User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
-
Updated
Apr 23, 2025 - JavaScript
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
LLMX; Easiest 3rd party Local LLM UI for the web!
Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, easy-to-use package.
Fully-featured, beautiful web interface for vLLM - built with NextJS.
RikkaHub is a Android APP that supports for multiple LLM providers.
🔬 Experiment is an experiment is an experiment is an experiment is an experiment is an e̴x̷p̶e̶r̶i̶m̸e̸n̸t̴ ̷i̵s̴ ̷a̵n̷ è̷̜x̴̝͝p̵̨̐e̴̯̐r̴͔̍ì̸̻m̴̛͎e̵̥̔n̶̠̎t̷̠͝ ̶̼̳̕ǐ̷̞͍͂s̷͍̈́ ̶̫̀a̵̠͌n̵̲͊ ̶̣̼̆ḛ̸̀x̵̰͋p̵͉̺̎e̶̛͈̮ř̸̜̜̅ì̵̜̠͗ṃ̴̼͆ė̴̮n̶̪̈́t̸̢͖͋͂
A NextJS "Local First" AI Interface
Dive-APP is a Flutter-based mobile application that brings your own powerful AI agents to your pocket via connect your own MCP-Client.
RotinaPy: Simplify your daily life and maximize productivity with an integrated app for task management, study tracking, flashcards, and more. Built with Streamlit and Python.
A modern, feature-rich web interface built with Next.js and shadcn/ui for interacting with local Ollama large language models.
Ollama web UI
User-friendly WebUI for LLMs which is based on Open WebUI. It used by the Kompetenzwerkstatt Digital Humanities (KDH) at the Humboldt-Universität zu Berlin
Add a description, image, and links to the llm-ui topic page so that developers can more easily learn about it.
To associate your repository with the llm-ui topic, visit your repo's landing page and select "manage topics."