Skip to content

Local LLM endpoints other than Ollama Integration Support #33

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
athulchandroth opened this issue Feb 21, 2025 · 0 comments
Open

Local LLM endpoints other than Ollama Integration Support #33

athulchandroth opened this issue Feb 21, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@athulchandroth
Copy link
Collaborator

Description

Expand local model support beyond Ollama to include other local LLM platforms like LMStudio and AnythingLLM.

Current Status (from team response)

  • Ollama support is currently implemented
  • Planning to add custom endpoint support in next version
  • Will focus on OpenAI-compatible API endpoints

Impact

  • Limited options for users who want to run models locally
  • Users of LMStudio and AnythingLLM cannot use their preferred platforms
  • Dependency on specific local LLM platform (Ollama)

Acceptance Criteria

  1. Support for OpenAI-compatible API endpoints
  2. Integration with common local LLM platforms:
    • LMStudio
    • AnythingLLM
    • Other OpenAI-compatible endpoints
  3. Easy configuration UI for endpoint settings
  4. Proper error handling for endpoint connectivity
  5. Documentation for setting up different local LLM platforms

Implementation Phases

  1. Phase 1 (Current):

    • Ollama support maintained
  2. Phase 2 (Next Version):

    • Add custom OpenAI-compatible endpoint support
    • Create configuration interface
    • Add endpoint validation
  3. Phase 3 (Future):

    • Expand to additional local LLM platforms
    • Add platform-specific optimizations

Priority

Medium - Currently has partial solution with Ollama support

Notes

  • Focus will be on OpenAI-compatible endpoints for maximum compatibility
  • Will enable users to use their preferred local LLM setup
  • Builds on existing Ollama integration experience
@athulchandroth athulchandroth added the enhancement New feature or request label Feb 21, 2025
@sujithatzackriya sujithatzackriya changed the title Local LLM Integration Support Local LLM endpoints other than Ollama Integration Support Mar 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Todo
Development

No branches or pull requests

1 participant