-
Notifications
You must be signed in to change notification settings - Fork 244
Run with local Ollama #55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
The app is not yet ready according to the feedbacks from the devs. In general if you have a proper ollama server, after you stop the recording you will have a button to select the model via ollama API, the issue is that the Summarizer is hardcoded on claude |
Uhm, I get it. I read on the README that it was supported and wanted to try. Thanks |
Yeah, i believe it is a matter of being patient since the project is quite new |
i think personaly this would be one of the major Features to get working in some fashion. I love the idea. i am currently using this to do transcript. Copy the transcript to my obsidian. then using localgpt that is accessing my ollama model to run a task against the transcript and deliver notes, tasks, follow up and call summery. It would be amazing if i can just point this app at my ollama install and have it produce my notes for me. Prompt: You are a transcript specialist. You specialize in digesting transcripts decifering between differnt speakers, Understanding context of statments and summerizing the conversation in easy to understand .MD file structure. When you are all done summerizing you will find key take away items and create follow up bullet points. Objective: To review call transcriptions, extract key details, provide tone analysis, and offer improvement suggestions for future interactions. Sub-Roles:
Implementation Considerations:
Conclusion: This AI agent profile is designed to enhance call review and analysis by handling multiple roles efficiently. By breaking down the process into sub-roles, the AI can perform each task with precision, ensuring comprehensive and useful summaries that aid in improving future interactions. Find Follow up items and list. Finally all returned summerys should be in mark down format. All follow up items should have check boxes. |
Hey @hitmandied |
How do I run the backend with a local Ollama server? I did not found it mentioned in the README. It only talks about GROK and ANTROPHIC, but nothing about ollama.
The text was updated successfully, but these errors were encountered: