Skip to content

gabriel-dehan/llm-chat-client-prototype

Repository files navigation

LLM Chat Client Prototype

VERY WIP, just a quick prototype for streaming chat completions to a chat client, with an hybrid approach (rendering of components inside the chat when getting JSON data from the server)

It intefaces with a back-end using my Gemini Completion Rails gem.

The idea in the future is to create a fully featured LLM chat client that handles forking the conversation elegantly

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published