You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Inference, fine-tuning, and evaluation engines (such as vLLM, TGI, Slang, Llama Factory, Swift, etc.) are now pluggable. Users can easily integrate new engine types by modifying the configuration file.
Introduced a new MCP (Model Control Protocol) application space to facilitate the seamless deployment and management of MCP servers.
Launched the new AI Gateway service to streamline interactions with serverless and dedicated inference endpoints running on CSGHub. Chat functionality is now supported, with embedding and tools expected to be integrated soon.
The Space Docker image builder is now open source and has been incorporated as a submodule of the CSGHub runner service.
The Inference Playground now supports image-text-to-text generation, enhancing its capabilities for versatile AI applications.