tidyllm Version 0.3.4 Release Notes
This release marks a major internal refactor accompanied by a suite of subtle yet impactful improvements. While many changes occur under the hood, they collectively deliver a more robust, flexible, and maintainable framework.
Key Improvements
-
Robust Streaming:
- New Streaming Backend: Streaming is now handled via
httr2::req_perform_connection()(httr2 ≥ 1.1.1), resulting in a more stable and reliable experience. - Metadata for Streaming: Streaming requests now also support metadata extraction, logprobs, and other features, making them even more informative.
- New Streaming Backend: Streaming is now handled via
-
Optimized Internal Processing:
- S7 Methods Integration: Improved handling of streams and chat parsing using more proper S7 methods instead of clunky old function generation.
- OpenAI Request Construction: Both OpenAI and Azure OpenAI (along with their batch functions) now use a common request construction function to reduce code duplication and simplify maintenance.
-
Schema support:
-
New
field_object()function to allow for nested schemata -
Expanded API Features:
- JSON Schema Support:
mistral()now accepts the.json_schemaargument.claude()incorporates.json_schemavia a JSON-extractor tool, in line with Anthropic's guidelines.
- Batch API for groq(): A new batch processing interface has been implemented for groq().
- JSON Schema Support:
Bug Fixes
- claude() Batch Requests: Fixed an issue where system prompts were not transmitted correctly in batch mode.
- gemini() Prompt Handling: Resolved a bug causing system prompts to be omitted from API calls in older versions.