Skip to content

tidyllm 0.3.4

Latest

Choose a tag to compare

@edubruell edubruell released this 26 Mar 22:13
· 14 commits to main since this release

tidyllm Version 0.3.4 Release Notes

This release marks a major internal refactor accompanied by a suite of subtle yet impactful improvements. While many changes occur under the hood, they collectively deliver a more robust, flexible, and maintainable framework.

Key Improvements

  • Robust Streaming:

    • New Streaming Backend: Streaming is now handled via httr2::req_perform_connection() (httr2 ≥ 1.1.1), resulting in a more stable and reliable experience.
    • Metadata for Streaming: Streaming requests now also support metadata extraction, logprobs, and other features, making them even more informative.
  • Optimized Internal Processing:

    • S7 Methods Integration: Improved handling of streams and chat parsing using more proper S7 methods instead of clunky old function generation.
    • OpenAI Request Construction: Both OpenAI and Azure OpenAI (along with their batch functions) now use a common request construction function to reduce code duplication and simplify maintenance.
  • Schema support:

  • New field_object() function to allow for nested schemata

  • Expanded API Features:

    • JSON Schema Support:
      • mistral() now accepts the .json_schema argument.
      • claude() incorporates .json_schema via a JSON-extractor tool, in line with Anthropic's guidelines.
    • Batch API for groq(): A new batch processing interface has been implemented for groq().

Bug Fixes

  • claude() Batch Requests: Fixed an issue where system prompts were not transmitted correctly in batch mode.
  • gemini() Prompt Handling: Resolved a bug causing system prompts to be omitted from API calls in older versions.