diff --git a/docs/capabilities/batch.md b/docs/capabilities/batch.md index 95ff53f1..03b0a8e6 100644 --- a/docs/capabilities/batch.md +++ b/docs/capabilities/batch.md @@ -55,7 +55,8 @@ const batchData = await client.files.upload({ file: { fileName: "batch_input_file.jsonl", content: batchFile, - } + }, + purpose: "batch" }); ``` diff --git a/docs/capabilities/code-generation.mdx b/docs/capabilities/code-generation.mdx index 405fac41..3a50737b 100644 --- a/docs/capabilities/code-generation.mdx +++ b/docs/capabilities/code-generation.mdx @@ -279,7 +279,7 @@ Check out the README of [mistral-inference](https://github.com/mistralai/mistral ## Integration with continue.dev Continue.dev supports both Codestral base for code generation and Codestral Instruct for chat. - + ### How to set up Codestral with Continue @@ -325,8 +325,7 @@ If you run into any issues or have any questions, please join our Discord and po ## Integration with Tabnine Tabnine supports Codestral Instruct for chat. - - + ### How to set up Codestral with Tabnine @@ -366,7 +365,7 @@ llm.invoke([("user", "Write a function for fibonacci")]) For a more complex use case of self-corrective code generation using the instruct Codestral tool use, check out this [notebook](https://github.com/mistralai/cookbook/blob/main/third_party/langchain/langgraph_code_assistant_mistral.ipynb) and this video: - + ## Integration with LlamaIndex LlamaIndex provides support for Codestral Instruct and Fill In Middle (FIM) endpoints. Here is how you can use it in LlamaIndex: @@ -402,7 +401,7 @@ jupyter lab Afterwards, you can select Codestral as your model of choice, input your Mistral API key, and start coding with Codestral! - + ## Integration with JupyterLite @@ -411,14 +410,14 @@ JupyterLite is a project that aims to bring the JupyterLab environment to the we You can try Codestral with JupyterLite in your browser: [![lite-badge](https://jupyterlite.rtfd.io/en/latest/_static/badge.svg)](https://jupyterlite.github.io/ai/lab/index.html) - + ## Integration with CodeGPT CodeGPT is a powerful agnostic extension harnessing the capabilities of Large Language Models (LLMs) to boost your programming tasks using AI in VSCode. You can select Codestral in CodeGPT for code generation and tab completion. - + ## Integration with Tabby @@ -435,7 +434,7 @@ api_key = "secret-api-key" You can check out [Tabby's documentation](https://tabby.tabbyml.com/docs/administration/model/#mistral--codestral) to learn more. - + ## Integration with E2B E2B provides open-source secure sandboxes for AI-generated code execution. @@ -444,7 +443,7 @@ With E2B, it is easy for developers to add code interpreting capabilities to AI In the following examples, the AI agent performs a data analysis task on an uploaded CSV file, executes the AI-generated code by Codestral in the sandboxed environment by E2B, and returns a chart, saving it as a PNG file. Python implementation ([cookbook](https://github.com/mistralai/cookbook/tree/main/third_party/E2B_Code_Interpreting/codestral-code-interpreter-python)): - + JS implementation ([cookbook](https://github.com/mistralai/cookbook/tree/main/third_party/E2B_Code_Interpreting/codestral-code-interpreter-js)): - +