Skip to content

feat: Add Ollama module #1099

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 15 commits into from
Apr 29, 2025

Conversation

frankhaugen
Copy link
Contributor

What does this PR do?

Added a new Module project that uses OLLAMA docker images to give local LLM capabilities. This only supports CPU workload, but adding the env. variables to activate GPU if the prerequisites are there, can be done and so CUDA core based processing isn't precluded

Why is it important?

This module is very powerful and can give a lot of value to those that are incorporating OLLAMA models in their projects

Related issues

Link related issues below. Insert the issue link or reference after the word "Closes" if merging this should automatically close it.

How to test this PR

Run the test added

Follow-ups

  • Make more docs to guide the user to be able to use GPU might be a future thing, but right now, no need.

Copy link

netlify bot commented Jan 25, 2024

Deploy Preview for testcontainers-dotnet ready!

Name Link
🔨 Latest commit 4c69ac7
🔍 Latest deploy log https://app.netlify.com/sites/testcontainers-dotnet/deploys/680f94f1137a7d00086f86fe
😎 Deploy Preview https://deploy-preview-1099--testcontainers-dotnet.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

Copy link
Collaborator

@HofmeisterAn HofmeisterAn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR. I had a quick look and added some minor suggestions and improvements.

This commit updates the Ollama configuration to allow more customization options and removes unnecessary test helpers. The OllamaConfiguration class was refactored to provide more configurable parameters such as the VolumePath and VolumeName. Additionally, the TestOutputHelperExtensions and TestOutputLogger classes were deleted as they were not providing any significant value.
@frankhaugen
Copy link
Contributor Author

@HofmeisterAn I'm updating the branch with my current state as the tests were green on local, but I think it needs another pass. No Rush, I'm busy with my day-job so, no need for you to rush 😄

Copy link
Member

@eddumelendez eddumelendez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution. Testcontainers for Java and Go has recently release a new Ollama module, which allows to detect and enable GPU and allow to commit the container's state and create a new Image. It would be nice to align the .NET implementation along with the others. See java implementation.

@HofmeisterAn HofmeisterAn added enhancement New feature or request module An official Testcontainers module labels Mar 11, 2024
@duranserkan
Copy link

Is there any update on this?

@IRusio
Copy link

IRusio commented Apr 17, 2025

any update on that?

Copy link
Collaborator

@HofmeisterAn HofmeisterAn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the PR to develop. I'll take a look at it sometime next week and address the necessary parts.

Copy link
Collaborator

@HofmeisterAn HofmeisterAn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the module implementation and aligned it with the other Testcontainers module implementations. I also added a text example.

Thanks for your contribution. Testcontainers for Java and Go has recently release a new Ollama module, which allows to detect and enable GPU and allow to commit the container's state and create a new Image. It would be nice to align the .NET implementation along with the others. See java implementation.

I didn't include detecting and setting the GPU configuration, as it's not that simple in .NET (but it's something we can address in the future). I'm also not sure if committing an image is supported in Docker.DotNet. I need to double-check that. It might be something we need to implement first. If someone wants to pick it up, that would be great, but we need to discuss how to address this in an issue first.

// How to set the device requests using the container builder API:
.WithCreateParameterModifier(parameterModifier =>
    parameterModifier.HostConfig.DeviceRequests
        = new List<DeviceRequest> { new DeviceRequest { /* TODO: Set properties. */ } })

@HofmeisterAn HofmeisterAn changed the title Frank/ollama module feat: Add Ollama module Apr 28, 2025
@kiview
Copy link
Member

kiview commented Apr 28, 2025

I also like to reference the useLocal feature in the tc-go implementation. Pragmatically, it is the best way to get a good DX on macOS.

@HofmeisterAn HofmeisterAn merged commit fff9742 into testcontainers:develop Apr 29, 2025
70 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request module An official Testcontainers module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Enhancement]: Ollama module
6 participants