Skip to content

Conversation

sonnh-uit
Copy link

@sonnh-uit sonnh-uit commented Aug 20, 2025

Why are these changes needed?

This change needed to load a model client with ollama provider

Follow instruction on: https://microsoft.github.io/autogen/stable//reference/python/autogen_ext.models.ollama.html#module-autogen_ext.models.ollama

The client can create from config by client = ChatCompletionClient.load_component(config). But these lines

output = loaded_model.provider.rsplit(".", maxsplit=1)
        if len(output) != 2:
            raise ValueError("Invalid")

will make load process failure if the provider not in well known list.

Related issue number

I do not create issue. I am facing this issue and try to fix that first.

Checks

Follow instruction on: https://microsoft.github.io/autogen/stable//reference/python/autogen_ext.models.ollama.html#module-autogen_ext.models.ollama

The client can create from config by `client = ChatCompletionClient.load_component(config)`. But on these lines
```
output = loaded_model.provider.rsplit(".", maxsplit=1)
        if len(output) != 2:
            raise ValueError("Invalid")
```
Will fail if one of provider not in well known list.

This change add ollama to well known provider to fix that bug
@sonnh-uit
Copy link
Author

sonnh-uit commented Aug 20, 2025

@Z1m4-blu3 has created pull request on #6709 but now it is out-of-date with the base branch

@sonnh-uit
Copy link
Author

hi @ekzhu, could you please approve and run the workflows?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant