Skip to content

send parameter in request when args is nil #818

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

longlene
Copy link

No description provided.

@karthink
Copy link
Owner

karthink commented May 1, 2025

As you can see from the comment in the original code, OpenAI complains if the function takes no arguments but the tool spec contains a :parameters field. From #688, it looks like some OpenAI-compatible APIs (like Grok?) complain if the tool spec does not contain a :parameters field.

So this needs to be tested carefully against all OpenAI compatible APIs.

@longlene
Copy link
Author

longlene commented May 1, 2025

@karthink Thank you for the explanation.
If so, maybe we can add a defcustom variable, gptel will still not cast parameters field by default, then we can customize the value to change the behavior to meet the different inference servers.
What do you think?

@longlene
Copy link
Author

longlene commented May 1, 2025

@karthink
Copy link
Owner

karthink commented May 1, 2025

If so, maybe we can add a defcustom variable, gptel will still not cast parameters field by default, then we can customize the value to change the behavior to meet the different inference servers.

This will not work -- consider the case where you switch your backend from OpenAI to llama.cpp.

@longlene
Copy link
Author

longlene commented May 1, 2025

@karthink I updated the code, the default behavior will be still the same as before, then for some user who knows what they are doing, there is still a way for them to always send parameters when needed.

@karthink
Copy link
Owner

karthink commented May 5, 2025

@longlene As I mentioned above, a user option is not the solution here.

Could you check if the OpenAI API works correctly after your original change? It did not use to, but things might be different now. If it works with :parameters specified, we merge the change without the user option.

Also :parameters should probably be set to :null, not nil.

@longlene
Copy link
Author

longlene commented May 6, 2025

@karthink ok, let me see if I can test it with OpenAI API.

@longlene
Copy link
Author

longlene commented May 6, 2025

@karthink Tested with gpt-4.1-nano Looks the parameters: null is ok now, I use nil just because it can also work. Will update the pr then.
gptel.log

@longlene
Copy link
Author

longlene commented May 6, 2025

(list :paramters nil) would cause gptel send paramters: {} to the API.

@karthink
Copy link
Owner

karthink commented May 6, 2025

(list :paramters nil) would cause gptel send paramters: {} to the API.

Yes. In your testing, do null and {} both work?

@longlene
Copy link
Author

longlene commented May 6, 2025

@karthink Yes, both OpenAI and llama-server can work with paramters: {} and null. I updated pr as null.

@longlene
Copy link
Author

longlene commented May 6, 2025

In the log file I uploaded, the value is null. gpt can call git_status tool correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants