Skip to content

InferenceService / ServingRuntime compatibility #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
vrutkovs opened this issue Apr 5, 2025 · 1 comment
Closed

InferenceService / ServingRuntime compatibility #1

vrutkovs opened this issue Apr 5, 2025 · 1 comment

Comments

@vrutkovs
Copy link

vrutkovs commented Apr 5, 2025

Thanks for a great example of diffusers demo! If I understood correctly this uses a standalone kserve deployment. Any pointers on how to use that with ServingRuntime / InferenceService objects?

@mcaimi
Copy link
Owner

mcaimi commented Apr 7, 2025

Hello Vadim,
yes you understood correctly, this is an example custom serving engine built with KServe. It is only an example though, it is not at all ready for anything else that demos 👍🏻

Sure thing, here is the ServingRuntime I use to deploy the custom image.
As fot the InferenceService... I currently create one using RHOAI web portal, I do not have any yaml example ready to share.

@mcaimi mcaimi closed this as completed Apr 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants