forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 134
Pull requests: HabanaAI/vllm-fork
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Clear the Mamba cache table if the running queue is empty
#2046
opened Oct 17, 2025 by
Wei-Lin-Intel
Loading…
fix bug that VLLM_SKIP_WARMUP=1 is not recognized in vision_bucket
#2036
opened Oct 15, 2025 by
yingjie-han
Loading…
intervl:cache prompt_tokens and output_tokens for penalty sampling
#1974
opened Sep 23, 2025 by
libinta
Loading…
Previous Next
ProTip!
Updated in the last three days: updated:>2025-10-14.