Skip to content

how to set tensor_parallel_size for vllm backend #8055

Closed
@Jasonsey

Description

@Jasonsey

What happened?

my code is here,

import dspy


lm = dspy.LM("vllm//home/stone/max/base_model/hf_model/Qwen/Qwen2.5-VL-72B-Instruct")
dspy.configure(lm=lm)


qa = dspy.Predict("question: str -> answer: str", tensor_parallel_size=8)
res = qa(question="who are you?")
print(res)

my question is how to set tensor_parallel_size for vllm backend? this code isnot working for this param

Steps to reproduce

import dspy


lm = dspy.LM("vllm//home/stone/max/base_model/hf_model/Qwen/Qwen2.5-VL-72B-Instruct")
dspy.configure(lm=lm)


qa = dspy.Predict("question: str -> answer: str", tensor_parallel_size=8)
res = qa(question="who are you?")
print(res)

DSPy version

2.6.17

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions