Skip to content
Discussion options

You must be logged in to vote

In that case, you should do something like:

from peft import PeftModel

base_model = ...  # the same quantized base model as previously
model = PeftModel.from_pretrained(base_model, adapter_path_from_first_qlora_training, is_trainable=True)

From there, you get a trainable PEFT model, similar to what you'd get with get_peft_model.

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@supreme-gg-gg
Comment options

@BenjaminBossan
Comment options

Answer selected by supreme-gg-gg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants