How to use chenjoya/videollm-online-8b-v1plus with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-8B-Instruct") model = PeftModel.from_pretrained(base_model, "chenjoya/videollm-online-8b-v1plus")
Thanks!
· Sign up or log in to comment