Can Devs Please Provide the Qwen2 HAR with Attention Layer LoRA Slots?

When I got partway through the LoRA, I found that the only supported model (Qwen2 1.5) only allows the MLP layers to be reweighted- gate_proj, up_proj, and down_proj and not the attention layers - q_proj, k_proj, v_proj, o_proj. Attention layers are not supported here, nor can I extract them from the compiled files to force them.

MLP isn’t as useful for me, as I’m already operating RAG with my model, which largely covers the need for any LoRA in these trainable areas, but I am looking to tweak personality.

Is it possible to publish a new HAR with those trainable surfaces exposed? Qwen2.5 would be ideal, but I would be happy with Qwen2 if it’s a lighter lift.

1 Like

Hi @bill_bz ,

As we are constantly evaluating new features, we took notice of your request.
However, for time being we cannot share specific timelines.

Thanks,