LoRAModel¶
- class mmpretrain.models.peft.LoRAModel(module, alpha=1, rank=0, drop_rate=0.0, targets=[])[source]¶
Implements LoRA in a module.
An PyTorch implement of : LoRA: Low-Rank Adaptation of Large Language Models
- Parameters:
module (dict) – The config of the module to be finetuned. See
mmpretrain.models
alpha (int) – The scale factor of LoRA. Defaults to 1.
rank (int) – The rank of LoRA. Defaults to 0.
drop_rate (float) – The drop out rate for LoRA. Defaults to 0.
targets (List[dict]) – The target layers to be applied with the LoRA. Defaults to a empty list. Specify by regular expression or suffix.
Examples
>>> model = LoRAModel( ... module=dict(type='VisionTransformer', arch='b'), ... alpha=4, ... rank=4, ... drop_rate=0.1, ... targets=[ ... dict(type='.*qkv'), # regular expression ... dict(type='proj', alpha=8, rank=8), # suffix ... ])