Skip to content

Commit 8eabf11

Browse files
updated lora
1 parent 09f52fb commit 8eabf11

File tree

2 files changed

+5
-3
lines changed

2 files changed

+5
-3
lines changed

‎config.yaml‎

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,5 +17,6 @@ training:
1717
save_frequency: 100
1818
warmup_epochs: 5
1919
use_lora: true
20-
use_lora_layers: true
20+
use_lora_layers: true # This applies lora to only bbox pred layer and few transformer decoder layers the number of trainable parameters in this case will be < 1% of total parameters
21+
# if it is false then it applies
2122
visualization_frequency: 5

‎groundingdino/util/lora.py‎

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,8 @@ def add_lora_to_model(model, rank=8):
6464
"query",
6565
"key",
6666
"value",
67-
"dense"
67+
"dense",
68+
"bbox_embed",
6869
],
6970
lora_dropout=0.1,
7071
bias="none",
@@ -102,7 +103,7 @@ def add_lora_to_layers(model, rank=32):
102103
r=rank,
103104
lora_alpha=rank,
104105
target_modules=["0", "1", "2"], # MLP layer indices
105-
lora_dropout=0.1,
106+
lora_dropout=1e-3,
106107
bias="none",
107108
)
108109

0 commit comments

Comments
 (0)