Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix auxiliary loss related code in transformers #28406

Merged
merged 53 commits into from
Jan 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
4fc204f
[DETA] fix freeze/unfreeze function
SangbumChoi Dec 5, 2023
97e3d23
Update src/transformers/models/deta/modeling_deta.py
SangbumChoi Dec 6, 2023
933119e
Update src/transformers/models/deta/modeling_deta.py
SangbumChoi Dec 6, 2023
a583fa7
add freeze/unfreeze test case in DETA
SangbumChoi Dec 8, 2023
fa83da9
Merge branch 'main' of https://github.com/SangbumChoi/transformers in…
SangbumChoi Dec 8, 2023
3407bd1
fix type
SangbumChoi Dec 8, 2023
921c7d1
fix typo 2
SangbumChoi Dec 8, 2023
df490fc
Merge branch 'huggingface:main' into main
SangbumChoi Dec 11, 2023
d727a49
fix : enable aux and enc loss in training pipeline
SangbumChoi Dec 12, 2023
beab307
Merge branch 'huggingface:main' into main
SangbumChoi Dec 12, 2023
cd351da
Add unsynced variables from original DETA for training
SangbumChoi Dec 13, 2023
954da10
Merge branch 'main' of https://github.com/SangbumChoi/transformers in…
SangbumChoi Dec 13, 2023
ad8aa0e
Merge branch 'huggingface:main' into main
SangbumChoi Dec 13, 2023
fae8b0a
Merge branch 'main' of https://github.com/SangbumChoi/transformers in…
SangbumChoi Dec 13, 2023
c38cba8
modification for passing CI test
SangbumChoi Dec 13, 2023
ddd3f6d
make style
SangbumChoi Dec 13, 2023
15409c3
make fix
SangbumChoi Dec 13, 2023
b2ea2b3
manual make fix
SangbumChoi Dec 13, 2023
92280f5
change deta_modeling_test of configuration 'two_stage' default to TRU…
SangbumChoi Dec 13, 2023
5429755
remove print
SangbumChoi Dec 13, 2023
db0f225
divide configuration in DetaModel and DetaForObjectDetection
SangbumChoi Dec 13, 2023
483e9fc
image smaller size than 224 will give topk error
SangbumChoi Dec 14, 2023
153e8b1
pred_boxes and logits should be equivalent to two_stage_num_proposals
SangbumChoi Dec 14, 2023
b537c2a
add missing part in DetaConfig
SangbumChoi Dec 14, 2023
d3a6cce
Merge branch 'huggingface:main' into main
SangbumChoi Dec 15, 2023
a86d762
Update src/transformers/models/deta/modeling_deta.py
SangbumChoi Dec 20, 2023
4e8b9f7
add docstring in configure and prettify TO DO part
SangbumChoi Dec 20, 2023
7895b7e
Merge branch 'main' of https://github.com/SangbumChoi/transformers in…
SangbumChoi Dec 20, 2023
0138a74
Merge branch 'huggingface:main' into main
SangbumChoi Dec 29, 2023
526a8b0
change distribute related code to accelerate
SangbumChoi Jan 2, 2024
c7b9818
Update src/transformers/models/deta/configuration_deta.py
SangbumChoi Jan 5, 2024
8fe68e4
Update tests/models/deta/test_modeling_deta.py
SangbumChoi Jan 5, 2024
e760c4a
protect importing accelerate
SangbumChoi Jan 5, 2024
2386d73
change variable name to specific value
SangbumChoi Jan 5, 2024
5643284
wrong import
SangbumChoi Jan 5, 2024
94960c2
Merge branch 'huggingface:main' into main
SangbumChoi Jan 7, 2024
1ea2af7
Merge branch 'huggingface:main' into main
SangbumChoi Jan 9, 2024
4a5cc44
fix aux_loss in conditional_detr
SangbumChoi Jan 9, 2024
10a6e10
add test aux_loss
SangbumChoi Jan 9, 2024
6fb27f5
add aux_loss test in deta and table_transformer
SangbumChoi Jan 9, 2024
5cd3e2f
fix yolos since it doesn't have auxiliary function
SangbumChoi Jan 9, 2024
0d8a0f2
fix maskformer auxiliary_loss related code
SangbumChoi Jan 9, 2024
2c7fb3a
make style
SangbumChoi Jan 9, 2024
98f9f0b
change param 'auxiliary_loss' to 'use_auxiliary_loss'
SangbumChoi Jan 9, 2024
34f70e6
change param 'auxiliary_loss' to 'use_auxiliary_loss' in tests
SangbumChoi Jan 9, 2024
a1127e2
make style & fix-copies, also revert yolos related parameter
SangbumChoi Jan 10, 2024
3cdbe9e
revert variable name 'use_auxiliary_loss' to 'auxiliary_loss' due to …
SangbumChoi Jan 10, 2024
bfbdf1a
Merge branch 'huggingface:main' into fix_aux_loss
SangbumChoi Jan 11, 2024
4384702
revert variable name in yolos
SangbumChoi Jan 12, 2024
a0f1a09
revert maskformer
SangbumChoi Jan 12, 2024
8cfcde6
add aux_loss test in maskformer
SangbumChoi Jan 12, 2024
1a44d67
make style
SangbumChoi Jan 12, 2024
01e7864
Update src/transformers/models/yolos/configuration_yolos.py
SangbumChoi Jan 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -1874,8 +1874,8 @@ def forward(
intermediate = outputs.intermediate_hidden_states if return_dict else outputs[4]
outputs_class = self.class_labels_classifier(intermediate)

for lvl in range(hs.shape[0]):
tmp = self.bbox_predictor(hs[lvl])
for lvl in range(intermediate.shape[0]):
tmp = self.bbox_predictor(intermediate[lvl])
tmp[..., :2] += reference_before_sigmoid
outputs_coord = tmp.sigmoid()
outputs_coords.append(outputs_coord)
Expand Down Expand Up @@ -2118,9 +2118,9 @@ def forward(
outputs_loss["pred_masks"] = pred_masks
if self.config.auxiliary_loss:
intermediate = decoder_outputs.intermediate_hidden_states if return_dict else decoder_outputs[-1]
outputs_class = self.class_labels_classifier(intermediate)
outputs_coord = self.bbox_predictor(intermediate).sigmoid()
auxiliary_outputs = self._set_aux_loss(outputs_class, outputs_coord)
outputs_class = self.conditional_detr.class_labels_classifier(intermediate)
outputs_coord = self.conditional_detr.bbox_predictor(intermediate).sigmoid()
auxiliary_outputs = self.conditional_detr._set_aux_loss(outputs_class, outputs_coord)
outputs_loss["auxiliary_outputs"] = auxiliary_outputs

loss_dict = criterion(outputs_loss, labels)
Expand Down
16 changes: 16 additions & 0 deletions tests/models/conditional_detr/test_modeling_conditional_detr.py
Original file line number Diff line number Diff line change
Expand Up @@ -399,6 +399,22 @@ def test_retain_grad_hidden_states_attentions(self):
self.assertIsNotNone(decoder_attentions.grad)
self.assertIsNotNone(cross_attentions.grad)

def test_forward_auxiliary_loss(self):
config, inputs_dict = self.model_tester.prepare_config_and_inputs_for_common()
config.auxiliary_loss = True

# only test for object detection and segmentation model
for model_class in self.all_model_classes[1:]:
model = model_class(config)
model.to(torch_device)

inputs = self._prepare_for_class(inputs_dict, model_class, return_labels=True)

outputs = model(**inputs)

self.assertIsNotNone(outputs.auxiliary_outputs)
self.assertEqual(len(outputs.auxiliary_outputs), self.model_tester.num_hidden_layers - 1)

def test_forward_signature(self):
config, _ = self.model_tester.prepare_config_and_inputs_for_common()

Expand Down
16 changes: 16 additions & 0 deletions tests/models/deformable_detr/test_modeling_deformable_detr.py
Original file line number Diff line number Diff line change
Expand Up @@ -476,6 +476,22 @@ def test_retain_grad_hidden_states_attentions(self):
self.assertIsNotNone(decoder_attentions.grad)
self.assertIsNotNone(cross_attentions.grad)

def test_forward_auxiliary_loss(self):
config, inputs_dict = self.model_tester.prepare_config_and_inputs_for_common()
config.auxiliary_loss = True

# only test for object detection and segmentation model
for model_class in self.all_model_classes[1:]:
model = model_class(config)
model.to(torch_device)

inputs = self._prepare_for_class(inputs_dict, model_class, return_labels=True)

outputs = model(**inputs)

self.assertIsNotNone(outputs.auxiliary_outputs)
self.assertEqual(len(outputs.auxiliary_outputs), self.model_tester.num_hidden_layers - 1)

def test_forward_signature(self):
config, _ = self.model_tester.prepare_config_and_inputs_for_common()

Expand Down
16 changes: 16 additions & 0 deletions tests/models/deta/test_modeling_deta.py
Original file line number Diff line number Diff line change
Expand Up @@ -449,6 +449,22 @@ def test_retain_grad_hidden_states_attentions(self):
self.assertIsNotNone(decoder_attentions.grad)
self.assertIsNotNone(cross_attentions.grad)

def test_forward_auxiliary_loss(self):
config, inputs_dict = self.model_tester.prepare_config_and_inputs_for_common()
config.auxiliary_loss = True

# only test for object detection and segmentation model
for model_class in self.all_model_classes[1:]:
model = model_class(config)
model.to(torch_device)

inputs = self._prepare_for_class(inputs_dict, model_class, return_labels=True)

outputs = model(**inputs)

self.assertIsNotNone(outputs.auxiliary_outputs)
self.assertEqual(len(outputs.auxiliary_outputs), self.model_tester.num_hidden_layers - 1)

def test_forward_signature(self):
config, _ = self.model_tester.prepare_config_and_inputs_for_common()

Expand Down
18 changes: 18 additions & 0 deletions tests/models/maskformer/test_modeling_maskformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -362,6 +362,24 @@ def test_retain_grad_hidden_states_attentions(self):
self.assertIsNotNone(transformer_decoder_hidden_states.grad)
self.assertIsNotNone(attentions.grad)

def test_forward_auxiliary_loss(self):
config, inputs_dict = self.model_tester.prepare_config_and_inputs_for_common()
config.use_auxiliary_loss = True
config.output_auxiliary_logits = True
config.output_hidden_states = True

# only test for object detection and segmentation model
for model_class in self.all_model_classes[1:]:
model = model_class(config)
model.to(torch_device)

inputs = self._prepare_for_class(inputs_dict, model_class, return_labels=True)

outputs = model(**inputs)

self.assertIsNotNone(outputs.auxiliary_logits)
self.assertEqual(len(outputs.auxiliary_logits), self.model_tester.num_channels - 1)


TOLERANCE = 1e-4

Expand Down
16 changes: 16 additions & 0 deletions tests/models/table_transformer/test_modeling_table_transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -411,6 +411,22 @@ def test_retain_grad_hidden_states_attentions(self):
self.assertIsNotNone(decoder_attentions.grad)
self.assertIsNotNone(cross_attentions.grad)

def test_forward_auxiliary_loss(self):
config, inputs_dict = self.model_tester.prepare_config_and_inputs_for_common()
config.auxiliary_loss = True

# only test for object detection and segmentation model
for model_class in self.all_model_classes[1:]:
model = model_class(config)
model.to(torch_device)

inputs = self._prepare_for_class(inputs_dict, model_class, return_labels=True)

outputs = model(**inputs)

self.assertIsNotNone(outputs.auxiliary_outputs)
self.assertEqual(len(outputs.auxiliary_outputs), self.model_tester.num_hidden_layers - 1)

def test_forward_signature(self):
config, _ = self.model_tester.prepare_config_and_inputs_for_common()

Expand Down
Loading