File size: 12,704 Bytes
a3beea8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
library_name: transformers
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
model-index:
- name: detr_finetuned_cppe5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# detr_finetuned_cppe5

This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3299
- Map: 0.0678
- Map 50: 0.1359
- Map 75: 0.0621
- Map Small: 0.0953
- Map Medium: 0.0947
- Map Large: 0.0707
- Mar 1: 0.123
- Mar 10: 0.3315
- Mar 100: 0.4317
- Mar Small: 0.4556
- Mar Medium: 0.3046
- Mar Large: 0.4536
- Map Coverall: 0.1683
- Mar 100 Coverall: 0.5644
- Map Face Shield: 0.02
- Mar 100 Face Shield: 0.2595
- Map Gloves: 0.0512
- Mar 100 Gloves: 0.4817
- Map Goggles: 0.0345
- Mar 100 Goggles: 0.3015
- Map Mask: 0.0647
- Mar 100 Mask: 0.5511

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 107  | 2.0850          | 0.0082 | 0.0334 | 0.0018 | 0.001     | 0.0046     | 0.009     | 0.0157 | 0.0658 | 0.1123  | 0.0333    | 0.0983     | 0.1162    | 0.0321       | 0.2923           | 0.0             | 0.0051              | 0.0021     | 0.1098         | 0.0         | 0.0             | 0.0067   | 0.1542       |
| No log        | 2.0   | 214  | 1.7616          | 0.0139 | 0.0413 | 0.0061 | 0.0005    | 0.047      | 0.0148    | 0.032  | 0.097  | 0.1704  | 0.0556    | 0.1232     | 0.1787    | 0.0492       | 0.3464           | 0.0             | 0.0                 | 0.0065     | 0.1768         | 0.0         | 0.0             | 0.0136   | 0.3289       |
| No log        | 3.0   | 321  | 1.8263          | 0.0213 | 0.068  | 0.0086 | 0.0047    | 0.0567     | 0.0241    | 0.036  | 0.1253 | 0.2014  | 0.1111    | 0.1304     | 0.2085    | 0.0659       | 0.5212           | 0.0001          | 0.0013              | 0.0152     | 0.1759         | 0.0         | 0.0             | 0.0252   | 0.3084       |
| No log        | 4.0   | 428  | 1.7623          | 0.0254 | 0.066  | 0.0157 | 0.0151    | 0.0437     | 0.0265    | 0.0439 | 0.1393 | 0.2158  | 0.1111    | 0.1517     | 0.2252    | 0.0757       | 0.4342           | 0.0             | 0.0                 | 0.0107     | 0.2241         | 0.0         | 0.0             | 0.0405   | 0.4209       |
| 2.7673        | 5.0   | 535  | 1.7402          | 0.0137 | 0.041  | 0.0051 | 0.005     | 0.0735     | 0.0146    | 0.0379 | 0.1097 | 0.1786  | 0.2556    | 0.1474     | 0.1831    | 0.046        | 0.3851           | 0.0012          | 0.0291              | 0.0021     | 0.1045         | 0.0         | 0.0             | 0.0191   | 0.3742       |
| 2.7673        | 6.0   | 642  | 1.6179          | 0.0256 | 0.0682 | 0.019  | 0.0089    | 0.0539     | 0.0273    | 0.0474 | 0.1269 | 0.2129  | 0.1778    | 0.141      | 0.2251    | 0.0673       | 0.423            | 0.001           | 0.0139              | 0.0022     | 0.1129         | 0.0074      | 0.0908          | 0.0503   | 0.424        |
| 2.7673        | 7.0   | 749  | 1.6199          | 0.0292 | 0.0672 | 0.0226 | 0.0053    | 0.0555     | 0.031     | 0.0546 | 0.145  | 0.2148  | 0.1333    | 0.1666     | 0.2271    | 0.0855       | 0.3883           | 0.0109          | 0.0139              | 0.0042     | 0.1357         | 0.0084      | 0.1015          | 0.0369   | 0.4347       |
| 2.7673        | 8.0   | 856  | 1.5755          | 0.029  | 0.0672 | 0.0219 | 0.0116    | 0.0722     | 0.0308    | 0.0558 | 0.1596 | 0.2416  | 0.2222    | 0.2017     | 0.2514    | 0.0858       | 0.4532           | 0.0026          | 0.0835              | 0.0036     | 0.1219         | 0.0095      | 0.1062          | 0.0436   | 0.4431       |
| 2.7673        | 9.0   | 963  | 1.5618          | 0.0338 | 0.0765 | 0.0277 | 0.0203    | 0.0719     | 0.0345    | 0.0474 | 0.1673 | 0.2511  | 0.2667    | 0.1616     | 0.2654    | 0.1082       | 0.5171           | 0.0002          | 0.0051              | 0.0198     | 0.2263         | 0.0047      | 0.0723          | 0.0362   | 0.4347       |
| 1.5478        | 10.0  | 1070 | 1.5111          | 0.0384 | 0.087  | 0.0319 | 0.065     | 0.0612     | 0.0407    | 0.0642 | 0.1982 | 0.2903  | 0.3222    | 0.1896     | 0.3069    | 0.0963       | 0.5284           | 0.0026          | 0.0392              | 0.0341     | 0.2871         | 0.0137      | 0.16            | 0.0453   | 0.4369       |
| 1.5478        | 11.0  | 1177 | 1.5122          | 0.0403 | 0.0922 | 0.0333 | 0.1083    | 0.072      | 0.0431    | 0.0729 | 0.2024 | 0.2958  | 0.4333    | 0.213      | 0.3086    | 0.1113       | 0.545            | 0.0052          | 0.0494              | 0.0333     | 0.2897         | 0.0142      | 0.1646          | 0.0371   | 0.4302       |
| 1.5478        | 12.0  | 1284 | 1.5083          | 0.0443 | 0.0985 | 0.0355 | 0.0534    | 0.0833     | 0.0451    | 0.0801 | 0.2004 | 0.2943  | 0.2333    | 0.2097     | 0.3095    | 0.1221       | 0.5441           | 0.0008          | 0.0114              | 0.0404     | 0.3433         | 0.0108      | 0.12            | 0.0475   | 0.4529       |
| 1.5478        | 13.0  | 1391 | 1.5374          | 0.0397 | 0.0859 | 0.0314 | 0.0861    | 0.0688     | 0.0419    | 0.0689 | 0.2201 | 0.3134  | 0.2889    | 0.2209     | 0.3291    | 0.1162       | 0.5405           | 0.003           | 0.0671              | 0.0297     | 0.3187         | 0.011       | 0.1908          | 0.0389   | 0.4498       |
| 1.5478        | 14.0  | 1498 | 1.5145          | 0.0449 | 0.1019 | 0.0344 | 0.0438    | 0.0794     | 0.0454    | 0.0684 | 0.2017 | 0.2897  | 0.3222    | 0.2005     | 0.3029    | 0.1423       | 0.5784           | 0.0009          | 0.0316              | 0.0356     | 0.2991         | 0.0089      | 0.1138          | 0.0367   | 0.4253       |
| 1.4217        | 15.0  | 1605 | 1.4832          | 0.0488 | 0.1141 | 0.0366 | 0.0474    | 0.0845     | 0.052     | 0.0798 | 0.2294 | 0.3337  | 0.3778    | 0.2367     | 0.3474    | 0.1426       | 0.5577           | 0.0033          | 0.0759              | 0.0419     | 0.4018         | 0.0214      | 0.2108          | 0.0347   | 0.4222       |
| 1.4217        | 16.0  | 1712 | 1.4627          | 0.047  | 0.1086 | 0.0344 | 0.0654    | 0.0892     | 0.0471    | 0.0896 | 0.2317 | 0.3142  | 0.3111    | 0.2416     | 0.3254    | 0.1408       | 0.5464           | 0.0053          | 0.1025              | 0.0303     | 0.3022         | 0.0119      | 0.1508          | 0.0466   | 0.4689       |
| 1.4217        | 17.0  | 1819 | 1.4152          | 0.0548 | 0.1195 | 0.0462 | 0.1189    | 0.0943     | 0.0558    | 0.097  | 0.2685 | 0.3578  | 0.3556    | 0.2809     | 0.371     | 0.1493       | 0.5234           | 0.0082          | 0.119               | 0.0401     | 0.3884         | 0.0155      | 0.2123          | 0.061    | 0.5458       |
| 1.4217        | 18.0  | 1926 | 1.4076          | 0.0506 | 0.1097 | 0.0425 | 0.054     | 0.0801     | 0.051     | 0.1038 | 0.2696 | 0.3692  | 0.3333    | 0.2823     | 0.3858    | 0.1389       | 0.5671           | 0.0109          | 0.1544              | 0.0293     | 0.3746         | 0.0142      | 0.24            | 0.0597   | 0.5098       |
| 1.3356        | 19.0  | 2033 | 1.3879          | 0.0576 | 0.1206 | 0.0504 | 0.1236    | 0.0895     | 0.0598    | 0.1222 | 0.2867 | 0.3818  | 0.3556    | 0.2535     | 0.4032    | 0.1396       | 0.5689           | 0.0161          | 0.1506              | 0.0359     | 0.4071         | 0.0333      | 0.2692          | 0.0629   | 0.5129       |
| 1.3356        | 20.0  | 2140 | 1.3471          | 0.064  | 0.1333 | 0.0581 | 0.1344    | 0.0948     | 0.0664    | 0.1268 | 0.3056 | 0.3965  | 0.3889    | 0.2996     | 0.4138    | 0.1558       | 0.5532           | 0.0217          | 0.1911              | 0.0491     | 0.4554         | 0.0253      | 0.2569          | 0.068    | 0.5258       |
| 1.3356        | 21.0  | 2247 | 1.3539          | 0.0652 | 0.1366 | 0.0549 | 0.1063    | 0.0885     | 0.0679    | 0.1163 | 0.3217 | 0.4142  | 0.4333    | 0.2995     | 0.4336    | 0.1572       | 0.5802           | 0.0178          | 0.2051              | 0.0595     | 0.4826         | 0.0254      | 0.2631          | 0.0663   | 0.54         |
| 1.3356        | 22.0  | 2354 | 1.3476          | 0.0653 | 0.1307 | 0.0583 | 0.0702    | 0.0963     | 0.0675    | 0.1116 | 0.3069 | 0.4026  | 0.4556    | 0.2928     | 0.4192    | 0.1678       | 0.568            | 0.0145          | 0.2013              | 0.0565     | 0.4848         | 0.0269      | 0.2354          | 0.061    | 0.5236       |
| 1.3356        | 23.0  | 2461 | 1.3519          | 0.064  | 0.1276 | 0.0584 | 0.0643    | 0.0999     | 0.0666    | 0.1169 | 0.3317 | 0.4235  | 0.4222    | 0.3046     | 0.4442    | 0.1609       | 0.5527           | 0.0178          | 0.2544              | 0.05       | 0.4674         | 0.0292      | 0.3             | 0.062    | 0.5431       |
| 1.2549        | 24.0  | 2568 | 1.3419          | 0.065  | 0.1327 | 0.058  | 0.0643    | 0.1002     | 0.0678    | 0.118  | 0.3265 | 0.4157  | 0.4222    | 0.295      | 0.4359    | 0.161        | 0.5833           | 0.0174          | 0.238               | 0.0546     | 0.4679         | 0.0314      | 0.26            | 0.0608   | 0.5293       |
| 1.2549        | 25.0  | 2675 | 1.3354          | 0.0687 | 0.1351 | 0.0617 | 0.0811    | 0.0904     | 0.0725    | 0.1241 | 0.3242 | 0.4217  | 0.4667    | 0.2988     | 0.4422    | 0.1643       | 0.5779           | 0.0179          | 0.2468              | 0.0521     | 0.4723         | 0.044       | 0.2646          | 0.0651   | 0.5467       |
| 1.2549        | 26.0  | 2782 | 1.3312          | 0.0679 | 0.1364 | 0.0626 | 0.083     | 0.0894     | 0.0711    | 0.1246 | 0.329  | 0.4268  | 0.4556    | 0.3039     | 0.4472    | 0.1683       | 0.5676           | 0.0201          | 0.2544              | 0.0523     | 0.4857         | 0.0349      | 0.2862          | 0.0641   | 0.54         |
| 1.2549        | 27.0  | 2889 | 1.3333          | 0.0667 | 0.1347 | 0.0615 | 0.0919    | 0.0904     | 0.0695    | 0.1237 | 0.3287 | 0.4308  | 0.4667    | 0.3055     | 0.4528    | 0.1653       | 0.568            | 0.019           | 0.2481              | 0.0513     | 0.4817         | 0.0315      | 0.3031          | 0.0666   | 0.5529       |
| 1.2549        | 28.0  | 2996 | 1.3306          | 0.0666 | 0.1343 | 0.0609 | 0.098     | 0.0899     | 0.0692    | 0.1248 | 0.3286 | 0.4284  | 0.4667    | 0.3021     | 0.4502    | 0.1666       | 0.5635           | 0.0188          | 0.2468              | 0.051      | 0.4777         | 0.0315      | 0.3             | 0.0652   | 0.5538       |
| 1.2213        | 29.0  | 3103 | 1.3300          | 0.0678 | 0.1358 | 0.0618 | 0.0955    | 0.0948     | 0.0707    | 0.1238 | 0.3319 | 0.432   | 0.4556    | 0.3046     | 0.454     | 0.1683       | 0.5649           | 0.0201          | 0.2595              | 0.0511     | 0.4826         | 0.0345      | 0.3015          | 0.0648   | 0.5516       |
| 1.2213        | 30.0  | 3210 | 1.3299          | 0.0678 | 0.1359 | 0.0621 | 0.0953    | 0.0947     | 0.0707    | 0.123  | 0.3315 | 0.4317  | 0.4556    | 0.3046     | 0.4536    | 0.1683       | 0.5644           | 0.02            | 0.2595              | 0.0512     | 0.4817         | 0.0345      | 0.3015          | 0.0647   | 0.5511       |


### Framework versions

- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0