model_id
stringlengths 7
105
| model_card
stringlengths 1
130k
| model_labels
listlengths 2
80k
|
---|---|---|
Taki3d/CrackDetectionLowRes
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CrackDetectionLowRes
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Accuracy: 0.9940
- Loss: 0.0183
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss |
|:-------------:|:-----:|:----:|:--------:|:---------------:|
| 0.0126 | 1.0 | 992 | 0.9879 | 0.0344 |
| 0.0788 | 2.0 | 1904 | 0.9933 | 0.0220 |
| 0.1336 | 3.0 | 2856 | 0.9933 | 0.0222 |
| 0.0066 | 4.0 | 3808 | 0.9933 | 0.0190 |
| 0.0528 | 5.0 | 4760 | 0.9940 | 0.0183 |
### Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cpu
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"crack",
"noncrack"
] |
jordyvl/vit-tiny_tobacco3482_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.6948
- Accuracy: 0.85
- Brier Loss: 0.2427
- Nll: 1.2265
- F1 Micro: 0.85
- F1 Macro: 0.8401
- Ece: 0.1325
- Aurc: 0.0510
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 6.1834 | 0.09 | 1.0218 | 8.8168 | 0.09 | 0.0691 | 0.3227 | 0.8979 |
| No log | 2.0 | 14 | 5.0430 | 0.14 | 0.9049 | 8.5503 | 0.14 | 0.1102 | 0.2495 | 0.7963 |
| No log | 3.0 | 21 | 4.6038 | 0.26 | 0.8385 | 6.2625 | 0.26 | 0.1872 | 0.2701 | 0.5801 |
| No log | 4.0 | 28 | 4.3342 | 0.475 | 0.7283 | 4.0652 | 0.4750 | 0.3906 | 0.3309 | 0.3230 |
| No log | 5.0 | 35 | 4.0844 | 0.575 | 0.6168 | 2.7058 | 0.575 | 0.4691 | 0.2963 | 0.2299 |
| No log | 6.0 | 42 | 3.9551 | 0.67 | 0.5721 | 2.2402 | 0.67 | 0.5531 | 0.3412 | 0.1586 |
| No log | 7.0 | 49 | 3.8114 | 0.695 | 0.4999 | 1.8621 | 0.695 | 0.5789 | 0.3028 | 0.1342 |
| No log | 8.0 | 56 | 3.7627 | 0.77 | 0.4472 | 1.6313 | 0.7700 | 0.7306 | 0.3170 | 0.0894 |
| No log | 9.0 | 63 | 3.7510 | 0.75 | 0.4223 | 1.4350 | 0.75 | 0.6574 | 0.2576 | 0.1046 |
| No log | 10.0 | 70 | 3.6586 | 0.8 | 0.3576 | 1.5473 | 0.8000 | 0.7615 | 0.2754 | 0.0701 |
| No log | 11.0 | 77 | 3.6643 | 0.825 | 0.3327 | 1.4335 | 0.825 | 0.7921 | 0.2555 | 0.0670 |
| No log | 12.0 | 84 | 3.6561 | 0.79 | 0.3465 | 1.3859 | 0.79 | 0.7541 | 0.2165 | 0.0833 |
| No log | 13.0 | 91 | 3.6518 | 0.805 | 0.3209 | 1.0963 | 0.805 | 0.7716 | 0.1894 | 0.0706 |
| No log | 14.0 | 98 | 3.6401 | 0.84 | 0.3045 | 1.1974 | 0.8400 | 0.8290 | 0.2261 | 0.0551 |
| No log | 15.0 | 105 | 3.6253 | 0.79 | 0.3200 | 1.2187 | 0.79 | 0.7658 | 0.1973 | 0.0777 |
| No log | 16.0 | 112 | 3.6019 | 0.83 | 0.2920 | 1.0547 | 0.83 | 0.8050 | 0.1655 | 0.0632 |
| No log | 17.0 | 119 | 3.5622 | 0.84 | 0.2741 | 1.0769 | 0.8400 | 0.8228 | 0.1641 | 0.0636 |
| No log | 18.0 | 126 | 3.5921 | 0.835 | 0.2706 | 1.0996 | 0.835 | 0.8205 | 0.1810 | 0.0581 |
| No log | 19.0 | 133 | 3.5886 | 0.815 | 0.2773 | 1.2384 | 0.815 | 0.7975 | 0.1772 | 0.0643 |
| No log | 20.0 | 140 | 3.5798 | 0.84 | 0.2732 | 1.4226 | 0.8400 | 0.8236 | 0.1624 | 0.0672 |
| No log | 21.0 | 147 | 3.5683 | 0.82 | 0.2868 | 1.3978 | 0.82 | 0.7990 | 0.1629 | 0.0681 |
| No log | 22.0 | 154 | 3.5891 | 0.825 | 0.2893 | 1.4084 | 0.825 | 0.8024 | 0.1856 | 0.0666 |
| No log | 23.0 | 161 | 3.5484 | 0.82 | 0.2595 | 1.0782 | 0.82 | 0.7990 | 0.1477 | 0.0595 |
| No log | 24.0 | 168 | 3.5882 | 0.8 | 0.2686 | 1.0513 | 0.8000 | 0.7745 | 0.1334 | 0.0596 |
| No log | 25.0 | 175 | 3.5636 | 0.81 | 0.2774 | 1.1250 | 0.81 | 0.7838 | 0.1563 | 0.0662 |
| No log | 26.0 | 182 | 3.5478 | 0.82 | 0.2669 | 1.0724 | 0.82 | 0.8021 | 0.1318 | 0.0576 |
| No log | 27.0 | 189 | 3.5092 | 0.84 | 0.2487 | 1.0652 | 0.8400 | 0.8144 | 0.1500 | 0.0567 |
| No log | 28.0 | 196 | 3.5746 | 0.815 | 0.2810 | 1.3942 | 0.815 | 0.7997 | 0.1547 | 0.0682 |
| No log | 29.0 | 203 | 3.5633 | 0.835 | 0.2587 | 1.1580 | 0.835 | 0.8154 | 0.1476 | 0.0604 |
| No log | 30.0 | 210 | 3.5221 | 0.835 | 0.2591 | 0.9385 | 0.835 | 0.8180 | 0.1496 | 0.0584 |
| No log | 31.0 | 217 | 3.6263 | 0.83 | 0.2676 | 1.3260 | 0.83 | 0.8213 | 0.1670 | 0.0550 |
| No log | 32.0 | 224 | 3.5758 | 0.825 | 0.2855 | 1.3100 | 0.825 | 0.8082 | 0.1453 | 0.0637 |
| No log | 33.0 | 231 | 3.5836 | 0.84 | 0.2550 | 0.9703 | 0.8400 | 0.8268 | 0.1344 | 0.0521 |
| No log | 34.0 | 238 | 3.5466 | 0.825 | 0.2580 | 1.3118 | 0.825 | 0.8071 | 0.1553 | 0.0604 |
| No log | 35.0 | 245 | 3.5566 | 0.835 | 0.2574 | 1.1729 | 0.835 | 0.8134 | 0.1357 | 0.0592 |
| No log | 36.0 | 252 | 3.6022 | 0.83 | 0.2848 | 1.3337 | 0.83 | 0.8158 | 0.1528 | 0.0631 |
| No log | 37.0 | 259 | 3.5422 | 0.845 | 0.2598 | 1.2362 | 0.845 | 0.8311 | 0.1488 | 0.0588 |
| No log | 38.0 | 266 | 3.5677 | 0.825 | 0.2662 | 1.2509 | 0.825 | 0.8026 | 0.1417 | 0.0566 |
| No log | 39.0 | 273 | 3.5600 | 0.83 | 0.2673 | 1.2032 | 0.83 | 0.8128 | 0.1344 | 0.0561 |
| No log | 40.0 | 280 | 3.5818 | 0.82 | 0.2634 | 1.1062 | 0.82 | 0.8070 | 0.1389 | 0.0558 |
| No log | 41.0 | 287 | 3.5326 | 0.85 | 0.2520 | 1.3207 | 0.85 | 0.8404 | 0.1485 | 0.0529 |
| No log | 42.0 | 294 | 3.5954 | 0.83 | 0.2708 | 1.1103 | 0.83 | 0.8092 | 0.1401 | 0.0595 |
| No log | 43.0 | 301 | 3.5330 | 0.84 | 0.2528 | 1.2339 | 0.8400 | 0.8241 | 0.1405 | 0.0540 |
| No log | 44.0 | 308 | 3.5696 | 0.825 | 0.2654 | 1.1943 | 0.825 | 0.8079 | 0.1358 | 0.0544 |
| No log | 45.0 | 315 | 3.5438 | 0.83 | 0.2558 | 1.1267 | 0.83 | 0.8138 | 0.1314 | 0.0530 |
| No log | 46.0 | 322 | 3.5537 | 0.845 | 0.2497 | 1.2612 | 0.845 | 0.8298 | 0.1338 | 0.0525 |
| No log | 47.0 | 329 | 3.5609 | 0.85 | 0.2467 | 1.4284 | 0.85 | 0.8315 | 0.1333 | 0.0563 |
| No log | 48.0 | 336 | 3.5723 | 0.835 | 0.2595 | 1.1814 | 0.835 | 0.8187 | 0.1402 | 0.0545 |
| No log | 49.0 | 343 | 3.5591 | 0.825 | 0.2485 | 1.1736 | 0.825 | 0.8072 | 0.1429 | 0.0536 |
| No log | 50.0 | 350 | 3.5715 | 0.825 | 0.2585 | 1.3645 | 0.825 | 0.8098 | 0.1445 | 0.0564 |
| No log | 51.0 | 357 | 3.5813 | 0.83 | 0.2617 | 1.2375 | 0.83 | 0.8210 | 0.1371 | 0.0551 |
| No log | 52.0 | 364 | 3.6084 | 0.835 | 0.2592 | 1.2465 | 0.835 | 0.8168 | 0.1557 | 0.0550 |
| No log | 53.0 | 371 | 3.5574 | 0.84 | 0.2474 | 1.1932 | 0.8400 | 0.8255 | 0.1351 | 0.0543 |
| No log | 54.0 | 378 | 3.5863 | 0.85 | 0.2428 | 1.2885 | 0.85 | 0.8346 | 0.1347 | 0.0536 |
| No log | 55.0 | 385 | 3.5510 | 0.83 | 0.2520 | 1.2654 | 0.83 | 0.8163 | 0.1342 | 0.0543 |
| No log | 56.0 | 392 | 3.5516 | 0.835 | 0.2476 | 1.0430 | 0.835 | 0.8210 | 0.1336 | 0.0549 |
| No log | 57.0 | 399 | 3.5754 | 0.835 | 0.2475 | 1.3656 | 0.835 | 0.8245 | 0.1165 | 0.0528 |
| No log | 58.0 | 406 | 3.6017 | 0.83 | 0.2584 | 1.3561 | 0.83 | 0.8198 | 0.1490 | 0.0542 |
| No log | 59.0 | 413 | 3.5767 | 0.845 | 0.2488 | 1.2699 | 0.845 | 0.8357 | 0.1291 | 0.0521 |
| No log | 60.0 | 420 | 3.5844 | 0.835 | 0.2513 | 1.2919 | 0.835 | 0.8218 | 0.1286 | 0.0541 |
| No log | 61.0 | 427 | 3.5744 | 0.84 | 0.2443 | 1.2315 | 0.8400 | 0.8334 | 0.1441 | 0.0515 |
| No log | 62.0 | 434 | 3.5948 | 0.825 | 0.2505 | 1.2265 | 0.825 | 0.8052 | 0.1266 | 0.0539 |
| No log | 63.0 | 441 | 3.5833 | 0.845 | 0.2403 | 1.2410 | 0.845 | 0.8382 | 0.1268 | 0.0506 |
| No log | 64.0 | 448 | 3.6000 | 0.845 | 0.2451 | 1.2889 | 0.845 | 0.8282 | 0.1408 | 0.0526 |
| No log | 65.0 | 455 | 3.6050 | 0.84 | 0.2497 | 1.2870 | 0.8400 | 0.8298 | 0.1401 | 0.0534 |
| No log | 66.0 | 462 | 3.5950 | 0.86 | 0.2381 | 1.3004 | 0.8600 | 0.8491 | 0.1231 | 0.0511 |
| No log | 67.0 | 469 | 3.6030 | 0.85 | 0.2435 | 1.2246 | 0.85 | 0.8374 | 0.1252 | 0.0517 |
| No log | 68.0 | 476 | 3.6028 | 0.85 | 0.2433 | 1.2260 | 0.85 | 0.8404 | 0.1370 | 0.0513 |
| No log | 69.0 | 483 | 3.6112 | 0.86 | 0.2438 | 1.2148 | 0.8600 | 0.8487 | 0.1362 | 0.0522 |
| No log | 70.0 | 490 | 3.6147 | 0.85 | 0.2409 | 1.2230 | 0.85 | 0.8401 | 0.1258 | 0.0511 |
| No log | 71.0 | 497 | 3.6231 | 0.845 | 0.2391 | 1.2277 | 0.845 | 0.8341 | 0.1332 | 0.0506 |
| 3.5909 | 72.0 | 504 | 3.6298 | 0.845 | 0.2415 | 1.2897 | 0.845 | 0.8346 | 0.1288 | 0.0508 |
| 3.5909 | 73.0 | 511 | 3.6384 | 0.85 | 0.2427 | 1.2893 | 0.85 | 0.8401 | 0.1366 | 0.0515 |
| 3.5909 | 74.0 | 518 | 3.6364 | 0.845 | 0.2420 | 1.2224 | 0.845 | 0.8346 | 0.1219 | 0.0511 |
| 3.5909 | 75.0 | 525 | 3.6471 | 0.845 | 0.2441 | 1.2252 | 0.845 | 0.8346 | 0.1322 | 0.0517 |
| 3.5909 | 76.0 | 532 | 3.6469 | 0.85 | 0.2423 | 1.2259 | 0.85 | 0.8404 | 0.1300 | 0.0513 |
| 3.5909 | 77.0 | 539 | 3.6493 | 0.85 | 0.2423 | 1.2253 | 0.85 | 0.8401 | 0.1248 | 0.0514 |
| 3.5909 | 78.0 | 546 | 3.6534 | 0.85 | 0.2434 | 1.2273 | 0.85 | 0.8404 | 0.1271 | 0.0512 |
| 3.5909 | 79.0 | 553 | 3.6588 | 0.845 | 0.2430 | 1.2254 | 0.845 | 0.8346 | 0.1307 | 0.0513 |
| 3.5909 | 80.0 | 560 | 3.6636 | 0.845 | 0.2434 | 1.2249 | 0.845 | 0.8346 | 0.1259 | 0.0513 |
| 3.5909 | 81.0 | 567 | 3.6670 | 0.845 | 0.2433 | 1.2253 | 0.845 | 0.8346 | 0.1356 | 0.0513 |
| 3.5909 | 82.0 | 574 | 3.6689 | 0.845 | 0.2427 | 1.2256 | 0.845 | 0.8346 | 0.1365 | 0.0511 |
| 3.5909 | 83.0 | 581 | 3.6724 | 0.845 | 0.2433 | 1.2278 | 0.845 | 0.8346 | 0.1315 | 0.0510 |
| 3.5909 | 84.0 | 588 | 3.6768 | 0.85 | 0.2431 | 1.2260 | 0.85 | 0.8401 | 0.1374 | 0.0510 |
| 3.5909 | 85.0 | 595 | 3.6782 | 0.85 | 0.2424 | 1.2265 | 0.85 | 0.8401 | 0.1340 | 0.0509 |
| 3.5909 | 86.0 | 602 | 3.6817 | 0.85 | 0.2428 | 1.2261 | 0.85 | 0.8401 | 0.1332 | 0.0510 |
| 3.5909 | 87.0 | 609 | 3.6822 | 0.85 | 0.2427 | 1.2266 | 0.85 | 0.8401 | 0.1330 | 0.0508 |
| 3.5909 | 88.0 | 616 | 3.6835 | 0.85 | 0.2425 | 1.2259 | 0.85 | 0.8401 | 0.1328 | 0.0510 |
| 3.5909 | 89.0 | 623 | 3.6854 | 0.85 | 0.2425 | 1.2260 | 0.85 | 0.8401 | 0.1328 | 0.0509 |
| 3.5909 | 90.0 | 630 | 3.6874 | 0.85 | 0.2426 | 1.2259 | 0.85 | 0.8401 | 0.1327 | 0.0510 |
| 3.5909 | 91.0 | 637 | 3.6891 | 0.85 | 0.2428 | 1.2264 | 0.85 | 0.8401 | 0.1327 | 0.0510 |
| 3.5909 | 92.0 | 644 | 3.6903 | 0.85 | 0.2426 | 1.2265 | 0.85 | 0.8401 | 0.1328 | 0.0509 |
| 3.5909 | 93.0 | 651 | 3.6913 | 0.85 | 0.2427 | 1.2264 | 0.85 | 0.8401 | 0.1327 | 0.0509 |
| 3.5909 | 94.0 | 658 | 3.6922 | 0.85 | 0.2427 | 1.2265 | 0.85 | 0.8401 | 0.1326 | 0.0509 |
| 3.5909 | 95.0 | 665 | 3.6930 | 0.85 | 0.2426 | 1.2262 | 0.85 | 0.8401 | 0.1326 | 0.0510 |
| 3.5909 | 96.0 | 672 | 3.6936 | 0.85 | 0.2427 | 1.2266 | 0.85 | 0.8401 | 0.1327 | 0.0509 |
| 3.5909 | 97.0 | 679 | 3.6940 | 0.85 | 0.2426 | 1.2264 | 0.85 | 0.8401 | 0.1325 | 0.0510 |
| 3.5909 | 98.0 | 686 | 3.6946 | 0.85 | 0.2427 | 1.2265 | 0.85 | 0.8401 | 0.1326 | 0.0510 |
| 3.5909 | 99.0 | 693 | 3.6948 | 0.85 | 0.2427 | 1.2266 | 0.85 | 0.8401 | 0.1325 | 0.0510 |
| 3.5909 | 100.0 | 700 | 3.6948 | 0.85 | 0.2427 | 1.2265 | 0.85 | 0.8401 | 0.1325 | 0.0510 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-LungCancer-LC25000-AH-40-30-30
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-LungCancer-LC25000-AH-40-30-30
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0418
- Accuracy: 0.9869
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3299 | 1.0 | 187 | 0.2118 | 0.9218 |
| 0.5922 | 2.0 | 374 | 0.3206 | 0.8629 |
| 0.1763 | 3.0 | 561 | 0.2447 | 0.9127 |
| 0.1351 | 4.0 | 749 | 0.1028 | 0.9564 |
| 0.142 | 4.99 | 935 | 0.0418 | 0.9869 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_simkd_CEKD_tNone_aNone_tNone_gNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_simkd_CEKD_tNone_aNone_tNone_gNone
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0733
- Accuracy: 0.4825
- Brier Loss: 0.7791
- Nll: 2.6387
- F1 Micro: 0.4825
- F1 Macro: 0.4847
- Ece: 0.3427
- Aurc: 0.2765
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.0917 | 0.0675 | 0.9375 | 7.4169 | 0.0675 | 0.0426 | 0.1087 | 0.9278 |
| No log | 2.0 | 50 | 0.0830 | 0.07 | 0.9373 | 7.3255 | 0.07 | 0.0326 | 0.1058 | 0.9149 |
| No log | 3.0 | 75 | 0.0823 | 0.08 | 0.9370 | 7.0476 | 0.08 | 0.0333 | 0.1129 | 0.9007 |
| No log | 4.0 | 100 | 0.0820 | 0.0825 | 0.9368 | 6.9259 | 0.0825 | 0.0333 | 0.1113 | 0.8914 |
| No log | 5.0 | 125 | 0.0817 | 0.095 | 0.9366 | 7.1920 | 0.095 | 0.0593 | 0.1189 | 0.8845 |
| No log | 6.0 | 150 | 0.0814 | 0.105 | 0.9363 | 7.6541 | 0.1050 | 0.0654 | 0.1364 | 0.8354 |
| No log | 7.0 | 175 | 0.0810 | 0.1075 | 0.9361 | 7.5199 | 0.1075 | 0.0628 | 0.1235 | 0.8559 |
| No log | 8.0 | 200 | 0.0806 | 0.1025 | 0.9357 | 7.3552 | 0.1025 | 0.0532 | 0.1230 | 0.8697 |
| No log | 9.0 | 225 | 0.0801 | 0.1125 | 0.9353 | 6.2436 | 0.1125 | 0.0580 | 0.1291 | 0.8258 |
| No log | 10.0 | 250 | 0.0797 | 0.0975 | 0.9342 | 6.1811 | 0.0975 | 0.0486 | 0.1217 | 0.8531 |
| No log | 11.0 | 275 | 0.0792 | 0.11 | 0.9331 | 5.1954 | 0.11 | 0.0558 | 0.1330 | 0.8172 |
| No log | 12.0 | 300 | 0.0789 | 0.1225 | 0.9310 | 5.0567 | 0.1225 | 0.0536 | 0.1428 | 0.7847 |
| No log | 13.0 | 325 | 0.0785 | 0.14 | 0.9283 | 4.2411 | 0.14 | 0.1085 | 0.1561 | 0.7098 |
| No log | 14.0 | 350 | 0.0780 | 0.1925 | 0.9234 | 3.9402 | 0.1925 | 0.1627 | 0.1956 | 0.6553 |
| No log | 15.0 | 375 | 0.0780 | 0.2275 | 0.9186 | 4.2282 | 0.2275 | 0.1806 | 0.2151 | 0.5919 |
| No log | 16.0 | 400 | 0.0770 | 0.2925 | 0.9082 | 3.5789 | 0.2925 | 0.2357 | 0.2602 | 0.5043 |
| No log | 17.0 | 425 | 0.0766 | 0.305 | 0.8993 | 3.6388 | 0.305 | 0.2465 | 0.2603 | 0.4771 |
| No log | 18.0 | 450 | 0.0762 | 0.31 | 0.8916 | 3.2067 | 0.31 | 0.2602 | 0.2755 | 0.4341 |
| No log | 19.0 | 475 | 0.0758 | 0.315 | 0.8861 | 3.1537 | 0.315 | 0.2659 | 0.2820 | 0.4282 |
| 0.0818 | 20.0 | 500 | 0.0755 | 0.3475 | 0.8713 | 3.3614 | 0.3475 | 0.2869 | 0.2830 | 0.3966 |
| 0.0818 | 21.0 | 525 | 0.0755 | 0.34 | 0.8627 | 3.3538 | 0.34 | 0.2728 | 0.2781 | 0.3934 |
| 0.0818 | 22.0 | 550 | 0.0752 | 0.3575 | 0.8578 | 3.4181 | 0.3575 | 0.3052 | 0.2867 | 0.4037 |
| 0.0818 | 23.0 | 575 | 0.0745 | 0.365 | 0.8486 | 2.7931 | 0.3650 | 0.3297 | 0.2908 | 0.3669 |
| 0.0818 | 24.0 | 600 | 0.0743 | 0.395 | 0.8392 | 2.8800 | 0.395 | 0.3419 | 0.3054 | 0.3602 |
| 0.0818 | 25.0 | 625 | 0.0741 | 0.3975 | 0.8382 | 2.8294 | 0.3975 | 0.3584 | 0.3049 | 0.3469 |
| 0.0818 | 26.0 | 650 | 0.0739 | 0.4125 | 0.8308 | 2.9306 | 0.4125 | 0.3650 | 0.3179 | 0.3342 |
| 0.0818 | 27.0 | 675 | 0.0740 | 0.425 | 0.8237 | 3.0954 | 0.425 | 0.3831 | 0.3069 | 0.3356 |
| 0.0818 | 28.0 | 700 | 0.0739 | 0.425 | 0.8325 | 3.0230 | 0.425 | 0.3933 | 0.3154 | 0.3316 |
| 0.0818 | 29.0 | 725 | 0.0735 | 0.445 | 0.8150 | 2.9001 | 0.445 | 0.4078 | 0.3320 | 0.3125 |
| 0.0818 | 30.0 | 750 | 0.0734 | 0.44 | 0.8127 | 2.8272 | 0.44 | 0.4048 | 0.3196 | 0.3145 |
| 0.0818 | 31.0 | 775 | 0.0733 | 0.45 | 0.8105 | 2.9716 | 0.45 | 0.4224 | 0.3214 | 0.3126 |
| 0.0818 | 32.0 | 800 | 0.0732 | 0.4475 | 0.8059 | 2.7234 | 0.4475 | 0.4211 | 0.3166 | 0.3098 |
| 0.0818 | 33.0 | 825 | 0.0734 | 0.45 | 0.8091 | 2.8963 | 0.45 | 0.4298 | 0.3174 | 0.3144 |
| 0.0818 | 34.0 | 850 | 0.0732 | 0.45 | 0.8021 | 2.7268 | 0.45 | 0.4216 | 0.3203 | 0.3024 |
| 0.0818 | 35.0 | 875 | 0.0732 | 0.465 | 0.8013 | 2.9374 | 0.465 | 0.4379 | 0.3417 | 0.2959 |
| 0.0818 | 36.0 | 900 | 0.0732 | 0.4575 | 0.8039 | 2.9305 | 0.4575 | 0.4360 | 0.3166 | 0.3029 |
| 0.0818 | 37.0 | 925 | 0.0733 | 0.4725 | 0.8017 | 2.7705 | 0.4725 | 0.4542 | 0.3348 | 0.2859 |
| 0.0818 | 38.0 | 950 | 0.0732 | 0.4725 | 0.7963 | 2.8600 | 0.4725 | 0.4559 | 0.3432 | 0.2826 |
| 0.0818 | 39.0 | 975 | 0.0731 | 0.4825 | 0.7979 | 2.7795 | 0.4825 | 0.4675 | 0.3361 | 0.2930 |
| 0.0698 | 40.0 | 1000 | 0.0732 | 0.445 | 0.7962 | 2.8308 | 0.445 | 0.4366 | 0.3056 | 0.3058 |
| 0.0698 | 41.0 | 1025 | 0.0732 | 0.4675 | 0.7914 | 2.7809 | 0.4675 | 0.4582 | 0.3173 | 0.2904 |
| 0.0698 | 42.0 | 1050 | 0.0731 | 0.4625 | 0.7952 | 2.8907 | 0.4625 | 0.4644 | 0.3175 | 0.2910 |
| 0.0698 | 43.0 | 1075 | 0.0733 | 0.4625 | 0.7955 | 2.7470 | 0.4625 | 0.4545 | 0.3107 | 0.2930 |
| 0.0698 | 44.0 | 1100 | 0.0731 | 0.4725 | 0.7894 | 2.8684 | 0.4725 | 0.4640 | 0.3281 | 0.2883 |
| 0.0698 | 45.0 | 1125 | 0.0731 | 0.475 | 0.7912 | 2.9091 | 0.4750 | 0.4594 | 0.3302 | 0.2830 |
| 0.0698 | 46.0 | 1150 | 0.0731 | 0.47 | 0.7911 | 2.7282 | 0.47 | 0.4705 | 0.3344 | 0.2865 |
| 0.0698 | 47.0 | 1175 | 0.0732 | 0.4775 | 0.7886 | 2.8402 | 0.4775 | 0.4737 | 0.3151 | 0.2846 |
| 0.0698 | 48.0 | 1200 | 0.0731 | 0.4825 | 0.7850 | 2.7818 | 0.4825 | 0.4833 | 0.3422 | 0.2807 |
| 0.0698 | 49.0 | 1225 | 0.0731 | 0.4625 | 0.7863 | 2.7929 | 0.4625 | 0.4621 | 0.3205 | 0.2828 |
| 0.0698 | 50.0 | 1250 | 0.0732 | 0.4725 | 0.7875 | 2.8382 | 0.4725 | 0.4686 | 0.3364 | 0.2831 |
| 0.0698 | 51.0 | 1275 | 0.0731 | 0.4725 | 0.7861 | 2.7543 | 0.4725 | 0.4661 | 0.3229 | 0.2838 |
| 0.0698 | 52.0 | 1300 | 0.0731 | 0.475 | 0.7863 | 2.7936 | 0.4750 | 0.4771 | 0.3285 | 0.2801 |
| 0.0698 | 53.0 | 1325 | 0.0731 | 0.4825 | 0.7846 | 2.8369 | 0.4825 | 0.4843 | 0.3369 | 0.2747 |
| 0.0698 | 54.0 | 1350 | 0.0731 | 0.4725 | 0.7852 | 2.8102 | 0.4725 | 0.4747 | 0.3175 | 0.2869 |
| 0.0698 | 55.0 | 1375 | 0.0731 | 0.475 | 0.7855 | 2.8205 | 0.4750 | 0.4801 | 0.3409 | 0.2802 |
| 0.0698 | 56.0 | 1400 | 0.0731 | 0.48 | 0.7855 | 2.7926 | 0.48 | 0.4815 | 0.3403 | 0.2827 |
| 0.0698 | 57.0 | 1425 | 0.0731 | 0.4825 | 0.7826 | 2.7536 | 0.4825 | 0.4815 | 0.3381 | 0.2788 |
| 0.0698 | 58.0 | 1450 | 0.0731 | 0.4875 | 0.7851 | 2.8313 | 0.4875 | 0.4901 | 0.3395 | 0.2719 |
| 0.0698 | 59.0 | 1475 | 0.0731 | 0.4875 | 0.7838 | 2.7423 | 0.4875 | 0.4905 | 0.3410 | 0.2735 |
| 0.0654 | 60.0 | 1500 | 0.0731 | 0.48 | 0.7849 | 2.7730 | 0.48 | 0.4818 | 0.3344 | 0.2807 |
| 0.0654 | 61.0 | 1525 | 0.0732 | 0.48 | 0.7816 | 2.7517 | 0.48 | 0.4813 | 0.3370 | 0.2762 |
| 0.0654 | 62.0 | 1550 | 0.0731 | 0.4775 | 0.7833 | 2.8441 | 0.4775 | 0.4804 | 0.3314 | 0.2767 |
| 0.0654 | 63.0 | 1575 | 0.0731 | 0.4775 | 0.7835 | 2.7252 | 0.4775 | 0.4811 | 0.3354 | 0.2769 |
| 0.0654 | 64.0 | 1600 | 0.0732 | 0.4925 | 0.7819 | 2.7991 | 0.4925 | 0.4958 | 0.3371 | 0.2726 |
| 0.0654 | 65.0 | 1625 | 0.0731 | 0.4825 | 0.7806 | 2.6719 | 0.4825 | 0.4850 | 0.3190 | 0.2752 |
| 0.0654 | 66.0 | 1650 | 0.0732 | 0.48 | 0.7817 | 2.7669 | 0.48 | 0.4828 | 0.3336 | 0.2791 |
| 0.0654 | 67.0 | 1675 | 0.0731 | 0.4775 | 0.7813 | 2.6678 | 0.4775 | 0.4822 | 0.3304 | 0.2750 |
| 0.0654 | 68.0 | 1700 | 0.0732 | 0.4875 | 0.7829 | 2.7529 | 0.4875 | 0.4919 | 0.3381 | 0.2756 |
| 0.0654 | 69.0 | 1725 | 0.0731 | 0.4825 | 0.7795 | 2.7291 | 0.4825 | 0.4839 | 0.3418 | 0.2737 |
| 0.0654 | 70.0 | 1750 | 0.0732 | 0.4875 | 0.7827 | 2.7613 | 0.4875 | 0.4909 | 0.3308 | 0.2747 |
| 0.0654 | 71.0 | 1775 | 0.0732 | 0.4825 | 0.7816 | 2.7348 | 0.4825 | 0.4863 | 0.3306 | 0.2733 |
| 0.0654 | 72.0 | 1800 | 0.0732 | 0.4825 | 0.7813 | 2.6920 | 0.4825 | 0.4863 | 0.3268 | 0.2724 |
| 0.0654 | 73.0 | 1825 | 0.0731 | 0.485 | 0.7809 | 2.6890 | 0.485 | 0.4872 | 0.3307 | 0.2741 |
| 0.0654 | 74.0 | 1850 | 0.0732 | 0.4825 | 0.7810 | 2.6668 | 0.4825 | 0.4854 | 0.3245 | 0.2758 |
| 0.0654 | 75.0 | 1875 | 0.0732 | 0.48 | 0.7814 | 2.7337 | 0.48 | 0.4836 | 0.3232 | 0.2767 |
| 0.0654 | 76.0 | 1900 | 0.0731 | 0.49 | 0.7802 | 2.7219 | 0.49 | 0.4900 | 0.3290 | 0.2727 |
| 0.0654 | 77.0 | 1925 | 0.0732 | 0.48 | 0.7804 | 2.7187 | 0.48 | 0.4821 | 0.3223 | 0.2759 |
| 0.0654 | 78.0 | 1950 | 0.0732 | 0.485 | 0.7811 | 2.6797 | 0.485 | 0.4884 | 0.3343 | 0.2754 |
| 0.0654 | 79.0 | 1975 | 0.0731 | 0.48 | 0.7784 | 2.6604 | 0.48 | 0.4816 | 0.3345 | 0.2751 |
| 0.0641 | 80.0 | 2000 | 0.0732 | 0.485 | 0.7797 | 2.6380 | 0.485 | 0.4876 | 0.3317 | 0.2755 |
| 0.0641 | 81.0 | 2025 | 0.0732 | 0.4775 | 0.7805 | 2.6934 | 0.4775 | 0.4808 | 0.3225 | 0.2758 |
| 0.0641 | 82.0 | 2050 | 0.0732 | 0.4825 | 0.7802 | 2.7315 | 0.4825 | 0.4851 | 0.3364 | 0.2781 |
| 0.0641 | 83.0 | 2075 | 0.0732 | 0.4875 | 0.7800 | 2.7011 | 0.4875 | 0.4899 | 0.3222 | 0.2736 |
| 0.0641 | 84.0 | 2100 | 0.0732 | 0.4825 | 0.7796 | 2.6672 | 0.4825 | 0.4845 | 0.3203 | 0.2772 |
| 0.0641 | 85.0 | 2125 | 0.0732 | 0.4825 | 0.7798 | 2.6956 | 0.4825 | 0.4833 | 0.3373 | 0.2757 |
| 0.0641 | 86.0 | 2150 | 0.0732 | 0.48 | 0.7797 | 2.6349 | 0.48 | 0.4823 | 0.3265 | 0.2774 |
| 0.0641 | 87.0 | 2175 | 0.0732 | 0.49 | 0.7800 | 2.7238 | 0.49 | 0.4921 | 0.3407 | 0.2755 |
| 0.0641 | 88.0 | 2200 | 0.0732 | 0.4775 | 0.7800 | 2.6423 | 0.4775 | 0.4804 | 0.3163 | 0.2785 |
| 0.0641 | 89.0 | 2225 | 0.0732 | 0.485 | 0.7793 | 2.6734 | 0.485 | 0.4881 | 0.3310 | 0.2760 |
| 0.0641 | 90.0 | 2250 | 0.0732 | 0.4825 | 0.7796 | 2.6582 | 0.4825 | 0.4858 | 0.3232 | 0.2774 |
| 0.0641 | 91.0 | 2275 | 0.0732 | 0.485 | 0.7790 | 2.6705 | 0.485 | 0.4882 | 0.3277 | 0.2760 |
| 0.0641 | 92.0 | 2300 | 0.0732 | 0.49 | 0.7795 | 2.6465 | 0.49 | 0.4943 | 0.3513 | 0.2767 |
| 0.0641 | 93.0 | 2325 | 0.0732 | 0.4825 | 0.7791 | 2.6495 | 0.4825 | 0.4852 | 0.3414 | 0.2763 |
| 0.0641 | 94.0 | 2350 | 0.0732 | 0.49 | 0.7793 | 2.6402 | 0.49 | 0.4933 | 0.3458 | 0.2760 |
| 0.0641 | 95.0 | 2375 | 0.0732 | 0.4875 | 0.7792 | 2.6448 | 0.4875 | 0.4898 | 0.3420 | 0.2763 |
| 0.0641 | 96.0 | 2400 | 0.0732 | 0.4825 | 0.7792 | 2.6402 | 0.4825 | 0.4847 | 0.3346 | 0.2766 |
| 0.0641 | 97.0 | 2425 | 0.0733 | 0.485 | 0.7793 | 2.6397 | 0.485 | 0.4873 | 0.3407 | 0.2768 |
| 0.0641 | 98.0 | 2450 | 0.0732 | 0.4825 | 0.7790 | 2.6388 | 0.4825 | 0.4847 | 0.3374 | 0.2763 |
| 0.0641 | 99.0 | 2475 | 0.0733 | 0.4825 | 0.7792 | 2.6390 | 0.4825 | 0.4847 | 0.3393 | 0.2767 |
| 0.0637 | 100.0 | 2500 | 0.0733 | 0.4825 | 0.7791 | 2.6387 | 0.4825 | 0.4847 | 0.3427 | 0.2765 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 5.2014
- Accuracy: 0.635
- Brier Loss: 0.5252
- Nll: 2.1069
- F1 Micro: 0.635
- F1 Macro: 0.6363
- Ece: 0.1836
- Aurc: 0.1520
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 7.0792 | 0.0825 | 0.9657 | 10.6330 | 0.0825 | 0.0736 | 0.1618 | 0.9054 |
| No log | 2.0 | 14 | 6.4691 | 0.07 | 0.9514 | 8.1662 | 0.07 | 0.0643 | 0.1779 | 0.9461 |
| No log | 3.0 | 21 | 5.8986 | 0.2975 | 0.8596 | 5.3260 | 0.2975 | 0.2944 | 0.2211 | 0.5304 |
| No log | 4.0 | 28 | 5.5468 | 0.3925 | 0.7531 | 3.5791 | 0.3925 | 0.3860 | 0.2145 | 0.3645 |
| No log | 5.0 | 35 | 5.2678 | 0.46 | 0.6755 | 3.2144 | 0.46 | 0.4517 | 0.1901 | 0.2901 |
| No log | 6.0 | 42 | 5.1237 | 0.5075 | 0.6334 | 2.9369 | 0.5075 | 0.4888 | 0.1985 | 0.2550 |
| No log | 7.0 | 49 | 5.1530 | 0.5125 | 0.6131 | 3.1196 | 0.5125 | 0.4843 | 0.1634 | 0.2417 |
| No log | 8.0 | 56 | 5.0462 | 0.545 | 0.5898 | 2.7596 | 0.545 | 0.5376 | 0.1792 | 0.2232 |
| No log | 9.0 | 63 | 5.1437 | 0.565 | 0.5759 | 2.9426 | 0.565 | 0.5660 | 0.1715 | 0.2208 |
| No log | 10.0 | 70 | 4.9658 | 0.605 | 0.5382 | 2.4096 | 0.605 | 0.5945 | 0.1828 | 0.1734 |
| No log | 11.0 | 77 | 5.1189 | 0.57 | 0.5592 | 2.5892 | 0.57 | 0.5677 | 0.1381 | 0.1952 |
| No log | 12.0 | 84 | 5.2082 | 0.54 | 0.5774 | 2.7250 | 0.54 | 0.5323 | 0.1578 | 0.2144 |
| No log | 13.0 | 91 | 4.9674 | 0.5775 | 0.5365 | 2.5469 | 0.5775 | 0.5654 | 0.1603 | 0.1824 |
| No log | 14.0 | 98 | 5.0007 | 0.5875 | 0.5299 | 2.6635 | 0.5875 | 0.5778 | 0.1567 | 0.1701 |
| No log | 15.0 | 105 | 4.9925 | 0.585 | 0.5417 | 2.6416 | 0.585 | 0.5760 | 0.1731 | 0.1896 |
| No log | 16.0 | 112 | 4.8314 | 0.6425 | 0.4939 | 2.4601 | 0.6425 | 0.6444 | 0.1492 | 0.1506 |
| No log | 17.0 | 119 | 4.8729 | 0.6075 | 0.5197 | 2.4297 | 0.6075 | 0.6054 | 0.1511 | 0.1702 |
| No log | 18.0 | 126 | 4.8960 | 0.61 | 0.5085 | 2.2405 | 0.61 | 0.6197 | 0.1664 | 0.1657 |
| No log | 19.0 | 133 | 4.8227 | 0.62 | 0.5032 | 2.4320 | 0.62 | 0.6177 | 0.1399 | 0.1615 |
| No log | 20.0 | 140 | 4.9420 | 0.61 | 0.5160 | 2.3051 | 0.61 | 0.6119 | 0.1460 | 0.1722 |
| No log | 21.0 | 147 | 4.8779 | 0.6125 | 0.5132 | 2.3564 | 0.6125 | 0.6080 | 0.1549 | 0.1639 |
| No log | 22.0 | 154 | 4.9454 | 0.6125 | 0.5261 | 2.4064 | 0.6125 | 0.6155 | 0.1792 | 0.1733 |
| No log | 23.0 | 161 | 4.8659 | 0.5925 | 0.5018 | 2.5961 | 0.5925 | 0.5897 | 0.1537 | 0.1607 |
| No log | 24.0 | 168 | 4.8150 | 0.605 | 0.4996 | 2.2624 | 0.605 | 0.6050 | 0.1525 | 0.1588 |
| No log | 25.0 | 175 | 4.8303 | 0.6175 | 0.4970 | 2.1999 | 0.6175 | 0.6204 | 0.1284 | 0.1515 |
| No log | 26.0 | 182 | 4.8442 | 0.6225 | 0.5060 | 2.2842 | 0.6225 | 0.6251 | 0.1639 | 0.1614 |
| No log | 27.0 | 189 | 4.8260 | 0.63 | 0.4953 | 2.2666 | 0.63 | 0.6345 | 0.1638 | 0.1531 |
| No log | 28.0 | 196 | 4.8421 | 0.6375 | 0.4979 | 2.3173 | 0.6375 | 0.6344 | 0.1430 | 0.1525 |
| No log | 29.0 | 203 | 4.9011 | 0.62 | 0.5066 | 2.2663 | 0.62 | 0.6221 | 0.1596 | 0.1602 |
| No log | 30.0 | 210 | 4.8689 | 0.62 | 0.4994 | 2.1498 | 0.62 | 0.6260 | 0.1581 | 0.1567 |
| No log | 31.0 | 217 | 4.8681 | 0.6075 | 0.5143 | 2.0979 | 0.6075 | 0.6080 | 0.1673 | 0.1641 |
| No log | 32.0 | 224 | 4.8489 | 0.6 | 0.5074 | 2.1485 | 0.6 | 0.5913 | 0.1579 | 0.1613 |
| No log | 33.0 | 231 | 4.8669 | 0.63 | 0.5037 | 2.3142 | 0.63 | 0.6272 | 0.1512 | 0.1519 |
| No log | 34.0 | 238 | 4.8382 | 0.6075 | 0.5005 | 2.1817 | 0.6075 | 0.6038 | 0.1683 | 0.1552 |
| No log | 35.0 | 245 | 4.8406 | 0.61 | 0.5012 | 2.2132 | 0.61 | 0.6019 | 0.1443 | 0.1518 |
| No log | 36.0 | 252 | 4.8241 | 0.6275 | 0.5040 | 2.2466 | 0.6275 | 0.6182 | 0.1511 | 0.1563 |
| No log | 37.0 | 259 | 4.8359 | 0.6225 | 0.4993 | 2.1727 | 0.6225 | 0.6201 | 0.1665 | 0.1570 |
| No log | 38.0 | 266 | 4.8812 | 0.6025 | 0.5155 | 2.2712 | 0.6025 | 0.5990 | 0.1634 | 0.1649 |
| No log | 39.0 | 273 | 4.8672 | 0.61 | 0.5075 | 2.1626 | 0.61 | 0.6073 | 0.1603 | 0.1592 |
| No log | 40.0 | 280 | 4.9083 | 0.6175 | 0.5098 | 2.1507 | 0.6175 | 0.6204 | 0.1524 | 0.1594 |
| No log | 41.0 | 287 | 4.8942 | 0.61 | 0.5132 | 2.2443 | 0.61 | 0.6070 | 0.1574 | 0.1618 |
| No log | 42.0 | 294 | 4.9435 | 0.62 | 0.5177 | 2.1770 | 0.62 | 0.6186 | 0.1567 | 0.1664 |
| No log | 43.0 | 301 | 4.8836 | 0.63 | 0.5089 | 2.1922 | 0.63 | 0.6300 | 0.1612 | 0.1553 |
| No log | 44.0 | 308 | 4.9806 | 0.6225 | 0.5205 | 2.1855 | 0.6225 | 0.6213 | 0.1715 | 0.1631 |
| No log | 45.0 | 315 | 4.9314 | 0.6225 | 0.5185 | 2.1783 | 0.6225 | 0.6182 | 0.1743 | 0.1631 |
| No log | 46.0 | 322 | 4.8615 | 0.6275 | 0.4984 | 2.2407 | 0.6275 | 0.6259 | 0.1529 | 0.1497 |
| No log | 47.0 | 329 | 4.8550 | 0.625 | 0.4985 | 2.1229 | 0.625 | 0.6261 | 0.1517 | 0.1531 |
| No log | 48.0 | 336 | 4.9218 | 0.6125 | 0.5113 | 2.2200 | 0.6125 | 0.6114 | 0.1627 | 0.1588 |
| No log | 49.0 | 343 | 4.9067 | 0.63 | 0.5102 | 2.2177 | 0.63 | 0.6299 | 0.1534 | 0.1567 |
| No log | 50.0 | 350 | 4.9040 | 0.6125 | 0.5110 | 2.1105 | 0.6125 | 0.6136 | 0.1731 | 0.1559 |
| No log | 51.0 | 357 | 4.9557 | 0.615 | 0.5180 | 2.2031 | 0.615 | 0.6157 | 0.1726 | 0.1602 |
| No log | 52.0 | 364 | 4.9409 | 0.61 | 0.5195 | 2.2616 | 0.61 | 0.6079 | 0.1627 | 0.1618 |
| No log | 53.0 | 371 | 4.9290 | 0.6225 | 0.5125 | 2.1352 | 0.6225 | 0.6227 | 0.1873 | 0.1549 |
| No log | 54.0 | 378 | 4.9297 | 0.6225 | 0.5075 | 2.1558 | 0.6225 | 0.6216 | 0.1724 | 0.1530 |
| No log | 55.0 | 385 | 4.9192 | 0.6225 | 0.5131 | 2.1572 | 0.6225 | 0.6220 | 0.1655 | 0.1578 |
| No log | 56.0 | 392 | 4.9760 | 0.61 | 0.5203 | 2.1227 | 0.61 | 0.6092 | 0.1852 | 0.1594 |
| No log | 57.0 | 399 | 4.9860 | 0.6125 | 0.5208 | 2.1996 | 0.6125 | 0.6154 | 0.1812 | 0.1608 |
| No log | 58.0 | 406 | 4.9418 | 0.62 | 0.5176 | 2.1034 | 0.62 | 0.6220 | 0.1635 | 0.1549 |
| No log | 59.0 | 413 | 4.9462 | 0.62 | 0.5143 | 2.1095 | 0.62 | 0.6221 | 0.1855 | 0.1553 |
| No log | 60.0 | 420 | 4.9447 | 0.6175 | 0.5142 | 2.0731 | 0.6175 | 0.6180 | 0.1571 | 0.1533 |
| No log | 61.0 | 427 | 4.9677 | 0.63 | 0.5091 | 2.1491 | 0.63 | 0.6346 | 0.1693 | 0.1498 |
| No log | 62.0 | 434 | 4.9567 | 0.62 | 0.5089 | 2.1222 | 0.62 | 0.6242 | 0.1609 | 0.1546 |
| No log | 63.0 | 441 | 4.9378 | 0.6325 | 0.5030 | 2.1787 | 0.6325 | 0.6310 | 0.1558 | 0.1471 |
| No log | 64.0 | 448 | 4.9764 | 0.6175 | 0.5154 | 2.0751 | 0.6175 | 0.6192 | 0.1835 | 0.1549 |
| No log | 65.0 | 455 | 4.9520 | 0.6325 | 0.5069 | 2.1067 | 0.6325 | 0.6352 | 0.1670 | 0.1499 |
| No log | 66.0 | 462 | 4.9649 | 0.6375 | 0.5109 | 2.1016 | 0.6375 | 0.6361 | 0.1665 | 0.1506 |
| No log | 67.0 | 469 | 5.0023 | 0.635 | 0.5174 | 2.2166 | 0.635 | 0.6350 | 0.1653 | 0.1543 |
| No log | 68.0 | 476 | 5.0084 | 0.63 | 0.5187 | 2.1238 | 0.63 | 0.6302 | 0.1674 | 0.1535 |
| No log | 69.0 | 483 | 4.9875 | 0.6325 | 0.5096 | 2.1744 | 0.6325 | 0.6345 | 0.1822 | 0.1510 |
| No log | 70.0 | 490 | 5.0129 | 0.6325 | 0.5151 | 2.1042 | 0.6325 | 0.6335 | 0.1691 | 0.1535 |
| No log | 71.0 | 497 | 5.0389 | 0.6275 | 0.5201 | 2.0941 | 0.6275 | 0.6283 | 0.1765 | 0.1550 |
| 3.4121 | 72.0 | 504 | 5.0288 | 0.6325 | 0.5168 | 2.1299 | 0.6325 | 0.6314 | 0.1802 | 0.1529 |
| 3.4121 | 73.0 | 511 | 5.0181 | 0.625 | 0.5121 | 2.1690 | 0.625 | 0.6236 | 0.1683 | 0.1511 |
| 3.4121 | 74.0 | 518 | 5.0422 | 0.625 | 0.5139 | 2.1323 | 0.625 | 0.6264 | 0.1873 | 0.1517 |
| 3.4121 | 75.0 | 525 | 5.0557 | 0.6325 | 0.5177 | 2.1695 | 0.6325 | 0.6342 | 0.1677 | 0.1503 |
| 3.4121 | 76.0 | 532 | 5.0440 | 0.6375 | 0.5113 | 2.1384 | 0.6375 | 0.6384 | 0.1714 | 0.1489 |
| 3.4121 | 77.0 | 539 | 5.0710 | 0.6375 | 0.5163 | 2.1017 | 0.6375 | 0.6397 | 0.1785 | 0.1508 |
| 3.4121 | 78.0 | 546 | 5.1024 | 0.63 | 0.5218 | 2.0905 | 0.63 | 0.6280 | 0.1724 | 0.1538 |
| 3.4121 | 79.0 | 553 | 5.0906 | 0.635 | 0.5186 | 2.1293 | 0.635 | 0.6358 | 0.1908 | 0.1509 |
| 3.4121 | 80.0 | 560 | 5.1027 | 0.63 | 0.5206 | 2.1292 | 0.63 | 0.6299 | 0.1850 | 0.1525 |
| 3.4121 | 81.0 | 567 | 5.1063 | 0.64 | 0.5161 | 2.1620 | 0.64 | 0.6404 | 0.1754 | 0.1489 |
| 3.4121 | 82.0 | 574 | 5.1267 | 0.64 | 0.5207 | 2.1291 | 0.64 | 0.6400 | 0.1849 | 0.1516 |
| 3.4121 | 83.0 | 581 | 5.1332 | 0.63 | 0.5224 | 2.1338 | 0.63 | 0.6322 | 0.1750 | 0.1522 |
| 3.4121 | 84.0 | 588 | 5.1408 | 0.6325 | 0.5233 | 2.1333 | 0.6325 | 0.6334 | 0.1797 | 0.1522 |
| 3.4121 | 85.0 | 595 | 5.1510 | 0.63 | 0.5224 | 2.1635 | 0.63 | 0.6301 | 0.1755 | 0.1522 |
| 3.4121 | 86.0 | 602 | 5.1536 | 0.6375 | 0.5215 | 2.1628 | 0.6375 | 0.6382 | 0.1683 | 0.1511 |
| 3.4121 | 87.0 | 609 | 5.1580 | 0.6325 | 0.5228 | 2.1348 | 0.6325 | 0.6328 | 0.1779 | 0.1523 |
| 3.4121 | 88.0 | 616 | 5.1701 | 0.64 | 0.5235 | 2.1352 | 0.64 | 0.6417 | 0.1818 | 0.1515 |
| 3.4121 | 89.0 | 623 | 5.1734 | 0.6375 | 0.5235 | 2.1354 | 0.6375 | 0.6385 | 0.1775 | 0.1515 |
| 3.4121 | 90.0 | 630 | 5.1779 | 0.635 | 0.5243 | 2.1334 | 0.635 | 0.6360 | 0.1842 | 0.1519 |
| 3.4121 | 91.0 | 637 | 5.1834 | 0.635 | 0.5241 | 2.1344 | 0.635 | 0.6363 | 0.1813 | 0.1521 |
| 3.4121 | 92.0 | 644 | 5.1877 | 0.6375 | 0.5247 | 2.1356 | 0.6375 | 0.6385 | 0.1871 | 0.1517 |
| 3.4121 | 93.0 | 651 | 5.1906 | 0.635 | 0.5245 | 2.1389 | 0.635 | 0.6360 | 0.1888 | 0.1520 |
| 3.4121 | 94.0 | 658 | 5.1935 | 0.635 | 0.5248 | 2.1083 | 0.635 | 0.6363 | 0.1831 | 0.1521 |
| 3.4121 | 95.0 | 665 | 5.1955 | 0.635 | 0.5249 | 2.1098 | 0.635 | 0.6363 | 0.1795 | 0.1521 |
| 3.4121 | 96.0 | 672 | 5.1978 | 0.635 | 0.5250 | 2.1079 | 0.635 | 0.6363 | 0.1820 | 0.1521 |
| 3.4121 | 97.0 | 679 | 5.1995 | 0.635 | 0.5251 | 2.1073 | 0.635 | 0.6363 | 0.1834 | 0.1521 |
| 3.4121 | 98.0 | 686 | 5.2004 | 0.635 | 0.5251 | 2.1072 | 0.635 | 0.6360 | 0.1834 | 0.1520 |
| 3.4121 | 99.0 | 693 | 5.2012 | 0.635 | 0.5252 | 2.1071 | 0.635 | 0.6360 | 0.1836 | 0.1520 |
| 3.4121 | 100.0 | 700 | 5.2014 | 0.635 | 0.5252 | 2.1069 | 0.635 | 0.6363 | 0.1836 | 0.1520 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
sbaner24/vit-base-patch16-224-Trial006-007-008-YEL_STEM1
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-Trial006-007-008-YEL_STEM1
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0457
- Accuracy: 0.9912
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7163 | 0.89 | 4 | 0.7036 | 0.5175 |
| 0.7183 | 2.0 | 9 | 0.6982 | 0.5439 |
| 0.6561 | 2.89 | 13 | 0.5914 | 0.6667 |
| 0.5799 | 4.0 | 18 | 0.6411 | 0.5526 |
| 0.5731 | 4.89 | 22 | 0.4833 | 0.8158 |
| 0.5162 | 6.0 | 27 | 0.4483 | 0.7807 |
| 0.4738 | 6.89 | 31 | 0.3961 | 0.8596 |
| 0.3283 | 8.0 | 36 | 0.2897 | 0.8772 |
| 0.3365 | 8.89 | 40 | 0.2774 | 0.8860 |
| 0.5987 | 10.0 | 45 | 0.2617 | 0.8947 |
| 0.3633 | 10.89 | 49 | 0.2661 | 0.8772 |
| 0.3342 | 12.0 | 54 | 0.2269 | 0.9211 |
| 0.3406 | 12.89 | 58 | 0.1903 | 0.9211 |
| 0.2331 | 14.0 | 63 | 0.2075 | 0.9123 |
| 0.2494 | 14.89 | 67 | 0.1345 | 0.9474 |
| 0.3143 | 16.0 | 72 | 0.1508 | 0.9298 |
| 0.2622 | 16.89 | 76 | 0.1267 | 0.9298 |
| 0.2894 | 18.0 | 81 | 0.0921 | 0.9737 |
| 0.2105 | 18.89 | 85 | 0.1058 | 0.9298 |
| 0.2011 | 20.0 | 90 | 0.1021 | 0.9561 |
| 0.1977 | 20.89 | 94 | 0.1122 | 0.9561 |
| 0.2685 | 22.0 | 99 | 0.1656 | 0.9298 |
| 0.2394 | 22.89 | 103 | 0.1197 | 0.9561 |
| 0.2394 | 24.0 | 108 | 0.1461 | 0.9211 |
| 0.1523 | 24.89 | 112 | 0.0969 | 0.9649 |
| 0.2231 | 26.0 | 117 | 0.0827 | 0.9649 |
| 0.1887 | 26.89 | 121 | 0.0979 | 0.9561 |
| 0.1656 | 28.0 | 126 | 0.0907 | 0.9386 |
| 0.1824 | 28.89 | 130 | 0.1009 | 0.9561 |
| 0.2443 | 30.0 | 135 | 0.1249 | 0.9474 |
| 0.2082 | 30.89 | 139 | 0.0755 | 0.9649 |
| 0.1276 | 32.0 | 144 | 0.0531 | 0.9825 |
| 0.1857 | 32.89 | 148 | 0.0707 | 0.9561 |
| 0.1676 | 34.0 | 153 | 0.0570 | 0.9825 |
| 0.1992 | 34.89 | 157 | 0.0591 | 0.9737 |
| 0.2283 | 36.0 | 162 | 0.0575 | 0.9737 |
| 0.2125 | 36.89 | 166 | 0.0519 | 0.9825 |
| 0.1915 | 38.0 | 171 | 0.0457 | 0.9912 |
| 0.1969 | 38.89 | 175 | 0.0644 | 0.9561 |
| 0.1522 | 40.0 | 180 | 0.0771 | 0.9561 |
| 0.1037 | 40.89 | 184 | 0.0557 | 0.9561 |
| 0.2348 | 42.0 | 189 | 0.0470 | 0.9737 |
| 0.1674 | 42.89 | 193 | 0.0469 | 0.9737 |
| 0.0874 | 44.0 | 198 | 0.0483 | 0.9737 |
| 0.1605 | 44.44 | 200 | 0.0485 | 0.9649 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1
|
[
"plant2",
"plant3"
] |
sbaner24/vit-base-patch16-224-Trial006-007-008-YEL_STEM2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-Trial006-007-008-YEL_STEM2
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0188
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7014 | 0.94 | 4 | 0.6789 | 0.5872 |
| 0.6242 | 1.88 | 8 | 0.6182 | 0.6147 |
| 0.5903 | 2.82 | 12 | 0.5147 | 0.8165 |
| 0.516 | 4.0 | 17 | 0.4137 | 0.8440 |
| 0.4612 | 4.94 | 21 | 0.3127 | 0.8899 |
| 0.3615 | 5.88 | 25 | 0.2448 | 0.8899 |
| 0.2899 | 6.82 | 29 | 0.2502 | 0.9083 |
| 0.3271 | 8.0 | 34 | 0.2268 | 0.8899 |
| 0.3765 | 8.94 | 38 | 0.1419 | 0.9266 |
| 0.2954 | 9.88 | 42 | 0.1233 | 0.9450 |
| 0.2187 | 10.82 | 46 | 0.1541 | 0.9358 |
| 0.3391 | 12.0 | 51 | 0.1575 | 0.9266 |
| 0.2027 | 12.94 | 55 | 0.0838 | 0.9908 |
| 0.2894 | 13.88 | 59 | 0.0852 | 0.9633 |
| 0.1972 | 14.82 | 63 | 0.0648 | 0.9908 |
| 0.1533 | 16.0 | 68 | 0.0884 | 0.9908 |
| 0.1588 | 16.94 | 72 | 0.0951 | 0.9633 |
| 0.1746 | 17.88 | 76 | 0.0793 | 0.9725 |
| 0.2386 | 18.82 | 80 | 0.1204 | 0.9633 |
| 0.1226 | 20.0 | 85 | 0.0502 | 0.9633 |
| 0.1901 | 20.94 | 89 | 0.0188 | 1.0 |
| 0.1305 | 21.88 | 93 | 0.0244 | 0.9908 |
| 0.1695 | 22.82 | 97 | 0.0186 | 1.0 |
| 0.2682 | 24.0 | 102 | 0.0460 | 0.9817 |
| 0.1956 | 24.94 | 106 | 0.0328 | 0.9817 |
| 0.1651 | 25.88 | 110 | 0.0493 | 0.9817 |
| 0.1841 | 26.82 | 114 | 0.0168 | 1.0 |
| 0.1897 | 28.0 | 119 | 0.0139 | 1.0 |
| 0.2088 | 28.94 | 123 | 0.0195 | 1.0 |
| 0.1386 | 29.88 | 127 | 0.0483 | 0.9817 |
| 0.1544 | 30.82 | 131 | 0.0258 | 0.9817 |
| 0.1293 | 32.0 | 136 | 0.0461 | 0.9725 |
| 0.1696 | 32.94 | 140 | 0.0207 | 0.9908 |
| 0.1793 | 33.88 | 144 | 0.0182 | 1.0 |
| 0.1545 | 34.82 | 148 | 0.0361 | 0.9817 |
| 0.141 | 36.0 | 153 | 0.0455 | 0.9817 |
| 0.138 | 36.94 | 157 | 0.0175 | 0.9908 |
| 0.1382 | 37.88 | 161 | 0.0359 | 0.9817 |
| 0.163 | 38.82 | 165 | 0.0463 | 0.9817 |
| 0.1374 | 40.0 | 170 | 0.0153 | 0.9908 |
| 0.1993 | 40.94 | 174 | 0.0116 | 1.0 |
| 0.1471 | 41.88 | 178 | 0.0199 | 0.9817 |
| 0.1414 | 42.82 | 182 | 0.0330 | 0.9817 |
| 0.1494 | 44.0 | 187 | 0.0228 | 0.9817 |
| 0.1367 | 44.94 | 191 | 0.0174 | 0.9817 |
| 0.1529 | 45.88 | 195 | 0.0162 | 0.9817 |
| 0.1365 | 46.82 | 199 | 0.0153 | 0.9908 |
| 0.1301 | 47.06 | 200 | 0.0152 | 0.9908 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1
|
[
"plant2",
"plant3"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 5.4262
- Accuracy: 0.535
- Brier Loss: 0.6080
- Nll: 2.4569
- F1 Micro: 0.535
- F1 Macro: 0.5345
- Ece: 0.2120
- Aurc: 0.2105
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 8.7328 | 0.0375 | 1.0712 | 7.1959 | 0.0375 | 0.0284 | 0.2770 | 0.9564 |
| No log | 2.0 | 14 | 6.6754 | 0.0725 | 0.9650 | 6.2033 | 0.0725 | 0.0442 | 0.1890 | 0.9239 |
| No log | 3.0 | 21 | 6.2399 | 0.1525 | 0.9213 | 5.6927 | 0.1525 | 0.1339 | 0.1647 | 0.8105 |
| No log | 4.0 | 28 | 5.9557 | 0.2125 | 0.8771 | 4.2773 | 0.2125 | 0.1952 | 0.1637 | 0.6347 |
| No log | 5.0 | 35 | 5.5957 | 0.3275 | 0.7962 | 3.1599 | 0.3275 | 0.3087 | 0.1928 | 0.4595 |
| No log | 6.0 | 42 | 5.4573 | 0.4075 | 0.7283 | 3.0860 | 0.4075 | 0.3732 | 0.1803 | 0.3610 |
| No log | 7.0 | 49 | 5.3445 | 0.43 | 0.7150 | 2.9840 | 0.4300 | 0.4112 | 0.2071 | 0.3498 |
| No log | 8.0 | 56 | 5.2986 | 0.445 | 0.6860 | 2.9281 | 0.445 | 0.4327 | 0.1891 | 0.3159 |
| No log | 9.0 | 63 | 5.2567 | 0.465 | 0.6608 | 3.0783 | 0.465 | 0.4494 | 0.1835 | 0.2885 |
| No log | 10.0 | 70 | 5.4317 | 0.4425 | 0.6752 | 2.9067 | 0.4425 | 0.4477 | 0.1836 | 0.3209 |
| No log | 11.0 | 77 | 5.2073 | 0.4825 | 0.6342 | 2.7960 | 0.4825 | 0.4664 | 0.1731 | 0.2708 |
| No log | 12.0 | 84 | 5.2094 | 0.51 | 0.6150 | 2.8886 | 0.51 | 0.4767 | 0.1463 | 0.2459 |
| No log | 13.0 | 91 | 5.2379 | 0.4975 | 0.6135 | 2.7446 | 0.4975 | 0.4819 | 0.1702 | 0.2436 |
| No log | 14.0 | 98 | 5.2716 | 0.505 | 0.6123 | 2.7888 | 0.505 | 0.4962 | 0.1595 | 0.2496 |
| No log | 15.0 | 105 | 5.2410 | 0.5125 | 0.6090 | 2.6474 | 0.5125 | 0.5144 | 0.1659 | 0.2451 |
| No log | 16.0 | 112 | 5.2764 | 0.54 | 0.6063 | 2.7700 | 0.54 | 0.5316 | 0.1634 | 0.2383 |
| No log | 17.0 | 119 | 5.2652 | 0.5275 | 0.6027 | 2.7934 | 0.5275 | 0.5171 | 0.1800 | 0.2326 |
| No log | 18.0 | 126 | 5.2145 | 0.54 | 0.5944 | 2.6456 | 0.54 | 0.5350 | 0.1548 | 0.2323 |
| No log | 19.0 | 133 | 5.2611 | 0.5175 | 0.6096 | 2.5302 | 0.5175 | 0.5273 | 0.1709 | 0.2435 |
| No log | 20.0 | 140 | 5.3536 | 0.52 | 0.6110 | 2.6530 | 0.52 | 0.5229 | 0.1619 | 0.2359 |
| No log | 21.0 | 147 | 5.3020 | 0.5125 | 0.6060 | 2.7184 | 0.5125 | 0.5070 | 0.1724 | 0.2398 |
| No log | 22.0 | 154 | 5.2107 | 0.5275 | 0.5926 | 2.4436 | 0.5275 | 0.5242 | 0.1618 | 0.2255 |
| No log | 23.0 | 161 | 5.2723 | 0.53 | 0.5953 | 2.7008 | 0.53 | 0.5209 | 0.1698 | 0.2253 |
| No log | 24.0 | 168 | 5.1615 | 0.5325 | 0.5875 | 2.4753 | 0.5325 | 0.5254 | 0.1699 | 0.2247 |
| No log | 25.0 | 175 | 5.1795 | 0.5375 | 0.5825 | 2.6856 | 0.5375 | 0.5316 | 0.1781 | 0.2144 |
| No log | 26.0 | 182 | 5.2340 | 0.54 | 0.5937 | 2.6542 | 0.54 | 0.5271 | 0.1778 | 0.2215 |
| No log | 27.0 | 189 | 5.2197 | 0.5375 | 0.5831 | 2.4800 | 0.5375 | 0.5366 | 0.1666 | 0.2119 |
| No log | 28.0 | 196 | 5.2345 | 0.5275 | 0.6105 | 2.6475 | 0.5275 | 0.5247 | 0.1919 | 0.2338 |
| No log | 29.0 | 203 | 5.2050 | 0.5475 | 0.5917 | 2.6350 | 0.5475 | 0.5531 | 0.1753 | 0.2251 |
| No log | 30.0 | 210 | 5.1753 | 0.5425 | 0.5891 | 2.6472 | 0.5425 | 0.5282 | 0.1831 | 0.2215 |
| No log | 31.0 | 217 | 5.2349 | 0.535 | 0.5946 | 2.5653 | 0.535 | 0.5257 | 0.1617 | 0.2186 |
| No log | 32.0 | 224 | 5.1497 | 0.545 | 0.5778 | 2.6174 | 0.545 | 0.5425 | 0.1716 | 0.2138 |
| No log | 33.0 | 231 | 5.1688 | 0.5175 | 0.5899 | 2.5079 | 0.5175 | 0.5149 | 0.1624 | 0.2159 |
| No log | 34.0 | 238 | 5.2269 | 0.53 | 0.5961 | 2.5188 | 0.53 | 0.5326 | 0.1746 | 0.2206 |
| No log | 35.0 | 245 | 5.1477 | 0.5325 | 0.5867 | 2.4762 | 0.5325 | 0.5369 | 0.1728 | 0.2176 |
| No log | 36.0 | 252 | 5.2229 | 0.5375 | 0.5838 | 2.4397 | 0.5375 | 0.5386 | 0.1693 | 0.2167 |
| No log | 37.0 | 259 | 5.1578 | 0.535 | 0.5802 | 2.5103 | 0.535 | 0.5286 | 0.1755 | 0.2124 |
| No log | 38.0 | 266 | 5.1405 | 0.535 | 0.5979 | 2.5852 | 0.535 | 0.5346 | 0.1913 | 0.2268 |
| No log | 39.0 | 273 | 5.1236 | 0.535 | 0.5844 | 2.4851 | 0.535 | 0.5378 | 0.1729 | 0.2168 |
| No log | 40.0 | 280 | 5.0813 | 0.5475 | 0.5757 | 2.4305 | 0.5475 | 0.5434 | 0.1781 | 0.2091 |
| No log | 41.0 | 287 | 5.1844 | 0.535 | 0.5888 | 2.4730 | 0.535 | 0.5306 | 0.1707 | 0.2159 |
| No log | 42.0 | 294 | 5.1468 | 0.53 | 0.5926 | 2.4866 | 0.53 | 0.5316 | 0.1776 | 0.2200 |
| No log | 43.0 | 301 | 5.1469 | 0.53 | 0.5837 | 2.5769 | 0.53 | 0.5252 | 0.1805 | 0.2168 |
| No log | 44.0 | 308 | 5.2168 | 0.54 | 0.5955 | 2.5216 | 0.54 | 0.5419 | 0.1689 | 0.2226 |
| No log | 45.0 | 315 | 5.1395 | 0.525 | 0.5861 | 2.4328 | 0.525 | 0.5293 | 0.2006 | 0.2180 |
| No log | 46.0 | 322 | 5.1163 | 0.5425 | 0.5822 | 2.4635 | 0.5425 | 0.5416 | 0.1937 | 0.2106 |
| No log | 47.0 | 329 | 5.1227 | 0.5475 | 0.5786 | 2.5198 | 0.5475 | 0.5489 | 0.1580 | 0.2111 |
| No log | 48.0 | 336 | 5.1134 | 0.5375 | 0.5839 | 2.5239 | 0.5375 | 0.5318 | 0.1832 | 0.2071 |
| No log | 49.0 | 343 | 5.1907 | 0.5375 | 0.5913 | 2.5012 | 0.5375 | 0.5334 | 0.1853 | 0.2145 |
| No log | 50.0 | 350 | 5.1364 | 0.5375 | 0.5875 | 2.4105 | 0.5375 | 0.5415 | 0.1857 | 0.2121 |
| No log | 51.0 | 357 | 5.1739 | 0.5425 | 0.5905 | 2.5208 | 0.5425 | 0.5399 | 0.1894 | 0.2112 |
| No log | 52.0 | 364 | 5.1635 | 0.5325 | 0.5841 | 2.4658 | 0.5325 | 0.5300 | 0.1924 | 0.2124 |
| No log | 53.0 | 371 | 5.2055 | 0.5425 | 0.5866 | 2.4800 | 0.5425 | 0.5390 | 0.1983 | 0.2135 |
| No log | 54.0 | 378 | 5.1547 | 0.5375 | 0.5869 | 2.4575 | 0.5375 | 0.5340 | 0.1839 | 0.2117 |
| No log | 55.0 | 385 | 5.1437 | 0.535 | 0.5838 | 2.4117 | 0.535 | 0.5366 | 0.1914 | 0.2110 |
| No log | 56.0 | 392 | 5.2042 | 0.5425 | 0.5915 | 2.4286 | 0.5425 | 0.5445 | 0.1905 | 0.2124 |
| No log | 57.0 | 399 | 5.2084 | 0.5625 | 0.5909 | 2.4774 | 0.5625 | 0.5646 | 0.2006 | 0.2116 |
| No log | 58.0 | 406 | 5.1844 | 0.545 | 0.5895 | 2.3826 | 0.545 | 0.5466 | 0.1948 | 0.2102 |
| No log | 59.0 | 413 | 5.1759 | 0.545 | 0.5892 | 2.4790 | 0.545 | 0.5498 | 0.1730 | 0.2143 |
| No log | 60.0 | 420 | 5.1783 | 0.5475 | 0.5894 | 2.4294 | 0.5475 | 0.5452 | 0.2043 | 0.2087 |
| No log | 61.0 | 427 | 5.1874 | 0.545 | 0.5879 | 2.4295 | 0.545 | 0.5412 | 0.1959 | 0.2080 |
| No log | 62.0 | 434 | 5.1861 | 0.5475 | 0.5840 | 2.4513 | 0.5475 | 0.5430 | 0.2097 | 0.2107 |
| No log | 63.0 | 441 | 5.1608 | 0.545 | 0.5818 | 2.4581 | 0.545 | 0.5450 | 0.1666 | 0.2055 |
| No log | 64.0 | 448 | 5.2018 | 0.5475 | 0.5911 | 2.4537 | 0.5475 | 0.5448 | 0.1938 | 0.2113 |
| No log | 65.0 | 455 | 5.2113 | 0.5375 | 0.5953 | 2.4444 | 0.5375 | 0.5360 | 0.1757 | 0.2106 |
| No log | 66.0 | 462 | 5.1985 | 0.5425 | 0.5897 | 2.4287 | 0.5425 | 0.5377 | 0.1870 | 0.2095 |
| No log | 67.0 | 469 | 5.2218 | 0.5325 | 0.5856 | 2.4340 | 0.5325 | 0.5320 | 0.1882 | 0.2059 |
| No log | 68.0 | 476 | 5.2243 | 0.545 | 0.5931 | 2.3923 | 0.545 | 0.5447 | 0.1799 | 0.2120 |
| No log | 69.0 | 483 | 5.2103 | 0.55 | 0.5881 | 2.4619 | 0.55 | 0.5486 | 0.2084 | 0.2060 |
| No log | 70.0 | 490 | 5.2370 | 0.55 | 0.5933 | 2.4236 | 0.55 | 0.5521 | 0.1920 | 0.2108 |
| No log | 71.0 | 497 | 5.2185 | 0.5475 | 0.5890 | 2.4137 | 0.5475 | 0.5435 | 0.2121 | 0.2076 |
| 3.6002 | 72.0 | 504 | 5.2460 | 0.545 | 0.5944 | 2.4704 | 0.545 | 0.5430 | 0.1922 | 0.2117 |
| 3.6002 | 73.0 | 511 | 5.2454 | 0.5425 | 0.5928 | 2.4750 | 0.5425 | 0.5406 | 0.1940 | 0.2080 |
| 3.6002 | 74.0 | 518 | 5.2307 | 0.5575 | 0.5935 | 2.4623 | 0.5575 | 0.5599 | 0.1959 | 0.2071 |
| 3.6002 | 75.0 | 525 | 5.2674 | 0.56 | 0.5877 | 2.4453 | 0.56 | 0.5587 | 0.1956 | 0.2033 |
| 3.6002 | 76.0 | 532 | 5.2263 | 0.5525 | 0.5907 | 2.5044 | 0.5525 | 0.5526 | 0.1862 | 0.2067 |
| 3.6002 | 77.0 | 539 | 5.2498 | 0.55 | 0.5938 | 2.4668 | 0.55 | 0.5467 | 0.2072 | 0.2059 |
| 3.6002 | 78.0 | 546 | 5.2671 | 0.545 | 0.5961 | 2.4394 | 0.545 | 0.5421 | 0.2056 | 0.2093 |
| 3.6002 | 79.0 | 553 | 5.2923 | 0.545 | 0.5950 | 2.4662 | 0.545 | 0.5455 | 0.1833 | 0.2058 |
| 3.6002 | 80.0 | 560 | 5.2854 | 0.555 | 0.5918 | 2.5010 | 0.555 | 0.5526 | 0.2040 | 0.2059 |
| 3.6002 | 81.0 | 567 | 5.3009 | 0.535 | 0.5955 | 2.4253 | 0.535 | 0.5319 | 0.1939 | 0.2101 |
| 3.6002 | 82.0 | 574 | 5.3016 | 0.535 | 0.5979 | 2.4528 | 0.535 | 0.5315 | 0.2020 | 0.2101 |
| 3.6002 | 83.0 | 581 | 5.3262 | 0.545 | 0.5990 | 2.4245 | 0.545 | 0.5422 | 0.1816 | 0.2081 |
| 3.6002 | 84.0 | 588 | 5.3206 | 0.535 | 0.5990 | 2.4519 | 0.535 | 0.5350 | 0.1959 | 0.2121 |
| 3.6002 | 85.0 | 595 | 5.3333 | 0.5375 | 0.5999 | 2.4909 | 0.5375 | 0.5352 | 0.1881 | 0.2109 |
| 3.6002 | 86.0 | 602 | 5.3407 | 0.535 | 0.6008 | 2.5019 | 0.535 | 0.5331 | 0.2087 | 0.2096 |
| 3.6002 | 87.0 | 609 | 5.3413 | 0.5425 | 0.6015 | 2.4753 | 0.5425 | 0.5402 | 0.2147 | 0.2101 |
| 3.6002 | 88.0 | 616 | 5.3716 | 0.5375 | 0.6041 | 2.4290 | 0.5375 | 0.5373 | 0.2234 | 0.2094 |
| 3.6002 | 89.0 | 623 | 5.3639 | 0.535 | 0.6010 | 2.4159 | 0.535 | 0.5319 | 0.2068 | 0.2108 |
| 3.6002 | 90.0 | 630 | 5.3742 | 0.5425 | 0.6030 | 2.4588 | 0.5425 | 0.5420 | 0.2021 | 0.2099 |
| 3.6002 | 91.0 | 637 | 5.3731 | 0.53 | 0.6046 | 2.4580 | 0.53 | 0.5284 | 0.2193 | 0.2122 |
| 3.6002 | 92.0 | 644 | 5.3919 | 0.54 | 0.6051 | 2.4317 | 0.54 | 0.5395 | 0.2057 | 0.2090 |
| 3.6002 | 93.0 | 651 | 5.3947 | 0.54 | 0.6049 | 2.4372 | 0.54 | 0.5385 | 0.2053 | 0.2092 |
| 3.6002 | 94.0 | 658 | 5.4070 | 0.535 | 0.6067 | 2.4600 | 0.535 | 0.5328 | 0.2297 | 0.2108 |
| 3.6002 | 95.0 | 665 | 5.4129 | 0.535 | 0.6071 | 2.4249 | 0.535 | 0.5345 | 0.2186 | 0.2104 |
| 3.6002 | 96.0 | 672 | 5.4137 | 0.535 | 0.6071 | 2.4580 | 0.535 | 0.5339 | 0.2192 | 0.2102 |
| 3.6002 | 97.0 | 679 | 5.4218 | 0.5325 | 0.6079 | 2.4584 | 0.5325 | 0.5316 | 0.2197 | 0.2112 |
| 3.6002 | 98.0 | 686 | 5.4236 | 0.5325 | 0.6083 | 2.4585 | 0.5325 | 0.5311 | 0.2154 | 0.2110 |
| 3.6002 | 99.0 | 693 | 5.4261 | 0.5325 | 0.6081 | 2.4569 | 0.5325 | 0.5316 | 0.2294 | 0.2114 |
| 3.6002 | 100.0 | 700 | 5.4262 | 0.535 | 0.6080 | 2.4569 | 0.535 | 0.5345 | 0.2120 | 0.2105 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
sbaner24/vit-base-patch16-224-Trial006-007-008-YEL_STEM3
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-Trial006-007-008-YEL_STEM3
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1618
- Accuracy: 0.9241
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7035 | 1.0 | 6 | 0.6557 | 0.6203 |
| 0.6168 | 2.0 | 12 | 0.5788 | 0.7215 |
| 0.5412 | 3.0 | 18 | 0.5005 | 0.7785 |
| 0.496 | 4.0 | 24 | 0.4946 | 0.7722 |
| 0.4024 | 5.0 | 30 | 0.4057 | 0.8165 |
| 0.4098 | 6.0 | 36 | 0.3076 | 0.8544 |
| 0.3645 | 7.0 | 42 | 0.3250 | 0.8418 |
| 0.276 | 8.0 | 48 | 0.2206 | 0.8924 |
| 0.3358 | 9.0 | 54 | 0.2100 | 0.8987 |
| 0.3386 | 10.0 | 60 | 0.1618 | 0.9241 |
| 0.2778 | 11.0 | 66 | 0.1609 | 0.9177 |
| 0.25 | 12.0 | 72 | 0.1581 | 0.9114 |
| 0.2914 | 13.0 | 78 | 0.1663 | 0.9114 |
| 0.2273 | 14.0 | 84 | 0.1525 | 0.9177 |
| 0.2694 | 15.0 | 90 | 0.1708 | 0.9051 |
| 0.2745 | 16.0 | 96 | 0.2364 | 0.8734 |
| 0.2809 | 17.0 | 102 | 0.1976 | 0.8608 |
| 0.2368 | 18.0 | 108 | 0.1517 | 0.9114 |
| 0.328 | 19.0 | 114 | 0.2454 | 0.8671 |
| 0.2571 | 20.0 | 120 | 0.1482 | 0.9114 |
| 0.2996 | 21.0 | 126 | 0.1629 | 0.8987 |
| 0.266 | 22.0 | 132 | 0.1360 | 0.9114 |
| 0.2323 | 23.0 | 138 | 0.1427 | 0.9114 |
| 0.2285 | 24.0 | 144 | 0.1683 | 0.9051 |
| 0.2566 | 25.0 | 150 | 0.1442 | 0.9114 |
| 0.2509 | 26.0 | 156 | 0.1595 | 0.9114 |
| 0.2337 | 27.0 | 162 | 0.1291 | 0.9177 |
| 0.2203 | 28.0 | 168 | 0.1302 | 0.8987 |
| 0.2409 | 29.0 | 174 | 0.1274 | 0.9114 |
| 0.2256 | 30.0 | 180 | 0.1272 | 0.8987 |
| 0.2157 | 31.0 | 186 | 0.1289 | 0.9177 |
| 0.2168 | 32.0 | 192 | 0.1267 | 0.9114 |
| 0.2426 | 33.0 | 198 | 0.1438 | 0.8987 |
| 0.2404 | 34.0 | 204 | 0.1388 | 0.8987 |
| 0.2218 | 35.0 | 210 | 0.1243 | 0.9241 |
| 0.3068 | 36.0 | 216 | 0.1268 | 0.9241 |
| 0.1721 | 37.0 | 222 | 0.1477 | 0.8987 |
| 0.2201 | 38.0 | 228 | 0.1545 | 0.8987 |
| 0.2581 | 39.0 | 234 | 0.1700 | 0.8987 |
| 0.213 | 40.0 | 240 | 0.1254 | 0.9114 |
| 0.2953 | 41.0 | 246 | 0.1237 | 0.9114 |
| 0.2564 | 42.0 | 252 | 0.1472 | 0.9051 |
| 0.249 | 43.0 | 258 | 0.1409 | 0.9051 |
| 0.2372 | 44.0 | 264 | 0.1495 | 0.9114 |
| 0.2541 | 45.0 | 270 | 0.1412 | 0.9051 |
| 0.1997 | 46.0 | 276 | 0.1308 | 0.9114 |
| 0.2381 | 47.0 | 282 | 0.1253 | 0.9177 |
| 0.2623 | 48.0 | 288 | 0.1267 | 0.9051 |
| 0.1855 | 49.0 | 294 | 0.1285 | 0.9051 |
| 0.1877 | 50.0 | 300 | 0.1289 | 0.9051 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1
|
[
"plant2",
"plant3"
] |
sbaner24/vit-base-patch16-224-Trial006-007-008-YEL_STEM4
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-Trial006-007-008-YEL_STEM4
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1729
- Accuracy: 0.9355
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6572 | 1.0 | 6 | 0.6719 | 0.5677 |
| 0.6279 | 2.0 | 12 | 0.6436 | 0.6516 |
| 0.5282 | 3.0 | 18 | 0.5677 | 0.7871 |
| 0.5405 | 4.0 | 24 | 0.4718 | 0.8258 |
| 0.5276 | 5.0 | 30 | 0.4959 | 0.7419 |
| 0.412 | 6.0 | 36 | 0.3081 | 0.8581 |
| 0.3011 | 7.0 | 42 | 0.2542 | 0.8839 |
| 0.3329 | 8.0 | 48 | 0.2350 | 0.8774 |
| 0.4478 | 9.0 | 54 | 0.1743 | 0.9290 |
| 0.268 | 10.0 | 60 | 0.1708 | 0.9161 |
| 0.2177 | 11.0 | 66 | 0.1729 | 0.9355 |
| 0.2675 | 12.0 | 72 | 0.1913 | 0.8968 |
| 0.4784 | 13.0 | 78 | 0.1826 | 0.9032 |
| 0.2456 | 14.0 | 84 | 0.1774 | 0.9032 |
| 0.6229 | 15.0 | 90 | 0.2196 | 0.8968 |
| 0.2561 | 16.0 | 96 | 0.1823 | 0.9226 |
| 0.3785 | 17.0 | 102 | 0.1770 | 0.9032 |
| 0.2334 | 18.0 | 108 | 0.2056 | 0.8903 |
| 0.1904 | 19.0 | 114 | 0.1564 | 0.9097 |
| 0.2256 | 20.0 | 120 | 0.1407 | 0.9226 |
| 0.2547 | 21.0 | 126 | 0.1552 | 0.9032 |
| 0.3468 | 22.0 | 132 | 0.1819 | 0.8968 |
| 0.4116 | 23.0 | 138 | 0.1537 | 0.9290 |
| 0.3689 | 24.0 | 144 | 0.1645 | 0.9097 |
| 0.3541 | 25.0 | 150 | 0.1527 | 0.9290 |
| 0.2498 | 26.0 | 156 | 0.1670 | 0.9161 |
| 0.3625 | 27.0 | 162 | 0.1522 | 0.9161 |
| 0.2463 | 28.0 | 168 | 0.1552 | 0.9226 |
| 0.3447 | 29.0 | 174 | 0.1510 | 0.9097 |
| 0.205 | 30.0 | 180 | 0.1924 | 0.9032 |
| 0.2023 | 31.0 | 186 | 0.1376 | 0.9355 |
| 0.3617 | 32.0 | 192 | 0.1518 | 0.9097 |
| 0.3515 | 33.0 | 198 | 0.1473 | 0.9097 |
| 0.1927 | 34.0 | 204 | 0.1544 | 0.9097 |
| 0.4567 | 35.0 | 210 | 0.1528 | 0.9097 |
| 0.3113 | 36.0 | 216 | 0.1510 | 0.9226 |
| 0.3475 | 37.0 | 222 | 0.1594 | 0.9161 |
| 0.1889 | 38.0 | 228 | 0.1448 | 0.9290 |
| 0.1979 | 39.0 | 234 | 0.1533 | 0.9226 |
| 0.3578 | 40.0 | 240 | 0.1627 | 0.9097 |
| 0.2004 | 41.0 | 246 | 0.1620 | 0.9161 |
| 0.3567 | 42.0 | 252 | 0.1475 | 0.9226 |
| 0.192 | 43.0 | 258 | 0.1504 | 0.9032 |
| 0.1872 | 44.0 | 264 | 0.1535 | 0.9097 |
| 0.2079 | 45.0 | 270 | 0.1490 | 0.9161 |
| 0.1503 | 46.0 | 276 | 0.1459 | 0.9161 |
| 0.169 | 47.0 | 282 | 0.1506 | 0.8968 |
| 0.1884 | 48.0 | 288 | 0.1556 | 0.8968 |
| 0.1638 | 49.0 | 294 | 0.1573 | 0.8968 |
| 0.1921 | 50.0 | 300 | 0.1570 | 0.8968 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1
|
[
"plant2",
"plant3"
] |
02shanky/test_model_graphics_classification_LION
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test_model_graphics_classification_LION
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1569
- Accuracy: 0.9558
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1765 | 0.99 | 42 | 0.1569 | 0.9558 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"cartoon",
"icon",
"images",
"nongraphics"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6786
- Accuracy: 0.5475
- Brier Loss: 0.5719
- Nll: 2.4381
- F1 Micro: 0.5475
- F1 Macro: 0.5432
- Ece: 0.1563
- Aurc: 0.2080
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.6969 | 0.0425 | 1.0710 | 7.3408 | 0.0425 | 0.0336 | 0.2818 | 0.9543 |
| No log | 2.0 | 14 | 3.6834 | 0.1 | 0.9492 | 5.8001 | 0.1000 | 0.0878 | 0.1662 | 0.8811 |
| No log | 3.0 | 21 | 3.2796 | 0.2325 | 0.8815 | 5.3289 | 0.2325 | 0.1817 | 0.1601 | 0.6217 |
| No log | 4.0 | 28 | 2.7674 | 0.2975 | 0.7907 | 3.6563 | 0.2975 | 0.2855 | 0.1624 | 0.4730 |
| No log | 5.0 | 35 | 2.4072 | 0.3925 | 0.7175 | 3.0843 | 0.3925 | 0.3726 | 0.1811 | 0.3501 |
| No log | 6.0 | 42 | 2.2499 | 0.44 | 0.6923 | 3.0183 | 0.44 | 0.4252 | 0.2052 | 0.3017 |
| No log | 7.0 | 49 | 2.2199 | 0.455 | 0.6973 | 3.0254 | 0.455 | 0.4233 | 0.2117 | 0.2958 |
| No log | 8.0 | 56 | 2.1564 | 0.4725 | 0.6958 | 2.8105 | 0.4725 | 0.4637 | 0.2159 | 0.2954 |
| No log | 9.0 | 63 | 2.2963 | 0.4575 | 0.7105 | 2.7884 | 0.4575 | 0.4386 | 0.2617 | 0.2866 |
| No log | 10.0 | 70 | 2.3511 | 0.4475 | 0.7553 | 2.8174 | 0.4475 | 0.4278 | 0.2698 | 0.3171 |
| No log | 11.0 | 77 | 2.4111 | 0.4425 | 0.7483 | 2.9313 | 0.4425 | 0.4290 | 0.2569 | 0.3056 |
| No log | 12.0 | 84 | 2.2927 | 0.4825 | 0.7158 | 2.7197 | 0.4825 | 0.4799 | 0.2453 | 0.2920 |
| No log | 13.0 | 91 | 2.2809 | 0.4825 | 0.7098 | 2.6808 | 0.4825 | 0.4758 | 0.2454 | 0.2811 |
| No log | 14.0 | 98 | 2.1787 | 0.4825 | 0.7036 | 2.5537 | 0.4825 | 0.4801 | 0.2417 | 0.2877 |
| No log | 15.0 | 105 | 2.1643 | 0.495 | 0.6934 | 2.7769 | 0.495 | 0.4892 | 0.2375 | 0.2707 |
| No log | 16.0 | 112 | 2.1309 | 0.48 | 0.6866 | 2.5936 | 0.48 | 0.4647 | 0.2130 | 0.2663 |
| No log | 17.0 | 119 | 2.1389 | 0.5125 | 0.6580 | 2.6313 | 0.5125 | 0.4962 | 0.1976 | 0.2424 |
| No log | 18.0 | 126 | 2.0813 | 0.5025 | 0.6649 | 2.6517 | 0.5025 | 0.5019 | 0.2126 | 0.2602 |
| No log | 19.0 | 133 | 2.0742 | 0.4975 | 0.6763 | 2.6690 | 0.4975 | 0.4739 | 0.2473 | 0.2575 |
| No log | 20.0 | 140 | 2.0550 | 0.48 | 0.6702 | 2.5468 | 0.48 | 0.4838 | 0.2206 | 0.2745 |
| No log | 21.0 | 147 | 1.9216 | 0.52 | 0.6332 | 2.6838 | 0.52 | 0.5237 | 0.1895 | 0.2410 |
| No log | 22.0 | 154 | 1.9572 | 0.5025 | 0.6500 | 2.4719 | 0.5025 | 0.4877 | 0.2223 | 0.2470 |
| No log | 23.0 | 161 | 1.8821 | 0.5175 | 0.6294 | 2.5337 | 0.5175 | 0.5067 | 0.2081 | 0.2332 |
| No log | 24.0 | 168 | 1.9198 | 0.5125 | 0.6332 | 2.5310 | 0.5125 | 0.5028 | 0.2039 | 0.2434 |
| No log | 25.0 | 175 | 1.9419 | 0.515 | 0.6358 | 2.6324 | 0.515 | 0.5005 | 0.2203 | 0.2383 |
| No log | 26.0 | 182 | 1.8714 | 0.525 | 0.6145 | 2.5820 | 0.525 | 0.5005 | 0.1843 | 0.2232 |
| No log | 27.0 | 189 | 1.9460 | 0.4975 | 0.6365 | 2.6119 | 0.4975 | 0.4907 | 0.2169 | 0.2395 |
| No log | 28.0 | 196 | 1.9364 | 0.525 | 0.6333 | 2.5789 | 0.525 | 0.5013 | 0.1808 | 0.2437 |
| No log | 29.0 | 203 | 1.9143 | 0.5425 | 0.6144 | 2.7809 | 0.5425 | 0.5181 | 0.1700 | 0.2224 |
| No log | 30.0 | 210 | 1.8565 | 0.5275 | 0.6171 | 2.5610 | 0.5275 | 0.5142 | 0.1823 | 0.2304 |
| No log | 31.0 | 217 | 1.8281 | 0.5325 | 0.6113 | 2.6216 | 0.5325 | 0.5191 | 0.1802 | 0.2167 |
| No log | 32.0 | 224 | 1.8213 | 0.525 | 0.6192 | 2.5521 | 0.525 | 0.5278 | 0.1844 | 0.2368 |
| No log | 33.0 | 231 | 1.7724 | 0.545 | 0.6030 | 2.4900 | 0.545 | 0.5460 | 0.1810 | 0.2211 |
| No log | 34.0 | 238 | 1.7617 | 0.5475 | 0.5985 | 2.5994 | 0.5475 | 0.5345 | 0.1907 | 0.2161 |
| No log | 35.0 | 245 | 1.8276 | 0.54 | 0.6090 | 2.5254 | 0.54 | 0.5172 | 0.1839 | 0.2249 |
| No log | 36.0 | 252 | 1.7646 | 0.5475 | 0.5924 | 2.5729 | 0.5475 | 0.5458 | 0.1930 | 0.2190 |
| No log | 37.0 | 259 | 1.7851 | 0.5675 | 0.5872 | 2.5036 | 0.5675 | 0.5600 | 0.1670 | 0.2056 |
| No log | 38.0 | 266 | 1.7743 | 0.5225 | 0.6136 | 2.4765 | 0.5225 | 0.5163 | 0.1831 | 0.2276 |
| No log | 39.0 | 273 | 1.7760 | 0.5325 | 0.5912 | 2.5692 | 0.5325 | 0.5228 | 0.1589 | 0.2108 |
| No log | 40.0 | 280 | 1.7664 | 0.53 | 0.5917 | 2.5363 | 0.53 | 0.5089 | 0.1800 | 0.2095 |
| No log | 41.0 | 287 | 1.8177 | 0.535 | 0.6070 | 2.5195 | 0.535 | 0.5272 | 0.1883 | 0.2268 |
| No log | 42.0 | 294 | 1.7575 | 0.56 | 0.5868 | 2.5956 | 0.56 | 0.5504 | 0.1740 | 0.2091 |
| No log | 43.0 | 301 | 1.7616 | 0.54 | 0.6001 | 2.3469 | 0.54 | 0.5401 | 0.1861 | 0.2272 |
| No log | 44.0 | 308 | 1.7105 | 0.5425 | 0.5831 | 2.4709 | 0.5425 | 0.5380 | 0.1884 | 0.2086 |
| No log | 45.0 | 315 | 1.7502 | 0.565 | 0.5880 | 2.4600 | 0.565 | 0.5546 | 0.1527 | 0.2084 |
| No log | 46.0 | 322 | 1.7135 | 0.565 | 0.5834 | 2.4571 | 0.565 | 0.5631 | 0.1703 | 0.2103 |
| No log | 47.0 | 329 | 1.7327 | 0.5525 | 0.5892 | 2.4573 | 0.5525 | 0.5491 | 0.1820 | 0.2178 |
| No log | 48.0 | 336 | 1.7405 | 0.535 | 0.5947 | 2.5414 | 0.535 | 0.5338 | 0.1840 | 0.2180 |
| No log | 49.0 | 343 | 1.7265 | 0.555 | 0.5758 | 2.4824 | 0.555 | 0.5423 | 0.1603 | 0.2025 |
| No log | 50.0 | 350 | 1.7065 | 0.5525 | 0.5904 | 2.3807 | 0.5525 | 0.5576 | 0.1769 | 0.2212 |
| No log | 51.0 | 357 | 1.7265 | 0.545 | 0.5833 | 2.4535 | 0.545 | 0.5413 | 0.1704 | 0.2108 |
| No log | 52.0 | 364 | 1.7197 | 0.55 | 0.5740 | 2.5386 | 0.55 | 0.5347 | 0.1467 | 0.2032 |
| No log | 53.0 | 371 | 1.7015 | 0.5575 | 0.5826 | 2.4159 | 0.5575 | 0.5578 | 0.1751 | 0.2138 |
| No log | 54.0 | 378 | 1.7263 | 0.55 | 0.5873 | 2.4471 | 0.55 | 0.5456 | 0.1629 | 0.2144 |
| No log | 55.0 | 385 | 1.6786 | 0.555 | 0.5780 | 2.3908 | 0.555 | 0.5490 | 0.1627 | 0.2106 |
| No log | 56.0 | 392 | 1.7147 | 0.55 | 0.5811 | 2.3876 | 0.55 | 0.5476 | 0.1724 | 0.2122 |
| No log | 57.0 | 399 | 1.6983 | 0.5525 | 0.5769 | 2.5028 | 0.5525 | 0.5415 | 0.1716 | 0.2054 |
| No log | 58.0 | 406 | 1.7350 | 0.5425 | 0.5984 | 2.3835 | 0.5425 | 0.5406 | 0.1806 | 0.2282 |
| No log | 59.0 | 413 | 1.7015 | 0.54 | 0.5779 | 2.4865 | 0.54 | 0.5288 | 0.1848 | 0.2079 |
| No log | 60.0 | 420 | 1.6783 | 0.5525 | 0.5712 | 2.4231 | 0.5525 | 0.5497 | 0.1777 | 0.2046 |
| No log | 61.0 | 427 | 1.7236 | 0.545 | 0.5856 | 2.5104 | 0.545 | 0.5379 | 0.1850 | 0.2142 |
| No log | 62.0 | 434 | 1.6904 | 0.5425 | 0.5752 | 2.4626 | 0.5425 | 0.5400 | 0.1787 | 0.2084 |
| No log | 63.0 | 441 | 1.7136 | 0.5375 | 0.5837 | 2.4091 | 0.5375 | 0.5304 | 0.1681 | 0.2150 |
| No log | 64.0 | 448 | 1.6783 | 0.56 | 0.5762 | 2.4017 | 0.56 | 0.5594 | 0.1607 | 0.2093 |
| No log | 65.0 | 455 | 1.7098 | 0.5425 | 0.5797 | 2.5062 | 0.5425 | 0.5406 | 0.1707 | 0.2120 |
| No log | 66.0 | 462 | 1.6752 | 0.5525 | 0.5712 | 2.4684 | 0.5525 | 0.5478 | 0.1711 | 0.2063 |
| No log | 67.0 | 469 | 1.7068 | 0.545 | 0.5761 | 2.5107 | 0.545 | 0.5415 | 0.1750 | 0.2090 |
| No log | 68.0 | 476 | 1.6706 | 0.5525 | 0.5700 | 2.4322 | 0.5525 | 0.5466 | 0.1798 | 0.2068 |
| No log | 69.0 | 483 | 1.6941 | 0.5525 | 0.5749 | 2.4689 | 0.5525 | 0.5501 | 0.1654 | 0.2072 |
| No log | 70.0 | 490 | 1.6726 | 0.545 | 0.5711 | 2.4348 | 0.545 | 0.5416 | 0.1811 | 0.2083 |
| No log | 71.0 | 497 | 1.6831 | 0.5525 | 0.5740 | 2.4659 | 0.5525 | 0.5498 | 0.1687 | 0.2061 |
| 0.4086 | 72.0 | 504 | 1.6749 | 0.5575 | 0.5699 | 2.4122 | 0.5575 | 0.5535 | 0.1706 | 0.2051 |
| 0.4086 | 73.0 | 511 | 1.6867 | 0.555 | 0.5740 | 2.4402 | 0.555 | 0.5495 | 0.1725 | 0.2069 |
| 0.4086 | 74.0 | 518 | 1.6768 | 0.5475 | 0.5716 | 2.4369 | 0.5475 | 0.5442 | 0.1621 | 0.2080 |
| 0.4086 | 75.0 | 525 | 1.6835 | 0.55 | 0.5736 | 2.4404 | 0.55 | 0.5448 | 0.1824 | 0.2075 |
| 0.4086 | 76.0 | 532 | 1.6737 | 0.5525 | 0.5701 | 2.4361 | 0.5525 | 0.5497 | 0.1731 | 0.2063 |
| 0.4086 | 77.0 | 539 | 1.6796 | 0.55 | 0.5721 | 2.4399 | 0.55 | 0.5450 | 0.1723 | 0.2066 |
| 0.4086 | 78.0 | 546 | 1.6774 | 0.55 | 0.5718 | 2.4362 | 0.55 | 0.5472 | 0.1732 | 0.2071 |
| 0.4086 | 79.0 | 553 | 1.6781 | 0.555 | 0.5711 | 2.4390 | 0.555 | 0.5506 | 0.1588 | 0.2059 |
| 0.4086 | 80.0 | 560 | 1.6787 | 0.555 | 0.5714 | 2.4380 | 0.555 | 0.5508 | 0.1832 | 0.2070 |
| 0.4086 | 81.0 | 567 | 1.6778 | 0.555 | 0.5717 | 2.4366 | 0.555 | 0.5516 | 0.1615 | 0.2060 |
| 0.4086 | 82.0 | 574 | 1.6788 | 0.55 | 0.5717 | 2.4380 | 0.55 | 0.5462 | 0.1680 | 0.2070 |
| 0.4086 | 83.0 | 581 | 1.6761 | 0.5525 | 0.5712 | 2.4366 | 0.5525 | 0.5505 | 0.1809 | 0.2064 |
| 0.4086 | 84.0 | 588 | 1.6778 | 0.55 | 0.5714 | 2.4388 | 0.55 | 0.5443 | 0.1571 | 0.2073 |
| 0.4086 | 85.0 | 595 | 1.6772 | 0.5525 | 0.5716 | 2.4367 | 0.5525 | 0.5479 | 0.1768 | 0.2073 |
| 0.4086 | 86.0 | 602 | 1.6789 | 0.5525 | 0.5722 | 2.4376 | 0.5525 | 0.5470 | 0.1646 | 0.2066 |
| 0.4086 | 87.0 | 609 | 1.6784 | 0.5525 | 0.5717 | 2.4384 | 0.5525 | 0.5486 | 0.1743 | 0.2073 |
| 0.4086 | 88.0 | 616 | 1.6786 | 0.55 | 0.5720 | 2.4382 | 0.55 | 0.5443 | 0.1559 | 0.2077 |
| 0.4086 | 89.0 | 623 | 1.6784 | 0.5525 | 0.5718 | 2.4379 | 0.5525 | 0.5479 | 0.1561 | 0.2073 |
| 0.4086 | 90.0 | 630 | 1.6782 | 0.5525 | 0.5718 | 2.4381 | 0.5525 | 0.5482 | 0.1561 | 0.2066 |
| 0.4086 | 91.0 | 637 | 1.6784 | 0.5525 | 0.5719 | 2.4375 | 0.5525 | 0.5482 | 0.1612 | 0.2065 |
| 0.4086 | 92.0 | 644 | 1.6783 | 0.55 | 0.5718 | 2.4377 | 0.55 | 0.5456 | 0.1702 | 0.2073 |
| 0.4086 | 93.0 | 651 | 1.6781 | 0.55 | 0.5718 | 2.4378 | 0.55 | 0.5456 | 0.1581 | 0.2075 |
| 0.4086 | 94.0 | 658 | 1.6785 | 0.55 | 0.5719 | 2.4379 | 0.55 | 0.5455 | 0.1602 | 0.2074 |
| 0.4086 | 95.0 | 665 | 1.6785 | 0.55 | 0.5719 | 2.4379 | 0.55 | 0.5456 | 0.1635 | 0.2075 |
| 0.4086 | 96.0 | 672 | 1.6785 | 0.5475 | 0.5719 | 2.4381 | 0.5475 | 0.5432 | 0.1659 | 0.2080 |
| 0.4086 | 97.0 | 679 | 1.6786 | 0.55 | 0.5719 | 2.4381 | 0.55 | 0.5458 | 0.1545 | 0.2071 |
| 0.4086 | 98.0 | 686 | 1.6786 | 0.5475 | 0.5719 | 2.4381 | 0.5475 | 0.5431 | 0.1613 | 0.2081 |
| 0.4086 | 99.0 | 693 | 1.6786 | 0.5475 | 0.5719 | 2.4381 | 0.5475 | 0.5431 | 0.1613 | 0.2081 |
| 0.4086 | 100.0 | 700 | 1.6786 | 0.5475 | 0.5719 | 2.4381 | 0.5475 | 0.5432 | 0.1563 | 0.2080 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0854
- Accuracy: 0.9110
- Brier Loss: 0.1305
- Nll: 1.2339
- F1 Micro: 0.9110
- F1 Macro: 0.9112
- Ece: 0.0097
- Aurc: 0.0122
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| 0.2638 | 1.0 | 2500 | 0.2093 | 0.8574 | 0.2035 | 1.6580 | 0.8574 | 0.8564 | 0.0105 | 0.0272 |
| 0.1559 | 2.0 | 5000 | 0.1413 | 0.8854 | 0.1645 | 1.5062 | 0.8854 | 0.8863 | 0.0101 | 0.0185 |
| 0.1036 | 3.0 | 7500 | 0.1212 | 0.8994 | 0.1461 | 1.4353 | 0.8994 | 0.8999 | 0.0096 | 0.0150 |
| 0.0693 | 4.0 | 10000 | 0.1196 | 0.9013 | 0.1440 | 1.3827 | 0.9013 | 0.9017 | 0.0152 | 0.0148 |
| 0.0451 | 5.0 | 12500 | 0.1094 | 0.9062 | 0.1391 | 1.3099 | 0.9062 | 0.9064 | 0.0124 | 0.0139 |
| 0.0317 | 6.0 | 15000 | 0.0997 | 0.9073 | 0.1357 | 1.2889 | 0.9073 | 0.9078 | 0.0091 | 0.0132 |
| 0.0224 | 7.0 | 17500 | 0.0961 | 0.9081 | 0.1348 | 1.2705 | 0.9081 | 0.9082 | 0.0100 | 0.0129 |
| 0.0166 | 8.0 | 20000 | 0.0890 | 0.9099 | 0.1328 | 1.2484 | 0.9099 | 0.9102 | 0.0085 | 0.0126 |
| 0.0117 | 9.0 | 22500 | 0.0862 | 0.9096 | 0.1316 | 1.2428 | 0.9096 | 0.9100 | 0.0100 | 0.0123 |
| 0.0085 | 10.0 | 25000 | 0.0854 | 0.9110 | 0.1305 | 1.2339 | 0.9110 | 0.9112 | 0.0097 | 0.0122 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5923
- Accuracy: 0.57
- Brier Loss: 0.5750
- Nll: 2.3088
- F1 Micro: 0.57
- F1 Macro: 0.5661
- Ece: 0.1722
- Aurc: 0.2058
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.3150 | 0.0425 | 1.0714 | 7.3695 | 0.0425 | 0.0339 | 0.2866 | 0.9549 |
| No log | 2.0 | 14 | 3.3516 | 0.1 | 0.9507 | 5.8558 | 0.1000 | 0.0866 | 0.1647 | 0.8831 |
| No log | 3.0 | 21 | 2.9890 | 0.225 | 0.8838 | 5.3580 | 0.225 | 0.1805 | 0.1576 | 0.6316 |
| No log | 4.0 | 28 | 2.5376 | 0.29 | 0.7946 | 3.5543 | 0.29 | 0.2749 | 0.1832 | 0.4807 |
| No log | 5.0 | 35 | 2.2193 | 0.3875 | 0.7186 | 3.0794 | 0.3875 | 0.3677 | 0.1577 | 0.3531 |
| No log | 6.0 | 42 | 2.0818 | 0.43 | 0.6905 | 2.9853 | 0.4300 | 0.4165 | 0.1668 | 0.3056 |
| No log | 7.0 | 49 | 2.1032 | 0.45 | 0.7019 | 3.1044 | 0.45 | 0.4208 | 0.2121 | 0.2997 |
| No log | 8.0 | 56 | 2.0360 | 0.455 | 0.6994 | 2.8491 | 0.455 | 0.4492 | 0.2131 | 0.3026 |
| No log | 9.0 | 63 | 2.1719 | 0.475 | 0.7092 | 2.7831 | 0.4750 | 0.4549 | 0.2332 | 0.2870 |
| No log | 10.0 | 70 | 2.1820 | 0.4525 | 0.7393 | 2.8185 | 0.4525 | 0.4318 | 0.2813 | 0.2994 |
| No log | 11.0 | 77 | 2.2659 | 0.4475 | 0.7485 | 2.8020 | 0.4475 | 0.4227 | 0.2677 | 0.3046 |
| No log | 12.0 | 84 | 2.1798 | 0.4575 | 0.7325 | 2.6772 | 0.4575 | 0.4555 | 0.2738 | 0.3081 |
| No log | 13.0 | 91 | 2.3031 | 0.465 | 0.7431 | 2.8956 | 0.465 | 0.4390 | 0.2771 | 0.2945 |
| No log | 14.0 | 98 | 2.0867 | 0.49 | 0.7048 | 2.5312 | 0.49 | 0.4823 | 0.2528 | 0.2921 |
| No log | 15.0 | 105 | 2.1671 | 0.49 | 0.7218 | 2.7779 | 0.49 | 0.4749 | 0.2396 | 0.2877 |
| No log | 16.0 | 112 | 2.0091 | 0.485 | 0.6857 | 2.7234 | 0.485 | 0.4608 | 0.2493 | 0.2577 |
| No log | 17.0 | 119 | 1.9494 | 0.515 | 0.6714 | 2.4841 | 0.515 | 0.5072 | 0.2380 | 0.2614 |
| No log | 18.0 | 126 | 1.9132 | 0.505 | 0.6665 | 2.4777 | 0.505 | 0.4945 | 0.2206 | 0.2622 |
| No log | 19.0 | 133 | 2.0539 | 0.505 | 0.6776 | 2.7885 | 0.505 | 0.4986 | 0.2209 | 0.2724 |
| No log | 20.0 | 140 | 1.9533 | 0.5125 | 0.6666 | 2.7287 | 0.5125 | 0.5044 | 0.2385 | 0.2645 |
| No log | 21.0 | 147 | 1.9038 | 0.525 | 0.6365 | 2.8345 | 0.525 | 0.5099 | 0.2021 | 0.2290 |
| No log | 22.0 | 154 | 1.8525 | 0.5075 | 0.6448 | 2.6337 | 0.5075 | 0.4958 | 0.2083 | 0.2494 |
| No log | 23.0 | 161 | 1.7880 | 0.51 | 0.6386 | 2.4856 | 0.51 | 0.5078 | 0.2186 | 0.2478 |
| No log | 24.0 | 168 | 1.8363 | 0.505 | 0.6456 | 2.5075 | 0.505 | 0.4966 | 0.1962 | 0.2399 |
| No log | 25.0 | 175 | 1.9655 | 0.4725 | 0.6864 | 2.6331 | 0.4725 | 0.4608 | 0.2291 | 0.2669 |
| No log | 26.0 | 182 | 1.8660 | 0.5175 | 0.6547 | 2.5404 | 0.5175 | 0.5076 | 0.2252 | 0.2489 |
| No log | 27.0 | 189 | 1.8693 | 0.525 | 0.6446 | 2.6230 | 0.525 | 0.5145 | 0.2047 | 0.2540 |
| No log | 28.0 | 196 | 1.8113 | 0.51 | 0.6407 | 2.4380 | 0.51 | 0.4978 | 0.2030 | 0.2454 |
| No log | 29.0 | 203 | 1.8313 | 0.53 | 0.6445 | 2.4777 | 0.53 | 0.5284 | 0.2071 | 0.2575 |
| No log | 30.0 | 210 | 1.7310 | 0.5425 | 0.6197 | 2.4559 | 0.5425 | 0.5384 | 0.1869 | 0.2367 |
| No log | 31.0 | 217 | 1.8023 | 0.5325 | 0.6351 | 2.5026 | 0.5325 | 0.5216 | 0.2081 | 0.2496 |
| No log | 32.0 | 224 | 1.7652 | 0.5325 | 0.6186 | 2.4794 | 0.5325 | 0.5145 | 0.1715 | 0.2338 |
| No log | 33.0 | 231 | 1.7583 | 0.525 | 0.6363 | 2.4889 | 0.525 | 0.5275 | 0.1984 | 0.2463 |
| No log | 34.0 | 238 | 1.7552 | 0.5475 | 0.6164 | 2.4291 | 0.5475 | 0.5305 | 0.2026 | 0.2377 |
| No log | 35.0 | 245 | 1.6839 | 0.5375 | 0.6085 | 2.5915 | 0.5375 | 0.5253 | 0.1828 | 0.2214 |
| No log | 36.0 | 252 | 1.7480 | 0.5425 | 0.6104 | 2.5809 | 0.5425 | 0.5366 | 0.1716 | 0.2232 |
| No log | 37.0 | 259 | 1.7534 | 0.525 | 0.6225 | 2.3614 | 0.525 | 0.5183 | 0.1930 | 0.2249 |
| No log | 38.0 | 266 | 1.7484 | 0.5425 | 0.6125 | 2.5714 | 0.5425 | 0.5282 | 0.1792 | 0.2272 |
| No log | 39.0 | 273 | 1.7073 | 0.55 | 0.6172 | 2.4200 | 0.55 | 0.5370 | 0.1902 | 0.2314 |
| No log | 40.0 | 280 | 1.7303 | 0.55 | 0.6134 | 2.4829 | 0.55 | 0.5394 | 0.1916 | 0.2324 |
| No log | 41.0 | 287 | 1.6684 | 0.54 | 0.6060 | 2.4632 | 0.54 | 0.5350 | 0.2028 | 0.2251 |
| No log | 42.0 | 294 | 1.7171 | 0.5375 | 0.6055 | 2.4705 | 0.5375 | 0.5213 | 0.1776 | 0.2262 |
| No log | 43.0 | 301 | 1.6493 | 0.545 | 0.5991 | 2.5207 | 0.545 | 0.5412 | 0.1779 | 0.2214 |
| No log | 44.0 | 308 | 1.6548 | 0.5625 | 0.5920 | 2.4810 | 0.5625 | 0.5568 | 0.1892 | 0.2182 |
| No log | 45.0 | 315 | 1.6392 | 0.565 | 0.5943 | 2.3771 | 0.565 | 0.5586 | 0.2165 | 0.2162 |
| No log | 46.0 | 322 | 1.6923 | 0.5225 | 0.6159 | 2.3661 | 0.5225 | 0.5158 | 0.1775 | 0.2400 |
| No log | 47.0 | 329 | 1.6266 | 0.5525 | 0.5827 | 2.4385 | 0.5525 | 0.5468 | 0.1845 | 0.2100 |
| No log | 48.0 | 336 | 1.6804 | 0.55 | 0.6019 | 2.3884 | 0.55 | 0.5481 | 0.1895 | 0.2291 |
| No log | 49.0 | 343 | 1.6202 | 0.5725 | 0.5847 | 2.4882 | 0.5725 | 0.5596 | 0.1642 | 0.2125 |
| No log | 50.0 | 350 | 1.6222 | 0.54 | 0.5882 | 2.4144 | 0.54 | 0.5311 | 0.1830 | 0.2226 |
| No log | 51.0 | 357 | 1.6119 | 0.5775 | 0.5794 | 2.4063 | 0.5775 | 0.5731 | 0.1647 | 0.2019 |
| No log | 52.0 | 364 | 1.5958 | 0.57 | 0.5757 | 2.3342 | 0.57 | 0.5642 | 0.1778 | 0.2094 |
| No log | 53.0 | 371 | 1.6206 | 0.545 | 0.5913 | 2.3884 | 0.545 | 0.5365 | 0.1799 | 0.2187 |
| No log | 54.0 | 378 | 1.5982 | 0.5675 | 0.5745 | 2.4276 | 0.5675 | 0.5640 | 0.1746 | 0.2050 |
| No log | 55.0 | 385 | 1.6258 | 0.5525 | 0.5856 | 2.4005 | 0.5525 | 0.5373 | 0.1890 | 0.2124 |
| No log | 56.0 | 392 | 1.5763 | 0.57 | 0.5744 | 2.4477 | 0.57 | 0.5729 | 0.1651 | 0.2081 |
| No log | 57.0 | 399 | 1.6249 | 0.5525 | 0.5861 | 2.3791 | 0.5525 | 0.5432 | 0.1531 | 0.2114 |
| No log | 58.0 | 406 | 1.6240 | 0.5775 | 0.5791 | 2.4540 | 0.5775 | 0.5730 | 0.1582 | 0.2054 |
| No log | 59.0 | 413 | 1.6149 | 0.545 | 0.5851 | 2.3134 | 0.545 | 0.5395 | 0.1870 | 0.2137 |
| No log | 60.0 | 420 | 1.6163 | 0.5775 | 0.5792 | 2.3778 | 0.5775 | 0.5708 | 0.1762 | 0.2076 |
| No log | 61.0 | 427 | 1.6132 | 0.5575 | 0.5868 | 2.3759 | 0.5575 | 0.5530 | 0.1842 | 0.2159 |
| No log | 62.0 | 434 | 1.5940 | 0.5725 | 0.5756 | 2.3394 | 0.5725 | 0.5731 | 0.2102 | 0.2054 |
| No log | 63.0 | 441 | 1.6167 | 0.56 | 0.5841 | 2.4117 | 0.56 | 0.5541 | 0.1806 | 0.2160 |
| No log | 64.0 | 448 | 1.5988 | 0.57 | 0.5775 | 2.3388 | 0.57 | 0.5667 | 0.1680 | 0.2064 |
| No log | 65.0 | 455 | 1.5893 | 0.5725 | 0.5752 | 2.4281 | 0.5725 | 0.5695 | 0.1624 | 0.2050 |
| No log | 66.0 | 462 | 1.5975 | 0.5725 | 0.5737 | 2.3760 | 0.5725 | 0.5662 | 0.1733 | 0.2026 |
| No log | 67.0 | 469 | 1.5903 | 0.57 | 0.5772 | 2.2921 | 0.57 | 0.5675 | 0.1888 | 0.2112 |
| No log | 68.0 | 476 | 1.5878 | 0.575 | 0.5730 | 2.3676 | 0.575 | 0.5706 | 0.1683 | 0.2039 |
| No log | 69.0 | 483 | 1.5950 | 0.57 | 0.5775 | 2.3006 | 0.57 | 0.5641 | 0.1639 | 0.2076 |
| No log | 70.0 | 490 | 1.5916 | 0.58 | 0.5728 | 2.3424 | 0.58 | 0.5769 | 0.1714 | 0.2026 |
| No log | 71.0 | 497 | 1.5960 | 0.5675 | 0.5784 | 2.3057 | 0.5675 | 0.5624 | 0.1600 | 0.2073 |
| 0.3705 | 72.0 | 504 | 1.5907 | 0.575 | 0.5755 | 2.3322 | 0.575 | 0.5723 | 0.1578 | 0.2066 |
| 0.3705 | 73.0 | 511 | 1.5918 | 0.5675 | 0.5762 | 2.3182 | 0.5675 | 0.5605 | 0.1942 | 0.2071 |
| 0.3705 | 74.0 | 518 | 1.5894 | 0.585 | 0.5747 | 2.3335 | 0.585 | 0.5818 | 0.1739 | 0.2035 |
| 0.3705 | 75.0 | 525 | 1.5878 | 0.565 | 0.5750 | 2.3019 | 0.565 | 0.5607 | 0.1649 | 0.2060 |
| 0.3705 | 76.0 | 532 | 1.5923 | 0.575 | 0.5742 | 2.3376 | 0.575 | 0.5699 | 0.1779 | 0.2048 |
| 0.3705 | 77.0 | 539 | 1.5891 | 0.565 | 0.5760 | 2.2978 | 0.565 | 0.5616 | 0.1691 | 0.2066 |
| 0.3705 | 78.0 | 546 | 1.5896 | 0.575 | 0.5738 | 2.3748 | 0.575 | 0.5703 | 0.1733 | 0.2048 |
| 0.3705 | 79.0 | 553 | 1.5901 | 0.5675 | 0.5757 | 2.3039 | 0.5675 | 0.5634 | 0.1710 | 0.2064 |
| 0.3705 | 80.0 | 560 | 1.5906 | 0.57 | 0.5746 | 2.3125 | 0.57 | 0.5657 | 0.1692 | 0.2054 |
| 0.3705 | 81.0 | 567 | 1.5907 | 0.57 | 0.5751 | 2.3097 | 0.57 | 0.5659 | 0.1600 | 0.2047 |
| 0.3705 | 82.0 | 574 | 1.5902 | 0.57 | 0.5746 | 2.3072 | 0.57 | 0.5657 | 0.1797 | 0.2055 |
| 0.3705 | 83.0 | 581 | 1.5906 | 0.5725 | 0.5746 | 2.3145 | 0.5725 | 0.5681 | 0.1547 | 0.2050 |
| 0.3705 | 84.0 | 588 | 1.5909 | 0.5725 | 0.5750 | 2.3057 | 0.5725 | 0.5684 | 0.1746 | 0.2055 |
| 0.3705 | 85.0 | 595 | 1.5906 | 0.57 | 0.5746 | 2.3098 | 0.57 | 0.5661 | 0.1721 | 0.2054 |
| 0.3705 | 86.0 | 602 | 1.5916 | 0.57 | 0.5749 | 2.3093 | 0.57 | 0.5661 | 0.1659 | 0.2058 |
| 0.3705 | 87.0 | 609 | 1.5913 | 0.57 | 0.5748 | 2.3084 | 0.57 | 0.5661 | 0.1631 | 0.2058 |
| 0.3705 | 88.0 | 616 | 1.5918 | 0.57 | 0.5749 | 2.3082 | 0.57 | 0.5661 | 0.1652 | 0.2058 |
| 0.3705 | 89.0 | 623 | 1.5919 | 0.57 | 0.5750 | 2.3084 | 0.57 | 0.5661 | 0.1658 | 0.2059 |
| 0.3705 | 90.0 | 630 | 1.5918 | 0.5725 | 0.5749 | 2.3087 | 0.5725 | 0.5685 | 0.1650 | 0.2056 |
| 0.3705 | 91.0 | 637 | 1.5921 | 0.57 | 0.5750 | 2.3076 | 0.57 | 0.5661 | 0.1549 | 0.2059 |
| 0.3705 | 92.0 | 644 | 1.5920 | 0.57 | 0.5750 | 2.3079 | 0.57 | 0.5661 | 0.1581 | 0.2058 |
| 0.3705 | 93.0 | 651 | 1.5917 | 0.57 | 0.5749 | 2.3080 | 0.57 | 0.5661 | 0.1680 | 0.2057 |
| 0.3705 | 94.0 | 658 | 1.5923 | 0.57 | 0.5750 | 2.3083 | 0.57 | 0.5661 | 0.1643 | 0.2058 |
| 0.3705 | 95.0 | 665 | 1.5924 | 0.57 | 0.5751 | 2.3085 | 0.57 | 0.5661 | 0.1543 | 0.2059 |
| 0.3705 | 96.0 | 672 | 1.5922 | 0.57 | 0.5750 | 2.3085 | 0.57 | 0.5661 | 0.1530 | 0.2058 |
| 0.3705 | 97.0 | 679 | 1.5923 | 0.57 | 0.5750 | 2.3088 | 0.57 | 0.5661 | 0.1688 | 0.2058 |
| 0.3705 | 98.0 | 686 | 1.5923 | 0.57 | 0.5749 | 2.3089 | 0.57 | 0.5661 | 0.1733 | 0.2058 |
| 0.3705 | 99.0 | 693 | 1.5923 | 0.57 | 0.5750 | 2.3088 | 0.57 | 0.5661 | 0.1735 | 0.2058 |
| 0.3705 | 100.0 | 700 | 1.5923 | 0.57 | 0.5750 | 2.3088 | 0.57 | 0.5661 | 0.1722 | 0.2058 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5908
- Accuracy: 0.54
- Brier Loss: 0.6121
- Nll: 2.4999
- F1 Micro: 0.54
- F1 Macro: 0.5334
- Ece: 0.1895
- Aurc: 0.2228
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 3.9334 | 0.0425 | 1.0719 | 7.3494 | 0.0425 | 0.0341 | 0.2781 | 0.9557 |
| No log | 2.0 | 14 | 3.0206 | 0.09 | 0.9526 | 5.9271 | 0.09 | 0.0682 | 0.1659 | 0.8896 |
| No log | 3.0 | 21 | 2.6955 | 0.2275 | 0.8867 | 5.4041 | 0.2275 | 0.1804 | 0.1637 | 0.6351 |
| No log | 4.0 | 28 | 2.3029 | 0.29 | 0.8005 | 3.5699 | 0.29 | 0.2727 | 0.1512 | 0.4877 |
| No log | 5.0 | 35 | 2.0293 | 0.37 | 0.7219 | 2.9856 | 0.37 | 0.3519 | 0.1495 | 0.3632 |
| No log | 6.0 | 42 | 1.9194 | 0.44 | 0.6926 | 2.9746 | 0.44 | 0.4240 | 0.1778 | 0.3091 |
| No log | 7.0 | 49 | 1.9640 | 0.4675 | 0.7000 | 3.0857 | 0.4675 | 0.4390 | 0.2009 | 0.3004 |
| No log | 8.0 | 56 | 1.9049 | 0.4625 | 0.7003 | 2.8472 | 0.4625 | 0.4602 | 0.2053 | 0.3051 |
| No log | 9.0 | 63 | 2.0561 | 0.4675 | 0.7168 | 2.8229 | 0.4675 | 0.4501 | 0.2288 | 0.2994 |
| No log | 10.0 | 70 | 2.1002 | 0.45 | 0.7433 | 2.7915 | 0.45 | 0.4234 | 0.2691 | 0.3022 |
| No log | 11.0 | 77 | 2.2528 | 0.4525 | 0.7686 | 3.0103 | 0.4525 | 0.4320 | 0.2921 | 0.3183 |
| No log | 12.0 | 84 | 2.1190 | 0.475 | 0.7427 | 2.6715 | 0.4750 | 0.4660 | 0.2832 | 0.3077 |
| No log | 13.0 | 91 | 2.3102 | 0.445 | 0.7825 | 2.9698 | 0.445 | 0.4252 | 0.3093 | 0.3100 |
| No log | 14.0 | 98 | 2.3501 | 0.42 | 0.8145 | 2.7585 | 0.4200 | 0.4248 | 0.3206 | 0.3662 |
| No log | 15.0 | 105 | 2.2402 | 0.495 | 0.7423 | 3.0313 | 0.495 | 0.4702 | 0.2692 | 0.2818 |
| No log | 16.0 | 112 | 2.2266 | 0.49 | 0.7349 | 2.8824 | 0.49 | 0.4714 | 0.2763 | 0.2895 |
| No log | 17.0 | 119 | 2.2989 | 0.4725 | 0.7509 | 3.0951 | 0.4725 | 0.4499 | 0.2863 | 0.2855 |
| No log | 18.0 | 126 | 2.1355 | 0.47 | 0.7322 | 2.9349 | 0.47 | 0.4616 | 0.2725 | 0.2845 |
| No log | 19.0 | 133 | 2.0965 | 0.505 | 0.7067 | 2.8254 | 0.505 | 0.4956 | 0.2523 | 0.2757 |
| No log | 20.0 | 140 | 2.1961 | 0.485 | 0.7358 | 3.1604 | 0.485 | 0.4567 | 0.2825 | 0.2841 |
| No log | 21.0 | 147 | 2.1287 | 0.5025 | 0.7247 | 2.5998 | 0.5025 | 0.5074 | 0.2703 | 0.3064 |
| No log | 22.0 | 154 | 2.2280 | 0.4675 | 0.7760 | 2.8571 | 0.4675 | 0.4636 | 0.2911 | 0.3232 |
| No log | 23.0 | 161 | 1.9649 | 0.5025 | 0.6828 | 2.8224 | 0.5025 | 0.4970 | 0.2410 | 0.2633 |
| No log | 24.0 | 168 | 1.9361 | 0.5125 | 0.6780 | 2.7309 | 0.5125 | 0.5035 | 0.2326 | 0.2553 |
| No log | 25.0 | 175 | 2.0161 | 0.5 | 0.6980 | 2.9958 | 0.5 | 0.4912 | 0.2580 | 0.2556 |
| No log | 26.0 | 182 | 1.8763 | 0.5025 | 0.6624 | 2.8291 | 0.5025 | 0.4952 | 0.2305 | 0.2431 |
| No log | 27.0 | 189 | 1.9057 | 0.525 | 0.6793 | 2.5627 | 0.525 | 0.5174 | 0.2161 | 0.2634 |
| No log | 28.0 | 196 | 1.8529 | 0.52 | 0.6683 | 2.7191 | 0.52 | 0.5132 | 0.2375 | 0.2535 |
| No log | 29.0 | 203 | 1.9603 | 0.5125 | 0.6831 | 2.7822 | 0.5125 | 0.5076 | 0.2395 | 0.2657 |
| No log | 30.0 | 210 | 1.8247 | 0.52 | 0.6533 | 2.8547 | 0.52 | 0.5058 | 0.2080 | 0.2426 |
| No log | 31.0 | 217 | 1.8275 | 0.5125 | 0.6547 | 2.6194 | 0.5125 | 0.5032 | 0.2208 | 0.2488 |
| No log | 32.0 | 224 | 1.8003 | 0.52 | 0.6455 | 2.6138 | 0.52 | 0.5124 | 0.2302 | 0.2370 |
| No log | 33.0 | 231 | 1.8714 | 0.505 | 0.6694 | 2.6643 | 0.505 | 0.4970 | 0.2195 | 0.2553 |
| No log | 34.0 | 238 | 1.8018 | 0.5075 | 0.6659 | 2.5423 | 0.5075 | 0.4978 | 0.2241 | 0.2515 |
| No log | 35.0 | 245 | 1.7844 | 0.5225 | 0.6503 | 2.6100 | 0.5225 | 0.5181 | 0.2181 | 0.2435 |
| No log | 36.0 | 252 | 1.8321 | 0.5225 | 0.6674 | 2.7821 | 0.5225 | 0.5020 | 0.2285 | 0.2462 |
| No log | 37.0 | 259 | 1.7859 | 0.4975 | 0.6725 | 2.6066 | 0.4975 | 0.4974 | 0.2351 | 0.2627 |
| No log | 38.0 | 266 | 1.7790 | 0.5125 | 0.6595 | 2.6983 | 0.5125 | 0.5023 | 0.2172 | 0.2497 |
| No log | 39.0 | 273 | 1.6989 | 0.5225 | 0.6401 | 2.6743 | 0.5225 | 0.5151 | 0.2100 | 0.2407 |
| No log | 40.0 | 280 | 1.7568 | 0.52 | 0.6488 | 2.5294 | 0.52 | 0.5132 | 0.2208 | 0.2442 |
| No log | 41.0 | 287 | 1.6896 | 0.5275 | 0.6362 | 2.5489 | 0.5275 | 0.5141 | 0.2045 | 0.2323 |
| No log | 42.0 | 294 | 1.7193 | 0.5275 | 0.6517 | 2.5525 | 0.5275 | 0.5232 | 0.1986 | 0.2467 |
| No log | 43.0 | 301 | 1.7199 | 0.535 | 0.6403 | 2.5974 | 0.535 | 0.5279 | 0.2104 | 0.2432 |
| No log | 44.0 | 308 | 1.6594 | 0.5375 | 0.6330 | 2.4854 | 0.5375 | 0.5316 | 0.2015 | 0.2321 |
| No log | 45.0 | 315 | 1.6543 | 0.5275 | 0.6239 | 2.4955 | 0.5275 | 0.5223 | 0.2144 | 0.2308 |
| No log | 46.0 | 322 | 1.6490 | 0.5425 | 0.6262 | 2.5215 | 0.5425 | 0.5358 | 0.2104 | 0.2273 |
| No log | 47.0 | 329 | 1.6570 | 0.54 | 0.6233 | 2.5454 | 0.54 | 0.5380 | 0.2047 | 0.2301 |
| No log | 48.0 | 336 | 1.6359 | 0.5375 | 0.6218 | 2.5546 | 0.5375 | 0.5320 | 0.2171 | 0.2257 |
| No log | 49.0 | 343 | 1.6320 | 0.55 | 0.6214 | 2.4958 | 0.55 | 0.5452 | 0.2014 | 0.2267 |
| No log | 50.0 | 350 | 1.6230 | 0.53 | 0.6208 | 2.4979 | 0.53 | 0.5243 | 0.2017 | 0.2315 |
| No log | 51.0 | 357 | 1.6374 | 0.535 | 0.6257 | 2.4644 | 0.535 | 0.5293 | 0.2038 | 0.2286 |
| No log | 52.0 | 364 | 1.6190 | 0.5375 | 0.6199 | 2.5279 | 0.5375 | 0.5310 | 0.1855 | 0.2290 |
| No log | 53.0 | 371 | 1.6155 | 0.5475 | 0.6158 | 2.4738 | 0.5475 | 0.5435 | 0.1913 | 0.2239 |
| No log | 54.0 | 378 | 1.6131 | 0.5425 | 0.6184 | 2.4982 | 0.5425 | 0.5377 | 0.1969 | 0.2248 |
| No log | 55.0 | 385 | 1.6035 | 0.545 | 0.6138 | 2.4690 | 0.545 | 0.5406 | 0.2164 | 0.2223 |
| No log | 56.0 | 392 | 1.5990 | 0.54 | 0.6153 | 2.4701 | 0.54 | 0.5356 | 0.2019 | 0.2249 |
| No log | 57.0 | 399 | 1.6024 | 0.5425 | 0.6153 | 2.4626 | 0.5425 | 0.5375 | 0.1826 | 0.2237 |
| No log | 58.0 | 406 | 1.5935 | 0.545 | 0.6141 | 2.4390 | 0.545 | 0.5415 | 0.1933 | 0.2238 |
| No log | 59.0 | 413 | 1.6016 | 0.545 | 0.6137 | 2.4640 | 0.545 | 0.5401 | 0.2021 | 0.2230 |
| No log | 60.0 | 420 | 1.5976 | 0.54 | 0.6146 | 2.4618 | 0.54 | 0.5355 | 0.1912 | 0.2245 |
| No log | 61.0 | 427 | 1.5984 | 0.545 | 0.6133 | 2.4683 | 0.545 | 0.5408 | 0.1971 | 0.2228 |
| No log | 62.0 | 434 | 1.5941 | 0.54 | 0.6131 | 2.4639 | 0.54 | 0.5358 | 0.1898 | 0.2236 |
| No log | 63.0 | 441 | 1.5936 | 0.545 | 0.6123 | 2.4689 | 0.545 | 0.5404 | 0.1953 | 0.2222 |
| No log | 64.0 | 448 | 1.5970 | 0.5425 | 0.6138 | 2.4647 | 0.5425 | 0.5384 | 0.2015 | 0.2238 |
| No log | 65.0 | 455 | 1.5943 | 0.545 | 0.6130 | 2.4963 | 0.545 | 0.5400 | 0.1979 | 0.2229 |
| No log | 66.0 | 462 | 1.5936 | 0.545 | 0.6127 | 2.4977 | 0.545 | 0.5400 | 0.1933 | 0.2229 |
| No log | 67.0 | 469 | 1.5928 | 0.5425 | 0.6127 | 2.4965 | 0.5425 | 0.5381 | 0.1976 | 0.2233 |
| No log | 68.0 | 476 | 1.5946 | 0.5425 | 0.6128 | 2.4768 | 0.5425 | 0.5383 | 0.2149 | 0.2233 |
| No log | 69.0 | 483 | 1.5924 | 0.54 | 0.6126 | 2.4946 | 0.54 | 0.5356 | 0.2094 | 0.2233 |
| No log | 70.0 | 490 | 1.5921 | 0.54 | 0.6120 | 2.4964 | 0.54 | 0.5356 | 0.1801 | 0.2230 |
| No log | 71.0 | 497 | 1.5926 | 0.54 | 0.6126 | 2.4955 | 0.54 | 0.5356 | 0.2039 | 0.2235 |
| 0.3138 | 72.0 | 504 | 1.5916 | 0.5425 | 0.6121 | 2.4964 | 0.5425 | 0.5366 | 0.1898 | 0.2229 |
| 0.3138 | 73.0 | 511 | 1.5917 | 0.54 | 0.6119 | 2.4966 | 0.54 | 0.5356 | 0.2039 | 0.2231 |
| 0.3138 | 74.0 | 518 | 1.5918 | 0.54 | 0.6123 | 2.4964 | 0.54 | 0.5351 | 0.2035 | 0.2229 |
| 0.3138 | 75.0 | 525 | 1.5912 | 0.54 | 0.6118 | 2.4975 | 0.54 | 0.5351 | 0.2059 | 0.2228 |
| 0.3138 | 76.0 | 532 | 1.5918 | 0.54 | 0.6124 | 2.4965 | 0.54 | 0.5351 | 0.1971 | 0.2231 |
| 0.3138 | 77.0 | 539 | 1.5919 | 0.5425 | 0.6120 | 2.4974 | 0.5425 | 0.5358 | 0.2087 | 0.2227 |
| 0.3138 | 78.0 | 546 | 1.5903 | 0.54 | 0.6118 | 2.4978 | 0.54 | 0.5341 | 0.2169 | 0.2228 |
| 0.3138 | 79.0 | 553 | 1.5922 | 0.54 | 0.6124 | 2.4976 | 0.54 | 0.5351 | 0.2109 | 0.2234 |
| 0.3138 | 80.0 | 560 | 1.5914 | 0.54 | 0.6122 | 2.4983 | 0.54 | 0.5345 | 0.2041 | 0.2228 |
| 0.3138 | 81.0 | 567 | 1.5907 | 0.54 | 0.6119 | 2.4981 | 0.54 | 0.5345 | 0.2128 | 0.2226 |
| 0.3138 | 82.0 | 574 | 1.5921 | 0.5425 | 0.6124 | 2.4986 | 0.5425 | 0.5362 | 0.2084 | 0.2227 |
| 0.3138 | 83.0 | 581 | 1.5918 | 0.5425 | 0.6125 | 2.4987 | 0.5425 | 0.5362 | 0.2038 | 0.2230 |
| 0.3138 | 84.0 | 588 | 1.5902 | 0.54 | 0.6120 | 2.4989 | 0.54 | 0.5345 | 0.2043 | 0.2226 |
| 0.3138 | 85.0 | 595 | 1.5919 | 0.5425 | 0.6124 | 2.4988 | 0.5425 | 0.5360 | 0.1998 | 0.2228 |
| 0.3138 | 86.0 | 602 | 1.5916 | 0.5425 | 0.6124 | 2.4990 | 0.5425 | 0.5362 | 0.2079 | 0.2227 |
| 0.3138 | 87.0 | 609 | 1.5906 | 0.54 | 0.6120 | 2.4990 | 0.54 | 0.5345 | 0.2037 | 0.2227 |
| 0.3138 | 88.0 | 616 | 1.5908 | 0.54 | 0.6120 | 2.4989 | 0.54 | 0.5345 | 0.2091 | 0.2230 |
| 0.3138 | 89.0 | 623 | 1.5909 | 0.54 | 0.6120 | 2.4995 | 0.54 | 0.5344 | 0.2113 | 0.2228 |
| 0.3138 | 90.0 | 630 | 1.5906 | 0.54 | 0.6119 | 2.4996 | 0.54 | 0.5345 | 0.1969 | 0.2228 |
| 0.3138 | 91.0 | 637 | 1.5911 | 0.5425 | 0.6121 | 2.4999 | 0.5425 | 0.5360 | 0.1954 | 0.2226 |
| 0.3138 | 92.0 | 644 | 1.5909 | 0.54 | 0.6121 | 2.4994 | 0.54 | 0.5344 | 0.1928 | 0.2228 |
| 0.3138 | 93.0 | 651 | 1.5907 | 0.5425 | 0.6121 | 2.4999 | 0.5425 | 0.5360 | 0.2034 | 0.2225 |
| 0.3138 | 94.0 | 658 | 1.5910 | 0.5425 | 0.6122 | 2.4996 | 0.5425 | 0.5360 | 0.1974 | 0.2227 |
| 0.3138 | 95.0 | 665 | 1.5909 | 0.5375 | 0.6121 | 2.4995 | 0.5375 | 0.5319 | 0.1990 | 0.2230 |
| 0.3138 | 96.0 | 672 | 1.5907 | 0.5375 | 0.6120 | 2.4997 | 0.5375 | 0.5318 | 0.1980 | 0.2229 |
| 0.3138 | 97.0 | 679 | 1.5907 | 0.54 | 0.6120 | 2.4998 | 0.54 | 0.5344 | 0.1900 | 0.2228 |
| 0.3138 | 98.0 | 686 | 1.5907 | 0.5425 | 0.6120 | 2.4999 | 0.5425 | 0.5362 | 0.1899 | 0.2226 |
| 0.3138 | 99.0 | 693 | 1.5908 | 0.54 | 0.6121 | 2.4999 | 0.54 | 0.5334 | 0.1936 | 0.2228 |
| 0.3138 | 100.0 | 700 | 1.5908 | 0.54 | 0.6121 | 2.4999 | 0.54 | 0.5334 | 0.1895 | 0.2228 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
ayanban011/vit-base_tobacco
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7324
- Accuracy: 0.8
- Brier Loss: 0.3049
- Nll: 1.3070
- F1 Micro: 0.8000
- F1 Macro: 0.7733
- Ece: 0.2124
- Aurc: 0.0840
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 40
- eval_batch_size: 40
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 640
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.8 | 1 | 0.7434 | 0.815 | 0.3073 | 1.1863 | 0.815 | 0.7942 | 0.2217 | 0.0720 |
| No log | 1.6 | 2 | 0.7569 | 0.81 | 0.3117 | 1.2131 | 0.81 | 0.7893 | 0.2153 | 0.0800 |
| No log | 2.4 | 3 | 0.7491 | 0.82 | 0.3107 | 1.2631 | 0.82 | 0.8063 | 0.2311 | 0.0777 |
| No log | 4.0 | 5 | 0.7489 | 0.795 | 0.3088 | 1.1544 | 0.795 | 0.7766 | 0.2427 | 0.0730 |
| No log | 4.8 | 6 | 0.7658 | 0.81 | 0.3171 | 1.3766 | 0.81 | 0.7983 | 0.2434 | 0.0822 |
| No log | 5.6 | 7 | 0.7496 | 0.815 | 0.3097 | 1.1920 | 0.815 | 0.8014 | 0.2434 | 0.0848 |
| No log | 6.4 | 8 | 0.7468 | 0.8 | 0.3090 | 1.0732 | 0.8000 | 0.7750 | 0.2195 | 0.0774 |
| No log | 8.0 | 10 | 0.7563 | 0.815 | 0.3131 | 1.3472 | 0.815 | 0.8082 | 0.2255 | 0.0741 |
| No log | 8.8 | 11 | 0.7548 | 0.81 | 0.3116 | 1.2016 | 0.81 | 0.7930 | 0.2496 | 0.0868 |
| No log | 9.6 | 12 | 0.7395 | 0.805 | 0.3071 | 1.1664 | 0.805 | 0.7841 | 0.2432 | 0.0772 |
| No log | 10.4 | 13 | 0.7296 | 0.82 | 0.3018 | 1.1776 | 0.82 | 0.8078 | 0.2214 | 0.0676 |
| No log | 12.0 | 15 | 0.7515 | 0.815 | 0.3104 | 1.2034 | 0.815 | 0.7987 | 0.2307 | 0.0835 |
| No log | 12.8 | 16 | 0.7350 | 0.81 | 0.3053 | 1.1762 | 0.81 | 0.7978 | 0.2196 | 0.0747 |
| No log | 13.6 | 17 | 0.7281 | 0.805 | 0.3023 | 1.1664 | 0.805 | 0.7841 | 0.2144 | 0.0718 |
| No log | 14.4 | 18 | 0.7395 | 0.81 | 0.3064 | 1.1750 | 0.81 | 0.7871 | 0.2306 | 0.0778 |
| No log | 16.0 | 20 | 0.7427 | 0.81 | 0.3076 | 1.2637 | 0.81 | 0.7986 | 0.2194 | 0.0808 |
| No log | 16.8 | 21 | 0.7337 | 0.81 | 0.3044 | 1.2447 | 0.81 | 0.7948 | 0.2321 | 0.0743 |
| No log | 17.6 | 22 | 0.7340 | 0.805 | 0.3050 | 1.1681 | 0.805 | 0.7841 | 0.2307 | 0.0743 |
| No log | 18.4 | 23 | 0.7338 | 0.805 | 0.3047 | 1.1708 | 0.805 | 0.7841 | 0.2290 | 0.0759 |
| No log | 20.0 | 25 | 0.7390 | 0.815 | 0.3058 | 1.2551 | 0.815 | 0.7984 | 0.2489 | 0.0818 |
| No log | 20.8 | 26 | 0.7390 | 0.815 | 0.3063 | 1.1894 | 0.815 | 0.7984 | 0.2294 | 0.0818 |
| No log | 21.6 | 27 | 0.7349 | 0.805 | 0.3054 | 1.1714 | 0.805 | 0.7847 | 0.2011 | 0.0791 |
| No log | 22.4 | 28 | 0.7308 | 0.81 | 0.3037 | 1.1694 | 0.81 | 0.7948 | 0.2128 | 0.0766 |
| No log | 24.0 | 30 | 0.7353 | 0.81 | 0.3051 | 1.1852 | 0.81 | 0.7956 | 0.2282 | 0.0794 |
| No log | 24.8 | 31 | 0.7378 | 0.81 | 0.3062 | 1.1870 | 0.81 | 0.7956 | 0.2293 | 0.0819 |
| No log | 25.6 | 32 | 0.7356 | 0.81 | 0.3054 | 1.1863 | 0.81 | 0.7956 | 0.2287 | 0.0817 |
| No log | 26.4 | 33 | 0.7309 | 0.81 | 0.3037 | 1.1801 | 0.81 | 0.7954 | 0.2209 | 0.0795 |
| No log | 28.0 | 35 | 0.7336 | 0.805 | 0.3050 | 1.1733 | 0.805 | 0.7850 | 0.2082 | 0.0789 |
| No log | 28.8 | 36 | 0.7334 | 0.81 | 0.3045 | 1.1799 | 0.81 | 0.7956 | 0.2207 | 0.0797 |
| No log | 29.6 | 37 | 0.7320 | 0.81 | 0.3040 | 1.2447 | 0.81 | 0.7956 | 0.2279 | 0.0804 |
| No log | 30.4 | 38 | 0.7328 | 0.81 | 0.3045 | 1.2473 | 0.81 | 0.7956 | 0.2154 | 0.0812 |
| No log | 32.0 | 40 | 0.7322 | 0.805 | 0.3044 | 1.1796 | 0.805 | 0.7850 | 0.2384 | 0.0804 |
| No log | 32.8 | 41 | 0.7318 | 0.81 | 0.3045 | 1.1792 | 0.81 | 0.7954 | 0.2291 | 0.0794 |
| No log | 33.6 | 42 | 0.7302 | 0.81 | 0.3034 | 1.2401 | 0.81 | 0.7954 | 0.2086 | 0.0794 |
| No log | 34.4 | 43 | 0.7311 | 0.805 | 0.3036 | 1.2424 | 0.805 | 0.7850 | 0.2278 | 0.0804 |
| No log | 36.0 | 45 | 0.7323 | 0.805 | 0.3043 | 1.1902 | 0.805 | 0.7850 | 0.2119 | 0.0816 |
| No log | 36.8 | 46 | 0.7304 | 0.805 | 0.3034 | 1.2428 | 0.805 | 0.7850 | 0.2330 | 0.0807 |
| No log | 37.6 | 47 | 0.7297 | 0.805 | 0.3032 | 1.2413 | 0.805 | 0.7850 | 0.2447 | 0.0801 |
| No log | 38.4 | 48 | 0.7310 | 0.805 | 0.3039 | 1.2424 | 0.805 | 0.7850 | 0.2233 | 0.0802 |
| No log | 40.0 | 50 | 0.7316 | 0.805 | 0.3040 | 1.2451 | 0.805 | 0.7850 | 0.2094 | 0.0809 |
| No log | 40.8 | 51 | 0.7313 | 0.805 | 0.3041 | 1.2450 | 0.805 | 0.7850 | 0.2093 | 0.0810 |
| No log | 41.6 | 52 | 0.7313 | 0.805 | 0.3041 | 1.2445 | 0.805 | 0.7850 | 0.2073 | 0.0814 |
| No log | 42.4 | 53 | 0.7315 | 0.805 | 0.3040 | 1.2447 | 0.805 | 0.7850 | 0.2198 | 0.0821 |
| No log | 44.0 | 55 | 0.7303 | 0.805 | 0.3034 | 1.2441 | 0.805 | 0.7850 | 0.2048 | 0.0813 |
| No log | 44.8 | 56 | 0.7306 | 0.805 | 0.3038 | 1.2444 | 0.805 | 0.7850 | 0.1966 | 0.0809 |
| No log | 45.6 | 57 | 0.7317 | 0.805 | 0.3043 | 1.2449 | 0.805 | 0.7850 | 0.1976 | 0.0821 |
| No log | 46.4 | 58 | 0.7317 | 0.805 | 0.3041 | 1.2466 | 0.805 | 0.7850 | 0.2007 | 0.0822 |
| No log | 48.0 | 60 | 0.7316 | 0.805 | 0.3041 | 1.2499 | 0.805 | 0.7850 | 0.2137 | 0.0820 |
| No log | 48.8 | 61 | 0.7320 | 0.8 | 0.3043 | 1.2536 | 0.8000 | 0.7733 | 0.2081 | 0.0822 |
| No log | 49.6 | 62 | 0.7319 | 0.805 | 0.3044 | 1.2494 | 0.805 | 0.7850 | 0.1998 | 0.0825 |
| No log | 50.4 | 63 | 0.7326 | 0.805 | 0.3048 | 1.2476 | 0.805 | 0.7850 | 0.1936 | 0.0828 |
| No log | 52.0 | 65 | 0.7313 | 0.8 | 0.3044 | 1.2495 | 0.8000 | 0.7733 | 0.2117 | 0.0822 |
| No log | 52.8 | 66 | 0.7304 | 0.8 | 0.3039 | 1.2524 | 0.8000 | 0.7733 | 0.2009 | 0.0818 |
| No log | 53.6 | 67 | 0.7306 | 0.8 | 0.3038 | 1.2505 | 0.8000 | 0.7733 | 0.2182 | 0.0818 |
| No log | 54.4 | 68 | 0.7321 | 0.8 | 0.3044 | 1.2513 | 0.8000 | 0.7733 | 0.2185 | 0.0833 |
| No log | 56.0 | 70 | 0.7326 | 0.8 | 0.3049 | 1.2519 | 0.8000 | 0.7733 | 0.2014 | 0.0833 |
| No log | 56.8 | 71 | 0.7320 | 0.8 | 0.3047 | 1.2580 | 0.8000 | 0.7733 | 0.2175 | 0.0829 |
| No log | 57.6 | 72 | 0.7313 | 0.8 | 0.3043 | 1.2571 | 0.8000 | 0.7733 | 0.2045 | 0.0828 |
| No log | 58.4 | 73 | 0.7314 | 0.8 | 0.3043 | 1.3065 | 0.8000 | 0.7733 | 0.2038 | 0.0827 |
| No log | 60.0 | 75 | 0.7322 | 0.8 | 0.3046 | 1.3081 | 0.8000 | 0.7733 | 0.2047 | 0.0840 |
| No log | 60.8 | 76 | 0.7323 | 0.8 | 0.3047 | 1.3078 | 0.8000 | 0.7733 | 0.2053 | 0.0839 |
| No log | 61.6 | 77 | 0.7322 | 0.8 | 0.3047 | 1.3070 | 0.8000 | 0.7733 | 0.2051 | 0.0837 |
| No log | 62.4 | 78 | 0.7316 | 0.8 | 0.3045 | 1.3062 | 0.8000 | 0.7733 | 0.2145 | 0.0835 |
| No log | 64.0 | 80 | 0.7315 | 0.8 | 0.3044 | 1.3063 | 0.8000 | 0.7733 | 0.2067 | 0.0836 |
| No log | 64.8 | 81 | 0.7320 | 0.8 | 0.3047 | 1.3064 | 0.8000 | 0.7733 | 0.2041 | 0.0839 |
| No log | 65.6 | 82 | 0.7323 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2046 | 0.0839 |
| No log | 66.4 | 83 | 0.7323 | 0.8 | 0.3048 | 1.3068 | 0.8000 | 0.7733 | 0.2045 | 0.0838 |
| No log | 68.0 | 85 | 0.7320 | 0.8 | 0.3046 | 1.3068 | 0.8000 | 0.7733 | 0.2046 | 0.0840 |
| No log | 68.8 | 86 | 0.7318 | 0.8 | 0.3045 | 1.3069 | 0.8000 | 0.7733 | 0.2114 | 0.0838 |
| No log | 69.6 | 87 | 0.7316 | 0.8 | 0.3045 | 1.3066 | 0.8000 | 0.7733 | 0.2149 | 0.0836 |
| No log | 70.4 | 88 | 0.7316 | 0.8 | 0.3045 | 1.3066 | 0.8000 | 0.7733 | 0.2244 | 0.0834 |
| No log | 72.0 | 90 | 0.7321 | 0.8 | 0.3047 | 1.3069 | 0.8000 | 0.7733 | 0.2151 | 0.0837 |
| No log | 72.8 | 91 | 0.7322 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2151 | 0.0839 |
| No log | 73.6 | 92 | 0.7322 | 0.8 | 0.3048 | 1.3070 | 0.8000 | 0.7733 | 0.2155 | 0.0840 |
| No log | 74.4 | 93 | 0.7323 | 0.8 | 0.3048 | 1.3071 | 0.8000 | 0.7733 | 0.2129 | 0.0842 |
| No log | 76.0 | 95 | 0.7324 | 0.8 | 0.3049 | 1.3071 | 0.8000 | 0.7733 | 0.2084 | 0.0841 |
| No log | 76.8 | 96 | 0.7324 | 0.8 | 0.3049 | 1.3071 | 0.8000 | 0.7733 | 0.2141 | 0.0842 |
| No log | 77.6 | 97 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2136 | 0.0841 |
| No log | 78.4 | 98 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2136 | 0.0841 |
| No log | 80.0 | 100 | 0.7324 | 0.8 | 0.3049 | 1.3070 | 0.8000 | 0.7733 | 0.2124 | 0.0840 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9682
- Accuracy: 0.5575
- Brier Loss: 0.5680
- Nll: 2.3526
- F1 Micro: 0.5575
- F1 Macro: 0.5516
- Ece: 0.1676
- Aurc: 0.1973
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 5.4578 | 0.045 | 1.0710 | 7.3211 | 0.045 | 0.0342 | 0.2911 | 0.9529 |
| No log | 2.0 | 14 | 4.3718 | 0.0975 | 0.9466 | 5.6031 | 0.0975 | 0.0893 | 0.1665 | 0.8820 |
| No log | 3.0 | 21 | 3.9389 | 0.2325 | 0.8820 | 5.3266 | 0.2325 | 0.1837 | 0.1656 | 0.6302 |
| No log | 4.0 | 28 | 3.3251 | 0.3075 | 0.7848 | 3.6819 | 0.3075 | 0.2963 | 0.1644 | 0.4646 |
| No log | 5.0 | 35 | 2.8980 | 0.4025 | 0.7180 | 2.9699 | 0.4025 | 0.3740 | 0.1867 | 0.3388 |
| No log | 6.0 | 42 | 2.7055 | 0.4375 | 0.7010 | 3.2343 | 0.4375 | 0.4029 | 0.2096 | 0.3042 |
| No log | 7.0 | 49 | 2.6466 | 0.4575 | 0.7040 | 2.9977 | 0.4575 | 0.4272 | 0.2082 | 0.2999 |
| No log | 8.0 | 56 | 2.5399 | 0.4775 | 0.6809 | 2.8098 | 0.4775 | 0.4693 | 0.2106 | 0.2837 |
| No log | 9.0 | 63 | 2.5949 | 0.49 | 0.6824 | 2.6503 | 0.49 | 0.4827 | 0.2289 | 0.2718 |
| No log | 10.0 | 70 | 2.6997 | 0.4725 | 0.7318 | 2.8289 | 0.4725 | 0.4523 | 0.2536 | 0.3056 |
| No log | 11.0 | 77 | 2.5694 | 0.47 | 0.7040 | 2.7969 | 0.47 | 0.4534 | 0.2348 | 0.2766 |
| No log | 12.0 | 84 | 2.4539 | 0.4975 | 0.6762 | 2.6929 | 0.4975 | 0.4831 | 0.2300 | 0.2664 |
| No log | 13.0 | 91 | 2.4841 | 0.5025 | 0.6664 | 2.6140 | 0.5025 | 0.4944 | 0.2098 | 0.2584 |
| No log | 14.0 | 98 | 2.2755 | 0.535 | 0.6410 | 2.4991 | 0.535 | 0.5249 | 0.1905 | 0.2405 |
| No log | 15.0 | 105 | 2.2998 | 0.5125 | 0.6282 | 2.6561 | 0.5125 | 0.4982 | 0.1916 | 0.2371 |
| No log | 16.0 | 112 | 2.2156 | 0.525 | 0.6195 | 2.4837 | 0.525 | 0.5191 | 0.1870 | 0.2232 |
| No log | 17.0 | 119 | 2.1862 | 0.5225 | 0.6096 | 2.7252 | 0.5225 | 0.5202 | 0.1747 | 0.2292 |
| No log | 18.0 | 126 | 2.2058 | 0.5375 | 0.6160 | 2.5446 | 0.5375 | 0.5154 | 0.2011 | 0.2231 |
| No log | 19.0 | 133 | 2.2147 | 0.5375 | 0.6143 | 2.5720 | 0.5375 | 0.5232 | 0.2028 | 0.2221 |
| No log | 20.0 | 140 | 2.1791 | 0.525 | 0.6191 | 2.4505 | 0.525 | 0.5107 | 0.1918 | 0.2233 |
| No log | 21.0 | 147 | 2.1165 | 0.535 | 0.5960 | 2.5369 | 0.535 | 0.5280 | 0.1867 | 0.2133 |
| No log | 22.0 | 154 | 2.1193 | 0.54 | 0.6009 | 2.5568 | 0.54 | 0.5313 | 0.2049 | 0.2184 |
| No log | 23.0 | 161 | 2.1082 | 0.5425 | 0.5929 | 2.5238 | 0.5425 | 0.5360 | 0.1691 | 0.2099 |
| No log | 24.0 | 168 | 2.1221 | 0.535 | 0.6115 | 2.4854 | 0.535 | 0.5252 | 0.1779 | 0.2234 |
| No log | 25.0 | 175 | 2.1912 | 0.52 | 0.6295 | 2.4975 | 0.52 | 0.5109 | 0.1970 | 0.2339 |
| No log | 26.0 | 182 | 2.1056 | 0.5225 | 0.6150 | 2.4697 | 0.5225 | 0.5250 | 0.2020 | 0.2346 |
| No log | 27.0 | 189 | 2.1017 | 0.535 | 0.6027 | 2.4992 | 0.535 | 0.5399 | 0.2003 | 0.2173 |
| No log | 28.0 | 196 | 2.0999 | 0.545 | 0.5929 | 2.6313 | 0.545 | 0.5306 | 0.1844 | 0.2126 |
| No log | 29.0 | 203 | 2.1188 | 0.54 | 0.6044 | 2.5420 | 0.54 | 0.5211 | 0.1745 | 0.2159 |
| No log | 30.0 | 210 | 2.0670 | 0.56 | 0.5938 | 2.4868 | 0.56 | 0.5500 | 0.1849 | 0.2132 |
| No log | 31.0 | 217 | 2.0709 | 0.5525 | 0.5937 | 2.4206 | 0.5525 | 0.5489 | 0.1759 | 0.2140 |
| No log | 32.0 | 224 | 2.0390 | 0.5675 | 0.5794 | 2.5007 | 0.5675 | 0.5547 | 0.1745 | 0.2032 |
| No log | 33.0 | 231 | 2.0725 | 0.5375 | 0.5912 | 2.4328 | 0.5375 | 0.5237 | 0.1817 | 0.2157 |
| No log | 34.0 | 238 | 2.0644 | 0.565 | 0.5950 | 2.5103 | 0.565 | 0.5560 | 0.1712 | 0.2085 |
| No log | 35.0 | 245 | 2.0665 | 0.5575 | 0.5918 | 2.4401 | 0.5575 | 0.5519 | 0.1604 | 0.2133 |
| No log | 36.0 | 252 | 2.0485 | 0.53 | 0.5977 | 2.4321 | 0.53 | 0.5259 | 0.1927 | 0.2163 |
| No log | 37.0 | 259 | 2.0352 | 0.555 | 0.5780 | 2.4665 | 0.555 | 0.5450 | 0.1701 | 0.2000 |
| No log | 38.0 | 266 | 2.0423 | 0.5425 | 0.5777 | 2.5061 | 0.5425 | 0.5327 | 0.1689 | 0.2021 |
| No log | 39.0 | 273 | 2.0138 | 0.5625 | 0.5799 | 2.3714 | 0.5625 | 0.5581 | 0.1705 | 0.2060 |
| No log | 40.0 | 280 | 2.0161 | 0.5525 | 0.5918 | 2.3704 | 0.5525 | 0.5439 | 0.1720 | 0.2111 |
| No log | 41.0 | 287 | 2.0315 | 0.545 | 0.5843 | 2.4582 | 0.545 | 0.5355 | 0.1782 | 0.2083 |
| No log | 42.0 | 294 | 2.0157 | 0.545 | 0.5861 | 2.5374 | 0.545 | 0.5386 | 0.1995 | 0.2129 |
| No log | 43.0 | 301 | 2.0495 | 0.555 | 0.5922 | 2.4841 | 0.555 | 0.5393 | 0.1538 | 0.2138 |
| No log | 44.0 | 308 | 2.0293 | 0.5525 | 0.5824 | 2.4853 | 0.5525 | 0.5352 | 0.1745 | 0.2042 |
| No log | 45.0 | 315 | 2.0253 | 0.5575 | 0.5776 | 2.4516 | 0.5575 | 0.5421 | 0.1978 | 0.2045 |
| No log | 46.0 | 322 | 2.0246 | 0.5525 | 0.5953 | 2.4196 | 0.5525 | 0.5362 | 0.1715 | 0.2122 |
| No log | 47.0 | 329 | 2.0114 | 0.555 | 0.5730 | 2.4431 | 0.555 | 0.5462 | 0.1759 | 0.2008 |
| No log | 48.0 | 336 | 2.0046 | 0.5575 | 0.5801 | 2.3784 | 0.5575 | 0.5445 | 0.1703 | 0.2039 |
| No log | 49.0 | 343 | 1.9721 | 0.565 | 0.5672 | 2.5034 | 0.565 | 0.5594 | 0.1686 | 0.1963 |
| No log | 50.0 | 350 | 1.9872 | 0.565 | 0.5704 | 2.4067 | 0.565 | 0.5620 | 0.1888 | 0.2000 |
| No log | 51.0 | 357 | 1.9668 | 0.5725 | 0.5695 | 2.3935 | 0.5725 | 0.5711 | 0.1534 | 0.1981 |
| No log | 52.0 | 364 | 1.9796 | 0.5525 | 0.5742 | 2.3977 | 0.5525 | 0.5504 | 0.1614 | 0.2023 |
| No log | 53.0 | 371 | 2.0086 | 0.56 | 0.5835 | 2.4361 | 0.56 | 0.5510 | 0.1912 | 0.2098 |
| No log | 54.0 | 378 | 1.9998 | 0.54 | 0.5776 | 2.4292 | 0.54 | 0.5270 | 0.1679 | 0.2042 |
| No log | 55.0 | 385 | 1.9736 | 0.555 | 0.5732 | 2.3619 | 0.555 | 0.5427 | 0.1830 | 0.2024 |
| No log | 56.0 | 392 | 1.9850 | 0.57 | 0.5691 | 2.4491 | 0.57 | 0.5670 | 0.1798 | 0.1967 |
| No log | 57.0 | 399 | 1.9680 | 0.5675 | 0.5680 | 2.4863 | 0.5675 | 0.5624 | 0.1644 | 0.1967 |
| No log | 58.0 | 406 | 1.9633 | 0.565 | 0.5682 | 2.4407 | 0.565 | 0.5573 | 0.1576 | 0.1960 |
| No log | 59.0 | 413 | 1.9715 | 0.555 | 0.5750 | 2.4037 | 0.555 | 0.5459 | 0.1753 | 0.2027 |
| No log | 60.0 | 420 | 1.9791 | 0.575 | 0.5684 | 2.3886 | 0.575 | 0.5732 | 0.1647 | 0.1969 |
| No log | 61.0 | 427 | 1.9587 | 0.5675 | 0.5669 | 2.4318 | 0.5675 | 0.5616 | 0.1583 | 0.1948 |
| No log | 62.0 | 434 | 1.9735 | 0.5575 | 0.5699 | 2.3758 | 0.5575 | 0.5550 | 0.1442 | 0.1978 |
| No log | 63.0 | 441 | 1.9720 | 0.5475 | 0.5748 | 2.4373 | 0.5475 | 0.5405 | 0.1660 | 0.2004 |
| No log | 64.0 | 448 | 1.9825 | 0.55 | 0.5710 | 2.3517 | 0.55 | 0.5490 | 0.1810 | 0.2030 |
| No log | 65.0 | 455 | 1.9679 | 0.5725 | 0.5705 | 2.4311 | 0.5725 | 0.5718 | 0.1690 | 0.1980 |
| No log | 66.0 | 462 | 1.9735 | 0.565 | 0.5706 | 2.4810 | 0.565 | 0.5624 | 0.1740 | 0.1976 |
| No log | 67.0 | 469 | 1.9752 | 0.5675 | 0.5689 | 2.3718 | 0.5675 | 0.5615 | 0.1681 | 0.1980 |
| No log | 68.0 | 476 | 1.9798 | 0.565 | 0.5690 | 2.4235 | 0.565 | 0.5602 | 0.1671 | 0.1980 |
| No log | 69.0 | 483 | 1.9720 | 0.5625 | 0.5681 | 2.3483 | 0.5625 | 0.5576 | 0.1845 | 0.1970 |
| No log | 70.0 | 490 | 1.9640 | 0.5675 | 0.5673 | 2.3772 | 0.5675 | 0.5639 | 0.1621 | 0.1962 |
| No log | 71.0 | 497 | 1.9641 | 0.5625 | 0.5680 | 2.3925 | 0.5625 | 0.5567 | 0.1670 | 0.1963 |
| 0.4915 | 72.0 | 504 | 1.9753 | 0.5625 | 0.5707 | 2.4507 | 0.5625 | 0.5566 | 0.1780 | 0.1986 |
| 0.4915 | 73.0 | 511 | 1.9792 | 0.56 | 0.5700 | 2.3604 | 0.56 | 0.5545 | 0.1580 | 0.1990 |
| 0.4915 | 74.0 | 518 | 1.9679 | 0.55 | 0.5700 | 2.3519 | 0.55 | 0.5412 | 0.1781 | 0.1981 |
| 0.4915 | 75.0 | 525 | 1.9711 | 0.57 | 0.5685 | 2.4204 | 0.57 | 0.5676 | 0.1891 | 0.1962 |
| 0.4915 | 76.0 | 532 | 1.9705 | 0.565 | 0.5686 | 2.3512 | 0.565 | 0.5600 | 0.1684 | 0.1967 |
| 0.4915 | 77.0 | 539 | 1.9673 | 0.5625 | 0.5685 | 2.3481 | 0.5625 | 0.5577 | 0.1761 | 0.1968 |
| 0.4915 | 78.0 | 546 | 1.9641 | 0.5625 | 0.5668 | 2.3519 | 0.5625 | 0.5577 | 0.1747 | 0.1964 |
| 0.4915 | 79.0 | 553 | 1.9685 | 0.5675 | 0.5677 | 2.3833 | 0.5675 | 0.5630 | 0.1490 | 0.1967 |
| 0.4915 | 80.0 | 560 | 1.9722 | 0.5625 | 0.5691 | 2.3864 | 0.5625 | 0.5565 | 0.1583 | 0.1979 |
| 0.4915 | 81.0 | 567 | 1.9641 | 0.5625 | 0.5675 | 2.3528 | 0.5625 | 0.5566 | 0.1820 | 0.1965 |
| 0.4915 | 82.0 | 574 | 1.9666 | 0.5625 | 0.5678 | 2.3516 | 0.5625 | 0.5564 | 0.1622 | 0.1968 |
| 0.4915 | 83.0 | 581 | 1.9703 | 0.56 | 0.5679 | 2.3564 | 0.56 | 0.5550 | 0.1577 | 0.1967 |
| 0.4915 | 84.0 | 588 | 1.9666 | 0.56 | 0.5673 | 2.3532 | 0.56 | 0.5546 | 0.1633 | 0.1967 |
| 0.4915 | 85.0 | 595 | 1.9683 | 0.56 | 0.5682 | 2.3506 | 0.56 | 0.5539 | 0.1526 | 0.1971 |
| 0.4915 | 86.0 | 602 | 1.9673 | 0.5575 | 0.5674 | 2.3523 | 0.5575 | 0.5516 | 0.1642 | 0.1972 |
| 0.4915 | 87.0 | 609 | 1.9678 | 0.56 | 0.5678 | 2.3514 | 0.56 | 0.5539 | 0.1543 | 0.1970 |
| 0.4915 | 88.0 | 616 | 1.9680 | 0.56 | 0.5678 | 2.3531 | 0.56 | 0.5539 | 0.1581 | 0.1970 |
| 0.4915 | 89.0 | 623 | 1.9683 | 0.5575 | 0.5681 | 2.3532 | 0.5575 | 0.5516 | 0.1732 | 0.1976 |
| 0.4915 | 90.0 | 630 | 1.9677 | 0.56 | 0.5677 | 2.3536 | 0.56 | 0.5539 | 0.1702 | 0.1970 |
| 0.4915 | 91.0 | 637 | 1.9680 | 0.5575 | 0.5679 | 2.3528 | 0.5575 | 0.5515 | 0.1760 | 0.1974 |
| 0.4915 | 92.0 | 644 | 1.9681 | 0.5575 | 0.5679 | 2.3522 | 0.5575 | 0.5515 | 0.1584 | 0.1975 |
| 0.4915 | 93.0 | 651 | 1.9680 | 0.5575 | 0.5678 | 2.3524 | 0.5575 | 0.5515 | 0.1610 | 0.1973 |
| 0.4915 | 94.0 | 658 | 1.9680 | 0.5575 | 0.5680 | 2.3521 | 0.5575 | 0.5515 | 0.1653 | 0.1973 |
| 0.4915 | 95.0 | 665 | 1.9681 | 0.5575 | 0.5679 | 2.3524 | 0.5575 | 0.5515 | 0.1663 | 0.1973 |
| 0.4915 | 96.0 | 672 | 1.9682 | 0.5575 | 0.5680 | 2.3528 | 0.5575 | 0.5516 | 0.1663 | 0.1973 |
| 0.4915 | 97.0 | 679 | 1.9682 | 0.5575 | 0.5680 | 2.3526 | 0.5575 | 0.5515 | 0.1625 | 0.1973 |
| 0.4915 | 98.0 | 686 | 1.9681 | 0.5575 | 0.5679 | 2.3526 | 0.5575 | 0.5516 | 0.1676 | 0.1973 |
| 0.4915 | 99.0 | 693 | 1.9682 | 0.5575 | 0.5679 | 2.3526 | 0.5575 | 0.5516 | 0.1635 | 0.1973 |
| 0.4915 | 100.0 | 700 | 1.9682 | 0.5575 | 0.5680 | 2.3526 | 0.5575 | 0.5516 | 0.1676 | 0.1973 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
sbaner24/vit-base-patch16-224-Soybean_11-46
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-Soybean_11-46
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2058
- Accuracy: 0.9306
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3661 | 1.0 | 11 | 1.3698 | 0.5069 |
| 0.9979 | 2.0 | 22 | 0.9817 | 0.6632 |
| 0.6746 | 3.0 | 33 | 0.7423 | 0.7396 |
| 0.6364 | 4.0 | 44 | 0.6075 | 0.7569 |
| 0.5425 | 5.0 | 55 | 0.5500 | 0.7951 |
| 0.5001 | 6.0 | 66 | 0.4883 | 0.8160 |
| 0.3519 | 7.0 | 77 | 0.4539 | 0.8264 |
| 0.4421 | 8.0 | 88 | 0.4483 | 0.8194 |
| 0.3207 | 9.0 | 99 | 0.3785 | 0.8438 |
| 0.3682 | 10.0 | 110 | 0.3385 | 0.8646 |
| 0.2642 | 11.0 | 121 | 0.3827 | 0.8403 |
| 0.3444 | 12.0 | 132 | 0.3462 | 0.8507 |
| 0.2423 | 13.0 | 143 | 0.3170 | 0.8681 |
| 0.3168 | 14.0 | 154 | 0.3168 | 0.8715 |
| 0.2781 | 15.0 | 165 | 0.3323 | 0.8333 |
| 0.2411 | 16.0 | 176 | 0.3200 | 0.8715 |
| 0.2276 | 17.0 | 187 | 0.3296 | 0.875 |
| 0.192 | 18.0 | 198 | 0.3119 | 0.8854 |
| 0.1612 | 19.0 | 209 | 0.3647 | 0.875 |
| 0.1084 | 20.0 | 220 | 0.2641 | 0.8993 |
| 0.2099 | 21.0 | 231 | 0.2807 | 0.8958 |
| 0.1666 | 22.0 | 242 | 0.2595 | 0.9097 |
| 0.1355 | 23.0 | 253 | 0.2735 | 0.8924 |
| 0.1165 | 24.0 | 264 | 0.3238 | 0.8785 |
| 0.112 | 25.0 | 275 | 0.3066 | 0.8889 |
| 0.1191 | 26.0 | 286 | 0.2427 | 0.9062 |
| 0.1293 | 27.0 | 297 | 0.2536 | 0.9201 |
| 0.2932 | 28.0 | 308 | 0.2707 | 0.8924 |
| 0.0918 | 29.0 | 319 | 0.2688 | 0.8924 |
| 0.1529 | 30.0 | 330 | 0.2715 | 0.8889 |
| 0.227 | 31.0 | 341 | 0.2664 | 0.9028 |
| 0.1044 | 32.0 | 352 | 0.2809 | 0.8993 |
| 0.0894 | 33.0 | 363 | 0.2863 | 0.8924 |
| 0.0566 | 34.0 | 374 | 0.2474 | 0.9201 |
| 0.0915 | 35.0 | 385 | 0.2428 | 0.9097 |
| 0.1136 | 36.0 | 396 | 0.2545 | 0.9097 |
| 0.0947 | 37.0 | 407 | 0.2599 | 0.9097 |
| 0.1012 | 38.0 | 418 | 0.2454 | 0.9167 |
| 0.0465 | 39.0 | 429 | 0.2435 | 0.9201 |
| 0.0299 | 40.0 | 440 | 0.2532 | 0.9062 |
| 0.0311 | 41.0 | 451 | 0.2298 | 0.9271 |
| 0.0796 | 42.0 | 462 | 0.2422 | 0.9167 |
| 0.058 | 43.0 | 473 | 0.2058 | 0.9306 |
| 0.0853 | 44.0 | 484 | 0.2266 | 0.9306 |
| 0.0868 | 45.0 | 495 | 0.2266 | 0.9236 |
| 0.0554 | 46.0 | 506 | 0.2163 | 0.9271 |
| 0.0508 | 47.0 | 517 | 0.2104 | 0.9306 |
| 0.0589 | 48.0 | 528 | 0.2172 | 0.9271 |
| 0.0369 | 49.0 | 539 | 0.2214 | 0.9271 |
| 0.0852 | 50.0 | 550 | 0.2241 | 0.9271 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1
|
[
"0",
"1",
"2",
"3",
"4"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7789
- Accuracy: 0.565
- Brier Loss: 0.5798
- Nll: 2.3548
- F1 Micro: 0.565
- F1 Macro: 0.5569
- Ece: 0.1677
- Aurc: 0.2032
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.7719 | 0.04 | 1.0712 | 7.3456 | 0.04 | 0.0315 | 0.2816 | 0.9543 |
| No log | 2.0 | 14 | 3.7636 | 0.1 | 0.9486 | 5.7989 | 0.1000 | 0.0863 | 0.1625 | 0.8854 |
| No log | 3.0 | 21 | 3.3871 | 0.235 | 0.8838 | 5.3315 | 0.235 | 0.1857 | 0.1589 | 0.6274 |
| No log | 4.0 | 28 | 2.8780 | 0.2975 | 0.7900 | 3.5467 | 0.2975 | 0.2884 | 0.1672 | 0.4712 |
| No log | 5.0 | 35 | 2.5223 | 0.3875 | 0.7164 | 3.0188 | 0.3875 | 0.3630 | 0.1596 | 0.3495 |
| No log | 6.0 | 42 | 2.3634 | 0.43 | 0.6927 | 3.1378 | 0.4300 | 0.4031 | 0.1877 | 0.3055 |
| No log | 7.0 | 49 | 2.3487 | 0.445 | 0.7009 | 3.0084 | 0.445 | 0.4123 | 0.2026 | 0.3003 |
| No log | 8.0 | 56 | 2.2521 | 0.47 | 0.6857 | 2.8634 | 0.47 | 0.4607 | 0.1977 | 0.2918 |
| No log | 9.0 | 63 | 2.3597 | 0.4775 | 0.6955 | 2.6768 | 0.4775 | 0.4642 | 0.2215 | 0.2787 |
| No log | 10.0 | 70 | 2.3524 | 0.46 | 0.7130 | 2.7830 | 0.46 | 0.4441 | 0.2429 | 0.2941 |
| No log | 11.0 | 77 | 2.4233 | 0.46 | 0.7108 | 3.0867 | 0.46 | 0.4418 | 0.2247 | 0.2828 |
| No log | 12.0 | 84 | 2.2723 | 0.485 | 0.6901 | 2.7313 | 0.485 | 0.4742 | 0.2293 | 0.2774 |
| No log | 13.0 | 91 | 2.2818 | 0.49 | 0.7007 | 2.5829 | 0.49 | 0.4859 | 0.2473 | 0.2791 |
| No log | 14.0 | 98 | 2.1682 | 0.4975 | 0.6695 | 2.8292 | 0.4975 | 0.4815 | 0.2170 | 0.2653 |
| No log | 15.0 | 105 | 2.0652 | 0.52 | 0.6520 | 2.4319 | 0.52 | 0.5120 | 0.2079 | 0.2617 |
| No log | 16.0 | 112 | 2.0524 | 0.5225 | 0.6384 | 2.5273 | 0.5225 | 0.5128 | 0.1980 | 0.2391 |
| No log | 17.0 | 119 | 2.1736 | 0.48 | 0.6765 | 2.4887 | 0.48 | 0.4809 | 0.2112 | 0.2834 |
| No log | 18.0 | 126 | 2.0096 | 0.515 | 0.6496 | 2.4230 | 0.515 | 0.5215 | 0.1957 | 0.2513 |
| No log | 19.0 | 133 | 2.0207 | 0.5175 | 0.6417 | 2.4591 | 0.5175 | 0.5104 | 0.2011 | 0.2526 |
| No log | 20.0 | 140 | 1.9152 | 0.5425 | 0.6044 | 2.5069 | 0.5425 | 0.5405 | 0.1826 | 0.2147 |
| No log | 21.0 | 147 | 1.9600 | 0.52 | 0.6332 | 2.5016 | 0.52 | 0.5204 | 0.1871 | 0.2427 |
| No log | 22.0 | 154 | 1.9325 | 0.515 | 0.6226 | 2.4981 | 0.515 | 0.5078 | 0.1860 | 0.2323 |
| No log | 23.0 | 161 | 1.9172 | 0.53 | 0.6144 | 2.4601 | 0.53 | 0.5221 | 0.1941 | 0.2311 |
| No log | 24.0 | 168 | 1.8891 | 0.5425 | 0.6091 | 2.4653 | 0.5425 | 0.5399 | 0.1933 | 0.2260 |
| No log | 25.0 | 175 | 1.9460 | 0.5175 | 0.6214 | 2.4785 | 0.5175 | 0.5168 | 0.1694 | 0.2335 |
| No log | 26.0 | 182 | 1.9060 | 0.5525 | 0.5970 | 2.4789 | 0.5525 | 0.5430 | 0.1934 | 0.2137 |
| No log | 27.0 | 189 | 1.9421 | 0.5375 | 0.6205 | 2.4814 | 0.5375 | 0.5313 | 0.2135 | 0.2402 |
| No log | 28.0 | 196 | 2.0195 | 0.545 | 0.6187 | 2.5330 | 0.545 | 0.5256 | 0.1800 | 0.2360 |
| No log | 29.0 | 203 | 1.9428 | 0.535 | 0.6167 | 2.4894 | 0.535 | 0.5111 | 0.1862 | 0.2322 |
| No log | 30.0 | 210 | 1.8996 | 0.5225 | 0.6207 | 2.4810 | 0.5225 | 0.5137 | 0.1994 | 0.2330 |
| No log | 31.0 | 217 | 1.8462 | 0.54 | 0.6119 | 2.4201 | 0.54 | 0.5341 | 0.1817 | 0.2190 |
| No log | 32.0 | 224 | 1.8324 | 0.55 | 0.5988 | 2.4230 | 0.55 | 0.5427 | 0.1888 | 0.2171 |
| No log | 33.0 | 231 | 1.8393 | 0.545 | 0.5977 | 2.3943 | 0.545 | 0.5339 | 0.1838 | 0.2172 |
| No log | 34.0 | 238 | 1.8704 | 0.5475 | 0.6081 | 2.4488 | 0.5475 | 0.5427 | 0.1768 | 0.2200 |
| No log | 35.0 | 245 | 1.8546 | 0.54 | 0.6022 | 2.3273 | 0.54 | 0.5316 | 0.1847 | 0.2226 |
| No log | 36.0 | 252 | 1.8608 | 0.53 | 0.5972 | 2.5153 | 0.53 | 0.5139 | 0.1810 | 0.2202 |
| No log | 37.0 | 259 | 1.8663 | 0.5325 | 0.6057 | 2.4642 | 0.5325 | 0.5243 | 0.1836 | 0.2205 |
| No log | 38.0 | 266 | 1.8300 | 0.545 | 0.5954 | 2.5101 | 0.545 | 0.5418 | 0.1890 | 0.2141 |
| No log | 39.0 | 273 | 1.8121 | 0.5625 | 0.5853 | 2.4397 | 0.5625 | 0.5550 | 0.1704 | 0.2110 |
| No log | 40.0 | 280 | 1.7916 | 0.54 | 0.5884 | 2.3565 | 0.54 | 0.5361 | 0.1685 | 0.2135 |
| No log | 41.0 | 287 | 1.8353 | 0.5575 | 0.5929 | 2.4252 | 0.5575 | 0.5451 | 0.1823 | 0.2116 |
| No log | 42.0 | 294 | 1.7999 | 0.5675 | 0.5839 | 2.4820 | 0.5675 | 0.5631 | 0.1729 | 0.2045 |
| No log | 43.0 | 301 | 1.8622 | 0.52 | 0.6106 | 2.4823 | 0.52 | 0.5028 | 0.1948 | 0.2270 |
| No log | 44.0 | 308 | 1.7892 | 0.55 | 0.5892 | 2.3342 | 0.55 | 0.5450 | 0.1798 | 0.2126 |
| No log | 45.0 | 315 | 1.7978 | 0.545 | 0.5868 | 2.4345 | 0.545 | 0.5439 | 0.1894 | 0.2094 |
| No log | 46.0 | 322 | 1.7697 | 0.56 | 0.5772 | 2.4272 | 0.56 | 0.5585 | 0.1601 | 0.1997 |
| No log | 47.0 | 329 | 1.7754 | 0.5475 | 0.5835 | 2.3977 | 0.5475 | 0.5438 | 0.1759 | 0.2059 |
| No log | 48.0 | 336 | 1.7922 | 0.545 | 0.5929 | 2.4119 | 0.545 | 0.5390 | 0.1891 | 0.2131 |
| No log | 49.0 | 343 | 1.8055 | 0.5625 | 0.5872 | 2.3654 | 0.5625 | 0.5497 | 0.1759 | 0.2073 |
| No log | 50.0 | 350 | 1.7972 | 0.56 | 0.5894 | 2.3366 | 0.56 | 0.5487 | 0.1803 | 0.2083 |
| No log | 51.0 | 357 | 1.7890 | 0.555 | 0.5815 | 2.3858 | 0.555 | 0.5501 | 0.1693 | 0.2067 |
| No log | 52.0 | 364 | 1.7958 | 0.5475 | 0.5883 | 2.4244 | 0.5475 | 0.5355 | 0.1910 | 0.2105 |
| No log | 53.0 | 371 | 1.7881 | 0.5675 | 0.5834 | 2.4135 | 0.5675 | 0.5603 | 0.1836 | 0.2028 |
| No log | 54.0 | 378 | 1.7675 | 0.555 | 0.5766 | 2.4043 | 0.555 | 0.5563 | 0.1653 | 0.2047 |
| No log | 55.0 | 385 | 1.7688 | 0.55 | 0.5843 | 2.3641 | 0.55 | 0.5505 | 0.1729 | 0.2092 |
| No log | 56.0 | 392 | 1.7796 | 0.55 | 0.5861 | 2.3404 | 0.55 | 0.5458 | 0.1808 | 0.2114 |
| No log | 57.0 | 399 | 1.7861 | 0.54 | 0.5885 | 2.3460 | 0.54 | 0.5323 | 0.1902 | 0.2073 |
| No log | 58.0 | 406 | 1.7746 | 0.56 | 0.5818 | 2.3715 | 0.56 | 0.5557 | 0.1643 | 0.2034 |
| No log | 59.0 | 413 | 1.7828 | 0.5575 | 0.5868 | 2.3086 | 0.5575 | 0.5526 | 0.1956 | 0.2088 |
| No log | 60.0 | 420 | 1.7735 | 0.565 | 0.5825 | 2.3405 | 0.565 | 0.5619 | 0.1696 | 0.2058 |
| No log | 61.0 | 427 | 1.7651 | 0.5675 | 0.5760 | 2.4771 | 0.5675 | 0.5636 | 0.1847 | 0.2027 |
| No log | 62.0 | 434 | 1.7751 | 0.5575 | 0.5834 | 2.3727 | 0.5575 | 0.5524 | 0.1638 | 0.2052 |
| No log | 63.0 | 441 | 1.7900 | 0.56 | 0.5834 | 2.3635 | 0.56 | 0.5502 | 0.1789 | 0.2061 |
| No log | 64.0 | 448 | 1.7729 | 0.56 | 0.5821 | 2.3797 | 0.56 | 0.5554 | 0.1676 | 0.2046 |
| No log | 65.0 | 455 | 1.7743 | 0.5625 | 0.5826 | 2.4174 | 0.5625 | 0.5581 | 0.1538 | 0.2052 |
| No log | 66.0 | 462 | 1.7749 | 0.5625 | 0.5801 | 2.3799 | 0.5625 | 0.5592 | 0.1709 | 0.2036 |
| No log | 67.0 | 469 | 1.7795 | 0.5625 | 0.5814 | 2.3169 | 0.5625 | 0.5533 | 0.1883 | 0.2037 |
| No log | 68.0 | 476 | 1.7773 | 0.5675 | 0.5794 | 2.3588 | 0.5675 | 0.5622 | 0.1779 | 0.2013 |
| No log | 69.0 | 483 | 1.7762 | 0.56 | 0.5793 | 2.3514 | 0.56 | 0.5566 | 0.1707 | 0.2039 |
| No log | 70.0 | 490 | 1.7762 | 0.5625 | 0.5787 | 2.3620 | 0.5625 | 0.5529 | 0.1607 | 0.2017 |
| No log | 71.0 | 497 | 1.7740 | 0.5675 | 0.5798 | 2.3235 | 0.5675 | 0.5612 | 0.1637 | 0.2046 |
| 0.4215 | 72.0 | 504 | 1.7739 | 0.56 | 0.5790 | 2.3235 | 0.56 | 0.5542 | 0.1583 | 0.2023 |
| 0.4215 | 73.0 | 511 | 1.7783 | 0.56 | 0.5806 | 2.4187 | 0.56 | 0.5545 | 0.1674 | 0.2040 |
| 0.4215 | 74.0 | 518 | 1.7785 | 0.56 | 0.5805 | 2.3302 | 0.56 | 0.5544 | 0.1748 | 0.2033 |
| 0.4215 | 75.0 | 525 | 1.7777 | 0.5625 | 0.5795 | 2.3321 | 0.5625 | 0.5548 | 0.1754 | 0.2029 |
| 0.4215 | 76.0 | 532 | 1.7785 | 0.565 | 0.5799 | 2.3249 | 0.565 | 0.5586 | 0.1696 | 0.2023 |
| 0.4215 | 77.0 | 539 | 1.7763 | 0.565 | 0.5790 | 2.3561 | 0.565 | 0.5573 | 0.1574 | 0.2022 |
| 0.4215 | 78.0 | 546 | 1.7767 | 0.565 | 0.5790 | 2.3296 | 0.565 | 0.5572 | 0.1633 | 0.2024 |
| 0.4215 | 79.0 | 553 | 1.7763 | 0.565 | 0.5790 | 2.3555 | 0.565 | 0.5580 | 0.1687 | 0.2016 |
| 0.4215 | 80.0 | 560 | 1.7783 | 0.565 | 0.5800 | 2.3254 | 0.565 | 0.5576 | 0.1752 | 0.2032 |
| 0.4215 | 81.0 | 567 | 1.7773 | 0.5675 | 0.5796 | 2.3530 | 0.5675 | 0.5605 | 0.1519 | 0.2023 |
| 0.4215 | 82.0 | 574 | 1.7774 | 0.5625 | 0.5797 | 2.3253 | 0.5625 | 0.5549 | 0.1911 | 0.2028 |
| 0.4215 | 83.0 | 581 | 1.7784 | 0.5625 | 0.5794 | 2.3554 | 0.5625 | 0.5544 | 0.1659 | 0.2030 |
| 0.4215 | 84.0 | 588 | 1.7769 | 0.565 | 0.5793 | 2.3527 | 0.565 | 0.5585 | 0.1588 | 0.2024 |
| 0.4215 | 85.0 | 595 | 1.7787 | 0.565 | 0.5799 | 2.3549 | 0.565 | 0.5576 | 0.1687 | 0.2032 |
| 0.4215 | 86.0 | 602 | 1.7778 | 0.565 | 0.5795 | 2.3548 | 0.565 | 0.5574 | 0.1577 | 0.2029 |
| 0.4215 | 87.0 | 609 | 1.7787 | 0.5625 | 0.5798 | 2.3545 | 0.5625 | 0.5549 | 0.1643 | 0.2032 |
| 0.4215 | 88.0 | 616 | 1.7786 | 0.565 | 0.5796 | 2.3554 | 0.565 | 0.5574 | 0.1667 | 0.2031 |
| 0.4215 | 89.0 | 623 | 1.7785 | 0.565 | 0.5799 | 2.3546 | 0.565 | 0.5574 | 0.1691 | 0.2032 |
| 0.4215 | 90.0 | 630 | 1.7784 | 0.565 | 0.5797 | 2.3548 | 0.565 | 0.5574 | 0.1656 | 0.2031 |
| 0.4215 | 91.0 | 637 | 1.7784 | 0.565 | 0.5797 | 2.3550 | 0.565 | 0.5569 | 0.1753 | 0.2032 |
| 0.4215 | 92.0 | 644 | 1.7786 | 0.565 | 0.5797 | 2.3545 | 0.565 | 0.5574 | 0.1744 | 0.2030 |
| 0.4215 | 93.0 | 651 | 1.7785 | 0.565 | 0.5797 | 2.3545 | 0.565 | 0.5574 | 0.1709 | 0.2031 |
| 0.4215 | 94.0 | 658 | 1.7787 | 0.565 | 0.5797 | 2.3543 | 0.565 | 0.5574 | 0.1704 | 0.2032 |
| 0.4215 | 95.0 | 665 | 1.7787 | 0.565 | 0.5798 | 2.3545 | 0.565 | 0.5574 | 0.1713 | 0.2032 |
| 0.4215 | 96.0 | 672 | 1.7788 | 0.565 | 0.5798 | 2.3549 | 0.565 | 0.5569 | 0.1777 | 0.2031 |
| 0.4215 | 97.0 | 679 | 1.7789 | 0.565 | 0.5798 | 2.3550 | 0.565 | 0.5569 | 0.1677 | 0.2032 |
| 0.4215 | 98.0 | 686 | 1.7789 | 0.565 | 0.5798 | 2.3549 | 0.565 | 0.5569 | 0.1648 | 0.2032 |
| 0.4215 | 99.0 | 693 | 1.7789 | 0.565 | 0.5798 | 2.3548 | 0.565 | 0.5569 | 0.1728 | 0.2032 |
| 0.4215 | 100.0 | 700 | 1.7789 | 0.565 | 0.5798 | 2.3548 | 0.565 | 0.5569 | 0.1677 | 0.2032 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
ayanban011/vit-base_tobacco_lr5e-6_e_200
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco_lr5e-6_e_200
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7893
- Accuracy: 0.78
- Brier Loss: 0.3364
- Nll: 1.4430
- F1 Micro: 0.78
- F1 Macro: 0.7488
- Ece: 0.1922
- Aurc: 0.1018
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.7450 | 0.815 | 0.3079 | 1.1882 | 0.815 | 0.7942 | 0.2383 | 0.0734 |
| No log | 2.0 | 25 | 0.7440 | 0.815 | 0.3074 | 1.1877 | 0.815 | 0.7922 | 0.2303 | 0.0734 |
| No log | 2.96 | 37 | 0.7429 | 0.81 | 0.3071 | 1.1883 | 0.81 | 0.7914 | 0.2367 | 0.0714 |
| No log | 4.0 | 50 | 0.7463 | 0.815 | 0.3083 | 1.1912 | 0.815 | 0.7942 | 0.2334 | 0.0768 |
| No log | 4.96 | 62 | 0.7453 | 0.815 | 0.3080 | 1.1927 | 0.815 | 0.7922 | 0.2224 | 0.0751 |
| No log | 6.0 | 75 | 0.7416 | 0.81 | 0.3067 | 1.1870 | 0.81 | 0.7914 | 0.2232 | 0.0716 |
| No log | 6.96 | 87 | 0.7420 | 0.81 | 0.3070 | 1.1858 | 0.81 | 0.7914 | 0.2309 | 0.0730 |
| No log | 8.0 | 100 | 0.7409 | 0.81 | 0.3062 | 1.1879 | 0.81 | 0.7871 | 0.2186 | 0.0749 |
| No log | 8.96 | 112 | 0.7444 | 0.815 | 0.3079 | 1.1984 | 0.815 | 0.8059 | 0.2342 | 0.0766 |
| No log | 10.0 | 125 | 0.7445 | 0.81 | 0.3079 | 1.1918 | 0.81 | 0.7894 | 0.2353 | 0.0775 |
| No log | 10.96 | 137 | 0.7451 | 0.81 | 0.3085 | 1.1888 | 0.81 | 0.7871 | 0.2319 | 0.0784 |
| No log | 12.0 | 150 | 0.7411 | 0.805 | 0.3060 | 1.1914 | 0.805 | 0.7829 | 0.2186 | 0.0779 |
| No log | 12.96 | 162 | 0.7406 | 0.815 | 0.3075 | 1.1967 | 0.815 | 0.8057 | 0.2472 | 0.0746 |
| No log | 14.0 | 175 | 0.7360 | 0.805 | 0.3048 | 1.2494 | 0.805 | 0.7841 | 0.2383 | 0.0770 |
| No log | 14.96 | 187 | 0.7331 | 0.81 | 0.3037 | 1.1896 | 0.81 | 0.7912 | 0.2306 | 0.0721 |
| No log | 16.0 | 200 | 0.7345 | 0.81 | 0.3042 | 1.2488 | 0.81 | 0.7956 | 0.2064 | 0.0794 |
| No log | 16.96 | 212 | 0.7329 | 0.815 | 0.3018 | 1.2595 | 0.815 | 0.8043 | 0.2331 | 0.0797 |
| No log | 18.0 | 225 | 0.7354 | 0.795 | 0.3055 | 1.1743 | 0.795 | 0.7765 | 0.2154 | 0.0742 |
| No log | 18.96 | 237 | 0.7282 | 0.805 | 0.3001 | 1.1920 | 0.805 | 0.7844 | 0.2444 | 0.0748 |
| No log | 20.0 | 250 | 0.7225 | 0.805 | 0.2981 | 1.1804 | 0.805 | 0.7874 | 0.2041 | 0.0744 |
| No log | 20.96 | 262 | 0.7250 | 0.81 | 0.3000 | 1.2516 | 0.81 | 0.7900 | 0.2148 | 0.0761 |
| No log | 22.0 | 275 | 0.7252 | 0.81 | 0.3009 | 1.3033 | 0.81 | 0.7954 | 0.2375 | 0.0782 |
| No log | 22.96 | 287 | 0.7293 | 0.8 | 0.3016 | 1.2557 | 0.8000 | 0.7796 | 0.2243 | 0.0810 |
| No log | 24.0 | 300 | 0.7344 | 0.805 | 0.3057 | 1.3122 | 0.805 | 0.7853 | 0.2096 | 0.0815 |
| No log | 24.96 | 312 | 0.7314 | 0.81 | 0.3044 | 1.2501 | 0.81 | 0.7909 | 0.2068 | 0.0808 |
| No log | 26.0 | 325 | 0.7293 | 0.81 | 0.3036 | 1.3066 | 0.81 | 0.7909 | 0.2015 | 0.0804 |
| No log | 26.96 | 337 | 0.7323 | 0.805 | 0.3042 | 1.3147 | 0.805 | 0.7853 | 0.2242 | 0.0827 |
| No log | 28.0 | 350 | 0.7288 | 0.805 | 0.3029 | 1.3109 | 0.805 | 0.7853 | 0.1976 | 0.0826 |
| No log | 28.96 | 362 | 0.7343 | 0.805 | 0.3058 | 1.3108 | 0.805 | 0.7853 | 0.2128 | 0.0851 |
| No log | 30.0 | 375 | 0.7351 | 0.8 | 0.3062 | 1.3129 | 0.8000 | 0.7733 | 0.2088 | 0.0845 |
| No log | 30.96 | 387 | 0.7301 | 0.8 | 0.3044 | 1.2452 | 0.8000 | 0.7733 | 0.2272 | 0.0836 |
| No log | 32.0 | 400 | 0.7340 | 0.8 | 0.3055 | 1.3769 | 0.8000 | 0.7806 | 0.2207 | 0.0855 |
| No log | 32.96 | 412 | 0.7322 | 0.805 | 0.3052 | 1.3132 | 0.805 | 0.7784 | 0.2470 | 0.0865 |
| No log | 34.0 | 425 | 0.7301 | 0.8 | 0.3045 | 1.2543 | 0.8000 | 0.7733 | 0.2021 | 0.0863 |
| No log | 34.96 | 437 | 0.7297 | 0.8 | 0.3046 | 1.3056 | 0.8000 | 0.7731 | 0.1886 | 0.0836 |
| No log | 36.0 | 450 | 0.7326 | 0.805 | 0.3056 | 1.3102 | 0.805 | 0.7784 | 0.2256 | 0.0892 |
| No log | 36.96 | 462 | 0.7311 | 0.8 | 0.3046 | 1.3681 | 0.8000 | 0.7733 | 0.2065 | 0.0880 |
| No log | 38.0 | 475 | 0.7356 | 0.8 | 0.3077 | 1.3666 | 0.8000 | 0.7733 | 0.2034 | 0.0895 |
| No log | 38.96 | 487 | 0.7302 | 0.8 | 0.3042 | 1.3114 | 0.8000 | 0.7733 | 0.1996 | 0.0894 |
| 0.1528 | 40.0 | 500 | 0.7289 | 0.805 | 0.3035 | 1.3669 | 0.805 | 0.7797 | 0.1951 | 0.0874 |
| 0.1528 | 40.96 | 512 | 0.7292 | 0.8 | 0.3037 | 1.3685 | 0.8000 | 0.7737 | 0.2103 | 0.0878 |
| 0.1528 | 42.0 | 525 | 0.7347 | 0.805 | 0.3077 | 1.3640 | 0.805 | 0.7784 | 0.2005 | 0.0894 |
| 0.1528 | 42.96 | 537 | 0.7343 | 0.805 | 0.3063 | 1.3684 | 0.805 | 0.7784 | 0.2055 | 0.0899 |
| 0.1528 | 44.0 | 550 | 0.7283 | 0.805 | 0.3038 | 1.3660 | 0.805 | 0.7797 | 0.2142 | 0.0881 |
| 0.1528 | 44.96 | 562 | 0.7325 | 0.805 | 0.3061 | 1.3658 | 0.805 | 0.7784 | 0.1967 | 0.0896 |
| 0.1528 | 46.0 | 575 | 0.7303 | 0.81 | 0.3049 | 1.3659 | 0.81 | 0.7844 | 0.1985 | 0.0883 |
| 0.1528 | 46.96 | 587 | 0.7321 | 0.805 | 0.3054 | 1.3657 | 0.805 | 0.7797 | 0.1925 | 0.0888 |
| 0.1528 | 48.0 | 600 | 0.7322 | 0.8 | 0.3054 | 1.3680 | 0.8000 | 0.7704 | 0.1837 | 0.0903 |
| 0.1528 | 48.96 | 612 | 0.7368 | 0.8 | 0.3079 | 1.3234 | 0.8000 | 0.7704 | 0.2111 | 0.0905 |
| 0.1528 | 50.0 | 625 | 0.7358 | 0.8 | 0.3074 | 1.3682 | 0.8000 | 0.7704 | 0.1971 | 0.0899 |
| 0.1528 | 50.96 | 637 | 0.7318 | 0.8 | 0.3054 | 1.3661 | 0.8000 | 0.7704 | 0.2111 | 0.0889 |
| 0.1528 | 52.0 | 650 | 0.7473 | 0.795 | 0.3133 | 1.3707 | 0.795 | 0.7660 | 0.2032 | 0.0954 |
| 0.1528 | 52.96 | 662 | 0.7299 | 0.805 | 0.3050 | 1.3631 | 0.805 | 0.7797 | 0.1977 | 0.0891 |
| 0.1528 | 54.0 | 675 | 0.7427 | 0.795 | 0.3116 | 1.3673 | 0.795 | 0.7644 | 0.2181 | 0.0923 |
| 0.1528 | 54.96 | 687 | 0.7371 | 0.8 | 0.3087 | 1.3648 | 0.8000 | 0.7704 | 0.1984 | 0.0902 |
| 0.1528 | 56.0 | 700 | 0.7399 | 0.8 | 0.3100 | 1.3671 | 0.8000 | 0.7704 | 0.2178 | 0.0920 |
| 0.1528 | 56.96 | 712 | 0.7420 | 0.8 | 0.3114 | 1.3671 | 0.8000 | 0.7689 | 0.2035 | 0.0925 |
| 0.1528 | 58.0 | 725 | 0.7380 | 0.8 | 0.3088 | 1.3646 | 0.8000 | 0.7704 | 0.2016 | 0.0916 |
| 0.1528 | 58.96 | 737 | 0.7390 | 0.8 | 0.3091 | 1.3671 | 0.8000 | 0.7704 | 0.2005 | 0.0921 |
| 0.1528 | 60.0 | 750 | 0.7370 | 0.8 | 0.3085 | 1.3651 | 0.8000 | 0.7704 | 0.1891 | 0.0908 |
| 0.1528 | 60.96 | 762 | 0.7388 | 0.8 | 0.3097 | 1.3622 | 0.8000 | 0.7704 | 0.2010 | 0.0906 |
| 0.1528 | 62.0 | 775 | 0.7403 | 0.795 | 0.3102 | 1.3691 | 0.795 | 0.7654 | 0.2142 | 0.0910 |
| 0.1528 | 62.96 | 787 | 0.7390 | 0.8 | 0.3088 | 1.3950 | 0.8000 | 0.7684 | 0.2058 | 0.0910 |
| 0.1528 | 64.0 | 800 | 0.7431 | 0.795 | 0.3110 | 1.3722 | 0.795 | 0.7656 | 0.1959 | 0.0914 |
| 0.1528 | 64.96 | 812 | 0.7429 | 0.8 | 0.3117 | 1.3673 | 0.8000 | 0.7704 | 0.1933 | 0.0925 |
| 0.1528 | 66.0 | 825 | 0.7419 | 0.8 | 0.3115 | 1.3663 | 0.8000 | 0.7704 | 0.1986 | 0.0913 |
| 0.1528 | 66.96 | 837 | 0.7434 | 0.8 | 0.3120 | 1.3672 | 0.8000 | 0.7704 | 0.1929 | 0.0927 |
| 0.1528 | 68.0 | 850 | 0.7414 | 0.8 | 0.3115 | 1.3649 | 0.8000 | 0.7704 | 0.1988 | 0.0923 |
| 0.1528 | 68.96 | 862 | 0.7448 | 0.8 | 0.3129 | 1.3685 | 0.8000 | 0.7704 | 0.2251 | 0.0928 |
| 0.1528 | 70.0 | 875 | 0.7450 | 0.8 | 0.3130 | 1.3657 | 0.8000 | 0.7704 | 0.1969 | 0.0934 |
| 0.1528 | 70.96 | 887 | 0.7464 | 0.8 | 0.3132 | 1.3686 | 0.8000 | 0.7704 | 0.1988 | 0.0946 |
| 0.1528 | 72.0 | 900 | 0.7465 | 0.8 | 0.3138 | 1.3682 | 0.8000 | 0.7707 | 0.2015 | 0.0935 |
| 0.1528 | 72.96 | 912 | 0.7471 | 0.8 | 0.3142 | 1.3685 | 0.8000 | 0.7696 | 0.2093 | 0.0936 |
| 0.1528 | 74.0 | 925 | 0.7476 | 0.8 | 0.3145 | 1.3685 | 0.8000 | 0.7704 | 0.2120 | 0.0942 |
| 0.1528 | 74.96 | 937 | 0.7433 | 0.8 | 0.3126 | 1.3655 | 0.8000 | 0.7696 | 0.1907 | 0.0918 |
| 0.1528 | 76.0 | 950 | 0.7519 | 0.8 | 0.3166 | 1.3683 | 0.8000 | 0.7704 | 0.2111 | 0.0949 |
| 0.1528 | 76.96 | 962 | 0.7485 | 0.8 | 0.3152 | 1.3679 | 0.8000 | 0.7696 | 0.2035 | 0.0942 |
| 0.1528 | 78.0 | 975 | 0.7496 | 0.8 | 0.3155 | 1.3716 | 0.8000 | 0.7704 | 0.1931 | 0.0940 |
| 0.1528 | 78.96 | 987 | 0.7513 | 0.8 | 0.3163 | 1.3684 | 0.8000 | 0.7704 | 0.1888 | 0.0952 |
| 0.1059 | 80.0 | 1000 | 0.7490 | 0.8 | 0.3157 | 1.3707 | 0.8000 | 0.7691 | 0.2118 | 0.0943 |
| 0.1059 | 80.96 | 1012 | 0.7482 | 0.8 | 0.3151 | 1.3687 | 0.8000 | 0.7696 | 0.2060 | 0.0945 |
| 0.1059 | 82.0 | 1025 | 0.7516 | 0.8 | 0.3163 | 1.3682 | 0.8000 | 0.7704 | 0.2248 | 0.0950 |
| 0.1059 | 82.96 | 1037 | 0.7527 | 0.8 | 0.3174 | 1.3660 | 0.8000 | 0.7707 | 0.2204 | 0.0948 |
| 0.1059 | 84.0 | 1050 | 0.7495 | 0.8 | 0.3156 | 1.3663 | 0.8000 | 0.7704 | 0.2067 | 0.0939 |
| 0.1059 | 84.96 | 1062 | 0.7506 | 0.8 | 0.3161 | 1.3659 | 0.8000 | 0.7707 | 0.2086 | 0.0947 |
| 0.1059 | 86.0 | 1075 | 0.7537 | 0.8 | 0.3179 | 1.3687 | 0.8000 | 0.7698 | 0.2001 | 0.0955 |
| 0.1059 | 86.96 | 1087 | 0.7525 | 0.8 | 0.3172 | 1.3693 | 0.8000 | 0.7696 | 0.2096 | 0.0947 |
| 0.1059 | 88.0 | 1100 | 0.7535 | 0.8 | 0.3175 | 1.3675 | 0.8000 | 0.7704 | 0.2091 | 0.0951 |
| 0.1059 | 88.96 | 1112 | 0.7525 | 0.8 | 0.3172 | 1.3680 | 0.8000 | 0.7696 | 0.2009 | 0.0946 |
| 0.1059 | 90.0 | 1125 | 0.7566 | 0.8 | 0.3190 | 1.3725 | 0.8000 | 0.7698 | 0.1881 | 0.0964 |
| 0.1059 | 90.96 | 1137 | 0.7578 | 0.8 | 0.3195 | 1.3726 | 0.8000 | 0.7704 | 0.1880 | 0.0968 |
| 0.1059 | 92.0 | 1150 | 0.7560 | 0.8 | 0.3186 | 1.3715 | 0.8000 | 0.7707 | 0.2095 | 0.0969 |
| 0.1059 | 92.96 | 1162 | 0.7623 | 0.795 | 0.3219 | 1.3765 | 0.795 | 0.7681 | 0.1907 | 0.0979 |
| 0.1059 | 94.0 | 1175 | 0.7567 | 0.8 | 0.3192 | 1.3709 | 0.8000 | 0.7698 | 0.2000 | 0.0953 |
| 0.1059 | 94.96 | 1187 | 0.7538 | 0.8 | 0.3181 | 1.3708 | 0.8000 | 0.7691 | 0.1986 | 0.0958 |
| 0.1059 | 96.0 | 1200 | 0.7530 | 0.8 | 0.3175 | 1.3725 | 0.8000 | 0.7693 | 0.2122 | 0.0958 |
| 0.1059 | 96.96 | 1212 | 0.7607 | 0.8 | 0.3207 | 1.3730 | 0.8000 | 0.7709 | 0.1906 | 0.0972 |
| 0.1059 | 98.0 | 1225 | 0.7647 | 0.79 | 0.3229 | 1.3411 | 0.79 | 0.7633 | 0.1889 | 0.0984 |
| 0.1059 | 98.96 | 1237 | 0.7568 | 0.79 | 0.3191 | 1.4425 | 0.79 | 0.7644 | 0.1911 | 0.0976 |
| 0.1059 | 100.0 | 1250 | 0.7586 | 0.8 | 0.3200 | 1.4486 | 0.8000 | 0.7696 | 0.2009 | 0.0952 |
| 0.1059 | 100.96 | 1262 | 0.7552 | 0.8 | 0.3192 | 1.3728 | 0.8000 | 0.7696 | 0.1962 | 0.0953 |
| 0.1059 | 102.0 | 1275 | 0.7601 | 0.8 | 0.3217 | 1.4309 | 0.8000 | 0.7691 | 0.2071 | 0.0969 |
| 0.1059 | 102.96 | 1287 | 0.7608 | 0.795 | 0.3213 | 1.3702 | 0.795 | 0.7668 | 0.2068 | 0.0967 |
| 0.1059 | 104.0 | 1300 | 0.7590 | 0.795 | 0.3202 | 1.3748 | 0.795 | 0.7668 | 0.1922 | 0.0967 |
| 0.1059 | 104.96 | 1312 | 0.7626 | 0.795 | 0.3222 | 1.3775 | 0.795 | 0.7673 | 0.1917 | 0.0974 |
| 0.1059 | 106.0 | 1325 | 0.7632 | 0.795 | 0.3228 | 1.3765 | 0.795 | 0.7664 | 0.2072 | 0.0977 |
| 0.1059 | 106.96 | 1337 | 0.7612 | 0.795 | 0.3223 | 1.3764 | 0.795 | 0.7666 | 0.2011 | 0.0974 |
| 0.1059 | 108.0 | 1350 | 0.7669 | 0.79 | 0.3246 | 1.3777 | 0.79 | 0.7639 | 0.2005 | 0.0981 |
| 0.1059 | 108.96 | 1362 | 0.7658 | 0.795 | 0.3238 | 1.3782 | 0.795 | 0.7673 | 0.2099 | 0.0983 |
| 0.1059 | 110.0 | 1375 | 0.7632 | 0.79 | 0.3232 | 1.3766 | 0.79 | 0.7639 | 0.2002 | 0.0978 |
| 0.1059 | 110.96 | 1387 | 0.7651 | 0.79 | 0.3236 | 1.3758 | 0.79 | 0.7580 | 0.2134 | 0.0980 |
| 0.1059 | 112.0 | 1400 | 0.7649 | 0.79 | 0.3235 | 1.3765 | 0.79 | 0.7583 | 0.1975 | 0.0982 |
| 0.1059 | 112.96 | 1412 | 0.7683 | 0.79 | 0.3253 | 1.3759 | 0.79 | 0.7639 | 0.1913 | 0.0986 |
| 0.1059 | 114.0 | 1425 | 0.7682 | 0.795 | 0.3253 | 1.3778 | 0.795 | 0.7668 | 0.1986 | 0.0984 |
| 0.1059 | 114.96 | 1437 | 0.7666 | 0.79 | 0.3245 | 1.3771 | 0.79 | 0.7639 | 0.1838 | 0.0976 |
| 0.1059 | 116.0 | 1450 | 0.7685 | 0.79 | 0.3255 | 1.3802 | 0.79 | 0.7639 | 0.1962 | 0.0988 |
| 0.1059 | 116.96 | 1462 | 0.7676 | 0.785 | 0.3249 | 1.3783 | 0.785 | 0.7554 | 0.2011 | 0.0991 |
| 0.1059 | 118.0 | 1475 | 0.7704 | 0.785 | 0.3262 | 1.3789 | 0.785 | 0.7516 | 0.2098 | 0.0995 |
| 0.1059 | 118.96 | 1487 | 0.7701 | 0.785 | 0.3262 | 1.3806 | 0.785 | 0.7554 | 0.2167 | 0.0991 |
| 0.0842 | 120.0 | 1500 | 0.7708 | 0.79 | 0.3267 | 1.3815 | 0.79 | 0.7639 | 0.1934 | 0.0990 |
| 0.0842 | 120.96 | 1512 | 0.7710 | 0.785 | 0.3265 | 1.3790 | 0.785 | 0.7516 | 0.1928 | 0.0995 |
| 0.0842 | 122.0 | 1525 | 0.7728 | 0.79 | 0.3274 | 1.3830 | 0.79 | 0.7639 | 0.1917 | 0.0996 |
| 0.0842 | 122.96 | 1537 | 0.7708 | 0.785 | 0.3267 | 1.3826 | 0.785 | 0.7554 | 0.2057 | 0.0992 |
| 0.0842 | 124.0 | 1550 | 0.7698 | 0.785 | 0.3262 | 1.3777 | 0.785 | 0.7554 | 0.2021 | 0.0996 |
| 0.0842 | 124.96 | 1562 | 0.7706 | 0.785 | 0.3267 | 1.3825 | 0.785 | 0.7554 | 0.1949 | 0.0994 |
| 0.0842 | 126.0 | 1575 | 0.7751 | 0.79 | 0.3290 | 1.3827 | 0.79 | 0.7639 | 0.1945 | 0.1002 |
| 0.0842 | 126.96 | 1587 | 0.7734 | 0.79 | 0.3282 | 1.3855 | 0.79 | 0.7639 | 0.1882 | 0.0998 |
| 0.0842 | 128.0 | 1600 | 0.7753 | 0.785 | 0.3290 | 1.3842 | 0.785 | 0.7554 | 0.1926 | 0.1006 |
| 0.0842 | 128.96 | 1612 | 0.7731 | 0.78 | 0.3278 | 1.3808 | 0.78 | 0.7488 | 0.2115 | 0.0994 |
| 0.0842 | 130.0 | 1625 | 0.7723 | 0.78 | 0.3276 | 1.3830 | 0.78 | 0.7486 | 0.2005 | 0.0994 |
| 0.0842 | 130.96 | 1637 | 0.7746 | 0.78 | 0.3287 | 1.3810 | 0.78 | 0.7488 | 0.2103 | 0.0999 |
| 0.0842 | 132.0 | 1650 | 0.7758 | 0.78 | 0.3291 | 1.3848 | 0.78 | 0.7488 | 0.2138 | 0.1004 |
| 0.0842 | 132.96 | 1662 | 0.7771 | 0.78 | 0.3299 | 1.3833 | 0.78 | 0.7488 | 0.2080 | 0.1007 |
| 0.0842 | 134.0 | 1675 | 0.7757 | 0.78 | 0.3295 | 1.3837 | 0.78 | 0.7488 | 0.2060 | 0.0999 |
| 0.0842 | 134.96 | 1687 | 0.7754 | 0.78 | 0.3292 | 1.3836 | 0.78 | 0.7488 | 0.2070 | 0.1001 |
| 0.0842 | 136.0 | 1700 | 0.7755 | 0.78 | 0.3294 | 1.3834 | 0.78 | 0.7488 | 0.2057 | 0.0999 |
| 0.0842 | 136.96 | 1712 | 0.7755 | 0.78 | 0.3295 | 1.3874 | 0.78 | 0.7488 | 0.1999 | 0.0998 |
| 0.0842 | 138.0 | 1725 | 0.7759 | 0.78 | 0.3296 | 1.4380 | 0.78 | 0.7488 | 0.2007 | 0.1002 |
| 0.0842 | 138.96 | 1737 | 0.7776 | 0.78 | 0.3303 | 1.3868 | 0.78 | 0.7488 | 0.2091 | 0.1002 |
| 0.0842 | 140.0 | 1750 | 0.7780 | 0.78 | 0.3304 | 1.3868 | 0.78 | 0.7488 | 0.2086 | 0.1005 |
| 0.0842 | 140.96 | 1762 | 0.7780 | 0.78 | 0.3306 | 1.3855 | 0.78 | 0.7488 | 0.2065 | 0.1001 |
| 0.0842 | 142.0 | 1775 | 0.7789 | 0.78 | 0.3311 | 1.3855 | 0.78 | 0.7488 | 0.2067 | 0.1003 |
| 0.0842 | 142.96 | 1787 | 0.7798 | 0.78 | 0.3314 | 1.3856 | 0.78 | 0.7488 | 0.2083 | 0.1010 |
| 0.0842 | 144.0 | 1800 | 0.7799 | 0.78 | 0.3315 | 1.3914 | 0.78 | 0.7488 | 0.2161 | 0.1004 |
| 0.0842 | 144.96 | 1812 | 0.7806 | 0.78 | 0.3317 | 1.3857 | 0.78 | 0.7488 | 0.2024 | 0.1008 |
| 0.0842 | 146.0 | 1825 | 0.7817 | 0.78 | 0.3322 | 1.3947 | 0.78 | 0.7488 | 0.2043 | 0.1009 |
| 0.0842 | 146.96 | 1837 | 0.7815 | 0.78 | 0.3324 | 1.3898 | 0.78 | 0.7488 | 0.2118 | 0.1006 |
| 0.0842 | 148.0 | 1850 | 0.7820 | 0.78 | 0.3326 | 1.3874 | 0.78 | 0.7488 | 0.2114 | 0.1008 |
| 0.0842 | 148.96 | 1862 | 0.7821 | 0.78 | 0.3327 | 1.4391 | 0.78 | 0.7488 | 0.2087 | 0.1006 |
| 0.0842 | 150.0 | 1875 | 0.7816 | 0.78 | 0.3324 | 1.4410 | 0.78 | 0.7488 | 0.2160 | 0.1006 |
| 0.0842 | 150.96 | 1887 | 0.7826 | 0.78 | 0.3328 | 1.3927 | 0.78 | 0.7488 | 0.2011 | 0.1007 |
| 0.0842 | 152.0 | 1900 | 0.7834 | 0.78 | 0.3332 | 1.4411 | 0.78 | 0.7488 | 0.1994 | 0.1009 |
| 0.0842 | 152.96 | 1912 | 0.7830 | 0.78 | 0.3331 | 1.4409 | 0.78 | 0.7488 | 0.1967 | 0.1008 |
| 0.0842 | 154.0 | 1925 | 0.7825 | 0.78 | 0.3329 | 1.4412 | 0.78 | 0.7488 | 0.2070 | 0.1008 |
| 0.0842 | 154.96 | 1937 | 0.7827 | 0.78 | 0.3332 | 1.4399 | 0.78 | 0.7488 | 0.2097 | 0.1007 |
| 0.0842 | 156.0 | 1950 | 0.7822 | 0.78 | 0.3329 | 1.4391 | 0.78 | 0.7488 | 0.1986 | 0.1006 |
| 0.0842 | 156.96 | 1962 | 0.7838 | 0.78 | 0.3335 | 1.4414 | 0.78 | 0.7488 | 0.1967 | 0.1007 |
| 0.0842 | 158.0 | 1975 | 0.7845 | 0.78 | 0.3337 | 1.3899 | 0.78 | 0.7488 | 0.2087 | 0.1008 |
| 0.0842 | 158.96 | 1987 | 0.7846 | 0.78 | 0.3340 | 1.4419 | 0.78 | 0.7488 | 0.2002 | 0.1007 |
| 0.0731 | 160.0 | 2000 | 0.7853 | 0.78 | 0.3343 | 1.4419 | 0.78 | 0.7488 | 0.1967 | 0.1010 |
| 0.0731 | 160.96 | 2012 | 0.7853 | 0.78 | 0.3343 | 1.4425 | 0.78 | 0.7488 | 0.1963 | 0.1012 |
| 0.0731 | 162.0 | 2025 | 0.7846 | 0.78 | 0.3341 | 1.4411 | 0.78 | 0.7488 | 0.1998 | 0.1008 |
| 0.0731 | 162.96 | 2037 | 0.7856 | 0.78 | 0.3345 | 1.4419 | 0.78 | 0.7488 | 0.2000 | 0.1011 |
| 0.0731 | 164.0 | 2050 | 0.7863 | 0.78 | 0.3348 | 1.4425 | 0.78 | 0.7488 | 0.1991 | 0.1014 |
| 0.0731 | 164.96 | 2062 | 0.7863 | 0.78 | 0.3349 | 1.4414 | 0.78 | 0.7488 | 0.1969 | 0.1013 |
| 0.0731 | 166.0 | 2075 | 0.7865 | 0.78 | 0.3349 | 1.4425 | 0.78 | 0.7488 | 0.1994 | 0.1015 |
| 0.0731 | 166.96 | 2087 | 0.7863 | 0.78 | 0.3349 | 1.4412 | 0.78 | 0.7488 | 0.1966 | 0.1015 |
| 0.0731 | 168.0 | 2100 | 0.7870 | 0.78 | 0.3352 | 1.4431 | 0.78 | 0.7488 | 0.1944 | 0.1016 |
| 0.0731 | 168.96 | 2112 | 0.7868 | 0.78 | 0.3351 | 1.4425 | 0.78 | 0.7488 | 0.1919 | 0.1017 |
| 0.0731 | 170.0 | 2125 | 0.7873 | 0.78 | 0.3354 | 1.4424 | 0.78 | 0.7488 | 0.1965 | 0.1017 |
| 0.0731 | 170.96 | 2137 | 0.7870 | 0.78 | 0.3352 | 1.4418 | 0.78 | 0.7488 | 0.2081 | 0.1014 |
| 0.0731 | 172.0 | 2150 | 0.7879 | 0.78 | 0.3356 | 1.4431 | 0.78 | 0.7488 | 0.1922 | 0.1017 |
| 0.0731 | 172.96 | 2162 | 0.7874 | 0.78 | 0.3355 | 1.4421 | 0.78 | 0.7488 | 0.1967 | 0.1015 |
| 0.0731 | 174.0 | 2175 | 0.7872 | 0.78 | 0.3355 | 1.4418 | 0.78 | 0.7488 | 0.1965 | 0.1016 |
| 0.0731 | 174.96 | 2187 | 0.7882 | 0.78 | 0.3358 | 1.4433 | 0.78 | 0.7488 | 0.1945 | 0.1017 |
| 0.0731 | 176.0 | 2200 | 0.7882 | 0.78 | 0.3358 | 1.4423 | 0.78 | 0.7488 | 0.1969 | 0.1016 |
| 0.0731 | 176.96 | 2212 | 0.7885 | 0.78 | 0.3361 | 1.4431 | 0.78 | 0.7488 | 0.1948 | 0.1017 |
| 0.0731 | 178.0 | 2225 | 0.7883 | 0.78 | 0.3359 | 1.4428 | 0.78 | 0.7488 | 0.1946 | 0.1017 |
| 0.0731 | 178.96 | 2237 | 0.7882 | 0.78 | 0.3359 | 1.4426 | 0.78 | 0.7488 | 0.1920 | 0.1017 |
| 0.0731 | 180.0 | 2250 | 0.7884 | 0.78 | 0.3360 | 1.4425 | 0.78 | 0.7488 | 0.2000 | 0.1016 |
| 0.0731 | 180.96 | 2262 | 0.7886 | 0.78 | 0.3361 | 1.4431 | 0.78 | 0.7488 | 0.1946 | 0.1017 |
| 0.0731 | 182.0 | 2275 | 0.7888 | 0.78 | 0.3362 | 1.4428 | 0.78 | 0.7488 | 0.1977 | 0.1016 |
| 0.0731 | 182.96 | 2287 | 0.7889 | 0.78 | 0.3362 | 1.4428 | 0.78 | 0.7488 | 0.1922 | 0.1017 |
| 0.0731 | 184.0 | 2300 | 0.7889 | 0.78 | 0.3362 | 1.4431 | 0.78 | 0.7488 | 0.1946 | 0.1017 |
| 0.0731 | 184.96 | 2312 | 0.7889 | 0.78 | 0.3362 | 1.4427 | 0.78 | 0.7488 | 0.1946 | 0.1017 |
| 0.0731 | 186.0 | 2325 | 0.7889 | 0.78 | 0.3362 | 1.4431 | 0.78 | 0.7488 | 0.1922 | 0.1018 |
| 0.0731 | 186.96 | 2337 | 0.7892 | 0.78 | 0.3364 | 1.4432 | 0.78 | 0.7488 | 0.1921 | 0.1017 |
| 0.0731 | 188.0 | 2350 | 0.7890 | 0.78 | 0.3363 | 1.4427 | 0.78 | 0.7488 | 0.1921 | 0.1017 |
| 0.0731 | 188.96 | 2362 | 0.7892 | 0.78 | 0.3364 | 1.4429 | 0.78 | 0.7488 | 0.1922 | 0.1017 |
| 0.0731 | 190.0 | 2375 | 0.7893 | 0.78 | 0.3364 | 1.4433 | 0.78 | 0.7488 | 0.1945 | 0.1017 |
| 0.0731 | 190.96 | 2387 | 0.7893 | 0.78 | 0.3364 | 1.4430 | 0.78 | 0.7488 | 0.1921 | 0.1017 |
| 0.0731 | 192.0 | 2400 | 0.7893 | 0.78 | 0.3364 | 1.4430 | 0.78 | 0.7488 | 0.1922 | 0.1018 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/vit-base_tobacco_wr_0.01_wd_0.2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco_wr_0.01_wd_0.2
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9831
- Accuracy: 0.755
- Brier Loss: 0.3936
- Nll: 1.4572
- F1 Micro: 0.755
- F1 Macro: 0.7245
- Ece: 0.2101
- Aurc: 0.1059
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.7384 | 0.815 | 0.3056 | 1.1863 | 0.815 | 0.8021 | 0.2211 | 0.0789 |
| No log | 2.0 | 25 | 0.7504 | 0.795 | 0.3109 | 1.2594 | 0.795 | 0.7691 | 0.2235 | 0.0789 |
| No log | 2.96 | 37 | 0.7415 | 0.765 | 0.3109 | 1.0445 | 0.765 | 0.7422 | 0.2323 | 0.0762 |
| No log | 4.0 | 50 | 0.7222 | 0.805 | 0.2989 | 1.3200 | 0.805 | 0.7951 | 0.2028 | 0.0682 |
| No log | 4.96 | 62 | 0.7160 | 0.815 | 0.2989 | 1.2307 | 0.815 | 0.8067 | 0.2381 | 0.0688 |
| No log | 6.0 | 75 | 0.7149 | 0.825 | 0.2950 | 1.3097 | 0.825 | 0.8127 | 0.2110 | 0.0809 |
| No log | 6.96 | 87 | 0.7157 | 0.815 | 0.2986 | 1.2310 | 0.815 | 0.8007 | 0.2195 | 0.0779 |
| No log | 8.0 | 100 | 0.7245 | 0.81 | 0.3020 | 1.3030 | 0.81 | 0.7949 | 0.1997 | 0.0834 |
| No log | 8.96 | 112 | 0.7111 | 0.815 | 0.2972 | 1.2310 | 0.815 | 0.8009 | 0.2090 | 0.0793 |
| No log | 10.0 | 125 | 0.7344 | 0.81 | 0.3087 | 1.2974 | 0.81 | 0.7956 | 0.2240 | 0.0913 |
| No log | 10.96 | 137 | 0.7264 | 0.81 | 0.3057 | 1.3216 | 0.81 | 0.7907 | 0.2074 | 0.0879 |
| No log | 12.0 | 150 | 0.7301 | 0.805 | 0.3056 | 1.2448 | 0.805 | 0.7919 | 0.2112 | 0.0932 |
| No log | 12.96 | 162 | 0.7224 | 0.805 | 0.3020 | 1.2950 | 0.805 | 0.7899 | 0.1930 | 0.0915 |
| No log | 14.0 | 175 | 0.7235 | 0.81 | 0.3045 | 1.2965 | 0.81 | 0.7956 | 0.2025 | 0.0902 |
| No log | 14.96 | 187 | 0.7239 | 0.81 | 0.3039 | 1.2970 | 0.81 | 0.7956 | 0.2156 | 0.0906 |
| No log | 16.0 | 200 | 0.7293 | 0.81 | 0.3060 | 1.4336 | 0.81 | 0.7924 | 0.2029 | 0.0955 |
| No log | 16.96 | 212 | 0.7253 | 0.805 | 0.3044 | 1.2984 | 0.805 | 0.7853 | 0.1946 | 0.0911 |
| No log | 18.0 | 225 | 0.7293 | 0.805 | 0.3076 | 1.1852 | 0.805 | 0.7861 | 0.2015 | 0.0916 |
| No log | 18.96 | 237 | 0.7328 | 0.8 | 0.3050 | 1.3844 | 0.8000 | 0.7836 | 0.1900 | 0.0986 |
| No log | 20.0 | 250 | 0.7263 | 0.8 | 0.3058 | 1.2943 | 0.8000 | 0.7754 | 0.2003 | 0.0900 |
| No log | 20.96 | 262 | 0.7370 | 0.805 | 0.3102 | 1.3612 | 0.805 | 0.7818 | 0.1998 | 0.0939 |
| No log | 22.0 | 275 | 0.7412 | 0.795 | 0.3118 | 1.3673 | 0.795 | 0.7714 | 0.1952 | 0.0957 |
| No log | 22.96 | 287 | 0.7326 | 0.795 | 0.3077 | 1.3572 | 0.795 | 0.7666 | 0.1935 | 0.0925 |
| No log | 24.0 | 300 | 0.7308 | 0.805 | 0.3076 | 1.4265 | 0.805 | 0.7818 | 0.1968 | 0.0906 |
| No log | 24.96 | 312 | 0.7424 | 0.8 | 0.3122 | 1.3666 | 0.8000 | 0.7774 | 0.1853 | 0.0960 |
| No log | 26.0 | 325 | 0.7383 | 0.8 | 0.3097 | 1.3644 | 0.8000 | 0.7774 | 0.1911 | 0.0952 |
| No log | 26.96 | 337 | 0.7468 | 0.805 | 0.3143 | 1.4295 | 0.805 | 0.7829 | 0.1952 | 0.0971 |
| No log | 28.0 | 350 | 0.7484 | 0.795 | 0.3135 | 1.4327 | 0.795 | 0.7668 | 0.1906 | 0.0989 |
| No log | 28.96 | 362 | 0.7459 | 0.8 | 0.3135 | 1.3624 | 0.8000 | 0.7704 | 0.1945 | 0.0944 |
| No log | 30.0 | 375 | 0.7513 | 0.8 | 0.3160 | 1.4282 | 0.8000 | 0.7698 | 0.1991 | 0.0953 |
| No log | 30.96 | 387 | 0.7535 | 0.795 | 0.3168 | 1.4336 | 0.795 | 0.7656 | 0.1799 | 0.0980 |
| No log | 32.0 | 400 | 0.7540 | 0.8 | 0.3171 | 1.4469 | 0.8000 | 0.7774 | 0.1943 | 0.0959 |
| No log | 32.96 | 412 | 0.7566 | 0.8 | 0.3180 | 1.3772 | 0.8000 | 0.7704 | 0.1809 | 0.0972 |
| No log | 34.0 | 425 | 0.7641 | 0.795 | 0.3220 | 1.3878 | 0.795 | 0.7691 | 0.1937 | 0.1021 |
| No log | 34.96 | 437 | 0.7602 | 0.795 | 0.3192 | 1.3748 | 0.795 | 0.7656 | 0.1908 | 0.0973 |
| No log | 36.0 | 450 | 0.7577 | 0.8 | 0.3186 | 1.4397 | 0.8000 | 0.7704 | 0.1858 | 0.0951 |
| No log | 36.96 | 462 | 0.7724 | 0.795 | 0.3233 | 1.3800 | 0.795 | 0.7656 | 0.1796 | 0.1002 |
| No log | 38.0 | 475 | 0.7675 | 0.795 | 0.3223 | 1.3659 | 0.795 | 0.7654 | 0.1862 | 0.0976 |
| No log | 38.96 | 487 | 0.7772 | 0.79 | 0.3261 | 1.4406 | 0.79 | 0.7633 | 0.1877 | 0.1062 |
| 0.0949 | 40.0 | 500 | 0.7631 | 0.795 | 0.3211 | 1.4302 | 0.795 | 0.7654 | 0.1896 | 0.0953 |
| 0.0949 | 40.96 | 512 | 0.7878 | 0.79 | 0.3304 | 1.5689 | 0.79 | 0.7631 | 0.1770 | 0.1014 |
| 0.0949 | 42.0 | 525 | 0.7639 | 0.815 | 0.3198 | 1.5524 | 0.815 | 0.7879 | 0.1890 | 0.0938 |
| 0.0949 | 42.96 | 537 | 0.7967 | 0.785 | 0.3350 | 1.4516 | 0.785 | 0.7606 | 0.1809 | 0.1040 |
| 0.0949 | 44.0 | 550 | 0.7735 | 0.81 | 0.3237 | 1.4915 | 0.81 | 0.7784 | 0.2079 | 0.0963 |
| 0.0949 | 44.96 | 562 | 0.7859 | 0.79 | 0.3300 | 1.5076 | 0.79 | 0.7594 | 0.1851 | 0.0991 |
| 0.0949 | 46.0 | 575 | 0.7917 | 0.79 | 0.3323 | 1.5062 | 0.79 | 0.7631 | 0.1944 | 0.1015 |
| 0.0949 | 46.96 | 587 | 0.7879 | 0.795 | 0.3310 | 1.4304 | 0.795 | 0.7656 | 0.1761 | 0.0987 |
| 0.0949 | 48.0 | 600 | 0.7917 | 0.79 | 0.3320 | 1.4382 | 0.79 | 0.7629 | 0.1828 | 0.0994 |
| 0.0949 | 48.96 | 612 | 0.7950 | 0.79 | 0.3333 | 1.4929 | 0.79 | 0.7549 | 0.1811 | 0.0995 |
| 0.0949 | 50.0 | 625 | 0.8005 | 0.79 | 0.3363 | 1.4351 | 0.79 | 0.7631 | 0.1953 | 0.1009 |
| 0.0949 | 50.96 | 637 | 0.8019 | 0.79 | 0.3363 | 1.4891 | 0.79 | 0.7631 | 0.1837 | 0.0998 |
| 0.0949 | 52.0 | 650 | 0.8021 | 0.79 | 0.3354 | 1.4899 | 0.79 | 0.7586 | 0.1990 | 0.0994 |
| 0.0949 | 52.96 | 662 | 0.8012 | 0.795 | 0.3346 | 1.4913 | 0.795 | 0.7648 | 0.1878 | 0.0984 |
| 0.0949 | 54.0 | 675 | 0.8078 | 0.785 | 0.3388 | 1.4920 | 0.785 | 0.7559 | 0.1938 | 0.1000 |
| 0.0949 | 54.96 | 687 | 0.8092 | 0.785 | 0.3391 | 1.5003 | 0.785 | 0.7561 | 0.1878 | 0.1004 |
| 0.0949 | 56.0 | 700 | 0.8095 | 0.795 | 0.3378 | 1.4914 | 0.795 | 0.7634 | 0.1879 | 0.0994 |
| 0.0949 | 56.96 | 712 | 0.8124 | 0.785 | 0.3396 | 1.4926 | 0.785 | 0.7555 | 0.1996 | 0.1005 |
| 0.0949 | 58.0 | 725 | 0.8120 | 0.78 | 0.3391 | 1.4405 | 0.78 | 0.7469 | 0.1924 | 0.1005 |
| 0.0949 | 58.96 | 737 | 0.8185 | 0.785 | 0.3406 | 1.5007 | 0.785 | 0.7523 | 0.1910 | 0.1013 |
| 0.0949 | 60.0 | 750 | 0.8182 | 0.785 | 0.3421 | 1.4371 | 0.785 | 0.7555 | 0.1727 | 0.1010 |
| 0.0949 | 60.96 | 762 | 0.8224 | 0.78 | 0.3443 | 1.4404 | 0.78 | 0.7475 | 0.1941 | 0.1019 |
| 0.0949 | 62.0 | 775 | 0.8267 | 0.78 | 0.3463 | 1.4995 | 0.78 | 0.7535 | 0.1927 | 0.1016 |
| 0.0949 | 62.96 | 787 | 0.8252 | 0.775 | 0.3447 | 1.4965 | 0.775 | 0.7465 | 0.1798 | 0.1016 |
| 0.0949 | 64.0 | 800 | 0.8286 | 0.78 | 0.3446 | 1.4987 | 0.78 | 0.7475 | 0.1911 | 0.1014 |
| 0.0949 | 64.96 | 812 | 0.8308 | 0.78 | 0.3467 | 1.4438 | 0.78 | 0.7469 | 0.1866 | 0.1019 |
| 0.0949 | 66.0 | 825 | 0.8346 | 0.775 | 0.3476 | 1.4592 | 0.775 | 0.7444 | 0.1878 | 0.1034 |
| 0.0949 | 66.96 | 837 | 0.8348 | 0.78 | 0.3484 | 1.4423 | 0.78 | 0.7515 | 0.1821 | 0.1021 |
| 0.0949 | 68.0 | 850 | 0.8376 | 0.78 | 0.3481 | 1.4970 | 0.78 | 0.7469 | 0.1922 | 0.1015 |
| 0.0949 | 68.96 | 862 | 0.8379 | 0.78 | 0.3497 | 1.4388 | 0.78 | 0.7515 | 0.1850 | 0.1018 |
| 0.0949 | 70.0 | 875 | 0.8452 | 0.78 | 0.3512 | 1.4974 | 0.78 | 0.7525 | 0.1828 | 0.1036 |
| 0.0949 | 70.96 | 887 | 0.8437 | 0.78 | 0.3511 | 1.4519 | 0.78 | 0.7469 | 0.2076 | 0.1032 |
| 0.0949 | 72.0 | 900 | 0.8485 | 0.775 | 0.3535 | 1.5163 | 0.775 | 0.7444 | 0.1863 | 0.1033 |
| 0.0949 | 72.96 | 912 | 0.8516 | 0.77 | 0.3555 | 1.5015 | 0.7700 | 0.7404 | 0.1868 | 0.1030 |
| 0.0949 | 74.0 | 925 | 0.8507 | 0.77 | 0.3541 | 1.4407 | 0.7700 | 0.7404 | 0.1996 | 0.1026 |
| 0.0949 | 74.96 | 937 | 0.8508 | 0.77 | 0.3540 | 1.4424 | 0.7700 | 0.7400 | 0.1996 | 0.1025 |
| 0.0949 | 76.0 | 950 | 0.8559 | 0.77 | 0.3558 | 1.4487 | 0.7700 | 0.7400 | 0.2001 | 0.1031 |
| 0.0949 | 76.96 | 962 | 0.8564 | 0.77 | 0.3568 | 1.4389 | 0.7700 | 0.7400 | 0.1804 | 0.1024 |
| 0.0949 | 78.0 | 975 | 0.8611 | 0.775 | 0.3569 | 1.4966 | 0.775 | 0.7440 | 0.1831 | 0.1031 |
| 0.0949 | 78.96 | 987 | 0.8573 | 0.77 | 0.3557 | 1.4946 | 0.7700 | 0.7400 | 0.1933 | 0.1013 |
| 0.0366 | 80.0 | 1000 | 0.8640 | 0.775 | 0.3587 | 1.4998 | 0.775 | 0.7440 | 0.1871 | 0.1032 |
| 0.0366 | 80.96 | 1012 | 0.8640 | 0.77 | 0.3585 | 1.4983 | 0.7700 | 0.7400 | 0.1921 | 0.1032 |
| 0.0366 | 82.0 | 1025 | 0.8696 | 0.77 | 0.3608 | 1.5032 | 0.7700 | 0.7400 | 0.2033 | 0.1036 |
| 0.0366 | 82.96 | 1037 | 0.8702 | 0.77 | 0.3613 | 1.4987 | 0.7700 | 0.7400 | 0.2022 | 0.1029 |
| 0.0366 | 84.0 | 1050 | 0.8686 | 0.77 | 0.3597 | 1.4446 | 0.7700 | 0.7400 | 0.1887 | 0.1028 |
| 0.0366 | 84.96 | 1062 | 0.8700 | 0.77 | 0.3607 | 1.4365 | 0.7700 | 0.7400 | 0.1900 | 0.1025 |
| 0.0366 | 86.0 | 1075 | 0.8756 | 0.765 | 0.3621 | 1.5009 | 0.765 | 0.7308 | 0.1983 | 0.1040 |
| 0.0366 | 86.96 | 1087 | 0.8768 | 0.76 | 0.3623 | 1.5035 | 0.76 | 0.7282 | 0.1956 | 0.1040 |
| 0.0366 | 88.0 | 1100 | 0.8762 | 0.765 | 0.3618 | 1.4409 | 0.765 | 0.7308 | 0.1957 | 0.1033 |
| 0.0366 | 88.96 | 1112 | 0.8777 | 0.765 | 0.3629 | 1.4427 | 0.765 | 0.7308 | 0.1974 | 0.1036 |
| 0.0366 | 90.0 | 1125 | 0.8854 | 0.76 | 0.3661 | 1.4681 | 0.76 | 0.7282 | 0.2023 | 0.1049 |
| 0.0366 | 90.96 | 1137 | 0.8867 | 0.76 | 0.3660 | 1.5057 | 0.76 | 0.7282 | 0.1995 | 0.1049 |
| 0.0366 | 92.0 | 1150 | 0.8849 | 0.765 | 0.3648 | 1.5011 | 0.765 | 0.7308 | 0.1953 | 0.1039 |
| 0.0366 | 92.96 | 1162 | 0.8898 | 0.76 | 0.3670 | 1.5077 | 0.76 | 0.7282 | 0.2083 | 0.1045 |
| 0.0366 | 94.0 | 1175 | 0.8891 | 0.765 | 0.3662 | 1.4520 | 0.765 | 0.7308 | 0.2091 | 0.1040 |
| 0.0366 | 94.96 | 1187 | 0.8910 | 0.755 | 0.3679 | 1.4460 | 0.755 | 0.7247 | 0.2039 | 0.1043 |
| 0.0366 | 96.0 | 1200 | 0.8935 | 0.76 | 0.3684 | 1.4435 | 0.76 | 0.7273 | 0.1929 | 0.1041 |
| 0.0366 | 96.96 | 1212 | 0.8964 | 0.755 | 0.3689 | 1.4526 | 0.755 | 0.7247 | 0.1980 | 0.1048 |
| 0.0366 | 98.0 | 1225 | 0.8979 | 0.755 | 0.3701 | 1.4507 | 0.755 | 0.7247 | 0.2017 | 0.1045 |
| 0.0366 | 98.96 | 1237 | 0.8965 | 0.755 | 0.3692 | 1.4474 | 0.755 | 0.7247 | 0.2057 | 0.1044 |
| 0.0366 | 100.0 | 1250 | 0.9019 | 0.755 | 0.3716 | 1.4526 | 0.755 | 0.7247 | 0.2051 | 0.1047 |
| 0.0366 | 100.96 | 1262 | 0.8994 | 0.755 | 0.3694 | 1.4485 | 0.755 | 0.7247 | 0.1979 | 0.1041 |
| 0.0366 | 102.0 | 1275 | 0.9023 | 0.755 | 0.3715 | 1.4465 | 0.755 | 0.7247 | 0.2140 | 0.1044 |
| 0.0366 | 102.96 | 1287 | 0.9048 | 0.755 | 0.3720 | 1.4472 | 0.755 | 0.7247 | 0.2066 | 0.1045 |
| 0.0366 | 104.0 | 1300 | 0.9060 | 0.755 | 0.3719 | 1.4565 | 0.755 | 0.7247 | 0.2003 | 0.1044 |
| 0.0366 | 104.96 | 1312 | 0.9105 | 0.755 | 0.3735 | 1.4625 | 0.755 | 0.7247 | 0.2094 | 0.1053 |
| 0.0366 | 106.0 | 1325 | 0.9099 | 0.76 | 0.3738 | 1.4463 | 0.76 | 0.7273 | 0.2050 | 0.1044 |
| 0.0366 | 106.96 | 1337 | 0.9111 | 0.755 | 0.3751 | 1.4486 | 0.755 | 0.7247 | 0.2173 | 0.1041 |
| 0.0366 | 108.0 | 1350 | 0.9149 | 0.755 | 0.3745 | 1.5081 | 0.755 | 0.7247 | 0.2062 | 0.1052 |
| 0.0366 | 108.96 | 1362 | 0.9146 | 0.755 | 0.3744 | 1.4513 | 0.755 | 0.7247 | 0.2073 | 0.1047 |
| 0.0366 | 110.0 | 1375 | 0.9157 | 0.755 | 0.3764 | 1.4486 | 0.755 | 0.7247 | 0.2094 | 0.1046 |
| 0.0366 | 110.96 | 1387 | 0.9201 | 0.755 | 0.3767 | 1.4529 | 0.755 | 0.7247 | 0.2093 | 0.1051 |
| 0.0366 | 112.0 | 1400 | 0.9190 | 0.755 | 0.3759 | 1.4533 | 0.755 | 0.7247 | 0.2020 | 0.1049 |
| 0.0366 | 112.96 | 1412 | 0.9230 | 0.755 | 0.3782 | 1.4529 | 0.755 | 0.7247 | 0.2054 | 0.1050 |
| 0.0366 | 114.0 | 1425 | 0.9234 | 0.755 | 0.3778 | 1.4505 | 0.755 | 0.7247 | 0.2009 | 0.1049 |
| 0.0366 | 114.96 | 1437 | 0.9238 | 0.755 | 0.3780 | 1.4469 | 0.755 | 0.7247 | 0.1974 | 0.1051 |
| 0.0366 | 116.0 | 1450 | 0.9264 | 0.755 | 0.3786 | 1.4538 | 0.755 | 0.7247 | 0.2016 | 0.1051 |
| 0.0366 | 116.96 | 1462 | 0.9275 | 0.755 | 0.3787 | 1.4553 | 0.755 | 0.7247 | 0.2012 | 0.1054 |
| 0.0366 | 118.0 | 1475 | 0.9306 | 0.755 | 0.3798 | 1.4595 | 0.755 | 0.7247 | 0.2139 | 0.1057 |
| 0.0366 | 118.96 | 1487 | 0.9291 | 0.76 | 0.3789 | 1.4529 | 0.76 | 0.7273 | 0.2014 | 0.1054 |
| 0.0214 | 120.0 | 1500 | 0.9318 | 0.755 | 0.3804 | 1.4520 | 0.755 | 0.7247 | 0.2017 | 0.1050 |
| 0.0214 | 120.96 | 1512 | 0.9325 | 0.755 | 0.3796 | 1.4535 | 0.755 | 0.7247 | 0.2087 | 0.1053 |
| 0.0214 | 122.0 | 1525 | 0.9339 | 0.755 | 0.3804 | 1.4534 | 0.755 | 0.7247 | 0.2068 | 0.1056 |
| 0.0214 | 122.96 | 1537 | 0.9342 | 0.755 | 0.3807 | 1.4519 | 0.755 | 0.7247 | 0.1986 | 0.1052 |
| 0.0214 | 124.0 | 1550 | 0.9357 | 0.755 | 0.3808 | 1.4524 | 0.755 | 0.7247 | 0.2058 | 0.1054 |
| 0.0214 | 124.96 | 1562 | 0.9360 | 0.755 | 0.3808 | 1.4514 | 0.755 | 0.7247 | 0.2061 | 0.1052 |
| 0.0214 | 126.0 | 1575 | 0.9409 | 0.755 | 0.3828 | 1.4557 | 0.755 | 0.7247 | 0.2044 | 0.1058 |
| 0.0214 | 126.96 | 1587 | 0.9390 | 0.755 | 0.3819 | 1.4523 | 0.755 | 0.7247 | 0.2062 | 0.1052 |
| 0.0214 | 128.0 | 1600 | 0.9425 | 0.755 | 0.3833 | 1.4559 | 0.755 | 0.7247 | 0.2049 | 0.1058 |
| 0.0214 | 128.96 | 1612 | 0.9421 | 0.755 | 0.3824 | 1.4534 | 0.755 | 0.7247 | 0.1978 | 0.1055 |
| 0.0214 | 130.0 | 1625 | 0.9433 | 0.755 | 0.3831 | 1.4530 | 0.755 | 0.7247 | 0.2069 | 0.1056 |
| 0.0214 | 130.96 | 1637 | 0.9463 | 0.755 | 0.3842 | 1.4535 | 0.755 | 0.7247 | 0.2127 | 0.1058 |
| 0.0214 | 132.0 | 1650 | 0.9462 | 0.755 | 0.3835 | 1.4546 | 0.755 | 0.7247 | 0.1975 | 0.1057 |
| 0.0214 | 132.96 | 1662 | 0.9479 | 0.755 | 0.3842 | 1.4562 | 0.755 | 0.7247 | 0.2012 | 0.1059 |
| 0.0214 | 134.0 | 1675 | 0.9493 | 0.755 | 0.3852 | 1.4547 | 0.755 | 0.7247 | 0.2161 | 0.1057 |
| 0.0214 | 134.96 | 1687 | 0.9484 | 0.755 | 0.3842 | 1.4563 | 0.755 | 0.7247 | 0.2046 | 0.1055 |
| 0.0214 | 136.0 | 1700 | 0.9500 | 0.755 | 0.3849 | 1.4528 | 0.755 | 0.7247 | 0.2081 | 0.1058 |
| 0.0214 | 136.96 | 1712 | 0.9510 | 0.755 | 0.3854 | 1.4510 | 0.755 | 0.7247 | 0.1976 | 0.1053 |
| 0.0214 | 138.0 | 1725 | 0.9519 | 0.755 | 0.3855 | 1.4516 | 0.755 | 0.7247 | 0.2090 | 0.1056 |
| 0.0214 | 138.96 | 1737 | 0.9535 | 0.755 | 0.3857 | 1.4537 | 0.755 | 0.7247 | 0.2091 | 0.1057 |
| 0.0214 | 140.0 | 1750 | 0.9546 | 0.755 | 0.3862 | 1.4557 | 0.755 | 0.7247 | 0.2020 | 0.1057 |
| 0.0214 | 140.96 | 1762 | 0.9558 | 0.755 | 0.3867 | 1.4530 | 0.755 | 0.7247 | 0.2018 | 0.1059 |
| 0.0214 | 142.0 | 1775 | 0.9567 | 0.755 | 0.3870 | 1.4522 | 0.755 | 0.7247 | 0.2059 | 0.1055 |
| 0.0214 | 142.96 | 1787 | 0.9589 | 0.755 | 0.3876 | 1.4555 | 0.755 | 0.7247 | 0.2063 | 0.1060 |
| 0.0214 | 144.0 | 1800 | 0.9584 | 0.755 | 0.3872 | 1.4568 | 0.755 | 0.7245 | 0.2162 | 0.1056 |
| 0.0214 | 144.96 | 1812 | 0.9613 | 0.755 | 0.3881 | 1.4594 | 0.755 | 0.7247 | 0.2085 | 0.1063 |
| 0.0214 | 146.0 | 1825 | 0.9604 | 0.755 | 0.3873 | 1.5132 | 0.755 | 0.7247 | 0.2047 | 0.1056 |
| 0.0214 | 146.96 | 1837 | 0.9627 | 0.755 | 0.3887 | 1.4573 | 0.755 | 0.7247 | 0.2107 | 0.1059 |
| 0.0214 | 148.0 | 1850 | 0.9643 | 0.755 | 0.3890 | 1.4570 | 0.755 | 0.7247 | 0.2041 | 0.1065 |
| 0.0214 | 148.96 | 1862 | 0.9633 | 0.755 | 0.3886 | 1.4526 | 0.755 | 0.7247 | 0.2085 | 0.1055 |
| 0.0214 | 150.0 | 1875 | 0.9637 | 0.755 | 0.3887 | 1.4551 | 0.755 | 0.7245 | 0.2096 | 0.1054 |
| 0.0214 | 150.96 | 1887 | 0.9645 | 0.755 | 0.3886 | 1.4534 | 0.755 | 0.7247 | 0.2100 | 0.1058 |
| 0.0214 | 152.0 | 1900 | 0.9661 | 0.755 | 0.3891 | 1.4550 | 0.755 | 0.7247 | 0.2072 | 0.1060 |
| 0.0214 | 152.96 | 1912 | 0.9665 | 0.755 | 0.3894 | 1.4543 | 0.755 | 0.7247 | 0.2092 | 0.1057 |
| 0.0214 | 154.0 | 1925 | 0.9667 | 0.755 | 0.3894 | 1.4570 | 0.755 | 0.7245 | 0.2019 | 0.1056 |
| 0.0214 | 154.96 | 1937 | 0.9681 | 0.755 | 0.3900 | 1.4537 | 0.755 | 0.7245 | 0.2098 | 0.1059 |
| 0.0214 | 156.0 | 1950 | 0.9689 | 0.755 | 0.3903 | 1.4533 | 0.755 | 0.7245 | 0.2022 | 0.1056 |
| 0.0214 | 156.96 | 1962 | 0.9689 | 0.755 | 0.3898 | 1.4553 | 0.755 | 0.7247 | 0.2074 | 0.1058 |
| 0.0214 | 158.0 | 1975 | 0.9705 | 0.755 | 0.3901 | 1.4566 | 0.755 | 0.7247 | 0.2072 | 0.1062 |
| 0.0214 | 158.96 | 1987 | 0.9721 | 0.755 | 0.3911 | 1.4570 | 0.755 | 0.7245 | 0.2055 | 0.1062 |
| 0.0155 | 160.0 | 2000 | 0.9712 | 0.755 | 0.3904 | 1.4551 | 0.755 | 0.7245 | 0.2070 | 0.1060 |
| 0.0155 | 160.96 | 2012 | 0.9721 | 0.755 | 0.3906 | 1.4563 | 0.755 | 0.7245 | 0.2108 | 0.1059 |
| 0.0155 | 162.0 | 2025 | 0.9734 | 0.755 | 0.3914 | 1.4550 | 0.755 | 0.7245 | 0.2071 | 0.1057 |
| 0.0155 | 162.96 | 2037 | 0.9740 | 0.755 | 0.3913 | 1.4565 | 0.755 | 0.7245 | 0.2073 | 0.1062 |
| 0.0155 | 164.0 | 2050 | 0.9744 | 0.755 | 0.3915 | 1.4563 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
| 0.0155 | 164.96 | 2062 | 0.9752 | 0.755 | 0.3917 | 1.4558 | 0.755 | 0.7245 | 0.2102 | 0.1059 |
| 0.0155 | 166.0 | 2075 | 0.9755 | 0.755 | 0.3916 | 1.4566 | 0.755 | 0.7245 | 0.2102 | 0.1062 |
| 0.0155 | 166.96 | 2087 | 0.9755 | 0.755 | 0.3917 | 1.4542 | 0.755 | 0.7245 | 0.2100 | 0.1057 |
| 0.0155 | 168.0 | 2100 | 0.9768 | 0.755 | 0.3921 | 1.4578 | 0.755 | 0.7245 | 0.2103 | 0.1059 |
| 0.0155 | 168.96 | 2112 | 0.9767 | 0.755 | 0.3919 | 1.4575 | 0.755 | 0.7245 | 0.2106 | 0.1057 |
| 0.0155 | 170.0 | 2125 | 0.9776 | 0.755 | 0.3922 | 1.4567 | 0.755 | 0.7245 | 0.2101 | 0.1058 |
| 0.0155 | 170.96 | 2137 | 0.9783 | 0.755 | 0.3925 | 1.4560 | 0.755 | 0.7245 | 0.2080 | 0.1062 |
| 0.0155 | 172.0 | 2150 | 0.9793 | 0.755 | 0.3927 | 1.4578 | 0.755 | 0.7245 | 0.2102 | 0.1064 |
| 0.0155 | 172.96 | 2162 | 0.9783 | 0.755 | 0.3923 | 1.4557 | 0.755 | 0.7245 | 0.2098 | 0.1058 |
| 0.0155 | 174.0 | 2175 | 0.9790 | 0.755 | 0.3927 | 1.4550 | 0.755 | 0.7245 | 0.2087 | 0.1057 |
| 0.0155 | 174.96 | 2187 | 0.9801 | 0.755 | 0.3929 | 1.4577 | 0.755 | 0.7245 | 0.2100 | 0.1059 |
| 0.0155 | 176.0 | 2200 | 0.9800 | 0.755 | 0.3928 | 1.4566 | 0.755 | 0.7245 | 0.2081 | 0.1058 |
| 0.0155 | 176.96 | 2212 | 0.9809 | 0.755 | 0.3932 | 1.4568 | 0.755 | 0.7245 | 0.2101 | 0.1058 |
| 0.0155 | 178.0 | 2225 | 0.9808 | 0.755 | 0.3930 | 1.4571 | 0.755 | 0.7245 | 0.2101 | 0.1058 |
| 0.0155 | 178.96 | 2237 | 0.9808 | 0.755 | 0.3930 | 1.4566 | 0.755 | 0.7245 | 0.2082 | 0.1057 |
| 0.0155 | 180.0 | 2250 | 0.9813 | 0.755 | 0.3931 | 1.4567 | 0.755 | 0.7245 | 0.2082 | 0.1058 |
| 0.0155 | 180.96 | 2262 | 0.9817 | 0.755 | 0.3933 | 1.4571 | 0.755 | 0.7245 | 0.2032 | 0.1059 |
| 0.0155 | 182.0 | 2275 | 0.9819 | 0.755 | 0.3933 | 1.4570 | 0.755 | 0.7245 | 0.2103 | 0.1058 |
| 0.0155 | 182.96 | 2287 | 0.9822 | 0.755 | 0.3934 | 1.4567 | 0.755 | 0.7245 | 0.2101 | 0.1058 |
| 0.0155 | 184.0 | 2300 | 0.9824 | 0.755 | 0.3934 | 1.4572 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
| 0.0155 | 184.96 | 2312 | 0.9825 | 0.755 | 0.3935 | 1.4569 | 0.755 | 0.7245 | 0.2081 | 0.1060 |
| 0.0155 | 186.0 | 2325 | 0.9825 | 0.755 | 0.3934 | 1.4574 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
| 0.0155 | 186.96 | 2337 | 0.9829 | 0.755 | 0.3935 | 1.4571 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
| 0.0155 | 188.0 | 2350 | 0.9830 | 0.755 | 0.3936 | 1.4569 | 0.755 | 0.7245 | 0.2081 | 0.1060 |
| 0.0155 | 188.96 | 2362 | 0.9831 | 0.755 | 0.3936 | 1.4572 | 0.755 | 0.7245 | 0.2082 | 0.1059 |
| 0.0155 | 190.0 | 2375 | 0.9830 | 0.755 | 0.3935 | 1.4575 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
| 0.0155 | 190.96 | 2387 | 0.9831 | 0.755 | 0.3936 | 1.4572 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
| 0.0155 | 192.0 | 2400 | 0.9831 | 0.755 | 0.3936 | 1.4572 | 0.755 | 0.7245 | 0.2101 | 0.1059 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/vit-base_tobacco_bs_16_lr_2e-4_e_200_wr_0.01_wd_0.2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco_bs_16_lr_2e-4_e_200_wr_0.01_wd_0.2
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0644
- Accuracy: 0.86
- Brier Loss: 0.2705
- Nll: 1.3085
- F1 Micro: 0.8600
- F1 Macro: 0.8552
- Ece: 0.1378
- Aurc: 0.0461
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.7869 | 0.78 | 0.3223 | 1.3413 | 0.78 | 0.7402 | 0.2269 | 0.0782 |
| No log | 2.0 | 25 | 0.9889 | 0.715 | 0.4300 | 1.9648 | 0.715 | 0.6881 | 0.2669 | 0.1397 |
| No log | 2.96 | 37 | 0.7053 | 0.82 | 0.2995 | 1.2578 | 0.82 | 0.8270 | 0.2253 | 0.0758 |
| No log | 4.0 | 50 | 0.7535 | 0.78 | 0.3225 | 1.3427 | 0.78 | 0.7395 | 0.2159 | 0.0616 |
| No log | 4.96 | 62 | 0.8538 | 0.775 | 0.3634 | 1.5684 | 0.775 | 0.7523 | 0.2181 | 0.1149 |
| No log | 6.0 | 75 | 0.7825 | 0.77 | 0.3557 | 1.3406 | 0.7700 | 0.7663 | 0.2136 | 0.0625 |
| No log | 6.96 | 87 | 1.0777 | 0.67 | 0.4896 | 1.5465 | 0.67 | 0.6728 | 0.2540 | 0.1106 |
| No log | 8.0 | 100 | 1.1030 | 0.73 | 0.4453 | 2.5744 | 0.7300 | 0.6939 | 0.2294 | 0.1423 |
| No log | 8.96 | 112 | 1.0215 | 0.725 | 0.4339 | 2.0485 | 0.7250 | 0.7012 | 0.2348 | 0.1278 |
| No log | 10.0 | 125 | 0.7940 | 0.795 | 0.3378 | 1.3057 | 0.795 | 0.7911 | 0.1828 | 0.0750 |
| No log | 10.96 | 137 | 0.7648 | 0.82 | 0.2963 | 1.3907 | 0.82 | 0.8022 | 0.1597 | 0.0744 |
| No log | 12.0 | 150 | 1.0755 | 0.74 | 0.4383 | 2.1271 | 0.74 | 0.7182 | 0.2281 | 0.0847 |
| No log | 12.96 | 162 | 1.0091 | 0.775 | 0.3856 | 1.7383 | 0.775 | 0.7339 | 0.1969 | 0.1029 |
| No log | 14.0 | 175 | 1.0531 | 0.77 | 0.4027 | 1.5532 | 0.7700 | 0.7592 | 0.2152 | 0.0888 |
| No log | 14.96 | 187 | 1.0221 | 0.77 | 0.4027 | 1.5199 | 0.7700 | 0.7259 | 0.2059 | 0.1031 |
| No log | 16.0 | 200 | 1.1795 | 0.735 | 0.4435 | 2.0739 | 0.735 | 0.7063 | 0.2262 | 0.1305 |
| No log | 16.96 | 212 | 1.1560 | 0.745 | 0.4379 | 2.0155 | 0.745 | 0.7240 | 0.2207 | 0.1273 |
| No log | 18.0 | 225 | 1.0635 | 0.76 | 0.4159 | 1.5491 | 0.76 | 0.7508 | 0.2124 | 0.0879 |
| No log | 18.96 | 237 | 1.2639 | 0.73 | 0.4649 | 1.9828 | 0.7300 | 0.7276 | 0.2298 | 0.1079 |
| No log | 20.0 | 250 | 1.0598 | 0.78 | 0.3866 | 1.5139 | 0.78 | 0.7676 | 0.1885 | 0.0914 |
| No log | 20.96 | 262 | 0.8900 | 0.81 | 0.3241 | 1.8355 | 0.81 | 0.7925 | 0.1691 | 0.0648 |
| No log | 22.0 | 275 | 1.0617 | 0.79 | 0.3788 | 1.8951 | 0.79 | 0.7783 | 0.1893 | 0.0676 |
| No log | 22.96 | 287 | 1.0362 | 0.785 | 0.3646 | 1.9399 | 0.785 | 0.7653 | 0.1914 | 0.0816 |
| No log | 24.0 | 300 | 1.1701 | 0.775 | 0.4060 | 2.1593 | 0.775 | 0.7718 | 0.2114 | 0.0842 |
| No log | 24.96 | 312 | 1.0841 | 0.79 | 0.3799 | 1.8773 | 0.79 | 0.7795 | 0.2016 | 0.0775 |
| No log | 26.0 | 325 | 1.0064 | 0.785 | 0.3650 | 1.7371 | 0.785 | 0.7674 | 0.1915 | 0.0813 |
| No log | 26.96 | 337 | 0.8886 | 0.825 | 0.3114 | 1.4858 | 0.825 | 0.8116 | 0.1609 | 0.0636 |
| No log | 28.0 | 350 | 1.1174 | 0.8 | 0.3751 | 1.9584 | 0.8000 | 0.7869 | 0.1928 | 0.0930 |
| No log | 28.96 | 362 | 1.0922 | 0.8 | 0.3672 | 1.8702 | 0.8000 | 0.7673 | 0.1954 | 0.0771 |
| No log | 30.0 | 375 | 1.0281 | 0.805 | 0.3506 | 1.6105 | 0.805 | 0.7809 | 0.1773 | 0.0936 |
| No log | 30.96 | 387 | 0.9041 | 0.82 | 0.3210 | 1.3323 | 0.82 | 0.8148 | 0.1627 | 0.0651 |
| No log | 32.0 | 400 | 1.1018 | 0.79 | 0.3804 | 1.9928 | 0.79 | 0.7962 | 0.1859 | 0.0574 |
| No log | 32.96 | 412 | 1.1973 | 0.765 | 0.4156 | 1.4304 | 0.765 | 0.7682 | 0.2147 | 0.0760 |
| No log | 34.0 | 425 | 1.0216 | 0.805 | 0.3605 | 1.4476 | 0.805 | 0.7864 | 0.1830 | 0.0633 |
| No log | 34.96 | 437 | 1.2356 | 0.755 | 0.4237 | 1.9897 | 0.755 | 0.7350 | 0.2214 | 0.0890 |
| No log | 36.0 | 450 | 1.0881 | 0.8 | 0.3757 | 1.3848 | 0.8000 | 0.7810 | 0.1960 | 0.0703 |
| No log | 36.96 | 462 | 1.1133 | 0.795 | 0.3687 | 2.0286 | 0.795 | 0.7707 | 0.1790 | 0.0652 |
| No log | 38.0 | 475 | 1.1243 | 0.78 | 0.3839 | 1.5683 | 0.78 | 0.7699 | 0.1905 | 0.0704 |
| No log | 38.96 | 487 | 1.1351 | 0.785 | 0.3983 | 1.4970 | 0.785 | 0.7647 | 0.1969 | 0.0666 |
| 0.0934 | 40.0 | 500 | 1.2551 | 0.775 | 0.4089 | 2.0438 | 0.775 | 0.7688 | 0.2082 | 0.1007 |
| 0.0934 | 40.96 | 512 | 1.1739 | 0.775 | 0.4003 | 1.3286 | 0.775 | 0.7654 | 0.2056 | 0.0819 |
| 0.0934 | 42.0 | 525 | 1.0007 | 0.83 | 0.3207 | 1.2576 | 0.83 | 0.8345 | 0.1579 | 0.0677 |
| 0.0934 | 42.96 | 537 | 1.0509 | 0.805 | 0.3580 | 1.2330 | 0.805 | 0.7933 | 0.1884 | 0.0716 |
| 0.0934 | 44.0 | 550 | 1.0830 | 0.805 | 0.3537 | 1.7652 | 0.805 | 0.7871 | 0.1740 | 0.0688 |
| 0.0934 | 44.96 | 562 | 0.8544 | 0.83 | 0.2957 | 1.4716 | 0.83 | 0.8039 | 0.1560 | 0.0532 |
| 0.0934 | 46.0 | 575 | 1.0803 | 0.815 | 0.3549 | 1.5802 | 0.815 | 0.7951 | 0.1840 | 0.0691 |
| 0.0934 | 46.96 | 587 | 0.9441 | 0.815 | 0.3318 | 1.2883 | 0.815 | 0.7924 | 0.1709 | 0.0514 |
| 0.0934 | 48.0 | 600 | 0.9007 | 0.845 | 0.2765 | 1.3443 | 0.845 | 0.8353 | 0.1402 | 0.0539 |
| 0.0934 | 48.96 | 612 | 0.9601 | 0.84 | 0.2952 | 1.4755 | 0.8400 | 0.8306 | 0.1499 | 0.0565 |
| 0.0934 | 50.0 | 625 | 0.9801 | 0.84 | 0.2992 | 1.4646 | 0.8400 | 0.8306 | 0.1529 | 0.0559 |
| 0.0934 | 50.96 | 637 | 0.9747 | 0.845 | 0.2950 | 1.4544 | 0.845 | 0.8338 | 0.1526 | 0.0546 |
| 0.0934 | 52.0 | 650 | 0.9651 | 0.845 | 0.2895 | 1.4442 | 0.845 | 0.8338 | 0.1469 | 0.0537 |
| 0.0934 | 52.96 | 662 | 0.9583 | 0.85 | 0.2848 | 1.4367 | 0.85 | 0.8370 | 0.1465 | 0.0525 |
| 0.0934 | 54.0 | 675 | 0.9534 | 0.85 | 0.2805 | 1.4300 | 0.85 | 0.8370 | 0.1455 | 0.0514 |
| 0.0934 | 54.96 | 687 | 0.9503 | 0.855 | 0.2776 | 1.4252 | 0.855 | 0.8425 | 0.1408 | 0.0510 |
| 0.0934 | 56.0 | 700 | 0.9480 | 0.855 | 0.2754 | 1.4207 | 0.855 | 0.8425 | 0.1407 | 0.0506 |
| 0.0934 | 56.96 | 712 | 0.9471 | 0.855 | 0.2739 | 1.4175 | 0.855 | 0.8425 | 0.1442 | 0.0504 |
| 0.0934 | 58.0 | 725 | 0.9471 | 0.855 | 0.2729 | 1.4147 | 0.855 | 0.8442 | 0.1435 | 0.0501 |
| 0.0934 | 58.96 | 737 | 0.9474 | 0.855 | 0.2720 | 1.4125 | 0.855 | 0.8442 | 0.1432 | 0.0497 |
| 0.0934 | 60.0 | 750 | 0.9482 | 0.855 | 0.2713 | 1.4101 | 0.855 | 0.8442 | 0.1420 | 0.0497 |
| 0.0934 | 60.96 | 762 | 0.9490 | 0.855 | 0.2708 | 1.4082 | 0.855 | 0.8442 | 0.1421 | 0.0493 |
| 0.0934 | 62.0 | 775 | 0.9500 | 0.86 | 0.2703 | 1.4063 | 0.8600 | 0.8534 | 0.1411 | 0.0493 |
| 0.0934 | 62.96 | 787 | 0.9512 | 0.86 | 0.2702 | 1.4046 | 0.8600 | 0.8534 | 0.1410 | 0.0492 |
| 0.0934 | 64.0 | 800 | 0.9528 | 0.86 | 0.2699 | 1.4032 | 0.8600 | 0.8534 | 0.1408 | 0.0489 |
| 0.0934 | 64.96 | 812 | 0.9541 | 0.86 | 0.2697 | 1.3472 | 0.8600 | 0.8534 | 0.1349 | 0.0487 |
| 0.0934 | 66.0 | 825 | 0.9558 | 0.86 | 0.2696 | 1.3431 | 0.8600 | 0.8534 | 0.1408 | 0.0487 |
| 0.0934 | 66.96 | 837 | 0.9574 | 0.86 | 0.2697 | 1.3403 | 0.8600 | 0.8534 | 0.1405 | 0.0486 |
| 0.0934 | 68.0 | 850 | 0.9591 | 0.86 | 0.2698 | 1.3375 | 0.8600 | 0.8534 | 0.1402 | 0.0486 |
| 0.0934 | 68.96 | 862 | 0.9605 | 0.86 | 0.2698 | 1.3355 | 0.8600 | 0.8552 | 0.1394 | 0.0486 |
| 0.0934 | 70.0 | 875 | 0.9624 | 0.86 | 0.2698 | 1.3338 | 0.8600 | 0.8552 | 0.1394 | 0.0486 |
| 0.0934 | 70.96 | 887 | 0.9638 | 0.86 | 0.2700 | 1.3322 | 0.8600 | 0.8552 | 0.1397 | 0.0485 |
| 0.0934 | 72.0 | 900 | 0.9657 | 0.86 | 0.2701 | 1.3310 | 0.8600 | 0.8552 | 0.1397 | 0.0485 |
| 0.0934 | 72.96 | 912 | 0.9673 | 0.86 | 0.2702 | 1.3299 | 0.8600 | 0.8552 | 0.1397 | 0.0484 |
| 0.0934 | 74.0 | 925 | 0.9691 | 0.86 | 0.2703 | 1.3289 | 0.8600 | 0.8552 | 0.1397 | 0.0484 |
| 0.0934 | 74.96 | 937 | 0.9708 | 0.86 | 0.2704 | 1.3280 | 0.8600 | 0.8552 | 0.1398 | 0.0485 |
| 0.0934 | 76.0 | 950 | 0.9725 | 0.86 | 0.2706 | 1.3271 | 0.8600 | 0.8552 | 0.1398 | 0.0485 |
| 0.0934 | 76.96 | 962 | 0.9740 | 0.86 | 0.2707 | 1.3263 | 0.8600 | 0.8552 | 0.1398 | 0.0485 |
| 0.0934 | 78.0 | 975 | 0.9757 | 0.86 | 0.2707 | 1.3256 | 0.8600 | 0.8552 | 0.1383 | 0.0485 |
| 0.0934 | 78.96 | 987 | 0.9772 | 0.86 | 0.2708 | 1.3248 | 0.8600 | 0.8552 | 0.1357 | 0.0484 |
| 0.0038 | 80.0 | 1000 | 0.9789 | 0.86 | 0.2709 | 1.3243 | 0.8600 | 0.8552 | 0.1359 | 0.0485 |
| 0.0038 | 80.96 | 1012 | 0.9806 | 0.86 | 0.2710 | 1.3238 | 0.8600 | 0.8552 | 0.1360 | 0.0484 |
| 0.0038 | 82.0 | 1025 | 0.9820 | 0.86 | 0.2711 | 1.3232 | 0.8600 | 0.8552 | 0.1361 | 0.0482 |
| 0.0038 | 82.96 | 1037 | 0.9837 | 0.86 | 0.2712 | 1.3227 | 0.8600 | 0.8552 | 0.1361 | 0.0481 |
| 0.0038 | 84.0 | 1050 | 0.9853 | 0.86 | 0.2713 | 1.3222 | 0.8600 | 0.8552 | 0.1362 | 0.0480 |
| 0.0038 | 84.96 | 1062 | 0.9867 | 0.86 | 0.2713 | 1.3216 | 0.8600 | 0.8552 | 0.1363 | 0.0481 |
| 0.0038 | 86.0 | 1075 | 0.9883 | 0.86 | 0.2714 | 1.3212 | 0.8600 | 0.8552 | 0.1364 | 0.0479 |
| 0.0038 | 86.96 | 1087 | 0.9896 | 0.86 | 0.2714 | 1.3208 | 0.8600 | 0.8552 | 0.1365 | 0.0477 |
| 0.0038 | 88.0 | 1100 | 0.9911 | 0.86 | 0.2715 | 1.3203 | 0.8600 | 0.8552 | 0.1366 | 0.0478 |
| 0.0038 | 88.96 | 1112 | 0.9925 | 0.86 | 0.2715 | 1.3200 | 0.8600 | 0.8552 | 0.1369 | 0.0478 |
| 0.0038 | 90.0 | 1125 | 0.9940 | 0.86 | 0.2715 | 1.3196 | 0.8600 | 0.8552 | 0.1369 | 0.0477 |
| 0.0038 | 90.96 | 1137 | 0.9954 | 0.86 | 0.2715 | 1.3194 | 0.8600 | 0.8552 | 0.1369 | 0.0476 |
| 0.0038 | 92.0 | 1150 | 0.9968 | 0.86 | 0.2716 | 1.3190 | 0.8600 | 0.8552 | 0.1368 | 0.0476 |
| 0.0038 | 92.96 | 1162 | 0.9983 | 0.86 | 0.2716 | 1.3187 | 0.8600 | 0.8552 | 0.1368 | 0.0476 |
| 0.0038 | 94.0 | 1175 | 0.9996 | 0.86 | 0.2716 | 1.3184 | 0.8600 | 0.8552 | 0.1394 | 0.0476 |
| 0.0038 | 94.96 | 1187 | 1.0009 | 0.86 | 0.2716 | 1.3182 | 0.8600 | 0.8552 | 0.1393 | 0.0475 |
| 0.0038 | 96.0 | 1200 | 1.0023 | 0.86 | 0.2717 | 1.3179 | 0.8600 | 0.8552 | 0.1392 | 0.0475 |
| 0.0038 | 96.96 | 1212 | 1.0035 | 0.86 | 0.2717 | 1.3176 | 0.8600 | 0.8552 | 0.1391 | 0.0475 |
| 0.0038 | 98.0 | 1225 | 1.0049 | 0.86 | 0.2717 | 1.3175 | 0.8600 | 0.8552 | 0.1391 | 0.0474 |
| 0.0038 | 98.96 | 1237 | 1.0062 | 0.86 | 0.2717 | 1.3172 | 0.8600 | 0.8552 | 0.1391 | 0.0475 |
| 0.0038 | 100.0 | 1250 | 1.0075 | 0.86 | 0.2717 | 1.3169 | 0.8600 | 0.8552 | 0.1367 | 0.0475 |
| 0.0038 | 100.96 | 1262 | 1.0087 | 0.86 | 0.2717 | 1.3167 | 0.8600 | 0.8552 | 0.1368 | 0.0475 |
| 0.0038 | 102.0 | 1275 | 1.0099 | 0.86 | 0.2717 | 1.3164 | 0.8600 | 0.8552 | 0.1375 | 0.0474 |
| 0.0038 | 102.96 | 1287 | 1.0111 | 0.86 | 0.2717 | 1.3162 | 0.8600 | 0.8552 | 0.1376 | 0.0473 |
| 0.0038 | 104.0 | 1300 | 1.0122 | 0.86 | 0.2717 | 1.3159 | 0.8600 | 0.8552 | 0.1378 | 0.0471 |
| 0.0038 | 104.96 | 1312 | 1.0134 | 0.86 | 0.2716 | 1.3158 | 0.8600 | 0.8552 | 0.1378 | 0.0473 |
| 0.0038 | 106.0 | 1325 | 1.0146 | 0.86 | 0.2717 | 1.3155 | 0.8600 | 0.8552 | 0.1379 | 0.0472 |
| 0.0038 | 106.96 | 1337 | 1.0158 | 0.86 | 0.2717 | 1.3153 | 0.8600 | 0.8552 | 0.1379 | 0.0471 |
| 0.0038 | 108.0 | 1350 | 1.0169 | 0.86 | 0.2716 | 1.3151 | 0.8600 | 0.8552 | 0.1380 | 0.0471 |
| 0.0038 | 108.96 | 1362 | 1.0180 | 0.86 | 0.2716 | 1.3149 | 0.8600 | 0.8552 | 0.1381 | 0.0471 |
| 0.0038 | 110.0 | 1375 | 1.0191 | 0.86 | 0.2716 | 1.3146 | 0.8600 | 0.8552 | 0.1381 | 0.0471 |
| 0.0038 | 110.96 | 1387 | 1.0201 | 0.86 | 0.2716 | 1.3144 | 0.8600 | 0.8552 | 0.1382 | 0.0471 |
| 0.0038 | 112.0 | 1400 | 1.0211 | 0.86 | 0.2716 | 1.3142 | 0.8600 | 0.8552 | 0.1382 | 0.0470 |
| 0.0038 | 112.96 | 1412 | 1.0222 | 0.86 | 0.2716 | 1.3141 | 0.8600 | 0.8552 | 0.1382 | 0.0471 |
| 0.0038 | 114.0 | 1425 | 1.0233 | 0.86 | 0.2715 | 1.3139 | 0.8600 | 0.8552 | 0.1383 | 0.0470 |
| 0.0038 | 114.96 | 1437 | 1.0242 | 0.86 | 0.2715 | 1.3138 | 0.8600 | 0.8552 | 0.1383 | 0.0470 |
| 0.0038 | 116.0 | 1450 | 1.0253 | 0.86 | 0.2715 | 1.3136 | 0.8600 | 0.8552 | 0.1383 | 0.0469 |
| 0.0038 | 116.96 | 1462 | 1.0263 | 0.86 | 0.2715 | 1.3134 | 0.8600 | 0.8552 | 0.1383 | 0.0470 |
| 0.0038 | 118.0 | 1475 | 1.0273 | 0.86 | 0.2715 | 1.3133 | 0.8600 | 0.8552 | 0.1384 | 0.0470 |
| 0.0038 | 118.96 | 1487 | 1.0282 | 0.86 | 0.2714 | 1.3131 | 0.8600 | 0.8552 | 0.1384 | 0.0468 |
| 0.0006 | 120.0 | 1500 | 1.0292 | 0.86 | 0.2714 | 1.3130 | 0.8600 | 0.8552 | 0.1385 | 0.0468 |
| 0.0006 | 120.96 | 1512 | 1.0301 | 0.86 | 0.2714 | 1.3128 | 0.8600 | 0.8552 | 0.1385 | 0.0468 |
| 0.0006 | 122.0 | 1525 | 1.0311 | 0.86 | 0.2714 | 1.3127 | 0.8600 | 0.8552 | 0.1386 | 0.0467 |
| 0.0006 | 122.96 | 1537 | 1.0319 | 0.86 | 0.2714 | 1.3126 | 0.8600 | 0.8552 | 0.1386 | 0.0467 |
| 0.0006 | 124.0 | 1550 | 1.0329 | 0.86 | 0.2714 | 1.3124 | 0.8600 | 0.8552 | 0.1387 | 0.0467 |
| 0.0006 | 124.96 | 1562 | 1.0337 | 0.86 | 0.2713 | 1.3123 | 0.8600 | 0.8552 | 0.1393 | 0.0467 |
| 0.0006 | 126.0 | 1575 | 1.0346 | 0.86 | 0.2713 | 1.3122 | 0.8600 | 0.8552 | 0.1374 | 0.0466 |
| 0.0006 | 126.96 | 1587 | 1.0354 | 0.86 | 0.2713 | 1.3120 | 0.8600 | 0.8552 | 0.1375 | 0.0466 |
| 0.0006 | 128.0 | 1600 | 1.0363 | 0.86 | 0.2713 | 1.3119 | 0.8600 | 0.8552 | 0.1375 | 0.0467 |
| 0.0006 | 128.96 | 1612 | 1.0372 | 0.86 | 0.2713 | 1.3118 | 0.8600 | 0.8552 | 0.1375 | 0.0466 |
| 0.0006 | 130.0 | 1625 | 1.0380 | 0.86 | 0.2712 | 1.3117 | 0.8600 | 0.8552 | 0.1375 | 0.0466 |
| 0.0006 | 130.96 | 1637 | 1.0388 | 0.86 | 0.2712 | 1.3116 | 0.8600 | 0.8552 | 0.1375 | 0.0467 |
| 0.0006 | 132.0 | 1650 | 1.0396 | 0.86 | 0.2712 | 1.3115 | 0.8600 | 0.8552 | 0.1375 | 0.0465 |
| 0.0006 | 132.96 | 1662 | 1.0403 | 0.86 | 0.2712 | 1.3113 | 0.8600 | 0.8552 | 0.1375 | 0.0466 |
| 0.0006 | 134.0 | 1675 | 1.0411 | 0.86 | 0.2712 | 1.3113 | 0.8600 | 0.8552 | 0.1376 | 0.0466 |
| 0.0006 | 134.96 | 1687 | 1.0419 | 0.86 | 0.2711 | 1.3112 | 0.8600 | 0.8552 | 0.1376 | 0.0466 |
| 0.0006 | 136.0 | 1700 | 1.0426 | 0.86 | 0.2711 | 1.3111 | 0.8600 | 0.8552 | 0.1376 | 0.0465 |
| 0.0006 | 136.96 | 1712 | 1.0433 | 0.86 | 0.2711 | 1.3110 | 0.8600 | 0.8552 | 0.1376 | 0.0465 |
| 0.0006 | 138.0 | 1725 | 1.0441 | 0.86 | 0.2711 | 1.3109 | 0.8600 | 0.8552 | 0.1376 | 0.0465 |
| 0.0006 | 138.96 | 1737 | 1.0448 | 0.86 | 0.2711 | 1.3108 | 0.8600 | 0.8552 | 0.1376 | 0.0465 |
| 0.0006 | 140.0 | 1750 | 1.0455 | 0.86 | 0.2710 | 1.3107 | 0.8600 | 0.8552 | 0.1377 | 0.0465 |
| 0.0006 | 140.96 | 1762 | 1.0461 | 0.86 | 0.2710 | 1.3106 | 0.8600 | 0.8552 | 0.1377 | 0.0465 |
| 0.0006 | 142.0 | 1775 | 1.0468 | 0.86 | 0.2710 | 1.3106 | 0.8600 | 0.8552 | 0.1377 | 0.0465 |
| 0.0006 | 142.96 | 1787 | 1.0474 | 0.86 | 0.2710 | 1.3105 | 0.8600 | 0.8552 | 0.1377 | 0.0465 |
| 0.0006 | 144.0 | 1800 | 1.0481 | 0.86 | 0.2710 | 1.3104 | 0.8600 | 0.8552 | 0.1377 | 0.0465 |
| 0.0006 | 144.96 | 1812 | 1.0487 | 0.86 | 0.2710 | 1.3103 | 0.8600 | 0.8552 | 0.1378 | 0.0465 |
| 0.0006 | 146.0 | 1825 | 1.0494 | 0.86 | 0.2709 | 1.3102 | 0.8600 | 0.8552 | 0.1378 | 0.0465 |
| 0.0006 | 146.96 | 1837 | 1.0500 | 0.86 | 0.2709 | 1.3102 | 0.8600 | 0.8552 | 0.1378 | 0.0465 |
| 0.0006 | 148.0 | 1850 | 1.0506 | 0.86 | 0.2709 | 1.3101 | 0.8600 | 0.8552 | 0.1378 | 0.0465 |
| 0.0006 | 148.96 | 1862 | 1.0511 | 0.86 | 0.2709 | 1.3100 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 150.0 | 1875 | 1.0517 | 0.86 | 0.2709 | 1.3099 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 150.96 | 1887 | 1.0523 | 0.86 | 0.2709 | 1.3099 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 152.0 | 1900 | 1.0529 | 0.86 | 0.2708 | 1.3098 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 152.96 | 1912 | 1.0534 | 0.86 | 0.2708 | 1.3097 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 154.0 | 1925 | 1.0539 | 0.86 | 0.2708 | 1.3096 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 154.96 | 1937 | 1.0544 | 0.86 | 0.2708 | 1.3096 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 156.0 | 1950 | 1.0550 | 0.86 | 0.2708 | 1.3095 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 156.96 | 1962 | 1.0554 | 0.86 | 0.2708 | 1.3094 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 158.0 | 1975 | 1.0559 | 0.86 | 0.2707 | 1.3094 | 0.8600 | 0.8552 | 0.1378 | 0.0464 |
| 0.0006 | 158.96 | 1987 | 1.0563 | 0.86 | 0.2707 | 1.3093 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 160.0 | 2000 | 1.0568 | 0.86 | 0.2707 | 1.3093 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 160.96 | 2012 | 1.0573 | 0.86 | 0.2707 | 1.3092 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 162.0 | 2025 | 1.0577 | 0.86 | 0.2707 | 1.3092 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 162.96 | 2037 | 1.0581 | 0.86 | 0.2707 | 1.3091 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 164.0 | 2050 | 1.0585 | 0.86 | 0.2707 | 1.3091 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 164.96 | 2062 | 1.0589 | 0.86 | 0.2707 | 1.3090 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 166.0 | 2075 | 1.0593 | 0.86 | 0.2707 | 1.3090 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 166.96 | 2087 | 1.0597 | 0.86 | 0.2706 | 1.3089 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 168.0 | 2100 | 1.0600 | 0.86 | 0.2706 | 1.3089 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 168.96 | 2112 | 1.0603 | 0.86 | 0.2706 | 1.3089 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 170.0 | 2125 | 1.0607 | 0.86 | 0.2706 | 1.3088 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 170.96 | 2137 | 1.0610 | 0.86 | 0.2706 | 1.3088 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 172.0 | 2150 | 1.0613 | 0.86 | 0.2706 | 1.3088 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 172.96 | 2162 | 1.0616 | 0.86 | 0.2706 | 1.3087 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 174.0 | 2175 | 1.0619 | 0.86 | 0.2706 | 1.3087 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 174.96 | 2187 | 1.0621 | 0.86 | 0.2706 | 1.3087 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 176.0 | 2200 | 1.0624 | 0.86 | 0.2706 | 1.3087 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 176.96 | 2212 | 1.0626 | 0.86 | 0.2706 | 1.3086 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 178.0 | 2225 | 1.0629 | 0.86 | 0.2706 | 1.3086 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 178.96 | 2237 | 1.0630 | 0.86 | 0.2706 | 1.3086 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 180.0 | 2250 | 1.0632 | 0.86 | 0.2706 | 1.3086 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 180.96 | 2262 | 1.0634 | 0.86 | 0.2706 | 1.3086 | 0.8600 | 0.8552 | 0.1378 | 0.0463 |
| 0.0004 | 182.0 | 2275 | 1.0636 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 182.96 | 2287 | 1.0637 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 184.0 | 2300 | 1.0639 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 184.96 | 2312 | 1.0640 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 186.0 | 2325 | 1.0641 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 186.96 | 2337 | 1.0642 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0462 |
| 0.0004 | 188.0 | 2350 | 1.0643 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0461 |
| 0.0004 | 188.96 | 2362 | 1.0643 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0461 |
| 0.0004 | 190.0 | 2375 | 1.0644 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0461 |
| 0.0004 | 190.96 | 2387 | 1.0644 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0461 |
| 0.0004 | 192.0 | 2400 | 1.0644 | 0.86 | 0.2705 | 1.3085 | 0.8600 | 0.8552 | 0.1378 | 0.0461 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6228
- Accuracy: 0.5375
- Brier Loss: 0.6009
- Nll: 2.3362
- F1 Micro: 0.5375
- F1 Macro: 0.5332
- Ece: 0.1886
- Aurc: 0.2188
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.0859 | 0.0425 | 1.0718 | 7.3724 | 0.0425 | 0.0340 | 0.2811 | 0.9554 |
| No log | 2.0 | 14 | 3.1575 | 0.0925 | 0.9517 | 5.8523 | 0.0925 | 0.0709 | 0.1660 | 0.8903 |
| No log | 3.0 | 21 | 2.8291 | 0.2325 | 0.8866 | 5.4080 | 0.2325 | 0.1824 | 0.1601 | 0.6363 |
| No log | 4.0 | 28 | 2.4189 | 0.2875 | 0.7987 | 3.5373 | 0.2875 | 0.2714 | 0.1822 | 0.4858 |
| No log | 5.0 | 35 | 2.1338 | 0.3775 | 0.7203 | 2.9804 | 0.3775 | 0.3569 | 0.1731 | 0.3581 |
| No log | 6.0 | 42 | 2.0168 | 0.44 | 0.6910 | 2.9903 | 0.44 | 0.4255 | 0.1854 | 0.3060 |
| No log | 7.0 | 49 | 2.0458 | 0.4575 | 0.6988 | 3.1988 | 0.4575 | 0.4286 | 0.2078 | 0.3000 |
| No log | 8.0 | 56 | 1.9688 | 0.4575 | 0.6928 | 2.8691 | 0.4575 | 0.4542 | 0.2122 | 0.2986 |
| No log | 9.0 | 63 | 2.1143 | 0.4625 | 0.7081 | 2.8507 | 0.4625 | 0.4496 | 0.2232 | 0.2940 |
| No log | 10.0 | 70 | 2.1352 | 0.455 | 0.7242 | 2.9232 | 0.455 | 0.4317 | 0.2581 | 0.2923 |
| No log | 11.0 | 77 | 2.2760 | 0.44 | 0.7545 | 2.9083 | 0.44 | 0.4144 | 0.2835 | 0.3144 |
| No log | 12.0 | 84 | 2.1320 | 0.4675 | 0.7389 | 2.6277 | 0.4675 | 0.4607 | 0.2715 | 0.3142 |
| No log | 13.0 | 91 | 2.0889 | 0.4825 | 0.7170 | 2.6786 | 0.4825 | 0.4755 | 0.2522 | 0.2933 |
| No log | 14.0 | 98 | 2.0173 | 0.475 | 0.7020 | 2.7765 | 0.4750 | 0.4555 | 0.2593 | 0.2791 |
| No log | 15.0 | 105 | 2.1265 | 0.4875 | 0.7124 | 3.0127 | 0.4875 | 0.4753 | 0.2458 | 0.3069 |
| No log | 16.0 | 112 | 2.1517 | 0.4825 | 0.7184 | 2.7981 | 0.4825 | 0.4719 | 0.2505 | 0.2964 |
| No log | 17.0 | 119 | 1.9485 | 0.49 | 0.6828 | 2.6670 | 0.49 | 0.4773 | 0.2430 | 0.2649 |
| No log | 18.0 | 126 | 1.9434 | 0.5 | 0.6802 | 2.6131 | 0.5 | 0.4941 | 0.2267 | 0.2741 |
| No log | 19.0 | 133 | 1.9055 | 0.4925 | 0.6741 | 2.7035 | 0.4925 | 0.4953 | 0.2201 | 0.2700 |
| No log | 20.0 | 140 | 1.8119 | 0.5 | 0.6520 | 2.4537 | 0.5 | 0.4937 | 0.2016 | 0.2449 |
| No log | 21.0 | 147 | 1.8420 | 0.49 | 0.6593 | 2.6244 | 0.49 | 0.4798 | 0.2100 | 0.2668 |
| No log | 22.0 | 154 | 1.9282 | 0.5 | 0.6739 | 2.6650 | 0.5 | 0.4837 | 0.2166 | 0.2652 |
| No log | 23.0 | 161 | 1.8139 | 0.5125 | 0.6543 | 2.4043 | 0.5125 | 0.5009 | 0.2275 | 0.2604 |
| No log | 24.0 | 168 | 1.6997 | 0.5325 | 0.6124 | 2.3950 | 0.5325 | 0.5271 | 0.1873 | 0.2248 |
| No log | 25.0 | 175 | 1.8370 | 0.5025 | 0.6526 | 2.4967 | 0.5025 | 0.4926 | 0.2249 | 0.2513 |
| No log | 26.0 | 182 | 1.7508 | 0.5025 | 0.6388 | 2.4321 | 0.5025 | 0.4947 | 0.2135 | 0.2416 |
| No log | 27.0 | 189 | 1.7354 | 0.52 | 0.6368 | 2.3706 | 0.52 | 0.5149 | 0.2101 | 0.2440 |
| No log | 28.0 | 196 | 1.7809 | 0.52 | 0.6421 | 2.4895 | 0.52 | 0.5076 | 0.2033 | 0.2446 |
| No log | 29.0 | 203 | 1.6770 | 0.55 | 0.6046 | 2.3913 | 0.55 | 0.5409 | 0.1849 | 0.2189 |
| No log | 30.0 | 210 | 1.6794 | 0.5325 | 0.6127 | 2.5677 | 0.5325 | 0.5263 | 0.2095 | 0.2247 |
| No log | 31.0 | 217 | 1.7352 | 0.525 | 0.6325 | 2.5068 | 0.525 | 0.5062 | 0.1865 | 0.2320 |
| No log | 32.0 | 224 | 1.7396 | 0.5225 | 0.6379 | 2.4798 | 0.5225 | 0.5169 | 0.1829 | 0.2402 |
| No log | 33.0 | 231 | 1.7013 | 0.52 | 0.6188 | 2.5032 | 0.52 | 0.5029 | 0.1986 | 0.2233 |
| No log | 34.0 | 238 | 1.7121 | 0.5075 | 0.6313 | 2.5680 | 0.5075 | 0.5074 | 0.2249 | 0.2482 |
| No log | 35.0 | 245 | 1.7053 | 0.5175 | 0.6368 | 2.3967 | 0.5175 | 0.5064 | 0.2088 | 0.2389 |
| No log | 36.0 | 252 | 1.6616 | 0.55 | 0.6058 | 2.5726 | 0.55 | 0.5357 | 0.1762 | 0.2174 |
| No log | 37.0 | 259 | 1.7302 | 0.5075 | 0.6350 | 2.3745 | 0.5075 | 0.4993 | 0.1939 | 0.2481 |
| No log | 38.0 | 266 | 1.6741 | 0.5225 | 0.6152 | 2.5862 | 0.5225 | 0.5211 | 0.2042 | 0.2234 |
| No log | 39.0 | 273 | 1.6960 | 0.515 | 0.6216 | 2.4210 | 0.515 | 0.5049 | 0.1776 | 0.2270 |
| No log | 40.0 | 280 | 1.6512 | 0.54 | 0.6075 | 2.4495 | 0.54 | 0.5377 | 0.1848 | 0.2217 |
| No log | 41.0 | 287 | 1.6477 | 0.52 | 0.6086 | 2.5354 | 0.52 | 0.5177 | 0.1976 | 0.2219 |
| No log | 42.0 | 294 | 1.6680 | 0.52 | 0.6198 | 2.2815 | 0.52 | 0.5215 | 0.1951 | 0.2314 |
| No log | 43.0 | 301 | 1.6153 | 0.545 | 0.5935 | 2.3952 | 0.545 | 0.5376 | 0.1721 | 0.2088 |
| No log | 44.0 | 308 | 1.6347 | 0.5375 | 0.6023 | 2.3985 | 0.5375 | 0.5340 | 0.1778 | 0.2178 |
| No log | 45.0 | 315 | 1.6132 | 0.5375 | 0.6001 | 2.3149 | 0.5375 | 0.5375 | 0.1902 | 0.2163 |
| No log | 46.0 | 322 | 1.6366 | 0.5275 | 0.6102 | 2.3820 | 0.5275 | 0.5210 | 0.2057 | 0.2239 |
| No log | 47.0 | 329 | 1.6118 | 0.55 | 0.5963 | 2.3834 | 0.55 | 0.5463 | 0.1748 | 0.2137 |
| No log | 48.0 | 336 | 1.6208 | 0.535 | 0.6038 | 2.2855 | 0.535 | 0.5340 | 0.1791 | 0.2201 |
| No log | 49.0 | 343 | 1.6006 | 0.545 | 0.5960 | 2.3253 | 0.545 | 0.5440 | 0.2007 | 0.2158 |
| No log | 50.0 | 350 | 1.6184 | 0.5425 | 0.6005 | 2.3181 | 0.5425 | 0.5432 | 0.1846 | 0.2176 |
| No log | 51.0 | 357 | 1.6215 | 0.5425 | 0.6022 | 2.3219 | 0.5425 | 0.5371 | 0.1776 | 0.2163 |
| No log | 52.0 | 364 | 1.6068 | 0.54 | 0.5978 | 2.2928 | 0.54 | 0.5383 | 0.1865 | 0.2164 |
| No log | 53.0 | 371 | 1.6128 | 0.535 | 0.5979 | 2.3243 | 0.535 | 0.5335 | 0.1913 | 0.2189 |
| No log | 54.0 | 378 | 1.6182 | 0.545 | 0.6002 | 2.3604 | 0.545 | 0.5432 | 0.1740 | 0.2157 |
| No log | 55.0 | 385 | 1.6143 | 0.54 | 0.5981 | 2.3632 | 0.54 | 0.5348 | 0.1911 | 0.2165 |
| No log | 56.0 | 392 | 1.6163 | 0.5375 | 0.6010 | 2.3551 | 0.5375 | 0.5338 | 0.1950 | 0.2191 |
| No log | 57.0 | 399 | 1.6125 | 0.54 | 0.5984 | 2.3597 | 0.54 | 0.5378 | 0.1680 | 0.2159 |
| No log | 58.0 | 406 | 1.6159 | 0.5425 | 0.5997 | 2.3564 | 0.5425 | 0.5388 | 0.1796 | 0.2175 |
| No log | 59.0 | 413 | 1.6102 | 0.5475 | 0.5967 | 2.3879 | 0.5475 | 0.5448 | 0.1863 | 0.2152 |
| No log | 60.0 | 420 | 1.6171 | 0.535 | 0.6006 | 2.3594 | 0.535 | 0.5298 | 0.1987 | 0.2183 |
| No log | 61.0 | 427 | 1.6162 | 0.5375 | 0.5987 | 2.3631 | 0.5375 | 0.5341 | 0.1836 | 0.2172 |
| No log | 62.0 | 434 | 1.6167 | 0.5425 | 0.5992 | 2.3344 | 0.5425 | 0.5395 | 0.2011 | 0.2170 |
| No log | 63.0 | 441 | 1.6123 | 0.545 | 0.5984 | 2.3575 | 0.545 | 0.5423 | 0.1946 | 0.2174 |
| No log | 64.0 | 448 | 1.6209 | 0.535 | 0.6007 | 2.3333 | 0.535 | 0.5299 | 0.1973 | 0.2180 |
| No log | 65.0 | 455 | 1.6184 | 0.5475 | 0.6003 | 2.3620 | 0.5475 | 0.5442 | 0.1911 | 0.2167 |
| No log | 66.0 | 462 | 1.6179 | 0.5375 | 0.5992 | 2.3321 | 0.5375 | 0.5333 | 0.2019 | 0.2182 |
| No log | 67.0 | 469 | 1.6179 | 0.5425 | 0.5999 | 2.3309 | 0.5425 | 0.5397 | 0.1780 | 0.2176 |
| No log | 68.0 | 476 | 1.6204 | 0.545 | 0.6004 | 2.3282 | 0.545 | 0.5408 | 0.1740 | 0.2167 |
| No log | 69.0 | 483 | 1.6195 | 0.54 | 0.6005 | 2.3372 | 0.54 | 0.5370 | 0.1898 | 0.2187 |
| No log | 70.0 | 490 | 1.6189 | 0.5425 | 0.5996 | 2.3328 | 0.5425 | 0.5398 | 0.1754 | 0.2169 |
| No log | 71.0 | 497 | 1.6202 | 0.5425 | 0.6007 | 2.3352 | 0.5425 | 0.5394 | 0.1973 | 0.2180 |
| 0.3297 | 72.0 | 504 | 1.6175 | 0.5375 | 0.5997 | 2.3297 | 0.5375 | 0.5348 | 0.1721 | 0.2184 |
| 0.3297 | 73.0 | 511 | 1.6185 | 0.545 | 0.6001 | 2.3339 | 0.545 | 0.5414 | 0.1780 | 0.2175 |
| 0.3297 | 74.0 | 518 | 1.6203 | 0.5425 | 0.6003 | 2.3315 | 0.5425 | 0.5391 | 0.1855 | 0.2175 |
| 0.3297 | 75.0 | 525 | 1.6199 | 0.54 | 0.6002 | 2.3366 | 0.54 | 0.5363 | 0.1841 | 0.2180 |
| 0.3297 | 76.0 | 532 | 1.6203 | 0.5425 | 0.6002 | 2.3332 | 0.5425 | 0.5395 | 0.1736 | 0.2176 |
| 0.3297 | 77.0 | 539 | 1.6191 | 0.5425 | 0.6000 | 2.3328 | 0.5425 | 0.5395 | 0.1722 | 0.2176 |
| 0.3297 | 78.0 | 546 | 1.6202 | 0.54 | 0.6001 | 2.3317 | 0.54 | 0.5362 | 0.1909 | 0.2177 |
| 0.3297 | 79.0 | 553 | 1.6199 | 0.54 | 0.6002 | 2.3335 | 0.54 | 0.5370 | 0.1982 | 0.2181 |
| 0.3297 | 80.0 | 560 | 1.6207 | 0.54 | 0.6006 | 2.3336 | 0.54 | 0.5363 | 0.1983 | 0.2185 |
| 0.3297 | 81.0 | 567 | 1.6210 | 0.5425 | 0.6003 | 2.3353 | 0.5425 | 0.5389 | 0.1875 | 0.2174 |
| 0.3297 | 82.0 | 574 | 1.6216 | 0.5425 | 0.6005 | 2.3340 | 0.5425 | 0.5386 | 0.1792 | 0.2178 |
| 0.3297 | 83.0 | 581 | 1.6213 | 0.5375 | 0.6007 | 2.3360 | 0.5375 | 0.5340 | 0.1797 | 0.2193 |
| 0.3297 | 84.0 | 588 | 1.6208 | 0.54 | 0.6007 | 2.3345 | 0.54 | 0.5363 | 0.1848 | 0.2189 |
| 0.3297 | 85.0 | 595 | 1.6221 | 0.54 | 0.6005 | 2.3360 | 0.54 | 0.5369 | 0.1836 | 0.2184 |
| 0.3297 | 86.0 | 602 | 1.6218 | 0.5425 | 0.6005 | 2.3342 | 0.5425 | 0.5391 | 0.1922 | 0.2178 |
| 0.3297 | 87.0 | 609 | 1.6224 | 0.54 | 0.6006 | 2.3365 | 0.54 | 0.5363 | 0.1849 | 0.2185 |
| 0.3297 | 88.0 | 616 | 1.6216 | 0.54 | 0.6005 | 2.3355 | 0.54 | 0.5363 | 0.1912 | 0.2188 |
| 0.3297 | 89.0 | 623 | 1.6222 | 0.535 | 0.6008 | 2.3359 | 0.535 | 0.5310 | 0.1890 | 0.2195 |
| 0.3297 | 90.0 | 630 | 1.6218 | 0.5425 | 0.6007 | 2.3356 | 0.5425 | 0.5386 | 0.1930 | 0.2179 |
| 0.3297 | 91.0 | 637 | 1.6226 | 0.54 | 0.6009 | 2.3363 | 0.54 | 0.5363 | 0.1941 | 0.2188 |
| 0.3297 | 92.0 | 644 | 1.6224 | 0.54 | 0.6009 | 2.3360 | 0.54 | 0.5363 | 0.1890 | 0.2188 |
| 0.3297 | 93.0 | 651 | 1.6226 | 0.5425 | 0.6009 | 2.3359 | 0.5425 | 0.5386 | 0.1964 | 0.2179 |
| 0.3297 | 94.0 | 658 | 1.6227 | 0.54 | 0.6009 | 2.3363 | 0.54 | 0.5363 | 0.1877 | 0.2187 |
| 0.3297 | 95.0 | 665 | 1.6226 | 0.54 | 0.6008 | 2.3360 | 0.54 | 0.5363 | 0.1916 | 0.2186 |
| 0.3297 | 96.0 | 672 | 1.6225 | 0.54 | 0.6008 | 2.3362 | 0.54 | 0.5363 | 0.1972 | 0.2186 |
| 0.3297 | 97.0 | 679 | 1.6227 | 0.5375 | 0.6009 | 2.3362 | 0.5375 | 0.5332 | 0.1949 | 0.2188 |
| 0.3297 | 98.0 | 686 | 1.6228 | 0.5375 | 0.6009 | 2.3362 | 0.5375 | 0.5332 | 0.1924 | 0.2188 |
| 0.3297 | 99.0 | 693 | 1.6228 | 0.5375 | 0.6009 | 2.3363 | 0.5375 | 0.5332 | 0.1924 | 0.2189 |
| 0.3297 | 100.0 | 700 | 1.6228 | 0.5375 | 0.6009 | 2.3362 | 0.5375 | 0.5332 | 0.1886 | 0.2188 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6906
- Accuracy: 0.5675
- Brier Loss: 0.5696
- Nll: 2.4654
- F1 Micro: 0.5675
- F1 Macro: 0.5648
- Ece: 0.1638
- Aurc: 0.1969
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.6645 | 0.0425 | 1.0732 | 7.4792 | 0.0425 | 0.0329 | 0.2822 | 0.9534 |
| No log | 2.0 | 14 | 3.6090 | 0.1 | 0.9462 | 5.7033 | 0.1000 | 0.0909 | 0.1715 | 0.8908 |
| No log | 3.0 | 21 | 3.2938 | 0.1975 | 0.8945 | 5.2434 | 0.1975 | 0.1585 | 0.1594 | 0.6803 |
| No log | 4.0 | 28 | 2.8501 | 0.29 | 0.7966 | 3.6125 | 0.29 | 0.2703 | 0.1664 | 0.4859 |
| No log | 5.0 | 35 | 2.5122 | 0.39 | 0.7162 | 3.1122 | 0.39 | 0.3562 | 0.1838 | 0.3484 |
| No log | 6.0 | 42 | 2.3667 | 0.4175 | 0.6955 | 3.3546 | 0.4175 | 0.3757 | 0.1683 | 0.3138 |
| No log | 7.0 | 49 | 2.2629 | 0.4725 | 0.6766 | 2.9737 | 0.4725 | 0.4417 | 0.1917 | 0.2939 |
| No log | 8.0 | 56 | 2.1495 | 0.485 | 0.6530 | 2.7257 | 0.485 | 0.4798 | 0.1901 | 0.2762 |
| No log | 9.0 | 63 | 2.1491 | 0.495 | 0.6517 | 2.7709 | 0.495 | 0.4982 | 0.1934 | 0.2664 |
| No log | 10.0 | 70 | 2.2541 | 0.465 | 0.6794 | 2.6880 | 0.465 | 0.4654 | 0.1923 | 0.2870 |
| No log | 11.0 | 77 | 2.1607 | 0.4925 | 0.6564 | 2.7773 | 0.4925 | 0.4769 | 0.1904 | 0.2545 |
| No log | 12.0 | 84 | 2.1581 | 0.505 | 0.6714 | 2.7017 | 0.505 | 0.4876 | 0.2023 | 0.2669 |
| No log | 13.0 | 91 | 2.0025 | 0.515 | 0.6324 | 2.5557 | 0.515 | 0.5048 | 0.2194 | 0.2348 |
| No log | 14.0 | 98 | 2.0267 | 0.5025 | 0.6474 | 2.4764 | 0.5025 | 0.4876 | 0.2071 | 0.2474 |
| No log | 15.0 | 105 | 1.9258 | 0.535 | 0.6106 | 2.4995 | 0.535 | 0.5310 | 0.1960 | 0.2251 |
| No log | 16.0 | 112 | 1.8650 | 0.545 | 0.6031 | 2.4944 | 0.545 | 0.5383 | 0.1847 | 0.2187 |
| No log | 17.0 | 119 | 1.8902 | 0.5375 | 0.6118 | 2.5468 | 0.5375 | 0.5284 | 0.1852 | 0.2278 |
| No log | 18.0 | 126 | 1.9187 | 0.53 | 0.6188 | 2.5580 | 0.53 | 0.5174 | 0.1792 | 0.2285 |
| No log | 19.0 | 133 | 1.8399 | 0.54 | 0.6049 | 2.5246 | 0.54 | 0.5362 | 0.1758 | 0.2191 |
| No log | 20.0 | 140 | 1.8912 | 0.53 | 0.6117 | 2.5403 | 0.53 | 0.5139 | 0.1819 | 0.2210 |
| No log | 21.0 | 147 | 1.8930 | 0.5325 | 0.6099 | 2.6739 | 0.5325 | 0.5294 | 0.1736 | 0.2281 |
| No log | 22.0 | 154 | 1.8930 | 0.5275 | 0.6070 | 2.6060 | 0.5275 | 0.5244 | 0.1832 | 0.2214 |
| No log | 23.0 | 161 | 1.8596 | 0.5375 | 0.6030 | 2.5066 | 0.5375 | 0.5260 | 0.1864 | 0.2185 |
| No log | 24.0 | 168 | 1.8328 | 0.5375 | 0.5942 | 2.5127 | 0.5375 | 0.5302 | 0.1807 | 0.2079 |
| No log | 25.0 | 175 | 1.8069 | 0.53 | 0.6004 | 2.4747 | 0.53 | 0.5241 | 0.1910 | 0.2189 |
| No log | 26.0 | 182 | 1.7937 | 0.5425 | 0.5912 | 2.5083 | 0.5425 | 0.5366 | 0.1841 | 0.2123 |
| No log | 27.0 | 189 | 1.7986 | 0.55 | 0.5945 | 2.5267 | 0.55 | 0.5476 | 0.1901 | 0.2134 |
| No log | 28.0 | 196 | 1.8090 | 0.5525 | 0.5918 | 2.4467 | 0.5525 | 0.5466 | 0.1765 | 0.2125 |
| No log | 29.0 | 203 | 1.7911 | 0.55 | 0.5931 | 2.4536 | 0.55 | 0.5462 | 0.1619 | 0.2140 |
| No log | 30.0 | 210 | 1.7702 | 0.5425 | 0.5911 | 2.5037 | 0.5425 | 0.5340 | 0.1819 | 0.2080 |
| No log | 31.0 | 217 | 1.8248 | 0.53 | 0.6063 | 2.4965 | 0.53 | 0.5145 | 0.1757 | 0.2217 |
| No log | 32.0 | 224 | 1.7866 | 0.545 | 0.5920 | 2.5543 | 0.545 | 0.5365 | 0.1745 | 0.2082 |
| No log | 33.0 | 231 | 1.7793 | 0.555 | 0.5867 | 2.4751 | 0.555 | 0.5503 | 0.1657 | 0.2107 |
| No log | 34.0 | 238 | 1.7417 | 0.5525 | 0.5799 | 2.4174 | 0.5525 | 0.5517 | 0.1778 | 0.2049 |
| No log | 35.0 | 245 | 1.7557 | 0.54 | 0.5869 | 2.4065 | 0.54 | 0.5379 | 0.1640 | 0.2119 |
| No log | 36.0 | 252 | 1.7567 | 0.54 | 0.5884 | 2.5031 | 0.54 | 0.5374 | 0.1519 | 0.2122 |
| No log | 37.0 | 259 | 1.7306 | 0.56 | 0.5781 | 2.5313 | 0.56 | 0.5551 | 0.1906 | 0.2042 |
| No log | 38.0 | 266 | 1.7615 | 0.5425 | 0.5888 | 2.4510 | 0.5425 | 0.5423 | 0.1783 | 0.2121 |
| No log | 39.0 | 273 | 1.7607 | 0.555 | 0.5855 | 2.5044 | 0.555 | 0.5473 | 0.1808 | 0.2062 |
| No log | 40.0 | 280 | 1.7430 | 0.5275 | 0.5840 | 2.4265 | 0.5275 | 0.5228 | 0.1822 | 0.2125 |
| No log | 41.0 | 287 | 1.7515 | 0.5625 | 0.5800 | 2.5432 | 0.5625 | 0.5500 | 0.1812 | 0.2019 |
| No log | 42.0 | 294 | 1.7229 | 0.5475 | 0.5778 | 2.4597 | 0.5475 | 0.5416 | 0.1656 | 0.2042 |
| No log | 43.0 | 301 | 1.7566 | 0.555 | 0.5818 | 2.4305 | 0.555 | 0.5517 | 0.1521 | 0.2036 |
| No log | 44.0 | 308 | 1.7093 | 0.575 | 0.5745 | 2.4087 | 0.575 | 0.5761 | 0.1678 | 0.2019 |
| No log | 45.0 | 315 | 1.7319 | 0.56 | 0.5788 | 2.5339 | 0.56 | 0.5550 | 0.1647 | 0.2028 |
| No log | 46.0 | 322 | 1.7331 | 0.5475 | 0.5863 | 2.4386 | 0.5475 | 0.5470 | 0.1737 | 0.2113 |
| No log | 47.0 | 329 | 1.7243 | 0.56 | 0.5776 | 2.5336 | 0.56 | 0.5548 | 0.1642 | 0.2037 |
| No log | 48.0 | 336 | 1.7205 | 0.5675 | 0.5817 | 2.4578 | 0.5675 | 0.5661 | 0.1800 | 0.2091 |
| No log | 49.0 | 343 | 1.7210 | 0.5725 | 0.5721 | 2.4792 | 0.5725 | 0.5656 | 0.1550 | 0.1970 |
| No log | 50.0 | 350 | 1.6994 | 0.58 | 0.5719 | 2.4645 | 0.58 | 0.5773 | 0.1580 | 0.1957 |
| No log | 51.0 | 357 | 1.7082 | 0.5775 | 0.5737 | 2.4805 | 0.5775 | 0.5701 | 0.1734 | 0.1992 |
| No log | 52.0 | 364 | 1.7144 | 0.5575 | 0.5693 | 2.4501 | 0.5575 | 0.5519 | 0.1812 | 0.1974 |
| No log | 53.0 | 371 | 1.7196 | 0.57 | 0.5796 | 2.4555 | 0.57 | 0.5618 | 0.1728 | 0.2022 |
| No log | 54.0 | 378 | 1.7149 | 0.5675 | 0.5746 | 2.4363 | 0.5675 | 0.5605 | 0.1901 | 0.1999 |
| No log | 55.0 | 385 | 1.7030 | 0.555 | 0.5775 | 2.4253 | 0.555 | 0.5528 | 0.1823 | 0.2029 |
| No log | 56.0 | 392 | 1.7209 | 0.5675 | 0.5747 | 2.4926 | 0.5675 | 0.5634 | 0.1644 | 0.2008 |
| No log | 57.0 | 399 | 1.7103 | 0.5525 | 0.5760 | 2.4528 | 0.5525 | 0.5487 | 0.1738 | 0.2025 |
| No log | 58.0 | 406 | 1.7005 | 0.5575 | 0.5771 | 2.4678 | 0.5575 | 0.5547 | 0.1514 | 0.2041 |
| No log | 59.0 | 413 | 1.7098 | 0.56 | 0.5763 | 2.4368 | 0.56 | 0.5585 | 0.1650 | 0.2022 |
| No log | 60.0 | 420 | 1.6976 | 0.5775 | 0.5681 | 2.4633 | 0.5775 | 0.5737 | 0.1729 | 0.1963 |
| No log | 61.0 | 427 | 1.7057 | 0.5575 | 0.5739 | 2.4717 | 0.5575 | 0.5522 | 0.1932 | 0.2007 |
| No log | 62.0 | 434 | 1.6884 | 0.5725 | 0.5646 | 2.4538 | 0.5725 | 0.5693 | 0.1692 | 0.1930 |
| No log | 63.0 | 441 | 1.6979 | 0.56 | 0.5731 | 2.4635 | 0.56 | 0.5562 | 0.1690 | 0.2016 |
| No log | 64.0 | 448 | 1.6848 | 0.55 | 0.5686 | 2.4583 | 0.55 | 0.5452 | 0.1782 | 0.1980 |
| No log | 65.0 | 455 | 1.7072 | 0.5575 | 0.5774 | 2.4596 | 0.5575 | 0.5549 | 0.1627 | 0.2036 |
| No log | 66.0 | 462 | 1.7057 | 0.5625 | 0.5759 | 2.4644 | 0.5625 | 0.5586 | 0.1780 | 0.2009 |
| No log | 67.0 | 469 | 1.7016 | 0.56 | 0.5719 | 2.4976 | 0.56 | 0.5546 | 0.1734 | 0.2004 |
| No log | 68.0 | 476 | 1.6951 | 0.5675 | 0.5725 | 2.4025 | 0.5675 | 0.5648 | 0.1866 | 0.1980 |
| No log | 69.0 | 483 | 1.7012 | 0.555 | 0.5750 | 2.4970 | 0.555 | 0.5527 | 0.1958 | 0.2022 |
| No log | 70.0 | 490 | 1.6983 | 0.575 | 0.5689 | 2.4763 | 0.575 | 0.5708 | 0.1648 | 0.1950 |
| No log | 71.0 | 497 | 1.6954 | 0.5675 | 0.5729 | 2.4642 | 0.5675 | 0.5638 | 0.1762 | 0.2004 |
| 0.437 | 72.0 | 504 | 1.6973 | 0.5625 | 0.5718 | 2.4639 | 0.5625 | 0.5605 | 0.1680 | 0.1991 |
| 0.437 | 73.0 | 511 | 1.6942 | 0.565 | 0.5714 | 2.4629 | 0.565 | 0.5610 | 0.1776 | 0.1980 |
| 0.437 | 74.0 | 518 | 1.6923 | 0.5725 | 0.5694 | 2.4717 | 0.5725 | 0.5698 | 0.1676 | 0.1967 |
| 0.437 | 75.0 | 525 | 1.6926 | 0.5675 | 0.5699 | 2.4674 | 0.5675 | 0.5633 | 0.1741 | 0.1975 |
| 0.437 | 76.0 | 532 | 1.6906 | 0.5675 | 0.5692 | 2.4673 | 0.5675 | 0.5641 | 0.1785 | 0.1962 |
| 0.437 | 77.0 | 539 | 1.6912 | 0.565 | 0.5692 | 2.4671 | 0.565 | 0.5617 | 0.1568 | 0.1958 |
| 0.437 | 78.0 | 546 | 1.6879 | 0.565 | 0.5685 | 2.4629 | 0.565 | 0.5620 | 0.1860 | 0.1954 |
| 0.437 | 79.0 | 553 | 1.6886 | 0.565 | 0.5695 | 2.4650 | 0.565 | 0.5625 | 0.1777 | 0.1968 |
| 0.437 | 80.0 | 560 | 1.6882 | 0.5625 | 0.5700 | 2.4632 | 0.5625 | 0.5583 | 0.1791 | 0.1982 |
| 0.437 | 81.0 | 567 | 1.6918 | 0.565 | 0.5704 | 2.4638 | 0.565 | 0.5622 | 0.1630 | 0.1978 |
| 0.437 | 82.0 | 574 | 1.6909 | 0.5675 | 0.5697 | 2.4646 | 0.5675 | 0.5641 | 0.1862 | 0.1959 |
| 0.437 | 83.0 | 581 | 1.6881 | 0.565 | 0.5687 | 2.4665 | 0.565 | 0.5621 | 0.1494 | 0.1956 |
| 0.437 | 84.0 | 588 | 1.6897 | 0.565 | 0.5692 | 2.4648 | 0.565 | 0.5626 | 0.1716 | 0.1969 |
| 0.437 | 85.0 | 595 | 1.6910 | 0.5675 | 0.5697 | 2.4654 | 0.5675 | 0.5652 | 0.1747 | 0.1971 |
| 0.437 | 86.0 | 602 | 1.6905 | 0.57 | 0.5694 | 2.4648 | 0.57 | 0.5667 | 0.1659 | 0.1959 |
| 0.437 | 87.0 | 609 | 1.6896 | 0.5675 | 0.5693 | 2.4648 | 0.5675 | 0.5642 | 0.1681 | 0.1958 |
| 0.437 | 88.0 | 616 | 1.6902 | 0.5675 | 0.5695 | 2.4655 | 0.5675 | 0.5642 | 0.1709 | 0.1961 |
| 0.437 | 89.0 | 623 | 1.6907 | 0.5675 | 0.5697 | 2.4655 | 0.5675 | 0.5648 | 0.1676 | 0.1971 |
| 0.437 | 90.0 | 630 | 1.6903 | 0.5675 | 0.5694 | 2.4657 | 0.5675 | 0.5648 | 0.1699 | 0.1966 |
| 0.437 | 91.0 | 637 | 1.6906 | 0.565 | 0.5696 | 2.4656 | 0.565 | 0.5624 | 0.1689 | 0.1970 |
| 0.437 | 92.0 | 644 | 1.6904 | 0.565 | 0.5695 | 2.4654 | 0.565 | 0.5622 | 0.1668 | 0.1970 |
| 0.437 | 93.0 | 651 | 1.6904 | 0.5675 | 0.5696 | 2.4655 | 0.5675 | 0.5649 | 0.1619 | 0.1970 |
| 0.437 | 94.0 | 658 | 1.6903 | 0.565 | 0.5694 | 2.4655 | 0.565 | 0.5622 | 0.1662 | 0.1969 |
| 0.437 | 95.0 | 665 | 1.6905 | 0.565 | 0.5695 | 2.4653 | 0.565 | 0.5622 | 0.1710 | 0.1969 |
| 0.437 | 96.0 | 672 | 1.6905 | 0.5675 | 0.5695 | 2.4655 | 0.5675 | 0.5648 | 0.1638 | 0.1968 |
| 0.437 | 97.0 | 679 | 1.6906 | 0.565 | 0.5695 | 2.4654 | 0.565 | 0.5622 | 0.1664 | 0.1970 |
| 0.437 | 98.0 | 686 | 1.6906 | 0.5675 | 0.5695 | 2.4655 | 0.5675 | 0.5648 | 0.1638 | 0.1969 |
| 0.437 | 99.0 | 693 | 1.6906 | 0.5675 | 0.5696 | 2.4654 | 0.5675 | 0.5648 | 0.1638 | 0.1969 |
| 0.437 | 100.0 | 700 | 1.6906 | 0.5675 | 0.5696 | 2.4654 | 0.5675 | 0.5648 | 0.1638 | 0.1969 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6224
- Accuracy: 0.555
- Brier Loss: 0.5813
- Nll: 2.4451
- F1 Micro: 0.555
- F1 Macro: 0.5481
- Ece: 0.1732
- Aurc: 0.2090
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.2974 | 0.04 | 1.0725 | 7.3803 | 0.04 | 0.0315 | 0.2825 | 0.9545 |
| No log | 2.0 | 14 | 3.3036 | 0.095 | 0.9476 | 5.7516 | 0.095 | 0.0792 | 0.1590 | 0.8944 |
| No log | 3.0 | 21 | 2.9979 | 0.215 | 0.8918 | 5.3309 | 0.2150 | 0.1763 | 0.1514 | 0.6635 |
| No log | 4.0 | 28 | 2.5846 | 0.2875 | 0.7979 | 3.5812 | 0.2875 | 0.2707 | 0.1619 | 0.4840 |
| No log | 5.0 | 35 | 2.2908 | 0.3925 | 0.7162 | 3.1082 | 0.3925 | 0.3675 | 0.1724 | 0.3500 |
| No log | 6.0 | 42 | 2.1582 | 0.4275 | 0.6903 | 3.3486 | 0.4275 | 0.3933 | 0.1723 | 0.3106 |
| No log | 7.0 | 49 | 2.1910 | 0.445 | 0.7011 | 2.9994 | 0.445 | 0.4233 | 0.1889 | 0.3105 |
| No log | 8.0 | 56 | 2.0287 | 0.485 | 0.6673 | 2.8482 | 0.485 | 0.4803 | 0.1781 | 0.2848 |
| No log | 9.0 | 63 | 2.1037 | 0.4775 | 0.6684 | 2.8143 | 0.4775 | 0.4715 | 0.2010 | 0.2690 |
| No log | 10.0 | 70 | 2.1168 | 0.4825 | 0.6846 | 2.8143 | 0.4825 | 0.4703 | 0.2082 | 0.2818 |
| No log | 11.0 | 77 | 2.1094 | 0.495 | 0.6825 | 2.9020 | 0.495 | 0.4834 | 0.1990 | 0.2632 |
| No log | 12.0 | 84 | 2.0835 | 0.48 | 0.6897 | 2.7522 | 0.48 | 0.4651 | 0.2261 | 0.2741 |
| No log | 13.0 | 91 | 1.9606 | 0.505 | 0.6631 | 2.4794 | 0.505 | 0.4988 | 0.2034 | 0.2712 |
| No log | 14.0 | 98 | 1.9519 | 0.4975 | 0.6567 | 2.7608 | 0.4975 | 0.4833 | 0.2042 | 0.2563 |
| No log | 15.0 | 105 | 1.8794 | 0.52 | 0.6304 | 2.6588 | 0.52 | 0.5121 | 0.1814 | 0.2337 |
| No log | 16.0 | 112 | 1.7934 | 0.5375 | 0.6191 | 2.5142 | 0.5375 | 0.5234 | 0.1853 | 0.2272 |
| No log | 17.0 | 119 | 1.8110 | 0.5225 | 0.6242 | 2.5106 | 0.5225 | 0.5071 | 0.1918 | 0.2336 |
| No log | 18.0 | 126 | 1.8027 | 0.515 | 0.6283 | 2.4142 | 0.515 | 0.4983 | 0.2020 | 0.2359 |
| No log | 19.0 | 133 | 1.8123 | 0.5375 | 0.6318 | 2.5551 | 0.5375 | 0.5235 | 0.2132 | 0.2358 |
| No log | 20.0 | 140 | 1.7937 | 0.5225 | 0.6292 | 2.5237 | 0.5225 | 0.5145 | 0.2070 | 0.2336 |
| No log | 21.0 | 147 | 1.7272 | 0.5175 | 0.6046 | 2.4278 | 0.5175 | 0.5103 | 0.1806 | 0.2286 |
| No log | 22.0 | 154 | 1.8337 | 0.5325 | 0.6396 | 2.5603 | 0.5325 | 0.5136 | 0.1884 | 0.2405 |
| No log | 23.0 | 161 | 1.7416 | 0.5275 | 0.6102 | 2.5228 | 0.5275 | 0.5077 | 0.1784 | 0.2232 |
| No log | 24.0 | 168 | 1.7036 | 0.55 | 0.6063 | 2.4933 | 0.55 | 0.5380 | 0.1776 | 0.2230 |
| No log | 25.0 | 175 | 1.7330 | 0.545 | 0.6084 | 2.4943 | 0.545 | 0.5365 | 0.1989 | 0.2251 |
| No log | 26.0 | 182 | 1.6911 | 0.55 | 0.5993 | 2.4401 | 0.55 | 0.5416 | 0.1792 | 0.2208 |
| No log | 27.0 | 189 | 1.7329 | 0.5475 | 0.6162 | 2.4824 | 0.5475 | 0.5380 | 0.1830 | 0.2317 |
| No log | 28.0 | 196 | 1.6890 | 0.5475 | 0.5992 | 2.4828 | 0.5475 | 0.5401 | 0.1725 | 0.2178 |
| No log | 29.0 | 203 | 1.7256 | 0.5425 | 0.6124 | 2.5121 | 0.5425 | 0.5299 | 0.1765 | 0.2260 |
| No log | 30.0 | 210 | 1.6854 | 0.5375 | 0.5952 | 2.4275 | 0.5375 | 0.5282 | 0.2015 | 0.2163 |
| No log | 31.0 | 217 | 1.7010 | 0.5475 | 0.6030 | 2.4832 | 0.5475 | 0.5385 | 0.1862 | 0.2236 |
| No log | 32.0 | 224 | 1.6840 | 0.535 | 0.5934 | 2.4512 | 0.535 | 0.5310 | 0.1794 | 0.2182 |
| No log | 33.0 | 231 | 1.6808 | 0.545 | 0.6016 | 2.4424 | 0.545 | 0.5396 | 0.1828 | 0.2222 |
| No log | 34.0 | 238 | 1.6965 | 0.535 | 0.6000 | 2.5453 | 0.535 | 0.5270 | 0.1846 | 0.2243 |
| No log | 35.0 | 245 | 1.6650 | 0.545 | 0.5930 | 2.4901 | 0.545 | 0.5418 | 0.1706 | 0.2130 |
| No log | 36.0 | 252 | 1.6494 | 0.54 | 0.5979 | 2.4011 | 0.54 | 0.5319 | 0.1731 | 0.2201 |
| No log | 37.0 | 259 | 1.6738 | 0.54 | 0.5892 | 2.4632 | 0.54 | 0.5257 | 0.1781 | 0.2088 |
| No log | 38.0 | 266 | 1.6502 | 0.55 | 0.5889 | 2.4733 | 0.55 | 0.5416 | 0.1957 | 0.2087 |
| No log | 39.0 | 273 | 1.6539 | 0.55 | 0.5832 | 2.5105 | 0.55 | 0.5424 | 0.1584 | 0.2120 |
| No log | 40.0 | 280 | 1.6399 | 0.545 | 0.5919 | 2.4450 | 0.545 | 0.5439 | 0.1877 | 0.2168 |
| No log | 41.0 | 287 | 1.6968 | 0.535 | 0.6006 | 2.5310 | 0.535 | 0.5281 | 0.1895 | 0.2221 |
| No log | 42.0 | 294 | 1.6430 | 0.5425 | 0.5955 | 2.4020 | 0.5425 | 0.5449 | 0.1722 | 0.2203 |
| No log | 43.0 | 301 | 1.6743 | 0.535 | 0.5970 | 2.5138 | 0.535 | 0.5239 | 0.1869 | 0.2167 |
| No log | 44.0 | 308 | 1.6544 | 0.5475 | 0.5949 | 2.4259 | 0.5475 | 0.5408 | 0.1651 | 0.2164 |
| No log | 45.0 | 315 | 1.6763 | 0.535 | 0.5980 | 2.4379 | 0.535 | 0.5249 | 0.1868 | 0.2174 |
| No log | 46.0 | 322 | 1.6509 | 0.525 | 0.5933 | 2.4351 | 0.525 | 0.5137 | 0.1861 | 0.2194 |
| No log | 47.0 | 329 | 1.6530 | 0.5475 | 0.5929 | 2.4628 | 0.5475 | 0.5419 | 0.1859 | 0.2148 |
| No log | 48.0 | 336 | 1.6410 | 0.555 | 0.5835 | 2.4992 | 0.555 | 0.5490 | 0.1805 | 0.2110 |
| No log | 49.0 | 343 | 1.6398 | 0.5525 | 0.5857 | 2.5060 | 0.5525 | 0.5409 | 0.1706 | 0.2101 |
| No log | 50.0 | 350 | 1.6343 | 0.5525 | 0.5814 | 2.4890 | 0.5525 | 0.5442 | 0.1608 | 0.2065 |
| No log | 51.0 | 357 | 1.6335 | 0.5475 | 0.5846 | 2.4407 | 0.5475 | 0.5392 | 0.1720 | 0.2109 |
| No log | 52.0 | 364 | 1.6309 | 0.555 | 0.5844 | 2.4944 | 0.555 | 0.5488 | 0.1697 | 0.2091 |
| No log | 53.0 | 371 | 1.6308 | 0.5575 | 0.5826 | 2.4815 | 0.5575 | 0.5505 | 0.1704 | 0.2080 |
| No log | 54.0 | 378 | 1.6279 | 0.56 | 0.5832 | 2.4741 | 0.56 | 0.5525 | 0.1724 | 0.2067 |
| No log | 55.0 | 385 | 1.6226 | 0.55 | 0.5825 | 2.4048 | 0.55 | 0.5425 | 0.1656 | 0.2094 |
| No log | 56.0 | 392 | 1.6141 | 0.555 | 0.5797 | 2.4716 | 0.555 | 0.5474 | 0.1813 | 0.2076 |
| No log | 57.0 | 399 | 1.6179 | 0.56 | 0.5760 | 2.4682 | 0.56 | 0.5549 | 0.1474 | 0.2030 |
| No log | 58.0 | 406 | 1.6278 | 0.56 | 0.5831 | 2.4758 | 0.56 | 0.5540 | 0.1681 | 0.2075 |
| No log | 59.0 | 413 | 1.6257 | 0.5525 | 0.5817 | 2.4462 | 0.5525 | 0.5455 | 0.1648 | 0.2084 |
| No log | 60.0 | 420 | 1.6306 | 0.5575 | 0.5861 | 2.5090 | 0.5575 | 0.5505 | 0.1687 | 0.2108 |
| No log | 61.0 | 427 | 1.6314 | 0.555 | 0.5821 | 2.5024 | 0.555 | 0.5480 | 0.1725 | 0.2092 |
| No log | 62.0 | 434 | 1.6322 | 0.545 | 0.5846 | 2.4694 | 0.545 | 0.5407 | 0.1848 | 0.2123 |
| No log | 63.0 | 441 | 1.6229 | 0.5575 | 0.5829 | 2.4413 | 0.5575 | 0.5508 | 0.1698 | 0.2098 |
| No log | 64.0 | 448 | 1.6187 | 0.56 | 0.5809 | 2.4420 | 0.56 | 0.5522 | 0.1848 | 0.2083 |
| No log | 65.0 | 455 | 1.6160 | 0.555 | 0.5794 | 2.4349 | 0.555 | 0.5506 | 0.1761 | 0.2076 |
| No log | 66.0 | 462 | 1.6254 | 0.55 | 0.5822 | 2.4752 | 0.55 | 0.5432 | 0.1629 | 0.2086 |
| No log | 67.0 | 469 | 1.6259 | 0.55 | 0.5839 | 2.4425 | 0.55 | 0.5423 | 0.1769 | 0.2113 |
| No log | 68.0 | 476 | 1.6245 | 0.55 | 0.5822 | 2.4382 | 0.55 | 0.5421 | 0.1736 | 0.2091 |
| No log | 69.0 | 483 | 1.6243 | 0.5575 | 0.5830 | 2.4422 | 0.5575 | 0.5497 | 0.1808 | 0.2094 |
| No log | 70.0 | 490 | 1.6223 | 0.5575 | 0.5810 | 2.4787 | 0.5575 | 0.5507 | 0.1556 | 0.2089 |
| No log | 71.0 | 497 | 1.6204 | 0.5575 | 0.5809 | 2.4408 | 0.5575 | 0.5515 | 0.1555 | 0.2083 |
| 0.3852 | 72.0 | 504 | 1.6225 | 0.55 | 0.5816 | 2.4404 | 0.55 | 0.5424 | 0.1750 | 0.2104 |
| 0.3852 | 73.0 | 511 | 1.6237 | 0.55 | 0.5822 | 2.4403 | 0.55 | 0.5429 | 0.1772 | 0.2107 |
| 0.3852 | 74.0 | 518 | 1.6220 | 0.55 | 0.5815 | 2.4441 | 0.55 | 0.5420 | 0.1649 | 0.2105 |
| 0.3852 | 75.0 | 525 | 1.6228 | 0.5475 | 0.5818 | 2.4736 | 0.5475 | 0.5405 | 0.1882 | 0.2109 |
| 0.3852 | 76.0 | 532 | 1.6224 | 0.5525 | 0.5814 | 2.4442 | 0.5525 | 0.5446 | 0.1817 | 0.2108 |
| 0.3852 | 77.0 | 539 | 1.6225 | 0.5525 | 0.5815 | 2.4431 | 0.5525 | 0.5448 | 0.1798 | 0.2098 |
| 0.3852 | 78.0 | 546 | 1.6213 | 0.555 | 0.5812 | 2.4417 | 0.555 | 0.5471 | 0.1680 | 0.2096 |
| 0.3852 | 79.0 | 553 | 1.6208 | 0.5575 | 0.5808 | 2.4423 | 0.5575 | 0.5501 | 0.1784 | 0.2082 |
| 0.3852 | 80.0 | 560 | 1.6218 | 0.5525 | 0.5811 | 2.4425 | 0.5525 | 0.5447 | 0.1683 | 0.2095 |
| 0.3852 | 81.0 | 567 | 1.6225 | 0.5525 | 0.5814 | 2.4429 | 0.5525 | 0.5447 | 0.1856 | 0.2098 |
| 0.3852 | 82.0 | 574 | 1.6222 | 0.5575 | 0.5812 | 2.4469 | 0.5575 | 0.5501 | 0.1953 | 0.2085 |
| 0.3852 | 83.0 | 581 | 1.6219 | 0.555 | 0.5811 | 2.4442 | 0.555 | 0.5471 | 0.1940 | 0.2093 |
| 0.3852 | 84.0 | 588 | 1.6220 | 0.555 | 0.5813 | 2.4443 | 0.555 | 0.5471 | 0.1867 | 0.2095 |
| 0.3852 | 85.0 | 595 | 1.6223 | 0.555 | 0.5813 | 2.4446 | 0.555 | 0.5471 | 0.1885 | 0.2094 |
| 0.3852 | 86.0 | 602 | 1.6222 | 0.5525 | 0.5812 | 2.4448 | 0.5525 | 0.5447 | 0.1749 | 0.2095 |
| 0.3852 | 87.0 | 609 | 1.6222 | 0.555 | 0.5813 | 2.4454 | 0.555 | 0.5481 | 0.1745 | 0.2091 |
| 0.3852 | 88.0 | 616 | 1.6222 | 0.5575 | 0.5813 | 2.4446 | 0.5575 | 0.5504 | 0.1767 | 0.2087 |
| 0.3852 | 89.0 | 623 | 1.6222 | 0.5575 | 0.5813 | 2.4445 | 0.5575 | 0.5504 | 0.1839 | 0.2087 |
| 0.3852 | 90.0 | 630 | 1.6221 | 0.555 | 0.5812 | 2.4447 | 0.555 | 0.5481 | 0.1814 | 0.2091 |
| 0.3852 | 91.0 | 637 | 1.6222 | 0.5575 | 0.5813 | 2.4446 | 0.5575 | 0.5504 | 0.1790 | 0.2087 |
| 0.3852 | 92.0 | 644 | 1.6222 | 0.555 | 0.5813 | 2.4447 | 0.555 | 0.5481 | 0.1755 | 0.2091 |
| 0.3852 | 93.0 | 651 | 1.6223 | 0.5575 | 0.5813 | 2.4446 | 0.5575 | 0.5504 | 0.1747 | 0.2087 |
| 0.3852 | 94.0 | 658 | 1.6223 | 0.5575 | 0.5813 | 2.4449 | 0.5575 | 0.5504 | 0.1747 | 0.2087 |
| 0.3852 | 95.0 | 665 | 1.6223 | 0.555 | 0.5813 | 2.4448 | 0.555 | 0.5481 | 0.1732 | 0.2090 |
| 0.3852 | 96.0 | 672 | 1.6224 | 0.555 | 0.5813 | 2.4451 | 0.555 | 0.5481 | 0.1720 | 0.2091 |
| 0.3852 | 97.0 | 679 | 1.6223 | 0.5575 | 0.5813 | 2.4451 | 0.5575 | 0.5504 | 0.1735 | 0.2088 |
| 0.3852 | 98.0 | 686 | 1.6223 | 0.5575 | 0.5813 | 2.4451 | 0.5575 | 0.5504 | 0.1747 | 0.2087 |
| 0.3852 | 99.0 | 693 | 1.6224 | 0.555 | 0.5813 | 2.4451 | 0.555 | 0.5481 | 0.1732 | 0.2090 |
| 0.3852 | 100.0 | 700 | 1.6224 | 0.555 | 0.5813 | 2.4451 | 0.555 | 0.5481 | 0.1732 | 0.2090 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5750
- Accuracy: 0.5325
- Brier Loss: 0.5990
- Nll: 2.5263
- F1 Micro: 0.5325
- F1 Macro: 0.5240
- Ece: 0.1659
- Aurc: 0.2152
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 3.9285 | 0.04 | 1.0722 | 7.3572 | 0.04 | 0.0319 | 0.2792 | 0.9556 |
| No log | 2.0 | 14 | 3.0027 | 0.095 | 0.9510 | 5.8779 | 0.095 | 0.0766 | 0.1668 | 0.8900 |
| No log | 3.0 | 21 | 2.6988 | 0.225 | 0.8896 | 5.3750 | 0.225 | 0.1801 | 0.1669 | 0.6473 |
| No log | 4.0 | 28 | 2.3179 | 0.285 | 0.8016 | 3.5658 | 0.285 | 0.2657 | 0.1679 | 0.4916 |
| No log | 5.0 | 35 | 2.0566 | 0.37 | 0.7203 | 2.9834 | 0.37 | 0.3493 | 0.1684 | 0.3612 |
| No log | 6.0 | 42 | 1.9505 | 0.4325 | 0.6892 | 3.0719 | 0.4325 | 0.4127 | 0.1775 | 0.3084 |
| No log | 7.0 | 49 | 1.9995 | 0.4375 | 0.7008 | 3.1569 | 0.4375 | 0.4084 | 0.2032 | 0.3103 |
| No log | 8.0 | 56 | 1.9133 | 0.445 | 0.6906 | 2.8574 | 0.445 | 0.4464 | 0.2016 | 0.3062 |
| No log | 9.0 | 63 | 1.9876 | 0.4625 | 0.6918 | 2.9267 | 0.4625 | 0.4538 | 0.2228 | 0.2868 |
| No log | 10.0 | 70 | 2.0051 | 0.4725 | 0.6971 | 2.9249 | 0.4725 | 0.4553 | 0.2234 | 0.2814 |
| No log | 11.0 | 77 | 2.1834 | 0.465 | 0.7319 | 2.9998 | 0.465 | 0.4426 | 0.2444 | 0.3006 |
| No log | 12.0 | 84 | 1.9953 | 0.4825 | 0.7087 | 2.7128 | 0.4825 | 0.4731 | 0.2386 | 0.2980 |
| No log | 13.0 | 91 | 1.8834 | 0.4975 | 0.6771 | 2.6879 | 0.4975 | 0.4954 | 0.2240 | 0.2748 |
| No log | 14.0 | 98 | 1.9647 | 0.4675 | 0.6987 | 2.8305 | 0.4675 | 0.4429 | 0.2409 | 0.2902 |
| No log | 15.0 | 105 | 1.8810 | 0.5 | 0.6785 | 2.6402 | 0.5 | 0.4847 | 0.2171 | 0.2725 |
| No log | 16.0 | 112 | 1.8777 | 0.4875 | 0.6877 | 2.6940 | 0.4875 | 0.4871 | 0.2210 | 0.2846 |
| No log | 17.0 | 119 | 1.9260 | 0.4925 | 0.6796 | 2.7055 | 0.4925 | 0.4834 | 0.2012 | 0.2744 |
| No log | 18.0 | 126 | 1.7864 | 0.505 | 0.6547 | 2.6724 | 0.505 | 0.4912 | 0.2081 | 0.2434 |
| No log | 19.0 | 133 | 1.7618 | 0.4975 | 0.6430 | 2.5951 | 0.4975 | 0.4915 | 0.2172 | 0.2490 |
| No log | 20.0 | 140 | 1.7496 | 0.515 | 0.6513 | 2.5263 | 0.515 | 0.5025 | 0.1975 | 0.2502 |
| No log | 21.0 | 147 | 1.7082 | 0.5275 | 0.6438 | 2.4039 | 0.5275 | 0.5224 | 0.2017 | 0.2450 |
| No log | 22.0 | 154 | 1.7482 | 0.4975 | 0.6682 | 2.5194 | 0.4975 | 0.4911 | 0.2247 | 0.2571 |
| No log | 23.0 | 161 | 1.7377 | 0.5075 | 0.6482 | 2.4136 | 0.5075 | 0.4900 | 0.2221 | 0.2396 |
| No log | 24.0 | 168 | 1.7094 | 0.515 | 0.6372 | 2.5605 | 0.515 | 0.5083 | 0.2137 | 0.2474 |
| No log | 25.0 | 175 | 1.6884 | 0.5175 | 0.6422 | 2.5270 | 0.5175 | 0.5104 | 0.2111 | 0.2444 |
| No log | 26.0 | 182 | 1.6489 | 0.5275 | 0.6246 | 2.5344 | 0.5275 | 0.5211 | 0.2066 | 0.2333 |
| No log | 27.0 | 189 | 1.6165 | 0.53 | 0.6191 | 2.5418 | 0.53 | 0.5256 | 0.2021 | 0.2305 |
| No log | 28.0 | 196 | 1.6316 | 0.5275 | 0.6181 | 2.6568 | 0.5275 | 0.5212 | 0.2004 | 0.2300 |
| No log | 29.0 | 203 | 1.6595 | 0.5175 | 0.6306 | 2.4298 | 0.5175 | 0.5096 | 0.2020 | 0.2427 |
| No log | 30.0 | 210 | 1.6193 | 0.5325 | 0.6157 | 2.5455 | 0.5325 | 0.5272 | 0.1779 | 0.2278 |
| No log | 31.0 | 217 | 1.6517 | 0.5325 | 0.6274 | 2.4579 | 0.5325 | 0.5259 | 0.2006 | 0.2362 |
| No log | 32.0 | 224 | 1.6434 | 0.5325 | 0.6167 | 2.5805 | 0.5325 | 0.5229 | 0.1995 | 0.2273 |
| No log | 33.0 | 231 | 1.6660 | 0.5225 | 0.6269 | 2.6794 | 0.5225 | 0.5132 | 0.2244 | 0.2283 |
| No log | 34.0 | 238 | 1.6353 | 0.515 | 0.6194 | 2.6085 | 0.515 | 0.5069 | 0.1839 | 0.2303 |
| No log | 35.0 | 245 | 1.5920 | 0.5325 | 0.6051 | 2.5645 | 0.5325 | 0.5248 | 0.1868 | 0.2208 |
| No log | 36.0 | 252 | 1.5909 | 0.54 | 0.6028 | 2.4786 | 0.54 | 0.5323 | 0.1902 | 0.2194 |
| No log | 37.0 | 259 | 1.5730 | 0.5425 | 0.5983 | 2.4877 | 0.5425 | 0.5368 | 0.1799 | 0.2177 |
| No log | 38.0 | 266 | 1.5800 | 0.535 | 0.6029 | 2.4736 | 0.535 | 0.5282 | 0.1761 | 0.2196 |
| No log | 39.0 | 273 | 1.5594 | 0.54 | 0.5955 | 2.5093 | 0.54 | 0.5327 | 0.1900 | 0.2126 |
| No log | 40.0 | 280 | 1.5685 | 0.53 | 0.5979 | 2.6068 | 0.53 | 0.5208 | 0.1893 | 0.2173 |
| No log | 41.0 | 287 | 1.5757 | 0.53 | 0.5995 | 2.5655 | 0.53 | 0.5218 | 0.1862 | 0.2164 |
| No log | 42.0 | 294 | 1.5797 | 0.535 | 0.6039 | 2.5445 | 0.535 | 0.5273 | 0.1834 | 0.2182 |
| No log | 43.0 | 301 | 1.5900 | 0.53 | 0.6074 | 2.5201 | 0.53 | 0.5189 | 0.1747 | 0.2206 |
| No log | 44.0 | 308 | 1.5760 | 0.5325 | 0.5986 | 2.4974 | 0.5325 | 0.5225 | 0.1870 | 0.2148 |
| No log | 45.0 | 315 | 1.5768 | 0.53 | 0.6013 | 2.5174 | 0.53 | 0.5204 | 0.1979 | 0.2158 |
| No log | 46.0 | 322 | 1.5774 | 0.53 | 0.6011 | 2.5199 | 0.53 | 0.5206 | 0.1882 | 0.2165 |
| No log | 47.0 | 329 | 1.5714 | 0.54 | 0.5983 | 2.5329 | 0.54 | 0.5303 | 0.1884 | 0.2135 |
| No log | 48.0 | 336 | 1.5834 | 0.5325 | 0.6026 | 2.5253 | 0.5325 | 0.5238 | 0.1658 | 0.2190 |
| No log | 49.0 | 343 | 1.5724 | 0.5375 | 0.5979 | 2.5569 | 0.5375 | 0.5299 | 0.1617 | 0.2151 |
| No log | 50.0 | 350 | 1.5685 | 0.5375 | 0.5985 | 2.5189 | 0.5375 | 0.5285 | 0.1919 | 0.2151 |
| No log | 51.0 | 357 | 1.5708 | 0.54 | 0.5986 | 2.5002 | 0.54 | 0.5305 | 0.1755 | 0.2149 |
| No log | 52.0 | 364 | 1.5665 | 0.535 | 0.5977 | 2.5224 | 0.535 | 0.5267 | 0.1842 | 0.2160 |
| No log | 53.0 | 371 | 1.5713 | 0.5325 | 0.5993 | 2.5515 | 0.5325 | 0.5250 | 0.1753 | 0.2160 |
| No log | 54.0 | 378 | 1.5693 | 0.535 | 0.5986 | 2.5516 | 0.535 | 0.5276 | 0.1841 | 0.2158 |
| No log | 55.0 | 385 | 1.5693 | 0.5375 | 0.5984 | 2.5190 | 0.5375 | 0.5285 | 0.1842 | 0.2144 |
| No log | 56.0 | 392 | 1.5725 | 0.535 | 0.5992 | 2.5527 | 0.535 | 0.5262 | 0.1776 | 0.2150 |
| No log | 57.0 | 399 | 1.5674 | 0.5425 | 0.5976 | 2.5502 | 0.5425 | 0.5326 | 0.1902 | 0.2137 |
| No log | 58.0 | 406 | 1.5675 | 0.5375 | 0.5974 | 2.5517 | 0.5375 | 0.5288 | 0.1794 | 0.2139 |
| No log | 59.0 | 413 | 1.5713 | 0.535 | 0.5988 | 2.5515 | 0.535 | 0.5257 | 0.1791 | 0.2147 |
| No log | 60.0 | 420 | 1.5729 | 0.535 | 0.5988 | 2.5512 | 0.535 | 0.5262 | 0.1796 | 0.2148 |
| No log | 61.0 | 427 | 1.5702 | 0.5375 | 0.5976 | 2.5521 | 0.5375 | 0.5281 | 0.1817 | 0.2139 |
| No log | 62.0 | 434 | 1.5728 | 0.535 | 0.5988 | 2.5514 | 0.535 | 0.5266 | 0.1722 | 0.2149 |
| No log | 63.0 | 441 | 1.5720 | 0.5325 | 0.5985 | 2.5206 | 0.5325 | 0.5231 | 0.1790 | 0.2149 |
| No log | 64.0 | 448 | 1.5704 | 0.5325 | 0.5975 | 2.5510 | 0.5325 | 0.5236 | 0.1706 | 0.2139 |
| No log | 65.0 | 455 | 1.5724 | 0.5325 | 0.5986 | 2.5225 | 0.5325 | 0.5236 | 0.1557 | 0.2148 |
| No log | 66.0 | 462 | 1.5718 | 0.5325 | 0.5985 | 2.5246 | 0.5325 | 0.5241 | 0.1772 | 0.2148 |
| No log | 67.0 | 469 | 1.5710 | 0.5325 | 0.5981 | 2.5511 | 0.5325 | 0.5237 | 0.1625 | 0.2146 |
| No log | 68.0 | 476 | 1.5716 | 0.54 | 0.5981 | 2.5001 | 0.54 | 0.5304 | 0.1622 | 0.2141 |
| No log | 69.0 | 483 | 1.5732 | 0.5325 | 0.5988 | 2.5517 | 0.5325 | 0.5232 | 0.1641 | 0.2150 |
| No log | 70.0 | 490 | 1.5733 | 0.5325 | 0.5987 | 2.5522 | 0.5325 | 0.5237 | 0.1715 | 0.2149 |
| No log | 71.0 | 497 | 1.5729 | 0.5325 | 0.5985 | 2.5523 | 0.5325 | 0.5241 | 0.1670 | 0.2147 |
| 0.3153 | 72.0 | 504 | 1.5730 | 0.5325 | 0.5987 | 2.5236 | 0.5325 | 0.5237 | 0.1656 | 0.2149 |
| 0.3153 | 73.0 | 511 | 1.5723 | 0.5325 | 0.5985 | 2.5212 | 0.5325 | 0.5238 | 0.1893 | 0.2145 |
| 0.3153 | 74.0 | 518 | 1.5738 | 0.5325 | 0.5989 | 2.5515 | 0.5325 | 0.5238 | 0.1744 | 0.2147 |
| 0.3153 | 75.0 | 525 | 1.5740 | 0.5325 | 0.5988 | 2.5318 | 0.5325 | 0.5237 | 0.1683 | 0.2150 |
| 0.3153 | 76.0 | 532 | 1.5734 | 0.535 | 0.5985 | 2.5525 | 0.535 | 0.5261 | 0.1763 | 0.2145 |
| 0.3153 | 77.0 | 539 | 1.5740 | 0.5325 | 0.5989 | 2.5516 | 0.5325 | 0.5243 | 0.1726 | 0.2149 |
| 0.3153 | 78.0 | 546 | 1.5738 | 0.5325 | 0.5987 | 2.5289 | 0.5325 | 0.5241 | 0.1692 | 0.2148 |
| 0.3153 | 79.0 | 553 | 1.5736 | 0.5325 | 0.5987 | 2.5255 | 0.5325 | 0.5242 | 0.1807 | 0.2147 |
| 0.3153 | 80.0 | 560 | 1.5739 | 0.5325 | 0.5988 | 2.5522 | 0.5325 | 0.5237 | 0.1769 | 0.2150 |
| 0.3153 | 81.0 | 567 | 1.5743 | 0.5325 | 0.5989 | 2.5519 | 0.5325 | 0.5238 | 0.1837 | 0.2151 |
| 0.3153 | 82.0 | 574 | 1.5742 | 0.5325 | 0.5989 | 2.5232 | 0.5325 | 0.5240 | 0.1712 | 0.2149 |
| 0.3153 | 83.0 | 581 | 1.5744 | 0.5325 | 0.5989 | 2.5256 | 0.5325 | 0.5239 | 0.1803 | 0.2151 |
| 0.3153 | 84.0 | 588 | 1.5741 | 0.5325 | 0.5988 | 2.5233 | 0.5325 | 0.5233 | 0.1655 | 0.2147 |
| 0.3153 | 85.0 | 595 | 1.5747 | 0.5325 | 0.5990 | 2.5274 | 0.5325 | 0.5237 | 0.1696 | 0.2152 |
| 0.3153 | 86.0 | 602 | 1.5747 | 0.5325 | 0.5989 | 2.5263 | 0.5325 | 0.5238 | 0.1689 | 0.2150 |
| 0.3153 | 87.0 | 609 | 1.5745 | 0.5325 | 0.5989 | 2.5251 | 0.5325 | 0.5237 | 0.1654 | 0.2149 |
| 0.3153 | 88.0 | 616 | 1.5747 | 0.5325 | 0.5989 | 2.5283 | 0.5325 | 0.5241 | 0.1693 | 0.2151 |
| 0.3153 | 89.0 | 623 | 1.5748 | 0.5325 | 0.5990 | 2.5275 | 0.5325 | 0.5239 | 0.1596 | 0.2152 |
| 0.3153 | 90.0 | 630 | 1.5749 | 0.5325 | 0.5990 | 2.5278 | 0.5325 | 0.5240 | 0.1602 | 0.2151 |
| 0.3153 | 91.0 | 637 | 1.5750 | 0.5325 | 0.5990 | 2.5337 | 0.5325 | 0.5239 | 0.1623 | 0.2152 |
| 0.3153 | 92.0 | 644 | 1.5749 | 0.5325 | 0.5990 | 2.5272 | 0.5325 | 0.5238 | 0.1653 | 0.2151 |
| 0.3153 | 93.0 | 651 | 1.5751 | 0.5325 | 0.5990 | 2.5281 | 0.5325 | 0.5240 | 0.1663 | 0.2149 |
| 0.3153 | 94.0 | 658 | 1.5750 | 0.5325 | 0.5990 | 2.5249 | 0.5325 | 0.5239 | 0.1715 | 0.2152 |
| 0.3153 | 95.0 | 665 | 1.5749 | 0.535 | 0.5990 | 2.5257 | 0.535 | 0.5263 | 0.1625 | 0.2149 |
| 0.3153 | 96.0 | 672 | 1.5750 | 0.5325 | 0.5990 | 2.5266 | 0.5325 | 0.5239 | 0.1655 | 0.2151 |
| 0.3153 | 97.0 | 679 | 1.5750 | 0.5325 | 0.5990 | 2.5268 | 0.5325 | 0.5239 | 0.1686 | 0.2152 |
| 0.3153 | 98.0 | 686 | 1.5750 | 0.5325 | 0.5990 | 2.5275 | 0.5325 | 0.5240 | 0.1664 | 0.2152 |
| 0.3153 | 99.0 | 693 | 1.5750 | 0.5325 | 0.5990 | 2.5269 | 0.5325 | 0.5240 | 0.1678 | 0.2152 |
| 0.3153 | 100.0 | 700 | 1.5750 | 0.5325 | 0.5990 | 2.5263 | 0.5325 | 0.5240 | 0.1659 | 0.2152 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
ayanban011/vit-base_tobacco_bs_16_lr_1e-5_e_200_wr_0.05_wd_0.4_split
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco_bs_16_lr_1e-5_e_200_wr_0.05_wd_0.4_split
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0411
- Accuracy: 0.8333
- Brier Loss: 0.3084
- Nll: 1.3568
- F1 Micro: 0.8333
- F1 Macro: 0.8183
- Ece: 0.1563
- Aurc: 0.0847
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.99 | 43 | 0.7544 | 0.7960 | 0.3088 | 1.3391 | 0.7960 | 0.7715 | 0.1991 | 0.0817 |
| No log | 2.0 | 87 | 0.7158 | 0.8218 | 0.2920 | 1.1888 | 0.8218 | 0.7941 | 0.1863 | 0.0741 |
| No log | 2.98 | 130 | 0.7144 | 0.7989 | 0.2932 | 1.2958 | 0.7989 | 0.7701 | 0.1628 | 0.0749 |
| No log | 3.99 | 174 | 0.6762 | 0.8305 | 0.2749 | 1.1916 | 0.8305 | 0.8076 | 0.1844 | 0.0678 |
| No log | 4.98 | 217 | 0.6710 | 0.8362 | 0.2745 | 1.0739 | 0.8362 | 0.8076 | 0.1696 | 0.0664 |
| No log | 5.99 | 261 | 0.6532 | 0.8362 | 0.2675 | 1.0011 | 0.8362 | 0.8115 | 0.1750 | 0.0602 |
| No log | 6.98 | 304 | 0.6404 | 0.8362 | 0.2635 | 1.0072 | 0.8362 | 0.8106 | 0.1714 | 0.0633 |
| No log | 7.99 | 348 | 0.6635 | 0.8218 | 0.2707 | 1.0903 | 0.8218 | 0.8030 | 0.1513 | 0.0770 |
| No log | 9.0 | 392 | 0.6167 | 0.8420 | 0.2534 | 1.0176 | 0.8420 | 0.8259 | 0.1613 | 0.0796 |
| No log | 9.99 | 435 | 0.6496 | 0.8276 | 0.2703 | 0.9646 | 0.8276 | 0.8085 | 0.1643 | 0.0588 |
| No log | 11.0 | 479 | 0.6091 | 0.8506 | 0.2467 | 1.1036 | 0.8506 | 0.8308 | 0.1483 | 0.0650 |
| 0.4309 | 11.98 | 522 | 0.6075 | 0.8420 | 0.2483 | 0.9144 | 0.8420 | 0.8246 | 0.1391 | 0.0519 |
| 0.4309 | 12.99 | 566 | 0.6164 | 0.8276 | 0.2576 | 0.9703 | 0.8276 | 0.8092 | 0.1467 | 0.0645 |
| 0.4309 | 13.98 | 609 | 0.5893 | 0.8592 | 0.2347 | 1.1493 | 0.8592 | 0.8483 | 0.1347 | 0.0715 |
| 0.4309 | 14.99 | 653 | 0.6123 | 0.8477 | 0.2485 | 1.1889 | 0.8477 | 0.8232 | 0.1587 | 0.0764 |
| 0.4309 | 16.0 | 697 | 0.6352 | 0.8420 | 0.2615 | 1.1999 | 0.8420 | 0.8403 | 0.1368 | 0.0668 |
| 0.4309 | 16.99 | 740 | 0.6329 | 0.8333 | 0.2625 | 1.1748 | 0.8333 | 0.8249 | 0.1267 | 0.0744 |
| 0.4309 | 18.0 | 784 | 0.6350 | 0.8448 | 0.2590 | 1.2154 | 0.8448 | 0.8386 | 0.1423 | 0.0688 |
| 0.4309 | 18.98 | 827 | 0.5892 | 0.8592 | 0.2383 | 1.1001 | 0.8592 | 0.8515 | 0.1293 | 0.0630 |
| 0.4309 | 19.99 | 871 | 0.5981 | 0.8477 | 0.2476 | 1.0104 | 0.8477 | 0.8375 | 0.1345 | 0.0630 |
| 0.4309 | 20.98 | 914 | 0.6484 | 0.8420 | 0.2642 | 1.3553 | 0.8420 | 0.8292 | 0.1490 | 0.0770 |
| 0.4309 | 21.99 | 958 | 0.6298 | 0.8305 | 0.2657 | 1.1220 | 0.8305 | 0.8208 | 0.1292 | 0.0670 |
| 0.1285 | 22.98 | 1001 | 0.6325 | 0.8391 | 0.2633 | 1.2549 | 0.8391 | 0.8362 | 0.1328 | 0.0708 |
| 0.1285 | 23.99 | 1045 | 0.6032 | 0.8534 | 0.2486 | 1.1258 | 0.8534 | 0.8444 | 0.1229 | 0.0706 |
| 0.1285 | 25.0 | 1089 | 0.6080 | 0.8534 | 0.2460 | 1.2033 | 0.8534 | 0.8414 | 0.1257 | 0.0755 |
| 0.1285 | 25.99 | 1132 | 0.6321 | 0.8391 | 0.2667 | 1.2242 | 0.8391 | 0.8355 | 0.1349 | 0.0697 |
| 0.1285 | 27.0 | 1176 | 0.6325 | 0.8592 | 0.2522 | 1.2029 | 0.8592 | 0.8493 | 0.1278 | 0.0778 |
| 0.1285 | 27.98 | 1219 | 0.6585 | 0.8534 | 0.2546 | 1.3669 | 0.8534 | 0.8378 | 0.1368 | 0.0890 |
| 0.1285 | 28.99 | 1263 | 0.6302 | 0.8563 | 0.2517 | 1.2419 | 0.8563 | 0.8508 | 0.1294 | 0.0751 |
| 0.1285 | 29.98 | 1306 | 0.6663 | 0.8477 | 0.2637 | 1.4132 | 0.8477 | 0.8339 | 0.1399 | 0.0828 |
| 0.1285 | 30.99 | 1350 | 0.7063 | 0.8362 | 0.2799 | 1.4323 | 0.8362 | 0.8330 | 0.1441 | 0.0863 |
| 0.1285 | 32.0 | 1394 | 0.6564 | 0.8506 | 0.2570 | 1.1583 | 0.8506 | 0.8417 | 0.1358 | 0.0847 |
| 0.1285 | 32.99 | 1437 | 0.6738 | 0.8477 | 0.2647 | 1.3855 | 0.8477 | 0.8398 | 0.1305 | 0.0775 |
| 0.1285 | 34.0 | 1481 | 0.6528 | 0.8563 | 0.2559 | 1.2601 | 0.8563 | 0.8462 | 0.1310 | 0.0789 |
| 0.0385 | 34.98 | 1524 | 0.6534 | 0.8563 | 0.2537 | 1.2931 | 0.8563 | 0.8461 | 0.1241 | 0.0773 |
| 0.0385 | 35.99 | 1568 | 0.6541 | 0.8534 | 0.2525 | 1.2589 | 0.8534 | 0.8449 | 0.1315 | 0.0833 |
| 0.0385 | 36.98 | 1611 | 0.6769 | 0.8592 | 0.2545 | 1.4351 | 0.8592 | 0.8492 | 0.1242 | 0.0792 |
| 0.0385 | 37.99 | 1655 | 0.6824 | 0.8592 | 0.2576 | 1.2241 | 0.8592 | 0.8472 | 0.1327 | 0.0810 |
| 0.0385 | 38.98 | 1698 | 0.6843 | 0.8563 | 0.2589 | 1.3394 | 0.8563 | 0.8450 | 0.1311 | 0.0802 |
| 0.0385 | 39.99 | 1742 | 0.6964 | 0.8506 | 0.2630 | 1.2625 | 0.8506 | 0.8405 | 0.1310 | 0.0789 |
| 0.0385 | 41.0 | 1786 | 0.7051 | 0.8534 | 0.2671 | 1.3296 | 0.8534 | 0.8434 | 0.1353 | 0.0794 |
| 0.0385 | 41.99 | 1829 | 0.7006 | 0.8506 | 0.2645 | 1.2965 | 0.8506 | 0.8400 | 0.1373 | 0.0796 |
| 0.0385 | 43.0 | 1873 | 0.7054 | 0.8563 | 0.2646 | 1.2973 | 0.8563 | 0.8450 | 0.1313 | 0.0790 |
| 0.0385 | 43.98 | 1916 | 0.7143 | 0.8506 | 0.2673 | 1.2640 | 0.8506 | 0.8399 | 0.1359 | 0.0803 |
| 0.0385 | 44.99 | 1960 | 0.7168 | 0.8534 | 0.2665 | 1.3058 | 0.8534 | 0.8429 | 0.1389 | 0.0820 |
| 0.0206 | 45.98 | 2003 | 0.7204 | 0.8506 | 0.2669 | 1.3009 | 0.8506 | 0.8384 | 0.1336 | 0.0805 |
| 0.0206 | 46.99 | 2047 | 0.7265 | 0.8534 | 0.2683 | 1.2633 | 0.8534 | 0.8415 | 0.1319 | 0.0806 |
| 0.0206 | 48.0 | 2091 | 0.7311 | 0.8506 | 0.2695 | 1.2725 | 0.8506 | 0.8396 | 0.1372 | 0.0811 |
| 0.0206 | 48.99 | 2134 | 0.7384 | 0.8477 | 0.2729 | 1.3385 | 0.8477 | 0.8364 | 0.1387 | 0.0807 |
| 0.0206 | 50.0 | 2178 | 0.7383 | 0.8534 | 0.2695 | 1.1951 | 0.8534 | 0.8406 | 0.1344 | 0.0827 |
| 0.0206 | 50.98 | 2221 | 0.7440 | 0.8506 | 0.2740 | 1.3360 | 0.8506 | 0.8394 | 0.1418 | 0.0812 |
| 0.0206 | 51.99 | 2265 | 0.7455 | 0.8506 | 0.2727 | 1.2704 | 0.8506 | 0.8388 | 0.1351 | 0.0816 |
| 0.0206 | 52.98 | 2308 | 0.7474 | 0.8506 | 0.2708 | 1.2622 | 0.8506 | 0.8384 | 0.1334 | 0.0823 |
| 0.0206 | 53.99 | 2352 | 0.7581 | 0.8477 | 0.2750 | 1.3446 | 0.8477 | 0.8374 | 0.1406 | 0.0826 |
| 0.0206 | 54.98 | 2395 | 0.7571 | 0.8477 | 0.2751 | 1.3703 | 0.8477 | 0.8363 | 0.1378 | 0.0814 |
| 0.0206 | 55.99 | 2439 | 0.7618 | 0.8477 | 0.2752 | 1.3702 | 0.8477 | 0.8363 | 0.1363 | 0.0827 |
| 0.0206 | 57.0 | 2483 | 0.7638 | 0.8477 | 0.2749 | 1.3774 | 0.8477 | 0.8363 | 0.1394 | 0.0819 |
| 0.0135 | 57.99 | 2526 | 0.7693 | 0.8477 | 0.2760 | 1.3370 | 0.8477 | 0.8363 | 0.1378 | 0.0824 |
| 0.0135 | 59.0 | 2570 | 0.7724 | 0.8448 | 0.2779 | 1.3710 | 0.8448 | 0.8344 | 0.1431 | 0.0823 |
| 0.0135 | 59.98 | 2613 | 0.7780 | 0.8477 | 0.2784 | 1.3328 | 0.8477 | 0.8363 | 0.1463 | 0.0828 |
| 0.0135 | 60.99 | 2657 | 0.7818 | 0.8477 | 0.2795 | 1.3289 | 0.8477 | 0.8363 | 0.1466 | 0.0828 |
| 0.0135 | 61.98 | 2700 | 0.7847 | 0.8420 | 0.2805 | 1.3308 | 0.8420 | 0.8308 | 0.1418 | 0.0830 |
| 0.0135 | 62.99 | 2744 | 0.7851 | 0.8448 | 0.2782 | 1.3650 | 0.8448 | 0.8344 | 0.1411 | 0.0834 |
| 0.0135 | 64.0 | 2788 | 0.7925 | 0.8420 | 0.2829 | 1.4383 | 0.8420 | 0.8319 | 0.1425 | 0.0821 |
| 0.0135 | 64.99 | 2831 | 0.7959 | 0.8448 | 0.2826 | 1.4130 | 0.8448 | 0.8353 | 0.1431 | 0.0826 |
| 0.0135 | 66.0 | 2875 | 0.7989 | 0.8420 | 0.2821 | 1.4040 | 0.8420 | 0.8285 | 0.1446 | 0.0833 |
| 0.0135 | 66.98 | 2918 | 0.7996 | 0.8477 | 0.2807 | 1.3296 | 0.8477 | 0.8363 | 0.1464 | 0.0837 |
| 0.0135 | 67.99 | 2962 | 0.8042 | 0.8448 | 0.2824 | 1.3637 | 0.8448 | 0.8344 | 0.1434 | 0.0837 |
| 0.0097 | 68.98 | 3005 | 0.8095 | 0.8391 | 0.2845 | 1.3635 | 0.8391 | 0.8275 | 0.1468 | 0.0835 |
| 0.0097 | 69.99 | 3049 | 0.8073 | 0.8448 | 0.2824 | 1.3640 | 0.8448 | 0.8344 | 0.1413 | 0.0833 |
| 0.0097 | 70.98 | 3092 | 0.8140 | 0.8477 | 0.2834 | 1.3617 | 0.8477 | 0.8363 | 0.1444 | 0.0837 |
| 0.0097 | 71.99 | 3136 | 0.8152 | 0.8420 | 0.2842 | 1.4009 | 0.8420 | 0.8277 | 0.1439 | 0.0840 |
| 0.0097 | 73.0 | 3180 | 0.8163 | 0.8391 | 0.2858 | 1.4029 | 0.8391 | 0.8246 | 0.1482 | 0.0836 |
| 0.0097 | 73.99 | 3223 | 0.8192 | 0.8391 | 0.2844 | 1.3644 | 0.8391 | 0.8240 | 0.1475 | 0.0843 |
| 0.0097 | 75.0 | 3267 | 0.8225 | 0.8448 | 0.2836 | 1.3593 | 0.8448 | 0.8344 | 0.1473 | 0.0847 |
| 0.0097 | 75.98 | 3310 | 0.8267 | 0.8362 | 0.2859 | 1.3642 | 0.8362 | 0.8207 | 0.1473 | 0.0840 |
| 0.0097 | 76.99 | 3354 | 0.8275 | 0.8391 | 0.2847 | 1.3618 | 0.8391 | 0.8240 | 0.1450 | 0.0849 |
| 0.0097 | 77.98 | 3397 | 0.8325 | 0.8362 | 0.2879 | 1.3686 | 0.8362 | 0.8207 | 0.1491 | 0.0843 |
| 0.0097 | 78.99 | 3441 | 0.8389 | 0.8448 | 0.2885 | 1.3629 | 0.8448 | 0.8329 | 0.1504 | 0.0833 |
| 0.0097 | 80.0 | 3485 | 0.8420 | 0.8420 | 0.2887 | 1.3610 | 0.8420 | 0.8261 | 0.1458 | 0.0837 |
| 0.0073 | 80.99 | 3528 | 0.8452 | 0.8362 | 0.2900 | 1.4064 | 0.8362 | 0.8221 | 0.1488 | 0.0833 |
| 0.0073 | 82.0 | 3572 | 0.8492 | 0.8362 | 0.2898 | 1.4076 | 0.8362 | 0.8221 | 0.1500 | 0.0837 |
| 0.0073 | 82.98 | 3615 | 0.8478 | 0.8362 | 0.2895 | 1.3609 | 0.8362 | 0.8207 | 0.1485 | 0.0847 |
| 0.0073 | 83.99 | 3659 | 0.8483 | 0.8391 | 0.2880 | 1.3622 | 0.8391 | 0.8243 | 0.1480 | 0.0842 |
| 0.0073 | 84.98 | 3702 | 0.8534 | 0.8420 | 0.2892 | 1.3609 | 0.8420 | 0.8261 | 0.1468 | 0.0843 |
| 0.0073 | 85.99 | 3746 | 0.8547 | 0.8333 | 0.2898 | 1.4028 | 0.8333 | 0.8186 | 0.1513 | 0.0846 |
| 0.0073 | 86.98 | 3789 | 0.8618 | 0.8391 | 0.2906 | 1.3597 | 0.8391 | 0.8243 | 0.1445 | 0.0846 |
| 0.0073 | 87.99 | 3833 | 0.8594 | 0.8420 | 0.2885 | 1.3265 | 0.8420 | 0.8311 | 0.1462 | 0.0848 |
| 0.0073 | 89.0 | 3877 | 0.8669 | 0.8391 | 0.2911 | 1.3592 | 0.8391 | 0.8243 | 0.1471 | 0.0843 |
| 0.0073 | 89.99 | 3920 | 0.8664 | 0.8391 | 0.2901 | 1.3597 | 0.8391 | 0.8243 | 0.1468 | 0.0852 |
| 0.0073 | 91.0 | 3964 | 0.8678 | 0.8420 | 0.2905 | 1.3253 | 0.8420 | 0.8296 | 0.1462 | 0.0854 |
| 0.0057 | 91.98 | 4007 | 0.8719 | 0.8391 | 0.2909 | 1.3585 | 0.8391 | 0.8243 | 0.1475 | 0.0853 |
| 0.0057 | 92.99 | 4051 | 0.8768 | 0.8391 | 0.2930 | 1.3595 | 0.8391 | 0.8243 | 0.1493 | 0.0852 |
| 0.0057 | 93.98 | 4094 | 0.8785 | 0.8333 | 0.2928 | 1.4034 | 0.8333 | 0.8203 | 0.1529 | 0.0849 |
| 0.0057 | 94.99 | 4138 | 0.8859 | 0.8333 | 0.2942 | 1.3684 | 0.8333 | 0.8183 | 0.1543 | 0.0844 |
| 0.0057 | 96.0 | 4182 | 0.8839 | 0.8362 | 0.2937 | 1.3597 | 0.8362 | 0.8221 | 0.1497 | 0.0852 |
| 0.0057 | 96.99 | 4225 | 0.8864 | 0.8333 | 0.2940 | 1.4012 | 0.8333 | 0.8203 | 0.1532 | 0.0850 |
| 0.0057 | 98.0 | 4269 | 0.8879 | 0.8362 | 0.2941 | 1.3607 | 0.8362 | 0.8221 | 0.1504 | 0.0849 |
| 0.0057 | 98.98 | 4312 | 0.8921 | 0.8333 | 0.2954 | 1.3609 | 0.8333 | 0.8183 | 0.1521 | 0.0851 |
| 0.0057 | 99.99 | 4356 | 0.8949 | 0.8391 | 0.2945 | 1.3575 | 0.8391 | 0.8243 | 0.1491 | 0.0854 |
| 0.0057 | 100.98 | 4399 | 0.8945 | 0.8362 | 0.2945 | 1.3591 | 0.8362 | 0.8221 | 0.1500 | 0.0856 |
| 0.0057 | 101.99 | 4443 | 0.8985 | 0.8333 | 0.2944 | 1.3599 | 0.8333 | 0.8183 | 0.1530 | 0.0854 |
| 0.0057 | 102.98 | 4486 | 0.8987 | 0.8391 | 0.2951 | 1.3586 | 0.8391 | 0.8246 | 0.1499 | 0.0850 |
| 0.0045 | 103.99 | 4530 | 0.9025 | 0.8362 | 0.2957 | 1.3592 | 0.8362 | 0.8221 | 0.1510 | 0.0857 |
| 0.0045 | 105.0 | 4574 | 0.9082 | 0.8305 | 0.2972 | 1.3625 | 0.8305 | 0.8165 | 0.1568 | 0.0852 |
| 0.0045 | 105.99 | 4617 | 0.9087 | 0.8362 | 0.2958 | 1.3579 | 0.8362 | 0.8221 | 0.1505 | 0.0858 |
| 0.0045 | 107.0 | 4661 | 0.9105 | 0.8305 | 0.2977 | 1.3619 | 0.8305 | 0.8165 | 0.1561 | 0.0844 |
| 0.0045 | 107.98 | 4704 | 0.9136 | 0.8305 | 0.2978 | 1.3994 | 0.8305 | 0.8165 | 0.1559 | 0.0851 |
| 0.0045 | 108.99 | 4748 | 0.9148 | 0.8391 | 0.2968 | 1.3573 | 0.8391 | 0.8243 | 0.1504 | 0.0856 |
| 0.0045 | 109.98 | 4791 | 0.9188 | 0.8333 | 0.2974 | 1.3569 | 0.8333 | 0.8183 | 0.1532 | 0.0850 |
| 0.0045 | 110.99 | 4835 | 0.9164 | 0.8362 | 0.2959 | 1.3595 | 0.8362 | 0.8221 | 0.1507 | 0.0857 |
| 0.0045 | 112.0 | 4879 | 0.9221 | 0.8333 | 0.2977 | 1.3573 | 0.8333 | 0.8183 | 0.1550 | 0.0857 |
| 0.0045 | 112.99 | 4922 | 0.9256 | 0.8305 | 0.2990 | 1.3599 | 0.8305 | 0.8165 | 0.1574 | 0.0852 |
| 0.0045 | 114.0 | 4966 | 0.9284 | 0.8305 | 0.2994 | 1.3610 | 0.8305 | 0.8165 | 0.1572 | 0.0848 |
| 0.0037 | 114.98 | 5009 | 0.9312 | 0.8333 | 0.2998 | 1.3565 | 0.8333 | 0.8183 | 0.1537 | 0.0857 |
| 0.0037 | 115.99 | 5053 | 0.9322 | 0.8333 | 0.2995 | 1.3583 | 0.8333 | 0.8183 | 0.1543 | 0.0852 |
| 0.0037 | 116.98 | 5096 | 0.9385 | 0.8305 | 0.3007 | 1.3593 | 0.8305 | 0.8165 | 0.1577 | 0.0852 |
| 0.0037 | 117.99 | 5140 | 0.9386 | 0.8305 | 0.3009 | 1.4329 | 0.8305 | 0.8165 | 0.1582 | 0.0851 |
| 0.0037 | 118.98 | 5183 | 0.9386 | 0.8333 | 0.2996 | 1.3570 | 0.8333 | 0.8183 | 0.1542 | 0.0855 |
| 0.0037 | 119.99 | 5227 | 0.9406 | 0.8333 | 0.2995 | 1.3554 | 0.8333 | 0.8183 | 0.1540 | 0.0848 |
| 0.0037 | 121.0 | 5271 | 0.9442 | 0.8305 | 0.3006 | 1.3589 | 0.8305 | 0.8165 | 0.1570 | 0.0849 |
| 0.0037 | 121.99 | 5314 | 0.9435 | 0.8333 | 0.3000 | 1.3551 | 0.8333 | 0.8183 | 0.1546 | 0.0855 |
| 0.0037 | 123.0 | 5358 | 0.9456 | 0.8333 | 0.2996 | 1.3550 | 0.8333 | 0.8183 | 0.1544 | 0.0848 |
| 0.0037 | 123.98 | 5401 | 0.9490 | 0.8333 | 0.3008 | 1.3561 | 0.8333 | 0.8183 | 0.1547 | 0.0850 |
| 0.0037 | 124.99 | 5445 | 0.9500 | 0.8333 | 0.3011 | 1.3592 | 0.8333 | 0.8183 | 0.1551 | 0.0846 |
| 0.0037 | 125.98 | 5488 | 0.9513 | 0.8333 | 0.3003 | 1.3549 | 0.8333 | 0.8183 | 0.1544 | 0.0845 |
| 0.0031 | 126.99 | 5532 | 0.9575 | 0.8305 | 0.3024 | 1.3580 | 0.8305 | 0.8165 | 0.1581 | 0.0849 |
| 0.0031 | 128.0 | 5576 | 0.9593 | 0.8305 | 0.3025 | 1.4028 | 0.8305 | 0.8165 | 0.1591 | 0.0851 |
| 0.0031 | 128.99 | 5619 | 0.9594 | 0.8305 | 0.3021 | 1.3619 | 0.8305 | 0.8165 | 0.1579 | 0.0849 |
| 0.0031 | 130.0 | 5663 | 0.9628 | 0.8305 | 0.3025 | 1.3589 | 0.8305 | 0.8165 | 0.1587 | 0.0847 |
| 0.0031 | 130.98 | 5706 | 0.9652 | 0.8305 | 0.3031 | 1.3599 | 0.8305 | 0.8165 | 0.1593 | 0.0844 |
| 0.0031 | 131.99 | 5750 | 0.9646 | 0.8362 | 0.3005 | 1.3353 | 0.8362 | 0.8205 | 0.1520 | 0.0851 |
| 0.0031 | 132.98 | 5793 | 0.9658 | 0.8333 | 0.3021 | 1.3562 | 0.8333 | 0.8183 | 0.1555 | 0.0849 |
| 0.0031 | 133.99 | 5837 | 0.9698 | 0.8333 | 0.3023 | 1.3545 | 0.8333 | 0.8183 | 0.1554 | 0.0845 |
| 0.0031 | 134.98 | 5880 | 0.9716 | 0.8333 | 0.3032 | 1.3559 | 0.8333 | 0.8183 | 0.1555 | 0.0852 |
| 0.0031 | 135.99 | 5924 | 0.9736 | 0.8305 | 0.3037 | 1.3624 | 0.8305 | 0.8165 | 0.1584 | 0.0849 |
| 0.0031 | 137.0 | 5968 | 0.9760 | 0.8333 | 0.3039 | 1.3575 | 0.8333 | 0.8183 | 0.1551 | 0.0845 |
| 0.0026 | 137.99 | 6011 | 0.9789 | 0.8305 | 0.3041 | 1.3569 | 0.8305 | 0.8165 | 0.1592 | 0.0848 |
| 0.0026 | 139.0 | 6055 | 0.9801 | 0.8305 | 0.3040 | 1.3574 | 0.8305 | 0.8165 | 0.1598 | 0.0854 |
| 0.0026 | 139.98 | 6098 | 0.9806 | 0.8333 | 0.3035 | 1.3552 | 0.8333 | 0.8183 | 0.1557 | 0.0852 |
| 0.0026 | 140.99 | 6142 | 0.9835 | 0.8333 | 0.3041 | 1.3574 | 0.8333 | 0.8183 | 0.1564 | 0.0846 |
| 0.0026 | 141.98 | 6185 | 0.9838 | 0.8333 | 0.3037 | 1.3549 | 0.8333 | 0.8183 | 0.1557 | 0.0849 |
| 0.0026 | 142.99 | 6229 | 0.9872 | 0.8333 | 0.3044 | 1.3544 | 0.8333 | 0.8183 | 0.1557 | 0.0851 |
| 0.0026 | 144.0 | 6273 | 0.9900 | 0.8305 | 0.3056 | 1.3654 | 0.8305 | 0.8165 | 0.1597 | 0.0847 |
| 0.0026 | 144.99 | 6316 | 0.9907 | 0.8333 | 0.3049 | 1.3551 | 0.8333 | 0.8183 | 0.1565 | 0.0854 |
| 0.0026 | 146.0 | 6360 | 0.9896 | 0.8333 | 0.3044 | 1.3569 | 0.8333 | 0.8183 | 0.1563 | 0.0843 |
| 0.0026 | 146.98 | 6403 | 0.9938 | 0.8333 | 0.3053 | 1.3550 | 0.8333 | 0.8183 | 0.1562 | 0.0844 |
| 0.0026 | 147.99 | 6447 | 0.9962 | 0.8305 | 0.3056 | 1.3615 | 0.8305 | 0.8165 | 0.1594 | 0.0844 |
| 0.0026 | 148.98 | 6490 | 0.9954 | 0.8305 | 0.3051 | 1.3601 | 0.8305 | 0.8165 | 0.1590 | 0.0847 |
| 0.0022 | 149.99 | 6534 | 0.9961 | 0.8333 | 0.3043 | 1.3550 | 0.8333 | 0.8183 | 0.1554 | 0.0847 |
| 0.0022 | 150.98 | 6577 | 1.0026 | 0.8333 | 0.3059 | 1.3555 | 0.8333 | 0.8183 | 0.1563 | 0.0853 |
| 0.0022 | 151.99 | 6621 | 1.0004 | 0.8333 | 0.3049 | 1.3544 | 0.8333 | 0.8183 | 0.1566 | 0.0847 |
| 0.0022 | 153.0 | 6665 | 1.0024 | 0.8305 | 0.3058 | 1.3606 | 0.8305 | 0.8165 | 0.1595 | 0.0846 |
| 0.0022 | 153.99 | 6708 | 1.0054 | 0.8305 | 0.3064 | 1.3598 | 0.8305 | 0.8165 | 0.1591 | 0.0848 |
| 0.0022 | 155.0 | 6752 | 1.0053 | 0.8333 | 0.3054 | 1.3548 | 0.8333 | 0.8183 | 0.1562 | 0.0845 |
| 0.0022 | 155.98 | 6795 | 1.0068 | 0.8333 | 0.3053 | 1.3548 | 0.8333 | 0.8183 | 0.1562 | 0.0846 |
| 0.0022 | 156.99 | 6839 | 1.0076 | 0.8333 | 0.3055 | 1.3551 | 0.8333 | 0.8183 | 0.1561 | 0.0844 |
| 0.0022 | 157.98 | 6882 | 1.0105 | 0.8333 | 0.3059 | 1.3546 | 0.8333 | 0.8183 | 0.1563 | 0.0845 |
| 0.0022 | 158.99 | 6926 | 1.0114 | 0.8333 | 0.3061 | 1.3555 | 0.8333 | 0.8183 | 0.1559 | 0.0851 |
| 0.0022 | 160.0 | 6970 | 1.0108 | 0.8333 | 0.3061 | 1.3586 | 0.8333 | 0.8183 | 0.1561 | 0.0848 |
| 0.002 | 160.99 | 7013 | 1.0129 | 0.8333 | 0.3064 | 1.3577 | 0.8333 | 0.8183 | 0.1560 | 0.0845 |
| 0.002 | 162.0 | 7057 | 1.0141 | 0.8333 | 0.3060 | 1.3542 | 0.8333 | 0.8183 | 0.1562 | 0.0845 |
| 0.002 | 162.98 | 7100 | 1.0150 | 0.8333 | 0.3063 | 1.3555 | 0.8333 | 0.8183 | 0.1563 | 0.0847 |
| 0.002 | 163.99 | 7144 | 1.0181 | 0.8305 | 0.3071 | 1.3616 | 0.8305 | 0.8165 | 0.1587 | 0.0847 |
| 0.002 | 164.98 | 7187 | 1.0197 | 0.8305 | 0.3073 | 1.3610 | 0.8305 | 0.8165 | 0.1585 | 0.0847 |
| 0.002 | 165.99 | 7231 | 1.0203 | 0.8333 | 0.3071 | 1.3566 | 0.8333 | 0.8183 | 0.1565 | 0.0846 |
| 0.002 | 166.98 | 7274 | 1.0214 | 0.8333 | 0.3070 | 1.3561 | 0.8333 | 0.8183 | 0.1564 | 0.0845 |
| 0.002 | 167.99 | 7318 | 1.0211 | 0.8333 | 0.3067 | 1.3558 | 0.8333 | 0.8183 | 0.1562 | 0.0846 |
| 0.002 | 169.0 | 7362 | 1.0255 | 0.8305 | 0.3077 | 1.3564 | 0.8305 | 0.8165 | 0.1592 | 0.0846 |
| 0.002 | 169.99 | 7405 | 1.0238 | 0.8333 | 0.3066 | 1.3535 | 0.8333 | 0.8183 | 0.1567 | 0.0844 |
| 0.002 | 171.0 | 7449 | 1.0258 | 0.8333 | 0.3075 | 1.3580 | 0.8333 | 0.8183 | 0.1562 | 0.0847 |
| 0.002 | 171.98 | 7492 | 1.0260 | 0.8333 | 0.3073 | 1.3594 | 0.8333 | 0.8183 | 0.1559 | 0.0846 |
| 0.0018 | 172.99 | 7536 | 1.0281 | 0.8305 | 0.3077 | 1.3584 | 0.8305 | 0.8165 | 0.1586 | 0.0847 |
| 0.0018 | 173.98 | 7579 | 1.0274 | 0.8333 | 0.3073 | 1.3577 | 0.8333 | 0.8183 | 0.1560 | 0.0851 |
| 0.0018 | 174.99 | 7623 | 1.0323 | 0.8305 | 0.3082 | 1.3577 | 0.8305 | 0.8165 | 0.1596 | 0.0848 |
| 0.0018 | 176.0 | 7667 | 1.0303 | 0.8333 | 0.3076 | 1.3579 | 0.8333 | 0.8183 | 0.1561 | 0.0846 |
| 0.0018 | 176.99 | 7710 | 1.0325 | 0.8333 | 0.3081 | 1.3567 | 0.8333 | 0.8183 | 0.1565 | 0.0845 |
| 0.0018 | 178.0 | 7754 | 1.0319 | 0.8333 | 0.3077 | 1.3569 | 0.8333 | 0.8183 | 0.1560 | 0.0847 |
| 0.0018 | 178.98 | 7797 | 1.0340 | 0.8333 | 0.3081 | 1.3568 | 0.8333 | 0.8183 | 0.1562 | 0.0847 |
| 0.0018 | 179.99 | 7841 | 1.0331 | 0.8333 | 0.3072 | 1.3550 | 0.8333 | 0.8183 | 0.1564 | 0.0847 |
| 0.0018 | 180.98 | 7884 | 1.0346 | 0.8333 | 0.3079 | 1.3563 | 0.8333 | 0.8183 | 0.1561 | 0.0847 |
| 0.0018 | 181.99 | 7928 | 1.0344 | 0.8333 | 0.3079 | 1.3577 | 0.8333 | 0.8183 | 0.1565 | 0.0847 |
| 0.0018 | 182.98 | 7971 | 1.0363 | 0.8333 | 0.3080 | 1.3556 | 0.8333 | 0.8183 | 0.1566 | 0.0850 |
| 0.0016 | 183.99 | 8015 | 1.0368 | 0.8333 | 0.3080 | 1.3569 | 0.8333 | 0.8183 | 0.1561 | 0.0847 |
| 0.0016 | 185.0 | 8059 | 1.0369 | 0.8333 | 0.3080 | 1.3563 | 0.8333 | 0.8183 | 0.1562 | 0.0847 |
| 0.0016 | 185.99 | 8102 | 1.0373 | 0.8333 | 0.3080 | 1.3565 | 0.8333 | 0.8183 | 0.1561 | 0.0850 |
| 0.0016 | 187.0 | 8146 | 1.0377 | 0.8333 | 0.3080 | 1.3568 | 0.8333 | 0.8183 | 0.1561 | 0.0846 |
| 0.0016 | 187.98 | 8189 | 1.0392 | 0.8333 | 0.3084 | 1.3577 | 0.8333 | 0.8183 | 0.1565 | 0.0846 |
| 0.0016 | 188.99 | 8233 | 1.0391 | 0.8333 | 0.3082 | 1.3564 | 0.8333 | 0.8183 | 0.1564 | 0.0848 |
| 0.0016 | 189.98 | 8276 | 1.0393 | 0.8333 | 0.3081 | 1.3561 | 0.8333 | 0.8183 | 0.1562 | 0.0847 |
| 0.0016 | 190.99 | 8320 | 1.0398 | 0.8333 | 0.3084 | 1.3582 | 0.8333 | 0.8183 | 0.1562 | 0.0846 |
| 0.0016 | 192.0 | 8364 | 1.0405 | 0.8333 | 0.3083 | 1.3558 | 0.8333 | 0.8183 | 0.1564 | 0.0847 |
| 0.0016 | 192.99 | 8407 | 1.0401 | 0.8333 | 0.3082 | 1.3558 | 0.8333 | 0.8183 | 0.1564 | 0.0847 |
| 0.0016 | 194.0 | 8451 | 1.0407 | 0.8333 | 0.3083 | 1.3564 | 0.8333 | 0.8183 | 0.1564 | 0.0847 |
| 0.0016 | 194.98 | 8494 | 1.0414 | 0.8333 | 0.3086 | 1.3573 | 0.8333 | 0.8183 | 0.1564 | 0.0847 |
| 0.0015 | 195.99 | 8538 | 1.0410 | 0.8333 | 0.3084 | 1.3567 | 0.8333 | 0.8183 | 0.1564 | 0.0848 |
| 0.0015 | 196.98 | 8581 | 1.0411 | 0.8333 | 0.3084 | 1.3568 | 0.8333 | 0.8183 | 0.1563 | 0.0846 |
| 0.0015 | 197.42 | 8600 | 1.0411 | 0.8333 | 0.3084 | 1.3568 | 0.8333 | 0.8183 | 0.1563 | 0.0847 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
hbenitez/AV_classifier1
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# hbenitez/AV_classifier1
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.6840
- Validation Loss: 0.6659
- Train Accuracy: 0.5
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 80, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.6840 | 0.6659 | 0.5 | 0 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.13.0-rc2
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"brake",
"dont_brake"
] |
arunboss/triage_R5_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# triage_R5_model
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0123
- Accuracy: 0.6837
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0452 | 1.0 | 159 | 1.9622 | 0.3814 |
| 1.7034 | 2.0 | 319 | 1.5695 | 0.4923 |
| 1.441 | 3.0 | 479 | 1.4427 | 0.5433 |
| 1.2908 | 4.0 | 639 | 1.2970 | 0.5895 |
| 1.2294 | 5.0 | 798 | 1.2293 | 0.6071 |
| 1.1097 | 6.0 | 958 | 1.1892 | 0.6300 |
| 1.0342 | 7.0 | 1118 | 1.1048 | 0.6546 |
| 0.9644 | 8.0 | 1278 | 1.0731 | 0.6678 |
| 0.8534 | 9.0 | 1437 | 1.0367 | 0.6766 |
| 0.8037 | 10.0 | 1597 | 1.0211 | 0.6802 |
| 0.7765 | 11.0 | 1757 | 1.0073 | 0.6885 |
| 0.7658 | 11.94 | 1908 | 1.0123 | 0.6837 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"acne and rosacea",
"actinic keratosis bcc other malignant lesions",
"allergic reactions & similar",
"atopic dermatitis eczema",
"bullous disease",
"hair loss alopecia and other hair diseases",
"melanoma (skin cancer) nevi and moles",
"nail fungus and other nail disease",
"pigmentation disorders",
"psoriasis & related diseases",
"seborrheic keratoses and other benign tumors",
"tinea ringworm candida",
"urticaria (hives)",
"vascular tumors",
"vasculitis",
"viral exanthems",
"warts (molluscum) and other viral infections"
] |
jordyvl/vit-tiny_tobacco3482_simkd__tNone_gNone__logits
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_simkd__tNone_gNone__logits
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0388
- Accuracy: 0.82
- Brier Loss: 0.7227
- Nll: 2.6079
- F1 Micro: 0.82
- F1 Macro: 0.7960
- Ece: 0.6182
- Aurc: 0.0575
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 100 | 0.0492 | 0.03 | 0.9002 | 15.7630 | 0.03 | 0.0130 | 0.1184 | 0.9407 |
| No log | 2.0 | 200 | 0.0479 | 0.045 | 0.8991 | 12.1839 | 0.045 | 0.0140 | 0.1350 | 0.9268 |
| No log | 3.0 | 300 | 0.0472 | 0.305 | 0.8968 | 12.2323 | 0.305 | 0.1551 | 0.2930 | 0.5596 |
| No log | 4.0 | 400 | 0.0463 | 0.305 | 0.8938 | 10.0711 | 0.305 | 0.1809 | 0.3077 | 0.5037 |
| 0.0541 | 5.0 | 500 | 0.0453 | 0.325 | 0.8898 | 9.3778 | 0.325 | 0.2046 | 0.3193 | 0.4698 |
| 0.0541 | 6.0 | 600 | 0.0442 | 0.4 | 0.8787 | 9.6363 | 0.4000 | 0.2403 | 0.3713 | 0.3759 |
| 0.0541 | 7.0 | 700 | 0.0433 | 0.4 | 0.8639 | 9.3483 | 0.4000 | 0.2510 | 0.3685 | 0.3742 |
| 0.0541 | 8.0 | 800 | 0.0428 | 0.475 | 0.8521 | 9.0007 | 0.4750 | 0.2666 | 0.4177 | 0.2978 |
| 0.0541 | 9.0 | 900 | 0.0422 | 0.515 | 0.8438 | 7.6934 | 0.515 | 0.3644 | 0.4480 | 0.2431 |
| 0.0451 | 10.0 | 1000 | 0.0417 | 0.515 | 0.8210 | 7.6243 | 0.515 | 0.3743 | 0.4387 | 0.2146 |
| 0.0451 | 11.0 | 1100 | 0.0411 | 0.68 | 0.8190 | 3.6435 | 0.68 | 0.5226 | 0.5677 | 0.1272 |
| 0.0451 | 12.0 | 1200 | 0.0403 | 0.655 | 0.7803 | 5.4537 | 0.655 | 0.5379 | 0.5202 | 0.1352 |
| 0.0451 | 13.0 | 1300 | 0.0398 | 0.75 | 0.7745 | 4.1150 | 0.75 | 0.6543 | 0.5945 | 0.0749 |
| 0.0451 | 14.0 | 1400 | 0.0390 | 0.77 | 0.7561 | 2.8538 | 0.7700 | 0.6703 | 0.6085 | 0.0753 |
| 0.0406 | 15.0 | 1500 | 0.0392 | 0.745 | 0.7637 | 3.7047 | 0.745 | 0.6673 | 0.5796 | 0.1060 |
| 0.0406 | 16.0 | 1600 | 0.0398 | 0.73 | 0.7603 | 4.3010 | 0.7300 | 0.6681 | 0.5693 | 0.0949 |
| 0.0406 | 17.0 | 1700 | 0.0399 | 0.705 | 0.7610 | 3.6375 | 0.705 | 0.6564 | 0.5590 | 0.0923 |
| 0.0406 | 18.0 | 1800 | 0.0397 | 0.705 | 0.7536 | 4.2628 | 0.705 | 0.6170 | 0.5472 | 0.1332 |
| 0.0406 | 19.0 | 1900 | 0.0390 | 0.745 | 0.7400 | 2.8861 | 0.745 | 0.6686 | 0.5644 | 0.1070 |
| 0.0379 | 20.0 | 2000 | 0.0394 | 0.785 | 0.7238 | 3.3111 | 0.785 | 0.7255 | 0.5995 | 0.0695 |
| 0.0379 | 21.0 | 2100 | 0.0396 | 0.76 | 0.7419 | 3.1935 | 0.76 | 0.7463 | 0.5885 | 0.0793 |
| 0.0379 | 22.0 | 2200 | 0.0396 | 0.785 | 0.7423 | 3.7954 | 0.785 | 0.7737 | 0.6043 | 0.0873 |
| 0.0379 | 23.0 | 2300 | 0.0395 | 0.78 | 0.7321 | 3.8067 | 0.78 | 0.7491 | 0.5885 | 0.0908 |
| 0.0379 | 24.0 | 2400 | 0.0387 | 0.8 | 0.7228 | 2.9339 | 0.8000 | 0.7758 | 0.6038 | 0.0609 |
| 0.037 | 25.0 | 2500 | 0.0387 | 0.795 | 0.7222 | 2.6252 | 0.795 | 0.7601 | 0.6094 | 0.0606 |
| 0.037 | 26.0 | 2600 | 0.0387 | 0.8 | 0.7241 | 2.6253 | 0.8000 | 0.7628 | 0.6110 | 0.0607 |
| 0.037 | 27.0 | 2700 | 0.0387 | 0.795 | 0.7235 | 2.4818 | 0.795 | 0.7616 | 0.6093 | 0.0629 |
| 0.037 | 28.0 | 2800 | 0.0387 | 0.795 | 0.7245 | 2.6226 | 0.795 | 0.7586 | 0.6032 | 0.0604 |
| 0.037 | 29.0 | 2900 | 0.0387 | 0.805 | 0.7253 | 2.7588 | 0.805 | 0.7725 | 0.6144 | 0.0609 |
| 0.0364 | 30.0 | 3000 | 0.0387 | 0.805 | 0.7233 | 2.4956 | 0.805 | 0.7701 | 0.6204 | 0.0594 |
| 0.0364 | 31.0 | 3100 | 0.0387 | 0.81 | 0.7241 | 2.7695 | 0.81 | 0.7797 | 0.6188 | 0.0602 |
| 0.0364 | 32.0 | 3200 | 0.0386 | 0.81 | 0.7239 | 2.6185 | 0.81 | 0.7797 | 0.6190 | 0.0580 |
| 0.0364 | 33.0 | 3300 | 0.0387 | 0.805 | 0.7238 | 2.9106 | 0.805 | 0.7717 | 0.6182 | 0.0586 |
| 0.0364 | 34.0 | 3400 | 0.0386 | 0.805 | 0.7231 | 2.9062 | 0.805 | 0.7725 | 0.6133 | 0.0590 |
| 0.0364 | 35.0 | 3500 | 0.0387 | 0.805 | 0.7247 | 2.7645 | 0.805 | 0.7717 | 0.6141 | 0.0590 |
| 0.0364 | 36.0 | 3600 | 0.0386 | 0.805 | 0.7238 | 2.9152 | 0.805 | 0.7717 | 0.6104 | 0.0578 |
| 0.0364 | 37.0 | 3700 | 0.0387 | 0.805 | 0.7229 | 2.9094 | 0.805 | 0.7717 | 0.6142 | 0.0588 |
| 0.0364 | 38.0 | 3800 | 0.0386 | 0.805 | 0.7237 | 2.9185 | 0.805 | 0.7717 | 0.6173 | 0.0565 |
| 0.0364 | 39.0 | 3900 | 0.0386 | 0.805 | 0.7230 | 2.9178 | 0.805 | 0.7717 | 0.6131 | 0.0578 |
| 0.0364 | 40.0 | 4000 | 0.0386 | 0.805 | 0.7233 | 2.9155 | 0.805 | 0.7717 | 0.6131 | 0.0561 |
| 0.0364 | 41.0 | 4100 | 0.0386 | 0.805 | 0.7235 | 2.9142 | 0.805 | 0.7717 | 0.6173 | 0.0574 |
| 0.0364 | 42.0 | 4200 | 0.0387 | 0.805 | 0.7225 | 2.9162 | 0.805 | 0.7717 | 0.6196 | 0.0572 |
| 0.0364 | 43.0 | 4300 | 0.0387 | 0.805 | 0.7231 | 3.0596 | 0.805 | 0.7717 | 0.6139 | 0.0560 |
| 0.0364 | 44.0 | 4400 | 0.0387 | 0.805 | 0.7229 | 3.0584 | 0.805 | 0.7717 | 0.6140 | 0.0558 |
| 0.0364 | 45.0 | 4500 | 0.0386 | 0.815 | 0.7231 | 2.9107 | 0.815 | 0.7856 | 0.6224 | 0.0551 |
| 0.0364 | 46.0 | 4600 | 0.0386 | 0.8 | 0.7228 | 3.0609 | 0.8000 | 0.7683 | 0.6154 | 0.0570 |
| 0.0364 | 47.0 | 4700 | 0.0386 | 0.8 | 0.7229 | 3.0539 | 0.8000 | 0.7683 | 0.6141 | 0.0564 |
| 0.0364 | 48.0 | 4800 | 0.0387 | 0.805 | 0.7228 | 2.9149 | 0.805 | 0.7753 | 0.6164 | 0.0559 |
| 0.0364 | 49.0 | 4900 | 0.0387 | 0.805 | 0.7239 | 3.0631 | 0.805 | 0.7729 | 0.6144 | 0.0569 |
| 0.0364 | 50.0 | 5000 | 0.0387 | 0.8 | 0.7231 | 3.0551 | 0.8000 | 0.7683 | 0.6094 | 0.0562 |
| 0.0364 | 51.0 | 5100 | 0.0387 | 0.815 | 0.7232 | 3.0662 | 0.815 | 0.7868 | 0.6212 | 0.0569 |
| 0.0364 | 52.0 | 5200 | 0.0386 | 0.805 | 0.7226 | 2.9067 | 0.805 | 0.7753 | 0.6241 | 0.0550 |
| 0.0364 | 53.0 | 5300 | 0.0387 | 0.81 | 0.7236 | 2.9086 | 0.81 | 0.7761 | 0.6209 | 0.0558 |
| 0.0364 | 54.0 | 5400 | 0.0387 | 0.805 | 0.7240 | 2.9108 | 0.805 | 0.7729 | 0.6072 | 0.0577 |
| 0.0364 | 55.0 | 5500 | 0.0387 | 0.815 | 0.7228 | 2.9235 | 0.815 | 0.7832 | 0.6202 | 0.0556 |
| 0.0364 | 56.0 | 5600 | 0.0387 | 0.82 | 0.7229 | 2.9335 | 0.82 | 0.7898 | 0.6273 | 0.0544 |
| 0.0364 | 57.0 | 5700 | 0.0387 | 0.82 | 0.7230 | 2.9210 | 0.82 | 0.7900 | 0.6292 | 0.0561 |
| 0.0364 | 58.0 | 5800 | 0.0387 | 0.82 | 0.7227 | 2.9211 | 0.82 | 0.7898 | 0.6325 | 0.0560 |
| 0.0364 | 59.0 | 5900 | 0.0386 | 0.82 | 0.7238 | 3.0664 | 0.82 | 0.7898 | 0.6249 | 0.0548 |
| 0.0364 | 60.0 | 6000 | 0.0387 | 0.82 | 0.7218 | 2.9137 | 0.82 | 0.7900 | 0.6397 | 0.0545 |
| 0.0364 | 61.0 | 6100 | 0.0387 | 0.82 | 0.7233 | 3.0756 | 0.82 | 0.7900 | 0.6225 | 0.0563 |
| 0.0364 | 62.0 | 6200 | 0.0387 | 0.815 | 0.7231 | 3.0725 | 0.815 | 0.7878 | 0.6190 | 0.0558 |
| 0.0364 | 63.0 | 6300 | 0.0387 | 0.82 | 0.7218 | 3.0589 | 0.82 | 0.7900 | 0.6223 | 0.0554 |
| 0.0364 | 64.0 | 6400 | 0.0387 | 0.82 | 0.7231 | 3.0632 | 0.82 | 0.7900 | 0.6289 | 0.0551 |
| 0.0363 | 65.0 | 6500 | 0.0387 | 0.82 | 0.7227 | 3.0595 | 0.82 | 0.7900 | 0.6339 | 0.0560 |
| 0.0363 | 66.0 | 6600 | 0.0387 | 0.82 | 0.7229 | 2.9000 | 0.82 | 0.7900 | 0.6320 | 0.0551 |
| 0.0363 | 67.0 | 6700 | 0.0387 | 0.82 | 0.7222 | 2.9092 | 0.82 | 0.7900 | 0.6292 | 0.0548 |
| 0.0363 | 68.0 | 6800 | 0.0387 | 0.82 | 0.7233 | 2.7662 | 0.82 | 0.7900 | 0.6299 | 0.0564 |
| 0.0363 | 69.0 | 6900 | 0.0387 | 0.82 | 0.7230 | 2.9095 | 0.82 | 0.7900 | 0.6239 | 0.0555 |
| 0.0363 | 70.0 | 7000 | 0.0387 | 0.82 | 0.7226 | 2.9175 | 0.82 | 0.7908 | 0.6274 | 0.0549 |
| 0.0363 | 71.0 | 7100 | 0.0387 | 0.82 | 0.7237 | 3.0548 | 0.82 | 0.7900 | 0.6337 | 0.0550 |
| 0.0363 | 72.0 | 7200 | 0.0387 | 0.815 | 0.7229 | 2.9144 | 0.815 | 0.7841 | 0.6207 | 0.0570 |
| 0.0363 | 73.0 | 7300 | 0.0387 | 0.82 | 0.7235 | 2.9041 | 0.82 | 0.7900 | 0.6310 | 0.0564 |
| 0.0363 | 74.0 | 7400 | 0.0387 | 0.82 | 0.7227 | 2.9094 | 0.82 | 0.7908 | 0.6291 | 0.0558 |
| 0.0363 | 75.0 | 7500 | 0.0387 | 0.825 | 0.7236 | 2.9105 | 0.825 | 0.7983 | 0.6319 | 0.0543 |
| 0.0363 | 76.0 | 7600 | 0.0387 | 0.82 | 0.7225 | 2.9172 | 0.82 | 0.7908 | 0.6260 | 0.0550 |
| 0.0363 | 77.0 | 7700 | 0.0387 | 0.815 | 0.7227 | 2.9050 | 0.815 | 0.7841 | 0.6325 | 0.0557 |
| 0.0363 | 78.0 | 7800 | 0.0387 | 0.825 | 0.7236 | 2.9242 | 0.825 | 0.7983 | 0.6264 | 0.0575 |
| 0.0363 | 79.0 | 7900 | 0.0387 | 0.82 | 0.7231 | 2.9167 | 0.82 | 0.7900 | 0.6263 | 0.0572 |
| 0.0363 | 80.0 | 8000 | 0.0387 | 0.825 | 0.7229 | 2.7707 | 0.825 | 0.8004 | 0.6311 | 0.0569 |
| 0.0363 | 81.0 | 8100 | 0.0387 | 0.81 | 0.7230 | 2.9083 | 0.81 | 0.7812 | 0.6295 | 0.0573 |
| 0.0363 | 82.0 | 8200 | 0.0387 | 0.82 | 0.7227 | 2.7600 | 0.82 | 0.7927 | 0.6263 | 0.0576 |
| 0.0363 | 83.0 | 8300 | 0.0387 | 0.815 | 0.7226 | 2.7902 | 0.815 | 0.7930 | 0.6169 | 0.0576 |
| 0.0363 | 84.0 | 8400 | 0.0387 | 0.815 | 0.7226 | 2.7666 | 0.815 | 0.7841 | 0.6254 | 0.0571 |
| 0.0363 | 85.0 | 8500 | 0.0387 | 0.82 | 0.7228 | 2.7730 | 0.82 | 0.7960 | 0.6226 | 0.0566 |
| 0.0363 | 86.0 | 8600 | 0.0387 | 0.815 | 0.7228 | 2.6311 | 0.815 | 0.7878 | 0.6186 | 0.0572 |
| 0.0363 | 87.0 | 8700 | 0.0388 | 0.82 | 0.7230 | 2.6272 | 0.82 | 0.7924 | 0.6321 | 0.0575 |
| 0.0363 | 88.0 | 8800 | 0.0387 | 0.82 | 0.7227 | 2.7579 | 0.82 | 0.7924 | 0.6318 | 0.0568 |
| 0.0363 | 89.0 | 8900 | 0.0388 | 0.82 | 0.7227 | 2.7688 | 0.82 | 0.7960 | 0.6238 | 0.0575 |
| 0.0363 | 90.0 | 9000 | 0.0387 | 0.82 | 0.7232 | 2.6100 | 0.82 | 0.7986 | 0.6290 | 0.0569 |
| 0.0363 | 91.0 | 9100 | 0.0387 | 0.82 | 0.7228 | 2.6088 | 0.82 | 0.7960 | 0.6258 | 0.0574 |
| 0.0363 | 92.0 | 9200 | 0.0387 | 0.825 | 0.7228 | 2.6134 | 0.825 | 0.8038 | 0.6239 | 0.0575 |
| 0.0363 | 93.0 | 9300 | 0.0387 | 0.825 | 0.7229 | 2.6136 | 0.825 | 0.7990 | 0.6283 | 0.0576 |
| 0.0363 | 94.0 | 9400 | 0.0387 | 0.82 | 0.7228 | 2.6112 | 0.82 | 0.7924 | 0.6235 | 0.0580 |
| 0.0363 | 95.0 | 9500 | 0.0388 | 0.82 | 0.7228 | 2.6077 | 0.82 | 0.7960 | 0.6185 | 0.0575 |
| 0.0363 | 96.0 | 9600 | 0.0388 | 0.825 | 0.7229 | 2.6073 | 0.825 | 0.8038 | 0.6206 | 0.0572 |
| 0.0363 | 97.0 | 9700 | 0.0388 | 0.82 | 0.7227 | 2.6077 | 0.82 | 0.7960 | 0.6213 | 0.0572 |
| 0.0363 | 98.0 | 9800 | 0.0388 | 0.815 | 0.7228 | 2.6094 | 0.815 | 0.7893 | 0.6161 | 0.0579 |
| 0.0363 | 99.0 | 9900 | 0.0388 | 0.815 | 0.7228 | 2.6086 | 0.815 | 0.7893 | 0.6160 | 0.0579 |
| 0.0363 | 100.0 | 10000 | 0.0388 | 0.82 | 0.7227 | 2.6079 | 0.82 | 0.7960 | 0.6182 | 0.0575 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
daxiboy/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.24.0
- Pytorch 2.0.1+cu118
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
ayanban011/vit-base_tobacco_bs_16_lr_5e-6_e_300_wr_0.1_wd_0.2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco_bs_16_lr_5e-6_e_300_wr_0.1_wd_0.2
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8461
- Accuracy: 0.775
- Brier Loss: 0.3632
- Nll: 1.4570
- F1 Micro: 0.775
- F1 Macro: 0.7418
- Ece: 0.2043
- Aurc: 0.1066
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 300
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.7447 | 0.815 | 0.3078 | 1.1882 | 0.815 | 0.7942 | 0.2385 | 0.0731 |
| No log | 2.0 | 25 | 0.7442 | 0.815 | 0.3075 | 1.1872 | 0.815 | 0.7922 | 0.2401 | 0.0736 |
| No log | 2.96 | 37 | 0.7439 | 0.815 | 0.3075 | 1.1883 | 0.815 | 0.7942 | 0.2292 | 0.0722 |
| No log | 4.0 | 50 | 0.7463 | 0.815 | 0.3083 | 1.1904 | 0.815 | 0.7942 | 0.2454 | 0.0762 |
| No log | 4.96 | 62 | 0.7441 | 0.805 | 0.3077 | 1.1886 | 0.805 | 0.7819 | 0.2322 | 0.0731 |
| No log | 6.0 | 75 | 0.7408 | 0.81 | 0.3064 | 1.1842 | 0.81 | 0.7914 | 0.2217 | 0.0704 |
| No log | 6.96 | 87 | 0.7448 | 0.81 | 0.3082 | 1.1852 | 0.81 | 0.7847 | 0.2341 | 0.0748 |
| No log | 8.0 | 100 | 0.7454 | 0.815 | 0.3084 | 1.1882 | 0.815 | 0.7942 | 0.2129 | 0.0767 |
| No log | 8.96 | 112 | 0.7462 | 0.815 | 0.3080 | 1.1954 | 0.815 | 0.7922 | 0.2535 | 0.0775 |
| No log | 10.0 | 125 | 0.7427 | 0.81 | 0.3067 | 1.1924 | 0.81 | 0.7876 | 0.2280 | 0.0767 |
| No log | 10.96 | 137 | 0.7420 | 0.815 | 0.3067 | 1.2033 | 0.815 | 0.7942 | 0.2611 | 0.0755 |
| No log | 12.0 | 150 | 0.7417 | 0.805 | 0.3063 | 1.1881 | 0.805 | 0.7820 | 0.2456 | 0.0774 |
| No log | 12.96 | 162 | 0.7442 | 0.815 | 0.3089 | 1.1895 | 0.815 | 0.8059 | 0.2230 | 0.0768 |
| No log | 14.0 | 175 | 0.7398 | 0.805 | 0.3061 | 1.2547 | 0.805 | 0.7843 | 0.2310 | 0.0766 |
| No log | 14.96 | 187 | 0.7355 | 0.81 | 0.3046 | 1.1887 | 0.81 | 0.7914 | 0.2328 | 0.0746 |
| No log | 16.0 | 200 | 0.7368 | 0.81 | 0.3053 | 1.1894 | 0.81 | 0.7922 | 0.2256 | 0.0774 |
| No log | 16.96 | 212 | 0.7355 | 0.81 | 0.3037 | 1.2537 | 0.81 | 0.7947 | 0.2077 | 0.0788 |
| No log | 18.0 | 225 | 0.7407 | 0.81 | 0.3065 | 1.1882 | 0.81 | 0.7871 | 0.2421 | 0.0767 |
| No log | 18.96 | 237 | 0.7279 | 0.8 | 0.2999 | 1.2540 | 0.8000 | 0.7796 | 0.2159 | 0.0742 |
| No log | 20.0 | 250 | 0.7324 | 0.805 | 0.3042 | 1.1811 | 0.805 | 0.7841 | 0.2269 | 0.0763 |
| No log | 20.96 | 262 | 0.7421 | 0.805 | 0.3079 | 1.1827 | 0.805 | 0.7850 | 0.2339 | 0.0797 |
| No log | 22.0 | 275 | 0.7343 | 0.81 | 0.3050 | 1.1689 | 0.81 | 0.7877 | 0.2223 | 0.0784 |
| No log | 22.96 | 287 | 0.7308 | 0.81 | 0.3032 | 1.1901 | 0.81 | 0.7922 | 0.2190 | 0.0774 |
| No log | 24.0 | 300 | 0.7381 | 0.805 | 0.3057 | 1.3200 | 0.805 | 0.7853 | 0.2500 | 0.0819 |
| No log | 24.96 | 312 | 0.7336 | 0.81 | 0.3042 | 1.3123 | 0.81 | 0.7903 | 0.2082 | 0.0795 |
| No log | 26.0 | 325 | 0.7282 | 0.805 | 0.3020 | 1.2465 | 0.805 | 0.7847 | 0.2248 | 0.0792 |
| No log | 26.96 | 337 | 0.7346 | 0.81 | 0.3050 | 1.2538 | 0.81 | 0.7956 | 0.2095 | 0.0818 |
| No log | 28.0 | 350 | 0.7305 | 0.805 | 0.3031 | 1.2443 | 0.805 | 0.7850 | 0.2488 | 0.0823 |
| No log | 28.96 | 362 | 0.7395 | 0.8 | 0.3071 | 1.3235 | 0.8000 | 0.7818 | 0.2223 | 0.0843 |
| No log | 30.0 | 375 | 0.7349 | 0.8 | 0.3058 | 1.2511 | 0.8000 | 0.7733 | 0.2004 | 0.0817 |
| No log | 30.96 | 387 | 0.7344 | 0.8 | 0.3048 | 1.2516 | 0.8000 | 0.7818 | 0.2183 | 0.0837 |
| No log | 32.0 | 400 | 0.7332 | 0.795 | 0.3037 | 1.3836 | 0.795 | 0.7686 | 0.2185 | 0.0844 |
| No log | 32.96 | 412 | 0.7306 | 0.81 | 0.3042 | 1.1767 | 0.81 | 0.7905 | 0.2117 | 0.0837 |
| No log | 34.0 | 425 | 0.7326 | 0.8 | 0.3040 | 1.2058 | 0.8000 | 0.7783 | 0.2106 | 0.0857 |
| No log | 34.96 | 437 | 0.7317 | 0.8 | 0.3045 | 1.3068 | 0.8000 | 0.7733 | 0.2337 | 0.0843 |
| No log | 36.0 | 450 | 0.7345 | 0.805 | 0.3073 | 1.3065 | 0.805 | 0.7782 | 0.1928 | 0.0823 |
| No log | 36.96 | 462 | 0.7367 | 0.8 | 0.3074 | 1.3259 | 0.8000 | 0.7733 | 0.1941 | 0.0860 |
| No log | 38.0 | 475 | 0.7349 | 0.8 | 0.3073 | 1.3074 | 0.8000 | 0.7731 | 0.2138 | 0.0853 |
| No log | 38.96 | 487 | 0.7331 | 0.81 | 0.3057 | 1.3149 | 0.81 | 0.7909 | 0.1981 | 0.0865 |
| 0.1577 | 40.0 | 500 | 0.7269 | 0.8 | 0.3018 | 1.3700 | 0.8000 | 0.7746 | 0.2033 | 0.0865 |
| 0.1577 | 40.96 | 512 | 0.7270 | 0.8 | 0.3020 | 1.3687 | 0.8000 | 0.7737 | 0.2108 | 0.0860 |
| 0.1577 | 42.0 | 525 | 0.7356 | 0.805 | 0.3078 | 1.3105 | 0.805 | 0.7784 | 0.2053 | 0.0892 |
| 0.1577 | 42.96 | 537 | 0.7291 | 0.8 | 0.3031 | 1.3687 | 0.8000 | 0.7746 | 0.2066 | 0.0876 |
| 0.1577 | 44.0 | 550 | 0.7276 | 0.81 | 0.3034 | 1.3655 | 0.81 | 0.7844 | 0.2189 | 0.0872 |
| 0.1577 | 44.96 | 562 | 0.7318 | 0.805 | 0.3050 | 1.3684 | 0.805 | 0.7793 | 0.2209 | 0.0893 |
| 0.1577 | 46.0 | 575 | 0.7300 | 0.805 | 0.3041 | 1.3679 | 0.805 | 0.7793 | 0.2040 | 0.0885 |
| 0.1577 | 46.96 | 587 | 0.7342 | 0.805 | 0.3060 | 1.3679 | 0.805 | 0.7797 | 0.2059 | 0.0893 |
| 0.1577 | 48.0 | 600 | 0.7303 | 0.805 | 0.3045 | 1.3672 | 0.805 | 0.7797 | 0.1862 | 0.0889 |
| 0.1577 | 48.96 | 612 | 0.7401 | 0.8 | 0.3090 | 1.3710 | 0.8000 | 0.7746 | 0.1930 | 0.0915 |
| 0.1577 | 50.0 | 625 | 0.7329 | 0.795 | 0.3054 | 1.3696 | 0.795 | 0.7654 | 0.1984 | 0.0891 |
| 0.1577 | 50.96 | 637 | 0.7363 | 0.795 | 0.3072 | 1.3689 | 0.795 | 0.7654 | 0.2196 | 0.0907 |
| 0.1577 | 52.0 | 650 | 0.7402 | 0.805 | 0.3101 | 1.3646 | 0.805 | 0.7784 | 0.2028 | 0.0911 |
| 0.1577 | 52.96 | 662 | 0.7347 | 0.8 | 0.3065 | 1.3687 | 0.8000 | 0.7746 | 0.2062 | 0.0894 |
| 0.1577 | 54.0 | 675 | 0.7388 | 0.805 | 0.3097 | 1.3649 | 0.805 | 0.7784 | 0.2027 | 0.0907 |
| 0.1577 | 54.96 | 687 | 0.7381 | 0.8 | 0.3087 | 1.3681 | 0.8000 | 0.7704 | 0.2120 | 0.0908 |
| 0.1577 | 56.0 | 700 | 0.7372 | 0.805 | 0.3088 | 1.3646 | 0.805 | 0.7749 | 0.1866 | 0.0903 |
| 0.1577 | 56.96 | 712 | 0.7403 | 0.805 | 0.3102 | 1.3682 | 0.805 | 0.7749 | 0.2287 | 0.0922 |
| 0.1577 | 58.0 | 725 | 0.7352 | 0.8 | 0.3069 | 1.3680 | 0.8000 | 0.7704 | 0.2117 | 0.0900 |
| 0.1577 | 58.96 | 737 | 0.7373 | 0.8 | 0.3079 | 1.3699 | 0.8000 | 0.7704 | 0.1990 | 0.0923 |
| 0.1577 | 60.0 | 750 | 0.7353 | 0.795 | 0.3065 | 1.3690 | 0.795 | 0.7656 | 0.2078 | 0.0900 |
| 0.1577 | 60.96 | 762 | 0.7357 | 0.805 | 0.3071 | 1.3657 | 0.805 | 0.7732 | 0.2076 | 0.0899 |
| 0.1577 | 62.0 | 775 | 0.7409 | 0.79 | 0.3103 | 1.3737 | 0.79 | 0.7623 | 0.2066 | 0.0920 |
| 0.1577 | 62.96 | 787 | 0.7393 | 0.795 | 0.3082 | 1.4518 | 0.795 | 0.7670 | 0.2047 | 0.0912 |
| 0.1577 | 64.0 | 800 | 0.7417 | 0.8 | 0.3093 | 1.3304 | 0.8000 | 0.7684 | 0.1955 | 0.0917 |
| 0.1577 | 64.96 | 812 | 0.7438 | 0.8 | 0.3121 | 1.3714 | 0.8000 | 0.7707 | 0.1782 | 0.0920 |
| 0.1577 | 66.0 | 825 | 0.7408 | 0.8 | 0.3100 | 1.3758 | 0.8000 | 0.7709 | 0.1965 | 0.0931 |
| 0.1577 | 66.96 | 837 | 0.7434 | 0.8 | 0.3112 | 1.3767 | 0.8000 | 0.7707 | 0.2124 | 0.0935 |
| 0.1577 | 68.0 | 850 | 0.7393 | 0.8 | 0.3107 | 1.3038 | 0.8000 | 0.7704 | 0.1786 | 0.0901 |
| 0.1577 | 68.96 | 862 | 0.7383 | 0.8 | 0.3090 | 1.3689 | 0.8000 | 0.7704 | 0.2041 | 0.0913 |
| 0.1577 | 70.0 | 875 | 0.7436 | 0.8 | 0.3119 | 1.3658 | 0.8000 | 0.7704 | 0.1983 | 0.0932 |
| 0.1577 | 70.96 | 887 | 0.7463 | 0.8 | 0.3130 | 1.3700 | 0.8000 | 0.7707 | 0.1932 | 0.0947 |
| 0.1577 | 72.0 | 900 | 0.7464 | 0.795 | 0.3135 | 1.3720 | 0.795 | 0.7656 | 0.2089 | 0.0932 |
| 0.1577 | 72.96 | 912 | 0.7469 | 0.8 | 0.3137 | 1.3703 | 0.8000 | 0.7707 | 0.2004 | 0.0943 |
| 0.1577 | 74.0 | 925 | 0.7435 | 0.8 | 0.3124 | 1.3674 | 0.8000 | 0.7704 | 0.1958 | 0.0930 |
| 0.1577 | 74.96 | 937 | 0.7427 | 0.8 | 0.3117 | 1.3708 | 0.8000 | 0.7707 | 0.2224 | 0.0921 |
| 0.1577 | 76.0 | 950 | 0.7420 | 0.8 | 0.3111 | 1.3664 | 0.8000 | 0.7704 | 0.2145 | 0.0928 |
| 0.1577 | 76.96 | 962 | 0.7457 | 0.8 | 0.3135 | 1.3690 | 0.8000 | 0.7707 | 0.2178 | 0.0934 |
| 0.1577 | 78.0 | 975 | 0.7513 | 0.8 | 0.3163 | 1.3707 | 0.8000 | 0.7707 | 0.1964 | 0.0947 |
| 0.1577 | 78.96 | 987 | 0.7466 | 0.8 | 0.3139 | 1.3722 | 0.8000 | 0.7704 | 0.2001 | 0.0936 |
| 0.1081 | 80.0 | 1000 | 0.7491 | 0.8 | 0.3154 | 1.3712 | 0.8000 | 0.7707 | 0.2100 | 0.0943 |
| 0.1081 | 80.96 | 1012 | 0.7483 | 0.8 | 0.3150 | 1.3675 | 0.8000 | 0.7704 | 0.2083 | 0.0939 |
| 0.1081 | 82.0 | 1025 | 0.7523 | 0.8 | 0.3163 | 1.3742 | 0.8000 | 0.7707 | 0.2095 | 0.0958 |
| 0.1081 | 82.96 | 1037 | 0.7511 | 0.8 | 0.3166 | 1.3703 | 0.8000 | 0.7707 | 0.2034 | 0.0944 |
| 0.1081 | 84.0 | 1050 | 0.7481 | 0.8 | 0.3150 | 1.3687 | 0.8000 | 0.7704 | 0.2113 | 0.0941 |
| 0.1081 | 84.96 | 1062 | 0.7501 | 0.8 | 0.3164 | 1.3668 | 0.8000 | 0.7693 | 0.2053 | 0.0932 |
| 0.1081 | 86.0 | 1075 | 0.7539 | 0.8 | 0.3177 | 1.3725 | 0.8000 | 0.7707 | 0.2025 | 0.0951 |
| 0.1081 | 86.96 | 1087 | 0.7550 | 0.8 | 0.3182 | 1.3731 | 0.8000 | 0.7707 | 0.1969 | 0.0953 |
| 0.1081 | 88.0 | 1100 | 0.7553 | 0.8 | 0.3183 | 1.3697 | 0.8000 | 0.7707 | 0.1972 | 0.0952 |
| 0.1081 | 88.96 | 1112 | 0.7535 | 0.8 | 0.3176 | 1.3719 | 0.8000 | 0.7707 | 0.2073 | 0.0945 |
| 0.1081 | 90.0 | 1125 | 0.7558 | 0.795 | 0.3186 | 1.3742 | 0.795 | 0.7681 | 0.2018 | 0.0959 |
| 0.1081 | 90.96 | 1137 | 0.7573 | 0.8 | 0.3193 | 1.3739 | 0.8000 | 0.7704 | 0.1919 | 0.0965 |
| 0.1081 | 92.0 | 1150 | 0.7565 | 0.8 | 0.3193 | 1.3743 | 0.8000 | 0.7698 | 0.1967 | 0.0959 |
| 0.1081 | 92.96 | 1162 | 0.7619 | 0.795 | 0.3218 | 1.3758 | 0.795 | 0.7681 | 0.1989 | 0.0974 |
| 0.1081 | 94.0 | 1175 | 0.7577 | 0.8 | 0.3198 | 1.3793 | 0.8000 | 0.7696 | 0.1996 | 0.0957 |
| 0.1081 | 94.96 | 1187 | 0.7575 | 0.795 | 0.3201 | 1.3781 | 0.795 | 0.7666 | 0.1954 | 0.0964 |
| 0.1081 | 96.0 | 1200 | 0.7573 | 0.8 | 0.3199 | 1.3752 | 0.8000 | 0.7693 | 0.1863 | 0.0955 |
| 0.1081 | 96.96 | 1212 | 0.7615 | 0.795 | 0.3216 | 1.3753 | 0.795 | 0.7681 | 0.1997 | 0.0975 |
| 0.1081 | 98.0 | 1225 | 0.7603 | 0.795 | 0.3215 | 1.3731 | 0.795 | 0.7681 | 0.2051 | 0.0963 |
| 0.1081 | 98.96 | 1237 | 0.7596 | 0.795 | 0.3209 | 1.3744 | 0.795 | 0.7673 | 0.2081 | 0.0959 |
| 0.1081 | 100.0 | 1250 | 0.7582 | 0.795 | 0.3203 | 1.3743 | 0.795 | 0.7673 | 0.2024 | 0.0955 |
| 0.1081 | 100.96 | 1262 | 0.7609 | 0.795 | 0.3223 | 1.3761 | 0.795 | 0.7681 | 0.1823 | 0.0968 |
| 0.1081 | 102.0 | 1275 | 0.7632 | 0.785 | 0.3233 | 1.3758 | 0.785 | 0.7528 | 0.1833 | 0.0970 |
| 0.1081 | 102.96 | 1287 | 0.7618 | 0.785 | 0.3219 | 1.3785 | 0.785 | 0.7516 | 0.2141 | 0.0970 |
| 0.1081 | 104.0 | 1300 | 0.7633 | 0.795 | 0.3230 | 1.4970 | 0.795 | 0.7664 | 0.1956 | 0.0952 |
| 0.1081 | 104.96 | 1312 | 0.7657 | 0.79 | 0.3243 | 1.4406 | 0.79 | 0.7639 | 0.1960 | 0.0961 |
| 0.1081 | 106.0 | 1325 | 0.7673 | 0.785 | 0.3251 | 1.4424 | 0.785 | 0.7516 | 0.2083 | 0.0978 |
| 0.1081 | 106.96 | 1337 | 0.7667 | 0.79 | 0.3250 | 1.4392 | 0.79 | 0.7639 | 0.1875 | 0.0976 |
| 0.1081 | 108.0 | 1350 | 0.7690 | 0.785 | 0.3250 | 1.3876 | 0.785 | 0.7526 | 0.2078 | 0.0990 |
| 0.1081 | 108.96 | 1362 | 0.7676 | 0.785 | 0.3252 | 1.3872 | 0.785 | 0.7554 | 0.2073 | 0.0985 |
| 0.1081 | 110.0 | 1375 | 0.7662 | 0.79 | 0.3249 | 1.4335 | 0.79 | 0.7639 | 0.1939 | 0.0980 |
| 0.1081 | 110.96 | 1387 | 0.7723 | 0.785 | 0.3273 | 1.4567 | 0.785 | 0.7554 | 0.2066 | 0.0995 |
| 0.1081 | 112.0 | 1400 | 0.7665 | 0.78 | 0.3250 | 1.3960 | 0.78 | 0.7488 | 0.2066 | 0.0976 |
| 0.1081 | 112.96 | 1412 | 0.7722 | 0.785 | 0.3275 | 1.4410 | 0.785 | 0.7573 | 0.2063 | 0.0991 |
| 0.1081 | 114.0 | 1425 | 0.7722 | 0.79 | 0.3271 | 1.4039 | 0.79 | 0.7639 | 0.1902 | 0.0990 |
| 0.1081 | 114.96 | 1437 | 0.7699 | 0.79 | 0.3264 | 1.3849 | 0.79 | 0.7644 | 0.1914 | 0.0982 |
| 0.1081 | 116.0 | 1450 | 0.7749 | 0.785 | 0.3285 | 1.3854 | 0.785 | 0.7573 | 0.1942 | 0.0999 |
| 0.1081 | 116.96 | 1462 | 0.7722 | 0.78 | 0.3279 | 1.4365 | 0.78 | 0.7488 | 0.1973 | 0.0991 |
| 0.1081 | 118.0 | 1475 | 0.7763 | 0.78 | 0.3293 | 1.3823 | 0.78 | 0.7488 | 0.2050 | 0.1006 |
| 0.1081 | 118.96 | 1487 | 0.7740 | 0.78 | 0.3287 | 1.3822 | 0.78 | 0.7488 | 0.2105 | 0.0991 |
| 0.0821 | 120.0 | 1500 | 0.7761 | 0.785 | 0.3294 | 1.4414 | 0.785 | 0.7573 | 0.1996 | 0.0995 |
| 0.0821 | 120.96 | 1512 | 0.7749 | 0.78 | 0.3289 | 1.4387 | 0.78 | 0.7488 | 0.1981 | 0.0991 |
| 0.0821 | 122.0 | 1525 | 0.7763 | 0.78 | 0.3297 | 1.4395 | 0.78 | 0.7488 | 0.2175 | 0.0993 |
| 0.0821 | 122.96 | 1537 | 0.7775 | 0.78 | 0.3305 | 1.4407 | 0.78 | 0.7488 | 0.2073 | 0.0993 |
| 0.0821 | 124.0 | 1550 | 0.7770 | 0.78 | 0.3299 | 1.4411 | 0.78 | 0.7488 | 0.2096 | 0.0996 |
| 0.0821 | 124.96 | 1562 | 0.7785 | 0.78 | 0.3309 | 1.4415 | 0.78 | 0.7488 | 0.2174 | 0.1004 |
| 0.0821 | 126.0 | 1575 | 0.7808 | 0.78 | 0.3321 | 1.4431 | 0.78 | 0.7488 | 0.2082 | 0.1005 |
| 0.0821 | 126.96 | 1587 | 0.7791 | 0.78 | 0.3312 | 1.4405 | 0.78 | 0.7488 | 0.2087 | 0.0998 |
| 0.0821 | 128.0 | 1600 | 0.7789 | 0.78 | 0.3312 | 1.4386 | 0.78 | 0.7488 | 0.2047 | 0.0995 |
| 0.0821 | 128.96 | 1612 | 0.7829 | 0.78 | 0.3330 | 1.4423 | 0.78 | 0.7488 | 0.1920 | 0.1005 |
| 0.0821 | 130.0 | 1625 | 0.7797 | 0.78 | 0.3317 | 1.4400 | 0.78 | 0.7488 | 0.2013 | 0.1006 |
| 0.0821 | 130.96 | 1637 | 0.7849 | 0.78 | 0.3336 | 1.4446 | 0.78 | 0.7491 | 0.2064 | 0.1006 |
| 0.0821 | 132.0 | 1650 | 0.7817 | 0.78 | 0.3322 | 1.4396 | 0.78 | 0.7488 | 0.2060 | 0.1003 |
| 0.0821 | 132.96 | 1662 | 0.7823 | 0.78 | 0.3329 | 1.4407 | 0.78 | 0.7488 | 0.1990 | 0.0999 |
| 0.0821 | 134.0 | 1675 | 0.7869 | 0.78 | 0.3354 | 1.4482 | 0.78 | 0.7488 | 0.1999 | 0.1009 |
| 0.0821 | 134.96 | 1687 | 0.7859 | 0.78 | 0.3349 | 1.4429 | 0.78 | 0.7488 | 0.1934 | 0.1013 |
| 0.0821 | 136.0 | 1700 | 0.7867 | 0.78 | 0.3352 | 1.4437 | 0.78 | 0.7488 | 0.2114 | 0.1006 |
| 0.0821 | 136.96 | 1712 | 0.7867 | 0.78 | 0.3350 | 1.4403 | 0.78 | 0.7488 | 0.2070 | 0.1011 |
| 0.0821 | 138.0 | 1725 | 0.7851 | 0.78 | 0.3341 | 1.4439 | 0.78 | 0.7488 | 0.1906 | 0.1009 |
| 0.0821 | 138.96 | 1737 | 0.7892 | 0.78 | 0.3360 | 1.4495 | 0.78 | 0.7488 | 0.2009 | 0.1020 |
| 0.0821 | 140.0 | 1750 | 0.7893 | 0.78 | 0.3366 | 1.4434 | 0.78 | 0.7488 | 0.1976 | 0.1013 |
| 0.0821 | 140.96 | 1762 | 0.7848 | 0.78 | 0.3344 | 1.4383 | 0.78 | 0.7488 | 0.1995 | 0.1001 |
| 0.0821 | 142.0 | 1775 | 0.7911 | 0.78 | 0.3372 | 1.4487 | 0.78 | 0.7488 | 0.1995 | 0.1020 |
| 0.0821 | 142.96 | 1787 | 0.7890 | 0.78 | 0.3362 | 1.4416 | 0.78 | 0.7488 | 0.2075 | 0.1010 |
| 0.0821 | 144.0 | 1800 | 0.7915 | 0.78 | 0.3372 | 1.4476 | 0.78 | 0.7488 | 0.1842 | 0.1019 |
| 0.0821 | 144.96 | 1812 | 0.7876 | 0.78 | 0.3351 | 1.4999 | 0.78 | 0.7488 | 0.1904 | 0.0995 |
| 0.0821 | 146.0 | 1825 | 0.7933 | 0.78 | 0.3378 | 1.4469 | 0.78 | 0.7488 | 0.1973 | 0.1023 |
| 0.0821 | 146.96 | 1837 | 0.7932 | 0.78 | 0.3383 | 1.4441 | 0.78 | 0.7488 | 0.2070 | 0.1016 |
| 0.0821 | 148.0 | 1850 | 0.7907 | 0.78 | 0.3369 | 1.4439 | 0.78 | 0.7488 | 0.1932 | 0.1014 |
| 0.0821 | 148.96 | 1862 | 0.7939 | 0.78 | 0.3386 | 1.4462 | 0.78 | 0.7488 | 0.1906 | 0.1015 |
| 0.0821 | 150.0 | 1875 | 0.7943 | 0.78 | 0.3386 | 1.4449 | 0.78 | 0.7488 | 0.1965 | 0.1016 |
| 0.0821 | 150.96 | 1887 | 0.7955 | 0.78 | 0.3393 | 1.5025 | 0.78 | 0.7488 | 0.2112 | 0.1015 |
| 0.0821 | 152.0 | 1900 | 0.7936 | 0.78 | 0.3386 | 1.4407 | 0.78 | 0.7488 | 0.2112 | 0.1012 |
| 0.0821 | 152.96 | 1912 | 0.7966 | 0.78 | 0.3400 | 1.5033 | 0.78 | 0.7488 | 0.1963 | 0.1012 |
| 0.0821 | 154.0 | 1925 | 0.7981 | 0.78 | 0.3405 | 1.4495 | 0.78 | 0.7488 | 0.1895 | 0.1020 |
| 0.0821 | 154.96 | 1937 | 0.7972 | 0.78 | 0.3401 | 1.4417 | 0.78 | 0.7488 | 0.1953 | 0.1018 |
| 0.0821 | 156.0 | 1950 | 0.7922 | 0.78 | 0.3381 | 1.4395 | 0.78 | 0.7488 | 0.2056 | 0.0999 |
| 0.0821 | 156.96 | 1962 | 0.8013 | 0.775 | 0.3425 | 1.4473 | 0.775 | 0.7451 | 0.1869 | 0.1028 |
| 0.0821 | 158.0 | 1975 | 0.7977 | 0.78 | 0.3403 | 1.4446 | 0.78 | 0.7488 | 0.1872 | 0.1014 |
| 0.0821 | 158.96 | 1987 | 0.7990 | 0.78 | 0.3412 | 1.4413 | 0.78 | 0.7488 | 0.1939 | 0.1017 |
| 0.0668 | 160.0 | 2000 | 0.8048 | 0.775 | 0.3435 | 1.4532 | 0.775 | 0.7451 | 0.1966 | 0.1049 |
| 0.0668 | 160.96 | 2012 | 0.8064 | 0.77 | 0.3448 | 1.4529 | 0.7700 | 0.7358 | 0.1953 | 0.1044 |
| 0.0668 | 162.0 | 2025 | 0.7989 | 0.78 | 0.3412 | 1.4423 | 0.78 | 0.7488 | 0.2038 | 0.1022 |
| 0.0668 | 162.96 | 2037 | 0.8001 | 0.78 | 0.3414 | 1.4440 | 0.78 | 0.7488 | 0.1972 | 0.1015 |
| 0.0668 | 164.0 | 2050 | 0.8068 | 0.775 | 0.3448 | 1.4523 | 0.775 | 0.7396 | 0.2031 | 0.1036 |
| 0.0668 | 164.96 | 2062 | 0.8046 | 0.785 | 0.3438 | 1.4475 | 0.785 | 0.7536 | 0.2070 | 0.1037 |
| 0.0668 | 166.0 | 2075 | 0.8016 | 0.78 | 0.3426 | 1.4451 | 0.78 | 0.7488 | 0.1975 | 0.1012 |
| 0.0668 | 166.96 | 2087 | 0.8053 | 0.78 | 0.3442 | 1.4485 | 0.78 | 0.7477 | 0.2112 | 0.1022 |
| 0.0668 | 168.0 | 2100 | 0.8040 | 0.78 | 0.3433 | 1.4459 | 0.78 | 0.7422 | 0.2014 | 0.1031 |
| 0.0668 | 168.96 | 2112 | 0.8048 | 0.785 | 0.3437 | 1.4479 | 0.785 | 0.7515 | 0.2046 | 0.1033 |
| 0.0668 | 170.0 | 2125 | 0.8054 | 0.775 | 0.3447 | 1.5060 | 0.775 | 0.7450 | 0.1896 | 0.1017 |
| 0.0668 | 170.96 | 2137 | 0.8067 | 0.775 | 0.3451 | 1.5079 | 0.775 | 0.7450 | 0.1898 | 0.1018 |
| 0.0668 | 172.0 | 2150 | 0.8060 | 0.78 | 0.3447 | 1.4508 | 0.78 | 0.7488 | 0.1842 | 0.1022 |
| 0.0668 | 172.96 | 2162 | 0.8127 | 0.77 | 0.3484 | 1.4513 | 0.7700 | 0.7358 | 0.2006 | 0.1042 |
| 0.0668 | 174.0 | 2175 | 0.8080 | 0.77 | 0.3457 | 1.4453 | 0.7700 | 0.7349 | 0.2198 | 0.1034 |
| 0.0668 | 174.96 | 2187 | 0.8095 | 0.775 | 0.3460 | 1.4471 | 0.775 | 0.7384 | 0.2029 | 0.1027 |
| 0.0668 | 176.0 | 2200 | 0.8112 | 0.775 | 0.3467 | 1.4559 | 0.775 | 0.7395 | 0.1995 | 0.1036 |
| 0.0668 | 176.96 | 2212 | 0.8089 | 0.77 | 0.3460 | 1.4485 | 0.7700 | 0.7357 | 0.2050 | 0.1019 |
| 0.0668 | 178.0 | 2225 | 0.8093 | 0.77 | 0.3461 | 1.4459 | 0.7700 | 0.7357 | 0.1989 | 0.1021 |
| 0.0668 | 178.96 | 2237 | 0.8118 | 0.775 | 0.3473 | 1.4499 | 0.775 | 0.7384 | 0.2085 | 0.1029 |
| 0.0668 | 180.0 | 2250 | 0.8112 | 0.775 | 0.3472 | 1.4471 | 0.775 | 0.7384 | 0.2070 | 0.1027 |
| 0.0668 | 180.96 | 2262 | 0.8124 | 0.77 | 0.3478 | 1.4484 | 0.7700 | 0.7357 | 0.1983 | 0.1029 |
| 0.0668 | 182.0 | 2275 | 0.8140 | 0.77 | 0.3484 | 1.4489 | 0.7700 | 0.7357 | 0.1987 | 0.1038 |
| 0.0668 | 182.96 | 2287 | 0.8137 | 0.77 | 0.3483 | 1.4491 | 0.7700 | 0.7357 | 0.2036 | 0.1030 |
| 0.0668 | 184.0 | 2300 | 0.8133 | 0.77 | 0.3481 | 1.4468 | 0.7700 | 0.7357 | 0.2012 | 0.1024 |
| 0.0668 | 184.96 | 2312 | 0.8152 | 0.77 | 0.3489 | 1.4525 | 0.7700 | 0.7357 | 0.1996 | 0.1029 |
| 0.0668 | 186.0 | 2325 | 0.8149 | 0.77 | 0.3490 | 1.4511 | 0.7700 | 0.7357 | 0.1917 | 0.1027 |
| 0.0668 | 186.96 | 2337 | 0.8151 | 0.77 | 0.3490 | 1.4489 | 0.7700 | 0.7357 | 0.1956 | 0.1028 |
| 0.0668 | 188.0 | 2350 | 0.8175 | 0.77 | 0.3500 | 1.5084 | 0.7700 | 0.7357 | 0.2011 | 0.1038 |
| 0.0668 | 188.96 | 2362 | 0.8181 | 0.765 | 0.3499 | 1.4506 | 0.765 | 0.7323 | 0.1975 | 0.1056 |
| 0.0668 | 190.0 | 2375 | 0.8180 | 0.765 | 0.3504 | 1.4499 | 0.765 | 0.7323 | 0.2162 | 0.1050 |
| 0.0668 | 190.96 | 2387 | 0.8168 | 0.77 | 0.3498 | 1.4510 | 0.7700 | 0.7357 | 0.2014 | 0.1039 |
| 0.0668 | 192.0 | 2400 | 0.8183 | 0.77 | 0.3505 | 1.4483 | 0.7700 | 0.7379 | 0.2114 | 0.1032 |
| 0.0668 | 192.96 | 2412 | 0.8193 | 0.775 | 0.3507 | 1.4508 | 0.775 | 0.7384 | 0.2025 | 0.1042 |
| 0.0668 | 194.0 | 2425 | 0.8181 | 0.77 | 0.3503 | 1.4565 | 0.7700 | 0.7357 | 0.2090 | 0.1027 |
| 0.0668 | 194.96 | 2437 | 0.8192 | 0.77 | 0.3507 | 1.4513 | 0.7700 | 0.7357 | 0.1953 | 0.1032 |
| 0.0668 | 196.0 | 2450 | 0.8214 | 0.77 | 0.3520 | 1.4519 | 0.7700 | 0.7349 | 0.2112 | 0.1045 |
| 0.0668 | 196.96 | 2462 | 0.8231 | 0.765 | 0.3531 | 1.4517 | 0.765 | 0.7323 | 0.2042 | 0.1049 |
| 0.0668 | 198.0 | 2475 | 0.8219 | 0.77 | 0.3521 | 1.4512 | 0.7700 | 0.7349 | 0.2152 | 0.1044 |
| 0.0668 | 198.96 | 2487 | 0.8223 | 0.77 | 0.3523 | 1.4507 | 0.7700 | 0.7349 | 0.1888 | 0.1050 |
| 0.0571 | 200.0 | 2500 | 0.8235 | 0.77 | 0.3529 | 1.4533 | 0.7700 | 0.7349 | 0.2029 | 0.1050 |
| 0.0571 | 200.96 | 2512 | 0.8227 | 0.77 | 0.3525 | 1.4718 | 0.7700 | 0.7357 | 0.2170 | 0.1033 |
| 0.0571 | 202.0 | 2525 | 0.8226 | 0.77 | 0.3525 | 1.4505 | 0.7700 | 0.7349 | 0.1954 | 0.1041 |
| 0.0571 | 202.96 | 2537 | 0.8231 | 0.765 | 0.3530 | 1.4506 | 0.765 | 0.7321 | 0.1962 | 0.1046 |
| 0.0571 | 204.0 | 2550 | 0.8255 | 0.77 | 0.3535 | 1.4520 | 0.7700 | 0.7380 | 0.2078 | 0.1060 |
| 0.0571 | 204.96 | 2562 | 0.8276 | 0.77 | 0.3550 | 1.4594 | 0.7700 | 0.7349 | 0.2013 | 0.1046 |
| 0.0571 | 206.0 | 2575 | 0.8257 | 0.77 | 0.3542 | 1.4532 | 0.7700 | 0.7349 | 0.1987 | 0.1040 |
| 0.0571 | 206.96 | 2587 | 0.8248 | 0.775 | 0.3536 | 1.4499 | 0.775 | 0.7406 | 0.1903 | 0.1043 |
| 0.0571 | 208.0 | 2600 | 0.8250 | 0.77 | 0.3534 | 1.4537 | 0.7700 | 0.7349 | 0.2070 | 0.1040 |
| 0.0571 | 208.96 | 2612 | 0.8277 | 0.77 | 0.3548 | 1.4521 | 0.7700 | 0.7380 | 0.1867 | 0.1058 |
| 0.0571 | 210.0 | 2625 | 0.8271 | 0.77 | 0.3545 | 1.4543 | 0.7700 | 0.7349 | 0.2213 | 0.1036 |
| 0.0571 | 210.96 | 2637 | 0.8284 | 0.775 | 0.3552 | 1.4516 | 0.775 | 0.7406 | 0.1992 | 0.1053 |
| 0.0571 | 212.0 | 2650 | 0.8278 | 0.77 | 0.3545 | 1.4533 | 0.7700 | 0.7360 | 0.1938 | 0.1056 |
| 0.0571 | 212.96 | 2662 | 0.8289 | 0.77 | 0.3552 | 1.4533 | 0.7700 | 0.7380 | 0.2017 | 0.1057 |
| 0.0571 | 214.0 | 2675 | 0.8290 | 0.775 | 0.3556 | 1.4530 | 0.775 | 0.7406 | 0.2005 | 0.1052 |
| 0.0571 | 214.96 | 2687 | 0.8282 | 0.77 | 0.3551 | 1.4517 | 0.7700 | 0.7379 | 0.1985 | 0.1037 |
| 0.0571 | 216.0 | 2700 | 0.8294 | 0.77 | 0.3555 | 1.4588 | 0.7700 | 0.7349 | 0.1941 | 0.1045 |
| 0.0571 | 216.96 | 2712 | 0.8305 | 0.775 | 0.3562 | 1.4516 | 0.775 | 0.7406 | 0.1977 | 0.1057 |
| 0.0571 | 218.0 | 2725 | 0.8310 | 0.77 | 0.3565 | 1.4539 | 0.7700 | 0.7380 | 0.1926 | 0.1054 |
| 0.0571 | 218.96 | 2737 | 0.8304 | 0.775 | 0.3560 | 1.4516 | 0.775 | 0.7406 | 0.1986 | 0.1054 |
| 0.0571 | 220.0 | 2750 | 0.8320 | 0.775 | 0.3568 | 1.4545 | 0.775 | 0.7406 | 0.1953 | 0.1054 |
| 0.0571 | 220.96 | 2762 | 0.8316 | 0.775 | 0.3569 | 1.4523 | 0.775 | 0.7406 | 0.1945 | 0.1045 |
| 0.0571 | 222.0 | 2775 | 0.8330 | 0.77 | 0.3573 | 1.4547 | 0.7700 | 0.7380 | 0.1892 | 0.1067 |
| 0.0571 | 222.96 | 2787 | 0.8309 | 0.77 | 0.3563 | 1.4548 | 0.7700 | 0.7379 | 0.2060 | 0.1033 |
| 0.0571 | 224.0 | 2800 | 0.8323 | 0.775 | 0.3572 | 1.4515 | 0.775 | 0.7406 | 0.1910 | 0.1050 |
| 0.0571 | 224.96 | 2812 | 0.8329 | 0.775 | 0.3569 | 1.4530 | 0.775 | 0.7406 | 0.1931 | 0.1055 |
| 0.0571 | 226.0 | 2825 | 0.8319 | 0.78 | 0.3567 | 1.4513 | 0.78 | 0.7444 | 0.2038 | 0.1043 |
| 0.0571 | 226.96 | 2837 | 0.8354 | 0.77 | 0.3586 | 1.4556 | 0.7700 | 0.7380 | 0.1969 | 0.1068 |
| 0.0571 | 228.0 | 2850 | 0.8340 | 0.78 | 0.3575 | 1.4550 | 0.78 | 0.7444 | 0.2043 | 0.1062 |
| 0.0571 | 228.96 | 2862 | 0.8355 | 0.775 | 0.3584 | 1.4546 | 0.775 | 0.7406 | 0.2048 | 0.1055 |
| 0.0571 | 230.0 | 2875 | 0.8350 | 0.78 | 0.3579 | 1.4538 | 0.78 | 0.7444 | 0.2069 | 0.1064 |
| 0.0571 | 230.96 | 2887 | 0.8358 | 0.77 | 0.3584 | 1.4550 | 0.7700 | 0.7380 | 0.1899 | 0.1061 |
| 0.0571 | 232.0 | 2900 | 0.8366 | 0.77 | 0.3587 | 1.4564 | 0.7700 | 0.7380 | 0.1921 | 0.1070 |
| 0.0571 | 232.96 | 2912 | 0.8364 | 0.775 | 0.3587 | 1.4557 | 0.775 | 0.7418 | 0.1970 | 0.1065 |
| 0.0571 | 234.0 | 2925 | 0.8359 | 0.775 | 0.3585 | 1.4543 | 0.775 | 0.7406 | 0.1912 | 0.1061 |
| 0.0571 | 234.96 | 2937 | 0.8360 | 0.775 | 0.3587 | 1.4540 | 0.775 | 0.7406 | 0.2017 | 0.1049 |
| 0.0571 | 236.0 | 2950 | 0.8362 | 0.78 | 0.3587 | 1.4527 | 0.78 | 0.7444 | 0.1985 | 0.1060 |
| 0.0571 | 236.96 | 2962 | 0.8375 | 0.78 | 0.3593 | 1.4554 | 0.78 | 0.7444 | 0.2035 | 0.1061 |
| 0.0571 | 238.0 | 2975 | 0.8378 | 0.775 | 0.3593 | 1.4544 | 0.775 | 0.7418 | 0.1971 | 0.1068 |
| 0.0571 | 238.96 | 2987 | 0.8369 | 0.78 | 0.3588 | 1.4557 | 0.78 | 0.7444 | 0.2178 | 0.1057 |
| 0.0512 | 240.0 | 3000 | 0.8388 | 0.77 | 0.3600 | 1.4558 | 0.7700 | 0.7380 | 0.1939 | 0.1067 |
| 0.0512 | 240.96 | 3012 | 0.8375 | 0.78 | 0.3593 | 1.4540 | 0.78 | 0.7444 | 0.2071 | 0.1058 |
| 0.0512 | 242.0 | 3025 | 0.8393 | 0.775 | 0.3602 | 1.4546 | 0.775 | 0.7406 | 0.1990 | 0.1066 |
| 0.0512 | 242.96 | 3037 | 0.8391 | 0.775 | 0.3601 | 1.4551 | 0.775 | 0.7406 | 0.2025 | 0.1063 |
| 0.0512 | 244.0 | 3050 | 0.8414 | 0.77 | 0.3610 | 1.4575 | 0.7700 | 0.7380 | 0.1924 | 0.1072 |
| 0.0512 | 244.96 | 3062 | 0.8385 | 0.78 | 0.3597 | 1.4531 | 0.78 | 0.7444 | 0.2062 | 0.1059 |
| 0.0512 | 246.0 | 3075 | 0.8394 | 0.78 | 0.3603 | 1.4583 | 0.78 | 0.7444 | 0.1962 | 0.1057 |
| 0.0512 | 246.96 | 3087 | 0.8401 | 0.775 | 0.3604 | 1.4535 | 0.775 | 0.7406 | 0.1880 | 0.1060 |
| 0.0512 | 248.0 | 3100 | 0.8400 | 0.78 | 0.3605 | 1.4550 | 0.78 | 0.7444 | 0.2156 | 0.1058 |
| 0.0512 | 248.96 | 3112 | 0.8404 | 0.78 | 0.3606 | 1.4554 | 0.78 | 0.7444 | 0.1977 | 0.1061 |
| 0.0512 | 250.0 | 3125 | 0.8406 | 0.78 | 0.3607 | 1.4542 | 0.78 | 0.7444 | 0.2055 | 0.1062 |
| 0.0512 | 250.96 | 3137 | 0.8408 | 0.78 | 0.3608 | 1.4545 | 0.78 | 0.7444 | 0.2036 | 0.1062 |
| 0.0512 | 252.0 | 3150 | 0.8414 | 0.78 | 0.3611 | 1.4560 | 0.78 | 0.7444 | 0.2054 | 0.1063 |
| 0.0512 | 252.96 | 3162 | 0.8424 | 0.775 | 0.3614 | 1.4580 | 0.775 | 0.7418 | 0.2037 | 0.1072 |
| 0.0512 | 254.0 | 3175 | 0.8423 | 0.775 | 0.3616 | 1.4558 | 0.775 | 0.7406 | 0.2057 | 0.1064 |
| 0.0512 | 254.96 | 3187 | 0.8422 | 0.775 | 0.3613 | 1.4562 | 0.775 | 0.7418 | 0.2070 | 0.1066 |
| 0.0512 | 256.0 | 3200 | 0.8419 | 0.78 | 0.3612 | 1.4562 | 0.78 | 0.7444 | 0.2196 | 0.1063 |
| 0.0512 | 256.96 | 3212 | 0.8434 | 0.775 | 0.3620 | 1.4565 | 0.775 | 0.7406 | 0.2033 | 0.1065 |
| 0.0512 | 258.0 | 3225 | 0.8431 | 0.775 | 0.3619 | 1.4557 | 0.775 | 0.7418 | 0.2072 | 0.1064 |
| 0.0512 | 258.96 | 3237 | 0.8435 | 0.77 | 0.3620 | 1.4567 | 0.7700 | 0.7380 | 0.1985 | 0.1066 |
| 0.0512 | 260.0 | 3250 | 0.8433 | 0.78 | 0.3619 | 1.4567 | 0.78 | 0.7444 | 0.2179 | 0.1065 |
| 0.0512 | 260.96 | 3262 | 0.8430 | 0.78 | 0.3619 | 1.4558 | 0.78 | 0.7444 | 0.2120 | 0.1060 |
| 0.0512 | 262.0 | 3275 | 0.8432 | 0.78 | 0.3619 | 1.4552 | 0.78 | 0.7444 | 0.2058 | 0.1060 |
| 0.0512 | 262.96 | 3287 | 0.8444 | 0.775 | 0.3623 | 1.4572 | 0.775 | 0.7418 | 0.2035 | 0.1068 |
| 0.0512 | 264.0 | 3300 | 0.8442 | 0.775 | 0.3622 | 1.4574 | 0.775 | 0.7418 | 0.2054 | 0.1067 |
| 0.0512 | 264.96 | 3312 | 0.8441 | 0.78 | 0.3623 | 1.4554 | 0.78 | 0.7444 | 0.2051 | 0.1062 |
| 0.0512 | 266.0 | 3325 | 0.8446 | 0.775 | 0.3624 | 1.4561 | 0.775 | 0.7418 | 0.1975 | 0.1066 |
| 0.0512 | 266.96 | 3337 | 0.8447 | 0.775 | 0.3624 | 1.4570 | 0.775 | 0.7418 | 0.2053 | 0.1065 |
| 0.0512 | 268.0 | 3350 | 0.8448 | 0.78 | 0.3624 | 1.4573 | 0.78 | 0.7444 | 0.2085 | 0.1065 |
| 0.0512 | 268.96 | 3362 | 0.8443 | 0.78 | 0.3624 | 1.4558 | 0.78 | 0.7444 | 0.2119 | 0.1065 |
| 0.0512 | 270.0 | 3375 | 0.8453 | 0.775 | 0.3628 | 1.4571 | 0.775 | 0.7418 | 0.2035 | 0.1067 |
| 0.0512 | 270.96 | 3387 | 0.8444 | 0.78 | 0.3623 | 1.4561 | 0.78 | 0.7444 | 0.2076 | 0.1063 |
| 0.0512 | 272.0 | 3400 | 0.8455 | 0.775 | 0.3629 | 1.4569 | 0.775 | 0.7418 | 0.2034 | 0.1066 |
| 0.0512 | 272.96 | 3412 | 0.8453 | 0.78 | 0.3628 | 1.4574 | 0.78 | 0.7444 | 0.2021 | 0.1065 |
| 0.0512 | 274.0 | 3425 | 0.8450 | 0.78 | 0.3626 | 1.4560 | 0.78 | 0.7444 | 0.2058 | 0.1064 |
| 0.0512 | 274.96 | 3437 | 0.8456 | 0.775 | 0.3629 | 1.4569 | 0.775 | 0.7418 | 0.2035 | 0.1066 |
| 0.0512 | 276.0 | 3450 | 0.8454 | 0.775 | 0.3628 | 1.4565 | 0.775 | 0.7418 | 0.2033 | 0.1065 |
| 0.0512 | 276.96 | 3462 | 0.8454 | 0.78 | 0.3628 | 1.4575 | 0.78 | 0.7444 | 0.2137 | 0.1063 |
| 0.0512 | 278.0 | 3475 | 0.8457 | 0.78 | 0.3630 | 1.4567 | 0.78 | 0.7444 | 0.2092 | 0.1065 |
| 0.0512 | 278.96 | 3487 | 0.8462 | 0.775 | 0.3632 | 1.4567 | 0.775 | 0.7418 | 0.1994 | 0.1067 |
| 0.0481 | 280.0 | 3500 | 0.8456 | 0.78 | 0.3630 | 1.4572 | 0.78 | 0.7444 | 0.2192 | 0.1064 |
| 0.0481 | 280.96 | 3512 | 0.8462 | 0.775 | 0.3632 | 1.4571 | 0.775 | 0.7418 | 0.2034 | 0.1066 |
| 0.0481 | 282.0 | 3525 | 0.8457 | 0.775 | 0.3630 | 1.4563 | 0.775 | 0.7418 | 0.2042 | 0.1065 |
| 0.0481 | 282.96 | 3537 | 0.8460 | 0.775 | 0.3631 | 1.4570 | 0.775 | 0.7418 | 0.2106 | 0.1066 |
| 0.0481 | 284.0 | 3550 | 0.8462 | 0.775 | 0.3632 | 1.4570 | 0.775 | 0.7418 | 0.2106 | 0.1067 |
| 0.0481 | 284.96 | 3562 | 0.8460 | 0.775 | 0.3631 | 1.4567 | 0.775 | 0.7418 | 0.2042 | 0.1065 |
| 0.0481 | 286.0 | 3575 | 0.8461 | 0.775 | 0.3632 | 1.4568 | 0.775 | 0.7418 | 0.2043 | 0.1066 |
| 0.0481 | 286.96 | 3587 | 0.8461 | 0.775 | 0.3632 | 1.4570 | 0.775 | 0.7418 | 0.2043 | 0.1066 |
| 0.0481 | 288.0 | 3600 | 0.8461 | 0.775 | 0.3632 | 1.4570 | 0.775 | 0.7418 | 0.2043 | 0.1066 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/vit-base_tobacco_lr1e-5_wr_0.05_wd_0.1
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco_lr1e-5_wr_0.05_wd_0.1
This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9592
- Accuracy: 0.775
- Brier Loss: 0.3981
- Nll: 1.5416
- F1 Micro: 0.775
- F1 Macro: 0.7418
- Ece: 0.2227
- Aurc: 0.1082
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 300
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.7440 | 0.815 | 0.3076 | 1.1842 | 0.815 | 0.7942 | 0.2216 | 0.0733 |
| No log | 2.0 | 25 | 0.7436 | 0.82 | 0.3075 | 1.1869 | 0.82 | 0.8049 | 0.2132 | 0.0741 |
| No log | 2.96 | 37 | 0.7454 | 0.81 | 0.3085 | 1.1880 | 0.81 | 0.7914 | 0.2312 | 0.0755 |
| No log | 4.0 | 50 | 0.7439 | 0.815 | 0.3077 | 1.1846 | 0.815 | 0.7926 | 0.2369 | 0.0760 |
| No log | 4.96 | 62 | 0.7370 | 0.82 | 0.3040 | 1.1982 | 0.82 | 0.8028 | 0.2374 | 0.0745 |
| No log | 6.0 | 75 | 0.7507 | 0.82 | 0.3112 | 1.1980 | 0.82 | 0.8005 | 0.2513 | 0.0809 |
| No log | 6.96 | 87 | 0.7370 | 0.805 | 0.3060 | 1.1778 | 0.805 | 0.7841 | 0.2522 | 0.0746 |
| No log | 8.0 | 100 | 0.7437 | 0.81 | 0.3076 | 1.1846 | 0.81 | 0.7877 | 0.2301 | 0.0804 |
| No log | 8.96 | 112 | 0.7311 | 0.81 | 0.3031 | 1.1975 | 0.81 | 0.7920 | 0.2084 | 0.0753 |
| No log | 10.0 | 125 | 0.7305 | 0.8 | 0.3020 | 1.1785 | 0.8000 | 0.7792 | 0.2131 | 0.0777 |
| No log | 10.96 | 137 | 0.7478 | 0.805 | 0.3119 | 1.3979 | 0.805 | 0.7860 | 0.2133 | 0.0827 |
| No log | 12.0 | 150 | 0.7469 | 0.805 | 0.3082 | 1.3337 | 0.805 | 0.7844 | 0.2213 | 0.0843 |
| No log | 12.96 | 162 | 0.7545 | 0.805 | 0.3114 | 1.4280 | 0.805 | 0.7893 | 0.2092 | 0.0935 |
| No log | 14.0 | 175 | 0.7283 | 0.795 | 0.3012 | 1.1856 | 0.795 | 0.7739 | 0.2182 | 0.0806 |
| No log | 14.96 | 187 | 0.7219 | 0.815 | 0.2972 | 1.2792 | 0.815 | 0.8043 | 0.2024 | 0.0734 |
| No log | 16.0 | 200 | 0.7284 | 0.805 | 0.3001 | 1.2528 | 0.805 | 0.7899 | 0.2068 | 0.0858 |
| No log | 16.96 | 212 | 0.7191 | 0.805 | 0.2981 | 1.3067 | 0.805 | 0.7919 | 0.2062 | 0.0809 |
| No log | 18.0 | 225 | 0.7221 | 0.8 | 0.3011 | 1.1747 | 0.8000 | 0.7792 | 0.2091 | 0.0803 |
| No log | 18.96 | 237 | 0.7253 | 0.81 | 0.2995 | 1.3143 | 0.81 | 0.7955 | 0.2136 | 0.0889 |
| No log | 20.0 | 250 | 0.7186 | 0.8 | 0.2981 | 1.1839 | 0.8000 | 0.7819 | 0.1899 | 0.0812 |
| No log | 20.96 | 262 | 0.7247 | 0.805 | 0.3012 | 1.2501 | 0.805 | 0.7925 | 0.2214 | 0.0891 |
| No log | 22.0 | 275 | 0.7317 | 0.805 | 0.3058 | 1.3767 | 0.805 | 0.7853 | 0.2141 | 0.0893 |
| No log | 22.96 | 287 | 0.7250 | 0.81 | 0.3031 | 1.3683 | 0.81 | 0.7907 | 0.1886 | 0.0838 |
| No log | 24.0 | 300 | 0.7137 | 0.805 | 0.2983 | 1.3088 | 0.805 | 0.7851 | 0.1799 | 0.0782 |
| No log | 24.96 | 312 | 0.7334 | 0.81 | 0.3070 | 1.4296 | 0.81 | 0.7909 | 0.1903 | 0.0898 |
| No log | 26.0 | 325 | 0.7284 | 0.81 | 0.3035 | 1.2467 | 0.81 | 0.7984 | 0.2152 | 0.0916 |
| No log | 26.96 | 337 | 0.7242 | 0.805 | 0.3020 | 1.3077 | 0.805 | 0.7862 | 0.2071 | 0.0891 |
| No log | 28.0 | 350 | 0.7285 | 0.81 | 0.3028 | 1.3756 | 0.81 | 0.7910 | 0.2158 | 0.0915 |
| No log | 28.96 | 362 | 0.7253 | 0.8 | 0.3016 | 1.3714 | 0.8000 | 0.7716 | 0.2057 | 0.0894 |
| No log | 30.0 | 375 | 0.7321 | 0.8 | 0.3068 | 1.3688 | 0.8000 | 0.7736 | 0.1943 | 0.0885 |
| No log | 30.96 | 387 | 0.7294 | 0.8 | 0.3047 | 1.3713 | 0.8000 | 0.7746 | 0.2138 | 0.0900 |
| No log | 32.0 | 400 | 0.7296 | 0.81 | 0.3054 | 1.3749 | 0.81 | 0.7921 | 0.2074 | 0.0910 |
| No log | 32.96 | 412 | 0.7311 | 0.805 | 0.3061 | 1.3704 | 0.805 | 0.7811 | 0.1984 | 0.0920 |
| No log | 34.0 | 425 | 0.7291 | 0.805 | 0.3049 | 1.3686 | 0.805 | 0.7811 | 0.2126 | 0.0916 |
| No log | 34.96 | 437 | 0.7301 | 0.795 | 0.3048 | 1.3712 | 0.795 | 0.7654 | 0.1917 | 0.0904 |
| No log | 36.0 | 450 | 0.7318 | 0.81 | 0.3072 | 1.3695 | 0.81 | 0.7844 | 0.1976 | 0.0900 |
| No log | 36.96 | 462 | 0.7403 | 0.795 | 0.3102 | 1.3712 | 0.795 | 0.7656 | 0.2039 | 0.0934 |
| No log | 38.0 | 475 | 0.7376 | 0.795 | 0.3095 | 1.3653 | 0.795 | 0.7654 | 0.1982 | 0.0920 |
| No log | 38.96 | 487 | 0.7326 | 0.805 | 0.3049 | 1.3815 | 0.805 | 0.7744 | 0.1820 | 0.0948 |
| 0.1331 | 40.0 | 500 | 0.7268 | 0.8 | 0.3038 | 1.3702 | 0.8000 | 0.7704 | 0.2051 | 0.0899 |
| 0.1331 | 40.96 | 512 | 0.7371 | 0.8 | 0.3074 | 1.3824 | 0.8000 | 0.7684 | 0.1946 | 0.0939 |
| 0.1331 | 42.0 | 525 | 0.7374 | 0.81 | 0.3107 | 1.3600 | 0.81 | 0.7844 | 0.2109 | 0.0910 |
| 0.1331 | 42.96 | 537 | 0.7366 | 0.8 | 0.3071 | 1.4434 | 0.8000 | 0.7776 | 0.2042 | 0.0935 |
| 0.1331 | 44.0 | 550 | 0.7362 | 0.805 | 0.3083 | 1.3721 | 0.805 | 0.7829 | 0.1782 | 0.0929 |
| 0.1331 | 44.96 | 562 | 0.7389 | 0.8 | 0.3110 | 1.3695 | 0.8000 | 0.7704 | 0.1966 | 0.0917 |
| 0.1331 | 46.0 | 575 | 0.7426 | 0.79 | 0.3108 | 1.5068 | 0.79 | 0.7644 | 0.1938 | 0.0968 |
| 0.1331 | 46.96 | 587 | 0.7395 | 0.8 | 0.3096 | 1.3760 | 0.8000 | 0.7704 | 0.1951 | 0.0927 |
| 0.1331 | 48.0 | 600 | 0.7540 | 0.805 | 0.3185 | 1.4936 | 0.805 | 0.7821 | 0.1958 | 0.0979 |
| 0.1331 | 48.96 | 612 | 0.7413 | 0.805 | 0.3116 | 1.4368 | 0.805 | 0.7829 | 0.1835 | 0.0955 |
| 0.1331 | 50.0 | 625 | 0.7543 | 0.805 | 0.3167 | 1.4402 | 0.805 | 0.7831 | 0.2143 | 0.0974 |
| 0.1331 | 50.96 | 637 | 0.7378 | 0.805 | 0.3087 | 1.3850 | 0.805 | 0.7829 | 0.1886 | 0.0935 |
| 0.1331 | 52.0 | 650 | 0.7545 | 0.795 | 0.3175 | 1.3873 | 0.795 | 0.7656 | 0.2007 | 0.0957 |
| 0.1331 | 52.96 | 662 | 0.7464 | 0.8 | 0.3140 | 1.3734 | 0.8000 | 0.7707 | 0.1872 | 0.0938 |
| 0.1331 | 54.0 | 675 | 0.7439 | 0.8 | 0.3120 | 1.3765 | 0.8000 | 0.7704 | 0.2036 | 0.0942 |
| 0.1331 | 54.96 | 687 | 0.7506 | 0.8 | 0.3150 | 1.3788 | 0.8000 | 0.7707 | 0.1788 | 0.0959 |
| 0.1331 | 56.0 | 700 | 0.7511 | 0.805 | 0.3158 | 1.4378 | 0.805 | 0.7829 | 0.2054 | 0.0955 |
| 0.1331 | 56.96 | 712 | 0.7587 | 0.805 | 0.3196 | 1.4494 | 0.805 | 0.7831 | 0.1844 | 0.0972 |
| 0.1331 | 58.0 | 725 | 0.7505 | 0.8 | 0.3154 | 1.3759 | 0.8000 | 0.7704 | 0.1913 | 0.0953 |
| 0.1331 | 58.96 | 737 | 0.7553 | 0.79 | 0.3167 | 1.4457 | 0.79 | 0.7549 | 0.1977 | 0.0959 |
| 0.1331 | 60.0 | 750 | 0.7543 | 0.8 | 0.3175 | 1.3807 | 0.8000 | 0.7707 | 0.1963 | 0.0953 |
| 0.1331 | 60.96 | 762 | 0.7592 | 0.795 | 0.3200 | 1.3759 | 0.795 | 0.7681 | 0.1986 | 0.0961 |
| 0.1331 | 62.0 | 775 | 0.7557 | 0.795 | 0.3185 | 1.3785 | 0.795 | 0.7634 | 0.1971 | 0.0948 |
| 0.1331 | 62.96 | 787 | 0.7591 | 0.79 | 0.3200 | 1.4466 | 0.79 | 0.7613 | 0.2033 | 0.0963 |
| 0.1331 | 64.0 | 800 | 0.7624 | 0.795 | 0.3210 | 1.4423 | 0.795 | 0.7621 | 0.2030 | 0.0962 |
| 0.1331 | 64.96 | 812 | 0.7674 | 0.79 | 0.3240 | 1.4454 | 0.79 | 0.7596 | 0.1973 | 0.0969 |
| 0.1331 | 66.0 | 825 | 0.7645 | 0.79 | 0.3224 | 1.4497 | 0.79 | 0.7611 | 0.1999 | 0.0964 |
| 0.1331 | 66.96 | 837 | 0.7652 | 0.795 | 0.3234 | 1.4418 | 0.795 | 0.7668 | 0.1819 | 0.0968 |
| 0.1331 | 68.0 | 850 | 0.7695 | 0.795 | 0.3250 | 1.4969 | 0.795 | 0.7606 | 0.1914 | 0.0979 |
| 0.1331 | 68.96 | 862 | 0.7708 | 0.785 | 0.3258 | 1.4482 | 0.785 | 0.7516 | 0.1954 | 0.0976 |
| 0.1331 | 70.0 | 875 | 0.7691 | 0.795 | 0.3249 | 1.4960 | 0.795 | 0.7673 | 0.1895 | 0.0976 |
| 0.1331 | 70.96 | 887 | 0.7741 | 0.785 | 0.3272 | 1.5043 | 0.785 | 0.7519 | 0.1898 | 0.0982 |
| 0.1331 | 72.0 | 900 | 0.7788 | 0.79 | 0.3293 | 1.5094 | 0.79 | 0.7611 | 0.1738 | 0.0995 |
| 0.1331 | 72.96 | 912 | 0.7837 | 0.785 | 0.3329 | 1.5306 | 0.785 | 0.7577 | 0.2002 | 0.1004 |
| 0.1331 | 74.0 | 925 | 0.7755 | 0.785 | 0.3280 | 1.4985 | 0.785 | 0.7514 | 0.1906 | 0.0981 |
| 0.1331 | 74.96 | 937 | 0.7797 | 0.785 | 0.3308 | 1.4611 | 0.785 | 0.7580 | 0.1925 | 0.0994 |
| 0.1331 | 76.0 | 950 | 0.7744 | 0.785 | 0.3273 | 1.4441 | 0.785 | 0.7519 | 0.1929 | 0.0976 |
| 0.1331 | 76.96 | 962 | 0.7766 | 0.785 | 0.3295 | 1.4420 | 0.785 | 0.7516 | 0.1899 | 0.0972 |
| 0.1331 | 78.0 | 975 | 0.7888 | 0.785 | 0.3339 | 1.4991 | 0.785 | 0.7573 | 0.1879 | 0.0998 |
| 0.1331 | 78.96 | 987 | 0.7765 | 0.795 | 0.3292 | 1.4915 | 0.795 | 0.7663 | 0.1750 | 0.0948 |
| 0.071 | 80.0 | 1000 | 0.7821 | 0.785 | 0.3303 | 1.4990 | 0.785 | 0.7519 | 0.1940 | 0.0986 |
| 0.071 | 80.96 | 1012 | 0.7860 | 0.79 | 0.3330 | 1.4977 | 0.79 | 0.7644 | 0.1698 | 0.0976 |
| 0.071 | 82.0 | 1025 | 0.7882 | 0.78 | 0.3342 | 1.5243 | 0.78 | 0.7482 | 0.1930 | 0.1006 |
| 0.071 | 82.96 | 1037 | 0.7879 | 0.78 | 0.3333 | 1.5037 | 0.78 | 0.7491 | 0.2055 | 0.0995 |
| 0.071 | 84.0 | 1050 | 0.7842 | 0.78 | 0.3326 | 1.4959 | 0.78 | 0.7488 | 0.1945 | 0.0985 |
| 0.071 | 84.96 | 1062 | 0.7866 | 0.78 | 0.3338 | 1.4961 | 0.78 | 0.7488 | 0.1877 | 0.0982 |
| 0.071 | 86.0 | 1075 | 0.7931 | 0.785 | 0.3369 | 1.5006 | 0.785 | 0.7573 | 0.1898 | 0.1003 |
| 0.071 | 86.96 | 1087 | 0.7937 | 0.78 | 0.3360 | 1.5043 | 0.78 | 0.7488 | 0.1828 | 0.0999 |
| 0.071 | 88.0 | 1100 | 0.7948 | 0.78 | 0.3374 | 1.5034 | 0.78 | 0.7488 | 0.1893 | 0.0999 |
| 0.071 | 88.96 | 1112 | 0.7962 | 0.78 | 0.3372 | 1.5078 | 0.78 | 0.7494 | 0.1943 | 0.1011 |
| 0.071 | 90.0 | 1125 | 0.7956 | 0.785 | 0.3377 | 1.5039 | 0.785 | 0.7516 | 0.1918 | 0.0999 |
| 0.071 | 90.96 | 1137 | 0.7996 | 0.78 | 0.3382 | 1.5060 | 0.78 | 0.7491 | 0.1982 | 0.1013 |
| 0.071 | 92.0 | 1150 | 0.7980 | 0.78 | 0.3381 | 1.5023 | 0.78 | 0.7488 | 0.1902 | 0.1004 |
| 0.071 | 92.96 | 1162 | 0.8015 | 0.78 | 0.3396 | 1.5029 | 0.78 | 0.7488 | 0.1978 | 0.1007 |
| 0.071 | 94.0 | 1175 | 0.8044 | 0.78 | 0.3411 | 1.5047 | 0.78 | 0.7488 | 0.1929 | 0.1012 |
| 0.071 | 94.96 | 1187 | 0.7977 | 0.78 | 0.3392 | 1.4989 | 0.78 | 0.7488 | 0.1866 | 0.0989 |
| 0.071 | 96.0 | 1200 | 0.8071 | 0.78 | 0.3425 | 1.5021 | 0.78 | 0.7488 | 0.1941 | 0.1018 |
| 0.071 | 96.96 | 1212 | 0.8033 | 0.78 | 0.3406 | 1.4967 | 0.78 | 0.7488 | 0.1913 | 0.1000 |
| 0.071 | 98.0 | 1225 | 0.8148 | 0.775 | 0.3466 | 1.4555 | 0.775 | 0.7462 | 0.1828 | 0.1036 |
| 0.071 | 98.96 | 1237 | 0.8062 | 0.78 | 0.3417 | 1.5007 | 0.78 | 0.7488 | 0.1949 | 0.1004 |
| 0.071 | 100.0 | 1250 | 0.8123 | 0.77 | 0.3456 | 1.5069 | 0.7700 | 0.7424 | 0.1964 | 0.1020 |
| 0.071 | 100.96 | 1262 | 0.8117 | 0.78 | 0.3452 | 1.5048 | 0.78 | 0.7488 | 0.2081 | 0.1020 |
| 0.071 | 102.0 | 1275 | 0.8125 | 0.77 | 0.3454 | 1.5066 | 0.7700 | 0.7424 | 0.2040 | 0.1022 |
| 0.071 | 102.96 | 1287 | 0.8134 | 0.775 | 0.3458 | 1.5048 | 0.775 | 0.7450 | 0.1977 | 0.1013 |
| 0.071 | 104.0 | 1300 | 0.8152 | 0.78 | 0.3461 | 1.5027 | 0.78 | 0.7488 | 0.2044 | 0.1014 |
| 0.071 | 104.96 | 1312 | 0.8185 | 0.78 | 0.3478 | 1.5057 | 0.78 | 0.7488 | 0.1900 | 0.1022 |
| 0.071 | 106.0 | 1325 | 0.8191 | 0.78 | 0.3480 | 1.5053 | 0.78 | 0.7488 | 0.2084 | 0.1026 |
| 0.071 | 106.96 | 1337 | 0.8207 | 0.77 | 0.3497 | 1.5095 | 0.7700 | 0.7424 | 0.1984 | 0.1025 |
| 0.071 | 108.0 | 1350 | 0.8221 | 0.77 | 0.3487 | 1.5095 | 0.7700 | 0.7424 | 0.1871 | 0.1031 |
| 0.071 | 108.96 | 1362 | 0.8229 | 0.765 | 0.3501 | 1.4607 | 0.765 | 0.7331 | 0.1920 | 0.1028 |
| 0.071 | 110.0 | 1375 | 0.8232 | 0.78 | 0.3498 | 1.5044 | 0.78 | 0.7488 | 0.1995 | 0.1023 |
| 0.071 | 110.96 | 1387 | 0.8279 | 0.785 | 0.3513 | 1.5060 | 0.785 | 0.7526 | 0.2073 | 0.1033 |
| 0.071 | 112.0 | 1400 | 0.8246 | 0.775 | 0.3505 | 1.5038 | 0.775 | 0.7450 | 0.1927 | 0.1018 |
| 0.071 | 112.96 | 1412 | 0.8308 | 0.765 | 0.3537 | 1.5095 | 0.765 | 0.7331 | 0.1931 | 0.1035 |
| 0.071 | 114.0 | 1425 | 0.8277 | 0.775 | 0.3513 | 1.5058 | 0.775 | 0.7395 | 0.1977 | 0.1022 |
| 0.071 | 114.96 | 1437 | 0.8302 | 0.76 | 0.3531 | 1.4583 | 0.76 | 0.7296 | 0.2112 | 0.1028 |
| 0.071 | 116.0 | 1450 | 0.8328 | 0.765 | 0.3535 | 1.5125 | 0.765 | 0.7331 | 0.2008 | 0.1037 |
| 0.071 | 116.96 | 1462 | 0.8309 | 0.76 | 0.3533 | 1.4542 | 0.76 | 0.7296 | 0.2037 | 0.1029 |
| 0.071 | 118.0 | 1475 | 0.8378 | 0.765 | 0.3558 | 1.5162 | 0.765 | 0.7323 | 0.2040 | 0.1055 |
| 0.071 | 118.96 | 1487 | 0.8341 | 0.76 | 0.3547 | 1.5076 | 0.76 | 0.7296 | 0.1942 | 0.1032 |
| 0.0462 | 120.0 | 1500 | 0.8367 | 0.76 | 0.3557 | 1.5134 | 0.76 | 0.7296 | 0.1987 | 0.1034 |
| 0.0462 | 120.96 | 1512 | 0.8369 | 0.76 | 0.3553 | 1.5081 | 0.76 | 0.7296 | 0.2121 | 0.1036 |
| 0.0462 | 122.0 | 1525 | 0.8385 | 0.77 | 0.3560 | 1.5076 | 0.7700 | 0.7357 | 0.1944 | 0.1034 |
| 0.0462 | 122.96 | 1537 | 0.8415 | 0.76 | 0.3577 | 1.5127 | 0.76 | 0.7296 | 0.2080 | 0.1040 |
| 0.0462 | 124.0 | 1550 | 0.8418 | 0.765 | 0.3571 | 1.5123 | 0.765 | 0.7333 | 0.1905 | 0.1043 |
| 0.0462 | 124.96 | 1562 | 0.8431 | 0.76 | 0.3581 | 1.5124 | 0.76 | 0.7296 | 0.2029 | 0.1043 |
| 0.0462 | 126.0 | 1575 | 0.8461 | 0.765 | 0.3595 | 1.5115 | 0.765 | 0.7331 | 0.1861 | 0.1044 |
| 0.0462 | 126.96 | 1587 | 0.8446 | 0.76 | 0.3586 | 1.5117 | 0.76 | 0.7296 | 0.1962 | 0.1043 |
| 0.0462 | 128.0 | 1600 | 0.8448 | 0.765 | 0.3585 | 1.5106 | 0.765 | 0.7333 | 0.1899 | 0.1048 |
| 0.0462 | 128.96 | 1612 | 0.8503 | 0.765 | 0.3611 | 1.5156 | 0.765 | 0.7323 | 0.1865 | 0.1050 |
| 0.0462 | 130.0 | 1625 | 0.8473 | 0.765 | 0.3597 | 1.5082 | 0.765 | 0.7333 | 0.1992 | 0.1040 |
| 0.0462 | 130.96 | 1637 | 0.8530 | 0.76 | 0.3617 | 1.5178 | 0.76 | 0.7296 | 0.2008 | 0.1053 |
| 0.0462 | 132.0 | 1650 | 0.8499 | 0.765 | 0.3608 | 1.5105 | 0.765 | 0.7321 | 0.1910 | 0.1035 |
| 0.0462 | 132.96 | 1662 | 0.8529 | 0.765 | 0.3612 | 1.5095 | 0.765 | 0.7333 | 0.1943 | 0.1043 |
| 0.0462 | 134.0 | 1675 | 0.8547 | 0.765 | 0.3635 | 1.5095 | 0.765 | 0.7321 | 0.2002 | 0.1032 |
| 0.0462 | 134.96 | 1687 | 0.8572 | 0.765 | 0.3638 | 1.5159 | 0.765 | 0.7333 | 0.1979 | 0.1056 |
| 0.0462 | 136.0 | 1700 | 0.8582 | 0.765 | 0.3642 | 1.5165 | 0.765 | 0.7333 | 0.2026 | 0.1057 |
| 0.0462 | 136.96 | 1712 | 0.8581 | 0.76 | 0.3639 | 1.5118 | 0.76 | 0.7296 | 0.1965 | 0.1052 |
| 0.0462 | 138.0 | 1725 | 0.8570 | 0.77 | 0.3629 | 1.5094 | 0.7700 | 0.7358 | 0.1870 | 0.1029 |
| 0.0462 | 138.96 | 1737 | 0.8611 | 0.76 | 0.3650 | 1.5129 | 0.76 | 0.7296 | 0.1919 | 0.1040 |
| 0.0462 | 140.0 | 1750 | 0.8618 | 0.76 | 0.3659 | 1.5131 | 0.76 | 0.7296 | 0.1981 | 0.1042 |
| 0.0462 | 140.96 | 1762 | 0.8605 | 0.765 | 0.3652 | 1.5115 | 0.765 | 0.7333 | 0.1875 | 0.1048 |
| 0.0462 | 142.0 | 1775 | 0.8647 | 0.76 | 0.3666 | 1.5157 | 0.76 | 0.7296 | 0.2002 | 0.1052 |
| 0.0462 | 142.96 | 1787 | 0.8618 | 0.76 | 0.3654 | 1.5116 | 0.76 | 0.7296 | 0.2006 | 0.1045 |
| 0.0462 | 144.0 | 1800 | 0.8672 | 0.765 | 0.3672 | 1.5160 | 0.765 | 0.7333 | 0.1979 | 0.1053 |
| 0.0462 | 144.96 | 1812 | 0.8625 | 0.77 | 0.3648 | 1.5080 | 0.7700 | 0.7358 | 0.1975 | 0.1026 |
| 0.0462 | 146.0 | 1825 | 0.8695 | 0.765 | 0.3679 | 1.5169 | 0.765 | 0.7323 | 0.1973 | 0.1051 |
| 0.0462 | 146.96 | 1837 | 0.8696 | 0.76 | 0.3685 | 1.5132 | 0.76 | 0.7296 | 0.1936 | 0.1037 |
| 0.0462 | 148.0 | 1850 | 0.8678 | 0.765 | 0.3671 | 1.5110 | 0.765 | 0.7333 | 0.2008 | 0.1040 |
| 0.0462 | 148.96 | 1862 | 0.8713 | 0.765 | 0.3690 | 1.5152 | 0.765 | 0.7333 | 0.1983 | 0.1050 |
| 0.0462 | 150.0 | 1875 | 0.8716 | 0.765 | 0.3687 | 1.5163 | 0.765 | 0.7323 | 0.2029 | 0.1051 |
| 0.0462 | 150.96 | 1887 | 0.8724 | 0.77 | 0.3691 | 1.5113 | 0.7700 | 0.7358 | 0.1997 | 0.1037 |
| 0.0462 | 152.0 | 1900 | 0.8729 | 0.765 | 0.3695 | 1.5134 | 0.765 | 0.7333 | 0.1966 | 0.1050 |
| 0.0462 | 152.96 | 1912 | 0.8760 | 0.765 | 0.3706 | 1.5131 | 0.765 | 0.7333 | 0.2046 | 0.1040 |
| 0.0462 | 154.0 | 1925 | 0.8761 | 0.765 | 0.3707 | 1.5138 | 0.765 | 0.7333 | 0.1896 | 0.1037 |
| 0.0462 | 154.96 | 1937 | 0.8778 | 0.765 | 0.3711 | 1.5138 | 0.765 | 0.7333 | 0.2012 | 0.1046 |
| 0.0462 | 156.0 | 1950 | 0.8768 | 0.765 | 0.3712 | 1.5125 | 0.765 | 0.7333 | 0.1891 | 0.1041 |
| 0.0462 | 156.96 | 1962 | 0.8816 | 0.77 | 0.3732 | 1.5205 | 0.7700 | 0.7360 | 0.1993 | 0.1067 |
| 0.0462 | 158.0 | 1975 | 0.8793 | 0.765 | 0.3718 | 1.5157 | 0.765 | 0.7333 | 0.2025 | 0.1049 |
| 0.0462 | 158.96 | 1987 | 0.8788 | 0.77 | 0.3713 | 1.5126 | 0.7700 | 0.7358 | 0.2044 | 0.1039 |
| 0.0335 | 160.0 | 2000 | 0.8851 | 0.77 | 0.3739 | 1.5193 | 0.7700 | 0.7360 | 0.2042 | 0.1069 |
| 0.0335 | 160.96 | 2012 | 0.8872 | 0.77 | 0.3748 | 1.5200 | 0.7700 | 0.7360 | 0.2009 | 0.1057 |
| 0.0335 | 162.0 | 2025 | 0.8827 | 0.765 | 0.3731 | 1.5144 | 0.765 | 0.7333 | 0.1897 | 0.1050 |
| 0.0335 | 162.96 | 2037 | 0.8821 | 0.765 | 0.3724 | 1.5129 | 0.765 | 0.7333 | 0.1971 | 0.1042 |
| 0.0335 | 164.0 | 2050 | 0.8919 | 0.77 | 0.3770 | 1.5229 | 0.7700 | 0.7360 | 0.2119 | 0.1061 |
| 0.0335 | 164.96 | 2062 | 0.8907 | 0.765 | 0.3764 | 1.5240 | 0.765 | 0.7323 | 0.2125 | 0.1069 |
| 0.0335 | 166.0 | 2075 | 0.8857 | 0.765 | 0.3743 | 1.5127 | 0.765 | 0.7333 | 0.1906 | 0.1044 |
| 0.0335 | 166.96 | 2087 | 0.8928 | 0.77 | 0.3771 | 1.5253 | 0.7700 | 0.7360 | 0.2062 | 0.1062 |
| 0.0335 | 168.0 | 2100 | 0.8895 | 0.77 | 0.3750 | 1.5179 | 0.7700 | 0.7360 | 0.2062 | 0.1054 |
| 0.0335 | 168.96 | 2112 | 0.8904 | 0.77 | 0.3754 | 1.5178 | 0.7700 | 0.7360 | 0.2048 | 0.1055 |
| 0.0335 | 170.0 | 2125 | 0.8919 | 0.765 | 0.3766 | 1.5137 | 0.765 | 0.7333 | 0.2170 | 0.1044 |
| 0.0335 | 170.96 | 2137 | 0.8949 | 0.77 | 0.3779 | 1.5203 | 0.7700 | 0.7360 | 0.2042 | 0.1069 |
| 0.0335 | 172.0 | 2150 | 0.8949 | 0.77 | 0.3779 | 1.5204 | 0.7700 | 0.7360 | 0.2078 | 0.1069 |
| 0.0335 | 172.96 | 2162 | 0.8986 | 0.765 | 0.3794 | 1.5241 | 0.765 | 0.7310 | 0.2079 | 0.1072 |
| 0.0335 | 174.0 | 2175 | 0.8978 | 0.76 | 0.3787 | 1.5201 | 0.76 | 0.7272 | 0.2108 | 0.1056 |
| 0.0335 | 174.96 | 2187 | 0.8990 | 0.77 | 0.3786 | 1.5198 | 0.7700 | 0.7360 | 0.2032 | 0.1053 |
| 0.0335 | 176.0 | 2200 | 0.9003 | 0.77 | 0.3794 | 1.5206 | 0.7700 | 0.7360 | 0.1996 | 0.1060 |
| 0.0335 | 176.96 | 2212 | 0.9000 | 0.77 | 0.3797 | 1.5196 | 0.7700 | 0.7360 | 0.2116 | 0.1063 |
| 0.0335 | 178.0 | 2225 | 0.9000 | 0.765 | 0.3794 | 1.5178 | 0.765 | 0.7333 | 0.1875 | 0.1055 |
| 0.0335 | 178.96 | 2237 | 0.9034 | 0.77 | 0.3804 | 1.5218 | 0.7700 | 0.7360 | 0.1964 | 0.1068 |
| 0.0335 | 180.0 | 2250 | 0.9020 | 0.77 | 0.3802 | 1.5198 | 0.7700 | 0.7360 | 0.2058 | 0.1063 |
| 0.0335 | 180.96 | 2262 | 0.9037 | 0.77 | 0.3808 | 1.5192 | 0.7700 | 0.7360 | 0.1976 | 0.1063 |
| 0.0335 | 182.0 | 2275 | 0.9059 | 0.77 | 0.3812 | 1.5227 | 0.7700 | 0.7360 | 0.1962 | 0.1067 |
| 0.0335 | 182.96 | 2287 | 0.9063 | 0.77 | 0.3818 | 1.5206 | 0.7700 | 0.7360 | 0.2000 | 0.1065 |
| 0.0335 | 184.0 | 2300 | 0.9058 | 0.77 | 0.3814 | 1.5196 | 0.7700 | 0.7360 | 0.1926 | 0.1061 |
| 0.0335 | 184.96 | 2312 | 0.9082 | 0.77 | 0.3821 | 1.5211 | 0.7700 | 0.7360 | 0.2001 | 0.1067 |
| 0.0335 | 186.0 | 2325 | 0.9083 | 0.77 | 0.3824 | 1.5204 | 0.7700 | 0.7360 | 0.2062 | 0.1057 |
| 0.0335 | 186.96 | 2337 | 0.9090 | 0.77 | 0.3824 | 1.5220 | 0.7700 | 0.7360 | 0.2027 | 0.1063 |
| 0.0335 | 188.0 | 2350 | 0.9106 | 0.77 | 0.3828 | 1.5213 | 0.7700 | 0.7360 | 0.1968 | 0.1068 |
| 0.0335 | 188.96 | 2362 | 0.9116 | 0.77 | 0.3829 | 1.5238 | 0.7700 | 0.7360 | 0.2029 | 0.1071 |
| 0.0335 | 190.0 | 2375 | 0.9120 | 0.77 | 0.3835 | 1.5225 | 0.7700 | 0.7360 | 0.1953 | 0.1064 |
| 0.0335 | 190.96 | 2387 | 0.9123 | 0.77 | 0.3835 | 1.5227 | 0.7700 | 0.7360 | 0.2080 | 0.1069 |
| 0.0335 | 192.0 | 2400 | 0.9131 | 0.775 | 0.3838 | 1.5222 | 0.775 | 0.7418 | 0.2039 | 0.1061 |
| 0.0335 | 192.96 | 2412 | 0.9144 | 0.765 | 0.3841 | 1.5200 | 0.765 | 0.7333 | 0.2163 | 0.1060 |
| 0.0335 | 194.0 | 2425 | 0.9138 | 0.77 | 0.3839 | 1.5200 | 0.7700 | 0.7360 | 0.2092 | 0.1057 |
| 0.0335 | 194.96 | 2437 | 0.9164 | 0.775 | 0.3850 | 1.5249 | 0.775 | 0.7418 | 0.2188 | 0.1065 |
| 0.0335 | 196.0 | 2450 | 0.9185 | 0.77 | 0.3861 | 1.5257 | 0.7700 | 0.7360 | 0.2087 | 0.1067 |
| 0.0335 | 196.96 | 2462 | 0.9207 | 0.77 | 0.3868 | 1.5286 | 0.7700 | 0.7360 | 0.2063 | 0.1074 |
| 0.0335 | 198.0 | 2475 | 0.9191 | 0.77 | 0.3858 | 1.5254 | 0.7700 | 0.7360 | 0.2129 | 0.1068 |
| 0.0335 | 198.96 | 2487 | 0.9195 | 0.77 | 0.3861 | 1.5240 | 0.7700 | 0.7360 | 0.2059 | 0.1066 |
| 0.0264 | 200.0 | 2500 | 0.9205 | 0.77 | 0.3864 | 1.5246 | 0.7700 | 0.7360 | 0.2081 | 0.1069 |
| 0.0264 | 200.96 | 2512 | 0.9214 | 0.77 | 0.3865 | 1.5235 | 0.7700 | 0.7360 | 0.2018 | 0.1066 |
| 0.0264 | 202.0 | 2525 | 0.9216 | 0.775 | 0.3867 | 1.5253 | 0.775 | 0.7418 | 0.2156 | 0.1068 |
| 0.0264 | 202.96 | 2537 | 0.9218 | 0.775 | 0.3870 | 1.5225 | 0.775 | 0.7418 | 0.2108 | 0.1064 |
| 0.0264 | 204.0 | 2550 | 0.9241 | 0.775 | 0.3871 | 1.4893 | 0.775 | 0.7418 | 0.2087 | 0.1071 |
| 0.0264 | 204.96 | 2562 | 0.9270 | 0.77 | 0.3889 | 1.5244 | 0.7700 | 0.7360 | 0.2024 | 0.1071 |
| 0.0264 | 206.0 | 2575 | 0.9260 | 0.775 | 0.3885 | 1.5262 | 0.775 | 0.7418 | 0.2116 | 0.1069 |
| 0.0264 | 206.96 | 2587 | 0.9259 | 0.775 | 0.3883 | 1.5269 | 0.775 | 0.7418 | 0.2089 | 0.1065 |
| 0.0264 | 208.0 | 2600 | 0.9254 | 0.77 | 0.3875 | 1.5247 | 0.7700 | 0.7360 | 0.2060 | 0.1069 |
| 0.0264 | 208.96 | 2612 | 0.9285 | 0.775 | 0.3889 | 1.5281 | 0.775 | 0.7418 | 0.2115 | 0.1074 |
| 0.0264 | 210.0 | 2625 | 0.9277 | 0.775 | 0.3886 | 1.5254 | 0.775 | 0.7418 | 0.2114 | 0.1069 |
| 0.0264 | 210.96 | 2637 | 0.9304 | 0.775 | 0.3897 | 1.5278 | 0.775 | 0.7418 | 0.2095 | 0.1071 |
| 0.0264 | 212.0 | 2650 | 0.9288 | 0.77 | 0.3886 | 1.5270 | 0.7700 | 0.7360 | 0.2068 | 0.1070 |
| 0.0264 | 212.96 | 2662 | 0.9310 | 0.775 | 0.3896 | 1.5316 | 0.775 | 0.7418 | 0.2135 | 0.1071 |
| 0.0264 | 214.0 | 2675 | 0.9311 | 0.775 | 0.3899 | 1.5263 | 0.775 | 0.7418 | 0.2187 | 0.1070 |
| 0.0264 | 214.96 | 2687 | 0.9315 | 0.775 | 0.3899 | 1.5256 | 0.775 | 0.7418 | 0.2123 | 0.1068 |
| 0.0264 | 216.0 | 2700 | 0.9315 | 0.77 | 0.3896 | 1.5258 | 0.7700 | 0.7360 | 0.2070 | 0.1071 |
| 0.0264 | 216.96 | 2712 | 0.9334 | 0.775 | 0.3905 | 1.5291 | 0.775 | 0.7418 | 0.2088 | 0.1071 |
| 0.0264 | 218.0 | 2725 | 0.9342 | 0.775 | 0.3908 | 1.5283 | 0.775 | 0.7418 | 0.2146 | 0.1072 |
| 0.0264 | 218.96 | 2737 | 0.9337 | 0.775 | 0.3903 | 1.5282 | 0.775 | 0.7418 | 0.2110 | 0.1070 |
| 0.0264 | 220.0 | 2750 | 0.9357 | 0.775 | 0.3913 | 1.5284 | 0.775 | 0.7418 | 0.2149 | 0.1073 |
| 0.0264 | 220.96 | 2762 | 0.9367 | 0.775 | 0.3918 | 1.5299 | 0.775 | 0.7418 | 0.2088 | 0.1072 |
| 0.0264 | 222.0 | 2775 | 0.9371 | 0.775 | 0.3916 | 1.5294 | 0.775 | 0.7418 | 0.2141 | 0.1075 |
| 0.0264 | 222.96 | 2787 | 0.9359 | 0.775 | 0.3910 | 1.5271 | 0.775 | 0.7418 | 0.2126 | 0.1067 |
| 0.0264 | 224.0 | 2800 | 0.9374 | 0.775 | 0.3918 | 1.5298 | 0.775 | 0.7418 | 0.2084 | 0.1072 |
| 0.0264 | 224.96 | 2812 | 0.9378 | 0.775 | 0.3914 | 1.5296 | 0.775 | 0.7418 | 0.2073 | 0.1072 |
| 0.0264 | 226.0 | 2825 | 0.9377 | 0.775 | 0.3916 | 1.5274 | 0.775 | 0.7418 | 0.2075 | 0.1066 |
| 0.0264 | 226.96 | 2837 | 0.9412 | 0.775 | 0.3932 | 1.5310 | 0.775 | 0.7418 | 0.2096 | 0.1077 |
| 0.0264 | 228.0 | 2850 | 0.9402 | 0.775 | 0.3923 | 1.5329 | 0.775 | 0.7418 | 0.2161 | 0.1076 |
| 0.0264 | 228.96 | 2862 | 0.9420 | 0.775 | 0.3932 | 1.5301 | 0.775 | 0.7418 | 0.2078 | 0.1074 |
| 0.0264 | 230.0 | 2875 | 0.9412 | 0.775 | 0.3925 | 1.5315 | 0.775 | 0.7418 | 0.2078 | 0.1076 |
| 0.0264 | 230.96 | 2887 | 0.9422 | 0.775 | 0.3930 | 1.5340 | 0.775 | 0.7418 | 0.2179 | 0.1077 |
| 0.0264 | 232.0 | 2900 | 0.9431 | 0.775 | 0.3933 | 1.5336 | 0.775 | 0.7418 | 0.2158 | 0.1081 |
| 0.0264 | 232.96 | 2912 | 0.9428 | 0.775 | 0.3931 | 1.5304 | 0.775 | 0.7418 | 0.2086 | 0.1075 |
| 0.0264 | 234.0 | 2925 | 0.9434 | 0.775 | 0.3935 | 1.5325 | 0.775 | 0.7418 | 0.2152 | 0.1074 |
| 0.0264 | 234.96 | 2937 | 0.9431 | 0.775 | 0.3933 | 1.5286 | 0.775 | 0.7418 | 0.2081 | 0.1070 |
| 0.0264 | 236.0 | 2950 | 0.9438 | 0.775 | 0.3935 | 1.5307 | 0.775 | 0.7418 | 0.2077 | 0.1073 |
| 0.0264 | 236.96 | 2962 | 0.9452 | 0.775 | 0.3940 | 1.5329 | 0.775 | 0.7418 | 0.2217 | 0.1074 |
| 0.0264 | 238.0 | 2975 | 0.9453 | 0.775 | 0.3939 | 1.5328 | 0.775 | 0.7418 | 0.2129 | 0.1076 |
| 0.0264 | 238.96 | 2987 | 0.9451 | 0.775 | 0.3937 | 1.5308 | 0.775 | 0.7418 | 0.2133 | 0.1073 |
| 0.0223 | 240.0 | 3000 | 0.9470 | 0.775 | 0.3947 | 1.5333 | 0.775 | 0.7418 | 0.2220 | 0.1077 |
| 0.0223 | 240.96 | 3012 | 0.9461 | 0.775 | 0.3942 | 1.5329 | 0.775 | 0.7418 | 0.2127 | 0.1072 |
| 0.0223 | 242.0 | 3025 | 0.9477 | 0.775 | 0.3949 | 1.5310 | 0.775 | 0.7418 | 0.2133 | 0.1074 |
| 0.0223 | 242.96 | 3037 | 0.9480 | 0.775 | 0.3949 | 1.5331 | 0.775 | 0.7418 | 0.2165 | 0.1073 |
| 0.0223 | 244.0 | 3050 | 0.9499 | 0.775 | 0.3955 | 1.5384 | 0.775 | 0.7418 | 0.2226 | 0.1080 |
| 0.0223 | 244.96 | 3062 | 0.9476 | 0.775 | 0.3946 | 1.5322 | 0.775 | 0.7418 | 0.2128 | 0.1069 |
| 0.0223 | 246.0 | 3075 | 0.9490 | 0.775 | 0.3953 | 1.5298 | 0.775 | 0.7418 | 0.2137 | 0.1071 |
| 0.0223 | 246.96 | 3087 | 0.9496 | 0.775 | 0.3953 | 1.5315 | 0.775 | 0.7418 | 0.2133 | 0.1071 |
| 0.0223 | 248.0 | 3100 | 0.9500 | 0.775 | 0.3955 | 1.5335 | 0.775 | 0.7418 | 0.2131 | 0.1072 |
| 0.0223 | 248.96 | 3112 | 0.9503 | 0.775 | 0.3956 | 1.5323 | 0.775 | 0.7418 | 0.2164 | 0.1072 |
| 0.0223 | 250.0 | 3125 | 0.9505 | 0.775 | 0.3955 | 1.5338 | 0.775 | 0.7418 | 0.2128 | 0.1071 |
| 0.0223 | 250.96 | 3137 | 0.9510 | 0.775 | 0.3957 | 1.5372 | 0.775 | 0.7418 | 0.2266 | 0.1072 |
| 0.0223 | 252.0 | 3150 | 0.9517 | 0.775 | 0.3960 | 1.5363 | 0.775 | 0.7418 | 0.2222 | 0.1073 |
| 0.0223 | 252.96 | 3162 | 0.9526 | 0.775 | 0.3961 | 1.5372 | 0.775 | 0.7418 | 0.2227 | 0.1080 |
| 0.0223 | 254.0 | 3175 | 0.9527 | 0.77 | 0.3963 | 1.5340 | 0.7700 | 0.7368 | 0.2174 | 0.1081 |
| 0.0223 | 254.96 | 3187 | 0.9527 | 0.775 | 0.3962 | 1.5389 | 0.775 | 0.7418 | 0.2222 | 0.1074 |
| 0.0223 | 256.0 | 3200 | 0.9528 | 0.775 | 0.3962 | 1.5347 | 0.775 | 0.7418 | 0.2258 | 0.1073 |
| 0.0223 | 256.96 | 3212 | 0.9545 | 0.775 | 0.3969 | 1.5401 | 0.775 | 0.7418 | 0.2226 | 0.1083 |
| 0.0223 | 258.0 | 3225 | 0.9540 | 0.775 | 0.3966 | 1.5369 | 0.775 | 0.7418 | 0.2224 | 0.1074 |
| 0.0223 | 258.96 | 3237 | 0.9547 | 0.775 | 0.3969 | 1.5370 | 0.775 | 0.7418 | 0.2228 | 0.1082 |
| 0.0223 | 260.0 | 3250 | 0.9549 | 0.775 | 0.3969 | 1.5381 | 0.775 | 0.7418 | 0.2226 | 0.1075 |
| 0.0223 | 260.96 | 3262 | 0.9545 | 0.775 | 0.3968 | 1.5345 | 0.775 | 0.7418 | 0.2134 | 0.1072 |
| 0.0223 | 262.0 | 3275 | 0.9550 | 0.775 | 0.3970 | 1.5362 | 0.775 | 0.7418 | 0.2145 | 0.1079 |
| 0.0223 | 262.96 | 3287 | 0.9558 | 0.775 | 0.3971 | 1.5392 | 0.775 | 0.7418 | 0.2227 | 0.1076 |
| 0.0223 | 264.0 | 3300 | 0.9557 | 0.775 | 0.3970 | 1.5383 | 0.775 | 0.7418 | 0.2226 | 0.1074 |
| 0.0223 | 264.96 | 3312 | 0.9561 | 0.775 | 0.3973 | 1.5393 | 0.775 | 0.7418 | 0.2224 | 0.1080 |
| 0.0223 | 266.0 | 3325 | 0.9563 | 0.775 | 0.3972 | 1.5387 | 0.775 | 0.7418 | 0.2224 | 0.1073 |
| 0.0223 | 266.96 | 3337 | 0.9568 | 0.775 | 0.3974 | 1.5407 | 0.775 | 0.7418 | 0.2225 | 0.1082 |
| 0.0223 | 268.0 | 3350 | 0.9567 | 0.775 | 0.3973 | 1.5373 | 0.775 | 0.7418 | 0.2259 | 0.1080 |
| 0.0223 | 268.96 | 3362 | 0.9566 | 0.775 | 0.3973 | 1.5371 | 0.775 | 0.7418 | 0.2225 | 0.1080 |
| 0.0223 | 270.0 | 3375 | 0.9574 | 0.775 | 0.3976 | 1.5403 | 0.775 | 0.7418 | 0.2227 | 0.1075 |
| 0.0223 | 270.96 | 3387 | 0.9568 | 0.775 | 0.3974 | 1.5363 | 0.775 | 0.7418 | 0.2225 | 0.1072 |
| 0.0223 | 272.0 | 3400 | 0.9580 | 0.775 | 0.3978 | 1.5465 | 0.775 | 0.7418 | 0.2241 | 0.1081 |
| 0.0223 | 272.96 | 3412 | 0.9577 | 0.775 | 0.3977 | 1.5383 | 0.775 | 0.7418 | 0.2228 | 0.1074 |
| 0.0223 | 274.0 | 3425 | 0.9577 | 0.775 | 0.3976 | 1.5409 | 0.775 | 0.7418 | 0.2225 | 0.1080 |
| 0.0223 | 274.96 | 3437 | 0.9582 | 0.775 | 0.3978 | 1.5409 | 0.775 | 0.7418 | 0.2226 | 0.1075 |
| 0.0223 | 276.0 | 3450 | 0.9581 | 0.775 | 0.3978 | 1.5412 | 0.775 | 0.7418 | 0.2225 | 0.1082 |
| 0.0223 | 276.96 | 3462 | 0.9582 | 0.775 | 0.3978 | 1.5367 | 0.775 | 0.7418 | 0.2220 | 0.1073 |
| 0.0223 | 278.0 | 3475 | 0.9587 | 0.775 | 0.3980 | 1.5422 | 0.775 | 0.7418 | 0.2244 | 0.1082 |
| 0.0223 | 278.96 | 3487 | 0.9588 | 0.775 | 0.3980 | 1.5478 | 0.775 | 0.7418 | 0.2242 | 0.1082 |
| 0.0202 | 280.0 | 3500 | 0.9586 | 0.775 | 0.3980 | 1.5381 | 0.775 | 0.7418 | 0.2219 | 0.1081 |
| 0.0202 | 280.96 | 3512 | 0.9592 | 0.775 | 0.3981 | 1.5474 | 0.775 | 0.7418 | 0.2243 | 0.1082 |
| 0.0202 | 282.0 | 3525 | 0.9588 | 0.775 | 0.3980 | 1.5396 | 0.775 | 0.7418 | 0.2227 | 0.1080 |
| 0.0202 | 282.96 | 3537 | 0.9589 | 0.775 | 0.3980 | 1.5401 | 0.775 | 0.7418 | 0.2218 | 0.1074 |
| 0.0202 | 284.0 | 3550 | 0.9593 | 0.775 | 0.3982 | 1.5441 | 0.775 | 0.7418 | 0.2243 | 0.1083 |
| 0.0202 | 284.96 | 3562 | 0.9591 | 0.775 | 0.3981 | 1.5412 | 0.775 | 0.7418 | 0.2227 | 0.1082 |
| 0.0202 | 286.0 | 3575 | 0.9592 | 0.775 | 0.3981 | 1.5417 | 0.775 | 0.7418 | 0.2227 | 0.1082 |
| 0.0202 | 286.96 | 3587 | 0.9592 | 0.775 | 0.3981 | 1.5416 | 0.775 | 0.7418 | 0.2227 | 0.1082 |
| 0.0202 | 288.0 | 3600 | 0.9592 | 0.775 | 0.3981 | 1.5416 | 0.775 | 0.7418 | 0.2227 | 0.1082 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-tiny_tobacco3482_dualsimkd_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_dualsimkd_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1401
- Accuracy: 0.385
- Brier Loss: 0.8709
- Nll: 8.8462
- F1 Micro: 0.3850
- F1 Macro: 0.1979
- Ece: 0.3606
- Aurc: 0.3874
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 100 | 0.5117 | 0.04 | 0.9009 | 19.1664 | 0.04 | 0.0077 | 0.1344 | 0.9445 |
| No log | 2.0 | 200 | 0.3168 | 0.05 | 0.8997 | 15.0313 | 0.0500 | 0.0095 | 0.1344 | 0.8364 |
| No log | 3.0 | 300 | 0.2703 | 0.18 | 0.8978 | 9.6860 | 0.18 | 0.0305 | 0.2180 | 0.7731 |
| No log | 4.0 | 400 | 0.2266 | 0.18 | 0.8952 | 12.0957 | 0.18 | 0.0305 | 0.2223 | 0.7993 |
| 1.1219 | 5.0 | 500 | 0.1687 | 0.18 | 0.8951 | 12.7136 | 0.18 | 0.0305 | 0.2215 | 0.7713 |
| 1.1219 | 6.0 | 600 | 0.1331 | 0.165 | 0.8956 | 12.6737 | 0.165 | 0.0284 | 0.2044 | 0.7829 |
| 1.1219 | 7.0 | 700 | 0.1139 | 0.18 | 0.8960 | 12.6380 | 0.18 | 0.0305 | 0.2283 | 0.7875 |
| 1.1219 | 8.0 | 800 | 0.1143 | 0.18 | 0.8963 | 12.6385 | 0.18 | 0.0306 | 0.2183 | 0.7703 |
| 1.1219 | 9.0 | 900 | 0.1246 | 0.18 | 0.8966 | 12.5389 | 0.18 | 0.0305 | 0.2223 | 0.7726 |
| 0.0694 | 10.0 | 1000 | 0.1262 | 0.18 | 0.8961 | 12.6316 | 0.18 | 0.0305 | 0.2271 | 0.7894 |
| 0.0694 | 11.0 | 1100 | 0.1186 | 0.155 | 0.8961 | 12.6309 | 0.155 | 0.0268 | 0.2169 | 0.6418 |
| 0.0694 | 12.0 | 1200 | 0.1290 | 0.18 | 0.8960 | 12.6360 | 0.18 | 0.0305 | 0.2272 | 0.8014 |
| 0.0694 | 13.0 | 1300 | 0.1202 | 0.18 | 0.8959 | 12.6644 | 0.18 | 0.0305 | 0.2274 | 0.7910 |
| 0.0694 | 14.0 | 1400 | 0.1341 | 0.18 | 0.8960 | 12.6667 | 0.18 | 0.0305 | 0.2273 | 0.7916 |
| 0.0505 | 15.0 | 1500 | 0.1234 | 0.18 | 0.8961 | 12.6653 | 0.18 | 0.0305 | 0.2261 | 0.7819 |
| 0.0505 | 16.0 | 1600 | 0.1375 | 0.18 | 0.8960 | 12.6951 | 0.18 | 0.0305 | 0.2283 | 0.7929 |
| 0.0505 | 17.0 | 1700 | 0.1249 | 0.18 | 0.8959 | 12.7041 | 0.18 | 0.0305 | 0.2262 | 0.7820 |
| 0.0505 | 18.0 | 1800 | 0.1263 | 0.18 | 0.8964 | 12.6096 | 0.18 | 0.0305 | 0.2228 | 0.7900 |
| 0.0505 | 19.0 | 1900 | 0.1243 | 0.18 | 0.8961 | 12.6667 | 0.18 | 0.0305 | 0.2229 | 0.7896 |
| 0.0483 | 20.0 | 2000 | 0.1246 | 0.18 | 0.8960 | 12.6285 | 0.18 | 0.0305 | 0.2172 | 0.7913 |
| 0.0483 | 21.0 | 2100 | 0.1218 | 0.18 | 0.8961 | 12.6375 | 0.18 | 0.0305 | 0.2250 | 0.8003 |
| 0.0483 | 22.0 | 2200 | 0.1228 | 0.18 | 0.8964 | 12.5765 | 0.18 | 0.0305 | 0.2258 | 0.7938 |
| 0.0483 | 23.0 | 2300 | 0.1270 | 0.18 | 0.8963 | 12.6332 | 0.18 | 0.0305 | 0.2239 | 0.8055 |
| 0.0483 | 24.0 | 2400 | 0.1303 | 0.18 | 0.8963 | 12.5914 | 0.18 | 0.0305 | 0.2270 | 0.8006 |
| 0.0484 | 25.0 | 2500 | 0.1234 | 0.18 | 0.8960 | 12.6429 | 0.18 | 0.0305 | 0.2208 | 0.7990 |
| 0.0484 | 26.0 | 2600 | 0.1313 | 0.18 | 0.8965 | 12.5721 | 0.18 | 0.0305 | 0.2205 | 0.8069 |
| 0.0484 | 27.0 | 2700 | 0.1314 | 0.18 | 0.8963 | 12.5982 | 0.18 | 0.0305 | 0.2247 | 0.8110 |
| 0.0484 | 28.0 | 2800 | 0.1326 | 0.18 | 0.8962 | 12.6539 | 0.18 | 0.0305 | 0.2143 | 0.8083 |
| 0.0484 | 29.0 | 2900 | 0.1337 | 0.18 | 0.8964 | 12.5814 | 0.18 | 0.0305 | 0.2225 | 0.8106 |
| 0.0473 | 30.0 | 3000 | 0.1369 | 0.18 | 0.8962 | 12.6021 | 0.18 | 0.0305 | 0.2258 | 0.8095 |
| 0.0473 | 31.0 | 3100 | 0.1295 | 0.18 | 0.8958 | 12.6587 | 0.18 | 0.0305 | 0.2273 | 0.8104 |
| 0.0473 | 32.0 | 3200 | 0.1343 | 0.18 | 0.8959 | 12.6740 | 0.18 | 0.0305 | 0.2220 | 0.8119 |
| 0.0473 | 33.0 | 3300 | 0.1359 | 0.18 | 0.8960 | 12.6790 | 0.18 | 0.0305 | 0.2273 | 0.8134 |
| 0.0473 | 34.0 | 3400 | 0.1367 | 0.18 | 0.8961 | 12.6336 | 0.18 | 0.0305 | 0.2228 | 0.8159 |
| 0.0476 | 35.0 | 3500 | 0.1378 | 0.18 | 0.8963 | 12.6119 | 0.18 | 0.0305 | 0.2270 | 0.8172 |
| 0.0476 | 36.0 | 3600 | 0.1286 | 0.18 | 0.8961 | 12.6340 | 0.18 | 0.0305 | 0.2218 | 0.8148 |
| 0.0476 | 37.0 | 3700 | 0.1333 | 0.18 | 0.8960 | 12.6328 | 0.18 | 0.0305 | 0.2207 | 0.8164 |
| 0.0476 | 38.0 | 3800 | 0.1328 | 0.18 | 0.8963 | 12.6294 | 0.18 | 0.0305 | 0.2196 | 0.8180 |
| 0.0476 | 39.0 | 3900 | 0.1344 | 0.18 | 0.8961 | 12.6417 | 0.18 | 0.0305 | 0.2207 | 0.8209 |
| 0.0474 | 40.0 | 4000 | 0.1362 | 0.18 | 0.8959 | 12.6775 | 0.18 | 0.0305 | 0.2187 | 0.8198 |
| 0.0474 | 41.0 | 4100 | 0.1340 | 0.18 | 0.8961 | 12.6746 | 0.18 | 0.0305 | 0.2249 | 0.8215 |
| 0.0474 | 42.0 | 4200 | 0.1308 | 0.18 | 0.8958 | 12.6621 | 0.18 | 0.0305 | 0.2208 | 0.8215 |
| 0.0474 | 43.0 | 4300 | 0.1372 | 0.18 | 0.8960 | 12.6133 | 0.18 | 0.0305 | 0.2249 | 0.8204 |
| 0.0474 | 44.0 | 4400 | 0.1436 | 0.18 | 0.8963 | 12.6014 | 0.18 | 0.0305 | 0.2280 | 0.8201 |
| 0.0472 | 45.0 | 4500 | 0.1374 | 0.18 | 0.8960 | 12.6316 | 0.18 | 0.0305 | 0.2228 | 0.8193 |
| 0.0472 | 46.0 | 4600 | 0.1261 | 0.18 | 0.8957 | 12.6840 | 0.18 | 0.0305 | 0.2251 | 0.8220 |
| 0.0472 | 47.0 | 4700 | 0.1340 | 0.18 | 0.8956 | 12.6704 | 0.18 | 0.0305 | 0.2251 | 0.8221 |
| 0.0472 | 48.0 | 4800 | 0.1320 | 0.18 | 0.8959 | 12.6111 | 0.18 | 0.0305 | 0.2227 | 0.8203 |
| 0.0472 | 49.0 | 4900 | 0.1336 | 0.18 | 0.8956 | 12.6838 | 0.18 | 0.0305 | 0.2294 | 0.8209 |
| 0.0474 | 50.0 | 5000 | 0.1342 | 0.18 | 0.8959 | 12.3426 | 0.18 | 0.0305 | 0.2292 | 0.8218 |
| 0.0474 | 51.0 | 5100 | 0.1362 | 0.18 | 0.8957 | 12.3611 | 0.18 | 0.0305 | 0.2261 | 0.8224 |
| 0.0474 | 52.0 | 5200 | 0.1368 | 0.18 | 0.8958 | 11.5617 | 0.18 | 0.0305 | 0.2205 | 0.8222 |
| 0.0474 | 53.0 | 5300 | 0.1391 | 0.18 | 0.8955 | 11.5519 | 0.18 | 0.0305 | 0.2312 | 0.8225 |
| 0.0474 | 54.0 | 5400 | 0.1366 | 0.18 | 0.8947 | 12.2068 | 0.18 | 0.0305 | 0.2231 | 0.8231 |
| 0.047 | 55.0 | 5500 | 0.1355 | 0.19 | 0.8943 | 11.5922 | 0.19 | 0.0641 | 0.2299 | 0.8248 |
| 0.047 | 56.0 | 5600 | 0.1386 | 0.17 | 0.8930 | 11.8204 | 0.17 | 0.0705 | 0.2240 | 0.5968 |
| 0.047 | 57.0 | 5700 | 0.1364 | 0.33 | 0.8936 | 11.0092 | 0.33 | 0.1878 | 0.3195 | 0.4381 |
| 0.047 | 58.0 | 5800 | 0.1368 | 0.27 | 0.8923 | 11.0463 | 0.27 | 0.1541 | 0.2874 | 0.5187 |
| 0.047 | 59.0 | 5900 | 0.1328 | 0.325 | 0.8915 | 10.5269 | 0.325 | 0.1702 | 0.3247 | 0.4469 |
| 0.0469 | 60.0 | 6000 | 0.1402 | 0.235 | 0.8945 | 9.2940 | 0.235 | 0.1141 | 0.2558 | 0.6612 |
| 0.0469 | 61.0 | 6100 | 0.1387 | 0.345 | 0.8913 | 9.2678 | 0.345 | 0.1657 | 0.3422 | 0.4100 |
| 0.0469 | 62.0 | 6200 | 0.1386 | 0.31 | 0.8891 | 10.1100 | 0.31 | 0.1637 | 0.3134 | 0.4609 |
| 0.0469 | 63.0 | 6300 | 0.1379 | 0.34 | 0.8892 | 9.1965 | 0.34 | 0.1582 | 0.3388 | 0.4344 |
| 0.0469 | 64.0 | 6400 | 0.1375 | 0.335 | 0.8876 | 9.2252 | 0.335 | 0.1624 | 0.3356 | 0.4239 |
| 0.0469 | 65.0 | 6500 | 0.1357 | 0.345 | 0.8868 | 9.1887 | 0.345 | 0.1659 | 0.3361 | 0.4061 |
| 0.0469 | 66.0 | 6600 | 0.1394 | 0.345 | 0.8850 | 9.1819 | 0.345 | 0.1641 | 0.3398 | 0.4265 |
| 0.0469 | 67.0 | 6700 | 0.1410 | 0.34 | 0.8850 | 9.1158 | 0.34 | 0.1590 | 0.3328 | 0.4302 |
| 0.0469 | 68.0 | 6800 | 0.1387 | 0.295 | 0.8814 | 9.2693 | 0.295 | 0.1374 | 0.3039 | 0.4572 |
| 0.0469 | 69.0 | 6900 | 0.1385 | 0.335 | 0.8814 | 9.1526 | 0.335 | 0.1668 | 0.3324 | 0.4205 |
| 0.0463 | 70.0 | 7000 | 0.1392 | 0.34 | 0.8814 | 9.1159 | 0.34 | 0.1546 | 0.3405 | 0.4263 |
| 0.0463 | 71.0 | 7100 | 0.1418 | 0.35 | 0.8820 | 9.1363 | 0.35 | 0.1692 | 0.3436 | 0.4019 |
| 0.0463 | 72.0 | 7200 | 0.1379 | 0.35 | 0.8791 | 9.0483 | 0.35 | 0.1726 | 0.3402 | 0.4226 |
| 0.0463 | 73.0 | 7300 | 0.1405 | 0.33 | 0.8760 | 9.3563 | 0.33 | 0.1731 | 0.3207 | 0.4307 |
| 0.0463 | 74.0 | 7400 | 0.1401 | 0.31 | 0.8769 | 9.4413 | 0.31 | 0.1676 | 0.3099 | 0.4383 |
| 0.0458 | 75.0 | 7500 | 0.1393 | 0.38 | 0.8778 | 9.0788 | 0.38 | 0.1985 | 0.3518 | 0.3976 |
| 0.0458 | 76.0 | 7600 | 0.1384 | 0.39 | 0.8779 | 9.0233 | 0.39 | 0.2027 | 0.3673 | 0.4144 |
| 0.0458 | 77.0 | 7700 | 0.1403 | 0.365 | 0.8818 | 9.1567 | 0.3650 | 0.1953 | 0.3518 | 0.4181 |
| 0.0458 | 78.0 | 7800 | 0.1400 | 0.27 | 0.8725 | 11.0592 | 0.27 | 0.1627 | 0.2896 | 0.4809 |
| 0.0458 | 79.0 | 7900 | 0.1402 | 0.375 | 0.8739 | 9.1158 | 0.375 | 0.1961 | 0.3540 | 0.3929 |
| 0.0455 | 80.0 | 8000 | 0.1401 | 0.315 | 0.8722 | 9.9114 | 0.315 | 0.1771 | 0.3220 | 0.4443 |
| 0.0455 | 81.0 | 8100 | 0.1378 | 0.39 | 0.8761 | 9.0128 | 0.39 | 0.2048 | 0.3642 | 0.4020 |
| 0.0455 | 82.0 | 8200 | 0.1401 | 0.38 | 0.8729 | 9.1624 | 0.38 | 0.2006 | 0.3612 | 0.3924 |
| 0.0455 | 83.0 | 8300 | 0.1391 | 0.38 | 0.8742 | 8.8982 | 0.38 | 0.2048 | 0.3561 | 0.3991 |
| 0.0455 | 84.0 | 8400 | 0.1381 | 0.375 | 0.8734 | 9.0598 | 0.375 | 0.1901 | 0.3567 | 0.4010 |
| 0.0453 | 85.0 | 8500 | 0.1398 | 0.39 | 0.8718 | 9.1407 | 0.39 | 0.2057 | 0.3693 | 0.3892 |
| 0.0453 | 86.0 | 8600 | 0.1389 | 0.37 | 0.8721 | 9.3494 | 0.37 | 0.2006 | 0.3505 | 0.3914 |
| 0.0453 | 87.0 | 8700 | 0.1390 | 0.395 | 0.8743 | 8.7444 | 0.395 | 0.2113 | 0.3724 | 0.3854 |
| 0.0453 | 88.0 | 8800 | 0.1404 | 0.395 | 0.8739 | 8.7654 | 0.395 | 0.2134 | 0.3657 | 0.3925 |
| 0.0453 | 89.0 | 8900 | 0.1409 | 0.385 | 0.8726 | 8.7763 | 0.3850 | 0.2032 | 0.3643 | 0.3963 |
| 0.0451 | 90.0 | 9000 | 0.1403 | 0.39 | 0.8717 | 8.8363 | 0.39 | 0.2055 | 0.3668 | 0.3926 |
| 0.0451 | 91.0 | 9100 | 0.1388 | 0.39 | 0.8719 | 9.2985 | 0.39 | 0.2099 | 0.3662 | 0.3847 |
| 0.0451 | 92.0 | 9200 | 0.1397 | 0.385 | 0.8702 | 9.4449 | 0.3850 | 0.2050 | 0.3535 | 0.3877 |
| 0.0451 | 93.0 | 9300 | 0.1403 | 0.385 | 0.8709 | 8.9790 | 0.3850 | 0.1989 | 0.3473 | 0.3887 |
| 0.0451 | 94.0 | 9400 | 0.1400 | 0.39 | 0.8705 | 9.1647 | 0.39 | 0.2053 | 0.3569 | 0.3865 |
| 0.045 | 95.0 | 9500 | 0.1404 | 0.395 | 0.8712 | 9.1707 | 0.395 | 0.2087 | 0.3688 | 0.3815 |
| 0.045 | 96.0 | 9600 | 0.1404 | 0.385 | 0.8711 | 8.6711 | 0.3850 | 0.1980 | 0.3566 | 0.3867 |
| 0.045 | 97.0 | 9700 | 0.1399 | 0.39 | 0.8706 | 9.1288 | 0.39 | 0.2035 | 0.3610 | 0.3845 |
| 0.045 | 98.0 | 9800 | 0.1400 | 0.385 | 0.8708 | 9.1302 | 0.3850 | 0.1982 | 0.3538 | 0.3870 |
| 0.045 | 99.0 | 9900 | 0.1398 | 0.39 | 0.8712 | 8.8257 | 0.39 | 0.2002 | 0.3660 | 0.3825 |
| 0.0449 | 100.0 | 10000 | 0.1401 | 0.385 | 0.8709 | 8.8462 | 0.3850 | 0.1979 | 0.3606 | 0.3874 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4277
- Accuracy: 0.835
- Brier Loss: 0.2653
- Nll: 1.5700
- F1 Micro: 0.835
- F1 Macro: 0.8164
- Ece: 0.1805
- Aurc: 0.0632
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.6826 | 0.23 | 0.8622 | 4.7953 | 0.23 | 0.1892 | 0.2929 | 0.7651 |
| No log | 2.0 | 50 | 1.0374 | 0.53 | 0.6004 | 2.7646 | 0.53 | 0.4280 | 0.2624 | 0.2619 |
| No log | 3.0 | 75 | 0.8158 | 0.665 | 0.4678 | 2.4034 | 0.665 | 0.5565 | 0.2488 | 0.1416 |
| No log | 4.0 | 100 | 0.6879 | 0.72 | 0.3838 | 1.5355 | 0.72 | 0.6873 | 0.2297 | 0.1064 |
| No log | 5.0 | 125 | 0.6511 | 0.775 | 0.3538 | 1.5183 | 0.775 | 0.7285 | 0.2235 | 0.0915 |
| No log | 6.0 | 150 | 0.7310 | 0.755 | 0.3579 | 1.3899 | 0.755 | 0.7257 | 0.2190 | 0.0926 |
| No log | 7.0 | 175 | 0.5698 | 0.795 | 0.3246 | 1.3920 | 0.795 | 0.7691 | 0.2251 | 0.0956 |
| No log | 8.0 | 200 | 0.5675 | 0.805 | 0.3064 | 1.4278 | 0.805 | 0.7733 | 0.2093 | 0.0655 |
| No log | 9.0 | 225 | 0.5986 | 0.8 | 0.3356 | 1.5317 | 0.8000 | 0.7890 | 0.2249 | 0.0913 |
| No log | 10.0 | 250 | 0.6158 | 0.755 | 0.3475 | 1.5027 | 0.755 | 0.7340 | 0.2152 | 0.0782 |
| No log | 11.0 | 275 | 0.5353 | 0.815 | 0.3037 | 1.6003 | 0.815 | 0.8143 | 0.2305 | 0.0749 |
| No log | 12.0 | 300 | 0.5460 | 0.825 | 0.3008 | 1.7407 | 0.825 | 0.8070 | 0.2378 | 0.0856 |
| No log | 13.0 | 325 | 0.4905 | 0.83 | 0.2787 | 1.1328 | 0.83 | 0.8099 | 0.2344 | 0.0481 |
| No log | 14.0 | 350 | 0.4913 | 0.795 | 0.2881 | 1.2261 | 0.795 | 0.7521 | 0.2121 | 0.0661 |
| No log | 15.0 | 375 | 0.4773 | 0.835 | 0.2753 | 1.2716 | 0.835 | 0.8140 | 0.2125 | 0.0636 |
| No log | 16.0 | 400 | 0.4848 | 0.84 | 0.2751 | 1.5983 | 0.8400 | 0.8139 | 0.2195 | 0.0707 |
| No log | 17.0 | 425 | 0.4994 | 0.805 | 0.2886 | 1.5637 | 0.805 | 0.7689 | 0.2049 | 0.0617 |
| No log | 18.0 | 450 | 0.4610 | 0.835 | 0.2871 | 1.3906 | 0.835 | 0.8122 | 0.2175 | 0.0675 |
| No log | 19.0 | 475 | 0.4594 | 0.84 | 0.2669 | 1.2217 | 0.8400 | 0.8214 | 0.2022 | 0.0516 |
| 0.4534 | 20.0 | 500 | 0.4793 | 0.815 | 0.2874 | 1.4445 | 0.815 | 0.7965 | 0.2024 | 0.0641 |
| 0.4534 | 21.0 | 525 | 0.5185 | 0.785 | 0.3215 | 1.8358 | 0.785 | 0.7743 | 0.2250 | 0.0850 |
| 0.4534 | 22.0 | 550 | 0.4339 | 0.83 | 0.2635 | 1.2137 | 0.83 | 0.8200 | 0.1944 | 0.0610 |
| 0.4534 | 23.0 | 575 | 0.4640 | 0.825 | 0.2770 | 1.4137 | 0.825 | 0.8086 | 0.1800 | 0.0674 |
| 0.4534 | 24.0 | 600 | 0.4528 | 0.825 | 0.2692 | 1.3148 | 0.825 | 0.8077 | 0.1912 | 0.0678 |
| 0.4534 | 25.0 | 625 | 0.4361 | 0.84 | 0.2600 | 1.4205 | 0.8400 | 0.8278 | 0.2066 | 0.0534 |
| 0.4534 | 26.0 | 650 | 0.4239 | 0.835 | 0.2590 | 1.2112 | 0.835 | 0.8224 | 0.1850 | 0.0544 |
| 0.4534 | 27.0 | 675 | 0.4294 | 0.82 | 0.2636 | 1.2671 | 0.82 | 0.8023 | 0.1866 | 0.0619 |
| 0.4534 | 28.0 | 700 | 0.4327 | 0.84 | 0.2633 | 1.3084 | 0.8400 | 0.8283 | 0.1954 | 0.0628 |
| 0.4534 | 29.0 | 725 | 0.4309 | 0.825 | 0.2640 | 1.4275 | 0.825 | 0.8022 | 0.2117 | 0.0667 |
| 0.4534 | 30.0 | 750 | 0.4299 | 0.83 | 0.2636 | 1.3161 | 0.83 | 0.8103 | 0.2110 | 0.0620 |
| 0.4534 | 31.0 | 775 | 0.4345 | 0.835 | 0.2634 | 1.4605 | 0.835 | 0.8269 | 0.1998 | 0.0562 |
| 0.4534 | 32.0 | 800 | 0.4404 | 0.83 | 0.2743 | 1.3965 | 0.83 | 0.8077 | 0.2198 | 0.0669 |
| 0.4534 | 33.0 | 825 | 0.4254 | 0.83 | 0.2614 | 1.3734 | 0.83 | 0.8133 | 0.1990 | 0.0567 |
| 0.4534 | 34.0 | 850 | 0.4271 | 0.835 | 0.2632 | 1.3963 | 0.835 | 0.8164 | 0.1932 | 0.0649 |
| 0.4534 | 35.0 | 875 | 0.4284 | 0.835 | 0.2636 | 1.3713 | 0.835 | 0.8164 | 0.2127 | 0.0634 |
| 0.4534 | 36.0 | 900 | 0.4262 | 0.835 | 0.2628 | 1.4403 | 0.835 | 0.8164 | 0.1926 | 0.0649 |
| 0.4534 | 37.0 | 925 | 0.4253 | 0.835 | 0.2621 | 1.3813 | 0.835 | 0.8164 | 0.2015 | 0.0628 |
| 0.4534 | 38.0 | 950 | 0.4262 | 0.835 | 0.2626 | 1.4528 | 0.835 | 0.8164 | 0.1971 | 0.0628 |
| 0.4534 | 39.0 | 975 | 0.4271 | 0.835 | 0.2629 | 1.4410 | 0.835 | 0.8164 | 0.1933 | 0.0627 |
| 0.0663 | 40.0 | 1000 | 0.4283 | 0.835 | 0.2639 | 1.4647 | 0.835 | 0.8164 | 0.1996 | 0.0631 |
| 0.0663 | 41.0 | 1025 | 0.4272 | 0.835 | 0.2639 | 1.4417 | 0.835 | 0.8164 | 0.2088 | 0.0630 |
| 0.0663 | 42.0 | 1050 | 0.4276 | 0.835 | 0.2640 | 1.3976 | 0.835 | 0.8164 | 0.1992 | 0.0634 |
| 0.0663 | 43.0 | 1075 | 0.4270 | 0.835 | 0.2633 | 1.4392 | 0.835 | 0.8164 | 0.1892 | 0.0628 |
| 0.0663 | 44.0 | 1100 | 0.4264 | 0.835 | 0.2635 | 1.4429 | 0.835 | 0.8164 | 0.1885 | 0.0631 |
| 0.0663 | 45.0 | 1125 | 0.4269 | 0.835 | 0.2637 | 1.4461 | 0.835 | 0.8164 | 0.1974 | 0.0629 |
| 0.0663 | 46.0 | 1150 | 0.4268 | 0.835 | 0.2636 | 1.4415 | 0.835 | 0.8164 | 0.1866 | 0.0625 |
| 0.0663 | 47.0 | 1175 | 0.4269 | 0.835 | 0.2641 | 1.4646 | 0.835 | 0.8164 | 0.1812 | 0.0636 |
| 0.0663 | 48.0 | 1200 | 0.4271 | 0.835 | 0.2639 | 1.3990 | 0.835 | 0.8164 | 0.1865 | 0.0631 |
| 0.0663 | 49.0 | 1225 | 0.4267 | 0.835 | 0.2639 | 1.4474 | 0.835 | 0.8164 | 0.1946 | 0.0629 |
| 0.0663 | 50.0 | 1250 | 0.4273 | 0.835 | 0.2642 | 1.4492 | 0.835 | 0.8164 | 0.1802 | 0.0631 |
| 0.0663 | 51.0 | 1275 | 0.4272 | 0.835 | 0.2644 | 1.4475 | 0.835 | 0.8164 | 0.1942 | 0.0630 |
| 0.0663 | 52.0 | 1300 | 0.4283 | 0.835 | 0.2648 | 1.5157 | 0.835 | 0.8164 | 0.1963 | 0.0635 |
| 0.0663 | 53.0 | 1325 | 0.4271 | 0.835 | 0.2643 | 1.5046 | 0.835 | 0.8164 | 0.1955 | 0.0633 |
| 0.0663 | 54.0 | 1350 | 0.4271 | 0.835 | 0.2642 | 1.4629 | 0.835 | 0.8164 | 0.1790 | 0.0617 |
| 0.0663 | 55.0 | 1375 | 0.4278 | 0.835 | 0.2649 | 1.5752 | 0.835 | 0.8164 | 0.2007 | 0.0635 |
| 0.0663 | 56.0 | 1400 | 0.4280 | 0.835 | 0.2648 | 1.5165 | 0.835 | 0.8164 | 0.1706 | 0.0631 |
| 0.0663 | 57.0 | 1425 | 0.4275 | 0.835 | 0.2644 | 1.5134 | 0.835 | 0.8164 | 0.1864 | 0.0629 |
| 0.0663 | 58.0 | 1450 | 0.4270 | 0.835 | 0.2643 | 1.5088 | 0.835 | 0.8164 | 0.1883 | 0.0630 |
| 0.0663 | 59.0 | 1475 | 0.4273 | 0.835 | 0.2644 | 1.5111 | 0.835 | 0.8164 | 0.1951 | 0.0630 |
| 0.0615 | 60.0 | 1500 | 0.4281 | 0.835 | 0.2651 | 1.5727 | 0.835 | 0.8164 | 0.2084 | 0.0630 |
| 0.0615 | 61.0 | 1525 | 0.4271 | 0.835 | 0.2647 | 1.5198 | 0.835 | 0.8164 | 0.1957 | 0.0631 |
| 0.0615 | 62.0 | 1550 | 0.4276 | 0.835 | 0.2649 | 1.5139 | 0.835 | 0.8164 | 0.1969 | 0.0630 |
| 0.0615 | 63.0 | 1575 | 0.4269 | 0.835 | 0.2646 | 1.4579 | 0.835 | 0.8164 | 0.1802 | 0.0629 |
| 0.0615 | 64.0 | 1600 | 0.4275 | 0.835 | 0.2648 | 1.5144 | 0.835 | 0.8164 | 0.2006 | 0.0632 |
| 0.0615 | 65.0 | 1625 | 0.4276 | 0.835 | 0.2649 | 1.5129 | 0.835 | 0.8164 | 0.1846 | 0.0632 |
| 0.0615 | 66.0 | 1650 | 0.4272 | 0.835 | 0.2647 | 1.5165 | 0.835 | 0.8164 | 0.1796 | 0.0629 |
| 0.0615 | 67.0 | 1675 | 0.4273 | 0.835 | 0.2647 | 1.5141 | 0.835 | 0.8164 | 0.1882 | 0.0631 |
| 0.0615 | 68.0 | 1700 | 0.4276 | 0.835 | 0.2649 | 1.5146 | 0.835 | 0.8164 | 0.1799 | 0.0631 |
| 0.0615 | 69.0 | 1725 | 0.4275 | 0.835 | 0.2649 | 1.5215 | 0.835 | 0.8164 | 0.1799 | 0.0631 |
| 0.0615 | 70.0 | 1750 | 0.4275 | 0.835 | 0.2647 | 1.5124 | 0.835 | 0.8164 | 0.1884 | 0.0632 |
| 0.0615 | 71.0 | 1775 | 0.4278 | 0.835 | 0.2652 | 1.5245 | 0.835 | 0.8164 | 0.1800 | 0.0631 |
| 0.0615 | 72.0 | 1800 | 0.4277 | 0.835 | 0.2650 | 1.5169 | 0.835 | 0.8164 | 0.1802 | 0.0631 |
| 0.0615 | 73.0 | 1825 | 0.4277 | 0.835 | 0.2651 | 1.5282 | 0.835 | 0.8164 | 0.1804 | 0.0633 |
| 0.0615 | 74.0 | 1850 | 0.4273 | 0.835 | 0.2650 | 1.5156 | 0.835 | 0.8164 | 0.1804 | 0.0632 |
| 0.0615 | 75.0 | 1875 | 0.4278 | 0.835 | 0.2653 | 1.5706 | 0.835 | 0.8164 | 0.1804 | 0.0632 |
| 0.0615 | 76.0 | 1900 | 0.4275 | 0.835 | 0.2651 | 1.5337 | 0.835 | 0.8164 | 0.1807 | 0.0633 |
| 0.0615 | 77.0 | 1925 | 0.4276 | 0.835 | 0.2652 | 1.5357 | 0.835 | 0.8164 | 0.1804 | 0.0633 |
| 0.0615 | 78.0 | 1950 | 0.4275 | 0.835 | 0.2651 | 1.5701 | 0.835 | 0.8164 | 0.1805 | 0.0633 |
| 0.0615 | 79.0 | 1975 | 0.4277 | 0.835 | 0.2651 | 1.5161 | 0.835 | 0.8164 | 0.1807 | 0.0633 |
| 0.0614 | 80.0 | 2000 | 0.4278 | 0.835 | 0.2653 | 1.5709 | 0.835 | 0.8164 | 0.1808 | 0.0632 |
| 0.0614 | 81.0 | 2025 | 0.4278 | 0.835 | 0.2653 | 1.5703 | 0.835 | 0.8164 | 0.1804 | 0.0632 |
| 0.0614 | 82.0 | 2050 | 0.4278 | 0.835 | 0.2653 | 1.5700 | 0.835 | 0.8164 | 0.1806 | 0.0633 |
| 0.0614 | 83.0 | 2075 | 0.4277 | 0.835 | 0.2652 | 1.5700 | 0.835 | 0.8164 | 0.1803 | 0.0631 |
| 0.0614 | 84.0 | 2100 | 0.4276 | 0.835 | 0.2652 | 1.5694 | 0.835 | 0.8164 | 0.1804 | 0.0632 |
| 0.0614 | 85.0 | 2125 | 0.4275 | 0.835 | 0.2652 | 1.5702 | 0.835 | 0.8164 | 0.1807 | 0.0633 |
| 0.0614 | 86.0 | 2150 | 0.4276 | 0.835 | 0.2652 | 1.5699 | 0.835 | 0.8164 | 0.1805 | 0.0633 |
| 0.0614 | 87.0 | 2175 | 0.4277 | 0.835 | 0.2653 | 1.5703 | 0.835 | 0.8164 | 0.1805 | 0.0633 |
| 0.0614 | 88.0 | 2200 | 0.4277 | 0.835 | 0.2652 | 1.5702 | 0.835 | 0.8164 | 0.1882 | 0.0632 |
| 0.0614 | 89.0 | 2225 | 0.4277 | 0.835 | 0.2653 | 1.5702 | 0.835 | 0.8164 | 0.1806 | 0.0633 |
| 0.0614 | 90.0 | 2250 | 0.4276 | 0.835 | 0.2653 | 1.5696 | 0.835 | 0.8164 | 0.1806 | 0.0633 |
| 0.0614 | 91.0 | 2275 | 0.4277 | 0.835 | 0.2653 | 1.5698 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 92.0 | 2300 | 0.4276 | 0.835 | 0.2652 | 1.5699 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 93.0 | 2325 | 0.4277 | 0.835 | 0.2653 | 1.5700 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 94.0 | 2350 | 0.4276 | 0.835 | 0.2653 | 1.5698 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 95.0 | 2375 | 0.4277 | 0.835 | 0.2653 | 1.5699 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 96.0 | 2400 | 0.4276 | 0.835 | 0.2653 | 1.5700 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 97.0 | 2425 | 0.4277 | 0.835 | 0.2653 | 1.5699 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 98.0 | 2450 | 0.4276 | 0.835 | 0.2653 | 1.5699 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 99.0 | 2475 | 0.4277 | 0.835 | 0.2653 | 1.5700 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
| 0.0614 | 100.0 | 2500 | 0.4277 | 0.835 | 0.2653 | 1.5700 | 0.835 | 0.8164 | 0.1805 | 0.0632 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4925
- Accuracy: 0.845
- Brier Loss: 0.2526
- Nll: 1.5547
- F1 Micro: 0.845
- F1 Macro: 0.8258
- Ece: 0.1785
- Aurc: 0.0736
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.8463 | 0.245 | 0.8631 | 4.7256 | 0.245 | 0.2002 | 0.2955 | 0.7640 |
| No log | 2.0 | 50 | 1.1593 | 0.535 | 0.5972 | 2.7208 | 0.535 | 0.4319 | 0.2539 | 0.2591 |
| No log | 3.0 | 75 | 0.9039 | 0.67 | 0.4555 | 2.3747 | 0.67 | 0.5677 | 0.2448 | 0.1349 |
| No log | 4.0 | 100 | 0.7631 | 0.73 | 0.3757 | 1.5518 | 0.7300 | 0.7026 | 0.1947 | 0.0987 |
| No log | 5.0 | 125 | 0.7412 | 0.775 | 0.3497 | 1.4677 | 0.775 | 0.7456 | 0.2239 | 0.0892 |
| No log | 6.0 | 150 | 0.9198 | 0.72 | 0.3977 | 1.7618 | 0.72 | 0.6958 | 0.2190 | 0.1118 |
| No log | 7.0 | 175 | 0.6117 | 0.81 | 0.2969 | 1.2112 | 0.81 | 0.7726 | 0.2244 | 0.0661 |
| No log | 8.0 | 200 | 0.6296 | 0.78 | 0.3090 | 1.3439 | 0.78 | 0.7443 | 0.1959 | 0.0771 |
| No log | 9.0 | 225 | 0.6850 | 0.785 | 0.3187 | 1.6325 | 0.785 | 0.7651 | 0.2194 | 0.0986 |
| No log | 10.0 | 250 | 0.6304 | 0.79 | 0.3111 | 1.3598 | 0.79 | 0.7821 | 0.2106 | 0.0838 |
| No log | 11.0 | 275 | 0.6668 | 0.775 | 0.3242 | 1.9754 | 0.775 | 0.6942 | 0.2005 | 0.0947 |
| No log | 12.0 | 300 | 0.6795 | 0.775 | 0.3263 | 1.6182 | 0.775 | 0.7692 | 0.2155 | 0.0875 |
| No log | 13.0 | 325 | 0.5156 | 0.85 | 0.2454 | 0.9647 | 0.85 | 0.8378 | 0.2033 | 0.0515 |
| No log | 14.0 | 350 | 0.5341 | 0.845 | 0.2644 | 1.0410 | 0.845 | 0.8402 | 0.2050 | 0.0503 |
| No log | 15.0 | 375 | 0.4678 | 0.865 | 0.2245 | 0.9232 | 0.865 | 0.8564 | 0.1836 | 0.0363 |
| No log | 16.0 | 400 | 0.5620 | 0.82 | 0.2819 | 1.1475 | 0.82 | 0.7980 | 0.2050 | 0.0710 |
| No log | 17.0 | 425 | 0.5253 | 0.83 | 0.2642 | 0.8809 | 0.83 | 0.8145 | 0.1811 | 0.0723 |
| No log | 18.0 | 450 | 0.6295 | 0.815 | 0.2997 | 1.8144 | 0.815 | 0.8062 | 0.2120 | 0.0636 |
| No log | 19.0 | 475 | 0.5748 | 0.83 | 0.2774 | 1.7900 | 0.83 | 0.8200 | 0.1920 | 0.0506 |
| 0.466 | 20.0 | 500 | 0.4704 | 0.84 | 0.2275 | 0.8869 | 0.8400 | 0.8135 | 0.1882 | 0.0472 |
| 0.466 | 21.0 | 525 | 0.5693 | 0.82 | 0.2820 | 1.3315 | 0.82 | 0.8013 | 0.2011 | 0.0821 |
| 0.466 | 22.0 | 550 | 0.5251 | 0.81 | 0.2677 | 1.2663 | 0.81 | 0.7890 | 0.2037 | 0.0745 |
| 0.466 | 23.0 | 575 | 0.5158 | 0.83 | 0.2638 | 1.2621 | 0.83 | 0.8070 | 0.1927 | 0.0614 |
| 0.466 | 24.0 | 600 | 0.5056 | 0.835 | 0.2590 | 1.5337 | 0.835 | 0.8080 | 0.1887 | 0.0617 |
| 0.466 | 25.0 | 625 | 0.4897 | 0.85 | 0.2476 | 1.4341 | 0.85 | 0.8361 | 0.1870 | 0.0627 |
| 0.466 | 26.0 | 650 | 0.4994 | 0.85 | 0.2556 | 1.5846 | 0.85 | 0.8302 | 0.1965 | 0.0718 |
| 0.466 | 27.0 | 675 | 0.4720 | 0.845 | 0.2406 | 1.3093 | 0.845 | 0.8234 | 0.1873 | 0.0704 |
| 0.466 | 28.0 | 700 | 0.4858 | 0.84 | 0.2486 | 1.4459 | 0.8400 | 0.8192 | 0.1676 | 0.0730 |
| 0.466 | 29.0 | 725 | 0.4908 | 0.84 | 0.2510 | 1.4941 | 0.8400 | 0.8159 | 0.1754 | 0.0717 |
| 0.466 | 30.0 | 750 | 0.4805 | 0.855 | 0.2442 | 1.3279 | 0.855 | 0.8334 | 0.1827 | 0.0667 |
| 0.466 | 31.0 | 775 | 0.4783 | 0.845 | 0.2428 | 1.4150 | 0.845 | 0.8264 | 0.1759 | 0.0660 |
| 0.466 | 32.0 | 800 | 0.4822 | 0.855 | 0.2449 | 1.4848 | 0.855 | 0.8322 | 0.1928 | 0.0702 |
| 0.466 | 33.0 | 825 | 0.4845 | 0.84 | 0.2462 | 1.4925 | 0.8400 | 0.8227 | 0.1837 | 0.0692 |
| 0.466 | 34.0 | 850 | 0.4843 | 0.85 | 0.2466 | 1.4881 | 0.85 | 0.8295 | 0.1752 | 0.0683 |
| 0.466 | 35.0 | 875 | 0.4837 | 0.85 | 0.2464 | 1.4939 | 0.85 | 0.8295 | 0.1842 | 0.0718 |
| 0.466 | 36.0 | 900 | 0.4843 | 0.85 | 0.2467 | 1.4910 | 0.85 | 0.8295 | 0.1950 | 0.0705 |
| 0.466 | 37.0 | 925 | 0.4862 | 0.85 | 0.2479 | 1.4938 | 0.85 | 0.8295 | 0.1871 | 0.0713 |
| 0.466 | 38.0 | 950 | 0.4854 | 0.85 | 0.2478 | 1.4945 | 0.85 | 0.8295 | 0.1859 | 0.0719 |
| 0.466 | 39.0 | 975 | 0.4850 | 0.85 | 0.2471 | 1.4891 | 0.85 | 0.8295 | 0.1855 | 0.0724 |
| 0.0749 | 40.0 | 1000 | 0.4869 | 0.85 | 0.2484 | 1.4967 | 0.85 | 0.8295 | 0.1969 | 0.0718 |
| 0.0749 | 41.0 | 1025 | 0.4857 | 0.85 | 0.2482 | 1.5544 | 0.85 | 0.8295 | 0.1904 | 0.0726 |
| 0.0749 | 42.0 | 1050 | 0.4872 | 0.85 | 0.2487 | 1.5559 | 0.85 | 0.8295 | 0.1877 | 0.0732 |
| 0.0749 | 43.0 | 1075 | 0.4873 | 0.85 | 0.2488 | 1.5534 | 0.85 | 0.8295 | 0.1871 | 0.0723 |
| 0.0749 | 44.0 | 1100 | 0.4870 | 0.85 | 0.2489 | 1.5542 | 0.85 | 0.8295 | 0.1787 | 0.0730 |
| 0.0749 | 45.0 | 1125 | 0.4874 | 0.85 | 0.2490 | 1.5544 | 0.85 | 0.8295 | 0.1867 | 0.0724 |
| 0.0749 | 46.0 | 1150 | 0.4868 | 0.85 | 0.2486 | 1.5531 | 0.85 | 0.8295 | 0.1954 | 0.0723 |
| 0.0749 | 47.0 | 1175 | 0.4879 | 0.85 | 0.2493 | 1.5546 | 0.85 | 0.8295 | 0.1842 | 0.0727 |
| 0.0749 | 48.0 | 1200 | 0.4882 | 0.85 | 0.2495 | 1.5537 | 0.85 | 0.8295 | 0.1864 | 0.0730 |
| 0.0749 | 49.0 | 1225 | 0.4875 | 0.85 | 0.2492 | 1.5537 | 0.85 | 0.8295 | 0.1884 | 0.0727 |
| 0.0749 | 50.0 | 1250 | 0.4880 | 0.85 | 0.2494 | 1.5528 | 0.85 | 0.8295 | 0.1877 | 0.0726 |
| 0.0749 | 51.0 | 1275 | 0.4888 | 0.85 | 0.2499 | 1.5539 | 0.85 | 0.8295 | 0.1754 | 0.0725 |
| 0.0749 | 52.0 | 1300 | 0.4894 | 0.85 | 0.2501 | 1.5540 | 0.85 | 0.8295 | 0.1883 | 0.0736 |
| 0.0749 | 53.0 | 1325 | 0.4889 | 0.85 | 0.2501 | 1.5533 | 0.85 | 0.8295 | 0.1708 | 0.0727 |
| 0.0749 | 54.0 | 1350 | 0.4891 | 0.85 | 0.2500 | 1.5531 | 0.85 | 0.8295 | 0.1785 | 0.0729 |
| 0.0749 | 55.0 | 1375 | 0.4904 | 0.85 | 0.2509 | 1.5541 | 0.85 | 0.8295 | 0.1744 | 0.0730 |
| 0.0749 | 56.0 | 1400 | 0.4903 | 0.85 | 0.2507 | 1.5541 | 0.85 | 0.8295 | 0.1897 | 0.0730 |
| 0.0749 | 57.0 | 1425 | 0.4894 | 0.85 | 0.2503 | 1.5536 | 0.85 | 0.8295 | 0.1792 | 0.0730 |
| 0.0749 | 58.0 | 1450 | 0.4889 | 0.85 | 0.2501 | 1.5531 | 0.85 | 0.8295 | 0.1892 | 0.0730 |
| 0.0749 | 59.0 | 1475 | 0.4907 | 0.85 | 0.2511 | 1.5542 | 0.85 | 0.8295 | 0.1767 | 0.0733 |
| 0.0712 | 60.0 | 1500 | 0.4897 | 0.85 | 0.2506 | 1.5540 | 0.85 | 0.8295 | 0.1813 | 0.0732 |
| 0.0712 | 61.0 | 1525 | 0.4906 | 0.85 | 0.2512 | 1.5545 | 0.85 | 0.8295 | 0.1853 | 0.0733 |
| 0.0712 | 62.0 | 1550 | 0.4905 | 0.85 | 0.2512 | 1.5541 | 0.85 | 0.8295 | 0.1723 | 0.0733 |
| 0.0712 | 63.0 | 1575 | 0.4904 | 0.85 | 0.2512 | 1.5543 | 0.85 | 0.8295 | 0.1817 | 0.0732 |
| 0.0712 | 64.0 | 1600 | 0.4915 | 0.85 | 0.2515 | 1.5544 | 0.85 | 0.8295 | 0.1942 | 0.0736 |
| 0.0712 | 65.0 | 1625 | 0.4898 | 0.85 | 0.2506 | 1.5534 | 0.85 | 0.8295 | 0.1712 | 0.0735 |
| 0.0712 | 66.0 | 1650 | 0.4911 | 0.85 | 0.2516 | 1.5548 | 0.85 | 0.8295 | 0.1824 | 0.0733 |
| 0.0712 | 67.0 | 1675 | 0.4908 | 0.85 | 0.2513 | 1.5546 | 0.85 | 0.8295 | 0.1896 | 0.0734 |
| 0.0712 | 68.0 | 1700 | 0.4911 | 0.85 | 0.2516 | 1.5548 | 0.85 | 0.8295 | 0.1744 | 0.0734 |
| 0.0712 | 69.0 | 1725 | 0.4912 | 0.85 | 0.2516 | 1.5541 | 0.85 | 0.8295 | 0.1726 | 0.0733 |
| 0.0712 | 70.0 | 1750 | 0.4910 | 0.85 | 0.2514 | 1.5543 | 0.85 | 0.8295 | 0.1827 | 0.0736 |
| 0.0712 | 71.0 | 1775 | 0.4918 | 0.85 | 0.2520 | 1.5546 | 0.85 | 0.8295 | 0.1909 | 0.0736 |
| 0.0712 | 72.0 | 1800 | 0.4916 | 0.85 | 0.2519 | 1.5545 | 0.85 | 0.8295 | 0.1830 | 0.0734 |
| 0.0712 | 73.0 | 1825 | 0.4913 | 0.85 | 0.2517 | 1.5540 | 0.85 | 0.8295 | 0.1835 | 0.0733 |
| 0.0712 | 74.0 | 1850 | 0.4918 | 0.85 | 0.2521 | 1.5544 | 0.85 | 0.8295 | 0.1831 | 0.0736 |
| 0.0712 | 75.0 | 1875 | 0.4919 | 0.85 | 0.2521 | 1.5548 | 0.85 | 0.8295 | 0.1829 | 0.0734 |
| 0.0712 | 76.0 | 1900 | 0.4916 | 0.85 | 0.2520 | 1.5547 | 0.85 | 0.8295 | 0.1831 | 0.0733 |
| 0.0712 | 77.0 | 1925 | 0.4919 | 0.85 | 0.2521 | 1.5542 | 0.85 | 0.8295 | 0.1732 | 0.0735 |
| 0.0712 | 78.0 | 1950 | 0.4920 | 0.85 | 0.2521 | 1.5541 | 0.85 | 0.8295 | 0.1831 | 0.0734 |
| 0.0712 | 79.0 | 1975 | 0.4920 | 0.85 | 0.2522 | 1.5544 | 0.85 | 0.8295 | 0.1833 | 0.0734 |
| 0.0712 | 80.0 | 2000 | 0.4922 | 0.845 | 0.2523 | 1.5549 | 0.845 | 0.8258 | 0.1859 | 0.0735 |
| 0.0712 | 81.0 | 2025 | 0.4920 | 0.85 | 0.2522 | 1.5542 | 0.85 | 0.8295 | 0.1830 | 0.0732 |
| 0.0712 | 82.0 | 2050 | 0.4920 | 0.845 | 0.2522 | 1.5549 | 0.845 | 0.8258 | 0.1783 | 0.0734 |
| 0.0712 | 83.0 | 2075 | 0.4922 | 0.85 | 0.2524 | 1.5546 | 0.85 | 0.8295 | 0.1832 | 0.0734 |
| 0.0712 | 84.0 | 2100 | 0.4920 | 0.845 | 0.2522 | 1.5543 | 0.845 | 0.8258 | 0.1784 | 0.0735 |
| 0.0712 | 85.0 | 2125 | 0.4921 | 0.845 | 0.2523 | 1.5547 | 0.845 | 0.8258 | 0.1785 | 0.0735 |
| 0.0712 | 86.0 | 2150 | 0.4921 | 0.85 | 0.2523 | 1.5545 | 0.85 | 0.8295 | 0.1836 | 0.0733 |
| 0.0712 | 87.0 | 2175 | 0.4924 | 0.85 | 0.2524 | 1.5547 | 0.85 | 0.8295 | 0.1836 | 0.0734 |
| 0.0712 | 88.0 | 2200 | 0.4925 | 0.845 | 0.2524 | 1.5548 | 0.845 | 0.8258 | 0.1785 | 0.0735 |
| 0.0712 | 89.0 | 2225 | 0.4924 | 0.85 | 0.2525 | 1.5548 | 0.85 | 0.8295 | 0.1835 | 0.0734 |
| 0.0712 | 90.0 | 2250 | 0.4921 | 0.845 | 0.2523 | 1.5545 | 0.845 | 0.8258 | 0.1688 | 0.0735 |
| 0.0712 | 91.0 | 2275 | 0.4925 | 0.845 | 0.2525 | 1.5546 | 0.845 | 0.8258 | 0.1785 | 0.0735 |
| 0.0712 | 92.0 | 2300 | 0.4924 | 0.845 | 0.2524 | 1.5546 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
| 0.0712 | 93.0 | 2325 | 0.4925 | 0.845 | 0.2526 | 1.5548 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
| 0.0712 | 94.0 | 2350 | 0.4924 | 0.845 | 0.2525 | 1.5547 | 0.845 | 0.8258 | 0.1786 | 0.0736 |
| 0.0712 | 95.0 | 2375 | 0.4926 | 0.845 | 0.2526 | 1.5547 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
| 0.0712 | 96.0 | 2400 | 0.4925 | 0.845 | 0.2526 | 1.5548 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
| 0.0712 | 97.0 | 2425 | 0.4925 | 0.845 | 0.2526 | 1.5547 | 0.845 | 0.8258 | 0.1785 | 0.0735 |
| 0.0712 | 98.0 | 2450 | 0.4926 | 0.845 | 0.2526 | 1.5548 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
| 0.0712 | 99.0 | 2475 | 0.4925 | 0.845 | 0.2526 | 1.5548 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
| 0.0711 | 100.0 | 2500 | 0.4925 | 0.845 | 0.2526 | 1.5547 | 0.845 | 0.8258 | 0.1785 | 0.0736 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5536
- Accuracy: 0.82
- Brier Loss: 0.2571
- Nll: 1.4560
- F1 Micro: 0.82
- F1 Macro: 0.7994
- Ece: 0.1404
- Aurc: 0.0578
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 2.0125 | 0.23 | 0.8650 | 4.4951 | 0.23 | 0.1799 | 0.2806 | 0.7660 |
| No log | 2.0 | 50 | 1.2756 | 0.555 | 0.5948 | 2.6781 | 0.555 | 0.4537 | 0.2800 | 0.2519 |
| No log | 3.0 | 75 | 0.9515 | 0.685 | 0.4392 | 1.9416 | 0.685 | 0.5937 | 0.2067 | 0.1288 |
| No log | 4.0 | 100 | 0.7861 | 0.72 | 0.3622 | 1.5125 | 0.72 | 0.6675 | 0.2050 | 0.0961 |
| No log | 5.0 | 125 | 0.7551 | 0.77 | 0.3362 | 1.5478 | 0.7700 | 0.7318 | 0.2043 | 0.0838 |
| No log | 6.0 | 150 | 0.8056 | 0.77 | 0.3525 | 1.4305 | 0.7700 | 0.7589 | 0.1943 | 0.0891 |
| No log | 7.0 | 175 | 0.7942 | 0.775 | 0.3310 | 1.8237 | 0.775 | 0.7454 | 0.1812 | 0.0924 |
| No log | 8.0 | 200 | 0.7735 | 0.77 | 0.3384 | 1.5161 | 0.7700 | 0.7530 | 0.1987 | 0.0931 |
| No log | 9.0 | 225 | 0.6992 | 0.79 | 0.3025 | 1.5664 | 0.79 | 0.7777 | 0.1631 | 0.0774 |
| No log | 10.0 | 250 | 0.6753 | 0.8 | 0.2955 | 1.5189 | 0.8000 | 0.7900 | 0.1654 | 0.0633 |
| No log | 11.0 | 275 | 0.7701 | 0.805 | 0.3018 | 1.4787 | 0.805 | 0.7932 | 0.1581 | 0.0881 |
| No log | 12.0 | 300 | 0.7164 | 0.79 | 0.3292 | 1.3527 | 0.79 | 0.7892 | 0.1946 | 0.0871 |
| No log | 13.0 | 325 | 0.6376 | 0.8 | 0.2901 | 1.4953 | 0.8000 | 0.7824 | 0.1770 | 0.0659 |
| No log | 14.0 | 350 | 0.7319 | 0.77 | 0.3247 | 1.6062 | 0.7700 | 0.7424 | 0.1803 | 0.0816 |
| No log | 15.0 | 375 | 0.5749 | 0.805 | 0.2738 | 0.8483 | 0.805 | 0.8010 | 0.1569 | 0.0647 |
| No log | 16.0 | 400 | 0.6879 | 0.775 | 0.3085 | 1.3379 | 0.775 | 0.7759 | 0.1909 | 0.0730 |
| No log | 17.0 | 425 | 0.5094 | 0.85 | 0.2241 | 1.4391 | 0.85 | 0.8360 | 0.1589 | 0.0441 |
| No log | 18.0 | 450 | 0.6826 | 0.8 | 0.3015 | 1.6933 | 0.8000 | 0.7969 | 0.1651 | 0.0792 |
| No log | 19.0 | 475 | 0.5677 | 0.825 | 0.2622 | 1.5426 | 0.825 | 0.8051 | 0.1600 | 0.0515 |
| 0.4493 | 20.0 | 500 | 0.5156 | 0.85 | 0.2312 | 1.5882 | 0.85 | 0.8471 | 0.1466 | 0.0427 |
| 0.4493 | 21.0 | 525 | 0.5743 | 0.83 | 0.2600 | 1.5702 | 0.83 | 0.8187 | 0.1604 | 0.0540 |
| 0.4493 | 22.0 | 550 | 0.5872 | 0.825 | 0.2712 | 1.6270 | 0.825 | 0.8056 | 0.1687 | 0.0572 |
| 0.4493 | 23.0 | 575 | 0.5770 | 0.81 | 0.2701 | 1.5089 | 0.81 | 0.7969 | 0.1559 | 0.0655 |
| 0.4493 | 24.0 | 600 | 0.5621 | 0.82 | 0.2590 | 1.3500 | 0.82 | 0.8052 | 0.1621 | 0.0587 |
| 0.4493 | 25.0 | 625 | 0.5480 | 0.805 | 0.2518 | 1.2519 | 0.805 | 0.7884 | 0.1483 | 0.0619 |
| 0.4493 | 26.0 | 650 | 0.5555 | 0.81 | 0.2575 | 1.3183 | 0.81 | 0.7926 | 0.1585 | 0.0598 |
| 0.4493 | 27.0 | 675 | 0.5449 | 0.82 | 0.2524 | 1.4400 | 0.82 | 0.8059 | 0.1713 | 0.0579 |
| 0.4493 | 28.0 | 700 | 0.5483 | 0.81 | 0.2545 | 1.4400 | 0.81 | 0.7894 | 0.1450 | 0.0580 |
| 0.4493 | 29.0 | 725 | 0.5448 | 0.81 | 0.2524 | 1.3070 | 0.81 | 0.7931 | 0.1447 | 0.0595 |
| 0.4493 | 30.0 | 750 | 0.5476 | 0.815 | 0.2538 | 1.3101 | 0.815 | 0.7982 | 0.1536 | 0.0582 |
| 0.4493 | 31.0 | 775 | 0.5433 | 0.82 | 0.2529 | 1.3812 | 0.82 | 0.8011 | 0.1637 | 0.0575 |
| 0.4493 | 32.0 | 800 | 0.5469 | 0.805 | 0.2528 | 1.2973 | 0.805 | 0.7905 | 0.1668 | 0.0600 |
| 0.4493 | 33.0 | 825 | 0.5443 | 0.815 | 0.2525 | 1.3020 | 0.815 | 0.7933 | 0.1768 | 0.0579 |
| 0.4493 | 34.0 | 850 | 0.5442 | 0.82 | 0.2521 | 1.3234 | 0.82 | 0.8011 | 0.1555 | 0.0580 |
| 0.4493 | 35.0 | 875 | 0.5434 | 0.82 | 0.2531 | 1.4362 | 0.82 | 0.8011 | 0.1430 | 0.0564 |
| 0.4493 | 36.0 | 900 | 0.5469 | 0.815 | 0.2534 | 1.3075 | 0.815 | 0.7933 | 0.1590 | 0.0578 |
| 0.4493 | 37.0 | 925 | 0.5468 | 0.815 | 0.2546 | 1.3204 | 0.815 | 0.7933 | 0.1623 | 0.0567 |
| 0.4493 | 38.0 | 950 | 0.5473 | 0.815 | 0.2540 | 1.3722 | 0.815 | 0.7933 | 0.1514 | 0.0582 |
| 0.4493 | 39.0 | 975 | 0.5453 | 0.82 | 0.2532 | 1.3874 | 0.82 | 0.8011 | 0.1751 | 0.0568 |
| 0.0581 | 40.0 | 1000 | 0.5475 | 0.815 | 0.2543 | 1.3116 | 0.815 | 0.7933 | 0.1654 | 0.0573 |
| 0.0581 | 41.0 | 1025 | 0.5452 | 0.815 | 0.2533 | 1.4421 | 0.815 | 0.7933 | 0.1459 | 0.0579 |
| 0.0581 | 42.0 | 1050 | 0.5467 | 0.815 | 0.2538 | 1.3730 | 0.815 | 0.7933 | 0.1642 | 0.0576 |
| 0.0581 | 43.0 | 1075 | 0.5478 | 0.815 | 0.2544 | 1.3086 | 0.815 | 0.7933 | 0.1657 | 0.0581 |
| 0.0581 | 44.0 | 1100 | 0.5482 | 0.815 | 0.2545 | 1.3744 | 0.815 | 0.7933 | 0.1629 | 0.0583 |
| 0.0581 | 45.0 | 1125 | 0.5493 | 0.815 | 0.2550 | 1.3676 | 0.815 | 0.7933 | 0.1638 | 0.0594 |
| 0.0581 | 46.0 | 1150 | 0.5478 | 0.82 | 0.2547 | 1.4645 | 0.82 | 0.8011 | 0.1631 | 0.0572 |
| 0.0581 | 47.0 | 1175 | 0.5487 | 0.815 | 0.2547 | 1.3795 | 0.815 | 0.7933 | 0.1634 | 0.0577 |
| 0.0581 | 48.0 | 1200 | 0.5471 | 0.825 | 0.2546 | 1.4421 | 0.825 | 0.8067 | 0.1436 | 0.0564 |
| 0.0581 | 49.0 | 1225 | 0.5489 | 0.815 | 0.2547 | 1.3676 | 0.815 | 0.7933 | 0.1663 | 0.0578 |
| 0.0581 | 50.0 | 1250 | 0.5482 | 0.82 | 0.2549 | 1.4346 | 0.82 | 0.7990 | 0.1481 | 0.0574 |
| 0.0581 | 51.0 | 1275 | 0.5472 | 0.82 | 0.2540 | 1.5012 | 0.82 | 0.8011 | 0.1565 | 0.0569 |
| 0.0581 | 52.0 | 1300 | 0.5489 | 0.825 | 0.2553 | 1.4351 | 0.825 | 0.8051 | 0.1608 | 0.0576 |
| 0.0581 | 53.0 | 1325 | 0.5486 | 0.815 | 0.2549 | 1.3799 | 0.815 | 0.7933 | 0.1483 | 0.0573 |
| 0.0581 | 54.0 | 1350 | 0.5498 | 0.815 | 0.2552 | 1.4434 | 0.815 | 0.7933 | 0.1542 | 0.0578 |
| 0.0581 | 55.0 | 1375 | 0.5508 | 0.82 | 0.2559 | 1.4394 | 0.82 | 0.7994 | 0.1562 | 0.0576 |
| 0.0581 | 56.0 | 1400 | 0.5492 | 0.825 | 0.2552 | 1.4368 | 0.825 | 0.8051 | 0.1483 | 0.0572 |
| 0.0581 | 57.0 | 1425 | 0.5501 | 0.815 | 0.2552 | 1.3874 | 0.815 | 0.7933 | 0.1390 | 0.0579 |
| 0.0581 | 58.0 | 1450 | 0.5497 | 0.82 | 0.2553 | 1.4365 | 0.82 | 0.7994 | 0.1437 | 0.0579 |
| 0.0581 | 59.0 | 1475 | 0.5507 | 0.82 | 0.2557 | 1.4343 | 0.82 | 0.7994 | 0.1389 | 0.0584 |
| 0.056 | 60.0 | 1500 | 0.5501 | 0.825 | 0.2555 | 1.4410 | 0.825 | 0.8051 | 0.1585 | 0.0583 |
| 0.056 | 61.0 | 1525 | 0.5510 | 0.82 | 0.2559 | 1.4380 | 0.82 | 0.7994 | 0.1395 | 0.0578 |
| 0.056 | 62.0 | 1550 | 0.5510 | 0.82 | 0.2558 | 1.4421 | 0.82 | 0.7994 | 0.1441 | 0.0573 |
| 0.056 | 63.0 | 1575 | 0.5508 | 0.82 | 0.2559 | 1.4369 | 0.82 | 0.7994 | 0.1395 | 0.0575 |
| 0.056 | 64.0 | 1600 | 0.5514 | 0.82 | 0.2560 | 1.4410 | 0.82 | 0.7994 | 0.1393 | 0.0579 |
| 0.056 | 65.0 | 1625 | 0.5519 | 0.825 | 0.2563 | 1.4544 | 0.825 | 0.8051 | 0.1427 | 0.0575 |
| 0.056 | 66.0 | 1650 | 0.5510 | 0.82 | 0.2560 | 1.4400 | 0.82 | 0.7994 | 0.1391 | 0.0576 |
| 0.056 | 67.0 | 1675 | 0.5520 | 0.825 | 0.2563 | 1.4396 | 0.825 | 0.8051 | 0.1422 | 0.0580 |
| 0.056 | 68.0 | 1700 | 0.5516 | 0.82 | 0.2561 | 1.4412 | 0.82 | 0.7994 | 0.1394 | 0.0580 |
| 0.056 | 69.0 | 1725 | 0.5512 | 0.82 | 0.2560 | 1.4433 | 0.82 | 0.7994 | 0.1393 | 0.0577 |
| 0.056 | 70.0 | 1750 | 0.5515 | 0.82 | 0.2561 | 1.4418 | 0.82 | 0.7994 | 0.1391 | 0.0576 |
| 0.056 | 71.0 | 1775 | 0.5517 | 0.82 | 0.2562 | 1.4448 | 0.82 | 0.7994 | 0.1449 | 0.0581 |
| 0.056 | 72.0 | 1800 | 0.5524 | 0.825 | 0.2566 | 1.4421 | 0.825 | 0.8051 | 0.1437 | 0.0579 |
| 0.056 | 73.0 | 1825 | 0.5518 | 0.82 | 0.2562 | 1.4403 | 0.82 | 0.7994 | 0.1469 | 0.0576 |
| 0.056 | 74.0 | 1850 | 0.5529 | 0.825 | 0.2568 | 1.4450 | 0.825 | 0.8051 | 0.1434 | 0.0580 |
| 0.056 | 75.0 | 1875 | 0.5528 | 0.82 | 0.2566 | 1.4475 | 0.82 | 0.7994 | 0.1447 | 0.0585 |
| 0.056 | 76.0 | 1900 | 0.5529 | 0.82 | 0.2568 | 1.4463 | 0.82 | 0.7994 | 0.1447 | 0.0578 |
| 0.056 | 77.0 | 1925 | 0.5528 | 0.82 | 0.2567 | 1.4469 | 0.82 | 0.7994 | 0.1401 | 0.0577 |
| 0.056 | 78.0 | 1950 | 0.5525 | 0.82 | 0.2565 | 1.4506 | 0.82 | 0.7994 | 0.1444 | 0.0576 |
| 0.056 | 79.0 | 1975 | 0.5527 | 0.825 | 0.2567 | 1.4479 | 0.825 | 0.8051 | 0.1423 | 0.0576 |
| 0.0559 | 80.0 | 2000 | 0.5530 | 0.825 | 0.2568 | 1.4429 | 0.825 | 0.8051 | 0.1423 | 0.0578 |
| 0.0559 | 81.0 | 2025 | 0.5529 | 0.825 | 0.2567 | 1.4489 | 0.825 | 0.8051 | 0.1422 | 0.0581 |
| 0.0559 | 82.0 | 2050 | 0.5529 | 0.82 | 0.2568 | 1.4550 | 0.82 | 0.7994 | 0.1401 | 0.0576 |
| 0.0559 | 83.0 | 2075 | 0.5534 | 0.82 | 0.2570 | 1.4458 | 0.82 | 0.7994 | 0.1399 | 0.0580 |
| 0.0559 | 84.0 | 2100 | 0.5530 | 0.82 | 0.2568 | 1.4497 | 0.82 | 0.7994 | 0.1399 | 0.0577 |
| 0.0559 | 85.0 | 2125 | 0.5533 | 0.82 | 0.2570 | 1.4507 | 0.82 | 0.7994 | 0.1401 | 0.0577 |
| 0.0559 | 86.0 | 2150 | 0.5531 | 0.825 | 0.2568 | 1.4515 | 0.825 | 0.8051 | 0.1428 | 0.0577 |
| 0.0559 | 87.0 | 2175 | 0.5534 | 0.82 | 0.2569 | 1.4503 | 0.82 | 0.7994 | 0.1404 | 0.0577 |
| 0.0559 | 88.0 | 2200 | 0.5534 | 0.82 | 0.2569 | 1.4532 | 0.82 | 0.7994 | 0.1399 | 0.0581 |
| 0.0559 | 89.0 | 2225 | 0.5533 | 0.825 | 0.2569 | 1.4499 | 0.825 | 0.8051 | 0.1423 | 0.0578 |
| 0.0559 | 90.0 | 2250 | 0.5534 | 0.82 | 0.2570 | 1.4517 | 0.82 | 0.7994 | 0.1404 | 0.0577 |
| 0.0559 | 91.0 | 2275 | 0.5533 | 0.82 | 0.2569 | 1.4526 | 0.82 | 0.7994 | 0.1405 | 0.0579 |
| 0.0559 | 92.0 | 2300 | 0.5534 | 0.825 | 0.2570 | 1.4533 | 0.825 | 0.8051 | 0.1424 | 0.0577 |
| 0.0559 | 93.0 | 2325 | 0.5535 | 0.82 | 0.2570 | 1.4527 | 0.82 | 0.7994 | 0.1399 | 0.0580 |
| 0.0559 | 94.0 | 2350 | 0.5536 | 0.82 | 0.2571 | 1.4533 | 0.82 | 0.7994 | 0.1404 | 0.0577 |
| 0.0559 | 95.0 | 2375 | 0.5536 | 0.82 | 0.2571 | 1.4547 | 0.82 | 0.7994 | 0.1400 | 0.0579 |
| 0.0559 | 96.0 | 2400 | 0.5535 | 0.82 | 0.2570 | 1.4567 | 0.82 | 0.7994 | 0.1400 | 0.0578 |
| 0.0559 | 97.0 | 2425 | 0.5536 | 0.82 | 0.2571 | 1.4523 | 0.82 | 0.7994 | 0.1404 | 0.0579 |
| 0.0559 | 98.0 | 2450 | 0.5536 | 0.82 | 0.2571 | 1.4570 | 0.82 | 0.7994 | 0.1404 | 0.0578 |
| 0.0559 | 99.0 | 2475 | 0.5536 | 0.82 | 0.2571 | 1.4570 | 0.82 | 0.7994 | 0.1404 | 0.0578 |
| 0.0559 | 100.0 | 2500 | 0.5536 | 0.82 | 0.2571 | 1.4560 | 0.82 | 0.7994 | 0.1404 | 0.0578 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4137
- Accuracy: 0.83
- Brier Loss: 0.2631
- Nll: 1.5189
- F1 Micro: 0.83
- F1 Macro: 0.8172
- Ece: 0.2007
- Aurc: 0.0591
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.6265 | 0.23 | 0.8647 | 5.1432 | 0.23 | 0.1847 | 0.2751 | 0.7516 |
| No log | 2.0 | 50 | 1.0240 | 0.505 | 0.6074 | 2.7425 | 0.505 | 0.3980 | 0.2977 | 0.2705 |
| No log | 3.0 | 75 | 0.8130 | 0.655 | 0.4809 | 2.4939 | 0.655 | 0.5584 | 0.2576 | 0.1502 |
| No log | 4.0 | 100 | 0.6703 | 0.735 | 0.3867 | 1.3509 | 0.735 | 0.6895 | 0.2334 | 0.1109 |
| No log | 5.0 | 125 | 0.6313 | 0.755 | 0.3420 | 1.2521 | 0.755 | 0.7207 | 0.2081 | 0.0789 |
| No log | 6.0 | 150 | 0.6598 | 0.76 | 0.3543 | 1.4171 | 0.76 | 0.7103 | 0.2275 | 0.0886 |
| No log | 7.0 | 175 | 0.5669 | 0.77 | 0.3368 | 1.5060 | 0.7700 | 0.7351 | 0.2247 | 0.0941 |
| No log | 8.0 | 200 | 0.5486 | 0.775 | 0.3004 | 1.1511 | 0.775 | 0.7413 | 0.2252 | 0.0640 |
| No log | 9.0 | 225 | 0.5456 | 0.795 | 0.3141 | 1.4663 | 0.795 | 0.7762 | 0.2198 | 0.0889 |
| No log | 10.0 | 250 | 0.4954 | 0.82 | 0.2819 | 1.4644 | 0.82 | 0.7981 | 0.2150 | 0.0649 |
| No log | 11.0 | 275 | 0.4804 | 0.805 | 0.2866 | 1.3705 | 0.805 | 0.7927 | 0.2078 | 0.0658 |
| No log | 12.0 | 300 | 0.5234 | 0.785 | 0.3152 | 1.5290 | 0.785 | 0.7681 | 0.2149 | 0.0637 |
| No log | 13.0 | 325 | 0.4701 | 0.815 | 0.2839 | 1.4490 | 0.815 | 0.8010 | 0.2315 | 0.0586 |
| No log | 14.0 | 350 | 0.4859 | 0.795 | 0.2807 | 1.1224 | 0.795 | 0.7957 | 0.2170 | 0.0512 |
| No log | 15.0 | 375 | 0.5580 | 0.79 | 0.3272 | 1.7539 | 0.79 | 0.7735 | 0.2376 | 0.0708 |
| No log | 16.0 | 400 | 0.4918 | 0.8 | 0.2961 | 1.5112 | 0.8000 | 0.7988 | 0.1850 | 0.0568 |
| No log | 17.0 | 425 | 0.4442 | 0.8 | 0.2846 | 1.6182 | 0.8000 | 0.7767 | 0.2083 | 0.0712 |
| No log | 18.0 | 450 | 0.4460 | 0.82 | 0.2760 | 1.6839 | 0.82 | 0.8027 | 0.2127 | 0.0523 |
| No log | 19.0 | 475 | 0.4423 | 0.825 | 0.2676 | 1.3774 | 0.825 | 0.8176 | 0.1853 | 0.0557 |
| 0.4472 | 20.0 | 500 | 0.4998 | 0.81 | 0.2910 | 1.7711 | 0.81 | 0.8152 | 0.2181 | 0.0635 |
| 0.4472 | 21.0 | 525 | 0.4579 | 0.83 | 0.2871 | 1.7025 | 0.83 | 0.8135 | 0.1927 | 0.0696 |
| 0.4472 | 22.0 | 550 | 0.4421 | 0.825 | 0.2683 | 1.6453 | 0.825 | 0.8215 | 0.1929 | 0.0613 |
| 0.4472 | 23.0 | 575 | 0.4368 | 0.8 | 0.2821 | 1.7298 | 0.8000 | 0.7684 | 0.2060 | 0.0771 |
| 0.4472 | 24.0 | 600 | 0.4310 | 0.83 | 0.2689 | 1.4699 | 0.83 | 0.8163 | 0.2067 | 0.0556 |
| 0.4472 | 25.0 | 625 | 0.4394 | 0.83 | 0.2751 | 1.5955 | 0.83 | 0.8166 | 0.2138 | 0.0681 |
| 0.4472 | 26.0 | 650 | 0.4395 | 0.815 | 0.2786 | 1.6788 | 0.815 | 0.8033 | 0.2034 | 0.0643 |
| 0.4472 | 27.0 | 675 | 0.4118 | 0.84 | 0.2578 | 1.5641 | 0.8400 | 0.8293 | 0.2024 | 0.0554 |
| 0.4472 | 28.0 | 700 | 0.4273 | 0.82 | 0.2707 | 1.7118 | 0.82 | 0.8090 | 0.2133 | 0.0674 |
| 0.4472 | 29.0 | 725 | 0.4207 | 0.835 | 0.2648 | 1.6469 | 0.835 | 0.8206 | 0.1948 | 0.0652 |
| 0.4472 | 30.0 | 750 | 0.4172 | 0.825 | 0.2620 | 1.5024 | 0.825 | 0.8114 | 0.1833 | 0.0601 |
| 0.4472 | 31.0 | 775 | 0.4148 | 0.825 | 0.2610 | 1.4994 | 0.825 | 0.8070 | 0.2052 | 0.0593 |
| 0.4472 | 32.0 | 800 | 0.4148 | 0.825 | 0.2627 | 1.6293 | 0.825 | 0.8088 | 0.2080 | 0.0618 |
| 0.4472 | 33.0 | 825 | 0.4159 | 0.825 | 0.2625 | 1.5069 | 0.825 | 0.8135 | 0.2082 | 0.0604 |
| 0.4472 | 34.0 | 850 | 0.4168 | 0.825 | 0.2638 | 1.5770 | 0.825 | 0.8137 | 0.1888 | 0.0588 |
| 0.4472 | 35.0 | 875 | 0.4181 | 0.82 | 0.2640 | 1.5404 | 0.82 | 0.8043 | 0.2145 | 0.0582 |
| 0.4472 | 36.0 | 900 | 0.4154 | 0.83 | 0.2618 | 1.5719 | 0.83 | 0.8165 | 0.1965 | 0.0586 |
| 0.4472 | 37.0 | 925 | 0.4160 | 0.825 | 0.2632 | 1.5840 | 0.825 | 0.8137 | 0.2003 | 0.0604 |
| 0.4472 | 38.0 | 950 | 0.4133 | 0.83 | 0.2616 | 1.5711 | 0.83 | 0.8163 | 0.2040 | 0.0596 |
| 0.4472 | 39.0 | 975 | 0.4167 | 0.825 | 0.2635 | 1.5210 | 0.825 | 0.8138 | 0.1930 | 0.0590 |
| 0.0652 | 40.0 | 1000 | 0.4162 | 0.83 | 0.2630 | 1.6312 | 0.83 | 0.8163 | 0.1973 | 0.0593 |
| 0.0652 | 41.0 | 1025 | 0.4144 | 0.83 | 0.2626 | 1.5787 | 0.83 | 0.8163 | 0.2068 | 0.0603 |
| 0.0652 | 42.0 | 1050 | 0.4150 | 0.83 | 0.2631 | 1.5789 | 0.83 | 0.8163 | 0.1970 | 0.0588 |
| 0.0652 | 43.0 | 1075 | 0.4158 | 0.825 | 0.2635 | 1.5833 | 0.825 | 0.8138 | 0.1927 | 0.0597 |
| 0.0652 | 44.0 | 1100 | 0.4132 | 0.83 | 0.2622 | 1.5130 | 0.83 | 0.8163 | 0.2030 | 0.0593 |
| 0.0652 | 45.0 | 1125 | 0.4146 | 0.83 | 0.2630 | 1.6312 | 0.83 | 0.8165 | 0.2010 | 0.0587 |
| 0.0652 | 46.0 | 1150 | 0.4138 | 0.825 | 0.2624 | 1.5301 | 0.825 | 0.8135 | 0.2065 | 0.0587 |
| 0.0652 | 47.0 | 1175 | 0.4142 | 0.83 | 0.2627 | 1.6292 | 0.83 | 0.8163 | 0.1984 | 0.0591 |
| 0.0652 | 48.0 | 1200 | 0.4146 | 0.825 | 0.2629 | 1.5735 | 0.825 | 0.8137 | 0.1998 | 0.0589 |
| 0.0652 | 49.0 | 1225 | 0.4143 | 0.83 | 0.2630 | 1.5276 | 0.83 | 0.8163 | 0.2116 | 0.0599 |
| 0.0652 | 50.0 | 1250 | 0.4140 | 0.83 | 0.2628 | 1.5705 | 0.83 | 0.8163 | 0.1966 | 0.0590 |
| 0.0652 | 51.0 | 1275 | 0.4152 | 0.825 | 0.2637 | 1.5747 | 0.825 | 0.8138 | 0.1835 | 0.0593 |
| 0.0652 | 52.0 | 1300 | 0.4145 | 0.825 | 0.2629 | 1.5796 | 0.825 | 0.8137 | 0.1926 | 0.0593 |
| 0.0652 | 53.0 | 1325 | 0.4147 | 0.825 | 0.2631 | 1.6323 | 0.825 | 0.8138 | 0.1838 | 0.0588 |
| 0.0652 | 54.0 | 1350 | 0.4141 | 0.83 | 0.2628 | 1.5763 | 0.83 | 0.8163 | 0.2035 | 0.0592 |
| 0.0652 | 55.0 | 1375 | 0.4137 | 0.83 | 0.2630 | 1.5751 | 0.83 | 0.8163 | 0.2042 | 0.0590 |
| 0.0652 | 56.0 | 1400 | 0.4145 | 0.83 | 0.2632 | 1.6307 | 0.83 | 0.8163 | 0.1981 | 0.0588 |
| 0.0652 | 57.0 | 1425 | 0.4149 | 0.825 | 0.2634 | 1.5225 | 0.825 | 0.8137 | 0.2008 | 0.0589 |
| 0.0652 | 58.0 | 1450 | 0.4146 | 0.83 | 0.2634 | 1.5725 | 0.83 | 0.8163 | 0.2121 | 0.0589 |
| 0.0652 | 59.0 | 1475 | 0.4142 | 0.83 | 0.2632 | 1.5214 | 0.83 | 0.8163 | 0.2028 | 0.0590 |
| 0.0614 | 60.0 | 1500 | 0.4145 | 0.83 | 0.2634 | 1.5237 | 0.83 | 0.8163 | 0.1981 | 0.0585 |
| 0.0614 | 61.0 | 1525 | 0.4142 | 0.83 | 0.2630 | 1.5710 | 0.83 | 0.8163 | 0.2070 | 0.0591 |
| 0.0614 | 62.0 | 1550 | 0.4139 | 0.825 | 0.2631 | 1.5733 | 0.825 | 0.8135 | 0.1986 | 0.0594 |
| 0.0614 | 63.0 | 1575 | 0.4139 | 0.825 | 0.2630 | 1.5813 | 0.825 | 0.8136 | 0.1984 | 0.0593 |
| 0.0614 | 64.0 | 1600 | 0.4138 | 0.83 | 0.2629 | 1.5729 | 0.83 | 0.8163 | 0.2035 | 0.0590 |
| 0.0614 | 65.0 | 1625 | 0.4139 | 0.825 | 0.2629 | 1.5715 | 0.825 | 0.8136 | 0.2026 | 0.0593 |
| 0.0614 | 66.0 | 1650 | 0.4136 | 0.825 | 0.2629 | 1.5768 | 0.825 | 0.8135 | 0.1988 | 0.0592 |
| 0.0614 | 67.0 | 1675 | 0.4139 | 0.825 | 0.2629 | 1.5709 | 0.825 | 0.8135 | 0.1987 | 0.0593 |
| 0.0614 | 68.0 | 1700 | 0.4143 | 0.825 | 0.2633 | 1.5744 | 0.825 | 0.8138 | 0.1896 | 0.0595 |
| 0.0614 | 69.0 | 1725 | 0.4142 | 0.825 | 0.2632 | 1.5752 | 0.825 | 0.8138 | 0.1896 | 0.0593 |
| 0.0614 | 70.0 | 1750 | 0.4142 | 0.825 | 0.2632 | 1.5769 | 0.825 | 0.8138 | 0.1879 | 0.0594 |
| 0.0614 | 71.0 | 1775 | 0.4138 | 0.83 | 0.2630 | 1.5734 | 0.83 | 0.8163 | 0.2073 | 0.0588 |
| 0.0614 | 72.0 | 1800 | 0.4140 | 0.825 | 0.2631 | 1.5734 | 0.825 | 0.8138 | 0.1977 | 0.0593 |
| 0.0614 | 73.0 | 1825 | 0.4135 | 0.83 | 0.2629 | 1.5711 | 0.83 | 0.8163 | 0.2035 | 0.0589 |
| 0.0614 | 74.0 | 1850 | 0.4140 | 0.83 | 0.2632 | 1.5717 | 0.83 | 0.8163 | 0.2038 | 0.0590 |
| 0.0614 | 75.0 | 1875 | 0.4141 | 0.825 | 0.2633 | 1.5205 | 0.825 | 0.8138 | 0.1838 | 0.0593 |
| 0.0614 | 76.0 | 1900 | 0.4138 | 0.825 | 0.2631 | 1.5218 | 0.825 | 0.8137 | 0.1838 | 0.0595 |
| 0.0614 | 77.0 | 1925 | 0.4134 | 0.825 | 0.2628 | 1.5710 | 0.825 | 0.8135 | 0.1937 | 0.0591 |
| 0.0614 | 78.0 | 1950 | 0.4135 | 0.83 | 0.2629 | 1.5688 | 0.83 | 0.8163 | 0.2067 | 0.0588 |
| 0.0614 | 79.0 | 1975 | 0.4138 | 0.825 | 0.2631 | 1.5143 | 0.825 | 0.8137 | 0.1942 | 0.0592 |
| 0.0613 | 80.0 | 2000 | 0.4134 | 0.825 | 0.2628 | 1.5152 | 0.825 | 0.8135 | 0.1939 | 0.0591 |
| 0.0613 | 81.0 | 2025 | 0.4139 | 0.825 | 0.2632 | 1.5144 | 0.825 | 0.8136 | 0.1903 | 0.0593 |
| 0.0613 | 82.0 | 2050 | 0.4139 | 0.83 | 0.2632 | 1.5242 | 0.83 | 0.8163 | 0.1894 | 0.0589 |
| 0.0613 | 83.0 | 2075 | 0.4138 | 0.825 | 0.2631 | 1.5159 | 0.825 | 0.8136 | 0.2014 | 0.0594 |
| 0.0613 | 84.0 | 2100 | 0.4137 | 0.825 | 0.2631 | 1.5707 | 0.825 | 0.8136 | 0.1954 | 0.0592 |
| 0.0613 | 85.0 | 2125 | 0.4136 | 0.825 | 0.2630 | 1.5252 | 0.825 | 0.8136 | 0.1878 | 0.0592 |
| 0.0613 | 86.0 | 2150 | 0.4138 | 0.83 | 0.2630 | 1.5186 | 0.83 | 0.8172 | 0.2024 | 0.0588 |
| 0.0613 | 87.0 | 2175 | 0.4139 | 0.825 | 0.2632 | 1.5201 | 0.825 | 0.8138 | 0.1927 | 0.0592 |
| 0.0613 | 88.0 | 2200 | 0.4138 | 0.83 | 0.2631 | 1.5285 | 0.83 | 0.8172 | 0.1897 | 0.0591 |
| 0.0613 | 89.0 | 2225 | 0.4137 | 0.825 | 0.2631 | 1.5185 | 0.825 | 0.8136 | 0.1956 | 0.0593 |
| 0.0613 | 90.0 | 2250 | 0.4137 | 0.83 | 0.2631 | 1.5212 | 0.83 | 0.8172 | 0.2007 | 0.0591 |
| 0.0613 | 91.0 | 2275 | 0.4138 | 0.825 | 0.2631 | 1.5185 | 0.825 | 0.8138 | 0.1915 | 0.0593 |
| 0.0613 | 92.0 | 2300 | 0.4136 | 0.83 | 0.2630 | 1.5174 | 0.83 | 0.8172 | 0.2067 | 0.0590 |
| 0.0613 | 93.0 | 2325 | 0.4137 | 0.83 | 0.2631 | 1.5204 | 0.83 | 0.8172 | 0.1939 | 0.0591 |
| 0.0613 | 94.0 | 2350 | 0.4137 | 0.83 | 0.2631 | 1.5255 | 0.83 | 0.8172 | 0.2007 | 0.0592 |
| 0.0613 | 95.0 | 2375 | 0.4137 | 0.83 | 0.2631 | 1.5161 | 0.83 | 0.8172 | 0.1966 | 0.0591 |
| 0.0613 | 96.0 | 2400 | 0.4136 | 0.83 | 0.2630 | 1.5180 | 0.83 | 0.8172 | 0.2007 | 0.0590 |
| 0.0613 | 97.0 | 2425 | 0.4137 | 0.83 | 0.2631 | 1.5176 | 0.83 | 0.8172 | 0.1966 | 0.0591 |
| 0.0613 | 98.0 | 2450 | 0.4137 | 0.83 | 0.2631 | 1.5194 | 0.83 | 0.8172 | 0.1966 | 0.0590 |
| 0.0613 | 99.0 | 2475 | 0.4137 | 0.83 | 0.2631 | 1.5195 | 0.83 | 0.8172 | 0.2005 | 0.0591 |
| 0.0613 | 100.0 | 2500 | 0.4137 | 0.83 | 0.2631 | 1.5189 | 0.83 | 0.8172 | 0.2007 | 0.0591 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t2.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t2.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4858
- Accuracy: 0.845
- Brier Loss: 0.2480
- Nll: 1.6086
- F1 Micro: 0.845
- F1 Macro: 0.8265
- Ece: 0.1885
- Aurc: 0.0556
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.8068 | 0.235 | 0.8615 | 4.8212 | 0.235 | 0.1982 | 0.2851 | 0.7569 |
| No log | 2.0 | 50 | 1.1592 | 0.525 | 0.6018 | 2.7552 | 0.525 | 0.4251 | 0.2868 | 0.2632 |
| No log | 3.0 | 75 | 0.9152 | 0.66 | 0.4661 | 2.3855 | 0.66 | 0.5569 | 0.2425 | 0.1423 |
| No log | 4.0 | 100 | 0.7630 | 0.72 | 0.3764 | 1.5145 | 0.72 | 0.6868 | 0.2192 | 0.1030 |
| No log | 5.0 | 125 | 0.7141 | 0.77 | 0.3437 | 1.5182 | 0.7700 | 0.7351 | 0.2394 | 0.0885 |
| No log | 6.0 | 150 | 0.7497 | 0.765 | 0.3420 | 1.4392 | 0.765 | 0.7401 | 0.2226 | 0.0850 |
| No log | 7.0 | 175 | 0.6465 | 0.8 | 0.3252 | 1.4412 | 0.8000 | 0.7688 | 0.2262 | 0.0878 |
| No log | 8.0 | 200 | 0.6100 | 0.79 | 0.3013 | 1.1306 | 0.79 | 0.7626 | 0.2135 | 0.0732 |
| No log | 9.0 | 225 | 0.7123 | 0.78 | 0.3429 | 1.5207 | 0.78 | 0.7679 | 0.2302 | 0.1021 |
| No log | 10.0 | 250 | 0.5881 | 0.83 | 0.2873 | 1.3827 | 0.83 | 0.8274 | 0.2288 | 0.0503 |
| No log | 11.0 | 275 | 0.6079 | 0.805 | 0.2958 | 1.3646 | 0.805 | 0.7659 | 0.1826 | 0.0795 |
| No log | 12.0 | 300 | 0.5697 | 0.815 | 0.2829 | 1.4925 | 0.815 | 0.7846 | 0.1783 | 0.0637 |
| No log | 13.0 | 325 | 0.5641 | 0.815 | 0.2834 | 1.7183 | 0.815 | 0.7950 | 0.1949 | 0.0587 |
| No log | 14.0 | 350 | 0.5287 | 0.815 | 0.2615 | 1.1254 | 0.815 | 0.7990 | 0.1903 | 0.0508 |
| No log | 15.0 | 375 | 0.5220 | 0.82 | 0.2627 | 1.5375 | 0.82 | 0.8024 | 0.2039 | 0.0477 |
| No log | 16.0 | 400 | 0.5982 | 0.8 | 0.3044 | 1.9197 | 0.8000 | 0.7762 | 0.2026 | 0.0664 |
| No log | 17.0 | 425 | 0.5165 | 0.83 | 0.2653 | 1.6907 | 0.83 | 0.8120 | 0.2057 | 0.0505 |
| No log | 18.0 | 450 | 0.5345 | 0.84 | 0.2695 | 1.8508 | 0.8400 | 0.8259 | 0.1987 | 0.0505 |
| No log | 19.0 | 475 | 0.5325 | 0.83 | 0.2670 | 1.3625 | 0.83 | 0.8226 | 0.1937 | 0.0450 |
| 0.4632 | 20.0 | 500 | 0.5102 | 0.835 | 0.2561 | 1.4043 | 0.835 | 0.8179 | 0.1957 | 0.0568 |
| 0.4632 | 21.0 | 525 | 0.5171 | 0.83 | 0.2605 | 1.6651 | 0.83 | 0.8199 | 0.1834 | 0.0508 |
| 0.4632 | 22.0 | 550 | 0.5294 | 0.825 | 0.2594 | 1.6140 | 0.825 | 0.7987 | 0.1745 | 0.0559 |
| 0.4632 | 23.0 | 575 | 0.5234 | 0.835 | 0.2633 | 1.7530 | 0.835 | 0.8190 | 0.1937 | 0.0534 |
| 0.4632 | 24.0 | 600 | 0.4999 | 0.845 | 0.2469 | 1.4458 | 0.845 | 0.8181 | 0.1650 | 0.0580 |
| 0.4632 | 25.0 | 625 | 0.5025 | 0.825 | 0.2571 | 1.6308 | 0.825 | 0.8111 | 0.1712 | 0.0493 |
| 0.4632 | 26.0 | 650 | 0.5058 | 0.83 | 0.2563 | 1.5906 | 0.83 | 0.8074 | 0.1828 | 0.0566 |
| 0.4632 | 27.0 | 675 | 0.4873 | 0.835 | 0.2476 | 1.4946 | 0.835 | 0.8148 | 0.1727 | 0.0584 |
| 0.4632 | 28.0 | 700 | 0.4894 | 0.835 | 0.2486 | 1.6165 | 0.835 | 0.8153 | 0.1538 | 0.0573 |
| 0.4632 | 29.0 | 725 | 0.4834 | 0.835 | 0.2455 | 1.6185 | 0.835 | 0.8178 | 0.1869 | 0.0523 |
| 0.4632 | 30.0 | 750 | 0.4842 | 0.835 | 0.2460 | 1.4898 | 0.835 | 0.8115 | 0.1683 | 0.0524 |
| 0.4632 | 31.0 | 775 | 0.4869 | 0.835 | 0.2480 | 1.5350 | 0.835 | 0.8144 | 0.1796 | 0.0543 |
| 0.4632 | 32.0 | 800 | 0.4865 | 0.835 | 0.2473 | 1.6052 | 0.835 | 0.8179 | 0.1792 | 0.0560 |
| 0.4632 | 33.0 | 825 | 0.4856 | 0.83 | 0.2479 | 1.5508 | 0.83 | 0.8118 | 0.1773 | 0.0550 |
| 0.4632 | 34.0 | 850 | 0.4861 | 0.835 | 0.2477 | 1.5486 | 0.835 | 0.8144 | 0.1779 | 0.0568 |
| 0.4632 | 35.0 | 875 | 0.4873 | 0.835 | 0.2484 | 1.6076 | 0.835 | 0.8179 | 0.1803 | 0.0556 |
| 0.4632 | 36.0 | 900 | 0.4867 | 0.84 | 0.2480 | 1.5438 | 0.8400 | 0.8205 | 0.1791 | 0.0561 |
| 0.4632 | 37.0 | 925 | 0.4863 | 0.84 | 0.2480 | 1.6035 | 0.8400 | 0.8205 | 0.1768 | 0.0558 |
| 0.4632 | 38.0 | 950 | 0.4864 | 0.835 | 0.2479 | 1.5469 | 0.835 | 0.8144 | 0.1876 | 0.0563 |
| 0.4632 | 39.0 | 975 | 0.4866 | 0.835 | 0.2483 | 1.5996 | 0.835 | 0.8144 | 0.1759 | 0.0546 |
| 0.0757 | 40.0 | 1000 | 0.4869 | 0.84 | 0.2480 | 1.5412 | 0.8400 | 0.8205 | 0.1726 | 0.0550 |
| 0.0757 | 41.0 | 1025 | 0.4869 | 0.84 | 0.2482 | 1.6060 | 0.8400 | 0.8205 | 0.1848 | 0.0545 |
| 0.0757 | 42.0 | 1050 | 0.4872 | 0.835 | 0.2485 | 1.6080 | 0.835 | 0.8179 | 0.1812 | 0.0558 |
| 0.0757 | 43.0 | 1075 | 0.4863 | 0.84 | 0.2476 | 1.6046 | 0.8400 | 0.8205 | 0.1773 | 0.0538 |
| 0.0757 | 44.0 | 1100 | 0.4866 | 0.84 | 0.2482 | 1.6117 | 0.8400 | 0.8205 | 0.1719 | 0.0557 |
| 0.0757 | 45.0 | 1125 | 0.4858 | 0.835 | 0.2478 | 1.6046 | 0.835 | 0.8144 | 0.1728 | 0.0554 |
| 0.0757 | 46.0 | 1150 | 0.4860 | 0.845 | 0.2478 | 1.6034 | 0.845 | 0.8265 | 0.2000 | 0.0551 |
| 0.0757 | 47.0 | 1175 | 0.4863 | 0.845 | 0.2480 | 1.5422 | 0.845 | 0.8265 | 0.1918 | 0.0560 |
| 0.0757 | 48.0 | 1200 | 0.4871 | 0.84 | 0.2486 | 1.5468 | 0.8400 | 0.8205 | 0.1795 | 0.0558 |
| 0.0757 | 49.0 | 1225 | 0.4855 | 0.835 | 0.2475 | 1.5511 | 0.835 | 0.8144 | 0.1814 | 0.0555 |
| 0.0757 | 50.0 | 1250 | 0.4854 | 0.84 | 0.2477 | 1.5488 | 0.8400 | 0.8205 | 0.1830 | 0.0549 |
| 0.0757 | 51.0 | 1275 | 0.4854 | 0.835 | 0.2475 | 1.5505 | 0.835 | 0.8144 | 0.1879 | 0.0547 |
| 0.0757 | 52.0 | 1300 | 0.4867 | 0.845 | 0.2483 | 1.6045 | 0.845 | 0.8265 | 0.1700 | 0.0560 |
| 0.0757 | 53.0 | 1325 | 0.4849 | 0.835 | 0.2473 | 1.6070 | 0.835 | 0.8179 | 0.1736 | 0.0546 |
| 0.0757 | 54.0 | 1350 | 0.4860 | 0.845 | 0.2479 | 1.5524 | 0.845 | 0.8265 | 0.1762 | 0.0549 |
| 0.0757 | 55.0 | 1375 | 0.4863 | 0.84 | 0.2481 | 1.5535 | 0.8400 | 0.8205 | 0.1864 | 0.0560 |
| 0.0757 | 56.0 | 1400 | 0.4871 | 0.84 | 0.2484 | 1.6074 | 0.8400 | 0.8205 | 0.1821 | 0.0562 |
| 0.0757 | 57.0 | 1425 | 0.4868 | 0.845 | 0.2483 | 1.5544 | 0.845 | 0.8265 | 0.1832 | 0.0557 |
| 0.0757 | 58.0 | 1450 | 0.4848 | 0.845 | 0.2474 | 1.5455 | 0.845 | 0.8265 | 0.1770 | 0.0551 |
| 0.0757 | 59.0 | 1475 | 0.4861 | 0.845 | 0.2481 | 1.6056 | 0.845 | 0.8265 | 0.1824 | 0.0552 |
| 0.0718 | 60.0 | 1500 | 0.4860 | 0.845 | 0.2481 | 1.5485 | 0.845 | 0.8265 | 0.1874 | 0.0555 |
| 0.0718 | 61.0 | 1525 | 0.4859 | 0.845 | 0.2480 | 1.5471 | 0.845 | 0.8265 | 0.1833 | 0.0545 |
| 0.0718 | 62.0 | 1550 | 0.4855 | 0.84 | 0.2477 | 1.5557 | 0.8400 | 0.8205 | 0.1892 | 0.0556 |
| 0.0718 | 63.0 | 1575 | 0.4856 | 0.845 | 0.2479 | 1.5511 | 0.845 | 0.8265 | 0.1804 | 0.0555 |
| 0.0718 | 64.0 | 1600 | 0.4863 | 0.845 | 0.2482 | 1.5482 | 0.845 | 0.8265 | 0.1761 | 0.0559 |
| 0.0718 | 65.0 | 1625 | 0.4857 | 0.845 | 0.2478 | 1.5503 | 0.845 | 0.8265 | 0.1944 | 0.0559 |
| 0.0718 | 66.0 | 1650 | 0.4858 | 0.845 | 0.2480 | 1.5463 | 0.845 | 0.8265 | 0.1832 | 0.0556 |
| 0.0718 | 67.0 | 1675 | 0.4855 | 0.845 | 0.2478 | 1.5493 | 0.845 | 0.8265 | 0.1947 | 0.0558 |
| 0.0718 | 68.0 | 1700 | 0.4858 | 0.845 | 0.2480 | 1.5482 | 0.845 | 0.8265 | 0.1831 | 0.0553 |
| 0.0718 | 69.0 | 1725 | 0.4863 | 0.845 | 0.2483 | 1.5470 | 0.845 | 0.8265 | 0.1904 | 0.0558 |
| 0.0718 | 70.0 | 1750 | 0.4859 | 0.845 | 0.2479 | 1.5449 | 0.845 | 0.8265 | 0.1836 | 0.0556 |
| 0.0718 | 71.0 | 1775 | 0.4865 | 0.845 | 0.2484 | 1.5476 | 0.845 | 0.8265 | 0.1999 | 0.0557 |
| 0.0718 | 72.0 | 1800 | 0.4855 | 0.845 | 0.2479 | 1.5472 | 0.845 | 0.8265 | 0.1875 | 0.0555 |
| 0.0718 | 73.0 | 1825 | 0.4859 | 0.845 | 0.2481 | 1.5514 | 0.845 | 0.8265 | 0.1946 | 0.0557 |
| 0.0718 | 74.0 | 1850 | 0.4863 | 0.845 | 0.2483 | 1.6121 | 0.845 | 0.8265 | 0.1904 | 0.0559 |
| 0.0718 | 75.0 | 1875 | 0.4861 | 0.845 | 0.2482 | 1.6150 | 0.845 | 0.8265 | 0.1807 | 0.0559 |
| 0.0718 | 76.0 | 1900 | 0.4856 | 0.845 | 0.2479 | 1.5755 | 0.845 | 0.8265 | 0.1877 | 0.0555 |
| 0.0718 | 77.0 | 1925 | 0.4857 | 0.845 | 0.2480 | 1.5550 | 0.845 | 0.8265 | 0.1880 | 0.0557 |
| 0.0718 | 78.0 | 1950 | 0.4856 | 0.845 | 0.2479 | 1.6081 | 0.845 | 0.8265 | 0.1877 | 0.0553 |
| 0.0718 | 79.0 | 1975 | 0.4858 | 0.845 | 0.2481 | 1.5526 | 0.845 | 0.8265 | 0.1878 | 0.0556 |
| 0.0718 | 80.0 | 2000 | 0.4860 | 0.845 | 0.2481 | 1.5562 | 0.845 | 0.8265 | 0.1881 | 0.0556 |
| 0.0718 | 81.0 | 2025 | 0.4860 | 0.845 | 0.2481 | 1.5569 | 0.845 | 0.8265 | 0.1837 | 0.0553 |
| 0.0718 | 82.0 | 2050 | 0.4863 | 0.845 | 0.2483 | 1.5522 | 0.845 | 0.8265 | 0.1937 | 0.0558 |
| 0.0718 | 83.0 | 2075 | 0.4859 | 0.845 | 0.2481 | 1.5539 | 0.845 | 0.8265 | 0.1878 | 0.0558 |
| 0.0718 | 84.0 | 2100 | 0.4858 | 0.845 | 0.2480 | 1.5575 | 0.845 | 0.8265 | 0.1877 | 0.0557 |
| 0.0718 | 85.0 | 2125 | 0.4859 | 0.845 | 0.2481 | 1.5652 | 0.845 | 0.8265 | 0.1884 | 0.0558 |
| 0.0718 | 86.0 | 2150 | 0.4858 | 0.845 | 0.2480 | 1.5511 | 0.845 | 0.8265 | 0.1840 | 0.0555 |
| 0.0718 | 87.0 | 2175 | 0.4858 | 0.845 | 0.2480 | 1.5571 | 0.845 | 0.8265 | 0.1883 | 0.0556 |
| 0.0718 | 88.0 | 2200 | 0.4859 | 0.845 | 0.2481 | 1.5554 | 0.845 | 0.8265 | 0.1842 | 0.0556 |
| 0.0718 | 89.0 | 2225 | 0.4858 | 0.845 | 0.2480 | 1.5627 | 0.845 | 0.8265 | 0.1884 | 0.0557 |
| 0.0718 | 90.0 | 2250 | 0.4856 | 0.845 | 0.2479 | 1.6086 | 0.845 | 0.8265 | 0.1883 | 0.0557 |
| 0.0718 | 91.0 | 2275 | 0.4859 | 0.845 | 0.2481 | 1.5552 | 0.845 | 0.8265 | 0.1884 | 0.0558 |
| 0.0718 | 92.0 | 2300 | 0.4857 | 0.845 | 0.2479 | 1.6085 | 0.845 | 0.8265 | 0.1884 | 0.0557 |
| 0.0718 | 93.0 | 2325 | 0.4858 | 0.845 | 0.2480 | 1.5633 | 0.845 | 0.8265 | 0.1884 | 0.0556 |
| 0.0718 | 94.0 | 2350 | 0.4856 | 0.845 | 0.2480 | 1.6085 | 0.845 | 0.8265 | 0.1884 | 0.0556 |
| 0.0718 | 95.0 | 2375 | 0.4857 | 0.845 | 0.2480 | 1.6082 | 0.845 | 0.8265 | 0.1884 | 0.0556 |
| 0.0718 | 96.0 | 2400 | 0.4857 | 0.845 | 0.2480 | 1.6088 | 0.845 | 0.8265 | 0.1884 | 0.0556 |
| 0.0718 | 97.0 | 2425 | 0.4858 | 0.845 | 0.2480 | 1.6082 | 0.845 | 0.8265 | 0.1885 | 0.0556 |
| 0.0718 | 98.0 | 2450 | 0.4857 | 0.845 | 0.2480 | 1.6084 | 0.845 | 0.8265 | 0.1884 | 0.0556 |
| 0.0718 | 99.0 | 2475 | 0.4858 | 0.845 | 0.2480 | 1.6086 | 0.845 | 0.8265 | 0.1885 | 0.0556 |
| 0.0718 | 100.0 | 2500 | 0.4858 | 0.845 | 0.2480 | 1.6086 | 0.845 | 0.8265 | 0.1885 | 0.0556 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t2.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t2.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5528
- Accuracy: 0.84
- Brier Loss: 0.2493
- Nll: 1.6062
- F1 Micro: 0.8400
- F1 Macro: 0.8256
- Ece: 0.1626
- Aurc: 0.0556
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.9962 | 0.24 | 0.8634 | 4.6099 | 0.24 | 0.1987 | 0.2924 | 0.7628 |
| No log | 2.0 | 50 | 1.2785 | 0.545 | 0.5960 | 2.6707 | 0.545 | 0.4456 | 0.2808 | 0.2569 |
| No log | 3.0 | 75 | 0.9740 | 0.685 | 0.4455 | 2.0688 | 0.685 | 0.5937 | 0.2231 | 0.1314 |
| No log | 4.0 | 100 | 0.8052 | 0.75 | 0.3628 | 1.5271 | 0.75 | 0.7144 | 0.2031 | 0.0915 |
| No log | 5.0 | 125 | 0.7531 | 0.77 | 0.3324 | 1.5448 | 0.7700 | 0.7348 | 0.1868 | 0.0829 |
| No log | 6.0 | 150 | 0.9730 | 0.735 | 0.4050 | 1.5875 | 0.735 | 0.7229 | 0.1899 | 0.1118 |
| No log | 7.0 | 175 | 0.6437 | 0.83 | 0.2790 | 1.3915 | 0.83 | 0.7996 | 0.1967 | 0.0612 |
| No log | 8.0 | 200 | 0.6670 | 0.78 | 0.2984 | 1.2128 | 0.78 | 0.7429 | 0.1701 | 0.0716 |
| No log | 9.0 | 225 | 0.6012 | 0.845 | 0.2521 | 1.4007 | 0.845 | 0.8208 | 0.1573 | 0.0581 |
| No log | 10.0 | 250 | 0.6754 | 0.795 | 0.3063 | 1.4194 | 0.795 | 0.7638 | 0.2036 | 0.0759 |
| No log | 11.0 | 275 | 0.5950 | 0.83 | 0.2554 | 1.1558 | 0.83 | 0.8053 | 0.1877 | 0.0529 |
| No log | 12.0 | 300 | 0.7061 | 0.79 | 0.3153 | 1.6781 | 0.79 | 0.7676 | 0.1879 | 0.0848 |
| No log | 13.0 | 325 | 0.6349 | 0.805 | 0.2806 | 1.3102 | 0.805 | 0.7767 | 0.1523 | 0.0667 |
| No log | 14.0 | 350 | 0.5973 | 0.82 | 0.2677 | 1.5498 | 0.82 | 0.8020 | 0.1734 | 0.0567 |
| No log | 15.0 | 375 | 0.6186 | 0.825 | 0.2792 | 1.3816 | 0.825 | 0.8170 | 0.1558 | 0.0672 |
| No log | 16.0 | 400 | 0.5694 | 0.815 | 0.2662 | 1.1759 | 0.815 | 0.7962 | 0.1675 | 0.0559 |
| No log | 17.0 | 425 | 0.5993 | 0.825 | 0.2793 | 1.2547 | 0.825 | 0.8112 | 0.1822 | 0.0647 |
| No log | 18.0 | 450 | 0.6333 | 0.815 | 0.2844 | 1.6540 | 0.815 | 0.8024 | 0.1562 | 0.0622 |
| No log | 19.0 | 475 | 0.5208 | 0.845 | 0.2349 | 1.2879 | 0.845 | 0.8155 | 0.1553 | 0.0494 |
| 0.4544 | 20.0 | 500 | 0.5412 | 0.86 | 0.2438 | 1.6726 | 0.8600 | 0.8465 | 0.1531 | 0.0485 |
| 0.4544 | 21.0 | 525 | 0.6171 | 0.825 | 0.2775 | 1.9997 | 0.825 | 0.8183 | 0.1464 | 0.0549 |
| 0.4544 | 22.0 | 550 | 0.5479 | 0.84 | 0.2447 | 1.5015 | 0.8400 | 0.8263 | 0.1481 | 0.0680 |
| 0.4544 | 23.0 | 575 | 0.5508 | 0.835 | 0.2491 | 1.8095 | 0.835 | 0.8209 | 0.1616 | 0.0469 |
| 0.4544 | 24.0 | 600 | 0.5597 | 0.825 | 0.2577 | 1.6676 | 0.825 | 0.8077 | 0.1572 | 0.0486 |
| 0.4544 | 25.0 | 625 | 0.5505 | 0.835 | 0.2535 | 1.6085 | 0.835 | 0.8166 | 0.1664 | 0.0524 |
| 0.4544 | 26.0 | 650 | 0.5347 | 0.84 | 0.2442 | 1.4694 | 0.8400 | 0.8288 | 0.1825 | 0.0505 |
| 0.4544 | 27.0 | 675 | 0.5333 | 0.84 | 0.2418 | 1.5809 | 0.8400 | 0.8280 | 0.1634 | 0.0521 |
| 0.4544 | 28.0 | 700 | 0.5417 | 0.84 | 0.2471 | 1.5289 | 0.8400 | 0.8231 | 0.1500 | 0.0503 |
| 0.4544 | 29.0 | 725 | 0.5369 | 0.845 | 0.2434 | 1.5333 | 0.845 | 0.8318 | 0.1690 | 0.0523 |
| 0.4544 | 30.0 | 750 | 0.5396 | 0.84 | 0.2448 | 1.5269 | 0.8400 | 0.8260 | 0.1689 | 0.0534 |
| 0.4544 | 31.0 | 775 | 0.5411 | 0.845 | 0.2459 | 1.5325 | 0.845 | 0.8289 | 0.1524 | 0.0514 |
| 0.4544 | 32.0 | 800 | 0.5429 | 0.845 | 0.2456 | 1.5239 | 0.845 | 0.8318 | 0.1550 | 0.0527 |
| 0.4544 | 33.0 | 825 | 0.5445 | 0.84 | 0.2468 | 1.5275 | 0.8400 | 0.8231 | 0.1626 | 0.0535 |
| 0.4544 | 34.0 | 850 | 0.5432 | 0.845 | 0.2461 | 1.5210 | 0.845 | 0.8289 | 0.1557 | 0.0533 |
| 0.4544 | 35.0 | 875 | 0.5438 | 0.845 | 0.2459 | 1.5269 | 0.845 | 0.8318 | 0.1564 | 0.0533 |
| 0.4544 | 36.0 | 900 | 0.5451 | 0.845 | 0.2466 | 1.5262 | 0.845 | 0.8289 | 0.1610 | 0.0541 |
| 0.4544 | 37.0 | 925 | 0.5415 | 0.85 | 0.2448 | 1.5254 | 0.85 | 0.8348 | 0.1667 | 0.0528 |
| 0.4544 | 38.0 | 950 | 0.5447 | 0.845 | 0.2461 | 1.5367 | 0.845 | 0.8318 | 0.1519 | 0.0535 |
| 0.4544 | 39.0 | 975 | 0.5437 | 0.85 | 0.2454 | 1.5223 | 0.85 | 0.8348 | 0.1605 | 0.0536 |
| 0.0607 | 40.0 | 1000 | 0.5445 | 0.845 | 0.2460 | 1.5252 | 0.845 | 0.8318 | 0.1610 | 0.0539 |
| 0.0607 | 41.0 | 1025 | 0.5460 | 0.845 | 0.2465 | 1.5925 | 0.845 | 0.8318 | 0.1416 | 0.0541 |
| 0.0607 | 42.0 | 1050 | 0.5466 | 0.84 | 0.2467 | 1.5304 | 0.8400 | 0.8260 | 0.1555 | 0.0542 |
| 0.0607 | 43.0 | 1075 | 0.5458 | 0.84 | 0.2464 | 1.5272 | 0.8400 | 0.8231 | 0.1633 | 0.0539 |
| 0.0607 | 44.0 | 1100 | 0.5460 | 0.85 | 0.2464 | 1.5459 | 0.85 | 0.8377 | 0.1534 | 0.0550 |
| 0.0607 | 45.0 | 1125 | 0.5464 | 0.85 | 0.2465 | 1.5390 | 0.85 | 0.8377 | 0.1471 | 0.0544 |
| 0.0607 | 46.0 | 1150 | 0.5462 | 0.85 | 0.2465 | 1.5972 | 0.85 | 0.8377 | 0.1549 | 0.0540 |
| 0.0607 | 47.0 | 1175 | 0.5475 | 0.85 | 0.2472 | 1.5910 | 0.85 | 0.8377 | 0.1592 | 0.0546 |
| 0.0607 | 48.0 | 1200 | 0.5482 | 0.845 | 0.2475 | 1.5943 | 0.845 | 0.8294 | 0.1548 | 0.0545 |
| 0.0607 | 49.0 | 1225 | 0.5475 | 0.845 | 0.2471 | 1.5922 | 0.845 | 0.8294 | 0.1534 | 0.0545 |
| 0.0607 | 50.0 | 1250 | 0.5476 | 0.85 | 0.2470 | 1.5908 | 0.85 | 0.8377 | 0.1539 | 0.0545 |
| 0.0607 | 51.0 | 1275 | 0.5480 | 0.845 | 0.2471 | 1.5990 | 0.845 | 0.8322 | 0.1545 | 0.0547 |
| 0.0607 | 52.0 | 1300 | 0.5479 | 0.85 | 0.2469 | 1.5917 | 0.85 | 0.8348 | 0.1688 | 0.0547 |
| 0.0607 | 53.0 | 1325 | 0.5479 | 0.845 | 0.2472 | 1.6052 | 0.845 | 0.8322 | 0.1545 | 0.0543 |
| 0.0607 | 54.0 | 1350 | 0.5490 | 0.85 | 0.2477 | 1.5948 | 0.85 | 0.8348 | 0.1610 | 0.0545 |
| 0.0607 | 55.0 | 1375 | 0.5489 | 0.85 | 0.2474 | 1.5967 | 0.85 | 0.8377 | 0.1543 | 0.0560 |
| 0.0607 | 56.0 | 1400 | 0.5499 | 0.845 | 0.2480 | 1.5939 | 0.845 | 0.8294 | 0.1561 | 0.0549 |
| 0.0607 | 57.0 | 1425 | 0.5492 | 0.845 | 0.2476 | 1.6048 | 0.845 | 0.8322 | 0.1570 | 0.0549 |
| 0.0607 | 58.0 | 1450 | 0.5497 | 0.845 | 0.2478 | 1.6004 | 0.845 | 0.8322 | 0.1724 | 0.0548 |
| 0.0607 | 59.0 | 1475 | 0.5496 | 0.85 | 0.2477 | 1.5982 | 0.85 | 0.8377 | 0.1634 | 0.0546 |
| 0.0589 | 60.0 | 1500 | 0.5497 | 0.845 | 0.2478 | 1.5969 | 0.845 | 0.8322 | 0.1592 | 0.0545 |
| 0.0589 | 61.0 | 1525 | 0.5492 | 0.85 | 0.2476 | 1.6095 | 0.85 | 0.8377 | 0.1630 | 0.0547 |
| 0.0589 | 62.0 | 1550 | 0.5507 | 0.845 | 0.2483 | 1.6060 | 0.845 | 0.8322 | 0.1649 | 0.0554 |
| 0.0589 | 63.0 | 1575 | 0.5490 | 0.845 | 0.2474 | 1.6021 | 0.845 | 0.8322 | 0.1635 | 0.0546 |
| 0.0589 | 64.0 | 1600 | 0.5508 | 0.845 | 0.2483 | 1.5970 | 0.845 | 0.8294 | 0.1697 | 0.0552 |
| 0.0589 | 65.0 | 1625 | 0.5505 | 0.84 | 0.2483 | 1.6023 | 0.8400 | 0.8256 | 0.1658 | 0.0553 |
| 0.0589 | 66.0 | 1650 | 0.5503 | 0.845 | 0.2481 | 1.6032 | 0.845 | 0.8322 | 0.1637 | 0.0546 |
| 0.0589 | 67.0 | 1675 | 0.5514 | 0.84 | 0.2486 | 1.6000 | 0.8400 | 0.8227 | 0.1649 | 0.0559 |
| 0.0589 | 68.0 | 1700 | 0.5516 | 0.84 | 0.2487 | 1.5979 | 0.8400 | 0.8227 | 0.1649 | 0.0550 |
| 0.0589 | 69.0 | 1725 | 0.5510 | 0.84 | 0.2485 | 1.6005 | 0.8400 | 0.8256 | 0.1639 | 0.0548 |
| 0.0589 | 70.0 | 1750 | 0.5510 | 0.84 | 0.2484 | 1.5990 | 0.8400 | 0.8256 | 0.1653 | 0.0549 |
| 0.0589 | 71.0 | 1775 | 0.5517 | 0.84 | 0.2487 | 1.6080 | 0.8400 | 0.8256 | 0.1640 | 0.0558 |
| 0.0589 | 72.0 | 1800 | 0.5525 | 0.84 | 0.2491 | 1.6069 | 0.8400 | 0.8227 | 0.1669 | 0.0558 |
| 0.0589 | 73.0 | 1825 | 0.5519 | 0.84 | 0.2488 | 1.6147 | 0.8400 | 0.8256 | 0.1638 | 0.0554 |
| 0.0589 | 74.0 | 1850 | 0.5519 | 0.84 | 0.2487 | 1.6027 | 0.8400 | 0.8256 | 0.1657 | 0.0558 |
| 0.0589 | 75.0 | 1875 | 0.5522 | 0.84 | 0.2490 | 1.6082 | 0.8400 | 0.8256 | 0.1717 | 0.0556 |
| 0.0589 | 76.0 | 1900 | 0.5523 | 0.84 | 0.2489 | 1.6022 | 0.8400 | 0.8256 | 0.1645 | 0.0553 |
| 0.0589 | 77.0 | 1925 | 0.5514 | 0.84 | 0.2486 | 1.6027 | 0.8400 | 0.8256 | 0.1635 | 0.0551 |
| 0.0589 | 78.0 | 1950 | 0.5518 | 0.84 | 0.2488 | 1.6007 | 0.8400 | 0.8256 | 0.1641 | 0.0556 |
| 0.0589 | 79.0 | 1975 | 0.5522 | 0.84 | 0.2490 | 1.6057 | 0.8400 | 0.8256 | 0.1637 | 0.0556 |
| 0.0588 | 80.0 | 2000 | 0.5520 | 0.84 | 0.2489 | 1.6110 | 0.8400 | 0.8256 | 0.1658 | 0.0552 |
| 0.0588 | 81.0 | 2025 | 0.5521 | 0.84 | 0.2489 | 1.6047 | 0.8400 | 0.8256 | 0.1659 | 0.0555 |
| 0.0588 | 82.0 | 2050 | 0.5521 | 0.84 | 0.2490 | 1.6015 | 0.8400 | 0.8256 | 0.1635 | 0.0551 |
| 0.0588 | 83.0 | 2075 | 0.5521 | 0.84 | 0.2489 | 1.6115 | 0.8400 | 0.8256 | 0.1637 | 0.0553 |
| 0.0588 | 84.0 | 2100 | 0.5523 | 0.84 | 0.2490 | 1.6033 | 0.8400 | 0.8256 | 0.1738 | 0.0553 |
| 0.0588 | 85.0 | 2125 | 0.5525 | 0.84 | 0.2491 | 1.6072 | 0.8400 | 0.8256 | 0.1658 | 0.0555 |
| 0.0588 | 86.0 | 2150 | 0.5521 | 0.84 | 0.2489 | 1.6057 | 0.8400 | 0.8256 | 0.1574 | 0.0553 |
| 0.0588 | 87.0 | 2175 | 0.5527 | 0.84 | 0.2492 | 1.6605 | 0.8400 | 0.8256 | 0.1610 | 0.0555 |
| 0.0588 | 88.0 | 2200 | 0.5526 | 0.84 | 0.2491 | 1.6056 | 0.8400 | 0.8256 | 0.1544 | 0.0556 |
| 0.0588 | 89.0 | 2225 | 0.5527 | 0.84 | 0.2492 | 1.6126 | 0.8400 | 0.8256 | 0.1547 | 0.0556 |
| 0.0588 | 90.0 | 2250 | 0.5525 | 0.84 | 0.2491 | 1.6059 | 0.8400 | 0.8256 | 0.1525 | 0.0556 |
| 0.0588 | 91.0 | 2275 | 0.5528 | 0.84 | 0.2492 | 1.6060 | 0.8400 | 0.8256 | 0.1604 | 0.0556 |
| 0.0588 | 92.0 | 2300 | 0.5526 | 0.84 | 0.2491 | 1.6080 | 0.8400 | 0.8256 | 0.1525 | 0.0555 |
| 0.0588 | 93.0 | 2325 | 0.5527 | 0.84 | 0.2492 | 1.6034 | 0.8400 | 0.8256 | 0.1547 | 0.0556 |
| 0.0588 | 94.0 | 2350 | 0.5526 | 0.84 | 0.2492 | 1.6040 | 0.8400 | 0.8256 | 0.1673 | 0.0555 |
| 0.0588 | 95.0 | 2375 | 0.5529 | 0.84 | 0.2493 | 1.6053 | 0.8400 | 0.8256 | 0.1545 | 0.0556 |
| 0.0588 | 96.0 | 2400 | 0.5526 | 0.84 | 0.2492 | 1.6050 | 0.8400 | 0.8256 | 0.1626 | 0.0555 |
| 0.0588 | 97.0 | 2425 | 0.5528 | 0.84 | 0.2492 | 1.6040 | 0.8400 | 0.8256 | 0.1686 | 0.0557 |
| 0.0588 | 98.0 | 2450 | 0.5528 | 0.84 | 0.2492 | 1.6068 | 0.8400 | 0.8256 | 0.1626 | 0.0555 |
| 0.0588 | 99.0 | 2475 | 0.5528 | 0.84 | 0.2492 | 1.6065 | 0.8400 | 0.8256 | 0.1626 | 0.0556 |
| 0.0588 | 100.0 | 2500 | 0.5528 | 0.84 | 0.2493 | 1.6062 | 0.8400 | 0.8256 | 0.1626 | 0.0556 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3932
- Accuracy: 0.83
- Brier Loss: 0.2507
- Nll: 1.3117
- F1 Micro: 0.83
- F1 Macro: 0.8164
- Ece: 0.1915
- Aurc: 0.0602
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.5453 | 0.21 | 0.8721 | 5.2626 | 0.2100 | 0.1784 | 0.2602 | 0.7492 |
| No log | 2.0 | 50 | 0.9799 | 0.515 | 0.6132 | 2.7421 | 0.515 | 0.4183 | 0.2927 | 0.2698 |
| No log | 3.0 | 75 | 0.7948 | 0.655 | 0.4888 | 2.6042 | 0.655 | 0.5597 | 0.2331 | 0.1485 |
| No log | 4.0 | 100 | 0.6422 | 0.735 | 0.3906 | 1.6168 | 0.735 | 0.6662 | 0.2448 | 0.1095 |
| No log | 5.0 | 125 | 0.6261 | 0.785 | 0.3475 | 1.1890 | 0.785 | 0.7582 | 0.2616 | 0.0792 |
| No log | 6.0 | 150 | 0.6060 | 0.775 | 0.3402 | 1.3317 | 0.775 | 0.7269 | 0.2750 | 0.0709 |
| No log | 7.0 | 175 | 0.5659 | 0.8 | 0.3459 | 1.4773 | 0.8000 | 0.7741 | 0.2628 | 0.0960 |
| No log | 8.0 | 200 | 0.5339 | 0.81 | 0.3038 | 1.5029 | 0.81 | 0.7882 | 0.2449 | 0.0699 |
| No log | 9.0 | 225 | 0.5429 | 0.805 | 0.3117 | 1.4140 | 0.805 | 0.7840 | 0.2242 | 0.0771 |
| No log | 10.0 | 250 | 0.5337 | 0.815 | 0.3139 | 1.4630 | 0.815 | 0.8167 | 0.2253 | 0.0801 |
| No log | 11.0 | 275 | 0.5257 | 0.815 | 0.3084 | 1.4325 | 0.815 | 0.7943 | 0.2431 | 0.0823 |
| No log | 12.0 | 300 | 0.4704 | 0.81 | 0.2879 | 1.3557 | 0.81 | 0.7859 | 0.2139 | 0.0770 |
| No log | 13.0 | 325 | 0.4828 | 0.81 | 0.2898 | 1.5643 | 0.81 | 0.7767 | 0.2115 | 0.0712 |
| No log | 14.0 | 350 | 0.4579 | 0.815 | 0.2733 | 1.4403 | 0.815 | 0.8061 | 0.1799 | 0.0609 |
| No log | 15.0 | 375 | 0.4642 | 0.815 | 0.2892 | 1.4598 | 0.815 | 0.8017 | 0.1973 | 0.0537 |
| No log | 16.0 | 400 | 0.4378 | 0.84 | 0.2683 | 1.1278 | 0.8400 | 0.8320 | 0.2135 | 0.0545 |
| No log | 17.0 | 425 | 0.4403 | 0.825 | 0.2750 | 1.3817 | 0.825 | 0.8024 | 0.1898 | 0.0511 |
| No log | 18.0 | 450 | 0.4211 | 0.825 | 0.2646 | 1.5604 | 0.825 | 0.8123 | 0.1918 | 0.0538 |
| No log | 19.0 | 475 | 0.4280 | 0.82 | 0.2700 | 1.5824 | 0.82 | 0.8016 | 0.1814 | 0.0587 |
| 0.4177 | 20.0 | 500 | 0.4223 | 0.83 | 0.2649 | 1.8744 | 0.83 | 0.8152 | 0.1885 | 0.0681 |
| 0.4177 | 21.0 | 525 | 0.4219 | 0.835 | 0.2682 | 1.6053 | 0.835 | 0.8252 | 0.1932 | 0.0600 |
| 0.4177 | 22.0 | 550 | 0.3935 | 0.835 | 0.2534 | 1.4134 | 0.835 | 0.8229 | 0.2049 | 0.0680 |
| 0.4177 | 23.0 | 575 | 0.4231 | 0.82 | 0.2651 | 1.7267 | 0.82 | 0.8028 | 0.1943 | 0.0636 |
| 0.4177 | 24.0 | 600 | 0.4135 | 0.845 | 0.2576 | 1.8708 | 0.845 | 0.8304 | 0.1872 | 0.0559 |
| 0.4177 | 25.0 | 625 | 0.4027 | 0.835 | 0.2526 | 1.5970 | 0.835 | 0.8187 | 0.1886 | 0.0645 |
| 0.4177 | 26.0 | 650 | 0.4000 | 0.835 | 0.2513 | 1.7233 | 0.835 | 0.8151 | 0.1998 | 0.0613 |
| 0.4177 | 27.0 | 675 | 0.3956 | 0.83 | 0.2478 | 1.5716 | 0.83 | 0.8143 | 0.1871 | 0.0533 |
| 0.4177 | 28.0 | 700 | 0.3952 | 0.835 | 0.2493 | 1.5638 | 0.835 | 0.8153 | 0.1942 | 0.0593 |
| 0.4177 | 29.0 | 725 | 0.3965 | 0.83 | 0.2509 | 1.5069 | 0.83 | 0.8150 | 0.1811 | 0.0594 |
| 0.4177 | 30.0 | 750 | 0.3935 | 0.835 | 0.2486 | 1.5657 | 0.835 | 0.8168 | 0.1948 | 0.0573 |
| 0.4177 | 31.0 | 775 | 0.3951 | 0.83 | 0.2498 | 1.5061 | 0.83 | 0.8146 | 0.1886 | 0.0591 |
| 0.4177 | 32.0 | 800 | 0.3940 | 0.835 | 0.2506 | 1.5054 | 0.835 | 0.8203 | 0.1991 | 0.0598 |
| 0.4177 | 33.0 | 825 | 0.3935 | 0.84 | 0.2493 | 1.5025 | 0.8400 | 0.8230 | 0.1932 | 0.0590 |
| 0.4177 | 34.0 | 850 | 0.3928 | 0.84 | 0.2485 | 1.5679 | 0.8400 | 0.8227 | 0.1981 | 0.0570 |
| 0.4177 | 35.0 | 875 | 0.3942 | 0.835 | 0.2497 | 1.5670 | 0.835 | 0.8177 | 0.1940 | 0.0592 |
| 0.4177 | 36.0 | 900 | 0.3935 | 0.835 | 0.2491 | 1.5120 | 0.835 | 0.8203 | 0.1931 | 0.0596 |
| 0.4177 | 37.0 | 925 | 0.3932 | 0.835 | 0.2496 | 1.5715 | 0.835 | 0.8203 | 0.1989 | 0.0597 |
| 0.4177 | 38.0 | 950 | 0.3924 | 0.835 | 0.2491 | 1.5091 | 0.835 | 0.8203 | 0.1973 | 0.0606 |
| 0.4177 | 39.0 | 975 | 0.3936 | 0.835 | 0.2500 | 1.5036 | 0.835 | 0.8203 | 0.1954 | 0.0602 |
| 0.0597 | 40.0 | 1000 | 0.3936 | 0.835 | 0.2497 | 1.4602 | 0.835 | 0.8203 | 0.2053 | 0.0597 |
| 0.0597 | 41.0 | 1025 | 0.3936 | 0.835 | 0.2505 | 1.5040 | 0.835 | 0.8203 | 0.2026 | 0.0607 |
| 0.0597 | 42.0 | 1050 | 0.3931 | 0.83 | 0.2500 | 1.4565 | 0.83 | 0.8164 | 0.1961 | 0.0590 |
| 0.0597 | 43.0 | 1075 | 0.3931 | 0.835 | 0.2497 | 1.5208 | 0.835 | 0.8203 | 0.1972 | 0.0591 |
| 0.0597 | 44.0 | 1100 | 0.3932 | 0.835 | 0.2503 | 1.5040 | 0.835 | 0.8203 | 0.2030 | 0.0606 |
| 0.0597 | 45.0 | 1125 | 0.3930 | 0.835 | 0.2502 | 1.4555 | 0.835 | 0.8203 | 0.1992 | 0.0604 |
| 0.0597 | 46.0 | 1150 | 0.3927 | 0.835 | 0.2500 | 1.4553 | 0.835 | 0.8203 | 0.1960 | 0.0616 |
| 0.0597 | 47.0 | 1175 | 0.3928 | 0.835 | 0.2501 | 1.3970 | 0.835 | 0.8203 | 0.1965 | 0.0610 |
| 0.0597 | 48.0 | 1200 | 0.3930 | 0.835 | 0.2498 | 1.3967 | 0.835 | 0.8203 | 0.1989 | 0.0599 |
| 0.0597 | 49.0 | 1225 | 0.3931 | 0.835 | 0.2502 | 1.4578 | 0.835 | 0.8203 | 0.1963 | 0.0606 |
| 0.0597 | 50.0 | 1250 | 0.3932 | 0.835 | 0.2504 | 1.4475 | 0.835 | 0.8203 | 0.1996 | 0.0604 |
| 0.0597 | 51.0 | 1275 | 0.3928 | 0.835 | 0.2500 | 1.3382 | 0.835 | 0.8203 | 0.2002 | 0.0609 |
| 0.0597 | 52.0 | 1300 | 0.3933 | 0.83 | 0.2502 | 1.4424 | 0.83 | 0.8164 | 0.1991 | 0.0597 |
| 0.0597 | 53.0 | 1325 | 0.3933 | 0.83 | 0.2502 | 1.3390 | 0.83 | 0.8164 | 0.1965 | 0.0604 |
| 0.0597 | 54.0 | 1350 | 0.3929 | 0.83 | 0.2502 | 1.3351 | 0.83 | 0.8164 | 0.1914 | 0.0608 |
| 0.0597 | 55.0 | 1375 | 0.3932 | 0.83 | 0.2503 | 1.3422 | 0.83 | 0.8164 | 0.1969 | 0.0608 |
| 0.0597 | 56.0 | 1400 | 0.3934 | 0.83 | 0.2506 | 1.3369 | 0.83 | 0.8164 | 0.1950 | 0.0599 |
| 0.0597 | 57.0 | 1425 | 0.3930 | 0.83 | 0.2502 | 1.3829 | 0.83 | 0.8164 | 0.1966 | 0.0603 |
| 0.0597 | 58.0 | 1450 | 0.3930 | 0.835 | 0.2503 | 1.3219 | 0.835 | 0.8203 | 0.1907 | 0.0607 |
| 0.0597 | 59.0 | 1475 | 0.3930 | 0.83 | 0.2504 | 1.3268 | 0.83 | 0.8164 | 0.1919 | 0.0599 |
| 0.0574 | 60.0 | 1500 | 0.3933 | 0.835 | 0.2505 | 1.3242 | 0.835 | 0.8203 | 0.1913 | 0.0601 |
| 0.0574 | 61.0 | 1525 | 0.3930 | 0.83 | 0.2504 | 1.3205 | 0.83 | 0.8164 | 0.1943 | 0.0607 |
| 0.0574 | 62.0 | 1550 | 0.3930 | 0.83 | 0.2504 | 1.3189 | 0.83 | 0.8164 | 0.1947 | 0.0608 |
| 0.0574 | 63.0 | 1575 | 0.3931 | 0.83 | 0.2504 | 1.3197 | 0.83 | 0.8164 | 0.1917 | 0.0600 |
| 0.0574 | 64.0 | 1600 | 0.3931 | 0.835 | 0.2505 | 1.3200 | 0.835 | 0.8203 | 0.1904 | 0.0597 |
| 0.0574 | 65.0 | 1625 | 0.3932 | 0.83 | 0.2505 | 1.3175 | 0.83 | 0.8164 | 0.1915 | 0.0601 |
| 0.0574 | 66.0 | 1650 | 0.3931 | 0.83 | 0.2506 | 1.3200 | 0.83 | 0.8164 | 0.1917 | 0.0608 |
| 0.0574 | 67.0 | 1675 | 0.3929 | 0.83 | 0.2503 | 1.3188 | 0.83 | 0.8164 | 0.1940 | 0.0598 |
| 0.0574 | 68.0 | 1700 | 0.3931 | 0.83 | 0.2505 | 1.3160 | 0.83 | 0.8164 | 0.1913 | 0.0599 |
| 0.0574 | 69.0 | 1725 | 0.3931 | 0.83 | 0.2505 | 1.3161 | 0.83 | 0.8164 | 0.1941 | 0.0598 |
| 0.0574 | 70.0 | 1750 | 0.3932 | 0.83 | 0.2506 | 1.3171 | 0.83 | 0.8164 | 0.1961 | 0.0608 |
| 0.0574 | 71.0 | 1775 | 0.3930 | 0.83 | 0.2506 | 1.3161 | 0.83 | 0.8164 | 0.1913 | 0.0602 |
| 0.0574 | 72.0 | 1800 | 0.3929 | 0.83 | 0.2505 | 1.3155 | 0.83 | 0.8164 | 0.1960 | 0.0603 |
| 0.0574 | 73.0 | 1825 | 0.3930 | 0.83 | 0.2506 | 1.3152 | 0.83 | 0.8164 | 0.1941 | 0.0601 |
| 0.0574 | 74.0 | 1850 | 0.3930 | 0.83 | 0.2506 | 1.3167 | 0.83 | 0.8164 | 0.1940 | 0.0602 |
| 0.0574 | 75.0 | 1875 | 0.3933 | 0.83 | 0.2507 | 1.3148 | 0.83 | 0.8164 | 0.1918 | 0.0600 |
| 0.0574 | 76.0 | 1900 | 0.3930 | 0.83 | 0.2505 | 1.3146 | 0.83 | 0.8164 | 0.1914 | 0.0602 |
| 0.0574 | 77.0 | 1925 | 0.3930 | 0.83 | 0.2505 | 1.3147 | 0.83 | 0.8164 | 0.1914 | 0.0598 |
| 0.0574 | 78.0 | 1950 | 0.3931 | 0.83 | 0.2506 | 1.3134 | 0.83 | 0.8164 | 0.1942 | 0.0601 |
| 0.0574 | 79.0 | 1975 | 0.3931 | 0.83 | 0.2505 | 1.3137 | 0.83 | 0.8164 | 0.1916 | 0.0598 |
| 0.0573 | 80.0 | 2000 | 0.3931 | 0.83 | 0.2506 | 1.3136 | 0.83 | 0.8164 | 0.1915 | 0.0601 |
| 0.0573 | 81.0 | 2025 | 0.3932 | 0.83 | 0.2506 | 1.3132 | 0.83 | 0.8164 | 0.1915 | 0.0607 |
| 0.0573 | 82.0 | 2050 | 0.3933 | 0.83 | 0.2507 | 1.3142 | 0.83 | 0.8164 | 0.1943 | 0.0603 |
| 0.0573 | 83.0 | 2075 | 0.3933 | 0.83 | 0.2507 | 1.3135 | 0.83 | 0.8164 | 0.1916 | 0.0603 |
| 0.0573 | 84.0 | 2100 | 0.3931 | 0.83 | 0.2506 | 1.3124 | 0.83 | 0.8164 | 0.1914 | 0.0601 |
| 0.0573 | 85.0 | 2125 | 0.3931 | 0.83 | 0.2507 | 1.3128 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 86.0 | 2150 | 0.3931 | 0.83 | 0.2506 | 1.3128 | 0.83 | 0.8164 | 0.1916 | 0.0602 |
| 0.0573 | 87.0 | 2175 | 0.3932 | 0.83 | 0.2507 | 1.3130 | 0.83 | 0.8164 | 0.1943 | 0.0602 |
| 0.0573 | 88.0 | 2200 | 0.3932 | 0.83 | 0.2507 | 1.3123 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 89.0 | 2225 | 0.3932 | 0.83 | 0.2507 | 1.3123 | 0.83 | 0.8164 | 0.1915 | 0.0599 |
| 0.0573 | 90.0 | 2250 | 0.3931 | 0.83 | 0.2507 | 1.3119 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 91.0 | 2275 | 0.3932 | 0.83 | 0.2507 | 1.3121 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 92.0 | 2300 | 0.3932 | 0.83 | 0.2507 | 1.3117 | 0.83 | 0.8164 | 0.1915 | 0.0601 |
| 0.0573 | 93.0 | 2325 | 0.3931 | 0.83 | 0.2507 | 1.3117 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 94.0 | 2350 | 0.3932 | 0.83 | 0.2507 | 1.3120 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 95.0 | 2375 | 0.3932 | 0.83 | 0.2507 | 1.3120 | 0.83 | 0.8164 | 0.1916 | 0.0602 |
| 0.0573 | 96.0 | 2400 | 0.3932 | 0.83 | 0.2507 | 1.3119 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 97.0 | 2425 | 0.3932 | 0.83 | 0.2507 | 1.3118 | 0.83 | 0.8164 | 0.1915 | 0.0601 |
| 0.0573 | 98.0 | 2450 | 0.3932 | 0.83 | 0.2507 | 1.3117 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 99.0 | 2475 | 0.3932 | 0.83 | 0.2507 | 1.3118 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
| 0.0573 | 100.0 | 2500 | 0.3932 | 0.83 | 0.2507 | 1.3117 | 0.83 | 0.8164 | 0.1915 | 0.0602 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6810
- Accuracy: 0.795
- Brier Loss: 0.3284
- Nll: 1.4069
- F1 Micro: 0.795
- F1 Macro: 0.7636
- Ece: 0.2215
- Aurc: 0.0726
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 3.5123 | 0.19 | 1.2292 | 9.8836 | 0.19 | 0.0557 | 0.5319 | 0.8455 |
| No log | 2.0 | 50 | 3.3061 | 0.19 | 1.1746 | 9.5259 | 0.19 | 0.0557 | 0.4948 | 0.8502 |
| No log | 3.0 | 75 | 3.0163 | 0.18 | 1.1038 | 9.0069 | 0.18 | 0.0530 | 0.3959 | 0.8542 |
| No log | 4.0 | 100 | 2.7051 | 0.16 | 1.0425 | 8.3332 | 0.16 | 0.0685 | 0.3637 | 0.8603 |
| No log | 5.0 | 125 | 2.4101 | 0.145 | 0.9955 | 7.8875 | 0.145 | 0.0886 | 0.3276 | 0.8594 |
| No log | 6.0 | 150 | 2.1622 | 0.16 | 0.9581 | 7.4210 | 0.16 | 0.1094 | 0.3098 | 0.8418 |
| No log | 7.0 | 175 | 1.9665 | 0.17 | 0.9199 | 6.3502 | 0.17 | 0.1141 | 0.2939 | 0.8129 |
| No log | 8.0 | 200 | 1.8218 | 0.19 | 0.8852 | 4.6467 | 0.19 | 0.1580 | 0.2691 | 0.7908 |
| No log | 9.0 | 225 | 1.7041 | 0.205 | 0.8503 | 3.9915 | 0.205 | 0.1722 | 0.2746 | 0.7349 |
| No log | 10.0 | 250 | 1.6030 | 0.315 | 0.8144 | 3.8156 | 0.315 | 0.2716 | 0.2999 | 0.6443 |
| No log | 11.0 | 275 | 1.5078 | 0.39 | 0.7749 | 3.6634 | 0.39 | 0.3103 | 0.2992 | 0.5291 |
| No log | 12.0 | 300 | 1.4231 | 0.46 | 0.7351 | 3.5329 | 0.46 | 0.3550 | 0.3143 | 0.4339 |
| No log | 13.0 | 325 | 1.3487 | 0.465 | 0.6988 | 3.4534 | 0.465 | 0.3851 | 0.2929 | 0.3704 |
| No log | 14.0 | 350 | 1.2823 | 0.51 | 0.6657 | 3.2203 | 0.51 | 0.4164 | 0.2835 | 0.3215 |
| No log | 15.0 | 375 | 1.2282 | 0.54 | 0.6399 | 3.0431 | 0.54 | 0.4415 | 0.2943 | 0.2928 |
| No log | 16.0 | 400 | 1.1793 | 0.555 | 0.6170 | 2.8562 | 0.555 | 0.4502 | 0.2813 | 0.2676 |
| No log | 17.0 | 425 | 1.1466 | 0.565 | 0.6019 | 2.7630 | 0.565 | 0.4607 | 0.2450 | 0.2524 |
| No log | 18.0 | 450 | 1.1114 | 0.59 | 0.5832 | 2.7390 | 0.59 | 0.4721 | 0.2850 | 0.2289 |
| No log | 19.0 | 475 | 1.0835 | 0.6 | 0.5699 | 2.5928 | 0.6 | 0.5073 | 0.2760 | 0.2195 |
| 1.9045 | 20.0 | 500 | 1.0547 | 0.615 | 0.5539 | 2.6261 | 0.615 | 0.5273 | 0.2883 | 0.2044 |
| 1.9045 | 21.0 | 525 | 1.0294 | 0.625 | 0.5404 | 2.6118 | 0.625 | 0.5343 | 0.2703 | 0.1945 |
| 1.9045 | 22.0 | 550 | 1.0085 | 0.635 | 0.5300 | 2.4378 | 0.635 | 0.5381 | 0.2727 | 0.1842 |
| 1.9045 | 23.0 | 575 | 0.9900 | 0.64 | 0.5188 | 2.4290 | 0.64 | 0.5435 | 0.2781 | 0.1745 |
| 1.9045 | 24.0 | 600 | 0.9674 | 0.645 | 0.5071 | 2.3513 | 0.645 | 0.5527 | 0.2631 | 0.1647 |
| 1.9045 | 25.0 | 625 | 0.9522 | 0.64 | 0.4980 | 2.2815 | 0.64 | 0.5494 | 0.2687 | 0.1606 |
| 1.9045 | 26.0 | 650 | 0.9336 | 0.65 | 0.4883 | 2.2513 | 0.65 | 0.5603 | 0.2727 | 0.1515 |
| 1.9045 | 27.0 | 675 | 0.9175 | 0.665 | 0.4795 | 2.2466 | 0.665 | 0.5707 | 0.2848 | 0.1450 |
| 1.9045 | 28.0 | 700 | 0.9060 | 0.655 | 0.4731 | 2.2223 | 0.655 | 0.5655 | 0.2598 | 0.1426 |
| 1.9045 | 29.0 | 725 | 0.8924 | 0.67 | 0.4648 | 2.1571 | 0.67 | 0.5748 | 0.2504 | 0.1364 |
| 1.9045 | 30.0 | 750 | 0.8808 | 0.675 | 0.4580 | 2.1970 | 0.675 | 0.5804 | 0.2124 | 0.1302 |
| 1.9045 | 31.0 | 775 | 0.8698 | 0.675 | 0.4513 | 1.9818 | 0.675 | 0.5784 | 0.2413 | 0.1248 |
| 1.9045 | 32.0 | 800 | 0.8581 | 0.685 | 0.4451 | 2.0653 | 0.685 | 0.6062 | 0.2783 | 0.1221 |
| 1.9045 | 33.0 | 825 | 0.8493 | 0.68 | 0.4398 | 1.9229 | 0.68 | 0.5964 | 0.2430 | 0.1198 |
| 1.9045 | 34.0 | 850 | 0.8416 | 0.675 | 0.4351 | 1.9147 | 0.675 | 0.5901 | 0.2547 | 0.1181 |
| 1.9045 | 35.0 | 875 | 0.8329 | 0.69 | 0.4296 | 1.9727 | 0.69 | 0.6098 | 0.2498 | 0.1121 |
| 1.9045 | 36.0 | 900 | 0.8222 | 0.7 | 0.4234 | 1.8988 | 0.7 | 0.6185 | 0.2404 | 0.1084 |
| 1.9045 | 37.0 | 925 | 0.8178 | 0.69 | 0.4201 | 1.8900 | 0.69 | 0.6103 | 0.2338 | 0.1079 |
| 1.9045 | 38.0 | 950 | 0.8091 | 0.69 | 0.4153 | 1.9396 | 0.69 | 0.6100 | 0.2469 | 0.1058 |
| 1.9045 | 39.0 | 975 | 0.7992 | 0.705 | 0.4098 | 1.8177 | 0.705 | 0.6453 | 0.1971 | 0.1037 |
| 0.8325 | 40.0 | 1000 | 0.7962 | 0.715 | 0.4069 | 1.7962 | 0.715 | 0.6560 | 0.2474 | 0.1008 |
| 0.8325 | 41.0 | 1025 | 0.7890 | 0.715 | 0.4032 | 1.7233 | 0.715 | 0.6592 | 0.2417 | 0.1009 |
| 0.8325 | 42.0 | 1050 | 0.7842 | 0.715 | 0.3997 | 1.6669 | 0.715 | 0.6573 | 0.2441 | 0.1005 |
| 0.8325 | 43.0 | 1075 | 0.7788 | 0.71 | 0.3962 | 1.6468 | 0.7100 | 0.6515 | 0.2199 | 0.0998 |
| 0.8325 | 44.0 | 1100 | 0.7713 | 0.725 | 0.3918 | 1.6398 | 0.7250 | 0.6698 | 0.2363 | 0.0945 |
| 0.8325 | 45.0 | 1125 | 0.7687 | 0.725 | 0.3895 | 1.6397 | 0.7250 | 0.6677 | 0.2478 | 0.0939 |
| 0.8325 | 46.0 | 1150 | 0.7642 | 0.73 | 0.3867 | 1.6259 | 0.7300 | 0.6684 | 0.2538 | 0.0941 |
| 0.8325 | 47.0 | 1175 | 0.7571 | 0.74 | 0.3822 | 1.5033 | 0.74 | 0.6907 | 0.2269 | 0.0903 |
| 0.8325 | 48.0 | 1200 | 0.7547 | 0.74 | 0.3806 | 1.5595 | 0.74 | 0.6830 | 0.2448 | 0.0912 |
| 0.8325 | 49.0 | 1225 | 0.7516 | 0.75 | 0.3779 | 1.6264 | 0.75 | 0.6921 | 0.2489 | 0.0887 |
| 0.8325 | 50.0 | 1250 | 0.7477 | 0.75 | 0.3759 | 1.5568 | 0.75 | 0.7029 | 0.2388 | 0.0888 |
| 0.8325 | 51.0 | 1275 | 0.7431 | 0.755 | 0.3725 | 1.5037 | 0.755 | 0.6986 | 0.2254 | 0.0848 |
| 0.8325 | 52.0 | 1300 | 0.7418 | 0.755 | 0.3713 | 1.4951 | 0.755 | 0.7085 | 0.2261 | 0.0862 |
| 0.8325 | 53.0 | 1325 | 0.7360 | 0.77 | 0.3676 | 1.4881 | 0.7700 | 0.7241 | 0.2474 | 0.0825 |
| 0.8325 | 54.0 | 1350 | 0.7339 | 0.77 | 0.3665 | 1.5554 | 0.7700 | 0.7241 | 0.2646 | 0.0827 |
| 0.8325 | 55.0 | 1375 | 0.7294 | 0.775 | 0.3636 | 1.4885 | 0.775 | 0.7275 | 0.2283 | 0.0818 |
| 0.8325 | 56.0 | 1400 | 0.7265 | 0.78 | 0.3617 | 1.5387 | 0.78 | 0.7306 | 0.2416 | 0.0799 |
| 0.8325 | 57.0 | 1425 | 0.7247 | 0.77 | 0.3598 | 1.4382 | 0.7700 | 0.7241 | 0.2553 | 0.0806 |
| 0.8325 | 58.0 | 1450 | 0.7234 | 0.78 | 0.3589 | 1.4888 | 0.78 | 0.7306 | 0.2231 | 0.0796 |
| 0.8325 | 59.0 | 1475 | 0.7186 | 0.78 | 0.3565 | 1.5400 | 0.78 | 0.7306 | 0.2200 | 0.0790 |
| 0.618 | 60.0 | 1500 | 0.7174 | 0.78 | 0.3549 | 1.4823 | 0.78 | 0.7306 | 0.2340 | 0.0787 |
| 0.618 | 61.0 | 1525 | 0.7148 | 0.78 | 0.3534 | 1.4804 | 0.78 | 0.7306 | 0.2412 | 0.0785 |
| 0.618 | 62.0 | 1550 | 0.7122 | 0.785 | 0.3516 | 1.4334 | 0.785 | 0.7445 | 0.2353 | 0.0787 |
| 0.618 | 63.0 | 1575 | 0.7107 | 0.79 | 0.3501 | 1.4153 | 0.79 | 0.7516 | 0.2354 | 0.0777 |
| 0.618 | 64.0 | 1600 | 0.7091 | 0.78 | 0.3491 | 1.4698 | 0.78 | 0.7306 | 0.2324 | 0.0780 |
| 0.618 | 65.0 | 1625 | 0.7071 | 0.79 | 0.3481 | 1.4097 | 0.79 | 0.7516 | 0.2198 | 0.0785 |
| 0.618 | 66.0 | 1650 | 0.7047 | 0.785 | 0.3464 | 1.4088 | 0.785 | 0.7458 | 0.2325 | 0.0778 |
| 0.618 | 67.0 | 1675 | 0.7041 | 0.785 | 0.3457 | 1.4108 | 0.785 | 0.7458 | 0.2248 | 0.0781 |
| 0.618 | 68.0 | 1700 | 0.7025 | 0.79 | 0.3444 | 1.4145 | 0.79 | 0.7516 | 0.2195 | 0.0773 |
| 0.618 | 69.0 | 1725 | 0.7014 | 0.79 | 0.3436 | 1.4120 | 0.79 | 0.7516 | 0.2629 | 0.0771 |
| 0.618 | 70.0 | 1750 | 0.6992 | 0.785 | 0.3422 | 1.4046 | 0.785 | 0.7458 | 0.2294 | 0.0767 |
| 0.618 | 71.0 | 1775 | 0.6982 | 0.785 | 0.3412 | 1.4142 | 0.785 | 0.7458 | 0.2325 | 0.0761 |
| 0.618 | 72.0 | 1800 | 0.6954 | 0.79 | 0.3395 | 1.4009 | 0.79 | 0.7516 | 0.2253 | 0.0763 |
| 0.618 | 73.0 | 1825 | 0.6942 | 0.79 | 0.3389 | 1.3994 | 0.79 | 0.7559 | 0.2383 | 0.0763 |
| 0.618 | 74.0 | 1850 | 0.6937 | 0.785 | 0.3382 | 1.4061 | 0.785 | 0.7458 | 0.2213 | 0.0762 |
| 0.618 | 75.0 | 1875 | 0.6935 | 0.785 | 0.3378 | 1.4082 | 0.785 | 0.7458 | 0.2218 | 0.0762 |
| 0.618 | 76.0 | 1900 | 0.6910 | 0.795 | 0.3359 | 1.4098 | 0.795 | 0.7599 | 0.2689 | 0.0746 |
| 0.618 | 77.0 | 1925 | 0.6907 | 0.79 | 0.3356 | 1.4072 | 0.79 | 0.7541 | 0.2254 | 0.0741 |
| 0.618 | 78.0 | 1950 | 0.6896 | 0.795 | 0.3352 | 1.3996 | 0.795 | 0.7636 | 0.2226 | 0.0743 |
| 0.618 | 79.0 | 1975 | 0.6896 | 0.79 | 0.3349 | 1.4073 | 0.79 | 0.7541 | 0.2295 | 0.0742 |
| 0.516 | 80.0 | 2000 | 0.6874 | 0.79 | 0.3335 | 1.4089 | 0.79 | 0.7541 | 0.2287 | 0.0743 |
| 0.516 | 81.0 | 2025 | 0.6874 | 0.795 | 0.3333 | 1.3983 | 0.795 | 0.7636 | 0.2387 | 0.0742 |
| 0.516 | 82.0 | 2050 | 0.6867 | 0.795 | 0.3327 | 1.4098 | 0.795 | 0.7636 | 0.2162 | 0.0736 |
| 0.516 | 83.0 | 2075 | 0.6865 | 0.795 | 0.3323 | 1.4656 | 0.795 | 0.7636 | 0.2072 | 0.0738 |
| 0.516 | 84.0 | 2100 | 0.6857 | 0.795 | 0.3323 | 1.4107 | 0.795 | 0.7636 | 0.2138 | 0.0741 |
| 0.516 | 85.0 | 2125 | 0.6854 | 0.795 | 0.3316 | 1.4223 | 0.795 | 0.7636 | 0.2262 | 0.0732 |
| 0.516 | 86.0 | 2150 | 0.6846 | 0.795 | 0.3311 | 1.4138 | 0.795 | 0.7636 | 0.2224 | 0.0733 |
| 0.516 | 87.0 | 2175 | 0.6834 | 0.795 | 0.3302 | 1.4113 | 0.795 | 0.7636 | 0.2307 | 0.0731 |
| 0.516 | 88.0 | 2200 | 0.6831 | 0.795 | 0.3300 | 1.4088 | 0.795 | 0.7636 | 0.2256 | 0.0730 |
| 0.516 | 89.0 | 2225 | 0.6821 | 0.795 | 0.3295 | 1.4126 | 0.795 | 0.7636 | 0.2395 | 0.0728 |
| 0.516 | 90.0 | 2250 | 0.6821 | 0.795 | 0.3294 | 1.4123 | 0.795 | 0.7636 | 0.2237 | 0.0728 |
| 0.516 | 91.0 | 2275 | 0.6823 | 0.795 | 0.3294 | 1.4085 | 0.795 | 0.7636 | 0.2213 | 0.0728 |
| 0.516 | 92.0 | 2300 | 0.6819 | 0.795 | 0.3290 | 1.4105 | 0.795 | 0.7636 | 0.2332 | 0.0730 |
| 0.516 | 93.0 | 2325 | 0.6816 | 0.795 | 0.3289 | 1.4094 | 0.795 | 0.7636 | 0.2236 | 0.0729 |
| 0.516 | 94.0 | 2350 | 0.6812 | 0.795 | 0.3287 | 1.4092 | 0.795 | 0.7636 | 0.2235 | 0.0729 |
| 0.516 | 95.0 | 2375 | 0.6813 | 0.795 | 0.3286 | 1.4065 | 0.795 | 0.7636 | 0.2197 | 0.0727 |
| 0.516 | 96.0 | 2400 | 0.6811 | 0.795 | 0.3285 | 1.4079 | 0.795 | 0.7636 | 0.2247 | 0.0729 |
| 0.516 | 97.0 | 2425 | 0.6810 | 0.795 | 0.3284 | 1.4072 | 0.795 | 0.7636 | 0.2320 | 0.0729 |
| 0.516 | 98.0 | 2450 | 0.6810 | 0.795 | 0.3284 | 1.4062 | 0.795 | 0.7636 | 0.2148 | 0.0727 |
| 0.516 | 99.0 | 2475 | 0.6810 | 0.795 | 0.3284 | 1.4068 | 0.795 | 0.7636 | 0.2215 | 0.0726 |
| 0.4715 | 100.0 | 2500 | 0.6810 | 0.795 | 0.3284 | 1.4069 | 0.795 | 0.7636 | 0.2215 | 0.0726 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5583
- Accuracy: 0.82
- Brier Loss: 0.2563
- Nll: 1.8898
- F1 Micro: 0.82
- F1 Macro: 0.8009
- Ece: 0.1578
- Aurc: 0.0530
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.9764 | 0.23 | 0.8621 | 4.6756 | 0.23 | 0.1902 | 0.2733 | 0.7604 |
| No log | 2.0 | 50 | 1.2764 | 0.535 | 0.5973 | 2.7212 | 0.535 | 0.4337 | 0.2769 | 0.2592 |
| No log | 3.0 | 75 | 0.9774 | 0.68 | 0.4478 | 2.1874 | 0.68 | 0.5915 | 0.2144 | 0.1334 |
| No log | 4.0 | 100 | 0.8047 | 0.755 | 0.3617 | 1.4629 | 0.755 | 0.7257 | 0.1850 | 0.0888 |
| No log | 5.0 | 125 | 0.7616 | 0.765 | 0.3363 | 1.4885 | 0.765 | 0.7391 | 0.2017 | 0.0843 |
| No log | 6.0 | 150 | 1.0029 | 0.72 | 0.4200 | 1.6550 | 0.72 | 0.7047 | 0.2303 | 0.1169 |
| No log | 7.0 | 175 | 0.6286 | 0.825 | 0.2766 | 1.2493 | 0.825 | 0.7930 | 0.1954 | 0.0646 |
| No log | 8.0 | 200 | 0.6859 | 0.82 | 0.2857 | 1.4847 | 0.82 | 0.7971 | 0.1837 | 0.0699 |
| No log | 9.0 | 225 | 0.6365 | 0.81 | 0.2765 | 1.1457 | 0.81 | 0.7913 | 0.1604 | 0.0669 |
| No log | 10.0 | 250 | 0.6085 | 0.81 | 0.2614 | 1.5809 | 0.81 | 0.7928 | 0.1874 | 0.0536 |
| No log | 11.0 | 275 | 0.5900 | 0.84 | 0.2620 | 1.1457 | 0.8400 | 0.8308 | 0.1695 | 0.0674 |
| No log | 12.0 | 300 | 0.8544 | 0.75 | 0.3667 | 1.9577 | 0.75 | 0.7330 | 0.1988 | 0.1329 |
| No log | 13.0 | 325 | 0.5265 | 0.845 | 0.2278 | 1.2521 | 0.845 | 0.8209 | 0.1518 | 0.0399 |
| No log | 14.0 | 350 | 0.5702 | 0.815 | 0.2567 | 1.5233 | 0.815 | 0.8032 | 0.1551 | 0.0519 |
| No log | 15.0 | 375 | 0.5933 | 0.845 | 0.2581 | 1.4776 | 0.845 | 0.8341 | 0.1659 | 0.0738 |
| No log | 16.0 | 400 | 0.5697 | 0.84 | 0.2496 | 1.6732 | 0.8400 | 0.8235 | 0.1470 | 0.0557 |
| No log | 17.0 | 425 | 0.5471 | 0.825 | 0.2428 | 1.7010 | 0.825 | 0.8093 | 0.1406 | 0.0461 |
| No log | 18.0 | 450 | 0.5696 | 0.825 | 0.2546 | 1.4095 | 0.825 | 0.7977 | 0.1461 | 0.0612 |
| No log | 19.0 | 475 | 0.6544 | 0.805 | 0.2959 | 1.8251 | 0.805 | 0.7970 | 0.1681 | 0.0605 |
| 0.4416 | 20.0 | 500 | 0.5113 | 0.83 | 0.2327 | 1.4103 | 0.83 | 0.8093 | 0.1380 | 0.0541 |
| 0.4416 | 21.0 | 525 | 0.5255 | 0.84 | 0.2375 | 1.6750 | 0.8400 | 0.8220 | 0.1320 | 0.0462 |
| 0.4416 | 22.0 | 550 | 0.5889 | 0.835 | 0.2681 | 1.7850 | 0.835 | 0.8242 | 0.1507 | 0.0683 |
| 0.4416 | 23.0 | 575 | 0.5456 | 0.835 | 0.2492 | 1.8481 | 0.835 | 0.8137 | 0.1716 | 0.0550 |
| 0.4416 | 24.0 | 600 | 0.5661 | 0.83 | 0.2611 | 1.8434 | 0.83 | 0.8156 | 0.1618 | 0.0591 |
| 0.4416 | 25.0 | 625 | 0.5444 | 0.83 | 0.2484 | 1.7579 | 0.83 | 0.8091 | 0.1478 | 0.0530 |
| 0.4416 | 26.0 | 650 | 0.5418 | 0.83 | 0.2503 | 1.7188 | 0.83 | 0.8125 | 0.1564 | 0.0484 |
| 0.4416 | 27.0 | 675 | 0.5532 | 0.835 | 0.2540 | 1.8931 | 0.835 | 0.8146 | 0.1694 | 0.0514 |
| 0.4416 | 28.0 | 700 | 0.5492 | 0.835 | 0.2518 | 1.8959 | 0.835 | 0.8155 | 0.1505 | 0.0495 |
| 0.4416 | 29.0 | 725 | 0.5478 | 0.825 | 0.2505 | 1.8907 | 0.825 | 0.8069 | 0.1548 | 0.0503 |
| 0.4416 | 30.0 | 750 | 0.5478 | 0.835 | 0.2510 | 1.8881 | 0.835 | 0.8178 | 0.1467 | 0.0521 |
| 0.4416 | 31.0 | 775 | 0.5472 | 0.825 | 0.2505 | 1.8888 | 0.825 | 0.8064 | 0.1527 | 0.0510 |
| 0.4416 | 32.0 | 800 | 0.5522 | 0.83 | 0.2527 | 1.8927 | 0.83 | 0.8126 | 0.1449 | 0.0520 |
| 0.4416 | 33.0 | 825 | 0.5513 | 0.825 | 0.2524 | 1.8989 | 0.825 | 0.8064 | 0.1625 | 0.0509 |
| 0.4416 | 34.0 | 850 | 0.5465 | 0.835 | 0.2504 | 1.8880 | 0.835 | 0.8148 | 0.1519 | 0.0520 |
| 0.4416 | 35.0 | 875 | 0.5489 | 0.825 | 0.2515 | 1.8866 | 0.825 | 0.8064 | 0.1538 | 0.0510 |
| 0.4416 | 36.0 | 900 | 0.5508 | 0.825 | 0.2521 | 1.8922 | 0.825 | 0.8053 | 0.1356 | 0.0526 |
| 0.4416 | 37.0 | 925 | 0.5495 | 0.825 | 0.2522 | 1.8881 | 0.825 | 0.8064 | 0.1517 | 0.0514 |
| 0.4416 | 38.0 | 950 | 0.5483 | 0.825 | 0.2514 | 1.8859 | 0.825 | 0.8064 | 0.1749 | 0.0511 |
| 0.4416 | 39.0 | 975 | 0.5508 | 0.825 | 0.2524 | 1.8868 | 0.825 | 0.8064 | 0.1459 | 0.0514 |
| 0.0519 | 40.0 | 1000 | 0.5519 | 0.825 | 0.2529 | 1.8862 | 0.825 | 0.8064 | 0.1532 | 0.0513 |
| 0.0519 | 41.0 | 1025 | 0.5522 | 0.825 | 0.2530 | 1.8882 | 0.825 | 0.8064 | 0.1665 | 0.0519 |
| 0.0519 | 42.0 | 1050 | 0.5507 | 0.825 | 0.2525 | 1.8870 | 0.825 | 0.8064 | 0.1613 | 0.0508 |
| 0.0519 | 43.0 | 1075 | 0.5528 | 0.825 | 0.2536 | 1.8884 | 0.825 | 0.8064 | 0.1634 | 0.0517 |
| 0.0519 | 44.0 | 1100 | 0.5520 | 0.825 | 0.2531 | 1.8879 | 0.825 | 0.8064 | 0.1519 | 0.0525 |
| 0.0519 | 45.0 | 1125 | 0.5524 | 0.825 | 0.2535 | 1.8876 | 0.825 | 0.8053 | 0.1582 | 0.0515 |
| 0.0519 | 46.0 | 1150 | 0.5525 | 0.825 | 0.2534 | 1.8867 | 0.825 | 0.8064 | 0.1592 | 0.0519 |
| 0.0519 | 47.0 | 1175 | 0.5532 | 0.825 | 0.2539 | 1.8875 | 0.825 | 0.8064 | 0.1621 | 0.0521 |
| 0.0519 | 48.0 | 1200 | 0.5540 | 0.825 | 0.2540 | 1.8865 | 0.825 | 0.8064 | 0.1502 | 0.0522 |
| 0.0519 | 49.0 | 1225 | 0.5523 | 0.825 | 0.2538 | 1.8268 | 0.825 | 0.8072 | 0.1625 | 0.0514 |
| 0.0519 | 50.0 | 1250 | 0.5535 | 0.825 | 0.2539 | 1.8871 | 0.825 | 0.8064 | 0.1684 | 0.0517 |
| 0.0519 | 51.0 | 1275 | 0.5526 | 0.825 | 0.2534 | 1.8850 | 0.825 | 0.8064 | 0.1621 | 0.0519 |
| 0.0519 | 52.0 | 1300 | 0.5543 | 0.825 | 0.2543 | 1.8865 | 0.825 | 0.8064 | 0.1429 | 0.0521 |
| 0.0519 | 53.0 | 1325 | 0.5526 | 0.825 | 0.2538 | 1.8866 | 0.825 | 0.8064 | 0.1613 | 0.0515 |
| 0.0519 | 54.0 | 1350 | 0.5530 | 0.82 | 0.2538 | 1.8877 | 0.82 | 0.8009 | 0.1620 | 0.0518 |
| 0.0519 | 55.0 | 1375 | 0.5550 | 0.825 | 0.2547 | 1.8872 | 0.825 | 0.8064 | 0.1567 | 0.0522 |
| 0.0519 | 56.0 | 1400 | 0.5565 | 0.825 | 0.2552 | 1.8859 | 0.825 | 0.8064 | 0.1400 | 0.0523 |
| 0.0519 | 57.0 | 1425 | 0.5552 | 0.825 | 0.2548 | 1.8874 | 0.825 | 0.8064 | 0.1543 | 0.0520 |
| 0.0519 | 58.0 | 1450 | 0.5537 | 0.825 | 0.2542 | 1.8860 | 0.825 | 0.8064 | 0.1531 | 0.0516 |
| 0.0519 | 59.0 | 1475 | 0.5559 | 0.825 | 0.2551 | 1.8879 | 0.825 | 0.8064 | 0.1564 | 0.0525 |
| 0.0508 | 60.0 | 1500 | 0.5548 | 0.825 | 0.2545 | 1.8866 | 0.825 | 0.8064 | 0.1526 | 0.0522 |
| 0.0508 | 61.0 | 1525 | 0.5557 | 0.825 | 0.2550 | 1.8884 | 0.825 | 0.8064 | 0.1443 | 0.0524 |
| 0.0508 | 62.0 | 1550 | 0.5548 | 0.82 | 0.2546 | 1.8874 | 0.82 | 0.8009 | 0.1709 | 0.0527 |
| 0.0508 | 63.0 | 1575 | 0.5556 | 0.825 | 0.2551 | 1.8899 | 0.825 | 0.8064 | 0.1606 | 0.0524 |
| 0.0508 | 64.0 | 1600 | 0.5562 | 0.825 | 0.2553 | 1.8872 | 0.825 | 0.8064 | 0.1467 | 0.0527 |
| 0.0508 | 65.0 | 1625 | 0.5569 | 0.825 | 0.2554 | 1.8879 | 0.825 | 0.8064 | 0.1537 | 0.0524 |
| 0.0508 | 66.0 | 1650 | 0.5567 | 0.825 | 0.2555 | 1.8873 | 0.825 | 0.8064 | 0.1601 | 0.0525 |
| 0.0508 | 67.0 | 1675 | 0.5556 | 0.825 | 0.2550 | 1.8878 | 0.825 | 0.8064 | 0.1601 | 0.0527 |
| 0.0508 | 68.0 | 1700 | 0.5570 | 0.825 | 0.2555 | 1.8879 | 0.825 | 0.8064 | 0.1679 | 0.0528 |
| 0.0508 | 69.0 | 1725 | 0.5560 | 0.825 | 0.2553 | 1.8886 | 0.825 | 0.8064 | 0.1525 | 0.0521 |
| 0.0508 | 70.0 | 1750 | 0.5562 | 0.825 | 0.2553 | 1.8878 | 0.825 | 0.8064 | 0.1531 | 0.0528 |
| 0.0508 | 71.0 | 1775 | 0.5572 | 0.82 | 0.2557 | 1.8883 | 0.82 | 0.8009 | 0.1718 | 0.0530 |
| 0.0508 | 72.0 | 1800 | 0.5567 | 0.82 | 0.2555 | 1.8888 | 0.82 | 0.8009 | 0.1630 | 0.0525 |
| 0.0508 | 73.0 | 1825 | 0.5571 | 0.825 | 0.2556 | 1.8882 | 0.825 | 0.8064 | 0.1598 | 0.0528 |
| 0.0508 | 74.0 | 1850 | 0.5580 | 0.825 | 0.2561 | 1.8901 | 0.825 | 0.8064 | 0.1543 | 0.0530 |
| 0.0508 | 75.0 | 1875 | 0.5579 | 0.82 | 0.2561 | 1.8892 | 0.82 | 0.8009 | 0.1721 | 0.0530 |
| 0.0508 | 76.0 | 1900 | 0.5574 | 0.82 | 0.2559 | 1.8892 | 0.82 | 0.8009 | 0.1636 | 0.0528 |
| 0.0508 | 77.0 | 1925 | 0.5569 | 0.82 | 0.2557 | 1.8393 | 0.82 | 0.8009 | 0.1634 | 0.0526 |
| 0.0508 | 78.0 | 1950 | 0.5572 | 0.82 | 0.2558 | 1.8887 | 0.82 | 0.8009 | 0.1637 | 0.0530 |
| 0.0508 | 79.0 | 1975 | 0.5578 | 0.82 | 0.2560 | 1.8888 | 0.82 | 0.8009 | 0.1579 | 0.0530 |
| 0.0507 | 80.0 | 2000 | 0.5577 | 0.82 | 0.2559 | 1.8889 | 0.82 | 0.8009 | 0.1578 | 0.0532 |
| 0.0507 | 81.0 | 2025 | 0.5578 | 0.82 | 0.2560 | 1.8889 | 0.82 | 0.8009 | 0.1578 | 0.0533 |
| 0.0507 | 82.0 | 2050 | 0.5579 | 0.825 | 0.2561 | 1.8891 | 0.825 | 0.8064 | 0.1602 | 0.0528 |
| 0.0507 | 83.0 | 2075 | 0.5581 | 0.825 | 0.2562 | 1.8894 | 0.825 | 0.8064 | 0.1544 | 0.0528 |
| 0.0507 | 84.0 | 2100 | 0.5579 | 0.82 | 0.2561 | 1.8894 | 0.82 | 0.8009 | 0.1581 | 0.0531 |
| 0.0507 | 85.0 | 2125 | 0.5580 | 0.82 | 0.2561 | 1.8896 | 0.82 | 0.8009 | 0.1578 | 0.0528 |
| 0.0507 | 86.0 | 2150 | 0.5581 | 0.82 | 0.2562 | 1.8891 | 0.82 | 0.8009 | 0.1580 | 0.0532 |
| 0.0507 | 87.0 | 2175 | 0.5582 | 0.82 | 0.2562 | 1.8467 | 0.82 | 0.8009 | 0.1581 | 0.0528 |
| 0.0507 | 88.0 | 2200 | 0.5583 | 0.82 | 0.2562 | 1.8891 | 0.82 | 0.8009 | 0.1580 | 0.0531 |
| 0.0507 | 89.0 | 2225 | 0.5584 | 0.815 | 0.2563 | 1.8894 | 0.815 | 0.7976 | 0.1608 | 0.0534 |
| 0.0507 | 90.0 | 2250 | 0.5578 | 0.82 | 0.2561 | 1.8894 | 0.82 | 0.8009 | 0.1578 | 0.0530 |
| 0.0507 | 91.0 | 2275 | 0.5584 | 0.815 | 0.2563 | 1.8896 | 0.815 | 0.7976 | 0.1607 | 0.0532 |
| 0.0507 | 92.0 | 2300 | 0.5583 | 0.82 | 0.2562 | 1.8893 | 0.82 | 0.8009 | 0.1581 | 0.0531 |
| 0.0507 | 93.0 | 2325 | 0.5582 | 0.82 | 0.2562 | 1.8898 | 0.82 | 0.8009 | 0.1579 | 0.0530 |
| 0.0507 | 94.0 | 2350 | 0.5582 | 0.82 | 0.2562 | 1.8392 | 0.82 | 0.8009 | 0.1578 | 0.0530 |
| 0.0507 | 95.0 | 2375 | 0.5584 | 0.82 | 0.2563 | 1.8897 | 0.82 | 0.8009 | 0.1582 | 0.0531 |
| 0.0507 | 96.0 | 2400 | 0.5582 | 0.82 | 0.2562 | 1.8898 | 0.82 | 0.8009 | 0.1578 | 0.0530 |
| 0.0507 | 97.0 | 2425 | 0.5583 | 0.82 | 0.2563 | 1.8896 | 0.82 | 0.8009 | 0.1580 | 0.0530 |
| 0.0507 | 98.0 | 2450 | 0.5582 | 0.82 | 0.2562 | 1.8898 | 0.82 | 0.8009 | 0.1578 | 0.0530 |
| 0.0507 | 99.0 | 2475 | 0.5583 | 0.82 | 0.2563 | 1.8898 | 0.82 | 0.8009 | 0.1578 | 0.0530 |
| 0.0507 | 100.0 | 2500 | 0.5583 | 0.82 | 0.2563 | 1.8898 | 0.82 | 0.8009 | 0.1578 | 0.0530 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ayanban011/6_e_200-tiny_tobacco3482
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 6_e_200-tiny_tobacco3482
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5797
- Accuracy: 0.68
- Brier Loss: 0.4846
- Nll: 2.7977
- F1 Micro: 0.68
- F1 Macro: 0.6624
- Ece: 0.2589
- Aurc: 0.2179
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.6199 | 0.545 | 0.5837 | 2.2367 | 0.545 | 0.4488 | 0.2385 | 0.2462 |
| No log | 2.0 | 50 | 0.4278 | 0.72 | 0.4443 | 1.6749 | 0.72 | 0.7298 | 0.3095 | 0.1339 |
| No log | 3.0 | 75 | 0.6756 | 0.625 | 0.4913 | 2.4923 | 0.625 | 0.5751 | 0.2278 | 0.1729 |
| No log | 4.0 | 100 | 0.6851 | 0.615 | 0.5031 | 2.5374 | 0.615 | 0.5988 | 0.2487 | 0.1568 |
| No log | 5.0 | 125 | 0.5550 | 0.69 | 0.4473 | 1.6315 | 0.69 | 0.6549 | 0.2726 | 0.1697 |
| No log | 6.0 | 150 | 0.4135 | 0.79 | 0.3326 | 1.3743 | 0.79 | 0.7851 | 0.2019 | 0.0767 |
| No log | 7.0 | 175 | 0.4111 | 0.77 | 0.3527 | 1.6472 | 0.7700 | 0.7388 | 0.1917 | 0.0849 |
| No log | 8.0 | 200 | 0.4809 | 0.745 | 0.3805 | 2.1093 | 0.745 | 0.7260 | 0.2375 | 0.0818 |
| No log | 9.0 | 225 | 0.6527 | 0.65 | 0.4651 | 2.5269 | 0.65 | 0.6271 | 0.2237 | 0.1333 |
| No log | 10.0 | 250 | 0.5215 | 0.715 | 0.4203 | 1.7818 | 0.715 | 0.6875 | 0.2155 | 0.1328 |
| No log | 11.0 | 275 | 0.5673 | 0.64 | 0.4965 | 2.1840 | 0.64 | 0.5458 | 0.2547 | 0.1750 |
| No log | 12.0 | 300 | 0.4713 | 0.75 | 0.3879 | 2.0629 | 0.75 | 0.7397 | 0.2143 | 0.1262 |
| No log | 13.0 | 325 | 0.5793 | 0.655 | 0.4791 | 2.6089 | 0.655 | 0.6206 | 0.2461 | 0.1870 |
| No log | 14.0 | 350 | 0.6098 | 0.685 | 0.4375 | 2.0773 | 0.685 | 0.6802 | 0.2415 | 0.1480 |
| No log | 15.0 | 375 | 0.5338 | 0.65 | 0.4777 | 2.1062 | 0.65 | 0.6267 | 0.2486 | 0.1577 |
| No log | 16.0 | 400 | 0.6278 | 0.675 | 0.4482 | 2.3275 | 0.675 | 0.6822 | 0.2425 | 0.1606 |
| No log | 17.0 | 425 | 0.5165 | 0.69 | 0.4524 | 2.1670 | 0.69 | 0.6661 | 0.2650 | 0.1554 |
| No log | 18.0 | 450 | 0.6064 | 0.64 | 0.4978 | 2.5380 | 0.64 | 0.5838 | 0.2508 | 0.1850 |
| No log | 19.0 | 475 | 0.6753 | 0.645 | 0.5125 | 2.4190 | 0.645 | 0.5875 | 0.2664 | 0.2124 |
| 0.3785 | 20.0 | 500 | 0.6946 | 0.65 | 0.5331 | 3.1366 | 0.65 | 0.6336 | 0.2715 | 0.2605 |
| 0.3785 | 21.0 | 525 | 0.5328 | 0.695 | 0.4699 | 2.8365 | 0.695 | 0.6863 | 0.2500 | 0.1660 |
| 0.3785 | 22.0 | 550 | 0.6684 | 0.62 | 0.5374 | 3.2087 | 0.62 | 0.6071 | 0.2363 | 0.2429 |
| 0.3785 | 23.0 | 575 | 0.7235 | 0.615 | 0.5613 | 3.4750 | 0.615 | 0.5866 | 0.2748 | 0.2185 |
| 0.3785 | 24.0 | 600 | 0.6748 | 0.67 | 0.5028 | 3.0615 | 0.67 | 0.6185 | 0.2697 | 0.2251 |
| 0.3785 | 25.0 | 625 | 0.6778 | 0.645 | 0.5068 | 2.7608 | 0.645 | 0.6235 | 0.2442 | 0.1589 |
| 0.3785 | 26.0 | 650 | 0.7163 | 0.6 | 0.5690 | 2.7443 | 0.6 | 0.5766 | 0.2655 | 0.2821 |
| 0.3785 | 27.0 | 675 | 0.7571 | 0.635 | 0.5278 | 3.3670 | 0.635 | 0.6085 | 0.2635 | 0.2025 |
| 0.3785 | 28.0 | 700 | 0.6955 | 0.605 | 0.5718 | 3.1717 | 0.605 | 0.5973 | 0.3092 | 0.2624 |
| 0.3785 | 29.0 | 725 | 0.7951 | 0.585 | 0.5869 | 3.2346 | 0.585 | 0.5777 | 0.2849 | 0.3039 |
| 0.3785 | 30.0 | 750 | 0.5426 | 0.655 | 0.4898 | 2.6384 | 0.655 | 0.6295 | 0.2758 | 0.1781 |
| 0.3785 | 31.0 | 775 | 0.7721 | 0.6 | 0.5956 | 3.5480 | 0.6 | 0.5717 | 0.2908 | 0.2548 |
| 0.3785 | 32.0 | 800 | 0.6102 | 0.65 | 0.4974 | 2.6613 | 0.65 | 0.6348 | 0.2529 | 0.1661 |
| 0.3785 | 33.0 | 825 | 0.7592 | 0.62 | 0.5666 | 3.5174 | 0.62 | 0.5736 | 0.2821 | 0.2752 |
| 0.3785 | 34.0 | 850 | 0.6516 | 0.655 | 0.5283 | 3.1254 | 0.655 | 0.6341 | 0.2596 | 0.1954 |
| 0.3785 | 35.0 | 875 | 0.6626 | 0.65 | 0.5329 | 2.9794 | 0.65 | 0.6120 | 0.2793 | 0.2390 |
| 0.3785 | 36.0 | 900 | 0.6939 | 0.66 | 0.5190 | 3.4020 | 0.66 | 0.6258 | 0.2473 | 0.1785 |
| 0.3785 | 37.0 | 925 | 0.7580 | 0.605 | 0.5970 | 3.1545 | 0.605 | 0.5466 | 0.2996 | 0.2424 |
| 0.3785 | 38.0 | 950 | 0.6088 | 0.655 | 0.5187 | 2.7205 | 0.655 | 0.6457 | 0.2636 | 0.2413 |
| 0.3785 | 39.0 | 975 | 0.7394 | 0.605 | 0.5815 | 2.8167 | 0.605 | 0.5798 | 0.2975 | 0.2782 |
| 0.0886 | 40.0 | 1000 | 0.6910 | 0.65 | 0.5015 | 2.9680 | 0.65 | 0.5993 | 0.2697 | 0.1652 |
| 0.0886 | 41.0 | 1025 | 0.6618 | 0.635 | 0.5752 | 3.5088 | 0.635 | 0.5937 | 0.2929 | 0.2653 |
| 0.0886 | 42.0 | 1050 | 0.7742 | 0.6 | 0.5556 | 3.5946 | 0.6 | 0.5644 | 0.2556 | 0.1974 |
| 0.0886 | 43.0 | 1075 | 0.7379 | 0.62 | 0.5589 | 2.7882 | 0.62 | 0.6042 | 0.3169 | 0.2143 |
| 0.0886 | 44.0 | 1100 | 0.6702 | 0.64 | 0.5359 | 2.9335 | 0.64 | 0.6088 | 0.2765 | 0.2133 |
| 0.0886 | 45.0 | 1125 | 0.8900 | 0.585 | 0.6173 | 3.8349 | 0.585 | 0.5639 | 0.2934 | 0.2364 |
| 0.0886 | 46.0 | 1150 | 0.7800 | 0.62 | 0.5707 | 3.2446 | 0.62 | 0.6171 | 0.3002 | 0.2156 |
| 0.0886 | 47.0 | 1175 | 0.8554 | 0.57 | 0.6256 | 3.6828 | 0.57 | 0.5583 | 0.3191 | 0.2611 |
| 0.0886 | 48.0 | 1200 | 0.6486 | 0.67 | 0.4911 | 3.4792 | 0.67 | 0.6449 | 0.2741 | 0.1870 |
| 0.0886 | 49.0 | 1225 | 0.7315 | 0.59 | 0.5829 | 3.4916 | 0.59 | 0.5963 | 0.2720 | 0.2101 |
| 0.0886 | 50.0 | 1250 | 0.6939 | 0.665 | 0.5022 | 2.9091 | 0.665 | 0.6362 | 0.2743 | 0.1829 |
| 0.0886 | 51.0 | 1275 | 0.7256 | 0.625 | 0.5687 | 3.4914 | 0.625 | 0.5740 | 0.2943 | 0.2493 |
| 0.0886 | 52.0 | 1300 | 0.6374 | 0.66 | 0.5144 | 2.7071 | 0.66 | 0.6297 | 0.2529 | 0.2006 |
| 0.0886 | 53.0 | 1325 | 0.7862 | 0.645 | 0.5470 | 3.2902 | 0.645 | 0.6385 | 0.2899 | 0.2053 |
| 0.0886 | 54.0 | 1350 | 0.7717 | 0.63 | 0.5762 | 3.8614 | 0.63 | 0.6027 | 0.2954 | 0.2150 |
| 0.0886 | 55.0 | 1375 | 0.6664 | 0.675 | 0.5120 | 3.1014 | 0.675 | 0.6582 | 0.2850 | 0.1842 |
| 0.0886 | 56.0 | 1400 | 0.6957 | 0.615 | 0.5602 | 3.0253 | 0.615 | 0.5977 | 0.3033 | 0.2229 |
| 0.0886 | 57.0 | 1425 | 0.6794 | 0.64 | 0.5581 | 3.0174 | 0.64 | 0.6205 | 0.2802 | 0.2056 |
| 0.0886 | 58.0 | 1450 | 0.6345 | 0.655 | 0.5162 | 2.7909 | 0.655 | 0.6422 | 0.2856 | 0.2789 |
| 0.0886 | 59.0 | 1475 | 0.6447 | 0.655 | 0.5271 | 2.9860 | 0.655 | 0.6432 | 0.2735 | 0.1774 |
| 0.0219 | 60.0 | 1500 | 0.7042 | 0.665 | 0.5404 | 3.1132 | 0.665 | 0.6268 | 0.2871 | 0.2981 |
| 0.0219 | 61.0 | 1525 | 0.7288 | 0.64 | 0.5486 | 3.3084 | 0.64 | 0.6225 | 0.2869 | 0.1861 |
| 0.0219 | 62.0 | 1550 | 0.6605 | 0.69 | 0.5078 | 2.9123 | 0.69 | 0.6642 | 0.2668 | 0.2487 |
| 0.0219 | 63.0 | 1575 | 0.5905 | 0.715 | 0.4712 | 3.4707 | 0.715 | 0.7013 | 0.2548 | 0.2257 |
| 0.0219 | 64.0 | 1600 | 0.6209 | 0.69 | 0.4940 | 2.6873 | 0.69 | 0.6770 | 0.2771 | 0.2263 |
| 0.0219 | 65.0 | 1625 | 0.6039 | 0.68 | 0.4914 | 2.6448 | 0.68 | 0.6620 | 0.2926 | 0.2184 |
| 0.0219 | 66.0 | 1650 | 0.5985 | 0.69 | 0.4918 | 2.7592 | 0.69 | 0.6757 | 0.2844 | 0.2181 |
| 0.0219 | 67.0 | 1675 | 0.5955 | 0.69 | 0.4903 | 2.7566 | 0.69 | 0.6757 | 0.2617 | 0.2227 |
| 0.0219 | 68.0 | 1700 | 0.5944 | 0.69 | 0.4898 | 2.7730 | 0.69 | 0.6757 | 0.2683 | 0.2211 |
| 0.0219 | 69.0 | 1725 | 0.5934 | 0.695 | 0.4893 | 2.7575 | 0.695 | 0.6823 | 0.2666 | 0.2171 |
| 0.0219 | 70.0 | 1750 | 0.5913 | 0.695 | 0.4890 | 2.7043 | 0.695 | 0.6823 | 0.2649 | 0.2160 |
| 0.0219 | 71.0 | 1775 | 0.5904 | 0.69 | 0.4888 | 2.7476 | 0.69 | 0.6742 | 0.2718 | 0.2163 |
| 0.0219 | 72.0 | 1800 | 0.5895 | 0.69 | 0.4883 | 2.7463 | 0.69 | 0.6742 | 0.2714 | 0.2160 |
| 0.0219 | 73.0 | 1825 | 0.5882 | 0.69 | 0.4877 | 2.7478 | 0.69 | 0.6742 | 0.2779 | 0.2171 |
| 0.0219 | 74.0 | 1850 | 0.5878 | 0.69 | 0.4876 | 2.7489 | 0.69 | 0.6742 | 0.2813 | 0.2169 |
| 0.0219 | 75.0 | 1875 | 0.5879 | 0.69 | 0.4871 | 2.7592 | 0.69 | 0.6742 | 0.2765 | 0.2185 |
| 0.0219 | 76.0 | 1900 | 0.5868 | 0.69 | 0.4870 | 2.8058 | 0.69 | 0.6742 | 0.2670 | 0.2183 |
| 0.0219 | 77.0 | 1925 | 0.5843 | 0.69 | 0.4864 | 2.8037 | 0.69 | 0.6745 | 0.2764 | 0.2185 |
| 0.0219 | 78.0 | 1950 | 0.5844 | 0.69 | 0.4862 | 2.8040 | 0.69 | 0.6745 | 0.2788 | 0.2191 |
| 0.0219 | 79.0 | 1975 | 0.5831 | 0.69 | 0.4857 | 2.8018 | 0.69 | 0.6745 | 0.2655 | 0.2178 |
| 0.0013 | 80.0 | 2000 | 0.5846 | 0.69 | 0.4858 | 2.8022 | 0.69 | 0.6745 | 0.2633 | 0.2182 |
| 0.0013 | 81.0 | 2025 | 0.5821 | 0.69 | 0.4851 | 2.8020 | 0.69 | 0.6745 | 0.2750 | 0.2177 |
| 0.0013 | 82.0 | 2050 | 0.5826 | 0.685 | 0.4852 | 2.8013 | 0.685 | 0.6713 | 0.2728 | 0.2180 |
| 0.0013 | 83.0 | 2075 | 0.5821 | 0.685 | 0.4851 | 2.8005 | 0.685 | 0.6713 | 0.2705 | 0.2179 |
| 0.0013 | 84.0 | 2100 | 0.5817 | 0.685 | 0.4850 | 2.8007 | 0.685 | 0.6713 | 0.2773 | 0.2180 |
| 0.0013 | 85.0 | 2125 | 0.5814 | 0.685 | 0.4849 | 2.7998 | 0.685 | 0.6713 | 0.2740 | 0.2176 |
| 0.0013 | 86.0 | 2150 | 0.5814 | 0.685 | 0.4848 | 2.7997 | 0.685 | 0.6713 | 0.2686 | 0.2174 |
| 0.0013 | 87.0 | 2175 | 0.5807 | 0.68 | 0.4847 | 2.7994 | 0.68 | 0.6624 | 0.2658 | 0.2186 |
| 0.0013 | 88.0 | 2200 | 0.5803 | 0.68 | 0.4845 | 2.7992 | 0.68 | 0.6624 | 0.2703 | 0.2180 |
| 0.0013 | 89.0 | 2225 | 0.5805 | 0.68 | 0.4846 | 2.7990 | 0.68 | 0.6624 | 0.2632 | 0.2194 |
| 0.0013 | 90.0 | 2250 | 0.5803 | 0.685 | 0.4846 | 2.7979 | 0.685 | 0.6703 | 0.2596 | 0.2183 |
| 0.0013 | 91.0 | 2275 | 0.5805 | 0.685 | 0.4847 | 2.7980 | 0.685 | 0.6703 | 0.2674 | 0.2183 |
| 0.0013 | 92.0 | 2300 | 0.5805 | 0.685 | 0.4846 | 2.7993 | 0.685 | 0.6703 | 0.2562 | 0.2181 |
| 0.0013 | 93.0 | 2325 | 0.5802 | 0.68 | 0.4847 | 2.7974 | 0.68 | 0.6624 | 0.2598 | 0.2182 |
| 0.0013 | 94.0 | 2350 | 0.5800 | 0.68 | 0.4846 | 2.7981 | 0.68 | 0.6624 | 0.2613 | 0.2175 |
| 0.0013 | 95.0 | 2375 | 0.5796 | 0.68 | 0.4846 | 2.7985 | 0.68 | 0.6624 | 0.2589 | 0.2179 |
| 0.0013 | 96.0 | 2400 | 0.5799 | 0.68 | 0.4846 | 2.7980 | 0.68 | 0.6624 | 0.2560 | 0.2183 |
| 0.0013 | 97.0 | 2425 | 0.5796 | 0.68 | 0.4846 | 2.7978 | 0.68 | 0.6624 | 0.2588 | 0.2175 |
| 0.0013 | 98.0 | 2450 | 0.5798 | 0.68 | 0.4846 | 2.7977 | 0.68 | 0.6624 | 0.2589 | 0.2179 |
| 0.0013 | 99.0 | 2475 | 0.5797 | 0.68 | 0.4846 | 2.7977 | 0.68 | 0.6624 | 0.2589 | 0.2178 |
| 0.0003 | 100.0 | 2500 | 0.5797 | 0.68 | 0.4846 | 2.7977 | 0.68 | 0.6624 | 0.2589 | 0.2179 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
achyudev/vit-base-patch16-224-in21k
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-in21k
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3383
- F1: 0.9799
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 10 | 0.7369 | 0.9461 |
| No log | 2.0 | 20 | 0.6143 | 0.9526 |
| No log | 3.0 | 30 | 0.5401 | 0.9594 |
| No log | 4.0 | 40 | 0.4809 | 0.9662 |
| No log | 5.0 | 50 | 0.4219 | 0.9595 |
| No log | 6.0 | 60 | 0.3864 | 0.9730 |
| No log | 7.0 | 70 | 0.3628 | 0.9730 |
| No log | 8.0 | 80 | 0.3496 | 0.9732 |
| No log | 9.0 | 90 | 0.3419 | 0.9799 |
| No log | 10.0 | 100 | 0.3383 | 0.9799 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
|
[
"viola",
"cello",
"sax",
"flute",
"trumpet",
"oboe"
] |
Akshay-123/vit-base-patch16-224-in21k
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-in21k
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7692
- F1: 0.9865
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 10 | 1.5877 | 0.6675 |
| No log | 2.0 | 20 | 1.4149 | 0.8402 |
| No log | 3.0 | 30 | 1.2687 | 0.8917 |
| No log | 4.0 | 40 | 1.1382 | 0.9113 |
| No log | 5.0 | 50 | 1.0214 | 0.9523 |
| No log | 6.0 | 60 | 0.9285 | 0.9662 |
| No log | 7.0 | 70 | 0.8601 | 0.9728 |
| No log | 8.0 | 80 | 0.8089 | 0.9797 |
| No log | 9.0 | 90 | 0.7796 | 0.9865 |
| No log | 10.0 | 100 | 0.7692 | 0.9865 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"cello",
"oboe",
"sax",
"viola",
"trumpet",
"flute"
] |
Schwab/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.24.0
- Pytorch 2.0.1+cu118
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
3b3r/vit_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0414
- Accuracy: 0.9925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1498 | 3.85 | 500 | 0.0414 | 0.9925 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
jordyvl/vit-base_rvl_tobacco
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_rvl_tobacco
This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4152
- Accuracy: 0.905
- Brier Loss: 0.1584
- Nll: 0.7130
- F1 Micro: 0.905
- F1 Macro: 0.9056
- Ece: 0.1601
- Aurc: 0.0196
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.3234 | 0.045 | 0.9050 | 9.6090 | 0.045 | 0.0479 | 0.1570 | 0.9674 |
| No log | 1.96 | 6 | 2.3007 | 0.05 | 0.9005 | 8.5690 | 0.0500 | 0.0549 | 0.1567 | 0.9599 |
| No log | 2.96 | 9 | 2.2614 | 0.095 | 0.8924 | 6.9011 | 0.095 | 0.0853 | 0.1807 | 0.9128 |
| No log | 3.96 | 12 | 2.2062 | 0.255 | 0.8804 | 5.5442 | 0.255 | 0.1609 | 0.2738 | 0.7469 |
| No log | 4.96 | 15 | 2.1348 | 0.385 | 0.8636 | 4.0613 | 0.3850 | 0.2330 | 0.3605 | 0.4157 |
| No log | 5.96 | 18 | 2.0473 | 0.48 | 0.8410 | 2.5353 | 0.48 | 0.3152 | 0.4376 | 0.2329 |
| No log | 6.96 | 21 | 1.9483 | 0.64 | 0.8128 | 2.0469 | 0.64 | 0.5131 | 0.5355 | 0.1314 |
| No log | 7.96 | 24 | 1.8371 | 0.735 | 0.7783 | 1.7309 | 0.735 | 0.6333 | 0.5897 | 0.0802 |
| No log | 8.96 | 27 | 1.7227 | 0.775 | 0.7393 | 1.3371 | 0.775 | 0.6937 | 0.6049 | 0.0560 |
| No log | 9.96 | 30 | 1.6124 | 0.805 | 0.6978 | 1.1320 | 0.805 | 0.7319 | 0.5981 | 0.0462 |
| No log | 10.96 | 33 | 1.4990 | 0.82 | 0.6518 | 0.9973 | 0.82 | 0.7658 | 0.5882 | 0.0444 |
| No log | 11.96 | 36 | 1.3922 | 0.855 | 0.6064 | 0.8830 | 0.855 | 0.8127 | 0.5823 | 0.0397 |
| No log | 12.96 | 39 | 1.2985 | 0.865 | 0.5653 | 0.8957 | 0.865 | 0.8350 | 0.5604 | 0.0365 |
| No log | 13.96 | 42 | 1.2141 | 0.89 | 0.5271 | 0.6892 | 0.89 | 0.8733 | 0.5564 | 0.0331 |
| No log | 14.96 | 45 | 1.1402 | 0.895 | 0.4926 | 0.6695 | 0.895 | 0.8803 | 0.5341 | 0.0321 |
| No log | 15.96 | 48 | 1.0699 | 0.91 | 0.4596 | 0.6407 | 0.91 | 0.8999 | 0.5185 | 0.0285 |
| No log | 16.96 | 51 | 1.0037 | 0.91 | 0.4282 | 0.6163 | 0.91 | 0.8979 | 0.4831 | 0.0270 |
| No log | 17.96 | 54 | 0.9457 | 0.915 | 0.4004 | 0.6126 | 0.915 | 0.9011 | 0.4618 | 0.0247 |
| No log | 18.96 | 57 | 0.8914 | 0.915 | 0.3742 | 0.6066 | 0.915 | 0.9011 | 0.4426 | 0.0242 |
| No log | 19.96 | 60 | 0.8405 | 0.92 | 0.3495 | 0.5898 | 0.92 | 0.9102 | 0.4314 | 0.0216 |
| No log | 20.96 | 63 | 0.7995 | 0.915 | 0.3291 | 0.5934 | 0.915 | 0.9049 | 0.4033 | 0.0204 |
| No log | 21.96 | 66 | 0.7583 | 0.915 | 0.3089 | 0.5883 | 0.915 | 0.9049 | 0.3818 | 0.0206 |
| No log | 22.96 | 69 | 0.7228 | 0.915 | 0.2915 | 0.5835 | 0.915 | 0.9049 | 0.3707 | 0.0199 |
| No log | 23.96 | 72 | 0.6889 | 0.925 | 0.2747 | 0.5703 | 0.925 | 0.9169 | 0.3649 | 0.0191 |
| No log | 24.96 | 75 | 0.6624 | 0.925 | 0.2614 | 0.5769 | 0.925 | 0.9200 | 0.3375 | 0.0190 |
| No log | 25.96 | 78 | 0.6373 | 0.925 | 0.2491 | 0.5764 | 0.925 | 0.9218 | 0.3206 | 0.0191 |
| No log | 26.96 | 81 | 0.6106 | 0.93 | 0.2363 | 0.5570 | 0.93 | 0.9251 | 0.3276 | 0.0186 |
| No log | 27.96 | 84 | 0.5945 | 0.93 | 0.2281 | 0.5721 | 0.93 | 0.9251 | 0.3201 | 0.0187 |
| No log | 28.96 | 87 | 0.5780 | 0.92 | 0.2206 | 0.5668 | 0.92 | 0.9190 | 0.3008 | 0.0200 |
| No log | 29.96 | 90 | 0.5613 | 0.925 | 0.2125 | 0.5709 | 0.925 | 0.9218 | 0.2961 | 0.0191 |
| No log | 30.96 | 93 | 0.5456 | 0.925 | 0.2051 | 0.6155 | 0.925 | 0.9175 | 0.2764 | 0.0182 |
| No log | 31.96 | 96 | 0.5354 | 0.91 | 0.2008 | 0.6139 | 0.91 | 0.9104 | 0.2600 | 0.0187 |
| No log | 32.96 | 99 | 0.5248 | 0.91 | 0.1961 | 0.6078 | 0.91 | 0.9104 | 0.2610 | 0.0194 |
| No log | 33.96 | 102 | 0.5151 | 0.91 | 0.1915 | 0.6158 | 0.91 | 0.9084 | 0.2529 | 0.0186 |
| No log | 34.96 | 105 | 0.5066 | 0.91 | 0.1880 | 0.6121 | 0.91 | 0.9084 | 0.2409 | 0.0186 |
| No log | 35.96 | 108 | 0.4986 | 0.91 | 0.1846 | 0.6070 | 0.91 | 0.9084 | 0.2429 | 0.0186 |
| No log | 36.96 | 111 | 0.4920 | 0.91 | 0.1817 | 0.6208 | 0.91 | 0.9084 | 0.2380 | 0.0187 |
| No log | 37.96 | 114 | 0.4858 | 0.91 | 0.1793 | 0.6081 | 0.91 | 0.9084 | 0.2319 | 0.0185 |
| No log | 38.96 | 117 | 0.4792 | 0.91 | 0.1766 | 0.6044 | 0.91 | 0.9084 | 0.2276 | 0.0184 |
| No log | 39.96 | 120 | 0.4753 | 0.91 | 0.1749 | 0.6671 | 0.91 | 0.9084 | 0.2245 | 0.0185 |
| No log | 40.96 | 123 | 0.4704 | 0.905 | 0.1731 | 0.6137 | 0.905 | 0.9056 | 0.2321 | 0.0186 |
| No log | 41.96 | 126 | 0.4656 | 0.91 | 0.1714 | 0.6028 | 0.91 | 0.9084 | 0.2259 | 0.0187 |
| No log | 42.96 | 129 | 0.4624 | 0.91 | 0.1703 | 0.6048 | 0.91 | 0.9084 | 0.2080 | 0.0189 |
| No log | 43.96 | 132 | 0.4604 | 0.905 | 0.1695 | 0.6674 | 0.905 | 0.9056 | 0.2167 | 0.0187 |
| No log | 44.96 | 135 | 0.4553 | 0.905 | 0.1678 | 0.6190 | 0.905 | 0.9056 | 0.2130 | 0.0185 |
| No log | 45.96 | 138 | 0.4512 | 0.905 | 0.1663 | 0.6002 | 0.905 | 0.9056 | 0.2182 | 0.0186 |
| No log | 46.96 | 141 | 0.4513 | 0.905 | 0.1665 | 0.6681 | 0.905 | 0.9056 | 0.1902 | 0.0185 |
| No log | 47.96 | 144 | 0.4480 | 0.905 | 0.1656 | 0.6661 | 0.905 | 0.9056 | 0.1900 | 0.0186 |
| No log | 48.96 | 147 | 0.4451 | 0.905 | 0.1647 | 0.6085 | 0.905 | 0.9056 | 0.1969 | 0.0185 |
| No log | 49.96 | 150 | 0.4429 | 0.905 | 0.1638 | 0.6729 | 0.905 | 0.9056 | 0.1954 | 0.0186 |
| No log | 50.96 | 153 | 0.4416 | 0.905 | 0.1637 | 0.7300 | 0.905 | 0.9056 | 0.1730 | 0.0188 |
| No log | 51.96 | 156 | 0.4390 | 0.905 | 0.1627 | 0.6832 | 0.905 | 0.9056 | 0.1881 | 0.0187 |
| No log | 52.96 | 159 | 0.4377 | 0.905 | 0.1625 | 0.6708 | 0.905 | 0.9056 | 0.1724 | 0.0187 |
| No log | 53.96 | 162 | 0.4360 | 0.905 | 0.1620 | 0.7300 | 0.905 | 0.9056 | 0.1714 | 0.0189 |
| No log | 54.96 | 165 | 0.4338 | 0.905 | 0.1613 | 0.6734 | 0.905 | 0.9056 | 0.1923 | 0.0190 |
| No log | 55.96 | 168 | 0.4321 | 0.905 | 0.1609 | 0.6635 | 0.905 | 0.9056 | 0.1846 | 0.0189 |
| No log | 56.96 | 171 | 0.4326 | 0.905 | 0.1614 | 0.6722 | 0.905 | 0.9056 | 0.1851 | 0.0190 |
| No log | 57.96 | 174 | 0.4322 | 0.905 | 0.1613 | 0.7871 | 0.905 | 0.9056 | 0.1850 | 0.0191 |
| No log | 58.96 | 177 | 0.4286 | 0.905 | 0.1600 | 0.6660 | 0.905 | 0.9056 | 0.1733 | 0.0190 |
| No log | 59.96 | 180 | 0.4267 | 0.905 | 0.1596 | 0.6581 | 0.905 | 0.9056 | 0.1720 | 0.0190 |
| No log | 60.96 | 183 | 0.4277 | 0.905 | 0.1601 | 0.7252 | 0.905 | 0.9056 | 0.1772 | 0.0189 |
| No log | 61.96 | 186 | 0.4274 | 0.905 | 0.1601 | 0.7841 | 0.905 | 0.9056 | 0.1866 | 0.0192 |
| No log | 62.96 | 189 | 0.4264 | 0.905 | 0.1598 | 0.7830 | 0.905 | 0.9056 | 0.1669 | 0.0191 |
| No log | 63.96 | 192 | 0.4246 | 0.905 | 0.1595 | 0.7188 | 0.905 | 0.9056 | 0.1671 | 0.0191 |
| No log | 64.96 | 195 | 0.4236 | 0.905 | 0.1592 | 0.7170 | 0.905 | 0.9056 | 0.1762 | 0.0193 |
| No log | 65.96 | 198 | 0.4238 | 0.905 | 0.1594 | 0.7235 | 0.905 | 0.9056 | 0.1757 | 0.0192 |
| No log | 66.96 | 201 | 0.4227 | 0.905 | 0.1591 | 0.7218 | 0.905 | 0.9056 | 0.1724 | 0.0192 |
| No log | 67.96 | 204 | 0.4220 | 0.905 | 0.1590 | 0.7195 | 0.905 | 0.9056 | 0.1715 | 0.0191 |
| No log | 68.96 | 207 | 0.4214 | 0.905 | 0.1589 | 0.7201 | 0.905 | 0.9056 | 0.1708 | 0.0191 |
| No log | 69.96 | 210 | 0.4210 | 0.905 | 0.1588 | 0.7210 | 0.905 | 0.9056 | 0.1703 | 0.0193 |
| No log | 70.96 | 213 | 0.4211 | 0.905 | 0.1590 | 0.7226 | 0.905 | 0.9056 | 0.1697 | 0.0193 |
| No log | 71.96 | 216 | 0.4201 | 0.905 | 0.1587 | 0.7165 | 0.905 | 0.9056 | 0.1785 | 0.0193 |
| No log | 72.96 | 219 | 0.4194 | 0.905 | 0.1587 | 0.7145 | 0.905 | 0.9056 | 0.1780 | 0.0194 |
| No log | 73.96 | 222 | 0.4194 | 0.905 | 0.1587 | 0.7189 | 0.905 | 0.9056 | 0.1777 | 0.0194 |
| No log | 74.96 | 225 | 0.4192 | 0.905 | 0.1587 | 0.7193 | 0.905 | 0.9056 | 0.1770 | 0.0194 |
| No log | 75.96 | 228 | 0.4188 | 0.905 | 0.1586 | 0.7186 | 0.905 | 0.9056 | 0.1764 | 0.0192 |
| No log | 76.96 | 231 | 0.4180 | 0.905 | 0.1585 | 0.7148 | 0.905 | 0.9056 | 0.1786 | 0.0192 |
| No log | 77.96 | 234 | 0.4174 | 0.905 | 0.1584 | 0.7121 | 0.905 | 0.9056 | 0.1746 | 0.0193 |
| No log | 78.96 | 237 | 0.4178 | 0.905 | 0.1585 | 0.7159 | 0.905 | 0.9056 | 0.1720 | 0.0195 |
| No log | 79.96 | 240 | 0.4177 | 0.905 | 0.1586 | 0.7161 | 0.905 | 0.9056 | 0.1627 | 0.0195 |
| No log | 80.96 | 243 | 0.4173 | 0.905 | 0.1585 | 0.7147 | 0.905 | 0.9056 | 0.1627 | 0.0195 |
| No log | 81.96 | 246 | 0.4171 | 0.905 | 0.1585 | 0.7159 | 0.905 | 0.9056 | 0.1650 | 0.0195 |
| No log | 82.96 | 249 | 0.4162 | 0.905 | 0.1582 | 0.7135 | 0.905 | 0.9056 | 0.1742 | 0.0194 |
| No log | 83.96 | 252 | 0.4163 | 0.905 | 0.1584 | 0.7138 | 0.905 | 0.9056 | 0.1522 | 0.0196 |
| No log | 84.96 | 255 | 0.4161 | 0.905 | 0.1583 | 0.7136 | 0.905 | 0.9056 | 0.1616 | 0.0195 |
| No log | 85.96 | 258 | 0.4163 | 0.905 | 0.1585 | 0.7143 | 0.905 | 0.9056 | 0.1615 | 0.0196 |
| No log | 86.96 | 261 | 0.4161 | 0.905 | 0.1585 | 0.7132 | 0.905 | 0.9056 | 0.1614 | 0.0195 |
| No log | 87.96 | 264 | 0.4159 | 0.905 | 0.1584 | 0.7133 | 0.905 | 0.9056 | 0.1514 | 0.0195 |
| No log | 88.96 | 267 | 0.4157 | 0.905 | 0.1584 | 0.7132 | 0.905 | 0.9056 | 0.1513 | 0.0195 |
| No log | 89.96 | 270 | 0.4156 | 0.905 | 0.1584 | 0.7134 | 0.905 | 0.9056 | 0.1511 | 0.0195 |
| No log | 90.96 | 273 | 0.4153 | 0.905 | 0.1583 | 0.7124 | 0.905 | 0.9056 | 0.1605 | 0.0195 |
| No log | 91.96 | 276 | 0.4153 | 0.905 | 0.1584 | 0.7121 | 0.905 | 0.9056 | 0.1604 | 0.0195 |
| No log | 92.96 | 279 | 0.4154 | 0.905 | 0.1584 | 0.7127 | 0.905 | 0.9056 | 0.1603 | 0.0195 |
| No log | 93.96 | 282 | 0.4154 | 0.905 | 0.1585 | 0.7131 | 0.905 | 0.9056 | 0.1603 | 0.0195 |
| No log | 94.96 | 285 | 0.4154 | 0.905 | 0.1585 | 0.7132 | 0.905 | 0.9056 | 0.1603 | 0.0195 |
| No log | 95.96 | 288 | 0.4154 | 0.905 | 0.1585 | 0.7135 | 0.905 | 0.9056 | 0.1603 | 0.0196 |
| No log | 96.96 | 291 | 0.4153 | 0.905 | 0.1585 | 0.7133 | 0.905 | 0.9056 | 0.1602 | 0.0195 |
| No log | 97.96 | 294 | 0.4152 | 0.905 | 0.1584 | 0.7132 | 0.905 | 0.9056 | 0.1601 | 0.0196 |
| No log | 98.96 | 297 | 0.4152 | 0.905 | 0.1584 | 0.7130 | 0.905 | 0.9056 | 0.1601 | 0.0196 |
| No log | 99.96 | 300 | 0.4152 | 0.905 | 0.1584 | 0.7130 | 0.905 | 0.9056 | 0.1601 | 0.0196 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
XO-Appleton/vit-base-patch16-224-in21k-MR
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# XO-Appleton/vit-base-patch16-224-in21k-MR
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0042
- Train Accuracy: 1.0
- Train Top-3-accuracy: 1.0
- Validation Loss: 0.0126
- Validation Accuracy: 0.9983
- Validation Top-3-accuracy: 1.0
- Epoch: 3
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 4485, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
- training_precision: mixed_float16
### Training results
| Train Loss | Train Accuracy | Train Top-3-accuracy | Validation Loss | Validation Accuracy | Validation Top-3-accuracy | Epoch |
|:----------:|:--------------:|:--------------------:|:---------------:|:-------------------:|:-------------------------:|:-----:|
| 0.1624 | 0.9544 | 1.0 | 0.0380 | 0.9933 | 1.0 | 0 |
| 0.0178 | 0.9979 | 1.0 | 0.0197 | 0.9966 | 1.0 | 1 |
| 0.0063 | 1.0 | 1.0 | 0.0139 | 0.9983 | 1.0 | 2 |
| 0.0042 | 1.0 | 1.0 | 0.0126 | 0.9983 | 1.0 | 3 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"flip",
"notflip"
] |
sghirardelli/vit-base-patch16-224-in21k-rgbd
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-in21k-rgbd
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.5496
- Train Accuracy: 1.0
- Train Top-3-accuracy: 1.0
- Validation Loss: 0.3955
- Validation Accuracy: 0.9994
- Validation Top-3-accuracy: 1.0
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 1455, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
- training_precision: mixed_float16
### Training results
| Train Loss | Train Accuracy | Train Top-3-accuracy | Validation Loss | Validation Accuracy | Validation Top-3-accuracy | Epoch |
|:----------:|:--------------:|:--------------------:|:---------------:|:-------------------:|:-------------------------:|:-----:|
| 1.6822 | 0.9392 | 0.9664 | 0.7810 | 0.9994 | 1.0 | 0 |
| 0.5496 | 1.0 | 1.0 | 0.3955 | 0.9994 | 1.0 | 1 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"apple",
"ball",
"cereal_box",
"coffee_mug",
"comb",
"dry_battery",
"flashlight",
"food_bag",
"food_box",
"food_can",
"food_cup",
"food_jar",
"banana",
"garlic",
"glue_stick",
"greens",
"hand_towel",
"instant_noodles",
"keyboard",
"kleenex",
"lemon",
"lightbulb",
"lime",
"bell_pepper",
"marker",
"mushroom",
"notebook",
"onion",
"orange",
"peach",
"pear",
"pitcher",
"plate",
"pliers",
"binder",
"potato",
"rubber_eraser",
"scissors",
"shampoo",
"soda_can",
"sponge",
"stapler",
"tomato",
"toothbrush",
"toothpaste",
"bowl",
"water_bottle",
"calculator",
"camera",
"cap",
"cell_phone"
] |
ALM-AHME/beit-large-patch16-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-large-patch16-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20
This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0275
- Accuracy: 0.9939
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.46 | 1.0 | 199 | 0.3950 | 0.8482 |
| 0.2048 | 2.0 | 398 | 0.1886 | 0.9189 |
| 0.182 | 3.0 | 597 | 0.1382 | 0.9481 |
| 0.0826 | 4.0 | 796 | 0.0760 | 0.9694 |
| 0.0886 | 5.0 | 995 | 0.0600 | 0.9788 |
| 0.0896 | 6.0 | 1194 | 0.0523 | 0.9802 |
| 0.0774 | 7.0 | 1393 | 0.0482 | 0.9826 |
| 0.0876 | 8.0 | 1592 | 0.0289 | 0.9877 |
| 0.1105 | 9.0 | 1791 | 0.0580 | 0.9821 |
| 0.0289 | 10.0 | 1990 | 0.0294 | 0.9925 |
| 0.0594 | 11.0 | 2189 | 0.0331 | 0.9906 |
| 0.0011 | 12.0 | 2388 | 0.0275 | 0.9939 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"benign",
"malignant"
] |
ALM-AHME/convnextv2-large-1k-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-large-1k-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20
This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0353
- Accuracy: 0.9901
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5207 | 1.0 | 199 | 0.4745 | 0.8887 |
| 0.2029 | 2.0 | 398 | 0.2072 | 0.9401 |
| 0.1615 | 3.0 | 597 | 0.1489 | 0.9547 |
| 0.1662 | 4.0 | 796 | 0.1312 | 0.9562 |
| 0.1986 | 5.0 | 995 | 0.1026 | 0.9698 |
| 0.0854 | 6.0 | 1194 | 0.0583 | 0.9802 |
| 0.0538 | 7.0 | 1393 | 0.0568 | 0.9835 |
| 0.0977 | 8.0 | 1592 | 0.0654 | 0.9793 |
| 0.6971 | 9.0 | 1791 | 0.6821 | 0.5450 |
| 0.211 | 10.0 | 1990 | 0.1654 | 0.9326 |
| 0.1775 | 11.0 | 2189 | 0.0859 | 0.9665 |
| 0.0042 | 12.0 | 2388 | 0.0353 | 0.9901 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"benign",
"malignant"
] |
akar49/mri_classifier
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# akar49/mri_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1032
- Validation Loss: 0.1556
- Train Accuracy: 0.9367
- Epoch: 14
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'SGD', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 0.001, 'momentum': 0.0, 'nesterov': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.6447 | 0.6133 | 0.7004 | 0 |
| 0.5405 | 0.5010 | 0.8256 | 1 |
| 0.4181 | 0.3917 | 0.8650 | 2 |
| 0.3122 | 0.3189 | 0.9058 | 3 |
| 0.2474 | 0.3069 | 0.8875 | 4 |
| 0.2021 | 0.2733 | 0.9044 | 5 |
| 0.1745 | 0.2455 | 0.9100 | 6 |
| 0.1591 | 0.2203 | 0.9212 | 7 |
| 0.1450 | 0.2350 | 0.9142 | 8 |
| 0.1397 | 0.2122 | 0.9198 | 9 |
| 0.1227 | 0.2098 | 0.9212 | 10 |
| 0.1169 | 0.1754 | 0.9325 | 11 |
| 0.1080 | 0.1782 | 0.9339 | 12 |
| 0.0971 | 0.1705 | 0.9353 | 13 |
| 0.1032 | 0.1556 | 0.9367 | 14 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"notumor",
"tumor"
] |
jvadlamudi2/swin-tiny-patch4-window7-224-jvadlamudi2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-jvadlamudi2
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4467
- Accuracy: 0.8304
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 7 | 0.4580 | 0.8214 |
| 0.4488 | 2.0 | 14 | 0.4733 | 0.8304 |
| 0.4924 | 3.0 | 21 | 0.4467 | 0.8304 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.0
- Tokenizers 0.13.3
|
[
"0",
"1"
] |
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-BreastCancer-BreakHis-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-BreastCancer-BreakHis-AH-60-20-20
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0229
- Accuracy: 0.9943
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 7
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2053 | 1.0 | 199 | 0.1227 | 0.9496 |
| 0.1302 | 2.0 | 398 | 0.0665 | 0.9736 |
| 0.0784 | 3.0 | 597 | 0.0600 | 0.9778 |
| 0.1181 | 4.0 | 796 | 0.0449 | 0.9849 |
| 0.208 | 5.0 | 995 | 0.0393 | 0.9887 |
| 0.0057 | 6.0 | 1194 | 0.0229 | 0.9943 |
| 0.0017 | 7.0 | 1393 | 0.0263 | 0.9939 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"benign",
"malignant"
] |
dyvapandhu/vit-base-molecul-v2-5-epoch
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-molecul-v2-5-epoch
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5290
- Accuracy: 0.77
- F1: 0.7698
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.11.0
|
[
"a",
"c"
] |
margosabry/food_classifier
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# margosabry/food_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.3853
- Validation Loss: 0.3150
- Train Accuracy: 0.928
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 2.8055 | 1.6705 | 0.808 | 0 |
| 1.2418 | 0.8233 | 0.883 | 1 |
| 0.7004 | 0.5248 | 0.912 | 2 |
| 0.5037 | 0.3802 | 0.926 | 3 |
| 0.3853 | 0.3150 | 0.928 | 4 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
rshrott/vit-base-beans-demo-v5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans-demo-v5
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8460
- Accuracy: 0.6695
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.0616 | 0.17 | 100 | 1.0267 | 0.5818 |
| 0.9594 | 0.34 | 200 | 0.9468 | 0.6073 |
| 1.1785 | 0.51 | 300 | 0.9976 | 0.5869 |
| 0.865 | 0.68 | 400 | 0.9288 | 0.6388 |
| 0.8494 | 0.85 | 500 | 0.8573 | 0.6516 |
| 0.8151 | 1.02 | 600 | 0.8729 | 0.6397 |
| 0.5787 | 1.19 | 700 | 0.9067 | 0.6448 |
| 0.7768 | 1.36 | 800 | 0.8996 | 0.6533 |
| 0.6098 | 1.53 | 900 | 0.8460 | 0.6695 |
| 0.6251 | 1.7 | 1000 | 0.8610 | 0.6704 |
| 0.7863 | 1.87 | 1100 | 0.8668 | 0.6431 |
| 0.2595 | 2.04 | 1200 | 0.8725 | 0.6840 |
| 0.2735 | 2.21 | 1300 | 0.9307 | 0.6746 |
| 0.2429 | 2.39 | 1400 | 1.0958 | 0.6354 |
| 0.3224 | 2.56 | 1500 | 1.0305 | 0.6687 |
| 0.1602 | 2.73 | 1600 | 1.0072 | 0.6746 |
| 0.2042 | 2.9 | 1700 | 1.0971 | 0.6789 |
| 0.0604 | 3.07 | 1800 | 1.0817 | 0.6917 |
| 0.0716 | 3.24 | 1900 | 1.1307 | 0.6925 |
| 0.0822 | 3.41 | 2000 | 1.1827 | 0.6925 |
| 0.0889 | 3.58 | 2100 | 1.2424 | 0.6934 |
| 0.0855 | 3.75 | 2200 | 1.2667 | 0.6899 |
| 0.0682 | 3.92 | 2300 | 1.2470 | 0.6951 |
### Framework versions
- Transformers 4.39.1
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
|
[
"not applicable",
"very poor",
"poor",
"fair",
"good",
"great",
"excellent"
] |
jthetzel/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0604
- Accuracy: 0.9822
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2326 | 1.0 | 190 | 0.1175 | 0.9604 |
| 0.1789 | 2.0 | 380 | 0.0765 | 0.9763 |
| 0.1414 | 3.0 | 570 | 0.0604 | 0.9822 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"annualcrop",
"forest",
"herbaceousvegetation",
"highway",
"industrial",
"pasture",
"permanentcrop",
"residential",
"river",
"sealake"
] |
hyeongjin99/vit-base-aihub_model-v2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-aihub_model-v2
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3076
- Accuracy: 0.9639
- Precision: 0.9610
- Recall: 0.9614
- F1: 0.9604
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| No log | 1.0 | 3 | 1.2753 | 0.8373 | 0.8563 | 0.7993 | 0.8022 |
| No log | 2.0 | 6 | 1.1252 | 0.8675 | 0.8895 | 0.8300 | 0.8333 |
| No log | 3.0 | 9 | 0.9427 | 0.8976 | 0.9185 | 0.8696 | 0.8760 |
| 1.1721 | 4.0 | 12 | 0.7995 | 0.9398 | 0.9474 | 0.9195 | 0.9246 |
| 1.1721 | 5.0 | 15 | 0.6820 | 0.9699 | 0.9704 | 0.9613 | 0.9642 |
| 1.1721 | 6.0 | 18 | 0.5927 | 0.9639 | 0.9603 | 0.9583 | 0.9587 |
| 0.7084 | 7.0 | 21 | 0.5239 | 0.9759 | 0.9725 | 0.9729 | 0.9725 |
| 0.7084 | 8.0 | 24 | 0.4743 | 0.9699 | 0.9665 | 0.9671 | 0.9665 |
| 0.7084 | 9.0 | 27 | 0.4436 | 0.9578 | 0.9558 | 0.9556 | 0.9544 |
| 0.4668 | 10.0 | 30 | 0.4070 | 0.9639 | 0.9610 | 0.9614 | 0.9604 |
| 0.4668 | 11.0 | 33 | 0.3817 | 0.9699 | 0.9665 | 0.9671 | 0.9665 |
| 0.4668 | 12.0 | 36 | 0.3625 | 0.9699 | 0.9665 | 0.9671 | 0.9665 |
| 0.4668 | 13.0 | 39 | 0.3536 | 0.9578 | 0.9558 | 0.9556 | 0.9544 |
| 0.3611 | 14.0 | 42 | 0.3384 | 0.9578 | 0.9558 | 0.9556 | 0.9544 |
| 0.3611 | 15.0 | 45 | 0.3249 | 0.9699 | 0.9665 | 0.9671 | 0.9665 |
| 0.3611 | 16.0 | 48 | 0.3164 | 0.9699 | 0.9665 | 0.9671 | 0.9665 |
| 0.3063 | 17.0 | 51 | 0.3142 | 0.9639 | 0.9610 | 0.9614 | 0.9604 |
| 0.3063 | 18.0 | 54 | 0.3122 | 0.9639 | 0.9610 | 0.9614 | 0.9604 |
| 0.3063 | 19.0 | 57 | 0.3093 | 0.9639 | 0.9610 | 0.9614 | 0.9604 |
| 0.294 | 20.0 | 60 | 0.3076 | 0.9639 | 0.9610 | 0.9614 | 0.9604 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
[
"cloudy",
"rainy",
"sandy",
"shine",
"snowy",
"sunrise"
] |
sezer12138/results
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12-192-22k](https://huggingface.co/microsoft/swinv2-large-patch4-window12-192-22k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5248
- Accuracy: 0.6505
## Model description
On ADE20K Dataset, do image classification.
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.8495 | 1.0 | 158 | 2.3313 | 0.5525 |
| 1.8825 | 2.0 | 316 | 1.6815 | 0.623 |
| 1.4956 | 3.0 | 474 | 1.5248 | 0.6505 |
### Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.2
|
[
"abbey",
"access_road",
"airport_terminal",
"bathroom",
"village",
"vinery",
"vineyard",
"volcano",
"volleyball_court_indoor",
"volleyball_court_outdoor",
"voting_booth",
"waiting_room",
"walk_in_freezer",
"walkway",
"batters_box",
"war_room",
"warehouse_indoor",
"warehouse_outdoor",
"washhouse_indoor",
"washhouse_outdoor",
"washroom",
"watchtower",
"water",
"water_fountain",
"water_gate",
"batting_cage_indoor",
"water_mill",
"water_park",
"water_tower",
"water_treatment_plant_indoor",
"water_treatment_plant_outdoor",
"watering_hole",
"waterscape",
"waterway",
"wave",
"weighbridge",
"batting_cage_outdoor",
"western",
"wet_bar",
"wetland",
"wharf",
"wheat_field",
"whispering_gallery",
"widows_walk_indoor",
"widows_walk_interior",
"wild",
"wind_farm",
"battlefield",
"windmill",
"window_seat",
"windstorm",
"winery",
"witness_stand",
"woodland",
"workroom",
"workshop",
"wrestling_ring_indoor",
"wrestling_ring_outdoor",
"battlement",
"yard",
"youth_hostel",
"zen_garden",
"ziggurat",
"zoo",
"bay",
"bayou",
"bazaar_indoor",
"bazaar_outdoor",
"airport_ticket_counter",
"beach",
"beach_house",
"beauty_salon",
"bedchamber",
"bedroom",
"beer_garden",
"beer_hall",
"belfry",
"bell_foundry",
"berth",
"alcove",
"berth_deck",
"betting_shop",
"bicycle_racks",
"bindery",
"biology_laboratory",
"bistro_indoor",
"bistro_outdoor",
"bleachers_indoor",
"bleachers_outdoor",
"block",
"alley",
"boardwalk",
"boat",
"boat_deck",
"boathouse",
"bog",
"bomb_shelter_indoor",
"bookbindery",
"bookshelf",
"bookstore",
"booth",
"amphitheater",
"booth_indoor",
"booth_outdoor",
"botanical_garden",
"bottle_storage",
"bottomland",
"bow_window_indoor",
"bow_window_outdoor",
"bowling_alley",
"box_seat",
"boxing_ring",
"amphitheater_indoor",
"breakfast_table",
"breakroom",
"brewery_indoor",
"brewery_outdoor",
"bric-a-brac",
"brickyard_indoor",
"brickyard_outdoor",
"bridge",
"bridle_path",
"broadleaf",
"amusement_arcade",
"brooklet",
"bubble_chamber",
"buffet",
"building_complex",
"building_facade",
"bulkhead",
"bullpen",
"bullring",
"bunk_bed",
"burial_chamber",
"amusement_park",
"bus_depot_indoor",
"bus_depot_outdoor",
"bus_interior",
"bus_shelter",
"bus_station_indoor",
"bus_station_outdoor",
"butchers_shop",
"butte",
"bypass",
"byroad",
"anechoic_chamber",
"cabana",
"cabin_cruiser",
"cabin_indoor",
"cabin_outdoor",
"cafeteria",
"call_center",
"campsite",
"campus",
"candy_store",
"canteen",
"apartment_building_outdoor",
"canyon",
"car_dealership",
"caravansary",
"cardroom",
"cargo_container_interior",
"cargo_deck",
"cargo_helicopter",
"carport_indoor",
"carport_outdoor",
"carrousel",
"acropolis",
"apse_indoor",
"cascade",
"casino_indoor",
"casino_outdoor",
"castle",
"catacomb",
"cataract",
"cathedral_indoor",
"cathedral_outdoor",
"catwalk",
"cavern_indoor",
"apse_outdoor",
"cavern_outdoor",
"cellar",
"cemetery",
"chair_lift",
"chalet",
"chaparral",
"chapel",
"checkout_counter",
"cheese_factory",
"chemical_plant",
"aquarium",
"chemistry_lab",
"chicken_coop_indoor",
"chicken_coop_outdoor",
"chicken_farm_indoor",
"chicken_farm_outdoor",
"childs_room",
"choir_loft_interior",
"chuck_wagon",
"church_indoor",
"church_outdoor",
"aquatic_theater",
"circus_tent_indoor",
"circus_tent_outdoor",
"city",
"classroom",
"clean_room",
"cliff",
"clock_tower_indoor",
"cloister_indoor",
"cloister_outdoor",
"closet",
"aqueduct",
"clothing_store",
"coast",
"coast_road",
"cockpit",
"cocktail_lounge",
"coffee_shop",
"computer_room",
"conference_center",
"conference_hall",
"conference_room",
"arbor",
"confessional",
"construction_site",
"control_room",
"control_tower_indoor",
"control_tower_outdoor",
"convenience_store_indoor",
"convenience_store_outdoor",
"coral_reef",
"corn_field",
"corner",
"arcade",
"corral",
"corridor",
"cottage",
"cottage_garden",
"country_house",
"country_road",
"courthouse",
"courtroom",
"courtyard",
"covered_bridge_interior",
"arch",
"crawl_space",
"creek",
"crevasse",
"crosswalk",
"cultivated",
"customhouse",
"cybercafe",
"dacha",
"dairy_indoor",
"dairy_outdoor",
"archaelogical_excavation",
"dam",
"dance_floor",
"dance_school",
"darkroom",
"day_care_center",
"deck-house_boat_deck_house",
"deck-house_deck_house",
"delicatessen",
"dentists_office",
"department_store",
"archipelago",
"departure_lounge",
"desert_road",
"diner_indoor",
"diner_outdoor",
"dinette_home",
"dining_area",
"dining_car",
"dining_hall",
"dining_room",
"dirt_track",
"air_base",
"archive",
"discotheque",
"distillery",
"ditch",
"diving_board",
"dock",
"dolmen",
"donjon",
"door",
"doorway_indoor",
"doorway_outdoor",
"armory",
"dorm_room",
"downtown",
"drainage_ditch",
"dress_shop",
"dressing_room",
"drill_rig",
"driveway",
"driving_range_indoor",
"driving_range_outdoor",
"drugstore",
"army_base",
"dry",
"dry_dock",
"dugout",
"earth_fissure",
"east_asia",
"editing_room",
"electrical_substation",
"elevated_catwalk",
"elevator_interior",
"elevator_lobby",
"arrival_gate_indoor",
"elevator_shaft",
"embankment",
"embassy",
"embrasure",
"engine_room",
"entrance",
"entrance_hall",
"entranceway_indoor",
"entranceway_outdoor",
"entryway_outdoor",
"arrival_gate_outdoor",
"escalator_indoor",
"escalator_outdoor",
"escarpment",
"establishment",
"estaminet",
"estuary",
"excavation",
"exhibition_hall",
"exterior",
"fabric_store",
"art_gallery",
"factory_indoor",
"factory_outdoor",
"fairway",
"fan",
"farm",
"farm_building",
"farmhouse",
"fastfood_restaurant",
"feed_bunk",
"fence",
"art_school",
"ferryboat_indoor",
"field_house",
"field_road",
"field_tent_indoor",
"field_tent_outdoor",
"fire_escape",
"fire_station",
"fire_trench",
"fireplace",
"firing_range_indoor",
"art_studio",
"firing_range_outdoor",
"fish_farm",
"fishmarket",
"fishpond",
"fitting_room_interior",
"fjord",
"flashflood",
"flatlet",
"flea_market_indoor",
"flea_market_outdoor",
"artificial",
"floating_dock",
"floating_dry_dock",
"flood",
"flood_plain",
"florist_shop_indoor",
"florist_shop_outdoor",
"flowerbed",
"flume_indoor",
"fly_bridge",
"flying_buttress",
"artists_loft",
"food_court",
"football",
"football_field",
"foothill",
"forecourt",
"foreshore",
"forest_fire",
"forest_path",
"forest_road",
"forklift",
"aircraft_carrier_object",
"assembly_hall",
"formal_garden",
"fort",
"fortress",
"foundry_indoor",
"foundry_outdoor",
"fountain",
"freestanding",
"freeway",
"freight_elevator",
"front_porch",
"assembly_line",
"frontseat",
"funeral_chapel",
"funeral_home",
"furnace_room",
"galley",
"game_room",
"gangplank",
"garage_indoor",
"garage_outdoor",
"garbage_dump",
"assembly_plant",
"garden",
"gas_station",
"gas_well",
"gasworks",
"gate",
"gatehouse",
"gazebo_interior",
"general_store_indoor",
"general_store_outdoor",
"geodesic_dome_indoor",
"athletic_field_indoor",
"geodesic_dome_outdoor",
"ghost_town",
"gift_shop",
"glacier",
"glade",
"glen",
"golf_course",
"gorge",
"granary",
"grape_arbor",
"athletic_field_outdoor",
"great_hall",
"greengrocery",
"greenhouse_indoor",
"greenhouse_outdoor",
"grotto",
"grove",
"guardhouse",
"guardroom",
"guesthouse",
"gulch",
"atrium_home",
"gun_deck_indoor",
"gun_deck_outdoor",
"gun_store",
"gymnasium_indoor",
"gymnasium_outdoor",
"hacienda",
"hallway",
"handball_court",
"hangar_indoor",
"hangar_outdoor",
"atrium_public",
"harbor",
"hardware_store",
"hat_shop",
"hatchery",
"hayfield",
"hayloft",
"head_shop",
"hearth",
"heath",
"hedge_maze",
"attic",
"hedgerow",
"heliport",
"hen_yard",
"herb_garden",
"highway",
"hill",
"hillock",
"hockey",
"hollow",
"home_office",
"auditorium",
"home_theater",
"hoodoo",
"hospital",
"hospital_room",
"hot_spring",
"hot_tub_indoor",
"hot_tub_outdoor",
"hotel_breakfast_area",
"hotel_outdoor",
"hotel_room",
"auto_factory",
"house",
"housing_estate",
"housing_project",
"howdah",
"hunting_lodge_indoor",
"hunting_lodge_outdoor",
"hut",
"hutment",
"ice_cream_parlor",
"ice_floe",
"airfield",
"auto_mechanics_indoor",
"ice_shelf",
"ice_skating_rink_indoor",
"ice_skating_rink_outdoor",
"iceberg",
"igloo",
"imaret",
"incinerator_indoor",
"incinerator_outdoor",
"indoor_procenium",
"indoor_round",
"auto_mechanics_outdoor",
"indoor_seats",
"industrial_area",
"industrial_park",
"inlet",
"inn_indoor",
"inn_outdoor",
"insane_asylum",
"irrigation_ditch",
"islet",
"jacuzzi_indoor",
"auto_racing_paddock",
"jacuzzi_outdoor",
"jail_cell",
"jail_indoor",
"jail_outdoor",
"japanese_garden",
"jetty",
"jewelry_shop",
"joss_house",
"juke_joint",
"jungle",
"auto_showroom",
"junk_pile",
"junkyard",
"jury_box",
"kasbah",
"kennel_indoor",
"kennel_outdoor",
"kindergarden_classroom",
"kiosk_indoor",
"kiosk_outdoor",
"kitchen",
"awning_deck",
"kitchenette",
"kraal",
"lab_classroom",
"laboratorywet",
"labyrinth_indoor",
"labyrinth_outdoor",
"lagoon",
"landfill",
"landing",
"landing_deck",
"back_porch",
"landing_strip",
"laundromat",
"lava_flow",
"lavatory",
"lawn",
"layby",
"lean-to",
"lean-to_tent",
"lecture_room",
"legislative_chamber",
"backdrop",
"levee",
"library",
"library_indoor",
"library_outdoor",
"lido_deck_indoor",
"lido_deck_outdoor",
"lift_bridge",
"lighthouse",
"limousine_interior",
"liquor_store_indoor",
"backroom",
"liquor_store_outdoor",
"living_room",
"loading_dock",
"lobby",
"lock_chamber",
"locker_room",
"loft",
"loge",
"loggia_outdoor",
"lookout_station_indoor",
"backseat",
"lookout_station_outdoor",
"lower_deck",
"luggage_van",
"lumberyard_indoor",
"lumberyard_outdoor",
"lyceum",
"machine_shop",
"manhole",
"mansard",
"mansion",
"backstage",
"manufactured_home",
"market_indoor",
"market_outdoor",
"marsh",
"martial_arts_gym",
"massage_room",
"mastaba",
"maternity_ward",
"mausoleum",
"meadow",
"airlock",
"backstage_outdoor",
"meat_house",
"medina",
"megalith",
"menhir",
"mens_store_outdoor",
"mental_institution_indoor",
"mental_institution_outdoor",
"mesa",
"mesoamerican",
"mess_hall",
"backstairs",
"mews",
"mezzanine",
"military_headquarters",
"military_hospital",
"military_hut",
"military_tent",
"millpond",
"millrace",
"mine",
"mineral_bath",
"backstairs_indoor",
"mineshaft",
"mini_golf_course_indoor",
"mini_golf_course_outdoor",
"misc",
"mission",
"mobile_home",
"monastery_indoor",
"monastery_outdoor",
"moon_bounce",
"moor",
"backwoods",
"morgue",
"mosque_indoor",
"mosque_outdoor",
"motel",
"mountain",
"mountain_path",
"mountain_road",
"mountain_snowy",
"movie_theater_indoor",
"movie_theater_outdoor",
"badlands",
"mudflat",
"museum_indoor",
"museum_outdoor",
"music_store",
"music_studio",
"natural",
"natural_history_museum",
"natural_spring",
"naval_base",
"needleleaf",
"badminton_court_indoor",
"newsroom",
"newsstand_indoor",
"newsstand_outdoor",
"nightclub",
"nook",
"nuclear_power_plant_indoor",
"nuclear_power_plant_outdoor",
"nunnery",
"nursery",
"nursing_home",
"badminton_court_outdoor",
"nursing_home_outdoor",
"oasis",
"oast_house",
"observation_station",
"observatory_indoor",
"observatory_outdoor",
"observatory_post",
"ocean",
"ocean_deep",
"ocean_shallow",
"baggage_claim",
"office",
"office_building",
"office_cubicles",
"oil_refinery_indoor",
"oil_refinery_outdoor",
"oilrig",
"one-way_street",
"open-hearth_furnace",
"operating_room",
"operating_table",
"balcony_interior",
"optician",
"orchard",
"orchestra_pit",
"organ_loft_interior",
"orlop_deck",
"ossuary",
"outbuilding",
"outcropping",
"outhouse_indoor",
"outhouse_outdoor",
"ball_pit",
"outside",
"overpass",
"oyster_bar",
"oyster_farm",
"packaging_plant",
"pagoda",
"palace",
"palace_hall",
"palestra",
"pantry",
"airplane",
"ballet",
"paper_mill",
"parade_ground",
"park",
"parking_garage_indoor",
"parking_garage_outdoor",
"parking_lot",
"parkway",
"parlor",
"particle_accelerator",
"party_tent_indoor",
"ballroom",
"party_tent_outdoor",
"passenger_deck",
"pasture",
"patio",
"patio_indoor",
"pavement",
"pavilion",
"pawnshop",
"pawnshop_outdoor",
"pedestrian_overpass_indoor",
"balustrade",
"penalty_box",
"performance",
"perfume_shop",
"pet_shop",
"pharmacy",
"phone_booth",
"physics_laboratory",
"piano_store",
"picnic_area",
"pier",
"bamboo_forest",
"pig_farm",
"pilothouse_indoor",
"pilothouse_outdoor",
"pinetum",
"piste_road",
"pitchers_mound",
"pizzeria",
"pizzeria_outdoor",
"planetarium_indoor",
"planetarium_outdoor",
"bank_indoor",
"plantation_house",
"platform",
"playground",
"playroom",
"plaza",
"plunge",
"podium_indoor",
"podium_outdoor",
"police_station",
"pond",
"bank_outdoor",
"pontoon_bridge",
"poolroom_home",
"poop_deck",
"porch",
"portico",
"portrait_studio",
"postern",
"powder_room",
"power_plant_outdoor",
"preserve",
"bank_vault",
"print_shop",
"priory",
"promenade",
"promenade_deck",
"pub_indoor",
"pub_outdoor",
"pueblo",
"pulpit",
"pump_room",
"pumping_station",
"banquet_hall",
"putting_green",
"quadrangle",
"questionable",
"quicksand",
"quonset_hut_indoor",
"quonset_hut_outdoor",
"racecourse",
"raceway",
"raft",
"rail_indoor",
"baptistry_indoor",
"rail_outdoor",
"railroad_track",
"railway_yard",
"rainforest",
"ramp",
"ranch",
"ranch_house",
"reading_room",
"reception",
"reception_room",
"baptistry_outdoor",
"recreation_room",
"rectory",
"recycling_plant_indoor",
"recycling_plant_outdoor",
"refectory",
"repair_shop",
"residential_neighborhood",
"resort",
"rest_area",
"rest_stop",
"airplane_cabin",
"bar",
"restaurant",
"restaurant_kitchen",
"restaurant_patio",
"restroom_indoor",
"restroom_outdoor",
"retaining_wall",
"revolving_door",
"rice_paddy",
"riding_arena",
"rift_valley",
"barbeque",
"river",
"road",
"road_cut",
"road_indoor",
"road_outdoor",
"rock_arch",
"rock_garden",
"rodeo",
"roller_skating_rink_indoor",
"roller_skating_rink_outdoor",
"barbershop",
"rolling_mill",
"roof",
"roof_garden",
"room",
"root_cellar",
"rope_bridge",
"rotisserie",
"roundabout",
"roundhouse",
"rubble",
"barn",
"ruin",
"runway",
"sacristy",
"safari_park",
"salon",
"saloon",
"salt_plain",
"sanatorium",
"sand",
"sand_trap",
"barndoor",
"sandbar",
"sandbox",
"sauna",
"savanna",
"sawmill",
"schoolhouse",
"schoolyard",
"science_laboratory",
"science_museum",
"scriptorium",
"barnyard",
"scrubland",
"scullery",
"sea_cliff",
"seaside",
"seawall",
"security_check_point",
"semidesert",
"server_room",
"sewer",
"sewing_room",
"barrack",
"shed",
"shelter",
"shelter_deck",
"shelter_tent",
"shipping_room",
"shipyard_outdoor",
"shoe_shop",
"shop",
"shopfront",
"shopping_mall_indoor",
"barrel_storage",
"shopping_mall_outdoor",
"shore",
"shower",
"shower_room",
"shrine",
"shrubbery",
"sidewalk",
"signal_box",
"sinkhole",
"ski_jump",
"baseball",
"ski_lodge",
"ski_resort",
"ski_slope",
"sky",
"skyscraper",
"skywalk_indoor",
"skywalk_outdoor",
"slum",
"snack_bar",
"snowbank",
"baseball_field",
"snowfield",
"soccer",
"south_asia",
"spillway",
"sporting_goods_store",
"squash_court",
"stable",
"stadium_outdoor",
"stage_indoor",
"stage_outdoor",
"airport",
"basement",
"stage_set",
"staircase",
"stall",
"starting_gate",
"stateroom",
"station",
"steam_plant_outdoor",
"steel_mill_indoor",
"steel_mill_outdoor",
"stone_circle",
"basilica",
"storage_room",
"store",
"storm_cellar",
"street",
"streetcar_track",
"strip_mall",
"strip_mine",
"student_center",
"student_residence",
"study_hall",
"basin_outdoor",
"submarine_interior",
"subway_interior",
"sugar_refinery",
"sun_deck",
"sunroom",
"supermarket",
"supply_chamber",
"sushi_bar",
"swamp",
"swimming_hole",
"basketball",
"swimming_pool_indoor",
"swimming_pool_outdoor",
"synagogue_indoor",
"synagogue_outdoor",
"t-bar_lift",
"tannery",
"taxistand",
"taxiway",
"tea_garden",
"teahouse",
"basketball_court_indoor",
"tearoom",
"teashop",
"television_room",
"television_studio",
"tennis_court_indoor",
"tennis_court_outdoor",
"tent_outdoor",
"terrace_farm",
"theater_outdoor",
"threshing_floor",
"basketball_court_outdoor",
"thriftshop",
"throne_room",
"ticket_booth",
"ticket_window_indoor",
"tidal_basin",
"tidal_river",
"tiltyard",
"tobacco_shop_indoor",
"toll_plaza",
"tollbooth",
"bath_indoor",
"tollgate",
"tomb",
"topiary_garden",
"tower",
"town_house",
"toyshop",
"track_outdoor",
"tract_housing",
"trading_floor",
"traffic_island",
"bath_outdoor",
"trailer_park",
"train_interior",
"train_railway",
"train_station_outdoor",
"tree_farm",
"tree_house",
"trellis",
"trench",
"trestle_bridge",
"truck_stop",
"bathhouse",
"tundra",
"turkish_bath",
"upper_balcony",
"urban",
"utility_room",
"valley",
"van_interior",
"vat",
"vegetable_garden",
"vegetation",
"bathhouse_outdoor",
"vehicle",
"velodrome_indoor",
"velodrome_outdoor",
"ventilation_shaft",
"veranda",
"vestibule",
"vestry",
"veterinarians_office",
"viaduct",
"videostore"
] |
adam-bourne/food-classifier
|
# food-classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.2136
- Validation Loss: 0.2284
- Train Accuracy: 0.94
- Epoch: 4
## Model description
This is an image classification model fine tuned from the Google Vision Transformer (ViT) to classify images of food.
## Intended uses & limitations
For messing around!
## Training and evaluation data
The training set contained 101 food classes, over a dataset of 101,000 images. The train/eval split was 80/20
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.3409 | 0.2903 | 0.932 | 0 |
| 0.2838 | 0.2897 | 0.917 | 1 |
| 0.2415 | 0.2869 | 0.914 | 2 |
| 0.2143 | 0.2630 | 0.924 | 3 |
| 0.2136 | 0.2284 | 0.94 | 4 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
bgirardot/my_awesome_food_model_v2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model_v2
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9051
- Accuracy: 0.965
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 3.6221 | 1.0 | 25 | 2.8425 | 0.875 |
| 2.195 | 2.0 | 50 | 2.0761 | 0.955 |
| 1.9134 | 3.0 | 75 | 1.9051 | 0.965 |
### Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.2
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
rshrott/vit-base-renovation
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-renovation
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the renovations dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0025
- Accuracy: 0.6667
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.382 | 0.2 | 25 | 1.1103 | 0.6073 |
| 0.5741 | 0.4 | 50 | 1.0628 | 0.6210 |
| 0.5589 | 0.6 | 75 | 1.0025 | 0.6667 |
| 0.4074 | 0.81 | 100 | 1.1324 | 0.6073 |
| 0.3581 | 1.01 | 125 | 1.1935 | 0.6438 |
| 0.2618 | 1.21 | 150 | 1.8300 | 0.5023 |
| 0.1299 | 1.41 | 175 | 1.2577 | 0.6301 |
| 0.2562 | 1.61 | 200 | 1.0924 | 0.6895 |
| 0.2573 | 1.81 | 225 | 1.1285 | 0.6849 |
| 0.2471 | 2.02 | 250 | 1.3387 | 0.6256 |
| 0.0618 | 2.22 | 275 | 1.2246 | 0.6667 |
| 0.0658 | 2.42 | 300 | 1.4132 | 0.6347 |
| 0.0592 | 2.62 | 325 | 1.4326 | 0.6530 |
| 0.0464 | 2.82 | 350 | 1.2484 | 0.6849 |
| 0.0567 | 3.02 | 375 | 1.5350 | 0.6347 |
| 0.0269 | 3.23 | 400 | 1.4797 | 0.6667 |
| 0.0239 | 3.43 | 425 | 1.4444 | 0.6530 |
| 0.0184 | 3.63 | 450 | 1.4474 | 0.6575 |
| 0.0286 | 3.83 | 475 | 1.4621 | 0.6667 |
### Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
|
[
"not applicable",
"very poor",
"poor",
"fair",
"good",
"excellent",
"exceptional"
] |
jordyvl/vit-small_tobacco3482_og_simkd_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_og_simkd_
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 212.9178
- Accuracy: 0.855
- Brier Loss: 0.2563
- Nll: 1.4722
- F1 Micro: 0.855
- F1 Macro: 0.8333
- Ece: 0.1253
- Aurc: 0.0422
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 219.2297 | 0.145 | 0.8895 | 6.5562 | 0.145 | 0.0567 | 0.2109 | 0.7661 |
| No log | 2.0 | 50 | 217.9786 | 0.49 | 0.6839 | 2.2035 | 0.49 | 0.4113 | 0.3237 | 0.3097 |
| No log | 3.0 | 75 | 216.6085 | 0.565 | 0.5671 | 1.7658 | 0.565 | 0.4471 | 0.2471 | 0.2239 |
| No log | 4.0 | 100 | 216.0210 | 0.68 | 0.4722 | 1.8682 | 0.68 | 0.5586 | 0.2322 | 0.1557 |
| No log | 5.0 | 125 | 215.5695 | 0.68 | 0.4668 | 1.9385 | 0.68 | 0.5570 | 0.2289 | 0.1440 |
| No log | 6.0 | 150 | 215.3762 | 0.745 | 0.3963 | 2.1043 | 0.745 | 0.6608 | 0.1949 | 0.0976 |
| No log | 7.0 | 175 | 214.8964 | 0.745 | 0.3675 | 1.7226 | 0.745 | 0.6693 | 0.1765 | 0.0949 |
| No log | 8.0 | 200 | 215.0440 | 0.735 | 0.3838 | 1.9180 | 0.735 | 0.6935 | 0.2056 | 0.0958 |
| No log | 9.0 | 225 | 214.7017 | 0.775 | 0.3466 | 1.5816 | 0.775 | 0.6897 | 0.1756 | 0.0661 |
| No log | 10.0 | 250 | 214.6309 | 0.775 | 0.3505 | 1.6245 | 0.775 | 0.7604 | 0.1828 | 0.0763 |
| No log | 11.0 | 275 | 214.6275 | 0.735 | 0.4314 | 2.3367 | 0.735 | 0.7342 | 0.2234 | 0.1203 |
| No log | 12.0 | 300 | 214.5664 | 0.75 | 0.3769 | 1.7889 | 0.75 | 0.7420 | 0.1873 | 0.1070 |
| No log | 13.0 | 325 | 214.6764 | 0.735 | 0.4425 | 2.3533 | 0.735 | 0.7404 | 0.2267 | 0.1508 |
| No log | 14.0 | 350 | 214.5261 | 0.805 | 0.3093 | 1.8504 | 0.805 | 0.7870 | 0.1732 | 0.0580 |
| No log | 15.0 | 375 | 214.4932 | 0.79 | 0.3255 | 1.4649 | 0.79 | 0.7575 | 0.1796 | 0.0543 |
| No log | 16.0 | 400 | 214.3134 | 0.85 | 0.2467 | 1.4769 | 0.85 | 0.8388 | 0.1149 | 0.0513 |
| No log | 17.0 | 425 | 214.3825 | 0.82 | 0.2845 | 1.4858 | 0.82 | 0.8014 | 0.1445 | 0.0540 |
| No log | 18.0 | 450 | 214.2077 | 0.85 | 0.2681 | 1.4891 | 0.85 | 0.8406 | 0.1462 | 0.0684 |
| No log | 19.0 | 475 | 214.1675 | 0.845 | 0.2623 | 1.5311 | 0.845 | 0.8329 | 0.1414 | 0.0485 |
| 220.0633 | 20.0 | 500 | 214.1433 | 0.84 | 0.2663 | 1.5269 | 0.8400 | 0.8182 | 0.1302 | 0.0562 |
| 220.0633 | 21.0 | 525 | 214.0829 | 0.805 | 0.3293 | 2.0021 | 0.805 | 0.8019 | 0.1710 | 0.0833 |
| 220.0633 | 22.0 | 550 | 213.9282 | 0.84 | 0.2586 | 1.5127 | 0.8400 | 0.8205 | 0.1397 | 0.0453 |
| 220.0633 | 23.0 | 575 | 213.9303 | 0.87 | 0.2260 | 1.4450 | 0.87 | 0.8552 | 0.1205 | 0.0365 |
| 220.0633 | 24.0 | 600 | 213.9140 | 0.84 | 0.2620 | 1.5244 | 0.8400 | 0.8161 | 0.1462 | 0.0490 |
| 220.0633 | 25.0 | 625 | 213.7616 | 0.86 | 0.2306 | 1.5288 | 0.8600 | 0.8409 | 0.1215 | 0.0361 |
| 220.0633 | 26.0 | 650 | 213.7738 | 0.845 | 0.2431 | 1.5303 | 0.845 | 0.8271 | 0.1335 | 0.0443 |
| 220.0633 | 27.0 | 675 | 213.8470 | 0.85 | 0.2427 | 1.3459 | 0.85 | 0.8275 | 0.1296 | 0.0445 |
| 220.0633 | 28.0 | 700 | 213.7198 | 0.85 | 0.2381 | 1.3868 | 0.85 | 0.8328 | 0.1267 | 0.0424 |
| 220.0633 | 29.0 | 725 | 213.6302 | 0.855 | 0.2293 | 1.4191 | 0.855 | 0.8361 | 0.1157 | 0.0394 |
| 220.0633 | 30.0 | 750 | 213.6385 | 0.85 | 0.2424 | 1.5410 | 0.85 | 0.8334 | 0.1339 | 0.0464 |
| 220.0633 | 31.0 | 775 | 213.6397 | 0.865 | 0.2234 | 1.4012 | 0.865 | 0.8464 | 0.1226 | 0.0402 |
| 220.0633 | 32.0 | 800 | 213.6658 | 0.86 | 0.2271 | 1.3863 | 0.8600 | 0.8470 | 0.1164 | 0.0354 |
| 220.0633 | 33.0 | 825 | 213.6526 | 0.85 | 0.2448 | 1.5357 | 0.85 | 0.8292 | 0.1214 | 0.0397 |
| 220.0633 | 34.0 | 850 | 213.5407 | 0.855 | 0.2282 | 1.3470 | 0.855 | 0.8405 | 0.1245 | 0.0393 |
| 220.0633 | 35.0 | 875 | 213.6166 | 0.83 | 0.2624 | 1.4288 | 0.83 | 0.8102 | 0.1415 | 0.0458 |
| 220.0633 | 36.0 | 900 | 213.5887 | 0.84 | 0.2613 | 1.3928 | 0.8400 | 0.8135 | 0.1298 | 0.0442 |
| 220.0633 | 37.0 | 925 | 213.4976 | 0.845 | 0.2338 | 1.3784 | 0.845 | 0.8244 | 0.1319 | 0.0355 |
| 220.0633 | 38.0 | 950 | 213.4554 | 0.85 | 0.2374 | 1.3680 | 0.85 | 0.8323 | 0.1192 | 0.0385 |
| 220.0633 | 39.0 | 975 | 213.4758 | 0.845 | 0.2319 | 1.4895 | 0.845 | 0.8274 | 0.1185 | 0.0385 |
| 217.7609 | 40.0 | 1000 | 213.4440 | 0.845 | 0.2432 | 1.3737 | 0.845 | 0.8265 | 0.1310 | 0.0415 |
| 217.7609 | 41.0 | 1025 | 213.4492 | 0.845 | 0.2385 | 1.4970 | 0.845 | 0.8297 | 0.1207 | 0.0373 |
| 217.7609 | 42.0 | 1050 | 213.4319 | 0.85 | 0.2384 | 1.3580 | 0.85 | 0.8276 | 0.1250 | 0.0383 |
| 217.7609 | 43.0 | 1075 | 213.3094 | 0.855 | 0.2287 | 1.4375 | 0.855 | 0.8334 | 0.1188 | 0.0353 |
| 217.7609 | 44.0 | 1100 | 213.3809 | 0.845 | 0.2514 | 1.4969 | 0.845 | 0.8250 | 0.1318 | 0.0435 |
| 217.7609 | 45.0 | 1125 | 213.3981 | 0.85 | 0.2478 | 1.6052 | 0.85 | 0.8287 | 0.1268 | 0.0408 |
| 217.7609 | 46.0 | 1150 | 213.3004 | 0.86 | 0.2292 | 1.3632 | 0.8600 | 0.8430 | 0.1180 | 0.0355 |
| 217.7609 | 47.0 | 1175 | 213.3041 | 0.86 | 0.2363 | 1.3407 | 0.8600 | 0.8444 | 0.1235 | 0.0359 |
| 217.7609 | 48.0 | 1200 | 213.2955 | 0.845 | 0.2462 | 1.5071 | 0.845 | 0.8179 | 0.1253 | 0.0396 |
| 217.7609 | 49.0 | 1225 | 213.2531 | 0.85 | 0.2433 | 1.2946 | 0.85 | 0.8277 | 0.1270 | 0.0392 |
| 217.7609 | 50.0 | 1250 | 213.2612 | 0.845 | 0.2378 | 1.2852 | 0.845 | 0.8193 | 0.1281 | 0.0361 |
| 217.7609 | 51.0 | 1275 | 213.2246 | 0.855 | 0.2370 | 1.5829 | 0.855 | 0.8393 | 0.1234 | 0.0357 |
| 217.7609 | 52.0 | 1300 | 213.1795 | 0.845 | 0.2431 | 1.4923 | 0.845 | 0.8300 | 0.1280 | 0.0372 |
| 217.7609 | 53.0 | 1325 | 213.2721 | 0.855 | 0.2467 | 1.5096 | 0.855 | 0.8333 | 0.1248 | 0.0385 |
| 217.7609 | 54.0 | 1350 | 213.1976 | 0.85 | 0.2453 | 1.4167 | 0.85 | 0.8275 | 0.1240 | 0.0384 |
| 217.7609 | 55.0 | 1375 | 213.2822 | 0.845 | 0.2430 | 1.4438 | 0.845 | 0.8193 | 0.1283 | 0.0396 |
| 217.7609 | 56.0 | 1400 | 213.1443 | 0.85 | 0.2479 | 1.5246 | 0.85 | 0.8277 | 0.1304 | 0.0389 |
| 217.7609 | 57.0 | 1425 | 213.1679 | 0.85 | 0.2455 | 1.4468 | 0.85 | 0.8291 | 0.1224 | 0.0387 |
| 217.7609 | 58.0 | 1450 | 213.1116 | 0.85 | 0.2467 | 1.4372 | 0.85 | 0.8287 | 0.1269 | 0.0378 |
| 217.7609 | 59.0 | 1475 | 213.1005 | 0.85 | 0.2490 | 1.4214 | 0.85 | 0.8271 | 0.1316 | 0.0392 |
| 217.1217 | 60.0 | 1500 | 213.1516 | 0.855 | 0.2425 | 1.4600 | 0.855 | 0.8343 | 0.1316 | 0.0369 |
| 217.1217 | 61.0 | 1525 | 213.1205 | 0.855 | 0.2458 | 1.4436 | 0.855 | 0.8303 | 0.1197 | 0.0409 |
| 217.1217 | 62.0 | 1550 | 213.1318 | 0.85 | 0.2488 | 1.4405 | 0.85 | 0.8275 | 0.1304 | 0.0378 |
| 217.1217 | 63.0 | 1575 | 213.0243 | 0.855 | 0.2521 | 1.5810 | 0.855 | 0.8328 | 0.1341 | 0.0447 |
| 217.1217 | 64.0 | 1600 | 213.1191 | 0.84 | 0.2567 | 1.4478 | 0.8400 | 0.8185 | 0.1292 | 0.0436 |
| 217.1217 | 65.0 | 1625 | 213.0329 | 0.855 | 0.2528 | 1.3910 | 0.855 | 0.8333 | 0.1311 | 0.0404 |
| 217.1217 | 66.0 | 1650 | 212.9868 | 0.85 | 0.2525 | 1.4652 | 0.85 | 0.8275 | 0.1226 | 0.0408 |
| 217.1217 | 67.0 | 1675 | 213.0856 | 0.84 | 0.2561 | 1.4601 | 0.8400 | 0.8178 | 0.1367 | 0.0419 |
| 217.1217 | 68.0 | 1700 | 213.0379 | 0.845 | 0.2544 | 1.5222 | 0.845 | 0.8216 | 0.1362 | 0.0426 |
| 217.1217 | 69.0 | 1725 | 213.0535 | 0.835 | 0.2606 | 1.5085 | 0.835 | 0.8093 | 0.1346 | 0.0445 |
| 217.1217 | 70.0 | 1750 | 213.0247 | 0.85 | 0.2530 | 1.4349 | 0.85 | 0.8274 | 0.1373 | 0.0427 |
| 217.1217 | 71.0 | 1775 | 213.0161 | 0.855 | 0.2510 | 1.4529 | 0.855 | 0.8333 | 0.1212 | 0.0411 |
| 217.1217 | 72.0 | 1800 | 213.0249 | 0.845 | 0.2494 | 1.4511 | 0.845 | 0.8229 | 0.1358 | 0.0412 |
| 217.1217 | 73.0 | 1825 | 213.0014 | 0.85 | 0.2548 | 1.4435 | 0.85 | 0.8264 | 0.1277 | 0.0390 |
| 217.1217 | 74.0 | 1850 | 213.0011 | 0.845 | 0.2527 | 1.3719 | 0.845 | 0.8206 | 0.1360 | 0.0379 |
| 217.1217 | 75.0 | 1875 | 213.0240 | 0.845 | 0.2576 | 1.4072 | 0.845 | 0.8221 | 0.1284 | 0.0425 |
| 217.1217 | 76.0 | 1900 | 212.9793 | 0.845 | 0.2534 | 1.4026 | 0.845 | 0.8212 | 0.1241 | 0.0404 |
| 217.1217 | 77.0 | 1925 | 212.9800 | 0.85 | 0.2514 | 1.5023 | 0.85 | 0.8271 | 0.1279 | 0.0407 |
| 217.1217 | 78.0 | 1950 | 212.9125 | 0.845 | 0.2564 | 1.4258 | 0.845 | 0.8211 | 0.1298 | 0.0427 |
| 217.1217 | 79.0 | 1975 | 212.9454 | 0.85 | 0.2527 | 1.5227 | 0.85 | 0.8271 | 0.1279 | 0.0423 |
| 216.765 | 80.0 | 2000 | 212.9475 | 0.845 | 0.2551 | 1.5025 | 0.845 | 0.8206 | 0.1311 | 0.0423 |
| 216.765 | 81.0 | 2025 | 212.9739 | 0.84 | 0.2567 | 1.5305 | 0.8400 | 0.8162 | 0.1294 | 0.0431 |
| 216.765 | 82.0 | 2050 | 212.9351 | 0.855 | 0.2526 | 1.5373 | 0.855 | 0.8339 | 0.1277 | 0.0401 |
| 216.765 | 83.0 | 2075 | 213.0053 | 0.845 | 0.2560 | 1.4724 | 0.845 | 0.8228 | 0.1341 | 0.0417 |
| 216.765 | 84.0 | 2100 | 212.9326 | 0.845 | 0.2568 | 1.5217 | 0.845 | 0.8206 | 0.1303 | 0.0472 |
| 216.765 | 85.0 | 2125 | 212.9555 | 0.855 | 0.2537 | 1.5265 | 0.855 | 0.8339 | 0.1233 | 0.0416 |
| 216.765 | 86.0 | 2150 | 212.9121 | 0.85 | 0.2534 | 1.5224 | 0.85 | 0.8280 | 0.1283 | 0.0398 |
| 216.765 | 87.0 | 2175 | 212.8850 | 0.845 | 0.2551 | 1.4480 | 0.845 | 0.8221 | 0.1328 | 0.0412 |
| 216.765 | 88.0 | 2200 | 212.9121 | 0.855 | 0.2518 | 1.5069 | 0.855 | 0.8339 | 0.1234 | 0.0404 |
| 216.765 | 89.0 | 2225 | 212.9327 | 0.845 | 0.2517 | 1.4532 | 0.845 | 0.8206 | 0.1231 | 0.0401 |
| 216.765 | 90.0 | 2250 | 212.9305 | 0.85 | 0.2542 | 1.4506 | 0.85 | 0.8271 | 0.1374 | 0.0398 |
| 216.765 | 91.0 | 2275 | 212.9274 | 0.85 | 0.2567 | 1.5045 | 0.85 | 0.8280 | 0.1297 | 0.0419 |
| 216.765 | 92.0 | 2300 | 212.8962 | 0.85 | 0.2545 | 1.4956 | 0.85 | 0.8280 | 0.1261 | 0.0405 |
| 216.765 | 93.0 | 2325 | 212.9133 | 0.845 | 0.2567 | 1.5274 | 0.845 | 0.8212 | 0.1291 | 0.0431 |
| 216.765 | 94.0 | 2350 | 212.8708 | 0.85 | 0.2576 | 1.4410 | 0.85 | 0.8280 | 0.1302 | 0.0410 |
| 216.765 | 95.0 | 2375 | 212.9661 | 0.855 | 0.2546 | 1.3988 | 0.855 | 0.8339 | 0.1248 | 0.0404 |
| 216.765 | 96.0 | 2400 | 212.9099 | 0.855 | 0.2547 | 1.5096 | 0.855 | 0.8333 | 0.1256 | 0.0402 |
| 216.765 | 97.0 | 2425 | 212.9668 | 0.85 | 0.2549 | 1.5337 | 0.85 | 0.8271 | 0.1289 | 0.0390 |
| 216.765 | 98.0 | 2450 | 212.9587 | 0.845 | 0.2545 | 1.5161 | 0.845 | 0.8215 | 0.1304 | 0.0412 |
| 216.765 | 99.0 | 2475 | 212.9395 | 0.855 | 0.2554 | 1.4606 | 0.855 | 0.8333 | 0.1253 | 0.0410 |
| 216.6085 | 100.0 | 2500 | 212.9178 | 0.855 | 0.2563 | 1.4722 | 0.855 | 0.8333 | 0.1253 | 0.0422 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
hbenitez/AV_classifier_resnet50
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# hbenitez/AV_classifier_resnet50
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.8261
- Validation Loss: 2.4425
- Train Accuracy: 0.6
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 8000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 8.2326 | 8.1060 | 0.0 | 0 |
| 8.3420 | 7.6394 | 0.05 | 1 |
| 7.9643 | 7.5706 | 0.05 | 2 |
| 7.9337 | 7.6265 | 0.05 | 3 |
| 7.8018 | 7.7736 | 0.05 | 4 |
| 7.8009 | 7.7905 | 0.05 | 5 |
| 7.6369 | 7.6354 | 0.05 | 6 |
| 7.4782 | 7.5608 | 0.05 | 7 |
| 7.3655 | 7.6271 | 0.05 | 8 |
| 7.2886 | 7.6028 | 0.0 | 9 |
| 7.1145 | 7.5211 | 0.0 | 10 |
| 7.1232 | 7.2993 | 0.0 | 11 |
| 6.8393 | 7.2079 | 0.0 | 12 |
| 6.8202 | 7.2143 | 0.0 | 13 |
| 6.7180 | 7.1236 | 0.05 | 14 |
| 6.7318 | 7.1061 | 0.0 | 15 |
| 6.4563 | 6.9758 | 0.05 | 16 |
| 6.3765 | 6.9413 | 0.05 | 17 |
| 6.1791 | 6.8315 | 0.05 | 18 |
| 6.1946 | 6.7703 | 0.05 | 19 |
| 5.8448 | 6.7431 | 0.1 | 20 |
| 5.8514 | 6.6876 | 0.1 | 21 |
| 5.8200 | 6.6353 | 0.05 | 22 |
| 5.8323 | 6.5814 | 0.05 | 23 |
| 5.5553 | 6.4306 | 0.05 | 24 |
| 5.4999 | 6.4455 | 0.05 | 25 |
| 5.4370 | 6.3026 | 0.05 | 26 |
| 5.2288 | 6.0093 | 0.1 | 27 |
| 5.2173 | 6.0593 | 0.05 | 28 |
| 5.2280 | 6.0598 | 0.05 | 29 |
| 5.0484 | 5.9769 | 0.05 | 30 |
| 4.8703 | 5.8336 | 0.05 | 31 |
| 4.9881 | 5.7711 | 0.1 | 32 |
| 4.5905 | 5.6685 | 0.1 | 33 |
| 4.7240 | 5.6156 | 0.15 | 34 |
| 4.5095 | 5.4680 | 0.15 | 35 |
| 4.2225 | 5.3962 | 0.15 | 36 |
| 4.3615 | 5.3290 | 0.2 | 37 |
| 4.1862 | 5.3602 | 0.15 | 38 |
| 3.9455 | 5.2635 | 0.15 | 39 |
| 3.9737 | 5.2337 | 0.15 | 40 |
| 4.0922 | 5.1268 | 0.15 | 41 |
| 3.6042 | 4.9972 | 0.2 | 42 |
| 3.7219 | 4.8787 | 0.15 | 43 |
| 3.5563 | 4.9075 | 0.2 | 44 |
| 3.5897 | 4.9157 | 0.25 | 45 |
| 3.5769 | 4.7936 | 0.25 | 46 |
| 3.6225 | 4.8689 | 0.2 | 47 |
| 3.4568 | 4.8767 | 0.2 | 48 |
| 3.1431 | 4.7520 | 0.25 | 49 |
| 3.0607 | 4.5815 | 0.3 | 50 |
| 2.8904 | 4.5007 | 0.2 | 51 |
| 2.8308 | 4.5054 | 0.25 | 52 |
| 2.8136 | 4.2745 | 0.25 | 53 |
| 2.6192 | 4.3300 | 0.2 | 54 |
| 2.5308 | 4.3180 | 0.2 | 55 |
| 2.5192 | 4.2706 | 0.2 | 56 |
| 2.5761 | 4.1395 | 0.25 | 57 |
| 2.3516 | 3.9031 | 0.3 | 58 |
| 2.3231 | 3.8172 | 0.35 | 59 |
| 2.2735 | 3.7651 | 0.35 | 60 |
| 2.1215 | 3.8034 | 0.35 | 61 |
| 2.3229 | 3.8096 | 0.35 | 62 |
| 2.2230 | 3.7000 | 0.35 | 63 |
| 1.9059 | 3.6666 | 0.25 | 64 |
| 2.0289 | 3.6743 | 0.25 | 65 |
| 1.9178 | 3.5819 | 0.3 | 66 |
| 2.0295 | 3.5087 | 0.35 | 67 |
| 1.6499 | 3.4962 | 0.4 | 68 |
| 1.6261 | 3.4146 | 0.3 | 69 |
| 1.7059 | 3.4097 | 0.35 | 70 |
| 1.4837 | 3.2702 | 0.35 | 71 |
| 1.3766 | 3.2214 | 0.4 | 72 |
| 1.5898 | 3.2674 | 0.4 | 73 |
| 1.5002 | 3.1907 | 0.4 | 74 |
| 1.2641 | 3.1176 | 0.4 | 75 |
| 1.3456 | 3.1562 | 0.4 | 76 |
| 1.2655 | 2.9548 | 0.5 | 77 |
| 1.5449 | 2.8738 | 0.5 | 78 |
| 1.2519 | 2.8336 | 0.45 | 79 |
| 1.0682 | 2.8478 | 0.35 | 80 |
| 1.1891 | 2.8408 | 0.5 | 81 |
| 1.2920 | 2.6254 | 0.5 | 82 |
| 1.1239 | 2.7507 | 0.5 | 83 |
| 1.0857 | 2.7772 | 0.4 | 84 |
| 0.9821 | 2.8372 | 0.45 | 85 |
| 1.0457 | 2.8636 | 0.45 | 86 |
| 1.1419 | 2.8426 | 0.45 | 87 |
| 1.0782 | 2.7856 | 0.5 | 88 |
| 0.9906 | 2.6826 | 0.55 | 89 |
| 1.0766 | 2.6707 | 0.5 | 90 |
| 1.1115 | 2.6457 | 0.5 | 91 |
| 1.2201 | 2.6838 | 0.55 | 92 |
| 0.8706 | 2.5262 | 0.55 | 93 |
| 0.7441 | 2.5422 | 0.55 | 94 |
| 0.9710 | 2.4211 | 0.6 | 95 |
| 0.9731 | 2.4090 | 0.6 | 96 |
| 0.8942 | 2.3773 | 0.6 | 97 |
| 1.0461 | 2.4159 | 0.55 | 98 |
| 0.8261 | 2.4425 | 0.6 | 99 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.13.0-rc2
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"tench, tinca tinca",
"goldfish, carassius auratus",
"great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias",
"tiger shark, galeocerdo cuvieri",
"hammerhead, hammerhead shark",
"electric ray, crampfish, numbfish, torpedo",
"stingray",
"cock",
"hen",
"ostrich, struthio camelus",
"brambling, fringilla montifringilla",
"goldfinch, carduelis carduelis",
"house finch, linnet, carpodacus mexicanus",
"junco, snowbird",
"indigo bunting, indigo finch, indigo bird, passerina cyanea",
"robin, american robin, turdus migratorius",
"bulbul",
"jay",
"magpie",
"chickadee",
"water ouzel, dipper",
"kite",
"bald eagle, american eagle, haliaeetus leucocephalus",
"vulture",
"great grey owl, great gray owl, strix nebulosa",
"european fire salamander, salamandra salamandra",
"common newt, triturus vulgaris",
"eft",
"spotted salamander, ambystoma maculatum",
"axolotl, mud puppy, ambystoma mexicanum",
"bullfrog, rana catesbeiana",
"tree frog, tree-frog",
"tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui",
"loggerhead, loggerhead turtle, caretta caretta",
"leatherback turtle, leatherback, leathery turtle, dermochelys coriacea",
"mud turtle",
"terrapin",
"box turtle, box tortoise",
"banded gecko",
"common iguana, iguana, iguana iguana",
"american chameleon, anole, anolis carolinensis",
"whiptail, whiptail lizard",
"agama",
"frilled lizard, chlamydosaurus kingi",
"alligator lizard",
"gila monster, heloderma suspectum",
"green lizard, lacerta viridis",
"african chameleon, chamaeleo chamaeleon",
"komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis",
"african crocodile, nile crocodile, crocodylus niloticus",
"american alligator, alligator mississipiensis",
"triceratops",
"thunder snake, worm snake, carphophis amoenus",
"ringneck snake, ring-necked snake, ring snake",
"hognose snake, puff adder, sand viper",
"green snake, grass snake",
"king snake, kingsnake",
"garter snake, grass snake",
"water snake",
"vine snake",
"night snake, hypsiglena torquata",
"boa constrictor, constrictor constrictor",
"rock python, rock snake, python sebae",
"indian cobra, naja naja",
"green mamba",
"sea snake",
"horned viper, cerastes, sand viper, horned asp, cerastes cornutus",
"diamondback, diamondback rattlesnake, crotalus adamanteus",
"sidewinder, horned rattlesnake, crotalus cerastes",
"trilobite",
"harvestman, daddy longlegs, phalangium opilio",
"scorpion",
"black and gold garden spider, argiope aurantia",
"barn spider, araneus cavaticus",
"garden spider, aranea diademata",
"black widow, latrodectus mactans",
"tarantula",
"wolf spider, hunting spider",
"tick",
"centipede",
"black grouse",
"ptarmigan",
"ruffed grouse, partridge, bonasa umbellus",
"prairie chicken, prairie grouse, prairie fowl",
"peacock",
"quail",
"partridge",
"african grey, african gray, psittacus erithacus",
"macaw",
"sulphur-crested cockatoo, kakatoe galerita, cacatua galerita",
"lorikeet",
"coucal",
"bee eater",
"hornbill",
"hummingbird",
"jacamar",
"toucan",
"drake",
"red-breasted merganser, mergus serrator",
"goose",
"black swan, cygnus atratus",
"tusker",
"echidna, spiny anteater, anteater",
"platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus",
"wallaby, brush kangaroo",
"koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus",
"wombat",
"jellyfish",
"sea anemone, anemone",
"brain coral",
"flatworm, platyhelminth",
"nematode, nematode worm, roundworm",
"conch",
"snail",
"slug",
"sea slug, nudibranch",
"chiton, coat-of-mail shell, sea cradle, polyplacophore",
"chambered nautilus, pearly nautilus, nautilus",
"dungeness crab, cancer magister",
"rock crab, cancer irroratus",
"fiddler crab",
"king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica",
"american lobster, northern lobster, maine lobster, homarus americanus",
"spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
"crayfish, crawfish, crawdad, crawdaddy",
"hermit crab",
"isopod",
"white stork, ciconia ciconia",
"black stork, ciconia nigra",
"spoonbill",
"flamingo",
"little blue heron, egretta caerulea",
"american egret, great white heron, egretta albus",
"bittern",
"crane",
"limpkin, aramus pictus",
"european gallinule, porphyrio porphyrio",
"american coot, marsh hen, mud hen, water hen, fulica americana",
"bustard",
"ruddy turnstone, arenaria interpres",
"red-backed sandpiper, dunlin, erolia alpina",
"redshank, tringa totanus",
"dowitcher",
"oystercatcher, oyster catcher",
"pelican",
"king penguin, aptenodytes patagonica",
"albatross, mollymawk",
"grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus",
"killer whale, killer, orca, grampus, sea wolf, orcinus orca",
"dugong, dugong dugon",
"sea lion",
"chihuahua",
"japanese spaniel",
"maltese dog, maltese terrier, maltese",
"pekinese, pekingese, peke",
"shih-tzu",
"blenheim spaniel",
"papillon",
"toy terrier",
"rhodesian ridgeback",
"afghan hound, afghan",
"basset, basset hound",
"beagle",
"bloodhound, sleuthhound",
"bluetick",
"black-and-tan coonhound",
"walker hound, walker foxhound",
"english foxhound",
"redbone",
"borzoi, russian wolfhound",
"irish wolfhound",
"italian greyhound",
"whippet",
"ibizan hound, ibizan podenco",
"norwegian elkhound, elkhound",
"otterhound, otter hound",
"saluki, gazelle hound",
"scottish deerhound, deerhound",
"weimaraner",
"staffordshire bullterrier, staffordshire bull terrier",
"american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier",
"bedlington terrier",
"border terrier",
"kerry blue terrier",
"irish terrier",
"norfolk terrier",
"norwich terrier",
"yorkshire terrier",
"wire-haired fox terrier",
"lakeland terrier",
"sealyham terrier, sealyham",
"airedale, airedale terrier",
"cairn, cairn terrier",
"australian terrier",
"dandie dinmont, dandie dinmont terrier",
"boston bull, boston terrier",
"miniature schnauzer",
"giant schnauzer",
"standard schnauzer",
"scotch terrier, scottish terrier, scottie",
"tibetan terrier, chrysanthemum dog",
"silky terrier, sydney silky",
"soft-coated wheaten terrier",
"west highland white terrier",
"lhasa, lhasa apso",
"flat-coated retriever",
"curly-coated retriever",
"golden retriever",
"labrador retriever",
"chesapeake bay retriever",
"german short-haired pointer",
"vizsla, hungarian pointer",
"english setter",
"irish setter, red setter",
"gordon setter",
"brittany spaniel",
"clumber, clumber spaniel",
"english springer, english springer spaniel",
"welsh springer spaniel",
"cocker spaniel, english cocker spaniel, cocker",
"sussex spaniel",
"irish water spaniel",
"kuvasz",
"schipperke",
"groenendael",
"malinois",
"briard",
"kelpie",
"komondor",
"old english sheepdog, bobtail",
"shetland sheepdog, shetland sheep dog, shetland",
"collie",
"border collie",
"bouvier des flandres, bouviers des flandres",
"rottweiler",
"german shepherd, german shepherd dog, german police dog, alsatian",
"doberman, doberman pinscher",
"miniature pinscher",
"greater swiss mountain dog",
"bernese mountain dog",
"appenzeller",
"entlebucher",
"boxer",
"bull mastiff",
"tibetan mastiff",
"french bulldog",
"great dane",
"saint bernard, st bernard",
"eskimo dog, husky",
"malamute, malemute, alaskan malamute",
"siberian husky",
"dalmatian, coach dog, carriage dog",
"affenpinscher, monkey pinscher, monkey dog",
"basenji",
"pug, pug-dog",
"leonberg",
"newfoundland, newfoundland dog",
"great pyrenees",
"samoyed, samoyede",
"pomeranian",
"chow, chow chow",
"keeshond",
"brabancon griffon",
"pembroke, pembroke welsh corgi",
"cardigan, cardigan welsh corgi",
"toy poodle",
"miniature poodle",
"standard poodle",
"mexican hairless",
"timber wolf, grey wolf, gray wolf, canis lupus",
"white wolf, arctic wolf, canis lupus tundrarum",
"red wolf, maned wolf, canis rufus, canis niger",
"coyote, prairie wolf, brush wolf, canis latrans",
"dingo, warrigal, warragal, canis dingo",
"dhole, cuon alpinus",
"african hunting dog, hyena dog, cape hunting dog, lycaon pictus",
"hyena, hyaena",
"red fox, vulpes vulpes",
"kit fox, vulpes macrotis",
"arctic fox, white fox, alopex lagopus",
"grey fox, gray fox, urocyon cinereoargenteus",
"tabby, tabby cat",
"tiger cat",
"persian cat",
"siamese cat, siamese",
"egyptian cat",
"cougar, puma, catamount, mountain lion, painter, panther, felis concolor",
"lynx, catamount",
"leopard, panthera pardus",
"snow leopard, ounce, panthera uncia",
"jaguar, panther, panthera onca, felis onca",
"lion, king of beasts, panthera leo",
"tiger, panthera tigris",
"cheetah, chetah, acinonyx jubatus",
"brown bear, bruin, ursus arctos",
"american black bear, black bear, ursus americanus, euarctos americanus",
"ice bear, polar bear, ursus maritimus, thalarctos maritimus",
"sloth bear, melursus ursinus, ursus ursinus",
"mongoose",
"meerkat, mierkat",
"tiger beetle",
"ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
"ground beetle, carabid beetle",
"long-horned beetle, longicorn, longicorn beetle",
"leaf beetle, chrysomelid",
"dung beetle",
"rhinoceros beetle",
"weevil",
"fly",
"bee",
"ant, emmet, pismire",
"grasshopper, hopper",
"cricket",
"walking stick, walkingstick, stick insect",
"cockroach, roach",
"mantis, mantid",
"cicada, cicala",
"leafhopper",
"lacewing, lacewing fly",
"dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
"damselfly",
"admiral",
"ringlet, ringlet butterfly",
"monarch, monarch butterfly, milkweed butterfly, danaus plexippus",
"cabbage butterfly",
"sulphur butterfly, sulfur butterfly",
"lycaenid, lycaenid butterfly",
"starfish, sea star",
"sea urchin",
"sea cucumber, holothurian",
"wood rabbit, cottontail, cottontail rabbit",
"hare",
"angora, angora rabbit",
"hamster",
"porcupine, hedgehog",
"fox squirrel, eastern fox squirrel, sciurus niger",
"marmot",
"beaver",
"guinea pig, cavia cobaya",
"sorrel",
"zebra",
"hog, pig, grunter, squealer, sus scrofa",
"wild boar, boar, sus scrofa",
"warthog",
"hippopotamus, hippo, river horse, hippopotamus amphibius",
"ox",
"water buffalo, water ox, asiatic buffalo, bubalus bubalis",
"bison",
"ram, tup",
"bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis",
"ibex, capra ibex",
"hartebeest",
"impala, aepyceros melampus",
"gazelle",
"arabian camel, dromedary, camelus dromedarius",
"llama",
"weasel",
"mink",
"polecat, fitch, foulmart, foumart, mustela putorius",
"black-footed ferret, ferret, mustela nigripes",
"otter",
"skunk, polecat, wood pussy",
"badger",
"armadillo",
"three-toed sloth, ai, bradypus tridactylus",
"orangutan, orang, orangutang, pongo pygmaeus",
"gorilla, gorilla gorilla",
"chimpanzee, chimp, pan troglodytes",
"gibbon, hylobates lar",
"siamang, hylobates syndactylus, symphalangus syndactylus",
"guenon, guenon monkey",
"patas, hussar monkey, erythrocebus patas",
"baboon",
"macaque",
"langur",
"colobus, colobus monkey",
"proboscis monkey, nasalis larvatus",
"marmoset",
"capuchin, ringtail, cebus capucinus",
"howler monkey, howler",
"titi, titi monkey",
"spider monkey, ateles geoffroyi",
"squirrel monkey, saimiri sciureus",
"madagascar cat, ring-tailed lemur, lemur catta",
"indri, indris, indri indri, indri brevicaudatus",
"indian elephant, elephas maximus",
"african elephant, loxodonta africana",
"lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens",
"giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca",
"barracouta, snoek",
"eel",
"coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch",
"rock beauty, holocanthus tricolor",
"anemone fish",
"sturgeon",
"gar, garfish, garpike, billfish, lepisosteus osseus",
"lionfish",
"puffer, pufferfish, blowfish, globefish",
"abacus",
"abaya",
"academic gown, academic robe, judge's robe",
"accordion, piano accordion, squeeze box",
"acoustic guitar",
"aircraft carrier, carrier, flattop, attack aircraft carrier",
"airliner",
"airship, dirigible",
"altar",
"ambulance",
"amphibian, amphibious vehicle",
"analog clock",
"apiary, bee house",
"apron",
"ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
"assault rifle, assault gun",
"backpack, back pack, knapsack, packsack, rucksack, haversack",
"bakery, bakeshop, bakehouse",
"balance beam, beam",
"balloon",
"ballpoint, ballpoint pen, ballpen, biro",
"band aid",
"banjo",
"bannister, banister, balustrade, balusters, handrail",
"barbell",
"barber chair",
"barbershop",
"barn",
"barometer",
"barrel, cask",
"barrow, garden cart, lawn cart, wheelbarrow",
"baseball",
"basketball",
"bassinet",
"bassoon",
"bathing cap, swimming cap",
"bath towel",
"bathtub, bathing tub, bath, tub",
"beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
"beacon, lighthouse, beacon light, pharos",
"beaker",
"bearskin, busby, shako",
"beer bottle",
"beer glass",
"bell cote, bell cot",
"bib",
"bicycle-built-for-two, tandem bicycle, tandem",
"bikini, two-piece",
"binder, ring-binder",
"binoculars, field glasses, opera glasses",
"birdhouse",
"boathouse",
"bobsled, bobsleigh, bob",
"bolo tie, bolo, bola tie, bola",
"bonnet, poke bonnet",
"bookcase",
"bookshop, bookstore, bookstall",
"bottlecap",
"bow",
"bow tie, bow-tie, bowtie",
"brass, memorial tablet, plaque",
"brassiere, bra, bandeau",
"breakwater, groin, groyne, mole, bulwark, seawall, jetty",
"breastplate, aegis, egis",
"broom",
"bucket, pail",
"buckle",
"bulletproof vest",
"bullet train, bullet",
"butcher shop, meat market",
"cab, hack, taxi, taxicab",
"caldron, cauldron",
"candle, taper, wax light",
"cannon",
"canoe",
"can opener, tin opener",
"cardigan",
"car mirror",
"carousel, carrousel, merry-go-round, roundabout, whirligig",
"carpenter's kit, tool kit",
"carton",
"car wheel",
"cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm",
"cassette",
"cassette player",
"castle",
"catamaran",
"cd player",
"cello, violoncello",
"cellular telephone, cellular phone, cellphone, cell, mobile phone",
"chain",
"chainlink fence",
"chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
"chain saw, chainsaw",
"chest",
"chiffonier, commode",
"chime, bell, gong",
"china cabinet, china closet",
"christmas stocking",
"church, church building",
"cinema, movie theater, movie theatre, movie house, picture palace",
"cleaver, meat cleaver, chopper",
"cliff dwelling",
"cloak",
"clog, geta, patten, sabot",
"cocktail shaker",
"coffee mug",
"coffeepot",
"coil, spiral, volute, whorl, helix",
"combination lock",
"computer keyboard, keypad",
"confectionery, confectionary, candy store",
"container ship, containership, container vessel",
"convertible",
"corkscrew, bottle screw",
"cornet, horn, trumpet, trump",
"cowboy boot",
"cowboy hat, ten-gallon hat",
"cradle",
"crane",
"crash helmet",
"crate",
"crib, cot",
"crock pot",
"croquet ball",
"crutch",
"cuirass",
"dam, dike, dyke",
"desk",
"desktop computer",
"dial telephone, dial phone",
"diaper, nappy, napkin",
"digital clock",
"digital watch",
"dining table, board",
"dishrag, dishcloth",
"dishwasher, dish washer, dishwashing machine",
"disk brake, disc brake",
"dock, dockage, docking facility",
"dogsled, dog sled, dog sleigh",
"dome",
"doormat, welcome mat",
"drilling platform, offshore rig",
"drum, membranophone, tympan",
"drumstick",
"dumbbell",
"dutch oven",
"electric fan, blower",
"electric guitar",
"electric locomotive",
"entertainment center",
"envelope",
"espresso maker",
"face powder",
"feather boa, boa",
"file, file cabinet, filing cabinet",
"fireboat",
"fire engine, fire truck",
"fire screen, fireguard",
"flagpole, flagstaff",
"flute, transverse flute",
"folding chair",
"football helmet",
"forklift",
"fountain",
"fountain pen",
"four-poster",
"freight car",
"french horn, horn",
"frying pan, frypan, skillet",
"fur coat",
"garbage truck, dustcart",
"gasmask, respirator, gas helmet",
"gas pump, gasoline pump, petrol pump, island dispenser",
"goblet",
"go-kart",
"golf ball",
"golfcart, golf cart",
"gondola",
"gong, tam-tam",
"gown",
"grand piano, grand",
"greenhouse, nursery, glasshouse",
"grille, radiator grille",
"grocery store, grocery, food market, market",
"guillotine",
"hair slide",
"hair spray",
"half track",
"hammer",
"hamper",
"hand blower, blow dryer, blow drier, hair dryer, hair drier",
"hand-held computer, hand-held microcomputer",
"handkerchief, hankie, hanky, hankey",
"hard disc, hard disk, fixed disk",
"harmonica, mouth organ, harp, mouth harp",
"harp",
"harvester, reaper",
"hatchet",
"holster",
"home theater, home theatre",
"honeycomb",
"hook, claw",
"hoopskirt, crinoline",
"horizontal bar, high bar",
"horse cart, horse-cart",
"hourglass",
"ipod",
"iron, smoothing iron",
"jack-o'-lantern",
"jean, blue jean, denim",
"jeep, landrover",
"jersey, t-shirt, tee shirt",
"jigsaw puzzle",
"jinrikisha, ricksha, rickshaw",
"joystick",
"kimono",
"knee pad",
"knot",
"lab coat, laboratory coat",
"ladle",
"lampshade, lamp shade",
"laptop, laptop computer",
"lawn mower, mower",
"lens cap, lens cover",
"letter opener, paper knife, paperknife",
"library",
"lifeboat",
"lighter, light, igniter, ignitor",
"limousine, limo",
"liner, ocean liner",
"lipstick, lip rouge",
"loafer",
"lotion",
"loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
"loupe, jeweler's loupe",
"lumbermill, sawmill",
"magnetic compass",
"mailbag, postbag",
"mailbox, letter box",
"maillot",
"maillot, tank suit",
"manhole cover",
"maraca",
"marimba, xylophone",
"mask",
"matchstick",
"maypole",
"maze, labyrinth",
"measuring cup",
"medicine chest, medicine cabinet",
"megalith, megalithic structure",
"microphone, mike",
"microwave, microwave oven",
"military uniform",
"milk can",
"minibus",
"miniskirt, mini",
"minivan",
"missile",
"mitten",
"mixing bowl",
"mobile home, manufactured home",
"model t",
"modem",
"monastery",
"monitor",
"moped",
"mortar",
"mortarboard",
"mosque",
"mosquito net",
"motor scooter, scooter",
"mountain bike, all-terrain bike, off-roader",
"mountain tent",
"mouse, computer mouse",
"mousetrap",
"moving van",
"muzzle",
"nail",
"neck brace",
"necklace",
"nipple",
"notebook, notebook computer",
"obelisk",
"oboe, hautboy, hautbois",
"ocarina, sweet potato",
"odometer, hodometer, mileometer, milometer",
"oil filter",
"organ, pipe organ",
"oscilloscope, scope, cathode-ray oscilloscope, cro",
"overskirt",
"oxcart",
"oxygen mask",
"packet",
"paddle, boat paddle",
"paddlewheel, paddle wheel",
"padlock",
"paintbrush",
"pajama, pyjama, pj's, jammies",
"palace",
"panpipe, pandean pipe, syrinx",
"paper towel",
"parachute, chute",
"parallel bars, bars",
"park bench",
"parking meter",
"passenger car, coach, carriage",
"patio, terrace",
"pay-phone, pay-station",
"pedestal, plinth, footstall",
"pencil box, pencil case",
"pencil sharpener",
"perfume, essence",
"petri dish",
"photocopier",
"pick, plectrum, plectron",
"pickelhaube",
"picket fence, paling",
"pickup, pickup truck",
"pier",
"piggy bank, penny bank",
"pill bottle",
"pillow",
"ping-pong ball",
"pinwheel",
"pirate, pirate ship",
"pitcher, ewer",
"plane, carpenter's plane, woodworking plane",
"planetarium",
"plastic bag",
"plate rack",
"plow, plough",
"plunger, plumber's helper",
"polaroid camera, polaroid land camera",
"pole",
"police van, police wagon, paddy wagon, patrol wagon, wagon, black maria",
"poncho",
"pool table, billiard table, snooker table",
"pop bottle, soda bottle",
"pot, flowerpot",
"potter's wheel",
"power drill",
"prayer rug, prayer mat",
"printer",
"prison, prison house",
"projectile, missile",
"projector",
"puck, hockey puck",
"punching bag, punch bag, punching ball, punchball",
"purse",
"quill, quill pen",
"quilt, comforter, comfort, puff",
"racer, race car, racing car",
"racket, racquet",
"radiator",
"radio, wireless",
"radio telescope, radio reflector",
"rain barrel",
"recreational vehicle, rv, r.v.",
"reel",
"reflex camera",
"refrigerator, icebox",
"remote control, remote",
"restaurant, eating house, eating place, eatery",
"revolver, six-gun, six-shooter",
"rifle",
"rocking chair, rocker",
"rotisserie",
"rubber eraser, rubber, pencil eraser",
"rugby ball",
"rule, ruler",
"running shoe",
"safe",
"safety pin",
"saltshaker, salt shaker",
"sandal",
"sarong",
"sax, saxophone",
"scabbard",
"scale, weighing machine",
"school bus",
"schooner",
"scoreboard",
"screen, crt screen",
"screw",
"screwdriver",
"seat belt, seatbelt",
"sewing machine",
"shield, buckler",
"shoe shop, shoe-shop, shoe store",
"shoji",
"shopping basket",
"shopping cart",
"shovel",
"shower cap",
"shower curtain",
"ski",
"ski mask",
"sleeping bag",
"slide rule, slipstick",
"sliding door",
"slot, one-armed bandit",
"snorkel",
"snowmobile",
"snowplow, snowplough",
"soap dispenser",
"soccer ball",
"sock",
"solar dish, solar collector, solar furnace",
"sombrero",
"soup bowl",
"space bar",
"space heater",
"space shuttle",
"spatula",
"speedboat",
"spider web, spider's web",
"spindle",
"sports car, sport car",
"spotlight, spot",
"stage",
"steam locomotive",
"steel arch bridge",
"steel drum",
"stethoscope",
"stole",
"stone wall",
"stopwatch, stop watch",
"stove",
"strainer",
"streetcar, tram, tramcar, trolley, trolley car",
"stretcher",
"studio couch, day bed",
"stupa, tope",
"submarine, pigboat, sub, u-boat",
"suit, suit of clothes",
"sundial",
"sunglass",
"sunglasses, dark glasses, shades",
"sunscreen, sunblock, sun blocker",
"suspension bridge",
"swab, swob, mop",
"sweatshirt",
"swimming trunks, bathing trunks",
"swing",
"switch, electric switch, electrical switch",
"syringe",
"table lamp",
"tank, army tank, armored combat vehicle, armoured combat vehicle",
"tape player",
"teapot",
"teddy, teddy bear",
"television, television system",
"tennis ball",
"thatch, thatched roof",
"theater curtain, theatre curtain",
"thimble",
"thresher, thrasher, threshing machine",
"throne",
"tile roof",
"toaster",
"tobacco shop, tobacconist shop, tobacconist",
"toilet seat",
"torch",
"totem pole",
"tow truck, tow car, wrecker",
"toyshop",
"tractor",
"trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
"tray",
"trench coat",
"tricycle, trike, velocipede",
"trimaran",
"tripod",
"triumphal arch",
"trolleybus, trolley coach, trackless trolley",
"trombone",
"tub, vat",
"turnstile",
"typewriter keyboard",
"umbrella",
"unicycle, monocycle",
"upright, upright piano",
"vacuum, vacuum cleaner",
"vase",
"vault",
"velvet",
"vending machine",
"vestment",
"viaduct",
"violin, fiddle",
"volleyball",
"waffle iron",
"wall clock",
"wallet, billfold, notecase, pocketbook",
"wardrobe, closet, press",
"warplane, military plane",
"washbasin, handbasin, washbowl, lavabo, wash-hand basin",
"washer, automatic washer, washing machine",
"water bottle",
"water jug",
"water tower",
"whiskey jug",
"whistle",
"wig",
"window screen",
"window shade",
"windsor tie",
"wine bottle",
"wing",
"wok",
"wooden spoon",
"wool, woolen, woollen",
"worm fence, snake fence, snake-rail fence, virginia fence",
"wreck",
"yawl",
"yurt",
"web site, website, internet site, site",
"comic book",
"crossword puzzle, crossword",
"street sign",
"traffic light, traffic signal, stoplight",
"book jacket, dust cover, dust jacket, dust wrapper",
"menu",
"plate",
"guacamole",
"consomme",
"hot pot, hotpot",
"trifle",
"ice cream, icecream",
"ice lolly, lolly, lollipop, popsicle",
"french loaf",
"bagel, beigel",
"pretzel",
"cheeseburger",
"hotdog, hot dog, red hot",
"mashed potato",
"head cabbage",
"broccoli",
"cauliflower",
"zucchini, courgette",
"spaghetti squash",
"acorn squash",
"butternut squash",
"cucumber, cuke",
"artichoke, globe artichoke",
"bell pepper",
"cardoon",
"mushroom",
"granny smith",
"strawberry",
"orange",
"lemon",
"fig",
"pineapple, ananas",
"banana",
"jackfruit, jak, jack",
"custard apple",
"pomegranate",
"hay",
"carbonara",
"chocolate sauce, chocolate syrup",
"dough",
"meat loaf, meatloaf",
"pizza, pizza pie",
"potpie",
"burrito",
"red wine",
"espresso",
"cup",
"eggnog",
"alp",
"bubble",
"cliff, drop, drop-off",
"coral reef",
"geyser",
"lakeside, lakeshore",
"promontory, headland, head, foreland",
"sandbar, sand bar",
"seashore, coast, seacoast, sea-coast",
"valley, vale",
"volcano",
"ballplayer, baseball player",
"groom, bridegroom",
"scuba diver",
"rapeseed",
"daisy",
"yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum",
"corn",
"acorn",
"hip, rose hip, rosehip",
"buckeye, horse chestnut, conker",
"coral fungus",
"agaric",
"gyromitra",
"stinkhorn, carrion fungus",
"earthstar",
"hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa",
"bolete",
"ear, spike, capitulum",
"toilet tissue, toilet paper, bathroom tissue"
] |
jordyvl/vit-tiny_tobacco3482_og_simkd_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_og_simkd_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 212.9255
- Accuracy: 0.825
- Brier Loss: 0.2955
- Nll: 1.3247
- F1 Micro: 0.825
- F1 Macro: 0.8027
- Ece: 0.1660
- Aurc: 0.0554
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 219.5859 | 0.225 | 0.8919 | 5.3905 | 0.225 | 0.1403 | 0.2467 | 0.7026 |
| No log | 2.0 | 50 | 218.7605 | 0.385 | 0.8149 | 2.3886 | 0.3850 | 0.3040 | 0.3416 | 0.4009 |
| No log | 3.0 | 75 | 217.4839 | 0.49 | 0.6311 | 1.8415 | 0.49 | 0.3775 | 0.2502 | 0.2789 |
| No log | 4.0 | 100 | 216.7303 | 0.605 | 0.5512 | 1.9525 | 0.605 | 0.4738 | 0.2423 | 0.1956 |
| No log | 5.0 | 125 | 215.7727 | 0.625 | 0.4792 | 1.8491 | 0.625 | 0.5148 | 0.2037 | 0.1503 |
| No log | 6.0 | 150 | 215.9141 | 0.595 | 0.5723 | 2.5766 | 0.595 | 0.4836 | 0.2297 | 0.2039 |
| No log | 7.0 | 175 | 215.2466 | 0.715 | 0.4245 | 1.8434 | 0.715 | 0.5962 | 0.2236 | 0.1201 |
| No log | 8.0 | 200 | 215.0580 | 0.72 | 0.3878 | 1.8760 | 0.72 | 0.6108 | 0.2068 | 0.0879 |
| No log | 9.0 | 225 | 214.9401 | 0.745 | 0.3919 | 1.7852 | 0.745 | 0.7183 | 0.1855 | 0.0936 |
| No log | 10.0 | 250 | 214.6627 | 0.66 | 0.4840 | 2.1588 | 0.66 | 0.6144 | 0.2371 | 0.1261 |
| No log | 11.0 | 275 | 214.9265 | 0.69 | 0.5027 | 1.7910 | 0.69 | 0.6599 | 0.2514 | 0.1279 |
| No log | 12.0 | 300 | 214.4700 | 0.78 | 0.3372 | 1.7013 | 0.78 | 0.7459 | 0.1740 | 0.0741 |
| No log | 13.0 | 325 | 214.5191 | 0.805 | 0.3164 | 1.5824 | 0.805 | 0.7689 | 0.1662 | 0.0689 |
| No log | 14.0 | 350 | 214.3915 | 0.8 | 0.3278 | 1.5290 | 0.8000 | 0.7742 | 0.1716 | 0.0675 |
| No log | 15.0 | 375 | 214.2643 | 0.8 | 0.3315 | 1.5964 | 0.8000 | 0.7697 | 0.1491 | 0.0876 |
| No log | 16.0 | 400 | 214.2815 | 0.78 | 0.3751 | 1.8388 | 0.78 | 0.7670 | 0.1944 | 0.0836 |
| No log | 17.0 | 425 | 214.0954 | 0.78 | 0.3505 | 1.4905 | 0.78 | 0.7521 | 0.1832 | 0.0662 |
| No log | 18.0 | 450 | 214.1399 | 0.785 | 0.3392 | 1.2883 | 0.785 | 0.7662 | 0.1760 | 0.0749 |
| No log | 19.0 | 475 | 214.1986 | 0.81 | 0.3229 | 1.6305 | 0.81 | 0.7984 | 0.1754 | 0.0777 |
| 220.3818 | 20.0 | 500 | 214.1931 | 0.815 | 0.2994 | 1.5637 | 0.815 | 0.7940 | 0.1520 | 0.0606 |
| 220.3818 | 21.0 | 525 | 213.9438 | 0.815 | 0.3066 | 1.4756 | 0.815 | 0.7983 | 0.1616 | 0.0678 |
| 220.3818 | 22.0 | 550 | 213.9014 | 0.8 | 0.3485 | 1.9629 | 0.8000 | 0.7885 | 0.1838 | 0.0727 |
| 220.3818 | 23.0 | 575 | 214.0186 | 0.83 | 0.2863 | 1.6314 | 0.83 | 0.8107 | 0.1516 | 0.0574 |
| 220.3818 | 24.0 | 600 | 213.8764 | 0.805 | 0.3323 | 1.4873 | 0.805 | 0.7903 | 0.1716 | 0.0726 |
| 220.3818 | 25.0 | 625 | 214.0043 | 0.81 | 0.3094 | 1.5562 | 0.81 | 0.7764 | 0.1625 | 0.0598 |
| 220.3818 | 26.0 | 650 | 213.6884 | 0.825 | 0.3165 | 1.6150 | 0.825 | 0.8038 | 0.1667 | 0.0755 |
| 220.3818 | 27.0 | 675 | 213.7763 | 0.81 | 0.3164 | 1.5526 | 0.81 | 0.7904 | 0.1767 | 0.0747 |
| 220.3818 | 28.0 | 700 | 213.9658 | 0.825 | 0.2996 | 1.7947 | 0.825 | 0.8149 | 0.1686 | 0.0651 |
| 220.3818 | 29.0 | 725 | 213.7030 | 0.815 | 0.3155 | 1.3772 | 0.815 | 0.8024 | 0.1616 | 0.0613 |
| 220.3818 | 30.0 | 750 | 213.7211 | 0.805 | 0.3421 | 1.5621 | 0.805 | 0.7986 | 0.1794 | 0.0624 |
| 220.3818 | 31.0 | 775 | 213.6852 | 0.815 | 0.3094 | 1.5177 | 0.815 | 0.7872 | 0.1500 | 0.0735 |
| 220.3818 | 32.0 | 800 | 213.6889 | 0.785 | 0.3345 | 1.4134 | 0.785 | 0.7652 | 0.1669 | 0.0563 |
| 220.3818 | 33.0 | 825 | 213.6302 | 0.805 | 0.3298 | 1.8630 | 0.805 | 0.7865 | 0.1689 | 0.0693 |
| 220.3818 | 34.0 | 850 | 213.6116 | 0.83 | 0.2890 | 1.5365 | 0.83 | 0.8033 | 0.1555 | 0.0661 |
| 220.3818 | 35.0 | 875 | 213.6136 | 0.805 | 0.3026 | 1.1797 | 0.805 | 0.7744 | 0.1551 | 0.0477 |
| 220.3818 | 36.0 | 900 | 213.5340 | 0.815 | 0.3008 | 1.6963 | 0.815 | 0.7938 | 0.1531 | 0.0687 |
| 220.3818 | 37.0 | 925 | 213.5380 | 0.84 | 0.2968 | 1.3848 | 0.8400 | 0.8396 | 0.1631 | 0.0577 |
| 220.3818 | 38.0 | 950 | 213.5322 | 0.83 | 0.3016 | 1.5511 | 0.83 | 0.8133 | 0.1647 | 0.0599 |
| 220.3818 | 39.0 | 975 | 213.4971 | 0.82 | 0.3155 | 1.4175 | 0.82 | 0.7999 | 0.1571 | 0.0535 |
| 217.7955 | 40.0 | 1000 | 213.4139 | 0.825 | 0.3103 | 1.6359 | 0.825 | 0.8043 | 0.1749 | 0.0683 |
| 217.7955 | 41.0 | 1025 | 213.4513 | 0.83 | 0.3002 | 1.5369 | 0.83 | 0.8139 | 0.1580 | 0.0606 |
| 217.7955 | 42.0 | 1050 | 213.4196 | 0.8 | 0.3251 | 1.3570 | 0.8000 | 0.7779 | 0.1745 | 0.0608 |
| 217.7955 | 43.0 | 1075 | 213.3506 | 0.815 | 0.3142 | 1.3579 | 0.815 | 0.7988 | 0.1724 | 0.0563 |
| 217.7955 | 44.0 | 1100 | 213.3151 | 0.805 | 0.3217 | 1.3796 | 0.805 | 0.7820 | 0.1748 | 0.0580 |
| 217.7955 | 45.0 | 1125 | 213.3202 | 0.825 | 0.3114 | 1.4198 | 0.825 | 0.8090 | 0.1568 | 0.0614 |
| 217.7955 | 46.0 | 1150 | 213.3313 | 0.805 | 0.3203 | 1.3750 | 0.805 | 0.7860 | 0.1667 | 0.0563 |
| 217.7955 | 47.0 | 1175 | 213.3293 | 0.835 | 0.2910 | 1.3909 | 0.835 | 0.8190 | 0.1515 | 0.0576 |
| 217.7955 | 48.0 | 1200 | 213.2646 | 0.825 | 0.2916 | 1.3674 | 0.825 | 0.8022 | 0.1526 | 0.0577 |
| 217.7955 | 49.0 | 1225 | 213.2620 | 0.83 | 0.3137 | 1.4579 | 0.83 | 0.8101 | 0.1634 | 0.0565 |
| 217.7955 | 50.0 | 1250 | 213.2164 | 0.815 | 0.3087 | 1.2599 | 0.815 | 0.7917 | 0.1618 | 0.0573 |
| 217.7955 | 51.0 | 1275 | 213.2495 | 0.795 | 0.3155 | 1.2060 | 0.795 | 0.7648 | 0.1712 | 0.0573 |
| 217.7955 | 52.0 | 1300 | 213.2231 | 0.82 | 0.3232 | 1.4463 | 0.82 | 0.8029 | 0.1658 | 0.0544 |
| 217.7955 | 53.0 | 1325 | 213.2242 | 0.83 | 0.2891 | 1.3586 | 0.83 | 0.8056 | 0.1427 | 0.0520 |
| 217.7955 | 54.0 | 1350 | 213.2049 | 0.83 | 0.2959 | 1.2968 | 0.83 | 0.8063 | 0.1573 | 0.0523 |
| 217.7955 | 55.0 | 1375 | 213.1500 | 0.84 | 0.2844 | 1.4084 | 0.8400 | 0.8137 | 0.1490 | 0.0570 |
| 217.7955 | 56.0 | 1400 | 213.1851 | 0.815 | 0.3097 | 1.4005 | 0.815 | 0.7892 | 0.1605 | 0.0605 |
| 217.7955 | 57.0 | 1425 | 213.1577 | 0.805 | 0.3130 | 1.2482 | 0.805 | 0.7805 | 0.1635 | 0.0550 |
| 217.7955 | 58.0 | 1450 | 213.1812 | 0.835 | 0.2943 | 1.3191 | 0.835 | 0.8047 | 0.1584 | 0.0550 |
| 217.7955 | 59.0 | 1475 | 213.0962 | 0.82 | 0.3102 | 1.3163 | 0.82 | 0.7906 | 0.1672 | 0.0513 |
| 217.1347 | 60.0 | 1500 | 213.1257 | 0.835 | 0.2876 | 1.3165 | 0.835 | 0.8122 | 0.1433 | 0.0562 |
| 217.1347 | 61.0 | 1525 | 213.1092 | 0.83 | 0.2896 | 1.3196 | 0.83 | 0.8061 | 0.1533 | 0.0549 |
| 217.1347 | 62.0 | 1550 | 213.0827 | 0.815 | 0.3072 | 1.3053 | 0.815 | 0.7925 | 0.1546 | 0.0523 |
| 217.1347 | 63.0 | 1575 | 213.0972 | 0.825 | 0.2965 | 1.2148 | 0.825 | 0.8044 | 0.1472 | 0.0507 |
| 217.1347 | 64.0 | 1600 | 213.0849 | 0.83 | 0.2923 | 1.3196 | 0.83 | 0.8106 | 0.1637 | 0.0551 |
| 217.1347 | 65.0 | 1625 | 213.0354 | 0.815 | 0.3158 | 1.2749 | 0.815 | 0.7867 | 0.1675 | 0.0515 |
| 217.1347 | 66.0 | 1650 | 213.0493 | 0.825 | 0.2993 | 1.3046 | 0.825 | 0.7949 | 0.1636 | 0.0558 |
| 217.1347 | 67.0 | 1675 | 212.9869 | 0.815 | 0.3081 | 1.3846 | 0.815 | 0.7868 | 0.1579 | 0.0583 |
| 217.1347 | 68.0 | 1700 | 213.0629 | 0.84 | 0.2915 | 1.3902 | 0.8400 | 0.8046 | 0.1590 | 0.0540 |
| 217.1347 | 69.0 | 1725 | 213.0391 | 0.825 | 0.3068 | 1.3801 | 0.825 | 0.8015 | 0.1553 | 0.0531 |
| 217.1347 | 70.0 | 1750 | 212.9991 | 0.835 | 0.2864 | 1.3331 | 0.835 | 0.8097 | 0.1531 | 0.0562 |
| 217.1347 | 71.0 | 1775 | 213.0157 | 0.83 | 0.2897 | 1.2788 | 0.83 | 0.8002 | 0.1584 | 0.0551 |
| 217.1347 | 72.0 | 1800 | 212.9134 | 0.82 | 0.3051 | 1.3131 | 0.82 | 0.7960 | 0.1690 | 0.0522 |
| 217.1347 | 73.0 | 1825 | 213.0014 | 0.825 | 0.2926 | 1.3111 | 0.825 | 0.8040 | 0.1390 | 0.0574 |
| 217.1347 | 74.0 | 1850 | 212.9525 | 0.82 | 0.2985 | 1.3181 | 0.82 | 0.7962 | 0.1579 | 0.0543 |
| 217.1347 | 75.0 | 1875 | 212.9581 | 0.815 | 0.3024 | 1.2835 | 0.815 | 0.7810 | 0.1648 | 0.0504 |
| 217.1347 | 76.0 | 1900 | 213.0073 | 0.835 | 0.2970 | 1.3745 | 0.835 | 0.8095 | 0.1597 | 0.0579 |
| 217.1347 | 77.0 | 1925 | 213.0066 | 0.805 | 0.3046 | 1.3071 | 0.805 | 0.7783 | 0.1502 | 0.0547 |
| 217.1347 | 78.0 | 1950 | 212.9872 | 0.82 | 0.3018 | 1.4088 | 0.82 | 0.7928 | 0.1527 | 0.0527 |
| 217.1347 | 79.0 | 1975 | 212.9629 | 0.82 | 0.3024 | 1.3665 | 0.82 | 0.8012 | 0.1626 | 0.0551 |
| 216.794 | 80.0 | 2000 | 212.9545 | 0.825 | 0.3080 | 1.3609 | 0.825 | 0.8062 | 0.1652 | 0.0541 |
| 216.794 | 81.0 | 2025 | 212.9253 | 0.825 | 0.3077 | 1.3779 | 0.825 | 0.8044 | 0.1662 | 0.0547 |
| 216.794 | 82.0 | 2050 | 212.9501 | 0.82 | 0.3024 | 1.3636 | 0.82 | 0.7928 | 0.1677 | 0.0553 |
| 216.794 | 83.0 | 2075 | 212.9160 | 0.81 | 0.3055 | 1.3686 | 0.81 | 0.7786 | 0.1624 | 0.0578 |
| 216.794 | 84.0 | 2100 | 212.9532 | 0.84 | 0.2914 | 1.4589 | 0.8400 | 0.8129 | 0.1482 | 0.0510 |
| 216.794 | 85.0 | 2125 | 212.9397 | 0.825 | 0.3067 | 1.3768 | 0.825 | 0.7981 | 0.1653 | 0.0537 |
| 216.794 | 86.0 | 2150 | 212.8927 | 0.83 | 0.2980 | 1.3825 | 0.83 | 0.8118 | 0.1662 | 0.0560 |
| 216.794 | 87.0 | 2175 | 212.8856 | 0.825 | 0.3004 | 1.4017 | 0.825 | 0.8063 | 0.1595 | 0.0555 |
| 216.794 | 88.0 | 2200 | 212.9423 | 0.82 | 0.3033 | 1.3619 | 0.82 | 0.8012 | 0.1539 | 0.0517 |
| 216.794 | 89.0 | 2225 | 212.8776 | 0.84 | 0.2922 | 1.3845 | 0.8400 | 0.8176 | 0.1555 | 0.0537 |
| 216.794 | 90.0 | 2250 | 212.9439 | 0.82 | 0.3011 | 1.3923 | 0.82 | 0.8012 | 0.1526 | 0.0535 |
| 216.794 | 91.0 | 2275 | 212.8640 | 0.815 | 0.3006 | 1.3680 | 0.815 | 0.7920 | 0.1443 | 0.0548 |
| 216.794 | 92.0 | 2300 | 212.8850 | 0.825 | 0.2940 | 1.3317 | 0.825 | 0.8087 | 0.1466 | 0.0548 |
| 216.794 | 93.0 | 2325 | 212.8843 | 0.825 | 0.3024 | 1.3848 | 0.825 | 0.8027 | 0.1688 | 0.0529 |
| 216.794 | 94.0 | 2350 | 212.9464 | 0.825 | 0.3013 | 1.3634 | 0.825 | 0.8027 | 0.1607 | 0.0515 |
| 216.794 | 95.0 | 2375 | 212.9154 | 0.825 | 0.3001 | 1.3262 | 0.825 | 0.8087 | 0.1604 | 0.0539 |
| 216.794 | 96.0 | 2400 | 212.9131 | 0.825 | 0.2984 | 1.3121 | 0.825 | 0.8027 | 0.1568 | 0.0538 |
| 216.794 | 97.0 | 2425 | 212.8888 | 0.82 | 0.3003 | 1.3975 | 0.82 | 0.8012 | 0.1614 | 0.0527 |
| 216.794 | 98.0 | 2450 | 212.8581 | 0.825 | 0.2989 | 1.2911 | 0.825 | 0.8027 | 0.1673 | 0.0542 |
| 216.794 | 99.0 | 2475 | 212.9360 | 0.83 | 0.2983 | 1.3929 | 0.83 | 0.8118 | 0.1616 | 0.0534 |
| 216.6176 | 100.0 | 2500 | 212.9255 | 0.825 | 0.2955 | 1.3247 | 0.825 | 0.8027 | 0.1660 | 0.0554 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
mvkvc/convnextv2-base-22k-224-finetuned-critique
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-base-22k-224-finetuned-critique
This model is a fine-tuned version of [facebook/convnextv2-base-22k-224](https://huggingface.co/facebook/convnextv2-base-22k-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2878
- Accuracy: 0.8825
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2128 | 1.0 | 56 | 0.3349 | 0.8588 |
| 0.2064 | 1.99 | 112 | 0.2999 | 0.8812 |
| 0.1788 | 2.99 | 168 | 0.2878 | 0.8825 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"ai",
"real"
] |
annazhong/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the cifar10 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1055
- Accuracy: 0.9674
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5173 | 1.0 | 351 | 0.1623 | 0.9444 |
| 0.4107 | 2.0 | 703 | 0.1182 | 0.962 |
| 0.3376 | 2.99 | 1053 | 0.1055 | 0.9674 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"airplane",
"automobile",
"bird",
"cat",
"deer",
"dog",
"frog",
"horse",
"ship",
"truck"
] |
mvkvc/convnextv2-base-22k-224-finetuned-critique-100k
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-base-22k-224-finetuned-critique-100k
This model is a fine-tuned version of [facebook/convnextv2-base-22k-224](https://huggingface.co/facebook/convnextv2-base-22k-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1309
- Accuracy: 0.9479
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6277 | 0.07 | 50 | 0.5987 | 0.6767 |
| 0.5459 | 0.14 | 100 | 0.5187 | 0.7401 |
| 0.4397 | 0.21 | 150 | 0.4448 | 0.7768 |
| 0.4197 | 0.28 | 200 | 0.3686 | 0.8401 |
| 0.3397 | 0.36 | 250 | 0.3153 | 0.8664 |
| 0.3345 | 0.43 | 300 | 0.3071 | 0.8701 |
| 0.3177 | 0.5 | 350 | 0.2576 | 0.8938 |
| 0.3182 | 0.57 | 400 | 0.2546 | 0.8926 |
| 0.2596 | 0.64 | 450 | 0.2320 | 0.9004 |
| 0.2563 | 0.71 | 500 | 0.2205 | 0.9082 |
| 0.2543 | 0.78 | 550 | 0.2142 | 0.9147 |
| 0.2768 | 0.85 | 600 | 0.2136 | 0.9132 |
| 0.2486 | 0.92 | 650 | 0.2052 | 0.9175 |
| 0.2504 | 1.0 | 700 | 0.2314 | 0.9058 |
| 0.2437 | 1.07 | 750 | 0.1943 | 0.9235 |
| 0.212 | 1.14 | 800 | 0.2019 | 0.9183 |
| 0.1891 | 1.21 | 850 | 0.1845 | 0.9254 |
| 0.2105 | 1.28 | 900 | 0.1834 | 0.9288 |
| 0.2285 | 1.35 | 950 | 0.1994 | 0.9206 |
| 0.2214 | 1.42 | 1000 | 0.1804 | 0.9251 |
| 0.1848 | 1.49 | 1050 | 0.1975 | 0.9196 |
| 0.191 | 1.56 | 1100 | 0.1795 | 0.9269 |
| 0.1794 | 1.64 | 1150 | 0.1606 | 0.9358 |
| 0.2084 | 1.71 | 1200 | 0.1807 | 0.9293 |
| 0.199 | 1.78 | 1250 | 0.1697 | 0.9307 |
| 0.1874 | 1.85 | 1300 | 0.1650 | 0.9372 |
| 0.1681 | 1.92 | 1350 | 0.1515 | 0.939 |
| 0.1696 | 1.99 | 1400 | 0.1473 | 0.9416 |
| 0.1651 | 2.06 | 1450 | 0.1489 | 0.9428 |
| 0.1627 | 2.13 | 1500 | 0.1529 | 0.9395 |
| 0.1754 | 2.2 | 1550 | 0.1540 | 0.9379 |
| 0.1302 | 2.28 | 1600 | 0.1579 | 0.939 |
| 0.1643 | 2.35 | 1650 | 0.1518 | 0.9401 |
| 0.1938 | 2.42 | 1700 | 0.1479 | 0.941 |
| 0.1441 | 2.49 | 1750 | 0.1451 | 0.9436 |
| 0.1478 | 2.56 | 1800 | 0.1324 | 0.9472 |
| 0.1275 | 2.63 | 1850 | 0.1340 | 0.9466 |
| 0.1582 | 2.7 | 1900 | 0.1501 | 0.9391 |
| 0.1472 | 2.77 | 1950 | 0.1354 | 0.9451 |
| 0.1522 | 2.84 | 2000 | 0.1309 | 0.9479 |
| 0.1593 | 2.92 | 2050 | 0.1433 | 0.9452 |
| 0.1541 | 2.99 | 2100 | 0.1381 | 0.9466 |
| 0.1297 | 3.06 | 2150 | 0.1320 | 0.9479 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"ai",
"real"
] |
jordyvl/vit-small_tobacco3482_hint_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_hint_
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 56.9670
- Accuracy: 0.835
- Brier Loss: 0.2969
- Nll: 1.1900
- F1 Micro: 0.835
- F1 Macro: 0.8377
- Ece: 0.1545
- Aurc: 0.0499
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 59.9015 | 0.37 | 0.7765 | 4.3056 | 0.37 | 0.2185 | 0.2975 | 0.4588 |
| No log | 2.0 | 50 | 58.9173 | 0.66 | 0.4866 | 2.0758 | 0.66 | 0.5717 | 0.2732 | 0.1578 |
| No log | 3.0 | 75 | 58.3604 | 0.745 | 0.3466 | 1.5077 | 0.745 | 0.7135 | 0.1846 | 0.0854 |
| No log | 4.0 | 100 | 58.0585 | 0.75 | 0.3628 | 1.5044 | 0.75 | 0.7674 | 0.2058 | 0.1052 |
| No log | 5.0 | 125 | 57.8363 | 0.76 | 0.3782 | 1.7066 | 0.76 | 0.7657 | 0.2174 | 0.1039 |
| No log | 6.0 | 150 | 57.4894 | 0.75 | 0.3593 | 1.5137 | 0.75 | 0.7377 | 0.1724 | 0.0800 |
| No log | 7.0 | 175 | 57.5188 | 0.76 | 0.3631 | 1.9770 | 0.76 | 0.7514 | 0.1968 | 0.0874 |
| No log | 8.0 | 200 | 57.4349 | 0.74 | 0.3947 | 1.8766 | 0.74 | 0.7412 | 0.1753 | 0.0777 |
| No log | 9.0 | 225 | 57.1764 | 0.765 | 0.3481 | 1.1532 | 0.765 | 0.7411 | 0.1956 | 0.0829 |
| No log | 10.0 | 250 | 57.6192 | 0.765 | 0.3943 | 1.8998 | 0.765 | 0.7755 | 0.1850 | 0.0981 |
| No log | 11.0 | 275 | 57.2121 | 0.77 | 0.3531 | 1.2685 | 0.7700 | 0.7643 | 0.1739 | 0.0689 |
| No log | 12.0 | 300 | 57.2250 | 0.795 | 0.3279 | 1.7553 | 0.795 | 0.7816 | 0.1596 | 0.0660 |
| No log | 13.0 | 325 | 57.4911 | 0.785 | 0.3678 | 2.0499 | 0.785 | 0.7857 | 0.1788 | 0.0945 |
| No log | 14.0 | 350 | 57.1481 | 0.77 | 0.3542 | 1.4834 | 0.7700 | 0.7649 | 0.1892 | 0.0636 |
| No log | 15.0 | 375 | 57.1701 | 0.825 | 0.3041 | 1.6075 | 0.825 | 0.8223 | 0.1609 | 0.0621 |
| No log | 16.0 | 400 | 57.4059 | 0.805 | 0.3343 | 1.7348 | 0.805 | 0.8080 | 0.1654 | 0.0822 |
| No log | 17.0 | 425 | 57.9813 | 0.72 | 0.4616 | 2.6345 | 0.72 | 0.7252 | 0.2263 | 0.1101 |
| No log | 18.0 | 450 | 57.2677 | 0.825 | 0.2953 | 1.5836 | 0.825 | 0.8171 | 0.1590 | 0.0572 |
| No log | 19.0 | 475 | 57.6052 | 0.765 | 0.4023 | 1.7463 | 0.765 | 0.7333 | 0.2052 | 0.0822 |
| 57.2084 | 20.0 | 500 | 57.4249 | 0.79 | 0.3653 | 1.5564 | 0.79 | 0.7941 | 0.1818 | 0.0845 |
| 57.2084 | 21.0 | 525 | 57.2631 | 0.845 | 0.2704 | 1.7326 | 0.845 | 0.8312 | 0.1358 | 0.0628 |
| 57.2084 | 22.0 | 550 | 57.1520 | 0.845 | 0.2723 | 1.2743 | 0.845 | 0.8386 | 0.1402 | 0.0551 |
| 57.2084 | 23.0 | 575 | 57.2977 | 0.82 | 0.3137 | 1.3068 | 0.82 | 0.8029 | 0.1578 | 0.0621 |
| 57.2084 | 24.0 | 600 | 57.2030 | 0.81 | 0.3107 | 1.5814 | 0.81 | 0.7870 | 0.1594 | 0.0688 |
| 57.2084 | 25.0 | 625 | 57.1500 | 0.82 | 0.3027 | 1.4128 | 0.82 | 0.8229 | 0.1584 | 0.0436 |
| 57.2084 | 26.0 | 650 | 57.1619 | 0.855 | 0.2735 | 1.5164 | 0.855 | 0.8558 | 0.1404 | 0.0530 |
| 57.2084 | 27.0 | 675 | 57.1504 | 0.845 | 0.2832 | 1.5742 | 0.845 | 0.8507 | 0.1500 | 0.0516 |
| 57.2084 | 28.0 | 700 | 57.1829 | 0.835 | 0.2932 | 1.4010 | 0.835 | 0.8410 | 0.1489 | 0.0496 |
| 57.2084 | 29.0 | 725 | 57.1899 | 0.83 | 0.2953 | 1.4038 | 0.83 | 0.8338 | 0.1497 | 0.0511 |
| 57.2084 | 30.0 | 750 | 57.1644 | 0.835 | 0.2948 | 1.3923 | 0.835 | 0.8374 | 0.1509 | 0.0507 |
| 57.2084 | 31.0 | 775 | 57.1720 | 0.83 | 0.2958 | 1.4622 | 0.83 | 0.8296 | 0.1502 | 0.0509 |
| 57.2084 | 32.0 | 800 | 57.1365 | 0.835 | 0.3024 | 1.2976 | 0.835 | 0.8374 | 0.1575 | 0.0509 |
| 57.2084 | 33.0 | 825 | 57.1499 | 0.835 | 0.2995 | 1.3654 | 0.835 | 0.8308 | 0.1574 | 0.0523 |
| 57.2084 | 34.0 | 850 | 57.1064 | 0.83 | 0.3022 | 1.3606 | 0.83 | 0.8251 | 0.1578 | 0.0526 |
| 57.2084 | 35.0 | 875 | 57.0901 | 0.835 | 0.3003 | 1.2803 | 0.835 | 0.8336 | 0.1554 | 0.0516 |
| 57.2084 | 36.0 | 900 | 57.0922 | 0.835 | 0.3047 | 1.2749 | 0.835 | 0.8336 | 0.1571 | 0.0517 |
| 57.2084 | 37.0 | 925 | 57.0673 | 0.83 | 0.3034 | 1.2533 | 0.83 | 0.8344 | 0.1559 | 0.0509 |
| 57.2084 | 38.0 | 950 | 57.0810 | 0.83 | 0.3024 | 1.2718 | 0.83 | 0.8344 | 0.1620 | 0.0526 |
| 57.2084 | 39.0 | 975 | 57.1040 | 0.835 | 0.3041 | 1.2522 | 0.835 | 0.8392 | 0.1571 | 0.0506 |
| 56.1387 | 40.0 | 1000 | 57.0542 | 0.835 | 0.3024 | 1.3210 | 0.835 | 0.8392 | 0.1525 | 0.0501 |
| 56.1387 | 41.0 | 1025 | 57.0554 | 0.83 | 0.3037 | 1.3231 | 0.83 | 0.8344 | 0.1534 | 0.0508 |
| 56.1387 | 42.0 | 1050 | 57.0724 | 0.83 | 0.2989 | 1.2517 | 0.83 | 0.8344 | 0.1485 | 0.0495 |
| 56.1387 | 43.0 | 1075 | 57.0429 | 0.835 | 0.3010 | 1.3082 | 0.835 | 0.8401 | 0.1557 | 0.0506 |
| 56.1387 | 44.0 | 1100 | 57.0208 | 0.835 | 0.3001 | 1.2428 | 0.835 | 0.8392 | 0.1583 | 0.0496 |
| 56.1387 | 45.0 | 1125 | 57.0700 | 0.835 | 0.2996 | 1.3149 | 0.835 | 0.8454 | 0.1601 | 0.0509 |
| 56.1387 | 46.0 | 1150 | 57.0054 | 0.835 | 0.2950 | 1.3019 | 0.835 | 0.8407 | 0.1476 | 0.0492 |
| 56.1387 | 47.0 | 1175 | 57.0516 | 0.825 | 0.3000 | 1.2344 | 0.825 | 0.8317 | 0.1485 | 0.0511 |
| 56.1387 | 48.0 | 1200 | 57.0373 | 0.835 | 0.3008 | 1.3016 | 0.835 | 0.8434 | 0.1611 | 0.0498 |
| 56.1387 | 49.0 | 1225 | 57.0154 | 0.83 | 0.2982 | 1.2376 | 0.83 | 0.8329 | 0.1515 | 0.0501 |
| 56.1387 | 50.0 | 1250 | 57.0000 | 0.835 | 0.2982 | 1.2196 | 0.835 | 0.8434 | 0.1535 | 0.0493 |
| 56.1387 | 51.0 | 1275 | 57.0054 | 0.825 | 0.2987 | 1.2217 | 0.825 | 0.8352 | 0.1517 | 0.0505 |
| 56.1387 | 52.0 | 1300 | 57.0347 | 0.835 | 0.2996 | 1.2239 | 0.835 | 0.8407 | 0.1643 | 0.0486 |
| 56.1387 | 53.0 | 1325 | 57.0183 | 0.835 | 0.2989 | 1.2208 | 0.835 | 0.8411 | 0.1604 | 0.0495 |
| 56.1387 | 54.0 | 1350 | 57.0094 | 0.845 | 0.2925 | 1.1545 | 0.845 | 0.8494 | 0.1515 | 0.0486 |
| 56.1387 | 55.0 | 1375 | 57.0027 | 0.83 | 0.2974 | 1.2161 | 0.83 | 0.8380 | 0.1538 | 0.0491 |
| 56.1387 | 56.0 | 1400 | 57.0060 | 0.835 | 0.2975 | 1.2215 | 0.835 | 0.8407 | 0.1546 | 0.0505 |
| 56.1387 | 57.0 | 1425 | 56.9898 | 0.835 | 0.2959 | 1.1432 | 0.835 | 0.8411 | 0.1483 | 0.0501 |
| 56.1387 | 58.0 | 1450 | 56.9907 | 0.835 | 0.2963 | 1.1437 | 0.835 | 0.8406 | 0.1527 | 0.0485 |
| 56.1387 | 59.0 | 1475 | 56.9578 | 0.84 | 0.2935 | 1.1583 | 0.8400 | 0.8439 | 0.1513 | 0.0488 |
| 55.9877 | 60.0 | 1500 | 57.0032 | 0.84 | 0.2957 | 1.2160 | 0.8400 | 0.8439 | 0.1460 | 0.0502 |
| 55.9877 | 61.0 | 1525 | 56.9880 | 0.835 | 0.2990 | 1.2836 | 0.835 | 0.8406 | 0.1475 | 0.0489 |
| 55.9877 | 62.0 | 1550 | 56.9920 | 0.83 | 0.2973 | 1.2071 | 0.83 | 0.8349 | 0.1519 | 0.0494 |
| 55.9877 | 63.0 | 1575 | 56.9681 | 0.835 | 0.2978 | 1.2076 | 0.835 | 0.8406 | 0.1465 | 0.0483 |
| 55.9877 | 64.0 | 1600 | 56.9772 | 0.835 | 0.3003 | 1.1997 | 0.835 | 0.8406 | 0.1567 | 0.0489 |
| 55.9877 | 65.0 | 1625 | 56.9705 | 0.835 | 0.2973 | 1.2038 | 0.835 | 0.8406 | 0.1520 | 0.0495 |
| 55.9877 | 66.0 | 1650 | 56.9682 | 0.835 | 0.2977 | 1.2005 | 0.835 | 0.8406 | 0.1576 | 0.0488 |
| 55.9877 | 67.0 | 1675 | 56.9775 | 0.835 | 0.2981 | 1.2093 | 0.835 | 0.8406 | 0.1497 | 0.0501 |
| 55.9877 | 68.0 | 1700 | 56.9762 | 0.835 | 0.2989 | 1.2061 | 0.835 | 0.8406 | 0.1626 | 0.0491 |
| 55.9877 | 69.0 | 1725 | 56.9807 | 0.84 | 0.2978 | 1.2023 | 0.8400 | 0.8434 | 0.1503 | 0.0481 |
| 55.9877 | 70.0 | 1750 | 56.9705 | 0.835 | 0.2988 | 1.1987 | 0.835 | 0.8406 | 0.1564 | 0.0487 |
| 55.9877 | 71.0 | 1775 | 56.9752 | 0.83 | 0.2987 | 1.2027 | 0.83 | 0.8349 | 0.1593 | 0.0497 |
| 55.9877 | 72.0 | 1800 | 56.9957 | 0.83 | 0.2996 | 1.2060 | 0.83 | 0.8349 | 0.1607 | 0.0496 |
| 55.9877 | 73.0 | 1825 | 56.9697 | 0.84 | 0.2966 | 1.1977 | 0.8400 | 0.8434 | 0.1510 | 0.0487 |
| 55.9877 | 74.0 | 1850 | 56.9644 | 0.83 | 0.2997 | 1.2055 | 0.83 | 0.8349 | 0.1528 | 0.0506 |
| 55.9877 | 75.0 | 1875 | 56.9677 | 0.84 | 0.2968 | 1.1969 | 0.8400 | 0.8434 | 0.1536 | 0.0495 |
| 55.9877 | 76.0 | 1900 | 56.9609 | 0.84 | 0.2958 | 1.1921 | 0.8400 | 0.8434 | 0.1531 | 0.0495 |
| 55.9877 | 77.0 | 1925 | 56.9663 | 0.835 | 0.2965 | 1.1950 | 0.835 | 0.8406 | 0.1576 | 0.0494 |
| 55.9877 | 78.0 | 1950 | 56.9796 | 0.83 | 0.2968 | 1.2049 | 0.83 | 0.8349 | 0.1525 | 0.0496 |
| 55.9877 | 79.0 | 1975 | 56.9648 | 0.835 | 0.2966 | 1.1944 | 0.835 | 0.8406 | 0.1545 | 0.0494 |
| 55.9237 | 80.0 | 2000 | 56.9596 | 0.845 | 0.2944 | 1.1912 | 0.845 | 0.8480 | 0.1543 | 0.0492 |
| 55.9237 | 81.0 | 2025 | 56.9596 | 0.84 | 0.2951 | 1.1878 | 0.8400 | 0.8434 | 0.1546 | 0.0492 |
| 55.9237 | 82.0 | 2050 | 56.9737 | 0.84 | 0.2958 | 1.1954 | 0.8400 | 0.8434 | 0.1521 | 0.0498 |
| 55.9237 | 83.0 | 2075 | 56.9725 | 0.835 | 0.2974 | 1.1963 | 0.835 | 0.8377 | 0.1512 | 0.0500 |
| 55.9237 | 84.0 | 2100 | 56.9743 | 0.835 | 0.2978 | 1.1928 | 0.835 | 0.8406 | 0.1554 | 0.0500 |
| 55.9237 | 85.0 | 2125 | 56.9788 | 0.835 | 0.2971 | 1.1952 | 0.835 | 0.8377 | 0.1493 | 0.0500 |
| 55.9237 | 86.0 | 2150 | 56.9705 | 0.84 | 0.2968 | 1.1933 | 0.8400 | 0.8434 | 0.1541 | 0.0499 |
| 55.9237 | 87.0 | 2175 | 56.9684 | 0.835 | 0.2966 | 1.1926 | 0.835 | 0.8377 | 0.1517 | 0.0497 |
| 55.9237 | 88.0 | 2200 | 56.9725 | 0.835 | 0.2979 | 1.1934 | 0.835 | 0.8377 | 0.1548 | 0.0497 |
| 55.9237 | 89.0 | 2225 | 56.9704 | 0.84 | 0.2959 | 1.1934 | 0.8400 | 0.8434 | 0.1527 | 0.0495 |
| 55.9237 | 90.0 | 2250 | 56.9681 | 0.84 | 0.2950 | 1.1907 | 0.8400 | 0.8434 | 0.1503 | 0.0498 |
| 55.9237 | 91.0 | 2275 | 56.9763 | 0.835 | 0.2979 | 1.1934 | 0.835 | 0.8377 | 0.1516 | 0.0501 |
| 55.9237 | 92.0 | 2300 | 56.9649 | 0.835 | 0.2959 | 1.1889 | 0.835 | 0.8377 | 0.1501 | 0.0495 |
| 55.9237 | 93.0 | 2325 | 56.9687 | 0.835 | 0.2959 | 1.1871 | 0.835 | 0.8377 | 0.1519 | 0.0501 |
| 55.9237 | 94.0 | 2350 | 56.9663 | 0.835 | 0.2963 | 1.1901 | 0.835 | 0.8377 | 0.1533 | 0.0496 |
| 55.9237 | 95.0 | 2375 | 56.9674 | 0.84 | 0.2955 | 1.1895 | 0.8400 | 0.8434 | 0.1534 | 0.0498 |
| 55.9237 | 96.0 | 2400 | 56.9661 | 0.835 | 0.2966 | 1.1907 | 0.835 | 0.8377 | 0.1520 | 0.0496 |
| 55.9237 | 97.0 | 2425 | 56.9623 | 0.84 | 0.2958 | 1.1871 | 0.8400 | 0.8434 | 0.1532 | 0.0499 |
| 55.9237 | 98.0 | 2450 | 56.9694 | 0.835 | 0.2969 | 1.1897 | 0.835 | 0.8377 | 0.1543 | 0.0499 |
| 55.9237 | 99.0 | 2475 | 56.9698 | 0.835 | 0.2967 | 1.1906 | 0.835 | 0.8377 | 0.1543 | 0.0499 |
| 55.8955 | 100.0 | 2500 | 56.9670 | 0.835 | 0.2969 | 1.1900 | 0.835 | 0.8377 | 0.1545 | 0.0499 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
dyvapandhu/vit-base-molecul
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-molecul
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6052
- Accuracy: 0.69
- F1: 0.6900
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.4633 | 3.85 | 50 | 0.6052 | 0.69 | 0.6900 |
| 0.0341 | 7.69 | 100 | 0.8705 | 0.71 | 0.7100 |
### Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.11.0
|
[
"a",
"c"
] |
jordyvl/vit-tiny_tobacco3482_hint_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_hint_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 56.8340
- Accuracy: 0.85
- Brier Loss: 0.2391
- Nll: 1.2700
- F1 Micro: 0.85
- F1 Macro: 0.8414
- Ece: 0.1228
- Aurc: 0.0436
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 59.9929 | 0.25 | 0.8706 | 4.9000 | 0.25 | 0.1856 | 0.3000 | 0.7819 |
| No log | 2.0 | 50 | 59.2119 | 0.54 | 0.5951 | 2.8481 | 0.54 | 0.4540 | 0.2868 | 0.2585 |
| No log | 3.0 | 75 | 58.7608 | 0.675 | 0.4417 | 1.6161 | 0.675 | 0.6086 | 0.2356 | 0.1335 |
| No log | 4.0 | 100 | 58.5280 | 0.75 | 0.3897 | 1.7733 | 0.75 | 0.7378 | 0.2211 | 0.1173 |
| No log | 5.0 | 125 | 58.2236 | 0.81 | 0.3119 | 1.5196 | 0.81 | 0.7653 | 0.1573 | 0.0769 |
| No log | 6.0 | 150 | 58.0804 | 0.805 | 0.3048 | 1.6467 | 0.805 | 0.7766 | 0.1355 | 0.0599 |
| No log | 7.0 | 175 | 58.3912 | 0.645 | 0.5320 | 2.1424 | 0.645 | 0.6017 | 0.2645 | 0.1497 |
| No log | 8.0 | 200 | 57.7380 | 0.755 | 0.3634 | 1.4581 | 0.755 | 0.7173 | 0.1913 | 0.0920 |
| No log | 9.0 | 225 | 57.5060 | 0.805 | 0.2939 | 1.4346 | 0.805 | 0.7860 | 0.1603 | 0.0636 |
| No log | 10.0 | 250 | 57.5367 | 0.805 | 0.3345 | 1.7441 | 0.805 | 0.7872 | 0.1748 | 0.0727 |
| No log | 11.0 | 275 | 57.7193 | 0.725 | 0.4195 | 1.7223 | 0.7250 | 0.6706 | 0.2121 | 0.0984 |
| No log | 12.0 | 300 | 57.5447 | 0.785 | 0.3435 | 1.7574 | 0.785 | 0.7772 | 0.1667 | 0.0678 |
| No log | 13.0 | 325 | 57.4991 | 0.795 | 0.3335 | 1.5337 | 0.795 | 0.7724 | 0.1685 | 0.0895 |
| No log | 14.0 | 350 | 57.3536 | 0.835 | 0.2942 | 1.5660 | 0.835 | 0.8089 | 0.1451 | 0.0594 |
| No log | 15.0 | 375 | 57.5013 | 0.815 | 0.3096 | 1.7294 | 0.815 | 0.7978 | 0.1579 | 0.0696 |
| No log | 16.0 | 400 | 57.3610 | 0.835 | 0.2763 | 1.7078 | 0.835 | 0.8182 | 0.1354 | 0.0526 |
| No log | 17.0 | 425 | 57.3415 | 0.825 | 0.3002 | 1.3895 | 0.825 | 0.8249 | 0.1455 | 0.0651 |
| No log | 18.0 | 450 | 57.4573 | 0.79 | 0.3649 | 1.6263 | 0.79 | 0.7688 | 0.1936 | 0.0943 |
| No log | 19.0 | 475 | 57.5728 | 0.8 | 0.3261 | 1.7616 | 0.8000 | 0.7461 | 0.1669 | 0.0659 |
| 57.5035 | 20.0 | 500 | 57.3014 | 0.835 | 0.2982 | 1.3501 | 0.835 | 0.8175 | 0.1442 | 0.0549 |
| 57.5035 | 21.0 | 525 | 57.3792 | 0.82 | 0.3206 | 1.4779 | 0.82 | 0.7957 | 0.1698 | 0.0568 |
| 57.5035 | 22.0 | 550 | 57.6968 | 0.795 | 0.3627 | 1.8539 | 0.795 | 0.7468 | 0.1809 | 0.0698 |
| 57.5035 | 23.0 | 575 | 57.4539 | 0.8 | 0.3465 | 1.4210 | 0.8000 | 0.7787 | 0.1736 | 0.0776 |
| 57.5035 | 24.0 | 600 | 57.1479 | 0.815 | 0.2998 | 1.3434 | 0.815 | 0.7989 | 0.1513 | 0.0607 |
| 57.5035 | 25.0 | 625 | 57.3390 | 0.79 | 0.3482 | 1.2595 | 0.79 | 0.7803 | 0.1635 | 0.0652 |
| 57.5035 | 26.0 | 650 | 57.4046 | 0.82 | 0.3130 | 1.5825 | 0.82 | 0.8060 | 0.1495 | 0.0692 |
| 57.5035 | 27.0 | 675 | 57.1398 | 0.835 | 0.2643 | 1.5441 | 0.835 | 0.8185 | 0.1317 | 0.0485 |
| 57.5035 | 28.0 | 700 | 57.4322 | 0.82 | 0.3217 | 1.5260 | 0.82 | 0.7912 | 0.1576 | 0.0643 |
| 57.5035 | 29.0 | 725 | 57.1362 | 0.87 | 0.2291 | 1.3903 | 0.87 | 0.8621 | 0.1104 | 0.0539 |
| 57.5035 | 30.0 | 750 | 57.1131 | 0.855 | 0.2511 | 1.1423 | 0.855 | 0.8451 | 0.1292 | 0.0492 |
| 57.5035 | 31.0 | 775 | 56.8690 | 0.845 | 0.2454 | 1.2822 | 0.845 | 0.8265 | 0.1254 | 0.0473 |
| 57.5035 | 32.0 | 800 | 56.8384 | 0.855 | 0.2220 | 1.2243 | 0.855 | 0.8453 | 0.1119 | 0.0382 |
| 57.5035 | 33.0 | 825 | 56.9461 | 0.855 | 0.2450 | 1.2192 | 0.855 | 0.8537 | 0.1276 | 0.0395 |
| 57.5035 | 34.0 | 850 | 56.9061 | 0.85 | 0.2450 | 1.2097 | 0.85 | 0.8337 | 0.1216 | 0.0408 |
| 57.5035 | 35.0 | 875 | 56.9100 | 0.86 | 0.2413 | 1.2197 | 0.8600 | 0.8546 | 0.1189 | 0.0387 |
| 57.5035 | 36.0 | 900 | 56.9087 | 0.855 | 0.2444 | 1.2098 | 0.855 | 0.8460 | 0.1281 | 0.0379 |
| 57.5035 | 37.0 | 925 | 56.8923 | 0.86 | 0.2438 | 1.2156 | 0.8600 | 0.8509 | 0.1259 | 0.0378 |
| 57.5035 | 38.0 | 950 | 56.8908 | 0.85 | 0.2450 | 1.2187 | 0.85 | 0.8397 | 0.1236 | 0.0390 |
| 57.5035 | 39.0 | 975 | 56.8591 | 0.855 | 0.2404 | 1.2063 | 0.855 | 0.8411 | 0.1326 | 0.0383 |
| 56.2493 | 40.0 | 1000 | 56.8479 | 0.86 | 0.2352 | 1.2142 | 0.8600 | 0.8539 | 0.1257 | 0.0386 |
| 56.2493 | 41.0 | 1025 | 56.8762 | 0.86 | 0.2365 | 1.2166 | 0.8600 | 0.8492 | 0.1171 | 0.0383 |
| 56.2493 | 42.0 | 1050 | 56.8551 | 0.865 | 0.2321 | 1.2078 | 0.865 | 0.8547 | 0.1183 | 0.0383 |
| 56.2493 | 43.0 | 1075 | 56.8913 | 0.87 | 0.2360 | 1.2013 | 0.87 | 0.8617 | 0.1187 | 0.0410 |
| 56.2493 | 44.0 | 1100 | 56.8523 | 0.86 | 0.2359 | 1.2094 | 0.8600 | 0.8506 | 0.1214 | 0.0399 |
| 56.2493 | 45.0 | 1125 | 56.8546 | 0.87 | 0.2330 | 1.2028 | 0.87 | 0.8650 | 0.1233 | 0.0384 |
| 56.2493 | 46.0 | 1150 | 56.8536 | 0.865 | 0.2323 | 1.2099 | 0.865 | 0.8579 | 0.1230 | 0.0389 |
| 56.2493 | 47.0 | 1175 | 56.8490 | 0.865 | 0.2346 | 1.2095 | 0.865 | 0.8580 | 0.1139 | 0.0411 |
| 56.2493 | 48.0 | 1200 | 56.8569 | 0.88 | 0.2295 | 1.1277 | 0.88 | 0.8693 | 0.1238 | 0.0381 |
| 56.2493 | 49.0 | 1225 | 56.8528 | 0.875 | 0.2292 | 1.1966 | 0.875 | 0.8681 | 0.1264 | 0.0394 |
| 56.2493 | 50.0 | 1250 | 56.8462 | 0.875 | 0.2309 | 1.1215 | 0.875 | 0.8690 | 0.1178 | 0.0424 |
| 56.2493 | 51.0 | 1275 | 56.8438 | 0.87 | 0.2285 | 1.1259 | 0.87 | 0.8631 | 0.1304 | 0.0404 |
| 56.2493 | 52.0 | 1300 | 56.8660 | 0.865 | 0.2334 | 1.1231 | 0.865 | 0.8623 | 0.1343 | 0.0416 |
| 56.2493 | 53.0 | 1325 | 56.8802 | 0.885 | 0.2273 | 1.1220 | 0.885 | 0.8784 | 0.1202 | 0.0415 |
| 56.2493 | 54.0 | 1350 | 56.8581 | 0.885 | 0.2261 | 1.1938 | 0.885 | 0.8772 | 0.1286 | 0.0408 |
| 56.2493 | 55.0 | 1375 | 56.8348 | 0.875 | 0.2301 | 1.1953 | 0.875 | 0.8675 | 0.1267 | 0.0404 |
| 56.2493 | 56.0 | 1400 | 56.8489 | 0.87 | 0.2336 | 1.1974 | 0.87 | 0.8589 | 0.1079 | 0.0393 |
| 56.2493 | 57.0 | 1425 | 56.8508 | 0.87 | 0.2307 | 1.1928 | 0.87 | 0.8648 | 0.1143 | 0.0399 |
| 56.2493 | 58.0 | 1450 | 56.8172 | 0.875 | 0.2198 | 1.2030 | 0.875 | 0.8675 | 0.1062 | 0.0378 |
| 56.2493 | 59.0 | 1475 | 56.8662 | 0.87 | 0.2287 | 1.1354 | 0.87 | 0.8659 | 0.1198 | 0.0410 |
| 56.0552 | 60.0 | 1500 | 56.8437 | 0.88 | 0.2241 | 1.1968 | 0.88 | 0.8735 | 0.1127 | 0.0407 |
| 56.0552 | 61.0 | 1525 | 56.8468 | 0.87 | 0.2303 | 1.1998 | 0.87 | 0.8648 | 0.1170 | 0.0428 |
| 56.0552 | 62.0 | 1550 | 56.8575 | 0.865 | 0.2299 | 1.1940 | 0.865 | 0.8599 | 0.1171 | 0.0411 |
| 56.0552 | 63.0 | 1575 | 56.8451 | 0.875 | 0.2291 | 1.1981 | 0.875 | 0.8697 | 0.1225 | 0.0395 |
| 56.0552 | 64.0 | 1600 | 56.8355 | 0.875 | 0.2255 | 1.2009 | 0.875 | 0.8675 | 0.1137 | 0.0394 |
| 56.0552 | 65.0 | 1625 | 56.8490 | 0.875 | 0.2338 | 1.2653 | 0.875 | 0.8697 | 0.1259 | 0.0413 |
| 56.0552 | 66.0 | 1650 | 56.8206 | 0.87 | 0.2271 | 1.2860 | 0.87 | 0.8603 | 0.1197 | 0.0379 |
| 56.0552 | 67.0 | 1675 | 56.8592 | 0.87 | 0.2307 | 1.1992 | 0.87 | 0.8636 | 0.1210 | 0.0408 |
| 56.0552 | 68.0 | 1700 | 56.8339 | 0.865 | 0.2290 | 1.1968 | 0.865 | 0.8599 | 0.1101 | 0.0386 |
| 56.0552 | 69.0 | 1725 | 56.8533 | 0.87 | 0.2336 | 1.2625 | 0.87 | 0.8636 | 0.1154 | 0.0406 |
| 56.0552 | 70.0 | 1750 | 56.8281 | 0.87 | 0.2328 | 1.2012 | 0.87 | 0.8659 | 0.1229 | 0.0406 |
| 56.0552 | 71.0 | 1775 | 56.8431 | 0.875 | 0.2335 | 1.2557 | 0.875 | 0.8697 | 0.1318 | 0.0398 |
| 56.0552 | 72.0 | 1800 | 56.8396 | 0.865 | 0.2321 | 1.2659 | 0.865 | 0.8599 | 0.1165 | 0.0397 |
| 56.0552 | 73.0 | 1825 | 56.8229 | 0.86 | 0.2302 | 1.2615 | 0.8600 | 0.8524 | 0.1258 | 0.0402 |
| 56.0552 | 74.0 | 1850 | 56.8445 | 0.87 | 0.2344 | 1.2371 | 0.87 | 0.8659 | 0.1202 | 0.0393 |
| 56.0552 | 75.0 | 1875 | 56.8475 | 0.865 | 0.2341 | 1.2660 | 0.865 | 0.8599 | 0.1202 | 0.0423 |
| 56.0552 | 76.0 | 1900 | 56.8338 | 0.86 | 0.2320 | 1.2643 | 0.8600 | 0.8524 | 0.1296 | 0.0422 |
| 56.0552 | 77.0 | 1925 | 56.8481 | 0.87 | 0.2353 | 1.2665 | 0.87 | 0.8659 | 0.1266 | 0.0426 |
| 56.0552 | 78.0 | 1950 | 56.8328 | 0.865 | 0.2323 | 1.2584 | 0.865 | 0.8599 | 0.1128 | 0.0424 |
| 56.0552 | 79.0 | 1975 | 56.8382 | 0.86 | 0.2363 | 1.2658 | 0.8600 | 0.8553 | 0.1273 | 0.0425 |
| 55.9822 | 80.0 | 2000 | 56.8260 | 0.86 | 0.2354 | 1.2710 | 0.8600 | 0.8553 | 0.1129 | 0.0430 |
| 55.9822 | 81.0 | 2025 | 56.8474 | 0.86 | 0.2398 | 1.2679 | 0.8600 | 0.8553 | 0.1212 | 0.0433 |
| 55.9822 | 82.0 | 2050 | 56.8105 | 0.855 | 0.2320 | 1.2655 | 0.855 | 0.8478 | 0.1269 | 0.0423 |
| 55.9822 | 83.0 | 2075 | 56.8240 | 0.86 | 0.2347 | 1.2651 | 0.8600 | 0.8524 | 0.1312 | 0.0425 |
| 55.9822 | 84.0 | 2100 | 56.8350 | 0.86 | 0.2353 | 1.2690 | 0.8600 | 0.8553 | 0.1225 | 0.0435 |
| 55.9822 | 85.0 | 2125 | 56.8317 | 0.855 | 0.2371 | 1.2674 | 0.855 | 0.8478 | 0.1211 | 0.0433 |
| 55.9822 | 86.0 | 2150 | 56.8270 | 0.855 | 0.2364 | 1.2646 | 0.855 | 0.8478 | 0.1270 | 0.0433 |
| 55.9822 | 87.0 | 2175 | 56.8275 | 0.855 | 0.2359 | 1.2660 | 0.855 | 0.8478 | 0.1167 | 0.0423 |
| 55.9822 | 88.0 | 2200 | 56.8426 | 0.855 | 0.2385 | 1.2683 | 0.855 | 0.8478 | 0.1239 | 0.0428 |
| 55.9822 | 89.0 | 2225 | 56.8376 | 0.855 | 0.2368 | 1.2676 | 0.855 | 0.8478 | 0.1239 | 0.0426 |
| 55.9822 | 90.0 | 2250 | 56.8358 | 0.855 | 0.2382 | 1.2670 | 0.855 | 0.8451 | 0.1213 | 0.0435 |
| 55.9822 | 91.0 | 2275 | 56.8254 | 0.86 | 0.2374 | 1.2687 | 0.8600 | 0.8536 | 0.1308 | 0.0432 |
| 55.9822 | 92.0 | 2300 | 56.8269 | 0.855 | 0.2359 | 1.2684 | 0.855 | 0.8476 | 0.1223 | 0.0425 |
| 55.9822 | 93.0 | 2325 | 56.8324 | 0.85 | 0.2381 | 1.2708 | 0.85 | 0.8414 | 0.1224 | 0.0432 |
| 55.9822 | 94.0 | 2350 | 56.8344 | 0.85 | 0.2384 | 1.2682 | 0.85 | 0.8414 | 0.1222 | 0.0433 |
| 55.9822 | 95.0 | 2375 | 56.8344 | 0.85 | 0.2387 | 1.2708 | 0.85 | 0.8414 | 0.1228 | 0.0434 |
| 55.9822 | 96.0 | 2400 | 56.8342 | 0.85 | 0.2389 | 1.2687 | 0.85 | 0.8414 | 0.1222 | 0.0432 |
| 55.9822 | 97.0 | 2425 | 56.8339 | 0.85 | 0.2387 | 1.2713 | 0.85 | 0.8414 | 0.1222 | 0.0433 |
| 55.9822 | 98.0 | 2450 | 56.8342 | 0.85 | 0.2393 | 1.2708 | 0.85 | 0.8414 | 0.1228 | 0.0437 |
| 55.9822 | 99.0 | 2475 | 56.8325 | 0.85 | 0.2390 | 1.2703 | 0.85 | 0.8414 | 0.1228 | 0.0436 |
| 55.9517 | 100.0 | 2500 | 56.8340 | 0.85 | 0.2391 | 1.2700 | 0.85 | 0.8414 | 0.1228 | 0.0436 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/81-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 81-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4383
- Accuracy: 0.815
- Brier Loss: 0.3137
- Nll: 0.9180
- F1 Micro: 0.815
- F1 Macro: 0.7935
- Ece: 0.2739
- Aurc: 0.0546
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.1414 | 0.1 | 1.0055 | 8.0967 | 0.1000 | 0.0761 | 0.3253 | 0.8939 |
| No log | 2.0 | 14 | 1.4539 | 0.215 | 0.8614 | 5.0189 | 0.2150 | 0.1812 | 0.2795 | 0.7550 |
| No log | 3.0 | 21 | 1.1486 | 0.39 | 0.7467 | 4.3499 | 0.39 | 0.3077 | 0.2755 | 0.4066 |
| No log | 4.0 | 28 | 0.9182 | 0.575 | 0.5990 | 2.9424 | 0.575 | 0.4619 | 0.2988 | 0.2367 |
| No log | 5.0 | 35 | 0.8134 | 0.645 | 0.5398 | 2.1261 | 0.645 | 0.5549 | 0.2689 | 0.1878 |
| No log | 6.0 | 42 | 0.7149 | 0.7 | 0.4612 | 2.2189 | 0.7 | 0.6006 | 0.2518 | 0.1316 |
| No log | 7.0 | 49 | 0.6607 | 0.73 | 0.4303 | 1.7076 | 0.7300 | 0.6689 | 0.2976 | 0.1200 |
| No log | 8.0 | 56 | 0.6109 | 0.755 | 0.3922 | 1.7703 | 0.755 | 0.6743 | 0.2638 | 0.0861 |
| No log | 9.0 | 63 | 0.6119 | 0.715 | 0.3943 | 1.4194 | 0.715 | 0.6534 | 0.2276 | 0.0840 |
| No log | 10.0 | 70 | 0.5462 | 0.8 | 0.3535 | 1.4026 | 0.8000 | 0.7760 | 0.2888 | 0.0660 |
| No log | 11.0 | 77 | 0.5376 | 0.785 | 0.3481 | 1.2329 | 0.785 | 0.7600 | 0.2660 | 0.0757 |
| No log | 12.0 | 84 | 0.5250 | 0.785 | 0.3442 | 1.1226 | 0.785 | 0.7669 | 0.2476 | 0.0802 |
| No log | 13.0 | 91 | 0.5053 | 0.81 | 0.3313 | 1.1038 | 0.81 | 0.7824 | 0.2543 | 0.0628 |
| No log | 14.0 | 98 | 0.5188 | 0.79 | 0.3497 | 1.0872 | 0.79 | 0.7678 | 0.2495 | 0.0759 |
| No log | 15.0 | 105 | 0.5020 | 0.805 | 0.3412 | 1.4342 | 0.805 | 0.7868 | 0.2669 | 0.0652 |
| No log | 16.0 | 112 | 0.5221 | 0.795 | 0.3496 | 1.3609 | 0.795 | 0.7473 | 0.2682 | 0.0621 |
| No log | 17.0 | 119 | 0.5046 | 0.8 | 0.3372 | 1.1543 | 0.8000 | 0.7766 | 0.2604 | 0.0689 |
| No log | 18.0 | 126 | 0.4733 | 0.805 | 0.3248 | 1.1335 | 0.805 | 0.7800 | 0.2599 | 0.0582 |
| No log | 19.0 | 133 | 0.4725 | 0.815 | 0.3242 | 1.1607 | 0.815 | 0.7855 | 0.2733 | 0.0573 |
| No log | 20.0 | 140 | 0.4887 | 0.82 | 0.3325 | 1.0316 | 0.82 | 0.7998 | 0.2684 | 0.0568 |
| No log | 21.0 | 147 | 0.4708 | 0.815 | 0.3205 | 1.1275 | 0.815 | 0.8033 | 0.2412 | 0.0663 |
| No log | 22.0 | 154 | 0.4773 | 0.83 | 0.3309 | 1.1147 | 0.83 | 0.8101 | 0.2739 | 0.0518 |
| No log | 23.0 | 161 | 0.4957 | 0.815 | 0.3402 | 1.0884 | 0.815 | 0.8012 | 0.2582 | 0.0726 |
| No log | 24.0 | 168 | 0.4666 | 0.805 | 0.3305 | 1.0784 | 0.805 | 0.7858 | 0.2792 | 0.0560 |
| No log | 25.0 | 175 | 0.4830 | 0.795 | 0.3324 | 1.1757 | 0.795 | 0.7595 | 0.2505 | 0.0715 |
| No log | 26.0 | 182 | 0.4622 | 0.8 | 0.3272 | 1.0698 | 0.8000 | 0.7873 | 0.2795 | 0.0590 |
| No log | 27.0 | 189 | 0.4604 | 0.8 | 0.3200 | 1.1104 | 0.8000 | 0.7717 | 0.2561 | 0.0630 |
| No log | 28.0 | 196 | 0.4635 | 0.82 | 0.3253 | 1.1271 | 0.82 | 0.7903 | 0.2756 | 0.0571 |
| No log | 29.0 | 203 | 0.4590 | 0.815 | 0.3211 | 1.1048 | 0.815 | 0.7952 | 0.2881 | 0.0567 |
| No log | 30.0 | 210 | 0.4575 | 0.795 | 0.3210 | 0.9174 | 0.795 | 0.7786 | 0.2833 | 0.0625 |
| No log | 31.0 | 217 | 0.4684 | 0.83 | 0.3337 | 0.9485 | 0.83 | 0.8093 | 0.2892 | 0.0557 |
| No log | 32.0 | 224 | 0.4520 | 0.81 | 0.3208 | 1.0186 | 0.81 | 0.7955 | 0.2438 | 0.0577 |
| No log | 33.0 | 231 | 0.4567 | 0.825 | 0.3233 | 0.9246 | 0.825 | 0.7928 | 0.2665 | 0.0592 |
| No log | 34.0 | 238 | 0.4468 | 0.82 | 0.3152 | 1.0065 | 0.82 | 0.8000 | 0.2710 | 0.0563 |
| No log | 35.0 | 245 | 0.4562 | 0.78 | 0.3244 | 1.0626 | 0.78 | 0.7614 | 0.2602 | 0.0624 |
| No log | 36.0 | 252 | 0.4542 | 0.815 | 0.3223 | 1.1362 | 0.815 | 0.7852 | 0.2584 | 0.0579 |
| No log | 37.0 | 259 | 0.4441 | 0.82 | 0.3136 | 1.0419 | 0.82 | 0.7901 | 0.2790 | 0.0529 |
| No log | 38.0 | 266 | 0.4408 | 0.825 | 0.3125 | 0.9860 | 0.825 | 0.8023 | 0.2766 | 0.0553 |
| No log | 39.0 | 273 | 0.4354 | 0.83 | 0.3082 | 0.8958 | 0.83 | 0.8116 | 0.2713 | 0.0504 |
| No log | 40.0 | 280 | 0.4465 | 0.79 | 0.3164 | 1.1111 | 0.79 | 0.7715 | 0.2668 | 0.0628 |
| No log | 41.0 | 287 | 0.4416 | 0.845 | 0.3128 | 1.0103 | 0.845 | 0.8162 | 0.3044 | 0.0527 |
| No log | 42.0 | 294 | 0.4463 | 0.83 | 0.3165 | 1.0849 | 0.83 | 0.8106 | 0.2683 | 0.0580 |
| No log | 43.0 | 301 | 0.4405 | 0.845 | 0.3132 | 1.0312 | 0.845 | 0.8247 | 0.2792 | 0.0509 |
| No log | 44.0 | 308 | 0.4443 | 0.83 | 0.3174 | 0.9196 | 0.83 | 0.8094 | 0.2687 | 0.0524 |
| No log | 45.0 | 315 | 0.4445 | 0.82 | 0.3194 | 1.0665 | 0.82 | 0.7897 | 0.2651 | 0.0560 |
| No log | 46.0 | 322 | 0.4405 | 0.81 | 0.3133 | 1.1805 | 0.81 | 0.7770 | 0.2771 | 0.0550 |
| No log | 47.0 | 329 | 0.4380 | 0.84 | 0.3132 | 0.9508 | 0.8400 | 0.8104 | 0.2916 | 0.0535 |
| No log | 48.0 | 336 | 0.4407 | 0.825 | 0.3139 | 0.9044 | 0.825 | 0.7978 | 0.2702 | 0.0542 |
| No log | 49.0 | 343 | 0.4418 | 0.835 | 0.3154 | 0.8965 | 0.835 | 0.8178 | 0.2877 | 0.0569 |
| No log | 50.0 | 350 | 0.4368 | 0.825 | 0.3123 | 0.9774 | 0.825 | 0.8073 | 0.2607 | 0.0531 |
| No log | 51.0 | 357 | 0.4402 | 0.825 | 0.3140 | 0.9170 | 0.825 | 0.8052 | 0.2810 | 0.0550 |
| No log | 52.0 | 364 | 0.4374 | 0.82 | 0.3107 | 0.9873 | 0.82 | 0.7952 | 0.2602 | 0.0542 |
| No log | 53.0 | 371 | 0.4368 | 0.83 | 0.3120 | 0.9832 | 0.83 | 0.8084 | 0.2709 | 0.0541 |
| No log | 54.0 | 378 | 0.4375 | 0.82 | 0.3131 | 0.9094 | 0.82 | 0.7943 | 0.2633 | 0.0538 |
| No log | 55.0 | 385 | 0.4379 | 0.815 | 0.3134 | 0.9927 | 0.815 | 0.7856 | 0.2960 | 0.0552 |
| No log | 56.0 | 392 | 0.4370 | 0.83 | 0.3125 | 0.9746 | 0.83 | 0.8100 | 0.2744 | 0.0535 |
| No log | 57.0 | 399 | 0.4366 | 0.825 | 0.3123 | 1.0392 | 0.825 | 0.8021 | 0.2730 | 0.0536 |
| No log | 58.0 | 406 | 0.4372 | 0.825 | 0.3129 | 0.9174 | 0.825 | 0.8026 | 0.2800 | 0.0542 |
| No log | 59.0 | 413 | 0.4380 | 0.81 | 0.3134 | 0.9770 | 0.81 | 0.7831 | 0.2612 | 0.0557 |
| No log | 60.0 | 420 | 0.4374 | 0.82 | 0.3130 | 0.9124 | 0.82 | 0.7961 | 0.2589 | 0.0541 |
| No log | 61.0 | 427 | 0.4366 | 0.825 | 0.3121 | 0.9038 | 0.825 | 0.8061 | 0.2641 | 0.0538 |
| No log | 62.0 | 434 | 0.4372 | 0.825 | 0.3126 | 0.9105 | 0.825 | 0.8042 | 0.2684 | 0.0547 |
| No log | 63.0 | 441 | 0.4381 | 0.82 | 0.3135 | 0.9160 | 0.82 | 0.7961 | 0.2810 | 0.0545 |
| No log | 64.0 | 448 | 0.4376 | 0.83 | 0.3133 | 0.9134 | 0.83 | 0.8100 | 0.2757 | 0.0539 |
| No log | 65.0 | 455 | 0.4376 | 0.825 | 0.3130 | 0.9133 | 0.825 | 0.8061 | 0.2977 | 0.0541 |
| No log | 66.0 | 462 | 0.4378 | 0.825 | 0.3133 | 0.9153 | 0.825 | 0.8061 | 0.2767 | 0.0543 |
| No log | 67.0 | 469 | 0.4373 | 0.825 | 0.3129 | 0.9139 | 0.825 | 0.8042 | 0.2905 | 0.0541 |
| No log | 68.0 | 476 | 0.4375 | 0.82 | 0.3129 | 0.9128 | 0.82 | 0.7961 | 0.2739 | 0.0543 |
| No log | 69.0 | 483 | 0.4376 | 0.82 | 0.3131 | 0.9125 | 0.82 | 0.7961 | 0.2757 | 0.0542 |
| No log | 70.0 | 490 | 0.4377 | 0.825 | 0.3133 | 0.9174 | 0.825 | 0.8061 | 0.2924 | 0.0538 |
| No log | 71.0 | 497 | 0.4380 | 0.82 | 0.3134 | 0.9179 | 0.82 | 0.7961 | 0.2896 | 0.0541 |
| 0.2684 | 72.0 | 504 | 0.4378 | 0.82 | 0.3133 | 0.9148 | 0.82 | 0.8035 | 0.2912 | 0.0543 |
| 0.2684 | 73.0 | 511 | 0.4375 | 0.82 | 0.3131 | 0.9169 | 0.82 | 0.7961 | 0.2731 | 0.0542 |
| 0.2684 | 74.0 | 518 | 0.4379 | 0.82 | 0.3133 | 0.9177 | 0.82 | 0.7961 | 0.2732 | 0.0540 |
| 0.2684 | 75.0 | 525 | 0.4383 | 0.82 | 0.3138 | 0.9194 | 0.82 | 0.8035 | 0.2835 | 0.0545 |
| 0.2684 | 76.0 | 532 | 0.4378 | 0.815 | 0.3133 | 0.9133 | 0.815 | 0.7935 | 0.2824 | 0.0543 |
| 0.2684 | 77.0 | 539 | 0.4378 | 0.815 | 0.3133 | 0.9146 | 0.815 | 0.7935 | 0.2735 | 0.0542 |
| 0.2684 | 78.0 | 546 | 0.4379 | 0.815 | 0.3134 | 0.9139 | 0.815 | 0.7935 | 0.2828 | 0.0547 |
| 0.2684 | 79.0 | 553 | 0.4382 | 0.815 | 0.3136 | 0.9179 | 0.815 | 0.7935 | 0.2817 | 0.0547 |
| 0.2684 | 80.0 | 560 | 0.4380 | 0.815 | 0.3134 | 0.9168 | 0.815 | 0.7935 | 0.2818 | 0.0545 |
| 0.2684 | 81.0 | 567 | 0.4381 | 0.815 | 0.3135 | 0.9183 | 0.815 | 0.7935 | 0.2736 | 0.0544 |
| 0.2684 | 82.0 | 574 | 0.4379 | 0.815 | 0.3134 | 0.9164 | 0.815 | 0.7935 | 0.2736 | 0.0544 |
| 0.2684 | 83.0 | 581 | 0.4382 | 0.815 | 0.3136 | 0.9168 | 0.815 | 0.7935 | 0.2736 | 0.0541 |
| 0.2684 | 84.0 | 588 | 0.4381 | 0.815 | 0.3136 | 0.9199 | 0.815 | 0.7935 | 0.2737 | 0.0541 |
| 0.2684 | 85.0 | 595 | 0.4380 | 0.815 | 0.3134 | 0.9175 | 0.815 | 0.7935 | 0.2735 | 0.0543 |
| 0.2684 | 86.0 | 602 | 0.4383 | 0.815 | 0.3137 | 0.9197 | 0.815 | 0.7935 | 0.2674 | 0.0545 |
| 0.2684 | 87.0 | 609 | 0.4381 | 0.815 | 0.3135 | 0.9176 | 0.815 | 0.7935 | 0.2738 | 0.0547 |
| 0.2684 | 88.0 | 616 | 0.4381 | 0.815 | 0.3135 | 0.9179 | 0.815 | 0.7935 | 0.2736 | 0.0541 |
| 0.2684 | 89.0 | 623 | 0.4381 | 0.815 | 0.3135 | 0.9177 | 0.815 | 0.7935 | 0.2737 | 0.0543 |
| 0.2684 | 90.0 | 630 | 0.4382 | 0.815 | 0.3136 | 0.9173 | 0.815 | 0.7935 | 0.2736 | 0.0544 |
| 0.2684 | 91.0 | 637 | 0.4383 | 0.815 | 0.3136 | 0.9187 | 0.815 | 0.7935 | 0.2738 | 0.0546 |
| 0.2684 | 92.0 | 644 | 0.4382 | 0.815 | 0.3136 | 0.9171 | 0.815 | 0.7935 | 0.2738 | 0.0544 |
| 0.2684 | 93.0 | 651 | 0.4383 | 0.815 | 0.3137 | 0.9190 | 0.815 | 0.7935 | 0.2738 | 0.0546 |
| 0.2684 | 94.0 | 658 | 0.4382 | 0.815 | 0.3136 | 0.9187 | 0.815 | 0.7935 | 0.2737 | 0.0543 |
| 0.2684 | 95.0 | 665 | 0.4383 | 0.815 | 0.3136 | 0.9184 | 0.815 | 0.7935 | 0.2739 | 0.0545 |
| 0.2684 | 96.0 | 672 | 0.4382 | 0.815 | 0.3136 | 0.9184 | 0.815 | 0.7935 | 0.2737 | 0.0545 |
| 0.2684 | 97.0 | 679 | 0.4382 | 0.815 | 0.3136 | 0.9179 | 0.815 | 0.7935 | 0.2739 | 0.0545 |
| 0.2684 | 98.0 | 686 | 0.4383 | 0.815 | 0.3137 | 0.9185 | 0.815 | 0.7935 | 0.2738 | 0.0544 |
| 0.2684 | 99.0 | 693 | 0.4383 | 0.815 | 0.3137 | 0.9182 | 0.815 | 0.7935 | 0.2739 | 0.0546 |
| 0.2684 | 100.0 | 700 | 0.4383 | 0.815 | 0.3137 | 0.9180 | 0.815 | 0.7935 | 0.2739 | 0.0546 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/81-tiny_tobacco3482
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 81-tiny_tobacco3482
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1635
- Accuracy: 0.815
- Brier Loss: 0.3719
- Nll: 0.7360
- F1 Micro: 0.815
- F1 Macro: 0.7973
- Ece: 0.3345
- Aurc: 0.0542
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.2942 | 0.16 | 1.0273 | 8.4554 | 0.16 | 0.0543 | 0.3538 | 0.8479 |
| No log | 2.0 | 14 | 1.1952 | 0.095 | 0.9063 | 7.1126 | 0.095 | 0.0819 | 0.2327 | 0.8615 |
| No log | 3.0 | 21 | 0.7862 | 0.305 | 0.8434 | 5.1628 | 0.305 | 0.1452 | 0.2965 | 0.6092 |
| No log | 4.0 | 28 | 0.6144 | 0.385 | 0.7594 | 3.3452 | 0.3850 | 0.2942 | 0.2978 | 0.4140 |
| No log | 5.0 | 35 | 0.5040 | 0.54 | 0.6858 | 2.6433 | 0.54 | 0.4419 | 0.3200 | 0.2853 |
| No log | 6.0 | 42 | 0.4291 | 0.625 | 0.6353 | 2.2564 | 0.625 | 0.5273 | 0.3772 | 0.1975 |
| No log | 7.0 | 49 | 0.3839 | 0.63 | 0.5815 | 2.0215 | 0.63 | 0.4970 | 0.3469 | 0.1795 |
| No log | 8.0 | 56 | 0.3390 | 0.705 | 0.5512 | 1.5239 | 0.705 | 0.5983 | 0.3704 | 0.1337 |
| No log | 9.0 | 63 | 0.3128 | 0.705 | 0.5015 | 1.7837 | 0.705 | 0.6172 | 0.3403 | 0.1138 |
| No log | 10.0 | 70 | 0.2868 | 0.705 | 0.4532 | 1.1243 | 0.705 | 0.6108 | 0.3221 | 0.1015 |
| No log | 11.0 | 77 | 0.2866 | 0.74 | 0.4532 | 1.0967 | 0.74 | 0.6909 | 0.3422 | 0.0932 |
| No log | 12.0 | 84 | 0.2625 | 0.765 | 0.4235 | 1.4646 | 0.765 | 0.6788 | 0.3240 | 0.0805 |
| No log | 13.0 | 91 | 0.2516 | 0.73 | 0.4141 | 1.4855 | 0.7300 | 0.6495 | 0.2770 | 0.0869 |
| No log | 14.0 | 98 | 0.2487 | 0.805 | 0.4440 | 1.2873 | 0.805 | 0.7319 | 0.3699 | 0.0670 |
| No log | 15.0 | 105 | 0.2103 | 0.77 | 0.4029 | 1.0834 | 0.7700 | 0.6891 | 0.3007 | 0.0677 |
| No log | 16.0 | 112 | 0.2216 | 0.8 | 0.4029 | 1.0621 | 0.8000 | 0.7247 | 0.3513 | 0.0611 |
| No log | 17.0 | 119 | 0.2230 | 0.745 | 0.3999 | 1.0367 | 0.745 | 0.6710 | 0.2992 | 0.0809 |
| No log | 18.0 | 126 | 0.2301 | 0.805 | 0.4044 | 0.9650 | 0.805 | 0.7473 | 0.3406 | 0.0538 |
| No log | 19.0 | 133 | 0.2024 | 0.795 | 0.3935 | 0.9429 | 0.795 | 0.7562 | 0.3327 | 0.0669 |
| No log | 20.0 | 140 | 0.1959 | 0.82 | 0.3935 | 0.8984 | 0.82 | 0.8000 | 0.3453 | 0.0571 |
| No log | 21.0 | 147 | 0.2020 | 0.815 | 0.3946 | 0.8507 | 0.815 | 0.7949 | 0.3669 | 0.0588 |
| No log | 22.0 | 154 | 0.2100 | 0.805 | 0.3758 | 0.8083 | 0.805 | 0.7743 | 0.3087 | 0.0625 |
| No log | 23.0 | 161 | 0.2002 | 0.81 | 0.3881 | 1.0046 | 0.81 | 0.7953 | 0.3276 | 0.0617 |
| No log | 24.0 | 168 | 0.2004 | 0.83 | 0.3987 | 0.8922 | 0.83 | 0.8034 | 0.3565 | 0.0531 |
| No log | 25.0 | 175 | 0.1914 | 0.785 | 0.3699 | 0.8571 | 0.785 | 0.7484 | 0.3117 | 0.0601 |
| No log | 26.0 | 182 | 0.1845 | 0.815 | 0.3764 | 0.8153 | 0.815 | 0.7970 | 0.3215 | 0.0539 |
| No log | 27.0 | 189 | 0.1821 | 0.835 | 0.3815 | 0.8488 | 0.835 | 0.8175 | 0.3441 | 0.0497 |
| No log | 28.0 | 196 | 0.1869 | 0.84 | 0.3808 | 0.8654 | 0.8400 | 0.8236 | 0.3639 | 0.0504 |
| No log | 29.0 | 203 | 0.1859 | 0.79 | 0.3752 | 0.7067 | 0.79 | 0.7661 | 0.3019 | 0.0618 |
| No log | 30.0 | 210 | 0.1842 | 0.795 | 0.3826 | 0.9031 | 0.795 | 0.7646 | 0.3170 | 0.0599 |
| No log | 31.0 | 217 | 0.1797 | 0.815 | 0.3714 | 0.8572 | 0.815 | 0.8038 | 0.3214 | 0.0588 |
| No log | 32.0 | 224 | 0.1754 | 0.805 | 0.3679 | 0.7412 | 0.805 | 0.7883 | 0.3063 | 0.0563 |
| No log | 33.0 | 231 | 0.1790 | 0.835 | 0.3736 | 0.8357 | 0.835 | 0.8128 | 0.3431 | 0.0497 |
| No log | 34.0 | 238 | 0.1761 | 0.81 | 0.3744 | 0.7360 | 0.81 | 0.7907 | 0.3292 | 0.0526 |
| No log | 35.0 | 245 | 0.1731 | 0.83 | 0.3744 | 0.7494 | 0.83 | 0.8202 | 0.3351 | 0.0540 |
| No log | 36.0 | 252 | 0.1758 | 0.79 | 0.3738 | 0.9441 | 0.79 | 0.7710 | 0.3099 | 0.0625 |
| No log | 37.0 | 259 | 0.1730 | 0.81 | 0.3785 | 0.9418 | 0.81 | 0.7939 | 0.3425 | 0.0577 |
| No log | 38.0 | 266 | 0.1727 | 0.815 | 0.3752 | 0.7539 | 0.815 | 0.8001 | 0.3290 | 0.0540 |
| No log | 39.0 | 273 | 0.1769 | 0.81 | 0.3754 | 0.9013 | 0.81 | 0.7888 | 0.3296 | 0.0563 |
| No log | 40.0 | 280 | 0.1770 | 0.805 | 0.3809 | 0.7637 | 0.805 | 0.7809 | 0.3291 | 0.0519 |
| No log | 41.0 | 287 | 0.1771 | 0.815 | 0.3814 | 0.7495 | 0.815 | 0.7993 | 0.3352 | 0.0565 |
| No log | 42.0 | 294 | 0.1745 | 0.8 | 0.3807 | 0.8386 | 0.8000 | 0.7865 | 0.3212 | 0.0601 |
| No log | 43.0 | 301 | 0.1700 | 0.825 | 0.3762 | 0.8216 | 0.825 | 0.8067 | 0.3420 | 0.0517 |
| No log | 44.0 | 308 | 0.1686 | 0.825 | 0.3706 | 0.7310 | 0.825 | 0.8024 | 0.3448 | 0.0509 |
| No log | 45.0 | 315 | 0.1692 | 0.81 | 0.3739 | 0.6805 | 0.81 | 0.7914 | 0.3397 | 0.0508 |
| No log | 46.0 | 322 | 0.1697 | 0.825 | 0.3709 | 0.7482 | 0.825 | 0.7995 | 0.3364 | 0.0531 |
| No log | 47.0 | 329 | 0.1681 | 0.805 | 0.3735 | 0.8810 | 0.805 | 0.7872 | 0.3388 | 0.0585 |
| No log | 48.0 | 336 | 0.1667 | 0.815 | 0.3731 | 0.6788 | 0.815 | 0.7968 | 0.3427 | 0.0567 |
| No log | 49.0 | 343 | 0.1690 | 0.82 | 0.3786 | 0.8693 | 0.82 | 0.7967 | 0.3507 | 0.0502 |
| No log | 50.0 | 350 | 0.1668 | 0.82 | 0.3761 | 0.7854 | 0.82 | 0.8032 | 0.3509 | 0.0526 |
| No log | 51.0 | 357 | 0.1659 | 0.82 | 0.3760 | 0.7437 | 0.82 | 0.8045 | 0.3374 | 0.0557 |
| No log | 52.0 | 364 | 0.1663 | 0.805 | 0.3682 | 0.7932 | 0.805 | 0.7863 | 0.3171 | 0.0560 |
| No log | 53.0 | 371 | 0.1649 | 0.805 | 0.3649 | 0.7249 | 0.805 | 0.7863 | 0.3139 | 0.0562 |
| No log | 54.0 | 378 | 0.1654 | 0.815 | 0.3725 | 0.7390 | 0.815 | 0.7935 | 0.3273 | 0.0560 |
| No log | 55.0 | 385 | 0.1667 | 0.84 | 0.3741 | 0.7324 | 0.8400 | 0.8279 | 0.3501 | 0.0513 |
| No log | 56.0 | 392 | 0.1659 | 0.825 | 0.3718 | 0.7278 | 0.825 | 0.8058 | 0.3288 | 0.0522 |
| No log | 57.0 | 399 | 0.1659 | 0.825 | 0.3731 | 0.7351 | 0.825 | 0.8084 | 0.3204 | 0.0537 |
| No log | 58.0 | 406 | 0.1645 | 0.825 | 0.3730 | 0.7356 | 0.825 | 0.8068 | 0.3468 | 0.0524 |
| No log | 59.0 | 413 | 0.1626 | 0.825 | 0.3715 | 0.7282 | 0.825 | 0.8040 | 0.3391 | 0.0527 |
| No log | 60.0 | 420 | 0.1628 | 0.825 | 0.3698 | 0.7297 | 0.825 | 0.8056 | 0.3212 | 0.0534 |
| No log | 61.0 | 427 | 0.1630 | 0.825 | 0.3719 | 0.7312 | 0.825 | 0.8065 | 0.3260 | 0.0533 |
| No log | 62.0 | 434 | 0.1642 | 0.82 | 0.3714 | 0.7335 | 0.82 | 0.8040 | 0.3388 | 0.0549 |
| No log | 63.0 | 441 | 0.1634 | 0.82 | 0.3733 | 0.7349 | 0.82 | 0.8011 | 0.3399 | 0.0553 |
| No log | 64.0 | 448 | 0.1628 | 0.815 | 0.3710 | 0.7301 | 0.815 | 0.7974 | 0.3249 | 0.0532 |
| No log | 65.0 | 455 | 0.1629 | 0.815 | 0.3704 | 0.7362 | 0.815 | 0.7981 | 0.3362 | 0.0541 |
| No log | 66.0 | 462 | 0.1630 | 0.82 | 0.3727 | 0.7354 | 0.82 | 0.8024 | 0.3559 | 0.0526 |
| No log | 67.0 | 469 | 0.1633 | 0.825 | 0.3733 | 0.7355 | 0.825 | 0.8038 | 0.3492 | 0.0526 |
| No log | 68.0 | 476 | 0.1638 | 0.82 | 0.3724 | 0.7369 | 0.82 | 0.8040 | 0.3468 | 0.0550 |
| No log | 69.0 | 483 | 0.1629 | 0.82 | 0.3717 | 0.7329 | 0.82 | 0.8040 | 0.3251 | 0.0536 |
| No log | 70.0 | 490 | 0.1629 | 0.815 | 0.3709 | 0.7312 | 0.815 | 0.7947 | 0.3221 | 0.0541 |
| No log | 71.0 | 497 | 0.1630 | 0.815 | 0.3718 | 0.7371 | 0.815 | 0.7973 | 0.3436 | 0.0545 |
| 0.1743 | 72.0 | 504 | 0.1631 | 0.815 | 0.3712 | 0.7350 | 0.815 | 0.7973 | 0.3264 | 0.0536 |
| 0.1743 | 73.0 | 511 | 0.1634 | 0.815 | 0.3721 | 0.7348 | 0.815 | 0.7981 | 0.3340 | 0.0543 |
| 0.1743 | 74.0 | 518 | 0.1631 | 0.815 | 0.3716 | 0.7332 | 0.815 | 0.7973 | 0.3279 | 0.0541 |
| 0.1743 | 75.0 | 525 | 0.1633 | 0.82 | 0.3720 | 0.7346 | 0.82 | 0.8008 | 0.3186 | 0.0542 |
| 0.1743 | 76.0 | 532 | 0.1631 | 0.815 | 0.3716 | 0.7342 | 0.815 | 0.7973 | 0.3189 | 0.0542 |
| 0.1743 | 77.0 | 539 | 0.1633 | 0.815 | 0.3718 | 0.7358 | 0.815 | 0.7973 | 0.3344 | 0.0542 |
| 0.1743 | 78.0 | 546 | 0.1633 | 0.815 | 0.3718 | 0.7363 | 0.815 | 0.7991 | 0.3342 | 0.0542 |
| 0.1743 | 79.0 | 553 | 0.1633 | 0.815 | 0.3718 | 0.7350 | 0.815 | 0.7973 | 0.3344 | 0.0543 |
| 0.1743 | 80.0 | 560 | 0.1634 | 0.815 | 0.3718 | 0.7354 | 0.815 | 0.7973 | 0.3136 | 0.0542 |
| 0.1743 | 81.0 | 567 | 0.1633 | 0.815 | 0.3721 | 0.7360 | 0.815 | 0.7973 | 0.3279 | 0.0543 |
| 0.1743 | 82.0 | 574 | 0.1634 | 0.815 | 0.3719 | 0.7358 | 0.815 | 0.7973 | 0.3276 | 0.0542 |
| 0.1743 | 83.0 | 581 | 0.1634 | 0.815 | 0.3720 | 0.7358 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
| 0.1743 | 84.0 | 588 | 0.1633 | 0.815 | 0.3717 | 0.7352 | 0.815 | 0.7973 | 0.3275 | 0.0542 |
| 0.1743 | 85.0 | 595 | 0.1633 | 0.815 | 0.3720 | 0.7357 | 0.815 | 0.7973 | 0.3194 | 0.0541 |
| 0.1743 | 86.0 | 602 | 0.1634 | 0.815 | 0.3717 | 0.7361 | 0.815 | 0.7973 | 0.3276 | 0.0542 |
| 0.1743 | 87.0 | 609 | 0.1634 | 0.815 | 0.3719 | 0.7359 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
| 0.1743 | 88.0 | 616 | 0.1634 | 0.815 | 0.3720 | 0.7359 | 0.815 | 0.7973 | 0.3278 | 0.0542 |
| 0.1743 | 89.0 | 623 | 0.1634 | 0.815 | 0.3718 | 0.7359 | 0.815 | 0.7973 | 0.3192 | 0.0543 |
| 0.1743 | 90.0 | 630 | 0.1635 | 0.815 | 0.3719 | 0.7359 | 0.815 | 0.7973 | 0.3278 | 0.0543 |
| 0.1743 | 91.0 | 637 | 0.1635 | 0.815 | 0.3719 | 0.7357 | 0.815 | 0.7973 | 0.3344 | 0.0541 |
| 0.1743 | 92.0 | 644 | 0.1635 | 0.815 | 0.3719 | 0.7361 | 0.815 | 0.7973 | 0.3278 | 0.0542 |
| 0.1743 | 93.0 | 651 | 0.1635 | 0.815 | 0.3719 | 0.7357 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
| 0.1743 | 94.0 | 658 | 0.1635 | 0.815 | 0.3719 | 0.7356 | 0.815 | 0.7973 | 0.3261 | 0.0543 |
| 0.1743 | 95.0 | 665 | 0.1635 | 0.815 | 0.3719 | 0.7360 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
| 0.1743 | 96.0 | 672 | 0.1635 | 0.815 | 0.3719 | 0.7357 | 0.815 | 0.7973 | 0.3278 | 0.0542 |
| 0.1743 | 97.0 | 679 | 0.1635 | 0.815 | 0.3719 | 0.7360 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
| 0.1743 | 98.0 | 686 | 0.1635 | 0.815 | 0.3719 | 0.7360 | 0.815 | 0.7973 | 0.3278 | 0.0543 |
| 0.1743 | 99.0 | 693 | 0.1635 | 0.815 | 0.3719 | 0.7360 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
| 0.1743 | 100.0 | 700 | 0.1635 | 0.815 | 0.3719 | 0.7360 | 0.815 | 0.7973 | 0.3345 | 0.0542 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/81-tiny_tobacco3482_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 81-tiny_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0317
- Accuracy: 0.86
- Brier Loss: 0.2316
- Nll: 0.9681
- F1 Micro: 0.8600
- F1 Macro: 0.8444
- Ece: 0.1162
- Aurc: 0.0505
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 6.0999 | 0.095 | 1.0191 | 8.7249 | 0.095 | 0.0767 | 0.3174 | 0.8988 |
| No log | 2.0 | 14 | 5.0037 | 0.145 | 0.9023 | 8.1615 | 0.145 | 0.1212 | 0.2431 | 0.8041 |
| No log | 3.0 | 21 | 4.5740 | 0.265 | 0.8363 | 5.9714 | 0.265 | 0.2003 | 0.2732 | 0.5711 |
| No log | 4.0 | 28 | 4.3460 | 0.435 | 0.7292 | 3.9720 | 0.435 | 0.3780 | 0.3029 | 0.3535 |
| No log | 5.0 | 35 | 4.1953 | 0.565 | 0.6344 | 2.7772 | 0.565 | 0.4465 | 0.3056 | 0.2433 |
| No log | 6.0 | 42 | 4.0769 | 0.665 | 0.5852 | 2.3975 | 0.665 | 0.5345 | 0.3469 | 0.1662 |
| No log | 7.0 | 49 | 3.9670 | 0.685 | 0.5103 | 2.1879 | 0.685 | 0.5645 | 0.3335 | 0.1339 |
| No log | 8.0 | 56 | 3.9085 | 0.74 | 0.4625 | 1.7988 | 0.74 | 0.6444 | 0.3276 | 0.1028 |
| No log | 9.0 | 63 | 3.8671 | 0.765 | 0.4073 | 1.5650 | 0.765 | 0.6567 | 0.2800 | 0.0878 |
| No log | 10.0 | 70 | 3.8008 | 0.77 | 0.3617 | 1.4468 | 0.7700 | 0.6730 | 0.2443 | 0.0692 |
| No log | 11.0 | 77 | 3.8924 | 0.76 | 0.3685 | 1.4401 | 0.76 | 0.7045 | 0.2129 | 0.0859 |
| No log | 12.0 | 84 | 3.8523 | 0.78 | 0.3275 | 1.3333 | 0.78 | 0.7165 | 0.2191 | 0.0691 |
| No log | 13.0 | 91 | 3.7745 | 0.78 | 0.3229 | 1.4554 | 0.78 | 0.6820 | 0.1980 | 0.0653 |
| No log | 14.0 | 98 | 3.8155 | 0.805 | 0.3070 | 1.1638 | 0.805 | 0.7564 | 0.1993 | 0.0621 |
| No log | 15.0 | 105 | 3.8060 | 0.815 | 0.3038 | 1.2009 | 0.815 | 0.7624 | 0.2051 | 0.0607 |
| No log | 16.0 | 112 | 3.8269 | 0.81 | 0.3044 | 1.2937 | 0.81 | 0.7751 | 0.1978 | 0.0643 |
| No log | 17.0 | 119 | 3.8191 | 0.83 | 0.2841 | 0.9571 | 0.83 | 0.7978 | 0.1773 | 0.0593 |
| No log | 18.0 | 126 | 3.8986 | 0.81 | 0.3109 | 1.1933 | 0.81 | 0.7858 | 0.1728 | 0.0692 |
| No log | 19.0 | 133 | 3.8134 | 0.845 | 0.2694 | 0.9314 | 0.845 | 0.8175 | 0.1791 | 0.0549 |
| No log | 20.0 | 140 | 3.8148 | 0.825 | 0.2768 | 0.8896 | 0.825 | 0.8108 | 0.1495 | 0.0619 |
| No log | 21.0 | 147 | 3.7769 | 0.83 | 0.2639 | 0.9579 | 0.83 | 0.8064 | 0.1521 | 0.0521 |
| No log | 22.0 | 154 | 3.7941 | 0.84 | 0.2596 | 0.9457 | 0.8400 | 0.8176 | 0.1644 | 0.0534 |
| No log | 23.0 | 161 | 3.9296 | 0.79 | 0.2938 | 1.0757 | 0.79 | 0.7732 | 0.1612 | 0.0706 |
| No log | 24.0 | 168 | 3.7899 | 0.815 | 0.2734 | 1.0937 | 0.815 | 0.7910 | 0.1570 | 0.0531 |
| No log | 25.0 | 175 | 3.8261 | 0.855 | 0.2554 | 0.9640 | 0.855 | 0.8370 | 0.1378 | 0.0559 |
| No log | 26.0 | 182 | 3.8596 | 0.805 | 0.2626 | 1.0429 | 0.805 | 0.7814 | 0.1548 | 0.0597 |
| No log | 27.0 | 189 | 3.7974 | 0.825 | 0.2637 | 1.0256 | 0.825 | 0.8107 | 0.1251 | 0.0522 |
| No log | 28.0 | 196 | 3.8304 | 0.83 | 0.2555 | 1.1640 | 0.83 | 0.8081 | 0.1337 | 0.0561 |
| No log | 29.0 | 203 | 3.8346 | 0.835 | 0.2631 | 0.8774 | 0.835 | 0.8110 | 0.1473 | 0.0564 |
| No log | 30.0 | 210 | 3.8251 | 0.825 | 0.2549 | 0.9254 | 0.825 | 0.7979 | 0.1298 | 0.0541 |
| No log | 31.0 | 217 | 3.8759 | 0.825 | 0.2671 | 0.9357 | 0.825 | 0.8014 | 0.1672 | 0.0609 |
| No log | 32.0 | 224 | 3.8466 | 0.835 | 0.2567 | 1.0822 | 0.835 | 0.8168 | 0.1454 | 0.0553 |
| No log | 33.0 | 231 | 3.8600 | 0.835 | 0.2578 | 1.0196 | 0.835 | 0.8103 | 0.1363 | 0.0577 |
| No log | 34.0 | 238 | 3.8364 | 0.84 | 0.2615 | 0.9060 | 0.8400 | 0.8153 | 0.1556 | 0.0550 |
| No log | 35.0 | 245 | 3.8615 | 0.84 | 0.2741 | 0.9777 | 0.8400 | 0.8175 | 0.1553 | 0.0609 |
| No log | 36.0 | 252 | 3.8354 | 0.815 | 0.2672 | 1.0578 | 0.815 | 0.7829 | 0.1412 | 0.0571 |
| No log | 37.0 | 259 | 3.8214 | 0.825 | 0.2586 | 1.1244 | 0.825 | 0.8005 | 0.1567 | 0.0555 |
| No log | 38.0 | 266 | 3.8379 | 0.84 | 0.2557 | 0.9827 | 0.8400 | 0.8159 | 0.1623 | 0.0569 |
| No log | 39.0 | 273 | 3.8269 | 0.82 | 0.2590 | 1.0025 | 0.82 | 0.7983 | 0.1300 | 0.0552 |
| No log | 40.0 | 280 | 3.8326 | 0.835 | 0.2576 | 0.9914 | 0.835 | 0.8145 | 0.1339 | 0.0549 |
| No log | 41.0 | 287 | 3.8171 | 0.845 | 0.2446 | 0.9369 | 0.845 | 0.8234 | 0.1345 | 0.0555 |
| No log | 42.0 | 294 | 3.8197 | 0.835 | 0.2421 | 0.9189 | 0.835 | 0.8127 | 0.1263 | 0.0551 |
| No log | 43.0 | 301 | 3.8182 | 0.835 | 0.2421 | 1.0651 | 0.835 | 0.8181 | 0.1348 | 0.0528 |
| No log | 44.0 | 308 | 3.8461 | 0.85 | 0.2482 | 0.9057 | 0.85 | 0.8353 | 0.1580 | 0.0530 |
| No log | 45.0 | 315 | 3.8203 | 0.845 | 0.2405 | 0.8504 | 0.845 | 0.8241 | 0.1461 | 0.0512 |
| No log | 46.0 | 322 | 3.8468 | 0.845 | 0.2431 | 0.7909 | 0.845 | 0.8252 | 0.1273 | 0.0534 |
| No log | 47.0 | 329 | 3.8798 | 0.84 | 0.2486 | 0.9810 | 0.8400 | 0.8219 | 0.1412 | 0.0565 |
| No log | 48.0 | 336 | 3.8650 | 0.855 | 0.2447 | 0.9117 | 0.855 | 0.8372 | 0.1465 | 0.0525 |
| No log | 49.0 | 343 | 3.8425 | 0.845 | 0.2414 | 1.0040 | 0.845 | 0.8231 | 0.1251 | 0.0536 |
| No log | 50.0 | 350 | 3.8350 | 0.86 | 0.2351 | 0.8541 | 0.8600 | 0.8454 | 0.1403 | 0.0511 |
| No log | 51.0 | 357 | 3.8572 | 0.84 | 0.2427 | 0.9366 | 0.8400 | 0.8197 | 0.1249 | 0.0527 |
| No log | 52.0 | 364 | 3.8461 | 0.85 | 0.2374 | 0.8631 | 0.85 | 0.8352 | 0.1269 | 0.0506 |
| No log | 53.0 | 371 | 3.8692 | 0.835 | 0.2431 | 0.9839 | 0.835 | 0.8110 | 0.1356 | 0.0546 |
| No log | 54.0 | 378 | 3.8806 | 0.855 | 0.2403 | 1.0493 | 0.855 | 0.8343 | 0.1287 | 0.0530 |
| No log | 55.0 | 385 | 3.8875 | 0.86 | 0.2389 | 0.9404 | 0.8600 | 0.8435 | 0.1416 | 0.0532 |
| No log | 56.0 | 392 | 3.8893 | 0.84 | 0.2407 | 0.9410 | 0.8400 | 0.8189 | 0.1256 | 0.0539 |
| No log | 57.0 | 399 | 3.8923 | 0.84 | 0.2408 | 0.9917 | 0.8400 | 0.8176 | 0.1274 | 0.0540 |
| No log | 58.0 | 406 | 3.8865 | 0.86 | 0.2323 | 0.9745 | 0.8600 | 0.8444 | 0.1372 | 0.0517 |
| No log | 59.0 | 413 | 3.9037 | 0.845 | 0.2363 | 0.9451 | 0.845 | 0.8230 | 0.1422 | 0.0533 |
| No log | 60.0 | 420 | 3.9040 | 0.845 | 0.2396 | 1.0699 | 0.845 | 0.8233 | 0.1272 | 0.0522 |
| No log | 61.0 | 427 | 3.9136 | 0.85 | 0.2378 | 1.0664 | 0.85 | 0.8289 | 0.1283 | 0.0527 |
| No log | 62.0 | 434 | 3.8877 | 0.85 | 0.2355 | 0.8483 | 0.85 | 0.8315 | 0.1445 | 0.0519 |
| No log | 63.0 | 441 | 3.8991 | 0.855 | 0.2342 | 0.9246 | 0.855 | 0.8374 | 0.1124 | 0.0509 |
| No log | 64.0 | 448 | 3.9207 | 0.85 | 0.2383 | 1.0477 | 0.85 | 0.8330 | 0.1235 | 0.0514 |
| No log | 65.0 | 455 | 3.9000 | 0.855 | 0.2362 | 1.0504 | 0.855 | 0.8384 | 0.1375 | 0.0529 |
| No log | 66.0 | 462 | 3.9542 | 0.84 | 0.2463 | 0.9763 | 0.8400 | 0.8255 | 0.1405 | 0.0540 |
| No log | 67.0 | 469 | 3.9153 | 0.855 | 0.2374 | 0.9954 | 0.855 | 0.8376 | 0.1353 | 0.0523 |
| No log | 68.0 | 476 | 3.9264 | 0.845 | 0.2410 | 0.9917 | 0.845 | 0.8295 | 0.1081 | 0.0515 |
| No log | 69.0 | 483 | 3.8989 | 0.86 | 0.2272 | 0.9322 | 0.8600 | 0.8438 | 0.1260 | 0.0492 |
| No log | 70.0 | 490 | 3.9224 | 0.86 | 0.2329 | 0.9317 | 0.8600 | 0.8443 | 0.1091 | 0.0515 |
| No log | 71.0 | 497 | 3.9313 | 0.85 | 0.2360 | 1.0424 | 0.85 | 0.8310 | 0.1259 | 0.0511 |
| 3.5118 | 72.0 | 504 | 3.9407 | 0.85 | 0.2343 | 0.9333 | 0.85 | 0.8331 | 0.1156 | 0.0529 |
| 3.5118 | 73.0 | 511 | 3.9407 | 0.865 | 0.2318 | 0.9791 | 0.865 | 0.8523 | 0.1245 | 0.0497 |
| 3.5118 | 74.0 | 518 | 3.9461 | 0.855 | 0.2347 | 1.0488 | 0.855 | 0.8385 | 0.1298 | 0.0508 |
| 3.5118 | 75.0 | 525 | 3.9560 | 0.86 | 0.2319 | 0.9924 | 0.8600 | 0.8444 | 0.1410 | 0.0504 |
| 3.5118 | 76.0 | 532 | 3.9608 | 0.855 | 0.2317 | 0.9253 | 0.855 | 0.8390 | 0.1380 | 0.0517 |
| 3.5118 | 77.0 | 539 | 3.9638 | 0.865 | 0.2319 | 0.9210 | 0.865 | 0.8528 | 0.1202 | 0.0504 |
| 3.5118 | 78.0 | 546 | 3.9718 | 0.86 | 0.2323 | 0.9413 | 0.8600 | 0.8444 | 0.1255 | 0.0505 |
| 3.5118 | 79.0 | 553 | 3.9778 | 0.86 | 0.2324 | 0.9916 | 0.8600 | 0.8444 | 0.1270 | 0.0506 |
| 3.5118 | 80.0 | 560 | 3.9813 | 0.855 | 0.2323 | 0.9919 | 0.855 | 0.8390 | 0.1246 | 0.0509 |
| 3.5118 | 81.0 | 567 | 3.9876 | 0.86 | 0.2319 | 0.9330 | 0.8600 | 0.8444 | 0.1318 | 0.0506 |
| 3.5118 | 82.0 | 574 | 3.9939 | 0.855 | 0.2324 | 0.9328 | 0.855 | 0.8390 | 0.1280 | 0.0510 |
| 3.5118 | 83.0 | 581 | 3.9971 | 0.86 | 0.2319 | 0.9321 | 0.8600 | 0.8444 | 0.1303 | 0.0503 |
| 3.5118 | 84.0 | 588 | 4.0003 | 0.855 | 0.2316 | 0.9348 | 0.855 | 0.8390 | 0.1284 | 0.0508 |
| 3.5118 | 85.0 | 595 | 4.0054 | 0.86 | 0.2319 | 0.9909 | 0.8600 | 0.8444 | 0.1348 | 0.0503 |
| 3.5118 | 86.0 | 602 | 4.0086 | 0.86 | 0.2315 | 0.9338 | 0.8600 | 0.8444 | 0.1340 | 0.0504 |
| 3.5118 | 87.0 | 609 | 4.0125 | 0.86 | 0.2318 | 0.9522 | 0.8600 | 0.8444 | 0.1348 | 0.0504 |
| 3.5118 | 88.0 | 616 | 4.0148 | 0.86 | 0.2316 | 0.9396 | 0.8600 | 0.8444 | 0.1323 | 0.0504 |
| 3.5118 | 89.0 | 623 | 4.0185 | 0.86 | 0.2318 | 0.9378 | 0.8600 | 0.8444 | 0.1326 | 0.0505 |
| 3.5118 | 90.0 | 630 | 4.0197 | 0.86 | 0.2316 | 0.9412 | 0.8600 | 0.8444 | 0.1253 | 0.0506 |
| 3.5118 | 91.0 | 637 | 4.0231 | 0.86 | 0.2318 | 0.9395 | 0.8600 | 0.8444 | 0.1165 | 0.0506 |
| 3.5118 | 92.0 | 644 | 4.0249 | 0.86 | 0.2316 | 0.9921 | 0.8600 | 0.8444 | 0.1159 | 0.0504 |
| 3.5118 | 93.0 | 651 | 4.0266 | 0.86 | 0.2316 | 0.9441 | 0.8600 | 0.8444 | 0.1161 | 0.0505 |
| 3.5118 | 94.0 | 658 | 4.0275 | 0.86 | 0.2317 | 0.9934 | 0.8600 | 0.8444 | 0.1159 | 0.0504 |
| 3.5118 | 95.0 | 665 | 4.0289 | 0.86 | 0.2315 | 0.9429 | 0.8600 | 0.8444 | 0.1160 | 0.0505 |
| 3.5118 | 96.0 | 672 | 4.0301 | 0.86 | 0.2316 | 0.9932 | 0.8600 | 0.8444 | 0.1163 | 0.0505 |
| 3.5118 | 97.0 | 679 | 4.0304 | 0.86 | 0.2315 | 0.9936 | 0.8600 | 0.8444 | 0.1163 | 0.0505 |
| 3.5118 | 98.0 | 686 | 4.0313 | 0.86 | 0.2316 | 0.9935 | 0.8600 | 0.8444 | 0.1163 | 0.0504 |
| 3.5118 | 99.0 | 693 | 4.0317 | 0.86 | 0.2316 | 0.9601 | 0.8600 | 0.8444 | 0.1162 | 0.0505 |
| 3.5118 | 100.0 | 700 | 4.0317 | 0.86 | 0.2316 | 0.9681 | 0.8600 | 0.8444 | 0.1162 | 0.0505 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/81-tiny_tobacco3482_hint_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 81-tiny_tobacco3482_hint_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 63.0326
- Accuracy: 0.85
- Brier Loss: 0.2647
- Nll: 1.1178
- F1 Micro: 0.85
- F1 Macro: 0.8409
- Ece: 0.1296
- Aurc: 0.0380
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 66.3150 | 0.26 | 0.8706 | 4.9002 | 0.26 | 0.1920 | 0.2904 | 0.7808 |
| No log | 2.0 | 50 | 65.5312 | 0.54 | 0.5948 | 2.8544 | 0.54 | 0.4535 | 0.2938 | 0.2564 |
| No log | 3.0 | 75 | 65.0613 | 0.675 | 0.4403 | 1.6101 | 0.675 | 0.6134 | 0.2276 | 0.1328 |
| No log | 4.0 | 100 | 64.7775 | 0.765 | 0.3800 | 1.5512 | 0.765 | 0.7486 | 0.2394 | 0.1111 |
| No log | 5.0 | 125 | 64.4695 | 0.8 | 0.3065 | 1.7477 | 0.8000 | 0.7658 | 0.1814 | 0.0713 |
| No log | 6.0 | 150 | 64.3179 | 0.78 | 0.3286 | 1.7786 | 0.78 | 0.7409 | 0.1555 | 0.0636 |
| No log | 7.0 | 175 | 64.0707 | 0.77 | 0.3516 | 1.6165 | 0.7700 | 0.7179 | 0.1677 | 0.0927 |
| No log | 8.0 | 200 | 63.8878 | 0.775 | 0.3677 | 1.5868 | 0.775 | 0.7675 | 0.1922 | 0.0867 |
| No log | 9.0 | 225 | 64.2141 | 0.715 | 0.4341 | 2.1772 | 0.715 | 0.7074 | 0.2273 | 0.1130 |
| No log | 10.0 | 250 | 63.8037 | 0.775 | 0.3649 | 1.4471 | 0.775 | 0.7221 | 0.1744 | 0.0766 |
| No log | 11.0 | 275 | 64.2412 | 0.655 | 0.5396 | 2.0448 | 0.655 | 0.6532 | 0.2692 | 0.1670 |
| No log | 12.0 | 300 | 63.8514 | 0.73 | 0.4372 | 1.9358 | 0.7300 | 0.6997 | 0.2194 | 0.1008 |
| No log | 13.0 | 325 | 63.4352 | 0.815 | 0.2888 | 1.6462 | 0.815 | 0.7858 | 0.1490 | 0.0469 |
| No log | 14.0 | 350 | 63.5101 | 0.795 | 0.3262 | 1.3794 | 0.795 | 0.7816 | 0.1599 | 0.0648 |
| No log | 15.0 | 375 | 63.6365 | 0.785 | 0.3704 | 1.4766 | 0.785 | 0.7627 | 0.1691 | 0.0781 |
| No log | 16.0 | 400 | 63.7245 | 0.73 | 0.4145 | 1.7447 | 0.7300 | 0.7261 | 0.2087 | 0.0917 |
| No log | 17.0 | 425 | 63.4312 | 0.795 | 0.3148 | 1.4363 | 0.795 | 0.7868 | 0.1665 | 0.0636 |
| No log | 18.0 | 450 | 63.7070 | 0.835 | 0.2915 | 1.4455 | 0.835 | 0.8078 | 0.1542 | 0.0570 |
| No log | 19.0 | 475 | 63.3600 | 0.81 | 0.2997 | 1.3326 | 0.81 | 0.7878 | 0.1602 | 0.0534 |
| 62.5692 | 20.0 | 500 | 63.4339 | 0.81 | 0.3158 | 1.2051 | 0.81 | 0.7809 | 0.1541 | 0.0506 |
| 62.5692 | 21.0 | 525 | 63.3477 | 0.805 | 0.3088 | 1.2575 | 0.805 | 0.7943 | 0.1570 | 0.0489 |
| 62.5692 | 22.0 | 550 | 63.3362 | 0.83 | 0.2911 | 1.3409 | 0.83 | 0.8104 | 0.1440 | 0.0514 |
| 62.5692 | 23.0 | 575 | 63.3897 | 0.805 | 0.3004 | 1.2505 | 0.805 | 0.7833 | 0.1595 | 0.0453 |
| 62.5692 | 24.0 | 600 | 63.3475 | 0.8 | 0.3185 | 1.1190 | 0.8000 | 0.7750 | 0.1626 | 0.0486 |
| 62.5692 | 25.0 | 625 | 63.4552 | 0.805 | 0.3470 | 1.2483 | 0.805 | 0.7904 | 0.1818 | 0.0652 |
| 62.5692 | 26.0 | 650 | 63.4364 | 0.79 | 0.3453 | 1.1298 | 0.79 | 0.7798 | 0.1827 | 0.0651 |
| 62.5692 | 27.0 | 675 | 63.3001 | 0.83 | 0.2899 | 1.4329 | 0.83 | 0.8141 | 0.1370 | 0.0466 |
| 62.5692 | 28.0 | 700 | 63.1848 | 0.85 | 0.2514 | 1.2175 | 0.85 | 0.8354 | 0.1220 | 0.0396 |
| 62.5692 | 29.0 | 725 | 63.2303 | 0.835 | 0.2744 | 1.4886 | 0.835 | 0.8143 | 0.1389 | 0.0532 |
| 62.5692 | 30.0 | 750 | 63.2275 | 0.84 | 0.2774 | 1.0426 | 0.8400 | 0.8405 | 0.1325 | 0.0632 |
| 62.5692 | 31.0 | 775 | 63.1341 | 0.835 | 0.2597 | 1.1066 | 0.835 | 0.8083 | 0.1170 | 0.0510 |
| 62.5692 | 32.0 | 800 | 63.2045 | 0.81 | 0.2970 | 1.0432 | 0.81 | 0.8028 | 0.1561 | 0.0552 |
| 62.5692 | 33.0 | 825 | 63.1898 | 0.82 | 0.2952 | 1.0189 | 0.82 | 0.8060 | 0.1440 | 0.0521 |
| 62.5692 | 34.0 | 850 | 63.1330 | 0.835 | 0.2760 | 1.0095 | 0.835 | 0.8265 | 0.1356 | 0.0542 |
| 62.5692 | 35.0 | 875 | 63.1572 | 0.84 | 0.2834 | 1.0174 | 0.8400 | 0.8234 | 0.1337 | 0.0508 |
| 62.5692 | 36.0 | 900 | 63.1922 | 0.835 | 0.2894 | 1.0102 | 0.835 | 0.8233 | 0.1469 | 0.0500 |
| 62.5692 | 37.0 | 925 | 63.1305 | 0.83 | 0.2818 | 1.0146 | 0.83 | 0.8172 | 0.1387 | 0.0510 |
| 62.5692 | 38.0 | 950 | 63.1902 | 0.815 | 0.2865 | 1.0101 | 0.815 | 0.8016 | 0.1500 | 0.0516 |
| 62.5692 | 39.0 | 975 | 63.1835 | 0.825 | 0.2851 | 1.0177 | 0.825 | 0.8162 | 0.1436 | 0.0496 |
| 61.2333 | 40.0 | 1000 | 63.1741 | 0.84 | 0.2783 | 1.0160 | 0.8400 | 0.8266 | 0.1275 | 0.0510 |
| 61.2333 | 41.0 | 1025 | 63.1755 | 0.835 | 0.2756 | 1.0117 | 0.835 | 0.8192 | 0.1447 | 0.0483 |
| 61.2333 | 42.0 | 1050 | 63.1281 | 0.83 | 0.2820 | 1.0142 | 0.83 | 0.8169 | 0.1415 | 0.0466 |
| 61.2333 | 43.0 | 1075 | 63.1697 | 0.85 | 0.2675 | 0.9929 | 0.85 | 0.8358 | 0.1423 | 0.0484 |
| 61.2333 | 44.0 | 1100 | 63.1141 | 0.835 | 0.2767 | 1.0005 | 0.835 | 0.8237 | 0.1293 | 0.0481 |
| 61.2333 | 45.0 | 1125 | 63.1441 | 0.85 | 0.2638 | 1.0023 | 0.85 | 0.8383 | 0.1335 | 0.0471 |
| 61.2333 | 46.0 | 1150 | 63.1221 | 0.84 | 0.2745 | 0.9981 | 0.8400 | 0.8308 | 0.1271 | 0.0451 |
| 61.2333 | 47.0 | 1175 | 63.1140 | 0.845 | 0.2654 | 0.9891 | 0.845 | 0.8317 | 0.1351 | 0.0458 |
| 61.2333 | 48.0 | 1200 | 63.1056 | 0.845 | 0.2654 | 1.0016 | 0.845 | 0.8351 | 0.1364 | 0.0458 |
| 61.2333 | 49.0 | 1225 | 63.0906 | 0.83 | 0.2713 | 1.0042 | 0.83 | 0.8221 | 0.1455 | 0.0449 |
| 61.2333 | 50.0 | 1250 | 63.0942 | 0.835 | 0.2633 | 1.0003 | 0.835 | 0.8314 | 0.1397 | 0.0452 |
| 61.2333 | 51.0 | 1275 | 63.0929 | 0.84 | 0.2641 | 0.9957 | 0.8400 | 0.8359 | 0.1340 | 0.0440 |
| 61.2333 | 52.0 | 1300 | 63.0913 | 0.83 | 0.2646 | 1.0040 | 0.83 | 0.8242 | 0.1422 | 0.0440 |
| 61.2333 | 53.0 | 1325 | 63.1152 | 0.83 | 0.2754 | 0.9985 | 0.83 | 0.8250 | 0.1416 | 0.0447 |
| 61.2333 | 54.0 | 1350 | 63.0923 | 0.835 | 0.2649 | 0.9997 | 0.835 | 0.8278 | 0.1356 | 0.0426 |
| 61.2333 | 55.0 | 1375 | 63.0720 | 0.83 | 0.2686 | 0.9988 | 0.83 | 0.8243 | 0.1396 | 0.0431 |
| 61.2333 | 56.0 | 1400 | 63.0627 | 0.83 | 0.2636 | 1.0713 | 0.83 | 0.8243 | 0.1369 | 0.0427 |
| 61.2333 | 57.0 | 1425 | 63.0742 | 0.835 | 0.2692 | 1.0572 | 0.835 | 0.8305 | 0.1391 | 0.0425 |
| 61.2333 | 58.0 | 1450 | 63.0910 | 0.84 | 0.2639 | 1.0727 | 0.8400 | 0.8334 | 0.1320 | 0.0432 |
| 61.2333 | 59.0 | 1475 | 63.1015 | 0.84 | 0.2648 | 1.1382 | 0.8400 | 0.8354 | 0.1331 | 0.0423 |
| 61.0482 | 60.0 | 1500 | 63.0557 | 0.835 | 0.2655 | 1.0688 | 0.835 | 0.8293 | 0.1333 | 0.0420 |
| 61.0482 | 61.0 | 1525 | 63.0590 | 0.835 | 0.2655 | 1.1378 | 0.835 | 0.8315 | 0.1425 | 0.0416 |
| 61.0482 | 62.0 | 1550 | 63.0732 | 0.845 | 0.2661 | 1.0565 | 0.845 | 0.8381 | 0.1404 | 0.0413 |
| 61.0482 | 63.0 | 1575 | 63.0972 | 0.855 | 0.2659 | 1.1274 | 0.855 | 0.8501 | 0.1424 | 0.0416 |
| 61.0482 | 64.0 | 1600 | 63.0528 | 0.84 | 0.2694 | 1.1315 | 0.8400 | 0.8330 | 0.1355 | 0.0418 |
| 61.0482 | 65.0 | 1625 | 63.0625 | 0.835 | 0.2683 | 1.1336 | 0.835 | 0.8281 | 0.1373 | 0.0411 |
| 61.0482 | 66.0 | 1650 | 63.0512 | 0.835 | 0.2747 | 1.1250 | 0.835 | 0.8242 | 0.1436 | 0.0410 |
| 61.0482 | 67.0 | 1675 | 63.0634 | 0.85 | 0.2671 | 1.1270 | 0.85 | 0.8397 | 0.1376 | 0.0419 |
| 61.0482 | 68.0 | 1700 | 63.0609 | 0.835 | 0.2717 | 1.1311 | 0.835 | 0.8295 | 0.1365 | 0.0411 |
| 61.0482 | 69.0 | 1725 | 63.0513 | 0.835 | 0.2707 | 1.1261 | 0.835 | 0.8223 | 0.1461 | 0.0412 |
| 61.0482 | 70.0 | 1750 | 63.0510 | 0.845 | 0.2712 | 1.1219 | 0.845 | 0.8396 | 0.1369 | 0.0411 |
| 61.0482 | 71.0 | 1775 | 63.0530 | 0.845 | 0.2688 | 1.1244 | 0.845 | 0.8364 | 0.1403 | 0.0412 |
| 61.0482 | 72.0 | 1800 | 63.0456 | 0.84 | 0.2665 | 1.1204 | 0.8400 | 0.8293 | 0.1341 | 0.0400 |
| 61.0482 | 73.0 | 1825 | 63.0459 | 0.845 | 0.2680 | 1.1244 | 0.845 | 0.8360 | 0.1430 | 0.0398 |
| 61.0482 | 74.0 | 1850 | 63.0773 | 0.85 | 0.2684 | 1.1291 | 0.85 | 0.8440 | 0.1386 | 0.0410 |
| 61.0482 | 75.0 | 1875 | 63.0497 | 0.85 | 0.2664 | 1.1285 | 0.85 | 0.8388 | 0.1293 | 0.0392 |
| 61.0482 | 76.0 | 1900 | 63.0483 | 0.845 | 0.2695 | 1.1256 | 0.845 | 0.8352 | 0.1440 | 0.0409 |
| 61.0482 | 77.0 | 1925 | 63.0352 | 0.845 | 0.2680 | 1.1229 | 0.845 | 0.8352 | 0.1420 | 0.0398 |
| 61.0482 | 78.0 | 1950 | 63.0291 | 0.845 | 0.2701 | 1.1225 | 0.845 | 0.8352 | 0.1342 | 0.0394 |
| 61.0482 | 79.0 | 1975 | 63.0508 | 0.85 | 0.2695 | 1.1224 | 0.85 | 0.8388 | 0.1418 | 0.0399 |
| 60.9704 | 80.0 | 2000 | 63.0510 | 0.85 | 0.2708 | 1.1169 | 0.85 | 0.8388 | 0.1376 | 0.0394 |
| 60.9704 | 81.0 | 2025 | 63.0460 | 0.85 | 0.2648 | 1.1205 | 0.85 | 0.8440 | 0.1357 | 0.0397 |
| 60.9704 | 82.0 | 2050 | 63.0505 | 0.845 | 0.2697 | 1.1148 | 0.845 | 0.8352 | 0.1464 | 0.0392 |
| 60.9704 | 83.0 | 2075 | 63.0425 | 0.845 | 0.2651 | 1.1229 | 0.845 | 0.8352 | 0.1396 | 0.0389 |
| 60.9704 | 84.0 | 2100 | 63.0398 | 0.845 | 0.2664 | 1.1197 | 0.845 | 0.8337 | 0.1330 | 0.0388 |
| 60.9704 | 85.0 | 2125 | 63.0355 | 0.845 | 0.2667 | 1.1192 | 0.845 | 0.8360 | 0.1307 | 0.0387 |
| 60.9704 | 86.0 | 2150 | 63.0386 | 0.85 | 0.2649 | 1.1223 | 0.85 | 0.8409 | 0.1279 | 0.0379 |
| 60.9704 | 87.0 | 2175 | 63.0405 | 0.85 | 0.2642 | 1.1218 | 0.85 | 0.8409 | 0.1437 | 0.0378 |
| 60.9704 | 88.0 | 2200 | 63.0363 | 0.85 | 0.2667 | 1.1165 | 0.85 | 0.8388 | 0.1320 | 0.0390 |
| 60.9704 | 89.0 | 2225 | 63.0456 | 0.845 | 0.2644 | 1.1180 | 0.845 | 0.8352 | 0.1354 | 0.0381 |
| 60.9704 | 90.0 | 2250 | 63.0343 | 0.845 | 0.2656 | 1.1159 | 0.845 | 0.8337 | 0.1390 | 0.0385 |
| 60.9704 | 91.0 | 2275 | 63.0391 | 0.85 | 0.2654 | 1.1194 | 0.85 | 0.8409 | 0.1389 | 0.0380 |
| 60.9704 | 92.0 | 2300 | 63.0354 | 0.85 | 0.2665 | 1.1203 | 0.85 | 0.8409 | 0.1419 | 0.0377 |
| 60.9704 | 93.0 | 2325 | 63.0272 | 0.845 | 0.2650 | 1.1166 | 0.845 | 0.8381 | 0.1370 | 0.0381 |
| 60.9704 | 94.0 | 2350 | 63.0313 | 0.85 | 0.2647 | 1.1181 | 0.85 | 0.8409 | 0.1322 | 0.0380 |
| 60.9704 | 95.0 | 2375 | 63.0220 | 0.845 | 0.2658 | 1.1197 | 0.845 | 0.8357 | 0.1311 | 0.0378 |
| 60.9704 | 96.0 | 2400 | 63.0345 | 0.85 | 0.2639 | 1.1179 | 0.85 | 0.8409 | 0.1284 | 0.0381 |
| 60.9704 | 97.0 | 2425 | 63.0330 | 0.845 | 0.2651 | 1.1163 | 0.845 | 0.8352 | 0.1391 | 0.0382 |
| 60.9704 | 98.0 | 2450 | 63.0302 | 0.85 | 0.2646 | 1.1182 | 0.85 | 0.8409 | 0.1311 | 0.0380 |
| 60.9704 | 99.0 | 2475 | 63.0287 | 0.85 | 0.2646 | 1.1175 | 0.85 | 0.8409 | 0.1250 | 0.0380 |
| 60.9392 | 100.0 | 2500 | 63.0326 | 0.85 | 0.2647 | 1.1178 | 0.85 | 0.8409 | 0.1296 | 0.0380 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/60-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 60-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4631
- Accuracy: 0.81
- Brier Loss: 0.3427
- Nll: 0.9896
- F1 Micro: 0.81
- F1 Macro: 0.7794
- Ece: 0.3002
- Aurc: 0.0534
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.0115 | 0.1 | 1.0070 | 8.1872 | 0.1000 | 0.0765 | 0.3197 | 0.8958 |
| No log | 2.0 | 14 | 1.3480 | 0.19 | 0.8667 | 5.2268 | 0.19 | 0.1599 | 0.2694 | 0.7716 |
| No log | 3.0 | 21 | 1.0745 | 0.38 | 0.7608 | 4.8555 | 0.38 | 0.3111 | 0.2804 | 0.4226 |
| No log | 4.0 | 28 | 0.8841 | 0.57 | 0.6204 | 2.8740 | 0.57 | 0.4526 | 0.2904 | 0.2536 |
| No log | 5.0 | 35 | 0.7962 | 0.63 | 0.5646 | 2.4082 | 0.63 | 0.5462 | 0.3178 | 0.2005 |
| No log | 6.0 | 42 | 0.7065 | 0.68 | 0.4894 | 2.3364 | 0.68 | 0.5773 | 0.2660 | 0.1461 |
| No log | 7.0 | 49 | 0.6639 | 0.7 | 0.4548 | 1.8260 | 0.7 | 0.6412 | 0.2748 | 0.1340 |
| No log | 8.0 | 56 | 0.6374 | 0.745 | 0.4265 | 1.8720 | 0.745 | 0.6620 | 0.3127 | 0.1047 |
| No log | 9.0 | 63 | 0.6184 | 0.73 | 0.4136 | 1.7957 | 0.7300 | 0.6650 | 0.3022 | 0.0845 |
| No log | 10.0 | 70 | 0.5569 | 0.8 | 0.3838 | 1.2127 | 0.8000 | 0.7683 | 0.3047 | 0.0721 |
| No log | 11.0 | 77 | 0.5225 | 0.815 | 0.3514 | 1.0539 | 0.815 | 0.7871 | 0.2818 | 0.0625 |
| No log | 12.0 | 84 | 0.5330 | 0.795 | 0.3609 | 1.0544 | 0.795 | 0.7695 | 0.2723 | 0.0786 |
| No log | 13.0 | 91 | 0.5241 | 0.78 | 0.3615 | 1.1338 | 0.78 | 0.7599 | 0.2681 | 0.0726 |
| No log | 14.0 | 98 | 0.5345 | 0.795 | 0.3640 | 1.1229 | 0.795 | 0.7527 | 0.2950 | 0.0661 |
| No log | 15.0 | 105 | 0.5042 | 0.8 | 0.3459 | 1.1690 | 0.8000 | 0.7664 | 0.2915 | 0.0664 |
| No log | 16.0 | 112 | 0.5285 | 0.795 | 0.3779 | 1.1973 | 0.795 | 0.7468 | 0.3056 | 0.0647 |
| No log | 17.0 | 119 | 0.5374 | 0.77 | 0.3716 | 1.5324 | 0.7700 | 0.7406 | 0.2658 | 0.0910 |
| No log | 18.0 | 126 | 0.5132 | 0.79 | 0.3677 | 1.1252 | 0.79 | 0.7699 | 0.2842 | 0.0722 |
| No log | 19.0 | 133 | 0.4961 | 0.815 | 0.3558 | 0.8880 | 0.815 | 0.7941 | 0.2985 | 0.0611 |
| No log | 20.0 | 140 | 0.4971 | 0.82 | 0.3556 | 1.0845 | 0.82 | 0.8049 | 0.3045 | 0.0631 |
| No log | 21.0 | 147 | 0.4791 | 0.835 | 0.3328 | 0.9109 | 0.835 | 0.8168 | 0.2803 | 0.0534 |
| No log | 22.0 | 154 | 0.4953 | 0.805 | 0.3531 | 1.0087 | 0.805 | 0.7758 | 0.3015 | 0.0642 |
| No log | 23.0 | 161 | 0.4962 | 0.825 | 0.3543 | 0.8886 | 0.825 | 0.8091 | 0.3085 | 0.0672 |
| No log | 24.0 | 168 | 0.4753 | 0.82 | 0.3435 | 1.0609 | 0.82 | 0.8025 | 0.2994 | 0.0508 |
| No log | 25.0 | 175 | 0.4856 | 0.825 | 0.3495 | 1.1336 | 0.825 | 0.7846 | 0.3018 | 0.0539 |
| No log | 26.0 | 182 | 0.4824 | 0.805 | 0.3499 | 1.0142 | 0.805 | 0.7890 | 0.3000 | 0.0577 |
| No log | 27.0 | 189 | 0.4843 | 0.83 | 0.3506 | 0.7550 | 0.83 | 0.8090 | 0.3163 | 0.0556 |
| No log | 28.0 | 196 | 0.4728 | 0.825 | 0.3451 | 1.1111 | 0.825 | 0.8012 | 0.3006 | 0.0554 |
| No log | 29.0 | 203 | 0.4844 | 0.8 | 0.3483 | 1.2220 | 0.8000 | 0.7750 | 0.2913 | 0.0609 |
| No log | 30.0 | 210 | 0.4728 | 0.815 | 0.3478 | 1.1306 | 0.815 | 0.7840 | 0.3051 | 0.0540 |
| No log | 31.0 | 217 | 0.4694 | 0.83 | 0.3411 | 0.8267 | 0.83 | 0.8156 | 0.3101 | 0.0474 |
| No log | 32.0 | 224 | 0.4685 | 0.82 | 0.3406 | 1.0564 | 0.82 | 0.7909 | 0.3269 | 0.0529 |
| No log | 33.0 | 231 | 0.4692 | 0.81 | 0.3407 | 0.8508 | 0.81 | 0.7900 | 0.2838 | 0.0517 |
| No log | 34.0 | 238 | 0.4647 | 0.815 | 0.3404 | 1.0430 | 0.815 | 0.7894 | 0.3098 | 0.0563 |
| No log | 35.0 | 245 | 0.4761 | 0.795 | 0.3503 | 1.0340 | 0.795 | 0.7687 | 0.2731 | 0.0594 |
| No log | 36.0 | 252 | 0.4802 | 0.83 | 0.3571 | 0.8575 | 0.83 | 0.8046 | 0.3275 | 0.0511 |
| No log | 37.0 | 259 | 0.4686 | 0.8 | 0.3414 | 0.8818 | 0.8000 | 0.7836 | 0.3020 | 0.0560 |
| No log | 38.0 | 266 | 0.4612 | 0.815 | 0.3361 | 0.9901 | 0.815 | 0.7807 | 0.2988 | 0.0528 |
| No log | 39.0 | 273 | 0.4721 | 0.81 | 0.3475 | 0.9803 | 0.81 | 0.7875 | 0.3019 | 0.0576 |
| No log | 40.0 | 280 | 0.4646 | 0.83 | 0.3425 | 0.8495 | 0.83 | 0.8059 | 0.3063 | 0.0507 |
| No log | 41.0 | 287 | 0.4622 | 0.805 | 0.3396 | 1.0087 | 0.805 | 0.7688 | 0.2903 | 0.0558 |
| No log | 42.0 | 294 | 0.4599 | 0.82 | 0.3375 | 0.9166 | 0.82 | 0.7996 | 0.3319 | 0.0533 |
| No log | 43.0 | 301 | 0.4688 | 0.805 | 0.3475 | 1.0195 | 0.805 | 0.7771 | 0.2994 | 0.0573 |
| No log | 44.0 | 308 | 0.4652 | 0.805 | 0.3437 | 0.9095 | 0.805 | 0.7809 | 0.3134 | 0.0545 |
| No log | 45.0 | 315 | 0.4641 | 0.81 | 0.3418 | 0.8922 | 0.81 | 0.7806 | 0.3014 | 0.0550 |
| No log | 46.0 | 322 | 0.4579 | 0.825 | 0.3361 | 0.9516 | 0.825 | 0.7956 | 0.2971 | 0.0490 |
| No log | 47.0 | 329 | 0.4638 | 0.82 | 0.3423 | 0.8830 | 0.82 | 0.7961 | 0.3259 | 0.0557 |
| No log | 48.0 | 336 | 0.4643 | 0.81 | 0.3434 | 1.0046 | 0.81 | 0.7789 | 0.3042 | 0.0541 |
| No log | 49.0 | 343 | 0.4596 | 0.81 | 0.3388 | 0.9862 | 0.81 | 0.7835 | 0.3170 | 0.0532 |
| No log | 50.0 | 350 | 0.4603 | 0.815 | 0.3399 | 0.9288 | 0.815 | 0.7963 | 0.3031 | 0.0533 |
| No log | 51.0 | 357 | 0.4610 | 0.815 | 0.3403 | 0.9900 | 0.815 | 0.7898 | 0.3306 | 0.0546 |
| No log | 52.0 | 364 | 0.4617 | 0.81 | 0.3412 | 0.9834 | 0.81 | 0.7793 | 0.3079 | 0.0533 |
| No log | 53.0 | 371 | 0.4627 | 0.815 | 0.3423 | 0.9901 | 0.815 | 0.7898 | 0.3023 | 0.0543 |
| No log | 54.0 | 378 | 0.4612 | 0.815 | 0.3415 | 0.9868 | 0.815 | 0.7962 | 0.3178 | 0.0534 |
| No log | 55.0 | 385 | 0.4617 | 0.815 | 0.3416 | 0.9904 | 0.815 | 0.7898 | 0.3117 | 0.0533 |
| No log | 56.0 | 392 | 0.4605 | 0.81 | 0.3399 | 0.9845 | 0.81 | 0.7793 | 0.3069 | 0.0535 |
| No log | 57.0 | 399 | 0.4606 | 0.81 | 0.3405 | 0.9818 | 0.81 | 0.7793 | 0.3045 | 0.0531 |
| No log | 58.0 | 406 | 0.4614 | 0.81 | 0.3413 | 0.9853 | 0.81 | 0.7793 | 0.3114 | 0.0537 |
| No log | 59.0 | 413 | 0.4623 | 0.81 | 0.3424 | 0.9848 | 0.81 | 0.7793 | 0.3045 | 0.0534 |
| No log | 60.0 | 420 | 0.4621 | 0.81 | 0.3421 | 0.9898 | 0.81 | 0.7863 | 0.3150 | 0.0536 |
| No log | 61.0 | 427 | 0.4620 | 0.81 | 0.3417 | 0.9868 | 0.81 | 0.7793 | 0.3060 | 0.0534 |
| No log | 62.0 | 434 | 0.4618 | 0.81 | 0.3413 | 0.9843 | 0.81 | 0.7793 | 0.3029 | 0.0533 |
| No log | 63.0 | 441 | 0.4622 | 0.81 | 0.3419 | 0.9868 | 0.81 | 0.7793 | 0.2969 | 0.0535 |
| No log | 64.0 | 448 | 0.4621 | 0.81 | 0.3419 | 0.9881 | 0.81 | 0.7793 | 0.3070 | 0.0542 |
| No log | 65.0 | 455 | 0.4625 | 0.81 | 0.3422 | 0.9871 | 0.81 | 0.7794 | 0.3131 | 0.0532 |
| No log | 66.0 | 462 | 0.4626 | 0.81 | 0.3423 | 0.9880 | 0.81 | 0.7794 | 0.3066 | 0.0533 |
| No log | 67.0 | 469 | 0.4621 | 0.81 | 0.3420 | 0.9872 | 0.81 | 0.7793 | 0.3066 | 0.0536 |
| No log | 68.0 | 476 | 0.4624 | 0.81 | 0.3421 | 0.9882 | 0.81 | 0.7794 | 0.2966 | 0.0533 |
| No log | 69.0 | 483 | 0.4627 | 0.81 | 0.3425 | 0.9891 | 0.81 | 0.7794 | 0.3160 | 0.0534 |
| No log | 70.0 | 490 | 0.4628 | 0.81 | 0.3424 | 0.9899 | 0.81 | 0.7794 | 0.2970 | 0.0533 |
| No log | 71.0 | 497 | 0.4627 | 0.81 | 0.3423 | 0.9890 | 0.81 | 0.7794 | 0.2968 | 0.0532 |
| 0.3139 | 72.0 | 504 | 0.4625 | 0.81 | 0.3423 | 0.9886 | 0.81 | 0.7794 | 0.2971 | 0.0534 |
| 0.3139 | 73.0 | 511 | 0.4625 | 0.81 | 0.3423 | 0.9892 | 0.81 | 0.7794 | 0.3043 | 0.0535 |
| 0.3139 | 74.0 | 518 | 0.4626 | 0.81 | 0.3422 | 0.9881 | 0.81 | 0.7794 | 0.2969 | 0.0533 |
| 0.3139 | 75.0 | 525 | 0.4631 | 0.81 | 0.3428 | 0.9896 | 0.81 | 0.7794 | 0.3142 | 0.0533 |
| 0.3139 | 76.0 | 532 | 0.4628 | 0.81 | 0.3425 | 0.9893 | 0.81 | 0.7794 | 0.3138 | 0.0532 |
| 0.3139 | 77.0 | 539 | 0.4627 | 0.81 | 0.3423 | 0.9889 | 0.81 | 0.7794 | 0.3040 | 0.0533 |
| 0.3139 | 78.0 | 546 | 0.4628 | 0.81 | 0.3425 | 0.9888 | 0.81 | 0.7794 | 0.3138 | 0.0533 |
| 0.3139 | 79.0 | 553 | 0.4629 | 0.81 | 0.3426 | 0.9898 | 0.81 | 0.7794 | 0.3002 | 0.0535 |
| 0.3139 | 80.0 | 560 | 0.4630 | 0.81 | 0.3426 | 0.9892 | 0.81 | 0.7794 | 0.3041 | 0.0534 |
| 0.3139 | 81.0 | 567 | 0.4631 | 0.81 | 0.3428 | 0.9899 | 0.81 | 0.7794 | 0.3042 | 0.0534 |
| 0.3139 | 82.0 | 574 | 0.4628 | 0.81 | 0.3424 | 0.9889 | 0.81 | 0.7794 | 0.3039 | 0.0532 |
| 0.3139 | 83.0 | 581 | 0.4630 | 0.81 | 0.3427 | 0.9893 | 0.81 | 0.7794 | 0.3068 | 0.0533 |
| 0.3139 | 84.0 | 588 | 0.4629 | 0.81 | 0.3426 | 0.9894 | 0.81 | 0.7794 | 0.3069 | 0.0534 |
| 0.3139 | 85.0 | 595 | 0.4629 | 0.81 | 0.3425 | 0.9893 | 0.81 | 0.7794 | 0.3138 | 0.0535 |
| 0.3139 | 86.0 | 602 | 0.4630 | 0.81 | 0.3427 | 0.9896 | 0.81 | 0.7794 | 0.3070 | 0.0533 |
| 0.3139 | 87.0 | 609 | 0.4630 | 0.81 | 0.3426 | 0.9890 | 0.81 | 0.7794 | 0.3069 | 0.0534 |
| 0.3139 | 88.0 | 616 | 0.4630 | 0.81 | 0.3426 | 0.9893 | 0.81 | 0.7794 | 0.3069 | 0.0533 |
| 0.3139 | 89.0 | 623 | 0.4630 | 0.81 | 0.3426 | 0.9897 | 0.81 | 0.7794 | 0.3001 | 0.0535 |
| 0.3139 | 90.0 | 630 | 0.4631 | 0.81 | 0.3428 | 0.9902 | 0.81 | 0.7794 | 0.2904 | 0.0534 |
| 0.3139 | 91.0 | 637 | 0.4631 | 0.81 | 0.3427 | 0.9892 | 0.81 | 0.7794 | 0.3139 | 0.0533 |
| 0.3139 | 92.0 | 644 | 0.4631 | 0.81 | 0.3427 | 0.9894 | 0.81 | 0.7794 | 0.3071 | 0.0535 |
| 0.3139 | 93.0 | 651 | 0.4631 | 0.81 | 0.3428 | 0.9899 | 0.81 | 0.7794 | 0.3001 | 0.0534 |
| 0.3139 | 94.0 | 658 | 0.4630 | 0.81 | 0.3427 | 0.9894 | 0.81 | 0.7794 | 0.3069 | 0.0534 |
| 0.3139 | 95.0 | 665 | 0.4631 | 0.81 | 0.3428 | 0.9896 | 0.81 | 0.7794 | 0.3071 | 0.0534 |
| 0.3139 | 96.0 | 672 | 0.4630 | 0.81 | 0.3427 | 0.9893 | 0.81 | 0.7794 | 0.3070 | 0.0534 |
| 0.3139 | 97.0 | 679 | 0.4631 | 0.81 | 0.3427 | 0.9895 | 0.81 | 0.7794 | 0.3002 | 0.0535 |
| 0.3139 | 98.0 | 686 | 0.4631 | 0.81 | 0.3428 | 0.9899 | 0.81 | 0.7794 | 0.3002 | 0.0534 |
| 0.3139 | 99.0 | 693 | 0.4631 | 0.81 | 0.3427 | 0.9897 | 0.81 | 0.7794 | 0.3002 | 0.0534 |
| 0.3139 | 100.0 | 700 | 0.4631 | 0.81 | 0.3427 | 0.9896 | 0.81 | 0.7794 | 0.3002 | 0.0534 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
Sanfee18/my_awesome_food_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6056
- Accuracy: 0.901
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.6911 | 0.99 | 62 | 2.5097 | 0.856 |
| 1.8202 | 2.0 | 125 | 1.7735 | 0.894 |
| 1.5694 | 2.98 | 186 | 1.6056 | 0.901 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
jordyvl/60-tiny_tobacco3482
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 60-tiny_tobacco3482
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1098
- Accuracy: 0.79
- Brier Loss: 0.4604
- Nll: 0.9058
- F1 Micro: 0.79
- F1 Macro: 0.7539
- Ece: 0.4083
- Aurc: 0.0644
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.9988 | 0.16 | 1.0265 | 8.4697 | 0.16 | 0.0543 | 0.3527 | 0.8493 |
| No log | 2.0 | 14 | 0.9241 | 0.1 | 0.9095 | 7.2634 | 0.1000 | 0.0853 | 0.2311 | 0.8707 |
| No log | 3.0 | 21 | 0.5325 | 0.29 | 0.8525 | 5.4845 | 0.29 | 0.1287 | 0.2809 | 0.6344 |
| No log | 4.0 | 28 | 0.4050 | 0.36 | 0.7891 | 3.8464 | 0.36 | 0.2714 | 0.2948 | 0.4598 |
| No log | 5.0 | 35 | 0.3282 | 0.495 | 0.7189 | 2.5837 | 0.495 | 0.3741 | 0.3317 | 0.3367 |
| No log | 6.0 | 42 | 0.2722 | 0.585 | 0.6827 | 2.2625 | 0.585 | 0.4818 | 0.3751 | 0.2241 |
| No log | 7.0 | 49 | 0.2426 | 0.605 | 0.6424 | 2.0945 | 0.605 | 0.4641 | 0.3516 | 0.2034 |
| No log | 8.0 | 56 | 0.2223 | 0.685 | 0.6171 | 1.9679 | 0.685 | 0.5711 | 0.4148 | 0.1545 |
| No log | 9.0 | 63 | 0.1939 | 0.67 | 0.5684 | 1.8320 | 0.67 | 0.5495 | 0.3460 | 0.1438 |
| No log | 10.0 | 70 | 0.1950 | 0.695 | 0.5454 | 1.3666 | 0.695 | 0.5750 | 0.3667 | 0.1276 |
| No log | 11.0 | 77 | 0.1838 | 0.69 | 0.5436 | 1.4378 | 0.69 | 0.5745 | 0.3846 | 0.1162 |
| No log | 12.0 | 84 | 0.1787 | 0.735 | 0.5138 | 1.4550 | 0.735 | 0.6490 | 0.3893 | 0.0949 |
| No log | 13.0 | 91 | 0.1682 | 0.74 | 0.5209 | 1.6406 | 0.74 | 0.6452 | 0.3904 | 0.1052 |
| No log | 14.0 | 98 | 0.1734 | 0.75 | 0.5420 | 1.4280 | 0.75 | 0.6656 | 0.4440 | 0.0864 |
| No log | 15.0 | 105 | 0.1404 | 0.725 | 0.4961 | 1.3478 | 0.7250 | 0.6343 | 0.3816 | 0.0895 |
| No log | 16.0 | 112 | 0.1439 | 0.76 | 0.4709 | 1.4366 | 0.76 | 0.6679 | 0.3721 | 0.0754 |
| No log | 17.0 | 119 | 0.1356 | 0.74 | 0.4745 | 1.3609 | 0.74 | 0.6596 | 0.3643 | 0.0868 |
| No log | 18.0 | 126 | 0.1373 | 0.75 | 0.4760 | 1.4421 | 0.75 | 0.6703 | 0.3773 | 0.0783 |
| No log | 19.0 | 133 | 0.1352 | 0.765 | 0.4851 | 1.1693 | 0.765 | 0.6803 | 0.3952 | 0.0693 |
| No log | 20.0 | 140 | 0.1323 | 0.765 | 0.4838 | 1.0026 | 0.765 | 0.6844 | 0.3866 | 0.0741 |
| No log | 21.0 | 147 | 0.1334 | 0.785 | 0.4713 | 1.1479 | 0.785 | 0.7430 | 0.4105 | 0.0737 |
| No log | 22.0 | 154 | 0.1267 | 0.775 | 0.4706 | 1.1082 | 0.775 | 0.7355 | 0.3838 | 0.0727 |
| No log | 23.0 | 161 | 0.1279 | 0.77 | 0.4598 | 1.0869 | 0.7700 | 0.7254 | 0.3846 | 0.0732 |
| No log | 24.0 | 168 | 0.1229 | 0.805 | 0.4838 | 1.0060 | 0.805 | 0.7635 | 0.4268 | 0.0625 |
| No log | 25.0 | 175 | 0.1250 | 0.79 | 0.4740 | 0.9769 | 0.79 | 0.7462 | 0.3898 | 0.0684 |
| No log | 26.0 | 182 | 0.1371 | 0.795 | 0.4784 | 1.1316 | 0.795 | 0.7641 | 0.4246 | 0.0732 |
| No log | 27.0 | 189 | 0.1230 | 0.77 | 0.4625 | 0.9606 | 0.7700 | 0.7185 | 0.3816 | 0.0712 |
| No log | 28.0 | 196 | 0.1161 | 0.775 | 0.4661 | 0.9889 | 0.775 | 0.7375 | 0.3925 | 0.0658 |
| No log | 29.0 | 203 | 0.1194 | 0.775 | 0.4688 | 1.0280 | 0.775 | 0.7320 | 0.4087 | 0.0709 |
| No log | 30.0 | 210 | 0.1211 | 0.795 | 0.4680 | 1.0785 | 0.795 | 0.7677 | 0.4168 | 0.0671 |
| No log | 31.0 | 217 | 0.1208 | 0.79 | 0.4629 | 0.9986 | 0.79 | 0.7536 | 0.3892 | 0.0658 |
| No log | 32.0 | 224 | 0.1194 | 0.77 | 0.4588 | 0.9202 | 0.7700 | 0.7313 | 0.3791 | 0.0679 |
| No log | 33.0 | 231 | 0.1167 | 0.795 | 0.4567 | 0.9374 | 0.795 | 0.7602 | 0.3852 | 0.0668 |
| No log | 34.0 | 238 | 0.1205 | 0.77 | 0.4653 | 0.9700 | 0.7700 | 0.7291 | 0.3829 | 0.0721 |
| No log | 35.0 | 245 | 0.1179 | 0.77 | 0.4616 | 0.9313 | 0.7700 | 0.7366 | 0.3797 | 0.0724 |
| No log | 36.0 | 252 | 0.1155 | 0.78 | 0.4566 | 0.9870 | 0.78 | 0.7391 | 0.3718 | 0.0661 |
| No log | 37.0 | 259 | 0.1151 | 0.785 | 0.4614 | 0.8936 | 0.785 | 0.7455 | 0.4010 | 0.0684 |
| No log | 38.0 | 266 | 0.1126 | 0.78 | 0.4588 | 0.9190 | 0.78 | 0.7406 | 0.3874 | 0.0669 |
| No log | 39.0 | 273 | 0.1139 | 0.78 | 0.4637 | 0.9150 | 0.78 | 0.7408 | 0.3874 | 0.0708 |
| No log | 40.0 | 280 | 0.1138 | 0.785 | 0.4650 | 0.9096 | 0.785 | 0.7500 | 0.4002 | 0.0680 |
| No log | 41.0 | 287 | 0.1139 | 0.79 | 0.4644 | 0.9092 | 0.79 | 0.7590 | 0.4034 | 0.0668 |
| No log | 42.0 | 294 | 0.1140 | 0.79 | 0.4670 | 0.9062 | 0.79 | 0.7494 | 0.4042 | 0.0665 |
| No log | 43.0 | 301 | 0.1126 | 0.785 | 0.4623 | 0.8470 | 0.785 | 0.7502 | 0.4079 | 0.0660 |
| No log | 44.0 | 308 | 0.1146 | 0.775 | 0.4651 | 0.9065 | 0.775 | 0.7314 | 0.3907 | 0.0721 |
| No log | 45.0 | 315 | 0.1122 | 0.785 | 0.4643 | 0.8626 | 0.785 | 0.7443 | 0.3991 | 0.0639 |
| No log | 46.0 | 322 | 0.1109 | 0.795 | 0.4644 | 0.9087 | 0.795 | 0.7631 | 0.4017 | 0.0629 |
| No log | 47.0 | 329 | 0.1116 | 0.79 | 0.4640 | 0.8473 | 0.79 | 0.7584 | 0.3959 | 0.0634 |
| No log | 48.0 | 336 | 0.1147 | 0.78 | 0.4662 | 0.8717 | 0.78 | 0.7467 | 0.3859 | 0.0677 |
| No log | 49.0 | 343 | 0.1154 | 0.765 | 0.4586 | 1.0035 | 0.765 | 0.7366 | 0.3826 | 0.0764 |
| No log | 50.0 | 350 | 0.1112 | 0.79 | 0.4582 | 0.9230 | 0.79 | 0.7525 | 0.3854 | 0.0672 |
| No log | 51.0 | 357 | 0.1104 | 0.79 | 0.4633 | 0.9120 | 0.79 | 0.7598 | 0.4000 | 0.0667 |
| No log | 52.0 | 364 | 0.1115 | 0.79 | 0.4641 | 0.8550 | 0.79 | 0.7603 | 0.3914 | 0.0672 |
| No log | 53.0 | 371 | 0.1150 | 0.77 | 0.4613 | 0.9215 | 0.7700 | 0.7333 | 0.3882 | 0.0733 |
| No log | 54.0 | 378 | 0.1100 | 0.8 | 0.4596 | 0.9149 | 0.8000 | 0.7610 | 0.4055 | 0.0657 |
| No log | 55.0 | 385 | 0.1094 | 0.785 | 0.4613 | 0.9060 | 0.785 | 0.7506 | 0.3956 | 0.0664 |
| No log | 56.0 | 392 | 0.1087 | 0.785 | 0.4607 | 0.9068 | 0.785 | 0.7498 | 0.3984 | 0.0649 |
| No log | 57.0 | 399 | 0.1094 | 0.785 | 0.4630 | 0.8993 | 0.785 | 0.7491 | 0.3943 | 0.0674 |
| No log | 58.0 | 406 | 0.1100 | 0.805 | 0.4627 | 0.9130 | 0.805 | 0.7693 | 0.4018 | 0.0637 |
| No log | 59.0 | 413 | 0.1103 | 0.795 | 0.4619 | 0.8483 | 0.795 | 0.7618 | 0.3992 | 0.0632 |
| No log | 60.0 | 420 | 0.1093 | 0.79 | 0.4631 | 0.9007 | 0.79 | 0.7539 | 0.3936 | 0.0647 |
| No log | 61.0 | 427 | 0.1095 | 0.79 | 0.4594 | 0.9073 | 0.79 | 0.7539 | 0.4129 | 0.0654 |
| No log | 62.0 | 434 | 0.1092 | 0.79 | 0.4591 | 0.9087 | 0.79 | 0.7539 | 0.3956 | 0.0638 |
| No log | 63.0 | 441 | 0.1096 | 0.79 | 0.4611 | 0.9075 | 0.79 | 0.7539 | 0.4088 | 0.0654 |
| No log | 64.0 | 448 | 0.1093 | 0.79 | 0.4610 | 0.9041 | 0.79 | 0.7539 | 0.3953 | 0.0650 |
| No log | 65.0 | 455 | 0.1092 | 0.79 | 0.4602 | 0.9049 | 0.79 | 0.7539 | 0.3845 | 0.0642 |
| No log | 66.0 | 462 | 0.1092 | 0.79 | 0.4605 | 0.9027 | 0.79 | 0.7539 | 0.3870 | 0.0646 |
| No log | 67.0 | 469 | 0.1094 | 0.79 | 0.4610 | 0.9047 | 0.79 | 0.7539 | 0.3967 | 0.0643 |
| No log | 68.0 | 476 | 0.1094 | 0.79 | 0.4602 | 0.9053 | 0.79 | 0.7539 | 0.3956 | 0.0645 |
| No log | 69.0 | 483 | 0.1094 | 0.79 | 0.4599 | 0.9054 | 0.79 | 0.7539 | 0.3950 | 0.0646 |
| No log | 70.0 | 490 | 0.1095 | 0.79 | 0.4609 | 0.9036 | 0.79 | 0.7539 | 0.4054 | 0.0647 |
| No log | 71.0 | 497 | 0.1095 | 0.79 | 0.4601 | 0.9066 | 0.79 | 0.7539 | 0.3937 | 0.0646 |
| 0.1361 | 72.0 | 504 | 0.1095 | 0.79 | 0.4602 | 0.9045 | 0.79 | 0.7539 | 0.3958 | 0.0644 |
| 0.1361 | 73.0 | 511 | 0.1095 | 0.79 | 0.4605 | 0.9064 | 0.79 | 0.7539 | 0.3900 | 0.0645 |
| 0.1361 | 74.0 | 518 | 0.1095 | 0.79 | 0.4606 | 0.9037 | 0.79 | 0.7539 | 0.4116 | 0.0642 |
| 0.1361 | 75.0 | 525 | 0.1095 | 0.79 | 0.4606 | 0.9072 | 0.79 | 0.7539 | 0.4079 | 0.0646 |
| 0.1361 | 76.0 | 532 | 0.1096 | 0.79 | 0.4604 | 0.9066 | 0.79 | 0.7539 | 0.4017 | 0.0644 |
| 0.1361 | 77.0 | 539 | 0.1095 | 0.79 | 0.4603 | 0.9062 | 0.79 | 0.7539 | 0.4014 | 0.0646 |
| 0.1361 | 78.0 | 546 | 0.1096 | 0.79 | 0.4600 | 0.9053 | 0.79 | 0.7539 | 0.3957 | 0.0644 |
| 0.1361 | 79.0 | 553 | 0.1096 | 0.79 | 0.4606 | 0.9056 | 0.79 | 0.7539 | 0.3986 | 0.0645 |
| 0.1361 | 80.0 | 560 | 0.1097 | 0.79 | 0.4602 | 0.9059 | 0.79 | 0.7539 | 0.4023 | 0.0647 |
| 0.1361 | 81.0 | 567 | 0.1096 | 0.79 | 0.4604 | 0.9056 | 0.79 | 0.7539 | 0.4042 | 0.0645 |
| 0.1361 | 82.0 | 574 | 0.1097 | 0.79 | 0.4603 | 0.9058 | 0.79 | 0.7539 | 0.4082 | 0.0646 |
| 0.1361 | 83.0 | 581 | 0.1097 | 0.79 | 0.4606 | 0.9066 | 0.79 | 0.7539 | 0.4085 | 0.0645 |
| 0.1361 | 84.0 | 588 | 0.1097 | 0.79 | 0.4603 | 0.9060 | 0.79 | 0.7539 | 0.4040 | 0.0645 |
| 0.1361 | 85.0 | 595 | 0.1097 | 0.79 | 0.4606 | 0.9059 | 0.79 | 0.7539 | 0.3949 | 0.0645 |
| 0.1361 | 86.0 | 602 | 0.1097 | 0.79 | 0.4603 | 0.9059 | 0.79 | 0.7539 | 0.4040 | 0.0645 |
| 0.1361 | 87.0 | 609 | 0.1097 | 0.79 | 0.4605 | 0.9051 | 0.79 | 0.7539 | 0.4025 | 0.0644 |
| 0.1361 | 88.0 | 616 | 0.1097 | 0.79 | 0.4605 | 0.9055 | 0.79 | 0.7539 | 0.3962 | 0.0643 |
| 0.1361 | 89.0 | 623 | 0.1097 | 0.79 | 0.4603 | 0.9056 | 0.79 | 0.7539 | 0.4040 | 0.0643 |
| 0.1361 | 90.0 | 630 | 0.1098 | 0.79 | 0.4604 | 0.9051 | 0.79 | 0.7539 | 0.3962 | 0.0643 |
| 0.1361 | 91.0 | 637 | 0.1098 | 0.79 | 0.4604 | 0.9064 | 0.79 | 0.7539 | 0.4041 | 0.0644 |
| 0.1361 | 92.0 | 644 | 0.1098 | 0.79 | 0.4605 | 0.9055 | 0.79 | 0.7539 | 0.4004 | 0.0644 |
| 0.1361 | 93.0 | 651 | 0.1098 | 0.79 | 0.4605 | 0.9059 | 0.79 | 0.7539 | 0.4042 | 0.0644 |
| 0.1361 | 94.0 | 658 | 0.1098 | 0.79 | 0.4603 | 0.9059 | 0.79 | 0.7539 | 0.4094 | 0.0643 |
| 0.1361 | 95.0 | 665 | 0.1098 | 0.79 | 0.4605 | 0.9056 | 0.79 | 0.7539 | 0.4138 | 0.0645 |
| 0.1361 | 96.0 | 672 | 0.1098 | 0.79 | 0.4604 | 0.9059 | 0.79 | 0.7539 | 0.4095 | 0.0643 |
| 0.1361 | 97.0 | 679 | 0.1098 | 0.79 | 0.4604 | 0.9057 | 0.79 | 0.7539 | 0.4137 | 0.0643 |
| 0.1361 | 98.0 | 686 | 0.1098 | 0.79 | 0.4604 | 0.9059 | 0.79 | 0.7539 | 0.4096 | 0.0643 |
| 0.1361 | 99.0 | 693 | 0.1098 | 0.79 | 0.4604 | 0.9059 | 0.79 | 0.7539 | 0.4137 | 0.0644 |
| 0.1361 | 100.0 | 700 | 0.1098 | 0.79 | 0.4604 | 0.9058 | 0.79 | 0.7539 | 0.4083 | 0.0644 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/60-tiny_tobacco3482_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 60-tiny_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0655
- Accuracy: 0.82
- Brier Loss: 0.2655
- Nll: 1.0777
- F1 Micro: 0.82
- F1 Macro: 0.7953
- Ece: 0.1364
- Aurc: 0.0539
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 5.7153 | 0.09 | 1.0182 | 8.6654 | 0.09 | 0.0745 | 0.3226 | 0.8984 |
| No log | 2.0 | 14 | 4.7081 | 0.13 | 0.9028 | 8.0605 | 0.13 | 0.0949 | 0.2402 | 0.8297 |
| No log | 3.0 | 21 | 4.3197 | 0.215 | 0.8419 | 6.1917 | 0.2150 | 0.1338 | 0.2566 | 0.5994 |
| No log | 4.0 | 28 | 4.1502 | 0.405 | 0.7493 | 4.1855 | 0.405 | 0.3261 | 0.3035 | 0.4065 |
| No log | 5.0 | 35 | 4.0310 | 0.545 | 0.6643 | 2.9984 | 0.545 | 0.4451 | 0.3080 | 0.2731 |
| No log | 6.0 | 42 | 3.9472 | 0.64 | 0.6089 | 2.4473 | 0.64 | 0.5047 | 0.3290 | 0.1849 |
| No log | 7.0 | 49 | 3.8616 | 0.655 | 0.5435 | 2.3233 | 0.655 | 0.5434 | 0.3086 | 0.1536 |
| No log | 8.0 | 56 | 3.8289 | 0.72 | 0.5177 | 2.1266 | 0.72 | 0.6162 | 0.3496 | 0.1202 |
| No log | 9.0 | 63 | 3.7812 | 0.715 | 0.4461 | 1.7718 | 0.715 | 0.6276 | 0.2464 | 0.1099 |
| No log | 10.0 | 70 | 3.7599 | 0.715 | 0.3960 | 1.7795 | 0.715 | 0.6158 | 0.2262 | 0.0879 |
| No log | 11.0 | 77 | 3.7506 | 0.74 | 0.3758 | 1.9447 | 0.74 | 0.6284 | 0.2212 | 0.0847 |
| No log | 12.0 | 84 | 3.7261 | 0.79 | 0.3515 | 1.7839 | 0.79 | 0.7107 | 0.2419 | 0.0698 |
| No log | 13.0 | 91 | 3.8045 | 0.72 | 0.3765 | 1.8827 | 0.72 | 0.6249 | 0.1943 | 0.0941 |
| No log | 14.0 | 98 | 3.6964 | 0.79 | 0.3309 | 1.8042 | 0.79 | 0.6873 | 0.2162 | 0.0629 |
| No log | 15.0 | 105 | 3.7058 | 0.78 | 0.3285 | 1.8229 | 0.78 | 0.6827 | 0.1999 | 0.0672 |
| No log | 16.0 | 112 | 3.8395 | 0.74 | 0.3502 | 1.9683 | 0.74 | 0.6554 | 0.1777 | 0.0880 |
| No log | 17.0 | 119 | 3.7568 | 0.795 | 0.3120 | 1.3060 | 0.795 | 0.7121 | 0.1656 | 0.0633 |
| No log | 18.0 | 126 | 3.7399 | 0.795 | 0.3099 | 1.6225 | 0.795 | 0.7133 | 0.1721 | 0.0615 |
| No log | 19.0 | 133 | 3.7813 | 0.82 | 0.2884 | 1.3165 | 0.82 | 0.7475 | 0.1698 | 0.0565 |
| No log | 20.0 | 140 | 3.7907 | 0.8 | 0.3074 | 1.2638 | 0.8000 | 0.7701 | 0.1862 | 0.0654 |
| No log | 21.0 | 147 | 3.7687 | 0.83 | 0.2860 | 1.2646 | 0.83 | 0.8077 | 0.1730 | 0.0569 |
| No log | 22.0 | 154 | 3.7182 | 0.82 | 0.2813 | 1.3271 | 0.82 | 0.7848 | 0.1762 | 0.0551 |
| No log | 23.0 | 161 | 3.7801 | 0.83 | 0.2679 | 1.1397 | 0.83 | 0.8015 | 0.1574 | 0.0568 |
| No log | 24.0 | 168 | 3.7659 | 0.82 | 0.2752 | 1.1301 | 0.82 | 0.7893 | 0.1581 | 0.0582 |
| No log | 25.0 | 175 | 3.7960 | 0.825 | 0.2755 | 1.2565 | 0.825 | 0.8017 | 0.1663 | 0.0540 |
| No log | 26.0 | 182 | 3.7630 | 0.82 | 0.2715 | 1.0265 | 0.82 | 0.7971 | 0.1439 | 0.0536 |
| No log | 27.0 | 189 | 3.8411 | 0.82 | 0.2822 | 1.2532 | 0.82 | 0.7929 | 0.1621 | 0.0623 |
| No log | 28.0 | 196 | 3.7869 | 0.83 | 0.2608 | 1.1781 | 0.83 | 0.8013 | 0.1461 | 0.0531 |
| No log | 29.0 | 203 | 3.7798 | 0.81 | 0.2724 | 1.0950 | 0.81 | 0.7785 | 0.1436 | 0.0562 |
| No log | 30.0 | 210 | 3.8412 | 0.82 | 0.2724 | 1.3168 | 0.82 | 0.7892 | 0.1473 | 0.0596 |
| No log | 31.0 | 217 | 3.7937 | 0.83 | 0.2683 | 1.1818 | 0.83 | 0.8137 | 0.1523 | 0.0515 |
| No log | 32.0 | 224 | 3.7964 | 0.825 | 0.2655 | 1.1605 | 0.825 | 0.8028 | 0.1580 | 0.0551 |
| No log | 33.0 | 231 | 3.7950 | 0.81 | 0.2685 | 1.2514 | 0.81 | 0.7854 | 0.1275 | 0.0534 |
| No log | 34.0 | 238 | 3.8086 | 0.83 | 0.2599 | 0.9670 | 0.83 | 0.8022 | 0.1309 | 0.0529 |
| No log | 35.0 | 245 | 3.8393 | 0.83 | 0.2649 | 0.9615 | 0.83 | 0.8050 | 0.1445 | 0.0543 |
| No log | 36.0 | 252 | 3.8140 | 0.84 | 0.2528 | 1.0685 | 0.8400 | 0.8194 | 0.1371 | 0.0528 |
| No log | 37.0 | 259 | 3.8164 | 0.825 | 0.2665 | 1.0001 | 0.825 | 0.8030 | 0.1501 | 0.0548 |
| No log | 38.0 | 266 | 3.8298 | 0.82 | 0.2669 | 1.0166 | 0.82 | 0.8007 | 0.1413 | 0.0549 |
| No log | 39.0 | 273 | 3.8346 | 0.815 | 0.2646 | 1.1609 | 0.815 | 0.7876 | 0.1453 | 0.0551 |
| No log | 40.0 | 280 | 3.8409 | 0.825 | 0.2610 | 1.1811 | 0.825 | 0.8054 | 0.1432 | 0.0551 |
| No log | 41.0 | 287 | 3.8106 | 0.835 | 0.2642 | 1.0262 | 0.835 | 0.8078 | 0.1471 | 0.0547 |
| No log | 42.0 | 294 | 3.8402 | 0.81 | 0.2591 | 0.9683 | 0.81 | 0.7863 | 0.1638 | 0.0546 |
| No log | 43.0 | 301 | 3.8480 | 0.82 | 0.2604 | 1.1408 | 0.82 | 0.7933 | 0.1448 | 0.0538 |
| No log | 44.0 | 308 | 3.8627 | 0.835 | 0.2717 | 1.0142 | 0.835 | 0.8113 | 0.1472 | 0.0550 |
| No log | 45.0 | 315 | 3.8274 | 0.83 | 0.2589 | 1.0148 | 0.83 | 0.8138 | 0.1333 | 0.0518 |
| No log | 46.0 | 322 | 3.8349 | 0.83 | 0.2563 | 0.9773 | 0.83 | 0.8086 | 0.1237 | 0.0537 |
| No log | 47.0 | 329 | 3.8656 | 0.82 | 0.2668 | 0.9728 | 0.82 | 0.7947 | 0.1330 | 0.0546 |
| No log | 48.0 | 336 | 3.8455 | 0.84 | 0.2514 | 0.9784 | 0.8400 | 0.8175 | 0.1519 | 0.0509 |
| No log | 49.0 | 343 | 3.8827 | 0.81 | 0.2740 | 0.9515 | 0.81 | 0.7852 | 0.1325 | 0.0565 |
| No log | 50.0 | 350 | 3.8617 | 0.815 | 0.2659 | 1.2129 | 0.815 | 0.7885 | 0.1430 | 0.0565 |
| No log | 51.0 | 357 | 3.8555 | 0.825 | 0.2601 | 0.9544 | 0.825 | 0.8023 | 0.1500 | 0.0535 |
| No log | 52.0 | 364 | 3.8794 | 0.825 | 0.2611 | 1.1499 | 0.825 | 0.7998 | 0.1520 | 0.0550 |
| No log | 53.0 | 371 | 3.8563 | 0.815 | 0.2592 | 1.0856 | 0.815 | 0.7896 | 0.1198 | 0.0525 |
| No log | 54.0 | 378 | 3.8851 | 0.825 | 0.2579 | 0.8909 | 0.825 | 0.8042 | 0.1320 | 0.0543 |
| No log | 55.0 | 385 | 3.8682 | 0.815 | 0.2589 | 1.1381 | 0.815 | 0.7906 | 0.1376 | 0.0553 |
| No log | 56.0 | 392 | 3.8646 | 0.83 | 0.2582 | 1.0602 | 0.83 | 0.8144 | 0.1330 | 0.0532 |
| No log | 57.0 | 399 | 3.9205 | 0.82 | 0.2678 | 1.0739 | 0.82 | 0.7991 | 0.1403 | 0.0569 |
| No log | 58.0 | 406 | 3.8878 | 0.835 | 0.2613 | 0.9320 | 0.835 | 0.8112 | 0.1575 | 0.0527 |
| No log | 59.0 | 413 | 3.8944 | 0.82 | 0.2662 | 1.0116 | 0.82 | 0.7966 | 0.1304 | 0.0544 |
| No log | 60.0 | 420 | 3.8819 | 0.825 | 0.2597 | 1.1330 | 0.825 | 0.8019 | 0.1471 | 0.0529 |
| No log | 61.0 | 427 | 3.9219 | 0.82 | 0.2719 | 1.0206 | 0.82 | 0.7957 | 0.1465 | 0.0563 |
| No log | 62.0 | 434 | 3.8903 | 0.815 | 0.2583 | 0.9983 | 0.815 | 0.7945 | 0.1428 | 0.0532 |
| No log | 63.0 | 441 | 3.9258 | 0.82 | 0.2674 | 1.0722 | 0.82 | 0.7941 | 0.1312 | 0.0552 |
| No log | 64.0 | 448 | 3.9086 | 0.83 | 0.2546 | 1.1474 | 0.83 | 0.8049 | 0.1375 | 0.0522 |
| No log | 65.0 | 455 | 3.9260 | 0.83 | 0.2655 | 1.0795 | 0.83 | 0.8074 | 0.1300 | 0.0537 |
| No log | 66.0 | 462 | 3.9424 | 0.82 | 0.2634 | 1.0220 | 0.82 | 0.7978 | 0.1298 | 0.0542 |
| No log | 67.0 | 469 | 3.9315 | 0.815 | 0.2605 | 1.0206 | 0.815 | 0.7882 | 0.1404 | 0.0534 |
| No log | 68.0 | 476 | 3.9583 | 0.82 | 0.2634 | 1.0018 | 0.82 | 0.7991 | 0.1321 | 0.0532 |
| No log | 69.0 | 483 | 3.9447 | 0.825 | 0.2606 | 1.0840 | 0.825 | 0.8003 | 0.1326 | 0.0533 |
| No log | 70.0 | 490 | 3.9698 | 0.815 | 0.2646 | 0.9721 | 0.815 | 0.7940 | 0.1359 | 0.0546 |
| No log | 71.0 | 497 | 3.9621 | 0.82 | 0.2615 | 1.0207 | 0.82 | 0.7953 | 0.1312 | 0.0540 |
| 3.4582 | 72.0 | 504 | 3.9687 | 0.81 | 0.2628 | 1.0078 | 0.81 | 0.7861 | 0.1379 | 0.0549 |
| 3.4582 | 73.0 | 511 | 3.9699 | 0.825 | 0.2575 | 1.0739 | 0.825 | 0.8024 | 0.1257 | 0.0534 |
| 3.4582 | 74.0 | 518 | 3.9791 | 0.82 | 0.2638 | 1.0290 | 0.82 | 0.7978 | 0.1337 | 0.0543 |
| 3.4582 | 75.0 | 525 | 3.9893 | 0.82 | 0.2664 | 1.0711 | 0.82 | 0.7978 | 0.1323 | 0.0543 |
| 3.4582 | 76.0 | 532 | 3.9843 | 0.825 | 0.2620 | 0.9559 | 0.825 | 0.7999 | 0.1307 | 0.0535 |
| 3.4582 | 77.0 | 539 | 3.9983 | 0.82 | 0.2626 | 1.0239 | 0.82 | 0.7953 | 0.1333 | 0.0543 |
| 3.4582 | 78.0 | 546 | 4.0068 | 0.82 | 0.2663 | 1.0205 | 0.82 | 0.7953 | 0.1421 | 0.0542 |
| 3.4582 | 79.0 | 553 | 4.0096 | 0.82 | 0.2636 | 1.0204 | 0.82 | 0.7953 | 0.1296 | 0.0541 |
| 3.4582 | 80.0 | 560 | 4.0211 | 0.82 | 0.2655 | 1.0147 | 0.82 | 0.7953 | 0.1365 | 0.0541 |
| 3.4582 | 81.0 | 567 | 4.0206 | 0.815 | 0.2631 | 1.0781 | 0.815 | 0.7902 | 0.1378 | 0.0543 |
| 3.4582 | 82.0 | 574 | 4.0270 | 0.82 | 0.2652 | 1.0752 | 0.82 | 0.7953 | 0.1407 | 0.0539 |
| 3.4582 | 83.0 | 581 | 4.0301 | 0.82 | 0.2642 | 1.0763 | 0.82 | 0.7953 | 0.1444 | 0.0537 |
| 3.4582 | 84.0 | 588 | 4.0369 | 0.82 | 0.2652 | 1.0758 | 0.82 | 0.7953 | 0.1355 | 0.0539 |
| 3.4582 | 85.0 | 595 | 4.0395 | 0.82 | 0.2648 | 1.0244 | 0.82 | 0.7953 | 0.1300 | 0.0539 |
| 3.4582 | 86.0 | 602 | 4.0428 | 0.82 | 0.2651 | 1.0768 | 0.82 | 0.7953 | 0.1209 | 0.0538 |
| 3.4582 | 87.0 | 609 | 4.0461 | 0.82 | 0.2653 | 1.0780 | 0.82 | 0.7953 | 0.1295 | 0.0536 |
| 3.4582 | 88.0 | 616 | 4.0485 | 0.82 | 0.2650 | 1.0780 | 0.82 | 0.7953 | 0.1375 | 0.0537 |
| 3.4582 | 89.0 | 623 | 4.0521 | 0.82 | 0.2652 | 1.0259 | 0.82 | 0.7953 | 0.1296 | 0.0537 |
| 3.4582 | 90.0 | 630 | 4.0540 | 0.82 | 0.2652 | 1.0466 | 0.82 | 0.7953 | 0.1282 | 0.0538 |
| 3.4582 | 91.0 | 637 | 4.0562 | 0.82 | 0.2653 | 1.0778 | 0.82 | 0.7953 | 0.1304 | 0.0538 |
| 3.4582 | 92.0 | 644 | 4.0587 | 0.82 | 0.2655 | 1.0774 | 0.82 | 0.7953 | 0.1241 | 0.0537 |
| 3.4582 | 93.0 | 651 | 4.0607 | 0.82 | 0.2657 | 1.0771 | 0.82 | 0.7953 | 0.1357 | 0.0538 |
| 3.4582 | 94.0 | 658 | 4.0615 | 0.82 | 0.2655 | 1.0775 | 0.82 | 0.7953 | 0.1268 | 0.0538 |
| 3.4582 | 95.0 | 665 | 4.0625 | 0.82 | 0.2654 | 1.0300 | 0.82 | 0.7953 | 0.1211 | 0.0538 |
| 3.4582 | 96.0 | 672 | 4.0641 | 0.82 | 0.2657 | 1.0771 | 0.82 | 0.7953 | 0.1365 | 0.0538 |
| 3.4582 | 97.0 | 679 | 4.0641 | 0.82 | 0.2654 | 1.0312 | 0.82 | 0.7953 | 0.1210 | 0.0538 |
| 3.4582 | 98.0 | 686 | 4.0652 | 0.82 | 0.2656 | 1.0776 | 0.82 | 0.7953 | 0.1269 | 0.0538 |
| 3.4582 | 99.0 | 693 | 4.0654 | 0.82 | 0.2656 | 1.0775 | 0.82 | 0.7953 | 0.1364 | 0.0539 |
| 3.4582 | 100.0 | 700 | 4.0655 | 0.82 | 0.2655 | 1.0777 | 0.82 | 0.7953 | 0.1364 | 0.0539 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/60-tiny_tobacco3482_hint_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 60-tiny_tobacco3482_hint_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 63.5396
- Accuracy: 0.84
- Brier Loss: 0.3043
- Nll: 1.1495
- F1 Micro: 0.8400
- F1 Macro: 0.8244
- Ece: 0.1568
- Aurc: 0.0457
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 66.6267 | 0.26 | 0.8706 | 4.9000 | 0.26 | 0.1920 | 0.2904 | 0.7812 |
| No log | 2.0 | 50 | 65.8408 | 0.54 | 0.5951 | 2.8543 | 0.54 | 0.4535 | 0.2542 | 0.2567 |
| No log | 3.0 | 75 | 65.3708 | 0.675 | 0.4400 | 1.6094 | 0.675 | 0.6134 | 0.2395 | 0.1333 |
| No log | 4.0 | 100 | 65.0889 | 0.76 | 0.3809 | 1.5505 | 0.76 | 0.7422 | 0.2333 | 0.1125 |
| No log | 5.0 | 125 | 64.7800 | 0.8 | 0.3080 | 1.7523 | 0.8000 | 0.7663 | 0.1734 | 0.0708 |
| No log | 6.0 | 150 | 64.6296 | 0.78 | 0.3286 | 1.7771 | 0.78 | 0.7427 | 0.1752 | 0.0642 |
| No log | 7.0 | 175 | 64.3879 | 0.765 | 0.3584 | 1.7400 | 0.765 | 0.6986 | 0.1799 | 0.0937 |
| No log | 8.0 | 200 | 64.4361 | 0.72 | 0.4640 | 1.4368 | 0.72 | 0.7385 | 0.2350 | 0.1314 |
| No log | 9.0 | 225 | 64.2223 | 0.76 | 0.3846 | 1.6420 | 0.76 | 0.7417 | 0.2006 | 0.0915 |
| No log | 10.0 | 250 | 64.2618 | 0.725 | 0.4268 | 1.6667 | 0.7250 | 0.7132 | 0.2131 | 0.1110 |
| No log | 11.0 | 275 | 64.2839 | 0.7 | 0.4830 | 1.7975 | 0.7 | 0.6829 | 0.2213 | 0.1207 |
| No log | 12.0 | 300 | 64.0218 | 0.785 | 0.3523 | 1.7098 | 0.785 | 0.7363 | 0.1742 | 0.0702 |
| No log | 13.0 | 325 | 63.8071 | 0.78 | 0.3218 | 1.4587 | 0.78 | 0.7574 | 0.1640 | 0.0674 |
| No log | 14.0 | 350 | 64.7387 | 0.645 | 0.5871 | 2.0188 | 0.645 | 0.6360 | 0.2996 | 0.1765 |
| No log | 15.0 | 375 | 64.2173 | 0.765 | 0.3832 | 1.8093 | 0.765 | 0.6909 | 0.1978 | 0.0892 |
| No log | 16.0 | 400 | 64.2233 | 0.765 | 0.3897 | 1.4456 | 0.765 | 0.7432 | 0.1983 | 0.0805 |
| No log | 17.0 | 425 | 63.7977 | 0.825 | 0.2971 | 1.4248 | 0.825 | 0.8057 | 0.1583 | 0.0546 |
| No log | 18.0 | 450 | 63.5818 | 0.82 | 0.2983 | 1.3079 | 0.82 | 0.7936 | 0.1532 | 0.0474 |
| No log | 19.0 | 475 | 64.1935 | 0.78 | 0.3764 | 1.6662 | 0.78 | 0.7618 | 0.1911 | 0.0669 |
| 63.0313 | 20.0 | 500 | 63.6054 | 0.825 | 0.2871 | 1.4054 | 0.825 | 0.8118 | 0.1605 | 0.0520 |
| 63.0313 | 21.0 | 525 | 63.6316 | 0.79 | 0.3258 | 1.3131 | 0.79 | 0.7714 | 0.1632 | 0.0485 |
| 63.0313 | 22.0 | 550 | 63.6978 | 0.84 | 0.2935 | 1.2425 | 0.8400 | 0.8236 | 0.1508 | 0.0586 |
| 63.0313 | 23.0 | 575 | 63.8266 | 0.825 | 0.3117 | 1.5766 | 0.825 | 0.8019 | 0.1550 | 0.0554 |
| 63.0313 | 24.0 | 600 | 63.6750 | 0.825 | 0.3130 | 1.1848 | 0.825 | 0.8158 | 0.1553 | 0.0462 |
| 63.0313 | 25.0 | 625 | 63.8469 | 0.82 | 0.3259 | 1.3997 | 0.82 | 0.8007 | 0.1603 | 0.0564 |
| 63.0313 | 26.0 | 650 | 63.7656 | 0.815 | 0.3285 | 1.2752 | 0.815 | 0.7969 | 0.1656 | 0.0535 |
| 63.0313 | 27.0 | 675 | 63.8074 | 0.805 | 0.3455 | 1.1282 | 0.805 | 0.7870 | 0.1732 | 0.0542 |
| 63.0313 | 28.0 | 700 | 63.8411 | 0.81 | 0.3437 | 1.1501 | 0.81 | 0.7917 | 0.1759 | 0.0529 |
| 63.0313 | 29.0 | 725 | 63.8158 | 0.81 | 0.3345 | 1.1519 | 0.81 | 0.7901 | 0.1706 | 0.0544 |
| 63.0313 | 30.0 | 750 | 63.7917 | 0.815 | 0.3383 | 1.2013 | 0.815 | 0.8006 | 0.1706 | 0.0557 |
| 63.0313 | 31.0 | 775 | 63.7855 | 0.815 | 0.3396 | 1.2088 | 0.815 | 0.7974 | 0.1687 | 0.0551 |
| 63.0313 | 32.0 | 800 | 63.8003 | 0.825 | 0.3297 | 1.2233 | 0.825 | 0.8091 | 0.1694 | 0.0547 |
| 63.0313 | 33.0 | 825 | 63.8029 | 0.815 | 0.3405 | 1.2628 | 0.815 | 0.8007 | 0.1729 | 0.0547 |
| 63.0313 | 34.0 | 850 | 63.7752 | 0.81 | 0.3352 | 1.2587 | 0.81 | 0.7979 | 0.1727 | 0.0574 |
| 63.0313 | 35.0 | 875 | 63.7800 | 0.815 | 0.3346 | 1.1948 | 0.815 | 0.7977 | 0.1679 | 0.0560 |
| 63.0313 | 36.0 | 900 | 63.7885 | 0.825 | 0.3313 | 1.2728 | 0.825 | 0.8173 | 0.1591 | 0.0569 |
| 63.0313 | 37.0 | 925 | 63.7730 | 0.815 | 0.3354 | 1.2726 | 0.815 | 0.8027 | 0.1689 | 0.0555 |
| 63.0313 | 38.0 | 950 | 63.8327 | 0.815 | 0.3405 | 1.4350 | 0.815 | 0.8043 | 0.1675 | 0.0632 |
| 63.0313 | 39.0 | 975 | 63.7324 | 0.785 | 0.3686 | 1.6439 | 0.785 | 0.7745 | 0.1808 | 0.0666 |
| 61.6786 | 40.0 | 1000 | 63.8625 | 0.765 | 0.3946 | 1.6127 | 0.765 | 0.7727 | 0.1961 | 0.0723 |
| 61.6786 | 41.0 | 1025 | 64.1254 | 0.765 | 0.3904 | 1.5456 | 0.765 | 0.7570 | 0.2020 | 0.0850 |
| 61.6786 | 42.0 | 1050 | 63.6201 | 0.78 | 0.3728 | 1.4198 | 0.78 | 0.7447 | 0.1869 | 0.0647 |
| 61.6786 | 43.0 | 1075 | 63.6033 | 0.835 | 0.2968 | 1.5430 | 0.835 | 0.8059 | 0.1574 | 0.0479 |
| 61.6786 | 44.0 | 1100 | 63.6777 | 0.795 | 0.3606 | 1.3542 | 0.795 | 0.7638 | 0.1806 | 0.0529 |
| 61.6786 | 45.0 | 1125 | 63.5747 | 0.83 | 0.2996 | 1.5403 | 0.83 | 0.8079 | 0.1450 | 0.0504 |
| 61.6786 | 46.0 | 1150 | 63.6022 | 0.805 | 0.3389 | 1.3842 | 0.805 | 0.7791 | 0.1794 | 0.0466 |
| 61.6786 | 47.0 | 1175 | 63.6342 | 0.81 | 0.3346 | 1.2861 | 0.81 | 0.7811 | 0.1678 | 0.0476 |
| 61.6786 | 48.0 | 1200 | 63.6065 | 0.81 | 0.3298 | 1.2911 | 0.81 | 0.7807 | 0.1654 | 0.0465 |
| 61.6786 | 49.0 | 1225 | 63.5937 | 0.815 | 0.3260 | 1.3576 | 0.815 | 0.7844 | 0.1613 | 0.0467 |
| 61.6786 | 50.0 | 1250 | 63.6029 | 0.815 | 0.3241 | 1.2826 | 0.815 | 0.7844 | 0.1662 | 0.0467 |
| 61.6786 | 51.0 | 1275 | 63.5947 | 0.81 | 0.3232 | 1.4156 | 0.81 | 0.7789 | 0.1631 | 0.0471 |
| 61.6786 | 52.0 | 1300 | 63.6501 | 0.81 | 0.3268 | 1.4148 | 0.81 | 0.7785 | 0.1703 | 0.0468 |
| 61.6786 | 53.0 | 1325 | 63.6207 | 0.81 | 0.3207 | 1.2785 | 0.81 | 0.7785 | 0.1698 | 0.0479 |
| 61.6786 | 54.0 | 1350 | 63.6021 | 0.815 | 0.3233 | 1.3519 | 0.815 | 0.7818 | 0.1629 | 0.0456 |
| 61.6786 | 55.0 | 1375 | 63.6128 | 0.815 | 0.3207 | 1.2837 | 0.815 | 0.7818 | 0.1641 | 0.0474 |
| 61.6786 | 56.0 | 1400 | 63.5974 | 0.81 | 0.3194 | 1.3542 | 0.81 | 0.7789 | 0.1679 | 0.0474 |
| 61.6786 | 57.0 | 1425 | 63.6173 | 0.81 | 0.3260 | 1.2907 | 0.81 | 0.7761 | 0.1653 | 0.0486 |
| 61.6786 | 58.0 | 1450 | 63.6057 | 0.81 | 0.3163 | 1.2981 | 0.81 | 0.7789 | 0.1651 | 0.0471 |
| 61.6786 | 59.0 | 1475 | 63.6052 | 0.81 | 0.3197 | 1.3444 | 0.81 | 0.7789 | 0.1680 | 0.0467 |
| 61.52 | 60.0 | 1500 | 63.5865 | 0.82 | 0.3143 | 1.2748 | 0.82 | 0.7920 | 0.1617 | 0.0465 |
| 61.52 | 61.0 | 1525 | 63.5754 | 0.82 | 0.3126 | 1.2677 | 0.82 | 0.7920 | 0.1595 | 0.0468 |
| 61.52 | 62.0 | 1550 | 63.5876 | 0.815 | 0.3120 | 1.2691 | 0.815 | 0.7879 | 0.1567 | 0.0478 |
| 61.52 | 63.0 | 1575 | 63.6040 | 0.82 | 0.3110 | 1.2632 | 0.82 | 0.7920 | 0.1526 | 0.0472 |
| 61.52 | 64.0 | 1600 | 63.5956 | 0.82 | 0.3111 | 1.1976 | 0.82 | 0.7963 | 0.1592 | 0.0468 |
| 61.52 | 65.0 | 1625 | 63.5792 | 0.815 | 0.3095 | 1.1928 | 0.815 | 0.7879 | 0.1571 | 0.0469 |
| 61.52 | 66.0 | 1650 | 63.5704 | 0.82 | 0.3086 | 1.2509 | 0.82 | 0.7936 | 0.1543 | 0.0467 |
| 61.52 | 67.0 | 1675 | 63.5918 | 0.82 | 0.3118 | 1.2536 | 0.82 | 0.7936 | 0.1619 | 0.0471 |
| 61.52 | 68.0 | 1700 | 63.5741 | 0.82 | 0.3072 | 1.2491 | 0.82 | 0.7963 | 0.1562 | 0.0465 |
| 61.52 | 69.0 | 1725 | 63.5581 | 0.825 | 0.3085 | 1.2490 | 0.825 | 0.8021 | 0.1566 | 0.0460 |
| 61.52 | 70.0 | 1750 | 63.5796 | 0.82 | 0.3087 | 1.2456 | 0.82 | 0.7963 | 0.1556 | 0.0471 |
| 61.52 | 71.0 | 1775 | 63.5776 | 0.825 | 0.3073 | 1.2530 | 0.825 | 0.8021 | 0.1571 | 0.0474 |
| 61.52 | 72.0 | 1800 | 63.5524 | 0.825 | 0.3064 | 1.2402 | 0.825 | 0.8021 | 0.1555 | 0.0465 |
| 61.52 | 73.0 | 1825 | 63.5638 | 0.825 | 0.3075 | 1.2465 | 0.825 | 0.8021 | 0.1607 | 0.0466 |
| 61.52 | 74.0 | 1850 | 63.5654 | 0.82 | 0.3058 | 1.2425 | 0.82 | 0.7963 | 0.1552 | 0.0468 |
| 61.52 | 75.0 | 1875 | 63.5654 | 0.825 | 0.3041 | 1.2439 | 0.825 | 0.8021 | 0.1563 | 0.0466 |
| 61.52 | 76.0 | 1900 | 63.5499 | 0.83 | 0.3018 | 1.2432 | 0.83 | 0.8082 | 0.1541 | 0.0463 |
| 61.52 | 77.0 | 1925 | 63.5563 | 0.825 | 0.3059 | 1.2385 | 0.825 | 0.8021 | 0.1570 | 0.0466 |
| 61.52 | 78.0 | 1950 | 63.5524 | 0.825 | 0.3045 | 1.2364 | 0.825 | 0.8021 | 0.1524 | 0.0464 |
| 61.52 | 79.0 | 1975 | 63.5507 | 0.825 | 0.3064 | 1.2344 | 0.825 | 0.8021 | 0.1523 | 0.0463 |
| 61.4257 | 80.0 | 2000 | 63.5531 | 0.825 | 0.3062 | 1.2266 | 0.825 | 0.8035 | 0.1625 | 0.0463 |
| 61.4257 | 81.0 | 2025 | 63.5486 | 0.825 | 0.3029 | 1.1850 | 0.825 | 0.8024 | 0.1506 | 0.0463 |
| 61.4257 | 82.0 | 2050 | 63.5479 | 0.82 | 0.3081 | 1.2269 | 0.82 | 0.7963 | 0.1588 | 0.0458 |
| 61.4257 | 83.0 | 2075 | 63.5444 | 0.835 | 0.3029 | 1.1721 | 0.835 | 0.8139 | 0.1475 | 0.0461 |
| 61.4257 | 84.0 | 2100 | 63.5435 | 0.835 | 0.3047 | 1.2306 | 0.835 | 0.8171 | 0.1529 | 0.0464 |
| 61.4257 | 85.0 | 2125 | 63.5393 | 0.83 | 0.3058 | 1.2255 | 0.83 | 0.8081 | 0.1462 | 0.0464 |
| 61.4257 | 86.0 | 2150 | 63.5437 | 0.835 | 0.3048 | 1.2254 | 0.835 | 0.8171 | 0.1481 | 0.0464 |
| 61.4257 | 87.0 | 2175 | 63.5463 | 0.83 | 0.3039 | 1.1549 | 0.83 | 0.8115 | 0.1562 | 0.0463 |
| 61.4257 | 88.0 | 2200 | 63.5408 | 0.835 | 0.3055 | 1.2211 | 0.835 | 0.8187 | 0.1485 | 0.0462 |
| 61.4257 | 89.0 | 2225 | 63.5477 | 0.825 | 0.3054 | 1.1541 | 0.825 | 0.8024 | 0.1521 | 0.0463 |
| 61.4257 | 90.0 | 2250 | 63.5383 | 0.83 | 0.3051 | 1.1577 | 0.83 | 0.8095 | 0.1532 | 0.0463 |
| 61.4257 | 91.0 | 2275 | 63.5466 | 0.84 | 0.3057 | 1.1583 | 0.8400 | 0.8244 | 0.1516 | 0.0458 |
| 61.4257 | 92.0 | 2300 | 63.5447 | 0.835 | 0.3049 | 1.1518 | 0.835 | 0.8188 | 0.1615 | 0.0462 |
| 61.4257 | 93.0 | 2325 | 63.5327 | 0.84 | 0.3044 | 1.1540 | 0.8400 | 0.8244 | 0.1508 | 0.0459 |
| 61.4257 | 94.0 | 2350 | 63.5392 | 0.84 | 0.3046 | 1.1506 | 0.8400 | 0.8244 | 0.1569 | 0.0459 |
| 61.4257 | 95.0 | 2375 | 63.5305 | 0.835 | 0.3050 | 1.1520 | 0.835 | 0.8188 | 0.1571 | 0.0457 |
| 61.4257 | 96.0 | 2400 | 63.5413 | 0.835 | 0.3042 | 1.1494 | 0.835 | 0.8188 | 0.1571 | 0.0461 |
| 61.4257 | 97.0 | 2425 | 63.5387 | 0.835 | 0.3047 | 1.1489 | 0.835 | 0.8188 | 0.1652 | 0.0461 |
| 61.4257 | 98.0 | 2450 | 63.5383 | 0.84 | 0.3046 | 1.1503 | 0.8400 | 0.8244 | 0.1568 | 0.0458 |
| 61.4257 | 99.0 | 2475 | 63.5374 | 0.835 | 0.3045 | 1.1489 | 0.835 | 0.8188 | 0.1570 | 0.0456 |
| 61.3919 | 100.0 | 2500 | 63.5396 | 0.84 | 0.3043 | 1.1495 | 0.8400 | 0.8244 | 0.1568 | 0.0457 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/39-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 39-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5285
- Accuracy: 0.81
- Brier Loss: 0.3907
- Nll: 0.9159
- F1 Micro: 0.81
- F1 Macro: 0.7907
- Ece: 0.3421
- Aurc: 0.0542
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.8936 | 0.11 | 1.0097 | 8.5078 | 0.11 | 0.0902 | 0.3251 | 0.8953 |
| No log | 2.0 | 14 | 1.2597 | 0.16 | 0.8753 | 5.5353 | 0.16 | 0.1308 | 0.2539 | 0.7902 |
| No log | 3.0 | 21 | 1.0230 | 0.355 | 0.7833 | 5.1396 | 0.3550 | 0.2872 | 0.2810 | 0.4521 |
| No log | 4.0 | 28 | 0.8743 | 0.545 | 0.6497 | 3.0412 | 0.545 | 0.4328 | 0.3224 | 0.2775 |
| No log | 5.0 | 35 | 0.8020 | 0.625 | 0.5958 | 2.6099 | 0.625 | 0.5465 | 0.3186 | 0.2136 |
| No log | 6.0 | 42 | 0.7221 | 0.675 | 0.5300 | 2.3085 | 0.675 | 0.5632 | 0.3257 | 0.1562 |
| No log | 7.0 | 49 | 0.6964 | 0.68 | 0.4843 | 1.9033 | 0.68 | 0.5761 | 0.3039 | 0.1453 |
| No log | 8.0 | 56 | 0.6729 | 0.72 | 0.4598 | 1.8200 | 0.72 | 0.6195 | 0.3089 | 0.1170 |
| No log | 9.0 | 63 | 0.6470 | 0.77 | 0.4318 | 1.5607 | 0.7700 | 0.7058 | 0.3518 | 0.0897 |
| No log | 10.0 | 70 | 0.5889 | 0.795 | 0.4019 | 1.1238 | 0.795 | 0.7546 | 0.3324 | 0.0675 |
| No log | 11.0 | 77 | 0.5829 | 0.795 | 0.4013 | 1.0267 | 0.795 | 0.7667 | 0.3142 | 0.0728 |
| No log | 12.0 | 84 | 0.5763 | 0.785 | 0.3923 | 1.2697 | 0.785 | 0.7655 | 0.3286 | 0.0751 |
| No log | 13.0 | 91 | 0.5854 | 0.765 | 0.3934 | 1.4915 | 0.765 | 0.7291 | 0.2936 | 0.0806 |
| No log | 14.0 | 98 | 0.5779 | 0.795 | 0.3983 | 1.2207 | 0.795 | 0.7409 | 0.3141 | 0.0681 |
| No log | 15.0 | 105 | 0.5564 | 0.795 | 0.3752 | 1.1974 | 0.795 | 0.7687 | 0.3201 | 0.0626 |
| No log | 16.0 | 112 | 0.5599 | 0.815 | 0.3945 | 1.0987 | 0.815 | 0.7827 | 0.3233 | 0.0618 |
| No log | 17.0 | 119 | 0.5748 | 0.77 | 0.4001 | 1.2395 | 0.7700 | 0.7497 | 0.3136 | 0.0866 |
| No log | 18.0 | 126 | 0.5611 | 0.79 | 0.4028 | 1.3279 | 0.79 | 0.7738 | 0.3127 | 0.0680 |
| No log | 19.0 | 133 | 0.5514 | 0.805 | 0.4063 | 0.8598 | 0.805 | 0.7873 | 0.3656 | 0.0575 |
| No log | 20.0 | 140 | 0.5566 | 0.81 | 0.4028 | 0.9944 | 0.81 | 0.7943 | 0.3449 | 0.0676 |
| No log | 21.0 | 147 | 0.5489 | 0.81 | 0.3879 | 1.1351 | 0.81 | 0.7966 | 0.3432 | 0.0682 |
| No log | 22.0 | 154 | 0.5586 | 0.82 | 0.4091 | 1.1107 | 0.82 | 0.7894 | 0.3526 | 0.0580 |
| No log | 23.0 | 161 | 0.5593 | 0.795 | 0.4131 | 1.1693 | 0.795 | 0.7765 | 0.3483 | 0.0641 |
| No log | 24.0 | 168 | 0.5493 | 0.79 | 0.3962 | 1.2363 | 0.79 | 0.7740 | 0.3494 | 0.0646 |
| No log | 25.0 | 175 | 0.5489 | 0.8 | 0.3930 | 1.0310 | 0.8000 | 0.7638 | 0.3342 | 0.0614 |
| No log | 26.0 | 182 | 0.5492 | 0.79 | 0.3944 | 1.3201 | 0.79 | 0.7670 | 0.3096 | 0.0667 |
| No log | 27.0 | 189 | 0.5441 | 0.805 | 0.4002 | 1.1304 | 0.805 | 0.7886 | 0.3528 | 0.0600 |
| No log | 28.0 | 196 | 0.5397 | 0.815 | 0.3960 | 1.1210 | 0.815 | 0.7902 | 0.3630 | 0.0544 |
| No log | 29.0 | 203 | 0.5418 | 0.785 | 0.3977 | 0.9580 | 0.785 | 0.7575 | 0.3536 | 0.0646 |
| No log | 30.0 | 210 | 0.5374 | 0.815 | 0.3931 | 1.0186 | 0.815 | 0.7855 | 0.3422 | 0.0604 |
| No log | 31.0 | 217 | 0.5405 | 0.815 | 0.3983 | 0.8948 | 0.815 | 0.7980 | 0.3671 | 0.0531 |
| No log | 32.0 | 224 | 0.5394 | 0.805 | 0.3998 | 1.0680 | 0.805 | 0.7841 | 0.3695 | 0.0568 |
| No log | 33.0 | 231 | 0.5296 | 0.81 | 0.3868 | 1.1222 | 0.81 | 0.7891 | 0.3530 | 0.0545 |
| No log | 34.0 | 238 | 0.5338 | 0.81 | 0.3952 | 1.1333 | 0.81 | 0.7825 | 0.3453 | 0.0559 |
| No log | 35.0 | 245 | 0.5339 | 0.805 | 0.3941 | 0.8600 | 0.805 | 0.7905 | 0.3552 | 0.0554 |
| No log | 36.0 | 252 | 0.5332 | 0.81 | 0.3918 | 0.9018 | 0.81 | 0.7996 | 0.3669 | 0.0527 |
| No log | 37.0 | 259 | 0.5336 | 0.79 | 0.3907 | 0.7768 | 0.79 | 0.7612 | 0.3374 | 0.0611 |
| No log | 38.0 | 266 | 0.5327 | 0.805 | 0.3906 | 0.9987 | 0.805 | 0.7750 | 0.3430 | 0.0564 |
| No log | 39.0 | 273 | 0.5342 | 0.805 | 0.3898 | 1.1024 | 0.805 | 0.7837 | 0.3295 | 0.0563 |
| No log | 40.0 | 280 | 0.5310 | 0.81 | 0.3906 | 0.8426 | 0.81 | 0.7820 | 0.3513 | 0.0556 |
| No log | 41.0 | 287 | 0.5327 | 0.81 | 0.3950 | 1.0952 | 0.81 | 0.7927 | 0.3418 | 0.0570 |
| No log | 42.0 | 294 | 0.5305 | 0.82 | 0.3961 | 0.7830 | 0.82 | 0.8011 | 0.3501 | 0.0545 |
| No log | 43.0 | 301 | 0.5308 | 0.81 | 0.3926 | 0.9752 | 0.81 | 0.7907 | 0.3534 | 0.0573 |
| No log | 44.0 | 308 | 0.5287 | 0.81 | 0.3898 | 0.9838 | 0.81 | 0.7904 | 0.3454 | 0.0570 |
| No log | 45.0 | 315 | 0.5270 | 0.815 | 0.3890 | 0.8682 | 0.815 | 0.8004 | 0.3499 | 0.0543 |
| No log | 46.0 | 322 | 0.5272 | 0.81 | 0.3884 | 0.9784 | 0.81 | 0.7827 | 0.3415 | 0.0541 |
| No log | 47.0 | 329 | 0.5306 | 0.805 | 0.3900 | 1.1153 | 0.805 | 0.7800 | 0.3388 | 0.0571 |
| No log | 48.0 | 336 | 0.5288 | 0.82 | 0.3915 | 0.9916 | 0.82 | 0.7912 | 0.3519 | 0.0527 |
| No log | 49.0 | 343 | 0.5274 | 0.81 | 0.3886 | 0.8415 | 0.81 | 0.7855 | 0.3524 | 0.0550 |
| No log | 50.0 | 350 | 0.5264 | 0.81 | 0.3868 | 0.9713 | 0.81 | 0.7907 | 0.3408 | 0.0559 |
| No log | 51.0 | 357 | 0.5295 | 0.815 | 0.3916 | 1.0340 | 0.815 | 0.7933 | 0.3683 | 0.0536 |
| No log | 52.0 | 364 | 0.5294 | 0.81 | 0.3920 | 0.9178 | 0.81 | 0.7854 | 0.3499 | 0.0563 |
| No log | 53.0 | 371 | 0.5283 | 0.81 | 0.3912 | 0.8517 | 0.81 | 0.7907 | 0.3648 | 0.0540 |
| No log | 54.0 | 378 | 0.5301 | 0.815 | 0.3927 | 0.9279 | 0.815 | 0.7933 | 0.3579 | 0.0558 |
| No log | 55.0 | 385 | 0.5275 | 0.805 | 0.3888 | 0.9225 | 0.805 | 0.7800 | 0.3406 | 0.0553 |
| No log | 56.0 | 392 | 0.5284 | 0.815 | 0.3903 | 0.9064 | 0.815 | 0.7933 | 0.3463 | 0.0551 |
| No log | 57.0 | 399 | 0.5261 | 0.81 | 0.3872 | 0.9072 | 0.81 | 0.7907 | 0.3527 | 0.0551 |
| No log | 58.0 | 406 | 0.5278 | 0.815 | 0.3900 | 0.8469 | 0.815 | 0.7966 | 0.3622 | 0.0526 |
| No log | 59.0 | 413 | 0.5280 | 0.81 | 0.3900 | 0.9220 | 0.81 | 0.7907 | 0.3467 | 0.0551 |
| No log | 60.0 | 420 | 0.5296 | 0.81 | 0.3932 | 0.9166 | 0.81 | 0.7907 | 0.3620 | 0.0555 |
| No log | 61.0 | 427 | 0.5288 | 0.815 | 0.3925 | 0.8647 | 0.815 | 0.7966 | 0.3491 | 0.0529 |
| No log | 62.0 | 434 | 0.5288 | 0.81 | 0.3909 | 0.9205 | 0.81 | 0.7907 | 0.3482 | 0.0552 |
| No log | 63.0 | 441 | 0.5274 | 0.81 | 0.3889 | 0.9143 | 0.81 | 0.7907 | 0.3457 | 0.0541 |
| No log | 64.0 | 448 | 0.5283 | 0.81 | 0.3905 | 0.9141 | 0.81 | 0.7907 | 0.3578 | 0.0549 |
| No log | 65.0 | 455 | 0.5283 | 0.81 | 0.3907 | 0.9177 | 0.81 | 0.7907 | 0.3536 | 0.0548 |
| No log | 66.0 | 462 | 0.5289 | 0.81 | 0.3912 | 0.9179 | 0.81 | 0.7907 | 0.3502 | 0.0550 |
| No log | 67.0 | 469 | 0.5282 | 0.81 | 0.3903 | 0.9134 | 0.81 | 0.7907 | 0.3511 | 0.0547 |
| No log | 68.0 | 476 | 0.5279 | 0.81 | 0.3901 | 0.9105 | 0.81 | 0.7907 | 0.3473 | 0.0541 |
| No log | 69.0 | 483 | 0.5283 | 0.81 | 0.3907 | 0.9128 | 0.81 | 0.7907 | 0.3558 | 0.0539 |
| No log | 70.0 | 490 | 0.5283 | 0.81 | 0.3904 | 0.9191 | 0.81 | 0.7907 | 0.3414 | 0.0543 |
| No log | 71.0 | 497 | 0.5284 | 0.81 | 0.3905 | 0.9183 | 0.81 | 0.7907 | 0.3478 | 0.0546 |
| 0.3962 | 72.0 | 504 | 0.5285 | 0.81 | 0.3909 | 0.9151 | 0.81 | 0.7907 | 0.3415 | 0.0545 |
| 0.3962 | 73.0 | 511 | 0.5283 | 0.81 | 0.3906 | 0.9144 | 0.81 | 0.7907 | 0.3499 | 0.0542 |
| 0.3962 | 74.0 | 518 | 0.5282 | 0.81 | 0.3903 | 0.9146 | 0.81 | 0.7907 | 0.3411 | 0.0541 |
| 0.3962 | 75.0 | 525 | 0.5284 | 0.81 | 0.3909 | 0.9159 | 0.81 | 0.7907 | 0.3571 | 0.0542 |
| 0.3962 | 76.0 | 532 | 0.5284 | 0.81 | 0.3906 | 0.9155 | 0.81 | 0.7907 | 0.3361 | 0.0543 |
| 0.3962 | 77.0 | 539 | 0.5283 | 0.81 | 0.3906 | 0.9159 | 0.81 | 0.7907 | 0.3480 | 0.0541 |
| 0.3962 | 78.0 | 546 | 0.5282 | 0.81 | 0.3905 | 0.9120 | 0.81 | 0.7907 | 0.3413 | 0.0540 |
| 0.3962 | 79.0 | 553 | 0.5283 | 0.81 | 0.3905 | 0.9162 | 0.81 | 0.7907 | 0.3412 | 0.0542 |
| 0.3962 | 80.0 | 560 | 0.5285 | 0.81 | 0.3907 | 0.9189 | 0.81 | 0.7907 | 0.3361 | 0.0543 |
| 0.3962 | 81.0 | 567 | 0.5285 | 0.81 | 0.3907 | 0.9162 | 0.81 | 0.7907 | 0.3470 | 0.0541 |
| 0.3962 | 82.0 | 574 | 0.5283 | 0.81 | 0.3904 | 0.9144 | 0.81 | 0.7907 | 0.3411 | 0.0540 |
| 0.3962 | 83.0 | 581 | 0.5284 | 0.81 | 0.3906 | 0.9153 | 0.81 | 0.7907 | 0.3361 | 0.0542 |
| 0.3962 | 84.0 | 588 | 0.5284 | 0.81 | 0.3907 | 0.9151 | 0.81 | 0.7907 | 0.3419 | 0.0542 |
| 0.3962 | 85.0 | 595 | 0.5283 | 0.81 | 0.3905 | 0.9143 | 0.81 | 0.7907 | 0.3362 | 0.0541 |
| 0.3962 | 86.0 | 602 | 0.5285 | 0.81 | 0.3908 | 0.9152 | 0.81 | 0.7907 | 0.3418 | 0.0540 |
| 0.3962 | 87.0 | 609 | 0.5284 | 0.81 | 0.3907 | 0.9156 | 0.81 | 0.7907 | 0.3365 | 0.0543 |
| 0.3962 | 88.0 | 616 | 0.5285 | 0.81 | 0.3907 | 0.9155 | 0.81 | 0.7907 | 0.3419 | 0.0541 |
| 0.3962 | 89.0 | 623 | 0.5284 | 0.81 | 0.3906 | 0.9154 | 0.81 | 0.7907 | 0.3360 | 0.0541 |
| 0.3962 | 90.0 | 630 | 0.5285 | 0.81 | 0.3907 | 0.9168 | 0.81 | 0.7907 | 0.3418 | 0.0543 |
| 0.3962 | 91.0 | 637 | 0.5285 | 0.81 | 0.3907 | 0.9160 | 0.81 | 0.7907 | 0.3420 | 0.0543 |
| 0.3962 | 92.0 | 644 | 0.5285 | 0.81 | 0.3908 | 0.9164 | 0.81 | 0.7907 | 0.3421 | 0.0541 |
| 0.3962 | 93.0 | 651 | 0.5285 | 0.81 | 0.3907 | 0.9164 | 0.81 | 0.7907 | 0.3473 | 0.0542 |
| 0.3962 | 94.0 | 658 | 0.5285 | 0.81 | 0.3907 | 0.9164 | 0.81 | 0.7907 | 0.3420 | 0.0542 |
| 0.3962 | 95.0 | 665 | 0.5285 | 0.81 | 0.3907 | 0.9161 | 0.81 | 0.7907 | 0.3473 | 0.0541 |
| 0.3962 | 96.0 | 672 | 0.5285 | 0.81 | 0.3907 | 0.9157 | 0.81 | 0.7907 | 0.3421 | 0.0542 |
| 0.3962 | 97.0 | 679 | 0.5285 | 0.81 | 0.3907 | 0.9154 | 0.81 | 0.7907 | 0.3363 | 0.0542 |
| 0.3962 | 98.0 | 686 | 0.5285 | 0.81 | 0.3907 | 0.9164 | 0.81 | 0.7907 | 0.3420 | 0.0542 |
| 0.3962 | 99.0 | 693 | 0.5285 | 0.81 | 0.3907 | 0.9162 | 0.81 | 0.7907 | 0.3420 | 0.0542 |
| 0.3962 | 100.0 | 700 | 0.5285 | 0.81 | 0.3907 | 0.9159 | 0.81 | 0.7907 | 0.3421 | 0.0542 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/39-tiny_tobacco3482
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 39-tiny_tobacco3482
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0652
- Accuracy: 0.68
- Brier Loss: 0.8297
- Nll: 1.9403
- F1 Micro: 0.68
- F1 Macro: 0.5899
- Ece: 0.5728
- Aurc: 0.1321
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.0877 | 0.11 | 0.8983 | 8.4701 | 0.11 | 0.0377 | 0.1765 | 0.9065 |
| No log | 2.0 | 50 | 0.0811 | 0.09 | 0.8984 | 6.6861 | 0.09 | 0.0364 | 0.1641 | 0.9033 |
| No log | 3.0 | 75 | 0.0803 | 0.165 | 0.8981 | 7.2209 | 0.165 | 0.0520 | 0.2115 | 0.7480 |
| No log | 4.0 | 100 | 0.0799 | 0.17 | 0.8977 | 7.1709 | 0.17 | 0.0713 | 0.2184 | 0.6439 |
| No log | 5.0 | 125 | 0.0794 | 0.15 | 0.8970 | 7.1655 | 0.15 | 0.0553 | 0.2075 | 0.6278 |
| No log | 6.0 | 150 | 0.0786 | 0.225 | 0.8959 | 6.1378 | 0.225 | 0.1083 | 0.2511 | 0.5827 |
| No log | 7.0 | 175 | 0.0777 | 0.275 | 0.8944 | 5.1450 | 0.275 | 0.1053 | 0.2840 | 0.5127 |
| No log | 8.0 | 200 | 0.0767 | 0.305 | 0.8920 | 5.0689 | 0.305 | 0.1177 | 0.3095 | 0.4372 |
| No log | 9.0 | 225 | 0.0756 | 0.305 | 0.8893 | 4.6689 | 0.305 | 0.1015 | 0.3114 | 0.4279 |
| No log | 10.0 | 250 | 0.0747 | 0.325 | 0.8865 | 5.1088 | 0.325 | 0.1067 | 0.3257 | 0.3891 |
| No log | 11.0 | 275 | 0.0738 | 0.345 | 0.8803 | 4.9911 | 0.345 | 0.1297 | 0.3418 | 0.3672 |
| No log | 12.0 | 300 | 0.0729 | 0.405 | 0.8748 | 4.7892 | 0.405 | 0.1649 | 0.3774 | 0.3276 |
| No log | 13.0 | 325 | 0.0721 | 0.395 | 0.8696 | 4.6346 | 0.395 | 0.1561 | 0.3723 | 0.3221 |
| No log | 14.0 | 350 | 0.0713 | 0.44 | 0.8676 | 3.8051 | 0.44 | 0.1780 | 0.4047 | 0.2743 |
| No log | 15.0 | 375 | 0.0705 | 0.43 | 0.8612 | 3.6440 | 0.4300 | 0.1893 | 0.3994 | 0.2701 |
| No log | 16.0 | 400 | 0.0698 | 0.48 | 0.8569 | 3.4952 | 0.48 | 0.2607 | 0.4216 | 0.2640 |
| No log | 17.0 | 425 | 0.0690 | 0.54 | 0.8533 | 3.2018 | 0.54 | 0.3042 | 0.4771 | 0.2168 |
| No log | 18.0 | 450 | 0.0683 | 0.54 | 0.8500 | 2.7042 | 0.54 | 0.3161 | 0.4677 | 0.2075 |
| No log | 19.0 | 475 | 0.0679 | 0.6 | 0.8469 | 2.5343 | 0.6 | 0.4246 | 0.5186 | 0.1865 |
| 0.0767 | 20.0 | 500 | 0.0677 | 0.615 | 0.8455 | 2.6485 | 0.615 | 0.4407 | 0.5207 | 0.1908 |
| 0.0767 | 21.0 | 525 | 0.0673 | 0.625 | 0.8463 | 2.2841 | 0.625 | 0.4471 | 0.5369 | 0.1558 |
| 0.0767 | 22.0 | 550 | 0.0667 | 0.645 | 0.8398 | 2.4032 | 0.645 | 0.4633 | 0.5462 | 0.1500 |
| 0.0767 | 23.0 | 575 | 0.0664 | 0.63 | 0.8388 | 2.4376 | 0.63 | 0.4600 | 0.5356 | 0.1583 |
| 0.0767 | 24.0 | 600 | 0.0663 | 0.645 | 0.8371 | 2.3057 | 0.645 | 0.4731 | 0.5443 | 0.1437 |
| 0.0767 | 25.0 | 625 | 0.0662 | 0.635 | 0.8386 | 2.2486 | 0.635 | 0.4606 | 0.5425 | 0.1515 |
| 0.0767 | 26.0 | 650 | 0.0661 | 0.63 | 0.8374 | 2.2367 | 0.63 | 0.4543 | 0.5423 | 0.1549 |
| 0.0767 | 27.0 | 675 | 0.0660 | 0.64 | 0.8358 | 2.1278 | 0.64 | 0.4554 | 0.5486 | 0.1350 |
| 0.0767 | 28.0 | 700 | 0.0660 | 0.64 | 0.8360 | 2.2416 | 0.64 | 0.4726 | 0.5363 | 0.1429 |
| 0.0767 | 29.0 | 725 | 0.0660 | 0.67 | 0.8364 | 2.1574 | 0.67 | 0.4990 | 0.5648 | 0.1264 |
| 0.0767 | 30.0 | 750 | 0.0659 | 0.665 | 0.8357 | 2.2015 | 0.665 | 0.5113 | 0.5645 | 0.1383 |
| 0.0767 | 31.0 | 775 | 0.0658 | 0.65 | 0.8347 | 2.1367 | 0.65 | 0.4995 | 0.5522 | 0.1461 |
| 0.0767 | 32.0 | 800 | 0.0656 | 0.67 | 0.8341 | 2.1025 | 0.67 | 0.5110 | 0.5666 | 0.1307 |
| 0.0767 | 33.0 | 825 | 0.0656 | 0.645 | 0.8354 | 2.0398 | 0.645 | 0.5034 | 0.5442 | 0.1334 |
| 0.0767 | 34.0 | 850 | 0.0656 | 0.67 | 0.8346 | 2.1934 | 0.67 | 0.5112 | 0.5569 | 0.1299 |
| 0.0767 | 35.0 | 875 | 0.0658 | 0.665 | 0.8353 | 2.0671 | 0.665 | 0.5255 | 0.5646 | 0.1295 |
| 0.0767 | 36.0 | 900 | 0.0655 | 0.665 | 0.8320 | 2.0168 | 0.665 | 0.5138 | 0.5680 | 0.1306 |
| 0.0767 | 37.0 | 925 | 0.0655 | 0.675 | 0.8315 | 2.0974 | 0.675 | 0.5229 | 0.5672 | 0.1333 |
| 0.0767 | 38.0 | 950 | 0.0655 | 0.675 | 0.8341 | 2.0624 | 0.675 | 0.5457 | 0.5750 | 0.1256 |
| 0.0767 | 39.0 | 975 | 0.0653 | 0.69 | 0.8321 | 2.0556 | 0.69 | 0.5498 | 0.5856 | 0.1250 |
| 0.0625 | 40.0 | 1000 | 0.0653 | 0.69 | 0.8330 | 1.9627 | 0.69 | 0.5812 | 0.5765 | 0.1243 |
| 0.0625 | 41.0 | 1025 | 0.0653 | 0.705 | 0.8335 | 2.0491 | 0.705 | 0.5900 | 0.5919 | 0.1155 |
| 0.0625 | 42.0 | 1050 | 0.0653 | 0.705 | 0.8335 | 2.0357 | 0.705 | 0.5984 | 0.5945 | 0.1250 |
| 0.0625 | 43.0 | 1075 | 0.0652 | 0.7 | 0.8316 | 2.0326 | 0.7 | 0.5957 | 0.5932 | 0.1230 |
| 0.0625 | 44.0 | 1100 | 0.0653 | 0.69 | 0.8323 | 2.0244 | 0.69 | 0.5904 | 0.5911 | 0.1252 |
| 0.0625 | 45.0 | 1125 | 0.0653 | 0.68 | 0.8310 | 2.0410 | 0.68 | 0.5644 | 0.5699 | 0.1305 |
| 0.0625 | 46.0 | 1150 | 0.0653 | 0.695 | 0.8323 | 2.0288 | 0.695 | 0.5944 | 0.5837 | 0.1251 |
| 0.0625 | 47.0 | 1175 | 0.0652 | 0.685 | 0.8312 | 1.9613 | 0.685 | 0.5894 | 0.5834 | 0.1244 |
| 0.0625 | 48.0 | 1200 | 0.0652 | 0.685 | 0.8312 | 1.9620 | 0.685 | 0.5753 | 0.5728 | 0.1321 |
| 0.0625 | 49.0 | 1225 | 0.0652 | 0.695 | 0.8317 | 1.9706 | 0.695 | 0.5962 | 0.5837 | 0.1291 |
| 0.0625 | 50.0 | 1250 | 0.0651 | 0.69 | 0.8314 | 1.9661 | 0.69 | 0.5902 | 0.5759 | 0.1315 |
| 0.0625 | 51.0 | 1275 | 0.0652 | 0.68 | 0.8319 | 1.9542 | 0.68 | 0.5695 | 0.5704 | 0.1288 |
| 0.0625 | 52.0 | 1300 | 0.0651 | 0.695 | 0.8308 | 1.9577 | 0.695 | 0.5834 | 0.5823 | 0.1276 |
| 0.0625 | 53.0 | 1325 | 0.0652 | 0.67 | 0.8315 | 1.8876 | 0.67 | 0.5604 | 0.5680 | 0.1326 |
| 0.0625 | 54.0 | 1350 | 0.0651 | 0.68 | 0.8318 | 1.8731 | 0.68 | 0.5925 | 0.5644 | 0.1317 |
| 0.0625 | 55.0 | 1375 | 0.0651 | 0.7 | 0.8292 | 1.9448 | 0.7 | 0.5856 | 0.5903 | 0.1214 |
| 0.0625 | 56.0 | 1400 | 0.0652 | 0.705 | 0.8310 | 2.0042 | 0.705 | 0.6059 | 0.5881 | 0.1195 |
| 0.0625 | 57.0 | 1425 | 0.0651 | 0.685 | 0.8309 | 1.9467 | 0.685 | 0.5832 | 0.5734 | 0.1273 |
| 0.0625 | 58.0 | 1450 | 0.0651 | 0.705 | 0.8306 | 1.9480 | 0.705 | 0.6064 | 0.5956 | 0.1227 |
| 0.0625 | 59.0 | 1475 | 0.0651 | 0.695 | 0.8302 | 1.9453 | 0.695 | 0.5998 | 0.5806 | 0.1310 |
| 0.0604 | 60.0 | 1500 | 0.0651 | 0.68 | 0.8305 | 1.8892 | 0.68 | 0.5813 | 0.5643 | 0.1276 |
| 0.0604 | 61.0 | 1525 | 0.0651 | 0.725 | 0.8302 | 1.9304 | 0.7250 | 0.6346 | 0.6022 | 0.1194 |
| 0.0604 | 62.0 | 1550 | 0.0651 | 0.685 | 0.8303 | 1.8831 | 0.685 | 0.5773 | 0.5815 | 0.1322 |
| 0.0604 | 63.0 | 1575 | 0.0650 | 0.71 | 0.8299 | 1.9502 | 0.7100 | 0.6140 | 0.5944 | 0.1257 |
| 0.0604 | 64.0 | 1600 | 0.0651 | 0.68 | 0.8296 | 1.9407 | 0.68 | 0.5701 | 0.5727 | 0.1337 |
| 0.0604 | 65.0 | 1625 | 0.0651 | 0.695 | 0.8309 | 1.9413 | 0.695 | 0.5995 | 0.5884 | 0.1234 |
| 0.0604 | 66.0 | 1650 | 0.0651 | 0.69 | 0.8298 | 1.9474 | 0.69 | 0.5865 | 0.5723 | 0.1293 |
| 0.0604 | 67.0 | 1675 | 0.0650 | 0.705 | 0.8298 | 1.8996 | 0.705 | 0.6109 | 0.5966 | 0.1258 |
| 0.0604 | 68.0 | 1700 | 0.0651 | 0.7 | 0.8298 | 1.9938 | 0.7 | 0.6089 | 0.5895 | 0.1283 |
| 0.0604 | 69.0 | 1725 | 0.0651 | 0.695 | 0.8296 | 1.9273 | 0.695 | 0.5923 | 0.5776 | 0.1251 |
| 0.0604 | 70.0 | 1750 | 0.0651 | 0.705 | 0.8297 | 1.8920 | 0.705 | 0.6162 | 0.5868 | 0.1323 |
| 0.0604 | 71.0 | 1775 | 0.0651 | 0.7 | 0.8304 | 1.9852 | 0.7 | 0.6123 | 0.5878 | 0.1282 |
| 0.0604 | 72.0 | 1800 | 0.0651 | 0.68 | 0.8310 | 1.9399 | 0.68 | 0.5963 | 0.5633 | 0.1345 |
| 0.0604 | 73.0 | 1825 | 0.0650 | 0.725 | 0.8302 | 1.9237 | 0.7250 | 0.6266 | 0.6029 | 0.1192 |
| 0.0604 | 74.0 | 1850 | 0.0651 | 0.68 | 0.8306 | 1.9521 | 0.68 | 0.5967 | 0.5745 | 0.1342 |
| 0.0604 | 75.0 | 1875 | 0.0651 | 0.695 | 0.8301 | 1.9911 | 0.695 | 0.6047 | 0.5841 | 0.1317 |
| 0.0604 | 76.0 | 1900 | 0.0651 | 0.695 | 0.8299 | 1.9333 | 0.695 | 0.5935 | 0.5715 | 0.1299 |
| 0.0604 | 77.0 | 1925 | 0.0651 | 0.695 | 0.8298 | 1.9429 | 0.695 | 0.6041 | 0.5679 | 0.1293 |
| 0.0604 | 78.0 | 1950 | 0.0651 | 0.695 | 0.8298 | 1.9367 | 0.695 | 0.6101 | 0.5792 | 0.1279 |
| 0.0604 | 79.0 | 1975 | 0.0651 | 0.695 | 0.8301 | 1.9934 | 0.695 | 0.6095 | 0.5898 | 0.1324 |
| 0.0596 | 80.0 | 2000 | 0.0651 | 0.7 | 0.8297 | 1.9332 | 0.7 | 0.6071 | 0.5778 | 0.1271 |
| 0.0596 | 81.0 | 2025 | 0.0651 | 0.685 | 0.8303 | 1.9457 | 0.685 | 0.5986 | 0.5807 | 0.1320 |
| 0.0596 | 82.0 | 2050 | 0.0651 | 0.7 | 0.8300 | 1.9337 | 0.7 | 0.6072 | 0.5896 | 0.1296 |
| 0.0596 | 83.0 | 2075 | 0.0651 | 0.685 | 0.8298 | 1.9424 | 0.685 | 0.5985 | 0.5753 | 0.1319 |
| 0.0596 | 84.0 | 2100 | 0.0651 | 0.7 | 0.8297 | 1.9407 | 0.7 | 0.6116 | 0.5847 | 0.1311 |
| 0.0596 | 85.0 | 2125 | 0.0651 | 0.685 | 0.8298 | 1.9364 | 0.685 | 0.5983 | 0.5841 | 0.1311 |
| 0.0596 | 86.0 | 2150 | 0.0651 | 0.685 | 0.8299 | 1.9345 | 0.685 | 0.5983 | 0.5806 | 0.1318 |
| 0.0596 | 87.0 | 2175 | 0.0652 | 0.685 | 0.8299 | 1.9402 | 0.685 | 0.5979 | 0.5778 | 0.1317 |
| 0.0596 | 88.0 | 2200 | 0.0651 | 0.685 | 0.8298 | 1.9385 | 0.685 | 0.5983 | 0.5726 | 0.1315 |
| 0.0596 | 89.0 | 2225 | 0.0652 | 0.68 | 0.8296 | 1.9367 | 0.68 | 0.5899 | 0.5732 | 0.1314 |
| 0.0596 | 90.0 | 2250 | 0.0652 | 0.68 | 0.8298 | 1.9383 | 0.68 | 0.5896 | 0.5782 | 0.1321 |
| 0.0596 | 91.0 | 2275 | 0.0652 | 0.68 | 0.8297 | 1.9408 | 0.68 | 0.5896 | 0.5782 | 0.1317 |
| 0.0596 | 92.0 | 2300 | 0.0652 | 0.68 | 0.8299 | 1.9370 | 0.68 | 0.5899 | 0.5701 | 0.1320 |
| 0.0596 | 93.0 | 2325 | 0.0652 | 0.68 | 0.8298 | 1.9395 | 0.68 | 0.5899 | 0.5754 | 0.1321 |
| 0.0596 | 94.0 | 2350 | 0.0652 | 0.68 | 0.8297 | 1.9392 | 0.68 | 0.5899 | 0.5701 | 0.1326 |
| 0.0596 | 95.0 | 2375 | 0.0652 | 0.68 | 0.8297 | 1.9393 | 0.68 | 0.5899 | 0.5651 | 0.1320 |
| 0.0596 | 96.0 | 2400 | 0.0652 | 0.68 | 0.8297 | 1.9397 | 0.68 | 0.5899 | 0.5701 | 0.1321 |
| 0.0596 | 97.0 | 2425 | 0.0652 | 0.68 | 0.8297 | 1.9400 | 0.68 | 0.5899 | 0.5676 | 0.1322 |
| 0.0596 | 98.0 | 2450 | 0.0652 | 0.68 | 0.8297 | 1.9391 | 0.68 | 0.5899 | 0.5677 | 0.1320 |
| 0.0596 | 99.0 | 2475 | 0.0652 | 0.68 | 0.8297 | 1.9397 | 0.68 | 0.5899 | 0.5701 | 0.1321 |
| 0.0592 | 100.0 | 2500 | 0.0652 | 0.68 | 0.8297 | 1.9403 | 0.68 | 0.5899 | 0.5728 | 0.1321 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/39-tiny_tobacco3482_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 39-tiny_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0812
- Accuracy: 0.835
- Brier Loss: 0.2748
- Nll: 1.2215
- F1 Micro: 0.835
- F1 Macro: 0.8213
- Ece: 0.1443
- Aurc: 0.0548
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 5.1938 | 0.095 | 1.0201 | 8.6917 | 0.095 | 0.0778 | 0.3242 | 0.9009 |
| No log | 2.0 | 14 | 4.3129 | 0.13 | 0.9109 | 8.2379 | 0.13 | 0.0910 | 0.2544 | 0.8466 |
| No log | 3.0 | 21 | 3.9690 | 0.225 | 0.8600 | 6.8547 | 0.225 | 0.1401 | 0.2626 | 0.6398 |
| No log | 4.0 | 28 | 3.8651 | 0.375 | 0.7978 | 5.6610 | 0.375 | 0.2964 | 0.3198 | 0.4692 |
| No log | 5.0 | 35 | 3.8115 | 0.465 | 0.7222 | 3.4731 | 0.465 | 0.3435 | 0.3007 | 0.3464 |
| No log | 6.0 | 42 | 3.7351 | 0.575 | 0.6691 | 2.6672 | 0.575 | 0.4736 | 0.3509 | 0.2284 |
| No log | 7.0 | 49 | 3.6913 | 0.62 | 0.6152 | 2.6026 | 0.62 | 0.4700 | 0.3255 | 0.1827 |
| No log | 8.0 | 56 | 3.6687 | 0.68 | 0.5820 | 1.9726 | 0.68 | 0.5400 | 0.3735 | 0.1472 |
| No log | 9.0 | 63 | 3.6771 | 0.645 | 0.5464 | 1.9938 | 0.645 | 0.5211 | 0.3013 | 0.1595 |
| No log | 10.0 | 70 | 3.6759 | 0.685 | 0.4884 | 1.9735 | 0.685 | 0.5678 | 0.2672 | 0.1278 |
| No log | 11.0 | 77 | 3.6587 | 0.71 | 0.4696 | 2.0625 | 0.7100 | 0.6080 | 0.2956 | 0.1115 |
| No log | 12.0 | 84 | 3.6317 | 0.72 | 0.4121 | 2.2088 | 0.72 | 0.6137 | 0.2372 | 0.0925 |
| No log | 13.0 | 91 | 3.6799 | 0.745 | 0.4167 | 2.0639 | 0.745 | 0.6372 | 0.2480 | 0.0978 |
| No log | 14.0 | 98 | 3.6191 | 0.745 | 0.3850 | 1.9955 | 0.745 | 0.6384 | 0.2363 | 0.0728 |
| No log | 15.0 | 105 | 3.6813 | 0.715 | 0.3814 | 2.0731 | 0.715 | 0.6026 | 0.1995 | 0.0918 |
| No log | 16.0 | 112 | 3.6394 | 0.75 | 0.3644 | 1.9093 | 0.75 | 0.6492 | 0.1904 | 0.0777 |
| No log | 17.0 | 119 | 3.7661 | 0.735 | 0.3786 | 1.5402 | 0.735 | 0.6352 | 0.2032 | 0.0982 |
| No log | 18.0 | 126 | 3.6849 | 0.79 | 0.3369 | 1.8761 | 0.79 | 0.6965 | 0.1954 | 0.0708 |
| No log | 19.0 | 133 | 3.6776 | 0.775 | 0.3358 | 1.4981 | 0.775 | 0.7021 | 0.1919 | 0.0744 |
| No log | 20.0 | 140 | 3.6814 | 0.755 | 0.3546 | 1.5225 | 0.755 | 0.6873 | 0.1840 | 0.0794 |
| No log | 21.0 | 147 | 3.6948 | 0.775 | 0.3267 | 1.4776 | 0.775 | 0.7052 | 0.1630 | 0.0710 |
| No log | 22.0 | 154 | 3.7210 | 0.795 | 0.3191 | 1.3634 | 0.795 | 0.7383 | 0.1737 | 0.0705 |
| No log | 23.0 | 161 | 3.7231 | 0.805 | 0.3062 | 1.3141 | 0.805 | 0.7679 | 0.1629 | 0.0665 |
| No log | 24.0 | 168 | 3.7322 | 0.815 | 0.2903 | 1.2030 | 0.815 | 0.7771 | 0.1789 | 0.0609 |
| No log | 25.0 | 175 | 3.7237 | 0.815 | 0.3020 | 1.1721 | 0.815 | 0.7947 | 0.1759 | 0.0603 |
| No log | 26.0 | 182 | 3.8243 | 0.8 | 0.3138 | 1.3356 | 0.8000 | 0.7699 | 0.1735 | 0.0720 |
| No log | 27.0 | 189 | 3.7675 | 0.81 | 0.3038 | 1.2662 | 0.81 | 0.7853 | 0.1891 | 0.0699 |
| No log | 28.0 | 196 | 3.8006 | 0.81 | 0.2992 | 1.3422 | 0.81 | 0.7805 | 0.1709 | 0.0698 |
| No log | 29.0 | 203 | 3.7783 | 0.815 | 0.3009 | 1.3322 | 0.815 | 0.7959 | 0.1729 | 0.0669 |
| No log | 30.0 | 210 | 3.7547 | 0.835 | 0.2775 | 0.9761 | 0.835 | 0.8228 | 0.1751 | 0.0566 |
| No log | 31.0 | 217 | 3.7810 | 0.82 | 0.2905 | 1.1472 | 0.82 | 0.7953 | 0.1670 | 0.0631 |
| No log | 32.0 | 224 | 3.7935 | 0.82 | 0.2732 | 1.2016 | 0.82 | 0.7967 | 0.1429 | 0.0590 |
| No log | 33.0 | 231 | 3.7871 | 0.83 | 0.2774 | 1.2459 | 0.83 | 0.8134 | 0.1495 | 0.0562 |
| No log | 34.0 | 238 | 3.7689 | 0.815 | 0.2756 | 1.1135 | 0.815 | 0.7825 | 0.1609 | 0.0596 |
| No log | 35.0 | 245 | 3.8169 | 0.81 | 0.2801 | 1.2621 | 0.81 | 0.7880 | 0.1570 | 0.0624 |
| No log | 36.0 | 252 | 3.7973 | 0.82 | 0.2729 | 1.1310 | 0.82 | 0.7894 | 0.1466 | 0.0585 |
| No log | 37.0 | 259 | 3.8560 | 0.835 | 0.2825 | 1.3222 | 0.835 | 0.8114 | 0.1466 | 0.0606 |
| No log | 38.0 | 266 | 3.8351 | 0.83 | 0.2892 | 1.2548 | 0.83 | 0.8178 | 0.1489 | 0.0593 |
| No log | 39.0 | 273 | 3.8258 | 0.82 | 0.2711 | 1.1900 | 0.82 | 0.8037 | 0.1455 | 0.0589 |
| No log | 40.0 | 280 | 3.8288 | 0.815 | 0.2840 | 1.2167 | 0.815 | 0.7913 | 0.1574 | 0.0619 |
| No log | 41.0 | 287 | 3.8264 | 0.82 | 0.2790 | 1.1737 | 0.82 | 0.8020 | 0.1394 | 0.0609 |
| No log | 42.0 | 294 | 3.8276 | 0.81 | 0.2797 | 1.1603 | 0.81 | 0.7888 | 0.1585 | 0.0580 |
| No log | 43.0 | 301 | 3.8554 | 0.815 | 0.2771 | 1.1695 | 0.815 | 0.7943 | 0.1310 | 0.0594 |
| No log | 44.0 | 308 | 3.8405 | 0.825 | 0.2768 | 1.1593 | 0.825 | 0.8149 | 0.1413 | 0.0569 |
| No log | 45.0 | 315 | 3.8640 | 0.815 | 0.2891 | 1.1752 | 0.815 | 0.7980 | 0.1516 | 0.0590 |
| No log | 46.0 | 322 | 3.8624 | 0.825 | 0.2653 | 1.1548 | 0.825 | 0.8024 | 0.1384 | 0.0581 |
| No log | 47.0 | 329 | 3.8546 | 0.83 | 0.2766 | 1.1634 | 0.83 | 0.8106 | 0.1411 | 0.0594 |
| No log | 48.0 | 336 | 3.8652 | 0.82 | 0.2805 | 1.1651 | 0.82 | 0.8069 | 0.1278 | 0.0581 |
| No log | 49.0 | 343 | 3.8716 | 0.83 | 0.2758 | 1.1895 | 0.83 | 0.8065 | 0.1486 | 0.0590 |
| No log | 50.0 | 350 | 3.8720 | 0.815 | 0.2737 | 1.1709 | 0.815 | 0.7937 | 0.1375 | 0.0578 |
| No log | 51.0 | 357 | 3.8812 | 0.82 | 0.2762 | 1.2348 | 0.82 | 0.7993 | 0.1292 | 0.0600 |
| No log | 52.0 | 364 | 3.8844 | 0.805 | 0.2815 | 1.0870 | 0.805 | 0.7843 | 0.1525 | 0.0581 |
| No log | 53.0 | 371 | 3.8968 | 0.825 | 0.2704 | 1.2235 | 0.825 | 0.8011 | 0.1452 | 0.0582 |
| No log | 54.0 | 378 | 3.8996 | 0.81 | 0.2788 | 1.3264 | 0.81 | 0.7909 | 0.1453 | 0.0573 |
| No log | 55.0 | 385 | 3.9037 | 0.81 | 0.2757 | 1.2231 | 0.81 | 0.7928 | 0.1307 | 0.0574 |
| No log | 56.0 | 392 | 3.9024 | 0.81 | 0.2775 | 1.2369 | 0.81 | 0.7869 | 0.1493 | 0.0581 |
| No log | 57.0 | 399 | 3.8951 | 0.83 | 0.2722 | 1.2151 | 0.83 | 0.8171 | 0.1491 | 0.0556 |
| No log | 58.0 | 406 | 3.9224 | 0.82 | 0.2741 | 1.2957 | 0.82 | 0.8001 | 0.1351 | 0.0575 |
| No log | 59.0 | 413 | 3.9397 | 0.805 | 0.2782 | 1.3017 | 0.805 | 0.7870 | 0.1342 | 0.0584 |
| No log | 60.0 | 420 | 3.9250 | 0.835 | 0.2721 | 1.2251 | 0.835 | 0.8151 | 0.1466 | 0.0570 |
| No log | 61.0 | 427 | 3.9381 | 0.825 | 0.2753 | 1.2330 | 0.825 | 0.8044 | 0.1384 | 0.0577 |
| No log | 62.0 | 434 | 3.9475 | 0.82 | 0.2759 | 1.2171 | 0.82 | 0.8054 | 0.1485 | 0.0576 |
| No log | 63.0 | 441 | 3.9591 | 0.83 | 0.2761 | 1.2299 | 0.83 | 0.8122 | 0.1551 | 0.0568 |
| No log | 64.0 | 448 | 3.9496 | 0.835 | 0.2709 | 1.2282 | 0.835 | 0.8223 | 0.1397 | 0.0559 |
| No log | 65.0 | 455 | 3.9360 | 0.83 | 0.2688 | 1.2238 | 0.83 | 0.8171 | 0.1384 | 0.0535 |
| No log | 66.0 | 462 | 3.9594 | 0.835 | 0.2733 | 1.2395 | 0.835 | 0.8094 | 0.1540 | 0.0563 |
| No log | 67.0 | 469 | 3.9648 | 0.84 | 0.2700 | 1.2154 | 0.8400 | 0.8252 | 0.1673 | 0.0557 |
| No log | 68.0 | 476 | 3.9725 | 0.83 | 0.2712 | 1.2297 | 0.83 | 0.8171 | 0.1248 | 0.0552 |
| No log | 69.0 | 483 | 3.9844 | 0.835 | 0.2719 | 1.2243 | 0.835 | 0.8151 | 0.1605 | 0.0557 |
| No log | 70.0 | 490 | 3.9845 | 0.83 | 0.2699 | 1.2288 | 0.83 | 0.8100 | 0.1223 | 0.0553 |
| No log | 71.0 | 497 | 3.9986 | 0.835 | 0.2729 | 1.2206 | 0.835 | 0.8223 | 0.1381 | 0.0556 |
| 3.4116 | 72.0 | 504 | 3.9973 | 0.835 | 0.2727 | 1.2242 | 0.835 | 0.8223 | 0.1446 | 0.0553 |
| 3.4116 | 73.0 | 511 | 4.0092 | 0.835 | 0.2733 | 1.2226 | 0.835 | 0.8223 | 0.1482 | 0.0554 |
| 3.4116 | 74.0 | 518 | 4.0072 | 0.83 | 0.2714 | 1.2248 | 0.83 | 0.8152 | 0.1219 | 0.0549 |
| 3.4116 | 75.0 | 525 | 4.0168 | 0.835 | 0.2742 | 1.2200 | 0.835 | 0.8223 | 0.1329 | 0.0551 |
| 3.4116 | 76.0 | 532 | 4.0223 | 0.835 | 0.2737 | 1.2248 | 0.835 | 0.8213 | 0.1380 | 0.0552 |
| 3.4116 | 77.0 | 539 | 4.0250 | 0.84 | 0.2719 | 1.2208 | 0.8400 | 0.8252 | 0.1405 | 0.0551 |
| 3.4116 | 78.0 | 546 | 4.0338 | 0.835 | 0.2745 | 1.2242 | 0.835 | 0.8213 | 0.1536 | 0.0551 |
| 3.4116 | 79.0 | 553 | 4.0380 | 0.835 | 0.2740 | 1.2234 | 0.835 | 0.8213 | 0.1494 | 0.0552 |
| 3.4116 | 80.0 | 560 | 4.0445 | 0.835 | 0.2744 | 1.2223 | 0.835 | 0.8213 | 0.1500 | 0.0555 |
| 3.4116 | 81.0 | 567 | 4.0449 | 0.835 | 0.2735 | 1.2209 | 0.835 | 0.8213 | 0.1504 | 0.0552 |
| 3.4116 | 82.0 | 574 | 4.0515 | 0.835 | 0.2747 | 1.2228 | 0.835 | 0.8213 | 0.1526 | 0.0549 |
| 3.4116 | 83.0 | 581 | 4.0534 | 0.835 | 0.2743 | 1.2226 | 0.835 | 0.8213 | 0.1501 | 0.0548 |
| 3.4116 | 84.0 | 588 | 4.0572 | 0.835 | 0.2740 | 1.2225 | 0.835 | 0.8213 | 0.1447 | 0.0550 |
| 3.4116 | 85.0 | 595 | 4.0605 | 0.835 | 0.2743 | 1.2222 | 0.835 | 0.8213 | 0.1466 | 0.0548 |
| 3.4116 | 86.0 | 602 | 4.0621 | 0.835 | 0.2744 | 1.2215 | 0.835 | 0.8213 | 0.1427 | 0.0548 |
| 3.4116 | 87.0 | 609 | 4.0653 | 0.835 | 0.2745 | 1.2214 | 0.835 | 0.8213 | 0.1439 | 0.0549 |
| 3.4116 | 88.0 | 616 | 4.0673 | 0.835 | 0.2746 | 1.2217 | 0.835 | 0.8213 | 0.1410 | 0.0548 |
| 3.4116 | 89.0 | 623 | 4.0705 | 0.835 | 0.2748 | 1.2214 | 0.835 | 0.8213 | 0.1440 | 0.0549 |
| 3.4116 | 90.0 | 630 | 4.0717 | 0.835 | 0.2744 | 1.2217 | 0.835 | 0.8213 | 0.1426 | 0.0547 |
| 3.4116 | 91.0 | 637 | 4.0740 | 0.835 | 0.2747 | 1.2217 | 0.835 | 0.8213 | 0.1432 | 0.0548 |
| 3.4116 | 92.0 | 644 | 4.0753 | 0.835 | 0.2748 | 1.2217 | 0.835 | 0.8213 | 0.1442 | 0.0547 |
| 3.4116 | 93.0 | 651 | 4.0763 | 0.835 | 0.2746 | 1.2214 | 0.835 | 0.8213 | 0.1434 | 0.0546 |
| 3.4116 | 94.0 | 658 | 4.0777 | 0.835 | 0.2746 | 1.2213 | 0.835 | 0.8213 | 0.1433 | 0.0547 |
| 3.4116 | 95.0 | 665 | 4.0788 | 0.835 | 0.2747 | 1.2217 | 0.835 | 0.8213 | 0.1442 | 0.0547 |
| 3.4116 | 96.0 | 672 | 4.0800 | 0.835 | 0.2748 | 1.2217 | 0.835 | 0.8213 | 0.1466 | 0.0547 |
| 3.4116 | 97.0 | 679 | 4.0802 | 0.835 | 0.2747 | 1.2215 | 0.835 | 0.8213 | 0.1435 | 0.0547 |
| 3.4116 | 98.0 | 686 | 4.0808 | 0.835 | 0.2747 | 1.2214 | 0.835 | 0.8213 | 0.1435 | 0.0547 |
| 3.4116 | 99.0 | 693 | 4.0811 | 0.835 | 0.2748 | 1.2214 | 0.835 | 0.8213 | 0.1443 | 0.0547 |
| 3.4116 | 100.0 | 700 | 4.0812 | 0.835 | 0.2748 | 1.2215 | 0.835 | 0.8213 | 0.1443 | 0.0548 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
leopuv/dog_chicken_muffin_classifier
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# leopuv/dog_chicken_muffin_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0326
- Train Accuracy: 0.9947
- Validation Loss: 0.0191
- Validation Accuracy: 0.9947
- Epoch: 9
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 22500, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.4271 | 0.996 | 0.1010 | 0.9960 | 0 |
| 0.1111 | 0.9933 | 0.0613 | 0.9933 | 1 |
| 0.0774 | 0.9947 | 0.0451 | 0.9947 | 2 |
| 0.0592 | 0.9947 | 0.0354 | 0.9947 | 3 |
| 0.0457 | 0.9987 | 0.0252 | 0.9987 | 4 |
| 0.0434 | 0.992 | 0.0346 | 0.9920 | 5 |
| 0.0429 | 0.996 | 0.0270 | 0.9960 | 6 |
| 0.0397 | 0.9973 | 0.0177 | 0.9973 | 7 |
| 0.0326 | 0.9973 | 0.0161 | 0.9973 | 8 |
| 0.0326 | 0.9947 | 0.0191 | 0.9947 | 9 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"chicken",
"dog",
"muffin"
] |
jordyvl/39-tiny_tobacco3482_hint_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 39-tiny_tobacco3482_hint_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 65.8239
- Accuracy: 0.84
- Brier Loss: 0.2807
- Nll: 1.1327
- F1 Micro: 0.8400
- F1 Macro: 0.8280
- Ece: 0.1437
- Aurc: 0.0472
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 69.1264 | 0.26 | 0.8707 | 4.9002 | 0.26 | 0.1920 | 0.3064 | 0.7815 |
| No log | 2.0 | 50 | 68.3319 | 0.545 | 0.5960 | 2.8558 | 0.545 | 0.4562 | 0.2850 | 0.2564 |
| No log | 3.0 | 75 | 67.8627 | 0.68 | 0.4406 | 1.6064 | 0.68 | 0.6157 | 0.2543 | 0.1333 |
| No log | 4.0 | 100 | 67.5797 | 0.75 | 0.3829 | 1.5484 | 0.75 | 0.7343 | 0.2220 | 0.1152 |
| No log | 5.0 | 125 | 67.2608 | 0.8 | 0.3072 | 1.7491 | 0.8000 | 0.7573 | 0.1809 | 0.0698 |
| No log | 6.0 | 150 | 67.0950 | 0.78 | 0.3169 | 1.7708 | 0.78 | 0.7441 | 0.1576 | 0.0607 |
| No log | 7.0 | 175 | 66.9178 | 0.755 | 0.3812 | 1.6929 | 0.755 | 0.6848 | 0.1899 | 0.1043 |
| No log | 8.0 | 200 | 66.7335 | 0.75 | 0.3763 | 1.7649 | 0.75 | 0.7399 | 0.1870 | 0.0806 |
| No log | 9.0 | 225 | 66.4371 | 0.805 | 0.3036 | 1.4686 | 0.805 | 0.7896 | 0.1378 | 0.0545 |
| No log | 10.0 | 250 | 66.6823 | 0.75 | 0.3924 | 1.8808 | 0.75 | 0.6665 | 0.1964 | 0.0829 |
| No log | 11.0 | 275 | 66.6079 | 0.775 | 0.3570 | 1.8872 | 0.775 | 0.7442 | 0.1811 | 0.0839 |
| No log | 12.0 | 300 | 66.4364 | 0.765 | 0.3689 | 1.6981 | 0.765 | 0.7550 | 0.1909 | 0.0732 |
| No log | 13.0 | 325 | 66.1317 | 0.785 | 0.3346 | 1.4062 | 0.785 | 0.7823 | 0.1753 | 0.0572 |
| No log | 14.0 | 350 | 66.5182 | 0.73 | 0.4453 | 1.4431 | 0.7300 | 0.7208 | 0.2310 | 0.0985 |
| No log | 15.0 | 375 | 66.5154 | 0.775 | 0.3769 | 1.4897 | 0.775 | 0.7398 | 0.1875 | 0.0821 |
| No log | 16.0 | 400 | 66.4191 | 0.8 | 0.3315 | 1.5327 | 0.8000 | 0.7771 | 0.1651 | 0.0685 |
| No log | 17.0 | 425 | 66.2374 | 0.765 | 0.3520 | 1.5388 | 0.765 | 0.7401 | 0.1767 | 0.0760 |
| No log | 18.0 | 450 | 66.2010 | 0.805 | 0.3320 | 1.4280 | 0.805 | 0.7721 | 0.1756 | 0.0684 |
| No log | 19.0 | 475 | 66.0335 | 0.85 | 0.2625 | 1.3549 | 0.85 | 0.8352 | 0.1431 | 0.0430 |
| 65.4034 | 20.0 | 500 | 66.2213 | 0.815 | 0.3213 | 1.3912 | 0.815 | 0.7955 | 0.1645 | 0.0579 |
| 65.4034 | 21.0 | 525 | 66.2647 | 0.77 | 0.3656 | 1.3241 | 0.7700 | 0.7743 | 0.1899 | 0.0755 |
| 65.4034 | 22.0 | 550 | 66.1220 | 0.86 | 0.2684 | 1.2459 | 0.8600 | 0.8354 | 0.1327 | 0.0473 |
| 65.4034 | 23.0 | 575 | 66.1615 | 0.85 | 0.2623 | 1.3400 | 0.85 | 0.8231 | 0.1291 | 0.0448 |
| 65.4034 | 24.0 | 600 | 66.2114 | 0.825 | 0.3130 | 1.4118 | 0.825 | 0.8122 | 0.1565 | 0.0498 |
| 65.4034 | 25.0 | 625 | 66.1048 | 0.835 | 0.2704 | 1.3571 | 0.835 | 0.8196 | 0.1405 | 0.0450 |
| 65.4034 | 26.0 | 650 | 65.9832 | 0.825 | 0.2990 | 1.1514 | 0.825 | 0.8253 | 0.1603 | 0.0423 |
| 65.4034 | 27.0 | 675 | 66.2567 | 0.805 | 0.3307 | 1.3509 | 0.805 | 0.8022 | 0.1699 | 0.0634 |
| 65.4034 | 28.0 | 700 | 66.0668 | 0.82 | 0.3172 | 1.1445 | 0.82 | 0.7973 | 0.1538 | 0.0419 |
| 65.4034 | 29.0 | 725 | 66.2254 | 0.81 | 0.3252 | 1.3290 | 0.81 | 0.8011 | 0.1659 | 0.0523 |
| 65.4034 | 30.0 | 750 | 65.9643 | 0.84 | 0.2697 | 1.2052 | 0.8400 | 0.8245 | 0.1319 | 0.0425 |
| 65.4034 | 31.0 | 775 | 66.3419 | 0.81 | 0.3249 | 1.2772 | 0.81 | 0.7969 | 0.1700 | 0.0612 |
| 65.4034 | 32.0 | 800 | 66.0324 | 0.825 | 0.3003 | 1.3138 | 0.825 | 0.8000 | 0.1584 | 0.0445 |
| 65.4034 | 33.0 | 825 | 66.3326 | 0.82 | 0.3336 | 1.2983 | 0.82 | 0.7826 | 0.1754 | 0.0590 |
| 65.4034 | 34.0 | 850 | 66.1374 | 0.825 | 0.3061 | 1.5645 | 0.825 | 0.8012 | 0.1500 | 0.0459 |
| 65.4034 | 35.0 | 875 | 66.2310 | 0.815 | 0.3207 | 1.5607 | 0.815 | 0.7939 | 0.1712 | 0.0646 |
| 65.4034 | 36.0 | 900 | 66.0388 | 0.84 | 0.2873 | 1.1966 | 0.8400 | 0.8327 | 0.1456 | 0.0585 |
| 65.4034 | 37.0 | 925 | 66.0520 | 0.835 | 0.2958 | 1.2728 | 0.835 | 0.8180 | 0.1508 | 0.0477 |
| 65.4034 | 38.0 | 950 | 65.9916 | 0.84 | 0.2783 | 1.1635 | 0.8400 | 0.8233 | 0.1398 | 0.0438 |
| 65.4034 | 39.0 | 975 | 65.9391 | 0.845 | 0.2743 | 1.2660 | 0.845 | 0.8289 | 0.1396 | 0.0458 |
| 64.0802 | 40.0 | 1000 | 65.9291 | 0.845 | 0.2762 | 1.3335 | 0.845 | 0.8259 | 0.1373 | 0.0430 |
| 64.0802 | 41.0 | 1025 | 65.8559 | 0.85 | 0.2686 | 1.3432 | 0.85 | 0.8338 | 0.1345 | 0.0428 |
| 64.0802 | 42.0 | 1050 | 65.8612 | 0.845 | 0.2772 | 1.2679 | 0.845 | 0.8255 | 0.1389 | 0.0431 |
| 64.0802 | 43.0 | 1075 | 65.8953 | 0.84 | 0.2742 | 1.2614 | 0.8400 | 0.8227 | 0.1408 | 0.0435 |
| 64.0802 | 44.0 | 1100 | 65.8569 | 0.835 | 0.2769 | 1.2730 | 0.835 | 0.8199 | 0.1426 | 0.0432 |
| 64.0802 | 45.0 | 1125 | 65.8610 | 0.84 | 0.2769 | 1.2622 | 0.8400 | 0.8248 | 0.1485 | 0.0425 |
| 64.0802 | 46.0 | 1150 | 65.8237 | 0.845 | 0.2729 | 1.1920 | 0.845 | 0.8334 | 0.1462 | 0.0432 |
| 64.0802 | 47.0 | 1175 | 65.8416 | 0.845 | 0.2785 | 1.1826 | 0.845 | 0.8317 | 0.1376 | 0.0431 |
| 64.0802 | 48.0 | 1200 | 65.8452 | 0.845 | 0.2817 | 1.1876 | 0.845 | 0.8317 | 0.1417 | 0.0441 |
| 64.0802 | 49.0 | 1225 | 65.8394 | 0.845 | 0.2750 | 1.1993 | 0.845 | 0.8309 | 0.1315 | 0.0419 |
| 64.0802 | 50.0 | 1250 | 65.8527 | 0.84 | 0.2796 | 1.1860 | 0.8400 | 0.8279 | 0.1410 | 0.0432 |
| 64.0802 | 51.0 | 1275 | 65.8286 | 0.845 | 0.2749 | 1.1977 | 0.845 | 0.8333 | 0.1444 | 0.0428 |
| 64.0802 | 52.0 | 1300 | 65.8296 | 0.83 | 0.2779 | 1.1926 | 0.83 | 0.8171 | 0.1382 | 0.0435 |
| 64.0802 | 53.0 | 1325 | 65.8121 | 0.83 | 0.2779 | 1.1955 | 0.83 | 0.8155 | 0.1387 | 0.0436 |
| 64.0802 | 54.0 | 1350 | 65.8361 | 0.825 | 0.2769 | 1.1909 | 0.825 | 0.8095 | 0.1435 | 0.0419 |
| 64.0802 | 55.0 | 1375 | 65.8370 | 0.83 | 0.2816 | 1.1925 | 0.83 | 0.8171 | 0.1416 | 0.0435 |
| 64.0802 | 56.0 | 1400 | 65.8301 | 0.825 | 0.2763 | 1.1908 | 0.825 | 0.8101 | 0.1393 | 0.0439 |
| 64.0802 | 57.0 | 1425 | 65.8301 | 0.82 | 0.2791 | 1.1881 | 0.82 | 0.8040 | 0.1443 | 0.0440 |
| 64.0802 | 58.0 | 1450 | 65.8324 | 0.83 | 0.2754 | 1.1938 | 0.83 | 0.8198 | 0.1387 | 0.0460 |
| 64.0802 | 59.0 | 1475 | 65.8407 | 0.825 | 0.2818 | 1.1893 | 0.825 | 0.8138 | 0.1393 | 0.0439 |
| 63.8765 | 60.0 | 1500 | 65.8236 | 0.84 | 0.2782 | 1.1871 | 0.8400 | 0.8290 | 0.1512 | 0.0449 |
| 63.8765 | 61.0 | 1525 | 65.8198 | 0.825 | 0.2846 | 1.1752 | 0.825 | 0.8138 | 0.1505 | 0.0438 |
| 63.8765 | 62.0 | 1550 | 65.8243 | 0.83 | 0.2796 | 1.1753 | 0.83 | 0.8196 | 0.1480 | 0.0445 |
| 63.8765 | 63.0 | 1575 | 65.8495 | 0.835 | 0.2781 | 1.1766 | 0.835 | 0.8257 | 0.1353 | 0.0451 |
| 63.8765 | 64.0 | 1600 | 65.8204 | 0.835 | 0.2833 | 1.1752 | 0.835 | 0.8239 | 0.1400 | 0.0447 |
| 63.8765 | 65.0 | 1625 | 65.8374 | 0.835 | 0.2800 | 1.1829 | 0.835 | 0.8239 | 0.1474 | 0.0441 |
| 63.8765 | 66.0 | 1650 | 65.8433 | 0.83 | 0.2855 | 1.1678 | 0.83 | 0.8148 | 0.1498 | 0.0444 |
| 63.8765 | 67.0 | 1675 | 65.8259 | 0.835 | 0.2820 | 1.1725 | 0.835 | 0.8257 | 0.1518 | 0.0457 |
| 63.8765 | 68.0 | 1700 | 65.8443 | 0.83 | 0.2841 | 1.1652 | 0.83 | 0.8196 | 0.1491 | 0.0457 |
| 63.8765 | 69.0 | 1725 | 65.8255 | 0.835 | 0.2849 | 1.1620 | 0.835 | 0.8247 | 0.1499 | 0.0460 |
| 63.8765 | 70.0 | 1750 | 65.8421 | 0.83 | 0.2870 | 1.1681 | 0.83 | 0.8196 | 0.1418 | 0.0475 |
| 63.8765 | 71.0 | 1775 | 65.8402 | 0.835 | 0.2839 | 1.1614 | 0.835 | 0.8230 | 0.1359 | 0.0466 |
| 63.8765 | 72.0 | 1800 | 65.8224 | 0.84 | 0.2831 | 1.1555 | 0.8400 | 0.8280 | 0.1467 | 0.0459 |
| 63.8765 | 73.0 | 1825 | 65.8233 | 0.84 | 0.2824 | 1.1578 | 0.8400 | 0.8280 | 0.1428 | 0.0465 |
| 63.8765 | 74.0 | 1850 | 65.8299 | 0.84 | 0.2814 | 1.1574 | 0.8400 | 0.8280 | 0.1469 | 0.0465 |
| 63.8765 | 75.0 | 1875 | 65.8309 | 0.835 | 0.2790 | 1.1575 | 0.835 | 0.8219 | 0.1407 | 0.0465 |
| 63.8765 | 76.0 | 1900 | 65.8199 | 0.84 | 0.2789 | 1.1496 | 0.8400 | 0.8280 | 0.1437 | 0.0460 |
| 63.8765 | 77.0 | 1925 | 65.8222 | 0.84 | 0.2828 | 1.1520 | 0.8400 | 0.8280 | 0.1539 | 0.0461 |
| 63.8765 | 78.0 | 1950 | 65.8312 | 0.84 | 0.2801 | 1.1459 | 0.8400 | 0.8280 | 0.1354 | 0.0458 |
| 63.8765 | 79.0 | 1975 | 65.8253 | 0.84 | 0.2836 | 1.1448 | 0.8400 | 0.8280 | 0.1542 | 0.0465 |
| 63.7964 | 80.0 | 2000 | 65.8332 | 0.84 | 0.2839 | 1.1408 | 0.8400 | 0.8280 | 0.1486 | 0.0462 |
| 63.7964 | 81.0 | 2025 | 65.8316 | 0.84 | 0.2818 | 1.1419 | 0.8400 | 0.8280 | 0.1430 | 0.0460 |
| 63.7964 | 82.0 | 2050 | 65.8238 | 0.84 | 0.2824 | 1.1387 | 0.8400 | 0.8280 | 0.1411 | 0.0452 |
| 63.7964 | 83.0 | 2075 | 65.8294 | 0.84 | 0.2786 | 1.1410 | 0.8400 | 0.8280 | 0.1539 | 0.0469 |
| 63.7964 | 84.0 | 2100 | 65.8267 | 0.84 | 0.2818 | 1.1391 | 0.8400 | 0.8280 | 0.1463 | 0.0471 |
| 63.7964 | 85.0 | 2125 | 65.8222 | 0.84 | 0.2814 | 1.1401 | 0.8400 | 0.8280 | 0.1463 | 0.0470 |
| 63.7964 | 86.0 | 2150 | 65.8264 | 0.84 | 0.2776 | 1.1380 | 0.8400 | 0.8280 | 0.1359 | 0.0460 |
| 63.7964 | 87.0 | 2175 | 65.8228 | 0.84 | 0.2781 | 1.1366 | 0.8400 | 0.8280 | 0.1468 | 0.0460 |
| 63.7964 | 88.0 | 2200 | 65.8229 | 0.84 | 0.2832 | 1.1367 | 0.8400 | 0.8280 | 0.1455 | 0.0476 |
| 63.7964 | 89.0 | 2225 | 65.8271 | 0.84 | 0.2792 | 1.1376 | 0.8400 | 0.8280 | 0.1598 | 0.0467 |
| 63.7964 | 90.0 | 2250 | 65.8234 | 0.84 | 0.2830 | 1.1352 | 0.8400 | 0.8280 | 0.1427 | 0.0474 |
| 63.7964 | 91.0 | 2275 | 65.8309 | 0.84 | 0.2804 | 1.1352 | 0.8400 | 0.8280 | 0.1426 | 0.0467 |
| 63.7964 | 92.0 | 2300 | 65.8305 | 0.84 | 0.2796 | 1.1345 | 0.8400 | 0.8280 | 0.1438 | 0.0466 |
| 63.7964 | 93.0 | 2325 | 65.8155 | 0.84 | 0.2808 | 1.1347 | 0.8400 | 0.8280 | 0.1499 | 0.0471 |
| 63.7964 | 94.0 | 2350 | 65.8218 | 0.84 | 0.2803 | 1.1336 | 0.8400 | 0.8280 | 0.1487 | 0.0473 |
| 63.7964 | 95.0 | 2375 | 65.8152 | 0.84 | 0.2812 | 1.1334 | 0.8400 | 0.8280 | 0.1441 | 0.0466 |
| 63.7964 | 96.0 | 2400 | 65.8230 | 0.84 | 0.2801 | 1.1344 | 0.8400 | 0.8280 | 0.1488 | 0.0472 |
| 63.7964 | 97.0 | 2425 | 65.8206 | 0.84 | 0.2808 | 1.1328 | 0.8400 | 0.8280 | 0.1490 | 0.0472 |
| 63.7964 | 98.0 | 2450 | 65.8221 | 0.84 | 0.2807 | 1.1332 | 0.8400 | 0.8280 | 0.1438 | 0.0474 |
| 63.7964 | 99.0 | 2475 | 65.8207 | 0.84 | 0.2809 | 1.1326 | 0.8400 | 0.8280 | 0.1446 | 0.0472 |
| 63.7613 | 100.0 | 2500 | 65.8239 | 0.84 | 0.2807 | 1.1327 | 0.8400 | 0.8280 | 0.1437 | 0.0472 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
annazhong/vit-base-patch16-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-eurosat
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7730
- Accuracy: 0.2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 150
- eval_batch_size: 150
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 600
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 1 | 1.8264 | 0.1143 |
| No log | 2.0 | 2 | 1.7730 | 0.2 |
| No log | 3.0 | 3 | 1.8143 | 0.2 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5"
] |
jordyvl/18-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 18-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6385
- Accuracy: 0.795
- Brier Loss: 0.4484
- Nll: 0.9250
- F1 Micro: 0.795
- F1 Macro: 0.7709
- Ece: 0.4225
- Aurc: 0.0567
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.8736 | 0.105 | 1.0144 | 8.6059 | 0.1050 | 0.0844 | 0.3169 | 0.8967 |
| No log | 2.0 | 14 | 1.2559 | 0.155 | 0.8899 | 7.1587 | 0.155 | 0.1259 | 0.2459 | 0.7824 |
| No log | 3.0 | 21 | 1.0441 | 0.33 | 0.8123 | 5.3633 | 0.33 | 0.2575 | 0.2995 | 0.5173 |
| No log | 4.0 | 28 | 0.9169 | 0.525 | 0.6852 | 3.4671 | 0.525 | 0.4253 | 0.3387 | 0.2892 |
| No log | 5.0 | 35 | 0.8589 | 0.615 | 0.6269 | 3.1119 | 0.615 | 0.5500 | 0.3683 | 0.2124 |
| No log | 6.0 | 42 | 0.7954 | 0.675 | 0.5756 | 2.2578 | 0.675 | 0.5752 | 0.3626 | 0.1550 |
| No log | 7.0 | 49 | 0.7664 | 0.685 | 0.5143 | 1.8811 | 0.685 | 0.6073 | 0.3254 | 0.1390 |
| No log | 8.0 | 56 | 0.7305 | 0.76 | 0.4895 | 1.5449 | 0.76 | 0.6768 | 0.3695 | 0.1016 |
| No log | 9.0 | 63 | 0.7056 | 0.765 | 0.4721 | 1.3575 | 0.765 | 0.6991 | 0.3828 | 0.0927 |
| No log | 10.0 | 70 | 0.6961 | 0.77 | 0.4380 | 1.2662 | 0.7700 | 0.7509 | 0.3549 | 0.0803 |
| No log | 11.0 | 77 | 0.6772 | 0.81 | 0.4508 | 1.3169 | 0.81 | 0.7915 | 0.4175 | 0.0629 |
| No log | 12.0 | 84 | 0.6766 | 0.785 | 0.4491 | 1.2979 | 0.785 | 0.7650 | 0.3839 | 0.0800 |
| No log | 13.0 | 91 | 0.6754 | 0.785 | 0.4382 | 1.2395 | 0.785 | 0.7794 | 0.3609 | 0.0689 |
| No log | 14.0 | 98 | 0.6768 | 0.8 | 0.4472 | 1.2218 | 0.8000 | 0.7837 | 0.3910 | 0.0640 |
| No log | 15.0 | 105 | 0.6793 | 0.81 | 0.4663 | 1.2698 | 0.81 | 0.7856 | 0.4293 | 0.0672 |
| No log | 16.0 | 112 | 0.6784 | 0.795 | 0.4726 | 1.3043 | 0.795 | 0.7728 | 0.4232 | 0.0669 |
| No log | 17.0 | 119 | 0.6638 | 0.805 | 0.4372 | 1.2746 | 0.805 | 0.7747 | 0.3956 | 0.0677 |
| No log | 18.0 | 126 | 0.6588 | 0.8 | 0.4297 | 1.4466 | 0.8000 | 0.7762 | 0.3866 | 0.0686 |
| No log | 19.0 | 133 | 0.6588 | 0.81 | 0.4588 | 1.2093 | 0.81 | 0.7912 | 0.4029 | 0.0702 |
| No log | 20.0 | 140 | 0.6587 | 0.81 | 0.4534 | 1.0697 | 0.81 | 0.7980 | 0.4197 | 0.0641 |
| No log | 21.0 | 147 | 0.6527 | 0.815 | 0.4529 | 1.1527 | 0.815 | 0.7942 | 0.4196 | 0.0598 |
| No log | 22.0 | 154 | 0.6608 | 0.78 | 0.4559 | 1.2039 | 0.78 | 0.7581 | 0.3612 | 0.0725 |
| No log | 23.0 | 161 | 0.6558 | 0.8 | 0.4547 | 1.0687 | 0.8000 | 0.7644 | 0.3964 | 0.0584 |
| No log | 24.0 | 168 | 0.6584 | 0.8 | 0.4491 | 1.2869 | 0.8000 | 0.7735 | 0.3810 | 0.0687 |
| No log | 25.0 | 175 | 0.6493 | 0.805 | 0.4497 | 0.9981 | 0.805 | 0.7887 | 0.4162 | 0.0570 |
| No log | 26.0 | 182 | 0.6425 | 0.795 | 0.4424 | 1.1317 | 0.795 | 0.7790 | 0.3974 | 0.0596 |
| No log | 27.0 | 189 | 0.6518 | 0.8 | 0.4552 | 0.9743 | 0.8000 | 0.7715 | 0.4122 | 0.0592 |
| No log | 28.0 | 196 | 0.6526 | 0.805 | 0.4630 | 1.1343 | 0.805 | 0.7941 | 0.4171 | 0.0672 |
| No log | 29.0 | 203 | 0.6515 | 0.8 | 0.4531 | 1.0062 | 0.8000 | 0.7681 | 0.3970 | 0.0566 |
| No log | 30.0 | 210 | 0.6459 | 0.795 | 0.4534 | 1.0893 | 0.795 | 0.7853 | 0.3972 | 0.0600 |
| No log | 31.0 | 217 | 0.6423 | 0.81 | 0.4483 | 0.9035 | 0.81 | 0.7927 | 0.4297 | 0.0536 |
| No log | 32.0 | 224 | 0.6454 | 0.8 | 0.4517 | 1.1025 | 0.8000 | 0.7688 | 0.3923 | 0.0599 |
| No log | 33.0 | 231 | 0.6417 | 0.805 | 0.4476 | 0.9658 | 0.805 | 0.7767 | 0.4136 | 0.0563 |
| No log | 34.0 | 238 | 0.6399 | 0.815 | 0.4462 | 0.8565 | 0.815 | 0.7940 | 0.4234 | 0.0550 |
| No log | 35.0 | 245 | 0.6430 | 0.81 | 0.4505 | 1.0491 | 0.81 | 0.7855 | 0.4279 | 0.0629 |
| No log | 36.0 | 252 | 0.6440 | 0.815 | 0.4481 | 1.0288 | 0.815 | 0.7813 | 0.4132 | 0.0539 |
| No log | 37.0 | 259 | 0.6396 | 0.82 | 0.4493 | 0.9477 | 0.82 | 0.8125 | 0.4266 | 0.0525 |
| No log | 38.0 | 266 | 0.6410 | 0.815 | 0.4462 | 1.0462 | 0.815 | 0.7971 | 0.4157 | 0.0522 |
| No log | 39.0 | 273 | 0.6360 | 0.8 | 0.4399 | 0.9645 | 0.8000 | 0.7779 | 0.3974 | 0.0566 |
| No log | 40.0 | 280 | 0.6376 | 0.805 | 0.4412 | 0.8777 | 0.805 | 0.7772 | 0.4104 | 0.0544 |
| No log | 41.0 | 287 | 0.6411 | 0.795 | 0.4475 | 0.9240 | 0.795 | 0.7780 | 0.4062 | 0.0583 |
| No log | 42.0 | 294 | 0.6398 | 0.795 | 0.4509 | 0.9279 | 0.795 | 0.7650 | 0.4068 | 0.0577 |
| No log | 43.0 | 301 | 0.6430 | 0.79 | 0.4567 | 0.9279 | 0.79 | 0.7683 | 0.4073 | 0.0590 |
| No log | 44.0 | 308 | 0.6401 | 0.8 | 0.4495 | 0.9915 | 0.8000 | 0.7744 | 0.4200 | 0.0565 |
| No log | 45.0 | 315 | 0.6364 | 0.795 | 0.4448 | 0.9245 | 0.795 | 0.7729 | 0.4115 | 0.0568 |
| No log | 46.0 | 322 | 0.6391 | 0.79 | 0.4472 | 1.0060 | 0.79 | 0.7633 | 0.4044 | 0.0561 |
| No log | 47.0 | 329 | 0.6376 | 0.795 | 0.4470 | 0.9530 | 0.795 | 0.7693 | 0.3989 | 0.0578 |
| No log | 48.0 | 336 | 0.6383 | 0.8 | 0.4476 | 0.9992 | 0.8000 | 0.7804 | 0.4084 | 0.0579 |
| No log | 49.0 | 343 | 0.6353 | 0.8 | 0.4424 | 0.8500 | 0.8000 | 0.7756 | 0.4055 | 0.0546 |
| No log | 50.0 | 350 | 0.6381 | 0.795 | 0.4470 | 0.9931 | 0.795 | 0.7691 | 0.4170 | 0.0573 |
| No log | 51.0 | 357 | 0.6374 | 0.795 | 0.4477 | 0.9729 | 0.795 | 0.7630 | 0.4076 | 0.0563 |
| No log | 52.0 | 364 | 0.6377 | 0.8 | 0.4481 | 0.9846 | 0.8000 | 0.7759 | 0.4212 | 0.0555 |
| No log | 53.0 | 371 | 0.6378 | 0.795 | 0.4485 | 0.9379 | 0.795 | 0.7733 | 0.4052 | 0.0565 |
| No log | 54.0 | 378 | 0.6385 | 0.79 | 0.4477 | 0.9900 | 0.79 | 0.7684 | 0.4165 | 0.0571 |
| No log | 55.0 | 385 | 0.6371 | 0.81 | 0.4466 | 0.9178 | 0.81 | 0.7867 | 0.4149 | 0.0546 |
| No log | 56.0 | 392 | 0.6373 | 0.795 | 0.4460 | 0.9254 | 0.795 | 0.7692 | 0.4081 | 0.0568 |
| No log | 57.0 | 399 | 0.6376 | 0.79 | 0.4476 | 0.9194 | 0.79 | 0.7596 | 0.3996 | 0.0568 |
| No log | 58.0 | 406 | 0.6380 | 0.79 | 0.4477 | 0.9259 | 0.79 | 0.7619 | 0.4024 | 0.0575 |
| No log | 59.0 | 413 | 0.6377 | 0.8 | 0.4474 | 0.9100 | 0.8000 | 0.7806 | 0.4096 | 0.0569 |
| No log | 60.0 | 420 | 0.6378 | 0.8 | 0.4481 | 0.9189 | 0.8000 | 0.7806 | 0.4076 | 0.0566 |
| No log | 61.0 | 427 | 0.6378 | 0.795 | 0.4478 | 0.9860 | 0.795 | 0.7709 | 0.3994 | 0.0566 |
| No log | 62.0 | 434 | 0.6380 | 0.795 | 0.4480 | 0.9189 | 0.795 | 0.7692 | 0.4070 | 0.0564 |
| No log | 63.0 | 441 | 0.6381 | 0.8 | 0.4482 | 0.9195 | 0.8000 | 0.7806 | 0.4047 | 0.0568 |
| No log | 64.0 | 448 | 0.6379 | 0.8 | 0.4480 | 0.9223 | 0.8000 | 0.7806 | 0.4224 | 0.0563 |
| No log | 65.0 | 455 | 0.6382 | 0.8 | 0.4481 | 0.9196 | 0.8000 | 0.7806 | 0.4113 | 0.0569 |
| No log | 66.0 | 462 | 0.6381 | 0.8 | 0.4484 | 0.9200 | 0.8000 | 0.7806 | 0.4308 | 0.0566 |
| No log | 67.0 | 469 | 0.6379 | 0.8 | 0.4479 | 0.9198 | 0.8000 | 0.7806 | 0.4186 | 0.0566 |
| No log | 68.0 | 476 | 0.6378 | 0.8 | 0.4476 | 0.9167 | 0.8000 | 0.7806 | 0.4166 | 0.0569 |
| No log | 69.0 | 483 | 0.6380 | 0.8 | 0.4481 | 0.9179 | 0.8000 | 0.7806 | 0.4254 | 0.0566 |
| No log | 70.0 | 490 | 0.6384 | 0.795 | 0.4486 | 0.9225 | 0.795 | 0.7709 | 0.4158 | 0.0566 |
| No log | 71.0 | 497 | 0.6380 | 0.795 | 0.4476 | 0.9211 | 0.795 | 0.7709 | 0.4215 | 0.0568 |
| 0.5133 | 72.0 | 504 | 0.6381 | 0.795 | 0.4480 | 0.9232 | 0.795 | 0.7709 | 0.4151 | 0.0566 |
| 0.5133 | 73.0 | 511 | 0.6380 | 0.795 | 0.4479 | 0.9242 | 0.795 | 0.7709 | 0.4218 | 0.0564 |
| 0.5133 | 74.0 | 518 | 0.6380 | 0.795 | 0.4478 | 0.9231 | 0.795 | 0.7709 | 0.4151 | 0.0566 |
| 0.5133 | 75.0 | 525 | 0.6382 | 0.795 | 0.4484 | 0.9245 | 0.795 | 0.7709 | 0.4156 | 0.0565 |
| 0.5133 | 76.0 | 532 | 0.6382 | 0.795 | 0.4481 | 0.9216 | 0.795 | 0.7709 | 0.4153 | 0.0567 |
| 0.5133 | 77.0 | 539 | 0.6382 | 0.795 | 0.4481 | 0.9231 | 0.795 | 0.7709 | 0.4222 | 0.0567 |
| 0.5133 | 78.0 | 546 | 0.6382 | 0.795 | 0.4481 | 0.9210 | 0.795 | 0.7709 | 0.4220 | 0.0565 |
| 0.5133 | 79.0 | 553 | 0.6382 | 0.795 | 0.4480 | 0.9220 | 0.795 | 0.7709 | 0.4220 | 0.0565 |
| 0.5133 | 80.0 | 560 | 0.6384 | 0.795 | 0.4484 | 0.9220 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 81.0 | 567 | 0.6383 | 0.795 | 0.4483 | 0.9218 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 82.0 | 574 | 0.6382 | 0.795 | 0.4480 | 0.9220 | 0.795 | 0.7709 | 0.4221 | 0.0568 |
| 0.5133 | 83.0 | 581 | 0.6384 | 0.795 | 0.4484 | 0.9240 | 0.795 | 0.7709 | 0.4157 | 0.0566 |
| 0.5133 | 84.0 | 588 | 0.6384 | 0.795 | 0.4484 | 0.9262 | 0.795 | 0.7709 | 0.4224 | 0.0566 |
| 0.5133 | 85.0 | 595 | 0.6382 | 0.795 | 0.4481 | 0.9235 | 0.795 | 0.7709 | 0.4221 | 0.0566 |
| 0.5133 | 86.0 | 602 | 0.6384 | 0.795 | 0.4484 | 0.9236 | 0.795 | 0.7709 | 0.4225 | 0.0566 |
| 0.5133 | 87.0 | 609 | 0.6384 | 0.795 | 0.4484 | 0.9235 | 0.795 | 0.7709 | 0.4225 | 0.0567 |
| 0.5133 | 88.0 | 616 | 0.6384 | 0.795 | 0.4483 | 0.9250 | 0.795 | 0.7709 | 0.4224 | 0.0566 |
| 0.5133 | 89.0 | 623 | 0.6384 | 0.795 | 0.4483 | 0.9244 | 0.795 | 0.7709 | 0.4223 | 0.0567 |
| 0.5133 | 90.0 | 630 | 0.6384 | 0.795 | 0.4483 | 0.9251 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 91.0 | 637 | 0.6384 | 0.795 | 0.4484 | 0.9246 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 92.0 | 644 | 0.6384 | 0.795 | 0.4484 | 0.9256 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 93.0 | 651 | 0.6385 | 0.795 | 0.4484 | 0.9252 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 94.0 | 658 | 0.6384 | 0.795 | 0.4484 | 0.9245 | 0.795 | 0.7709 | 0.4223 | 0.0565 |
| 0.5133 | 95.0 | 665 | 0.6385 | 0.795 | 0.4484 | 0.9254 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 96.0 | 672 | 0.6384 | 0.795 | 0.4484 | 0.9242 | 0.795 | 0.7709 | 0.4225 | 0.0566 |
| 0.5133 | 97.0 | 679 | 0.6384 | 0.795 | 0.4484 | 0.9242 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 98.0 | 686 | 0.6385 | 0.795 | 0.4484 | 0.9249 | 0.795 | 0.7709 | 0.4224 | 0.0567 |
| 0.5133 | 99.0 | 693 | 0.6385 | 0.795 | 0.4484 | 0.9252 | 0.795 | 0.7709 | 0.4224 | 0.0566 |
| 0.5133 | 100.0 | 700 | 0.6385 | 0.795 | 0.4484 | 0.9250 | 0.795 | 0.7709 | 0.4225 | 0.0567 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/18-tiny_tobacco3482
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 18-tiny_tobacco3482
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0652
- Accuracy: 0.29
- Brier Loss: 0.8849
- Nll: 4.7897
- F1 Micro: 0.29
- F1 Macro: 0.1634
- Ece: 0.2953
- Aurc: 0.5243
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.0888 | 0.1 | 0.8987 | 8.8684 | 0.1000 | 0.036 | 0.1707 | 0.9181 |
| No log | 2.0 | 50 | 0.0822 | 0.07 | 0.8987 | 7.4095 | 0.07 | 0.0264 | 0.1526 | 0.9367 |
| No log | 3.0 | 75 | 0.0815 | 0.17 | 0.8985 | 7.9603 | 0.17 | 0.0549 | 0.2145 | 0.8165 |
| No log | 4.0 | 100 | 0.0811 | 0.175 | 0.8981 | 8.1220 | 0.175 | 0.0636 | 0.2224 | 0.6841 |
| No log | 5.0 | 125 | 0.0807 | 0.18 | 0.8980 | 8.0205 | 0.18 | 0.0595 | 0.2162 | 0.6466 |
| No log | 6.0 | 150 | 0.0802 | 0.16 | 0.8979 | 8.0022 | 0.16 | 0.0588 | 0.2146 | 0.6987 |
| No log | 7.0 | 175 | 0.0795 | 0.16 | 0.8978 | 8.1389 | 0.16 | 0.1003 | 0.2087 | 0.8300 |
| No log | 8.0 | 200 | 0.0786 | 0.165 | 0.8973 | 7.2103 | 0.165 | 0.0840 | 0.2019 | 0.7778 |
| No log | 9.0 | 225 | 0.0777 | 0.135 | 0.8971 | 6.5106 | 0.135 | 0.0644 | 0.2023 | 0.9632 |
| No log | 10.0 | 250 | 0.0768 | 0.12 | 0.8971 | 7.0176 | 0.12 | 0.0555 | 0.1928 | 0.9684 |
| No log | 11.0 | 275 | 0.0758 | 0.08 | 0.8962 | 6.4528 | 0.08 | 0.0356 | 0.1609 | 0.9735 |
| No log | 12.0 | 300 | 0.0747 | 0.08 | 0.8954 | 6.9898 | 0.08 | 0.0365 | 0.1608 | 0.9594 |
| No log | 13.0 | 325 | 0.0736 | 0.14 | 0.8941 | 6.5287 | 0.14 | 0.0645 | 0.2064 | 0.6977 |
| No log | 14.0 | 350 | 0.0725 | 0.17 | 0.8937 | 7.0519 | 0.17 | 0.0891 | 0.2278 | 0.6074 |
| No log | 15.0 | 375 | 0.0712 | 0.17 | 0.8919 | 6.3769 | 0.17 | 0.0888 | 0.2218 | 0.6154 |
| No log | 16.0 | 400 | 0.0702 | 0.18 | 0.8912 | 6.2503 | 0.18 | 0.1086 | 0.2297 | 0.6140 |
| No log | 17.0 | 425 | 0.0691 | 0.16 | 0.8909 | 6.5096 | 0.16 | 0.1064 | 0.2081 | 0.6497 |
| No log | 18.0 | 450 | 0.0686 | 0.175 | 0.8904 | 6.2506 | 0.175 | 0.1196 | 0.2252 | 0.6744 |
| No log | 19.0 | 475 | 0.0681 | 0.16 | 0.8891 | 5.3019 | 0.16 | 0.1000 | 0.2185 | 0.7715 |
| 0.0776 | 20.0 | 500 | 0.0678 | 0.155 | 0.8887 | 5.3003 | 0.155 | 0.0980 | 0.2208 | 0.7213 |
| 0.0776 | 21.0 | 525 | 0.0677 | 0.155 | 0.8891 | 5.7776 | 0.155 | 0.0944 | 0.2223 | 0.7075 |
| 0.0776 | 22.0 | 550 | 0.0671 | 0.185 | 0.8885 | 5.4219 | 0.185 | 0.0998 | 0.2333 | 0.6805 |
| 0.0776 | 23.0 | 575 | 0.0669 | 0.17 | 0.8887 | 5.4554 | 0.17 | 0.0813 | 0.2304 | 0.6676 |
| 0.0776 | 24.0 | 600 | 0.0667 | 0.175 | 0.8877 | 5.4877 | 0.175 | 0.0791 | 0.2294 | 0.6991 |
| 0.0776 | 25.0 | 625 | 0.0665 | 0.18 | 0.8876 | 5.3499 | 0.18 | 0.0846 | 0.2342 | 0.6774 |
| 0.0776 | 26.0 | 650 | 0.0665 | 0.18 | 0.8870 | 4.9329 | 0.18 | 0.0888 | 0.2320 | 0.6823 |
| 0.0776 | 27.0 | 675 | 0.0663 | 0.21 | 0.8872 | 4.7787 | 0.2100 | 0.1204 | 0.2521 | 0.6698 |
| 0.0776 | 28.0 | 700 | 0.0665 | 0.21 | 0.8870 | 4.8102 | 0.2100 | 0.1177 | 0.2481 | 0.6893 |
| 0.0776 | 29.0 | 725 | 0.0663 | 0.2 | 0.8866 | 4.8474 | 0.2000 | 0.1124 | 0.2433 | 0.6616 |
| 0.0776 | 30.0 | 750 | 0.0663 | 0.21 | 0.8862 | 4.6460 | 0.2100 | 0.1060 | 0.2514 | 0.6663 |
| 0.0776 | 31.0 | 775 | 0.0661 | 0.215 | 0.8874 | 5.0331 | 0.2150 | 0.1219 | 0.2585 | 0.6564 |
| 0.0776 | 32.0 | 800 | 0.0661 | 0.23 | 0.8871 | 4.9313 | 0.23 | 0.1299 | 0.2600 | 0.6596 |
| 0.0776 | 33.0 | 825 | 0.0661 | 0.245 | 0.8870 | 5.2183 | 0.245 | 0.1400 | 0.2729 | 0.6216 |
| 0.0776 | 34.0 | 850 | 0.0660 | 0.215 | 0.8869 | 5.0987 | 0.2150 | 0.1130 | 0.2473 | 0.6745 |
| 0.0776 | 35.0 | 875 | 0.0659 | 0.245 | 0.8872 | 5.1536 | 0.245 | 0.1404 | 0.2684 | 0.6753 |
| 0.0776 | 36.0 | 900 | 0.0661 | 0.23 | 0.8865 | 5.0364 | 0.23 | 0.1219 | 0.2603 | 0.7257 |
| 0.0776 | 37.0 | 925 | 0.0659 | 0.25 | 0.8866 | 5.0315 | 0.25 | 0.1289 | 0.2709 | 0.6743 |
| 0.0776 | 38.0 | 950 | 0.0658 | 0.245 | 0.8865 | 5.1304 | 0.245 | 0.1348 | 0.2708 | 0.6383 |
| 0.0776 | 39.0 | 975 | 0.0659 | 0.255 | 0.8865 | 4.8854 | 0.255 | 0.1343 | 0.2762 | 0.6336 |
| 0.0623 | 40.0 | 1000 | 0.0658 | 0.245 | 0.8866 | 4.8705 | 0.245 | 0.1287 | 0.2709 | 0.6663 |
| 0.0623 | 41.0 | 1025 | 0.0656 | 0.245 | 0.8872 | 5.2534 | 0.245 | 0.1307 | 0.2662 | 0.6369 |
| 0.0623 | 42.0 | 1050 | 0.0658 | 0.235 | 0.8866 | 5.1467 | 0.235 | 0.1220 | 0.2630 | 0.6834 |
| 0.0623 | 43.0 | 1075 | 0.0656 | 0.235 | 0.8868 | 5.2909 | 0.235 | 0.1201 | 0.2629 | 0.6663 |
| 0.0623 | 44.0 | 1100 | 0.0655 | 0.24 | 0.8870 | 5.3291 | 0.24 | 0.1315 | 0.2600 | 0.5948 |
| 0.0623 | 45.0 | 1125 | 0.0656 | 0.25 | 0.8867 | 5.0194 | 0.25 | 0.1397 | 0.2663 | 0.6091 |
| 0.0623 | 46.0 | 1150 | 0.0656 | 0.255 | 0.8863 | 5.0523 | 0.255 | 0.1303 | 0.2729 | 0.5687 |
| 0.0623 | 47.0 | 1175 | 0.0655 | 0.25 | 0.8863 | 4.9527 | 0.25 | 0.1293 | 0.2679 | 0.5724 |
| 0.0623 | 48.0 | 1200 | 0.0654 | 0.25 | 0.8863 | 5.0401 | 0.25 | 0.1307 | 0.2588 | 0.5783 |
| 0.0623 | 49.0 | 1225 | 0.0655 | 0.25 | 0.8862 | 5.0198 | 0.25 | 0.1301 | 0.2781 | 0.6005 |
| 0.0623 | 50.0 | 1250 | 0.0653 | 0.255 | 0.8858 | 5.0534 | 0.255 | 0.1311 | 0.2732 | 0.5601 |
| 0.0623 | 51.0 | 1275 | 0.0653 | 0.26 | 0.8861 | 4.9558 | 0.26 | 0.1329 | 0.2779 | 0.5458 |
| 0.0623 | 52.0 | 1300 | 0.0653 | 0.275 | 0.8855 | 4.8780 | 0.275 | 0.1497 | 0.2863 | 0.5410 |
| 0.0623 | 53.0 | 1325 | 0.0653 | 0.27 | 0.8854 | 4.8854 | 0.27 | 0.1453 | 0.2857 | 0.5276 |
| 0.0623 | 54.0 | 1350 | 0.0652 | 0.26 | 0.8853 | 4.9379 | 0.26 | 0.1322 | 0.2803 | 0.5457 |
| 0.0623 | 55.0 | 1375 | 0.0653 | 0.26 | 0.8852 | 4.9101 | 0.26 | 0.1388 | 0.2778 | 0.5414 |
| 0.0623 | 56.0 | 1400 | 0.0653 | 0.275 | 0.8849 | 4.8107 | 0.275 | 0.1541 | 0.2881 | 0.5204 |
| 0.0623 | 57.0 | 1425 | 0.0652 | 0.26 | 0.8850 | 4.7712 | 0.26 | 0.1402 | 0.2709 | 0.5323 |
| 0.0623 | 58.0 | 1450 | 0.0652 | 0.275 | 0.8850 | 4.8164 | 0.275 | 0.1618 | 0.2885 | 0.5288 |
| 0.0623 | 59.0 | 1475 | 0.0652 | 0.27 | 0.8848 | 4.7627 | 0.27 | 0.1402 | 0.2830 | 0.5194 |
| 0.0599 | 60.0 | 1500 | 0.0652 | 0.28 | 0.8851 | 4.8057 | 0.28 | 0.1624 | 0.2889 | 0.5304 |
| 0.0599 | 61.0 | 1525 | 0.0652 | 0.26 | 0.8851 | 4.8892 | 0.26 | 0.1351 | 0.2823 | 0.5372 |
| 0.0599 | 62.0 | 1550 | 0.0651 | 0.28 | 0.8852 | 4.8944 | 0.28 | 0.1625 | 0.2889 | 0.5320 |
| 0.0599 | 63.0 | 1575 | 0.0652 | 0.275 | 0.8849 | 4.8180 | 0.275 | 0.1484 | 0.2858 | 0.5207 |
| 0.0599 | 64.0 | 1600 | 0.0652 | 0.265 | 0.8849 | 4.8034 | 0.265 | 0.1366 | 0.2807 | 0.5431 |
| 0.0599 | 65.0 | 1625 | 0.0651 | 0.28 | 0.8852 | 4.7696 | 0.28 | 0.1587 | 0.2888 | 0.5327 |
| 0.0599 | 66.0 | 1650 | 0.0652 | 0.28 | 0.8850 | 4.8120 | 0.28 | 0.1561 | 0.2886 | 0.5134 |
| 0.0599 | 67.0 | 1675 | 0.0651 | 0.28 | 0.8850 | 4.7193 | 0.28 | 0.1623 | 0.2867 | 0.5339 |
| 0.0599 | 68.0 | 1700 | 0.0652 | 0.255 | 0.8848 | 4.7659 | 0.255 | 0.1287 | 0.2726 | 0.5406 |
| 0.0599 | 69.0 | 1725 | 0.0651 | 0.275 | 0.8849 | 4.7354 | 0.275 | 0.1479 | 0.2880 | 0.5181 |
| 0.0599 | 70.0 | 1750 | 0.0651 | 0.275 | 0.8850 | 4.8271 | 0.275 | 0.1543 | 0.2893 | 0.5313 |
| 0.0599 | 71.0 | 1775 | 0.0651 | 0.275 | 0.8849 | 4.7387 | 0.275 | 0.1549 | 0.2839 | 0.5454 |
| 0.0599 | 72.0 | 1800 | 0.0652 | 0.275 | 0.8851 | 4.7695 | 0.275 | 0.1475 | 0.2849 | 0.5313 |
| 0.0599 | 73.0 | 1825 | 0.0652 | 0.275 | 0.8848 | 4.7794 | 0.275 | 0.1418 | 0.2845 | 0.5159 |
| 0.0599 | 74.0 | 1850 | 0.0652 | 0.265 | 0.8848 | 4.7832 | 0.265 | 0.1334 | 0.2827 | 0.5350 |
| 0.0599 | 75.0 | 1875 | 0.0652 | 0.27 | 0.8849 | 4.7822 | 0.27 | 0.1388 | 0.2822 | 0.5478 |
| 0.0599 | 76.0 | 1900 | 0.0652 | 0.26 | 0.8850 | 4.7813 | 0.26 | 0.1293 | 0.2720 | 0.5386 |
| 0.0599 | 77.0 | 1925 | 0.0652 | 0.265 | 0.8849 | 4.7949 | 0.265 | 0.1365 | 0.2818 | 0.5456 |
| 0.0599 | 78.0 | 1950 | 0.0652 | 0.28 | 0.8850 | 4.7960 | 0.28 | 0.1509 | 0.2878 | 0.5361 |
| 0.0599 | 79.0 | 1975 | 0.0652 | 0.265 | 0.8849 | 4.7800 | 0.265 | 0.1353 | 0.2803 | 0.5407 |
| 0.059 | 80.0 | 2000 | 0.0652 | 0.265 | 0.8849 | 4.8926 | 0.265 | 0.1429 | 0.2782 | 0.5381 |
| 0.059 | 81.0 | 2025 | 0.0653 | 0.27 | 0.8850 | 4.7975 | 0.27 | 0.1388 | 0.2847 | 0.5436 |
| 0.059 | 82.0 | 2050 | 0.0652 | 0.275 | 0.8849 | 4.7902 | 0.275 | 0.1505 | 0.2849 | 0.5342 |
| 0.059 | 83.0 | 2075 | 0.0652 | 0.265 | 0.8850 | 4.7813 | 0.265 | 0.1435 | 0.2805 | 0.5388 |
| 0.059 | 84.0 | 2100 | 0.0652 | 0.265 | 0.8849 | 4.7866 | 0.265 | 0.1457 | 0.2873 | 0.5358 |
| 0.059 | 85.0 | 2125 | 0.0652 | 0.275 | 0.8850 | 4.7877 | 0.275 | 0.1514 | 0.2872 | 0.5395 |
| 0.059 | 86.0 | 2150 | 0.0652 | 0.265 | 0.8849 | 4.7870 | 0.265 | 0.1453 | 0.2862 | 0.5424 |
| 0.059 | 87.0 | 2175 | 0.0652 | 0.275 | 0.8849 | 4.7887 | 0.275 | 0.1517 | 0.2926 | 0.5312 |
| 0.059 | 88.0 | 2200 | 0.0652 | 0.27 | 0.8849 | 4.7862 | 0.27 | 0.1494 | 0.2809 | 0.5357 |
| 0.059 | 89.0 | 2225 | 0.0652 | 0.275 | 0.8849 | 4.7911 | 0.275 | 0.1515 | 0.2871 | 0.5317 |
| 0.059 | 90.0 | 2250 | 0.0652 | 0.28 | 0.8849 | 4.7825 | 0.28 | 0.1540 | 0.2887 | 0.5305 |
| 0.059 | 91.0 | 2275 | 0.0652 | 0.28 | 0.8849 | 4.7892 | 0.28 | 0.1547 | 0.2900 | 0.5339 |
| 0.059 | 92.0 | 2300 | 0.0652 | 0.29 | 0.8848 | 4.7848 | 0.29 | 0.1642 | 0.2932 | 0.5252 |
| 0.059 | 93.0 | 2325 | 0.0652 | 0.29 | 0.8848 | 4.7823 | 0.29 | 0.1634 | 0.2953 | 0.5243 |
| 0.059 | 94.0 | 2350 | 0.0652 | 0.29 | 0.8849 | 4.7869 | 0.29 | 0.1631 | 0.2932 | 0.5239 |
| 0.059 | 95.0 | 2375 | 0.0652 | 0.285 | 0.8849 | 4.7902 | 0.285 | 0.1615 | 0.2893 | 0.5254 |
| 0.059 | 96.0 | 2400 | 0.0652 | 0.29 | 0.8849 | 4.7894 | 0.29 | 0.1634 | 0.2909 | 0.5256 |
| 0.059 | 97.0 | 2425 | 0.0652 | 0.29 | 0.8849 | 4.7915 | 0.29 | 0.1642 | 0.2932 | 0.5244 |
| 0.059 | 98.0 | 2450 | 0.0652 | 0.29 | 0.8849 | 4.7902 | 0.29 | 0.1642 | 0.2909 | 0.5244 |
| 0.059 | 99.0 | 2475 | 0.0652 | 0.29 | 0.8849 | 4.7889 | 0.29 | 0.1634 | 0.2953 | 0.5241 |
| 0.0586 | 100.0 | 2500 | 0.0652 | 0.29 | 0.8849 | 4.7897 | 0.29 | 0.1634 | 0.2953 | 0.5243 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/18-tiny_tobacco3482_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 18-tiny_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0957
- Accuracy: 0.805
- Brier Loss: 0.2927
- Nll: 1.1753
- F1 Micro: 0.805
- F1 Macro: 0.7833
- Ece: 0.1572
- Aurc: 0.0655
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.7898 | 0.1 | 1.0292 | 9.4902 | 0.1000 | 0.0772 | 0.3220 | 0.9001 |
| No log | 2.0 | 14 | 3.9970 | 0.1 | 0.9420 | 10.0981 | 0.1000 | 0.1071 | 0.2441 | 0.8581 |
| No log | 3.0 | 21 | 3.6641 | 0.075 | 0.8956 | 9.5324 | 0.075 | 0.0777 | 0.1896 | 0.9137 |
| No log | 4.0 | 28 | 3.6014 | 0.18 | 0.8691 | 9.6679 | 0.18 | 0.0781 | 0.2345 | 0.5824 |
| No log | 5.0 | 35 | 3.5833 | 0.23 | 0.8347 | 9.6569 | 0.23 | 0.1572 | 0.2618 | 0.5205 |
| No log | 6.0 | 42 | 3.5576 | 0.44 | 0.7860 | 5.9410 | 0.44 | 0.2946 | 0.3475 | 0.3232 |
| No log | 7.0 | 49 | 3.5400 | 0.575 | 0.7404 | 4.2387 | 0.575 | 0.4638 | 0.4007 | 0.2294 |
| No log | 8.0 | 56 | 3.5319 | 0.545 | 0.7181 | 4.5958 | 0.545 | 0.4482 | 0.3502 | 0.2374 |
| No log | 9.0 | 63 | 3.5405 | 0.52 | 0.7002 | 3.9862 | 0.52 | 0.4101 | 0.3148 | 0.2506 |
| No log | 10.0 | 70 | 3.5341 | 0.61 | 0.6897 | 3.2707 | 0.61 | 0.5118 | 0.3775 | 0.2235 |
| No log | 11.0 | 77 | 3.5259 | 0.66 | 0.6771 | 2.6882 | 0.66 | 0.5201 | 0.4365 | 0.1420 |
| No log | 12.0 | 84 | 3.5215 | 0.66 | 0.6463 | 2.4544 | 0.66 | 0.5387 | 0.3750 | 0.1664 |
| No log | 13.0 | 91 | 3.5363 | 0.58 | 0.6232 | 2.3149 | 0.58 | 0.5090 | 0.3285 | 0.1858 |
| No log | 14.0 | 98 | 3.5161 | 0.675 | 0.6008 | 2.6144 | 0.675 | 0.5411 | 0.3690 | 0.1237 |
| No log | 15.0 | 105 | 3.5073 | 0.67 | 0.5845 | 2.1229 | 0.67 | 0.5577 | 0.3405 | 0.1350 |
| No log | 16.0 | 112 | 3.5272 | 0.67 | 0.5338 | 2.4215 | 0.67 | 0.5603 | 0.3154 | 0.1325 |
| No log | 17.0 | 119 | 3.5332 | 0.695 | 0.5367 | 2.1675 | 0.695 | 0.6056 | 0.3140 | 0.1071 |
| No log | 18.0 | 126 | 3.5659 | 0.655 | 0.4841 | 1.9565 | 0.655 | 0.5559 | 0.2600 | 0.1365 |
| No log | 19.0 | 133 | 3.5438 | 0.69 | 0.4817 | 1.8201 | 0.69 | 0.5735 | 0.2574 | 0.1202 |
| No log | 20.0 | 140 | 3.5019 | 0.74 | 0.4725 | 1.6346 | 0.74 | 0.6486 | 0.2939 | 0.0931 |
| No log | 21.0 | 147 | 3.5236 | 0.755 | 0.4407 | 1.3134 | 0.755 | 0.6811 | 0.2762 | 0.0820 |
| No log | 22.0 | 154 | 3.5303 | 0.755 | 0.4143 | 1.2834 | 0.755 | 0.6843 | 0.2434 | 0.0806 |
| No log | 23.0 | 161 | 3.5541 | 0.77 | 0.4034 | 1.4417 | 0.7700 | 0.6891 | 0.2382 | 0.0842 |
| No log | 24.0 | 168 | 3.5675 | 0.765 | 0.3853 | 1.6692 | 0.765 | 0.7072 | 0.2309 | 0.0807 |
| No log | 25.0 | 175 | 3.5411 | 0.745 | 0.3914 | 1.2777 | 0.745 | 0.6720 | 0.2271 | 0.0784 |
| No log | 26.0 | 182 | 3.5877 | 0.75 | 0.3710 | 1.4838 | 0.75 | 0.6717 | 0.2082 | 0.0789 |
| No log | 27.0 | 189 | 3.6026 | 0.77 | 0.3483 | 1.4211 | 0.7700 | 0.7018 | 0.2089 | 0.0694 |
| No log | 28.0 | 196 | 3.6374 | 0.78 | 0.3365 | 1.3205 | 0.78 | 0.7181 | 0.1953 | 0.0694 |
| No log | 29.0 | 203 | 3.7319 | 0.775 | 0.3538 | 1.2749 | 0.775 | 0.7012 | 0.2149 | 0.0814 |
| No log | 30.0 | 210 | 3.6359 | 0.805 | 0.3291 | 1.3272 | 0.805 | 0.7761 | 0.1991 | 0.0637 |
| No log | 31.0 | 217 | 3.7160 | 0.785 | 0.3337 | 1.2632 | 0.785 | 0.7445 | 0.1727 | 0.0757 |
| No log | 32.0 | 224 | 3.6810 | 0.8 | 0.3234 | 1.3720 | 0.8000 | 0.7636 | 0.1999 | 0.0649 |
| No log | 33.0 | 231 | 3.7139 | 0.82 | 0.3221 | 1.2150 | 0.82 | 0.7919 | 0.2051 | 0.0677 |
| No log | 34.0 | 238 | 3.7286 | 0.795 | 0.3130 | 1.0622 | 0.795 | 0.7575 | 0.1919 | 0.0639 |
| No log | 35.0 | 245 | 3.7807 | 0.795 | 0.3154 | 1.0146 | 0.795 | 0.7672 | 0.1565 | 0.0714 |
| No log | 36.0 | 252 | 3.6802 | 0.815 | 0.3131 | 1.0083 | 0.815 | 0.7933 | 0.2051 | 0.0626 |
| No log | 37.0 | 259 | 3.7369 | 0.81 | 0.3168 | 1.0017 | 0.81 | 0.7862 | 0.1792 | 0.0690 |
| No log | 38.0 | 266 | 3.7638 | 0.82 | 0.2971 | 1.3357 | 0.82 | 0.7977 | 0.1913 | 0.0628 |
| No log | 39.0 | 273 | 3.7415 | 0.825 | 0.2954 | 1.0423 | 0.825 | 0.8072 | 0.1893 | 0.0599 |
| No log | 40.0 | 280 | 3.8005 | 0.785 | 0.3140 | 1.0817 | 0.785 | 0.7453 | 0.1694 | 0.0684 |
| No log | 41.0 | 287 | 3.7901 | 0.82 | 0.3127 | 1.0853 | 0.82 | 0.7993 | 0.1789 | 0.0673 |
| No log | 42.0 | 294 | 3.7811 | 0.825 | 0.3019 | 1.2712 | 0.825 | 0.8020 | 0.1644 | 0.0644 |
| No log | 43.0 | 301 | 3.7689 | 0.81 | 0.3110 | 0.8553 | 0.81 | 0.7932 | 0.1785 | 0.0645 |
| No log | 44.0 | 308 | 3.7796 | 0.82 | 0.2919 | 1.2589 | 0.82 | 0.7972 | 0.1875 | 0.0643 |
| No log | 45.0 | 315 | 3.8005 | 0.805 | 0.3036 | 1.1993 | 0.805 | 0.7789 | 0.1840 | 0.0660 |
| No log | 46.0 | 322 | 3.7811 | 0.82 | 0.2909 | 1.0962 | 0.82 | 0.8004 | 0.1735 | 0.0618 |
| No log | 47.0 | 329 | 3.8145 | 0.8 | 0.3040 | 1.1968 | 0.8000 | 0.7759 | 0.1795 | 0.0671 |
| No log | 48.0 | 336 | 3.7969 | 0.835 | 0.2816 | 1.1019 | 0.835 | 0.8118 | 0.1624 | 0.0603 |
| No log | 49.0 | 343 | 3.8020 | 0.815 | 0.2855 | 1.0383 | 0.815 | 0.7978 | 0.1556 | 0.0639 |
| No log | 50.0 | 350 | 3.8049 | 0.815 | 0.2884 | 1.1121 | 0.815 | 0.7935 | 0.1608 | 0.0616 |
| No log | 51.0 | 357 | 3.8048 | 0.81 | 0.2873 | 1.1173 | 0.81 | 0.7898 | 0.1574 | 0.0632 |
| No log | 52.0 | 364 | 3.8581 | 0.8 | 0.2923 | 1.1257 | 0.8000 | 0.7767 | 0.1436 | 0.0664 |
| No log | 53.0 | 371 | 3.8565 | 0.79 | 0.2984 | 1.0513 | 0.79 | 0.7670 | 0.1622 | 0.0668 |
| No log | 54.0 | 378 | 3.8787 | 0.805 | 0.2901 | 1.0619 | 0.805 | 0.7874 | 0.1335 | 0.0655 |
| No log | 55.0 | 385 | 3.8777 | 0.805 | 0.2940 | 1.0378 | 0.805 | 0.7883 | 0.1450 | 0.0647 |
| No log | 56.0 | 392 | 3.8743 | 0.805 | 0.2906 | 1.1702 | 0.805 | 0.7849 | 0.1610 | 0.0634 |
| No log | 57.0 | 399 | 3.9082 | 0.795 | 0.2959 | 1.0951 | 0.795 | 0.7711 | 0.1761 | 0.0662 |
| No log | 58.0 | 406 | 3.8894 | 0.8 | 0.2898 | 1.0979 | 0.8000 | 0.7816 | 0.1774 | 0.0638 |
| No log | 59.0 | 413 | 3.9005 | 0.825 | 0.2914 | 1.2358 | 0.825 | 0.8088 | 0.1687 | 0.0637 |
| No log | 60.0 | 420 | 3.9115 | 0.815 | 0.2863 | 1.0318 | 0.815 | 0.7928 | 0.1672 | 0.0640 |
| No log | 61.0 | 427 | 3.9172 | 0.805 | 0.2956 | 1.1397 | 0.805 | 0.7884 | 0.1646 | 0.0667 |
| No log | 62.0 | 434 | 3.8993 | 0.82 | 0.2862 | 1.2349 | 0.82 | 0.8001 | 0.1544 | 0.0645 |
| No log | 63.0 | 441 | 3.9334 | 0.825 | 0.2896 | 1.1718 | 0.825 | 0.8061 | 0.1662 | 0.0646 |
| No log | 64.0 | 448 | 3.9179 | 0.815 | 0.2861 | 1.1727 | 0.815 | 0.7966 | 0.1592 | 0.0650 |
| No log | 65.0 | 455 | 3.9489 | 0.8 | 0.2981 | 1.1681 | 0.8000 | 0.7805 | 0.1522 | 0.0674 |
| No log | 66.0 | 462 | 3.9372 | 0.81 | 0.2855 | 1.1041 | 0.81 | 0.7870 | 0.1709 | 0.0647 |
| No log | 67.0 | 469 | 3.9651 | 0.8 | 0.2935 | 1.1723 | 0.8000 | 0.7816 | 0.1492 | 0.0667 |
| No log | 68.0 | 476 | 3.9600 | 0.815 | 0.2903 | 1.1687 | 0.815 | 0.7950 | 0.1466 | 0.0650 |
| No log | 69.0 | 483 | 3.9695 | 0.82 | 0.2908 | 1.1251 | 0.82 | 0.8026 | 0.1532 | 0.0654 |
| No log | 70.0 | 490 | 3.9817 | 0.805 | 0.2915 | 1.1879 | 0.805 | 0.7861 | 0.1537 | 0.0657 |
| No log | 71.0 | 497 | 3.9838 | 0.81 | 0.2899 | 1.1688 | 0.81 | 0.7892 | 0.1538 | 0.0648 |
| 3.4085 | 72.0 | 504 | 3.9960 | 0.805 | 0.2910 | 1.1702 | 0.805 | 0.7904 | 0.1568 | 0.0657 |
| 3.4085 | 73.0 | 511 | 4.0046 | 0.8 | 0.2931 | 1.1743 | 0.8000 | 0.7800 | 0.1529 | 0.0658 |
| 3.4085 | 74.0 | 518 | 4.0115 | 0.815 | 0.2917 | 1.1718 | 0.815 | 0.7968 | 0.1589 | 0.0647 |
| 3.4085 | 75.0 | 525 | 4.0205 | 0.805 | 0.2920 | 1.1719 | 0.805 | 0.7833 | 0.1575 | 0.0654 |
| 3.4085 | 76.0 | 532 | 4.0272 | 0.805 | 0.2919 | 1.1725 | 0.805 | 0.7833 | 0.1547 | 0.0659 |
| 3.4085 | 77.0 | 539 | 4.0323 | 0.81 | 0.2923 | 1.1720 | 0.81 | 0.7892 | 0.1547 | 0.0653 |
| 3.4085 | 78.0 | 546 | 4.0364 | 0.81 | 0.2907 | 1.1715 | 0.81 | 0.7892 | 0.1607 | 0.0650 |
| 3.4085 | 79.0 | 553 | 4.0405 | 0.81 | 0.2910 | 1.1716 | 0.81 | 0.7892 | 0.1451 | 0.0650 |
| 3.4085 | 80.0 | 560 | 4.0476 | 0.81 | 0.2917 | 1.1743 | 0.81 | 0.7892 | 0.1453 | 0.0650 |
| 3.4085 | 81.0 | 567 | 4.0529 | 0.805 | 0.2921 | 1.1736 | 0.805 | 0.7833 | 0.1573 | 0.0654 |
| 3.4085 | 82.0 | 574 | 4.0570 | 0.805 | 0.2919 | 1.1741 | 0.805 | 0.7861 | 0.1717 | 0.0655 |
| 3.4085 | 83.0 | 581 | 4.0601 | 0.81 | 0.2918 | 1.1727 | 0.81 | 0.7892 | 0.1508 | 0.0650 |
| 3.4085 | 84.0 | 588 | 4.0643 | 0.81 | 0.2919 | 1.1743 | 0.81 | 0.7892 | 0.1507 | 0.0652 |
| 3.4085 | 85.0 | 595 | 4.0678 | 0.81 | 0.2922 | 1.1744 | 0.81 | 0.7892 | 0.1552 | 0.0651 |
| 3.4085 | 86.0 | 602 | 4.0743 | 0.81 | 0.2925 | 1.1746 | 0.81 | 0.7892 | 0.1526 | 0.0651 |
| 3.4085 | 87.0 | 609 | 4.0758 | 0.805 | 0.2924 | 1.1753 | 0.805 | 0.7833 | 0.1718 | 0.0653 |
| 3.4085 | 88.0 | 616 | 4.0796 | 0.805 | 0.2924 | 1.1758 | 0.805 | 0.7833 | 0.1567 | 0.0654 |
| 3.4085 | 89.0 | 623 | 4.0803 | 0.81 | 0.2920 | 1.1742 | 0.81 | 0.7892 | 0.1587 | 0.0650 |
| 3.4085 | 90.0 | 630 | 4.0842 | 0.81 | 0.2925 | 1.1744 | 0.81 | 0.7892 | 0.1529 | 0.0651 |
| 3.4085 | 91.0 | 637 | 4.0864 | 0.805 | 0.2926 | 1.1752 | 0.805 | 0.7833 | 0.1568 | 0.0654 |
| 3.4085 | 92.0 | 644 | 4.0880 | 0.81 | 0.2925 | 1.1757 | 0.81 | 0.7892 | 0.1526 | 0.0651 |
| 3.4085 | 93.0 | 651 | 4.0903 | 0.805 | 0.2927 | 1.1752 | 0.805 | 0.7833 | 0.1567 | 0.0654 |
| 3.4085 | 94.0 | 658 | 4.0918 | 0.805 | 0.2927 | 1.1750 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
| 3.4085 | 95.0 | 665 | 4.0927 | 0.805 | 0.2926 | 1.1750 | 0.805 | 0.7833 | 0.1570 | 0.0655 |
| 3.4085 | 96.0 | 672 | 4.0937 | 0.805 | 0.2927 | 1.1751 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
| 3.4085 | 97.0 | 679 | 4.0946 | 0.805 | 0.2926 | 1.1750 | 0.805 | 0.7833 | 0.1573 | 0.0655 |
| 3.4085 | 98.0 | 686 | 4.0950 | 0.805 | 0.2926 | 1.1752 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
| 3.4085 | 99.0 | 693 | 4.0955 | 0.805 | 0.2927 | 1.1753 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
| 3.4085 | 100.0 | 700 | 4.0957 | 0.805 | 0.2927 | 1.1753 | 0.805 | 0.7833 | 0.1572 | 0.0655 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/18-tiny_tobacco3482_hint_
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 18-tiny_tobacco3482_hint_
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 71.6043
- Accuracy: 0.85
- Brier Loss: 0.2638
- Nll: 1.3122
- F1 Micro: 0.85
- F1 Macro: 0.8425
- Ece: 0.1342
- Aurc: 0.0355
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 74.9320 | 0.26 | 0.8713 | 4.9021 | 0.26 | 0.1920 | 0.3072 | 0.7822 |
| No log | 2.0 | 50 | 74.1171 | 0.54 | 0.5982 | 2.7516 | 0.54 | 0.4539 | 0.2763 | 0.2591 |
| No log | 3.0 | 75 | 73.6525 | 0.685 | 0.4431 | 1.5999 | 0.685 | 0.6196 | 0.1890 | 0.1355 |
| No log | 4.0 | 100 | 73.3675 | 0.735 | 0.3894 | 1.5393 | 0.735 | 0.7201 | 0.2334 | 0.1214 |
| No log | 5.0 | 125 | 73.0199 | 0.785 | 0.3130 | 1.6799 | 0.785 | 0.7426 | 0.1738 | 0.0705 |
| No log | 6.0 | 150 | 72.7837 | 0.8 | 0.3068 | 1.5290 | 0.8000 | 0.7711 | 0.1531 | 0.0582 |
| No log | 7.0 | 175 | 72.7218 | 0.76 | 0.3913 | 1.8944 | 0.76 | 0.7334 | 0.1998 | 0.0987 |
| No log | 8.0 | 200 | 72.7351 | 0.695 | 0.4805 | 1.4948 | 0.695 | 0.6565 | 0.2585 | 0.1329 |
| No log | 9.0 | 225 | 72.2520 | 0.79 | 0.3279 | 1.5198 | 0.79 | 0.7440 | 0.1878 | 0.0704 |
| No log | 10.0 | 250 | 72.3792 | 0.785 | 0.3370 | 1.7982 | 0.785 | 0.7687 | 0.1655 | 0.0757 |
| No log | 11.0 | 275 | 72.2408 | 0.765 | 0.3542 | 1.6304 | 0.765 | 0.7364 | 0.1838 | 0.0660 |
| No log | 12.0 | 300 | 72.1003 | 0.76 | 0.3664 | 1.4090 | 0.76 | 0.7247 | 0.1832 | 0.0817 |
| No log | 13.0 | 325 | 72.4671 | 0.77 | 0.3886 | 1.7820 | 0.7700 | 0.7380 | 0.1985 | 0.0833 |
| No log | 14.0 | 350 | 72.1519 | 0.785 | 0.3633 | 1.5939 | 0.785 | 0.7563 | 0.1814 | 0.0687 |
| No log | 15.0 | 375 | 72.2206 | 0.775 | 0.3723 | 1.6025 | 0.775 | 0.7406 | 0.1824 | 0.0800 |
| No log | 16.0 | 400 | 72.0554 | 0.805 | 0.3080 | 1.4448 | 0.805 | 0.7520 | 0.1577 | 0.0485 |
| No log | 17.0 | 425 | 72.3130 | 0.8 | 0.3412 | 1.7255 | 0.8000 | 0.7653 | 0.1637 | 0.0694 |
| No log | 18.0 | 450 | 72.1019 | 0.815 | 0.3228 | 1.4827 | 0.815 | 0.7918 | 0.1753 | 0.0592 |
| No log | 19.0 | 475 | 72.2643 | 0.8 | 0.3558 | 1.5960 | 0.8000 | 0.7604 | 0.1677 | 0.0737 |
| 71.4928 | 20.0 | 500 | 71.9509 | 0.78 | 0.3398 | 1.5615 | 0.78 | 0.7596 | 0.1683 | 0.0561 |
| 71.4928 | 21.0 | 525 | 71.9389 | 0.82 | 0.2947 | 1.5336 | 0.82 | 0.8008 | 0.1561 | 0.0538 |
| 71.4928 | 22.0 | 550 | 72.0399 | 0.83 | 0.2843 | 1.4576 | 0.83 | 0.7914 | 0.1347 | 0.0523 |
| 71.4928 | 23.0 | 575 | 72.0529 | 0.815 | 0.3263 | 1.3174 | 0.815 | 0.7923 | 0.1677 | 0.0523 |
| 71.4928 | 24.0 | 600 | 72.3487 | 0.775 | 0.3838 | 1.5418 | 0.775 | 0.7560 | 0.1880 | 0.0794 |
| 71.4928 | 25.0 | 625 | 71.9154 | 0.825 | 0.2949 | 1.2538 | 0.825 | 0.7992 | 0.1471 | 0.0584 |
| 71.4928 | 26.0 | 650 | 72.0222 | 0.815 | 0.3151 | 1.5809 | 0.815 | 0.7830 | 0.1601 | 0.0594 |
| 71.4928 | 27.0 | 675 | 72.0422 | 0.815 | 0.3269 | 1.5161 | 0.815 | 0.7954 | 0.1606 | 0.0597 |
| 71.4928 | 28.0 | 700 | 72.0172 | 0.845 | 0.2828 | 1.3388 | 0.845 | 0.8350 | 0.1447 | 0.0649 |
| 71.4928 | 29.0 | 725 | 71.9113 | 0.84 | 0.2685 | 1.2082 | 0.8400 | 0.8202 | 0.1365 | 0.0562 |
| 71.4928 | 30.0 | 750 | 71.9516 | 0.84 | 0.2856 | 1.2664 | 0.8400 | 0.8359 | 0.1415 | 0.0563 |
| 71.4928 | 31.0 | 775 | 71.8583 | 0.835 | 0.2782 | 1.2979 | 0.835 | 0.8277 | 0.1447 | 0.0545 |
| 71.4928 | 32.0 | 800 | 71.9071 | 0.84 | 0.2766 | 1.2772 | 0.8400 | 0.8359 | 0.1378 | 0.0546 |
| 71.4928 | 33.0 | 825 | 71.8580 | 0.85 | 0.2699 | 1.2985 | 0.85 | 0.8482 | 0.1351 | 0.0525 |
| 71.4928 | 34.0 | 850 | 71.8499 | 0.835 | 0.2872 | 1.3022 | 0.835 | 0.8292 | 0.1462 | 0.0532 |
| 71.4928 | 35.0 | 875 | 72.0085 | 0.84 | 0.2897 | 1.3042 | 0.8400 | 0.8323 | 0.1420 | 0.0616 |
| 71.4928 | 36.0 | 900 | 71.8423 | 0.82 | 0.2929 | 1.2266 | 0.82 | 0.8056 | 0.1543 | 0.0521 |
| 71.4928 | 37.0 | 925 | 71.7886 | 0.845 | 0.2807 | 1.2181 | 0.845 | 0.8254 | 0.1332 | 0.0400 |
| 71.4928 | 38.0 | 950 | 71.8857 | 0.83 | 0.2877 | 1.4036 | 0.83 | 0.8166 | 0.1490 | 0.0480 |
| 71.4928 | 39.0 | 975 | 71.9388 | 0.83 | 0.2877 | 1.3374 | 0.83 | 0.8119 | 0.1459 | 0.0451 |
| 70.1673 | 40.0 | 1000 | 72.0368 | 0.8 | 0.3368 | 1.7112 | 0.8000 | 0.7897 | 0.1741 | 0.0578 |
| 70.1673 | 41.0 | 1025 | 72.0295 | 0.8 | 0.3208 | 1.5473 | 0.8000 | 0.7862 | 0.1587 | 0.0622 |
| 70.1673 | 42.0 | 1050 | 71.7048 | 0.86 | 0.2547 | 1.3240 | 0.8600 | 0.8374 | 0.1257 | 0.0408 |
| 70.1673 | 43.0 | 1075 | 71.7541 | 0.835 | 0.2680 | 1.4095 | 0.835 | 0.8178 | 0.1418 | 0.0445 |
| 70.1673 | 44.0 | 1100 | 71.7746 | 0.845 | 0.2721 | 1.5529 | 0.845 | 0.8383 | 0.1412 | 0.0405 |
| 70.1673 | 45.0 | 1125 | 71.7661 | 0.83 | 0.2908 | 1.5315 | 0.83 | 0.8104 | 0.1415 | 0.0437 |
| 70.1673 | 46.0 | 1150 | 71.7563 | 0.84 | 0.2787 | 1.4088 | 0.8400 | 0.8238 | 0.1360 | 0.0416 |
| 70.1673 | 47.0 | 1175 | 71.7670 | 0.84 | 0.2709 | 1.2801 | 0.8400 | 0.8303 | 0.1322 | 0.0401 |
| 70.1673 | 48.0 | 1200 | 71.7458 | 0.84 | 0.2699 | 1.4180 | 0.8400 | 0.8265 | 0.1433 | 0.0397 |
| 70.1673 | 49.0 | 1225 | 71.7226 | 0.84 | 0.2653 | 1.4126 | 0.8400 | 0.8265 | 0.1282 | 0.0390 |
| 70.1673 | 50.0 | 1250 | 71.7163 | 0.85 | 0.2608 | 1.4227 | 0.85 | 0.8402 | 0.1339 | 0.0394 |
| 70.1673 | 51.0 | 1275 | 71.7044 | 0.845 | 0.2612 | 1.4130 | 0.845 | 0.8371 | 0.1314 | 0.0387 |
| 70.1673 | 52.0 | 1300 | 71.6821 | 0.85 | 0.2545 | 1.4880 | 0.85 | 0.8392 | 0.1286 | 0.0383 |
| 70.1673 | 53.0 | 1325 | 71.6764 | 0.845 | 0.2598 | 1.3776 | 0.845 | 0.8301 | 0.1299 | 0.0377 |
| 70.1673 | 54.0 | 1350 | 71.6750 | 0.855 | 0.2590 | 1.3404 | 0.855 | 0.8479 | 0.1361 | 0.0383 |
| 70.1673 | 55.0 | 1375 | 71.7192 | 0.855 | 0.2543 | 1.4804 | 0.855 | 0.8482 | 0.1346 | 0.0388 |
| 70.1673 | 56.0 | 1400 | 71.6907 | 0.85 | 0.2552 | 1.4102 | 0.85 | 0.8389 | 0.1274 | 0.0377 |
| 70.1673 | 57.0 | 1425 | 71.6778 | 0.85 | 0.2572 | 1.4026 | 0.85 | 0.8392 | 0.1319 | 0.0389 |
| 70.1673 | 58.0 | 1450 | 71.6735 | 0.85 | 0.2559 | 1.4062 | 0.85 | 0.8394 | 0.1288 | 0.0386 |
| 70.1673 | 59.0 | 1475 | 71.6938 | 0.855 | 0.2549 | 1.4710 | 0.855 | 0.8482 | 0.1309 | 0.0387 |
| 69.9715 | 60.0 | 1500 | 71.6799 | 0.85 | 0.2599 | 1.3989 | 0.85 | 0.8389 | 0.1309 | 0.0384 |
| 69.9715 | 61.0 | 1525 | 71.6753 | 0.865 | 0.2539 | 1.3555 | 0.865 | 0.8619 | 0.1201 | 0.0374 |
| 69.9715 | 62.0 | 1550 | 71.6657 | 0.855 | 0.2562 | 1.4005 | 0.855 | 0.8453 | 0.1301 | 0.0365 |
| 69.9715 | 63.0 | 1575 | 71.6941 | 0.855 | 0.2569 | 1.4083 | 0.855 | 0.8453 | 0.1297 | 0.0369 |
| 69.9715 | 64.0 | 1600 | 71.6430 | 0.85 | 0.2567 | 1.3933 | 0.85 | 0.8395 | 0.1239 | 0.0370 |
| 69.9715 | 65.0 | 1625 | 71.6666 | 0.85 | 0.2582 | 1.4014 | 0.85 | 0.8395 | 0.1357 | 0.0375 |
| 69.9715 | 66.0 | 1650 | 71.6550 | 0.85 | 0.2578 | 1.3849 | 0.85 | 0.8395 | 0.1253 | 0.0370 |
| 69.9715 | 67.0 | 1675 | 71.6321 | 0.855 | 0.2573 | 1.3932 | 0.855 | 0.8466 | 0.1276 | 0.0362 |
| 69.9715 | 68.0 | 1700 | 71.6237 | 0.855 | 0.2576 | 1.3976 | 0.855 | 0.8453 | 0.1231 | 0.0374 |
| 69.9715 | 69.0 | 1725 | 71.6287 | 0.85 | 0.2589 | 1.3914 | 0.85 | 0.8403 | 0.1299 | 0.0366 |
| 69.9715 | 70.0 | 1750 | 71.6325 | 0.85 | 0.2580 | 1.3907 | 0.85 | 0.8425 | 0.1321 | 0.0365 |
| 69.9715 | 71.0 | 1775 | 71.6175 | 0.85 | 0.2572 | 1.3914 | 0.85 | 0.8412 | 0.1318 | 0.0365 |
| 69.9715 | 72.0 | 1800 | 71.6208 | 0.85 | 0.2591 | 1.3860 | 0.85 | 0.8425 | 0.1325 | 0.0355 |
| 69.9715 | 73.0 | 1825 | 71.6157 | 0.85 | 0.2600 | 1.3894 | 0.85 | 0.8425 | 0.1335 | 0.0361 |
| 69.9715 | 74.0 | 1850 | 71.6405 | 0.85 | 0.2632 | 1.3335 | 0.85 | 0.8425 | 0.1306 | 0.0359 |
| 69.9715 | 75.0 | 1875 | 71.6099 | 0.85 | 0.2586 | 1.3899 | 0.85 | 0.8425 | 0.1283 | 0.0360 |
| 69.9715 | 76.0 | 1900 | 71.6058 | 0.85 | 0.2599 | 1.3220 | 0.85 | 0.8425 | 0.1260 | 0.0357 |
| 69.9715 | 77.0 | 1925 | 71.6096 | 0.85 | 0.2591 | 1.3859 | 0.85 | 0.8425 | 0.1279 | 0.0355 |
| 69.9715 | 78.0 | 1950 | 71.6213 | 0.85 | 0.2604 | 1.3875 | 0.85 | 0.8425 | 0.1284 | 0.0351 |
| 69.9715 | 79.0 | 1975 | 71.6240 | 0.85 | 0.2610 | 1.3867 | 0.85 | 0.8425 | 0.1347 | 0.0356 |
| 69.8814 | 80.0 | 2000 | 71.6246 | 0.855 | 0.2608 | 1.3183 | 0.855 | 0.8483 | 0.1238 | 0.0354 |
| 69.8814 | 81.0 | 2025 | 71.6133 | 0.85 | 0.2595 | 1.3177 | 0.85 | 0.8425 | 0.1268 | 0.0357 |
| 69.8814 | 82.0 | 2050 | 71.6120 | 0.85 | 0.2593 | 1.3873 | 0.85 | 0.8425 | 0.1278 | 0.0353 |
| 69.8814 | 83.0 | 2075 | 71.6183 | 0.85 | 0.2600 | 1.3220 | 0.85 | 0.8425 | 0.1342 | 0.0356 |
| 69.8814 | 84.0 | 2100 | 71.6008 | 0.85 | 0.2610 | 1.3168 | 0.85 | 0.8425 | 0.1359 | 0.0358 |
| 69.8814 | 85.0 | 2125 | 71.5986 | 0.85 | 0.2621 | 1.3156 | 0.85 | 0.8425 | 0.1327 | 0.0355 |
| 69.8814 | 86.0 | 2150 | 71.6000 | 0.85 | 0.2609 | 1.3167 | 0.85 | 0.8425 | 0.1311 | 0.0354 |
| 69.8814 | 87.0 | 2175 | 71.6006 | 0.855 | 0.2611 | 1.3171 | 0.855 | 0.8483 | 0.1251 | 0.0353 |
| 69.8814 | 88.0 | 2200 | 71.6033 | 0.85 | 0.2620 | 1.3132 | 0.85 | 0.8425 | 0.1331 | 0.0357 |
| 69.8814 | 89.0 | 2225 | 71.6176 | 0.855 | 0.2635 | 1.3142 | 0.855 | 0.8483 | 0.1326 | 0.0357 |
| 69.8814 | 90.0 | 2250 | 71.6201 | 0.855 | 0.2636 | 1.3126 | 0.855 | 0.8483 | 0.1282 | 0.0356 |
| 69.8814 | 91.0 | 2275 | 71.6128 | 0.85 | 0.2629 | 1.3126 | 0.85 | 0.8425 | 0.1273 | 0.0356 |
| 69.8814 | 92.0 | 2300 | 71.6086 | 0.855 | 0.2631 | 1.3147 | 0.855 | 0.8515 | 0.1261 | 0.0358 |
| 69.8814 | 93.0 | 2325 | 71.6010 | 0.85 | 0.2638 | 1.3122 | 0.85 | 0.8425 | 0.1292 | 0.0356 |
| 69.8814 | 94.0 | 2350 | 71.6053 | 0.85 | 0.2636 | 1.3125 | 0.85 | 0.8425 | 0.1269 | 0.0354 |
| 69.8814 | 95.0 | 2375 | 71.6004 | 0.85 | 0.2640 | 1.3128 | 0.85 | 0.8425 | 0.1346 | 0.0356 |
| 69.8814 | 96.0 | 2400 | 71.6035 | 0.85 | 0.2644 | 1.3128 | 0.85 | 0.8425 | 0.1346 | 0.0356 |
| 69.8814 | 97.0 | 2425 | 71.6027 | 0.85 | 0.2639 | 1.3117 | 0.85 | 0.8425 | 0.1343 | 0.0355 |
| 69.8814 | 98.0 | 2450 | 71.6039 | 0.85 | 0.2639 | 1.3117 | 0.85 | 0.8425 | 0.1343 | 0.0354 |
| 69.8814 | 99.0 | 2475 | 71.6018 | 0.85 | 0.2640 | 1.3122 | 0.85 | 0.8425 | 0.1342 | 0.0356 |
| 69.8448 | 100.0 | 2500 | 71.6043 | 0.85 | 0.2638 | 1.3122 | 0.85 | 0.8425 | 0.1342 | 0.0355 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
marcoi/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.24.0
- Pytorch 2.0.1+cu118
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
jordyvl/81-tiny_tobacco3482_og_simkd
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 81-tiny_tobacco3482_og_simkd
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 230.0462
- Accuracy: 0.845
- Brier Loss: 0.2451
- Nll: 1.1250
- F1 Micro: 0.845
- F1 Macro: 0.8350
- Ece: 0.1115
- Aurc: 0.0383
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 50 | 238.2967 | 0.275 | 0.8789 | 6.2995 | 0.275 | 0.1849 | 0.2898 | 0.4790 |
| No log | 2.0 | 100 | 236.7866 | 0.41 | 0.7214 | 3.0130 | 0.41 | 0.2599 | 0.2544 | 0.3985 |
| No log | 3.0 | 150 | 235.3687 | 0.59 | 0.5690 | 2.4483 | 0.59 | 0.4507 | 0.2464 | 0.2336 |
| No log | 4.0 | 200 | 234.7639 | 0.65 | 0.4658 | 2.3318 | 0.65 | 0.4996 | 0.2365 | 0.1474 |
| No log | 5.0 | 250 | 234.0798 | 0.675 | 0.4692 | 2.3351 | 0.675 | 0.5158 | 0.2359 | 0.1335 |
| No log | 6.0 | 300 | 234.2002 | 0.68 | 0.4267 | 2.1570 | 0.68 | 0.5724 | 0.1990 | 0.1124 |
| No log | 7.0 | 350 | 234.1154 | 0.665 | 0.4874 | 2.9269 | 0.665 | 0.5631 | 0.2249 | 0.1259 |
| No log | 8.0 | 400 | 233.2390 | 0.77 | 0.3224 | 1.8874 | 0.7700 | 0.6850 | 0.1754 | 0.0695 |
| No log | 9.0 | 450 | 233.1396 | 0.755 | 0.3583 | 2.1139 | 0.755 | 0.7060 | 0.1573 | 0.0802 |
| 234.3515 | 10.0 | 500 | 233.5895 | 0.705 | 0.4653 | 3.2740 | 0.705 | 0.6813 | 0.2183 | 0.1180 |
| 234.3515 | 11.0 | 550 | 233.0481 | 0.745 | 0.3803 | 2.7330 | 0.745 | 0.6697 | 0.1730 | 0.0936 |
| 234.3515 | 12.0 | 600 | 232.9339 | 0.78 | 0.3319 | 1.8094 | 0.78 | 0.7416 | 0.1700 | 0.0660 |
| 234.3515 | 13.0 | 650 | 233.0181 | 0.755 | 0.3182 | 1.9630 | 0.755 | 0.7596 | 0.1772 | 0.0823 |
| 234.3515 | 14.0 | 700 | 232.4934 | 0.815 | 0.2659 | 1.3869 | 0.815 | 0.7749 | 0.1512 | 0.0511 |
| 234.3515 | 15.0 | 750 | 232.6922 | 0.79 | 0.3361 | 1.8793 | 0.79 | 0.7671 | 0.1633 | 0.0926 |
| 234.3515 | 16.0 | 800 | 232.3391 | 0.815 | 0.2785 | 1.5488 | 0.815 | 0.8038 | 0.1476 | 0.0549 |
| 234.3515 | 17.0 | 850 | 232.5412 | 0.775 | 0.3100 | 2.0192 | 0.775 | 0.7675 | 0.1617 | 0.0502 |
| 234.3515 | 18.0 | 900 | 232.5307 | 0.785 | 0.3211 | 1.6793 | 0.785 | 0.7541 | 0.1679 | 0.0676 |
| 234.3515 | 19.0 | 950 | 232.3055 | 0.825 | 0.2862 | 1.5993 | 0.825 | 0.8204 | 0.1443 | 0.0620 |
| 231.365 | 20.0 | 1000 | 232.5636 | 0.78 | 0.3431 | 2.6048 | 0.78 | 0.7835 | 0.1658 | 0.0847 |
| 231.365 | 21.0 | 1050 | 231.8748 | 0.845 | 0.2737 | 1.4332 | 0.845 | 0.8454 | 0.1616 | 0.0602 |
| 231.365 | 22.0 | 1100 | 232.1032 | 0.825 | 0.2557 | 1.6240 | 0.825 | 0.8290 | 0.1422 | 0.0442 |
| 231.365 | 23.0 | 1150 | 231.7278 | 0.835 | 0.2740 | 1.6260 | 0.835 | 0.8246 | 0.1541 | 0.0613 |
| 231.365 | 24.0 | 1200 | 231.9350 | 0.84 | 0.2636 | 1.6028 | 0.8400 | 0.8410 | 0.1472 | 0.0471 |
| 231.365 | 25.0 | 1250 | 231.6054 | 0.82 | 0.2753 | 1.3835 | 0.82 | 0.8090 | 0.1526 | 0.0478 |
| 231.365 | 26.0 | 1300 | 231.6516 | 0.84 | 0.2737 | 1.3849 | 0.8400 | 0.8354 | 0.1492 | 0.0543 |
| 231.365 | 27.0 | 1350 | 231.5453 | 0.84 | 0.2536 | 1.1384 | 0.8400 | 0.8265 | 0.1363 | 0.0431 |
| 231.365 | 28.0 | 1400 | 231.4833 | 0.85 | 0.2292 | 1.1206 | 0.85 | 0.8350 | 0.1250 | 0.0362 |
| 231.365 | 29.0 | 1450 | 231.3722 | 0.815 | 0.2856 | 1.1706 | 0.815 | 0.8038 | 0.1606 | 0.0486 |
| 230.3328 | 30.0 | 1500 | 231.3517 | 0.84 | 0.2608 | 1.3387 | 0.8400 | 0.8382 | 0.1366 | 0.0483 |
| 230.3328 | 31.0 | 1550 | 231.3705 | 0.815 | 0.2724 | 1.2558 | 0.815 | 0.7992 | 0.1448 | 0.0463 |
| 230.3328 | 32.0 | 1600 | 231.4319 | 0.84 | 0.2588 | 1.0691 | 0.8400 | 0.8301 | 0.1317 | 0.0435 |
| 230.3328 | 33.0 | 1650 | 231.2119 | 0.86 | 0.2323 | 1.1693 | 0.8600 | 0.8609 | 0.1229 | 0.0470 |
| 230.3328 | 34.0 | 1700 | 231.2836 | 0.83 | 0.2477 | 1.1294 | 0.83 | 0.8201 | 0.1398 | 0.0406 |
| 230.3328 | 35.0 | 1750 | 231.2669 | 0.845 | 0.2569 | 1.1508 | 0.845 | 0.8369 | 0.1357 | 0.0449 |
| 230.3328 | 36.0 | 1800 | 231.0634 | 0.85 | 0.2422 | 1.0830 | 0.85 | 0.8404 | 0.1372 | 0.0419 |
| 230.3328 | 37.0 | 1850 | 231.1141 | 0.85 | 0.2398 | 1.0879 | 0.85 | 0.8455 | 0.1347 | 0.0437 |
| 230.3328 | 38.0 | 1900 | 231.0520 | 0.815 | 0.2626 | 1.2325 | 0.815 | 0.8077 | 0.1304 | 0.0478 |
| 230.3328 | 39.0 | 1950 | 230.9089 | 0.83 | 0.2507 | 1.1245 | 0.83 | 0.8258 | 0.1347 | 0.0487 |
| 229.6409 | 40.0 | 2000 | 231.0532 | 0.86 | 0.2207 | 1.1258 | 0.8600 | 0.8464 | 0.1206 | 0.0428 |
| 229.6409 | 41.0 | 2050 | 230.9307 | 0.855 | 0.2350 | 1.1326 | 0.855 | 0.8498 | 0.1284 | 0.0401 |
| 229.6409 | 42.0 | 2100 | 230.8493 | 0.86 | 0.2306 | 1.1075 | 0.8600 | 0.8548 | 0.1441 | 0.0417 |
| 229.6409 | 43.0 | 2150 | 230.7516 | 0.87 | 0.2198 | 1.0312 | 0.87 | 0.8587 | 0.1184 | 0.0404 |
| 229.6409 | 44.0 | 2200 | 230.8540 | 0.85 | 0.2485 | 1.1724 | 0.85 | 0.8444 | 0.1425 | 0.0451 |
| 229.6409 | 45.0 | 2250 | 230.7995 | 0.86 | 0.2284 | 1.1183 | 0.8600 | 0.8490 | 0.1284 | 0.0368 |
| 229.6409 | 46.0 | 2300 | 230.7162 | 0.825 | 0.2701 | 1.1206 | 0.825 | 0.8159 | 0.1390 | 0.0437 |
| 229.6409 | 47.0 | 2350 | 230.5593 | 0.855 | 0.2341 | 1.2242 | 0.855 | 0.8459 | 0.1226 | 0.0392 |
| 229.6409 | 48.0 | 2400 | 230.6472 | 0.86 | 0.2377 | 1.0233 | 0.8600 | 0.8558 | 0.1319 | 0.0354 |
| 229.6409 | 49.0 | 2450 | 230.7080 | 0.84 | 0.2548 | 1.1208 | 0.8400 | 0.8294 | 0.1484 | 0.0426 |
| 229.142 | 50.0 | 2500 | 230.5862 | 0.845 | 0.2543 | 1.2129 | 0.845 | 0.8322 | 0.1358 | 0.0415 |
| 229.142 | 51.0 | 2550 | 230.6550 | 0.845 | 0.2462 | 1.0937 | 0.845 | 0.8333 | 0.1306 | 0.0411 |
| 229.142 | 52.0 | 2600 | 230.5789 | 0.835 | 0.2595 | 1.1393 | 0.835 | 0.8249 | 0.1369 | 0.0428 |
| 229.142 | 53.0 | 2650 | 230.5895 | 0.85 | 0.2519 | 1.0185 | 0.85 | 0.8447 | 0.1263 | 0.0439 |
| 229.142 | 54.0 | 2700 | 230.4955 | 0.86 | 0.2402 | 1.0837 | 0.8600 | 0.8590 | 0.1382 | 0.0394 |
| 229.142 | 55.0 | 2750 | 230.4579 | 0.84 | 0.2560 | 1.1514 | 0.8400 | 0.8312 | 0.1439 | 0.0431 |
| 229.142 | 56.0 | 2800 | 230.5190 | 0.845 | 0.2527 | 1.2868 | 0.845 | 0.8334 | 0.1342 | 0.0406 |
| 229.142 | 57.0 | 2850 | 230.4989 | 0.84 | 0.2536 | 1.0785 | 0.8400 | 0.8278 | 0.1377 | 0.0397 |
| 229.142 | 58.0 | 2900 | 230.4445 | 0.84 | 0.2416 | 1.2104 | 0.8400 | 0.8340 | 0.1339 | 0.0383 |
| 229.142 | 59.0 | 2950 | 230.3159 | 0.87 | 0.2211 | 1.1138 | 0.87 | 0.8630 | 0.1171 | 0.0394 |
| 228.7442 | 60.0 | 3000 | 230.3405 | 0.835 | 0.2603 | 1.1327 | 0.835 | 0.8273 | 0.1422 | 0.0416 |
| 228.7442 | 61.0 | 3050 | 230.3892 | 0.845 | 0.2395 | 1.1619 | 0.845 | 0.8308 | 0.1263 | 0.0377 |
| 228.7442 | 62.0 | 3100 | 230.3728 | 0.84 | 0.2478 | 1.0053 | 0.8400 | 0.8291 | 0.1208 | 0.0389 |
| 228.7442 | 63.0 | 3150 | 230.2486 | 0.845 | 0.2545 | 1.1989 | 0.845 | 0.8438 | 0.1364 | 0.0434 |
| 228.7442 | 64.0 | 3200 | 230.2353 | 0.84 | 0.2512 | 1.1260 | 0.8400 | 0.8330 | 0.1191 | 0.0430 |
| 228.7442 | 65.0 | 3250 | 230.2041 | 0.845 | 0.2429 | 1.1529 | 0.845 | 0.8384 | 0.1343 | 0.0401 |
| 228.7442 | 66.0 | 3300 | 230.2439 | 0.845 | 0.2511 | 1.1125 | 0.845 | 0.8366 | 0.1367 | 0.0430 |
| 228.7442 | 67.0 | 3350 | 230.1510 | 0.85 | 0.2471 | 1.0528 | 0.85 | 0.8457 | 0.1414 | 0.0402 |
| 228.7442 | 68.0 | 3400 | 230.2274 | 0.85 | 0.2455 | 1.1150 | 0.85 | 0.8397 | 0.1311 | 0.0427 |
| 228.7442 | 69.0 | 3450 | 230.2165 | 0.845 | 0.2524 | 1.0517 | 0.845 | 0.8421 | 0.1312 | 0.0410 |
| 228.4757 | 70.0 | 3500 | 230.1976 | 0.835 | 0.2600 | 1.0845 | 0.835 | 0.8258 | 0.1353 | 0.0410 |
| 228.4757 | 71.0 | 3550 | 230.1062 | 0.85 | 0.2487 | 1.1447 | 0.85 | 0.8410 | 0.1297 | 0.0427 |
| 228.4757 | 72.0 | 3600 | 229.9867 | 0.835 | 0.2584 | 1.0641 | 0.835 | 0.8273 | 0.1236 | 0.0440 |
| 228.4757 | 73.0 | 3650 | 230.1918 | 0.845 | 0.2411 | 1.1521 | 0.845 | 0.8363 | 0.1373 | 0.0389 |
| 228.4757 | 74.0 | 3700 | 230.0781 | 0.85 | 0.2524 | 1.0980 | 0.85 | 0.8390 | 0.1298 | 0.0409 |
| 228.4757 | 75.0 | 3750 | 230.1432 | 0.835 | 0.2554 | 1.0967 | 0.835 | 0.8230 | 0.1227 | 0.0407 |
| 228.4757 | 76.0 | 3800 | 230.1512 | 0.84 | 0.2535 | 1.0945 | 0.8400 | 0.8295 | 0.1321 | 0.0422 |
| 228.4757 | 77.0 | 3850 | 230.0682 | 0.84 | 0.2502 | 1.0301 | 0.8400 | 0.8370 | 0.1312 | 0.0403 |
| 228.4757 | 78.0 | 3900 | 230.0357 | 0.835 | 0.2521 | 1.1572 | 0.835 | 0.8293 | 0.1244 | 0.0412 |
| 228.4757 | 79.0 | 3950 | 230.1252 | 0.845 | 0.2509 | 1.0961 | 0.845 | 0.8381 | 0.1273 | 0.0409 |
| 228.2815 | 80.0 | 4000 | 230.0584 | 0.845 | 0.2539 | 1.0795 | 0.845 | 0.8363 | 0.1235 | 0.0432 |
| 228.2815 | 81.0 | 4050 | 229.9967 | 0.85 | 0.2427 | 1.1156 | 0.85 | 0.8382 | 0.1184 | 0.0394 |
| 228.2815 | 82.0 | 4100 | 230.0755 | 0.84 | 0.2563 | 1.0833 | 0.8400 | 0.8302 | 0.1295 | 0.0406 |
| 228.2815 | 83.0 | 4150 | 230.0798 | 0.845 | 0.2477 | 1.1713 | 0.845 | 0.8385 | 0.1259 | 0.0427 |
| 228.2815 | 84.0 | 4200 | 230.0299 | 0.84 | 0.2477 | 1.0907 | 0.8400 | 0.8260 | 0.1213 | 0.0383 |
| 228.2815 | 85.0 | 4250 | 230.0568 | 0.845 | 0.2483 | 1.0763 | 0.845 | 0.8350 | 0.1238 | 0.0409 |
| 228.2815 | 86.0 | 4300 | 230.0743 | 0.85 | 0.2464 | 1.0549 | 0.85 | 0.8418 | 0.1271 | 0.0398 |
| 228.2815 | 87.0 | 4350 | 230.0061 | 0.845 | 0.2505 | 1.1585 | 0.845 | 0.8350 | 0.1312 | 0.0375 |
| 228.2815 | 88.0 | 4400 | 229.9674 | 0.845 | 0.2478 | 1.0763 | 0.845 | 0.8346 | 0.1394 | 0.0410 |
| 228.2815 | 89.0 | 4450 | 229.9697 | 0.85 | 0.2451 | 1.0833 | 0.85 | 0.8406 | 0.1324 | 0.0364 |
| 228.1298 | 90.0 | 4500 | 230.0305 | 0.845 | 0.2496 | 1.1008 | 0.845 | 0.8350 | 0.1308 | 0.0395 |
| 228.1298 | 91.0 | 4550 | 229.9740 | 0.845 | 0.2495 | 1.0605 | 0.845 | 0.8350 | 0.1309 | 0.0413 |
| 228.1298 | 92.0 | 4600 | 229.9962 | 0.85 | 0.2497 | 1.1193 | 0.85 | 0.8408 | 0.1294 | 0.0399 |
| 228.1298 | 93.0 | 4650 | 229.9740 | 0.85 | 0.2491 | 1.0496 | 0.85 | 0.8383 | 0.1270 | 0.0390 |
| 228.1298 | 94.0 | 4700 | 229.9698 | 0.84 | 0.2516 | 1.0644 | 0.8400 | 0.8295 | 0.1430 | 0.0398 |
| 228.1298 | 95.0 | 4750 | 229.9247 | 0.845 | 0.2516 | 1.1705 | 0.845 | 0.8350 | 0.1227 | 0.0371 |
| 228.1298 | 96.0 | 4800 | 230.0451 | 0.84 | 0.2507 | 1.0970 | 0.8400 | 0.8285 | 0.1368 | 0.0391 |
| 228.1298 | 97.0 | 4850 | 229.9402 | 0.85 | 0.2495 | 1.1427 | 0.85 | 0.8430 | 0.1379 | 0.0413 |
| 228.1298 | 98.0 | 4900 | 230.0130 | 0.84 | 0.2532 | 1.0964 | 0.8400 | 0.8285 | 0.1412 | 0.0389 |
| 228.1298 | 99.0 | 4950 | 230.0200 | 0.845 | 0.2482 | 1.0916 | 0.845 | 0.8317 | 0.1190 | 0.0392 |
| 228.0583 | 100.0 | 5000 | 230.0462 | 0.845 | 0.2451 | 1.1250 | 0.845 | 0.8350 | 0.1115 | 0.0383 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/60-tiny_tobacco3482_og_simkd
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 60-tiny_tobacco3482_og_simkd
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 230.9055
- Accuracy: 0.855
- Brier Loss: 0.2199
- Nll: 1.0336
- F1 Micro: 0.855
- F1 Macro: 0.8458
- Ece: 0.1335
- Aurc: 0.0336
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 50 | 238.9631 | 0.285 | 0.8828 | 6.5847 | 0.285 | 0.1874 | 0.2964 | 0.5019 |
| No log | 2.0 | 100 | 237.5668 | 0.385 | 0.7452 | 3.0464 | 0.3850 | 0.2403 | 0.2776 | 0.4072 |
| No log | 3.0 | 150 | 236.0767 | 0.615 | 0.5834 | 2.4900 | 0.615 | 0.4713 | 0.2629 | 0.2189 |
| No log | 4.0 | 200 | 235.7085 | 0.635 | 0.5104 | 2.9259 | 0.635 | 0.4933 | 0.2605 | 0.1700 |
| No log | 5.0 | 250 | 234.8680 | 0.66 | 0.4494 | 2.2358 | 0.66 | 0.4974 | 0.2021 | 0.1292 |
| No log | 6.0 | 300 | 235.0645 | 0.68 | 0.4615 | 2.3164 | 0.68 | 0.5568 | 0.2601 | 0.1346 |
| No log | 7.0 | 350 | 234.3820 | 0.735 | 0.3741 | 1.8624 | 0.735 | 0.6245 | 0.2265 | 0.0894 |
| No log | 8.0 | 400 | 233.9906 | 0.72 | 0.3954 | 1.7623 | 0.72 | 0.6479 | 0.1779 | 0.0980 |
| No log | 9.0 | 450 | 234.0194 | 0.755 | 0.3410 | 1.6963 | 0.755 | 0.6965 | 0.1730 | 0.0718 |
| 234.3621 | 10.0 | 500 | 234.2365 | 0.705 | 0.3859 | 2.4174 | 0.705 | 0.6957 | 0.1759 | 0.0944 |
| 234.3621 | 11.0 | 550 | 233.5580 | 0.76 | 0.3331 | 1.5730 | 0.76 | 0.7135 | 0.1817 | 0.0709 |
| 234.3621 | 12.0 | 600 | 233.3485 | 0.815 | 0.2725 | 1.6179 | 0.815 | 0.7769 | 0.1835 | 0.0497 |
| 234.3621 | 13.0 | 650 | 233.6395 | 0.805 | 0.3038 | 1.7563 | 0.805 | 0.7813 | 0.1976 | 0.0698 |
| 234.3621 | 14.0 | 700 | 233.1443 | 0.805 | 0.2928 | 1.4722 | 0.805 | 0.7779 | 0.1513 | 0.0546 |
| 234.3621 | 15.0 | 750 | 233.1237 | 0.83 | 0.2569 | 1.7889 | 0.83 | 0.8231 | 0.1361 | 0.0460 |
| 234.3621 | 16.0 | 800 | 232.9007 | 0.825 | 0.2675 | 1.7492 | 0.825 | 0.8125 | 0.1742 | 0.0558 |
| 234.3621 | 17.0 | 850 | 233.0284 | 0.79 | 0.2861 | 1.4243 | 0.79 | 0.7698 | 0.1418 | 0.0520 |
| 234.3621 | 18.0 | 900 | 232.9831 | 0.79 | 0.3009 | 1.4085 | 0.79 | 0.7440 | 0.1962 | 0.0643 |
| 234.3621 | 19.0 | 950 | 232.9837 | 0.825 | 0.2618 | 1.4839 | 0.825 | 0.7949 | 0.1444 | 0.0457 |
| 231.5126 | 20.0 | 1000 | 232.9143 | 0.825 | 0.2599 | 1.5299 | 0.825 | 0.8086 | 0.1570 | 0.0463 |
| 231.5126 | 21.0 | 1050 | 232.5251 | 0.835 | 0.2495 | 1.2311 | 0.835 | 0.8279 | 0.1511 | 0.0472 |
| 231.5126 | 22.0 | 1100 | 232.6748 | 0.855 | 0.2165 | 1.2547 | 0.855 | 0.8448 | 0.1450 | 0.0299 |
| 231.5126 | 23.0 | 1150 | 232.6610 | 0.83 | 0.2450 | 1.2944 | 0.83 | 0.8113 | 0.1545 | 0.0403 |
| 231.5126 | 24.0 | 1200 | 232.7660 | 0.83 | 0.2480 | 1.4783 | 0.83 | 0.8101 | 0.1645 | 0.0397 |
| 231.5126 | 25.0 | 1250 | 232.5843 | 0.855 | 0.2336 | 1.1227 | 0.855 | 0.8303 | 0.1672 | 0.0397 |
| 231.5126 | 26.0 | 1300 | 232.3482 | 0.84 | 0.2321 | 1.1720 | 0.8400 | 0.8346 | 0.1527 | 0.0380 |
| 231.5126 | 27.0 | 1350 | 232.3758 | 0.84 | 0.2353 | 1.1684 | 0.8400 | 0.8327 | 0.1520 | 0.0372 |
| 231.5126 | 28.0 | 1400 | 232.3022 | 0.82 | 0.2460 | 1.0888 | 0.82 | 0.7920 | 0.1746 | 0.0392 |
| 231.5126 | 29.0 | 1450 | 232.1077 | 0.845 | 0.2355 | 1.3720 | 0.845 | 0.8306 | 0.1314 | 0.0372 |
| 230.5624 | 30.0 | 1500 | 232.3631 | 0.825 | 0.2443 | 1.3202 | 0.825 | 0.7971 | 0.1404 | 0.0373 |
| 230.5624 | 31.0 | 1550 | 232.1099 | 0.84 | 0.2509 | 1.2935 | 0.8400 | 0.8199 | 0.1552 | 0.0453 |
| 230.5624 | 32.0 | 1600 | 232.1548 | 0.855 | 0.2275 | 1.0948 | 0.855 | 0.8360 | 0.1428 | 0.0347 |
| 230.5624 | 33.0 | 1650 | 232.0168 | 0.84 | 0.2284 | 1.1303 | 0.8400 | 0.8260 | 0.1460 | 0.0374 |
| 230.5624 | 34.0 | 1700 | 232.1122 | 0.875 | 0.2154 | 1.3044 | 0.875 | 0.8612 | 0.1410 | 0.0307 |
| 230.5624 | 35.0 | 1750 | 232.0452 | 0.84 | 0.2382 | 1.1732 | 0.8400 | 0.8199 | 0.1218 | 0.0376 |
| 230.5624 | 36.0 | 1800 | 231.8549 | 0.835 | 0.2390 | 1.1322 | 0.835 | 0.8227 | 0.1478 | 0.0375 |
| 230.5624 | 37.0 | 1850 | 231.8938 | 0.845 | 0.2558 | 1.0115 | 0.845 | 0.8261 | 0.1730 | 0.0447 |
| 230.5624 | 38.0 | 1900 | 231.8979 | 0.85 | 0.2292 | 1.1142 | 0.85 | 0.8361 | 0.1359 | 0.0348 |
| 230.5624 | 39.0 | 1950 | 231.7455 | 0.84 | 0.2217 | 1.0746 | 0.8400 | 0.8185 | 0.1294 | 0.0309 |
| 229.9089 | 40.0 | 2000 | 231.8694 | 0.85 | 0.2200 | 1.0529 | 0.85 | 0.8334 | 0.1435 | 0.0343 |
| 229.9089 | 41.0 | 2050 | 231.8044 | 0.84 | 0.2317 | 0.9332 | 0.8400 | 0.8204 | 0.1432 | 0.0337 |
| 229.9089 | 42.0 | 2100 | 231.6414 | 0.855 | 0.2165 | 1.1356 | 0.855 | 0.8393 | 0.1357 | 0.0321 |
| 229.9089 | 43.0 | 2150 | 231.5806 | 0.835 | 0.2378 | 1.0314 | 0.835 | 0.8124 | 0.1443 | 0.0382 |
| 229.9089 | 44.0 | 2200 | 231.6199 | 0.855 | 0.2287 | 1.0907 | 0.855 | 0.8463 | 0.1196 | 0.0362 |
| 229.9089 | 45.0 | 2250 | 231.5991 | 0.85 | 0.2208 | 1.0967 | 0.85 | 0.8350 | 0.1321 | 0.0339 |
| 229.9089 | 46.0 | 2300 | 231.5103 | 0.85 | 0.2249 | 1.0330 | 0.85 | 0.8270 | 0.1239 | 0.0332 |
| 229.9089 | 47.0 | 2350 | 231.4252 | 0.87 | 0.2126 | 1.1054 | 0.87 | 0.8618 | 0.1230 | 0.0312 |
| 229.9089 | 48.0 | 2400 | 231.4696 | 0.86 | 0.2136 | 1.0952 | 0.8600 | 0.8503 | 0.1304 | 0.0302 |
| 229.9089 | 49.0 | 2450 | 231.5416 | 0.84 | 0.2329 | 1.0155 | 0.8400 | 0.8256 | 0.1381 | 0.0356 |
| 229.4364 | 50.0 | 2500 | 231.4932 | 0.84 | 0.2215 | 1.1382 | 0.8400 | 0.8177 | 0.1557 | 0.0319 |
| 229.4364 | 51.0 | 2550 | 231.4270 | 0.84 | 0.2312 | 1.0191 | 0.8400 | 0.8253 | 0.1376 | 0.0371 |
| 229.4364 | 52.0 | 2600 | 231.3520 | 0.85 | 0.2233 | 1.2815 | 0.85 | 0.8289 | 0.1444 | 0.0334 |
| 229.4364 | 53.0 | 2650 | 231.3922 | 0.86 | 0.2223 | 1.0950 | 0.8600 | 0.8423 | 0.1309 | 0.0334 |
| 229.4364 | 54.0 | 2700 | 231.3504 | 0.855 | 0.2171 | 1.0165 | 0.855 | 0.8422 | 0.1304 | 0.0313 |
| 229.4364 | 55.0 | 2750 | 231.2676 | 0.87 | 0.2129 | 1.0704 | 0.87 | 0.8598 | 0.1334 | 0.0320 |
| 229.4364 | 56.0 | 2800 | 231.2823 | 0.84 | 0.2390 | 1.0982 | 0.8400 | 0.8226 | 0.1187 | 0.0372 |
| 229.4364 | 57.0 | 2850 | 231.2740 | 0.85 | 0.2251 | 0.9521 | 0.85 | 0.8271 | 0.1388 | 0.0320 |
| 229.4364 | 58.0 | 2900 | 231.2784 | 0.85 | 0.2284 | 1.0194 | 0.85 | 0.8320 | 0.1306 | 0.0360 |
| 229.4364 | 59.0 | 2950 | 231.2078 | 0.84 | 0.2265 | 1.0036 | 0.8400 | 0.8253 | 0.1359 | 0.0366 |
| 229.043 | 60.0 | 3000 | 231.2086 | 0.85 | 0.2380 | 1.0247 | 0.85 | 0.8391 | 0.1395 | 0.0374 |
| 229.043 | 61.0 | 3050 | 231.2673 | 0.845 | 0.2410 | 1.0205 | 0.845 | 0.8272 | 0.1466 | 0.0389 |
| 229.043 | 62.0 | 3100 | 231.1900 | 0.855 | 0.2219 | 1.0835 | 0.855 | 0.8449 | 0.1351 | 0.0346 |
| 229.043 | 63.0 | 3150 | 231.0561 | 0.845 | 0.2417 | 0.9740 | 0.845 | 0.8332 | 0.1503 | 0.0405 |
| 229.043 | 64.0 | 3200 | 231.1282 | 0.845 | 0.2387 | 1.1105 | 0.845 | 0.8270 | 0.1198 | 0.0379 |
| 229.043 | 65.0 | 3250 | 231.0782 | 0.85 | 0.2334 | 0.9838 | 0.85 | 0.8403 | 0.1282 | 0.0356 |
| 229.043 | 66.0 | 3300 | 231.0704 | 0.84 | 0.2442 | 1.0380 | 0.8400 | 0.8275 | 0.1543 | 0.0410 |
| 229.043 | 67.0 | 3350 | 231.0450 | 0.85 | 0.2246 | 1.0023 | 0.85 | 0.8394 | 0.1212 | 0.0353 |
| 229.043 | 68.0 | 3400 | 231.1017 | 0.85 | 0.2257 | 1.0437 | 0.85 | 0.8334 | 0.1232 | 0.0350 |
| 229.043 | 69.0 | 3450 | 231.0068 | 0.85 | 0.2321 | 1.0075 | 0.85 | 0.8445 | 0.1271 | 0.0361 |
| 228.7748 | 70.0 | 3500 | 231.0666 | 0.85 | 0.2274 | 1.0133 | 0.85 | 0.8382 | 0.1334 | 0.0361 |
| 228.7748 | 71.0 | 3550 | 230.9450 | 0.85 | 0.2417 | 1.0738 | 0.85 | 0.8356 | 0.1294 | 0.0380 |
| 228.7748 | 72.0 | 3600 | 230.7952 | 0.85 | 0.2379 | 0.9779 | 0.85 | 0.8391 | 0.1309 | 0.0393 |
| 228.7748 | 73.0 | 3650 | 231.0920 | 0.86 | 0.2188 | 1.0154 | 0.8600 | 0.8538 | 0.1230 | 0.0335 |
| 228.7748 | 74.0 | 3700 | 230.9152 | 0.855 | 0.2408 | 1.1637 | 0.855 | 0.8486 | 0.1490 | 0.0400 |
| 228.7748 | 75.0 | 3750 | 230.9537 | 0.85 | 0.2195 | 1.0135 | 0.85 | 0.8301 | 0.1131 | 0.0321 |
| 228.7748 | 76.0 | 3800 | 230.9977 | 0.855 | 0.2208 | 1.0136 | 0.855 | 0.8484 | 0.1296 | 0.0334 |
| 228.7748 | 77.0 | 3850 | 230.9619 | 0.855 | 0.2348 | 1.0158 | 0.855 | 0.8526 | 0.1346 | 0.0371 |
| 228.7748 | 78.0 | 3900 | 230.9416 | 0.84 | 0.2315 | 1.0372 | 0.8400 | 0.8219 | 0.1290 | 0.0353 |
| 228.7748 | 79.0 | 3950 | 231.0093 | 0.85 | 0.2196 | 1.0981 | 0.85 | 0.8318 | 0.1380 | 0.0335 |
| 228.5779 | 80.0 | 4000 | 230.9455 | 0.845 | 0.2290 | 1.0193 | 0.845 | 0.8332 | 0.1459 | 0.0350 |
| 228.5779 | 81.0 | 4050 | 230.8672 | 0.845 | 0.2184 | 1.0164 | 0.845 | 0.8309 | 0.1560 | 0.0322 |
| 228.5779 | 82.0 | 4100 | 230.9410 | 0.855 | 0.2282 | 1.0116 | 0.855 | 0.8486 | 0.1309 | 0.0349 |
| 228.5779 | 83.0 | 4150 | 230.9393 | 0.855 | 0.2258 | 1.0168 | 0.855 | 0.8498 | 0.1227 | 0.0355 |
| 228.5779 | 84.0 | 4200 | 230.8770 | 0.845 | 0.2204 | 1.0105 | 0.845 | 0.8303 | 0.1259 | 0.0315 |
| 228.5779 | 85.0 | 4250 | 230.9236 | 0.845 | 0.2271 | 1.0079 | 0.845 | 0.8332 | 0.1274 | 0.0353 |
| 228.5779 | 86.0 | 4300 | 230.9332 | 0.845 | 0.2237 | 1.0410 | 0.845 | 0.8288 | 0.1480 | 0.0340 |
| 228.5779 | 87.0 | 4350 | 230.8607 | 0.845 | 0.2273 | 1.0343 | 0.845 | 0.8313 | 0.1070 | 0.0338 |
| 228.5779 | 88.0 | 4400 | 230.7832 | 0.85 | 0.2434 | 1.1399 | 0.85 | 0.8432 | 0.1483 | 0.0407 |
| 228.5779 | 89.0 | 4450 | 230.8136 | 0.85 | 0.2263 | 1.0002 | 0.85 | 0.8367 | 0.1283 | 0.0343 |
| 228.4285 | 90.0 | 4500 | 230.9154 | 0.86 | 0.2243 | 1.0189 | 0.8600 | 0.8552 | 0.1319 | 0.0332 |
| 228.4285 | 91.0 | 4550 | 230.8125 | 0.86 | 0.2320 | 0.9955 | 0.8600 | 0.8552 | 0.1437 | 0.0361 |
| 228.4285 | 92.0 | 4600 | 230.8634 | 0.855 | 0.2238 | 1.0341 | 0.855 | 0.8477 | 0.1252 | 0.0329 |
| 228.4285 | 93.0 | 4650 | 230.8527 | 0.84 | 0.2326 | 0.9819 | 0.8400 | 0.8242 | 0.1248 | 0.0362 |
| 228.4285 | 94.0 | 4700 | 230.8176 | 0.845 | 0.2347 | 1.0080 | 0.845 | 0.8332 | 0.1356 | 0.0369 |
| 228.4285 | 95.0 | 4750 | 230.7853 | 0.85 | 0.2286 | 1.0163 | 0.85 | 0.8445 | 0.1176 | 0.0356 |
| 228.4285 | 96.0 | 4800 | 230.9345 | 0.855 | 0.2218 | 1.0151 | 0.855 | 0.8458 | 0.1252 | 0.0328 |
| 228.4285 | 97.0 | 4850 | 230.8020 | 0.855 | 0.2300 | 1.0166 | 0.855 | 0.8462 | 0.1424 | 0.0349 |
| 228.4285 | 98.0 | 4900 | 230.8873 | 0.855 | 0.2240 | 1.0244 | 0.855 | 0.8477 | 0.1309 | 0.0343 |
| 228.4285 | 99.0 | 4950 | 230.8796 | 0.855 | 0.2276 | 1.0084 | 0.855 | 0.8498 | 0.1254 | 0.0344 |
| 228.3542 | 100.0 | 5000 | 230.9055 | 0.855 | 0.2199 | 1.0336 | 0.855 | 0.8458 | 0.1335 | 0.0336 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/39-tiny_tobacco3482_og_simkd
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 39-tiny_tobacco3482_og_simkd
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 239.0946
- Accuracy: 0.835
- Brier Loss: 0.3084
- Nll: 1.2679
- F1 Micro: 0.835
- F1 Macro: 0.8047
- Ece: 0.2467
- Aurc: 0.0448
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 50 | 246.9268 | 0.22 | 0.8877 | 7.2845 | 0.22 | 0.1442 | 0.2508 | 0.5734 |
| No log | 2.0 | 100 | 245.5493 | 0.355 | 0.7851 | 4.2469 | 0.3550 | 0.2408 | 0.2761 | 0.4302 |
| No log | 3.0 | 150 | 244.0258 | 0.58 | 0.6569 | 2.7197 | 0.58 | 0.4698 | 0.3335 | 0.2529 |
| No log | 4.0 | 200 | 243.8010 | 0.615 | 0.5800 | 3.1495 | 0.615 | 0.4701 | 0.2865 | 0.1927 |
| No log | 5.0 | 250 | 242.9670 | 0.64 | 0.5193 | 2.2510 | 0.64 | 0.4843 | 0.2774 | 0.1631 |
| No log | 6.0 | 300 | 243.1274 | 0.62 | 0.5595 | 2.8430 | 0.62 | 0.4881 | 0.2980 | 0.1558 |
| No log | 7.0 | 350 | 242.1450 | 0.675 | 0.4620 | 2.2479 | 0.675 | 0.5389 | 0.2322 | 0.1266 |
| No log | 8.0 | 400 | 242.5806 | 0.63 | 0.5222 | 2.6597 | 0.63 | 0.5539 | 0.2649 | 0.1584 |
| No log | 9.0 | 450 | 241.9880 | 0.755 | 0.4313 | 1.8660 | 0.755 | 0.6923 | 0.2780 | 0.1028 |
| 241.4671 | 10.0 | 500 | 241.9459 | 0.73 | 0.4274 | 1.9762 | 0.7300 | 0.6550 | 0.2539 | 0.0843 |
| 241.4671 | 11.0 | 550 | 241.4543 | 0.74 | 0.4410 | 1.7736 | 0.74 | 0.6867 | 0.2589 | 0.1197 |
| 241.4671 | 12.0 | 600 | 241.5311 | 0.79 | 0.3750 | 1.6579 | 0.79 | 0.7193 | 0.2754 | 0.0582 |
| 241.4671 | 13.0 | 650 | 241.6211 | 0.77 | 0.4231 | 1.4441 | 0.7700 | 0.7350 | 0.3134 | 0.0761 |
| 241.4671 | 14.0 | 700 | 241.2984 | 0.795 | 0.3898 | 1.5615 | 0.795 | 0.7650 | 0.2879 | 0.0721 |
| 241.4671 | 15.0 | 750 | 241.2752 | 0.81 | 0.3641 | 1.5167 | 0.81 | 0.7692 | 0.2956 | 0.0515 |
| 241.4671 | 16.0 | 800 | 240.9899 | 0.78 | 0.3821 | 1.7296 | 0.78 | 0.7434 | 0.2935 | 0.0688 |
| 241.4671 | 17.0 | 850 | 241.3208 | 0.81 | 0.3850 | 1.6487 | 0.81 | 0.7628 | 0.2946 | 0.0657 |
| 241.4671 | 18.0 | 900 | 241.0431 | 0.82 | 0.3678 | 1.5667 | 0.82 | 0.7701 | 0.3109 | 0.0504 |
| 241.4671 | 19.0 | 950 | 240.9436 | 0.815 | 0.3615 | 1.1826 | 0.815 | 0.7624 | 0.3019 | 0.0557 |
| 238.6853 | 20.0 | 1000 | 240.7993 | 0.83 | 0.3475 | 1.2078 | 0.83 | 0.7887 | 0.3062 | 0.0541 |
| 238.6853 | 21.0 | 1050 | 240.6124 | 0.825 | 0.3577 | 1.1927 | 0.825 | 0.7956 | 0.2883 | 0.0560 |
| 238.6853 | 22.0 | 1100 | 240.7105 | 0.815 | 0.3535 | 1.4564 | 0.815 | 0.7550 | 0.2814 | 0.0553 |
| 238.6853 | 23.0 | 1150 | 240.6886 | 0.815 | 0.3584 | 1.1556 | 0.815 | 0.7815 | 0.2993 | 0.0564 |
| 238.6853 | 24.0 | 1200 | 240.5978 | 0.84 | 0.3451 | 1.1351 | 0.8400 | 0.8002 | 0.2937 | 0.0485 |
| 238.6853 | 25.0 | 1250 | 240.3825 | 0.815 | 0.3453 | 1.2408 | 0.815 | 0.7918 | 0.2853 | 0.0525 |
| 238.6853 | 26.0 | 1300 | 240.2633 | 0.83 | 0.3431 | 1.3299 | 0.83 | 0.7883 | 0.2715 | 0.0562 |
| 238.6853 | 27.0 | 1350 | 240.5535 | 0.82 | 0.3475 | 1.4406 | 0.82 | 0.7874 | 0.2897 | 0.0536 |
| 238.6853 | 28.0 | 1400 | 240.3554 | 0.835 | 0.3447 | 1.2483 | 0.835 | 0.7947 | 0.3078 | 0.0464 |
| 238.6853 | 29.0 | 1450 | 240.2271 | 0.82 | 0.3274 | 1.5224 | 0.82 | 0.7862 | 0.2691 | 0.0467 |
| 237.7411 | 30.0 | 1500 | 240.2261 | 0.825 | 0.3388 | 1.2921 | 0.825 | 0.7939 | 0.2913 | 0.0485 |
| 237.7411 | 31.0 | 1550 | 240.4772 | 0.83 | 0.3474 | 1.3063 | 0.83 | 0.7823 | 0.2995 | 0.0516 |
| 237.7411 | 32.0 | 1600 | 240.2594 | 0.85 | 0.3509 | 1.3993 | 0.85 | 0.8184 | 0.3142 | 0.0496 |
| 237.7411 | 33.0 | 1650 | 239.9501 | 0.805 | 0.3413 | 1.2018 | 0.805 | 0.7630 | 0.2619 | 0.0592 |
| 237.7411 | 34.0 | 1700 | 240.2149 | 0.86 | 0.3409 | 1.2633 | 0.8600 | 0.8490 | 0.3229 | 0.0467 |
| 237.7411 | 35.0 | 1750 | 240.0215 | 0.82 | 0.3264 | 1.1435 | 0.82 | 0.7705 | 0.2538 | 0.0499 |
| 237.7411 | 36.0 | 1800 | 239.9490 | 0.85 | 0.3204 | 1.4364 | 0.85 | 0.8147 | 0.2752 | 0.0432 |
| 237.7411 | 37.0 | 1850 | 239.9360 | 0.825 | 0.3262 | 1.1487 | 0.825 | 0.7921 | 0.2706 | 0.0482 |
| 237.7411 | 38.0 | 1900 | 240.0647 | 0.855 | 0.3322 | 1.1655 | 0.855 | 0.8339 | 0.3053 | 0.0460 |
| 237.7411 | 39.0 | 1950 | 239.7761 | 0.825 | 0.3237 | 1.2252 | 0.825 | 0.7862 | 0.2732 | 0.0499 |
| 237.1023 | 40.0 | 2000 | 239.9426 | 0.83 | 0.3220 | 1.2096 | 0.83 | 0.7906 | 0.2748 | 0.0457 |
| 237.1023 | 41.0 | 2050 | 239.8765 | 0.835 | 0.3241 | 1.3547 | 0.835 | 0.7929 | 0.2661 | 0.0462 |
| 237.1023 | 42.0 | 2100 | 239.8423 | 0.835 | 0.3329 | 1.2423 | 0.835 | 0.8000 | 0.2955 | 0.0527 |
| 237.1023 | 43.0 | 2150 | 239.6674 | 0.83 | 0.3215 | 1.1217 | 0.83 | 0.8069 | 0.2597 | 0.0486 |
| 237.1023 | 44.0 | 2200 | 239.7237 | 0.81 | 0.3206 | 1.1322 | 0.81 | 0.7724 | 0.2500 | 0.0473 |
| 237.1023 | 45.0 | 2250 | 239.7981 | 0.82 | 0.3263 | 1.4167 | 0.82 | 0.7754 | 0.2786 | 0.0485 |
| 237.1023 | 46.0 | 2300 | 239.4825 | 0.83 | 0.3149 | 1.2610 | 0.83 | 0.7879 | 0.2436 | 0.0487 |
| 237.1023 | 47.0 | 2350 | 239.6016 | 0.83 | 0.3195 | 1.2602 | 0.83 | 0.7903 | 0.2631 | 0.0529 |
| 237.1023 | 48.0 | 2400 | 239.6693 | 0.84 | 0.3248 | 1.2935 | 0.8400 | 0.7992 | 0.2581 | 0.0505 |
| 237.1023 | 49.0 | 2450 | 239.6932 | 0.845 | 0.3235 | 1.1698 | 0.845 | 0.8024 | 0.2600 | 0.0459 |
| 236.6301 | 50.0 | 2500 | 239.5297 | 0.845 | 0.3151 | 1.4464 | 0.845 | 0.8057 | 0.2775 | 0.0467 |
| 236.6301 | 51.0 | 2550 | 239.6262 | 0.83 | 0.3236 | 1.2198 | 0.83 | 0.7879 | 0.2680 | 0.0466 |
| 236.6301 | 52.0 | 2600 | 239.4326 | 0.855 | 0.3111 | 1.2320 | 0.855 | 0.8338 | 0.2785 | 0.0452 |
| 236.6301 | 53.0 | 2650 | 239.5918 | 0.825 | 0.3171 | 1.2295 | 0.825 | 0.7966 | 0.2517 | 0.0440 |
| 236.6301 | 54.0 | 2700 | 239.5557 | 0.845 | 0.3215 | 1.1543 | 0.845 | 0.8027 | 0.2723 | 0.0453 |
| 236.6301 | 55.0 | 2750 | 239.4767 | 0.83 | 0.3113 | 1.0283 | 0.83 | 0.8107 | 0.2457 | 0.0446 |
| 236.6301 | 56.0 | 2800 | 239.3879 | 0.85 | 0.3112 | 1.2353 | 0.85 | 0.8146 | 0.2547 | 0.0449 |
| 236.6301 | 57.0 | 2850 | 239.3733 | 0.85 | 0.3104 | 1.3730 | 0.85 | 0.8236 | 0.2593 | 0.0451 |
| 236.6301 | 58.0 | 2900 | 239.4988 | 0.835 | 0.3174 | 1.3295 | 0.835 | 0.7984 | 0.2572 | 0.0468 |
| 236.6301 | 59.0 | 2950 | 239.3514 | 0.815 | 0.3103 | 1.2168 | 0.815 | 0.7746 | 0.2382 | 0.0453 |
| 236.2404 | 60.0 | 3000 | 239.3031 | 0.835 | 0.3083 | 1.2398 | 0.835 | 0.7944 | 0.2577 | 0.0479 |
| 236.2404 | 61.0 | 3050 | 239.3242 | 0.855 | 0.3095 | 1.2921 | 0.855 | 0.8307 | 0.2739 | 0.0438 |
| 236.2404 | 62.0 | 3100 | 239.3217 | 0.815 | 0.3135 | 1.2385 | 0.815 | 0.7799 | 0.2559 | 0.0481 |
| 236.2404 | 63.0 | 3150 | 239.2695 | 0.835 | 0.3102 | 1.0724 | 0.835 | 0.8068 | 0.2276 | 0.0471 |
| 236.2404 | 64.0 | 3200 | 239.3596 | 0.83 | 0.3124 | 1.1534 | 0.83 | 0.7911 | 0.2440 | 0.0442 |
| 236.2404 | 65.0 | 3250 | 239.2498 | 0.825 | 0.3097 | 1.2191 | 0.825 | 0.7804 | 0.2450 | 0.0459 |
| 236.2404 | 66.0 | 3300 | 239.1448 | 0.82 | 0.3020 | 1.0196 | 0.82 | 0.7825 | 0.2559 | 0.0452 |
| 236.2404 | 67.0 | 3350 | 239.1730 | 0.825 | 0.3017 | 1.1825 | 0.825 | 0.8065 | 0.2494 | 0.0444 |
| 236.2404 | 68.0 | 3400 | 239.3466 | 0.835 | 0.3159 | 1.2392 | 0.835 | 0.7973 | 0.2617 | 0.0448 |
| 236.2404 | 69.0 | 3450 | 239.1867 | 0.825 | 0.3058 | 1.1611 | 0.825 | 0.7940 | 0.2596 | 0.0450 |
| 235.9626 | 70.0 | 3500 | 239.2792 | 0.84 | 0.3106 | 1.1812 | 0.8400 | 0.8134 | 0.2590 | 0.0444 |
| 235.9626 | 71.0 | 3550 | 239.0603 | 0.82 | 0.3055 | 1.1710 | 0.82 | 0.7895 | 0.2251 | 0.0466 |
| 235.9626 | 72.0 | 3600 | 238.9947 | 0.815 | 0.3069 | 1.1766 | 0.815 | 0.7905 | 0.2315 | 0.0485 |
| 235.9626 | 73.0 | 3650 | 239.2322 | 0.83 | 0.3108 | 1.1724 | 0.83 | 0.8015 | 0.2595 | 0.0450 |
| 235.9626 | 74.0 | 3700 | 239.0134 | 0.825 | 0.3087 | 1.2661 | 0.825 | 0.7854 | 0.2352 | 0.0467 |
| 235.9626 | 75.0 | 3750 | 239.1055 | 0.825 | 0.3090 | 1.1748 | 0.825 | 0.7946 | 0.2483 | 0.0458 |
| 235.9626 | 76.0 | 3800 | 239.0925 | 0.825 | 0.3133 | 1.1843 | 0.825 | 0.7918 | 0.2466 | 0.0490 |
| 235.9626 | 77.0 | 3850 | 239.1586 | 0.835 | 0.3115 | 1.1877 | 0.835 | 0.8140 | 0.2498 | 0.0455 |
| 235.9626 | 78.0 | 3900 | 239.1394 | 0.83 | 0.3103 | 1.2698 | 0.83 | 0.7897 | 0.2424 | 0.0467 |
| 235.9626 | 79.0 | 3950 | 239.2314 | 0.83 | 0.3121 | 1.2519 | 0.83 | 0.7938 | 0.2378 | 0.0453 |
| 235.7667 | 80.0 | 4000 | 239.1433 | 0.83 | 0.3076 | 1.1725 | 0.83 | 0.7924 | 0.2412 | 0.0459 |
| 235.7667 | 81.0 | 4050 | 239.0533 | 0.83 | 0.3026 | 1.2551 | 0.83 | 0.7994 | 0.2661 | 0.0448 |
| 235.7667 | 82.0 | 4100 | 239.0847 | 0.825 | 0.3123 | 1.1798 | 0.825 | 0.7964 | 0.2611 | 0.0483 |
| 235.7667 | 83.0 | 4150 | 239.1199 | 0.835 | 0.3089 | 1.2642 | 0.835 | 0.8047 | 0.2551 | 0.0471 |
| 235.7667 | 84.0 | 4200 | 239.0148 | 0.835 | 0.3055 | 1.1264 | 0.835 | 0.7975 | 0.2341 | 0.0439 |
| 235.7667 | 85.0 | 4250 | 239.0657 | 0.825 | 0.3107 | 1.1645 | 0.825 | 0.7924 | 0.2363 | 0.0452 |
| 235.7667 | 86.0 | 4300 | 239.1023 | 0.83 | 0.3134 | 1.2785 | 0.83 | 0.7949 | 0.2353 | 0.0462 |
| 235.7667 | 87.0 | 4350 | 239.0670 | 0.83 | 0.3028 | 1.2849 | 0.83 | 0.7927 | 0.2295 | 0.0428 |
| 235.7667 | 88.0 | 4400 | 239.0111 | 0.82 | 0.3097 | 1.1759 | 0.82 | 0.7865 | 0.2447 | 0.0462 |
| 235.7667 | 89.0 | 4450 | 238.9677 | 0.845 | 0.2995 | 1.1189 | 0.845 | 0.8231 | 0.2402 | 0.0423 |
| 235.6149 | 90.0 | 4500 | 239.1255 | 0.83 | 0.3063 | 1.1282 | 0.83 | 0.8028 | 0.2695 | 0.0431 |
| 235.6149 | 91.0 | 4550 | 238.9880 | 0.815 | 0.3102 | 1.1766 | 0.815 | 0.7851 | 0.2388 | 0.0486 |
| 235.6149 | 92.0 | 4600 | 239.0212 | 0.84 | 0.3079 | 1.1941 | 0.8400 | 0.8121 | 0.2627 | 0.0442 |
| 235.6149 | 93.0 | 4650 | 238.9948 | 0.82 | 0.3077 | 1.1702 | 0.82 | 0.7874 | 0.2533 | 0.0468 |
| 235.6149 | 94.0 | 4700 | 239.0357 | 0.82 | 0.3103 | 1.1823 | 0.82 | 0.7878 | 0.2639 | 0.0471 |
| 235.6149 | 95.0 | 4750 | 238.9936 | 0.81 | 0.2995 | 1.1251 | 0.81 | 0.7738 | 0.2519 | 0.0447 |
| 235.6149 | 96.0 | 4800 | 239.1319 | 0.835 | 0.3138 | 1.1957 | 0.835 | 0.8045 | 0.2806 | 0.0455 |
| 235.6149 | 97.0 | 4850 | 239.0160 | 0.815 | 0.3061 | 1.1852 | 0.815 | 0.7756 | 0.2281 | 0.0453 |
| 235.6149 | 98.0 | 4900 | 239.1168 | 0.83 | 0.3152 | 1.2637 | 0.83 | 0.7997 | 0.2509 | 0.0474 |
| 235.6149 | 99.0 | 4950 | 239.0691 | 0.83 | 0.3085 | 1.1799 | 0.83 | 0.7998 | 0.2524 | 0.0439 |
| 235.5384 | 100.0 | 5000 | 239.0946 | 0.835 | 0.3084 | 1.2679 | 0.835 | 0.8047 | 0.2467 | 0.0448 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/18-tiny_tobacco3482_og_simkd
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 18-tiny_tobacco3482_og_simkd
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 269.3729
- Accuracy: 0.36
- Brier Loss: 0.8320
- Nll: 4.3971
- F1 Micro: 0.36
- F1 Macro: 0.2725
- Ece: 0.3539
- Aurc: 0.6748
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 50 | 278.3639 | 0.075 | 0.8950 | 9.1030 | 0.075 | 0.0452 | 0.1624 | 0.8470 |
| No log | 2.0 | 100 | 276.7339 | 0.095 | 0.8890 | 8.6785 | 0.095 | 0.0820 | 0.1806 | 0.9095 |
| No log | 3.0 | 150 | 275.0460 | 0.16 | 0.8755 | 6.5857 | 0.16 | 0.0866 | 0.2296 | 0.8951 |
| No log | 4.0 | 200 | 274.4719 | 0.33 | 0.8662 | 6.6801 | 0.33 | 0.1980 | 0.3166 | 0.5931 |
| No log | 5.0 | 250 | 273.5627 | 0.23 | 0.8660 | 6.4504 | 0.23 | 0.1193 | 0.2606 | 0.7477 |
| No log | 6.0 | 300 | 273.5152 | 0.22 | 0.8622 | 6.1576 | 0.22 | 0.1323 | 0.2578 | 0.6395 |
| No log | 7.0 | 350 | 273.0874 | 0.2 | 0.8584 | 5.8986 | 0.2000 | 0.1249 | 0.2508 | 0.8129 |
| No log | 8.0 | 400 | 273.7433 | 0.19 | 0.8764 | 6.0516 | 0.19 | 0.1554 | 0.2550 | 0.8545 |
| No log | 9.0 | 450 | 272.4757 | 0.185 | 0.8533 | 5.9072 | 0.185 | 0.1340 | 0.2529 | 0.7360 |
| 273.9531 | 10.0 | 500 | 272.7899 | 0.14 | 0.8669 | 6.3132 | 0.14 | 0.1080 | 0.2373 | 0.8882 |
| 273.9531 | 11.0 | 550 | 271.7198 | 0.245 | 0.8484 | 4.7561 | 0.245 | 0.1370 | 0.2741 | 0.7203 |
| 273.9531 | 12.0 | 600 | 272.7391 | 0.095 | 0.8761 | 6.3636 | 0.095 | 0.0930 | 0.2120 | 0.9142 |
| 273.9531 | 13.0 | 650 | 271.8800 | 0.24 | 0.8566 | 4.9415 | 0.24 | 0.1926 | 0.2825 | 0.7850 |
| 273.9531 | 14.0 | 700 | 271.4349 | 0.305 | 0.8408 | 4.6971 | 0.305 | 0.1935 | 0.3087 | 0.7274 |
| 273.9531 | 15.0 | 750 | 271.7440 | 0.24 | 0.8476 | 3.9346 | 0.24 | 0.1835 | 0.2902 | 0.8098 |
| 273.9531 | 16.0 | 800 | 271.1944 | 0.28 | 0.8464 | 4.1470 | 0.28 | 0.2074 | 0.3074 | 0.7919 |
| 273.9531 | 17.0 | 850 | 271.2062 | 0.34 | 0.8363 | 4.8972 | 0.34 | 0.2136 | 0.3237 | 0.6403 |
| 273.9531 | 18.0 | 900 | 271.6591 | 0.295 | 0.8548 | 4.7423 | 0.295 | 0.1990 | 0.3066 | 0.7590 |
| 273.9531 | 19.0 | 950 | 271.5129 | 0.255 | 0.8571 | 4.8864 | 0.255 | 0.1898 | 0.2901 | 0.7929 |
| 270.6992 | 20.0 | 1000 | 271.4151 | 0.265 | 0.8543 | 5.2793 | 0.265 | 0.1924 | 0.2928 | 0.7842 |
| 270.6992 | 21.0 | 1050 | 271.0936 | 0.35 | 0.8449 | 4.4171 | 0.35 | 0.2347 | 0.3389 | 0.7230 |
| 270.6992 | 22.0 | 1100 | 271.2365 | 0.285 | 0.8504 | 4.7910 | 0.285 | 0.2020 | 0.3147 | 0.7840 |
| 270.6992 | 23.0 | 1150 | 271.3472 | 0.26 | 0.8496 | 4.1785 | 0.26 | 0.1800 | 0.2922 | 0.7983 |
| 270.6992 | 24.0 | 1200 | 271.1550 | 0.335 | 0.8466 | 4.8743 | 0.335 | 0.2518 | 0.3381 | 0.7148 |
| 270.6992 | 25.0 | 1250 | 270.7837 | 0.34 | 0.8455 | 4.7405 | 0.34 | 0.2392 | 0.3420 | 0.7592 |
| 270.6992 | 26.0 | 1300 | 271.2034 | 0.325 | 0.8485 | 5.0231 | 0.325 | 0.2418 | 0.3302 | 0.7411 |
| 270.6992 | 27.0 | 1350 | 270.5752 | 0.385 | 0.8334 | 4.5239 | 0.3850 | 0.2866 | 0.3535 | 0.6197 |
| 270.6992 | 28.0 | 1400 | 270.6892 | 0.35 | 0.8407 | 4.5651 | 0.35 | 0.2593 | 0.3428 | 0.6812 |
| 270.6992 | 29.0 | 1450 | 270.7246 | 0.36 | 0.8411 | 4.6054 | 0.36 | 0.2699 | 0.3365 | 0.6772 |
| 269.5806 | 30.0 | 1500 | 270.6962 | 0.32 | 0.8453 | 4.3997 | 0.32 | 0.2170 | 0.3145 | 0.7129 |
| 269.5806 | 31.0 | 1550 | 270.8489 | 0.325 | 0.8478 | 4.8399 | 0.325 | 0.2367 | 0.3180 | 0.6999 |
| 269.5806 | 32.0 | 1600 | 270.5093 | 0.36 | 0.8370 | 4.6472 | 0.36 | 0.2765 | 0.3456 | 0.6472 |
| 269.5806 | 33.0 | 1650 | 270.5440 | 0.35 | 0.8407 | 4.6948 | 0.35 | 0.2532 | 0.3292 | 0.6436 |
| 269.5806 | 34.0 | 1700 | 270.6743 | 0.395 | 0.8387 | 4.1379 | 0.395 | 0.2872 | 0.3730 | 0.6805 |
| 269.5806 | 35.0 | 1750 | 270.6282 | 0.33 | 0.8415 | 4.2499 | 0.33 | 0.2445 | 0.3410 | 0.7220 |
| 269.5806 | 36.0 | 1800 | 270.5709 | 0.31 | 0.8478 | 4.8275 | 0.31 | 0.2295 | 0.3063 | 0.7223 |
| 269.5806 | 37.0 | 1850 | 270.4933 | 0.33 | 0.8449 | 4.7300 | 0.33 | 0.2415 | 0.3157 | 0.6997 |
| 269.5806 | 38.0 | 1900 | 270.4860 | 0.355 | 0.8360 | 4.4343 | 0.3550 | 0.2598 | 0.3373 | 0.6545 |
| 269.5806 | 39.0 | 1950 | 269.9668 | 0.37 | 0.8306 | 4.4080 | 0.37 | 0.2602 | 0.3349 | 0.6056 |
| 268.8099 | 40.0 | 2000 | 270.3401 | 0.385 | 0.8389 | 5.0506 | 0.3850 | 0.2777 | 0.3567 | 0.6686 |
| 268.8099 | 41.0 | 2050 | 270.0377 | 0.365 | 0.8320 | 4.6238 | 0.3650 | 0.2466 | 0.3429 | 0.6167 |
| 268.8099 | 42.0 | 2100 | 270.2510 | 0.355 | 0.8363 | 4.7053 | 0.3550 | 0.2498 | 0.3352 | 0.6564 |
| 268.8099 | 43.0 | 2150 | 270.0252 | 0.38 | 0.8275 | 4.2261 | 0.38 | 0.2690 | 0.3557 | 0.5995 |
| 268.8099 | 44.0 | 2200 | 269.9756 | 0.36 | 0.8251 | 4.2336 | 0.36 | 0.2516 | 0.3226 | 0.6005 |
| 268.8099 | 45.0 | 2250 | 270.1610 | 0.34 | 0.8383 | 4.5052 | 0.34 | 0.2593 | 0.3164 | 0.6854 |
| 268.8099 | 46.0 | 2300 | 270.0320 | 0.33 | 0.8396 | 4.4659 | 0.33 | 0.2458 | 0.3114 | 0.7070 |
| 268.8099 | 47.0 | 2350 | 269.9535 | 0.335 | 0.8316 | 4.6707 | 0.335 | 0.2419 | 0.3203 | 0.6503 |
| 268.8099 | 48.0 | 2400 | 270.0645 | 0.365 | 0.8320 | 4.4182 | 0.3650 | 0.2517 | 0.3230 | 0.6360 |
| 268.8099 | 49.0 | 2450 | 270.0541 | 0.375 | 0.8348 | 4.4803 | 0.375 | 0.2717 | 0.3504 | 0.6444 |
| 268.2443 | 50.0 | 2500 | 269.8891 | 0.33 | 0.8374 | 4.7223 | 0.33 | 0.2365 | 0.3223 | 0.6716 |
| 268.2443 | 51.0 | 2550 | 269.8773 | 0.39 | 0.8355 | 4.4693 | 0.39 | 0.2637 | 0.3547 | 0.6525 |
| 268.2443 | 52.0 | 2600 | 269.6983 | 0.39 | 0.8298 | 4.5670 | 0.39 | 0.2716 | 0.3624 | 0.6230 |
| 268.2443 | 53.0 | 2650 | 270.0016 | 0.35 | 0.8327 | 4.3052 | 0.35 | 0.2402 | 0.3367 | 0.6602 |
| 268.2443 | 54.0 | 2700 | 269.7764 | 0.35 | 0.8318 | 4.3783 | 0.35 | 0.2432 | 0.3189 | 0.6373 |
| 268.2443 | 55.0 | 2750 | 269.7582 | 0.36 | 0.8272 | 4.4652 | 0.36 | 0.2535 | 0.3130 | 0.6197 |
| 268.2443 | 56.0 | 2800 | 269.7654 | 0.385 | 0.8253 | 4.4241 | 0.3850 | 0.2751 | 0.3375 | 0.6019 |
| 268.2443 | 57.0 | 2850 | 269.7094 | 0.375 | 0.8331 | 4.5019 | 0.375 | 0.2691 | 0.3413 | 0.6391 |
| 268.2443 | 58.0 | 2900 | 269.7442 | 0.37 | 0.8323 | 4.5422 | 0.37 | 0.2813 | 0.3311 | 0.6468 |
| 268.2443 | 59.0 | 2950 | 269.7930 | 0.385 | 0.8265 | 4.4502 | 0.3850 | 0.2820 | 0.3331 | 0.6032 |
| 267.7843 | 60.0 | 3000 | 269.6804 | 0.375 | 0.8270 | 4.3152 | 0.375 | 0.2723 | 0.3221 | 0.6005 |
| 267.7843 | 61.0 | 3050 | 269.7132 | 0.36 | 0.8355 | 4.4928 | 0.36 | 0.2625 | 0.3464 | 0.6679 |
| 267.7843 | 62.0 | 3100 | 269.6766 | 0.375 | 0.8308 | 4.5392 | 0.375 | 0.2644 | 0.3272 | 0.6307 |
| 267.7843 | 63.0 | 3150 | 269.5538 | 0.32 | 0.8345 | 4.4571 | 0.32 | 0.2299 | 0.3160 | 0.6791 |
| 267.7843 | 64.0 | 3200 | 269.5080 | 0.375 | 0.8262 | 4.1608 | 0.375 | 0.2581 | 0.3323 | 0.6357 |
| 267.7843 | 65.0 | 3250 | 269.6766 | 0.355 | 0.8362 | 4.4220 | 0.3550 | 0.2647 | 0.3250 | 0.6591 |
| 267.7843 | 66.0 | 3300 | 269.4889 | 0.38 | 0.8249 | 4.3636 | 0.38 | 0.2786 | 0.3452 | 0.6364 |
| 267.7843 | 67.0 | 3350 | 269.4171 | 0.4 | 0.8246 | 4.2888 | 0.4000 | 0.2877 | 0.3479 | 0.6114 |
| 267.7843 | 68.0 | 3400 | 269.6962 | 0.37 | 0.8286 | 4.3171 | 0.37 | 0.2614 | 0.3376 | 0.6369 |
| 267.7843 | 69.0 | 3450 | 269.3859 | 0.425 | 0.8225 | 4.2805 | 0.425 | 0.3053 | 0.3649 | 0.5751 |
| 267.4508 | 70.0 | 3500 | 269.5482 | 0.345 | 0.8377 | 4.7547 | 0.345 | 0.2559 | 0.3341 | 0.6728 |
| 267.4508 | 71.0 | 3550 | 269.3435 | 0.4 | 0.8216 | 4.3030 | 0.4000 | 0.2858 | 0.3391 | 0.6045 |
| 267.4508 | 72.0 | 3600 | 269.2284 | 0.405 | 0.8198 | 4.3223 | 0.405 | 0.3006 | 0.3598 | 0.5984 |
| 267.4508 | 73.0 | 3650 | 269.5614 | 0.37 | 0.8300 | 4.3579 | 0.37 | 0.2645 | 0.3515 | 0.6597 |
| 267.4508 | 74.0 | 3700 | 269.2826 | 0.39 | 0.8217 | 4.3413 | 0.39 | 0.2738 | 0.3317 | 0.5869 |
| 267.4508 | 75.0 | 3750 | 269.3375 | 0.39 | 0.8226 | 4.2796 | 0.39 | 0.2712 | 0.3485 | 0.6140 |
| 267.4508 | 76.0 | 3800 | 269.2377 | 0.39 | 0.8198 | 4.4825 | 0.39 | 0.2771 | 0.3450 | 0.5936 |
| 267.4508 | 77.0 | 3850 | 269.4238 | 0.375 | 0.8278 | 4.5048 | 0.375 | 0.2736 | 0.3349 | 0.6386 |
| 267.4508 | 78.0 | 3900 | 269.4522 | 0.4 | 0.8283 | 4.5498 | 0.4000 | 0.2870 | 0.3318 | 0.6187 |
| 267.4508 | 79.0 | 3950 | 269.4372 | 0.355 | 0.8316 | 4.5034 | 0.3550 | 0.2621 | 0.3182 | 0.6581 |
| 267.2267 | 80.0 | 4000 | 269.4058 | 0.38 | 0.8262 | 4.4096 | 0.38 | 0.2834 | 0.3453 | 0.6197 |
| 267.2267 | 81.0 | 4050 | 269.2977 | 0.37 | 0.8251 | 4.4666 | 0.37 | 0.2669 | 0.3374 | 0.6406 |
| 267.2267 | 82.0 | 4100 | 269.3194 | 0.37 | 0.8286 | 4.4947 | 0.37 | 0.2751 | 0.3332 | 0.6326 |
| 267.2267 | 83.0 | 4150 | 269.2936 | 0.365 | 0.8301 | 4.3184 | 0.3650 | 0.2725 | 0.3358 | 0.6815 |
| 267.2267 | 84.0 | 4200 | 269.2045 | 0.4 | 0.8242 | 4.2851 | 0.4000 | 0.2947 | 0.3526 | 0.6133 |
| 267.2267 | 85.0 | 4250 | 269.2916 | 0.38 | 0.8256 | 4.4965 | 0.38 | 0.2872 | 0.3190 | 0.6079 |
| 267.2267 | 86.0 | 4300 | 269.3817 | 0.35 | 0.8327 | 4.3865 | 0.35 | 0.2686 | 0.3341 | 0.6660 |
| 267.2267 | 87.0 | 4350 | 269.2809 | 0.36 | 0.8296 | 4.4521 | 0.36 | 0.2664 | 0.3367 | 0.6620 |
| 267.2267 | 88.0 | 4400 | 269.1650 | 0.405 | 0.8181 | 4.2997 | 0.405 | 0.2857 | 0.3527 | 0.5624 |
| 267.2267 | 89.0 | 4450 | 269.1071 | 0.375 | 0.8297 | 4.3559 | 0.375 | 0.2737 | 0.3386 | 0.6423 |
| 267.0481 | 90.0 | 4500 | 269.3120 | 0.385 | 0.8268 | 4.4102 | 0.3850 | 0.2833 | 0.3438 | 0.6264 |
| 267.0481 | 91.0 | 4550 | 269.1590 | 0.4 | 0.8243 | 4.2495 | 0.4000 | 0.2964 | 0.3382 | 0.5986 |
| 267.0481 | 92.0 | 4600 | 269.1963 | 0.4 | 0.8230 | 4.4469 | 0.4000 | 0.3070 | 0.3521 | 0.5969 |
| 267.0481 | 93.0 | 4650 | 269.1320 | 0.39 | 0.8256 | 4.2528 | 0.39 | 0.2856 | 0.3355 | 0.6072 |
| 267.0481 | 94.0 | 4700 | 269.2391 | 0.4 | 0.8222 | 4.3351 | 0.4000 | 0.2891 | 0.3392 | 0.6019 |
| 267.0481 | 95.0 | 4750 | 269.2017 | 0.395 | 0.8243 | 4.3717 | 0.395 | 0.2931 | 0.3435 | 0.6271 |
| 267.0481 | 96.0 | 4800 | 269.4085 | 0.37 | 0.8290 | 4.3941 | 0.37 | 0.2817 | 0.3255 | 0.6242 |
| 267.0481 | 97.0 | 4850 | 269.2195 | 0.4 | 0.8231 | 4.4454 | 0.4000 | 0.2889 | 0.3452 | 0.5916 |
| 267.0481 | 98.0 | 4900 | 269.3515 | 0.375 | 0.8296 | 4.3291 | 0.375 | 0.2754 | 0.3370 | 0.6380 |
| 267.0481 | 99.0 | 4950 | 269.3496 | 0.395 | 0.8244 | 4.4743 | 0.395 | 0.2989 | 0.3432 | 0.6100 |
| 266.9625 | 100.0 | 5000 | 269.3729 | 0.36 | 0.8320 | 4.3971 | 0.36 | 0.2725 | 0.3539 | 0.6748 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/300-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 300-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4770
- Accuracy: 0.82
- Brier Loss: 0.2875
- Nll: 1.3922
- F1 Micro: 0.82
- F1 Macro: 0.8020
- Ece: 0.2219
- Aurc: 0.0517
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.7938 | 0.235 | 0.8938 | 7.8599 | 0.235 | 0.1394 | 0.3127 | 0.7433 |
| No log | 2.0 | 26 | 1.2738 | 0.455 | 0.6913 | 3.6965 | 0.455 | 0.3679 | 0.2904 | 0.3352 |
| No log | 3.0 | 39 | 1.0682 | 0.555 | 0.5748 | 2.0566 | 0.555 | 0.4968 | 0.2475 | 0.2296 |
| No log | 4.0 | 52 | 0.8509 | 0.655 | 0.4621 | 1.7782 | 0.655 | 0.6085 | 0.2242 | 0.1405 |
| No log | 5.0 | 65 | 0.7670 | 0.71 | 0.4142 | 1.4993 | 0.7100 | 0.6560 | 0.2271 | 0.1082 |
| No log | 6.0 | 78 | 0.7285 | 0.735 | 0.3857 | 1.5730 | 0.735 | 0.6874 | 0.2098 | 0.0996 |
| No log | 7.0 | 91 | 0.7052 | 0.72 | 0.3804 | 1.4916 | 0.72 | 0.6974 | 0.2249 | 0.0959 |
| No log | 8.0 | 104 | 0.7590 | 0.71 | 0.3925 | 1.8047 | 0.7100 | 0.6641 | 0.1956 | 0.1008 |
| No log | 9.0 | 117 | 0.7657 | 0.71 | 0.4006 | 1.8296 | 0.7100 | 0.7169 | 0.2330 | 0.1025 |
| No log | 10.0 | 130 | 0.6512 | 0.755 | 0.3514 | 1.5899 | 0.755 | 0.7256 | 0.1863 | 0.0853 |
| No log | 11.0 | 143 | 0.6615 | 0.775 | 0.3638 | 1.8180 | 0.775 | 0.7564 | 0.2106 | 0.0911 |
| No log | 12.0 | 156 | 0.6195 | 0.785 | 0.3398 | 1.6998 | 0.785 | 0.7419 | 0.2337 | 0.0643 |
| No log | 13.0 | 169 | 0.6065 | 0.78 | 0.3471 | 1.5917 | 0.78 | 0.7550 | 0.2280 | 0.0793 |
| No log | 14.0 | 182 | 0.6314 | 0.75 | 0.3486 | 1.9235 | 0.75 | 0.7315 | 0.2105 | 0.0755 |
| No log | 15.0 | 195 | 0.6426 | 0.745 | 0.3686 | 1.8633 | 0.745 | 0.7100 | 0.2099 | 0.0891 |
| No log | 16.0 | 208 | 0.5849 | 0.765 | 0.3476 | 1.3466 | 0.765 | 0.7505 | 0.1978 | 0.0827 |
| No log | 17.0 | 221 | 0.5604 | 0.79 | 0.3311 | 1.3948 | 0.79 | 0.7581 | 0.2258 | 0.0705 |
| No log | 18.0 | 234 | 0.5504 | 0.78 | 0.3230 | 1.4757 | 0.78 | 0.7712 | 0.2104 | 0.0624 |
| No log | 19.0 | 247 | 0.5586 | 0.785 | 0.3247 | 1.5297 | 0.785 | 0.7642 | 0.2159 | 0.0655 |
| No log | 20.0 | 260 | 0.5879 | 0.78 | 0.3366 | 1.5348 | 0.78 | 0.7727 | 0.2162 | 0.0716 |
| No log | 21.0 | 273 | 0.5558 | 0.805 | 0.3113 | 1.5720 | 0.805 | 0.7945 | 0.2161 | 0.0652 |
| No log | 22.0 | 286 | 0.5439 | 0.795 | 0.3258 | 1.7373 | 0.795 | 0.7883 | 0.2307 | 0.0745 |
| No log | 23.0 | 299 | 0.5155 | 0.795 | 0.3094 | 1.4183 | 0.795 | 0.7725 | 0.2221 | 0.0625 |
| No log | 24.0 | 312 | 0.5039 | 0.81 | 0.2994 | 1.4458 | 0.81 | 0.7830 | 0.2114 | 0.0624 |
| No log | 25.0 | 325 | 0.5142 | 0.81 | 0.3101 | 1.2798 | 0.81 | 0.7928 | 0.2205 | 0.0624 |
| No log | 26.0 | 338 | 0.5007 | 0.8 | 0.3100 | 1.2390 | 0.8000 | 0.7730 | 0.2038 | 0.0645 |
| No log | 27.0 | 351 | 0.4779 | 0.815 | 0.2865 | 1.3312 | 0.815 | 0.7863 | 0.2061 | 0.0518 |
| No log | 28.0 | 364 | 0.4893 | 0.825 | 0.2927 | 1.3993 | 0.825 | 0.8009 | 0.2219 | 0.0555 |
| No log | 29.0 | 377 | 0.4938 | 0.82 | 0.2996 | 1.4038 | 0.82 | 0.7888 | 0.2138 | 0.0586 |
| No log | 30.0 | 390 | 0.4668 | 0.82 | 0.2795 | 1.3366 | 0.82 | 0.7944 | 0.2217 | 0.0495 |
| No log | 31.0 | 403 | 0.4662 | 0.8 | 0.2805 | 1.1721 | 0.8000 | 0.7761 | 0.2009 | 0.0494 |
| No log | 32.0 | 416 | 0.4787 | 0.82 | 0.2887 | 1.3872 | 0.82 | 0.8043 | 0.2161 | 0.0542 |
| No log | 33.0 | 429 | 0.4842 | 0.81 | 0.2909 | 1.4774 | 0.81 | 0.7854 | 0.2246 | 0.0562 |
| No log | 34.0 | 442 | 0.4899 | 0.81 | 0.2979 | 1.4419 | 0.81 | 0.7843 | 0.2155 | 0.0607 |
| No log | 35.0 | 455 | 0.4832 | 0.815 | 0.2920 | 1.3892 | 0.815 | 0.7907 | 0.2296 | 0.0552 |
| No log | 36.0 | 468 | 0.4739 | 0.815 | 0.2869 | 1.2603 | 0.815 | 0.7932 | 0.2385 | 0.0532 |
| No log | 37.0 | 481 | 0.4747 | 0.81 | 0.2877 | 1.4390 | 0.81 | 0.7848 | 0.2163 | 0.0526 |
| No log | 38.0 | 494 | 0.4710 | 0.815 | 0.2842 | 1.3024 | 0.815 | 0.7885 | 0.2153 | 0.0516 |
| 0.2992 | 39.0 | 507 | 0.4712 | 0.81 | 0.2839 | 1.3676 | 0.81 | 0.7860 | 0.2282 | 0.0518 |
| 0.2992 | 40.0 | 520 | 0.4772 | 0.815 | 0.2883 | 1.3845 | 0.815 | 0.7953 | 0.2216 | 0.0527 |
| 0.2992 | 41.0 | 533 | 0.4751 | 0.82 | 0.2877 | 1.3207 | 0.82 | 0.8018 | 0.2177 | 0.0521 |
| 0.2992 | 42.0 | 546 | 0.4724 | 0.82 | 0.2860 | 1.3075 | 0.82 | 0.8018 | 0.2183 | 0.0508 |
| 0.2992 | 43.0 | 559 | 0.4745 | 0.82 | 0.2869 | 1.3079 | 0.82 | 0.8020 | 0.2184 | 0.0522 |
| 0.2992 | 44.0 | 572 | 0.4779 | 0.815 | 0.2884 | 1.4039 | 0.815 | 0.7922 | 0.2142 | 0.0531 |
| 0.2992 | 45.0 | 585 | 0.4738 | 0.82 | 0.2859 | 1.3153 | 0.82 | 0.8018 | 0.2079 | 0.0516 |
| 0.2992 | 46.0 | 598 | 0.4755 | 0.815 | 0.2874 | 1.3273 | 0.815 | 0.7922 | 0.2279 | 0.0526 |
| 0.2992 | 47.0 | 611 | 0.4736 | 0.82 | 0.2858 | 1.3190 | 0.82 | 0.8018 | 0.2182 | 0.0515 |
| 0.2992 | 48.0 | 624 | 0.4753 | 0.82 | 0.2876 | 1.3170 | 0.82 | 0.8018 | 0.2274 | 0.0521 |
| 0.2992 | 49.0 | 637 | 0.4755 | 0.82 | 0.2866 | 1.4452 | 0.82 | 0.8018 | 0.2245 | 0.0516 |
| 0.2992 | 50.0 | 650 | 0.4754 | 0.815 | 0.2869 | 1.2915 | 0.815 | 0.7924 | 0.2336 | 0.0523 |
| 0.2992 | 51.0 | 663 | 0.4747 | 0.82 | 0.2861 | 1.3336 | 0.82 | 0.8020 | 0.2309 | 0.0517 |
| 0.2992 | 52.0 | 676 | 0.4765 | 0.815 | 0.2880 | 1.3456 | 0.815 | 0.7924 | 0.2137 | 0.0524 |
| 0.2992 | 53.0 | 689 | 0.4756 | 0.82 | 0.2866 | 1.3288 | 0.82 | 0.8020 | 0.2236 | 0.0518 |
| 0.2992 | 54.0 | 702 | 0.4757 | 0.82 | 0.2873 | 1.3860 | 0.82 | 0.8018 | 0.2085 | 0.0516 |
| 0.2992 | 55.0 | 715 | 0.4753 | 0.815 | 0.2866 | 1.3284 | 0.815 | 0.7922 | 0.2100 | 0.0515 |
| 0.2992 | 56.0 | 728 | 0.4759 | 0.82 | 0.2870 | 1.3199 | 0.82 | 0.8020 | 0.2240 | 0.0518 |
| 0.2992 | 57.0 | 741 | 0.4764 | 0.82 | 0.2874 | 1.3901 | 0.82 | 0.8020 | 0.2241 | 0.0517 |
| 0.2992 | 58.0 | 754 | 0.4754 | 0.815 | 0.2870 | 1.3246 | 0.815 | 0.7924 | 0.2260 | 0.0520 |
| 0.2992 | 59.0 | 767 | 0.4759 | 0.815 | 0.2870 | 1.3862 | 0.815 | 0.7924 | 0.2176 | 0.0520 |
| 0.2992 | 60.0 | 780 | 0.4765 | 0.815 | 0.2874 | 1.3873 | 0.815 | 0.7924 | 0.2266 | 0.0523 |
| 0.2992 | 61.0 | 793 | 0.4763 | 0.82 | 0.2873 | 1.3851 | 0.82 | 0.8020 | 0.2161 | 0.0517 |
| 0.2992 | 62.0 | 806 | 0.4768 | 0.815 | 0.2878 | 1.3903 | 0.815 | 0.7924 | 0.2128 | 0.0522 |
| 0.2992 | 63.0 | 819 | 0.4767 | 0.82 | 0.2876 | 1.3866 | 0.82 | 0.8020 | 0.2120 | 0.0521 |
| 0.2992 | 64.0 | 832 | 0.4762 | 0.82 | 0.2872 | 1.3910 | 0.82 | 0.8020 | 0.2157 | 0.0516 |
| 0.2992 | 65.0 | 845 | 0.4765 | 0.82 | 0.2874 | 1.3892 | 0.82 | 0.8020 | 0.2178 | 0.0519 |
| 0.2992 | 66.0 | 858 | 0.4767 | 0.82 | 0.2875 | 1.3462 | 0.82 | 0.8020 | 0.2180 | 0.0519 |
| 0.2992 | 67.0 | 871 | 0.4764 | 0.82 | 0.2872 | 1.3894 | 0.82 | 0.8020 | 0.2252 | 0.0518 |
| 0.2992 | 68.0 | 884 | 0.4767 | 0.82 | 0.2874 | 1.3860 | 0.82 | 0.8020 | 0.2118 | 0.0518 |
| 0.2992 | 69.0 | 897 | 0.4766 | 0.82 | 0.2874 | 1.3894 | 0.82 | 0.8020 | 0.2180 | 0.0519 |
| 0.2992 | 70.0 | 910 | 0.4765 | 0.82 | 0.2872 | 1.3882 | 0.82 | 0.8020 | 0.2280 | 0.0517 |
| 0.2992 | 71.0 | 923 | 0.4766 | 0.82 | 0.2874 | 1.3875 | 0.82 | 0.8020 | 0.2177 | 0.0519 |
| 0.2992 | 72.0 | 936 | 0.4765 | 0.82 | 0.2874 | 1.3880 | 0.82 | 0.8020 | 0.2148 | 0.0517 |
| 0.2992 | 73.0 | 949 | 0.4766 | 0.82 | 0.2873 | 1.3915 | 0.82 | 0.8020 | 0.2109 | 0.0516 |
| 0.2992 | 74.0 | 962 | 0.4765 | 0.82 | 0.2872 | 1.3900 | 0.82 | 0.8020 | 0.2110 | 0.0517 |
| 0.2992 | 75.0 | 975 | 0.4769 | 0.82 | 0.2875 | 1.3913 | 0.82 | 0.8020 | 0.2251 | 0.0520 |
| 0.2992 | 76.0 | 988 | 0.4770 | 0.82 | 0.2876 | 1.3909 | 0.82 | 0.8020 | 0.2196 | 0.0520 |
| 0.0695 | 77.0 | 1001 | 0.4768 | 0.82 | 0.2875 | 1.3890 | 0.82 | 0.8020 | 0.2212 | 0.0517 |
| 0.0695 | 78.0 | 1014 | 0.4767 | 0.82 | 0.2873 | 1.3935 | 0.82 | 0.8020 | 0.2281 | 0.0518 |
| 0.0695 | 79.0 | 1027 | 0.4767 | 0.82 | 0.2874 | 1.3897 | 0.82 | 0.8020 | 0.2282 | 0.0517 |
| 0.0695 | 80.0 | 1040 | 0.4770 | 0.82 | 0.2876 | 1.3889 | 0.82 | 0.8020 | 0.2174 | 0.0518 |
| 0.0695 | 81.0 | 1053 | 0.4770 | 0.82 | 0.2875 | 1.3935 | 0.82 | 0.8020 | 0.2221 | 0.0518 |
| 0.0695 | 82.0 | 1066 | 0.4766 | 0.82 | 0.2873 | 1.3901 | 0.82 | 0.8020 | 0.2283 | 0.0517 |
| 0.0695 | 83.0 | 1079 | 0.4768 | 0.82 | 0.2874 | 1.3902 | 0.82 | 0.8020 | 0.2283 | 0.0517 |
| 0.0695 | 84.0 | 1092 | 0.4770 | 0.82 | 0.2874 | 1.3917 | 0.82 | 0.8020 | 0.2217 | 0.0518 |
| 0.0695 | 85.0 | 1105 | 0.4769 | 0.82 | 0.2875 | 1.3913 | 0.82 | 0.8020 | 0.2283 | 0.0518 |
| 0.0695 | 86.0 | 1118 | 0.4769 | 0.82 | 0.2874 | 1.3916 | 0.82 | 0.8020 | 0.2282 | 0.0517 |
| 0.0695 | 87.0 | 1131 | 0.4769 | 0.82 | 0.2874 | 1.3912 | 0.82 | 0.8020 | 0.2218 | 0.0517 |
| 0.0695 | 88.0 | 1144 | 0.4770 | 0.82 | 0.2875 | 1.3923 | 0.82 | 0.8020 | 0.2218 | 0.0517 |
| 0.0695 | 89.0 | 1157 | 0.4768 | 0.82 | 0.2874 | 1.3905 | 0.82 | 0.8020 | 0.2283 | 0.0518 |
| 0.0695 | 90.0 | 1170 | 0.4769 | 0.82 | 0.2875 | 1.3924 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 91.0 | 1183 | 0.4769 | 0.82 | 0.2874 | 1.3923 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 92.0 | 1196 | 0.4768 | 0.82 | 0.2874 | 1.3908 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 93.0 | 1209 | 0.4770 | 0.82 | 0.2875 | 1.3909 | 0.82 | 0.8020 | 0.2219 | 0.0518 |
| 0.0695 | 94.0 | 1222 | 0.4768 | 0.82 | 0.2873 | 1.3918 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 95.0 | 1235 | 0.4769 | 0.82 | 0.2874 | 1.3914 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 96.0 | 1248 | 0.4770 | 0.82 | 0.2875 | 1.3917 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 97.0 | 1261 | 0.4769 | 0.82 | 0.2874 | 1.3918 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 98.0 | 1274 | 0.4770 | 0.82 | 0.2875 | 1.3920 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 99.0 | 1287 | 0.4770 | 0.82 | 0.2875 | 1.3922 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
| 0.0695 | 100.0 | 1300 | 0.4770 | 0.82 | 0.2875 | 1.3922 | 0.82 | 0.8020 | 0.2219 | 0.0517 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/300-tiny_tobacco3482_kd
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 300-tiny_tobacco3482_kd
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3298
- Accuracy: 0.79
- Brier Loss: 0.3334
- Nll: 1.0051
- F1 Micro: 0.79
- F1 Macro: 0.7591
- Ece: 0.2152
- Aurc: 0.0601
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.7587 | 0.225 | 0.8896 | 7.6699 | 0.225 | 0.1499 | 0.2894 | 0.7588 |
| No log | 2.0 | 26 | 1.1905 | 0.4 | 0.7901 | 3.7933 | 0.4000 | 0.2961 | 0.3234 | 0.4395 |
| No log | 3.0 | 39 | 0.9530 | 0.53 | 0.6670 | 2.9907 | 0.53 | 0.4066 | 0.3137 | 0.2821 |
| No log | 4.0 | 52 | 0.8046 | 0.615 | 0.5821 | 2.2124 | 0.615 | 0.4898 | 0.3097 | 0.1862 |
| No log | 5.0 | 65 | 0.7084 | 0.685 | 0.5084 | 2.1522 | 0.685 | 0.6027 | 0.3191 | 0.1389 |
| No log | 6.0 | 78 | 0.7243 | 0.68 | 0.4683 | 2.2673 | 0.68 | 0.5904 | 0.2695 | 0.1279 |
| No log | 7.0 | 91 | 0.6734 | 0.67 | 0.4675 | 2.2909 | 0.67 | 0.5844 | 0.2436 | 0.1510 |
| No log | 8.0 | 104 | 0.5780 | 0.7 | 0.4215 | 2.0061 | 0.7 | 0.6160 | 0.2418 | 0.1016 |
| No log | 9.0 | 117 | 0.6270 | 0.71 | 0.4402 | 1.8620 | 0.7100 | 0.6574 | 0.2485 | 0.1249 |
| No log | 10.0 | 130 | 0.5604 | 0.72 | 0.4074 | 1.5914 | 0.72 | 0.6430 | 0.2566 | 0.0935 |
| No log | 11.0 | 143 | 0.5814 | 0.705 | 0.4079 | 1.6933 | 0.705 | 0.6190 | 0.2350 | 0.1035 |
| No log | 12.0 | 156 | 0.5901 | 0.71 | 0.4176 | 1.7974 | 0.7100 | 0.6472 | 0.2225 | 0.1058 |
| No log | 13.0 | 169 | 0.5041 | 0.71 | 0.3918 | 1.8429 | 0.7100 | 0.6562 | 0.2336 | 0.0958 |
| No log | 14.0 | 182 | 0.5099 | 0.72 | 0.3982 | 1.6343 | 0.72 | 0.6550 | 0.2202 | 0.1021 |
| No log | 15.0 | 195 | 0.4843 | 0.745 | 0.3951 | 1.3599 | 0.745 | 0.6719 | 0.2680 | 0.0884 |
| No log | 16.0 | 208 | 0.4529 | 0.74 | 0.3776 | 1.3838 | 0.74 | 0.6951 | 0.2112 | 0.0839 |
| No log | 17.0 | 221 | 0.4420 | 0.745 | 0.3782 | 1.4403 | 0.745 | 0.6982 | 0.2285 | 0.0800 |
| No log | 18.0 | 234 | 0.4428 | 0.755 | 0.3710 | 1.3696 | 0.755 | 0.7298 | 0.2170 | 0.0825 |
| No log | 19.0 | 247 | 0.4306 | 0.75 | 0.3794 | 1.4095 | 0.75 | 0.7235 | 0.2470 | 0.0862 |
| No log | 20.0 | 260 | 0.4166 | 0.74 | 0.3648 | 1.2893 | 0.74 | 0.6776 | 0.2312 | 0.0835 |
| No log | 21.0 | 273 | 0.3830 | 0.77 | 0.3524 | 1.1764 | 0.7700 | 0.7256 | 0.2535 | 0.0730 |
| No log | 22.0 | 286 | 0.3918 | 0.77 | 0.3564 | 1.2293 | 0.7700 | 0.7067 | 0.2372 | 0.0710 |
| No log | 23.0 | 299 | 0.4125 | 0.75 | 0.3656 | 1.1419 | 0.75 | 0.7109 | 0.2357 | 0.0766 |
| No log | 24.0 | 312 | 0.3771 | 0.785 | 0.3543 | 1.0960 | 0.785 | 0.7583 | 0.2345 | 0.0712 |
| No log | 25.0 | 325 | 0.3846 | 0.745 | 0.3613 | 1.0616 | 0.745 | 0.7061 | 0.2060 | 0.0766 |
| No log | 26.0 | 338 | 0.3660 | 0.77 | 0.3547 | 1.3094 | 0.7700 | 0.7196 | 0.2515 | 0.0724 |
| No log | 27.0 | 351 | 0.3634 | 0.78 | 0.3476 | 1.0645 | 0.78 | 0.7479 | 0.2401 | 0.0677 |
| No log | 28.0 | 364 | 0.3715 | 0.755 | 0.3522 | 1.1981 | 0.755 | 0.6984 | 0.2257 | 0.0709 |
| No log | 29.0 | 377 | 0.3701 | 0.765 | 0.3597 | 1.1645 | 0.765 | 0.7239 | 0.2631 | 0.0747 |
| No log | 30.0 | 390 | 0.3562 | 0.775 | 0.3465 | 1.1094 | 0.775 | 0.7140 | 0.2428 | 0.0659 |
| No log | 31.0 | 403 | 0.3811 | 0.775 | 0.3499 | 1.2515 | 0.775 | 0.7368 | 0.2214 | 0.0694 |
| No log | 32.0 | 416 | 0.3555 | 0.77 | 0.3439 | 1.1715 | 0.7700 | 0.7053 | 0.2532 | 0.0705 |
| No log | 33.0 | 429 | 0.3592 | 0.775 | 0.3449 | 1.1606 | 0.775 | 0.7364 | 0.2336 | 0.0729 |
| No log | 34.0 | 442 | 0.3555 | 0.78 | 0.3431 | 1.1054 | 0.78 | 0.7373 | 0.2143 | 0.0653 |
| No log | 35.0 | 455 | 0.3454 | 0.77 | 0.3415 | 1.0386 | 0.7700 | 0.7333 | 0.2463 | 0.0668 |
| No log | 36.0 | 468 | 0.3403 | 0.8 | 0.3394 | 1.1435 | 0.8000 | 0.7664 | 0.2674 | 0.0625 |
| No log | 37.0 | 481 | 0.3390 | 0.785 | 0.3379 | 1.1183 | 0.785 | 0.7552 | 0.2432 | 0.0633 |
| No log | 38.0 | 494 | 0.3413 | 0.79 | 0.3347 | 1.1538 | 0.79 | 0.7406 | 0.2239 | 0.0615 |
| 0.2994 | 39.0 | 507 | 0.3364 | 0.795 | 0.3362 | 0.9975 | 0.795 | 0.7650 | 0.2334 | 0.0639 |
| 0.2994 | 40.0 | 520 | 0.3340 | 0.79 | 0.3328 | 1.0045 | 0.79 | 0.7466 | 0.2711 | 0.0580 |
| 0.2994 | 41.0 | 533 | 0.3381 | 0.77 | 0.3391 | 0.9829 | 0.7700 | 0.7427 | 0.2147 | 0.0675 |
| 0.2994 | 42.0 | 546 | 0.3297 | 0.8 | 0.3319 | 1.0739 | 0.8000 | 0.7685 | 0.2613 | 0.0585 |
| 0.2994 | 43.0 | 559 | 0.3338 | 0.8 | 0.3373 | 1.1507 | 0.8000 | 0.7719 | 0.2491 | 0.0637 |
| 0.2994 | 44.0 | 572 | 0.3316 | 0.79 | 0.3359 | 1.1274 | 0.79 | 0.7539 | 0.2469 | 0.0620 |
| 0.2994 | 45.0 | 585 | 0.3283 | 0.79 | 0.3336 | 1.0644 | 0.79 | 0.7531 | 0.2636 | 0.0612 |
| 0.2994 | 46.0 | 598 | 0.3297 | 0.8 | 0.3344 | 1.1343 | 0.8000 | 0.7670 | 0.2317 | 0.0600 |
| 0.2994 | 47.0 | 611 | 0.3293 | 0.79 | 0.3318 | 1.0692 | 0.79 | 0.7542 | 0.2396 | 0.0616 |
| 0.2994 | 48.0 | 624 | 0.3339 | 0.79 | 0.3357 | 1.1225 | 0.79 | 0.7590 | 0.2508 | 0.0617 |
| 0.2994 | 49.0 | 637 | 0.3290 | 0.795 | 0.3343 | 1.0692 | 0.795 | 0.7618 | 0.2529 | 0.0604 |
| 0.2994 | 50.0 | 650 | 0.3298 | 0.79 | 0.3348 | 1.1343 | 0.79 | 0.7591 | 0.2330 | 0.0609 |
| 0.2994 | 51.0 | 663 | 0.3305 | 0.795 | 0.3330 | 1.0045 | 0.795 | 0.7618 | 0.2357 | 0.0607 |
| 0.2994 | 52.0 | 676 | 0.3299 | 0.79 | 0.3339 | 1.0722 | 0.79 | 0.7542 | 0.2562 | 0.0614 |
| 0.2994 | 53.0 | 689 | 0.3280 | 0.8 | 0.3325 | 1.0688 | 0.8000 | 0.7685 | 0.2500 | 0.0593 |
| 0.2994 | 54.0 | 702 | 0.3284 | 0.795 | 0.3323 | 1.0175 | 0.795 | 0.7618 | 0.2436 | 0.0598 |
| 0.2994 | 55.0 | 715 | 0.3287 | 0.79 | 0.3331 | 1.0750 | 0.79 | 0.7591 | 0.2497 | 0.0604 |
| 0.2994 | 56.0 | 728 | 0.3286 | 0.795 | 0.3335 | 1.0115 | 0.795 | 0.7618 | 0.2296 | 0.0602 |
| 0.2994 | 57.0 | 741 | 0.3285 | 0.79 | 0.3330 | 1.0648 | 0.79 | 0.7591 | 0.2446 | 0.0602 |
| 0.2994 | 58.0 | 754 | 0.3299 | 0.795 | 0.3339 | 1.0193 | 0.795 | 0.7618 | 0.2345 | 0.0608 |
| 0.2994 | 59.0 | 767 | 0.3294 | 0.79 | 0.3329 | 1.0139 | 0.79 | 0.7591 | 0.2369 | 0.0601 |
| 0.2994 | 60.0 | 780 | 0.3292 | 0.795 | 0.3332 | 1.0118 | 0.795 | 0.7618 | 0.2226 | 0.0601 |
| 0.2994 | 61.0 | 793 | 0.3293 | 0.795 | 0.3333 | 1.0716 | 0.795 | 0.7618 | 0.2282 | 0.0602 |
| 0.2994 | 62.0 | 806 | 0.3294 | 0.795 | 0.3331 | 1.0107 | 0.795 | 0.7618 | 0.2224 | 0.0601 |
| 0.2994 | 63.0 | 819 | 0.3295 | 0.795 | 0.3336 | 1.0144 | 0.795 | 0.7618 | 0.2294 | 0.0605 |
| 0.2994 | 64.0 | 832 | 0.3293 | 0.795 | 0.3332 | 1.0104 | 0.795 | 0.7618 | 0.2324 | 0.0603 |
| 0.2994 | 65.0 | 845 | 0.3298 | 0.795 | 0.3337 | 1.0114 | 0.795 | 0.7618 | 0.2478 | 0.0606 |
| 0.2994 | 66.0 | 858 | 0.3297 | 0.795 | 0.3333 | 1.0076 | 0.795 | 0.7618 | 0.2366 | 0.0601 |
| 0.2994 | 67.0 | 871 | 0.3298 | 0.79 | 0.3338 | 1.0120 | 0.79 | 0.7591 | 0.2513 | 0.0606 |
| 0.2994 | 68.0 | 884 | 0.3297 | 0.795 | 0.3337 | 1.0110 | 0.795 | 0.7618 | 0.2376 | 0.0605 |
| 0.2994 | 69.0 | 897 | 0.3297 | 0.795 | 0.3335 | 1.0115 | 0.795 | 0.7618 | 0.2228 | 0.0602 |
| 0.2994 | 70.0 | 910 | 0.3292 | 0.795 | 0.3333 | 1.0089 | 0.795 | 0.7618 | 0.2215 | 0.0602 |
| 0.2994 | 71.0 | 923 | 0.3297 | 0.795 | 0.3334 | 1.0083 | 0.795 | 0.7618 | 0.2226 | 0.0600 |
| 0.2994 | 72.0 | 936 | 0.3297 | 0.79 | 0.3335 | 1.0072 | 0.79 | 0.7591 | 0.2257 | 0.0604 |
| 0.2994 | 73.0 | 949 | 0.3297 | 0.795 | 0.3332 | 1.0060 | 0.795 | 0.7618 | 0.2381 | 0.0600 |
| 0.2994 | 74.0 | 962 | 0.3295 | 0.795 | 0.3335 | 1.0082 | 0.795 | 0.7618 | 0.2366 | 0.0603 |
| 0.2994 | 75.0 | 975 | 0.3296 | 0.79 | 0.3334 | 1.0089 | 0.79 | 0.7591 | 0.2373 | 0.0601 |
| 0.2994 | 76.0 | 988 | 0.3298 | 0.795 | 0.3334 | 1.0098 | 0.795 | 0.7618 | 0.2310 | 0.0602 |
| 0.0006 | 77.0 | 1001 | 0.3297 | 0.79 | 0.3334 | 1.0084 | 0.79 | 0.7591 | 0.2228 | 0.0603 |
| 0.0006 | 78.0 | 1014 | 0.3297 | 0.79 | 0.3333 | 1.0071 | 0.79 | 0.7591 | 0.2148 | 0.0600 |
| 0.0006 | 79.0 | 1027 | 0.3298 | 0.795 | 0.3334 | 1.0059 | 0.795 | 0.7618 | 0.2309 | 0.0602 |
| 0.0006 | 80.0 | 1040 | 0.3298 | 0.795 | 0.3334 | 1.0046 | 0.795 | 0.7618 | 0.2309 | 0.0602 |
| 0.0006 | 81.0 | 1053 | 0.3298 | 0.79 | 0.3335 | 1.0073 | 0.79 | 0.7591 | 0.2239 | 0.0602 |
| 0.0006 | 82.0 | 1066 | 0.3298 | 0.795 | 0.3336 | 1.0072 | 0.795 | 0.7618 | 0.2317 | 0.0603 |
| 0.0006 | 83.0 | 1079 | 0.3297 | 0.795 | 0.3334 | 1.0055 | 0.795 | 0.7618 | 0.2224 | 0.0601 |
| 0.0006 | 84.0 | 1092 | 0.3298 | 0.79 | 0.3335 | 1.0061 | 0.79 | 0.7591 | 0.2240 | 0.0601 |
| 0.0006 | 85.0 | 1105 | 0.3297 | 0.79 | 0.3334 | 1.0052 | 0.79 | 0.7591 | 0.2322 | 0.0601 |
| 0.0006 | 86.0 | 1118 | 0.3298 | 0.79 | 0.3335 | 1.0059 | 0.79 | 0.7591 | 0.2323 | 0.0602 |
| 0.0006 | 87.0 | 1131 | 0.3298 | 0.79 | 0.3335 | 1.0065 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 88.0 | 1144 | 0.3298 | 0.79 | 0.3335 | 1.0056 | 0.79 | 0.7591 | 0.2235 | 0.0603 |
| 0.0006 | 89.0 | 1157 | 0.3297 | 0.79 | 0.3334 | 1.0050 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 90.0 | 1170 | 0.3297 | 0.79 | 0.3334 | 1.0049 | 0.79 | 0.7591 | 0.2153 | 0.0602 |
| 0.0006 | 91.0 | 1183 | 0.3297 | 0.79 | 0.3334 | 1.0059 | 0.79 | 0.7591 | 0.2234 | 0.0601 |
| 0.0006 | 92.0 | 1196 | 0.3298 | 0.79 | 0.3334 | 1.0049 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 93.0 | 1209 | 0.3299 | 0.79 | 0.3335 | 1.0056 | 0.79 | 0.7591 | 0.2152 | 0.0601 |
| 0.0006 | 94.0 | 1222 | 0.3298 | 0.79 | 0.3335 | 1.0049 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 95.0 | 1235 | 0.3298 | 0.79 | 0.3334 | 1.0048 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 96.0 | 1248 | 0.3298 | 0.79 | 0.3334 | 1.0050 | 0.79 | 0.7591 | 0.2152 | 0.0601 |
| 0.0006 | 97.0 | 1261 | 0.3298 | 0.79 | 0.3335 | 1.0053 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 98.0 | 1274 | 0.3298 | 0.79 | 0.3334 | 1.0051 | 0.79 | 0.7591 | 0.2152 | 0.0602 |
| 0.0006 | 99.0 | 1287 | 0.3298 | 0.79 | 0.3334 | 1.0052 | 0.79 | 0.7591 | 0.2152 | 0.0601 |
| 0.0006 | 100.0 | 1300 | 0.3298 | 0.79 | 0.3334 | 1.0051 | 0.79 | 0.7591 | 0.2152 | 0.0601 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.