model_id
stringlengths
7
105
model_card
stringlengths
1
130k
model_labels
listlengths
2
80k
jordyvl/300-tiny_tobacco3482_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 300-tiny_tobacco3482_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 4.2183 - Accuracy: 0.83 - Brier Loss: 0.2913 - Nll: 1.7770 - F1 Micro: 0.83 - F1 Macro: 0.8264 - Ece: 0.1561 - Aurc: 0.0404 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 5.3145 | 0.225 | 0.8879 | 8.2660 | 0.225 | 0.1306 | 0.2937 | 0.7940 | | No log | 2.0 | 26 | 4.7229 | 0.395 | 0.7730 | 4.0151 | 0.395 | 0.3145 | 0.3180 | 0.3989 | | No log | 3.0 | 39 | 4.4853 | 0.52 | 0.6589 | 3.4597 | 0.52 | 0.3986 | 0.2918 | 0.2753 | | No log | 4.0 | 52 | 4.3071 | 0.595 | 0.5739 | 2.6724 | 0.595 | 0.5030 | 0.3024 | 0.1820 | | No log | 5.0 | 65 | 4.1849 | 0.67 | 0.5011 | 2.3415 | 0.67 | 0.6072 | 0.2760 | 0.1327 | | No log | 6.0 | 78 | 4.2077 | 0.69 | 0.4390 | 2.6524 | 0.69 | 0.6081 | 0.2261 | 0.1082 | | No log | 7.0 | 91 | 4.0758 | 0.705 | 0.4177 | 2.1183 | 0.705 | 0.6424 | 0.2219 | 0.1011 | | No log | 8.0 | 104 | 4.1727 | 0.705 | 0.4036 | 2.0253 | 0.705 | 0.6324 | 0.2012 | 0.1048 | | No log | 9.0 | 117 | 4.1369 | 0.72 | 0.4007 | 2.1063 | 0.72 | 0.6700 | 0.2155 | 0.1058 | | No log | 10.0 | 130 | 4.2585 | 0.705 | 0.4100 | 2.2749 | 0.705 | 0.6445 | 0.2011 | 0.1081 | | No log | 11.0 | 143 | 3.9977 | 0.765 | 0.3467 | 1.7895 | 0.765 | 0.6913 | 0.1865 | 0.0703 | | No log | 12.0 | 156 | 4.1178 | 0.72 | 0.3606 | 1.5964 | 0.72 | 0.6535 | 0.1917 | 0.0879 | | No log | 13.0 | 169 | 4.2015 | 0.69 | 0.4133 | 2.1808 | 0.69 | 0.6412 | 0.2060 | 0.1157 | | No log | 14.0 | 182 | 3.9586 | 0.775 | 0.3258 | 1.6381 | 0.775 | 0.7529 | 0.1930 | 0.0637 | | No log | 15.0 | 195 | 3.9068 | 0.775 | 0.3176 | 1.3371 | 0.775 | 0.7259 | 0.1640 | 0.0671 | | No log | 16.0 | 208 | 3.9768 | 0.75 | 0.3493 | 1.6983 | 0.75 | 0.7460 | 0.1677 | 0.0827 | | No log | 17.0 | 221 | 3.9648 | 0.77 | 0.3116 | 1.5883 | 0.7700 | 0.7352 | 0.1523 | 0.0632 | | No log | 18.0 | 234 | 3.8706 | 0.8 | 0.2933 | 1.7099 | 0.8000 | 0.7800 | 0.1548 | 0.0568 | | No log | 19.0 | 247 | 3.8302 | 0.82 | 0.2815 | 1.4136 | 0.82 | 0.8092 | 0.1606 | 0.0508 | | No log | 20.0 | 260 | 3.9628 | 0.815 | 0.3061 | 1.4014 | 0.815 | 0.8032 | 0.1701 | 0.0645 | | No log | 21.0 | 273 | 3.9288 | 0.81 | 0.3065 | 1.3082 | 0.81 | 0.8034 | 0.1744 | 0.0665 | | No log | 22.0 | 286 | 3.8470 | 0.795 | 0.2934 | 1.5781 | 0.795 | 0.7626 | 0.1510 | 0.0557 | | No log | 23.0 | 299 | 3.9932 | 0.785 | 0.3214 | 1.6262 | 0.785 | 0.7678 | 0.1626 | 0.0646 | | No log | 24.0 | 312 | 3.9308 | 0.795 | 0.3165 | 1.7210 | 0.795 | 0.7907 | 0.1692 | 0.0661 | | No log | 25.0 | 325 | 3.9077 | 0.82 | 0.2895 | 1.7608 | 0.82 | 0.8019 | 0.1556 | 0.0577 | | No log | 26.0 | 338 | 3.8450 | 0.82 | 0.2816 | 1.6690 | 0.82 | 0.8035 | 0.1629 | 0.0557 | | No log | 27.0 | 351 | 4.0009 | 0.815 | 0.3063 | 1.7843 | 0.815 | 0.8128 | 0.1705 | 0.0679 | | No log | 28.0 | 364 | 3.8787 | 0.815 | 0.2839 | 1.6762 | 0.815 | 0.8071 | 0.1355 | 0.0535 | | No log | 29.0 | 377 | 3.8246 | 0.825 | 0.2713 | 1.7012 | 0.825 | 0.8100 | 0.1428 | 0.0503 | | No log | 30.0 | 390 | 3.8846 | 0.815 | 0.2758 | 1.7815 | 0.815 | 0.8025 | 0.1440 | 0.0490 | | No log | 31.0 | 403 | 3.9277 | 0.81 | 0.2935 | 1.5897 | 0.81 | 0.8011 | 0.1523 | 0.0586 | | No log | 32.0 | 416 | 3.8308 | 0.805 | 0.2744 | 1.5821 | 0.805 | 0.7951 | 0.1290 | 0.0475 | | No log | 33.0 | 429 | 3.8453 | 0.82 | 0.2668 | 1.6016 | 0.82 | 0.8070 | 0.1307 | 0.0476 | | No log | 34.0 | 442 | 3.8837 | 0.81 | 0.2767 | 1.6053 | 0.81 | 0.8003 | 0.1368 | 0.0464 | | No log | 35.0 | 455 | 3.8103 | 0.83 | 0.2561 | 1.6922 | 0.83 | 0.8328 | 0.1285 | 0.0434 | | No log | 36.0 | 468 | 3.8702 | 0.81 | 0.2712 | 1.8361 | 0.81 | 0.8011 | 0.1387 | 0.0465 | | No log | 37.0 | 481 | 3.8080 | 0.81 | 0.2635 | 1.7576 | 0.81 | 0.7993 | 0.1247 | 0.0398 | | No log | 38.0 | 494 | 3.8234 | 0.81 | 0.2672 | 1.7059 | 0.81 | 0.8008 | 0.1349 | 0.0420 | | 3.5835 | 39.0 | 507 | 3.8962 | 0.81 | 0.2789 | 1.7265 | 0.81 | 0.8106 | 0.1459 | 0.0448 | | 3.5835 | 40.0 | 520 | 3.8600 | 0.805 | 0.2731 | 1.7552 | 0.805 | 0.7917 | 0.1328 | 0.0415 | | 3.5835 | 41.0 | 533 | 3.9201 | 0.815 | 0.2799 | 1.8308 | 0.815 | 0.8072 | 0.1416 | 0.0428 | | 3.5835 | 42.0 | 546 | 3.9188 | 0.815 | 0.2721 | 1.7661 | 0.815 | 0.8102 | 0.1153 | 0.0416 | | 3.5835 | 43.0 | 559 | 3.8939 | 0.82 | 0.2660 | 1.6435 | 0.82 | 0.8171 | 0.1349 | 0.0414 | | 3.5835 | 44.0 | 572 | 3.9485 | 0.8 | 0.2806 | 1.8239 | 0.8000 | 0.7896 | 0.1515 | 0.0432 | | 3.5835 | 45.0 | 585 | 3.9293 | 0.805 | 0.2731 | 1.7495 | 0.805 | 0.7980 | 0.1468 | 0.0408 | | 3.5835 | 46.0 | 598 | 3.9623 | 0.815 | 0.2744 | 1.8305 | 0.815 | 0.8102 | 0.1397 | 0.0423 | | 3.5835 | 47.0 | 611 | 3.9737 | 0.815 | 0.2776 | 1.8283 | 0.815 | 0.8132 | 0.1394 | 0.0415 | | 3.5835 | 48.0 | 624 | 4.0066 | 0.825 | 0.2839 | 1.8282 | 0.825 | 0.8232 | 0.1422 | 0.0420 | | 3.5835 | 49.0 | 637 | 4.0039 | 0.82 | 0.2789 | 1.8268 | 0.82 | 0.8166 | 0.1371 | 0.0423 | | 3.5835 | 50.0 | 650 | 4.0113 | 0.82 | 0.2812 | 1.7108 | 0.82 | 0.8166 | 0.1352 | 0.0411 | | 3.5835 | 51.0 | 663 | 4.0170 | 0.82 | 0.2810 | 1.7028 | 0.82 | 0.8197 | 0.1308 | 0.0404 | | 3.5835 | 52.0 | 676 | 4.0325 | 0.82 | 0.2806 | 1.7729 | 0.82 | 0.8197 | 0.1364 | 0.0413 | | 3.5835 | 53.0 | 689 | 4.0359 | 0.815 | 0.2824 | 1.7656 | 0.815 | 0.8065 | 0.1372 | 0.0417 | | 3.5835 | 54.0 | 702 | 4.0536 | 0.82 | 0.2833 | 1.7665 | 0.82 | 0.8197 | 0.1390 | 0.0414 | | 3.5835 | 55.0 | 715 | 4.0646 | 0.825 | 0.2864 | 1.6994 | 0.825 | 0.8232 | 0.1505 | 0.0405 | | 3.5835 | 56.0 | 728 | 4.0602 | 0.83 | 0.2830 | 1.7634 | 0.83 | 0.8264 | 0.1543 | 0.0408 | | 3.5835 | 57.0 | 741 | 4.0725 | 0.825 | 0.2850 | 1.7749 | 0.825 | 0.8220 | 0.1521 | 0.0411 | | 3.5835 | 58.0 | 754 | 4.0659 | 0.83 | 0.2819 | 1.7018 | 0.83 | 0.8252 | 0.1426 | 0.0400 | | 3.5835 | 59.0 | 767 | 4.0838 | 0.83 | 0.2832 | 1.8336 | 0.83 | 0.8264 | 0.1441 | 0.0416 | | 3.5835 | 60.0 | 780 | 4.0928 | 0.825 | 0.2850 | 1.7666 | 0.825 | 0.8232 | 0.1425 | 0.0411 | | 3.5835 | 61.0 | 793 | 4.0945 | 0.83 | 0.2841 | 1.7751 | 0.83 | 0.8252 | 0.1442 | 0.0412 | | 3.5835 | 62.0 | 806 | 4.1037 | 0.83 | 0.2863 | 1.7089 | 0.83 | 0.8252 | 0.1389 | 0.0404 | | 3.5835 | 63.0 | 819 | 4.1134 | 0.825 | 0.2879 | 1.6997 | 0.825 | 0.8220 | 0.1299 | 0.0404 | | 3.5835 | 64.0 | 832 | 4.1176 | 0.83 | 0.2869 | 1.7757 | 0.83 | 0.8264 | 0.1378 | 0.0411 | | 3.5835 | 65.0 | 845 | 4.1232 | 0.83 | 0.2862 | 1.7746 | 0.83 | 0.8264 | 0.1424 | 0.0406 | | 3.5835 | 66.0 | 858 | 4.1335 | 0.83 | 0.2866 | 1.8341 | 0.83 | 0.8252 | 0.1439 | 0.0414 | | 3.5835 | 67.0 | 871 | 4.1352 | 0.825 | 0.2865 | 1.7806 | 0.825 | 0.8229 | 0.1495 | 0.0413 | | 3.5835 | 68.0 | 884 | 4.1348 | 0.83 | 0.2864 | 1.7896 | 0.83 | 0.8252 | 0.1427 | 0.0405 | | 3.5835 | 69.0 | 897 | 4.1445 | 0.83 | 0.2883 | 1.7743 | 0.83 | 0.8252 | 0.1404 | 0.0406 | | 3.5835 | 70.0 | 910 | 4.1455 | 0.83 | 0.2868 | 1.8347 | 0.83 | 0.8252 | 0.1445 | 0.0409 | | 3.5835 | 71.0 | 923 | 4.1512 | 0.83 | 0.2876 | 1.7757 | 0.83 | 0.8252 | 0.1450 | 0.0409 | | 3.5835 | 72.0 | 936 | 4.1579 | 0.83 | 0.2884 | 1.7730 | 0.83 | 0.8252 | 0.1463 | 0.0407 | | 3.5835 | 73.0 | 949 | 4.1602 | 0.83 | 0.2879 | 1.7744 | 0.83 | 0.8252 | 0.1400 | 0.0406 | | 3.5835 | 74.0 | 962 | 4.1661 | 0.83 | 0.2886 | 1.7724 | 0.83 | 0.8252 | 0.1464 | 0.0407 | | 3.5835 | 75.0 | 975 | 4.1724 | 0.83 | 0.2894 | 1.7735 | 0.83 | 0.8252 | 0.1465 | 0.0407 | | 3.5835 | 76.0 | 988 | 4.1726 | 0.83 | 0.2888 | 1.7730 | 0.83 | 0.8252 | 0.1455 | 0.0406 | | 3.1987 | 77.0 | 1001 | 4.1784 | 0.83 | 0.2894 | 1.7734 | 0.83 | 0.8252 | 0.1460 | 0.0404 | | 3.1987 | 78.0 | 1014 | 4.1805 | 0.83 | 0.2891 | 1.7744 | 0.83 | 0.8264 | 0.1455 | 0.0406 | | 3.1987 | 79.0 | 1027 | 4.1858 | 0.83 | 0.2898 | 1.7749 | 0.83 | 0.8252 | 0.1551 | 0.0406 | | 3.1987 | 80.0 | 1040 | 4.1881 | 0.83 | 0.2898 | 1.7749 | 0.83 | 0.8264 | 0.1549 | 0.0406 | | 3.1987 | 81.0 | 1053 | 4.1907 | 0.83 | 0.2899 | 1.7744 | 0.83 | 0.8252 | 0.1562 | 0.0405 | | 3.1987 | 82.0 | 1066 | 4.1930 | 0.83 | 0.2900 | 1.7763 | 0.83 | 0.8252 | 0.1555 | 0.0406 | | 3.1987 | 83.0 | 1079 | 4.1954 | 0.83 | 0.2900 | 1.7761 | 0.83 | 0.8264 | 0.1552 | 0.0406 | | 3.1987 | 84.0 | 1092 | 4.2000 | 0.83 | 0.2905 | 1.7773 | 0.83 | 0.8264 | 0.1553 | 0.0408 | | 3.1987 | 85.0 | 1105 | 4.2015 | 0.83 | 0.2905 | 1.7746 | 0.83 | 0.8252 | 0.1501 | 0.0404 | | 3.1987 | 86.0 | 1118 | 4.2041 | 0.83 | 0.2909 | 1.7765 | 0.83 | 0.8252 | 0.1565 | 0.0405 | | 3.1987 | 87.0 | 1131 | 4.2060 | 0.83 | 0.2908 | 1.7768 | 0.83 | 0.8264 | 0.1556 | 0.0405 | | 3.1987 | 88.0 | 1144 | 4.2070 | 0.83 | 0.2908 | 1.7756 | 0.83 | 0.8252 | 0.1565 | 0.0404 | | 3.1987 | 89.0 | 1157 | 4.2093 | 0.83 | 0.2908 | 1.7770 | 0.83 | 0.8264 | 0.1556 | 0.0407 | | 3.1987 | 90.0 | 1170 | 4.2099 | 0.83 | 0.2909 | 1.7763 | 0.83 | 0.8252 | 0.1559 | 0.0404 | | 3.1987 | 91.0 | 1183 | 4.2123 | 0.83 | 0.2910 | 1.7776 | 0.83 | 0.8264 | 0.1559 | 0.0407 | | 3.1987 | 92.0 | 1196 | 4.2138 | 0.83 | 0.2912 | 1.7764 | 0.83 | 0.8252 | 0.1512 | 0.0404 | | 3.1987 | 93.0 | 1209 | 4.2142 | 0.83 | 0.2911 | 1.7767 | 0.83 | 0.8252 | 0.1505 | 0.0403 | | 3.1987 | 94.0 | 1222 | 4.2154 | 0.83 | 0.2912 | 1.7765 | 0.83 | 0.8252 | 0.1561 | 0.0404 | | 3.1987 | 95.0 | 1235 | 4.2162 | 0.83 | 0.2912 | 1.7766 | 0.83 | 0.8264 | 0.1560 | 0.0404 | | 3.1987 | 96.0 | 1248 | 4.2165 | 0.83 | 0.2912 | 1.7769 | 0.83 | 0.8264 | 0.1561 | 0.0404 | | 3.1987 | 97.0 | 1261 | 4.2180 | 0.83 | 0.2914 | 1.7773 | 0.83 | 0.8264 | 0.1561 | 0.0404 | | 3.1987 | 98.0 | 1274 | 4.2180 | 0.83 | 0.2913 | 1.7768 | 0.83 | 0.8252 | 0.1561 | 0.0404 | | 3.1987 | 99.0 | 1287 | 4.2183 | 0.83 | 0.2914 | 1.7772 | 0.83 | 0.8264 | 0.1561 | 0.0404 | | 3.1987 | 100.0 | 1300 | 4.2183 | 0.83 | 0.2913 | 1.7770 | 0.83 | 0.8264 | 0.1561 | 0.0404 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/300-tiny_tobacco3482_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 300-tiny_tobacco3482_og_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1030.4594 - Accuracy: 0.845 - Brier Loss: 0.2507 - Nll: 1.6602 - F1 Micro: 0.845 - F1 Macro: 0.8273 - Ece: 0.1440 - Aurc: 0.0504 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 50 | 1063.5994 | 0.215 | 0.8548 | 4.5394 | 0.2150 | 0.0866 | 0.2467 | 0.5619 | | No log | 2.0 | 100 | 1058.8120 | 0.53 | 0.6920 | 2.3234 | 0.53 | 0.4420 | 0.3178 | 0.2841 | | No log | 3.0 | 150 | 1054.1038 | 0.63 | 0.5596 | 2.2480 | 0.63 | 0.5709 | 0.3050 | 0.1733 | | No log | 4.0 | 200 | 1049.1915 | 0.67 | 0.4903 | 2.6916 | 0.67 | 0.5866 | 0.2481 | 0.1460 | | No log | 5.0 | 250 | 1049.7939 | 0.71 | 0.4663 | 1.9364 | 0.7100 | 0.6359 | 0.2438 | 0.1365 | | No log | 6.0 | 300 | 1048.1417 | 0.69 | 0.4830 | 2.1830 | 0.69 | 0.6309 | 0.2518 | 0.1379 | | No log | 7.0 | 350 | 1048.1290 | 0.725 | 0.4265 | 2.1076 | 0.7250 | 0.6383 | 0.2114 | 0.1172 | | No log | 8.0 | 400 | 1044.3457 | 0.775 | 0.3807 | 1.9343 | 0.775 | 0.7467 | 0.2407 | 0.0982 | | No log | 9.0 | 450 | 1046.8309 | 0.67 | 0.4528 | 2.7967 | 0.67 | 0.6493 | 0.2279 | 0.1225 | | 1185.4053 | 10.0 | 500 | 1044.5997 | 0.76 | 0.3748 | 2.0448 | 0.76 | 0.7066 | 0.1889 | 0.0965 | | 1185.4053 | 11.0 | 550 | 1046.6857 | 0.745 | 0.3799 | 2.1911 | 0.745 | 0.7195 | 0.2129 | 0.0936 | | 1185.4053 | 12.0 | 600 | 1036.3792 | 0.69 | 0.4258 | 2.9605 | 0.69 | 0.6744 | 0.1946 | 0.1255 | | 1185.4053 | 13.0 | 650 | 1042.4764 | 0.715 | 0.4055 | 2.8193 | 0.715 | 0.7326 | 0.1879 | 0.0969 | | 1185.4053 | 14.0 | 700 | 1040.1765 | 0.765 | 0.3811 | 2.5360 | 0.765 | 0.7497 | 0.2079 | 0.0957 | | 1185.4053 | 15.0 | 750 | 1045.1747 | 0.655 | 0.4732 | 3.4740 | 0.655 | 0.6577 | 0.2284 | 0.1180 | | 1185.4053 | 16.0 | 800 | 1035.7695 | 0.785 | 0.3445 | 2.7152 | 0.785 | 0.7564 | 0.1876 | 0.1176 | | 1185.4053 | 17.0 | 850 | 1037.5389 | 0.755 | 0.3937 | 2.8771 | 0.755 | 0.7418 | 0.2153 | 0.1097 | | 1185.4053 | 18.0 | 900 | 1045.9521 | 0.775 | 0.3902 | 2.4149 | 0.775 | 0.7261 | 0.2202 | 0.1157 | | 1185.4053 | 19.0 | 950 | 1039.2958 | 0.81 | 0.3229 | 1.8751 | 0.81 | 0.8025 | 0.1808 | 0.0855 | | 1173.9102 | 20.0 | 1000 | 1037.2432 | 0.765 | 0.3557 | 2.4940 | 0.765 | 0.7378 | 0.1849 | 0.0929 | | 1173.9102 | 21.0 | 1050 | 1042.5205 | 0.785 | 0.3416 | 2.1312 | 0.785 | 0.7641 | 0.2139 | 0.0791 | | 1173.9102 | 22.0 | 1100 | 1031.1548 | 0.785 | 0.3667 | 2.6195 | 0.785 | 0.7555 | 0.1945 | 0.1464 | | 1173.9102 | 23.0 | 1150 | 1038.0391 | 0.815 | 0.3085 | 2.4661 | 0.815 | 0.7769 | 0.1814 | 0.0923 | | 1173.9102 | 24.0 | 1200 | 1033.7769 | 0.75 | 0.3758 | 2.5229 | 0.75 | 0.7341 | 0.1910 | 0.1272 | | 1173.9102 | 25.0 | 1250 | 1040.0878 | 0.82 | 0.2923 | 2.2061 | 0.82 | 0.7882 | 0.1810 | 0.0600 | | 1173.9102 | 26.0 | 1300 | 1041.4139 | 0.84 | 0.2750 | 2.0378 | 0.8400 | 0.8224 | 0.1849 | 0.0645 | | 1173.9102 | 27.0 | 1350 | 1033.1660 | 0.835 | 0.2818 | 1.9645 | 0.835 | 0.8018 | 0.1471 | 0.0771 | | 1173.9102 | 28.0 | 1400 | 1036.1976 | 0.82 | 0.2973 | 2.2955 | 0.82 | 0.7993 | 0.1691 | 0.0755 | | 1173.9102 | 29.0 | 1450 | 1035.8601 | 0.845 | 0.2740 | 2.1097 | 0.845 | 0.8171 | 0.1696 | 0.0637 | | 1169.7248 | 30.0 | 1500 | 1033.2889 | 0.82 | 0.3046 | 2.2615 | 0.82 | 0.7985 | 0.1820 | 0.0910 | | 1169.7248 | 31.0 | 1550 | 1030.7528 | 0.85 | 0.2549 | 1.8264 | 0.85 | 0.8265 | 0.1459 | 0.0798 | | 1169.7248 | 32.0 | 1600 | 1030.7676 | 0.835 | 0.2708 | 1.9641 | 0.835 | 0.8127 | 0.1646 | 0.0648 | | 1169.7248 | 33.0 | 1650 | 1033.2319 | 0.84 | 0.2689 | 1.9015 | 0.8400 | 0.8091 | 0.1652 | 0.0741 | | 1169.7248 | 34.0 | 1700 | 1039.8257 | 0.82 | 0.2974 | 2.2209 | 0.82 | 0.7998 | 0.1767 | 0.0687 | | 1169.7248 | 35.0 | 1750 | 1035.8896 | 0.865 | 0.2331 | 1.8113 | 0.865 | 0.8458 | 0.1535 | 0.0674 | | 1169.7248 | 36.0 | 1800 | 1038.4092 | 0.82 | 0.3014 | 2.1408 | 0.82 | 0.8013 | 0.1708 | 0.0844 | | 1169.7248 | 37.0 | 1850 | 1034.9086 | 0.84 | 0.2722 | 1.8873 | 0.8400 | 0.8233 | 0.1509 | 0.0689 | | 1169.7248 | 38.0 | 1900 | 1034.8866 | 0.84 | 0.2751 | 2.1269 | 0.8400 | 0.8196 | 0.1647 | 0.0781 | | 1169.7248 | 39.0 | 1950 | 1031.3431 | 0.815 | 0.2934 | 2.0669 | 0.815 | 0.7844 | 0.1669 | 0.0839 | | 1167.1226 | 40.0 | 2000 | 1031.2334 | 0.84 | 0.2766 | 2.1016 | 0.8400 | 0.8322 | 0.1580 | 0.0847 | | 1167.1226 | 41.0 | 2050 | 1033.4176 | 0.84 | 0.2725 | 1.7005 | 0.8400 | 0.8114 | 0.1473 | 0.0731 | | 1167.1226 | 42.0 | 2100 | 1032.7366 | 0.83 | 0.2871 | 2.1337 | 0.83 | 0.8037 | 0.1592 | 0.0852 | | 1167.1226 | 43.0 | 2150 | 1032.4486 | 0.845 | 0.2683 | 2.0076 | 0.845 | 0.8218 | 0.1645 | 0.0852 | | 1167.1226 | 44.0 | 2200 | 1035.2195 | 0.84 | 0.2660 | 1.8355 | 0.8400 | 0.8147 | 0.1670 | 0.0643 | | 1167.1226 | 45.0 | 2250 | 1034.2225 | 0.84 | 0.2556 | 1.8249 | 0.8400 | 0.8022 | 0.1622 | 0.0598 | | 1167.1226 | 46.0 | 2300 | 1032.7642 | 0.845 | 0.2534 | 1.6936 | 0.845 | 0.8166 | 0.1438 | 0.0639 | | 1167.1226 | 47.0 | 2350 | 1036.8080 | 0.83 | 0.2765 | 1.7153 | 0.83 | 0.8076 | 0.1638 | 0.0600 | | 1167.1226 | 48.0 | 2400 | 1033.1261 | 0.865 | 0.2290 | 1.7616 | 0.865 | 0.8577 | 0.1402 | 0.0536 | | 1167.1226 | 49.0 | 2450 | 1036.7234 | 0.84 | 0.2656 | 1.7275 | 0.8400 | 0.8093 | 0.1459 | 0.0597 | | 1165.4459 | 50.0 | 2500 | 1026.7501 | 0.85 | 0.2643 | 2.0250 | 0.85 | 0.8276 | 0.1577 | 0.0791 | | 1165.4459 | 51.0 | 2550 | 1033.3628 | 0.825 | 0.2804 | 1.7547 | 0.825 | 0.7943 | 0.1592 | 0.0589 | | 1165.4459 | 52.0 | 2600 | 1033.1439 | 0.845 | 0.2433 | 1.8201 | 0.845 | 0.8156 | 0.1370 | 0.0559 | | 1165.4459 | 53.0 | 2650 | 1031.2720 | 0.85 | 0.2602 | 1.8013 | 0.85 | 0.8276 | 0.1550 | 0.0637 | | 1165.4459 | 54.0 | 2700 | 1031.4177 | 0.845 | 0.2625 | 1.7994 | 0.845 | 0.8295 | 0.1514 | 0.0590 | | 1165.4459 | 55.0 | 2750 | 1029.6421 | 0.84 | 0.2534 | 1.5053 | 0.8400 | 0.8163 | 0.1512 | 0.0541 | | 1165.4459 | 56.0 | 2800 | 1032.8274 | 0.85 | 0.2475 | 1.7805 | 0.85 | 0.8405 | 0.1384 | 0.0554 | | 1165.4459 | 57.0 | 2850 | 1030.2781 | 0.835 | 0.2608 | 1.6087 | 0.835 | 0.8059 | 0.1570 | 0.0585 | | 1165.4459 | 58.0 | 2900 | 1032.2584 | 0.85 | 0.2434 | 1.6753 | 0.85 | 0.8311 | 0.1389 | 0.0606 | | 1165.4459 | 59.0 | 2950 | 1031.4507 | 0.845 | 0.2590 | 1.7351 | 0.845 | 0.8291 | 0.1464 | 0.0556 | | 1163.88 | 60.0 | 3000 | 1031.8136 | 0.85 | 0.2461 | 1.8215 | 0.85 | 0.8329 | 0.1438 | 0.0578 | | 1163.88 | 61.0 | 3050 | 1032.6718 | 0.86 | 0.2410 | 1.6261 | 0.8600 | 0.8465 | 0.1483 | 0.0566 | | 1163.88 | 62.0 | 3100 | 1028.8600 | 0.855 | 0.2488 | 1.6111 | 0.855 | 0.8358 | 0.1421 | 0.0578 | | 1163.88 | 63.0 | 3150 | 1029.1522 | 0.845 | 0.2787 | 1.8558 | 0.845 | 0.8135 | 0.1539 | 0.0591 | | 1163.88 | 64.0 | 3200 | 1026.7474 | 0.85 | 0.2457 | 1.9070 | 0.85 | 0.8280 | 0.1411 | 0.0570 | | 1163.88 | 65.0 | 3250 | 1032.3363 | 0.845 | 0.2595 | 1.7872 | 0.845 | 0.8169 | 0.1435 | 0.0616 | | 1163.88 | 66.0 | 3300 | 1031.6006 | 0.855 | 0.2474 | 1.7063 | 0.855 | 0.8357 | 0.1357 | 0.0521 | | 1163.88 | 67.0 | 3350 | 1031.1060 | 0.855 | 0.2420 | 1.5538 | 0.855 | 0.8401 | 0.1357 | 0.0562 | | 1163.88 | 68.0 | 3400 | 1032.2750 | 0.855 | 0.2443 | 1.7408 | 0.855 | 0.8455 | 0.1348 | 0.0520 | | 1163.88 | 69.0 | 3450 | 1031.3723 | 0.84 | 0.2523 | 1.6058 | 0.8400 | 0.8211 | 0.1356 | 0.0550 | | 1162.7836 | 70.0 | 3500 | 1032.2831 | 0.855 | 0.2539 | 1.6918 | 0.855 | 0.8386 | 0.1406 | 0.0605 | | 1162.7836 | 71.0 | 3550 | 1028.9554 | 0.85 | 0.2479 | 1.6822 | 0.85 | 0.8290 | 0.1395 | 0.0567 | | 1162.7836 | 72.0 | 3600 | 1031.4127 | 0.855 | 0.2495 | 1.7386 | 0.855 | 0.8386 | 0.1365 | 0.0564 | | 1162.7836 | 73.0 | 3650 | 1031.2170 | 0.86 | 0.2450 | 1.7208 | 0.8600 | 0.8454 | 0.1337 | 0.0551 | | 1162.7836 | 74.0 | 3700 | 1030.2428 | 0.86 | 0.2458 | 1.7499 | 0.8600 | 0.8454 | 0.1490 | 0.0525 | | 1162.7836 | 75.0 | 3750 | 1031.7998 | 0.86 | 0.2442 | 1.6081 | 0.8600 | 0.8384 | 0.1373 | 0.0569 | | 1162.7836 | 76.0 | 3800 | 1030.5450 | 0.85 | 0.2469 | 1.5548 | 0.85 | 0.8354 | 0.1380 | 0.0563 | | 1162.7836 | 77.0 | 3850 | 1031.4061 | 0.855 | 0.2383 | 1.6571 | 0.855 | 0.8376 | 0.1313 | 0.0549 | | 1162.7836 | 78.0 | 3900 | 1033.5166 | 0.85 | 0.2554 | 1.5379 | 0.85 | 0.8279 | 0.1400 | 0.0572 | | 1162.7836 | 79.0 | 3950 | 1033.4709 | 0.855 | 0.2496 | 1.5471 | 0.855 | 0.8437 | 0.1287 | 0.0524 | | 1162.105 | 80.0 | 4000 | 1031.8766 | 0.85 | 0.2453 | 1.6612 | 0.85 | 0.8325 | 0.1388 | 0.0556 | | 1162.105 | 81.0 | 4050 | 1030.5789 | 0.855 | 0.2441 | 1.7341 | 0.855 | 0.8356 | 0.1259 | 0.0535 | | 1162.105 | 82.0 | 4100 | 1033.6682 | 0.85 | 0.2517 | 1.6263 | 0.85 | 0.8340 | 0.1443 | 0.0513 | | 1162.105 | 83.0 | 4150 | 1031.2977 | 0.855 | 0.2425 | 1.6676 | 0.855 | 0.8427 | 0.1334 | 0.0537 | | 1162.105 | 84.0 | 4200 | 1027.3252 | 0.86 | 0.2456 | 1.7430 | 0.8600 | 0.8515 | 0.1326 | 0.0542 | | 1162.105 | 85.0 | 4250 | 1030.4331 | 0.855 | 0.2454 | 1.7415 | 0.855 | 0.8439 | 0.1507 | 0.0501 | | 1162.105 | 86.0 | 4300 | 1031.9957 | 0.845 | 0.2553 | 1.6819 | 0.845 | 0.8322 | 0.1463 | 0.0497 | | 1162.105 | 87.0 | 4350 | 1031.3456 | 0.845 | 0.2543 | 1.7632 | 0.845 | 0.8240 | 0.1503 | 0.0556 | | 1162.105 | 88.0 | 4400 | 1030.5261 | 0.845 | 0.2522 | 1.7459 | 0.845 | 0.8243 | 0.1450 | 0.0482 | | 1162.105 | 89.0 | 4450 | 1028.4374 | 0.845 | 0.2507 | 1.6968 | 0.845 | 0.8243 | 0.1356 | 0.0538 | | 1161.3639 | 90.0 | 4500 | 1030.4808 | 0.845 | 0.2501 | 1.6386 | 0.845 | 0.8240 | 0.1442 | 0.0531 | | 1161.3639 | 91.0 | 4550 | 1032.4968 | 0.845 | 0.2583 | 1.7297 | 0.845 | 0.8235 | 0.1329 | 0.0519 | | 1161.3639 | 92.0 | 4600 | 1031.8092 | 0.85 | 0.2523 | 1.6594 | 0.85 | 0.8340 | 0.1411 | 0.0519 | | 1161.3639 | 93.0 | 4650 | 1031.8936 | 0.85 | 0.2539 | 1.7723 | 0.85 | 0.8366 | 0.1392 | 0.0475 | | 1161.3639 | 94.0 | 4700 | 1031.4855 | 0.855 | 0.2525 | 1.6232 | 0.855 | 0.8448 | 0.1477 | 0.0508 | | 1161.3639 | 95.0 | 4750 | 1027.4985 | 0.855 | 0.2502 | 1.8411 | 0.855 | 0.8448 | 0.1442 | 0.0510 | | 1161.3639 | 96.0 | 4800 | 1031.0199 | 0.85 | 0.2500 | 1.7087 | 0.85 | 0.8353 | 0.1342 | 0.0488 | | 1161.3639 | 97.0 | 4850 | 1031.1750 | 0.84 | 0.2592 | 1.7172 | 0.8400 | 0.8152 | 0.1407 | 0.0530 | | 1161.3639 | 98.0 | 4900 | 1032.8660 | 0.845 | 0.2541 | 1.6367 | 0.845 | 0.8273 | 0.1352 | 0.0488 | | 1161.3639 | 99.0 | 4950 | 1029.1611 | 0.855 | 0.2509 | 1.6982 | 0.855 | 0.8448 | 0.1485 | 0.0525 | | 1161.2033 | 100.0 | 5000 | 1030.4594 | 0.845 | 0.2507 | 1.6602 | 0.845 | 0.8273 | 0.1440 | 0.0504 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/225-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 225-tiny_tobacco3482_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4875 - Accuracy: 0.805 - Brier Loss: 0.3011 - Nll: 1.4097 - F1 Micro: 0.805 - F1 Macro: 0.7862 - Ece: 0.2181 - Aurc: 0.0530 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 1.7287 | 0.235 | 0.8932 | 7.9209 | 0.235 | 0.1394 | 0.3142 | 0.7431 | | No log | 2.0 | 26 | 1.2278 | 0.455 | 0.6937 | 3.7028 | 0.455 | 0.3630 | 0.2894 | 0.3370 | | No log | 3.0 | 39 | 1.0392 | 0.545 | 0.5804 | 2.0622 | 0.545 | 0.4755 | 0.2549 | 0.2370 | | No log | 4.0 | 52 | 0.8315 | 0.635 | 0.4677 | 1.8368 | 0.635 | 0.5858 | 0.2429 | 0.1436 | | No log | 5.0 | 65 | 0.7488 | 0.72 | 0.4187 | 1.4849 | 0.72 | 0.6856 | 0.2558 | 0.1081 | | No log | 6.0 | 78 | 0.7076 | 0.735 | 0.3886 | 1.4403 | 0.735 | 0.6840 | 0.2107 | 0.1002 | | No log | 7.0 | 91 | 0.6946 | 0.725 | 0.3876 | 1.5056 | 0.7250 | 0.7017 | 0.2229 | 0.0982 | | No log | 8.0 | 104 | 0.7344 | 0.72 | 0.3928 | 1.8498 | 0.72 | 0.6796 | 0.1817 | 0.1016 | | No log | 9.0 | 117 | 0.7380 | 0.735 | 0.4000 | 1.8338 | 0.735 | 0.7369 | 0.2382 | 0.1032 | | No log | 10.0 | 130 | 0.6790 | 0.76 | 0.3677 | 1.8100 | 0.76 | 0.7343 | 0.2346 | 0.1003 | | No log | 11.0 | 143 | 0.7008 | 0.735 | 0.3997 | 1.8469 | 0.735 | 0.7211 | 0.2510 | 0.1226 | | No log | 12.0 | 156 | 0.6378 | 0.74 | 0.3762 | 1.7911 | 0.74 | 0.7187 | 0.2214 | 0.0869 | | No log | 13.0 | 169 | 0.6066 | 0.745 | 0.3544 | 1.6249 | 0.745 | 0.7012 | 0.2009 | 0.0823 | | No log | 14.0 | 182 | 0.5888 | 0.77 | 0.3365 | 1.7603 | 0.7700 | 0.7568 | 0.2047 | 0.0613 | | No log | 15.0 | 195 | 0.5817 | 0.765 | 0.3430 | 1.6581 | 0.765 | 0.7411 | 0.2345 | 0.0661 | | No log | 16.0 | 208 | 0.5510 | 0.795 | 0.3265 | 1.3740 | 0.795 | 0.7815 | 0.2347 | 0.0590 | | No log | 17.0 | 221 | 0.5449 | 0.77 | 0.3362 | 1.3348 | 0.7700 | 0.7541 | 0.1971 | 0.0709 | | No log | 18.0 | 234 | 0.5686 | 0.775 | 0.3363 | 1.8806 | 0.775 | 0.7604 | 0.2006 | 0.0654 | | No log | 19.0 | 247 | 0.5499 | 0.82 | 0.3237 | 1.3751 | 0.82 | 0.7981 | 0.2471 | 0.0586 | | No log | 20.0 | 260 | 0.5321 | 0.77 | 0.3213 | 1.5905 | 0.7700 | 0.7474 | 0.2332 | 0.0643 | | No log | 21.0 | 273 | 0.5349 | 0.805 | 0.3050 | 2.0032 | 0.805 | 0.7788 | 0.2238 | 0.0533 | | No log | 22.0 | 286 | 0.5318 | 0.8 | 0.3105 | 1.5868 | 0.8000 | 0.7620 | 0.2377 | 0.0538 | | No log | 23.0 | 299 | 0.5021 | 0.83 | 0.2982 | 1.4067 | 0.83 | 0.8190 | 0.2631 | 0.0463 | | No log | 24.0 | 312 | 0.5008 | 0.805 | 0.3023 | 1.4409 | 0.805 | 0.7863 | 0.2248 | 0.0501 | | No log | 25.0 | 325 | 0.5069 | 0.805 | 0.3036 | 1.4965 | 0.805 | 0.7770 | 0.2218 | 0.0519 | | No log | 26.0 | 338 | 0.4967 | 0.8 | 0.3002 | 1.6267 | 0.8000 | 0.7788 | 0.2188 | 0.0598 | | No log | 27.0 | 351 | 0.4892 | 0.81 | 0.3006 | 1.6391 | 0.81 | 0.7886 | 0.2170 | 0.0513 | | No log | 28.0 | 364 | 0.5099 | 0.82 | 0.3129 | 1.5802 | 0.82 | 0.8004 | 0.2285 | 0.0589 | | No log | 29.0 | 377 | 0.5009 | 0.8 | 0.3054 | 1.5187 | 0.8000 | 0.7747 | 0.2260 | 0.0570 | | No log | 30.0 | 390 | 0.4869 | 0.805 | 0.2989 | 1.4292 | 0.805 | 0.7823 | 0.2380 | 0.0511 | | No log | 31.0 | 403 | 0.4876 | 0.82 | 0.2970 | 1.4254 | 0.82 | 0.7984 | 0.2293 | 0.0524 | | No log | 32.0 | 416 | 0.4916 | 0.81 | 0.3024 | 1.5657 | 0.81 | 0.7872 | 0.2239 | 0.0557 | | No log | 33.0 | 429 | 0.4834 | 0.805 | 0.2969 | 1.5227 | 0.805 | 0.7939 | 0.2108 | 0.0537 | | No log | 34.0 | 442 | 0.4910 | 0.8 | 0.3074 | 1.4463 | 0.8000 | 0.7745 | 0.2236 | 0.0580 | | No log | 35.0 | 455 | 0.4854 | 0.805 | 0.2990 | 1.4106 | 0.805 | 0.7875 | 0.2280 | 0.0547 | | No log | 36.0 | 468 | 0.4861 | 0.815 | 0.2985 | 1.4682 | 0.815 | 0.7921 | 0.2310 | 0.0527 | | No log | 37.0 | 481 | 0.4880 | 0.8 | 0.3032 | 1.4765 | 0.8000 | 0.7743 | 0.2174 | 0.0565 | | No log | 38.0 | 494 | 0.4871 | 0.805 | 0.2993 | 1.4592 | 0.805 | 0.7854 | 0.2072 | 0.0551 | | 0.3005 | 39.0 | 507 | 0.4908 | 0.805 | 0.3037 | 1.4704 | 0.805 | 0.7854 | 0.2269 | 0.0575 | | 0.3005 | 40.0 | 520 | 0.4893 | 0.805 | 0.3018 | 1.3980 | 0.805 | 0.7862 | 0.2105 | 0.0555 | | 0.3005 | 41.0 | 533 | 0.4866 | 0.8 | 0.3016 | 1.4087 | 0.8000 | 0.7766 | 0.2219 | 0.0547 | | 0.3005 | 42.0 | 546 | 0.4851 | 0.805 | 0.2997 | 1.3968 | 0.805 | 0.7862 | 0.2110 | 0.0536 | | 0.3005 | 43.0 | 559 | 0.4859 | 0.805 | 0.3011 | 1.4078 | 0.805 | 0.7875 | 0.2126 | 0.0545 | | 0.3005 | 44.0 | 572 | 0.4869 | 0.805 | 0.3011 | 1.4629 | 0.805 | 0.7862 | 0.2122 | 0.0546 | | 0.3005 | 45.0 | 585 | 0.4868 | 0.805 | 0.3010 | 1.4646 | 0.805 | 0.7854 | 0.2151 | 0.0549 | | 0.3005 | 46.0 | 598 | 0.4870 | 0.805 | 0.3012 | 1.4644 | 0.805 | 0.7854 | 0.2110 | 0.0544 | | 0.3005 | 47.0 | 611 | 0.4858 | 0.805 | 0.2999 | 1.4066 | 0.805 | 0.7875 | 0.2180 | 0.0534 | | 0.3005 | 48.0 | 624 | 0.4866 | 0.805 | 0.3014 | 1.4032 | 0.805 | 0.7875 | 0.2265 | 0.0538 | | 0.3005 | 49.0 | 637 | 0.4854 | 0.805 | 0.2996 | 1.4117 | 0.805 | 0.7862 | 0.2156 | 0.0534 | | 0.3005 | 50.0 | 650 | 0.4860 | 0.805 | 0.3003 | 1.4683 | 0.805 | 0.7854 | 0.2100 | 0.0533 | | 0.3005 | 51.0 | 663 | 0.4860 | 0.805 | 0.3002 | 1.4041 | 0.805 | 0.7854 | 0.2352 | 0.0534 | | 0.3005 | 52.0 | 676 | 0.4872 | 0.805 | 0.3015 | 1.4067 | 0.805 | 0.7875 | 0.2033 | 0.0540 | | 0.3005 | 53.0 | 689 | 0.4866 | 0.805 | 0.3005 | 1.4105 | 0.805 | 0.7875 | 0.2310 | 0.0538 | | 0.3005 | 54.0 | 702 | 0.4861 | 0.805 | 0.3006 | 1.4036 | 0.805 | 0.7875 | 0.2340 | 0.0533 | | 0.3005 | 55.0 | 715 | 0.4864 | 0.805 | 0.3005 | 1.4063 | 0.805 | 0.7875 | 0.2199 | 0.0537 | | 0.3005 | 56.0 | 728 | 0.4871 | 0.805 | 0.3009 | 1.4091 | 0.805 | 0.7862 | 0.2282 | 0.0537 | | 0.3005 | 57.0 | 741 | 0.4869 | 0.805 | 0.3007 | 1.4079 | 0.805 | 0.7862 | 0.2214 | 0.0531 | | 0.3005 | 58.0 | 754 | 0.4864 | 0.805 | 0.3005 | 1.4086 | 0.805 | 0.7862 | 0.2206 | 0.0532 | | 0.3005 | 59.0 | 767 | 0.4868 | 0.805 | 0.3007 | 1.4133 | 0.805 | 0.7862 | 0.2372 | 0.0531 | | 0.3005 | 60.0 | 780 | 0.4871 | 0.805 | 0.3009 | 1.4079 | 0.805 | 0.7875 | 0.2172 | 0.0534 | | 0.3005 | 61.0 | 793 | 0.4875 | 0.805 | 0.3014 | 1.4106 | 0.805 | 0.7862 | 0.2295 | 0.0536 | | 0.3005 | 62.0 | 806 | 0.4875 | 0.805 | 0.3013 | 1.4136 | 0.805 | 0.7875 | 0.2219 | 0.0535 | | 0.3005 | 63.0 | 819 | 0.4874 | 0.805 | 0.3013 | 1.4085 | 0.805 | 0.7862 | 0.2189 | 0.0534 | | 0.3005 | 64.0 | 832 | 0.4867 | 0.805 | 0.3007 | 1.4075 | 0.805 | 0.7862 | 0.2325 | 0.0530 | | 0.3005 | 65.0 | 845 | 0.4876 | 0.805 | 0.3013 | 1.4122 | 0.805 | 0.7862 | 0.2379 | 0.0537 | | 0.3005 | 66.0 | 858 | 0.4878 | 0.805 | 0.3015 | 1.4090 | 0.805 | 0.7862 | 0.2220 | 0.0536 | | 0.3005 | 67.0 | 871 | 0.4869 | 0.805 | 0.3007 | 1.4101 | 0.805 | 0.7862 | 0.2253 | 0.0529 | | 0.3005 | 68.0 | 884 | 0.4871 | 0.805 | 0.3009 | 1.4096 | 0.805 | 0.7862 | 0.2340 | 0.0530 | | 0.3005 | 69.0 | 897 | 0.4873 | 0.805 | 0.3010 | 1.4120 | 0.805 | 0.7862 | 0.2138 | 0.0534 | | 0.3005 | 70.0 | 910 | 0.4874 | 0.805 | 0.3011 | 1.4121 | 0.805 | 0.7862 | 0.2292 | 0.0533 | | 0.3005 | 71.0 | 923 | 0.4874 | 0.805 | 0.3012 | 1.4095 | 0.805 | 0.7862 | 0.2276 | 0.0532 | | 0.3005 | 72.0 | 936 | 0.4870 | 0.805 | 0.3009 | 1.4083 | 0.805 | 0.7862 | 0.2262 | 0.0532 | | 0.3005 | 73.0 | 949 | 0.4877 | 0.805 | 0.3013 | 1.4115 | 0.805 | 0.7862 | 0.2273 | 0.0533 | | 0.3005 | 74.0 | 962 | 0.4872 | 0.805 | 0.3010 | 1.4109 | 0.805 | 0.7862 | 0.2275 | 0.0533 | | 0.3005 | 75.0 | 975 | 0.4874 | 0.805 | 0.3010 | 1.4100 | 0.805 | 0.7862 | 0.2186 | 0.0533 | | 0.3005 | 76.0 | 988 | 0.4874 | 0.805 | 0.3011 | 1.4095 | 0.805 | 0.7862 | 0.2174 | 0.0532 | | 0.0815 | 77.0 | 1001 | 0.4876 | 0.805 | 0.3012 | 1.4096 | 0.805 | 0.7862 | 0.2185 | 0.0533 | | 0.0815 | 78.0 | 1014 | 0.4875 | 0.805 | 0.3011 | 1.4114 | 0.805 | 0.7862 | 0.2189 | 0.0532 | | 0.0815 | 79.0 | 1027 | 0.4874 | 0.805 | 0.3011 | 1.4092 | 0.805 | 0.7862 | 0.2347 | 0.0533 | | 0.0815 | 80.0 | 1040 | 0.4877 | 0.805 | 0.3012 | 1.4110 | 0.805 | 0.7862 | 0.2272 | 0.0532 | | 0.0815 | 81.0 | 1053 | 0.4876 | 0.805 | 0.3012 | 1.4092 | 0.805 | 0.7862 | 0.2259 | 0.0532 | | 0.0815 | 82.0 | 1066 | 0.4873 | 0.805 | 0.3009 | 1.4103 | 0.805 | 0.7862 | 0.2171 | 0.0531 | | 0.0815 | 83.0 | 1079 | 0.4875 | 0.805 | 0.3011 | 1.4091 | 0.805 | 0.7862 | 0.2260 | 0.0532 | | 0.0815 | 84.0 | 1092 | 0.4875 | 0.805 | 0.3010 | 1.4108 | 0.805 | 0.7862 | 0.2346 | 0.0532 | | 0.0815 | 85.0 | 1105 | 0.4876 | 0.805 | 0.3012 | 1.4098 | 0.805 | 0.7862 | 0.2276 | 0.0531 | | 0.0815 | 86.0 | 1118 | 0.4876 | 0.805 | 0.3011 | 1.4127 | 0.805 | 0.7862 | 0.2272 | 0.0532 | | 0.0815 | 87.0 | 1131 | 0.4875 | 0.805 | 0.3011 | 1.4093 | 0.805 | 0.7862 | 0.2275 | 0.0531 | | 0.0815 | 88.0 | 1144 | 0.4876 | 0.805 | 0.3011 | 1.4092 | 0.805 | 0.7862 | 0.2184 | 0.0531 | | 0.0815 | 89.0 | 1157 | 0.4874 | 0.805 | 0.3011 | 1.4086 | 0.805 | 0.7862 | 0.2271 | 0.0531 | | 0.0815 | 90.0 | 1170 | 0.4875 | 0.805 | 0.3011 | 1.4098 | 0.805 | 0.7862 | 0.2272 | 0.0531 | | 0.0815 | 91.0 | 1183 | 0.4875 | 0.805 | 0.3011 | 1.4104 | 0.805 | 0.7862 | 0.2275 | 0.0531 | | 0.0815 | 92.0 | 1196 | 0.4874 | 0.805 | 0.3010 | 1.4092 | 0.805 | 0.7862 | 0.2183 | 0.0531 | | 0.0815 | 93.0 | 1209 | 0.4876 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2276 | 0.0531 | | 0.0815 | 94.0 | 1222 | 0.4874 | 0.805 | 0.3009 | 1.4095 | 0.805 | 0.7862 | 0.2180 | 0.0529 | | 0.0815 | 95.0 | 1235 | 0.4874 | 0.805 | 0.3011 | 1.4092 | 0.805 | 0.7862 | 0.2182 | 0.0531 | | 0.0815 | 96.0 | 1248 | 0.4875 | 0.805 | 0.3011 | 1.4100 | 0.805 | 0.7862 | 0.2183 | 0.0531 | | 0.0815 | 97.0 | 1261 | 0.4875 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2181 | 0.0530 | | 0.0815 | 98.0 | 1274 | 0.4876 | 0.805 | 0.3011 | 1.4098 | 0.805 | 0.7862 | 0.2181 | 0.0530 | | 0.0815 | 99.0 | 1287 | 0.4875 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2273 | 0.0531 | | 0.0815 | 100.0 | 1300 | 0.4875 | 0.805 | 0.3011 | 1.4097 | 0.805 | 0.7862 | 0.2181 | 0.0530 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/225-tiny_tobacco3482_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 225-tiny_tobacco3482_kd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2991 - Accuracy: 0.775 - Brier Loss: 0.3491 - Nll: 1.2196 - F1 Micro: 0.775 - F1 Macro: 0.7302 - Ece: 0.2602 - Aurc: 0.0644 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 1.6344 | 0.23 | 0.8900 | 7.7885 | 0.23 | 0.1633 | 0.2747 | 0.7585 | | No log | 2.0 | 26 | 1.0824 | 0.385 | 0.7943 | 4.1668 | 0.3850 | 0.2795 | 0.3160 | 0.4560 | | No log | 3.0 | 39 | 0.8639 | 0.535 | 0.6762 | 3.0145 | 0.535 | 0.4086 | 0.3233 | 0.2913 | | No log | 4.0 | 52 | 0.7309 | 0.595 | 0.5956 | 2.2236 | 0.595 | 0.4646 | 0.3075 | 0.1944 | | No log | 5.0 | 65 | 0.6374 | 0.67 | 0.5211 | 2.1759 | 0.67 | 0.5737 | 0.2898 | 0.1450 | | No log | 6.0 | 78 | 0.6720 | 0.685 | 0.4833 | 2.2861 | 0.685 | 0.5860 | 0.2904 | 0.1331 | | No log | 7.0 | 91 | 0.6097 | 0.675 | 0.4767 | 2.3133 | 0.675 | 0.5733 | 0.2622 | 0.1519 | | No log | 8.0 | 104 | 0.5206 | 0.705 | 0.4301 | 1.8228 | 0.705 | 0.6164 | 0.2603 | 0.1038 | | No log | 9.0 | 117 | 0.5486 | 0.715 | 0.4414 | 1.8451 | 0.715 | 0.6444 | 0.2583 | 0.1063 | | No log | 10.0 | 130 | 0.5067 | 0.7 | 0.4171 | 1.7759 | 0.7 | 0.6325 | 0.2611 | 0.1071 | | No log | 11.0 | 143 | 0.4612 | 0.745 | 0.4017 | 1.4919 | 0.745 | 0.6635 | 0.2840 | 0.0838 | | No log | 12.0 | 156 | 0.4785 | 0.745 | 0.4204 | 1.8579 | 0.745 | 0.6750 | 0.2542 | 0.0979 | | No log | 13.0 | 169 | 0.4518 | 0.715 | 0.4036 | 1.5697 | 0.715 | 0.6496 | 0.2744 | 0.1002 | | No log | 14.0 | 182 | 0.5081 | 0.7 | 0.4294 | 1.9850 | 0.7 | 0.6514 | 0.2364 | 0.1225 | | No log | 15.0 | 195 | 0.4415 | 0.705 | 0.3994 | 1.7828 | 0.705 | 0.6301 | 0.2380 | 0.0992 | | No log | 16.0 | 208 | 0.3859 | 0.73 | 0.3832 | 1.3431 | 0.7300 | 0.6516 | 0.2548 | 0.0817 | | No log | 17.0 | 221 | 0.3869 | 0.75 | 0.3832 | 1.2075 | 0.75 | 0.6651 | 0.2622 | 0.0758 | | No log | 18.0 | 234 | 0.3637 | 0.755 | 0.3770 | 1.2290 | 0.755 | 0.7108 | 0.2569 | 0.0687 | | No log | 19.0 | 247 | 0.3933 | 0.745 | 0.3700 | 1.4931 | 0.745 | 0.6812 | 0.2434 | 0.0799 | | No log | 20.0 | 260 | 0.3540 | 0.745 | 0.3721 | 1.1910 | 0.745 | 0.6702 | 0.2208 | 0.0760 | | No log | 21.0 | 273 | 0.3560 | 0.77 | 0.3718 | 1.1248 | 0.7700 | 0.7142 | 0.2731 | 0.0743 | | No log | 22.0 | 286 | 0.3530 | 0.74 | 0.3758 | 1.4213 | 0.74 | 0.6902 | 0.2326 | 0.0768 | | No log | 23.0 | 299 | 0.3419 | 0.745 | 0.3699 | 1.2528 | 0.745 | 0.6714 | 0.2324 | 0.0765 | | No log | 24.0 | 312 | 0.3302 | 0.775 | 0.3595 | 1.3338 | 0.775 | 0.7120 | 0.2521 | 0.0665 | | No log | 25.0 | 325 | 0.3533 | 0.775 | 0.3672 | 1.4609 | 0.775 | 0.7167 | 0.2482 | 0.0740 | | No log | 26.0 | 338 | 0.3416 | 0.775 | 0.3684 | 1.1575 | 0.775 | 0.7124 | 0.2601 | 0.0732 | | No log | 27.0 | 351 | 0.3463 | 0.75 | 0.3714 | 1.1053 | 0.75 | 0.6868 | 0.2512 | 0.0808 | | No log | 28.0 | 364 | 0.3298 | 0.775 | 0.3605 | 1.2108 | 0.775 | 0.6986 | 0.2537 | 0.0668 | | No log | 29.0 | 377 | 0.3278 | 0.77 | 0.3645 | 1.1893 | 0.7700 | 0.7013 | 0.2447 | 0.0765 | | No log | 30.0 | 390 | 0.3165 | 0.78 | 0.3608 | 1.1615 | 0.78 | 0.7285 | 0.2472 | 0.0712 | | No log | 31.0 | 403 | 0.3212 | 0.765 | 0.3571 | 1.1317 | 0.765 | 0.6999 | 0.2497 | 0.0725 | | No log | 32.0 | 416 | 0.3119 | 0.765 | 0.3581 | 1.0644 | 0.765 | 0.6881 | 0.2285 | 0.0675 | | No log | 33.0 | 429 | 0.3229 | 0.765 | 0.3523 | 1.2937 | 0.765 | 0.7138 | 0.2517 | 0.0658 | | No log | 34.0 | 442 | 0.3193 | 0.78 | 0.3660 | 1.1849 | 0.78 | 0.7329 | 0.2686 | 0.0700 | | No log | 35.0 | 455 | 0.3088 | 0.775 | 0.3556 | 1.1613 | 0.775 | 0.7071 | 0.2640 | 0.0680 | | No log | 36.0 | 468 | 0.3113 | 0.785 | 0.3508 | 1.1715 | 0.785 | 0.7501 | 0.2443 | 0.0656 | | No log | 37.0 | 481 | 0.3113 | 0.79 | 0.3526 | 1.2334 | 0.79 | 0.7388 | 0.2580 | 0.0639 | | No log | 38.0 | 494 | 0.3077 | 0.755 | 0.3528 | 1.1152 | 0.755 | 0.6973 | 0.2401 | 0.0692 | | 0.2783 | 39.0 | 507 | 0.3064 | 0.775 | 0.3567 | 1.2289 | 0.775 | 0.7370 | 0.2417 | 0.0696 | | 0.2783 | 40.0 | 520 | 0.3063 | 0.77 | 0.3521 | 1.2437 | 0.7700 | 0.7232 | 0.2396 | 0.0688 | | 0.2783 | 41.0 | 533 | 0.3042 | 0.77 | 0.3541 | 1.2490 | 0.7700 | 0.7234 | 0.2470 | 0.0682 | | 0.2783 | 42.0 | 546 | 0.2999 | 0.77 | 0.3486 | 1.1626 | 0.7700 | 0.7082 | 0.2491 | 0.0638 | | 0.2783 | 43.0 | 559 | 0.3020 | 0.77 | 0.3515 | 1.2141 | 0.7700 | 0.7312 | 0.2570 | 0.0687 | | 0.2783 | 44.0 | 572 | 0.3024 | 0.775 | 0.3502 | 1.2184 | 0.775 | 0.7168 | 0.2568 | 0.0648 | | 0.2783 | 45.0 | 585 | 0.3002 | 0.78 | 0.3517 | 1.2189 | 0.78 | 0.7364 | 0.2673 | 0.0644 | | 0.2783 | 46.0 | 598 | 0.3022 | 0.775 | 0.3511 | 1.1594 | 0.775 | 0.7266 | 0.2538 | 0.0661 | | 0.2783 | 47.0 | 611 | 0.2974 | 0.775 | 0.3464 | 1.2157 | 0.775 | 0.7238 | 0.2630 | 0.0627 | | 0.2783 | 48.0 | 624 | 0.3003 | 0.78 | 0.3519 | 1.1584 | 0.78 | 0.7318 | 0.2413 | 0.0666 | | 0.2783 | 49.0 | 637 | 0.2990 | 0.77 | 0.3492 | 1.2187 | 0.7700 | 0.7136 | 0.2401 | 0.0643 | | 0.2783 | 50.0 | 650 | 0.3019 | 0.765 | 0.3516 | 1.2254 | 0.765 | 0.7180 | 0.2409 | 0.0673 | | 0.2783 | 51.0 | 663 | 0.2991 | 0.77 | 0.3499 | 1.2186 | 0.7700 | 0.7145 | 0.2566 | 0.0646 | | 0.2783 | 52.0 | 676 | 0.2990 | 0.77 | 0.3507 | 1.2204 | 0.7700 | 0.7207 | 0.2360 | 0.0651 | | 0.2783 | 53.0 | 689 | 0.2982 | 0.765 | 0.3488 | 1.1663 | 0.765 | 0.7042 | 0.2338 | 0.0643 | | 0.2783 | 54.0 | 702 | 0.2969 | 0.775 | 0.3485 | 1.1667 | 0.775 | 0.7302 | 0.2586 | 0.0642 | | 0.2783 | 55.0 | 715 | 0.2989 | 0.775 | 0.3487 | 1.2181 | 0.775 | 0.7302 | 0.2670 | 0.0647 | | 0.2783 | 56.0 | 728 | 0.2991 | 0.77 | 0.3499 | 1.2208 | 0.7700 | 0.7136 | 0.2339 | 0.0650 | | 0.2783 | 57.0 | 741 | 0.2986 | 0.775 | 0.3487 | 1.2162 | 0.775 | 0.7302 | 0.2415 | 0.0639 | | 0.2783 | 58.0 | 754 | 0.2985 | 0.77 | 0.3490 | 1.2183 | 0.7700 | 0.7207 | 0.2547 | 0.0647 | | 0.2783 | 59.0 | 767 | 0.2993 | 0.77 | 0.3494 | 1.2218 | 0.7700 | 0.7136 | 0.2417 | 0.0649 | | 0.2783 | 60.0 | 780 | 0.2983 | 0.77 | 0.3487 | 1.2185 | 0.7700 | 0.7207 | 0.2555 | 0.0646 | | 0.2783 | 61.0 | 793 | 0.2989 | 0.775 | 0.3492 | 1.2182 | 0.775 | 0.7302 | 0.2444 | 0.0645 | | 0.2783 | 62.0 | 806 | 0.2987 | 0.775 | 0.3487 | 1.2174 | 0.775 | 0.7302 | 0.2438 | 0.0642 | | 0.2783 | 63.0 | 819 | 0.2987 | 0.775 | 0.3490 | 1.2198 | 0.775 | 0.7302 | 0.2508 | 0.0646 | | 0.2783 | 64.0 | 832 | 0.2989 | 0.775 | 0.3494 | 1.2195 | 0.775 | 0.7302 | 0.2609 | 0.0646 | | 0.2783 | 65.0 | 845 | 0.2990 | 0.775 | 0.3492 | 1.2177 | 0.775 | 0.7302 | 0.2528 | 0.0644 | | 0.2783 | 66.0 | 858 | 0.2992 | 0.775 | 0.3493 | 1.2193 | 0.775 | 0.7302 | 0.2537 | 0.0646 | | 0.2783 | 67.0 | 871 | 0.2990 | 0.775 | 0.3493 | 1.2199 | 0.775 | 0.7302 | 0.2510 | 0.0647 | | 0.2783 | 68.0 | 884 | 0.2991 | 0.775 | 0.3495 | 1.2199 | 0.775 | 0.7302 | 0.2476 | 0.0646 | | 0.2783 | 69.0 | 897 | 0.2989 | 0.775 | 0.3491 | 1.2187 | 0.775 | 0.7302 | 0.2606 | 0.0646 | | 0.2783 | 70.0 | 910 | 0.2987 | 0.775 | 0.3490 | 1.2187 | 0.775 | 0.7302 | 0.2436 | 0.0642 | | 0.2783 | 71.0 | 923 | 0.2990 | 0.775 | 0.3491 | 1.2190 | 0.775 | 0.7302 | 0.2510 | 0.0646 | | 0.2783 | 72.0 | 936 | 0.2990 | 0.775 | 0.3492 | 1.2191 | 0.775 | 0.7302 | 0.2541 | 0.0646 | | 0.2783 | 73.0 | 949 | 0.2990 | 0.775 | 0.3491 | 1.2176 | 0.775 | 0.7302 | 0.2509 | 0.0647 | | 0.2783 | 74.0 | 962 | 0.2990 | 0.775 | 0.3493 | 1.2203 | 0.775 | 0.7302 | 0.2600 | 0.0643 | | 0.2783 | 75.0 | 975 | 0.2989 | 0.775 | 0.3492 | 1.2203 | 0.775 | 0.7302 | 0.2665 | 0.0643 | | 0.2783 | 76.0 | 988 | 0.2991 | 0.775 | 0.3492 | 1.2193 | 0.775 | 0.7302 | 0.2601 | 0.0643 | | 0.0005 | 77.0 | 1001 | 0.2991 | 0.775 | 0.3491 | 1.2201 | 0.775 | 0.7302 | 0.2598 | 0.0645 | | 0.0005 | 78.0 | 1014 | 0.2991 | 0.775 | 0.3490 | 1.2198 | 0.775 | 0.7302 | 0.2441 | 0.0645 | | 0.0005 | 79.0 | 1027 | 0.2991 | 0.775 | 0.3492 | 1.2182 | 0.775 | 0.7302 | 0.2513 | 0.0645 | | 0.0005 | 80.0 | 1040 | 0.2992 | 0.775 | 0.3491 | 1.2183 | 0.775 | 0.7302 | 0.2514 | 0.0645 | | 0.0005 | 81.0 | 1053 | 0.2992 | 0.775 | 0.3492 | 1.2196 | 0.775 | 0.7302 | 0.2584 | 0.0646 | | 0.0005 | 82.0 | 1066 | 0.2992 | 0.775 | 0.3493 | 1.2199 | 0.775 | 0.7302 | 0.2520 | 0.0646 | | 0.0005 | 83.0 | 1079 | 0.2991 | 0.775 | 0.3491 | 1.2191 | 0.775 | 0.7302 | 0.2514 | 0.0643 | | 0.0005 | 84.0 | 1092 | 0.2991 | 0.775 | 0.3491 | 1.2194 | 0.775 | 0.7302 | 0.2516 | 0.0645 | | 0.0005 | 85.0 | 1105 | 0.2990 | 0.775 | 0.3491 | 1.2188 | 0.775 | 0.7302 | 0.2585 | 0.0645 | | 0.0005 | 86.0 | 1118 | 0.2991 | 0.775 | 0.3492 | 1.2193 | 0.775 | 0.7302 | 0.2584 | 0.0645 | | 0.0005 | 87.0 | 1131 | 0.2991 | 0.775 | 0.3491 | 1.2201 | 0.775 | 0.7302 | 0.2667 | 0.0643 | | 0.0005 | 88.0 | 1144 | 0.2991 | 0.775 | 0.3492 | 1.2199 | 0.775 | 0.7302 | 0.2516 | 0.0645 | | 0.0005 | 89.0 | 1157 | 0.2990 | 0.775 | 0.3491 | 1.2193 | 0.775 | 0.7302 | 0.2603 | 0.0644 | | 0.0005 | 90.0 | 1170 | 0.2990 | 0.775 | 0.3492 | 1.2197 | 0.775 | 0.7302 | 0.2536 | 0.0645 | | 0.0005 | 91.0 | 1183 | 0.2990 | 0.775 | 0.3491 | 1.2201 | 0.775 | 0.7302 | 0.2668 | 0.0644 | | 0.0005 | 92.0 | 1196 | 0.2991 | 0.775 | 0.3491 | 1.2190 | 0.775 | 0.7302 | 0.2533 | 0.0644 | | 0.0005 | 93.0 | 1209 | 0.2991 | 0.775 | 0.3492 | 1.2192 | 0.775 | 0.7302 | 0.2602 | 0.0645 | | 0.0005 | 94.0 | 1222 | 0.2991 | 0.775 | 0.3492 | 1.2193 | 0.775 | 0.7302 | 0.2533 | 0.0645 | | 0.0005 | 95.0 | 1235 | 0.2991 | 0.775 | 0.3491 | 1.2192 | 0.775 | 0.7302 | 0.2533 | 0.0644 | | 0.0005 | 96.0 | 1248 | 0.2991 | 0.775 | 0.3491 | 1.2196 | 0.775 | 0.7302 | 0.2668 | 0.0644 | | 0.0005 | 97.0 | 1261 | 0.2991 | 0.775 | 0.3492 | 1.2196 | 0.775 | 0.7302 | 0.2602 | 0.0644 | | 0.0005 | 98.0 | 1274 | 0.2991 | 0.775 | 0.3491 | 1.2194 | 0.775 | 0.7302 | 0.2533 | 0.0644 | | 0.0005 | 99.0 | 1287 | 0.2991 | 0.775 | 0.3491 | 1.2195 | 0.775 | 0.7302 | 0.2602 | 0.0644 | | 0.0005 | 100.0 | 1300 | 0.2991 | 0.775 | 0.3491 | 1.2196 | 0.775 | 0.7302 | 0.2602 | 0.0644 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/225-tiny_tobacco3482_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 225-tiny_tobacco3482_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 4.3111 - Accuracy: 0.82 - Brier Loss: 0.2977 - Nll: 1.6959 - F1 Micro: 0.82 - F1 Macro: 0.8150 - Ece: 0.1454 - Aurc: 0.0488 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 5.2368 | 0.225 | 0.8876 | 8.2751 | 0.225 | 0.1306 | 0.3140 | 0.7919 | | No log | 2.0 | 26 | 4.6617 | 0.385 | 0.7736 | 4.0165 | 0.3850 | 0.3071 | 0.3295 | 0.4033 | | No log | 3.0 | 39 | 4.4343 | 0.525 | 0.6609 | 3.4855 | 0.525 | 0.4017 | 0.3068 | 0.2761 | | No log | 4.0 | 52 | 4.2677 | 0.59 | 0.5775 | 2.7458 | 0.59 | 0.4879 | 0.3037 | 0.1850 | | No log | 5.0 | 65 | 4.1495 | 0.67 | 0.5044 | 2.2848 | 0.67 | 0.6081 | 0.3100 | 0.1336 | | No log | 6.0 | 78 | 4.1699 | 0.71 | 0.4412 | 2.9360 | 0.7100 | 0.6211 | 0.2407 | 0.1076 | | No log | 7.0 | 91 | 4.0527 | 0.725 | 0.4198 | 2.1169 | 0.7250 | 0.6606 | 0.2359 | 0.0993 | | No log | 8.0 | 104 | 4.0491 | 0.715 | 0.4001 | 2.1794 | 0.715 | 0.6343 | 0.1955 | 0.1013 | | No log | 9.0 | 117 | 4.2070 | 0.715 | 0.4096 | 2.1137 | 0.715 | 0.6363 | 0.1968 | 0.1104 | | No log | 10.0 | 130 | 4.2307 | 0.715 | 0.4030 | 2.4228 | 0.715 | 0.6467 | 0.1977 | 0.1054 | | No log | 11.0 | 143 | 4.0841 | 0.73 | 0.3673 | 2.2764 | 0.7300 | 0.6697 | 0.1840 | 0.0781 | | No log | 12.0 | 156 | 3.9980 | 0.74 | 0.3569 | 1.7264 | 0.74 | 0.6752 | 0.1822 | 0.0779 | | No log | 13.0 | 169 | 4.0921 | 0.735 | 0.3704 | 1.8601 | 0.735 | 0.6818 | 0.1835 | 0.0888 | | No log | 14.0 | 182 | 3.9026 | 0.755 | 0.3362 | 1.6596 | 0.755 | 0.7128 | 0.1684 | 0.0757 | | No log | 15.0 | 195 | 4.0542 | 0.765 | 0.3472 | 2.0096 | 0.765 | 0.7051 | 0.1789 | 0.0783 | | No log | 16.0 | 208 | 4.0180 | 0.75 | 0.3634 | 1.6543 | 0.75 | 0.7364 | 0.1958 | 0.0890 | | No log | 17.0 | 221 | 3.9665 | 0.8 | 0.3330 | 1.4940 | 0.8000 | 0.7935 | 0.1919 | 0.0793 | | No log | 18.0 | 234 | 3.9523 | 0.785 | 0.3225 | 1.6353 | 0.785 | 0.7825 | 0.1598 | 0.0719 | | No log | 19.0 | 247 | 3.9298 | 0.79 | 0.3262 | 1.8606 | 0.79 | 0.7757 | 0.1785 | 0.0749 | | No log | 20.0 | 260 | 3.9484 | 0.8 | 0.3106 | 1.6615 | 0.8000 | 0.8034 | 0.1692 | 0.0763 | | No log | 21.0 | 273 | 3.9056 | 0.785 | 0.2930 | 1.6180 | 0.785 | 0.7499 | 0.1542 | 0.0609 | | No log | 22.0 | 286 | 3.8094 | 0.82 | 0.2765 | 1.3116 | 0.82 | 0.8028 | 0.1784 | 0.0532 | | No log | 23.0 | 299 | 3.8352 | 0.81 | 0.2939 | 1.5765 | 0.81 | 0.7971 | 0.1592 | 0.0559 | | No log | 24.0 | 312 | 3.9996 | 0.79 | 0.3192 | 1.6863 | 0.79 | 0.7914 | 0.1678 | 0.0742 | | No log | 25.0 | 325 | 3.8680 | 0.805 | 0.2932 | 1.4217 | 0.805 | 0.8052 | 0.1505 | 0.0578 | | No log | 26.0 | 338 | 3.8913 | 0.8 | 0.3025 | 1.6254 | 0.8000 | 0.7971 | 0.1370 | 0.0607 | | No log | 27.0 | 351 | 3.8603 | 0.815 | 0.2893 | 1.6578 | 0.815 | 0.8094 | 0.1659 | 0.0570 | | No log | 28.0 | 364 | 3.9414 | 0.795 | 0.2990 | 1.9161 | 0.795 | 0.7900 | 0.1504 | 0.0593 | | No log | 29.0 | 377 | 3.8802 | 0.815 | 0.2836 | 1.7091 | 0.815 | 0.7943 | 0.1395 | 0.0565 | | No log | 30.0 | 390 | 3.9025 | 0.8 | 0.2957 | 1.7376 | 0.8000 | 0.7894 | 0.1373 | 0.0594 | | No log | 31.0 | 403 | 3.8744 | 0.835 | 0.2785 | 1.5096 | 0.835 | 0.8185 | 0.1405 | 0.0550 | | No log | 32.0 | 416 | 3.8670 | 0.8 | 0.2813 | 1.5817 | 0.8000 | 0.7825 | 0.1279 | 0.0500 | | No log | 33.0 | 429 | 3.9197 | 0.8 | 0.2852 | 1.5082 | 0.8000 | 0.7802 | 0.1488 | 0.0540 | | No log | 34.0 | 442 | 3.9589 | 0.795 | 0.3005 | 1.9897 | 0.795 | 0.7872 | 0.1487 | 0.0563 | | No log | 35.0 | 455 | 3.9669 | 0.82 | 0.2863 | 1.7012 | 0.82 | 0.8161 | 0.1483 | 0.0551 | | No log | 36.0 | 468 | 3.8924 | 0.81 | 0.2803 | 1.5552 | 0.81 | 0.7961 | 0.1322 | 0.0484 | | No log | 37.0 | 481 | 3.9455 | 0.81 | 0.2838 | 1.6590 | 0.81 | 0.7989 | 0.1423 | 0.0531 | | No log | 38.0 | 494 | 3.8957 | 0.82 | 0.2726 | 1.5431 | 0.82 | 0.8072 | 0.1409 | 0.0482 | | 3.5636 | 39.0 | 507 | 3.9710 | 0.81 | 0.2979 | 1.7156 | 0.81 | 0.7989 | 0.1399 | 0.0524 | | 3.5636 | 40.0 | 520 | 3.8789 | 0.83 | 0.2606 | 1.5452 | 0.83 | 0.8227 | 0.1323 | 0.0478 | | 3.5636 | 41.0 | 533 | 3.9488 | 0.81 | 0.2839 | 1.6447 | 0.81 | 0.8016 | 0.1326 | 0.0509 | | 3.5636 | 42.0 | 546 | 3.9774 | 0.815 | 0.2937 | 1.6907 | 0.815 | 0.8111 | 0.1291 | 0.0488 | | 3.5636 | 43.0 | 559 | 3.9991 | 0.805 | 0.2877 | 1.7106 | 0.805 | 0.7979 | 0.1504 | 0.0518 | | 3.5636 | 44.0 | 572 | 3.9634 | 0.815 | 0.2798 | 1.5063 | 0.815 | 0.8048 | 0.1272 | 0.0493 | | 3.5636 | 45.0 | 585 | 4.0229 | 0.82 | 0.2904 | 1.6439 | 0.82 | 0.8156 | 0.1392 | 0.0511 | | 3.5636 | 46.0 | 598 | 4.0206 | 0.82 | 0.2836 | 1.5407 | 0.82 | 0.8150 | 0.1233 | 0.0497 | | 3.5636 | 47.0 | 611 | 4.0351 | 0.81 | 0.2835 | 1.7627 | 0.81 | 0.8003 | 0.1338 | 0.0486 | | 3.5636 | 48.0 | 624 | 4.0646 | 0.82 | 0.2889 | 1.7694 | 0.82 | 0.8150 | 0.1341 | 0.0499 | | 3.5636 | 49.0 | 637 | 4.0496 | 0.815 | 0.2828 | 1.7548 | 0.815 | 0.8071 | 0.1391 | 0.0477 | | 3.5636 | 50.0 | 650 | 4.0914 | 0.815 | 0.2917 | 1.6381 | 0.815 | 0.8053 | 0.1310 | 0.0502 | | 3.5636 | 51.0 | 663 | 4.0748 | 0.82 | 0.2866 | 1.5646 | 0.82 | 0.8148 | 0.1325 | 0.0483 | | 3.5636 | 52.0 | 676 | 4.0921 | 0.82 | 0.2871 | 1.5732 | 0.82 | 0.8148 | 0.1381 | 0.0487 | | 3.5636 | 53.0 | 689 | 4.1093 | 0.82 | 0.2886 | 1.6448 | 0.82 | 0.8147 | 0.1506 | 0.0481 | | 3.5636 | 54.0 | 702 | 4.1200 | 0.82 | 0.2910 | 1.6446 | 0.82 | 0.8150 | 0.1335 | 0.0493 | | 3.5636 | 55.0 | 715 | 4.1250 | 0.815 | 0.2901 | 1.5641 | 0.815 | 0.8098 | 0.1386 | 0.0491 | | 3.5636 | 56.0 | 728 | 4.1340 | 0.82 | 0.2893 | 1.6575 | 0.82 | 0.8148 | 0.1298 | 0.0489 | | 3.5636 | 57.0 | 741 | 4.1575 | 0.82 | 0.2935 | 1.6360 | 0.82 | 0.8150 | 0.1402 | 0.0499 | | 3.5636 | 58.0 | 754 | 4.1495 | 0.82 | 0.2895 | 1.6349 | 0.82 | 0.8148 | 0.1398 | 0.0486 | | 3.5636 | 59.0 | 767 | 4.1582 | 0.82 | 0.2909 | 1.6327 | 0.82 | 0.8150 | 0.1341 | 0.0487 | | 3.5636 | 60.0 | 780 | 4.1720 | 0.82 | 0.2923 | 1.5746 | 0.82 | 0.8150 | 0.1386 | 0.0493 | | 3.5636 | 61.0 | 793 | 4.1848 | 0.825 | 0.2940 | 1.6424 | 0.825 | 0.8181 | 0.1380 | 0.0494 | | 3.5636 | 62.0 | 806 | 4.1880 | 0.82 | 0.2939 | 1.6323 | 0.82 | 0.8148 | 0.1389 | 0.0488 | | 3.5636 | 63.0 | 819 | 4.1825 | 0.82 | 0.2916 | 1.6920 | 0.82 | 0.8150 | 0.1421 | 0.0483 | | 3.5636 | 64.0 | 832 | 4.2037 | 0.82 | 0.2946 | 1.6365 | 0.82 | 0.8148 | 0.1393 | 0.0493 | | 3.5636 | 65.0 | 845 | 4.2096 | 0.82 | 0.2948 | 1.5852 | 0.82 | 0.8150 | 0.1462 | 0.0493 | | 3.5636 | 66.0 | 858 | 4.2191 | 0.82 | 0.2962 | 1.6349 | 0.82 | 0.8150 | 0.1491 | 0.0495 | | 3.5636 | 67.0 | 871 | 4.2189 | 0.82 | 0.2948 | 1.6389 | 0.82 | 0.8150 | 0.1313 | 0.0489 | | 3.5636 | 68.0 | 884 | 4.2243 | 0.82 | 0.2947 | 1.6322 | 0.82 | 0.8150 | 0.1398 | 0.0491 | | 3.5636 | 69.0 | 897 | 4.2334 | 0.82 | 0.2957 | 1.6398 | 0.82 | 0.8150 | 0.1355 | 0.0491 | | 3.5636 | 70.0 | 910 | 4.2312 | 0.82 | 0.2943 | 1.6395 | 0.82 | 0.8148 | 0.1419 | 0.0484 | | 3.5636 | 71.0 | 923 | 4.2376 | 0.82 | 0.2956 | 1.6389 | 0.82 | 0.8150 | 0.1372 | 0.0490 | | 3.5636 | 72.0 | 936 | 4.2420 | 0.82 | 0.2951 | 1.6368 | 0.82 | 0.8150 | 0.1427 | 0.0489 | | 3.5636 | 73.0 | 949 | 4.2464 | 0.82 | 0.2946 | 1.6375 | 0.82 | 0.8150 | 0.1449 | 0.0488 | | 3.5636 | 74.0 | 962 | 4.2540 | 0.82 | 0.2956 | 1.6364 | 0.82 | 0.8150 | 0.1476 | 0.0489 | | 3.5636 | 75.0 | 975 | 4.2579 | 0.82 | 0.2955 | 1.6361 | 0.82 | 0.8150 | 0.1361 | 0.0491 | | 3.5636 | 76.0 | 988 | 4.2638 | 0.82 | 0.2960 | 1.6368 | 0.82 | 0.8150 | 0.1483 | 0.0490 | | 3.1969 | 77.0 | 1001 | 4.2653 | 0.82 | 0.2956 | 1.6950 | 0.82 | 0.8150 | 0.1509 | 0.0487 | | 3.1969 | 78.0 | 1014 | 4.2708 | 0.82 | 0.2965 | 1.6365 | 0.82 | 0.8150 | 0.1398 | 0.0490 | | 3.1969 | 79.0 | 1027 | 4.2761 | 0.82 | 0.2968 | 1.6400 | 0.82 | 0.8150 | 0.1399 | 0.0490 | | 3.1969 | 80.0 | 1040 | 4.2792 | 0.82 | 0.2969 | 1.6381 | 0.82 | 0.8150 | 0.1425 | 0.0490 | | 3.1969 | 81.0 | 1053 | 4.2801 | 0.82 | 0.2963 | 1.6949 | 0.82 | 0.8148 | 0.1477 | 0.0487 | | 3.1969 | 82.0 | 1066 | 4.2841 | 0.82 | 0.2968 | 1.6459 | 0.82 | 0.8150 | 0.1425 | 0.0488 | | 3.1969 | 83.0 | 1079 | 4.2864 | 0.82 | 0.2968 | 1.6378 | 0.82 | 0.8150 | 0.1421 | 0.0489 | | 3.1969 | 84.0 | 1092 | 4.2918 | 0.82 | 0.2973 | 1.6398 | 0.82 | 0.8150 | 0.1373 | 0.0491 | | 3.1969 | 85.0 | 1105 | 4.2930 | 0.82 | 0.2970 | 1.6408 | 0.82 | 0.8150 | 0.1486 | 0.0490 | | 3.1969 | 86.0 | 1118 | 4.2956 | 0.82 | 0.2973 | 1.6420 | 0.82 | 0.8150 | 0.1427 | 0.0489 | | 3.1969 | 87.0 | 1131 | 4.2988 | 0.82 | 0.2976 | 1.6390 | 0.82 | 0.8150 | 0.1374 | 0.0491 | | 3.1969 | 88.0 | 1144 | 4.2995 | 0.82 | 0.2974 | 1.6509 | 0.82 | 0.8150 | 0.1427 | 0.0489 | | 3.1969 | 89.0 | 1157 | 4.3026 | 0.82 | 0.2976 | 1.6418 | 0.82 | 0.8150 | 0.1375 | 0.0490 | | 3.1969 | 90.0 | 1170 | 4.3028 | 0.82 | 0.2974 | 1.6445 | 0.82 | 0.8150 | 0.1453 | 0.0488 | | 3.1969 | 91.0 | 1183 | 4.3054 | 0.82 | 0.2976 | 1.6443 | 0.82 | 0.8150 | 0.1402 | 0.0488 | | 3.1969 | 92.0 | 1196 | 4.3060 | 0.82 | 0.2975 | 1.6530 | 0.82 | 0.8150 | 0.1454 | 0.0488 | | 3.1969 | 93.0 | 1209 | 4.3074 | 0.82 | 0.2975 | 1.6961 | 0.82 | 0.8150 | 0.1453 | 0.0488 | | 3.1969 | 94.0 | 1222 | 4.3078 | 0.82 | 0.2975 | 1.6638 | 0.82 | 0.8150 | 0.1454 | 0.0488 | | 3.1969 | 95.0 | 1235 | 4.3092 | 0.82 | 0.2976 | 1.6959 | 0.82 | 0.8150 | 0.1454 | 0.0488 | | 3.1969 | 96.0 | 1248 | 4.3094 | 0.82 | 0.2976 | 1.6957 | 0.82 | 0.8150 | 0.1454 | 0.0488 | | 3.1969 | 97.0 | 1261 | 4.3106 | 0.82 | 0.2977 | 1.6961 | 0.82 | 0.8150 | 0.1455 | 0.0489 | | 3.1969 | 98.0 | 1274 | 4.3110 | 0.82 | 0.2977 | 1.6960 | 0.82 | 0.8150 | 0.1455 | 0.0488 | | 3.1969 | 99.0 | 1287 | 4.3111 | 0.82 | 0.2977 | 1.6959 | 0.82 | 0.8150 | 0.1454 | 0.0488 | | 3.1969 | 100.0 | 1300 | 4.3111 | 0.82 | 0.2977 | 1.6959 | 0.82 | 0.8150 | 0.1454 | 0.0488 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/225-tiny_tobacco3482_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 225-tiny_tobacco3482_og_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1232.9076 - Accuracy: 0.845 - Brier Loss: 0.2612 - Nll: 1.7130 - F1 Micro: 0.845 - F1 Macro: 0.8331 - Ece: 0.1516 - Aurc: 0.0543 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 50 | 1271.0875 | 0.2 | 0.8576 | 4.7335 | 0.2000 | 0.0749 | 0.2387 | 0.5656 | | No log | 2.0 | 100 | 1266.3860 | 0.48 | 0.6981 | 2.5151 | 0.48 | 0.3418 | 0.2977 | 0.2958 | | No log | 3.0 | 150 | 1260.7906 | 0.65 | 0.5606 | 2.2427 | 0.65 | 0.5932 | 0.3175 | 0.1738 | | No log | 4.0 | 200 | 1257.9875 | 0.635 | 0.5237 | 2.6934 | 0.635 | 0.5258 | 0.2661 | 0.1758 | | No log | 5.0 | 250 | 1253.2234 | 0.68 | 0.4767 | 1.7822 | 0.68 | 0.5759 | 0.2604 | 0.1457 | | No log | 6.0 | 300 | 1255.6379 | 0.65 | 0.4880 | 2.1128 | 0.65 | 0.5824 | 0.2301 | 0.1431 | | No log | 7.0 | 350 | 1255.5657 | 0.68 | 0.4868 | 2.1551 | 0.68 | 0.6119 | 0.2091 | 0.1567 | | No log | 8.0 | 400 | 1249.9714 | 0.7 | 0.4528 | 2.3724 | 0.7 | 0.6674 | 0.2404 | 0.1247 | | No log | 9.0 | 450 | 1252.9314 | 0.75 | 0.4210 | 2.0979 | 0.75 | 0.7443 | 0.2484 | 0.1069 | | 1335.0542 | 10.0 | 500 | 1259.8634 | 0.685 | 0.4673 | 3.3686 | 0.685 | 0.6315 | 0.2186 | 0.1396 | | 1335.0542 | 11.0 | 550 | 1260.2100 | 0.655 | 0.4945 | 3.6990 | 0.655 | 0.5917 | 0.2486 | 0.1514 | | 1335.0542 | 12.0 | 600 | 1242.3972 | 0.73 | 0.3849 | 2.4725 | 0.7300 | 0.7323 | 0.2188 | 0.0916 | | 1335.0542 | 13.0 | 650 | 1242.8196 | 0.785 | 0.3174 | 2.0359 | 0.785 | 0.7707 | 0.1768 | 0.0635 | | 1335.0542 | 14.0 | 700 | 1242.0604 | 0.76 | 0.3600 | 2.5990 | 0.76 | 0.7563 | 0.2073 | 0.0806 | | 1335.0542 | 15.0 | 750 | 1249.1082 | 0.81 | 0.3139 | 2.3924 | 0.81 | 0.7738 | 0.2004 | 0.0686 | | 1335.0542 | 16.0 | 800 | 1235.6434 | 0.815 | 0.2970 | 2.1356 | 0.815 | 0.7957 | 0.1658 | 0.0874 | | 1335.0542 | 17.0 | 850 | 1246.7532 | 0.805 | 0.3122 | 2.3216 | 0.805 | 0.7720 | 0.1863 | 0.0648 | | 1335.0542 | 18.0 | 900 | 1245.1010 | 0.8 | 0.3012 | 1.8121 | 0.8000 | 0.7747 | 0.1744 | 0.0587 | | 1335.0542 | 19.0 | 950 | 1248.3215 | 0.79 | 0.3271 | 1.9954 | 0.79 | 0.7668 | 0.2001 | 0.0653 | | 1322.7586 | 20.0 | 1000 | 1246.8469 | 0.785 | 0.3455 | 2.4830 | 0.785 | 0.7633 | 0.1968 | 0.0788 | | 1322.7586 | 21.0 | 1050 | 1248.2886 | 0.8 | 0.3211 | 2.2441 | 0.8000 | 0.7899 | 0.1820 | 0.0692 | | 1322.7586 | 22.0 | 1100 | 1243.5780 | 0.805 | 0.3075 | 2.3942 | 0.805 | 0.7995 | 0.1963 | 0.0614 | | 1322.7586 | 23.0 | 1150 | 1244.5054 | 0.82 | 0.2993 | 2.0642 | 0.82 | 0.7988 | 0.1923 | 0.0560 | | 1322.7586 | 24.0 | 1200 | 1243.2177 | 0.82 | 0.2948 | 1.9610 | 0.82 | 0.8156 | 0.1860 | 0.0573 | | 1322.7586 | 25.0 | 1250 | 1244.5066 | 0.8 | 0.3089 | 2.0136 | 0.8000 | 0.7677 | 0.1817 | 0.0511 | | 1322.7586 | 26.0 | 1300 | 1241.2290 | 0.835 | 0.2683 | 1.6252 | 0.835 | 0.8179 | 0.1716 | 0.0473 | | 1322.7586 | 27.0 | 1350 | 1242.3634 | 0.815 | 0.2971 | 2.1384 | 0.815 | 0.8043 | 0.1805 | 0.0586 | | 1322.7586 | 28.0 | 1400 | 1248.5602 | 0.805 | 0.3035 | 2.3228 | 0.805 | 0.7644 | 0.1682 | 0.0628 | | 1322.7586 | 29.0 | 1450 | 1241.1305 | 0.825 | 0.2758 | 2.0506 | 0.825 | 0.8003 | 0.1599 | 0.0513 | | 1318.3501 | 30.0 | 1500 | 1234.8096 | 0.84 | 0.2547 | 1.8920 | 0.8400 | 0.8217 | 0.1542 | 0.0556 | | 1318.3501 | 31.0 | 1550 | 1235.2516 | 0.84 | 0.2426 | 1.8788 | 0.8400 | 0.8250 | 0.1429 | 0.0380 | | 1318.3501 | 32.0 | 1600 | 1237.9358 | 0.835 | 0.2643 | 1.7957 | 0.835 | 0.8171 | 0.1596 | 0.0431 | | 1318.3501 | 33.0 | 1650 | 1231.1899 | 0.86 | 0.2449 | 1.8820 | 0.8600 | 0.8424 | 0.1565 | 0.0519 | | 1318.3501 | 34.0 | 1700 | 1241.4664 | 0.84 | 0.2614 | 1.7047 | 0.8400 | 0.8240 | 0.1771 | 0.0491 | | 1318.3501 | 35.0 | 1750 | 1241.1458 | 0.85 | 0.2485 | 1.8466 | 0.85 | 0.8372 | 0.1585 | 0.0403 | | 1318.3501 | 36.0 | 1800 | 1238.1477 | 0.845 | 0.2570 | 1.8164 | 0.845 | 0.8236 | 0.1604 | 0.0739 | | 1318.3501 | 37.0 | 1850 | 1238.3875 | 0.85 | 0.2646 | 1.9949 | 0.85 | 0.8333 | 0.1638 | 0.0641 | | 1318.3501 | 38.0 | 1900 | 1238.3080 | 0.86 | 0.2393 | 1.7820 | 0.8600 | 0.8458 | 0.1528 | 0.0474 | | 1318.3501 | 39.0 | 1950 | 1235.3929 | 0.86 | 0.2459 | 1.8287 | 0.8600 | 0.8544 | 0.1636 | 0.0556 | | 1315.684 | 40.0 | 2000 | 1239.4463 | 0.86 | 0.2420 | 1.6866 | 0.8600 | 0.8423 | 0.1507 | 0.0378 | | 1315.684 | 41.0 | 2050 | 1237.7450 | 0.85 | 0.2523 | 1.9391 | 0.85 | 0.8387 | 0.1452 | 0.0536 | | 1315.684 | 42.0 | 2100 | 1237.9618 | 0.86 | 0.2476 | 1.8292 | 0.8600 | 0.8481 | 0.1636 | 0.0509 | | 1315.684 | 43.0 | 2150 | 1235.4918 | 0.845 | 0.2661 | 1.9061 | 0.845 | 0.8333 | 0.1551 | 0.0663 | | 1315.684 | 44.0 | 2200 | 1239.4510 | 0.865 | 0.2423 | 1.6291 | 0.865 | 0.8515 | 0.1553 | 0.0565 | | 1315.684 | 45.0 | 2250 | 1237.6595 | 0.85 | 0.2470 | 1.8245 | 0.85 | 0.8346 | 0.1514 | 0.0554 | | 1315.684 | 46.0 | 2300 | 1238.8110 | 0.835 | 0.2543 | 1.8533 | 0.835 | 0.8232 | 0.1356 | 0.0412 | | 1315.684 | 47.0 | 2350 | 1240.4524 | 0.855 | 0.2487 | 1.7030 | 0.855 | 0.8489 | 0.1489 | 0.0402 | | 1315.684 | 48.0 | 2400 | 1239.2617 | 0.87 | 0.2387 | 1.6849 | 0.87 | 0.8573 | 0.1506 | 0.0404 | | 1315.684 | 49.0 | 2450 | 1240.5238 | 0.85 | 0.2544 | 1.8495 | 0.85 | 0.8365 | 0.1514 | 0.0593 | | 1313.9472 | 50.0 | 2500 | 1224.2273 | 0.87 | 0.2408 | 1.8714 | 0.87 | 0.8505 | 0.1504 | 0.0757 | | 1313.9472 | 51.0 | 2550 | 1239.5197 | 0.85 | 0.2599 | 1.7630 | 0.85 | 0.8371 | 0.1659 | 0.0587 | | 1313.9472 | 52.0 | 2600 | 1237.7816 | 0.865 | 0.2353 | 1.7327 | 0.865 | 0.8518 | 0.1456 | 0.0461 | | 1313.9472 | 53.0 | 2650 | 1236.0118 | 0.865 | 0.2414 | 1.7887 | 0.865 | 0.8539 | 0.1607 | 0.0614 | | 1313.9472 | 54.0 | 2700 | 1236.8806 | 0.875 | 0.2323 | 1.5017 | 0.875 | 0.8657 | 0.1481 | 0.0481 | | 1313.9472 | 55.0 | 2750 | 1232.2323 | 0.865 | 0.2215 | 1.6013 | 0.865 | 0.8524 | 0.1395 | 0.0541 | | 1313.9472 | 56.0 | 2800 | 1239.0905 | 0.845 | 0.2495 | 1.6543 | 0.845 | 0.8321 | 0.1602 | 0.0401 | | 1313.9472 | 57.0 | 2850 | 1232.8143 | 0.86 | 0.2561 | 1.8669 | 0.8600 | 0.8475 | 0.1551 | 0.0573 | | 1313.9472 | 58.0 | 2900 | 1239.1449 | 0.86 | 0.2430 | 1.7579 | 0.8600 | 0.8469 | 0.1491 | 0.0413 | | 1313.9472 | 59.0 | 2950 | 1233.6656 | 0.86 | 0.2540 | 1.8388 | 0.8600 | 0.8394 | 0.1534 | 0.0723 | | 1312.3446 | 60.0 | 3000 | 1237.9213 | 0.86 | 0.2467 | 1.7297 | 0.8600 | 0.8495 | 0.1516 | 0.0413 | | 1312.3446 | 61.0 | 3050 | 1234.9781 | 0.855 | 0.2436 | 1.7597 | 0.855 | 0.8445 | 0.1441 | 0.0650 | | 1312.3446 | 62.0 | 3100 | 1233.9817 | 0.86 | 0.2396 | 1.7017 | 0.8600 | 0.8446 | 0.1526 | 0.0517 | | 1312.3446 | 63.0 | 3150 | 1231.5956 | 0.88 | 0.2261 | 1.8523 | 0.88 | 0.8694 | 0.1428 | 0.0502 | | 1312.3446 | 64.0 | 3200 | 1231.5542 | 0.85 | 0.2574 | 1.7764 | 0.85 | 0.8335 | 0.1640 | 0.0445 | | 1312.3446 | 65.0 | 3250 | 1235.3212 | 0.86 | 0.2387 | 1.7426 | 0.8600 | 0.8458 | 0.1437 | 0.0569 | | 1312.3446 | 66.0 | 3300 | 1234.1420 | 0.86 | 0.2446 | 1.5064 | 0.8600 | 0.8532 | 0.1491 | 0.0504 | | 1312.3446 | 67.0 | 3350 | 1234.3502 | 0.855 | 0.2418 | 1.7734 | 0.855 | 0.8426 | 0.1439 | 0.0560 | | 1312.3446 | 68.0 | 3400 | 1235.5698 | 0.865 | 0.2367 | 1.6948 | 0.865 | 0.8563 | 0.1520 | 0.0462 | | 1312.3446 | 69.0 | 3450 | 1234.6050 | 0.86 | 0.2403 | 1.7511 | 0.8600 | 0.8458 | 0.1405 | 0.0498 | | 1311.1923 | 70.0 | 3500 | 1235.3489 | 0.835 | 0.2654 | 1.7495 | 0.835 | 0.8199 | 0.1503 | 0.0512 | | 1311.1923 | 71.0 | 3550 | 1233.4445 | 0.855 | 0.2671 | 1.7786 | 0.855 | 0.8465 | 0.1558 | 0.0589 | | 1311.1923 | 72.0 | 3600 | 1234.6138 | 0.86 | 0.2543 | 1.6259 | 0.8600 | 0.8487 | 0.1559 | 0.0612 | | 1311.1923 | 73.0 | 3650 | 1234.8722 | 0.86 | 0.2407 | 1.5390 | 0.8600 | 0.8487 | 0.1471 | 0.0566 | | 1311.1923 | 74.0 | 3700 | 1233.2711 | 0.87 | 0.2436 | 1.7559 | 0.87 | 0.8575 | 0.1554 | 0.0497 | | 1311.1923 | 75.0 | 3750 | 1235.2708 | 0.865 | 0.2386 | 1.6956 | 0.865 | 0.8528 | 0.1520 | 0.0554 | | 1311.1923 | 76.0 | 3800 | 1233.7223 | 0.865 | 0.2385 | 1.5563 | 0.865 | 0.8511 | 0.1429 | 0.0565 | | 1311.1923 | 77.0 | 3850 | 1234.5378 | 0.865 | 0.2441 | 1.5156 | 0.865 | 0.8528 | 0.1633 | 0.0547 | | 1311.1923 | 78.0 | 3900 | 1238.3745 | 0.85 | 0.2469 | 1.4876 | 0.85 | 0.8335 | 0.1400 | 0.0465 | | 1311.1923 | 79.0 | 3950 | 1237.1874 | 0.86 | 0.2460 | 1.6190 | 0.8600 | 0.8451 | 0.1579 | 0.0537 | | 1310.5116 | 80.0 | 4000 | 1235.0160 | 0.865 | 0.2379 | 1.4887 | 0.865 | 0.8618 | 0.1567 | 0.0547 | | 1310.5116 | 81.0 | 4050 | 1233.9181 | 0.85 | 0.2590 | 1.7047 | 0.85 | 0.8362 | 0.1631 | 0.0590 | | 1310.5116 | 82.0 | 4100 | 1237.2312 | 0.865 | 0.2485 | 1.6650 | 0.865 | 0.8549 | 0.1674 | 0.0462 | | 1310.5116 | 83.0 | 4150 | 1234.4546 | 0.86 | 0.2472 | 1.5453 | 0.8600 | 0.8497 | 0.1449 | 0.0578 | | 1310.5116 | 84.0 | 4200 | 1230.0541 | 0.85 | 0.2579 | 1.7589 | 0.85 | 0.8371 | 0.1493 | 0.0615 | | 1310.5116 | 85.0 | 4250 | 1234.1154 | 0.855 | 0.2523 | 1.6053 | 0.855 | 0.8423 | 0.1560 | 0.0541 | | 1310.5116 | 86.0 | 4300 | 1235.0112 | 0.86 | 0.2541 | 1.6794 | 0.8600 | 0.8497 | 0.1582 | 0.0525 | | 1310.5116 | 87.0 | 4350 | 1234.1501 | 0.845 | 0.2566 | 1.7223 | 0.845 | 0.8317 | 0.1568 | 0.0496 | | 1310.5116 | 88.0 | 4400 | 1233.6084 | 0.85 | 0.2575 | 1.6697 | 0.85 | 0.8401 | 0.1606 | 0.0509 | | 1310.5116 | 89.0 | 4450 | 1230.3450 | 0.855 | 0.2541 | 1.6316 | 0.855 | 0.8402 | 0.1397 | 0.0567 | | 1309.7242 | 90.0 | 4500 | 1233.0825 | 0.85 | 0.2584 | 1.7262 | 0.85 | 0.8371 | 0.1430 | 0.0509 | | 1309.7242 | 91.0 | 4550 | 1235.7081 | 0.855 | 0.2551 | 1.5906 | 0.855 | 0.8402 | 0.1449 | 0.0504 | | 1309.7242 | 92.0 | 4600 | 1234.7166 | 0.855 | 0.2545 | 1.6704 | 0.855 | 0.8501 | 0.1431 | 0.0519 | | 1309.7242 | 93.0 | 4650 | 1235.1996 | 0.85 | 0.2597 | 1.7262 | 0.85 | 0.8371 | 0.1595 | 0.0525 | | 1309.7242 | 94.0 | 4700 | 1233.9705 | 0.845 | 0.2555 | 1.7033 | 0.845 | 0.8331 | 0.1533 | 0.0521 | | 1309.7242 | 95.0 | 4750 | 1229.2874 | 0.845 | 0.2605 | 1.7454 | 0.845 | 0.8331 | 0.1535 | 0.0530 | | 1309.7242 | 96.0 | 4800 | 1233.8939 | 0.845 | 0.2614 | 1.7398 | 0.845 | 0.8331 | 0.1540 | 0.0527 | | 1309.7242 | 97.0 | 4850 | 1234.3517 | 0.85 | 0.2641 | 1.7150 | 0.85 | 0.8371 | 0.1504 | 0.0543 | | 1309.7242 | 98.0 | 4900 | 1236.2716 | 0.845 | 0.2594 | 1.6780 | 0.845 | 0.8331 | 0.1440 | 0.0502 | | 1309.7242 | 99.0 | 4950 | 1231.1798 | 0.85 | 0.2633 | 1.7550 | 0.85 | 0.8371 | 0.1446 | 0.0565 | | 1309.5659 | 100.0 | 5000 | 1232.9076 | 0.845 | 0.2612 | 1.7130 | 0.845 | 0.8331 | 0.1516 | 0.0543 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/171-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 171-tiny_tobacco3482_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4688 - Accuracy: 0.815 - Brier Loss: 0.3067 - Nll: 1.4679 - F1 Micro: 0.815 - F1 Macro: 0.7970 - Ece: 0.2440 - Aurc: 0.0500 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 1.6008 | 0.23 | 0.8921 | 8.0367 | 0.23 | 0.1380 | 0.3153 | 0.7486 | | No log | 2.0 | 26 | 1.1383 | 0.445 | 0.6997 | 3.6320 | 0.445 | 0.3583 | 0.2866 | 0.3390 | | No log | 3.0 | 39 | 0.9781 | 0.555 | 0.5896 | 2.1989 | 0.555 | 0.4763 | 0.2856 | 0.2440 | | No log | 4.0 | 52 | 0.7953 | 0.65 | 0.4796 | 1.7904 | 0.65 | 0.5880 | 0.2308 | 0.1417 | | No log | 5.0 | 65 | 0.7282 | 0.705 | 0.4370 | 1.4923 | 0.705 | 0.6654 | 0.2538 | 0.1123 | | No log | 6.0 | 78 | 0.6794 | 0.73 | 0.3987 | 1.5706 | 0.7300 | 0.6928 | 0.2386 | 0.1041 | | No log | 7.0 | 91 | 0.6813 | 0.73 | 0.4024 | 1.6519 | 0.7300 | 0.6984 | 0.2553 | 0.1027 | | No log | 8.0 | 104 | 0.6669 | 0.72 | 0.3910 | 1.6057 | 0.72 | 0.6811 | 0.2234 | 0.0990 | | No log | 9.0 | 117 | 0.7152 | 0.72 | 0.4167 | 1.9716 | 0.72 | 0.7201 | 0.2259 | 0.1091 | | No log | 10.0 | 130 | 0.6722 | 0.745 | 0.3751 | 1.9561 | 0.745 | 0.7290 | 0.2362 | 0.0849 | | No log | 11.0 | 143 | 0.6263 | 0.75 | 0.3817 | 1.8594 | 0.75 | 0.7238 | 0.2511 | 0.0980 | | No log | 12.0 | 156 | 0.6259 | 0.725 | 0.3946 | 1.8363 | 0.7250 | 0.6835 | 0.2186 | 0.0974 | | No log | 13.0 | 169 | 0.5756 | 0.77 | 0.3487 | 1.3847 | 0.7700 | 0.7171 | 0.2271 | 0.0723 | | No log | 14.0 | 182 | 0.5670 | 0.76 | 0.3492 | 1.7986 | 0.76 | 0.7323 | 0.2201 | 0.0713 | | No log | 15.0 | 195 | 0.5538 | 0.785 | 0.3532 | 1.6319 | 0.785 | 0.7608 | 0.2479 | 0.0629 | | No log | 16.0 | 208 | 0.5634 | 0.75 | 0.3582 | 1.5131 | 0.75 | 0.7397 | 0.2333 | 0.0747 | | No log | 17.0 | 221 | 0.5348 | 0.77 | 0.3378 | 1.5843 | 0.7700 | 0.7421 | 0.2193 | 0.0646 | | No log | 18.0 | 234 | 0.5306 | 0.78 | 0.3310 | 1.6298 | 0.78 | 0.7618 | 0.2290 | 0.0644 | | No log | 19.0 | 247 | 0.5185 | 0.805 | 0.3400 | 1.4945 | 0.805 | 0.7755 | 0.2627 | 0.0622 | | No log | 20.0 | 260 | 0.5335 | 0.76 | 0.3402 | 1.5758 | 0.76 | 0.7108 | 0.2372 | 0.0699 | | No log | 21.0 | 273 | 0.5191 | 0.76 | 0.3389 | 1.3860 | 0.76 | 0.7413 | 0.2587 | 0.0661 | | No log | 22.0 | 286 | 0.5198 | 0.785 | 0.3423 | 1.4790 | 0.785 | 0.7607 | 0.2513 | 0.0649 | | No log | 23.0 | 299 | 0.5155 | 0.79 | 0.3344 | 1.5003 | 0.79 | 0.7648 | 0.2393 | 0.0671 | | No log | 24.0 | 312 | 0.5156 | 0.775 | 0.3380 | 1.5898 | 0.775 | 0.7388 | 0.2295 | 0.0667 | | No log | 25.0 | 325 | 0.4808 | 0.815 | 0.3033 | 1.4602 | 0.815 | 0.7837 | 0.2520 | 0.0520 | | No log | 26.0 | 338 | 0.4975 | 0.785 | 0.3325 | 1.3864 | 0.785 | 0.7563 | 0.2298 | 0.0673 | | No log | 27.0 | 351 | 0.4988 | 0.785 | 0.3257 | 1.5206 | 0.785 | 0.7717 | 0.2156 | 0.0638 | | No log | 28.0 | 364 | 0.4928 | 0.795 | 0.3209 | 1.3717 | 0.795 | 0.7719 | 0.2303 | 0.0612 | | No log | 29.0 | 377 | 0.4660 | 0.81 | 0.3022 | 1.2190 | 0.81 | 0.7864 | 0.2285 | 0.0485 | | No log | 30.0 | 390 | 0.4777 | 0.815 | 0.3123 | 1.4266 | 0.815 | 0.7926 | 0.2535 | 0.0562 | | No log | 31.0 | 403 | 0.4695 | 0.82 | 0.3067 | 1.3425 | 0.82 | 0.8000 | 0.2338 | 0.0528 | | No log | 32.0 | 416 | 0.4701 | 0.815 | 0.3026 | 1.3247 | 0.815 | 0.7893 | 0.2259 | 0.0522 | | No log | 33.0 | 429 | 0.4625 | 0.82 | 0.3023 | 1.2646 | 0.82 | 0.7915 | 0.2441 | 0.0486 | | No log | 34.0 | 442 | 0.4684 | 0.81 | 0.3080 | 1.3468 | 0.81 | 0.7846 | 0.2373 | 0.0521 | | No log | 35.0 | 455 | 0.4629 | 0.81 | 0.3000 | 1.3441 | 0.81 | 0.7869 | 0.2375 | 0.0492 | | No log | 36.0 | 468 | 0.4680 | 0.81 | 0.3074 | 1.2158 | 0.81 | 0.7894 | 0.2417 | 0.0508 | | No log | 37.0 | 481 | 0.4672 | 0.81 | 0.3053 | 1.3329 | 0.81 | 0.7866 | 0.2320 | 0.0508 | | No log | 38.0 | 494 | 0.4716 | 0.805 | 0.3091 | 1.2975 | 0.805 | 0.7863 | 0.2361 | 0.0545 | | 0.3111 | 39.0 | 507 | 0.4703 | 0.805 | 0.3081 | 1.2855 | 0.805 | 0.7863 | 0.2473 | 0.0534 | | 0.3111 | 40.0 | 520 | 0.4692 | 0.81 | 0.3073 | 1.2833 | 0.81 | 0.7894 | 0.2361 | 0.0525 | | 0.3111 | 41.0 | 533 | 0.4681 | 0.81 | 0.3068 | 1.2804 | 0.81 | 0.7890 | 0.2386 | 0.0517 | | 0.3111 | 42.0 | 546 | 0.4672 | 0.81 | 0.3058 | 1.4597 | 0.81 | 0.7898 | 0.2276 | 0.0521 | | 0.3111 | 43.0 | 559 | 0.4691 | 0.81 | 0.3080 | 1.4136 | 0.81 | 0.7894 | 0.2280 | 0.0520 | | 0.3111 | 44.0 | 572 | 0.4664 | 0.815 | 0.3048 | 1.4593 | 0.815 | 0.7921 | 0.2459 | 0.0509 | | 0.3111 | 45.0 | 585 | 0.4684 | 0.81 | 0.3069 | 1.4071 | 0.81 | 0.7894 | 0.2415 | 0.0514 | | 0.3111 | 46.0 | 598 | 0.4688 | 0.81 | 0.3066 | 1.4084 | 0.81 | 0.7890 | 0.2174 | 0.0516 | | 0.3111 | 47.0 | 611 | 0.4683 | 0.81 | 0.3061 | 1.4052 | 0.81 | 0.7890 | 0.2406 | 0.0515 | | 0.3111 | 48.0 | 624 | 0.4677 | 0.81 | 0.3065 | 1.4045 | 0.81 | 0.7890 | 0.2346 | 0.0508 | | 0.3111 | 49.0 | 637 | 0.4679 | 0.81 | 0.3058 | 1.4072 | 0.81 | 0.7890 | 0.2177 | 0.0507 | | 0.3111 | 50.0 | 650 | 0.4679 | 0.81 | 0.3061 | 1.4681 | 0.81 | 0.7890 | 0.2619 | 0.0510 | | 0.3111 | 51.0 | 663 | 0.4688 | 0.81 | 0.3068 | 1.4662 | 0.81 | 0.7890 | 0.2325 | 0.0513 | | 0.3111 | 52.0 | 676 | 0.4679 | 0.81 | 0.3063 | 1.4062 | 0.81 | 0.7890 | 0.2257 | 0.0508 | | 0.3111 | 53.0 | 689 | 0.4682 | 0.81 | 0.3064 | 1.4667 | 0.81 | 0.7890 | 0.2279 | 0.0512 | | 0.3111 | 54.0 | 702 | 0.4674 | 0.81 | 0.3058 | 1.4075 | 0.81 | 0.7890 | 0.2269 | 0.0507 | | 0.3111 | 55.0 | 715 | 0.4689 | 0.81 | 0.3069 | 1.4674 | 0.81 | 0.7890 | 0.2428 | 0.0511 | | 0.3111 | 56.0 | 728 | 0.4678 | 0.81 | 0.3062 | 1.4081 | 0.81 | 0.7890 | 0.2402 | 0.0507 | | 0.3111 | 57.0 | 741 | 0.4691 | 0.81 | 0.3069 | 1.4691 | 0.81 | 0.7890 | 0.2279 | 0.0511 | | 0.3111 | 58.0 | 754 | 0.4686 | 0.81 | 0.3067 | 1.4114 | 0.81 | 0.7890 | 0.2647 | 0.0510 | | 0.3111 | 59.0 | 767 | 0.4688 | 0.81 | 0.3069 | 1.4130 | 0.81 | 0.7890 | 0.2416 | 0.0510 | | 0.3111 | 60.0 | 780 | 0.4685 | 0.81 | 0.3065 | 1.4206 | 0.81 | 0.7890 | 0.2278 | 0.0509 | | 0.3111 | 61.0 | 793 | 0.4688 | 0.81 | 0.3069 | 1.4145 | 0.81 | 0.7890 | 0.2307 | 0.0513 | | 0.3111 | 62.0 | 806 | 0.4690 | 0.81 | 0.3070 | 1.4681 | 0.81 | 0.7890 | 0.2437 | 0.0510 | | 0.3111 | 63.0 | 819 | 0.4688 | 0.81 | 0.3068 | 1.4680 | 0.81 | 0.7890 | 0.2465 | 0.0510 | | 0.3111 | 64.0 | 832 | 0.4681 | 0.81 | 0.3062 | 1.4670 | 0.81 | 0.7890 | 0.2565 | 0.0507 | | 0.3111 | 65.0 | 845 | 0.4690 | 0.81 | 0.3069 | 1.4675 | 0.81 | 0.7890 | 0.2444 | 0.0510 | | 0.3111 | 66.0 | 858 | 0.4688 | 0.81 | 0.3069 | 1.4673 | 0.81 | 0.7890 | 0.2433 | 0.0510 | | 0.3111 | 67.0 | 871 | 0.4686 | 0.81 | 0.3066 | 1.4676 | 0.81 | 0.7890 | 0.2560 | 0.0507 | | 0.3111 | 68.0 | 884 | 0.4684 | 0.81 | 0.3064 | 1.4667 | 0.81 | 0.7890 | 0.2496 | 0.0506 | | 0.3111 | 69.0 | 897 | 0.4686 | 0.81 | 0.3066 | 1.4675 | 0.81 | 0.7890 | 0.2407 | 0.0507 | | 0.3111 | 70.0 | 910 | 0.4689 | 0.81 | 0.3068 | 1.4679 | 0.81 | 0.7890 | 0.2502 | 0.0508 | | 0.3111 | 71.0 | 923 | 0.4690 | 0.81 | 0.3071 | 1.4687 | 0.81 | 0.7890 | 0.2445 | 0.0507 | | 0.3111 | 72.0 | 936 | 0.4688 | 0.81 | 0.3068 | 1.4678 | 0.81 | 0.7890 | 0.2500 | 0.0506 | | 0.3111 | 73.0 | 949 | 0.4689 | 0.81 | 0.3068 | 1.4685 | 0.81 | 0.7890 | 0.2662 | 0.0510 | | 0.3111 | 74.0 | 962 | 0.4687 | 0.81 | 0.3067 | 1.4679 | 0.81 | 0.7890 | 0.2496 | 0.0507 | | 0.3111 | 75.0 | 975 | 0.4688 | 0.81 | 0.3067 | 1.4683 | 0.81 | 0.7890 | 0.2468 | 0.0508 | | 0.3111 | 76.0 | 988 | 0.4688 | 0.81 | 0.3067 | 1.4676 | 0.81 | 0.7890 | 0.2511 | 0.0508 | | 0.1126 | 77.0 | 1001 | 0.4689 | 0.81 | 0.3068 | 1.4672 | 0.81 | 0.7890 | 0.2365 | 0.0506 | | 0.1126 | 78.0 | 1014 | 0.4688 | 0.81 | 0.3066 | 1.4681 | 0.81 | 0.7890 | 0.2507 | 0.0507 | | 0.1126 | 79.0 | 1027 | 0.4688 | 0.81 | 0.3068 | 1.4680 | 0.81 | 0.7890 | 0.2498 | 0.0508 | | 0.1126 | 80.0 | 1040 | 0.4689 | 0.81 | 0.3068 | 1.4676 | 0.81 | 0.7890 | 0.2497 | 0.0507 | | 0.1126 | 81.0 | 1053 | 0.4690 | 0.81 | 0.3068 | 1.4682 | 0.81 | 0.7890 | 0.2338 | 0.0506 | | 0.1126 | 82.0 | 1066 | 0.4686 | 0.81 | 0.3065 | 1.4682 | 0.81 | 0.7890 | 0.2541 | 0.0505 | | 0.1126 | 83.0 | 1079 | 0.4689 | 0.815 | 0.3067 | 1.4675 | 0.815 | 0.7970 | 0.2503 | 0.0501 | | 0.1126 | 84.0 | 1092 | 0.4687 | 0.815 | 0.3065 | 1.4676 | 0.815 | 0.7970 | 0.2567 | 0.0501 | | 0.1126 | 85.0 | 1105 | 0.4689 | 0.81 | 0.3067 | 1.4680 | 0.81 | 0.7890 | 0.2678 | 0.0507 | | 0.1126 | 86.0 | 1118 | 0.4689 | 0.815 | 0.3067 | 1.4684 | 0.815 | 0.7970 | 0.2566 | 0.0502 | | 0.1126 | 87.0 | 1131 | 0.4687 | 0.815 | 0.3066 | 1.4672 | 0.815 | 0.7970 | 0.2529 | 0.0501 | | 0.1126 | 88.0 | 1144 | 0.4689 | 0.815 | 0.3067 | 1.4680 | 0.815 | 0.7970 | 0.2569 | 0.0502 | | 0.1126 | 89.0 | 1157 | 0.4688 | 0.815 | 0.3067 | 1.4678 | 0.815 | 0.7970 | 0.2527 | 0.0500 | | 0.1126 | 90.0 | 1170 | 0.4689 | 0.815 | 0.3067 | 1.4681 | 0.815 | 0.7970 | 0.2527 | 0.0501 | | 0.1126 | 91.0 | 1183 | 0.4688 | 0.815 | 0.3067 | 1.4683 | 0.815 | 0.7970 | 0.2527 | 0.0500 | | 0.1126 | 92.0 | 1196 | 0.4688 | 0.815 | 0.3066 | 1.4675 | 0.815 | 0.7970 | 0.2528 | 0.0500 | | 0.1126 | 93.0 | 1209 | 0.4689 | 0.815 | 0.3068 | 1.4680 | 0.815 | 0.7970 | 0.2527 | 0.0500 | | 0.1126 | 94.0 | 1222 | 0.4688 | 0.815 | 0.3066 | 1.4678 | 0.815 | 0.7970 | 0.2440 | 0.0499 | | 0.1126 | 95.0 | 1235 | 0.4688 | 0.815 | 0.3066 | 1.4677 | 0.815 | 0.7970 | 0.2440 | 0.0499 | | 0.1126 | 96.0 | 1248 | 0.4688 | 0.815 | 0.3067 | 1.4681 | 0.815 | 0.7970 | 0.2528 | 0.0500 | | 0.1126 | 97.0 | 1261 | 0.4688 | 0.815 | 0.3066 | 1.4679 | 0.815 | 0.7970 | 0.2440 | 0.0500 | | 0.1126 | 98.0 | 1274 | 0.4689 | 0.815 | 0.3067 | 1.4680 | 0.815 | 0.7970 | 0.2440 | 0.0500 | | 0.1126 | 99.0 | 1287 | 0.4689 | 0.815 | 0.3067 | 1.4679 | 0.815 | 0.7970 | 0.2440 | 0.0500 | | 0.1126 | 100.0 | 1300 | 0.4688 | 0.815 | 0.3067 | 1.4679 | 0.815 | 0.7970 | 0.2440 | 0.0500 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/171-tiny_tobacco3482_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 171-tiny_tobacco3482_kd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2409 - Accuracy: 0.765 - Brier Loss: 0.3934 - Nll: 1.2941 - F1 Micro: 0.765 - F1 Macro: 0.6802 - Ece: 0.2843 - Aurc: 0.0716 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 1.3775 | 0.235 | 0.8906 | 7.4936 | 0.235 | 0.1630 | 0.2705 | 0.7664 | | No log | 2.0 | 26 | 0.8552 | 0.38 | 0.8035 | 4.2966 | 0.38 | 0.2738 | 0.3086 | 0.4787 | | No log | 3.0 | 39 | 0.6791 | 0.51 | 0.6994 | 3.1397 | 0.51 | 0.3754 | 0.3208 | 0.3181 | | No log | 4.0 | 52 | 0.5747 | 0.59 | 0.6289 | 2.2673 | 0.59 | 0.4483 | 0.3491 | 0.2156 | | No log | 5.0 | 65 | 0.4985 | 0.64 | 0.5620 | 2.3221 | 0.64 | 0.5323 | 0.3258 | 0.1637 | | No log | 6.0 | 78 | 0.5387 | 0.665 | 0.5254 | 2.2659 | 0.665 | 0.5673 | 0.2991 | 0.1582 | | No log | 7.0 | 91 | 0.5050 | 0.66 | 0.5100 | 2.4796 | 0.66 | 0.5360 | 0.2241 | 0.1675 | | No log | 8.0 | 104 | 0.4247 | 0.685 | 0.4864 | 2.1512 | 0.685 | 0.6017 | 0.2848 | 0.1242 | | No log | 9.0 | 117 | 0.5322 | 0.65 | 0.5101 | 1.9216 | 0.65 | 0.5658 | 0.2808 | 0.1775 | | No log | 10.0 | 130 | 0.4128 | 0.705 | 0.4744 | 1.6466 | 0.705 | 0.6320 | 0.3126 | 0.1011 | | No log | 11.0 | 143 | 0.4216 | 0.72 | 0.4621 | 2.3585 | 0.72 | 0.6365 | 0.2807 | 0.1181 | | No log | 12.0 | 156 | 0.3925 | 0.72 | 0.4535 | 2.1047 | 0.72 | 0.6466 | 0.2976 | 0.0989 | | No log | 13.0 | 169 | 0.3664 | 0.71 | 0.4406 | 1.6481 | 0.7100 | 0.6388 | 0.2789 | 0.1056 | | No log | 14.0 | 182 | 0.4002 | 0.715 | 0.4396 | 1.8360 | 0.715 | 0.6437 | 0.2705 | 0.1096 | | No log | 15.0 | 195 | 0.3324 | 0.72 | 0.4156 | 1.8256 | 0.72 | 0.6309 | 0.2857 | 0.0874 | | No log | 16.0 | 208 | 0.3041 | 0.75 | 0.4277 | 1.5808 | 0.75 | 0.6746 | 0.3014 | 0.0843 | | No log | 17.0 | 221 | 0.3178 | 0.78 | 0.4139 | 1.1784 | 0.78 | 0.7004 | 0.3016 | 0.0772 | | No log | 18.0 | 234 | 0.2911 | 0.735 | 0.4210 | 1.4736 | 0.735 | 0.6754 | 0.2909 | 0.0844 | | No log | 19.0 | 247 | 0.2988 | 0.76 | 0.4107 | 1.2700 | 0.76 | 0.6933 | 0.2904 | 0.0775 | | No log | 20.0 | 260 | 0.2904 | 0.745 | 0.4215 | 1.4039 | 0.745 | 0.6686 | 0.2920 | 0.0852 | | No log | 21.0 | 273 | 0.3022 | 0.77 | 0.4196 | 1.0212 | 0.7700 | 0.7040 | 0.3041 | 0.0714 | | No log | 22.0 | 286 | 0.2748 | 0.73 | 0.4106 | 1.1826 | 0.7300 | 0.6715 | 0.2977 | 0.0854 | | No log | 23.0 | 299 | 0.2835 | 0.745 | 0.4079 | 1.2464 | 0.745 | 0.6654 | 0.3083 | 0.0797 | | No log | 24.0 | 312 | 0.2748 | 0.75 | 0.4089 | 1.1540 | 0.75 | 0.6797 | 0.2772 | 0.0802 | | No log | 25.0 | 325 | 0.2818 | 0.735 | 0.4142 | 1.3465 | 0.735 | 0.6523 | 0.2693 | 0.0916 | | No log | 26.0 | 338 | 0.2666 | 0.74 | 0.4076 | 1.3420 | 0.74 | 0.6560 | 0.2892 | 0.0831 | | No log | 27.0 | 351 | 0.2693 | 0.745 | 0.4083 | 1.4070 | 0.745 | 0.6883 | 0.3074 | 0.0858 | | No log | 28.0 | 364 | 0.2598 | 0.725 | 0.4007 | 1.3015 | 0.7250 | 0.6509 | 0.2874 | 0.0843 | | No log | 29.0 | 377 | 0.2579 | 0.745 | 0.4023 | 1.2920 | 0.745 | 0.6770 | 0.2808 | 0.0788 | | No log | 30.0 | 390 | 0.2606 | 0.745 | 0.4053 | 1.2203 | 0.745 | 0.6643 | 0.2838 | 0.0805 | | No log | 31.0 | 403 | 0.2588 | 0.735 | 0.3982 | 1.3655 | 0.735 | 0.6743 | 0.2941 | 0.0852 | | No log | 32.0 | 416 | 0.2524 | 0.74 | 0.3941 | 1.1515 | 0.74 | 0.6771 | 0.2735 | 0.0813 | | No log | 33.0 | 429 | 0.2579 | 0.765 | 0.4002 | 1.3257 | 0.765 | 0.6993 | 0.2784 | 0.0753 | | No log | 34.0 | 442 | 0.2448 | 0.775 | 0.3981 | 1.2289 | 0.775 | 0.7015 | 0.2923 | 0.0720 | | No log | 35.0 | 455 | 0.2483 | 0.75 | 0.3987 | 1.2485 | 0.75 | 0.6645 | 0.2751 | 0.0751 | | No log | 36.0 | 468 | 0.2417 | 0.765 | 0.3879 | 1.0562 | 0.765 | 0.6856 | 0.2827 | 0.0723 | | No log | 37.0 | 481 | 0.2506 | 0.755 | 0.3944 | 1.2087 | 0.755 | 0.6855 | 0.3045 | 0.0744 | | No log | 38.0 | 494 | 0.2427 | 0.765 | 0.3917 | 1.2356 | 0.765 | 0.6862 | 0.2822 | 0.0703 | | 0.2351 | 39.0 | 507 | 0.2449 | 0.745 | 0.3958 | 1.2868 | 0.745 | 0.6750 | 0.2697 | 0.0762 | | 0.2351 | 40.0 | 520 | 0.2413 | 0.755 | 0.3917 | 1.3279 | 0.755 | 0.6831 | 0.2720 | 0.0724 | | 0.2351 | 41.0 | 533 | 0.2428 | 0.75 | 0.3924 | 1.2369 | 0.75 | 0.6781 | 0.2700 | 0.0766 | | 0.2351 | 42.0 | 546 | 0.2412 | 0.76 | 0.3919 | 1.2235 | 0.76 | 0.6913 | 0.2945 | 0.0732 | | 0.2351 | 43.0 | 559 | 0.2428 | 0.74 | 0.3968 | 1.3021 | 0.74 | 0.6648 | 0.2743 | 0.0773 | | 0.2351 | 44.0 | 572 | 0.2400 | 0.75 | 0.3936 | 1.2410 | 0.75 | 0.6643 | 0.2789 | 0.0723 | | 0.2351 | 45.0 | 585 | 0.2424 | 0.77 | 0.3949 | 1.2480 | 0.7700 | 0.7041 | 0.2813 | 0.0722 | | 0.2351 | 46.0 | 598 | 0.2398 | 0.77 | 0.3931 | 1.2463 | 0.7700 | 0.7005 | 0.3050 | 0.0722 | | 0.2351 | 47.0 | 611 | 0.2397 | 0.77 | 0.3919 | 1.2957 | 0.7700 | 0.6874 | 0.2961 | 0.0703 | | 0.2351 | 48.0 | 624 | 0.2401 | 0.77 | 0.3926 | 1.2360 | 0.7700 | 0.7045 | 0.2945 | 0.0720 | | 0.2351 | 49.0 | 637 | 0.2401 | 0.77 | 0.3927 | 1.2905 | 0.7700 | 0.6876 | 0.2825 | 0.0706 | | 0.2351 | 50.0 | 650 | 0.2413 | 0.765 | 0.3936 | 1.2892 | 0.765 | 0.6978 | 0.3016 | 0.0743 | | 0.2351 | 51.0 | 663 | 0.2410 | 0.77 | 0.3943 | 1.2913 | 0.7700 | 0.7005 | 0.2849 | 0.0728 | | 0.2351 | 52.0 | 676 | 0.2391 | 0.765 | 0.3926 | 1.2846 | 0.765 | 0.6805 | 0.2777 | 0.0710 | | 0.2351 | 53.0 | 689 | 0.2400 | 0.77 | 0.3929 | 1.2927 | 0.7700 | 0.6876 | 0.2698 | 0.0711 | | 0.2351 | 54.0 | 702 | 0.2401 | 0.77 | 0.3929 | 1.2917 | 0.7700 | 0.6876 | 0.2775 | 0.0711 | | 0.2351 | 55.0 | 715 | 0.2405 | 0.775 | 0.3934 | 1.2912 | 0.775 | 0.7074 | 0.2858 | 0.0709 | | 0.2351 | 56.0 | 728 | 0.2403 | 0.775 | 0.3927 | 1.2912 | 0.775 | 0.7077 | 0.3059 | 0.0710 | | 0.2351 | 57.0 | 741 | 0.2403 | 0.77 | 0.3932 | 1.2886 | 0.7700 | 0.6876 | 0.2914 | 0.0709 | | 0.2351 | 58.0 | 754 | 0.2403 | 0.765 | 0.3932 | 1.2893 | 0.765 | 0.6805 | 0.2735 | 0.0717 | | 0.2351 | 59.0 | 767 | 0.2405 | 0.77 | 0.3932 | 1.2923 | 0.7700 | 0.6876 | 0.2915 | 0.0714 | | 0.2351 | 60.0 | 780 | 0.2403 | 0.77 | 0.3931 | 1.2893 | 0.7700 | 0.6874 | 0.2830 | 0.0708 | | 0.2351 | 61.0 | 793 | 0.2405 | 0.77 | 0.3931 | 1.2910 | 0.7700 | 0.6876 | 0.2912 | 0.0711 | | 0.2351 | 62.0 | 806 | 0.2405 | 0.77 | 0.3935 | 1.2905 | 0.7700 | 0.6874 | 0.2831 | 0.0711 | | 0.2351 | 63.0 | 819 | 0.2405 | 0.765 | 0.3933 | 1.2918 | 0.765 | 0.6805 | 0.2776 | 0.0716 | | 0.2351 | 64.0 | 832 | 0.2405 | 0.77 | 0.3932 | 1.2937 | 0.7700 | 0.6876 | 0.2747 | 0.0712 | | 0.2351 | 65.0 | 845 | 0.2405 | 0.765 | 0.3934 | 1.2927 | 0.765 | 0.6802 | 0.2781 | 0.0717 | | 0.2351 | 66.0 | 858 | 0.2406 | 0.77 | 0.3930 | 1.2926 | 0.7700 | 0.6876 | 0.2780 | 0.0708 | | 0.2351 | 67.0 | 871 | 0.2407 | 0.77 | 0.3934 | 1.2933 | 0.7700 | 0.6876 | 0.2784 | 0.0712 | | 0.2351 | 68.0 | 884 | 0.2407 | 0.77 | 0.3934 | 1.2942 | 0.7700 | 0.6876 | 0.2853 | 0.0714 | | 0.2351 | 69.0 | 897 | 0.2406 | 0.77 | 0.3932 | 1.2920 | 0.7700 | 0.6876 | 0.2780 | 0.0709 | | 0.2351 | 70.0 | 910 | 0.2403 | 0.77 | 0.3931 | 1.2937 | 0.7700 | 0.6874 | 0.2748 | 0.0709 | | 0.2351 | 71.0 | 923 | 0.2407 | 0.77 | 0.3934 | 1.2929 | 0.7700 | 0.6874 | 0.2855 | 0.0710 | | 0.2351 | 72.0 | 936 | 0.2405 | 0.77 | 0.3930 | 1.2944 | 0.7700 | 0.6874 | 0.2779 | 0.0708 | | 0.2351 | 73.0 | 949 | 0.2407 | 0.765 | 0.3931 | 1.2919 | 0.765 | 0.6802 | 0.2799 | 0.0715 | | 0.2351 | 74.0 | 962 | 0.2408 | 0.765 | 0.3933 | 1.2937 | 0.765 | 0.6802 | 0.2647 | 0.0716 | | 0.2351 | 75.0 | 975 | 0.2407 | 0.765 | 0.3932 | 1.2935 | 0.765 | 0.6802 | 0.2728 | 0.0716 | | 0.2351 | 76.0 | 988 | 0.2407 | 0.77 | 0.3933 | 1.2933 | 0.7700 | 0.6874 | 0.2773 | 0.0709 | | 0.0004 | 77.0 | 1001 | 0.2407 | 0.77 | 0.3932 | 1.2941 | 0.7700 | 0.6874 | 0.2892 | 0.0709 | | 0.0004 | 78.0 | 1014 | 0.2408 | 0.77 | 0.3933 | 1.2936 | 0.7700 | 0.6874 | 0.2820 | 0.0709 | | 0.0004 | 79.0 | 1027 | 0.2410 | 0.77 | 0.3934 | 1.2931 | 0.7700 | 0.6874 | 0.2892 | 0.0710 | | 0.0004 | 80.0 | 1040 | 0.2409 | 0.77 | 0.3933 | 1.2929 | 0.7700 | 0.6874 | 0.2855 | 0.0711 | | 0.0004 | 81.0 | 1053 | 0.2409 | 0.77 | 0.3933 | 1.2937 | 0.7700 | 0.6874 | 0.2820 | 0.0709 | | 0.0004 | 82.0 | 1066 | 0.2409 | 0.77 | 0.3934 | 1.2947 | 0.7700 | 0.6874 | 0.2819 | 0.0709 | | 0.0004 | 83.0 | 1079 | 0.2408 | 0.77 | 0.3934 | 1.2933 | 0.7700 | 0.6874 | 0.2893 | 0.0709 | | 0.0004 | 84.0 | 1092 | 0.2409 | 0.765 | 0.3934 | 1.2934 | 0.765 | 0.6802 | 0.2843 | 0.0716 | | 0.0004 | 85.0 | 1105 | 0.2408 | 0.77 | 0.3933 | 1.2933 | 0.7700 | 0.6874 | 0.2893 | 0.0710 | | 0.0004 | 86.0 | 1118 | 0.2409 | 0.77 | 0.3933 | 1.2940 | 0.7700 | 0.6874 | 0.2819 | 0.0710 | | 0.0004 | 87.0 | 1131 | 0.2409 | 0.77 | 0.3934 | 1.2944 | 0.7700 | 0.6874 | 0.2820 | 0.0709 | | 0.0004 | 88.0 | 1144 | 0.2409 | 0.77 | 0.3934 | 1.2936 | 0.7700 | 0.6874 | 0.2893 | 0.0709 | | 0.0004 | 89.0 | 1157 | 0.2409 | 0.765 | 0.3933 | 1.2936 | 0.765 | 0.6802 | 0.2844 | 0.0717 | | 0.0004 | 90.0 | 1170 | 0.2409 | 0.77 | 0.3934 | 1.2939 | 0.7700 | 0.6874 | 0.2893 | 0.0709 | | 0.0004 | 91.0 | 1183 | 0.2409 | 0.765 | 0.3934 | 1.2943 | 0.765 | 0.6802 | 0.2843 | 0.0716 | | 0.0004 | 92.0 | 1196 | 0.2409 | 0.77 | 0.3934 | 1.2942 | 0.7700 | 0.6874 | 0.2893 | 0.0709 | | 0.0004 | 93.0 | 1209 | 0.2410 | 0.765 | 0.3934 | 1.2939 | 0.765 | 0.6802 | 0.2843 | 0.0716 | | 0.0004 | 94.0 | 1222 | 0.2409 | 0.765 | 0.3934 | 1.2937 | 0.765 | 0.6802 | 0.2843 | 0.0716 | | 0.0004 | 95.0 | 1235 | 0.2409 | 0.765 | 0.3934 | 1.2938 | 0.765 | 0.6802 | 0.2844 | 0.0716 | | 0.0004 | 96.0 | 1248 | 0.2409 | 0.765 | 0.3934 | 1.2939 | 0.765 | 0.6802 | 0.2770 | 0.0716 | | 0.0004 | 97.0 | 1261 | 0.2409 | 0.765 | 0.3934 | 1.2941 | 0.765 | 0.6802 | 0.2843 | 0.0715 | | 0.0004 | 98.0 | 1274 | 0.2409 | 0.765 | 0.3934 | 1.2940 | 0.765 | 0.6802 | 0.2844 | 0.0715 | | 0.0004 | 99.0 | 1287 | 0.2409 | 0.765 | 0.3934 | 1.2941 | 0.765 | 0.6802 | 0.2843 | 0.0716 | | 0.0004 | 100.0 | 1300 | 0.2409 | 0.765 | 0.3934 | 1.2941 | 0.765 | 0.6802 | 0.2843 | 0.0716 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/171-tiny_tobacco3482_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 171-tiny_tobacco3482_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 4.3051 - Accuracy: 0.815 - Brier Loss: 0.3074 - Nll: 1.7785 - F1 Micro: 0.815 - F1 Macro: 0.8049 - Ece: 0.1516 - Aurc: 0.0489 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 5.0451 | 0.22 | 0.8885 | 8.4566 | 0.22 | 0.1247 | 0.2851 | 0.7928 | | No log | 2.0 | 26 | 4.5055 | 0.385 | 0.7765 | 3.9655 | 0.3850 | 0.3066 | 0.3049 | 0.4178 | | No log | 3.0 | 39 | 4.3096 | 0.52 | 0.6691 | 3.8146 | 0.52 | 0.3950 | 0.3177 | 0.2860 | | No log | 4.0 | 52 | 4.1755 | 0.575 | 0.5907 | 2.9444 | 0.575 | 0.4546 | 0.2729 | 0.2029 | | No log | 5.0 | 65 | 4.0437 | 0.675 | 0.5104 | 2.4241 | 0.675 | 0.5995 | 0.2991 | 0.1354 | | No log | 6.0 | 78 | 4.0642 | 0.69 | 0.4602 | 2.3471 | 0.69 | 0.5925 | 0.2798 | 0.1256 | | No log | 7.0 | 91 | 4.0104 | 0.695 | 0.4319 | 2.2902 | 0.695 | 0.6109 | 0.2430 | 0.1101 | | No log | 8.0 | 104 | 4.1702 | 0.7 | 0.4296 | 2.5778 | 0.7 | 0.6065 | 0.2231 | 0.1201 | | No log | 9.0 | 117 | 4.2785 | 0.695 | 0.4433 | 2.7331 | 0.695 | 0.6269 | 0.2296 | 0.1283 | | No log | 10.0 | 130 | 3.9853 | 0.725 | 0.3705 | 2.0880 | 0.7250 | 0.6477 | 0.1971 | 0.0874 | | No log | 11.0 | 143 | 3.9595 | 0.725 | 0.3506 | 2.1144 | 0.7250 | 0.6431 | 0.1650 | 0.0750 | | No log | 12.0 | 156 | 3.8678 | 0.735 | 0.3504 | 2.0683 | 0.735 | 0.6839 | 0.2047 | 0.0764 | | No log | 13.0 | 169 | 3.9641 | 0.745 | 0.3520 | 2.0788 | 0.745 | 0.6754 | 0.1899 | 0.0837 | | No log | 14.0 | 182 | 4.0188 | 0.725 | 0.3639 | 2.3771 | 0.7250 | 0.6643 | 0.1740 | 0.0893 | | No log | 15.0 | 195 | 3.8558 | 0.765 | 0.3342 | 1.4620 | 0.765 | 0.7097 | 0.1866 | 0.0696 | | No log | 16.0 | 208 | 3.9103 | 0.79 | 0.3416 | 1.7139 | 0.79 | 0.7662 | 0.2043 | 0.0770 | | No log | 17.0 | 221 | 4.0320 | 0.795 | 0.3548 | 1.8525 | 0.795 | 0.7690 | 0.1901 | 0.0924 | | No log | 18.0 | 234 | 3.8974 | 0.79 | 0.3264 | 1.8646 | 0.79 | 0.7582 | 0.1656 | 0.0739 | | No log | 19.0 | 247 | 3.8235 | 0.815 | 0.3074 | 1.4771 | 0.815 | 0.8185 | 0.1825 | 0.0617 | | No log | 20.0 | 260 | 3.8918 | 0.805 | 0.3150 | 1.6824 | 0.805 | 0.7893 | 0.1859 | 0.0631 | | No log | 21.0 | 273 | 3.8919 | 0.785 | 0.3161 | 1.7951 | 0.785 | 0.7725 | 0.1450 | 0.0701 | | No log | 22.0 | 286 | 3.8626 | 0.795 | 0.3121 | 1.6707 | 0.795 | 0.7832 | 0.1570 | 0.0684 | | No log | 23.0 | 299 | 3.8132 | 0.825 | 0.2906 | 1.4511 | 0.825 | 0.8097 | 0.1552 | 0.0564 | | No log | 24.0 | 312 | 3.8680 | 0.81 | 0.3048 | 1.9348 | 0.81 | 0.8027 | 0.1572 | 0.0611 | | No log | 25.0 | 325 | 3.8305 | 0.81 | 0.2954 | 1.5734 | 0.81 | 0.7999 | 0.1645 | 0.0556 | | No log | 26.0 | 338 | 3.8050 | 0.81 | 0.2965 | 1.7904 | 0.81 | 0.8013 | 0.1495 | 0.0546 | | No log | 27.0 | 351 | 3.9524 | 0.79 | 0.3212 | 2.0459 | 0.79 | 0.7846 | 0.1643 | 0.0669 | | No log | 28.0 | 364 | 3.9299 | 0.81 | 0.3076 | 1.7819 | 0.81 | 0.7967 | 0.1393 | 0.0601 | | No log | 29.0 | 377 | 3.9315 | 0.805 | 0.3158 | 2.0697 | 0.805 | 0.8046 | 0.1618 | 0.0663 | | No log | 30.0 | 390 | 3.8141 | 0.825 | 0.2853 | 1.9079 | 0.825 | 0.8150 | 0.1487 | 0.0528 | | No log | 31.0 | 403 | 3.8682 | 0.815 | 0.2932 | 1.9092 | 0.815 | 0.8030 | 0.1448 | 0.0585 | | No log | 32.0 | 416 | 3.8275 | 0.82 | 0.2823 | 1.6793 | 0.82 | 0.8043 | 0.1459 | 0.0508 | | No log | 33.0 | 429 | 3.8782 | 0.82 | 0.2895 | 1.6565 | 0.82 | 0.8077 | 0.1465 | 0.0542 | | No log | 34.0 | 442 | 3.8433 | 0.825 | 0.2891 | 1.6481 | 0.825 | 0.8157 | 0.1467 | 0.0525 | | No log | 35.0 | 455 | 3.8403 | 0.82 | 0.2891 | 1.5960 | 0.82 | 0.8090 | 0.1398 | 0.0497 | | No log | 36.0 | 468 | 3.8627 | 0.81 | 0.2848 | 1.6935 | 0.81 | 0.8015 | 0.1557 | 0.0471 | | No log | 37.0 | 481 | 3.8992 | 0.81 | 0.2937 | 1.8237 | 0.81 | 0.7991 | 0.1511 | 0.0515 | | No log | 38.0 | 494 | 3.9662 | 0.82 | 0.2978 | 1.8392 | 0.82 | 0.8143 | 0.1503 | 0.0527 | | 3.5354 | 39.0 | 507 | 3.9440 | 0.825 | 0.2899 | 1.7818 | 0.825 | 0.8159 | 0.1454 | 0.0540 | | 3.5354 | 40.0 | 520 | 3.9479 | 0.81 | 0.2959 | 1.7465 | 0.81 | 0.7986 | 0.1504 | 0.0501 | | 3.5354 | 41.0 | 533 | 3.9760 | 0.815 | 0.2964 | 1.7821 | 0.815 | 0.8049 | 0.1519 | 0.0522 | | 3.5354 | 42.0 | 546 | 3.9696 | 0.82 | 0.2906 | 1.7671 | 0.82 | 0.8127 | 0.1468 | 0.0503 | | 3.5354 | 43.0 | 559 | 4.0107 | 0.81 | 0.2994 | 1.8207 | 0.81 | 0.7986 | 0.1474 | 0.0517 | | 3.5354 | 44.0 | 572 | 3.9970 | 0.815 | 0.2913 | 1.7706 | 0.815 | 0.8049 | 0.1465 | 0.0504 | | 3.5354 | 45.0 | 585 | 3.9890 | 0.815 | 0.2886 | 1.6384 | 0.815 | 0.8049 | 0.1516 | 0.0495 | | 3.5354 | 46.0 | 598 | 4.0585 | 0.82 | 0.3006 | 1.7773 | 0.82 | 0.8127 | 0.1522 | 0.0518 | | 3.5354 | 47.0 | 611 | 4.0448 | 0.825 | 0.2925 | 1.8226 | 0.825 | 0.8109 | 0.1540 | 0.0505 | | 3.5354 | 48.0 | 624 | 4.0918 | 0.815 | 0.3016 | 1.8403 | 0.815 | 0.8049 | 0.1492 | 0.0512 | | 3.5354 | 49.0 | 637 | 4.0677 | 0.82 | 0.2971 | 1.8256 | 0.82 | 0.8127 | 0.1396 | 0.0493 | | 3.5354 | 50.0 | 650 | 4.0831 | 0.815 | 0.2986 | 1.8232 | 0.815 | 0.8049 | 0.1479 | 0.0513 | | 3.5354 | 51.0 | 663 | 4.0846 | 0.815 | 0.2994 | 1.8268 | 0.815 | 0.8049 | 0.1525 | 0.0496 | | 3.5354 | 52.0 | 676 | 4.0828 | 0.82 | 0.2978 | 1.7538 | 0.82 | 0.8127 | 0.1425 | 0.0486 | | 3.5354 | 53.0 | 689 | 4.0890 | 0.815 | 0.3004 | 1.7552 | 0.815 | 0.8049 | 0.1491 | 0.0485 | | 3.5354 | 54.0 | 702 | 4.1299 | 0.815 | 0.3029 | 1.8902 | 0.815 | 0.8049 | 0.1614 | 0.0506 | | 3.5354 | 55.0 | 715 | 4.1200 | 0.815 | 0.3016 | 1.8279 | 0.815 | 0.8049 | 0.1510 | 0.0499 | | 3.5354 | 56.0 | 728 | 4.1196 | 0.815 | 0.3008 | 1.8883 | 0.815 | 0.8049 | 0.1503 | 0.0503 | | 3.5354 | 57.0 | 741 | 4.1200 | 0.815 | 0.3003 | 1.7620 | 0.815 | 0.8049 | 0.1499 | 0.0490 | | 3.5354 | 58.0 | 754 | 4.1419 | 0.815 | 0.3017 | 1.8463 | 0.815 | 0.8049 | 0.1459 | 0.0499 | | 3.5354 | 59.0 | 767 | 4.1527 | 0.815 | 0.3041 | 1.8269 | 0.815 | 0.8049 | 0.1618 | 0.0496 | | 3.5354 | 60.0 | 780 | 4.1362 | 0.815 | 0.3002 | 1.7666 | 0.815 | 0.8049 | 0.1461 | 0.0489 | | 3.5354 | 61.0 | 793 | 4.1470 | 0.815 | 0.3009 | 1.8213 | 0.815 | 0.8049 | 0.1471 | 0.0491 | | 3.5354 | 62.0 | 806 | 4.1503 | 0.815 | 0.2991 | 1.8235 | 0.815 | 0.8049 | 0.1604 | 0.0496 | | 3.5354 | 63.0 | 819 | 4.1544 | 0.815 | 0.3003 | 1.7546 | 0.815 | 0.8049 | 0.1518 | 0.0487 | | 3.5354 | 64.0 | 832 | 4.1713 | 0.815 | 0.3023 | 1.8223 | 0.815 | 0.8049 | 0.1543 | 0.0499 | | 3.5354 | 65.0 | 845 | 4.1716 | 0.815 | 0.3010 | 1.8213 | 0.815 | 0.8049 | 0.1485 | 0.0494 | | 3.5354 | 66.0 | 858 | 4.1956 | 0.815 | 0.3042 | 1.8287 | 0.815 | 0.8049 | 0.1637 | 0.0496 | | 3.5354 | 67.0 | 871 | 4.1845 | 0.815 | 0.3018 | 1.8259 | 0.815 | 0.8049 | 0.1519 | 0.0488 | | 3.5354 | 68.0 | 884 | 4.2055 | 0.815 | 0.3037 | 1.8339 | 0.815 | 0.8049 | 0.1504 | 0.0496 | | 3.5354 | 69.0 | 897 | 4.2079 | 0.815 | 0.3039 | 1.8281 | 0.815 | 0.8049 | 0.1554 | 0.0491 | | 3.5354 | 70.0 | 910 | 4.2125 | 0.815 | 0.3034 | 1.7637 | 0.815 | 0.8049 | 0.1500 | 0.0490 | | 3.5354 | 71.0 | 923 | 4.2179 | 0.815 | 0.3035 | 1.8254 | 0.815 | 0.8049 | 0.1531 | 0.0492 | | 3.5354 | 72.0 | 936 | 4.2270 | 0.815 | 0.3040 | 1.8270 | 0.815 | 0.8049 | 0.1528 | 0.0493 | | 3.5354 | 73.0 | 949 | 4.2294 | 0.815 | 0.3041 | 1.8260 | 0.815 | 0.8049 | 0.1531 | 0.0488 | | 3.5354 | 74.0 | 962 | 4.2383 | 0.815 | 0.3043 | 1.8261 | 0.815 | 0.8049 | 0.1513 | 0.0492 | | 3.5354 | 75.0 | 975 | 4.2441 | 0.815 | 0.3051 | 1.7691 | 0.815 | 0.8049 | 0.1539 | 0.0488 | | 3.5354 | 76.0 | 988 | 4.2500 | 0.815 | 0.3051 | 1.8287 | 0.815 | 0.8049 | 0.1540 | 0.0490 | | 3.192 | 77.0 | 1001 | 4.2538 | 0.815 | 0.3053 | 1.8273 | 0.815 | 0.8049 | 0.1542 | 0.0490 | | 3.192 | 78.0 | 1014 | 4.2573 | 0.815 | 0.3055 | 1.8281 | 0.815 | 0.8049 | 0.1541 | 0.0491 | | 3.192 | 79.0 | 1027 | 4.2603 | 0.815 | 0.3054 | 1.8275 | 0.815 | 0.8049 | 0.1544 | 0.0490 | | 3.192 | 80.0 | 1040 | 4.2673 | 0.815 | 0.3060 | 1.8277 | 0.815 | 0.8049 | 0.1544 | 0.0489 | | 3.192 | 81.0 | 1053 | 4.2697 | 0.815 | 0.3060 | 1.8272 | 0.815 | 0.8049 | 0.1500 | 0.0489 | | 3.192 | 82.0 | 1066 | 4.2747 | 0.815 | 0.3064 | 1.7765 | 0.815 | 0.8049 | 0.1544 | 0.0489 | | 3.192 | 83.0 | 1079 | 4.2769 | 0.815 | 0.3063 | 1.8273 | 0.815 | 0.8049 | 0.1503 | 0.0489 | | 3.192 | 84.0 | 1092 | 4.2824 | 0.815 | 0.3066 | 1.8278 | 0.815 | 0.8049 | 0.1548 | 0.0491 | | 3.192 | 85.0 | 1105 | 4.2842 | 0.815 | 0.3066 | 1.8276 | 0.815 | 0.8049 | 0.1506 | 0.0489 | | 3.192 | 86.0 | 1118 | 4.2883 | 0.815 | 0.3070 | 1.8281 | 0.815 | 0.8049 | 0.1508 | 0.0488 | | 3.192 | 87.0 | 1131 | 4.2907 | 0.815 | 0.3071 | 1.7730 | 0.815 | 0.8049 | 0.1548 | 0.0489 | | 3.192 | 88.0 | 1144 | 4.2919 | 0.815 | 0.3070 | 1.7739 | 0.815 | 0.8049 | 0.1513 | 0.0489 | | 3.192 | 89.0 | 1157 | 4.2943 | 0.815 | 0.3071 | 1.8281 | 0.815 | 0.8049 | 0.1514 | 0.0489 | | 3.192 | 90.0 | 1170 | 4.2954 | 0.815 | 0.3070 | 1.8280 | 0.815 | 0.8049 | 0.1508 | 0.0489 | | 3.192 | 91.0 | 1183 | 4.2976 | 0.815 | 0.3071 | 1.8282 | 0.815 | 0.8049 | 0.1514 | 0.0489 | | 3.192 | 92.0 | 1196 | 4.2985 | 0.815 | 0.3070 | 1.7799 | 0.815 | 0.8049 | 0.1509 | 0.0489 | | 3.192 | 93.0 | 1209 | 4.3000 | 0.815 | 0.3072 | 1.7832 | 0.815 | 0.8049 | 0.1514 | 0.0489 | | 3.192 | 94.0 | 1222 | 4.3016 | 0.815 | 0.3073 | 1.7775 | 0.815 | 0.8049 | 0.1516 | 0.0489 | | 3.192 | 95.0 | 1235 | 4.3025 | 0.815 | 0.3072 | 1.8282 | 0.815 | 0.8049 | 0.1510 | 0.0489 | | 3.192 | 96.0 | 1248 | 4.3030 | 0.815 | 0.3073 | 1.7778 | 0.815 | 0.8049 | 0.1510 | 0.0489 | | 3.192 | 97.0 | 1261 | 4.3042 | 0.815 | 0.3073 | 1.7770 | 0.815 | 0.8049 | 0.1516 | 0.0489 | | 3.192 | 98.0 | 1274 | 4.3047 | 0.815 | 0.3074 | 1.7826 | 0.815 | 0.8049 | 0.1516 | 0.0489 | | 3.192 | 99.0 | 1287 | 4.3051 | 0.815 | 0.3074 | 1.7777 | 0.815 | 0.8049 | 0.1516 | 0.0489 | | 3.192 | 100.0 | 1300 | 4.3051 | 0.815 | 0.3074 | 1.7785 | 0.815 | 0.8049 | 0.1516 | 0.0489 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/171-tiny_tobacco3482_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 171-tiny_tobacco3482_og_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1545.9613 - Accuracy: 0.835 - Brier Loss: 0.2581 - Nll: 1.8079 - F1 Micro: 0.835 - F1 Macro: 0.8078 - Ece: 0.1626 - Aurc: 0.0552 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 50 | 1583.9828 | 0.195 | 0.8629 | 4.7365 | 0.195 | 0.0729 | 0.2366 | 0.5739 | | No log | 2.0 | 100 | 1579.1852 | 0.47 | 0.7171 | 2.4638 | 0.47 | 0.3442 | 0.3084 | 0.3108 | | No log | 3.0 | 150 | 1572.6602 | 0.625 | 0.5876 | 2.4898 | 0.625 | 0.5699 | 0.3192 | 0.1832 | | No log | 4.0 | 200 | 1568.2220 | 0.655 | 0.5098 | 2.4424 | 0.655 | 0.5454 | 0.2498 | 0.1503 | | No log | 5.0 | 250 | 1564.8569 | 0.72 | 0.4455 | 2.0840 | 0.72 | 0.5990 | 0.2387 | 0.1189 | | No log | 6.0 | 300 | 1567.0118 | 0.51 | 0.6163 | 3.9029 | 0.51 | 0.4434 | 0.2623 | 0.2494 | | No log | 7.0 | 350 | 1563.1887 | 0.745 | 0.4081 | 2.3136 | 0.745 | 0.6818 | 0.2108 | 0.1029 | | No log | 8.0 | 400 | 1561.1106 | 0.74 | 0.4192 | 2.0321 | 0.74 | 0.6974 | 0.2351 | 0.1040 | | No log | 9.0 | 450 | 1559.3597 | 0.665 | 0.4981 | 2.6615 | 0.665 | 0.6722 | 0.2456 | 0.1715 | | 1524.8844 | 10.0 | 500 | 1564.1249 | 0.73 | 0.4146 | 2.2876 | 0.7300 | 0.6839 | 0.2202 | 0.0966 | | 1524.8844 | 11.0 | 550 | 1571.7053 | 0.69 | 0.4778 | 3.9365 | 0.69 | 0.6192 | 0.2256 | 0.1386 | | 1524.8844 | 12.0 | 600 | 1566.6997 | 0.74 | 0.3931 | 2.4325 | 0.74 | 0.6717 | 0.2158 | 0.0952 | | 1524.8844 | 13.0 | 650 | 1563.5497 | 0.71 | 0.4268 | 3.1600 | 0.7100 | 0.6571 | 0.2277 | 0.1104 | | 1524.8844 | 14.0 | 700 | 1554.3715 | 0.795 | 0.3309 | 2.8276 | 0.795 | 0.7856 | 0.1968 | 0.0575 | | 1524.8844 | 15.0 | 750 | 1557.7578 | 0.83 | 0.2923 | 2.0899 | 0.83 | 0.8167 | 0.2174 | 0.0508 | | 1524.8844 | 16.0 | 800 | 1551.9381 | 0.78 | 0.3410 | 2.2729 | 0.78 | 0.7566 | 0.1860 | 0.0857 | | 1524.8844 | 17.0 | 850 | 1561.5450 | 0.765 | 0.3687 | 2.6934 | 0.765 | 0.7402 | 0.2318 | 0.0777 | | 1524.8844 | 18.0 | 900 | 1561.7700 | 0.755 | 0.3746 | 2.6989 | 0.755 | 0.7246 | 0.2163 | 0.0802 | | 1524.8844 | 19.0 | 950 | 1556.6021 | 0.78 | 0.3457 | 2.1642 | 0.78 | 0.7820 | 0.1890 | 0.0700 | | 1511.8512 | 20.0 | 1000 | 1560.8047 | 0.745 | 0.4095 | 3.1277 | 0.745 | 0.7095 | 0.2325 | 0.1169 | | 1511.8512 | 21.0 | 1050 | 1562.6659 | 0.82 | 0.3097 | 2.1455 | 0.82 | 0.7896 | 0.2215 | 0.0631 | | 1511.8512 | 22.0 | 1100 | 1552.2571 | 0.795 | 0.3317 | 2.2236 | 0.795 | 0.7833 | 0.2097 | 0.0866 | | 1511.8512 | 23.0 | 1150 | 1556.3745 | 0.815 | 0.2791 | 1.7801 | 0.815 | 0.7915 | 0.1747 | 0.0456 | | 1511.8512 | 24.0 | 1200 | 1553.9163 | 0.835 | 0.2826 | 1.8593 | 0.835 | 0.8139 | 0.2010 | 0.0576 | | 1511.8512 | 25.0 | 1250 | 1553.7266 | 0.835 | 0.2823 | 1.9374 | 0.835 | 0.8099 | 0.1944 | 0.0640 | | 1511.8512 | 26.0 | 1300 | 1551.4546 | 0.83 | 0.2789 | 1.8261 | 0.83 | 0.8093 | 0.1795 | 0.0630 | | 1511.8512 | 27.0 | 1350 | 1550.9459 | 0.84 | 0.2788 | 1.9731 | 0.8400 | 0.8213 | 0.1788 | 0.0638 | | 1511.8512 | 28.0 | 1400 | 1554.3896 | 0.825 | 0.2678 | 1.7180 | 0.825 | 0.8069 | 0.1793 | 0.0560 | | 1511.8512 | 29.0 | 1450 | 1550.7052 | 0.81 | 0.3100 | 1.9994 | 0.81 | 0.7942 | 0.1852 | 0.0689 | | 1507.4645 | 30.0 | 1500 | 1552.1665 | 0.83 | 0.2807 | 1.7044 | 0.83 | 0.7991 | 0.1742 | 0.0644 | | 1507.4645 | 31.0 | 1550 | 1544.2642 | 0.845 | 0.2619 | 1.7200 | 0.845 | 0.8219 | 0.1690 | 0.0731 | | 1507.4645 | 32.0 | 1600 | 1548.5247 | 0.83 | 0.2698 | 1.8894 | 0.83 | 0.8061 | 0.1609 | 0.0597 | | 1507.4645 | 33.0 | 1650 | 1543.8158 | 0.845 | 0.2641 | 1.6868 | 0.845 | 0.8348 | 0.1911 | 0.0601 | | 1507.4645 | 34.0 | 1700 | 1550.6659 | 0.855 | 0.2562 | 1.7688 | 0.855 | 0.8349 | 0.1867 | 0.0643 | | 1507.4645 | 35.0 | 1750 | 1552.8594 | 0.85 | 0.2560 | 1.5886 | 0.85 | 0.8338 | 0.1746 | 0.0541 | | 1507.4645 | 36.0 | 1800 | 1550.4800 | 0.835 | 0.2620 | 1.7978 | 0.835 | 0.8217 | 0.1659 | 0.0531 | | 1507.4645 | 37.0 | 1850 | 1548.1787 | 0.835 | 0.2700 | 1.7099 | 0.835 | 0.8122 | 0.1750 | 0.0710 | | 1507.4645 | 38.0 | 1900 | 1550.8218 | 0.84 | 0.2601 | 1.7914 | 0.8400 | 0.8170 | 0.1637 | 0.0560 | | 1507.4645 | 39.0 | 1950 | 1546.4485 | 0.845 | 0.2624 | 1.8862 | 0.845 | 0.8288 | 0.1672 | 0.0669 | | 1504.6109 | 40.0 | 2000 | 1548.0247 | 0.84 | 0.2641 | 1.8206 | 0.8400 | 0.8265 | 0.1677 | 0.0574 | | 1504.6109 | 41.0 | 2050 | 1548.5134 | 0.85 | 0.2636 | 1.8483 | 0.85 | 0.8370 | 0.1794 | 0.0536 | | 1504.6109 | 42.0 | 2100 | 1549.4111 | 0.84 | 0.2714 | 1.8670 | 0.8400 | 0.8216 | 0.1741 | 0.0656 | | 1504.6109 | 43.0 | 2150 | 1545.5203 | 0.855 | 0.2578 | 1.8265 | 0.855 | 0.8343 | 0.1555 | 0.0688 | | 1504.6109 | 44.0 | 2200 | 1550.9375 | 0.835 | 0.2802 | 2.0555 | 0.835 | 0.8104 | 0.1645 | 0.0599 | | 1504.6109 | 45.0 | 2250 | 1548.9200 | 0.85 | 0.2560 | 1.6944 | 0.85 | 0.8233 | 0.1539 | 0.0636 | | 1504.6109 | 46.0 | 2300 | 1551.2212 | 0.835 | 0.2620 | 1.7128 | 0.835 | 0.8161 | 0.1679 | 0.0444 | | 1504.6109 | 47.0 | 2350 | 1551.1859 | 0.84 | 0.2644 | 1.7914 | 0.8400 | 0.8255 | 0.1696 | 0.0522 | | 1504.6109 | 48.0 | 2400 | 1549.4532 | 0.84 | 0.2629 | 1.8107 | 0.8400 | 0.8154 | 0.1651 | 0.0513 | | 1504.6109 | 49.0 | 2450 | 1552.7788 | 0.835 | 0.2626 | 1.7638 | 0.835 | 0.8102 | 0.1767 | 0.0521 | | 1502.7574 | 50.0 | 2500 | 1536.4287 | 0.855 | 0.2483 | 1.7998 | 0.855 | 0.8292 | 0.1589 | 0.0694 | | 1502.7574 | 51.0 | 2550 | 1549.2762 | 0.855 | 0.2449 | 1.6393 | 0.855 | 0.8306 | 0.1829 | 0.0626 | | 1502.7574 | 52.0 | 2600 | 1549.4481 | 0.845 | 0.2684 | 1.8626 | 0.845 | 0.8194 | 0.1674 | 0.0590 | | 1502.7574 | 53.0 | 2650 | 1546.8684 | 0.855 | 0.2592 | 1.8554 | 0.855 | 0.8293 | 0.1681 | 0.0634 | | 1502.7574 | 54.0 | 2700 | 1546.9109 | 0.83 | 0.2613 | 1.7866 | 0.83 | 0.8031 | 0.1627 | 0.0610 | | 1502.7574 | 55.0 | 2750 | 1545.6719 | 0.83 | 0.2710 | 1.6850 | 0.83 | 0.8005 | 0.1635 | 0.0617 | | 1502.7574 | 56.0 | 2800 | 1549.3009 | 0.83 | 0.2740 | 1.7925 | 0.83 | 0.8082 | 0.1739 | 0.0621 | | 1502.7574 | 57.0 | 2850 | 1544.9680 | 0.825 | 0.2710 | 1.7718 | 0.825 | 0.7994 | 0.1725 | 0.0642 | | 1502.7574 | 58.0 | 2900 | 1549.2238 | 0.835 | 0.2536 | 1.6798 | 0.835 | 0.8074 | 0.1711 | 0.0538 | | 1502.7574 | 59.0 | 2950 | 1546.0543 | 0.85 | 0.2579 | 1.7481 | 0.85 | 0.8254 | 0.1719 | 0.0682 | | 1501.0283 | 60.0 | 3000 | 1545.8566 | 0.845 | 0.2503 | 1.8351 | 0.845 | 0.8218 | 0.1598 | 0.0583 | | 1501.0283 | 61.0 | 3050 | 1548.5146 | 0.845 | 0.2529 | 1.7119 | 0.845 | 0.8188 | 0.1609 | 0.0577 | | 1501.0283 | 62.0 | 3100 | 1545.9747 | 0.835 | 0.2619 | 1.7379 | 0.835 | 0.8045 | 0.1545 | 0.0566 | | 1501.0283 | 63.0 | 3150 | 1543.2375 | 0.84 | 0.2631 | 1.9799 | 0.8400 | 0.8181 | 0.1544 | 0.0701 | | 1501.0283 | 64.0 | 3200 | 1542.9421 | 0.84 | 0.2659 | 1.8430 | 0.8400 | 0.8122 | 0.1674 | 0.0635 | | 1501.0283 | 65.0 | 3250 | 1548.4230 | 0.855 | 0.2566 | 1.8601 | 0.855 | 0.8308 | 0.1732 | 0.0548 | | 1501.0283 | 66.0 | 3300 | 1545.2163 | 0.835 | 0.2603 | 1.7657 | 0.835 | 0.8123 | 0.1661 | 0.0685 | | 1501.0283 | 67.0 | 3350 | 1545.8247 | 0.835 | 0.2619 | 1.7753 | 0.835 | 0.8122 | 0.1683 | 0.0625 | | 1501.0283 | 68.0 | 3400 | 1548.2606 | 0.845 | 0.2498 | 1.7430 | 0.845 | 0.8151 | 0.1692 | 0.0539 | | 1501.0283 | 69.0 | 3450 | 1546.6490 | 0.85 | 0.2601 | 1.7287 | 0.85 | 0.8212 | 0.1707 | 0.0559 | | 1499.823 | 70.0 | 3500 | 1547.7959 | 0.83 | 0.2623 | 1.7656 | 0.83 | 0.8039 | 0.1696 | 0.0606 | | 1499.823 | 71.0 | 3550 | 1545.1508 | 0.84 | 0.2579 | 1.7362 | 0.8400 | 0.8141 | 0.1546 | 0.0587 | | 1499.823 | 72.0 | 3600 | 1547.2922 | 0.83 | 0.2599 | 1.7354 | 0.83 | 0.8070 | 0.1609 | 0.0565 | | 1499.823 | 73.0 | 3650 | 1547.1040 | 0.85 | 0.2556 | 1.7806 | 0.85 | 0.8271 | 0.1635 | 0.0598 | | 1499.823 | 74.0 | 3700 | 1546.0535 | 0.83 | 0.2644 | 1.7560 | 0.83 | 0.8022 | 0.1557 | 0.0585 | | 1499.823 | 75.0 | 3750 | 1548.1322 | 0.85 | 0.2569 | 1.7507 | 0.85 | 0.8297 | 0.1802 | 0.0559 | | 1499.823 | 76.0 | 3800 | 1546.1481 | 0.84 | 0.2539 | 1.7530 | 0.8400 | 0.8147 | 0.1551 | 0.0572 | | 1499.823 | 77.0 | 3850 | 1546.6122 | 0.855 | 0.2459 | 1.7644 | 0.855 | 0.8269 | 0.1687 | 0.0548 | | 1499.823 | 78.0 | 3900 | 1550.3534 | 0.835 | 0.2651 | 1.7555 | 0.835 | 0.8085 | 0.1719 | 0.0579 | | 1499.823 | 79.0 | 3950 | 1549.6628 | 0.85 | 0.2536 | 1.7469 | 0.85 | 0.8272 | 0.1678 | 0.0535 | | 1499.0814 | 80.0 | 4000 | 1547.6503 | 0.85 | 0.2484 | 1.8084 | 0.85 | 0.8243 | 0.1628 | 0.0557 | | 1499.0814 | 81.0 | 4050 | 1546.0641 | 0.855 | 0.2545 | 1.7587 | 0.855 | 0.8295 | 0.1725 | 0.0592 | | 1499.0814 | 82.0 | 4100 | 1550.3037 | 0.83 | 0.2541 | 1.7713 | 0.83 | 0.8016 | 0.1598 | 0.0483 | | 1499.0814 | 83.0 | 4150 | 1547.8164 | 0.85 | 0.2516 | 1.7390 | 0.85 | 0.8190 | 0.1701 | 0.0568 | | 1499.0814 | 84.0 | 4200 | 1541.3584 | 0.84 | 0.2536 | 1.7410 | 0.8400 | 0.8165 | 0.1482 | 0.0625 | | 1499.0814 | 85.0 | 4250 | 1546.9172 | 0.835 | 0.2577 | 1.7380 | 0.835 | 0.8089 | 0.1497 | 0.0520 | | 1499.0814 | 86.0 | 4300 | 1548.3782 | 0.84 | 0.2591 | 1.8017 | 0.8400 | 0.8169 | 0.1655 | 0.0515 | | 1499.0814 | 87.0 | 4350 | 1547.3690 | 0.83 | 0.2629 | 1.7771 | 0.83 | 0.8030 | 0.1591 | 0.0546 | | 1499.0814 | 88.0 | 4400 | 1546.2750 | 0.84 | 0.2637 | 1.7430 | 0.8400 | 0.8197 | 0.1538 | 0.0529 | | 1499.0814 | 89.0 | 4450 | 1542.7203 | 0.84 | 0.2524 | 1.8141 | 0.8400 | 0.8181 | 0.1616 | 0.0583 | | 1498.2634 | 90.0 | 4500 | 1546.0441 | 0.84 | 0.2537 | 1.7778 | 0.8400 | 0.8111 | 0.1550 | 0.0534 | | 1498.2634 | 91.0 | 4550 | 1549.2853 | 0.835 | 0.2646 | 1.7224 | 0.835 | 0.8093 | 0.1503 | 0.0514 | | 1498.2634 | 92.0 | 4600 | 1547.8093 | 0.825 | 0.2631 | 1.7433 | 0.825 | 0.8052 | 0.1469 | 0.0534 | | 1498.2634 | 93.0 | 4650 | 1547.4916 | 0.84 | 0.2608 | 1.7245 | 0.8400 | 0.8163 | 0.1615 | 0.0512 | | 1498.2634 | 94.0 | 4700 | 1546.7931 | 0.835 | 0.2636 | 1.7105 | 0.835 | 0.8110 | 0.1640 | 0.0564 | | 1498.2634 | 95.0 | 4750 | 1542.2678 | 0.83 | 0.2633 | 1.7869 | 0.83 | 0.8008 | 0.1528 | 0.0567 | | 1498.2634 | 96.0 | 4800 | 1546.7603 | 0.845 | 0.2611 | 1.7291 | 0.845 | 0.8253 | 0.1671 | 0.0547 | | 1498.2634 | 97.0 | 4850 | 1547.1423 | 0.835 | 0.2639 | 1.7659 | 0.835 | 0.8144 | 0.1608 | 0.0557 | | 1498.2634 | 98.0 | 4900 | 1549.5010 | 0.835 | 0.2588 | 1.7856 | 0.835 | 0.8078 | 0.1637 | 0.0499 | | 1498.2634 | 99.0 | 4950 | 1544.0619 | 0.84 | 0.2604 | 1.7604 | 0.8400 | 0.8162 | 0.1582 | 0.0560 | | 1498.0844 | 100.0 | 5000 | 1545.9613 | 0.835 | 0.2581 | 1.8079 | 0.835 | 0.8078 | 0.1626 | 0.0552 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/114-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 114-tiny_tobacco3482_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5018 - Accuracy: 0.81 - Brier Loss: 0.3510 - Nll: 1.3312 - F1 Micro: 0.81 - F1 Macro: 0.7895 - Ece: 0.2880 - Aurc: 0.0537 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 1.4179 | 0.225 | 0.8892 | 8.1717 | 0.225 | 0.1452 | 0.3025 | 0.7575 | | No log | 2.0 | 26 | 1.0222 | 0.44 | 0.7148 | 3.7941 | 0.44 | 0.3487 | 0.2946 | 0.3460 | | No log | 3.0 | 39 | 0.8765 | 0.57 | 0.5993 | 2.4902 | 0.57 | 0.4787 | 0.2800 | 0.2411 | | No log | 4.0 | 52 | 0.7663 | 0.645 | 0.5127 | 1.8672 | 0.645 | 0.5731 | 0.2922 | 0.1395 | | No log | 5.0 | 65 | 0.7017 | 0.695 | 0.4672 | 1.3663 | 0.695 | 0.6567 | 0.3016 | 0.1153 | | No log | 6.0 | 78 | 0.6744 | 0.725 | 0.4267 | 1.5639 | 0.7250 | 0.6508 | 0.2743 | 0.1066 | | No log | 7.0 | 91 | 0.6457 | 0.735 | 0.4093 | 1.5745 | 0.735 | 0.7161 | 0.2659 | 0.0897 | | No log | 8.0 | 104 | 0.6701 | 0.715 | 0.4180 | 1.7405 | 0.715 | 0.6768 | 0.2274 | 0.1123 | | No log | 9.0 | 117 | 0.6394 | 0.735 | 0.4147 | 1.5335 | 0.735 | 0.6938 | 0.2380 | 0.1089 | | No log | 10.0 | 130 | 0.6231 | 0.795 | 0.4000 | 1.4767 | 0.795 | 0.7795 | 0.3033 | 0.0702 | | No log | 11.0 | 143 | 0.6335 | 0.74 | 0.3955 | 2.0438 | 0.74 | 0.7274 | 0.2632 | 0.0900 | | No log | 12.0 | 156 | 0.5905 | 0.745 | 0.3898 | 1.6288 | 0.745 | 0.7078 | 0.2524 | 0.0808 | | No log | 13.0 | 169 | 0.6359 | 0.72 | 0.4279 | 2.0889 | 0.72 | 0.7190 | 0.2676 | 0.1137 | | No log | 14.0 | 182 | 0.5462 | 0.78 | 0.3627 | 1.5296 | 0.78 | 0.7550 | 0.2957 | 0.0678 | | No log | 15.0 | 195 | 0.5639 | 0.76 | 0.3913 | 1.6057 | 0.76 | 0.7296 | 0.2769 | 0.0763 | | No log | 16.0 | 208 | 0.5731 | 0.77 | 0.3957 | 1.7224 | 0.7700 | 0.7444 | 0.3190 | 0.0661 | | No log | 17.0 | 221 | 0.5561 | 0.745 | 0.3842 | 1.3096 | 0.745 | 0.7130 | 0.2562 | 0.0784 | | No log | 18.0 | 234 | 0.5559 | 0.755 | 0.3880 | 1.5937 | 0.755 | 0.7263 | 0.2810 | 0.0798 | | No log | 19.0 | 247 | 0.5454 | 0.8 | 0.3824 | 1.4121 | 0.8000 | 0.7875 | 0.3284 | 0.0595 | | No log | 20.0 | 260 | 0.5327 | 0.78 | 0.3638 | 1.4363 | 0.78 | 0.7462 | 0.2774 | 0.0633 | | No log | 21.0 | 273 | 0.5291 | 0.775 | 0.3596 | 1.5236 | 0.775 | 0.7470 | 0.2902 | 0.0686 | | No log | 22.0 | 286 | 0.5175 | 0.79 | 0.3597 | 1.2045 | 0.79 | 0.7583 | 0.3013 | 0.0547 | | No log | 23.0 | 299 | 0.5160 | 0.83 | 0.3560 | 1.4152 | 0.83 | 0.8028 | 0.3339 | 0.0545 | | No log | 24.0 | 312 | 0.5473 | 0.79 | 0.3736 | 1.5232 | 0.79 | 0.7719 | 0.2815 | 0.0842 | | No log | 25.0 | 325 | 0.5270 | 0.805 | 0.3685 | 1.5013 | 0.805 | 0.7801 | 0.2881 | 0.0606 | | No log | 26.0 | 338 | 0.5032 | 0.815 | 0.3444 | 1.2979 | 0.815 | 0.7913 | 0.2776 | 0.0544 | | No log | 27.0 | 351 | 0.5245 | 0.79 | 0.3607 | 1.4567 | 0.79 | 0.7767 | 0.2932 | 0.0658 | | No log | 28.0 | 364 | 0.5192 | 0.775 | 0.3636 | 1.3492 | 0.775 | 0.7445 | 0.2730 | 0.0627 | | No log | 29.0 | 377 | 0.5219 | 0.79 | 0.3595 | 1.3888 | 0.79 | 0.7590 | 0.2956 | 0.0627 | | No log | 30.0 | 390 | 0.5138 | 0.82 | 0.3548 | 1.4869 | 0.82 | 0.7911 | 0.3138 | 0.0579 | | No log | 31.0 | 403 | 0.5110 | 0.8 | 0.3523 | 1.3410 | 0.8000 | 0.7800 | 0.3211 | 0.0535 | | No log | 32.0 | 416 | 0.5059 | 0.825 | 0.3517 | 1.3191 | 0.825 | 0.8090 | 0.3256 | 0.0499 | | No log | 33.0 | 429 | 0.5060 | 0.805 | 0.3487 | 1.4018 | 0.805 | 0.7850 | 0.2882 | 0.0568 | | No log | 34.0 | 442 | 0.5063 | 0.79 | 0.3558 | 1.4104 | 0.79 | 0.7607 | 0.2863 | 0.0593 | | No log | 35.0 | 455 | 0.4944 | 0.81 | 0.3426 | 1.3328 | 0.81 | 0.7918 | 0.2916 | 0.0465 | | No log | 36.0 | 468 | 0.4999 | 0.815 | 0.3476 | 1.3893 | 0.815 | 0.7973 | 0.2954 | 0.0534 | | No log | 37.0 | 481 | 0.5014 | 0.8 | 0.3487 | 1.3948 | 0.8000 | 0.7800 | 0.3012 | 0.0574 | | No log | 38.0 | 494 | 0.5013 | 0.81 | 0.3500 | 1.3885 | 0.81 | 0.7896 | 0.2923 | 0.0537 | | 0.3577 | 39.0 | 507 | 0.5016 | 0.81 | 0.3495 | 1.3905 | 0.81 | 0.7927 | 0.3037 | 0.0565 | | 0.3577 | 40.0 | 520 | 0.4999 | 0.805 | 0.3491 | 1.3888 | 0.805 | 0.7850 | 0.2920 | 0.0547 | | 0.3577 | 41.0 | 533 | 0.5014 | 0.805 | 0.3509 | 1.3877 | 0.805 | 0.7878 | 0.3036 | 0.0546 | | 0.3577 | 42.0 | 546 | 0.4997 | 0.805 | 0.3487 | 1.3352 | 0.805 | 0.7850 | 0.2921 | 0.0537 | | 0.3577 | 43.0 | 559 | 0.5012 | 0.81 | 0.3508 | 1.3924 | 0.81 | 0.7882 | 0.3171 | 0.0542 | | 0.3577 | 44.0 | 572 | 0.5014 | 0.805 | 0.3504 | 1.3910 | 0.805 | 0.7878 | 0.2963 | 0.0545 | | 0.3577 | 45.0 | 585 | 0.5008 | 0.805 | 0.3500 | 1.3281 | 0.805 | 0.7878 | 0.2871 | 0.0539 | | 0.3577 | 46.0 | 598 | 0.5014 | 0.805 | 0.3512 | 1.3288 | 0.805 | 0.7850 | 0.3028 | 0.0542 | | 0.3577 | 47.0 | 611 | 0.5005 | 0.805 | 0.3502 | 1.3282 | 0.805 | 0.7864 | 0.3115 | 0.0531 | | 0.3577 | 48.0 | 624 | 0.5010 | 0.805 | 0.3510 | 1.3280 | 0.805 | 0.7878 | 0.3022 | 0.0542 | | 0.3577 | 49.0 | 637 | 0.5014 | 0.805 | 0.3502 | 1.3334 | 0.805 | 0.7864 | 0.3023 | 0.0543 | | 0.3577 | 50.0 | 650 | 0.5016 | 0.805 | 0.3509 | 1.3344 | 0.805 | 0.7878 | 0.3095 | 0.0547 | | 0.3577 | 51.0 | 663 | 0.5015 | 0.805 | 0.3507 | 1.3325 | 0.805 | 0.7878 | 0.3089 | 0.0543 | | 0.3577 | 52.0 | 676 | 0.5009 | 0.81 | 0.3505 | 1.3292 | 0.81 | 0.7908 | 0.3082 | 0.0543 | | 0.3577 | 53.0 | 689 | 0.5012 | 0.81 | 0.3506 | 1.3313 | 0.81 | 0.7923 | 0.3008 | 0.0539 | | 0.3577 | 54.0 | 702 | 0.5012 | 0.805 | 0.3506 | 1.3284 | 0.805 | 0.7892 | 0.2850 | 0.0539 | | 0.3577 | 55.0 | 715 | 0.5012 | 0.81 | 0.3506 | 1.3288 | 0.81 | 0.7894 | 0.2940 | 0.0537 | | 0.3577 | 56.0 | 728 | 0.5017 | 0.805 | 0.3508 | 1.3284 | 0.805 | 0.7892 | 0.2967 | 0.0544 | | 0.3577 | 57.0 | 741 | 0.5017 | 0.81 | 0.3510 | 1.3316 | 0.81 | 0.7895 | 0.2838 | 0.0542 | | 0.3577 | 58.0 | 754 | 0.5013 | 0.815 | 0.3508 | 1.3299 | 0.815 | 0.7940 | 0.2792 | 0.0539 | | 0.3577 | 59.0 | 767 | 0.5015 | 0.81 | 0.3508 | 1.3326 | 0.81 | 0.7910 | 0.2894 | 0.0538 | | 0.3577 | 60.0 | 780 | 0.5014 | 0.81 | 0.3505 | 1.3305 | 0.81 | 0.7910 | 0.2891 | 0.0540 | | 0.3577 | 61.0 | 793 | 0.5016 | 0.81 | 0.3508 | 1.3305 | 0.81 | 0.7910 | 0.3138 | 0.0538 | | 0.3577 | 62.0 | 806 | 0.5015 | 0.815 | 0.3507 | 1.3329 | 0.815 | 0.7952 | 0.2835 | 0.0538 | | 0.3577 | 63.0 | 819 | 0.5017 | 0.81 | 0.3509 | 1.3288 | 0.81 | 0.7895 | 0.2841 | 0.0542 | | 0.3577 | 64.0 | 832 | 0.5018 | 0.81 | 0.3511 | 1.3311 | 0.81 | 0.7895 | 0.2908 | 0.0540 | | 0.3577 | 65.0 | 845 | 0.5015 | 0.815 | 0.3509 | 1.3310 | 0.815 | 0.7940 | 0.3124 | 0.0539 | | 0.3577 | 66.0 | 858 | 0.5014 | 0.81 | 0.3507 | 1.3273 | 0.81 | 0.7923 | 0.3039 | 0.0536 | | 0.3577 | 67.0 | 871 | 0.5014 | 0.81 | 0.3507 | 1.3287 | 0.81 | 0.7895 | 0.2907 | 0.0538 | | 0.3577 | 68.0 | 884 | 0.5016 | 0.81 | 0.3509 | 1.3304 | 0.81 | 0.7910 | 0.3043 | 0.0540 | | 0.3577 | 69.0 | 897 | 0.5017 | 0.81 | 0.3509 | 1.3316 | 0.81 | 0.7923 | 0.3041 | 0.0539 | | 0.3577 | 70.0 | 910 | 0.5017 | 0.81 | 0.3510 | 1.3307 | 0.81 | 0.7882 | 0.2986 | 0.0538 | | 0.3577 | 71.0 | 923 | 0.5019 | 0.81 | 0.3511 | 1.3315 | 0.81 | 0.7923 | 0.3047 | 0.0540 | | 0.3577 | 72.0 | 936 | 0.5016 | 0.81 | 0.3510 | 1.3305 | 0.81 | 0.7910 | 0.3044 | 0.0539 | | 0.3577 | 73.0 | 949 | 0.5017 | 0.81 | 0.3508 | 1.3308 | 0.81 | 0.7895 | 0.2951 | 0.0540 | | 0.3577 | 74.0 | 962 | 0.5015 | 0.815 | 0.3509 | 1.3309 | 0.815 | 0.7952 | 0.3046 | 0.0537 | | 0.3577 | 75.0 | 975 | 0.5018 | 0.81 | 0.3510 | 1.3308 | 0.81 | 0.7923 | 0.3099 | 0.0542 | | 0.3577 | 76.0 | 988 | 0.5019 | 0.81 | 0.3511 | 1.3311 | 0.81 | 0.7923 | 0.3045 | 0.0543 | | 0.197 | 77.0 | 1001 | 0.5017 | 0.81 | 0.3510 | 1.3308 | 0.81 | 0.7882 | 0.3045 | 0.0538 | | 0.197 | 78.0 | 1014 | 0.5017 | 0.81 | 0.3509 | 1.3308 | 0.81 | 0.7923 | 0.2974 | 0.0538 | | 0.197 | 79.0 | 1027 | 0.5018 | 0.81 | 0.3510 | 1.3313 | 0.81 | 0.7882 | 0.3028 | 0.0540 | | 0.197 | 80.0 | 1040 | 0.5018 | 0.81 | 0.3511 | 1.3310 | 0.81 | 0.7910 | 0.3030 | 0.0539 | | 0.197 | 81.0 | 1053 | 0.5019 | 0.815 | 0.3511 | 1.3307 | 0.815 | 0.7952 | 0.2952 | 0.0540 | | 0.197 | 82.0 | 1066 | 0.5015 | 0.81 | 0.3507 | 1.3314 | 0.81 | 0.7882 | 0.2948 | 0.0535 | | 0.197 | 83.0 | 1079 | 0.5018 | 0.81 | 0.3510 | 1.3305 | 0.81 | 0.7895 | 0.2802 | 0.0538 | | 0.197 | 84.0 | 1092 | 0.5018 | 0.81 | 0.3510 | 1.3310 | 0.81 | 0.7882 | 0.2951 | 0.0539 | | 0.197 | 85.0 | 1105 | 0.5018 | 0.81 | 0.3510 | 1.3314 | 0.81 | 0.7882 | 0.2879 | 0.0539 | | 0.197 | 86.0 | 1118 | 0.5018 | 0.81 | 0.3510 | 1.3318 | 0.81 | 0.7910 | 0.3028 | 0.0539 | | 0.197 | 87.0 | 1131 | 0.5018 | 0.81 | 0.3510 | 1.3311 | 0.81 | 0.7895 | 0.2802 | 0.0538 | | 0.197 | 88.0 | 1144 | 0.5018 | 0.81 | 0.3511 | 1.3315 | 0.81 | 0.7895 | 0.2880 | 0.0539 | | 0.197 | 89.0 | 1157 | 0.5018 | 0.81 | 0.3510 | 1.3310 | 0.81 | 0.7895 | 0.2880 | 0.0539 | | 0.197 | 90.0 | 1170 | 0.5019 | 0.81 | 0.3511 | 1.3314 | 0.81 | 0.7923 | 0.2881 | 0.0538 | | 0.197 | 91.0 | 1183 | 0.5018 | 0.81 | 0.3510 | 1.3316 | 0.81 | 0.7895 | 0.2880 | 0.0537 | | 0.197 | 92.0 | 1196 | 0.5018 | 0.81 | 0.3510 | 1.3311 | 0.81 | 0.7882 | 0.2880 | 0.0538 | | 0.197 | 93.0 | 1209 | 0.5019 | 0.81 | 0.3511 | 1.3315 | 0.81 | 0.7895 | 0.2935 | 0.0539 | | 0.197 | 94.0 | 1222 | 0.5018 | 0.81 | 0.3510 | 1.3313 | 0.81 | 0.7895 | 0.2803 | 0.0537 | | 0.197 | 95.0 | 1235 | 0.5017 | 0.81 | 0.3510 | 1.3312 | 0.81 | 0.7882 | 0.2880 | 0.0537 | | 0.197 | 96.0 | 1248 | 0.5018 | 0.81 | 0.3510 | 1.3315 | 0.81 | 0.7895 | 0.2935 | 0.0538 | | 0.197 | 97.0 | 1261 | 0.5018 | 0.81 | 0.3510 | 1.3313 | 0.81 | 0.7895 | 0.2880 | 0.0537 | | 0.197 | 98.0 | 1274 | 0.5018 | 0.81 | 0.3510 | 1.3314 | 0.81 | 0.7895 | 0.2935 | 0.0538 | | 0.197 | 99.0 | 1287 | 0.5018 | 0.81 | 0.3510 | 1.3313 | 0.81 | 0.7895 | 0.2880 | 0.0537 | | 0.197 | 100.0 | 1300 | 0.5018 | 0.81 | 0.3510 | 1.3312 | 0.81 | 0.7895 | 0.2880 | 0.0537 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/114-tiny_tobacco3482_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 114-tiny_tobacco3482_kd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1493 - Accuracy: 0.71 - Brier Loss: 0.5252 - Nll: 1.5452 - F1 Micro: 0.7100 - F1 Macro: 0.6059 - Ece: 0.3891 - Aurc: 0.0888 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 0.9704 | 0.23 | 0.8929 | 7.3757 | 0.23 | 0.1649 | 0.2840 | 0.7790 | | No log | 2.0 | 26 | 0.4936 | 0.34 | 0.8279 | 4.5780 | 0.34 | 0.2115 | 0.3215 | 0.5666 | | No log | 3.0 | 39 | 0.3759 | 0.455 | 0.7568 | 3.7623 | 0.455 | 0.3073 | 0.3262 | 0.3769 | | No log | 4.0 | 52 | 0.3165 | 0.545 | 0.7159 | 2.7078 | 0.545 | 0.4004 | 0.3944 | 0.2710 | | No log | 5.0 | 65 | 0.2807 | 0.6 | 0.6537 | 2.7061 | 0.6 | 0.4506 | 0.3539 | 0.2091 | | No log | 6.0 | 78 | 0.2714 | 0.575 | 0.6476 | 2.6202 | 0.575 | 0.4324 | 0.3576 | 0.2081 | | No log | 7.0 | 91 | 0.2473 | 0.64 | 0.6163 | 2.4990 | 0.64 | 0.5080 | 0.3882 | 0.1616 | | No log | 8.0 | 104 | 0.2752 | 0.625 | 0.5837 | 2.5795 | 0.625 | 0.5022 | 0.3291 | 0.1892 | | No log | 9.0 | 117 | 0.3128 | 0.59 | 0.6027 | 2.7893 | 0.59 | 0.4859 | 0.2944 | 0.2296 | | No log | 10.0 | 130 | 0.2292 | 0.66 | 0.5612 | 2.3152 | 0.66 | 0.5253 | 0.3822 | 0.1155 | | No log | 11.0 | 143 | 0.2676 | 0.665 | 0.5632 | 2.6937 | 0.665 | 0.5479 | 0.3608 | 0.1422 | | No log | 12.0 | 156 | 0.2512 | 0.65 | 0.5543 | 2.2519 | 0.65 | 0.5533 | 0.3324 | 0.1300 | | No log | 13.0 | 169 | 0.2053 | 0.67 | 0.5555 | 1.9904 | 0.67 | 0.5739 | 0.3659 | 0.1162 | | No log | 14.0 | 182 | 0.2281 | 0.68 | 0.5613 | 2.2343 | 0.68 | 0.5508 | 0.3683 | 0.1193 | | No log | 15.0 | 195 | 0.2029 | 0.705 | 0.5511 | 1.6184 | 0.705 | 0.5984 | 0.4175 | 0.0937 | | No log | 16.0 | 208 | 0.2090 | 0.71 | 0.5459 | 1.9750 | 0.7100 | 0.6052 | 0.3911 | 0.0983 | | No log | 17.0 | 221 | 0.1828 | 0.705 | 0.5385 | 1.8272 | 0.705 | 0.5973 | 0.3700 | 0.0969 | | No log | 18.0 | 234 | 0.1739 | 0.73 | 0.5358 | 1.6202 | 0.7300 | 0.6180 | 0.4115 | 0.0962 | | No log | 19.0 | 247 | 0.1847 | 0.685 | 0.5300 | 2.1083 | 0.685 | 0.5717 | 0.3582 | 0.1047 | | No log | 20.0 | 260 | 0.1839 | 0.69 | 0.5390 | 1.8560 | 0.69 | 0.5932 | 0.3708 | 0.1090 | | No log | 21.0 | 273 | 0.1756 | 0.72 | 0.5417 | 1.7203 | 0.72 | 0.6132 | 0.4000 | 0.0855 | | No log | 22.0 | 286 | 0.1727 | 0.69 | 0.5212 | 1.9503 | 0.69 | 0.5853 | 0.3574 | 0.1041 | | No log | 23.0 | 299 | 0.1684 | 0.72 | 0.5333 | 1.5951 | 0.72 | 0.6229 | 0.3922 | 0.0943 | | No log | 24.0 | 312 | 0.1652 | 0.735 | 0.5263 | 1.6768 | 0.735 | 0.6519 | 0.4001 | 0.0920 | | No log | 25.0 | 325 | 0.1637 | 0.735 | 0.5363 | 1.6879 | 0.735 | 0.6514 | 0.4079 | 0.0819 | | No log | 26.0 | 338 | 0.1609 | 0.675 | 0.5299 | 1.5660 | 0.675 | 0.5602 | 0.3593 | 0.0989 | | No log | 27.0 | 351 | 0.1581 | 0.725 | 0.5210 | 1.5886 | 0.7250 | 0.6206 | 0.3739 | 0.0847 | | No log | 28.0 | 364 | 0.1591 | 0.71 | 0.5286 | 1.7728 | 0.7100 | 0.6076 | 0.3868 | 0.0921 | | No log | 29.0 | 377 | 0.1544 | 0.715 | 0.5251 | 1.6215 | 0.715 | 0.6201 | 0.3813 | 0.0948 | | No log | 30.0 | 390 | 0.1618 | 0.705 | 0.5340 | 1.5824 | 0.705 | 0.6064 | 0.3853 | 0.1000 | | No log | 31.0 | 403 | 0.1580 | 0.705 | 0.5202 | 1.7228 | 0.705 | 0.5949 | 0.3710 | 0.0963 | | No log | 32.0 | 416 | 0.1531 | 0.72 | 0.5257 | 1.6330 | 0.72 | 0.6137 | 0.3857 | 0.0904 | | No log | 33.0 | 429 | 0.1521 | 0.72 | 0.5248 | 1.6212 | 0.72 | 0.6349 | 0.3928 | 0.0898 | | No log | 34.0 | 442 | 0.1526 | 0.71 | 0.5261 | 1.4652 | 0.7100 | 0.6141 | 0.3829 | 0.0905 | | No log | 35.0 | 455 | 0.1529 | 0.7 | 0.5256 | 1.5784 | 0.7 | 0.5926 | 0.3885 | 0.0887 | | No log | 36.0 | 468 | 0.1526 | 0.735 | 0.5268 | 1.5163 | 0.735 | 0.6514 | 0.3991 | 0.0878 | | No log | 37.0 | 481 | 0.1497 | 0.695 | 0.5222 | 1.6068 | 0.695 | 0.5785 | 0.3918 | 0.0919 | | No log | 38.0 | 494 | 0.1488 | 0.72 | 0.5248 | 1.5401 | 0.72 | 0.6115 | 0.3905 | 0.0891 | | 0.1554 | 39.0 | 507 | 0.1504 | 0.715 | 0.5208 | 1.5917 | 0.715 | 0.6152 | 0.3730 | 0.0894 | | 0.1554 | 40.0 | 520 | 0.1487 | 0.725 | 0.5260 | 1.5258 | 0.7250 | 0.6399 | 0.3998 | 0.0879 | | 0.1554 | 41.0 | 533 | 0.1484 | 0.71 | 0.5250 | 1.6093 | 0.7100 | 0.6073 | 0.3908 | 0.0880 | | 0.1554 | 42.0 | 546 | 0.1481 | 0.715 | 0.5245 | 1.5711 | 0.715 | 0.6096 | 0.3857 | 0.0860 | | 0.1554 | 43.0 | 559 | 0.1493 | 0.705 | 0.5243 | 1.6261 | 0.705 | 0.6000 | 0.3727 | 0.0901 | | 0.1554 | 44.0 | 572 | 0.1495 | 0.71 | 0.5242 | 1.5942 | 0.7100 | 0.6080 | 0.3808 | 0.0868 | | 0.1554 | 45.0 | 585 | 0.1495 | 0.71 | 0.5242 | 1.5417 | 0.7100 | 0.6059 | 0.3813 | 0.0881 | | 0.1554 | 46.0 | 598 | 0.1490 | 0.715 | 0.5239 | 1.5403 | 0.715 | 0.6134 | 0.3826 | 0.0893 | | 0.1554 | 47.0 | 611 | 0.1486 | 0.715 | 0.5248 | 1.5387 | 0.715 | 0.6112 | 0.3754 | 0.0883 | | 0.1554 | 48.0 | 624 | 0.1491 | 0.71 | 0.5252 | 1.5527 | 0.7100 | 0.6059 | 0.3761 | 0.0889 | | 0.1554 | 49.0 | 637 | 0.1491 | 0.71 | 0.5249 | 1.5545 | 0.7100 | 0.6059 | 0.3880 | 0.0885 | | 0.1554 | 50.0 | 650 | 0.1489 | 0.71 | 0.5247 | 1.5376 | 0.7100 | 0.6059 | 0.3900 | 0.0895 | | 0.1554 | 51.0 | 663 | 0.1492 | 0.71 | 0.5257 | 1.5385 | 0.7100 | 0.6059 | 0.3857 | 0.0890 | | 0.1554 | 52.0 | 676 | 0.1491 | 0.71 | 0.5251 | 1.5460 | 0.7100 | 0.6059 | 0.3816 | 0.0888 | | 0.1554 | 53.0 | 689 | 0.1491 | 0.71 | 0.5248 | 1.5429 | 0.7100 | 0.6059 | 0.3806 | 0.0886 | | 0.1554 | 54.0 | 702 | 0.1489 | 0.71 | 0.5247 | 1.5426 | 0.7100 | 0.6059 | 0.3949 | 0.0887 | | 0.1554 | 55.0 | 715 | 0.1492 | 0.71 | 0.5258 | 1.5550 | 0.7100 | 0.6059 | 0.3921 | 0.0890 | | 0.1554 | 56.0 | 728 | 0.1492 | 0.71 | 0.5248 | 1.5470 | 0.7100 | 0.6059 | 0.3859 | 0.0888 | | 0.1554 | 57.0 | 741 | 0.1491 | 0.71 | 0.5251 | 1.5447 | 0.7100 | 0.6059 | 0.4035 | 0.0888 | | 0.1554 | 58.0 | 754 | 0.1491 | 0.71 | 0.5248 | 1.5440 | 0.7100 | 0.6059 | 0.4033 | 0.0886 | | 0.1554 | 59.0 | 767 | 0.1491 | 0.71 | 0.5246 | 1.5561 | 0.7100 | 0.6059 | 0.3920 | 0.0890 | | 0.1554 | 60.0 | 780 | 0.1492 | 0.71 | 0.5251 | 1.5461 | 0.7100 | 0.6059 | 0.3847 | 0.0889 | | 0.1554 | 61.0 | 793 | 0.1493 | 0.71 | 0.5251 | 1.5455 | 0.7100 | 0.6059 | 0.3931 | 0.0887 | | 0.1554 | 62.0 | 806 | 0.1493 | 0.71 | 0.5252 | 1.5443 | 0.7100 | 0.6059 | 0.3912 | 0.0889 | | 0.1554 | 63.0 | 819 | 0.1493 | 0.71 | 0.5253 | 1.5441 | 0.7100 | 0.6059 | 0.3944 | 0.0887 | | 0.1554 | 64.0 | 832 | 0.1492 | 0.71 | 0.5249 | 1.5444 | 0.7100 | 0.6059 | 0.3891 | 0.0888 | | 0.1554 | 65.0 | 845 | 0.1492 | 0.71 | 0.5255 | 1.5430 | 0.7100 | 0.6059 | 0.3995 | 0.0888 | | 0.1554 | 66.0 | 858 | 0.1493 | 0.71 | 0.5250 | 1.5435 | 0.7100 | 0.6059 | 0.3991 | 0.0890 | | 0.1554 | 67.0 | 871 | 0.1493 | 0.71 | 0.5252 | 1.5449 | 0.7100 | 0.6059 | 0.3991 | 0.0890 | | 0.1554 | 68.0 | 884 | 0.1492 | 0.71 | 0.5251 | 1.5458 | 0.7100 | 0.6059 | 0.3968 | 0.0889 | | 0.1554 | 69.0 | 897 | 0.1493 | 0.71 | 0.5250 | 1.5468 | 0.7100 | 0.6059 | 0.4036 | 0.0888 | | 0.1554 | 70.0 | 910 | 0.1494 | 0.71 | 0.5253 | 1.5464 | 0.7100 | 0.6059 | 0.3889 | 0.0887 | | 0.1554 | 71.0 | 923 | 0.1493 | 0.71 | 0.5251 | 1.5452 | 0.7100 | 0.6059 | 0.3888 | 0.0887 | | 0.1554 | 72.0 | 936 | 0.1493 | 0.71 | 0.5250 | 1.5457 | 0.7100 | 0.6059 | 0.3928 | 0.0888 | | 0.1554 | 73.0 | 949 | 0.1494 | 0.71 | 0.5253 | 1.5455 | 0.7100 | 0.6059 | 0.3946 | 0.0889 | | 0.1554 | 74.0 | 962 | 0.1493 | 0.71 | 0.5251 | 1.5441 | 0.7100 | 0.6059 | 0.3928 | 0.0888 | | 0.1554 | 75.0 | 975 | 0.1493 | 0.71 | 0.5252 | 1.5455 | 0.7100 | 0.6059 | 0.3929 | 0.0891 | | 0.1554 | 76.0 | 988 | 0.1493 | 0.71 | 0.5252 | 1.5449 | 0.7100 | 0.6059 | 0.3940 | 0.0886 | | 0.0002 | 77.0 | 1001 | 0.1493 | 0.71 | 0.5253 | 1.5455 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 78.0 | 1014 | 0.1493 | 0.71 | 0.5251 | 1.5468 | 0.7100 | 0.6059 | 0.3889 | 0.0887 | | 0.0002 | 79.0 | 1027 | 0.1494 | 0.71 | 0.5252 | 1.5462 | 0.7100 | 0.6059 | 0.3891 | 0.0888 | | 0.0002 | 80.0 | 1040 | 0.1493 | 0.71 | 0.5252 | 1.5443 | 0.7100 | 0.6059 | 0.3994 | 0.0886 | | 0.0002 | 81.0 | 1053 | 0.1493 | 0.71 | 0.5251 | 1.5451 | 0.7100 | 0.6059 | 0.3890 | 0.0887 | | 0.0002 | 82.0 | 1066 | 0.1493 | 0.71 | 0.5251 | 1.5448 | 0.7100 | 0.6059 | 0.3938 | 0.0886 | | 0.0002 | 83.0 | 1079 | 0.1493 | 0.71 | 0.5251 | 1.5453 | 0.7100 | 0.6059 | 0.3890 | 0.0886 | | 0.0002 | 84.0 | 1092 | 0.1493 | 0.71 | 0.5252 | 1.5462 | 0.7100 | 0.6059 | 0.3890 | 0.0888 | | 0.0002 | 85.0 | 1105 | 0.1493 | 0.71 | 0.5252 | 1.5454 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 86.0 | 1118 | 0.1493 | 0.71 | 0.5252 | 1.5452 | 0.7100 | 0.6059 | 0.3890 | 0.0887 | | 0.0002 | 87.0 | 1131 | 0.1494 | 0.71 | 0.5252 | 1.5452 | 0.7100 | 0.6059 | 0.3891 | 0.0888 | | 0.0002 | 88.0 | 1144 | 0.1494 | 0.71 | 0.5252 | 1.5453 | 0.7100 | 0.6059 | 0.3891 | 0.0886 | | 0.0002 | 89.0 | 1157 | 0.1493 | 0.71 | 0.5252 | 1.5451 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 90.0 | 1170 | 0.1493 | 0.71 | 0.5252 | 1.5449 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 91.0 | 1183 | 0.1493 | 0.71 | 0.5252 | 1.5454 | 0.7100 | 0.6059 | 0.3890 | 0.0887 | | 0.0002 | 92.0 | 1196 | 0.1493 | 0.71 | 0.5252 | 1.5450 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 93.0 | 1209 | 0.1493 | 0.71 | 0.5252 | 1.5454 | 0.7100 | 0.6059 | 0.3891 | 0.0888 | | 0.0002 | 94.0 | 1222 | 0.1493 | 0.71 | 0.5252 | 1.5453 | 0.7100 | 0.6059 | 0.3890 | 0.0886 | | 0.0002 | 95.0 | 1235 | 0.1493 | 0.71 | 0.5252 | 1.5454 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 96.0 | 1248 | 0.1493 | 0.71 | 0.5252 | 1.5450 | 0.7100 | 0.6059 | 0.3890 | 0.0887 | | 0.0002 | 97.0 | 1261 | 0.1493 | 0.71 | 0.5252 | 1.5455 | 0.7100 | 0.6059 | 0.3891 | 0.0888 | | 0.0002 | 98.0 | 1274 | 0.1494 | 0.71 | 0.5252 | 1.5453 | 0.7100 | 0.6059 | 0.3891 | 0.0886 | | 0.0002 | 99.0 | 1287 | 0.1493 | 0.71 | 0.5252 | 1.5452 | 0.7100 | 0.6059 | 0.3891 | 0.0887 | | 0.0002 | 100.0 | 1300 | 0.1493 | 0.71 | 0.5252 | 1.5452 | 0.7100 | 0.6059 | 0.3891 | 0.0888 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/114-tiny_tobacco3482_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 114-tiny_tobacco3482_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 4.1549 - Accuracy: 0.83 - Brier Loss: 0.2823 - Nll: 1.5555 - F1 Micro: 0.83 - F1 Macro: 0.8165 - Ece: 0.1345 - Aurc: 0.0505 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 13 | 4.6466 | 0.205 | 0.8898 | 8.5236 | 0.205 | 0.1171 | 0.2921 | 0.7976 | | No log | 2.0 | 26 | 4.1877 | 0.37 | 0.7948 | 4.6881 | 0.37 | 0.2765 | 0.3019 | 0.4618 | | No log | 3.0 | 39 | 4.0470 | 0.495 | 0.7046 | 4.2682 | 0.495 | 0.3642 | 0.2930 | 0.3256 | | No log | 4.0 | 52 | 3.9713 | 0.535 | 0.6402 | 3.2866 | 0.535 | 0.3996 | 0.2908 | 0.2421 | | No log | 5.0 | 65 | 3.8631 | 0.645 | 0.5467 | 2.7568 | 0.645 | 0.5172 | 0.3135 | 0.1523 | | No log | 6.0 | 78 | 3.9102 | 0.65 | 0.5148 | 2.6608 | 0.65 | 0.5059 | 0.2764 | 0.1533 | | No log | 7.0 | 91 | 3.8517 | 0.67 | 0.4656 | 2.6499 | 0.67 | 0.5649 | 0.2541 | 0.1313 | | No log | 8.0 | 104 | 4.1021 | 0.645 | 0.4936 | 2.7236 | 0.645 | 0.5425 | 0.2230 | 0.1622 | | No log | 9.0 | 117 | 3.9369 | 0.66 | 0.4363 | 2.1539 | 0.66 | 0.5584 | 0.2057 | 0.1246 | | No log | 10.0 | 130 | 4.0468 | 0.675 | 0.4564 | 2.9594 | 0.675 | 0.5976 | 0.2109 | 0.1319 | | No log | 11.0 | 143 | 3.8168 | 0.735 | 0.3788 | 2.3105 | 0.735 | 0.6427 | 0.2121 | 0.0768 | | No log | 12.0 | 156 | 3.9052 | 0.72 | 0.3890 | 2.7166 | 0.72 | 0.6500 | 0.1906 | 0.0978 | | No log | 13.0 | 169 | 3.8460 | 0.735 | 0.3741 | 2.3764 | 0.735 | 0.6513 | 0.1941 | 0.0904 | | No log | 14.0 | 182 | 3.7683 | 0.78 | 0.3418 | 2.0497 | 0.78 | 0.7024 | 0.1980 | 0.0656 | | No log | 15.0 | 195 | 3.7220 | 0.78 | 0.3242 | 2.1470 | 0.78 | 0.7176 | 0.1780 | 0.0624 | | No log | 16.0 | 208 | 3.7405 | 0.785 | 0.3170 | 1.6643 | 0.785 | 0.7151 | 0.1874 | 0.0642 | | No log | 17.0 | 221 | 3.8003 | 0.775 | 0.3264 | 1.8059 | 0.775 | 0.7434 | 0.1699 | 0.0740 | | No log | 18.0 | 234 | 3.7431 | 0.795 | 0.3134 | 1.6740 | 0.795 | 0.7780 | 0.1891 | 0.0625 | | No log | 19.0 | 247 | 3.7844 | 0.8 | 0.3366 | 1.6058 | 0.8000 | 0.7896 | 0.2193 | 0.0707 | | No log | 20.0 | 260 | 3.7887 | 0.81 | 0.3142 | 1.8916 | 0.81 | 0.7984 | 0.1869 | 0.0660 | | No log | 21.0 | 273 | 3.7557 | 0.82 | 0.2997 | 1.6302 | 0.82 | 0.8003 | 0.1653 | 0.0548 | | No log | 22.0 | 286 | 3.8324 | 0.815 | 0.3303 | 2.0226 | 0.815 | 0.8124 | 0.1903 | 0.0742 | | No log | 23.0 | 299 | 3.7161 | 0.815 | 0.2916 | 1.9124 | 0.815 | 0.8080 | 0.1504 | 0.0554 | | No log | 24.0 | 312 | 3.8438 | 0.785 | 0.3072 | 1.9052 | 0.785 | 0.7848 | 0.1823 | 0.0688 | | No log | 25.0 | 325 | 3.7427 | 0.82 | 0.2859 | 1.9856 | 0.82 | 0.8014 | 0.1583 | 0.0546 | | No log | 26.0 | 338 | 3.7653 | 0.81 | 0.2838 | 1.6727 | 0.81 | 0.8053 | 0.1527 | 0.0565 | | No log | 27.0 | 351 | 3.7667 | 0.82 | 0.2927 | 1.8255 | 0.82 | 0.8134 | 0.1655 | 0.0601 | | No log | 28.0 | 364 | 3.7104 | 0.815 | 0.2747 | 1.7848 | 0.815 | 0.8052 | 0.1732 | 0.0513 | | No log | 29.0 | 377 | 3.7794 | 0.825 | 0.2884 | 1.7768 | 0.825 | 0.8194 | 0.1616 | 0.0581 | | No log | 30.0 | 390 | 3.7582 | 0.81 | 0.2732 | 1.8177 | 0.81 | 0.7965 | 0.1358 | 0.0520 | | No log | 31.0 | 403 | 3.7540 | 0.82 | 0.2790 | 1.7627 | 0.82 | 0.8027 | 0.1399 | 0.0520 | | No log | 32.0 | 416 | 3.7146 | 0.82 | 0.2644 | 1.7950 | 0.82 | 0.8097 | 0.1532 | 0.0487 | | No log | 33.0 | 429 | 3.8304 | 0.79 | 0.2983 | 2.3232 | 0.79 | 0.7817 | 0.1596 | 0.0599 | | No log | 34.0 | 442 | 3.7604 | 0.82 | 0.2714 | 1.7869 | 0.82 | 0.7976 | 0.1429 | 0.0474 | | No log | 35.0 | 455 | 3.8126 | 0.815 | 0.2768 | 1.8130 | 0.815 | 0.8075 | 0.1388 | 0.0510 | | No log | 36.0 | 468 | 3.7828 | 0.825 | 0.2648 | 1.5233 | 0.825 | 0.8101 | 0.1603 | 0.0471 | | No log | 37.0 | 481 | 3.8297 | 0.82 | 0.2781 | 1.6717 | 0.82 | 0.8114 | 0.1557 | 0.0491 | | No log | 38.0 | 494 | 3.8217 | 0.82 | 0.2704 | 1.5368 | 0.82 | 0.8058 | 0.1461 | 0.0498 | | 3.4744 | 39.0 | 507 | 3.8171 | 0.845 | 0.2639 | 1.7121 | 0.845 | 0.8325 | 0.1270 | 0.0468 | | 3.4744 | 40.0 | 520 | 3.8336 | 0.83 | 0.2691 | 1.6086 | 0.83 | 0.8158 | 0.1289 | 0.0486 | | 3.4744 | 41.0 | 533 | 3.8612 | 0.815 | 0.2699 | 1.6193 | 0.815 | 0.8052 | 0.1393 | 0.0516 | | 3.4744 | 42.0 | 546 | 3.8801 | 0.825 | 0.2716 | 1.6084 | 0.825 | 0.8139 | 0.1309 | 0.0499 | | 3.4744 | 43.0 | 559 | 3.8851 | 0.82 | 0.2744 | 1.6179 | 0.82 | 0.8124 | 0.1320 | 0.0504 | | 3.4744 | 44.0 | 572 | 3.8818 | 0.825 | 0.2708 | 1.5941 | 0.825 | 0.8107 | 0.1295 | 0.0492 | | 3.4744 | 45.0 | 585 | 3.8843 | 0.83 | 0.2656 | 1.5945 | 0.83 | 0.8207 | 0.1301 | 0.0485 | | 3.4744 | 46.0 | 598 | 3.9274 | 0.82 | 0.2749 | 1.6790 | 0.82 | 0.8079 | 0.1391 | 0.0504 | | 3.4744 | 47.0 | 611 | 3.9210 | 0.83 | 0.2736 | 1.6600 | 0.83 | 0.8137 | 0.1198 | 0.0486 | | 3.4744 | 48.0 | 624 | 3.9345 | 0.82 | 0.2746 | 1.7165 | 0.82 | 0.8056 | 0.1393 | 0.0488 | | 3.4744 | 49.0 | 637 | 3.9491 | 0.835 | 0.2706 | 1.5963 | 0.835 | 0.8240 | 0.1355 | 0.0495 | | 3.4744 | 50.0 | 650 | 3.9607 | 0.815 | 0.2767 | 1.6837 | 0.815 | 0.8029 | 0.1407 | 0.0503 | | 3.4744 | 51.0 | 663 | 3.9657 | 0.825 | 0.2727 | 1.6016 | 0.825 | 0.8125 | 0.1338 | 0.0500 | | 3.4744 | 52.0 | 676 | 3.9763 | 0.825 | 0.2766 | 1.6100 | 0.825 | 0.8125 | 0.1419 | 0.0501 | | 3.4744 | 53.0 | 689 | 3.9726 | 0.83 | 0.2746 | 1.5261 | 0.83 | 0.8189 | 0.1296 | 0.0495 | | 3.4744 | 54.0 | 702 | 3.9863 | 0.825 | 0.2765 | 1.6034 | 0.825 | 0.8125 | 0.1320 | 0.0502 | | 3.4744 | 55.0 | 715 | 3.9922 | 0.815 | 0.2767 | 1.7367 | 0.815 | 0.8029 | 0.1305 | 0.0494 | | 3.4744 | 56.0 | 728 | 4.0013 | 0.825 | 0.2745 | 1.5431 | 0.825 | 0.8133 | 0.1239 | 0.0496 | | 3.4744 | 57.0 | 741 | 4.0124 | 0.825 | 0.2773 | 1.6129 | 0.825 | 0.8118 | 0.1291 | 0.0504 | | 3.4744 | 58.0 | 754 | 4.0031 | 0.83 | 0.2740 | 1.5309 | 0.83 | 0.8174 | 0.1353 | 0.0489 | | 3.4744 | 59.0 | 767 | 4.0280 | 0.825 | 0.2781 | 1.6088 | 0.825 | 0.8107 | 0.1419 | 0.0505 | | 3.4744 | 60.0 | 780 | 4.0266 | 0.83 | 0.2757 | 1.5379 | 0.83 | 0.8170 | 0.1375 | 0.0501 | | 3.4744 | 61.0 | 793 | 4.0333 | 0.82 | 0.2769 | 1.5573 | 0.82 | 0.8060 | 0.1430 | 0.0504 | | 3.4744 | 62.0 | 806 | 4.0370 | 0.835 | 0.2766 | 1.5287 | 0.835 | 0.8221 | 0.1284 | 0.0498 | | 3.4744 | 63.0 | 819 | 4.0418 | 0.825 | 0.2763 | 1.5459 | 0.825 | 0.8107 | 0.1381 | 0.0500 | | 3.4744 | 64.0 | 832 | 4.0393 | 0.83 | 0.2745 | 1.5433 | 0.83 | 0.8174 | 0.1371 | 0.0493 | | 3.4744 | 65.0 | 845 | 4.0636 | 0.83 | 0.2779 | 1.5378 | 0.83 | 0.8174 | 0.1369 | 0.0505 | | 3.4744 | 66.0 | 858 | 4.0627 | 0.825 | 0.2790 | 1.6074 | 0.825 | 0.8107 | 0.1378 | 0.0504 | | 3.4744 | 67.0 | 871 | 4.0715 | 0.825 | 0.2797 | 1.5569 | 0.825 | 0.8107 | 0.1338 | 0.0507 | | 3.4744 | 68.0 | 884 | 4.0698 | 0.835 | 0.2770 | 1.5355 | 0.835 | 0.8221 | 0.1362 | 0.0498 | | 3.4744 | 69.0 | 897 | 4.0808 | 0.825 | 0.2798 | 1.5505 | 0.825 | 0.8107 | 0.1304 | 0.0506 | | 3.4744 | 70.0 | 910 | 4.0837 | 0.825 | 0.2794 | 1.5387 | 0.825 | 0.8118 | 0.1432 | 0.0502 | | 3.4744 | 71.0 | 923 | 4.0868 | 0.825 | 0.2793 | 1.6048 | 0.825 | 0.8107 | 0.1343 | 0.0507 | | 3.4744 | 72.0 | 936 | 4.0912 | 0.825 | 0.2793 | 1.5487 | 0.825 | 0.8118 | 0.1482 | 0.0504 | | 3.4744 | 73.0 | 949 | 4.0962 | 0.825 | 0.2795 | 1.5468 | 0.825 | 0.8118 | 0.1414 | 0.0502 | | 3.4744 | 74.0 | 962 | 4.1011 | 0.83 | 0.2802 | 1.5485 | 0.83 | 0.8165 | 0.1528 | 0.0501 | | 3.4744 | 75.0 | 975 | 4.1044 | 0.825 | 0.2801 | 1.6062 | 0.825 | 0.8107 | 0.1410 | 0.0504 | | 3.4744 | 76.0 | 988 | 4.1094 | 0.825 | 0.2804 | 1.5688 | 0.825 | 0.8107 | 0.1435 | 0.0506 | | 3.1992 | 77.0 | 1001 | 4.1127 | 0.825 | 0.2805 | 1.6066 | 0.825 | 0.8107 | 0.1496 | 0.0506 | | 3.1992 | 78.0 | 1014 | 4.1164 | 0.83 | 0.2804 | 1.5433 | 0.83 | 0.8165 | 0.1384 | 0.0503 | | 3.1992 | 79.0 | 1027 | 4.1198 | 0.83 | 0.2805 | 1.5485 | 0.83 | 0.8165 | 0.1530 | 0.0504 | | 3.1992 | 80.0 | 1040 | 4.1228 | 0.83 | 0.2808 | 1.5610 | 0.83 | 0.8165 | 0.1385 | 0.0504 | | 3.1992 | 81.0 | 1053 | 4.1257 | 0.83 | 0.2809 | 1.5505 | 0.83 | 0.8165 | 0.1475 | 0.0504 | | 3.1992 | 82.0 | 1066 | 4.1301 | 0.83 | 0.2812 | 1.5504 | 0.83 | 0.8165 | 0.1474 | 0.0504 | | 3.1992 | 83.0 | 1079 | 4.1310 | 0.83 | 0.2811 | 1.5544 | 0.83 | 0.8165 | 0.1390 | 0.0504 | | 3.1992 | 84.0 | 1092 | 4.1345 | 0.83 | 0.2813 | 1.5513 | 0.83 | 0.8165 | 0.1476 | 0.0504 | | 3.1992 | 85.0 | 1105 | 4.1373 | 0.83 | 0.2814 | 1.5530 | 0.83 | 0.8165 | 0.1340 | 0.0504 | | 3.1992 | 86.0 | 1118 | 4.1401 | 0.83 | 0.2818 | 1.5528 | 0.83 | 0.8165 | 0.1427 | 0.0504 | | 3.1992 | 87.0 | 1131 | 4.1416 | 0.83 | 0.2816 | 1.5521 | 0.83 | 0.8165 | 0.1341 | 0.0504 | | 3.1992 | 88.0 | 1144 | 4.1439 | 0.83 | 0.2819 | 1.5527 | 0.83 | 0.8165 | 0.1341 | 0.0504 | | 3.1992 | 89.0 | 1157 | 4.1453 | 0.83 | 0.2819 | 1.5536 | 0.83 | 0.8165 | 0.1343 | 0.0504 | | 3.1992 | 90.0 | 1170 | 4.1475 | 0.83 | 0.2820 | 1.5534 | 0.83 | 0.8165 | 0.1343 | 0.0504 | | 3.1992 | 91.0 | 1183 | 4.1488 | 0.83 | 0.2820 | 1.5514 | 0.83 | 0.8165 | 0.1344 | 0.0504 | | 3.1992 | 92.0 | 1196 | 4.1495 | 0.83 | 0.2820 | 1.5561 | 0.83 | 0.8165 | 0.1392 | 0.0503 | | 3.1992 | 93.0 | 1209 | 4.1510 | 0.83 | 0.2821 | 1.5510 | 0.83 | 0.8165 | 0.1432 | 0.0506 | | 3.1992 | 94.0 | 1222 | 4.1518 | 0.83 | 0.2821 | 1.5548 | 0.83 | 0.8165 | 0.1344 | 0.0504 | | 3.1992 | 95.0 | 1235 | 4.1529 | 0.83 | 0.2822 | 1.5544 | 0.83 | 0.8165 | 0.1345 | 0.0505 | | 3.1992 | 96.0 | 1248 | 4.1536 | 0.83 | 0.2822 | 1.5564 | 0.83 | 0.8165 | 0.1344 | 0.0504 | | 3.1992 | 97.0 | 1261 | 4.1542 | 0.83 | 0.2822 | 1.5542 | 0.83 | 0.8165 | 0.1345 | 0.0505 | | 3.1992 | 98.0 | 1274 | 4.1545 | 0.83 | 0.2823 | 1.5574 | 0.83 | 0.8165 | 0.1345 | 0.0505 | | 3.1992 | 99.0 | 1287 | 4.1548 | 0.83 | 0.2823 | 1.5553 | 0.83 | 0.8165 | 0.1345 | 0.0505 | | 3.1992 | 100.0 | 1300 | 4.1549 | 0.83 | 0.2823 | 1.5555 | 0.83 | 0.8165 | 0.1345 | 0.0505 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/114-tiny_tobacco3482_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 114-tiny_tobacco3482_og_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1893.8862 - Accuracy: 0.81 - Brier Loss: 0.3043 - Nll: 1.7237 - F1 Micro: 0.81 - F1 Macro: 0.7986 - Ece: 0.2043 - Aurc: 0.0518 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 50 | 1935.1045 | 0.185 | 0.8736 | 6.0686 | 0.185 | 0.0688 | 0.2403 | 0.5781 | | No log | 2.0 | 100 | 1929.4067 | 0.38 | 0.7591 | 2.7813 | 0.38 | 0.2846 | 0.2736 | 0.3992 | | No log | 3.0 | 150 | 1922.6112 | 0.52 | 0.6781 | 3.2915 | 0.52 | 0.4584 | 0.3133 | 0.2609 | | No log | 4.0 | 200 | 1915.7050 | 0.555 | 0.5833 | 2.9277 | 0.555 | 0.4448 | 0.2570 | 0.2195 | | No log | 5.0 | 250 | 1909.4652 | 0.705 | 0.4886 | 2.2512 | 0.705 | 0.5909 | 0.2891 | 0.1184 | | No log | 6.0 | 300 | 1921.1434 | 0.56 | 0.5606 | 2.8580 | 0.56 | 0.4688 | 0.2439 | 0.1803 | | No log | 7.0 | 350 | 1915.3008 | 0.68 | 0.4581 | 2.5626 | 0.68 | 0.5349 | 0.2522 | 0.1105 | | No log | 8.0 | 400 | 1910.5537 | 0.755 | 0.4383 | 2.3359 | 0.755 | 0.7020 | 0.3135 | 0.0876 | | No log | 9.0 | 450 | 1904.0891 | 0.745 | 0.4316 | 2.7111 | 0.745 | 0.7106 | 0.2586 | 0.0938 | | 1805.2034 | 10.0 | 500 | 1910.0713 | 0.74 | 0.4450 | 2.5024 | 0.74 | 0.7050 | 0.2549 | 0.1025 | | 1805.2034 | 11.0 | 550 | 1919.6288 | 0.645 | 0.5364 | 3.8307 | 0.645 | 0.5714 | 0.2608 | 0.1798 | | 1805.2034 | 12.0 | 600 | 1906.6981 | 0.755 | 0.4213 | 2.3151 | 0.755 | 0.7259 | 0.2818 | 0.0784 | | 1805.2034 | 13.0 | 650 | 1917.8431 | 0.75 | 0.4398 | 2.6290 | 0.75 | 0.6892 | 0.2979 | 0.1018 | | 1805.2034 | 14.0 | 700 | 1903.4856 | 0.775 | 0.4175 | 1.9907 | 0.775 | 0.7792 | 0.2951 | 0.0818 | | 1805.2034 | 15.0 | 750 | 1921.8427 | 0.65 | 0.5353 | 3.4019 | 0.65 | 0.6228 | 0.3013 | 0.1318 | | 1805.2034 | 16.0 | 800 | 1909.0247 | 0.78 | 0.4072 | 2.3930 | 0.78 | 0.7669 | 0.2861 | 0.0777 | | 1805.2034 | 17.0 | 850 | 1904.9219 | 0.77 | 0.4239 | 3.0975 | 0.7700 | 0.7413 | 0.2926 | 0.0850 | | 1805.2034 | 18.0 | 900 | 1909.7672 | 0.795 | 0.4094 | 1.8444 | 0.795 | 0.7631 | 0.3217 | 0.0703 | | 1805.2034 | 19.0 | 950 | 1902.0480 | 0.78 | 0.3774 | 2.2148 | 0.78 | 0.7783 | 0.2871 | 0.0818 | | 1791.9011 | 20.0 | 1000 | 1909.4414 | 0.775 | 0.3855 | 2.1104 | 0.775 | 0.7248 | 0.2782 | 0.0678 | | 1791.9011 | 21.0 | 1050 | 1908.0966 | 0.805 | 0.3740 | 2.2179 | 0.805 | 0.7927 | 0.2643 | 0.0708 | | 1791.9011 | 22.0 | 1100 | 1901.1847 | 0.81 | 0.3374 | 2.0363 | 0.81 | 0.7951 | 0.2685 | 0.0482 | | 1791.9011 | 23.0 | 1150 | 1906.8840 | 0.79 | 0.3623 | 2.0552 | 0.79 | 0.7541 | 0.2722 | 0.0722 | | 1791.9011 | 24.0 | 1200 | 1906.2512 | 0.805 | 0.3717 | 2.1442 | 0.805 | 0.7782 | 0.2826 | 0.0673 | | 1791.9011 | 25.0 | 1250 | 1907.1316 | 0.8 | 0.3593 | 2.2944 | 0.8000 | 0.7751 | 0.2603 | 0.0540 | | 1791.9011 | 26.0 | 1300 | 1905.5479 | 0.81 | 0.3382 | 2.1642 | 0.81 | 0.8011 | 0.2643 | 0.0588 | | 1791.9011 | 27.0 | 1350 | 1901.1903 | 0.81 | 0.3459 | 1.8076 | 0.81 | 0.8003 | 0.2701 | 0.0593 | | 1791.9011 | 28.0 | 1400 | 1909.6206 | 0.775 | 0.3710 | 2.0871 | 0.775 | 0.7602 | 0.2669 | 0.0662 | | 1791.9011 | 29.0 | 1450 | 1898.1127 | 0.825 | 0.3183 | 1.9650 | 0.825 | 0.8106 | 0.2453 | 0.0466 | | 1787.102 | 30.0 | 1500 | 1899.2822 | 0.815 | 0.3199 | 2.0598 | 0.815 | 0.8009 | 0.2249 | 0.0535 | | 1787.102 | 31.0 | 1550 | 1896.6500 | 0.815 | 0.3261 | 1.8001 | 0.815 | 0.8092 | 0.2422 | 0.0518 | | 1787.102 | 32.0 | 1600 | 1895.1382 | 0.815 | 0.3185 | 1.7859 | 0.815 | 0.7927 | 0.2447 | 0.0528 | | 1787.102 | 33.0 | 1650 | 1895.5653 | 0.83 | 0.3093 | 1.8941 | 0.83 | 0.8208 | 0.2492 | 0.0514 | | 1787.102 | 34.0 | 1700 | 1903.5587 | 0.825 | 0.3330 | 2.0897 | 0.825 | 0.8191 | 0.2669 | 0.0564 | | 1787.102 | 35.0 | 1750 | 1906.5038 | 0.79 | 0.3470 | 2.0105 | 0.79 | 0.7706 | 0.2470 | 0.0555 | | 1787.102 | 36.0 | 1800 | 1900.6022 | 0.82 | 0.3315 | 1.7296 | 0.82 | 0.7975 | 0.2521 | 0.0562 | | 1787.102 | 37.0 | 1850 | 1905.2075 | 0.82 | 0.3317 | 1.6814 | 0.82 | 0.8032 | 0.2497 | 0.0500 | | 1787.102 | 38.0 | 1900 | 1903.8790 | 0.805 | 0.3429 | 1.7498 | 0.805 | 0.7981 | 0.2636 | 0.0551 | | 1787.102 | 39.0 | 1950 | 1900.6611 | 0.815 | 0.3180 | 1.8225 | 0.815 | 0.8030 | 0.2445 | 0.0508 | | 1783.8555 | 40.0 | 2000 | 1898.1110 | 0.815 | 0.3065 | 1.7107 | 0.815 | 0.7966 | 0.2119 | 0.0468 | | 1783.8555 | 41.0 | 2050 | 1898.4365 | 0.83 | 0.3143 | 1.7429 | 0.83 | 0.8211 | 0.2559 | 0.0459 | | 1783.8555 | 42.0 | 2100 | 1897.6564 | 0.825 | 0.3174 | 1.6802 | 0.825 | 0.8289 | 0.2337 | 0.0528 | | 1783.8555 | 43.0 | 2150 | 1895.1722 | 0.83 | 0.3140 | 1.8016 | 0.83 | 0.8274 | 0.2389 | 0.0546 | | 1783.8555 | 44.0 | 2200 | 1899.7227 | 0.815 | 0.3283 | 1.7356 | 0.815 | 0.8041 | 0.2357 | 0.0548 | | 1783.8555 | 45.0 | 2250 | 1900.8423 | 0.815 | 0.3203 | 1.7309 | 0.815 | 0.8027 | 0.2426 | 0.0475 | | 1783.8555 | 46.0 | 2300 | 1897.0514 | 0.805 | 0.3144 | 1.6915 | 0.805 | 0.7944 | 0.2256 | 0.0487 | | 1783.8555 | 47.0 | 2350 | 1899.8257 | 0.82 | 0.3165 | 1.8902 | 0.82 | 0.8063 | 0.2467 | 0.0534 | | 1783.8555 | 48.0 | 2400 | 1898.5613 | 0.83 | 0.3184 | 1.6568 | 0.83 | 0.8262 | 0.2501 | 0.0505 | | 1783.8555 | 49.0 | 2450 | 1900.2554 | 0.815 | 0.3214 | 1.5944 | 0.815 | 0.8144 | 0.2196 | 0.0526 | | 1781.6714 | 50.0 | 2500 | 1887.5803 | 0.82 | 0.3064 | 1.9301 | 0.82 | 0.8071 | 0.2161 | 0.0509 | | 1781.6714 | 51.0 | 2550 | 1897.9454 | 0.8 | 0.3173 | 1.7138 | 0.8000 | 0.7864 | 0.2198 | 0.0516 | | 1781.6714 | 52.0 | 2600 | 1897.1787 | 0.82 | 0.3129 | 1.7421 | 0.82 | 0.8134 | 0.2336 | 0.0484 | | 1781.6714 | 53.0 | 2650 | 1894.3107 | 0.815 | 0.3074 | 1.8254 | 0.815 | 0.8044 | 0.2158 | 0.0503 | | 1781.6714 | 54.0 | 2700 | 1893.0331 | 0.82 | 0.3107 | 1.8731 | 0.82 | 0.8128 | 0.2317 | 0.0510 | | 1781.6714 | 55.0 | 2750 | 1891.3662 | 0.805 | 0.3096 | 1.5809 | 0.805 | 0.7907 | 0.2200 | 0.0530 | | 1781.6714 | 56.0 | 2800 | 1898.0635 | 0.81 | 0.3201 | 1.8833 | 0.81 | 0.7984 | 0.2422 | 0.0527 | | 1781.6714 | 57.0 | 2850 | 1893.0884 | 0.82 | 0.3090 | 1.7285 | 0.82 | 0.8109 | 0.2237 | 0.0478 | | 1781.6714 | 58.0 | 2900 | 1898.0316 | 0.81 | 0.3096 | 1.7963 | 0.81 | 0.7911 | 0.2020 | 0.0500 | | 1781.6714 | 59.0 | 2950 | 1894.0997 | 0.83 | 0.3112 | 1.7991 | 0.83 | 0.8258 | 0.2204 | 0.0527 | | 1779.6704 | 60.0 | 3000 | 1896.0547 | 0.815 | 0.3134 | 1.7401 | 0.815 | 0.8015 | 0.2084 | 0.0573 | | 1779.6704 | 61.0 | 3050 | 1895.2075 | 0.8 | 0.3155 | 1.7718 | 0.8000 | 0.7934 | 0.2166 | 0.0535 | | 1779.6704 | 62.0 | 3100 | 1893.9904 | 0.82 | 0.3038 | 1.6664 | 0.82 | 0.8008 | 0.2104 | 0.0529 | | 1779.6704 | 63.0 | 3150 | 1891.7444 | 0.83 | 0.3003 | 1.6876 | 0.83 | 0.8184 | 0.2047 | 0.0502 | | 1779.6704 | 64.0 | 3200 | 1891.1179 | 0.825 | 0.3024 | 1.6612 | 0.825 | 0.8057 | 0.2204 | 0.0532 | | 1779.6704 | 65.0 | 3250 | 1896.2891 | 0.805 | 0.3155 | 1.6865 | 0.805 | 0.7950 | 0.2164 | 0.0579 | | 1779.6704 | 66.0 | 3300 | 1893.3801 | 0.82 | 0.2999 | 1.6871 | 0.82 | 0.8052 | 0.2143 | 0.0531 | | 1779.6704 | 67.0 | 3350 | 1894.5913 | 0.825 | 0.3076 | 1.6609 | 0.825 | 0.8120 | 0.1837 | 0.0541 | | 1779.6704 | 68.0 | 3400 | 1896.0500 | 0.82 | 0.3114 | 1.7333 | 0.82 | 0.8047 | 0.2223 | 0.0517 | | 1779.6704 | 69.0 | 3450 | 1893.4884 | 0.805 | 0.3052 | 1.6768 | 0.805 | 0.7973 | 0.2042 | 0.0519 | | 1778.3283 | 70.0 | 3500 | 1895.4883 | 0.815 | 0.3078 | 1.7596 | 0.815 | 0.8082 | 0.2222 | 0.0508 | | 1778.3283 | 71.0 | 3550 | 1894.0317 | 0.81 | 0.3078 | 1.8215 | 0.81 | 0.7981 | 0.2106 | 0.0538 | | 1778.3283 | 72.0 | 3600 | 1895.4956 | 0.815 | 0.3058 | 1.7385 | 0.815 | 0.7993 | 0.2126 | 0.0532 | | 1778.3283 | 73.0 | 3650 | 1896.8766 | 0.8 | 0.3106 | 1.8469 | 0.8000 | 0.7887 | 0.2042 | 0.0557 | | 1778.3283 | 74.0 | 3700 | 1893.9586 | 0.82 | 0.3104 | 1.7539 | 0.82 | 0.8027 | 0.2229 | 0.0545 | | 1778.3283 | 75.0 | 3750 | 1896.0406 | 0.81 | 0.3155 | 1.7257 | 0.81 | 0.7947 | 0.2177 | 0.0576 | | 1778.3283 | 76.0 | 3800 | 1895.0575 | 0.815 | 0.3115 | 1.6848 | 0.815 | 0.7940 | 0.2165 | 0.0558 | | 1778.3283 | 77.0 | 3850 | 1895.8911 | 0.8 | 0.3102 | 1.7059 | 0.8000 | 0.7860 | 0.2135 | 0.0550 | | 1778.3283 | 78.0 | 3900 | 1897.4371 | 0.81 | 0.3138 | 1.8054 | 0.81 | 0.7986 | 0.2218 | 0.0541 | | 1778.3283 | 79.0 | 3950 | 1897.8650 | 0.805 | 0.3148 | 1.8384 | 0.805 | 0.7899 | 0.2226 | 0.0529 | | 1777.483 | 80.0 | 4000 | 1896.5586 | 0.82 | 0.3121 | 1.6966 | 0.82 | 0.8040 | 0.2163 | 0.0547 | | 1777.483 | 81.0 | 4050 | 1894.1078 | 0.805 | 0.3029 | 1.7155 | 0.805 | 0.7913 | 0.2053 | 0.0504 | | 1777.483 | 82.0 | 4100 | 1898.5256 | 0.81 | 0.3134 | 1.7079 | 0.81 | 0.7917 | 0.2099 | 0.0546 | | 1777.483 | 83.0 | 4150 | 1895.9496 | 0.8 | 0.3137 | 1.8060 | 0.8000 | 0.7837 | 0.2138 | 0.0542 | | 1777.483 | 84.0 | 4200 | 1890.2460 | 0.81 | 0.3126 | 1.7942 | 0.81 | 0.7929 | 0.2040 | 0.0565 | | 1777.483 | 85.0 | 4250 | 1894.1871 | 0.805 | 0.3062 | 1.8438 | 0.805 | 0.7940 | 0.1993 | 0.0514 | | 1777.483 | 86.0 | 4300 | 1895.9297 | 0.81 | 0.3099 | 1.8574 | 0.81 | 0.7941 | 0.2017 | 0.0521 | | 1777.483 | 87.0 | 4350 | 1894.5403 | 0.82 | 0.3044 | 1.6991 | 0.82 | 0.8036 | 0.2360 | 0.0523 | | 1777.483 | 88.0 | 4400 | 1893.5771 | 0.815 | 0.3056 | 1.7819 | 0.815 | 0.7949 | 0.2241 | 0.0531 | | 1777.483 | 89.0 | 4450 | 1890.9475 | 0.8 | 0.3060 | 1.7981 | 0.8000 | 0.7852 | 0.2025 | 0.0544 | | 1776.5495 | 90.0 | 4500 | 1893.1115 | 0.82 | 0.3018 | 1.9212 | 0.82 | 0.8066 | 0.1983 | 0.0520 | | 1776.5495 | 91.0 | 4550 | 1897.0479 | 0.815 | 0.3124 | 1.8394 | 0.815 | 0.7950 | 0.2129 | 0.0544 | | 1776.5495 | 92.0 | 4600 | 1895.9264 | 0.815 | 0.3075 | 1.8241 | 0.815 | 0.8003 | 0.2220 | 0.0517 | | 1776.5495 | 93.0 | 4650 | 1894.7333 | 0.805 | 0.3077 | 1.8649 | 0.805 | 0.7835 | 0.2071 | 0.0523 | | 1776.5495 | 94.0 | 4700 | 1894.5804 | 0.805 | 0.3086 | 1.8344 | 0.805 | 0.7857 | 0.1945 | 0.0533 | | 1776.5495 | 95.0 | 4750 | 1889.6356 | 0.805 | 0.2979 | 1.7220 | 0.805 | 0.7944 | 0.1813 | 0.0532 | | 1776.5495 | 96.0 | 4800 | 1894.5663 | 0.805 | 0.3077 | 1.7782 | 0.805 | 0.7902 | 0.2031 | 0.0525 | | 1776.5495 | 97.0 | 4850 | 1894.6851 | 0.815 | 0.3085 | 1.8374 | 0.815 | 0.7972 | 0.2140 | 0.0541 | | 1776.5495 | 98.0 | 4900 | 1897.3888 | 0.805 | 0.3111 | 1.8603 | 0.805 | 0.7905 | 0.2092 | 0.0537 | | 1776.5495 | 99.0 | 4950 | 1892.2825 | 0.815 | 0.3038 | 1.8404 | 0.815 | 0.8017 | 0.2080 | 0.0514 | | 1776.3791 | 100.0 | 5000 | 1893.8862 | 0.81 | 0.3043 | 1.7237 | 0.81 | 0.7986 | 0.2043 | 0.0518 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/vit-base_tobacco_crl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_tobacco_crl This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8584 - Accuracy: 0.8 - Brier Loss: 0.3083 - Nll: 1.3299 - F1 Micro: 0.8000 - F1 Macro: 0.7728 - Ece: 0.2079 - Aurc: 0.0851 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 0.8895 | 0.82 | 0.3092 | 1.1901 | 0.82 | 0.8049 | 0.2293 | 0.0751 | | No log | 1.96 | 6 | 0.8886 | 0.81 | 0.3071 | 1.1861 | 0.81 | 0.7912 | 0.2245 | 0.0705 | | No log | 2.96 | 9 | 0.8747 | 0.815 | 0.3065 | 1.1876 | 0.815 | 0.8021 | 0.2265 | 0.0734 | | No log | 3.96 | 12 | 0.8812 | 0.805 | 0.3085 | 1.2661 | 0.805 | 0.7783 | 0.2087 | 0.0761 | | No log | 4.96 | 15 | 0.8874 | 0.81 | 0.3080 | 1.1831 | 0.81 | 0.7871 | 0.2325 | 0.0786 | | No log | 5.96 | 18 | 0.8818 | 0.81 | 0.3089 | 1.2715 | 0.81 | 0.7960 | 0.2345 | 0.0788 | | No log | 6.96 | 21 | 0.8790 | 0.81 | 0.3045 | 1.2619 | 0.81 | 0.7904 | 0.2235 | 0.0693 | | No log | 7.96 | 24 | 0.8794 | 0.805 | 0.3084 | 1.2566 | 0.805 | 0.7884 | 0.2205 | 0.0787 | | No log | 8.96 | 27 | 0.8838 | 0.815 | 0.3134 | 1.3380 | 0.815 | 0.8072 | 0.2230 | 0.0751 | | No log | 9.96 | 30 | 0.8849 | 0.8 | 0.3132 | 1.3205 | 0.8000 | 0.7757 | 0.2229 | 0.0824 | | No log | 10.96 | 33 | 0.8633 | 0.81 | 0.3061 | 1.3978 | 0.81 | 0.7938 | 0.2004 | 0.0756 | | No log | 11.96 | 36 | 0.8746 | 0.81 | 0.3089 | 1.3970 | 0.81 | 0.7918 | 0.2346 | 0.0741 | | No log | 12.96 | 39 | 0.8625 | 0.805 | 0.3078 | 1.1961 | 0.805 | 0.7945 | 0.2505 | 0.0854 | | No log | 13.96 | 42 | 0.8636 | 0.815 | 0.3068 | 1.2113 | 0.815 | 0.8046 | 0.2371 | 0.0804 | | No log | 14.96 | 45 | 0.8906 | 0.79 | 0.3157 | 1.3748 | 0.79 | 0.7777 | 0.2279 | 0.0847 | | No log | 15.96 | 48 | 0.8601 | 0.805 | 0.3040 | 1.2977 | 0.805 | 0.7876 | 0.2176 | 0.0805 | | No log | 16.96 | 51 | 0.8606 | 0.815 | 0.3083 | 1.4136 | 0.815 | 0.8077 | 0.2279 | 0.0787 | | No log | 17.96 | 54 | 0.9013 | 0.8 | 0.3261 | 1.2494 | 0.8000 | 0.7886 | 0.2194 | 0.0871 | | No log | 18.96 | 57 | 0.8653 | 0.805 | 0.3143 | 1.4166 | 0.805 | 0.7935 | 0.2170 | 0.0786 | | No log | 19.96 | 60 | 0.8459 | 0.81 | 0.3030 | 1.2629 | 0.81 | 0.7953 | 0.2129 | 0.0892 | | No log | 20.96 | 63 | 0.8689 | 0.795 | 0.3106 | 1.2823 | 0.795 | 0.7725 | 0.2099 | 0.0828 | | No log | 21.96 | 66 | 0.8563 | 0.81 | 0.3016 | 1.2789 | 0.81 | 0.7954 | 0.2324 | 0.0742 | | No log | 22.96 | 69 | 0.8998 | 0.785 | 0.3231 | 1.6511 | 0.785 | 0.7642 | 0.2178 | 0.1015 | | No log | 23.96 | 72 | 0.8338 | 0.805 | 0.2971 | 1.0504 | 0.805 | 0.7868 | 0.2135 | 0.0645 | | No log | 24.96 | 75 | 0.8423 | 0.8 | 0.3040 | 1.4777 | 0.8000 | 0.7771 | 0.2283 | 0.0689 | | No log | 25.96 | 78 | 0.8775 | 0.8 | 0.3218 | 1.4206 | 0.8000 | 0.7774 | 0.2204 | 0.1120 | | No log | 26.96 | 81 | 0.8389 | 0.8 | 0.2984 | 1.1946 | 0.8000 | 0.7771 | 0.1990 | 0.0737 | | No log | 27.96 | 84 | 0.9119 | 0.795 | 0.3319 | 1.6978 | 0.795 | 0.7805 | 0.2279 | 0.1109 | | No log | 28.96 | 87 | 0.8689 | 0.805 | 0.3144 | 1.2644 | 0.805 | 0.7971 | 0.2216 | 0.0787 | | No log | 29.96 | 90 | 0.8404 | 0.8 | 0.2990 | 1.1775 | 0.8000 | 0.7848 | 0.1962 | 0.0805 | | No log | 30.96 | 93 | 0.8842 | 0.8 | 0.3226 | 1.3091 | 0.8000 | 0.7904 | 0.2168 | 0.1020 | | No log | 31.96 | 96 | 0.8653 | 0.805 | 0.3086 | 1.3926 | 0.805 | 0.7818 | 0.1996 | 0.0853 | | No log | 32.96 | 99 | 0.8767 | 0.785 | 0.3142 | 1.2268 | 0.785 | 0.7684 | 0.2117 | 0.0739 | | No log | 33.96 | 102 | 0.9349 | 0.775 | 0.3410 | 1.3988 | 0.775 | 0.7600 | 0.2246 | 0.1024 | | No log | 34.96 | 105 | 0.8606 | 0.79 | 0.3035 | 1.0902 | 0.79 | 0.7683 | 0.1954 | 0.0830 | | No log | 35.96 | 108 | 0.8578 | 0.815 | 0.3050 | 1.3418 | 0.815 | 0.7923 | 0.2155 | 0.0923 | | No log | 36.96 | 111 | 0.8641 | 0.795 | 0.3128 | 1.2449 | 0.795 | 0.7694 | 0.2068 | 0.0878 | | No log | 37.96 | 114 | 0.8489 | 0.8 | 0.2996 | 1.2505 | 0.8000 | 0.7698 | 0.2027 | 0.0827 | | No log | 38.96 | 117 | 0.8465 | 0.82 | 0.3011 | 1.3264 | 0.82 | 0.7947 | 0.2033 | 0.0923 | | No log | 39.96 | 120 | 0.8608 | 0.8 | 0.3051 | 1.3178 | 0.8000 | 0.7706 | 0.2072 | 0.0894 | | No log | 40.96 | 123 | 0.8592 | 0.8 | 0.3066 | 1.3141 | 0.8000 | 0.7692 | 0.2069 | 0.0909 | | No log | 41.96 | 126 | 0.8611 | 0.805 | 0.3125 | 1.2988 | 0.805 | 0.7832 | 0.2094 | 0.0791 | | No log | 42.96 | 129 | 0.8516 | 0.805 | 0.3000 | 1.3221 | 0.805 | 0.7791 | 0.2179 | 0.0884 | | No log | 43.96 | 132 | 0.8587 | 0.8 | 0.3064 | 1.3414 | 0.8000 | 0.7784 | 0.2056 | 0.0922 | | No log | 44.96 | 135 | 0.8691 | 0.79 | 0.3181 | 1.3262 | 0.79 | 0.7765 | 0.2153 | 0.0884 | | No log | 45.96 | 138 | 0.8576 | 0.81 | 0.3066 | 1.1918 | 0.81 | 0.7847 | 0.2182 | 0.1009 | | No log | 46.96 | 141 | 0.8722 | 0.8 | 0.3152 | 1.4909 | 0.8000 | 0.7798 | 0.2219 | 0.1012 | | No log | 47.96 | 144 | 0.8399 | 0.81 | 0.3087 | 1.5338 | 0.81 | 0.7849 | 0.2138 | 0.0740 | | No log | 48.96 | 147 | 0.8393 | 0.805 | 0.3004 | 1.3810 | 0.805 | 0.7819 | 0.2150 | 0.0696 | | No log | 49.96 | 150 | 0.8899 | 0.78 | 0.3201 | 1.5622 | 0.78 | 0.7644 | 0.2227 | 0.0960 | | No log | 50.96 | 153 | 0.8954 | 0.78 | 0.3249 | 1.6494 | 0.78 | 0.7654 | 0.2135 | 0.0902 | | No log | 51.96 | 156 | 0.8259 | 0.79 | 0.2954 | 1.2271 | 0.79 | 0.7707 | 0.2129 | 0.0659 | | No log | 52.96 | 159 | 0.8806 | 0.795 | 0.3145 | 1.4079 | 0.795 | 0.7759 | 0.2046 | 0.0877 | | No log | 53.96 | 162 | 0.8842 | 0.81 | 0.3178 | 1.3465 | 0.81 | 0.7925 | 0.2173 | 0.1037 | | No log | 54.96 | 165 | 0.8741 | 0.8 | 0.3173 | 1.4540 | 0.8000 | 0.7750 | 0.2079 | 0.0819 | | No log | 55.96 | 168 | 0.8242 | 0.8 | 0.2964 | 1.3053 | 0.8000 | 0.7838 | 0.1972 | 0.0670 | | No log | 56.96 | 171 | 0.8350 | 0.825 | 0.2962 | 1.2110 | 0.825 | 0.8135 | 0.2126 | 0.0780 | | No log | 57.96 | 174 | 0.8491 | 0.815 | 0.3034 | 1.3250 | 0.815 | 0.8070 | 0.2116 | 0.0875 | | No log | 58.96 | 177 | 0.8584 | 0.795 | 0.3119 | 1.3162 | 0.795 | 0.7764 | 0.1956 | 0.0860 | | No log | 59.96 | 180 | 0.8546 | 0.79 | 0.3115 | 1.3315 | 0.79 | 0.7740 | 0.1855 | 0.0828 | | No log | 60.96 | 183 | 0.8564 | 0.79 | 0.3068 | 1.3275 | 0.79 | 0.7760 | 0.2008 | 0.0862 | | No log | 61.96 | 186 | 0.8573 | 0.795 | 0.3068 | 1.3160 | 0.795 | 0.7738 | 0.2117 | 0.0884 | | No log | 62.96 | 189 | 0.8503 | 0.785 | 0.3088 | 1.3498 | 0.785 | 0.7650 | 0.2069 | 0.0856 | | No log | 63.96 | 192 | 0.8639 | 0.81 | 0.3111 | 1.2614 | 0.81 | 0.7873 | 0.2247 | 0.0893 | | No log | 64.96 | 195 | 0.8744 | 0.805 | 0.3128 | 1.3294 | 0.805 | 0.7888 | 0.2096 | 0.0912 | | No log | 65.96 | 198 | 0.8727 | 0.8 | 0.3138 | 1.4212 | 0.8000 | 0.7903 | 0.2031 | 0.0849 | | No log | 66.96 | 201 | 0.8612 | 0.79 | 0.3084 | 1.3592 | 0.79 | 0.7702 | 0.1855 | 0.0816 | | No log | 67.96 | 204 | 0.8576 | 0.79 | 0.3071 | 1.4005 | 0.79 | 0.7667 | 0.1896 | 0.0863 | | No log | 68.96 | 207 | 0.8540 | 0.805 | 0.3037 | 1.3957 | 0.805 | 0.7775 | 0.2263 | 0.0876 | | No log | 69.96 | 210 | 0.8499 | 0.81 | 0.2982 | 1.3987 | 0.81 | 0.7874 | 0.2109 | 0.0856 | | No log | 70.96 | 213 | 0.8465 | 0.815 | 0.3001 | 1.3222 | 0.815 | 0.7901 | 0.2224 | 0.0928 | | No log | 71.96 | 216 | 0.8541 | 0.81 | 0.3041 | 1.3331 | 0.81 | 0.7827 | 0.2169 | 0.0897 | | No log | 72.96 | 219 | 0.8546 | 0.795 | 0.3066 | 1.3991 | 0.795 | 0.7720 | 0.2141 | 0.0871 | | No log | 73.96 | 222 | 0.8569 | 0.79 | 0.3039 | 1.3544 | 0.79 | 0.7672 | 0.1958 | 0.0863 | | No log | 74.96 | 225 | 0.8622 | 0.805 | 0.3028 | 1.3384 | 0.805 | 0.7847 | 0.1938 | 0.0879 | | No log | 75.96 | 228 | 0.8610 | 0.805 | 0.3039 | 1.3285 | 0.805 | 0.7810 | 0.2033 | 0.0947 | | No log | 76.96 | 231 | 0.8581 | 0.81 | 0.3031 | 1.3334 | 0.81 | 0.7840 | 0.1993 | 0.0944 | | No log | 77.96 | 234 | 0.8607 | 0.8 | 0.3055 | 1.3260 | 0.8000 | 0.7785 | 0.1979 | 0.0899 | | No log | 78.96 | 237 | 0.8642 | 0.79 | 0.3068 | 1.3928 | 0.79 | 0.7672 | 0.1822 | 0.0869 | | No log | 79.96 | 240 | 0.8640 | 0.805 | 0.3044 | 1.3311 | 0.805 | 0.7786 | 0.2001 | 0.0916 | | No log | 80.96 | 243 | 0.8648 | 0.81 | 0.3056 | 1.2812 | 0.81 | 0.7836 | 0.2173 | 0.0955 | | No log | 81.96 | 246 | 0.8639 | 0.825 | 0.3056 | 1.3295 | 0.825 | 0.8062 | 0.1952 | 0.0913 | | No log | 82.96 | 249 | 0.8643 | 0.805 | 0.3082 | 1.3334 | 0.805 | 0.7887 | 0.2108 | 0.0881 | | No log | 83.96 | 252 | 0.8626 | 0.795 | 0.3068 | 1.3334 | 0.795 | 0.7780 | 0.2097 | 0.0845 | | No log | 84.96 | 255 | 0.8586 | 0.81 | 0.3033 | 1.2646 | 0.81 | 0.7893 | 0.2035 | 0.0808 | | No log | 85.96 | 258 | 0.8570 | 0.805 | 0.3024 | 1.2694 | 0.805 | 0.7802 | 0.1947 | 0.0811 | | No log | 86.96 | 261 | 0.8557 | 0.795 | 0.3023 | 1.3261 | 0.795 | 0.7657 | 0.1966 | 0.0828 | | No log | 87.96 | 264 | 0.8576 | 0.8 | 0.3051 | 1.3283 | 0.8000 | 0.7754 | 0.2072 | 0.0848 | | No log | 88.96 | 267 | 0.8537 | 0.8 | 0.3083 | 1.3257 | 0.8000 | 0.7771 | 0.2167 | 0.0859 | | No log | 89.96 | 270 | 0.8591 | 0.795 | 0.3106 | 1.3262 | 0.795 | 0.7737 | 0.2011 | 0.0866 | | No log | 90.96 | 273 | 0.8612 | 0.785 | 0.3122 | 1.3279 | 0.785 | 0.7594 | 0.1885 | 0.0868 | | No log | 91.96 | 276 | 0.8571 | 0.795 | 0.3104 | 1.3248 | 0.795 | 0.7667 | 0.1966 | 0.0853 | | No log | 92.96 | 279 | 0.8560 | 0.795 | 0.3082 | 1.3244 | 0.795 | 0.7667 | 0.2147 | 0.0836 | | No log | 93.96 | 282 | 0.8551 | 0.8 | 0.3071 | 1.3251 | 0.8000 | 0.7766 | 0.2109 | 0.0830 | | No log | 94.96 | 285 | 0.8556 | 0.79 | 0.3076 | 1.3264 | 0.79 | 0.7577 | 0.1885 | 0.0834 | | No log | 95.96 | 288 | 0.8569 | 0.795 | 0.3078 | 1.3280 | 0.795 | 0.7675 | 0.1980 | 0.0840 | | No log | 96.96 | 291 | 0.8581 | 0.795 | 0.3082 | 1.3290 | 0.795 | 0.7675 | 0.2039 | 0.0842 | | No log | 97.96 | 294 | 0.8585 | 0.8 | 0.3084 | 1.3300 | 0.8000 | 0.7728 | 0.2137 | 0.0849 | | No log | 98.96 | 297 | 0.8589 | 0.8 | 0.3083 | 1.3301 | 0.8000 | 0.7728 | 0.2156 | 0.0850 | | No log | 99.96 | 300 | 0.8584 | 0.8 | 0.3083 | 1.3299 | 0.8000 | 0.7728 | 0.2079 | 0.0851 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/dit-base_tobacco_crl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base_tobacco_crl This model is a fine-tuned version of [jordyvl/dit-base_tobacco](https://huggingface.co/jordyvl/dit-base_tobacco) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3591 - Accuracy: 0.935 - Brier Loss: 0.1049 - Nll: 0.7583 - F1 Micro: 0.935 - F1 Macro: 0.9303 - Ece: 0.0608 - Aurc: 0.0092 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 0.3864 | 0.925 | 0.1155 | 1.0908 | 0.925 | 0.9187 | 0.0833 | 0.0112 | | No log | 1.96 | 6 | 0.3462 | 0.935 | 0.1104 | 1.0903 | 0.935 | 0.9308 | 0.0791 | 0.0123 | | No log | 2.96 | 9 | 0.3278 | 0.94 | 0.1121 | 1.0884 | 0.94 | 0.9399 | 0.0806 | 0.0143 | | No log | 3.96 | 12 | 0.3634 | 0.925 | 0.1178 | 1.0882 | 0.925 | 0.9213 | 0.0849 | 0.0162 | | No log | 4.96 | 15 | 0.3682 | 0.925 | 0.1187 | 1.0834 | 0.925 | 0.9213 | 0.0763 | 0.0167 | | No log | 5.96 | 18 | 0.3456 | 0.93 | 0.1133 | 1.0735 | 0.93 | 0.9247 | 0.0710 | 0.0148 | | No log | 6.96 | 21 | 0.3335 | 0.93 | 0.1107 | 1.0762 | 0.93 | 0.9236 | 0.0598 | 0.0119 | | No log | 7.96 | 24 | 0.3303 | 0.925 | 0.1131 | 1.0704 | 0.925 | 0.9204 | 0.0713 | 0.0113 | | No log | 8.96 | 27 | 0.3394 | 0.93 | 0.1110 | 1.0586 | 0.93 | 0.9246 | 0.0731 | 0.0115 | | No log | 9.96 | 30 | 0.3348 | 0.94 | 0.1067 | 1.0420 | 0.94 | 0.9374 | 0.0677 | 0.0127 | | No log | 10.96 | 33 | 0.3362 | 0.94 | 0.1047 | 1.0205 | 0.94 | 0.9375 | 0.0732 | 0.0137 | | No log | 11.96 | 36 | 0.3314 | 0.94 | 0.1042 | 1.0173 | 0.94 | 0.9390 | 0.0679 | 0.0123 | | No log | 12.96 | 39 | 0.3310 | 0.94 | 0.1089 | 1.0218 | 0.94 | 0.9390 | 0.0728 | 0.0116 | | No log | 13.96 | 42 | 0.3400 | 0.935 | 0.1100 | 1.0094 | 0.935 | 0.9361 | 0.0704 | 0.0140 | | No log | 14.96 | 45 | 0.3507 | 0.93 | 0.1142 | 0.9968 | 0.93 | 0.9292 | 0.0710 | 0.0151 | | No log | 15.96 | 48 | 0.3661 | 0.93 | 0.1178 | 0.9775 | 0.93 | 0.9315 | 0.0668 | 0.0154 | | No log | 16.96 | 51 | 0.3673 | 0.93 | 0.1149 | 0.9696 | 0.93 | 0.9315 | 0.0580 | 0.0135 | | No log | 17.96 | 54 | 0.3520 | 0.935 | 0.1112 | 0.9773 | 0.935 | 0.9317 | 0.0700 | 0.0113 | | No log | 18.96 | 57 | 0.3450 | 0.93 | 0.1074 | 0.9759 | 0.93 | 0.9273 | 0.0580 | 0.0108 | | No log | 19.96 | 60 | 0.3381 | 0.94 | 0.0998 | 0.9696 | 0.94 | 0.9445 | 0.0587 | 0.0124 | | No log | 20.96 | 63 | 0.3371 | 0.935 | 0.0995 | 0.9645 | 0.935 | 0.9346 | 0.0578 | 0.0132 | | No log | 21.96 | 66 | 0.3452 | 0.945 | 0.1014 | 0.9606 | 0.945 | 0.9446 | 0.0617 | 0.0147 | | No log | 22.96 | 69 | 0.3627 | 0.94 | 0.1034 | 0.9504 | 0.94 | 0.9416 | 0.0655 | 0.0141 | | No log | 23.96 | 72 | 0.3576 | 0.945 | 0.1025 | 0.9361 | 0.945 | 0.9445 | 0.0583 | 0.0123 | | No log | 24.96 | 75 | 0.3528 | 0.94 | 0.1008 | 0.9230 | 0.94 | 0.9390 | 0.0610 | 0.0116 | | No log | 25.96 | 78 | 0.3514 | 0.935 | 0.1030 | 0.9219 | 0.935 | 0.9338 | 0.0609 | 0.0112 | | No log | 26.96 | 81 | 0.3560 | 0.94 | 0.1031 | 0.9334 | 0.94 | 0.9370 | 0.0623 | 0.0105 | | No log | 27.96 | 84 | 0.3613 | 0.945 | 0.1038 | 0.9425 | 0.945 | 0.9422 | 0.0570 | 0.0102 | | No log | 28.96 | 87 | 0.3701 | 0.94 | 0.1065 | 0.9366 | 0.94 | 0.9353 | 0.0605 | 0.0098 | | No log | 29.96 | 90 | 0.3682 | 0.94 | 0.1064 | 0.9261 | 0.94 | 0.9353 | 0.0662 | 0.0094 | | No log | 30.96 | 93 | 0.3583 | 0.945 | 0.1022 | 0.9012 | 0.945 | 0.9422 | 0.0591 | 0.0100 | | No log | 31.96 | 96 | 0.3649 | 0.93 | 0.1057 | 0.8904 | 0.93 | 0.9307 | 0.0649 | 0.0112 | | No log | 32.96 | 99 | 0.3597 | 0.935 | 0.1048 | 0.8894 | 0.935 | 0.9359 | 0.0613 | 0.0107 | | No log | 33.96 | 102 | 0.3589 | 0.935 | 0.1056 | 0.8950 | 0.935 | 0.9359 | 0.0657 | 0.0106 | | No log | 34.96 | 105 | 0.3595 | 0.94 | 0.1081 | 0.9086 | 0.94 | 0.9370 | 0.0614 | 0.0104 | | No log | 35.96 | 108 | 0.3626 | 0.94 | 0.1100 | 0.9047 | 0.94 | 0.9370 | 0.0622 | 0.0098 | | No log | 36.96 | 111 | 0.3520 | 0.94 | 0.1073 | 0.8958 | 0.94 | 0.9378 | 0.0579 | 0.0096 | | No log | 37.96 | 114 | 0.3451 | 0.94 | 0.1033 | 0.8793 | 0.94 | 0.9378 | 0.0542 | 0.0094 | | No log | 38.96 | 117 | 0.3427 | 0.935 | 0.1000 | 0.8685 | 0.935 | 0.9309 | 0.0604 | 0.0090 | | No log | 39.96 | 120 | 0.3391 | 0.94 | 0.0977 | 0.8597 | 0.94 | 0.9353 | 0.0548 | 0.0089 | | No log | 40.96 | 123 | 0.3363 | 0.95 | 0.0964 | 0.8537 | 0.9500 | 0.9522 | 0.0576 | 0.0088 | | No log | 41.96 | 126 | 0.3458 | 0.95 | 0.1015 | 0.8524 | 0.9500 | 0.9522 | 0.0590 | 0.0087 | | No log | 42.96 | 129 | 0.3618 | 0.935 | 0.1099 | 0.8628 | 0.935 | 0.9406 | 0.0640 | 0.0094 | | No log | 43.96 | 132 | 0.3631 | 0.935 | 0.1109 | 0.8657 | 0.935 | 0.9406 | 0.0611 | 0.0093 | | No log | 44.96 | 135 | 0.3622 | 0.94 | 0.1076 | 0.8571 | 0.94 | 0.9459 | 0.0555 | 0.0087 | | No log | 45.96 | 138 | 0.3654 | 0.94 | 0.1066 | 0.8452 | 0.94 | 0.9459 | 0.0546 | 0.0083 | | No log | 46.96 | 141 | 0.3672 | 0.935 | 0.1104 | 0.8382 | 0.935 | 0.9431 | 0.0588 | 0.0084 | | No log | 47.96 | 144 | 0.3624 | 0.94 | 0.1081 | 0.8342 | 0.94 | 0.9482 | 0.0611 | 0.0085 | | No log | 48.96 | 147 | 0.3603 | 0.945 | 0.1051 | 0.8251 | 0.945 | 0.9515 | 0.0590 | 0.0084 | | No log | 49.96 | 150 | 0.3540 | 0.945 | 0.1013 | 0.8215 | 0.945 | 0.9515 | 0.0571 | 0.0082 | | No log | 50.96 | 153 | 0.3538 | 0.945 | 0.0996 | 0.8169 | 0.945 | 0.9515 | 0.0600 | 0.0083 | | No log | 51.96 | 156 | 0.3588 | 0.945 | 0.1014 | 0.8110 | 0.945 | 0.9515 | 0.0557 | 0.0082 | | No log | 52.96 | 159 | 0.3600 | 0.945 | 0.1029 | 0.8046 | 0.945 | 0.9515 | 0.0597 | 0.0085 | | No log | 53.96 | 162 | 0.3615 | 0.94 | 0.1037 | 0.7998 | 0.94 | 0.9398 | 0.0556 | 0.0083 | | No log | 54.96 | 165 | 0.3589 | 0.94 | 0.1037 | 0.7987 | 0.94 | 0.9398 | 0.0562 | 0.0082 | | No log | 55.96 | 168 | 0.3582 | 0.945 | 0.1037 | 0.7922 | 0.945 | 0.9515 | 0.0570 | 0.0083 | | No log | 56.96 | 171 | 0.3695 | 0.945 | 0.1032 | 0.7887 | 0.945 | 0.9515 | 0.0590 | 0.0092 | | No log | 57.96 | 174 | 0.3764 | 0.945 | 0.1008 | 0.7897 | 0.945 | 0.9490 | 0.0637 | 0.0098 | | No log | 58.96 | 177 | 0.3719 | 0.955 | 0.0990 | 0.7909 | 0.955 | 0.9575 | 0.0616 | 0.0099 | | No log | 59.96 | 180 | 0.3613 | 0.95 | 0.0977 | 0.7942 | 0.9500 | 0.9543 | 0.0550 | 0.0096 | | No log | 60.96 | 183 | 0.3593 | 0.95 | 0.0984 | 0.7958 | 0.9500 | 0.9543 | 0.0573 | 0.0100 | | No log | 61.96 | 186 | 0.3583 | 0.945 | 0.1005 | 0.7982 | 0.945 | 0.9512 | 0.0542 | 0.0102 | | No log | 62.96 | 189 | 0.3594 | 0.94 | 0.1027 | 0.7994 | 0.94 | 0.9483 | 0.0558 | 0.0101 | | No log | 63.96 | 192 | 0.3627 | 0.94 | 0.1042 | 0.7976 | 0.94 | 0.9483 | 0.0554 | 0.0101 | | No log | 64.96 | 195 | 0.3668 | 0.94 | 0.1053 | 0.7944 | 0.94 | 0.9483 | 0.0560 | 0.0104 | | No log | 65.96 | 198 | 0.3685 | 0.94 | 0.1063 | 0.7914 | 0.94 | 0.9483 | 0.0548 | 0.0104 | | No log | 66.96 | 201 | 0.3701 | 0.94 | 0.1060 | 0.7880 | 0.94 | 0.9483 | 0.0563 | 0.0105 | | No log | 67.96 | 204 | 0.3668 | 0.94 | 0.1047 | 0.7872 | 0.94 | 0.9483 | 0.0555 | 0.0111 | | No log | 68.96 | 207 | 0.3682 | 0.945 | 0.1046 | 0.7863 | 0.945 | 0.9512 | 0.0587 | 0.0115 | | No log | 69.96 | 210 | 0.3692 | 0.94 | 0.1054 | 0.7855 | 0.94 | 0.9425 | 0.0584 | 0.0117 | | No log | 70.96 | 213 | 0.3734 | 0.94 | 0.1071 | 0.7829 | 0.94 | 0.9425 | 0.0582 | 0.0116 | | No log | 71.96 | 216 | 0.3733 | 0.94 | 0.1079 | 0.7802 | 0.94 | 0.9425 | 0.0576 | 0.0116 | | No log | 72.96 | 219 | 0.3720 | 0.935 | 0.1081 | 0.7769 | 0.935 | 0.9396 | 0.0579 | 0.0109 | | No log | 73.96 | 222 | 0.3689 | 0.93 | 0.1074 | 0.7723 | 0.93 | 0.9296 | 0.0652 | 0.0096 | | No log | 74.96 | 225 | 0.3687 | 0.935 | 0.1057 | 0.7701 | 0.935 | 0.9327 | 0.0634 | 0.0090 | | No log | 75.96 | 228 | 0.3672 | 0.935 | 0.1053 | 0.7723 | 0.935 | 0.9327 | 0.0606 | 0.0083 | | No log | 76.96 | 231 | 0.3649 | 0.935 | 0.1051 | 0.7772 | 0.935 | 0.9303 | 0.0575 | 0.0084 | | No log | 77.96 | 234 | 0.3654 | 0.935 | 0.1055 | 0.7806 | 0.935 | 0.9303 | 0.0633 | 0.0084 | | No log | 78.96 | 237 | 0.3630 | 0.935 | 0.1048 | 0.7823 | 0.935 | 0.9312 | 0.0647 | 0.0086 | | No log | 79.96 | 240 | 0.3613 | 0.935 | 0.1036 | 0.7819 | 0.935 | 0.9312 | 0.0552 | 0.0088 | | No log | 80.96 | 243 | 0.3578 | 0.94 | 0.1020 | 0.7808 | 0.94 | 0.9356 | 0.0550 | 0.0086 | | No log | 81.96 | 246 | 0.3561 | 0.94 | 0.1009 | 0.7794 | 0.94 | 0.9356 | 0.0551 | 0.0086 | | No log | 82.96 | 249 | 0.3573 | 0.935 | 0.1010 | 0.7767 | 0.935 | 0.9303 | 0.0557 | 0.0088 | | No log | 83.96 | 252 | 0.3574 | 0.935 | 0.1010 | 0.7744 | 0.935 | 0.9303 | 0.0556 | 0.0089 | | No log | 84.96 | 255 | 0.3574 | 0.935 | 0.1004 | 0.7696 | 0.935 | 0.9303 | 0.0562 | 0.0091 | | No log | 85.96 | 258 | 0.3578 | 0.935 | 0.1003 | 0.7659 | 0.935 | 0.9303 | 0.0593 | 0.0088 | | No log | 86.96 | 261 | 0.3569 | 0.935 | 0.0994 | 0.7640 | 0.935 | 0.9303 | 0.0599 | 0.0090 | | No log | 87.96 | 264 | 0.3541 | 0.94 | 0.0984 | 0.7629 | 0.94 | 0.9390 | 0.0592 | 0.0094 | | No log | 88.96 | 267 | 0.3529 | 0.94 | 0.0982 | 0.7613 | 0.94 | 0.9390 | 0.0555 | 0.0095 | | No log | 89.96 | 270 | 0.3517 | 0.94 | 0.0984 | 0.7598 | 0.94 | 0.9390 | 0.0514 | 0.0095 | | No log | 90.96 | 273 | 0.3528 | 0.94 | 0.0992 | 0.7586 | 0.94 | 0.9390 | 0.0516 | 0.0096 | | No log | 91.96 | 276 | 0.3539 | 0.94 | 0.1004 | 0.7581 | 0.94 | 0.9390 | 0.0518 | 0.0095 | | No log | 92.96 | 279 | 0.3551 | 0.94 | 0.1016 | 0.7583 | 0.94 | 0.9390 | 0.0559 | 0.0095 | | No log | 93.96 | 282 | 0.3565 | 0.94 | 0.1028 | 0.7586 | 0.94 | 0.9390 | 0.0554 | 0.0093 | | No log | 94.96 | 285 | 0.3572 | 0.94 | 0.1035 | 0.7588 | 0.94 | 0.9390 | 0.0554 | 0.0093 | | No log | 95.96 | 288 | 0.3586 | 0.94 | 0.1041 | 0.7587 | 0.94 | 0.9390 | 0.0556 | 0.0092 | | No log | 96.96 | 291 | 0.3586 | 0.94 | 0.1044 | 0.7586 | 0.94 | 0.9390 | 0.0607 | 0.0092 | | No log | 97.96 | 294 | 0.3589 | 0.94 | 0.1047 | 0.7585 | 0.94 | 0.9390 | 0.0608 | 0.0092 | | No log | 98.96 | 297 | 0.3590 | 0.935 | 0.1049 | 0.7584 | 0.935 | 0.9303 | 0.0608 | 0.0092 | | No log | 99.96 | 300 | 0.3591 | 0.935 | 0.1049 | 0.7583 | 0.935 | 0.9303 | 0.0608 | 0.0092 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/vit-base_rvl_tobacco_crl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_tobacco_crl This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5075 - Accuracy: 0.92 - Brier Loss: 0.1544 - Nll: 0.6650 - F1 Micro: 0.92 - F1 Macro: 0.9150 - Ece: 0.1721 - Aurc: 0.0193 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 2.3823 | 0.045 | 0.9050 | 9.6078 | 0.045 | 0.0481 | 0.1570 | 0.9673 | | No log | 1.96 | 6 | 2.3642 | 0.05 | 0.9005 | 8.5700 | 0.0500 | 0.0549 | 0.1567 | 0.9599 | | No log | 2.96 | 9 | 2.3130 | 0.095 | 0.8925 | 6.9490 | 0.095 | 0.0853 | 0.1833 | 0.9127 | | No log | 3.96 | 12 | 2.2603 | 0.265 | 0.8804 | 5.6508 | 0.265 | 0.1642 | 0.2794 | 0.7458 | | No log | 4.96 | 15 | 2.2077 | 0.38 | 0.8637 | 4.0696 | 0.38 | 0.2272 | 0.3548 | 0.4172 | | No log | 5.96 | 18 | 2.1176 | 0.47 | 0.8411 | 2.4954 | 0.47 | 0.3062 | 0.4299 | 0.2410 | | No log | 6.96 | 21 | 2.0268 | 0.64 | 0.8132 | 2.0526 | 0.64 | 0.5126 | 0.5273 | 0.1330 | | No log | 7.96 | 24 | 1.9258 | 0.735 | 0.7792 | 1.7187 | 0.735 | 0.6337 | 0.5870 | 0.0787 | | No log | 8.96 | 27 | 1.8114 | 0.77 | 0.7409 | 1.3797 | 0.7700 | 0.6746 | 0.6034 | 0.0556 | | No log | 9.96 | 30 | 1.7062 | 0.8 | 0.6999 | 1.1402 | 0.8000 | 0.7266 | 0.6005 | 0.0466 | | No log | 10.96 | 33 | 1.5916 | 0.825 | 0.6548 | 0.9516 | 0.825 | 0.7706 | 0.5882 | 0.0427 | | No log | 11.96 | 36 | 1.4855 | 0.86 | 0.6103 | 0.8848 | 0.8600 | 0.8201 | 0.5829 | 0.0388 | | No log | 12.96 | 39 | 1.3944 | 0.87 | 0.5688 | 0.7924 | 0.87 | 0.8361 | 0.5720 | 0.0349 | | No log | 13.96 | 42 | 1.3176 | 0.895 | 0.5326 | 0.6952 | 0.895 | 0.8740 | 0.5576 | 0.0324 | | No log | 14.96 | 45 | 1.2435 | 0.9 | 0.4978 | 0.6632 | 0.9 | 0.8838 | 0.5370 | 0.0293 | | No log | 15.96 | 48 | 1.1760 | 0.915 | 0.4653 | 0.6368 | 0.915 | 0.9034 | 0.5272 | 0.0257 | | No log | 16.96 | 51 | 1.1101 | 0.915 | 0.4338 | 0.6194 | 0.915 | 0.9011 | 0.4963 | 0.0241 | | No log | 17.96 | 54 | 1.0518 | 0.915 | 0.4058 | 0.6131 | 0.915 | 0.9011 | 0.4750 | 0.0231 | | No log | 18.96 | 57 | 1.0011 | 0.915 | 0.3808 | 0.6125 | 0.915 | 0.9011 | 0.4479 | 0.0222 | | No log | 19.96 | 60 | 0.9471 | 0.92 | 0.3566 | 0.5890 | 0.92 | 0.9102 | 0.4353 | 0.0203 | | No log | 20.96 | 63 | 0.8962 | 0.915 | 0.3352 | 0.5856 | 0.915 | 0.9047 | 0.4245 | 0.0185 | | No log | 21.96 | 66 | 0.8635 | 0.92 | 0.3159 | 0.5865 | 0.92 | 0.9115 | 0.3999 | 0.0192 | | No log | 22.96 | 69 | 0.8333 | 0.93 | 0.2987 | 0.5791 | 0.93 | 0.9260 | 0.3917 | 0.0189 | | No log | 23.96 | 72 | 0.8079 | 0.925 | 0.2839 | 0.5871 | 0.925 | 0.9159 | 0.3733 | 0.0173 | | No log | 24.96 | 75 | 0.7644 | 0.93 | 0.2681 | 0.5755 | 0.93 | 0.9233 | 0.3644 | 0.0198 | | No log | 25.96 | 78 | 0.7443 | 0.925 | 0.2567 | 0.5750 | 0.925 | 0.9204 | 0.3419 | 0.0193 | | No log | 26.96 | 81 | 0.7250 | 0.93 | 0.2461 | 0.5722 | 0.93 | 0.9227 | 0.3345 | 0.0176 | | No log | 27.96 | 84 | 0.6988 | 0.93 | 0.2344 | 0.5118 | 0.93 | 0.9227 | 0.3151 | 0.0172 | | No log | 28.96 | 87 | 0.6923 | 0.935 | 0.2272 | 0.5730 | 0.935 | 0.9303 | 0.3162 | 0.0175 | | No log | 29.96 | 90 | 0.6752 | 0.935 | 0.2196 | 0.5646 | 0.935 | 0.9303 | 0.3016 | 0.0179 | | No log | 30.96 | 93 | 0.6576 | 0.93 | 0.2117 | 0.5554 | 0.93 | 0.9227 | 0.2934 | 0.0188 | | No log | 31.96 | 96 | 0.6476 | 0.93 | 0.2073 | 0.5617 | 0.93 | 0.9227 | 0.2867 | 0.0193 | | No log | 32.96 | 99 | 0.6349 | 0.93 | 0.2009 | 0.5648 | 0.93 | 0.9245 | 0.2818 | 0.0178 | | No log | 33.96 | 102 | 0.6195 | 0.92 | 0.1949 | 0.6098 | 0.92 | 0.9140 | 0.2612 | 0.0185 | | No log | 34.96 | 105 | 0.6158 | 0.92 | 0.1921 | 0.6190 | 0.92 | 0.9140 | 0.2659 | 0.0184 | | No log | 35.96 | 108 | 0.6093 | 0.93 | 0.1891 | 0.6182 | 0.93 | 0.9273 | 0.2616 | 0.0187 | | No log | 36.96 | 111 | 0.6007 | 0.925 | 0.1854 | 0.6169 | 0.925 | 0.9170 | 0.2561 | 0.0182 | | No log | 37.96 | 114 | 0.5877 | 0.925 | 0.1815 | 0.5400 | 0.925 | 0.9170 | 0.2575 | 0.0179 | | No log | 38.96 | 117 | 0.5887 | 0.925 | 0.1793 | 0.6079 | 0.925 | 0.9170 | 0.2544 | 0.0188 | | No log | 39.96 | 120 | 0.5865 | 0.915 | 0.1775 | 0.6123 | 0.915 | 0.9107 | 0.2510 | 0.0192 | | No log | 40.96 | 123 | 0.5753 | 0.925 | 0.1738 | 0.5984 | 0.925 | 0.9230 | 0.2323 | 0.0190 | | No log | 41.96 | 126 | 0.5727 | 0.92 | 0.1738 | 0.5394 | 0.92 | 0.9140 | 0.2305 | 0.0184 | | No log | 42.96 | 129 | 0.5644 | 0.92 | 0.1724 | 0.5476 | 0.92 | 0.9140 | 0.2276 | 0.0186 | | No log | 43.96 | 132 | 0.5597 | 0.92 | 0.1703 | 0.6031 | 0.92 | 0.9140 | 0.2285 | 0.0194 | | No log | 44.96 | 135 | 0.5597 | 0.92 | 0.1688 | 0.6026 | 0.92 | 0.9140 | 0.2216 | 0.0187 | | No log | 45.96 | 138 | 0.5580 | 0.925 | 0.1676 | 0.6051 | 0.925 | 0.9170 | 0.2194 | 0.0187 | | No log | 46.96 | 141 | 0.5541 | 0.925 | 0.1658 | 0.6063 | 0.925 | 0.9170 | 0.2252 | 0.0184 | | No log | 47.96 | 144 | 0.5533 | 0.925 | 0.1654 | 0.6153 | 0.925 | 0.9170 | 0.2164 | 0.0183 | | No log | 48.96 | 147 | 0.5464 | 0.925 | 0.1629 | 0.6085 | 0.925 | 0.9170 | 0.2225 | 0.0183 | | No log | 49.96 | 150 | 0.5407 | 0.925 | 0.1612 | 0.5988 | 0.925 | 0.9170 | 0.2187 | 0.0179 | | No log | 50.96 | 153 | 0.5432 | 0.92 | 0.1625 | 0.6095 | 0.92 | 0.9150 | 0.2040 | 0.0177 | | No log | 51.96 | 156 | 0.5425 | 0.915 | 0.1648 | 0.6964 | 0.915 | 0.9118 | 0.1977 | 0.0182 | | No log | 52.96 | 159 | 0.5376 | 0.915 | 0.1623 | 0.6959 | 0.915 | 0.9118 | 0.2129 | 0.0192 | | No log | 53.96 | 162 | 0.5299 | 0.915 | 0.1596 | 0.6710 | 0.915 | 0.9118 | 0.2120 | 0.0194 | | No log | 54.96 | 165 | 0.5240 | 0.92 | 0.1579 | 0.6072 | 0.92 | 0.9150 | 0.2076 | 0.0183 | | No log | 55.96 | 168 | 0.5297 | 0.92 | 0.1583 | 0.6704 | 0.92 | 0.9150 | 0.1997 | 0.0182 | | No log | 56.96 | 171 | 0.5307 | 0.915 | 0.1585 | 0.6782 | 0.915 | 0.9118 | 0.2091 | 0.0187 | | No log | 57.96 | 174 | 0.5257 | 0.925 | 0.1566 | 0.6692 | 0.925 | 0.9180 | 0.1970 | 0.0193 | | No log | 58.96 | 177 | 0.5281 | 0.925 | 0.1576 | 0.6703 | 0.925 | 0.9180 | 0.2007 | 0.0182 | | No log | 59.96 | 180 | 0.5282 | 0.92 | 0.1579 | 0.6690 | 0.92 | 0.9150 | 0.1842 | 0.0185 | | No log | 60.96 | 183 | 0.5212 | 0.92 | 0.1573 | 0.6672 | 0.92 | 0.9150 | 0.1957 | 0.0189 | | No log | 61.96 | 186 | 0.5203 | 0.92 | 0.1554 | 0.6655 | 0.92 | 0.9207 | 0.1918 | 0.0199 | | No log | 62.96 | 189 | 0.5166 | 0.915 | 0.1557 | 0.6689 | 0.915 | 0.9118 | 0.1817 | 0.0195 | | No log | 63.96 | 192 | 0.5168 | 0.915 | 0.1556 | 0.6695 | 0.915 | 0.9118 | 0.1895 | 0.0191 | | No log | 64.96 | 195 | 0.5153 | 0.915 | 0.1547 | 0.6661 | 0.915 | 0.9118 | 0.1879 | 0.0188 | | No log | 65.96 | 198 | 0.5157 | 0.915 | 0.1545 | 0.6665 | 0.915 | 0.9118 | 0.1890 | 0.0191 | | No log | 66.96 | 201 | 0.5181 | 0.915 | 0.1549 | 0.6703 | 0.915 | 0.9118 | 0.1890 | 0.0191 | | No log | 67.96 | 204 | 0.5168 | 0.915 | 0.1542 | 0.6686 | 0.915 | 0.9118 | 0.1882 | 0.0193 | | No log | 68.96 | 207 | 0.5120 | 0.93 | 0.1532 | 0.6643 | 0.93 | 0.9269 | 0.1901 | 0.0195 | | No log | 69.96 | 210 | 0.5091 | 0.92 | 0.1528 | 0.6596 | 0.92 | 0.9150 | 0.1866 | 0.0194 | | No log | 70.96 | 213 | 0.5093 | 0.92 | 0.1526 | 0.6607 | 0.92 | 0.9150 | 0.1847 | 0.0182 | | No log | 71.96 | 216 | 0.5143 | 0.925 | 0.1538 | 0.6675 | 0.925 | 0.9180 | 0.1789 | 0.0180 | | No log | 72.96 | 219 | 0.5145 | 0.925 | 0.1550 | 0.6728 | 0.925 | 0.9180 | 0.1765 | 0.0187 | | No log | 73.96 | 222 | 0.5090 | 0.92 | 0.1540 | 0.6658 | 0.92 | 0.9150 | 0.1904 | 0.0191 | | No log | 74.96 | 225 | 0.5069 | 0.92 | 0.1530 | 0.6606 | 0.92 | 0.9150 | 0.1840 | 0.0189 | | No log | 75.96 | 228 | 0.5051 | 0.92 | 0.1524 | 0.6624 | 0.92 | 0.9150 | 0.1925 | 0.0186 | | No log | 76.96 | 231 | 0.5089 | 0.92 | 0.1539 | 0.6698 | 0.92 | 0.9150 | 0.1759 | 0.0189 | | No log | 77.96 | 234 | 0.5053 | 0.92 | 0.1528 | 0.6647 | 0.92 | 0.9150 | 0.1748 | 0.0188 | | No log | 78.96 | 237 | 0.5028 | 0.92 | 0.1524 | 0.6598 | 0.92 | 0.9150 | 0.1821 | 0.0182 | | No log | 79.96 | 240 | 0.5043 | 0.92 | 0.1527 | 0.6615 | 0.92 | 0.9150 | 0.1810 | 0.0181 | | No log | 80.96 | 243 | 0.5014 | 0.92 | 0.1523 | 0.6622 | 0.92 | 0.9150 | 0.1733 | 0.0184 | | No log | 81.96 | 246 | 0.5035 | 0.92 | 0.1531 | 0.6635 | 0.92 | 0.9150 | 0.1791 | 0.0183 | | No log | 82.96 | 249 | 0.5052 | 0.92 | 0.1538 | 0.6669 | 0.92 | 0.9150 | 0.1799 | 0.0186 | | No log | 83.96 | 252 | 0.5040 | 0.92 | 0.1533 | 0.6640 | 0.92 | 0.9150 | 0.1833 | 0.0188 | | No log | 84.96 | 255 | 0.5008 | 0.92 | 0.1530 | 0.6588 | 0.92 | 0.9150 | 0.1735 | 0.0188 | | No log | 85.96 | 258 | 0.5027 | 0.915 | 0.1538 | 0.6599 | 0.915 | 0.9121 | 0.1751 | 0.0187 | | No log | 86.96 | 261 | 0.5075 | 0.915 | 0.1551 | 0.6661 | 0.915 | 0.9121 | 0.1684 | 0.0187 | | No log | 87.96 | 264 | 0.5107 | 0.92 | 0.1555 | 0.6734 | 0.92 | 0.9150 | 0.1748 | 0.0186 | | No log | 88.96 | 267 | 0.5035 | 0.92 | 0.1534 | 0.6676 | 0.92 | 0.9150 | 0.1810 | 0.0192 | | No log | 89.96 | 270 | 0.5006 | 0.92 | 0.1523 | 0.6624 | 0.92 | 0.9150 | 0.1867 | 0.0200 | | No log | 90.96 | 273 | 0.4984 | 0.92 | 0.1521 | 0.6605 | 0.92 | 0.9150 | 0.1704 | 0.0201 | | No log | 91.96 | 276 | 0.4976 | 0.92 | 0.1518 | 0.6586 | 0.92 | 0.9150 | 0.1702 | 0.0201 | | No log | 92.96 | 279 | 0.4986 | 0.92 | 0.1520 | 0.6584 | 0.92 | 0.9150 | 0.1701 | 0.0201 | | No log | 93.96 | 282 | 0.5005 | 0.92 | 0.1526 | 0.6596 | 0.92 | 0.9150 | 0.1714 | 0.0201 | | No log | 94.96 | 285 | 0.5025 | 0.92 | 0.1533 | 0.6614 | 0.92 | 0.9150 | 0.1820 | 0.0202 | | No log | 95.96 | 288 | 0.5043 | 0.92 | 0.1539 | 0.6634 | 0.92 | 0.9150 | 0.1721 | 0.0195 | | No log | 96.96 | 291 | 0.5056 | 0.92 | 0.1542 | 0.6644 | 0.92 | 0.9150 | 0.1783 | 0.0194 | | No log | 97.96 | 294 | 0.5075 | 0.92 | 0.1544 | 0.6648 | 0.92 | 0.9150 | 0.1723 | 0.0194 | | No log | 98.96 | 297 | 0.5077 | 0.92 | 0.1544 | 0.6649 | 0.92 | 0.9150 | 0.1722 | 0.0194 | | No log | 99.96 | 300 | 0.5075 | 0.92 | 0.1544 | 0.6650 | 0.92 | 0.9150 | 0.1721 | 0.0193 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/dit-finetuned_rvl_tobacco_crl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-finetuned_rvl_tobacco_crl This model is a fine-tuned version of [microsoft/dit-base-finetuned-rvlcdip](https://huggingface.co/microsoft/dit-base-finetuned-rvlcdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5587 - Accuracy: 0.935 - Brier Loss: 0.1614 - Nll: 0.7604 - F1 Micro: 0.935 - F1 Macro: 0.9244 - Ece: 0.2320 - Aurc: 0.0099 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 2.3702 | 0.005 | 0.9059 | 8.8946 | 0.005 | 0.0048 | 0.1391 | 0.9945 | | No log | 1.96 | 6 | 2.3612 | 0.005 | 0.9052 | 8.8523 | 0.005 | 0.0048 | 0.1390 | 0.9942 | | No log | 2.96 | 9 | 2.3406 | 0.005 | 0.9038 | 8.5993 | 0.005 | 0.0048 | 0.1389 | 0.9939 | | No log | 3.96 | 12 | 2.3415 | 0.01 | 0.9018 | 7.9436 | 0.01 | 0.0091 | 0.1413 | 0.9925 | | No log | 4.96 | 15 | 2.3297 | 0.02 | 0.8991 | 7.1590 | 0.02 | 0.0303 | 0.1461 | 0.9874 | | No log | 5.96 | 18 | 2.3164 | 0.05 | 0.8958 | 6.6598 | 0.0500 | 0.0527 | 0.1638 | 0.9769 | | No log | 6.96 | 21 | 2.3021 | 0.16 | 0.8918 | 6.3655 | 0.16 | 0.1028 | 0.2371 | 0.9458 | | No log | 7.96 | 24 | 2.2776 | 0.18 | 0.8869 | 6.0316 | 0.18 | 0.1076 | 0.2436 | 0.9253 | | No log | 8.96 | 27 | 2.2639 | 0.195 | 0.8811 | 4.6385 | 0.195 | 0.1154 | 0.2533 | 0.8971 | | No log | 9.96 | 30 | 2.2388 | 0.215 | 0.8736 | 3.3475 | 0.2150 | 0.1273 | 0.2506 | 0.8034 | | No log | 10.96 | 33 | 2.2053 | 0.3 | 0.8635 | 2.6087 | 0.3 | 0.1896 | 0.2977 | 0.6413 | | No log | 11.96 | 36 | 2.1526 | 0.39 | 0.8496 | 2.2967 | 0.39 | 0.2387 | 0.3672 | 0.4311 | | No log | 12.96 | 39 | 2.1007 | 0.475 | 0.8335 | 1.8576 | 0.4750 | 0.3168 | 0.4171 | 0.3033 | | No log | 13.96 | 42 | 2.0444 | 0.575 | 0.8173 | 1.4725 | 0.575 | 0.3985 | 0.4782 | 0.2201 | | No log | 14.96 | 45 | 1.9806 | 0.6 | 0.7977 | 1.2973 | 0.6 | 0.4402 | 0.5055 | 0.1902 | | No log | 15.96 | 48 | 1.9183 | 0.645 | 0.7791 | 1.2239 | 0.645 | 0.4909 | 0.5235 | 0.1422 | | No log | 16.96 | 51 | 1.8671 | 0.705 | 0.7619 | 1.1838 | 0.705 | 0.5916 | 0.5620 | 0.1137 | | No log | 17.96 | 54 | 1.8104 | 0.785 | 0.7434 | 1.0809 | 0.785 | 0.6794 | 0.6135 | 0.0684 | | No log | 18.96 | 57 | 1.7607 | 0.805 | 0.7265 | 1.0477 | 0.805 | 0.6970 | 0.6211 | 0.0550 | | No log | 19.96 | 60 | 1.7100 | 0.825 | 0.7079 | 1.0221 | 0.825 | 0.7236 | 0.6304 | 0.0478 | | No log | 20.96 | 63 | 1.6615 | 0.825 | 0.6892 | 1.0108 | 0.825 | 0.7236 | 0.6201 | 0.0445 | | No log | 21.96 | 66 | 1.6118 | 0.845 | 0.6705 | 0.9972 | 0.845 | 0.7428 | 0.6329 | 0.0364 | | No log | 22.96 | 69 | 1.5710 | 0.845 | 0.6520 | 0.9833 | 0.845 | 0.7417 | 0.6179 | 0.0337 | | No log | 23.96 | 72 | 1.5225 | 0.85 | 0.6329 | 0.9634 | 0.85 | 0.7421 | 0.6078 | 0.0306 | | No log | 24.96 | 75 | 1.4797 | 0.865 | 0.6132 | 0.9521 | 0.865 | 0.7643 | 0.6058 | 0.0256 | | No log | 25.96 | 78 | 1.4397 | 0.865 | 0.5934 | 0.9487 | 0.865 | 0.7643 | 0.5917 | 0.0249 | | No log | 26.96 | 81 | 1.4028 | 0.87 | 0.5735 | 0.9501 | 0.87 | 0.7708 | 0.5875 | 0.0229 | | No log | 27.96 | 84 | 1.3612 | 0.87 | 0.5537 | 0.9488 | 0.87 | 0.7708 | 0.5747 | 0.0225 | | No log | 28.96 | 87 | 1.3246 | 0.87 | 0.5347 | 0.9884 | 0.87 | 0.7756 | 0.5592 | 0.0220 | | No log | 29.96 | 90 | 1.2879 | 0.87 | 0.5163 | 0.9824 | 0.87 | 0.7759 | 0.5428 | 0.0225 | | No log | 30.96 | 93 | 1.2546 | 0.87 | 0.4993 | 0.9798 | 0.87 | 0.7752 | 0.5337 | 0.0225 | | No log | 31.96 | 96 | 1.2207 | 0.875 | 0.4815 | 0.9755 | 0.875 | 0.7822 | 0.5094 | 0.0218 | | No log | 32.96 | 99 | 1.1855 | 0.88 | 0.4628 | 0.9779 | 0.88 | 0.8016 | 0.5062 | 0.0206 | | No log | 33.96 | 102 | 1.1557 | 0.875 | 0.4467 | 1.0389 | 0.875 | 0.7946 | 0.4862 | 0.0210 | | No log | 34.96 | 105 | 1.1322 | 0.885 | 0.4327 | 0.9684 | 0.885 | 0.8176 | 0.4812 | 0.0210 | | No log | 35.96 | 108 | 1.1061 | 0.895 | 0.4176 | 0.9561 | 0.895 | 0.8405 | 0.4697 | 0.0206 | | No log | 36.96 | 111 | 1.0796 | 0.9 | 0.4027 | 0.9468 | 0.9 | 0.8513 | 0.4678 | 0.0203 | | No log | 37.96 | 114 | 1.0579 | 0.91 | 0.3907 | 0.8753 | 0.91 | 0.8753 | 0.4617 | 0.0195 | | No log | 38.96 | 117 | 1.0277 | 0.91 | 0.3774 | 0.8706 | 0.91 | 0.8772 | 0.4447 | 0.0187 | | No log | 39.96 | 120 | 1.0031 | 0.915 | 0.3647 | 0.8547 | 0.915 | 0.8837 | 0.4374 | 0.0175 | | No log | 40.96 | 123 | 0.9803 | 0.925 | 0.3535 | 0.8474 | 0.925 | 0.9037 | 0.4327 | 0.0172 | | No log | 41.96 | 126 | 0.9621 | 0.92 | 0.3440 | 0.8505 | 0.92 | 0.8985 | 0.4129 | 0.0182 | | No log | 42.96 | 129 | 0.9428 | 0.91 | 0.3347 | 0.8515 | 0.91 | 0.8846 | 0.3943 | 0.0191 | | No log | 43.96 | 132 | 0.9231 | 0.92 | 0.3249 | 0.8403 | 0.92 | 0.9003 | 0.4079 | 0.0177 | | No log | 44.96 | 135 | 0.9075 | 0.93 | 0.3159 | 0.8224 | 0.93 | 0.9139 | 0.4073 | 0.0167 | | No log | 45.96 | 138 | 0.8876 | 0.925 | 0.3073 | 0.8091 | 0.925 | 0.9096 | 0.3878 | 0.0173 | | No log | 46.96 | 141 | 0.8799 | 0.93 | 0.2977 | 0.8091 | 0.93 | 0.9148 | 0.3785 | 0.0161 | | No log | 47.96 | 144 | 0.8567 | 0.915 | 0.2901 | 0.8123 | 0.915 | 0.8922 | 0.3693 | 0.0173 | | No log | 48.96 | 147 | 0.8430 | 0.92 | 0.2837 | 0.8045 | 0.92 | 0.9055 | 0.3525 | 0.0177 | | No log | 49.96 | 150 | 0.8270 | 0.925 | 0.2764 | 0.7970 | 0.925 | 0.9132 | 0.3499 | 0.0172 | | No log | 50.96 | 153 | 0.8168 | 0.925 | 0.2685 | 0.7991 | 0.925 | 0.9132 | 0.3417 | 0.0164 | | No log | 51.96 | 156 | 0.7975 | 0.93 | 0.2598 | 0.7987 | 0.93 | 0.9184 | 0.3379 | 0.0148 | | No log | 52.96 | 159 | 0.7821 | 0.935 | 0.2522 | 0.7911 | 0.935 | 0.9245 | 0.3345 | 0.0137 | | No log | 53.96 | 162 | 0.7693 | 0.935 | 0.2468 | 0.7805 | 0.935 | 0.9245 | 0.3423 | 0.0135 | | No log | 54.96 | 165 | 0.7486 | 0.93 | 0.2416 | 0.7829 | 0.93 | 0.9195 | 0.3272 | 0.0142 | | No log | 55.96 | 168 | 0.7409 | 0.93 | 0.2381 | 0.7833 | 0.93 | 0.9195 | 0.3216 | 0.0144 | | No log | 56.96 | 171 | 0.7290 | 0.93 | 0.2327 | 0.7815 | 0.93 | 0.9195 | 0.3055 | 0.0138 | | No log | 57.96 | 174 | 0.7137 | 0.935 | 0.2268 | 0.7757 | 0.935 | 0.9245 | 0.3039 | 0.0128 | | No log | 58.96 | 177 | 0.7026 | 0.935 | 0.2214 | 0.7698 | 0.935 | 0.9245 | 0.2911 | 0.0122 | | No log | 59.96 | 180 | 0.6935 | 0.935 | 0.2168 | 0.7604 | 0.935 | 0.9245 | 0.2853 | 0.0119 | | No log | 60.96 | 183 | 0.6855 | 0.935 | 0.2134 | 0.7605 | 0.935 | 0.9245 | 0.2895 | 0.0117 | | No log | 61.96 | 186 | 0.6755 | 0.94 | 0.2094 | 0.8161 | 0.94 | 0.9330 | 0.2902 | 0.0114 | | No log | 62.96 | 189 | 0.6641 | 0.94 | 0.2046 | 0.8131 | 0.94 | 0.9330 | 0.2761 | 0.0108 | | No log | 63.96 | 192 | 0.6536 | 0.94 | 0.2000 | 0.8113 | 0.94 | 0.9330 | 0.2865 | 0.0104 | | No log | 64.96 | 195 | 0.6441 | 0.94 | 0.1964 | 0.8071 | 0.94 | 0.9330 | 0.2739 | 0.0103 | | No log | 65.96 | 198 | 0.6395 | 0.94 | 0.1937 | 0.7997 | 0.94 | 0.9330 | 0.2771 | 0.0102 | | No log | 66.96 | 201 | 0.6345 | 0.94 | 0.1915 | 0.7930 | 0.94 | 0.9330 | 0.2764 | 0.0104 | | No log | 67.96 | 204 | 0.6355 | 0.94 | 0.1901 | 0.7901 | 0.94 | 0.9330 | 0.2763 | 0.0105 | | No log | 68.96 | 207 | 0.6302 | 0.94 | 0.1880 | 0.7887 | 0.94 | 0.9330 | 0.2631 | 0.0108 | | No log | 69.96 | 210 | 0.6242 | 0.94 | 0.1858 | 0.7887 | 0.94 | 0.9330 | 0.2595 | 0.0109 | | No log | 70.96 | 213 | 0.6182 | 0.94 | 0.1837 | 0.7898 | 0.94 | 0.9330 | 0.2628 | 0.0105 | | No log | 71.96 | 216 | 0.6129 | 0.94 | 0.1816 | 0.7910 | 0.94 | 0.9330 | 0.2597 | 0.0103 | | No log | 72.96 | 219 | 0.6085 | 0.94 | 0.1795 | 0.7878 | 0.94 | 0.9330 | 0.2572 | 0.0101 | | No log | 73.96 | 222 | 0.6049 | 0.94 | 0.1777 | 0.7837 | 0.94 | 0.9330 | 0.2561 | 0.0099 | | No log | 74.96 | 225 | 0.6004 | 0.94 | 0.1756 | 0.7824 | 0.94 | 0.9330 | 0.2372 | 0.0093 | | No log | 75.96 | 228 | 0.5966 | 0.94 | 0.1740 | 0.7799 | 0.94 | 0.9330 | 0.2436 | 0.0093 | | No log | 76.96 | 231 | 0.5934 | 0.94 | 0.1731 | 0.7803 | 0.94 | 0.9330 | 0.2395 | 0.0094 | | No log | 77.96 | 234 | 0.5902 | 0.94 | 0.1722 | 0.7779 | 0.94 | 0.9330 | 0.2470 | 0.0096 | | No log | 78.96 | 237 | 0.5866 | 0.94 | 0.1709 | 0.7737 | 0.94 | 0.9330 | 0.2456 | 0.0097 | | No log | 79.96 | 240 | 0.5833 | 0.94 | 0.1696 | 0.7702 | 0.94 | 0.9330 | 0.2361 | 0.0099 | | No log | 80.96 | 243 | 0.5809 | 0.94 | 0.1687 | 0.7693 | 0.94 | 0.9330 | 0.2346 | 0.0099 | | No log | 81.96 | 246 | 0.5786 | 0.94 | 0.1679 | 0.7701 | 0.94 | 0.9330 | 0.2333 | 0.0100 | | No log | 82.96 | 249 | 0.5772 | 0.935 | 0.1675 | 0.7703 | 0.935 | 0.9244 | 0.2296 | 0.0099 | | No log | 83.96 | 252 | 0.5758 | 0.935 | 0.1671 | 0.7703 | 0.935 | 0.9244 | 0.2373 | 0.0101 | | No log | 84.96 | 255 | 0.5741 | 0.935 | 0.1664 | 0.7686 | 0.935 | 0.9244 | 0.2362 | 0.0100 | | No log | 85.96 | 258 | 0.5725 | 0.935 | 0.1657 | 0.7665 | 0.935 | 0.9244 | 0.2328 | 0.0099 | | No log | 86.96 | 261 | 0.5710 | 0.935 | 0.1651 | 0.7650 | 0.935 | 0.9244 | 0.2313 | 0.0101 | | No log | 87.96 | 264 | 0.5692 | 0.935 | 0.1646 | 0.7641 | 0.935 | 0.9244 | 0.2323 | 0.0101 | | No log | 88.96 | 267 | 0.5674 | 0.935 | 0.1641 | 0.7641 | 0.935 | 0.9244 | 0.2303 | 0.0100 | | No log | 89.96 | 270 | 0.5659 | 0.935 | 0.1636 | 0.7640 | 0.935 | 0.9244 | 0.2290 | 0.0098 | | No log | 90.96 | 273 | 0.5648 | 0.935 | 0.1633 | 0.7636 | 0.935 | 0.9244 | 0.2281 | 0.0098 | | No log | 91.96 | 276 | 0.5639 | 0.935 | 0.1630 | 0.7632 | 0.935 | 0.9244 | 0.2345 | 0.0100 | | No log | 92.96 | 279 | 0.5628 | 0.935 | 0.1626 | 0.7628 | 0.935 | 0.9244 | 0.2340 | 0.0100 | | No log | 93.96 | 282 | 0.5619 | 0.935 | 0.1623 | 0.7623 | 0.935 | 0.9244 | 0.2334 | 0.0101 | | No log | 94.96 | 285 | 0.5609 | 0.935 | 0.1620 | 0.7615 | 0.935 | 0.9244 | 0.2328 | 0.0100 | | No log | 95.96 | 288 | 0.5602 | 0.935 | 0.1617 | 0.7610 | 0.935 | 0.9244 | 0.2324 | 0.0099 | | No log | 96.96 | 291 | 0.5596 | 0.935 | 0.1616 | 0.7605 | 0.935 | 0.9244 | 0.2322 | 0.0099 | | No log | 97.96 | 294 | 0.5589 | 0.935 | 0.1615 | 0.7604 | 0.935 | 0.9244 | 0.2321 | 0.0099 | | No log | 98.96 | 297 | 0.5589 | 0.935 | 0.1614 | 0.7604 | 0.935 | 0.9244 | 0.2320 | 0.0099 | | No log | 99.96 | 300 | 0.5587 | 0.935 | 0.1614 | 0.7604 | 0.935 | 0.9244 | 0.2320 | 0.0099 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
gowrias12/swin-tiny-patch4-window7-224-finetuned-cac
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-cac This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.0394 - Accuracy: 0.3636 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 1.3354 | 0.1818 | | No log | 2.0 | 3 | 1.0394 | 0.3636 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hippo", "penguin", "turtle" ]
Epl1/food_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Epl1/food_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.3725 - Validation Loss: 0.3553 - Train Accuracy: 0.911 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 2.8116 | 1.7125 | 0.778 | 0 | | 1.2501 | 0.8766 | 0.851 | 1 | | 0.7145 | 0.5461 | 0.888 | 2 | | 0.5083 | 0.4211 | 0.904 | 3 | | 0.3725 | 0.3553 | 0.911 | 4 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.12.0 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Epl1/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 1.6141 - Accuracy: 0.892 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7048 | 0.99 | 62 | 2.5361 | 0.823 | | 1.8279 | 2.0 | 125 | 1.7878 | 0.875 | | 1.5917 | 2.98 | 186 | 1.6141 | 0.892 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
merve/vit-mobilenet-beans-224
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ViT distilled to MobileNet This model is a distilled model, where teacher model is [merve/beans-vit-224](https://huggingface.co/merve/beans-vit-224), fine-tuned [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. Student model is randomly initialized MobileNetV2. It achieves the following results on the evaluation set: - Loss: 0.5922 - Accuracy: 0.7266 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.9217 | 1.0 | 130 | 1.0079 | 0.3835 | | 0.8973 | 2.0 | 260 | 0.8349 | 0.4286 | | 0.7912 | 3.0 | 390 | 0.8905 | 0.5414 | | 0.7151 | 4.0 | 520 | 1.1400 | 0.4887 | | 0.6797 | 5.0 | 650 | 4.5343 | 0.4135 | | 0.6471 | 6.0 | 780 | 2.1551 | 0.3985 | | 0.5989 | 7.0 | 910 | 0.8552 | 0.6090 | | 0.6252 | 8.0 | 1040 | 1.7453 | 0.5489 | | 0.6025 | 9.0 | 1170 | 0.7852 | 0.6466 | | 0.5643 | 10.0 | 1300 | 1.4728 | 0.6090 | | 0.5505 | 11.0 | 1430 | 1.1570 | 0.6015 | | 0.5207 | 12.0 | 1560 | 3.2526 | 0.4436 | | 0.4957 | 13.0 | 1690 | 0.6617 | 0.6541 | | 0.4935 | 14.0 | 1820 | 0.7502 | 0.6241 | | 0.4836 | 15.0 | 1950 | 1.2039 | 0.5338 | | 0.4648 | 16.0 | 2080 | 1.0283 | 0.5338 | | 0.4662 | 17.0 | 2210 | 0.6695 | 0.7293 | | 0.4351 | 18.0 | 2340 | 0.8694 | 0.5940 | | 0.4286 | 19.0 | 2470 | 1.2751 | 0.4737 | | 0.4166 | 20.0 | 2600 | 0.8719 | 0.6241 | | 0.4263 | 21.0 | 2730 | 0.8767 | 0.6015 | | 0.4261 | 22.0 | 2860 | 1.2780 | 0.5564 | | 0.4124 | 23.0 | 2990 | 1.4095 | 0.5940 | | 0.4082 | 24.0 | 3120 | 0.9104 | 0.6015 | | 0.3923 | 25.0 | 3250 | 0.6430 | 0.7068 | ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1
[ "label_0", "label_1", "label_2" ]
MHRDYN7/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 1.6130 - Accuracy: 0.889 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7036 | 0.99 | 62 | 2.4963 | 0.839 | | 1.808 | 2.0 | 125 | 1.7523 | 0.875 | | 1.5765 | 2.98 | 186 | 1.6130 | 0.889 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
leopuv/cats_vs_dogs_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # leopuv/cats_vs_dogs_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0285 - Train Accuracy: 0.9865 - Validation Loss: 0.0340 - Validation Accuracy: 0.9865 - Epoch: 9 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 80000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch | |:----------:|:--------------:|:---------------:|:-------------------:|:-----:| | 0.1739 | 0.9715 | 0.0787 | 0.9715 | 0 | | 0.0744 | 0.984 | 0.0432 | 0.9840 | 1 | | 0.0543 | 0.9895 | 0.0365 | 0.9895 | 2 | | 0.0420 | 0.9885 | 0.0346 | 0.9885 | 3 | | 0.0402 | 0.9855 | 0.0414 | 0.9855 | 4 | | 0.0378 | 0.9885 | 0.0307 | 0.9885 | 5 | | 0.0306 | 0.9855 | 0.0375 | 0.9855 | 6 | | 0.0343 | 0.987 | 0.0402 | 0.9870 | 7 | | 0.0283 | 0.9875 | 0.0381 | 0.9875 | 8 | | 0.0285 | 0.9865 | 0.0340 | 0.9865 | 9 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.12.0 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "cat", "dog" ]
akaashp15/food_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # akaashp15/food_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 3.8114 - Validation Loss: 4.0450 - Train Accuracy: 0.5875 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 4.5874 | 4.5203 | 0.0875 | 0 | | 4.3743 | 4.4135 | 0.35 | 1 | | 4.1829 | 4.2913 | 0.55 | 2 | | 3.9908 | 4.1636 | 0.6125 | 3 | | 3.8114 | 4.0450 | 0.5875 | 4 | ### Framework versions - Transformers 4.30.2 - TensorFlow 2.13.0-rc2 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
akaashp15/my_food_classifier
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1). ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Data Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
jvadlamudi2/vit-base-patch16-224-jvadlamudi2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-jvadlamudi2 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4552 - Accuracy: 0.8378 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 7 | 0.4525 | 0.8333 | | 0.4945 | 2.0 | 14 | 0.4563 | 0.8243 | | 0.4492 | 3.0 | 21 | 0.4552 | 0.8378 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "0", "1" ]
Shojint/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model was trained from scratch on the food101 dataset. It achieves the following results on the evaluation set: - eval_loss: 0.2752 - eval_accuracy: 0.923 - eval_runtime: 18.6571 - eval_samples_per_second: 53.599 - eval_steps_per_second: 3.377 - epoch: 29.71 - step: 1857 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "baklava", "beef_carpaccio", "beef_tartare", "beet_salad", "beignets", "bibimbap", "bread_pudding", "breakfast_burrito", "bruschetta", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare", "waffles" ]
Shojint/food_vit_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # food_vit_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.5959 - Accuracy: 0.8749 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.4008 | 1.0 | 592 | 0.5106 | 0.8690 | | 0.313 | 2.0 | 1184 | 0.5045 | 0.8718 | | 0.3444 | 3.0 | 1776 | 0.5029 | 0.8711 | | 0.3305 | 4.0 | 2368 | 0.5049 | 0.8706 | | 0.2993 | 5.0 | 2960 | 0.5123 | 0.8702 | | 0.3658 | 6.0 | 3552 | 0.5202 | 0.8662 | | 0.314 | 7.0 | 4144 | 0.5344 | 0.8633 | | 0.2973 | 8.0 | 4736 | 0.5558 | 0.8589 | | 0.3171 | 9.0 | 5328 | 0.5806 | 0.8566 | | 0.2841 | 10.0 | 5920 | 0.5932 | 0.856 | | 0.4034 | 11.0 | 6512 | 0.5770 | 0.8554 | | 0.3231 | 12.0 | 7104 | 0.5455 | 0.8607 | | 0.3162 | 13.0 | 7696 | 0.5420 | 0.8634 | | 0.3706 | 14.0 | 8288 | 0.5591 | 0.8590 | | 0.2857 | 15.0 | 8880 | 0.5284 | 0.8653 | | 0.2647 | 16.0 | 9472 | 0.5680 | 0.8567 | | 0.2411 | 17.0 | 10064 | 0.5492 | 0.8648 | | 0.2566 | 18.0 | 10656 | 0.5716 | 0.8581 | | 0.2338 | 19.0 | 11248 | 0.5842 | 0.8573 | | 0.2862 | 20.0 | 11840 | 0.5735 | 0.8592 | | 0.2689 | 21.0 | 12432 | 0.5669 | 0.8604 | | 0.1892 | 22.0 | 13024 | 0.5747 | 0.8602 | | 0.1801 | 23.0 | 13616 | 0.5581 | 0.8627 | | 0.2258 | 24.0 | 14208 | 0.5717 | 0.8614 | | 0.2215 | 25.0 | 14800 | 0.6046 | 0.8562 | | 0.1443 | 26.0 | 15392 | 0.5758 | 0.8642 | | 0.2143 | 27.0 | 15984 | 0.5805 | 0.8626 | | 0.1699 | 28.0 | 16576 | 0.5843 | 0.8616 | | 0.1787 | 29.0 | 17168 | 0.5740 | 0.8657 | | 0.1702 | 30.0 | 17760 | 0.5718 | 0.8653 | | 0.1703 | 31.0 | 18352 | 0.5703 | 0.8646 | | 0.1692 | 32.0 | 18944 | 0.5918 | 0.8627 | | 0.1643 | 33.0 | 19536 | 0.6041 | 0.8608 | | 0.214 | 34.0 | 20128 | 0.5950 | 0.8624 | | 0.1996 | 35.0 | 20720 | 0.5861 | 0.8637 | | 0.1618 | 36.0 | 21312 | 0.6032 | 0.8622 | | 0.181 | 37.0 | 21904 | 0.5915 | 0.8646 | | 0.1641 | 38.0 | 22496 | 0.5697 | 0.8663 | | 0.1233 | 39.0 | 23088 | 0.5987 | 0.8617 | | 0.1469 | 40.0 | 23680 | 0.5944 | 0.8635 | | 0.1492 | 41.0 | 24272 | 0.5893 | 0.8651 | | 0.1616 | 42.0 | 24864 | 0.5717 | 0.8667 | | 0.1359 | 43.0 | 25456 | 0.5897 | 0.8655 | | 0.1318 | 44.0 | 26048 | 0.5920 | 0.8684 | | 0.102 | 45.0 | 26640 | 0.5908 | 0.8683 | | 0.1416 | 46.0 | 27232 | 0.5977 | 0.8625 | | 0.1393 | 47.0 | 27824 | 0.6069 | 0.8648 | | 0.1003 | 48.0 | 28416 | 0.5849 | 0.8682 | | 0.121 | 49.0 | 29008 | 0.5880 | 0.8661 | | 0.128 | 50.0 | 29600 | 0.5800 | 0.8693 | | 0.1409 | 51.0 | 30192 | 0.6004 | 0.8663 | | 0.1783 | 52.0 | 30784 | 0.5847 | 0.8678 | | 0.1177 | 53.0 | 31376 | 0.5984 | 0.8683 | | 0.097 | 54.0 | 31968 | 0.5973 | 0.8669 | | 0.137 | 55.0 | 32560 | 0.5983 | 0.8668 | | 0.1227 | 56.0 | 33152 | 0.5913 | 0.8689 | | 0.1259 | 57.0 | 33744 | 0.5949 | 0.868 | | 0.0947 | 58.0 | 34336 | 0.6065 | 0.8664 | | 0.1184 | 59.0 | 34928 | 0.6098 | 0.8667 | | 0.0996 | 60.0 | 35520 | 0.5958 | 0.8700 | | 0.0977 | 61.0 | 36112 | 0.6019 | 0.8694 | | 0.1295 | 62.0 | 36704 | 0.6012 | 0.8698 | | 0.0842 | 63.0 | 37296 | 0.5993 | 0.8688 | | 0.0784 | 64.0 | 37888 | 0.6074 | 0.8689 | | 0.1183 | 65.0 | 38480 | 0.5853 | 0.8713 | | 0.1215 | 66.0 | 39072 | 0.5962 | 0.8709 | | 0.1069 | 67.0 | 39664 | 0.5786 | 0.8728 | | 0.101 | 68.0 | 40256 | 0.5938 | 0.8691 | | 0.1004 | 69.0 | 40848 | 0.5985 | 0.8716 | | 0.0958 | 70.0 | 41440 | 0.5961 | 0.8721 | | 0.0914 | 71.0 | 42032 | 0.6053 | 0.8704 | | 0.0915 | 72.0 | 42624 | 0.5937 | 0.8713 | | 0.0964 | 73.0 | 43216 | 0.6001 | 0.8703 | | 0.0558 | 74.0 | 43808 | 0.5993 | 0.8697 | | 0.0977 | 75.0 | 44400 | 0.6025 | 0.8706 | | 0.1096 | 76.0 | 44992 | 0.6018 | 0.8706 | | 0.0883 | 77.0 | 45584 | 0.5973 | 0.8733 | | 0.0811 | 78.0 | 46176 | 0.6023 | 0.8741 | | 0.0912 | 79.0 | 46768 | 0.6004 | 0.8733 | | 0.0981 | 80.0 | 47360 | 0.5851 | 0.8730 | | 0.0892 | 81.0 | 47952 | 0.5782 | 0.8754 | | 0.1119 | 82.0 | 48544 | 0.5893 | 0.8727 | | 0.1016 | 83.0 | 49136 | 0.5911 | 0.8722 | | 0.0801 | 84.0 | 49728 | 0.5880 | 0.8755 | | 0.107 | 85.0 | 50320 | 0.6088 | 0.8710 | | 0.0763 | 86.0 | 50912 | 0.5912 | 0.8760 | | 0.0667 | 87.0 | 51504 | 0.5974 | 0.8752 | | 0.0485 | 88.0 | 52096 | 0.5903 | 0.8763 | | 0.1002 | 89.0 | 52688 | 0.6097 | 0.8744 | | 0.0786 | 90.0 | 53280 | 0.5853 | 0.8762 | | 0.1067 | 91.0 | 53872 | 0.5874 | 0.8772 | | 0.0618 | 92.0 | 54464 | 0.5847 | 0.8762 | | 0.0667 | 93.0 | 55056 | 0.5803 | 0.8774 | | 0.0702 | 94.0 | 55648 | 0.5812 | 0.8781 | | 0.055 | 95.0 | 56240 | 0.5918 | 0.8761 | | 0.0941 | 96.0 | 56832 | 0.5904 | 0.8766 | | 0.0821 | 97.0 | 57424 | 0.5849 | 0.8762 | | 0.0998 | 98.0 | 58016 | 0.5891 | 0.8757 | | 0.0594 | 99.0 | 58608 | 0.5845 | 0.8778 | | 0.0727 | 100.0 | 59200 | 0.5959 | 0.8749 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.0+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
inmdd/vit-base-beans
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-beans This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0857 - Accuracy: 0.9850 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.308 | 1.0 | 130 | 0.2118 | 0.9774 | | 0.2219 | 2.0 | 260 | 0.1303 | 0.9699 | | 0.1831 | 3.0 | 390 | 0.1142 | 0.9774 | | 0.0838 | 4.0 | 520 | 0.1031 | 0.9774 | | 0.1266 | 5.0 | 650 | 0.0857 | 0.9850 | ### Framework versions - Transformers 4.32.0.dev0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
pbyrnes/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
ALM-AHME/beit-large-patch16-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beit-large-patch16-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20-Shuffled This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0487 - Accuracy: 0.9893 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.9 - num_epochs: 12 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.1055 | 1.0 | 114 | 2.0091 | 0.1601 | | 1.6582 | 2.0 | 229 | 1.5953 | 0.4187 | | 1.2399 | 3.0 | 343 | 1.1053 | 0.5977 | | 0.8417 | 4.0 | 458 | 0.7602 | 0.7241 | | 0.5517 | 5.0 | 572 | 0.5651 | 0.8013 | | 0.5777 | 6.0 | 687 | 0.3980 | 0.8768 | | 0.408 | 7.0 | 801 | 0.2912 | 0.9154 | | 0.2395 | 8.0 | 916 | 0.2185 | 0.9417 | | 0.3613 | 9.0 | 1030 | 0.1753 | 0.9475 | | 0.2408 | 10.0 | 1145 | 0.1353 | 0.9614 | | 0.2777 | 11.0 | 1259 | 0.0699 | 0.9860 | | 0.1528 | 11.95 | 1368 | 0.0487 | 0.9893 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "akiec", "bcc", "bkl", "df", "mel", "nv", "vasc" ]
ALM-AHME/convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20-Shuffled This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0721 - Accuracy: 0.9869 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.9 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.8937 | 1.0 | 114 | 1.9040 | 0.3144 | | 1.7208 | 2.0 | 229 | 1.6891 | 0.5632 | | 1.3822 | 3.0 | 343 | 1.3554 | 0.6897 | | 1.1497 | 4.0 | 458 | 1.2437 | 0.5755 | | 0.8979 | 5.0 | 572 | 0.8548 | 0.7701 | | 0.6382 | 6.0 | 687 | 0.6359 | 0.8424 | | 0.583 | 7.0 | 801 | 0.4687 | 0.8966 | | 0.6295 | 8.0 | 916 | 0.5029 | 0.8456 | | 0.5367 | 9.0 | 1030 | 0.4742 | 0.8670 | | 0.5091 | 10.0 | 1145 | 0.3038 | 0.9212 | | 0.3521 | 11.0 | 1259 | 0.1855 | 0.9606 | | 0.318 | 12.0 | 1374 | 0.1893 | 0.9573 | | 0.2725 | 13.0 | 1488 | 0.2292 | 0.9409 | | 0.2937 | 14.0 | 1603 | 0.0866 | 0.9836 | | 0.1185 | 14.93 | 1710 | 0.0721 | 0.9869 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "akiec", "bcc", "bkl", "df", "mel", "nv", "vasc" ]
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-Lesion-Classification-HAM10000-S
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-Lesion-Classification-HAM10000-S This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0069 - Accuracy: 0.9975 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.5 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.2378 | 1.0 | 114 | 0.9976 | 0.5936 | | 0.7272 | 2.0 | 228 | 0.4749 | 0.8309 | | 0.4335 | 2.99 | 342 | 0.2488 | 0.9195 | | 0.3298 | 4.0 | 457 | 0.1700 | 0.9310 | | 0.177 | 5.0 | 571 | 0.2116 | 0.9261 | | 0.2299 | 6.0 | 685 | 0.0933 | 0.9754 | | 0.2586 | 6.99 | 799 | 0.0316 | 0.9869 | | 0.1053 | 8.0 | 914 | 0.0256 | 0.9910 | | 0.2159 | 9.0 | 1028 | 0.0147 | 0.9959 | | 0.0607 | 9.98 | 1140 | 0.0069 | 0.9975 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "akiec", "bcc", "bkl", "df", "mel", "nv", "vasc" ]
sghirardelli/vit-base-patch16-224-rgbd1k2
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # sghirardelli/vit-base-patch16-224-rgbd1k2 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 1.9711 - Train Accuracy: 0.4384 - Train Top-3-accuracy: 0.6297 - Validation Loss: 0.2537 - Validation Accuracy: 0.9323 - Validation Top-3-accuracy: 0.9940 - Epoch: 0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.002, 'decay_steps': 544, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000} - training_precision: mixed_float16 ### Training results | Train Loss | Train Accuracy | Train Top-3-accuracy | Validation Loss | Validation Accuracy | Validation Top-3-accuracy | Epoch | |:----------:|:--------------:|:--------------------:|:---------------:|:-------------------:|:-------------------------:|:-----:| | 1.9711 | 0.4384 | 0.6297 | 0.2537 | 0.9323 | 0.9940 | 0 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.12.0 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "apple", "ball", "cereal_box", "coffee_mug", "comb", "dry_battery", "flashlight", "food_bag", "food_box", "food_can", "food_cup", "food_jar", "banana", "garlic", "glue_stick", "greens", "hand_towel", "instant_noodles", "keyboard", "kleenex", "lemon", "lightbulb", "lime", "bell_pepper", "marker", "mushroom", "notebook", "onion", "orange", "peach", "pear", "pitcher", "plate", "pliers", "binder", "potato", "rubber_eraser", "scissors", "shampoo", "soda_can", "sponge", "stapler", "tomato", "toothbrush", "toothpaste", "bowl", "water_bottle", "calculator", "camera", "cap", "cell_phone" ]
CindyShawn/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
ALM-AHME/beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30-Shuffled This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0600 - Accuracy: 0.9765 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.5 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1531 | 0.99 | 93 | 0.1351 | 0.9506 | | 0.2389 | 1.99 | 187 | 0.1534 | 0.9344 | | 0.2517 | 3.0 | 281 | 0.1484 | 0.9402 | | 0.1769 | 4.0 | 375 | 0.1108 | 0.9570 | | 0.0764 | 4.96 | 465 | 0.0600 | 0.9765 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "lung-benign_tissue", "lung_adenocarcinoma", "lung_squamous_cell_carcinoma" ]
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-LungCancer-LC25000-AH-40-30-30-S
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-LungCancer-LC25000-AH-40-30-30-S This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0217 - Accuracy: 0.9931 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.5 - num_epochs: 7 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.176 | 1.0 | 187 | 0.0891 | 0.9663 | | 0.2574 | 2.0 | 374 | 0.2127 | 0.9249 | | 0.2416 | 3.0 | 561 | 0.2407 | 0.9236 | | 0.2457 | 4.0 | 749 | 0.1245 | 0.9632 | | 0.3583 | 5.0 | 936 | 0.1709 | 0.9404 | | 0.149 | 6.0 | 1123 | 0.0502 | 0.9814 | | 0.061 | 6.99 | 1309 | 0.0217 | 0.9931 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "lung-benign_tissue", "lung_adenocarcinoma", "lung_squamous_cell_carcinoma" ]
ALM-AHME/convnextv2-large-1k-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnextv2-large-1k-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30-Shuffled This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1288 - Accuracy: 0.9623 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.5 - num_epochs: 7 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2522 | 0.99 | 93 | 0.1288 | 0.9623 | | 0.1579 | 1.99 | 187 | 0.1211 | 0.9573 | | 1.1016 | 3.0 | 281 | 1.1018 | 0.3216 | | 1.0934 | 4.0 | 375 | 1.0787 | 0.6432 | | 0.5795 | 4.99 | 468 | 0.5864 | 0.6445 | | 0.5437 | 5.99 | 562 | 0.5733 | 0.7369 | | 0.3369 | 6.94 | 651 | 0.3298 | 0.9030 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "lung-benign_tissue", "lung_adenocarcinoma", "lung_squamous_cell_carcinoma" ]
wuru330/flipped_results
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # flipped_results This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.5433 - Accuracy: 0.4099 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.3063 | 1.0 | 37 | 1.2485 | 0.4592 | | 1.2272 | 2.0 | 74 | 1.1915 | 0.4626 | | 1.1518 | 3.0 | 111 | 1.1770 | 0.4575 | | 1.0504 | 4.0 | 148 | 1.1724 | 0.4745 | | 0.9525 | 5.0 | 185 | 1.1966 | 0.4898 | | 0.7485 | 6.0 | 222 | 1.2403 | 0.4626 | | 0.5645 | 7.0 | 259 | 1.3973 | 0.4235 | | 0.4645 | 8.0 | 296 | 1.4260 | 0.4898 | | 0.374 | 9.0 | 333 | 1.5838 | 0.4405 | | 0.2721 | 10.0 | 370 | 1.6747 | 0.4354 | | 0.2679 | 11.0 | 407 | 1.7427 | 0.4507 | | 0.2284 | 12.0 | 444 | 1.8097 | 0.4269 | | 0.2003 | 13.0 | 481 | 1.9738 | 0.3997 | | 0.1844 | 14.0 | 518 | 1.9745 | 0.4524 | | 0.1631 | 15.0 | 555 | 2.0326 | 0.4456 | | 0.1135 | 16.0 | 592 | 2.1294 | 0.4184 | | 0.1188 | 17.0 | 629 | 2.1613 | 0.4065 | | 0.1204 | 18.0 | 666 | 2.1795 | 0.4252 | | 0.1083 | 19.0 | 703 | 2.2433 | 0.4167 | | 0.0794 | 20.0 | 740 | 2.2762 | 0.4150 | | 0.0589 | 21.0 | 777 | 2.3736 | 0.4065 | | 0.0646 | 22.0 | 814 | 2.3644 | 0.4252 | | 0.0744 | 23.0 | 851 | 2.4478 | 0.4099 | | 0.0728 | 24.0 | 888 | 2.4367 | 0.4099 | | 0.0382 | 25.0 | 925 | 2.5123 | 0.3997 | | 0.033 | 26.0 | 962 | 2.5202 | 0.4031 | | 0.0326 | 27.0 | 999 | 2.5145 | 0.4099 | | 0.023 | 28.0 | 1036 | 2.5309 | 0.4099 | | 0.0377 | 29.0 | 1073 | 2.5409 | 0.4099 | | 0.0243 | 30.0 | 1110 | 2.5433 | 0.4099 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2", "label_3" ]
ALM-AHME/beit-large-patch16-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beit-large-patch16-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20-Shuffled This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0146 - Accuracy: 0.9958 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.9 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Accuracy | Validation Loss | |:-------------:|:-----:|:----:|:--------:|:---------------:| | 0.5847 | 1.0 | 199 | 0.8030 | 0.4640 | | 0.2856 | 2.0 | 398 | 0.9354 | 0.1753 | | 0.156 | 3.0 | 597 | 0.9552 | 0.1179 | | 0.1049 | 4.0 | 796 | 0.9585 | 0.1043 | | 0.1399 | 5.0 | 995 | 0.9760 | 0.0673 | | 0.0423 | 6.0 | 1194 | 0.9802 | 0.0455 | | 0.078 | 7.0 | 1393 | 0.9802 | 0.0554 | | 0.1769 | 8.0 | 1592 | 0.9764 | 0.0556 | | 0.0568 | 9.0 | 1791 | 0.9807 | 0.0569 | | 0.0728 | 10.0 | 1990 | 0.9915 | 0.0234 | | 0.0229 | 11.0 | 2189 | 0.9910 | 0.0240 | | 0.0561 | 12.0 | 2388 | 0.9901 | 0.0352 | | 0.014 | 13.0 | 2587 | 0.9797 | 0.0749 | | 0.096 | 14.0 | 2786 | 0.9934 | 0.0268 | | 0.0005 | 15.0 | 2985 | 0.0146 | 0.9958 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "benign", "malignant" ]
Shojint/vit_awesome_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit_awesome_model This model is a fine-tuned version of [stevhliu/my_awesome_food_model](https://huggingface.co/stevhliu/my_awesome_food_model) on the food101 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.0+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "baklava", "beef_carpaccio", "beef_tartare", "beet_salad", "beignets", "bibimbap", "bread_pudding", "breakfast_burrito", "bruschetta", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare", "waffles" ]
jordyvl/vit-base_tobacco_crl_allv2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_tobacco_crl_allv2 This model is a fine-tuned version of [jordyvl/vit-base_tobacco](https://huggingface.co/jordyvl/vit-base_tobacco) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8070 - Accuracy: 0.82 - Brier Loss: 0.2884 - Nll: 1.3803 - F1 Micro: 0.82 - F1 Macro: 0.8075 - Ece: 0.1968 - Aurc: 0.0591 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 0.8264 | 0.815 | 0.3085 | 1.1929 | 0.815 | 0.7922 | 0.2249 | 0.0764 | | No log | 1.96 | 6 | 0.8371 | 0.815 | 0.3061 | 1.1842 | 0.815 | 0.7910 | 0.2299 | 0.0718 | | No log | 2.96 | 9 | 0.8530 | 0.8 | 0.3072 | 1.1942 | 0.8000 | 0.7798 | 0.2335 | 0.0749 | | No log | 3.96 | 12 | 0.8469 | 0.82 | 0.3056 | 1.1930 | 0.82 | 0.8049 | 0.2385 | 0.0743 | | No log | 4.96 | 15 | 0.8274 | 0.84 | 0.3006 | 1.3642 | 0.8400 | 0.8268 | 0.2376 | 0.0716 | | No log | 5.96 | 18 | 0.8632 | 0.81 | 0.3105 | 1.2012 | 0.81 | 0.7873 | 0.2468 | 0.0806 | | No log | 6.96 | 21 | 0.8216 | 0.815 | 0.2949 | 1.1817 | 0.815 | 0.8037 | 0.2343 | 0.0623 | | No log | 7.96 | 24 | 0.8540 | 0.805 | 0.3056 | 1.1903 | 0.805 | 0.7842 | 0.1968 | 0.0782 | | No log | 8.96 | 27 | 0.8223 | 0.82 | 0.2955 | 1.3001 | 0.82 | 0.8114 | 0.2195 | 0.0629 | | No log | 9.96 | 30 | 0.8432 | 0.81 | 0.3025 | 1.1862 | 0.81 | 0.7999 | 0.2074 | 0.0752 | | No log | 10.96 | 33 | 0.8445 | 0.795 | 0.3049 | 1.2622 | 0.795 | 0.7820 | 0.2254 | 0.0629 | | No log | 11.96 | 36 | 0.8367 | 0.81 | 0.2987 | 1.2088 | 0.81 | 0.7945 | 0.2269 | 0.0736 | | No log | 12.96 | 39 | 0.8365 | 0.81 | 0.3022 | 1.4703 | 0.81 | 0.7926 | 0.2216 | 0.0733 | | No log | 13.96 | 42 | 0.8460 | 0.805 | 0.3020 | 1.2887 | 0.805 | 0.7864 | 0.2151 | 0.0673 | | No log | 14.96 | 45 | 0.8791 | 0.805 | 0.3161 | 1.2981 | 0.805 | 0.7822 | 0.2089 | 0.0840 | | No log | 15.96 | 48 | 0.8318 | 0.805 | 0.3049 | 1.3976 | 0.805 | 0.7900 | 0.2216 | 0.0787 | | No log | 16.96 | 51 | 0.9041 | 0.795 | 0.3295 | 1.3060 | 0.795 | 0.7798 | 0.2362 | 0.0772 | | No log | 17.96 | 54 | 0.8434 | 0.81 | 0.3051 | 1.4598 | 0.81 | 0.7863 | 0.2121 | 0.0756 | | No log | 18.96 | 57 | 0.8693 | 0.8 | 0.3080 | 1.4271 | 0.8000 | 0.7818 | 0.2227 | 0.0780 | | No log | 19.96 | 60 | 0.8635 | 0.795 | 0.3138 | 1.4281 | 0.795 | 0.7611 | 0.2259 | 0.0764 | | No log | 20.96 | 63 | 0.9061 | 0.78 | 0.3285 | 1.3005 | 0.78 | 0.7667 | 0.2245 | 0.0788 | | No log | 21.96 | 66 | 0.8564 | 0.805 | 0.3065 | 1.3226 | 0.805 | 0.7827 | 0.2093 | 0.0901 | | No log | 22.96 | 69 | 0.8354 | 0.78 | 0.3035 | 1.2478 | 0.78 | 0.7556 | 0.1985 | 0.0633 | | No log | 23.96 | 72 | 0.8458 | 0.8 | 0.3068 | 1.1995 | 0.8000 | 0.7869 | 0.2117 | 0.0666 | | No log | 24.96 | 75 | 0.8406 | 0.78 | 0.3055 | 1.1192 | 0.78 | 0.7605 | 0.2131 | 0.0646 | | No log | 25.96 | 78 | 0.8048 | 0.81 | 0.2904 | 1.1867 | 0.81 | 0.7937 | 0.1940 | 0.0611 | | No log | 26.96 | 81 | 0.8429 | 0.815 | 0.3050 | 1.4600 | 0.815 | 0.7924 | 0.2020 | 0.0801 | | No log | 27.96 | 84 | 0.8362 | 0.785 | 0.3042 | 1.4610 | 0.785 | 0.7667 | 0.1953 | 0.0733 | | No log | 28.96 | 87 | 0.8709 | 0.795 | 0.3096 | 1.5073 | 0.795 | 0.7787 | 0.2015 | 0.0844 | | No log | 29.96 | 90 | 0.8696 | 0.785 | 0.3223 | 1.4348 | 0.785 | 0.7607 | 0.2249 | 0.0798 | | No log | 30.96 | 93 | 0.8748 | 0.795 | 0.3133 | 1.5627 | 0.795 | 0.7759 | 0.2125 | 0.0919 | | No log | 31.96 | 96 | 0.8320 | 0.815 | 0.3006 | 1.4744 | 0.815 | 0.8039 | 0.2239 | 0.0816 | | No log | 32.96 | 99 | 0.8299 | 0.79 | 0.3041 | 1.3752 | 0.79 | 0.7725 | 0.2193 | 0.0726 | | No log | 33.96 | 102 | 0.8377 | 0.795 | 0.3058 | 1.3802 | 0.795 | 0.7769 | 0.2196 | 0.0773 | | No log | 34.96 | 105 | 0.8195 | 0.8 | 0.2986 | 1.4524 | 0.8000 | 0.7836 | 0.2032 | 0.0756 | | No log | 35.96 | 108 | 0.8600 | 0.785 | 0.3081 | 1.4267 | 0.785 | 0.7599 | 0.2042 | 0.0914 | | No log | 36.96 | 111 | 0.8565 | 0.795 | 0.3073 | 1.3142 | 0.795 | 0.7720 | 0.2018 | 0.0773 | | No log | 37.96 | 114 | 0.8175 | 0.815 | 0.3002 | 1.3310 | 0.815 | 0.8107 | 0.2289 | 0.0659 | | No log | 38.96 | 117 | 0.8309 | 0.825 | 0.2975 | 1.3417 | 0.825 | 0.8100 | 0.1944 | 0.0890 | | No log | 39.96 | 120 | 0.8479 | 0.795 | 0.3071 | 1.3669 | 0.795 | 0.7796 | 0.2001 | 0.0796 | | No log | 40.96 | 123 | 0.8052 | 0.825 | 0.2940 | 1.4031 | 0.825 | 0.8087 | 0.1969 | 0.0661 | | No log | 41.96 | 126 | 0.8009 | 0.8 | 0.2937 | 1.1756 | 0.8000 | 0.7867 | 0.2003 | 0.0624 | | No log | 42.96 | 129 | 0.8367 | 0.79 | 0.3059 | 1.2063 | 0.79 | 0.7746 | 0.1964 | 0.0657 | | No log | 43.96 | 132 | 0.8222 | 0.81 | 0.3021 | 1.2555 | 0.81 | 0.7994 | 0.2268 | 0.0695 | | No log | 44.96 | 135 | 0.8234 | 0.805 | 0.2990 | 1.2438 | 0.805 | 0.7889 | 0.2045 | 0.0820 | | No log | 45.96 | 138 | 0.8378 | 0.81 | 0.3096 | 1.3797 | 0.81 | 0.7904 | 0.2136 | 0.0761 | | No log | 46.96 | 141 | 0.8089 | 0.8 | 0.2959 | 1.2355 | 0.8000 | 0.7837 | 0.1987 | 0.0716 | | No log | 47.96 | 144 | 0.8427 | 0.79 | 0.3052 | 1.3295 | 0.79 | 0.7746 | 0.2032 | 0.0794 | | No log | 48.96 | 147 | 0.8269 | 0.81 | 0.3034 | 1.3293 | 0.81 | 0.7968 | 0.2014 | 0.0826 | | No log | 49.96 | 150 | 0.8081 | 0.81 | 0.2958 | 1.3146 | 0.81 | 0.7901 | 0.2004 | 0.0721 | | No log | 50.96 | 153 | 0.8084 | 0.8 | 0.2967 | 1.3800 | 0.8000 | 0.7799 | 0.2114 | 0.0623 | | No log | 51.96 | 156 | 0.8076 | 0.805 | 0.2931 | 1.3180 | 0.805 | 0.7850 | 0.2068 | 0.0631 | | No log | 52.96 | 159 | 0.8163 | 0.82 | 0.3025 | 1.3950 | 0.82 | 0.8028 | 0.2294 | 0.0694 | | No log | 53.96 | 162 | 0.8519 | 0.765 | 0.3193 | 1.3976 | 0.765 | 0.7522 | 0.2062 | 0.0703 | | No log | 54.96 | 165 | 0.8146 | 0.79 | 0.2991 | 1.3098 | 0.79 | 0.7738 | 0.1900 | 0.0615 | | No log | 55.96 | 168 | 0.8064 | 0.815 | 0.2918 | 1.3889 | 0.815 | 0.7978 | 0.2099 | 0.0719 | | No log | 56.96 | 171 | 0.8225 | 0.81 | 0.2991 | 1.3051 | 0.81 | 0.7916 | 0.2153 | 0.0794 | | No log | 57.96 | 174 | 0.8222 | 0.815 | 0.3021 | 1.2774 | 0.815 | 0.8011 | 0.2052 | 0.0722 | | No log | 58.96 | 177 | 0.8025 | 0.82 | 0.2949 | 1.4508 | 0.82 | 0.8103 | 0.2228 | 0.0636 | | No log | 59.96 | 180 | 0.7993 | 0.8 | 0.2911 | 1.3141 | 0.8000 | 0.7829 | 0.2008 | 0.0685 | | No log | 60.96 | 183 | 0.7919 | 0.805 | 0.2900 | 1.3050 | 0.805 | 0.7891 | 0.1983 | 0.0618 | | No log | 61.96 | 186 | 0.7994 | 0.81 | 0.2921 | 1.2423 | 0.81 | 0.7964 | 0.1920 | 0.0710 | | No log | 62.96 | 189 | 0.7966 | 0.815 | 0.2887 | 1.2438 | 0.815 | 0.7970 | 0.1940 | 0.0727 | | No log | 63.96 | 192 | 0.7892 | 0.82 | 0.2860 | 1.3182 | 0.82 | 0.8032 | 0.2006 | 0.0622 | | No log | 64.96 | 195 | 0.7916 | 0.815 | 0.2880 | 1.2500 | 0.815 | 0.7998 | 0.2005 | 0.0611 | | No log | 65.96 | 198 | 0.7920 | 0.81 | 0.2892 | 1.2549 | 0.81 | 0.7925 | 0.2207 | 0.0595 | | No log | 66.96 | 201 | 0.7924 | 0.81 | 0.2895 | 1.3216 | 0.81 | 0.7925 | 0.2084 | 0.0587 | | No log | 67.96 | 204 | 0.7935 | 0.8 | 0.2898 | 1.2552 | 0.8000 | 0.7858 | 0.2142 | 0.0592 | | No log | 68.96 | 207 | 0.7929 | 0.805 | 0.2894 | 1.3132 | 0.805 | 0.7891 | 0.2224 | 0.0581 | | No log | 69.96 | 210 | 0.7960 | 0.805 | 0.2892 | 1.2494 | 0.805 | 0.7887 | 0.2193 | 0.0606 | | No log | 70.96 | 213 | 0.7928 | 0.815 | 0.2876 | 1.3287 | 0.815 | 0.8024 | 0.2152 | 0.0595 | | No log | 71.96 | 216 | 0.7939 | 0.81 | 0.2875 | 1.2485 | 0.81 | 0.7920 | 0.2085 | 0.0604 | | No log | 72.96 | 219 | 0.7937 | 0.81 | 0.2871 | 1.2538 | 0.81 | 0.7920 | 0.2009 | 0.0600 | | No log | 73.96 | 222 | 0.7924 | 0.82 | 0.2869 | 1.3766 | 0.82 | 0.8095 | 0.1965 | 0.0586 | | No log | 74.96 | 225 | 0.7980 | 0.81 | 0.2884 | 1.2557 | 0.81 | 0.7920 | 0.2197 | 0.0614 | | No log | 75.96 | 228 | 0.7938 | 0.82 | 0.2863 | 1.3205 | 0.82 | 0.8052 | 0.1892 | 0.0598 | | No log | 76.96 | 231 | 0.7935 | 0.825 | 0.2860 | 1.3242 | 0.825 | 0.8123 | 0.1866 | 0.0592 | | No log | 77.96 | 234 | 0.7942 | 0.815 | 0.2864 | 1.3761 | 0.815 | 0.7990 | 0.1821 | 0.0581 | | No log | 78.96 | 237 | 0.7939 | 0.815 | 0.2864 | 1.3762 | 0.815 | 0.7990 | 0.1893 | 0.0578 | | No log | 79.96 | 240 | 0.7961 | 0.815 | 0.2874 | 1.3788 | 0.815 | 0.7990 | 0.1992 | 0.0581 | | No log | 80.96 | 243 | 0.7981 | 0.815 | 0.2880 | 1.3816 | 0.815 | 0.7990 | 0.1946 | 0.0583 | | No log | 81.96 | 246 | 0.7989 | 0.815 | 0.2882 | 1.3820 | 0.815 | 0.7990 | 0.1939 | 0.0584 | | No log | 82.96 | 249 | 0.7995 | 0.81 | 0.2882 | 1.3813 | 0.81 | 0.7957 | 0.1929 | 0.0587 | | No log | 83.96 | 252 | 0.8014 | 0.81 | 0.2886 | 1.3263 | 0.81 | 0.7957 | 0.2109 | 0.0587 | | No log | 84.96 | 255 | 0.7998 | 0.82 | 0.2878 | 1.3793 | 0.82 | 0.8090 | 0.1873 | 0.0582 | | No log | 85.96 | 258 | 0.8021 | 0.82 | 0.2881 | 1.3816 | 0.82 | 0.8090 | 0.2131 | 0.0587 | | No log | 86.96 | 261 | 0.8036 | 0.815 | 0.2884 | 1.3827 | 0.815 | 0.7975 | 0.2065 | 0.0589 | | No log | 87.96 | 264 | 0.8051 | 0.815 | 0.2887 | 1.3836 | 0.815 | 0.7975 | 0.1989 | 0.0594 | | No log | 88.96 | 267 | 0.8053 | 0.815 | 0.2885 | 1.3830 | 0.815 | 0.8034 | 0.1883 | 0.0592 | | No log | 89.96 | 270 | 0.8056 | 0.815 | 0.2883 | 1.3830 | 0.815 | 0.8034 | 0.1977 | 0.0590 | | No log | 90.96 | 273 | 0.8051 | 0.82 | 0.2878 | 1.3815 | 0.82 | 0.8075 | 0.2024 | 0.0586 | | No log | 91.96 | 276 | 0.8052 | 0.82 | 0.2877 | 1.3805 | 0.82 | 0.8075 | 0.2133 | 0.0584 | | No log | 92.96 | 279 | 0.8055 | 0.82 | 0.2879 | 1.3799 | 0.82 | 0.8075 | 0.1953 | 0.0587 | | No log | 93.96 | 282 | 0.8055 | 0.82 | 0.2878 | 1.3793 | 0.82 | 0.8075 | 0.1995 | 0.0586 | | No log | 94.96 | 285 | 0.8061 | 0.815 | 0.2881 | 1.3797 | 0.815 | 0.7975 | 0.1905 | 0.0590 | | No log | 95.96 | 288 | 0.8064 | 0.815 | 0.2882 | 1.3798 | 0.815 | 0.7975 | 0.1990 | 0.0590 | | No log | 96.96 | 291 | 0.8065 | 0.82 | 0.2883 | 1.3799 | 0.82 | 0.8075 | 0.2140 | 0.0588 | | No log | 97.96 | 294 | 0.8069 | 0.815 | 0.2884 | 1.3802 | 0.815 | 0.7975 | 0.1916 | 0.0591 | | No log | 98.96 | 297 | 0.8070 | 0.82 | 0.2884 | 1.3803 | 0.82 | 0.8075 | 0.1968 | 0.0592 | | No log | 99.96 | 300 | 0.8070 | 0.82 | 0.2884 | 1.3803 | 0.82 | 0.8075 | 0.1968 | 0.0591 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/dit-base_tobacco_crl_allv2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base_tobacco_crl_allv2 This model is a fine-tuned version of [jordyvl/dit-base_tobacco](https://huggingface.co/jordyvl/dit-base_tobacco) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3885 - Accuracy: 0.945 - Brier Loss: 0.1018 - Nll: 0.7205 - F1 Micro: 0.945 - F1 Macro: 0.9429 - Ece: 0.0554 - Aurc: 0.0107 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 0.3286 | 0.925 | 0.1161 | 1.0901 | 0.925 | 0.9187 | 0.0807 | 0.0113 | | No log | 1.96 | 6 | 0.3344 | 0.93 | 0.1123 | 1.0906 | 0.93 | 0.9219 | 0.0760 | 0.0120 | | No log | 2.96 | 9 | 0.3363 | 0.935 | 0.1092 | 1.0909 | 0.935 | 0.9319 | 0.0711 | 0.0135 | | No log | 3.96 | 12 | 0.3530 | 0.935 | 0.1135 | 1.0883 | 0.935 | 0.9320 | 0.0753 | 0.0158 | | No log | 4.96 | 15 | 0.3673 | 0.93 | 0.1170 | 1.0778 | 0.93 | 0.9247 | 0.0811 | 0.0162 | | No log | 5.96 | 18 | 0.3583 | 0.93 | 0.1167 | 1.0706 | 0.93 | 0.9247 | 0.0708 | 0.0159 | | No log | 6.96 | 21 | 0.3469 | 0.93 | 0.1121 | 1.0675 | 0.93 | 0.9247 | 0.0783 | 0.0148 | | No log | 7.96 | 24 | 0.3357 | 0.935 | 0.1071 | 1.0654 | 0.935 | 0.9279 | 0.0724 | 0.0136 | | No log | 8.96 | 27 | 0.3329 | 0.935 | 0.1048 | 1.0488 | 0.935 | 0.9279 | 0.0615 | 0.0127 | | No log | 9.96 | 30 | 0.3354 | 0.94 | 0.1027 | 1.0178 | 0.94 | 0.9391 | 0.0636 | 0.0130 | | No log | 10.96 | 33 | 0.3350 | 0.94 | 0.1018 | 1.0054 | 0.94 | 0.9418 | 0.0616 | 0.0136 | | No log | 11.96 | 36 | 0.3342 | 0.94 | 0.1012 | 1.0160 | 0.94 | 0.9418 | 0.0632 | 0.0133 | | No log | 12.96 | 39 | 0.3341 | 0.935 | 0.1002 | 1.0132 | 0.935 | 0.9318 | 0.0692 | 0.0135 | | No log | 13.96 | 42 | 0.3427 | 0.93 | 0.1039 | 1.0032 | 0.93 | 0.9275 | 0.0644 | 0.0137 | | No log | 14.96 | 45 | 0.3393 | 0.945 | 0.0985 | 0.9986 | 0.945 | 0.9406 | 0.0581 | 0.0122 | | No log | 15.96 | 48 | 0.3304 | 0.94 | 0.0995 | 0.9934 | 0.94 | 0.9390 | 0.0575 | 0.0124 | | No log | 16.96 | 51 | 0.3372 | 0.94 | 0.1010 | 0.9796 | 0.94 | 0.9390 | 0.0597 | 0.0127 | | No log | 17.96 | 54 | 0.3399 | 0.94 | 0.1023 | 0.9591 | 0.94 | 0.9459 | 0.0603 | 0.0123 | | No log | 18.96 | 57 | 0.3443 | 0.94 | 0.1044 | 0.9473 | 0.94 | 0.9459 | 0.0588 | 0.0122 | | No log | 19.96 | 60 | 0.3491 | 0.94 | 0.1064 | 0.9401 | 0.94 | 0.9398 | 0.0617 | 0.0122 | | No log | 20.96 | 63 | 0.3510 | 0.94 | 0.1081 | 0.9288 | 0.94 | 0.9398 | 0.0681 | 0.0131 | | No log | 21.96 | 66 | 0.3485 | 0.94 | 0.1074 | 0.9111 | 0.94 | 0.9398 | 0.0628 | 0.0132 | | No log | 22.96 | 69 | 0.3481 | 0.935 | 0.1056 | 0.8993 | 0.935 | 0.9382 | 0.0616 | 0.0132 | | No log | 23.96 | 72 | 0.3605 | 0.935 | 0.1131 | 0.9013 | 0.935 | 0.9378 | 0.0684 | 0.0120 | | No log | 24.96 | 75 | 0.3738 | 0.935 | 0.1159 | 0.9113 | 0.935 | 0.9377 | 0.0683 | 0.0117 | | No log | 25.96 | 78 | 0.3657 | 0.935 | 0.1108 | 0.8932 | 0.935 | 0.9394 | 0.0690 | 0.0124 | | No log | 26.96 | 81 | 0.3511 | 0.94 | 0.1060 | 0.8761 | 0.94 | 0.9446 | 0.0563 | 0.0120 | | No log | 27.96 | 84 | 0.3375 | 0.94 | 0.1025 | 0.8662 | 0.94 | 0.9446 | 0.0602 | 0.0108 | | No log | 28.96 | 87 | 0.3369 | 0.94 | 0.1019 | 0.8654 | 0.94 | 0.9446 | 0.0558 | 0.0090 | | No log | 29.96 | 90 | 0.3423 | 0.94 | 0.1055 | 0.8602 | 0.94 | 0.9446 | 0.0610 | 0.0076 | | No log | 30.96 | 93 | 0.3458 | 0.945 | 0.1065 | 0.8525 | 0.945 | 0.9474 | 0.0605 | 0.0078 | | No log | 31.96 | 96 | 0.3436 | 0.945 | 0.1035 | 0.8390 | 0.945 | 0.9490 | 0.0591 | 0.0082 | | No log | 32.96 | 99 | 0.3436 | 0.94 | 0.1025 | 0.8294 | 0.94 | 0.9397 | 0.0574 | 0.0086 | | No log | 33.96 | 102 | 0.3481 | 0.94 | 0.0990 | 0.8225 | 0.94 | 0.9398 | 0.0579 | 0.0102 | | No log | 34.96 | 105 | 0.3519 | 0.945 | 0.0965 | 0.8203 | 0.945 | 0.9491 | 0.0576 | 0.0109 | | No log | 35.96 | 108 | 0.3551 | 0.945 | 0.0939 | 0.8213 | 0.945 | 0.9491 | 0.0547 | 0.0116 | | No log | 36.96 | 111 | 0.3611 | 0.95 | 0.0945 | 0.8193 | 0.9500 | 0.9519 | 0.0556 | 0.0117 | | No log | 37.96 | 114 | 0.3678 | 0.94 | 0.1037 | 0.8166 | 0.94 | 0.9446 | 0.0591 | 0.0116 | | No log | 38.96 | 117 | 0.3740 | 0.94 | 0.1086 | 0.8226 | 0.94 | 0.9446 | 0.0588 | 0.0112 | | No log | 39.96 | 120 | 0.3754 | 0.94 | 0.1106 | 0.8328 | 0.94 | 0.9446 | 0.0631 | 0.0114 | | No log | 40.96 | 123 | 0.3699 | 0.94 | 0.1097 | 0.8241 | 0.94 | 0.9446 | 0.0584 | 0.0116 | | No log | 41.96 | 126 | 0.3606 | 0.94 | 0.1051 | 0.8010 | 0.94 | 0.9446 | 0.0550 | 0.0122 | | No log | 42.96 | 129 | 0.3548 | 0.94 | 0.0970 | 0.7939 | 0.94 | 0.9447 | 0.0603 | 0.0130 | | No log | 43.96 | 132 | 0.3533 | 0.95 | 0.0948 | 0.7902 | 0.9500 | 0.9522 | 0.0589 | 0.0131 | | No log | 44.96 | 135 | 0.3588 | 0.945 | 0.0973 | 0.7818 | 0.945 | 0.9478 | 0.0540 | 0.0128 | | No log | 45.96 | 138 | 0.3634 | 0.945 | 0.1011 | 0.7802 | 0.945 | 0.9478 | 0.0566 | 0.0124 | | No log | 46.96 | 141 | 0.3642 | 0.945 | 0.1018 | 0.7813 | 0.945 | 0.9478 | 0.0564 | 0.0108 | | No log | 47.96 | 144 | 0.3624 | 0.945 | 0.1018 | 0.7858 | 0.945 | 0.9478 | 0.0568 | 0.0104 | | No log | 48.96 | 147 | 0.3653 | 0.945 | 0.1011 | 0.7949 | 0.945 | 0.9478 | 0.0570 | 0.0110 | | No log | 49.96 | 150 | 0.3697 | 0.945 | 0.1022 | 0.8296 | 0.945 | 0.9478 | 0.0565 | 0.0110 | | No log | 50.96 | 153 | 0.3705 | 0.945 | 0.1025 | 0.8677 | 0.945 | 0.9478 | 0.0558 | 0.0111 | | No log | 51.96 | 156 | 0.3753 | 0.945 | 0.1042 | 0.7933 | 0.945 | 0.9478 | 0.0553 | 0.0106 | | No log | 52.96 | 159 | 0.3763 | 0.945 | 0.1038 | 0.7869 | 0.945 | 0.9478 | 0.0580 | 0.0112 | | No log | 53.96 | 162 | 0.3735 | 0.95 | 0.1007 | 0.7751 | 0.9500 | 0.9522 | 0.0551 | 0.0114 | | No log | 54.96 | 165 | 0.3713 | 0.95 | 0.0995 | 0.7660 | 0.9500 | 0.9522 | 0.0551 | 0.0112 | | No log | 55.96 | 168 | 0.3709 | 0.95 | 0.0985 | 0.7592 | 0.9500 | 0.9522 | 0.0540 | 0.0108 | | No log | 56.96 | 171 | 0.3747 | 0.95 | 0.0985 | 0.7580 | 0.9500 | 0.9522 | 0.0530 | 0.0119 | | No log | 57.96 | 174 | 0.3793 | 0.95 | 0.0992 | 0.7567 | 0.9500 | 0.9522 | 0.0532 | 0.0121 | | No log | 58.96 | 177 | 0.3802 | 0.95 | 0.0992 | 0.7519 | 0.9500 | 0.9522 | 0.0540 | 0.0115 | | No log | 59.96 | 180 | 0.3815 | 0.95 | 0.1007 | 0.7485 | 0.9500 | 0.9522 | 0.0545 | 0.0108 | | No log | 60.96 | 183 | 0.3873 | 0.945 | 0.1069 | 0.7489 | 0.945 | 0.9478 | 0.0572 | 0.0098 | | No log | 61.96 | 186 | 0.3883 | 0.945 | 0.1070 | 0.7477 | 0.945 | 0.9478 | 0.0564 | 0.0097 | | No log | 62.96 | 189 | 0.3818 | 0.945 | 0.1053 | 0.7451 | 0.945 | 0.9478 | 0.0561 | 0.0096 | | No log | 63.96 | 192 | 0.3745 | 0.945 | 0.1048 | 0.7446 | 0.945 | 0.9478 | 0.0586 | 0.0101 | | No log | 64.96 | 195 | 0.3762 | 0.945 | 0.1030 | 0.8090 | 0.945 | 0.9478 | 0.0576 | 0.0103 | | No log | 65.96 | 198 | 0.3822 | 0.95 | 0.1025 | 0.8092 | 0.9500 | 0.9522 | 0.0564 | 0.0104 | | No log | 66.96 | 201 | 0.3896 | 0.95 | 0.1030 | 0.8112 | 0.9500 | 0.9522 | 0.0566 | 0.0103 | | No log | 67.96 | 204 | 0.3914 | 0.945 | 0.1036 | 0.8095 | 0.945 | 0.9490 | 0.0586 | 0.0102 | | No log | 68.96 | 207 | 0.3900 | 0.945 | 0.1043 | 0.8060 | 0.945 | 0.9490 | 0.0585 | 0.0097 | | No log | 69.96 | 210 | 0.3903 | 0.945 | 0.1059 | 0.7370 | 0.945 | 0.9490 | 0.0586 | 0.0099 | | No log | 70.96 | 213 | 0.3923 | 0.94 | 0.1069 | 0.7327 | 0.94 | 0.9446 | 0.0568 | 0.0096 | | No log | 71.96 | 216 | 0.3894 | 0.94 | 0.1070 | 0.7316 | 0.94 | 0.9446 | 0.0611 | 0.0094 | | No log | 72.96 | 219 | 0.3847 | 0.94 | 0.1053 | 0.7318 | 0.94 | 0.9446 | 0.0607 | 0.0100 | | No log | 73.96 | 222 | 0.3833 | 0.94 | 0.1043 | 0.7315 | 0.94 | 0.9446 | 0.0603 | 0.0105 | | No log | 74.96 | 225 | 0.3822 | 0.935 | 0.1041 | 0.7310 | 0.935 | 0.9353 | 0.0620 | 0.0101 | | No log | 75.96 | 228 | 0.3771 | 0.945 | 0.1026 | 0.7314 | 0.945 | 0.9429 | 0.0552 | 0.0100 | | No log | 76.96 | 231 | 0.3748 | 0.945 | 0.1014 | 0.7322 | 0.945 | 0.9429 | 0.0569 | 0.0100 | | No log | 77.96 | 234 | 0.3759 | 0.945 | 0.1010 | 0.7332 | 0.945 | 0.9429 | 0.0556 | 0.0104 | | No log | 78.96 | 237 | 0.3775 | 0.945 | 0.1009 | 0.7346 | 0.945 | 0.9429 | 0.0546 | 0.0107 | | No log | 79.96 | 240 | 0.3784 | 0.94 | 0.1012 | 0.7343 | 0.94 | 0.9398 | 0.0558 | 0.0108 | | No log | 80.96 | 243 | 0.3797 | 0.94 | 0.1013 | 0.7340 | 0.94 | 0.9398 | 0.0559 | 0.0109 | | No log | 81.96 | 246 | 0.3821 | 0.94 | 0.1012 | 0.7359 | 0.94 | 0.9398 | 0.0578 | 0.0109 | | No log | 82.96 | 249 | 0.3836 | 0.94 | 0.1011 | 0.7332 | 0.94 | 0.9398 | 0.0576 | 0.0108 | | No log | 83.96 | 252 | 0.3844 | 0.94 | 0.1009 | 0.7318 | 0.94 | 0.9398 | 0.0574 | 0.0106 | | No log | 84.96 | 255 | 0.3859 | 0.94 | 0.1009 | 0.7316 | 0.94 | 0.9398 | 0.0572 | 0.0106 | | No log | 85.96 | 258 | 0.3885 | 0.94 | 0.1012 | 0.7312 | 0.94 | 0.9398 | 0.0546 | 0.0106 | | No log | 86.96 | 261 | 0.3898 | 0.945 | 0.1015 | 0.7292 | 0.945 | 0.9429 | 0.0546 | 0.0106 | | No log | 87.96 | 264 | 0.3905 | 0.945 | 0.1018 | 0.7265 | 0.945 | 0.9429 | 0.0560 | 0.0108 | | No log | 88.96 | 267 | 0.3909 | 0.945 | 0.1020 | 0.7239 | 0.945 | 0.9429 | 0.0558 | 0.0106 | | No log | 89.96 | 270 | 0.3903 | 0.945 | 0.1018 | 0.7219 | 0.945 | 0.9429 | 0.0559 | 0.0105 | | No log | 90.96 | 273 | 0.3895 | 0.945 | 0.1017 | 0.7208 | 0.945 | 0.9429 | 0.0559 | 0.0105 | | No log | 91.96 | 276 | 0.3891 | 0.945 | 0.1017 | 0.7202 | 0.945 | 0.9429 | 0.0562 | 0.0104 | | No log | 92.96 | 279 | 0.3890 | 0.945 | 0.1017 | 0.7201 | 0.945 | 0.9429 | 0.0564 | 0.0106 | | No log | 93.96 | 282 | 0.3889 | 0.945 | 0.1018 | 0.7202 | 0.945 | 0.9429 | 0.0554 | 0.0105 | | No log | 94.96 | 285 | 0.3883 | 0.945 | 0.1016 | 0.7206 | 0.945 | 0.9429 | 0.0555 | 0.0105 | | No log | 95.96 | 288 | 0.3880 | 0.945 | 0.1016 | 0.7210 | 0.945 | 0.9429 | 0.0556 | 0.0107 | | No log | 96.96 | 291 | 0.3880 | 0.945 | 0.1016 | 0.7209 | 0.945 | 0.9429 | 0.0555 | 0.0107 | | No log | 97.96 | 294 | 0.3882 | 0.945 | 0.1017 | 0.7207 | 0.945 | 0.9429 | 0.0555 | 0.0107 | | No log | 98.96 | 297 | 0.3884 | 0.945 | 0.1017 | 0.7205 | 0.945 | 0.9429 | 0.0554 | 0.0107 | | No log | 99.96 | 300 | 0.3885 | 0.945 | 0.1018 | 0.7205 | 0.945 | 0.9429 | 0.0554 | 0.0107 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/test_crl_entropy_large
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # test_crl_entropy_large This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4581 - Accuracy: 0.915 - Brier Loss: 0.1532 - Nll: 1.2415 - F1 Micro: 0.915 - F1 Macro: 0.9138 - Ece: 0.1052 - Aurc: 0.0187 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 12 | 2.5859 | 0.05 | 0.9010 | 18.0478 | 0.0500 | 0.0539 | 0.1568 | 0.9609 | | No log | 2.0 | 25 | 2.5164 | 0.225 | 0.8825 | 11.6684 | 0.225 | 0.1480 | 0.2600 | 0.7882 | | No log | 2.96 | 37 | 2.3841 | 0.435 | 0.8485 | 4.9677 | 0.435 | 0.2723 | 0.4030 | 0.2751 | | No log | 4.0 | 50 | 2.1765 | 0.72 | 0.7878 | 3.1818 | 0.72 | 0.6092 | 0.5962 | 0.0863 | | No log | 4.96 | 62 | 1.9498 | 0.8 | 0.7104 | 1.5065 | 0.8000 | 0.7202 | 0.6069 | 0.0453 | | No log | 6.0 | 75 | 1.6936 | 0.85 | 0.6116 | 1.3333 | 0.85 | 0.8023 | 0.5897 | 0.0375 | | No log | 6.96 | 87 | 1.4794 | 0.885 | 0.5207 | 1.1014 | 0.885 | 0.8689 | 0.5429 | 0.0360 | | No log | 8.0 | 100 | 1.2753 | 0.905 | 0.4293 | 0.8992 | 0.905 | 0.8958 | 0.4920 | 0.0289 | | No log | 8.96 | 112 | 1.1140 | 0.91 | 0.3546 | 1.0480 | 0.91 | 0.8979 | 0.4226 | 0.0229 | | No log | 10.0 | 125 | 0.9650 | 0.91 | 0.2872 | 0.8855 | 0.91 | 0.9076 | 0.3683 | 0.0210 | | No log | 10.96 | 137 | 0.8788 | 0.915 | 0.2493 | 1.0637 | 0.915 | 0.9156 | 0.3070 | 0.0202 | | No log | 12.0 | 150 | 0.7997 | 0.9 | 0.2207 | 1.0602 | 0.9 | 0.9020 | 0.2929 | 0.0201 | | No log | 12.96 | 162 | 0.7460 | 0.915 | 0.2033 | 0.8874 | 0.915 | 0.9106 | 0.2675 | 0.0175 | | No log | 14.0 | 175 | 0.6631 | 0.925 | 0.1713 | 0.8091 | 0.925 | 0.9178 | 0.2271 | 0.0137 | | No log | 14.96 | 187 | 0.6219 | 0.925 | 0.1593 | 0.9488 | 0.925 | 0.9169 | 0.2183 | 0.0177 | | No log | 16.0 | 200 | 0.5861 | 0.93 | 0.1531 | 0.9181 | 0.93 | 0.9345 | 0.2110 | 0.0193 | | No log | 16.96 | 212 | 0.5557 | 0.93 | 0.1407 | 0.9323 | 0.93 | 0.9247 | 0.1725 | 0.0186 | | No log | 18.0 | 225 | 0.5394 | 0.92 | 0.1446 | 0.7790 | 0.92 | 0.9150 | 0.1847 | 0.0165 | | No log | 18.96 | 237 | 0.5170 | 0.93 | 0.1345 | 0.7822 | 0.93 | 0.9269 | 0.1598 | 0.0157 | | No log | 20.0 | 250 | 0.5079 | 0.91 | 0.1356 | 0.9286 | 0.91 | 0.9084 | 0.1598 | 0.0161 | | No log | 20.96 | 262 | 0.4945 | 0.92 | 0.1342 | 0.7583 | 0.92 | 0.9150 | 0.1470 | 0.0153 | | No log | 22.0 | 275 | 0.4850 | 0.91 | 0.1330 | 0.7760 | 0.91 | 0.9084 | 0.1398 | 0.0155 | | No log | 22.96 | 287 | 0.4828 | 0.91 | 0.1334 | 0.9411 | 0.91 | 0.9084 | 0.1487 | 0.0154 | | No log | 24.0 | 300 | 0.4758 | 0.91 | 0.1324 | 0.9241 | 0.91 | 0.9084 | 0.1294 | 0.0153 | | No log | 24.96 | 312 | 0.4712 | 0.91 | 0.1327 | 1.0746 | 0.91 | 0.9084 | 0.1322 | 0.0156 | | No log | 26.0 | 325 | 0.4672 | 0.91 | 0.1321 | 1.0726 | 0.91 | 0.9084 | 0.1248 | 0.0153 | | No log | 26.96 | 337 | 0.4659 | 0.91 | 0.1331 | 1.0712 | 0.91 | 0.9084 | 0.1320 | 0.0153 | | No log | 28.0 | 350 | 0.4618 | 0.91 | 0.1323 | 1.0693 | 0.91 | 0.9084 | 0.1276 | 0.0151 | | No log | 28.96 | 362 | 0.4564 | 0.91 | 0.1315 | 1.0707 | 0.91 | 0.9084 | 0.1284 | 0.0156 | | No log | 30.0 | 375 | 0.4587 | 0.91 | 0.1348 | 1.0736 | 0.91 | 0.9080 | 0.1334 | 0.0160 | | No log | 30.96 | 387 | 0.4554 | 0.91 | 0.1334 | 1.0663 | 0.91 | 0.9080 | 0.1104 | 0.0155 | | No log | 32.0 | 400 | 0.4512 | 0.91 | 0.1331 | 1.0691 | 0.91 | 0.9080 | 0.1079 | 0.0158 | | No log | 32.96 | 412 | 0.4569 | 0.91 | 0.1385 | 1.2348 | 0.91 | 0.9080 | 0.1184 | 0.0166 | | No log | 34.0 | 425 | 0.4516 | 0.91 | 0.1362 | 1.0761 | 0.91 | 0.9080 | 0.1178 | 0.0160 | | No log | 34.96 | 437 | 0.4531 | 0.905 | 0.1376 | 1.0636 | 0.905 | 0.9051 | 0.1252 | 0.0156 | | No log | 36.0 | 450 | 0.4526 | 0.91 | 0.1384 | 1.1004 | 0.91 | 0.9080 | 0.1128 | 0.0164 | | No log | 36.96 | 462 | 0.4460 | 0.91 | 0.1345 | 1.0779 | 0.91 | 0.9080 | 0.1093 | 0.0173 | | No log | 38.0 | 475 | 0.4422 | 0.905 | 0.1347 | 1.0680 | 0.905 | 0.9047 | 0.1237 | 0.0163 | | No log | 38.96 | 487 | 0.4510 | 0.905 | 0.1410 | 1.2254 | 0.905 | 0.9065 | 0.1138 | 0.0176 | | 0.6142 | 40.0 | 500 | 0.4426 | 0.91 | 0.1370 | 1.2232 | 0.91 | 0.9080 | 0.1067 | 0.0163 | | 0.6142 | 40.96 | 512 | 0.4469 | 0.91 | 0.1407 | 1.2227 | 0.91 | 0.9080 | 0.1285 | 0.0159 | | 0.6142 | 42.0 | 525 | 0.4456 | 0.91 | 0.1403 | 1.2315 | 0.91 | 0.9080 | 0.1099 | 0.0163 | | 0.6142 | 42.96 | 537 | 0.4336 | 0.905 | 0.1319 | 1.2228 | 0.905 | 0.9028 | 0.1132 | 0.0180 | | 0.6142 | 44.0 | 550 | 0.4435 | 0.915 | 0.1357 | 1.0493 | 0.915 | 0.9133 | 0.1055 | 0.0146 | | 0.6142 | 44.96 | 562 | 0.5008 | 0.905 | 0.1585 | 1.4460 | 0.905 | 0.9070 | 0.1219 | 0.0234 | | 0.6142 | 46.0 | 575 | 0.4496 | 0.915 | 0.1428 | 1.3785 | 0.915 | 0.9138 | 0.1162 | 0.0164 | | 0.6142 | 46.96 | 587 | 0.4627 | 0.915 | 0.1476 | 1.3925 | 0.915 | 0.9138 | 0.1166 | 0.0178 | | 0.6142 | 48.0 | 600 | 0.4545 | 0.915 | 0.1459 | 1.3840 | 0.915 | 0.9138 | 0.1145 | 0.0172 | | 0.6142 | 48.96 | 612 | 0.4528 | 0.915 | 0.1457 | 1.3862 | 0.915 | 0.9138 | 0.1088 | 0.0175 | | 0.6142 | 50.0 | 625 | 0.4578 | 0.915 | 0.1480 | 1.3905 | 0.915 | 0.9138 | 0.1079 | 0.0176 | | 0.6142 | 50.96 | 637 | 0.4574 | 0.91 | 0.1473 | 1.3863 | 0.91 | 0.9105 | 0.1135 | 0.0178 | | 0.6142 | 52.0 | 650 | 0.4497 | 0.915 | 0.1455 | 1.3807 | 0.915 | 0.9138 | 0.1072 | 0.0172 | | 0.6142 | 52.96 | 662 | 0.4614 | 0.905 | 0.1503 | 1.3893 | 0.905 | 0.9076 | 0.1144 | 0.0174 | | 0.6142 | 54.0 | 675 | 0.4560 | 0.915 | 0.1480 | 1.3884 | 0.915 | 0.9138 | 0.1134 | 0.0179 | | 0.6142 | 54.96 | 687 | 0.4460 | 0.905 | 0.1457 | 1.2136 | 0.905 | 0.9076 | 0.1129 | 0.0163 | | 0.6142 | 56.0 | 700 | 0.4499 | 0.91 | 0.1480 | 1.3847 | 0.91 | 0.9109 | 0.1075 | 0.0176 | | 0.6142 | 56.96 | 712 | 0.4520 | 0.915 | 0.1477 | 1.3829 | 0.915 | 0.9138 | 0.1113 | 0.0175 | | 0.6142 | 58.0 | 725 | 0.4519 | 0.915 | 0.1481 | 1.5477 | 0.915 | 0.9138 | 0.1059 | 0.0187 | | 0.6142 | 58.96 | 737 | 0.4399 | 0.915 | 0.1440 | 1.0570 | 0.915 | 0.9138 | 0.1110 | 0.0167 | | 0.6142 | 60.0 | 750 | 0.4451 | 0.91 | 0.1473 | 1.2166 | 0.91 | 0.9080 | 0.1071 | 0.0172 | | 0.6142 | 60.96 | 762 | 0.4538 | 0.915 | 0.1502 | 1.2374 | 0.915 | 0.9138 | 0.1115 | 0.0183 | | 0.6142 | 62.0 | 775 | 0.4489 | 0.91 | 0.1487 | 1.2152 | 0.91 | 0.9105 | 0.1049 | 0.0171 | | 0.6142 | 62.96 | 787 | 0.4489 | 0.915 | 0.1488 | 1.2265 | 0.915 | 0.9138 | 0.1048 | 0.0183 | | 0.6142 | 64.0 | 800 | 0.4480 | 0.915 | 0.1481 | 1.2384 | 0.915 | 0.9138 | 0.1092 | 0.0176 | | 0.6142 | 64.96 | 812 | 0.4422 | 0.91 | 0.1465 | 1.0633 | 0.91 | 0.9080 | 0.1064 | 0.0164 | | 0.6142 | 66.0 | 825 | 0.4410 | 0.91 | 0.1462 | 1.2394 | 0.91 | 0.9080 | 0.1060 | 0.0166 | | 0.6142 | 66.96 | 837 | 0.4380 | 0.91 | 0.1460 | 1.2227 | 0.91 | 0.9080 | 0.1006 | 0.0165 | | 0.6142 | 68.0 | 850 | 0.4434 | 0.91 | 0.1478 | 1.3838 | 0.91 | 0.9080 | 0.1025 | 0.0171 | | 0.6142 | 68.96 | 862 | 0.4511 | 0.91 | 0.1504 | 1.3874 | 0.91 | 0.9080 | 0.1023 | 0.0177 | | 0.6142 | 70.0 | 875 | 0.4405 | 0.91 | 0.1460 | 1.3857 | 0.91 | 0.9080 | 0.1076 | 0.0185 | | 0.6142 | 70.96 | 887 | 0.4558 | 0.915 | 0.1518 | 1.3987 | 0.915 | 0.9138 | 0.1061 | 0.0191 | | 0.6142 | 72.0 | 900 | 0.4434 | 0.91 | 0.1480 | 1.2224 | 0.91 | 0.9080 | 0.1021 | 0.0176 | | 0.6142 | 72.96 | 912 | 0.4455 | 0.91 | 0.1491 | 1.2312 | 0.91 | 0.9080 | 0.1031 | 0.0177 | | 0.6142 | 74.0 | 925 | 0.4549 | 0.915 | 0.1521 | 1.2332 | 0.915 | 0.9138 | 0.1066 | 0.0183 | | 0.6142 | 74.96 | 937 | 0.4567 | 0.91 | 0.1516 | 1.2475 | 0.91 | 0.9109 | 0.0993 | 0.0194 | | 0.6142 | 76.0 | 950 | 0.4465 | 0.905 | 0.1490 | 1.2255 | 0.905 | 0.9047 | 0.1050 | 0.0182 | | 0.6142 | 76.96 | 962 | 0.4425 | 0.91 | 0.1476 | 1.2256 | 0.91 | 0.9080 | 0.1037 | 0.0180 | | 0.6142 | 78.0 | 975 | 0.4535 | 0.91 | 0.1519 | 1.2320 | 0.91 | 0.9080 | 0.1070 | 0.0184 | | 0.6142 | 78.96 | 987 | 0.4509 | 0.915 | 0.1507 | 1.2352 | 0.915 | 0.9138 | 0.1036 | 0.0185 | | 0.0861 | 80.0 | 1000 | 0.4566 | 0.91 | 0.1527 | 1.2324 | 0.91 | 0.9109 | 0.1061 | 0.0185 | | 0.0861 | 80.96 | 1012 | 0.4544 | 0.915 | 0.1513 | 1.3928 | 0.915 | 0.9138 | 0.1074 | 0.0183 | | 0.0861 | 82.0 | 1025 | 0.4544 | 0.915 | 0.1512 | 1.2383 | 0.915 | 0.9138 | 0.1034 | 0.0186 | | 0.0861 | 82.96 | 1037 | 0.4597 | 0.915 | 0.1532 | 1.3974 | 0.915 | 0.9138 | 0.1014 | 0.0188 | | 0.0861 | 84.0 | 1050 | 0.4508 | 0.905 | 0.1515 | 1.2298 | 0.905 | 0.9047 | 0.1029 | 0.0180 | | 0.0861 | 84.96 | 1062 | 0.4507 | 0.915 | 0.1508 | 1.2297 | 0.915 | 0.9138 | 0.1058 | 0.0178 | | 0.0861 | 86.0 | 1075 | 0.4557 | 0.915 | 0.1524 | 1.3915 | 0.915 | 0.9138 | 0.1082 | 0.0183 | | 0.0861 | 86.96 | 1087 | 0.4532 | 0.915 | 0.1516 | 1.2274 | 0.915 | 0.9138 | 0.1077 | 0.0184 | | 0.0861 | 88.0 | 1100 | 0.4492 | 0.91 | 0.1506 | 1.2265 | 0.91 | 0.9080 | 0.1018 | 0.0181 | | 0.0861 | 88.96 | 1112 | 0.4502 | 0.91 | 0.1511 | 1.2304 | 0.91 | 0.9080 | 0.1062 | 0.0184 | | 0.0861 | 90.0 | 1125 | 0.4582 | 0.91 | 0.1536 | 1.2361 | 0.91 | 0.9080 | 0.1046 | 0.0186 | | 0.0861 | 90.96 | 1137 | 0.4576 | 0.915 | 0.1531 | 1.2504 | 0.915 | 0.9138 | 0.1046 | 0.0187 | | 0.0861 | 92.0 | 1150 | 0.4589 | 0.915 | 0.1533 | 1.3958 | 0.915 | 0.9138 | 0.1052 | 0.0188 | | 0.0861 | 92.96 | 1162 | 0.4594 | 0.915 | 0.1536 | 1.2503 | 0.915 | 0.9138 | 0.1114 | 0.0188 | | 0.0861 | 94.0 | 1175 | 0.4570 | 0.915 | 0.1529 | 1.2441 | 0.915 | 0.9138 | 0.1078 | 0.0185 | | 0.0861 | 94.96 | 1187 | 0.4577 | 0.915 | 0.1531 | 1.2402 | 0.915 | 0.9138 | 0.1052 | 0.0186 | | 0.0861 | 96.0 | 1200 | 0.4581 | 0.915 | 0.1532 | 1.2415 | 0.915 | 0.9138 | 0.1052 | 0.0187 | ### Framework versions - Transformers 4.28.0.dev0 - Pytorch 1.12.1+cu113 - Datasets 2.12.0 - Tokenizers 0.12.1
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
ALM-AHME/convnextv2-large-1k-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnextv2-large-1k-224-finetuned-BreastCancer-Classification-BreakHis-AH-60-20-20-Shuffled This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0398 - Accuracy: 0.9882 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.9 - num_epochs: 14 ### Training results | Training Loss | Epoch | Step | Accuracy | Validation Loss | |:-------------:|:-----:|:----:|:--------:|:---------------:| | 0.5059 | 1.0 | 199 | 0.9001 | 0.4826 | | 0.2533 | 2.0 | 398 | 0.9515 | 0.2124 | | 0.2358 | 3.0 | 597 | 0.9538 | 0.1543 | | 0.2584 | 4.0 | 796 | 0.9642 | 0.1136 | | 0.1085 | 5.0 | 995 | 0.9746 | 0.0891 | | 0.1007 | 6.0 | 1194 | 0.9769 | 0.0725 | | 0.1463 | 7.0 | 1393 | 0.9840 | 0.0541 | | 0.3564 | 8.0 | 1592 | 0.9802 | 0.0880 | | 0.0957 | 9.0 | 1791 | 0.9656 | 0.1375 | | 0.1481 | 10.0 | 1990 | 0.0511 | 0.9873 | | 0.1536 | 11.0 | 2189 | 0.0827 | 0.9713 | | 0.0458 | 12.0 | 2388 | 0.0398 | 0.9882 | | 0.4956 | 13.0 | 2587 | 0.3474 | 0.8643 | | 0.0801 | 14.0 | 2786 | 0.0850 | 0.9797 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "benign", "malignant" ]
jordyvl/vit-base_rvl_tobacco_crl_allv2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_tobacco_crl_allv2 This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4914 - Accuracy: 0.905 - Brier Loss: 0.1511 - Nll: 0.7302 - F1 Micro: 0.905 - F1 Macro: 0.9056 - Ece: 0.1839 - Aurc: 0.0184 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 2.3674 | 0.04 | 0.9050 | 9.6086 | 0.04 | 0.0438 | 0.1520 | 0.9678 | | No log | 1.96 | 6 | 2.3484 | 0.05 | 0.9004 | 8.5656 | 0.0500 | 0.0549 | 0.1567 | 0.9600 | | No log | 2.96 | 9 | 2.3106 | 0.095 | 0.8923 | 6.8919 | 0.095 | 0.0853 | 0.1832 | 0.9122 | | No log | 3.96 | 12 | 2.2538 | 0.27 | 0.8801 | 5.5263 | 0.27 | 0.1666 | 0.2921 | 0.7417 | | No log | 4.96 | 15 | 2.1898 | 0.39 | 0.8630 | 3.9890 | 0.39 | 0.2376 | 0.3516 | 0.4066 | | No log | 5.96 | 18 | 2.1124 | 0.49 | 0.8400 | 2.4772 | 0.49 | 0.3221 | 0.4366 | 0.2273 | | No log | 6.96 | 21 | 2.0178 | 0.65 | 0.8115 | 2.0300 | 0.65 | 0.5251 | 0.5447 | 0.1261 | | No log | 7.96 | 24 | 1.9053 | 0.74 | 0.7769 | 1.7062 | 0.74 | 0.6400 | 0.5939 | 0.0731 | | No log | 8.96 | 27 | 1.7938 | 0.78 | 0.7383 | 1.3250 | 0.78 | 0.6962 | 0.6183 | 0.0523 | | No log | 9.96 | 30 | 1.6845 | 0.825 | 0.6966 | 1.1128 | 0.825 | 0.7679 | 0.6216 | 0.0419 | | No log | 10.96 | 33 | 1.5726 | 0.835 | 0.6507 | 0.9279 | 0.835 | 0.7902 | 0.5892 | 0.0413 | | No log | 11.96 | 36 | 1.4703 | 0.855 | 0.6058 | 0.8610 | 0.855 | 0.8168 | 0.5789 | 0.0388 | | No log | 12.96 | 39 | 1.3783 | 0.88 | 0.5647 | 0.7009 | 0.88 | 0.8462 | 0.5745 | 0.0333 | | No log | 13.96 | 42 | 1.2986 | 0.885 | 0.5277 | 0.6657 | 0.885 | 0.8617 | 0.5458 | 0.0319 | | No log | 14.96 | 45 | 1.2260 | 0.91 | 0.4930 | 0.6487 | 0.91 | 0.8999 | 0.5400 | 0.0284 | | No log | 15.96 | 48 | 1.1579 | 0.91 | 0.4596 | 0.6307 | 0.91 | 0.8999 | 0.5268 | 0.0261 | | No log | 16.96 | 51 | 1.0942 | 0.915 | 0.4281 | 0.5573 | 0.915 | 0.9011 | 0.4938 | 0.0242 | | No log | 17.96 | 54 | 1.0383 | 0.915 | 0.4004 | 0.6216 | 0.915 | 0.9011 | 0.4780 | 0.0228 | | No log | 18.96 | 57 | 0.9877 | 0.92 | 0.3742 | 0.6119 | 0.92 | 0.9077 | 0.4502 | 0.0212 | | No log | 19.96 | 60 | 0.9380 | 0.925 | 0.3495 | 0.5968 | 0.925 | 0.9167 | 0.4332 | 0.0195 | | No log | 20.96 | 63 | 0.8961 | 0.92 | 0.3280 | 0.5971 | 0.92 | 0.9130 | 0.4097 | 0.0188 | | No log | 21.96 | 66 | 0.8581 | 0.925 | 0.3086 | 0.5969 | 0.925 | 0.9169 | 0.4032 | 0.0190 | | No log | 22.96 | 69 | 0.8218 | 0.92 | 0.2901 | 0.5901 | 0.92 | 0.9130 | 0.3764 | 0.0186 | | No log | 23.96 | 72 | 0.7899 | 0.92 | 0.2741 | 0.5890 | 0.92 | 0.9130 | 0.3736 | 0.0181 | | No log | 24.96 | 75 | 0.7627 | 0.93 | 0.2603 | 0.5946 | 0.93 | 0.9245 | 0.3535 | 0.0175 | | No log | 25.96 | 78 | 0.7346 | 0.93 | 0.2470 | 0.5839 | 0.93 | 0.9245 | 0.3440 | 0.0165 | | No log | 26.96 | 81 | 0.7114 | 0.93 | 0.2361 | 0.5823 | 0.93 | 0.9245 | 0.3261 | 0.0169 | | No log | 27.96 | 84 | 0.6914 | 0.93 | 0.2265 | 0.5842 | 0.93 | 0.9245 | 0.3151 | 0.0170 | | No log | 28.96 | 87 | 0.6730 | 0.925 | 0.2179 | 0.5840 | 0.925 | 0.9216 | 0.3048 | 0.0170 | | No log | 29.96 | 90 | 0.6564 | 0.92 | 0.2107 | 0.6412 | 0.92 | 0.9159 | 0.2932 | 0.0171 | | No log | 30.96 | 93 | 0.6416 | 0.92 | 0.2043 | 0.6918 | 0.92 | 0.9159 | 0.2884 | 0.0172 | | No log | 31.96 | 96 | 0.6287 | 0.92 | 0.1986 | 0.6890 | 0.92 | 0.9140 | 0.2801 | 0.0169 | | No log | 32.96 | 99 | 0.6165 | 0.92 | 0.1934 | 0.6360 | 0.92 | 0.9150 | 0.2650 | 0.0170 | | No log | 33.96 | 102 | 0.6064 | 0.92 | 0.1892 | 0.6819 | 0.92 | 0.9150 | 0.2585 | 0.0170 | | No log | 34.96 | 105 | 0.5970 | 0.92 | 0.1855 | 0.6806 | 0.92 | 0.9150 | 0.2525 | 0.0169 | | No log | 35.96 | 108 | 0.5880 | 0.915 | 0.1818 | 0.6778 | 0.915 | 0.9121 | 0.2494 | 0.0171 | | No log | 36.96 | 111 | 0.5801 | 0.91 | 0.1789 | 0.6766 | 0.91 | 0.9089 | 0.2427 | 0.0170 | | No log | 37.96 | 114 | 0.5732 | 0.91 | 0.1764 | 0.6757 | 0.91 | 0.9089 | 0.2376 | 0.0166 | | No log | 38.96 | 117 | 0.5672 | 0.91 | 0.1739 | 0.6741 | 0.91 | 0.9089 | 0.2337 | 0.0165 | | No log | 39.96 | 120 | 0.5617 | 0.91 | 0.1717 | 0.6732 | 0.91 | 0.9089 | 0.2305 | 0.0165 | | No log | 40.96 | 123 | 0.5562 | 0.91 | 0.1698 | 0.6723 | 0.91 | 0.9089 | 0.2266 | 0.0164 | | No log | 41.96 | 126 | 0.5516 | 0.91 | 0.1682 | 0.6724 | 0.91 | 0.9089 | 0.2230 | 0.0165 | | No log | 42.96 | 129 | 0.5470 | 0.91 | 0.1666 | 0.6719 | 0.91 | 0.9089 | 0.2201 | 0.0165 | | No log | 43.96 | 132 | 0.5428 | 0.91 | 0.1653 | 0.6718 | 0.91 | 0.9089 | 0.2174 | 0.0167 | | No log | 44.96 | 135 | 0.5400 | 0.91 | 0.1642 | 0.6726 | 0.91 | 0.9089 | 0.2145 | 0.0167 | | No log | 45.96 | 138 | 0.5373 | 0.91 | 0.1632 | 0.6738 | 0.91 | 0.9089 | 0.2216 | 0.0168 | | No log | 46.96 | 141 | 0.5347 | 0.91 | 0.1621 | 0.6744 | 0.91 | 0.9089 | 0.2095 | 0.0167 | | No log | 47.96 | 144 | 0.5318 | 0.91 | 0.1613 | 0.6775 | 0.91 | 0.9089 | 0.2074 | 0.0166 | | No log | 48.96 | 147 | 0.5294 | 0.91 | 0.1606 | 0.7357 | 0.91 | 0.9089 | 0.2054 | 0.0166 | | No log | 49.96 | 150 | 0.5270 | 0.91 | 0.1598 | 0.7350 | 0.91 | 0.9089 | 0.2029 | 0.0166 | | No log | 50.96 | 153 | 0.5240 | 0.91 | 0.1593 | 0.7347 | 0.91 | 0.9089 | 0.1920 | 0.0166 | | No log | 51.96 | 156 | 0.5218 | 0.91 | 0.1586 | 0.7340 | 0.91 | 0.9089 | 0.1902 | 0.0167 | | No log | 52.96 | 159 | 0.5200 | 0.91 | 0.1581 | 0.7336 | 0.91 | 0.9089 | 0.1925 | 0.0168 | | No log | 53.96 | 162 | 0.5178 | 0.905 | 0.1576 | 0.7333 | 0.905 | 0.9056 | 0.1903 | 0.0171 | | No log | 54.96 | 165 | 0.5156 | 0.905 | 0.1571 | 0.7323 | 0.905 | 0.9056 | 0.1888 | 0.0172 | | No log | 55.96 | 168 | 0.5138 | 0.905 | 0.1565 | 0.7318 | 0.905 | 0.9056 | 0.1971 | 0.0172 | | No log | 56.96 | 171 | 0.5122 | 0.905 | 0.1561 | 0.7313 | 0.905 | 0.9056 | 0.1771 | 0.0173 | | No log | 57.96 | 174 | 0.5106 | 0.905 | 0.1557 | 0.7308 | 0.905 | 0.9056 | 0.1947 | 0.0175 | | No log | 58.96 | 177 | 0.5093 | 0.905 | 0.1554 | 0.7308 | 0.905 | 0.9056 | 0.2033 | 0.0176 | | No log | 59.96 | 180 | 0.5080 | 0.905 | 0.1550 | 0.7307 | 0.905 | 0.9056 | 0.2021 | 0.0175 | | No log | 60.96 | 183 | 0.5068 | 0.905 | 0.1547 | 0.7305 | 0.905 | 0.9056 | 0.1914 | 0.0176 | | No log | 61.96 | 186 | 0.5055 | 0.905 | 0.1545 | 0.7301 | 0.905 | 0.9056 | 0.1813 | 0.0178 | | No log | 62.96 | 189 | 0.5039 | 0.905 | 0.1541 | 0.7292 | 0.905 | 0.9056 | 0.1803 | 0.0178 | | No log | 63.96 | 192 | 0.5029 | 0.905 | 0.1539 | 0.7297 | 0.905 | 0.9056 | 0.1893 | 0.0179 | | No log | 64.96 | 195 | 0.5018 | 0.905 | 0.1537 | 0.7292 | 0.905 | 0.9056 | 0.1883 | 0.0178 | | No log | 65.96 | 198 | 0.5007 | 0.905 | 0.1534 | 0.7286 | 0.905 | 0.9056 | 0.1873 | 0.0178 | | No log | 66.96 | 201 | 0.5002 | 0.905 | 0.1534 | 0.7292 | 0.905 | 0.9056 | 0.1869 | 0.0179 | | No log | 67.96 | 204 | 0.4993 | 0.905 | 0.1532 | 0.7287 | 0.905 | 0.9056 | 0.1771 | 0.0179 | | No log | 68.96 | 207 | 0.4989 | 0.905 | 0.1530 | 0.7287 | 0.905 | 0.9056 | 0.1668 | 0.0179 | | No log | 69.96 | 210 | 0.4988 | 0.905 | 0.1530 | 0.7290 | 0.905 | 0.9056 | 0.1664 | 0.0179 | | No log | 70.96 | 213 | 0.4983 | 0.905 | 0.1528 | 0.7289 | 0.905 | 0.9056 | 0.1658 | 0.0179 | | No log | 71.96 | 216 | 0.4977 | 0.905 | 0.1526 | 0.7290 | 0.905 | 0.9056 | 0.1653 | 0.0180 | | No log | 72.96 | 219 | 0.4971 | 0.905 | 0.1525 | 0.7289 | 0.905 | 0.9056 | 0.1732 | 0.0181 | | No log | 73.96 | 222 | 0.4966 | 0.905 | 0.1525 | 0.7291 | 0.905 | 0.9056 | 0.1731 | 0.0181 | | No log | 74.96 | 225 | 0.4961 | 0.905 | 0.1523 | 0.7291 | 0.905 | 0.9056 | 0.1726 | 0.0182 | | No log | 75.96 | 228 | 0.4955 | 0.905 | 0.1522 | 0.7292 | 0.905 | 0.9056 | 0.1718 | 0.0181 | | No log | 76.96 | 231 | 0.4950 | 0.905 | 0.1521 | 0.7290 | 0.905 | 0.9056 | 0.1714 | 0.0181 | | No log | 77.96 | 234 | 0.4945 | 0.905 | 0.1520 | 0.7289 | 0.905 | 0.9056 | 0.1712 | 0.0181 | | No log | 78.96 | 237 | 0.4946 | 0.905 | 0.1519 | 0.7290 | 0.905 | 0.9056 | 0.1709 | 0.0181 | | No log | 79.96 | 240 | 0.4944 | 0.905 | 0.1518 | 0.7292 | 0.905 | 0.9056 | 0.1705 | 0.0181 | | No log | 80.96 | 243 | 0.4942 | 0.905 | 0.1518 | 0.7293 | 0.905 | 0.9056 | 0.1732 | 0.0182 | | No log | 81.96 | 246 | 0.4939 | 0.905 | 0.1517 | 0.7293 | 0.905 | 0.9056 | 0.1698 | 0.0182 | | No log | 82.96 | 249 | 0.4937 | 0.905 | 0.1517 | 0.7299 | 0.905 | 0.9056 | 0.1695 | 0.0182 | | No log | 83.96 | 252 | 0.4932 | 0.905 | 0.1515 | 0.7294 | 0.905 | 0.9056 | 0.1698 | 0.0182 | | No log | 84.96 | 255 | 0.4929 | 0.905 | 0.1514 | 0.7292 | 0.905 | 0.9056 | 0.1695 | 0.0182 | | No log | 85.96 | 258 | 0.4928 | 0.905 | 0.1515 | 0.7297 | 0.905 | 0.9056 | 0.1693 | 0.0182 | | No log | 86.96 | 261 | 0.4925 | 0.905 | 0.1514 | 0.7296 | 0.905 | 0.9056 | 0.1690 | 0.0182 | | No log | 87.96 | 264 | 0.4923 | 0.905 | 0.1513 | 0.7294 | 0.905 | 0.9056 | 0.1688 | 0.0183 | | No log | 88.96 | 267 | 0.4921 | 0.905 | 0.1512 | 0.7296 | 0.905 | 0.9056 | 0.1687 | 0.0182 | | No log | 89.96 | 270 | 0.4920 | 0.905 | 0.1512 | 0.7299 | 0.905 | 0.9056 | 0.1685 | 0.0182 | | No log | 90.96 | 273 | 0.4918 | 0.905 | 0.1512 | 0.7298 | 0.905 | 0.9056 | 0.1683 | 0.0182 | | No log | 91.96 | 276 | 0.4917 | 0.905 | 0.1512 | 0.7298 | 0.905 | 0.9056 | 0.1775 | 0.0183 | | No log | 92.96 | 279 | 0.4916 | 0.905 | 0.1512 | 0.7299 | 0.905 | 0.9056 | 0.1773 | 0.0183 | | No log | 93.96 | 282 | 0.4915 | 0.905 | 0.1511 | 0.7300 | 0.905 | 0.9056 | 0.1843 | 0.0184 | | No log | 94.96 | 285 | 0.4915 | 0.905 | 0.1511 | 0.7300 | 0.905 | 0.9056 | 0.1842 | 0.0184 | | No log | 95.96 | 288 | 0.4915 | 0.905 | 0.1511 | 0.7302 | 0.905 | 0.9056 | 0.1841 | 0.0184 | | No log | 96.96 | 291 | 0.4914 | 0.905 | 0.1511 | 0.7302 | 0.905 | 0.9056 | 0.1840 | 0.0184 | | No log | 97.96 | 294 | 0.4914 | 0.905 | 0.1511 | 0.7302 | 0.905 | 0.9056 | 0.1840 | 0.0184 | | No log | 98.96 | 297 | 0.4914 | 0.905 | 0.1511 | 0.7302 | 0.905 | 0.9056 | 0.1840 | 0.0184 | | No log | 99.96 | 300 | 0.4914 | 0.905 | 0.1511 | 0.7302 | 0.905 | 0.9056 | 0.1839 | 0.0184 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/vit-base_rvl_tobacco_test_entropy_large
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_tobacco_test_entropy_large This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4296 - Accuracy: 0.9 - Brier Loss: 0.1643 - Nll: 1.3751 - F1 Micro: 0.9 - F1 Macro: 0.9013 - Ece: 0.1074 - Aurc: 0.0220 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 12 | 2.5324 | 0.05 | 0.9010 | 18.0477 | 0.0500 | 0.0539 | 0.1568 | 0.9609 | | No log | 2.0 | 25 | 2.4440 | 0.23 | 0.8823 | 11.6657 | 0.23 | 0.1501 | 0.2626 | 0.7837 | | No log | 2.96 | 37 | 2.3026 | 0.435 | 0.8481 | 4.6649 | 0.435 | 0.2723 | 0.4061 | 0.2744 | | No log | 4.0 | 50 | 2.0897 | 0.71 | 0.7862 | 3.1934 | 0.7100 | 0.5915 | 0.5791 | 0.0865 | | No log | 4.96 | 62 | 1.8603 | 0.785 | 0.7065 | 1.8079 | 0.785 | 0.7047 | 0.6067 | 0.0508 | | No log | 6.0 | 75 | 1.6025 | 0.845 | 0.6037 | 1.3384 | 0.845 | 0.7944 | 0.5657 | 0.0393 | | No log | 6.96 | 87 | 1.3819 | 0.885 | 0.5101 | 1.1034 | 0.885 | 0.8660 | 0.5288 | 0.0355 | | No log | 8.0 | 100 | 1.1701 | 0.905 | 0.4187 | 0.9002 | 0.905 | 0.8958 | 0.4765 | 0.0302 | | No log | 8.96 | 112 | 1.0024 | 0.9 | 0.3462 | 0.8889 | 0.9 | 0.8913 | 0.3961 | 0.0259 | | No log | 10.0 | 125 | 0.8487 | 0.91 | 0.2796 | 1.0389 | 0.91 | 0.9067 | 0.3464 | 0.0225 | | No log | 10.96 | 137 | 0.7522 | 0.91 | 0.2395 | 1.0408 | 0.91 | 0.9090 | 0.2930 | 0.0225 | | No log | 12.0 | 150 | 0.6862 | 0.905 | 0.2163 | 1.2027 | 0.905 | 0.9094 | 0.2529 | 0.0250 | | No log | 12.96 | 162 | 0.6437 | 0.895 | 0.2035 | 1.0504 | 0.895 | 0.8998 | 0.2306 | 0.0230 | | No log | 14.0 | 175 | 0.5896 | 0.905 | 0.1851 | 0.8758 | 0.905 | 0.9040 | 0.2122 | 0.0201 | | No log | 14.96 | 187 | 0.5167 | 0.92 | 0.1528 | 0.7812 | 0.92 | 0.9119 | 0.1872 | 0.0143 | | No log | 16.0 | 200 | 0.4894 | 0.935 | 0.1450 | 0.9154 | 0.935 | 0.9314 | 0.1962 | 0.0186 | | No log | 16.96 | 212 | 0.4769 | 0.91 | 0.1451 | 1.1071 | 0.91 | 0.9109 | 0.1703 | 0.0196 | | No log | 18.0 | 225 | 0.4573 | 0.91 | 0.1399 | 1.0919 | 0.91 | 0.9084 | 0.1621 | 0.0182 | | No log | 18.96 | 237 | 0.4501 | 0.905 | 0.1409 | 0.9304 | 0.905 | 0.9056 | 0.1549 | 0.0177 | | No log | 20.0 | 250 | 0.4434 | 0.905 | 0.1393 | 1.2519 | 0.905 | 0.9056 | 0.1552 | 0.0180 | | No log | 20.96 | 262 | 0.4391 | 0.905 | 0.1403 | 1.2476 | 0.905 | 0.9056 | 0.1434 | 0.0178 | | No log | 22.0 | 275 | 0.4326 | 0.905 | 0.1401 | 1.2419 | 0.905 | 0.9056 | 0.1396 | 0.0180 | | No log | 22.96 | 287 | 0.4290 | 0.905 | 0.1407 | 1.2397 | 0.905 | 0.9051 | 0.1414 | 0.0181 | | No log | 24.0 | 300 | 0.4255 | 0.905 | 0.1408 | 1.2373 | 0.905 | 0.9051 | 0.1346 | 0.0182 | | No log | 24.96 | 312 | 0.4238 | 0.905 | 0.1418 | 1.2372 | 0.905 | 0.9051 | 0.1308 | 0.0183 | | No log | 26.0 | 325 | 0.4212 | 0.905 | 0.1423 | 1.2348 | 0.905 | 0.9051 | 0.1288 | 0.0184 | | No log | 26.96 | 337 | 0.4197 | 0.905 | 0.1429 | 1.2350 | 0.905 | 0.9051 | 0.1242 | 0.0187 | | No log | 28.0 | 350 | 0.4181 | 0.905 | 0.1436 | 1.2331 | 0.905 | 0.9051 | 0.1298 | 0.0187 | | No log | 28.96 | 362 | 0.4171 | 0.905 | 0.1443 | 1.2339 | 0.905 | 0.9051 | 0.1341 | 0.0188 | | No log | 30.0 | 375 | 0.4158 | 0.905 | 0.1449 | 1.2322 | 0.905 | 0.9051 | 0.1261 | 0.0190 | | No log | 30.96 | 387 | 0.4156 | 0.905 | 0.1458 | 1.2337 | 0.905 | 0.9051 | 0.1310 | 0.0190 | | No log | 32.0 | 400 | 0.4145 | 0.905 | 0.1463 | 1.2323 | 0.905 | 0.9051 | 0.1244 | 0.0192 | | No log | 32.96 | 412 | 0.4145 | 0.905 | 0.1472 | 1.2342 | 0.905 | 0.9051 | 0.1175 | 0.0193 | | No log | 34.0 | 425 | 0.4140 | 0.905 | 0.1477 | 1.2353 | 0.905 | 0.9051 | 0.1163 | 0.0194 | | No log | 34.96 | 437 | 0.4138 | 0.905 | 0.1485 | 1.2384 | 0.905 | 0.9051 | 0.1296 | 0.0195 | | No log | 36.0 | 450 | 0.4137 | 0.905 | 0.1491 | 1.3855 | 0.905 | 0.9051 | 0.1271 | 0.0195 | | No log | 36.96 | 462 | 0.4134 | 0.905 | 0.1497 | 1.3846 | 0.905 | 0.9051 | 0.1264 | 0.0196 | | No log | 38.0 | 475 | 0.4137 | 0.905 | 0.1504 | 1.3842 | 0.905 | 0.9051 | 0.1254 | 0.0196 | | No log | 38.96 | 487 | 0.4136 | 0.91 | 0.1509 | 1.3835 | 0.91 | 0.9109 | 0.1160 | 0.0196 | | 0.5543 | 40.0 | 500 | 0.4140 | 0.91 | 0.1515 | 1.3831 | 0.91 | 0.9109 | 0.1190 | 0.0198 | | 0.5543 | 40.96 | 512 | 0.4138 | 0.91 | 0.1519 | 1.3825 | 0.91 | 0.9109 | 0.1186 | 0.0198 | | 0.5543 | 42.0 | 525 | 0.4143 | 0.91 | 0.1526 | 1.3822 | 0.91 | 0.9109 | 0.1180 | 0.0198 | | 0.5543 | 42.96 | 537 | 0.4143 | 0.91 | 0.1530 | 1.3816 | 0.91 | 0.9109 | 0.1220 | 0.0199 | | 0.5543 | 44.0 | 550 | 0.4148 | 0.91 | 0.1536 | 1.3815 | 0.91 | 0.9109 | 0.1214 | 0.0200 | | 0.5543 | 44.96 | 562 | 0.4149 | 0.91 | 0.1539 | 1.3809 | 0.91 | 0.9109 | 0.1208 | 0.0200 | | 0.5543 | 46.0 | 575 | 0.4154 | 0.91 | 0.1545 | 1.3807 | 0.91 | 0.9109 | 0.1173 | 0.0200 | | 0.5543 | 46.96 | 587 | 0.4157 | 0.91 | 0.1549 | 1.3803 | 0.91 | 0.9109 | 0.1084 | 0.0202 | | 0.5543 | 48.0 | 600 | 0.4162 | 0.91 | 0.1554 | 1.3801 | 0.91 | 0.9109 | 0.1080 | 0.0202 | | 0.5543 | 48.96 | 612 | 0.4163 | 0.91 | 0.1557 | 1.3798 | 0.91 | 0.9109 | 0.1068 | 0.0202 | | 0.5543 | 50.0 | 625 | 0.4169 | 0.91 | 0.1562 | 1.3795 | 0.91 | 0.9109 | 0.1066 | 0.0203 | | 0.5543 | 50.96 | 637 | 0.4171 | 0.91 | 0.1565 | 1.3793 | 0.91 | 0.9109 | 0.1064 | 0.0203 | | 0.5543 | 52.0 | 650 | 0.4177 | 0.91 | 0.1570 | 1.3791 | 0.91 | 0.9109 | 0.1120 | 0.0203 | | 0.5543 | 52.96 | 662 | 0.4180 | 0.91 | 0.1573 | 1.3789 | 0.91 | 0.9109 | 0.1117 | 0.0203 | | 0.5543 | 54.0 | 675 | 0.4185 | 0.91 | 0.1577 | 1.3786 | 0.91 | 0.9109 | 0.1065 | 0.0204 | | 0.5543 | 54.96 | 687 | 0.4187 | 0.91 | 0.1579 | 1.3785 | 0.91 | 0.9109 | 0.1063 | 0.0204 | | 0.5543 | 56.0 | 700 | 0.4193 | 0.91 | 0.1584 | 1.3782 | 0.91 | 0.9109 | 0.1062 | 0.0204 | | 0.5543 | 56.96 | 712 | 0.4196 | 0.91 | 0.1586 | 1.3782 | 0.91 | 0.9109 | 0.1058 | 0.0206 | | 0.5543 | 58.0 | 725 | 0.4200 | 0.91 | 0.1590 | 1.3779 | 0.91 | 0.9109 | 0.1060 | 0.0206 | | 0.5543 | 58.96 | 737 | 0.4203 | 0.91 | 0.1592 | 1.3778 | 0.91 | 0.9109 | 0.1095 | 0.0207 | | 0.5543 | 60.0 | 750 | 0.4209 | 0.91 | 0.1596 | 1.3776 | 0.91 | 0.9109 | 0.1055 | 0.0209 | | 0.5543 | 60.96 | 762 | 0.4213 | 0.91 | 0.1598 | 1.3776 | 0.91 | 0.9109 | 0.1091 | 0.0209 | | 0.5543 | 62.0 | 775 | 0.4216 | 0.91 | 0.1601 | 1.3772 | 0.91 | 0.9109 | 0.1019 | 0.0210 | | 0.5543 | 62.96 | 787 | 0.4221 | 0.91 | 0.1603 | 1.3773 | 0.91 | 0.9109 | 0.1017 | 0.0211 | | 0.5543 | 64.0 | 800 | 0.4225 | 0.905 | 0.1606 | 1.3769 | 0.905 | 0.9064 | 0.0997 | 0.0211 | | 0.5543 | 64.96 | 812 | 0.4228 | 0.905 | 0.1608 | 1.3771 | 0.905 | 0.9064 | 0.0995 | 0.0211 | | 0.5543 | 66.0 | 825 | 0.4232 | 0.9 | 0.1611 | 1.3766 | 0.9 | 0.9013 | 0.1046 | 0.0212 | | 0.5543 | 66.96 | 837 | 0.4236 | 0.9 | 0.1613 | 1.3768 | 0.9 | 0.9013 | 0.1045 | 0.0213 | | 0.5543 | 68.0 | 850 | 0.4240 | 0.9 | 0.1615 | 1.3764 | 0.9 | 0.9013 | 0.1026 | 0.0213 | | 0.5543 | 68.96 | 862 | 0.4243 | 0.9 | 0.1617 | 1.3765 | 0.9 | 0.9013 | 0.1043 | 0.0213 | | 0.5543 | 70.0 | 875 | 0.4247 | 0.9 | 0.1619 | 1.3762 | 0.9 | 0.9013 | 0.1060 | 0.0213 | | 0.5543 | 70.96 | 887 | 0.4249 | 0.9 | 0.1620 | 1.3762 | 0.9 | 0.9013 | 0.1077 | 0.0214 | | 0.5543 | 72.0 | 900 | 0.4254 | 0.9 | 0.1623 | 1.3760 | 0.9 | 0.9013 | 0.1057 | 0.0214 | | 0.5543 | 72.96 | 912 | 0.4257 | 0.9 | 0.1624 | 1.3760 | 0.9 | 0.9013 | 0.1074 | 0.0213 | | 0.5543 | 74.0 | 925 | 0.4259 | 0.9 | 0.1625 | 1.3758 | 0.9 | 0.9013 | 0.1056 | 0.0213 | | 0.5543 | 74.96 | 937 | 0.4262 | 0.9 | 0.1627 | 1.3758 | 0.9 | 0.9013 | 0.1056 | 0.0214 | | 0.5543 | 76.0 | 950 | 0.4266 | 0.9 | 0.1629 | 1.3757 | 0.9 | 0.9013 | 0.1058 | 0.0216 | | 0.5543 | 76.96 | 962 | 0.4268 | 0.9 | 0.1630 | 1.3756 | 0.9 | 0.9013 | 0.1057 | 0.0216 | | 0.5543 | 78.0 | 975 | 0.4271 | 0.9 | 0.1631 | 1.3755 | 0.9 | 0.9013 | 0.1076 | 0.0216 | | 0.5543 | 78.96 | 987 | 0.4274 | 0.9 | 0.1632 | 1.3756 | 0.9 | 0.9013 | 0.1075 | 0.0216 | | 0.0526 | 80.0 | 1000 | 0.4275 | 0.9 | 0.1634 | 1.3754 | 0.9 | 0.9013 | 0.1100 | 0.0217 | | 0.0526 | 80.96 | 1012 | 0.4278 | 0.9 | 0.1635 | 1.3754 | 0.9 | 0.9013 | 0.1099 | 0.0218 | | 0.0526 | 82.0 | 1025 | 0.4280 | 0.9 | 0.1636 | 1.3753 | 0.9 | 0.9013 | 0.1098 | 0.0218 | | 0.0526 | 82.96 | 1037 | 0.4282 | 0.9 | 0.1637 | 1.3753 | 0.9 | 0.9013 | 0.1098 | 0.0218 | | 0.0526 | 84.0 | 1050 | 0.4284 | 0.9 | 0.1638 | 1.3753 | 0.9 | 0.9013 | 0.1097 | 0.0218 | | 0.0526 | 84.96 | 1062 | 0.4286 | 0.9 | 0.1638 | 1.3753 | 0.9 | 0.9013 | 0.1097 | 0.0218 | | 0.0526 | 86.0 | 1075 | 0.4288 | 0.9 | 0.1639 | 1.3752 | 0.9 | 0.9013 | 0.1096 | 0.0218 | | 0.0526 | 86.96 | 1087 | 0.4289 | 0.9 | 0.1640 | 1.3752 | 0.9 | 0.9013 | 0.1096 | 0.0218 | | 0.0526 | 88.0 | 1100 | 0.4291 | 0.9 | 0.1641 | 1.3752 | 0.9 | 0.9013 | 0.1095 | 0.0218 | | 0.0526 | 88.96 | 1112 | 0.4292 | 0.9 | 0.1641 | 1.3752 | 0.9 | 0.9013 | 0.1095 | 0.0218 | | 0.0526 | 90.0 | 1125 | 0.4293 | 0.9 | 0.1642 | 1.3752 | 0.9 | 0.9013 | 0.1074 | 0.0219 | | 0.0526 | 90.96 | 1137 | 0.4294 | 0.9 | 0.1642 | 1.3752 | 0.9 | 0.9013 | 0.1075 | 0.0219 | | 0.0526 | 92.0 | 1150 | 0.4295 | 0.9 | 0.1642 | 1.3752 | 0.9 | 0.9013 | 0.1075 | 0.0220 | | 0.0526 | 92.96 | 1162 | 0.4295 | 0.9 | 0.1643 | 1.3752 | 0.9 | 0.9013 | 0.1075 | 0.0220 | | 0.0526 | 94.0 | 1175 | 0.4296 | 0.9 | 0.1643 | 1.3751 | 0.9 | 0.9013 | 0.1075 | 0.0220 | | 0.0526 | 94.96 | 1187 | 0.4296 | 0.9 | 0.1643 | 1.3751 | 0.9 | 0.9013 | 0.1075 | 0.0220 | | 0.0526 | 96.0 | 1200 | 0.4296 | 0.9 | 0.1643 | 1.3751 | 0.9 | 0.9013 | 0.1074 | 0.0220 | ### Framework versions - Transformers 4.28.0.dev0 - Pytorch 1.12.1+cu113 - Datasets 2.12.0 - Tokenizers 0.12.1
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
jordyvl/dit-finetuned_rvl_tobacco_crl_allv2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-finetuned_rvl_tobacco_crl_allv2 This model is a fine-tuned version of [microsoft/dit-base-finetuned-rvlcdip](https://huggingface.co/microsoft/dit-base-finetuned-rvlcdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5457 - Accuracy: 0.93 - Brier Loss: 0.1595 - Nll: 0.7692 - F1 Micro: 0.93 - F1 Macro: 0.9154 - Ece: 0.2119 - Aurc: 0.0106 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 0.96 | 3 | 2.3428 | 0.005 | 0.9059 | 8.8950 | 0.005 | 0.0048 | 0.1391 | 0.9945 | | No log | 1.96 | 6 | 2.3437 | 0.005 | 0.9052 | 8.8527 | 0.005 | 0.0048 | 0.1390 | 0.9942 | | No log | 2.96 | 9 | 2.3397 | 0.005 | 0.9038 | 8.6108 | 0.005 | 0.0048 | 0.1389 | 0.9940 | | No log | 3.96 | 12 | 2.3292 | 0.01 | 0.9018 | 7.9364 | 0.01 | 0.0091 | 0.1413 | 0.9925 | | No log | 4.96 | 15 | 2.3200 | 0.02 | 0.8991 | 7.1029 | 0.02 | 0.0303 | 0.1462 | 0.9876 | | No log | 5.96 | 18 | 2.3074 | 0.05 | 0.8958 | 6.6583 | 0.0500 | 0.0527 | 0.1638 | 0.9775 | | No log | 6.96 | 21 | 2.2903 | 0.155 | 0.8918 | 6.3637 | 0.155 | 0.1012 | 0.2321 | 0.9477 | | No log | 7.96 | 24 | 2.2706 | 0.18 | 0.8869 | 6.0129 | 0.18 | 0.1088 | 0.2487 | 0.9279 | | No log | 8.96 | 27 | 2.2479 | 0.195 | 0.8810 | 4.5304 | 0.195 | 0.1154 | 0.2509 | 0.8991 | | No log | 9.96 | 30 | 2.2212 | 0.215 | 0.8734 | 3.2783 | 0.2150 | 0.1365 | 0.2504 | 0.8008 | | No log | 10.96 | 33 | 2.1853 | 0.305 | 0.8631 | 2.6641 | 0.305 | 0.1889 | 0.3073 | 0.6426 | | No log | 11.96 | 36 | 2.1391 | 0.405 | 0.8489 | 2.2824 | 0.405 | 0.2593 | 0.3747 | 0.4193 | | No log | 12.96 | 39 | 2.0878 | 0.485 | 0.8327 | 1.8576 | 0.485 | 0.3408 | 0.4203 | 0.2962 | | No log | 13.96 | 42 | 2.0311 | 0.58 | 0.8157 | 1.4206 | 0.58 | 0.4327 | 0.4908 | 0.2186 | | No log | 14.96 | 45 | 1.9707 | 0.605 | 0.7970 | 1.2940 | 0.605 | 0.4601 | 0.4866 | 0.1784 | | No log | 15.96 | 48 | 1.9196 | 0.64 | 0.7803 | 1.2372 | 0.64 | 0.4958 | 0.5098 | 0.1483 | | No log | 16.96 | 51 | 1.8690 | 0.68 | 0.7630 | 1.1960 | 0.68 | 0.5548 | 0.5403 | 0.1286 | | No log | 17.96 | 54 | 1.8192 | 0.73 | 0.7448 | 1.1405 | 0.7300 | 0.6227 | 0.5655 | 0.0954 | | No log | 18.96 | 57 | 1.7691 | 0.8 | 0.7272 | 1.1234 | 0.8000 | 0.6965 | 0.6131 | 0.0618 | | No log | 19.96 | 60 | 1.7210 | 0.835 | 0.7096 | 1.0912 | 0.835 | 0.7380 | 0.6351 | 0.0463 | | No log | 20.96 | 63 | 1.6734 | 0.84 | 0.6912 | 1.0707 | 0.8400 | 0.7410 | 0.6305 | 0.0404 | | No log | 21.96 | 66 | 1.6273 | 0.85 | 0.6724 | 1.0024 | 0.85 | 0.7458 | 0.6316 | 0.0335 | | No log | 22.96 | 69 | 1.5782 | 0.86 | 0.6525 | 0.9764 | 0.8600 | 0.7612 | 0.6265 | 0.0289 | | No log | 23.96 | 72 | 1.5293 | 0.87 | 0.6327 | 0.9580 | 0.87 | 0.7690 | 0.6211 | 0.0227 | | No log | 24.96 | 75 | 1.4840 | 0.87 | 0.6131 | 0.9549 | 0.87 | 0.7730 | 0.6157 | 0.0223 | | No log | 25.96 | 78 | 1.4425 | 0.88 | 0.5937 | 0.9521 | 0.88 | 0.7829 | 0.6121 | 0.0209 | | No log | 26.96 | 81 | 1.4013 | 0.875 | 0.5740 | 0.9456 | 0.875 | 0.7799 | 0.5931 | 0.0216 | | No log | 27.96 | 84 | 1.3626 | 0.875 | 0.5550 | 0.9392 | 0.875 | 0.7828 | 0.5840 | 0.0219 | | No log | 28.96 | 87 | 1.3237 | 0.87 | 0.5360 | 0.9386 | 0.87 | 0.7802 | 0.5656 | 0.0225 | | No log | 29.96 | 90 | 1.2847 | 0.875 | 0.5166 | 0.9371 | 0.875 | 0.7854 | 0.5605 | 0.0208 | | No log | 30.96 | 93 | 1.2515 | 0.88 | 0.4993 | 0.8803 | 0.88 | 0.7875 | 0.5431 | 0.0205 | | No log | 31.96 | 96 | 1.2196 | 0.89 | 0.4824 | 0.9276 | 0.89 | 0.8120 | 0.5328 | 0.0192 | | No log | 32.96 | 99 | 1.1854 | 0.885 | 0.4648 | 0.9279 | 0.885 | 0.8020 | 0.5183 | 0.0195 | | No log | 33.96 | 102 | 1.1523 | 0.895 | 0.4480 | 0.9823 | 0.895 | 0.8347 | 0.5018 | 0.0189 | | No log | 34.96 | 105 | 1.1231 | 0.895 | 0.4324 | 0.9680 | 0.895 | 0.8347 | 0.4931 | 0.0187 | | No log | 35.96 | 108 | 1.0944 | 0.905 | 0.4174 | 0.9642 | 0.905 | 0.8527 | 0.4880 | 0.0173 | | No log | 36.96 | 111 | 1.0648 | 0.91 | 0.4020 | 0.9006 | 0.91 | 0.8720 | 0.4702 | 0.0183 | | No log | 37.96 | 114 | 1.0380 | 0.925 | 0.3885 | 0.8823 | 0.925 | 0.9005 | 0.4695 | 0.0160 | | No log | 38.96 | 117 | 1.0127 | 0.92 | 0.3758 | 0.8700 | 0.92 | 0.8954 | 0.4523 | 0.0171 | | No log | 39.96 | 120 | 0.9882 | 0.915 | 0.3629 | 0.8668 | 0.915 | 0.8864 | 0.4399 | 0.0170 | | No log | 40.96 | 123 | 0.9655 | 0.93 | 0.3509 | 0.8642 | 0.93 | 0.9048 | 0.4435 | 0.0150 | | No log | 41.96 | 126 | 0.9452 | 0.925 | 0.3412 | 0.8563 | 0.925 | 0.9016 | 0.4240 | 0.0159 | | No log | 42.96 | 129 | 0.9248 | 0.925 | 0.3312 | 0.8581 | 0.925 | 0.9016 | 0.4130 | 0.0160 | | No log | 43.96 | 132 | 0.9037 | 0.935 | 0.3207 | 0.8536 | 0.935 | 0.9204 | 0.4066 | 0.0159 | | No log | 44.96 | 135 | 0.8859 | 0.93 | 0.3120 | 0.8420 | 0.93 | 0.9154 | 0.3929 | 0.0166 | | No log | 45.96 | 138 | 0.8695 | 0.93 | 0.3039 | 0.8342 | 0.93 | 0.9154 | 0.3947 | 0.0163 | | No log | 46.96 | 141 | 0.8536 | 0.935 | 0.2959 | 0.8340 | 0.935 | 0.9204 | 0.3970 | 0.0152 | | No log | 47.96 | 144 | 0.8381 | 0.935 | 0.2889 | 0.8297 | 0.935 | 0.9204 | 0.3877 | 0.0150 | | No log | 48.96 | 147 | 0.8217 | 0.94 | 0.2810 | 0.8237 | 0.94 | 0.9269 | 0.3783 | 0.0147 | | No log | 49.96 | 150 | 0.8060 | 0.94 | 0.2725 | 0.8241 | 0.94 | 0.9277 | 0.3762 | 0.0142 | | No log | 50.96 | 153 | 0.7892 | 0.935 | 0.2642 | 0.8185 | 0.935 | 0.9245 | 0.3522 | 0.0139 | | No log | 51.96 | 156 | 0.7750 | 0.94 | 0.2577 | 0.8128 | 0.94 | 0.9330 | 0.3512 | 0.0131 | | No log | 52.96 | 159 | 0.7602 | 0.94 | 0.2517 | 0.8020 | 0.94 | 0.9330 | 0.3517 | 0.0135 | | No log | 53.96 | 162 | 0.7457 | 0.94 | 0.2449 | 0.7927 | 0.94 | 0.9330 | 0.3443 | 0.0134 | | No log | 54.96 | 165 | 0.7342 | 0.94 | 0.2393 | 0.8457 | 0.94 | 0.9330 | 0.3248 | 0.0130 | | No log | 55.96 | 168 | 0.7235 | 0.94 | 0.2344 | 0.8500 | 0.94 | 0.9330 | 0.3244 | 0.0127 | | No log | 56.96 | 171 | 0.7161 | 0.935 | 0.2303 | 0.8536 | 0.935 | 0.9214 | 0.3181 | 0.0129 | | No log | 57.96 | 174 | 0.7052 | 0.935 | 0.2251 | 0.8537 | 0.935 | 0.9214 | 0.3122 | 0.0128 | | No log | 58.96 | 177 | 0.6930 | 0.935 | 0.2192 | 0.8442 | 0.935 | 0.9214 | 0.3123 | 0.0121 | | No log | 59.96 | 180 | 0.6830 | 0.94 | 0.2146 | 0.8339 | 0.94 | 0.9263 | 0.3003 | 0.0119 | | No log | 60.96 | 183 | 0.6735 | 0.94 | 0.2108 | 0.8266 | 0.94 | 0.9263 | 0.2944 | 0.0120 | | No log | 61.96 | 186 | 0.6654 | 0.935 | 0.2068 | 0.8249 | 0.935 | 0.9231 | 0.3001 | 0.0120 | | No log | 62.96 | 189 | 0.6572 | 0.935 | 0.2029 | 0.8228 | 0.935 | 0.9231 | 0.2784 | 0.0115 | | No log | 63.96 | 192 | 0.6526 | 0.935 | 0.2001 | 0.8257 | 0.935 | 0.9231 | 0.2749 | 0.0116 | | No log | 64.96 | 195 | 0.6448 | 0.935 | 0.1977 | 0.8244 | 0.935 | 0.9244 | 0.2643 | 0.0118 | | No log | 65.96 | 198 | 0.6366 | 0.935 | 0.1944 | 0.8169 | 0.935 | 0.9244 | 0.2607 | 0.0115 | | No log | 66.96 | 201 | 0.6281 | 0.935 | 0.1908 | 0.8088 | 0.935 | 0.9231 | 0.2745 | 0.0113 | | No log | 67.96 | 204 | 0.6206 | 0.935 | 0.1876 | 0.8037 | 0.935 | 0.9231 | 0.2784 | 0.0112 | | No log | 68.96 | 207 | 0.6143 | 0.935 | 0.1853 | 0.8025 | 0.935 | 0.9231 | 0.2662 | 0.0112 | | No log | 69.96 | 210 | 0.6102 | 0.94 | 0.1839 | 0.8010 | 0.94 | 0.9330 | 0.2584 | 0.0113 | | No log | 70.96 | 213 | 0.6053 | 0.94 | 0.1822 | 0.7991 | 0.94 | 0.9330 | 0.2639 | 0.0113 | | No log | 71.96 | 216 | 0.6000 | 0.94 | 0.1802 | 0.7949 | 0.94 | 0.9330 | 0.2516 | 0.0113 | | No log | 72.96 | 219 | 0.5941 | 0.94 | 0.1773 | 0.7891 | 0.94 | 0.9330 | 0.2670 | 0.0112 | | No log | 73.96 | 222 | 0.5888 | 0.94 | 0.1747 | 0.7846 | 0.94 | 0.9330 | 0.2457 | 0.0109 | | No log | 74.96 | 225 | 0.5837 | 0.94 | 0.1724 | 0.7831 | 0.94 | 0.9330 | 0.2425 | 0.0107 | | No log | 75.96 | 228 | 0.5801 | 0.94 | 0.1711 | 0.7857 | 0.94 | 0.9330 | 0.2411 | 0.0108 | | No log | 76.96 | 231 | 0.5783 | 0.935 | 0.1708 | 0.7884 | 0.935 | 0.9231 | 0.2342 | 0.0110 | | No log | 77.96 | 234 | 0.5767 | 0.93 | 0.1707 | 0.7882 | 0.93 | 0.9139 | 0.2375 | 0.0109 | | No log | 78.96 | 237 | 0.5754 | 0.93 | 0.1703 | 0.7863 | 0.93 | 0.9154 | 0.2255 | 0.0109 | | No log | 79.96 | 240 | 0.5732 | 0.93 | 0.1694 | 0.7848 | 0.93 | 0.9154 | 0.2365 | 0.0111 | | No log | 80.96 | 243 | 0.5715 | 0.93 | 0.1685 | 0.7834 | 0.93 | 0.9154 | 0.2433 | 0.0111 | | No log | 81.96 | 246 | 0.5675 | 0.93 | 0.1672 | 0.7803 | 0.93 | 0.9154 | 0.2423 | 0.0110 | | No log | 82.96 | 249 | 0.5648 | 0.93 | 0.1661 | 0.7773 | 0.93 | 0.9154 | 0.2352 | 0.0108 | | No log | 83.96 | 252 | 0.5624 | 0.93 | 0.1653 | 0.7753 | 0.93 | 0.9154 | 0.2343 | 0.0108 | | No log | 84.96 | 255 | 0.5608 | 0.93 | 0.1646 | 0.7746 | 0.93 | 0.9154 | 0.2332 | 0.0108 | | No log | 85.96 | 258 | 0.5589 | 0.93 | 0.1642 | 0.7748 | 0.93 | 0.9154 | 0.2310 | 0.0110 | | No log | 86.96 | 261 | 0.5574 | 0.93 | 0.1638 | 0.7741 | 0.93 | 0.9154 | 0.2286 | 0.0111 | | No log | 87.96 | 264 | 0.5559 | 0.93 | 0.1636 | 0.7739 | 0.93 | 0.9154 | 0.2274 | 0.0108 | | No log | 88.96 | 267 | 0.5555 | 0.93 | 0.1631 | 0.7748 | 0.93 | 0.9154 | 0.2264 | 0.0109 | | No log | 89.96 | 270 | 0.5539 | 0.93 | 0.1623 | 0.7743 | 0.93 | 0.9154 | 0.2255 | 0.0108 | | No log | 90.96 | 273 | 0.5525 | 0.93 | 0.1616 | 0.7738 | 0.93 | 0.9154 | 0.2251 | 0.0108 | | No log | 91.96 | 276 | 0.5508 | 0.93 | 0.1612 | 0.7726 | 0.93 | 0.9154 | 0.2254 | 0.0108 | | No log | 92.96 | 279 | 0.5498 | 0.93 | 0.1610 | 0.7721 | 0.93 | 0.9154 | 0.2240 | 0.0107 | | No log | 93.96 | 282 | 0.5497 | 0.93 | 0.1607 | 0.7717 | 0.93 | 0.9154 | 0.2141 | 0.0107 | | No log | 94.96 | 285 | 0.5487 | 0.93 | 0.1605 | 0.7711 | 0.93 | 0.9154 | 0.2138 | 0.0107 | | No log | 95.96 | 288 | 0.5473 | 0.93 | 0.1601 | 0.7704 | 0.93 | 0.9154 | 0.2129 | 0.0106 | | No log | 96.96 | 291 | 0.5467 | 0.93 | 0.1599 | 0.7701 | 0.93 | 0.9154 | 0.2124 | 0.0106 | | No log | 97.96 | 294 | 0.5462 | 0.93 | 0.1597 | 0.7698 | 0.93 | 0.9154 | 0.2121 | 0.0106 | | No log | 98.96 | 297 | 0.5458 | 0.93 | 0.1596 | 0.7694 | 0.93 | 0.9154 | 0.2120 | 0.0106 | | No log | 99.96 | 300 | 0.5457 | 0.93 | 0.1595 | 0.7692 | 0.93 | 0.9154 | 0.2119 | 0.0106 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "adve", "email", "form", "letter", "memo", "news", "note", "report", "resume", "scientific" ]
vesteinn/vit-mae-cub
Note that this model does not work directly with HF, a modification that does mean pooling before the layernorm and classification head is needed. ```python from transformers import ( ViTForImageClassification, pipeline, AutoImageProcessor, ViTConfig, ViTModel, ) from transformers.modeling_outputs import ( ImageClassifierOutput, BaseModelOutputWithPooling, ) from PIL import Image import torch from torch import nn from typing import Optional, Union, Tuple class CustomViTModel(ViTModel): def forward( self, pixel_values: Optional[torch.Tensor] = None, bool_masked_pos: Optional[torch.BoolTensor] = None, head_mask: Optional[torch.Tensor] = None, output_attentions: Optional[bool] = None, output_hidden_states: Optional[bool] = None, interpolate_pos_encoding: Optional[bool] = None, return_dict: Optional[bool] = None, ) -> Union[Tuple, BaseModelOutputWithPooling]: r""" bool_masked_pos (`torch.BoolTensor` of shape `(batch_size, num_patches)`, *optional*): Boolean masked positions. Indicates which patches are masked (1) and which aren't (0). """ output_attentions = ( output_attentions if output_attentions is not None else self.config.output_attentions ) output_hidden_states = ( output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states ) return_dict = ( return_dict if return_dict is not None else self.config.use_return_dict ) if pixel_values is None: raise ValueError("You have to specify pixel_values") # Prepare head mask if needed # 1.0 in head_mask indicate we keep the head # attention_probs has shape bsz x n_heads x N x N # input head_mask has shape [num_heads] or [num_hidden_layers x num_heads] # and head_mask is converted to shape [num_hidden_layers x batch x num_heads x seq_length x seq_length] head_mask = self.get_head_mask(head_mask, self.config.num_hidden_layers) # TODO: maybe have a cleaner way to cast the input (from `ImageProcessor` side?) expected_dtype = self.embeddings.patch_embeddings.projection.weight.dtype if pixel_values.dtype != expected_dtype: pixel_values = pixel_values.to(expected_dtype) embedding_output = self.embeddings( pixel_values, bool_masked_pos=bool_masked_pos, interpolate_pos_encoding=interpolate_pos_encoding, ) encoder_outputs = self.encoder( embedding_output, head_mask=head_mask, output_attentions=output_attentions, output_hidden_states=output_hidden_states, return_dict=return_dict, ) sequence_output = encoder_outputs[0] sequence_output = sequence_output[:, 1:, :].mean(dim=1) sequence_output = self.layernorm(sequence_output) pooled_output = ( self.pooler(sequence_output) if self.pooler is not None else None ) if not return_dict: head_outputs = ( (sequence_output, pooled_output) if pooled_output is not None else (sequence_output,) ) return head_outputs + encoder_outputs[1:] return BaseModelOutputWithPooling( last_hidden_state=sequence_output, pooler_output=pooled_output, hidden_states=encoder_outputs.hidden_states, attentions=encoder_outputs.attentions, ) class CustomViTForImageClassification(ViTForImageClassification): def __init__(self, config: ViTConfig) -> None: super().__init__(config) self.num_labels = config.num_labels self.vit = CustomViTModel(config, add_pooling_layer=False) # Classifier head self.classifier = ( nn.Linear(config.hidden_size, config.num_labels) if config.num_labels > 0 else nn.Identity() ) # Initialize weights and apply final processing self.post_init() def forward( self, pixel_values: Optional[torch.Tensor] = None, head_mask: Optional[torch.Tensor] = None, labels: Optional[torch.Tensor] = None, output_attentions: Optional[bool] = None, output_hidden_states: Optional[bool] = None, interpolate_pos_encoding: Optional[bool] = None, return_dict: Optional[bool] = None, ) -> Union[tuple, ImageClassifierOutput]: r""" labels (`torch.LongTensor` of shape `(batch_size,)`, *optional*): Labels for computing the image classification/regression loss. Indices should be in `[0, ..., config.num_labels - 1]`. If `config.num_labels == 1` a regression loss is computed (Mean-Square loss), If `config.num_labels > 1` a classification loss is computed (Cross-Entropy). """ return_dict = ( return_dict if return_dict is not None else self.config.use_return_dict ) outputs = self.vit( pixel_values, head_mask=head_mask, output_attentions=output_attentions, output_hidden_states=output_hidden_states, interpolate_pos_encoding=interpolate_pos_encoding, return_dict=return_dict, ) sequence_output = outputs[0] logits = self.classifier(sequence_output) loss = None return ImageClassifierOutput( loss=loss, logits=logits, hidden_states=outputs.hidden_states, attentions=outputs.attentions, ) model = CustomViTForImageClassification.from_pretrained("vesteinn/vit-mae-cub") image_processor = AutoImageProcessor.from_pretrained("vesteinn/vit-mae-cub") classifier = pipeline( "image-classification", model=model, image_processor=image_processor ) ```
[ "black footed albatross", "laysan albatross", "sooty albatross", "groove billed ani", "crested auklet", "least auklet", "parakeet auklet", "rhinoceros auklet", "brewer blackbird", "red winged blackbird", "rusty blackbird", "yellow headed blackbird", "bobolink", "indigo bunting", "lazuli bunting", "painted bunting", "cardinal", "spotted catbird", "gray catbird", "yellow breasted chat", "eastern towhee", "chuck will widow", "brandt cormorant", "red faced cormorant", "pelagic cormorant", "bronzed cowbird", "shiny cowbird", "brown creeper", "american crow", "fish crow", "black billed cuckoo", "mangrove cuckoo", "yellow billed cuckoo", "gray crowned rosy finch", "purple finch", "northern flicker", "acadian flycatcher", "great crested flycatcher", "least flycatcher", "olive sided flycatcher", "scissor tailed flycatcher", "vermilion flycatcher", "yellow bellied flycatcher", "frigatebird", "northern fulmar", "gadwall", "american goldfinch", "european goldfinch", "boat tailed grackle", "eared grebe", "horned grebe", "pied billed grebe", "western grebe", "blue grosbeak", "evening grosbeak", "pine grosbeak", "rose breasted grosbeak", "pigeon guillemot", "california gull", "glaucous winged gull", "heermann gull", "herring gull", "ivory gull", "ring billed gull", "slaty backed gull", "western gull", "anna hummingbird", "ruby throated hummingbird", "rufous hummingbird", "green violetear", "long tailed jaeger", "pomarine jaeger", "blue jay", "florida jay", "green jay", "dark eyed junco", "tropical kingbird", "gray kingbird", "belted kingfisher", "green kingfisher", "pied kingfisher", "ringed kingfisher", "white breasted kingfisher", "red legged kittiwake", "horned lark", "pacific loon", "mallard", "western meadowlark", "hooded merganser", "red breasted merganser", "mockingbird", "nighthawk", "clark nutcracker", "white breasted nuthatch", "baltimore oriole", "hooded oriole", "orchard oriole", "scott oriole", "ovenbird", "brown pelican", "white pelican", "western wood pewee", "sayornis", "american pipit", "whip poor will", "horned puffin", "common raven", "white necked raven", "american redstart", "geococcyx", "loggerhead shrike", "great grey shrike", "baird sparrow", "black throated sparrow", "brewer sparrow", "chipping sparrow", "clay colored sparrow", "house sparrow", "field sparrow", "fox sparrow", "grasshopper sparrow", "harris sparrow", "henslow sparrow", "le conte sparrow", "lincoln sparrow", "nelson sharp tailed sparrow", "savannah sparrow", "seaside sparrow", "song sparrow", "tree sparrow", "vesper sparrow", "white crowned sparrow", "white throated sparrow", "cape glossy starling", "bank swallow", "barn swallow", "cliff swallow", "tree swallow", "scarlet tanager", "summer tanager", "artic tern", "black tern", "caspian tern", "common tern", "elegant tern", "forsters tern", "least tern", "green tailed towhee", "brown thrasher", "sage thrasher", "black capped vireo", "blue headed vireo", "philadelphia vireo", "red eyed vireo", "warbling vireo", "white eyed vireo", "yellow throated vireo", "bay breasted warbler", "black and white warbler", "black throated blue warbler", "blue winged warbler", "canada warbler", "cape may warbler", "cerulean warbler", "chestnut sided warbler", "golden winged warbler", "hooded warbler", "kentucky warbler", "magnolia warbler", "mourning warbler", "myrtle warbler", "nashville warbler", "orange crowned warbler", "palm warbler", "pine warbler", "prairie warbler", "prothonotary warbler", "swainson warbler", "tennessee warbler", "wilson warbler", "worm eating warbler", "yellow warbler", "northern waterthrush", "louisiana waterthrush", "bohemian waxwing", "cedar waxwing", "american three toed woodpecker", "pileated woodpecker", "red bellied woodpecker", "red cockaded woodpecker", "red headed woodpecker", "downy woodpecker", "bewick wren", "cactus wren", "carolina wren", "house wren", "marsh wren", "rock wren", "winter wren", "common yellowthroat", "label_200", "label_201", "label_202", "label_203", "label_204", "label_205", "label_206", "label_207", "label_208", "label_209", "label_210", "label_211", "label_212", "label_213", "label_214", "label_215", "label_216", "label_217", "label_218", "label_219", "label_220", "label_221", "label_222", "label_223", "label_224", "label_225", "label_226", "label_227", "label_228", "label_229", "label_230", "label_231", "label_232", "label_233", "label_234", "label_235", "label_236", "label_237", "label_238", "label_239", "label_240", "label_241", "label_242", "label_243", "label_244", "label_245", "label_246", "label_247", "label_248", "label_249", "label_250", "label_251", "label_252", "label_253", "label_254", "label_255", "label_256", "label_257", "label_258", "label_259", "label_260", "label_261", "label_262", "label_263", "label_264", "label_265", "label_266", "label_267", "label_268", "label_269", "label_270", "label_271", "label_272", "label_273", "label_274", "label_275", "label_276", "label_277", "label_278", "label_279", "label_280", "label_281", "label_282", "label_283", "label_284", "label_285", "label_286", "label_287", "label_288", "label_289", "label_290", "label_291", "label_292", "label_293", "label_294", "label_295", "label_296", "label_297", "label_298", "label_299", "label_300", "label_301", "label_302", "label_303", "label_304", "label_305", "label_306", "label_307", "label_308", "label_309", "label_310", "label_311", "label_312", "label_313", "label_314", "label_315", "label_316", "label_317", "label_318", "label_319", "label_320", "label_321", "label_322", "label_323", "label_324", "label_325", "label_326", "label_327", "label_328", "label_329", "label_330", "label_331", "label_332", "label_333", "label_334", "label_335", "label_336", "label_337", "label_338", "label_339", "label_340", "label_341", "label_342", "label_343", "label_344", "label_345", "label_346", "label_347", "label_348", "label_349", "label_350", "label_351", "label_352", "label_353", "label_354", "label_355", "label_356", "label_357", "label_358", "label_359", "label_360", "label_361", "label_362", "label_363", "label_364", "label_365", "label_366", "label_367", "label_368", "label_369", "label_370", "label_371", "label_372", "label_373", "label_374", "label_375", "label_376", "label_377", "label_378", "label_379", "label_380", "label_381", "label_382", "label_383", "label_384", "label_385", "label_386", "label_387", "label_388", "label_389", "label_390", "label_391", "label_392", "label_393", "label_394", "label_395", "label_396", "label_397", "label_398", "label_399", "label_400", "label_401", "label_402", "label_403", "label_404", "label_405", "label_406", "label_407", "label_408", "label_409", "label_410", "label_411", "label_412", "label_413", "label_414", "label_415", "label_416", "label_417", "label_418", "label_419", "label_420", "label_421", "label_422", "label_423", "label_424", "label_425", "label_426", "label_427", "label_428", "label_429", "label_430", "label_431", "label_432", "label_433", "label_434", "label_435", "label_436", "label_437", "label_438", "label_439", "label_440", "label_441", "label_442", "label_443", "label_444", "label_445", "label_446", "label_447", "label_448", "label_449", "label_450", "label_451", "label_452", "label_453", "label_454", "label_455", "label_456", "label_457", "label_458", "label_459", "label_460", "label_461", "label_462", "label_463", "label_464", "label_465", "label_466", "label_467", "label_468", "label_469", "label_470", "label_471", "label_472", "label_473", "label_474", "label_475", "label_476", "label_477", "label_478", "label_479", "label_480", "label_481", "label_482", "label_483", "label_484", "label_485", "label_486", "label_487", "label_488", "label_489", "label_490", "label_491", "label_492", "label_493", "label_494", "label_495", "label_496", "label_497", "label_498", "label_499", "label_500", "label_501", "label_502", "label_503", "label_504", "label_505", "label_506", "label_507", "label_508", "label_509", "label_510", "label_511", "label_512", "label_513", "label_514", "label_515", "label_516", "label_517", "label_518", "label_519", "label_520", "label_521", "label_522", "label_523", "label_524", "label_525", "label_526", "label_527", "label_528", "label_529", "label_530", "label_531", "label_532", "label_533", "label_534", "label_535", "label_536", "label_537", "label_538", "label_539", "label_540", "label_541", "label_542", "label_543", "label_544", "label_545", "label_546", "label_547", "label_548", "label_549", "label_550", "label_551", "label_552", "label_553", "label_554", "label_555", "label_556", "label_557", "label_558", "label_559", "label_560", "label_561", "label_562", "label_563", "label_564", "label_565", "label_566", "label_567", "label_568", "label_569", "label_570", "label_571", "label_572", "label_573", "label_574", "label_575", "label_576", "label_577", "label_578", "label_579", "label_580", "label_581", "label_582", "label_583", "label_584", "label_585", "label_586", "label_587", "label_588", "label_589", "label_590", "label_591", "label_592", "label_593", "label_594", "label_595", "label_596", "label_597", "label_598", "label_599", "label_600", "label_601", "label_602", "label_603", "label_604", "label_605", "label_606", "label_607", "label_608", "label_609", "label_610", "label_611", "label_612", "label_613", "label_614", "label_615", "label_616", "label_617", "label_618", "label_619", "label_620", "label_621", "label_622", "label_623", "label_624", "label_625", "label_626", "label_627", "label_628", "label_629", "label_630", "label_631", "label_632", "label_633", "label_634", "label_635", "label_636", "label_637", "label_638", "label_639", "label_640", "label_641", "label_642", "label_643", "label_644", "label_645", "label_646", "label_647", "label_648", "label_649", "label_650", "label_651", "label_652", "label_653", "label_654", "label_655", "label_656", "label_657", "label_658", "label_659", "label_660", "label_661", "label_662", "label_663", "label_664", "label_665", "label_666", "label_667", "label_668", "label_669", "label_670", "label_671", "label_672", "label_673", "label_674", "label_675", "label_676", "label_677", "label_678", "label_679", "label_680", "label_681", "label_682", "label_683", "label_684", "label_685", "label_686", "label_687", "label_688", "label_689", "label_690", "label_691", "label_692", "label_693", "label_694", "label_695", "label_696", "label_697", "label_698", "label_699", "label_700", "label_701", "label_702", "label_703", "label_704", "label_705", "label_706", "label_707", "label_708", "label_709", "label_710", "label_711", "label_712", "label_713", "label_714", "label_715", "label_716", "label_717", "label_718", "label_719", "label_720", "label_721", "label_722", "label_723", "label_724", "label_725", "label_726", "label_727", "label_728", "label_729", "label_730", "label_731", "label_732", "label_733", "label_734", "label_735", "label_736", "label_737", "label_738", "label_739", "label_740", "label_741", "label_742", "label_743", "label_744", "label_745", "label_746", "label_747", "label_748", "label_749", "label_750", "label_751", "label_752", "label_753", "label_754", "label_755", "label_756", "label_757", "label_758", "label_759", "label_760", "label_761", "label_762", "label_763", "label_764", "label_765", "label_766", "label_767", "label_768", "label_769", "label_770", "label_771", "label_772", "label_773", "label_774", "label_775", "label_776", "label_777", "label_778", "label_779", "label_780", "label_781", "label_782", "label_783", "label_784", "label_785", "label_786", "label_787", "label_788", "label_789", "label_790", "label_791", "label_792", "label_793", "label_794", "label_795", "label_796", "label_797", "label_798", "label_799", "label_800", "label_801", "label_802", "label_803", "label_804", "label_805", "label_806", "label_807", "label_808", "label_809", "label_810", "label_811", "label_812", "label_813", "label_814", "label_815", "label_816", "label_817", "label_818", "label_819", "label_820", "label_821", "label_822", "label_823", "label_824", "label_825", "label_826", "label_827", "label_828", "label_829", "label_830", "label_831", "label_832", "label_833", "label_834", "label_835", "label_836", "label_837", "label_838", "label_839", "label_840", "label_841", "label_842", "label_843", "label_844", "label_845", "label_846", "label_847", "label_848", "label_849", "label_850", "label_851", "label_852", "label_853", "label_854", "label_855", "label_856", "label_857", "label_858", "label_859", "label_860", "label_861", "label_862", "label_863", "label_864", "label_865", "label_866", "label_867", "label_868", "label_869", "label_870", "label_871", "label_872", "label_873", "label_874", "label_875", "label_876", "label_877", "label_878", "label_879", "label_880", "label_881", "label_882", "label_883", "label_884", "label_885", "label_886", "label_887", "label_888", "label_889", "label_890", "label_891", "label_892", "label_893", "label_894", "label_895", "label_896", "label_897", "label_898", "label_899", "label_900", "label_901", "label_902", "label_903", "label_904", "label_905", "label_906", "label_907", "label_908", "label_909", "label_910", "label_911", "label_912", "label_913", "label_914", "label_915", "label_916", "label_917", "label_918", "label_919", "label_920", "label_921", "label_922", "label_923", "label_924", "label_925", "label_926", "label_927", "label_928", "label_929", "label_930", "label_931", "label_932", "label_933", "label_934", "label_935", "label_936", "label_937", "label_938", "label_939", "label_940", "label_941", "label_942", "label_943", "label_944", "label_945", "label_946", "label_947", "label_948", "label_949", "label_950", "label_951", "label_952", "label_953", "label_954", "label_955", "label_956", "label_957", "label_958", "label_959", "label_960", "label_961", "label_962", "label_963", "label_964", "label_965", "label_966", "label_967", "label_968", "label_969", "label_970", "label_971", "label_972", "label_973", "label_974", "label_975", "label_976", "label_977", "label_978", "label_979", "label_980", "label_981", "label_982", "label_983", "label_984", "label_985", "label_986", "label_987", "label_988", "label_989", "label_990", "label_991", "label_992", "label_993", "label_994", "label_995", "label_996", "label_997", "label_998", "label_999" ]
wuru330/378A1_results
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 378A1_results This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4492 - Accuracy: 0.9014 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.2401 | 1.0 | 37 | 1.0427 | 0.6582 | | 0.669 | 2.0 | 74 | 0.5486 | 0.8418 | | 0.4662 | 3.0 | 111 | 0.4012 | 0.8690 | | 0.3211 | 4.0 | 148 | 0.5338 | 0.7942 | | 0.2136 | 5.0 | 185 | 0.3189 | 0.8861 | | 0.1626 | 6.0 | 222 | 0.4406 | 0.8435 | | 0.1042 | 7.0 | 259 | 0.3812 | 0.8741 | | 0.0688 | 8.0 | 296 | 0.3501 | 0.8946 | | 0.0425 | 9.0 | 333 | 0.3845 | 0.8912 | | 0.0586 | 10.0 | 370 | 0.3640 | 0.8980 | | 0.0276 | 11.0 | 407 | 0.3708 | 0.9031 | | 0.0342 | 12.0 | 444 | 0.3862 | 0.9082 | | 0.0251 | 13.0 | 481 | 0.5206 | 0.8776 | | 0.0209 | 14.0 | 518 | 0.4078 | 0.8929 | | 0.0173 | 15.0 | 555 | 0.4168 | 0.8895 | | 0.0159 | 16.0 | 592 | 0.4108 | 0.8997 | | 0.0151 | 17.0 | 629 | 0.4176 | 0.9014 | | 0.014 | 18.0 | 666 | 0.4228 | 0.9014 | | 0.0131 | 19.0 | 703 | 0.4266 | 0.9014 | | 0.0125 | 20.0 | 740 | 0.4301 | 0.9014 | | 0.012 | 21.0 | 777 | 0.4339 | 0.9014 | | 0.0115 | 22.0 | 814 | 0.4372 | 0.9014 | | 0.0111 | 23.0 | 851 | 0.4401 | 0.9014 | | 0.0107 | 24.0 | 888 | 0.4424 | 0.9014 | | 0.0101 | 25.0 | 925 | 0.4444 | 0.9014 | | 0.01 | 26.0 | 962 | 0.4461 | 0.9014 | | 0.01 | 27.0 | 999 | 0.4475 | 0.9014 | | 0.0099 | 28.0 | 1036 | 0.4485 | 0.9014 | | 0.0097 | 29.0 | 1073 | 0.4490 | 0.9014 | | 0.0097 | 30.0 | 1110 | 0.4492 | 0.9014 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2", "label_3" ]
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-BreastCancer-BreakHis-AH-Shuffled
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-BreastCancer-BreakHis-AH-Shuffled This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0289 - Accuracy: 0.9953 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.5 - num_epochs: 12 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1894 | 1.0 | 199 | 0.1739 | 0.9307 | | 0.3951 | 2.0 | 398 | 0.1066 | 0.9614 | | 0.1021 | 3.0 | 597 | 0.0741 | 0.9708 | | 0.0784 | 4.0 | 796 | 0.0815 | 0.9760 | | 0.0835 | 5.0 | 995 | 0.0723 | 0.9774 | | 0.1394 | 6.0 | 1194 | 0.0532 | 0.9840 | | 0.1755 | 7.0 | 1393 | 0.1068 | 0.9722 | | 0.1134 | 8.0 | 1592 | 0.0390 | 0.9892 | | 0.0237 | 9.0 | 1791 | 0.0789 | 0.9863 | | 0.027 | 10.0 | 1990 | 0.0492 | 0.9887 | | 0.0081 | 11.0 | 2189 | 0.0429 | 0.9934 | | 0.011 | 12.0 | 2388 | 0.0289 | 0.9953 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "benign", "malignant" ]
jvadlamudi2/convnext-tiny-224-jvadlamudi2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnext-tiny-224-jvadlamudi2 This model is a fine-tuned version of [facebook/convnext-tiny-224](https://huggingface.co/facebook/convnext-tiny-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5780 - Accuracy: 0.7946 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 7 | 0.5882 | 0.8036 | | 0.6213 | 2.0 | 14 | 0.5821 | 0.7857 | | 0.6123 | 3.0 | 21 | 0.5780 | 0.7946 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "0", "1" ]
Helenbzbz/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1798 - Accuracy: 0.9346 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.269 | 1.0 | 26 | 0.2186 | 0.9128 | | 0.163 | 2.0 | 52 | 0.1899 | 0.9401 | | 0.1792 | 3.0 | 78 | 0.1798 | 0.9346 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
annazhong/vit-base-patch16-224-finetuned-original-images
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-original-images This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1367 - Accuracy: 0.4865 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 150 - eval_batch_size: 150 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 600 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 1.4730 | 0.2703 | | No log | 2.0 | 2 | 1.1367 | 0.4865 | | No log | 3.0 | 3 | 0.9924 | 0.4324 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2" ]
pankajgharai/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 1.5995 - Accuracy: 0.892 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.6508 | 0.99 | 62 | 2.5037 | 0.82 | | 1.8322 | 2.0 | 125 | 1.7732 | 0.875 | | 1.5648 | 2.98 | 186 | 1.5995 | 0.892 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
annazhong/vit-base-patch16-224-finetuned-foveated-features
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-foveated-features This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1242 - Accuracy: 0.4595 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 150 - eval_batch_size: 150 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 600 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 1.2615 | 0.1622 | | No log | 2.0 | 2 | 1.2910 | 0.3514 | | No log | 3.0 | 3 | 1.1242 | 0.4595 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2" ]
annazhong/vit-base-patch16-224-finetuned-feature-map-v2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-feature-map-v2 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.9026 - Accuracy: 0.22 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 150 - eval_batch_size: 150 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 600 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 2.1272 | 0.21 | | No log | 2.0 | 3 | 1.9026 | 0.22 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
mansee/vit-base-patch16-224-blur_vs_clean
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-blur_vs_clean This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0714 - Accuracy: 0.9754 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0539 | 1.0 | 151 | 0.1078 | 0.9596 | | 0.0611 | 2.0 | 302 | 0.0846 | 0.9698 | | 0.049 | 3.0 | 453 | 0.0714 | 0.9754 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "blur", "clean" ]
pankajgharai/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0798 - Accuracy: 0.9715 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.22 | 1.0 | 190 | 0.1221 | 0.9570 | | 0.1705 | 2.0 | 380 | 0.0891 | 0.97 | | 0.1084 | 3.0 | 570 | 0.0798 | 0.9715 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
jordyvl/cdip-tiny_rvl_cdip-NK1000_kd_test
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cdip-tiny_rvl_cdip-NK1000_kd_test This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4588 - Accuracy: 0.8227 - Brier Loss: 0.2552 - Nll: 1.9117 - F1 Micro: 0.8227 - F1 Macro: 0.8239 - Ece: 0.0507 - Aurc: 0.0420 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 1.3371 | 0.5423 | 0.5838 | 2.6217 | 0.5423 | 0.5341 | 0.0583 | 0.2219 | | No log | 2.0 | 250 | 0.9983 | 0.646 | 0.4725 | 2.3143 | 0.646 | 0.6407 | 0.0490 | 0.1432 | | No log | 3.0 | 375 | 0.8094 | 0.7085 | 0.3977 | 2.2571 | 0.7085 | 0.7034 | 0.0500 | 0.1017 | | 1.2477 | 4.0 | 500 | 0.7633 | 0.7215 | 0.3806 | 2.2013 | 0.7215 | 0.7275 | 0.0447 | 0.0926 | | 1.2477 | 5.0 | 625 | 0.7295 | 0.7505 | 0.3565 | 2.1741 | 0.7505 | 0.7417 | 0.0631 | 0.0775 | | 1.2477 | 6.0 | 750 | 0.6706 | 0.7692 | 0.3321 | 2.1869 | 0.7692 | 0.7669 | 0.0614 | 0.0690 | | 1.2477 | 7.0 | 875 | 0.6933 | 0.767 | 0.3344 | 2.1685 | 0.767 | 0.7650 | 0.0811 | 0.0676 | | 0.3434 | 8.0 | 1000 | 0.6640 | 0.7778 | 0.3251 | 2.1666 | 0.7778 | 0.7795 | 0.0681 | 0.0650 | | 0.3434 | 9.0 | 1125 | 0.6874 | 0.774 | 0.3328 | 2.1401 | 0.774 | 0.7739 | 0.0863 | 0.0660 | | 0.3434 | 10.0 | 1250 | 0.6639 | 0.7795 | 0.3194 | 2.1335 | 0.7795 | 0.7800 | 0.0800 | 0.0618 | | 0.3434 | 11.0 | 1375 | 0.6827 | 0.7728 | 0.3332 | 2.1140 | 0.7728 | 0.7771 | 0.0891 | 0.0650 | | 0.1507 | 12.0 | 1500 | 0.6197 | 0.786 | 0.3106 | 2.1052 | 0.786 | 0.7873 | 0.0716 | 0.0586 | | 0.1507 | 13.0 | 1625 | 0.6264 | 0.7823 | 0.3133 | 2.1077 | 0.7823 | 0.7834 | 0.0784 | 0.0595 | | 0.1507 | 14.0 | 1750 | 0.5822 | 0.796 | 0.2964 | 2.0567 | 0.796 | 0.7983 | 0.0676 | 0.0549 | | 0.1507 | 15.0 | 1875 | 0.5900 | 0.7923 | 0.3016 | 2.0704 | 0.7923 | 0.7936 | 0.0724 | 0.0541 | | 0.107 | 16.0 | 2000 | 0.6044 | 0.7855 | 0.3099 | 2.0625 | 0.7855 | 0.7901 | 0.0730 | 0.0617 | | 0.107 | 17.0 | 2125 | 0.5692 | 0.7973 | 0.2930 | 2.0627 | 0.7973 | 0.7990 | 0.0676 | 0.0528 | | 0.107 | 18.0 | 2250 | 0.5836 | 0.7907 | 0.2984 | 2.0575 | 0.7907 | 0.7922 | 0.0749 | 0.0554 | | 0.107 | 19.0 | 2375 | 0.5469 | 0.806 | 0.2835 | 2.0754 | 0.806 | 0.8060 | 0.0576 | 0.0498 | | 0.0879 | 20.0 | 2500 | 0.5427 | 0.804 | 0.2892 | 2.0655 | 0.804 | 0.8089 | 0.0593 | 0.0528 | | 0.0879 | 21.0 | 2625 | 0.5305 | 0.806 | 0.2777 | 2.0213 | 0.806 | 0.8070 | 0.0604 | 0.0495 | | 0.0879 | 22.0 | 2750 | 0.5146 | 0.8113 | 0.2741 | 2.0127 | 0.8113 | 0.8121 | 0.0534 | 0.0480 | | 0.0879 | 23.0 | 2875 | 0.5196 | 0.8107 | 0.2750 | 2.0261 | 0.8108 | 0.8117 | 0.0541 | 0.0489 | | 0.0755 | 24.0 | 3000 | 0.5169 | 0.8123 | 0.2743 | 1.9561 | 0.8123 | 0.8127 | 0.0595 | 0.0478 | | 0.0755 | 25.0 | 3125 | 0.5129 | 0.8073 | 0.2777 | 2.0020 | 0.8073 | 0.8089 | 0.0552 | 0.0491 | | 0.0755 | 26.0 | 3250 | 0.4898 | 0.8177 | 0.2649 | 1.9710 | 0.8178 | 0.8177 | 0.0474 | 0.0451 | | 0.0755 | 27.0 | 3375 | 0.4966 | 0.8155 | 0.2682 | 2.0075 | 0.8155 | 0.8163 | 0.0514 | 0.0458 | | 0.0652 | 28.0 | 3500 | 0.4883 | 0.813 | 0.2690 | 1.9655 | 0.813 | 0.8141 | 0.0557 | 0.0465 | | 0.0652 | 29.0 | 3625 | 0.4860 | 0.8185 | 0.2659 | 1.9593 | 0.8185 | 0.8194 | 0.0481 | 0.0456 | | 0.0652 | 30.0 | 3750 | 0.4760 | 0.818 | 0.2600 | 1.9517 | 0.818 | 0.8194 | 0.0505 | 0.0441 | | 0.0652 | 31.0 | 3875 | 0.4755 | 0.8195 | 0.2611 | 1.9593 | 0.8195 | 0.8196 | 0.0507 | 0.0440 | | 0.0568 | 32.0 | 4000 | 0.4763 | 0.8155 | 0.2628 | 1.9508 | 0.8155 | 0.8161 | 0.0484 | 0.0451 | | 0.0568 | 33.0 | 4125 | 0.4675 | 0.8225 | 0.2574 | 1.9474 | 0.8225 | 0.8238 | 0.0477 | 0.0433 | | 0.0568 | 34.0 | 4250 | 0.4664 | 0.8207 | 0.2579 | 1.9478 | 0.8207 | 0.8220 | 0.0498 | 0.0431 | | 0.0568 | 35.0 | 4375 | 0.4635 | 0.8213 | 0.2567 | 1.9233 | 0.8213 | 0.8219 | 0.0481 | 0.0427 | | 0.0514 | 36.0 | 4500 | 0.4584 | 0.8245 | 0.2551 | 1.9196 | 0.8245 | 0.8260 | 0.0461 | 0.0424 | | 0.0514 | 37.0 | 4625 | 0.4627 | 0.825 | 0.2557 | 1.9274 | 0.825 | 0.8256 | 0.0454 | 0.0424 | | 0.0514 | 38.0 | 4750 | 0.4603 | 0.8213 | 0.2552 | 1.9319 | 0.8213 | 0.8221 | 0.0478 | 0.0425 | | 0.0514 | 39.0 | 4875 | 0.4610 | 0.8245 | 0.2560 | 1.9337 | 0.8245 | 0.8252 | 0.0476 | 0.0424 | | 0.0483 | 40.0 | 5000 | 0.4603 | 0.825 | 0.2559 | 1.9319 | 0.825 | 0.8262 | 0.0460 | 0.0421 | | 0.0483 | 41.0 | 5125 | 0.4589 | 0.8253 | 0.2545 | 1.9317 | 0.8253 | 0.8260 | 0.0459 | 0.0421 | | 0.0483 | 42.0 | 5250 | 0.4586 | 0.8245 | 0.2552 | 1.9192 | 0.8245 | 0.8260 | 0.0524 | 0.0420 | | 0.0483 | 43.0 | 5375 | 0.4581 | 0.825 | 0.2552 | 1.9179 | 0.825 | 0.8263 | 0.0477 | 0.0421 | | 0.0465 | 44.0 | 5500 | 0.4573 | 0.8245 | 0.2543 | 1.9187 | 0.8245 | 0.8257 | 0.0457 | 0.0417 | | 0.0465 | 45.0 | 5625 | 0.4589 | 0.8225 | 0.2554 | 1.9184 | 0.8225 | 0.8235 | 0.0549 | 0.0421 | | 0.0465 | 46.0 | 5750 | 0.4582 | 0.823 | 0.2547 | 1.9128 | 0.823 | 0.8242 | 0.0512 | 0.0420 | | 0.0465 | 47.0 | 5875 | 0.4587 | 0.823 | 0.2551 | 1.9135 | 0.823 | 0.8241 | 0.0484 | 0.0420 | | 0.0458 | 48.0 | 6000 | 0.4585 | 0.8235 | 0.2550 | 1.9127 | 0.8235 | 0.8246 | 0.0479 | 0.0420 | | 0.0458 | 49.0 | 6125 | 0.4589 | 0.8227 | 0.2553 | 1.9117 | 0.8227 | 0.8238 | 0.0490 | 0.0421 | | 0.0458 | 50.0 | 6250 | 0.4588 | 0.8227 | 0.2552 | 1.9117 | 0.8227 | 0.8239 | 0.0507 | 0.0420 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/cdip-small_rvl_cdip-NK1000_kd_test
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cdip-small_rvl_cdip-NK1000_kd_test This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3813 - Accuracy: 0.8558 - Brier Loss: 0.2176 - Nll: 1.4251 - F1 Micro: 0.8558 - F1 Macro: 0.8566 - Ece: 0.0597 - Aurc: 0.0299 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 1.7536 | 1.0 | 667 | 0.9652 | 0.6695 | 0.4474 | 2.2965 | 0.6695 | 0.6595 | 0.0494 | 0.1257 | | 0.8802 | 2.0 | 1334 | 0.7683 | 0.7195 | 0.3806 | 2.0303 | 0.7195 | 0.7116 | 0.0473 | 0.0920 | | 0.5767 | 3.0 | 2001 | 0.6276 | 0.7698 | 0.3253 | 1.9446 | 0.7698 | 0.7711 | 0.0436 | 0.0684 | | 0.4263 | 4.0 | 2668 | 0.6095 | 0.7785 | 0.3110 | 1.9810 | 0.7785 | 0.7810 | 0.0474 | 0.0624 | | 0.3987 | 5.0 | 3335 | 0.5608 | 0.791 | 0.2939 | 1.8539 | 0.791 | 0.7918 | 0.0504 | 0.0557 | | 0.3179 | 6.0 | 4002 | 0.6057 | 0.7935 | 0.3027 | 1.8778 | 0.7935 | 0.7940 | 0.0811 | 0.0548 | | 0.2428 | 7.0 | 4669 | 0.5828 | 0.8043 | 0.2905 | 1.8616 | 0.8043 | 0.8050 | 0.0662 | 0.0520 | | 0.2094 | 8.0 | 5336 | 0.5812 | 0.7957 | 0.2973 | 1.8459 | 0.7957 | 0.8019 | 0.0783 | 0.0532 | | 0.1715 | 9.0 | 6003 | 0.6152 | 0.7987 | 0.2993 | 1.9533 | 0.7987 | 0.7998 | 0.0723 | 0.0539 | | 0.1508 | 10.0 | 6670 | 0.5442 | 0.808 | 0.2820 | 1.8159 | 0.808 | 0.8097 | 0.0836 | 0.0476 | | 0.1434 | 11.0 | 7337 | 0.4881 | 0.828 | 0.2549 | 1.6938 | 0.828 | 0.8286 | 0.0610 | 0.0410 | | 0.1267 | 12.0 | 8004 | 0.4720 | 0.8365 | 0.2465 | 1.6878 | 0.8365 | 0.8360 | 0.0576 | 0.0400 | | 0.115 | 13.0 | 8671 | 0.4648 | 0.8335 | 0.2482 | 1.6871 | 0.8335 | 0.8353 | 0.0630 | 0.0387 | | 0.1112 | 14.0 | 9338 | 0.4777 | 0.8317 | 0.2509 | 1.6393 | 0.8317 | 0.8312 | 0.0614 | 0.0418 | | 0.1002 | 15.0 | 10005 | 0.4684 | 0.8333 | 0.2484 | 1.6054 | 0.8333 | 0.8335 | 0.0657 | 0.0392 | | 0.0944 | 16.0 | 10672 | 0.4693 | 0.8365 | 0.2480 | 1.6381 | 0.8365 | 0.8366 | 0.0658 | 0.0383 | | 0.0934 | 17.0 | 11339 | 0.4534 | 0.8323 | 0.2465 | 1.6420 | 0.8323 | 0.8343 | 0.0561 | 0.0373 | | 0.0835 | 18.0 | 12006 | 0.4512 | 0.8357 | 0.2435 | 1.6301 | 0.8357 | 0.8367 | 0.0575 | 0.0372 | | 0.08 | 19.0 | 12673 | 0.4345 | 0.838 | 0.2394 | 1.6382 | 0.838 | 0.8398 | 0.0562 | 0.0366 | | 0.0819 | 20.0 | 13340 | 0.4356 | 0.838 | 0.2374 | 1.5973 | 0.838 | 0.8384 | 0.0588 | 0.0364 | | 0.0709 | 21.0 | 14007 | 0.4484 | 0.8415 | 0.2368 | 1.6231 | 0.8415 | 0.8411 | 0.0595 | 0.0368 | | 0.0691 | 22.0 | 14674 | 0.4194 | 0.8495 | 0.2287 | 1.5968 | 0.8495 | 0.8505 | 0.0531 | 0.0335 | | 0.068 | 23.0 | 15341 | 0.4308 | 0.8413 | 0.2346 | 1.5599 | 0.8413 | 0.8410 | 0.0542 | 0.0360 | | 0.0641 | 24.0 | 16008 | 0.4209 | 0.8405 | 0.2336 | 1.5539 | 0.8405 | 0.8422 | 0.0590 | 0.0339 | | 0.0617 | 25.0 | 16675 | 0.4181 | 0.841 | 0.2352 | 1.5735 | 0.841 | 0.8435 | 0.0568 | 0.0356 | | 0.0633 | 26.0 | 17342 | 0.4193 | 0.8508 | 0.2286 | 1.5299 | 0.8508 | 0.8510 | 0.0650 | 0.0348 | | 0.0569 | 27.0 | 18009 | 0.4065 | 0.8468 | 0.2278 | 1.5267 | 0.8468 | 0.8479 | 0.0546 | 0.0332 | | 0.0571 | 28.0 | 18676 | 0.4109 | 0.8498 | 0.2255 | 1.5147 | 0.8498 | 0.8499 | 0.0590 | 0.0331 | | 0.0543 | 29.0 | 19343 | 0.4026 | 0.8482 | 0.2250 | 1.5187 | 0.8482 | 0.8498 | 0.0623 | 0.0327 | | 0.0543 | 30.0 | 20010 | 0.4124 | 0.847 | 0.2293 | 1.5125 | 0.847 | 0.8473 | 0.0605 | 0.0330 | | 0.0536 | 31.0 | 20677 | 0.4022 | 0.851 | 0.2238 | 1.5100 | 0.851 | 0.8527 | 0.0594 | 0.0323 | | 0.0522 | 32.0 | 21344 | 0.4120 | 0.8475 | 0.2290 | 1.5044 | 0.8475 | 0.8483 | 0.0633 | 0.0327 | | 0.0493 | 33.0 | 22011 | 0.3990 | 0.8492 | 0.2258 | 1.5197 | 0.8492 | 0.8503 | 0.0589 | 0.0318 | | 0.0512 | 34.0 | 22678 | 0.3983 | 0.85 | 0.2251 | 1.4644 | 0.85 | 0.8503 | 0.0597 | 0.0319 | | 0.0517 | 35.0 | 23345 | 0.3969 | 0.8465 | 0.2257 | 1.4814 | 0.8465 | 0.8479 | 0.0630 | 0.0309 | | 0.0477 | 36.0 | 24012 | 0.3939 | 0.8528 | 0.2237 | 1.4797 | 0.8528 | 0.8531 | 0.0604 | 0.0316 | | 0.0482 | 37.0 | 24679 | 0.3934 | 0.852 | 0.2218 | 1.4595 | 0.852 | 0.8527 | 0.0613 | 0.0316 | | 0.0481 | 38.0 | 25346 | 0.3930 | 0.8532 | 0.2217 | 1.4561 | 0.8532 | 0.8544 | 0.0593 | 0.0306 | | 0.0477 | 39.0 | 26013 | 0.3875 | 0.8512 | 0.2202 | 1.4610 | 0.8512 | 0.8523 | 0.0609 | 0.0310 | | 0.048 | 40.0 | 26680 | 0.3900 | 0.8538 | 0.2202 | 1.4541 | 0.8537 | 0.8546 | 0.0629 | 0.0307 | | 0.0448 | 41.0 | 27347 | 0.3901 | 0.8525 | 0.2221 | 1.4519 | 0.8525 | 0.8532 | 0.0621 | 0.0308 | | 0.0454 | 42.0 | 28014 | 0.3858 | 0.851 | 0.2186 | 1.4554 | 0.851 | 0.8519 | 0.0633 | 0.0298 | | 0.0464 | 43.0 | 28681 | 0.3861 | 0.8528 | 0.2197 | 1.4516 | 0.8528 | 0.8535 | 0.0618 | 0.0307 | | 0.0444 | 44.0 | 29348 | 0.3824 | 0.8548 | 0.2176 | 1.4288 | 0.8547 | 0.8557 | 0.0607 | 0.0299 | | 0.0461 | 45.0 | 30015 | 0.3833 | 0.8555 | 0.2181 | 1.4330 | 0.8555 | 0.8566 | 0.0606 | 0.0302 | | 0.0442 | 46.0 | 30682 | 0.3830 | 0.8552 | 0.2174 | 1.4358 | 0.8552 | 0.8560 | 0.0604 | 0.0302 | | 0.0456 | 47.0 | 31349 | 0.3797 | 0.8552 | 0.2173 | 1.4264 | 0.8552 | 0.8560 | 0.0596 | 0.0297 | | 0.0447 | 48.0 | 32016 | 0.3811 | 0.8558 | 0.2176 | 1.4273 | 0.8558 | 0.8566 | 0.0595 | 0.0300 | | 0.0439 | 49.0 | 32683 | 0.3814 | 0.856 | 0.2176 | 1.4252 | 0.856 | 0.8568 | 0.0600 | 0.0300 | | 0.0437 | 50.0 | 33350 | 0.3813 | 0.8558 | 0.2176 | 1.4251 | 0.8558 | 0.8566 | 0.0597 | 0.0299 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
annazhong/vit-base-patch16-224-finetuned-foveated-features-v2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-foveated-features-v2 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.9396 - Accuracy: 0.24 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 150 - eval_batch_size: 150 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 600 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 1.9396 | 0.24 | | No log | 2.0 | 3 | 1.9830 | 0.12 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
JoseVilla/cfe-telmex-classification-finetuned-v3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cfe-telmex-classification-finetuned-v3 This model is a fine-tuned version of [JoseVilla/cfe-telmex-classification-finetuned-v2](https://huggingface.co/JoseVilla/cfe-telmex-classification-finetuned-v2) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2047 - Accuracy: 0.9583 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.94 | 8 | 0.4386 | 0.8167 | | 0.6716 | 2.0 | 17 | 0.2047 | 0.9583 | | 0.1864 | 2.82 | 24 | 0.1664 | 0.9583 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "cfe", "other", "telmex" ]
annazhong/vit-base-patch16-224-finetuned-feature-maps-v3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-feature-maps-v3 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.0989 - Accuracy: 0.3810 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 150 - eval_batch_size: 150 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 600 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 1.0989 | 0.3810 | | No log | 2.0 | 2 | 1.1292 | 0.3651 | | No log | 3.0 | 3 | 1.0972 | 0.3810 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2" ]
jordyvl/cdip-tiny_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cdip-tiny_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5622 - Accuracy: 0.8255 - Brier Loss: 0.2585 - Nll: 1.9229 - F1 Micro: 0.8255 - F1 Macro: 0.8273 - Ece: 0.0661 - Aurc: 0.0421 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 2.1936 | 0.55 | 0.5790 | 2.6092 | 0.55 | 0.5380 | 0.0723 | 0.2136 | | No log | 2.0 | 250 | 1.5136 | 0.666 | 0.4551 | 2.2832 | 0.666 | 0.6632 | 0.0686 | 0.1282 | | No log | 3.0 | 375 | 1.2758 | 0.6993 | 0.4127 | 2.2483 | 0.6993 | 0.6928 | 0.0836 | 0.1026 | | 1.9726 | 4.0 | 500 | 1.1382 | 0.726 | 0.3794 | 2.0849 | 0.726 | 0.7288 | 0.0725 | 0.0878 | | 1.9726 | 5.0 | 625 | 1.0625 | 0.7542 | 0.3523 | 2.1178 | 0.7542 | 0.7426 | 0.0755 | 0.0758 | | 1.9726 | 6.0 | 750 | 0.8657 | 0.7775 | 0.3184 | 1.9516 | 0.7775 | 0.7783 | 0.0736 | 0.0618 | | 1.9726 | 7.0 | 875 | 0.8285 | 0.7887 | 0.3089 | 2.0439 | 0.7887 | 0.7867 | 0.0724 | 0.0586 | | 0.5092 | 8.0 | 1000 | 0.7988 | 0.7925 | 0.3056 | 2.0106 | 0.7925 | 0.7953 | 0.0700 | 0.0572 | | 0.5092 | 9.0 | 1125 | 0.7783 | 0.7925 | 0.3060 | 1.9710 | 0.7925 | 0.7914 | 0.0822 | 0.0573 | | 0.5092 | 10.0 | 1250 | 0.7640 | 0.796 | 0.3007 | 1.9819 | 0.796 | 0.7996 | 0.0801 | 0.0536 | | 0.5092 | 11.0 | 1375 | 0.7705 | 0.7913 | 0.3048 | 1.9588 | 0.7913 | 0.7952 | 0.0857 | 0.0559 | | 0.2071 | 12.0 | 1500 | 0.7328 | 0.8015 | 0.2937 | 1.9484 | 0.8015 | 0.8017 | 0.0760 | 0.0537 | | 0.2071 | 13.0 | 1625 | 0.6946 | 0.811 | 0.2881 | 1.9173 | 0.811 | 0.8125 | 0.0824 | 0.0507 | | 0.2071 | 14.0 | 1750 | 0.6902 | 0.8053 | 0.2880 | 1.9154 | 0.8053 | 0.8068 | 0.0738 | 0.0502 | | 0.2071 | 15.0 | 1875 | 0.6756 | 0.8083 | 0.2840 | 1.9317 | 0.8083 | 0.8078 | 0.0761 | 0.0487 | | 0.1424 | 16.0 | 2000 | 0.6684 | 0.8067 | 0.2852 | 1.9192 | 0.8067 | 0.8073 | 0.0765 | 0.0507 | | 0.1424 | 17.0 | 2125 | 0.6548 | 0.8095 | 0.2816 | 1.9398 | 0.8095 | 0.8110 | 0.0758 | 0.0472 | | 0.1424 | 18.0 | 2250 | 0.6477 | 0.8117 | 0.2762 | 1.9054 | 0.8117 | 0.8140 | 0.0759 | 0.0464 | | 0.1424 | 19.0 | 2375 | 0.6423 | 0.8145 | 0.2794 | 1.9081 | 0.8145 | 0.8148 | 0.0774 | 0.0478 | | 0.1102 | 20.0 | 2500 | 0.6312 | 0.8103 | 0.2771 | 1.9581 | 0.8103 | 0.8125 | 0.0746 | 0.0454 | | 0.1102 | 21.0 | 2625 | 0.6299 | 0.8133 | 0.2720 | 1.9275 | 0.8133 | 0.8132 | 0.0758 | 0.0466 | | 0.1102 | 22.0 | 2750 | 0.6148 | 0.8197 | 0.2691 | 1.9463 | 0.8197 | 0.8223 | 0.0681 | 0.0447 | | 0.1102 | 23.0 | 2875 | 0.6132 | 0.8187 | 0.2700 | 1.9301 | 0.8187 | 0.8200 | 0.0691 | 0.0451 | | 0.0931 | 24.0 | 3000 | 0.5995 | 0.8245 | 0.2649 | 1.9173 | 0.8245 | 0.8251 | 0.0640 | 0.0444 | | 0.0931 | 25.0 | 3125 | 0.6020 | 0.8177 | 0.2697 | 1.9205 | 0.8178 | 0.8201 | 0.0723 | 0.0440 | | 0.0931 | 26.0 | 3250 | 0.5914 | 0.8247 | 0.2617 | 1.9385 | 0.8247 | 0.8264 | 0.0667 | 0.0428 | | 0.0931 | 27.0 | 3375 | 0.5833 | 0.822 | 0.2621 | 1.9390 | 0.822 | 0.8228 | 0.0658 | 0.0429 | | 0.0789 | 28.0 | 3500 | 0.5884 | 0.8247 | 0.2626 | 1.9400 | 0.8247 | 0.8259 | 0.0619 | 0.0435 | | 0.0789 | 29.0 | 3625 | 0.5771 | 0.8285 | 0.2568 | 1.9252 | 0.8285 | 0.8313 | 0.0612 | 0.0413 | | 0.0789 | 30.0 | 3750 | 0.5815 | 0.823 | 0.2628 | 1.9413 | 0.823 | 0.8236 | 0.0676 | 0.0433 | | 0.0789 | 31.0 | 3875 | 0.5789 | 0.8205 | 0.2617 | 1.9209 | 0.8205 | 0.8219 | 0.0667 | 0.0431 | | 0.0686 | 32.0 | 4000 | 0.5775 | 0.8247 | 0.2616 | 1.9045 | 0.8247 | 0.8265 | 0.0674 | 0.0428 | | 0.0686 | 33.0 | 4125 | 0.5744 | 0.827 | 0.2603 | 1.9088 | 0.827 | 0.8275 | 0.0656 | 0.0420 | | 0.0686 | 34.0 | 4250 | 0.5685 | 0.824 | 0.2607 | 1.9372 | 0.824 | 0.8264 | 0.0647 | 0.0421 | | 0.0686 | 35.0 | 4375 | 0.5649 | 0.8255 | 0.2584 | 1.9375 | 0.8255 | 0.8274 | 0.0694 | 0.0419 | | 0.0596 | 36.0 | 4500 | 0.5629 | 0.8263 | 0.2574 | 1.9304 | 0.8263 | 0.8283 | 0.0651 | 0.0415 | | 0.0596 | 37.0 | 4625 | 0.5622 | 0.8237 | 0.2579 | 1.9228 | 0.8237 | 0.8254 | 0.0644 | 0.0419 | | 0.0596 | 38.0 | 4750 | 0.5623 | 0.8257 | 0.2579 | 1.9310 | 0.8257 | 0.8277 | 0.0650 | 0.0418 | | 0.0596 | 39.0 | 4875 | 0.5625 | 0.827 | 0.2579 | 1.9311 | 0.827 | 0.8286 | 0.0668 | 0.0418 | | 0.0538 | 40.0 | 5000 | 0.5633 | 0.8247 | 0.2590 | 1.9264 | 0.8247 | 0.8264 | 0.0671 | 0.0424 | | 0.0538 | 41.0 | 5125 | 0.5607 | 0.8257 | 0.2575 | 1.9239 | 0.8257 | 0.8275 | 0.0621 | 0.0417 | | 0.0538 | 42.0 | 5250 | 0.5605 | 0.8263 | 0.2569 | 1.9305 | 0.8263 | 0.8279 | 0.0620 | 0.0418 | | 0.0538 | 43.0 | 5375 | 0.5613 | 0.8255 | 0.2581 | 1.9295 | 0.8255 | 0.8272 | 0.0672 | 0.0418 | | 0.0512 | 44.0 | 5500 | 0.5616 | 0.8255 | 0.2581 | 1.9235 | 0.8255 | 0.8273 | 0.0636 | 0.0419 | | 0.0512 | 45.0 | 5625 | 0.5624 | 0.8253 | 0.2585 | 1.9206 | 0.8253 | 0.8270 | 0.0646 | 0.0422 | | 0.0512 | 46.0 | 5750 | 0.5622 | 0.8257 | 0.2582 | 1.9283 | 0.8257 | 0.8273 | 0.0647 | 0.0420 | | 0.0512 | 47.0 | 5875 | 0.5616 | 0.825 | 0.2581 | 1.9283 | 0.825 | 0.8267 | 0.0640 | 0.0420 | | 0.05 | 48.0 | 6000 | 0.5620 | 0.8257 | 0.2584 | 1.9262 | 0.8257 | 0.8275 | 0.0644 | 0.0421 | | 0.05 | 49.0 | 6125 | 0.5622 | 0.8253 | 0.2585 | 1.9257 | 0.8253 | 0.8270 | 0.0630 | 0.0421 | | 0.05 | 50.0 | 6250 | 0.5622 | 0.8255 | 0.2585 | 1.9229 | 0.8255 | 0.8273 | 0.0661 | 0.0421 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl_cdip_crl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_cdip_crl This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6238 - Accuracy: 0.8956 - Brier Loss: 0.1819 - Nll: 1.1791 - F1 Micro: 0.8957 - F1 Macro: 0.8958 - Ece: 0.0846 - Aurc: 0.0210 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.1844 | 1.0 | 1250 | 0.4411 | 0.8961 | 0.1614 | 1.1240 | 0.8961 | 0.8963 | 0.0528 | 0.0161 | | 0.1394 | 2.0 | 2500 | 0.4830 | 0.8927 | 0.1716 | 1.1324 | 0.8927 | 0.8927 | 0.0646 | 0.0175 | | 0.1 | 3.0 | 3750 | 0.5257 | 0.8911 | 0.1791 | 1.1569 | 0.8911 | 0.8912 | 0.0737 | 0.0187 | | 0.068 | 4.0 | 5000 | 0.5497 | 0.8913 | 0.1806 | 1.1705 | 0.8913 | 0.8913 | 0.0770 | 0.0192 | | 0.048 | 5.0 | 6250 | 0.5762 | 0.8915 | 0.1834 | 1.1906 | 0.8915 | 0.8914 | 0.0808 | 0.0195 | | 0.033 | 6.0 | 7500 | 0.5877 | 0.8936 | 0.1822 | 1.1690 | 0.8936 | 0.8938 | 0.0817 | 0.0196 | | 0.0231 | 7.0 | 8750 | 0.6000 | 0.8938 | 0.1822 | 1.1867 | 0.8938 | 0.8939 | 0.0833 | 0.0206 | | 0.0162 | 8.0 | 10000 | 0.6187 | 0.8948 | 0.1834 | 1.1827 | 0.8948 | 0.8949 | 0.0841 | 0.0208 | | 0.0123 | 9.0 | 11250 | 0.6191 | 0.8953 | 0.1824 | 1.1868 | 0.8953 | 0.8955 | 0.0836 | 0.0207 | | 0.0102 | 10.0 | 12500 | 0.6238 | 0.8956 | 0.1819 | 1.1791 | 0.8957 | 0.8958 | 0.0846 | 0.0210 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
wuru330/378A1_results_2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 378A1_results_2 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5007 - Accuracy: 0.8861 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.2051 | 1.0 | 37 | 1.0438 | 0.6429 | | 0.6643 | 2.0 | 74 | 0.6103 | 0.7925 | | 0.4615 | 3.0 | 111 | 0.4720 | 0.8435 | | 0.3136 | 4.0 | 148 | 0.3668 | 0.8776 | | 0.278 | 5.0 | 185 | 0.3650 | 0.8622 | | 0.1875 | 6.0 | 222 | 0.3705 | 0.8690 | | 0.1215 | 7.0 | 259 | 0.4093 | 0.8741 | | 0.0885 | 8.0 | 296 | 0.3428 | 0.9014 | | 0.0497 | 9.0 | 333 | 0.3854 | 0.8759 | | 0.0348 | 10.0 | 370 | 0.4291 | 0.8707 | | 0.0301 | 11.0 | 407 | 0.4464 | 0.8895 | | 0.0246 | 12.0 | 444 | 0.4208 | 0.8929 | | 0.0218 | 13.0 | 481 | 0.4256 | 0.8912 | | 0.0198 | 14.0 | 518 | 0.4300 | 0.8878 | | 0.0179 | 15.0 | 555 | 0.4403 | 0.8861 | | 0.0165 | 16.0 | 592 | 0.4481 | 0.8861 | | 0.0155 | 17.0 | 629 | 0.4554 | 0.8878 | | 0.0146 | 18.0 | 666 | 0.4632 | 0.8878 | | 0.0137 | 19.0 | 703 | 0.4691 | 0.8844 | | 0.0129 | 20.0 | 740 | 0.4747 | 0.8861 | | 0.0125 | 21.0 | 777 | 0.4792 | 0.8844 | | 0.0119 | 22.0 | 814 | 0.4840 | 0.8844 | | 0.0113 | 23.0 | 851 | 0.4875 | 0.8861 | | 0.0111 | 24.0 | 888 | 0.4924 | 0.8844 | | 0.0108 | 25.0 | 925 | 0.4947 | 0.8844 | | 0.0105 | 26.0 | 962 | 0.4966 | 0.8844 | | 0.0104 | 27.0 | 999 | 0.4988 | 0.8861 | | 0.0102 | 28.0 | 1036 | 0.4997 | 0.8861 | | 0.0101 | 29.0 | 1073 | 0.5005 | 0.8861 | | 0.01 | 30.0 | 1110 | 0.5007 | 0.8861 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2", "label_3" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.8222 - Accuracy: 0.5543 - Brier Loss: 0.6966 - Nll: 3.2790 - F1 Micro: 0.5543 - F1 Macro: 0.5553 - Ece: 0.2764 - Aurc: 0.2323 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 2.4938 | 0.1348 | 0.9105 | 6.1860 | 0.1348 | 0.0673 | 0.0701 | 0.7303 | | No log | 2.0 | 250 | 2.3746 | 0.1895 | 0.8922 | 5.4000 | 0.1895 | 0.1384 | 0.0653 | 0.6997 | | No log | 3.0 | 375 | 2.2780 | 0.198 | 0.8746 | 4.3716 | 0.198 | 0.1340 | 0.0676 | 0.6597 | | 2.396 | 4.0 | 500 | 2.0622 | 0.302 | 0.8212 | 4.2504 | 0.302 | 0.2268 | 0.0655 | 0.5280 | | 2.396 | 5.0 | 625 | 1.8529 | 0.3703 | 0.7693 | 3.3328 | 0.3703 | 0.3184 | 0.0727 | 0.4470 | | 2.396 | 6.0 | 750 | 1.6847 | 0.423 | 0.7103 | 3.0730 | 0.4230 | 0.3879 | 0.0698 | 0.3567 | | 2.396 | 7.0 | 875 | 1.5975 | 0.45 | 0.6817 | 3.0713 | 0.45 | 0.4139 | 0.0632 | 0.3257 | | 1.7095 | 8.0 | 1000 | 1.5156 | 0.4768 | 0.6588 | 2.8252 | 0.4768 | 0.4517 | 0.0635 | 0.3029 | | 1.7095 | 9.0 | 1125 | 1.4425 | 0.5018 | 0.6308 | 2.7656 | 0.5018 | 0.4812 | 0.0650 | 0.2728 | | 1.7095 | 10.0 | 1250 | 1.4089 | 0.5092 | 0.6218 | 2.6715 | 0.5092 | 0.4894 | 0.0527 | 0.2642 | | 1.7095 | 11.0 | 1375 | 1.3930 | 0.523 | 0.6150 | 2.6821 | 0.523 | 0.5261 | 0.0635 | 0.2584 | | 1.3064 | 12.0 | 1500 | 1.4166 | 0.5205 | 0.6262 | 2.7691 | 0.5205 | 0.4991 | 0.0813 | 0.2639 | | 1.3064 | 13.0 | 1625 | 1.3343 | 0.5312 | 0.5961 | 2.6475 | 0.5312 | 0.5194 | 0.0586 | 0.2383 | | 1.3064 | 14.0 | 1750 | 1.3277 | 0.5417 | 0.5917 | 2.6528 | 0.5417 | 0.5361 | 0.0669 | 0.2327 | | 1.3064 | 15.0 | 1875 | 1.3407 | 0.5312 | 0.5958 | 2.6880 | 0.5312 | 0.5356 | 0.0637 | 0.2378 | | 1.0419 | 16.0 | 2000 | 1.2873 | 0.5545 | 0.5801 | 2.6042 | 0.5545 | 0.5509 | 0.0870 | 0.2193 | | 1.0419 | 17.0 | 2125 | 1.3539 | 0.5375 | 0.6022 | 2.6706 | 0.5375 | 0.5329 | 0.0970 | 0.2376 | | 1.0419 | 18.0 | 2250 | 1.3073 | 0.5543 | 0.5857 | 2.6217 | 0.5543 | 0.5502 | 0.1006 | 0.2200 | | 1.0419 | 19.0 | 2375 | 1.3225 | 0.558 | 0.5886 | 2.6258 | 0.558 | 0.5530 | 0.1047 | 0.2206 | | 0.8001 | 20.0 | 2500 | 1.3573 | 0.554 | 0.5955 | 2.7139 | 0.554 | 0.5489 | 0.1221 | 0.2200 | | 0.8001 | 21.0 | 2625 | 1.4029 | 0.546 | 0.6150 | 2.7649 | 0.546 | 0.5456 | 0.1547 | 0.2274 | | 0.8001 | 22.0 | 2750 | 1.4006 | 0.5525 | 0.6092 | 2.8131 | 0.5525 | 0.5504 | 0.1474 | 0.2246 | | 0.8001 | 23.0 | 2875 | 1.4523 | 0.5513 | 0.6223 | 2.8803 | 0.5513 | 0.5448 | 0.1818 | 0.2269 | | 0.5716 | 24.0 | 3000 | 1.4744 | 0.5495 | 0.6261 | 2.9958 | 0.5495 | 0.5525 | 0.1799 | 0.2253 | | 0.5716 | 25.0 | 3125 | 1.5278 | 0.5445 | 0.6418 | 3.0853 | 0.5445 | 0.5485 | 0.1915 | 0.2321 | | 0.5716 | 26.0 | 3250 | 1.5782 | 0.5433 | 0.6566 | 3.0618 | 0.5433 | 0.5448 | 0.2171 | 0.2333 | | 0.5716 | 27.0 | 3375 | 1.6368 | 0.5375 | 0.6704 | 3.2249 | 0.5375 | 0.5389 | 0.2277 | 0.2401 | | 0.3744 | 28.0 | 3500 | 1.6339 | 0.5445 | 0.6694 | 3.1689 | 0.5445 | 0.5447 | 0.2376 | 0.2338 | | 0.3744 | 29.0 | 3625 | 1.6589 | 0.548 | 0.6714 | 3.1654 | 0.548 | 0.5469 | 0.2376 | 0.2319 | | 0.3744 | 30.0 | 3750 | 1.7679 | 0.5353 | 0.6989 | 3.3537 | 0.5353 | 0.5387 | 0.2524 | 0.2558 | | 0.3744 | 31.0 | 3875 | 1.7441 | 0.5475 | 0.6846 | 3.3716 | 0.5475 | 0.5501 | 0.2455 | 0.2395 | | 0.2439 | 32.0 | 4000 | 1.7856 | 0.5365 | 0.6977 | 3.4176 | 0.5365 | 0.5443 | 0.2510 | 0.2462 | | 0.2439 | 33.0 | 4125 | 1.7886 | 0.545 | 0.6997 | 3.3804 | 0.545 | 0.5454 | 0.2646 | 0.2379 | | 0.2439 | 34.0 | 4250 | 1.8658 | 0.5275 | 0.7187 | 3.6006 | 0.5275 | 0.5300 | 0.2840 | 0.2482 | | 0.2439 | 35.0 | 4375 | 1.8668 | 0.5387 | 0.7145 | 3.3922 | 0.5387 | 0.5391 | 0.2797 | 0.2453 | | 0.1695 | 36.0 | 4500 | 1.8920 | 0.5288 | 0.7263 | 3.4756 | 0.5288 | 0.5320 | 0.2878 | 0.2507 | | 0.1695 | 37.0 | 4625 | 1.8767 | 0.542 | 0.7146 | 3.5924 | 0.542 | 0.5357 | 0.2792 | 0.2469 | | 0.1695 | 38.0 | 4750 | 1.8617 | 0.5435 | 0.7094 | 3.5434 | 0.5435 | 0.5467 | 0.2729 | 0.2440 | | 0.1695 | 39.0 | 4875 | 1.8746 | 0.5525 | 0.7073 | 3.4325 | 0.5525 | 0.5514 | 0.2789 | 0.2434 | | 0.1278 | 40.0 | 5000 | 1.8877 | 0.5435 | 0.7171 | 3.4872 | 0.5435 | 0.5438 | 0.2852 | 0.2393 | | 0.1278 | 41.0 | 5125 | 1.8919 | 0.54 | 0.7219 | 3.4577 | 0.54 | 0.5456 | 0.2869 | 0.2487 | | 0.1278 | 42.0 | 5250 | 1.8631 | 0.548 | 0.7089 | 3.4287 | 0.548 | 0.5502 | 0.2758 | 0.2390 | | 0.1278 | 43.0 | 5375 | 1.8433 | 0.5475 | 0.7058 | 3.2993 | 0.5475 | 0.5468 | 0.2863 | 0.2335 | | 0.0993 | 44.0 | 5500 | 1.8458 | 0.5505 | 0.7048 | 3.3852 | 0.5505 | 0.5528 | 0.2776 | 0.2378 | | 0.0993 | 45.0 | 5625 | 1.8408 | 0.5443 | 0.7100 | 3.3510 | 0.5443 | 0.5490 | 0.2769 | 0.2392 | | 0.0993 | 46.0 | 5750 | 1.8492 | 0.5477 | 0.7064 | 3.2989 | 0.5477 | 0.5496 | 0.2807 | 0.2363 | | 0.0993 | 47.0 | 5875 | 1.8100 | 0.5497 | 0.6969 | 3.2853 | 0.5497 | 0.5534 | 0.2761 | 0.2341 | | 0.0803 | 48.0 | 6000 | 1.8260 | 0.5523 | 0.6984 | 3.2543 | 0.5523 | 0.5532 | 0.2783 | 0.2326 | | 0.0803 | 49.0 | 6125 | 1.8225 | 0.5563 | 0.6970 | 3.3070 | 0.5563 | 0.5573 | 0.2739 | 0.2327 | | 0.0803 | 50.0 | 6250 | 1.8222 | 0.5543 | 0.6966 | 3.2790 | 0.5543 | 0.5553 | 0.2764 | 0.2323 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/cdip-tiny_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cdip-tiny_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.9475 - Accuracy: 0.8255 - Brier Loss: 0.2849 - Nll: 1.5880 - F1 Micro: 0.8255 - F1 Macro: 0.8276 - Ece: 0.1152 - Aurc: 0.0416 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 5.0886 | 0.506 | 0.6520 | 2.3481 | 0.506 | 0.4936 | 0.1512 | 0.2571 | | No log | 2.0 | 250 | 4.5869 | 0.639 | 0.5026 | 2.1689 | 0.639 | 0.6339 | 0.1308 | 0.1459 | | No log | 3.0 | 375 | 4.3135 | 0.7065 | 0.4267 | 2.0672 | 0.7065 | 0.7112 | 0.1248 | 0.1057 | | 5.1406 | 4.0 | 500 | 4.1859 | 0.717 | 0.3906 | 2.0573 | 0.7170 | 0.7213 | 0.0822 | 0.0921 | | 5.1406 | 5.0 | 625 | 3.9770 | 0.7615 | 0.3347 | 1.9365 | 0.7615 | 0.7597 | 0.0587 | 0.0699 | | 5.1406 | 6.0 | 750 | 3.9247 | 0.7698 | 0.3189 | 1.8905 | 0.7698 | 0.7710 | 0.0411 | 0.0654 | | 5.1406 | 7.0 | 875 | 3.8799 | 0.7728 | 0.3165 | 1.9278 | 0.7728 | 0.7800 | 0.0362 | 0.0644 | | 3.778 | 8.0 | 1000 | 3.8415 | 0.786 | 0.3036 | 1.8576 | 0.786 | 0.7927 | 0.0417 | 0.0602 | | 3.778 | 9.0 | 1125 | 3.7909 | 0.7977 | 0.2904 | 1.8383 | 0.7977 | 0.7996 | 0.0444 | 0.0547 | | 3.778 | 10.0 | 1250 | 3.8216 | 0.794 | 0.2950 | 1.8486 | 0.7940 | 0.7953 | 0.0526 | 0.0534 | | 3.778 | 11.0 | 1375 | 3.8084 | 0.797 | 0.2929 | 1.8672 | 0.797 | 0.8004 | 0.0653 | 0.0513 | | 3.3945 | 12.0 | 1500 | 3.7547 | 0.8113 | 0.2801 | 1.8312 | 0.8113 | 0.8124 | 0.0548 | 0.0482 | | 3.3945 | 13.0 | 1625 | 3.7730 | 0.8137 | 0.2825 | 1.8087 | 0.8137 | 0.8151 | 0.0762 | 0.0482 | | 3.3945 | 14.0 | 1750 | 3.8090 | 0.807 | 0.2863 | 1.7529 | 0.807 | 0.8075 | 0.0713 | 0.0504 | | 3.3945 | 15.0 | 1875 | 3.7612 | 0.8067 | 0.2886 | 1.7934 | 0.8067 | 0.8113 | 0.0741 | 0.0498 | | 3.2666 | 16.0 | 2000 | 3.7760 | 0.809 | 0.2863 | 1.8104 | 0.809 | 0.8116 | 0.0779 | 0.0490 | | 3.2666 | 17.0 | 2125 | 3.7504 | 0.8155 | 0.2765 | 1.7438 | 0.8155 | 0.8160 | 0.0798 | 0.0458 | | 3.2666 | 18.0 | 2250 | 3.7798 | 0.8085 | 0.2858 | 1.7447 | 0.8085 | 0.8097 | 0.0844 | 0.0462 | | 3.2666 | 19.0 | 2375 | 3.7784 | 0.8073 | 0.2876 | 1.7731 | 0.8073 | 0.8112 | 0.0832 | 0.0481 | | 3.1995 | 20.0 | 2500 | 3.7772 | 0.8123 | 0.2862 | 1.7110 | 0.8123 | 0.8137 | 0.0887 | 0.0461 | | 3.1995 | 21.0 | 2625 | 3.7531 | 0.8155 | 0.2780 | 1.6920 | 0.8155 | 0.8166 | 0.0860 | 0.0435 | | 3.1995 | 22.0 | 2750 | 3.7922 | 0.8123 | 0.2850 | 1.7335 | 0.8123 | 0.8162 | 0.0957 | 0.0454 | | 3.1995 | 23.0 | 2875 | 3.7857 | 0.8185 | 0.2760 | 1.7026 | 0.8185 | 0.8201 | 0.0870 | 0.0455 | | 3.154 | 24.0 | 3000 | 3.7452 | 0.821 | 0.2724 | 1.6936 | 0.821 | 0.8234 | 0.0902 | 0.0425 | | 3.154 | 25.0 | 3125 | 3.7485 | 0.8233 | 0.2734 | 1.6908 | 0.8233 | 0.8252 | 0.0889 | 0.0418 | | 3.154 | 26.0 | 3250 | 3.7627 | 0.8197 | 0.2754 | 1.6656 | 0.8197 | 0.8212 | 0.0951 | 0.0424 | | 3.154 | 27.0 | 3375 | 3.7635 | 0.821 | 0.2743 | 1.6732 | 0.821 | 0.8221 | 0.0955 | 0.0419 | | 3.1227 | 28.0 | 3500 | 3.7829 | 0.821 | 0.2765 | 1.6749 | 0.821 | 0.8223 | 0.0980 | 0.0426 | | 3.1227 | 29.0 | 3625 | 3.7738 | 0.8207 | 0.2752 | 1.6585 | 0.8207 | 0.8223 | 0.0936 | 0.0417 | | 3.1227 | 30.0 | 3750 | 3.7622 | 0.822 | 0.2763 | 1.6672 | 0.822 | 0.8243 | 0.0942 | 0.0421 | | 3.1227 | 31.0 | 3875 | 3.7884 | 0.824 | 0.2749 | 1.6566 | 0.824 | 0.8249 | 0.0980 | 0.0416 | | 3.0998 | 32.0 | 4000 | 3.7948 | 0.8205 | 0.2780 | 1.6516 | 0.8205 | 0.8225 | 0.1004 | 0.0420 | | 3.0998 | 33.0 | 4125 | 3.7831 | 0.8175 | 0.2787 | 1.6503 | 0.8175 | 0.8207 | 0.1022 | 0.0416 | | 3.0998 | 34.0 | 4250 | 3.8119 | 0.8223 | 0.2785 | 1.6290 | 0.8223 | 0.8246 | 0.1004 | 0.0421 | | 3.0998 | 35.0 | 4375 | 3.8186 | 0.8235 | 0.2798 | 1.6490 | 0.8235 | 0.8263 | 0.1019 | 0.0422 | | 3.0845 | 36.0 | 4500 | 3.8304 | 0.8205 | 0.2821 | 1.6117 | 0.8205 | 0.8228 | 0.1062 | 0.0421 | | 3.0845 | 37.0 | 4625 | 3.8128 | 0.8267 | 0.2758 | 1.6362 | 0.8267 | 0.8292 | 0.1007 | 0.0409 | | 3.0845 | 38.0 | 4750 | 3.8488 | 0.8217 | 0.2812 | 1.6245 | 0.8217 | 0.8236 | 0.1080 | 0.0417 | | 3.0845 | 39.0 | 4875 | 3.8459 | 0.826 | 0.2781 | 1.6239 | 0.826 | 0.8281 | 0.1050 | 0.0417 | | 3.0726 | 40.0 | 5000 | 3.8527 | 0.8257 | 0.2790 | 1.6083 | 0.8257 | 0.8280 | 0.1078 | 0.0412 | | 3.0726 | 41.0 | 5125 | 3.8496 | 0.829 | 0.2777 | 1.6026 | 0.8290 | 0.8304 | 0.1018 | 0.0412 | | 3.0726 | 42.0 | 5250 | 3.8656 | 0.826 | 0.2803 | 1.6125 | 0.826 | 0.8283 | 0.1074 | 0.0412 | | 3.0726 | 43.0 | 5375 | 3.8860 | 0.8253 | 0.2815 | 1.6029 | 0.8253 | 0.8273 | 0.1102 | 0.0415 | | 3.0635 | 44.0 | 5500 | 3.8868 | 0.8225 | 0.2810 | 1.5939 | 0.8225 | 0.8248 | 0.1132 | 0.0414 | | 3.0635 | 45.0 | 5625 | 3.9087 | 0.8247 | 0.2825 | 1.5956 | 0.8247 | 0.8268 | 0.1122 | 0.0414 | | 3.0635 | 46.0 | 5750 | 3.9273 | 0.8243 | 0.2842 | 1.5863 | 0.8243 | 0.8263 | 0.1150 | 0.0415 | | 3.0635 | 47.0 | 5875 | 3.9352 | 0.8247 | 0.2841 | 1.5859 | 0.8247 | 0.8268 | 0.1148 | 0.0416 | | 3.0576 | 48.0 | 6000 | 3.9397 | 0.8253 | 0.2843 | 1.5907 | 0.8253 | 0.8274 | 0.1146 | 0.0416 | | 3.0576 | 49.0 | 6125 | 3.9444 | 0.8255 | 0.2847 | 1.5886 | 0.8255 | 0.8276 | 0.1147 | 0.0416 | | 3.0576 | 50.0 | 6250 | 3.9475 | 0.8255 | 0.2849 | 1.5880 | 0.8255 | 0.8276 | 0.1152 | 0.0416 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
vincentiussgk/vit-base-patch16-224-in21k-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-finetuned-eurosat This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 1.1055 - Accuracy: 0.927 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 3.0689 | 0.99 | 31 | 2.6415 | 0.82 | | 1.6615 | 1.98 | 62 | 1.4504 | 0.898 | | 1.1467 | 2.98 | 93 | 1.1055 | 0.927 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "baklava", "beef_carpaccio", "beef_tartare", "beet_salad", "beignets", "bibimbap", "bread_pudding", "breakfast_burrito", "bruschetta", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare", "waffles" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 73.8137 - Accuracy: 0.8137 - Brier Loss: 0.3252 - Nll: 2.0673 - F1 Micro: 0.8137 - F1 Macro: 0.8140 - Ece: 0.1539 - Aurc: 0.0483 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 250 | 76.1878 | 0.5783 | 0.5528 | 2.6021 | 0.5783 | 0.5765 | 0.0527 | 0.2026 | | 76.425 | 2.0 | 500 | 75.1954 | 0.6558 | 0.4561 | 2.2844 | 0.6558 | 0.6549 | 0.0488 | 0.1337 | | 76.425 | 3.0 | 750 | 74.7574 | 0.716 | 0.3935 | 2.2465 | 0.7160 | 0.7170 | 0.0489 | 0.0983 | | 74.5686 | 4.0 | 1000 | 74.5759 | 0.7265 | 0.3815 | 2.1845 | 0.7265 | 0.7306 | 0.0445 | 0.0951 | | 74.5686 | 5.0 | 1250 | 74.4539 | 0.7245 | 0.3774 | 2.2022 | 0.7245 | 0.7264 | 0.0560 | 0.0919 | | 73.9702 | 6.0 | 1500 | 74.4498 | 0.7468 | 0.3680 | 2.1854 | 0.7468 | 0.7555 | 0.0829 | 0.0826 | | 73.9702 | 7.0 | 1750 | 74.2701 | 0.773 | 0.3350 | 2.1685 | 0.7730 | 0.7724 | 0.0855 | 0.0683 | | 73.5091 | 8.0 | 2000 | 74.2610 | 0.7675 | 0.3548 | 2.1544 | 0.7675 | 0.7704 | 0.1155 | 0.0709 | | 73.5091 | 9.0 | 2250 | 74.2621 | 0.772 | 0.3501 | 2.2087 | 0.772 | 0.7703 | 0.1242 | 0.0638 | | 73.2311 | 10.0 | 2500 | 74.2978 | 0.7592 | 0.3768 | 2.1953 | 0.7592 | 0.7592 | 0.1462 | 0.0738 | | 73.2311 | 11.0 | 2750 | 74.3242 | 0.7645 | 0.3803 | 2.1374 | 0.7645 | 0.7603 | 0.1528 | 0.0747 | | 73.0554 | 12.0 | 3000 | 74.2177 | 0.7847 | 0.3545 | 2.1892 | 0.7847 | 0.7862 | 0.1411 | 0.0650 | | 73.0554 | 13.0 | 3250 | 74.2360 | 0.779 | 0.3598 | 2.1518 | 0.779 | 0.7781 | 0.1513 | 0.0629 | | 72.9294 | 14.0 | 3500 | 74.2339 | 0.7772 | 0.3684 | 2.1404 | 0.7773 | 0.7799 | 0.1583 | 0.0644 | | 72.9294 | 15.0 | 3750 | 74.1185 | 0.7953 | 0.3416 | 2.1394 | 0.7953 | 0.7966 | 0.1436 | 0.0562 | | 72.8246 | 16.0 | 4000 | 74.1754 | 0.7915 | 0.3498 | 2.1599 | 0.7915 | 0.7929 | 0.1525 | 0.0606 | | 72.8246 | 17.0 | 4250 | 74.2033 | 0.7885 | 0.3559 | 2.2161 | 0.7885 | 0.7898 | 0.1558 | 0.0597 | | 72.7339 | 18.0 | 4500 | 74.2018 | 0.7873 | 0.3640 | 2.1417 | 0.7873 | 0.7881 | 0.1590 | 0.0613 | | 72.7339 | 19.0 | 4750 | 74.1204 | 0.7913 | 0.3517 | 2.1363 | 0.7913 | 0.7927 | 0.1553 | 0.0601 | | 72.6572 | 20.0 | 5000 | 74.0625 | 0.7975 | 0.3431 | 2.1165 | 0.7975 | 0.7989 | 0.1530 | 0.0587 | | 72.6572 | 21.0 | 5250 | 74.2249 | 0.7893 | 0.3609 | 2.1703 | 0.7893 | 0.7909 | 0.1663 | 0.0620 | | 72.5815 | 22.0 | 5500 | 74.1181 | 0.8025 | 0.3400 | 2.1457 | 0.8025 | 0.8024 | 0.1531 | 0.0543 | | 72.5815 | 23.0 | 5750 | 74.0536 | 0.8113 | 0.3293 | 2.1567 | 0.8113 | 0.8121 | 0.1489 | 0.0511 | | 72.5166 | 24.0 | 6000 | 74.0110 | 0.8073 | 0.3345 | 2.1831 | 0.8073 | 0.8072 | 0.1487 | 0.0524 | | 72.5166 | 25.0 | 6250 | 74.1061 | 0.8005 | 0.3424 | 2.1431 | 0.8005 | 0.8013 | 0.1573 | 0.0576 | | 72.4615 | 26.0 | 6500 | 74.0349 | 0.8013 | 0.3399 | 2.1286 | 0.8013 | 0.7997 | 0.1565 | 0.0548 | | 72.4615 | 27.0 | 6750 | 74.0363 | 0.805 | 0.3416 | 2.1198 | 0.805 | 0.8057 | 0.1551 | 0.0573 | | 72.4072 | 28.0 | 7000 | 74.0054 | 0.8107 | 0.3322 | 2.1186 | 0.8108 | 0.8104 | 0.1495 | 0.0528 | | 72.4072 | 29.0 | 7250 | 74.0448 | 0.8043 | 0.3429 | 2.0845 | 0.8043 | 0.8058 | 0.1560 | 0.0563 | | 72.3615 | 30.0 | 7500 | 73.9915 | 0.805 | 0.3376 | 2.1142 | 0.805 | 0.8059 | 0.1571 | 0.0527 | | 72.3615 | 31.0 | 7750 | 73.9340 | 0.81 | 0.3284 | 2.0976 | 0.81 | 0.8101 | 0.1516 | 0.0500 | | 72.3206 | 32.0 | 8000 | 73.9701 | 0.814 | 0.3264 | 2.1364 | 0.8140 | 0.8139 | 0.1488 | 0.0534 | | 72.3206 | 33.0 | 8250 | 73.8978 | 0.8115 | 0.3287 | 2.1375 | 0.8115 | 0.8110 | 0.1517 | 0.0487 | | 72.289 | 34.0 | 8500 | 73.8993 | 0.8175 | 0.3185 | 2.0686 | 0.8175 | 0.8196 | 0.1443 | 0.0505 | | 72.289 | 35.0 | 8750 | 73.8655 | 0.814 | 0.3231 | 2.0881 | 0.8140 | 0.8149 | 0.1504 | 0.0488 | | 72.2572 | 36.0 | 9000 | 73.8631 | 0.8153 | 0.3190 | 2.0729 | 0.8153 | 0.8158 | 0.1479 | 0.0489 | | 72.2572 | 37.0 | 9250 | 73.8671 | 0.8163 | 0.3200 | 2.1224 | 0.8163 | 0.8154 | 0.1504 | 0.0486 | | 72.2292 | 38.0 | 9500 | 73.8828 | 0.8155 | 0.3259 | 2.0859 | 0.8155 | 0.8151 | 0.1502 | 0.0476 | | 72.2292 | 39.0 | 9750 | 73.8538 | 0.8115 | 0.3296 | 2.0611 | 0.8115 | 0.8119 | 0.1541 | 0.0493 | | 72.2054 | 40.0 | 10000 | 73.8624 | 0.8115 | 0.3260 | 2.0991 | 0.8115 | 0.8113 | 0.1547 | 0.0481 | | 72.2054 | 41.0 | 10250 | 73.8335 | 0.819 | 0.3199 | 2.0802 | 0.819 | 0.8189 | 0.1468 | 0.0479 | | 72.1861 | 42.0 | 10500 | 73.8582 | 0.8123 | 0.3314 | 2.0555 | 0.8123 | 0.8130 | 0.1548 | 0.0490 | | 72.1861 | 43.0 | 10750 | 73.8290 | 0.8153 | 0.3235 | 2.0956 | 0.8153 | 0.8158 | 0.1514 | 0.0480 | | 72.1705 | 44.0 | 11000 | 73.8210 | 0.8107 | 0.3291 | 2.0636 | 0.8108 | 0.8112 | 0.1570 | 0.0489 | | 72.1705 | 45.0 | 11250 | 73.8179 | 0.8143 | 0.3260 | 2.0835 | 0.8143 | 0.8148 | 0.1534 | 0.0474 | | 72.1588 | 46.0 | 11500 | 73.8054 | 0.8117 | 0.3239 | 2.0814 | 0.8117 | 0.8122 | 0.1553 | 0.0479 | | 72.1588 | 47.0 | 11750 | 73.8085 | 0.8137 | 0.3251 | 2.0705 | 0.8137 | 0.8138 | 0.1536 | 0.0485 | | 72.1506 | 48.0 | 12000 | 73.8144 | 0.814 | 0.3254 | 2.0702 | 0.8140 | 0.8142 | 0.1534 | 0.0483 | | 72.1506 | 49.0 | 12250 | 73.8181 | 0.8137 | 0.3252 | 2.0666 | 0.8137 | 0.8141 | 0.1539 | 0.0483 | | 72.146 | 50.0 | 12500 | 73.8137 | 0.8137 | 0.3252 | 2.0673 | 0.8137 | 0.8140 | 0.1539 | 0.0483 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 37.0129 - Accuracy: 0.8277 - Brier Loss: 0.3307 - Nll: 1.8775 - F1 Micro: 0.8277 - F1 Macro: 0.8289 - Ece: 0.1649 - Aurc: 0.0944 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 250 | 56.2115 | 0.3142 | 0.8385 | 3.5992 | 0.3142 | 0.2499 | 0.1012 | 0.5692 | | 56.615 | 2.0 | 500 | 54.0327 | 0.4025 | 0.9176 | 3.1629 | 0.4025 | 0.3116 | 0.4002 | 0.3781 | | 56.615 | 3.0 | 750 | 49.9569 | 0.4728 | 0.8906 | 2.8997 | 0.4728 | 0.4076 | 0.4129 | 0.2864 | | 50.7474 | 4.0 | 1000 | 47.4945 | 0.5685 | 0.7670 | 2.6755 | 0.5685 | 0.5350 | 0.3561 | 0.2844 | | 50.7474 | 5.0 | 1250 | 45.5054 | 0.6378 | 0.6629 | 2.5408 | 0.6378 | 0.6030 | 0.3212 | 0.1851 | | 45.4907 | 6.0 | 1500 | 43.9471 | 0.679 | 0.5949 | 2.6322 | 0.679 | 0.6636 | 0.2925 | 0.1474 | | 45.4907 | 7.0 | 1750 | 42.9273 | 0.7342 | 0.4843 | 2.4382 | 0.7342 | 0.7365 | 0.2245 | 0.1436 | | 42.5191 | 8.0 | 2000 | 41.9715 | 0.7548 | 0.4560 | 2.3596 | 0.7548 | 0.7533 | 0.2231 | 0.1400 | | 42.5191 | 9.0 | 2250 | 41.4349 | 0.7722 | 0.4310 | 2.3144 | 0.7722 | 0.7718 | 0.2103 | 0.1304 | | 40.8849 | 10.0 | 2500 | 41.0961 | 0.7805 | 0.4187 | 2.2268 | 0.7805 | 0.7826 | 0.2047 | 0.1305 | | 40.8849 | 11.0 | 2750 | 40.5831 | 0.7893 | 0.4030 | 2.1663 | 0.7893 | 0.7930 | 0.2001 | 0.1246 | | 39.8394 | 12.0 | 3000 | 40.1596 | 0.7987 | 0.3877 | 2.1719 | 0.7987 | 0.8015 | 0.1929 | 0.1162 | | 39.8394 | 13.0 | 3250 | 39.8469 | 0.8033 | 0.3821 | 2.1455 | 0.8033 | 0.8077 | 0.1889 | 0.1183 | | 38.9442 | 14.0 | 3500 | 39.5865 | 0.8055 | 0.3761 | 2.1121 | 0.8055 | 0.8096 | 0.1864 | 0.1110 | | 38.9442 | 15.0 | 3750 | 39.4686 | 0.81 | 0.3693 | 2.0948 | 0.81 | 0.8125 | 0.1831 | 0.1114 | | 38.3612 | 16.0 | 4000 | 39.1387 | 0.8207 | 0.3446 | 1.9957 | 0.8207 | 0.8219 | 0.1716 | 0.1038 | | 38.3612 | 17.0 | 4250 | 38.8950 | 0.8143 | 0.3575 | 2.0339 | 0.8143 | 0.8152 | 0.1781 | 0.1034 | | 37.7855 | 18.0 | 4500 | 38.6442 | 0.8215 | 0.3442 | 1.9658 | 0.8215 | 0.8236 | 0.1718 | 0.1036 | | 37.7855 | 19.0 | 4750 | 38.5218 | 0.8197 | 0.3477 | 1.9627 | 0.8197 | 0.8220 | 0.1735 | 0.1070 | | 37.3649 | 20.0 | 5000 | 38.3474 | 0.8225 | 0.3413 | 1.9886 | 0.8225 | 0.8239 | 0.1710 | 0.1028 | | 37.3649 | 21.0 | 5250 | 38.2377 | 0.8257 | 0.3358 | 1.9864 | 0.8257 | 0.8269 | 0.1674 | 0.0957 | | 37.0326 | 22.0 | 5500 | 38.1089 | 0.824 | 0.3418 | 1.9404 | 0.824 | 0.8257 | 0.1678 | 0.0980 | | 37.0326 | 23.0 | 5750 | 37.9861 | 0.8273 | 0.3339 | 1.9540 | 0.8273 | 0.8285 | 0.1664 | 0.0985 | | 36.7372 | 24.0 | 6000 | 37.8397 | 0.8255 | 0.3376 | 1.9492 | 0.8255 | 0.8268 | 0.1685 | 0.0944 | | 36.7372 | 25.0 | 6250 | 37.7772 | 0.8253 | 0.3370 | 1.9078 | 0.8253 | 0.8255 | 0.1669 | 0.0997 | | 36.4341 | 26.0 | 6500 | 37.6550 | 0.828 | 0.3325 | 1.9388 | 0.828 | 0.8284 | 0.1647 | 0.0943 | | 36.4341 | 27.0 | 6750 | 37.5873 | 0.8255 | 0.3364 | 1.9319 | 0.8255 | 0.8261 | 0.1680 | 0.0920 | | 36.2152 | 28.0 | 7000 | 37.5052 | 0.825 | 0.3379 | 1.8945 | 0.825 | 0.8268 | 0.1681 | 0.0981 | | 36.2152 | 29.0 | 7250 | 37.4586 | 0.8243 | 0.3361 | 1.9094 | 0.8243 | 0.8251 | 0.1692 | 0.0945 | | 36.0128 | 30.0 | 7500 | 37.3730 | 0.8277 | 0.3304 | 1.9062 | 0.8277 | 0.8288 | 0.1657 | 0.0946 | | 36.0128 | 31.0 | 7750 | 37.3309 | 0.8277 | 0.3309 | 1.9045 | 0.8277 | 0.8291 | 0.1660 | 0.0947 | | 35.8486 | 32.0 | 8000 | 37.2620 | 0.8267 | 0.3323 | 1.8884 | 0.8267 | 0.8279 | 0.1652 | 0.0950 | | 35.8486 | 33.0 | 8250 | 37.2147 | 0.8275 | 0.3308 | 1.9079 | 0.8275 | 0.8290 | 0.1654 | 0.0960 | | 35.6854 | 34.0 | 8500 | 37.1911 | 0.831 | 0.3252 | 1.8935 | 0.831 | 0.8323 | 0.1613 | 0.0939 | | 35.6854 | 35.0 | 8750 | 37.1523 | 0.8283 | 0.3301 | 1.8847 | 0.8283 | 0.8293 | 0.1644 | 0.0972 | | 35.5758 | 36.0 | 9000 | 37.1315 | 0.8305 | 0.3252 | 1.8941 | 0.8305 | 0.8317 | 0.1627 | 0.0934 | | 35.5758 | 37.0 | 9250 | 37.1184 | 0.8275 | 0.3320 | 1.8844 | 0.8275 | 0.8285 | 0.1654 | 0.0923 | | 35.4911 | 38.0 | 9500 | 37.1149 | 0.827 | 0.3327 | 1.8885 | 0.827 | 0.8288 | 0.1668 | 0.0953 | | 35.4911 | 39.0 | 9750 | 37.1067 | 0.8267 | 0.3323 | 1.8846 | 0.8267 | 0.8281 | 0.1659 | 0.0932 | | 35.4248 | 40.0 | 10000 | 37.0792 | 0.8293 | 0.3294 | 1.8840 | 0.8293 | 0.8305 | 0.1633 | 0.0937 | | 35.4248 | 41.0 | 10250 | 37.0798 | 0.8297 | 0.3288 | 1.8718 | 0.8297 | 0.8309 | 0.1639 | 0.0929 | | 35.3648 | 42.0 | 10500 | 37.0635 | 0.8265 | 0.3351 | 1.8883 | 0.8265 | 0.8279 | 0.1680 | 0.0951 | | 35.3648 | 43.0 | 10750 | 37.0470 | 0.828 | 0.3308 | 1.8746 | 0.828 | 0.8294 | 0.1656 | 0.0939 | | 35.2961 | 44.0 | 11000 | 37.0305 | 0.8273 | 0.3321 | 1.8901 | 0.8273 | 0.8286 | 0.1657 | 0.0932 | | 35.2961 | 45.0 | 11250 | 37.0261 | 0.8275 | 0.3315 | 1.8823 | 0.8275 | 0.8287 | 0.1650 | 0.0949 | | 35.241 | 46.0 | 11500 | 37.0253 | 0.827 | 0.3311 | 1.8751 | 0.827 | 0.8283 | 0.1662 | 0.0940 | | 35.241 | 47.0 | 11750 | 37.0200 | 0.8277 | 0.3321 | 1.8708 | 0.8277 | 0.8289 | 0.1653 | 0.0949 | | 35.2059 | 48.0 | 12000 | 37.0165 | 0.8277 | 0.3305 | 1.8745 | 0.8277 | 0.8289 | 0.1650 | 0.0934 | | 35.2059 | 49.0 | 12250 | 37.0130 | 0.8275 | 0.3312 | 1.8743 | 0.8275 | 0.8287 | 0.1655 | 0.0942 | | 35.18 | 50.0 | 12500 | 37.0129 | 0.8277 | 0.3307 | 1.8775 | 0.8277 | 0.8289 | 0.1649 | 0.0944 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_og_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 268.5710 - Accuracy: 0.832 - Brier Loss: 0.3051 - Nll: 1.8984 - F1 Micro: 0.832 - F1 Macro: 0.8328 - Ece: 0.1458 - Aurc: 0.0566 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 286.5314 | 1.0 | 500 | 283.8548 | 0.4555 | 0.8016 | 2.8438 | 0.4555 | 0.3694 | 0.3223 | 0.2744 | | 283.1635 | 2.0 | 1000 | 282.0722 | 0.5577 | 0.6555 | 2.5676 | 0.5577 | 0.5116 | 0.2428 | 0.1873 | | 282.0305 | 3.0 | 1500 | 281.5496 | 0.6355 | 0.5598 | 2.4665 | 0.6355 | 0.6232 | 0.2150 | 0.1424 | | 281.2235 | 4.0 | 2000 | 280.5680 | 0.7065 | 0.4447 | 2.2146 | 0.7065 | 0.7028 | 0.1680 | 0.1010 | | 280.4245 | 5.0 | 2500 | 279.8773 | 0.7245 | 0.4675 | 2.2237 | 0.7245 | 0.7253 | 0.1993 | 0.1058 | | 279.6686 | 6.0 | 3000 | 279.1086 | 0.7748 | 0.3740 | 2.0556 | 0.7748 | 0.7729 | 0.1601 | 0.0700 | | 278.8635 | 7.0 | 3500 | 278.2839 | 0.7675 | 0.3894 | 1.9970 | 0.7675 | 0.7689 | 0.1713 | 0.0791 | | 278.1131 | 8.0 | 4000 | 277.6409 | 0.7977 | 0.3416 | 1.9272 | 0.7977 | 0.7969 | 0.1478 | 0.0618 | | 277.4522 | 9.0 | 4500 | 277.2180 | 0.8083 | 0.3280 | 1.9560 | 0.8083 | 0.8122 | 0.1474 | 0.0559 | | 276.8156 | 10.0 | 5000 | 276.5454 | 0.8145 | 0.3158 | 1.8932 | 0.8145 | 0.8149 | 0.1404 | 0.0531 | | 276.2433 | 11.0 | 5500 | 275.8951 | 0.8117 | 0.3233 | 1.8813 | 0.8117 | 0.8122 | 0.1460 | 0.0551 | | 275.7038 | 12.0 | 6000 | 275.6540 | 0.8217 | 0.3033 | 1.8605 | 0.8217 | 0.8228 | 0.1355 | 0.0521 | | 275.1847 | 13.0 | 6500 | 275.1825 | 0.8263 | 0.3063 | 1.8834 | 0.8263 | 0.8282 | 0.1384 | 0.0513 | | 274.7168 | 14.0 | 7000 | 274.8503 | 0.8203 | 0.3133 | 1.8742 | 0.8203 | 0.8222 | 0.1448 | 0.0520 | | 274.2753 | 15.0 | 7500 | 274.2773 | 0.8273 | 0.3015 | 1.8833 | 0.8273 | 0.8282 | 0.1392 | 0.0497 | | 273.8617 | 16.0 | 8000 | 273.9056 | 0.825 | 0.3018 | 1.8546 | 0.825 | 0.8273 | 0.1391 | 0.0527 | | 273.4509 | 17.0 | 8500 | 273.4976 | 0.827 | 0.3028 | 1.8656 | 0.827 | 0.8270 | 0.1415 | 0.0500 | | 273.0643 | 18.0 | 9000 | 273.0985 | 0.8315 | 0.2977 | 1.8671 | 0.8315 | 0.8326 | 0.1382 | 0.0491 | | 272.7104 | 19.0 | 9500 | 272.9490 | 0.8273 | 0.3035 | 1.8686 | 0.8273 | 0.8285 | 0.1427 | 0.0525 | | 272.3669 | 20.0 | 10000 | 272.6702 | 0.8253 | 0.3052 | 1.8809 | 0.8253 | 0.8258 | 0.1441 | 0.0499 | | 272.0331 | 21.0 | 10500 | 272.2651 | 0.833 | 0.2966 | 1.8759 | 0.833 | 0.8340 | 0.1397 | 0.0479 | | 271.7213 | 22.0 | 11000 | 272.2740 | 0.829 | 0.2999 | 1.8507 | 0.8290 | 0.8295 | 0.1419 | 0.0493 | | 271.4253 | 23.0 | 11500 | 271.7973 | 0.8327 | 0.2962 | 1.8837 | 0.8327 | 0.8326 | 0.1404 | 0.0490 | | 271.1327 | 24.0 | 12000 | 271.5110 | 0.8355 | 0.2930 | 1.8580 | 0.8355 | 0.8365 | 0.1393 | 0.0502 | | 270.858 | 25.0 | 12500 | 271.1653 | 0.828 | 0.3035 | 1.8529 | 0.828 | 0.8287 | 0.1455 | 0.0520 | | 270.5978 | 26.0 | 13000 | 270.9584 | 0.8283 | 0.3056 | 1.8615 | 0.8283 | 0.8282 | 0.1442 | 0.0535 | | 270.3636 | 27.0 | 13500 | 270.7707 | 0.832 | 0.3030 | 1.8635 | 0.832 | 0.8331 | 0.1442 | 0.0507 | | 270.1365 | 28.0 | 14000 | 270.3265 | 0.8287 | 0.3096 | 1.8857 | 0.8287 | 0.8295 | 0.1458 | 0.0551 | | 269.9005 | 29.0 | 14500 | 270.4089 | 0.8307 | 0.3017 | 1.8722 | 0.8308 | 0.8306 | 0.1441 | 0.0528 | | 269.6876 | 30.0 | 15000 | 270.2905 | 0.8303 | 0.3043 | 1.8792 | 0.8303 | 0.8308 | 0.1446 | 0.0518 | | 269.5015 | 31.0 | 15500 | 269.9496 | 0.834 | 0.2997 | 1.8925 | 0.834 | 0.8344 | 0.1413 | 0.0519 | | 269.3106 | 32.0 | 16000 | 269.8872 | 0.8333 | 0.2990 | 1.8822 | 0.8333 | 0.8333 | 0.1419 | 0.0524 | | 269.1204 | 33.0 | 16500 | 269.7998 | 0.8303 | 0.3057 | 1.9016 | 0.8303 | 0.8310 | 0.1463 | 0.0541 | | 268.9658 | 34.0 | 17000 | 269.3946 | 0.8347 | 0.3003 | 1.8922 | 0.8347 | 0.8356 | 0.1423 | 0.0535 | | 268.8073 | 35.0 | 17500 | 269.5928 | 0.8327 | 0.3035 | 1.8508 | 0.8327 | 0.8332 | 0.1443 | 0.0546 | | 268.6654 | 36.0 | 18000 | 269.2020 | 0.8307 | 0.3058 | 1.8891 | 0.8308 | 0.8317 | 0.1456 | 0.0543 | | 268.5213 | 37.0 | 18500 | 269.3784 | 0.8295 | 0.3095 | 1.8732 | 0.8295 | 0.8299 | 0.1478 | 0.0549 | | 268.3883 | 38.0 | 19000 | 269.0580 | 0.8303 | 0.3060 | 1.8621 | 0.8303 | 0.8303 | 0.1466 | 0.0559 | | 268.2752 | 39.0 | 19500 | 269.0785 | 0.8317 | 0.3038 | 1.8956 | 0.8317 | 0.8320 | 0.1449 | 0.0534 | | 268.1814 | 40.0 | 20000 | 268.8612 | 0.8357 | 0.3029 | 1.9057 | 0.8357 | 0.8367 | 0.1416 | 0.0557 | | 268.0695 | 41.0 | 20500 | 268.8330 | 0.8303 | 0.3047 | 1.8963 | 0.8303 | 0.8309 | 0.1475 | 0.0565 | | 267.9566 | 42.0 | 21000 | 268.7392 | 0.8313 | 0.3055 | 1.9059 | 0.8313 | 0.8319 | 0.1462 | 0.0550 | | 267.9045 | 43.0 | 21500 | 268.6012 | 0.8307 | 0.3063 | 1.8974 | 0.8308 | 0.8317 | 0.1478 | 0.0565 | | 267.8455 | 44.0 | 22000 | 268.8788 | 0.832 | 0.3042 | 1.8960 | 0.832 | 0.8325 | 0.1447 | 0.0553 | | 267.776 | 45.0 | 22500 | 268.5588 | 0.829 | 0.3093 | 1.9087 | 0.8290 | 0.8296 | 0.1489 | 0.0556 | | 267.7253 | 46.0 | 23000 | 268.4604 | 0.8303 | 0.3079 | 1.9259 | 0.8303 | 0.8307 | 0.1474 | 0.0573 | | 267.6786 | 47.0 | 23500 | 268.5825 | 0.8317 | 0.3062 | 1.9043 | 0.8317 | 0.8322 | 0.1464 | 0.0561 | | 267.6514 | 48.0 | 24000 | 268.4191 | 0.8315 | 0.3072 | 1.8933 | 0.8315 | 0.8320 | 0.1466 | 0.0576 | | 267.6384 | 49.0 | 24500 | 268.4131 | 0.8303 | 0.3087 | 1.9256 | 0.8303 | 0.8306 | 0.1483 | 0.0576 | | 267.6158 | 50.0 | 25000 | 268.5710 | 0.832 | 0.3051 | 1.8984 | 0.832 | 0.8328 | 0.1458 | 0.0566 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/rvlcdip-tiny_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # rvlcdip-tiny_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6215 - Accuracy: 0.7963 - Brier Loss: 0.3076 - Nll: 1.6291 - F1 Micro: 0.7963 - F1 Macro: 0.7978 - Ece: 0.0919 - Aurc: 0.0682 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 1.3808 | 0.541 | 0.5996 | 3.3159 | 0.541 | 0.5235 | 0.1039 | 0.2209 | | No log | 2.0 | 250 | 1.0577 | 0.6525 | 0.4662 | 2.6310 | 0.6525 | 0.6396 | 0.0871 | 0.1302 | | No log | 3.0 | 375 | 0.9165 | 0.7075 | 0.4104 | 2.2685 | 0.7075 | 0.7041 | 0.0788 | 0.1048 | | 1.3004 | 4.0 | 500 | 0.8505 | 0.7298 | 0.3804 | 2.1171 | 0.7298 | 0.7380 | 0.0622 | 0.0934 | | 1.3004 | 5.0 | 625 | 0.8063 | 0.745 | 0.3603 | 2.1178 | 0.745 | 0.7359 | 0.0588 | 0.0814 | | 1.3004 | 6.0 | 750 | 0.7441 | 0.7662 | 0.3348 | 1.9219 | 0.7663 | 0.7636 | 0.0545 | 0.0741 | | 1.3004 | 7.0 | 875 | 0.6987 | 0.7732 | 0.3193 | 1.8601 | 0.7732 | 0.7741 | 0.0509 | 0.0697 | | 0.4682 | 8.0 | 1000 | 0.7033 | 0.773 | 0.3240 | 1.8889 | 0.7730 | 0.7733 | 0.0516 | 0.0776 | | 0.4682 | 9.0 | 1125 | 0.6973 | 0.7865 | 0.3151 | 1.9589 | 0.7865 | 0.7838 | 0.0441 | 0.0760 | | 0.4682 | 10.0 | 1250 | 0.7068 | 0.7748 | 0.3252 | 2.0362 | 0.7748 | 0.7749 | 0.0515 | 0.0791 | | 0.4682 | 11.0 | 1375 | 0.6988 | 0.7768 | 0.3285 | 1.9227 | 0.7768 | 0.7801 | 0.0555 | 0.0840 | | 0.1899 | 12.0 | 1500 | 0.7048 | 0.7762 | 0.3303 | 1.9777 | 0.7762 | 0.7719 | 0.0627 | 0.0809 | | 0.1899 | 13.0 | 1625 | 0.6842 | 0.7785 | 0.3240 | 1.9360 | 0.7785 | 0.7784 | 0.0614 | 0.0808 | | 0.1899 | 14.0 | 1750 | 0.6993 | 0.7742 | 0.3319 | 1.9508 | 0.7742 | 0.7727 | 0.0731 | 0.0759 | | 0.1899 | 15.0 | 1875 | 0.6936 | 0.7742 | 0.3333 | 1.9042 | 0.7742 | 0.7760 | 0.0717 | 0.0853 | | 0.1304 | 16.0 | 2000 | 0.6818 | 0.7837 | 0.3233 | 1.9541 | 0.7837 | 0.7855 | 0.0713 | 0.0853 | | 0.1304 | 17.0 | 2125 | 0.6757 | 0.78 | 0.3255 | 1.8818 | 0.78 | 0.7829 | 0.0755 | 0.0834 | | 0.1304 | 18.0 | 2250 | 0.7018 | 0.781 | 0.3348 | 2.0078 | 0.7810 | 0.7829 | 0.0786 | 0.0876 | | 0.1304 | 19.0 | 2375 | 0.6872 | 0.7775 | 0.3340 | 1.8345 | 0.7775 | 0.7786 | 0.0864 | 0.0787 | | 0.11 | 20.0 | 2500 | 0.7054 | 0.7758 | 0.3379 | 1.9542 | 0.7758 | 0.7747 | 0.0731 | 0.0847 | | 0.11 | 21.0 | 2625 | 0.7006 | 0.782 | 0.3371 | 1.8610 | 0.782 | 0.7813 | 0.0821 | 0.0891 | | 0.11 | 22.0 | 2750 | 0.7046 | 0.775 | 0.3428 | 1.8464 | 0.775 | 0.7772 | 0.0833 | 0.0814 | | 0.11 | 23.0 | 2875 | 0.6620 | 0.789 | 0.3201 | 1.8174 | 0.7890 | 0.7908 | 0.0761 | 0.0799 | | 0.0979 | 24.0 | 3000 | 0.6886 | 0.783 | 0.3324 | 1.8706 | 0.7830 | 0.7848 | 0.0807 | 0.0773 | | 0.0979 | 25.0 | 3125 | 0.6600 | 0.7847 | 0.3236 | 1.8218 | 0.7847 | 0.7863 | 0.0833 | 0.0749 | | 0.0979 | 26.0 | 3250 | 0.6777 | 0.7798 | 0.3349 | 1.7189 | 0.7798 | 0.7812 | 0.0951 | 0.0752 | | 0.0979 | 27.0 | 3375 | 0.6554 | 0.7857 | 0.3212 | 1.7356 | 0.7857 | 0.7888 | 0.0871 | 0.0709 | | 0.087 | 28.0 | 3500 | 0.6460 | 0.7955 | 0.3140 | 1.7680 | 0.7955 | 0.7970 | 0.0761 | 0.0696 | | 0.087 | 29.0 | 3625 | 0.6371 | 0.7935 | 0.3136 | 1.6350 | 0.7935 | 0.7946 | 0.0830 | 0.0706 | | 0.087 | 30.0 | 3750 | 0.6334 | 0.7915 | 0.3127 | 1.7187 | 0.7915 | 0.7933 | 0.0857 | 0.0712 | | 0.087 | 31.0 | 3875 | 0.6293 | 0.7977 | 0.3075 | 1.7781 | 0.7977 | 0.7999 | 0.0799 | 0.0661 | | 0.0793 | 32.0 | 4000 | 0.6273 | 0.7973 | 0.3076 | 1.6439 | 0.7973 | 0.7976 | 0.0782 | 0.0695 | | 0.0793 | 33.0 | 4125 | 0.6320 | 0.7933 | 0.3123 | 1.6486 | 0.7932 | 0.7954 | 0.0899 | 0.0679 | | 0.0793 | 34.0 | 4250 | 0.6345 | 0.79 | 0.3154 | 1.6402 | 0.79 | 0.7903 | 0.0922 | 0.0675 | | 0.0793 | 35.0 | 4375 | 0.6209 | 0.793 | 0.3098 | 1.6026 | 0.793 | 0.7943 | 0.0863 | 0.0630 | | 0.0733 | 36.0 | 4500 | 0.6187 | 0.7947 | 0.3076 | 1.6282 | 0.7947 | 0.7967 | 0.0880 | 0.0666 | | 0.0733 | 37.0 | 4625 | 0.6146 | 0.7957 | 0.3051 | 1.6186 | 0.7957 | 0.7971 | 0.0885 | 0.0623 | | 0.0733 | 38.0 | 4750 | 0.6169 | 0.7983 | 0.3062 | 1.6182 | 0.7983 | 0.7996 | 0.0835 | 0.0650 | | 0.0733 | 39.0 | 4875 | 0.6180 | 0.7953 | 0.3074 | 1.6241 | 0.7953 | 0.7975 | 0.0889 | 0.0655 | | 0.0693 | 40.0 | 5000 | 0.6204 | 0.7977 | 0.3069 | 1.6048 | 0.7977 | 0.7987 | 0.0824 | 0.0659 | | 0.0693 | 41.0 | 5125 | 0.6140 | 0.7967 | 0.3055 | 1.6065 | 0.7967 | 0.7986 | 0.0911 | 0.0662 | | 0.0693 | 42.0 | 5250 | 0.6162 | 0.7957 | 0.3062 | 1.6182 | 0.7957 | 0.7971 | 0.0883 | 0.0655 | | 0.0693 | 43.0 | 5375 | 0.6169 | 0.796 | 0.3058 | 1.6212 | 0.796 | 0.7976 | 0.0879 | 0.0662 | | 0.0673 | 44.0 | 5500 | 0.6173 | 0.7973 | 0.3063 | 1.6161 | 0.7973 | 0.7990 | 0.0877 | 0.0666 | | 0.0673 | 45.0 | 5625 | 0.6193 | 0.797 | 0.3070 | 1.6151 | 0.797 | 0.7986 | 0.0881 | 0.0678 | | 0.0673 | 46.0 | 5750 | 0.6209 | 0.7963 | 0.3076 | 1.6211 | 0.7963 | 0.7979 | 0.0894 | 0.0678 | | 0.0673 | 47.0 | 5875 | 0.6211 | 0.7977 | 0.3075 | 1.6284 | 0.7977 | 0.7993 | 0.0905 | 0.0691 | | 0.0662 | 48.0 | 6000 | 0.6206 | 0.7967 | 0.3072 | 1.6289 | 0.7967 | 0.7983 | 0.0892 | 0.0673 | | 0.0662 | 49.0 | 6125 | 0.6213 | 0.7965 | 0.3075 | 1.6262 | 0.7965 | 0.7980 | 0.0886 | 0.0684 | | 0.0662 | 50.0 | 6250 | 0.6215 | 0.7963 | 0.3076 | 1.6291 | 0.7963 | 0.7978 | 0.0919 | 0.0682 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
Dewa/dog_emotion_v3_resnet
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dog_emotion_v3_resnet This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.3063 - Accuracy: 0.5075 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5.5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 50 | 1.3721 | 0.3475 | | No log | 2.0 | 100 | 1.3502 | 0.45 | | No log | 3.0 | 150 | 1.3292 | 0.485 | | No log | 4.0 | 200 | 1.3103 | 0.5025 | | No log | 5.0 | 250 | 1.3063 | 0.5075 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "sad", "angry", "happy", "relaxed" ]
mpterradillos/beans_vit_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beans_vit_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0068 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1356 | 3.85 | 500 | 0.0068 | 1.0 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
jordyvl/dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_kd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5815 - Accuracy: 0.8055 - Brier Loss: 0.2836 - Nll: 1.6135 - F1 Micro: 0.8055 - F1 Macro: 0.8061 - Ece: 0.0597 - Aurc: 0.0526 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 1.2844 | 0.5403 | 0.5889 | 3.0582 | 0.5403 | 0.5275 | 0.0742 | 0.2209 | | No log | 2.0 | 250 | 0.9687 | 0.655 | 0.4587 | 2.4358 | 0.655 | 0.6414 | 0.0559 | 0.1296 | | No log | 3.0 | 375 | 0.8401 | 0.7063 | 0.4019 | 2.2308 | 0.7063 | 0.7008 | 0.0588 | 0.0990 | | 1.234 | 4.0 | 500 | 0.8080 | 0.7145 | 0.3874 | 2.1628 | 0.7145 | 0.7163 | 0.0487 | 0.0951 | | 1.234 | 5.0 | 625 | 0.7772 | 0.7238 | 0.3755 | 2.0380 | 0.7237 | 0.7167 | 0.0421 | 0.0914 | | 1.234 | 6.0 | 750 | 0.7530 | 0.7498 | 0.3484 | 2.1346 | 0.7498 | 0.7464 | 0.0477 | 0.0774 | | 1.234 | 7.0 | 875 | 0.7034 | 0.7652 | 0.3267 | 2.0596 | 0.7652 | 0.7664 | 0.0467 | 0.0678 | | 0.3976 | 8.0 | 1000 | 0.7390 | 0.7715 | 0.3350 | 2.0568 | 0.7715 | 0.7704 | 0.0448 | 0.0763 | | 0.3976 | 9.0 | 1125 | 0.7019 | 0.7762 | 0.3209 | 2.0168 | 0.7762 | 0.7768 | 0.0556 | 0.0769 | | 0.3976 | 10.0 | 1250 | 0.7318 | 0.7668 | 0.3346 | 2.1148 | 0.7668 | 0.7699 | 0.0529 | 0.0792 | | 0.3976 | 11.0 | 1375 | 0.7083 | 0.7782 | 0.3213 | 2.0671 | 0.7782 | 0.7775 | 0.0452 | 0.0756 | | 0.1591 | 12.0 | 1500 | 0.7535 | 0.7668 | 0.3424 | 2.1407 | 0.7668 | 0.7636 | 0.0564 | 0.0845 | | 0.1591 | 13.0 | 1625 | 0.7117 | 0.775 | 0.3288 | 2.0935 | 0.775 | 0.7766 | 0.0525 | 0.0785 | | 0.1591 | 14.0 | 1750 | 0.6421 | 0.785 | 0.3039 | 1.9939 | 0.785 | 0.7860 | 0.0512 | 0.0643 | | 0.1591 | 15.0 | 1875 | 0.6475 | 0.7865 | 0.3050 | 1.9301 | 0.7865 | 0.7867 | 0.0552 | 0.0636 | | 0.1125 | 16.0 | 2000 | 0.6477 | 0.7893 | 0.3064 | 1.9442 | 0.7893 | 0.7920 | 0.0556 | 0.0684 | | 0.1125 | 17.0 | 2125 | 0.6509 | 0.7883 | 0.3113 | 1.8957 | 0.7883 | 0.7907 | 0.0498 | 0.0710 | | 0.1125 | 18.0 | 2250 | 0.6291 | 0.7925 | 0.3038 | 1.8697 | 0.7925 | 0.7963 | 0.0512 | 0.0677 | | 0.1125 | 19.0 | 2375 | 0.6279 | 0.7963 | 0.2992 | 1.8155 | 0.7963 | 0.7950 | 0.0478 | 0.0647 | | 0.095 | 20.0 | 2500 | 0.6246 | 0.7937 | 0.3008 | 1.7925 | 0.7937 | 0.7946 | 0.0595 | 0.0659 | | 0.095 | 21.0 | 2625 | 0.6149 | 0.7953 | 0.2962 | 1.8237 | 0.7953 | 0.7951 | 0.0547 | 0.0590 | | 0.095 | 22.0 | 2750 | 0.6196 | 0.7953 | 0.3000 | 1.8031 | 0.7953 | 0.7969 | 0.0567 | 0.0643 | | 0.095 | 23.0 | 2875 | 0.6023 | 0.798 | 0.2932 | 1.7663 | 0.798 | 0.7983 | 0.0497 | 0.0616 | | 0.0829 | 24.0 | 3000 | 0.6107 | 0.7943 | 0.2951 | 1.7755 | 0.7943 | 0.7958 | 0.0564 | 0.0581 | | 0.0829 | 25.0 | 3125 | 0.5986 | 0.8015 | 0.2930 | 1.7243 | 0.8015 | 0.8027 | 0.0565 | 0.0574 | | 0.0829 | 26.0 | 3250 | 0.5899 | 0.8005 | 0.2886 | 1.7304 | 0.8005 | 0.8021 | 0.0546 | 0.0560 | | 0.0829 | 27.0 | 3375 | 0.5836 | 0.8023 | 0.2846 | 1.6865 | 0.8023 | 0.8024 | 0.0479 | 0.0561 | | 0.074 | 28.0 | 3500 | 0.5824 | 0.8047 | 0.2850 | 1.6817 | 0.8047 | 0.8060 | 0.0524 | 0.0559 | | 0.074 | 29.0 | 3625 | 0.5760 | 0.8063 | 0.2822 | 1.6505 | 0.8062 | 0.8065 | 0.0500 | 0.0546 | | 0.074 | 30.0 | 3750 | 0.5819 | 0.8065 | 0.2843 | 1.6667 | 0.8065 | 0.8079 | 0.0563 | 0.0544 | | 0.074 | 31.0 | 3875 | 0.5800 | 0.8045 | 0.2841 | 1.6658 | 0.8045 | 0.8059 | 0.0511 | 0.0548 | | 0.0668 | 32.0 | 4000 | 0.5828 | 0.8053 | 0.2841 | 1.6883 | 0.8053 | 0.8054 | 0.0559 | 0.0547 | | 0.0668 | 33.0 | 4125 | 0.5802 | 0.8037 | 0.2838 | 1.6669 | 0.8037 | 0.8038 | 0.0572 | 0.0545 | | 0.0668 | 34.0 | 4250 | 0.5772 | 0.8067 | 0.2821 | 1.6588 | 0.8067 | 0.8083 | 0.0520 | 0.0525 | | 0.0668 | 35.0 | 4375 | 0.5745 | 0.807 | 0.2812 | 1.6524 | 0.807 | 0.8072 | 0.0528 | 0.0528 | | 0.0631 | 36.0 | 4500 | 0.5770 | 0.8063 | 0.2826 | 1.6433 | 0.8062 | 0.8071 | 0.0559 | 0.0528 | | 0.0631 | 37.0 | 4625 | 0.5782 | 0.8007 | 0.2837 | 1.5953 | 0.8007 | 0.8021 | 0.0581 | 0.0541 | | 0.0631 | 38.0 | 4750 | 0.5780 | 0.8047 | 0.2829 | 1.6275 | 0.8047 | 0.8052 | 0.0540 | 0.0521 | | 0.0631 | 39.0 | 4875 | 0.5759 | 0.8055 | 0.2817 | 1.6162 | 0.8055 | 0.8065 | 0.0528 | 0.0529 | | 0.0612 | 40.0 | 5000 | 0.5770 | 0.8047 | 0.2825 | 1.6131 | 0.8047 | 0.8051 | 0.0575 | 0.0524 | | 0.0612 | 41.0 | 5125 | 0.5771 | 0.8043 | 0.2819 | 1.6015 | 0.8043 | 0.8048 | 0.0562 | 0.0519 | | 0.0612 | 42.0 | 5250 | 0.5776 | 0.8043 | 0.2825 | 1.6152 | 0.8043 | 0.8047 | 0.0566 | 0.0527 | | 0.0612 | 43.0 | 5375 | 0.5793 | 0.8057 | 0.2830 | 1.6196 | 0.8057 | 0.8065 | 0.0538 | 0.0527 | | 0.06 | 44.0 | 5500 | 0.5801 | 0.8053 | 0.2835 | 1.6183 | 0.8053 | 0.8060 | 0.0618 | 0.0527 | | 0.06 | 45.0 | 5625 | 0.5800 | 0.805 | 0.2831 | 1.6057 | 0.805 | 0.8055 | 0.0568 | 0.0530 | | 0.06 | 46.0 | 5750 | 0.5812 | 0.805 | 0.2836 | 1.6034 | 0.805 | 0.8056 | 0.0577 | 0.0529 | | 0.06 | 47.0 | 5875 | 0.5809 | 0.805 | 0.2834 | 1.6164 | 0.805 | 0.8056 | 0.0580 | 0.0526 | | 0.0593 | 48.0 | 6000 | 0.5810 | 0.8057 | 0.2834 | 1.6108 | 0.8057 | 0.8064 | 0.0617 | 0.0525 | | 0.0593 | 49.0 | 6125 | 0.5812 | 0.8053 | 0.2836 | 1.6140 | 0.8053 | 0.8058 | 0.0570 | 0.0527 | | 0.0593 | 50.0 | 6250 | 0.5815 | 0.8055 | 0.2836 | 1.6135 | 0.8055 | 0.8061 | 0.0597 | 0.0526 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/rvlcdip-tiny_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # rvlcdip-tiny_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 5.4561 - Accuracy: 0.802 - Brier Loss: 0.3399 - Nll: 1.6335 - F1 Micro: 0.802 - F1 Macro: 0.8037 - Ece: 0.1478 - Aurc: 0.0576 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 5.6533 | 0.5288 | 0.6696 | 3.8280 | 0.5288 | 0.4993 | 0.2170 | 0.2294 | | No log | 2.0 | 250 | 5.3016 | 0.6285 | 0.5364 | 2.7651 | 0.6285 | 0.6089 | 0.1861 | 0.1465 | | No log | 3.0 | 375 | 5.1153 | 0.696 | 0.4775 | 2.4052 | 0.696 | 0.6956 | 0.2003 | 0.1078 | | 5.7003 | 4.0 | 500 | 4.9491 | 0.7358 | 0.3968 | 2.1532 | 0.7358 | 0.7375 | 0.1406 | 0.0813 | | 5.7003 | 5.0 | 625 | 4.8556 | 0.754 | 0.3676 | 1.8243 | 0.754 | 0.7472 | 0.1001 | 0.0756 | | 5.7003 | 6.0 | 750 | 4.8060 | 0.7625 | 0.3475 | 1.8558 | 0.7625 | 0.7636 | 0.0808 | 0.0696 | | 5.7003 | 7.0 | 875 | 4.8301 | 0.7648 | 0.3320 | 1.7367 | 0.7648 | 0.7663 | 0.0434 | 0.0677 | | 4.6459 | 8.0 | 1000 | 4.7883 | 0.7692 | 0.3305 | 1.8366 | 0.7692 | 0.7728 | 0.0532 | 0.0666 | | 4.6459 | 9.0 | 1125 | 4.8347 | 0.7762 | 0.3282 | 1.7122 | 0.7762 | 0.7789 | 0.0610 | 0.0675 | | 4.6459 | 10.0 | 1250 | 4.8679 | 0.7682 | 0.3338 | 1.8225 | 0.7682 | 0.7713 | 0.0634 | 0.0672 | | 4.6459 | 11.0 | 1375 | 4.9875 | 0.7655 | 0.3521 | 1.9651 | 0.7655 | 0.7647 | 0.0914 | 0.0692 | | 4.2436 | 12.0 | 1500 | 4.9708 | 0.77 | 0.3410 | 2.0195 | 0.7700 | 0.7694 | 0.0838 | 0.0684 | | 4.2436 | 13.0 | 1625 | 4.9246 | 0.7752 | 0.3349 | 1.8150 | 0.7752 | 0.7758 | 0.0801 | 0.0666 | | 4.2436 | 14.0 | 1750 | 4.9235 | 0.776 | 0.3327 | 1.8364 | 0.776 | 0.7782 | 0.0896 | 0.0628 | | 4.2436 | 15.0 | 1875 | 4.9149 | 0.7817 | 0.3348 | 1.9243 | 0.7817 | 0.7857 | 0.0917 | 0.0650 | | 4.0997 | 16.0 | 2000 | 4.8998 | 0.7837 | 0.3255 | 1.8326 | 0.7837 | 0.7874 | 0.0901 | 0.0637 | | 4.0997 | 17.0 | 2125 | 4.9658 | 0.7792 | 0.3358 | 1.8156 | 0.7792 | 0.7815 | 0.1025 | 0.0640 | | 4.0997 | 18.0 | 2250 | 4.9819 | 0.7905 | 0.3256 | 1.8605 | 0.7905 | 0.7919 | 0.1016 | 0.0613 | | 4.0997 | 19.0 | 2375 | 5.0040 | 0.778 | 0.3417 | 1.9392 | 0.778 | 0.7800 | 0.1095 | 0.0638 | | 4.0325 | 20.0 | 2500 | 5.0084 | 0.7817 | 0.3387 | 1.9882 | 0.7817 | 0.7833 | 0.1043 | 0.0642 | | 4.0325 | 21.0 | 2625 | 5.0680 | 0.7805 | 0.3473 | 1.8641 | 0.7805 | 0.7803 | 0.1200 | 0.0631 | | 4.0325 | 22.0 | 2750 | 5.0324 | 0.7808 | 0.3395 | 1.8541 | 0.7808 | 0.7835 | 0.1124 | 0.0620 | | 4.0325 | 23.0 | 2875 | 5.0734 | 0.7845 | 0.3446 | 1.9087 | 0.7845 | 0.7884 | 0.1170 | 0.0625 | | 3.99 | 24.0 | 3000 | 5.2144 | 0.782 | 0.3564 | 1.9540 | 0.782 | 0.7845 | 0.1293 | 0.0640 | | 3.99 | 25.0 | 3125 | 5.0299 | 0.7873 | 0.3387 | 1.8106 | 0.7873 | 0.7887 | 0.1167 | 0.0614 | | 3.99 | 26.0 | 3250 | 5.0673 | 0.792 | 0.3318 | 1.7538 | 0.792 | 0.7930 | 0.1134 | 0.0599 | | 3.99 | 27.0 | 3375 | 5.0854 | 0.791 | 0.3379 | 1.8144 | 0.791 | 0.7932 | 0.1253 | 0.0586 | | 3.9606 | 28.0 | 3500 | 5.0962 | 0.787 | 0.3403 | 1.7780 | 0.787 | 0.7884 | 0.1224 | 0.0592 | | 3.9606 | 29.0 | 3625 | 5.0812 | 0.7877 | 0.3379 | 1.7721 | 0.7877 | 0.7900 | 0.1247 | 0.0592 | | 3.9606 | 30.0 | 3750 | 5.1318 | 0.7905 | 0.3359 | 1.8105 | 0.7905 | 0.7931 | 0.1290 | 0.0597 | | 3.9606 | 31.0 | 3875 | 5.0330 | 0.7953 | 0.3276 | 1.7361 | 0.7953 | 0.7978 | 0.1144 | 0.0584 | | 3.9355 | 32.0 | 4000 | 5.0843 | 0.7975 | 0.3276 | 1.7556 | 0.7975 | 0.7990 | 0.1236 | 0.0560 | | 3.9355 | 33.0 | 4125 | 5.1843 | 0.7995 | 0.3315 | 1.7084 | 0.7995 | 0.8004 | 0.1297 | 0.0575 | | 3.9355 | 34.0 | 4250 | 5.1703 | 0.7987 | 0.3333 | 1.6918 | 0.7987 | 0.8000 | 0.1257 | 0.0580 | | 3.9355 | 35.0 | 4375 | 5.1933 | 0.7937 | 0.3372 | 1.7084 | 0.7937 | 0.7941 | 0.1307 | 0.0561 | | 3.9148 | 36.0 | 4500 | 5.1404 | 0.7987 | 0.3275 | 1.6423 | 0.7987 | 0.8011 | 0.1308 | 0.0547 | | 3.9148 | 37.0 | 4625 | 5.1734 | 0.8017 | 0.3272 | 1.6836 | 0.8017 | 0.8034 | 0.1272 | 0.0572 | | 3.9148 | 38.0 | 4750 | 5.2479 | 0.802 | 0.3322 | 1.7081 | 0.802 | 0.8032 | 0.1353 | 0.0550 | | 3.9148 | 39.0 | 4875 | 5.1921 | 0.8 | 0.3320 | 1.6554 | 0.8000 | 0.8012 | 0.1334 | 0.0538 | | 3.9001 | 40.0 | 5000 | 5.2477 | 0.801 | 0.3353 | 1.6333 | 0.801 | 0.8022 | 0.1390 | 0.0539 | | 3.9001 | 41.0 | 5125 | 5.2140 | 0.801 | 0.3299 | 1.6370 | 0.801 | 0.8017 | 0.1340 | 0.0544 | | 3.9001 | 42.0 | 5250 | 5.2660 | 0.807 | 0.3303 | 1.6090 | 0.807 | 0.8079 | 0.1339 | 0.0545 | | 3.9001 | 43.0 | 5375 | 5.2884 | 0.8007 | 0.3319 | 1.6816 | 0.8007 | 0.8022 | 0.1394 | 0.0547 | | 3.8892 | 44.0 | 5500 | 5.3358 | 0.804 | 0.3352 | 1.6399 | 0.804 | 0.8049 | 0.1387 | 0.0560 | | 3.8892 | 45.0 | 5625 | 5.3545 | 0.8043 | 0.3349 | 1.6445 | 0.8043 | 0.8060 | 0.1408 | 0.0555 | | 3.8892 | 46.0 | 5750 | 5.4026 | 0.8033 | 0.3373 | 1.6493 | 0.8033 | 0.8049 | 0.1439 | 0.0567 | | 3.8892 | 47.0 | 5875 | 5.4195 | 0.8015 | 0.3386 | 1.6393 | 0.8015 | 0.8031 | 0.1468 | 0.0570 | | 3.8834 | 48.0 | 6000 | 5.4409 | 0.803 | 0.3396 | 1.6392 | 0.803 | 0.8046 | 0.1458 | 0.0574 | | 3.8834 | 49.0 | 6125 | 5.4501 | 0.8023 | 0.3395 | 1.6367 | 0.8023 | 0.8039 | 0.1468 | 0.0574 | | 3.8834 | 50.0 | 6250 | 5.4561 | 0.802 | 0.3399 | 1.6335 | 0.802 | 0.8037 | 0.1478 | 0.0576 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
tommilyjones/swin-tiny-patch4-window7-224-cats_dogs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-cats_dogs This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0126 - Accuracy: 0.9973 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0832 | 0.98 | 47 | 0.0235 | 0.9909 | | 0.0788 | 1.99 | 95 | 0.0126 | 0.9973 | | 0.0534 | 2.95 | 141 | 0.0127 | 0.9957 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "cat", "dog" ]
jordyvl/dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.7080 - Accuracy: 0.8275 - Brier Loss: 0.3142 - Nll: 2.0399 - F1 Micro: 0.8275 - F1 Macro: 0.8270 - Ece: 0.1526 - Aurc: 0.0520 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 3.111 | 1.0 | 1000 | 2.9416 | 0.5917 | 0.5262 | 2.5737 | 0.5917 | 0.5835 | 0.0528 | 0.1821 | | 2.5832 | 2.0 | 2000 | 2.4518 | 0.6917 | 0.4147 | 2.1569 | 0.6917 | 0.6919 | 0.0508 | 0.1107 | | 2.2618 | 3.0 | 3000 | 2.2194 | 0.7418 | 0.3548 | 2.0905 | 0.7418 | 0.7417 | 0.0384 | 0.0775 | | 2.0277 | 4.0 | 4000 | 2.1469 | 0.7575 | 0.3418 | 2.0638 | 0.7575 | 0.7547 | 0.0661 | 0.0729 | | 1.9024 | 5.0 | 5000 | 2.1380 | 0.7355 | 0.3703 | 2.0583 | 0.7355 | 0.7365 | 0.0619 | 0.0880 | | 1.7315 | 6.0 | 6000 | 2.0423 | 0.7508 | 0.3495 | 2.0467 | 0.7508 | 0.7566 | 0.0631 | 0.0752 | | 1.5844 | 7.0 | 7000 | 2.0832 | 0.7628 | 0.3382 | 2.1301 | 0.7628 | 0.7651 | 0.0953 | 0.0689 | | 1.4761 | 8.0 | 8000 | 2.2224 | 0.773 | 0.3548 | 2.1347 | 0.7730 | 0.7734 | 0.1284 | 0.0708 | | 1.3852 | 9.0 | 9000 | 2.2341 | 0.7853 | 0.3452 | 2.0905 | 0.7853 | 0.7874 | 0.1349 | 0.0614 | | 1.3234 | 10.0 | 10000 | 2.3403 | 0.778 | 0.3614 | 2.1125 | 0.778 | 0.7797 | 0.1530 | 0.0649 | | 1.2546 | 11.0 | 11000 | 2.4153 | 0.7768 | 0.3675 | 2.1438 | 0.7768 | 0.7772 | 0.1601 | 0.0649 | | 1.2161 | 12.0 | 12000 | 2.5661 | 0.7742 | 0.3810 | 2.1581 | 0.7742 | 0.7752 | 0.1715 | 0.0669 | | 1.1611 | 13.0 | 13000 | 2.5638 | 0.789 | 0.3616 | 2.0957 | 0.7890 | 0.7888 | 0.1648 | 0.0595 | | 1.1349 | 14.0 | 14000 | 2.6037 | 0.7957 | 0.3569 | 2.1299 | 0.7957 | 0.7963 | 0.1641 | 0.0578 | | 1.1043 | 15.0 | 15000 | 2.6763 | 0.7817 | 0.3786 | 2.1078 | 0.7817 | 0.7855 | 0.1755 | 0.0680 | | 1.0768 | 16.0 | 16000 | 2.6931 | 0.792 | 0.3636 | 2.1056 | 0.792 | 0.7942 | 0.1679 | 0.0601 | | 1.0675 | 17.0 | 17000 | 2.6384 | 0.7957 | 0.3549 | 2.1658 | 0.7957 | 0.7941 | 0.1651 | 0.0570 | | 1.0387 | 18.0 | 18000 | 2.8320 | 0.7825 | 0.3899 | 2.1964 | 0.7825 | 0.7804 | 0.1804 | 0.0706 | | 1.035 | 19.0 | 19000 | 2.7127 | 0.7947 | 0.3641 | 2.0771 | 0.7947 | 0.7981 | 0.1741 | 0.0607 | | 1.0053 | 20.0 | 20000 | 2.7164 | 0.8035 | 0.3508 | 2.0693 | 0.8035 | 0.8017 | 0.1638 | 0.0594 | | 0.9783 | 21.0 | 21000 | 2.7162 | 0.8085 | 0.3475 | 2.0165 | 0.8085 | 0.8080 | 0.1622 | 0.0601 | | 0.9606 | 22.0 | 22000 | 2.7740 | 0.804 | 0.3505 | 2.0738 | 0.804 | 0.8057 | 0.1678 | 0.0585 | | 0.9579 | 23.0 | 23000 | 2.7597 | 0.803 | 0.3544 | 2.0507 | 0.803 | 0.8038 | 0.1668 | 0.0600 | | 0.9439 | 24.0 | 24000 | 2.7108 | 0.809 | 0.3407 | 2.0218 | 0.809 | 0.8099 | 0.1626 | 0.0574 | | 0.9247 | 25.0 | 25000 | 2.6918 | 0.8125 | 0.3355 | 2.0449 | 0.8125 | 0.8114 | 0.1580 | 0.0549 | | 0.9275 | 26.0 | 26000 | 2.6996 | 0.8163 | 0.3316 | 2.0140 | 0.8163 | 0.8159 | 0.1585 | 0.0582 | | 0.914 | 27.0 | 27000 | 2.7846 | 0.8113 | 0.3389 | 2.0190 | 0.8113 | 0.8110 | 0.1626 | 0.0598 | | 0.9036 | 28.0 | 28000 | 2.7436 | 0.817 | 0.3341 | 2.0702 | 0.817 | 0.8166 | 0.1587 | 0.0564 | | 0.893 | 29.0 | 29000 | 2.7354 | 0.8197 | 0.3272 | 2.0581 | 0.8197 | 0.8207 | 0.1551 | 0.0588 | | 0.8815 | 30.0 | 30000 | 2.8377 | 0.813 | 0.3414 | 2.1163 | 0.813 | 0.8149 | 0.1630 | 0.0614 | | 0.8688 | 31.0 | 31000 | 2.7815 | 0.8207 | 0.3310 | 2.0502 | 0.8207 | 0.8205 | 0.1576 | 0.0554 | | 0.8727 | 32.0 | 32000 | 2.7370 | 0.82 | 0.3292 | 2.1149 | 0.82 | 0.8193 | 0.1563 | 0.0545 | | 0.8581 | 33.0 | 33000 | 2.8168 | 0.812 | 0.3443 | 2.0026 | 0.8120 | 0.8146 | 0.1658 | 0.0594 | | 0.8504 | 34.0 | 34000 | 2.7660 | 0.8173 | 0.3321 | 2.0497 | 0.8173 | 0.8181 | 0.1597 | 0.0556 | | 0.8563 | 35.0 | 35000 | 2.8457 | 0.8097 | 0.3442 | 2.0815 | 0.8097 | 0.8107 | 0.1669 | 0.0592 | | 0.8415 | 36.0 | 36000 | 2.7366 | 0.8245 | 0.3179 | 2.0282 | 0.8245 | 0.8251 | 0.1511 | 0.0566 | | 0.8372 | 37.0 | 37000 | 2.7731 | 0.821 | 0.3249 | 2.1084 | 0.821 | 0.8198 | 0.1563 | 0.0546 | | 0.8406 | 38.0 | 38000 | 2.6948 | 0.8283 | 0.3131 | 2.0343 | 0.8283 | 0.8281 | 0.1493 | 0.0533 | | 0.831 | 39.0 | 39000 | 2.7781 | 0.827 | 0.3192 | 2.0592 | 0.827 | 0.8270 | 0.1534 | 0.0544 | | 0.8223 | 40.0 | 40000 | 2.7811 | 0.8267 | 0.3161 | 2.0946 | 0.8267 | 0.8271 | 0.1512 | 0.0570 | | 0.8258 | 41.0 | 41000 | 2.6993 | 0.827 | 0.3138 | 2.0347 | 0.827 | 0.8271 | 0.1507 | 0.0531 | | 0.8209 | 42.0 | 42000 | 2.7467 | 0.828 | 0.3197 | 2.0159 | 0.828 | 0.8279 | 0.1530 | 0.0541 | | 0.8146 | 43.0 | 43000 | 2.7050 | 0.8257 | 0.3159 | 2.0518 | 0.8257 | 0.8249 | 0.1526 | 0.0523 | | 0.8161 | 44.0 | 44000 | 2.6919 | 0.8257 | 0.3160 | 1.9889 | 0.8257 | 0.8255 | 0.1515 | 0.0530 | | 0.8121 | 45.0 | 45000 | 2.7314 | 0.8235 | 0.3210 | 2.0259 | 0.8235 | 0.8244 | 0.1542 | 0.0537 | | 0.809 | 46.0 | 46000 | 2.7203 | 0.8275 | 0.3146 | 2.0431 | 0.8275 | 0.8272 | 0.1526 | 0.0514 | | 0.8091 | 47.0 | 47000 | 2.7174 | 0.826 | 0.3176 | 2.0313 | 0.826 | 0.8253 | 0.1534 | 0.0527 | | 0.8073 | 48.0 | 48000 | 2.7058 | 0.8277 | 0.3130 | 2.0258 | 0.8277 | 0.8272 | 0.1515 | 0.0519 | | 0.8073 | 49.0 | 49000 | 2.7065 | 0.827 | 0.3146 | 2.0301 | 0.827 | 0.8266 | 0.1528 | 0.0523 | | 0.8069 | 50.0 | 50000 | 2.7080 | 0.8275 | 0.3142 | 2.0399 | 0.8275 | 0.8270 | 0.1526 | 0.0520 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0914 - Accuracy: 0.8157 - Brier Loss: 0.5152 - Nll: 1.5529 - F1 Micro: 0.8157 - F1 Macro: 0.8167 - Ece: 0.4370 - Aurc: 0.1010 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.149 | 1.0 | 1000 | 0.1482 | 0.1598 | 0.9334 | 6.4910 | 0.1598 | 0.1128 | 0.0991 | 0.7250 | | 0.1278 | 2.0 | 2000 | 0.1248 | 0.5295 | 0.8391 | 3.1275 | 0.5295 | 0.4951 | 0.4011 | 0.2355 | | 0.1156 | 3.0 | 3000 | 0.1146 | 0.6388 | 0.7532 | 2.4008 | 0.6388 | 0.6238 | 0.4480 | 0.1552 | | 0.1094 | 4.0 | 4000 | 0.1089 | 0.7085 | 0.7069 | 2.1449 | 0.7085 | 0.7039 | 0.4850 | 0.1225 | | 0.105 | 5.0 | 5000 | 0.1063 | 0.7252 | 0.6970 | 2.0868 | 0.7252 | 0.7291 | 0.4969 | 0.1087 | | 0.1011 | 6.0 | 6000 | 0.1038 | 0.7472 | 0.6562 | 2.1759 | 0.7472 | 0.7478 | 0.4844 | 0.1095 | | 0.097 | 7.0 | 7000 | 0.1043 | 0.7415 | 0.6509 | 2.5078 | 0.7415 | 0.7404 | 0.4668 | 0.1364 | | 0.0946 | 8.0 | 8000 | 0.1022 | 0.7508 | 0.6416 | 2.1930 | 0.7508 | 0.7550 | 0.4714 | 0.1300 | | 0.0916 | 9.0 | 9000 | 0.0995 | 0.7642 | 0.6271 | 1.9398 | 0.7642 | 0.7691 | 0.4791 | 0.1086 | | 0.0901 | 10.0 | 10000 | 0.1013 | 0.747 | 0.6277 | 2.3724 | 0.747 | 0.7538 | 0.4538 | 0.1227 | | 0.0881 | 11.0 | 11000 | 0.0991 | 0.7752 | 0.6037 | 1.9848 | 0.7752 | 0.7784 | 0.4696 | 0.1054 | | 0.0868 | 12.0 | 12000 | 0.0983 | 0.7738 | 0.6074 | 2.0011 | 0.7738 | 0.7757 | 0.4741 | 0.0996 | | 0.0855 | 13.0 | 13000 | 0.0977 | 0.7833 | 0.5864 | 1.9790 | 0.7833 | 0.7868 | 0.4633 | 0.1068 | | 0.0845 | 14.0 | 14000 | 0.0986 | 0.782 | 0.5928 | 2.0415 | 0.782 | 0.7847 | 0.4645 | 0.1158 | | 0.083 | 15.0 | 15000 | 0.0974 | 0.78 | 0.5793 | 2.0235 | 0.78 | 0.7857 | 0.4455 | 0.1243 | | 0.0821 | 16.0 | 16000 | 0.0975 | 0.7823 | 0.5776 | 2.0363 | 0.7823 | 0.7859 | 0.4462 | 0.1238 | | 0.0811 | 17.0 | 17000 | 0.0962 | 0.7883 | 0.5667 | 2.0085 | 0.7883 | 0.7907 | 0.4474 | 0.1108 | | 0.0803 | 18.0 | 18000 | 0.0969 | 0.7833 | 0.5720 | 2.0028 | 0.7833 | 0.7840 | 0.4421 | 0.1276 | | 0.0801 | 19.0 | 19000 | 0.0962 | 0.7823 | 0.5727 | 1.9412 | 0.7823 | 0.7847 | 0.4447 | 0.1182 | | 0.0794 | 20.0 | 20000 | 0.0961 | 0.7847 | 0.5681 | 1.9442 | 0.7847 | 0.7851 | 0.4449 | 0.1121 | | 0.0786 | 21.0 | 21000 | 0.0993 | 0.7612 | 0.5748 | 2.2878 | 0.7612 | 0.7627 | 0.4088 | 0.1494 | | 0.0776 | 22.0 | 22000 | 0.0947 | 0.797 | 0.5491 | 1.8933 | 0.797 | 0.7986 | 0.4379 | 0.1211 | | 0.0771 | 23.0 | 23000 | 0.0955 | 0.7893 | 0.5564 | 1.8974 | 0.7893 | 0.7918 | 0.4391 | 0.1124 | | 0.0772 | 24.0 | 24000 | 0.0956 | 0.788 | 0.5524 | 1.9541 | 0.788 | 0.7898 | 0.4309 | 0.1166 | | 0.0768 | 25.0 | 25000 | 0.0970 | 0.7748 | 0.5568 | 2.0627 | 0.7748 | 0.7776 | 0.4152 | 0.1264 | | 0.0765 | 26.0 | 26000 | 0.0939 | 0.7975 | 0.5448 | 1.7874 | 0.7975 | 0.7996 | 0.4397 | 0.1086 | | 0.0759 | 27.0 | 27000 | 0.0944 | 0.797 | 0.5425 | 1.8354 | 0.797 | 0.7982 | 0.4328 | 0.1185 | | 0.0755 | 28.0 | 28000 | 0.0938 | 0.7993 | 0.5399 | 1.6911 | 0.7993 | 0.7993 | 0.4391 | 0.1025 | | 0.0754 | 29.0 | 29000 | 0.0945 | 0.797 | 0.5387 | 1.8083 | 0.797 | 0.7980 | 0.4323 | 0.1117 | | 0.075 | 30.0 | 30000 | 0.0941 | 0.8005 | 0.5353 | 1.7803 | 0.8005 | 0.8020 | 0.4318 | 0.1128 | | 0.0745 | 31.0 | 31000 | 0.0928 | 0.805 | 0.5282 | 1.6621 | 0.805 | 0.8070 | 0.4338 | 0.1107 | | 0.0747 | 32.0 | 32000 | 0.0935 | 0.806 | 0.5316 | 1.6745 | 0.806 | 0.8066 | 0.4368 | 0.1111 | | 0.0743 | 33.0 | 33000 | 0.0928 | 0.8095 | 0.5288 | 1.7115 | 0.8095 | 0.8096 | 0.4401 | 0.1045 | | 0.074 | 34.0 | 34000 | 0.0927 | 0.8063 | 0.5286 | 1.6801 | 0.8062 | 0.8064 | 0.4378 | 0.1001 | | 0.0734 | 35.0 | 35000 | 0.0925 | 0.8083 | 0.5260 | 1.6524 | 0.8083 | 0.8102 | 0.4364 | 0.1066 | | 0.0734 | 36.0 | 36000 | 0.0924 | 0.8087 | 0.5252 | 1.6727 | 0.8087 | 0.8106 | 0.4352 | 0.1077 | | 0.0733 | 37.0 | 37000 | 0.0920 | 0.8133 | 0.5215 | 1.6062 | 0.8133 | 0.8147 | 0.4399 | 0.1000 | | 0.0733 | 38.0 | 38000 | 0.0924 | 0.8083 | 0.5243 | 1.6319 | 0.8083 | 0.8100 | 0.4343 | 0.1063 | | 0.0732 | 39.0 | 39000 | 0.0921 | 0.8105 | 0.5222 | 1.5823 | 0.8105 | 0.8106 | 0.4363 | 0.1034 | | 0.073 | 40.0 | 40000 | 0.0917 | 0.8157 | 0.5203 | 1.5771 | 0.8157 | 0.8163 | 0.4414 | 0.1014 | | 0.0728 | 41.0 | 41000 | 0.0916 | 0.8153 | 0.5192 | 1.5726 | 0.8153 | 0.8163 | 0.4395 | 0.1033 | | 0.0729 | 42.0 | 42000 | 0.0916 | 0.8133 | 0.5188 | 1.5495 | 0.8133 | 0.8145 | 0.4392 | 0.1026 | | 0.0726 | 43.0 | 43000 | 0.0917 | 0.816 | 0.5185 | 1.5969 | 0.816 | 0.8169 | 0.4395 | 0.1054 | | 0.0728 | 44.0 | 44000 | 0.0914 | 0.8163 | 0.5164 | 1.5257 | 0.8163 | 0.8167 | 0.4388 | 0.1023 | | 0.0725 | 45.0 | 45000 | 0.0914 | 0.8153 | 0.5165 | 1.5699 | 0.8153 | 0.8161 | 0.4386 | 0.1012 | | 0.0723 | 46.0 | 46000 | 0.0915 | 0.816 | 0.5160 | 1.5653 | 0.816 | 0.8171 | 0.4386 | 0.1008 | | 0.0723 | 47.0 | 47000 | 0.0914 | 0.8155 | 0.5159 | 1.5478 | 0.8155 | 0.8165 | 0.4380 | 0.0997 | | 0.0721 | 48.0 | 48000 | 0.0914 | 0.816 | 0.5156 | 1.5579 | 0.816 | 0.8169 | 0.4379 | 0.1006 | | 0.0725 | 49.0 | 49000 | 0.0914 | 0.8155 | 0.5153 | 1.5636 | 0.8155 | 0.8165 | 0.4369 | 0.1009 | | 0.0721 | 50.0 | 50000 | 0.0914 | 0.8157 | 0.5152 | 1.5529 | 0.8157 | 0.8167 | 0.4370 | 0.1010 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-tiny_rvl_cdip-NK1000_og_simkd This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 12216.4121 - Accuracy: 0.8337 - Brier Loss: 0.3073 - Nll: 2.1945 - F1 Micro: 0.8337 - F1 Macro: 0.8337 - Ece: 0.1506 - Aurc: 0.0535 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 12731.392 | 1.0 | 1000 | 12479.0312 | 0.5323 | 0.5967 | 3.1288 | 0.5323 | 0.4700 | 0.1047 | 0.2100 | | 12715.18 | 2.0 | 2000 | 12453.9434 | 0.5787 | 0.6261 | 3.4535 | 0.5787 | 0.5518 | 0.1953 | 0.2225 | | 12672.101 | 3.0 | 3000 | 12456.4629 | 0.6723 | 0.4708 | 2.9701 | 0.6723 | 0.6487 | 0.1102 | 0.1171 | | 12681.216 | 4.0 | 4000 | 12448.6143 | 0.6815 | 0.4741 | 2.8370 | 0.6815 | 0.6707 | 0.1250 | 0.1245 | | 12638.181 | 5.0 | 5000 | 12443.0645 | 0.716 | 0.4178 | 2.7837 | 0.7160 | 0.7273 | 0.1095 | 0.0932 | | 12802.019 | 6.0 | 6000 | 12432.3438 | 0.7468 | 0.3768 | 2.6575 | 0.7468 | 0.7537 | 0.0898 | 0.0800 | | 12671.194 | 7.0 | 7000 | 12420.4395 | 0.7335 | 0.4163 | 2.8018 | 0.7335 | 0.7331 | 0.1256 | 0.1139 | | 12705.783 | 8.0 | 8000 | 12410.6143 | 0.7598 | 0.3739 | 2.7562 | 0.7598 | 0.7634 | 0.1157 | 0.0729 | | 12559.44 | 9.0 | 9000 | 12412.9453 | 0.7612 | 0.3793 | 2.8905 | 0.7612 | 0.7658 | 0.1232 | 0.0731 | | 12618.725 | 10.0 | 10000 | 12390.2139 | 0.7722 | 0.3613 | 2.6477 | 0.7722 | 0.7753 | 0.1190 | 0.0750 | | 12731.292 | 11.0 | 11000 | 12387.4863 | 0.7875 | 0.3427 | 2.5376 | 0.7875 | 0.7898 | 0.1154 | 0.0689 | | 12705.794 | 12.0 | 12000 | 12379.9336 | 0.7805 | 0.3555 | 2.6072 | 0.7805 | 0.7813 | 0.1284 | 0.0681 | | 12550.782 | 13.0 | 13000 | 12380.7959 | 0.787 | 0.3400 | 2.5381 | 0.787 | 0.7887 | 0.1162 | 0.0662 | | 12670.568 | 14.0 | 14000 | 12376.7646 | 0.7867 | 0.3423 | 2.6149 | 0.7868 | 0.7925 | 0.1186 | 0.0574 | | 12580.616 | 15.0 | 15000 | 12352.3135 | 0.7953 | 0.3468 | 2.6324 | 0.7953 | 0.7969 | 0.1382 | 0.0622 | | 12723.865 | 16.0 | 16000 | 12345.4600 | 0.8015 | 0.3312 | 2.4793 | 0.8015 | 0.8034 | 0.1244 | 0.0601 | | 12620.305 | 17.0 | 17000 | 12343.1553 | 0.8023 | 0.3424 | 2.6488 | 0.8023 | 0.8031 | 0.1420 | 0.0644 | | 12668.087 | 18.0 | 18000 | 12336.9277 | 0.8 | 0.3455 | 2.7019 | 0.8000 | 0.8023 | 0.1401 | 0.0592 | | 12654.687 | 19.0 | 19000 | 12332.4404 | 0.8075 | 0.3321 | 2.5589 | 0.8075 | 0.8094 | 0.1393 | 0.0556 | | 12578.655 | 20.0 | 20000 | 12321.3037 | 0.8075 | 0.3395 | 2.4255 | 0.8075 | 0.8050 | 0.1484 | 0.0638 | | 12525.448 | 21.0 | 21000 | 12315.5303 | 0.8067 | 0.3328 | 2.5264 | 0.8067 | 0.8066 | 0.1440 | 0.0548 | | 12610.837 | 22.0 | 22000 | 12311.0215 | 0.8105 | 0.3291 | 2.4781 | 0.8105 | 0.8112 | 0.1445 | 0.0540 | | 12494.528 | 23.0 | 23000 | 12303.3623 | 0.8145 | 0.3337 | 2.5535 | 0.8145 | 0.8154 | 0.1510 | 0.0561 | | 12561.799 | 24.0 | 24000 | 12296.2363 | 0.8153 | 0.3246 | 2.4243 | 0.8153 | 0.8142 | 0.1475 | 0.0513 | | 12580.176 | 25.0 | 25000 | 12291.8018 | 0.8193 | 0.3262 | 2.3932 | 0.8193 | 0.8174 | 0.1484 | 0.0550 | | 12455.165 | 26.0 | 26000 | 12276.9355 | 0.826 | 0.3223 | 2.4710 | 0.826 | 0.8251 | 0.1507 | 0.0597 | | 12528.496 | 27.0 | 27000 | 12280.9180 | 0.8257 | 0.3154 | 2.4010 | 0.8257 | 0.8260 | 0.1462 | 0.0524 | | 12521.554 | 28.0 | 28000 | 12262.9600 | 0.821 | 0.3274 | 2.4721 | 0.821 | 0.8201 | 0.1560 | 0.0595 | | 12557.871 | 29.0 | 29000 | 12260.7754 | 0.823 | 0.3217 | 2.3929 | 0.823 | 0.8226 | 0.1552 | 0.0551 | | 12535.524 | 30.0 | 30000 | 12271.4717 | 0.8263 | 0.3183 | 2.3249 | 0.8263 | 0.8269 | 0.1503 | 0.0502 | | 12488.263 | 31.0 | 31000 | 12259.3057 | 0.823 | 0.3219 | 2.3830 | 0.823 | 0.8226 | 0.1541 | 0.0528 | | 12498.048 | 32.0 | 32000 | 12253.2412 | 0.8263 | 0.3174 | 2.2771 | 0.8263 | 0.8243 | 0.1527 | 0.0541 | | 12465.825 | 33.0 | 33000 | 12257.4863 | 0.8323 | 0.3088 | 2.3466 | 0.8323 | 0.8319 | 0.1454 | 0.0500 | | 12439.6 | 34.0 | 34000 | 12238.5957 | 0.8323 | 0.3093 | 2.4057 | 0.8323 | 0.8329 | 0.1482 | 0.0552 | | 12407.423 | 35.0 | 35000 | 12250.7178 | 0.8335 | 0.3072 | 2.2532 | 0.8335 | 0.8336 | 0.1471 | 0.0521 | | 12534.711 | 36.0 | 36000 | 12231.9902 | 0.8353 | 0.3032 | 2.2711 | 0.8353 | 0.8353 | 0.1464 | 0.0548 | | 12458.666 | 37.0 | 37000 | 12232.9521 | 0.835 | 0.3041 | 2.2523 | 0.835 | 0.8352 | 0.1467 | 0.0539 | | 12461.748 | 38.0 | 38000 | 12230.4639 | 0.8317 | 0.3096 | 2.3052 | 0.8317 | 0.8318 | 0.1512 | 0.0539 | | 12434.679 | 39.0 | 39000 | 12229.0684 | 0.8317 | 0.3081 | 2.2172 | 0.8317 | 0.8317 | 0.1497 | 0.0547 | | 12468.468 | 40.0 | 40000 | 12226.4775 | 0.8323 | 0.3096 | 2.3112 | 0.8323 | 0.8324 | 0.1509 | 0.0524 | | 12540.176 | 41.0 | 41000 | 12213.8359 | 0.8357 | 0.3085 | 2.2929 | 0.8357 | 0.8356 | 0.1502 | 0.0541 | | 12513.896 | 42.0 | 42000 | 12216.2480 | 0.8333 | 0.3096 | 2.1638 | 0.8333 | 0.8329 | 0.1501 | 0.0559 | | 12406.31 | 43.0 | 43000 | 12213.7012 | 0.8347 | 0.3078 | 2.1971 | 0.8347 | 0.8345 | 0.1504 | 0.0542 | | 12350.768 | 44.0 | 44000 | 12224.6738 | 0.8323 | 0.3086 | 2.1722 | 0.8323 | 0.8320 | 0.1514 | 0.0546 | | 12394.478 | 45.0 | 45000 | 12221.9336 | 0.8325 | 0.3100 | 2.2464 | 0.8325 | 0.8323 | 0.1516 | 0.0536 | | 12399.318 | 46.0 | 46000 | 12207.5957 | 0.8347 | 0.3089 | 2.2193 | 0.8347 | 0.8344 | 0.1517 | 0.0553 | | 12476.218 | 47.0 | 47000 | 12213.4814 | 0.8353 | 0.3055 | 2.2084 | 0.8353 | 0.8353 | 0.1488 | 0.0532 | | 12448.278 | 48.0 | 48000 | 12212.2119 | 0.8347 | 0.3058 | 2.1518 | 0.8347 | 0.8345 | 0.1492 | 0.0545 | | 12486.848 | 49.0 | 49000 | 12210.5742 | 0.8347 | 0.3062 | 2.2778 | 0.8347 | 0.8345 | 0.1498 | 0.0546 | | 12376.327 | 50.0 | 50000 | 12216.4121 | 0.8337 | 0.3073 | 2.1945 | 0.8337 | 0.8337 | 0.1506 | 0.0535 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
phunc20/swin-tiny-patch4-window7-224-finetuned-wuhan
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-wuhan This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7953 - Accuracy: 0.4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 3 | 0.7953 | 0.4 | | No log | 2.0 | 6 | 0.9477 | 0.4 | | No log | 3.0 | 9 | 1.0106 | 0.4 | | 0.5883 | 4.0 | 12 | 1.4170 | 0.4 | | 0.5883 | 5.0 | 15 | 1.7436 | 0.4 | | 0.5883 | 6.0 | 18 | 2.5380 | 0.4 | | 0.241 | 7.0 | 21 | 3.8803 | 0.4 | | 0.241 | 8.0 | 24 | 2.4040 | 0.2222 | | 0.241 | 9.0 | 27 | 3.9968 | 0.4 | | 0.125 | 10.0 | 30 | 3.2731 | 0.4 | | 0.125 | 11.0 | 33 | 3.2202 | 0.2222 | | 0.125 | 12.0 | 36 | 4.7008 | 0.4 | | 0.125 | 13.0 | 39 | 4.5588 | 0.3556 | | 0.0766 | 14.0 | 42 | 4.5434 | 0.2444 | | 0.0766 | 15.0 | 45 | 4.9792 | 0.2667 | | 0.0766 | 16.0 | 48 | 5.4095 | 0.2667 | | 0.0239 | 17.0 | 51 | 5.8507 | 0.2222 | | 0.0239 | 18.0 | 54 | 6.1023 | 0.2222 | | 0.0239 | 19.0 | 57 | 6.1666 | 0.2222 | | 0.0129 | 20.0 | 60 | 6.1948 | 0.2222 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "healthy", "sick" ]
jordyvl/cdip-small_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # cdip-small_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4315 - Accuracy: 0.8522 - Brier Loss: 0.2145 - Nll: 1.3474 - F1 Micro: 0.8522 - F1 Macro: 0.8535 - Ece: 0.0573 - Aurc: 0.0300 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 167 | 1.6705 | 0.6378 | 0.4837 | 2.4248 | 0.6378 | 0.6323 | 0.0655 | 0.1457 | | No log | 2.0 | 334 | 1.1423 | 0.7322 | 0.3740 | 1.9847 | 0.7322 | 0.7285 | 0.0695 | 0.0846 | | 1.7909 | 3.0 | 501 | 0.9082 | 0.7682 | 0.3248 | 1.7674 | 0.7682 | 0.7676 | 0.0620 | 0.0642 | | 1.7909 | 4.0 | 668 | 0.8494 | 0.7865 | 0.3082 | 1.7306 | 0.7865 | 0.7904 | 0.0665 | 0.0560 | | 1.7909 | 5.0 | 835 | 0.7837 | 0.798 | 0.2988 | 1.6072 | 0.798 | 0.7953 | 0.0729 | 0.0553 | | 0.4994 | 6.0 | 1002 | 0.6867 | 0.804 | 0.2862 | 1.5014 | 0.804 | 0.8059 | 0.0794 | 0.0471 | | 0.4994 | 7.0 | 1169 | 0.7037 | 0.8157 | 0.2797 | 1.5533 | 0.8157 | 0.8178 | 0.0807 | 0.0478 | | 0.4994 | 8.0 | 1336 | 0.6709 | 0.8163 | 0.2756 | 1.5297 | 0.8163 | 0.8166 | 0.0728 | 0.0478 | | 0.2478 | 9.0 | 1503 | 0.6132 | 0.825 | 0.2576 | 1.4349 | 0.825 | 0.8247 | 0.0728 | 0.0398 | | 0.2478 | 10.0 | 1670 | 0.6389 | 0.8235 | 0.2671 | 1.4455 | 0.8235 | 0.8266 | 0.0746 | 0.0419 | | 0.2478 | 11.0 | 1837 | 0.6043 | 0.8257 | 0.2585 | 1.4609 | 0.8257 | 0.8293 | 0.0752 | 0.0403 | | 0.1683 | 12.0 | 2004 | 0.5639 | 0.8327 | 0.2457 | 1.4470 | 0.8327 | 0.8350 | 0.0676 | 0.0375 | | 0.1683 | 13.0 | 2171 | 0.5665 | 0.8317 | 0.2508 | 1.4054 | 0.8317 | 0.8324 | 0.0731 | 0.0388 | | 0.1683 | 14.0 | 2338 | 0.5505 | 0.8403 | 0.2427 | 1.4059 | 0.8403 | 0.8408 | 0.0649 | 0.0377 | | 0.131 | 15.0 | 2505 | 0.5321 | 0.836 | 0.2428 | 1.4078 | 0.836 | 0.8372 | 0.0684 | 0.0365 | | 0.131 | 16.0 | 2672 | 0.5161 | 0.8373 | 0.2383 | 1.3900 | 0.8373 | 0.8373 | 0.0711 | 0.0368 | | 0.131 | 17.0 | 2839 | 0.5177 | 0.8403 | 0.2371 | 1.3828 | 0.8403 | 0.8413 | 0.0633 | 0.0354 | | 0.1071 | 18.0 | 3006 | 0.5113 | 0.8407 | 0.2377 | 1.3832 | 0.8407 | 0.8432 | 0.0718 | 0.0343 | | 0.1071 | 19.0 | 3173 | 0.4949 | 0.8415 | 0.2332 | 1.3767 | 0.8415 | 0.8428 | 0.0667 | 0.0338 | | 0.1071 | 20.0 | 3340 | 0.4857 | 0.848 | 0.2271 | 1.3664 | 0.848 | 0.8492 | 0.0615 | 0.0338 | | 0.0877 | 21.0 | 3507 | 0.4812 | 0.847 | 0.2283 | 1.3360 | 0.847 | 0.8478 | 0.0602 | 0.0346 | | 0.0877 | 22.0 | 3674 | 0.4715 | 0.8495 | 0.2243 | 1.3761 | 0.8495 | 0.8506 | 0.0560 | 0.0320 | | 0.0877 | 23.0 | 3841 | 0.4622 | 0.8508 | 0.2206 | 1.3584 | 0.8508 | 0.8515 | 0.0557 | 0.0323 | | 0.0694 | 24.0 | 4008 | 0.4432 | 0.8515 | 0.2167 | 1.3653 | 0.8515 | 0.8531 | 0.0555 | 0.0309 | | 0.0694 | 25.0 | 4175 | 0.4467 | 0.8498 | 0.2193 | 1.3499 | 0.8498 | 0.8512 | 0.0581 | 0.0309 | | 0.0694 | 26.0 | 4342 | 0.4412 | 0.8545 | 0.2162 | 1.3535 | 0.8545 | 0.8560 | 0.0534 | 0.0306 | | 0.0586 | 27.0 | 4509 | 0.4402 | 0.8498 | 0.2180 | 1.3390 | 0.8498 | 0.8510 | 0.0597 | 0.0309 | | 0.0586 | 28.0 | 4676 | 0.4408 | 0.8522 | 0.2174 | 1.3568 | 0.8522 | 0.8536 | 0.0576 | 0.0306 | | 0.0586 | 29.0 | 4843 | 0.4391 | 0.851 | 0.2168 | 1.3429 | 0.851 | 0.8523 | 0.0585 | 0.0305 | | 0.0549 | 30.0 | 5010 | 0.4371 | 0.853 | 0.2160 | 1.3389 | 0.853 | 0.8543 | 0.0573 | 0.0303 | | 0.0549 | 31.0 | 5177 | 0.4382 | 0.8498 | 0.2168 | 1.3486 | 0.8498 | 0.8513 | 0.0602 | 0.0304 | | 0.0549 | 32.0 | 5344 | 0.4372 | 0.853 | 0.2166 | 1.3501 | 0.853 | 0.8540 | 0.0591 | 0.0306 | | 0.0527 | 33.0 | 5511 | 0.4379 | 0.852 | 0.2156 | 1.3546 | 0.852 | 0.8531 | 0.0576 | 0.0304 | | 0.0527 | 34.0 | 5678 | 0.4353 | 0.8532 | 0.2154 | 1.3381 | 0.8532 | 0.8543 | 0.0574 | 0.0302 | | 0.0527 | 35.0 | 5845 | 0.4347 | 0.8525 | 0.2148 | 1.3550 | 0.8525 | 0.8535 | 0.0591 | 0.0304 | | 0.0511 | 36.0 | 6012 | 0.4311 | 0.8542 | 0.2141 | 1.3233 | 0.8542 | 0.8552 | 0.0572 | 0.0299 | | 0.0511 | 37.0 | 6179 | 0.4323 | 0.852 | 0.2150 | 1.3332 | 0.852 | 0.8532 | 0.0586 | 0.0302 | | 0.0511 | 38.0 | 6346 | 0.4321 | 0.8515 | 0.2152 | 1.3382 | 0.8515 | 0.8527 | 0.0583 | 0.0299 | | 0.0494 | 39.0 | 6513 | 0.4335 | 0.8495 | 0.2152 | 1.3385 | 0.8495 | 0.8511 | 0.0593 | 0.0303 | | 0.0494 | 40.0 | 6680 | 0.4323 | 0.852 | 0.2146 | 1.3603 | 0.852 | 0.8533 | 0.0576 | 0.0299 | | 0.0494 | 41.0 | 6847 | 0.4309 | 0.8512 | 0.2143 | 1.3448 | 0.8512 | 0.8525 | 0.0570 | 0.0299 | | 0.0477 | 42.0 | 7014 | 0.4327 | 0.8525 | 0.2149 | 1.3439 | 0.8525 | 0.8539 | 0.0580 | 0.0299 | | 0.0477 | 43.0 | 7181 | 0.4309 | 0.8532 | 0.2140 | 1.3406 | 0.8532 | 0.8544 | 0.0560 | 0.0299 | | 0.0477 | 44.0 | 7348 | 0.4308 | 0.8528 | 0.2141 | 1.3404 | 0.8528 | 0.8540 | 0.0573 | 0.0299 | | 0.0466 | 45.0 | 7515 | 0.4317 | 0.8525 | 0.2147 | 1.3402 | 0.8525 | 0.8538 | 0.0580 | 0.0299 | | 0.0466 | 46.0 | 7682 | 0.4317 | 0.8535 | 0.2144 | 1.3475 | 0.8535 | 0.8547 | 0.0553 | 0.0298 | | 0.0466 | 47.0 | 7849 | 0.4314 | 0.8525 | 0.2143 | 1.3479 | 0.8525 | 0.8537 | 0.0559 | 0.0299 | | 0.0465 | 48.0 | 8016 | 0.4314 | 0.8525 | 0.2143 | 1.3479 | 0.8525 | 0.8538 | 0.0559 | 0.0299 | | 0.0465 | 49.0 | 8183 | 0.4316 | 0.8528 | 0.2145 | 1.3471 | 0.8528 | 0.8540 | 0.0573 | 0.0299 | | 0.0465 | 50.0 | 8350 | 0.4315 | 0.8522 | 0.2145 | 1.3474 | 0.8522 | 0.8535 | 0.0573 | 0.0300 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
tommilyjones/vit-base-patch16-224-finetuned-hateful-meme-restructured
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-hateful-meme-restructured This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7152 - Accuracy: 0.552 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6546 | 0.99 | 66 | 0.7185 | 0.52 | | 0.6222 | 2.0 | 133 | 0.7152 | 0.552 | | 0.5986 | 2.99 | 199 | 0.7344 | 0.542 | | 0.5535 | 4.0 | 266 | 0.7782 | 0.514 | | 0.5377 | 4.99 | 332 | 0.8329 | 0.514 | | 0.5115 | 6.0 | 399 | 0.7596 | 0.528 | | 0.5133 | 6.99 | 465 | 0.8151 | 0.512 | | 0.511 | 8.0 | 532 | 0.7897 | 0.538 | | 0.4712 | 8.99 | 598 | 0.8539 | 0.514 | | 0.4626 | 9.92 | 660 | 0.8449 | 0.522 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/resnet-50-finetuned-hateful-meme-restructured
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-hateful-meme-restructured This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7132 - Accuracy: 0.5 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6633 | 0.99 | 66 | 0.7132 | 0.5 | | 0.6561 | 2.0 | 133 | 0.7309 | 0.5 | | 0.6497 | 2.99 | 199 | 0.7314 | 0.5 | | 0.6529 | 4.0 | 266 | 0.7296 | 0.5 | | 0.6336 | 4.99 | 332 | 0.7386 | 0.5 | | 0.625 | 6.0 | 399 | 0.7403 | 0.5 | | 0.6511 | 6.99 | 465 | 0.7425 | 0.5 | | 0.6567 | 8.0 | 532 | 0.7314 | 0.5 | | 0.6389 | 8.99 | 598 | 0.7380 | 0.5 | | 0.6446 | 9.92 | 660 | 0.7426 | 0.5 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/swin-tiny-patch4-window7-224-finetuned-hateful-meme-restructured
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-hateful-meme-restructured This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.8519 - Accuracy: 0.52 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6441 | 0.99 | 66 | 0.7419 | 0.492 | | 0.6368 | 2.0 | 133 | 0.7235 | 0.51 | | 0.6157 | 2.99 | 199 | 0.7516 | 0.504 | | 0.5928 | 4.0 | 266 | 0.8009 | 0.502 | | 0.5735 | 4.99 | 332 | 0.8270 | 0.508 | | 0.5559 | 6.0 | 399 | 0.7804 | 0.502 | | 0.5533 | 6.99 | 465 | 0.8053 | 0.486 | | 0.5541 | 8.0 | 532 | 0.8078 | 0.504 | | 0.5218 | 8.99 | 598 | 0.8519 | 0.52 | | 0.5226 | 9.92 | 660 | 0.8522 | 0.508 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/swin-tiny-patch4-window7-224-finetuned-masked-hateful-meme-restructured
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-masked-hateful-meme-restructured This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7166 - Accuracy: 0.53 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6507 | 0.99 | 66 | 0.7352 | 0.502 | | 0.6411 | 2.0 | 133 | 0.7070 | 0.528 | | 0.6268 | 2.99 | 199 | 0.7166 | 0.53 | | 0.6007 | 4.0 | 266 | 0.7934 | 0.506 | | 0.5875 | 4.99 | 332 | 0.8053 | 0.52 | | 0.5554 | 6.0 | 399 | 0.7534 | 0.524 | | 0.5613 | 6.99 | 465 | 0.8075 | 0.524 | | 0.5714 | 8.0 | 532 | 0.7882 | 0.522 | | 0.5244 | 8.99 | 598 | 0.8380 | 0.518 | | 0.5251 | 9.92 | 660 | 0.8331 | 0.52 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/resnet-50-finetuned-masked-hateful-meme-restructured
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-masked-hateful-meme-restructured This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7093 - Accuracy: 0.5 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6639 | 0.99 | 66 | 0.7093 | 0.5 | | 0.6569 | 2.0 | 133 | 0.7295 | 0.5 | | 0.6489 | 2.99 | 199 | 0.7257 | 0.5 | | 0.6553 | 4.0 | 266 | 0.7274 | 0.5 | | 0.6334 | 4.99 | 332 | 0.7311 | 0.5 | | 0.627 | 6.0 | 399 | 0.7371 | 0.5 | | 0.6561 | 6.99 | 465 | 0.7386 | 0.5 | | 0.6552 | 8.0 | 532 | 0.7354 | 0.5 | | 0.6427 | 8.99 | 598 | 0.7346 | 0.5 | | 0.6451 | 9.92 | 660 | 0.7377 | 0.498 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/vit-base-patch16-224-finetuned-masked-hateful-meme-restructured
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-masked-hateful-meme-restructured This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7518 - Accuracy: 0.54 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6625 | 0.99 | 66 | 0.7385 | 0.518 | | 0.6413 | 2.0 | 133 | 0.6980 | 0.538 | | 0.6063 | 2.99 | 199 | 0.7422 | 0.53 | | 0.5813 | 4.0 | 266 | 0.7794 | 0.52 | | 0.5551 | 4.99 | 332 | 0.7975 | 0.52 | | 0.5249 | 6.0 | 399 | 0.7518 | 0.54 | | 0.5254 | 6.99 | 465 | 0.8074 | 0.53 | | 0.5335 | 8.0 | 532 | 0.7907 | 0.52 | | 0.4867 | 8.99 | 598 | 0.8286 | 0.524 | | 0.4746 | 9.92 | 660 | 0.8262 | 0.522 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
TangTide/vit-base-patch16-224-in21k-Dog-Classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-Dog-Classification This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagewoof dataset. It achieves the following results on the evaluation set: - Loss: 0.5386 - Accuracy: 0.9524 ## Model description Based on the [frgfm/imagewoof dataset](https://huggingface.co/datasets/frgfm/imagewoof), it can categorize ten types of dogs such as Shih-Tzu, Rhodesian ridgeback, Beagle, English foxhound, Border terrier, Australian terrier, Golden retriever, Old English sheepdog, Samoyed, Dingo. ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.2356 | 0.99 | 63 | 1.0520 | 0.9059 | | 0.6987 | 2.0 | 127 | 0.6162 | 0.9446 | | 0.5787 | 2.98 | 189 | 0.5386 | 0.9524 | ### Framework versions - Transformers 4.30.2 - Pytorch 1.13.0+cu117 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "shih-tzu", "rhodesian ridgeback", "beagle", "english foxhound", "border terrier", "australian terrier", "golden retriever", "old english sheepdog", "samoyed", "dingo" ]
loucad/mobilevit-small-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mobilevit-small-finetuned-eurosat This model is a fine-tuned version of [apple/mobilevit-small](https://huggingface.co/apple/mobilevit-small) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3838 - Accuracy: 0.9189 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.065 | 1.0 | 190 | 0.8779 | 0.8385 | | 0.636 | 2.0 | 380 | 0.4618 | 0.9011 | | 0.5761 | 3.0 | 570 | 0.3838 | 0.9189 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_MSE
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_MSE This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3508 - Accuracy: 0.861 - Brier Loss: 0.2072 - Nll: 1.3138 - F1 Micro: 0.861 - F1 Macro: 0.8630 - Ece: 0.0470 - Aurc: 0.0290 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 167 | 2.0205 | 0.6145 | 0.5068 | 2.2570 | 0.6145 | 0.6027 | 0.0547 | 0.1646 | | No log | 2.0 | 334 | 1.3347 | 0.7055 | 0.3976 | 1.8932 | 0.7055 | 0.7001 | 0.0615 | 0.1010 | | 2.0481 | 3.0 | 501 | 0.9336 | 0.764 | 0.3401 | 1.6693 | 0.764 | 0.7665 | 0.0772 | 0.0686 | | 2.0481 | 4.0 | 668 | 0.7982 | 0.7895 | 0.3047 | 1.5439 | 0.7895 | 0.7930 | 0.0670 | 0.0569 | | 2.0481 | 5.0 | 835 | 0.7154 | 0.7973 | 0.3037 | 1.5600 | 0.7973 | 0.7969 | 0.0836 | 0.0571 | | 0.5656 | 6.0 | 1002 | 0.6158 | 0.8113 | 0.2903 | 1.4591 | 0.8113 | 0.8139 | 0.0921 | 0.0493 | | 0.5656 | 7.0 | 1169 | 0.5531 | 0.8207 | 0.2707 | 1.4410 | 0.8207 | 0.8236 | 0.0832 | 0.0431 | | 0.5656 | 8.0 | 1336 | 0.5706 | 0.815 | 0.2826 | 1.4722 | 0.815 | 0.8208 | 0.0881 | 0.0465 | | 0.2988 | 9.0 | 1503 | 0.4654 | 0.8355 | 0.2488 | 1.3791 | 0.8355 | 0.8368 | 0.0745 | 0.0382 | | 0.2988 | 10.0 | 1670 | 0.4695 | 0.8315 | 0.2579 | 1.3701 | 0.8315 | 0.8333 | 0.0813 | 0.0403 | | 0.2988 | 11.0 | 1837 | 0.4358 | 0.8405 | 0.2424 | 1.3500 | 0.8405 | 0.8424 | 0.0725 | 0.0361 | | 0.1829 | 12.0 | 2004 | 0.4333 | 0.8425 | 0.2402 | 1.3740 | 0.8425 | 0.8446 | 0.0662 | 0.0362 | | 0.1829 | 13.0 | 2171 | 0.4239 | 0.8462 | 0.2326 | 1.3541 | 0.8462 | 0.8477 | 0.0648 | 0.0335 | | 0.1829 | 14.0 | 2338 | 0.3902 | 0.8488 | 0.2263 | 1.2996 | 0.8488 | 0.8512 | 0.0642 | 0.0318 | | 0.1215 | 15.0 | 2505 | 0.3740 | 0.8522 | 0.2194 | 1.3374 | 0.8522 | 0.8543 | 0.0595 | 0.0313 | | 0.1215 | 16.0 | 2672 | 0.3735 | 0.8548 | 0.2189 | 1.3420 | 0.8547 | 0.8553 | 0.0525 | 0.0320 | | 0.1215 | 17.0 | 2839 | 0.3700 | 0.8538 | 0.2161 | 1.3217 | 0.8537 | 0.8561 | 0.0521 | 0.0304 | | 0.082 | 18.0 | 3006 | 0.3574 | 0.8548 | 0.2164 | 1.3245 | 0.8547 | 0.8561 | 0.0583 | 0.0301 | | 0.082 | 19.0 | 3173 | 0.3669 | 0.8555 | 0.2140 | 1.3197 | 0.8555 | 0.8572 | 0.0538 | 0.0304 | | 0.082 | 20.0 | 3340 | 0.3561 | 0.8548 | 0.2125 | 1.3367 | 0.8547 | 0.8560 | 0.0540 | 0.0296 | | 0.0535 | 21.0 | 3507 | 0.3495 | 0.854 | 0.2116 | 1.3422 | 0.854 | 0.8558 | 0.0556 | 0.0294 | | 0.0535 | 22.0 | 3674 | 0.3412 | 0.8602 | 0.2092 | 1.2970 | 0.8602 | 0.8621 | 0.0527 | 0.0293 | | 0.0535 | 23.0 | 3841 | 0.3445 | 0.8595 | 0.2086 | 1.2979 | 0.8595 | 0.8613 | 0.0500 | 0.0286 | | 0.0309 | 24.0 | 4008 | 0.3456 | 0.8585 | 0.2105 | 1.3220 | 0.8585 | 0.8601 | 0.0507 | 0.0292 | | 0.0309 | 25.0 | 4175 | 0.3451 | 0.862 | 0.2091 | 1.3080 | 0.8620 | 0.8640 | 0.0465 | 0.0290 | | 0.0309 | 26.0 | 4342 | 0.3484 | 0.8578 | 0.2090 | 1.3165 | 0.8578 | 0.8596 | 0.0527 | 0.0290 | | 0.019 | 27.0 | 4509 | 0.3452 | 0.8612 | 0.2072 | 1.3133 | 0.8612 | 0.8634 | 0.0494 | 0.0288 | | 0.019 | 28.0 | 4676 | 0.3451 | 0.8598 | 0.2089 | 1.3197 | 0.8598 | 0.8619 | 0.0515 | 0.0295 | | 0.019 | 29.0 | 4843 | 0.3445 | 0.8618 | 0.2072 | 1.3057 | 0.8618 | 0.8633 | 0.0496 | 0.0294 | | 0.0137 | 30.0 | 5010 | 0.3452 | 0.8592 | 0.2078 | 1.3108 | 0.8592 | 0.8609 | 0.0499 | 0.0292 | | 0.0137 | 31.0 | 5177 | 0.3439 | 0.8615 | 0.2074 | 1.2960 | 0.8615 | 0.8631 | 0.0495 | 0.0286 | | 0.0137 | 32.0 | 5344 | 0.3475 | 0.8618 | 0.2080 | 1.3146 | 0.8618 | 0.8638 | 0.0468 | 0.0288 | | 0.01 | 33.0 | 5511 | 0.3468 | 0.8605 | 0.2080 | 1.3095 | 0.8605 | 0.8624 | 0.0470 | 0.0291 | | 0.01 | 34.0 | 5678 | 0.3454 | 0.8638 | 0.2060 | 1.3094 | 0.8638 | 0.8653 | 0.0465 | 0.0285 | | 0.01 | 35.0 | 5845 | 0.3463 | 0.8612 | 0.2067 | 1.3145 | 0.8612 | 0.8632 | 0.0479 | 0.0287 | | 0.0071 | 36.0 | 6012 | 0.3466 | 0.8615 | 0.2070 | 1.3189 | 0.8615 | 0.8634 | 0.0449 | 0.0289 | | 0.0071 | 37.0 | 6179 | 0.3457 | 0.8635 | 0.2065 | 1.3085 | 0.8635 | 0.8653 | 0.0487 | 0.0287 | | 0.0071 | 38.0 | 6346 | 0.3471 | 0.8618 | 0.2066 | 1.3132 | 0.8618 | 0.8637 | 0.0488 | 0.0286 | | 0.0047 | 39.0 | 6513 | 0.3481 | 0.8615 | 0.2067 | 1.3116 | 0.8615 | 0.8632 | 0.0485 | 0.0288 | | 0.0047 | 40.0 | 6680 | 0.3482 | 0.8618 | 0.2074 | 1.3149 | 0.8618 | 0.8638 | 0.0512 | 0.0290 | | 0.0047 | 41.0 | 6847 | 0.3488 | 0.862 | 0.2072 | 1.3162 | 0.8620 | 0.8640 | 0.0467 | 0.0287 | | 0.0029 | 42.0 | 7014 | 0.3485 | 0.862 | 0.2069 | 1.3136 | 0.8620 | 0.8640 | 0.0466 | 0.0288 | | 0.0029 | 43.0 | 7181 | 0.3492 | 0.8612 | 0.2072 | 1.3151 | 0.8612 | 0.8633 | 0.0470 | 0.0288 | | 0.0029 | 44.0 | 7348 | 0.3492 | 0.8615 | 0.2070 | 1.3117 | 0.8615 | 0.8634 | 0.0459 | 0.0289 | | 0.0019 | 45.0 | 7515 | 0.3502 | 0.8612 | 0.2073 | 1.3153 | 0.8612 | 0.8632 | 0.0460 | 0.0289 | | 0.0019 | 46.0 | 7682 | 0.3500 | 0.8615 | 0.2072 | 1.3136 | 0.8615 | 0.8634 | 0.0474 | 0.0290 | | 0.0019 | 47.0 | 7849 | 0.3505 | 0.862 | 0.2072 | 1.3153 | 0.8620 | 0.8640 | 0.0457 | 0.0289 | | 0.0014 | 48.0 | 8016 | 0.3507 | 0.861 | 0.2072 | 1.3113 | 0.861 | 0.8630 | 0.0475 | 0.0290 | | 0.0014 | 49.0 | 8183 | 0.3508 | 0.861 | 0.2071 | 1.3111 | 0.861 | 0.8630 | 0.0474 | 0.0290 | | 0.0014 | 50.0 | 8350 | 0.3508 | 0.861 | 0.2072 | 1.3138 | 0.861 | 0.8630 | 0.0470 | 0.0290 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
rdmpage/autotrain-pagex-78162140824
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 78162140824 - CO2 Emissions (in grams): 0.0075 ## Validation Metrics - Loss: 0.184 - Accuracy: 0.933 - Macro F1: 0.934 - Micro F1: 0.933 - Weighted F1: 0.936 - Macro Precision: 0.917 - Micro Precision: 0.933 - Weighted Precision: 0.950 - Macro Recall: 0.965 - Micro Recall: 0.933 - Weighted Recall: 0.933
[ "content", "end", "start" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.6996 - Accuracy: 0.856 - Brier Loss: 0.2370 - Nll: 1.1806 - F1 Micro: 0.856 - F1 Macro: 0.8578 - Ece: 0.1015 - Aurc: 0.0302 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 167 | 4.6372 | 0.632 | 0.5161 | 2.2084 | 0.632 | 0.6245 | 0.1480 | 0.1490 | | No log | 2.0 | 334 | 4.2247 | 0.7143 | 0.3976 | 1.8912 | 0.7142 | 0.7110 | 0.0878 | 0.0942 | | 4.8599 | 3.0 | 501 | 4.0290 | 0.7488 | 0.3551 | 1.7330 | 0.7488 | 0.7552 | 0.0711 | 0.0785 | | 4.8599 | 4.0 | 668 | 3.8716 | 0.7903 | 0.2981 | 1.6409 | 0.7903 | 0.7898 | 0.0468 | 0.0593 | | 4.8599 | 5.0 | 835 | 3.7535 | 0.8055 | 0.2829 | 1.5302 | 0.8055 | 0.8039 | 0.0465 | 0.0521 | | 3.7258 | 6.0 | 1002 | 3.7365 | 0.8023 | 0.2787 | 1.5134 | 0.8023 | 0.8043 | 0.0352 | 0.0509 | | 3.7258 | 7.0 | 1169 | 3.7092 | 0.811 | 0.2705 | 1.3930 | 0.811 | 0.8130 | 0.0472 | 0.0488 | | 3.7258 | 8.0 | 1336 | 3.6799 | 0.8213 | 0.2643 | 1.4444 | 0.8213 | 0.8242 | 0.0484 | 0.0453 | | 3.4329 | 9.0 | 1503 | 3.6148 | 0.8265 | 0.2522 | 1.3355 | 0.8265 | 0.8295 | 0.0522 | 0.0425 | | 3.4329 | 10.0 | 1670 | 3.5723 | 0.826 | 0.2524 | 1.3332 | 0.826 | 0.8286 | 0.0637 | 0.0398 | | 3.4329 | 11.0 | 1837 | 3.6298 | 0.8277 | 0.2565 | 1.3664 | 0.8277 | 0.8304 | 0.0720 | 0.0422 | | 3.2987 | 12.0 | 2004 | 3.5604 | 0.8407 | 0.2376 | 1.3420 | 0.8407 | 0.8424 | 0.0609 | 0.0359 | | 3.2987 | 13.0 | 2171 | 3.5885 | 0.8393 | 0.2446 | 1.3552 | 0.8393 | 0.8420 | 0.0712 | 0.0381 | | 3.2987 | 14.0 | 2338 | 3.6191 | 0.8315 | 0.2518 | 1.3329 | 0.8315 | 0.8322 | 0.0772 | 0.0383 | | 3.2268 | 15.0 | 2505 | 3.5920 | 0.837 | 0.2465 | 1.3397 | 0.8370 | 0.8399 | 0.0816 | 0.0372 | | 3.2268 | 16.0 | 2672 | 3.5483 | 0.847 | 0.2324 | 1.2733 | 0.847 | 0.8489 | 0.0697 | 0.0337 | | 3.2268 | 17.0 | 2839 | 3.5924 | 0.8438 | 0.2444 | 1.2686 | 0.8438 | 0.8450 | 0.0838 | 0.0355 | | 3.174 | 18.0 | 3006 | 3.5909 | 0.8427 | 0.2419 | 1.2631 | 0.8427 | 0.8449 | 0.0856 | 0.0336 | | 3.174 | 19.0 | 3173 | 3.5857 | 0.8452 | 0.2393 | 1.2979 | 0.8452 | 0.8474 | 0.0804 | 0.0338 | | 3.174 | 20.0 | 3340 | 3.5700 | 0.8455 | 0.2373 | 1.2916 | 0.8455 | 0.8471 | 0.0824 | 0.0336 | | 3.1369 | 21.0 | 3507 | 3.5578 | 0.8518 | 0.2298 | 1.2615 | 0.8518 | 0.8531 | 0.0779 | 0.0316 | | 3.1369 | 22.0 | 3674 | 3.5659 | 0.8478 | 0.2349 | 1.2532 | 0.8478 | 0.8502 | 0.0848 | 0.0325 | | 3.1369 | 23.0 | 3841 | 3.5506 | 0.8552 | 0.2302 | 1.2530 | 0.8552 | 0.8572 | 0.0817 | 0.0312 | | 3.1077 | 24.0 | 4008 | 3.5551 | 0.857 | 0.2298 | 1.2669 | 0.857 | 0.8585 | 0.0817 | 0.0306 | | 3.1077 | 25.0 | 4175 | 3.5563 | 0.8575 | 0.2259 | 1.2374 | 0.8575 | 0.8587 | 0.0820 | 0.0296 | | 3.1077 | 26.0 | 4342 | 3.5642 | 0.8555 | 0.2312 | 1.2159 | 0.8555 | 0.8577 | 0.0855 | 0.0305 | | 3.0885 | 27.0 | 4509 | 3.5739 | 0.856 | 0.2332 | 1.2143 | 0.856 | 0.8581 | 0.0854 | 0.0309 | | 3.0885 | 28.0 | 4676 | 3.5544 | 0.855 | 0.2294 | 1.2305 | 0.855 | 0.8567 | 0.0860 | 0.0302 | | 3.0885 | 29.0 | 4843 | 3.5574 | 0.8598 | 0.2262 | 1.2330 | 0.8598 | 0.8616 | 0.0839 | 0.0304 | | 3.0716 | 30.0 | 5010 | 3.5673 | 0.8572 | 0.2291 | 1.2208 | 0.8572 | 0.8591 | 0.0888 | 0.0298 | | 3.0716 | 31.0 | 5177 | 3.5818 | 0.853 | 0.2293 | 1.1947 | 0.853 | 0.8550 | 0.0917 | 0.0302 | | 3.0716 | 32.0 | 5344 | 3.5792 | 0.858 | 0.2295 | 1.2086 | 0.858 | 0.8597 | 0.0881 | 0.0297 | | 3.064 | 33.0 | 5511 | 3.5895 | 0.8575 | 0.2315 | 1.2009 | 0.8575 | 0.8590 | 0.0900 | 0.0296 | | 3.064 | 34.0 | 5678 | 3.5923 | 0.8565 | 0.2293 | 1.1905 | 0.8565 | 0.8583 | 0.0901 | 0.0295 | | 3.064 | 35.0 | 5845 | 3.5997 | 0.8562 | 0.2310 | 1.2128 | 0.8562 | 0.8577 | 0.0922 | 0.0297 | | 3.0572 | 36.0 | 6012 | 3.6041 | 0.8572 | 0.2307 | 1.1932 | 0.8572 | 0.8589 | 0.0917 | 0.0296 | | 3.0572 | 37.0 | 6179 | 3.6123 | 0.857 | 0.2319 | 1.1984 | 0.857 | 0.8587 | 0.0932 | 0.0294 | | 3.0572 | 38.0 | 6346 | 3.6162 | 0.8585 | 0.2304 | 1.1909 | 0.8585 | 0.8600 | 0.0917 | 0.0293 | | 3.0542 | 39.0 | 6513 | 3.6318 | 0.8575 | 0.2321 | 1.1982 | 0.8575 | 0.8590 | 0.0945 | 0.0298 | | 3.0542 | 40.0 | 6680 | 3.6319 | 0.8572 | 0.2324 | 1.1905 | 0.8572 | 0.8588 | 0.0944 | 0.0296 | | 3.0542 | 41.0 | 6847 | 3.6401 | 0.856 | 0.2327 | 1.1953 | 0.856 | 0.8577 | 0.0964 | 0.0295 | | 3.0527 | 42.0 | 7014 | 3.6567 | 0.8572 | 0.2343 | 1.1821 | 0.8572 | 0.8588 | 0.0968 | 0.0299 | | 3.0527 | 43.0 | 7181 | 3.6601 | 0.8562 | 0.2341 | 1.1885 | 0.8562 | 0.8580 | 0.0981 | 0.0299 | | 3.0527 | 44.0 | 7348 | 3.6683 | 0.8572 | 0.2351 | 1.1854 | 0.8572 | 0.8588 | 0.0977 | 0.0299 | | 3.0479 | 45.0 | 7515 | 3.6742 | 0.8568 | 0.2353 | 1.1894 | 0.8568 | 0.8584 | 0.0986 | 0.0299 | | 3.0479 | 46.0 | 7682 | 3.6847 | 0.8565 | 0.2360 | 1.1813 | 0.8565 | 0.8582 | 0.1002 | 0.0301 | | 3.0479 | 47.0 | 7849 | 3.6891 | 0.8562 | 0.2363 | 1.1814 | 0.8562 | 0.8581 | 0.1004 | 0.0302 | | 3.0469 | 48.0 | 8016 | 3.6964 | 0.8558 | 0.2367 | 1.1806 | 0.8558 | 0.8575 | 0.1014 | 0.0302 | | 3.0469 | 49.0 | 8183 | 3.6982 | 0.8565 | 0.2369 | 1.1808 | 0.8565 | 0.8583 | 0.1008 | 0.0302 | | 3.0469 | 50.0 | 8350 | 3.6996 | 0.856 | 0.2370 | 1.1806 | 0.856 | 0.8578 | 0.1015 | 0.0302 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
petrznel/blurred_landmarks
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # blurred_landmarks This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1152 - Accuracy: 0.9645 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6588 | 1.0 | 357 | 0.6460 | 0.7707 | | 0.3752 | 2.0 | 714 | 0.2969 | 0.8933 | | 0.3275 | 3.0 | 1071 | 0.1912 | 0.9319 | | 0.2183 | 4.0 | 1429 | 0.1794 | 0.9305 | | 0.2133 | 5.0 | 1786 | 0.1638 | 0.9414 | | 0.1984 | 6.0 | 2143 | 0.1322 | 0.9484 | | 0.1409 | 7.0 | 2500 | 0.1304 | 0.9529 | | 0.1864 | 8.0 | 2858 | 0.1212 | 0.9572 | | 0.1778 | 9.0 | 3215 | 0.1216 | 0.9540 | | 0.1734 | 10.0 | 3572 | 0.1129 | 0.9593 | | 0.1349 | 11.0 | 3929 | 0.1127 | 0.9614 | | 0.1057 | 12.0 | 4287 | 0.1177 | 0.9582 | | 0.1434 | 13.0 | 4644 | 0.1153 | 0.9603 | | 0.0832 | 14.0 | 5001 | 0.1264 | 0.9593 | | 0.0963 | 15.0 | 5358 | 0.1146 | 0.9607 | | 0.0642 | 16.0 | 5716 | 0.1135 | 0.9635 | | 0.0763 | 17.0 | 6073 | 0.1210 | 0.9614 | | 0.0432 | 18.0 | 6430 | 0.1162 | 0.9645 | | 0.0618 | 19.0 | 6787 | 0.1269 | 0.9600 | | 0.049 | 19.99 | 7140 | 0.1152 | 0.9645 | ### Framework versions - Transformers 4.30.0.dev0 - Pytorch 1.13.0 - Datasets 2.10.1 - Tokenizers 0.11.0
[ "generated", "real" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 72.4419 - Accuracy: 0.8455 - Brier Loss: 0.2823 - Nll: 1.9713 - F1 Micro: 0.8455 - F1 Macro: 0.8458 - Ece: 0.1379 - Aurc: 0.0501 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 75.292 | 1.0 | 1000 | 74.9267 | 0.669 | 0.4449 | 2.2033 | 0.669 | 0.6688 | 0.0509 | 0.1261 | | 74.3102 | 2.0 | 2000 | 74.4753 | 0.7228 | 0.3815 | 2.1695 | 0.7228 | 0.7241 | 0.0443 | 0.0931 | | 73.8711 | 3.0 | 3000 | 74.1051 | 0.7718 | 0.3245 | 1.9516 | 0.7717 | 0.7726 | 0.0520 | 0.0687 | | 73.8038 | 4.0 | 4000 | 74.0142 | 0.764 | 0.3412 | 1.9168 | 0.764 | 0.7656 | 0.0862 | 0.0705 | | 73.5111 | 5.0 | 5000 | 73.9154 | 0.762 | 0.3443 | 1.9320 | 0.762 | 0.7684 | 0.0774 | 0.0734 | | 73.2405 | 6.0 | 6000 | 73.6544 | 0.7923 | 0.3034 | 1.9497 | 0.7923 | 0.7877 | 0.0780 | 0.0555 | | 72.892 | 7.0 | 7000 | 73.6057 | 0.8073 | 0.3042 | 1.9087 | 0.8073 | 0.8074 | 0.1072 | 0.0549 | | 72.6235 | 8.0 | 8000 | 73.6740 | 0.8007 | 0.3175 | 1.9674 | 0.8007 | 0.8002 | 0.1221 | 0.0566 | | 72.6609 | 9.0 | 9000 | 73.6103 | 0.7977 | 0.3290 | 2.0128 | 0.7977 | 0.7992 | 0.1345 | 0.0595 | | 72.4643 | 10.0 | 10000 | 73.5996 | 0.8143 | 0.3166 | 1.9412 | 0.8143 | 0.8148 | 0.1372 | 0.0498 | | 72.2435 | 11.0 | 11000 | 73.5005 | 0.8057 | 0.3268 | 1.9323 | 0.8057 | 0.8070 | 0.1469 | 0.0530 | | 72.1757 | 12.0 | 12000 | 73.5284 | 0.8165 | 0.3197 | 1.9353 | 0.8165 | 0.8175 | 0.1442 | 0.0511 | | 72.0746 | 13.0 | 13000 | 73.5250 | 0.8023 | 0.3434 | 2.0172 | 0.8023 | 0.8047 | 0.1577 | 0.0584 | | 72.0251 | 14.0 | 14000 | 73.3937 | 0.817 | 0.3176 | 1.9784 | 0.817 | 0.8172 | 0.1471 | 0.0510 | | 71.8588 | 15.0 | 15000 | 73.3792 | 0.814 | 0.3249 | 1.9812 | 0.8140 | 0.8148 | 0.1506 | 0.0514 | | 71.8093 | 16.0 | 16000 | 73.2188 | 0.825 | 0.3084 | 1.9155 | 0.825 | 0.8259 | 0.1446 | 0.0449 | | 71.5835 | 17.0 | 17000 | 73.2452 | 0.8307 | 0.3019 | 1.9447 | 0.8308 | 0.8308 | 0.1386 | 0.0437 | | 71.6995 | 18.0 | 18000 | 73.2568 | 0.83 | 0.3042 | 2.0599 | 0.83 | 0.8304 | 0.1444 | 0.0459 | | 71.4455 | 19.0 | 19000 | 73.2655 | 0.8203 | 0.3204 | 2.0173 | 0.8203 | 0.8205 | 0.1524 | 0.0484 | | 71.4047 | 20.0 | 20000 | 73.1947 | 0.8203 | 0.3218 | 1.9651 | 0.8203 | 0.8218 | 0.1524 | 0.0484 | | 71.5116 | 21.0 | 21000 | 73.1398 | 0.8217 | 0.3197 | 1.9803 | 0.8217 | 0.8209 | 0.1521 | 0.0453 | | 71.3315 | 22.0 | 22000 | 73.0498 | 0.832 | 0.3033 | 1.9617 | 0.832 | 0.8327 | 0.1445 | 0.0478 | | 71.329 | 23.0 | 23000 | 73.0645 | 0.8277 | 0.3129 | 1.9073 | 0.8277 | 0.8293 | 0.1502 | 0.0486 | | 71.2208 | 24.0 | 24000 | 73.1448 | 0.8257 | 0.3188 | 2.0056 | 0.8257 | 0.8252 | 0.1511 | 0.0520 | | 71.22 | 25.0 | 25000 | 72.9177 | 0.827 | 0.3103 | 1.9543 | 0.827 | 0.8261 | 0.1493 | 0.0500 | | 71.1268 | 26.0 | 26000 | 72.9064 | 0.8323 | 0.3022 | 1.9069 | 0.8323 | 0.8311 | 0.1458 | 0.0463 | | 70.8954 | 27.0 | 27000 | 72.8821 | 0.8403 | 0.2930 | 1.9264 | 0.8403 | 0.8405 | 0.1399 | 0.0454 | | 70.7553 | 28.0 | 28000 | 72.8230 | 0.8347 | 0.3012 | 1.9356 | 0.8347 | 0.8347 | 0.1434 | 0.0487 | | 70.8785 | 29.0 | 29000 | 72.8549 | 0.8347 | 0.3023 | 1.9905 | 0.8347 | 0.8339 | 0.1464 | 0.0477 | | 70.8139 | 30.0 | 30000 | 72.8073 | 0.8405 | 0.2933 | 1.9148 | 0.8405 | 0.8400 | 0.1399 | 0.0482 | | 70.9162 | 31.0 | 31000 | 72.7751 | 0.8367 | 0.3005 | 1.9466 | 0.8367 | 0.8364 | 0.1441 | 0.0473 | | 70.8988 | 32.0 | 32000 | 72.7235 | 0.8365 | 0.2990 | 1.9178 | 0.8365 | 0.8362 | 0.1432 | 0.0453 | | 70.7529 | 33.0 | 33000 | 72.6744 | 0.8415 | 0.2937 | 1.9929 | 0.8415 | 0.8424 | 0.1391 | 0.0491 | | 70.6705 | 34.0 | 34000 | 72.6624 | 0.8407 | 0.2927 | 1.9562 | 0.8407 | 0.8423 | 0.1417 | 0.0490 | | 70.6404 | 35.0 | 35000 | 72.7689 | 0.8317 | 0.3071 | 2.0079 | 0.8317 | 0.8311 | 0.1479 | 0.0551 | | 70.5201 | 36.0 | 36000 | 72.6579 | 0.8425 | 0.2928 | 1.9879 | 0.8425 | 0.8431 | 0.1404 | 0.0519 | | 70.6383 | 37.0 | 37000 | 72.5850 | 0.8458 | 0.2839 | 1.9513 | 0.8458 | 0.8470 | 0.1362 | 0.0500 | | 70.5781 | 38.0 | 38000 | 72.5590 | 0.8423 | 0.2894 | 1.9416 | 0.8423 | 0.8427 | 0.1407 | 0.0496 | | 70.4386 | 39.0 | 39000 | 72.5131 | 0.8435 | 0.2855 | 1.9475 | 0.8435 | 0.8443 | 0.1397 | 0.0492 | | 70.4275 | 40.0 | 40000 | 72.5441 | 0.8455 | 0.2896 | 1.9216 | 0.8455 | 0.8460 | 0.1387 | 0.0481 | | 70.5018 | 41.0 | 41000 | 72.5071 | 0.8452 | 0.2866 | 1.9148 | 0.8452 | 0.8465 | 0.1384 | 0.0486 | | 70.5176 | 42.0 | 42000 | 72.5110 | 0.8445 | 0.2862 | 1.9591 | 0.8445 | 0.8447 | 0.1384 | 0.0523 | | 70.3675 | 43.0 | 43000 | 72.4968 | 0.844 | 0.2845 | 1.9376 | 0.844 | 0.8445 | 0.1398 | 0.0507 | | 70.4295 | 44.0 | 44000 | 72.4633 | 0.846 | 0.2838 | 1.9475 | 0.8460 | 0.8464 | 0.1381 | 0.0506 | | 70.5198 | 45.0 | 45000 | 72.4940 | 0.8468 | 0.2828 | 1.9559 | 0.8468 | 0.8471 | 0.1374 | 0.0501 | | 70.3838 | 46.0 | 46000 | 72.4658 | 0.8462 | 0.2823 | 1.9468 | 0.8462 | 0.8465 | 0.1366 | 0.0500 | | 70.5383 | 47.0 | 47000 | 72.4590 | 0.8475 | 0.2817 | 1.9537 | 0.8475 | 0.8480 | 0.1362 | 0.0502 | | 70.3619 | 48.0 | 48000 | 72.4522 | 0.8458 | 0.2824 | 1.9632 | 0.8458 | 0.8461 | 0.1377 | 0.0495 | | 70.4764 | 49.0 | 49000 | 72.4573 | 0.8455 | 0.2825 | 1.9689 | 0.8455 | 0.8457 | 0.1378 | 0.0499 | | 70.3705 | 50.0 | 50000 | 72.4419 | 0.8455 | 0.2823 | 1.9713 | 0.8455 | 0.8458 | 0.1379 | 0.0501 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_simkd This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0540 - Accuracy: 0.859 - Brier Loss: 0.2977 - Nll: 1.1492 - F1 Micro: 0.859 - F1 Macro: 0.8598 - Ece: 0.2784 - Aurc: 0.0325 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.0787 | 1.0 | 1000 | 0.0770 | 0.2978 | 0.9091 | 2.9717 | 0.2978 | 0.2516 | 0.2150 | 0.5174 | | 0.0697 | 2.0 | 2000 | 0.0679 | 0.6372 | 0.6886 | 1.8065 | 0.6372 | 0.6302 | 0.4021 | 0.1388 | | 0.0655 | 3.0 | 3000 | 0.0645 | 0.7492 | 0.5738 | 1.7388 | 0.7492 | 0.7460 | 0.4215 | 0.0921 | | 0.0628 | 4.0 | 4000 | 0.0631 | 0.752 | 0.5394 | 1.7446 | 0.752 | 0.7551 | 0.3922 | 0.0837 | | 0.0611 | 5.0 | 5000 | 0.0612 | 0.768 | 0.4928 | 1.5830 | 0.768 | 0.7700 | 0.3710 | 0.0655 | | 0.0593 | 6.0 | 6000 | 0.0609 | 0.7598 | 0.4655 | 1.5730 | 0.7598 | 0.7667 | 0.3228 | 0.0802 | | 0.0578 | 7.0 | 7000 | 0.0585 | 0.8063 | 0.4195 | 1.4053 | 0.8062 | 0.8065 | 0.3459 | 0.0521 | | 0.0566 | 8.0 | 8000 | 0.0581 | 0.8073 | 0.3997 | 1.2957 | 0.8073 | 0.8084 | 0.3207 | 0.0538 | | 0.0557 | 9.0 | 9000 | 0.0571 | 0.8287 | 0.3810 | 1.3269 | 0.8287 | 0.8301 | 0.3307 | 0.0473 | | 0.0554 | 10.0 | 10000 | 0.0573 | 0.8115 | 0.3780 | 1.3469 | 0.8115 | 0.8128 | 0.3011 | 0.0508 | | 0.0546 | 11.0 | 11000 | 0.0563 | 0.8395 | 0.3549 | 1.2882 | 0.8395 | 0.8401 | 0.3197 | 0.0386 | | 0.0541 | 12.0 | 12000 | 0.0558 | 0.839 | 0.3426 | 1.2653 | 0.839 | 0.8401 | 0.3014 | 0.0394 | | 0.0536 | 13.0 | 13000 | 0.0553 | 0.8465 | 0.3259 | 1.1941 | 0.8465 | 0.8473 | 0.2980 | 0.0357 | | 0.0537 | 14.0 | 14000 | 0.0559 | 0.8303 | 0.3499 | 1.2460 | 0.8303 | 0.8338 | 0.2955 | 0.0427 | | 0.0532 | 15.0 | 15000 | 0.0551 | 0.8445 | 0.3296 | 1.1799 | 0.8445 | 0.8453 | 0.2990 | 0.0360 | | 0.0529 | 16.0 | 16000 | 0.0549 | 0.845 | 0.3224 | 1.1801 | 0.845 | 0.8456 | 0.2895 | 0.0364 | | 0.0527 | 17.0 | 17000 | 0.0549 | 0.849 | 0.3264 | 1.1725 | 0.849 | 0.8503 | 0.2991 | 0.0363 | | 0.0526 | 18.0 | 18000 | 0.0547 | 0.8518 | 0.3170 | 1.1755 | 0.8518 | 0.8527 | 0.2943 | 0.0334 | | 0.0524 | 19.0 | 19000 | 0.0546 | 0.8458 | 0.3213 | 1.1417 | 0.8458 | 0.8466 | 0.2917 | 0.0344 | | 0.0522 | 20.0 | 20000 | 0.0544 | 0.8545 | 0.3105 | 1.1512 | 0.8545 | 0.8542 | 0.2891 | 0.0333 | | 0.052 | 21.0 | 21000 | 0.0542 | 0.855 | 0.3120 | 1.1403 | 0.855 | 0.8555 | 0.2940 | 0.0333 | | 0.0518 | 22.0 | 22000 | 0.0542 | 0.854 | 0.3096 | 1.1533 | 0.854 | 0.8545 | 0.2893 | 0.0319 | | 0.0517 | 23.0 | 23000 | 0.0541 | 0.8545 | 0.3098 | 1.1445 | 0.8545 | 0.8556 | 0.2920 | 0.0315 | | 0.0516 | 24.0 | 24000 | 0.0540 | 0.8578 | 0.3097 | 1.1273 | 0.8578 | 0.8586 | 0.2958 | 0.0315 | | 0.0514 | 25.0 | 25000 | 0.0540 | 0.8532 | 0.3076 | 1.1579 | 0.8532 | 0.8533 | 0.2849 | 0.0342 | | 0.0513 | 26.0 | 26000 | 0.0540 | 0.855 | 0.3055 | 1.1269 | 0.855 | 0.8563 | 0.2855 | 0.0325 | | 0.0511 | 27.0 | 27000 | 0.0538 | 0.8565 | 0.3029 | 1.1571 | 0.8565 | 0.8572 | 0.2827 | 0.0334 | | 0.051 | 28.0 | 28000 | 0.0538 | 0.8598 | 0.3012 | 1.1409 | 0.8598 | 0.8604 | 0.2851 | 0.0317 | | 0.0509 | 29.0 | 29000 | 0.0537 | 0.86 | 0.3003 | 1.1525 | 0.8600 | 0.8603 | 0.2839 | 0.0323 | | 0.0508 | 30.0 | 30000 | 0.0537 | 0.8575 | 0.3024 | 1.1430 | 0.8575 | 0.8585 | 0.2849 | 0.0319 | | 0.0507 | 31.0 | 31000 | 0.0537 | 0.8595 | 0.3015 | 1.1454 | 0.8595 | 0.8603 | 0.2859 | 0.0311 | | 0.0507 | 32.0 | 32000 | 0.0537 | 0.8598 | 0.3005 | 1.1463 | 0.8598 | 0.8603 | 0.2847 | 0.0316 | | 0.0506 | 33.0 | 33000 | 0.0537 | 0.8598 | 0.2966 | 1.1392 | 0.8598 | 0.8605 | 0.2800 | 0.0309 | | 0.0506 | 34.0 | 34000 | 0.0537 | 0.8562 | 0.3018 | 1.1442 | 0.8562 | 0.8574 | 0.2813 | 0.0327 | | 0.0505 | 35.0 | 35000 | 0.0537 | 0.855 | 0.2995 | 1.1402 | 0.855 | 0.8556 | 0.2790 | 0.0324 | | 0.0505 | 36.0 | 36000 | 0.0537 | 0.8575 | 0.2980 | 1.1324 | 0.8575 | 0.8582 | 0.2783 | 0.0314 | | 0.0504 | 37.0 | 37000 | 0.0538 | 0.8562 | 0.2981 | 1.1429 | 0.8562 | 0.8570 | 0.2770 | 0.0320 | | 0.0503 | 38.0 | 38000 | 0.0538 | 0.8565 | 0.2997 | 1.1319 | 0.8565 | 0.8573 | 0.2795 | 0.0324 | | 0.0503 | 39.0 | 39000 | 0.0538 | 0.857 | 0.2988 | 1.1447 | 0.857 | 0.8578 | 0.2791 | 0.0320 | | 0.0502 | 40.0 | 40000 | 0.0538 | 0.8588 | 0.2982 | 1.1409 | 0.8588 | 0.8595 | 0.2798 | 0.0320 | | 0.0502 | 41.0 | 41000 | 0.0538 | 0.8572 | 0.2982 | 1.1455 | 0.8572 | 0.8580 | 0.2781 | 0.0319 | | 0.0502 | 42.0 | 42000 | 0.0538 | 0.8602 | 0.2979 | 1.1357 | 0.8602 | 0.8609 | 0.2809 | 0.0320 | | 0.0501 | 43.0 | 43000 | 0.0539 | 0.8568 | 0.2987 | 1.1462 | 0.8568 | 0.8574 | 0.2787 | 0.0322 | | 0.0501 | 44.0 | 44000 | 0.0539 | 0.8595 | 0.2974 | 1.1456 | 0.8595 | 0.8602 | 0.2789 | 0.0322 | | 0.0501 | 45.0 | 45000 | 0.0539 | 0.8592 | 0.2980 | 1.1460 | 0.8592 | 0.8601 | 0.2792 | 0.0322 | | 0.05 | 46.0 | 46000 | 0.0539 | 0.8588 | 0.2979 | 1.1441 | 0.8588 | 0.8596 | 0.2787 | 0.0322 | | 0.05 | 47.0 | 47000 | 0.0540 | 0.8592 | 0.2983 | 1.1501 | 0.8592 | 0.8600 | 0.2793 | 0.0324 | | 0.05 | 48.0 | 48000 | 0.0540 | 0.8588 | 0.2980 | 1.1462 | 0.8588 | 0.8595 | 0.2787 | 0.0324 | | 0.05 | 49.0 | 49000 | 0.0540 | 0.8598 | 0.2978 | 1.1507 | 0.8598 | 0.8604 | 0.2793 | 0.0324 | | 0.05 | 50.0 | 50000 | 0.0540 | 0.859 | 0.2977 | 1.1492 | 0.859 | 0.8598 | 0.2784 | 0.0325 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_og_simkd This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 261.8253 - Accuracy: 0.845 - Brier Loss: 0.2896 - Nll: 1.8917 - F1 Micro: 0.845 - F1 Macro: 0.8458 - Ece: 0.1431 - Aurc: 0.0597 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 284.1528 | 1.0 | 1000 | 282.4832 | 0.5725 | 0.6206 | 2.4811 | 0.5725 | 0.5320 | 0.2175 | 0.1848 | | 281.942 | 2.0 | 2000 | 280.8445 | 0.6943 | 0.4704 | 2.2781 | 0.6943 | 0.6823 | 0.1809 | 0.1131 | | 281.4574 | 3.0 | 3000 | 280.4852 | 0.7185 | 0.4524 | 2.1204 | 0.7185 | 0.7250 | 0.1810 | 0.1008 | | 279.8457 | 4.0 | 4000 | 278.7610 | 0.769 | 0.3964 | 2.0520 | 0.769 | 0.7685 | 0.1789 | 0.0775 | | 279.1982 | 5.0 | 5000 | 278.0784 | 0.792 | 0.3522 | 1.9832 | 0.792 | 0.7915 | 0.1570 | 0.0670 | | 278.1353 | 6.0 | 6000 | 277.1822 | 0.8135 | 0.3198 | 1.8943 | 0.8135 | 0.8160 | 0.1427 | 0.0547 | | 277.4303 | 7.0 | 7000 | 275.9198 | 0.8193 | 0.3170 | 1.9321 | 0.8193 | 0.8203 | 0.1453 | 0.0589 | | 276.2535 | 8.0 | 8000 | 274.8677 | 0.8273 | 0.3063 | 1.8543 | 0.8273 | 0.8266 | 0.1404 | 0.0538 | | 275.1405 | 9.0 | 9000 | 273.8240 | 0.8345 | 0.2905 | 1.8312 | 0.8345 | 0.8362 | 0.1369 | 0.0525 | | 274.3982 | 10.0 | 10000 | 273.2765 | 0.835 | 0.2892 | 1.8405 | 0.835 | 0.8362 | 0.1363 | 0.0512 | | 272.9251 | 11.0 | 11000 | 272.4844 | 0.8455 | 0.2730 | 1.8874 | 0.8455 | 0.8468 | 0.1277 | 0.0478 | | 272.1662 | 12.0 | 12000 | 271.4586 | 0.8373 | 0.2923 | 1.8514 | 0.8373 | 0.8374 | 0.1396 | 0.0508 | | 272.1504 | 13.0 | 13000 | 271.0098 | 0.8452 | 0.2765 | 1.8428 | 0.8452 | 0.8454 | 0.1304 | 0.0505 | | 271.0841 | 14.0 | 14000 | 270.4739 | 0.8405 | 0.2884 | 1.8279 | 0.8405 | 0.8421 | 0.1368 | 0.0522 | | 270.5412 | 15.0 | 15000 | 269.5290 | 0.843 | 0.2861 | 1.8339 | 0.843 | 0.8434 | 0.1375 | 0.0524 | | 269.4117 | 16.0 | 16000 | 269.1779 | 0.842 | 0.2874 | 1.8357 | 0.842 | 0.8422 | 0.1383 | 0.0520 | | 269.1644 | 17.0 | 17000 | 268.5929 | 0.8465 | 0.2743 | 1.8563 | 0.8465 | 0.8470 | 0.1333 | 0.0491 | | 268.7355 | 18.0 | 18000 | 268.2595 | 0.8475 | 0.2790 | 1.8540 | 0.8475 | 0.8479 | 0.1345 | 0.0505 | | 268.3442 | 19.0 | 19000 | 267.7969 | 0.8508 | 0.2749 | 1.8406 | 0.8508 | 0.8509 | 0.1307 | 0.0505 | | 267.4279 | 20.0 | 20000 | 267.2394 | 0.844 | 0.2811 | 1.8676 | 0.844 | 0.8448 | 0.1384 | 0.0509 | | 267.468 | 21.0 | 21000 | 267.0267 | 0.8525 | 0.2694 | 1.8311 | 0.8525 | 0.8534 | 0.1293 | 0.0519 | | 266.6685 | 22.0 | 22000 | 266.3500 | 0.8485 | 0.2772 | 1.8471 | 0.8485 | 0.8487 | 0.1368 | 0.0507 | | 266.4612 | 23.0 | 23000 | 265.8022 | 0.8433 | 0.2863 | 1.8363 | 0.8433 | 0.8441 | 0.1399 | 0.0536 | | 266.3148 | 24.0 | 24000 | 265.7575 | 0.8488 | 0.2783 | 1.8835 | 0.8488 | 0.8495 | 0.1366 | 0.0518 | | 265.0058 | 25.0 | 25000 | 265.1237 | 0.8468 | 0.2841 | 1.8232 | 0.8468 | 0.8476 | 0.1370 | 0.0555 | | 265.3975 | 26.0 | 26000 | 265.0540 | 0.8518 | 0.2757 | 1.8747 | 0.8518 | 0.8525 | 0.1324 | 0.0527 | | 265.4347 | 27.0 | 27000 | 264.8875 | 0.8502 | 0.2755 | 1.8525 | 0.8502 | 0.8509 | 0.1339 | 0.0515 | | 264.4956 | 28.0 | 28000 | 264.4421 | 0.8448 | 0.2864 | 1.8596 | 0.8448 | 0.8457 | 0.1402 | 0.0535 | | 264.3941 | 29.0 | 29000 | 264.0486 | 0.8472 | 0.2815 | 1.8533 | 0.8472 | 0.8480 | 0.1379 | 0.0538 | | 264.138 | 30.0 | 30000 | 264.2021 | 0.8495 | 0.2772 | 1.8547 | 0.8495 | 0.8500 | 0.1363 | 0.0531 | | 263.8278 | 31.0 | 31000 | 263.6598 | 0.8472 | 0.2840 | 1.8715 | 0.8472 | 0.8472 | 0.1393 | 0.0549 | | 263.4683 | 32.0 | 32000 | 263.4160 | 0.8465 | 0.2820 | 1.8844 | 0.8465 | 0.8471 | 0.1389 | 0.0558 | | 263.5281 | 33.0 | 33000 | 263.2498 | 0.851 | 0.2788 | 1.8720 | 0.851 | 0.8520 | 0.1361 | 0.0554 | | 263.3538 | 34.0 | 34000 | 262.9030 | 0.8472 | 0.2839 | 1.9007 | 0.8472 | 0.8482 | 0.1393 | 0.0562 | | 262.673 | 35.0 | 35000 | 262.9031 | 0.8452 | 0.2859 | 1.8754 | 0.8452 | 0.8463 | 0.1406 | 0.0564 | | 262.9104 | 36.0 | 36000 | 262.8404 | 0.8468 | 0.2867 | 1.8730 | 0.8468 | 0.8478 | 0.1398 | 0.0561 | | 262.9824 | 37.0 | 37000 | 262.8044 | 0.849 | 0.2810 | 1.8759 | 0.849 | 0.8494 | 0.1372 | 0.0524 | | 262.2614 | 38.0 | 38000 | 262.8396 | 0.8458 | 0.2861 | 1.8657 | 0.8458 | 0.8468 | 0.1410 | 0.0548 | | 262.2726 | 39.0 | 39000 | 262.3623 | 0.846 | 0.2833 | 1.8772 | 0.8460 | 0.8465 | 0.1405 | 0.0565 | | 262.3102 | 40.0 | 40000 | 262.4073 | 0.8465 | 0.2831 | 1.8798 | 0.8465 | 0.8475 | 0.1395 | 0.0553 | | 262.2994 | 41.0 | 41000 | 262.2219 | 0.8472 | 0.2836 | 1.8810 | 0.8472 | 0.8475 | 0.1399 | 0.0579 | | 262.222 | 42.0 | 42000 | 262.4181 | 0.8472 | 0.2775 | 1.8712 | 0.8472 | 0.8482 | 0.1389 | 0.0552 | | 261.6536 | 43.0 | 43000 | 262.2162 | 0.8465 | 0.2844 | 1.8668 | 0.8465 | 0.8479 | 0.1401 | 0.0565 | | 261.9964 | 44.0 | 44000 | 262.1039 | 0.8472 | 0.2848 | 1.8718 | 0.8472 | 0.8481 | 0.1403 | 0.0590 | | 261.4522 | 45.0 | 45000 | 261.7883 | 0.846 | 0.2868 | 1.8589 | 0.8460 | 0.8459 | 0.1419 | 0.0556 | | 261.6668 | 46.0 | 46000 | 262.0215 | 0.8492 | 0.2822 | 1.8682 | 0.8492 | 0.8494 | 0.1385 | 0.0542 | | 261.8742 | 47.0 | 47000 | 261.9067 | 0.847 | 0.2846 | 1.8765 | 0.847 | 0.8476 | 0.1403 | 0.0599 | | 261.5992 | 48.0 | 48000 | 261.7719 | 0.8475 | 0.2820 | 1.8854 | 0.8475 | 0.8485 | 0.1401 | 0.0583 | | 261.6406 | 49.0 | 49000 | 261.5148 | 0.846 | 0.2873 | 1.8737 | 0.8460 | 0.8466 | 0.1427 | 0.0598 | | 261.9611 | 50.0 | 50000 | 261.8253 | 0.845 | 0.2896 | 1.8917 | 0.845 | 0.8458 | 0.1431 | 0.0597 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]