model_id
stringlengths 7
105
| model_card
stringlengths 1
130k
| model_labels
listlengths 2
80k
|
---|---|---|
ahmedALM1221/convnextv2-base-22k-224-finetuned-eurosat-50
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-base-22k-224-finetuned-eurosat-50
This model is a fine-tuned version of [facebook/convnextv2-base-22k-224](https://huggingface.co/facebook/convnextv2-base-22k-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2875
- Accuracy: 0.9147
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.9019 | 1.0 | 122 | 1.9510 | 0.1727 |
| 1.7782 | 2.0 | 244 | 1.8239 | 0.3073 |
| 1.6214 | 3.0 | 366 | 1.6121 | 0.4913 |
| 1.3495 | 4.0 | 488 | 1.3064 | 0.6238 |
| 1.0994 | 5.0 | 610 | 1.0243 | 0.7163 |
| 0.8866 | 6.0 | 732 | 0.8165 | 0.7564 |
| 0.7282 | 7.0 | 854 | 0.6637 | 0.7996 |
| 0.6211 | 8.0 | 976 | 0.5623 | 0.8160 |
| 0.5114 | 9.0 | 1098 | 0.4681 | 0.8551 |
| 0.3835 | 10.0 | 1220 | 0.3917 | 0.8787 |
| 0.3543 | 11.0 | 1342 | 0.3122 | 0.9013 |
| 0.3534 | 12.0 | 1464 | 0.2875 | 0.9147 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
vinson099/DatasetModel
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# DatasetModel
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6457
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 0.96 | 6 | 1.2651 | 0.99 |
| 1.3875 | 1.92 | 12 | 0.7931 | 1.0 |
| 1.3875 | 2.88 | 18 | 0.6457 | 1.0 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
jcm-art/hf_image_classification_tuning_pipeline
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hf_image_classification_tuning_pipeline
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5764
- Accuracy: 0.903
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.7113 | 0.99 | 62 | 2.4840 | 0.849 |
| 1.8024 | 2.0 | 125 | 1.7298 | 0.891 |
| 1.5532 | 2.98 | 186 | 1.5764 | 0.903 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
DunnBC22/dit-base-Document_Classification-RVL_CDIP
|
# dit-base-Document_Classification-RVL_CDIP
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base).
It achieves the following results on the evaluation set:
- Loss: 0.0786
- Accuracy: 0.9767
- F1
- Weighted: 0.9768
- Micro: 0.9767
- Macro: 0.9154
- Recall
- Weighted: 0.9767
- Micro: 0.9767
- Macro: 0.9019
- Precision
- Weighted: 0.9771
- Micro: 0.9767
- Macro: 0.9314
## Model description
For more information on how it was created, check out the following link: https://github.com/DunnBC22/Vision_Audio_and_Multimodal_Projects/blob/main/Document%20AI/Multiclass%20Classification/Document%20Classification%20-%20RVL-CDIP/Document%20Classification%20-%20RVL-CDIP.ipynb
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/achrafbribiche/document-classification
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Weighted F1 | Micro F1 | Macro F1 | Weighted Recall | Micro Recall | Macro Recall | Weighted Precision | Micro Precision | Macro Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------:|:--------:|:--------:|:---------------:|:------------:|:------------:|:------------------:|:---------------:|:---------------:|
| 0.1535 | 1.0 | 208 | 0.1126 | 0.9622 | 0.9597 | 0.9622 | 0.5711 | 0.9622 | 0.9622 | 0.5925 | 0.9577 | 0.9622 | 0.5531 |
| 0.1195 | 2.0 | 416 | 0.0843 | 0.9738 | 0.9736 | 0.9738 | 0.8502 | 0.9738 | 0.9738 | 0.8037 | 0.9741 | 0.9738 | 0.9287 |
| 0.0979 | 3.0 | 624 | 0.0786 | 0.9767 | 0.9768 | 0.9767 | 0.9154 | 0.9767 | 0.9767 | 0.9019 | 0.9771 | 0.9767 | 0.9314 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.11.0
- Tokenizers 0.13.3
|
[
"receipt",
"other",
"invoice"
] |
cherrue/RandomCrop_Rescale_epoch_3_learning_rate_5e_5_decay_0_01
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# cherrue/pricetag_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.0546
- Validation Loss: 1.2226
- Train Accuracy: 0.3846
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 1251, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 1.3379 | 1.2276 | 0.5128 | 0 |
| 1.1973 | 1.1561 | 0.4615 | 1 |
| 1.0546 | 1.2226 | 0.3846 | 2 |
### Framework versions
- Transformers 4.28.0
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"template1_one",
"template2_one_left_label",
"template3_two",
"template5_one_sold_out"
] |
arunboss/test_triage
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test_triage
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the Test dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9758
- Accuracy: 0.5008
## Model description
This is a basic skin disease recognition model without the specific disease information for now. I just wanted to test the platform for hosting capabilities and check other features.
## Intended uses & limitations
For now, its just a test environment. We have the basic pipeline of data & processing in place to push to this place. Future use will be to open source the dataset and allow the community to fine tune the skin identification and triaging module for broader and free-for-all in commercial/non-commercial usage.
## Training and evaluation data
We have a lot of open & closed datasets that have been compiled over years and annotated.
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 3.3471 | 1.0 | 151 | 3.2152 | 0.2452 |
| 2.7313 | 2.0 | 303 | 2.5291 | 0.3817 |
| 2.48 | 3.0 | 454 | 2.2459 | 0.4413 |
| 2.2192 | 4.0 | 606 | 2.0968 | 0.4702 |
| 2.0479 | 5.0 | 757 | 2.0026 | 0.4897 |
| 1.9702 | 5.98 | 906 | 1.9758 | 0.5008 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"1",
"100",
"102",
"104",
"105",
"108",
"109",
"113",
"115",
"117",
"118",
"119",
"121",
"122",
"124",
"126",
"128",
"129",
"131",
"132",
"133",
"136",
"14",
"142",
"143",
"148",
"15",
"153",
"154",
"155",
"157",
"158",
"159",
"16",
"172",
"173",
"18",
"181",
"182",
"183",
"184",
"189",
"19",
"20",
"201",
"202",
"213",
"216",
"219",
"22",
"225",
"23",
"233",
"234",
"237",
"238",
"239",
"24",
"240",
"242",
"244",
"249",
"250",
"251",
"252",
"253",
"27",
"274",
"275",
"3",
"31",
"32",
"34",
"36",
"37",
"38",
"39",
"41",
"42",
"44",
"45",
"46",
"47",
"50",
"51",
"52",
"53",
"54",
"55",
"57",
"58",
"59",
"60",
"63",
"64",
"66",
"67",
"68",
"70",
"71",
"72",
"73",
"75",
"77",
"78",
"79",
"81",
"82",
"83",
"84",
"86",
"89",
"92",
"93",
"98"
] |
jordyvl/swin-base_tobacco
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-base_tobacco
This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window8-256](https://huggingface.co/microsoft/swinv2-base-patch4-window8-256) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6059
- Accuracy: 0.835
- Brier Loss: 0.2576
- Nll: 1.2824
- F1 Micro: 0.835
- F1 Macro: 0.8348
- Ece: 0.1310
- Aurc: 0.0387
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.3165 | 0.11 | 0.9031 | 7.6310 | 0.11 | 0.0604 | 0.2004 | 0.8718 |
| No log | 1.96 | 6 | 2.2894 | 0.155 | 0.8975 | 6.8146 | 0.155 | 0.0944 | 0.2230 | 0.8555 |
| No log | 2.96 | 9 | 2.2481 | 0.215 | 0.8888 | 5.1480 | 0.2150 | 0.1472 | 0.2492 | 0.8119 |
| No log | 3.96 | 12 | 2.1955 | 0.275 | 0.8770 | 4.2879 | 0.275 | 0.1939 | 0.2844 | 0.6562 |
| No log | 4.96 | 15 | 2.1326 | 0.36 | 0.8619 | 3.8809 | 0.36 | 0.2199 | 0.3357 | 0.4962 |
| No log | 5.96 | 18 | 2.0568 | 0.375 | 0.8415 | 3.9254 | 0.375 | 0.2309 | 0.3377 | 0.4471 |
| No log | 6.96 | 21 | 1.9639 | 0.375 | 0.8126 | 3.8158 | 0.375 | 0.2319 | 0.3195 | 0.4534 |
| No log | 7.96 | 24 | 1.8621 | 0.375 | 0.7781 | 3.3244 | 0.375 | 0.2456 | 0.2924 | 0.4833 |
| No log | 8.96 | 27 | 1.7100 | 0.44 | 0.7273 | 2.8211 | 0.44 | 0.3136 | 0.3188 | 0.3515 |
| No log | 9.96 | 30 | 1.5377 | 0.535 | 0.6611 | 2.4560 | 0.535 | 0.4259 | 0.3557 | 0.2259 |
| No log | 10.96 | 33 | 1.3588 | 0.595 | 0.5825 | 2.3216 | 0.595 | 0.4933 | 0.2986 | 0.1795 |
| No log | 11.96 | 36 | 1.2072 | 0.62 | 0.5215 | 2.3831 | 0.62 | 0.5352 | 0.2927 | 0.1541 |
| No log | 12.96 | 39 | 1.0766 | 0.67 | 0.4715 | 2.2078 | 0.67 | 0.5966 | 0.2727 | 0.1219 |
| No log | 13.96 | 42 | 0.9699 | 0.675 | 0.4408 | 1.8028 | 0.675 | 0.5961 | 0.2568 | 0.1215 |
| No log | 14.96 | 45 | 0.8660 | 0.68 | 0.4011 | 1.4772 | 0.68 | 0.5978 | 0.2176 | 0.1014 |
| No log | 15.96 | 48 | 0.7907 | 0.725 | 0.3709 | 1.4755 | 0.7250 | 0.6768 | 0.2055 | 0.0904 |
| No log | 16.96 | 51 | 0.7362 | 0.75 | 0.3501 | 1.3822 | 0.75 | 0.7077 | 0.2042 | 0.0806 |
| No log | 17.96 | 54 | 0.6867 | 0.76 | 0.3322 | 1.3191 | 0.76 | 0.7177 | 0.1926 | 0.0724 |
| No log | 18.96 | 57 | 0.6572 | 0.78 | 0.3203 | 1.2996 | 0.78 | 0.7424 | 0.1920 | 0.0699 |
| No log | 19.96 | 60 | 0.6074 | 0.785 | 0.2967 | 1.3136 | 0.785 | 0.7686 | 0.1705 | 0.0589 |
| No log | 20.96 | 63 | 0.6050 | 0.795 | 0.2956 | 1.3729 | 0.795 | 0.7793 | 0.1762 | 0.0600 |
| No log | 21.96 | 66 | 0.5748 | 0.83 | 0.2785 | 1.3558 | 0.83 | 0.8113 | 0.1744 | 0.0529 |
| No log | 22.96 | 69 | 0.5722 | 0.815 | 0.2756 | 1.3937 | 0.815 | 0.8097 | 0.1767 | 0.0489 |
| No log | 23.96 | 72 | 0.5689 | 0.795 | 0.2750 | 1.3641 | 0.795 | 0.7947 | 0.1452 | 0.0539 |
| No log | 24.96 | 75 | 0.5536 | 0.825 | 0.2718 | 1.2773 | 0.825 | 0.8068 | 0.1698 | 0.0509 |
| No log | 25.96 | 78 | 0.5464 | 0.805 | 0.2726 | 1.2772 | 0.805 | 0.7888 | 0.1499 | 0.0487 |
| No log | 26.96 | 81 | 0.5455 | 0.81 | 0.2626 | 1.3607 | 0.81 | 0.8080 | 0.1750 | 0.0471 |
| No log | 27.96 | 84 | 0.5542 | 0.815 | 0.2609 | 1.3643 | 0.815 | 0.8089 | 0.1521 | 0.0466 |
| No log | 28.96 | 87 | 0.5480 | 0.82 | 0.2710 | 1.2996 | 0.82 | 0.8227 | 0.1422 | 0.0468 |
| No log | 29.96 | 90 | 0.5507 | 0.83 | 0.2654 | 1.3425 | 0.83 | 0.8320 | 0.1491 | 0.0475 |
| No log | 30.96 | 93 | 0.5608 | 0.815 | 0.2591 | 1.4365 | 0.815 | 0.8145 | 0.1405 | 0.0442 |
| No log | 31.96 | 96 | 0.5473 | 0.825 | 0.2622 | 1.3600 | 0.825 | 0.8198 | 0.1339 | 0.0424 |
| No log | 32.96 | 99 | 0.5296 | 0.83 | 0.2588 | 1.2906 | 0.83 | 0.8311 | 0.1373 | 0.0416 |
| No log | 33.96 | 102 | 0.5370 | 0.82 | 0.2522 | 1.2895 | 0.82 | 0.8214 | 0.1428 | 0.0436 |
| No log | 34.96 | 105 | 0.5578 | 0.8 | 0.2707 | 1.3364 | 0.8000 | 0.8056 | 0.1708 | 0.0481 |
| No log | 35.96 | 108 | 0.5193 | 0.825 | 0.2484 | 1.2883 | 0.825 | 0.8250 | 0.1316 | 0.0405 |
| No log | 36.96 | 111 | 0.5306 | 0.815 | 0.2569 | 1.2856 | 0.815 | 0.8093 | 0.1344 | 0.0420 |
| No log | 37.96 | 114 | 0.5824 | 0.815 | 0.2729 | 1.3994 | 0.815 | 0.8182 | 0.1418 | 0.0479 |
| No log | 38.96 | 117 | 0.5486 | 0.82 | 0.2549 | 1.2974 | 0.82 | 0.8259 | 0.1312 | 0.0443 |
| No log | 39.96 | 120 | 0.5421 | 0.83 | 0.2545 | 1.3575 | 0.83 | 0.8316 | 0.1491 | 0.0415 |
| No log | 40.96 | 123 | 0.5477 | 0.81 | 0.2700 | 1.3251 | 0.81 | 0.8166 | 0.1499 | 0.0418 |
| No log | 41.96 | 126 | 0.5404 | 0.825 | 0.2553 | 1.3186 | 0.825 | 0.8309 | 0.1519 | 0.0414 |
| No log | 42.96 | 129 | 0.5698 | 0.83 | 0.2598 | 1.3249 | 0.83 | 0.8386 | 0.1396 | 0.0452 |
| No log | 43.96 | 132 | 0.5538 | 0.815 | 0.2605 | 1.3122 | 0.815 | 0.8212 | 0.1410 | 0.0430 |
| No log | 44.96 | 135 | 0.5369 | 0.81 | 0.2586 | 1.3030 | 0.81 | 0.8141 | 0.1404 | 0.0409 |
| No log | 45.96 | 138 | 0.5614 | 0.825 | 0.2615 | 1.3881 | 0.825 | 0.8278 | 0.1404 | 0.0427 |
| No log | 46.96 | 141 | 0.5636 | 0.825 | 0.2601 | 1.4077 | 0.825 | 0.8286 | 0.1345 | 0.0421 |
| No log | 47.96 | 144 | 0.5783 | 0.83 | 0.2684 | 1.3350 | 0.83 | 0.8304 | 0.1373 | 0.0422 |
| No log | 48.96 | 147 | 0.5749 | 0.825 | 0.2663 | 1.3167 | 0.825 | 0.8241 | 0.1308 | 0.0424 |
| No log | 49.96 | 150 | 0.5802 | 0.82 | 0.2692 | 1.3191 | 0.82 | 0.8194 | 0.1217 | 0.0461 |
| No log | 50.96 | 153 | 0.5696 | 0.82 | 0.2639 | 1.3330 | 0.82 | 0.8175 | 0.1372 | 0.0429 |
| No log | 51.96 | 156 | 0.5827 | 0.84 | 0.2656 | 1.3975 | 0.8400 | 0.8444 | 0.1378 | 0.0426 |
| No log | 52.96 | 159 | 0.5725 | 0.805 | 0.2669 | 1.3172 | 0.805 | 0.7997 | 0.1459 | 0.0422 |
| No log | 53.96 | 162 | 0.5769 | 0.805 | 0.2691 | 1.3111 | 0.805 | 0.7991 | 0.1457 | 0.0434 |
| No log | 54.96 | 165 | 0.5883 | 0.805 | 0.2647 | 1.4581 | 0.805 | 0.8104 | 0.1405 | 0.0430 |
| No log | 55.96 | 168 | 0.5834 | 0.835 | 0.2543 | 1.4586 | 0.835 | 0.8349 | 0.1346 | 0.0407 |
| No log | 56.96 | 171 | 0.5875 | 0.835 | 0.2543 | 1.3211 | 0.835 | 0.8358 | 0.1320 | 0.0402 |
| No log | 57.96 | 174 | 0.5741 | 0.84 | 0.2533 | 1.3027 | 0.8400 | 0.8405 | 0.1290 | 0.0395 |
| No log | 58.96 | 177 | 0.5737 | 0.82 | 0.2624 | 1.3104 | 0.82 | 0.8167 | 0.1437 | 0.0396 |
| No log | 59.96 | 180 | 0.5796 | 0.815 | 0.2603 | 1.4021 | 0.815 | 0.8154 | 0.1286 | 0.0406 |
| No log | 60.96 | 183 | 0.5711 | 0.83 | 0.2553 | 1.4016 | 0.83 | 0.8306 | 0.1272 | 0.0390 |
| No log | 61.96 | 186 | 0.5670 | 0.825 | 0.2591 | 1.3136 | 0.825 | 0.8263 | 0.1429 | 0.0406 |
| No log | 62.96 | 189 | 0.5736 | 0.825 | 0.2592 | 1.3077 | 0.825 | 0.8231 | 0.1244 | 0.0417 |
| No log | 63.96 | 192 | 0.5730 | 0.83 | 0.2531 | 1.3007 | 0.83 | 0.8274 | 0.1275 | 0.0401 |
| No log | 64.96 | 195 | 0.6130 | 0.82 | 0.2687 | 1.3014 | 0.82 | 0.8246 | 0.1484 | 0.0414 |
| No log | 65.96 | 198 | 0.6023 | 0.825 | 0.2596 | 1.3107 | 0.825 | 0.8254 | 0.1373 | 0.0404 |
| No log | 66.96 | 201 | 0.5923 | 0.825 | 0.2599 | 1.3078 | 0.825 | 0.8263 | 0.1312 | 0.0411 |
| No log | 67.96 | 204 | 0.6197 | 0.81 | 0.2766 | 1.3046 | 0.81 | 0.8035 | 0.1373 | 0.0451 |
| No log | 68.96 | 207 | 0.5918 | 0.805 | 0.2651 | 1.3019 | 0.805 | 0.8044 | 0.1407 | 0.0404 |
| No log | 69.96 | 210 | 0.5908 | 0.835 | 0.2544 | 1.3286 | 0.835 | 0.8344 | 0.1354 | 0.0394 |
| No log | 70.96 | 213 | 0.5941 | 0.83 | 0.2558 | 1.3019 | 0.83 | 0.8324 | 0.1402 | 0.0401 |
| No log | 71.96 | 216 | 0.5994 | 0.82 | 0.2588 | 1.2998 | 0.82 | 0.8215 | 0.1297 | 0.0411 |
| No log | 72.96 | 219 | 0.6083 | 0.825 | 0.2638 | 1.3525 | 0.825 | 0.8257 | 0.1379 | 0.0410 |
| No log | 73.96 | 222 | 0.5980 | 0.825 | 0.2609 | 1.3515 | 0.825 | 0.8295 | 0.1457 | 0.0394 |
| No log | 74.96 | 225 | 0.5945 | 0.83 | 0.2568 | 1.3670 | 0.83 | 0.8302 | 0.1324 | 0.0390 |
| No log | 75.96 | 228 | 0.5982 | 0.845 | 0.2535 | 1.4552 | 0.845 | 0.8476 | 0.1246 | 0.0390 |
| No log | 76.96 | 231 | 0.5850 | 0.83 | 0.2507 | 1.3700 | 0.83 | 0.8287 | 0.1348 | 0.0391 |
| No log | 77.96 | 234 | 0.5859 | 0.825 | 0.2566 | 1.2917 | 0.825 | 0.8232 | 0.1309 | 0.0394 |
| No log | 78.96 | 237 | 0.6085 | 0.835 | 0.2630 | 1.3516 | 0.835 | 0.8370 | 0.1329 | 0.0420 |
| No log | 79.96 | 240 | 0.6108 | 0.835 | 0.2621 | 1.2943 | 0.835 | 0.8370 | 0.1395 | 0.0414 |
| No log | 80.96 | 243 | 0.6061 | 0.81 | 0.2596 | 1.2898 | 0.81 | 0.8119 | 0.1313 | 0.0413 |
| No log | 81.96 | 246 | 0.6006 | 0.815 | 0.2564 | 1.2952 | 0.815 | 0.8122 | 0.1453 | 0.0406 |
| No log | 82.96 | 249 | 0.6050 | 0.825 | 0.2577 | 1.2998 | 0.825 | 0.8283 | 0.1271 | 0.0400 |
| No log | 83.96 | 252 | 0.6197 | 0.835 | 0.2658 | 1.3021 | 0.835 | 0.8386 | 0.1222 | 0.0414 |
| No log | 84.96 | 255 | 0.6086 | 0.825 | 0.2651 | 1.2889 | 0.825 | 0.8251 | 0.1207 | 0.0404 |
| No log | 85.96 | 258 | 0.5965 | 0.83 | 0.2587 | 1.2929 | 0.83 | 0.8304 | 0.1323 | 0.0397 |
| No log | 86.96 | 261 | 0.5897 | 0.82 | 0.2550 | 1.2980 | 0.82 | 0.8171 | 0.1372 | 0.0394 |
| No log | 87.96 | 264 | 0.5887 | 0.83 | 0.2551 | 1.2950 | 0.83 | 0.8290 | 0.1251 | 0.0391 |
| No log | 88.96 | 267 | 0.5958 | 0.82 | 0.2598 | 1.2871 | 0.82 | 0.8180 | 0.1319 | 0.0392 |
| No log | 89.96 | 270 | 0.6088 | 0.82 | 0.2658 | 1.2805 | 0.82 | 0.8184 | 0.1513 | 0.0396 |
| No log | 90.96 | 273 | 0.6192 | 0.825 | 0.2692 | 1.2772 | 0.825 | 0.8263 | 0.1258 | 0.0402 |
| No log | 91.96 | 276 | 0.6230 | 0.825 | 0.2689 | 1.2777 | 0.825 | 0.8263 | 0.1416 | 0.0404 |
| No log | 92.96 | 279 | 0.6223 | 0.83 | 0.2667 | 1.2792 | 0.83 | 0.8318 | 0.1296 | 0.0401 |
| No log | 93.96 | 282 | 0.6145 | 0.83 | 0.2627 | 1.2797 | 0.83 | 0.8321 | 0.1265 | 0.0394 |
| No log | 94.96 | 285 | 0.6105 | 0.83 | 0.2610 | 1.2807 | 0.83 | 0.8321 | 0.1352 | 0.0392 |
| No log | 95.96 | 288 | 0.6095 | 0.83 | 0.2602 | 1.2815 | 0.83 | 0.8321 | 0.1360 | 0.0390 |
| No log | 96.96 | 291 | 0.6076 | 0.835 | 0.2590 | 1.2824 | 0.835 | 0.8348 | 0.1255 | 0.0389 |
| No log | 97.96 | 294 | 0.6060 | 0.835 | 0.2578 | 1.2827 | 0.835 | 0.8348 | 0.1281 | 0.0388 |
| No log | 98.96 | 297 | 0.6058 | 0.835 | 0.2575 | 1.2825 | 0.835 | 0.8348 | 0.1410 | 0.0387 |
| No log | 99.96 | 300 | 0.6059 | 0.835 | 0.2576 | 1.2824 | 0.835 | 0.8348 | 0.1310 | 0.0387 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-base_tobacco
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_tobacco
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7442
- Accuracy: 0.815
- Brier Loss: 0.3076
- Nll: 1.1877
- F1 Micro: 0.815
- F1 Macro: 0.7942
- Ece: 0.2072
- Aurc: 0.0734
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 6 | 2.3082 | 0.085 | 0.9012 | 6.2672 | 0.085 | 0.0735 | 0.1625 | 0.9316 |
| No log | 1.96 | 12 | 2.2872 | 0.14 | 0.8970 | 4.8533 | 0.14 | 0.0885 | 0.1958 | 0.8912 |
| No log | 2.96 | 18 | 2.2562 | 0.225 | 0.8906 | 4.5559 | 0.225 | 0.1319 | 0.2527 | 0.8101 |
| No log | 3.96 | 24 | 2.2107 | 0.265 | 0.8808 | 4.3151 | 0.265 | 0.1614 | 0.2710 | 0.6990 |
| No log | 4.96 | 30 | 2.1433 | 0.3 | 0.8654 | 4.1825 | 0.3 | 0.1615 | 0.2943 | 0.6102 |
| No log | 5.96 | 36 | 2.0764 | 0.325 | 0.8493 | 3.6715 | 0.325 | 0.1696 | 0.3160 | 0.4502 |
| No log | 6.96 | 42 | 2.0012 | 0.375 | 0.8287 | 3.5534 | 0.375 | 0.1901 | 0.3542 | 0.3791 |
| No log | 7.96 | 48 | 1.9197 | 0.41 | 0.8041 | 3.3582 | 0.41 | 0.2136 | 0.3528 | 0.3342 |
| No log | 8.96 | 54 | 1.8379 | 0.45 | 0.7767 | 3.1997 | 0.45 | 0.2279 | 0.3709 | 0.2872 |
| No log | 9.96 | 60 | 1.7538 | 0.535 | 0.7475 | 2.9586 | 0.535 | 0.3755 | 0.4024 | 0.2508 |
| No log | 10.96 | 66 | 1.6634 | 0.57 | 0.7132 | 2.6969 | 0.57 | 0.4025 | 0.4182 | 0.2183 |
| No log | 11.96 | 72 | 1.5952 | 0.61 | 0.6842 | 2.4519 | 0.61 | 0.4427 | 0.4153 | 0.1882 |
| No log | 12.96 | 78 | 1.5205 | 0.655 | 0.6554 | 1.9703 | 0.655 | 0.5306 | 0.4572 | 0.1651 |
| No log | 13.96 | 84 | 1.4566 | 0.67 | 0.6308 | 1.7832 | 0.67 | 0.5458 | 0.4240 | 0.1514 |
| No log | 14.96 | 90 | 1.4009 | 0.685 | 0.6074 | 1.8217 | 0.685 | 0.5641 | 0.4221 | 0.1406 |
| No log | 15.96 | 96 | 1.3520 | 0.7 | 0.5866 | 1.6223 | 0.7 | 0.5896 | 0.4107 | 0.1304 |
| No log | 16.96 | 102 | 1.3220 | 0.7 | 0.5741 | 1.4452 | 0.7 | 0.5865 | 0.4029 | 0.1225 |
| No log | 17.96 | 108 | 1.2764 | 0.705 | 0.5522 | 1.4534 | 0.705 | 0.6076 | 0.3805 | 0.1269 |
| No log | 18.96 | 114 | 1.2448 | 0.72 | 0.5378 | 1.4843 | 0.72 | 0.6321 | 0.3724 | 0.1193 |
| No log | 19.96 | 120 | 1.2049 | 0.74 | 0.5210 | 1.2527 | 0.74 | 0.6471 | 0.3947 | 0.1039 |
| No log | 20.96 | 126 | 1.1712 | 0.74 | 0.5057 | 1.1657 | 0.74 | 0.6464 | 0.3833 | 0.0955 |
| No log | 21.96 | 132 | 1.1453 | 0.735 | 0.4936 | 1.0277 | 0.735 | 0.6597 | 0.3628 | 0.1015 |
| No log | 22.96 | 138 | 1.1094 | 0.745 | 0.4771 | 1.0003 | 0.745 | 0.6667 | 0.3841 | 0.0938 |
| No log | 23.96 | 144 | 1.0803 | 0.75 | 0.4628 | 1.0334 | 0.75 | 0.6972 | 0.3490 | 0.0891 |
| No log | 24.96 | 150 | 1.0658 | 0.755 | 0.4559 | 1.0092 | 0.755 | 0.6937 | 0.3536 | 0.0925 |
| No log | 25.96 | 156 | 1.0345 | 0.765 | 0.4423 | 0.9971 | 0.765 | 0.7356 | 0.3661 | 0.0852 |
| No log | 26.96 | 162 | 1.0133 | 0.76 | 0.4323 | 0.9206 | 0.76 | 0.7302 | 0.3343 | 0.0791 |
| No log | 27.96 | 168 | 0.9927 | 0.775 | 0.4225 | 0.9015 | 0.775 | 0.7433 | 0.3457 | 0.0794 |
| No log | 28.96 | 174 | 0.9789 | 0.765 | 0.4152 | 0.8946 | 0.765 | 0.7282 | 0.3337 | 0.0818 |
| No log | 29.96 | 180 | 0.9509 | 0.78 | 0.4025 | 0.9323 | 0.78 | 0.7565 | 0.3135 | 0.0733 |
| No log | 30.96 | 186 | 0.9388 | 0.79 | 0.3968 | 0.8616 | 0.79 | 0.7642 | 0.3353 | 0.0694 |
| No log | 31.96 | 192 | 0.9316 | 0.78 | 0.3927 | 0.8636 | 0.78 | 0.7588 | 0.3426 | 0.0739 |
| No log | 32.96 | 198 | 0.9197 | 0.79 | 0.3876 | 0.8581 | 0.79 | 0.7656 | 0.3042 | 0.0800 |
| No log | 33.96 | 204 | 0.9020 | 0.775 | 0.3792 | 0.8458 | 0.775 | 0.7543 | 0.2872 | 0.0744 |
| No log | 34.96 | 210 | 0.8833 | 0.785 | 0.3694 | 0.8288 | 0.785 | 0.7619 | 0.3305 | 0.0663 |
| No log | 35.96 | 216 | 0.8684 | 0.795 | 0.3624 | 0.8462 | 0.795 | 0.7779 | 0.3184 | 0.0690 |
| No log | 36.96 | 222 | 0.8608 | 0.79 | 0.3584 | 0.8860 | 0.79 | 0.7707 | 0.2790 | 0.0709 |
| No log | 37.96 | 228 | 0.8586 | 0.79 | 0.3587 | 0.8954 | 0.79 | 0.7724 | 0.3153 | 0.0754 |
| No log | 38.96 | 234 | 0.8470 | 0.79 | 0.3515 | 0.8822 | 0.79 | 0.7684 | 0.3075 | 0.0726 |
| No log | 39.96 | 240 | 0.8288 | 0.79 | 0.3434 | 0.8192 | 0.79 | 0.7700 | 0.2700 | 0.0648 |
| No log | 40.96 | 246 | 0.8255 | 0.8 | 0.3426 | 0.8191 | 0.8000 | 0.7808 | 0.2760 | 0.0727 |
| No log | 41.96 | 252 | 0.8247 | 0.8 | 0.3411 | 0.8876 | 0.8000 | 0.7737 | 0.2903 | 0.0701 |
| No log | 42.96 | 258 | 0.8196 | 0.8 | 0.3389 | 0.8841 | 0.8000 | 0.7786 | 0.2768 | 0.0727 |
| No log | 43.96 | 264 | 0.8118 | 0.805 | 0.3351 | 0.9510 | 0.805 | 0.7806 | 0.2620 | 0.0685 |
| No log | 44.96 | 270 | 0.8127 | 0.795 | 0.3352 | 1.0119 | 0.795 | 0.7705 | 0.2650 | 0.0707 |
| No log | 45.96 | 276 | 0.7968 | 0.8 | 0.3285 | 1.0041 | 0.8000 | 0.7788 | 0.2734 | 0.0665 |
| No log | 46.96 | 282 | 0.7946 | 0.81 | 0.3274 | 1.0647 | 0.81 | 0.7921 | 0.2765 | 0.0703 |
| No log | 47.96 | 288 | 0.7996 | 0.805 | 0.3298 | 1.0108 | 0.805 | 0.7867 | 0.2772 | 0.0714 |
| No log | 48.96 | 294 | 0.7971 | 0.805 | 0.3283 | 1.0728 | 0.805 | 0.7816 | 0.2756 | 0.0732 |
| No log | 49.96 | 300 | 0.7950 | 0.8 | 0.3278 | 1.0694 | 0.8000 | 0.7758 | 0.2540 | 0.0750 |
| No log | 50.96 | 306 | 0.7826 | 0.8 | 0.3222 | 1.0211 | 0.8000 | 0.7784 | 0.2596 | 0.0643 |
| No log | 51.96 | 312 | 0.7933 | 0.795 | 0.3273 | 1.0680 | 0.795 | 0.7712 | 0.2619 | 0.0764 |
| No log | 52.96 | 318 | 0.7883 | 0.805 | 0.3247 | 1.0730 | 0.805 | 0.7834 | 0.2426 | 0.0712 |
| No log | 53.96 | 324 | 0.7811 | 0.815 | 0.3219 | 1.0623 | 0.815 | 0.7913 | 0.2259 | 0.0716 |
| No log | 54.96 | 330 | 0.7784 | 0.815 | 0.3203 | 1.0657 | 0.815 | 0.7917 | 0.2797 | 0.0690 |
| No log | 55.96 | 336 | 0.7827 | 0.81 | 0.3219 | 1.0770 | 0.81 | 0.7885 | 0.2491 | 0.0752 |
| No log | 56.96 | 342 | 0.7701 | 0.815 | 0.3166 | 1.0614 | 0.815 | 0.7913 | 0.2664 | 0.0689 |
| No log | 57.96 | 348 | 0.7748 | 0.815 | 0.3187 | 1.0699 | 0.815 | 0.7913 | 0.2487 | 0.0722 |
| No log | 58.96 | 354 | 0.7669 | 0.815 | 0.3155 | 1.0607 | 0.815 | 0.7919 | 0.2482 | 0.0685 |
| No log | 59.96 | 360 | 0.7721 | 0.81 | 0.3180 | 1.0746 | 0.81 | 0.7859 | 0.2385 | 0.0730 |
| No log | 60.96 | 366 | 0.7645 | 0.815 | 0.3145 | 1.0650 | 0.815 | 0.7913 | 0.2468 | 0.0688 |
| No log | 61.96 | 372 | 0.7672 | 0.815 | 0.3157 | 1.0782 | 0.815 | 0.7913 | 0.2228 | 0.0728 |
| No log | 62.96 | 378 | 0.7625 | 0.82 | 0.3139 | 1.0673 | 0.82 | 0.8025 | 0.2323 | 0.0688 |
| No log | 63.96 | 384 | 0.7627 | 0.81 | 0.3144 | 1.1893 | 0.81 | 0.7892 | 0.2236 | 0.0710 |
| No log | 64.96 | 390 | 0.7629 | 0.815 | 0.3141 | 1.1934 | 0.815 | 0.7972 | 0.2277 | 0.0707 |
| No log | 65.96 | 396 | 0.7569 | 0.81 | 0.3118 | 1.1003 | 0.81 | 0.7866 | 0.2577 | 0.0696 |
| No log | 66.96 | 402 | 0.7619 | 0.815 | 0.3136 | 1.1365 | 0.815 | 0.7919 | 0.2562 | 0.0732 |
| No log | 67.96 | 408 | 0.7565 | 0.815 | 0.3114 | 1.1325 | 0.815 | 0.7919 | 0.2467 | 0.0694 |
| No log | 68.96 | 414 | 0.7558 | 0.815 | 0.3117 | 1.1895 | 0.815 | 0.7972 | 0.2453 | 0.0705 |
| No log | 69.96 | 420 | 0.7550 | 0.815 | 0.3111 | 1.1924 | 0.815 | 0.7972 | 0.2107 | 0.0709 |
| No log | 70.96 | 426 | 0.7573 | 0.805 | 0.3123 | 1.1886 | 0.805 | 0.7795 | 0.2476 | 0.0737 |
| No log | 71.96 | 432 | 0.7521 | 0.81 | 0.3099 | 1.1911 | 0.81 | 0.7866 | 0.2117 | 0.0698 |
| No log | 72.96 | 438 | 0.7542 | 0.81 | 0.3112 | 1.1878 | 0.81 | 0.7827 | 0.2332 | 0.0726 |
| No log | 73.96 | 444 | 0.7509 | 0.815 | 0.3096 | 1.1880 | 0.815 | 0.7899 | 0.2364 | 0.0709 |
| No log | 74.96 | 450 | 0.7526 | 0.81 | 0.3105 | 1.1889 | 0.81 | 0.7827 | 0.2453 | 0.0724 |
| No log | 75.96 | 456 | 0.7488 | 0.81 | 0.3090 | 1.1869 | 0.81 | 0.7827 | 0.2285 | 0.0699 |
| No log | 76.96 | 462 | 0.7506 | 0.815 | 0.3097 | 1.1901 | 0.815 | 0.7934 | 0.2547 | 0.0721 |
| No log | 77.96 | 468 | 0.7505 | 0.81 | 0.3098 | 1.1876 | 0.81 | 0.7827 | 0.2110 | 0.0724 |
| No log | 78.96 | 474 | 0.7487 | 0.815 | 0.3089 | 1.1885 | 0.815 | 0.7934 | 0.2319 | 0.0715 |
| No log | 79.96 | 480 | 0.7472 | 0.81 | 0.3083 | 1.1877 | 0.81 | 0.7827 | 0.2310 | 0.0714 |
| No log | 80.96 | 486 | 0.7494 | 0.81 | 0.3094 | 1.1877 | 0.81 | 0.7827 | 0.2462 | 0.0738 |
| No log | 81.96 | 492 | 0.7466 | 0.815 | 0.3082 | 1.1888 | 0.815 | 0.7922 | 0.2181 | 0.0709 |
| No log | 82.96 | 498 | 0.7467 | 0.81 | 0.3083 | 1.1874 | 0.81 | 0.7827 | 0.2454 | 0.0714 |
| 0.7129 | 83.96 | 504 | 0.7479 | 0.815 | 0.3088 | 1.1888 | 0.815 | 0.7922 | 0.2272 | 0.0741 |
| 0.7129 | 84.96 | 510 | 0.7456 | 0.81 | 0.3080 | 1.1853 | 0.81 | 0.7847 | 0.2358 | 0.0719 |
| 0.7129 | 85.96 | 516 | 0.7465 | 0.815 | 0.3082 | 1.1908 | 0.815 | 0.7922 | 0.2322 | 0.0721 |
| 0.7129 | 86.96 | 522 | 0.7454 | 0.805 | 0.3081 | 1.1848 | 0.805 | 0.7819 | 0.2262 | 0.0719 |
| 0.7129 | 87.96 | 528 | 0.7471 | 0.815 | 0.3086 | 1.1894 | 0.815 | 0.7922 | 0.2351 | 0.0741 |
| 0.7129 | 88.96 | 534 | 0.7459 | 0.815 | 0.3082 | 1.1885 | 0.815 | 0.7922 | 0.2159 | 0.0726 |
| 0.7129 | 89.96 | 540 | 0.7435 | 0.815 | 0.3072 | 1.1861 | 0.815 | 0.7922 | 0.2291 | 0.0712 |
| 0.7129 | 90.96 | 546 | 0.7454 | 0.81 | 0.3080 | 1.1876 | 0.81 | 0.7847 | 0.2180 | 0.0733 |
| 0.7129 | 91.96 | 552 | 0.7461 | 0.815 | 0.3083 | 1.1883 | 0.815 | 0.7942 | 0.2308 | 0.0743 |
| 0.7129 | 92.96 | 558 | 0.7451 | 0.815 | 0.3079 | 1.1883 | 0.815 | 0.7922 | 0.2330 | 0.0734 |
| 0.7129 | 93.96 | 564 | 0.7434 | 0.815 | 0.3073 | 1.1863 | 0.815 | 0.7942 | 0.2217 | 0.0720 |
| 0.7129 | 94.96 | 570 | 0.7446 | 0.815 | 0.3077 | 1.1882 | 0.815 | 0.7942 | 0.2400 | 0.0731 |
| 0.7129 | 95.96 | 576 | 0.7450 | 0.815 | 0.3079 | 1.1882 | 0.815 | 0.7942 | 0.2144 | 0.0735 |
| 0.7129 | 96.96 | 582 | 0.7440 | 0.815 | 0.3075 | 1.1871 | 0.815 | 0.7942 | 0.2348 | 0.0731 |
| 0.7129 | 97.96 | 588 | 0.7441 | 0.815 | 0.3076 | 1.1876 | 0.815 | 0.7942 | 0.2225 | 0.0732 |
| 0.7129 | 98.96 | 594 | 0.7442 | 0.815 | 0.3076 | 1.1877 | 0.815 | 0.7942 | 0.2072 | 0.0734 |
| 0.7129 | 99.96 | 600 | 0.7442 | 0.815 | 0.3076 | 1.1877 | 0.815 | 0.7942 | 0.2072 | 0.0734 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-base_tobacco
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-base_tobacco
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3120
- Accuracy: 0.95
- Brier Loss: 0.0965
- Nll: 0.6372
- F1 Micro: 0.9500
- F1 Macro: 0.9545
- Ece: 0.0560
- Aurc: 0.0092
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 6 | 2.4454 | 0.175 | 0.9193 | 8.6626 | 0.175 | 0.0676 | 0.2489 | 0.8592 |
| No log | 1.96 | 12 | 2.3287 | 0.175 | 0.9034 | 7.2049 | 0.175 | 0.0674 | 0.2590 | 0.8557 |
| No log | 2.96 | 18 | 2.0836 | 0.23 | 0.8528 | 3.3114 | 0.23 | 0.1544 | 0.2652 | 0.7357 |
| No log | 3.96 | 24 | 2.0456 | 0.315 | 0.8435 | 3.8932 | 0.315 | 0.1785 | 0.3010 | 0.6372 |
| No log | 4.96 | 30 | 1.8778 | 0.3 | 0.7820 | 3.0975 | 0.3 | 0.1659 | 0.2985 | 0.5174 |
| No log | 5.96 | 36 | 1.7247 | 0.365 | 0.7305 | 2.7808 | 0.3650 | 0.2235 | 0.2507 | 0.4036 |
| No log | 6.96 | 42 | 1.6610 | 0.38 | 0.7183 | 2.6958 | 0.38 | 0.2449 | 0.2538 | 0.4119 |
| No log | 7.96 | 48 | 1.4667 | 0.505 | 0.6417 | 2.4078 | 0.505 | 0.3653 | 0.2881 | 0.2656 |
| No log | 8.96 | 54 | 1.3427 | 0.58 | 0.6031 | 2.0381 | 0.58 | 0.5304 | 0.2885 | 0.2470 |
| No log | 9.96 | 60 | 1.1586 | 0.635 | 0.5217 | 1.8792 | 0.635 | 0.5496 | 0.2831 | 0.1697 |
| No log | 10.96 | 66 | 1.0108 | 0.71 | 0.4578 | 1.6886 | 0.7100 | 0.6273 | 0.2851 | 0.1340 |
| No log | 11.96 | 72 | 0.8648 | 0.75 | 0.3849 | 1.5408 | 0.75 | 0.6788 | 0.2530 | 0.0801 |
| No log | 12.96 | 78 | 0.7342 | 0.79 | 0.3327 | 1.3588 | 0.79 | 0.7264 | 0.2152 | 0.0575 |
| No log | 13.96 | 84 | 0.6024 | 0.835 | 0.2734 | 1.2694 | 0.835 | 0.7937 | 0.1876 | 0.0429 |
| No log | 14.96 | 90 | 0.5143 | 0.85 | 0.2386 | 1.1756 | 0.85 | 0.8175 | 0.1714 | 0.0363 |
| No log | 15.96 | 96 | 0.4429 | 0.865 | 0.2044 | 1.1080 | 0.865 | 0.8435 | 0.1380 | 0.0277 |
| No log | 16.96 | 102 | 0.3999 | 0.885 | 0.1854 | 1.0748 | 0.885 | 0.8673 | 0.1407 | 0.0274 |
| No log | 17.96 | 108 | 0.3635 | 0.88 | 0.1732 | 1.0361 | 0.88 | 0.8594 | 0.1117 | 0.0247 |
| No log | 18.96 | 114 | 0.3166 | 0.89 | 0.1454 | 1.0855 | 0.89 | 0.8682 | 0.0971 | 0.0196 |
| No log | 19.96 | 120 | 0.3137 | 0.905 | 0.1418 | 1.1614 | 0.905 | 0.8934 | 0.1041 | 0.0195 |
| No log | 20.96 | 126 | 0.3207 | 0.91 | 0.1408 | 1.1941 | 0.91 | 0.9002 | 0.0856 | 0.0198 |
| No log | 21.96 | 132 | 0.2753 | 0.925 | 0.1224 | 1.0928 | 0.925 | 0.9209 | 0.0858 | 0.0145 |
| No log | 22.96 | 138 | 0.2538 | 0.925 | 0.1169 | 1.0895 | 0.925 | 0.9187 | 0.0863 | 0.0111 |
| No log | 23.96 | 144 | 0.2691 | 0.935 | 0.1138 | 1.0767 | 0.935 | 0.9279 | 0.0730 | 0.0149 |
| No log | 24.96 | 150 | 0.2775 | 0.935 | 0.1131 | 1.0538 | 0.935 | 0.9292 | 0.0676 | 0.0157 |
| No log | 25.96 | 156 | 0.2544 | 0.94 | 0.1011 | 1.0266 | 0.94 | 0.9292 | 0.0643 | 0.0131 |
| No log | 26.96 | 162 | 0.2637 | 0.945 | 0.1013 | 1.0337 | 0.945 | 0.9384 | 0.0648 | 0.0150 |
| No log | 27.96 | 168 | 0.2787 | 0.94 | 0.1089 | 1.0202 | 0.94 | 0.9348 | 0.0685 | 0.0161 |
| No log | 28.96 | 174 | 0.2794 | 0.935 | 0.1091 | 1.0099 | 0.935 | 0.9306 | 0.0671 | 0.0143 |
| No log | 29.96 | 180 | 0.2631 | 0.935 | 0.1025 | 0.9815 | 0.935 | 0.9306 | 0.0575 | 0.0129 |
| No log | 30.96 | 186 | 0.2616 | 0.945 | 0.1009 | 0.9683 | 0.945 | 0.9401 | 0.0674 | 0.0120 |
| No log | 31.96 | 192 | 0.2726 | 0.935 | 0.1074 | 0.9598 | 0.935 | 0.9346 | 0.0641 | 0.0100 |
| No log | 32.96 | 198 | 0.2765 | 0.935 | 0.1058 | 0.9067 | 0.935 | 0.9321 | 0.0696 | 0.0101 |
| No log | 33.96 | 204 | 0.2662 | 0.95 | 0.0965 | 0.8891 | 0.9500 | 0.9522 | 0.0672 | 0.0120 |
| No log | 34.96 | 210 | 0.2761 | 0.935 | 0.1019 | 0.8893 | 0.935 | 0.9338 | 0.0597 | 0.0134 |
| No log | 35.96 | 216 | 0.2729 | 0.945 | 0.0961 | 0.8807 | 0.945 | 0.9419 | 0.0552 | 0.0119 |
| No log | 36.96 | 222 | 0.2741 | 0.94 | 0.1037 | 0.8782 | 0.94 | 0.9356 | 0.0645 | 0.0086 |
| No log | 37.96 | 228 | 0.2686 | 0.94 | 0.0994 | 0.8423 | 0.94 | 0.9356 | 0.0592 | 0.0085 |
| No log | 38.96 | 234 | 0.2712 | 0.95 | 0.0906 | 0.8179 | 0.9500 | 0.9545 | 0.0610 | 0.0105 |
| No log | 39.96 | 240 | 0.2644 | 0.95 | 0.0870 | 0.8240 | 0.9500 | 0.9443 | 0.0510 | 0.0110 |
| No log | 40.96 | 246 | 0.2653 | 0.95 | 0.0932 | 0.8386 | 0.9500 | 0.9525 | 0.0572 | 0.0118 |
| No log | 41.96 | 252 | 0.2724 | 0.955 | 0.0939 | 0.8369 | 0.955 | 0.9573 | 0.0602 | 0.0104 |
| No log | 42.96 | 258 | 0.2552 | 0.95 | 0.0868 | 0.8079 | 0.9500 | 0.9522 | 0.0539 | 0.0079 |
| No log | 43.96 | 264 | 0.2629 | 0.95 | 0.0879 | 0.7800 | 0.9500 | 0.9545 | 0.0526 | 0.0080 |
| No log | 44.96 | 270 | 0.2664 | 0.955 | 0.0864 | 0.7660 | 0.955 | 0.9575 | 0.0515 | 0.0086 |
| No log | 45.96 | 276 | 0.2777 | 0.945 | 0.0948 | 0.7670 | 0.945 | 0.9513 | 0.0524 | 0.0096 |
| No log | 46.96 | 282 | 0.2824 | 0.94 | 0.1014 | 0.7799 | 0.94 | 0.9436 | 0.0570 | 0.0093 |
| No log | 47.96 | 288 | 0.2699 | 0.95 | 0.0896 | 0.7706 | 0.9500 | 0.9546 | 0.0528 | 0.0087 |
| No log | 48.96 | 294 | 0.2809 | 0.945 | 0.0950 | 0.7691 | 0.945 | 0.9480 | 0.0475 | 0.0087 |
| No log | 49.96 | 300 | 0.2827 | 0.945 | 0.0940 | 0.7635 | 0.945 | 0.9447 | 0.0571 | 0.0091 |
| No log | 50.96 | 306 | 0.2781 | 0.945 | 0.0921 | 0.7591 | 0.945 | 0.9478 | 0.0552 | 0.0090 |
| No log | 51.96 | 312 | 0.2834 | 0.95 | 0.0946 | 0.7572 | 0.9500 | 0.9484 | 0.0549 | 0.0089 |
| No log | 52.96 | 318 | 0.2986 | 0.94 | 0.0994 | 0.7541 | 0.94 | 0.9363 | 0.0605 | 0.0091 |
| No log | 53.96 | 324 | 0.2957 | 0.94 | 0.1016 | 0.7447 | 0.94 | 0.9385 | 0.0562 | 0.0086 |
| No log | 54.96 | 330 | 0.2991 | 0.94 | 0.1047 | 0.7392 | 0.94 | 0.9377 | 0.0592 | 0.0102 |
| No log | 55.96 | 336 | 0.3027 | 0.94 | 0.1031 | 0.7235 | 0.94 | 0.9377 | 0.0572 | 0.0113 |
| No log | 56.96 | 342 | 0.2945 | 0.945 | 0.0968 | 0.7143 | 0.945 | 0.9470 | 0.0581 | 0.0104 |
| No log | 57.96 | 348 | 0.2935 | 0.94 | 0.0955 | 0.7046 | 0.94 | 0.9459 | 0.0569 | 0.0097 |
| No log | 58.96 | 354 | 0.2909 | 0.94 | 0.0934 | 0.6969 | 0.94 | 0.9459 | 0.0544 | 0.0092 |
| No log | 59.96 | 360 | 0.2973 | 0.95 | 0.0939 | 0.6964 | 0.9500 | 0.9545 | 0.0524 | 0.0082 |
| No log | 60.96 | 366 | 0.3222 | 0.93 | 0.1108 | 0.7078 | 0.93 | 0.9266 | 0.0586 | 0.0088 |
| No log | 61.96 | 372 | 0.3247 | 0.935 | 0.1093 | 0.7743 | 0.935 | 0.9353 | 0.0622 | 0.0091 |
| No log | 62.96 | 378 | 0.3125 | 0.945 | 0.1003 | 0.7651 | 0.945 | 0.9453 | 0.0559 | 0.0089 |
| No log | 63.96 | 384 | 0.3035 | 0.945 | 0.0993 | 0.7515 | 0.945 | 0.9476 | 0.0545 | 0.0088 |
| No log | 64.96 | 390 | 0.3002 | 0.945 | 0.0973 | 0.7408 | 0.945 | 0.9476 | 0.0537 | 0.0091 |
| No log | 65.96 | 396 | 0.3023 | 0.95 | 0.0965 | 0.7321 | 0.9500 | 0.9545 | 0.0523 | 0.0095 |
| No log | 66.96 | 402 | 0.3075 | 0.945 | 0.1007 | 0.7323 | 0.945 | 0.9477 | 0.0540 | 0.0096 |
| No log | 67.96 | 408 | 0.3062 | 0.945 | 0.0999 | 0.6682 | 0.945 | 0.9514 | 0.0525 | 0.0098 |
| No log | 68.96 | 414 | 0.3182 | 0.945 | 0.0968 | 0.6809 | 0.945 | 0.9432 | 0.0485 | 0.0115 |
| No log | 69.96 | 420 | 0.3272 | 0.945 | 0.0972 | 0.6879 | 0.945 | 0.9432 | 0.0513 | 0.0132 |
| No log | 70.96 | 426 | 0.3210 | 0.945 | 0.0973 | 0.7545 | 0.945 | 0.9488 | 0.0522 | 0.0124 |
| No log | 71.96 | 432 | 0.3194 | 0.945 | 0.1027 | 0.7464 | 0.945 | 0.9514 | 0.0546 | 0.0108 |
| No log | 72.96 | 438 | 0.3236 | 0.94 | 0.1067 | 0.7486 | 0.94 | 0.9427 | 0.0587 | 0.0097 |
| No log | 73.96 | 444 | 0.3166 | 0.94 | 0.1049 | 0.6751 | 0.94 | 0.9427 | 0.0597 | 0.0096 |
| No log | 74.96 | 450 | 0.3062 | 0.945 | 0.0982 | 0.6702 | 0.945 | 0.9514 | 0.0526 | 0.0100 |
| No log | 75.96 | 456 | 0.3018 | 0.95 | 0.0948 | 0.6823 | 0.9500 | 0.9545 | 0.0523 | 0.0102 |
| No log | 76.96 | 462 | 0.3062 | 0.95 | 0.0951 | 0.7444 | 0.9500 | 0.9545 | 0.0522 | 0.0109 |
| No log | 77.96 | 468 | 0.3072 | 0.95 | 0.0933 | 0.7437 | 0.9500 | 0.9545 | 0.0501 | 0.0118 |
| No log | 78.96 | 474 | 0.3095 | 0.95 | 0.0943 | 0.6749 | 0.9500 | 0.9545 | 0.0512 | 0.0121 |
| No log | 79.96 | 480 | 0.3097 | 0.945 | 0.0968 | 0.6654 | 0.945 | 0.9514 | 0.0576 | 0.0116 |
| No log | 80.96 | 486 | 0.3094 | 0.95 | 0.0967 | 0.6581 | 0.9500 | 0.9545 | 0.0526 | 0.0112 |
| No log | 81.96 | 492 | 0.3109 | 0.95 | 0.0954 | 0.6549 | 0.9500 | 0.9545 | 0.0507 | 0.0115 |
| No log | 82.96 | 498 | 0.3104 | 0.95 | 0.0949 | 0.7168 | 0.9500 | 0.9545 | 0.0521 | 0.0113 |
| 0.3747 | 83.96 | 504 | 0.3122 | 0.95 | 0.0949 | 0.7130 | 0.9500 | 0.9545 | 0.0513 | 0.0111 |
| 0.3747 | 84.96 | 510 | 0.3140 | 0.95 | 0.0944 | 0.7116 | 0.9500 | 0.9545 | 0.0534 | 0.0113 |
| 0.3747 | 85.96 | 516 | 0.3175 | 0.95 | 0.0949 | 0.7100 | 0.9500 | 0.9545 | 0.0544 | 0.0113 |
| 0.3747 | 86.96 | 522 | 0.3187 | 0.95 | 0.0958 | 0.7072 | 0.9500 | 0.9545 | 0.0537 | 0.0111 |
| 0.3747 | 87.96 | 528 | 0.3191 | 0.95 | 0.0967 | 0.6428 | 0.9500 | 0.9545 | 0.0536 | 0.0103 |
| 0.3747 | 88.96 | 534 | 0.3168 | 0.95 | 0.0963 | 0.6438 | 0.9500 | 0.9545 | 0.0542 | 0.0102 |
| 0.3747 | 89.96 | 540 | 0.3136 | 0.95 | 0.0963 | 0.6418 | 0.9500 | 0.9545 | 0.0554 | 0.0099 |
| 0.3747 | 90.96 | 546 | 0.3117 | 0.95 | 0.0963 | 0.6407 | 0.9500 | 0.9545 | 0.0533 | 0.0097 |
| 0.3747 | 91.96 | 552 | 0.3113 | 0.95 | 0.0964 | 0.6403 | 0.9500 | 0.9545 | 0.0528 | 0.0091 |
| 0.3747 | 92.96 | 558 | 0.3112 | 0.95 | 0.0968 | 0.6401 | 0.9500 | 0.9545 | 0.0517 | 0.0091 |
| 0.3747 | 93.96 | 564 | 0.3109 | 0.95 | 0.0967 | 0.6393 | 0.9500 | 0.9545 | 0.0563 | 0.0091 |
| 0.3747 | 94.96 | 570 | 0.3112 | 0.95 | 0.0969 | 0.6370 | 0.9500 | 0.9545 | 0.0567 | 0.0092 |
| 0.3747 | 95.96 | 576 | 0.3118 | 0.95 | 0.0971 | 0.6364 | 0.9500 | 0.9545 | 0.0568 | 0.0091 |
| 0.3747 | 96.96 | 582 | 0.3120 | 0.95 | 0.0969 | 0.6377 | 0.9500 | 0.9545 | 0.0564 | 0.0092 |
| 0.3747 | 97.96 | 588 | 0.3121 | 0.95 | 0.0966 | 0.6379 | 0.9500 | 0.9545 | 0.0560 | 0.0092 |
| 0.3747 | 98.96 | 594 | 0.3121 | 0.95 | 0.0965 | 0.6374 | 0.9500 | 0.9545 | 0.0560 | 0.0092 |
| 0.3747 | 99.96 | 600 | 0.3120 | 0.95 | 0.0965 | 0.6372 | 0.9500 | 0.9545 | 0.0560 | 0.0092 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
cherrue/pricetag_classifier
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# cherrue/pricetag_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1636
- Validation Loss: 0.1296
- Train Accuracy: 1.0
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 1251, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.9521 | 0.4894 | 1.0 | 0 |
| 0.3312 | 0.2032 | 1.0 | 1 |
| 0.1636 | 0.1296 | 1.0 | 2 |
### Framework versions
- Transformers 4.28.0
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"template1_one",
"template2_one_left_label",
"template3_two",
"template5_one_sold_out"
] |
ahmedALM1221/convnextv2-tiny-22k-224-finetuned-eurosat-50
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-tiny-22k-224-finetuned-eurosat-50
This model is a fine-tuned version of [facebook/convnextv2-tiny-22k-224](https://huggingface.co/facebook/convnextv2-tiny-22k-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5332
- Accuracy: 0.8273
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0127 | 1.0 | 122 | 1.9838 | 0.1942 |
| 1.8499 | 2.0 | 244 | 1.8667 | 0.2456 |
| 1.8024 | 3.0 | 366 | 1.7247 | 0.3792 |
| 1.5952 | 4.0 | 488 | 1.5540 | 0.4861 |
| 1.3867 | 5.0 | 610 | 1.3568 | 0.5550 |
| 1.1846 | 6.0 | 732 | 1.1521 | 0.6372 |
| 1.0063 | 7.0 | 854 | 0.9649 | 0.6824 |
| 0.8932 | 8.0 | 976 | 0.8284 | 0.7307 |
| 0.7576 | 9.0 | 1098 | 0.7217 | 0.7780 |
| 0.6742 | 10.0 | 1220 | 0.6412 | 0.7924 |
| 0.6398 | 11.0 | 1342 | 0.5679 | 0.8160 |
| 0.6243 | 12.0 | 1464 | 0.5332 | 0.8273 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
ALM-AHME/beit-large-patch16-224-finetuned-eurosat-50
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-large-patch16-224-finetuned-eurosat-50
This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0568
- Accuracy: 0.9856
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.7148 | 1.0 | 122 | 1.6402 | 0.3916 |
| 1.1543 | 2.0 | 244 | 1.0718 | 0.6208 |
| 0.8948 | 3.0 | 366 | 0.7228 | 0.7564 |
| 0.6348 | 4.0 | 488 | 0.5327 | 0.8160 |
| 0.647 | 5.0 | 610 | 0.4081 | 0.8551 |
| 0.3244 | 6.0 | 732 | 0.2965 | 0.9096 |
| 0.305 | 7.0 | 854 | 0.2515 | 0.9342 |
| 0.3522 | 8.0 | 976 | 0.1667 | 0.9568 |
| 0.1782 | 9.0 | 1098 | 0.1494 | 0.9568 |
| 0.1849 | 10.0 | 1220 | 0.0972 | 0.9712 |
| 0.1814 | 11.0 | 1342 | 0.0559 | 0.9846 |
| 0.1682 | 12.0 | 1464 | 0.0568 | 0.9856 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
nesanchezo/model_prueba
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model_prueba
This model is a fine-tuned version of [farleyknight-org-username/vit-base-mnist](https://huggingface.co/farleyknight-org-username/vit-base-mnist) on the handwriten-Numbers dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1889
- Accuracy: 0.9606
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
|
[
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9"
] |
BubbleJoe/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0742
- Accuracy: 0.9748
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2967 | 1.0 | 190 | 0.1191 | 0.9622 |
| 0.1776 | 2.0 | 380 | 0.0897 | 0.9719 |
| 0.1334 | 3.0 | 570 | 0.0742 | 0.9748 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"annualcrop",
"forest",
"herbaceousvegetation",
"highway",
"industrial",
"pasture",
"permanentcrop",
"residential",
"river",
"sealake"
] |
hbenitez/food_classifier
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# hbenitez/food_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.3735
- Validation Loss: 2.5622
- Train Accuracy: 0.0769
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 260, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 2.5417 | 2.5922 | 0.0 | 0 |
| 2.5103 | 2.5856 | 0.0 | 1 |
| 2.4593 | 2.5738 | 0.0 | 2 |
| 2.4104 | 2.5671 | 0.0 | 3 |
| 2.3735 | 2.5622 | 0.0769 | 4 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.13.0-rc2
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"banana custom",
"bicycle",
"smoke from fire",
"steam from man hole",
"street",
"big bird",
"car",
"carriage",
"flocks of insects",
"leaves",
"motorbyke",
"person",
"small bird"
] |
02shanky/test_model_graphics_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test_model_graphics_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1381
- Accuracy: 0.9842
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1521 | 0.98 | 44 | 0.1381 | 0.9842 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"cartoon",
"icon",
"images",
"nongraphics"
] |
KyriaAnnwyn/vit-large-artifacts
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-large-artifacts
This model is a fine-tuned version of [kakaobrain/vit-large-patch16-512](https://huggingface.co/kakaobrain/vit-large-patch16-512) on the KyriaAnnwyn/artifacts_ds dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5995
- Accuracy: 0.6705
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.7001 | 0.01 | 100 | 0.6414 | 0.6559 |
| 0.6288 | 0.01 | 200 | 0.6666 | 0.6559 |
| 0.7237 | 0.02 | 300 | 0.7087 | 0.6559 |
| 0.8741 | 0.03 | 400 | 0.6739 | 0.6257 |
| 0.6093 | 0.04 | 500 | 0.6462 | 0.6559 |
| 0.5801 | 0.04 | 600 | 0.6822 | 0.6559 |
| 0.594 | 0.05 | 700 | 1.9948 | 0.6395 |
| 0.7724 | 0.06 | 800 | 0.6566 | 0.6553 |
| 0.6976 | 0.07 | 900 | 0.6774 | 0.6325 |
| 0.6583 | 0.07 | 1000 | 0.7175 | 0.3517 |
| 0.6779 | 0.08 | 1100 | 0.7012 | 0.6559 |
| 0.6478 | 0.09 | 1200 | 0.6336 | 0.6559 |
| 0.7405 | 0.1 | 1300 | 0.6577 | 0.6559 |
| 0.7362 | 0.1 | 1400 | 0.6630 | 0.6142 |
| 0.535 | 0.11 | 1500 | 0.7445 | 0.6559 |
| 0.7338 | 0.12 | 1600 | 0.7046 | 0.4718 |
| 0.6519 | 0.13 | 1700 | 0.6601 | 0.6426 |
| 0.5969 | 0.13 | 1800 | 0.6518 | 0.6559 |
| 0.5992 | 0.14 | 1900 | 0.6544 | 0.6559 |
| 0.5762 | 0.15 | 2000 | 0.6608 | 0.6559 |
| 0.6483 | 0.16 | 2100 | 0.6436 | 0.6331 |
| 0.7594 | 0.16 | 2200 | 0.7562 | 0.5213 |
| 0.6423 | 0.17 | 2300 | 0.6326 | 0.6433 |
| 0.7006 | 0.18 | 2400 | 0.6669 | 0.6108 |
| 0.833 | 0.19 | 2500 | 0.7043 | 0.6559 |
| 0.6133 | 0.19 | 2600 | 0.6356 | 0.6532 |
| 0.5285 | 0.2 | 2700 | 0.6619 | 0.6606 |
| 0.7209 | 0.21 | 2800 | 0.7306 | 0.4196 |
| 0.682 | 0.22 | 2900 | 0.6400 | 0.6539 |
| 0.7148 | 0.22 | 3000 | 0.6421 | 0.6559 |
| 0.6288 | 0.23 | 3100 | 0.7416 | 0.6559 |
| 0.666 | 0.24 | 3200 | 0.6368 | 0.6293 |
| 0.772 | 0.25 | 3300 | 0.6973 | 0.4985 |
| 0.6778 | 0.25 | 3400 | 0.6288 | 0.6604 |
| 0.5939 | 0.26 | 3500 | 0.6566 | 0.6559 |
| 0.6246 | 0.27 | 3600 | 0.6347 | 0.6618 |
| 0.649 | 0.28 | 3700 | 0.6353 | 0.6277 |
| 0.7122 | 0.28 | 3800 | 0.6407 | 0.6559 |
| 0.6292 | 0.29 | 3900 | 0.6776 | 0.6560 |
| 0.6079 | 0.3 | 4000 | 0.6220 | 0.6609 |
| 0.6971 | 0.31 | 4100 | 0.6258 | 0.6394 |
| 0.7131 | 0.31 | 4200 | 0.7202 | 0.6556 |
| 0.5346 | 0.32 | 4300 | 0.6394 | 0.6571 |
| 0.5801 | 0.33 | 4400 | 0.6960 | 0.6664 |
| 0.6806 | 0.34 | 4500 | 0.6339 | 0.6348 |
| 0.6245 | 0.34 | 4600 | 0.6226 | 0.6477 |
| 0.6905 | 0.35 | 4700 | 0.6203 | 0.6533 |
| 0.741 | 0.36 | 4800 | 0.6464 | 0.6680 |
| 0.5712 | 0.37 | 4900 | 0.6162 | 0.6640 |
| 0.5566 | 0.37 | 5000 | 0.6182 | 0.6507 |
| 0.6443 | 0.38 | 5100 | 0.6457 | 0.6664 |
| 0.6107 | 0.39 | 5200 | 0.6092 | 0.6617 |
| 0.5824 | 0.4 | 5300 | 0.6383 | 0.6571 |
| 0.4775 | 0.4 | 5400 | 0.6606 | 0.6621 |
| 0.7114 | 0.41 | 5500 | 0.6179 | 0.6619 |
| 0.7701 | 0.42 | 5600 | 0.7982 | 0.4217 |
| 0.6974 | 0.42 | 5700 | 0.6223 | 0.6540 |
| 0.6669 | 0.43 | 5800 | 0.6249 | 0.6559 |
| 0.6982 | 0.44 | 5900 | 0.6287 | 0.6564 |
| 0.5811 | 0.45 | 6000 | 0.6104 | 0.6506 |
| 0.4347 | 0.45 | 6100 | 1.0475 | 0.6559 |
| 0.5885 | 0.46 | 6200 | 0.6125 | 0.6552 |
| 0.6867 | 0.47 | 6300 | 0.6435 | 0.6468 |
| 0.6088 | 0.48 | 6400 | 0.6047 | 0.6623 |
| 0.8194 | 0.48 | 6500 | 0.6972 | 0.6589 |
| 0.8182 | 0.49 | 6600 | 0.6053 | 0.6644 |
| 0.6104 | 0.5 | 6700 | 0.7375 | 0.6571 |
| 0.5552 | 0.51 | 6800 | 0.6231 | 0.6402 |
| 0.6451 | 0.51 | 6900 | 0.6452 | 0.6561 |
| 0.7849 | 0.52 | 7000 | 0.6177 | 0.6612 |
| 0.64 | 0.53 | 7100 | 0.6307 | 0.6234 |
| 0.6393 | 0.54 | 7200 | 0.6130 | 0.6554 |
| 0.8326 | 0.54 | 7300 | 0.7210 | 0.6421 |
| 0.6579 | 0.55 | 7400 | 0.6227 | 0.6544 |
| 0.5195 | 0.56 | 7500 | 0.6619 | 0.6557 |
| 0.6197 | 0.57 | 7600 | 0.6354 | 0.6498 |
| 0.8507 | 0.57 | 7700 | 0.6820 | 0.6550 |
| 0.7163 | 0.58 | 7800 | 0.6720 | 0.5328 |
| 0.6896 | 0.59 | 7900 | 0.6530 | 0.6386 |
| 0.62 | 0.6 | 8000 | 0.6296 | 0.6559 |
| 0.8254 | 0.6 | 8100 | 0.6752 | 0.6200 |
| 0.7653 | 0.61 | 8200 | 0.7118 | 0.6558 |
| 0.7742 | 0.62 | 8300 | 0.6262 | 0.6497 |
| 0.6861 | 0.63 | 8400 | 0.6799 | 0.5566 |
| 0.5652 | 0.63 | 8500 | 0.6708 | 0.6559 |
| 0.7486 | 0.64 | 8600 | 0.6319 | 0.6559 |
| 0.6204 | 0.65 | 8700 | 0.6407 | 0.6530 |
| 0.673 | 0.66 | 8800 | 0.7154 | 0.4672 |
| 0.7272 | 0.66 | 8900 | 0.6323 | 0.6528 |
| 0.7364 | 0.67 | 9000 | 0.6436 | 0.6188 |
| 0.71 | 0.68 | 9100 | 0.6507 | 0.5924 |
| 0.6767 | 0.69 | 9200 | 0.6347 | 0.6575 |
| 0.7046 | 0.69 | 9300 | 0.6723 | 0.6127 |
| 0.7486 | 0.7 | 9400 | 0.6328 | 0.6485 |
| 0.7646 | 0.71 | 9500 | 0.6244 | 0.6550 |
| 0.5971 | 0.72 | 9600 | 0.6610 | 0.6558 |
| 0.6195 | 0.72 | 9700 | 0.6219 | 0.6515 |
| 0.6891 | 0.73 | 9800 | 0.6300 | 0.6619 |
| 0.6829 | 0.74 | 9900 | 0.6312 | 0.6568 |
| 0.4786 | 0.75 | 10000 | 0.7160 | 0.6573 |
| 0.6093 | 0.75 | 10100 | 0.6245 | 0.6503 |
| 0.672 | 0.76 | 10200 | 0.6248 | 0.6577 |
| 0.6734 | 0.77 | 10300 | 0.6541 | 0.6600 |
| 0.7826 | 0.78 | 10400 | 0.6413 | 0.6559 |
| 0.6851 | 0.78 | 10500 | 0.6478 | 0.6006 |
| 0.6776 | 0.79 | 10600 | 0.6453 | 0.6175 |
| 0.7322 | 0.8 | 10700 | 0.6188 | 0.6353 |
| 0.5144 | 0.81 | 10800 | 0.6762 | 0.6571 |
| 0.6977 | 0.81 | 10900 | 0.6559 | 0.6544 |
| 0.5681 | 0.82 | 11000 | 0.7225 | 0.6559 |
| 0.6449 | 0.83 | 11100 | 0.6372 | 0.6576 |
| 0.6067 | 0.83 | 11200 | 0.6207 | 0.6391 |
| 0.5921 | 0.84 | 11300 | 0.6178 | 0.6538 |
| 0.5373 | 0.85 | 11400 | 0.7370 | 0.6559 |
| 0.6926 | 0.86 | 11500 | 0.6346 | 0.6372 |
| 0.6634 | 0.86 | 11600 | 0.6274 | 0.6489 |
| 0.61 | 0.87 | 11700 | 0.6309 | 0.6427 |
| 0.6214 | 0.88 | 11800 | 0.6273 | 0.6480 |
| 0.6202 | 0.89 | 11900 | 0.6255 | 0.6559 |
| 0.6153 | 0.89 | 12000 | 0.6348 | 0.6459 |
| 0.7062 | 0.9 | 12100 | 0.6283 | 0.6512 |
| 0.6977 | 0.91 | 12200 | 0.6159 | 0.6515 |
| 0.6041 | 0.92 | 12300 | 0.6251 | 0.6504 |
| 0.6609 | 0.92 | 12400 | 0.6633 | 0.5870 |
| 0.7565 | 0.93 | 12500 | 0.6200 | 0.6562 |
| 0.6133 | 0.94 | 12600 | 0.6193 | 0.6527 |
| 0.7066 | 0.95 | 12700 | 0.6279 | 0.6180 |
| 0.5706 | 0.95 | 12800 | 0.6128 | 0.6575 |
| 0.6992 | 0.96 | 12900 | 0.6334 | 0.6449 |
| 0.6834 | 0.97 | 13000 | 0.6258 | 0.6591 |
| 0.6069 | 0.98 | 13100 | 0.6290 | 0.6620 |
| 0.743 | 0.98 | 13200 | 0.6110 | 0.6562 |
| 0.5226 | 0.99 | 13300 | 0.6165 | 0.6557 |
| 0.7359 | 1.0 | 13400 | 0.6207 | 0.6376 |
| 0.5812 | 1.01 | 13500 | 0.6192 | 0.6559 |
| 0.666 | 1.01 | 13600 | 0.6347 | 0.6602 |
| 0.5489 | 1.02 | 13700 | 0.6107 | 0.6459 |
| 0.701 | 1.03 | 13800 | 0.6172 | 0.6518 |
| 0.4873 | 1.04 | 13900 | 0.6786 | 0.6559 |
| 0.5807 | 1.04 | 14000 | 0.6636 | 0.6433 |
| 0.6824 | 1.05 | 14100 | 0.6176 | 0.6315 |
| 0.6012 | 1.06 | 14200 | 0.6097 | 0.6617 |
| 0.4865 | 1.07 | 14300 | 0.6103 | 0.6623 |
| 0.5612 | 1.07 | 14400 | 0.6947 | 0.6559 |
| 0.5968 | 1.08 | 14500 | 0.6559 | 0.5981 |
| 0.5657 | 1.09 | 14600 | 0.6076 | 0.6509 |
| 0.4778 | 1.1 | 14700 | 0.6808 | 0.6535 |
| 0.6047 | 1.1 | 14800 | 0.6131 | 0.6480 |
| 0.5999 | 1.11 | 14900 | 0.6120 | 0.6559 |
| 0.5852 | 1.12 | 15000 | 0.6356 | 0.6553 |
| 0.7033 | 1.13 | 15100 | 0.6578 | 0.6647 |
| 0.5925 | 1.13 | 15200 | 0.6153 | 0.6633 |
| 0.5959 | 1.14 | 15300 | 0.6306 | 0.6211 |
| 0.5929 | 1.15 | 15400 | 0.6246 | 0.6655 |
| 0.5621 | 1.16 | 15500 | 0.6126 | 0.6424 |
| 0.5508 | 1.16 | 15600 | 0.6844 | 0.6559 |
| 0.6276 | 1.17 | 15700 | 0.6066 | 0.6531 |
| 1.0359 | 1.18 | 15800 | 0.6271 | 0.6617 |
| 0.6191 | 1.19 | 15900 | 0.6166 | 0.6480 |
| 0.7095 | 1.19 | 16000 | 0.6228 | 0.6462 |
| 0.6567 | 1.2 | 16100 | 0.6066 | 0.6653 |
| 0.5653 | 1.21 | 16200 | 0.6022 | 0.6605 |
| 0.6894 | 1.21 | 16300 | 0.6216 | 0.6568 |
| 0.608 | 1.22 | 16400 | 0.6041 | 0.6559 |
| 0.665 | 1.23 | 16500 | 0.6111 | 0.6564 |
| 0.6753 | 1.24 | 16600 | 0.6138 | 0.6581 |
| 0.6213 | 1.24 | 16700 | 0.6121 | 0.6380 |
| 0.6983 | 1.25 | 16800 | 0.6166 | 0.6661 |
| 0.8521 | 1.26 | 16900 | 0.6202 | 0.6461 |
| 0.4927 | 1.27 | 17000 | 0.6313 | 0.6547 |
| 0.6414 | 1.27 | 17100 | 0.6011 | 0.6667 |
| 0.539 | 1.28 | 17200 | 0.6451 | 0.6664 |
| 0.5118 | 1.29 | 17300 | 0.6243 | 0.6641 |
| 0.7512 | 1.3 | 17400 | 0.6257 | 0.6586 |
| 0.5943 | 1.3 | 17500 | 0.6186 | 0.6423 |
| 0.5861 | 1.31 | 17600 | 0.6435 | 0.6638 |
| 0.7065 | 1.32 | 17700 | 0.6197 | 0.6279 |
| 0.5973 | 1.33 | 17800 | 0.6081 | 0.6535 |
| 0.5997 | 1.33 | 17900 | 0.6053 | 0.6608 |
| 0.7091 | 1.34 | 18000 | 0.6013 | 0.6644 |
| 0.691 | 1.35 | 18100 | 0.6103 | 0.6654 |
| 0.5559 | 1.36 | 18200 | 0.6110 | 0.6658 |
| 0.6309 | 1.36 | 18300 | 0.6067 | 0.6664 |
| 0.6262 | 1.37 | 18400 | 0.6027 | 0.6616 |
| 0.5551 | 1.38 | 18500 | 0.6106 | 0.6671 |
| 0.6703 | 1.39 | 18600 | 0.6043 | 0.6576 |
| 0.6849 | 1.39 | 18700 | 0.6018 | 0.6616 |
| 0.6136 | 1.4 | 18800 | 0.6324 | 0.6629 |
| 0.7075 | 1.41 | 18900 | 0.6057 | 0.6561 |
| 0.6036 | 1.42 | 19000 | 0.6081 | 0.6559 |
| 0.6549 | 1.42 | 19100 | 0.6352 | 0.6655 |
| 0.5168 | 1.43 | 19200 | 0.6042 | 0.6632 |
| 0.5864 | 1.44 | 19300 | 0.6111 | 0.6639 |
| 0.5961 | 1.45 | 19400 | 0.6003 | 0.6644 |
| 0.6077 | 1.45 | 19500 | 0.6125 | 0.6566 |
| 0.6215 | 1.46 | 19600 | 0.6128 | 0.6582 |
| 0.4005 | 1.47 | 19700 | 0.6348 | 0.6642 |
| 0.5689 | 1.48 | 19800 | 0.6355 | 0.6647 |
| 0.6026 | 1.48 | 19900 | 0.6127 | 0.6444 |
| 0.4982 | 1.49 | 20000 | 0.6034 | 0.6654 |
| 0.6189 | 1.5 | 20100 | 0.6202 | 0.6609 |
| 0.5502 | 1.51 | 20200 | 0.6044 | 0.6621 |
| 0.5924 | 1.51 | 20300 | 0.6107 | 0.6445 |
| 0.744 | 1.52 | 20400 | 0.6164 | 0.6559 |
| 0.5582 | 1.53 | 20500 | 0.6166 | 0.6559 |
| 0.6994 | 1.54 | 20600 | 0.6109 | 0.6664 |
| 0.5396 | 1.54 | 20700 | 0.6189 | 0.6670 |
| 0.7232 | 1.55 | 20800 | 0.6104 | 0.6610 |
| 0.9802 | 1.56 | 20900 | 0.6232 | 0.6642 |
| 0.6487 | 1.57 | 21000 | 0.6056 | 0.6505 |
| 0.5932 | 1.57 | 21100 | 0.5980 | 0.6702 |
| 0.7897 | 1.58 | 21200 | 0.6012 | 0.6638 |
| 0.6006 | 1.59 | 21300 | 0.6232 | 0.6672 |
| 0.4481 | 1.6 | 21400 | 0.6124 | 0.6676 |
| 0.6078 | 1.6 | 21500 | 0.6495 | 0.6664 |
| 0.595 | 1.61 | 21600 | 0.7122 | 0.6675 |
| 0.6388 | 1.62 | 21700 | 0.6227 | 0.6671 |
| 0.5731 | 1.62 | 21800 | 0.6252 | 0.6682 |
| 0.8603 | 1.63 | 21900 | 0.6026 | 0.6653 |
| 0.6316 | 1.64 | 22000 | 0.6494 | 0.6669 |
| 0.6712 | 1.65 | 22100 | 0.6097 | 0.6676 |
| 0.6102 | 1.65 | 22200 | 0.6221 | 0.6585 |
| 0.7099 | 1.66 | 22300 | 0.6006 | 0.6658 |
| 0.621 | 1.67 | 22400 | 0.6026 | 0.6626 |
| 0.478 | 1.68 | 22500 | 0.6062 | 0.6624 |
| 0.6106 | 1.68 | 22600 | 0.5990 | 0.6669 |
| 0.5793 | 1.69 | 22700 | 0.5980 | 0.6681 |
| 0.5804 | 1.7 | 22800 | 0.6014 | 0.6626 |
| 0.6304 | 1.71 | 22900 | 0.6107 | 0.6380 |
| 0.7427 | 1.71 | 23000 | 0.6051 | 0.6682 |
| 0.5794 | 1.72 | 23100 | 0.6105 | 0.6611 |
| 0.5084 | 1.73 | 23200 | 0.6643 | 0.6673 |
| 0.6518 | 1.74 | 23300 | 0.6366 | 0.6687 |
| 0.5129 | 1.74 | 23400 | 0.6053 | 0.6682 |
| 0.7593 | 1.75 | 23500 | 0.5977 | 0.6662 |
| 0.6645 | 1.76 | 23600 | 0.5988 | 0.6683 |
| 0.6144 | 1.77 | 23700 | 0.6130 | 0.6673 |
| 0.6855 | 1.77 | 23800 | 0.6192 | 0.6596 |
| 0.559 | 1.78 | 23900 | 0.6208 | 0.6574 |
| 0.4202 | 1.79 | 24000 | 0.6125 | 0.6690 |
| 0.6604 | 1.8 | 24100 | 0.6052 | 0.6685 |
| 0.5487 | 1.8 | 24200 | 0.6086 | 0.6685 |
| 0.6816 | 1.81 | 24300 | 0.5997 | 0.6620 |
| 0.6057 | 1.82 | 24400 | 0.6128 | 0.6530 |
| 0.4335 | 1.83 | 24500 | 0.6121 | 0.6676 |
| 0.6147 | 1.83 | 24600 | 0.6225 | 0.6670 |
| 0.7414 | 1.84 | 24700 | 0.6248 | 0.6718 |
| 0.622 | 1.85 | 24800 | 0.6084 | 0.6722 |
| 0.5356 | 1.86 | 24900 | 0.6003 | 0.6611 |
| 0.7994 | 1.86 | 25000 | 0.6098 | 0.6657 |
| 0.5389 | 1.87 | 25100 | 0.6052 | 0.6633 |
| 0.6985 | 1.88 | 25200 | 0.6073 | 0.6694 |
| 0.652 | 1.89 | 25300 | 0.6040 | 0.6709 |
| 0.5409 | 1.89 | 25400 | 0.6065 | 0.6709 |
| 0.6356 | 1.9 | 25500 | 0.6062 | 0.6699 |
| 0.7588 | 1.91 | 25600 | 0.6025 | 0.6711 |
| 0.5109 | 1.92 | 25700 | 0.5992 | 0.6693 |
| 0.6766 | 1.92 | 25800 | 0.6004 | 0.6693 |
| 0.6517 | 1.93 | 25900 | 0.6020 | 0.6701 |
| 0.6561 | 1.94 | 26000 | 0.5995 | 0.6705 |
| 0.6224 | 1.95 | 26100 | 0.6008 | 0.6717 |
| 0.6054 | 1.95 | 26200 | 0.6005 | 0.6714 |
| 0.5152 | 1.96 | 26300 | 0.6023 | 0.6709 |
| 0.5503 | 1.97 | 26400 | 0.6032 | 0.6706 |
| 0.5101 | 1.98 | 26500 | 0.6067 | 0.6709 |
| 0.5229 | 1.98 | 26600 | 0.6079 | 0.6702 |
| 0.8387 | 1.99 | 26700 | 0.6079 | 0.6700 |
| 0.608 | 2.0 | 26800 | 0.6069 | 0.6699 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1+cu116
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"false",
"true"
] |
jordyvl/vit-base_rvl-cdip
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base_rvl-cdip
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5535
- Accuracy: 0.897
- Brier Loss: 0.1768
- Nll: 1.0978
- F1 Micro: 0.897
- F1 Macro: 0.8972
- Ece: 0.0801
- Aurc: 0.0180
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| 0.676 | 1.0 | 5000 | 0.6451 | 0.8230 | 0.2574 | 1.2627 | 0.8230 | 0.8237 | 0.0458 | 0.0425 |
| 0.4207 | 2.0 | 10000 | 0.4251 | 0.8766 | 0.1800 | 1.2821 | 0.8766 | 0.8779 | 0.0154 | 0.0218 |
| 0.3335 | 3.0 | 15000 | 0.3914 | 0.8861 | 0.1676 | 1.2589 | 0.8861 | 0.8858 | 0.0252 | 0.0192 |
| 0.2447 | 4.0 | 20000 | 0.3687 | 0.8934 | 0.1574 | 1.2243 | 0.8934 | 0.8937 | 0.0331 | 0.0164 |
| 0.1623 | 5.0 | 25000 | 0.3843 | 0.8976 | 0.1583 | 1.1553 | 0.8976 | 0.8973 | 0.0461 | 0.0159 |
| 0.1083 | 6.0 | 30000 | 0.4131 | 0.8964 | 0.1624 | 1.1514 | 0.8964 | 0.8967 | 0.0581 | 0.0163 |
| 0.0652 | 7.0 | 35000 | 0.4633 | 0.8966 | 0.1690 | 1.1300 | 0.8966 | 0.8967 | 0.0692 | 0.0169 |
| 0.0361 | 8.0 | 40000 | 0.5068 | 0.8976 | 0.1723 | 1.1161 | 0.8976 | 0.8976 | 0.0737 | 0.0175 |
| 0.0192 | 9.0 | 45000 | 0.5418 | 0.8982 | 0.1748 | 1.1015 | 0.8982 | 0.8983 | 0.0779 | 0.0179 |
| 0.0111 | 10.0 | 50000 | 0.5535 | 0.897 | 0.1768 | 1.0978 | 0.897 | 0.8972 | 0.0801 | 0.0180 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
nesanchezo/model_handwritenNumbers-nesanchezo
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model_handwritenNumbers-nesanchezo
This model is a fine-tuned version of [farleyknight-org-username/vit-base-mnist](https://huggingface.co/farleyknight-org-username/vit-base-mnist) on the handwriten-Numbers dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0807
- Accuracy: 0.9839
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.396 | 0.34 | 500 | 0.1925 | 0.9470 |
| 0.2672 | 0.67 | 1000 | 0.2655 | 0.9297 |
| 0.2261 | 1.01 | 1500 | 0.1767 | 0.9548 |
| 0.1603 | 1.34 | 2000 | 0.1423 | 0.9658 |
| 0.1308 | 1.68 | 2500 | 0.1378 | 0.9709 |
| 0.1187 | 2.02 | 3000 | 0.1168 | 0.9737 |
| 0.0873 | 2.35 | 3500 | 0.0857 | 0.9823 |
| 0.0686 | 2.69 | 4000 | 0.1188 | 0.9753 |
| 0.0635 | 3.03 | 4500 | 0.0836 | 0.9804 |
| 0.034 | 3.36 | 5000 | 0.0807 | 0.9839 |
| 0.0155 | 3.7 | 5500 | 0.0898 | 0.9823 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
|
[
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9"
] |
ALM-AHME/convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20
This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0976
- Accuracy: 0.9866
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.8977 | 1.0 | 122 | 1.8949 | 0.2939 |
| 1.6493 | 2.0 | 244 | 1.6449 | 0.5447 |
| 1.239 | 3.0 | 366 | 1.2819 | 0.6886 |
| 0.9342 | 4.0 | 488 | 0.9664 | 0.7276 |
| 0.7011 | 5.0 | 610 | 0.6760 | 0.8356 |
| 0.5809 | 6.0 | 732 | 0.5792 | 0.8469 |
| 0.4846 | 7.0 | 854 | 0.4280 | 0.8890 |
| 0.6914 | 8.0 | 976 | 0.4121 | 0.8849 |
| 0.3815 | 9.0 | 1098 | 0.2751 | 0.9353 |
| 0.2931 | 10.0 | 1220 | 0.2980 | 0.9198 |
| 0.2485 | 11.0 | 1342 | 0.3090 | 0.9106 |
| 0.1759 | 12.0 | 1464 | 0.0976 | 0.9866 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
rdmpage/autotrain-inat2018-72960139083
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 72960139083
- CO2 Emissions (in grams): 0.8379
## Validation Metrics
- Loss: 0.001
- Accuracy: 1.000
- Macro F1: 1.000
- Micro F1: 1.000
- Weighted F1: 1.000
- Macro Precision: 1.000
- Micro Precision: 1.000
- Weighted Precision: 1.000
- Macro Recall: 1.000
- Micro Recall: 1.000
- Weighted Recall: 1.000
|
[
"1478",
"613",
"676"
] |
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-Lesion-Classification-HAM10000-AH
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-Lesion-Classification-HAM10000-AH
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1143
- Accuracy: 0.9681
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.9527 | 1.0 | 122 | 1.9746 | 0.1716 |
| 1.818 | 2.0 | 244 | 1.7423 | 0.3628 |
| 1.5044 | 3.0 | 366 | 1.3707 | 0.5046 |
| 1.1173 | 4.0 | 488 | 0.9796 | 0.6300 |
| 0.8714 | 5.0 | 610 | 0.7475 | 0.7379 |
| 0.8631 | 6.0 | 732 | 0.5978 | 0.7729 |
| 0.628 | 7.0 | 854 | 0.4791 | 0.8212 |
| 0.5588 | 8.0 | 976 | 0.3517 | 0.8705 |
| 0.5632 | 9.0 | 1098 | 0.2564 | 0.9168 |
| 0.3693 | 10.0 | 1220 | 0.1875 | 0.9455 |
| 0.321 | 11.0 | 1342 | 0.1525 | 0.9424 |
| 0.2761 | 12.0 | 1464 | 0.1143 | 0.9681 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
ALM-AHME/beit-large-patch16-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-large-patch16-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20
This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0434
- Accuracy: 0.9908
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.9688 | 1.0 | 122 | 1.8425 | 0.2775 |
| 1.4822 | 2.0 | 244 | 1.3833 | 0.5457 |
| 1.1239 | 3.0 | 366 | 0.9321 | 0.6680 |
| 0.8686 | 4.0 | 488 | 0.6691 | 0.7698 |
| 0.5234 | 5.0 | 610 | 0.4872 | 0.8335 |
| 0.5246 | 6.0 | 732 | 0.3586 | 0.8736 |
| 0.3691 | 7.0 | 854 | 0.3134 | 0.8993 |
| 0.4708 | 8.0 | 976 | 0.2069 | 0.9394 |
| 0.1694 | 9.0 | 1098 | 0.1832 | 0.9414 |
| 0.2749 | 10.0 | 1220 | 0.1198 | 0.9640 |
| 0.1777 | 11.0 | 1342 | 0.0845 | 0.9733 |
| 0.1529 | 12.0 | 1464 | 0.0434 | 0.9908 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
usamaaleem99tech/segformer-class-classWeights-augmentation
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-class-classWeights-augmentation
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1855
- Accuracy: 0.9655
- F1: 0.9647
- Precision: 0.9674
- Recall: 0.9655
- Learning Rate: 0.0000
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 40
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Rate |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|:------:|
| No log | 0.89 | 6 | 0.1113 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.1153 | 1.93 | 13 | 0.0929 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.2246 | 2.96 | 20 | 0.1026 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.2246 | 4.0 | 27 | 0.0391 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.1433 | 4.89 | 33 | 0.0673 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.1816 | 5.93 | 40 | 0.0794 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.1816 | 6.96 | 47 | 0.0687 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.1448 | 8.0 | 54 | 0.1123 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
| 0.1124 | 8.89 | 60 | 0.1855 | 0.9655 | 0.9647 | 0.9674 | 0.9655 | 0.0000 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"bcc",
"iec",
"scc"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t2.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t2.0_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.5976
- Accuracy: 0.18
- Brier Loss: 0.8781
- Nll: 6.8947
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2499
- Aurc: 0.8510
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.8479 | 0.145 | 0.8999 | 10.1604 | 0.145 | 0.0253 | 0.2222 | 0.8467 |
| No log | 1.96 | 6 | 3.8090 | 0.145 | 0.8946 | 10.5967 | 0.145 | 0.0253 | 0.2246 | 0.8470 |
| No log | 2.96 | 9 | 3.7500 | 0.16 | 0.8866 | 8.6365 | 0.16 | 0.0406 | 0.2205 | 0.8486 |
| No log | 3.96 | 12 | 3.7003 | 0.16 | 0.8805 | 6.5484 | 0.16 | 0.0327 | 0.2242 | 0.8816 |
| No log | 4.96 | 15 | 3.6677 | 0.155 | 0.8776 | 6.7592 | 0.155 | 0.0271 | 0.2365 | 0.8919 |
| No log | 5.96 | 18 | 3.6477 | 0.155 | 0.8770 | 7.2639 | 0.155 | 0.0278 | 0.2368 | 0.8961 |
| No log | 6.96 | 21 | 3.6339 | 0.18 | 0.8774 | 7.3546 | 0.18 | 0.0313 | 0.2486 | 0.8556 |
| No log | 7.96 | 24 | 3.6240 | 0.18 | 0.8781 | 7.0685 | 0.18 | 0.0308 | 0.2654 | 0.8528 |
| No log | 8.96 | 27 | 3.6163 | 0.18 | 0.8784 | 7.0041 | 0.18 | 0.0306 | 0.2561 | 0.8532 |
| No log | 9.96 | 30 | 3.6114 | 0.18 | 0.8787 | 6.9904 | 0.18 | 0.0306 | 0.2584 | 0.8537 |
| No log | 10.96 | 33 | 3.6078 | 0.18 | 0.8788 | 6.9806 | 0.18 | 0.0306 | 0.2594 | 0.8538 |
| No log | 11.96 | 36 | 3.6052 | 0.18 | 0.8789 | 6.9768 | 0.18 | 0.0306 | 0.2596 | 0.8537 |
| No log | 12.96 | 39 | 3.6034 | 0.18 | 0.8788 | 6.9716 | 0.18 | 0.0306 | 0.2507 | 0.8532 |
| No log | 13.96 | 42 | 3.6018 | 0.18 | 0.8786 | 6.9683 | 0.18 | 0.0306 | 0.2548 | 0.8527 |
| No log | 14.96 | 45 | 3.6005 | 0.18 | 0.8786 | 6.9040 | 0.18 | 0.0306 | 0.2597 | 0.8524 |
| No log | 15.96 | 48 | 3.5995 | 0.18 | 0.8784 | 6.8978 | 0.18 | 0.0306 | 0.2685 | 0.8518 |
| No log | 16.96 | 51 | 3.5989 | 0.18 | 0.8784 | 6.8972 | 0.18 | 0.0306 | 0.2641 | 0.8515 |
| No log | 17.96 | 54 | 3.5989 | 0.18 | 0.8784 | 6.8961 | 0.18 | 0.0306 | 0.2550 | 0.8513 |
| No log | 18.96 | 57 | 3.5988 | 0.18 | 0.8784 | 6.8968 | 0.18 | 0.0306 | 0.2505 | 0.8510 |
| No log | 19.96 | 60 | 3.5982 | 0.18 | 0.8782 | 6.8956 | 0.18 | 0.0306 | 0.2478 | 0.8511 |
| No log | 20.96 | 63 | 3.5980 | 0.18 | 0.8782 | 6.8954 | 0.18 | 0.0306 | 0.2456 | 0.8507 |
| No log | 21.96 | 66 | 3.5978 | 0.18 | 0.8782 | 6.8951 | 0.18 | 0.0306 | 0.2499 | 0.8511 |
| No log | 22.96 | 69 | 3.5976 | 0.18 | 0.8781 | 6.8949 | 0.18 | 0.0306 | 0.2499 | 0.8510 |
| No log | 23.96 | 72 | 3.5976 | 0.18 | 0.8781 | 6.8949 | 0.18 | 0.0306 | 0.2499 | 0.8510 |
| No log | 24.96 | 75 | 3.5976 | 0.18 | 0.8781 | 6.8947 | 0.18 | 0.0306 | 0.2499 | 0.8510 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_MSE
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 6.8328
- Accuracy: 0.19
- Brier Loss: 0.8942
- Nll: 7.0296
- F1 Micro: 0.19
- F1 Macro: 0.0703
- Ece: 0.2429
- Aurc: 0.8146
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 7.1188 | 0.145 | 0.9003 | 10.1627 | 0.145 | 0.0253 | 0.2218 | 0.8463 |
| No log | 1.96 | 6 | 7.0608 | 0.145 | 0.8969 | 9.8809 | 0.145 | 0.0253 | 0.2197 | 0.8454 |
| No log | 2.96 | 9 | 6.9777 | 0.145 | 0.8929 | 8.9712 | 0.145 | 0.0442 | 0.2065 | 0.7921 |
| No log | 3.96 | 12 | 6.9144 | 0.17 | 0.8908 | 4.9924 | 0.17 | 0.0413 | 0.2325 | 0.7807 |
| No log | 4.96 | 15 | 6.8797 | 0.145 | 0.8912 | 6.8983 | 0.145 | 0.0399 | 0.2089 | 0.7932 |
| No log | 5.96 | 18 | 6.8636 | 0.085 | 0.8926 | 6.9917 | 0.085 | 0.0299 | 0.1822 | 0.8755 |
| No log | 6.96 | 21 | 6.8545 | 0.075 | 0.8946 | 7.0604 | 0.075 | 0.0307 | 0.1849 | 0.8758 |
| No log | 7.96 | 24 | 6.8486 | 0.06 | 0.8958 | 7.1035 | 0.06 | 0.0230 | 0.1801 | 0.8891 |
| No log | 8.96 | 27 | 6.8455 | 0.165 | 0.8967 | 7.1315 | 0.165 | 0.0604 | 0.2414 | 0.8438 |
| No log | 9.96 | 30 | 6.8450 | 0.185 | 0.8973 | 7.1546 | 0.185 | 0.0468 | 0.2477 | 0.8436 |
| No log | 10.96 | 33 | 6.8438 | 0.18 | 0.8969 | 7.1569 | 0.18 | 0.0308 | 0.2406 | 0.8504 |
| No log | 11.96 | 36 | 6.8414 | 0.18 | 0.8962 | 7.1492 | 0.18 | 0.0306 | 0.2510 | 0.8501 |
| No log | 12.96 | 39 | 6.8390 | 0.18 | 0.8958 | 7.1455 | 0.18 | 0.0306 | 0.2374 | 0.8494 |
| No log | 13.96 | 42 | 6.8365 | 0.18 | 0.8950 | 7.0793 | 0.18 | 0.0306 | 0.2436 | 0.8488 |
| No log | 14.96 | 45 | 6.8349 | 0.18 | 0.8944 | 7.0591 | 0.18 | 0.0306 | 0.2369 | 0.8486 |
| No log | 15.96 | 48 | 6.8338 | 0.18 | 0.8942 | 7.0493 | 0.18 | 0.0306 | 0.2396 | 0.8482 |
| No log | 16.96 | 51 | 6.8335 | 0.18 | 0.8940 | 7.0429 | 0.18 | 0.0309 | 0.2390 | 0.8486 |
| No log | 17.96 | 54 | 6.8341 | 0.18 | 0.8943 | 7.0410 | 0.18 | 0.0314 | 0.2351 | 0.8514 |
| No log | 18.96 | 57 | 6.8338 | 0.19 | 0.8943 | 7.0391 | 0.19 | 0.0495 | 0.2480 | 0.8471 |
| No log | 19.96 | 60 | 6.8335 | 0.205 | 0.8943 | 7.0342 | 0.205 | 0.0722 | 0.2562 | 0.8204 |
| No log | 20.96 | 63 | 6.8334 | 0.2 | 0.8942 | 7.0308 | 0.2000 | 0.0683 | 0.2541 | 0.8199 |
| No log | 21.96 | 66 | 6.8332 | 0.195 | 0.8942 | 7.0296 | 0.195 | 0.0714 | 0.2511 | 0.8099 |
| No log | 22.96 | 69 | 6.8330 | 0.195 | 0.8942 | 7.0297 | 0.195 | 0.0717 | 0.2572 | 0.8123 |
| No log | 23.96 | 72 | 6.8329 | 0.19 | 0.8942 | 7.0294 | 0.19 | 0.0703 | 0.2459 | 0.8148 |
| No log | 24.96 | 75 | 6.8328 | 0.19 | 0.8942 | 7.0296 | 0.19 | 0.0703 | 0.2429 | 0.8146 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_MSE
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 6.7275
- Accuracy: 0.21
- Brier Loss: 0.8834
- Nll: 6.7677
- F1 Micro: 0.2100
- F1 Macro: 0.1146
- Ece: 0.2647
- Aurc: 0.7666
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 7.1014 | 0.06 | 0.9055 | 7.9056 | 0.06 | 0.0114 | 0.1732 | 0.9050 |
| No log | 1.96 | 6 | 6.9659 | 0.125 | 0.8970 | 10.1253 | 0.125 | 0.0631 | 0.2010 | 0.8465 |
| No log | 2.96 | 9 | 6.8528 | 0.075 | 0.8954 | 7.0315 | 0.075 | 0.0258 | 0.1912 | 0.8871 |
| No log | 3.96 | 12 | 6.8522 | 0.205 | 0.8955 | 7.0990 | 0.205 | 0.0776 | 0.2426 | 0.7588 |
| No log | 4.96 | 15 | 6.8465 | 0.19 | 0.8959 | 7.1340 | 0.19 | 0.0627 | 0.2308 | 0.7536 |
| No log | 5.96 | 18 | 6.8246 | 0.205 | 0.8937 | 7.1101 | 0.205 | 0.0867 | 0.2410 | 0.7354 |
| No log | 6.96 | 21 | 6.8054 | 0.085 | 0.8918 | 7.0215 | 0.085 | 0.0435 | 0.1847 | 0.8289 |
| No log | 7.96 | 24 | 6.8025 | 0.22 | 0.8879 | 6.8272 | 0.22 | 0.0967 | 0.2487 | 0.7438 |
| No log | 8.96 | 27 | 6.8045 | 0.21 | 0.8871 | 6.3740 | 0.2100 | 0.0992 | 0.2412 | 0.7634 |
| No log | 9.96 | 30 | 6.8013 | 0.22 | 0.8869 | 6.9538 | 0.22 | 0.1016 | 0.2495 | 0.7633 |
| No log | 10.96 | 33 | 6.7920 | 0.215 | 0.8865 | 6.9670 | 0.2150 | 0.0968 | 0.2549 | 0.7577 |
| No log | 11.96 | 36 | 6.7817 | 0.22 | 0.8867 | 6.9953 | 0.22 | 0.1004 | 0.2455 | 0.7437 |
| No log | 12.96 | 39 | 6.7729 | 0.17 | 0.8884 | 6.9738 | 0.17 | 0.0891 | 0.2277 | 0.7865 |
| No log | 13.96 | 42 | 6.7632 | 0.2 | 0.8873 | 6.9622 | 0.2000 | 0.0998 | 0.2393 | 0.7413 |
| No log | 14.96 | 45 | 6.7548 | 0.215 | 0.8860 | 6.9576 | 0.2150 | 0.1010 | 0.2635 | 0.7189 |
| No log | 15.96 | 48 | 6.7489 | 0.22 | 0.8857 | 6.8386 | 0.22 | 0.1024 | 0.2665 | 0.7098 |
| No log | 16.96 | 51 | 6.7457 | 0.23 | 0.8855 | 6.8730 | 0.23 | 0.1129 | 0.2506 | 0.7217 |
| No log | 17.96 | 54 | 6.7455 | 0.215 | 0.8864 | 6.8688 | 0.2150 | 0.1058 | 0.2576 | 0.7528 |
| No log | 18.96 | 57 | 6.7424 | 0.16 | 0.8861 | 6.8631 | 0.16 | 0.0843 | 0.2281 | 0.8036 |
| No log | 19.96 | 60 | 6.7380 | 0.155 | 0.8850 | 6.8443 | 0.155 | 0.0871 | 0.2315 | 0.7937 |
| No log | 20.96 | 63 | 6.7348 | 0.195 | 0.8841 | 6.7769 | 0.195 | 0.0949 | 0.2501 | 0.7799 |
| No log | 21.96 | 66 | 6.7317 | 0.175 | 0.8838 | 6.7692 | 0.175 | 0.1025 | 0.2421 | 0.7797 |
| No log | 22.96 | 69 | 6.7293 | 0.175 | 0.8836 | 6.7682 | 0.175 | 0.1012 | 0.2452 | 0.7799 |
| No log | 23.96 | 72 | 6.7281 | 0.205 | 0.8834 | 6.7672 | 0.205 | 0.1132 | 0.2566 | 0.7679 |
| No log | 24.96 | 75 | 6.7275 | 0.21 | 0.8834 | 6.7677 | 0.2100 | 0.1146 | 0.2647 | 0.7666 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_simkd_CEKD_t1_aNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_simkd_CEKD_t1_aNone
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9983
- Accuracy: 0.18
- Brier Loss: 0.8965
- Nll: 6.7849
- F1 Micro: 0.18
- F1 Macro: 0.0305
- Ece: 0.2195
- Aurc: 0.8182
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 1.0062 | 0.18 | 0.8980 | 6.1518 | 0.18 | 0.0309 | 0.2213 | 0.7838 |
| No log | 1.96 | 24 | 1.0034 | 0.18 | 0.8987 | 5.7795 | 0.18 | 0.0305 | 0.2273 | 0.8165 |
| No log | 2.96 | 36 | 1.0025 | 0.18 | 0.8984 | 6.4819 | 0.18 | 0.0305 | 0.2249 | 0.8306 |
| No log | 3.96 | 48 | 1.0018 | 0.18 | 0.8982 | 6.8521 | 0.18 | 0.0306 | 0.2205 | 0.8505 |
| No log | 4.96 | 60 | 1.0015 | 0.16 | 0.8980 | 6.6853 | 0.16 | 0.0324 | 0.2089 | 0.8798 |
| No log | 5.96 | 72 | 1.0011 | 0.175 | 0.8979 | 6.8349 | 0.175 | 0.0314 | 0.2134 | 0.8345 |
| No log | 6.96 | 84 | 1.0008 | 0.18 | 0.8976 | 6.8293 | 0.18 | 0.0313 | 0.2249 | 0.8208 |
| No log | 7.96 | 96 | 1.0005 | 0.18 | 0.8975 | 6.9400 | 0.18 | 0.0305 | 0.2230 | 0.8140 |
| No log | 8.96 | 108 | 1.0003 | 0.18 | 0.8974 | 6.5877 | 0.18 | 0.0306 | 0.2230 | 0.8246 |
| No log | 9.96 | 120 | 1.0000 | 0.18 | 0.8973 | 6.5454 | 0.18 | 0.0306 | 0.2188 | 0.8188 |
| No log | 10.96 | 132 | 0.9998 | 0.18 | 0.8972 | 6.5555 | 0.18 | 0.0306 | 0.2274 | 0.8151 |
| No log | 11.96 | 144 | 0.9996 | 0.18 | 0.8971 | 6.5819 | 0.18 | 0.0306 | 0.2254 | 0.8131 |
| No log | 12.96 | 156 | 0.9994 | 0.18 | 0.8970 | 6.7150 | 0.18 | 0.0305 | 0.2255 | 0.8162 |
| No log | 13.96 | 168 | 0.9993 | 0.18 | 0.8969 | 6.6542 | 0.18 | 0.0305 | 0.2213 | 0.8220 |
| No log | 14.96 | 180 | 0.9991 | 0.18 | 0.8968 | 6.6025 | 0.18 | 0.0305 | 0.2213 | 0.8125 |
| No log | 15.96 | 192 | 0.9990 | 0.18 | 0.8968 | 7.0424 | 0.18 | 0.0305 | 0.2301 | 0.8201 |
| No log | 16.96 | 204 | 0.9988 | 0.18 | 0.8967 | 6.6676 | 0.18 | 0.0305 | 0.2258 | 0.8153 |
| No log | 17.96 | 216 | 0.9987 | 0.18 | 0.8967 | 6.6621 | 0.18 | 0.0305 | 0.2270 | 0.8145 |
| No log | 18.96 | 228 | 0.9986 | 0.18 | 0.8967 | 7.0058 | 0.18 | 0.0305 | 0.2259 | 0.8214 |
| No log | 19.96 | 240 | 0.9985 | 0.18 | 0.8966 | 6.8777 | 0.18 | 0.0305 | 0.2194 | 0.8183 |
| No log | 20.96 | 252 | 0.9984 | 0.18 | 0.8966 | 6.7612 | 0.18 | 0.0305 | 0.2282 | 0.8131 |
| No log | 21.96 | 264 | 0.9984 | 0.18 | 0.8966 | 6.7811 | 0.18 | 0.0305 | 0.2282 | 0.8145 |
| No log | 22.96 | 276 | 0.9983 | 0.18 | 0.8965 | 6.7044 | 0.18 | 0.0305 | 0.2239 | 0.8167 |
| No log | 23.96 | 288 | 0.9983 | 0.18 | 0.8965 | 6.7813 | 0.18 | 0.0305 | 0.2217 | 0.8183 |
| No log | 24.96 | 300 | 0.9983 | 0.18 | 0.8965 | 6.7849 | 0.18 | 0.0305 | 0.2195 | 0.8182 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_simkd_CEKD_t1_aNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_simkd_CEKD_t1_aNone
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9876
- Accuracy: 0.085
- Brier Loss: 0.8927
- Nll: 8.3272
- F1 Micro: 0.085
- F1 Macro: 0.0461
- Ece: 0.1645
- Aurc: 0.7988
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 1.0049 | 0.08 | 0.8993 | 5.4663 | 0.08 | 0.0322 | 0.1476 | 0.8883 |
| No log | 1.96 | 24 | 1.0007 | 0.165 | 0.8988 | 5.5926 | 0.165 | 0.0284 | 0.2066 | 0.8251 |
| No log | 2.96 | 36 | 0.9994 | 0.16 | 0.8982 | 5.9135 | 0.16 | 0.0277 | 0.2100 | 0.8518 |
| No log | 3.96 | 48 | 0.9984 | 0.17 | 0.8975 | 6.1195 | 0.17 | 0.0574 | 0.2142 | 0.8153 |
| No log | 4.96 | 60 | 0.9976 | 0.19 | 0.8970 | 6.2724 | 0.19 | 0.0752 | 0.2294 | 0.8254 |
| No log | 5.96 | 72 | 0.9967 | 0.09 | 0.8968 | 6.3787 | 0.09 | 0.0315 | 0.1591 | 0.7950 |
| No log | 6.96 | 84 | 0.9958 | 0.065 | 0.8964 | 6.4218 | 0.065 | 0.0122 | 0.1433 | 0.8333 |
| No log | 7.96 | 96 | 0.9949 | 0.065 | 0.8960 | 6.5170 | 0.065 | 0.0122 | 0.1543 | 0.8344 |
| No log | 8.96 | 108 | 0.9941 | 0.065 | 0.8956 | 6.5572 | 0.065 | 0.0123 | 0.1545 | 0.8331 |
| No log | 9.96 | 120 | 0.9934 | 0.07 | 0.8954 | 6.6362 | 0.07 | 0.0304 | 0.1597 | 0.8313 |
| No log | 10.96 | 132 | 0.9926 | 0.07 | 0.8951 | 6.6430 | 0.07 | 0.0304 | 0.1576 | 0.8325 |
| No log | 11.96 | 144 | 0.9920 | 0.07 | 0.8948 | 6.6842 | 0.07 | 0.0304 | 0.1590 | 0.8225 |
| No log | 12.96 | 156 | 0.9914 | 0.07 | 0.8947 | 6.7731 | 0.07 | 0.0304 | 0.1619 | 0.8155 |
| No log | 13.96 | 168 | 0.9909 | 0.07 | 0.8944 | 6.8584 | 0.07 | 0.0304 | 0.1522 | 0.8128 |
| No log | 14.96 | 180 | 0.9904 | 0.07 | 0.8941 | 6.8161 | 0.07 | 0.0304 | 0.1524 | 0.8142 |
| No log | 15.96 | 192 | 0.9899 | 0.07 | 0.8940 | 7.3169 | 0.07 | 0.0304 | 0.1532 | 0.8109 |
| No log | 16.96 | 204 | 0.9894 | 0.07 | 0.8937 | 7.8481 | 0.07 | 0.0304 | 0.1531 | 0.8132 |
| No log | 17.96 | 216 | 0.9890 | 0.08 | 0.8935 | 8.3375 | 0.08 | 0.0439 | 0.1587 | 0.8002 |
| No log | 18.96 | 228 | 0.9886 | 0.07 | 0.8933 | 8.4250 | 0.07 | 0.0307 | 0.1536 | 0.8132 |
| No log | 19.96 | 240 | 0.9883 | 0.085 | 0.8931 | 8.4316 | 0.085 | 0.0445 | 0.1618 | 0.8014 |
| No log | 20.96 | 252 | 0.9880 | 0.075 | 0.8930 | 8.4395 | 0.075 | 0.0392 | 0.1566 | 0.8088 |
| No log | 21.96 | 264 | 0.9878 | 0.085 | 0.8929 | 8.3319 | 0.085 | 0.0476 | 0.1621 | 0.7956 |
| No log | 22.96 | 276 | 0.9877 | 0.08 | 0.8928 | 8.3274 | 0.08 | 0.0439 | 0.1594 | 0.8024 |
| No log | 23.96 | 288 | 0.9876 | 0.08 | 0.8927 | 8.3285 | 0.08 | 0.0440 | 0.1595 | 0.8014 |
| No log | 24.96 | 300 | 0.9876 | 0.085 | 0.8927 | 8.3272 | 0.085 | 0.0461 | 0.1645 | 0.7988 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
Xanadu00/galaxy_classifier_mobilevit_3
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Xanadu00/galaxy_classifier_mobilevit_3
This model is a fine-tuned version of [apple/mobilevit-small](https://huggingface.co/apple/mobilevit-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1914
- Train Accuracy: 0.9341
- Validation Loss: 0.5148
- Validation Accuracy: 0.8512
- Epoch: 16
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamW', 'weight_decay': 0.01, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'ExponentialDecay', 'config': {'initial_learning_rate': 0.002, 'decay_steps': 10000, 'decay_rate': 0.01, 'staircase': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 1.1049 | 0.6128 | 0.7422 | 0.7517 | 0 |
| 0.7149 | 0.7564 | 0.6376 | 0.7821 | 1 |
| 0.6080 | 0.7945 | 0.6947 | 0.7745 | 2 |
| 0.5376 | 0.8160 | 0.5589 | 0.8134 | 3 |
| 0.4977 | 0.8279 | 0.5458 | 0.8162 | 4 |
| 0.4564 | 0.8407 | 0.4799 | 0.8441 | 5 |
| 0.4271 | 0.8557 | 0.4765 | 0.8413 | 6 |
| 0.3957 | 0.8619 | 0.4790 | 0.8453 | 7 |
| 0.3701 | 0.8741 | 0.5376 | 0.8329 | 8 |
| 0.3425 | 0.8829 | 0.4359 | 0.8619 | 9 |
| 0.3192 | 0.8892 | 0.4475 | 0.8585 | 10 |
| 0.2972 | 0.8967 | 0.4143 | 0.8712 | 11 |
| 0.2691 | 0.9080 | 0.4819 | 0.8498 | 12 |
| 0.2445 | 0.9144 | 0.4543 | 0.8563 | 13 |
| 0.2261 | 0.9220 | 0.4221 | 0.8689 | 14 |
| 0.2127 | 0.9251 | 0.5076 | 0.8540 | 15 |
| 0.1914 | 0.9341 | 0.5148 | 0.8512 | 16 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"barred spiral",
"cigar round smooth",
"distributed",
"edge-on with bulge",
"edge-on without bulge",
"in-between round smooth",
"merging",
"round smooth",
"unbarred loss spiral",
"unbarred tight spiral"
] |
rdmpage/autotrain-lasiocampidae-73081139111
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 73081139111
- CO2 Emissions (in grams): 2.2329
## Validation Metrics
- Loss: 0.365
- Accuracy: 0.871
- Macro F1: 0.824
- Micro F1: 0.871
- Weighted F1: 0.865
- Macro Precision: 0.898
- Micro Precision: 0.871
- Weighted Precision: 0.874
- Macro Recall: 0.796
- Micro Recall: 0.871
- Weighted Recall: 0.871
|
[
"1114",
"1115",
"1116",
"1117",
"1118",
"1119",
"1120",
"1121"
] |
gcicceri/organoids-prova_organoid
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# organoids-prova_organoid
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3433
- Accuracy: 0.8576
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.2121 | 0.99 | 36 | 1.3066 | 0.4116 |
| 0.8905 | 1.99 | 72 | 0.9344 | 0.6749 |
| 0.6942 | 2.98 | 108 | 0.6875 | 0.7507 |
| 0.6087 | 4.0 | 145 | 0.5493 | 0.7896 |
| 0.5896 | 4.99 | 181 | 0.5028 | 0.7993 |
| 0.6168 | 5.99 | 217 | 0.4787 | 0.8100 |
| 0.5627 | 6.98 | 253 | 0.4373 | 0.8319 |
| 0.5654 | 8.0 | 290 | 0.4324 | 0.8299 |
| 0.5204 | 8.99 | 326 | 0.4130 | 0.8319 |
| 0.5581 | 9.99 | 362 | 0.4264 | 0.8241 |
| 0.5232 | 10.98 | 398 | 0.4074 | 0.8294 |
| 0.483 | 12.0 | 435 | 0.3850 | 0.8445 |
| 0.5208 | 12.99 | 471 | 0.3791 | 0.8489 |
| 0.4937 | 13.99 | 507 | 0.3723 | 0.8528 |
| 0.4436 | 14.98 | 543 | 0.3910 | 0.8440 |
| 0.5169 | 16.0 | 580 | 0.3794 | 0.8465 |
| 0.4394 | 16.99 | 616 | 0.3876 | 0.8440 |
| 0.4616 | 17.99 | 652 | 0.3844 | 0.8465 |
| 0.4983 | 18.98 | 688 | 0.3552 | 0.8591 |
| 0.5295 | 20.0 | 725 | 0.3561 | 0.8547 |
| 0.5121 | 20.99 | 761 | 0.3573 | 0.8537 |
| 0.4379 | 21.99 | 797 | 0.3593 | 0.8576 |
| 0.4653 | 22.98 | 833 | 0.3473 | 0.8601 |
| 0.486 | 24.0 | 870 | 0.3473 | 0.8610 |
| 0.4751 | 24.99 | 906 | 0.3638 | 0.8552 |
| 0.4462 | 25.99 | 942 | 0.3533 | 0.8542 |
| 0.4197 | 26.98 | 978 | 0.3464 | 0.8601 |
| 0.4966 | 28.0 | 1015 | 0.3451 | 0.8649 |
| 0.5004 | 28.99 | 1051 | 0.3634 | 0.8508 |
| 0.4156 | 29.99 | 1087 | 0.3723 | 0.8474 |
| 0.4508 | 30.98 | 1123 | 0.3342 | 0.8669 |
| 0.43 | 32.0 | 1160 | 0.3389 | 0.8639 |
| 0.5004 | 32.99 | 1196 | 0.3416 | 0.8615 |
| 0.4927 | 33.99 | 1232 | 0.3545 | 0.8533 |
| 0.4802 | 34.98 | 1268 | 0.3382 | 0.8610 |
| 0.4334 | 36.0 | 1305 | 0.3480 | 0.8542 |
| 0.4557 | 36.99 | 1341 | 0.3392 | 0.8601 |
| 0.4551 | 37.99 | 1377 | 0.3488 | 0.8542 |
| 0.4643 | 38.98 | 1413 | 0.3424 | 0.8586 |
| 0.513 | 39.72 | 1440 | 0.3433 | 0.8576 |
### Framework versions
- Transformers 4.28.0
- Pytorch 1.8.1+cu111
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"0",
"1",
"2",
"3"
] |
mietlinski/autotrain-parking
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 72236139154
- CO2 Emissions (in grams): 0.7902
## Validation Metrics
- Loss: 0.424
- Accuracy: 0.871
- Macro F1: 0.706
- Micro F1: 0.871
- Weighted F1: 0.863
- Macro Precision: 0.695
- Micro Precision: 0.871
- Weighted Precision: 0.860
- Macro Recall: 0.722
- Micro Recall: 0.871
- Weighted Recall: 0.871
|
[
"spot_0",
"spot_1",
"spot_2",
"spot_3",
"spot_4"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.9246
- Accuracy: 0.18
- Brier Loss: 0.8755
- Nll: 6.7967
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2497
- Aurc: 0.8499
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.1239 | 0.145 | 0.8999 | 10.1580 | 0.145 | 0.0253 | 0.2222 | 0.8467 |
| No log | 1.96 | 6 | 3.0895 | 0.145 | 0.8946 | 10.5934 | 0.145 | 0.0253 | 0.2303 | 0.8470 |
| No log | 2.96 | 9 | 3.0385 | 0.165 | 0.8866 | 8.6307 | 0.165 | 0.0502 | 0.2200 | 0.8458 |
| No log | 3.96 | 12 | 2.9972 | 0.21 | 0.8806 | 6.5449 | 0.2100 | 0.0615 | 0.2512 | 0.8364 |
| No log | 4.96 | 15 | 2.9719 | 0.155 | 0.8776 | 6.7565 | 0.155 | 0.0271 | 0.2414 | 0.8884 |
| No log | 5.96 | 18 | 2.9579 | 0.215 | 0.8768 | 7.0870 | 0.2150 | 0.0643 | 0.2713 | 0.8778 |
| No log | 6.96 | 21 | 2.9485 | 0.18 | 0.8768 | 7.0291 | 0.18 | 0.0308 | 0.2482 | 0.8532 |
| No log | 7.96 | 24 | 2.9417 | 0.18 | 0.8770 | 6.9706 | 0.18 | 0.0306 | 0.2559 | 0.8525 |
| No log | 8.96 | 27 | 2.9360 | 0.18 | 0.8768 | 6.9349 | 0.18 | 0.0306 | 0.2498 | 0.8527 |
| No log | 9.96 | 30 | 2.9326 | 0.18 | 0.8767 | 6.9268 | 0.18 | 0.0306 | 0.2635 | 0.8533 |
| No log | 10.96 | 33 | 2.9303 | 0.18 | 0.8765 | 6.9226 | 0.18 | 0.0306 | 0.2637 | 0.8531 |
| No log | 11.96 | 36 | 2.9289 | 0.18 | 0.8764 | 6.9217 | 0.18 | 0.0306 | 0.2591 | 0.8524 |
| No log | 12.96 | 39 | 2.9279 | 0.18 | 0.8762 | 6.8547 | 0.18 | 0.0306 | 0.2505 | 0.8526 |
| No log | 13.96 | 42 | 2.9270 | 0.18 | 0.8760 | 6.8491 | 0.18 | 0.0306 | 0.2500 | 0.8520 |
| No log | 14.96 | 45 | 2.9263 | 0.18 | 0.8759 | 6.8471 | 0.18 | 0.0306 | 0.2463 | 0.8518 |
| No log | 15.96 | 48 | 2.9258 | 0.18 | 0.8758 | 6.8445 | 0.18 | 0.0306 | 0.2462 | 0.8520 |
| No log | 16.96 | 51 | 2.9255 | 0.18 | 0.8758 | 6.8452 | 0.18 | 0.0306 | 0.2587 | 0.8511 |
| No log | 17.96 | 54 | 2.9256 | 0.18 | 0.8758 | 6.7940 | 0.18 | 0.0306 | 0.2585 | 0.8513 |
| No log | 18.96 | 57 | 2.9256 | 0.18 | 0.8758 | 6.7930 | 0.18 | 0.0306 | 0.2625 | 0.8508 |
| No log | 19.96 | 60 | 2.9252 | 0.18 | 0.8757 | 6.7945 | 0.18 | 0.0306 | 0.2580 | 0.8506 |
| No log | 20.96 | 63 | 2.9250 | 0.18 | 0.8756 | 6.7999 | 0.18 | 0.0306 | 0.2539 | 0.8505 |
| No log | 21.96 | 66 | 2.9248 | 0.18 | 0.8756 | 6.8441 | 0.18 | 0.0306 | 0.2538 | 0.8502 |
| No log | 22.96 | 69 | 2.9247 | 0.18 | 0.8755 | 6.8439 | 0.18 | 0.0306 | 0.2497 | 0.8500 |
| No log | 23.96 | 72 | 2.9247 | 0.18 | 0.8755 | 6.7977 | 0.18 | 0.0306 | 0.2497 | 0.8500 |
| No log | 24.96 | 75 | 2.9246 | 0.18 | 0.8755 | 6.7967 | 0.18 | 0.0306 | 0.2497 | 0.8499 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.8753
- Accuracy: 0.185
- Brier Loss: 0.8660
- Nll: 6.5533
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2451
- Aurc: 0.7363
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.1378 | 0.06 | 0.9042 | 9.2898 | 0.06 | 0.0114 | 0.1754 | 0.9032 |
| No log | 1.96 | 6 | 3.0447 | 0.18 | 0.8884 | 6.2145 | 0.18 | 0.0305 | 0.2294 | 0.8048 |
| No log | 2.96 | 9 | 2.9500 | 0.18 | 0.8761 | 6.9445 | 0.18 | 0.0305 | 0.2447 | 0.8193 |
| No log | 3.96 | 12 | 2.9328 | 0.18 | 0.8800 | 6.9512 | 0.18 | 0.0305 | 0.2565 | 0.8122 |
| No log | 4.96 | 15 | 2.9305 | 0.185 | 0.8793 | 6.9136 | 0.185 | 0.0488 | 0.2557 | 0.7823 |
| No log | 5.96 | 18 | 2.9286 | 0.185 | 0.8762 | 6.7762 | 0.185 | 0.0488 | 0.2533 | 0.7721 |
| No log | 6.96 | 21 | 2.9265 | 0.185 | 0.8731 | 5.9902 | 0.185 | 0.0488 | 0.2345 | 0.7682 |
| No log | 7.96 | 24 | 2.9240 | 0.185 | 0.8718 | 5.9696 | 0.185 | 0.0488 | 0.2625 | 0.7621 |
| No log | 8.96 | 27 | 2.9177 | 0.185 | 0.8707 | 5.9711 | 0.185 | 0.0488 | 0.2463 | 0.7578 |
| No log | 9.96 | 30 | 2.9129 | 0.185 | 0.8702 | 6.6932 | 0.185 | 0.0488 | 0.2485 | 0.7574 |
| No log | 10.96 | 33 | 2.9082 | 0.185 | 0.8704 | 6.7772 | 0.185 | 0.0488 | 0.2500 | 0.7560 |
| No log | 11.96 | 36 | 2.9039 | 0.185 | 0.8707 | 6.8060 | 0.185 | 0.0488 | 0.2464 | 0.7537 |
| No log | 12.96 | 39 | 2.8990 | 0.185 | 0.8704 | 6.7988 | 0.185 | 0.0488 | 0.2466 | 0.7515 |
| No log | 13.96 | 42 | 2.8933 | 0.185 | 0.8696 | 6.7771 | 0.185 | 0.0488 | 0.2505 | 0.7479 |
| No log | 14.96 | 45 | 2.8879 | 0.185 | 0.8688 | 6.7597 | 0.185 | 0.0488 | 0.2523 | 0.7482 |
| No log | 15.96 | 48 | 2.8840 | 0.185 | 0.8679 | 6.6825 | 0.185 | 0.0488 | 0.2648 | 0.7454 |
| No log | 16.96 | 51 | 2.8822 | 0.185 | 0.8676 | 6.6742 | 0.185 | 0.0488 | 0.2473 | 0.7425 |
| No log | 17.96 | 54 | 2.8819 | 0.185 | 0.8672 | 6.5521 | 0.185 | 0.0488 | 0.2479 | 0.7405 |
| No log | 18.96 | 57 | 2.8817 | 0.185 | 0.8671 | 6.5498 | 0.185 | 0.0488 | 0.2536 | 0.7385 |
| No log | 19.96 | 60 | 2.8797 | 0.185 | 0.8667 | 6.5563 | 0.185 | 0.0488 | 0.2442 | 0.7371 |
| No log | 20.96 | 63 | 2.8784 | 0.185 | 0.8666 | 6.6145 | 0.185 | 0.0488 | 0.2528 | 0.7374 |
| No log | 21.96 | 66 | 2.8770 | 0.185 | 0.8663 | 6.6084 | 0.185 | 0.0488 | 0.2489 | 0.7366 |
| No log | 22.96 | 69 | 2.8760 | 0.185 | 0.8662 | 6.5683 | 0.185 | 0.0488 | 0.2448 | 0.7360 |
| No log | 23.96 | 72 | 2.8756 | 0.185 | 0.8661 | 6.5544 | 0.185 | 0.0488 | 0.2450 | 0.7363 |
| No log | 24.96 | 75 | 2.8753 | 0.185 | 0.8660 | 6.5533 | 0.185 | 0.0488 | 0.2451 | 0.7363 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6280
- Accuracy: 0.18
- Brier Loss: 0.8747
- Nll: 6.7569
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2550
- Aurc: 0.8496
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.7961 | 0.145 | 0.8999 | 10.1560 | 0.145 | 0.0253 | 0.2221 | 0.8467 |
| No log | 1.96 | 6 | 2.7646 | 0.145 | 0.8946 | 10.5828 | 0.145 | 0.0253 | 0.2242 | 0.8475 |
| No log | 2.96 | 9 | 2.7185 | 0.155 | 0.8868 | 8.6137 | 0.155 | 0.0501 | 0.2145 | 0.8394 |
| No log | 3.96 | 12 | 2.6825 | 0.21 | 0.8808 | 6.5439 | 0.2100 | 0.0613 | 0.2567 | 0.8351 |
| No log | 4.96 | 15 | 2.6619 | 0.155 | 0.8778 | 6.7839 | 0.155 | 0.0274 | 0.2346 | 0.8880 |
| No log | 5.96 | 18 | 2.6517 | 0.18 | 0.8769 | 7.4578 | 0.18 | 0.0395 | 0.2461 | 0.8571 |
| No log | 6.96 | 21 | 2.6450 | 0.18 | 0.8767 | 7.1192 | 0.18 | 0.0308 | 0.2518 | 0.8516 |
| No log | 7.96 | 24 | 2.6400 | 0.18 | 0.8766 | 6.9539 | 0.18 | 0.0306 | 0.2472 | 0.8526 |
| No log | 8.96 | 27 | 2.6355 | 0.18 | 0.8762 | 6.9109 | 0.18 | 0.0306 | 0.2524 | 0.8527 |
| No log | 9.96 | 30 | 2.6332 | 0.18 | 0.8759 | 6.8997 | 0.18 | 0.0306 | 0.2491 | 0.8527 |
| No log | 10.96 | 33 | 2.6317 | 0.18 | 0.8757 | 6.8943 | 0.18 | 0.0306 | 0.2529 | 0.8524 |
| No log | 11.96 | 36 | 2.6309 | 0.18 | 0.8755 | 6.8287 | 0.18 | 0.0306 | 0.2442 | 0.8523 |
| No log | 12.96 | 39 | 2.6304 | 0.18 | 0.8753 | 6.7670 | 0.18 | 0.0306 | 0.2478 | 0.8521 |
| No log | 13.96 | 42 | 2.6298 | 0.18 | 0.8752 | 6.7597 | 0.18 | 0.0306 | 0.2433 | 0.8517 |
| No log | 14.96 | 45 | 2.6293 | 0.18 | 0.8751 | 6.7590 | 0.18 | 0.0306 | 0.2516 | 0.8513 |
| No log | 15.96 | 48 | 2.6290 | 0.18 | 0.8750 | 6.7556 | 0.18 | 0.0306 | 0.2555 | 0.8515 |
| No log | 16.96 | 51 | 2.6287 | 0.18 | 0.8750 | 6.7582 | 0.18 | 0.0306 | 0.2557 | 0.8514 |
| No log | 17.96 | 54 | 2.6289 | 0.18 | 0.8750 | 6.7556 | 0.18 | 0.0306 | 0.2476 | 0.8509 |
| No log | 18.96 | 57 | 2.6289 | 0.18 | 0.8750 | 6.7567 | 0.18 | 0.0306 | 0.2475 | 0.8505 |
| No log | 19.96 | 60 | 2.6285 | 0.18 | 0.8748 | 6.7567 | 0.18 | 0.0306 | 0.2433 | 0.8502 |
| No log | 20.96 | 63 | 2.6283 | 0.18 | 0.8748 | 6.7577 | 0.18 | 0.0306 | 0.2512 | 0.8500 |
| No log | 21.96 | 66 | 2.6281 | 0.18 | 0.8748 | 6.7586 | 0.18 | 0.0306 | 0.2551 | 0.8495 |
| No log | 22.96 | 69 | 2.6280 | 0.18 | 0.8747 | 6.7580 | 0.18 | 0.0306 | 0.2550 | 0.8496 |
| No log | 23.96 | 72 | 2.6280 | 0.18 | 0.8747 | 6.7573 | 0.18 | 0.0306 | 0.2550 | 0.8496 |
| No log | 24.96 | 75 | 2.6280 | 0.18 | 0.8747 | 6.7569 | 0.18 | 0.0306 | 0.2550 | 0.8496 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5836
- Accuracy: 0.185
- Brier Loss: 0.8652
- Nll: 6.4546
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2424
- Aurc: 0.7342
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.8093 | 0.06 | 0.9041 | 9.2868 | 0.06 | 0.0114 | 0.1752 | 0.9033 |
| No log | 1.96 | 6 | 2.7245 | 0.18 | 0.8884 | 6.2166 | 0.18 | 0.0305 | 0.2292 | 0.8036 |
| No log | 2.96 | 9 | 2.6443 | 0.18 | 0.8760 | 6.9627 | 0.18 | 0.0305 | 0.2437 | 0.8179 |
| No log | 3.96 | 12 | 2.6356 | 0.185 | 0.8785 | 6.9306 | 0.185 | 0.0488 | 0.2534 | 0.7877 |
| No log | 4.96 | 15 | 2.6338 | 0.185 | 0.8768 | 6.8870 | 0.185 | 0.0488 | 0.2605 | 0.7787 |
| No log | 5.96 | 18 | 2.6325 | 0.185 | 0.8740 | 6.2086 | 0.185 | 0.0490 | 0.2453 | 0.7699 |
| No log | 6.96 | 21 | 2.6322 | 0.185 | 0.8721 | 5.9554 | 0.185 | 0.0488 | 0.2474 | 0.7629 |
| No log | 7.96 | 24 | 2.6293 | 0.185 | 0.8712 | 5.9359 | 0.185 | 0.0488 | 0.2550 | 0.7576 |
| No log | 8.96 | 27 | 2.6221 | 0.185 | 0.8701 | 5.9468 | 0.185 | 0.0488 | 0.2436 | 0.7536 |
| No log | 9.96 | 30 | 2.6171 | 0.185 | 0.8697 | 6.6875 | 0.185 | 0.0488 | 0.2497 | 0.7541 |
| No log | 10.96 | 33 | 2.6126 | 0.185 | 0.8697 | 6.7549 | 0.185 | 0.0488 | 0.2512 | 0.7517 |
| No log | 11.96 | 36 | 2.6084 | 0.185 | 0.8697 | 6.7827 | 0.185 | 0.0488 | 0.2476 | 0.7489 |
| No log | 12.96 | 39 | 2.6037 | 0.185 | 0.8692 | 6.7652 | 0.185 | 0.0488 | 0.2557 | 0.7476 |
| No log | 13.96 | 42 | 2.5986 | 0.185 | 0.8683 | 6.6847 | 0.185 | 0.0488 | 0.2513 | 0.7446 |
| No log | 14.96 | 45 | 2.5940 | 0.185 | 0.8676 | 6.6600 | 0.185 | 0.0488 | 0.2572 | 0.7447 |
| No log | 15.96 | 48 | 2.5910 | 0.185 | 0.8669 | 6.6410 | 0.185 | 0.0488 | 0.2448 | 0.7424 |
| No log | 16.96 | 51 | 2.5897 | 0.185 | 0.8667 | 6.6371 | 0.185 | 0.0488 | 0.2402 | 0.7402 |
| No log | 17.96 | 54 | 2.5898 | 0.185 | 0.8664 | 6.5096 | 0.185 | 0.0488 | 0.2549 | 0.7371 |
| No log | 18.96 | 57 | 2.5897 | 0.185 | 0.8664 | 6.5160 | 0.185 | 0.0488 | 0.2504 | 0.7363 |
| No log | 19.96 | 60 | 2.5877 | 0.185 | 0.8660 | 6.4661 | 0.185 | 0.0488 | 0.2416 | 0.7346 |
| No log | 20.96 | 63 | 2.5865 | 0.185 | 0.8658 | 6.4833 | 0.185 | 0.0488 | 0.2459 | 0.7347 |
| No log | 21.96 | 66 | 2.5852 | 0.185 | 0.8655 | 6.4690 | 0.185 | 0.0488 | 0.2460 | 0.7343 |
| No log | 22.96 | 69 | 2.5843 | 0.185 | 0.8654 | 6.4625 | 0.185 | 0.0488 | 0.2461 | 0.7340 |
| No log | 23.96 | 72 | 2.5838 | 0.185 | 0.8653 | 6.4568 | 0.185 | 0.0488 | 0.2424 | 0.7342 |
| No log | 24.96 | 75 | 2.5836 | 0.185 | 0.8652 | 6.4546 | 0.185 | 0.0488 | 0.2424 | 0.7342 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3286
- Accuracy: 0.18
- Brier Loss: 0.8742
- Nll: 6.7213
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2558
- Aurc: 0.8491
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.4683 | 0.145 | 0.8999 | 10.1538 | 0.145 | 0.0253 | 0.2220 | 0.8466 |
| No log | 1.96 | 6 | 2.4396 | 0.145 | 0.8947 | 10.5704 | 0.145 | 0.0253 | 0.2237 | 0.8463 |
| No log | 2.96 | 9 | 2.3985 | 0.145 | 0.8869 | 8.5511 | 0.145 | 0.0451 | 0.2116 | 0.8036 |
| No log | 3.96 | 12 | 2.3677 | 0.21 | 0.8810 | 6.5446 | 0.2100 | 0.0611 | 0.2566 | 0.8335 |
| No log | 4.96 | 15 | 2.3517 | 0.155 | 0.8780 | 6.8400 | 0.155 | 0.0279 | 0.2309 | 0.8894 |
| No log | 5.96 | 18 | 2.3450 | 0.18 | 0.8771 | 8.1897 | 0.18 | 0.0313 | 0.2495 | 0.8531 |
| No log | 6.96 | 21 | 2.3407 | 0.18 | 0.8767 | 7.3073 | 0.18 | 0.0306 | 0.2551 | 0.8513 |
| No log | 7.96 | 24 | 2.3371 | 0.18 | 0.8763 | 6.9328 | 0.18 | 0.0306 | 0.2501 | 0.8520 |
| No log | 8.96 | 27 | 2.3337 | 0.18 | 0.8757 | 6.8828 | 0.18 | 0.0306 | 0.2507 | 0.8525 |
| No log | 9.96 | 30 | 2.3321 | 0.18 | 0.8753 | 6.8682 | 0.18 | 0.0306 | 0.2508 | 0.8524 |
| No log | 10.96 | 33 | 2.3312 | 0.18 | 0.8751 | 6.7981 | 0.18 | 0.0306 | 0.2462 | 0.8521 |
| No log | 11.96 | 36 | 2.3309 | 0.18 | 0.8749 | 6.7375 | 0.18 | 0.0306 | 0.2531 | 0.8520 |
| No log | 12.96 | 39 | 2.3307 | 0.18 | 0.8748 | 6.7235 | 0.18 | 0.0306 | 0.2524 | 0.8518 |
| No log | 13.96 | 42 | 2.3304 | 0.18 | 0.8747 | 6.7200 | 0.18 | 0.0306 | 0.2482 | 0.8514 |
| No log | 14.96 | 45 | 2.3301 | 0.18 | 0.8746 | 6.7201 | 0.18 | 0.0306 | 0.2410 | 0.8509 |
| No log | 15.96 | 48 | 2.3298 | 0.18 | 0.8746 | 6.7182 | 0.18 | 0.0306 | 0.2449 | 0.8505 |
| No log | 16.96 | 51 | 2.3295 | 0.18 | 0.8745 | 6.7211 | 0.18 | 0.0306 | 0.2412 | 0.8500 |
| No log | 17.96 | 54 | 2.3297 | 0.18 | 0.8745 | 6.7201 | 0.18 | 0.0306 | 0.2449 | 0.8496 |
| No log | 18.96 | 57 | 2.3296 | 0.18 | 0.8745 | 6.7216 | 0.18 | 0.0306 | 0.2392 | 0.8494 |
| No log | 19.96 | 60 | 2.3292 | 0.18 | 0.8744 | 6.7214 | 0.18 | 0.0306 | 0.2371 | 0.8494 |
| No log | 20.96 | 63 | 2.3290 | 0.18 | 0.8744 | 6.7222 | 0.18 | 0.0306 | 0.2371 | 0.8493 |
| No log | 21.96 | 66 | 2.3288 | 0.18 | 0.8743 | 6.7227 | 0.18 | 0.0306 | 0.2408 | 0.8494 |
| No log | 22.96 | 69 | 2.3286 | 0.18 | 0.8743 | 6.7223 | 0.18 | 0.0306 | 0.2558 | 0.8490 |
| No log | 23.96 | 72 | 2.3286 | 0.18 | 0.8743 | 6.7218 | 0.18 | 0.0306 | 0.2558 | 0.8491 |
| No log | 24.96 | 75 | 2.3286 | 0.18 | 0.8742 | 6.7213 | 0.18 | 0.0306 | 0.2558 | 0.8491 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2890
- Accuracy: 0.19
- Brier Loss: 0.8648
- Nll: 6.4150
- F1 Micro: 0.19
- F1 Macro: 0.0641
- Ece: 0.2450
- Aurc: 0.7332
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.4806 | 0.06 | 0.9041 | 9.2838 | 0.06 | 0.0114 | 0.1750 | 0.9034 |
| No log | 1.96 | 6 | 2.4041 | 0.18 | 0.8884 | 6.3227 | 0.18 | 0.0305 | 0.2317 | 0.8027 |
| No log | 2.96 | 9 | 2.3381 | 0.18 | 0.8760 | 6.9952 | 0.18 | 0.0305 | 0.2424 | 0.8118 |
| No log | 3.96 | 12 | 2.3362 | 0.185 | 0.8771 | 6.9040 | 0.185 | 0.0488 | 0.2544 | 0.7841 |
| No log | 4.96 | 15 | 2.3345 | 0.185 | 0.8747 | 6.8515 | 0.185 | 0.0488 | 0.2476 | 0.7768 |
| No log | 5.96 | 18 | 2.3339 | 0.185 | 0.8725 | 6.0111 | 0.185 | 0.0490 | 0.2457 | 0.7670 |
| No log | 6.96 | 21 | 2.3348 | 0.185 | 0.8718 | 5.9199 | 0.185 | 0.0488 | 0.2328 | 0.7596 |
| No log | 7.96 | 24 | 2.3310 | 0.185 | 0.8711 | 5.9008 | 0.185 | 0.0488 | 0.2443 | 0.7536 |
| No log | 8.96 | 27 | 2.3231 | 0.185 | 0.8699 | 5.8793 | 0.185 | 0.0488 | 0.2337 | 0.7516 |
| No log | 9.96 | 30 | 2.3181 | 0.185 | 0.8694 | 6.6980 | 0.185 | 0.0488 | 0.2507 | 0.7500 |
| No log | 10.96 | 33 | 2.3139 | 0.185 | 0.8692 | 6.7350 | 0.185 | 0.0488 | 0.2481 | 0.7488 |
| No log | 11.96 | 36 | 2.3099 | 0.185 | 0.8690 | 6.7557 | 0.185 | 0.0488 | 0.2484 | 0.7463 |
| No log | 12.96 | 39 | 2.3057 | 0.185 | 0.8684 | 6.6765 | 0.185 | 0.0488 | 0.2598 | 0.7441 |
| No log | 13.96 | 42 | 2.3014 | 0.185 | 0.8676 | 6.6313 | 0.185 | 0.0488 | 0.2478 | 0.7420 |
| No log | 14.96 | 45 | 2.2978 | 0.185 | 0.8669 | 6.6142 | 0.185 | 0.0488 | 0.2496 | 0.7412 |
| No log | 15.96 | 48 | 2.2955 | 0.185 | 0.8664 | 6.5990 | 0.185 | 0.0488 | 0.2379 | 0.7399 |
| No log | 16.96 | 51 | 2.2947 | 0.185 | 0.8662 | 6.4895 | 0.185 | 0.0488 | 0.2452 | 0.7375 |
| No log | 17.96 | 54 | 2.2949 | 0.185 | 0.8661 | 6.4730 | 0.185 | 0.0488 | 0.2438 | 0.7354 |
| No log | 18.96 | 57 | 2.2949 | 0.185 | 0.8661 | 6.4244 | 0.185 | 0.0488 | 0.2435 | 0.7356 |
| No log | 19.96 | 60 | 2.2930 | 0.185 | 0.8657 | 6.3676 | 0.185 | 0.0490 | 0.2389 | 0.7341 |
| No log | 20.96 | 63 | 2.2918 | 0.19 | 0.8654 | 6.4233 | 0.19 | 0.0641 | 0.2446 | 0.7336 |
| No log | 21.96 | 66 | 2.2905 | 0.19 | 0.8651 | 6.4742 | 0.19 | 0.0641 | 0.2485 | 0.7334 |
| No log | 22.96 | 69 | 2.2897 | 0.19 | 0.8649 | 6.4243 | 0.19 | 0.0641 | 0.2448 | 0.7332 |
| No log | 23.96 | 72 | 2.2893 | 0.19 | 0.8648 | 6.4174 | 0.19 | 0.0641 | 0.2450 | 0.7332 |
| No log | 24.96 | 75 | 2.2890 | 0.19 | 0.8648 | 6.4150 | 0.19 | 0.0641 | 0.2450 | 0.7332 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.9560
- Accuracy: 0.18
- Brier Loss: 0.8800
- Nll: 6.8606
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2612
- Aurc: 0.8512
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 4.2281 | 0.145 | 0.8999 | 10.1620 | 0.145 | 0.0253 | 0.2222 | 0.8467 |
| No log | 1.96 | 6 | 4.1872 | 0.145 | 0.8946 | 10.5915 | 0.145 | 0.0253 | 0.2275 | 0.8468 |
| No log | 2.96 | 9 | 4.1248 | 0.155 | 0.8866 | 8.6280 | 0.155 | 0.0360 | 0.2179 | 0.8487 |
| No log | 3.96 | 12 | 4.0716 | 0.155 | 0.8806 | 6.5480 | 0.155 | 0.0272 | 0.2254 | 0.8851 |
| No log | 4.96 | 15 | 4.0359 | 0.155 | 0.8778 | 6.7781 | 0.155 | 0.0271 | 0.2310 | 0.8931 |
| No log | 5.96 | 18 | 4.0135 | 0.155 | 0.8774 | 7.8547 | 0.155 | 0.0271 | 0.2345 | 0.8965 |
| No log | 6.96 | 21 | 3.9978 | 0.185 | 0.8779 | 8.3528 | 0.185 | 0.0468 | 0.2615 | 0.8612 |
| No log | 7.96 | 24 | 3.9867 | 0.18 | 0.8789 | 7.6001 | 0.18 | 0.0308 | 0.2618 | 0.8546 |
| No log | 8.96 | 27 | 3.9782 | 0.18 | 0.8796 | 7.0871 | 0.18 | 0.0306 | 0.2613 | 0.8538 |
| No log | 9.96 | 30 | 3.9726 | 0.18 | 0.8800 | 7.0519 | 0.18 | 0.0306 | 0.2687 | 0.8545 |
| No log | 10.96 | 33 | 3.9684 | 0.18 | 0.8803 | 7.0277 | 0.18 | 0.0306 | 0.2656 | 0.8537 |
| No log | 11.96 | 36 | 3.9654 | 0.18 | 0.8805 | 7.0162 | 0.18 | 0.0306 | 0.2708 | 0.8536 |
| No log | 12.96 | 39 | 3.9633 | 0.18 | 0.8805 | 7.0056 | 0.18 | 0.0306 | 0.2619 | 0.8535 |
| No log | 13.96 | 42 | 3.9614 | 0.18 | 0.8804 | 6.9981 | 0.18 | 0.0306 | 0.2617 | 0.8532 |
| No log | 14.96 | 45 | 3.9598 | 0.18 | 0.8804 | 6.9923 | 0.18 | 0.0306 | 0.2669 | 0.8531 |
| No log | 15.96 | 48 | 3.9586 | 0.18 | 0.8803 | 6.9334 | 0.18 | 0.0306 | 0.2669 | 0.8529 |
| No log | 16.96 | 51 | 3.9578 | 0.18 | 0.8802 | 6.9237 | 0.18 | 0.0306 | 0.2716 | 0.8522 |
| No log | 17.96 | 54 | 3.9576 | 0.18 | 0.8802 | 6.8704 | 0.18 | 0.0306 | 0.2666 | 0.8521 |
| No log | 18.96 | 57 | 3.9574 | 0.18 | 0.8802 | 6.8662 | 0.18 | 0.0306 | 0.2664 | 0.8523 |
| No log | 19.96 | 60 | 3.9568 | 0.18 | 0.8801 | 6.8641 | 0.18 | 0.0306 | 0.2614 | 0.8518 |
| No log | 20.96 | 63 | 3.9566 | 0.18 | 0.8801 | 6.8634 | 0.18 | 0.0306 | 0.2659 | 0.8516 |
| No log | 21.96 | 66 | 3.9563 | 0.18 | 0.8800 | 6.8632 | 0.18 | 0.0306 | 0.2612 | 0.8516 |
| No log | 22.96 | 69 | 3.9561 | 0.18 | 0.8800 | 6.8620 | 0.18 | 0.0306 | 0.2612 | 0.8513 |
| No log | 23.96 | 72 | 3.9561 | 0.18 | 0.8800 | 6.8611 | 0.18 | 0.0306 | 0.2612 | 0.8513 |
| No log | 24.96 | 75 | 3.9560 | 0.18 | 0.8800 | 6.8606 | 0.18 | 0.0306 | 0.2612 | 0.8512 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_rvl_cdip_100_examples_per_class_simkd_CEKD_t1_aNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_rvl_cdip_100_examples_per_class_simkd_CEKD_t1_aNone
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1502
- Accuracy: 0.0625
- Brier Loss: 0.9374
- Nll: 9.1398
- F1 Micro: 0.0625
- F1 Macro: 0.0074
- Ece: 0.1015
- Aurc: 0.9383
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.1540 | 0.0625 | 0.9376 | 8.5438 | 0.0625 | 0.0074 | 0.1043 | 0.9530 |
| No log | 1.96 | 24 | 0.1519 | 0.0625 | 0.9376 | 8.2831 | 0.0625 | 0.0074 | 0.1008 | 0.9465 |
| No log | 2.96 | 36 | 0.1512 | 0.0625 | 0.9375 | 8.4629 | 0.0625 | 0.0074 | 0.1028 | 0.9336 |
| No log | 3.96 | 48 | 0.1510 | 0.0625 | 0.9375 | 8.6283 | 0.0625 | 0.0074 | 0.1027 | 0.9365 |
| No log | 4.96 | 60 | 0.1509 | 0.0625 | 0.9375 | 8.5065 | 0.0625 | 0.0074 | 0.1030 | 0.9433 |
| No log | 5.96 | 72 | 0.1508 | 0.0625 | 0.9375 | 8.4779 | 0.0625 | 0.0074 | 0.1017 | 0.9414 |
| No log | 6.96 | 84 | 0.1507 | 0.0625 | 0.9375 | 8.5053 | 0.0625 | 0.0074 | 0.1045 | 0.9438 |
| No log | 7.96 | 96 | 0.1507 | 0.0625 | 0.9375 | 8.7396 | 0.0625 | 0.0074 | 0.1032 | 0.9440 |
| No log | 8.96 | 108 | 0.1506 | 0.0625 | 0.9375 | 8.6420 | 0.0625 | 0.0074 | 0.1031 | 0.9448 |
| No log | 9.96 | 120 | 0.1506 | 0.0625 | 0.9375 | 8.8410 | 0.0625 | 0.0074 | 0.1045 | 0.9438 |
| No log | 10.96 | 132 | 0.1506 | 0.0625 | 0.9374 | 8.9438 | 0.0625 | 0.0074 | 0.1042 | 0.9413 |
| No log | 11.96 | 144 | 0.1505 | 0.0625 | 0.9374 | 8.9847 | 0.0625 | 0.0074 | 0.1032 | 0.9418 |
| No log | 12.96 | 156 | 0.1505 | 0.0625 | 0.9374 | 9.0594 | 0.0625 | 0.0074 | 0.1031 | 0.9397 |
| No log | 13.96 | 168 | 0.1504 | 0.0625 | 0.9374 | 9.0748 | 0.0625 | 0.0074 | 0.1045 | 0.9343 |
| No log | 14.96 | 180 | 0.1504 | 0.0625 | 0.9374 | 9.0912 | 0.0625 | 0.0074 | 0.1018 | 0.9358 |
| No log | 15.96 | 192 | 0.1504 | 0.0625 | 0.9374 | 9.0950 | 0.0625 | 0.0074 | 0.1032 | 0.9331 |
| No log | 16.96 | 204 | 0.1503 | 0.0625 | 0.9374 | 9.2141 | 0.0625 | 0.0074 | 0.1015 | 0.9363 |
| No log | 17.96 | 216 | 0.1503 | 0.0625 | 0.9374 | 9.0918 | 0.0625 | 0.0074 | 0.1046 | 0.9354 |
| No log | 18.96 | 228 | 0.1503 | 0.0625 | 0.9374 | 9.1430 | 0.0625 | 0.0074 | 0.1018 | 0.9385 |
| No log | 19.96 | 240 | 0.1503 | 0.0625 | 0.9374 | 9.2149 | 0.0625 | 0.0074 | 0.0991 | 0.9404 |
| No log | 20.96 | 252 | 0.1503 | 0.0625 | 0.9374 | 9.0900 | 0.0625 | 0.0074 | 0.1043 | 0.9386 |
| No log | 21.96 | 264 | 0.1503 | 0.0625 | 0.9374 | 9.1244 | 0.0625 | 0.0074 | 0.1060 | 0.9395 |
| No log | 22.96 | 276 | 0.1503 | 0.0625 | 0.9374 | 9.1353 | 0.0625 | 0.0074 | 0.1005 | 0.9378 |
| No log | 23.96 | 288 | 0.1502 | 0.0625 | 0.9374 | 9.2063 | 0.0625 | 0.0074 | 0.1032 | 0.9373 |
| No log | 24.96 | 300 | 0.1502 | 0.0625 | 0.9374 | 9.1398 | 0.0625 | 0.0074 | 0.1015 | 0.9383 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8936
- Accuracy: 0.185
- Brier Loss: 0.8707
- Nll: 6.6284
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2527
- Aurc: 0.7434
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 4.2363 | 0.06 | 0.9043 | 9.2962 | 0.06 | 0.0114 | 0.1758 | 0.9032 |
| No log | 1.96 | 6 | 4.1268 | 0.18 | 0.8887 | 6.8683 | 0.18 | 0.0305 | 0.2329 | 0.8055 |
| No log | 2.96 | 9 | 4.0044 | 0.18 | 0.8773 | 7.3055 | 0.18 | 0.0305 | 0.2510 | 0.8219 |
| No log | 3.96 | 12 | 3.9678 | 0.18 | 0.8851 | 7.2435 | 0.18 | 0.0305 | 0.2677 | 0.8214 |
| No log | 4.96 | 15 | 3.9645 | 0.185 | 0.8877 | 6.9806 | 0.185 | 0.0488 | 0.2757 | 0.7934 |
| No log | 5.96 | 18 | 3.9635 | 0.185 | 0.8853 | 6.9543 | 0.185 | 0.0488 | 0.2551 | 0.7812 |
| No log | 6.96 | 21 | 3.9564 | 0.185 | 0.8801 | 6.0556 | 0.185 | 0.0488 | 0.2515 | 0.7771 |
| No log | 7.96 | 24 | 3.9505 | 0.185 | 0.8772 | 6.0356 | 0.185 | 0.0488 | 0.2598 | 0.7724 |
| No log | 8.96 | 27 | 3.9435 | 0.185 | 0.8751 | 6.0288 | 0.185 | 0.0488 | 0.2590 | 0.7697 |
| No log | 9.96 | 30 | 3.9383 | 0.185 | 0.8742 | 6.0724 | 0.185 | 0.0488 | 0.2474 | 0.7712 |
| No log | 10.96 | 33 | 3.9336 | 0.185 | 0.8746 | 6.7953 | 0.185 | 0.0488 | 0.2533 | 0.7685 |
| No log | 11.96 | 36 | 3.9298 | 0.185 | 0.8755 | 6.9469 | 0.185 | 0.0488 | 0.2679 | 0.7659 |
| No log | 12.96 | 39 | 3.9253 | 0.185 | 0.8756 | 6.9654 | 0.185 | 0.0488 | 0.2591 | 0.7640 |
| No log | 13.96 | 42 | 3.9194 | 0.185 | 0.8750 | 6.9522 | 0.185 | 0.0488 | 0.2681 | 0.7604 |
| No log | 14.96 | 45 | 3.9128 | 0.185 | 0.8744 | 6.9200 | 0.185 | 0.0488 | 0.2611 | 0.7617 |
| No log | 15.96 | 48 | 3.9074 | 0.185 | 0.8733 | 6.8369 | 0.185 | 0.0488 | 0.2611 | 0.7600 |
| No log | 16.96 | 51 | 3.9041 | 0.185 | 0.8726 | 6.8278 | 0.185 | 0.0488 | 0.2558 | 0.7566 |
| No log | 17.96 | 54 | 3.9025 | 0.185 | 0.8719 | 6.7039 | 0.185 | 0.0488 | 0.2588 | 0.7510 |
| No log | 18.96 | 57 | 3.9012 | 0.185 | 0.8717 | 6.6384 | 0.185 | 0.0488 | 0.2580 | 0.7484 |
| No log | 19.96 | 60 | 3.8987 | 0.185 | 0.8712 | 6.6323 | 0.185 | 0.0488 | 0.2612 | 0.7450 |
| No log | 20.96 | 63 | 3.8971 | 0.185 | 0.8712 | 6.6319 | 0.185 | 0.0488 | 0.2615 | 0.7443 |
| No log | 21.96 | 66 | 3.8956 | 0.185 | 0.8710 | 6.6323 | 0.185 | 0.0488 | 0.2659 | 0.7439 |
| No log | 22.96 | 69 | 3.8945 | 0.185 | 0.8708 | 6.6307 | 0.185 | 0.0488 | 0.2569 | 0.7436 |
| No log | 23.96 | 72 | 3.8940 | 0.185 | 0.8708 | 6.6295 | 0.185 | 0.0488 | 0.2526 | 0.7434 |
| No log | 24.96 | 75 | 3.8936 | 0.185 | 0.8707 | 6.6284 | 0.185 | 0.0488 | 0.2527 | 0.7434 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.7
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.2510
- Accuracy: 0.18
- Brier Loss: 0.8767
- Nll: 6.8039
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2513
- Aurc: 0.8508
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.4586 | 0.145 | 0.8999 | 10.1587 | 0.145 | 0.0253 | 0.2221 | 0.8467 |
| No log | 1.96 | 6 | 3.4232 | 0.145 | 0.8946 | 10.5824 | 0.145 | 0.0253 | 0.2242 | 0.8475 |
| No log | 2.96 | 9 | 3.3704 | 0.16 | 0.8867 | 8.6135 | 0.16 | 0.0503 | 0.2171 | 0.8440 |
| No log | 3.96 | 12 | 3.3273 | 0.155 | 0.8807 | 6.5471 | 0.155 | 0.0274 | 0.2248 | 0.8831 |
| No log | 4.96 | 15 | 3.3006 | 0.155 | 0.8779 | 6.8045 | 0.155 | 0.0271 | 0.2331 | 0.8918 |
| No log | 5.96 | 18 | 3.2856 | 0.16 | 0.8773 | 8.2046 | 0.16 | 0.0329 | 0.2361 | 0.8956 |
| No log | 6.96 | 21 | 3.2758 | 0.18 | 0.8774 | 8.0738 | 0.18 | 0.0308 | 0.2561 | 0.8544 |
| No log | 7.96 | 24 | 3.2688 | 0.18 | 0.8778 | 7.1046 | 0.18 | 0.0308 | 0.2647 | 0.8524 |
| No log | 8.96 | 27 | 3.2630 | 0.18 | 0.8778 | 6.9910 | 0.18 | 0.0306 | 0.2591 | 0.8530 |
| No log | 9.96 | 30 | 3.2597 | 0.18 | 0.8778 | 6.9680 | 0.18 | 0.0306 | 0.2736 | 0.8538 |
| No log | 10.96 | 33 | 3.2573 | 0.18 | 0.8776 | 6.9547 | 0.18 | 0.0306 | 0.2698 | 0.8536 |
| No log | 11.96 | 36 | 3.2557 | 0.18 | 0.8775 | 6.9491 | 0.18 | 0.0306 | 0.2653 | 0.8533 |
| No log | 12.96 | 39 | 3.2546 | 0.18 | 0.8773 | 6.8987 | 0.18 | 0.0306 | 0.2606 | 0.8526 |
| No log | 13.96 | 42 | 3.2536 | 0.18 | 0.8771 | 6.8204 | 0.18 | 0.0306 | 0.2601 | 0.8523 |
| No log | 14.96 | 45 | 3.2528 | 0.18 | 0.8771 | 6.8141 | 0.18 | 0.0306 | 0.2521 | 0.8519 |
| No log | 15.96 | 48 | 3.2522 | 0.18 | 0.8769 | 6.8074 | 0.18 | 0.0306 | 0.2606 | 0.8517 |
| No log | 16.96 | 51 | 3.2519 | 0.18 | 0.8769 | 6.8077 | 0.18 | 0.0306 | 0.2607 | 0.8515 |
| No log | 17.96 | 54 | 3.2520 | 0.18 | 0.8769 | 6.8050 | 0.18 | 0.0306 | 0.2561 | 0.8510 |
| No log | 18.96 | 57 | 3.2520 | 0.18 | 0.8769 | 6.8057 | 0.18 | 0.0306 | 0.2519 | 0.8509 |
| No log | 19.96 | 60 | 3.2515 | 0.18 | 0.8768 | 6.8046 | 0.18 | 0.0306 | 0.2556 | 0.8507 |
| No log | 20.96 | 63 | 3.2514 | 0.18 | 0.8768 | 6.8048 | 0.18 | 0.0306 | 0.2515 | 0.8506 |
| No log | 21.96 | 66 | 3.2512 | 0.18 | 0.8767 | 6.8048 | 0.18 | 0.0306 | 0.2556 | 0.8508 |
| No log | 22.96 | 69 | 3.2510 | 0.18 | 0.8767 | 6.8045 | 0.18 | 0.0306 | 0.2513 | 0.8509 |
| No log | 23.96 | 72 | 3.2510 | 0.18 | 0.8767 | 6.8043 | 0.18 | 0.0306 | 0.2513 | 0.8508 |
| No log | 24.96 | 75 | 3.2510 | 0.18 | 0.8767 | 6.8039 | 0.18 | 0.0306 | 0.2513 | 0.8508 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t2.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t2.5_a0.7
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.1993
- Accuracy: 0.185
- Brier Loss: 0.8672
- Nll: 6.5703
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2594
- Aurc: 0.7367
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.4684 | 0.06 | 0.9042 | 9.2910 | 0.06 | 0.0114 | 0.1755 | 0.9033 |
| No log | 1.96 | 6 | 3.3741 | 0.18 | 0.8886 | 6.5491 | 0.18 | 0.0305 | 0.2324 | 0.8055 |
| No log | 2.96 | 9 | 3.2779 | 0.18 | 0.8767 | 7.2662 | 0.18 | 0.0305 | 0.2493 | 0.8196 |
| No log | 3.96 | 12 | 3.2605 | 0.18 | 0.8816 | 7.0963 | 0.18 | 0.0305 | 0.2628 | 0.8140 |
| No log | 4.96 | 15 | 3.2592 | 0.185 | 0.8814 | 6.9350 | 0.185 | 0.0488 | 0.2584 | 0.7850 |
| No log | 5.96 | 18 | 3.2576 | 0.185 | 0.8782 | 6.3113 | 0.185 | 0.0488 | 0.2561 | 0.7731 |
| No log | 6.96 | 21 | 3.2540 | 0.185 | 0.8747 | 6.0058 | 0.185 | 0.0488 | 0.2446 | 0.7705 |
| No log | 7.96 | 24 | 3.2500 | 0.185 | 0.8731 | 5.9849 | 0.185 | 0.0488 | 0.2442 | 0.7669 |
| No log | 8.96 | 27 | 3.2430 | 0.185 | 0.8717 | 5.9785 | 0.185 | 0.0488 | 0.2483 | 0.7626 |
| No log | 9.96 | 30 | 3.2377 | 0.185 | 0.8711 | 6.2837 | 0.185 | 0.0488 | 0.2462 | 0.7609 |
| No log | 10.96 | 33 | 3.2332 | 0.185 | 0.8713 | 6.8641 | 0.185 | 0.0488 | 0.2560 | 0.7601 |
| No log | 11.96 | 36 | 3.2293 | 0.185 | 0.8719 | 6.8631 | 0.185 | 0.0488 | 0.2523 | 0.7587 |
| No log | 12.96 | 39 | 3.2246 | 0.185 | 0.8717 | 6.8535 | 0.185 | 0.0488 | 0.2526 | 0.7558 |
| No log | 13.96 | 42 | 3.2190 | 0.185 | 0.8709 | 6.8177 | 0.185 | 0.0488 | 0.2565 | 0.7533 |
| No log | 14.96 | 45 | 3.2134 | 0.185 | 0.8700 | 6.7894 | 0.185 | 0.0488 | 0.2630 | 0.7533 |
| No log | 15.96 | 48 | 3.2091 | 0.185 | 0.8691 | 6.7672 | 0.185 | 0.0488 | 0.2585 | 0.7500 |
| No log | 16.96 | 51 | 3.2069 | 0.185 | 0.8687 | 6.6512 | 0.185 | 0.0488 | 0.2536 | 0.7466 |
| No log | 17.96 | 54 | 3.2063 | 0.185 | 0.8682 | 6.5227 | 0.185 | 0.0488 | 0.2520 | 0.7429 |
| No log | 18.96 | 57 | 3.2057 | 0.185 | 0.8682 | 6.5119 | 0.185 | 0.0488 | 0.2514 | 0.7406 |
| No log | 19.96 | 60 | 3.2036 | 0.185 | 0.8678 | 6.5674 | 0.185 | 0.0488 | 0.2501 | 0.7385 |
| No log | 20.96 | 63 | 3.2023 | 0.185 | 0.8677 | 6.5709 | 0.185 | 0.0488 | 0.2506 | 0.7385 |
| No log | 21.96 | 66 | 3.2010 | 0.185 | 0.8675 | 6.5731 | 0.185 | 0.0488 | 0.2631 | 0.7376 |
| No log | 22.96 | 69 | 3.2000 | 0.185 | 0.8673 | 6.5723 | 0.185 | 0.0488 | 0.2591 | 0.7371 |
| No log | 23.96 | 72 | 3.1996 | 0.185 | 0.8673 | 6.5715 | 0.185 | 0.0488 | 0.2593 | 0.7368 |
| No log | 24.96 | 75 | 3.1993 | 0.185 | 0.8672 | 6.5703 | 0.185 | 0.0488 | 0.2594 | 0.7367 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.9
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5379
- Accuracy: 0.18
- Brier Loss: 0.8746
- Nll: 6.7389
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2460
- Aurc: 0.8496
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.6891 | 0.145 | 0.8999 | 10.1550 | 0.145 | 0.0253 | 0.2220 | 0.8466 |
| No log | 1.96 | 6 | 2.6592 | 0.145 | 0.8947 | 10.5706 | 0.145 | 0.0253 | 0.2238 | 0.8463 |
| No log | 2.96 | 9 | 2.6158 | 0.14 | 0.8869 | 8.5528 | 0.14 | 0.0422 | 0.2066 | 0.8175 |
| No log | 3.96 | 12 | 2.5827 | 0.175 | 0.8810 | 6.5464 | 0.175 | 0.0467 | 0.2385 | 0.8661 |
| No log | 4.96 | 15 | 2.5647 | 0.155 | 0.8781 | 6.8570 | 0.155 | 0.0274 | 0.2316 | 0.8886 |
| No log | 5.96 | 18 | 2.5566 | 0.19 | 0.8772 | 8.4283 | 0.19 | 0.0413 | 0.2460 | 0.8532 |
| No log | 6.96 | 21 | 2.5515 | 0.18 | 0.8769 | 7.6865 | 0.18 | 0.0308 | 0.2480 | 0.8517 |
| No log | 7.96 | 24 | 2.5475 | 0.18 | 0.8767 | 6.9727 | 0.18 | 0.0306 | 0.2469 | 0.8521 |
| No log | 8.96 | 27 | 2.5438 | 0.18 | 0.8762 | 6.9080 | 0.18 | 0.0306 | 0.2438 | 0.8525 |
| No log | 9.96 | 30 | 2.5420 | 0.18 | 0.8758 | 6.8906 | 0.18 | 0.0306 | 0.2521 | 0.8528 |
| No log | 10.96 | 33 | 2.5410 | 0.18 | 0.8755 | 6.8317 | 0.18 | 0.0306 | 0.2516 | 0.8524 |
| No log | 11.96 | 36 | 2.5404 | 0.18 | 0.8753 | 6.7606 | 0.18 | 0.0306 | 0.2469 | 0.8516 |
| No log | 12.96 | 39 | 2.5401 | 0.18 | 0.8752 | 6.7444 | 0.18 | 0.0306 | 0.2425 | 0.8516 |
| No log | 13.96 | 42 | 2.5397 | 0.18 | 0.8751 | 6.7397 | 0.18 | 0.0306 | 0.2498 | 0.8514 |
| No log | 14.96 | 45 | 2.5393 | 0.18 | 0.8750 | 6.7390 | 0.18 | 0.0306 | 0.2579 | 0.8511 |
| No log | 15.96 | 48 | 2.5389 | 0.18 | 0.8749 | 6.7366 | 0.18 | 0.0306 | 0.2463 | 0.8513 |
| No log | 16.96 | 51 | 2.5387 | 0.18 | 0.8749 | 6.7390 | 0.18 | 0.0306 | 0.2465 | 0.8510 |
| No log | 17.96 | 54 | 2.5389 | 0.18 | 0.8749 | 6.7382 | 0.18 | 0.0306 | 0.2425 | 0.8505 |
| No log | 18.96 | 57 | 2.5389 | 0.18 | 0.8749 | 6.7397 | 0.18 | 0.0306 | 0.2463 | 0.8504 |
| No log | 19.96 | 60 | 2.5384 | 0.18 | 0.8748 | 6.7391 | 0.18 | 0.0306 | 0.2421 | 0.8495 |
| No log | 20.96 | 63 | 2.5383 | 0.18 | 0.8747 | 6.7396 | 0.18 | 0.0306 | 0.2422 | 0.8500 |
| No log | 21.96 | 66 | 2.5380 | 0.18 | 0.8747 | 6.7399 | 0.18 | 0.0306 | 0.2460 | 0.8496 |
| No log | 22.96 | 69 | 2.5379 | 0.18 | 0.8746 | 6.7395 | 0.18 | 0.0306 | 0.2460 | 0.8497 |
| No log | 23.96 | 72 | 2.5379 | 0.18 | 0.8746 | 6.7393 | 0.18 | 0.0306 | 0.2460 | 0.8497 |
| No log | 24.96 | 75 | 2.5379 | 0.18 | 0.8746 | 6.7389 | 0.18 | 0.0306 | 0.2460 | 0.8496 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_rvl_cdip_100_examples_per_class_simkd_CEKD_t1_aNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_rvl_cdip_100_examples_per_class_simkd_CEKD_t1_aNone
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1481
- Accuracy: 0.08
- Brier Loss: 0.9369
- Nll: 9.2883
- F1 Micro: 0.08
- F1 Macro: 0.0357
- Ece: 0.1153
- Aurc: 0.8531
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 12 | 0.1528 | 0.0625 | 0.9377 | 9.9656 | 0.0625 | 0.0074 | 0.1025 | 0.9319 |
| No log | 1.96 | 24 | 0.1507 | 0.06 | 0.9377 | 9.9434 | 0.06 | 0.0074 | 0.1036 | 0.9537 |
| No log | 2.96 | 36 | 0.1500 | 0.0625 | 0.9376 | 8.6216 | 0.0625 | 0.0074 | 0.1019 | 0.9383 |
| No log | 3.96 | 48 | 0.1498 | 0.0625 | 0.9376 | 9.2776 | 0.0625 | 0.0074 | 0.1032 | 0.9438 |
| No log | 4.96 | 60 | 0.1496 | 0.0625 | 0.9375 | 9.3105 | 0.0625 | 0.0074 | 0.1017 | 0.9421 |
| No log | 5.96 | 72 | 0.1495 | 0.0625 | 0.9375 | 9.7276 | 0.0625 | 0.0074 | 0.1029 | 0.9380 |
| No log | 6.96 | 84 | 0.1494 | 0.0625 | 0.9374 | 9.6348 | 0.0625 | 0.0074 | 0.1017 | 0.9347 |
| No log | 7.96 | 96 | 0.1493 | 0.0625 | 0.9374 | 9.6145 | 0.0625 | 0.0074 | 0.1008 | 0.9359 |
| No log | 8.96 | 108 | 0.1492 | 0.0625 | 0.9374 | 9.5748 | 0.0625 | 0.0074 | 0.1019 | 0.9371 |
| No log | 9.96 | 120 | 0.1491 | 0.0625 | 0.9373 | 9.5551 | 0.0625 | 0.0074 | 0.1005 | 0.9372 |
| No log | 10.96 | 132 | 0.1490 | 0.065 | 0.9373 | 9.5267 | 0.065 | 0.0122 | 0.1047 | 0.9315 |
| No log | 11.96 | 144 | 0.1489 | 0.065 | 0.9373 | 9.5165 | 0.065 | 0.0122 | 0.1043 | 0.9284 |
| No log | 12.96 | 156 | 0.1488 | 0.065 | 0.9372 | 9.5162 | 0.065 | 0.0123 | 0.1068 | 0.9302 |
| No log | 13.96 | 168 | 0.1488 | 0.07 | 0.9372 | 9.5139 | 0.07 | 0.0213 | 0.1070 | 0.9275 |
| No log | 14.96 | 180 | 0.1487 | 0.0725 | 0.9371 | 9.4579 | 0.0725 | 0.0253 | 0.1095 | 0.9174 |
| No log | 15.96 | 192 | 0.1486 | 0.075 | 0.9371 | 9.3950 | 0.075 | 0.0286 | 0.1106 | 0.9161 |
| No log | 16.96 | 204 | 0.1485 | 0.075 | 0.9371 | 9.3347 | 0.075 | 0.0280 | 0.1055 | 0.9014 |
| No log | 17.96 | 216 | 0.1484 | 0.0775 | 0.9370 | 9.3157 | 0.0775 | 0.0315 | 0.1089 | 0.8695 |
| No log | 18.96 | 228 | 0.1483 | 0.08 | 0.9370 | 9.3125 | 0.08 | 0.0362 | 0.1133 | 0.8526 |
| No log | 19.96 | 240 | 0.1483 | 0.08 | 0.9370 | 9.2915 | 0.08 | 0.0360 | 0.1113 | 0.8554 |
| No log | 20.96 | 252 | 0.1482 | 0.0775 | 0.9370 | 9.2937 | 0.0775 | 0.0374 | 0.1118 | 0.8475 |
| No log | 21.96 | 264 | 0.1482 | 0.08 | 0.9369 | 9.2903 | 0.08 | 0.0357 | 0.1167 | 0.8526 |
| No log | 22.96 | 276 | 0.1482 | 0.08 | 0.9369 | 9.2888 | 0.08 | 0.0357 | 0.1099 | 0.8540 |
| No log | 23.96 | 288 | 0.1481 | 0.08 | 0.9369 | 9.2877 | 0.08 | 0.0357 | 0.1126 | 0.8531 |
| No log | 24.96 | 300 | 0.1481 | 0.08 | 0.9369 | 9.2883 | 0.08 | 0.0357 | 0.1153 | 0.8531 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8497
- Accuracy: 0.18
- Brier Loss: 0.8788
- Nll: 6.0432
- F1 Micro: 0.18
- F1 Macro: 0.0305
- Ece: 0.2578
- Aurc: 0.8511
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 4.0678 | 0.145 | 0.8999 | 10.1608 | 0.145 | 0.0253 | 0.2221 | 0.8466 |
| No log | 1.96 | 6 | 4.0316 | 0.145 | 0.8948 | 10.5160 | 0.145 | 0.0253 | 0.2239 | 0.8468 |
| No log | 2.96 | 9 | 3.9774 | 0.16 | 0.8871 | 8.6333 | 0.16 | 0.0524 | 0.2217 | 0.8424 |
| No log | 3.96 | 12 | 3.9325 | 0.155 | 0.8813 | 6.5340 | 0.155 | 0.0272 | 0.2161 | 0.8837 |
| No log | 4.96 | 15 | 3.9041 | 0.155 | 0.8787 | 7.1704 | 0.155 | 0.0271 | 0.2296 | 0.8923 |
| No log | 5.96 | 18 | 3.8876 | 0.155 | 0.8782 | 8.7334 | 0.155 | 0.0277 | 0.2325 | 0.8942 |
| No log | 6.96 | 21 | 3.8766 | 0.18 | 0.8785 | 8.8120 | 0.18 | 0.0314 | 0.2476 | 0.8555 |
| No log | 7.96 | 24 | 3.8690 | 0.18 | 0.8791 | 8.8676 | 0.18 | 0.0308 | 0.2643 | 0.8534 |
| No log | 8.96 | 27 | 3.8633 | 0.18 | 0.8793 | 8.5299 | 0.18 | 0.0306 | 0.2594 | 0.8541 |
| No log | 9.96 | 30 | 3.8601 | 0.18 | 0.8796 | 7.4142 | 0.18 | 0.0305 | 0.2622 | 0.8548 |
| No log | 10.96 | 33 | 3.8577 | 0.18 | 0.8797 | 6.6642 | 0.18 | 0.0305 | 0.2720 | 0.8546 |
| No log | 11.96 | 36 | 3.8560 | 0.18 | 0.8797 | 6.2862 | 0.18 | 0.0305 | 0.2723 | 0.8543 |
| No log | 12.96 | 39 | 3.8547 | 0.18 | 0.8796 | 6.2084 | 0.18 | 0.0305 | 0.2678 | 0.8541 |
| No log | 13.96 | 42 | 3.8535 | 0.18 | 0.8794 | 6.1826 | 0.18 | 0.0305 | 0.2631 | 0.8534 |
| No log | 14.96 | 45 | 3.8525 | 0.18 | 0.8793 | 6.1744 | 0.18 | 0.0305 | 0.2593 | 0.8529 |
| No log | 15.96 | 48 | 3.8516 | 0.18 | 0.8792 | 6.1606 | 0.18 | 0.0305 | 0.2680 | 0.8527 |
| No log | 16.96 | 51 | 3.8511 | 0.18 | 0.8791 | 6.1634 | 0.18 | 0.0305 | 0.2724 | 0.8528 |
| No log | 17.96 | 54 | 3.8510 | 0.18 | 0.8791 | 6.0971 | 0.18 | 0.0305 | 0.2676 | 0.8525 |
| No log | 18.96 | 57 | 3.8508 | 0.18 | 0.8790 | 6.0686 | 0.18 | 0.0305 | 0.2630 | 0.8522 |
| No log | 19.96 | 60 | 3.8503 | 0.18 | 0.8789 | 6.0495 | 0.18 | 0.0305 | 0.2581 | 0.8518 |
| No log | 20.96 | 63 | 3.8501 | 0.18 | 0.8789 | 6.0918 | 0.18 | 0.0305 | 0.2581 | 0.8516 |
| No log | 21.96 | 66 | 3.8499 | 0.18 | 0.8788 | 6.0464 | 0.18 | 0.0305 | 0.2536 | 0.8516 |
| No log | 22.96 | 69 | 3.8497 | 0.18 | 0.8788 | 6.0419 | 0.18 | 0.0305 | 0.2535 | 0.8513 |
| No log | 23.96 | 72 | 3.8497 | 0.18 | 0.8788 | 6.0432 | 0.18 | 0.0305 | 0.2578 | 0.8511 |
| No log | 24.96 | 75 | 3.8497 | 0.18 | 0.8788 | 6.0432 | 0.18 | 0.0305 | 0.2578 | 0.8511 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.7912
- Accuracy: 0.185
- Brier Loss: 0.8688
- Nll: 5.6106
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2524
- Aurc: 0.7391
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 4.0715 | 0.06 | 0.9043 | 8.8976 | 0.06 | 0.0114 | 0.1751 | 0.9034 |
| No log | 1.96 | 6 | 3.9774 | 0.18 | 0.8893 | 8.0316 | 0.18 | 0.0305 | 0.2237 | 0.8040 |
| No log | 2.96 | 9 | 3.8805 | 0.18 | 0.8782 | 8.6752 | 0.18 | 0.0305 | 0.2566 | 0.8189 |
| No log | 3.96 | 12 | 3.8615 | 0.18 | 0.8836 | 8.9177 | 0.18 | 0.0305 | 0.2645 | 0.8205 |
| No log | 4.96 | 15 | 3.8624 | 0.185 | 0.8844 | 6.3245 | 0.185 | 0.0488 | 0.2727 | 0.7889 |
| No log | 5.96 | 18 | 3.8605 | 0.185 | 0.8813 | 5.1679 | 0.185 | 0.0488 | 0.2558 | 0.7797 |
| No log | 6.96 | 21 | 3.8511 | 0.185 | 0.8774 | 5.1770 | 0.185 | 0.0488 | 0.2510 | 0.7741 |
| No log | 7.96 | 24 | 3.8410 | 0.185 | 0.8751 | 5.6014 | 0.185 | 0.0488 | 0.2458 | 0.7699 |
| No log | 8.96 | 27 | 3.8317 | 0.185 | 0.8733 | 5.9766 | 0.185 | 0.0488 | 0.2537 | 0.7681 |
| No log | 9.96 | 30 | 3.8259 | 0.185 | 0.8724 | 6.0278 | 0.185 | 0.0488 | 0.2473 | 0.7689 |
| No log | 10.96 | 33 | 3.8226 | 0.185 | 0.8724 | 6.8070 | 0.185 | 0.0488 | 0.2618 | 0.7671 |
| No log | 11.96 | 36 | 3.8209 | 0.185 | 0.8730 | 7.6044 | 0.185 | 0.0488 | 0.2539 | 0.7643 |
| No log | 12.96 | 39 | 3.8187 | 0.185 | 0.8730 | 8.1654 | 0.185 | 0.0488 | 0.2542 | 0.7612 |
| No log | 13.96 | 42 | 3.8147 | 0.185 | 0.8725 | 7.1073 | 0.185 | 0.0488 | 0.2542 | 0.7566 |
| No log | 14.96 | 45 | 3.8096 | 0.185 | 0.8720 | 6.3875 | 0.185 | 0.0488 | 0.2565 | 0.7566 |
| No log | 15.96 | 48 | 3.8052 | 0.185 | 0.8712 | 6.0256 | 0.185 | 0.0488 | 0.2518 | 0.7524 |
| No log | 16.96 | 51 | 3.8022 | 0.185 | 0.8707 | 5.7809 | 0.185 | 0.0488 | 0.2558 | 0.7485 |
| No log | 17.96 | 54 | 3.8008 | 0.185 | 0.8701 | 5.6835 | 0.185 | 0.0488 | 0.2496 | 0.7442 |
| No log | 18.96 | 57 | 3.7992 | 0.185 | 0.8700 | 5.3867 | 0.185 | 0.0488 | 0.2490 | 0.7421 |
| No log | 19.96 | 60 | 3.7965 | 0.185 | 0.8694 | 5.4928 | 0.185 | 0.0488 | 0.2478 | 0.7406 |
| No log | 20.96 | 63 | 3.7948 | 0.185 | 0.8693 | 5.5527 | 0.185 | 0.0488 | 0.2481 | 0.7405 |
| No log | 21.96 | 66 | 3.7932 | 0.185 | 0.8691 | 5.5585 | 0.185 | 0.0488 | 0.2564 | 0.7396 |
| No log | 22.96 | 69 | 3.7921 | 0.185 | 0.8689 | 5.5607 | 0.185 | 0.0488 | 0.2479 | 0.7391 |
| No log | 23.96 | 72 | 3.7915 | 0.185 | 0.8688 | 5.6116 | 0.185 | 0.0488 | 0.2523 | 0.7390 |
| No log | 24.96 | 75 | 3.7912 | 0.185 | 0.8688 | 5.6106 | 0.185 | 0.0488 | 0.2524 | 0.7391 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.1844
- Accuracy: 0.18
- Brier Loss: 0.8763
- Nll: 6.0873
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2492
- Aurc: 0.8505
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.3625 | 0.145 | 0.8999 | 10.1577 | 0.145 | 0.0253 | 0.2220 | 0.8466 |
| No log | 1.96 | 6 | 3.3300 | 0.145 | 0.8947 | 10.5652 | 0.145 | 0.0253 | 0.2237 | 0.8468 |
| No log | 2.96 | 9 | 3.2822 | 0.14 | 0.8870 | 8.5877 | 0.14 | 0.0453 | 0.2040 | 0.8325 |
| No log | 3.96 | 12 | 3.2442 | 0.16 | 0.8812 | 6.5385 | 0.16 | 0.0327 | 0.2208 | 0.8814 |
| No log | 4.96 | 15 | 3.2219 | 0.155 | 0.8784 | 7.1527 | 0.155 | 0.0271 | 0.2352 | 0.8898 |
| No log | 5.96 | 18 | 3.2105 | 0.185 | 0.8778 | 8.7319 | 0.185 | 0.0517 | 0.2548 | 0.8944 |
| No log | 6.96 | 21 | 3.2032 | 0.18 | 0.8778 | 8.8034 | 0.18 | 0.0308 | 0.2478 | 0.8527 |
| No log | 7.96 | 24 | 3.1980 | 0.18 | 0.8779 | 8.1814 | 0.18 | 0.0306 | 0.2635 | 0.8527 |
| No log | 8.96 | 27 | 3.1937 | 0.18 | 0.8777 | 7.0314 | 0.18 | 0.0306 | 0.2618 | 0.8529 |
| No log | 9.96 | 30 | 3.1915 | 0.18 | 0.8776 | 6.9166 | 0.18 | 0.0306 | 0.2591 | 0.8537 |
| No log | 10.96 | 33 | 3.1900 | 0.18 | 0.8774 | 6.8864 | 0.18 | 0.0306 | 0.2551 | 0.8535 |
| No log | 11.96 | 36 | 3.1889 | 0.18 | 0.8773 | 6.5148 | 0.18 | 0.0306 | 0.2547 | 0.8532 |
| No log | 12.96 | 39 | 3.1881 | 0.18 | 0.8771 | 6.1469 | 0.18 | 0.0306 | 0.2543 | 0.8530 |
| No log | 13.96 | 42 | 3.1872 | 0.18 | 0.8769 | 6.1318 | 0.18 | 0.0306 | 0.2538 | 0.8525 |
| No log | 14.96 | 45 | 3.1865 | 0.18 | 0.8768 | 6.0783 | 0.18 | 0.0306 | 0.2501 | 0.8525 |
| No log | 15.96 | 48 | 3.1859 | 0.18 | 0.8766 | 6.0654 | 0.18 | 0.0306 | 0.2500 | 0.8520 |
| No log | 16.96 | 51 | 3.1855 | 0.18 | 0.8766 | 6.0809 | 0.18 | 0.0306 | 0.2459 | 0.8516 |
| No log | 17.96 | 54 | 3.1855 | 0.18 | 0.8766 | 6.0610 | 0.18 | 0.0306 | 0.2497 | 0.8515 |
| No log | 18.96 | 57 | 3.1854 | 0.18 | 0.8766 | 6.0659 | 0.18 | 0.0306 | 0.2579 | 0.8515 |
| No log | 19.96 | 60 | 3.1850 | 0.18 | 0.8764 | 6.0737 | 0.18 | 0.0306 | 0.2656 | 0.8513 |
| No log | 20.96 | 63 | 3.1848 | 0.18 | 0.8764 | 6.0869 | 0.18 | 0.0306 | 0.2575 | 0.8510 |
| No log | 21.96 | 66 | 3.1846 | 0.18 | 0.8764 | 6.1423 | 0.18 | 0.0306 | 0.2533 | 0.8510 |
| No log | 22.96 | 69 | 3.1845 | 0.18 | 0.8763 | 6.1047 | 0.18 | 0.0306 | 0.2532 | 0.8505 |
| No log | 23.96 | 72 | 3.1845 | 0.18 | 0.8763 | 6.0895 | 0.18 | 0.0306 | 0.2532 | 0.8504 |
| No log | 24.96 | 75 | 3.1844 | 0.18 | 0.8763 | 6.0873 | 0.18 | 0.0306 | 0.2492 | 0.8505 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.1347
- Accuracy: 0.185
- Brier Loss: 0.8666
- Nll: 5.9997
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2480
- Aurc: 0.7353
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 3.3695 | 0.06 | 0.9042 | 9.1505 | 0.06 | 0.0114 | 0.1750 | 0.9033 |
| No log | 1.96 | 6 | 3.2847 | 0.18 | 0.8890 | 7.1646 | 0.18 | 0.0305 | 0.2263 | 0.8027 |
| No log | 2.96 | 9 | 3.2039 | 0.18 | 0.8773 | 8.6118 | 0.18 | 0.0305 | 0.2478 | 0.8186 |
| No log | 3.96 | 12 | 3.1950 | 0.18 | 0.8806 | 7.4891 | 0.18 | 0.0305 | 0.2514 | 0.8131 |
| No log | 4.96 | 15 | 3.1951 | 0.185 | 0.8795 | 6.7125 | 0.185 | 0.0488 | 0.2555 | 0.7835 |
| No log | 5.96 | 18 | 3.1931 | 0.185 | 0.8766 | 5.2600 | 0.185 | 0.0488 | 0.2526 | 0.7702 |
| No log | 6.96 | 21 | 3.1876 | 0.185 | 0.8741 | 5.6453 | 0.185 | 0.0488 | 0.2372 | 0.7672 |
| No log | 7.96 | 24 | 3.1800 | 0.185 | 0.8726 | 5.9473 | 0.185 | 0.0488 | 0.2412 | 0.7644 |
| No log | 8.96 | 27 | 3.1712 | 0.185 | 0.8712 | 5.9421 | 0.185 | 0.0488 | 0.2491 | 0.7615 |
| No log | 9.96 | 30 | 3.1656 | 0.185 | 0.8704 | 6.6276 | 0.185 | 0.0488 | 0.2516 | 0.7602 |
| No log | 10.96 | 33 | 3.1623 | 0.185 | 0.8704 | 6.8796 | 0.185 | 0.0488 | 0.2487 | 0.7598 |
| No log | 11.96 | 36 | 3.1601 | 0.185 | 0.8708 | 7.1352 | 0.185 | 0.0488 | 0.2451 | 0.7559 |
| No log | 12.96 | 39 | 3.1573 | 0.185 | 0.8706 | 7.0151 | 0.185 | 0.0488 | 0.2492 | 0.7531 |
| No log | 13.96 | 42 | 3.1531 | 0.185 | 0.8699 | 6.7912 | 0.185 | 0.0488 | 0.2450 | 0.7484 |
| No log | 14.96 | 45 | 3.1485 | 0.185 | 0.8693 | 6.6578 | 0.185 | 0.0488 | 0.2513 | 0.7491 |
| No log | 15.96 | 48 | 3.1449 | 0.185 | 0.8685 | 6.1407 | 0.185 | 0.0488 | 0.2596 | 0.7463 |
| No log | 16.96 | 51 | 3.1428 | 0.185 | 0.8681 | 5.9160 | 0.185 | 0.0488 | 0.2548 | 0.7432 |
| No log | 17.96 | 54 | 3.1421 | 0.185 | 0.8678 | 5.8419 | 0.185 | 0.0488 | 0.2449 | 0.7401 |
| No log | 18.96 | 57 | 3.1413 | 0.185 | 0.8677 | 5.7417 | 0.185 | 0.0488 | 0.2606 | 0.7382 |
| No log | 19.96 | 60 | 3.1391 | 0.185 | 0.8673 | 5.7824 | 0.185 | 0.0488 | 0.2432 | 0.7365 |
| No log | 20.96 | 63 | 3.1378 | 0.185 | 0.8671 | 5.9509 | 0.185 | 0.0488 | 0.2598 | 0.7368 |
| No log | 21.96 | 66 | 3.1364 | 0.185 | 0.8668 | 6.0164 | 0.185 | 0.0488 | 0.2477 | 0.7361 |
| No log | 22.96 | 69 | 3.1355 | 0.185 | 0.8667 | 6.0109 | 0.185 | 0.0488 | 0.2437 | 0.7352 |
| No log | 23.96 | 72 | 3.1350 | 0.185 | 0.8666 | 6.0029 | 0.185 | 0.0488 | 0.2438 | 0.7351 |
| No log | 24.96 | 75 | 3.1347 | 0.185 | 0.8666 | 5.9997 | 0.185 | 0.0488 | 0.2480 | 0.7353 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5147
- Accuracy: 0.18
- Brier Loss: 0.8746
- Nll: 6.7241
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2451
- Aurc: 0.8494
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.6571 | 0.145 | 0.8999 | 10.1542 | 0.145 | 0.0253 | 0.2220 | 0.8466 |
| No log | 1.96 | 6 | 2.6281 | 0.145 | 0.8947 | 10.5635 | 0.145 | 0.0253 | 0.2236 | 0.8461 |
| No log | 2.96 | 9 | 2.5865 | 0.14 | 0.8870 | 8.5822 | 0.14 | 0.0433 | 0.2063 | 0.8040 |
| No log | 3.96 | 12 | 2.5552 | 0.19 | 0.8811 | 6.5445 | 0.19 | 0.0552 | 0.2421 | 0.8576 |
| No log | 4.96 | 15 | 2.5387 | 0.155 | 0.8782 | 7.1184 | 0.155 | 0.0277 | 0.2280 | 0.8892 |
| No log | 5.96 | 18 | 2.5317 | 0.18 | 0.8774 | 8.7285 | 0.18 | 0.0319 | 0.2392 | 0.8538 |
| No log | 6.96 | 21 | 2.5274 | 0.18 | 0.8770 | 8.2533 | 0.18 | 0.0306 | 0.2476 | 0.8514 |
| No log | 7.96 | 24 | 2.5238 | 0.18 | 0.8767 | 6.9903 | 0.18 | 0.0306 | 0.2465 | 0.8523 |
| No log | 8.96 | 27 | 2.5205 | 0.18 | 0.8762 | 6.9049 | 0.18 | 0.0306 | 0.2473 | 0.8528 |
| No log | 9.96 | 30 | 2.5189 | 0.18 | 0.8758 | 6.8830 | 0.18 | 0.0306 | 0.2515 | 0.8526 |
| No log | 10.96 | 33 | 2.5180 | 0.18 | 0.8756 | 6.8133 | 0.18 | 0.0306 | 0.2469 | 0.8522 |
| No log | 11.96 | 36 | 2.5175 | 0.18 | 0.8754 | 6.7422 | 0.18 | 0.0306 | 0.2500 | 0.8519 |
| No log | 12.96 | 39 | 2.5173 | 0.18 | 0.8753 | 6.5762 | 0.18 | 0.0306 | 0.2533 | 0.8515 |
| No log | 13.96 | 42 | 2.5168 | 0.18 | 0.8751 | 6.5666 | 0.18 | 0.0306 | 0.2528 | 0.8516 |
| No log | 14.96 | 45 | 2.5164 | 0.18 | 0.8750 | 6.7246 | 0.18 | 0.0306 | 0.2532 | 0.8512 |
| No log | 15.96 | 48 | 2.5160 | 0.18 | 0.8750 | 6.7221 | 0.18 | 0.0306 | 0.2456 | 0.8507 |
| No log | 16.96 | 51 | 2.5157 | 0.18 | 0.8749 | 6.7242 | 0.18 | 0.0306 | 0.2457 | 0.8507 |
| No log | 17.96 | 54 | 2.5158 | 0.18 | 0.8749 | 6.7241 | 0.18 | 0.0306 | 0.2417 | 0.8503 |
| No log | 18.96 | 57 | 2.5157 | 0.18 | 0.8749 | 6.7259 | 0.18 | 0.0306 | 0.2455 | 0.8503 |
| No log | 19.96 | 60 | 2.5153 | 0.18 | 0.8748 | 6.7248 | 0.18 | 0.0306 | 0.2452 | 0.8495 |
| No log | 20.96 | 63 | 2.5151 | 0.18 | 0.8748 | 6.7250 | 0.18 | 0.0306 | 0.2414 | 0.8494 |
| No log | 21.96 | 66 | 2.5149 | 0.18 | 0.8747 | 6.7250 | 0.18 | 0.0306 | 0.2452 | 0.8495 |
| No log | 22.96 | 69 | 2.5147 | 0.18 | 0.8747 | 6.7247 | 0.18 | 0.0306 | 0.2451 | 0.8495 |
| No log | 23.96 | 72 | 2.5147 | 0.18 | 0.8747 | 6.7246 | 0.18 | 0.0306 | 0.2451 | 0.8495 |
| No log | 24.96 | 75 | 2.5147 | 0.18 | 0.8746 | 6.7241 | 0.18 | 0.0306 | 0.2451 | 0.8494 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_tobacco3482_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_tobacco3482_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4735
- Accuracy: 0.19
- Brier Loss: 0.8651
- Nll: 6.3618
- F1 Micro: 0.19
- F1 Macro: 0.0641
- Ece: 0.2456
- Aurc: 0.7331
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 0.96 | 3 | 2.6674 | 0.06 | 0.9042 | 9.2824 | 0.06 | 0.0114 | 0.1749 | 0.9042 |
| No log | 1.96 | 6 | 2.5911 | 0.18 | 0.8886 | 6.4746 | 0.18 | 0.0305 | 0.2317 | 0.8026 |
| No log | 2.96 | 9 | 2.5252 | 0.18 | 0.8764 | 7.5079 | 0.18 | 0.0305 | 0.2390 | 0.8141 |
| No log | 3.96 | 12 | 2.5235 | 0.185 | 0.8777 | 6.9489 | 0.185 | 0.0488 | 0.2553 | 0.7838 |
| No log | 4.96 | 15 | 2.5223 | 0.185 | 0.8754 | 6.8606 | 0.185 | 0.0488 | 0.2572 | 0.7773 |
| No log | 5.96 | 18 | 2.5213 | 0.185 | 0.8732 | 5.9794 | 0.185 | 0.0488 | 0.2384 | 0.7684 |
| No log | 6.96 | 21 | 2.5203 | 0.185 | 0.8723 | 5.9244 | 0.185 | 0.0488 | 0.2406 | 0.7603 |
| No log | 7.96 | 24 | 2.5149 | 0.185 | 0.8713 | 5.9034 | 0.185 | 0.0488 | 0.2484 | 0.7560 |
| No log | 8.96 | 27 | 2.5064 | 0.185 | 0.8701 | 5.9325 | 0.185 | 0.0488 | 0.2525 | 0.7529 |
| No log | 9.96 | 30 | 2.5014 | 0.185 | 0.8695 | 6.7123 | 0.185 | 0.0488 | 0.2399 | 0.7528 |
| No log | 10.96 | 33 | 2.4977 | 0.185 | 0.8693 | 6.7598 | 0.185 | 0.0488 | 0.2487 | 0.7511 |
| No log | 11.96 | 36 | 2.4944 | 0.185 | 0.8692 | 6.8130 | 0.185 | 0.0488 | 0.2488 | 0.7476 |
| No log | 12.96 | 39 | 2.4908 | 0.185 | 0.8688 | 6.7610 | 0.185 | 0.0488 | 0.2488 | 0.7452 |
| No log | 13.96 | 42 | 2.4867 | 0.185 | 0.8680 | 6.6686 | 0.185 | 0.0488 | 0.2484 | 0.7428 |
| No log | 14.96 | 45 | 2.4830 | 0.185 | 0.8673 | 6.6283 | 0.185 | 0.0488 | 0.2426 | 0.7431 |
| No log | 15.96 | 48 | 2.4805 | 0.185 | 0.8668 | 6.4857 | 0.185 | 0.0488 | 0.2385 | 0.7410 |
| No log | 16.96 | 51 | 2.4794 | 0.185 | 0.8666 | 6.4425 | 0.185 | 0.0488 | 0.2459 | 0.7385 |
| No log | 17.96 | 54 | 2.4795 | 0.185 | 0.8664 | 6.0769 | 0.185 | 0.0488 | 0.2406 | 0.7352 |
| No log | 18.96 | 57 | 2.4793 | 0.185 | 0.8664 | 6.1000 | 0.185 | 0.0488 | 0.2402 | 0.7355 |
| No log | 19.96 | 60 | 2.4774 | 0.185 | 0.8660 | 6.3802 | 0.185 | 0.0488 | 0.2506 | 0.7346 |
| No log | 20.96 | 63 | 2.4762 | 0.185 | 0.8657 | 6.4330 | 0.185 | 0.0488 | 0.2550 | 0.7344 |
| No log | 21.96 | 66 | 2.4750 | 0.185 | 0.8654 | 6.3721 | 0.185 | 0.0488 | 0.2513 | 0.7333 |
| No log | 22.96 | 69 | 2.4741 | 0.19 | 0.8652 | 6.3676 | 0.19 | 0.0641 | 0.2453 | 0.7332 |
| No log | 23.96 | 72 | 2.4738 | 0.19 | 0.8652 | 6.3645 | 0.19 | 0.0641 | 0.2455 | 0.7331 |
| No log | 24.96 | 75 | 2.4735 | 0.19 | 0.8651 | 6.3618 | 0.19 | 0.0641 | 0.2456 | 0.7331 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
wuru330/results
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8129
- Accuracy: 0.5969
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.9759 | 1.0 | 37 | 0.9392 | 0.5408 |
| 0.8313 | 2.0 | 74 | 0.8845 | 0.6122 |
| 0.8032 | 3.0 | 111 | 0.8459 | 0.6122 |
| 0.7375 | 4.0 | 148 | 0.8693 | 0.5782 |
| 0.635 | 5.0 | 185 | 0.8724 | 0.6344 |
| 0.578 | 6.0 | 222 | 0.9932 | 0.5629 |
| 0.3875 | 7.0 | 259 | 1.0738 | 0.5952 |
| 0.3544 | 8.0 | 296 | 1.1359 | 0.6156 |
| 0.407 | 9.0 | 333 | 1.3020 | 0.5493 |
| 0.2329 | 10.0 | 370 | 1.2567 | 0.6020 |
| 0.2305 | 11.0 | 407 | 1.3148 | 0.6156 |
| 0.2098 | 12.0 | 444 | 1.2928 | 0.6241 |
| 0.1595 | 13.0 | 481 | 1.5325 | 0.5629 |
| 0.1515 | 14.0 | 518 | 1.4402 | 0.6156 |
| 0.1429 | 15.0 | 555 | 1.4456 | 0.6276 |
| 0.1812 | 16.0 | 592 | 1.5088 | 0.5663 |
| 0.1169 | 17.0 | 629 | 1.6266 | 0.5850 |
| 0.1375 | 18.0 | 666 | 1.5252 | 0.6173 |
| 0.0907 | 19.0 | 703 | 1.6055 | 0.6088 |
| 0.1003 | 20.0 | 740 | 1.5785 | 0.6003 |
| 0.0756 | 21.0 | 777 | 1.6485 | 0.5850 |
| 0.0641 | 22.0 | 814 | 1.6257 | 0.6190 |
| 0.0387 | 23.0 | 851 | 1.6758 | 0.6105 |
| 0.0341 | 24.0 | 888 | 1.7239 | 0.6088 |
| 0.0227 | 25.0 | 925 | 1.7956 | 0.6020 |
| 0.0247 | 26.0 | 962 | 1.7542 | 0.6037 |
| 0.014 | 27.0 | 999 | 1.7693 | 0.6139 |
| 0.0152 | 28.0 | 1036 | 1.8133 | 0.5969 |
| 0.0125 | 29.0 | 1073 | 1.8082 | 0.6037 |
| 0.0116 | 30.0 | 1110 | 1.8129 | 0.5969 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"label_0",
"label_1",
"label_2"
] |
hvisionrc/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.24.0
- Pytorch 2.0.1+cu118
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
wesleyacheng/dog-breeds-multiclass-image-classification-with-vit
|
Model made by notebook first posted in my [Kaggle](https://www.kaggle.com/wesleyacheng/dog-breeds-multiclass-image-classification-w-vit).
# Model Motivation
Recently, someone asked me if you can classify dog images into their respective dog breeds instead just differentiating from cats vs dogs like my last [notebook](https://www.kaggle.com/code/wesleyacheng/cat-vs-dog-image-classification-with-cnns). I say **YES**!
Due to the complexity of the problem, we will be using the most advanced computer vision architecture released in the [2020 Google paper](https://arxiv.org/pdf/2010.11929v2.pdf), the [**Vision Transformer**](https://paperswithcode.com/methods/category/vision-transformer).
The difference between the **Vision Transformer** and the traditional **Convolutional Neural Network (CNN)** is how it treats an image. In **Vision Transformers**, we take the input as a patch of the original image, say 16 x 16, and feed in into the Transformer as a sequence with positional embeddings and self-attention, while in the **Convolutional Neural Network (CNN)**, we use the same patch of original image as an input, but use convolutions and pooling layers as inductive biases. What this means is that **Vision Transformer** can use it's judgement to attend any particular patch of the image in a *global* fashion using it's self-attention mechanism without having us to guide the neural network like a **CNN** with *local* centering/cropping/bounding box our images to help its convolutions.
This allows the **Vision Transformer** architecture to be more flexible and scalable in nature, allowing us to create [foundation models](https://blogs.nvidia.com/blog/2023/03/13/what-are-foundation-models) in computer vision, similar to the NLP foundational models like [BERT](https://paperswithcode.com/method/bert) and [GPT](https://paperswithcode.com/method/gpt), with pre-training self-supervised/supervised on massive amount of image data that would generalize to different computer vision tasks such as *image classification, recognition, segmentation, etc.* This cross-pollination helps us move closer towards the goal of Artificial General Intelligence.
One thing about **Vision Transformers** are it has weaker inductive biases compared to **Convolutional Neural Networks** that enables it's scalability and flexibility. This feature/bug depending on who you ask will require most well-performing pre-trained models to require more data despite having less parameters compared to it's CNN counterparts.
Luckily, in this model, we will use a **Vision Transformer** from [Google hosted at HuggingFace](https://huggingface.co/google/vit-base-patch16-224-in21k) pre-trained on the [ImageNet-21k dataset](https://paperswithcode.com/paper/imagenet-21k-pretraining-for-the-masses) (14 million images, 21k classes) with 16x16 patches, 224x224 resolution to bypass that data limitation. We will be fine-tuning this model to our "small" dog breeds dataset of around 20 thousand images from the [Stanford Dogs dataset](http://vision.stanford.edu/aditya86/ImageNetDogs/) imported by Jessica Li into [Kaggle](https://www.kaggle.com/datasets/jessicali9530/stanford-dogs-dataset) to classify dog images into 120 types of dog breeds!
# Model Description
This model is finetuned using the [Google Vision Transformer (vit-base-patch16-224-in21k)](https://huggingface.co/google/vit-base-patch16-224-in21k) on the [Stanford Dogs dataset in Kaggle](https://www.kaggle.com/datasets/jessicali9530/stanford-dogs-dataset) to classify dog images into 120 types of dog breeds.
# Intended Uses & Limitations
You can use this finetuned model to classify images of dogs only and dog breeds that are in the dataset.
# How to Use
```python
from transformers import AutoImageProcessor, AutoModelForImageClassification
import PIL
import requests
url = "https://upload.wikimedia.org/wikipedia/commons/5/55/Beagle_600.jpg"
image = PIL.Image.open(requests.get(url, stream=True).raw)
image_processor = AutoImageProcessor.from_pretrained("wesleyacheng/dog-breeds-multiclass-image-classification-with-vit")
model = AutoModelForImageClassification.from_pretrained("wesleyacheng/dog-breeds-multiclass-image-classification-with-vit")
inputs = image_processor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits
# model predicts one of the 120 Stanford dog breeds classes
predicted_class_idx = logits.argmax(-1).item()
print("Predicted class:", model.config.id2label[predicted_class_idx])
```
# Model Training Metrics
| Epoch | Top-1 Accuracy | Top-3 Accuracy | Top-5 Accuracy | Macro F1 |
|-------|----------------|-----------------|----------------|----------|
| 1 | 79.8% | 95.1% | 97.5% | 77.2% |
| 2 | 83.8% | 96.7% | 98.2% | 81.9% |
| 3 | 84.8% | 96.7% | 98.3% | 83.4% |
# Model Evaluation Metrics
| Top-1 Accuracy | Top-3 Accuracy | Top-5 Accuracy | Macro F1 |
|----------------|-----------------|----------------|----------|
| 84.0% | 97.1% | 98.7% | 83.0% |
|
[
"otterhound",
"cocker_spaniel",
"brittany_spaniel",
"afghan_hound",
"maltese_dog",
"schipperke",
"irish_setter",
"pekinese",
"golden_retriever",
"vizsla",
"welsh_springer_spaniel",
"staffordshire_bullterrier",
"border_collie",
"irish_terrier",
"eskimo_dog",
"pug",
"kelpie",
"yorkshire_terrier",
"tibetan_terrier",
"walker_hound",
"affenpinscher",
"cardigan",
"english_springer",
"english_foxhound",
"west_highland_white_terrier",
"lakeland_terrier",
"rhodesian_ridgeback",
"gordon_setter",
"lhasa",
"curly",
"beagle",
"tibetan_mastiff",
"sussex_spaniel",
"saint_bernard",
"toy_terrier",
"standard_poodle",
"bernese_mountain_dog",
"pomeranian",
"ibizan_hound",
"redbone",
"toy_poodle",
"basset",
"scottish_deerhound",
"miniature_pinscher",
"basenji",
"border_terrier",
"bedlington_terrier",
"kerry_blue_terrier",
"weimaraner",
"english_setter",
"bluetick",
"boston_bull",
"italian_greyhound",
"dandie_dinmont",
"airedale",
"irish_water_spaniel",
"norfolk_terrier",
"wire",
"french_bulldog",
"soft",
"komondor",
"african_hunting_dog",
"siberian_husky",
"newfoundland",
"bouvier_des_flandres",
"saluki",
"shetland_sheepdog",
"collie",
"rottweiler",
"silky_terrier",
"norwegian_elkhound",
"chihuahua",
"leonberg",
"norwich_terrier",
"cairn",
"boxer",
"borzoi",
"dhole",
"samoyed",
"german_shepherd",
"labrador_retriever",
"blenheim_spaniel",
"groenendael",
"doberman",
"great_dane",
"flat",
"appenzeller",
"shih",
"japanese_spaniel",
"greater_swiss_mountain_dog",
"black",
"dingo",
"great_pyrenees",
"whippet",
"keeshond",
"malinois",
"american_staffordshire_terrier",
"mexican_hairless",
"giant_schnauzer",
"brabancon_griffon",
"kuvasz",
"miniature_poodle",
"irish_wolfhound",
"briard",
"clumber",
"standard_schnauzer",
"bull_mastiff",
"malamute",
"sealyham_terrier",
"entlebucher",
"chow",
"papillon",
"pembroke",
"german_short",
"old_english_sheepdog",
"chesapeake_bay_retriever",
"scotch_terrier",
"australian_terrier",
"miniature_schnauzer",
"bloodhound"
] |
ALM-AHME/beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-60-20-20
This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0001
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1071 | 0.99 | 140 | 0.0181 | 0.993 |
| 0.2317 | 2.0 | 281 | 0.0474 | 0.983 |
| 0.0454 | 3.0 | 422 | 0.0173 | 0.9927 |
| 0.0299 | 4.0 | 563 | 0.0153 | 0.994 |
| 0.0055 | 4.97 | 700 | 0.0001 | 1.0 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
ALM-AHME/convnextv2-large-1k-224-finetuned-LungCancer-Classification-LC25000-AH-60-20-20
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-large-1k-224-finetuned-LungCancer-Classification-LC25000-AH-60-20-20
This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0043
- Accuracy: 0.9997
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4332 | 1.0 | 281 | 0.3824 | 0.947 |
| 0.1558 | 2.0 | 563 | 0.1292 | 0.9737 |
| 0.1161 | 3.0 | 844 | 0.0556 | 0.9887 |
| 0.2337 | 4.0 | 1126 | 0.0683 | 0.982 |
| 0.1285 | 5.0 | 1407 | 0.0293 | 0.9923 |
| 0.047 | 6.0 | 1689 | 0.0987 | 0.975 |
| 0.0741 | 7.0 | 1970 | 0.0373 | 0.988 |
| 0.153 | 8.0 | 2252 | 0.0043 | 0.9997 |
| 0.0244 | 9.0 | 2533 | 0.0696 | 0.981 |
| 0.0646 | 10.0 | 2815 | 0.0120 | 0.995 |
| 0.0025 | 11.0 | 3096 | 0.0076 | 0.9977 |
| 0.0611 | 11.98 | 3372 | 0.0024 | 0.9997 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
ALM-AHME/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-LungCancer-LC25000-AH
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swinv2-large-patch4-window12to16-192to256-22kto1k-ft-finetuned-LungCancer-LC25000-AH
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0002
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 7
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0929 | 1.0 | 281 | 0.0919 | 0.9657 |
| 0.0908 | 2.0 | 562 | 0.0127 | 0.9967 |
| 0.0525 | 3.0 | 843 | 0.0133 | 0.9947 |
| 0.1301 | 4.0 | 1125 | 0.0270 | 0.9927 |
| 0.0624 | 5.0 | 1406 | 0.0064 | 0.9973 |
| 0.0506 | 6.0 | 1687 | 0.0025 | 0.999 |
| 0.0001 | 6.99 | 1967 | 0.0002 | 1.0 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
jordyvl/dit-tiny_rvl_cdip_100_examples_per_class_kd_MSE_lr_fix
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-tiny_rvl_cdip_100_examples_per_class_kd_MSE_lr_fix
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4358
- Accuracy: 0.195
- Brier Loss: 0.9035
- Nll: 12.0550
- F1 Micro: 0.195
- F1 Macro: 0.1471
- Ece: 0.1675
- Aurc: 0.6988
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 1.5167 | 0.07 | 0.9368 | 20.8948 | 0.07 | 0.0305 | 0.1106 | 0.8850 |
| No log | 2.0 | 50 | 1.5246 | 0.08 | 0.9362 | 21.4368 | 0.08 | 0.0346 | 0.1200 | 0.8659 |
| No log | 3.0 | 75 | 1.5053 | 0.1 | 0.9340 | 23.7241 | 0.1000 | 0.0522 | 0.1280 | 0.8087 |
| No log | 4.0 | 100 | 1.5097 | 0.0975 | 0.9322 | 17.3004 | 0.0975 | 0.0487 | 0.1220 | 0.8220 |
| No log | 5.0 | 125 | 1.4926 | 0.12 | 0.9296 | 16.3893 | 0.12 | 0.0600 | 0.1284 | 0.7752 |
| No log | 6.0 | 150 | 1.4838 | 0.105 | 0.9273 | 19.3692 | 0.1050 | 0.0356 | 0.1254 | 0.7955 |
| No log | 7.0 | 175 | 1.4729 | 0.0975 | 0.9229 | 18.6899 | 0.0975 | 0.0411 | 0.1134 | 0.7963 |
| No log | 8.0 | 200 | 1.4754 | 0.125 | 0.9196 | 17.7842 | 0.125 | 0.0676 | 0.1238 | 0.7778 |
| No log | 9.0 | 225 | 1.4725 | 0.1125 | 0.9193 | 16.6572 | 0.1125 | 0.0505 | 0.1254 | 0.7839 |
| No log | 10.0 | 250 | 1.4702 | 0.1175 | 0.9168 | 16.3975 | 0.1175 | 0.0556 | 0.1183 | 0.7638 |
| No log | 11.0 | 275 | 1.4648 | 0.1175 | 0.9169 | 18.4274 | 0.1175 | 0.0558 | 0.1219 | 0.7806 |
| No log | 12.0 | 300 | 1.4660 | 0.155 | 0.9166 | 15.6492 | 0.155 | 0.0791 | 0.1411 | 0.7512 |
| No log | 13.0 | 325 | 1.4684 | 0.16 | 0.9164 | 17.1698 | 0.16 | 0.1140 | 0.1519 | 0.7285 |
| No log | 14.0 | 350 | 1.4662 | 0.1175 | 0.9158 | 17.6999 | 0.1175 | 0.0501 | 0.1269 | 0.7637 |
| No log | 15.0 | 375 | 1.4602 | 0.1675 | 0.9143 | 13.2540 | 0.1675 | 0.1153 | 0.1515 | 0.7223 |
| No log | 16.0 | 400 | 1.4556 | 0.1325 | 0.9138 | 13.3868 | 0.1325 | 0.0881 | 0.1323 | 0.7558 |
| No log | 17.0 | 425 | 1.4527 | 0.175 | 0.9128 | 11.1983 | 0.175 | 0.1334 | 0.1596 | 0.7153 |
| No log | 18.0 | 450 | 1.4535 | 0.1625 | 0.9111 | 17.6046 | 0.1625 | 0.1021 | 0.1435 | 0.7379 |
| No log | 19.0 | 475 | 1.4453 | 0.1825 | 0.9086 | 11.8948 | 0.1825 | 0.1228 | 0.1594 | 0.7098 |
| 1.4614 | 20.0 | 500 | 1.4431 | 0.1525 | 0.9078 | 14.2631 | 0.1525 | 0.1115 | 0.1410 | 0.7293 |
| 1.4614 | 21.0 | 525 | 1.4392 | 0.1825 | 0.9063 | 10.7664 | 0.1825 | 0.1378 | 0.1567 | 0.7058 |
| 1.4614 | 22.0 | 550 | 1.4469 | 0.1775 | 0.9055 | 13.4724 | 0.1775 | 0.1212 | 0.1483 | 0.7107 |
| 1.4614 | 23.0 | 575 | 1.4356 | 0.17 | 0.9039 | 11.8141 | 0.17 | 0.1232 | 0.1515 | 0.7091 |
| 1.4614 | 24.0 | 600 | 1.4370 | 0.1875 | 0.9039 | 12.9338 | 0.1875 | 0.1384 | 0.1539 | 0.7017 |
| 1.4614 | 25.0 | 625 | 1.4358 | 0.195 | 0.9035 | 12.0550 | 0.195 | 0.1471 | 0.1675 | 0.6988 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
mihule/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.24.0
- Pytorch 2.0.1+cu118
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
jordyvl/dit-base_tobacco_small_student
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-base_tobacco_small_student
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.3305
- Accuracy: 0.435
- Brier Loss: 1.0472
- Nll: 10.3327
- F1 Micro: 0.435
- F1 Macro: 0.4299
- Ece: 0.5115
- Aurc: 0.4245
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 50 | 2.1780 | 0.16 | 0.8745 | 11.2696 | 0.16 | 0.0323 | 0.2326 | 0.8208 |
| No log | 2.0 | 100 | 2.1761 | 0.19 | 0.8727 | 10.5065 | 0.19 | 0.0548 | 0.2712 | 0.7980 |
| No log | 3.0 | 150 | 2.1426 | 0.16 | 0.8689 | 8.8915 | 0.16 | 0.0451 | 0.2697 | 0.6322 |
| No log | 4.0 | 200 | 2.0668 | 0.225 | 0.8434 | 9.6036 | 0.225 | 0.1219 | 0.2680 | 0.6623 |
| No log | 5.0 | 250 | 2.0633 | 0.21 | 0.8447 | 5.7679 | 0.2100 | 0.1401 | 0.2733 | 0.5765 |
| No log | 6.0 | 300 | 2.0030 | 0.22 | 0.8351 | 7.1501 | 0.22 | 0.1132 | 0.3000 | 0.6750 |
| No log | 7.0 | 350 | 1.9273 | 0.32 | 0.8243 | 6.2911 | 0.32 | 0.2612 | 0.2822 | 0.6549 |
| No log | 8.0 | 400 | 1.7954 | 0.365 | 0.7742 | 4.2641 | 0.3650 | 0.2647 | 0.2630 | 0.5031 |
| No log | 9.0 | 450 | 1.8070 | 0.36 | 0.7720 | 4.9274 | 0.36 | 0.2914 | 0.2601 | 0.4871 |
| 1.9795 | 10.0 | 500 | 1.7838 | 0.34 | 0.7857 | 3.3860 | 0.34 | 0.2387 | 0.2902 | 0.5057 |
| 1.9795 | 11.0 | 550 | 1.7214 | 0.395 | 0.7404 | 4.1630 | 0.395 | 0.2995 | 0.2922 | 0.4210 |
| 1.9795 | 12.0 | 600 | 1.6834 | 0.445 | 0.7284 | 3.7081 | 0.445 | 0.3444 | 0.2700 | 0.3914 |
| 1.9795 | 13.0 | 650 | 1.6992 | 0.38 | 0.7641 | 4.1246 | 0.38 | 0.3045 | 0.3375 | 0.4155 |
| 1.9795 | 14.0 | 700 | 1.8695 | 0.395 | 0.7711 | 5.6899 | 0.395 | 0.3432 | 0.3224 | 0.4425 |
| 1.9795 | 15.0 | 750 | 1.8757 | 0.38 | 0.7939 | 5.1099 | 0.38 | 0.3879 | 0.3102 | 0.4313 |
| 1.9795 | 16.0 | 800 | 2.0457 | 0.405 | 0.8184 | 5.6034 | 0.405 | 0.3957 | 0.3256 | 0.4414 |
| 1.9795 | 17.0 | 850 | 2.2243 | 0.395 | 0.8653 | 7.7124 | 0.395 | 0.3567 | 0.3887 | 0.3997 |
| 1.9795 | 18.0 | 900 | 1.9309 | 0.45 | 0.7794 | 5.2698 | 0.45 | 0.3763 | 0.3626 | 0.3767 |
| 1.9795 | 19.0 | 950 | 2.2285 | 0.415 | 0.8319 | 6.7127 | 0.415 | 0.4153 | 0.3667 | 0.3942 |
| 0.6717 | 20.0 | 1000 | 2.3745 | 0.445 | 0.8643 | 7.4432 | 0.445 | 0.4290 | 0.3859 | 0.4046 |
| 0.6717 | 21.0 | 1050 | 2.5389 | 0.41 | 0.9148 | 7.6865 | 0.41 | 0.3994 | 0.4351 | 0.4054 |
| 0.6717 | 22.0 | 1100 | 2.5537 | 0.465 | 0.8500 | 8.1266 | 0.465 | 0.4623 | 0.4070 | 0.3900 |
| 0.6717 | 23.0 | 1150 | 2.8355 | 0.42 | 0.9426 | 8.8542 | 0.4200 | 0.3930 | 0.4508 | 0.4201 |
| 0.6717 | 24.0 | 1200 | 2.8575 | 0.4 | 0.9962 | 7.6428 | 0.4000 | 0.3502 | 0.4994 | 0.4119 |
| 0.6717 | 25.0 | 1250 | 2.8704 | 0.445 | 0.9418 | 9.2600 | 0.445 | 0.4570 | 0.4309 | 0.4021 |
| 0.6717 | 26.0 | 1300 | 3.4702 | 0.43 | 0.9641 | 12.1621 | 0.4300 | 0.3977 | 0.4590 | 0.3597 |
| 0.6717 | 27.0 | 1350 | 3.1484 | 0.475 | 0.9518 | 8.1474 | 0.4750 | 0.4641 | 0.4809 | 0.4088 |
| 0.6717 | 28.0 | 1400 | 3.2299 | 0.455 | 0.9673 | 9.6161 | 0.455 | 0.4205 | 0.4711 | 0.3806 |
| 0.6717 | 29.0 | 1450 | 3.4968 | 0.425 | 1.0136 | 10.5614 | 0.425 | 0.3992 | 0.4743 | 0.3773 |
| 0.0268 | 30.0 | 1500 | 3.1340 | 0.46 | 0.9443 | 8.5023 | 0.46 | 0.4296 | 0.4557 | 0.3735 |
| 0.0268 | 31.0 | 1550 | 3.4297 | 0.435 | 1.0058 | 8.2428 | 0.435 | 0.3979 | 0.4967 | 0.3848 |
| 0.0268 | 32.0 | 1600 | 3.6922 | 0.4 | 1.0488 | 10.8019 | 0.4000 | 0.3880 | 0.5223 | 0.4017 |
| 0.0268 | 33.0 | 1650 | 3.6009 | 0.445 | 0.9964 | 10.1007 | 0.445 | 0.4204 | 0.4924 | 0.3981 |
| 0.0268 | 34.0 | 1700 | 3.6678 | 0.425 | 1.0494 | 9.1369 | 0.425 | 0.3896 | 0.5159 | 0.4192 |
| 0.0268 | 35.0 | 1750 | 3.5743 | 0.45 | 0.9953 | 9.5996 | 0.45 | 0.4182 | 0.4934 | 0.4030 |
| 0.0268 | 36.0 | 1800 | 3.5551 | 0.465 | 0.9877 | 9.6080 | 0.465 | 0.4221 | 0.5033 | 0.3977 |
| 0.0268 | 37.0 | 1850 | 3.7424 | 0.435 | 1.0191 | 9.9258 | 0.435 | 0.4292 | 0.4955 | 0.4120 |
| 0.0268 | 38.0 | 1900 | 3.7093 | 0.45 | 1.0051 | 9.7038 | 0.45 | 0.4033 | 0.4966 | 0.3857 |
| 0.0268 | 39.0 | 1950 | 3.7240 | 0.45 | 1.0076 | 9.8462 | 0.45 | 0.4027 | 0.4953 | 0.3962 |
| 0.0022 | 40.0 | 2000 | 3.7503 | 0.455 | 1.0090 | 9.9967 | 0.455 | 0.4076 | 0.5056 | 0.3968 |
| 0.0022 | 41.0 | 2050 | 3.5545 | 0.44 | 1.0007 | 8.7616 | 0.44 | 0.4285 | 0.4894 | 0.4008 |
| 0.0022 | 42.0 | 2100 | 3.7452 | 0.435 | 1.0142 | 9.4376 | 0.435 | 0.4135 | 0.5032 | 0.4022 |
| 0.0022 | 43.0 | 2150 | 3.5980 | 0.47 | 0.9532 | 8.2333 | 0.47 | 0.4441 | 0.4650 | 0.4113 |
| 0.0022 | 44.0 | 2200 | 3.7055 | 0.45 | 0.9946 | 9.0121 | 0.45 | 0.4327 | 0.4817 | 0.3688 |
| 0.0022 | 45.0 | 2250 | 3.8500 | 0.435 | 1.0161 | 9.2035 | 0.435 | 0.4164 | 0.5128 | 0.3723 |
| 0.0022 | 46.0 | 2300 | 3.8806 | 0.435 | 1.0261 | 10.7033 | 0.435 | 0.4323 | 0.5008 | 0.3812 |
| 0.0022 | 47.0 | 2350 | 3.8114 | 0.455 | 1.0128 | 9.6784 | 0.455 | 0.4236 | 0.5025 | 0.3873 |
| 0.0022 | 48.0 | 2400 | 3.8743 | 0.435 | 1.0294 | 8.7193 | 0.435 | 0.3734 | 0.5109 | 0.3783 |
| 0.0022 | 49.0 | 2450 | 3.9281 | 0.43 | 1.0414 | 9.9489 | 0.4300 | 0.4296 | 0.5047 | 0.4049 |
| 0.0047 | 50.0 | 2500 | 3.7824 | 0.45 | 0.9956 | 10.7814 | 0.45 | 0.4467 | 0.4975 | 0.3949 |
| 0.0047 | 51.0 | 2550 | 4.0089 | 0.475 | 0.9668 | 11.9005 | 0.4750 | 0.4253 | 0.4637 | 0.4501 |
| 0.0047 | 52.0 | 2600 | 3.7248 | 0.43 | 0.9909 | 10.6449 | 0.4300 | 0.4064 | 0.4750 | 0.4023 |
| 0.0047 | 53.0 | 2650 | 3.7911 | 0.415 | 1.0491 | 9.1188 | 0.415 | 0.3608 | 0.5130 | 0.4173 |
| 0.0047 | 54.0 | 2700 | 3.6925 | 0.44 | 1.0000 | 8.9655 | 0.44 | 0.3970 | 0.4826 | 0.4168 |
| 0.0047 | 55.0 | 2750 | 3.6214 | 0.46 | 0.9590 | 9.5422 | 0.46 | 0.4440 | 0.4636 | 0.3829 |
| 0.0047 | 56.0 | 2800 | 4.3545 | 0.405 | 1.0811 | 10.6531 | 0.405 | 0.4090 | 0.5439 | 0.4533 |
| 0.0047 | 57.0 | 2850 | 3.6835 | 0.46 | 0.9717 | 8.2408 | 0.46 | 0.4367 | 0.4950 | 0.4118 |
| 0.0047 | 58.0 | 2900 | 4.0080 | 0.465 | 1.0011 | 9.3764 | 0.465 | 0.4579 | 0.4927 | 0.4234 |
| 0.0047 | 59.0 | 2950 | 4.0141 | 0.45 | 1.0014 | 9.7100 | 0.45 | 0.4443 | 0.4987 | 0.4220 |
| 0.0118 | 60.0 | 3000 | 3.7963 | 0.43 | 1.0135 | 9.4040 | 0.4300 | 0.4007 | 0.5007 | 0.3979 |
| 0.0118 | 61.0 | 3050 | 4.0609 | 0.43 | 1.0426 | 9.3533 | 0.4300 | 0.3825 | 0.5266 | 0.4285 |
| 0.0118 | 62.0 | 3100 | 4.0150 | 0.47 | 1.0002 | 9.3307 | 0.47 | 0.4490 | 0.5030 | 0.4052 |
| 0.0118 | 63.0 | 3150 | 3.7982 | 0.47 | 0.9660 | 8.5060 | 0.47 | 0.4581 | 0.4716 | 0.3988 |
| 0.0118 | 64.0 | 3200 | 4.3553 | 0.44 | 1.0428 | 10.3840 | 0.44 | 0.4218 | 0.5163 | 0.4312 |
| 0.0118 | 65.0 | 3250 | 3.7142 | 0.44 | 0.9900 | 8.5049 | 0.44 | 0.4298 | 0.4849 | 0.3735 |
| 0.0118 | 66.0 | 3300 | 3.7411 | 0.47 | 0.9661 | 8.1935 | 0.47 | 0.4497 | 0.4789 | 0.3812 |
| 0.0118 | 67.0 | 3350 | 3.7858 | 0.49 | 0.9574 | 8.8397 | 0.49 | 0.4799 | 0.4616 | 0.3895 |
| 0.0118 | 68.0 | 3400 | 3.7927 | 0.495 | 0.9459 | 8.6915 | 0.495 | 0.4870 | 0.4577 | 0.3883 |
| 0.0118 | 69.0 | 3450 | 3.8348 | 0.5 | 0.9454 | 8.8298 | 0.5 | 0.4889 | 0.4715 | 0.3891 |
| 0.0004 | 70.0 | 3500 | 3.8551 | 0.48 | 0.9500 | 8.9827 | 0.48 | 0.4755 | 0.4691 | 0.3913 |
| 0.0004 | 71.0 | 3550 | 3.8432 | 0.48 | 0.9622 | 9.1404 | 0.48 | 0.4691 | 0.4690 | 0.3885 |
| 0.0004 | 72.0 | 3600 | 3.8594 | 0.48 | 0.9617 | 8.8182 | 0.48 | 0.4691 | 0.4805 | 0.3854 |
| 0.0004 | 73.0 | 3650 | 3.8855 | 0.485 | 0.9622 | 8.8248 | 0.485 | 0.4760 | 0.4809 | 0.3881 |
| 0.0004 | 74.0 | 3700 | 3.8996 | 0.49 | 0.9610 | 8.9750 | 0.49 | 0.4818 | 0.4634 | 0.3892 |
| 0.0004 | 75.0 | 3750 | 3.9921 | 0.475 | 0.9642 | 9.5409 | 0.4750 | 0.4597 | 0.4666 | 0.4185 |
| 0.0004 | 76.0 | 3800 | 4.1128 | 0.43 | 1.0429 | 9.9966 | 0.4300 | 0.3844 | 0.5187 | 0.4056 |
| 0.0004 | 77.0 | 3850 | 4.0783 | 0.44 | 1.0172 | 9.3016 | 0.44 | 0.4205 | 0.5051 | 0.3988 |
| 0.0004 | 78.0 | 3900 | 4.0804 | 0.445 | 1.0254 | 8.9753 | 0.445 | 0.4246 | 0.5089 | 0.3982 |
| 0.0004 | 79.0 | 3950 | 4.0892 | 0.445 | 1.0269 | 8.8290 | 0.445 | 0.4246 | 0.5069 | 0.4000 |
| 0.0002 | 80.0 | 4000 | 4.1013 | 0.445 | 1.0258 | 9.1363 | 0.445 | 0.4246 | 0.5129 | 0.4033 |
| 0.0002 | 81.0 | 4050 | 4.0985 | 0.44 | 1.0287 | 9.1459 | 0.44 | 0.4213 | 0.5074 | 0.4054 |
| 0.0002 | 82.0 | 4100 | 4.1029 | 0.44 | 1.0263 | 9.3107 | 0.44 | 0.4211 | 0.5125 | 0.4066 |
| 0.0002 | 83.0 | 4150 | 4.1075 | 0.44 | 1.0248 | 9.4604 | 0.44 | 0.4224 | 0.5164 | 0.4061 |
| 0.0002 | 84.0 | 4200 | 4.1087 | 0.44 | 1.0225 | 9.7739 | 0.44 | 0.4221 | 0.5090 | 0.4055 |
| 0.0002 | 85.0 | 4250 | 4.1248 | 0.44 | 1.0262 | 9.7747 | 0.44 | 0.4259 | 0.5032 | 0.4065 |
| 0.0002 | 86.0 | 4300 | 4.1527 | 0.445 | 1.0263 | 9.4647 | 0.445 | 0.4299 | 0.5128 | 0.4066 |
| 0.0002 | 87.0 | 4350 | 4.0529 | 0.475 | 0.9810 | 9.1439 | 0.4750 | 0.4488 | 0.4910 | 0.3938 |
| 0.0002 | 88.0 | 4400 | 4.1405 | 0.455 | 1.0091 | 9.5149 | 0.455 | 0.4230 | 0.4966 | 0.4147 |
| 0.0002 | 89.0 | 4450 | 4.3483 | 0.41 | 1.0724 | 9.8421 | 0.41 | 0.4083 | 0.5384 | 0.4090 |
| 0.0008 | 90.0 | 4500 | 4.5574 | 0.39 | 1.1077 | 11.2517 | 0.39 | 0.3940 | 0.5618 | 0.4405 |
| 0.0008 | 91.0 | 4550 | 4.5104 | 0.41 | 1.0890 | 10.8687 | 0.41 | 0.4173 | 0.5411 | 0.4350 |
| 0.0008 | 92.0 | 4600 | 4.3791 | 0.425 | 1.0672 | 10.7198 | 0.425 | 0.4202 | 0.5233 | 0.4306 |
| 0.0008 | 93.0 | 4650 | 4.3608 | 0.43 | 1.0553 | 10.8428 | 0.4300 | 0.4236 | 0.5196 | 0.4284 |
| 0.0008 | 94.0 | 4700 | 4.3469 | 0.44 | 1.0474 | 10.6774 | 0.44 | 0.4428 | 0.5020 | 0.4280 |
| 0.0008 | 95.0 | 4750 | 4.3420 | 0.44 | 1.0487 | 10.5138 | 0.44 | 0.4385 | 0.5260 | 0.4270 |
| 0.0008 | 96.0 | 4800 | 4.3385 | 0.435 | 1.0491 | 10.3448 | 0.435 | 0.4312 | 0.5170 | 0.4266 |
| 0.0008 | 97.0 | 4850 | 4.3341 | 0.435 | 1.0485 | 10.3378 | 0.435 | 0.4312 | 0.5136 | 0.4261 |
| 0.0008 | 98.0 | 4900 | 4.3336 | 0.435 | 1.0480 | 10.3350 | 0.435 | 0.4312 | 0.5184 | 0.4253 |
| 0.0008 | 99.0 | 4950 | 4.3306 | 0.435 | 1.0472 | 10.3328 | 0.435 | 0.4299 | 0.5116 | 0.4245 |
| 0.0001 | 100.0 | 5000 | 4.3305 | 0.435 | 1.0472 | 10.3327 | 0.435 | 0.4299 | 0.5115 | 0.4245 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/dit-small_rvl_cdip_100_examples_per_class_kd_MSE_lr_fix
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-small_rvl_cdip_100_examples_per_class_kd_MSE_lr_fix
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8796
- Accuracy: 0.26
- Brier Loss: 0.8768
- Nll: 6.0962
- F1 Micro: 0.26
- F1 Macro: 0.2480
- Ece: 0.2002
- Aurc: 0.5815
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.5365 | 0.065 | 0.9398 | 10.2864 | 0.065 | 0.0116 | 0.1183 | 0.9536 |
| No log | 2.0 | 14 | 1.5332 | 0.06 | 0.9374 | 9.8468 | 0.06 | 0.0269 | 0.1067 | 0.9096 |
| No log | 3.0 | 21 | 1.5119 | 0.085 | 0.9352 | 9.1495 | 0.085 | 0.0355 | 0.1135 | 0.8759 |
| No log | 4.0 | 28 | 1.5040 | 0.0825 | 0.9333 | 8.6549 | 0.0825 | 0.0439 | 0.1181 | 0.8618 |
| No log | 5.0 | 35 | 1.5021 | 0.1 | 0.9301 | 8.9643 | 0.1000 | 0.0558 | 0.1318 | 0.8030 |
| No log | 6.0 | 42 | 1.4885 | 0.1 | 0.9276 | 7.8684 | 0.1000 | 0.0505 | 0.1205 | 0.8190 |
| No log | 7.0 | 49 | 1.4882 | 0.0975 | 0.9254 | 9.4095 | 0.0975 | 0.0584 | 0.1220 | 0.7847 |
| No log | 8.0 | 56 | 1.4909 | 0.1275 | 0.9227 | 9.4274 | 0.1275 | 0.0827 | 0.1335 | 0.7445 |
| No log | 9.0 | 63 | 1.4837 | 0.115 | 0.9217 | 10.2918 | 0.115 | 0.0546 | 0.1366 | 0.7932 |
| No log | 10.0 | 70 | 1.4857 | 0.1125 | 0.9186 | 9.5039 | 0.1125 | 0.0510 | 0.1277 | 0.7749 |
| No log | 11.0 | 77 | 1.4804 | 0.1125 | 0.9183 | 8.5178 | 0.1125 | 0.0515 | 0.1315 | 0.7831 |
| No log | 12.0 | 84 | 1.4701 | 0.11 | 0.9177 | 8.2398 | 0.11 | 0.0655 | 0.1310 | 0.7754 |
| No log | 13.0 | 91 | 1.4721 | 0.16 | 0.9160 | 7.2379 | 0.16 | 0.1155 | 0.1462 | 0.7370 |
| No log | 14.0 | 98 | 1.4717 | 0.11 | 0.9159 | 8.1355 | 0.11 | 0.0633 | 0.1221 | 0.7579 |
| No log | 15.0 | 105 | 1.4739 | 0.1325 | 0.9138 | 7.4037 | 0.1325 | 0.0790 | 0.1419 | 0.7358 |
| No log | 16.0 | 112 | 1.4657 | 0.1425 | 0.9135 | 7.8063 | 0.1425 | 0.0821 | 0.1285 | 0.7269 |
| No log | 17.0 | 119 | 1.4632 | 0.1375 | 0.9112 | 7.8852 | 0.1375 | 0.0948 | 0.1389 | 0.7342 |
| No log | 18.0 | 126 | 1.4769 | 0.15 | 0.9081 | 8.5375 | 0.15 | 0.0894 | 0.1399 | 0.7113 |
| No log | 19.0 | 133 | 1.4547 | 0.1775 | 0.9045 | 6.4114 | 0.1775 | 0.1174 | 0.1507 | 0.7007 |
| No log | 20.0 | 140 | 1.4470 | 0.1725 | 0.9031 | 8.1696 | 0.1725 | 0.1246 | 0.1464 | 0.7079 |
| No log | 21.0 | 147 | 1.4615 | 0.19 | 0.9021 | 6.0696 | 0.19 | 0.1390 | 0.1646 | 0.7023 |
| No log | 22.0 | 154 | 1.4588 | 0.2 | 0.8996 | 6.0038 | 0.2000 | 0.1384 | 0.1628 | 0.6821 |
| No log | 23.0 | 161 | 1.4646 | 0.1525 | 0.8988 | 7.0678 | 0.1525 | 0.1075 | 0.1458 | 0.7000 |
| No log | 24.0 | 168 | 1.4491 | 0.2125 | 0.8933 | 5.9276 | 0.2125 | 0.1503 | 0.1533 | 0.6457 |
| No log | 25.0 | 175 | 1.4526 | 0.205 | 0.8916 | 7.6108 | 0.205 | 0.1479 | 0.1603 | 0.6676 |
| No log | 26.0 | 182 | 1.4510 | 0.17 | 0.8910 | 5.6337 | 0.17 | 0.1333 | 0.1396 | 0.6868 |
| No log | 27.0 | 189 | 1.4567 | 0.19 | 0.8850 | 5.2038 | 0.19 | 0.1380 | 0.1637 | 0.6547 |
| No log | 28.0 | 196 | 1.4570 | 0.2225 | 0.8846 | 6.5368 | 0.2225 | 0.1840 | 0.1701 | 0.6554 |
| No log | 29.0 | 203 | 1.4701 | 0.2075 | 0.8820 | 5.0057 | 0.2075 | 0.1663 | 0.1719 | 0.6598 |
| No log | 30.0 | 210 | 1.4693 | 0.2225 | 0.8755 | 7.4456 | 0.2225 | 0.1729 | 0.1626 | 0.6355 |
| No log | 31.0 | 217 | 1.4670 | 0.23 | 0.8787 | 5.8938 | 0.23 | 0.1904 | 0.1717 | 0.6424 |
| No log | 32.0 | 224 | 1.4540 | 0.2275 | 0.8756 | 6.6513 | 0.2275 | 0.1673 | 0.1676 | 0.6306 |
| No log | 33.0 | 231 | 1.4641 | 0.2275 | 0.8649 | 5.5689 | 0.2275 | 0.1751 | 0.1746 | 0.6138 |
| No log | 34.0 | 238 | 1.4710 | 0.2425 | 0.8640 | 7.0556 | 0.2425 | 0.1957 | 0.1809 | 0.6048 |
| No log | 35.0 | 245 | 1.4685 | 0.23 | 0.8632 | 5.5735 | 0.23 | 0.1940 | 0.1609 | 0.6188 |
| No log | 36.0 | 252 | 1.4665 | 0.2375 | 0.8592 | 5.8835 | 0.2375 | 0.1952 | 0.1727 | 0.6050 |
| No log | 37.0 | 259 | 1.4668 | 0.235 | 0.8540 | 5.3502 | 0.235 | 0.1966 | 0.1746 | 0.6056 |
| No log | 38.0 | 266 | 1.4855 | 0.27 | 0.8510 | 5.3781 | 0.27 | 0.2124 | 0.1692 | 0.5825 |
| No log | 39.0 | 273 | 1.5279 | 0.265 | 0.8562 | 6.2426 | 0.265 | 0.2126 | 0.1772 | 0.5831 |
| No log | 40.0 | 280 | 1.5433 | 0.2425 | 0.8551 | 5.9574 | 0.2425 | 0.1867 | 0.1499 | 0.5874 |
| No log | 41.0 | 287 | 1.5955 | 0.2525 | 0.8597 | 6.1628 | 0.2525 | 0.2024 | 0.1479 | 0.5891 |
| No log | 42.0 | 294 | 1.5528 | 0.2475 | 0.8541 | 6.3624 | 0.2475 | 0.1908 | 0.1566 | 0.5735 |
| No log | 43.0 | 301 | 1.5858 | 0.2675 | 0.8504 | 6.1261 | 0.2675 | 0.2174 | 0.1706 | 0.5674 |
| No log | 44.0 | 308 | 1.6013 | 0.2725 | 0.8496 | 5.8409 | 0.2725 | 0.2463 | 0.1846 | 0.5807 |
| No log | 45.0 | 315 | 1.5632 | 0.2625 | 0.8472 | 5.9669 | 0.2625 | 0.2307 | 0.1689 | 0.5689 |
| No log | 46.0 | 322 | 1.6520 | 0.2675 | 0.8509 | 5.8544 | 0.2675 | 0.2325 | 0.1779 | 0.5622 |
| No log | 47.0 | 329 | 1.6135 | 0.2625 | 0.8476 | 5.5208 | 0.2625 | 0.2504 | 0.1565 | 0.5759 |
| No log | 48.0 | 336 | 1.6565 | 0.275 | 0.8466 | 5.9254 | 0.275 | 0.2527 | 0.2026 | 0.5616 |
| No log | 49.0 | 343 | 1.6807 | 0.2625 | 0.8531 | 6.1297 | 0.2625 | 0.2259 | 0.1813 | 0.5664 |
| No log | 50.0 | 350 | 1.7266 | 0.255 | 0.8560 | 6.0828 | 0.255 | 0.2315 | 0.1817 | 0.5735 |
| No log | 51.0 | 357 | 1.7038 | 0.2525 | 0.8579 | 5.6442 | 0.2525 | 0.2405 | 0.1861 | 0.5828 |
| No log | 52.0 | 364 | 1.7954 | 0.255 | 0.8583 | 5.7016 | 0.255 | 0.2227 | 0.1722 | 0.5725 |
| No log | 53.0 | 371 | 1.7567 | 0.275 | 0.8557 | 6.1586 | 0.275 | 0.2523 | 0.1577 | 0.5619 |
| No log | 54.0 | 378 | 1.7589 | 0.2525 | 0.8565 | 5.3969 | 0.2525 | 0.2325 | 0.1840 | 0.5661 |
| No log | 55.0 | 385 | 1.7778 | 0.265 | 0.8569 | 5.8559 | 0.265 | 0.2447 | 0.1835 | 0.5640 |
| No log | 56.0 | 392 | 1.8044 | 0.275 | 0.8592 | 5.9942 | 0.275 | 0.2517 | 0.1783 | 0.5627 |
| No log | 57.0 | 399 | 1.8327 | 0.2625 | 0.8628 | 6.0224 | 0.2625 | 0.2333 | 0.1801 | 0.5560 |
| No log | 58.0 | 406 | 1.8184 | 0.25 | 0.8609 | 6.0769 | 0.25 | 0.2333 | 0.1941 | 0.5718 |
| No log | 59.0 | 413 | 1.8318 | 0.2575 | 0.8639 | 5.9454 | 0.2575 | 0.2364 | 0.1965 | 0.5743 |
| No log | 60.0 | 420 | 1.8081 | 0.2525 | 0.8641 | 6.0119 | 0.2525 | 0.2380 | 0.1818 | 0.5755 |
| No log | 61.0 | 427 | 1.8405 | 0.2625 | 0.8775 | 6.2129 | 0.2625 | 0.2474 | 0.1767 | 0.5908 |
| No log | 62.0 | 434 | 1.9012 | 0.2625 | 0.8728 | 6.1015 | 0.2625 | 0.2373 | 0.1881 | 0.5716 |
| No log | 63.0 | 441 | 1.8500 | 0.26 | 0.8728 | 6.3885 | 0.26 | 0.2414 | 0.1933 | 0.5809 |
| No log | 64.0 | 448 | 1.8771 | 0.2675 | 0.8733 | 6.2730 | 0.2675 | 0.2553 | 0.2035 | 0.5800 |
| No log | 65.0 | 455 | 1.8744 | 0.2575 | 0.8677 | 5.9805 | 0.2575 | 0.2392 | 0.1918 | 0.5663 |
| No log | 66.0 | 462 | 1.8366 | 0.255 | 0.8694 | 6.0073 | 0.255 | 0.2403 | 0.2048 | 0.5807 |
| No log | 67.0 | 469 | 1.8758 | 0.2575 | 0.8743 | 6.1015 | 0.2575 | 0.2381 | 0.2071 | 0.5825 |
| No log | 68.0 | 476 | 1.8796 | 0.2675 | 0.8711 | 5.9457 | 0.2675 | 0.2470 | 0.2100 | 0.5737 |
| No log | 69.0 | 483 | 1.8635 | 0.2675 | 0.8721 | 5.9312 | 0.2675 | 0.2493 | 0.1788 | 0.5751 |
| No log | 70.0 | 490 | 1.8801 | 0.2625 | 0.8710 | 5.9629 | 0.2625 | 0.2467 | 0.1974 | 0.5721 |
| No log | 71.0 | 497 | 1.8936 | 0.26 | 0.8791 | 6.0358 | 0.26 | 0.2481 | 0.1922 | 0.5844 |
| 0.9216 | 72.0 | 504 | 1.8736 | 0.275 | 0.8715 | 6.0493 | 0.275 | 0.2569 | 0.2099 | 0.5710 |
| 0.9216 | 73.0 | 511 | 1.8784 | 0.2525 | 0.8760 | 6.1441 | 0.2525 | 0.2401 | 0.1978 | 0.5849 |
| 0.9216 | 74.0 | 518 | 1.8843 | 0.2725 | 0.8763 | 6.1948 | 0.2725 | 0.2533 | 0.2007 | 0.5801 |
| 0.9216 | 75.0 | 525 | 1.8785 | 0.2675 | 0.8784 | 5.9868 | 0.2675 | 0.2578 | 0.1975 | 0.5851 |
| 0.9216 | 76.0 | 532 | 1.8812 | 0.275 | 0.8725 | 5.9367 | 0.275 | 0.2594 | 0.2037 | 0.5744 |
| 0.9216 | 77.0 | 539 | 1.8956 | 0.27 | 0.8746 | 5.9038 | 0.27 | 0.2541 | 0.1816 | 0.5738 |
| 0.9216 | 78.0 | 546 | 1.8897 | 0.265 | 0.8802 | 5.9763 | 0.265 | 0.2493 | 0.2098 | 0.5866 |
| 0.9216 | 79.0 | 553 | 1.8728 | 0.275 | 0.8752 | 6.0806 | 0.275 | 0.2623 | 0.1874 | 0.5794 |
| 0.9216 | 80.0 | 560 | 1.8887 | 0.2725 | 0.8759 | 6.2762 | 0.2725 | 0.2520 | 0.2005 | 0.5768 |
| 0.9216 | 81.0 | 567 | 1.8987 | 0.2725 | 0.8787 | 6.2444 | 0.2725 | 0.2587 | 0.2183 | 0.5773 |
| 0.9216 | 82.0 | 574 | 1.8759 | 0.2625 | 0.8773 | 6.1643 | 0.2625 | 0.2541 | 0.1922 | 0.5805 |
| 0.9216 | 83.0 | 581 | 1.8766 | 0.27 | 0.8748 | 6.0036 | 0.27 | 0.2554 | 0.1784 | 0.5762 |
| 0.9216 | 84.0 | 588 | 1.8809 | 0.2625 | 0.8764 | 6.0488 | 0.2625 | 0.2469 | 0.2030 | 0.5833 |
| 0.9216 | 85.0 | 595 | 1.8982 | 0.26 | 0.8775 | 6.0747 | 0.26 | 0.2453 | 0.1998 | 0.5851 |
| 0.9216 | 86.0 | 602 | 1.8912 | 0.27 | 0.8798 | 6.1894 | 0.27 | 0.2566 | 0.1938 | 0.5839 |
| 0.9216 | 87.0 | 609 | 1.8847 | 0.2775 | 0.8769 | 6.2744 | 0.2775 | 0.2643 | 0.2019 | 0.5775 |
| 0.9216 | 88.0 | 616 | 1.8734 | 0.265 | 0.8741 | 6.1928 | 0.265 | 0.2526 | 0.1763 | 0.5820 |
| 0.9216 | 89.0 | 623 | 1.8760 | 0.2725 | 0.8768 | 6.0274 | 0.2725 | 0.2620 | 0.2039 | 0.5792 |
| 0.9216 | 90.0 | 630 | 1.8860 | 0.265 | 0.8771 | 6.0912 | 0.265 | 0.2518 | 0.1924 | 0.5810 |
| 0.9216 | 91.0 | 637 | 1.8865 | 0.2625 | 0.8750 | 6.2350 | 0.2625 | 0.2476 | 0.1844 | 0.5791 |
| 0.9216 | 92.0 | 644 | 1.8815 | 0.2725 | 0.8733 | 6.0962 | 0.2725 | 0.2563 | 0.2013 | 0.5721 |
| 0.9216 | 93.0 | 651 | 1.8794 | 0.27 | 0.8756 | 6.2535 | 0.27 | 0.2562 | 0.2028 | 0.5764 |
| 0.9216 | 94.0 | 658 | 1.8835 | 0.2675 | 0.8769 | 6.2039 | 0.2675 | 0.2562 | 0.1928 | 0.5773 |
| 0.9216 | 95.0 | 665 | 1.8904 | 0.27 | 0.8786 | 6.1504 | 0.27 | 0.2543 | 0.2034 | 0.5768 |
| 0.9216 | 96.0 | 672 | 1.8911 | 0.26 | 0.8788 | 6.1527 | 0.26 | 0.2465 | 0.2025 | 0.5829 |
| 0.9216 | 97.0 | 679 | 1.8871 | 0.265 | 0.8776 | 6.0994 | 0.265 | 0.2519 | 0.2126 | 0.5794 |
| 0.9216 | 98.0 | 686 | 1.8825 | 0.265 | 0.8769 | 6.1564 | 0.265 | 0.2516 | 0.1987 | 0.5776 |
| 0.9216 | 99.0 | 693 | 1.8803 | 0.2675 | 0.8766 | 6.1183 | 0.2675 | 0.2561 | 0.2095 | 0.5798 |
| 0.9216 | 100.0 | 700 | 1.8796 | 0.26 | 0.8768 | 6.0962 | 0.26 | 0.2480 | 0.2002 | 0.5815 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_tobacco3482_kd_MSE_test_pretrain_student
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_kd_MSE_test_pretrain_student
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.6243 | 0.595 | 0.6456 | 1.9017 | 0.595 | 0.5113 | 0.3512 | 0.2202 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-_tobacco3482_kd_MSE_test_pretrain_student
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-_tobacco3482_kd_MSE_test_pretrain_student
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.8077 | 0.4 | 0.7439 | 5.4442 | 0.4000 | 0.2755 | 0.2844 | 0.3738 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-tiny_tobacco3482_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_kd_MSE
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2482
- Accuracy: 0.82
- Brier Loss: 0.3226
- Nll: 0.8343
- F1 Micro: 0.82
- F1 Macro: 0.8090
- Ece: 0.2625
- Aurc: 0.0606
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.6314 | 0.165 | 1.0293 | 8.4704 | 0.165 | 0.0552 | 0.3856 | 0.8464 |
| No log | 2.0 | 14 | 1.4800 | 0.12 | 0.9052 | 7.3409 | 0.12 | 0.0997 | 0.2469 | 0.8374 |
| No log | 3.0 | 21 | 1.0627 | 0.34 | 0.8400 | 5.5369 | 0.34 | 0.2006 | 0.3075 | 0.5758 |
| No log | 4.0 | 28 | 0.8429 | 0.42 | 0.7462 | 3.2442 | 0.4200 | 0.3362 | 0.3072 | 0.3692 |
| No log | 5.0 | 35 | 0.7060 | 0.535 | 0.6558 | 2.7042 | 0.535 | 0.4295 | 0.2993 | 0.2677 |
| No log | 6.0 | 42 | 0.5950 | 0.635 | 0.6056 | 2.0779 | 0.635 | 0.5021 | 0.3375 | 0.1948 |
| No log | 7.0 | 49 | 0.4865 | 0.67 | 0.5486 | 1.4919 | 0.67 | 0.5384 | 0.3633 | 0.1737 |
| No log | 8.0 | 56 | 0.4572 | 0.69 | 0.4897 | 1.4359 | 0.69 | 0.6106 | 0.2889 | 0.1355 |
| No log | 9.0 | 63 | 0.3932 | 0.72 | 0.4496 | 1.0748 | 0.72 | 0.6261 | 0.2914 | 0.1092 |
| No log | 10.0 | 70 | 0.3584 | 0.76 | 0.4091 | 1.1341 | 0.76 | 0.6747 | 0.2946 | 0.0937 |
| No log | 11.0 | 77 | 0.3516 | 0.785 | 0.3906 | 1.0586 | 0.785 | 0.7422 | 0.3026 | 0.0762 |
| No log | 12.0 | 84 | 0.3905 | 0.74 | 0.4155 | 1.1502 | 0.74 | 0.6677 | 0.2827 | 0.1211 |
| No log | 13.0 | 91 | 0.3346 | 0.775 | 0.3640 | 1.0888 | 0.775 | 0.7397 | 0.2743 | 0.0771 |
| No log | 14.0 | 98 | 0.3700 | 0.81 | 0.3728 | 0.9575 | 0.81 | 0.7941 | 0.3125 | 0.0709 |
| No log | 15.0 | 105 | 0.3346 | 0.8 | 0.3631 | 0.9636 | 0.8000 | 0.7731 | 0.3113 | 0.0662 |
| No log | 16.0 | 112 | 0.3084 | 0.785 | 0.3606 | 1.0243 | 0.785 | 0.7610 | 0.2883 | 0.0911 |
| No log | 17.0 | 119 | 0.3266 | 0.785 | 0.3375 | 1.0301 | 0.785 | 0.7603 | 0.2659 | 0.0687 |
| No log | 18.0 | 126 | 0.2987 | 0.805 | 0.3371 | 0.7961 | 0.805 | 0.7895 | 0.2764 | 0.0639 |
| No log | 19.0 | 133 | 0.3468 | 0.815 | 0.3428 | 0.9841 | 0.815 | 0.7884 | 0.2824 | 0.0641 |
| No log | 20.0 | 140 | 0.3111 | 0.81 | 0.3409 | 0.8873 | 0.81 | 0.8010 | 0.2978 | 0.0579 |
| No log | 21.0 | 147 | 0.3042 | 0.8 | 0.3392 | 0.8057 | 0.8000 | 0.7802 | 0.2611 | 0.0556 |
| No log | 22.0 | 154 | 0.2936 | 0.84 | 0.3359 | 0.8659 | 0.8400 | 0.8210 | 0.2920 | 0.0636 |
| No log | 23.0 | 161 | 0.3284 | 0.815 | 0.3457 | 0.9494 | 0.815 | 0.8007 | 0.2793 | 0.0615 |
| No log | 24.0 | 168 | 0.3184 | 0.805 | 0.3474 | 0.9087 | 0.805 | 0.7851 | 0.2793 | 0.0605 |
| No log | 25.0 | 175 | 0.3081 | 0.805 | 0.3357 | 1.0425 | 0.805 | 0.7757 | 0.2781 | 0.0646 |
| No log | 26.0 | 182 | 0.2743 | 0.815 | 0.3425 | 0.7898 | 0.815 | 0.8028 | 0.2814 | 0.0689 |
| No log | 27.0 | 189 | 0.2873 | 0.81 | 0.3254 | 0.8795 | 0.81 | 0.7904 | 0.2755 | 0.0637 |
| No log | 28.0 | 196 | 0.2650 | 0.82 | 0.3236 | 0.7462 | 0.82 | 0.7986 | 0.2714 | 0.0616 |
| No log | 29.0 | 203 | 0.2756 | 0.815 | 0.3310 | 0.8227 | 0.815 | 0.7973 | 0.2766 | 0.0664 |
| No log | 30.0 | 210 | 0.2711 | 0.83 | 0.3337 | 0.8340 | 0.83 | 0.8186 | 0.2992 | 0.0688 |
| No log | 31.0 | 217 | 0.2779 | 0.825 | 0.3234 | 0.8173 | 0.825 | 0.8100 | 0.2579 | 0.0589 |
| No log | 32.0 | 224 | 0.2679 | 0.82 | 0.3216 | 0.7441 | 0.82 | 0.8024 | 0.2940 | 0.0605 |
| No log | 33.0 | 231 | 0.2633 | 0.805 | 0.3277 | 0.8046 | 0.805 | 0.7871 | 0.2710 | 0.0644 |
| No log | 34.0 | 238 | 0.2705 | 0.805 | 0.3333 | 0.8661 | 0.805 | 0.7890 | 0.2626 | 0.0632 |
| No log | 35.0 | 245 | 0.2624 | 0.815 | 0.3295 | 0.8568 | 0.815 | 0.7900 | 0.2865 | 0.0651 |
| No log | 36.0 | 252 | 0.2654 | 0.805 | 0.3262 | 0.8075 | 0.805 | 0.7793 | 0.2726 | 0.0662 |
| No log | 37.0 | 259 | 0.2697 | 0.805 | 0.3293 | 0.8143 | 0.805 | 0.7857 | 0.2587 | 0.0623 |
| No log | 38.0 | 266 | 0.2548 | 0.805 | 0.3267 | 0.8028 | 0.805 | 0.7847 | 0.2606 | 0.0660 |
| No log | 39.0 | 273 | 0.2740 | 0.83 | 0.3218 | 0.8270 | 0.83 | 0.8172 | 0.2697 | 0.0566 |
| No log | 40.0 | 280 | 0.2572 | 0.81 | 0.3302 | 0.8573 | 0.81 | 0.7892 | 0.2722 | 0.0663 |
| No log | 41.0 | 287 | 0.2528 | 0.81 | 0.3300 | 0.8454 | 0.81 | 0.7980 | 0.2555 | 0.0673 |
| No log | 42.0 | 294 | 0.2590 | 0.815 | 0.3271 | 0.8393 | 0.815 | 0.8002 | 0.2554 | 0.0604 |
| No log | 43.0 | 301 | 0.2654 | 0.825 | 0.3273 | 0.8100 | 0.825 | 0.8155 | 0.2687 | 0.0595 |
| No log | 44.0 | 308 | 0.2506 | 0.805 | 0.3290 | 0.8551 | 0.805 | 0.7858 | 0.2425 | 0.0716 |
| No log | 45.0 | 315 | 0.2615 | 0.82 | 0.3255 | 0.8624 | 0.82 | 0.8007 | 0.2773 | 0.0586 |
| No log | 46.0 | 322 | 0.2487 | 0.815 | 0.3240 | 0.8416 | 0.815 | 0.7987 | 0.2757 | 0.0611 |
| No log | 47.0 | 329 | 0.2674 | 0.845 | 0.3207 | 0.8407 | 0.845 | 0.8227 | 0.2963 | 0.0553 |
| No log | 48.0 | 336 | 0.2522 | 0.805 | 0.3312 | 0.9376 | 0.805 | 0.7888 | 0.2714 | 0.0662 |
| No log | 49.0 | 343 | 0.2547 | 0.81 | 0.3280 | 0.7847 | 0.81 | 0.7870 | 0.2696 | 0.0688 |
| No log | 50.0 | 350 | 0.2523 | 0.81 | 0.3213 | 0.7968 | 0.81 | 0.7937 | 0.2599 | 0.0654 |
| No log | 51.0 | 357 | 0.2526 | 0.815 | 0.3291 | 0.8022 | 0.815 | 0.7994 | 0.2888 | 0.0669 |
| No log | 52.0 | 364 | 0.2568 | 0.835 | 0.3180 | 0.8317 | 0.835 | 0.8140 | 0.2750 | 0.0578 |
| No log | 53.0 | 371 | 0.2496 | 0.82 | 0.3267 | 0.8442 | 0.82 | 0.8068 | 0.2825 | 0.0602 |
| No log | 54.0 | 378 | 0.2602 | 0.82 | 0.3229 | 0.7963 | 0.82 | 0.8061 | 0.2714 | 0.0585 |
| No log | 55.0 | 385 | 0.2477 | 0.81 | 0.3237 | 0.8278 | 0.81 | 0.7937 | 0.2511 | 0.0631 |
| No log | 56.0 | 392 | 0.2508 | 0.83 | 0.3210 | 0.8302 | 0.83 | 0.8102 | 0.2706 | 0.0588 |
| No log | 57.0 | 399 | 0.2454 | 0.815 | 0.3240 | 0.8377 | 0.815 | 0.7948 | 0.2607 | 0.0643 |
| No log | 58.0 | 406 | 0.2488 | 0.815 | 0.3229 | 0.8308 | 0.815 | 0.7942 | 0.2569 | 0.0604 |
| No log | 59.0 | 413 | 0.2510 | 0.82 | 0.3223 | 0.8314 | 0.82 | 0.8093 | 0.2779 | 0.0603 |
| No log | 60.0 | 420 | 0.2499 | 0.82 | 0.3235 | 0.8401 | 0.82 | 0.8031 | 0.2578 | 0.0618 |
| No log | 61.0 | 427 | 0.2478 | 0.81 | 0.3227 | 0.8315 | 0.81 | 0.7933 | 0.2645 | 0.0615 |
| No log | 62.0 | 434 | 0.2460 | 0.82 | 0.3231 | 0.8364 | 0.82 | 0.8028 | 0.2795 | 0.0643 |
| No log | 63.0 | 441 | 0.2489 | 0.825 | 0.3224 | 0.8337 | 0.825 | 0.8156 | 0.2759 | 0.0604 |
| No log | 64.0 | 448 | 0.2482 | 0.825 | 0.3230 | 0.8320 | 0.825 | 0.8138 | 0.2753 | 0.0600 |
| No log | 65.0 | 455 | 0.2462 | 0.815 | 0.3231 | 0.8354 | 0.815 | 0.8008 | 0.2551 | 0.0625 |
| No log | 66.0 | 462 | 0.2470 | 0.815 | 0.3219 | 0.8338 | 0.815 | 0.8018 | 0.2729 | 0.0611 |
| No log | 67.0 | 469 | 0.2457 | 0.81 | 0.3231 | 0.8336 | 0.81 | 0.7930 | 0.2587 | 0.0638 |
| No log | 68.0 | 476 | 0.2472 | 0.815 | 0.3225 | 0.8334 | 0.815 | 0.8008 | 0.2706 | 0.0619 |
| No log | 69.0 | 483 | 0.2473 | 0.825 | 0.3223 | 0.8357 | 0.825 | 0.8165 | 0.2668 | 0.0611 |
| No log | 70.0 | 490 | 0.2481 | 0.81 | 0.3223 | 0.8343 | 0.81 | 0.7930 | 0.2685 | 0.0624 |
| No log | 71.0 | 497 | 0.2472 | 0.825 | 0.3227 | 0.8338 | 0.825 | 0.8117 | 0.2839 | 0.0601 |
| 0.232 | 72.0 | 504 | 0.2472 | 0.815 | 0.3220 | 0.8345 | 0.815 | 0.8018 | 0.2617 | 0.0615 |
| 0.232 | 73.0 | 511 | 0.2486 | 0.82 | 0.3218 | 0.8321 | 0.82 | 0.8086 | 0.2768 | 0.0610 |
| 0.232 | 74.0 | 518 | 0.2468 | 0.815 | 0.3219 | 0.8338 | 0.815 | 0.8008 | 0.2717 | 0.0621 |
| 0.232 | 75.0 | 525 | 0.2470 | 0.82 | 0.3223 | 0.8325 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
| 0.232 | 76.0 | 532 | 0.2474 | 0.825 | 0.3223 | 0.8322 | 0.825 | 0.8165 | 0.2723 | 0.0602 |
| 0.232 | 77.0 | 539 | 0.2476 | 0.805 | 0.3227 | 0.8345 | 0.805 | 0.7859 | 0.2589 | 0.0629 |
| 0.232 | 78.0 | 546 | 0.2479 | 0.82 | 0.3228 | 0.8336 | 0.82 | 0.8090 | 0.2674 | 0.0608 |
| 0.232 | 79.0 | 553 | 0.2478 | 0.82 | 0.3225 | 0.8349 | 0.82 | 0.8090 | 0.2624 | 0.0604 |
| 0.232 | 80.0 | 560 | 0.2477 | 0.81 | 0.3227 | 0.8337 | 0.81 | 0.7938 | 0.2577 | 0.0621 |
| 0.232 | 81.0 | 567 | 0.2478 | 0.82 | 0.3226 | 0.8336 | 0.82 | 0.8090 | 0.2670 | 0.0607 |
| 0.232 | 82.0 | 574 | 0.2480 | 0.825 | 0.3224 | 0.8340 | 0.825 | 0.8165 | 0.2673 | 0.0605 |
| 0.232 | 83.0 | 581 | 0.2479 | 0.82 | 0.3227 | 0.8347 | 0.82 | 0.8090 | 0.2564 | 0.0607 |
| 0.232 | 84.0 | 588 | 0.2480 | 0.82 | 0.3226 | 0.8342 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
| 0.232 | 85.0 | 595 | 0.2480 | 0.82 | 0.3225 | 0.8339 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
| 0.232 | 86.0 | 602 | 0.2479 | 0.825 | 0.3226 | 0.8339 | 0.825 | 0.8165 | 0.2677 | 0.0606 |
| 0.232 | 87.0 | 609 | 0.2479 | 0.82 | 0.3225 | 0.8339 | 0.82 | 0.8090 | 0.2624 | 0.0605 |
| 0.232 | 88.0 | 616 | 0.2481 | 0.825 | 0.3225 | 0.8343 | 0.825 | 0.8165 | 0.2675 | 0.0604 |
| 0.232 | 89.0 | 623 | 0.2481 | 0.825 | 0.3225 | 0.8341 | 0.825 | 0.8165 | 0.2722 | 0.0605 |
| 0.232 | 90.0 | 630 | 0.2481 | 0.82 | 0.3225 | 0.8341 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
| 0.232 | 91.0 | 637 | 0.2481 | 0.82 | 0.3226 | 0.8345 | 0.82 | 0.8090 | 0.2629 | 0.0608 |
| 0.232 | 92.0 | 644 | 0.2481 | 0.825 | 0.3226 | 0.8342 | 0.825 | 0.8165 | 0.2675 | 0.0605 |
| 0.232 | 93.0 | 651 | 0.2481 | 0.825 | 0.3225 | 0.8340 | 0.825 | 0.8165 | 0.2675 | 0.0605 |
| 0.232 | 94.0 | 658 | 0.2481 | 0.82 | 0.3225 | 0.8343 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
| 0.232 | 95.0 | 665 | 0.2482 | 0.82 | 0.3226 | 0.8345 | 0.82 | 0.8090 | 0.2627 | 0.0606 |
| 0.232 | 96.0 | 672 | 0.2482 | 0.82 | 0.3225 | 0.8343 | 0.82 | 0.8090 | 0.2627 | 0.0607 |
| 0.232 | 97.0 | 679 | 0.2482 | 0.82 | 0.3226 | 0.8344 | 0.82 | 0.8090 | 0.2627 | 0.0607 |
| 0.232 | 98.0 | 686 | 0.2482 | 0.82 | 0.3226 | 0.8344 | 0.82 | 0.8090 | 0.2626 | 0.0606 |
| 0.232 | 99.0 | 693 | 0.2482 | 0.82 | 0.3226 | 0.8343 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
| 0.232 | 100.0 | 700 | 0.2482 | 0.82 | 0.3226 | 0.8343 | 0.82 | 0.8090 | 0.2625 | 0.0606 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_MSE
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2131
- Accuracy: 0.84
- Brier Loss: 0.2974
- Nll: 0.8913
- F1 Micro: 0.8400
- F1 Macro: 0.8190
- Ece: 0.2456
- Aurc: 0.0512
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.4711 | 0.21 | 0.8898 | 6.2752 | 0.2100 | 0.1403 | 0.2702 | 0.7673 |
| No log | 2.0 | 14 | 1.0769 | 0.41 | 0.8120 | 5.2446 | 0.41 | 0.2713 | 0.3253 | 0.5170 |
| No log | 3.0 | 21 | 0.7901 | 0.51 | 0.7057 | 2.6186 | 0.51 | 0.4114 | 0.3359 | 0.3162 |
| No log | 4.0 | 28 | 0.6044 | 0.61 | 0.5736 | 1.8428 | 0.61 | 0.4989 | 0.3358 | 0.1889 |
| No log | 5.0 | 35 | 0.4605 | 0.7 | 0.5009 | 1.3395 | 0.7 | 0.6120 | 0.3587 | 0.1321 |
| No log | 6.0 | 42 | 0.4484 | 0.73 | 0.4373 | 1.4781 | 0.7300 | 0.6394 | 0.2751 | 0.1150 |
| No log | 7.0 | 49 | 0.4406 | 0.765 | 0.4180 | 1.1081 | 0.765 | 0.7193 | 0.3066 | 0.0981 |
| No log | 8.0 | 56 | 0.3421 | 0.82 | 0.3575 | 0.9309 | 0.82 | 0.7764 | 0.2867 | 0.0703 |
| No log | 9.0 | 63 | 0.4201 | 0.75 | 0.3973 | 1.5859 | 0.75 | 0.7562 | 0.2618 | 0.1051 |
| No log | 10.0 | 70 | 0.4086 | 0.795 | 0.3775 | 1.2870 | 0.795 | 0.7701 | 0.3104 | 0.0691 |
| No log | 11.0 | 77 | 0.2867 | 0.82 | 0.3251 | 1.2141 | 0.82 | 0.7996 | 0.2511 | 0.0683 |
| No log | 12.0 | 84 | 0.2964 | 0.825 | 0.3233 | 1.0042 | 0.825 | 0.8028 | 0.2801 | 0.0538 |
| No log | 13.0 | 91 | 0.3010 | 0.81 | 0.3351 | 1.0085 | 0.81 | 0.7735 | 0.2678 | 0.0584 |
| No log | 14.0 | 98 | 0.2741 | 0.835 | 0.3194 | 1.0574 | 0.835 | 0.8127 | 0.2982 | 0.0542 |
| No log | 15.0 | 105 | 0.2524 | 0.845 | 0.3228 | 1.1162 | 0.845 | 0.8225 | 0.2911 | 0.0568 |
| No log | 16.0 | 112 | 0.2652 | 0.83 | 0.3154 | 0.8145 | 0.83 | 0.8130 | 0.2786 | 0.0516 |
| No log | 17.0 | 119 | 0.2478 | 0.83 | 0.3241 | 1.1158 | 0.83 | 0.8034 | 0.2776 | 0.0683 |
| No log | 18.0 | 126 | 0.2526 | 0.85 | 0.3112 | 1.0132 | 0.85 | 0.8324 | 0.2757 | 0.0517 |
| No log | 19.0 | 133 | 0.2423 | 0.855 | 0.3023 | 1.0623 | 0.855 | 0.8382 | 0.2727 | 0.0561 |
| No log | 20.0 | 140 | 0.2294 | 0.83 | 0.3112 | 1.1134 | 0.83 | 0.8139 | 0.2697 | 0.0703 |
| No log | 21.0 | 147 | 0.2380 | 0.835 | 0.3080 | 0.9961 | 0.835 | 0.8190 | 0.2841 | 0.0489 |
| No log | 22.0 | 154 | 0.2362 | 0.84 | 0.3034 | 0.9586 | 0.8400 | 0.8145 | 0.2626 | 0.0520 |
| No log | 23.0 | 161 | 0.2252 | 0.86 | 0.2946 | 1.1006 | 0.8600 | 0.8471 | 0.2830 | 0.0495 |
| No log | 24.0 | 168 | 0.2325 | 0.85 | 0.2985 | 0.9069 | 0.85 | 0.8288 | 0.2681 | 0.0533 |
| No log | 25.0 | 175 | 0.2335 | 0.825 | 0.3005 | 0.8930 | 0.825 | 0.8000 | 0.2640 | 0.0496 |
| No log | 26.0 | 182 | 0.2309 | 0.845 | 0.2984 | 1.0007 | 0.845 | 0.8308 | 0.2573 | 0.0536 |
| No log | 27.0 | 189 | 0.2265 | 0.835 | 0.3051 | 1.0092 | 0.835 | 0.8158 | 0.2626 | 0.0603 |
| No log | 28.0 | 196 | 0.2192 | 0.83 | 0.2977 | 1.0186 | 0.83 | 0.8019 | 0.2516 | 0.0572 |
| No log | 29.0 | 203 | 0.2276 | 0.83 | 0.3017 | 0.9407 | 0.83 | 0.8179 | 0.2553 | 0.0480 |
| No log | 30.0 | 210 | 0.2131 | 0.84 | 0.2992 | 0.9232 | 0.8400 | 0.8195 | 0.2541 | 0.0546 |
| No log | 31.0 | 217 | 0.2197 | 0.845 | 0.2998 | 0.9012 | 0.845 | 0.8301 | 0.2537 | 0.0569 |
| No log | 32.0 | 224 | 0.2138 | 0.85 | 0.2972 | 0.9117 | 0.85 | 0.8349 | 0.2777 | 0.0551 |
| No log | 33.0 | 231 | 0.2167 | 0.85 | 0.2969 | 1.0176 | 0.85 | 0.8390 | 0.2676 | 0.0535 |
| No log | 34.0 | 238 | 0.2114 | 0.84 | 0.2959 | 0.8912 | 0.8400 | 0.8190 | 0.2512 | 0.0514 |
| No log | 35.0 | 245 | 0.2145 | 0.845 | 0.2952 | 0.8960 | 0.845 | 0.8216 | 0.2638 | 0.0492 |
| No log | 36.0 | 252 | 0.2146 | 0.845 | 0.2960 | 0.9093 | 0.845 | 0.8301 | 0.2841 | 0.0519 |
| No log | 37.0 | 259 | 0.2157 | 0.845 | 0.2973 | 0.9043 | 0.845 | 0.8216 | 0.2614 | 0.0520 |
| No log | 38.0 | 266 | 0.2116 | 0.84 | 0.2949 | 0.8871 | 0.8400 | 0.8190 | 0.2639 | 0.0512 |
| No log | 39.0 | 273 | 0.2138 | 0.845 | 0.2963 | 0.9002 | 0.845 | 0.8301 | 0.2497 | 0.0512 |
| No log | 40.0 | 280 | 0.2129 | 0.84 | 0.2960 | 0.9731 | 0.8400 | 0.8190 | 0.2500 | 0.0511 |
| No log | 41.0 | 287 | 0.2139 | 0.845 | 0.2966 | 1.0111 | 0.845 | 0.8301 | 0.2750 | 0.0523 |
| No log | 42.0 | 294 | 0.2134 | 0.84 | 0.2959 | 0.9515 | 0.8400 | 0.8190 | 0.2577 | 0.0506 |
| No log | 43.0 | 301 | 0.2134 | 0.84 | 0.2972 | 0.9022 | 0.8400 | 0.8190 | 0.2538 | 0.0517 |
| No log | 44.0 | 308 | 0.2131 | 0.84 | 0.2966 | 0.9569 | 0.8400 | 0.8190 | 0.2683 | 0.0519 |
| No log | 45.0 | 315 | 0.2131 | 0.84 | 0.2965 | 0.8931 | 0.8400 | 0.8190 | 0.2504 | 0.0513 |
| No log | 46.0 | 322 | 0.2119 | 0.84 | 0.2963 | 0.8998 | 0.8400 | 0.8190 | 0.2535 | 0.0513 |
| No log | 47.0 | 329 | 0.2129 | 0.84 | 0.2973 | 0.9017 | 0.8400 | 0.8190 | 0.2527 | 0.0514 |
| No log | 48.0 | 336 | 0.2130 | 0.84 | 0.2971 | 0.8947 | 0.8400 | 0.8190 | 0.2520 | 0.0510 |
| No log | 49.0 | 343 | 0.2123 | 0.84 | 0.2972 | 0.9482 | 0.8400 | 0.8190 | 0.2583 | 0.0515 |
| No log | 50.0 | 350 | 0.2124 | 0.84 | 0.2970 | 0.9083 | 0.8400 | 0.8190 | 0.2604 | 0.0513 |
| No log | 51.0 | 357 | 0.2130 | 0.84 | 0.2974 | 0.8978 | 0.8400 | 0.8190 | 0.2446 | 0.0513 |
| No log | 52.0 | 364 | 0.2127 | 0.84 | 0.2975 | 0.8932 | 0.8400 | 0.8190 | 0.2457 | 0.0513 |
| No log | 53.0 | 371 | 0.2125 | 0.84 | 0.2972 | 0.8935 | 0.8400 | 0.8190 | 0.2508 | 0.0512 |
| No log | 54.0 | 378 | 0.2130 | 0.84 | 0.2975 | 0.8989 | 0.8400 | 0.8190 | 0.2551 | 0.0513 |
| No log | 55.0 | 385 | 0.2128 | 0.84 | 0.2972 | 0.8941 | 0.8400 | 0.8190 | 0.2448 | 0.0511 |
| No log | 56.0 | 392 | 0.2128 | 0.84 | 0.2974 | 0.8944 | 0.8400 | 0.8190 | 0.2459 | 0.0515 |
| No log | 57.0 | 399 | 0.2128 | 0.84 | 0.2973 | 0.8934 | 0.8400 | 0.8190 | 0.2517 | 0.0512 |
| No log | 58.0 | 406 | 0.2130 | 0.84 | 0.2973 | 0.8936 | 0.8400 | 0.8190 | 0.2448 | 0.0513 |
| No log | 59.0 | 413 | 0.2129 | 0.84 | 0.2973 | 0.8951 | 0.8400 | 0.8190 | 0.2383 | 0.0513 |
| No log | 60.0 | 420 | 0.2128 | 0.84 | 0.2972 | 0.8921 | 0.8400 | 0.8190 | 0.2519 | 0.0512 |
| No log | 61.0 | 427 | 0.2125 | 0.84 | 0.2974 | 0.8959 | 0.8400 | 0.8190 | 0.2518 | 0.0515 |
| No log | 62.0 | 434 | 0.2128 | 0.84 | 0.2973 | 0.8937 | 0.8400 | 0.8190 | 0.2385 | 0.0513 |
| No log | 63.0 | 441 | 0.2131 | 0.84 | 0.2974 | 0.8933 | 0.8400 | 0.8190 | 0.2551 | 0.0512 |
| No log | 64.0 | 448 | 0.2129 | 0.84 | 0.2974 | 0.8930 | 0.8400 | 0.8190 | 0.2388 | 0.0512 |
| No log | 65.0 | 455 | 0.2129 | 0.84 | 0.2973 | 0.8927 | 0.8400 | 0.8190 | 0.2447 | 0.0513 |
| No log | 66.0 | 462 | 0.2129 | 0.84 | 0.2974 | 0.8930 | 0.8400 | 0.8190 | 0.2385 | 0.0513 |
| No log | 67.0 | 469 | 0.2129 | 0.84 | 0.2974 | 0.8929 | 0.8400 | 0.8190 | 0.2458 | 0.0512 |
| No log | 68.0 | 476 | 0.2130 | 0.84 | 0.2975 | 0.8930 | 0.8400 | 0.8190 | 0.2455 | 0.0512 |
| No log | 69.0 | 483 | 0.2130 | 0.84 | 0.2973 | 0.8917 | 0.8400 | 0.8190 | 0.2459 | 0.0513 |
| No log | 70.0 | 490 | 0.2129 | 0.84 | 0.2973 | 0.8913 | 0.8400 | 0.8190 | 0.2520 | 0.0513 |
| No log | 71.0 | 497 | 0.2131 | 0.84 | 0.2974 | 0.8919 | 0.8400 | 0.8190 | 0.2519 | 0.0513 |
| 0.1234 | 72.0 | 504 | 0.2130 | 0.84 | 0.2973 | 0.8917 | 0.8400 | 0.8190 | 0.2457 | 0.0511 |
| 0.1234 | 73.0 | 511 | 0.2129 | 0.84 | 0.2974 | 0.8917 | 0.8400 | 0.8190 | 0.2455 | 0.0512 |
| 0.1234 | 74.0 | 518 | 0.2129 | 0.84 | 0.2974 | 0.8913 | 0.8400 | 0.8190 | 0.2455 | 0.0512 |
| 0.1234 | 75.0 | 525 | 0.2130 | 0.84 | 0.2973 | 0.8917 | 0.8400 | 0.8190 | 0.2519 | 0.0513 |
| 0.1234 | 76.0 | 532 | 0.2129 | 0.84 | 0.2974 | 0.8921 | 0.8400 | 0.8190 | 0.2455 | 0.0512 |
| 0.1234 | 77.0 | 539 | 0.2130 | 0.84 | 0.2973 | 0.8919 | 0.8400 | 0.8190 | 0.2455 | 0.0511 |
| 0.1234 | 78.0 | 546 | 0.2130 | 0.84 | 0.2973 | 0.8924 | 0.8400 | 0.8190 | 0.2455 | 0.0511 |
| 0.1234 | 79.0 | 553 | 0.2130 | 0.84 | 0.2974 | 0.8919 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 80.0 | 560 | 0.2130 | 0.84 | 0.2973 | 0.8915 | 0.8400 | 0.8190 | 0.2515 | 0.0512 |
| 0.1234 | 81.0 | 567 | 0.2130 | 0.84 | 0.2973 | 0.8915 | 0.8400 | 0.8190 | 0.2456 | 0.0511 |
| 0.1234 | 82.0 | 574 | 0.2130 | 0.84 | 0.2974 | 0.8915 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 83.0 | 581 | 0.2130 | 0.84 | 0.2973 | 0.8916 | 0.8400 | 0.8190 | 0.2516 | 0.0512 |
| 0.1234 | 84.0 | 588 | 0.2130 | 0.84 | 0.2974 | 0.8920 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 85.0 | 595 | 0.2130 | 0.84 | 0.2974 | 0.8915 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 86.0 | 602 | 0.2130 | 0.84 | 0.2974 | 0.8917 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 87.0 | 609 | 0.2130 | 0.84 | 0.2974 | 0.8913 | 0.8400 | 0.8190 | 0.2517 | 0.0512 |
| 0.1234 | 88.0 | 616 | 0.2130 | 0.84 | 0.2973 | 0.8916 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 89.0 | 623 | 0.2130 | 0.84 | 0.2974 | 0.8912 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 90.0 | 630 | 0.2130 | 0.84 | 0.2973 | 0.8914 | 0.8400 | 0.8190 | 0.2517 | 0.0512 |
| 0.1234 | 91.0 | 637 | 0.2131 | 0.84 | 0.2974 | 0.8915 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 92.0 | 644 | 0.2130 | 0.84 | 0.2973 | 0.8912 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 93.0 | 651 | 0.2130 | 0.84 | 0.2974 | 0.8915 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 94.0 | 658 | 0.2130 | 0.84 | 0.2973 | 0.8913 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 95.0 | 665 | 0.2130 | 0.84 | 0.2973 | 0.8913 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 96.0 | 672 | 0.2131 | 0.84 | 0.2974 | 0.8915 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 97.0 | 679 | 0.2131 | 0.84 | 0.2973 | 0.8914 | 0.8400 | 0.8190 | 0.2517 | 0.0512 |
| 0.1234 | 98.0 | 686 | 0.2130 | 0.84 | 0.2974 | 0.8912 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 99.0 | 693 | 0.2131 | 0.84 | 0.2974 | 0.8913 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
| 0.1234 | 100.0 | 700 | 0.2131 | 0.84 | 0.2974 | 0.8913 | 0.8400 | 0.8190 | 0.2456 | 0.0512 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4258
- Accuracy: 0.825
- Brier Loss: 0.2707
- Nll: 0.8867
- F1 Micro: 0.825
- F1 Macro: 0.8116
- Ece: 0.2129
- Aurc: 0.0681
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.7307 | 0.22 | 0.8748 | 5.3766 | 0.22 | 0.1294 | 0.2444 | 0.6913 |
| No log | 2.0 | 14 | 1.3514 | 0.405 | 0.7426 | 3.5573 | 0.405 | 0.2280 | 0.2900 | 0.4026 |
| No log | 3.0 | 21 | 0.9121 | 0.62 | 0.5647 | 1.9398 | 0.62 | 0.5595 | 0.2879 | 0.2015 |
| No log | 4.0 | 28 | 0.7084 | 0.695 | 0.4179 | 1.7042 | 0.695 | 0.6379 | 0.2305 | 0.1177 |
| No log | 5.0 | 35 | 0.7167 | 0.735 | 0.3862 | 1.7929 | 0.735 | 0.7392 | 0.2380 | 0.1046 |
| No log | 6.0 | 42 | 0.6442 | 0.765 | 0.3625 | 1.5688 | 0.765 | 0.7549 | 0.2371 | 0.1034 |
| No log | 7.0 | 49 | 0.6147 | 0.805 | 0.3410 | 1.5975 | 0.805 | 0.7789 | 0.2438 | 0.1042 |
| No log | 8.0 | 56 | 0.6444 | 0.775 | 0.3446 | 1.2309 | 0.775 | 0.7725 | 0.2305 | 0.0911 |
| No log | 9.0 | 63 | 0.5964 | 0.8 | 0.3219 | 1.3613 | 0.8000 | 0.7784 | 0.2446 | 0.0734 |
| No log | 10.0 | 70 | 0.5700 | 0.82 | 0.3160 | 1.2605 | 0.82 | 0.7860 | 0.2301 | 0.0632 |
| No log | 11.0 | 77 | 0.5663 | 0.79 | 0.3176 | 1.2939 | 0.79 | 0.7643 | 0.2315 | 0.0666 |
| No log | 12.0 | 84 | 0.5111 | 0.825 | 0.3143 | 1.1082 | 0.825 | 0.8082 | 0.2519 | 0.0844 |
| No log | 13.0 | 91 | 0.5228 | 0.78 | 0.3156 | 0.9444 | 0.78 | 0.7773 | 0.1941 | 0.0650 |
| No log | 14.0 | 98 | 0.5792 | 0.78 | 0.3409 | 1.5054 | 0.78 | 0.7725 | 0.2061 | 0.1019 |
| No log | 15.0 | 105 | 0.4905 | 0.83 | 0.2912 | 1.0068 | 0.83 | 0.8266 | 0.2324 | 0.0545 |
| No log | 16.0 | 112 | 0.4990 | 0.825 | 0.2961 | 1.1452 | 0.825 | 0.8140 | 0.2188 | 0.0632 |
| No log | 17.0 | 119 | 0.4900 | 0.805 | 0.2940 | 1.2027 | 0.805 | 0.8018 | 0.2188 | 0.0860 |
| No log | 18.0 | 126 | 0.4755 | 0.805 | 0.2988 | 1.0223 | 0.805 | 0.7789 | 0.2229 | 0.0792 |
| No log | 19.0 | 133 | 0.4398 | 0.81 | 0.2679 | 0.9732 | 0.81 | 0.7830 | 0.2085 | 0.0585 |
| No log | 20.0 | 140 | 0.4766 | 0.805 | 0.2992 | 0.9730 | 0.805 | 0.7934 | 0.2141 | 0.0662 |
| No log | 21.0 | 147 | 0.4615 | 0.835 | 0.2867 | 0.9343 | 0.835 | 0.8219 | 0.1999 | 0.0751 |
| No log | 22.0 | 154 | 0.4343 | 0.825 | 0.2641 | 1.1353 | 0.825 | 0.8070 | 0.2095 | 0.0603 |
| No log | 23.0 | 161 | 0.4291 | 0.85 | 0.2660 | 1.0109 | 0.85 | 0.8365 | 0.2435 | 0.0615 |
| No log | 24.0 | 168 | 0.4263 | 0.855 | 0.2653 | 0.9395 | 0.855 | 0.8440 | 0.2445 | 0.0623 |
| No log | 25.0 | 175 | 0.4338 | 0.845 | 0.2700 | 0.8794 | 0.845 | 0.8349 | 0.2254 | 0.0584 |
| No log | 26.0 | 182 | 0.4305 | 0.835 | 0.2648 | 0.9062 | 0.835 | 0.8322 | 0.2113 | 0.0658 |
| No log | 27.0 | 189 | 0.4262 | 0.84 | 0.2683 | 0.9967 | 0.8400 | 0.8291 | 0.2240 | 0.0670 |
| No log | 28.0 | 196 | 0.4329 | 0.83 | 0.2724 | 0.9016 | 0.83 | 0.8239 | 0.2016 | 0.0685 |
| No log | 29.0 | 203 | 0.4233 | 0.845 | 0.2653 | 0.9115 | 0.845 | 0.8375 | 0.2005 | 0.0634 |
| No log | 30.0 | 210 | 0.4204 | 0.84 | 0.2638 | 0.8892 | 0.8400 | 0.8348 | 0.2175 | 0.0633 |
| No log | 31.0 | 217 | 0.4240 | 0.83 | 0.2684 | 0.8871 | 0.83 | 0.8217 | 0.2128 | 0.0660 |
| No log | 32.0 | 224 | 0.4246 | 0.84 | 0.2677 | 0.8867 | 0.8400 | 0.8307 | 0.2117 | 0.0670 |
| No log | 33.0 | 231 | 0.4247 | 0.83 | 0.2690 | 0.8917 | 0.83 | 0.8202 | 0.2084 | 0.0679 |
| No log | 34.0 | 238 | 0.4218 | 0.84 | 0.2660 | 0.8848 | 0.8400 | 0.8326 | 0.2138 | 0.0663 |
| No log | 35.0 | 245 | 0.4220 | 0.845 | 0.2667 | 0.8926 | 0.845 | 0.8354 | 0.2109 | 0.0655 |
| No log | 36.0 | 252 | 0.4247 | 0.83 | 0.2694 | 0.8854 | 0.83 | 0.8202 | 0.2213 | 0.0683 |
| No log | 37.0 | 259 | 0.4239 | 0.84 | 0.2683 | 0.8849 | 0.8400 | 0.8326 | 0.2163 | 0.0670 |
| No log | 38.0 | 266 | 0.4239 | 0.835 | 0.2689 | 0.8876 | 0.835 | 0.8208 | 0.2118 | 0.0672 |
| No log | 39.0 | 273 | 0.4252 | 0.83 | 0.2696 | 0.8885 | 0.83 | 0.8180 | 0.2064 | 0.0682 |
| No log | 40.0 | 280 | 0.4237 | 0.835 | 0.2686 | 0.8867 | 0.835 | 0.8208 | 0.2211 | 0.0675 |
| No log | 41.0 | 287 | 0.4256 | 0.83 | 0.2700 | 0.8847 | 0.83 | 0.8180 | 0.2253 | 0.0682 |
| No log | 42.0 | 294 | 0.4243 | 0.835 | 0.2692 | 0.8839 | 0.835 | 0.8208 | 0.2130 | 0.0675 |
| No log | 43.0 | 301 | 0.4248 | 0.83 | 0.2695 | 0.8850 | 0.83 | 0.8180 | 0.2237 | 0.0682 |
| No log | 44.0 | 308 | 0.4246 | 0.83 | 0.2694 | 0.8847 | 0.83 | 0.8180 | 0.2383 | 0.0680 |
| No log | 45.0 | 315 | 0.4253 | 0.83 | 0.2699 | 0.8858 | 0.83 | 0.8180 | 0.2200 | 0.0681 |
| No log | 46.0 | 322 | 0.4246 | 0.83 | 0.2694 | 0.8857 | 0.83 | 0.8180 | 0.2311 | 0.0679 |
| No log | 47.0 | 329 | 0.4253 | 0.83 | 0.2700 | 0.8843 | 0.83 | 0.8180 | 0.2312 | 0.0682 |
| No log | 48.0 | 336 | 0.4252 | 0.83 | 0.2698 | 0.8830 | 0.83 | 0.8180 | 0.2177 | 0.0682 |
| No log | 49.0 | 343 | 0.4257 | 0.83 | 0.2703 | 0.8848 | 0.83 | 0.8180 | 0.2315 | 0.0683 |
| No log | 50.0 | 350 | 0.4256 | 0.83 | 0.2703 | 0.8833 | 0.83 | 0.8180 | 0.2331 | 0.0684 |
| No log | 51.0 | 357 | 0.4254 | 0.83 | 0.2703 | 0.8863 | 0.83 | 0.8180 | 0.2422 | 0.0681 |
| No log | 52.0 | 364 | 0.4261 | 0.83 | 0.2707 | 0.8864 | 0.83 | 0.8180 | 0.2424 | 0.0683 |
| No log | 53.0 | 371 | 0.4249 | 0.83 | 0.2700 | 0.8855 | 0.83 | 0.8180 | 0.2195 | 0.0679 |
| No log | 54.0 | 378 | 0.4255 | 0.83 | 0.2704 | 0.8846 | 0.83 | 0.8180 | 0.2342 | 0.0682 |
| No log | 55.0 | 385 | 0.4256 | 0.825 | 0.2704 | 0.8861 | 0.825 | 0.8116 | 0.2357 | 0.0682 |
| No log | 56.0 | 392 | 0.4264 | 0.83 | 0.2708 | 0.8853 | 0.83 | 0.8180 | 0.2345 | 0.0682 |
| No log | 57.0 | 399 | 0.4257 | 0.825 | 0.2706 | 0.8864 | 0.825 | 0.8116 | 0.2353 | 0.0682 |
| No log | 58.0 | 406 | 0.4258 | 0.825 | 0.2704 | 0.8841 | 0.825 | 0.8116 | 0.2271 | 0.0681 |
| No log | 59.0 | 413 | 0.4255 | 0.825 | 0.2703 | 0.8856 | 0.825 | 0.8116 | 0.2267 | 0.0680 |
| No log | 60.0 | 420 | 0.4259 | 0.825 | 0.2709 | 0.8842 | 0.825 | 0.8116 | 0.2269 | 0.0683 |
| No log | 61.0 | 427 | 0.4254 | 0.83 | 0.2702 | 0.8852 | 0.83 | 0.8180 | 0.2265 | 0.0680 |
| No log | 62.0 | 434 | 0.4261 | 0.83 | 0.2707 | 0.8851 | 0.83 | 0.8180 | 0.2346 | 0.0682 |
| No log | 63.0 | 441 | 0.4257 | 0.825 | 0.2704 | 0.8854 | 0.825 | 0.8116 | 0.2232 | 0.0682 |
| No log | 64.0 | 448 | 0.4261 | 0.825 | 0.2708 | 0.8845 | 0.825 | 0.8116 | 0.2264 | 0.0683 |
| No log | 65.0 | 455 | 0.4259 | 0.825 | 0.2706 | 0.8862 | 0.825 | 0.8116 | 0.2204 | 0.0682 |
| No log | 66.0 | 462 | 0.4258 | 0.825 | 0.2707 | 0.8856 | 0.825 | 0.8116 | 0.2193 | 0.0682 |
| No log | 67.0 | 469 | 0.4255 | 0.83 | 0.2703 | 0.8852 | 0.83 | 0.8180 | 0.2190 | 0.0681 |
| No log | 68.0 | 476 | 0.4260 | 0.825 | 0.2708 | 0.8860 | 0.825 | 0.8116 | 0.2196 | 0.0682 |
| No log | 69.0 | 483 | 0.4259 | 0.825 | 0.2708 | 0.8858 | 0.825 | 0.8116 | 0.2195 | 0.0682 |
| No log | 70.0 | 490 | 0.4255 | 0.825 | 0.2703 | 0.8857 | 0.825 | 0.8116 | 0.2135 | 0.0682 |
| No log | 71.0 | 497 | 0.4258 | 0.825 | 0.2707 | 0.8857 | 0.825 | 0.8116 | 0.2205 | 0.0681 |
| 0.1816 | 72.0 | 504 | 0.4261 | 0.825 | 0.2708 | 0.8857 | 0.825 | 0.8116 | 0.2198 | 0.0682 |
| 0.1816 | 73.0 | 511 | 0.4259 | 0.825 | 0.2706 | 0.8852 | 0.825 | 0.8116 | 0.2192 | 0.0682 |
| 0.1816 | 74.0 | 518 | 0.4259 | 0.825 | 0.2707 | 0.8856 | 0.825 | 0.8116 | 0.2290 | 0.0681 |
| 0.1816 | 75.0 | 525 | 0.4257 | 0.825 | 0.2706 | 0.8864 | 0.825 | 0.8116 | 0.2337 | 0.0681 |
| 0.1816 | 76.0 | 532 | 0.4259 | 0.825 | 0.2707 | 0.8855 | 0.825 | 0.8116 | 0.2211 | 0.0681 |
| 0.1816 | 77.0 | 539 | 0.4255 | 0.825 | 0.2704 | 0.8860 | 0.825 | 0.8116 | 0.2137 | 0.0680 |
| 0.1816 | 78.0 | 546 | 0.4258 | 0.825 | 0.2707 | 0.8868 | 0.825 | 0.8116 | 0.2274 | 0.0682 |
| 0.1816 | 79.0 | 553 | 0.4260 | 0.825 | 0.2708 | 0.8859 | 0.825 | 0.8116 | 0.2209 | 0.0682 |
| 0.1816 | 80.0 | 560 | 0.4260 | 0.825 | 0.2708 | 0.8864 | 0.825 | 0.8116 | 0.2135 | 0.0681 |
| 0.1816 | 81.0 | 567 | 0.4259 | 0.825 | 0.2707 | 0.8859 | 0.825 | 0.8116 | 0.2134 | 0.0682 |
| 0.1816 | 82.0 | 574 | 0.4258 | 0.825 | 0.2706 | 0.8862 | 0.825 | 0.8116 | 0.2062 | 0.0681 |
| 0.1816 | 83.0 | 581 | 0.4259 | 0.825 | 0.2707 | 0.8866 | 0.825 | 0.8116 | 0.2204 | 0.0681 |
| 0.1816 | 84.0 | 588 | 0.4259 | 0.825 | 0.2707 | 0.8868 | 0.825 | 0.8116 | 0.2204 | 0.0681 |
| 0.1816 | 85.0 | 595 | 0.4257 | 0.825 | 0.2706 | 0.8861 | 0.825 | 0.8116 | 0.2141 | 0.0682 |
| 0.1816 | 86.0 | 602 | 0.4258 | 0.825 | 0.2707 | 0.8861 | 0.825 | 0.8116 | 0.2140 | 0.0682 |
| 0.1816 | 87.0 | 609 | 0.4258 | 0.825 | 0.2707 | 0.8867 | 0.825 | 0.8116 | 0.2137 | 0.0680 |
| 0.1816 | 88.0 | 616 | 0.4259 | 0.825 | 0.2707 | 0.8866 | 0.825 | 0.8116 | 0.2129 | 0.0681 |
| 0.1816 | 89.0 | 623 | 0.4258 | 0.825 | 0.2707 | 0.8866 | 0.825 | 0.8116 | 0.2205 | 0.0681 |
| 0.1816 | 90.0 | 630 | 0.4259 | 0.825 | 0.2707 | 0.8865 | 0.825 | 0.8116 | 0.2053 | 0.0680 |
| 0.1816 | 91.0 | 637 | 0.4258 | 0.825 | 0.2706 | 0.8868 | 0.825 | 0.8116 | 0.2130 | 0.0681 |
| 0.1816 | 92.0 | 644 | 0.4258 | 0.825 | 0.2706 | 0.8870 | 0.825 | 0.8116 | 0.2129 | 0.0680 |
| 0.1816 | 93.0 | 651 | 0.4258 | 0.825 | 0.2706 | 0.8868 | 0.825 | 0.8116 | 0.2129 | 0.0681 |
| 0.1816 | 94.0 | 658 | 0.4258 | 0.825 | 0.2707 | 0.8867 | 0.825 | 0.8116 | 0.2129 | 0.0681 |
| 0.1816 | 95.0 | 665 | 0.4258 | 0.825 | 0.2707 | 0.8867 | 0.825 | 0.8116 | 0.2053 | 0.0680 |
| 0.1816 | 96.0 | 672 | 0.4259 | 0.825 | 0.2707 | 0.8866 | 0.825 | 0.8116 | 0.2053 | 0.0681 |
| 0.1816 | 97.0 | 679 | 0.4258 | 0.825 | 0.2707 | 0.8868 | 0.825 | 0.8116 | 0.2129 | 0.0681 |
| 0.1816 | 98.0 | 686 | 0.4258 | 0.825 | 0.2707 | 0.8868 | 0.825 | 0.8116 | 0.2129 | 0.0680 |
| 0.1816 | 99.0 | 693 | 0.4258 | 0.825 | 0.2707 | 0.8868 | 0.825 | 0.8116 | 0.2129 | 0.0681 |
| 0.1816 | 100.0 | 700 | 0.4258 | 0.825 | 0.2707 | 0.8867 | 0.825 | 0.8116 | 0.2129 | 0.0681 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4797
- Accuracy: 0.835
- Brier Loss: 0.2522
- Nll: 0.8627
- F1 Micro: 0.835
- F1 Macro: 0.8222
- Ece: 0.1830
- Aurc: 0.0434
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.9341 | 0.215 | 0.8749 | 5.3238 | 0.2150 | 0.1264 | 0.2642 | 0.6914 |
| No log | 2.0 | 14 | 1.5320 | 0.405 | 0.7410 | 3.5078 | 0.405 | 0.2276 | 0.2957 | 0.4015 |
| No log | 3.0 | 21 | 1.0532 | 0.635 | 0.5629 | 2.0153 | 0.635 | 0.5844 | 0.3037 | 0.2006 |
| No log | 4.0 | 28 | 0.7915 | 0.715 | 0.4093 | 1.6974 | 0.715 | 0.6762 | 0.2420 | 0.1131 |
| No log | 5.0 | 35 | 0.8024 | 0.745 | 0.3869 | 1.7109 | 0.745 | 0.7548 | 0.2160 | 0.1006 |
| No log | 6.0 | 42 | 0.7162 | 0.765 | 0.3351 | 1.8105 | 0.765 | 0.7599 | 0.2216 | 0.0874 |
| No log | 7.0 | 49 | 0.6966 | 0.785 | 0.3304 | 1.5292 | 0.785 | 0.7682 | 0.2058 | 0.0979 |
| No log | 8.0 | 56 | 0.6317 | 0.805 | 0.2995 | 1.3486 | 0.805 | 0.7887 | 0.2266 | 0.0721 |
| No log | 9.0 | 63 | 0.6903 | 0.805 | 0.3304 | 1.5866 | 0.805 | 0.7971 | 0.2371 | 0.0995 |
| No log | 10.0 | 70 | 0.6223 | 0.805 | 0.2940 | 1.3478 | 0.805 | 0.8114 | 0.2281 | 0.0697 |
| No log | 11.0 | 77 | 0.6350 | 0.795 | 0.3145 | 1.3386 | 0.795 | 0.7730 | 0.2063 | 0.0962 |
| No log | 12.0 | 84 | 0.5570 | 0.835 | 0.2666 | 1.2662 | 0.835 | 0.8181 | 0.1951 | 0.0553 |
| No log | 13.0 | 91 | 0.5610 | 0.81 | 0.2858 | 1.2619 | 0.81 | 0.8002 | 0.1884 | 0.0626 |
| No log | 14.0 | 98 | 0.5843 | 0.8 | 0.2961 | 1.0782 | 0.8000 | 0.8083 | 0.1993 | 0.0683 |
| No log | 15.0 | 105 | 0.5918 | 0.78 | 0.2965 | 1.1207 | 0.78 | 0.7861 | 0.1895 | 0.0634 |
| No log | 16.0 | 112 | 0.5541 | 0.84 | 0.2765 | 1.3189 | 0.8400 | 0.8455 | 0.1969 | 0.0597 |
| No log | 17.0 | 119 | 0.5037 | 0.835 | 0.2568 | 0.9024 | 0.835 | 0.8248 | 0.2083 | 0.0499 |
| No log | 18.0 | 126 | 0.5050 | 0.85 | 0.2563 | 1.0032 | 0.85 | 0.8441 | 0.2147 | 0.0580 |
| No log | 19.0 | 133 | 0.5430 | 0.815 | 0.2779 | 1.1046 | 0.815 | 0.8044 | 0.1906 | 0.0562 |
| No log | 20.0 | 140 | 0.5276 | 0.84 | 0.2743 | 0.9964 | 0.8400 | 0.8144 | 0.2104 | 0.0597 |
| No log | 21.0 | 147 | 0.5155 | 0.835 | 0.2686 | 0.9556 | 0.835 | 0.8210 | 0.1962 | 0.0572 |
| No log | 22.0 | 154 | 0.4937 | 0.835 | 0.2581 | 1.0079 | 0.835 | 0.8172 | 0.1975 | 0.0479 |
| No log | 23.0 | 161 | 0.4931 | 0.845 | 0.2533 | 1.0021 | 0.845 | 0.8270 | 0.1884 | 0.0503 |
| No log | 24.0 | 168 | 0.4869 | 0.83 | 0.2554 | 0.9660 | 0.83 | 0.8084 | 0.1945 | 0.0481 |
| No log | 25.0 | 175 | 0.4843 | 0.845 | 0.2512 | 0.9979 | 0.845 | 0.8316 | 0.1746 | 0.0466 |
| No log | 26.0 | 182 | 0.4866 | 0.835 | 0.2531 | 0.9006 | 0.835 | 0.8188 | 0.1833 | 0.0472 |
| No log | 27.0 | 189 | 0.4882 | 0.825 | 0.2562 | 0.8929 | 0.825 | 0.8043 | 0.2023 | 0.0469 |
| No log | 28.0 | 196 | 0.4814 | 0.82 | 0.2494 | 0.9122 | 0.82 | 0.8060 | 0.1773 | 0.0451 |
| No log | 29.0 | 203 | 0.4749 | 0.835 | 0.2501 | 0.8770 | 0.835 | 0.8252 | 0.1688 | 0.0442 |
| No log | 30.0 | 210 | 0.4761 | 0.84 | 0.2490 | 0.8848 | 0.8400 | 0.8250 | 0.2068 | 0.0443 |
| No log | 31.0 | 217 | 0.4787 | 0.845 | 0.2508 | 0.8754 | 0.845 | 0.8309 | 0.1635 | 0.0438 |
| No log | 32.0 | 224 | 0.4791 | 0.835 | 0.2521 | 0.8711 | 0.835 | 0.8224 | 0.1876 | 0.0446 |
| No log | 33.0 | 231 | 0.4779 | 0.84 | 0.2509 | 0.8650 | 0.8400 | 0.8252 | 0.1813 | 0.0436 |
| No log | 34.0 | 238 | 0.4774 | 0.84 | 0.2513 | 0.8662 | 0.8400 | 0.8252 | 0.1919 | 0.0441 |
| No log | 35.0 | 245 | 0.4760 | 0.835 | 0.2502 | 0.8636 | 0.835 | 0.8224 | 0.1840 | 0.0434 |
| No log | 36.0 | 252 | 0.4784 | 0.84 | 0.2509 | 0.8688 | 0.8400 | 0.8281 | 0.1691 | 0.0437 |
| No log | 37.0 | 259 | 0.4771 | 0.835 | 0.2507 | 0.8670 | 0.835 | 0.8224 | 0.1936 | 0.0440 |
| No log | 38.0 | 266 | 0.4764 | 0.835 | 0.2499 | 0.8614 | 0.835 | 0.8224 | 0.1830 | 0.0434 |
| No log | 39.0 | 273 | 0.4769 | 0.835 | 0.2503 | 0.8651 | 0.835 | 0.8224 | 0.2001 | 0.0438 |
| No log | 40.0 | 280 | 0.4777 | 0.84 | 0.2514 | 0.8608 | 0.8400 | 0.8281 | 0.1832 | 0.0435 |
| No log | 41.0 | 287 | 0.4777 | 0.835 | 0.2504 | 0.8650 | 0.835 | 0.8224 | 0.1953 | 0.0437 |
| No log | 42.0 | 294 | 0.4779 | 0.835 | 0.2511 | 0.8629 | 0.835 | 0.8224 | 0.1944 | 0.0440 |
| No log | 43.0 | 301 | 0.4790 | 0.835 | 0.2519 | 0.8631 | 0.835 | 0.8222 | 0.1808 | 0.0439 |
| No log | 44.0 | 308 | 0.4777 | 0.835 | 0.2509 | 0.8604 | 0.835 | 0.8222 | 0.1886 | 0.0435 |
| No log | 45.0 | 315 | 0.4787 | 0.835 | 0.2517 | 0.8620 | 0.835 | 0.8222 | 0.1940 | 0.0437 |
| No log | 46.0 | 322 | 0.4774 | 0.84 | 0.2509 | 0.8614 | 0.8400 | 0.8281 | 0.1779 | 0.0433 |
| No log | 47.0 | 329 | 0.4785 | 0.835 | 0.2517 | 0.8609 | 0.835 | 0.8222 | 0.1811 | 0.0438 |
| No log | 48.0 | 336 | 0.4792 | 0.835 | 0.2521 | 0.8611 | 0.835 | 0.8222 | 0.1849 | 0.0436 |
| No log | 49.0 | 343 | 0.4771 | 0.84 | 0.2509 | 0.8623 | 0.8400 | 0.8281 | 0.1908 | 0.0430 |
| No log | 50.0 | 350 | 0.4793 | 0.835 | 0.2520 | 0.8633 | 0.835 | 0.8222 | 0.1900 | 0.0435 |
| No log | 51.0 | 357 | 0.4786 | 0.83 | 0.2517 | 0.8654 | 0.83 | 0.8159 | 0.1684 | 0.0437 |
| No log | 52.0 | 364 | 0.4792 | 0.83 | 0.2521 | 0.8625 | 0.83 | 0.8166 | 0.1915 | 0.0430 |
| No log | 53.0 | 371 | 0.4785 | 0.835 | 0.2513 | 0.8652 | 0.835 | 0.8222 | 0.1853 | 0.0434 |
| No log | 54.0 | 378 | 0.4798 | 0.835 | 0.2523 | 0.8652 | 0.835 | 0.8222 | 0.1767 | 0.0437 |
| No log | 55.0 | 385 | 0.4791 | 0.835 | 0.2519 | 0.8637 | 0.835 | 0.8222 | 0.1891 | 0.0435 |
| No log | 56.0 | 392 | 0.4790 | 0.835 | 0.2519 | 0.8614 | 0.835 | 0.8222 | 0.1749 | 0.0429 |
| No log | 57.0 | 399 | 0.4782 | 0.835 | 0.2513 | 0.8625 | 0.835 | 0.8222 | 0.1909 | 0.0433 |
| No log | 58.0 | 406 | 0.4794 | 0.835 | 0.2521 | 0.8602 | 0.835 | 0.8222 | 0.1758 | 0.0435 |
| No log | 59.0 | 413 | 0.4790 | 0.835 | 0.2517 | 0.8617 | 0.835 | 0.8222 | 0.1754 | 0.0432 |
| No log | 60.0 | 420 | 0.4791 | 0.835 | 0.2520 | 0.8614 | 0.835 | 0.8222 | 0.1830 | 0.0430 |
| No log | 61.0 | 427 | 0.4789 | 0.835 | 0.2518 | 0.8612 | 0.835 | 0.8222 | 0.1870 | 0.0432 |
| No log | 62.0 | 434 | 0.4792 | 0.835 | 0.2520 | 0.8620 | 0.835 | 0.8222 | 0.1902 | 0.0433 |
| No log | 63.0 | 441 | 0.4789 | 0.835 | 0.2518 | 0.8619 | 0.835 | 0.8222 | 0.1997 | 0.0431 |
| No log | 64.0 | 448 | 0.4797 | 0.835 | 0.2523 | 0.8607 | 0.835 | 0.8222 | 0.1833 | 0.0434 |
| No log | 65.0 | 455 | 0.4797 | 0.835 | 0.2522 | 0.8624 | 0.835 | 0.8222 | 0.1922 | 0.0434 |
| No log | 66.0 | 462 | 0.4791 | 0.835 | 0.2519 | 0.8620 | 0.835 | 0.8222 | 0.1894 | 0.0430 |
| No log | 67.0 | 469 | 0.4792 | 0.835 | 0.2520 | 0.8612 | 0.835 | 0.8222 | 0.1885 | 0.0433 |
| No log | 68.0 | 476 | 0.4796 | 0.835 | 0.2522 | 0.8627 | 0.835 | 0.8222 | 0.1918 | 0.0433 |
| No log | 69.0 | 483 | 0.4793 | 0.835 | 0.2521 | 0.8628 | 0.835 | 0.8222 | 0.1828 | 0.0433 |
| No log | 70.0 | 490 | 0.4792 | 0.835 | 0.2519 | 0.8622 | 0.835 | 0.8222 | 0.1918 | 0.0432 |
| No log | 71.0 | 497 | 0.4797 | 0.835 | 0.2523 | 0.8615 | 0.835 | 0.8222 | 0.1836 | 0.0434 |
| 0.194 | 72.0 | 504 | 0.4797 | 0.835 | 0.2522 | 0.8618 | 0.835 | 0.8222 | 0.1842 | 0.0433 |
| 0.194 | 73.0 | 511 | 0.4794 | 0.835 | 0.2521 | 0.8624 | 0.835 | 0.8222 | 0.1914 | 0.0432 |
| 0.194 | 74.0 | 518 | 0.4794 | 0.835 | 0.2521 | 0.8617 | 0.835 | 0.8222 | 0.1915 | 0.0431 |
| 0.194 | 75.0 | 525 | 0.4796 | 0.835 | 0.2522 | 0.8623 | 0.835 | 0.8222 | 0.1917 | 0.0434 |
| 0.194 | 76.0 | 532 | 0.4795 | 0.835 | 0.2520 | 0.8622 | 0.835 | 0.8222 | 0.1985 | 0.0433 |
| 0.194 | 77.0 | 539 | 0.4795 | 0.835 | 0.2520 | 0.8623 | 0.835 | 0.8222 | 0.1985 | 0.0432 |
| 0.194 | 78.0 | 546 | 0.4795 | 0.835 | 0.2522 | 0.8621 | 0.835 | 0.8222 | 0.1981 | 0.0432 |
| 0.194 | 79.0 | 553 | 0.4798 | 0.835 | 0.2522 | 0.8626 | 0.835 | 0.8222 | 0.1909 | 0.0433 |
| 0.194 | 80.0 | 560 | 0.4796 | 0.835 | 0.2521 | 0.8630 | 0.835 | 0.8222 | 0.1984 | 0.0433 |
| 0.194 | 81.0 | 567 | 0.4797 | 0.835 | 0.2522 | 0.8619 | 0.835 | 0.8222 | 0.1902 | 0.0434 |
| 0.194 | 82.0 | 574 | 0.4797 | 0.835 | 0.2522 | 0.8631 | 0.835 | 0.8222 | 0.1913 | 0.0433 |
| 0.194 | 83.0 | 581 | 0.4797 | 0.835 | 0.2522 | 0.8627 | 0.835 | 0.8222 | 0.1909 | 0.0433 |
| 0.194 | 84.0 | 588 | 0.4797 | 0.835 | 0.2522 | 0.8623 | 0.835 | 0.8222 | 0.1909 | 0.0433 |
| 0.194 | 85.0 | 595 | 0.4797 | 0.835 | 0.2522 | 0.8624 | 0.835 | 0.8222 | 0.1909 | 0.0434 |
| 0.194 | 86.0 | 602 | 0.4796 | 0.835 | 0.2522 | 0.8623 | 0.835 | 0.8222 | 0.1830 | 0.0433 |
| 0.194 | 87.0 | 609 | 0.4797 | 0.835 | 0.2522 | 0.8629 | 0.835 | 0.8222 | 0.1909 | 0.0434 |
| 0.194 | 88.0 | 616 | 0.4797 | 0.835 | 0.2521 | 0.8634 | 0.835 | 0.8222 | 0.1830 | 0.0433 |
| 0.194 | 89.0 | 623 | 0.4797 | 0.835 | 0.2522 | 0.8627 | 0.835 | 0.8222 | 0.1910 | 0.0434 |
| 0.194 | 90.0 | 630 | 0.4798 | 0.835 | 0.2523 | 0.8627 | 0.835 | 0.8222 | 0.1909 | 0.0434 |
| 0.194 | 91.0 | 637 | 0.4797 | 0.835 | 0.2522 | 0.8625 | 0.835 | 0.8222 | 0.1909 | 0.0434 |
| 0.194 | 92.0 | 644 | 0.4797 | 0.835 | 0.2522 | 0.8630 | 0.835 | 0.8222 | 0.1830 | 0.0434 |
| 0.194 | 93.0 | 651 | 0.4798 | 0.835 | 0.2522 | 0.8629 | 0.835 | 0.8222 | 0.1910 | 0.0434 |
| 0.194 | 94.0 | 658 | 0.4797 | 0.835 | 0.2522 | 0.8628 | 0.835 | 0.8222 | 0.1910 | 0.0434 |
| 0.194 | 95.0 | 665 | 0.4797 | 0.835 | 0.2522 | 0.8627 | 0.835 | 0.8222 | 0.1910 | 0.0434 |
| 0.194 | 96.0 | 672 | 0.4798 | 0.835 | 0.2522 | 0.8627 | 0.835 | 0.8222 | 0.1834 | 0.0435 |
| 0.194 | 97.0 | 679 | 0.4797 | 0.835 | 0.2522 | 0.8628 | 0.835 | 0.8222 | 0.1830 | 0.0434 |
| 0.194 | 98.0 | 686 | 0.4797 | 0.835 | 0.2522 | 0.8628 | 0.835 | 0.8222 | 0.1830 | 0.0434 |
| 0.194 | 99.0 | 693 | 0.4797 | 0.835 | 0.2522 | 0.8628 | 0.835 | 0.8222 | 0.1830 | 0.0434 |
| 0.194 | 100.0 | 700 | 0.4797 | 0.835 | 0.2522 | 0.8627 | 0.835 | 0.8222 | 0.1830 | 0.0434 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5492
- Accuracy: 0.84
- Brier Loss: 0.2438
- Nll: 1.0175
- F1 Micro: 0.8400
- F1 Macro: 0.8329
- Ece: 0.1581
- Aurc: 0.0460
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.1371 | 0.215 | 0.8750 | 5.2661 | 0.2150 | 0.1261 | 0.2616 | 0.6901 |
| No log | 2.0 | 14 | 1.7146 | 0.4 | 0.7405 | 3.6392 | 0.4000 | 0.2249 | 0.2801 | 0.4047 |
| No log | 3.0 | 21 | 1.1877 | 0.625 | 0.5608 | 2.0254 | 0.625 | 0.5681 | 0.3161 | 0.2040 |
| No log | 4.0 | 28 | 0.8633 | 0.715 | 0.4058 | 1.6421 | 0.715 | 0.6656 | 0.2020 | 0.1142 |
| No log | 5.0 | 35 | 0.8597 | 0.72 | 0.3947 | 1.6962 | 0.72 | 0.7299 | 0.2181 | 0.1133 |
| No log | 6.0 | 42 | 0.7266 | 0.785 | 0.3157 | 1.6428 | 0.785 | 0.7648 | 0.2063 | 0.0758 |
| No log | 7.0 | 49 | 0.7662 | 0.77 | 0.3428 | 1.4695 | 0.7700 | 0.7666 | 0.1871 | 0.0998 |
| No log | 8.0 | 56 | 0.7824 | 0.77 | 0.3365 | 1.5995 | 0.7700 | 0.7346 | 0.1840 | 0.0980 |
| No log | 9.0 | 63 | 0.7245 | 0.805 | 0.3102 | 1.2669 | 0.805 | 0.8012 | 0.1789 | 0.0855 |
| No log | 10.0 | 70 | 0.6787 | 0.8 | 0.2944 | 1.3351 | 0.8000 | 0.7754 | 0.1578 | 0.0665 |
| No log | 11.0 | 77 | 0.6497 | 0.805 | 0.2870 | 1.3980 | 0.805 | 0.8029 | 0.1743 | 0.0709 |
| No log | 12.0 | 84 | 0.6353 | 0.82 | 0.2747 | 1.3397 | 0.82 | 0.8085 | 0.1670 | 0.0687 |
| No log | 13.0 | 91 | 0.7204 | 0.79 | 0.3163 | 1.4500 | 0.79 | 0.7945 | 0.1660 | 0.0782 |
| No log | 14.0 | 98 | 0.6632 | 0.825 | 0.2714 | 1.5658 | 0.825 | 0.8110 | 0.1827 | 0.0726 |
| No log | 15.0 | 105 | 0.6417 | 0.8 | 0.2840 | 1.3774 | 0.8000 | 0.7984 | 0.1618 | 0.0703 |
| No log | 16.0 | 112 | 0.5899 | 0.825 | 0.2687 | 1.0331 | 0.825 | 0.8220 | 0.1569 | 0.0603 |
| No log | 17.0 | 119 | 0.5924 | 0.83 | 0.2508 | 1.4167 | 0.83 | 0.8182 | 0.1414 | 0.0549 |
| No log | 18.0 | 126 | 0.5885 | 0.825 | 0.2608 | 1.1991 | 0.825 | 0.8174 | 0.1677 | 0.0607 |
| No log | 19.0 | 133 | 0.5898 | 0.82 | 0.2634 | 1.2879 | 0.82 | 0.8145 | 0.1563 | 0.0610 |
| No log | 20.0 | 140 | 0.5509 | 0.825 | 0.2439 | 1.1130 | 0.825 | 0.8127 | 0.1532 | 0.0475 |
| No log | 21.0 | 147 | 0.5719 | 0.82 | 0.2585 | 1.1331 | 0.82 | 0.8101 | 0.1640 | 0.0490 |
| No log | 22.0 | 154 | 0.5650 | 0.85 | 0.2449 | 1.2095 | 0.85 | 0.8429 | 0.1622 | 0.0595 |
| No log | 23.0 | 161 | 0.5538 | 0.83 | 0.2492 | 1.0979 | 0.83 | 0.8227 | 0.1759 | 0.0515 |
| No log | 24.0 | 168 | 0.5514 | 0.84 | 0.2396 | 1.1748 | 0.8400 | 0.8360 | 0.1449 | 0.0479 |
| No log | 25.0 | 175 | 0.5549 | 0.815 | 0.2497 | 1.0876 | 0.815 | 0.8080 | 0.1668 | 0.0502 |
| No log | 26.0 | 182 | 0.5469 | 0.84 | 0.2397 | 1.1651 | 0.8400 | 0.8317 | 0.1560 | 0.0471 |
| No log | 27.0 | 189 | 0.5584 | 0.84 | 0.2508 | 1.0605 | 0.8400 | 0.8253 | 0.1801 | 0.0486 |
| No log | 28.0 | 196 | 0.5395 | 0.845 | 0.2371 | 1.0749 | 0.845 | 0.8302 | 0.1448 | 0.0438 |
| No log | 29.0 | 203 | 0.5478 | 0.84 | 0.2436 | 1.0599 | 0.8400 | 0.8271 | 0.1556 | 0.0470 |
| No log | 30.0 | 210 | 0.5432 | 0.835 | 0.2402 | 1.0595 | 0.835 | 0.8206 | 0.1613 | 0.0457 |
| No log | 31.0 | 217 | 0.5454 | 0.83 | 0.2422 | 1.0518 | 0.83 | 0.8176 | 0.1556 | 0.0462 |
| No log | 32.0 | 224 | 0.5456 | 0.83 | 0.2415 | 1.0500 | 0.83 | 0.8176 | 0.1555 | 0.0461 |
| No log | 33.0 | 231 | 0.5471 | 0.835 | 0.2430 | 1.0492 | 0.835 | 0.8233 | 0.1616 | 0.0466 |
| No log | 34.0 | 238 | 0.5456 | 0.83 | 0.2424 | 1.0495 | 0.83 | 0.8176 | 0.1636 | 0.0467 |
| No log | 35.0 | 245 | 0.5482 | 0.835 | 0.2434 | 1.0438 | 0.835 | 0.8239 | 0.1717 | 0.0474 |
| No log | 36.0 | 252 | 0.5462 | 0.835 | 0.2425 | 1.0461 | 0.835 | 0.8239 | 0.1507 | 0.0462 |
| No log | 37.0 | 259 | 0.5488 | 0.83 | 0.2435 | 1.0468 | 0.83 | 0.8176 | 0.1377 | 0.0471 |
| No log | 38.0 | 266 | 0.5461 | 0.84 | 0.2420 | 1.0389 | 0.8400 | 0.8296 | 0.1379 | 0.0458 |
| No log | 39.0 | 273 | 0.5458 | 0.84 | 0.2423 | 1.0387 | 0.8400 | 0.8296 | 0.1545 | 0.0457 |
| No log | 40.0 | 280 | 0.5483 | 0.835 | 0.2435 | 1.0382 | 0.835 | 0.8233 | 0.1343 | 0.0466 |
| No log | 41.0 | 287 | 0.5475 | 0.835 | 0.2430 | 1.0378 | 0.835 | 0.8233 | 0.1408 | 0.0454 |
| No log | 42.0 | 294 | 0.5463 | 0.835 | 0.2424 | 1.0368 | 0.835 | 0.8233 | 0.1463 | 0.0454 |
| No log | 43.0 | 301 | 0.5467 | 0.835 | 0.2428 | 1.0335 | 0.835 | 0.8233 | 0.1453 | 0.0458 |
| No log | 44.0 | 308 | 0.5470 | 0.835 | 0.2429 | 1.0331 | 0.835 | 0.8233 | 0.1597 | 0.0459 |
| No log | 45.0 | 315 | 0.5469 | 0.835 | 0.2426 | 1.0336 | 0.835 | 0.8233 | 0.1487 | 0.0459 |
| No log | 46.0 | 322 | 0.5473 | 0.835 | 0.2431 | 1.0322 | 0.835 | 0.8233 | 0.1486 | 0.0465 |
| No log | 47.0 | 329 | 0.5464 | 0.84 | 0.2425 | 1.0324 | 0.8400 | 0.8329 | 0.1443 | 0.0454 |
| No log | 48.0 | 336 | 0.5462 | 0.835 | 0.2426 | 1.0298 | 0.835 | 0.8233 | 0.1527 | 0.0454 |
| No log | 49.0 | 343 | 0.5471 | 0.835 | 0.2427 | 1.0305 | 0.835 | 0.8233 | 0.1619 | 0.0456 |
| No log | 50.0 | 350 | 0.5479 | 0.84 | 0.2433 | 1.0304 | 0.8400 | 0.8329 | 0.1549 | 0.0457 |
| No log | 51.0 | 357 | 0.5471 | 0.835 | 0.2427 | 1.0296 | 0.835 | 0.8233 | 0.1607 | 0.0458 |
| No log | 52.0 | 364 | 0.5475 | 0.835 | 0.2431 | 1.0282 | 0.835 | 0.8233 | 0.1596 | 0.0458 |
| No log | 53.0 | 371 | 0.5474 | 0.84 | 0.2428 | 1.0294 | 0.8400 | 0.8329 | 0.1603 | 0.0457 |
| No log | 54.0 | 378 | 0.5482 | 0.835 | 0.2436 | 1.0263 | 0.835 | 0.8233 | 0.1460 | 0.0461 |
| No log | 55.0 | 385 | 0.5468 | 0.84 | 0.2424 | 1.0264 | 0.8400 | 0.8329 | 0.1491 | 0.0454 |
| No log | 56.0 | 392 | 0.5479 | 0.84 | 0.2432 | 1.0263 | 0.8400 | 0.8329 | 0.1594 | 0.0452 |
| No log | 57.0 | 399 | 0.5467 | 0.84 | 0.2426 | 1.0259 | 0.8400 | 0.8329 | 0.1476 | 0.0454 |
| No log | 58.0 | 406 | 0.5484 | 0.835 | 0.2434 | 1.0237 | 0.835 | 0.8233 | 0.1379 | 0.0463 |
| No log | 59.0 | 413 | 0.5473 | 0.835 | 0.2429 | 1.0245 | 0.835 | 0.8233 | 0.1521 | 0.0458 |
| No log | 60.0 | 420 | 0.5475 | 0.835 | 0.2430 | 1.0240 | 0.835 | 0.8233 | 0.1523 | 0.0458 |
| No log | 61.0 | 427 | 0.5475 | 0.835 | 0.2430 | 1.0239 | 0.835 | 0.8233 | 0.1438 | 0.0461 |
| No log | 62.0 | 434 | 0.5476 | 0.835 | 0.2430 | 1.0227 | 0.835 | 0.8233 | 0.1522 | 0.0461 |
| No log | 63.0 | 441 | 0.5478 | 0.835 | 0.2430 | 1.0235 | 0.835 | 0.8233 | 0.1520 | 0.0460 |
| No log | 64.0 | 448 | 0.5478 | 0.84 | 0.2432 | 1.0215 | 0.8400 | 0.8329 | 0.1576 | 0.0458 |
| No log | 65.0 | 455 | 0.5478 | 0.835 | 0.2430 | 1.0229 | 0.835 | 0.8233 | 0.1592 | 0.0461 |
| No log | 66.0 | 462 | 0.5481 | 0.84 | 0.2433 | 1.0219 | 0.8400 | 0.8329 | 0.1582 | 0.0459 |
| No log | 67.0 | 469 | 0.5482 | 0.84 | 0.2434 | 1.0214 | 0.8400 | 0.8329 | 0.1665 | 0.0456 |
| No log | 68.0 | 476 | 0.5482 | 0.835 | 0.2433 | 1.0209 | 0.835 | 0.8233 | 0.1445 | 0.0463 |
| No log | 69.0 | 483 | 0.5484 | 0.84 | 0.2435 | 1.0210 | 0.8400 | 0.8329 | 0.1578 | 0.0458 |
| No log | 70.0 | 490 | 0.5479 | 0.84 | 0.2433 | 1.0206 | 0.8400 | 0.8329 | 0.1662 | 0.0457 |
| No log | 71.0 | 497 | 0.5486 | 0.84 | 0.2435 | 1.0210 | 0.8400 | 0.8329 | 0.1401 | 0.0460 |
| 0.1783 | 72.0 | 504 | 0.5489 | 0.84 | 0.2437 | 1.0204 | 0.8400 | 0.8329 | 0.1581 | 0.0460 |
| 0.1783 | 73.0 | 511 | 0.5483 | 0.835 | 0.2435 | 1.0194 | 0.835 | 0.8233 | 0.1712 | 0.0460 |
| 0.1783 | 74.0 | 518 | 0.5489 | 0.84 | 0.2437 | 1.0198 | 0.8400 | 0.8329 | 0.1668 | 0.0461 |
| 0.1783 | 75.0 | 525 | 0.5486 | 0.84 | 0.2435 | 1.0194 | 0.8400 | 0.8329 | 0.1666 | 0.0458 |
| 0.1783 | 76.0 | 532 | 0.5487 | 0.84 | 0.2436 | 1.0194 | 0.8400 | 0.8329 | 0.1710 | 0.0458 |
| 0.1783 | 77.0 | 539 | 0.5485 | 0.84 | 0.2434 | 1.0191 | 0.8400 | 0.8329 | 0.1392 | 0.0459 |
| 0.1783 | 78.0 | 546 | 0.5486 | 0.84 | 0.2435 | 1.0191 | 0.8400 | 0.8329 | 0.1579 | 0.0458 |
| 0.1783 | 79.0 | 553 | 0.5486 | 0.84 | 0.2436 | 1.0190 | 0.8400 | 0.8329 | 0.1582 | 0.0459 |
| 0.1783 | 80.0 | 560 | 0.5492 | 0.84 | 0.2438 | 1.0194 | 0.8400 | 0.8329 | 0.1581 | 0.0461 |
| 0.1783 | 81.0 | 567 | 0.5486 | 0.84 | 0.2435 | 1.0189 | 0.8400 | 0.8329 | 0.1581 | 0.0460 |
| 0.1783 | 82.0 | 574 | 0.5489 | 0.84 | 0.2437 | 1.0185 | 0.8400 | 0.8329 | 0.1581 | 0.0460 |
| 0.1783 | 83.0 | 581 | 0.5491 | 0.84 | 0.2438 | 1.0188 | 0.8400 | 0.8329 | 0.1574 | 0.0460 |
| 0.1783 | 84.0 | 588 | 0.5490 | 0.84 | 0.2438 | 1.0183 | 0.8400 | 0.8329 | 0.1581 | 0.0461 |
| 0.1783 | 85.0 | 595 | 0.5491 | 0.84 | 0.2438 | 1.0184 | 0.8400 | 0.8329 | 0.1485 | 0.0461 |
| 0.1783 | 86.0 | 602 | 0.5492 | 0.84 | 0.2439 | 1.0177 | 0.8400 | 0.8329 | 0.1584 | 0.0461 |
| 0.1783 | 87.0 | 609 | 0.5491 | 0.84 | 0.2438 | 1.0180 | 0.8400 | 0.8329 | 0.1582 | 0.0461 |
| 0.1783 | 88.0 | 616 | 0.5493 | 0.84 | 0.2438 | 1.0180 | 0.8400 | 0.8329 | 0.1584 | 0.0462 |
| 0.1783 | 89.0 | 623 | 0.5493 | 0.84 | 0.2438 | 1.0178 | 0.8400 | 0.8329 | 0.1584 | 0.0462 |
| 0.1783 | 90.0 | 630 | 0.5490 | 0.84 | 0.2437 | 1.0180 | 0.8400 | 0.8329 | 0.1584 | 0.0461 |
| 0.1783 | 91.0 | 637 | 0.5491 | 0.84 | 0.2438 | 1.0177 | 0.8400 | 0.8329 | 0.1581 | 0.0459 |
| 0.1783 | 92.0 | 644 | 0.5492 | 0.84 | 0.2438 | 1.0177 | 0.8400 | 0.8329 | 0.1582 | 0.0461 |
| 0.1783 | 93.0 | 651 | 0.5491 | 0.84 | 0.2437 | 1.0180 | 0.8400 | 0.8329 | 0.1581 | 0.0460 |
| 0.1783 | 94.0 | 658 | 0.5491 | 0.84 | 0.2438 | 1.0180 | 0.8400 | 0.8329 | 0.1584 | 0.0461 |
| 0.1783 | 95.0 | 665 | 0.5492 | 0.84 | 0.2438 | 1.0177 | 0.8400 | 0.8329 | 0.1582 | 0.0461 |
| 0.1783 | 96.0 | 672 | 0.5492 | 0.84 | 0.2438 | 1.0176 | 0.8400 | 0.8329 | 0.1582 | 0.0461 |
| 0.1783 | 97.0 | 679 | 0.5492 | 0.84 | 0.2438 | 1.0175 | 0.8400 | 0.8329 | 0.1581 | 0.0460 |
| 0.1783 | 98.0 | 686 | 0.5491 | 0.84 | 0.2438 | 1.0175 | 0.8400 | 0.8329 | 0.1582 | 0.0461 |
| 0.1783 | 99.0 | 693 | 0.5491 | 0.84 | 0.2438 | 1.0175 | 0.8400 | 0.8329 | 0.1580 | 0.0460 |
| 0.1783 | 100.0 | 700 | 0.5492 | 0.84 | 0.2438 | 1.0175 | 0.8400 | 0.8329 | 0.1581 | 0.0460 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4300
- Accuracy: 0.83
- Brier Loss: 0.2807
- Nll: 1.0350
- F1 Micro: 0.83
- F1 Macro: 0.8295
- Ece: 0.2287
- Aurc: 0.0560
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.6525 | 0.225 | 0.8757 | 5.3231 | 0.225 | 0.1387 | 0.2689 | 0.6977 |
| No log | 2.0 | 14 | 1.3106 | 0.405 | 0.7470 | 3.3487 | 0.405 | 0.2195 | 0.2936 | 0.4032 |
| No log | 3.0 | 21 | 0.9127 | 0.585 | 0.5785 | 1.8686 | 0.585 | 0.5142 | 0.2974 | 0.2067 |
| No log | 4.0 | 28 | 0.7280 | 0.715 | 0.4339 | 1.6780 | 0.715 | 0.6761 | 0.2672 | 0.1204 |
| No log | 5.0 | 35 | 0.6523 | 0.775 | 0.3676 | 1.6537 | 0.775 | 0.7619 | 0.2554 | 0.0929 |
| No log | 6.0 | 42 | 0.5888 | 0.785 | 0.3502 | 1.3926 | 0.785 | 0.7538 | 0.2277 | 0.0908 |
| No log | 7.0 | 49 | 0.6113 | 0.805 | 0.3326 | 1.7118 | 0.805 | 0.7903 | 0.2428 | 0.0803 |
| No log | 8.0 | 56 | 0.5404 | 0.785 | 0.3178 | 1.1557 | 0.785 | 0.7671 | 0.2183 | 0.0716 |
| No log | 9.0 | 63 | 0.5380 | 0.82 | 0.3051 | 1.3231 | 0.82 | 0.8072 | 0.2168 | 0.0773 |
| No log | 10.0 | 70 | 0.6035 | 0.775 | 0.3508 | 1.3888 | 0.775 | 0.7682 | 0.2191 | 0.0812 |
| No log | 11.0 | 77 | 0.5473 | 0.795 | 0.3202 | 1.2622 | 0.795 | 0.7740 | 0.2303 | 0.0626 |
| No log | 12.0 | 84 | 0.4860 | 0.825 | 0.2937 | 1.3575 | 0.825 | 0.8053 | 0.2392 | 0.0727 |
| No log | 13.0 | 91 | 0.5046 | 0.81 | 0.3032 | 1.1857 | 0.81 | 0.8086 | 0.2248 | 0.0564 |
| No log | 14.0 | 98 | 0.4745 | 0.825 | 0.2870 | 1.2338 | 0.825 | 0.8089 | 0.2441 | 0.0518 |
| No log | 15.0 | 105 | 0.4764 | 0.81 | 0.2943 | 1.0325 | 0.81 | 0.8110 | 0.1935 | 0.0556 |
| No log | 16.0 | 112 | 0.4918 | 0.81 | 0.3062 | 1.0551 | 0.81 | 0.8015 | 0.2198 | 0.0587 |
| No log | 17.0 | 119 | 0.4757 | 0.815 | 0.2970 | 1.4203 | 0.815 | 0.7965 | 0.2263 | 0.0850 |
| No log | 18.0 | 126 | 0.4586 | 0.825 | 0.2926 | 1.0361 | 0.825 | 0.8268 | 0.2279 | 0.0583 |
| No log | 19.0 | 133 | 0.4503 | 0.835 | 0.2855 | 1.1476 | 0.835 | 0.8301 | 0.2392 | 0.0589 |
| No log | 20.0 | 140 | 0.4780 | 0.805 | 0.3105 | 0.9928 | 0.805 | 0.7902 | 0.1988 | 0.0775 |
| No log | 21.0 | 147 | 0.4965 | 0.8 | 0.3205 | 1.1887 | 0.8000 | 0.8029 | 0.2410 | 0.0702 |
| No log | 22.0 | 154 | 0.4753 | 0.815 | 0.3016 | 0.9609 | 0.815 | 0.8169 | 0.2163 | 0.0580 |
| No log | 23.0 | 161 | 0.4733 | 0.8 | 0.3074 | 1.2566 | 0.8000 | 0.8001 | 0.2162 | 0.0704 |
| No log | 24.0 | 168 | 0.4472 | 0.815 | 0.2888 | 1.0352 | 0.815 | 0.8187 | 0.2317 | 0.0590 |
| No log | 25.0 | 175 | 0.4434 | 0.815 | 0.2854 | 0.9874 | 0.815 | 0.8186 | 0.2149 | 0.0554 |
| No log | 26.0 | 182 | 0.4316 | 0.82 | 0.2754 | 1.0477 | 0.82 | 0.8267 | 0.2195 | 0.0508 |
| No log | 27.0 | 189 | 0.4276 | 0.83 | 0.2751 | 1.1016 | 0.83 | 0.8336 | 0.2050 | 0.0525 |
| No log | 28.0 | 196 | 0.4329 | 0.82 | 0.2795 | 1.0537 | 0.82 | 0.8220 | 0.2158 | 0.0611 |
| No log | 29.0 | 203 | 0.4327 | 0.82 | 0.2827 | 1.1766 | 0.82 | 0.8237 | 0.2024 | 0.0603 |
| No log | 30.0 | 210 | 0.4317 | 0.82 | 0.2820 | 1.0331 | 0.82 | 0.8219 | 0.2083 | 0.0611 |
| No log | 31.0 | 217 | 0.4316 | 0.82 | 0.2803 | 1.0974 | 0.82 | 0.8263 | 0.1984 | 0.0575 |
| No log | 32.0 | 224 | 0.4340 | 0.82 | 0.2833 | 1.0384 | 0.82 | 0.8240 | 0.2202 | 0.0590 |
| No log | 33.0 | 231 | 0.4333 | 0.81 | 0.2824 | 1.0355 | 0.81 | 0.8160 | 0.2103 | 0.0586 |
| No log | 34.0 | 238 | 0.4309 | 0.83 | 0.2817 | 1.1015 | 0.83 | 0.8307 | 0.2107 | 0.0577 |
| No log | 35.0 | 245 | 0.4321 | 0.82 | 0.2817 | 1.0359 | 0.82 | 0.8229 | 0.2147 | 0.0590 |
| No log | 36.0 | 252 | 0.4304 | 0.825 | 0.2802 | 1.1016 | 0.825 | 0.8257 | 0.2137 | 0.0569 |
| No log | 37.0 | 259 | 0.4303 | 0.825 | 0.2811 | 1.0990 | 0.825 | 0.8268 | 0.2149 | 0.0581 |
| No log | 38.0 | 266 | 0.4314 | 0.825 | 0.2814 | 1.1003 | 0.825 | 0.8257 | 0.2163 | 0.0581 |
| No log | 39.0 | 273 | 0.4302 | 0.82 | 0.2806 | 1.1007 | 0.82 | 0.8226 | 0.2102 | 0.0576 |
| No log | 40.0 | 280 | 0.4307 | 0.825 | 0.2809 | 1.0376 | 0.825 | 0.8264 | 0.2049 | 0.0573 |
| No log | 41.0 | 287 | 0.4303 | 0.82 | 0.2808 | 1.0434 | 0.82 | 0.8226 | 0.2096 | 0.0574 |
| No log | 42.0 | 294 | 0.4310 | 0.825 | 0.2817 | 1.0376 | 0.825 | 0.8268 | 0.2140 | 0.0580 |
| No log | 43.0 | 301 | 0.4310 | 0.825 | 0.2813 | 1.0391 | 0.825 | 0.8257 | 0.2147 | 0.0580 |
| No log | 44.0 | 308 | 0.4301 | 0.825 | 0.2808 | 1.0389 | 0.825 | 0.8257 | 0.2064 | 0.0573 |
| No log | 45.0 | 315 | 0.4305 | 0.83 | 0.2811 | 1.0419 | 0.83 | 0.8307 | 0.2300 | 0.0577 |
| No log | 46.0 | 322 | 0.4303 | 0.82 | 0.2808 | 1.0423 | 0.82 | 0.8226 | 0.2197 | 0.0582 |
| No log | 47.0 | 329 | 0.4304 | 0.825 | 0.2811 | 1.0405 | 0.825 | 0.8257 | 0.2240 | 0.0580 |
| No log | 48.0 | 336 | 0.4300 | 0.82 | 0.2805 | 1.0407 | 0.82 | 0.8226 | 0.2105 | 0.0574 |
| No log | 49.0 | 343 | 0.4307 | 0.825 | 0.2812 | 1.0381 | 0.825 | 0.8257 | 0.2252 | 0.0577 |
| No log | 50.0 | 350 | 0.4304 | 0.82 | 0.2810 | 1.0422 | 0.82 | 0.8226 | 0.2353 | 0.0578 |
| No log | 51.0 | 357 | 0.4310 | 0.825 | 0.2813 | 1.0382 | 0.825 | 0.8264 | 0.2153 | 0.0569 |
| No log | 52.0 | 364 | 0.4309 | 0.82 | 0.2814 | 1.0380 | 0.82 | 0.8226 | 0.2282 | 0.0574 |
| No log | 53.0 | 371 | 0.4307 | 0.825 | 0.2813 | 1.0357 | 0.825 | 0.8264 | 0.2250 | 0.0568 |
| No log | 54.0 | 378 | 0.4305 | 0.82 | 0.2810 | 1.0366 | 0.82 | 0.8226 | 0.2284 | 0.0575 |
| No log | 55.0 | 385 | 0.4304 | 0.825 | 0.2811 | 1.0351 | 0.825 | 0.8264 | 0.2241 | 0.0566 |
| No log | 56.0 | 392 | 0.4308 | 0.825 | 0.2813 | 1.0369 | 0.825 | 0.8257 | 0.2414 | 0.0572 |
| No log | 57.0 | 399 | 0.4305 | 0.825 | 0.2810 | 1.0356 | 0.825 | 0.8257 | 0.2322 | 0.0571 |
| No log | 58.0 | 406 | 0.4302 | 0.82 | 0.2808 | 1.0359 | 0.82 | 0.8226 | 0.2368 | 0.0569 |
| No log | 59.0 | 413 | 0.4302 | 0.82 | 0.2809 | 1.0346 | 0.82 | 0.8226 | 0.2271 | 0.0569 |
| No log | 60.0 | 420 | 0.4303 | 0.82 | 0.2809 | 1.0357 | 0.82 | 0.8226 | 0.2272 | 0.0570 |
| No log | 61.0 | 427 | 0.4304 | 0.825 | 0.2810 | 1.0360 | 0.825 | 0.8257 | 0.2325 | 0.0569 |
| No log | 62.0 | 434 | 0.4303 | 0.825 | 0.2809 | 1.0360 | 0.825 | 0.8257 | 0.2321 | 0.0568 |
| No log | 63.0 | 441 | 0.4303 | 0.83 | 0.2809 | 1.0356 | 0.83 | 0.8295 | 0.2300 | 0.0562 |
| No log | 64.0 | 448 | 0.4304 | 0.825 | 0.2810 | 1.0347 | 0.825 | 0.8264 | 0.2242 | 0.0564 |
| No log | 65.0 | 455 | 0.4301 | 0.83 | 0.2808 | 1.0361 | 0.83 | 0.8295 | 0.2384 | 0.0564 |
| No log | 66.0 | 462 | 0.4303 | 0.83 | 0.2810 | 1.0359 | 0.83 | 0.8295 | 0.2293 | 0.0563 |
| No log | 67.0 | 469 | 0.4302 | 0.83 | 0.2809 | 1.0360 | 0.83 | 0.8295 | 0.2386 | 0.0564 |
| No log | 68.0 | 476 | 0.4304 | 0.83 | 0.2810 | 1.0360 | 0.83 | 0.8295 | 0.2384 | 0.0563 |
| No log | 69.0 | 483 | 0.4305 | 0.83 | 0.2812 | 1.0355 | 0.83 | 0.8295 | 0.2295 | 0.0564 |
| No log | 70.0 | 490 | 0.4302 | 0.825 | 0.2808 | 1.0354 | 0.825 | 0.8264 | 0.2239 | 0.0561 |
| No log | 71.0 | 497 | 0.4305 | 0.83 | 0.2812 | 1.0352 | 0.83 | 0.8295 | 0.2296 | 0.0564 |
| 0.1776 | 72.0 | 504 | 0.4303 | 0.83 | 0.2808 | 1.0356 | 0.83 | 0.8295 | 0.2287 | 0.0561 |
| 0.1776 | 73.0 | 511 | 0.4301 | 0.825 | 0.2807 | 1.0351 | 0.825 | 0.8264 | 0.2348 | 0.0563 |
| 0.1776 | 74.0 | 518 | 0.4304 | 0.83 | 0.2811 | 1.0353 | 0.83 | 0.8295 | 0.2195 | 0.0562 |
| 0.1776 | 75.0 | 525 | 0.4301 | 0.825 | 0.2808 | 1.0355 | 0.825 | 0.8257 | 0.2320 | 0.0568 |
| 0.1776 | 76.0 | 532 | 0.4302 | 0.83 | 0.2808 | 1.0348 | 0.83 | 0.8295 | 0.2289 | 0.0561 |
| 0.1776 | 77.0 | 539 | 0.4301 | 0.83 | 0.2808 | 1.0355 | 0.83 | 0.8295 | 0.2300 | 0.0562 |
| 0.1776 | 78.0 | 546 | 0.4301 | 0.83 | 0.2808 | 1.0354 | 0.83 | 0.8295 | 0.2394 | 0.0563 |
| 0.1776 | 79.0 | 553 | 0.4302 | 0.83 | 0.2809 | 1.0346 | 0.83 | 0.8295 | 0.2287 | 0.0560 |
| 0.1776 | 80.0 | 560 | 0.4302 | 0.83 | 0.2809 | 1.0353 | 0.83 | 0.8295 | 0.2299 | 0.0563 |
| 0.1776 | 81.0 | 567 | 0.4302 | 0.83 | 0.2809 | 1.0350 | 0.83 | 0.8295 | 0.2299 | 0.0563 |
| 0.1776 | 82.0 | 574 | 0.4302 | 0.83 | 0.2808 | 1.0354 | 0.83 | 0.8295 | 0.2298 | 0.0560 |
| 0.1776 | 83.0 | 581 | 0.4302 | 0.83 | 0.2809 | 1.0350 | 0.83 | 0.8295 | 0.2299 | 0.0561 |
| 0.1776 | 84.0 | 588 | 0.4299 | 0.83 | 0.2807 | 1.0352 | 0.83 | 0.8295 | 0.2287 | 0.0561 |
| 0.1776 | 85.0 | 595 | 0.4301 | 0.83 | 0.2808 | 1.0349 | 0.83 | 0.8295 | 0.2296 | 0.0562 |
| 0.1776 | 86.0 | 602 | 0.4301 | 0.83 | 0.2808 | 1.0351 | 0.83 | 0.8295 | 0.2287 | 0.0562 |
| 0.1776 | 87.0 | 609 | 0.4300 | 0.83 | 0.2807 | 1.0351 | 0.83 | 0.8295 | 0.2297 | 0.0561 |
| 0.1776 | 88.0 | 616 | 0.4300 | 0.83 | 0.2807 | 1.0349 | 0.83 | 0.8295 | 0.2287 | 0.0562 |
| 0.1776 | 89.0 | 623 | 0.4300 | 0.83 | 0.2807 | 1.0353 | 0.83 | 0.8295 | 0.2296 | 0.0560 |
| 0.1776 | 90.0 | 630 | 0.4300 | 0.83 | 0.2807 | 1.0349 | 0.83 | 0.8295 | 0.2297 | 0.0559 |
| 0.1776 | 91.0 | 637 | 0.4300 | 0.83 | 0.2807 | 1.0352 | 0.83 | 0.8295 | 0.2296 | 0.0562 |
| 0.1776 | 92.0 | 644 | 0.4300 | 0.83 | 0.2807 | 1.0351 | 0.83 | 0.8295 | 0.2287 | 0.0561 |
| 0.1776 | 93.0 | 651 | 0.4300 | 0.83 | 0.2807 | 1.0351 | 0.83 | 0.8295 | 0.2297 | 0.0562 |
| 0.1776 | 94.0 | 658 | 0.4300 | 0.83 | 0.2807 | 1.0349 | 0.83 | 0.8295 | 0.2297 | 0.0560 |
| 0.1776 | 95.0 | 665 | 0.4300 | 0.83 | 0.2807 | 1.0350 | 0.83 | 0.8295 | 0.2297 | 0.0562 |
| 0.1776 | 96.0 | 672 | 0.4300 | 0.83 | 0.2807 | 1.0349 | 0.83 | 0.8295 | 0.2296 | 0.0561 |
| 0.1776 | 97.0 | 679 | 0.4300 | 0.83 | 0.2807 | 1.0350 | 0.83 | 0.8295 | 0.2296 | 0.0560 |
| 0.1776 | 98.0 | 686 | 0.4300 | 0.83 | 0.2807 | 1.0350 | 0.83 | 0.8295 | 0.2296 | 0.0560 |
| 0.1776 | 99.0 | 693 | 0.4300 | 0.83 | 0.2807 | 1.0350 | 0.83 | 0.8295 | 0.2287 | 0.0560 |
| 0.1776 | 100.0 | 700 | 0.4300 | 0.83 | 0.2807 | 1.0350 | 0.83 | 0.8295 | 0.2287 | 0.0560 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t2.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t2.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5012
- Accuracy: 0.845
- Brier Loss: 0.2630
- Nll: 0.9559
- F1 Micro: 0.845
- F1 Macro: 0.8409
- Ece: 0.2081
- Aurc: 0.0487
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.8874 | 0.22 | 0.8753 | 5.3264 | 0.22 | 0.1297 | 0.2491 | 0.6946 |
| No log | 2.0 | 14 | 1.5114 | 0.405 | 0.7436 | 3.4265 | 0.405 | 0.2289 | 0.2861 | 0.4056 |
| No log | 3.0 | 21 | 1.0555 | 0.61 | 0.5700 | 1.9393 | 0.61 | 0.5508 | 0.3108 | 0.2025 |
| No log | 4.0 | 28 | 0.8188 | 0.695 | 0.4182 | 1.7503 | 0.695 | 0.6308 | 0.2374 | 0.1153 |
| No log | 5.0 | 35 | 0.8072 | 0.74 | 0.3863 | 1.7844 | 0.74 | 0.7491 | 0.2368 | 0.1059 |
| No log | 6.0 | 42 | 0.7259 | 0.745 | 0.3520 | 1.6780 | 0.745 | 0.7403 | 0.2147 | 0.0935 |
| No log | 7.0 | 49 | 0.6764 | 0.805 | 0.3188 | 1.5699 | 0.805 | 0.7739 | 0.2217 | 0.0875 |
| No log | 8.0 | 56 | 0.6437 | 0.805 | 0.3103 | 1.2385 | 0.805 | 0.7880 | 0.2198 | 0.0667 |
| No log | 9.0 | 63 | 0.6173 | 0.79 | 0.3033 | 1.3362 | 0.79 | 0.7910 | 0.2158 | 0.0688 |
| No log | 10.0 | 70 | 0.5831 | 0.83 | 0.2846 | 1.1457 | 0.83 | 0.8211 | 0.2293 | 0.0530 |
| No log | 11.0 | 77 | 0.6568 | 0.775 | 0.3281 | 1.4185 | 0.775 | 0.7253 | 0.2128 | 0.0708 |
| No log | 12.0 | 84 | 0.5838 | 0.815 | 0.2859 | 1.4799 | 0.815 | 0.8079 | 0.2191 | 0.0518 |
| No log | 13.0 | 91 | 0.6773 | 0.75 | 0.3546 | 1.1980 | 0.75 | 0.7125 | 0.1970 | 0.0977 |
| No log | 14.0 | 98 | 0.5809 | 0.82 | 0.2976 | 1.1146 | 0.82 | 0.8166 | 0.2240 | 0.0850 |
| No log | 15.0 | 105 | 0.5589 | 0.8 | 0.2913 | 1.0169 | 0.8000 | 0.7942 | 0.2018 | 0.0655 |
| No log | 16.0 | 112 | 0.5495 | 0.835 | 0.2857 | 1.1661 | 0.835 | 0.8217 | 0.1943 | 0.0800 |
| No log | 17.0 | 119 | 0.5280 | 0.83 | 0.2697 | 1.0304 | 0.83 | 0.8314 | 0.2216 | 0.0511 |
| No log | 18.0 | 126 | 0.5174 | 0.85 | 0.2583 | 1.1388 | 0.85 | 0.8392 | 0.1992 | 0.0540 |
| No log | 19.0 | 133 | 0.5271 | 0.81 | 0.2838 | 0.9360 | 0.81 | 0.8129 | 0.1742 | 0.0622 |
| No log | 20.0 | 140 | 0.5098 | 0.845 | 0.2604 | 1.1335 | 0.845 | 0.8353 | 0.2040 | 0.0599 |
| No log | 21.0 | 147 | 0.5219 | 0.82 | 0.2725 | 0.9910 | 0.82 | 0.8145 | 0.1831 | 0.0497 |
| No log | 22.0 | 154 | 0.5195 | 0.835 | 0.2706 | 1.0873 | 0.835 | 0.8390 | 0.1908 | 0.0515 |
| No log | 23.0 | 161 | 0.5122 | 0.835 | 0.2666 | 1.0898 | 0.835 | 0.8399 | 0.1998 | 0.0487 |
| No log | 24.0 | 168 | 0.5158 | 0.825 | 0.2723 | 1.0534 | 0.825 | 0.8210 | 0.1811 | 0.0507 |
| No log | 25.0 | 175 | 0.5059 | 0.825 | 0.2654 | 0.9966 | 0.825 | 0.8212 | 0.1910 | 0.0487 |
| No log | 26.0 | 182 | 0.5033 | 0.825 | 0.2648 | 0.9836 | 0.825 | 0.8212 | 0.1768 | 0.0516 |
| No log | 27.0 | 189 | 0.5114 | 0.835 | 0.2703 | 0.9847 | 0.835 | 0.8353 | 0.2040 | 0.0512 |
| No log | 28.0 | 196 | 0.5047 | 0.84 | 0.2654 | 0.9774 | 0.8400 | 0.8359 | 0.1868 | 0.0494 |
| No log | 29.0 | 203 | 0.5027 | 0.84 | 0.2656 | 0.9674 | 0.8400 | 0.8359 | 0.1820 | 0.0502 |
| No log | 30.0 | 210 | 0.5035 | 0.835 | 0.2660 | 0.9606 | 0.835 | 0.8296 | 0.1781 | 0.0494 |
| No log | 31.0 | 217 | 0.5010 | 0.835 | 0.2642 | 0.9628 | 0.835 | 0.8296 | 0.1893 | 0.0487 |
| No log | 32.0 | 224 | 0.5032 | 0.835 | 0.2652 | 0.9705 | 0.835 | 0.8296 | 0.1913 | 0.0494 |
| No log | 33.0 | 231 | 0.5052 | 0.825 | 0.2664 | 0.9657 | 0.825 | 0.8231 | 0.1882 | 0.0503 |
| No log | 34.0 | 238 | 0.5047 | 0.825 | 0.2667 | 0.9605 | 0.825 | 0.8168 | 0.1938 | 0.0508 |
| No log | 35.0 | 245 | 0.5019 | 0.835 | 0.2642 | 0.9596 | 0.835 | 0.8296 | 0.1846 | 0.0491 |
| No log | 36.0 | 252 | 0.5035 | 0.835 | 0.2648 | 0.9646 | 0.835 | 0.8296 | 0.2064 | 0.0492 |
| No log | 37.0 | 259 | 0.5020 | 0.835 | 0.2645 | 0.9589 | 0.835 | 0.8296 | 0.2036 | 0.0491 |
| No log | 38.0 | 266 | 0.5023 | 0.83 | 0.2642 | 0.9595 | 0.83 | 0.8262 | 0.1798 | 0.0495 |
| No log | 39.0 | 273 | 0.5023 | 0.835 | 0.2643 | 0.9615 | 0.835 | 0.8296 | 0.1921 | 0.0491 |
| No log | 40.0 | 280 | 0.5024 | 0.835 | 0.2645 | 0.9589 | 0.835 | 0.8292 | 0.1813 | 0.0499 |
| No log | 41.0 | 287 | 0.5018 | 0.835 | 0.2638 | 0.9583 | 0.835 | 0.8296 | 0.1743 | 0.0492 |
| No log | 42.0 | 294 | 0.5018 | 0.83 | 0.2640 | 0.9592 | 0.83 | 0.8266 | 0.1839 | 0.0498 |
| No log | 43.0 | 301 | 0.5033 | 0.84 | 0.2650 | 0.9588 | 0.8400 | 0.8322 | 0.1914 | 0.0497 |
| No log | 44.0 | 308 | 0.5009 | 0.83 | 0.2632 | 0.9603 | 0.83 | 0.8266 | 0.1652 | 0.0495 |
| No log | 45.0 | 315 | 0.5049 | 0.835 | 0.2659 | 0.9587 | 0.835 | 0.8344 | 0.2130 | 0.0502 |
| No log | 46.0 | 322 | 0.5018 | 0.835 | 0.2637 | 0.9592 | 0.835 | 0.8344 | 0.1685 | 0.0496 |
| No log | 47.0 | 329 | 0.5009 | 0.835 | 0.2632 | 0.9578 | 0.835 | 0.8296 | 0.1971 | 0.0491 |
| No log | 48.0 | 336 | 0.5022 | 0.835 | 0.2641 | 0.9574 | 0.835 | 0.8288 | 0.2050 | 0.0495 |
| No log | 49.0 | 343 | 0.5017 | 0.835 | 0.2635 | 0.9586 | 0.835 | 0.8349 | 0.2025 | 0.0493 |
| No log | 50.0 | 350 | 0.5022 | 0.84 | 0.2640 | 0.9572 | 0.8400 | 0.8322 | 0.1926 | 0.0493 |
| No log | 51.0 | 357 | 0.5022 | 0.835 | 0.2638 | 0.9591 | 0.835 | 0.8288 | 0.1948 | 0.0494 |
| No log | 52.0 | 364 | 0.5025 | 0.835 | 0.2640 | 0.9573 | 0.835 | 0.8344 | 0.2119 | 0.0497 |
| No log | 53.0 | 371 | 0.5022 | 0.835 | 0.2638 | 0.9575 | 0.835 | 0.8292 | 0.2176 | 0.0497 |
| No log | 54.0 | 378 | 0.5020 | 0.83 | 0.2637 | 0.9575 | 0.83 | 0.8257 | 0.1935 | 0.0498 |
| No log | 55.0 | 385 | 0.5024 | 0.835 | 0.2640 | 0.9572 | 0.835 | 0.8288 | 0.2040 | 0.0491 |
| No log | 56.0 | 392 | 0.5023 | 0.835 | 0.2639 | 0.9581 | 0.835 | 0.8288 | 0.2123 | 0.0492 |
| No log | 57.0 | 399 | 0.5018 | 0.84 | 0.2635 | 0.9575 | 0.8400 | 0.8322 | 0.2086 | 0.0487 |
| No log | 58.0 | 406 | 0.5023 | 0.835 | 0.2639 | 0.9568 | 0.835 | 0.8292 | 0.2094 | 0.0496 |
| No log | 59.0 | 413 | 0.5016 | 0.83 | 0.2633 | 0.9563 | 0.83 | 0.8257 | 0.1930 | 0.0493 |
| No log | 60.0 | 420 | 0.5015 | 0.84 | 0.2633 | 0.9565 | 0.8400 | 0.8322 | 0.2004 | 0.0488 |
| No log | 61.0 | 427 | 0.5017 | 0.84 | 0.2635 | 0.9559 | 0.8400 | 0.8322 | 0.2004 | 0.0491 |
| No log | 62.0 | 434 | 0.5018 | 0.83 | 0.2635 | 0.9563 | 0.83 | 0.8257 | 0.1994 | 0.0497 |
| No log | 63.0 | 441 | 0.5020 | 0.835 | 0.2636 | 0.9572 | 0.835 | 0.8288 | 0.2040 | 0.0490 |
| No log | 64.0 | 448 | 0.5020 | 0.835 | 0.2636 | 0.9565 | 0.835 | 0.8288 | 0.2036 | 0.0490 |
| No log | 65.0 | 455 | 0.5017 | 0.835 | 0.2634 | 0.9566 | 0.835 | 0.8288 | 0.1959 | 0.0490 |
| No log | 66.0 | 462 | 0.5018 | 0.84 | 0.2635 | 0.9561 | 0.8400 | 0.8322 | 0.2012 | 0.0491 |
| No log | 67.0 | 469 | 0.5016 | 0.84 | 0.2633 | 0.9566 | 0.8400 | 0.8322 | 0.1946 | 0.0489 |
| No log | 68.0 | 476 | 0.5016 | 0.84 | 0.2633 | 0.9565 | 0.8400 | 0.8322 | 0.1946 | 0.0488 |
| No log | 69.0 | 483 | 0.5018 | 0.835 | 0.2634 | 0.9567 | 0.835 | 0.8288 | 0.1955 | 0.0490 |
| No log | 70.0 | 490 | 0.5012 | 0.84 | 0.2631 | 0.9562 | 0.8400 | 0.8322 | 0.1945 | 0.0488 |
| No log | 71.0 | 497 | 0.5018 | 0.835 | 0.2635 | 0.9568 | 0.835 | 0.8288 | 0.1958 | 0.0491 |
| 0.1944 | 72.0 | 504 | 0.5016 | 0.84 | 0.2633 | 0.9562 | 0.8400 | 0.8322 | 0.2034 | 0.0490 |
| 0.1944 | 73.0 | 511 | 0.5013 | 0.84 | 0.2632 | 0.9558 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 74.0 | 518 | 0.5013 | 0.84 | 0.2631 | 0.9562 | 0.8400 | 0.8322 | 0.1943 | 0.0487 |
| 0.1944 | 75.0 | 525 | 0.5016 | 0.835 | 0.2633 | 0.9560 | 0.835 | 0.8344 | 0.2035 | 0.0495 |
| 0.1944 | 76.0 | 532 | 0.5018 | 0.84 | 0.2634 | 0.9563 | 0.8400 | 0.8322 | 0.2093 | 0.0487 |
| 0.1944 | 77.0 | 539 | 0.5012 | 0.84 | 0.2630 | 0.9565 | 0.8400 | 0.8322 | 0.1941 | 0.0488 |
| 0.1944 | 78.0 | 546 | 0.5015 | 0.84 | 0.2632 | 0.9561 | 0.8400 | 0.8375 | 0.2008 | 0.0489 |
| 0.1944 | 79.0 | 553 | 0.5016 | 0.835 | 0.2633 | 0.9560 | 0.835 | 0.8288 | 0.1957 | 0.0490 |
| 0.1944 | 80.0 | 560 | 0.5015 | 0.84 | 0.2631 | 0.9568 | 0.8400 | 0.8375 | 0.2093 | 0.0488 |
| 0.1944 | 81.0 | 567 | 0.5015 | 0.835 | 0.2632 | 0.9561 | 0.835 | 0.8288 | 0.1957 | 0.0491 |
| 0.1944 | 82.0 | 574 | 0.5014 | 0.835 | 0.2631 | 0.9565 | 0.835 | 0.8288 | 0.1949 | 0.0489 |
| 0.1944 | 83.0 | 581 | 0.5015 | 0.835 | 0.2632 | 0.9563 | 0.835 | 0.8288 | 0.1957 | 0.0490 |
| 0.1944 | 84.0 | 588 | 0.5015 | 0.84 | 0.2632 | 0.9559 | 0.8400 | 0.8322 | 0.2031 | 0.0488 |
| 0.1944 | 85.0 | 595 | 0.5012 | 0.84 | 0.2630 | 0.9560 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 86.0 | 602 | 0.5012 | 0.84 | 0.2630 | 0.9561 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 87.0 | 609 | 0.5012 | 0.84 | 0.2630 | 0.9562 | 0.8400 | 0.8322 | 0.1943 | 0.0488 |
| 0.1944 | 88.0 | 616 | 0.5012 | 0.84 | 0.2630 | 0.9561 | 0.8400 | 0.8322 | 0.2030 | 0.0488 |
| 0.1944 | 89.0 | 623 | 0.5013 | 0.845 | 0.2631 | 0.9559 | 0.845 | 0.8409 | 0.1995 | 0.0488 |
| 0.1944 | 90.0 | 630 | 0.5013 | 0.845 | 0.2631 | 0.9559 | 0.845 | 0.8409 | 0.1995 | 0.0488 |
| 0.1944 | 91.0 | 637 | 0.5012 | 0.845 | 0.2630 | 0.9559 | 0.845 | 0.8409 | 0.1994 | 0.0487 |
| 0.1944 | 92.0 | 644 | 0.5013 | 0.845 | 0.2631 | 0.9561 | 0.845 | 0.8409 | 0.1995 | 0.0487 |
| 0.1944 | 93.0 | 651 | 0.5012 | 0.84 | 0.2630 | 0.9560 | 0.8400 | 0.8322 | 0.2031 | 0.0488 |
| 0.1944 | 94.0 | 658 | 0.5013 | 0.84 | 0.2630 | 0.9558 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 95.0 | 665 | 0.5012 | 0.84 | 0.2630 | 0.9558 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 96.0 | 672 | 0.5012 | 0.84 | 0.2630 | 0.9558 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 97.0 | 679 | 0.5012 | 0.845 | 0.2630 | 0.9559 | 0.845 | 0.8409 | 0.1994 | 0.0487 |
| 0.1944 | 98.0 | 686 | 0.5012 | 0.84 | 0.2630 | 0.9559 | 0.8400 | 0.8322 | 0.1944 | 0.0488 |
| 0.1944 | 99.0 | 693 | 0.5012 | 0.845 | 0.2630 | 0.9560 | 0.845 | 0.8409 | 0.2081 | 0.0487 |
| 0.1944 | 100.0 | 700 | 0.5012 | 0.845 | 0.2630 | 0.9559 | 0.845 | 0.8409 | 0.2081 | 0.0487 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t2.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t2.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5446
- Accuracy: 0.85
- Brier Loss: 0.2446
- Nll: 1.0816
- F1 Micro: 0.85
- F1 Macro: 0.8348
- Ece: 0.1474
- Aurc: 0.0436
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.1216 | 0.215 | 0.8751 | 5.2864 | 0.2150 | 0.1264 | 0.2697 | 0.6907 |
| No log | 2.0 | 14 | 1.7056 | 0.405 | 0.7400 | 3.5721 | 0.405 | 0.2275 | 0.2995 | 0.4011 |
| No log | 3.0 | 21 | 1.1857 | 0.62 | 0.5612 | 2.0143 | 0.62 | 0.5712 | 0.2994 | 0.2024 |
| No log | 4.0 | 28 | 0.8767 | 0.705 | 0.4085 | 1.6918 | 0.705 | 0.6436 | 0.2231 | 0.1152 |
| No log | 5.0 | 35 | 0.8620 | 0.72 | 0.3878 | 1.7931 | 0.72 | 0.7294 | 0.2233 | 0.1076 |
| No log | 6.0 | 42 | 0.7517 | 0.775 | 0.3252 | 1.5573 | 0.775 | 0.7600 | 0.1970 | 0.0790 |
| No log | 7.0 | 49 | 0.7280 | 0.79 | 0.3175 | 1.5140 | 0.79 | 0.7742 | 0.1903 | 0.0826 |
| No log | 8.0 | 56 | 0.6848 | 0.8 | 0.2942 | 1.4438 | 0.8000 | 0.7902 | 0.1828 | 0.0866 |
| No log | 9.0 | 63 | 0.6744 | 0.81 | 0.2889 | 1.4703 | 0.81 | 0.7969 | 0.1989 | 0.0692 |
| No log | 10.0 | 70 | 0.8432 | 0.74 | 0.3859 | 1.3134 | 0.74 | 0.7206 | 0.1959 | 0.1051 |
| No log | 11.0 | 77 | 0.7424 | 0.765 | 0.3294 | 1.5162 | 0.765 | 0.7792 | 0.2005 | 0.1048 |
| No log | 12.0 | 84 | 0.6953 | 0.79 | 0.3194 | 1.2233 | 0.79 | 0.7850 | 0.1800 | 0.0922 |
| No log | 13.0 | 91 | 0.5703 | 0.845 | 0.2538 | 1.2355 | 0.845 | 0.8372 | 0.1739 | 0.0447 |
| No log | 14.0 | 98 | 0.6439 | 0.795 | 0.2924 | 1.2777 | 0.795 | 0.7743 | 0.1771 | 0.0534 |
| No log | 15.0 | 105 | 0.5895 | 0.825 | 0.2650 | 1.2086 | 0.825 | 0.8071 | 0.1665 | 0.0566 |
| No log | 16.0 | 112 | 0.5973 | 0.81 | 0.2753 | 1.0959 | 0.81 | 0.8013 | 0.1839 | 0.0534 |
| No log | 17.0 | 119 | 0.5825 | 0.795 | 0.2722 | 1.1565 | 0.795 | 0.7886 | 0.1855 | 0.0534 |
| No log | 18.0 | 126 | 0.5854 | 0.845 | 0.2661 | 1.1223 | 0.845 | 0.8424 | 0.1981 | 0.0549 |
| No log | 19.0 | 133 | 0.5514 | 0.82 | 0.2553 | 0.9585 | 0.82 | 0.8150 | 0.1600 | 0.0481 |
| No log | 20.0 | 140 | 0.5600 | 0.835 | 0.2443 | 1.2692 | 0.835 | 0.8232 | 0.1657 | 0.0469 |
| No log | 21.0 | 147 | 0.5592 | 0.845 | 0.2473 | 1.1658 | 0.845 | 0.8331 | 0.1683 | 0.0493 |
| No log | 22.0 | 154 | 0.5507 | 0.845 | 0.2411 | 1.1403 | 0.845 | 0.8311 | 0.1797 | 0.0450 |
| No log | 23.0 | 161 | 0.5305 | 0.84 | 0.2361 | 1.1509 | 0.8400 | 0.8287 | 0.1650 | 0.0409 |
| No log | 24.0 | 168 | 0.5352 | 0.835 | 0.2378 | 1.2208 | 0.835 | 0.8201 | 0.1515 | 0.0420 |
| No log | 25.0 | 175 | 0.5425 | 0.845 | 0.2420 | 1.1208 | 0.845 | 0.8321 | 0.1776 | 0.0430 |
| No log | 26.0 | 182 | 0.5396 | 0.84 | 0.2409 | 1.1230 | 0.8400 | 0.8286 | 0.1647 | 0.0446 |
| No log | 27.0 | 189 | 0.5436 | 0.85 | 0.2401 | 1.1179 | 0.85 | 0.8387 | 0.1568 | 0.0427 |
| No log | 28.0 | 196 | 0.5373 | 0.835 | 0.2415 | 1.1092 | 0.835 | 0.8141 | 0.1641 | 0.0427 |
| No log | 29.0 | 203 | 0.5420 | 0.845 | 0.2436 | 1.0988 | 0.845 | 0.8326 | 0.1551 | 0.0444 |
| No log | 30.0 | 210 | 0.5413 | 0.845 | 0.2420 | 1.1064 | 0.845 | 0.8312 | 0.1486 | 0.0440 |
| No log | 31.0 | 217 | 0.5411 | 0.84 | 0.2418 | 1.1024 | 0.8400 | 0.8286 | 0.1565 | 0.0435 |
| No log | 32.0 | 224 | 0.5426 | 0.845 | 0.2429 | 1.0993 | 0.845 | 0.8322 | 0.1631 | 0.0433 |
| No log | 33.0 | 231 | 0.5424 | 0.85 | 0.2426 | 1.0989 | 0.85 | 0.8348 | 0.1615 | 0.0436 |
| No log | 34.0 | 238 | 0.5406 | 0.84 | 0.2419 | 1.0979 | 0.8400 | 0.8251 | 0.1640 | 0.0440 |
| No log | 35.0 | 245 | 0.5438 | 0.85 | 0.2436 | 1.0953 | 0.85 | 0.8348 | 0.1595 | 0.0438 |
| No log | 36.0 | 252 | 0.5429 | 0.85 | 0.2429 | 1.0970 | 0.85 | 0.8348 | 0.1495 | 0.0433 |
| No log | 37.0 | 259 | 0.5431 | 0.85 | 0.2427 | 1.0951 | 0.85 | 0.8348 | 0.1617 | 0.0435 |
| No log | 38.0 | 266 | 0.5424 | 0.85 | 0.2426 | 1.0959 | 0.85 | 0.8348 | 0.1587 | 0.0434 |
| No log | 39.0 | 273 | 0.5428 | 0.85 | 0.2432 | 1.0924 | 0.85 | 0.8348 | 0.1512 | 0.0433 |
| No log | 40.0 | 280 | 0.5437 | 0.85 | 0.2438 | 1.0911 | 0.85 | 0.8348 | 0.1726 | 0.0438 |
| No log | 41.0 | 287 | 0.5438 | 0.85 | 0.2434 | 1.0925 | 0.85 | 0.8348 | 0.1704 | 0.0433 |
| No log | 42.0 | 294 | 0.5428 | 0.85 | 0.2432 | 1.0927 | 0.85 | 0.8348 | 0.1585 | 0.0436 |
| No log | 43.0 | 301 | 0.5455 | 0.85 | 0.2443 | 1.0907 | 0.85 | 0.8348 | 0.1756 | 0.0437 |
| No log | 44.0 | 308 | 0.5427 | 0.85 | 0.2433 | 1.0908 | 0.85 | 0.8348 | 0.1616 | 0.0433 |
| No log | 45.0 | 315 | 0.5456 | 0.85 | 0.2446 | 1.0878 | 0.85 | 0.8348 | 0.1767 | 0.0437 |
| No log | 46.0 | 322 | 0.5439 | 0.85 | 0.2438 | 1.0895 | 0.85 | 0.8348 | 0.1503 | 0.0435 |
| No log | 47.0 | 329 | 0.5448 | 0.85 | 0.2443 | 1.0891 | 0.85 | 0.8348 | 0.1674 | 0.0439 |
| No log | 48.0 | 336 | 0.5440 | 0.85 | 0.2437 | 1.0898 | 0.85 | 0.8348 | 0.1768 | 0.0437 |
| No log | 49.0 | 343 | 0.5443 | 0.85 | 0.2441 | 1.0883 | 0.85 | 0.8348 | 0.1433 | 0.0432 |
| No log | 50.0 | 350 | 0.5449 | 0.85 | 0.2444 | 1.0877 | 0.85 | 0.8348 | 0.1722 | 0.0436 |
| No log | 51.0 | 357 | 0.5443 | 0.85 | 0.2442 | 1.0871 | 0.85 | 0.8348 | 0.1606 | 0.0434 |
| No log | 52.0 | 364 | 0.5453 | 0.85 | 0.2444 | 1.0865 | 0.85 | 0.8348 | 0.1729 | 0.0436 |
| No log | 53.0 | 371 | 0.5433 | 0.845 | 0.2438 | 1.0873 | 0.845 | 0.8287 | 0.1570 | 0.0434 |
| No log | 54.0 | 378 | 0.5453 | 0.85 | 0.2447 | 1.0854 | 0.85 | 0.8348 | 0.1606 | 0.0435 |
| No log | 55.0 | 385 | 0.5438 | 0.85 | 0.2439 | 1.0868 | 0.85 | 0.8348 | 0.1721 | 0.0434 |
| No log | 56.0 | 392 | 0.5455 | 0.85 | 0.2447 | 1.0853 | 0.85 | 0.8348 | 0.1710 | 0.0437 |
| No log | 57.0 | 399 | 0.5435 | 0.85 | 0.2439 | 1.0864 | 0.85 | 0.8348 | 0.1540 | 0.0434 |
| No log | 58.0 | 406 | 0.5451 | 0.85 | 0.2447 | 1.0844 | 0.85 | 0.8348 | 0.1636 | 0.0436 |
| No log | 59.0 | 413 | 0.5442 | 0.85 | 0.2441 | 1.0858 | 0.85 | 0.8348 | 0.1556 | 0.0435 |
| No log | 60.0 | 420 | 0.5453 | 0.85 | 0.2447 | 1.0843 | 0.85 | 0.8348 | 0.1717 | 0.0437 |
| No log | 61.0 | 427 | 0.5439 | 0.85 | 0.2442 | 1.0847 | 0.85 | 0.8348 | 0.1541 | 0.0432 |
| No log | 62.0 | 434 | 0.5455 | 0.85 | 0.2449 | 1.0839 | 0.85 | 0.8348 | 0.1550 | 0.0435 |
| No log | 63.0 | 441 | 0.5446 | 0.85 | 0.2445 | 1.0843 | 0.85 | 0.8348 | 0.1553 | 0.0435 |
| No log | 64.0 | 448 | 0.5448 | 0.85 | 0.2446 | 1.0833 | 0.85 | 0.8348 | 0.1634 | 0.0435 |
| No log | 65.0 | 455 | 0.5443 | 0.85 | 0.2443 | 1.0847 | 0.85 | 0.8348 | 0.1554 | 0.0435 |
| No log | 66.0 | 462 | 0.5448 | 0.85 | 0.2447 | 1.0831 | 0.85 | 0.8348 | 0.1547 | 0.0436 |
| No log | 67.0 | 469 | 0.5452 | 0.85 | 0.2448 | 1.0828 | 0.85 | 0.8348 | 0.1563 | 0.0436 |
| No log | 68.0 | 476 | 0.5443 | 0.85 | 0.2444 | 1.0834 | 0.85 | 0.8348 | 0.1472 | 0.0434 |
| No log | 69.0 | 483 | 0.5447 | 0.85 | 0.2445 | 1.0832 | 0.85 | 0.8348 | 0.1632 | 0.0434 |
| No log | 70.0 | 490 | 0.5447 | 0.85 | 0.2446 | 1.0831 | 0.85 | 0.8348 | 0.1559 | 0.0435 |
| No log | 71.0 | 497 | 0.5447 | 0.85 | 0.2446 | 1.0829 | 0.85 | 0.8348 | 0.1473 | 0.0435 |
| 0.1823 | 72.0 | 504 | 0.5443 | 0.85 | 0.2444 | 1.0828 | 0.85 | 0.8348 | 0.1559 | 0.0434 |
| 0.1823 | 73.0 | 511 | 0.5447 | 0.85 | 0.2447 | 1.0825 | 0.85 | 0.8348 | 0.1472 | 0.0434 |
| 0.1823 | 74.0 | 518 | 0.5444 | 0.85 | 0.2444 | 1.0829 | 0.85 | 0.8348 | 0.1559 | 0.0436 |
| 0.1823 | 75.0 | 525 | 0.5446 | 0.85 | 0.2445 | 1.0829 | 0.85 | 0.8348 | 0.1557 | 0.0435 |
| 0.1823 | 76.0 | 532 | 0.5448 | 0.85 | 0.2445 | 1.0825 | 0.85 | 0.8348 | 0.1559 | 0.0435 |
| 0.1823 | 77.0 | 539 | 0.5443 | 0.85 | 0.2444 | 1.0827 | 0.85 | 0.8348 | 0.1558 | 0.0435 |
| 0.1823 | 78.0 | 546 | 0.5446 | 0.85 | 0.2446 | 1.0824 | 0.85 | 0.8348 | 0.1560 | 0.0436 |
| 0.1823 | 79.0 | 553 | 0.5450 | 0.85 | 0.2448 | 1.0821 | 0.85 | 0.8348 | 0.1637 | 0.0436 |
| 0.1823 | 80.0 | 560 | 0.5447 | 0.85 | 0.2446 | 1.0823 | 0.85 | 0.8348 | 0.1638 | 0.0436 |
| 0.1823 | 81.0 | 567 | 0.5446 | 0.85 | 0.2446 | 1.0820 | 0.85 | 0.8348 | 0.1560 | 0.0435 |
| 0.1823 | 82.0 | 574 | 0.5447 | 0.85 | 0.2446 | 1.0819 | 0.85 | 0.8348 | 0.1561 | 0.0435 |
| 0.1823 | 83.0 | 581 | 0.5448 | 0.85 | 0.2446 | 1.0822 | 0.85 | 0.8348 | 0.1550 | 0.0436 |
| 0.1823 | 84.0 | 588 | 0.5445 | 0.85 | 0.2446 | 1.0819 | 0.85 | 0.8348 | 0.1551 | 0.0435 |
| 0.1823 | 85.0 | 595 | 0.5446 | 0.85 | 0.2446 | 1.0818 | 0.85 | 0.8348 | 0.1560 | 0.0436 |
| 0.1823 | 86.0 | 602 | 0.5446 | 0.85 | 0.2446 | 1.0818 | 0.85 | 0.8348 | 0.1560 | 0.0435 |
| 0.1823 | 87.0 | 609 | 0.5448 | 0.85 | 0.2447 | 1.0820 | 0.85 | 0.8348 | 0.1560 | 0.0435 |
| 0.1823 | 88.0 | 616 | 0.5447 | 0.85 | 0.2446 | 1.0819 | 0.85 | 0.8348 | 0.1551 | 0.0435 |
| 0.1823 | 89.0 | 623 | 0.5446 | 0.85 | 0.2446 | 1.0819 | 0.85 | 0.8348 | 0.1560 | 0.0435 |
| 0.1823 | 90.0 | 630 | 0.5446 | 0.85 | 0.2446 | 1.0816 | 0.85 | 0.8348 | 0.1638 | 0.0436 |
| 0.1823 | 91.0 | 637 | 0.5446 | 0.85 | 0.2445 | 1.0817 | 0.85 | 0.8348 | 0.1474 | 0.0435 |
| 0.1823 | 92.0 | 644 | 0.5445 | 0.85 | 0.2445 | 1.0818 | 0.85 | 0.8348 | 0.1551 | 0.0436 |
| 0.1823 | 93.0 | 651 | 0.5447 | 0.85 | 0.2446 | 1.0818 | 0.85 | 0.8348 | 0.1560 | 0.0436 |
| 0.1823 | 94.0 | 658 | 0.5447 | 0.85 | 0.2446 | 1.0816 | 0.85 | 0.8348 | 0.1561 | 0.0436 |
| 0.1823 | 95.0 | 665 | 0.5447 | 0.85 | 0.2446 | 1.0816 | 0.85 | 0.8348 | 0.1550 | 0.0435 |
| 0.1823 | 96.0 | 672 | 0.5446 | 0.85 | 0.2446 | 1.0816 | 0.85 | 0.8348 | 0.1474 | 0.0436 |
| 0.1823 | 97.0 | 679 | 0.5446 | 0.85 | 0.2446 | 1.0817 | 0.85 | 0.8348 | 0.1551 | 0.0436 |
| 0.1823 | 98.0 | 686 | 0.5446 | 0.85 | 0.2446 | 1.0817 | 0.85 | 0.8348 | 0.1474 | 0.0436 |
| 0.1823 | 99.0 | 693 | 0.5446 | 0.85 | 0.2446 | 1.0816 | 0.85 | 0.8348 | 0.1474 | 0.0436 |
| 0.1823 | 100.0 | 700 | 0.5446 | 0.85 | 0.2446 | 1.0816 | 0.85 | 0.8348 | 0.1474 | 0.0436 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ALM-AHME/convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20-V2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20-V2
This model is a fine-tuned version of [ALM-AHME/convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20](https://huggingface.co/ALM-AHME/convnextv2-large-1k-224-finetuned-Lesion-Classification-HAM10000-AH-60-20-20) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.9
- num_epochs: 12
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"akiec",
"bcc",
"bkl",
"df",
"mel",
"nv",
"vasc"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class_kd_MSE
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7723
- Accuracy: 0.6025
- Brier Loss: 0.5295
- Nll: 3.6748
- F1 Micro: 0.6025
- F1 Macro: 0.6055
- Ece: 0.1688
- Aurc: 0.1708
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 4.7870 | 0.065 | 0.9655 | 17.0930 | 0.065 | 0.0550 | 0.1747 | 0.9357 |
| No log | 2.0 | 50 | 3.9498 | 0.205 | 0.8858 | 9.5780 | 0.205 | 0.1863 | 0.1692 | 0.6618 |
| No log | 3.0 | 75 | 3.3698 | 0.3675 | 0.7672 | 6.4908 | 0.3675 | 0.3392 | 0.1676 | 0.4195 |
| No log | 4.0 | 100 | 2.9935 | 0.4075 | 0.6958 | 5.5595 | 0.4075 | 0.3820 | 0.1828 | 0.3327 |
| No log | 5.0 | 125 | 2.8351 | 0.455 | 0.6591 | 4.8619 | 0.455 | 0.4351 | 0.1561 | 0.2833 |
| No log | 6.0 | 150 | 2.8196 | 0.4725 | 0.6595 | 4.7785 | 0.4725 | 0.4367 | 0.1808 | 0.2790 |
| No log | 7.0 | 175 | 2.6352 | 0.5075 | 0.6234 | 4.9881 | 0.5075 | 0.4886 | 0.1563 | 0.2493 |
| No log | 8.0 | 200 | 2.5325 | 0.525 | 0.6162 | 4.3297 | 0.525 | 0.5026 | 0.1724 | 0.2365 |
| No log | 9.0 | 225 | 2.5459 | 0.53 | 0.6099 | 5.1608 | 0.53 | 0.5148 | 0.1944 | 0.2350 |
| No log | 10.0 | 250 | 2.5573 | 0.5325 | 0.6161 | 5.4495 | 0.5325 | 0.5212 | 0.2052 | 0.2397 |
| No log | 11.0 | 275 | 2.3199 | 0.5675 | 0.5828 | 4.1247 | 0.5675 | 0.5626 | 0.1849 | 0.2071 |
| No log | 12.0 | 300 | 2.2917 | 0.565 | 0.5758 | 4.1738 | 0.565 | 0.5694 | 0.1992 | 0.2023 |
| No log | 13.0 | 325 | 2.2744 | 0.555 | 0.5974 | 4.2323 | 0.555 | 0.5544 | 0.1982 | 0.2203 |
| No log | 14.0 | 350 | 2.1638 | 0.5625 | 0.5807 | 4.2049 | 0.5625 | 0.5629 | 0.1868 | 0.2049 |
| No log | 15.0 | 375 | 2.1934 | 0.5575 | 0.5903 | 4.3813 | 0.5575 | 0.5614 | 0.1868 | 0.2022 |
| No log | 16.0 | 400 | 2.1092 | 0.5625 | 0.5702 | 3.6094 | 0.5625 | 0.5700 | 0.1846 | 0.2011 |
| No log | 17.0 | 425 | 2.0379 | 0.5875 | 0.5642 | 4.4351 | 0.5875 | 0.5822 | 0.2036 | 0.1959 |
| No log | 18.0 | 450 | 2.0303 | 0.5825 | 0.5558 | 3.6847 | 0.5825 | 0.5820 | 0.1684 | 0.1881 |
| No log | 19.0 | 475 | 2.0506 | 0.57 | 0.5749 | 4.0014 | 0.57 | 0.5708 | 0.1725 | 0.2027 |
| 1.5026 | 20.0 | 500 | 1.9932 | 0.5875 | 0.5524 | 3.8003 | 0.5875 | 0.5914 | 0.1843 | 0.1831 |
| 1.5026 | 21.0 | 525 | 2.0131 | 0.565 | 0.5643 | 4.0681 | 0.565 | 0.5635 | 0.1776 | 0.1957 |
| 1.5026 | 22.0 | 550 | 2.0162 | 0.5725 | 0.5712 | 3.7068 | 0.5725 | 0.5766 | 0.1934 | 0.1955 |
| 1.5026 | 23.0 | 575 | 1.9093 | 0.605 | 0.5381 | 3.7930 | 0.605 | 0.6032 | 0.1539 | 0.1749 |
| 1.5026 | 24.0 | 600 | 1.9607 | 0.575 | 0.5561 | 4.5740 | 0.575 | 0.5789 | 0.1782 | 0.1902 |
| 1.5026 | 25.0 | 625 | 1.8971 | 0.5825 | 0.5408 | 3.7290 | 0.5825 | 0.5754 | 0.1836 | 0.1751 |
| 1.5026 | 26.0 | 650 | 1.9217 | 0.5775 | 0.5537 | 3.8085 | 0.5775 | 0.5844 | 0.1725 | 0.1843 |
| 1.5026 | 27.0 | 675 | 1.9493 | 0.585 | 0.5606 | 3.6743 | 0.585 | 0.5953 | 0.1755 | 0.1882 |
| 1.5026 | 28.0 | 700 | 1.8884 | 0.585 | 0.5437 | 3.7865 | 0.585 | 0.5828 | 0.1801 | 0.1822 |
| 1.5026 | 29.0 | 725 | 1.9242 | 0.585 | 0.5479 | 3.9607 | 0.585 | 0.5856 | 0.1619 | 0.1817 |
| 1.5026 | 30.0 | 750 | 1.8767 | 0.5975 | 0.5470 | 3.7995 | 0.5975 | 0.5966 | 0.1599 | 0.1790 |
| 1.5026 | 31.0 | 775 | 1.8723 | 0.5925 | 0.5337 | 3.8962 | 0.5925 | 0.5972 | 0.1678 | 0.1729 |
| 1.5026 | 32.0 | 800 | 1.9093 | 0.585 | 0.5545 | 3.8776 | 0.585 | 0.5830 | 0.1902 | 0.1841 |
| 1.5026 | 33.0 | 825 | 1.8667 | 0.595 | 0.5363 | 3.8926 | 0.595 | 0.5917 | 0.1772 | 0.1745 |
| 1.5026 | 34.0 | 850 | 1.8403 | 0.59 | 0.5521 | 3.8560 | 0.59 | 0.5953 | 0.1711 | 0.1800 |
| 1.5026 | 35.0 | 875 | 1.8464 | 0.5925 | 0.5380 | 4.0376 | 0.5925 | 0.5970 | 0.1719 | 0.1756 |
| 1.5026 | 36.0 | 900 | 1.8441 | 0.5975 | 0.5411 | 3.7193 | 0.5975 | 0.6008 | 0.1569 | 0.1753 |
| 1.5026 | 37.0 | 925 | 1.8599 | 0.5875 | 0.5402 | 3.9139 | 0.5875 | 0.5908 | 0.1779 | 0.1789 |
| 1.5026 | 38.0 | 950 | 1.8559 | 0.6 | 0.5458 | 3.8970 | 0.6 | 0.5991 | 0.1583 | 0.1804 |
| 1.5026 | 39.0 | 975 | 1.8285 | 0.61 | 0.5370 | 3.6292 | 0.61 | 0.6155 | 0.1623 | 0.1722 |
| 0.0745 | 40.0 | 1000 | 1.8309 | 0.5975 | 0.5432 | 3.6865 | 0.5975 | 0.6017 | 0.1663 | 0.1821 |
| 0.0745 | 41.0 | 1025 | 1.8237 | 0.59 | 0.5348 | 3.6213 | 0.59 | 0.5921 | 0.1695 | 0.1738 |
| 0.0745 | 42.0 | 1050 | 1.8421 | 0.605 | 0.5360 | 3.8592 | 0.605 | 0.6048 | 0.1601 | 0.1743 |
| 0.0745 | 43.0 | 1075 | 1.8158 | 0.5975 | 0.5300 | 3.4537 | 0.5975 | 0.5953 | 0.1696 | 0.1707 |
| 0.0745 | 44.0 | 1100 | 1.8238 | 0.5875 | 0.5358 | 3.7706 | 0.5875 | 0.5923 | 0.1797 | 0.1754 |
| 0.0745 | 45.0 | 1125 | 1.8214 | 0.595 | 0.5463 | 3.4742 | 0.595 | 0.5981 | 0.1800 | 0.1770 |
| 0.0745 | 46.0 | 1150 | 1.8162 | 0.5925 | 0.5317 | 3.9260 | 0.5925 | 0.5950 | 0.1646 | 0.1733 |
| 0.0745 | 47.0 | 1175 | 1.8050 | 0.5975 | 0.5392 | 3.8322 | 0.5975 | 0.5979 | 0.1794 | 0.1763 |
| 0.0745 | 48.0 | 1200 | 1.8214 | 0.5975 | 0.5347 | 3.7965 | 0.5975 | 0.6009 | 0.1555 | 0.1746 |
| 0.0745 | 49.0 | 1225 | 1.7813 | 0.6 | 0.5294 | 3.8398 | 0.6 | 0.6005 | 0.1674 | 0.1688 |
| 0.0745 | 50.0 | 1250 | 1.8179 | 0.6075 | 0.5336 | 3.4690 | 0.6075 | 0.6112 | 0.1743 | 0.1748 |
| 0.0745 | 51.0 | 1275 | 1.7953 | 0.595 | 0.5380 | 3.7781 | 0.595 | 0.5990 | 0.1380 | 0.1727 |
| 0.0745 | 52.0 | 1300 | 1.7897 | 0.6 | 0.5323 | 3.7412 | 0.6 | 0.6013 | 0.1603 | 0.1707 |
| 0.0745 | 53.0 | 1325 | 1.8072 | 0.59 | 0.5428 | 3.5993 | 0.59 | 0.5947 | 0.1571 | 0.1773 |
| 0.0745 | 54.0 | 1350 | 1.7834 | 0.605 | 0.5219 | 3.7600 | 0.605 | 0.6049 | 0.1563 | 0.1671 |
| 0.0745 | 55.0 | 1375 | 1.7920 | 0.595 | 0.5361 | 3.5986 | 0.595 | 0.5978 | 0.1512 | 0.1717 |
| 0.0745 | 56.0 | 1400 | 1.8074 | 0.5925 | 0.5387 | 3.5383 | 0.5925 | 0.5962 | 0.1669 | 0.1741 |
| 0.0745 | 57.0 | 1425 | 1.7893 | 0.605 | 0.5346 | 3.6929 | 0.605 | 0.6039 | 0.1641 | 0.1681 |
| 0.0745 | 58.0 | 1450 | 1.7787 | 0.6 | 0.5317 | 3.7652 | 0.6 | 0.6004 | 0.1850 | 0.1726 |
| 0.0745 | 59.0 | 1475 | 1.7888 | 0.595 | 0.5323 | 3.4558 | 0.595 | 0.5975 | 0.1797 | 0.1732 |
| 0.0231 | 60.0 | 1500 | 1.8064 | 0.58 | 0.5332 | 3.7773 | 0.58 | 0.5839 | 0.1819 | 0.1762 |
| 0.0231 | 61.0 | 1525 | 1.7795 | 0.6075 | 0.5298 | 3.7998 | 0.6075 | 0.6086 | 0.1678 | 0.1704 |
| 0.0231 | 62.0 | 1550 | 1.7826 | 0.595 | 0.5318 | 3.6741 | 0.595 | 0.5916 | 0.1550 | 0.1715 |
| 0.0231 | 63.0 | 1575 | 1.7704 | 0.5925 | 0.5325 | 3.5942 | 0.5925 | 0.5941 | 0.1619 | 0.1712 |
| 0.0231 | 64.0 | 1600 | 1.7901 | 0.6025 | 0.5289 | 3.4459 | 0.6025 | 0.6054 | 0.2022 | 0.1712 |
| 0.0231 | 65.0 | 1625 | 1.7944 | 0.59 | 0.5381 | 3.7591 | 0.59 | 0.5910 | 0.1599 | 0.1756 |
| 0.0231 | 66.0 | 1650 | 1.7721 | 0.605 | 0.5256 | 3.5227 | 0.605 | 0.6045 | 0.1525 | 0.1677 |
| 0.0231 | 67.0 | 1675 | 1.7779 | 0.5975 | 0.5306 | 3.6792 | 0.5975 | 0.5994 | 0.1667 | 0.1714 |
| 0.0231 | 68.0 | 1700 | 1.7724 | 0.6 | 0.5250 | 3.7552 | 0.6 | 0.6022 | 0.1818 | 0.1683 |
| 0.0231 | 69.0 | 1725 | 1.7765 | 0.6025 | 0.5283 | 3.4264 | 0.6025 | 0.6019 | 0.1671 | 0.1700 |
| 0.0231 | 70.0 | 1750 | 1.7784 | 0.6 | 0.5276 | 3.6887 | 0.6 | 0.6053 | 0.1715 | 0.1703 |
| 0.0231 | 71.0 | 1775 | 1.7659 | 0.6 | 0.5282 | 3.6051 | 0.6 | 0.6006 | 0.1722 | 0.1691 |
| 0.0231 | 72.0 | 1800 | 1.7882 | 0.5975 | 0.5329 | 3.5950 | 0.5975 | 0.6016 | 0.1981 | 0.1716 |
| 0.0231 | 73.0 | 1825 | 1.7678 | 0.6 | 0.5287 | 3.6691 | 0.6 | 0.6032 | 0.1733 | 0.1696 |
| 0.0231 | 74.0 | 1850 | 1.7716 | 0.6 | 0.5286 | 3.7576 | 0.6 | 0.6013 | 0.1734 | 0.1692 |
| 0.0231 | 75.0 | 1875 | 1.7704 | 0.6 | 0.5299 | 3.5917 | 0.6 | 0.6016 | 0.1645 | 0.1709 |
| 0.0231 | 76.0 | 1900 | 1.7729 | 0.6 | 0.5298 | 3.6758 | 0.6 | 0.6024 | 0.1766 | 0.1710 |
| 0.0231 | 77.0 | 1925 | 1.7749 | 0.6 | 0.5308 | 3.6022 | 0.6 | 0.6030 | 0.1604 | 0.1717 |
| 0.0231 | 78.0 | 1950 | 1.7720 | 0.6 | 0.5294 | 3.6759 | 0.6 | 0.6017 | 0.1786 | 0.1708 |
| 0.0231 | 79.0 | 1975 | 1.7734 | 0.6025 | 0.5288 | 3.6765 | 0.6025 | 0.6048 | 0.1673 | 0.1698 |
| 0.0059 | 80.0 | 2000 | 1.7709 | 0.6 | 0.5286 | 3.6755 | 0.6 | 0.6020 | 0.1749 | 0.1704 |
| 0.0059 | 81.0 | 2025 | 1.7730 | 0.6 | 0.5295 | 3.6760 | 0.6 | 0.6020 | 0.1677 | 0.1708 |
| 0.0059 | 82.0 | 2050 | 1.7723 | 0.6025 | 0.5295 | 3.6756 | 0.6025 | 0.6055 | 0.1626 | 0.1708 |
| 0.0059 | 83.0 | 2075 | 1.7721 | 0.6025 | 0.5295 | 3.6741 | 0.6025 | 0.6055 | 0.1709 | 0.1708 |
| 0.0059 | 84.0 | 2100 | 1.7725 | 0.6025 | 0.5297 | 3.6747 | 0.6025 | 0.6048 | 0.1627 | 0.1709 |
| 0.0059 | 85.0 | 2125 | 1.7724 | 0.6025 | 0.5295 | 3.6751 | 0.6025 | 0.6055 | 0.1639 | 0.1707 |
| 0.0059 | 86.0 | 2150 | 1.7724 | 0.6025 | 0.5296 | 3.6751 | 0.6025 | 0.6055 | 0.1630 | 0.1708 |
| 0.0059 | 87.0 | 2175 | 1.7724 | 0.6025 | 0.5295 | 3.6749 | 0.6025 | 0.6055 | 0.1638 | 0.1707 |
| 0.0059 | 88.0 | 2200 | 1.7722 | 0.6025 | 0.5295 | 3.6752 | 0.6025 | 0.6055 | 0.1645 | 0.1708 |
| 0.0059 | 89.0 | 2225 | 1.7723 | 0.6025 | 0.5295 | 3.6747 | 0.6025 | 0.6055 | 0.1639 | 0.1708 |
| 0.0059 | 90.0 | 2250 | 1.7723 | 0.6025 | 0.5294 | 3.6750 | 0.6025 | 0.6055 | 0.1643 | 0.1708 |
| 0.0059 | 91.0 | 2275 | 1.7723 | 0.6025 | 0.5294 | 3.6750 | 0.6025 | 0.6055 | 0.1643 | 0.1708 |
| 0.0059 | 92.0 | 2300 | 1.7723 | 0.6025 | 0.5295 | 3.6747 | 0.6025 | 0.6055 | 0.1639 | 0.1708 |
| 0.0059 | 93.0 | 2325 | 1.7723 | 0.6025 | 0.5295 | 3.6749 | 0.6025 | 0.6055 | 0.1637 | 0.1707 |
| 0.0059 | 94.0 | 2350 | 1.7722 | 0.6025 | 0.5295 | 3.6749 | 0.6025 | 0.6055 | 0.1688 | 0.1708 |
| 0.0059 | 95.0 | 2375 | 1.7723 | 0.6025 | 0.5295 | 3.6748 | 0.6025 | 0.6055 | 0.1643 | 0.1708 |
| 0.0059 | 96.0 | 2400 | 1.7723 | 0.6025 | 0.5294 | 3.6748 | 0.6025 | 0.6055 | 0.1643 | 0.1707 |
| 0.0059 | 97.0 | 2425 | 1.7723 | 0.6025 | 0.5295 | 3.6748 | 0.6025 | 0.6055 | 0.1688 | 0.1708 |
| 0.0059 | 98.0 | 2450 | 1.7723 | 0.6025 | 0.5295 | 3.6749 | 0.6025 | 0.6055 | 0.1643 | 0.1708 |
| 0.0059 | 99.0 | 2475 | 1.7723 | 0.6025 | 0.5295 | 3.6749 | 0.6025 | 0.6055 | 0.1688 | 0.1708 |
| 0.0 | 100.0 | 2500 | 1.7723 | 0.6025 | 0.5295 | 3.6748 | 0.6025 | 0.6055 | 0.1688 | 0.1708 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3966
- Accuracy: 0.85
- Brier Loss: 0.2593
- Nll: 0.9223
- F1 Micro: 0.85
- F1 Macro: 0.8392
- Ece: 0.1994
- Aurc: 0.0457
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.5608 | 0.225 | 0.8774 | 5.2159 | 0.225 | 0.1397 | 0.2725 | 0.7038 |
| No log | 2.0 | 14 | 1.2539 | 0.415 | 0.7531 | 3.2673 | 0.415 | 0.2434 | 0.3070 | 0.4078 |
| No log | 3.0 | 21 | 0.9055 | 0.585 | 0.5971 | 1.9093 | 0.585 | 0.5086 | 0.3232 | 0.2172 |
| No log | 4.0 | 28 | 0.7122 | 0.72 | 0.4403 | 1.7693 | 0.72 | 0.6805 | 0.3073 | 0.1228 |
| No log | 5.0 | 35 | 0.6584 | 0.74 | 0.3938 | 1.5810 | 0.74 | 0.7214 | 0.2661 | 0.1075 |
| No log | 6.0 | 42 | 0.5711 | 0.8 | 0.3462 | 1.4146 | 0.8000 | 0.7524 | 0.2347 | 0.0843 |
| No log | 7.0 | 49 | 0.5521 | 0.8 | 0.3199 | 1.2631 | 0.8000 | 0.7867 | 0.2542 | 0.0634 |
| No log | 8.0 | 56 | 0.5603 | 0.77 | 0.3381 | 1.1808 | 0.7700 | 0.7680 | 0.2316 | 0.0858 |
| No log | 9.0 | 63 | 0.5209 | 0.82 | 0.3062 | 1.2891 | 0.82 | 0.7972 | 0.2405 | 0.0792 |
| No log | 10.0 | 70 | 0.5705 | 0.78 | 0.3343 | 1.5183 | 0.78 | 0.7743 | 0.2264 | 0.0874 |
| No log | 11.0 | 77 | 0.5137 | 0.82 | 0.3047 | 1.2987 | 0.82 | 0.8096 | 0.2420 | 0.0592 |
| No log | 12.0 | 84 | 0.4664 | 0.835 | 0.2929 | 1.1529 | 0.835 | 0.8101 | 0.2291 | 0.0753 |
| No log | 13.0 | 91 | 0.4772 | 0.82 | 0.2915 | 1.2078 | 0.82 | 0.8029 | 0.2131 | 0.0620 |
| No log | 14.0 | 98 | 0.4553 | 0.825 | 0.2843 | 1.1312 | 0.825 | 0.8112 | 0.2196 | 0.0453 |
| No log | 15.0 | 105 | 0.4574 | 0.825 | 0.2821 | 1.1234 | 0.825 | 0.8163 | 0.2241 | 0.0554 |
| No log | 16.0 | 112 | 0.4873 | 0.8 | 0.3111 | 1.2248 | 0.8000 | 0.8007 | 0.1992 | 0.0657 |
| No log | 17.0 | 119 | 0.4224 | 0.855 | 0.2620 | 1.1871 | 0.855 | 0.8218 | 0.2337 | 0.0479 |
| No log | 18.0 | 126 | 0.4414 | 0.825 | 0.2857 | 1.0723 | 0.825 | 0.8227 | 0.2500 | 0.0517 |
| No log | 19.0 | 133 | 0.4232 | 0.845 | 0.2737 | 0.9360 | 0.845 | 0.8219 | 0.2053 | 0.0543 |
| No log | 20.0 | 140 | 0.4114 | 0.845 | 0.2637 | 1.0046 | 0.845 | 0.8233 | 0.2144 | 0.0460 |
| No log | 21.0 | 147 | 0.4110 | 0.835 | 0.2640 | 0.9853 | 0.835 | 0.8160 | 0.2278 | 0.0466 |
| No log | 22.0 | 154 | 0.4163 | 0.845 | 0.2678 | 1.1494 | 0.845 | 0.8291 | 0.2156 | 0.0458 |
| No log | 23.0 | 161 | 0.4243 | 0.835 | 0.2779 | 0.9475 | 0.835 | 0.8269 | 0.2420 | 0.0554 |
| No log | 24.0 | 168 | 0.4079 | 0.835 | 0.2683 | 0.9249 | 0.835 | 0.8044 | 0.2091 | 0.0532 |
| No log | 25.0 | 175 | 0.4027 | 0.85 | 0.2621 | 0.9433 | 0.85 | 0.8361 | 0.2138 | 0.0530 |
| No log | 26.0 | 182 | 0.3975 | 0.855 | 0.2590 | 0.9310 | 0.855 | 0.8457 | 0.1932 | 0.0487 |
| No log | 27.0 | 189 | 0.4032 | 0.85 | 0.2650 | 0.9823 | 0.85 | 0.8425 | 0.2088 | 0.0528 |
| No log | 28.0 | 196 | 0.4037 | 0.845 | 0.2650 | 1.0692 | 0.845 | 0.8361 | 0.2157 | 0.0496 |
| No log | 29.0 | 203 | 0.4027 | 0.845 | 0.2652 | 1.0423 | 0.845 | 0.8295 | 0.1917 | 0.0502 |
| No log | 30.0 | 210 | 0.3989 | 0.85 | 0.2610 | 1.0633 | 0.85 | 0.8392 | 0.2214 | 0.0482 |
| No log | 31.0 | 217 | 0.3985 | 0.855 | 0.2609 | 1.0374 | 0.855 | 0.8424 | 0.2074 | 0.0472 |
| No log | 32.0 | 224 | 0.3986 | 0.85 | 0.2596 | 1.0403 | 0.85 | 0.8392 | 0.2184 | 0.0462 |
| No log | 33.0 | 231 | 0.3990 | 0.85 | 0.2603 | 1.0369 | 0.85 | 0.8392 | 0.2079 | 0.0470 |
| No log | 34.0 | 238 | 0.3982 | 0.85 | 0.2600 | 0.9765 | 0.85 | 0.8392 | 0.2160 | 0.0467 |
| No log | 35.0 | 245 | 0.3977 | 0.85 | 0.2601 | 0.9762 | 0.85 | 0.8392 | 0.2108 | 0.0465 |
| No log | 36.0 | 252 | 0.3977 | 0.85 | 0.2600 | 1.0372 | 0.85 | 0.8392 | 0.2075 | 0.0466 |
| No log | 37.0 | 259 | 0.3972 | 0.85 | 0.2597 | 1.0383 | 0.85 | 0.8392 | 0.2091 | 0.0465 |
| No log | 38.0 | 266 | 0.3967 | 0.85 | 0.2590 | 0.9796 | 0.85 | 0.8392 | 0.1987 | 0.0461 |
| No log | 39.0 | 273 | 0.3979 | 0.85 | 0.2601 | 1.0390 | 0.85 | 0.8392 | 0.1991 | 0.0467 |
| No log | 40.0 | 280 | 0.3976 | 0.85 | 0.2601 | 0.9775 | 0.85 | 0.8392 | 0.2175 | 0.0465 |
| No log | 41.0 | 287 | 0.3979 | 0.85 | 0.2603 | 0.9796 | 0.85 | 0.8392 | 0.1930 | 0.0467 |
| No log | 42.0 | 294 | 0.3973 | 0.85 | 0.2598 | 0.9746 | 0.85 | 0.8392 | 0.2175 | 0.0468 |
| No log | 43.0 | 301 | 0.3972 | 0.85 | 0.2598 | 0.9798 | 0.85 | 0.8392 | 0.1931 | 0.0466 |
| No log | 44.0 | 308 | 0.3969 | 0.85 | 0.2594 | 0.9784 | 0.85 | 0.8392 | 0.2094 | 0.0465 |
| No log | 45.0 | 315 | 0.3971 | 0.85 | 0.2596 | 0.9847 | 0.85 | 0.8392 | 0.2033 | 0.0464 |
| No log | 46.0 | 322 | 0.3969 | 0.85 | 0.2597 | 0.9768 | 0.85 | 0.8392 | 0.2100 | 0.0465 |
| No log | 47.0 | 329 | 0.3974 | 0.85 | 0.2599 | 0.9788 | 0.85 | 0.8392 | 0.2090 | 0.0467 |
| No log | 48.0 | 336 | 0.3971 | 0.85 | 0.2596 | 0.9797 | 0.85 | 0.8392 | 0.1977 | 0.0463 |
| No log | 49.0 | 343 | 0.3972 | 0.85 | 0.2597 | 0.9391 | 0.85 | 0.8392 | 0.1903 | 0.0465 |
| No log | 50.0 | 350 | 0.3969 | 0.85 | 0.2596 | 0.9802 | 0.85 | 0.8392 | 0.1985 | 0.0464 |
| No log | 51.0 | 357 | 0.3970 | 0.85 | 0.2596 | 0.9795 | 0.85 | 0.8392 | 0.2161 | 0.0463 |
| No log | 52.0 | 364 | 0.3973 | 0.85 | 0.2597 | 0.9333 | 0.85 | 0.8392 | 0.1983 | 0.0462 |
| No log | 53.0 | 371 | 0.3971 | 0.85 | 0.2597 | 0.9408 | 0.85 | 0.8392 | 0.2022 | 0.0467 |
| No log | 54.0 | 378 | 0.3970 | 0.85 | 0.2595 | 0.9371 | 0.85 | 0.8392 | 0.1992 | 0.0460 |
| No log | 55.0 | 385 | 0.3970 | 0.85 | 0.2596 | 0.9262 | 0.85 | 0.8392 | 0.1917 | 0.0464 |
| No log | 56.0 | 392 | 0.3971 | 0.85 | 0.2595 | 0.9195 | 0.85 | 0.8392 | 0.1927 | 0.0461 |
| No log | 57.0 | 399 | 0.3970 | 0.85 | 0.2596 | 0.9789 | 0.85 | 0.8392 | 0.1992 | 0.0462 |
| No log | 58.0 | 406 | 0.3968 | 0.85 | 0.2594 | 0.9255 | 0.85 | 0.8392 | 0.1929 | 0.0462 |
| No log | 59.0 | 413 | 0.3967 | 0.85 | 0.2593 | 0.9795 | 0.85 | 0.8392 | 0.1996 | 0.0459 |
| No log | 60.0 | 420 | 0.3970 | 0.85 | 0.2596 | 0.9787 | 0.85 | 0.8392 | 0.1994 | 0.0461 |
| No log | 61.0 | 427 | 0.3967 | 0.85 | 0.2594 | 0.9803 | 0.85 | 0.8392 | 0.2073 | 0.0461 |
| No log | 62.0 | 434 | 0.3968 | 0.85 | 0.2594 | 0.9325 | 0.85 | 0.8392 | 0.1996 | 0.0460 |
| No log | 63.0 | 441 | 0.3968 | 0.85 | 0.2595 | 0.9276 | 0.85 | 0.8392 | 0.2063 | 0.0459 |
| No log | 64.0 | 448 | 0.3968 | 0.85 | 0.2595 | 0.9247 | 0.85 | 0.8392 | 0.1991 | 0.0461 |
| No log | 65.0 | 455 | 0.3968 | 0.85 | 0.2595 | 0.9301 | 0.85 | 0.8392 | 0.1989 | 0.0459 |
| No log | 66.0 | 462 | 0.3968 | 0.85 | 0.2595 | 0.9310 | 0.85 | 0.8392 | 0.1922 | 0.0459 |
| No log | 67.0 | 469 | 0.3968 | 0.85 | 0.2595 | 0.9250 | 0.85 | 0.8392 | 0.2061 | 0.0459 |
| No log | 68.0 | 476 | 0.3968 | 0.85 | 0.2594 | 0.9234 | 0.85 | 0.8392 | 0.1994 | 0.0461 |
| No log | 69.0 | 483 | 0.3967 | 0.85 | 0.2594 | 0.9257 | 0.85 | 0.8392 | 0.2065 | 0.0459 |
| No log | 70.0 | 490 | 0.3967 | 0.85 | 0.2594 | 0.9205 | 0.85 | 0.8392 | 0.1840 | 0.0459 |
| No log | 71.0 | 497 | 0.3967 | 0.85 | 0.2594 | 0.9258 | 0.85 | 0.8392 | 0.2017 | 0.0458 |
| 0.1666 | 72.0 | 504 | 0.3969 | 0.85 | 0.2594 | 0.9297 | 0.85 | 0.8392 | 0.2017 | 0.0458 |
| 0.1666 | 73.0 | 511 | 0.3966 | 0.85 | 0.2593 | 0.9223 | 0.85 | 0.8392 | 0.1920 | 0.0457 |
| 0.1666 | 74.0 | 518 | 0.3967 | 0.85 | 0.2594 | 0.9228 | 0.85 | 0.8392 | 0.1920 | 0.0459 |
| 0.1666 | 75.0 | 525 | 0.3967 | 0.85 | 0.2594 | 0.9257 | 0.85 | 0.8392 | 0.1919 | 0.0459 |
| 0.1666 | 76.0 | 532 | 0.3966 | 0.85 | 0.2593 | 0.9232 | 0.85 | 0.8392 | 0.1994 | 0.0458 |
| 0.1666 | 77.0 | 539 | 0.3968 | 0.85 | 0.2594 | 0.9224 | 0.85 | 0.8392 | 0.1920 | 0.0459 |
| 0.1666 | 78.0 | 546 | 0.3966 | 0.85 | 0.2593 | 0.9242 | 0.85 | 0.8392 | 0.1918 | 0.0458 |
| 0.1666 | 79.0 | 553 | 0.3967 | 0.85 | 0.2594 | 0.9233 | 0.85 | 0.8392 | 0.1920 | 0.0459 |
| 0.1666 | 80.0 | 560 | 0.3968 | 0.85 | 0.2594 | 0.9241 | 0.85 | 0.8392 | 0.1919 | 0.0458 |
| 0.1666 | 81.0 | 567 | 0.3967 | 0.85 | 0.2594 | 0.9225 | 0.85 | 0.8392 | 0.1918 | 0.0459 |
| 0.1666 | 82.0 | 574 | 0.3967 | 0.85 | 0.2594 | 0.9233 | 0.85 | 0.8392 | 0.1919 | 0.0459 |
| 0.1666 | 83.0 | 581 | 0.3967 | 0.85 | 0.2593 | 0.9246 | 0.85 | 0.8392 | 0.1919 | 0.0458 |
| 0.1666 | 84.0 | 588 | 0.3966 | 0.85 | 0.2593 | 0.9229 | 0.85 | 0.8392 | 0.2017 | 0.0458 |
| 0.1666 | 85.0 | 595 | 0.3966 | 0.85 | 0.2593 | 0.9232 | 0.85 | 0.8392 | 0.2017 | 0.0458 |
| 0.1666 | 86.0 | 602 | 0.3967 | 0.85 | 0.2593 | 0.9225 | 0.85 | 0.8392 | 0.1920 | 0.0458 |
| 0.1666 | 87.0 | 609 | 0.3966 | 0.85 | 0.2593 | 0.9214 | 0.85 | 0.8392 | 0.1999 | 0.0458 |
| 0.1666 | 88.0 | 616 | 0.3967 | 0.85 | 0.2593 | 0.9214 | 0.85 | 0.8392 | 0.1920 | 0.0458 |
| 0.1666 | 89.0 | 623 | 0.3966 | 0.85 | 0.2593 | 0.9227 | 0.85 | 0.8392 | 0.2097 | 0.0458 |
| 0.1666 | 90.0 | 630 | 0.3967 | 0.85 | 0.2594 | 0.9219 | 0.85 | 0.8392 | 0.1919 | 0.0458 |
| 0.1666 | 91.0 | 637 | 0.3966 | 0.85 | 0.2593 | 0.9212 | 0.85 | 0.8392 | 0.1994 | 0.0458 |
| 0.1666 | 92.0 | 644 | 0.3966 | 0.85 | 0.2593 | 0.9227 | 0.85 | 0.8392 | 0.1919 | 0.0458 |
| 0.1666 | 93.0 | 651 | 0.3966 | 0.85 | 0.2593 | 0.9231 | 0.85 | 0.8392 | 0.2017 | 0.0458 |
| 0.1666 | 94.0 | 658 | 0.3967 | 0.85 | 0.2593 | 0.9220 | 0.85 | 0.8392 | 0.1919 | 0.0458 |
| 0.1666 | 95.0 | 665 | 0.3966 | 0.85 | 0.2593 | 0.9217 | 0.85 | 0.8392 | 0.1920 | 0.0457 |
| 0.1666 | 96.0 | 672 | 0.3966 | 0.85 | 0.2593 | 0.9218 | 0.85 | 0.8392 | 0.1920 | 0.0458 |
| 0.1666 | 97.0 | 679 | 0.3966 | 0.85 | 0.2593 | 0.9221 | 0.85 | 0.8392 | 0.1920 | 0.0458 |
| 0.1666 | 98.0 | 686 | 0.3966 | 0.85 | 0.2593 | 0.9224 | 0.85 | 0.8392 | 0.1920 | 0.0457 |
| 0.1666 | 99.0 | 693 | 0.3966 | 0.85 | 0.2593 | 0.9224 | 0.85 | 0.8392 | 0.1994 | 0.0457 |
| 0.1666 | 100.0 | 700 | 0.3966 | 0.85 | 0.2593 | 0.9223 | 0.85 | 0.8392 | 0.1994 | 0.0457 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_MSE
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4673
- Accuracy: 0.6425
- Brier Loss: 0.4763
- Nll: 3.0680
- F1 Micro: 0.6425
- F1 Macro: 0.6485
- Ece: 0.1946
- Aurc: 0.1381
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 4.4851 | 0.06 | 0.9565 | 13.8276 | 0.06 | 0.0556 | 0.1688 | 0.9385 |
| No log | 2.0 | 50 | 3.5619 | 0.3775 | 0.7827 | 6.2649 | 0.3775 | 0.3611 | 0.2331 | 0.3882 |
| No log | 3.0 | 75 | 2.8990 | 0.5025 | 0.6453 | 4.7376 | 0.5025 | 0.4858 | 0.1689 | 0.2658 |
| No log | 4.0 | 100 | 2.5972 | 0.515 | 0.5980 | 4.4210 | 0.515 | 0.4895 | 0.1605 | 0.2249 |
| No log | 5.0 | 125 | 2.4353 | 0.56 | 0.5762 | 3.4885 | 0.56 | 0.5566 | 0.1548 | 0.2100 |
| No log | 6.0 | 150 | 2.4157 | 0.5475 | 0.5864 | 3.8261 | 0.5475 | 0.5323 | 0.1837 | 0.2167 |
| No log | 7.0 | 175 | 2.1786 | 0.6075 | 0.5203 | 3.4565 | 0.6075 | 0.6103 | 0.1403 | 0.1670 |
| No log | 8.0 | 200 | 2.1082 | 0.63 | 0.5040 | 3.3570 | 0.63 | 0.6246 | 0.1580 | 0.1530 |
| No log | 9.0 | 225 | 2.0472 | 0.625 | 0.5042 | 3.8572 | 0.625 | 0.6184 | 0.1552 | 0.1530 |
| No log | 10.0 | 250 | 2.0589 | 0.6025 | 0.5468 | 3.5723 | 0.6025 | 0.5982 | 0.1781 | 0.1785 |
| No log | 11.0 | 275 | 1.8965 | 0.65 | 0.4755 | 3.4466 | 0.65 | 0.6497 | 0.1605 | 0.1475 |
| No log | 12.0 | 300 | 1.9014 | 0.6325 | 0.5066 | 3.0881 | 0.6325 | 0.6359 | 0.1658 | 0.1591 |
| No log | 13.0 | 325 | 1.7904 | 0.6175 | 0.5162 | 3.4673 | 0.6175 | 0.6141 | 0.1525 | 0.1598 |
| No log | 14.0 | 350 | 1.8624 | 0.625 | 0.5173 | 3.6824 | 0.625 | 0.6179 | 0.1567 | 0.1624 |
| No log | 15.0 | 375 | 1.7083 | 0.6625 | 0.4817 | 3.1296 | 0.6625 | 0.6686 | 0.1651 | 0.1405 |
| No log | 16.0 | 400 | 1.8848 | 0.59 | 0.5478 | 4.3761 | 0.59 | 0.5913 | 0.2083 | 0.1696 |
| No log | 17.0 | 425 | 1.7238 | 0.6125 | 0.5229 | 3.1232 | 0.6125 | 0.6052 | 0.1833 | 0.1553 |
| No log | 18.0 | 450 | 1.7126 | 0.625 | 0.5152 | 2.9267 | 0.625 | 0.6284 | 0.1747 | 0.1565 |
| No log | 19.0 | 475 | 1.6459 | 0.6275 | 0.5024 | 2.9078 | 0.6275 | 0.6219 | 0.1766 | 0.1527 |
| 1.0542 | 20.0 | 500 | 1.6029 | 0.6275 | 0.4855 | 3.0931 | 0.6275 | 0.6316 | 0.1720 | 0.1414 |
| 1.0542 | 21.0 | 525 | 1.6566 | 0.6525 | 0.4847 | 3.0998 | 0.6525 | 0.6479 | 0.1558 | 0.1438 |
| 1.0542 | 22.0 | 550 | 1.6169 | 0.645 | 0.4894 | 3.0081 | 0.645 | 0.6471 | 0.1687 | 0.1400 |
| 1.0542 | 23.0 | 575 | 1.5322 | 0.6525 | 0.4557 | 3.3587 | 0.6525 | 0.6520 | 0.1428 | 0.1247 |
| 1.0542 | 24.0 | 600 | 1.5991 | 0.6475 | 0.4787 | 2.9349 | 0.6475 | 0.6444 | 0.1580 | 0.1450 |
| 1.0542 | 25.0 | 625 | 1.5625 | 0.6375 | 0.4926 | 3.0245 | 0.6375 | 0.6378 | 0.1641 | 0.1433 |
| 1.0542 | 26.0 | 650 | 1.5366 | 0.64 | 0.4884 | 3.3388 | 0.64 | 0.6461 | 0.1595 | 0.1453 |
| 1.0542 | 27.0 | 675 | 1.5686 | 0.65 | 0.4765 | 3.5120 | 0.65 | 0.6504 | 0.1625 | 0.1359 |
| 1.0542 | 28.0 | 700 | 1.5562 | 0.6475 | 0.4817 | 3.0348 | 0.6475 | 0.6488 | 0.1459 | 0.1388 |
| 1.0542 | 29.0 | 725 | 1.5213 | 0.6475 | 0.4719 | 3.2628 | 0.6475 | 0.6475 | 0.1634 | 0.1326 |
| 1.0542 | 30.0 | 750 | 1.5492 | 0.6675 | 0.4730 | 3.1693 | 0.6675 | 0.6679 | 0.1469 | 0.1415 |
| 1.0542 | 31.0 | 775 | 1.5311 | 0.65 | 0.4896 | 3.0881 | 0.65 | 0.6504 | 0.1815 | 0.1380 |
| 1.0542 | 32.0 | 800 | 1.5556 | 0.6475 | 0.4821 | 3.1829 | 0.6475 | 0.6491 | 0.1640 | 0.1405 |
| 1.0542 | 33.0 | 825 | 1.5471 | 0.6375 | 0.4846 | 3.4190 | 0.6375 | 0.6407 | 0.1628 | 0.1415 |
| 1.0542 | 34.0 | 850 | 1.4809 | 0.6575 | 0.4714 | 2.9136 | 0.6575 | 0.6612 | 0.1729 | 0.1338 |
| 1.0542 | 35.0 | 875 | 1.5256 | 0.66 | 0.4773 | 3.2303 | 0.66 | 0.6650 | 0.1746 | 0.1368 |
| 1.0542 | 36.0 | 900 | 1.4929 | 0.6675 | 0.4671 | 3.2360 | 0.6675 | 0.6698 | 0.1698 | 0.1309 |
| 1.0542 | 37.0 | 925 | 1.4923 | 0.645 | 0.4880 | 3.0567 | 0.645 | 0.6564 | 0.1764 | 0.1395 |
| 1.0542 | 38.0 | 950 | 1.5038 | 0.665 | 0.4672 | 3.2116 | 0.665 | 0.6661 | 0.1588 | 0.1343 |
| 1.0542 | 39.0 | 975 | 1.4708 | 0.6625 | 0.4669 | 3.1420 | 0.6625 | 0.6675 | 0.1683 | 0.1301 |
| 0.0522 | 40.0 | 1000 | 1.5153 | 0.6475 | 0.4865 | 3.1796 | 0.6475 | 0.6447 | 0.1639 | 0.1400 |
| 0.0522 | 41.0 | 1025 | 1.4705 | 0.6575 | 0.4642 | 3.2196 | 0.6575 | 0.6626 | 0.1440 | 0.1308 |
| 0.0522 | 42.0 | 1050 | 1.4844 | 0.6575 | 0.4722 | 3.2445 | 0.6575 | 0.6595 | 0.1746 | 0.1328 |
| 0.0522 | 43.0 | 1075 | 1.4957 | 0.6425 | 0.4828 | 3.1456 | 0.6425 | 0.6468 | 0.1499 | 0.1417 |
| 0.0522 | 44.0 | 1100 | 1.5179 | 0.645 | 0.4910 | 3.3921 | 0.645 | 0.6470 | 0.1861 | 0.1433 |
| 0.0522 | 45.0 | 1125 | 1.4878 | 0.6425 | 0.4839 | 3.2139 | 0.6425 | 0.6478 | 0.1720 | 0.1403 |
| 0.0522 | 46.0 | 1150 | 1.4666 | 0.655 | 0.4741 | 2.9333 | 0.655 | 0.6601 | 0.1813 | 0.1347 |
| 0.0522 | 47.0 | 1175 | 1.4954 | 0.6575 | 0.4776 | 3.2102 | 0.6575 | 0.6604 | 0.1842 | 0.1390 |
| 0.0522 | 48.0 | 1200 | 1.4976 | 0.645 | 0.4856 | 3.1539 | 0.645 | 0.6493 | 0.1549 | 0.1407 |
| 0.0522 | 49.0 | 1225 | 1.4772 | 0.64 | 0.4780 | 2.9845 | 0.64 | 0.6445 | 0.1826 | 0.1388 |
| 0.0522 | 50.0 | 1250 | 1.4584 | 0.65 | 0.4703 | 3.0776 | 0.65 | 0.6533 | 0.1685 | 0.1352 |
| 0.0522 | 51.0 | 1275 | 1.4828 | 0.6325 | 0.4844 | 3.1425 | 0.6325 | 0.6377 | 0.1641 | 0.1409 |
| 0.0522 | 52.0 | 1300 | 1.4676 | 0.6525 | 0.4737 | 3.1483 | 0.6525 | 0.6565 | 0.1773 | 0.1358 |
| 0.0522 | 53.0 | 1325 | 1.4675 | 0.6475 | 0.4791 | 3.1411 | 0.6475 | 0.6515 | 0.1820 | 0.1388 |
| 0.0522 | 54.0 | 1350 | 1.4724 | 0.645 | 0.4764 | 3.0744 | 0.645 | 0.6499 | 0.1847 | 0.1382 |
| 0.0522 | 55.0 | 1375 | 1.4689 | 0.6425 | 0.4769 | 3.2256 | 0.6425 | 0.6476 | 0.1839 | 0.1376 |
| 0.0522 | 56.0 | 1400 | 1.4660 | 0.6425 | 0.4760 | 2.9907 | 0.6425 | 0.6479 | 0.1906 | 0.1378 |
| 0.0522 | 57.0 | 1425 | 1.4663 | 0.645 | 0.4757 | 3.0722 | 0.645 | 0.6514 | 0.1705 | 0.1367 |
| 0.0522 | 58.0 | 1450 | 1.4678 | 0.65 | 0.4770 | 3.0710 | 0.65 | 0.6546 | 0.1794 | 0.1371 |
| 0.0522 | 59.0 | 1475 | 1.4717 | 0.64 | 0.4786 | 3.0737 | 0.64 | 0.6455 | 0.1889 | 0.1392 |
| 0.0064 | 60.0 | 1500 | 1.4691 | 0.645 | 0.4768 | 3.0688 | 0.645 | 0.6499 | 0.1815 | 0.1378 |
| 0.0064 | 61.0 | 1525 | 1.4689 | 0.64 | 0.4767 | 3.0688 | 0.64 | 0.6452 | 0.1846 | 0.1382 |
| 0.0064 | 62.0 | 1550 | 1.4689 | 0.64 | 0.4770 | 3.0674 | 0.64 | 0.6455 | 0.1937 | 0.1383 |
| 0.0064 | 63.0 | 1575 | 1.4687 | 0.6425 | 0.4767 | 3.0700 | 0.6425 | 0.6485 | 0.1897 | 0.1381 |
| 0.0064 | 64.0 | 1600 | 1.4674 | 0.6425 | 0.4764 | 3.0675 | 0.6425 | 0.6472 | 0.1855 | 0.1375 |
| 0.0064 | 65.0 | 1625 | 1.4681 | 0.6425 | 0.4766 | 3.0694 | 0.6425 | 0.6485 | 0.1917 | 0.1381 |
| 0.0064 | 66.0 | 1650 | 1.4681 | 0.6425 | 0.4766 | 3.0687 | 0.6425 | 0.6472 | 0.1905 | 0.1378 |
| 0.0064 | 67.0 | 1675 | 1.4667 | 0.645 | 0.4757 | 3.0681 | 0.645 | 0.6505 | 0.1899 | 0.1375 |
| 0.0064 | 68.0 | 1700 | 1.4683 | 0.6425 | 0.4771 | 3.0686 | 0.6425 | 0.6474 | 0.1871 | 0.1379 |
| 0.0064 | 69.0 | 1725 | 1.4672 | 0.64 | 0.4760 | 3.0679 | 0.64 | 0.6455 | 0.1932 | 0.1380 |
| 0.0064 | 70.0 | 1750 | 1.4673 | 0.6425 | 0.4763 | 3.0683 | 0.6425 | 0.6474 | 0.1955 | 0.1376 |
| 0.0064 | 71.0 | 1775 | 1.4676 | 0.645 | 0.4763 | 3.0680 | 0.645 | 0.6505 | 0.1921 | 0.1376 |
| 0.0064 | 72.0 | 1800 | 1.4674 | 0.6425 | 0.4763 | 3.0683 | 0.6425 | 0.6474 | 0.1946 | 0.1376 |
| 0.0064 | 73.0 | 1825 | 1.4675 | 0.6425 | 0.4763 | 3.0682 | 0.6425 | 0.6474 | 0.1946 | 0.1377 |
| 0.0064 | 74.0 | 1850 | 1.4674 | 0.6425 | 0.4763 | 3.0682 | 0.6425 | 0.6485 | 0.1945 | 0.1380 |
| 0.0064 | 75.0 | 1875 | 1.4674 | 0.64 | 0.4763 | 3.0680 | 0.64 | 0.6455 | 0.1960 | 0.1380 |
| 0.0064 | 76.0 | 1900 | 1.4675 | 0.64 | 0.4764 | 3.0682 | 0.64 | 0.6455 | 0.1972 | 0.1381 |
| 0.0064 | 77.0 | 1925 | 1.4675 | 0.6425 | 0.4763 | 3.0681 | 0.6425 | 0.6485 | 0.1947 | 0.1380 |
| 0.0064 | 78.0 | 1950 | 1.4674 | 0.6425 | 0.4763 | 3.0681 | 0.6425 | 0.6485 | 0.1958 | 0.1381 |
| 0.0064 | 79.0 | 1975 | 1.4674 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6474 | 0.1935 | 0.1376 |
| 0.0 | 80.0 | 2000 | 1.4673 | 0.6425 | 0.4763 | 3.0681 | 0.6425 | 0.6485 | 0.1958 | 0.1380 |
| 0.0 | 81.0 | 2025 | 1.4674 | 0.6425 | 0.4763 | 3.0681 | 0.6425 | 0.6485 | 0.1946 | 0.1380 |
| 0.0 | 82.0 | 2050 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1935 | 0.1380 |
| 0.0 | 83.0 | 2075 | 1.4674 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 84.0 | 2100 | 1.4674 | 0.6425 | 0.4763 | 3.0681 | 0.6425 | 0.6485 | 0.1958 | 0.1381 |
| 0.0 | 85.0 | 2125 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 86.0 | 2150 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 87.0 | 2175 | 1.4673 | 0.6425 | 0.4763 | 3.0681 | 0.6425 | 0.6485 | 0.1958 | 0.1381 |
| 0.0 | 88.0 | 2200 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 89.0 | 2225 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 90.0 | 2250 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 91.0 | 2275 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 92.0 | 2300 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 93.0 | 2325 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 94.0 | 2350 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1909 | 0.1381 |
| 0.0 | 95.0 | 2375 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 96.0 | 2400 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 97.0 | 2425 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 98.0 | 2450 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 99.0 | 2475 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
| 0.0 | 100.0 | 2500 | 1.4673 | 0.6425 | 0.4763 | 3.0680 | 0.6425 | 0.6485 | 0.1946 | 0.1381 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4918
- Accuracy: 0.85
- Brier Loss: 0.2583
- Nll: 1.0894
- F1 Micro: 0.85
- F1 Macro: 0.8374
- Ece: 0.1917
- Aurc: 0.0470
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.8329 | 0.225 | 0.8761 | 5.2731 | 0.225 | 0.1384 | 0.2607 | 0.6977 |
| No log | 2.0 | 14 | 1.4785 | 0.405 | 0.7460 | 3.4067 | 0.405 | 0.2289 | 0.3097 | 0.4085 |
| No log | 3.0 | 21 | 1.0406 | 0.6 | 0.5725 | 1.8722 | 0.6 | 0.5345 | 0.3050 | 0.2010 |
| No log | 4.0 | 28 | 0.8087 | 0.725 | 0.4192 | 1.6096 | 0.7250 | 0.6767 | 0.2345 | 0.1149 |
| No log | 5.0 | 35 | 0.7666 | 0.735 | 0.3731 | 1.6189 | 0.735 | 0.7350 | 0.2377 | 0.1011 |
| No log | 6.0 | 42 | 0.6960 | 0.78 | 0.3413 | 1.5230 | 0.78 | 0.7592 | 0.2295 | 0.0868 |
| No log | 7.0 | 49 | 0.6490 | 0.805 | 0.3110 | 1.4861 | 0.805 | 0.7864 | 0.2138 | 0.0785 |
| No log | 8.0 | 56 | 0.6238 | 0.795 | 0.3069 | 1.2098 | 0.795 | 0.7816 | 0.2065 | 0.0698 |
| No log | 9.0 | 63 | 0.5755 | 0.83 | 0.2866 | 1.1943 | 0.83 | 0.8117 | 0.1937 | 0.0694 |
| No log | 10.0 | 70 | 0.6360 | 0.77 | 0.3164 | 1.2608 | 0.7700 | 0.7550 | 0.1785 | 0.0677 |
| No log | 11.0 | 77 | 0.6548 | 0.785 | 0.3335 | 1.4895 | 0.785 | 0.7707 | 0.2281 | 0.0885 |
| No log | 12.0 | 84 | 0.5847 | 0.805 | 0.3002 | 1.4317 | 0.805 | 0.7807 | 0.2264 | 0.0756 |
| No log | 13.0 | 91 | 0.5956 | 0.81 | 0.3040 | 1.2590 | 0.81 | 0.7928 | 0.2241 | 0.0556 |
| No log | 14.0 | 98 | 0.5692 | 0.81 | 0.3025 | 1.2119 | 0.81 | 0.8043 | 0.2235 | 0.0665 |
| No log | 15.0 | 105 | 0.5223 | 0.83 | 0.2762 | 1.1162 | 0.83 | 0.8221 | 0.1798 | 0.0552 |
| No log | 16.0 | 112 | 0.4981 | 0.84 | 0.2523 | 1.0864 | 0.8400 | 0.8372 | 0.1868 | 0.0396 |
| No log | 17.0 | 119 | 0.5207 | 0.805 | 0.2741 | 1.0416 | 0.805 | 0.7897 | 0.1960 | 0.0551 |
| No log | 18.0 | 126 | 0.5165 | 0.84 | 0.2723 | 1.1596 | 0.8400 | 0.8325 | 0.1942 | 0.0506 |
| No log | 19.0 | 133 | 0.4979 | 0.845 | 0.2573 | 1.2329 | 0.845 | 0.8297 | 0.1825 | 0.0444 |
| No log | 20.0 | 140 | 0.4953 | 0.855 | 0.2565 | 1.1213 | 0.855 | 0.8442 | 0.1844 | 0.0474 |
| No log | 21.0 | 147 | 0.5296 | 0.82 | 0.2792 | 1.0000 | 0.82 | 0.8218 | 0.1768 | 0.0523 |
| No log | 22.0 | 154 | 0.5027 | 0.835 | 0.2625 | 0.9926 | 0.835 | 0.8238 | 0.2035 | 0.0481 |
| No log | 23.0 | 161 | 0.5027 | 0.84 | 0.2642 | 1.0500 | 0.8400 | 0.8299 | 0.1616 | 0.0482 |
| No log | 24.0 | 168 | 0.5017 | 0.84 | 0.2616 | 1.0560 | 0.8400 | 0.8314 | 0.1819 | 0.0497 |
| No log | 25.0 | 175 | 0.4942 | 0.85 | 0.2594 | 1.1003 | 0.85 | 0.8407 | 0.1793 | 0.0483 |
| No log | 26.0 | 182 | 0.4943 | 0.83 | 0.2586 | 1.0436 | 0.83 | 0.8140 | 0.1869 | 0.0518 |
| No log | 27.0 | 189 | 0.4950 | 0.835 | 0.2613 | 1.0817 | 0.835 | 0.8224 | 0.2039 | 0.0504 |
| No log | 28.0 | 196 | 0.4957 | 0.85 | 0.2599 | 1.1109 | 0.85 | 0.8309 | 0.2058 | 0.0485 |
| No log | 29.0 | 203 | 0.4956 | 0.845 | 0.2599 | 1.0914 | 0.845 | 0.8304 | 0.1916 | 0.0492 |
| No log | 30.0 | 210 | 0.4893 | 0.84 | 0.2561 | 1.0890 | 0.8400 | 0.8214 | 0.2071 | 0.0482 |
| No log | 31.0 | 217 | 0.4920 | 0.835 | 0.2587 | 1.0907 | 0.835 | 0.8270 | 0.2031 | 0.0482 |
| No log | 32.0 | 224 | 0.4927 | 0.83 | 0.2601 | 1.0879 | 0.83 | 0.8157 | 0.2093 | 0.0500 |
| No log | 33.0 | 231 | 0.4925 | 0.835 | 0.2593 | 1.0886 | 0.835 | 0.8270 | 0.1810 | 0.0484 |
| No log | 34.0 | 238 | 0.4909 | 0.845 | 0.2578 | 1.0871 | 0.845 | 0.8304 | 0.1916 | 0.0478 |
| No log | 35.0 | 245 | 0.4927 | 0.845 | 0.2591 | 1.0866 | 0.845 | 0.8378 | 0.1943 | 0.0473 |
| No log | 36.0 | 252 | 0.4919 | 0.85 | 0.2581 | 1.0891 | 0.85 | 0.8342 | 0.2193 | 0.0475 |
| No log | 37.0 | 259 | 0.4908 | 0.845 | 0.2579 | 1.0867 | 0.845 | 0.8346 | 0.2215 | 0.0474 |
| No log | 38.0 | 266 | 0.4929 | 0.85 | 0.2590 | 1.0873 | 0.85 | 0.8407 | 0.1884 | 0.0471 |
| No log | 39.0 | 273 | 0.4913 | 0.85 | 0.2584 | 1.0861 | 0.85 | 0.8374 | 0.1944 | 0.0474 |
| No log | 40.0 | 280 | 0.4933 | 0.835 | 0.2595 | 1.0871 | 0.835 | 0.8248 | 0.1893 | 0.0491 |
| No log | 41.0 | 287 | 0.4936 | 0.84 | 0.2599 | 1.0863 | 0.8400 | 0.8276 | 0.1860 | 0.0486 |
| No log | 42.0 | 294 | 0.4911 | 0.85 | 0.2580 | 1.0861 | 0.85 | 0.8374 | 0.2186 | 0.0474 |
| No log | 43.0 | 301 | 0.4915 | 0.85 | 0.2581 | 1.0860 | 0.85 | 0.8374 | 0.2023 | 0.0475 |
| No log | 44.0 | 308 | 0.4921 | 0.85 | 0.2586 | 1.0874 | 0.85 | 0.8374 | 0.2013 | 0.0477 |
| No log | 45.0 | 315 | 0.4915 | 0.85 | 0.2583 | 1.0862 | 0.85 | 0.8374 | 0.1941 | 0.0475 |
| No log | 46.0 | 322 | 0.4918 | 0.85 | 0.2584 | 1.0878 | 0.85 | 0.8374 | 0.1852 | 0.0473 |
| No log | 47.0 | 329 | 0.4916 | 0.85 | 0.2583 | 1.0873 | 0.85 | 0.8374 | 0.2089 | 0.0473 |
| No log | 48.0 | 336 | 0.4921 | 0.85 | 0.2586 | 1.0879 | 0.85 | 0.8374 | 0.2026 | 0.0477 |
| No log | 49.0 | 343 | 0.4918 | 0.845 | 0.2584 | 1.0884 | 0.845 | 0.8282 | 0.1963 | 0.0478 |
| No log | 50.0 | 350 | 0.4922 | 0.85 | 0.2587 | 1.0871 | 0.85 | 0.8374 | 0.2102 | 0.0474 |
| No log | 51.0 | 357 | 0.4920 | 0.85 | 0.2585 | 1.0879 | 0.85 | 0.8374 | 0.2095 | 0.0474 |
| No log | 52.0 | 364 | 0.4926 | 0.85 | 0.2589 | 1.0878 | 0.85 | 0.8374 | 0.2022 | 0.0477 |
| No log | 53.0 | 371 | 0.4920 | 0.85 | 0.2586 | 1.0888 | 0.85 | 0.8374 | 0.2027 | 0.0475 |
| No log | 54.0 | 378 | 0.4921 | 0.85 | 0.2586 | 1.0886 | 0.85 | 0.8374 | 0.2020 | 0.0474 |
| No log | 55.0 | 385 | 0.4921 | 0.85 | 0.2587 | 1.0890 | 0.85 | 0.8374 | 0.1929 | 0.0471 |
| No log | 56.0 | 392 | 0.4925 | 0.85 | 0.2589 | 1.0881 | 0.85 | 0.8374 | 0.1946 | 0.0473 |
| No log | 57.0 | 399 | 0.4917 | 0.85 | 0.2583 | 1.0893 | 0.85 | 0.8374 | 0.1932 | 0.0472 |
| No log | 58.0 | 406 | 0.4921 | 0.85 | 0.2586 | 1.0877 | 0.85 | 0.8374 | 0.1948 | 0.0476 |
| No log | 59.0 | 413 | 0.4917 | 0.85 | 0.2583 | 1.0883 | 0.85 | 0.8374 | 0.1931 | 0.0472 |
| No log | 60.0 | 420 | 0.4918 | 0.85 | 0.2583 | 1.0882 | 0.85 | 0.8374 | 0.1945 | 0.0475 |
| No log | 61.0 | 427 | 0.4916 | 0.85 | 0.2582 | 1.0883 | 0.85 | 0.8374 | 0.1936 | 0.0472 |
| No log | 62.0 | 434 | 0.4920 | 0.85 | 0.2586 | 1.0882 | 0.85 | 0.8374 | 0.1942 | 0.0473 |
| No log | 63.0 | 441 | 0.4922 | 0.85 | 0.2587 | 1.0889 | 0.85 | 0.8374 | 0.1935 | 0.0473 |
| No log | 64.0 | 448 | 0.4921 | 0.85 | 0.2586 | 1.0885 | 0.85 | 0.8374 | 0.1848 | 0.0473 |
| No log | 65.0 | 455 | 0.4916 | 0.85 | 0.2582 | 1.0887 | 0.85 | 0.8374 | 0.1848 | 0.0474 |
| No log | 66.0 | 462 | 0.4917 | 0.85 | 0.2583 | 1.0883 | 0.85 | 0.8374 | 0.1849 | 0.0472 |
| No log | 67.0 | 469 | 0.4917 | 0.85 | 0.2584 | 1.0887 | 0.85 | 0.8374 | 0.1848 | 0.0472 |
| No log | 68.0 | 476 | 0.4920 | 0.85 | 0.2585 | 1.0888 | 0.85 | 0.8374 | 0.2011 | 0.0471 |
| No log | 69.0 | 483 | 0.4918 | 0.85 | 0.2584 | 1.0889 | 0.85 | 0.8374 | 0.2007 | 0.0471 |
| No log | 70.0 | 490 | 0.4919 | 0.85 | 0.2584 | 1.0886 | 0.85 | 0.8374 | 0.1848 | 0.0474 |
| No log | 71.0 | 497 | 0.4920 | 0.85 | 0.2585 | 1.0888 | 0.85 | 0.8374 | 0.1940 | 0.0474 |
| 0.1824 | 72.0 | 504 | 0.4919 | 0.85 | 0.2584 | 1.0889 | 0.85 | 0.8374 | 0.2011 | 0.0471 |
| 0.1824 | 73.0 | 511 | 0.4917 | 0.85 | 0.2583 | 1.0887 | 0.85 | 0.8374 | 0.1848 | 0.0472 |
| 0.1824 | 74.0 | 518 | 0.4920 | 0.85 | 0.2585 | 1.0890 | 0.85 | 0.8374 | 0.1848 | 0.0472 |
| 0.1824 | 75.0 | 525 | 0.4920 | 0.85 | 0.2585 | 1.0892 | 0.85 | 0.8374 | 0.1846 | 0.0472 |
| 0.1824 | 76.0 | 532 | 0.4918 | 0.85 | 0.2583 | 1.0889 | 0.85 | 0.8374 | 0.1930 | 0.0472 |
| 0.1824 | 77.0 | 539 | 0.4917 | 0.85 | 0.2582 | 1.0891 | 0.85 | 0.8374 | 0.2005 | 0.0472 |
| 0.1824 | 78.0 | 546 | 0.4919 | 0.85 | 0.2584 | 1.0892 | 0.85 | 0.8374 | 0.1928 | 0.0472 |
| 0.1824 | 79.0 | 553 | 0.4920 | 0.85 | 0.2585 | 1.0893 | 0.85 | 0.8374 | 0.1845 | 0.0473 |
| 0.1824 | 80.0 | 560 | 0.4919 | 0.85 | 0.2584 | 1.0890 | 0.85 | 0.8374 | 0.1929 | 0.0473 |
| 0.1824 | 81.0 | 567 | 0.4920 | 0.85 | 0.2585 | 1.0892 | 0.85 | 0.8374 | 0.1925 | 0.0471 |
| 0.1824 | 82.0 | 574 | 0.4920 | 0.85 | 0.2585 | 1.0895 | 0.85 | 0.8374 | 0.1844 | 0.0471 |
| 0.1824 | 83.0 | 581 | 0.4919 | 0.85 | 0.2584 | 1.0892 | 0.85 | 0.8374 | 0.1916 | 0.0471 |
| 0.1824 | 84.0 | 588 | 0.4918 | 0.85 | 0.2584 | 1.0890 | 0.85 | 0.8374 | 0.1926 | 0.0471 |
| 0.1824 | 85.0 | 595 | 0.4918 | 0.85 | 0.2584 | 1.0892 | 0.85 | 0.8374 | 0.1844 | 0.0471 |
| 0.1824 | 86.0 | 602 | 0.4918 | 0.85 | 0.2584 | 1.0893 | 0.85 | 0.8374 | 0.1927 | 0.0472 |
| 0.1824 | 87.0 | 609 | 0.4918 | 0.85 | 0.2584 | 1.0895 | 0.85 | 0.8374 | 0.1844 | 0.0471 |
| 0.1824 | 88.0 | 616 | 0.4918 | 0.85 | 0.2584 | 1.0892 | 0.85 | 0.8374 | 0.1844 | 0.0471 |
| 0.1824 | 89.0 | 623 | 0.4918 | 0.85 | 0.2583 | 1.0895 | 0.85 | 0.8374 | 0.1917 | 0.0471 |
| 0.1824 | 90.0 | 630 | 0.4919 | 0.85 | 0.2584 | 1.0892 | 0.85 | 0.8374 | 0.1998 | 0.0471 |
| 0.1824 | 91.0 | 637 | 0.4919 | 0.85 | 0.2584 | 1.0894 | 0.85 | 0.8374 | 0.1916 | 0.0471 |
| 0.1824 | 92.0 | 644 | 0.4918 | 0.85 | 0.2583 | 1.0895 | 0.85 | 0.8374 | 0.1917 | 0.0470 |
| 0.1824 | 93.0 | 651 | 0.4918 | 0.85 | 0.2583 | 1.0893 | 0.85 | 0.8374 | 0.1917 | 0.0471 |
| 0.1824 | 94.0 | 658 | 0.4918 | 0.85 | 0.2583 | 1.0894 | 0.85 | 0.8374 | 0.1844 | 0.0471 |
| 0.1824 | 95.0 | 665 | 0.4918 | 0.85 | 0.2583 | 1.0894 | 0.85 | 0.8374 | 0.1917 | 0.0470 |
| 0.1824 | 96.0 | 672 | 0.4918 | 0.85 | 0.2583 | 1.0894 | 0.85 | 0.8374 | 0.1917 | 0.0470 |
| 0.1824 | 97.0 | 679 | 0.4918 | 0.85 | 0.2583 | 1.0895 | 0.85 | 0.8374 | 0.1916 | 0.0471 |
| 0.1824 | 98.0 | 686 | 0.4918 | 0.85 | 0.2583 | 1.0895 | 0.85 | 0.8374 | 0.1917 | 0.0470 |
| 0.1824 | 99.0 | 693 | 0.4918 | 0.85 | 0.2583 | 1.0894 | 0.85 | 0.8374 | 0.1917 | 0.0470 |
| 0.1824 | 100.0 | 700 | 0.4918 | 0.85 | 0.2583 | 1.0894 | 0.85 | 0.8374 | 0.1917 | 0.0470 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_tobacco3482_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5373
- Accuracy: 0.85
- Brier Loss: 0.2432
- Nll: 1.1157
- F1 Micro: 0.85
- F1 Macro: 0.8450
- Ece: 0.1621
- Aurc: 0.0427
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.1036 | 0.215 | 0.8753 | 5.3195 | 0.2150 | 0.1264 | 0.2571 | 0.6923 |
| No log | 2.0 | 14 | 1.6952 | 0.405 | 0.7407 | 3.4929 | 0.405 | 0.2416 | 0.2907 | 0.4040 |
| No log | 3.0 | 21 | 1.1843 | 0.62 | 0.5633 | 2.0113 | 0.62 | 0.5725 | 0.2740 | 0.2014 |
| No log | 4.0 | 28 | 0.8797 | 0.71 | 0.4080 | 1.7043 | 0.7100 | 0.6683 | 0.2024 | 0.1125 |
| No log | 5.0 | 35 | 0.8570 | 0.715 | 0.3837 | 1.6476 | 0.715 | 0.7280 | 0.2189 | 0.1079 |
| No log | 6.0 | 42 | 0.7484 | 0.775 | 0.3285 | 1.5962 | 0.775 | 0.7668 | 0.1873 | 0.0816 |
| No log | 7.0 | 49 | 0.7337 | 0.79 | 0.3131 | 1.5377 | 0.79 | 0.7779 | 0.1904 | 0.0771 |
| No log | 8.0 | 56 | 0.6709 | 0.795 | 0.3012 | 1.2156 | 0.795 | 0.7776 | 0.1939 | 0.0761 |
| No log | 9.0 | 63 | 0.6901 | 0.795 | 0.3069 | 1.4725 | 0.795 | 0.7916 | 0.1882 | 0.0769 |
| No log | 10.0 | 70 | 0.7960 | 0.75 | 0.3586 | 1.4426 | 0.75 | 0.7406 | 0.1868 | 0.0976 |
| No log | 11.0 | 77 | 0.7489 | 0.77 | 0.3296 | 1.6202 | 0.7700 | 0.7794 | 0.2020 | 0.0878 |
| No log | 12.0 | 84 | 0.7068 | 0.785 | 0.3270 | 1.4127 | 0.785 | 0.7812 | 0.1922 | 0.0759 |
| No log | 13.0 | 91 | 0.6687 | 0.79 | 0.3050 | 1.3820 | 0.79 | 0.7945 | 0.1818 | 0.0625 |
| No log | 14.0 | 98 | 0.6052 | 0.79 | 0.2854 | 1.0602 | 0.79 | 0.7716 | 0.1702 | 0.0590 |
| No log | 15.0 | 105 | 0.6369 | 0.795 | 0.2959 | 1.0580 | 0.795 | 0.7953 | 0.1709 | 0.0603 |
| No log | 16.0 | 112 | 0.6204 | 0.81 | 0.2816 | 1.1886 | 0.81 | 0.8050 | 0.1657 | 0.0702 |
| No log | 17.0 | 119 | 0.5648 | 0.83 | 0.2475 | 1.2506 | 0.83 | 0.8241 | 0.1347 | 0.0612 |
| No log | 18.0 | 126 | 0.5849 | 0.83 | 0.2672 | 1.2245 | 0.83 | 0.8155 | 0.1646 | 0.0601 |
| No log | 19.0 | 133 | 0.5536 | 0.835 | 0.2475 | 1.0514 | 0.835 | 0.8254 | 0.1683 | 0.0531 |
| No log | 20.0 | 140 | 0.5689 | 0.835 | 0.2513 | 1.2369 | 0.835 | 0.8437 | 0.1722 | 0.0489 |
| No log | 21.0 | 147 | 0.5540 | 0.83 | 0.2485 | 1.2139 | 0.83 | 0.8165 | 0.1641 | 0.0608 |
| No log | 22.0 | 154 | 0.5352 | 0.835 | 0.2402 | 1.0108 | 0.835 | 0.8295 | 0.1408 | 0.0430 |
| No log | 23.0 | 161 | 0.5380 | 0.84 | 0.2403 | 1.2280 | 0.8400 | 0.8347 | 0.1405 | 0.0436 |
| No log | 24.0 | 168 | 0.5422 | 0.835 | 0.2471 | 1.0204 | 0.835 | 0.8324 | 0.1606 | 0.0445 |
| No log | 25.0 | 175 | 0.5342 | 0.85 | 0.2404 | 1.0767 | 0.85 | 0.8487 | 0.1469 | 0.0432 |
| No log | 26.0 | 182 | 0.5374 | 0.84 | 0.2429 | 1.0774 | 0.8400 | 0.8334 | 0.1420 | 0.0462 |
| No log | 27.0 | 189 | 0.5311 | 0.85 | 0.2395 | 1.0748 | 0.85 | 0.8487 | 0.1439 | 0.0446 |
| No log | 28.0 | 196 | 0.5298 | 0.85 | 0.2384 | 1.1337 | 0.85 | 0.8487 | 0.1570 | 0.0437 |
| No log | 29.0 | 203 | 0.5387 | 0.845 | 0.2435 | 1.1319 | 0.845 | 0.8424 | 0.1539 | 0.0458 |
| No log | 30.0 | 210 | 0.5361 | 0.85 | 0.2430 | 1.0648 | 0.85 | 0.8450 | 0.1679 | 0.0431 |
| No log | 31.0 | 217 | 0.5339 | 0.85 | 0.2413 | 1.0676 | 0.85 | 0.8487 | 0.1646 | 0.0428 |
| No log | 32.0 | 224 | 0.5345 | 0.85 | 0.2421 | 1.0709 | 0.85 | 0.8487 | 0.1476 | 0.0440 |
| No log | 33.0 | 231 | 0.5343 | 0.85 | 0.2421 | 1.1236 | 0.85 | 0.8450 | 0.1621 | 0.0431 |
| No log | 34.0 | 238 | 0.5353 | 0.845 | 0.2426 | 1.1244 | 0.845 | 0.8424 | 0.1710 | 0.0428 |
| No log | 35.0 | 245 | 0.5346 | 0.85 | 0.2423 | 1.0649 | 0.85 | 0.8487 | 0.1520 | 0.0440 |
| No log | 36.0 | 252 | 0.5356 | 0.855 | 0.2422 | 1.1241 | 0.855 | 0.8517 | 0.1814 | 0.0429 |
| No log | 37.0 | 259 | 0.5357 | 0.85 | 0.2426 | 1.1237 | 0.85 | 0.8450 | 0.1670 | 0.0425 |
| No log | 38.0 | 266 | 0.5356 | 0.845 | 0.2426 | 1.1226 | 0.845 | 0.8419 | 0.1607 | 0.0435 |
| No log | 39.0 | 273 | 0.5347 | 0.855 | 0.2420 | 1.0739 | 0.855 | 0.8517 | 0.1597 | 0.0427 |
| No log | 40.0 | 280 | 0.5356 | 0.855 | 0.2423 | 1.1203 | 0.855 | 0.8517 | 0.1676 | 0.0435 |
| No log | 41.0 | 287 | 0.5365 | 0.85 | 0.2431 | 1.1199 | 0.85 | 0.8450 | 0.1780 | 0.0429 |
| No log | 42.0 | 294 | 0.5356 | 0.85 | 0.2426 | 1.1173 | 0.85 | 0.8450 | 0.1653 | 0.0430 |
| No log | 43.0 | 301 | 0.5363 | 0.85 | 0.2428 | 1.1189 | 0.85 | 0.8450 | 0.1550 | 0.0435 |
| No log | 44.0 | 308 | 0.5345 | 0.85 | 0.2418 | 1.1193 | 0.85 | 0.8450 | 0.1590 | 0.0428 |
| No log | 45.0 | 315 | 0.5374 | 0.85 | 0.2435 | 1.1202 | 0.85 | 0.8450 | 0.1633 | 0.0435 |
| No log | 46.0 | 322 | 0.5355 | 0.85 | 0.2423 | 1.1183 | 0.85 | 0.8450 | 0.1564 | 0.0428 |
| No log | 47.0 | 329 | 0.5354 | 0.85 | 0.2425 | 1.1176 | 0.85 | 0.8450 | 0.1509 | 0.0429 |
| No log | 48.0 | 336 | 0.5369 | 0.85 | 0.2433 | 1.1177 | 0.85 | 0.8450 | 0.1517 | 0.0432 |
| No log | 49.0 | 343 | 0.5361 | 0.85 | 0.2428 | 1.1182 | 0.85 | 0.8450 | 0.1490 | 0.0428 |
| No log | 50.0 | 350 | 0.5364 | 0.85 | 0.2431 | 1.1179 | 0.85 | 0.8450 | 0.1654 | 0.0430 |
| No log | 51.0 | 357 | 0.5365 | 0.85 | 0.2428 | 1.1185 | 0.85 | 0.8450 | 0.1729 | 0.0432 |
| No log | 52.0 | 364 | 0.5364 | 0.85 | 0.2430 | 1.1165 | 0.85 | 0.8450 | 0.1614 | 0.0429 |
| No log | 53.0 | 371 | 0.5362 | 0.85 | 0.2429 | 1.1167 | 0.85 | 0.8450 | 0.1694 | 0.0430 |
| No log | 54.0 | 378 | 0.5369 | 0.85 | 0.2432 | 1.1170 | 0.85 | 0.8450 | 0.1597 | 0.0432 |
| No log | 55.0 | 385 | 0.5368 | 0.85 | 0.2430 | 1.1168 | 0.85 | 0.8450 | 0.1670 | 0.0429 |
| No log | 56.0 | 392 | 0.5367 | 0.85 | 0.2430 | 1.1180 | 0.85 | 0.8450 | 0.1619 | 0.0430 |
| No log | 57.0 | 399 | 0.5364 | 0.85 | 0.2429 | 1.1163 | 0.85 | 0.8450 | 0.1649 | 0.0429 |
| No log | 58.0 | 406 | 0.5364 | 0.85 | 0.2430 | 1.1156 | 0.85 | 0.8450 | 0.1611 | 0.0429 |
| No log | 59.0 | 413 | 0.5365 | 0.85 | 0.2428 | 1.1163 | 0.85 | 0.8450 | 0.1591 | 0.0429 |
| No log | 60.0 | 420 | 0.5364 | 0.85 | 0.2429 | 1.1155 | 0.85 | 0.8450 | 0.1588 | 0.0429 |
| No log | 61.0 | 427 | 0.5370 | 0.85 | 0.2432 | 1.1158 | 0.85 | 0.8450 | 0.1772 | 0.0432 |
| No log | 62.0 | 434 | 0.5367 | 0.85 | 0.2429 | 1.1167 | 0.85 | 0.8450 | 0.1622 | 0.0429 |
| No log | 63.0 | 441 | 0.5362 | 0.85 | 0.2428 | 1.1162 | 0.85 | 0.8450 | 0.1503 | 0.0428 |
| No log | 64.0 | 448 | 0.5372 | 0.85 | 0.2433 | 1.1161 | 0.85 | 0.8450 | 0.1616 | 0.0432 |
| No log | 65.0 | 455 | 0.5371 | 0.85 | 0.2431 | 1.1162 | 0.85 | 0.8450 | 0.1499 | 0.0429 |
| No log | 66.0 | 462 | 0.5367 | 0.85 | 0.2430 | 1.1160 | 0.85 | 0.8450 | 0.1591 | 0.0427 |
| No log | 67.0 | 469 | 0.5367 | 0.85 | 0.2430 | 1.1164 | 0.85 | 0.8450 | 0.1562 | 0.0428 |
| No log | 68.0 | 476 | 0.5368 | 0.85 | 0.2430 | 1.1168 | 0.85 | 0.8450 | 0.1556 | 0.0427 |
| No log | 69.0 | 483 | 0.5368 | 0.85 | 0.2431 | 1.1158 | 0.85 | 0.8450 | 0.1593 | 0.0428 |
| No log | 70.0 | 490 | 0.5372 | 0.85 | 0.2432 | 1.1162 | 0.85 | 0.8450 | 0.1628 | 0.0428 |
| No log | 71.0 | 497 | 0.5371 | 0.85 | 0.2432 | 1.1163 | 0.85 | 0.8450 | 0.1599 | 0.0429 |
| 0.1708 | 72.0 | 504 | 0.5370 | 0.85 | 0.2430 | 1.1161 | 0.85 | 0.8450 | 0.1559 | 0.0430 |
| 0.1708 | 73.0 | 511 | 0.5372 | 0.85 | 0.2433 | 1.1154 | 0.85 | 0.8450 | 0.1556 | 0.0428 |
| 0.1708 | 74.0 | 518 | 0.5370 | 0.85 | 0.2429 | 1.1165 | 0.85 | 0.8450 | 0.1540 | 0.0428 |
| 0.1708 | 75.0 | 525 | 0.5371 | 0.85 | 0.2431 | 1.1161 | 0.85 | 0.8450 | 0.1616 | 0.0427 |
| 0.1708 | 76.0 | 532 | 0.5369 | 0.85 | 0.2431 | 1.1161 | 0.85 | 0.8450 | 0.1619 | 0.0427 |
| 0.1708 | 77.0 | 539 | 0.5369 | 0.85 | 0.2430 | 1.1156 | 0.85 | 0.8450 | 0.1623 | 0.0429 |
| 0.1708 | 78.0 | 546 | 0.5372 | 0.85 | 0.2432 | 1.1158 | 0.85 | 0.8450 | 0.1619 | 0.0427 |
| 0.1708 | 79.0 | 553 | 0.5375 | 0.85 | 0.2433 | 1.1162 | 0.85 | 0.8450 | 0.1688 | 0.0429 |
| 0.1708 | 80.0 | 560 | 0.5372 | 0.85 | 0.2432 | 1.1160 | 0.85 | 0.8450 | 0.1623 | 0.0429 |
| 0.1708 | 81.0 | 567 | 0.5373 | 0.85 | 0.2432 | 1.1162 | 0.85 | 0.8450 | 0.1620 | 0.0428 |
| 0.1708 | 82.0 | 574 | 0.5374 | 0.85 | 0.2433 | 1.1160 | 0.85 | 0.8450 | 0.1622 | 0.0428 |
| 0.1708 | 83.0 | 581 | 0.5372 | 0.85 | 0.2432 | 1.1159 | 0.85 | 0.8450 | 0.1622 | 0.0428 |
| 0.1708 | 84.0 | 588 | 0.5371 | 0.85 | 0.2431 | 1.1157 | 0.85 | 0.8450 | 0.1621 | 0.0427 |
| 0.1708 | 85.0 | 595 | 0.5372 | 0.85 | 0.2432 | 1.1158 | 0.85 | 0.8450 | 0.1687 | 0.0426 |
| 0.1708 | 86.0 | 602 | 0.5372 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1619 | 0.0426 |
| 0.1708 | 87.0 | 609 | 0.5374 | 0.85 | 0.2432 | 1.1159 | 0.85 | 0.8450 | 0.1687 | 0.0428 |
| 0.1708 | 88.0 | 616 | 0.5373 | 0.85 | 0.2432 | 1.1160 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 89.0 | 623 | 0.5373 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 90.0 | 630 | 0.5373 | 0.85 | 0.2432 | 1.1156 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 91.0 | 637 | 0.5372 | 0.85 | 0.2432 | 1.1156 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 92.0 | 644 | 0.5373 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 93.0 | 651 | 0.5372 | 0.85 | 0.2432 | 1.1156 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 94.0 | 658 | 0.5373 | 0.85 | 0.2432 | 1.1158 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 95.0 | 665 | 0.5373 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1621 | 0.0427 |
| 0.1708 | 96.0 | 672 | 0.5372 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1621 | 0.0427 |
| 0.1708 | 97.0 | 679 | 0.5372 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1620 | 0.0427 |
| 0.1708 | 98.0 | 686 | 0.5373 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1621 | 0.0427 |
| 0.1708 | 99.0 | 693 | 0.5373 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1621 | 0.0427 |
| 0.1708 | 100.0 | 700 | 0.5373 | 0.85 | 0.2432 | 1.1157 | 0.85 | 0.8450 | 0.1621 | 0.0427 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_simkd_CEKD_tNone_aNone_tNone_gNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_simkd_CEKD_tNone_aNone_tNone_gNone
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0689
- Accuracy: 0.6
- Brier Loss: 0.6433
- Nll: 2.4057
- F1 Micro: 0.6
- F1 Macro: 0.6101
- Ece: 0.3353
- Aurc: 0.1685
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.0859 | 0.0675 | 0.9373 | 7.3238 | 0.0675 | 0.0163 | 0.1099 | 0.9351 |
| No log | 2.0 | 50 | 0.0810 | 0.0675 | 0.9372 | 7.0436 | 0.0675 | 0.0153 | 0.1067 | 0.9365 |
| No log | 3.0 | 75 | 0.0804 | 0.0725 | 0.9368 | 6.5507 | 0.0725 | 0.0268 | 0.1041 | 0.9438 |
| No log | 4.0 | 100 | 0.0800 | 0.0725 | 0.9362 | 6.2816 | 0.0725 | 0.0293 | 0.1056 | 0.9404 |
| No log | 5.0 | 125 | 0.0797 | 0.0775 | 0.9352 | 6.1624 | 0.0775 | 0.0225 | 0.1125 | 0.9037 |
| No log | 6.0 | 150 | 0.0793 | 0.0875 | 0.9337 | 6.0364 | 0.0875 | 0.0376 | 0.1173 | 0.8572 |
| No log | 7.0 | 175 | 0.0788 | 0.13 | 0.9307 | 4.5728 | 0.13 | 0.0918 | 0.1430 | 0.7693 |
| No log | 8.0 | 200 | 0.0781 | 0.2325 | 0.9246 | 3.6321 | 0.2325 | 0.1958 | 0.2225 | 0.5621 |
| No log | 9.0 | 225 | 0.0770 | 0.31 | 0.9103 | 3.3593 | 0.31 | 0.2693 | 0.2782 | 0.4570 |
| No log | 10.0 | 250 | 0.0755 | 0.34 | 0.8830 | 2.9550 | 0.34 | 0.2911 | 0.2951 | 0.4131 |
| No log | 11.0 | 275 | 0.0740 | 0.4075 | 0.8559 | 2.6844 | 0.4075 | 0.3802 | 0.3347 | 0.3241 |
| No log | 12.0 | 300 | 0.0730 | 0.47 | 0.8216 | 2.7315 | 0.47 | 0.4439 | 0.3582 | 0.2707 |
| No log | 13.0 | 325 | 0.0720 | 0.4925 | 0.7913 | 2.6641 | 0.4925 | 0.4606 | 0.3561 | 0.2588 |
| No log | 14.0 | 350 | 0.0717 | 0.4725 | 0.7854 | 2.7229 | 0.4725 | 0.4565 | 0.3296 | 0.2732 |
| No log | 15.0 | 375 | 0.0708 | 0.5125 | 0.7515 | 2.4866 | 0.5125 | 0.4890 | 0.3445 | 0.2379 |
| No log | 16.0 | 400 | 0.0704 | 0.5375 | 0.7424 | 2.4355 | 0.5375 | 0.5131 | 0.3525 | 0.2259 |
| No log | 17.0 | 425 | 0.0702 | 0.545 | 0.7259 | 2.5234 | 0.545 | 0.5227 | 0.3427 | 0.2199 |
| No log | 18.0 | 450 | 0.0696 | 0.545 | 0.7253 | 2.5796 | 0.545 | 0.5318 | 0.3471 | 0.2118 |
| No log | 19.0 | 475 | 0.0697 | 0.56 | 0.7163 | 2.3050 | 0.56 | 0.5547 | 0.3494 | 0.2048 |
| 0.0745 | 20.0 | 500 | 0.0692 | 0.565 | 0.7044 | 2.4019 | 0.565 | 0.5669 | 0.3598 | 0.1869 |
| 0.0745 | 21.0 | 525 | 0.0690 | 0.5775 | 0.6983 | 2.3271 | 0.5775 | 0.5805 | 0.3615 | 0.1906 |
| 0.0745 | 22.0 | 550 | 0.0689 | 0.58 | 0.6855 | 2.2368 | 0.58 | 0.5808 | 0.3572 | 0.1851 |
| 0.0745 | 23.0 | 575 | 0.0690 | 0.56 | 0.6905 | 2.4557 | 0.56 | 0.5709 | 0.3387 | 0.1925 |
| 0.0745 | 24.0 | 600 | 0.0688 | 0.57 | 0.6895 | 2.3632 | 0.57 | 0.5736 | 0.3516 | 0.1912 |
| 0.0745 | 25.0 | 625 | 0.0686 | 0.5775 | 0.6826 | 2.3272 | 0.5775 | 0.5838 | 0.3376 | 0.1802 |
| 0.0745 | 26.0 | 650 | 0.0689 | 0.5625 | 0.6886 | 2.2696 | 0.5625 | 0.5754 | 0.3445 | 0.1917 |
| 0.0745 | 27.0 | 675 | 0.0687 | 0.575 | 0.6765 | 2.3387 | 0.575 | 0.5800 | 0.3511 | 0.1861 |
| 0.0745 | 28.0 | 700 | 0.0689 | 0.5775 | 0.6785 | 2.3039 | 0.5775 | 0.5821 | 0.3546 | 0.1860 |
| 0.0745 | 29.0 | 725 | 0.0685 | 0.6 | 0.6720 | 2.4176 | 0.6 | 0.6013 | 0.3606 | 0.1750 |
| 0.0745 | 30.0 | 750 | 0.0685 | 0.5925 | 0.6690 | 2.2827 | 0.5925 | 0.5962 | 0.3646 | 0.1750 |
| 0.0745 | 31.0 | 775 | 0.0685 | 0.5825 | 0.6682 | 2.2957 | 0.5825 | 0.5885 | 0.3476 | 0.1771 |
| 0.0745 | 32.0 | 800 | 0.0687 | 0.585 | 0.6700 | 2.2669 | 0.585 | 0.5914 | 0.3428 | 0.1797 |
| 0.0745 | 33.0 | 825 | 0.0685 | 0.59 | 0.6652 | 2.3359 | 0.59 | 0.5927 | 0.3429 | 0.1775 |
| 0.0745 | 34.0 | 850 | 0.0686 | 0.5825 | 0.6717 | 2.3900 | 0.5825 | 0.5919 | 0.3453 | 0.1790 |
| 0.0745 | 35.0 | 875 | 0.0685 | 0.5875 | 0.6721 | 2.3131 | 0.5875 | 0.5932 | 0.3579 | 0.1799 |
| 0.0745 | 36.0 | 900 | 0.0686 | 0.5925 | 0.6625 | 2.3435 | 0.5925 | 0.6005 | 0.3441 | 0.1728 |
| 0.0745 | 37.0 | 925 | 0.0685 | 0.5875 | 0.6649 | 2.4475 | 0.5875 | 0.5885 | 0.3550 | 0.1756 |
| 0.0745 | 38.0 | 950 | 0.0685 | 0.5925 | 0.6607 | 2.2842 | 0.5925 | 0.5962 | 0.3410 | 0.1732 |
| 0.0745 | 39.0 | 975 | 0.0685 | 0.6 | 0.6605 | 2.2073 | 0.6 | 0.6083 | 0.3414 | 0.1708 |
| 0.0599 | 40.0 | 1000 | 0.0685 | 0.575 | 0.6578 | 2.3075 | 0.575 | 0.5788 | 0.3341 | 0.1773 |
| 0.0599 | 41.0 | 1025 | 0.0685 | 0.5975 | 0.6598 | 2.1562 | 0.5975 | 0.6067 | 0.3462 | 0.1685 |
| 0.0599 | 42.0 | 1050 | 0.0685 | 0.5925 | 0.6592 | 2.3363 | 0.5925 | 0.5999 | 0.3262 | 0.1733 |
| 0.0599 | 43.0 | 1075 | 0.0683 | 0.5925 | 0.6545 | 2.2970 | 0.5925 | 0.5975 | 0.3413 | 0.1741 |
| 0.0599 | 44.0 | 1100 | 0.0686 | 0.5975 | 0.6590 | 2.2220 | 0.5975 | 0.6061 | 0.3425 | 0.1698 |
| 0.0599 | 45.0 | 1125 | 0.0684 | 0.585 | 0.6563 | 2.2507 | 0.585 | 0.5876 | 0.3214 | 0.1795 |
| 0.0599 | 46.0 | 1150 | 0.0684 | 0.5975 | 0.6578 | 2.2677 | 0.5975 | 0.6082 | 0.3374 | 0.1712 |
| 0.0599 | 47.0 | 1175 | 0.0684 | 0.5925 | 0.6531 | 2.3091 | 0.5925 | 0.5974 | 0.3362 | 0.1716 |
| 0.0599 | 48.0 | 1200 | 0.0685 | 0.5825 | 0.6539 | 2.3803 | 0.5825 | 0.5901 | 0.3098 | 0.1790 |
| 0.0599 | 49.0 | 1225 | 0.0685 | 0.59 | 0.6518 | 2.1855 | 0.59 | 0.6001 | 0.3229 | 0.1759 |
| 0.0599 | 50.0 | 1250 | 0.0685 | 0.595 | 0.6513 | 2.3357 | 0.595 | 0.6004 | 0.3307 | 0.1711 |
| 0.0599 | 51.0 | 1275 | 0.0684 | 0.59 | 0.6499 | 2.3253 | 0.59 | 0.5968 | 0.3298 | 0.1708 |
| 0.0599 | 52.0 | 1300 | 0.0684 | 0.61 | 0.6500 | 2.3352 | 0.61 | 0.6196 | 0.3692 | 0.1687 |
| 0.0599 | 53.0 | 1325 | 0.0685 | 0.595 | 0.6518 | 2.2189 | 0.595 | 0.6036 | 0.3278 | 0.1735 |
| 0.0599 | 54.0 | 1350 | 0.0684 | 0.6025 | 0.6501 | 2.3238 | 0.6025 | 0.6114 | 0.3410 | 0.1668 |
| 0.0599 | 55.0 | 1375 | 0.0684 | 0.595 | 0.6479 | 2.2696 | 0.595 | 0.6022 | 0.3341 | 0.1719 |
| 0.0599 | 56.0 | 1400 | 0.0685 | 0.595 | 0.6496 | 2.3172 | 0.595 | 0.6008 | 0.3239 | 0.1720 |
| 0.0599 | 57.0 | 1425 | 0.0684 | 0.595 | 0.6476 | 2.2983 | 0.595 | 0.6023 | 0.3310 | 0.1667 |
| 0.0599 | 58.0 | 1450 | 0.0684 | 0.605 | 0.6483 | 2.2607 | 0.605 | 0.6140 | 0.3563 | 0.1660 |
| 0.0599 | 59.0 | 1475 | 0.0685 | 0.5975 | 0.6491 | 2.3956 | 0.5975 | 0.6091 | 0.3222 | 0.1691 |
| 0.0576 | 60.0 | 1500 | 0.0685 | 0.5925 | 0.6476 | 2.2049 | 0.5925 | 0.6032 | 0.3240 | 0.1716 |
| 0.0576 | 61.0 | 1525 | 0.0685 | 0.6 | 0.6482 | 2.3095 | 0.6 | 0.6068 | 0.3276 | 0.1703 |
| 0.0576 | 62.0 | 1550 | 0.0685 | 0.6025 | 0.6448 | 2.2755 | 0.6025 | 0.6101 | 0.3303 | 0.1673 |
| 0.0576 | 63.0 | 1575 | 0.0685 | 0.6 | 0.6480 | 2.3857 | 0.6 | 0.6078 | 0.3358 | 0.1687 |
| 0.0576 | 64.0 | 1600 | 0.0685 | 0.59 | 0.6465 | 2.3280 | 0.59 | 0.5990 | 0.3198 | 0.1705 |
| 0.0576 | 65.0 | 1625 | 0.0684 | 0.605 | 0.6438 | 2.3484 | 0.605 | 0.6125 | 0.3346 | 0.1651 |
| 0.0576 | 66.0 | 1650 | 0.0686 | 0.6 | 0.6462 | 2.2443 | 0.6 | 0.6084 | 0.3371 | 0.1706 |
| 0.0576 | 67.0 | 1675 | 0.0685 | 0.6025 | 0.6449 | 2.3717 | 0.6025 | 0.6115 | 0.3317 | 0.1674 |
| 0.0576 | 68.0 | 1700 | 0.0685 | 0.595 | 0.6449 | 2.3396 | 0.595 | 0.6003 | 0.3292 | 0.1676 |
| 0.0576 | 69.0 | 1725 | 0.0686 | 0.595 | 0.6460 | 2.3315 | 0.595 | 0.6047 | 0.3339 | 0.1683 |
| 0.0576 | 70.0 | 1750 | 0.0687 | 0.5975 | 0.6480 | 2.3967 | 0.5975 | 0.6070 | 0.3404 | 0.1702 |
| 0.0576 | 71.0 | 1775 | 0.0686 | 0.6 | 0.6456 | 2.3870 | 0.6 | 0.6095 | 0.3215 | 0.1689 |
| 0.0576 | 72.0 | 1800 | 0.0686 | 0.59 | 0.6455 | 2.3966 | 0.59 | 0.5985 | 0.3273 | 0.1691 |
| 0.0576 | 73.0 | 1825 | 0.0686 | 0.5875 | 0.6472 | 2.3619 | 0.5875 | 0.5975 | 0.3465 | 0.1711 |
| 0.0576 | 74.0 | 1850 | 0.0686 | 0.595 | 0.6436 | 2.4181 | 0.595 | 0.6054 | 0.3183 | 0.1706 |
| 0.0576 | 75.0 | 1875 | 0.0686 | 0.6 | 0.6440 | 2.4160 | 0.6 | 0.6077 | 0.3285 | 0.1677 |
| 0.0576 | 76.0 | 1900 | 0.0687 | 0.6025 | 0.6446 | 2.4184 | 0.6025 | 0.6111 | 0.3408 | 0.1685 |
| 0.0576 | 77.0 | 1925 | 0.0686 | 0.6025 | 0.6440 | 2.4208 | 0.6025 | 0.6111 | 0.3323 | 0.1670 |
| 0.0576 | 78.0 | 1950 | 0.0687 | 0.5975 | 0.6438 | 2.4236 | 0.5975 | 0.6063 | 0.3298 | 0.1689 |
| 0.0576 | 79.0 | 1975 | 0.0687 | 0.5975 | 0.6438 | 2.4521 | 0.5975 | 0.6057 | 0.3328 | 0.1692 |
| 0.0565 | 80.0 | 2000 | 0.0687 | 0.6 | 0.6448 | 2.4213 | 0.6 | 0.6088 | 0.3368 | 0.1682 |
| 0.0565 | 81.0 | 2025 | 0.0688 | 0.5975 | 0.6444 | 2.4257 | 0.5975 | 0.6076 | 0.3179 | 0.1681 |
| 0.0565 | 82.0 | 2050 | 0.0687 | 0.6 | 0.6446 | 2.4225 | 0.6 | 0.6102 | 0.3392 | 0.1673 |
| 0.0565 | 83.0 | 2075 | 0.0687 | 0.6 | 0.6437 | 2.4571 | 0.6 | 0.6091 | 0.3281 | 0.1681 |
| 0.0565 | 84.0 | 2100 | 0.0688 | 0.595 | 0.6439 | 2.4360 | 0.595 | 0.6042 | 0.3256 | 0.1685 |
| 0.0565 | 85.0 | 2125 | 0.0688 | 0.6 | 0.6436 | 2.4396 | 0.6 | 0.6104 | 0.3318 | 0.1683 |
| 0.0565 | 86.0 | 2150 | 0.0688 | 0.6 | 0.6434 | 2.3977 | 0.6 | 0.6095 | 0.3273 | 0.1675 |
| 0.0565 | 87.0 | 2175 | 0.0688 | 0.595 | 0.6432 | 2.4303 | 0.595 | 0.6053 | 0.3146 | 0.1687 |
| 0.0565 | 88.0 | 2200 | 0.0688 | 0.5975 | 0.6431 | 2.4222 | 0.5975 | 0.6071 | 0.3326 | 0.1686 |
| 0.0565 | 89.0 | 2225 | 0.0688 | 0.6 | 0.6440 | 2.4042 | 0.6 | 0.6108 | 0.3303 | 0.1678 |
| 0.0565 | 90.0 | 2250 | 0.0688 | 0.6 | 0.6433 | 2.3998 | 0.6 | 0.6096 | 0.3301 | 0.1679 |
| 0.0565 | 91.0 | 2275 | 0.0689 | 0.6 | 0.6434 | 2.4026 | 0.6 | 0.6108 | 0.3362 | 0.1680 |
| 0.0565 | 92.0 | 2300 | 0.0689 | 0.5975 | 0.6435 | 2.4037 | 0.5975 | 0.6083 | 0.3335 | 0.1680 |
| 0.0565 | 93.0 | 2325 | 0.0689 | 0.5975 | 0.6434 | 2.4060 | 0.5975 | 0.6077 | 0.3344 | 0.1679 |
| 0.0565 | 94.0 | 2350 | 0.0689 | 0.6 | 0.6433 | 2.4024 | 0.6 | 0.6106 | 0.3204 | 0.1683 |
| 0.0565 | 95.0 | 2375 | 0.0689 | 0.595 | 0.6432 | 2.4060 | 0.595 | 0.6052 | 0.3423 | 0.1684 |
| 0.0565 | 96.0 | 2400 | 0.0689 | 0.6 | 0.6432 | 2.4044 | 0.6 | 0.6101 | 0.3404 | 0.1684 |
| 0.0565 | 97.0 | 2425 | 0.0689 | 0.6 | 0.6434 | 2.4042 | 0.6 | 0.6101 | 0.3349 | 0.1683 |
| 0.0565 | 98.0 | 2450 | 0.0689 | 0.6 | 0.6432 | 2.4055 | 0.6 | 0.6101 | 0.3390 | 0.1684 |
| 0.0565 | 99.0 | 2475 | 0.0689 | 0.6 | 0.6433 | 2.4056 | 0.6 | 0.6101 | 0.3393 | 0.1685 |
| 0.056 | 100.0 | 2500 | 0.0689 | 0.6 | 0.6433 | 2.4057 | 0.6 | 0.6101 | 0.3353 | 0.1685 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3067
- Accuracy: 0.64
- Brier Loss: 0.4889
- Nll: 2.7590
- F1 Micro: 0.64
- F1 Macro: 0.6422
- Ece: 0.1482
- Aurc: 0.1465
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 3.5436 | 0.1275 | 0.9288 | 15.5432 | 0.1275 | 0.1271 | 0.1597 | 0.8570 |
| No log | 2.0 | 50 | 2.6686 | 0.4025 | 0.7453 | 9.6119 | 0.4025 | 0.3632 | 0.1957 | 0.3597 |
| No log | 3.0 | 75 | 2.0708 | 0.495 | 0.6252 | 5.5129 | 0.495 | 0.4799 | 0.1581 | 0.2478 |
| No log | 4.0 | 100 | 1.8472 | 0.5475 | 0.5792 | 4.3917 | 0.5475 | 0.5504 | 0.1665 | 0.2138 |
| No log | 5.0 | 125 | 1.8657 | 0.535 | 0.6173 | 4.2639 | 0.535 | 0.5241 | 0.1890 | 0.2311 |
| No log | 6.0 | 150 | 1.7791 | 0.5725 | 0.5777 | 3.7697 | 0.5725 | 0.5672 | 0.1634 | 0.2157 |
| No log | 7.0 | 175 | 1.7957 | 0.555 | 0.5997 | 4.3973 | 0.555 | 0.5219 | 0.1885 | 0.2118 |
| No log | 8.0 | 200 | 1.7306 | 0.56 | 0.5858 | 4.3403 | 0.56 | 0.5499 | 0.1808 | 0.2076 |
| No log | 9.0 | 225 | 1.9129 | 0.55 | 0.6072 | 5.3639 | 0.55 | 0.5253 | 0.2106 | 0.2034 |
| No log | 10.0 | 250 | 1.9057 | 0.565 | 0.6050 | 4.7359 | 0.565 | 0.5514 | 0.2051 | 0.2211 |
| No log | 11.0 | 275 | 1.8169 | 0.5825 | 0.5990 | 4.2043 | 0.5825 | 0.5787 | 0.2048 | 0.2099 |
| No log | 12.0 | 300 | 1.9194 | 0.55 | 0.6387 | 3.9608 | 0.55 | 0.5457 | 0.2246 | 0.2475 |
| No log | 13.0 | 325 | 1.7830 | 0.585 | 0.5961 | 3.9468 | 0.585 | 0.5717 | 0.1971 | 0.2235 |
| No log | 14.0 | 350 | 1.8241 | 0.5575 | 0.6112 | 3.6498 | 0.5575 | 0.5554 | 0.2123 | 0.2116 |
| No log | 15.0 | 375 | 1.8344 | 0.58 | 0.5950 | 3.9880 | 0.58 | 0.5741 | 0.1872 | 0.2168 |
| No log | 16.0 | 400 | 1.8909 | 0.57 | 0.5987 | 4.6112 | 0.57 | 0.5596 | 0.2096 | 0.2100 |
| No log | 17.0 | 425 | 1.6662 | 0.585 | 0.5645 | 4.0403 | 0.585 | 0.5752 | 0.2000 | 0.1872 |
| No log | 18.0 | 450 | 1.5986 | 0.6175 | 0.5315 | 3.8888 | 0.6175 | 0.6162 | 0.1724 | 0.1660 |
| No log | 19.0 | 475 | 1.5392 | 0.5925 | 0.5593 | 2.8593 | 0.5925 | 0.5823 | 0.2056 | 0.1777 |
| 0.718 | 20.0 | 500 | 1.5257 | 0.595 | 0.5386 | 3.5024 | 0.595 | 0.5817 | 0.1909 | 0.1680 |
| 0.718 | 21.0 | 525 | 1.6699 | 0.6125 | 0.5570 | 3.9342 | 0.6125 | 0.6121 | 0.2006 | 0.1898 |
| 0.718 | 22.0 | 550 | 1.5804 | 0.605 | 0.5542 | 3.7562 | 0.605 | 0.5828 | 0.1888 | 0.1826 |
| 0.718 | 23.0 | 575 | 1.5580 | 0.6025 | 0.5407 | 3.4731 | 0.6025 | 0.5877 | 0.1780 | 0.1693 |
| 0.718 | 24.0 | 600 | 1.5693 | 0.58 | 0.5717 | 3.1009 | 0.58 | 0.5830 | 0.1954 | 0.2041 |
| 0.718 | 25.0 | 625 | 1.6368 | 0.57 | 0.5826 | 3.7067 | 0.57 | 0.5684 | 0.2027 | 0.2116 |
| 0.718 | 26.0 | 650 | 1.3959 | 0.635 | 0.5018 | 3.1312 | 0.635 | 0.6342 | 0.1814 | 0.1544 |
| 0.718 | 27.0 | 675 | 1.4555 | 0.635 | 0.5130 | 3.1374 | 0.635 | 0.6344 | 0.1733 | 0.1727 |
| 0.718 | 28.0 | 700 | 1.5010 | 0.605 | 0.5361 | 3.6647 | 0.605 | 0.6030 | 0.1811 | 0.1725 |
| 0.718 | 29.0 | 725 | 1.6266 | 0.585 | 0.5777 | 3.1233 | 0.585 | 0.5757 | 0.1955 | 0.1965 |
| 0.718 | 30.0 | 750 | 1.4467 | 0.635 | 0.5196 | 3.3019 | 0.635 | 0.6371 | 0.1856 | 0.1759 |
| 0.718 | 31.0 | 775 | 1.5051 | 0.6 | 0.5439 | 3.5968 | 0.6 | 0.5950 | 0.2020 | 0.1776 |
| 0.718 | 32.0 | 800 | 1.3890 | 0.6325 | 0.5001 | 3.2391 | 0.6325 | 0.6310 | 0.1639 | 0.1502 |
| 0.718 | 33.0 | 825 | 1.4150 | 0.6075 | 0.5208 | 3.4287 | 0.6075 | 0.6102 | 0.1862 | 0.1667 |
| 0.718 | 34.0 | 850 | 1.3743 | 0.6125 | 0.5133 | 3.0028 | 0.6125 | 0.6123 | 0.1927 | 0.1585 |
| 0.718 | 35.0 | 875 | 1.3564 | 0.6325 | 0.4960 | 2.8056 | 0.6325 | 0.6344 | 0.1624 | 0.1490 |
| 0.718 | 36.0 | 900 | 1.3634 | 0.6325 | 0.5005 | 2.5056 | 0.6325 | 0.6352 | 0.1808 | 0.1513 |
| 0.718 | 37.0 | 925 | 1.3707 | 0.62 | 0.4991 | 3.2196 | 0.62 | 0.6209 | 0.1509 | 0.1530 |
| 0.718 | 38.0 | 950 | 1.3311 | 0.635 | 0.4937 | 2.8078 | 0.635 | 0.6383 | 0.1645 | 0.1478 |
| 0.718 | 39.0 | 975 | 1.2896 | 0.635 | 0.4838 | 2.7910 | 0.635 | 0.6319 | 0.1524 | 0.1420 |
| 0.0894 | 40.0 | 1000 | 1.3209 | 0.65 | 0.4935 | 2.7909 | 0.65 | 0.6523 | 0.1674 | 0.1442 |
| 0.0894 | 41.0 | 1025 | 1.3280 | 0.6525 | 0.4903 | 2.9461 | 0.6525 | 0.6536 | 0.1645 | 0.1457 |
| 0.0894 | 42.0 | 1050 | 1.3220 | 0.65 | 0.4893 | 2.9579 | 0.65 | 0.6505 | 0.1577 | 0.1480 |
| 0.0894 | 43.0 | 1075 | 1.3155 | 0.6425 | 0.4912 | 2.8699 | 0.6425 | 0.6465 | 0.1479 | 0.1461 |
| 0.0894 | 44.0 | 1100 | 1.3243 | 0.6375 | 0.4946 | 2.9297 | 0.6375 | 0.6393 | 0.1624 | 0.1494 |
| 0.0894 | 45.0 | 1125 | 1.3123 | 0.645 | 0.4891 | 2.8813 | 0.645 | 0.6464 | 0.1710 | 0.1443 |
| 0.0894 | 46.0 | 1150 | 1.3051 | 0.6425 | 0.4859 | 2.8460 | 0.6425 | 0.6434 | 0.1570 | 0.1431 |
| 0.0894 | 47.0 | 1175 | 1.3082 | 0.645 | 0.4871 | 2.7740 | 0.645 | 0.6460 | 0.1740 | 0.1449 |
| 0.0894 | 48.0 | 1200 | 1.3026 | 0.6475 | 0.4849 | 2.7773 | 0.6475 | 0.6505 | 0.1800 | 0.1440 |
| 0.0894 | 49.0 | 1225 | 1.3141 | 0.6375 | 0.4895 | 2.7660 | 0.6375 | 0.6396 | 0.1737 | 0.1463 |
| 0.0894 | 50.0 | 1250 | 1.3147 | 0.6325 | 0.4879 | 2.7744 | 0.6325 | 0.6351 | 0.1609 | 0.1450 |
| 0.0894 | 51.0 | 1275 | 1.3080 | 0.64 | 0.4883 | 2.7668 | 0.64 | 0.6423 | 0.1636 | 0.1450 |
| 0.0894 | 52.0 | 1300 | 1.3087 | 0.6425 | 0.4890 | 2.8436 | 0.6425 | 0.6448 | 0.1520 | 0.1462 |
| 0.0894 | 53.0 | 1325 | 1.3101 | 0.64 | 0.4888 | 2.7708 | 0.64 | 0.6415 | 0.1602 | 0.1452 |
| 0.0894 | 54.0 | 1350 | 1.3181 | 0.6425 | 0.4927 | 2.8450 | 0.6425 | 0.6446 | 0.1732 | 0.1490 |
| 0.0894 | 55.0 | 1375 | 1.3144 | 0.6375 | 0.4915 | 2.7718 | 0.6375 | 0.6399 | 0.1542 | 0.1473 |
| 0.0894 | 56.0 | 1400 | 1.3138 | 0.645 | 0.4923 | 2.6836 | 0.645 | 0.6476 | 0.1721 | 0.1471 |
| 0.0894 | 57.0 | 1425 | 1.3156 | 0.645 | 0.4920 | 2.7653 | 0.645 | 0.6468 | 0.1642 | 0.1470 |
| 0.0894 | 58.0 | 1450 | 1.3161 | 0.6425 | 0.4919 | 2.7644 | 0.6425 | 0.6450 | 0.1617 | 0.1472 |
| 0.0894 | 59.0 | 1475 | 1.3069 | 0.6375 | 0.4877 | 2.7658 | 0.6375 | 0.6396 | 0.1635 | 0.1455 |
| 0.0506 | 60.0 | 1500 | 1.3109 | 0.645 | 0.4904 | 2.8426 | 0.645 | 0.6464 | 0.1605 | 0.1467 |
| 0.0506 | 61.0 | 1525 | 1.3111 | 0.6425 | 0.4893 | 2.7618 | 0.6425 | 0.6446 | 0.1704 | 0.1461 |
| 0.0506 | 62.0 | 1550 | 1.3053 | 0.6425 | 0.4884 | 2.7648 | 0.6425 | 0.6449 | 0.1602 | 0.1457 |
| 0.0506 | 63.0 | 1575 | 1.3097 | 0.64 | 0.4887 | 2.7618 | 0.64 | 0.6423 | 0.1632 | 0.1463 |
| 0.0506 | 64.0 | 1600 | 1.3106 | 0.645 | 0.4912 | 2.7681 | 0.645 | 0.6473 | 0.1688 | 0.1469 |
| 0.0506 | 65.0 | 1625 | 1.3095 | 0.64 | 0.4902 | 2.7589 | 0.64 | 0.6419 | 0.1560 | 0.1468 |
| 0.0506 | 66.0 | 1650 | 1.3073 | 0.645 | 0.4895 | 2.7642 | 0.645 | 0.6473 | 0.1800 | 0.1463 |
| 0.0506 | 67.0 | 1675 | 1.3041 | 0.64 | 0.4880 | 2.7619 | 0.64 | 0.6424 | 0.1670 | 0.1454 |
| 0.0506 | 68.0 | 1700 | 1.3062 | 0.64 | 0.4887 | 2.7623 | 0.64 | 0.6423 | 0.1671 | 0.1466 |
| 0.0506 | 69.0 | 1725 | 1.3075 | 0.64 | 0.4888 | 2.7628 | 0.64 | 0.6424 | 0.1533 | 0.1459 |
| 0.0506 | 70.0 | 1750 | 1.3089 | 0.64 | 0.4898 | 2.7607 | 0.64 | 0.6425 | 0.1805 | 0.1466 |
| 0.0506 | 71.0 | 1775 | 1.3068 | 0.64 | 0.4889 | 2.7600 | 0.64 | 0.6424 | 0.1592 | 0.1458 |
| 0.0506 | 72.0 | 1800 | 1.3076 | 0.6425 | 0.4894 | 2.7599 | 0.6425 | 0.6451 | 0.1766 | 0.1461 |
| 0.0506 | 73.0 | 1825 | 1.3071 | 0.6425 | 0.4890 | 2.7609 | 0.6425 | 0.6451 | 0.1538 | 0.1460 |
| 0.0506 | 74.0 | 1850 | 1.3062 | 0.64 | 0.4887 | 2.7601 | 0.64 | 0.6422 | 0.1678 | 0.1461 |
| 0.0506 | 75.0 | 1875 | 1.3076 | 0.6425 | 0.4891 | 2.7598 | 0.6425 | 0.6451 | 0.1660 | 0.1461 |
| 0.0506 | 76.0 | 1900 | 1.3067 | 0.6425 | 0.4890 | 2.7607 | 0.6425 | 0.6450 | 0.1510 | 0.1461 |
| 0.0506 | 77.0 | 1925 | 1.3073 | 0.6425 | 0.4891 | 2.7596 | 0.6425 | 0.6451 | 0.1558 | 0.1461 |
| 0.0506 | 78.0 | 1950 | 1.3075 | 0.6425 | 0.4894 | 2.7612 | 0.6425 | 0.6451 | 0.1604 | 0.1461 |
| 0.0506 | 79.0 | 1975 | 1.3071 | 0.6425 | 0.4889 | 2.7602 | 0.6425 | 0.6452 | 0.1575 | 0.1460 |
| 0.0486 | 80.0 | 2000 | 1.3065 | 0.6425 | 0.4889 | 2.7599 | 0.6425 | 0.6450 | 0.1451 | 0.1461 |
| 0.0486 | 81.0 | 2025 | 1.3066 | 0.6425 | 0.4889 | 2.7594 | 0.6425 | 0.6451 | 0.1532 | 0.1460 |
| 0.0486 | 82.0 | 2050 | 1.3069 | 0.64 | 0.4891 | 2.7599 | 0.64 | 0.6424 | 0.1468 | 0.1463 |
| 0.0486 | 83.0 | 2075 | 1.3068 | 0.64 | 0.4889 | 2.7599 | 0.64 | 0.6422 | 0.1551 | 0.1466 |
| 0.0486 | 84.0 | 2100 | 1.3067 | 0.64 | 0.4889 | 2.7592 | 0.64 | 0.6424 | 0.1445 | 0.1463 |
| 0.0486 | 85.0 | 2125 | 1.3065 | 0.64 | 0.4889 | 2.7591 | 0.64 | 0.6422 | 0.1506 | 0.1465 |
| 0.0486 | 86.0 | 2150 | 1.3067 | 0.64 | 0.4889 | 2.7589 | 0.64 | 0.6422 | 0.1637 | 0.1465 |
| 0.0486 | 87.0 | 2175 | 1.3069 | 0.64 | 0.4889 | 2.7592 | 0.64 | 0.6422 | 0.1530 | 0.1465 |
| 0.0486 | 88.0 | 2200 | 1.3069 | 0.64 | 0.4890 | 2.7591 | 0.64 | 0.6422 | 0.1503 | 0.1465 |
| 0.0486 | 89.0 | 2225 | 1.3067 | 0.64 | 0.4889 | 2.7592 | 0.64 | 0.6422 | 0.1547 | 0.1464 |
| 0.0486 | 90.0 | 2250 | 1.3069 | 0.64 | 0.4890 | 2.7592 | 0.64 | 0.6422 | 0.1477 | 0.1465 |
| 0.0486 | 91.0 | 2275 | 1.3067 | 0.64 | 0.4889 | 2.7590 | 0.64 | 0.6422 | 0.1508 | 0.1465 |
| 0.0486 | 92.0 | 2300 | 1.3066 | 0.64 | 0.4888 | 2.7591 | 0.64 | 0.6422 | 0.1484 | 0.1464 |
| 0.0486 | 93.0 | 2325 | 1.3068 | 0.64 | 0.4889 | 2.7588 | 0.64 | 0.6422 | 0.1485 | 0.1465 |
| 0.0486 | 94.0 | 2350 | 1.3067 | 0.64 | 0.4889 | 2.7590 | 0.64 | 0.6422 | 0.1482 | 0.1465 |
| 0.0486 | 95.0 | 2375 | 1.3068 | 0.64 | 0.4889 | 2.7589 | 0.64 | 0.6422 | 0.1482 | 0.1465 |
| 0.0486 | 96.0 | 2400 | 1.3067 | 0.64 | 0.4889 | 2.7589 | 0.64 | 0.6422 | 0.1482 | 0.1464 |
| 0.0486 | 97.0 | 2425 | 1.3068 | 0.64 | 0.4889 | 2.7590 | 0.64 | 0.6422 | 0.1482 | 0.1465 |
| 0.0486 | 98.0 | 2450 | 1.3067 | 0.64 | 0.4889 | 2.7589 | 0.64 | 0.6422 | 0.1482 | 0.1464 |
| 0.0486 | 99.0 | 2475 | 1.3067 | 0.64 | 0.4889 | 2.7589 | 0.64 | 0.6422 | 0.1482 | 0.1465 |
| 0.0484 | 100.0 | 2500 | 1.3067 | 0.64 | 0.4889 | 2.7590 | 0.64 | 0.6422 | 0.1482 | 0.1465 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_tobacco3482_simkd_CEKD_tNone_aNone_tNone_gNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_simkd_CEKD_tNone_aNone_tNone_gNone
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0379
- Accuracy: 0.8
- Brier Loss: 0.6938
- Nll: 1.3290
- F1 Micro: 0.8000
- F1 Macro: 0.7859
- Ece: 0.5869
- Aurc: 0.0931
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.0506 | 0.09 | 0.8991 | 6.5155 | 0.09 | 0.0484 | 0.1622 | 0.8986 |
| No log | 2.0 | 50 | 0.0468 | 0.22 | 0.8982 | 4.6950 | 0.22 | 0.1025 | 0.2491 | 0.7656 |
| No log | 3.0 | 75 | 0.0463 | 0.29 | 0.8969 | 3.3099 | 0.29 | 0.1676 | 0.2924 | 0.6888 |
| No log | 4.0 | 100 | 0.0459 | 0.37 | 0.8954 | 3.2920 | 0.37 | 0.1891 | 0.3517 | 0.4208 |
| No log | 5.0 | 125 | 0.0455 | 0.395 | 0.8929 | 3.2550 | 0.395 | 0.2299 | 0.3759 | 0.3617 |
| No log | 6.0 | 150 | 0.0449 | 0.49 | 0.8885 | 2.9109 | 0.49 | 0.3135 | 0.4396 | 0.2804 |
| No log | 7.0 | 175 | 0.0441 | 0.495 | 0.8796 | 2.8950 | 0.495 | 0.3248 | 0.4360 | 0.2721 |
| No log | 8.0 | 200 | 0.0430 | 0.545 | 0.8619 | 2.5199 | 0.545 | 0.3771 | 0.4777 | 0.2129 |
| No log | 9.0 | 225 | 0.0418 | 0.62 | 0.8382 | 2.2126 | 0.62 | 0.4291 | 0.5298 | 0.1659 |
| No log | 10.0 | 250 | 0.0409 | 0.645 | 0.8137 | 2.2525 | 0.645 | 0.4947 | 0.5293 | 0.1552 |
| No log | 11.0 | 275 | 0.0401 | 0.68 | 0.7863 | 2.4423 | 0.68 | 0.5145 | 0.5433 | 0.1215 |
| No log | 12.0 | 300 | 0.0392 | 0.68 | 0.7628 | 1.9779 | 0.68 | 0.5373 | 0.5402 | 0.1172 |
| No log | 13.0 | 325 | 0.0385 | 0.745 | 0.7350 | 1.8986 | 0.745 | 0.6126 | 0.5806 | 0.0843 |
| No log | 14.0 | 350 | 0.0384 | 0.735 | 0.7268 | 1.9922 | 0.735 | 0.6451 | 0.5466 | 0.0997 |
| No log | 15.0 | 375 | 0.0381 | 0.745 | 0.7180 | 1.6965 | 0.745 | 0.6627 | 0.5586 | 0.0761 |
| No log | 16.0 | 400 | 0.0377 | 0.805 | 0.7031 | 1.2564 | 0.805 | 0.7353 | 0.6034 | 0.0713 |
| No log | 17.0 | 425 | 0.0389 | 0.745 | 0.7303 | 1.5063 | 0.745 | 0.7192 | 0.5779 | 0.0705 |
| No log | 18.0 | 450 | 0.0387 | 0.765 | 0.7219 | 1.5776 | 0.765 | 0.7703 | 0.5815 | 0.0923 |
| No log | 19.0 | 475 | 0.0383 | 0.805 | 0.7213 | 1.3953 | 0.805 | 0.7906 | 0.6159 | 0.0667 |
| 0.0432 | 20.0 | 500 | 0.0377 | 0.835 | 0.6952 | 1.3075 | 0.835 | 0.8271 | 0.6116 | 0.0799 |
| 0.0432 | 21.0 | 525 | 0.0381 | 0.795 | 0.7018 | 1.6184 | 0.795 | 0.7723 | 0.5851 | 0.0880 |
| 0.0432 | 22.0 | 550 | 0.0378 | 0.81 | 0.6984 | 1.4292 | 0.81 | 0.7950 | 0.6103 | 0.0673 |
| 0.0432 | 23.0 | 575 | 0.0380 | 0.805 | 0.6976 | 1.4852 | 0.805 | 0.7951 | 0.5942 | 0.0808 |
| 0.0432 | 24.0 | 600 | 0.0377 | 0.825 | 0.6907 | 1.4501 | 0.825 | 0.8103 | 0.6020 | 0.0774 |
| 0.0432 | 25.0 | 625 | 0.0377 | 0.83 | 0.6920 | 1.4509 | 0.83 | 0.8148 | 0.6038 | 0.0759 |
| 0.0432 | 26.0 | 650 | 0.0377 | 0.825 | 0.6927 | 1.4113 | 0.825 | 0.8114 | 0.6072 | 0.0765 |
| 0.0432 | 27.0 | 675 | 0.0377 | 0.825 | 0.6924 | 1.4044 | 0.825 | 0.8114 | 0.6057 | 0.0757 |
| 0.0432 | 28.0 | 700 | 0.0377 | 0.82 | 0.6932 | 1.4521 | 0.82 | 0.8061 | 0.6017 | 0.0815 |
| 0.0432 | 29.0 | 725 | 0.0377 | 0.82 | 0.6932 | 1.3593 | 0.82 | 0.8080 | 0.5983 | 0.0794 |
| 0.0432 | 30.0 | 750 | 0.0377 | 0.82 | 0.6926 | 1.3437 | 0.82 | 0.8069 | 0.6042 | 0.0819 |
| 0.0432 | 31.0 | 775 | 0.0377 | 0.815 | 0.6932 | 1.3453 | 0.815 | 0.8027 | 0.5988 | 0.0815 |
| 0.0432 | 32.0 | 800 | 0.0377 | 0.82 | 0.6930 | 1.3384 | 0.82 | 0.8029 | 0.6044 | 0.0855 |
| 0.0432 | 33.0 | 825 | 0.0377 | 0.81 | 0.6928 | 1.3969 | 0.81 | 0.7927 | 0.5929 | 0.0835 |
| 0.0432 | 34.0 | 850 | 0.0378 | 0.805 | 0.6927 | 1.3995 | 0.805 | 0.7886 | 0.5961 | 0.0855 |
| 0.0432 | 35.0 | 875 | 0.0377 | 0.81 | 0.6927 | 1.3705 | 0.81 | 0.7979 | 0.5910 | 0.0887 |
| 0.0432 | 36.0 | 900 | 0.0378 | 0.805 | 0.6930 | 1.3566 | 0.805 | 0.7886 | 0.5850 | 0.0817 |
| 0.0432 | 37.0 | 925 | 0.0377 | 0.82 | 0.6927 | 1.3537 | 0.82 | 0.8022 | 0.5936 | 0.0847 |
| 0.0432 | 38.0 | 950 | 0.0377 | 0.815 | 0.6930 | 1.3574 | 0.815 | 0.7978 | 0.5976 | 0.0854 |
| 0.0432 | 39.0 | 975 | 0.0377 | 0.815 | 0.6932 | 1.4599 | 0.815 | 0.7978 | 0.5955 | 0.0864 |
| 0.035 | 40.0 | 1000 | 0.0377 | 0.815 | 0.6926 | 1.4147 | 0.815 | 0.7978 | 0.5990 | 0.0869 |
| 0.035 | 41.0 | 1025 | 0.0377 | 0.81 | 0.6931 | 1.4065 | 0.81 | 0.7943 | 0.5966 | 0.0844 |
| 0.035 | 42.0 | 1050 | 0.0378 | 0.81 | 0.6929 | 1.4678 | 0.81 | 0.7961 | 0.5902 | 0.0891 |
| 0.035 | 43.0 | 1075 | 0.0378 | 0.81 | 0.6927 | 1.4164 | 0.81 | 0.7971 | 0.5951 | 0.0897 |
| 0.035 | 44.0 | 1100 | 0.0378 | 0.81 | 0.6930 | 1.4646 | 0.81 | 0.7961 | 0.5948 | 0.0875 |
| 0.035 | 45.0 | 1125 | 0.0378 | 0.815 | 0.6921 | 1.4660 | 0.815 | 0.8004 | 0.6024 | 0.0895 |
| 0.035 | 46.0 | 1150 | 0.0378 | 0.81 | 0.6929 | 1.4098 | 0.81 | 0.7961 | 0.5987 | 0.0831 |
| 0.035 | 47.0 | 1175 | 0.0378 | 0.815 | 0.6928 | 1.4634 | 0.815 | 0.8004 | 0.5963 | 0.0911 |
| 0.035 | 48.0 | 1200 | 0.0378 | 0.81 | 0.6932 | 1.4648 | 0.81 | 0.7961 | 0.5841 | 0.0877 |
| 0.035 | 49.0 | 1225 | 0.0378 | 0.81 | 0.6928 | 1.4635 | 0.81 | 0.7961 | 0.5955 | 0.0898 |
| 0.035 | 50.0 | 1250 | 0.0378 | 0.805 | 0.6935 | 1.4688 | 0.805 | 0.7882 | 0.5795 | 0.0902 |
| 0.035 | 51.0 | 1275 | 0.0378 | 0.805 | 0.6928 | 1.4665 | 0.805 | 0.7882 | 0.5848 | 0.0916 |
| 0.035 | 52.0 | 1300 | 0.0378 | 0.81 | 0.6925 | 1.4249 | 0.81 | 0.7961 | 0.5869 | 0.0926 |
| 0.035 | 53.0 | 1325 | 0.0378 | 0.815 | 0.6926 | 1.4150 | 0.815 | 0.8021 | 0.5934 | 0.0913 |
| 0.035 | 54.0 | 1350 | 0.0378 | 0.81 | 0.6929 | 1.4155 | 0.81 | 0.7961 | 0.5943 | 0.0913 |
| 0.035 | 55.0 | 1375 | 0.0378 | 0.805 | 0.6928 | 1.4141 | 0.805 | 0.7882 | 0.5934 | 0.0964 |
| 0.035 | 56.0 | 1400 | 0.0378 | 0.805 | 0.6930 | 1.4124 | 0.805 | 0.7882 | 0.5926 | 0.0958 |
| 0.035 | 57.0 | 1425 | 0.0378 | 0.81 | 0.6935 | 1.4116 | 0.81 | 0.7934 | 0.6002 | 0.0895 |
| 0.035 | 58.0 | 1450 | 0.0378 | 0.805 | 0.6928 | 1.4059 | 0.805 | 0.7882 | 0.5890 | 0.0937 |
| 0.035 | 59.0 | 1475 | 0.0378 | 0.805 | 0.6929 | 1.4141 | 0.805 | 0.7882 | 0.5918 | 0.0967 |
| 0.0348 | 60.0 | 1500 | 0.0378 | 0.81 | 0.6935 | 1.4086 | 0.81 | 0.7934 | 0.5915 | 0.0934 |
| 0.0348 | 61.0 | 1525 | 0.0378 | 0.81 | 0.6930 | 1.4105 | 0.81 | 0.7941 | 0.5954 | 0.0961 |
| 0.0348 | 62.0 | 1550 | 0.0378 | 0.81 | 0.6933 | 1.4166 | 0.81 | 0.7941 | 0.5889 | 0.0954 |
| 0.0348 | 63.0 | 1575 | 0.0378 | 0.81 | 0.6933 | 1.4109 | 0.81 | 0.7934 | 0.5963 | 0.0975 |
| 0.0348 | 64.0 | 1600 | 0.0378 | 0.81 | 0.6932 | 1.4131 | 0.81 | 0.7934 | 0.5980 | 0.0953 |
| 0.0348 | 65.0 | 1625 | 0.0378 | 0.81 | 0.6937 | 1.4182 | 0.81 | 0.7934 | 0.5956 | 0.0970 |
| 0.0348 | 66.0 | 1650 | 0.0378 | 0.805 | 0.6933 | 1.4125 | 0.805 | 0.7893 | 0.5905 | 0.0966 |
| 0.0348 | 67.0 | 1675 | 0.0378 | 0.81 | 0.6937 | 1.4136 | 0.81 | 0.7934 | 0.5965 | 0.0975 |
| 0.0348 | 68.0 | 1700 | 0.0379 | 0.81 | 0.6935 | 1.4137 | 0.81 | 0.7934 | 0.5994 | 0.0971 |
| 0.0348 | 69.0 | 1725 | 0.0378 | 0.805 | 0.6935 | 1.4196 | 0.805 | 0.7893 | 0.5913 | 0.0946 |
| 0.0348 | 70.0 | 1750 | 0.0379 | 0.805 | 0.6933 | 1.4129 | 0.805 | 0.7893 | 0.5877 | 0.0945 |
| 0.0348 | 71.0 | 1775 | 0.0379 | 0.805 | 0.6933 | 1.4172 | 0.805 | 0.7893 | 0.5921 | 0.0951 |
| 0.0348 | 72.0 | 1800 | 0.0379 | 0.805 | 0.6931 | 1.4136 | 0.805 | 0.7893 | 0.5851 | 0.0953 |
| 0.0348 | 73.0 | 1825 | 0.0379 | 0.805 | 0.6929 | 1.4168 | 0.805 | 0.7893 | 0.5846 | 0.0971 |
| 0.0348 | 74.0 | 1850 | 0.0379 | 0.805 | 0.6939 | 1.4185 | 0.805 | 0.7893 | 0.5892 | 0.0950 |
| 0.0348 | 75.0 | 1875 | 0.0379 | 0.805 | 0.6935 | 1.4171 | 0.805 | 0.7893 | 0.5946 | 0.0938 |
| 0.0348 | 76.0 | 1900 | 0.0379 | 0.805 | 0.6934 | 1.4217 | 0.805 | 0.7893 | 0.5939 | 0.0959 |
| 0.0348 | 77.0 | 1925 | 0.0379 | 0.8 | 0.6932 | 1.4162 | 0.8000 | 0.7859 | 0.5826 | 0.0954 |
| 0.0348 | 78.0 | 1950 | 0.0379 | 0.8 | 0.6935 | 1.4172 | 0.8000 | 0.7859 | 0.5912 | 0.0950 |
| 0.0348 | 79.0 | 1975 | 0.0379 | 0.8 | 0.6933 | 1.4169 | 0.8000 | 0.7859 | 0.5885 | 0.0964 |
| 0.0348 | 80.0 | 2000 | 0.0379 | 0.8 | 0.6935 | 1.4196 | 0.8000 | 0.7859 | 0.5865 | 0.0957 |
| 0.0348 | 81.0 | 2025 | 0.0379 | 0.8 | 0.6937 | 1.4213 | 0.8000 | 0.7859 | 0.5880 | 0.0962 |
| 0.0348 | 82.0 | 2050 | 0.0379 | 0.8 | 0.6939 | 1.4201 | 0.8000 | 0.7859 | 0.5910 | 0.0962 |
| 0.0348 | 83.0 | 2075 | 0.0379 | 0.8 | 0.6938 | 1.3762 | 0.8000 | 0.7859 | 0.5883 | 0.0945 |
| 0.0348 | 84.0 | 2100 | 0.0379 | 0.8 | 0.6938 | 1.4218 | 0.8000 | 0.7859 | 0.5947 | 0.0950 |
| 0.0348 | 85.0 | 2125 | 0.0379 | 0.8 | 0.6935 | 1.3657 | 0.8000 | 0.7859 | 0.5857 | 0.0912 |
| 0.0348 | 86.0 | 2150 | 0.0379 | 0.8 | 0.6938 | 1.3278 | 0.8000 | 0.7859 | 0.5892 | 0.0929 |
| 0.0348 | 87.0 | 2175 | 0.0379 | 0.8 | 0.6938 | 1.3831 | 0.8000 | 0.7859 | 0.5856 | 0.0946 |
| 0.0348 | 88.0 | 2200 | 0.0379 | 0.8 | 0.6938 | 1.3761 | 0.8000 | 0.7859 | 0.5892 | 0.0955 |
| 0.0348 | 89.0 | 2225 | 0.0379 | 0.8 | 0.6938 | 1.3296 | 0.8000 | 0.7859 | 0.5870 | 0.0947 |
| 0.0348 | 90.0 | 2250 | 0.0379 | 0.8 | 0.6939 | 1.3667 | 0.8000 | 0.7859 | 0.5909 | 0.0926 |
| 0.0348 | 91.0 | 2275 | 0.0379 | 0.8 | 0.6940 | 1.3346 | 0.8000 | 0.7859 | 0.5906 | 0.0930 |
| 0.0348 | 92.0 | 2300 | 0.0379 | 0.8 | 0.6938 | 1.3268 | 0.8000 | 0.7859 | 0.5870 | 0.0936 |
| 0.0348 | 93.0 | 2325 | 0.0379 | 0.8 | 0.6937 | 1.3320 | 0.8000 | 0.7859 | 0.5919 | 0.0939 |
| 0.0348 | 94.0 | 2350 | 0.0379 | 0.8 | 0.6939 | 1.3324 | 0.8000 | 0.7859 | 0.5870 | 0.0928 |
| 0.0348 | 95.0 | 2375 | 0.0379 | 0.8 | 0.6937 | 1.3289 | 0.8000 | 0.7859 | 0.5869 | 0.0932 |
| 0.0348 | 96.0 | 2400 | 0.0379 | 0.8 | 0.6938 | 1.3264 | 0.8000 | 0.7859 | 0.5869 | 0.0931 |
| 0.0348 | 97.0 | 2425 | 0.0379 | 0.8 | 0.6938 | 1.3280 | 0.8000 | 0.7859 | 0.5870 | 0.0932 |
| 0.0348 | 98.0 | 2450 | 0.0379 | 0.8 | 0.6938 | 1.3297 | 0.8000 | 0.7859 | 0.5869 | 0.0930 |
| 0.0348 | 99.0 | 2475 | 0.0379 | 0.8 | 0.6938 | 1.3304 | 0.8000 | 0.7859 | 0.5869 | 0.0929 |
| 0.0347 | 100.0 | 2500 | 0.0379 | 0.8 | 0.6938 | 1.3290 | 0.8000 | 0.7859 | 0.5869 | 0.0931 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2544
- Accuracy: 0.6375
- Brier Loss: 0.4805
- Nll: 3.0517
- F1 Micro: 0.6375
- F1 Macro: 0.6394
- Ece: 0.1654
- Aurc: 0.1376
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 3.2176 | 0.1275 | 0.9297 | 15.5568 | 0.1275 | 0.1255 | 0.1544 | 0.8595 |
| No log | 2.0 | 50 | 2.4392 | 0.405 | 0.7503 | 9.6083 | 0.405 | 0.3723 | 0.1816 | 0.3640 |
| No log | 3.0 | 75 | 1.9211 | 0.5025 | 0.6287 | 5.6023 | 0.5025 | 0.4930 | 0.1991 | 0.2451 |
| No log | 4.0 | 100 | 1.7474 | 0.5375 | 0.5956 | 4.5712 | 0.5375 | 0.5387 | 0.1677 | 0.2244 |
| No log | 5.0 | 125 | 1.7107 | 0.535 | 0.6051 | 4.3431 | 0.535 | 0.5180 | 0.1796 | 0.2269 |
| No log | 6.0 | 150 | 1.7144 | 0.545 | 0.5988 | 3.6699 | 0.545 | 0.5455 | 0.1918 | 0.2253 |
| No log | 7.0 | 175 | 1.9096 | 0.5625 | 0.6262 | 4.6856 | 0.5625 | 0.5459 | 0.1966 | 0.2362 |
| No log | 8.0 | 200 | 1.6325 | 0.575 | 0.5815 | 3.9279 | 0.575 | 0.5705 | 0.1893 | 0.2026 |
| No log | 9.0 | 225 | 1.8268 | 0.56 | 0.6088 | 4.5140 | 0.56 | 0.5482 | 0.1976 | 0.2213 |
| No log | 10.0 | 250 | 1.9253 | 0.5575 | 0.6493 | 4.2860 | 0.5575 | 0.5427 | 0.2286 | 0.2445 |
| No log | 11.0 | 275 | 1.6941 | 0.5725 | 0.5940 | 3.9317 | 0.5725 | 0.5827 | 0.2019 | 0.2232 |
| No log | 12.0 | 300 | 1.8197 | 0.5575 | 0.6138 | 4.7928 | 0.5575 | 0.5476 | 0.2079 | 0.2240 |
| No log | 13.0 | 325 | 1.8958 | 0.54 | 0.6508 | 4.2978 | 0.54 | 0.5338 | 0.2379 | 0.2357 |
| No log | 14.0 | 350 | 1.8939 | 0.535 | 0.6522 | 4.5557 | 0.535 | 0.5143 | 0.2324 | 0.2350 |
| No log | 15.0 | 375 | 1.8018 | 0.585 | 0.6042 | 4.4728 | 0.585 | 0.5829 | 0.2205 | 0.2182 |
| No log | 16.0 | 400 | 1.7645 | 0.5975 | 0.5978 | 3.9939 | 0.5975 | 0.5992 | 0.2130 | 0.1927 |
| No log | 17.0 | 425 | 1.6392 | 0.5925 | 0.5842 | 3.6783 | 0.5925 | 0.6039 | 0.1986 | 0.2017 |
| No log | 18.0 | 450 | 1.6124 | 0.5875 | 0.5761 | 4.0535 | 0.5875 | 0.5721 | 0.2060 | 0.1792 |
| No log | 19.0 | 475 | 1.7517 | 0.585 | 0.6102 | 3.9076 | 0.585 | 0.5786 | 0.2082 | 0.2071 |
| 0.6436 | 20.0 | 500 | 1.7467 | 0.5575 | 0.6166 | 3.5052 | 0.5575 | 0.5476 | 0.2252 | 0.2247 |
| 0.6436 | 21.0 | 525 | 1.6719 | 0.5825 | 0.5745 | 4.1235 | 0.5825 | 0.5877 | 0.1831 | 0.1723 |
| 0.6436 | 22.0 | 550 | 1.4222 | 0.605 | 0.5237 | 3.2051 | 0.605 | 0.6083 | 0.1813 | 0.1559 |
| 0.6436 | 23.0 | 575 | 1.6436 | 0.595 | 0.5701 | 4.3949 | 0.595 | 0.5834 | 0.1921 | 0.1901 |
| 0.6436 | 24.0 | 600 | 1.4244 | 0.6075 | 0.5197 | 3.3207 | 0.6075 | 0.6100 | 0.1548 | 0.1616 |
| 0.6436 | 25.0 | 625 | 1.4567 | 0.6075 | 0.5356 | 3.5288 | 0.6075 | 0.6107 | 0.1768 | 0.1652 |
| 0.6436 | 26.0 | 650 | 1.5889 | 0.595 | 0.5587 | 4.1521 | 0.595 | 0.5907 | 0.1943 | 0.1768 |
| 0.6436 | 27.0 | 675 | 1.4828 | 0.5725 | 0.5532 | 3.4259 | 0.5725 | 0.5720 | 0.2125 | 0.1803 |
| 0.6436 | 28.0 | 700 | 1.4671 | 0.5975 | 0.5509 | 3.2612 | 0.5975 | 0.6006 | 0.1983 | 0.1797 |
| 0.6436 | 29.0 | 725 | 1.4049 | 0.6225 | 0.5273 | 3.3136 | 0.6225 | 0.6237 | 0.1995 | 0.1600 |
| 0.6436 | 30.0 | 750 | 1.4039 | 0.6175 | 0.5208 | 3.2588 | 0.6175 | 0.6063 | 0.1770 | 0.1534 |
| 0.6436 | 31.0 | 775 | 1.4333 | 0.6 | 0.5378 | 3.6417 | 0.6 | 0.5995 | 0.1899 | 0.1632 |
| 0.6436 | 32.0 | 800 | 1.3311 | 0.64 | 0.5032 | 3.0056 | 0.64 | 0.6394 | 0.1699 | 0.1476 |
| 0.6436 | 33.0 | 825 | 1.3361 | 0.61 | 0.5079 | 3.2304 | 0.61 | 0.6123 | 0.1536 | 0.1517 |
| 0.6436 | 34.0 | 850 | 1.2984 | 0.64 | 0.4982 | 3.1446 | 0.64 | 0.6444 | 0.1636 | 0.1424 |
| 0.6436 | 35.0 | 875 | 1.3153 | 0.6275 | 0.4995 | 3.0722 | 0.6275 | 0.6288 | 0.1634 | 0.1486 |
| 0.6436 | 36.0 | 900 | 1.2773 | 0.6375 | 0.4880 | 2.7136 | 0.6375 | 0.6422 | 0.1606 | 0.1411 |
| 0.6436 | 37.0 | 925 | 1.2881 | 0.64 | 0.4946 | 3.0452 | 0.64 | 0.6437 | 0.1732 | 0.1440 |
| 0.6436 | 38.0 | 950 | 1.2609 | 0.64 | 0.4824 | 2.7407 | 0.64 | 0.6430 | 0.1485 | 0.1424 |
| 0.6436 | 39.0 | 975 | 1.2685 | 0.645 | 0.4869 | 2.7203 | 0.645 | 0.6484 | 0.1680 | 0.1398 |
| 0.0861 | 40.0 | 1000 | 1.2546 | 0.635 | 0.4808 | 2.7042 | 0.635 | 0.6356 | 0.1669 | 0.1416 |
| 0.0861 | 41.0 | 1025 | 1.2599 | 0.6425 | 0.4858 | 2.6880 | 0.6425 | 0.6457 | 0.1539 | 0.1387 |
| 0.0861 | 42.0 | 1050 | 1.2413 | 0.635 | 0.4783 | 2.8343 | 0.635 | 0.6361 | 0.1679 | 0.1369 |
| 0.0861 | 43.0 | 1075 | 1.2670 | 0.6325 | 0.4901 | 2.8366 | 0.6325 | 0.6337 | 0.1501 | 0.1399 |
| 0.0861 | 44.0 | 1100 | 1.2793 | 0.63 | 0.4919 | 3.1711 | 0.63 | 0.6309 | 0.1672 | 0.1465 |
| 0.0861 | 45.0 | 1125 | 1.2555 | 0.635 | 0.4844 | 2.9284 | 0.635 | 0.6379 | 0.1791 | 0.1401 |
| 0.0861 | 46.0 | 1150 | 1.2491 | 0.635 | 0.4806 | 2.8475 | 0.635 | 0.6358 | 0.1611 | 0.1392 |
| 0.0861 | 47.0 | 1175 | 1.2533 | 0.6325 | 0.4837 | 2.8229 | 0.6325 | 0.6352 | 0.1623 | 0.1378 |
| 0.0861 | 48.0 | 1200 | 1.2602 | 0.635 | 0.4857 | 2.9963 | 0.635 | 0.6368 | 0.1535 | 0.1426 |
| 0.0861 | 49.0 | 1225 | 1.2598 | 0.635 | 0.4848 | 2.8569 | 0.635 | 0.6370 | 0.1718 | 0.1389 |
| 0.0861 | 50.0 | 1250 | 1.2577 | 0.6225 | 0.4839 | 2.8645 | 0.6225 | 0.6237 | 0.1678 | 0.1420 |
| 0.0861 | 51.0 | 1275 | 1.2547 | 0.63 | 0.4817 | 2.8344 | 0.63 | 0.6314 | 0.1721 | 0.1399 |
| 0.0861 | 52.0 | 1300 | 1.2525 | 0.64 | 0.4819 | 2.7720 | 0.64 | 0.6411 | 0.1567 | 0.1378 |
| 0.0861 | 53.0 | 1325 | 1.2627 | 0.6325 | 0.4854 | 2.9202 | 0.6325 | 0.6337 | 0.1688 | 0.1406 |
| 0.0861 | 54.0 | 1350 | 1.2565 | 0.63 | 0.4836 | 2.8392 | 0.63 | 0.6320 | 0.1612 | 0.1404 |
| 0.0861 | 55.0 | 1375 | 1.2514 | 0.6325 | 0.4813 | 2.9887 | 0.6325 | 0.6343 | 0.1652 | 0.1386 |
| 0.0861 | 56.0 | 1400 | 1.2541 | 0.6275 | 0.4822 | 2.9067 | 0.6275 | 0.6296 | 0.1649 | 0.1401 |
| 0.0861 | 57.0 | 1425 | 1.2529 | 0.64 | 0.4810 | 2.9166 | 0.64 | 0.6432 | 0.1765 | 0.1372 |
| 0.0861 | 58.0 | 1450 | 1.2464 | 0.6275 | 0.4799 | 2.9713 | 0.6275 | 0.6291 | 0.1653 | 0.1401 |
| 0.0861 | 59.0 | 1475 | 1.2576 | 0.63 | 0.4826 | 2.9124 | 0.63 | 0.6323 | 0.1557 | 0.1397 |
| 0.0496 | 60.0 | 1500 | 1.2494 | 0.63 | 0.4804 | 2.8355 | 0.63 | 0.6317 | 0.1672 | 0.1390 |
| 0.0496 | 61.0 | 1525 | 1.2496 | 0.6325 | 0.4803 | 2.9091 | 0.6325 | 0.6352 | 0.1510 | 0.1383 |
| 0.0496 | 62.0 | 1550 | 1.2592 | 0.6375 | 0.4838 | 2.8980 | 0.6375 | 0.6384 | 0.1758 | 0.1398 |
| 0.0496 | 63.0 | 1575 | 1.2504 | 0.63 | 0.4806 | 2.9843 | 0.63 | 0.6316 | 0.1691 | 0.1391 |
| 0.0496 | 64.0 | 1600 | 1.2528 | 0.6325 | 0.4810 | 2.9045 | 0.6325 | 0.6349 | 0.1737 | 0.1388 |
| 0.0496 | 65.0 | 1625 | 1.2589 | 0.6425 | 0.4833 | 2.9817 | 0.6425 | 0.6447 | 0.1719 | 0.1380 |
| 0.0496 | 66.0 | 1650 | 1.2531 | 0.63 | 0.4811 | 2.9027 | 0.63 | 0.6321 | 0.1751 | 0.1391 |
| 0.0496 | 67.0 | 1675 | 1.2520 | 0.635 | 0.4808 | 2.9794 | 0.635 | 0.6379 | 0.1715 | 0.1378 |
| 0.0496 | 68.0 | 1700 | 1.2543 | 0.64 | 0.4815 | 2.9771 | 0.64 | 0.6420 | 0.1562 | 0.1380 |
| 0.0496 | 69.0 | 1725 | 1.2538 | 0.6325 | 0.4808 | 2.9080 | 0.6325 | 0.6345 | 0.1681 | 0.1385 |
| 0.0496 | 70.0 | 1750 | 1.2543 | 0.6325 | 0.4813 | 2.9102 | 0.6325 | 0.6347 | 0.1725 | 0.1390 |
| 0.0496 | 71.0 | 1775 | 1.2534 | 0.6325 | 0.4809 | 2.9778 | 0.6325 | 0.6353 | 0.1495 | 0.1385 |
| 0.0496 | 72.0 | 1800 | 1.2539 | 0.6375 | 0.4809 | 2.9024 | 0.6375 | 0.6394 | 0.1588 | 0.1381 |
| 0.0496 | 73.0 | 1825 | 1.2531 | 0.635 | 0.4806 | 2.9812 | 0.635 | 0.6378 | 0.1552 | 0.1380 |
| 0.0496 | 74.0 | 1850 | 1.2531 | 0.635 | 0.4805 | 2.9783 | 0.635 | 0.6377 | 0.1700 | 0.1380 |
| 0.0496 | 75.0 | 1875 | 1.2533 | 0.6375 | 0.4809 | 2.9772 | 0.6375 | 0.6400 | 0.1645 | 0.1372 |
| 0.0496 | 76.0 | 1900 | 1.2539 | 0.6375 | 0.4808 | 2.9777 | 0.6375 | 0.6393 | 0.1675 | 0.1376 |
| 0.0496 | 77.0 | 1925 | 1.2537 | 0.635 | 0.4808 | 2.9832 | 0.635 | 0.6375 | 0.1648 | 0.1381 |
| 0.0496 | 78.0 | 1950 | 1.2539 | 0.6375 | 0.4807 | 2.9769 | 0.6375 | 0.6394 | 0.1636 | 0.1374 |
| 0.0496 | 79.0 | 1975 | 1.2534 | 0.6375 | 0.4805 | 2.9796 | 0.6375 | 0.6399 | 0.1599 | 0.1375 |
| 0.048 | 80.0 | 2000 | 1.2537 | 0.6375 | 0.4806 | 3.0539 | 0.6375 | 0.6399 | 0.1657 | 0.1375 |
| 0.048 | 81.0 | 2025 | 1.2535 | 0.6375 | 0.4805 | 3.0534 | 0.6375 | 0.6399 | 0.1728 | 0.1375 |
| 0.048 | 82.0 | 2050 | 1.2539 | 0.6375 | 0.4806 | 2.9831 | 0.6375 | 0.6393 | 0.1674 | 0.1375 |
| 0.048 | 83.0 | 2075 | 1.2542 | 0.6375 | 0.4807 | 3.0538 | 0.6375 | 0.6399 | 0.1674 | 0.1375 |
| 0.048 | 84.0 | 2100 | 1.2539 | 0.6375 | 0.4805 | 3.0531 | 0.6375 | 0.6394 | 0.1564 | 0.1375 |
| 0.048 | 85.0 | 2125 | 1.2542 | 0.6375 | 0.4806 | 3.0531 | 0.6375 | 0.6393 | 0.1676 | 0.1376 |
| 0.048 | 86.0 | 2150 | 1.2541 | 0.6375 | 0.4806 | 3.0527 | 0.6375 | 0.6399 | 0.1691 | 0.1375 |
| 0.048 | 87.0 | 2175 | 1.2542 | 0.6375 | 0.4805 | 3.0525 | 0.6375 | 0.6394 | 0.1677 | 0.1376 |
| 0.048 | 88.0 | 2200 | 1.2542 | 0.6375 | 0.4806 | 3.0525 | 0.6375 | 0.6393 | 0.1651 | 0.1375 |
| 0.048 | 89.0 | 2225 | 1.2543 | 0.6375 | 0.4805 | 3.0525 | 0.6375 | 0.6394 | 0.1601 | 0.1375 |
| 0.048 | 90.0 | 2250 | 1.2543 | 0.6375 | 0.4805 | 3.0521 | 0.6375 | 0.6394 | 0.1661 | 0.1375 |
| 0.048 | 91.0 | 2275 | 1.2541 | 0.6375 | 0.4805 | 3.0521 | 0.6375 | 0.6394 | 0.1665 | 0.1376 |
| 0.048 | 92.0 | 2300 | 1.2542 | 0.6375 | 0.4805 | 3.0521 | 0.6375 | 0.6394 | 0.1638 | 0.1375 |
| 0.048 | 93.0 | 2325 | 1.2544 | 0.6375 | 0.4805 | 3.0518 | 0.6375 | 0.6394 | 0.1671 | 0.1376 |
| 0.048 | 94.0 | 2350 | 1.2543 | 0.6375 | 0.4805 | 3.0519 | 0.6375 | 0.6394 | 0.1601 | 0.1376 |
| 0.048 | 95.0 | 2375 | 1.2544 | 0.6375 | 0.4805 | 3.0518 | 0.6375 | 0.6394 | 0.1638 | 0.1376 |
| 0.048 | 96.0 | 2400 | 1.2544 | 0.6375 | 0.4805 | 3.0518 | 0.6375 | 0.6394 | 0.1638 | 0.1376 |
| 0.048 | 97.0 | 2425 | 1.2544 | 0.6375 | 0.4805 | 3.0517 | 0.6375 | 0.6394 | 0.1655 | 0.1376 |
| 0.048 | 98.0 | 2450 | 1.2544 | 0.6375 | 0.4805 | 3.0517 | 0.6375 | 0.6394 | 0.1638 | 0.1376 |
| 0.048 | 99.0 | 2475 | 1.2544 | 0.6375 | 0.4805 | 3.0517 | 0.6375 | 0.6394 | 0.1654 | 0.1376 |
| 0.0478 | 100.0 | 2500 | 1.2544 | 0.6375 | 0.4805 | 3.0517 | 0.6375 | 0.6394 | 0.1654 | 0.1376 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2366
- Accuracy: 0.63
- Brier Loss: 0.5035
- Nll: 2.8588
- F1 Micro: 0.63
- F1 Macro: 0.6311
- Ece: 0.1649
- Aurc: 0.1472
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 2.8887 | 0.1225 | 0.9306 | 15.9457 | 0.1225 | 0.1226 | 0.1434 | 0.8620 |
| No log | 2.0 | 50 | 2.2120 | 0.3775 | 0.7577 | 9.7500 | 0.3775 | 0.3483 | 0.1992 | 0.3776 |
| No log | 3.0 | 75 | 1.7681 | 0.495 | 0.6387 | 5.6935 | 0.495 | 0.4838 | 0.1885 | 0.2491 |
| No log | 4.0 | 100 | 1.6420 | 0.5225 | 0.6038 | 5.2427 | 0.5225 | 0.5242 | 0.1757 | 0.2301 |
| No log | 5.0 | 125 | 1.5877 | 0.545 | 0.5986 | 4.6187 | 0.545 | 0.5282 | 0.1808 | 0.2248 |
| No log | 6.0 | 150 | 1.6460 | 0.5125 | 0.6162 | 3.9942 | 0.5125 | 0.5060 | 0.1962 | 0.2295 |
| No log | 7.0 | 175 | 1.8436 | 0.5125 | 0.6538 | 4.1740 | 0.5125 | 0.4932 | 0.2299 | 0.2451 |
| No log | 8.0 | 200 | 1.8205 | 0.545 | 0.6453 | 5.0752 | 0.545 | 0.5234 | 0.2057 | 0.2432 |
| No log | 9.0 | 225 | 1.7399 | 0.55 | 0.6260 | 4.5896 | 0.55 | 0.5460 | 0.2057 | 0.2258 |
| No log | 10.0 | 250 | 1.8559 | 0.55 | 0.6521 | 5.0532 | 0.55 | 0.5368 | 0.2209 | 0.2560 |
| No log | 11.0 | 275 | 1.8636 | 0.5625 | 0.6488 | 4.6642 | 0.5625 | 0.5544 | 0.2335 | 0.2187 |
| No log | 12.0 | 300 | 1.7461 | 0.55 | 0.6356 | 4.1298 | 0.55 | 0.5638 | 0.2047 | 0.2313 |
| No log | 13.0 | 325 | 1.7468 | 0.5625 | 0.6281 | 4.5451 | 0.5625 | 0.5570 | 0.2224 | 0.2214 |
| No log | 14.0 | 350 | 1.9616 | 0.545 | 0.6884 | 3.7999 | 0.545 | 0.5484 | 0.2691 | 0.2624 |
| No log | 15.0 | 375 | 2.0977 | 0.5175 | 0.7138 | 4.3792 | 0.5175 | 0.5055 | 0.2658 | 0.2917 |
| No log | 16.0 | 400 | 2.0238 | 0.5275 | 0.6896 | 4.5299 | 0.5275 | 0.5177 | 0.2664 | 0.2603 |
| No log | 17.0 | 425 | 1.8687 | 0.535 | 0.6534 | 3.7356 | 0.535 | 0.5388 | 0.2490 | 0.2448 |
| No log | 18.0 | 450 | 1.8210 | 0.5575 | 0.6492 | 4.3823 | 0.5575 | 0.5537 | 0.2533 | 0.2268 |
| No log | 19.0 | 475 | 1.7610 | 0.555 | 0.6325 | 3.9697 | 0.555 | 0.5503 | 0.2292 | 0.2161 |
| 0.5398 | 20.0 | 500 | 1.7125 | 0.5825 | 0.6125 | 3.4176 | 0.5825 | 0.5731 | 0.2140 | 0.1859 |
| 0.5398 | 21.0 | 525 | 1.6296 | 0.5775 | 0.6163 | 3.6014 | 0.5775 | 0.5871 | 0.2236 | 0.2051 |
| 0.5398 | 22.0 | 550 | 1.5965 | 0.57 | 0.5908 | 3.7668 | 0.57 | 0.5712 | 0.2058 | 0.1883 |
| 0.5398 | 23.0 | 575 | 1.4828 | 0.5875 | 0.5646 | 3.7028 | 0.5875 | 0.5854 | 0.1944 | 0.1714 |
| 0.5398 | 24.0 | 600 | 1.3983 | 0.6075 | 0.5481 | 3.3608 | 0.6075 | 0.6107 | 0.1966 | 0.1628 |
| 0.5398 | 25.0 | 625 | 1.5241 | 0.5925 | 0.5866 | 3.3669 | 0.5925 | 0.6019 | 0.2069 | 0.1886 |
| 0.5398 | 26.0 | 650 | 1.5540 | 0.58 | 0.5780 | 3.5184 | 0.58 | 0.5710 | 0.2131 | 0.1857 |
| 0.5398 | 27.0 | 675 | 1.4653 | 0.6 | 0.5768 | 2.9877 | 0.6 | 0.6043 | 0.2166 | 0.1781 |
| 0.5398 | 28.0 | 700 | 1.4883 | 0.5925 | 0.5646 | 3.7789 | 0.5925 | 0.5910 | 0.2096 | 0.1746 |
| 0.5398 | 29.0 | 725 | 1.5738 | 0.59 | 0.5914 | 4.0558 | 0.59 | 0.5879 | 0.2150 | 0.1957 |
| 0.5398 | 30.0 | 750 | 1.4017 | 0.6025 | 0.5583 | 3.4791 | 0.6025 | 0.6023 | 0.2150 | 0.1752 |
| 0.5398 | 31.0 | 775 | 1.3500 | 0.61 | 0.5365 | 3.2560 | 0.61 | 0.6157 | 0.1988 | 0.1579 |
| 0.5398 | 32.0 | 800 | 1.2977 | 0.6375 | 0.5140 | 3.0503 | 0.6375 | 0.6395 | 0.1847 | 0.1534 |
| 0.5398 | 33.0 | 825 | 1.3471 | 0.6175 | 0.5406 | 3.1888 | 0.6175 | 0.6104 | 0.2077 | 0.1689 |
| 0.5398 | 34.0 | 850 | 1.2992 | 0.615 | 0.5219 | 2.8944 | 0.615 | 0.6191 | 0.1826 | 0.1574 |
| 0.5398 | 35.0 | 875 | 1.2733 | 0.6225 | 0.5124 | 2.9352 | 0.6225 | 0.6238 | 0.1588 | 0.1505 |
| 0.5398 | 36.0 | 900 | 1.2821 | 0.6175 | 0.5231 | 3.0142 | 0.6175 | 0.6169 | 0.1672 | 0.1553 |
| 0.5398 | 37.0 | 925 | 1.2819 | 0.61 | 0.5200 | 2.6874 | 0.61 | 0.6116 | 0.1847 | 0.1540 |
| 0.5398 | 38.0 | 950 | 1.2664 | 0.615 | 0.5145 | 2.9287 | 0.615 | 0.6159 | 0.1961 | 0.1528 |
| 0.5398 | 39.0 | 975 | 1.2584 | 0.6225 | 0.5134 | 3.0058 | 0.6225 | 0.6230 | 0.1747 | 0.1508 |
| 0.0507 | 40.0 | 1000 | 1.2562 | 0.615 | 0.5114 | 2.9269 | 0.615 | 0.6169 | 0.1815 | 0.1504 |
| 0.0507 | 41.0 | 1025 | 1.2525 | 0.6225 | 0.5101 | 2.9199 | 0.6225 | 0.6239 | 0.1770 | 0.1496 |
| 0.0507 | 42.0 | 1050 | 1.2573 | 0.62 | 0.5133 | 2.9195 | 0.62 | 0.6221 | 0.1824 | 0.1511 |
| 0.0507 | 43.0 | 1075 | 1.2536 | 0.6125 | 0.5131 | 2.9026 | 0.6125 | 0.6121 | 0.1820 | 0.1511 |
| 0.0507 | 44.0 | 1100 | 1.2543 | 0.6225 | 0.5109 | 3.0693 | 0.6225 | 0.6235 | 0.1647 | 0.1500 |
| 0.0507 | 45.0 | 1125 | 1.2526 | 0.6125 | 0.5117 | 2.9018 | 0.6125 | 0.6141 | 0.1788 | 0.1500 |
| 0.0507 | 46.0 | 1150 | 1.2432 | 0.615 | 0.5068 | 2.9042 | 0.615 | 0.6167 | 0.1762 | 0.1484 |
| 0.0507 | 47.0 | 1175 | 1.2485 | 0.6275 | 0.5098 | 2.8927 | 0.6275 | 0.6251 | 0.1590 | 0.1496 |
| 0.0507 | 48.0 | 1200 | 1.2576 | 0.6125 | 0.5140 | 2.8956 | 0.6125 | 0.6137 | 0.1824 | 0.1524 |
| 0.0507 | 49.0 | 1225 | 1.2468 | 0.62 | 0.5094 | 2.8918 | 0.62 | 0.6204 | 0.1832 | 0.1496 |
| 0.0507 | 50.0 | 1250 | 1.2479 | 0.6175 | 0.5102 | 2.8921 | 0.6175 | 0.6178 | 0.1706 | 0.1491 |
| 0.0507 | 51.0 | 1275 | 1.2393 | 0.6225 | 0.5057 | 2.8813 | 0.6225 | 0.6229 | 0.1784 | 0.1486 |
| 0.0507 | 52.0 | 1300 | 1.2463 | 0.6175 | 0.5085 | 2.8959 | 0.6175 | 0.6184 | 0.1669 | 0.1495 |
| 0.0507 | 53.0 | 1325 | 1.2391 | 0.62 | 0.5061 | 2.8828 | 0.62 | 0.6215 | 0.1803 | 0.1471 |
| 0.0507 | 54.0 | 1350 | 1.2538 | 0.6175 | 0.5121 | 2.8795 | 0.6175 | 0.6167 | 0.1680 | 0.1512 |
| 0.0507 | 55.0 | 1375 | 1.2407 | 0.625 | 0.5064 | 2.8830 | 0.625 | 0.6259 | 0.1842 | 0.1482 |
| 0.0507 | 56.0 | 1400 | 1.2488 | 0.62 | 0.5099 | 2.8769 | 0.62 | 0.6198 | 0.1568 | 0.1499 |
| 0.0507 | 57.0 | 1425 | 1.2402 | 0.625 | 0.5052 | 2.8778 | 0.625 | 0.6260 | 0.1616 | 0.1481 |
| 0.0507 | 58.0 | 1450 | 1.2457 | 0.625 | 0.5077 | 2.8786 | 0.625 | 0.6260 | 0.1759 | 0.1474 |
| 0.0507 | 59.0 | 1475 | 1.2430 | 0.6275 | 0.5073 | 2.8744 | 0.6275 | 0.6266 | 0.1652 | 0.1486 |
| 0.0319 | 60.0 | 1500 | 1.2399 | 0.625 | 0.5056 | 2.8767 | 0.625 | 0.6256 | 0.1701 | 0.1474 |
| 0.0319 | 61.0 | 1525 | 1.2460 | 0.63 | 0.5087 | 2.8758 | 0.63 | 0.6329 | 0.1865 | 0.1491 |
| 0.0319 | 62.0 | 1550 | 1.2410 | 0.6225 | 0.5058 | 2.8719 | 0.6225 | 0.6229 | 0.1752 | 0.1477 |
| 0.0319 | 63.0 | 1575 | 1.2418 | 0.63 | 0.5060 | 2.8746 | 0.63 | 0.6319 | 0.1692 | 0.1484 |
| 0.0319 | 64.0 | 1600 | 1.2424 | 0.6275 | 0.5069 | 2.8672 | 0.6275 | 0.6279 | 0.1903 | 0.1475 |
| 0.0319 | 65.0 | 1625 | 1.2413 | 0.63 | 0.5061 | 2.8747 | 0.63 | 0.6304 | 0.1737 | 0.1471 |
| 0.0319 | 66.0 | 1650 | 1.2385 | 0.6325 | 0.5039 | 2.8726 | 0.6325 | 0.6358 | 0.1792 | 0.1473 |
| 0.0319 | 67.0 | 1675 | 1.2368 | 0.625 | 0.5047 | 2.8661 | 0.625 | 0.6261 | 0.1843 | 0.1467 |
| 0.0319 | 68.0 | 1700 | 1.2370 | 0.6275 | 0.5039 | 2.8691 | 0.6275 | 0.6294 | 0.1724 | 0.1471 |
| 0.0319 | 69.0 | 1725 | 1.2382 | 0.63 | 0.5050 | 2.8659 | 0.63 | 0.6317 | 0.1698 | 0.1472 |
| 0.0319 | 70.0 | 1750 | 1.2396 | 0.6275 | 0.5051 | 2.8670 | 0.6275 | 0.6290 | 0.1790 | 0.1474 |
| 0.0319 | 71.0 | 1775 | 1.2378 | 0.625 | 0.5045 | 2.8637 | 0.625 | 0.6268 | 0.1742 | 0.1476 |
| 0.0319 | 72.0 | 1800 | 1.2360 | 0.625 | 0.5037 | 2.8669 | 0.625 | 0.6269 | 0.1778 | 0.1468 |
| 0.0319 | 73.0 | 1825 | 1.2390 | 0.63 | 0.5049 | 2.8638 | 0.63 | 0.6310 | 0.1711 | 0.1474 |
| 0.0319 | 74.0 | 1850 | 1.2372 | 0.625 | 0.5045 | 2.8640 | 0.625 | 0.6269 | 0.1817 | 0.1475 |
| 0.0319 | 75.0 | 1875 | 1.2375 | 0.63 | 0.5044 | 2.8640 | 0.63 | 0.6313 | 0.1703 | 0.1472 |
| 0.0319 | 76.0 | 1900 | 1.2372 | 0.6275 | 0.5041 | 2.8621 | 0.6275 | 0.6290 | 0.1794 | 0.1473 |
| 0.0319 | 77.0 | 1925 | 1.2374 | 0.63 | 0.5041 | 2.8629 | 0.63 | 0.6313 | 0.1722 | 0.1472 |
| 0.0319 | 78.0 | 1950 | 1.2367 | 0.6275 | 0.5039 | 2.8620 | 0.6275 | 0.6294 | 0.1704 | 0.1474 |
| 0.0319 | 79.0 | 1975 | 1.2371 | 0.6275 | 0.5039 | 2.8619 | 0.6275 | 0.6294 | 0.1639 | 0.1474 |
| 0.0314 | 80.0 | 2000 | 1.2372 | 0.63 | 0.5041 | 2.8612 | 0.63 | 0.6310 | 0.1750 | 0.1474 |
| 0.0314 | 81.0 | 2025 | 1.2368 | 0.63 | 0.5038 | 2.8613 | 0.63 | 0.6309 | 0.1648 | 0.1473 |
| 0.0314 | 82.0 | 2050 | 1.2370 | 0.63 | 0.5038 | 2.8607 | 0.63 | 0.6305 | 0.1782 | 0.1473 |
| 0.0314 | 83.0 | 2075 | 1.2368 | 0.63 | 0.5038 | 2.8609 | 0.63 | 0.6307 | 0.1686 | 0.1472 |
| 0.0314 | 84.0 | 2100 | 1.2368 | 0.63 | 0.5037 | 2.8603 | 0.63 | 0.6305 | 0.1667 | 0.1472 |
| 0.0314 | 85.0 | 2125 | 1.2366 | 0.63 | 0.5036 | 2.8601 | 0.63 | 0.6309 | 0.1686 | 0.1473 |
| 0.0314 | 86.0 | 2150 | 1.2367 | 0.6325 | 0.5037 | 2.8600 | 0.6325 | 0.6335 | 0.1751 | 0.1471 |
| 0.0314 | 87.0 | 2175 | 1.2369 | 0.63 | 0.5037 | 2.8598 | 0.63 | 0.6307 | 0.1730 | 0.1473 |
| 0.0314 | 88.0 | 2200 | 1.2367 | 0.63 | 0.5036 | 2.8595 | 0.63 | 0.6307 | 0.1657 | 0.1472 |
| 0.0314 | 89.0 | 2225 | 1.2366 | 0.63 | 0.5036 | 2.8597 | 0.63 | 0.6307 | 0.1680 | 0.1472 |
| 0.0314 | 90.0 | 2250 | 1.2366 | 0.63 | 0.5036 | 2.8594 | 0.63 | 0.6307 | 0.1580 | 0.1472 |
| 0.0314 | 91.0 | 2275 | 1.2366 | 0.63 | 0.5035 | 2.8593 | 0.63 | 0.6307 | 0.1677 | 0.1472 |
| 0.0314 | 92.0 | 2300 | 1.2367 | 0.63 | 0.5035 | 2.8593 | 0.63 | 0.6307 | 0.1616 | 0.1472 |
| 0.0314 | 93.0 | 2325 | 1.2366 | 0.63 | 0.5035 | 2.8590 | 0.63 | 0.6307 | 0.1625 | 0.1472 |
| 0.0314 | 94.0 | 2350 | 1.2366 | 0.6325 | 0.5035 | 2.8590 | 0.6325 | 0.6333 | 0.1586 | 0.1470 |
| 0.0314 | 95.0 | 2375 | 1.2366 | 0.63 | 0.5035 | 2.8591 | 0.63 | 0.6307 | 0.1580 | 0.1472 |
| 0.0314 | 96.0 | 2400 | 1.2366 | 0.63 | 0.5035 | 2.8589 | 0.63 | 0.6307 | 0.1695 | 0.1471 |
| 0.0314 | 97.0 | 2425 | 1.2366 | 0.63 | 0.5035 | 2.8589 | 0.63 | 0.6311 | 0.1648 | 0.1472 |
| 0.0314 | 98.0 | 2450 | 1.2366 | 0.63 | 0.5035 | 2.8588 | 0.63 | 0.6311 | 0.1695 | 0.1471 |
| 0.0314 | 99.0 | 2475 | 1.2366 | 0.6325 | 0.5035 | 2.8589 | 0.6325 | 0.6337 | 0.1724 | 0.1470 |
| 0.0312 | 100.0 | 2500 | 1.2366 | 0.63 | 0.5035 | 2.8588 | 0.63 | 0.6311 | 0.1649 | 0.1472 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4583
- Accuracy: 0.655
- Brier Loss: 0.4857
- Nll: 2.9372
- F1 Micro: 0.655
- F1 Macro: 0.6591
- Ece: 0.1679
- Aurc: 0.1394
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 4.2264 | 0.1375 | 0.9289 | 15.9084 | 0.1375 | 0.1395 | 0.1536 | 0.8596 |
| No log | 2.0 | 50 | 3.2078 | 0.405 | 0.7396 | 8.9647 | 0.405 | 0.3723 | 0.2073 | 0.3570 |
| No log | 3.0 | 75 | 2.4477 | 0.4975 | 0.6180 | 5.3439 | 0.4975 | 0.4756 | 0.1714 | 0.2421 |
| No log | 4.0 | 100 | 2.2058 | 0.545 | 0.5825 | 4.3028 | 0.545 | 0.5448 | 0.1681 | 0.2147 |
| No log | 5.0 | 125 | 2.1459 | 0.5325 | 0.6143 | 4.3798 | 0.5325 | 0.5164 | 0.2012 | 0.2274 |
| No log | 6.0 | 150 | 2.0457 | 0.5825 | 0.5625 | 4.1921 | 0.5825 | 0.5823 | 0.1712 | 0.2008 |
| No log | 7.0 | 175 | 1.9438 | 0.575 | 0.5557 | 4.2405 | 0.575 | 0.5654 | 0.1805 | 0.1894 |
| No log | 8.0 | 200 | 1.9821 | 0.5675 | 0.5766 | 3.8326 | 0.5675 | 0.5665 | 0.1815 | 0.2050 |
| No log | 9.0 | 225 | 2.1566 | 0.5425 | 0.6068 | 4.2488 | 0.5425 | 0.5367 | 0.2053 | 0.2167 |
| No log | 10.0 | 250 | 1.9672 | 0.5925 | 0.5692 | 4.3417 | 0.5925 | 0.5968 | 0.2005 | 0.2114 |
| No log | 11.0 | 275 | 2.0417 | 0.5725 | 0.6080 | 3.6972 | 0.5725 | 0.5608 | 0.2005 | 0.2168 |
| No log | 12.0 | 300 | 1.9432 | 0.585 | 0.5704 | 3.6005 | 0.585 | 0.5840 | 0.1976 | 0.1939 |
| No log | 13.0 | 325 | 1.9031 | 0.585 | 0.5816 | 4.0984 | 0.585 | 0.5835 | 0.1996 | 0.1911 |
| No log | 14.0 | 350 | 1.8994 | 0.5925 | 0.5897 | 4.2703 | 0.5925 | 0.5926 | 0.2211 | 0.2041 |
| No log | 15.0 | 375 | 1.8136 | 0.6325 | 0.5297 | 4.5861 | 0.6325 | 0.6299 | 0.1622 | 0.1578 |
| No log | 16.0 | 400 | 1.6961 | 0.5925 | 0.5300 | 4.0317 | 0.5925 | 0.5839 | 0.1909 | 0.1630 |
| No log | 17.0 | 425 | 1.7687 | 0.61 | 0.5357 | 3.6514 | 0.61 | 0.6110 | 0.1715 | 0.1703 |
| No log | 18.0 | 450 | 1.8963 | 0.6 | 0.5785 | 4.7474 | 0.6 | 0.5842 | 0.2168 | 0.1893 |
| No log | 19.0 | 475 | 1.7545 | 0.6175 | 0.5506 | 4.4192 | 0.6175 | 0.6086 | 0.2006 | 0.1759 |
| 0.8611 | 20.0 | 500 | 1.7832 | 0.61 | 0.5546 | 4.0543 | 0.61 | 0.6099 | 0.2133 | 0.1662 |
| 0.8611 | 21.0 | 525 | 1.7788 | 0.5875 | 0.5718 | 3.8585 | 0.5875 | 0.5855 | 0.2084 | 0.1848 |
| 0.8611 | 22.0 | 550 | 1.6323 | 0.62 | 0.5184 | 3.6953 | 0.62 | 0.6146 | 0.1921 | 0.1588 |
| 0.8611 | 23.0 | 575 | 1.6384 | 0.6325 | 0.5431 | 3.5349 | 0.6325 | 0.6269 | 0.2042 | 0.1678 |
| 0.8611 | 24.0 | 600 | 1.7895 | 0.62 | 0.5588 | 4.2768 | 0.62 | 0.6169 | 0.1993 | 0.1885 |
| 0.8611 | 25.0 | 625 | 1.5712 | 0.6175 | 0.5111 | 3.1891 | 0.6175 | 0.6199 | 0.1777 | 0.1552 |
| 0.8611 | 26.0 | 650 | 1.6139 | 0.62 | 0.5284 | 3.0912 | 0.62 | 0.6238 | 0.1793 | 0.1599 |
| 0.8611 | 27.0 | 675 | 1.6449 | 0.6375 | 0.5190 | 4.0147 | 0.6375 | 0.6313 | 0.1794 | 0.1606 |
| 0.8611 | 28.0 | 700 | 1.6379 | 0.6325 | 0.5355 | 3.5225 | 0.6325 | 0.6300 | 0.1859 | 0.1693 |
| 0.8611 | 29.0 | 725 | 1.5486 | 0.6375 | 0.5202 | 3.1611 | 0.6375 | 0.6407 | 0.1908 | 0.1608 |
| 0.8611 | 30.0 | 750 | 1.5410 | 0.63 | 0.5074 | 3.2562 | 0.63 | 0.6340 | 0.1772 | 0.1424 |
| 0.8611 | 31.0 | 775 | 1.5033 | 0.6575 | 0.4973 | 3.3321 | 0.6575 | 0.6619 | 0.1802 | 0.1451 |
| 0.8611 | 32.0 | 800 | 1.6065 | 0.6375 | 0.5260 | 3.4264 | 0.6375 | 0.6451 | 0.2028 | 0.1670 |
| 0.8611 | 33.0 | 825 | 1.5188 | 0.6525 | 0.5028 | 3.5128 | 0.6525 | 0.6536 | 0.1813 | 0.1491 |
| 0.8611 | 34.0 | 850 | 1.5034 | 0.635 | 0.5005 | 3.4093 | 0.635 | 0.6345 | 0.1602 | 0.1506 |
| 0.8611 | 35.0 | 875 | 1.5711 | 0.66 | 0.5163 | 3.6591 | 0.66 | 0.6587 | 0.1884 | 0.1574 |
| 0.8611 | 36.0 | 900 | 1.5224 | 0.6475 | 0.5057 | 3.1773 | 0.6475 | 0.6491 | 0.1802 | 0.1526 |
| 0.8611 | 37.0 | 925 | 1.4781 | 0.6475 | 0.4938 | 3.3389 | 0.6475 | 0.6508 | 0.1753 | 0.1420 |
| 0.8611 | 38.0 | 950 | 1.4991 | 0.65 | 0.5005 | 3.4077 | 0.65 | 0.6541 | 0.1843 | 0.1482 |
| 0.8611 | 39.0 | 975 | 1.4613 | 0.6625 | 0.4848 | 3.2461 | 0.6625 | 0.6675 | 0.1647 | 0.1386 |
| 0.0907 | 40.0 | 1000 | 1.4824 | 0.64 | 0.4951 | 3.1830 | 0.64 | 0.6444 | 0.1779 | 0.1431 |
| 0.0907 | 41.0 | 1025 | 1.5224 | 0.6625 | 0.5004 | 3.4231 | 0.6625 | 0.6659 | 0.1769 | 0.1506 |
| 0.0907 | 42.0 | 1050 | 1.4882 | 0.6375 | 0.5013 | 3.0893 | 0.6375 | 0.6451 | 0.1844 | 0.1465 |
| 0.0907 | 43.0 | 1075 | 1.4852 | 0.665 | 0.4901 | 3.4025 | 0.665 | 0.6685 | 0.1869 | 0.1442 |
| 0.0907 | 44.0 | 1100 | 1.4744 | 0.65 | 0.4934 | 3.4829 | 0.65 | 0.6528 | 0.1836 | 0.1426 |
| 0.0907 | 45.0 | 1125 | 1.4735 | 0.66 | 0.4892 | 3.1763 | 0.66 | 0.6642 | 0.1666 | 0.1427 |
| 0.0907 | 46.0 | 1150 | 1.4690 | 0.65 | 0.4898 | 3.0960 | 0.65 | 0.6537 | 0.1642 | 0.1427 |
| 0.0907 | 47.0 | 1175 | 1.4773 | 0.6475 | 0.4909 | 3.2535 | 0.6475 | 0.6506 | 0.1749 | 0.1446 |
| 0.0907 | 48.0 | 1200 | 1.4632 | 0.6575 | 0.4884 | 3.1685 | 0.6575 | 0.6625 | 0.1750 | 0.1398 |
| 0.0907 | 49.0 | 1225 | 1.4712 | 0.66 | 0.4896 | 3.0915 | 0.66 | 0.6634 | 0.1697 | 0.1432 |
| 0.0907 | 50.0 | 1250 | 1.4630 | 0.655 | 0.4883 | 3.0953 | 0.655 | 0.6591 | 0.1650 | 0.1406 |
| 0.0907 | 51.0 | 1275 | 1.4607 | 0.66 | 0.4860 | 3.0153 | 0.66 | 0.6653 | 0.1665 | 0.1411 |
| 0.0907 | 52.0 | 1300 | 1.4646 | 0.6475 | 0.4889 | 3.0242 | 0.6475 | 0.6510 | 0.1713 | 0.1426 |
| 0.0907 | 53.0 | 1325 | 1.4717 | 0.6575 | 0.4904 | 3.0926 | 0.6575 | 0.6605 | 0.1789 | 0.1428 |
| 0.0907 | 54.0 | 1350 | 1.4554 | 0.645 | 0.4868 | 3.0882 | 0.645 | 0.6489 | 0.1664 | 0.1408 |
| 0.0907 | 55.0 | 1375 | 1.4581 | 0.6575 | 0.4855 | 3.0904 | 0.6575 | 0.6614 | 0.1602 | 0.1404 |
| 0.0907 | 56.0 | 1400 | 1.4588 | 0.655 | 0.4866 | 3.0910 | 0.655 | 0.6598 | 0.1722 | 0.1405 |
| 0.0907 | 57.0 | 1425 | 1.4582 | 0.6575 | 0.4859 | 3.0143 | 0.6575 | 0.6619 | 0.1540 | 0.1397 |
| 0.0907 | 58.0 | 1450 | 1.4613 | 0.6575 | 0.4865 | 3.0143 | 0.6575 | 0.6620 | 0.1659 | 0.1402 |
| 0.0907 | 59.0 | 1475 | 1.4593 | 0.655 | 0.4867 | 3.0140 | 0.655 | 0.6599 | 0.1583 | 0.1402 |
| 0.0478 | 60.0 | 1500 | 1.4593 | 0.655 | 0.4864 | 3.0148 | 0.655 | 0.6593 | 0.1657 | 0.1404 |
| 0.0478 | 61.0 | 1525 | 1.4588 | 0.655 | 0.4861 | 3.0165 | 0.655 | 0.6590 | 0.1757 | 0.1401 |
| 0.0478 | 62.0 | 1550 | 1.4598 | 0.6575 | 0.4864 | 3.0140 | 0.6575 | 0.6616 | 0.1528 | 0.1403 |
| 0.0478 | 63.0 | 1575 | 1.4595 | 0.6575 | 0.4865 | 3.0143 | 0.6575 | 0.6623 | 0.1538 | 0.1400 |
| 0.0478 | 64.0 | 1600 | 1.4591 | 0.655 | 0.4864 | 2.9404 | 0.655 | 0.6591 | 0.1669 | 0.1399 |
| 0.0478 | 65.0 | 1625 | 1.4568 | 0.655 | 0.4854 | 2.9393 | 0.655 | 0.6596 | 0.1644 | 0.1393 |
| 0.0478 | 66.0 | 1650 | 1.4569 | 0.655 | 0.4855 | 3.0146 | 0.655 | 0.6599 | 0.1619 | 0.1401 |
| 0.0478 | 67.0 | 1675 | 1.4592 | 0.655 | 0.4865 | 2.9380 | 0.655 | 0.6596 | 0.1540 | 0.1399 |
| 0.0478 | 68.0 | 1700 | 1.4580 | 0.66 | 0.4858 | 2.9406 | 0.66 | 0.6641 | 0.1850 | 0.1396 |
| 0.0478 | 69.0 | 1725 | 1.4591 | 0.655 | 0.4865 | 2.9381 | 0.655 | 0.6593 | 0.1651 | 0.1399 |
| 0.0478 | 70.0 | 1750 | 1.4586 | 0.655 | 0.4859 | 2.9388 | 0.655 | 0.6596 | 0.1773 | 0.1397 |
| 0.0478 | 71.0 | 1775 | 1.4585 | 0.6525 | 0.4862 | 2.9366 | 0.6525 | 0.6566 | 0.1644 | 0.1400 |
| 0.0478 | 72.0 | 1800 | 1.4582 | 0.66 | 0.4858 | 2.9385 | 0.66 | 0.6644 | 0.1809 | 0.1396 |
| 0.0478 | 73.0 | 1825 | 1.4577 | 0.65 | 0.4857 | 2.9374 | 0.65 | 0.6543 | 0.1715 | 0.1403 |
| 0.0478 | 74.0 | 1850 | 1.4578 | 0.6525 | 0.4857 | 2.9381 | 0.6525 | 0.6565 | 0.1748 | 0.1401 |
| 0.0478 | 75.0 | 1875 | 1.4583 | 0.65 | 0.4860 | 2.9371 | 0.65 | 0.6544 | 0.1661 | 0.1402 |
| 0.0478 | 76.0 | 1900 | 1.4582 | 0.65 | 0.4859 | 2.9369 | 0.65 | 0.6544 | 0.1760 | 0.1402 |
| 0.0478 | 77.0 | 1925 | 1.4585 | 0.65 | 0.4859 | 2.9367 | 0.65 | 0.6546 | 0.1609 | 0.1403 |
| 0.0478 | 78.0 | 1950 | 1.4580 | 0.65 | 0.4858 | 2.9372 | 0.65 | 0.6546 | 0.1626 | 0.1401 |
| 0.0478 | 79.0 | 1975 | 1.4578 | 0.6525 | 0.4857 | 2.9369 | 0.6525 | 0.6564 | 0.1706 | 0.1400 |
| 0.0457 | 80.0 | 2000 | 1.4584 | 0.6525 | 0.4859 | 2.9370 | 0.6525 | 0.6564 | 0.1712 | 0.1402 |
| 0.0457 | 81.0 | 2025 | 1.4587 | 0.6525 | 0.4860 | 2.9370 | 0.6525 | 0.6568 | 0.1631 | 0.1402 |
| 0.0457 | 82.0 | 2050 | 1.4584 | 0.6525 | 0.4859 | 2.9369 | 0.6525 | 0.6568 | 0.1631 | 0.1401 |
| 0.0457 | 83.0 | 2075 | 1.4581 | 0.65 | 0.4858 | 2.9369 | 0.65 | 0.6543 | 0.1703 | 0.1401 |
| 0.0457 | 84.0 | 2100 | 1.4581 | 0.6525 | 0.4858 | 2.9370 | 0.6525 | 0.6564 | 0.1588 | 0.1401 |
| 0.0457 | 85.0 | 2125 | 1.4582 | 0.6525 | 0.4858 | 2.9370 | 0.6525 | 0.6568 | 0.1723 | 0.1400 |
| 0.0457 | 86.0 | 2150 | 1.4582 | 0.6525 | 0.4858 | 2.9371 | 0.6525 | 0.6564 | 0.1724 | 0.1400 |
| 0.0457 | 87.0 | 2175 | 1.4582 | 0.6525 | 0.4858 | 2.9369 | 0.6525 | 0.6567 | 0.1720 | 0.1400 |
| 0.0457 | 88.0 | 2200 | 1.4582 | 0.6525 | 0.4858 | 2.9372 | 0.6525 | 0.6567 | 0.1606 | 0.1401 |
| 0.0457 | 89.0 | 2225 | 1.4583 | 0.6525 | 0.4858 | 2.9372 | 0.6525 | 0.6567 | 0.1665 | 0.1401 |
| 0.0457 | 90.0 | 2250 | 1.4583 | 0.6525 | 0.4857 | 2.9370 | 0.6525 | 0.6564 | 0.1688 | 0.1400 |
| 0.0457 | 91.0 | 2275 | 1.4583 | 0.6525 | 0.4858 | 2.9371 | 0.6525 | 0.6567 | 0.1695 | 0.1400 |
| 0.0457 | 92.0 | 2300 | 1.4583 | 0.655 | 0.4858 | 2.9372 | 0.655 | 0.6591 | 0.1660 | 0.1394 |
| 0.0457 | 93.0 | 2325 | 1.4583 | 0.6525 | 0.4857 | 2.9371 | 0.6525 | 0.6565 | 0.1645 | 0.1400 |
| 0.0457 | 94.0 | 2350 | 1.4583 | 0.6525 | 0.4858 | 2.9371 | 0.6525 | 0.6567 | 0.1665 | 0.1399 |
| 0.0457 | 95.0 | 2375 | 1.4583 | 0.6525 | 0.4858 | 2.9372 | 0.6525 | 0.6567 | 0.1704 | 0.1399 |
| 0.0457 | 96.0 | 2400 | 1.4583 | 0.655 | 0.4858 | 2.9372 | 0.655 | 0.6588 | 0.1660 | 0.1395 |
| 0.0457 | 97.0 | 2425 | 1.4582 | 0.6525 | 0.4857 | 2.9372 | 0.6525 | 0.6567 | 0.1704 | 0.1399 |
| 0.0457 | 98.0 | 2450 | 1.4582 | 0.655 | 0.4857 | 2.9372 | 0.655 | 0.6591 | 0.1679 | 0.1394 |
| 0.0457 | 99.0 | 2475 | 1.4583 | 0.6525 | 0.4857 | 2.9372 | 0.6525 | 0.6567 | 0.1704 | 0.1399 |
| 0.0456 | 100.0 | 2500 | 1.4583 | 0.655 | 0.4857 | 2.9372 | 0.655 | 0.6591 | 0.1679 | 0.1394 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2623
- Accuracy: 0.65
- Brier Loss: 0.4803
- Nll: 3.2676
- F1 Micro: 0.65
- F1 Macro: 0.6575
- Ece: 0.1722
- Aurc: 0.1414
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 3.4916 | 0.1075 | 0.9342 | 15.2561 | 0.1075 | 0.1132 | 0.1627 | 0.8874 |
| No log | 2.0 | 50 | 2.6905 | 0.395 | 0.7423 | 8.7655 | 0.395 | 0.3694 | 0.1922 | 0.3538 |
| No log | 3.0 | 75 | 2.1229 | 0.505 | 0.6157 | 5.2850 | 0.505 | 0.4830 | 0.1716 | 0.2424 |
| No log | 4.0 | 100 | 1.9322 | 0.55 | 0.5842 | 4.6402 | 0.55 | 0.5501 | 0.1744 | 0.2156 |
| No log | 5.0 | 125 | 1.8231 | 0.5575 | 0.5788 | 4.2830 | 0.5575 | 0.5494 | 0.1777 | 0.2091 |
| No log | 6.0 | 150 | 1.7318 | 0.5875 | 0.5523 | 4.4127 | 0.5875 | 0.5864 | 0.1686 | 0.1950 |
| No log | 7.0 | 175 | 1.6652 | 0.615 | 0.5325 | 3.8720 | 0.615 | 0.6192 | 0.1654 | 0.1740 |
| No log | 8.0 | 200 | 1.5910 | 0.61 | 0.5233 | 3.2435 | 0.61 | 0.6097 | 0.1556 | 0.1702 |
| No log | 9.0 | 225 | 1.7751 | 0.59 | 0.5610 | 3.9627 | 0.59 | 0.5839 | 0.1932 | 0.1965 |
| No log | 10.0 | 250 | 1.5950 | 0.5975 | 0.5521 | 3.9360 | 0.5975 | 0.5922 | 0.1868 | 0.1886 |
| No log | 11.0 | 275 | 1.6105 | 0.6 | 0.5459 | 4.2017 | 0.6 | 0.5960 | 0.1788 | 0.1696 |
| No log | 12.0 | 300 | 1.5566 | 0.5975 | 0.5283 | 3.6344 | 0.5975 | 0.5957 | 0.1843 | 0.1758 |
| No log | 13.0 | 325 | 1.5395 | 0.6225 | 0.5344 | 3.3755 | 0.6225 | 0.6327 | 0.1725 | 0.1721 |
| No log | 14.0 | 350 | 1.5117 | 0.64 | 0.5193 | 3.7990 | 0.64 | 0.6366 | 0.1849 | 0.1659 |
| No log | 15.0 | 375 | 1.5274 | 0.6225 | 0.5381 | 3.5126 | 0.6225 | 0.6198 | 0.1837 | 0.1689 |
| No log | 16.0 | 400 | 1.3822 | 0.645 | 0.4848 | 3.5167 | 0.645 | 0.6501 | 0.1426 | 0.1384 |
| No log | 17.0 | 425 | 1.4390 | 0.6325 | 0.5345 | 3.8558 | 0.6325 | 0.6406 | 0.1859 | 0.1624 |
| No log | 18.0 | 450 | 1.3763 | 0.6425 | 0.4905 | 3.0232 | 0.6425 | 0.6446 | 0.1687 | 0.1388 |
| No log | 19.0 | 475 | 1.5017 | 0.5925 | 0.5558 | 3.9738 | 0.5925 | 0.5699 | 0.2064 | 0.1827 |
| 0.7312 | 20.0 | 500 | 1.4216 | 0.64 | 0.5092 | 3.5054 | 0.64 | 0.6394 | 0.1885 | 0.1583 |
| 0.7312 | 21.0 | 525 | 1.3999 | 0.6325 | 0.5166 | 3.6206 | 0.6325 | 0.6342 | 0.1865 | 0.1586 |
| 0.7312 | 22.0 | 550 | 1.3555 | 0.6575 | 0.5092 | 3.5815 | 0.6575 | 0.6570 | 0.1748 | 0.1565 |
| 0.7312 | 23.0 | 575 | 1.3915 | 0.6375 | 0.5065 | 3.2269 | 0.6375 | 0.6367 | 0.1712 | 0.1485 |
| 0.7312 | 24.0 | 600 | 1.4116 | 0.64 | 0.5130 | 3.7646 | 0.64 | 0.6412 | 0.1690 | 0.1624 |
| 0.7312 | 25.0 | 625 | 1.3663 | 0.64 | 0.5160 | 3.0397 | 0.64 | 0.6471 | 0.1736 | 0.1575 |
| 0.7312 | 26.0 | 650 | 1.3717 | 0.63 | 0.5097 | 3.7950 | 0.63 | 0.6379 | 0.1823 | 0.1570 |
| 0.7312 | 27.0 | 675 | 1.3229 | 0.6425 | 0.4933 | 3.5568 | 0.6425 | 0.6498 | 0.1564 | 0.1470 |
| 0.7312 | 28.0 | 700 | 1.3638 | 0.6275 | 0.5124 | 3.2988 | 0.6275 | 0.6266 | 0.1916 | 0.1600 |
| 0.7312 | 29.0 | 725 | 1.3353 | 0.6475 | 0.5013 | 3.4126 | 0.6475 | 0.6407 | 0.1747 | 0.1558 |
| 0.7312 | 30.0 | 750 | 1.3788 | 0.6325 | 0.5172 | 3.4229 | 0.6325 | 0.6329 | 0.1629 | 0.1650 |
| 0.7312 | 31.0 | 775 | 1.3021 | 0.6525 | 0.4840 | 3.2418 | 0.6525 | 0.6571 | 0.1788 | 0.1412 |
| 0.7312 | 32.0 | 800 | 1.3127 | 0.6525 | 0.5058 | 3.1876 | 0.6525 | 0.6579 | 0.1879 | 0.1525 |
| 0.7312 | 33.0 | 825 | 1.3181 | 0.64 | 0.5023 | 3.1837 | 0.64 | 0.6459 | 0.1751 | 0.1529 |
| 0.7312 | 34.0 | 850 | 1.3071 | 0.6425 | 0.4954 | 3.5271 | 0.6425 | 0.6480 | 0.1615 | 0.1496 |
| 0.7312 | 35.0 | 875 | 1.2808 | 0.655 | 0.4904 | 3.2539 | 0.655 | 0.6606 | 0.1725 | 0.1448 |
| 0.7312 | 36.0 | 900 | 1.2766 | 0.68 | 0.4771 | 3.3397 | 0.68 | 0.6823 | 0.1645 | 0.1408 |
| 0.7312 | 37.0 | 925 | 1.2751 | 0.665 | 0.4837 | 3.3390 | 0.665 | 0.6728 | 0.1723 | 0.1446 |
| 0.7312 | 38.0 | 950 | 1.2658 | 0.67 | 0.4791 | 3.2603 | 0.67 | 0.6760 | 0.1781 | 0.1407 |
| 0.7312 | 39.0 | 975 | 1.2678 | 0.66 | 0.4814 | 3.1865 | 0.66 | 0.6682 | 0.1585 | 0.1414 |
| 0.0683 | 40.0 | 1000 | 1.2737 | 0.66 | 0.4840 | 3.3466 | 0.66 | 0.6658 | 0.1699 | 0.1434 |
| 0.0683 | 41.0 | 1025 | 1.2581 | 0.66 | 0.4769 | 3.1757 | 0.66 | 0.6660 | 0.1752 | 0.1398 |
| 0.0683 | 42.0 | 1050 | 1.2734 | 0.655 | 0.4833 | 3.1843 | 0.655 | 0.6600 | 0.1721 | 0.1440 |
| 0.0683 | 43.0 | 1075 | 1.2628 | 0.66 | 0.4802 | 3.2578 | 0.66 | 0.6670 | 0.1789 | 0.1403 |
| 0.0683 | 44.0 | 1100 | 1.2717 | 0.66 | 0.4837 | 3.2573 | 0.66 | 0.6651 | 0.1584 | 0.1433 |
| 0.0683 | 45.0 | 1125 | 1.2637 | 0.6475 | 0.4791 | 3.3419 | 0.6475 | 0.6545 | 0.1736 | 0.1408 |
| 0.0683 | 46.0 | 1150 | 1.2625 | 0.6575 | 0.4797 | 3.3403 | 0.6575 | 0.6642 | 0.1597 | 0.1406 |
| 0.0683 | 47.0 | 1175 | 1.2642 | 0.6525 | 0.4791 | 3.3527 | 0.6525 | 0.6592 | 0.1731 | 0.1416 |
| 0.0683 | 48.0 | 1200 | 1.2652 | 0.655 | 0.4816 | 3.2664 | 0.655 | 0.6623 | 0.1717 | 0.1413 |
| 0.0683 | 49.0 | 1225 | 1.2646 | 0.65 | 0.4806 | 3.3371 | 0.65 | 0.6568 | 0.1758 | 0.1419 |
| 0.0683 | 50.0 | 1250 | 1.2677 | 0.65 | 0.4812 | 3.4189 | 0.65 | 0.6575 | 0.1582 | 0.1427 |
| 0.0683 | 51.0 | 1275 | 1.2657 | 0.65 | 0.4813 | 3.3393 | 0.65 | 0.6565 | 0.1748 | 0.1413 |
| 0.0683 | 52.0 | 1300 | 1.2648 | 0.655 | 0.4813 | 3.3447 | 0.655 | 0.6629 | 0.1627 | 0.1419 |
| 0.0683 | 53.0 | 1325 | 1.2650 | 0.65 | 0.4813 | 3.3350 | 0.65 | 0.6565 | 0.1780 | 0.1414 |
| 0.0683 | 54.0 | 1350 | 1.2593 | 0.655 | 0.4790 | 3.3427 | 0.655 | 0.6620 | 0.1543 | 0.1399 |
| 0.0683 | 55.0 | 1375 | 1.2648 | 0.6525 | 0.4810 | 3.3368 | 0.6525 | 0.6592 | 0.1723 | 0.1414 |
| 0.0683 | 56.0 | 1400 | 1.2608 | 0.6525 | 0.4802 | 3.2599 | 0.6525 | 0.6603 | 0.1738 | 0.1411 |
| 0.0683 | 57.0 | 1425 | 1.2639 | 0.6525 | 0.4799 | 3.3437 | 0.6525 | 0.6599 | 0.1767 | 0.1413 |
| 0.0683 | 58.0 | 1450 | 1.2631 | 0.65 | 0.4810 | 3.3401 | 0.65 | 0.6578 | 0.1667 | 0.1416 |
| 0.0683 | 59.0 | 1475 | 1.2636 | 0.6525 | 0.4803 | 3.3411 | 0.6525 | 0.6594 | 0.1690 | 0.1416 |
| 0.0391 | 60.0 | 1500 | 1.2618 | 0.6525 | 0.4796 | 3.2684 | 0.6525 | 0.6600 | 0.1813 | 0.1413 |
| 0.0391 | 61.0 | 1525 | 1.2636 | 0.6525 | 0.4807 | 3.2704 | 0.6525 | 0.6595 | 0.1673 | 0.1413 |
| 0.0391 | 62.0 | 1550 | 1.2615 | 0.65 | 0.4794 | 3.2662 | 0.65 | 0.6575 | 0.1741 | 0.1413 |
| 0.0391 | 63.0 | 1575 | 1.2630 | 0.65 | 0.4803 | 3.3417 | 0.65 | 0.6575 | 0.1752 | 0.1411 |
| 0.0391 | 64.0 | 1600 | 1.2618 | 0.65 | 0.4801 | 3.2663 | 0.65 | 0.6575 | 0.1770 | 0.1413 |
| 0.0391 | 65.0 | 1625 | 1.2622 | 0.65 | 0.4802 | 3.2698 | 0.65 | 0.6575 | 0.1686 | 0.1412 |
| 0.0391 | 66.0 | 1650 | 1.2622 | 0.65 | 0.4802 | 3.3400 | 0.65 | 0.6575 | 0.1922 | 0.1412 |
| 0.0391 | 67.0 | 1675 | 1.2625 | 0.65 | 0.4802 | 3.2694 | 0.65 | 0.6575 | 0.1801 | 0.1413 |
| 0.0391 | 68.0 | 1700 | 1.2626 | 0.65 | 0.4803 | 3.2683 | 0.65 | 0.6575 | 0.1656 | 0.1414 |
| 0.0391 | 69.0 | 1725 | 1.2631 | 0.65 | 0.4806 | 3.2696 | 0.65 | 0.6575 | 0.1722 | 0.1413 |
| 0.0391 | 70.0 | 1750 | 1.2622 | 0.65 | 0.4802 | 3.2688 | 0.65 | 0.6575 | 0.1812 | 0.1412 |
| 0.0391 | 71.0 | 1775 | 1.2626 | 0.65 | 0.4803 | 3.2676 | 0.65 | 0.6575 | 0.1845 | 0.1412 |
| 0.0391 | 72.0 | 1800 | 1.2621 | 0.65 | 0.4801 | 3.2683 | 0.65 | 0.6575 | 0.1805 | 0.1411 |
| 0.0391 | 73.0 | 1825 | 1.2626 | 0.65 | 0.4804 | 3.2683 | 0.65 | 0.6575 | 0.1665 | 0.1413 |
| 0.0391 | 74.0 | 1850 | 1.2624 | 0.65 | 0.4803 | 3.2686 | 0.65 | 0.6575 | 0.1773 | 0.1412 |
| 0.0391 | 75.0 | 1875 | 1.2624 | 0.65 | 0.4803 | 3.2682 | 0.65 | 0.6575 | 0.1807 | 0.1412 |
| 0.0391 | 76.0 | 1900 | 1.2627 | 0.65 | 0.4804 | 3.2680 | 0.65 | 0.6575 | 0.1732 | 0.1414 |
| 0.0391 | 77.0 | 1925 | 1.2625 | 0.65 | 0.4803 | 3.2673 | 0.65 | 0.6575 | 0.1715 | 0.1412 |
| 0.0391 | 78.0 | 1950 | 1.2623 | 0.65 | 0.4803 | 3.2681 | 0.65 | 0.6575 | 0.1840 | 0.1413 |
| 0.0391 | 79.0 | 1975 | 1.2624 | 0.65 | 0.4803 | 3.2678 | 0.65 | 0.6575 | 0.1773 | 0.1413 |
| 0.0385 | 80.0 | 2000 | 1.2625 | 0.65 | 0.4803 | 3.2686 | 0.65 | 0.6575 | 0.1802 | 0.1414 |
| 0.0385 | 81.0 | 2025 | 1.2625 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1773 | 0.1413 |
| 0.0385 | 82.0 | 2050 | 1.2625 | 0.65 | 0.4803 | 3.2684 | 0.65 | 0.6575 | 0.1802 | 0.1414 |
| 0.0385 | 83.0 | 2075 | 1.2624 | 0.65 | 0.4803 | 3.2679 | 0.65 | 0.6575 | 0.1823 | 0.1413 |
| 0.0385 | 84.0 | 2100 | 1.2623 | 0.65 | 0.4803 | 3.2681 | 0.65 | 0.6575 | 0.1772 | 0.1413 |
| 0.0385 | 85.0 | 2125 | 1.2624 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 86.0 | 2150 | 1.2625 | 0.65 | 0.4804 | 3.2680 | 0.65 | 0.6575 | 0.1751 | 0.1414 |
| 0.0385 | 87.0 | 2175 | 1.2623 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1772 | 0.1413 |
| 0.0385 | 88.0 | 2200 | 1.2624 | 0.65 | 0.4803 | 3.2676 | 0.65 | 0.6575 | 0.1723 | 0.1414 |
| 0.0385 | 89.0 | 2225 | 1.2623 | 0.65 | 0.4803 | 3.2679 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 90.0 | 2250 | 1.2622 | 0.65 | 0.4802 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1413 |
| 0.0385 | 91.0 | 2275 | 1.2623 | 0.65 | 0.4803 | 3.2678 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 92.0 | 2300 | 1.2624 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 93.0 | 2325 | 1.2623 | 0.65 | 0.4803 | 3.2679 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 94.0 | 2350 | 1.2623 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 95.0 | 2375 | 1.2623 | 0.65 | 0.4803 | 3.2676 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 96.0 | 2400 | 1.2623 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 97.0 | 2425 | 1.2623 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 98.0 | 2450 | 1.2623 | 0.65 | 0.4803 | 3.2677 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 99.0 | 2475 | 1.2623 | 0.65 | 0.4803 | 3.2676 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
| 0.0385 | 100.0 | 2500 | 1.2623 | 0.65 | 0.4803 | 3.2676 | 0.65 | 0.6575 | 0.1722 | 0.1414 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2378
- Accuracy: 0.645
- Brier Loss: 0.4995
- Nll: 2.6600
- F1 Micro: 0.645
- F1 Macro: 0.6464
- Ece: 0.1850
- Aurc: 0.1447
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 3.1863 | 0.105 | 0.9328 | 15.2391 | 0.1050 | 0.1096 | 0.1551 | 0.8788 |
| No log | 2.0 | 50 | 2.4570 | 0.395 | 0.7500 | 9.2532 | 0.395 | 0.3662 | 0.1883 | 0.3593 |
| No log | 3.0 | 75 | 1.9474 | 0.51 | 0.6157 | 5.2483 | 0.51 | 0.4950 | 0.1693 | 0.2362 |
| No log | 4.0 | 100 | 1.8038 | 0.5375 | 0.5910 | 4.7704 | 0.5375 | 0.5412 | 0.1672 | 0.2240 |
| No log | 5.0 | 125 | 1.7706 | 0.5425 | 0.6043 | 4.4142 | 0.5425 | 0.5313 | 0.1961 | 0.2262 |
| No log | 6.0 | 150 | 1.6182 | 0.58 | 0.5399 | 3.8940 | 0.58 | 0.5814 | 0.1548 | 0.1768 |
| No log | 7.0 | 175 | 1.6199 | 0.6025 | 0.5494 | 3.7722 | 0.6025 | 0.6047 | 0.1571 | 0.1815 |
| No log | 8.0 | 200 | 1.6354 | 0.585 | 0.5620 | 4.3106 | 0.585 | 0.5782 | 0.2067 | 0.1958 |
| No log | 9.0 | 225 | 1.8421 | 0.555 | 0.6076 | 5.4885 | 0.555 | 0.5516 | 0.1995 | 0.2339 |
| No log | 10.0 | 250 | 1.8780 | 0.545 | 0.6302 | 5.0672 | 0.545 | 0.5457 | 0.2036 | 0.2356 |
| No log | 11.0 | 275 | 1.4752 | 0.59 | 0.5450 | 3.4210 | 0.59 | 0.5985 | 0.1751 | 0.1817 |
| No log | 12.0 | 300 | 1.4825 | 0.615 | 0.5332 | 3.3838 | 0.615 | 0.6180 | 0.1764 | 0.1727 |
| No log | 13.0 | 325 | 1.4550 | 0.6325 | 0.5238 | 3.3565 | 0.6325 | 0.6264 | 0.1702 | 0.1607 |
| No log | 14.0 | 350 | 1.4558 | 0.6025 | 0.5424 | 3.2294 | 0.6025 | 0.6060 | 0.1850 | 0.1709 |
| No log | 15.0 | 375 | 1.4164 | 0.6225 | 0.5239 | 3.4651 | 0.6225 | 0.6149 | 0.1797 | 0.1727 |
| No log | 16.0 | 400 | 1.4977 | 0.5975 | 0.5490 | 4.1918 | 0.5975 | 0.5901 | 0.1918 | 0.1761 |
| No log | 17.0 | 425 | 1.4744 | 0.605 | 0.5490 | 3.7221 | 0.605 | 0.5971 | 0.1955 | 0.1752 |
| No log | 18.0 | 450 | 1.5371 | 0.6225 | 0.5563 | 3.9267 | 0.6225 | 0.6194 | 0.1946 | 0.1713 |
| No log | 19.0 | 475 | 1.3703 | 0.61 | 0.5230 | 2.9363 | 0.61 | 0.6115 | 0.1808 | 0.1606 |
| 0.6508 | 20.0 | 500 | 1.3942 | 0.625 | 0.5353 | 3.7288 | 0.625 | 0.6218 | 0.1949 | 0.1549 |
| 0.6508 | 21.0 | 525 | 1.3539 | 0.62 | 0.5281 | 3.2632 | 0.62 | 0.6256 | 0.2058 | 0.1554 |
| 0.6508 | 22.0 | 550 | 1.3411 | 0.6525 | 0.5040 | 3.4382 | 0.6525 | 0.6462 | 0.1740 | 0.1522 |
| 0.6508 | 23.0 | 575 | 1.3133 | 0.62 | 0.5073 | 3.1716 | 0.62 | 0.6213 | 0.1804 | 0.1497 |
| 0.6508 | 24.0 | 600 | 1.4132 | 0.6275 | 0.5343 | 3.4836 | 0.6275 | 0.6311 | 0.1808 | 0.1635 |
| 0.6508 | 25.0 | 625 | 1.4322 | 0.6275 | 0.5464 | 2.9913 | 0.6275 | 0.6374 | 0.1949 | 0.1747 |
| 0.6508 | 26.0 | 650 | 1.4199 | 0.615 | 0.5482 | 3.2476 | 0.615 | 0.6183 | 0.1977 | 0.1705 |
| 0.6508 | 27.0 | 675 | 1.3493 | 0.6275 | 0.5250 | 3.5747 | 0.6275 | 0.6239 | 0.2046 | 0.1518 |
| 0.6508 | 28.0 | 700 | 1.2954 | 0.635 | 0.5078 | 3.0855 | 0.635 | 0.6355 | 0.1787 | 0.1475 |
| 0.6508 | 29.0 | 725 | 1.3715 | 0.6375 | 0.5270 | 3.3421 | 0.6375 | 0.6254 | 0.1888 | 0.1591 |
| 0.6508 | 30.0 | 750 | 1.3038 | 0.645 | 0.5160 | 3.2790 | 0.645 | 0.6443 | 0.1859 | 0.1543 |
| 0.6508 | 31.0 | 775 | 1.3311 | 0.6375 | 0.5259 | 3.0953 | 0.6375 | 0.6364 | 0.1899 | 0.1593 |
| 0.6508 | 32.0 | 800 | 1.2487 | 0.6375 | 0.4942 | 2.9030 | 0.6375 | 0.6406 | 0.1822 | 0.1424 |
| 0.6508 | 33.0 | 825 | 1.2838 | 0.645 | 0.5096 | 2.8108 | 0.645 | 0.6448 | 0.1845 | 0.1532 |
| 0.6508 | 34.0 | 850 | 1.2788 | 0.6525 | 0.5103 | 2.8377 | 0.6525 | 0.6524 | 0.2013 | 0.1505 |
| 0.6508 | 35.0 | 875 | 1.2478 | 0.6425 | 0.5011 | 2.6533 | 0.6425 | 0.6432 | 0.1735 | 0.1435 |
| 0.6508 | 36.0 | 900 | 1.2420 | 0.6375 | 0.5030 | 2.5071 | 0.6375 | 0.6399 | 0.1853 | 0.1461 |
| 0.6508 | 37.0 | 925 | 1.2406 | 0.6375 | 0.4992 | 2.5840 | 0.6375 | 0.6391 | 0.1795 | 0.1456 |
| 0.6508 | 38.0 | 950 | 1.2493 | 0.645 | 0.5035 | 2.5959 | 0.645 | 0.6463 | 0.1905 | 0.1461 |
| 0.6508 | 39.0 | 975 | 1.2446 | 0.6425 | 0.5029 | 2.6545 | 0.6425 | 0.6441 | 0.1943 | 0.1445 |
| 0.0591 | 40.0 | 1000 | 1.2471 | 0.6525 | 0.5005 | 2.5163 | 0.6525 | 0.6529 | 0.1830 | 0.1460 |
| 0.0591 | 41.0 | 1025 | 1.2420 | 0.635 | 0.5009 | 2.5884 | 0.635 | 0.6371 | 0.1842 | 0.1448 |
| 0.0591 | 42.0 | 1050 | 1.2471 | 0.6475 | 0.5016 | 2.6730 | 0.6475 | 0.6476 | 0.1905 | 0.1463 |
| 0.0591 | 43.0 | 1075 | 1.2452 | 0.635 | 0.5036 | 2.5784 | 0.635 | 0.6373 | 0.1786 | 0.1466 |
| 0.0591 | 44.0 | 1100 | 1.2404 | 0.6475 | 0.4999 | 2.5804 | 0.6475 | 0.6468 | 0.1757 | 0.1448 |
| 0.0591 | 45.0 | 1125 | 1.2443 | 0.64 | 0.5025 | 2.5843 | 0.64 | 0.6425 | 0.1852 | 0.1457 |
| 0.0591 | 46.0 | 1150 | 1.2429 | 0.6425 | 0.5001 | 2.5071 | 0.6425 | 0.6441 | 0.1886 | 0.1454 |
| 0.0591 | 47.0 | 1175 | 1.2450 | 0.645 | 0.5028 | 2.5860 | 0.645 | 0.6460 | 0.1957 | 0.1453 |
| 0.0591 | 48.0 | 1200 | 1.2391 | 0.6375 | 0.4993 | 2.6594 | 0.6375 | 0.6379 | 0.1802 | 0.1456 |
| 0.0591 | 49.0 | 1225 | 1.2421 | 0.6425 | 0.5006 | 2.5857 | 0.6425 | 0.6428 | 0.1933 | 0.1450 |
| 0.0591 | 50.0 | 1250 | 1.2413 | 0.6425 | 0.5007 | 2.6657 | 0.6425 | 0.6432 | 0.1861 | 0.1455 |
| 0.0591 | 51.0 | 1275 | 1.2399 | 0.645 | 0.4995 | 2.5804 | 0.645 | 0.6469 | 0.1949 | 0.1448 |
| 0.0591 | 52.0 | 1300 | 1.2425 | 0.645 | 0.5013 | 2.5908 | 0.645 | 0.6442 | 0.1766 | 0.1448 |
| 0.0591 | 53.0 | 1325 | 1.2407 | 0.64 | 0.5006 | 2.5801 | 0.64 | 0.6415 | 0.1818 | 0.1458 |
| 0.0591 | 54.0 | 1350 | 1.2402 | 0.6425 | 0.5004 | 2.6583 | 0.6425 | 0.6451 | 0.1967 | 0.1452 |
| 0.0591 | 55.0 | 1375 | 1.2394 | 0.645 | 0.5000 | 2.5852 | 0.645 | 0.6464 | 0.1829 | 0.1446 |
| 0.0591 | 56.0 | 1400 | 1.2391 | 0.6425 | 0.4999 | 2.5903 | 0.6425 | 0.6444 | 0.1902 | 0.1449 |
| 0.0591 | 57.0 | 1425 | 1.2384 | 0.6475 | 0.4994 | 2.5864 | 0.6475 | 0.6483 | 0.1935 | 0.1446 |
| 0.0591 | 58.0 | 1450 | 1.2409 | 0.6425 | 0.5007 | 2.5842 | 0.6425 | 0.6450 | 0.1868 | 0.1451 |
| 0.0591 | 59.0 | 1475 | 1.2389 | 0.6425 | 0.4999 | 2.5848 | 0.6425 | 0.6444 | 0.1845 | 0.1447 |
| 0.0363 | 60.0 | 1500 | 1.2391 | 0.6425 | 0.4998 | 2.6608 | 0.6425 | 0.6443 | 0.1823 | 0.1449 |
| 0.0363 | 61.0 | 1525 | 1.2393 | 0.6475 | 0.5002 | 2.6602 | 0.6475 | 0.6484 | 0.1966 | 0.1446 |
| 0.0363 | 62.0 | 1550 | 1.2385 | 0.6425 | 0.4994 | 2.5912 | 0.6425 | 0.6427 | 0.1932 | 0.1448 |
| 0.0363 | 63.0 | 1575 | 1.2396 | 0.6425 | 0.5003 | 2.6605 | 0.6425 | 0.6444 | 0.1909 | 0.1450 |
| 0.0363 | 64.0 | 1600 | 1.2388 | 0.6425 | 0.4996 | 2.6609 | 0.6425 | 0.6443 | 0.1862 | 0.1449 |
| 0.0363 | 65.0 | 1625 | 1.2387 | 0.645 | 0.5000 | 2.6604 | 0.645 | 0.6465 | 0.1826 | 0.1446 |
| 0.0363 | 66.0 | 1650 | 1.2390 | 0.645 | 0.4998 | 2.5910 | 0.645 | 0.6464 | 0.1868 | 0.1447 |
| 0.0363 | 67.0 | 1675 | 1.2388 | 0.6425 | 0.4999 | 2.6605 | 0.6425 | 0.6444 | 0.1803 | 0.1448 |
| 0.0363 | 68.0 | 1700 | 1.2387 | 0.6425 | 0.4996 | 2.6608 | 0.6425 | 0.6444 | 0.1845 | 0.1448 |
| 0.0363 | 69.0 | 1725 | 1.2388 | 0.6475 | 0.4999 | 2.6597 | 0.6475 | 0.6484 | 0.1878 | 0.1445 |
| 0.0363 | 70.0 | 1750 | 1.2387 | 0.645 | 0.4997 | 2.6601 | 0.645 | 0.6465 | 0.1870 | 0.1448 |
| 0.0363 | 71.0 | 1775 | 1.2382 | 0.6425 | 0.4996 | 2.6606 | 0.6425 | 0.6444 | 0.1954 | 0.1448 |
| 0.0363 | 72.0 | 1800 | 1.2387 | 0.645 | 0.4998 | 2.6595 | 0.645 | 0.6465 | 0.1866 | 0.1447 |
| 0.0363 | 73.0 | 1825 | 1.2381 | 0.645 | 0.4996 | 2.6602 | 0.645 | 0.6464 | 0.1838 | 0.1446 |
| 0.0363 | 74.0 | 1850 | 1.2384 | 0.6425 | 0.4996 | 2.6605 | 0.6425 | 0.6444 | 0.1908 | 0.1449 |
| 0.0363 | 75.0 | 1875 | 1.2384 | 0.6425 | 0.4997 | 2.6601 | 0.6425 | 0.6443 | 0.1876 | 0.1449 |
| 0.0363 | 76.0 | 1900 | 1.2383 | 0.645 | 0.4996 | 2.6602 | 0.645 | 0.6464 | 0.1881 | 0.1447 |
| 0.0363 | 77.0 | 1925 | 1.2383 | 0.645 | 0.4997 | 2.6601 | 0.645 | 0.6464 | 0.1851 | 0.1447 |
| 0.0363 | 78.0 | 1950 | 1.2382 | 0.6425 | 0.4996 | 2.6601 | 0.6425 | 0.6443 | 0.1882 | 0.1448 |
| 0.0363 | 79.0 | 1975 | 1.2381 | 0.645 | 0.4996 | 2.6600 | 0.645 | 0.6464 | 0.1854 | 0.1447 |
| 0.036 | 80.0 | 2000 | 1.2381 | 0.6425 | 0.4996 | 2.6603 | 0.6425 | 0.6443 | 0.1882 | 0.1448 |
| 0.036 | 81.0 | 2025 | 1.2382 | 0.645 | 0.4996 | 2.6601 | 0.645 | 0.6464 | 0.1854 | 0.1447 |
| 0.036 | 82.0 | 2050 | 1.2380 | 0.6425 | 0.4996 | 2.6601 | 0.6425 | 0.6443 | 0.1942 | 0.1448 |
| 0.036 | 83.0 | 2075 | 1.2380 | 0.645 | 0.4996 | 2.6602 | 0.645 | 0.6464 | 0.1884 | 0.1447 |
| 0.036 | 84.0 | 2100 | 1.2379 | 0.645 | 0.4995 | 2.6601 | 0.645 | 0.6464 | 0.1849 | 0.1447 |
| 0.036 | 85.0 | 2125 | 1.2380 | 0.6425 | 0.4996 | 2.6600 | 0.6425 | 0.6443 | 0.1895 | 0.1449 |
| 0.036 | 86.0 | 2150 | 1.2381 | 0.645 | 0.4996 | 2.6601 | 0.645 | 0.6464 | 0.1870 | 0.1447 |
| 0.036 | 87.0 | 2175 | 1.2379 | 0.6425 | 0.4995 | 2.6601 | 0.6425 | 0.6443 | 0.1925 | 0.1449 |
| 0.036 | 88.0 | 2200 | 1.2379 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1900 | 0.1447 |
| 0.036 | 89.0 | 2225 | 1.2379 | 0.645 | 0.4995 | 2.6601 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 90.0 | 2250 | 1.2379 | 0.645 | 0.4995 | 2.6599 | 0.645 | 0.6464 | 0.1900 | 0.1447 |
| 0.036 | 91.0 | 2275 | 1.2378 | 0.6425 | 0.4995 | 2.6600 | 0.6425 | 0.6443 | 0.1875 | 0.1448 |
| 0.036 | 92.0 | 2300 | 1.2379 | 0.645 | 0.4996 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 93.0 | 2325 | 1.2379 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 94.0 | 2350 | 1.2378 | 0.645 | 0.4995 | 2.6599 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 95.0 | 2375 | 1.2378 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 96.0 | 2400 | 1.2378 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 97.0 | 2425 | 1.2378 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 98.0 | 2450 | 1.2378 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 99.0 | 2475 | 1.2378 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
| 0.036 | 100.0 | 2500 | 1.2378 | 0.645 | 0.4995 | 2.6600 | 0.645 | 0.6464 | 0.1850 | 0.1447 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2897
- Accuracy: 0.635
- Brier Loss: 0.5186
- Nll: 2.9908
- F1 Micro: 0.635
- F1 Macro: 0.6391
- Ece: 0.1984
- Aurc: 0.1511
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 2.8799 | 0.12 | 0.9317 | 15.6566 | 0.12 | 0.1217 | 0.1503 | 0.8678 |
| No log | 2.0 | 50 | 2.2166 | 0.395 | 0.7576 | 9.4150 | 0.395 | 0.3645 | 0.2155 | 0.3726 |
| No log | 3.0 | 75 | 1.7821 | 0.505 | 0.6346 | 5.5305 | 0.505 | 0.4975 | 0.1755 | 0.2454 |
| No log | 4.0 | 100 | 1.6660 | 0.5275 | 0.6038 | 4.9669 | 0.5275 | 0.5333 | 0.1684 | 0.2324 |
| No log | 5.0 | 125 | 1.6118 | 0.54 | 0.5943 | 4.8266 | 0.54 | 0.5233 | 0.1947 | 0.2249 |
| No log | 6.0 | 150 | 1.7108 | 0.5275 | 0.6168 | 4.4308 | 0.5275 | 0.5247 | 0.2018 | 0.2418 |
| No log | 7.0 | 175 | 1.6465 | 0.5825 | 0.5721 | 4.8918 | 0.5825 | 0.5614 | 0.1887 | 0.1995 |
| No log | 8.0 | 200 | 1.6441 | 0.565 | 0.6040 | 4.2349 | 0.565 | 0.5591 | 0.1933 | 0.2216 |
| No log | 9.0 | 225 | 1.7054 | 0.565 | 0.6054 | 4.6348 | 0.565 | 0.5649 | 0.1845 | 0.2033 |
| No log | 10.0 | 250 | 1.6724 | 0.5375 | 0.6191 | 4.3502 | 0.5375 | 0.5257 | 0.1991 | 0.2223 |
| No log | 11.0 | 275 | 1.5397 | 0.57 | 0.5757 | 4.1311 | 0.57 | 0.5715 | 0.2079 | 0.1936 |
| No log | 12.0 | 300 | 1.7636 | 0.55 | 0.6394 | 5.0515 | 0.55 | 0.5376 | 0.2252 | 0.2268 |
| No log | 13.0 | 325 | 1.6080 | 0.575 | 0.5997 | 4.2707 | 0.575 | 0.5515 | 0.2048 | 0.1887 |
| No log | 14.0 | 350 | 1.7572 | 0.575 | 0.6205 | 4.6140 | 0.575 | 0.5705 | 0.2203 | 0.2342 |
| No log | 15.0 | 375 | 1.5604 | 0.58 | 0.5872 | 3.8633 | 0.58 | 0.5762 | 0.2089 | 0.1866 |
| No log | 16.0 | 400 | 1.6440 | 0.585 | 0.6042 | 4.2508 | 0.585 | 0.5940 | 0.2253 | 0.2182 |
| No log | 17.0 | 425 | 1.6117 | 0.5825 | 0.6057 | 4.2511 | 0.5825 | 0.5732 | 0.2299 | 0.1947 |
| No log | 18.0 | 450 | 1.5597 | 0.605 | 0.5732 | 4.4755 | 0.605 | 0.6028 | 0.2101 | 0.1721 |
| No log | 19.0 | 475 | 1.4177 | 0.6325 | 0.5429 | 3.4771 | 0.6325 | 0.6319 | 0.1930 | 0.1786 |
| 0.5354 | 20.0 | 500 | 1.5745 | 0.56 | 0.6076 | 3.6058 | 0.56 | 0.5643 | 0.2265 | 0.1898 |
| 0.5354 | 21.0 | 525 | 1.4907 | 0.6125 | 0.5682 | 3.9837 | 0.6125 | 0.6184 | 0.1981 | 0.1810 |
| 0.5354 | 22.0 | 550 | 1.4494 | 0.5925 | 0.5677 | 3.2864 | 0.5925 | 0.5906 | 0.2187 | 0.1670 |
| 0.5354 | 23.0 | 575 | 1.5608 | 0.62 | 0.5830 | 4.0132 | 0.62 | 0.6029 | 0.2286 | 0.1808 |
| 0.5354 | 24.0 | 600 | 1.5038 | 0.58 | 0.5957 | 3.6519 | 0.58 | 0.5956 | 0.2321 | 0.1879 |
| 0.5354 | 25.0 | 625 | 1.4094 | 0.615 | 0.5554 | 3.0313 | 0.615 | 0.6102 | 0.2180 | 0.1689 |
| 0.5354 | 26.0 | 650 | 1.4485 | 0.62 | 0.5712 | 3.3326 | 0.62 | 0.6181 | 0.2138 | 0.1729 |
| 0.5354 | 27.0 | 675 | 1.4156 | 0.6225 | 0.5621 | 3.2257 | 0.6225 | 0.6239 | 0.2158 | 0.1718 |
| 0.5354 | 28.0 | 700 | 1.3729 | 0.6275 | 0.5476 | 3.1300 | 0.6275 | 0.6285 | 0.2078 | 0.1620 |
| 0.5354 | 29.0 | 725 | 1.3671 | 0.6275 | 0.5337 | 3.4625 | 0.6275 | 0.6285 | 0.2177 | 0.1586 |
| 0.5354 | 30.0 | 750 | 1.3263 | 0.63 | 0.5380 | 3.2177 | 0.63 | 0.6338 | 0.2063 | 0.1577 |
| 0.5354 | 31.0 | 775 | 1.2991 | 0.6225 | 0.5223 | 3.0482 | 0.6225 | 0.6238 | 0.1940 | 0.1525 |
| 0.5354 | 32.0 | 800 | 1.3227 | 0.6325 | 0.5333 | 2.9622 | 0.6325 | 0.6351 | 0.1906 | 0.1554 |
| 0.5354 | 33.0 | 825 | 1.3077 | 0.63 | 0.5298 | 3.2060 | 0.63 | 0.6338 | 0.1933 | 0.1555 |
| 0.5354 | 34.0 | 850 | 1.3036 | 0.6225 | 0.5269 | 3.0431 | 0.6225 | 0.6242 | 0.1996 | 0.1535 |
| 0.5354 | 35.0 | 875 | 1.3057 | 0.6275 | 0.5263 | 2.9651 | 0.6275 | 0.6291 | 0.2023 | 0.1538 |
| 0.5354 | 36.0 | 900 | 1.2992 | 0.6275 | 0.5247 | 2.9748 | 0.6275 | 0.6289 | 0.1961 | 0.1518 |
| 0.5354 | 37.0 | 925 | 1.3001 | 0.6325 | 0.5252 | 2.9784 | 0.6325 | 0.6347 | 0.1978 | 0.1531 |
| 0.5354 | 38.0 | 950 | 1.2990 | 0.63 | 0.5229 | 2.9014 | 0.63 | 0.6327 | 0.1981 | 0.1524 |
| 0.5354 | 39.0 | 975 | 1.2995 | 0.6325 | 0.5246 | 2.9776 | 0.6325 | 0.6354 | 0.1946 | 0.1533 |
| 0.0336 | 40.0 | 1000 | 1.2945 | 0.6275 | 0.5226 | 2.9029 | 0.6275 | 0.6302 | 0.1965 | 0.1523 |
| 0.0336 | 41.0 | 1025 | 1.3023 | 0.63 | 0.5247 | 3.0515 | 0.63 | 0.6341 | 0.2044 | 0.1534 |
| 0.0336 | 42.0 | 1050 | 1.2990 | 0.635 | 0.5239 | 3.0673 | 0.635 | 0.6381 | 0.1952 | 0.1516 |
| 0.0336 | 43.0 | 1075 | 1.2962 | 0.635 | 0.5213 | 3.0585 | 0.635 | 0.6378 | 0.2055 | 0.1523 |
| 0.0336 | 44.0 | 1100 | 1.2991 | 0.625 | 0.5229 | 2.9801 | 0.625 | 0.6278 | 0.1954 | 0.1532 |
| 0.0336 | 45.0 | 1125 | 1.2949 | 0.6375 | 0.5222 | 3.0564 | 0.6375 | 0.6419 | 0.2027 | 0.1519 |
| 0.0336 | 46.0 | 1150 | 1.2989 | 0.6275 | 0.5228 | 3.0737 | 0.6275 | 0.6308 | 0.2075 | 0.1529 |
| 0.0336 | 47.0 | 1175 | 1.2902 | 0.6325 | 0.5201 | 3.0606 | 0.6325 | 0.6360 | 0.2099 | 0.1516 |
| 0.0336 | 48.0 | 1200 | 1.2971 | 0.6275 | 0.5217 | 3.0829 | 0.6275 | 0.6305 | 0.1882 | 0.1518 |
| 0.0336 | 49.0 | 1225 | 1.2913 | 0.63 | 0.5212 | 2.9853 | 0.63 | 0.6332 | 0.1928 | 0.1524 |
| 0.0336 | 50.0 | 1250 | 1.2917 | 0.63 | 0.5205 | 2.9850 | 0.63 | 0.6336 | 0.1910 | 0.1518 |
| 0.0336 | 51.0 | 1275 | 1.2928 | 0.63 | 0.5208 | 3.0579 | 0.63 | 0.6330 | 0.2020 | 0.1528 |
| 0.0336 | 52.0 | 1300 | 1.2941 | 0.635 | 0.5205 | 3.0647 | 0.635 | 0.6383 | 0.1919 | 0.1515 |
| 0.0336 | 53.0 | 1325 | 1.2930 | 0.635 | 0.5207 | 3.0637 | 0.635 | 0.6384 | 0.1868 | 0.1518 |
| 0.0336 | 54.0 | 1350 | 1.2918 | 0.63 | 0.5203 | 3.0628 | 0.63 | 0.6335 | 0.1986 | 0.1519 |
| 0.0336 | 55.0 | 1375 | 1.2894 | 0.635 | 0.5198 | 2.9874 | 0.635 | 0.6383 | 0.2026 | 0.1514 |
| 0.0336 | 56.0 | 1400 | 1.2913 | 0.63 | 0.5203 | 3.0691 | 0.63 | 0.6337 | 0.2045 | 0.1519 |
| 0.0336 | 57.0 | 1425 | 1.2923 | 0.6325 | 0.5205 | 2.9869 | 0.6325 | 0.6358 | 0.1962 | 0.1522 |
| 0.0336 | 58.0 | 1450 | 1.2927 | 0.6375 | 0.5199 | 3.0734 | 0.6375 | 0.6408 | 0.1905 | 0.1514 |
| 0.0336 | 59.0 | 1475 | 1.2931 | 0.6325 | 0.5204 | 3.0607 | 0.6325 | 0.6353 | 0.1980 | 0.1520 |
| 0.0236 | 60.0 | 1500 | 1.2911 | 0.6325 | 0.5199 | 3.0664 | 0.6325 | 0.6359 | 0.1875 | 0.1517 |
| 0.0236 | 61.0 | 1525 | 1.2901 | 0.635 | 0.5195 | 2.9877 | 0.635 | 0.6386 | 0.1907 | 0.1516 |
| 0.0236 | 62.0 | 1550 | 1.2913 | 0.635 | 0.5192 | 3.0655 | 0.635 | 0.6383 | 0.1971 | 0.1515 |
| 0.0236 | 63.0 | 1575 | 1.2920 | 0.635 | 0.5201 | 3.0044 | 0.635 | 0.6379 | 0.1991 | 0.1514 |
| 0.0236 | 64.0 | 1600 | 1.2911 | 0.635 | 0.5192 | 3.0654 | 0.635 | 0.6380 | 0.1848 | 0.1509 |
| 0.0236 | 65.0 | 1625 | 1.2924 | 0.635 | 0.5196 | 3.1438 | 0.635 | 0.6379 | 0.1969 | 0.1515 |
| 0.0236 | 66.0 | 1650 | 1.2901 | 0.635 | 0.5191 | 2.9928 | 0.635 | 0.6392 | 0.1978 | 0.1507 |
| 0.0236 | 67.0 | 1675 | 1.2911 | 0.6325 | 0.5189 | 3.0662 | 0.6325 | 0.6359 | 0.1896 | 0.1517 |
| 0.0236 | 68.0 | 1700 | 1.2911 | 0.6375 | 0.5193 | 2.9932 | 0.6375 | 0.6404 | 0.2017 | 0.1507 |
| 0.0236 | 69.0 | 1725 | 1.2893 | 0.635 | 0.5189 | 2.9907 | 0.635 | 0.6391 | 0.1951 | 0.1511 |
| 0.0236 | 70.0 | 1750 | 1.2913 | 0.6325 | 0.5195 | 2.9919 | 0.6325 | 0.6362 | 0.1955 | 0.1513 |
| 0.0236 | 71.0 | 1775 | 1.2899 | 0.635 | 0.5188 | 2.9899 | 0.635 | 0.6386 | 0.2049 | 0.1511 |
| 0.0236 | 72.0 | 1800 | 1.2912 | 0.635 | 0.5192 | 2.9914 | 0.635 | 0.6379 | 0.1924 | 0.1513 |
| 0.0236 | 73.0 | 1825 | 1.2898 | 0.6325 | 0.5188 | 2.9901 | 0.6325 | 0.6367 | 0.2059 | 0.1511 |
| 0.0236 | 74.0 | 1850 | 1.2902 | 0.635 | 0.5190 | 2.9918 | 0.635 | 0.6391 | 0.2069 | 0.1511 |
| 0.0236 | 75.0 | 1875 | 1.2904 | 0.635 | 0.5191 | 2.9916 | 0.635 | 0.6391 | 0.1969 | 0.1511 |
| 0.0236 | 76.0 | 1900 | 1.2905 | 0.635 | 0.5191 | 2.9899 | 0.635 | 0.6391 | 0.1969 | 0.1512 |
| 0.0236 | 77.0 | 1925 | 1.2904 | 0.635 | 0.5191 | 2.9917 | 0.635 | 0.6391 | 0.1926 | 0.1511 |
| 0.0236 | 78.0 | 1950 | 1.2899 | 0.635 | 0.5188 | 2.9909 | 0.635 | 0.6391 | 0.2010 | 0.1510 |
| 0.0236 | 79.0 | 1975 | 1.2900 | 0.635 | 0.5188 | 2.9908 | 0.635 | 0.6391 | 0.2034 | 0.1511 |
| 0.0233 | 80.0 | 2000 | 1.2900 | 0.635 | 0.5188 | 2.9910 | 0.635 | 0.6391 | 0.1967 | 0.1511 |
| 0.0233 | 81.0 | 2025 | 1.2900 | 0.635 | 0.5188 | 2.9911 | 0.635 | 0.6391 | 0.2002 | 0.1511 |
| 0.0233 | 82.0 | 2050 | 1.2901 | 0.635 | 0.5189 | 2.9909 | 0.635 | 0.6391 | 0.1993 | 0.1511 |
| 0.0233 | 83.0 | 2075 | 1.2900 | 0.635 | 0.5188 | 2.9906 | 0.635 | 0.6391 | 0.1937 | 0.1511 |
| 0.0233 | 84.0 | 2100 | 1.2901 | 0.635 | 0.5189 | 2.9917 | 0.635 | 0.6391 | 0.2026 | 0.1511 |
| 0.0233 | 85.0 | 2125 | 1.2899 | 0.635 | 0.5188 | 2.9905 | 0.635 | 0.6391 | 0.1993 | 0.1512 |
| 0.0233 | 86.0 | 2150 | 1.2897 | 0.635 | 0.5187 | 2.9906 | 0.635 | 0.6391 | 0.1976 | 0.1511 |
| 0.0233 | 87.0 | 2175 | 1.2899 | 0.635 | 0.5188 | 2.9905 | 0.635 | 0.6391 | 0.1980 | 0.1511 |
| 0.0233 | 88.0 | 2200 | 1.2897 | 0.635 | 0.5187 | 2.9911 | 0.635 | 0.6391 | 0.1957 | 0.1511 |
| 0.0233 | 89.0 | 2225 | 1.2899 | 0.635 | 0.5187 | 2.9910 | 0.635 | 0.6391 | 0.1970 | 0.1511 |
| 0.0233 | 90.0 | 2250 | 1.2898 | 0.635 | 0.5187 | 2.9905 | 0.635 | 0.6391 | 0.1988 | 0.1512 |
| 0.0233 | 91.0 | 2275 | 1.2897 | 0.635 | 0.5187 | 2.9908 | 0.635 | 0.6391 | 0.1961 | 0.1511 |
| 0.0233 | 92.0 | 2300 | 1.2898 | 0.635 | 0.5187 | 2.9908 | 0.635 | 0.6391 | 0.1966 | 0.1511 |
| 0.0233 | 93.0 | 2325 | 1.2897 | 0.635 | 0.5186 | 2.9908 | 0.635 | 0.6391 | 0.1984 | 0.1511 |
| 0.0233 | 94.0 | 2350 | 1.2898 | 0.635 | 0.5187 | 2.9907 | 0.635 | 0.6391 | 0.2009 | 0.1511 |
| 0.0233 | 95.0 | 2375 | 1.2897 | 0.635 | 0.5186 | 2.9908 | 0.635 | 0.6391 | 0.2023 | 0.1511 |
| 0.0233 | 96.0 | 2400 | 1.2897 | 0.635 | 0.5186 | 2.9908 | 0.635 | 0.6391 | 0.1985 | 0.1511 |
| 0.0233 | 97.0 | 2425 | 1.2897 | 0.635 | 0.5186 | 2.9908 | 0.635 | 0.6391 | 0.1984 | 0.1511 |
| 0.0233 | 98.0 | 2450 | 1.2897 | 0.635 | 0.5186 | 2.9908 | 0.635 | 0.6391 | 0.1985 | 0.1511 |
| 0.0233 | 99.0 | 2475 | 1.2897 | 0.635 | 0.5186 | 2.9909 | 0.635 | 0.6391 | 0.1984 | 0.1511 |
| 0.0232 | 100.0 | 2500 | 1.2897 | 0.635 | 0.5186 | 2.9908 | 0.635 | 0.6391 | 0.1984 | 0.1511 |
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.12.0
- Tokenizers 0.12.1
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
sjdata/vit-base-beans
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0138
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1595 | 1.54 | 100 | 0.1212 | 0.9699 |
| 0.014 | 3.08 | 200 | 0.0138 | 1.0 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
miki-kawa/huggingdatavit-base-beans
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# huggingdatavit-base-beans
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0356
- Accuracy: 0.9925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1059 | 1.54 | 100 | 0.0356 | 0.9925 |
| 0.0256 | 3.08 | 200 | 0.0663 | 0.9774 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.13.0
- Tokenizers 0.11.0
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
jordyvl/vit-small_tobacco3482_kd_CE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_CE
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7692
- Accuracy: 0.845
- Brier Loss: 0.2469
- Nll: 1.1078
- F1 Micro: 0.845
- F1 Macro: 0.8517
- Ece: 0.1239
- Aurc: 0.0373
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 2.2384 | 0.215 | 0.8750 | 5.2607 | 0.2150 | 0.1263 | 0.2515 | 0.6887 |
| No log | 2.0 | 14 | 1.8043 | 0.395 | 0.7402 | 3.6938 | 0.395 | 0.2230 | 0.2859 | 0.4088 |
| No log | 3.0 | 21 | 1.2581 | 0.61 | 0.5613 | 2.0209 | 0.61 | 0.5557 | 0.2634 | 0.2084 |
| No log | 4.0 | 28 | 0.8820 | 0.7 | 0.4017 | 1.7039 | 0.7 | 0.6574 | 0.2397 | 0.1123 |
| No log | 5.0 | 35 | 0.8338 | 0.74 | 0.3807 | 1.7427 | 0.74 | 0.7425 | 0.2155 | 0.1001 |
| No log | 6.0 | 42 | 0.7026 | 0.775 | 0.3202 | 1.4789 | 0.775 | 0.7666 | 0.1781 | 0.0771 |
| No log | 7.0 | 49 | 0.7935 | 0.77 | 0.3635 | 1.5766 | 0.7700 | 0.7840 | 0.2029 | 0.0857 |
| No log | 8.0 | 56 | 0.6819 | 0.8 | 0.3047 | 1.3800 | 0.8000 | 0.7987 | 0.1758 | 0.0738 |
| No log | 9.0 | 63 | 0.7826 | 0.775 | 0.3434 | 1.5345 | 0.775 | 0.7888 | 0.1910 | 0.0807 |
| No log | 10.0 | 70 | 0.8752 | 0.775 | 0.3392 | 1.5110 | 0.775 | 0.7813 | 0.1731 | 0.0737 |
| No log | 11.0 | 77 | 1.0440 | 0.72 | 0.4285 | 1.7284 | 0.72 | 0.7241 | 0.2416 | 0.1094 |
| No log | 12.0 | 84 | 0.8109 | 0.785 | 0.3411 | 1.3933 | 0.785 | 0.7806 | 0.1694 | 0.0690 |
| No log | 13.0 | 91 | 0.9980 | 0.76 | 0.3852 | 1.5143 | 0.76 | 0.6919 | 0.1917 | 0.0892 |
| No log | 14.0 | 98 | 1.0056 | 0.775 | 0.3773 | 1.4735 | 0.775 | 0.7733 | 0.2060 | 0.0871 |
| No log | 15.0 | 105 | 1.2081 | 0.75 | 0.4260 | 1.5653 | 0.75 | 0.7373 | 0.2110 | 0.0983 |
| No log | 16.0 | 112 | 1.1463 | 0.78 | 0.3781 | 1.5472 | 0.78 | 0.7906 | 0.1736 | 0.1004 |
| No log | 17.0 | 119 | 0.9384 | 0.825 | 0.2973 | 1.5527 | 0.825 | 0.8334 | 0.1529 | 0.0648 |
| No log | 18.0 | 126 | 0.9258 | 0.785 | 0.3464 | 1.2875 | 0.785 | 0.7832 | 0.1694 | 0.0562 |
| No log | 19.0 | 133 | 1.1667 | 0.8 | 0.3406 | 1.8919 | 0.8000 | 0.8038 | 0.1518 | 0.0705 |
| No log | 20.0 | 140 | 0.9351 | 0.81 | 0.3116 | 1.4283 | 0.81 | 0.8084 | 0.1532 | 0.0628 |
| No log | 21.0 | 147 | 1.2016 | 0.77 | 0.4040 | 1.2958 | 0.7700 | 0.7606 | 0.2051 | 0.0827 |
| No log | 22.0 | 154 | 1.3592 | 0.765 | 0.4040 | 1.6059 | 0.765 | 0.7645 | 0.1941 | 0.1163 |
| No log | 23.0 | 161 | 0.9921 | 0.805 | 0.3374 | 1.6304 | 0.805 | 0.8029 | 0.1710 | 0.0558 |
| No log | 24.0 | 168 | 0.8805 | 0.83 | 0.2953 | 1.1996 | 0.83 | 0.8189 | 0.1547 | 0.0611 |
| No log | 25.0 | 175 | 0.9926 | 0.815 | 0.3148 | 1.2949 | 0.815 | 0.8050 | 0.1638 | 0.0606 |
| No log | 26.0 | 182 | 1.0838 | 0.83 | 0.3171 | 1.3327 | 0.83 | 0.8265 | 0.1632 | 0.0732 |
| No log | 27.0 | 189 | 1.1845 | 0.8 | 0.3382 | 1.3456 | 0.8000 | 0.7942 | 0.1814 | 0.0798 |
| No log | 28.0 | 196 | 0.9800 | 0.83 | 0.2999 | 1.3172 | 0.83 | 0.8275 | 0.1563 | 0.0798 |
| No log | 29.0 | 203 | 0.9653 | 0.85 | 0.2724 | 1.3303 | 0.85 | 0.8531 | 0.1415 | 0.0556 |
| No log | 30.0 | 210 | 0.9896 | 0.85 | 0.2837 | 1.3282 | 0.85 | 0.8494 | 0.1373 | 0.0596 |
| No log | 31.0 | 217 | 0.9196 | 0.84 | 0.2844 | 1.2157 | 0.8400 | 0.8437 | 0.1516 | 0.0508 |
| No log | 32.0 | 224 | 0.9701 | 0.83 | 0.3062 | 1.2264 | 0.83 | 0.8364 | 0.1608 | 0.0554 |
| No log | 33.0 | 231 | 0.7464 | 0.865 | 0.2353 | 1.1321 | 0.865 | 0.8613 | 0.1265 | 0.0432 |
| No log | 34.0 | 238 | 0.7593 | 0.865 | 0.2367 | 1.1160 | 0.865 | 0.8649 | 0.1322 | 0.0430 |
| No log | 35.0 | 245 | 0.7450 | 0.855 | 0.2465 | 1.0615 | 0.855 | 0.8536 | 0.1279 | 0.0413 |
| No log | 36.0 | 252 | 0.7389 | 0.845 | 0.2546 | 1.0563 | 0.845 | 0.8429 | 0.1266 | 0.0417 |
| No log | 37.0 | 259 | 0.7332 | 0.845 | 0.2542 | 1.0549 | 0.845 | 0.8452 | 0.1293 | 0.0413 |
| No log | 38.0 | 266 | 0.7328 | 0.85 | 0.2531 | 1.0554 | 0.85 | 0.8490 | 0.1331 | 0.0407 |
| No log | 39.0 | 273 | 0.7342 | 0.85 | 0.2514 | 1.0558 | 0.85 | 0.8490 | 0.1339 | 0.0398 |
| No log | 40.0 | 280 | 0.7367 | 0.855 | 0.2498 | 1.0564 | 0.855 | 0.8529 | 0.1362 | 0.0391 |
| No log | 41.0 | 287 | 0.7395 | 0.855 | 0.2489 | 1.0574 | 0.855 | 0.8529 | 0.1307 | 0.0392 |
| No log | 42.0 | 294 | 0.7412 | 0.855 | 0.2480 | 1.0598 | 0.855 | 0.8529 | 0.1237 | 0.0393 |
| No log | 43.0 | 301 | 0.7434 | 0.855 | 0.2475 | 1.0635 | 0.855 | 0.8550 | 0.1161 | 0.0392 |
| No log | 44.0 | 308 | 0.7453 | 0.855 | 0.2473 | 1.0725 | 0.855 | 0.8550 | 0.1237 | 0.0392 |
| No log | 45.0 | 315 | 0.7462 | 0.855 | 0.2471 | 1.1225 | 0.855 | 0.8550 | 0.1205 | 0.0391 |
| No log | 46.0 | 322 | 0.7471 | 0.855 | 0.2468 | 1.1219 | 0.855 | 0.8550 | 0.1155 | 0.0391 |
| No log | 47.0 | 329 | 0.7481 | 0.85 | 0.2466 | 1.1213 | 0.85 | 0.8519 | 0.1283 | 0.0390 |
| No log | 48.0 | 336 | 0.7492 | 0.85 | 0.2464 | 1.1207 | 0.85 | 0.8519 | 0.1334 | 0.0388 |
| No log | 49.0 | 343 | 0.7504 | 0.85 | 0.2464 | 1.1203 | 0.85 | 0.8519 | 0.1379 | 0.0387 |
| No log | 50.0 | 350 | 0.7515 | 0.85 | 0.2465 | 1.1201 | 0.85 | 0.8519 | 0.1267 | 0.0387 |
| No log | 51.0 | 357 | 0.7523 | 0.85 | 0.2464 | 1.1198 | 0.85 | 0.8519 | 0.1265 | 0.0385 |
| No log | 52.0 | 364 | 0.7532 | 0.85 | 0.2463 | 1.1194 | 0.85 | 0.8519 | 0.1201 | 0.0385 |
| No log | 53.0 | 371 | 0.7534 | 0.855 | 0.2461 | 1.1189 | 0.855 | 0.8602 | 0.1266 | 0.0384 |
| No log | 54.0 | 378 | 0.7542 | 0.855 | 0.2460 | 1.1185 | 0.855 | 0.8602 | 0.1279 | 0.0386 |
| No log | 55.0 | 385 | 0.7547 | 0.855 | 0.2459 | 1.1180 | 0.855 | 0.8602 | 0.1332 | 0.0381 |
| No log | 56.0 | 392 | 0.7556 | 0.855 | 0.2460 | 1.1176 | 0.855 | 0.8602 | 0.1256 | 0.0380 |
| No log | 57.0 | 399 | 0.7564 | 0.855 | 0.2460 | 1.1171 | 0.855 | 0.8602 | 0.1252 | 0.0381 |
| No log | 58.0 | 406 | 0.7571 | 0.855 | 0.2461 | 1.1166 | 0.855 | 0.8602 | 0.1231 | 0.0379 |
| No log | 59.0 | 413 | 0.7581 | 0.855 | 0.2463 | 1.1162 | 0.855 | 0.8602 | 0.1295 | 0.0378 |
| No log | 60.0 | 420 | 0.7588 | 0.855 | 0.2464 | 1.1159 | 0.855 | 0.8602 | 0.1224 | 0.0378 |
| No log | 61.0 | 427 | 0.7594 | 0.855 | 0.2465 | 1.1155 | 0.855 | 0.8602 | 0.1226 | 0.0378 |
| No log | 62.0 | 434 | 0.7598 | 0.855 | 0.2464 | 1.1152 | 0.855 | 0.8602 | 0.1231 | 0.0378 |
| No log | 63.0 | 441 | 0.7605 | 0.855 | 0.2465 | 1.1149 | 0.855 | 0.8602 | 0.1231 | 0.0378 |
| No log | 64.0 | 448 | 0.7610 | 0.855 | 0.2465 | 1.1144 | 0.855 | 0.8602 | 0.1222 | 0.0377 |
| No log | 65.0 | 455 | 0.7618 | 0.855 | 0.2466 | 1.1140 | 0.855 | 0.8602 | 0.1229 | 0.0377 |
| No log | 66.0 | 462 | 0.7625 | 0.855 | 0.2468 | 1.1137 | 0.855 | 0.8602 | 0.1317 | 0.0378 |
| No log | 67.0 | 469 | 0.7630 | 0.855 | 0.2468 | 1.1133 | 0.855 | 0.8602 | 0.1317 | 0.0377 |
| No log | 68.0 | 476 | 0.7633 | 0.855 | 0.2468 | 1.1130 | 0.855 | 0.8602 | 0.1317 | 0.0377 |
| No log | 69.0 | 483 | 0.7636 | 0.855 | 0.2468 | 1.1127 | 0.855 | 0.8602 | 0.1318 | 0.0376 |
| No log | 70.0 | 490 | 0.7640 | 0.855 | 0.2468 | 1.1124 | 0.855 | 0.8602 | 0.1319 | 0.0376 |
| No log | 71.0 | 497 | 0.7645 | 0.85 | 0.2468 | 1.1120 | 0.85 | 0.8548 | 0.1280 | 0.0375 |
| 0.1221 | 72.0 | 504 | 0.7649 | 0.85 | 0.2468 | 1.1116 | 0.85 | 0.8548 | 0.1293 | 0.0375 |
| 0.1221 | 73.0 | 511 | 0.7653 | 0.85 | 0.2469 | 1.1113 | 0.85 | 0.8548 | 0.1293 | 0.0374 |
| 0.1221 | 74.0 | 518 | 0.7656 | 0.85 | 0.2469 | 1.1111 | 0.85 | 0.8548 | 0.1293 | 0.0373 |
| 0.1221 | 75.0 | 525 | 0.7659 | 0.85 | 0.2469 | 1.1108 | 0.85 | 0.8548 | 0.1208 | 0.0373 |
| 0.1221 | 76.0 | 532 | 0.7662 | 0.85 | 0.2469 | 1.1106 | 0.85 | 0.8548 | 0.1207 | 0.0374 |
| 0.1221 | 77.0 | 539 | 0.7664 | 0.85 | 0.2469 | 1.1104 | 0.85 | 0.8548 | 0.1207 | 0.0374 |
| 0.1221 | 78.0 | 546 | 0.7665 | 0.85 | 0.2469 | 1.1102 | 0.85 | 0.8548 | 0.1301 | 0.0375 |
| 0.1221 | 79.0 | 553 | 0.7667 | 0.85 | 0.2469 | 1.1100 | 0.85 | 0.8548 | 0.1301 | 0.0375 |
| 0.1221 | 80.0 | 560 | 0.7668 | 0.85 | 0.2468 | 1.1097 | 0.85 | 0.8548 | 0.1301 | 0.0374 |
| 0.1221 | 81.0 | 567 | 0.7669 | 0.85 | 0.2468 | 1.1095 | 0.85 | 0.8548 | 0.1300 | 0.0374 |
| 0.1221 | 82.0 | 574 | 0.7672 | 0.85 | 0.2468 | 1.1094 | 0.85 | 0.8548 | 0.1301 | 0.0374 |
| 0.1221 | 83.0 | 581 | 0.7674 | 0.85 | 0.2469 | 1.1092 | 0.85 | 0.8548 | 0.1301 | 0.0374 |
| 0.1221 | 84.0 | 588 | 0.7678 | 0.85 | 0.2469 | 1.1090 | 0.85 | 0.8548 | 0.1302 | 0.0374 |
| 0.1221 | 85.0 | 595 | 0.7678 | 0.85 | 0.2468 | 1.1089 | 0.85 | 0.8548 | 0.1284 | 0.0373 |
| 0.1221 | 86.0 | 602 | 0.7679 | 0.845 | 0.2468 | 1.1087 | 0.845 | 0.8517 | 0.1243 | 0.0373 |
| 0.1221 | 87.0 | 609 | 0.7681 | 0.845 | 0.2468 | 1.1086 | 0.845 | 0.8517 | 0.1244 | 0.0373 |
| 0.1221 | 88.0 | 616 | 0.7683 | 0.845 | 0.2468 | 1.1084 | 0.845 | 0.8517 | 0.1244 | 0.0373 |
| 0.1221 | 89.0 | 623 | 0.7685 | 0.845 | 0.2468 | 1.1083 | 0.845 | 0.8517 | 0.1243 | 0.0373 |
| 0.1221 | 90.0 | 630 | 0.7687 | 0.845 | 0.2469 | 1.1083 | 0.845 | 0.8517 | 0.1239 | 0.0372 |
| 0.1221 | 91.0 | 637 | 0.7688 | 0.845 | 0.2469 | 1.1082 | 0.845 | 0.8517 | 0.1239 | 0.0372 |
| 0.1221 | 92.0 | 644 | 0.7689 | 0.845 | 0.2469 | 1.1082 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 93.0 | 651 | 0.7690 | 0.845 | 0.2469 | 1.1081 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 94.0 | 658 | 0.7690 | 0.845 | 0.2469 | 1.1080 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 95.0 | 665 | 0.7691 | 0.845 | 0.2469 | 1.1080 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 96.0 | 672 | 0.7692 | 0.845 | 0.2469 | 1.1079 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 97.0 | 679 | 0.7692 | 0.845 | 0.2469 | 1.1079 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 98.0 | 686 | 0.7692 | 0.845 | 0.2469 | 1.1078 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 99.0 | 693 | 0.7692 | 0.845 | 0.2469 | 1.1078 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
| 0.1221 | 100.0 | 700 | 0.7692 | 0.845 | 0.2469 | 1.1078 | 0.845 | 0.8517 | 0.1239 | 0.0373 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-small_rvl_cdip_100_examples_per_class_kd_CE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_rvl_cdip_100_examples_per_class_kd_CE
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7493
- Accuracy: 0.6275
- Brier Loss: 0.5677
- Nll: 2.9769
- F1 Micro: 0.6275
- F1 Macro: 0.6250
- Ece: 0.2161
- Aurc: 0.1599
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 3.0120 | 0.085 | 0.9612 | 10.2074 | 0.085 | 0.0737 | 0.1585 | 0.8946 |
| No log | 2.0 | 14 | 2.6380 | 0.12 | 0.9247 | 6.9313 | 0.12 | 0.1257 | 0.1753 | 0.8678 |
| No log | 3.0 | 21 | 2.1951 | 0.36 | 0.7931 | 5.6390 | 0.36 | 0.3256 | 0.2066 | 0.4054 |
| No log | 4.0 | 28 | 1.8405 | 0.445 | 0.6971 | 3.5387 | 0.445 | 0.4479 | 0.1889 | 0.3068 |
| No log | 5.0 | 35 | 1.6213 | 0.525 | 0.6244 | 3.3423 | 0.525 | 0.5188 | 0.1821 | 0.2477 |
| No log | 6.0 | 42 | 1.5983 | 0.5275 | 0.6177 | 3.1202 | 0.5275 | 0.5220 | 0.1781 | 0.2448 |
| No log | 7.0 | 49 | 1.6214 | 0.54 | 0.6243 | 3.2514 | 0.54 | 0.5248 | 0.1758 | 0.2535 |
| No log | 8.0 | 56 | 1.4964 | 0.5675 | 0.5862 | 2.6168 | 0.5675 | 0.5715 | 0.1585 | 0.2200 |
| No log | 9.0 | 63 | 1.5696 | 0.575 | 0.5893 | 2.9901 | 0.575 | 0.5729 | 0.1851 | 0.2123 |
| No log | 10.0 | 70 | 1.6620 | 0.54 | 0.6257 | 3.1275 | 0.54 | 0.5425 | 0.2353 | 0.2343 |
| No log | 11.0 | 77 | 1.6901 | 0.585 | 0.5967 | 3.1708 | 0.585 | 0.5753 | 0.2006 | 0.1900 |
| No log | 12.0 | 84 | 1.5686 | 0.61 | 0.5645 | 2.9975 | 0.61 | 0.6129 | 0.1904 | 0.1830 |
| No log | 13.0 | 91 | 1.7390 | 0.5675 | 0.6159 | 3.0248 | 0.5675 | 0.5574 | 0.2200 | 0.2016 |
| No log | 14.0 | 98 | 1.6423 | 0.59 | 0.5778 | 2.9212 | 0.59 | 0.5827 | 0.2015 | 0.1863 |
| No log | 15.0 | 105 | 1.6262 | 0.61 | 0.5630 | 2.9492 | 0.61 | 0.6074 | 0.1950 | 0.1700 |
| No log | 16.0 | 112 | 1.6987 | 0.5925 | 0.5791 | 3.0433 | 0.5925 | 0.5852 | 0.2123 | 0.1674 |
| No log | 17.0 | 119 | 1.7256 | 0.5975 | 0.5782 | 3.0657 | 0.5975 | 0.5929 | 0.2214 | 0.1713 |
| No log | 18.0 | 126 | 1.7127 | 0.6125 | 0.5697 | 2.9494 | 0.6125 | 0.6110 | 0.2044 | 0.1706 |
| No log | 19.0 | 133 | 1.6961 | 0.62 | 0.5627 | 2.8745 | 0.62 | 0.6181 | 0.1972 | 0.1666 |
| No log | 20.0 | 140 | 1.6784 | 0.6275 | 0.5565 | 2.9077 | 0.6275 | 0.6256 | 0.2005 | 0.1614 |
| No log | 21.0 | 147 | 1.6699 | 0.62 | 0.5549 | 2.9148 | 0.62 | 0.6189 | 0.2089 | 0.1598 |
| No log | 22.0 | 154 | 1.6705 | 0.62 | 0.5561 | 2.9207 | 0.62 | 0.6186 | 0.2036 | 0.1593 |
| No log | 23.0 | 161 | 1.6749 | 0.62 | 0.5576 | 2.8938 | 0.62 | 0.6172 | 0.2017 | 0.1594 |
| No log | 24.0 | 168 | 1.6811 | 0.62 | 0.5586 | 2.9303 | 0.62 | 0.6176 | 0.2064 | 0.1602 |
| No log | 25.0 | 175 | 1.6870 | 0.625 | 0.5595 | 2.9457 | 0.625 | 0.6225 | 0.1996 | 0.1600 |
| No log | 26.0 | 182 | 1.6905 | 0.625 | 0.5600 | 2.9438 | 0.625 | 0.6228 | 0.1957 | 0.1604 |
| No log | 27.0 | 189 | 1.6920 | 0.625 | 0.5601 | 2.9207 | 0.625 | 0.6228 | 0.2030 | 0.1603 |
| No log | 28.0 | 196 | 1.6928 | 0.6225 | 0.5596 | 2.9140 | 0.6225 | 0.6201 | 0.2104 | 0.1598 |
| No log | 29.0 | 203 | 1.6934 | 0.6225 | 0.5596 | 2.9133 | 0.6225 | 0.6201 | 0.2171 | 0.1597 |
| No log | 30.0 | 210 | 1.6952 | 0.6225 | 0.5600 | 2.9156 | 0.6225 | 0.6199 | 0.2175 | 0.1597 |
| No log | 31.0 | 217 | 1.6962 | 0.6225 | 0.5604 | 2.9195 | 0.6225 | 0.6199 | 0.2151 | 0.1597 |
| No log | 32.0 | 224 | 1.6982 | 0.625 | 0.5609 | 2.9466 | 0.625 | 0.6216 | 0.2052 | 0.1598 |
| No log | 33.0 | 231 | 1.6996 | 0.625 | 0.5610 | 2.9468 | 0.625 | 0.6220 | 0.2073 | 0.1598 |
| No log | 34.0 | 238 | 1.7008 | 0.625 | 0.5611 | 2.9223 | 0.625 | 0.6220 | 0.2099 | 0.1595 |
| No log | 35.0 | 245 | 1.7028 | 0.625 | 0.5615 | 2.9159 | 0.625 | 0.6218 | 0.2062 | 0.1597 |
| No log | 36.0 | 252 | 1.7053 | 0.6275 | 0.5621 | 2.9154 | 0.6275 | 0.6246 | 0.2166 | 0.1598 |
| No log | 37.0 | 259 | 1.7078 | 0.625 | 0.5628 | 2.9132 | 0.625 | 0.6216 | 0.2113 | 0.1600 |
| No log | 38.0 | 266 | 1.7098 | 0.6275 | 0.5631 | 2.9119 | 0.6275 | 0.6243 | 0.2209 | 0.1601 |
| No log | 39.0 | 273 | 1.7112 | 0.625 | 0.5632 | 2.9136 | 0.625 | 0.6221 | 0.2164 | 0.1604 |
| No log | 40.0 | 280 | 1.7122 | 0.625 | 0.5633 | 2.9183 | 0.625 | 0.6221 | 0.2206 | 0.1603 |
| No log | 41.0 | 287 | 1.7134 | 0.6275 | 0.5635 | 2.9473 | 0.6275 | 0.6247 | 0.2192 | 0.1602 |
| No log | 42.0 | 294 | 1.7142 | 0.625 | 0.5636 | 2.9477 | 0.625 | 0.6220 | 0.2172 | 0.1600 |
| No log | 43.0 | 301 | 1.7152 | 0.6275 | 0.5634 | 2.9471 | 0.6275 | 0.6245 | 0.2090 | 0.1598 |
| No log | 44.0 | 308 | 1.7160 | 0.6275 | 0.5634 | 2.9175 | 0.6275 | 0.6245 | 0.2074 | 0.1597 |
| No log | 45.0 | 315 | 1.7172 | 0.6275 | 0.5637 | 2.9171 | 0.6275 | 0.6245 | 0.2138 | 0.1597 |
| No log | 46.0 | 322 | 1.7188 | 0.63 | 0.5640 | 2.9184 | 0.63 | 0.6272 | 0.2138 | 0.1597 |
| No log | 47.0 | 329 | 1.7204 | 0.63 | 0.5642 | 2.9171 | 0.63 | 0.6277 | 0.2146 | 0.1599 |
| No log | 48.0 | 336 | 1.7220 | 0.63 | 0.5643 | 2.9178 | 0.63 | 0.6277 | 0.2088 | 0.1599 |
| No log | 49.0 | 343 | 1.7233 | 0.6325 | 0.5643 | 2.9162 | 0.6325 | 0.6296 | 0.2114 | 0.1597 |
| No log | 50.0 | 350 | 1.7244 | 0.6325 | 0.5644 | 2.9149 | 0.6325 | 0.6296 | 0.2117 | 0.1598 |
| No log | 51.0 | 357 | 1.7253 | 0.6325 | 0.5645 | 2.9168 | 0.6325 | 0.6296 | 0.2078 | 0.1597 |
| No log | 52.0 | 364 | 1.7260 | 0.63 | 0.5647 | 2.9198 | 0.63 | 0.6271 | 0.2002 | 0.1598 |
| No log | 53.0 | 371 | 1.7268 | 0.63 | 0.5649 | 2.9230 | 0.63 | 0.6270 | 0.2068 | 0.1596 |
| No log | 54.0 | 378 | 1.7271 | 0.6275 | 0.5649 | 2.9547 | 0.6275 | 0.6241 | 0.2019 | 0.1599 |
| No log | 55.0 | 385 | 1.7281 | 0.6275 | 0.5652 | 2.9814 | 0.6275 | 0.6241 | 0.2084 | 0.1599 |
| No log | 56.0 | 392 | 1.7293 | 0.6275 | 0.5652 | 2.9522 | 0.6275 | 0.6241 | 0.2086 | 0.1599 |
| No log | 57.0 | 399 | 1.7306 | 0.6275 | 0.5653 | 2.9227 | 0.6275 | 0.6244 | 0.2160 | 0.1600 |
| No log | 58.0 | 406 | 1.7315 | 0.6275 | 0.5654 | 2.9203 | 0.6275 | 0.6244 | 0.2140 | 0.1598 |
| No log | 59.0 | 413 | 1.7322 | 0.6275 | 0.5655 | 2.9190 | 0.6275 | 0.6244 | 0.2229 | 0.1600 |
| No log | 60.0 | 420 | 1.7333 | 0.6275 | 0.5657 | 2.9184 | 0.6275 | 0.6250 | 0.2150 | 0.1600 |
| No log | 61.0 | 427 | 1.7343 | 0.63 | 0.5658 | 2.9166 | 0.63 | 0.6273 | 0.2304 | 0.1599 |
| No log | 62.0 | 434 | 1.7351 | 0.63 | 0.5660 | 2.9230 | 0.63 | 0.6275 | 0.2154 | 0.1598 |
| No log | 63.0 | 441 | 1.7354 | 0.63 | 0.5660 | 2.9476 | 0.63 | 0.6275 | 0.2056 | 0.1597 |
| No log | 64.0 | 448 | 1.7359 | 0.63 | 0.5661 | 2.9483 | 0.63 | 0.6275 | 0.2050 | 0.1598 |
| No log | 65.0 | 455 | 1.7366 | 0.6275 | 0.5661 | 2.9515 | 0.6275 | 0.6250 | 0.2053 | 0.1600 |
| No log | 66.0 | 462 | 1.7371 | 0.6275 | 0.5661 | 2.9588 | 0.6275 | 0.6250 | 0.2110 | 0.1600 |
| No log | 67.0 | 469 | 1.7378 | 0.6275 | 0.5663 | 2.9780 | 0.6275 | 0.6250 | 0.2108 | 0.1599 |
| No log | 68.0 | 476 | 1.7384 | 0.6275 | 0.5663 | 2.9530 | 0.6275 | 0.6250 | 0.2150 | 0.1599 |
| No log | 69.0 | 483 | 1.7392 | 0.63 | 0.5663 | 2.9631 | 0.63 | 0.6275 | 0.2114 | 0.1596 |
| No log | 70.0 | 490 | 1.7398 | 0.63 | 0.5663 | 2.9778 | 0.63 | 0.6275 | 0.2129 | 0.1596 |
| No log | 71.0 | 497 | 1.7402 | 0.63 | 0.5664 | 2.9544 | 0.63 | 0.6275 | 0.2227 | 0.1596 |
| 0.1799 | 72.0 | 504 | 1.7408 | 0.63 | 0.5665 | 2.9521 | 0.63 | 0.6275 | 0.2157 | 0.1596 |
| 0.1799 | 73.0 | 511 | 1.7412 | 0.63 | 0.5666 | 2.9508 | 0.63 | 0.6275 | 0.2262 | 0.1596 |
| 0.1799 | 74.0 | 518 | 1.7417 | 0.63 | 0.5666 | 2.9509 | 0.63 | 0.6272 | 0.2248 | 0.1596 |
| 0.1799 | 75.0 | 525 | 1.7420 | 0.63 | 0.5666 | 2.9555 | 0.63 | 0.6272 | 0.2219 | 0.1596 |
| 0.1799 | 76.0 | 532 | 1.7425 | 0.63 | 0.5667 | 2.9541 | 0.63 | 0.6268 | 0.2233 | 0.1596 |
| 0.1799 | 77.0 | 539 | 1.7430 | 0.63 | 0.5668 | 2.9773 | 0.63 | 0.6276 | 0.2133 | 0.1596 |
| 0.1799 | 78.0 | 546 | 1.7435 | 0.63 | 0.5668 | 2.9772 | 0.63 | 0.6276 | 0.2134 | 0.1597 |
| 0.1799 | 79.0 | 553 | 1.7439 | 0.63 | 0.5669 | 2.9514 | 0.63 | 0.6276 | 0.2142 | 0.1596 |
| 0.1799 | 80.0 | 560 | 1.7444 | 0.6325 | 0.5669 | 2.9499 | 0.6325 | 0.6303 | 0.2118 | 0.1594 |
| 0.1799 | 81.0 | 567 | 1.7451 | 0.6325 | 0.5669 | 2.9506 | 0.6325 | 0.6303 | 0.2078 | 0.1594 |
| 0.1799 | 82.0 | 574 | 1.7455 | 0.6325 | 0.5670 | 2.9617 | 0.6325 | 0.6303 | 0.2079 | 0.1594 |
| 0.1799 | 83.0 | 581 | 1.7459 | 0.6325 | 0.5671 | 2.9766 | 0.6325 | 0.6303 | 0.2130 | 0.1594 |
| 0.1799 | 84.0 | 588 | 1.7463 | 0.63 | 0.5672 | 2.9770 | 0.63 | 0.6278 | 0.2085 | 0.1597 |
| 0.1799 | 85.0 | 595 | 1.7466 | 0.6275 | 0.5672 | 2.9768 | 0.6275 | 0.6250 | 0.2111 | 0.1598 |
| 0.1799 | 86.0 | 602 | 1.7469 | 0.63 | 0.5673 | 2.9769 | 0.63 | 0.6278 | 0.2086 | 0.1597 |
| 0.1799 | 87.0 | 609 | 1.7472 | 0.6275 | 0.5673 | 2.9770 | 0.6275 | 0.6250 | 0.2140 | 0.1598 |
| 0.1799 | 88.0 | 616 | 1.7474 | 0.6275 | 0.5674 | 2.9771 | 0.6275 | 0.6250 | 0.2111 | 0.1598 |
| 0.1799 | 89.0 | 623 | 1.7477 | 0.6275 | 0.5674 | 2.9772 | 0.6275 | 0.6250 | 0.2112 | 0.1598 |
| 0.1799 | 90.0 | 630 | 1.7480 | 0.6275 | 0.5675 | 2.9770 | 0.6275 | 0.6250 | 0.2112 | 0.1598 |
| 0.1799 | 91.0 | 637 | 1.7483 | 0.6275 | 0.5675 | 2.9770 | 0.6275 | 0.6250 | 0.2112 | 0.1599 |
| 0.1799 | 92.0 | 644 | 1.7485 | 0.6275 | 0.5676 | 2.9769 | 0.6275 | 0.6250 | 0.2112 | 0.1598 |
| 0.1799 | 93.0 | 651 | 1.7486 | 0.6275 | 0.5676 | 2.9770 | 0.6275 | 0.6250 | 0.2112 | 0.1598 |
| 0.1799 | 94.0 | 658 | 1.7488 | 0.6275 | 0.5676 | 2.9770 | 0.6275 | 0.6250 | 0.2131 | 0.1598 |
| 0.1799 | 95.0 | 665 | 1.7489 | 0.6275 | 0.5676 | 2.9768 | 0.6275 | 0.6250 | 0.2143 | 0.1598 |
| 0.1799 | 96.0 | 672 | 1.7491 | 0.6275 | 0.5676 | 2.9768 | 0.6275 | 0.6250 | 0.2161 | 0.1599 |
| 0.1799 | 97.0 | 679 | 1.7492 | 0.6275 | 0.5676 | 2.9768 | 0.6275 | 0.6250 | 0.2161 | 0.1599 |
| 0.1799 | 98.0 | 686 | 1.7493 | 0.6275 | 0.5677 | 2.9768 | 0.6275 | 0.6250 | 0.2161 | 0.1599 |
| 0.1799 | 99.0 | 693 | 1.7493 | 0.6275 | 0.5677 | 2.9769 | 0.6275 | 0.6250 | 0.2161 | 0.1599 |
| 0.1799 | 100.0 | 700 | 1.7493 | 0.6275 | 0.5677 | 2.9769 | 0.6275 | 0.6250 | 0.2161 | 0.1599 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
ALM-AHME/beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30
This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0474
- Accuracy: 0.9805
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2312 | 0.99 | 93 | 0.1822 | 0.9453 |
| 0.3817 | 1.99 | 187 | 0.2106 | 0.9183 |
| 0.2217 | 3.0 | 281 | 0.1902 | 0.9285 |
| 0.1667 | 4.0 | 375 | 0.1127 | 0.9584 |
| 0.0572 | 4.96 | 465 | 0.0474 | 0.9805 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
jordyvl/vit-tiny_tobacco3482
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2012
- Accuracy: 0.825
- Brier Loss: 0.3279
- Nll: 1.1568
- F1 Micro: 0.825
- F1 Macro: 0.7904
- Ece: 0.2679
- Aurc: 0.0635
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 1.7837 | 0.1 | 1.0203 | 8.4259 | 0.1000 | 0.0850 | 0.3286 | 0.9000 |
| No log | 2.0 | 14 | 1.0637 | 0.25 | 0.8576 | 5.3606 | 0.25 | 0.1988 | 0.2806 | 0.7500 |
| No log | 3.0 | 21 | 0.7848 | 0.41 | 0.7314 | 3.9906 | 0.41 | 0.3109 | 0.2967 | 0.3924 |
| No log | 4.0 | 28 | 0.5442 | 0.565 | 0.5856 | 2.7941 | 0.565 | 0.4762 | 0.2765 | 0.2125 |
| No log | 5.0 | 35 | 0.4420 | 0.68 | 0.4947 | 1.7142 | 0.68 | 0.6058 | 0.2692 | 0.1565 |
| No log | 6.0 | 42 | 0.3671 | 0.7 | 0.4278 | 1.6759 | 0.7 | 0.6223 | 0.2685 | 0.1186 |
| No log | 7.0 | 49 | 0.3163 | 0.765 | 0.3959 | 1.6600 | 0.765 | 0.6862 | 0.2639 | 0.1117 |
| No log | 8.0 | 56 | 0.3540 | 0.77 | 0.3688 | 1.4505 | 0.7700 | 0.7148 | 0.2634 | 0.0998 |
| No log | 9.0 | 63 | 0.3875 | 0.725 | 0.3934 | 1.4205 | 0.7250 | 0.6918 | 0.2622 | 0.1161 |
| No log | 10.0 | 70 | 0.3546 | 0.775 | 0.3523 | 1.3197 | 0.775 | 0.7619 | 0.2558 | 0.0835 |
| No log | 11.0 | 77 | 0.3046 | 0.79 | 0.3515 | 1.2730 | 0.79 | 0.7594 | 0.2526 | 0.0978 |
| No log | 12.0 | 84 | 0.3059 | 0.8 | 0.3313 | 1.2700 | 0.8000 | 0.7796 | 0.2520 | 0.0783 |
| No log | 13.0 | 91 | 0.3075 | 0.775 | 0.3397 | 1.2980 | 0.775 | 0.7501 | 0.2632 | 0.0702 |
| No log | 14.0 | 98 | 0.2917 | 0.79 | 0.3618 | 1.3528 | 0.79 | 0.7715 | 0.2744 | 0.0843 |
| No log | 15.0 | 105 | 0.2393 | 0.825 | 0.3268 | 1.1212 | 0.825 | 0.8022 | 0.2498 | 0.0656 |
| No log | 16.0 | 112 | 0.2862 | 0.8 | 0.3444 | 1.2675 | 0.8000 | 0.7622 | 0.2611 | 0.0689 |
| No log | 17.0 | 119 | 0.2539 | 0.79 | 0.3390 | 1.2317 | 0.79 | 0.7632 | 0.2492 | 0.0739 |
| No log | 18.0 | 126 | 0.2338 | 0.82 | 0.3359 | 1.2455 | 0.82 | 0.7838 | 0.2865 | 0.0914 |
| No log | 19.0 | 133 | 0.2357 | 0.825 | 0.3197 | 1.0057 | 0.825 | 0.7935 | 0.2656 | 0.0727 |
| No log | 20.0 | 140 | 0.2711 | 0.81 | 0.3525 | 1.1392 | 0.81 | 0.7852 | 0.2961 | 0.0670 |
| No log | 21.0 | 147 | 0.2341 | 0.795 | 0.3534 | 1.3905 | 0.795 | 0.7645 | 0.2634 | 0.0999 |
| No log | 22.0 | 154 | 0.2635 | 0.795 | 0.3382 | 1.2001 | 0.795 | 0.7860 | 0.2625 | 0.0635 |
| No log | 23.0 | 161 | 0.2176 | 0.82 | 0.3271 | 0.9072 | 0.82 | 0.7972 | 0.2703 | 0.0680 |
| No log | 24.0 | 168 | 0.2512 | 0.835 | 0.3329 | 1.2192 | 0.835 | 0.8160 | 0.2980 | 0.0626 |
| No log | 25.0 | 175 | 0.2169 | 0.805 | 0.3414 | 1.1117 | 0.805 | 0.7912 | 0.2798 | 0.0776 |
| No log | 26.0 | 182 | 0.2227 | 0.84 | 0.3264 | 1.0267 | 0.8400 | 0.8267 | 0.2985 | 0.0669 |
| No log | 27.0 | 189 | 0.2302 | 0.79 | 0.3342 | 1.1603 | 0.79 | 0.7708 | 0.2589 | 0.0680 |
| No log | 28.0 | 196 | 0.2215 | 0.805 | 0.3324 | 1.1168 | 0.805 | 0.7786 | 0.2826 | 0.0655 |
| No log | 29.0 | 203 | 0.2022 | 0.82 | 0.3217 | 0.9587 | 0.82 | 0.7874 | 0.2865 | 0.0646 |
| No log | 30.0 | 210 | 0.2142 | 0.805 | 0.3287 | 1.1199 | 0.805 | 0.7855 | 0.2526 | 0.0599 |
| No log | 31.0 | 217 | 0.2035 | 0.795 | 0.3272 | 1.0385 | 0.795 | 0.7717 | 0.2599 | 0.0777 |
| No log | 32.0 | 224 | 0.2079 | 0.835 | 0.3246 | 0.9399 | 0.835 | 0.8045 | 0.2974 | 0.0586 |
| No log | 33.0 | 231 | 0.2071 | 0.81 | 0.3173 | 1.2784 | 0.81 | 0.7848 | 0.2520 | 0.0652 |
| No log | 34.0 | 238 | 0.2070 | 0.815 | 0.3217 | 1.1020 | 0.815 | 0.7855 | 0.2633 | 0.0634 |
| No log | 35.0 | 245 | 0.2128 | 0.82 | 0.3235 | 1.2763 | 0.82 | 0.7800 | 0.2771 | 0.0593 |
| No log | 36.0 | 252 | 0.2093 | 0.825 | 0.3221 | 1.1203 | 0.825 | 0.8030 | 0.2666 | 0.0580 |
| No log | 37.0 | 259 | 0.1995 | 0.815 | 0.3240 | 1.0387 | 0.815 | 0.7831 | 0.2712 | 0.0659 |
| No log | 38.0 | 266 | 0.1977 | 0.82 | 0.3207 | 1.0955 | 0.82 | 0.7846 | 0.2589 | 0.0629 |
| No log | 39.0 | 273 | 0.2062 | 0.82 | 0.3235 | 1.0691 | 0.82 | 0.7911 | 0.2666 | 0.0616 |
| No log | 40.0 | 280 | 0.1993 | 0.825 | 0.3266 | 1.0812 | 0.825 | 0.7973 | 0.2755 | 0.0671 |
| No log | 41.0 | 287 | 0.1976 | 0.82 | 0.3288 | 1.1043 | 0.82 | 0.7948 | 0.2646 | 0.0688 |
| No log | 42.0 | 294 | 0.2040 | 0.825 | 0.3308 | 1.2371 | 0.825 | 0.7964 | 0.2782 | 0.0629 |
| No log | 43.0 | 301 | 0.2000 | 0.835 | 0.3224 | 1.0857 | 0.835 | 0.8041 | 0.2882 | 0.0584 |
| No log | 44.0 | 308 | 0.1987 | 0.83 | 0.3222 | 1.0746 | 0.83 | 0.7959 | 0.2837 | 0.0631 |
| No log | 45.0 | 315 | 0.2026 | 0.82 | 0.3248 | 1.1471 | 0.82 | 0.7887 | 0.2843 | 0.0633 |
| No log | 46.0 | 322 | 0.2014 | 0.825 | 0.3258 | 1.1310 | 0.825 | 0.7916 | 0.2915 | 0.0627 |
| No log | 47.0 | 329 | 0.1988 | 0.83 | 0.3237 | 1.0291 | 0.83 | 0.7959 | 0.2811 | 0.0633 |
| No log | 48.0 | 336 | 0.1989 | 0.82 | 0.3273 | 1.1741 | 0.82 | 0.7871 | 0.2699 | 0.0640 |
| No log | 49.0 | 343 | 0.1995 | 0.82 | 0.3251 | 1.1518 | 0.82 | 0.7869 | 0.2742 | 0.0631 |
| No log | 50.0 | 350 | 0.1996 | 0.825 | 0.3241 | 1.0873 | 0.825 | 0.7900 | 0.2659 | 0.0616 |
| No log | 51.0 | 357 | 0.1995 | 0.83 | 0.3248 | 1.1532 | 0.83 | 0.7933 | 0.2651 | 0.0618 |
| No log | 52.0 | 364 | 0.1983 | 0.825 | 0.3251 | 1.2117 | 0.825 | 0.7904 | 0.2967 | 0.0634 |
| No log | 53.0 | 371 | 0.1984 | 0.825 | 0.3254 | 1.1566 | 0.825 | 0.7904 | 0.2695 | 0.0635 |
| No log | 54.0 | 378 | 0.1996 | 0.825 | 0.3249 | 1.1259 | 0.825 | 0.7904 | 0.2841 | 0.0625 |
| No log | 55.0 | 385 | 0.1996 | 0.825 | 0.3252 | 1.1424 | 0.825 | 0.7904 | 0.2790 | 0.0616 |
| No log | 56.0 | 392 | 0.2004 | 0.825 | 0.3243 | 1.1391 | 0.825 | 0.7904 | 0.2857 | 0.0623 |
| No log | 57.0 | 399 | 0.2004 | 0.825 | 0.3259 | 1.2109 | 0.825 | 0.7904 | 0.2788 | 0.0617 |
| No log | 58.0 | 406 | 0.2007 | 0.825 | 0.3262 | 1.1473 | 0.825 | 0.7900 | 0.2842 | 0.0630 |
| No log | 59.0 | 413 | 0.2000 | 0.82 | 0.3264 | 1.2066 | 0.82 | 0.7871 | 0.2698 | 0.0638 |
| No log | 60.0 | 420 | 0.1994 | 0.82 | 0.3263 | 1.1542 | 0.82 | 0.7871 | 0.2822 | 0.0640 |
| No log | 61.0 | 427 | 0.1994 | 0.825 | 0.3261 | 1.1506 | 0.825 | 0.7904 | 0.2683 | 0.0635 |
| No log | 62.0 | 434 | 0.2005 | 0.82 | 0.3276 | 1.1754 | 0.82 | 0.7871 | 0.2767 | 0.0639 |
| No log | 63.0 | 441 | 0.2006 | 0.82 | 0.3269 | 1.2127 | 0.82 | 0.7871 | 0.2811 | 0.0635 |
| No log | 64.0 | 448 | 0.2003 | 0.825 | 0.3265 | 1.1547 | 0.825 | 0.7904 | 0.2814 | 0.0630 |
| No log | 65.0 | 455 | 0.2005 | 0.825 | 0.3268 | 1.1078 | 0.825 | 0.7904 | 0.3069 | 0.0629 |
| No log | 66.0 | 462 | 0.2006 | 0.825 | 0.3268 | 1.0998 | 0.825 | 0.7904 | 0.3012 | 0.0627 |
| No log | 67.0 | 469 | 0.2009 | 0.825 | 0.3265 | 1.1526 | 0.825 | 0.7904 | 0.2946 | 0.0623 |
| No log | 68.0 | 476 | 0.2006 | 0.825 | 0.3269 | 1.1500 | 0.825 | 0.7904 | 0.2983 | 0.0631 |
| No log | 69.0 | 483 | 0.2007 | 0.825 | 0.3272 | 1.1005 | 0.825 | 0.7904 | 0.2756 | 0.0635 |
| No log | 70.0 | 490 | 0.2003 | 0.825 | 0.3271 | 1.0947 | 0.825 | 0.7904 | 0.2697 | 0.0635 |
| No log | 71.0 | 497 | 0.2006 | 0.825 | 0.3272 | 1.1587 | 0.825 | 0.7904 | 0.2770 | 0.0636 |
| 0.1375 | 72.0 | 504 | 0.2008 | 0.825 | 0.3272 | 1.1549 | 0.825 | 0.7904 | 0.2749 | 0.0632 |
| 0.1375 | 73.0 | 511 | 0.2009 | 0.825 | 0.3272 | 1.1528 | 0.825 | 0.7904 | 0.2748 | 0.0624 |
| 0.1375 | 74.0 | 518 | 0.2013 | 0.825 | 0.3275 | 1.1528 | 0.825 | 0.7904 | 0.2757 | 0.0626 |
| 0.1375 | 75.0 | 525 | 0.2009 | 0.825 | 0.3277 | 1.1529 | 0.825 | 0.7904 | 0.2763 | 0.0632 |
| 0.1375 | 76.0 | 532 | 0.2009 | 0.825 | 0.3271 | 1.1526 | 0.825 | 0.7904 | 0.2754 | 0.0633 |
| 0.1375 | 77.0 | 539 | 0.2006 | 0.825 | 0.3274 | 1.1559 | 0.825 | 0.7904 | 0.2699 | 0.0636 |
| 0.1375 | 78.0 | 546 | 0.2005 | 0.825 | 0.3272 | 1.1499 | 0.825 | 0.7904 | 0.2755 | 0.0634 |
| 0.1375 | 79.0 | 553 | 0.2009 | 0.825 | 0.3277 | 1.1539 | 0.825 | 0.7904 | 0.2833 | 0.0634 |
| 0.1375 | 80.0 | 560 | 0.2009 | 0.825 | 0.3275 | 1.1551 | 0.825 | 0.7904 | 0.2751 | 0.0632 |
| 0.1375 | 81.0 | 567 | 0.2013 | 0.825 | 0.3276 | 1.1563 | 0.825 | 0.7904 | 0.2809 | 0.0634 |
| 0.1375 | 82.0 | 574 | 0.2010 | 0.825 | 0.3277 | 1.1545 | 0.825 | 0.7904 | 0.2752 | 0.0633 |
| 0.1375 | 83.0 | 581 | 0.2009 | 0.825 | 0.3275 | 1.1565 | 0.825 | 0.7904 | 0.2753 | 0.0634 |
| 0.1375 | 84.0 | 588 | 0.2009 | 0.825 | 0.3277 | 1.1564 | 0.825 | 0.7904 | 0.2817 | 0.0636 |
| 0.1375 | 85.0 | 595 | 0.2010 | 0.825 | 0.3277 | 1.1560 | 0.825 | 0.7904 | 0.2686 | 0.0633 |
| 0.1375 | 86.0 | 602 | 0.2010 | 0.825 | 0.3278 | 1.1560 | 0.825 | 0.7904 | 0.2755 | 0.0633 |
| 0.1375 | 87.0 | 609 | 0.2010 | 0.825 | 0.3277 | 1.1544 | 0.825 | 0.7904 | 0.2661 | 0.0634 |
| 0.1375 | 88.0 | 616 | 0.2012 | 0.825 | 0.3278 | 1.1571 | 0.825 | 0.7904 | 0.2612 | 0.0633 |
| 0.1375 | 89.0 | 623 | 0.2010 | 0.825 | 0.3278 | 1.1558 | 0.825 | 0.7904 | 0.2747 | 0.0635 |
| 0.1375 | 90.0 | 630 | 0.2010 | 0.825 | 0.3278 | 1.1564 | 0.825 | 0.7904 | 0.2687 | 0.0634 |
| 0.1375 | 91.0 | 637 | 0.2011 | 0.825 | 0.3278 | 1.1551 | 0.825 | 0.7904 | 0.2678 | 0.0633 |
| 0.1375 | 92.0 | 644 | 0.2011 | 0.825 | 0.3278 | 1.1559 | 0.825 | 0.7904 | 0.2759 | 0.0635 |
| 0.1375 | 93.0 | 651 | 0.2011 | 0.825 | 0.3279 | 1.1565 | 0.825 | 0.7904 | 0.2750 | 0.0633 |
| 0.1375 | 94.0 | 658 | 0.2011 | 0.825 | 0.3278 | 1.1567 | 0.825 | 0.7904 | 0.2760 | 0.0635 |
| 0.1375 | 95.0 | 665 | 0.2011 | 0.825 | 0.3279 | 1.1572 | 0.825 | 0.7904 | 0.2679 | 0.0633 |
| 0.1375 | 96.0 | 672 | 0.2012 | 0.825 | 0.3279 | 1.1574 | 0.825 | 0.7904 | 0.2679 | 0.0634 |
| 0.1375 | 97.0 | 679 | 0.2012 | 0.825 | 0.3279 | 1.1563 | 0.825 | 0.7904 | 0.2678 | 0.0634 |
| 0.1375 | 98.0 | 686 | 0.2012 | 0.825 | 0.3279 | 1.1567 | 0.825 | 0.7904 | 0.2678 | 0.0634 |
| 0.1375 | 99.0 | 693 | 0.2012 | 0.825 | 0.3279 | 1.1569 | 0.825 | 0.7904 | 0.2679 | 0.0635 |
| 0.1375 | 100.0 | 700 | 0.2012 | 0.825 | 0.3279 | 1.1568 | 0.825 | 0.7904 | 0.2679 | 0.0635 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
jordyvl/vit-tiny_rvl_cdip_100_examples_per_class
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip_100_examples_per_class
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1181
- Accuracy: 0.555
- Brier Loss: 0.5639
- Nll: 2.5058
- F1 Micro: 0.555
- F1 Macro: 0.5524
- Ece: 0.1705
- Aurc: 0.2031
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 3.4972 | 0.045 | 1.0713 | 7.2880 | 0.045 | 0.0370 | 0.2764 | 0.9554 |
| No log | 2.0 | 14 | 2.5557 | 0.0925 | 0.9501 | 5.9529 | 0.0925 | 0.0781 | 0.1601 | 0.8830 |
| No log | 3.0 | 21 | 2.2287 | 0.215 | 0.8879 | 5.6730 | 0.2150 | 0.1651 | 0.1774 | 0.6393 |
| No log | 4.0 | 28 | 1.8874 | 0.29 | 0.8001 | 3.7085 | 0.29 | 0.2600 | 0.1619 | 0.4897 |
| No log | 5.0 | 35 | 1.6351 | 0.375 | 0.7226 | 3.1565 | 0.375 | 0.3592 | 0.1647 | 0.3632 |
| No log | 6.0 | 42 | 1.5371 | 0.4275 | 0.6865 | 3.0837 | 0.4275 | 0.4211 | 0.1661 | 0.3123 |
| No log | 7.0 | 49 | 1.5664 | 0.4575 | 0.6933 | 2.9854 | 0.4575 | 0.4296 | 0.2078 | 0.2973 |
| No log | 8.0 | 56 | 1.5244 | 0.44 | 0.6953 | 2.7931 | 0.44 | 0.4374 | 0.2067 | 0.3131 |
| No log | 9.0 | 63 | 1.5573 | 0.47 | 0.6933 | 2.7079 | 0.47 | 0.4583 | 0.2023 | 0.2977 |
| No log | 10.0 | 70 | 1.6359 | 0.455 | 0.7367 | 2.7425 | 0.455 | 0.4251 | 0.2519 | 0.3015 |
| No log | 11.0 | 77 | 1.6207 | 0.465 | 0.7274 | 2.6891 | 0.465 | 0.4426 | 0.2454 | 0.3086 |
| No log | 12.0 | 84 | 1.6268 | 0.4775 | 0.6996 | 2.7118 | 0.4775 | 0.4638 | 0.2370 | 0.2887 |
| No log | 13.0 | 91 | 1.6162 | 0.4725 | 0.7065 | 2.9655 | 0.4725 | 0.4522 | 0.2355 | 0.2868 |
| No log | 14.0 | 98 | 1.6040 | 0.4875 | 0.6796 | 2.8589 | 0.4875 | 0.4736 | 0.2148 | 0.2755 |
| No log | 15.0 | 105 | 1.6294 | 0.4825 | 0.7123 | 2.8243 | 0.4825 | 0.4714 | 0.2390 | 0.2867 |
| No log | 16.0 | 112 | 1.5734 | 0.49 | 0.6894 | 2.7436 | 0.49 | 0.4731 | 0.2306 | 0.2652 |
| No log | 17.0 | 119 | 1.5915 | 0.465 | 0.7045 | 2.6023 | 0.465 | 0.4547 | 0.2404 | 0.2905 |
| No log | 18.0 | 126 | 1.4637 | 0.4875 | 0.6723 | 2.7700 | 0.4875 | 0.4637 | 0.2162 | 0.2644 |
| No log | 19.0 | 133 | 1.4599 | 0.48 | 0.6739 | 2.5843 | 0.48 | 0.4625 | 0.2375 | 0.2671 |
| No log | 20.0 | 140 | 1.5233 | 0.4925 | 0.6821 | 2.8319 | 0.4925 | 0.4759 | 0.2331 | 0.2620 |
| No log | 21.0 | 147 | 1.4234 | 0.5025 | 0.6644 | 2.5679 | 0.5025 | 0.5051 | 0.2214 | 0.2746 |
| No log | 22.0 | 154 | 1.5117 | 0.5 | 0.6770 | 2.6852 | 0.5 | 0.4805 | 0.2201 | 0.2662 |
| No log | 23.0 | 161 | 1.4680 | 0.4875 | 0.6798 | 2.5928 | 0.4875 | 0.4799 | 0.1973 | 0.2714 |
| No log | 24.0 | 168 | 1.4762 | 0.4825 | 0.6827 | 2.7070 | 0.4825 | 0.4690 | 0.2411 | 0.2845 |
| No log | 25.0 | 175 | 1.4672 | 0.4875 | 0.6778 | 2.6189 | 0.4875 | 0.4746 | 0.2272 | 0.2733 |
| No log | 26.0 | 182 | 1.5126 | 0.5075 | 0.6671 | 2.8784 | 0.5075 | 0.5018 | 0.2404 | 0.2523 |
| No log | 27.0 | 189 | 1.4648 | 0.495 | 0.6722 | 2.6680 | 0.495 | 0.4959 | 0.2250 | 0.2773 |
| No log | 28.0 | 196 | 1.4679 | 0.5175 | 0.6463 | 2.7459 | 0.5175 | 0.4851 | 0.2134 | 0.2377 |
| No log | 29.0 | 203 | 1.4982 | 0.475 | 0.6930 | 2.7531 | 0.4750 | 0.4666 | 0.2446 | 0.2901 |
| No log | 30.0 | 210 | 1.3873 | 0.5075 | 0.6346 | 2.5718 | 0.5075 | 0.4857 | 0.1991 | 0.2456 |
| No log | 31.0 | 217 | 1.2963 | 0.54 | 0.6140 | 2.6319 | 0.54 | 0.5431 | 0.1806 | 0.2316 |
| No log | 32.0 | 224 | 1.3467 | 0.505 | 0.6361 | 2.7359 | 0.505 | 0.4792 | 0.2133 | 0.2409 |
| No log | 33.0 | 231 | 1.2819 | 0.55 | 0.6017 | 2.6009 | 0.55 | 0.5423 | 0.1700 | 0.2182 |
| No log | 34.0 | 238 | 1.3502 | 0.525 | 0.6402 | 2.4999 | 0.525 | 0.5166 | 0.1945 | 0.2492 |
| No log | 35.0 | 245 | 1.3020 | 0.5275 | 0.6155 | 2.6222 | 0.5275 | 0.5173 | 0.1844 | 0.2243 |
| No log | 36.0 | 252 | 1.4007 | 0.5175 | 0.6457 | 2.5934 | 0.5175 | 0.5160 | 0.2112 | 0.2588 |
| No log | 37.0 | 259 | 1.2637 | 0.5175 | 0.6187 | 2.6172 | 0.5175 | 0.5031 | 0.1879 | 0.2314 |
| No log | 38.0 | 266 | 1.3416 | 0.54 | 0.6142 | 2.6727 | 0.54 | 0.5300 | 0.1827 | 0.2262 |
| No log | 39.0 | 273 | 1.2449 | 0.5375 | 0.6086 | 2.6247 | 0.5375 | 0.5359 | 0.1847 | 0.2294 |
| No log | 40.0 | 280 | 1.2930 | 0.5325 | 0.6131 | 2.7111 | 0.5325 | 0.5141 | 0.1865 | 0.2262 |
| No log | 41.0 | 287 | 1.2288 | 0.535 | 0.5988 | 2.5900 | 0.535 | 0.5277 | 0.1827 | 0.2146 |
| No log | 42.0 | 294 | 1.2507 | 0.545 | 0.6031 | 2.5254 | 0.545 | 0.5289 | 0.1754 | 0.2224 |
| No log | 43.0 | 301 | 1.2154 | 0.5475 | 0.5986 | 2.6049 | 0.5475 | 0.5414 | 0.1730 | 0.2229 |
| No log | 44.0 | 308 | 1.1948 | 0.5575 | 0.5824 | 2.5753 | 0.5575 | 0.5492 | 0.1904 | 0.2047 |
| No log | 45.0 | 315 | 1.1773 | 0.555 | 0.5846 | 2.6682 | 0.555 | 0.5564 | 0.1918 | 0.2171 |
| No log | 46.0 | 322 | 1.1860 | 0.56 | 0.5763 | 2.5625 | 0.56 | 0.5549 | 0.1951 | 0.2018 |
| No log | 47.0 | 329 | 1.1970 | 0.5575 | 0.5850 | 2.5154 | 0.5575 | 0.5436 | 0.1819 | 0.2114 |
| No log | 48.0 | 336 | 1.1805 | 0.5425 | 0.5862 | 2.5828 | 0.5425 | 0.5423 | 0.1885 | 0.2132 |
| No log | 49.0 | 343 | 1.1902 | 0.555 | 0.5862 | 2.5043 | 0.555 | 0.5446 | 0.1705 | 0.2075 |
| No log | 50.0 | 350 | 1.1496 | 0.5725 | 0.5748 | 2.5429 | 0.5725 | 0.5752 | 0.1647 | 0.2034 |
| No log | 51.0 | 357 | 1.1749 | 0.57 | 0.5802 | 2.5185 | 0.57 | 0.5564 | 0.1713 | 0.2055 |
| No log | 52.0 | 364 | 1.1485 | 0.5625 | 0.5717 | 2.5957 | 0.5625 | 0.5570 | 0.1752 | 0.1993 |
| No log | 53.0 | 371 | 1.1721 | 0.5525 | 0.5853 | 2.6264 | 0.5525 | 0.5488 | 0.1632 | 0.2165 |
| No log | 54.0 | 378 | 1.1592 | 0.5525 | 0.5783 | 2.5631 | 0.5525 | 0.5433 | 0.1743 | 0.2057 |
| No log | 55.0 | 385 | 1.1473 | 0.5725 | 0.5731 | 2.6200 | 0.5725 | 0.5689 | 0.1667 | 0.2047 |
| No log | 56.0 | 392 | 1.1643 | 0.5575 | 0.5771 | 2.5922 | 0.5575 | 0.5458 | 0.1699 | 0.2098 |
| No log | 57.0 | 399 | 1.1510 | 0.5775 | 0.5740 | 2.5202 | 0.5775 | 0.5798 | 0.1633 | 0.2042 |
| No log | 58.0 | 406 | 1.1627 | 0.5725 | 0.5755 | 2.5974 | 0.5725 | 0.5619 | 0.1685 | 0.2053 |
| No log | 59.0 | 413 | 1.1524 | 0.56 | 0.5806 | 2.5180 | 0.56 | 0.5581 | 0.1826 | 0.2119 |
| No log | 60.0 | 420 | 1.1526 | 0.5675 | 0.5712 | 2.6118 | 0.5675 | 0.5624 | 0.1714 | 0.2025 |
| No log | 61.0 | 427 | 1.1861 | 0.5525 | 0.5845 | 2.5332 | 0.5525 | 0.5429 | 0.1634 | 0.2111 |
| No log | 62.0 | 434 | 1.1313 | 0.5675 | 0.5648 | 2.5085 | 0.5675 | 0.5690 | 0.1589 | 0.1952 |
| No log | 63.0 | 441 | 1.1651 | 0.5625 | 0.5796 | 2.5044 | 0.5625 | 0.5495 | 0.1666 | 0.2078 |
| No log | 64.0 | 448 | 1.1352 | 0.565 | 0.5689 | 2.4902 | 0.565 | 0.5652 | 0.1703 | 0.2032 |
| No log | 65.0 | 455 | 1.1379 | 0.5725 | 0.5723 | 2.5435 | 0.5725 | 0.5666 | 0.1593 | 0.2026 |
| No log | 66.0 | 462 | 1.1507 | 0.575 | 0.5763 | 2.4687 | 0.575 | 0.5697 | 0.1551 | 0.2056 |
| No log | 67.0 | 469 | 1.1354 | 0.5625 | 0.5708 | 2.5495 | 0.5625 | 0.5564 | 0.1754 | 0.2009 |
| No log | 68.0 | 476 | 1.1286 | 0.5775 | 0.5671 | 2.4870 | 0.5775 | 0.5746 | 0.1795 | 0.2008 |
| No log | 69.0 | 483 | 1.1371 | 0.5675 | 0.5701 | 2.5777 | 0.5675 | 0.5642 | 0.1769 | 0.2047 |
| No log | 70.0 | 490 | 1.1152 | 0.56 | 0.5613 | 2.5036 | 0.56 | 0.5599 | 0.1370 | 0.1985 |
| No log | 71.0 | 497 | 1.1289 | 0.5725 | 0.5657 | 2.5572 | 0.5725 | 0.5666 | 0.1496 | 0.2014 |
| 0.2584 | 72.0 | 504 | 1.1270 | 0.57 | 0.5653 | 2.5292 | 0.57 | 0.5688 | 0.1570 | 0.1973 |
| 0.2584 | 73.0 | 511 | 1.1297 | 0.57 | 0.5680 | 2.5565 | 0.57 | 0.5649 | 0.1467 | 0.2042 |
| 0.2584 | 74.0 | 518 | 1.1246 | 0.5625 | 0.5646 | 2.5033 | 0.5625 | 0.5618 | 0.1581 | 0.2004 |
| 0.2584 | 75.0 | 525 | 1.1186 | 0.57 | 0.5635 | 2.5465 | 0.57 | 0.5671 | 0.1454 | 0.1999 |
| 0.2584 | 76.0 | 532 | 1.1210 | 0.56 | 0.5654 | 2.5094 | 0.56 | 0.5587 | 0.1510 | 0.2031 |
| 0.2584 | 77.0 | 539 | 1.1212 | 0.5675 | 0.5635 | 2.5170 | 0.5675 | 0.5630 | 0.1631 | 0.1996 |
| 0.2584 | 78.0 | 546 | 1.1190 | 0.56 | 0.5642 | 2.5074 | 0.56 | 0.5592 | 0.1506 | 0.2027 |
| 0.2584 | 79.0 | 553 | 1.1215 | 0.5625 | 0.5643 | 2.5112 | 0.5625 | 0.5599 | 0.1573 | 0.2024 |
| 0.2584 | 80.0 | 560 | 1.1181 | 0.56 | 0.5635 | 2.5064 | 0.56 | 0.5595 | 0.1601 | 0.2009 |
| 0.2584 | 81.0 | 567 | 1.1201 | 0.5675 | 0.5639 | 2.5096 | 0.5675 | 0.5669 | 0.1602 | 0.2008 |
| 0.2584 | 82.0 | 574 | 1.1195 | 0.5625 | 0.5645 | 2.5054 | 0.5625 | 0.5610 | 0.1759 | 0.2025 |
| 0.2584 | 83.0 | 581 | 1.1182 | 0.5625 | 0.5641 | 2.5062 | 0.5625 | 0.5619 | 0.1830 | 0.2018 |
| 0.2584 | 84.0 | 588 | 1.1195 | 0.5575 | 0.5637 | 2.5121 | 0.5575 | 0.5556 | 0.1838 | 0.2026 |
| 0.2584 | 85.0 | 595 | 1.1192 | 0.56 | 0.5641 | 2.5058 | 0.56 | 0.5588 | 0.1716 | 0.2026 |
| 0.2584 | 86.0 | 602 | 1.1186 | 0.5625 | 0.5639 | 2.5060 | 0.5625 | 0.5619 | 0.1662 | 0.2022 |
| 0.2584 | 87.0 | 609 | 1.1181 | 0.5575 | 0.5637 | 2.5070 | 0.5575 | 0.5557 | 0.1596 | 0.2027 |
| 0.2584 | 88.0 | 616 | 1.1178 | 0.56 | 0.5637 | 2.5060 | 0.56 | 0.5577 | 0.1655 | 0.2017 |
| 0.2584 | 89.0 | 623 | 1.1183 | 0.56 | 0.5639 | 2.5057 | 0.56 | 0.5580 | 0.1542 | 0.2021 |
| 0.2584 | 90.0 | 630 | 1.1184 | 0.56 | 0.5640 | 2.5060 | 0.56 | 0.5581 | 0.1841 | 0.2021 |
| 0.2584 | 91.0 | 637 | 1.1183 | 0.5575 | 0.5638 | 2.5060 | 0.5575 | 0.5547 | 0.1672 | 0.2029 |
| 0.2584 | 92.0 | 644 | 1.1182 | 0.5575 | 0.5638 | 2.5059 | 0.5575 | 0.5547 | 0.1608 | 0.2029 |
| 0.2584 | 93.0 | 651 | 1.1183 | 0.56 | 0.5639 | 2.5057 | 0.56 | 0.5576 | 0.1648 | 0.2021 |
| 0.2584 | 94.0 | 658 | 1.1185 | 0.555 | 0.5639 | 2.5061 | 0.555 | 0.5524 | 0.1648 | 0.2031 |
| 0.2584 | 95.0 | 665 | 1.1183 | 0.555 | 0.5640 | 2.5058 | 0.555 | 0.5524 | 0.1614 | 0.2031 |
| 0.2584 | 96.0 | 672 | 1.1182 | 0.555 | 0.5639 | 2.5057 | 0.555 | 0.5524 | 0.1769 | 0.2030 |
| 0.2584 | 97.0 | 679 | 1.1182 | 0.555 | 0.5639 | 2.5057 | 0.555 | 0.5524 | 0.1733 | 0.2031 |
| 0.2584 | 98.0 | 686 | 1.1181 | 0.555 | 0.5639 | 2.5058 | 0.555 | 0.5524 | 0.1754 | 0.2031 |
| 0.2584 | 99.0 | 693 | 1.1181 | 0.555 | 0.5639 | 2.5058 | 0.555 | 0.5524 | 0.1705 | 0.2031 |
| 0.2584 | 100.0 | 700 | 1.1181 | 0.555 | 0.5639 | 2.5058 | 0.555 | 0.5524 | 0.1705 | 0.2031 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
jordyvl/vit-tiny_rvl_cdip
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_rvl_cdip
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1016
- Accuracy: 0.9025
- Brier Loss: 0.1427
- Nll: 1.7378
- F1 Micro: 0.9025
- F1 Macro: 0.9029
- Ece: 0.0142
- Aurc: 0.0141
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| 0.3377 | 1.0 | 2500 | 0.2693 | 0.8397 | 0.2295 | 2.0316 | 0.8397 | 0.8399 | 0.0140 | 0.0337 |
| 0.1962 | 2.0 | 5000 | 0.1745 | 0.8717 | 0.1835 | 1.9452 | 0.8717 | 0.8739 | 0.0122 | 0.0222 |
| 0.1359 | 3.0 | 7500 | 0.1380 | 0.8869 | 0.1643 | 1.8585 | 0.8869 | 0.8871 | 0.0089 | 0.0181 |
| 0.099 | 4.0 | 10000 | 0.1297 | 0.8920 | 0.1567 | 1.8113 | 0.8920 | 0.8921 | 0.0128 | 0.0168 |
| 0.068 | 5.0 | 12500 | 0.1253 | 0.8966 | 0.1520 | 1.7963 | 0.8966 | 0.8969 | 0.0120 | 0.0160 |
| 0.0475 | 6.0 | 15000 | 0.1153 | 0.8972 | 0.1487 | 1.7849 | 0.8972 | 0.8979 | 0.0136 | 0.0151 |
| 0.0341 | 7.0 | 17500 | 0.1110 | 0.8995 | 0.1460 | 1.7557 | 0.8995 | 0.8997 | 0.0151 | 0.0146 |
| 0.0238 | 8.0 | 20000 | 0.1059 | 0.9013 | 0.1438 | 1.7503 | 0.9013 | 0.9015 | 0.0120 | 0.0143 |
| 0.017 | 9.0 | 22500 | 0.1034 | 0.9022 | 0.1440 | 1.7344 | 0.9022 | 0.9026 | 0.0142 | 0.0143 |
| 0.0128 | 10.0 | 25000 | 0.1016 | 0.9025 | 0.1427 | 1.7378 | 0.9025 | 0.9029 | 0.0142 | 0.0141 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
ALM-AHME/beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-20-40-40
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-large-patch16-224-finetuned-LungCancer-Classification-LC25000-AH-20-40-40
This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0075
- Accuracy: 0.9972
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1841 | 0.99 | 46 | 0.0968 | 0.9570 |
| 0.1017 | 2.0 | 93 | 0.0328 | 0.9887 |
| 0.1037 | 2.99 | 139 | 0.0281 | 0.9891 |
| 0.1074 | 4.0 | 186 | 0.0201 | 0.9924 |
| 0.0274 | 4.95 | 230 | 0.0075 | 0.9972 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
sbaner24/vit-base-patch16-224-Trial006-007-008-YEL_STEM
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-Trial006-007-008-YEL_STEM
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1837
- Accuracy: 0.9492
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 60
- eval_batch_size: 60
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7399 | 0.89 | 4 | 0.7281 | 0.5085 |
| 0.6861 | 2.0 | 9 | 0.6326 | 0.6186 |
| 0.6362 | 2.89 | 13 | 0.6392 | 0.6610 |
| 0.5728 | 4.0 | 18 | 0.5497 | 0.6695 |
| 0.515 | 4.89 | 22 | 0.4464 | 0.8136 |
| 0.5437 | 6.0 | 27 | 0.4090 | 0.8136 |
| 0.4647 | 6.89 | 31 | 0.3873 | 0.8220 |
| 0.4616 | 8.0 | 36 | 0.4446 | 0.7712 |
| 0.4396 | 8.89 | 40 | 0.4029 | 0.8220 |
| 0.4354 | 10.0 | 45 | 0.3104 | 0.8559 |
| 0.4187 | 10.89 | 49 | 0.4200 | 0.8220 |
| 0.4639 | 12.0 | 54 | 0.2883 | 0.8898 |
| 0.3663 | 12.89 | 58 | 0.2667 | 0.9153 |
| 0.3169 | 14.0 | 63 | 0.3731 | 0.8644 |
| 0.4725 | 14.89 | 67 | 0.2759 | 0.8729 |
| 0.4438 | 16.0 | 72 | 0.3782 | 0.8559 |
| 0.361 | 16.89 | 76 | 0.2790 | 0.8983 |
| 0.3321 | 18.0 | 81 | 0.2890 | 0.8983 |
| 0.3071 | 18.89 | 85 | 0.2499 | 0.8814 |
| 0.333 | 20.0 | 90 | 0.2271 | 0.9068 |
| 0.2706 | 20.89 | 94 | 0.2631 | 0.8814 |
| 0.2963 | 22.0 | 99 | 0.2449 | 0.9068 |
| 0.323 | 22.89 | 103 | 0.1904 | 0.9322 |
| 0.2677 | 24.0 | 108 | 0.2341 | 0.9237 |
| 0.2473 | 24.89 | 112 | 0.2191 | 0.9237 |
| 0.2598 | 26.0 | 117 | 0.2106 | 0.9322 |
| 0.2733 | 26.89 | 121 | 0.1837 | 0.9492 |
| 0.2506 | 28.0 | 126 | 0.1828 | 0.9407 |
| 0.2315 | 28.89 | 130 | 0.2110 | 0.9153 |
| 0.2519 | 30.0 | 135 | 0.2288 | 0.9153 |
| 0.289 | 30.89 | 139 | 0.1781 | 0.9322 |
| 0.3309 | 32.0 | 144 | 0.1571 | 0.9322 |
| 0.2435 | 32.89 | 148 | 0.1778 | 0.9322 |
| 0.2643 | 34.0 | 153 | 0.2275 | 0.9153 |
| 0.2238 | 34.89 | 157 | 0.1777 | 0.9322 |
| 0.2773 | 36.0 | 162 | 0.1955 | 0.9322 |
| 0.2422 | 36.89 | 166 | 0.2019 | 0.9237 |
| 0.2812 | 38.0 | 171 | 0.1794 | 0.9492 |
| 0.2626 | 38.89 | 175 | 0.1884 | 0.9407 |
| 0.2059 | 40.0 | 180 | 0.1843 | 0.9322 |
| 0.2585 | 40.89 | 184 | 0.1721 | 0.9492 |
| 0.2022 | 42.0 | 189 | 0.1640 | 0.9492 |
| 0.2403 | 42.89 | 193 | 0.1660 | 0.9407 |
| 0.2771 | 44.0 | 198 | 0.1686 | 0.9407 |
| 0.2153 | 44.44 | 200 | 0.1684 | 0.9407 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.12.1
- Datasets 2.12.0
- Tokenizers 0.13.1
|
[
"plant2",
"plant3"
] |
jordyvl/vit-tiny_tobacco3482_simkd_CEKD_tNone_aNone_tNone_gNone
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-tiny_tobacco3482_simkd_CEKD_tNone_aNone_tNone_gNone
This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0396
- Accuracy: 0.735
- Brier Loss: 0.7729
- Nll: 1.4473
- F1 Micro: 0.735
- F1 Macro: 0.6948
- Ece: 0.5886
- Aurc: 0.0947
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 25 | 0.0553 | 0.085 | 0.8991 | 5.2518 | 0.085 | 0.0595 | 0.1614 | 0.8792 |
| No log | 2.0 | 50 | 0.0488 | 0.035 | 0.9007 | 7.4288 | 0.035 | 0.0069 | 0.1218 | 0.9410 |
| No log | 3.0 | 75 | 0.0481 | 0.045 | 0.8999 | 6.0525 | 0.045 | 0.0087 | 0.1349 | 0.9308 |
| No log | 4.0 | 100 | 0.0478 | 0.05 | 0.8991 | 5.6444 | 0.0500 | 0.0149 | 0.1378 | 0.9211 |
| No log | 5.0 | 125 | 0.0475 | 0.14 | 0.8981 | 5.8239 | 0.14 | 0.0863 | 0.1987 | 0.8452 |
| No log | 6.0 | 150 | 0.0471 | 0.305 | 0.8964 | 5.7469 | 0.305 | 0.1652 | 0.3097 | 0.5016 |
| No log | 7.0 | 175 | 0.0466 | 0.305 | 0.8950 | 4.9568 | 0.305 | 0.1899 | 0.3035 | 0.5165 |
| No log | 8.0 | 200 | 0.0461 | 0.315 | 0.8931 | 4.8214 | 0.315 | 0.1811 | 0.3152 | 0.4687 |
| No log | 9.0 | 225 | 0.0455 | 0.315 | 0.8907 | 4.7406 | 0.315 | 0.2028 | 0.3225 | 0.4671 |
| No log | 10.0 | 250 | 0.0449 | 0.35 | 0.8862 | 4.7538 | 0.35 | 0.1972 | 0.3364 | 0.4263 |
| No log | 11.0 | 275 | 0.0443 | 0.37 | 0.8793 | 4.6283 | 0.37 | 0.2106 | 0.3455 | 0.4084 |
| No log | 12.0 | 300 | 0.0438 | 0.4 | 0.8731 | 3.9664 | 0.4000 | 0.2443 | 0.3685 | 0.3731 |
| No log | 13.0 | 325 | 0.0434 | 0.425 | 0.8628 | 3.9702 | 0.425 | 0.2574 | 0.3842 | 0.3601 |
| No log | 14.0 | 350 | 0.0430 | 0.465 | 0.8586 | 3.8630 | 0.465 | 0.3226 | 0.4112 | 0.3024 |
| No log | 15.0 | 375 | 0.0428 | 0.46 | 0.8488 | 3.9046 | 0.46 | 0.2693 | 0.4082 | 0.2854 |
| No log | 16.0 | 400 | 0.0424 | 0.475 | 0.8430 | 3.2916 | 0.4750 | 0.2802 | 0.4183 | 0.2626 |
| No log | 17.0 | 425 | 0.0421 | 0.555 | 0.8439 | 2.7780 | 0.555 | 0.4109 | 0.4760 | 0.2123 |
| No log | 18.0 | 450 | 0.0418 | 0.575 | 0.8317 | 2.8629 | 0.575 | 0.4399 | 0.4869 | 0.2123 |
| No log | 19.0 | 475 | 0.0415 | 0.665 | 0.8329 | 2.5145 | 0.665 | 0.5077 | 0.5655 | 0.1361 |
| 0.0491 | 20.0 | 500 | 0.0412 | 0.635 | 0.8121 | 2.7489 | 0.635 | 0.5155 | 0.5235 | 0.1686 |
| 0.0491 | 21.0 | 525 | 0.0410 | 0.655 | 0.8221 | 1.7853 | 0.655 | 0.5182 | 0.5509 | 0.1545 |
| 0.0491 | 22.0 | 550 | 0.0406 | 0.685 | 0.8045 | 1.5894 | 0.685 | 0.5486 | 0.5627 | 0.1305 |
| 0.0491 | 23.0 | 575 | 0.0405 | 0.68 | 0.7984 | 1.7241 | 0.68 | 0.5489 | 0.5545 | 0.1296 |
| 0.0491 | 24.0 | 600 | 0.0402 | 0.725 | 0.7959 | 1.5667 | 0.7250 | 0.6156 | 0.5926 | 0.1055 |
| 0.0491 | 25.0 | 625 | 0.0402 | 0.68 | 0.7927 | 1.4334 | 0.68 | 0.5853 | 0.5453 | 0.1239 |
| 0.0491 | 26.0 | 650 | 0.0401 | 0.705 | 0.7808 | 1.8114 | 0.705 | 0.5856 | 0.5735 | 0.1109 |
| 0.0491 | 27.0 | 675 | 0.0399 | 0.71 | 0.7859 | 1.6101 | 0.7100 | 0.6176 | 0.5679 | 0.1034 |
| 0.0491 | 28.0 | 700 | 0.0399 | 0.715 | 0.7808 | 1.3423 | 0.715 | 0.6612 | 0.5582 | 0.1130 |
| 0.0491 | 29.0 | 725 | 0.0398 | 0.705 | 0.7789 | 1.3921 | 0.705 | 0.6477 | 0.5615 | 0.1175 |
| 0.0491 | 30.0 | 750 | 0.0397 | 0.73 | 0.7767 | 1.5801 | 0.7300 | 0.6758 | 0.5741 | 0.1069 |
| 0.0491 | 31.0 | 775 | 0.0397 | 0.72 | 0.7774 | 1.3193 | 0.72 | 0.6653 | 0.5790 | 0.1004 |
| 0.0491 | 32.0 | 800 | 0.0396 | 0.745 | 0.7729 | 1.4864 | 0.745 | 0.6931 | 0.5941 | 0.0933 |
| 0.0491 | 33.0 | 825 | 0.0396 | 0.74 | 0.7736 | 1.5161 | 0.74 | 0.6901 | 0.5828 | 0.0934 |
| 0.0491 | 34.0 | 850 | 0.0396 | 0.745 | 0.7754 | 1.5432 | 0.745 | 0.6963 | 0.5911 | 0.0857 |
| 0.0491 | 35.0 | 875 | 0.0396 | 0.74 | 0.7744 | 1.4773 | 0.74 | 0.6936 | 0.5966 | 0.0896 |
| 0.0491 | 36.0 | 900 | 0.0397 | 0.715 | 0.7762 | 1.3769 | 0.715 | 0.6827 | 0.5675 | 0.1048 |
| 0.0491 | 37.0 | 925 | 0.0396 | 0.72 | 0.7744 | 1.3882 | 0.72 | 0.6780 | 0.5689 | 0.0970 |
| 0.0491 | 38.0 | 950 | 0.0396 | 0.72 | 0.7762 | 1.4098 | 0.72 | 0.6874 | 0.5701 | 0.1016 |
| 0.0491 | 39.0 | 975 | 0.0395 | 0.74 | 0.7728 | 1.3890 | 0.74 | 0.6894 | 0.5861 | 0.0902 |
| 0.0386 | 40.0 | 1000 | 0.0396 | 0.74 | 0.7724 | 1.5265 | 0.74 | 0.6936 | 0.5906 | 0.0881 |
| 0.0386 | 41.0 | 1025 | 0.0396 | 0.725 | 0.7730 | 1.3516 | 0.7250 | 0.6768 | 0.5784 | 0.0942 |
| 0.0386 | 42.0 | 1050 | 0.0396 | 0.73 | 0.7728 | 1.3633 | 0.7300 | 0.6847 | 0.5899 | 0.0945 |
| 0.0386 | 43.0 | 1075 | 0.0396 | 0.735 | 0.7730 | 1.3670 | 0.735 | 0.6874 | 0.5830 | 0.0940 |
| 0.0386 | 44.0 | 1100 | 0.0395 | 0.73 | 0.7727 | 1.4707 | 0.7300 | 0.6850 | 0.5914 | 0.0930 |
| 0.0386 | 45.0 | 1125 | 0.0396 | 0.725 | 0.7721 | 1.4269 | 0.7250 | 0.6810 | 0.5770 | 0.0934 |
| 0.0386 | 46.0 | 1150 | 0.0396 | 0.72 | 0.7730 | 1.3567 | 0.72 | 0.6793 | 0.5717 | 0.0976 |
| 0.0386 | 47.0 | 1175 | 0.0396 | 0.715 | 0.7731 | 1.3708 | 0.715 | 0.6757 | 0.5717 | 0.0974 |
| 0.0386 | 48.0 | 1200 | 0.0396 | 0.735 | 0.7724 | 1.4118 | 0.735 | 0.6874 | 0.5791 | 0.0923 |
| 0.0386 | 49.0 | 1225 | 0.0396 | 0.72 | 0.7729 | 1.3647 | 0.72 | 0.6837 | 0.5711 | 0.0965 |
| 0.0386 | 50.0 | 1250 | 0.0396 | 0.725 | 0.7727 | 1.3773 | 0.7250 | 0.6820 | 0.5740 | 0.0963 |
| 0.0386 | 51.0 | 1275 | 0.0396 | 0.73 | 0.7736 | 1.3286 | 0.7300 | 0.6847 | 0.5766 | 0.0939 |
| 0.0386 | 52.0 | 1300 | 0.0396 | 0.725 | 0.7732 | 1.3810 | 0.7250 | 0.6817 | 0.5830 | 0.0944 |
| 0.0386 | 53.0 | 1325 | 0.0396 | 0.725 | 0.7725 | 1.3568 | 0.7250 | 0.6820 | 0.5763 | 0.0948 |
| 0.0386 | 54.0 | 1350 | 0.0396 | 0.73 | 0.7731 | 1.3693 | 0.7300 | 0.6847 | 0.5768 | 0.0941 |
| 0.0386 | 55.0 | 1375 | 0.0396 | 0.745 | 0.7728 | 1.3631 | 0.745 | 0.7112 | 0.5842 | 0.0928 |
| 0.0386 | 56.0 | 1400 | 0.0396 | 0.715 | 0.7731 | 1.4175 | 0.715 | 0.6712 | 0.5600 | 0.0976 |
| 0.0386 | 57.0 | 1425 | 0.0396 | 0.725 | 0.7725 | 1.3668 | 0.7250 | 0.6929 | 0.5738 | 0.0962 |
| 0.0386 | 58.0 | 1450 | 0.0396 | 0.73 | 0.7734 | 1.3903 | 0.7300 | 0.6958 | 0.5868 | 0.0963 |
| 0.0386 | 59.0 | 1475 | 0.0396 | 0.725 | 0.7729 | 1.4120 | 0.7250 | 0.6765 | 0.5756 | 0.0945 |
| 0.0373 | 60.0 | 1500 | 0.0396 | 0.725 | 0.7732 | 1.3655 | 0.7250 | 0.6820 | 0.5754 | 0.0951 |
| 0.0373 | 61.0 | 1525 | 0.0396 | 0.745 | 0.7727 | 1.3676 | 0.745 | 0.7038 | 0.5913 | 0.0921 |
| 0.0373 | 62.0 | 1550 | 0.0396 | 0.72 | 0.7729 | 1.3629 | 0.72 | 0.6797 | 0.5762 | 0.0969 |
| 0.0373 | 63.0 | 1575 | 0.0396 | 0.725 | 0.7730 | 1.4242 | 0.7250 | 0.6865 | 0.5811 | 0.0950 |
| 0.0373 | 64.0 | 1600 | 0.0396 | 0.725 | 0.7735 | 1.3658 | 0.7250 | 0.6923 | 0.5750 | 0.0959 |
| 0.0373 | 65.0 | 1625 | 0.0396 | 0.73 | 0.7731 | 1.4296 | 0.7300 | 0.6958 | 0.5769 | 0.0954 |
| 0.0373 | 66.0 | 1650 | 0.0396 | 0.735 | 0.7727 | 1.4780 | 0.735 | 0.6980 | 0.5851 | 0.0938 |
| 0.0373 | 67.0 | 1675 | 0.0396 | 0.725 | 0.7725 | 1.3669 | 0.7250 | 0.6824 | 0.5715 | 0.0938 |
| 0.0373 | 68.0 | 1700 | 0.0396 | 0.725 | 0.7730 | 1.4327 | 0.7250 | 0.6804 | 0.5741 | 0.0940 |
| 0.0373 | 69.0 | 1725 | 0.0396 | 0.73 | 0.7728 | 1.3811 | 0.7300 | 0.6961 | 0.5806 | 0.0963 |
| 0.0373 | 70.0 | 1750 | 0.0396 | 0.735 | 0.7727 | 1.3812 | 0.735 | 0.7081 | 0.5765 | 0.0952 |
| 0.0373 | 71.0 | 1775 | 0.0396 | 0.73 | 0.7730 | 1.4263 | 0.7300 | 0.6961 | 0.5739 | 0.0953 |
| 0.0373 | 72.0 | 1800 | 0.0396 | 0.73 | 0.7731 | 1.4280 | 0.7300 | 0.6953 | 0.5803 | 0.0956 |
| 0.0373 | 73.0 | 1825 | 0.0396 | 0.735 | 0.7729 | 1.3676 | 0.735 | 0.6988 | 0.5889 | 0.0953 |
| 0.0373 | 74.0 | 1850 | 0.0396 | 0.735 | 0.7727 | 1.4358 | 0.735 | 0.6985 | 0.5828 | 0.0940 |
| 0.0373 | 75.0 | 1875 | 0.0396 | 0.735 | 0.7727 | 1.4306 | 0.735 | 0.6965 | 0.5786 | 0.0940 |
| 0.0373 | 76.0 | 1900 | 0.0396 | 0.73 | 0.7729 | 1.4343 | 0.7300 | 0.6957 | 0.5802 | 0.0958 |
| 0.0373 | 77.0 | 1925 | 0.0396 | 0.73 | 0.7726 | 1.4259 | 0.7300 | 0.6961 | 0.5795 | 0.0962 |
| 0.0373 | 78.0 | 1950 | 0.0396 | 0.74 | 0.7731 | 1.4246 | 0.74 | 0.7080 | 0.5879 | 0.0941 |
| 0.0373 | 79.0 | 1975 | 0.0396 | 0.735 | 0.7730 | 1.4414 | 0.735 | 0.6980 | 0.5914 | 0.0945 |
| 0.0372 | 80.0 | 2000 | 0.0396 | 0.74 | 0.7727 | 1.4285 | 0.74 | 0.7103 | 0.5915 | 0.0939 |
| 0.0372 | 81.0 | 2025 | 0.0396 | 0.735 | 0.7731 | 1.4379 | 0.735 | 0.6980 | 0.5826 | 0.0942 |
| 0.0372 | 82.0 | 2050 | 0.0396 | 0.735 | 0.7729 | 1.4308 | 0.735 | 0.6963 | 0.5827 | 0.0942 |
| 0.0372 | 83.0 | 2075 | 0.0396 | 0.735 | 0.7728 | 1.4329 | 0.735 | 0.6968 | 0.5896 | 0.0946 |
| 0.0372 | 84.0 | 2100 | 0.0396 | 0.735 | 0.7728 | 1.4343 | 0.735 | 0.6948 | 0.5889 | 0.0947 |
| 0.0372 | 85.0 | 2125 | 0.0396 | 0.735 | 0.7727 | 1.4320 | 0.735 | 0.6948 | 0.5988 | 0.0945 |
| 0.0372 | 86.0 | 2150 | 0.0396 | 0.735 | 0.7730 | 1.4366 | 0.735 | 0.6963 | 0.5883 | 0.0949 |
| 0.0372 | 87.0 | 2175 | 0.0396 | 0.73 | 0.7728 | 1.4825 | 0.7300 | 0.6888 | 0.5878 | 0.0945 |
| 0.0372 | 88.0 | 2200 | 0.0396 | 0.735 | 0.7731 | 1.4339 | 0.735 | 0.6945 | 0.5828 | 0.0948 |
| 0.0372 | 89.0 | 2225 | 0.0396 | 0.735 | 0.7729 | 1.4383 | 0.735 | 0.6948 | 0.5917 | 0.0946 |
| 0.0372 | 90.0 | 2250 | 0.0396 | 0.735 | 0.7729 | 1.4471 | 0.735 | 0.6948 | 0.5867 | 0.0944 |
| 0.0372 | 91.0 | 2275 | 0.0396 | 0.735 | 0.7728 | 1.4402 | 0.735 | 0.6948 | 0.5892 | 0.0946 |
| 0.0372 | 92.0 | 2300 | 0.0396 | 0.735 | 0.7729 | 1.4412 | 0.735 | 0.6948 | 0.5952 | 0.0948 |
| 0.0372 | 93.0 | 2325 | 0.0396 | 0.735 | 0.7729 | 1.4709 | 0.735 | 0.6948 | 0.5917 | 0.0948 |
| 0.0372 | 94.0 | 2350 | 0.0396 | 0.735 | 0.7728 | 1.4413 | 0.735 | 0.6948 | 0.5858 | 0.0947 |
| 0.0372 | 95.0 | 2375 | 0.0396 | 0.735 | 0.7729 | 1.4422 | 0.735 | 0.6948 | 0.5917 | 0.0946 |
| 0.0372 | 96.0 | 2400 | 0.0396 | 0.735 | 0.7729 | 1.4527 | 0.735 | 0.6948 | 0.5917 | 0.0946 |
| 0.0372 | 97.0 | 2425 | 0.0396 | 0.735 | 0.7729 | 1.4441 | 0.735 | 0.6948 | 0.5917 | 0.0946 |
| 0.0372 | 98.0 | 2450 | 0.0396 | 0.735 | 0.7729 | 1.4423 | 0.735 | 0.6948 | 0.5917 | 0.0946 |
| 0.0372 | 99.0 | 2475 | 0.0396 | 0.735 | 0.7729 | 1.4457 | 0.735 | 0.6948 | 0.5886 | 0.0948 |
| 0.0372 | 100.0 | 2500 | 0.0396 | 0.735 | 0.7729 | 1.4473 | 0.735 | 0.6948 | 0.5886 | 0.0947 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
ALM-AHME/convnextv2-large-1k-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-large-1k-224-finetuned-LungCancer-Classification-LC25000-AH-40-30-30
This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1791
- Accuracy: 0.9433
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.5
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.254 | 0.99 | 93 | 0.1791 | 0.9433 |
| 0.4225 | 1.99 | 187 | 0.4341 | 0.8297 |
| 0.4801 | 3.0 | 281 | 0.4158 | 0.8890 |
| 0.2558 | 4.0 | 375 | 0.2540 | 0.8952 |
| 0.1809 | 4.96 | 465 | 0.1753 | 0.9358 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
|
[
"lung-benign_tissue",
"lung_adenocarcinoma",
"lung_squamous_cell_carcinoma"
] |
jordyvl/vit-small_tobacco3482_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-small_tobacco3482_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.9399
- Accuracy: 0.82
- Brier Loss: 0.3024
- Nll: 1.1952
- F1 Micro: 0.82
- F1 Macro: 0.7964
- Ece: 0.1494
- Aurc: 0.0548
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 7 | 4.9730 | 0.205 | 0.8886 | 5.4337 | 0.205 | 0.1356 | 0.2736 | 0.7527 |
| No log | 2.0 | 14 | 4.6039 | 0.355 | 0.8122 | 3.5880 | 0.3550 | 0.2120 | 0.3197 | 0.5132 |
| No log | 3.0 | 21 | 4.2754 | 0.515 | 0.7054 | 2.0283 | 0.515 | 0.4046 | 0.3275 | 0.2966 |
| No log | 4.0 | 28 | 4.0263 | 0.6 | 0.5799 | 1.5709 | 0.6 | 0.5157 | 0.3062 | 0.1890 |
| No log | 5.0 | 35 | 3.8749 | 0.725 | 0.4857 | 1.5338 | 0.7250 | 0.6949 | 0.3181 | 0.1194 |
| No log | 6.0 | 42 | 3.7023 | 0.765 | 0.3925 | 1.2926 | 0.765 | 0.6948 | 0.2394 | 0.0908 |
| No log | 7.0 | 49 | 3.7728 | 0.78 | 0.3668 | 1.3007 | 0.78 | 0.7355 | 0.2478 | 0.0754 |
| No log | 8.0 | 56 | 3.7328 | 0.785 | 0.3459 | 1.2487 | 0.785 | 0.7501 | 0.2261 | 0.0804 |
| No log | 9.0 | 63 | 3.7092 | 0.77 | 0.3289 | 1.0921 | 0.7700 | 0.7672 | 0.2019 | 0.0767 |
| No log | 10.0 | 70 | 3.6273 | 0.795 | 0.3150 | 1.0342 | 0.795 | 0.7690 | 0.1927 | 0.0716 |
| No log | 11.0 | 77 | 3.5677 | 0.83 | 0.2754 | 1.3837 | 0.83 | 0.7933 | 0.1697 | 0.0532 |
| No log | 12.0 | 84 | 3.5668 | 0.815 | 0.2816 | 1.1304 | 0.815 | 0.7934 | 0.1563 | 0.0617 |
| No log | 13.0 | 91 | 3.6080 | 0.83 | 0.2723 | 0.9515 | 0.83 | 0.8088 | 0.1648 | 0.0543 |
| No log | 14.0 | 98 | 3.6095 | 0.815 | 0.3050 | 1.2020 | 0.815 | 0.8207 | 0.1523 | 0.0633 |
| No log | 15.0 | 105 | 3.6685 | 0.805 | 0.3060 | 1.2725 | 0.805 | 0.7920 | 0.1618 | 0.0676 |
| No log | 16.0 | 112 | 3.5523 | 0.825 | 0.2832 | 0.9447 | 0.825 | 0.8163 | 0.1614 | 0.0569 |
| No log | 17.0 | 119 | 3.5294 | 0.805 | 0.2752 | 0.9918 | 0.805 | 0.7636 | 0.1537 | 0.0549 |
| No log | 18.0 | 126 | 3.5382 | 0.8 | 0.2870 | 1.2294 | 0.8000 | 0.7885 | 0.1603 | 0.0583 |
| No log | 19.0 | 133 | 3.5541 | 0.82 | 0.2905 | 1.2181 | 0.82 | 0.8204 | 0.1400 | 0.0618 |
| No log | 20.0 | 140 | 3.4717 | 0.835 | 0.2606 | 1.1119 | 0.835 | 0.8146 | 0.1382 | 0.0531 |
| No log | 21.0 | 147 | 3.6074 | 0.79 | 0.3099 | 1.2144 | 0.79 | 0.7771 | 0.1419 | 0.0599 |
| No log | 22.0 | 154 | 3.5448 | 0.805 | 0.2868 | 1.2075 | 0.805 | 0.7761 | 0.1439 | 0.0581 |
| No log | 23.0 | 161 | 3.6070 | 0.805 | 0.3057 | 1.2908 | 0.805 | 0.7831 | 0.1393 | 0.0627 |
| No log | 24.0 | 168 | 3.5289 | 0.81 | 0.2716 | 1.1844 | 0.81 | 0.7879 | 0.1358 | 0.0550 |
| No log | 25.0 | 175 | 3.5502 | 0.82 | 0.2827 | 1.1141 | 0.82 | 0.7908 | 0.1460 | 0.0554 |
| No log | 26.0 | 182 | 3.5747 | 0.82 | 0.2829 | 1.1727 | 0.82 | 0.8027 | 0.1330 | 0.0565 |
| No log | 27.0 | 189 | 3.6091 | 0.83 | 0.2787 | 1.1040 | 0.83 | 0.8067 | 0.1347 | 0.0563 |
| No log | 28.0 | 196 | 3.5917 | 0.82 | 0.2837 | 1.1775 | 0.82 | 0.7975 | 0.1513 | 0.0564 |
| No log | 29.0 | 203 | 3.6087 | 0.815 | 0.2875 | 1.1448 | 0.815 | 0.7998 | 0.1339 | 0.0542 |
| No log | 30.0 | 210 | 3.6018 | 0.815 | 0.2819 | 1.1613 | 0.815 | 0.8027 | 0.1507 | 0.0535 |
| No log | 31.0 | 217 | 3.6350 | 0.815 | 0.2845 | 1.2278 | 0.815 | 0.7866 | 0.1401 | 0.0537 |
| No log | 32.0 | 224 | 3.6290 | 0.82 | 0.2815 | 1.1528 | 0.82 | 0.7950 | 0.1424 | 0.0520 |
| No log | 33.0 | 231 | 3.6642 | 0.815 | 0.2865 | 1.1504 | 0.815 | 0.7946 | 0.1379 | 0.0542 |
| No log | 34.0 | 238 | 3.6778 | 0.815 | 0.2929 | 1.2116 | 0.815 | 0.7890 | 0.1437 | 0.0538 |
| No log | 35.0 | 245 | 3.6867 | 0.82 | 0.2869 | 1.1547 | 0.82 | 0.7904 | 0.1404 | 0.0529 |
| No log | 36.0 | 252 | 3.6931 | 0.795 | 0.2946 | 1.1478 | 0.795 | 0.7694 | 0.1494 | 0.0543 |
| No log | 37.0 | 259 | 3.7166 | 0.82 | 0.2921 | 1.2109 | 0.82 | 0.7904 | 0.1489 | 0.0534 |
| No log | 38.0 | 266 | 3.7024 | 0.81 | 0.2889 | 1.1516 | 0.81 | 0.7888 | 0.1508 | 0.0536 |
| No log | 39.0 | 273 | 3.7353 | 0.81 | 0.2943 | 1.2088 | 0.81 | 0.7812 | 0.1466 | 0.0537 |
| No log | 40.0 | 280 | 3.7198 | 0.82 | 0.2891 | 1.1515 | 0.82 | 0.8014 | 0.1285 | 0.0536 |
| No log | 41.0 | 287 | 3.7413 | 0.815 | 0.2899 | 1.2124 | 0.815 | 0.7959 | 0.1471 | 0.0537 |
| No log | 42.0 | 294 | 3.7272 | 0.82 | 0.2896 | 1.2071 | 0.82 | 0.8002 | 0.1414 | 0.0532 |
| No log | 43.0 | 301 | 3.7609 | 0.815 | 0.2925 | 1.2100 | 0.815 | 0.7868 | 0.1486 | 0.0528 |
| No log | 44.0 | 308 | 3.7589 | 0.815 | 0.2922 | 1.2074 | 0.815 | 0.7877 | 0.1398 | 0.0537 |
| No log | 45.0 | 315 | 3.7820 | 0.815 | 0.2961 | 1.2078 | 0.815 | 0.7874 | 0.1499 | 0.0535 |
| No log | 46.0 | 322 | 3.7663 | 0.82 | 0.2926 | 1.2053 | 0.82 | 0.8014 | 0.1369 | 0.0532 |
| No log | 47.0 | 329 | 3.7850 | 0.82 | 0.2944 | 1.2079 | 0.82 | 0.7904 | 0.1374 | 0.0532 |
| No log | 48.0 | 336 | 3.7802 | 0.82 | 0.2935 | 1.2025 | 0.82 | 0.7981 | 0.1483 | 0.0537 |
| No log | 49.0 | 343 | 3.7954 | 0.82 | 0.2937 | 1.2068 | 0.82 | 0.7900 | 0.1354 | 0.0528 |
| No log | 50.0 | 350 | 3.7974 | 0.815 | 0.2954 | 1.2020 | 0.815 | 0.7907 | 0.1491 | 0.0534 |
| No log | 51.0 | 357 | 3.8081 | 0.815 | 0.2965 | 1.2035 | 0.815 | 0.7907 | 0.1533 | 0.0533 |
| No log | 52.0 | 364 | 3.8171 | 0.815 | 0.2982 | 1.2033 | 0.815 | 0.7907 | 0.1466 | 0.0537 |
| No log | 53.0 | 371 | 3.8136 | 0.815 | 0.2961 | 1.2035 | 0.815 | 0.7907 | 0.1399 | 0.0531 |
| No log | 54.0 | 378 | 3.8244 | 0.815 | 0.2977 | 1.2024 | 0.815 | 0.7907 | 0.1586 | 0.0538 |
| No log | 55.0 | 385 | 3.8265 | 0.815 | 0.2963 | 1.2004 | 0.815 | 0.7907 | 0.1506 | 0.0537 |
| No log | 56.0 | 392 | 3.8376 | 0.82 | 0.2980 | 1.2011 | 0.82 | 0.7964 | 0.1471 | 0.0536 |
| No log | 57.0 | 399 | 3.8428 | 0.82 | 0.2982 | 1.1994 | 0.82 | 0.7964 | 0.1562 | 0.0535 |
| No log | 58.0 | 406 | 3.8418 | 0.82 | 0.2973 | 1.2004 | 0.82 | 0.7964 | 0.1484 | 0.0537 |
| No log | 59.0 | 413 | 3.8507 | 0.82 | 0.2984 | 1.2009 | 0.82 | 0.7931 | 0.1563 | 0.0538 |
| No log | 60.0 | 420 | 3.8560 | 0.82 | 0.2989 | 1.2001 | 0.82 | 0.7964 | 0.1579 | 0.0540 |
| No log | 61.0 | 427 | 3.8563 | 0.82 | 0.2974 | 1.1997 | 0.82 | 0.7964 | 0.1560 | 0.0536 |
| No log | 62.0 | 434 | 3.8648 | 0.815 | 0.2986 | 1.1995 | 0.815 | 0.7907 | 0.1532 | 0.0540 |
| No log | 63.0 | 441 | 3.8682 | 0.82 | 0.2991 | 1.1991 | 0.82 | 0.7964 | 0.1570 | 0.0536 |
| No log | 64.0 | 448 | 3.8735 | 0.82 | 0.2989 | 1.1984 | 0.82 | 0.7964 | 0.1481 | 0.0539 |
| No log | 65.0 | 455 | 3.8794 | 0.82 | 0.3000 | 1.1981 | 0.82 | 0.7964 | 0.1496 | 0.0543 |
| No log | 66.0 | 462 | 3.8824 | 0.82 | 0.3002 | 1.1980 | 0.82 | 0.7964 | 0.1567 | 0.0539 |
| No log | 67.0 | 469 | 3.8842 | 0.82 | 0.3005 | 1.1983 | 0.82 | 0.7964 | 0.1438 | 0.0542 |
| No log | 68.0 | 476 | 3.8866 | 0.82 | 0.3001 | 1.1978 | 0.82 | 0.7964 | 0.1418 | 0.0540 |
| No log | 69.0 | 483 | 3.8912 | 0.82 | 0.3003 | 1.1977 | 0.82 | 0.7964 | 0.1570 | 0.0541 |
| No log | 70.0 | 490 | 3.8959 | 0.82 | 0.3008 | 1.1971 | 0.82 | 0.7964 | 0.1445 | 0.0544 |
| No log | 71.0 | 497 | 3.8964 | 0.82 | 0.3002 | 1.1977 | 0.82 | 0.7964 | 0.1366 | 0.0543 |
| 3.4649 | 72.0 | 504 | 3.9021 | 0.82 | 0.3009 | 1.1969 | 0.82 | 0.7964 | 0.1471 | 0.0543 |
| 3.4649 | 73.0 | 511 | 3.9052 | 0.82 | 0.3015 | 1.1976 | 0.82 | 0.7964 | 0.1532 | 0.0546 |
| 3.4649 | 74.0 | 518 | 3.9043 | 0.82 | 0.3002 | 1.1973 | 0.82 | 0.7964 | 0.1371 | 0.0544 |
| 3.4649 | 75.0 | 525 | 3.9096 | 0.82 | 0.3004 | 1.1966 | 0.82 | 0.7964 | 0.1417 | 0.0543 |
| 3.4649 | 76.0 | 532 | 3.9099 | 0.82 | 0.3010 | 1.1965 | 0.82 | 0.7964 | 0.1428 | 0.0545 |
| 3.4649 | 77.0 | 539 | 3.9151 | 0.82 | 0.3016 | 1.1963 | 0.82 | 0.7964 | 0.1460 | 0.0548 |
| 3.4649 | 78.0 | 546 | 3.9143 | 0.82 | 0.3010 | 1.1970 | 0.82 | 0.7964 | 0.1447 | 0.0543 |
| 3.4649 | 79.0 | 553 | 3.9164 | 0.82 | 0.3014 | 1.1966 | 0.82 | 0.7964 | 0.1436 | 0.0545 |
| 3.4649 | 80.0 | 560 | 3.9198 | 0.82 | 0.3018 | 1.1965 | 0.82 | 0.7964 | 0.1520 | 0.0545 |
| 3.4649 | 81.0 | 567 | 3.9218 | 0.82 | 0.3015 | 1.1959 | 0.82 | 0.7964 | 0.1440 | 0.0546 |
| 3.4649 | 82.0 | 574 | 3.9236 | 0.82 | 0.3018 | 1.1961 | 0.82 | 0.7964 | 0.1439 | 0.0546 |
| 3.4649 | 83.0 | 581 | 3.9248 | 0.82 | 0.3017 | 1.1959 | 0.82 | 0.7964 | 0.1440 | 0.0546 |
| 3.4649 | 84.0 | 588 | 3.9267 | 0.82 | 0.3018 | 1.1958 | 0.82 | 0.7964 | 0.1442 | 0.0545 |
| 3.4649 | 85.0 | 595 | 3.9286 | 0.82 | 0.3019 | 1.1959 | 0.82 | 0.7964 | 0.1443 | 0.0546 |
| 3.4649 | 86.0 | 602 | 3.9300 | 0.82 | 0.3020 | 1.1958 | 0.82 | 0.7964 | 0.1444 | 0.0545 |
| 3.4649 | 87.0 | 609 | 3.9320 | 0.82 | 0.3022 | 1.1956 | 0.82 | 0.7964 | 0.1446 | 0.0546 |
| 3.4649 | 88.0 | 616 | 3.9327 | 0.82 | 0.3022 | 1.1957 | 0.82 | 0.7964 | 0.1446 | 0.0545 |
| 3.4649 | 89.0 | 623 | 3.9340 | 0.82 | 0.3022 | 1.1955 | 0.82 | 0.7964 | 0.1436 | 0.0546 |
| 3.4649 | 90.0 | 630 | 3.9346 | 0.82 | 0.3022 | 1.1956 | 0.82 | 0.7964 | 0.1447 | 0.0546 |
| 3.4649 | 91.0 | 637 | 3.9360 | 0.82 | 0.3023 | 1.1953 | 0.82 | 0.7964 | 0.1438 | 0.0546 |
| 3.4649 | 92.0 | 644 | 3.9368 | 0.82 | 0.3023 | 1.1954 | 0.82 | 0.7964 | 0.1438 | 0.0546 |
| 3.4649 | 93.0 | 651 | 3.9374 | 0.82 | 0.3023 | 1.1954 | 0.82 | 0.7964 | 0.1437 | 0.0548 |
| 3.4649 | 94.0 | 658 | 3.9380 | 0.82 | 0.3023 | 1.1953 | 0.82 | 0.7964 | 0.1438 | 0.0548 |
| 3.4649 | 95.0 | 665 | 3.9385 | 0.82 | 0.3023 | 1.1953 | 0.82 | 0.7964 | 0.1494 | 0.0549 |
| 3.4649 | 96.0 | 672 | 3.9391 | 0.82 | 0.3024 | 1.1952 | 0.82 | 0.7964 | 0.1494 | 0.0548 |
| 3.4649 | 97.0 | 679 | 3.9393 | 0.82 | 0.3024 | 1.1952 | 0.82 | 0.7964 | 0.1495 | 0.0548 |
| 3.4649 | 98.0 | 686 | 3.9396 | 0.82 | 0.3024 | 1.1952 | 0.82 | 0.7964 | 0.1494 | 0.0548 |
| 3.4649 | 99.0 | 693 | 3.9398 | 0.82 | 0.3024 | 1.1952 | 0.82 | 0.7964 | 0.1494 | 0.0548 |
| 3.4649 | 100.0 | 700 | 3.9399 | 0.82 | 0.3024 | 1.1952 | 0.82 | 0.7964 | 0.1494 | 0.0548 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.