model_id
stringlengths
7
105
model_card
stringlengths
1
130k
model_labels
listlengths
2
80k
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5443 - Accuracy: 0.8293 - Brier Loss: 0.2695 - Nll: 1.2500 - F1 Micro: 0.8293 - F1 Macro: 0.8301 - Ece: 0.0874 - Aurc: 0.0693 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 167 | 1.1672 | 0.6082 | 0.5198 | 2.4714 | 0.6082 | 0.6021 | 0.0681 | 0.1731 | | No log | 2.0 | 334 | 0.8928 | 0.7177 | 0.4028 | 2.0687 | 0.7178 | 0.7161 | 0.0715 | 0.0997 | | 1.1617 | 3.0 | 501 | 0.7584 | 0.7602 | 0.3454 | 1.8041 | 0.7602 | 0.7637 | 0.0762 | 0.0747 | | 1.1617 | 4.0 | 668 | 0.7048 | 0.768 | 0.3262 | 1.6695 | 0.768 | 0.7687 | 0.0550 | 0.0687 | | 1.1617 | 5.0 | 835 | 0.6921 | 0.7745 | 0.3214 | 1.6428 | 0.7745 | 0.7683 | 0.0477 | 0.0738 | | 0.4396 | 6.0 | 1002 | 0.6616 | 0.789 | 0.3105 | 1.5655 | 0.7890 | 0.7897 | 0.0586 | 0.0714 | | 0.4396 | 7.0 | 1169 | 0.6363 | 0.794 | 0.3001 | 1.5694 | 0.7940 | 0.7970 | 0.0549 | 0.0674 | | 0.4396 | 8.0 | 1336 | 0.6753 | 0.7792 | 0.3259 | 1.4975 | 0.7792 | 0.7804 | 0.0647 | 0.0777 | | 0.2291 | 9.0 | 1503 | 0.6247 | 0.8025 | 0.2979 | 1.4968 | 0.8025 | 0.8037 | 0.0705 | 0.0669 | | 0.2291 | 10.0 | 1670 | 0.6347 | 0.799 | 0.3032 | 1.4834 | 0.799 | 0.8011 | 0.0720 | 0.0743 | | 0.2291 | 11.0 | 1837 | 0.6328 | 0.7975 | 0.3045 | 1.4998 | 0.7975 | 0.8031 | 0.0773 | 0.0659 | | 0.1575 | 12.0 | 2004 | 0.6442 | 0.7965 | 0.3097 | 1.4447 | 0.7965 | 0.7979 | 0.0714 | 0.0824 | | 0.1575 | 13.0 | 2171 | 0.6354 | 0.8013 | 0.3043 | 1.4874 | 0.8013 | 0.8035 | 0.0712 | 0.0741 | | 0.1575 | 14.0 | 2338 | 0.6443 | 0.799 | 0.3091 | 1.5848 | 0.799 | 0.8022 | 0.0791 | 0.0859 | | 0.1285 | 15.0 | 2505 | 0.6357 | 0.8017 | 0.3042 | 1.5670 | 0.8017 | 0.8002 | 0.0799 | 0.0685 | | 0.1285 | 16.0 | 2672 | 0.6166 | 0.807 | 0.2965 | 1.4806 | 0.807 | 0.8056 | 0.0720 | 0.0745 | | 0.1285 | 17.0 | 2839 | 0.6433 | 0.7993 | 0.3159 | 1.5024 | 0.7993 | 0.8023 | 0.0805 | 0.0857 | | 0.1121 | 18.0 | 3006 | 0.6102 | 0.8147 | 0.2960 | 1.4550 | 0.8148 | 0.8144 | 0.0775 | 0.0698 | | 0.1121 | 19.0 | 3173 | 0.6616 | 0.7995 | 0.3146 | 1.6009 | 0.7995 | 0.7962 | 0.0892 | 0.0883 | | 0.1121 | 20.0 | 3340 | 0.6163 | 0.8037 | 0.3029 | 1.4525 | 0.8037 | 0.8059 | 0.0920 | 0.0771 | | 0.1012 | 21.0 | 3507 | 0.6186 | 0.8093 | 0.3017 | 1.5539 | 0.8093 | 0.8111 | 0.0920 | 0.0712 | | 0.1012 | 22.0 | 3674 | 0.5982 | 0.8137 | 0.2930 | 1.4533 | 0.8137 | 0.8140 | 0.0815 | 0.0668 | | 0.1012 | 23.0 | 3841 | 0.5928 | 0.822 | 0.2864 | 1.4312 | 0.822 | 0.8218 | 0.0723 | 0.0818 | | 0.0888 | 24.0 | 4008 | 0.5931 | 0.8135 | 0.2900 | 1.4129 | 0.8135 | 0.8143 | 0.0894 | 0.0706 | | 0.0888 | 25.0 | 4175 | 0.5807 | 0.8183 | 0.2849 | 1.4241 | 0.8183 | 0.8203 | 0.0903 | 0.0683 | | 0.0888 | 26.0 | 4342 | 0.5859 | 0.8193 | 0.2869 | 1.4385 | 0.8193 | 0.8194 | 0.0879 | 0.0698 | | 0.0828 | 27.0 | 4509 | 0.5957 | 0.8147 | 0.2941 | 1.4132 | 0.8148 | 0.8151 | 0.0847 | 0.0732 | | 0.0828 | 28.0 | 4676 | 0.5791 | 0.818 | 0.2852 | 1.4231 | 0.818 | 0.8185 | 0.0896 | 0.0612 | | 0.0828 | 29.0 | 4843 | 0.5888 | 0.8137 | 0.2895 | 1.3998 | 0.8137 | 0.8148 | 0.0925 | 0.0740 | | 0.0776 | 30.0 | 5010 | 0.5633 | 0.8225 | 0.2798 | 1.3391 | 0.8225 | 0.8234 | 0.0878 | 0.0760 | | 0.0776 | 31.0 | 5177 | 0.5635 | 0.8247 | 0.2785 | 1.3193 | 0.8247 | 0.8256 | 0.0900 | 0.0587 | | 0.0776 | 32.0 | 5344 | 0.5580 | 0.8223 | 0.2784 | 1.2970 | 0.8223 | 0.8241 | 0.0905 | 0.0704 | | 0.0727 | 33.0 | 5511 | 0.5502 | 0.826 | 0.2724 | 1.2733 | 0.826 | 0.8268 | 0.0865 | 0.0619 | | 0.0727 | 34.0 | 5678 | 0.5448 | 0.8293 | 0.2720 | 1.2237 | 0.8293 | 0.8303 | 0.0820 | 0.0639 | | 0.0727 | 35.0 | 5845 | 0.5480 | 0.8257 | 0.2729 | 1.2867 | 0.8257 | 0.8271 | 0.0928 | 0.0586 | | 0.0696 | 36.0 | 6012 | 0.5437 | 0.8293 | 0.2703 | 1.2427 | 0.8293 | 0.8298 | 0.0871 | 0.0630 | | 0.0696 | 37.0 | 6179 | 0.5460 | 0.8253 | 0.2712 | 1.2629 | 0.8253 | 0.8262 | 0.0912 | 0.0598 | | 0.0696 | 38.0 | 6346 | 0.5425 | 0.8295 | 0.2703 | 1.2440 | 0.8295 | 0.8303 | 0.0899 | 0.0611 | | 0.0677 | 39.0 | 6513 | 0.5421 | 0.8307 | 0.2690 | 1.2453 | 0.8308 | 0.8319 | 0.0835 | 0.0665 | | 0.0677 | 40.0 | 6680 | 0.5406 | 0.8287 | 0.2689 | 1.2465 | 0.8287 | 0.8296 | 0.0895 | 0.0612 | | 0.0677 | 41.0 | 6847 | 0.5423 | 0.8277 | 0.2696 | 1.2735 | 0.8277 | 0.8284 | 0.0893 | 0.0604 | | 0.0663 | 42.0 | 7014 | 0.5406 | 0.8297 | 0.2676 | 1.2403 | 0.8297 | 0.8306 | 0.0894 | 0.0657 | | 0.0663 | 43.0 | 7181 | 0.5410 | 0.8313 | 0.2686 | 1.2359 | 0.8313 | 0.8323 | 0.0895 | 0.0635 | | 0.0663 | 44.0 | 7348 | 0.5416 | 0.8287 | 0.2685 | 1.2308 | 0.8287 | 0.8295 | 0.0883 | 0.0647 | | 0.0652 | 45.0 | 7515 | 0.5431 | 0.8275 | 0.2697 | 1.2374 | 0.8275 | 0.8282 | 0.0932 | 0.0648 | | 0.0652 | 46.0 | 7682 | 0.5433 | 0.8295 | 0.2693 | 1.2347 | 0.8295 | 0.8303 | 0.0891 | 0.0681 | | 0.0652 | 47.0 | 7849 | 0.5441 | 0.8277 | 0.2696 | 1.2433 | 0.8277 | 0.8286 | 0.0882 | 0.0681 | | 0.0651 | 48.0 | 8016 | 0.5439 | 0.8293 | 0.2695 | 1.2358 | 0.8293 | 0.8301 | 0.0888 | 0.0692 | | 0.0651 | 49.0 | 8183 | 0.5445 | 0.8287 | 0.2696 | 1.2499 | 0.8287 | 0.8296 | 0.0882 | 0.0695 | | 0.0651 | 50.0 | 8350 | 0.5443 | 0.8293 | 0.2695 | 1.2500 | 0.8293 | 0.8301 | 0.0874 | 0.0693 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
loucad/mobilevit-xx-small-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mobilevit-xx-small-finetuned-eurosat This model is a fine-tuned version of [apple/mobilevit-xx-small](https://huggingface.co/apple/mobilevit-xx-small) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1926 - Accuracy: 0.9507 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.5074 | 1.0 | 190 | 1.3433 | 0.7078 | | 0.9398 | 2.0 | 380 | 0.7177 | 0.85 | | 0.7035 | 3.0 | 570 | 0.4252 | 0.9070 | | 0.5435 | 4.0 | 760 | 0.3080 | 0.9281 | | 0.5007 | 5.0 | 950 | 0.2465 | 0.9389 | | 0.4533 | 6.0 | 1140 | 0.2291 | 0.9444 | | 0.3961 | 7.0 | 1330 | 0.1991 | 0.9496 | | 0.3949 | 8.0 | 1520 | 0.1926 | 0.9507 | | 0.4302 | 9.0 | 1710 | 0.1928 | 0.95 | | 0.4061 | 10.0 | 1900 | 0.1931 | 0.9463 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
rdmpage/autotrain-page7-78316140914
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 78316140914 - CO2 Emissions (in grams): 0.8772 ## Validation Metrics - Loss: 0.296 - Accuracy: 0.888 - Macro F1: 0.805 - Micro F1: 0.888 - Weighted F1: 0.865 - Macro Precision: 0.799 - Micro Precision: 0.888 - Weighted Precision: 0.846 - Macro Recall: 0.812 - Micro Recall: 0.888 - Weighted Recall: 0.888
[ "blank", "content", "cover", "end", "endstart", "plate", "start" ]
AmirAlahmedy/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd_MSE
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd_MSE This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3468 - Accuracy: 0.8335 - Brier Loss: 0.2611 - Nll: 1.2696 - F1 Micro: 0.8335 - F1 Macro: 0.8338 - Ece: 0.0865 - Aurc: 0.0606 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 167 | 0.9313 | 0.5955 | 0.5896 | 2.2535 | 0.5955 | 0.5760 | 0.2103 | 0.1770 | | No log | 2.0 | 334 | 0.6941 | 0.7027 | 0.4541 | 1.7696 | 0.7027 | 0.6962 | 0.1756 | 0.1075 | | 0.9587 | 3.0 | 501 | 0.5806 | 0.7435 | 0.3758 | 1.8114 | 0.7435 | 0.7489 | 0.1225 | 0.0802 | | 0.9587 | 4.0 | 668 | 0.4847 | 0.7808 | 0.3232 | 1.6093 | 0.7808 | 0.7825 | 0.0897 | 0.0600 | | 0.9587 | 5.0 | 835 | 0.4888 | 0.775 | 0.3291 | 1.7398 | 0.775 | 0.7686 | 0.0629 | 0.0681 | | 0.3668 | 6.0 | 1002 | 0.4441 | 0.789 | 0.3136 | 1.5502 | 0.7890 | 0.7882 | 0.0720 | 0.0626 | | 0.3668 | 7.0 | 1169 | 0.4258 | 0.7993 | 0.3009 | 1.6320 | 0.7993 | 0.7994 | 0.0626 | 0.0658 | | 0.3668 | 8.0 | 1336 | 0.4813 | 0.774 | 0.3395 | 1.8387 | 0.774 | 0.7788 | 0.0716 | 0.0811 | | 0.1959 | 9.0 | 1503 | 0.4289 | 0.7967 | 0.3111 | 1.7125 | 0.7967 | 0.7985 | 0.0799 | 0.0686 | | 0.1959 | 10.0 | 1670 | 0.4380 | 0.7897 | 0.3151 | 1.8001 | 0.7897 | 0.7920 | 0.0802 | 0.0730 | | 0.1959 | 11.0 | 1837 | 0.4414 | 0.796 | 0.3159 | 1.7450 | 0.796 | 0.7956 | 0.0795 | 0.0815 | | 0.1249 | 12.0 | 2004 | 0.4507 | 0.787 | 0.3243 | 1.7219 | 0.787 | 0.7840 | 0.0841 | 0.0767 | | 0.1249 | 13.0 | 2171 | 0.4209 | 0.802 | 0.3095 | 1.7111 | 0.802 | 0.8045 | 0.0825 | 0.0729 | | 0.1249 | 14.0 | 2338 | 0.4095 | 0.8007 | 0.3039 | 1.5961 | 0.8007 | 0.8018 | 0.0742 | 0.0743 | | 0.088 | 15.0 | 2505 | 0.4043 | 0.8125 | 0.2974 | 1.6100 | 0.8125 | 0.8158 | 0.0801 | 0.0740 | | 0.088 | 16.0 | 2672 | 0.4056 | 0.8083 | 0.2964 | 1.6402 | 0.8083 | 0.8080 | 0.0833 | 0.0681 | | 0.088 | 17.0 | 2839 | 0.4052 | 0.8103 | 0.2993 | 1.6074 | 0.8103 | 0.8105 | 0.0848 | 0.0780 | | 0.0638 | 18.0 | 3006 | 0.4207 | 0.8035 | 0.3066 | 1.6669 | 0.8035 | 0.8075 | 0.0826 | 0.0746 | | 0.0638 | 19.0 | 3173 | 0.3981 | 0.8125 | 0.2911 | 1.5687 | 0.8125 | 0.8128 | 0.0836 | 0.0762 | | 0.0638 | 20.0 | 3340 | 0.3828 | 0.8207 | 0.2803 | 1.5513 | 0.8207 | 0.8217 | 0.0800 | 0.0627 | | 0.0456 | 21.0 | 3507 | 0.3710 | 0.821 | 0.2802 | 1.4355 | 0.821 | 0.8218 | 0.0913 | 0.0662 | | 0.0456 | 22.0 | 3674 | 0.3672 | 0.8247 | 0.2744 | 1.4922 | 0.8247 | 0.8280 | 0.0774 | 0.0615 | | 0.0456 | 23.0 | 3841 | 0.3600 | 0.8255 | 0.2727 | 1.4413 | 0.8255 | 0.8256 | 0.0817 | 0.0675 | | 0.0289 | 24.0 | 4008 | 0.3650 | 0.8235 | 0.2767 | 1.3874 | 0.8235 | 0.8248 | 0.0818 | 0.0698 | | 0.0289 | 25.0 | 4175 | 0.3608 | 0.827 | 0.2706 | 1.3223 | 0.827 | 0.8279 | 0.0861 | 0.0597 | | 0.0289 | 26.0 | 4342 | 0.3572 | 0.829 | 0.2687 | 1.3947 | 0.8290 | 0.8300 | 0.0878 | 0.0650 | | 0.0176 | 27.0 | 4509 | 0.3516 | 0.8315 | 0.2655 | 1.3000 | 0.8315 | 0.8319 | 0.0866 | 0.0597 | | 0.0176 | 28.0 | 4676 | 0.3455 | 0.8337 | 0.2626 | 1.3070 | 0.8337 | 0.8351 | 0.0870 | 0.0602 | | 0.0176 | 29.0 | 4843 | 0.3489 | 0.8337 | 0.2656 | 1.3027 | 0.8337 | 0.8347 | 0.0859 | 0.0587 | | 0.011 | 30.0 | 5010 | 0.3472 | 0.8327 | 0.2639 | 1.2879 | 0.8327 | 0.8336 | 0.0878 | 0.0599 | | 0.011 | 31.0 | 5177 | 0.3468 | 0.8335 | 0.2642 | 1.2955 | 0.8335 | 0.8341 | 0.0859 | 0.0650 | | 0.011 | 32.0 | 5344 | 0.3467 | 0.8333 | 0.2635 | 1.2911 | 0.8333 | 0.8341 | 0.0849 | 0.0588 | | 0.0076 | 33.0 | 5511 | 0.3430 | 0.834 | 0.2601 | 1.2738 | 0.834 | 0.8346 | 0.0831 | 0.0609 | | 0.0076 | 34.0 | 5678 | 0.3442 | 0.8345 | 0.2626 | 1.2921 | 0.8345 | 0.8353 | 0.0864 | 0.0629 | | 0.0076 | 35.0 | 5845 | 0.3431 | 0.8355 | 0.2596 | 1.2790 | 0.8355 | 0.8362 | 0.0860 | 0.0589 | | 0.0055 | 36.0 | 6012 | 0.3496 | 0.8297 | 0.2646 | 1.2985 | 0.8297 | 0.8305 | 0.0897 | 0.0642 | | 0.0055 | 37.0 | 6179 | 0.3445 | 0.8343 | 0.2605 | 1.2509 | 0.8343 | 0.8348 | 0.0862 | 0.0594 | | 0.0055 | 38.0 | 6346 | 0.3473 | 0.831 | 0.2628 | 1.2919 | 0.831 | 0.8314 | 0.0881 | 0.0616 | | 0.0041 | 39.0 | 6513 | 0.3445 | 0.8325 | 0.2625 | 1.2894 | 0.8325 | 0.8330 | 0.0880 | 0.0619 | | 0.0041 | 40.0 | 6680 | 0.3462 | 0.8317 | 0.2614 | 1.2840 | 0.8317 | 0.8323 | 0.0844 | 0.0599 | | 0.0041 | 41.0 | 6847 | 0.3437 | 0.833 | 0.2602 | 1.2694 | 0.833 | 0.8336 | 0.0871 | 0.0598 | | 0.003 | 42.0 | 7014 | 0.3456 | 0.8347 | 0.2605 | 1.2867 | 0.8347 | 0.8352 | 0.0844 | 0.0615 | | 0.003 | 43.0 | 7181 | 0.3454 | 0.8347 | 0.2607 | 1.2844 | 0.8347 | 0.8354 | 0.0868 | 0.0598 | | 0.003 | 44.0 | 7348 | 0.3451 | 0.8337 | 0.2599 | 1.2719 | 0.8337 | 0.8342 | 0.0832 | 0.0595 | | 0.0022 | 45.0 | 7515 | 0.3460 | 0.8343 | 0.2607 | 1.2750 | 0.8343 | 0.8346 | 0.0844 | 0.0591 | | 0.0022 | 46.0 | 7682 | 0.3461 | 0.8325 | 0.2607 | 1.2774 | 0.8325 | 0.8328 | 0.0861 | 0.0604 | | 0.0022 | 47.0 | 7849 | 0.3465 | 0.8335 | 0.2610 | 1.2776 | 0.8335 | 0.8338 | 0.0834 | 0.0600 | | 0.0018 | 48.0 | 8016 | 0.3468 | 0.8327 | 0.2609 | 1.2758 | 0.8327 | 0.8331 | 0.0865 | 0.0605 | | 0.0018 | 49.0 | 8183 | 0.3466 | 0.8333 | 0.2610 | 1.2724 | 0.8333 | 0.8336 | 0.0858 | 0.0609 | | 0.0018 | 50.0 | 8350 | 0.3468 | 0.8335 | 0.2611 | 1.2696 | 0.8335 | 0.8338 | 0.0865 | 0.0606 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
minatosnow/swinv2-tiny-patch4-window8-256-mineral
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-tiny-patch4-window8-256-mineral This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 5.2045 - Accuracy: 0.2467 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 5.692 | 0.96 | 18 | 5.6904 | 0.0017 | | 5.6759 | 1.97 | 37 | 5.6829 | 0.0017 | | 5.6699 | 2.99 | 56 | 5.6722 | 0.0033 | | 5.6484 | 4.0 | 75 | 5.6601 | 0.0033 | | 5.6355 | 4.96 | 93 | 5.6486 | 0.005 | | 5.6088 | 5.97 | 112 | 5.6375 | 0.0067 | | 5.5941 | 6.99 | 131 | 5.6268 | 0.0083 | | 5.5636 | 8.0 | 150 | 5.6131 | 0.0117 | | 5.5499 | 8.96 | 168 | 5.5987 | 0.0117 | | 5.5038 | 9.97 | 187 | 5.5798 | 0.0083 | | 5.437 | 10.99 | 206 | 5.5757 | 0.0133 | | 5.3642 | 12.0 | 225 | 5.5482 | 0.0133 | | 5.2877 | 12.96 | 243 | 5.5107 | 0.0183 | | 5.1992 | 13.97 | 262 | 5.4744 | 0.025 | | 5.0956 | 14.99 | 281 | 5.4006 | 0.0333 | | 4.9566 | 16.0 | 300 | 5.3490 | 0.0433 | | 4.8827 | 16.96 | 318 | 5.2856 | 0.0517 | | 4.7089 | 17.97 | 337 | 5.1565 | 0.075 | | 4.5187 | 18.99 | 356 | 5.0459 | 0.0817 | | 4.4164 | 20.0 | 375 | 4.9614 | 0.0983 | | 4.2402 | 20.96 | 393 | 4.8814 | 0.0983 | | 4.0046 | 21.97 | 412 | 4.7763 | 0.1167 | | 3.8336 | 22.99 | 431 | 4.6775 | 0.125 | | 3.6379 | 24.0 | 450 | 4.6089 | 0.14 | | 3.4647 | 24.96 | 468 | 4.5512 | 0.1433 | | 3.3663 | 25.97 | 487 | 4.4674 | 0.1533 | | 3.1445 | 26.99 | 506 | 4.4247 | 0.15 | | 3.0248 | 28.0 | 525 | 4.3519 | 0.1633 | | 2.8415 | 28.96 | 543 | 4.3256 | 0.1633 | | 2.6664 | 29.97 | 562 | 4.3015 | 0.18 | | 2.5225 | 30.99 | 581 | 4.2571 | 0.1867 | | 2.4433 | 32.0 | 600 | 4.2507 | 0.1833 | | 2.3169 | 32.96 | 618 | 4.2097 | 0.1933 | | 2.145 | 33.97 | 637 | 4.1741 | 0.1883 | | 2.0045 | 34.99 | 656 | 4.1681 | 0.1967 | | 1.9272 | 36.0 | 675 | 4.1376 | 0.2083 | | 1.7761 | 36.96 | 693 | 4.1391 | 0.1933 | | 1.7038 | 37.97 | 712 | 4.0957 | 0.205 | | 1.5902 | 38.99 | 731 | 4.1309 | 0.2067 | | 1.4921 | 40.0 | 750 | 4.1324 | 0.2033 | | 1.3552 | 40.96 | 768 | 4.1510 | 0.2067 | | 1.2597 | 41.97 | 787 | 4.1636 | 0.205 | | 1.2909 | 42.99 | 806 | 4.1422 | 0.2267 | | 1.1661 | 44.0 | 825 | 4.1553 | 0.21 | | 1.1031 | 44.96 | 843 | 4.1976 | 0.2167 | | 1.0203 | 45.97 | 862 | 4.1673 | 0.2183 | | 0.9883 | 46.99 | 881 | 4.1628 | 0.2167 | | 0.976 | 48.0 | 900 | 4.1852 | 0.2267 | | 0.8848 | 48.96 | 918 | 4.2108 | 0.2167 | | 0.8616 | 49.97 | 937 | 4.2583 | 0.2133 | | 0.8079 | 50.99 | 956 | 4.1821 | 0.23 | | 0.8256 | 52.0 | 975 | 4.2021 | 0.2217 | | 0.7439 | 52.96 | 993 | 4.2577 | 0.2167 | | 0.6863 | 53.97 | 1012 | 4.2398 | 0.225 | | 0.715 | 54.99 | 1031 | 4.2934 | 0.21 | | 0.6924 | 56.0 | 1050 | 4.2675 | 0.225 | | 0.6454 | 56.96 | 1068 | 4.2916 | 0.2117 | | 0.5688 | 57.97 | 1087 | 4.3216 | 0.2217 | | 0.5958 | 58.99 | 1106 | 4.3202 | 0.2133 | | 0.5834 | 60.0 | 1125 | 4.3022 | 0.2267 | | 0.5583 | 60.96 | 1143 | 4.3746 | 0.225 | | 0.565 | 61.97 | 1162 | 4.3334 | 0.215 | | 0.55 | 62.99 | 1181 | 4.3330 | 0.2217 | | 0.4972 | 64.0 | 1200 | 4.3866 | 0.2183 | | 0.4895 | 64.96 | 1218 | 4.3797 | 0.2267 | | 0.4537 | 65.97 | 1237 | 4.3761 | 0.2267 | | 0.4941 | 66.99 | 1256 | 4.4122 | 0.2317 | | 0.4422 | 68.0 | 1275 | 4.4636 | 0.2217 | | 0.3731 | 68.96 | 1293 | 4.4118 | 0.2217 | | 0.4338 | 69.97 | 1312 | 4.4002 | 0.2217 | | 0.4482 | 70.99 | 1331 | 4.4558 | 0.2233 | | 0.4206 | 72.0 | 1350 | 4.4238 | 0.23 | | 0.4009 | 72.96 | 1368 | 4.4717 | 0.2267 | | 0.4297 | 73.97 | 1387 | 4.5047 | 0.23 | | 0.4031 | 74.99 | 1406 | 4.4637 | 0.2233 | | 0.3489 | 76.0 | 1425 | 4.4878 | 0.2183 | | 0.4202 | 76.96 | 1443 | 4.4785 | 0.23 | | 0.3307 | 77.97 | 1462 | 4.4647 | 0.2267 | | 0.3919 | 78.99 | 1481 | 4.5011 | 0.2233 | | 0.3511 | 80.0 | 1500 | 4.4895 | 0.2367 | | 0.334 | 80.96 | 1518 | 4.4878 | 0.2283 | | 0.3958 | 81.97 | 1537 | 4.5025 | 0.2367 | | 0.316 | 82.99 | 1556 | 4.5792 | 0.2317 | | 0.3135 | 84.0 | 1575 | 4.4894 | 0.2333 | | 0.3292 | 84.96 | 1593 | 4.5495 | 0.235 | | 0.3116 | 85.97 | 1612 | 4.5069 | 0.2317 | | 0.3162 | 86.99 | 1631 | 4.5407 | 0.23 | | 0.3173 | 88.0 | 1650 | 4.5997 | 0.2217 | | 0.2857 | 88.96 | 1668 | 4.5476 | 0.2283 | | 0.2869 | 89.97 | 1687 | 4.5655 | 0.2217 | | 0.299 | 90.99 | 1706 | 4.5743 | 0.24 | | 0.3142 | 92.0 | 1725 | 4.5478 | 0.23 | | 0.3066 | 92.96 | 1743 | 4.5509 | 0.2367 | | 0.2542 | 93.97 | 1762 | 4.5452 | 0.225 | | 0.2707 | 94.99 | 1781 | 4.5017 | 0.225 | | 0.2781 | 96.0 | 1800 | 4.5425 | 0.23 | | 0.2611 | 96.96 | 1818 | 4.6214 | 0.2233 | | 0.2816 | 97.97 | 1837 | 4.5780 | 0.2233 | | 0.2698 | 98.99 | 1856 | 4.5237 | 0.225 | | 0.2568 | 100.0 | 1875 | 4.6262 | 0.2167 | | 0.2925 | 100.96 | 1893 | 4.6138 | 0.2133 | | 0.2542 | 101.97 | 1912 | 4.6329 | 0.22 | | 0.2276 | 102.99 | 1931 | 4.5854 | 0.23 | | 0.2701 | 104.0 | 1950 | 4.5662 | 0.2167 | | 0.2492 | 104.96 | 1968 | 4.6098 | 0.225 | | 0.2394 | 105.97 | 1987 | 4.6215 | 0.2333 | | 0.2409 | 106.99 | 2006 | 4.6277 | 0.2317 | | 0.2578 | 108.0 | 2025 | 4.6173 | 0.2267 | | 0.229 | 108.96 | 2043 | 4.6405 | 0.2283 | | 0.2438 | 109.97 | 2062 | 4.6041 | 0.2233 | | 0.2441 | 110.99 | 2081 | 4.6367 | 0.23 | | 0.2353 | 112.0 | 2100 | 4.5877 | 0.2467 | | 0.1876 | 112.96 | 2118 | 4.6518 | 0.2333 | | 0.1996 | 113.97 | 2137 | 4.6810 | 0.2367 | | 0.1908 | 114.99 | 2156 | 4.6483 | 0.245 | | 0.1905 | 116.0 | 2175 | 4.6117 | 0.22 | | 0.1995 | 116.96 | 2193 | 4.6283 | 0.2183 | | 0.2044 | 117.97 | 2212 | 4.6130 | 0.2283 | | 0.2156 | 118.99 | 2231 | 4.6729 | 0.2167 | | 0.2003 | 120.0 | 2250 | 4.6124 | 0.24 | | 0.1861 | 120.96 | 2268 | 4.6839 | 0.23 | | 0.2072 | 121.97 | 2287 | 4.7217 | 0.2267 | | 0.1973 | 122.99 | 2306 | 4.7596 | 0.23 | | 0.2191 | 124.0 | 2325 | 4.7394 | 0.2283 | | 0.1738 | 124.96 | 2343 | 4.7356 | 0.23 | | 0.1669 | 125.97 | 2362 | 4.7210 | 0.235 | | 0.1971 | 126.99 | 2381 | 4.6826 | 0.23 | | 0.1972 | 128.0 | 2400 | 4.7256 | 0.2233 | | 0.1794 | 128.96 | 2418 | 4.6589 | 0.235 | | 0.1894 | 129.97 | 2437 | 4.7391 | 0.2317 | | 0.1854 | 130.99 | 2456 | 4.7441 | 0.23 | | 0.1551 | 132.0 | 2475 | 4.7559 | 0.2233 | | 0.179 | 132.96 | 2493 | 4.7555 | 0.235 | | 0.2235 | 133.97 | 2512 | 4.7686 | 0.2183 | | 0.179 | 134.99 | 2531 | 4.7334 | 0.2283 | | 0.1595 | 136.0 | 2550 | 4.7324 | 0.2267 | | 0.1598 | 136.96 | 2568 | 4.7099 | 0.245 | | 0.178 | 137.97 | 2587 | 4.7363 | 0.2317 | | 0.1976 | 138.99 | 2606 | 4.7806 | 0.2217 | | 0.153 | 140.0 | 2625 | 4.8128 | 0.2267 | | 0.1946 | 140.96 | 2643 | 4.7925 | 0.2233 | | 0.1597 | 141.97 | 2662 | 4.8279 | 0.2117 | | 0.1515 | 142.99 | 2681 | 4.8203 | 0.2233 | | 0.2013 | 144.0 | 2700 | 4.7393 | 0.23 | | 0.1431 | 144.96 | 2718 | 4.7866 | 0.2367 | | 0.1864 | 145.97 | 2737 | 4.7730 | 0.2317 | | 0.1676 | 146.99 | 2756 | 4.8112 | 0.2333 | | 0.1502 | 148.0 | 2775 | 4.8374 | 0.215 | | 0.1584 | 148.96 | 2793 | 4.8600 | 0.2317 | | 0.1901 | 149.97 | 2812 | 4.8647 | 0.2167 | | 0.1347 | 150.99 | 2831 | 4.8073 | 0.215 | | 0.1859 | 152.0 | 2850 | 4.8766 | 0.2283 | | 0.1483 | 152.96 | 2868 | 4.8392 | 0.23 | | 0.149 | 153.97 | 2887 | 4.8734 | 0.215 | | 0.1411 | 154.99 | 2906 | 4.9380 | 0.2083 | | 0.1589 | 156.0 | 2925 | 4.8365 | 0.2367 | | 0.1608 | 156.96 | 2943 | 4.8053 | 0.23 | | 0.1759 | 157.97 | 2962 | 4.8690 | 0.2383 | | 0.142 | 158.99 | 2981 | 4.8490 | 0.23 | | 0.1433 | 160.0 | 3000 | 4.8658 | 0.2267 | | 0.1407 | 160.96 | 3018 | 4.8936 | 0.2267 | | 0.1442 | 161.97 | 3037 | 4.8534 | 0.235 | | 0.1669 | 162.99 | 3056 | 4.9050 | 0.2317 | | 0.1384 | 164.0 | 3075 | 4.8717 | 0.235 | | 0.1366 | 164.96 | 3093 | 4.8951 | 0.2317 | | 0.1428 | 165.97 | 3112 | 4.9253 | 0.225 | | 0.1376 | 166.99 | 3131 | 4.9108 | 0.22 | | 0.1238 | 168.0 | 3150 | 4.8965 | 0.2267 | | 0.1129 | 168.96 | 3168 | 4.8446 | 0.2317 | | 0.1296 | 169.97 | 3187 | 4.9100 | 0.2217 | | 0.1383 | 170.99 | 3206 | 4.8886 | 0.2333 | | 0.1511 | 172.0 | 3225 | 4.8883 | 0.235 | | 0.1426 | 172.96 | 3243 | 4.8783 | 0.2333 | | 0.1112 | 173.97 | 3262 | 4.9305 | 0.225 | | 0.1456 | 174.99 | 3281 | 4.8354 | 0.2417 | | 0.1343 | 176.0 | 3300 | 4.8553 | 0.225 | | 0.1133 | 176.96 | 3318 | 4.8739 | 0.2317 | | 0.1213 | 177.97 | 3337 | 4.8865 | 0.2333 | | 0.1309 | 178.99 | 3356 | 4.9231 | 0.22 | | 0.1197 | 180.0 | 3375 | 4.8976 | 0.2383 | | 0.1619 | 180.96 | 3393 | 4.8812 | 0.2383 | | 0.1254 | 181.97 | 3412 | 4.8260 | 0.225 | | 0.0934 | 182.99 | 3431 | 4.8645 | 0.2283 | | 0.1156 | 184.0 | 3450 | 4.8253 | 0.24 | | 0.1008 | 184.96 | 3468 | 4.8692 | 0.2467 | | 0.1273 | 185.97 | 3487 | 4.9049 | 0.24 | | 0.1352 | 186.99 | 3506 | 4.8660 | 0.2333 | | 0.1411 | 188.0 | 3525 | 4.8252 | 0.2333 | | 0.124 | 188.96 | 3543 | 4.8715 | 0.2317 | | 0.1121 | 189.97 | 3562 | 4.8768 | 0.24 | | 0.1337 | 190.99 | 3581 | 4.9004 | 0.2433 | | 0.106 | 192.0 | 3600 | 4.8300 | 0.2333 | | 0.1044 | 192.96 | 3618 | 4.7903 | 0.25 | | 0.1259 | 193.97 | 3637 | 4.8227 | 0.2267 | | 0.1182 | 194.99 | 3656 | 4.8286 | 0.2283 | | 0.1269 | 196.0 | 3675 | 4.8475 | 0.2467 | | 0.1304 | 196.96 | 3693 | 4.8759 | 0.2367 | | 0.1086 | 197.97 | 3712 | 4.8773 | 0.2283 | | 0.1184 | 198.99 | 3731 | 4.8875 | 0.2317 | | 0.114 | 200.0 | 3750 | 4.9083 | 0.2283 | | 0.1028 | 200.96 | 3768 | 4.8888 | 0.2283 | | 0.1355 | 201.97 | 3787 | 4.8358 | 0.23 | | 0.1038 | 202.99 | 3806 | 4.8567 | 0.23 | | 0.1122 | 204.0 | 3825 | 4.8578 | 0.2317 | | 0.0864 | 204.96 | 3843 | 4.9288 | 0.2333 | | 0.0969 | 205.97 | 3862 | 4.8940 | 0.2567 | | 0.0986 | 206.99 | 3881 | 4.8632 | 0.2483 | | 0.1291 | 208.0 | 3900 | 4.9114 | 0.2433 | | 0.1244 | 208.96 | 3918 | 4.9076 | 0.2483 | | 0.1112 | 209.97 | 3937 | 4.8954 | 0.245 | | 0.1032 | 210.99 | 3956 | 4.8885 | 0.2433 | | 0.0919 | 212.0 | 3975 | 4.9197 | 0.2433 | | 0.1153 | 212.96 | 3993 | 4.9020 | 0.25 | | 0.0978 | 213.97 | 4012 | 4.9312 | 0.2433 | | 0.1316 | 214.99 | 4031 | 4.9504 | 0.2417 | | 0.1222 | 216.0 | 4050 | 4.9403 | 0.2367 | | 0.1108 | 216.96 | 4068 | 4.8719 | 0.2483 | | 0.0996 | 217.97 | 4087 | 4.8812 | 0.24 | | 0.0907 | 218.99 | 4106 | 4.9203 | 0.25 | | 0.0974 | 220.0 | 4125 | 4.9728 | 0.2417 | | 0.1127 | 220.96 | 4143 | 4.9708 | 0.2467 | | 0.0891 | 221.97 | 4162 | 5.0154 | 0.2517 | | 0.0973 | 222.99 | 4181 | 4.9533 | 0.24 | | 0.0912 | 224.0 | 4200 | 4.9306 | 0.2333 | | 0.0971 | 224.96 | 4218 | 4.9873 | 0.2283 | | 0.1069 | 225.97 | 4237 | 4.9154 | 0.2433 | | 0.1163 | 226.99 | 4256 | 4.9178 | 0.2367 | | 0.1139 | 228.0 | 4275 | 4.9395 | 0.2383 | | 0.0816 | 228.96 | 4293 | 4.8986 | 0.2333 | | 0.0899 | 229.97 | 4312 | 4.9542 | 0.2517 | | 0.0984 | 230.99 | 4331 | 4.9305 | 0.2433 | | 0.0732 | 232.0 | 4350 | 4.9264 | 0.2417 | | 0.0947 | 232.96 | 4368 | 4.9676 | 0.235 | | 0.0826 | 233.97 | 4387 | 5.0179 | 0.235 | | 0.086 | 234.99 | 4406 | 4.9719 | 0.2283 | | 0.1055 | 236.0 | 4425 | 4.9457 | 0.235 | | 0.0872 | 236.96 | 4443 | 4.9355 | 0.2533 | | 0.0826 | 237.97 | 4462 | 4.9467 | 0.245 | | 0.081 | 238.99 | 4481 | 5.0036 | 0.245 | | 0.0746 | 240.0 | 4500 | 5.0125 | 0.245 | | 0.0725 | 240.96 | 4518 | 5.0061 | 0.2367 | | 0.1104 | 241.97 | 4537 | 5.0150 | 0.2483 | | 0.0885 | 242.99 | 4556 | 5.0063 | 0.2333 | | 0.0847 | 244.0 | 4575 | 4.9919 | 0.2383 | | 0.0751 | 244.96 | 4593 | 4.9881 | 0.2367 | | 0.0872 | 245.97 | 4612 | 5.0013 | 0.2383 | | 0.0771 | 246.99 | 4631 | 5.0273 | 0.235 | | 0.0941 | 248.0 | 4650 | 5.0507 | 0.2317 | | 0.116 | 248.96 | 4668 | 5.0491 | 0.245 | | 0.0733 | 249.97 | 4687 | 5.0391 | 0.24 | | 0.0821 | 250.99 | 4706 | 5.0231 | 0.235 | | 0.075 | 252.0 | 4725 | 5.0388 | 0.2317 | | 0.0885 | 252.96 | 4743 | 4.9838 | 0.2383 | | 0.0759 | 253.97 | 4762 | 4.9536 | 0.2433 | | 0.0773 | 254.99 | 4781 | 5.0145 | 0.2383 | | 0.0707 | 256.0 | 4800 | 5.0352 | 0.2217 | | 0.0861 | 256.96 | 4818 | 5.0295 | 0.2483 | | 0.0739 | 257.97 | 4837 | 5.0354 | 0.2433 | | 0.0822 | 258.99 | 4856 | 5.0563 | 0.245 | | 0.0681 | 260.0 | 4875 | 5.0407 | 0.245 | | 0.0872 | 260.96 | 4893 | 5.0511 | 0.2367 | | 0.079 | 261.97 | 4912 | 5.1087 | 0.23 | | 0.0733 | 262.99 | 4931 | 5.0523 | 0.225 | | 0.097 | 264.0 | 4950 | 5.0368 | 0.2317 | | 0.0669 | 264.96 | 4968 | 5.0501 | 0.2283 | | 0.0801 | 265.97 | 4987 | 5.0515 | 0.235 | | 0.0894 | 266.99 | 5006 | 5.0182 | 0.245 | | 0.0815 | 268.0 | 5025 | 5.0713 | 0.23 | | 0.0934 | 268.96 | 5043 | 5.0082 | 0.2417 | | 0.0728 | 269.97 | 5062 | 5.0473 | 0.23 | | 0.088 | 270.99 | 5081 | 5.0689 | 0.2267 | | 0.0706 | 272.0 | 5100 | 5.0403 | 0.2383 | | 0.0931 | 272.96 | 5118 | 5.0298 | 0.235 | | 0.0784 | 273.97 | 5137 | 5.0141 | 0.2367 | | 0.0831 | 274.99 | 5156 | 5.0314 | 0.2417 | | 0.0624 | 276.0 | 5175 | 5.0445 | 0.2383 | | 0.0819 | 276.96 | 5193 | 5.0632 | 0.2517 | | 0.0714 | 277.97 | 5212 | 5.0520 | 0.255 | | 0.0893 | 278.99 | 5231 | 5.0075 | 0.2533 | | 0.0777 | 280.0 | 5250 | 5.0122 | 0.24 | | 0.0686 | 280.96 | 5268 | 5.0477 | 0.2333 | | 0.0849 | 281.97 | 5287 | 5.0238 | 0.2433 | | 0.0969 | 282.99 | 5306 | 5.0061 | 0.2383 | | 0.0906 | 284.0 | 5325 | 5.0771 | 0.2517 | | 0.0843 | 284.96 | 5343 | 5.0882 | 0.2417 | | 0.0538 | 285.97 | 5362 | 5.0800 | 0.2467 | | 0.0678 | 286.99 | 5381 | 5.0976 | 0.2367 | | 0.0729 | 288.0 | 5400 | 5.0817 | 0.2333 | | 0.0922 | 288.96 | 5418 | 5.0955 | 0.2433 | | 0.0684 | 289.97 | 5437 | 5.0873 | 0.2467 | | 0.082 | 290.99 | 5456 | 5.1424 | 0.2383 | | 0.0798 | 292.0 | 5475 | 5.0631 | 0.2433 | | 0.0781 | 292.96 | 5493 | 5.0474 | 0.2517 | | 0.0857 | 293.97 | 5512 | 5.0502 | 0.2417 | | 0.0733 | 294.99 | 5531 | 5.0599 | 0.235 | | 0.072 | 296.0 | 5550 | 5.0549 | 0.2433 | | 0.101 | 296.96 | 5568 | 5.0783 | 0.24 | | 0.0982 | 297.97 | 5587 | 5.1048 | 0.24 | | 0.0782 | 298.99 | 5606 | 5.1096 | 0.235 | | 0.0654 | 300.0 | 5625 | 5.1158 | 0.2333 | | 0.0761 | 300.96 | 5643 | 5.0416 | 0.2383 | | 0.079 | 301.97 | 5662 | 5.1202 | 0.235 | | 0.0845 | 302.99 | 5681 | 5.1711 | 0.2333 | | 0.0718 | 304.0 | 5700 | 5.1572 | 0.2383 | | 0.0675 | 304.96 | 5718 | 5.1230 | 0.235 | | 0.0683 | 305.97 | 5737 | 5.1464 | 0.23 | | 0.0899 | 306.99 | 5756 | 5.0815 | 0.24 | | 0.0692 | 308.0 | 5775 | 5.0867 | 0.2367 | | 0.0843 | 308.96 | 5793 | 5.1006 | 0.2283 | | 0.0742 | 309.97 | 5812 | 5.0545 | 0.2317 | | 0.0689 | 310.99 | 5831 | 5.0509 | 0.2317 | | 0.0714 | 312.0 | 5850 | 5.0536 | 0.2367 | | 0.0669 | 312.96 | 5868 | 5.0488 | 0.2433 | | 0.0726 | 313.97 | 5887 | 5.0903 | 0.2433 | | 0.0509 | 314.99 | 5906 | 5.1158 | 0.2433 | | 0.062 | 316.0 | 5925 | 5.0651 | 0.2417 | | 0.0528 | 316.96 | 5943 | 5.0545 | 0.2467 | | 0.0627 | 317.97 | 5962 | 5.0730 | 0.2417 | | 0.0678 | 318.99 | 5981 | 5.0354 | 0.24 | | 0.0603 | 320.0 | 6000 | 5.0496 | 0.2483 | | 0.0703 | 320.96 | 6018 | 5.0788 | 0.2367 | | 0.0797 | 321.97 | 6037 | 5.1071 | 0.24 | | 0.0914 | 322.99 | 6056 | 5.0996 | 0.24 | | 0.0688 | 324.0 | 6075 | 5.0954 | 0.24 | | 0.0591 | 324.96 | 6093 | 5.1014 | 0.245 | | 0.0622 | 325.97 | 6112 | 5.0859 | 0.24 | | 0.0715 | 326.99 | 6131 | 5.0557 | 0.2533 | | 0.0483 | 328.0 | 6150 | 5.1148 | 0.2383 | | 0.0922 | 328.96 | 6168 | 5.1338 | 0.24 | | 0.0588 | 329.97 | 6187 | 5.1553 | 0.245 | | 0.0615 | 330.99 | 6206 | 5.1083 | 0.2483 | | 0.0508 | 332.0 | 6225 | 5.1167 | 0.245 | | 0.0691 | 332.96 | 6243 | 5.1116 | 0.2433 | | 0.0684 | 333.97 | 6262 | 5.1211 | 0.2417 | | 0.0519 | 334.99 | 6281 | 5.1481 | 0.2433 | | 0.0571 | 336.0 | 6300 | 5.1662 | 0.245 | | 0.038 | 336.96 | 6318 | 5.1616 | 0.245 | | 0.0589 | 337.97 | 6337 | 5.1507 | 0.2483 | | 0.0609 | 338.99 | 6356 | 5.1083 | 0.245 | | 0.0484 | 340.0 | 6375 | 5.1392 | 0.2417 | | 0.0842 | 340.96 | 6393 | 5.1476 | 0.2483 | | 0.0569 | 341.97 | 6412 | 5.1547 | 0.245 | | 0.0626 | 342.99 | 6431 | 5.1824 | 0.2383 | | 0.0399 | 344.0 | 6450 | 5.1972 | 0.2417 | | 0.0803 | 344.96 | 6468 | 5.1678 | 0.2433 | | 0.0533 | 345.97 | 6487 | 5.1815 | 0.2417 | | 0.0542 | 346.99 | 6506 | 5.1768 | 0.2383 | | 0.0624 | 348.0 | 6525 | 5.1759 | 0.2333 | | 0.055 | 348.96 | 6543 | 5.1954 | 0.2433 | | 0.0678 | 349.97 | 6562 | 5.1478 | 0.2417 | | 0.0557 | 350.99 | 6581 | 5.1236 | 0.24 | | 0.0581 | 352.0 | 6600 | 5.1462 | 0.245 | | 0.0509 | 352.96 | 6618 | 5.1428 | 0.235 | | 0.0636 | 353.97 | 6637 | 5.1659 | 0.24 | | 0.0671 | 354.99 | 6656 | 5.1583 | 0.24 | | 0.0768 | 356.0 | 6675 | 5.1143 | 0.2283 | | 0.0719 | 356.96 | 6693 | 5.1047 | 0.235 | | 0.0653 | 357.97 | 6712 | 5.1262 | 0.2467 | | 0.0646 | 358.99 | 6731 | 5.1456 | 0.245 | | 0.0466 | 360.0 | 6750 | 5.1514 | 0.2517 | | 0.0447 | 360.96 | 6768 | 5.1479 | 0.24 | | 0.0671 | 361.97 | 6787 | 5.1508 | 0.2383 | | 0.0618 | 362.99 | 6806 | 5.1520 | 0.2367 | | 0.0559 | 364.0 | 6825 | 5.1495 | 0.2367 | | 0.0745 | 364.96 | 6843 | 5.1606 | 0.235 | | 0.0542 | 365.97 | 6862 | 5.1888 | 0.235 | | 0.0737 | 366.99 | 6881 | 5.2100 | 0.2417 | | 0.0524 | 368.0 | 6900 | 5.2309 | 0.2317 | | 0.0548 | 368.96 | 6918 | 5.2278 | 0.2283 | | 0.0742 | 369.97 | 6937 | 5.1995 | 0.2383 | | 0.0512 | 370.99 | 6956 | 5.2024 | 0.24 | | 0.0529 | 372.0 | 6975 | 5.1839 | 0.24 | | 0.0647 | 372.96 | 6993 | 5.2155 | 0.2433 | | 0.055 | 373.97 | 7012 | 5.1983 | 0.2417 | | 0.0586 | 374.99 | 7031 | 5.1605 | 0.2467 | | 0.0619 | 376.0 | 7050 | 5.1739 | 0.24 | | 0.0462 | 376.96 | 7068 | 5.1929 | 0.2467 | | 0.0406 | 377.97 | 7087 | 5.2178 | 0.2467 | | 0.0578 | 378.99 | 7106 | 5.2028 | 0.2433 | | 0.0643 | 380.0 | 7125 | 5.1912 | 0.245 | | 0.0442 | 380.96 | 7143 | 5.2069 | 0.2417 | | 0.0642 | 381.97 | 7162 | 5.1740 | 0.2417 | | 0.0423 | 382.99 | 7181 | 5.1597 | 0.2417 | | 0.0663 | 384.0 | 7200 | 5.1497 | 0.2467 | | 0.0539 | 384.96 | 7218 | 5.1274 | 0.2467 | | 0.0426 | 385.97 | 7237 | 5.1322 | 0.24 | | 0.0489 | 386.99 | 7256 | 5.1582 | 0.2433 | | 0.0591 | 388.0 | 7275 | 5.1627 | 0.245 | | 0.0528 | 388.96 | 7293 | 5.1561 | 0.2483 | | 0.0666 | 389.97 | 7312 | 5.1548 | 0.2433 | | 0.053 | 390.99 | 7331 | 5.1537 | 0.2433 | | 0.0546 | 392.0 | 7350 | 5.1460 | 0.25 | | 0.0546 | 392.96 | 7368 | 5.1467 | 0.2467 | | 0.0334 | 393.97 | 7387 | 5.1394 | 0.25 | | 0.0548 | 394.99 | 7406 | 5.1488 | 0.2417 | | 0.0441 | 396.0 | 7425 | 5.1643 | 0.2467 | | 0.0476 | 396.96 | 7443 | 5.1578 | 0.255 | | 0.0602 | 397.97 | 7462 | 5.1754 | 0.2517 | | 0.0455 | 398.99 | 7481 | 5.1758 | 0.2533 | | 0.0364 | 400.0 | 7500 | 5.1471 | 0.255 | | 0.0479 | 400.96 | 7518 | 5.1424 | 0.2517 | | 0.052 | 401.97 | 7537 | 5.1626 | 0.245 | | 0.0521 | 402.99 | 7556 | 5.1680 | 0.2533 | | 0.0527 | 404.0 | 7575 | 5.1557 | 0.2517 | | 0.0534 | 404.96 | 7593 | 5.1781 | 0.2567 | | 0.0431 | 405.97 | 7612 | 5.1900 | 0.2533 | | 0.0506 | 406.99 | 7631 | 5.1651 | 0.2533 | | 0.043 | 408.0 | 7650 | 5.1529 | 0.2567 | | 0.0419 | 408.96 | 7668 | 5.1589 | 0.255 | | 0.0388 | 409.97 | 7687 | 5.1641 | 0.2567 | | 0.0601 | 410.99 | 7706 | 5.1679 | 0.2567 | | 0.0521 | 412.0 | 7725 | 5.1805 | 0.255 | | 0.0564 | 412.96 | 7743 | 5.1811 | 0.2517 | | 0.0395 | 413.97 | 7762 | 5.1670 | 0.2483 | | 0.0542 | 414.99 | 7781 | 5.1530 | 0.2517 | | 0.0482 | 416.0 | 7800 | 5.1599 | 0.2483 | | 0.0597 | 416.96 | 7818 | 5.1715 | 0.2517 | | 0.0579 | 417.97 | 7837 | 5.1748 | 0.2517 | | 0.0331 | 418.99 | 7856 | 5.1811 | 0.25 | | 0.045 | 420.0 | 7875 | 5.1903 | 0.25 | | 0.0424 | 420.96 | 7893 | 5.1830 | 0.2583 | | 0.0474 | 421.97 | 7912 | 5.1867 | 0.2567 | | 0.051 | 422.99 | 7931 | 5.1793 | 0.255 | | 0.0563 | 424.0 | 7950 | 5.1588 | 0.2517 | | 0.039 | 424.96 | 7968 | 5.1612 | 0.2483 | | 0.0722 | 425.97 | 7987 | 5.1692 | 0.2433 | | 0.0492 | 426.99 | 8006 | 5.1873 | 0.245 | | 0.0594 | 428.0 | 8025 | 5.1983 | 0.245 | | 0.0646 | 428.96 | 8043 | 5.2049 | 0.2483 | | 0.0412 | 429.97 | 8062 | 5.1985 | 0.25 | | 0.0493 | 430.99 | 8081 | 5.2013 | 0.24 | | 0.0582 | 432.0 | 8100 | 5.1916 | 0.2467 | | 0.0528 | 432.96 | 8118 | 5.1852 | 0.2483 | | 0.0432 | 433.97 | 8137 | 5.1820 | 0.25 | | 0.0313 | 434.99 | 8156 | 5.1844 | 0.2483 | | 0.048 | 436.0 | 8175 | 5.1884 | 0.2467 | | 0.0591 | 436.96 | 8193 | 5.1952 | 0.2467 | | 0.0379 | 437.97 | 8212 | 5.2027 | 0.25 | | 0.0355 | 438.99 | 8231 | 5.2032 | 0.2517 | | 0.0499 | 440.0 | 8250 | 5.1997 | 0.2533 | | 0.0464 | 440.96 | 8268 | 5.1866 | 0.255 | | 0.0455 | 441.97 | 8287 | 5.1744 | 0.255 | | 0.0429 | 442.99 | 8306 | 5.1745 | 0.2533 | | 0.0532 | 444.0 | 8325 | 5.1885 | 0.2483 | | 0.053 | 444.96 | 8343 | 5.1893 | 0.2517 | | 0.0453 | 445.97 | 8362 | 5.1899 | 0.2467 | | 0.0414 | 446.99 | 8381 | 5.1957 | 0.2433 | | 0.0449 | 448.0 | 8400 | 5.2016 | 0.2433 | | 0.0424 | 448.96 | 8418 | 5.2080 | 0.245 | | 0.0377 | 449.97 | 8437 | 5.2015 | 0.2417 | | 0.0448 | 450.99 | 8456 | 5.1974 | 0.2433 | | 0.0647 | 452.0 | 8475 | 5.1928 | 0.24 | | 0.0549 | 452.96 | 8493 | 5.1991 | 0.24 | | 0.0455 | 453.97 | 8512 | 5.1975 | 0.245 | | 0.0531 | 454.99 | 8531 | 5.1890 | 0.2417 | | 0.0476 | 456.0 | 8550 | 5.1756 | 0.245 | | 0.0366 | 456.96 | 8568 | 5.1726 | 0.2467 | | 0.0534 | 457.97 | 8587 | 5.1726 | 0.2467 | | 0.0521 | 458.99 | 8606 | 5.1757 | 0.2467 | | 0.042 | 460.0 | 8625 | 5.1777 | 0.2467 | | 0.0513 | 460.96 | 8643 | 5.1781 | 0.2483 | | 0.0463 | 461.97 | 8662 | 5.1803 | 0.25 | | 0.0563 | 462.99 | 8681 | 5.1841 | 0.2483 | | 0.0366 | 464.0 | 8700 | 5.1888 | 0.245 | | 0.0439 | 464.96 | 8718 | 5.1896 | 0.2467 | | 0.0645 | 465.97 | 8737 | 5.1916 | 0.2483 | | 0.0358 | 466.99 | 8756 | 5.1958 | 0.25 | | 0.0397 | 468.0 | 8775 | 5.1982 | 0.2467 | | 0.0498 | 468.96 | 8793 | 5.2001 | 0.245 | | 0.0597 | 469.97 | 8812 | 5.2035 | 0.245 | | 0.031 | 470.99 | 8831 | 5.2055 | 0.245 | | 0.0466 | 472.0 | 8850 | 5.2071 | 0.245 | | 0.0406 | 472.96 | 8868 | 5.2064 | 0.245 | | 0.047 | 473.97 | 8887 | 5.2062 | 0.2467 | | 0.0472 | 474.99 | 8906 | 5.2057 | 0.245 | | 0.0392 | 476.0 | 8925 | 5.2060 | 0.2467 | | 0.0314 | 476.96 | 8943 | 5.2054 | 0.2467 | | 0.0411 | 477.97 | 8962 | 5.2049 | 0.2467 | | 0.0427 | 478.99 | 8981 | 5.2046 | 0.2467 | | 0.0469 | 480.0 | 9000 | 5.2045 | 0.2467 | ### Framework versions - Transformers 4.31.0 - Pytorch 1.13.1 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "acanthite", "actinolite", "adamite", "aegirine", "afghanite", "agate", "alabandite", "albite", "almandine", "amethyst", "analcime", "anatase", "andalusite", "andradite", "anglesite", "anhydrite", "annabergite", "anorthite", "anorthoclase", "antimony", "apatite", "apophyllite", "aquamarine", "aragonite", "arfvedsonite", "arsenic", "arsenopyrite", "augite", "aurichalcite", "axinite", "azurite", "babingtonite", "barite", "bauxite", "benitoite", "beryl", "biotite", "bismuth", "bismuthinite", "bixbyite", "boehmite", "boleite", "boracite", "borax", "bornite", "boulangerite", "bournonite", "brochantite", "bromargyrite", "brookite", "brucite", "buergerite", "bustamite", "bytownite", "calaverite", "calcite", "calomel", "carletonite", "cassiterite", "celestine", "cerussite", "cervantite", "chabazite", "chalcanthite", "chalcedony", "chalcocite", "chalcopyrite", "chambersite", "chamosite", "chlorargyrite", "chlorite", "chondrodite", "chromite", "chrysoberyl", "chrysocolla", "cinnabar", "citrine", "clinochlore", "clinohumite", "clinozoisite", "cobaltite", "coesite", "colemanite", "columbite", "cookeite", "copper", "cordierite", "corundum", "covellite", "cristobalite", "crocoite", "cryolite", "cumengeite", "cuprite", "cyanotrichite", "danburite", "datolite", "diamond", "diaspore", "diopside", "dioptase", "dolomite", "dravite", "dumortierite", "edenite", "elbaite", "emerald", "enargite", "enstatite", "epidote", "epistilbite", "erythrite", "euclase", "ferberite", "fluorite", "franklinite", "galena", "garnet", "gaspeite", "gibbsite", "glauberite", "glaucophane", "gmelinite", "goethite", "gold", "goosecreekite", "graphite", "greenockite", "grossular", "gummite", "gypsum", "halite", "hastingsite", "hauyne", "hedenbergite", "hematite", "hemimorphite", "heulandite", "hornblende", "howlite", "huebnerite", "humite", "hydrozincite", "ice", "ilmenite", "inesite", "iodargyrite", "iron-nickel", "jadeite", "jamesonite", "jarosite", "jasper", "jeremejevite", "johannsenite", "kaolinite", "kernite", "krennerite", "kyanite", "labradorite", "laumontite", "lawsonite", "lazulite", "lazurite", "lead", "lepidolite", "leucite", "liddicoatite", "limonite", "linarite", "lithiophilite", "loellingite", "luzonite", "magnesiochromite", "magnesite", "magnetite", "malachite", "manganite", "manganoneptunite", "marcasite", "marialite", "meionite", "melanophlogite", "mercury", "mesolite", "metacinnabar", "metastibnite", "metavariscite", "miargyrite", "microcline", "millerite", "mimetite", "molybdenite", "monazite", "muscovite", "natrolite", "nepheline", "neptunite", "nickeline", "nitratine", "norbergite", "nosean", "oligoclase", "olivine", "olmiite", "opal", "orpiment", "orthoclase", "otavite", "paradamite", "pararealgar", "pargasite", "pearceite", "pectolite", "pentlandite", "pezzottaite", "phenakite", "phlogopite", "phosgenite", "plagioclase", "platinum", "polybasite", "powellite", "prehnite", "proustite", "psilomelane", "pumpellyite", "pyrargyrite", "pyrite", "pyrolusite", "pyromorphite", "pyrope", "pyrophyllite", "pyroxmangite", "pyrrhotite", "quartz", "rammelsbergite", "raspite", "realgar", "rheniite", "rhodochrosite", "rhodonite", "riebeckite", "romanechite", "rutile", "talc", "tantalite", "tellurium", "tennantite", "tephroite", "tetrahedrite", "thenardite", "thomsonite", "thorite", "tincalconite", "titanite", "topaz", "tourmaline", "tremolite", "tridymite", "triphylite", "turquoise", "ulexite", "uraninite", "uvarovite", "uvite", "vanadinite", "variscite", "vesuvianite", "vivianite", "water", "wavellite", "willemite", "witherite", "wolframite", "wollastonite", "wulfenite", "wurtzite", "xanthoconite", "xenotime", "zincite", "zircon", "zoisite" ]
minatosnow/swinv2-small-patch4-window16-256-mineral
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-small-patch4-window16-256-mineral This model is a fine-tuned version of [microsoft/swinv2-small-patch4-window16-256](https://huggingface.co/microsoft/swinv2-small-patch4-window16-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 4.9130 - Accuracy: 0.24 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 5.6941 | 0.96 | 18 | 5.6921 | 0.005 | | 5.6886 | 1.97 | 37 | 5.6825 | 0.005 | | 5.6735 | 2.99 | 56 | 5.6691 | 0.005 | | 5.6521 | 4.0 | 75 | 5.6549 | 0.0033 | | 5.6394 | 4.96 | 93 | 5.6416 | 0.0033 | | 5.6078 | 5.97 | 112 | 5.6278 | 0.0033 | | 5.5743 | 6.99 | 131 | 5.6128 | 0.0017 | | 5.5509 | 8.0 | 150 | 5.5918 | 0.0017 | | 5.5115 | 8.96 | 168 | 5.5696 | 0.0067 | | 5.4411 | 9.97 | 187 | 5.5440 | 0.01 | | 5.3335 | 10.99 | 206 | 5.5135 | 0.0167 | | 5.2413 | 12.0 | 225 | 5.4640 | 0.0217 | | 5.1738 | 12.96 | 243 | 5.4084 | 0.0333 | | 5.0222 | 13.97 | 262 | 5.3321 | 0.045 | | 4.8594 | 14.99 | 281 | 5.2485 | 0.0533 | | 4.7441 | 16.0 | 300 | 5.1509 | 0.065 | | 4.5946 | 16.96 | 318 | 5.0701 | 0.0717 | | 4.3382 | 17.97 | 337 | 4.9767 | 0.0867 | | 4.2008 | 18.99 | 356 | 4.8622 | 0.105 | | 4.0563 | 20.0 | 375 | 4.7726 | 0.1033 | | 3.8064 | 20.96 | 393 | 4.6898 | 0.115 | | 3.5584 | 21.97 | 412 | 4.5997 | 0.125 | | 3.3377 | 22.99 | 431 | 4.4848 | 0.1367 | | 3.1119 | 24.0 | 450 | 4.4052 | 0.1533 | | 2.8686 | 24.96 | 468 | 4.3705 | 0.15 | | 2.7649 | 25.97 | 487 | 4.2980 | 0.165 | | 2.5698 | 26.99 | 506 | 4.2363 | 0.1767 | | 2.4344 | 28.0 | 525 | 4.1733 | 0.1767 | | 2.2186 | 28.96 | 543 | 4.1783 | 0.1733 | | 2.0227 | 29.97 | 562 | 4.1306 | 0.18 | | 1.9153 | 30.99 | 581 | 4.0948 | 0.175 | | 1.7363 | 32.0 | 600 | 4.0612 | 0.1783 | | 1.6171 | 32.96 | 618 | 4.0209 | 0.185 | | 1.4865 | 33.97 | 637 | 4.0194 | 0.185 | | 1.3194 | 34.99 | 656 | 3.9881 | 0.205 | | 1.2811 | 36.0 | 675 | 3.9862 | 0.215 | | 1.1703 | 36.96 | 693 | 3.9905 | 0.2033 | | 1.114 | 37.97 | 712 | 3.9514 | 0.2133 | | 0.9645 | 38.99 | 731 | 3.9678 | 0.2067 | | 0.8976 | 40.0 | 750 | 3.9874 | 0.2167 | | 0.8147 | 40.96 | 768 | 3.9257 | 0.2083 | | 0.7239 | 41.97 | 787 | 3.9394 | 0.2217 | | 0.7732 | 42.99 | 806 | 3.9473 | 0.215 | | 0.7009 | 44.0 | 825 | 3.9461 | 0.215 | | 0.5945 | 44.96 | 843 | 4.0207 | 0.2133 | | 0.555 | 45.97 | 862 | 4.0353 | 0.2083 | | 0.5241 | 46.99 | 881 | 4.0232 | 0.2167 | | 0.4789 | 48.0 | 900 | 4.0026 | 0.22 | | 0.4284 | 48.96 | 918 | 4.0031 | 0.22 | | 0.4701 | 49.97 | 937 | 4.0572 | 0.215 | | 0.4501 | 50.99 | 956 | 4.0877 | 0.215 | | 0.3966 | 52.0 | 975 | 4.0207 | 0.2167 | | 0.3564 | 52.96 | 993 | 4.0827 | 0.215 | | 0.3472 | 53.97 | 1012 | 4.0902 | 0.235 | | 0.3731 | 54.99 | 1031 | 4.0953 | 0.2417 | | 0.3161 | 56.0 | 1050 | 4.1660 | 0.2033 | | 0.3352 | 56.96 | 1068 | 4.1153 | 0.2217 | | 0.3317 | 57.97 | 1087 | 4.1096 | 0.2167 | | 0.294 | 58.99 | 1106 | 4.1856 | 0.215 | | 0.3299 | 60.0 | 1125 | 4.1476 | 0.2233 | | 0.2847 | 60.96 | 1143 | 4.2046 | 0.225 | | 0.2924 | 61.97 | 1162 | 4.1568 | 0.2183 | | 0.2818 | 62.99 | 1181 | 4.1519 | 0.2333 | | 0.2698 | 64.0 | 1200 | 4.2275 | 0.215 | | 0.2579 | 64.96 | 1218 | 4.1626 | 0.235 | | 0.2597 | 65.97 | 1237 | 4.2277 | 0.2217 | | 0.2443 | 66.99 | 1256 | 4.1929 | 0.2367 | | 0.2532 | 68.0 | 1275 | 4.2779 | 0.2233 | | 0.2305 | 68.96 | 1293 | 4.2441 | 0.2367 | | 0.2423 | 69.97 | 1312 | 4.2583 | 0.2217 | | 0.222 | 70.99 | 1331 | 4.2935 | 0.23 | | 0.2096 | 72.0 | 1350 | 4.2714 | 0.23 | | 0.1776 | 72.96 | 1368 | 4.2348 | 0.225 | | 0.2009 | 73.97 | 1387 | 4.2930 | 0.2283 | | 0.2087 | 74.99 | 1406 | 4.3071 | 0.235 | | 0.1818 | 76.0 | 1425 | 4.2960 | 0.235 | | 0.2236 | 76.96 | 1443 | 4.2910 | 0.24 | | 0.1802 | 77.97 | 1462 | 4.2896 | 0.25 | | 0.2037 | 78.99 | 1481 | 4.3314 | 0.245 | | 0.1912 | 80.0 | 1500 | 4.2612 | 0.2333 | | 0.2305 | 80.96 | 1518 | 4.2790 | 0.2367 | | 0.2188 | 81.97 | 1537 | 4.3069 | 0.2217 | | 0.1639 | 82.99 | 1556 | 4.3539 | 0.2183 | | 0.1741 | 84.0 | 1575 | 4.3211 | 0.225 | | 0.1937 | 84.96 | 1593 | 4.3576 | 0.2117 | | 0.1712 | 85.97 | 1612 | 4.3434 | 0.2233 | | 0.1665 | 86.99 | 1631 | 4.3349 | 0.2117 | | 0.1846 | 88.0 | 1650 | 4.4170 | 0.235 | | 0.1827 | 88.96 | 1668 | 4.3350 | 0.23 | | 0.1591 | 89.97 | 1687 | 4.3397 | 0.215 | | 0.1508 | 90.99 | 1706 | 4.3273 | 0.2317 | | 0.1808 | 92.0 | 1725 | 4.3315 | 0.2317 | | 0.17 | 92.96 | 1743 | 4.2760 | 0.24 | | 0.14 | 93.97 | 1762 | 4.3144 | 0.2333 | | 0.1734 | 94.99 | 1781 | 4.3667 | 0.2283 | | 0.1593 | 96.0 | 1800 | 4.3903 | 0.225 | | 0.1523 | 96.96 | 1818 | 4.3314 | 0.24 | | 0.1599 | 97.97 | 1837 | 4.4115 | 0.23 | | 0.1352 | 98.99 | 1856 | 4.3626 | 0.2467 | | 0.1406 | 100.0 | 1875 | 4.3555 | 0.2383 | | 0.1486 | 100.96 | 1893 | 4.3116 | 0.2383 | | 0.149 | 101.97 | 1912 | 4.3894 | 0.23 | | 0.115 | 102.99 | 1931 | 4.3755 | 0.2233 | | 0.1301 | 104.0 | 1950 | 4.3765 | 0.2317 | | 0.1429 | 104.96 | 1968 | 4.4027 | 0.235 | | 0.1209 | 105.97 | 1987 | 4.3803 | 0.2317 | | 0.1287 | 106.99 | 2006 | 4.3235 | 0.2467 | | 0.1318 | 108.0 | 2025 | 4.3484 | 0.24 | | 0.1136 | 108.96 | 2043 | 4.3977 | 0.225 | | 0.1326 | 109.97 | 2062 | 4.3978 | 0.2267 | | 0.1415 | 110.99 | 2081 | 4.3214 | 0.2383 | | 0.1229 | 112.0 | 2100 | 4.3699 | 0.2467 | | 0.1004 | 112.96 | 2118 | 4.3828 | 0.2583 | | 0.0961 | 113.97 | 2137 | 4.3564 | 0.2517 | | 0.1132 | 114.99 | 2156 | 4.3384 | 0.2533 | | 0.1166 | 116.0 | 2175 | 4.4152 | 0.2417 | | 0.1193 | 116.96 | 2193 | 4.3634 | 0.2417 | | 0.096 | 117.97 | 2212 | 4.3826 | 0.235 | | 0.1158 | 118.99 | 2231 | 4.4524 | 0.235 | | 0.099 | 120.0 | 2250 | 4.4978 | 0.2233 | | 0.1065 | 120.96 | 2268 | 4.4124 | 0.24 | | 0.129 | 121.97 | 2287 | 4.3814 | 0.235 | | 0.1047 | 122.99 | 2306 | 4.3663 | 0.2467 | | 0.101 | 124.0 | 2325 | 4.5113 | 0.23 | | 0.1076 | 124.96 | 2343 | 4.4553 | 0.2367 | | 0.1135 | 125.97 | 2362 | 4.4351 | 0.23 | | 0.1066 | 126.99 | 2381 | 4.4874 | 0.235 | | 0.1256 | 128.0 | 2400 | 4.4635 | 0.2333 | | 0.0932 | 128.96 | 2418 | 4.4576 | 0.2383 | | 0.1189 | 129.97 | 2437 | 4.5770 | 0.2267 | | 0.1096 | 130.99 | 2456 | 4.4921 | 0.2317 | | 0.0791 | 132.0 | 2475 | 4.5090 | 0.2267 | | 0.1152 | 132.96 | 2493 | 4.4572 | 0.2417 | | 0.1264 | 133.97 | 2512 | 4.5109 | 0.25 | | 0.1009 | 134.99 | 2531 | 4.5236 | 0.2283 | | 0.0956 | 136.0 | 2550 | 4.4783 | 0.245 | | 0.0919 | 136.96 | 2568 | 4.5484 | 0.2467 | | 0.1042 | 137.97 | 2587 | 4.5423 | 0.2433 | | 0.1039 | 138.99 | 2606 | 4.4918 | 0.245 | | 0.094 | 140.0 | 2625 | 4.5456 | 0.2467 | | 0.1056 | 140.96 | 2643 | 4.5219 | 0.245 | | 0.0918 | 141.97 | 2662 | 4.5255 | 0.245 | | 0.0877 | 142.99 | 2681 | 4.4923 | 0.2383 | | 0.105 | 144.0 | 2700 | 4.5352 | 0.235 | | 0.0892 | 144.96 | 2718 | 4.4715 | 0.245 | | 0.0963 | 145.97 | 2737 | 4.5060 | 0.245 | | 0.095 | 146.99 | 2756 | 4.5593 | 0.2433 | | 0.0997 | 148.0 | 2775 | 4.5804 | 0.24 | | 0.0839 | 148.96 | 2793 | 4.5917 | 0.23 | | 0.0924 | 149.97 | 2812 | 4.5931 | 0.2267 | | 0.0781 | 150.99 | 2831 | 4.5784 | 0.2317 | | 0.0986 | 152.0 | 2850 | 4.6546 | 0.2283 | | 0.0823 | 152.96 | 2868 | 4.5985 | 0.2367 | | 0.0887 | 153.97 | 2887 | 4.6148 | 0.23 | | 0.0671 | 154.99 | 2906 | 4.6397 | 0.2333 | | 0.0897 | 156.0 | 2925 | 4.5834 | 0.235 | | 0.093 | 156.96 | 2943 | 4.5397 | 0.2433 | | 0.0973 | 157.97 | 2962 | 4.5532 | 0.2333 | | 0.1001 | 158.99 | 2981 | 4.5827 | 0.24 | | 0.0884 | 160.0 | 3000 | 4.5728 | 0.235 | | 0.084 | 160.96 | 3018 | 4.6542 | 0.235 | | 0.0902 | 161.97 | 3037 | 4.6366 | 0.2417 | | 0.0944 | 162.99 | 3056 | 4.5957 | 0.2383 | | 0.0828 | 164.0 | 3075 | 4.6521 | 0.23 | | 0.0812 | 164.96 | 3093 | 4.6761 | 0.2367 | | 0.0817 | 165.97 | 3112 | 4.6272 | 0.225 | | 0.07 | 166.99 | 3131 | 4.6536 | 0.2433 | | 0.0746 | 168.0 | 3150 | 4.5671 | 0.245 | | 0.0782 | 168.96 | 3168 | 4.5915 | 0.24 | | 0.0677 | 169.97 | 3187 | 4.6373 | 0.2433 | | 0.0626 | 170.99 | 3206 | 4.6723 | 0.2583 | | 0.0697 | 172.0 | 3225 | 4.6817 | 0.245 | | 0.077 | 172.96 | 3243 | 4.6793 | 0.23 | | 0.068 | 173.97 | 3262 | 4.7110 | 0.2417 | | 0.0875 | 174.99 | 3281 | 4.7012 | 0.2433 | | 0.0787 | 176.0 | 3300 | 4.7113 | 0.2367 | | 0.0779 | 176.96 | 3318 | 4.6998 | 0.24 | | 0.0823 | 177.97 | 3337 | 4.7092 | 0.24 | | 0.0685 | 178.99 | 3356 | 4.6763 | 0.245 | | 0.0698 | 180.0 | 3375 | 4.7181 | 0.2567 | | 0.0924 | 180.96 | 3393 | 4.7151 | 0.2483 | | 0.084 | 181.97 | 3412 | 4.7231 | 0.2417 | | 0.0508 | 182.99 | 3431 | 4.6856 | 0.2317 | | 0.0637 | 184.0 | 3450 | 4.7041 | 0.2417 | | 0.06 | 184.96 | 3468 | 4.7205 | 0.24 | | 0.0659 | 185.97 | 3487 | 4.7251 | 0.2433 | | 0.0842 | 186.99 | 3506 | 4.7215 | 0.23 | | 0.0733 | 188.0 | 3525 | 4.7068 | 0.24 | | 0.0647 | 188.96 | 3543 | 4.7594 | 0.2367 | | 0.0569 | 189.97 | 3562 | 4.7831 | 0.2233 | | 0.0883 | 190.99 | 3581 | 4.7212 | 0.235 | | 0.0622 | 192.0 | 3600 | 4.6878 | 0.2417 | | 0.057 | 192.96 | 3618 | 4.6654 | 0.2467 | | 0.0654 | 193.97 | 3637 | 4.6358 | 0.2517 | | 0.0868 | 194.99 | 3656 | 4.6621 | 0.2333 | | 0.0789 | 196.0 | 3675 | 4.6985 | 0.2333 | | 0.0657 | 196.96 | 3693 | 4.6636 | 0.2567 | | 0.0648 | 197.97 | 3712 | 4.7698 | 0.2467 | | 0.0635 | 198.99 | 3731 | 4.7226 | 0.2417 | | 0.0637 | 200.0 | 3750 | 4.7481 | 0.245 | | 0.0665 | 200.96 | 3768 | 4.7789 | 0.2483 | | 0.0799 | 201.97 | 3787 | 4.7014 | 0.235 | | 0.064 | 202.99 | 3806 | 4.7528 | 0.2417 | | 0.0772 | 204.0 | 3825 | 4.7401 | 0.2383 | | 0.0438 | 204.96 | 3843 | 4.7678 | 0.2417 | | 0.0766 | 205.97 | 3862 | 4.7180 | 0.2367 | | 0.0687 | 206.99 | 3881 | 4.7058 | 0.2433 | | 0.0801 | 208.0 | 3900 | 4.7584 | 0.235 | | 0.0772 | 208.96 | 3918 | 4.7304 | 0.2433 | | 0.0663 | 209.97 | 3937 | 4.6940 | 0.2367 | | 0.0529 | 210.99 | 3956 | 4.6940 | 0.235 | | 0.0568 | 212.0 | 3975 | 4.7333 | 0.235 | | 0.0697 | 212.96 | 3993 | 4.6673 | 0.2367 | | 0.0394 | 213.97 | 4012 | 4.6733 | 0.245 | | 0.0625 | 214.99 | 4031 | 4.7383 | 0.225 | | 0.0588 | 216.0 | 4050 | 4.7674 | 0.24 | | 0.0594 | 216.96 | 4068 | 4.6873 | 0.2417 | | 0.0451 | 217.97 | 4087 | 4.6718 | 0.2433 | | 0.047 | 218.99 | 4106 | 4.7146 | 0.2283 | | 0.0445 | 220.0 | 4125 | 4.7174 | 0.2283 | | 0.0746 | 220.96 | 4143 | 4.6702 | 0.2367 | | 0.0697 | 221.97 | 4162 | 4.6462 | 0.2367 | | 0.0562 | 222.99 | 4181 | 4.6956 | 0.2333 | | 0.047 | 224.0 | 4200 | 4.7278 | 0.2383 | | 0.0612 | 224.96 | 4218 | 4.7307 | 0.235 | | 0.0625 | 225.97 | 4237 | 4.6670 | 0.2567 | | 0.0739 | 226.99 | 4256 | 4.7110 | 0.2317 | | 0.0637 | 228.0 | 4275 | 4.7039 | 0.22 | | 0.0461 | 228.96 | 4293 | 4.7119 | 0.2267 | | 0.0506 | 229.97 | 4312 | 4.7099 | 0.23 | | 0.0412 | 230.99 | 4331 | 4.6714 | 0.2317 | | 0.057 | 232.0 | 4350 | 4.6921 | 0.2367 | | 0.0402 | 232.96 | 4368 | 4.7545 | 0.2317 | | 0.058 | 233.97 | 4387 | 4.7573 | 0.225 | | 0.0661 | 234.99 | 4406 | 4.6800 | 0.2283 | | 0.0613 | 236.0 | 4425 | 4.6533 | 0.2433 | | 0.0462 | 236.96 | 4443 | 4.6748 | 0.2283 | | 0.0494 | 237.97 | 4462 | 4.6874 | 0.23 | | 0.0643 | 238.99 | 4481 | 4.7291 | 0.2333 | | 0.0422 | 240.0 | 4500 | 4.7088 | 0.23 | | 0.0376 | 240.96 | 4518 | 4.7422 | 0.225 | | 0.0696 | 241.97 | 4537 | 4.8011 | 0.2283 | | 0.0609 | 242.99 | 4556 | 4.8013 | 0.2217 | | 0.0637 | 244.0 | 4575 | 4.7603 | 0.225 | | 0.0529 | 244.96 | 4593 | 4.7895 | 0.2233 | | 0.0603 | 245.97 | 4612 | 4.7639 | 0.235 | | 0.0365 | 246.99 | 4631 | 4.7285 | 0.2433 | | 0.0732 | 248.0 | 4650 | 4.7252 | 0.2283 | | 0.0709 | 248.96 | 4668 | 4.7620 | 0.23 | | 0.0485 | 249.97 | 4687 | 4.7529 | 0.2367 | | 0.0449 | 250.99 | 4706 | 4.8006 | 0.2417 | | 0.0506 | 252.0 | 4725 | 4.8028 | 0.2333 | | 0.0455 | 252.96 | 4743 | 4.7778 | 0.2367 | | 0.0594 | 253.97 | 4762 | 4.7439 | 0.2383 | | 0.0551 | 254.99 | 4781 | 4.8069 | 0.2367 | | 0.0435 | 256.0 | 4800 | 4.8171 | 0.2383 | | 0.042 | 256.96 | 4818 | 4.7961 | 0.2383 | | 0.0403 | 257.97 | 4837 | 4.8172 | 0.2383 | | 0.0524 | 258.99 | 4856 | 4.8537 | 0.23 | | 0.0461 | 260.0 | 4875 | 4.7698 | 0.2283 | | 0.05 | 260.96 | 4893 | 4.8058 | 0.2483 | | 0.0545 | 261.97 | 4912 | 4.8398 | 0.2333 | | 0.0405 | 262.99 | 4931 | 4.8228 | 0.2367 | | 0.0615 | 264.0 | 4950 | 4.8395 | 0.2367 | | 0.0381 | 264.96 | 4968 | 4.8231 | 0.2233 | | 0.0464 | 265.97 | 4987 | 4.8180 | 0.2367 | | 0.058 | 266.99 | 5006 | 4.8744 | 0.235 | | 0.0553 | 268.0 | 5025 | 4.8866 | 0.2367 | | 0.0505 | 268.96 | 5043 | 4.8534 | 0.24 | | 0.049 | 269.97 | 5062 | 4.8702 | 0.2333 | | 0.0444 | 270.99 | 5081 | 4.8715 | 0.2267 | | 0.0457 | 272.0 | 5100 | 4.8274 | 0.225 | | 0.0546 | 272.96 | 5118 | 4.8441 | 0.225 | | 0.0378 | 273.97 | 5137 | 4.8229 | 0.225 | | 0.0374 | 274.99 | 5156 | 4.8053 | 0.2217 | | 0.047 | 276.0 | 5175 | 4.8619 | 0.2333 | | 0.0526 | 276.96 | 5193 | 4.8793 | 0.2417 | | 0.0503 | 277.97 | 5212 | 4.9060 | 0.2283 | | 0.0414 | 278.99 | 5231 | 4.8687 | 0.24 | | 0.0361 | 280.0 | 5250 | 4.8537 | 0.24 | | 0.0449 | 280.96 | 5268 | 4.8204 | 0.2383 | | 0.0596 | 281.97 | 5287 | 4.8030 | 0.2367 | | 0.0494 | 282.99 | 5306 | 4.8060 | 0.2483 | | 0.0483 | 284.0 | 5325 | 4.7878 | 0.235 | | 0.0338 | 284.96 | 5343 | 4.8254 | 0.2383 | | 0.0319 | 285.97 | 5362 | 4.8264 | 0.2383 | | 0.0454 | 286.99 | 5381 | 4.8426 | 0.2367 | | 0.0409 | 288.0 | 5400 | 4.8198 | 0.2483 | | 0.0435 | 288.96 | 5418 | 4.8339 | 0.2367 | | 0.0498 | 289.97 | 5437 | 4.8387 | 0.225 | | 0.0447 | 290.99 | 5456 | 4.8342 | 0.23 | | 0.0402 | 292.0 | 5475 | 4.8496 | 0.2333 | | 0.0366 | 292.96 | 5493 | 4.8671 | 0.2317 | | 0.0369 | 293.97 | 5512 | 4.8366 | 0.2467 | | 0.0361 | 294.99 | 5531 | 4.7992 | 0.2433 | | 0.0448 | 296.0 | 5550 | 4.8486 | 0.2267 | | 0.055 | 296.96 | 5568 | 4.8979 | 0.2267 | | 0.0585 | 297.97 | 5587 | 4.8660 | 0.2367 | | 0.0477 | 298.99 | 5606 | 4.8717 | 0.2433 | | 0.0247 | 300.0 | 5625 | 4.8838 | 0.2283 | | 0.047 | 300.96 | 5643 | 4.8248 | 0.2383 | | 0.0608 | 301.97 | 5662 | 4.8330 | 0.2367 | | 0.0417 | 302.99 | 5681 | 4.8236 | 0.2317 | | 0.0494 | 304.0 | 5700 | 4.8070 | 0.2383 | | 0.0316 | 304.96 | 5718 | 4.8213 | 0.2267 | | 0.0421 | 305.97 | 5737 | 4.8634 | 0.2317 | | 0.0411 | 306.99 | 5756 | 4.8770 | 0.24 | | 0.0404 | 308.0 | 5775 | 4.9030 | 0.2383 | | 0.0397 | 308.96 | 5793 | 4.9433 | 0.2383 | | 0.053 | 309.97 | 5812 | 4.9301 | 0.2333 | | 0.0303 | 310.99 | 5831 | 4.8961 | 0.2283 | | 0.0369 | 312.0 | 5850 | 4.8560 | 0.2433 | | 0.0423 | 312.96 | 5868 | 4.9177 | 0.225 | | 0.0343 | 313.97 | 5887 | 4.8928 | 0.2233 | | 0.0216 | 314.99 | 5906 | 4.8958 | 0.23 | | 0.0287 | 316.0 | 5925 | 4.8803 | 0.235 | | 0.0286 | 316.96 | 5943 | 4.8615 | 0.23 | | 0.0304 | 317.97 | 5962 | 4.8736 | 0.2317 | | 0.0486 | 318.99 | 5981 | 4.8825 | 0.2233 | | 0.0404 | 320.0 | 6000 | 4.8618 | 0.2283 | | 0.0439 | 320.96 | 6018 | 4.8848 | 0.23 | | 0.0428 | 321.97 | 6037 | 4.8975 | 0.2267 | | 0.0498 | 322.99 | 6056 | 4.8614 | 0.2383 | | 0.0314 | 324.0 | 6075 | 4.8718 | 0.235 | | 0.0334 | 324.96 | 6093 | 4.9021 | 0.2383 | | 0.0431 | 325.97 | 6112 | 4.8973 | 0.2283 | | 0.0473 | 326.99 | 6131 | 4.8671 | 0.24 | | 0.0348 | 328.0 | 6150 | 4.9050 | 0.2333 | | 0.0718 | 328.96 | 6168 | 4.8869 | 0.2417 | | 0.0387 | 329.97 | 6187 | 4.8552 | 0.245 | | 0.0335 | 330.99 | 6206 | 4.8932 | 0.2367 | | 0.0355 | 332.0 | 6225 | 4.9195 | 0.245 | | 0.0407 | 332.96 | 6243 | 4.9163 | 0.2333 | | 0.0471 | 333.97 | 6262 | 4.8860 | 0.225 | | 0.0334 | 334.99 | 6281 | 4.8943 | 0.235 | | 0.0301 | 336.0 | 6300 | 4.9223 | 0.2367 | | 0.0281 | 336.96 | 6318 | 4.9101 | 0.2433 | | 0.0305 | 337.97 | 6337 | 4.8897 | 0.24 | | 0.0505 | 338.99 | 6356 | 4.9290 | 0.2417 | | 0.024 | 340.0 | 6375 | 4.9442 | 0.2333 | | 0.0504 | 340.96 | 6393 | 4.9183 | 0.2367 | | 0.0259 | 341.97 | 6412 | 4.8832 | 0.235 | | 0.0313 | 342.99 | 6431 | 4.8958 | 0.2317 | | 0.0293 | 344.0 | 6450 | 4.8979 | 0.2433 | | 0.0427 | 344.96 | 6468 | 4.9055 | 0.2417 | | 0.0399 | 345.97 | 6487 | 4.8957 | 0.2433 | | 0.0273 | 346.99 | 6506 | 4.8989 | 0.24 | | 0.0388 | 348.0 | 6525 | 4.9087 | 0.2367 | | 0.0306 | 348.96 | 6543 | 4.9264 | 0.2283 | | 0.0411 | 349.97 | 6562 | 4.9219 | 0.2367 | | 0.0394 | 350.99 | 6581 | 4.8998 | 0.24 | | 0.0507 | 352.0 | 6600 | 4.9304 | 0.2317 | | 0.0263 | 352.96 | 6618 | 4.9232 | 0.23 | | 0.0395 | 353.97 | 6637 | 4.9241 | 0.2367 | | 0.0394 | 354.99 | 6656 | 4.9263 | 0.2433 | | 0.0391 | 356.0 | 6675 | 4.9273 | 0.26 | | 0.0647 | 356.96 | 6693 | 4.9034 | 0.2633 | | 0.038 | 357.97 | 6712 | 4.8910 | 0.2467 | | 0.0368 | 358.99 | 6731 | 4.8830 | 0.245 | | 0.0308 | 360.0 | 6750 | 4.8867 | 0.2367 | | 0.0346 | 360.96 | 6768 | 4.8657 | 0.2433 | | 0.0279 | 361.97 | 6787 | 4.8678 | 0.24 | | 0.0443 | 362.99 | 6806 | 4.8723 | 0.2433 | | 0.027 | 364.0 | 6825 | 4.8756 | 0.2433 | | 0.0447 | 364.96 | 6843 | 4.8742 | 0.235 | | 0.028 | 365.97 | 6862 | 4.9042 | 0.235 | | 0.0483 | 366.99 | 6881 | 4.9086 | 0.2367 | | 0.034 | 368.0 | 6900 | 4.8886 | 0.24 | | 0.0363 | 368.96 | 6918 | 4.8778 | 0.2467 | | 0.0417 | 369.97 | 6937 | 4.9051 | 0.2417 | | 0.0326 | 370.99 | 6956 | 4.9112 | 0.2367 | | 0.028 | 372.0 | 6975 | 4.9116 | 0.2333 | | 0.0343 | 372.96 | 6993 | 4.9104 | 0.245 | | 0.0229 | 373.97 | 7012 | 4.9401 | 0.2367 | | 0.0337 | 374.99 | 7031 | 4.9341 | 0.245 | | 0.0356 | 376.0 | 7050 | 4.9336 | 0.2317 | | 0.029 | 376.96 | 7068 | 4.9132 | 0.2333 | | 0.0272 | 377.97 | 7087 | 4.9102 | 0.2367 | | 0.0256 | 378.99 | 7106 | 4.9255 | 0.2317 | | 0.0276 | 380.0 | 7125 | 4.9282 | 0.2267 | | 0.026 | 380.96 | 7143 | 4.9527 | 0.22 | | 0.0385 | 381.97 | 7162 | 4.9411 | 0.2217 | | 0.026 | 382.99 | 7181 | 4.9530 | 0.2367 | | 0.0444 | 384.0 | 7200 | 4.9387 | 0.2383 | | 0.0369 | 384.96 | 7218 | 4.9042 | 0.2333 | | 0.0203 | 385.97 | 7237 | 4.8860 | 0.23 | | 0.0238 | 386.99 | 7256 | 4.8775 | 0.2333 | | 0.0315 | 388.0 | 7275 | 4.8641 | 0.2333 | | 0.0349 | 388.96 | 7293 | 4.8677 | 0.2467 | | 0.038 | 389.97 | 7312 | 4.8688 | 0.24 | | 0.0301 | 390.99 | 7331 | 4.8932 | 0.245 | | 0.0363 | 392.0 | 7350 | 4.9023 | 0.2417 | | 0.0329 | 392.96 | 7368 | 4.8825 | 0.24 | | 0.0174 | 393.97 | 7387 | 4.8711 | 0.24 | | 0.0284 | 394.99 | 7406 | 4.8762 | 0.2433 | | 0.0178 | 396.0 | 7425 | 4.8684 | 0.2417 | | 0.0359 | 396.96 | 7443 | 4.8660 | 0.245 | | 0.029 | 397.97 | 7462 | 4.8799 | 0.2433 | | 0.0227 | 398.99 | 7481 | 4.8845 | 0.25 | | 0.0135 | 400.0 | 7500 | 4.8898 | 0.2383 | | 0.0297 | 400.96 | 7518 | 4.8967 | 0.2383 | | 0.0263 | 401.97 | 7537 | 4.8884 | 0.2333 | | 0.0386 | 402.99 | 7556 | 4.8719 | 0.24 | | 0.0298 | 404.0 | 7575 | 4.8609 | 0.2433 | | 0.0232 | 404.96 | 7593 | 4.8602 | 0.2483 | | 0.0232 | 405.97 | 7612 | 4.8667 | 0.2467 | | 0.032 | 406.99 | 7631 | 4.8684 | 0.2483 | | 0.0306 | 408.0 | 7650 | 4.8755 | 0.2433 | | 0.0299 | 408.96 | 7668 | 4.8687 | 0.245 | | 0.0307 | 409.97 | 7687 | 4.8724 | 0.24 | | 0.0304 | 410.99 | 7706 | 4.8798 | 0.25 | | 0.0293 | 412.0 | 7725 | 4.8901 | 0.2483 | | 0.0273 | 412.96 | 7743 | 4.9025 | 0.24 | | 0.0184 | 413.97 | 7762 | 4.8870 | 0.24 | | 0.0377 | 414.99 | 7781 | 4.8901 | 0.2417 | | 0.0278 | 416.0 | 7800 | 4.8895 | 0.2417 | | 0.0345 | 416.96 | 7818 | 4.9046 | 0.2533 | | 0.0301 | 417.97 | 7837 | 4.9002 | 0.2483 | | 0.0159 | 418.99 | 7856 | 4.8982 | 0.245 | | 0.0203 | 420.0 | 7875 | 4.9008 | 0.2483 | | 0.0182 | 420.96 | 7893 | 4.9113 | 0.2467 | | 0.0258 | 421.97 | 7912 | 4.9180 | 0.25 | | 0.0266 | 422.99 | 7931 | 4.9134 | 0.2433 | | 0.0304 | 424.0 | 7950 | 4.9005 | 0.2417 | | 0.0247 | 424.96 | 7968 | 4.8937 | 0.2417 | | 0.0493 | 425.97 | 7987 | 4.8835 | 0.245 | | 0.0286 | 426.99 | 8006 | 4.8968 | 0.24 | | 0.0228 | 428.0 | 8025 | 4.9066 | 0.2383 | | 0.0362 | 428.96 | 8043 | 4.9031 | 0.245 | | 0.0244 | 429.97 | 8062 | 4.8997 | 0.2467 | | 0.0204 | 430.99 | 8081 | 4.9059 | 0.2433 | | 0.0344 | 432.0 | 8100 | 4.9052 | 0.2433 | | 0.0252 | 432.96 | 8118 | 4.8975 | 0.2433 | | 0.0242 | 433.97 | 8137 | 4.8961 | 0.2467 | | 0.0135 | 434.99 | 8156 | 4.9086 | 0.2467 | | 0.0296 | 436.0 | 8175 | 4.9135 | 0.2417 | | 0.0432 | 436.96 | 8193 | 4.9079 | 0.2433 | | 0.0242 | 437.97 | 8212 | 4.8981 | 0.24 | | 0.0227 | 438.99 | 8231 | 4.8857 | 0.24 | | 0.021 | 440.0 | 8250 | 4.8874 | 0.2383 | | 0.0244 | 440.96 | 8268 | 4.8847 | 0.24 | | 0.0234 | 441.97 | 8287 | 4.8964 | 0.2367 | | 0.0278 | 442.99 | 8306 | 4.9161 | 0.2383 | | 0.0322 | 444.0 | 8325 | 4.9212 | 0.2367 | | 0.038 | 444.96 | 8343 | 4.9251 | 0.24 | | 0.0327 | 445.97 | 8362 | 4.9340 | 0.24 | | 0.0256 | 446.99 | 8381 | 4.9246 | 0.2417 | | 0.0327 | 448.0 | 8400 | 4.9294 | 0.2367 | | 0.0246 | 448.96 | 8418 | 4.9311 | 0.2417 | | 0.0239 | 449.97 | 8437 | 4.9220 | 0.2383 | | 0.0219 | 450.99 | 8456 | 4.9205 | 0.24 | | 0.0287 | 452.0 | 8475 | 4.9249 | 0.2367 | | 0.0244 | 452.96 | 8493 | 4.9275 | 0.24 | | 0.0222 | 453.97 | 8512 | 4.9322 | 0.2417 | | 0.0277 | 454.99 | 8531 | 4.9318 | 0.2383 | | 0.0315 | 456.0 | 8550 | 4.9291 | 0.2383 | | 0.021 | 456.96 | 8568 | 4.9293 | 0.2367 | | 0.0288 | 457.97 | 8587 | 4.9233 | 0.2333 | | 0.0229 | 458.99 | 8606 | 4.9236 | 0.2383 | | 0.0257 | 460.0 | 8625 | 4.9225 | 0.2367 | | 0.0291 | 460.96 | 8643 | 4.9222 | 0.2383 | | 0.0325 | 461.97 | 8662 | 4.9216 | 0.2367 | | 0.0268 | 462.99 | 8681 | 4.9202 | 0.2367 | | 0.0156 | 464.0 | 8700 | 4.9175 | 0.2367 | | 0.0196 | 464.96 | 8718 | 4.9147 | 0.2333 | | 0.0448 | 465.97 | 8737 | 4.9100 | 0.2333 | | 0.0232 | 466.99 | 8756 | 4.9088 | 0.2333 | | 0.0274 | 468.0 | 8775 | 4.9096 | 0.2367 | | 0.029 | 468.96 | 8793 | 4.9105 | 0.2367 | | 0.0337 | 469.97 | 8812 | 4.9125 | 0.235 | | 0.0178 | 470.99 | 8831 | 4.9120 | 0.235 | | 0.0286 | 472.0 | 8850 | 4.9125 | 0.2367 | | 0.0159 | 472.96 | 8868 | 4.9102 | 0.2367 | | 0.0318 | 473.97 | 8887 | 4.9116 | 0.2383 | | 0.0302 | 474.99 | 8906 | 4.9113 | 0.24 | | 0.0184 | 476.0 | 8925 | 4.9120 | 0.24 | | 0.025 | 476.96 | 8943 | 4.9128 | 0.24 | | 0.027 | 477.97 | 8962 | 4.9126 | 0.24 | | 0.0298 | 478.99 | 8981 | 4.9130 | 0.24 | | 0.0349 | 480.0 | 9000 | 4.9130 | 0.24 | ### Framework versions - Transformers 4.31.0 - Pytorch 1.13.1 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "acanthite", "actinolite", "adamite", "aegirine", "afghanite", "agate", "alabandite", "albite", "almandine", "amethyst", "analcime", "anatase", "andalusite", "andradite", "anglesite", "anhydrite", "annabergite", "anorthite", "anorthoclase", "antimony", "apatite", "apophyllite", "aquamarine", "aragonite", "arfvedsonite", "arsenic", "arsenopyrite", "augite", "aurichalcite", "axinite", "azurite", "babingtonite", "barite", "bauxite", "benitoite", "beryl", "biotite", "bismuth", "bismuthinite", "bixbyite", "boehmite", "boleite", "boracite", "borax", "bornite", "boulangerite", "bournonite", "brochantite", "bromargyrite", "brookite", "brucite", "buergerite", "bustamite", "bytownite", "calaverite", "calcite", "calomel", "carletonite", "cassiterite", "celestine", "cerussite", "cervantite", "chabazite", "chalcanthite", "chalcedony", "chalcocite", "chalcopyrite", "chambersite", "chamosite", "chlorargyrite", "chlorite", "chondrodite", "chromite", "chrysoberyl", "chrysocolla", "cinnabar", "citrine", "clinochlore", "clinohumite", "clinozoisite", "cobaltite", "coesite", "colemanite", "columbite", "cookeite", "copper", "cordierite", "corundum", "covellite", "cristobalite", "crocoite", "cryolite", "cumengeite", "cuprite", "cyanotrichite", "danburite", "datolite", "diamond", "diaspore", "diopside", "dioptase", "dolomite", "dravite", "dumortierite", "edenite", "elbaite", "emerald", "enargite", "enstatite", "epidote", "epistilbite", "erythrite", "euclase", "ferberite", "fluorite", "franklinite", "galena", "garnet", "gaspeite", "gibbsite", "glauberite", "glaucophane", "gmelinite", "goethite", "gold", "goosecreekite", "graphite", "greenockite", "grossular", "gummite", "gypsum", "halite", "hastingsite", "hauyne", "hedenbergite", "hematite", "hemimorphite", "heulandite", "hornblende", "howlite", "huebnerite", "humite", "hydrozincite", "ice", "ilmenite", "inesite", "iodargyrite", "iron-nickel", "jadeite", "jamesonite", "jarosite", "jasper", "jeremejevite", "johannsenite", "kaolinite", "kernite", "krennerite", "kyanite", "labradorite", "laumontite", "lawsonite", "lazulite", "lazurite", "lead", "lepidolite", "leucite", "liddicoatite", "limonite", "linarite", "lithiophilite", "loellingite", "luzonite", "magnesiochromite", "magnesite", "magnetite", "malachite", "manganite", "manganoneptunite", "marcasite", "marialite", "meionite", "melanophlogite", "mercury", "mesolite", "metacinnabar", "metastibnite", "metavariscite", "miargyrite", "microcline", "millerite", "mimetite", "molybdenite", "monazite", "muscovite", "natrolite", "nepheline", "neptunite", "nickeline", "nitratine", "norbergite", "nosean", "oligoclase", "olivine", "olmiite", "opal", "orpiment", "orthoclase", "otavite", "paradamite", "pararealgar", "pargasite", "pearceite", "pectolite", "pentlandite", "pezzottaite", "phenakite", "phlogopite", "phosgenite", "plagioclase", "platinum", "polybasite", "powellite", "prehnite", "proustite", "psilomelane", "pumpellyite", "pyrargyrite", "pyrite", "pyrolusite", "pyromorphite", "pyrope", "pyrophyllite", "pyroxmangite", "pyrrhotite", "quartz", "rammelsbergite", "raspite", "realgar", "rheniite", "rhodochrosite", "rhodonite", "riebeckite", "romanechite", "rutile", "talc", "tantalite", "tellurium", "tennantite", "tephroite", "tetrahedrite", "thenardite", "thomsonite", "thorite", "tincalconite", "titanite", "topaz", "tourmaline", "tremolite", "tridymite", "triphylite", "turquoise", "ulexite", "uraninite", "uvarovite", "uvite", "vanadinite", "variscite", "vesuvianite", "vivianite", "water", "wavellite", "willemite", "witherite", "wolframite", "wollastonite", "wulfenite", "wurtzite", "xanthoconite", "xenotime", "zincite", "zircon", "zoisite" ]
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 5.3085 - Accuracy: 0.828 - Brier Loss: 0.3005 - Nll: 1.3339 - F1 Micro: 0.828 - F1 Macro: 0.8285 - Ece: 0.1391 - Aurc: 0.0474 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 167 | 5.3750 | 0.61 | 0.5591 | 2.2520 | 0.61 | 0.5922 | 0.1827 | 0.1664 | | No log | 2.0 | 334 | 5.0343 | 0.7117 | 0.4389 | 1.9483 | 0.7117 | 0.7096 | 0.1691 | 0.0962 | | 5.4927 | 3.0 | 501 | 4.8554 | 0.7472 | 0.3777 | 1.6689 | 0.7472 | 0.7474 | 0.1221 | 0.0780 | | 5.4927 | 4.0 | 668 | 4.7917 | 0.76 | 0.3524 | 1.5715 | 0.76 | 0.7644 | 0.0915 | 0.0699 | | 5.4927 | 5.0 | 835 | 4.7792 | 0.765 | 0.3461 | 1.5348 | 0.765 | 0.7590 | 0.0737 | 0.0704 | | 4.6216 | 6.0 | 1002 | 4.6378 | 0.7993 | 0.2954 | 1.3769 | 0.7993 | 0.7995 | 0.0546 | 0.0559 | | 4.6216 | 7.0 | 1169 | 4.8666 | 0.771 | 0.3359 | 1.5727 | 0.771 | 0.7728 | 0.0670 | 0.0666 | | 4.6216 | 8.0 | 1336 | 4.6834 | 0.7897 | 0.3047 | 1.3537 | 0.7897 | 0.7914 | 0.0531 | 0.0564 | | 4.2978 | 9.0 | 1503 | 4.6558 | 0.7997 | 0.2912 | 1.3758 | 0.7997 | 0.7988 | 0.0521 | 0.0508 | | 4.2978 | 10.0 | 1670 | 4.8214 | 0.7923 | 0.3144 | 1.5316 | 0.7923 | 0.7928 | 0.0815 | 0.0561 | | 4.2978 | 11.0 | 1837 | 4.8908 | 0.7923 | 0.3201 | 1.4158 | 0.7923 | 0.7931 | 0.0988 | 0.0573 | | 4.1375 | 12.0 | 2004 | 4.7703 | 0.8093 | 0.2971 | 1.3642 | 0.8093 | 0.8097 | 0.0812 | 0.0514 | | 4.1375 | 13.0 | 2171 | 4.8126 | 0.806 | 0.3039 | 1.3759 | 0.806 | 0.8053 | 0.0916 | 0.0491 | | 4.1375 | 14.0 | 2338 | 4.7875 | 0.8063 | 0.2990 | 1.3712 | 0.8062 | 0.8065 | 0.0904 | 0.0481 | | 4.0665 | 15.0 | 2505 | 4.7995 | 0.805 | 0.3016 | 1.4133 | 0.805 | 0.8049 | 0.0909 | 0.0503 | | 4.0665 | 16.0 | 2672 | 4.7712 | 0.8075 | 0.2957 | 1.4018 | 0.8075 | 0.8082 | 0.0880 | 0.0484 | | 4.0665 | 17.0 | 2839 | 4.7245 | 0.812 | 0.2886 | 1.2816 | 0.8120 | 0.8139 | 0.0831 | 0.0464 | | 4.0204 | 18.0 | 3006 | 4.8990 | 0.8055 | 0.3079 | 1.3884 | 0.8055 | 0.8046 | 0.1117 | 0.0504 | | 4.0204 | 19.0 | 3173 | 4.9286 | 0.802 | 0.3147 | 1.3977 | 0.802 | 0.7995 | 0.1078 | 0.0522 | | 4.0204 | 20.0 | 3340 | 4.9510 | 0.8055 | 0.3121 | 1.4482 | 0.8055 | 0.8062 | 0.1125 | 0.0521 | | 3.9854 | 21.0 | 3507 | 4.8837 | 0.8033 | 0.3082 | 1.4528 | 0.8033 | 0.8022 | 0.1052 | 0.0502 | | 3.9854 | 22.0 | 3674 | 5.0103 | 0.813 | 0.3069 | 1.4217 | 0.813 | 0.8169 | 0.1207 | 0.0500 | | 3.9854 | 23.0 | 3841 | 4.9602 | 0.8093 | 0.3091 | 1.4672 | 0.8093 | 0.8103 | 0.1187 | 0.0494 | | 3.9599 | 24.0 | 4008 | 4.8980 | 0.8177 | 0.2953 | 1.3589 | 0.8178 | 0.8203 | 0.1083 | 0.0451 | | 3.9599 | 25.0 | 4175 | 4.8753 | 0.8145 | 0.2932 | 1.3219 | 0.8145 | 0.8140 | 0.1054 | 0.0460 | | 3.9599 | 26.0 | 4342 | 4.9644 | 0.8193 | 0.3000 | 1.4336 | 0.8193 | 0.8200 | 0.1173 | 0.0458 | | 3.9358 | 27.0 | 4509 | 4.9648 | 0.8203 | 0.2985 | 1.4117 | 0.8203 | 0.8197 | 0.1132 | 0.0471 | | 3.9358 | 28.0 | 4676 | 5.0130 | 0.8195 | 0.3014 | 1.4618 | 0.8195 | 0.8201 | 0.1205 | 0.0456 | | 3.9358 | 29.0 | 4843 | 4.8974 | 0.8255 | 0.2874 | 1.3041 | 0.8255 | 0.8258 | 0.1097 | 0.0421 | | 3.9151 | 30.0 | 5010 | 4.9045 | 0.8255 | 0.2878 | 1.2801 | 0.8255 | 0.8250 | 0.1119 | 0.0426 | | 3.9151 | 31.0 | 5177 | 4.9720 | 0.823 | 0.2945 | 1.3551 | 0.823 | 0.8240 | 0.1212 | 0.0439 | | 3.9151 | 32.0 | 5344 | 4.9508 | 0.826 | 0.2913 | 1.2669 | 0.826 | 0.8268 | 0.1201 | 0.0422 | | 3.9003 | 33.0 | 5511 | 5.0336 | 0.8237 | 0.2991 | 1.3443 | 0.8237 | 0.8240 | 0.1243 | 0.0433 | | 3.9003 | 34.0 | 5678 | 4.9828 | 0.8237 | 0.2901 | 1.2843 | 0.8237 | 0.8239 | 0.1214 | 0.0440 | | 3.9003 | 35.0 | 5845 | 5.0256 | 0.8287 | 0.2920 | 1.2961 | 0.8287 | 0.8291 | 0.1232 | 0.0422 | | 3.89 | 36.0 | 6012 | 5.0229 | 0.8283 | 0.2922 | 1.2471 | 0.8283 | 0.8283 | 0.1236 | 0.0432 | | 3.89 | 37.0 | 6179 | 5.0835 | 0.8285 | 0.2936 | 1.2892 | 0.8285 | 0.8289 | 0.1254 | 0.0442 | | 3.89 | 38.0 | 6346 | 5.1148 | 0.8287 | 0.2946 | 1.3106 | 0.8287 | 0.8282 | 0.1287 | 0.0427 | | 3.8846 | 39.0 | 6513 | 5.1238 | 0.827 | 0.2954 | 1.2964 | 0.827 | 0.8275 | 0.1298 | 0.0441 | | 3.8846 | 40.0 | 6680 | 5.1481 | 0.8307 | 0.2950 | 1.3136 | 0.8308 | 0.8311 | 0.1277 | 0.0453 | | 3.8846 | 41.0 | 6847 | 5.1491 | 0.8293 | 0.2943 | 1.2841 | 0.8293 | 0.8294 | 0.1298 | 0.0451 | | 3.881 | 42.0 | 7014 | 5.1982 | 0.829 | 0.2969 | 1.3111 | 0.8290 | 0.8292 | 0.1331 | 0.0459 | | 3.881 | 43.0 | 7181 | 5.2041 | 0.8283 | 0.2970 | 1.3427 | 0.8283 | 0.8283 | 0.1327 | 0.0465 | | 3.881 | 44.0 | 7348 | 5.2310 | 0.8297 | 0.2985 | 1.3351 | 0.8297 | 0.8303 | 0.1346 | 0.0471 | | 3.8796 | 45.0 | 7515 | 5.2394 | 0.83 | 0.2999 | 1.3308 | 0.83 | 0.8305 | 0.1348 | 0.0467 | | 3.8796 | 46.0 | 7682 | 5.2632 | 0.83 | 0.2990 | 1.3350 | 0.83 | 0.8304 | 0.1355 | 0.0471 | | 3.8796 | 47.0 | 7849 | 5.2821 | 0.828 | 0.2998 | 1.3354 | 0.828 | 0.8286 | 0.1383 | 0.0470 | | 3.8753 | 48.0 | 8016 | 5.2949 | 0.829 | 0.2998 | 1.3341 | 0.8290 | 0.8294 | 0.1374 | 0.0472 | | 3.8753 | 49.0 | 8183 | 5.3026 | 0.8287 | 0.3004 | 1.3281 | 0.8287 | 0.8293 | 0.1382 | 0.0474 | | 3.8753 | 50.0 | 8350 | 5.3085 | 0.828 | 0.3005 | 1.3339 | 0.828 | 0.8285 | 0.1391 | 0.0474 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.5149 - Accuracy: 0.8438 - Brier Loss: 0.2886 - Nll: 1.9666 - F1 Micro: 0.8438 - F1 Macro: 0.8442 - Ece: 0.1400 - Aurc: 0.0499 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 2.8286 | 1.0 | 1000 | 2.6354 | 0.6627 | 0.4563 | 2.2971 | 0.6627 | 0.6604 | 0.0542 | 0.1330 | | 2.3 | 2.0 | 2000 | 2.2674 | 0.73 | 0.3761 | 2.0481 | 0.7300 | 0.7314 | 0.0509 | 0.0916 | | 2.0283 | 3.0 | 3000 | 2.0891 | 0.7602 | 0.3360 | 1.9964 | 0.7602 | 0.7626 | 0.0564 | 0.0728 | | 1.8552 | 4.0 | 4000 | 2.1367 | 0.746 | 0.3686 | 2.0430 | 0.746 | 0.7485 | 0.0911 | 0.0815 | | 1.7095 | 5.0 | 5000 | 2.0469 | 0.7725 | 0.3301 | 1.9740 | 0.7725 | 0.7715 | 0.0882 | 0.0683 | | 1.6118 | 6.0 | 6000 | 1.9706 | 0.7788 | 0.3199 | 1.9470 | 0.7788 | 0.7773 | 0.0816 | 0.0617 | | 1.4475 | 7.0 | 7000 | 2.0324 | 0.779 | 0.3364 | 2.0056 | 0.779 | 0.7789 | 0.1159 | 0.0640 | | 1.3546 | 8.0 | 8000 | 2.0987 | 0.7955 | 0.3266 | 1.9823 | 0.7955 | 0.7965 | 0.1259 | 0.0597 | | 1.2711 | 9.0 | 9000 | 2.1830 | 0.7863 | 0.3487 | 2.0545 | 0.7863 | 0.7879 | 0.1418 | 0.0621 | | 1.1984 | 10.0 | 10000 | 2.2992 | 0.7923 | 0.3537 | 2.0028 | 0.7923 | 0.7925 | 0.1532 | 0.0612 | | 1.1503 | 11.0 | 11000 | 2.3319 | 0.795 | 0.3449 | 2.0241 | 0.795 | 0.7946 | 0.1527 | 0.0594 | | 1.0998 | 12.0 | 12000 | 2.4733 | 0.7973 | 0.3553 | 2.0856 | 0.7973 | 0.7964 | 0.1602 | 0.0589 | | 1.0752 | 13.0 | 13000 | 2.4884 | 0.7887 | 0.3655 | 2.0351 | 0.7887 | 0.7902 | 0.1679 | 0.0644 | | 1.0564 | 14.0 | 14000 | 2.4374 | 0.7963 | 0.3496 | 2.0512 | 0.7963 | 0.7985 | 0.1611 | 0.0570 | | 1.0227 | 15.0 | 15000 | 2.5464 | 0.7973 | 0.3582 | 2.1184 | 0.7973 | 0.7936 | 0.1676 | 0.0568 | | 1.0129 | 16.0 | 16000 | 2.5022 | 0.8027 | 0.3441 | 2.0449 | 0.8027 | 0.8036 | 0.1636 | 0.0560 | | 0.9895 | 17.0 | 17000 | 2.4877 | 0.811 | 0.3358 | 2.0303 | 0.811 | 0.8099 | 0.1578 | 0.0562 | | 0.9628 | 18.0 | 18000 | 2.4552 | 0.8107 | 0.3328 | 2.0399 | 0.8108 | 0.8114 | 0.1548 | 0.0527 | | 0.9466 | 19.0 | 19000 | 2.5208 | 0.818 | 0.3251 | 2.0761 | 0.818 | 0.8189 | 0.1524 | 0.0520 | | 0.9291 | 20.0 | 20000 | 2.5858 | 0.8137 | 0.3332 | 2.0634 | 0.8137 | 0.8141 | 0.1588 | 0.0538 | | 0.9177 | 21.0 | 21000 | 2.5647 | 0.8107 | 0.3383 | 2.0875 | 0.8108 | 0.8124 | 0.1601 | 0.0539 | | 0.9038 | 22.0 | 22000 | 2.6104 | 0.82 | 0.3301 | 2.1033 | 0.82 | 0.8198 | 0.1566 | 0.0559 | | 0.8874 | 23.0 | 23000 | 2.5864 | 0.8237 | 0.3188 | 2.0000 | 0.8237 | 0.8244 | 0.1517 | 0.0519 | | 0.8858 | 24.0 | 24000 | 2.5969 | 0.8185 | 0.3273 | 2.0714 | 0.8185 | 0.8191 | 0.1551 | 0.0527 | | 0.8653 | 25.0 | 25000 | 2.5529 | 0.828 | 0.3109 | 2.0179 | 0.828 | 0.8287 | 0.1505 | 0.0509 | | 0.8475 | 26.0 | 26000 | 2.5745 | 0.8265 | 0.3171 | 1.9994 | 0.8265 | 0.8272 | 0.1509 | 0.0526 | | 0.8569 | 27.0 | 27000 | 2.5906 | 0.8265 | 0.3142 | 2.0156 | 0.8265 | 0.8272 | 0.1499 | 0.0565 | | 0.8368 | 28.0 | 28000 | 2.7150 | 0.8225 | 0.3271 | 2.0439 | 0.8225 | 0.8215 | 0.1561 | 0.0580 | | 0.8355 | 29.0 | 29000 | 2.6501 | 0.824 | 0.3211 | 1.9908 | 0.824 | 0.8260 | 0.1545 | 0.0541 | | 0.832 | 30.0 | 30000 | 2.5656 | 0.8315 | 0.3076 | 2.0091 | 0.8315 | 0.8328 | 0.1474 | 0.0540 | | 0.8191 | 31.0 | 31000 | 2.6891 | 0.827 | 0.3189 | 1.9819 | 0.827 | 0.8294 | 0.1529 | 0.0573 | | 0.8118 | 32.0 | 32000 | 2.6791 | 0.827 | 0.3175 | 2.0233 | 0.827 | 0.8268 | 0.1523 | 0.0575 | | 0.8098 | 33.0 | 33000 | 2.5437 | 0.8373 | 0.2992 | 1.9926 | 0.8373 | 0.8384 | 0.1435 | 0.0492 | | 0.8006 | 34.0 | 34000 | 2.5751 | 0.8415 | 0.2932 | 2.0036 | 0.8415 | 0.8410 | 0.1403 | 0.0501 | | 0.8033 | 35.0 | 35000 | 2.5944 | 0.8303 | 0.3113 | 2.0069 | 0.8303 | 0.8302 | 0.1492 | 0.0537 | | 0.7916 | 36.0 | 36000 | 2.4955 | 0.839 | 0.2922 | 1.9523 | 0.839 | 0.8391 | 0.1407 | 0.0493 | | 0.7919 | 37.0 | 37000 | 2.6199 | 0.8365 | 0.3003 | 1.9494 | 0.8365 | 0.8370 | 0.1458 | 0.0538 | | 0.7844 | 38.0 | 38000 | 2.5823 | 0.8365 | 0.3011 | 1.9960 | 0.8365 | 0.8368 | 0.1462 | 0.0511 | | 0.7795 | 39.0 | 39000 | 2.5626 | 0.8415 | 0.2928 | 1.9916 | 0.8415 | 0.8412 | 0.1406 | 0.0484 | | 0.7757 | 40.0 | 40000 | 2.5528 | 0.8415 | 0.2891 | 1.9512 | 0.8415 | 0.8420 | 0.1406 | 0.0502 | | 0.7788 | 41.0 | 41000 | 2.5829 | 0.8383 | 0.2983 | 1.9420 | 0.8383 | 0.8377 | 0.1438 | 0.0530 | | 0.7743 | 42.0 | 42000 | 2.5285 | 0.838 | 0.2948 | 1.9636 | 0.838 | 0.8383 | 0.1449 | 0.0486 | | 0.768 | 43.0 | 43000 | 2.5130 | 0.8405 | 0.2906 | 1.9448 | 0.8405 | 0.8404 | 0.1418 | 0.0479 | | 0.7681 | 44.0 | 44000 | 2.5347 | 0.8383 | 0.2951 | 1.9588 | 0.8383 | 0.8390 | 0.1446 | 0.0508 | | 0.7662 | 45.0 | 45000 | 2.5246 | 0.8413 | 0.2900 | 1.9314 | 0.8413 | 0.8416 | 0.1415 | 0.0508 | | 0.7629 | 46.0 | 46000 | 2.5246 | 0.8397 | 0.2913 | 1.9648 | 0.8397 | 0.8403 | 0.1433 | 0.0498 | | 0.7632 | 47.0 | 47000 | 2.5217 | 0.8425 | 0.2892 | 1.9648 | 0.8425 | 0.8429 | 0.1409 | 0.0503 | | 0.7598 | 48.0 | 48000 | 2.5163 | 0.8435 | 0.2881 | 1.9776 | 0.8435 | 0.8439 | 0.1407 | 0.0500 | | 0.7609 | 49.0 | 49000 | 2.5187 | 0.8438 | 0.2885 | 1.9677 | 0.8438 | 0.8442 | 0.1401 | 0.0500 | | 0.759 | 50.0 | 50000 | 2.5149 | 0.8438 | 0.2886 | 1.9666 | 0.8438 | 0.8442 | 0.1400 | 0.0499 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_simkd This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0864 - Accuracy: 0.842 - Brier Loss: 0.4773 - Nll: 1.3055 - F1 Micro: 0.842 - F1 Macro: 0.8427 - Ece: 0.4404 - Aurc: 0.0865 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.1387 | 1.0 | 1000 | 0.1293 | 0.4412 | 0.8613 | 3.7912 | 0.4412 | 0.3863 | 0.3246 | 0.3321 | | 0.1154 | 2.0 | 2000 | 0.1132 | 0.668 | 0.7441 | 2.1523 | 0.668 | 0.6478 | 0.4699 | 0.1467 | | 0.1074 | 3.0 | 3000 | 0.1075 | 0.73 | 0.7150 | 1.9686 | 0.7300 | 0.7323 | 0.5177 | 0.1104 | | 0.1036 | 4.0 | 4000 | 0.1068 | 0.7258 | 0.6943 | 2.0361 | 0.7258 | 0.7299 | 0.4903 | 0.1318 | | 0.0996 | 5.0 | 5000 | 0.1047 | 0.742 | 0.6897 | 2.1166 | 0.7420 | 0.7456 | 0.5061 | 0.1124 | | 0.0967 | 6.0 | 6000 | 0.0992 | 0.78 | 0.6307 | 1.8492 | 0.78 | 0.7875 | 0.5018 | 0.0924 | | 0.0923 | 7.0 | 7000 | 0.0985 | 0.774 | 0.6055 | 2.0449 | 0.774 | 0.7800 | 0.4698 | 0.1056 | | 0.0893 | 8.0 | 8000 | 0.0982 | 0.7817 | 0.6012 | 1.9476 | 0.7817 | 0.7814 | 0.4696 | 0.1243 | | 0.0871 | 9.0 | 9000 | 0.0954 | 0.8043 | 0.5826 | 1.7573 | 0.8043 | 0.8065 | 0.4925 | 0.0811 | | 0.0857 | 10.0 | 10000 | 0.0969 | 0.784 | 0.5672 | 2.0878 | 0.7840 | 0.7846 | 0.4417 | 0.1076 | | 0.083 | 11.0 | 11000 | 0.0934 | 0.8067 | 0.5658 | 1.6628 | 0.8067 | 0.8062 | 0.4809 | 0.0788 | | 0.0819 | 12.0 | 12000 | 0.0930 | 0.8027 | 0.5499 | 1.6718 | 0.8027 | 0.8035 | 0.4592 | 0.0851 | | 0.081 | 13.0 | 13000 | 0.0937 | 0.7957 | 0.5579 | 1.7282 | 0.7957 | 0.7956 | 0.4544 | 0.0948 | | 0.079 | 14.0 | 14000 | 0.0939 | 0.794 | 0.5463 | 1.9406 | 0.7940 | 0.7974 | 0.4344 | 0.1142 | | 0.0779 | 15.0 | 15000 | 0.0912 | 0.81 | 0.5206 | 1.6776 | 0.81 | 0.8141 | 0.4371 | 0.0947 | | 0.0767 | 16.0 | 16000 | 0.0905 | 0.813 | 0.5165 | 1.6744 | 0.813 | 0.8140 | 0.4383 | 0.0864 | | 0.0766 | 17.0 | 17000 | 0.0911 | 0.8113 | 0.5239 | 1.7109 | 0.8113 | 0.8104 | 0.4428 | 0.0902 | | 0.0762 | 18.0 | 18000 | 0.0914 | 0.8093 | 0.5153 | 1.6778 | 0.8093 | 0.8098 | 0.4290 | 0.0998 | | 0.0759 | 19.0 | 19000 | 0.0904 | 0.8163 | 0.5076 | 1.6946 | 0.8163 | 0.8178 | 0.4333 | 0.0939 | | 0.075 | 20.0 | 20000 | 0.0897 | 0.8133 | 0.5062 | 1.5892 | 0.8133 | 0.8155 | 0.4300 | 0.0898 | | 0.0743 | 21.0 | 21000 | 0.0895 | 0.8147 | 0.5058 | 1.5900 | 0.8148 | 0.8149 | 0.4315 | 0.0917 | | 0.0745 | 22.0 | 22000 | 0.0898 | 0.8157 | 0.5014 | 1.5523 | 0.8157 | 0.8164 | 0.4287 | 0.0848 | | 0.0737 | 23.0 | 23000 | 0.0901 | 0.8127 | 0.5038 | 1.6625 | 0.8128 | 0.8146 | 0.4219 | 0.0978 | | 0.0735 | 24.0 | 24000 | 0.0907 | 0.8117 | 0.5082 | 1.6475 | 0.8117 | 0.8133 | 0.4231 | 0.1064 | | 0.0732 | 25.0 | 25000 | 0.0901 | 0.8103 | 0.5041 | 1.6830 | 0.8103 | 0.8105 | 0.4187 | 0.1017 | | 0.0727 | 26.0 | 26000 | 0.0899 | 0.8135 | 0.5015 | 1.6499 | 0.8135 | 0.8170 | 0.4197 | 0.1020 | | 0.0722 | 27.0 | 27000 | 0.0880 | 0.8265 | 0.4931 | 1.4651 | 0.8265 | 0.8273 | 0.4330 | 0.0975 | | 0.0718 | 28.0 | 28000 | 0.0876 | 0.8263 | 0.4917 | 1.4213 | 0.8263 | 0.8275 | 0.4354 | 0.0858 | | 0.0725 | 29.0 | 29000 | 0.0891 | 0.8247 | 0.4930 | 1.5581 | 0.8247 | 0.8254 | 0.4288 | 0.0946 | | 0.0717 | 30.0 | 30000 | 0.0879 | 0.8327 | 0.4913 | 1.4417 | 0.8327 | 0.8326 | 0.4403 | 0.0888 | | 0.0715 | 31.0 | 31000 | 0.0872 | 0.8375 | 0.4866 | 1.3775 | 0.8375 | 0.8389 | 0.4435 | 0.0872 | | 0.0715 | 32.0 | 32000 | 0.0884 | 0.8297 | 0.4915 | 1.5136 | 0.8297 | 0.8305 | 0.4331 | 0.0946 | | 0.0717 | 33.0 | 33000 | 0.0877 | 0.8347 | 0.4851 | 1.4096 | 0.8347 | 0.8347 | 0.4375 | 0.0845 | | 0.0716 | 34.0 | 34000 | 0.0880 | 0.8323 | 0.4866 | 1.4547 | 0.8323 | 0.8333 | 0.4323 | 0.0926 | | 0.0713 | 35.0 | 35000 | 0.0873 | 0.8343 | 0.4833 | 1.3884 | 0.8343 | 0.8351 | 0.4375 | 0.0810 | | 0.0713 | 36.0 | 36000 | 0.0873 | 0.8365 | 0.4843 | 1.4168 | 0.8365 | 0.8372 | 0.4381 | 0.0913 | | 0.071 | 37.0 | 37000 | 0.0871 | 0.8393 | 0.4831 | 1.3524 | 0.8393 | 0.8399 | 0.4412 | 0.0882 | | 0.0709 | 38.0 | 38000 | 0.0877 | 0.834 | 0.4862 | 1.4457 | 0.834 | 0.8353 | 0.4371 | 0.0929 | | 0.071 | 39.0 | 39000 | 0.0870 | 0.836 | 0.4811 | 1.3954 | 0.836 | 0.8367 | 0.4360 | 0.0886 | | 0.0708 | 40.0 | 40000 | 0.0867 | 0.8387 | 0.4800 | 1.3687 | 0.8387 | 0.8403 | 0.4390 | 0.0867 | | 0.0706 | 41.0 | 41000 | 0.0866 | 0.8395 | 0.4802 | 1.3464 | 0.8395 | 0.8399 | 0.4412 | 0.0860 | | 0.0708 | 42.0 | 42000 | 0.0868 | 0.8363 | 0.4796 | 1.3828 | 0.8363 | 0.8371 | 0.4345 | 0.0886 | | 0.0709 | 43.0 | 43000 | 0.0866 | 0.838 | 0.4790 | 1.3503 | 0.838 | 0.8390 | 0.4382 | 0.0860 | | 0.0702 | 44.0 | 44000 | 0.0866 | 0.8415 | 0.4787 | 1.3679 | 0.8415 | 0.8425 | 0.4403 | 0.0899 | | 0.0703 | 45.0 | 45000 | 0.0866 | 0.8373 | 0.4788 | 1.3192 | 0.8373 | 0.8379 | 0.4374 | 0.0863 | | 0.0702 | 46.0 | 46000 | 0.0865 | 0.841 | 0.4776 | 1.3357 | 0.841 | 0.8417 | 0.4398 | 0.0871 | | 0.0703 | 47.0 | 47000 | 0.0864 | 0.8417 | 0.4772 | 1.3302 | 0.8417 | 0.8424 | 0.4406 | 0.0859 | | 0.0705 | 48.0 | 48000 | 0.0865 | 0.841 | 0.4776 | 1.3096 | 0.841 | 0.8417 | 0.4398 | 0.0877 | | 0.0703 | 49.0 | 49000 | 0.0864 | 0.8413 | 0.4775 | 1.3022 | 0.8413 | 0.8419 | 0.4400 | 0.0865 | | 0.0703 | 50.0 | 50000 | 0.0864 | 0.842 | 0.4773 | 1.3055 | 0.842 | 0.8427 | 0.4404 | 0.0865 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_og_simkd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_og_simkd This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 12194.7598 - Accuracy: 0.8558 - Brier Loss: 0.2688 - Nll: 1.9967 - F1 Micro: 0.8558 - F1 Macro: 0.8552 - Ece: 0.1305 - Aurc: 0.0490 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 12767.556 | 1.0 | 1000 | 12472.2930 | 0.5725 | 0.5598 | 2.8765 | 0.5725 | 0.5254 | 0.0980 | 0.1794 | | 12750.588 | 2.0 | 2000 | 12455.9170 | 0.6683 | 0.4822 | 2.7098 | 0.6683 | 0.6534 | 0.1247 | 0.1325 | | 12820.858 | 3.0 | 3000 | 12458.1924 | 0.7003 | 0.4459 | 2.6913 | 0.7003 | 0.7019 | 0.1303 | 0.0964 | | 12762.296 | 4.0 | 4000 | 12445.5703 | 0.7167 | 0.4202 | 2.8379 | 0.7168 | 0.7217 | 0.1078 | 0.0925 | | 12706.14 | 5.0 | 5000 | 12425.8330 | 0.753 | 0.3820 | 2.7925 | 0.753 | 0.7539 | 0.0997 | 0.0923 | | 12764.822 | 6.0 | 6000 | 12427.2080 | 0.7635 | 0.3603 | 2.5903 | 0.7635 | 0.7659 | 0.0950 | 0.0823 | | 12719.869 | 7.0 | 7000 | 12411.4668 | 0.769 | 0.3469 | 2.7566 | 0.769 | 0.7761 | 0.0895 | 0.0681 | | 12628.481 | 8.0 | 8000 | 12412.3760 | 0.7738 | 0.3535 | 2.7667 | 0.7738 | 0.7832 | 0.1127 | 0.0699 | | 12624.542 | 9.0 | 9000 | 12396.7773 | 0.7933 | 0.3243 | 2.4484 | 0.7932 | 0.7954 | 0.1002 | 0.0664 | | 12681.642 | 10.0 | 10000 | 12391.2744 | 0.7943 | 0.3241 | 2.5709 | 0.7943 | 0.7979 | 0.1081 | 0.0592 | | 12656.593 | 11.0 | 11000 | 12383.5020 | 0.8015 | 0.3190 | 2.4516 | 0.8015 | 0.8065 | 0.1064 | 0.0597 | | 12638.155 | 12.0 | 12000 | 12372.9707 | 0.7957 | 0.3357 | 2.4891 | 0.7957 | 0.7956 | 0.1225 | 0.0679 | | 12698.474 | 13.0 | 13000 | 12370.7217 | 0.813 | 0.2988 | 2.1414 | 0.813 | 0.8125 | 0.1030 | 0.0494 | | 12574.549 | 14.0 | 14000 | 12361.6641 | 0.8045 | 0.3218 | 2.4610 | 0.8045 | 0.8043 | 0.1155 | 0.0560 | | 12589.537 | 15.0 | 15000 | 12345.1123 | 0.8193 | 0.3046 | 2.2566 | 0.8193 | 0.8184 | 0.1220 | 0.0524 | | 12592.604 | 16.0 | 16000 | 12354.9756 | 0.817 | 0.3078 | 2.3526 | 0.817 | 0.8207 | 0.1204 | 0.0527 | | 12660.709 | 17.0 | 17000 | 12334.7686 | 0.8293 | 0.2942 | 2.2857 | 0.8293 | 0.8284 | 0.1201 | 0.0482 | | 12591.369 | 18.0 | 18000 | 12334.4570 | 0.829 | 0.2948 | 2.1559 | 0.8290 | 0.8287 | 0.1211 | 0.0451 | | 12598.469 | 19.0 | 19000 | 12320.7510 | 0.826 | 0.2997 | 2.2348 | 0.826 | 0.8251 | 0.1240 | 0.0473 | | 12497.537 | 20.0 | 20000 | 12307.0811 | 0.8347 | 0.2833 | 2.2433 | 0.8347 | 0.8358 | 0.1200 | 0.0426 | | 12537.66 | 21.0 | 21000 | 12310.8438 | 0.8323 | 0.2965 | 2.1513 | 0.8323 | 0.8321 | 0.1287 | 0.0490 | | 12524.668 | 22.0 | 22000 | 12300.1055 | 0.8403 | 0.2776 | 2.1780 | 0.8403 | 0.8407 | 0.1207 | 0.0427 | | 12433.952 | 23.0 | 23000 | 12288.1221 | 0.8353 | 0.2898 | 2.2189 | 0.8353 | 0.8357 | 0.1346 | 0.0439 | | 12598.38 | 24.0 | 24000 | 12282.6680 | 0.8442 | 0.2765 | 2.1653 | 0.8443 | 0.8440 | 0.1264 | 0.0438 | | 12474.447 | 25.0 | 25000 | 12277.6797 | 0.8363 | 0.2925 | 2.1209 | 0.8363 | 0.8350 | 0.1366 | 0.0451 | | 12522.706 | 26.0 | 26000 | 12276.4502 | 0.8465 | 0.2764 | 2.0779 | 0.8465 | 0.8469 | 0.1291 | 0.0432 | | 12502.289 | 27.0 | 27000 | 12268.1758 | 0.8445 | 0.2811 | 2.0839 | 0.8445 | 0.8442 | 0.1318 | 0.0465 | | 12465.994 | 28.0 | 28000 | 12252.7266 | 0.8433 | 0.2882 | 2.1410 | 0.8433 | 0.8431 | 0.1380 | 0.0479 | | 12467.13 | 29.0 | 29000 | 12260.4912 | 0.8442 | 0.2838 | 2.1129 | 0.8443 | 0.8430 | 0.1348 | 0.0487 | | 12540.006 | 30.0 | 30000 | 12249.1670 | 0.846 | 0.2811 | 2.1134 | 0.8460 | 0.8458 | 0.1349 | 0.0486 | | 12594.326 | 31.0 | 31000 | 12245.6699 | 0.8452 | 0.2850 | 2.0734 | 0.8452 | 0.8443 | 0.1363 | 0.0480 | | 12486.203 | 32.0 | 32000 | 12240.5479 | 0.8468 | 0.2813 | 2.0757 | 0.8468 | 0.8463 | 0.1353 | 0.0484 | | 12468.631 | 33.0 | 33000 | 12231.9600 | 0.852 | 0.2715 | 2.0178 | 0.852 | 0.8523 | 0.1309 | 0.0450 | | 12423.715 | 34.0 | 34000 | 12215.6680 | 0.8472 | 0.2843 | 2.0927 | 0.8472 | 0.8470 | 0.1389 | 0.0491 | | 12454.715 | 35.0 | 35000 | 12223.0361 | 0.8492 | 0.2772 | 2.0161 | 0.8492 | 0.8485 | 0.1340 | 0.0476 | | 12466.932 | 36.0 | 36000 | 12221.3887 | 0.8495 | 0.2776 | 2.0135 | 0.8495 | 0.8488 | 0.1343 | 0.0467 | | 12483.745 | 37.0 | 37000 | 12210.9414 | 0.8508 | 0.2748 | 2.0374 | 0.8508 | 0.8506 | 0.1350 | 0.0493 | | 12453.102 | 38.0 | 38000 | 12224.9482 | 0.852 | 0.2737 | 1.9699 | 0.852 | 0.8517 | 0.1308 | 0.0460 | | 12511.225 | 39.0 | 39000 | 12213.9756 | 0.8522 | 0.2763 | 1.9619 | 0.8522 | 0.8518 | 0.1342 | 0.0484 | | 12561.782 | 40.0 | 40000 | 12213.9297 | 0.852 | 0.2736 | 2.0481 | 0.852 | 0.8516 | 0.1326 | 0.0477 | | 12524.982 | 41.0 | 41000 | 12208.1758 | 0.8518 | 0.2745 | 1.9751 | 0.8518 | 0.8509 | 0.1346 | 0.0490 | | 12465.351 | 42.0 | 42000 | 12215.3604 | 0.8532 | 0.2730 | 2.0037 | 0.8532 | 0.8521 | 0.1314 | 0.0474 | | 12419.902 | 43.0 | 43000 | 12211.3701 | 0.8565 | 0.2680 | 2.0140 | 0.8565 | 0.8561 | 0.1297 | 0.0462 | | 12493.264 | 44.0 | 44000 | 12196.7217 | 0.8532 | 0.2717 | 1.9866 | 0.8532 | 0.8524 | 0.1336 | 0.0487 | | 12487.514 | 45.0 | 45000 | 12199.4902 | 0.8532 | 0.2700 | 1.9711 | 0.8532 | 0.8523 | 0.1309 | 0.0478 | | 12321.575 | 46.0 | 46000 | 12189.0117 | 0.8535 | 0.2727 | 2.0220 | 0.8535 | 0.8528 | 0.1335 | 0.0492 | | 12423.494 | 47.0 | 47000 | 12198.2002 | 0.8542 | 0.2711 | 1.9648 | 0.8542 | 0.8536 | 0.1331 | 0.0478 | | 12535.605 | 48.0 | 48000 | 12192.7061 | 0.8565 | 0.2678 | 2.0098 | 0.8565 | 0.8560 | 0.1292 | 0.0489 | | 12319.588 | 49.0 | 49000 | 12185.3916 | 0.856 | 0.2691 | 2.0285 | 0.856 | 0.8554 | 0.1311 | 0.0506 | | 12470.527 | 50.0 | 50000 | 12194.7598 | 0.8558 | 0.2688 | 1.9967 | 0.8558 | 0.8552 | 0.1305 | 0.0490 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
annazhong/vit-base-patch16-224-finetuned-attention-fixation-3-classes
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-attention-fixation-3-classes This model is a fine-tuned version of [annazhong/vit-base-patch16-224-finetuned-attention-fixation-3-classes](https://huggingface.co/annazhong/vit-base-patch16-224-finetuned-attention-fixation-3-classes) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1739 - Accuracy: 0.4127 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 150 - eval_batch_size: 150 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 600 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 1.3475 | 0.4444 | | No log | 2.0 | 2 | 1.3280 | 0.4127 | | No log | 3.0 | 3 | 1.1739 | 0.4127 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2" ]
minatosnow/swinv2-base-patch4-window8-256-mineral
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-base-patch4-window8-256-mineral This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window8-256](https://huggingface.co/microsoft/swinv2-base-patch4-window8-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 5.1457 - Accuracy: 0.245 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 5.6939 | 0.96 | 18 | 5.6774 | 0.0067 | | 5.6713 | 1.97 | 37 | 5.6701 | 0.005 | | 5.665 | 2.99 | 56 | 5.6592 | 0.0033 | | 5.6497 | 4.0 | 75 | 5.6487 | 0.005 | | 5.6375 | 4.96 | 93 | 5.6375 | 0.0083 | | 5.6141 | 5.97 | 112 | 5.6255 | 0.0133 | | 5.5981 | 6.99 | 131 | 5.6123 | 0.0167 | | 5.5718 | 8.0 | 150 | 5.5961 | 0.0117 | | 5.5526 | 8.96 | 168 | 5.5777 | 0.0117 | | 5.5 | 9.97 | 187 | 5.5554 | 0.015 | | 5.4391 | 10.99 | 206 | 5.5290 | 0.0167 | | 5.3112 | 12.0 | 225 | 5.4914 | 0.0217 | | 5.25 | 12.96 | 243 | 5.4305 | 0.0283 | | 5.1361 | 13.97 | 262 | 5.3729 | 0.045 | | 4.985 | 14.99 | 281 | 5.2795 | 0.0483 | | 4.8031 | 16.0 | 300 | 5.1991 | 0.0633 | | 4.6843 | 16.96 | 318 | 5.0976 | 0.0683 | | 4.3959 | 17.97 | 337 | 4.9828 | 0.0783 | | 4.2277 | 18.99 | 356 | 4.8522 | 0.0883 | | 3.9594 | 20.0 | 375 | 4.7467 | 0.1017 | | 3.7637 | 20.96 | 393 | 4.6450 | 0.1183 | | 3.4748 | 21.97 | 412 | 4.5736 | 0.1317 | | 3.2937 | 22.99 | 431 | 4.4707 | 0.13 | | 3.1059 | 24.0 | 450 | 4.4103 | 0.1567 | | 2.8278 | 24.96 | 468 | 4.3476 | 0.1417 | | 2.7093 | 25.97 | 487 | 4.2624 | 0.1517 | | 2.4771 | 26.99 | 506 | 4.2065 | 0.1667 | | 2.3164 | 28.0 | 525 | 4.1656 | 0.17 | | 2.1912 | 28.96 | 543 | 4.1606 | 0.1617 | | 1.9385 | 29.97 | 562 | 4.1411 | 0.1717 | | 1.7912 | 30.99 | 581 | 4.1030 | 0.1667 | | 1.6852 | 32.0 | 600 | 4.0695 | 0.1833 | | 1.478 | 32.96 | 618 | 4.0546 | 0.1717 | | 1.3769 | 33.97 | 637 | 4.0407 | 0.1917 | | 1.2175 | 34.99 | 656 | 4.0224 | 0.19 | | 1.1616 | 36.0 | 675 | 4.0421 | 0.1867 | | 1.0305 | 36.96 | 693 | 4.0470 | 0.1933 | | 0.9513 | 37.97 | 712 | 4.0161 | 0.195 | | 0.8428 | 38.99 | 731 | 4.0153 | 0.2033 | | 0.7385 | 40.0 | 750 | 4.0600 | 0.1867 | | 0.667 | 40.96 | 768 | 4.0568 | 0.215 | | 0.6017 | 41.97 | 787 | 4.0459 | 0.2033 | | 0.6252 | 42.99 | 806 | 4.0209 | 0.2183 | | 0.5604 | 44.0 | 825 | 4.1177 | 0.2133 | | 0.4691 | 44.96 | 843 | 4.0966 | 0.1967 | | 0.4445 | 45.97 | 862 | 4.1254 | 0.215 | | 0.4151 | 46.99 | 881 | 4.1253 | 0.2133 | | 0.3903 | 48.0 | 900 | 4.1026 | 0.195 | | 0.3551 | 48.96 | 918 | 4.1544 | 0.2033 | | 0.3656 | 49.97 | 937 | 4.2111 | 0.2117 | | 0.3651 | 50.99 | 956 | 4.2143 | 0.1983 | | 0.3282 | 52.0 | 975 | 4.1838 | 0.2217 | | 0.2737 | 52.96 | 993 | 4.1866 | 0.2083 | | 0.2867 | 53.97 | 1012 | 4.2628 | 0.1983 | | 0.3202 | 54.99 | 1031 | 4.2542 | 0.2117 | | 0.2761 | 56.0 | 1050 | 4.2476 | 0.2183 | | 0.262 | 56.96 | 1068 | 4.2466 | 0.21 | | 0.2779 | 57.97 | 1087 | 4.2886 | 0.215 | | 0.2278 | 58.99 | 1106 | 4.2835 | 0.22 | | 0.2597 | 60.0 | 1125 | 4.2966 | 0.2183 | | 0.2364 | 60.96 | 1143 | 4.3238 | 0.2167 | | 0.2523 | 61.97 | 1162 | 4.2324 | 0.2083 | | 0.2199 | 62.99 | 1181 | 4.3234 | 0.2033 | | 0.2264 | 64.0 | 1200 | 4.3092 | 0.2167 | | 0.2004 | 64.96 | 1218 | 4.3842 | 0.2017 | | 0.1958 | 65.97 | 1237 | 4.3319 | 0.2033 | | 0.1942 | 66.99 | 1256 | 4.3161 | 0.225 | | 0.2094 | 68.0 | 1275 | 4.3120 | 0.2267 | | 0.1826 | 68.96 | 1293 | 4.3541 | 0.2267 | | 0.1984 | 69.97 | 1312 | 4.3720 | 0.2317 | | 0.1863 | 70.99 | 1331 | 4.3854 | 0.225 | | 0.2019 | 72.0 | 1350 | 4.3635 | 0.215 | | 0.1679 | 72.96 | 1368 | 4.3958 | 0.2183 | | 0.1765 | 73.97 | 1387 | 4.4110 | 0.2133 | | 0.1647 | 74.99 | 1406 | 4.4675 | 0.2167 | | 0.151 | 76.0 | 1425 | 4.4919 | 0.2183 | | 0.1621 | 76.96 | 1443 | 4.4519 | 0.2183 | | 0.1417 | 77.97 | 1462 | 4.4599 | 0.2017 | | 0.1685 | 78.99 | 1481 | 4.5555 | 0.2133 | | 0.1637 | 80.0 | 1500 | 4.4596 | 0.2117 | | 0.1692 | 80.96 | 1518 | 4.4194 | 0.2233 | | 0.1771 | 81.97 | 1537 | 4.4831 | 0.2217 | | 0.1433 | 82.99 | 1556 | 4.5271 | 0.1983 | | 0.1615 | 84.0 | 1575 | 4.4425 | 0.21 | | 0.1605 | 84.96 | 1593 | 4.4559 | 0.2183 | | 0.1402 | 85.97 | 1612 | 4.5123 | 0.21 | | 0.1639 | 86.99 | 1631 | 4.5368 | 0.2117 | | 0.1333 | 88.0 | 1650 | 4.5109 | 0.2217 | | 0.1405 | 88.96 | 1668 | 4.5220 | 0.2217 | | 0.1427 | 89.97 | 1687 | 4.5414 | 0.225 | | 0.1346 | 90.99 | 1706 | 4.4954 | 0.2317 | | 0.1494 | 92.0 | 1725 | 4.5996 | 0.2217 | | 0.1432 | 92.96 | 1743 | 4.5152 | 0.2167 | | 0.1276 | 93.97 | 1762 | 4.5623 | 0.2217 | | 0.1344 | 94.99 | 1781 | 4.5049 | 0.2233 | | 0.1157 | 96.0 | 1800 | 4.5413 | 0.22 | | 0.1324 | 96.96 | 1818 | 4.5660 | 0.225 | | 0.1262 | 97.97 | 1837 | 4.5358 | 0.2317 | | 0.1299 | 98.99 | 1856 | 4.6085 | 0.225 | | 0.1077 | 100.0 | 1875 | 4.5516 | 0.2267 | | 0.1386 | 100.96 | 1893 | 4.5751 | 0.23 | | 0.1463 | 101.97 | 1912 | 4.6149 | 0.2117 | | 0.0921 | 102.99 | 1931 | 4.5543 | 0.2383 | | 0.1094 | 104.0 | 1950 | 4.6144 | 0.2283 | | 0.1327 | 104.96 | 1968 | 4.5966 | 0.2283 | | 0.101 | 105.97 | 1987 | 4.5869 | 0.22 | | 0.1275 | 106.99 | 2006 | 4.5588 | 0.2367 | | 0.1175 | 108.0 | 2025 | 4.6091 | 0.2133 | | 0.1031 | 108.96 | 2043 | 4.6347 | 0.22 | | 0.1086 | 109.97 | 2062 | 4.5894 | 0.215 | | 0.1158 | 110.99 | 2081 | 4.6496 | 0.2267 | | 0.1091 | 112.0 | 2100 | 4.6218 | 0.22 | | 0.0885 | 112.96 | 2118 | 4.6319 | 0.2133 | | 0.0732 | 113.97 | 2137 | 4.7100 | 0.21 | | 0.113 | 114.99 | 2156 | 4.6596 | 0.21 | | 0.0872 | 116.0 | 2175 | 4.6277 | 0.2217 | | 0.0862 | 116.96 | 2193 | 4.6262 | 0.2333 | | 0.0762 | 117.97 | 2212 | 4.6078 | 0.2267 | | 0.0916 | 118.99 | 2231 | 4.6198 | 0.215 | | 0.1158 | 120.0 | 2250 | 4.6560 | 0.225 | | 0.083 | 120.96 | 2268 | 4.5878 | 0.2267 | | 0.0855 | 121.97 | 2287 | 4.6666 | 0.24 | | 0.0963 | 122.99 | 2306 | 4.6186 | 0.2233 | | 0.088 | 124.0 | 2325 | 4.6250 | 0.2433 | | 0.0867 | 124.96 | 2343 | 4.6591 | 0.24 | | 0.0858 | 125.97 | 2362 | 4.6467 | 0.2267 | | 0.0928 | 126.99 | 2381 | 4.6457 | 0.2233 | | 0.1189 | 128.0 | 2400 | 4.7195 | 0.2183 | | 0.0816 | 128.96 | 2418 | 4.6935 | 0.2233 | | 0.09 | 129.97 | 2437 | 4.7062 | 0.2183 | | 0.0958 | 130.99 | 2456 | 4.6826 | 0.2267 | | 0.0583 | 132.0 | 2475 | 4.6903 | 0.23 | | 0.0878 | 132.96 | 2493 | 4.7640 | 0.2217 | | 0.0967 | 133.97 | 2512 | 4.7564 | 0.2333 | | 0.0902 | 134.99 | 2531 | 4.7551 | 0.225 | | 0.0709 | 136.0 | 2550 | 4.7972 | 0.23 | | 0.0724 | 136.96 | 2568 | 4.8073 | 0.22 | | 0.0901 | 137.97 | 2587 | 4.7568 | 0.2283 | | 0.0968 | 138.99 | 2606 | 4.7389 | 0.2383 | | 0.075 | 140.0 | 2625 | 4.7521 | 0.2283 | | 0.0955 | 140.96 | 2643 | 4.7091 | 0.2217 | | 0.0648 | 141.97 | 2662 | 4.7522 | 0.2283 | | 0.0792 | 142.99 | 2681 | 4.7606 | 0.2233 | | 0.0828 | 144.0 | 2700 | 4.7383 | 0.23 | | 0.0722 | 144.96 | 2718 | 4.7851 | 0.2383 | | 0.0958 | 145.97 | 2737 | 4.7559 | 0.235 | | 0.0761 | 146.99 | 2756 | 4.8086 | 0.23 | | 0.0926 | 148.0 | 2775 | 4.7643 | 0.2317 | | 0.0735 | 148.96 | 2793 | 4.7936 | 0.23 | | 0.0957 | 149.97 | 2812 | 4.7229 | 0.23 | | 0.0804 | 150.99 | 2831 | 4.7744 | 0.2133 | | 0.0807 | 152.0 | 2850 | 4.8072 | 0.2117 | | 0.0761 | 152.96 | 2868 | 4.7777 | 0.2283 | | 0.0752 | 153.97 | 2887 | 4.8695 | 0.2183 | | 0.0588 | 154.99 | 2906 | 4.8292 | 0.2283 | | 0.0735 | 156.0 | 2925 | 4.7753 | 0.2233 | | 0.0867 | 156.96 | 2943 | 4.7585 | 0.225 | | 0.094 | 157.97 | 2962 | 4.8235 | 0.2317 | | 0.0908 | 158.99 | 2981 | 4.8720 | 0.23 | | 0.0697 | 160.0 | 3000 | 4.8698 | 0.22 | | 0.0701 | 160.96 | 3018 | 4.8097 | 0.2283 | | 0.0804 | 161.97 | 3037 | 4.7381 | 0.2217 | | 0.0874 | 162.99 | 3056 | 4.7458 | 0.23 | | 0.0661 | 164.0 | 3075 | 4.7571 | 0.225 | | 0.0715 | 164.96 | 3093 | 4.8221 | 0.2367 | | 0.0599 | 165.97 | 3112 | 4.6964 | 0.2467 | | 0.0708 | 166.99 | 3131 | 4.8396 | 0.2317 | | 0.0633 | 168.0 | 3150 | 4.8040 | 0.22 | | 0.0531 | 168.96 | 3168 | 4.8200 | 0.2217 | | 0.0689 | 169.97 | 3187 | 4.8330 | 0.2333 | | 0.0502 | 170.99 | 3206 | 4.8688 | 0.2283 | | 0.0607 | 172.0 | 3225 | 4.8696 | 0.2333 | | 0.0705 | 172.96 | 3243 | 4.8514 | 0.225 | | 0.0719 | 173.97 | 3262 | 4.8854 | 0.2267 | | 0.0852 | 174.99 | 3281 | 4.8463 | 0.2233 | | 0.0525 | 176.0 | 3300 | 4.8801 | 0.2383 | | 0.069 | 176.96 | 3318 | 4.9139 | 0.2383 | | 0.0716 | 177.97 | 3337 | 4.9129 | 0.225 | | 0.0681 | 178.99 | 3356 | 4.8785 | 0.2283 | | 0.0656 | 180.0 | 3375 | 4.8776 | 0.24 | | 0.0866 | 180.96 | 3393 | 4.8923 | 0.24 | | 0.0588 | 181.97 | 3412 | 4.8773 | 0.2317 | | 0.0542 | 182.99 | 3431 | 4.8637 | 0.2417 | | 0.0508 | 184.0 | 3450 | 4.8914 | 0.23 | | 0.0591 | 184.96 | 3468 | 4.8835 | 0.225 | | 0.0626 | 185.97 | 3487 | 4.9119 | 0.2317 | | 0.0708 | 186.99 | 3506 | 4.8882 | 0.2217 | | 0.0651 | 188.0 | 3525 | 4.8978 | 0.22 | | 0.0609 | 188.96 | 3543 | 4.8657 | 0.2267 | | 0.0519 | 189.97 | 3562 | 4.9336 | 0.21 | | 0.0687 | 190.99 | 3581 | 4.8825 | 0.2217 | | 0.0401 | 192.0 | 3600 | 4.9984 | 0.2133 | | 0.0532 | 192.96 | 3618 | 4.8797 | 0.24 | | 0.0515 | 193.97 | 3637 | 4.8716 | 0.2317 | | 0.0735 | 194.99 | 3656 | 4.9249 | 0.2217 | | 0.0708 | 196.0 | 3675 | 4.9456 | 0.2083 | | 0.0612 | 196.96 | 3693 | 4.9355 | 0.23 | | 0.0511 | 197.97 | 3712 | 4.9061 | 0.215 | | 0.0573 | 198.99 | 3731 | 4.9662 | 0.2117 | | 0.0482 | 200.0 | 3750 | 4.9992 | 0.2133 | | 0.0454 | 200.96 | 3768 | 5.0024 | 0.2217 | | 0.072 | 201.97 | 3787 | 4.9026 | 0.2283 | | 0.0583 | 202.99 | 3806 | 4.9121 | 0.2283 | | 0.0655 | 204.0 | 3825 | 4.9426 | 0.23 | | 0.0375 | 204.96 | 3843 | 4.9971 | 0.2267 | | 0.0652 | 205.97 | 3862 | 4.9414 | 0.2283 | | 0.0514 | 206.99 | 3881 | 5.0227 | 0.2333 | | 0.0677 | 208.0 | 3900 | 5.0196 | 0.2217 | | 0.0805 | 208.96 | 3918 | 4.9773 | 0.2233 | | 0.0556 | 209.97 | 3937 | 4.9303 | 0.235 | | 0.0508 | 210.99 | 3956 | 4.8852 | 0.2333 | | 0.0471 | 212.0 | 3975 | 4.9395 | 0.24 | | 0.047 | 212.96 | 3993 | 4.9143 | 0.23 | | 0.0438 | 213.97 | 4012 | 4.8727 | 0.2333 | | 0.0769 | 214.99 | 4031 | 4.9319 | 0.235 | | 0.0516 | 216.0 | 4050 | 4.9909 | 0.225 | | 0.0413 | 216.96 | 4068 | 4.9651 | 0.2167 | | 0.0402 | 217.97 | 4087 | 4.9235 | 0.2233 | | 0.036 | 218.99 | 4106 | 4.9646 | 0.2267 | | 0.0375 | 220.0 | 4125 | 4.9655 | 0.2267 | | 0.0566 | 220.96 | 4143 | 4.9532 | 0.23 | | 0.0591 | 221.97 | 4162 | 4.9400 | 0.2317 | | 0.0602 | 222.99 | 4181 | 4.9306 | 0.2417 | | 0.0339 | 224.0 | 4200 | 4.9925 | 0.2383 | | 0.0628 | 224.96 | 4218 | 4.9416 | 0.24 | | 0.0495 | 225.97 | 4237 | 4.9398 | 0.25 | | 0.0545 | 226.99 | 4256 | 4.9522 | 0.2367 | | 0.045 | 228.0 | 4275 | 4.8830 | 0.2367 | | 0.036 | 228.96 | 4293 | 4.9543 | 0.235 | | 0.0477 | 229.97 | 4312 | 4.9769 | 0.2267 | | 0.0427 | 230.99 | 4331 | 4.9233 | 0.23 | | 0.0419 | 232.0 | 4350 | 4.9587 | 0.24 | | 0.0402 | 232.96 | 4368 | 4.9977 | 0.2233 | | 0.0468 | 233.97 | 4387 | 5.0033 | 0.24 | | 0.0554 | 234.99 | 4406 | 4.9908 | 0.24 | | 0.0503 | 236.0 | 4425 | 4.9143 | 0.2433 | | 0.0462 | 236.96 | 4443 | 4.8740 | 0.245 | | 0.0354 | 237.97 | 4462 | 4.9383 | 0.2333 | | 0.0381 | 238.99 | 4481 | 5.0228 | 0.2467 | | 0.0358 | 240.0 | 4500 | 5.0114 | 0.2417 | | 0.0264 | 240.96 | 4518 | 5.0244 | 0.23 | | 0.0512 | 241.97 | 4537 | 5.0987 | 0.22 | | 0.0555 | 242.99 | 4556 | 4.9946 | 0.2233 | | 0.0545 | 244.0 | 4575 | 5.0295 | 0.23 | | 0.0395 | 244.96 | 4593 | 5.0320 | 0.225 | | 0.0512 | 245.97 | 4612 | 4.9736 | 0.2317 | | 0.0343 | 246.99 | 4631 | 4.9499 | 0.2383 | | 0.0506 | 248.0 | 4650 | 4.9478 | 0.24 | | 0.0642 | 248.96 | 4668 | 4.9233 | 0.2417 | | 0.035 | 249.97 | 4687 | 4.9348 | 0.2317 | | 0.0408 | 250.99 | 4706 | 4.9228 | 0.2483 | | 0.0431 | 252.0 | 4725 | 5.0074 | 0.24 | | 0.0273 | 252.96 | 4743 | 4.9480 | 0.235 | | 0.0495 | 253.97 | 4762 | 4.9539 | 0.2333 | | 0.0431 | 254.99 | 4781 | 5.0146 | 0.245 | | 0.0339 | 256.0 | 4800 | 5.0659 | 0.225 | | 0.0544 | 256.96 | 4818 | 5.0378 | 0.2417 | | 0.0365 | 257.97 | 4837 | 5.0133 | 0.2433 | | 0.047 | 258.99 | 4856 | 4.9964 | 0.2367 | | 0.048 | 260.0 | 4875 | 5.0102 | 0.2383 | | 0.0469 | 260.96 | 4893 | 4.9275 | 0.245 | | 0.0399 | 261.97 | 4912 | 4.9794 | 0.2483 | | 0.0372 | 262.99 | 4931 | 4.9892 | 0.2383 | | 0.0538 | 264.0 | 4950 | 4.9841 | 0.2333 | | 0.0402 | 264.96 | 4968 | 5.0038 | 0.2383 | | 0.0507 | 265.97 | 4987 | 5.0184 | 0.2333 | | 0.0472 | 266.99 | 5006 | 4.9750 | 0.245 | | 0.047 | 268.0 | 5025 | 5.0005 | 0.2533 | | 0.0436 | 268.96 | 5043 | 4.9527 | 0.25 | | 0.0323 | 269.97 | 5062 | 4.9495 | 0.2517 | | 0.0454 | 270.99 | 5081 | 4.9883 | 0.2433 | | 0.0477 | 272.0 | 5100 | 4.9656 | 0.25 | | 0.053 | 272.96 | 5118 | 4.9082 | 0.245 | | 0.0398 | 273.97 | 5137 | 4.9618 | 0.245 | | 0.0347 | 274.99 | 5156 | 4.9594 | 0.2467 | | 0.0447 | 276.0 | 5175 | 4.9556 | 0.2417 | | 0.0312 | 276.96 | 5193 | 4.9943 | 0.2283 | | 0.0521 | 277.97 | 5212 | 5.0603 | 0.2367 | | 0.0386 | 278.99 | 5231 | 5.0410 | 0.2183 | | 0.0276 | 280.0 | 5250 | 4.9705 | 0.2383 | | 0.0532 | 280.96 | 5268 | 4.9813 | 0.235 | | 0.0627 | 281.97 | 5287 | 4.9953 | 0.2317 | | 0.0455 | 282.99 | 5306 | 4.9607 | 0.2433 | | 0.0418 | 284.0 | 5325 | 4.9816 | 0.2417 | | 0.0355 | 284.96 | 5343 | 4.9670 | 0.245 | | 0.0307 | 285.97 | 5362 | 4.9994 | 0.2367 | | 0.0504 | 286.99 | 5381 | 5.0265 | 0.24 | | 0.0368 | 288.0 | 5400 | 5.0336 | 0.235 | | 0.035 | 288.96 | 5418 | 5.0561 | 0.2417 | | 0.0518 | 289.97 | 5437 | 5.0769 | 0.2433 | | 0.0383 | 290.99 | 5456 | 5.0629 | 0.235 | | 0.0381 | 292.0 | 5475 | 4.9694 | 0.2367 | | 0.0307 | 292.96 | 5493 | 4.9370 | 0.2417 | | 0.0447 | 293.97 | 5512 | 4.9549 | 0.2333 | | 0.0499 | 294.99 | 5531 | 4.9883 | 0.2333 | | 0.045 | 296.0 | 5550 | 4.9923 | 0.24 | | 0.0579 | 296.96 | 5568 | 4.9793 | 0.24 | | 0.0527 | 297.97 | 5587 | 5.0027 | 0.2333 | | 0.0353 | 298.99 | 5606 | 5.0196 | 0.2283 | | 0.0247 | 300.0 | 5625 | 5.0492 | 0.24 | | 0.0339 | 300.96 | 5643 | 5.0539 | 0.2317 | | 0.0525 | 301.97 | 5662 | 5.1091 | 0.2233 | | 0.0339 | 302.99 | 5681 | 5.0659 | 0.2333 | | 0.0406 | 304.0 | 5700 | 5.0559 | 0.2233 | | 0.0252 | 304.96 | 5718 | 5.0228 | 0.2233 | | 0.0358 | 305.97 | 5737 | 5.0661 | 0.2333 | | 0.031 | 306.99 | 5756 | 5.0571 | 0.2383 | | 0.0334 | 308.0 | 5775 | 5.0452 | 0.2417 | | 0.0368 | 308.96 | 5793 | 5.0873 | 0.225 | | 0.042 | 309.97 | 5812 | 5.1043 | 0.24 | | 0.0375 | 310.99 | 5831 | 5.0588 | 0.2383 | | 0.026 | 312.0 | 5850 | 5.0150 | 0.2417 | | 0.0326 | 312.96 | 5868 | 5.0101 | 0.2433 | | 0.0286 | 313.97 | 5887 | 5.0106 | 0.245 | | 0.0164 | 314.99 | 5906 | 5.0428 | 0.24 | | 0.0274 | 316.0 | 5925 | 5.0479 | 0.2317 | | 0.0263 | 316.96 | 5943 | 4.9870 | 0.24 | | 0.0371 | 317.97 | 5962 | 4.9846 | 0.2417 | | 0.0355 | 318.99 | 5981 | 5.0133 | 0.2367 | | 0.0342 | 320.0 | 6000 | 5.0306 | 0.2467 | | 0.0377 | 320.96 | 6018 | 5.0073 | 0.235 | | 0.0369 | 321.97 | 6037 | 4.9949 | 0.2517 | | 0.0375 | 322.99 | 6056 | 5.0409 | 0.2483 | | 0.0287 | 324.0 | 6075 | 5.0325 | 0.24 | | 0.0319 | 324.96 | 6093 | 5.0241 | 0.245 | | 0.0315 | 325.97 | 6112 | 5.0320 | 0.235 | | 0.0404 | 326.99 | 6131 | 5.0403 | 0.2383 | | 0.0287 | 328.0 | 6150 | 5.0792 | 0.2383 | | 0.0611 | 328.96 | 6168 | 5.0378 | 0.2367 | | 0.0296 | 329.97 | 6187 | 5.0192 | 0.2333 | | 0.0259 | 330.99 | 6206 | 5.0362 | 0.2317 | | 0.0257 | 332.0 | 6225 | 5.0494 | 0.2433 | | 0.0294 | 332.96 | 6243 | 5.0462 | 0.2383 | | 0.0483 | 333.97 | 6262 | 5.0474 | 0.225 | | 0.0332 | 334.99 | 6281 | 5.0521 | 0.2233 | | 0.0278 | 336.0 | 6300 | 5.0483 | 0.2367 | | 0.0241 | 336.96 | 6318 | 5.0617 | 0.2367 | | 0.0356 | 337.97 | 6337 | 5.0549 | 0.2367 | | 0.0326 | 338.99 | 6356 | 5.0920 | 0.225 | | 0.0255 | 340.0 | 6375 | 5.1311 | 0.2283 | | 0.0415 | 340.96 | 6393 | 5.1072 | 0.2267 | | 0.0241 | 341.97 | 6412 | 5.0731 | 0.2217 | | 0.0343 | 342.99 | 6431 | 5.0496 | 0.2283 | | 0.0196 | 344.0 | 6450 | 5.0186 | 0.2383 | | 0.0451 | 344.96 | 6468 | 5.0546 | 0.235 | | 0.0291 | 345.97 | 6487 | 5.0705 | 0.2367 | | 0.0244 | 346.99 | 6506 | 5.0857 | 0.2267 | | 0.0356 | 348.0 | 6525 | 5.0919 | 0.2367 | | 0.03 | 348.96 | 6543 | 5.0770 | 0.2333 | | 0.033 | 349.97 | 6562 | 5.0676 | 0.2333 | | 0.0231 | 350.99 | 6581 | 5.0522 | 0.2383 | | 0.0348 | 352.0 | 6600 | 5.0550 | 0.2283 | | 0.0234 | 352.96 | 6618 | 5.0709 | 0.2317 | | 0.0302 | 353.97 | 6637 | 5.0728 | 0.2283 | | 0.0269 | 354.99 | 6656 | 5.1108 | 0.2217 | | 0.0287 | 356.0 | 6675 | 5.1299 | 0.2283 | | 0.0459 | 356.96 | 6693 | 5.0675 | 0.2333 | | 0.0357 | 357.97 | 6712 | 5.0613 | 0.235 | | 0.0254 | 358.99 | 6731 | 5.0390 | 0.245 | | 0.0208 | 360.0 | 6750 | 5.0600 | 0.2333 | | 0.0264 | 360.96 | 6768 | 5.1158 | 0.24 | | 0.0248 | 361.97 | 6787 | 5.1090 | 0.2417 | | 0.0289 | 362.99 | 6806 | 5.1165 | 0.2417 | | 0.0234 | 364.0 | 6825 | 5.0914 | 0.24 | | 0.0354 | 364.96 | 6843 | 5.0802 | 0.2317 | | 0.0261 | 365.97 | 6862 | 5.0958 | 0.2433 | | 0.0413 | 366.99 | 6881 | 5.1149 | 0.2333 | | 0.0328 | 368.0 | 6900 | 5.1452 | 0.2333 | | 0.0285 | 368.96 | 6918 | 5.1633 | 0.2317 | | 0.035 | 369.97 | 6937 | 5.1334 | 0.2367 | | 0.0223 | 370.99 | 6956 | 5.1195 | 0.2483 | | 0.0216 | 372.0 | 6975 | 5.0994 | 0.24 | | 0.0246 | 372.96 | 6993 | 5.0976 | 0.2467 | | 0.0195 | 373.97 | 7012 | 5.1302 | 0.2383 | | 0.0323 | 374.99 | 7031 | 5.1269 | 0.245 | | 0.0402 | 376.0 | 7050 | 5.1254 | 0.23 | | 0.0249 | 376.96 | 7068 | 5.1320 | 0.2267 | | 0.0282 | 377.97 | 7087 | 5.1341 | 0.2233 | | 0.0284 | 378.99 | 7106 | 5.1457 | 0.2267 | | 0.0247 | 380.0 | 7125 | 5.1302 | 0.2283 | | 0.0254 | 380.96 | 7143 | 5.1563 | 0.2317 | | 0.0307 | 381.97 | 7162 | 5.1569 | 0.2283 | | 0.025 | 382.99 | 7181 | 5.1598 | 0.2267 | | 0.0438 | 384.0 | 7200 | 5.1443 | 0.23 | | 0.0318 | 384.96 | 7218 | 5.1208 | 0.2367 | | 0.0131 | 385.97 | 7237 | 5.0810 | 0.2317 | | 0.0223 | 386.99 | 7256 | 5.0733 | 0.235 | | 0.0212 | 388.0 | 7275 | 5.1003 | 0.23 | | 0.0269 | 388.96 | 7293 | 5.1205 | 0.2283 | | 0.0334 | 389.97 | 7312 | 5.1224 | 0.2283 | | 0.0285 | 390.99 | 7331 | 5.1208 | 0.2283 | | 0.0427 | 392.0 | 7350 | 5.1224 | 0.2383 | | 0.0228 | 392.96 | 7368 | 5.0950 | 0.2333 | | 0.0144 | 393.97 | 7387 | 5.0910 | 0.235 | | 0.0293 | 394.99 | 7406 | 5.0855 | 0.2483 | | 0.0135 | 396.0 | 7425 | 5.0941 | 0.24 | | 0.0297 | 396.96 | 7443 | 5.1149 | 0.2367 | | 0.0242 | 397.97 | 7462 | 5.1468 | 0.24 | | 0.0188 | 398.99 | 7481 | 5.1389 | 0.24 | | 0.013 | 400.0 | 7500 | 5.1085 | 0.24 | | 0.0215 | 400.96 | 7518 | 5.1079 | 0.2417 | | 0.0246 | 401.97 | 7537 | 5.1225 | 0.2433 | | 0.0381 | 402.99 | 7556 | 5.1049 | 0.2333 | | 0.0268 | 404.0 | 7575 | 5.0964 | 0.2367 | | 0.022 | 404.96 | 7593 | 5.1123 | 0.2417 | | 0.0258 | 405.97 | 7612 | 5.1242 | 0.2367 | | 0.0234 | 406.99 | 7631 | 5.1538 | 0.2367 | | 0.0346 | 408.0 | 7650 | 5.1587 | 0.24 | | 0.0251 | 408.96 | 7668 | 5.1695 | 0.2367 | | 0.0319 | 409.97 | 7687 | 5.1242 | 0.2367 | | 0.0296 | 410.99 | 7706 | 5.1253 | 0.245 | | 0.0311 | 412.0 | 7725 | 5.1319 | 0.235 | | 0.0246 | 412.96 | 7743 | 5.1435 | 0.2383 | | 0.0148 | 413.97 | 7762 | 5.1314 | 0.2433 | | 0.0196 | 414.99 | 7781 | 5.1186 | 0.2433 | | 0.0206 | 416.0 | 7800 | 5.1153 | 0.24 | | 0.0377 | 416.96 | 7818 | 5.1252 | 0.23 | | 0.0246 | 417.97 | 7837 | 5.1409 | 0.235 | | 0.0115 | 418.99 | 7856 | 5.1439 | 0.235 | | 0.0139 | 420.0 | 7875 | 5.1273 | 0.2333 | | 0.0238 | 420.96 | 7893 | 5.1244 | 0.2333 | | 0.0228 | 421.97 | 7912 | 5.1404 | 0.2333 | | 0.0276 | 422.99 | 7931 | 5.1417 | 0.23 | | 0.0276 | 424.0 | 7950 | 5.1281 | 0.235 | | 0.0186 | 424.96 | 7968 | 5.1225 | 0.24 | | 0.0463 | 425.97 | 7987 | 5.1386 | 0.2333 | | 0.0192 | 426.99 | 8006 | 5.1411 | 0.2383 | | 0.0277 | 428.0 | 8025 | 5.1424 | 0.235 | | 0.0304 | 428.96 | 8043 | 5.1354 | 0.23 | | 0.0303 | 429.97 | 8062 | 5.1346 | 0.2383 | | 0.0236 | 430.99 | 8081 | 5.1426 | 0.2367 | | 0.0279 | 432.0 | 8100 | 5.1394 | 0.24 | | 0.0218 | 432.96 | 8118 | 5.1427 | 0.2383 | | 0.0194 | 433.97 | 8137 | 5.1346 | 0.235 | | 0.0116 | 434.99 | 8156 | 5.1279 | 0.2333 | | 0.0201 | 436.0 | 8175 | 5.1297 | 0.2333 | | 0.0319 | 436.96 | 8193 | 5.1245 | 0.235 | | 0.0225 | 437.97 | 8212 | 5.1166 | 0.2367 | | 0.0176 | 438.99 | 8231 | 5.1184 | 0.2333 | | 0.015 | 440.0 | 8250 | 5.1266 | 0.235 | | 0.0226 | 440.96 | 8268 | 5.1210 | 0.2317 | | 0.0158 | 441.97 | 8287 | 5.1206 | 0.2383 | | 0.0182 | 442.99 | 8306 | 5.1153 | 0.2367 | | 0.0318 | 444.0 | 8325 | 5.1076 | 0.235 | | 0.0283 | 444.96 | 8343 | 5.1095 | 0.2333 | | 0.0265 | 445.97 | 8362 | 5.1310 | 0.2333 | | 0.0187 | 446.99 | 8381 | 5.1357 | 0.235 | | 0.02 | 448.0 | 8400 | 5.1346 | 0.2283 | | 0.0238 | 448.96 | 8418 | 5.1394 | 0.23 | | 0.0176 | 449.97 | 8437 | 5.1368 | 0.2333 | | 0.0193 | 450.99 | 8456 | 5.1376 | 0.235 | | 0.034 | 452.0 | 8475 | 5.1447 | 0.2383 | | 0.0204 | 452.96 | 8493 | 5.1470 | 0.235 | | 0.0188 | 453.97 | 8512 | 5.1478 | 0.235 | | 0.0256 | 454.99 | 8531 | 5.1441 | 0.2367 | | 0.026 | 456.0 | 8550 | 5.1448 | 0.235 | | 0.0179 | 456.96 | 8568 | 5.1457 | 0.2317 | | 0.0233 | 457.97 | 8587 | 5.1456 | 0.235 | | 0.0186 | 458.99 | 8606 | 5.1381 | 0.235 | | 0.0203 | 460.0 | 8625 | 5.1357 | 0.2333 | | 0.0266 | 460.96 | 8643 | 5.1313 | 0.2417 | | 0.0242 | 461.97 | 8662 | 5.1319 | 0.2417 | | 0.0234 | 462.99 | 8681 | 5.1338 | 0.24 | | 0.0184 | 464.0 | 8700 | 5.1343 | 0.2417 | | 0.016 | 464.96 | 8718 | 5.1374 | 0.2383 | | 0.0357 | 465.97 | 8737 | 5.1385 | 0.24 | | 0.021 | 466.99 | 8756 | 5.1406 | 0.2417 | | 0.0209 | 468.0 | 8775 | 5.1426 | 0.2417 | | 0.0186 | 468.96 | 8793 | 5.1428 | 0.2417 | | 0.027 | 469.97 | 8812 | 5.1442 | 0.2417 | | 0.0146 | 470.99 | 8831 | 5.1449 | 0.2433 | | 0.0237 | 472.0 | 8850 | 5.1456 | 0.2433 | | 0.0147 | 472.96 | 8868 | 5.1467 | 0.245 | | 0.0268 | 473.97 | 8887 | 5.1457 | 0.245 | | 0.015 | 474.99 | 8906 | 5.1465 | 0.2433 | | 0.0116 | 476.0 | 8925 | 5.1462 | 0.2433 | | 0.0151 | 476.96 | 8943 | 5.1453 | 0.2433 | | 0.0225 | 477.97 | 8962 | 5.1453 | 0.245 | | 0.0267 | 478.99 | 8981 | 5.1456 | 0.245 | | 0.031 | 480.0 | 9000 | 5.1457 | 0.245 | ### Framework versions - Transformers 4.31.0 - Pytorch 1.13.1 - Datasets 2.14.0 - Tokenizers 0.13.3
[ "acanthite", "actinolite", "adamite", "aegirine", "afghanite", "agate", "alabandite", "albite", "almandine", "amethyst", "analcime", "anatase", "andalusite", "andradite", "anglesite", "anhydrite", "annabergite", "anorthite", "anorthoclase", "antimony", "apatite", "apophyllite", "aquamarine", "aragonite", "arfvedsonite", "arsenic", "arsenopyrite", "augite", "aurichalcite", "axinite", "azurite", "babingtonite", "barite", "bauxite", "benitoite", "beryl", "biotite", "bismuth", "bismuthinite", "bixbyite", "boehmite", "boleite", "boracite", "borax", "bornite", "boulangerite", "bournonite", "brochantite", "bromargyrite", "brookite", "brucite", "buergerite", "bustamite", "bytownite", "calaverite", "calcite", "calomel", "carletonite", "cassiterite", "celestine", "cerussite", "cervantite", "chabazite", "chalcanthite", "chalcedony", "chalcocite", "chalcopyrite", "chambersite", "chamosite", "chlorargyrite", "chlorite", "chondrodite", "chromite", "chrysoberyl", "chrysocolla", "cinnabar", "citrine", "clinochlore", "clinohumite", "clinozoisite", "cobaltite", "coesite", "colemanite", "columbite", "cookeite", "copper", "cordierite", "corundum", "covellite", "cristobalite", "crocoite", "cryolite", "cumengeite", "cuprite", "cyanotrichite", "danburite", "datolite", "diamond", "diaspore", "diopside", "dioptase", "dolomite", "dravite", "dumortierite", "edenite", "elbaite", "emerald", "enargite", "enstatite", "epidote", "epistilbite", "erythrite", "euclase", "ferberite", "fluorite", "franklinite", "galena", "garnet", "gaspeite", "gibbsite", "glauberite", "glaucophane", "gmelinite", "goethite", "gold", "goosecreekite", "graphite", "greenockite", "grossular", "gummite", "gypsum", "halite", "hastingsite", "hauyne", "hedenbergite", "hematite", "hemimorphite", "heulandite", "hornblende", "howlite", "huebnerite", "humite", "hydrozincite", "ice", "ilmenite", "inesite", "iodargyrite", "iron-nickel", "jadeite", "jamesonite", "jarosite", "jasper", "jeremejevite", "johannsenite", "kaolinite", "kernite", "krennerite", "kyanite", "labradorite", "laumontite", "lawsonite", "lazulite", "lazurite", "lead", "lepidolite", "leucite", "liddicoatite", "limonite", "linarite", "lithiophilite", "loellingite", "luzonite", "magnesiochromite", "magnesite", "magnetite", "malachite", "manganite", "manganoneptunite", "marcasite", "marialite", "meionite", "melanophlogite", "mercury", "mesolite", "metacinnabar", "metastibnite", "metavariscite", "miargyrite", "microcline", "millerite", "mimetite", "molybdenite", "monazite", "muscovite", "natrolite", "nepheline", "neptunite", "nickeline", "nitratine", "norbergite", "nosean", "oligoclase", "olivine", "olmiite", "opal", "orpiment", "orthoclase", "otavite", "paradamite", "pararealgar", "pargasite", "pearceite", "pectolite", "pentlandite", "pezzottaite", "phenakite", "phlogopite", "phosgenite", "plagioclase", "platinum", "polybasite", "powellite", "prehnite", "proustite", "psilomelane", "pumpellyite", "pyrargyrite", "pyrite", "pyrolusite", "pyromorphite", "pyrope", "pyrophyllite", "pyroxmangite", "pyrrhotite", "quartz", "rammelsbergite", "raspite", "realgar", "rheniite", "rhodochrosite", "rhodonite", "riebeckite", "romanechite", "rutile", "talc", "tantalite", "tellurium", "tennantite", "tephroite", "tetrahedrite", "thenardite", "thomsonite", "thorite", "tincalconite", "titanite", "topaz", "tourmaline", "tremolite", "tridymite", "triphylite", "turquoise", "ulexite", "uraninite", "uvarovite", "uvite", "vanadinite", "variscite", "vesuvianite", "vivianite", "water", "wavellite", "willemite", "witherite", "wolframite", "wollastonite", "wulfenite", "wurtzite", "xanthoconite", "xenotime", "zincite", "zircon", "zoisite" ]
omarhkh/resnet-50-finetuned-omars1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-omars1 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.2536 - Accuracy: 0.6667 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.3877 | 1.0 | 11 | 1.3919 | 0.2564 | | 1.383 | 2.0 | 22 | 1.3813 | 0.3077 | | 1.366 | 3.0 | 33 | 1.3663 | 0.3077 | | 1.348 | 4.0 | 44 | 1.3393 | 0.4103 | | 1.3034 | 5.0 | 55 | 1.2699 | 0.5641 | | 1.2227 | 6.0 | 66 | 1.1615 | 0.6154 | | 1.0912 | 7.0 | 77 | 1.1262 | 0.6154 | | 0.9553 | 8.0 | 88 | 1.1313 | 0.5897 | | 0.8801 | 9.0 | 99 | 1.1711 | 0.6667 | | 0.8017 | 10.0 | 110 | 1.0136 | 0.6667 | | 0.7451 | 11.0 | 121 | 0.9310 | 0.6923 | | 0.6817 | 12.0 | 132 | 0.8635 | 0.6667 | | 0.6579 | 13.0 | 143 | 1.1545 | 0.6667 | | 0.6357 | 14.0 | 154 | 0.9239 | 0.6154 | | 0.6006 | 15.0 | 165 | 1.0271 | 0.6667 | | 0.5551 | 16.0 | 176 | 1.1781 | 0.5897 | | 0.5619 | 17.0 | 187 | 1.1831 | 0.6923 | | 0.5359 | 18.0 | 198 | 0.9667 | 0.6667 | | 0.5247 | 19.0 | 209 | 1.1237 | 0.6667 | | 0.5134 | 20.0 | 220 | 1.1176 | 0.6410 | | 0.4469 | 21.0 | 231 | 0.9955 | 0.7179 | | 0.4908 | 22.0 | 242 | 1.1411 | 0.7179 | | 0.4112 | 23.0 | 253 | 1.2766 | 0.6410 | | 0.4225 | 24.0 | 264 | 1.1135 | 0.6923 | | 0.4786 | 25.0 | 275 | 1.2243 | 0.7179 | | 0.3908 | 26.0 | 286 | 1.1587 | 0.7179 | | 0.4706 | 27.0 | 297 | 1.2236 | 0.6923 | | 0.502 | 28.0 | 308 | 1.1733 | 0.7179 | | 0.4514 | 29.0 | 319 | 1.0931 | 0.7436 | | 0.4386 | 30.0 | 330 | 1.2536 | 0.6667 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.0 - Tokenizers 0.13.3
[ "depthread", "depthunread", "widthread", "widthunread" ]
frncscp/patacoswin
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # patacoswin This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0203 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.96 | 6 | 0.4003 | 0.875 | | 0.5091 | 1.92 | 12 | 0.1308 | 0.9886 | | 0.5091 | 2.88 | 18 | 0.0522 | 0.9886 | | 0.1585 | 4.0 | 25 | 0.0203 | 1.0 | | 0.0925 | 4.96 | 31 | 0.0156 | 1.0 | | 0.0925 | 5.92 | 37 | 0.0196 | 1.0 | | 0.0539 | 6.88 | 43 | 0.0095 | 1.0 | | 0.0397 | 8.0 | 50 | 0.0089 | 1.0 | | 0.0397 | 8.96 | 56 | 0.0089 | 1.0 | | 0.0378 | 9.6 | 60 | 0.0090 | 1.0 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "patacon-false", "patacon-true" ]
frncscp/focalnet-small-patacon
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # focal-patacotron This model is a fine-tuned version of [microsoft/focalnet-small](https://huggingface.co/microsoft/focalnet-small) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0897 - Accuracy: 0.9659 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0575 | 0.97 | 24 | 0.1208 | 0.9545 | | 0.0275 | 1.94 | 48 | 0.0897 | 0.9659 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "patacon-false", "patacon-true" ]
jordyvl/vit-base_rvl_cdip_crl_softmax_rank1_fixed
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_cdip_crl_softmax_rank1_fixed This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7091 - Accuracy: 0.9032 - Brier Loss: 0.1756 - Nll: 1.0964 - F1 Micro: 0.9032 - F1 Macro: 0.9033 - Ece: 0.0854 - Aurc: 0.0188 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:------:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.2289 | 1.0 | 10000 | 0.4298 | 0.8826 | 0.1794 | 1.2461 | 0.8826 | 0.8841 | 0.0553 | 0.0199 | | 0.1972 | 2.0 | 20000 | 0.4350 | 0.8859 | 0.1769 | 1.3140 | 0.8859 | 0.8862 | 0.0558 | 0.0197 | | 0.1414 | 3.0 | 30000 | 0.4423 | 0.8938 | 0.1702 | 1.2433 | 0.8938 | 0.8948 | 0.0639 | 0.0181 | | 0.0903 | 4.0 | 40000 | 0.5076 | 0.8943 | 0.1753 | 1.2033 | 0.8943 | 0.8941 | 0.0766 | 0.0181 | | 0.0684 | 5.0 | 50000 | 0.5592 | 0.8963 | 0.1783 | 1.2422 | 0.8963 | 0.8965 | 0.0811 | 0.0194 | | 0.0313 | 6.0 | 60000 | 0.6384 | 0.8956 | 0.1836 | 1.2359 | 0.8957 | 0.8957 | 0.0861 | 0.0218 | | 0.0163 | 7.0 | 70000 | 0.6673 | 0.9005 | 0.1788 | 1.1927 | 0.9005 | 0.9006 | 0.0855 | 0.0215 | | 0.0104 | 8.0 | 80000 | 0.6929 | 0.9001 | 0.1791 | 1.1768 | 0.9001 | 0.9000 | 0.0860 | 0.0204 | | 0.0036 | 9.0 | 90000 | 0.7131 | 0.9018 | 0.1780 | 1.1295 | 0.9018 | 0.9018 | 0.0866 | 0.0195 | | 0.0023 | 10.0 | 100000 | 0.7091 | 0.9032 | 0.1756 | 1.0964 | 0.9032 | 0.9033 | 0.0854 | 0.0188 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
rdmpage/autotrain-bwpages-78824141089
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 78824141089 - CO2 Emissions (in grams): 0.4838 ## Validation Metrics - Loss: 0.136 - Accuracy: 0.925 - Macro F1: 0.930 - Micro F1: 0.925 - Weighted F1: 0.924 - Macro Precision: 0.948 - Micro Precision: 0.925 - Weighted Precision: 0.928 - Macro Recall: 0.918 - Micro Recall: 0.925 - Weighted Recall: 0.925
[ "blank", "content", "end", "start" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.6000 - Accuracy: 0.5805 - Brier Loss: 0.6398 - Nll: 2.9515 - F1 Micro: 0.5805 - F1 Macro: 0.5810 - Ece: 0.2379 - Aurc: 0.2036 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 2.4623 | 0.1792 | 0.9035 | 7.3776 | 0.1792 | 0.1152 | 0.0726 | 0.7119 | | No log | 2.0 | 250 | 2.2456 | 0.218 | 0.8696 | 3.5941 | 0.218 | 0.1605 | 0.0646 | 0.6549 | | No log | 3.0 | 375 | 2.0626 | 0.285 | 0.8239 | 3.2463 | 0.285 | 0.2178 | 0.0561 | 0.5455 | | 2.2649 | 4.0 | 500 | 1.8007 | 0.3902 | 0.7487 | 3.1874 | 0.3902 | 0.3500 | 0.0567 | 0.4166 | | 2.2649 | 5.0 | 625 | 1.6948 | 0.4228 | 0.7156 | 3.1334 | 0.4228 | 0.3809 | 0.0669 | 0.3709 | | 2.2649 | 6.0 | 750 | 1.5414 | 0.4725 | 0.6656 | 2.8978 | 0.4725 | 0.4410 | 0.0621 | 0.3146 | | 2.2649 | 7.0 | 875 | 1.4740 | 0.4848 | 0.6464 | 2.6879 | 0.4848 | 0.4556 | 0.0645 | 0.2942 | | 1.4861 | 8.0 | 1000 | 1.3662 | 0.5198 | 0.6079 | 2.6641 | 0.5198 | 0.4973 | 0.0653 | 0.2514 | | 1.4861 | 9.0 | 1125 | 1.3400 | 0.5417 | 0.5949 | 2.6876 | 0.5417 | 0.5364 | 0.0613 | 0.2381 | | 1.4861 | 10.0 | 1250 | 1.3414 | 0.542 | 0.5968 | 2.6382 | 0.542 | 0.5267 | 0.0917 | 0.2336 | | 1.4861 | 11.0 | 1375 | 1.3402 | 0.5395 | 0.5935 | 2.6955 | 0.5395 | 0.5418 | 0.0774 | 0.2303 | | 1.0134 | 12.0 | 1500 | 1.3721 | 0.537 | 0.6035 | 2.6887 | 0.537 | 0.5271 | 0.1148 | 0.2301 | | 1.0134 | 13.0 | 1625 | 1.3683 | 0.5455 | 0.6005 | 2.7328 | 0.5455 | 0.5383 | 0.1229 | 0.2270 | | 1.0134 | 14.0 | 1750 | 1.4969 | 0.5363 | 0.6360 | 2.9430 | 0.5363 | 0.5293 | 0.1733 | 0.2346 | | 1.0134 | 15.0 | 1875 | 1.5422 | 0.5295 | 0.6487 | 2.9876 | 0.5295 | 0.5341 | 0.1774 | 0.2442 | | 0.594 | 16.0 | 2000 | 1.5237 | 0.5543 | 0.6329 | 2.9785 | 0.5543 | 0.5550 | 0.1900 | 0.2242 | | 0.594 | 17.0 | 2125 | 1.6365 | 0.5298 | 0.6667 | 3.1126 | 0.5298 | 0.5332 | 0.2148 | 0.2498 | | 0.594 | 18.0 | 2250 | 1.6367 | 0.5413 | 0.6663 | 3.0856 | 0.5413 | 0.5429 | 0.2332 | 0.2313 | | 0.594 | 19.0 | 2375 | 1.7407 | 0.543 | 0.6811 | 3.2768 | 0.543 | 0.5379 | 0.2478 | 0.2327 | | 0.3116 | 20.0 | 2500 | 1.7899 | 0.5535 | 0.6816 | 3.4174 | 0.5535 | 0.5459 | 0.2524 | 0.2308 | | 0.3116 | 21.0 | 2625 | 1.8270 | 0.545 | 0.6990 | 3.2131 | 0.545 | 0.5401 | 0.2683 | 0.2459 | | 0.3116 | 22.0 | 2750 | 1.8178 | 0.538 | 0.7029 | 3.3342 | 0.538 | 0.5392 | 0.2646 | 0.2471 | | 0.3116 | 23.0 | 2875 | 1.8589 | 0.5337 | 0.7086 | 3.4584 | 0.5337 | 0.5332 | 0.2668 | 0.2505 | | 0.1975 | 24.0 | 3000 | 1.8554 | 0.5363 | 0.7072 | 3.3578 | 0.5363 | 0.5360 | 0.2754 | 0.2448 | | 0.1975 | 25.0 | 3125 | 1.8389 | 0.5397 | 0.7023 | 3.2630 | 0.5397 | 0.5377 | 0.2724 | 0.2457 | | 0.1975 | 26.0 | 3250 | 1.8596 | 0.5423 | 0.7076 | 3.3014 | 0.5423 | 0.5463 | 0.2804 | 0.2355 | | 0.1975 | 27.0 | 3375 | 1.8342 | 0.55 | 0.6890 | 3.3997 | 0.55 | 0.5451 | 0.2646 | 0.2286 | | 0.1448 | 28.0 | 3500 | 1.8707 | 0.548 | 0.7045 | 3.3058 | 0.548 | 0.5428 | 0.2805 | 0.2372 | | 0.1448 | 29.0 | 3625 | 1.8214 | 0.546 | 0.6979 | 3.2599 | 0.546 | 0.5455 | 0.2674 | 0.2372 | | 0.1448 | 30.0 | 3750 | 1.8021 | 0.5537 | 0.6896 | 3.2681 | 0.5537 | 0.5549 | 0.2664 | 0.2307 | | 0.1448 | 31.0 | 3875 | 1.8335 | 0.551 | 0.6938 | 3.3393 | 0.551 | 0.5522 | 0.2740 | 0.2262 | | 0.1165 | 32.0 | 4000 | 1.7620 | 0.5473 | 0.6851 | 3.1437 | 0.5473 | 0.5463 | 0.2626 | 0.2328 | | 0.1165 | 33.0 | 4125 | 1.7496 | 0.5527 | 0.6850 | 3.1206 | 0.5527 | 0.5515 | 0.2678 | 0.2257 | | 0.1165 | 34.0 | 4250 | 1.7095 | 0.56 | 0.6691 | 3.1142 | 0.56 | 0.5631 | 0.2511 | 0.2232 | | 0.1165 | 35.0 | 4375 | 1.7775 | 0.543 | 0.6943 | 3.2500 | 0.543 | 0.5428 | 0.2719 | 0.2309 | | 0.0964 | 36.0 | 4500 | 1.7212 | 0.5653 | 0.6715 | 3.1218 | 0.5653 | 0.5642 | 0.2513 | 0.2212 | | 0.0964 | 37.0 | 4625 | 1.6819 | 0.5633 | 0.6612 | 3.0858 | 0.5633 | 0.5605 | 0.2447 | 0.2172 | | 0.0964 | 38.0 | 4750 | 1.7017 | 0.5617 | 0.6726 | 3.0501 | 0.5617 | 0.5636 | 0.2596 | 0.2218 | | 0.0964 | 39.0 | 4875 | 1.6995 | 0.564 | 0.6690 | 3.1110 | 0.564 | 0.5656 | 0.2471 | 0.2209 | | 0.0805 | 40.0 | 5000 | 1.6639 | 0.566 | 0.6594 | 3.1202 | 0.566 | 0.5677 | 0.2405 | 0.2180 | | 0.0805 | 41.0 | 5125 | 1.6265 | 0.57 | 0.6504 | 3.0491 | 0.57 | 0.5725 | 0.2368 | 0.2125 | | 0.0805 | 42.0 | 5250 | 1.6325 | 0.568 | 0.6534 | 3.0176 | 0.568 | 0.5696 | 0.2429 | 0.2090 | | 0.0805 | 43.0 | 5375 | 1.6029 | 0.5775 | 0.6418 | 2.9852 | 0.5775 | 0.5778 | 0.2330 | 0.2072 | | 0.0678 | 44.0 | 5500 | 1.5963 | 0.5725 | 0.6417 | 2.9674 | 0.5725 | 0.5720 | 0.2378 | 0.2080 | | 0.0678 | 45.0 | 5625 | 1.5820 | 0.58 | 0.6365 | 2.9070 | 0.58 | 0.5793 | 0.2312 | 0.2033 | | 0.0678 | 46.0 | 5750 | 1.5828 | 0.5773 | 0.6368 | 2.9425 | 0.5773 | 0.5766 | 0.2367 | 0.2028 | | 0.0678 | 47.0 | 5875 | 1.5854 | 0.5807 | 0.6368 | 2.9375 | 0.5807 | 0.5816 | 0.2341 | 0.2035 | | 0.0566 | 48.0 | 6000 | 1.5948 | 0.58 | 0.6396 | 2.9457 | 0.58 | 0.5812 | 0.2372 | 0.2037 | | 0.0566 | 49.0 | 6125 | 1.5972 | 0.5813 | 0.6393 | 2.9527 | 0.5813 | 0.5817 | 0.2360 | 0.2038 | | 0.0566 | 50.0 | 6250 | 1.6000 | 0.5805 | 0.6398 | 2.9515 | 0.5805 | 0.5810 | 0.2379 | 0.2036 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-small_rvl_cdip-NK1000_kd This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5198 - Accuracy: 0.833 - Brier Loss: 0.2560 - Nll: 1.1465 - F1 Micro: 0.833 - F1 Macro: 0.8328 - Ece: 0.0719 - Aurc: 0.0425 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 125 | 1.0816 | 0.622 | 0.5119 | 2.5026 | 0.622 | 0.6194 | 0.0740 | 0.1657 | | No log | 2.0 | 250 | 0.8028 | 0.715 | 0.3936 | 2.1454 | 0.715 | 0.7158 | 0.0651 | 0.1017 | | No log | 3.0 | 375 | 0.7104 | 0.7505 | 0.3455 | 2.0393 | 0.7505 | 0.7464 | 0.0456 | 0.0765 | | 0.9841 | 4.0 | 500 | 0.6747 | 0.7682 | 0.3267 | 1.9784 | 0.7682 | 0.7703 | 0.0455 | 0.0682 | | 0.9841 | 5.0 | 625 | 0.6619 | 0.7782 | 0.3169 | 1.9299 | 0.7782 | 0.7752 | 0.0391 | 0.0649 | | 0.9841 | 6.0 | 750 | 0.6416 | 0.7897 | 0.3058 | 1.8240 | 0.7897 | 0.7923 | 0.0483 | 0.0683 | | 0.9841 | 7.0 | 875 | 0.6481 | 0.786 | 0.3016 | 1.8855 | 0.786 | 0.7855 | 0.0501 | 0.0640 | | 0.259 | 8.0 | 1000 | 0.6273 | 0.7963 | 0.2970 | 1.7135 | 0.7963 | 0.7970 | 0.0454 | 0.0633 | | 0.259 | 9.0 | 1125 | 0.6484 | 0.7927 | 0.3044 | 1.7079 | 0.7927 | 0.7911 | 0.0601 | 0.0647 | | 0.259 | 10.0 | 1250 | 0.6504 | 0.7925 | 0.3046 | 1.8241 | 0.7925 | 0.7931 | 0.0577 | 0.0674 | | 0.259 | 11.0 | 1375 | 0.6137 | 0.7975 | 0.2914 | 1.6742 | 0.7975 | 0.7996 | 0.0567 | 0.0675 | | 0.133 | 12.0 | 1500 | 0.6092 | 0.7993 | 0.2928 | 1.6077 | 0.7993 | 0.8023 | 0.0600 | 0.0654 | | 0.133 | 13.0 | 1625 | 0.5905 | 0.805 | 0.2842 | 1.5790 | 0.805 | 0.8074 | 0.0589 | 0.0623 | | 0.133 | 14.0 | 1750 | 0.5794 | 0.8077 | 0.2797 | 1.4947 | 0.8077 | 0.8090 | 0.0533 | 0.0579 | | 0.133 | 15.0 | 1875 | 0.5683 | 0.8075 | 0.2777 | 1.4518 | 0.8075 | 0.8076 | 0.0594 | 0.0565 | | 0.1032 | 16.0 | 2000 | 0.5762 | 0.8125 | 0.2794 | 1.3998 | 0.8125 | 0.8146 | 0.0633 | 0.0551 | | 0.1032 | 17.0 | 2125 | 0.5529 | 0.8115 | 0.2748 | 1.3595 | 0.8115 | 0.8126 | 0.0638 | 0.0519 | | 0.1032 | 18.0 | 2250 | 0.5669 | 0.8133 | 0.2759 | 1.3803 | 0.8133 | 0.8143 | 0.0603 | 0.0547 | | 0.1032 | 19.0 | 2375 | 0.5549 | 0.8177 | 0.2716 | 1.3258 | 0.8178 | 0.8186 | 0.0625 | 0.0527 | | 0.0832 | 20.0 | 2500 | 0.5576 | 0.8147 | 0.2737 | 1.3814 | 0.8148 | 0.8183 | 0.0627 | 0.0513 | | 0.0832 | 21.0 | 2625 | 0.5336 | 0.8247 | 0.2609 | 1.2941 | 0.8247 | 0.8243 | 0.0626 | 0.0476 | | 0.0832 | 22.0 | 2750 | 0.5276 | 0.8257 | 0.2595 | 1.2491 | 0.8257 | 0.8262 | 0.0633 | 0.0455 | | 0.0832 | 23.0 | 2875 | 0.5313 | 0.8193 | 0.2603 | 1.2685 | 0.8193 | 0.8198 | 0.0618 | 0.0466 | | 0.0715 | 24.0 | 3000 | 0.5208 | 0.826 | 0.2575 | 1.2280 | 0.826 | 0.8266 | 0.0644 | 0.0468 | | 0.0715 | 25.0 | 3125 | 0.5205 | 0.8233 | 0.2591 | 1.2235 | 0.8233 | 0.8235 | 0.0615 | 0.0459 | | 0.0715 | 26.0 | 3250 | 0.5067 | 0.8293 | 0.2536 | 1.2028 | 0.8293 | 0.8298 | 0.0630 | 0.0433 | | 0.0715 | 27.0 | 3375 | 0.5207 | 0.8245 | 0.2591 | 1.2148 | 0.8245 | 0.8256 | 0.0692 | 0.0449 | | 0.0647 | 28.0 | 3500 | 0.5197 | 0.824 | 0.2596 | 1.1765 | 0.824 | 0.8242 | 0.0690 | 0.0469 | | 0.0647 | 29.0 | 3625 | 0.5086 | 0.8315 | 0.2531 | 1.1762 | 0.8315 | 0.8319 | 0.0704 | 0.0428 | | 0.0647 | 30.0 | 3750 | 0.5025 | 0.8313 | 0.2509 | 1.1560 | 0.8313 | 0.8314 | 0.0687 | 0.0439 | | 0.0647 | 31.0 | 3875 | 0.5073 | 0.832 | 0.2527 | 1.1743 | 0.832 | 0.8323 | 0.0662 | 0.0426 | | 0.0618 | 32.0 | 4000 | 0.5068 | 0.8303 | 0.2526 | 1.1644 | 0.8303 | 0.8304 | 0.0679 | 0.0422 | | 0.0618 | 33.0 | 4125 | 0.5086 | 0.8325 | 0.2526 | 1.1658 | 0.8325 | 0.8326 | 0.0671 | 0.0415 | | 0.0618 | 34.0 | 4250 | 0.5114 | 0.833 | 0.2540 | 1.1694 | 0.833 | 0.8326 | 0.0649 | 0.0440 | | 0.0618 | 35.0 | 4375 | 0.5104 | 0.8305 | 0.2541 | 1.1399 | 0.8305 | 0.8309 | 0.0666 | 0.0426 | | 0.0601 | 36.0 | 4500 | 0.5122 | 0.8307 | 0.2547 | 1.1755 | 0.8308 | 0.8309 | 0.0689 | 0.0435 | | 0.0601 | 37.0 | 4625 | 0.5122 | 0.8323 | 0.2543 | 1.1448 | 0.8323 | 0.8326 | 0.0698 | 0.0429 | | 0.0601 | 38.0 | 4750 | 0.5144 | 0.8307 | 0.2554 | 1.1444 | 0.8308 | 0.8308 | 0.0699 | 0.0414 | | 0.0601 | 39.0 | 4875 | 0.5155 | 0.8307 | 0.2553 | 1.1524 | 0.8308 | 0.8308 | 0.0722 | 0.0430 | | 0.0593 | 40.0 | 5000 | 0.5132 | 0.8315 | 0.2543 | 1.1554 | 0.8315 | 0.8318 | 0.0721 | 0.0423 | | 0.0593 | 41.0 | 5125 | 0.5153 | 0.8335 | 0.2551 | 1.1557 | 0.8335 | 0.8332 | 0.0700 | 0.0423 | | 0.0593 | 42.0 | 5250 | 0.5141 | 0.8313 | 0.2545 | 1.1530 | 0.8313 | 0.8314 | 0.0728 | 0.0419 | | 0.0593 | 43.0 | 5375 | 0.5159 | 0.8313 | 0.2551 | 1.1434 | 0.8313 | 0.8312 | 0.0756 | 0.0425 | | 0.0587 | 44.0 | 5500 | 0.5164 | 0.833 | 0.2548 | 1.1469 | 0.833 | 0.8329 | 0.0688 | 0.0428 | | 0.0587 | 45.0 | 5625 | 0.5170 | 0.8325 | 0.2553 | 1.1486 | 0.8325 | 0.8324 | 0.0723 | 0.0426 | | 0.0587 | 46.0 | 5750 | 0.5188 | 0.8325 | 0.2559 | 1.1478 | 0.8325 | 0.8324 | 0.0731 | 0.0423 | | 0.0587 | 47.0 | 5875 | 0.5188 | 0.8325 | 0.2557 | 1.1515 | 0.8325 | 0.8323 | 0.0702 | 0.0424 | | 0.0583 | 48.0 | 6000 | 0.5195 | 0.8327 | 0.2559 | 1.1477 | 0.8327 | 0.8325 | 0.0702 | 0.0427 | | 0.0583 | 49.0 | 6125 | 0.5194 | 0.8325 | 0.2559 | 1.1464 | 0.8325 | 0.8324 | 0.0713 | 0.0426 | | 0.0583 | 50.0 | 6250 | 0.5198 | 0.833 | 0.2560 | 1.1465 | 0.833 | 0.8328 | 0.0719 | 0.0425 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
vesteinn/vit-mae-inat21
Note that this model does not work directly with HF, a modification that does mean pooling before the layernorm and classification head is needed. ```python from transformers import ( ViTForImageClassification, pipeline, AutoImageProcessor, ViTConfig, ViTModel, ) from transformers.modeling_outputs import ( ImageClassifierOutput, BaseModelOutputWithPooling, ) from PIL import Image import torch from torch import nn from typing import Optional, Union, Tuple class CustomViTModel(ViTModel): def forward( self, pixel_values: Optional[torch.Tensor] = None, bool_masked_pos: Optional[torch.BoolTensor] = None, head_mask: Optional[torch.Tensor] = None, output_attentions: Optional[bool] = None, output_hidden_states: Optional[bool] = None, interpolate_pos_encoding: Optional[bool] = None, return_dict: Optional[bool] = None, ) -> Union[Tuple, BaseModelOutputWithPooling]: r""" bool_masked_pos (`torch.BoolTensor` of shape `(batch_size, num_patches)`, *optional*): Boolean masked positions. Indicates which patches are masked (1) and which aren't (0). """ output_attentions = ( output_attentions if output_attentions is not None else self.config.output_attentions ) output_hidden_states = ( output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states ) return_dict = ( return_dict if return_dict is not None else self.config.use_return_dict ) if pixel_values is None: raise ValueError("You have to specify pixel_values") # Prepare head mask if needed # 1.0 in head_mask indicate we keep the head # attention_probs has shape bsz x n_heads x N x N # input head_mask has shape [num_heads] or [num_hidden_layers x num_heads] # and head_mask is converted to shape [num_hidden_layers x batch x num_heads x seq_length x seq_length] head_mask = self.get_head_mask(head_mask, self.config.num_hidden_layers) # TODO: maybe have a cleaner way to cast the input (from `ImageProcessor` side?) expected_dtype = self.embeddings.patch_embeddings.projection.weight.dtype if pixel_values.dtype != expected_dtype: pixel_values = pixel_values.to(expected_dtype) embedding_output = self.embeddings( pixel_values, bool_masked_pos=bool_masked_pos, interpolate_pos_encoding=interpolate_pos_encoding, ) encoder_outputs = self.encoder( embedding_output, head_mask=head_mask, output_attentions=output_attentions, output_hidden_states=output_hidden_states, return_dict=return_dict, ) sequence_output = encoder_outputs[0] sequence_output = sequence_output[:, 1:, :].mean(dim=1) sequence_output = self.layernorm(sequence_output) pooled_output = ( self.pooler(sequence_output) if self.pooler is not None else None ) if not return_dict: head_outputs = ( (sequence_output, pooled_output) if pooled_output is not None else (sequence_output,) ) return head_outputs + encoder_outputs[1:] return BaseModelOutputWithPooling( last_hidden_state=sequence_output, pooler_output=pooled_output, hidden_states=encoder_outputs.hidden_states, attentions=encoder_outputs.attentions, ) class CustomViTForImageClassification(ViTForImageClassification): def __init__(self, config: ViTConfig) -> None: super().__init__(config) self.num_labels = config.num_labels self.vit = CustomViTModel(config, add_pooling_layer=False) # Classifier head self.classifier = ( nn.Linear(config.hidden_size, config.num_labels) if config.num_labels > 0 else nn.Identity() ) # Initialize weights and apply final processing self.post_init() def forward( self, pixel_values: Optional[torch.Tensor] = None, head_mask: Optional[torch.Tensor] = None, labels: Optional[torch.Tensor] = None, output_attentions: Optional[bool] = None, output_hidden_states: Optional[bool] = None, interpolate_pos_encoding: Optional[bool] = None, return_dict: Optional[bool] = None, ) -> Union[tuple, ImageClassifierOutput]: r""" labels (`torch.LongTensor` of shape `(batch_size,)`, *optional*): Labels for computing the image classification/regression loss. Indices should be in `[0, ..., config.num_labels - 1]`. If `config.num_labels == 1` a regression loss is computed (Mean-Square loss), If `config.num_labels > 1` a classification loss is computed (Cross-Entropy). """ return_dict = ( return_dict if return_dict is not None else self.config.use_return_dict ) outputs = self.vit( pixel_values, head_mask=head_mask, output_attentions=output_attentions, output_hidden_states=output_hidden_states, interpolate_pos_encoding=interpolate_pos_encoding, return_dict=return_dict, ) sequence_output = outputs[0] logits = self.classifier(sequence_output) loss = None return ImageClassifierOutput( loss=loss, logits=logits, hidden_states=outputs.hidden_states, attentions=outputs.attentions, ) if __name__ == "__main__": model = CustomViTForImageClassification.from_pretrained("vesteinn/vit-mae-inat21") image_processor = AutoImageProcessor.from_pretrained("vesteinn/vit-mae-inat21") classifier = pipeline( "image-classification", model=model, image_processor=image_processor ) ```
[ "common earthworm", "mediterranean fanworm", "serpula columbiana", "blue tube worm", "eastern bark centipede", "mediterranean banded centipede", "giant desert centipede", "common desert centipede", "house centipede", "portuguese millipede", "greenhouse millipede", "black-and-gold flat millipede", "yellow-spotted millipede", "bumblebee millipede", "rusty millipede", "pelagic gooseneck barnacle", "goose barnacle", "goose barnacle", "pacific barnacle", "northern rock barnacle", "thatched barnacle", "pink volcano barnacle", "signal crayfish", "virile crayfish", "red swamp crawfish", "jonah crab", "atlantic rock crab", "red rock crab", "dungeness crab", "pacific rock crab", "green shore crab", "caribbean land hermit crab", "ecuadorian hermit crab", "northern kelp crab", "blue land crab", "lady crab", "sally lightfoot crab", "thin-shelled rock crab", "purple rock crab", "striped shore crab", "marbled crab", "pacific sand crab", "atlantic sand crab", "american lobster", "mexican fiddler crab", "atlantic sand fiddler", "mud fiddler crab", "horned ghost crab", "painted ghost crab", "atlantic ghost crab", "bowed fiddler crab", "grainy hermit crab", "blueband hermit crab", "glass shrimp", "caribbean spiny lobster", "california spiny lobster", "red rock crab", "new zealand half crab", "speckled swimming crab", "common blue crab", "mangrove tree crab", "squareback marsh crab", "banded coral shrimp", "purple shore crab", "yellow shore crab", "asian shore crab", "nosy pillbug", "common pill-bug", "wharf roach", "western sea roach", "common shiny woodlouse", "common striped woodlouse", "smooth slater", "european woodlouse", "brickwork woodlouse", "rathke’s woodlouse", "horseshoe crab", "violet tunicate", "golden star tunicate", "whitetip reef shark", "leopard shark", "port jackson shark", "grey nurse shark", "spotted eagle ray", "smooth stingray", "southern stingray", "bluespotted ribbontail ray", "bat ray", "new zealand eagle ray", "common stingaree", "haller's round ray", "nurse shark", "banded wobbegong", "spotted wobbegong", "banjo shark", "beadlet anemone", "waratah anemone", "snakelocks anemone", "moonglow anemone", "aggregating anemone", "starburst anemone", "giant green anemone", "brooding anemone", "olive anemone", "speckled rock anemone", "wandering sea anemone", "painted anemone", "striped green sea anemone", "giant plumose anemone", "plumose anemone", "white-striped anemone", "strawberry anemone", "blue button", "by-the-wind sailor", "portuguese man o' war", "cannonball jelly", "pacific lion's mane jelly", "pacific sea nettle", "egg-yolk jelly", "moon jellyfish", "greater moon jellyfish", "purple starfish", "eleven-armed seastar", "mottled star", "giant sea star", "ochre sea star", "sunflower sea star", "pacific blood star", "bat star", "new zealand common cushion star", "leather star", "red cushion sea star", "indo-pacific rock-boring urchin", "kina", "red sea urchin", "green sea urchin", "pacific purple sea urchin", "eccentric sand dollar", "five-slotted sand dollar", "red sea cucumber", "giant california sea cucumber", "western spiny brittle star", "giant house spider", "california turret spider", "oak spider", "gorse orbweaver", "giant lichen orbweaver", "cross orbweaver", "marbled orb weaver", "four-spot orbweaver", "shamrock orbweaver", "six-spotted orbweaver", "hawaiian garden spider", "silver garden spider", "yellow garden spider", "common garden orbweb spinner", "wasp spider", "st. andrew's cross spider", "lobed argiope", "banded garden spider", "australian jewel spider", "conical orbweaver", "humped trashline orbweaver", "tropical tent-web spider", "garden orbweb spider", "tropical orbweaver spider", "spinybacked orbweaver", "black-and-white spiny spider", "furrow orbweaver", "cricket-bat orbweaver", "tuft-legged orbweaver", "basilica spider", "spined micrathena", "white micrathena", "arrow-shaped micrathena", "arabesque orbweaver", "hentz's orbweaver", "red-femured spotted orbweaver", "western spotted orbweaver", "giant golden orb-weaving spider", "walnut orbweaver", "asian spinybacked orbweaver", "jorō spider", "golden silk spider", "edible golden silk spider", "hairy golden orb-weaving spider", "tiger spider", "arrowhead orbweaver", "silver-sided sector spider", "yellow sac spider", "spotted ground swift", "grey house spider", "woodlouse spider", "southern house spider", "parson spider", "bowl and doily spider", "filmy dome spider", "radiated wolf spider", "rabid wolf spider", "striped lynx spider", "western lynx spider", "green lynx spider", "marbled cellar spider", "skull spider", "whitebanded fishing spider", "nurseryweb spider", "striped fishing spider", "dark fishing spider", "six-spotted fishing spider", "european nursery web spider", "american nursery web spider", "twinflagged jumping spider", "asiatic wall jumping spider", "colonus hesperus", "sylvan jumping spider", "bronze jumper", "adanson's house jumper", "bronze hopper", "white-jawed jumping spider", "common hentz jumper", "white banded house jumper", "magnolia green jumper", "dimorphic jumper", "fencepost jumping spider", "gray wall jumper", "menemerus semilimbatus", "ribbon jumping spider", "flea jumper", "golden jumping spider", "peppered jumper", "bold jumper", "brilliant jumper", "johnson's jumping spider", "red-bellied jumping spider", "tan jumping spider", "pantropical jumper", "zebra jumper", "buttonhook leaf-beetle jumping spider", "golden-brown jumping spider", "black-headed jumping spider", "brown recluse", "giant crab spider", "green huntsman spider", "giant crab spider", "leucauge argyra", "mabel orchard orbweaver", "orchard orbweaver", "desert blonde tarantula", "california ebony tarantula", "texas brown tarantula", "desert tarantula", "common candy-striped spider", "brown widow", "redback spider", "western black widow", "katipō", "southern black widow", "northern black widow", "common house spider", "brown house spider", "false black widow", "noble false widow", "triangulate comb-foot", "goldenrod crab spider", "white-banded crab spider", "trapezoid crab spider", "napoleon spider", "flower spider", "mediterranean spiny false wolf spider", "elm finger gall mite", "poison ivy leaf mite", "willow bead gall mite", "black cherry leaf gall", "red nail gall mite", "maple bladdergall mite", "lone star tick", "pacific coast tick", "american dog tick", "western blacklegged tick", "castor bean tick", "blacklegged tick", "saddleback harvestman", "canestrini's harvestman", "european harvestman", "leiobunum rotundum", "leiobunum townsendi", "eastern harvestman", "arizona bark scorpion", "striped bark scorpion", "desert hairy scorpion", "paravaejovis puritanus", "stripe-tailed scorpion", "california common scorpion", "western forest scorpion", "southern devil scorpion", "cape mountain cockroach", "banana cockroach", "surinam cockroach", "oriental cockroach", "american cockroach", "australian cockroach", "smoky-brown cockroach", "pale-bordered field cockroach", "oak timberworm", "emerald ash borer", "golden buprestid beetle", "ceiba borer beetle", "two-lined leatherwing", "dark sailor beetle", "cantharis livida", "rustic sailor beetle", "colorado soldier beetle", "margined leatherwing beetle", "goldenrod soldier beetle", "common red soldier beetle", "fiery searcher beetle", "carabus coriaceus", "granulated ground beetle", "wood ground-beetle", "golden-spotted tiger beetle", "green tiger beetle", "twelve-spotted tiger beetle", "beautiful tiger beetle", "hairy-necked tiger beetle", "northern dune tiger beetle", "boreal long-lipped tiger beetle", "ocellated tiger beetle", "oregon tiger beetle", "backroad tiger beetle", "purple tiger beetle", "bronzed tiger beetle", "festive tiger beetle", "six-spotted tiger beetle", "oblique-lined tiger beetle", "s-banded tiger beetle", "strawberry seed beetle", "narrow-collared snail-eating beetle", "big-headed ground beetle", "carolina metallic tiger beetle", "golden-bloomed longhorn", "analeptura lineola", "musk beetle", "capricorn beetle", "wasp beetle", "round-necked longhorn beetle", "long-jawed longhorn beetle", "elderberry borer", "ivory-marked borer", "spined oak borer", "banded graphisurus", "banded hickory borer", "four-banded longhorn beetle", "yellow velvet beetle", "hardwood stump borer", "locust borer", "spotted pine sawyer", "northeastern pine sawyer", "white-spotted sawyer", "morimus asper", "red-headed ash borer", "brown prionid", "cottonwood borer", "huhu", "california prionus", "the tanner", "tile-horned prionus", "broad-necked root borer", "fairy-ring longhorn beetle", "ribbed pine borer", "blackspotted longhorn beetle", "banded alder borer", "spotted longhorn", "double-banded bycid", "black-striped longhorn beetle", "red-shouldered pine borer", "red pine longhorn", "strangalepta flower longhorn", "yellow-horned lepture", "red-femured milkweed borer", "eastern milkweed longhorn beetle", "spined woodborer", "banded longhorn", "zebra longhorn", "rustic borer beetle", "striped cucumber beetle", "alder leaf beetle", "clay-colored leaf beetle", "spotted tortoise beetle", "thistle tortoise beetle", "bean leaf beetle", "golden tortoise beetle", "dogbane leaf beetle", "cobalt milkweed beetle", "rosemary beetle", "chrysolina bankii", "dead-nettle leaf beetle", "poplar leaf beetle", "cottonwood leaf beetle", "spotted willow leaf beetle", "clytra laeviuscula", "mottled tortoise beetle", "banded cucumber beetle", "spotted cucumber beetle", "pigweed flea beetle", "galeruca tanaceti", "green dock-beetle", "clavate tortoise beetle", "milkweed leaf beetle", "three-lined potato beetle", "colorado potato beetle", "false potato beetle", "lily leaf beetle", "eight-spotted flea beetle", "ragweed leaf beetle", "willow leaf beetle", "bloody-nosed beetle", "australian tortoise beetle", "elm leaf beetle", "zygogramma signatipennis", "trichodes alvearius", "trichodes apiarius", "common checkered beetle", "two-spotted ladybug", "ten-spotted ladybird beetle", "eyed ladybird beetle", "cream-spot ladybird", "lunate lady beetle", "six-spotted zigzag ladybird beetle", "twice-stabbed lady beetle", "california lady beetle", "seven-spotted ladybird beetle", "small transverse ladybird beetle", "three-banded ladybug", "eleven-spotted ladybird beetle", "common australian lady beetle", "spotted ladybeetle", "mealybug destroyer", "polished lady beetle", "western spotless lady beetle", "spotless ladybird", "epilachna mexicana", "pine ladybird", "steelblue ladybird", "orange ladybird beetle", "asian lady beetle", "large spotted ladybird beetle", "hadda beetle", "convergent lady beetle", "adonis ladybird beetle", "fungus-eating ladybird", "ashy gray ladybug", "fourteen-spotted ladybug", "22-spot ladybird beetle", "20-spotted ladybird beetle", "cactus weevil", "asiatic oak weevil", "diaprepes root weevil", "canada thistle bud weevil", "black vine weevil", "fuller's rose weevil", "phyllobius pomaceus", "green immigrant leaf weevil", "cocklebur weevil", "red palm weevil", "varied carpet beetle", "larder beetle", "agrypnus murinus", "texas eyed click beetle", "eastern eyed click beetle", "blue fungus beetle", "pleasing fungus beetle", "woodland dor beetle", "springtime dung beetle", "diurnal firefly", "common glowworm", "black firefly", "common eastern firefly", "lesser stag beetle", "reddish-brown stag beetle", "european stag beetle", "giant stag beetle", "giant stag beetle", "banded net-winged beetle", "end band net-winged beetle", "inflated beetle", "black blister beetle", "master blister beetle", "black oil beetle", "violet oil beetle", "common malachite-beetle", "oedemera femorata", "thick-legged flower beetle", "patent-leather beetle", "black-headed cardinal beetle", "common cardinal beetle", "summer chafer", "margined shining leaf chafer", "european rose chafer", "green fig beetle", "green june beetle", "gazelle scarab", "eastern hercules beetle", "euphoria basalis", "bumble flower beetle", "kern's flower scarab", "dark flower scarab", "oriental beetle", "american rose chafer", "common cockchafer", "european rhinoceros beetle", "rough hermit beetle", "mediterranean spotted chafer beetle", "spotted june beetle", "rainbow scarab", "garden chafer beetle", "ten-lined june beetle", "japanese beetle", "copper chafer", "aloeus ox beetle", "bee-mimic beetle", "texas flower scarab", "bee beetle", "delta flower scarab", "japanese rhinoceros beetle", "valgus hemipterus", "red-lined carrion beetle", "american carrion beetle", "roundneck sexton beetle", "tomentose burying beetle", "lesser vespillo burying beetle", "red-breasted carrion beetle", "black snail beetle", "devil's coach-horse beetle", "forked fungus beetle", "eleodes obscura", "wooly darkling beetle", "hirta beetle", "diabolical ironclad beetle", "zopherus nodulosus", "ring-legged earwig", "common earwig", "dioctria hyalipennis", "efferia aestuans", "laphria macquarti", "laphria thoracica", "mallophora fautrix", "beelzebub bee-eater", "giant prairie robber fly", "promachus hinei", "red-footed cannibalfly", "triorla interrupta", "common lovebug", "greater bee fly", "poecilanthrax lucifer", "tiger bee fly", "blue blowfly", "oriental latrine fly", "common greenbottle fly", "creosote gall midge", "willow pinecone gall midge", "coyote brush bud gall midge", "goldenrod bunch gall midge", "common eastern physocephala", "yellow fever mosquito", "asian tiger mosquito", "inland floodwater mosquito", "southern house mosquito", "gallinipper", "rainieria antennaepes", "noon fly", "housefly", "stable fly", "clubbed mydas fly", "filter fly", "eastern phantom crane fly", "golden-backed snipe fly", "yellow dung fly", "garden soldier fly", "black soldier fly", "compost fly", "oblique stripetail hoverfly", "mexican cactus fly", "four-speckled hover fly", "marmalade hover fly", "common lagoon fly", "band-eyed drone fly", "european drone fly", "black-shouldered drone fly", "orange-legged drone fly", "tapered drone fly", "yellow-shouldered drone fly", "european hoverfly", "transverse-banded flower fly", "migrant hover fly", "bird hover fly", "narrow-headed marsh fly", "dangling swamp-lover", "trivittate sunfly", "large hover fly", "narcissus bulb fly", "virginia flower fly", "yellow-haired sunfly", "dusky-winged hover fly", "northern plushback", "black-backed grass skimmer", "white-bowed smoothwing", "pied hover fly", "long hover fly", "eastern hornet fly", "thick-legged hoverfly", "eastern calligrapher", "margined calligrapher fly", "western calligrapher", "maize calligrapher fly", "bumblebee hoverfly", "lesser hornet hoverfly", "pellucid hover fly", "hornet mimic hover fly", "black horsefly", "tomato bristle fly", "tachina fera", "swift feather-legged fly", "australian leafroller tachinid", "goldenrod gall fly", "giant crane fly", "common picture-winged fly", "giant mayfly", "two-striped planthopper", "green cone-headed planthopper", "hawthorn shield bug", "red-cross shield bug", "birch shield bug", "parent bug", "hemlock woolly adelgid", "texas bow-legged bug", "riptortus pedestris", "red gum lerp psyllid", "hackberry nipple gall", "hackberry petiole gall psyllid", "milkweed aphid", "witch-hazel cone gall aphid", "sumac gall aphid", "giant willow aphid", "meadow spittlebug", "american giant water bug", "common froghopper", "two-lined spittlebug", "green leafhopper", "saddled leafhopper", "lateral-lined sharpshooter", "excultanus excultus", "blue-green sharpshooter", "red-banded leafhopper", "rhododendron leafhopper", "versute sharpshooter", "glassy-winged sharpshooter", "coppery leafhopper", "broad-headed sharpshooter", "japanese leafhopper", "speckled sharpshooter", "tylozygus bifidus", "chorus cicada", "ash cicada", "green grocer", "large brown cicada", "pharaoh cicada", "northern dusk singing cicada", "plains cicada", "western dusk singing cicada", "northern dog-day cicada", "lyric cicada", "scissors grinder cicada", "superb dog-day cicada", "swamp cicada", "little mesquite cicada", "black prince", "acanthocephala alata", "giant leaf-footed bug", "florida leaf-footed bug", "acanthocephala terminalis", "squash bug", "chelinidea vittiger", "dock bug", "helmeted squash bug", "box bug", "hypselonotus interruptus", "spot-sided coreid", "western leaf-footed bug", "western conifer seed bug", "leptoglossus oppositus", "eastern leaf-footed bug", "leptoglossus zonatus", "thasus gigas", "giant mesquite bug", "white-margined burrower bug", "northern flatid planthopper", "citrus flatid planthopper", "ormenoides venusta", "torpedo bug", "spotted lanternfly", "pyrops candelaria", "big-eyed toad bug", "common water strider", "california bordered plant bug", "largus bug", "stenomacra marginella", "birch catkin bug", "black-and-red bug", "common milkweed bug", "false milkweed bug", "charcoal seed bug", "white-crossed seed bug", "large milkweed bug", "spilostethus pandurus", "spilostethus saxatilis", "keeled treehopper", "mexican treehopper", "alfalfa plant bug", "potato mirid", "deraeocoris ruber", "meadow plant bug", "north american tarnished plant bug", "four-lined plant bug", "two-spotted grass bug", "water scorpion", "lime seed bug", "bishop's mitre shield bug", "bagrada bug", "green burgundy stink bug", "juniper stink bug", "carpocoris fuscispinus", "red shield bug", "black-shouldered shieldbug", "green stinkbug", "conchuela bug", "say's stink bug", "twice-stabbed stink bug", "sloe bug", "yellow-spotted stink bug", "rape bug", "red cabbage bug", "brown stink bug", "dusky stink bug", "florida predatory stink bug", "european striped shield bug", "semipunctated shield bug", "brown marmorated stink bug", "elf shoe stink bug", "mormidea lugens", "harlequin bug", "southern green stink bug", "rice stink bug", "green shield bug", "forest bug", "vernal shield bug", "spiny shield bug", "gorse shield bug", "black stink bug", "mottled stink bug", "anchor stink bug", "red-shouldered stink bug", "grape phylloxera", "bean plataspid bug", "harlequin red bug", "indian cotton stainer", "firebug", "mediterranean red bug", "california bee assassin", "apiomerus spissipes", "wheel bug", "black corsair", "jagged ambush bug", "orange assassin bug", "rasahus hamatus", "masked hunter", "red bull assassin", "milkweed assassin bug", "pale green assassin bug", "leafhopper assassin bug", "four-spurred assassin bug", "western box elder bug", "eastern boxelder bug", "spotted firebug", "red-shouldered bug", "long-necked seed bug", "rhyparochromus vulgaris", "passionvine hopper", "cantao ocellatus", "clown stink bug", "hibiscus harlequin bug", "lychee stink bug", "ashy mining bee", "tawny mining bee", "wilke's mining bee", "california anthophora", "urbane digger bee", "asian honey bee", "giant honey bee", "dwarf honey bee", "western honey bee", "black-and-gold bumble bee", "two-spotted bumblebee", "northern amber bumble bee", "california bumble bee", "lemon cuckoo bumblebee", "yellow bumble bee", "yellow-fronted bumble bee", "brown-belted bumblebee", "hunt's bumble bee", "tree bumblebee", "common eastern bumblebee", "large red-tailed bumblebee", "black-tailed bumblebee", "fuzzy-horned bumble bee", "nevada bumble bee", "common carder bumble bee", "american bumble bee", "perplexing bumble bee", "early bumblebee", "red-belted bumble bee", "sonoran bumble bee", "tricolored bumble bee", "buff-tailed bumblebee", "yellow-banded bumble bee", "half-black bumblebee", "bombus vancouverensis", "yellow-faced bumblebee", "two-spotted long-horned bee", "narrow stingless bee", "fox-colored stingless bee", "california carpenter bee", "southern carpenter bee", "valley carpenter bee", "horsefly-like carpenter bee", "violet carpenter bee", "eastern carpenter bee", "tomato hornworm parasitoid wasp", "chrysis angolensis", "elm sawfly", "unequal cellophane bee", "modest masked bee", "american sand wasp", "four-banded stink bug hunter wasp", "humped beewolf", "european beewolf", "eastern cicada-killer wasp", "organ-pipe mud-dauber wasp", "spongy oak apple gall wasp", "acorn plum gall wasp", "live oak apple gall wasp", "red cone gall wasp", "california gall wasp", "wool sower gall wasp", "spined turban gall wasp", "spiny leaf gall wasp", "rose bedeguar gall wasp", "introduced pine sawfly", "cockroach egg parasitoid wasp", "yellow crazy ant", "atta cephalotes", "chicatana leafcutter ant", "texas leafcutter ant", "chestnut carpenter ant", "florida carpenter ant", "karoo balbyter sugar ant", "western carpenter ant", "hairy sugar ant", "new york carpenter ant", "eastern black carpenter ant", "compact carpenter ant", "shimmering golden sugar ant", "cocktail ant", "mediterranean acrobat ant", "western thatching ant", "southern wood ant", "meat ant", "argentine ant", "velvety tree ant", "black jumper ant", "hairy panther ant", "green tree ant", "bullet ant", "longhorn crazy ant", "red harvester ant", "california harvester ant", "rough harvester ant", "small honey ant", "elongate twig ant", "red imported fire ant", "odorous house ant", "immigrant pavement ant", "black harvester ant", "brown-winged striped sweat bee", "bicolored striped sweat bee", "pure green sweat bee", "metallic epauletted-sweat bee", "ligated furrow bee", "poey's furrow bee", "orange-legged furrow bee", "great banded furrow bee", "tripartite sweat bee", "enicospilus purgatus", "black giant ichneumon wasp", "long-tailed giant ichneumonid wasp", "european wool carder bee", "oblong woolcarder", "giant resin bee", "carpenter-mimic leafcutter bee", "red mason bee", "european orchard bee", "dasymutilla aureola", "common eastern velvet ant", "american pelecinid wasp", "mammoth wasp", "double-banded scoliid wasp", "scolia dubia", "scolia hirta", "noble scoliid wasp", "pigeon tremex horntail", "common thread-waisted wasp", "common blue mud-dauber wasp", "steel-blue cricket-hunter wasp", "eremnophila aureonotata", "mexican grass-carrying wasp", "black-and-yellow mud-dauber wasp", "great golden digger wasp", "katydid wasp", "great black wasp", "willow apple gall sawfly", "five-banded thynnid wasp", "ancistrocerus adiabatus", "ancistrocerus antilope", "european tube wasp", "mexican honey wasp", "common aerial yellowjacket", "bald-faced hornet", "fraternal potter wasp", "euodynerus foraminatus", "euodynerus hidalgo", "western paper wasp", "four-toothed mason wasp", "red and black mason wasp", "jack spaniard wasp", "apache wasp", "golden paper wasp", "polistes bellicosus", "neotropical red paper wasp", "executioner paper wasp", "asian paper wasp", "comanche paper wasp", "european paper wasp", "polistes dorsalis", "common paper wasp", "northern paper wasp", "australian paper wasp", "polistes instabilis", "horse's paper wasp", "metricus paper wasp", "lesser banded hornet", "giant hornet", "asian predatory wasp", "alaska yellowjacket", "german wasp", "eastern yellowjacket", "western yellowjacket", "southern yellowjacket", "widow yellowjacket", "common yellowjacket", "green longhorn moth", "yellow-barred longhorn", "spotted apatelodes", "ailanthus webworm moth", "acorn moth", "shy cosmet moth", "goat moth", "carpenterworm moth", "leopard moth", "cotton web spinner", "garden webworm moth", "pearl veneer", "common grass-veneer", "vagabond sod webworm moth", "yellow-spotted webworm moth", "white-spotted sable", "small magpie", "checkered apogeshna", "milky urola moth", "hollow-spotted blepharomastix", "sooty-winged chalcoela moth", "garden grass-veneer", "topiary grass-veneer moth", "rice leaf-roller", "lesser rice-leafroller", "zebra conchylodes", "double-banded grass-veneer moth", "yellow satin veneer", "culladia cuneiferellus", "box tree moth", "grape leaffolder moth", "darker diacme moth", "melonworm moth", "cucumber moth", "fractured western snout moth", "julia's dicymolomia moth", "waterlily borer moth", "pondside pyralid moth", "waterlily leafcutter", "belted grass-veneer", "spotted peppergrass moth", "purple-backed cabbageworm", "white-roped glaphyria moth", "swan plant flower moth", "cabbage centre grub", "southern beet webworm moth", "grass webworm", "dusky herpetogramma", "magician moth", "pond moth", "spotted beet webworm", "victorian lamplighter moth", "poroporo fruit borer", "eggplant leafroller", "bog lygropia", "mung bean moth", "gold-stripe grass-veneer", "elegant grass-veneer", "yellow-veined moth", "whip-marked snout moth", "rufous-banded crambid moth", "nacoleia rhoeoalis", "celery stalkworm", "rush veneer", "common grass moth", "european corn borer", "splendid palpita moth", "four-spotted palpita", "jasmine moth", "basswood leafroller moth", "bluegrass webworm moth", "watermilfoil leafcutter moth", "chestnut-marked pondweed moth", "hydrilla leafcutter moth", "polymorphic pondweed moth", "mother of pearl", "titian peale's crambid", "two-banded petrophila", "jalisco petrophila", "mint-loving pyrausta moth", "mint moth", "bicolored pyrausta moth", "straw-barred pearl", "inornate pyrausta moth", "southern purple mint moth", "orange-spotted pyrausta", "common crimson-and-gold moth", "raspberry pyrausta moth", "coffee-loving pyrausta moth", "volupial pyrausta moth", "media moth", "assembly moth", "salvinia stem-borer", "sameodes cancellalis", "double-striped scoparia moth", "carrot seed moth", "hawaiian beet webworm", "orange-spotted flower moth", "rusty dot pearl", "celery leaftier", "genista broom moth", "snowy urola moth", "hemlock moth", "oak lantern", "gold-striped leaftier", "arched hooktip", "two-lined hooktip", "pebble hook-tip", "buff arches", "lettered habrosyne", "rose hooktip moth", "tufted thyatirid moth", "peach blossom moth", "oak hook-tip", "barred hook-tip", "castor semi-looper", "false underwing moth", "hübner's wasp moth", "nine-spotted moth", "velvet bean moth", "arge moth", "ornate tiger moth", "parthenice tiger moth", "harnessed moth", "mexican tiger moth", "virgin tiger moth", "little virgin tiger moth", "painted tiger moth", "garden tiger", "st. lawrence tiger moth", "wood tiger", "cream-spot tiger", "ranchman's tiger moth", "black witch", "tropical tiger moth", "asota heliconia", "red-necked footman", "bent-winged owlet", "brunia antica", "deduced graphic", "vetch looper moth", "clover looper moth", "forage looper moth", "scarlet tiger", "mother shipton", "pale tussock moth", "the sweetheart", "darling underwing", "pink underwing", "woody underwing", "ilia underwing", "little lined underwing", "sad underwing moth", "little nymph underwing", "the bride", "red underwing", "penitent underwing", "white underwing", "ultronia underwing", "widow underwing", "morbid owlet moth", "yellow-collared scape moth", "lead-colored lichen moth", "thin-banded lichen moth", "little white lichen moth", "pale lichen moth", "creatonotos gangis", "creatonotos transiens", "virginia ctenucha", "unexpected cycnia moth", "delicate cycnia moth", "cream-striped owl", "zhejiang tussock moth", "northern wattle moth", "clouded buff", "the passenger", "northern giant flag moth", "buff footman", "common footman", "orange footman", "erebus ephesperis", "salt marsh moth", "eubolina impartialis", "milkweed tussock moth", "toothed somberwing", "burnet companion", "common fruit-piercing moth", "locust underwing", "jersey tiger", "brown-tail moth", "green lattice", "bird dropping lichen moth", "sycamore tussock moth", "banded tussock moth", "clymene moth", "confused haploa moth", "leconte's haploa", "white-lined snout", "baltimore snout moth", "dimorphic bomolocha moth", "deceptive snout", "gray-edged bomolocha moth", "flowing-line hypena", "mottled bomolocha", "the snout", "green cloverworm moth", "hypercompe oslari", "giant leopard moth", "fall webworm", "painted lichen moth", "scarlet-winged lichen moth", "small necklace moth", "common idia moth", "american idia", "glossy black idia", "rotund idia moth", "thin-lined owlet", "ambiguous moth", "beautiful hook-tip", "white satin moth", "four-spotted footman", "santa ana tussock moth", "silver-spotted tiger moth", "hickory tussock moth", "spotted tussock moth", "black-and-yellow lichen moth", "gypsy moth", "black arches", "royal poinciana moth", "cellar melipotis", "indomitable melipotis", "merry melipotis", "common fungus moth", "live oak metria moth", "the rosy footman", "sugarcane looper", "small mocis moth", "withered mocis", "thin-winged owlet moth", "nyctemera adversata", "senecio moth", "new zealand magpie moth", "magpie moth", "vapourer", "definite tussock moth", "fir tussock moth", "white-marked tussock moth", "cocoa tussock moth", "western tussock moth", "orvasca subnotata", "dark-spotted palthis", "faint-spotted palthis moth", "decorated owlet", "brown panopoda moth", "red-lined panopoda", "pantydia sparsa", "maple looper moth", "yellow-winged pareuchaetes", "banyan tussock moth", "dark-banded owlet", "black-banded owlet", "philenora aspectalella", "common oak moth", "ruby tiger moth", "curve-lined owlet moth", "moonseed moth", "isabella tiger moth", "yellow-spotted renia", "slender owlet moth", "spotted grass moth", "straw dot", "deadwood borer moth", "the herald", "pale-edged selenisa", "swan moth", "agreeable tiger moth", "crimson tiger moth", "white ermine", "buff ermine", "virginian tiger moth", "polka-dot wasp moth", "handmaiden moth", "smoky tetanolita", "triangles", "cinnabar moth", "ornate bella moth", "crimson-speckled flunkey", "orange virbia", "joyful holomelina", "horrid zale", "lunate zale moth", "colorful zale moth", "variable zanclognatha", "grayish fan-foot", "dark marathyssa moth", "large paectes moth", "eyed paectes moth", "black-smudged chionodes moth", "cream-edged dichomeris moth", "red-necked peanutworm moth", "clouded magpie", "mottled beauty", "fall cankerworm moth", "common gray", "variable carpet moth", "st. john's wort inchworm", "pink arhodia", "oak besma", "pepper-and-salt geometer", "yellow-dusted cream", "common white wave", "light emerald", "pale beauty", "yellow shell moth", "gray spruce looper", "northern pine looper", "latticed heath", "chiasmia emersaria", "bicolored chloraspilates moth", "blackberry looper", "red-green carpet", "australian pug moth", "chloroclystis inductata", "chain-dotted geometer", "broken leaf moth", "the scribbler", "mottled gray carpet", "green carpet", "feathered thorn", "comostola laesaria", "barberry geometer", "bent-line carpet", "scalloped oak", "red-lined looper", "clay triple-lines", "dwarf tawny wave", "packard's wave", "sweetfern geometer moth", "forest semilooper", "showy emerald", "didymoctenia exsuperata", "broad-lined angle", "curve-lined angle", "hollow-spotted angle", "faint-spotted angle", "somber carpet moth", "false tiger moth", "the bad-wing", "northern marbled carpet", "orange-barred carpet moth", "marbled carpet moth", "small phoenix", "small engrailed", "common heath", "maple spanworm", "elm spanworm", "epidesmia tryxaria", "tulip-tree beauty", "common carpet", "autumnal moth", "new zealand looper", "mottled umber", "the beggar", "deep yellow euchlaena", "saw-wing", "white eulithis moth", "sharp-angled carpet moth", "common eupithecia", "confused eusarca", "curve-toothed geometer", "green broomweed looper moth", "waved-lined geometrid", "large emerald", "five-lined gray", "texas gray moth", "double-striped pug", "chickweed geometer", "common spring moth", "common emerald", "three-spotted fillip", "pistachio emerald", "brown bark carpet moth", "one-spotted variant", "pale oak beauty", "black looper", "riband wave", "small fan-footed wave", "portland ribbon wave", "red-bordered wave", "single-dotted wave", "least carpet", "small dusty wave", "dot-lined wave", "idiodes apicata", "black-dotted ruddy moth", "brown-shaded gray moth", "bent-line gray", "pale-veined isturgia", "hemlock looper", "pannaria wave moth", "scorched carpet", "drab brown wave", "powdered bigwing moth", "clouded border", "gray spring moth", "white spring moth", "common lytrosis", "common angle moth", "sharp-angled peacock moth", "red-headed inchworm moth", "tawny-barred angle moth", "lesser maple spanworm moth", "white-tipped black", "black geometrid", "canadian melanolophia moth", "western carpet", "orange wing", "white-ribboned carpet moth", "pale metanema", "milionia zonea", "horned spanworm moth", "red-fringed emerald", "red-bordered emerald", "white-fringed emerald", "brown-lined looper moth", "false hemlock looper", "orange magpie moth", "chimney-sweeper", "bruce spanworm", "winter moth", "brimstone moth", "the gem", "swallow-tailed moth", "spring cankerworm moth", "green pug", "juniper-twig geometer", "willow beauty", "small rivulet", "oak beauty", "small phigalia moth", "half-wing", "sinister moth", "apple looper", "hollow-spotted plagodis moth", "straight-lined plagodis", "barred umber", "lemon plagodis", "common tan wave", "common gum emerald", "alien probole moth", "large maple spanworm", "porcelain gray", "dot-lined angle", "common forest looper", "speckled yellow", "vestal", "omnivorous looper", "lewes wave", "large lace border moth", "plantain moth", "shaded broad-bar", "sharp-lined yellow", "insigillated pug", "black-veined moth", "wavy-lined emerald", "southern emerald moth", "white slant-line", "yellow slant-line", "cross-lined wave", "blood vein", "cross-line wave moth", "white-striped black moth", "tissue moth", "dark-barred twin-spot carpet", "garden carpet", "toothed brown carpet moth", "silver-ground carpet", "horse-chestnut leafminer", "common aspen leaf miner", "puriri moth", "common swift", "orange swift", "hoary edge", "pale sicklewing", "celia's roadside-skipper", "common bush hopper", "delaware skipper", "least skipper", "common spurwing", "monk skipper", "two-barred flasher", "sachem", "golden-banded skipper", "formosan swift", "white checkered-skipper", "common checkered skipper", "tropical checkered skipper", "orcus checkered-skipper", "desert checkered-skipper", "brazilian skipper", "mallow skipper", "chequered skipper", "white-striped longtail", "white-patched skipper", "orange skipperling", "southern skipperling", "daimio tethys", "sickle-winged skipper", "silver-spotted skipper", "wild indigo duskywing", "sleepy duskywing", "funereal duskywing", "horace's duskywing", "dreamy duskywing", "juvenal's duskywing", "propertius duskywing", "dingy skipper", "mournful duskywing", "dion skipper", "dun skipper", "northern white-skipper", "laviana white-skipper", "western branded skipper", "common branded skipper", "leonard's skipper", "fiery skipper", "chestnut bob", "clouded skipper", "eufala skipper", "hobomok skipper", "umber skipper", "taxiles skipper", "zabulon skipper", "restricted demon", "common banded demon", "rural skipper", "woodland skipper", "large skipper", "yellow-banded dart", "twin-spot skipper", "ocola skipper", "small branded swift", "guava skipper", "common sootywing", "broad-winged skipper", "long dash skipper", "peck's skipper", "sandhill skipper", "tawny-edged skipper", "whirlabout", "little glassywing", "lesser dart", "grizzled skipper", "southern grizzled skipper", "indian palm bob", "southern cloudywing", "northern cloudywing", "european skipper", "small skipper", "grass demon", "dorantes longtail", "brown longtail", "long-tailed skipper", "northern broken-dash", "southern broken dash", "speckled lactura", "dot-lined white", "drinker moth", "oak eggar", "fox moth", "eastern tent caterpillar moth", "western tent caterpillar", "forest tent caterpillar moth", "the lackey", "american lappet moth", "larch tolype", "large tolype moth", "rose myrtle lappet moth", "saddleback caterpillar moth", "shagreened slug moth", "yellow-collared slug moth", "spiny oak-slug moth", "green oak-slug moth", "crowned slug moth", "spun glass slug moth", "yellow-shouldered slug moth", "smaller parasa moth", "hag moth", "skiff moth", "early button slug moth", "common hedge blue", "common ciliate blue", "brown argus", "southern brown argus", "great purple hairstreak", "western pygmy blue", "common geranium-bronze", "brown elfin", "bramble green hairstreak", "western pine elfin", "juniper hairstreak", "henry's elfin", "eastern pine elfin", "green hairstreak", "red-banded hairstreak", "dusky-blue groundstreak", "common pierrot", "forget-me-not", "holly blue", "echo azure", "spring azure", "northern azure", "summer azure", "lime blue", "plains cupid", "long-banded silverline", "common silverline", "western tailed-blue", "short-tailed blue", "eastern tailed-blue", "small blue", "mazarine blue", "reakirt's blue", "branded imperial", "gram blue", "atala", "bernardino dotted-blue", "purple hairstreak", "harvester", "green-underside blue", "silvery blue", "golden hairstreak", "purple sapphire", "restricted purple sapphire", "ceraunus blue", "common tit", "acmon blue", "boisduval's blue", "lupine blue", "greenish blue", "metallic caerulean", "dark cerulean", "common caerulean", "long-tailed blue", "cassius blue", "marine blue", "lang's short-tailed blue", "zebra blue", "yamfly", "purple-shot copper", "tailed copper", "large copper", "gorgon copper", "purplish copper", "blue copper", "purple-edged copper", "bronze copper", "mariposa copper", "small copper", "sooty copper", "scarce copper", "great copper", "malayan", "transparent 6-line blue", "white m hairstreak", "silver-studded blue", "idas blue", "melissa blue", "karner blue", "amanda's blue", "adonis blue", "chalkhill blue", "common blue", "tailless line blue", "common line blue", "pale grass blue", "acadian hairstreak", "behr's hairstreak", "banded hairstreak", "california hairstreak", "edwards' hairstreak", "oak hairstreak", "striped hairstreak", "hedgerow hairstreak", "blue-spot hairstreak", "sylvan hairstreak", "coral hairstreak", "mallow scrub-hairstreak", "gray hairstreak", "red pierrot", "brown hairstreak", "dark grass blue", "dark grass blue", "common grass blue", "lesser grass blue", "tiny grass blue", "black-waved flannel moth", "southern flannel moth", "white flannel moth", "distinct quaker", "pale shoulder", "spotted sulphur", "sycamore moth", "american dagger moth", "green marvel", "yellow-haired dagger moth", "unmarked dagger moth", "fingered dagger", "cattail caterpillar", "smeared dagger moth", "ruddy dagger", "knot grass", "splendid dagger moth", "delightful dagger moth", "pale-banded dart", "heart and dart", "bogong moth", "black cutworm", "brown cutworm", "variable cutworm", "shuttle-shaped dart", "turnip moth", "venerable dart", "unspotted looper moth", "eight-spotted forester", "american copper underwing", "eight-spot moth", "celery looper", "green arches", "the nutmeg", "green cutworm moth", "yellow-headed cutworm moth", "dark arches", "bordered apamea moth", "the slowpoke", "alfalfa looper", "silver y", "common looper moth", "the flame", "three-lined balsa moth", "silver-spotted fern moth", "florida fern moth", "pink-shaded fern moth", "toadflax brocade moth", "tufted bird-dropping moth", "the laugher", "charadra dispulsa", "bent-line dart", "green garden looper", "soybean looper", "cloaked marvel", "nut-tree tussock", "closebanded yellowhorn moth", "dusky groundling", "white-dotted groundling", "grote's sallow moth", "american dun-bar moth", "dun-bar", "green blotched moth", "coronet", "tree-lichen beauty", "sharp-stigma looper moth", "asteroid moth", "brown-hooded owlet moth", "marbled white-spot", "dark-spotted looper moth", "hologram moth", "lantana moth", "chalcedony midget", "festive midget", "grateful midget", "variegated midget", "beloved emarginea moth", "beautiful wood-nymph", "pearly wood-nymph", "two-spot dart", "american angle shade", "great brocade", "master's dart", "dingy cutworm", "granulate cutworm", "comstock's sallow moth", "wedgling moth", "harris's three-spot", "cotton bollworm moth", "corn earworm", "creaky dart", "black wedge-spot", "vine's rustic", "common hyppa moth", "new zealand cutworm", "bright-line brown-eye", "laudable arches", "olive arches moth", "bristly cutworm", "double lobed moth", "adjutant wainscot", "dark-barred wainscot", "unknown wainscot", "green leuconycta", "marbled-green leuconycta moth", "small mossy lithacodia", "variable narrow-wing", "maliattha amorpha", "red-spotted lithacodia", "black-dotted maliattha", "black-bordered lemon moth", "bilobed looper", "hitched arches moth", "cloaked minor", "confused woodgrain moth", "fluid arches", "the white-point", "australian armyworm", "lesser wainscot", "white-speck", "gray half-spot", "bronzed cutworm moth", "lesser yellow underwing", "broad-bordered yellow underwing", "large yellow underwing", "flame-shouldered dart moth", "flame shoulder moth", "common pinkband", "rustic quaker", "common quaker", "hebrew character", "speckled green fruit-worm", "clouded drab", "black zigzag moth", "eastern panthea", "sensitive fern borer moth", "bracken borer", "variegated cutworm", "olive angle shades", "angle shades", "brown angle shades", "spotted phosphila moth", "turbulent phosphila moth", "stormy arches", "the hebrew", "olive-shaded bird-dropping moth", "small bird-dropping moth", "half-yellow moth", "proteuxoa hypochalchis", "pale glyph", "large mossy lithacodia moth", "pink-barred pseudeustrotia", "pink-spotted dart", "grapevine epimenis moth", "gray looper moth", "brother moth", "arcigera flower moth", "primrose moth", "ragweed flower moth", "beet armyworm", "fall armyworm", "oriental leafworm moth", "lawn armyworm", "yellow-striped armyworm", "spotted spragueia", "common spragueia", "western bean cutworm moth", "bicolored sallow moth", "salt-and-pepper looper moth", "exposed bird dropping moth", "four-patched bird-dropping moth", "slender burnished brass", "cabbage looper", "striped garden caterpillar moth", "harp-winged tripudia moth", "four-spotted moth", "wilson's wood-nymph moth", "setaceous hebrew character", "smith's dart", "square-spot rustic", "doubleday's baileya moth", "eyed baileya moth", "hieroglyphic moth", "sorghum webworm moth", "nola desmotes", "green silver-lines", "gum leaf skeletonizer", "sigmoid prominent moth", "apical prominent moth", "walnut caterpillar", "yellow-necked caterpillar moth", "destolmia lineata", "linden prominent moth", "white furcula moth", "common gluphisia", "wavy-lined heterocampa moth", "saddled prominent", "white-blotched heterocampa", "double-lined prominent moth", "mottled prominent moth", "white-dotted prominent moth", "double-toothed prominent moth", "bag-shelter moth", "georgian prominent moth", "angulose prominent moth", "oval-based prominent moth", "chocolate prominent", "buff-tip", "black-rimmed prominent moth", "elegant prominent moth", "california oak moth", "red-humped caterpillar", "morning-glory prominent moth", "unicorn prominent", "lobster moth", "pine processionary-moth", "oak processionary", "garden acraea", "yellow coster", "small orange acraea", "tawny coster", "california sister", "arizona sister", "band-celled sister", "iphicleola sister", "european peacock butterfly", "milbert's tortoiseshell", "small tortoiseshell", "gulf fritillary", "tropical leafwing butterfly", "goatweed leafwing butterfly", "brown peacock butterfly", "banded peacock", "white peacock", "texan crescent", "pale-banded crescent", "lesser purple emperor", "purple emperor", "ringlet", "map", "high brown fritillary", "tropical fritillary", "cardinal butterfly", "silver-washed fritillary", "angled castor", "hackberry emperor", "tawny emperor", "empress leilia", "staff sergeant", "red rim", "common bush brown", "american meadow fritillary", "purplish fritillary", "weaver's fritillary", "pacific fritillary", "pearl-bordered fritillary", "silver-bordered fritillary", "marbled fritillary", "lesser marbled fritillary", "great banded grayling", "pale owl-butterfly", "common wood nymph", "tawny rajah", "two-tailed pasha", "sagebrush checkerspot", "white-rayed patch", "gabb's checkerspot", "gorgone checkerspot", "harris's checkerspot", "crimson patch", "bordered patch", "silvery checkerspot", "northern checkerspot", "theona checkerspot", "pearly heath", "chestnut heath", "small heath", "large heath", "dirce beauty", "rustic", "gemmed satyr", "common mapwing", "african monarch", "tropical queen", "southern monarch", "common tiger", "queen butterfly", "lesser wanderer", "monarch butterfly", "anna’s eighty-eight", "juno silverspot", "mexican silverspot", "cape autumn widow", "autumn leaf", "silver emperor", "banded orange heliconian", "julia butterfly", "tiny checkerspot", "mexican sailor", "common palmfly", "scotch argus", "common alpine", "large ringlet", "arran brown", "woodland ringlet", "isabella's longwing", "dingy purplewing", "marsh fritillary", "variable checkerspot", "edith's checkerspot", "baltimore checkerspot", "common crow butterfly", "blue-spotted crow butterfly", "striped blue crow butterfly", "purple crow butterfly", "variegated fritillary", "mexican fritillary", "common baron", "large faun", "klug's xenica", "red cracker", "gray cracker", "variable cracker", "guatemalan cracker", "zebra longwing", "red postman", "hermes satyr", "carolina satyr", "red ring skirt", "common brown", "grayling", "tree grayling", "odius leafwing", "common brown ringlet", "great eggfly", "danaid eggfly", "ceylon blue glassy tiger", "queen of spain fritillary", "peacock pansy", "grey pansy", "common buckeye", "southern mangrove buckeye", "brown pansy", "yellow pansy", "chocolate pansy", "lemon pansy", "natal pansy", "dark blue pansy", "blue argus", "soldier pansy", "meadow argus", "blue admiral", "large wall brown", "wall", "knight", "northern pearly eye", "appalachian eyed brown", "banded treebrown", "bamboo treebrown", "eyed brown", "southern pearly-eye", "common archduke", "american snout", "viceroy", "red-spotted admiral", "white admiral", "lorquin's admiral", "poplar admiral", "southern white admiral", "weidemeyer's admiral", "meadow brown", "many-banded daggerwing", "ruddy daggerwing", "polymnia tigerwing", "little wood satyr", "red satyr", "marbled white", "iberian marbled white", "common evening brown", "dark evening brown", "heath fritillary", "glanville fritillary", "spotted fritillary", "knapweed fritillary", "common mestra", "elf butterfly", "dryad", "commander", "common morpho", "dark-branded bushbrown", "south china bushbrown", "mexican bluewing", "common sailor", "pallas' sailer", "mourning cloak", "california tortoiseshell", "false comma", "large tortoiseshell", "scarce tortoiseshell", "smooth-eyed bushbrown", "glassy tiger", "speckled wood butterfly", "clipper", "short banded sailer", "common leopard", "northern crescent", "vesta crescent", "mylitta crescent", "phaon crescent", "field crescent", "pearl crescent", "comma", "asian comma", "eastern comma", "green comma", "hoary comma", "question mark", "grey comma", "satyr comma", "spanish gatekeeper", "southern gatekeeper", "gatekeeper", "rusty-tipped page", "malachite", "blomfild's beauty", "dark green fritillary", "aphrodite fritillary", "atlantis fritillary", "callippe fritillary", "great spangled fritillary", "regal fritillary", "mormon fritillary", "common jester", "malay viscount", "claudina crescent", "elada checkerspot", "blue tiger", "dark blue tiger butterfly", "swordgrass brown", "west coast lady", "red admiral", "brazilian lady", "painted lady", "western painted lady", "new zealand red admiral", "asian admiral", "yellow admiral", "australian painted lady", "american lady", "common five-ring", "common four-ring", "white-shouldered house moth", "orange-headed epicallima moth", "brown house moth", "suzuki's promalactis moth", "pipevine swallowtail", "polydamas swallowtail", "zebra swallowtail", "tailed jay", "common jay", "common bluebottle", "iberian scarce swallowtail", "scarce swallowtail", "common rose swallowtail", "orchard swallowtail", "dainty swallowtail", "ruby-spotted swallowtail", "chinese peacock swallowtail", "canadian tiger swallowtail", "common mime swallowtail", "giant swallowtail", "mocker swallowtail", "citrus swallowtail", "citrus swallowtail", "pale swallowtail", "magnificent swallowtail", "eastern tiger swallowtail", "red helen swallowtail", "old world swallowtail", "great mormon swallowtail", "two-tailed swallowtail", "palamedes swallowtail", "paris peacock swallowtail", "blue mormon swallowtail", "common mormon swallowtail", "black swallowtail", "spangle swallowtail", "pink-spotted swallowtail", "western giant swallowtail", "western tiger swallowtail", "thoas swallowtail", "spicebush swallowtail", "asian swallowtail", "anise swallowtail", "apollo", "clodius parnassian", "clouded apollo", "rocky mountain parnassian", "mexican kite swallowtail", "southern festoon", "spanish festoon", "sleepy orange", "white angled-sulphur", "yellow angled sulphur", "orange tip", "falcate orangetip", "sara orangetip", "black-veined white", "striped albatross", "chocolate albatross", "great southern white", "pioneer white", "african caper", "caper white", "mexican dartwhite", "african migrant", "lemon migrant", "mottled emigrant", "common gull", "clouded yellow", "orange sulphur", "pink-edged sulphur", "clouded sulphur", "common jezebel", "painted jezebel", "red-based jezebel", "large marble", "three-spotted grass yellow", "broad-bordered grass yellow", "barred yellow", "common grass yellow", "mexican yellow", "giant white", "florida white", "cleopatra butterfly", "common brimstone", "great orange tip", "yellow orange tip", "lyside sulphur", "common green-eyed white", "psyche", "eastern dotted border", "dainty sulphur", "pine white", "indian wanderer", "large orange sulphur", "orange-barred sulphur", "cloudless sulphur", "large white butterfly", "indian cabbage white", "margined white", "green-veined white", "mustard white", "cabbage white", "west virginia white", "bath white", "eastern bath white", "western white", "checkered white", "little yellow", "mimosa yellow", "tailed orange", "southern dogface", "diamondback moth", "common bagworm moth", "evergreen bagworm", "morning-glory plume moth", "plain plume moth", "white plume moth", "sphenarches anisodactylus", "large tabby", "bee moth", "olive arta moth", "posturing arta moth", "trumpet vine moth", "rosy tabby", "endotricha mesenterialis", "endotricha pyrosalis", "dimorphic macalla moth", "broad-banded eulogia", "boxwood leaftier moth", "greater wax moth", "pink-fringed dolichomia moth", "clover hayworm", "yellow-fringed dolichomia", "darker moodna moth", "rosy-striped knot-horn", "pantry moth", "maple webworm moth", "meal moth", "spectrotrota fimbrialis", "long-legged tabby", "dimorphic tosale moth", "plum judy", "behr's metalmark", "fatal metalmark", "little metalmark", "red-bordered metalmark", "curve-winged metalmark", "duke of burgundy", "white-rayed metalmark", "red-bordered pixie", "long-tailed metalmark", "punchinello", "north american moon moth", "orange-tipped oakworm moth", "spiny oakworm moth", "pink-striped oakworm moth", "polyphemus moth", "io moth", "tulip-tree silkmoth", "promethea silkmoth", "regal moth", "pandora pinemoth", "rosy maple moth", "imperial moth", "western sheep moth", "buck moth", "cecropia moth", "columbia silk moth", "ceanothus silk moth", "gum emperor moth", "small emperor moth", "giant peacock moth", "banded scythris moth", "squash vine borer", "maple callus borer moth", "death's head hawkmoth", "pink-spotted hawkmoth", "convolvulus hawkmoth", "walnut sphinx moth", "nessus sphinx", "elm sphinx moth", "catalpa sphinx", "waved sphinx moth", "oleander hawk-moth", "azalea sphinx moth", "virginia creeper sphinx moth", "lettered sphinx moth", "elephant hawk-moth", "small elephant hawk-moth", "pawpaw sphinx moth", "mournful sphinx moth", "ello sphinx moth", "obscure sphinx moth", "achemon sphinx moth", "banded sphinx moth", "pandorus sphinx moth", "satellite sphinx moth", "vine sphinx moth", "snowberry clearwing", "broad-bordered bee hawk-moth", "hummingbird clearwing", "vine hawk-moth", "hippotion scrofa", "leafy spurge hawk moth", "bedstraw hawk-moth", "white-lined sphinx moth", "poplar hawk-moth", "northern pine sphinx moth", "hummingbird hawkmoth", "tomato hornworm", "rustic sphinx moth", "carolina sphinx moth", "lime hawk-moth", "fig sphinx moth", "modest sphinx moth", "blinded sphinx moth", "small-eyed sphinx moth", "tetrio sphinx moth", "one-eyed sphinx moth", "twin-spotted sphinx", "eyed hawkmoth", "smerinthus ophthalmica", "abbott's sphinx", "great ash sphinx moth", "laurel sphinx", "privet hawkmoth", "pine hawk-moth", "northern apple sphinx moth", "yam hawkmoth", "tersa sphinx", "spotted thyris moth", "mournful thyris", "arcane grass tubeworm moth", "clemens' grass tubeworm moth", "texas grass tubeworm moth", "maple leaftier", "fruit-tree leafroller moth", "omnivorous leafroller", "yellow-winged oak leafroller moth", "red-banded leafroller", "maple-basswood leafroller moth", "reticulated fruitworm moth", "spruce budworm", "oblique-banded leafroller moth", "black-patched clepsis moth", "garden tortrix", "white triangle tortrix", "greenish apple moth", "the batman moth", "cotton tipworm moth", "filbertworm moth", "codling moth", "verbena bud moth", "bidens borer moth", "goldenrod gall moth", "ragweed borer", "light brown apple moth", "white pinecone borer moth", "aster eucosma moth", "sculptured moth", "dotted ecdytolopha moth", "orange fruit borer", "three-lined leafroller", "exasperating platynota moth", "black-shaded platynota moth", "tufted apple bud moth", "omnivorous platynota moth", "sparganothis fruitworm", "lentiginos moth", "gray leafroller moth", "common marble", "brown scoopwing", "gray scoopwing", "tropical swallowtail moth", "micronia aculeata", "urania swallowtail moth", "european honeysuckle leafroller", "grapeleaf skeletonizer", "orange-patched smoky moth", "six-spot burnet", "narrow-bordered five-spot burnet", "zygaena transalpina", "european dwarf mantis", "conehead mantis", "mediterranean mantis", "stick mantis", "harabiro mantis", "praying mantis", "australian green mantis", "new zealand mantis", "false garden mantis", "carolina mantis", "arizona mantis", "chinese mantis", "south african praying mantis", "grass-like mantid", "common european scorpionfly", "nuptial scorpionfly", "summer fishfly", "spring fishfly", "eastern dobsonfly", "chrysopa oculata", "tasmanian lacewing", "wasp mantidfly", "four-spotted mantisfly", "say's mantidfly", "green mantidfly", "owly sulphur", "blue-spotted hawker", "canada darner", "lance-tipped darner", "southern hawker", "lake darner", "brown hawker", "variable darner", "sedge darner", "migrant hawker", "paddle-tailed darner", "black-tipped darner", "shadow darner", "green-striped darner", "emperor dragonfly", "common green darner", "comet darner", "lesser emperor", "springtime darner", "fawn darner", "swamp darner", "harlequin darner", "cyrano darner", "california darner", "blue-eyed darner", "spatterdock darner", "common flatwing", "river jewelwing", "copper demoiselle", "ebony jewelwing", "banded demoiselle", "beautiful demoiselle", "american rubyspot", "smoky rubyspot", "formosan jewelwing", "common blue jewel", "golden gem", "variable wisp", "wandering midget", "eastern red damsel", "paiute dancer", "blue-fronted dancer", "seepage dancer", "emma's dancer", "variable dancer", "kiowa dancer", "powdered dancer", "aztec dancer", "springwater dancer", "blue-ringed dancer", "blue-tipped dancer", "dusky dancer", "vivid dancer", "eastern billabongfly", "orange-tailed sprite", "orange-tailed marsh dart", "coromandel marsh dart", "common citril", "aurora damsel", "azure damselfly", "variable damselfly", "taiga bluet", "azure bluet", "double-striped bluet", "tule bluet", "familiar bluet", "common blue damselfly", "turquoise bluet", "atlantic bluet", "big bluet", "marsh bluet", "stream bluet", "skimming bluet", "hagen's bluet", "arroyo bluet", "orange bluet", "north american slender bluet", "vesper bluet", "red-eyed damselfly", "small red-eyed damselfly", "aurora bluetail", "pacific forktail", "black-fronted forktail", "blue-tailed damselfly", "citrine forktail", "common bluetail", "lilypad forktail", "western forktail", "fragile forktail", "rambur's forktail", "common bluetail", "eastern forktail", "sphagnum sprite", "sedge sprite", "blue riverdamsel", "pseudagrion pilidorsum", "large red damselfly", "desert firetail", "red and blue damsel", "red damselfly", "golden-ringed dragonfly", "delta-spotted spiketail", "pacific spiketail", "twin-spotted spiketail", "arrowhead spiketail", "downy emerald", "american emerald dragonfly", "racket-tailed emerald", "beaverpond baskettail", "common baskettail", "dot-winged baskettail", "prince baskettail", "spiny baskettail", "australian emerald dragonfly", "tau emerald", "clamp-tipped emerald", "black-banded gossamerwing", "euphaea formosa", "broad-striped forceptail", "lilypad clubtail", "jade clubtail", "unicorn clubtail", "black-shouldered spinyleg", "flag-tailed spinyleg", "eastern ringtail", "plains clubtail", "midland clubtail", "cobra clubtail", "common clubtail", "dragonhunter", "common clubtail", "small pincertail", "lancet clubtail", "ashy clubtail", "sulphur-tipped clubtail", "dusky clubtail", "five-striped leaftail", "four-striped leaftail", "common sanddragon", "eastern least clubtail", "russet-tipped clubtail", "california spreadwing", "great spreadwing", "slender ringtail", "blue damselfly", "wandering ringtail", "willow emerald damselfly", "plateau spreadwing", "southern spreadwing", "migrant spreadwing", "spotted spreadwing", "northern spreadwing", "emerald spreadwing", "amber-winged spreadwing", "sweetflag spreadwing", "elegant spreadwing", "slender spreadwing", "emerald damselfly", "lyre-tipped spreadwing", "swamp spreadwing", "small spreadwing", "common winter damselfly", "siberian winter damsel", "grizzled pintail", "grenadier", "oriental blue dasher", "red-tailed pennant", "four-spotted pennant", "ditch jewel", "southern banded groundling", "pale-faced clubskimmer", "calico pennant", "halloween pennant", "banded pennant", "martha's pennant", "broad scarlet", "scarlet skimmer", "wandering percher", "scarlet percher", "chalky percher", "checkered setwing", "black setwing", "swift setwing", "western pondhawk", "pin-tailed pondhawk", "eastern pondhawk", "great pondhawk", "plateau dragonlet", "seaside dragonlet", "black-winged dragonlet", "little blue dragonlet", "band-winged dragonlet", "blue corporal", "chalk-fronted corporal", "asiatic blood tail", "frosted whiteface", "crimson-ringed whiteface", "hudsonian whiteface", "dot-tailed whiteface", "belted whiteface", "golden-winged skimmer", "bar-winged skimmer", "comanche skimmer", "neon skimmer", "spangled skimmer", "broad-bodied chaser", "yellow-sided skimmer", "eight-spotted skimmer", "scarce chaser", "slaty skimmer", "widow skimmer", "needham's skimmer", "twelve-spotted skimmer", "four-spotted skimmer", "flame skimmer", "painted skimmer", "great blue skimmer", "marl pennant", "wandering pennant", "thornbush dasher", "pygmy percher", "elfin skimmer", "common parasol", "russet percher", "neurothemis taiwanensis", "pied paddy skimmer", "carmine skimmer", "roseate skimmer", "white-tailed skimmer", "southern skimmer", "blue skimmer", "black-lined skimmer", "brown-backed red marsh hawk", "epaulet skimmer", "keeled skipper", "blue marsh hawk", "julia skimmer", "slender blue skimmer", "crimson-tailed marsh hawk", "green marsh hawk", "scarlet skimmer", "blue-tailed forest hawk", "fiery skimmer", "blue dasher", "lucia widow", "red rock skimmer", "wandering glider", "spot-winged glider", "mexican amberwing", "eastern amberwing", "common whitetail", "swampwatcher", "filigree skimmer", "pied skimmer", "graphic flutterer", "yellow-striped flutterer", "common picture wing", "blue-faced meadowhawk", "variegated meadowhawk", "saffron-winged meadowhawk", "black meadowhawk", "yellow-winged darter", "red-veined darter", "cardinal meadowhawk", "cherry-faced meadowhawk", "southern darter", "white-faced meadowhawk", "striped meadowhawk", "banded darter", "ruddy darter", "band-winged meadowhawk", "common darter", "autumn meadowhawk", "vagrant darter", "twister", "carolina saddlebags", "black saddlebags", "red saddlebags", "violet dropwing", "red-veined dropwing", "crimson marsh glider", "black stream glider", "kirby's dropwing", "dancing dropwing", "scarlet basker", "stream cruiser", "swift river cruiser", "royal river cruiser", "gray petaltail", "yellow featherlegs", "orange threadtail", "white-legged damselfly", "abracris flavolineata", "garden locust", "oriental longheaded locust", "giant green slantface", "mediterranean slant-faced grasshopper", "aztec spur-throated grasshopper", "broad green-winged grasshopper", "egyptian locust", "clear-winged grasshopper", "painted meadow grasshopper", "common field grasshopper", "southern green-striped grasshopper", "green-striped grasshopper", "gold grasshopper", "rainbow grasshopper", "short-winged green grasshopper", "carolina grasshopper", "small gold grasshopper", "rufous grasshopper", "snakeweed grasshopper", "cattail toothpick grasshopper", "migratory locust", "two-striped grasshopper", "differential grasshopper", "red-legged grasshopper", "grizzly spur-throat grasshopper", "two-striped mermiria", "blue-winged grasshopper", "woodland grasshopper", "common green grasshopper", "common maquis grasshopper", "new zealand grasshopper", "wingless grasshopper", "marsh meadow grasshopper", "meadow grasshopper", "american bird grasshopper", "spotted bird grasshopper", "gray bird grasshopper", "obscure bird grasshopper", "large marsh grasshopper", "admirable grasshopper", "seaside grasshopper", "pallid-winged grasshopper", "crackling forest grasshopper", "giant grasshopper", "javanese grasshopper", "rufous-legged grasshopper", "wellington tree weta", "auckland tree weta", "tropical house cricket", "african field cricket", "european field cricket", "fall field cricket", "restless bush cricket", "jumping bush cricket", "two-spotted tree cricket", "nisitrus vittatus", "black-horned tree cricket", "narrow-winged tree cricket", "black field cricket", "japanese burrowing cricket", "koppie foam grasshopper", "elegant grasshopper", "greenhouse camel cricket", "plains lubber grasshopper", "devil's horse", "horse lubber grasshopper", "violet-winged grasshopper", "giant red-winged grasshopper", "mormon cricket", "common garden katydid", "short-winged meadow katydid", "slender meadow katydid", "long-winged conehead", "conocephalus melaenus", "white-faced bush-cricket", "wart-biter", "speckled bush-cricket", "southern oak bush-cricket", "drumming katydid", "lesser anglewing katydid", "greater anglewing", "broad-tipped conehead", "black-legged meadow katydid", "sickle-bearing bush-cricket", "mediterranean katydid", "dark bush-cricket", "common true katydid", "roesel's bush-cricket", "large conehead", "fork-tailed bush katydid", "mexican bush katydid", "northern bush katydid", "giant katydid", "brown-spotted bush-cricket", "upland green bush-cricket", "great green bush-cricket", "red-headed bush cricket", "northern walkingstick", "giant walkingstick", "smooth stick-insect", "southern two-striped walkingstick", "tree cattle", "four-lined silverfish", "gray silverfish", "common silverfish", "bowfin", "american eel", "green moray", "sergeant baker", "american gizzard shad", "white sucker", "central stoneroller", "goldfish", "common carp", "amur carp", "striped shiner", "common shiner", "golden shiner", "bluntnose minnow", "blacknose dace", "rudd", "creek chub", "chub", "sheepshead minnow", "banded killifish", "mummichog", "blackstripe topminnow", "western mosquitofish", "eastern mosquitofish", "atlantic tarpon", "northern pike", "chain pickerel", "brook stickleback", "three-spined stickleback", "northern clingfish", "spotted gar", "longnose gar", "florida gar", "flathead mullet", "atlantic blue tang", "brown surgeonfish", "orangeband surgeonfish", "convict surgeonfish", "orangespine unicornfish", "yellow tang", "sydney cardinalfish", "silver trevally", "yellowtail scad", "redeye bass", "redbreast sunfish", "green sunfish", "pumpkinseed sunfish", "warmouth sunfish", "bluegill", "longear sunfish", "redear sunfish", "smallmouth bass", "largemouth bass", "white crappie", "black crappie", "threadfin butterflyfish", "four-eyed butterflyfish", "red morwong", "magpie perch", "crested morwong", "texas cichlid", "mayan cichlid", "mozambique tilapia", "nile tilapia", "spotted hawkfish", "yellowfin pike", "old wife", "opaleye", "luderick", "stripey bream", "round goby", "french grunt", "bluestriped grunt", "australian mado", "stripey", "silver sweep", "moonlighter", "eastern blue groper", "mexican hogfish", "rainbow wrasse", "snakeskin wrasse", "slippery dick", "bluestreak cleaner wrasse", "crimson-banded wrasse", "southern maori wrasse", "senator wrasse", "günther's wrasse", "caribbean bluehead wrasse", "six-bar wrasse", "cortez rainbow wrasse", "moon wrasse", "ornate wrasse", "schoolmaster snapper", "grey snapper", "yellowtail snapper", "eastern pomfred", "white perch", "striped bass", "striped red mullet", "blackspot goatfish", "bluestriped goatfish", "rainbow darter", "yellow perch", "english perch", "walleye", "eastern hulafish", "french angelfish", "atlantic sergeant major", "scissortail sergeant major", "panamic sergeant major", "indo-pacific sergeant major", "clark's anemonefish", "mediterranean damselfish", "threespot damselfish", "garibaldi", "immaculate damsel", "yellowtail damselfish", "white-ear parma", "bluefish", "blue-barred parrotfish", "redband parrotfish", "stoplight parrotfish", "freshwater drum", "speckled trout", "atlantic croaker", "oyster drum", "red drum", "halfbanded seaperch", "kelp bass", "painted comber", "sea bream", "sheepshead", "silver seabream", "white bream", "common two-banded seabream", "pinfish", "saddled seabream", "salema porgy", "great barracuda", "eastern australian blackhead triplefin", "common triplefin", "moorish idol", "cutthroat trout", "coho salmon", "rainbow trout", "sockeye salmon", "king salmon", "brown trout", "brook trout", "tidepool sculpin", "dusky flathead", "red lionfish", "eastern red scorpionfish", "hardhead catfish", "black bullhead", "yellow bullhead", "brown bullhead catfish", "channel catfish", "striped eel catfish", "pacific trumpetfish", "trumpetfish", "bluespotted cornetfish", "pot-bellied seahorse", "spiny seahorse", "new holland seahorse", "leafy seadragon", "weedy seadragon", "eastern smooth boxfish", "striped burrfish", "three-barred porcupinefish", "spiny porcupinefish", "giant porcupinefish", "ocean sunfish", "southern pygmy leatherjacket", "six-spine leatherjacket", "yellowfin leatherjacket", "fanbelly leatherjacket", "smooth trunkfish", "yellow boxfish", "spotted boxfish", "stripebelly puffer", "guineafowl puffer", "smooth toadfish", "coastal tailed frog", "oriental fire-bellied toad", "yellow-bellied toad", "american toad", "western toad", "great plains toad", "fowler's toad", "red-spotted toad", "oak toad", "texas toad", "southern toad", "woodhouse's toad", "taiwan common toad", "european toad", "asiatic toad", "giant toad", "balearic green toad", "variable toad", "green toad", "common indian toad", "natterjack toad", "colorado river toad", "gulf coast toad", "central american gulf coast toad", "river toad", "leaf litter toad", "argentine toad", "cururu toad", "giant marine toad", "south american common toad", "eastern cane toad", "ranger's toad", "african common toad", "western leopard toad", "cachabi robber frog", "green and black dart-poison frog", "strawberry poison-dart frog", "alpine cricket frog", "rio grande chirping frog", "greenhouse frog", "blanchard's cricket frog", "northern cricket frog", "southern cricket frog", "rosenberg's gladiator frog", "european tree frog", "canyon tree frog", "cope's gray tree frog", "green tree frog", "mountain tree frog", "pine woods tree frog", "barking tree frog", "italian tree frog", "japanese tree frog", "mediterranean tree frog", "squirrel tree frog", "gray tree frog", "cuban treefrog", "california treefrog", "spring peeper", "baja california treefrog", "boreal chorus frog", "northern pacific treefrog", "sierran treefrog", "red snouted tree frog", "mexican smilisca", "masked tree frog", "veined frog", "painted reed frog", "guayaquil dwarf frog", "savage's thin-toed frog", "eastern narrow-mouthed toad", "western narrow-mouthed toad", "asian painted frog", "ornate chorus frog", "whistling tree frog", "eastern dwarf treefrog", "peron's tree frog", "green tree frog", "moore's frog", "red-eyed treefrog", "mexican giant tree frog", "african clawed frog", "cape river frog", "gray's stream frog", "schlegel's frog", "latouche's frog", "rio grande leopard frog", "plains leopard frog", "american bullfrog", "green frog", "pig frog", "pickerel frog", "northern leopard frog", "mink frog", "southern leopard frog", "wood frog", "swinhoe's frog", "black-spotted frog", "iberian green frog", "marsh frog", "moor frog", "northern red-legged frog", "foothill yellow-legged frog", "cascades frog", "agile frog", "california red-legged frog", "italian stream frog", "italian agile frog", "columbia spotted frog", "european common frog", "guenther's frog", "robust kajika frog", "temple tree frog", "common tree frog", "spot-legged treefrog", "couch's spadefoot", "eastern spadefoot", "western spadefoot", "mexican spadefoot", "northwestern salamander", "jefferson salamander", "blue-spotted salamander", "long-toed salamander", "spotted salamander", "western tiger salamander", "marbled salamander", "small-mouthed salamander", "tiger salamander", "california giant salamander", "coastal giant salamander", "northeastern china hynobiid salamander", "green salamander", "black salamander", "arboreal salamander", "california slender salamander", "garden slender salamander", "black-bellied slender salamander", "spotted dusky salamander", "dusky salamander", "seal salamander", "allegheny mountain dusky salamander", "black-bellied salamander", "ensatina", "northern two-lined salamander", "southern two-lined salamander", "long-tailed salamander", "cave salamander", "blue ridge two-lined salamander", "spring salamander", "four-toed salamander", "western slimy salamander", "eastern red-backed salamander", "white-spotted slimy salamander", "northern zigzag salamander", "northern slimy salamander", "southern red-backed salamander", "western red-backed salamander", "red salamander", "alpine newt", "smooth newt", "eastern newt", "alpine salamander", "lanza's alpine salamander", "fire salamander", "northern spectacled salamander", "rough-skinned newt", "sierra newt", "california newt", "italian crested newt", "great crested newt", "shikra", "cooper's hawk", "northern goshawk", "eurasian sparrowhawk", "sharp-shinned hawk", "crested goshawk", "cinereous vulture", "wedge-tailed eagle", "golden eagle", "imperial eagle", "steppe eagle", "tawny eagle", "black-collared hawk", "zone-tailed hawk", "short-tailed hawk", "common buzzard", "red-tailed hawk", "rough-legged hawk", "red-shouldered hawk", "gray hawk", "broad-winged hawk", "ferruginous hawk", "long-legged buzzard", "jackal buzzard", "swainson's hawk", "common black-hawk", "savanna hawk", "great black-hawk", "hook-billed kite", "brown snake eagle", "short-toed eagle", "western marsh-harrier", "swamp harrier", "hen harrier", "northern harrier", "pale harrier", "montagu's harrier", "greater spotted eagle", "american swallow-tailed kite", "black-shouldered kite", "black-winged kite", "white-tailed kite", "white-tailed hawk", "black-chested buzzard-eagle", "variable hawk", "crane hawk", "bearded vulture", "white-backed vulture", "griffon vulture", "white-tailed eagle", "bald eagle", "white-bellied sea-eagle", "african fish eagle", "brahminy kite", "whistling kite", "booted eagle", "mississippi kite", "plumbeous kite", "long-crested eagle", "pale chanting goshawk", "black kite", "red kite", "hooded vulture", "egyptian vulture", "crested hawk-eagle", "harris' hawk", "european honey-buzzard", "oriental honey-buzzard", "martial eagle", "african harrier-hawk", "snail kite", "roadside hawk", "crested serpent eagle", "bateleur", "osprey", "secretary bird", "mandarin duck", "wood duck", "egyptian goose", "brazilian teal", "northern pintail", "white-cheeked pintail", "chestnut teal", "common teal", "mexican duck", "red-billed teal", "speckled teal", "mottled duck", "yellow-billed pintail", "grey teal", "mallard", "indian spot-billed duck", "american black duck", "pacific black duck", "yellow-billed duck", "eastern spot-billed duck", "greater white-fronted goose", "greylag goose", "pink-footed goose", "snow goose", "swan goose", "taiga bean goose", "bar-headed goose", "ross's goose", "lesser scaup", "redhead", "white-eyed duck", "ring-necked duck", "common pochard", "tufted duck", "greater scaup", "new zealand scaup", "ferruginous duck", "canvasback", "brant", "canada goose", "cackling goose", "barnacle goose", "hawaiian goose", "bufflehead", "common goldeneye", "barrow's goldeneye", "muscovy duck", "cape barren goose", "australian wood duck", "upland goose", "long-tailed duck", "coscoroba swan", "black swan", "trumpeter swan", "tundra swan", "whooper swan", "black-necked swan", "mute swan", "black-bellied whistling-duck", "fulvous whistling-duck", "lesser whistling-duck", "white-faced whistling duck", "harlequin duck", "hooded merganser", "crested duck", "pink-eared duck", "american wigeon", "eurasian wigeon", "gadwall", "black scoter", "american white-winged scoter", "surf scoter", "smew", "common merganser", "red-breasted merganser", "rosybill pochard", "red-crested pochard", "ruddy duck", "spur-winged goose", "comb duck", "common eider", "northern shoveler", "cinnamon teal", "blue-winged teal", "garganey", "australasian shoveler", "ruddy shelduck", "common shelduck", "australian shelduck", "paradise shelduck", "southern screamer", "magpie goose", "oriental pied hornbill", "crowned hornbill", "african grey hornbill", "southern yellow-billed hornbill", "southern ground-hornbill", "green wood hoopoe", "common hoopoe", "white-throated swift", "common swift", "chimney swift", "vaux's swift", "european nightjar", "lesser nighthawk", "common nighthawk", "common pauraque", "common poorwill", "common potoo", "northern potoo", "tawny frogmouth", "berylline hummingbird", "azure-crowned hummingbird", "cinnamon hummingbird", "rufous-tailed hummingbird", "violet-crowned hummingbird", "buff-bellied hummingbird", "black-throated mango", "green-breasted mango", "black-chinned hummingbird", "ruby-throated hummingbird", "buff-tailed coronet", "lucifer hummingbird", "anna's hummingbird", "costa's hummingbird", "violet sabrewing", "canivet's emerald", "glittering-bellied emerald", "sparkling violetear", "lesser violetear", "broad-billed hummingbird", "rivoli's hummingbird", "swallow-tailed hummingbird", "white-necked jacobin", "green-crowned brilliant", "white-eared hummingbird", "blue-throated hummingbird", "booted racket-tail", "calliope hummingbird", "broad-tailed hummingbird", "rufous hummingbird", "allen's hummingbird", "violet-crowned woodnymph", "emu", "turkey vulture", "lesser yellow-headed vulture", "black vulture", "california condor", "king vulture", "andean condor", "razorbill", "marbled murrelet", "pigeon guillemot", "black guillemot", "rhinoceros auklet", "atlantic puffin", "common murre", "double-striped thick-knee", "spotted thick-knee", "bush stone-curlew", "water thick-knee", "kentish plover", "double-banded plover", "collared plover", "little ringed plover", "common ringed plover", "greater sand plover", "white-fronted plover", "piping plover", "lesser sand-plover", "snowy plover", "red-breasted dotterel", "kittlitz's plover", "red-capped plover", "semipalmated plover", "three-banded plover", "killdeer", "wilson's plover", "black-fronted dotterel", "european golden plover", "american golden-plover", "pacific golden-plover", "black-bellied plover", "blacksmith lapwing", "southern lapwing", "crowned lapwing", "red-wattled lapwing", "masked lapwing", "african wattled lapwing", "spur-winged lapwing", "northern lapwing", "collared pratincole", "black oystercatcher", "south island pied oystercatcher", "sooty oystercatcher", "pied oystercatcher", "african oystercatcher", "european oystercatcher", "american oystercatcher", "variable oystercatcher", "african jacana", "pheasant-tailed jacana", "comb-crested jacana", "wattled jacana", "northern jacana", "whiskered tern", "white-winged tern", "black tern", "black-billed gull", "grey-hooded gull", "slender-billed gull", "hartlaub's gull", "brown-hooded gull", "silver gull", "bonaparte's gull", "black-headed gull", "andean gull", "gull-billed tern", "little gull", "caspian tern", "pallas's gull", "mediterranean gull", "herring gull", "caspian gull", "california gull", "mew gull", "black-tailed gull", "ring-billed gull", "dominican gull", "lesser black-backed gull", "glaucous-winged gull", "iceland gull", "heermann's gull", "glaucous gull", "yellow-footed gull", "great black-backed gull", "yellow-legged gull", "western gull", "pacific gull", "laughing gull", "franklin's gull", "large-billed tern", "black-legged kittiwake", "black skimmer", "forster's tern", "common tern", "arctic tern", "white-fronted tern", "little tern", "least tern", "crested tern", "elegant tern", "royal tern", "sandwich tern", "black-winged stilt", "white-headed stilt", "black-necked stilt", "american avocet", "pied avocet", "greater painted-snipe", "common sandpiper", "spotted sandpiper", "ruddy turnstone", "black turnstone", "upland sandpiper", "sharp-tailed sandpiper", "sanderling", "dunlin", "baird's sandpiper", "red knot", "curlew sandpiper", "white-rumped sandpiper", "stilt sandpiper", "purple sandpiper", "western sandpiper", "pectoral sandpiper", "little stint", "least sandpiper", "ruff", "semipalmated sandpiper", "red-necked stint", "buff-breasted sandpiper", "temminck's stint", "surfbird", "wilson's snipe", "common snipe", "short-billed dowitcher", "long-billed dowitcher", "marbled godwit", "hudsonian godwit", "bar-tailed godwit", "black-tailed godwit", "long-billed curlew", "eurasian curlew", "far eastern curlew", "whimbrel", "red phalarope", "red-necked phalarope", "wilson's phalarope", "american woodcock", "grey-tailed tattler", "spotted redshank", "lesser yellowlegs", "wood sandpiper", "wandering tattler", "greater yellowlegs", "common greenshank", "green sandpiper", "willet", "solitary sandpiper", "marsh sandpiper", "common redshank", "terek sandpiper", "parasitic jaeger", "asian openbill stork", "white stork", "woolly-necked stork", "maguari stork", "black stork", "jabiru", "marabou stork", "wood stork", "yellow-billed stork", "painted stork", "speckled mousebird", "asian emerald dove", "speckled pigeon", "rock pigeon", "stock dove", "wood pigeon", "inca dove", "common ground-dove", "picuí ground-dove", "ruddy ground-dove", "bar-shouldered dove", "peaceful dove", "zebra dove", "new zealand pigeon", "white-tipped dove", "crested pigeon", "pale-vented pigeon", "band-tailed pigeon", "red-billed pigeon", "white-crowned pigeon", "spot-winged pigeon", "picazuro pigeon", "common bronzewing", "cape turtle dove", "spotted dove", "eurasian collared-dove", "oriental turtle dove", "red-eyed dove", "laughing dove", "red turtle dove", "european turtle dove", "african green pigeon", "pink-necked green-pigeon", "white-winged dove", "eared dove", "zenaida dove", "mourning dove", "common kingfisher", "pied kingfisher", "amazon kingfisher", "green kingfisher", "malachite kingfisher", "laughing kookaburra", "brown-hooded kingfisher", "woodland kingfisher", "white-throated kingfisher", "belted kingfisher", "giant kingfisher", "ringed kingfisher", "stork-billed kingfisher", "collared kingfisher", "forest kingfisher", "sacred kingfisher", "indian roller", "lilac-breasted roller", "european roller", "dollarbird", "european bee-eater", "white-fronted bee-eater", "green bee-eater", "rainbow bee-eater", "blue-cheeked bee-eater", "blue-tailed bee-eater", "little bee-eater", "turquoise-browed motmot", "andean motmot", "lesson's motmot", "russet-crowned motmot", "fan-tailed cuckoo", "greater coucal", "white-browed coucal", "yellow-billed cuckoo", "black-billed cuckoo", "smooth-billed ani", "greater ani", "groove-billed ani", "common cuckoo", "asian koel", "greater roadrunner", "guira cuckoo", "squirrel cuckoo", "crested caracara", "southern caracara", "brown falcon", "australian kestrel", "merlin", "aplomado falcon", "prairie falcon", "lesser kestrel", "new zealand falcon", "peregrine falcon", "bat falcon", "rock kestrel", "american kestrel", "eurasian hobby", "common kestrel", "red-footed falcon", "laughing falcon", "yellow-headed caracara", "chimango caracara", "rufous-tailed jacamar", "great curassow", "gray-headed chachalaca", "west mexican chachalaca", "plain chachalaca", "rufous-bellied chachalaca", "dusky-legged guan", "crested guan", "australian brush-turkey", "helmeted guineafowl", "california quail", "gambel's quail", "scaled quail", "northern bobwhite", "chukar", "red-legged partridge", "ruffed grouse", "sooty grouse", "dusky grouse", "spruce grouse", "gray francolin", "red junglefowl", "willow grouse", "wild turkey", "common peafowl", "gray partridge", "ring-necked pheasant", "cape spurfowl", "eurasian black grouse", "hazel grouse", "arctic loon", "common loon", "pacific loon", "red-throated loon", "limpkin", "blue crane", "demoiselle crane", "sandhill crane", "brolga", "whooping crane", "common crane", "white-breasted waterhen", "russet-naped wood-rail", "gray-cowled wood-rail", "american coot", "slate-colored coot", "red-gartered coot", "eurasian coot", "red-knobbed coot", "common moorhen", "common gallinule", "dusky moorhen", "weka", "buff-banded rail", "american purple gallinule", "south-west pacific swamphen", "gray-headed swamphen", "sora", "water rail", "clapper rail", "virginia rail", "ridgway's rail", "black crake", "grey go-away-bird", "hoatzin", "kori bustard", "rifleman", "yellow-rumped thornbill", "brown thornbill", "grey warbler", "white-browed scrubwren", "great reed-warbler", "australian reed warbler", "blyth's reed warbler", "sedge warbler", "eurasian reed-warbler", "booted warbler", "long-tailed tit", "bushtit", "common iora", "skylark", "horned lark", "crested lark", "woodlark", "dusky woodswallow", "white-breasted woodswallow", "pied butcherbird", "grey butcherbird", "australian magpie", "pied currawong", "cedar waxwing", "bohemian waxwing", "red-billed oxpecker", "lapland longspur", "snow bunting", "north island saddleback", "black-faced cuckoo-shrike", "scarlet minivet", "northern cardinal", "pyrrhuloxia", "blue bunting", "red-throated ant-tanager", "lazuli bunting", "blue grosbeak", "painted bunting", "indigo bunting", "orange-breasted bunting", "varied bunting", "yellow grosbeak", "rose-breasted grosbeak", "black-headed grosbeak", "flame-colored tanager", "hepatic tanager", "western tanager", "scarlet tanager", "summer tanager", "dickcissel", "brown creeper", "short-toed treecreeper", "treecreeper", "white-throated dipper", "american dipper", "golden-headed cisticola", "zitting cisticola", "common tailorbird", "plain prinia", "karoo prinia", "tawny-flanked prinia", "white-throated treecreeper", "white-winged chough", "california scrub-jay", "florida scrub jay", "mexican jay", "woodhouse's scrub-jay", "black-throated magpie-jay", "white-throated magpie-jay", "white-necked raven", "pied crow", "american crow", "northwestern crow", "common raven", "hooded crow", "carrion crow", "australian raven", "rook", "large-billed crow", "little raven", "eurasian jackdaw", "fish crow", "house crow", "blue jay", "steller's jay", "plush-crested jay", "green jay", "yucatán jay", "azure-winged magpie", "gray treepie", "rufous treepie", "eurasian jay", "eurasian nutcracker", "clark's nutcracker", "gray jay", "black-billed magpie", "yellow-billed magpie", "eurasian magpie", "oriental magpie", "brown jay", "alpine chough", "red-billed chough", "red-billed blue magpie", "mistletoebird", "fork-tailed drongo", "spangled drongo", "ashy drongo", "black drongo", "greater racket-tailed drongo", "corn bunting", "rock bunting", "cirl bunting", "yellowhammer", "yellow-throated bunting", "ortolan bunting", "rustic bunting", "reed bunting", "black-faced bunting", "common waxbill", "indian silverbill", "red-billed firefinch", "chestnut munia", "nutmeg mannikin", "white-rumped munia", "red-browed firetail", "bronze mannikin", "double-barred finch", "blue-breasted cordonbleu", "lesser redpoll", "common redpoll", "hoary redpoll", "eurasian goldfinch", "common rosefinch", "long-tailed rosefinch", "european greenfinch", "oriental greenfinch", "hawfinch", "evening grosbeak", "yellow-fronted canary", "scrub euphonia", "elegant euphonia", "yellow-throated euphonia", "thick-billed euphonia", "chaffinch", "brambling", "cassin's finch", "house finch", "purple finch", "gray-crowned rosy-finch", "linnet", "red crossbill", "white-winged crossbill", "pine grosbeak", "eurasian bullfinch", "european serin", "lawrence's goldfinch", "hooded siskin", "pine siskin", "lesser goldfinch", "eurasian siskin", "american goldfinch", "yellow-chinned spinetail", "pale-legged hornero", "rufous hornero", "narrow-billed woodcreeper", "ivory-billed woodcreeper", "lesser striped-swallow", "red-rumped swallow", "house martin", "white-throated swallow", "welcome swallow", "barn swallow", "wire-tailed swallow", "pacific swallow", "cave swallow", "cliff swallow", "gray-breasted martin", "purple martin", "brown-chested martin", "eurasian crag martin", "blue-and-white swallow", "bank swallow", "southern rough-winged swallow", "northern rough-winged swallow", "mangrove swallow", "white-winged swallow", "tree swallow", "violet-green swallow", "grayish baywing", "red-winged blackbird", "tricolored blackbird", "yellow-rumped cacique", "yellow-winged cacique", "yellow-hooded blackbird", "melodious blackbird", "bobolink", "rusty blackbird", "brewer's blackbird", "black-backed oriole", "bullock's oriole", "yellow-backed oriole", "hooded oriole", "baltimore oriole", "audubon's oriole", "altamira oriole", "yellow oriole", "scott's oriole", "spot-breasted oriole", "black-cowled oriole", "streak-backed oriole", "variable oriole", "orchard oriole", "black-vented oriole", "long-tailed meadowlark", "bronzed cowbird", "brown-headed cowbird", "shiny cowbird", "crested oropendola", "montezuma oropendola", "carib grackle", "boat-tailed grackle", "great-tailed grackle", "greater antillean grackle", "common grackle", "eastern meadowlark", "western meadowlark", "yellow-headed blackbird", "yellow-breasted chat", "northern shrike", "southern fiscal", "red-backed shrike", "brown shrike", "great grey shrike", "loggerhead shrike", "lesser grey shrike", "long-tailed shrike", "woodchat shrike", "masked laughingthrush", "jungle babbler", "black-backed puffback", "southern boubou", "bokmakierie", "superb fairywren", "red-backed fairy-wren", "splendid fairywren", "eastern spinebill", "red wattlebird", "little wattlebird", "new zealand bellbird", "yellow-faced honeyeater", "blue-faced honeyeater", "singing honeyeater", "brown honeyeater", "noisy miner", "lewin's honeyeater", "scarlet myzomela", "noisy friarbird", "white-cheeked honeyeater", "new holland honeyeater", "tūī", "white-plumed honeyeater", "superb lyrebird", "gray catbird", "blue mockingbird", "tropical mockingbird", "long-tailed mockingbird", "northern mockingbird", "chalk-browed mockingbird", "sage thrasher", "curve-billed thrasher", "long-billed thrasher", "california thrasher", "brown thrasher", "magpie-lark", "black-naped monarch", "indian paradise-flycatcher", "african paradise-flycatcher", "tawny pipit", "red-throated pipit", "grassveld pipit", "olive-backed pipit", "australasian pipit", "rock pipit", "meadow pipit", "american pipit", "oriental pipit", "water pipit", "tree pipit", "yellow-throated longclaw", "african pied wagtail", "white wagtail", "cape wagtail", "grey wagtail", "citrine wagtail", "western yellow wagtail", "eastern yellow wagtail", "indian robin", "white-rumped shama", "oriental magpie-robin", "cape robin-chat", "european robin", "verditer flycatcher", "european pied flycatcher", "thrush nightingale", "common nightingale", "bluethroat", "fiscal flycatcher", "blue rock-thrush", "african dusky flycatcher", "asian brown flycatcher", "spotted flycatcher", "blue whistling-thrush", "familiar chat", "northern wheatear", "daurian redstart", "plumbeous water redstart", "black redstart", "common redstart", "pied bushchat", "siberian stonechat", "whinchat", "european stonechat", "african stonechat", "red-flanked bluetail", "crimson sunbird", "orange-breasted sunbird", "brown-throated sunbird", "amethyst sunbird", "scarlet-chested sunbird", "greater double-collared sunbird", "purple sunbird", "southern double-collared sunbird", "olive-backed sunbird", "collared sunbird", "purple-rumped sunbird", "malachite sunbird", "black-naped oriole", "black-headed oriole", "eurasian golden oriole", "olive-backed oriole", "black-hooded oriole", "australasian figbird", "gray shrikethrush", "golden whistler", "rufous whistler", "bearded reedling", "spotted pardalote", "striated pardalote", "black-crested titmouse", "tufted titmouse", "oak titmouse", "juniper titmouse", "bridled titmouse", "blue tit", "azure tit", "crested tit", "great tit", "japanese tit", "coal tit", "black-capped chickadee", "carolina chickadee", "mountain chickadee", "boreal chickadee", "willow tit", "marsh tit", "chestnut-backed chickadee", "golden-crowned warbler", "rufous-capped warbler", "canada warbler", "wilson's warbler", "kentucky warbler", "mourning warbler", "macgillivray's warbler", "common yellowthroat", "worm-eating warbler", "orange-crowned warbler", "lucy's warbler", "tennessee warbler", "nashville warbler", "black-and-white warbler", "slate-throated redstart", "painted redstart", "crescent-chested warbler", "louisiana waterthrush", "northern waterthrush", "prothonotary warbler", "ovenbird", "northern parula", "black-throated blue warbler", "bay-breasted warbler", "cerulean warbler", "golden-cheeked warbler", "hooded warbler", "yellow-rumped warbler", "prairie warbler", "yellow-throated warbler", "blackburnian warbler", "grace's warbler", "magnolia warbler", "black-throated gray warbler", "hermit warbler", "palm warbler", "chestnut-sided warbler", "yellow warbler", "pine warbler", "tropical parula", "american redstart", "blackpoll warbler", "cape may warbler", "townsend's warbler", "black-throated green warbler", "golden-winged warbler", "blue-winged warbler", "rufous-crowned sparrow", "grasshopper sparrow", "leconte's sparrow", "nelson's sharp-tailed sparrow", "black-throated sparrow", "olive sparrow", "bell's sparrow", "sagebrush sparrow", "lark bunting", "henslow's sparrow", "common chlorospingus", "lark sparrow", "dark-eyed junco", "yellow-eyed junco", "swamp sparrow", "lincoln's sparrow", "song sparrow", "abert's towhee", "california towhee", "canyon towhee", "striped sparrow", "savannah sparrow", "fox sparrow", "rufous-winged sparrow", "cassin's sparrow", "stripe-headed sparrow", "green-tailed towhee", "eastern towhee", "spotted towhee", "vesper sparrow", "black-chinned sparrow", "brewer's sparrow", "clay-colored sparrow", "chipping sparrow", "field sparrow", "american tree sparrow", "white-throated sparrow", "golden-crowned sparrow", "rufous-collared sparrow", "white-crowned sparrow", "harris's sparrow", "southern gray-headed sparrow", "house sparrow", "spanish sparrow", "italian sparrow", "cape sparrow", "eurasian tree sparrow", "eastern yellow robin", "toutouwai", "scarlet robin", "north island robin", "tomtit", "olive warbler", "common chiffchaff", "wood warbler", "willow warbler", "grosbeak weaver", "red bishop", "white-browed sparrow-weaver", "cape weaver", "village weaver", "spectacled weaver", "baya weaver", "southern masked weaver", "red-billed quelea", "blue-gray gnatcatcher", "california gnatcatcher", "masked gnatcatcher", "black-tailed gnatcatcher", "cape sugarbird", "alpine accentor", "dunnock", "phainopepla", "gray silky-flycatcher", "satin bowerbird", "sombre greenbul", "brown-eared bulbul", "black bulbul", "sooty-headed bulbul", "common bulbul", "red-vented bulbul", "cape bulbul", "yellow-vented bulbul", "red-whiskered bulbul", "white-eared bulbul", "light-vented bulbul", "ruby-crowned kinglet", "common firecrest", "goldcrest", "golden-crowned kinglet", "verdin", "european penduline tit", "grey fantail", "new zealand fantail", "willie wagtail", "red-breasted nuthatch", "white-breasted nuthatch", "eurasian nuthatch", "brown-headed nuthatch", "pygmy nuthatch", "crested myna", "javan myna", "common mynah", "asian glossy starling", "violet-backed starling", "asian pied starling", "black-collared starling", "red-shouldered glossy-starling", "red-winged starling", "white-cheeked starling", "spotless starling", "european starling", "wrentit", "blackcap", "garden warbler", "whitethroat", "lesser whitethroat", "sardinian warbler", "barred antshrike", "green honeycreeper", "bananaquit", "red-legged honeycreeper", "blue dacnis", "cinnamon-bellied flowerpiercer", "great pampa-finch", "gray-headed tanager", "lesser antillean bullfinch", "black-faced grassquit", "yellow-billed cardinal", "red-crested cardinal", "blue-and-yellow tanager", "silver-beaked tanager", "crimson-backed tanager", "flame-rumped tanager", "scarlet-rumped tanager", "black-headed saltator", "golden-billed saltator", "grayish saltator", "buff-throated saltator", "saffron finch", "grassland yellow-finch", "double-collared seedeater", "variable seedeater", "morelet's seedeater", "yellow-bellied seedeater", "cinnamon-rumped seedeater", "blue-necked tanager", "golden-hooded tanager", "scrub tanager", "white-lined tanager", "golden tanager", "bay-headed tanager", "swallow tanager", "yellow-winged tanager", "blue-gray tanager", "palm tanager", "sayaca tanager", "yellow-faced grassquit", "blue-black grassquit", "rose-throated becard", "masked tityra", "cactus wren", "spotted wren", "rufous-naped wren", "band-backed wren", "canyon wren", "marsh wren", "sedge wren", "rock wren", "bewick's wren", "carolina wren", "house wren", "winter wren", "pacific wren", "eurasian wren", "veery", "hermit thrush", "gray-cheeked thrush", "swainson's thrush", "wood thrush", "varied thrush", "townsend's solitaire", "mountain bluebird", "western bluebird", "eastern bluebird", "creamy-bellied thrush", "chiguanco thrush", "austral thrush", "great thrush", "clay-colored thrush", "black-billed thrush", "redwing", "pale-breasted thrush", "eurasian blackbird", "american robin", "olive thrush", "song thrush", "fieldfare", "rufous-bellied thrush", "rufous-backed robin", "mistle thrush", "northern beardless-tyrannulet", "olive-sided flycatcher", "greater pewee", "western wood-pewee", "eastern wood-pewee", "yellow-bellied elaenia", "alder flycatcher", "pacific-slope flycatcher", "yellow-bellied flycatcher", "buff-breasted flycatcher", "hammond's flycatcher", "least flycatcher", "cordilleran flycatcher", "willow flycatcher", "acadian flycatcher", "gray flycatcher", "masked water-tyrant", "spectacled tyrant", "cattle tyrant", "boat-billed flycatcher", "tufted flycatcher", "ash-throated flycatcher", "great crested flycatcher", "dusky-capped flycatcher", "brown-crested flycatcher", "sulphur-bellied flycatcher", "streaked flycatcher", "rusty-margined flycatcher", "social flycatcher", "great kiskadee", "vermilion flycatcher", "black phoebe", "eastern phoebe", "say's phoebe", "common tody-flycatcher", "yellow-olive flycatcher", "couch's kingbird", "thick-billed kingbird", "gray kingbird", "scissor-tailed flycatcher", "tropical kingbird", "fork-tailed flycatcher", "eastern kingbird", "western kingbird", "cassin's kingbird", "pin-tailed whydah", "rufous-browed peppershrike", "bell's vireo", "cassin's vireo", "yellow-throated vireo", "yellow-green vireo", "warbling vireo", "white-eyed vireo", "hutton's vireo", "red-eyed vireo", "philadelphia vireo", "plumbeous vireo", "blue-headed vireo", "warbling white-eye", "silvereye", "indian white-eye", "swinhoe’s white-eye", "cape white-eye", "great egret", "grey heron", "cocoi heron", "goliath heron", "great blue heron", "intermediate egret", "black-headed heron", "white-necked heron", "purple heron", "chinese pond heron", "indian pond heron", "squacco heron", "american bittern", "eurasian bittern", "cattle egret", "striated heron", "green heron", "boat-billed heron", "little blue heron", "little egret", "white-faced heron", "reddish egret", "pacific reef heron", "snowy egret", "tricolored heron", "malayan night heron", "least bittern", "little bittern", "yellow bittern", "yellow-crowned night-heron", "nankeen night-heron", "black-crowned night heron", "whistling heron", "rufescent tiger heron", "bare-throated tiger heron", "australian pelican", "dalmatian pelican", "american white pelican", "brown pelican", "great white pelican", "grey pelican", "peruvian pelican", "hamerkop", "hadada", "white ibis", "green ibis", "bare-faced ibis", "roseate spoonbill", "african spoonbill", "yellow-billed spoonbill", "eurasian spoonbill", "black-faced spoonbill", "royal spoonbill", "white-faced ibis", "glossy ibis", "buff-necked ibis", "black-faced ibis", "african sacred ibis", "black-headed ibis", "australian white ibis", "straw-necked ibis", "lesser flamingo", "chilean flamingo", "greater flamingo", "american flamingo", "black-collared barbet", "crested barbet", "coppersmith barbet", "taiwan barbet", "pale-billed woodpecker", "crimson-crested woodpecker", "cardinal woodpecker", "northern flicker", "campo flicker", "gilded flicker", "green-barred woodpecker", "spot-breasted woodpecker", "golden-olive woodpecker", "white-backed woodpecker", "great spotted woodpecker", "syrian woodpecker", "middle spotted woodpecker", "black-rumped flameback", "white-headed woodpecker", "red-cockaded woodpecker", "lesser spotted woodpecker", "nuttall's woodpecker", "downy woodpecker", "ladder-backed woodpecker", "hairy woodpecker", "lineated woodpecker", "black woodpecker", "pileated woodpecker", "eurasian wryneck", "golden-fronted woodpecker", "red-bellied woodpecker", "golden-cheeked woodpecker", "red-headed woodpecker", "acorn woodpecker", "hoffmann's woodpecker", "lewis's woodpecker", "black-cheeked woodpecker", "red-crowned woodpecker", "gila woodpecker", "black-backed woodpecker", "american three-toed woodpecker", "grey-headed woodpecker", "green woodpecker", "red-naped sapsucker", "red-breasted sapsucker", "williamson's sapsucker", "yellow-bellied sapsucker", "northern emerald-toucanet", "chestnut-eared aracari", "collared aracari", "black-mandibled toucan", "keel-billed toucan", "toco toucan", "clark's grebe", "western grebe", "horned grebe", "great crested grebe", "red-necked grebe", "great grebe", "eared grebe", "pied-billed grebe", "hoary-headed grebe", "white-tufted grebe", "least grebe", "australasian grebe", "little grebe", "laysan albatross", "black-footed albatross", "black-browed albatross", "pink-footed shearwater", "sooty shearwater", "wedge-tailed shearwater", "cory's shearwater", "northern fulmar", "southern giant petrel", "sulphur-crested cockatoo", "little corella", "long-billed corella", "gang-gang cockatoo", "red-tailed black-cockatoo", "yellow-tailed black-cockatoo", "galah", "white-fronted parrot", "red-lored parrot", "lilac-crowned parrot", "red-crowned parrot", "blue-and-yellow macaw", "scarlet macaw", "nanday parakeet", "yellow-chevroned parakeet", "orange-chinned parakeet", "orange-fronted parakeet", "olive-throated parakeet", "brown-throated parakeet", "spectacled parrotlet", "monk parakeet", "blue-headed parrot", "red-masked parakeet", "green parakeet", "white-eyed parakeet", "mitred parakeet", "peach-faced lovebird", "australian king-parrot", "australian ringneck", "red-crowned parakeet", "musk lorikeet", "budgerigar", "pale-headed rosella", "crimson rosella", "eastern rosella", "red-rumped parrot", "rose-ringed parakeet", "scaly-breasted lorikeet", "rainbow lorikeet", "new zealand kaka", "kea", "little penguin", "gentoo penguin", "african penguin", "magellanic penguin", "northern saw-whet owl", "short-eared owl", "long-eared owl", "spotted owlet", "burrowing owl", "little owl", "spotted eagle-owl", "eurasian eagle-owl", "snowy owl", "great horned owl", "mottled owl", "ferruginous pygmy-owl", "northern pygmy-owl", "eastern screech-owl", "tropical screech-owl", "western screech-owl", "morepork", "tawny owl", "great gray owl", "ural owl", "barred owl", "northern hawk owl", "barn owl", "common ostrich", "anhinga", "oriental darter", "australian darter", "african darter", "magnificent frigatebird", "long-tailed cormorant", "little pied cormorant", "little cormorant", "pygmy cormorant", "european shag", "double-crested cormorant", "neotropic cormorant", "cape cormorant", "great cormorant", "pelagic cormorant", "brandt's cormorant", "spotted shag", "little black cormorant", "australian pied cormorant", "northern gannet", "australasian gannet", "masked booby", "brown booby", "blue-footed booby", "red-footed booby", "resplendent quetzal", "gartered trogon", "citreoline trogon", "collared trogon", "elegant trogon", "slaty-tailed trogon", "black-headed trogon", "black-throated trogon", "pronghorn", "minke whale", "humpback whale", "impala", "hartebeest", "springbok", "american bison", "domestic cattle", "nilgai", "water buffalo", "domestic goat", "alpine ibex", "blue wildebeest", "bontebok/blesbok", "mountain gazelle", "roan antelope", "waterbuck", "kob", "mountain goat", "klipspringer", "gemsbok", "domestic sheep", "bighorn sheep", "steenbok", "northern chamois", "bush duiker", "african buffalo", "nyala", "eland", "bushbuck", "greater kudu", "guanaco", "vicugna", "elk", "axis deer", "western roe deer", "elk", "red deer", "sika deer", "fallow deer", "mule deer", "white-tailed deer", "caribou", "sambar", "common dolphin", "risso's dolphin", "killer whale", "bottlenose dolphin", "gray whale", "giraffe", "hippopotamus", "common porpoise", "common warthog", "wild boar", "collared peccary", "asian jackal", "domestic dog", "coyote", "gray wolf", "black-backed jackal", "culpeo", "south american gray fox", "gray fox", "island fox", "red fox", "cheetah", "domestic cat", "canada lynx", "bobcat", "lion", "jaguar", "leopard", "mountain lion", "banded mongoose", "small indian mongoose", "spotted hyena", "common hog-nosed skunk", "hooded skunk", "striped skunk", "sea otter", "northern river otter", "eurasian otter", "american marten", "stone marten", "european badger", "stoat", "long-tailed weasel", "least weasel", "american mink", "fisher", "american badger", "southern fur seal", "afro-australian fur seal", "steller sea lion", "south american sealion", "new zealand sea lion", "california sea lion", "galápagos sea lion", "gray seal", "northern elephant seal", "southern elephant seal", "hawaiian monk seal", "harbor seal", "ringtail", "white-nosed coati", "south american coati", "common raccoon", "american black bear", "brown bear", "proboscis bat", "mexican free-tailed bat", "indian flying fox", "grey-headed flying-fox", "big brown bat", "silver-haired bat", "eastern red bat", "hoary bat", "tricolored bat", "nine-banded armadillo", "sunda flying lemur", "common opossum", "virginia opossum", "agile wallaby", "western grey kangaroo", "eastern grey kangaroo", "common wallaroo", "red-necked wallaby", "swamp wallaby", "common brushtail", "koala", "common ringtail possum", "common wombat", "common hedgehog", "northern white-breasted hedgehog", "northern short-tailed shrew", "star-nosed mole", "eastern mole", "broad-footed mole", "european mole", "rock hyrax", "snowshoe hare", "black-tailed jackrabbit", "brown hare", "white-tailed jackrabbit", "european rabbit", "swamp rabbit", "desert cottontail", "brush rabbit", "mexican cottontail", "eastern cottontail", "mountain cottontail", "marsh rabbit", "american pika", "duck-billed platypus", "short-beaked echidna", "donkey", "domestic horse", "plains zebra", "white rhinoceros", "south american tapir", "brown-throated sloth", "hoffmann's two-toed sloth", "northern tamandua", "mantled howler monkey", "mexican black howler monkey", "colombian red howler monkey", "central american spider monkey", "black-pencilled marmoset", "white-throated capuchin", "blue monkey", "vervet monkey", "formosan rock macaque", "crab-eating macaque", "rhesus macaque", "bonnet macaque", "olive baboon", "chacma baboon", "northern plains gray langur", "dusky leaf-monkey", "asian elephant", "african bush elephant", "american beaver", "eurasian beaver", "brazilian guinea pig", "capybara", "northern mountain viscacha", "california vole", "meadow vole", "bank vole", "dusky-footed woodrat", "muskrat", "white-footed mouse", "deer mouse", "hispid cotton rat", "central american agouti", "coypu", "common porcupine", "botta's pocket gopher", "cape porcupine", "crested porcupine", "house mouse", "norway rat", "black rat", "four-striped grass mouse", "harris' antelope squirrel", "white-tailed antelope squirrel", "pallas's squirrel", "plantain squirrel", "golden-mantled ground squirrel", "cascade golden-mantled ground squirrel", "black-tailed prairie dog", "siberian chipmunk", "indian palm squirrel", "northern palm squirrel", "south african ground squirrel", "southern flying squirrel", "thirteen-lined ground squirrel", "hoary marmot", "yellow-bellied marmot", "alpine marmot", "woodchuck", "yellow-pine chipmunk", "cliff chipmunk", "merriam's chipmunk", "least chipmunk", "townsend's chipmunk", "california ground squirrel", "rock squirrel", "abert's squirrel", "allen's squirrel", "red-bellied squirrel", "eastern gray squirrel", "red-tailed squirrel", "western gray squirrel", "fox squirrel", "variegated squirrel", "eurasian red squirrel", "yucatán squirrel", "eastern chipmunk", "douglas' squirrel", "american red squirrel", "belding's ground squirrel", "columbian ground squirrel", "richardson's ground squirrel", "round-tailed ground squirrel", "west indian manatee", "american alligator", "common caiman", "american crocodile", "belize crocodile", "nile crocodile", "saltwater crocodile", "tuatara", "southern tree agama", "southern rock agama", "african redhead agama", "jacky lashtail", "green crested lizard", "great crested canopy lizard", "common green forest lizard", "oriental garden lizard", "taiwan japalure", "australian water dragon", "bearded dragon", "roughtail rock agama", "slow worm", "italian slow worm", "transvolcanic alligator lizard", "northern alligator lizard", "arizona alligator lizard", "southern alligator lizard", "texas alligator lizard", "slender glass lizard", "eastern glass lizard", "boa constrictor", "central american boa", "northern rubber boa", "northern three-lined boa", "knysna dwarf chameleon", "cape dwarf chameleon", "common chameleon", "flap-necked chameleon", "oriental whipsnake", "glossy snake", "eastern worm snake", "scarlet snake", "western shovelnose snake", "north american racer", "sharp-tailed snake", "smooth snake", "red-lipped snake", "common bronze-back", "black treesnake", "ringneck snake", "boomslang", "central american indigo snake", "mud snake", "rough earthsnake", "plains hognose snake", "eastern hognose snake", "green whip snake", "desert nightsnake", "chihuahuan nightsnake", "coast night snake", "blunthead tree snake", "california king snake", "yellow-bellied kingsnake", "western milksnake", "common kingsnake", "speckled kingsnake", "black kingsnake", "desert kingsnake", "eastern milksnake", "northern cat-eyed snake", "giant parrot snake", "coachwhip", "striped racer", "neotropical whip snake", "striped whipsnake", "barred grass snake", "viperine snake", "grass snake", "tessellated water snake", "saltmarsh snake", "plain-bellied water snake", "banded water snake", "florida green watersnake", "diamondback watersnake", "northern watersnake", "brown watersnake", "rough green snake", "smooth greensnake", "brown vinesnake", "eastern rat snake", "great plains ratsnake", "corn snake", "western rat snake", "gray ratsnake", "eastern foxsnake", "western leaf-nosed snake", "gopher snake", "mexican bull snake", "graham's crayfish snake", "queen snake", "long-nosed snake", "eastern patch-nosed snake", "western patch-nosed snake", "western groundsnake", "chicken snake", "dekay's brownsnake", "redbelly snake", "flat-headed snake", "aquatic garter snake", "black-necked gartersnake", "western terrestrial garter snake", "two-striped garter snake", "checkered garter snake", "northwestern garter snake", "western ribbon snake", "plains garter snake", "eastern ribbonsnake", "common garter snake", "lined snake", "smooth earth snake", "aesculapian snake", "black girdled lizard", "common basilisk", "green basilisk", "brown basilisk", "helmeted iguana", "desert collared lizard", "eastern collared lizard", "long-nosed leopard lizard", "neotropical green anole", "green anole", "crested anole", "bark anole", "knight anole", "border anole", "clouded anole", "brown anole", "common gecko", "texas coralsnake", "eastern tiger snake", "red-bellied black snake", "western banded gecko", "marbled leaf-toed gecko", "marbled gecko", "mutilating gecko", "southern palm gecko", "tokay gecko", "common house gecko", "tropical house gecko", "flat-tailed house gecko", "mediterranean house gecko", "mourning gecko", "cape dwarf gecko", "gold dust day gecko", "gila monster", "marine iguana", "western spiny-tailed iguana", "black spiny-tailed iguana", "desert iguana", "green iguana", "common chuckwalla", "sand lizard", "western green lizard", "green lizard", "common wall lizard", "italian wall lizard", "large psammodromus", "ocellated lizard", "viviparous lizard", "common slug-eater", "common mock viper", "mole snake", "northern curly-tailed lizard", "texas blind snake", "thin tree iguana", "zebra-tailed lizard", "greater earless lizard", "western earless lizard", "keeled earless lizard", "coast horned lizard", "texas horned lizard", "greater short-horned lizard", "flat-tailed horned lizard", "roundtail horned lizard", "mountain horned lizard", "desert horned lizard", "regal horned lizard", "clark's spiny lizard", "prairie lizard", "southwestern fence lizard", "west gulf rough-scaled lizard", "sagebrush lizard", "graphic spiny lizard", "mountain spiny lizard", "desert spiny lizard", "emerald swift", "western fence lizard", "texas spiny lizard", "granite spiny lizard", "crevice spiny lizard", "longtail spiny lizard", "eastern spiny lizard", "crevice swift", "plateau fence lizard", "eastern fence lizard", "yellow-backed spiny lizard", "rose-bellied lizard", "small-scaled lizard", "ornate tree lizard", "common side-blotched lizard", "moorish gecko", "turniptail gecko", "carpet python", "eastern water skink", "longtail mabuya", "many-lined sun-skink", "rainbow skink", "pale-flecked garden sunskink", "common new zealand skink", "common five-lined skink", "gilbert's skink", "southeastern five-lined skink", "broadhead skink", "great plains skink", "western skink", "little brown skink", "shingleback lizard", "common bluetongue", "rainbow mabuya", "african striped skink", "white-throated gecko", "amazon racerunner", "chihuahuan spotted whiptail", "common spotted whiptail", "race-runner", "six-lined racerunner", "sonoran spotted whiptail", "gray-checkered whiptail", "western whiptail", "plateau striped whiptail", "middle american ameiva", "rainbow ameiva", "argentine black-and-white tegu", "khorat blind snake", "bengal monitor", "clouded monitor", "nile monitor", "common water monitor", "lace monitor", "florida cottonmouth", "eastern copperhead", "broad-banded copperhead", "northern cottonmouth", "puff adder", "eyelash viper", "fer-de-lance", "eastern diamondback rattlesnake", "western diamondback rattlesnake", "sidewinder", "timber rattlesnake", "western black-tailed rattlesnake", "western rattlesnake", "eastern black-tailed rattlesnake", "southwestern speckled rattlesnake", "red diamond rattlesnake", "mohave rattlesnake", "prairie rattlesnake", "massasauga", "pygmy rattlesnake", "western massasauga", "chinese green tree viper", "asp viper", "adder", "hilaire’s side-necked turtle", "loggerhead sea turtle", "green turtle", "hawksbill", "pacific ridley", "snapping turtle", "western pond turtle", "painted turtle", "spotted turtle", "blanding's turtle", "european pond turtle", "wood turtle", "northern map turtle", "ouachita map turtle", "false map turtle", "diamondback terrapin", "river cooter", "florida red-bellied cooter", "peninsular cooter", "northern red-bellied cooter", "texas cooter", "common box turtle", "western box turtle", "pond slider", "mesoamerican slider", "mediterranean turtle", "striped mud turtle", "yellow mud turtle", "mexican mud turtle", "eastern mud turtle", "razor-backed musk turtle", "common musk turtle", "angulate tortoise", "(californian) desert tortoise", "texas tortoise", "morafka’s desert tortoise", "gopher tortoise", "leopard tortoise", "common tortoise", "hermann's tortoise", "florida softshell turtle", "spiny softshell", "atlantic jackknife", "nuttall's cockle", "giant atlantic cockle", "small giant clam", "gould beanclam", "coquina", "zebra mussel", "soft-shelled clam", "angel wing", "atlantic ribbed mussel", "california mussel", "blue mussel", "new zealand green-lipped mussel", "little black mussel", "eastern oyster", "pacific oyster", "european flat oyster", "common jingle", "green falsejingle", "atlantic calico scallop", "atlantic bay scallop", "pacific calico scallop", "giant rock scallop", "eastern elliptio", "giant floater mussel", "asian clam", "atlantic surf clam", "new zealand cockle", "cross-barred venus", "pacific littleneck clam", "hard clam", "japanese littleneck", "pismo clam", "common sydney octopus", "common octopus", "australian giant cuttlefish", "european common cuttlefish", "ram's horn squid", "california seahare", "black sea hare", "channeled applesnail", "chinese mystery snail", "california aglaja", "globular drop snail", "rough keyhole limpet", "volcano keyhole limpet", "giant keyhole limpet", "common slipper snail", "marsh periwinkle", "common periwinkle", "flat periwinkle", "shark eye snail", "lewis' moon snail", "round-mouthed snail", "queen conch", "florida fighting conch", "ostrich foot snail", "scaled worm snail", "common whelk", "knobbed whelk", "speckled whelk", "mud whelk", "kellet's whelk", "lightning whelk", "california cone", "leafy hornmouth", "cart-rut shell", "wrinkled purple", "dog whelk", "striped dogwinkle", "eastern oyster drill", "eastern mud snail", "lettered olive", "purple olive snail", "arabic volute", "shag-rug nudibranch", "modest cadlina", "clown doris", "short-tailed ceratosoma", "anna's chromodoris", "black-margined nudibranch", "three-lined aeolid", "white-spotted dorid", "branched dendronotus", "white-lined dirona", "colorful dirona", "spotted leopard dorid", "san diego dorid", "heath's dorid", "sea lemon", "red dorid", "pancake aphelodoris", "monterey dorid", "hilton's aeolid", "australian blue dragon nudibranch", "blue dragon", "spanish shawl", "hopkins' rose nudibranch", "thick-horned nudibranch", "opalescent nudibranch", "orange-peel doris", "black-tipped spiny doris", "rough-mantled dorid", "ocellate phyllidia", "varicose wart slug", "pustulose wart slug", "cockerell's dorid", "orange-spike polycera", "sea clown triopha", "spotted triopha", "giant african land snail", "decollate snail", "milky slug", "button's banana slug", "california banana slug", "pacific banana slug", "black slug", "red slug", "dusky arion", "arboreal snail", "asian tramp snail", "slippery moss-snail", "rounded snail", "small pointed snail", "california lancetooth snail", "robust lancetooth snail", "copse snail", "green garden snail", "white-lipped snail", "brown-lipped snail", "garden snail", "turkish snail", "roman snail", "chocolate-band snail", "milk snail", "white italian snail", "yellow garden slug", "ash-black slug", "leopard slug", "jet slug", "florida tree snail", "draparnaud's glass-snail", "changeable mantleslug", "chinese slug", "southern flatcoil", "rosy wolfsnail", "southern california shoulderband snail", "pacific sideband", "redwood sideband", "cuban brown snail", "black-velvet leatherleaf", "tropical leatherleaf slug", "blue-ringed top snail", "norris' top-shell", "brown tegula", "banded tegula", "black tegula", "spotted top snail", "cat's eye snail", "wavy turban", "gumboot chiton", "west indian fuzzy chiton", "blue green chiton", "snakeskin chiton", "black katy chiton", "woody chiton", "mossy chiton", "california spiny chiton", "lined chiton", "christmas lichen", "gold dust lichen", "black knot", "yellow cobblestone lichen", "hooded rosette lichen", "hoary rosette lichen", "rosette lichen", "star rosette lichen", "candleflame lichen", "mealy pixie cup", "common powderhorn", "british soldier lichen", "evans' deer moss", "many-forked cladonia", "lipstick powderhorn", "gray reindeer lichen", "star-tipped reindeer lichen", "stonewall rim lichen", "boreal oakmoss", "oakmoss", "common greenshield lichen", "speckled greenshield", "hooded tube lichen", "powder-headed tube lichen", "brown-eyed wolf lichen", "wolf lichen", "shield lichen", "powdered ruffle lichen", "varied rag lichen", "tree moss", "rough speckled shield lichen", "old man's beard", "red beard lichen", "bushy beard lichen", "powdered sunshine lichen", "cartilage lichen", "farinose cartilage lichen", "lace lichen", "porpidia lichen", "common script lichen", "whitewash lichen", "lobaria anthraspis", "tree lungwort", "smooth lungwort", "membranous pelt lichen", "scaly pelt lichen", "dibaeis arcuata", "pink earth lichen", "candy lichen", "map lichen", "elegant sunburst lichen", "golden-eye lichen", "slender orange-bush", "golden hair-lichen", "hooded sunburst lichen", "maritime sunburst lichen", "common toadskin lichen", "smooth rock tripe", "green wood cup", "purple jellydisc", "lemon discos", "ochre jelly club", "black bulgar", "black tar spot", "spring orange peel fungus", "false morel", "saddle-shaped false morel", "white saddle", "western black elfin saddle", "white morel", "early false morel", "orange peel fungus", "scarlet elfcup", "scarlet cup", "stalked scarlet cup", "hairy rubber cup", "devil's urn", "bolete mould", "lobster mushroom", "coral spot", "cramp balls", "brittle cinder", "candlesnuff fungus", "dead man's fingers", "the prince", "meadow mushroom", "yellow stainer", "sandy stilt-puffball", "shaggy parasol", "green-spored parasol", "shaggy parasol", "shaggy mane", "reddening lepiota", "white dapperling", "flowerpot parasol", "fragile dapperling", "graceful parasol", "parasol", "desert shaggymane", "western yellow-veiled amanita", "eastern north american destroying angel", "false death-cap", "yellow patches", "tawny grisette", "jewelled amanita", "jackson's slender caesar", "fly agaric", "blushing bride amanita", "panthercap", "deathcap", "blusher", "grisette", "springtime amanita", "yellow fieldcap", "white dunce cap", "fairy fingers", "violet coral", "golden spindles", "viscid violet cort", "violet webcap", "peeling oysterling", "silverleaf fungus", "aborted entoloma", "entoloma hochstetteri", "beefsteak polypore", "western amethyst laccaria", "amethyst deceiver", "waxy laccaria", "purple laccaria", "parrot mushroom", "scarlet waxy cap", "witch's hat", "golden waxy cap", "western witch's hat", "lichenomphalia chromacea", "lichen agaric", "funeral bell", "spectacular rustgill", "pear-shaped puffball", "giant puffball", "warted puffball", "fairy ring marasmius", "red pinwheel", "collared parachute", "cruentomycena viscidocruenta", "orange pore fungus", "orange bonnet", "yellowleg bonnet", "bleeding fairy helmet", "pixie's parasol", "orange mycena", "lilac bonnet", "luminescent panellus", "roridomyces austrororidus", "late oyster", "common bird's nest fungus", "dung-loving bird's nest fungus", "splash cups", "common collybia", "fairy parachutes", "jack o'lantern mushroom", "western jack-o'-lantern mushroom", "honey mushroom", "ringless honey mushroom", "velvet foot", "fir-cone mushroom", "veiled oyster", "oyster mushroom", "summer oyster mushroom", "deer mushroom", "silky rosegill", "stubble rosegill", "eastern american platterful mushroom", "mock oyster mushroom", "trooping crumble cap", "mica cap", "common ink cap", "hare's foot inkcap", "magpie inkcap", "scaly ink cap", "weeping widow", "panaeolus antillarum", "mower's mushroom", "petticoat mottlegill", "pleated inkcap", "pale brittlestem", "splitgill mushroom", "common fieldcap", "spring fieldcap", "cyclocybe parasitica", "smoky-gilled hypholoma", "sulphur tuft", "brick cap", "red roundhead", "scarlet pouch", "mulch maids", "golden pholiota", "shaggy pholiota", "questionable stropharia", "wine-cap stropharia", "clouded agaric", "bitter false funnelcap", "totally tedious tubaria", "pinewood gingertail", "jelly ear", "ear fungus", "black witches' butter", "amber jelly fungus", "crystal brain fungus", "hygroscopic earthstar", "king bolete", "peppery bolete", "frost's bolete", "bay bolete", "red-capped scaber stalk", "brown birch-bolete", "ornate-stalked bolete", "old-man-of-the-woods", "liver bolete", "bitter bolete", "suede bolete", "ash-tree bolete", "false chanterelle", "brown roll-rim", "dyeball", "common earthball", "chicken fat mushroom", "fat jack", "dotted-stalked suillus", "purple-veiled slippery jack", "pungent slippery jack", "painted suillus", "velvet-footed pax", "california golden chanterelle", "golden chanterelle", "red chanterelle", "smooth chanterelle", "black trumpet", "yellowfoot", "white coral fungus", "wrinkled club", "sessile earthstar", "collared earthstar", "conifer mazegill", "upright coral fungus", "scaly chanterelle", "mustard yellow polypore", "chaga mushroom", "cracked cap polypore", "aspen bracket", "pine bracket", "oak bracket", "orange moss agaric", "anemone stinkhorn fungus", "devil's-fingers", "column stinkhorn", "red-cage fungus", "white basket fungus", "lantern stinkhorn", "devil's dipstick", "stinkhorn", "bridal veil stinkhorn", "ravenel's stinkhorn", "oak mazegill", "birch polypore", "northern red belt", "red belted conk", "resinous polypore", "green cheese polypore", "rosy conk", "milk-white toothed polypore", "white-pored chicken of the woods", "conifer chicken of the woods", "western hardwood sulphur shelf", "chicken of the woods", "dyer's polypore", "hen of the woods", "giant polypore", "giant polypore", "blushing rosette", "smoky polypore", "trembling crust", "coral-pink merulius", "wrinkled crust", "northern tooth", "pheasant's back mushroom", "veiled polypore", "thin-walled maze polypore", "hoof fungus", "artist's bracket", "ganoderma brownii", "golden reishi", "west coast reishi", "ganoderma sessile", "hemlock varnish shelf", "hairy hexagonia", "spring polypore", "microporus xanthopus", "hexagonal-pored polypore", "bay polypore", "gilled polypore", "northern cinnabar polypore", "southern cinnabar polypore", "little nest polypore", "lumpy bracket", "hairy bracket", "cinnabar bracket", "turkey-tail", "cauliflower mushroom", "crown-tipped coral", "earpick fungus", "berkeley's polypore", "bear's head tooth", "coral tooth fungus", "lion's mane mushroom", "golden milkcap", "saffron milkcap", "indigo milk cap", "candy cap", "yellow-staining milk cap", "fishy milkcap", "short-stemmed russula", "crowded parchment", "false turkey-tail", "false turkey-tail", "false turkey-tail", "ceramic parchment", "jellied false coral", "red-juice tooth", "club-like tuning fork", "jelly-antler", "orange jelly spot", "fan-shaped jelly-fungus", "alpine jelly cone", "mayapple rust", "cedar-apple rust", "hollyhock rust", "golden ear", "leafy brain", "snow fungus", "witch's butter", "silvery bryum", "ontario rhodobryum moss", "leucolepis umbrella moss", "woodsy thyme-moss", "badge moss", "fan moss", "broom fork-moss", "redshank", "pincushion moss", "bonfire moss", "common bladder moss", "grey-cushioned grimmia", "woolly fringe-moss", "hedwig's fringeleaf moss", "anomodon moss", "oregon beaked moss", "glittering wood-moss", "red-stemmed feather moss", "lanky moss", "square gooseneck moss", "big shaggy-moss", "cypress-leaved plait-moss", "cat's tail moss", "dendroalsia moss", "shingle moss", "waved silk-moss", "ostrich-plume moss", "delicate fern moss", "star moss", "wall screw-moss", "giant moss", "common haircap moss", "juniper haircap moss", "bristly haircap moss", "crome sphagnum", "sponge weed", "green pin-cushion alga", "gutweed", "sea lettuce", "greater whipwort", "flat-leaved scalewort", "crescent-cup liverwort", "great scented liverwort", "cat-tongue liverwort", "umbrella liverwort", "california asterella", "black pine seaweed", "common coralline", "scouring-pad alga", "turkish towel", "irish moss", "turkish washcloth", "sea sacks", "coontie", "longleaf ephedra", "green ephedra", "sweet-flag", "european water-plantain", "northern water plantain", "lanceleaf arrowhead", "duck-potato", "arrowhead", "asian taro", "green dragon", "jack-in-the-pulpit", "italian arum", "cuckoo-pint", "bog arum", "taro", "ivy-arum", "common duckweed", "ivy-leaved duckweed", "western skunk cabbage", "swiss cheese plant", "golden club", "green arrow arum", "water lettuce", "greater duckweed", "eastern skunk cabbage", "goosefoot-plant", "calla lily", "flowering-rush", "common pondweed", "frogbit", "water-soldier", "common arrowgrass", "crisp-leaved pondweed", "floating-leaved pondweed", "longleaf pondweed", "perfoliate pondweed", "american scheuchzeria", "sticky false asphodel", "surf grass", "eelgrass", "mediterranean fan palm", "coconut palm", "canary island palm", "date palm", "reclining date palm", "nikau palm", "dwarf palmetto", "carolina palmetto", "saw palmetto", "california fan palm", "mexican fan palm", "lily of the nile", "hooker's onion", "meadow garlic", "nodding onion", "drummond's onion", "naples garlic", "field garlic", "chives", "prairie onion", "small white leek", "three-cornered leek", "ramsons", "wild garlic", "belladonna lily", "poison-bulb", "candelabra lily", "southern swamp crinum", "tree crinum", "common snowdrop", "rio grande copper lily", "spotted april-fool", "april-fool", "spring spiderlily", "beach spider lily", "spring starflower", "summer snowflake", "spring snowflake", "red spider lily", "poet's narcissus", "wild daffodil", "bunch-flowered daffodil", "crowpoison", "sea daffodil", "paintbrush lily", "atamasco lily", "brazos rain-lily", "drummond's rain lily", "american century plant", "desert agave", "lechuguilla", "parry's agave", "pulque agave", "shaw's agave", "utah agave", "spiny asparagus", "sprenger's asparagus", "cape smilax", "asparagus", "climbing asparagus", "goldenstar", "harvest brodiaea", "dwarf brodiaea", "leichtlin's camassia", "small camas", "atlantic camas", "wavy-leafed soap plant", "european lily of the valley", "new zealand cabbage tree", "ti", "texas sotol", "sotol", "blue dicks", "fork-toothed ookow", "twining snakelily", "desert lily", "chaparral yucca", "spanish bluebell", "bluebell", "tufted grape hyacinth", "common starlily", "liriope", "may lily", "wild lily-of-the-valley", "false lily of the valley", "false solomon's seal", "star-flowered lily-of-the-valley", "three-leaved false solomon’s seal", "eastern false aloe", "mexican star", "sea muilla", "garden grape-hyacinth", "common grape hyacinth", "grape hyacinth", "parry's nolina", "beargrass", "common star-of-bethlehem", "solomon's seal", "solomon's-seal", "scented solomon's-seal", "hairy solomon's-seal", "autumn squill", "butcher's-broom", "alpine squill", "forbes' glory-of-the-snow", "siberian squill", "large-flowered triteleia", "white brodiaea", "prettyface", "ithuriel's spear", "arkansas yucca", "datil yucca", "joshua tree", "soaptree yucca", "common yucca", "tree yucca", "spanish bayonet", "pale yucca", "twisted-leaf yucca", "mojave yucca", "spanish dagger", "candelabra aloe", "cape aloe", "aloe vera", "onion-leafed asphodel", "branched asphodel", "ink berry", "orange day-lily", "red hot poker", "mountain flax", "new zealand flax", "bush flax", "tank lily", "mountain astelia", "yellow star grass", "stars", "purple pleatleaf", "african flag", "woodland crocus", "spring crocus", "field gladiolus", "prairie nymph", "dwarf crested iris", "blackberry lily", "douglas iris", "stinking iris", "dwarf lake iris", "western blue flag", "yellow flag", "pilgrim iris", "beach-head iris", "siberian iris", "oregon iris", "dwarf iris", "northern blue flag", "southern blue flag", "crisp tulp", "barbary-nut", "prairie pleatleaf", "douglas's blue-eyed grass", "rosy sandcrocus", "narrow-leaved blue-eyed grass", "western blue-eyed grass", "strict blue-eyed grass", "annual blue-eyed grass", "harlequin flower", "peacock flower", "mayfly orchid", "pixie cap", "bug orchid", "green-winged orchid", "pink-butterfly orchid", "pyramidal orchid", "putty root", "dragon's mouth", "bamboo orchid", "pink lady fingers", "tuberous grasspink", "fairy-slipper", "phantom orchid", "white helleborine", "narrow-leaved helleborine", "red helleborine", "green bird orchid", "spotted coralroot", "pacific coralroot", "striped coralroot", "yellow coralroot", "pink lady's slipper", "lady's-slipper", "small white lady's-slipper", "large-flowered cypripedium", "mountain lady's-slipper", "lesser yellow lady's slipper", "showy lady's slipper", "common spotted orchid", "early marsh-orchid", "heath spotted orchid", "broad-leaved marsh orchid", "elder-flowered orchid", "frog orchid", "winika", "scarlet lady's tresses", "rosy hyacinth orchid", "south african weed orchid", "wallflower orchid", "leopard orchid", "easter orchid", "new zealand bamboo orchid", "fire-star orchid", "dark-red helleborine", "stream orchid", "broad-leaved helleborine", "marsh helleborine", "showy orchis", "waxlip orchid", "western rattlesnake plantain", "downy rattlesnake plantain", "lesser rattlesnake plantain", "checkered rattlesnake plantain", "fragrant orchid", "dark vanilla orchid", "lizard orchid", "giant orchid", "violet bird's-nest", "large twayblade", "fen orchid", "maikaika", "three-toothed orchid", "burnt-tip orchid", "heartleaf twayblade", "bird's-nest orchid", "common twayblade", "monk orchid", "bee orchid", "late spider-orchid", "sombre bee-orchid", "fly orchid", "yellow bee-orchid", "woodcock orchid", "early spider-orchid", "sawfly orchid", "man orchid", "naked-man orchid", "early purple orchid", "military orchid", "lady orchid", "north wind bog orchid", "lesser butterfly-orchid", "white-fringed orchid", "greater butterfly-orchid", "orange-fringed orchid", "small green wood orchid", "scentbottle", "elegant rein orchid", "green bog orchid", "ragged fringed orchid", "round-leaved bog orchid", "lesser purple fringed orchid", "slender bog orchid", "flat spurred piperia", "rose pogonia", "ponerorchis cucullata", "leek orchid", "tutukiwi", "tall greenhood", "dwarf snail orchid", "nodding greenhood", "maroonhood", "fire orchid", "tongue orchid", "philippine ground orchid", "navasota ladies'-tresses", "sphinx ladies’ tresses", "slender ladies' tresses", "great plains ladies' tresses", "irish lady's-tresses", "spring ladies' tresses", "common sun orchid", "crane-fly orchid", "african yellow dayflower", "asiatic dayflower", "creeping spiderwort", "widow's tears", "virginia dayflower", "false dayflower", "wandering jew", "western spiderwort", "bluejacket", "purple heart", "moses-in-the-cradle", "spider lily", "inchplant", "carolina redroot", "spider-flower", "pickerelweed", "common water hyacinth", "air potato", "black bryony", "wild yam", "bog asphodel", "milkmaids", "autumn crocus", "flame lily", "largeflower bellwort", "perfoliate bellwort", "sessile bellwort", "white globe lily", "diogenes' lantern", "clay mariposa lily", "catalina mariposa lily", "clubhair mariposa lily", "gunnison's mariposa lily", "plain mariposa lily", "desert mariposa lily", "smokey mariposa", "yellow mariposa lily", "sagebrush mariposa lily", "yellow star-tulip", "sego lily", "plummer's mariposa lily", "mount diablo fairy-lantern", "splendid mariposa lily", "tolmie's pussy ears", "oakland mariposa lily", "butterfly mariposa lily", "weed's mariposa lily", "andrews' clintonia", "bluebead lily", "bride's bonnet", "white fawnlily", "yellow trout lily", "glacier lily", "avalanche lily", "giant white fawn lily", "siberian fawn lily", "dimpled trout lily", "checker lily", "spotted fritillary", "chocolate lily", "kamchatka fritillary", "yellow fritillary", "scarlet fritillary", "yellow star-of-bethlehem", "least gagea", "fire lily", "canada lily", "columbia lily", "formosa lily", "humboldt's lily", "tiger lily", "martagon lily", "michigan lily", "leopard lily", "sierra tiger lily", "wood lily", "turk's-cap lily", "washington lily", "indian cucumber root", "drops-of-gold", "largeflower fairybells", "rough-fruited fairybells", "california fetid adderstongue", "white twisted-stalk", "rose twisted-stalk", "wild tulip", "mountain deathcamas", "herb paris", "fremont's deathcamas", "death camas", "giant white wakerobin", "bashful wakerobin", "nodding trillium", "giant wakerobin", "little sweet betsy", "wake-robin", "nodding wakerobin", "large white trillium", "yellow wakerobin", "snow trillium", "pacific trillium", "prairie trillium", "toad trillium", "painted trillium", "california false hellebore", "white hellebore", "green false hellebore", "common beargrass", "supplejack", "rough bindweed", "earleaf greenbrier", "saw greenbrier", "sawbrier", "smooth carrionflower", "midwestern carrionflower", "laurel-leaf greenbrier", "sarsaparilla vine", "common greenbrier", "lanceleaf greenbrier", "bristly greenbrier", "kiekie", "thatch screwpine", "piñuela", "cardinal airplant", "small ballmoss", "spanish moss", "giant airplant", "sea clubrush", "white bear sedge", "golden sedge", "eastern woodland sedge", "hoary sedge", "cherokee sedge", "bristly sedge", "fringed sedge", "fingered sedge", "ebony sedge", "star sedge", "yellow-green sedge", "graceful sedge", "limestone meadow sedge", "gray's sedge", "hairy sedge", "porcupine sedge", "bladder sedge", "bristle-stalked sedge", "hop sedge", "sallow sedge", "big-head sedge", "tall bog-sedge", "slough sedge", "few-seeded sedge", "pale sedge", "long-stalked sedge", "hanging sedge", "pennsylvania sedge", "carex pilosa", "plantainleaf sedge", "cyperus sedge", "sand sedge", "rosy sedge", "beaked sedge", "spiked sedge", "longbeak sedge", "squarrose sedge", "spongy sedge", "wood sedge", "beaked sedge", "inflated sedge", "little green sedge", "fox sedge", "great fen-sedge", "shortleaf spikesedge", "globe flatsedge", "tall flatsedge", "chufa", "umbrella papyrus", "fragrant flatsedge", "purple nutsedge", "false nutsedge", "giant umbrella sedge", "three-way sedge", "common spike-rush", "common cotton-grass", "tussock cottongrass", "tawny cotton-grass", "knobby clubrush", "pingao", "white beak-sedge", "whitetop sedge", "tule", "california bulrush", "common three-square", "soft-stemmed bulrush", "dark green bulrush", "woolgrass", "panicled bulrush", "rufous bulrush", "wood club-rush", "tufted bulrush", "common pipewort", "spiny rush", "jointed rush", "baltic rush", "toad rush", "flattened rush", "soft rush", "slender path rush", "torrey's rush", "field woodrush", "heath wood-rush", "hairy woodrush", "indian ricegrass", "smilo grass", "crested wheatgrass", "short-awn foxtail", "meadow foxtail", "european beachgrass", "american beachgrass", "big bluestem", "bushy bluestem", "splitbeard bluestem", "broomsedge bluestem", "sweet vernal grass", "purple three-awn", "tall oat grass", "giant cane", "giant reed", "richards toetoe", "slender wild oat", "wild oat", "yellow bluestem", "silver bluestem", "side-oats grama", "buffalo grass", "blue grama", "hairy grama", "texas grama", "big quaking grass", "quaking-grass", "little quaking-grass", "rescuegrass", "ripgut brome", "soft brome", "smooth brome", "foxtail brome", "cheat grass", "bluejoint", "bushgrass", "buffel grass", "fountain grass", "river oats", "finger grass", "purple pampas grass", "pampas grass", "bermuda grass", "bristly dogtail grass", "orchard grass", "durban crowfoot", "poverty oatgrass", "fluffgrass", "tufted hair grass", "deertongue", "hairy crabgrass", "seashore saltgrass", "barnyardgrass", "panic veldtgrass", "goose grass", "canada wild rye", "squirreltail", "bottlebrush grass", "quack grass", "virginia wildrye", "rattlesnake mannagrass", "reed meadowgrass", "ridged manna grass", "big galleta", "yorkshire fog", "foxtail barley", "wall barley", "little barley", "bladey grass", "prairie junegrass", "hare's tail grass", "goldentop grass", "rice cutgrass", "giant wild rye", "american dune grass", "kentucky fescue", "perennial ryegrass", "spreading fescue", "mountain melick", "natal grass", "japanese stiltgrass", "wood millet", "chinese silver grass", "deergrass", "texas winter grass", "purple needlegrass", "basket grass", "white-grained mountain-ricegrass", "witch grass", "guineagrass", "switchgrass", "dallis grass", "bahia grass", "vasey grass", "kikuyu grass", "bulbous canarygrass", "reed canary grass", "alpine timothy", "timothy", "common reed", "annual meadow-grass", "bulbous bluegrass", "kentucky bluegrass", "annual beard grass", "little bluestem", "chinese foxtail", "palmgrass", "marsh bristlegrass", "yellow foxtail", "green bristle grass", "indiangrass", "johnson grass", "beach spinifex", "saltmarsh cord grass", "cordgrass", "saint augustine grass", "european feather-grass", "kangaroo grass", "white tridens", "purpletop tridens", "eastern gamagrass", "bread wheat", "sea oats", "jointed rush", "unbranched bur-reed", "branched bur-reed", "big bur-reed", "lesser reedmace", "southern cattail", "common cattail", "raupo", "indian-shot", "crêpe ginger", "expanded lobsterclaw", "parrot's beak", "hanging lobster claw heliconia", "alligator flag", "cavendish banana", "red ginger", "shell ginger", "torch ginger", "white ginger", "kahili ginger", "prickly tree-clubmoss", "hickey's tree-clubmoss", "princess pine", "ground-cedar", "fan clubmoss", "shining clubmoss", "northern firmoss", "nodding clubmoss", "inundated bog clubmoss", "running pine", "bushy clubmoss", "alpine clubmoss", "creeping clubmoss", "climbing clubmoss", "hanging clubmoss", "stiff clubmoss", "bigelow's spike moss", "krauss's clubmoss", "resurrection plant", "rock spikemoss", "goutweed", "garden angelica", "purplestem angelica", "ranger buttons", "wild angelica", "bur chervil", "cow parsley", "shore celery", "great masterwort", "prairie bishop", "hoary bowlesia", "caraway", "gotu cola", "broad-leaved chervil", "tainturier's chervil", "bulblet-bearing water hemlock", "poison parsnip", "cowbane", "poison hemlock", "rock samphire", "honewort", "slender celery", "queen anne's lace", "rattlesnake weed", "harbinger of spring", "field eryngo", "hooker's eryngo", "leavenworth's eryngo", "sea holly", "blue eryngo", "rattlesnake master", "longleaf", "giant fennel", "sweet fennel", "cartwheel flower", "common cowparsnip", "sosnowsky's hogweed", "hogweed", "scots lovage", "fernleaf biscuitroot", "ternate desert-parsley", "foothill desert-parsley", "fine-leaved water-dropwort", "water parsley", "mountain sweet cicely", "hairy sweet cicely", "aniseroot", "cowbane", "wild parsnip", "burnet-saxifrage", "texas prairie parsley", "alpine false springparsley", "footsteps of spring", "purple sanicle", "black snakeroot", "snakeroot", "sanicle", "clustered black snakeroot", "shepherd's-needle", "moon carrot", "water parsnip", "alexanders", "apiacean yellow pimpernel", "common hedge parsley", "upright hedge-parsley", "knotted hedgeparsley", "heart-leaf golden alexanders", "golden alexander", "california spikenard", "japanese angelica tree", "bristly sarsaparilla", "wild sarsaparilla", "american spikenard", "devil's walkingstick", "cabbage tree", "japanese aralia", "english ivy", "dollarweed", "hairy pennywort", "water pennywort", "manyflower marshpennywort", "devil's club", "american ginseng", "dwarf ginseng", "five finger", "mountain five finger", "lancewood", "houpara", "haumakaroa", "octopus tree", "miniature umbrella tree", "patē", "ivy tree", "new zealand broadleaf", "kaikomako", "australian blackthorn", "karo", "lemonwood", "kohuhu", "victorian box", "european holly", "dahoon", "chinese holly", "possumhaw", "inkberry", "mountain holly", "american holly", "winterberry holly", "yaupon holly", "alseuosmia macrophylla", "korokio", "common yarrow", "sneezewort", "blow wives", "oppositeleaf spotflower", "sacapellote", "american trailplant", "san felipe dogweed", "crofton weed", "white snakeroot", "shrubby boneset", "billygoat weed", "bluemink", "orange agoseris", "big bur-sage", "common ragweed", "silver burr ragweed", "triangle-leaf bursage", "white bur-sage", "western ragweed", "common burrobrush", "giant ragweed", "new zealand everlastingflower", "pearly everlasting", "woodland madia", "catsfoot", "field pussytoes", "pink everlasting", "mayweed", "greater burdock", "lesser burdock", "woolly burdock", "capeweed", "prostrate capeweed", "heartleaf arnica", "wolf's bane", "pale indian plantain", "prairie indian plantain", "wormwood", "sweet annie", "california sagebrush", "field wormwood", "california mugwort", "tarragon", "sand sagebrush", "fringed sagebrush", "white sagebrush", "beach wormwood", "hoary mugwort", "common sagebrush", "common mugwort", "alpine aster", "gravel ghost", "groundsel tree", "poverty weed", "coyote brush", "mule fat", "desert broom", "san diego county viguiera", "parish's goldeneye", "desert marigold", "arrowleaf balsamroot", "willow ragwort", "sweetbush", "lawn daisy", "lyre-leaf greeneyes", "tickseed beggar-ticks", "spanish needles", "nodding bur-marigold", "devil's beggarticks", "black-jack", "trifid bur-marigold", "sea ox-eye", "rangiora", "california brickellbush", "false boneset", "field marigold", "pot marigold", "white tack-stem", "straggler daisy", "broad-winged thistle", "welted thistle", "musk thistle", "italian thistle", "silver thistle", "carlina biebersteinii", "carline thistle", "celmisia spectabilis", "purple star-thistle", "cornflower", "brown knapweed", "napa star thistle", "perennial cornflower", "black knapweed", "wig knapweed", "greater knapweed", "yellow starthistle", "spotted knapweed", "white pincushion", "pebble pincushion", "douglas' dustymaiden", "fremont's pincushion", "yellow pincushion", "desert pincushion", "rose-heath", "skeleton weed", "siam weed", "green-and-gold", "yellow rabbitbrush", "chuquiragua", "bunk", "dwarf thistle", "tall thistle", "anderson's thistle", "arizona thistle", "field thistle", "field thistle", "woolly thistle", "melancholy thistle", "yellow thistle", "swamp thistle", "desert thistle", "cobwebby thistle", "cabbage thistle", "marsh thistle", "dune thistle", "meadow thistle", "texas thistle", "wavyleaf thistle", "bull thistle", "blue mistflower", "golden wave tickseed", "lance-leaved coreopsis", "greater tickseed", "prairie coreopsis", "plains coreopsis", "tall coreopsis", "california aster", "garden cosmos", "yellow cosmos", "golden marguerite", "australian waterbuttons", "brass buttons", "redflower ragleaf", "smooth hawksbeard", "narrow-leaved hawksbeard", "little ironweed", "marshelder", "artichoke thistle", "red dahlia", "clustered tarweed", "cape-ivy", "flathead rabbit tobacco", "many stem evax", "hoary tansyaster", "trailing african daisy", "stinkwort", "false yellowhead", "parasol whitetop", "narrow-leaved purple coneflower", "pale purple coneflower", "purple coneflower", "southern globe thistle", "glandular globe-thistle", "false daisy", "leafy elephant's-foot", "common elephant's-foot", "renosterbush", "red tasselflower", "lilac tasselflower", "acton brittlebush", "california brittlebush", "brittlebush", "engelmann daisy", "fireweed", "wedgeleaf goldenbush", "california goldenbush", "narrowleaf goldenbush", "rubber rabbitbrush", "bitter fleabane", "annual fleabane", "flax-leaved horseweed", "canadian fleabane", "cut-leaf fleabane", "leafy fleabane", "seaside daisy", "santa barbara daisy", "philadelphia fleabane", "robin's-plantain", "daisy fleabane", "tropical horseweed", "cape snow bush", "golden yarrow", "common woolly sunflower", "lizard tail", "wallace's woollydaisy", "tall boneset", "hemp agrimony", "dogfennel", "hyssopleaf thoroughwort", "common boneset", "round-leaved boneset", "late boneset", "white wood aster", "large-leaved aster", "siberian aster", "flat-topped goldenrod", "hollow joe-pye weed", "spotted joe-pye-weed", "sweetscented joe pye weed", "lanceleaf blanketflower", "great blanketflower", "red dome blanketflower", "indian blanket", "boar thistle", "gallant soldier", "shaggy soldier", "pennsylvania everlasting", "treasureflower", "beach gazanias", "desert-sunflower", "garland daisy", "marsh cudweed", "spanish gold", "curlycup gumweed", "oregon gumplant", "broom snakeweed", "saw-toothed goldenbush", "cretanweed", "bitterweed", "common sneezeweed", "bigelow's sneezeweed", "southern sneezeweed", "rosilla", "narrowleaf sunflower", "sunflower", "cucumberleaf sunflower", "woodland sunflower", "sawtooth sunflower", "maximilian sunflower", "ashy sunflower", "mcdowell's sunflower", "prairie sunflower", "jerusalem artichoke", "everlasting flower", "smooth oxeye", "bristly ox-tongue", "hayfield tarweed", "telegraphweed", "camphorweed", "hairy goldenaster", "white hawkweed", "canada hawkweed", "rattlesnakeweed", "old plainsman", "wooly white", "smooth cat's ear", "false dandelion", "meadow fleabane", "elecampane", "irish fleabane", "alkali goldenbush", "coastal goldenbush", "sumpweed", "marsh elder", "yellow bristle-hair ixeris", "silver ragwort", "tansy ragwort", "marsh jaumea", "two-flower dwarf-dandelion", "virginia dwarfdandelion", "tall blue lettuce", "canada wild lettuce", "woodland lettuce", "prickly lettuce", "blue lettuce", "nipplewort", "california goldfields", "common goldfields", "whitedaisy tidytips", "tidy tips", "edelweiss", "california broomsage", "giant coreopsis", "sea dahlia", "oxeye daisy", "rough blazing star", "slender blazingstar", "elegant gayfeather", "dotted gayfeather", "prairie blazing star", "dense blazing star", "texas yellow star", "texas skeleton plant", "tahoka daisy", "common madia", "grassy tarweed", "coast tarweed", "desert dandelion", "cliff aster", "barbara's-buttons", "german chamomile", "disc mayweed", "blackfoot daisy", "snow squarestem", "fynbos blombush", "white bristle bush", "q-tips", "mile-a-minute", "climbing hempvine", "desert star", "wall lettuce", "white rattlesnakeroot", "tall rattlesnakeroot", "three-leaved rattlesnakeroot", "whorled wood aster", "common tree daisy", "mangrove-leaved daisy-bush", "akiraho", "woodland cudweed", "stinknet", "cotton thistle", "bietou", "tauhinu", "small's ragwort", "golden ragwort", "butterweed", "roundleaf ragwort", "balsam ragwort", "great plains ragwort", "spanish needle", "small palafox", "spiny starwort", "parasenecio hastatus", "santa maria feverfew", "mariola", "wild quinine", "cinchweed", "emory's rock daisy", "white butterbur", "arctic sweet coltsfoot", "butterbur", "woolly butterbur", "pygmy cedar", "cape everlasting", "hawkweed oxtongue", "fox-and-cubs", "meadow hawkweed", "mouse-ear hawkweed", "white rock-lettuce", "sickleleaf silk-grass", "narrowleaf silkgrass", "american basketflower", "arrow-leaf", "rosy camphorweed", "camphor-weed", "sourbush", "marsh fleabane", "arrowweed", "leafcup", "odora", "purple lettuce", "two-color rabbit tobacco", "california cudweed", "jersey cudweed", "rabbit tobacco", "blackroot", "common fleabane", "false dandelion", "california chicory", "desert chicory", "upright prairie coneflower", "yellow coneflower", "clasping coneflower", "black-eyed-susan", "green-headed coneflower", "brown-eyed susan", "mexican creeping zinnia", "common golden thistle", "autumn hawkbit", "texas ragwort", "red-purple ragwort", "threadleaf groundsel", "woad-leaved ragwort", "cutleaf burnweed", "narrow-leaved ragwort", "tall western groundsel", "coastal burnweed", "wood ragwort", "seabeach groundsel", "cotton burnweed", "gravel groundsel", "arrowleaf ragwort", "eastern groundsel", "sticky groundsel", "common groundsel", "bankrupt bush", "white rosinweed", "prairie rosinweed", "compass plant", "cup plant", "prairie dock", "milk thistle", "bear's foot", "late goldenrod", "silverrod", "bluestem goldenrod", "canada goldenrod", "broad-leaved goldenrod", "tall goldenrod", "hairy goldenrod", "early goldenrod", "northern goldenrod", "field goldenrod", "white flat-topped goldenrod", "stiff-leaved goldenrod", "common wrinkle-leaved goldenrod", "northern seaside goldenrod", "showy goldenrod", "bog goldenrod", "velvety goldenrod", "european goldenrod", "common soliva", "perennial sow thistle", "prickly sowthistle", "common sow-thistle", "trailing daisy", "brownplume wirelettuce", "pacific aster", "lindley's aster", "heart-leaved aster", "yard aster", "drummond's aster", "white heath aster", "smooth blue aster", "panicled aster", "white woodland aster", "new england aster", "new york aster", "hairy white oldfield aster", "swamp aster", "western silver aster", "annual saltmarsh aster", "cape snow", "mexican marigold", "red-crescent marigold", "dune tansy", "feverfew", "tansy", "red-seeded dandelion", "common dandelion", "coastal camphor bush", "fineleaf fournerved daisy", "plains yellow four-nerve daisy", "greenthread", "rayless greenthread", "parralena", "mexican sunflower", "tree marigold", "yellow salsify", "oyster plant", "meadow salsify", "yellowhead", "tridax daisy", "scentless mayweed", "sea aster", "american threefold", "colt's-foot", "silverpuffs", "smooth golden fleece", "false hawkbit", "canyon sunflower", "wingstem", "cowpen daisy", "white crownbeard", "western ironweed", "smooth ironweed", "tall ironweed", "missouri ironweed", "new york ironweed", "golden-eye", "texas creeping-oxeye", "narrowleaf mule-ears", "smooth mule-ears", "woolly mule's ears", "cut-leaf ironplant", "xanthium orientale", "spiny cocklebur", "rough cocklebur", "orcutt's woody-aster", "mojave aster", "oriental false hawksbeard", "desert zinnia", "elegant zinnia", "rocky mountain zinnia", "peruvian zinnia", "marsh bellflower", "bearded bellflower", "clustered bellflower", "mountain harebell", "giant bellflower", "speading bellflower", "peach-leaved bellflower", "california harebell", "creeping bellflower", "rampion", "harebell", "campanula sibirica", "nettle-leaved bellflower", "american bellflower", "star of bethlehem", "sheep's-bit", "punakuru", "panakeake", "cardinal flower", "water lobelia", "indian tobacco", "bog lobelia", "sierra madre lobelia", "pine-leaved lobelia", "downy lobelia", "great blue lobelia", "pale-spiked lobelia", "round-headed rampion", "spiked rampion", "venus' looking-glass", "clasping venus's looking glass", "new zealand harebell", "southern rockbell", "hop goodenia", "coastal inkberry", "sea lettuce", "selliera", "bog-bean", "deer-cabbage", "water snowflake", "marble leaf", "pacific hound's tongue", "purple gromwell", "common fiddleneck", "bristly fiddleneck", "small bugloss", "common alkanet", "wild comfrey", "borage", "corn gromwell", "texas wild olive", "scarlet cordia", "hound's-tongue", "pride of madeira", "salvation jane", "viper's-bugloss", "anacua", "whispering bells", "california yerba santa", "thick-leaved yerba santa", "hairy yerba santa", "spotted hideseed", "fourspike heliotrope", "pasture heliotrope", "stickseed", "scorpion's-tail", "tree heliotrope", "seaside heliotrope", "european heliotrope", "indian heliotrope", "great waterleaf", "broad-leaf waterleaf", "ballhead waterleaf", "pacific waterleaf", "virginia waterleaf", "narrow-leaved cryptantha", "hoary puccoon", "hairy puccoon", "fringed puccoon", "common gromwell", "western stoneseed", "mountain bluebells", "oysterplant", "tall bluebell", "virginia bluebells", "field forget-me-not", "changing forget-me-not", "broadleaf forget-me-not", "water forget-me-not", "wood forget-me-not", "purple mat", "purple mat", "white nemophila", "fivespot", "menzies' baby blue eyes", "small-flowered nemophila", "texas baby blue eyes", "brown monkwort", "nonea rossica", "green alkanet", "purple phacelia", "california phacelia", "desert blue bells", "caterpillar phacelia", "blue curls", "notch-leaf scorpion-weed", "distant phacelia", "frémont's phacelia", "large-flowered phacelia", "silverleaf phacelia", "california bluebell", "parry's phacelia", "miami mist", "branching phacelia", "silky phacelia", "lacy phacelia", "fiesta flower", "white fiesta flower", "pulmonaria mollis", "suffolk lungwort", "lungwort", "common comfrey", "palmer's crinklemat", "fan-leaved tiquilia", "sea rosemary", "poodle-dog bush", "fiberglass plant", "saltwort", "jack-by-the-hedge", "lyre-leaved rock cress", "mouse-ear cress", "horseradish", "american yellowrocket", "garden yellowrocket", "hoary alyssum", "smooth rockcress", "black mustard", "field mustard", "saharan mustard", "warty-cabbage", "american searocket", "european searocket", "shepherd's-purse", "large bittercress", "coralroot", "spring cress", "milkmaids", "cutleaf toothwort", "broadleaf toothwort", "purple cress", "wavy bittercress", "hairy bittercress", "narrow-leaved bittercress", "nuttall's toothwort", "cuckooflower", "california mustard", "crossflower", "sea kale", "western tansy mustard", "flixweed", "perennial wall-rocket", "california shieldpod", "wedge leaf whitlow-grass", "common whitlowgrass", "rocketsalad", "western wallflower", "wormseed wallflower", "wallflower", "dame's-rocket", "shortpod mustard", "field peppergrass", "common peppergrass", "lesser swine-cress", "heart-podded hoary cress", "hairypod pepperweed", "broadleaved pepperweed", "shining pepperweed", "virginia pepperweed", "sweet alyssum", "moneyplant", "perennial honesty", "watercress", "gordon's bladderpod", "wild radish", "wild radish", "annual bastard cabbage", "bog yellowcress", "wild mustard", "tumble mustard", "london rocket", "false london-rocket", "hedge mustard", "prince's plume", "bristly jewelflower", "mountain jewelflower", "field penny-cress", "sand fringepod", "tower mustard", "caper bush", "capparis zoharyi", "papaya", "purple cleome", "asian spiderflower", "bladderpod", "rocky mountain bee plant", "redwhisker clammyweed", "crown of thorns", "false mermaid", "douglas' meadowfoam", "yellow mignonette", "dyer's rocket", "nasturtium", "cone stompie", "japanese-spurge", "mountain horopito", "chilean sea fig", "hottentot fig", "pig's-root", "new zealand ice plant", "heart-leaf ice plant", "crystalline ice plant", "slender iceplant", "sea purslane", "kokihi", "warrigal greens", "devil's horsewhip", "iodinebush", "alligatorweed", "palmer's amaranth", "redroot pigweed", "four-wing saltbush", "desert holly", "big saltbush", "cattle saltbush", "fat-hen", "shining orache", "australian saltbush", "summer-cypress", "beet", "quail grass", "nettle-leaved goosefoot", "lambsquarters", "mexican tea", "clammy goosefoot", "plains snakecotton", "prostrate globe-amaranth", "gomphrena weed", "hopsage", "winterfat", "oak-leaved goosefoot", "pacific swampfire", "beaded samphire", "saltwort", "tumbleweed", "mignonette vine", "triangle cactus", "golden-spined cereus", "saguaro", "nipple beehive cactus", "grooved nipple cactus", "buckhorn cholla", "teddy bear cholla", "california cholla", "silver cholla", "jumping cholla", "gander's cholla", "chain cholla", "pencil cactus", "coast cholla", "branched pencil cholla", "cylindropuntia thurberi", "whipple cholla", "devilshead", "echinocactus platyacanthus", "echinocereus bakeri", "nylon hedgehog cactus", "scarlet hedgehog cactus", "spiny hedgehog cactus", "engelmann's hedgehog cactus", "strawberry cactus", "fendler's hedgehog cactus", "spiny hedgehog cactus", "ladyfinger cactus", "lace hedgehog cactus", "rainbow hedgehog cactus", "scarlet hedgehog cactus", "strawberry hedgehog cactus", "claret-cup hedgehog", "missouri foxtail cactus", "whitecolumn foxtail cactus", "viviparous foxtail cactus", "california barrel cactus", "emory's barrel cactus", "ferocactus histrix", "ferocactus latispinus", "san diego barrel cactus", "fishhook barrel cactus", "turk's head", "harem cactus", "horse crippler cactus", "senita cactus", "peyote", "strawberry cactus", "graham's fishhook cactus", "little nipple cactus", "mexican pincushion cactus", "common fishhook cactus", "blue myrtle-cactus", "chihuahuan beehive", "beavertail cactus", "eastern prickly-pear cactus", "pancake prickly-pear", "cochineal nopal cactus", "opuntia decumbens", "texas prickly-pear", "mission cactus", "brittle prickly-pear", "eastern prickly pear", "texas prickly pear", "coastal prickly pear", "black-spine prickly pear", "plains prickly pear", "bunny ears", "brown-spined prickly-pear", "starvation prickly-pear", "nopal tapón", "erect prickly pear", "woollyjoint pricklypear", "indian comb", "mexican giant cactus", "simpson's hedgehog cactus", "nightblooming cereus", "night-blooming cereus", "galloping cactus", "organ pipe cactus", "onyx flower", "thyme-leaved sandwort", "sandcarpet", "field chickweed", "common mouse-ear chickweed", "sticky mouse-ear chickweed", "deptford pink", "sweet-william", "carthusian pink", "chinese-pink", "maiden pink", "fringed pink", "whitesnow", "smooth rupturewort", "sea sandwort", "grove sandwort", "largeleaf sandwort", "three-nerved sandwort", "greenland stitchwort", "hairypink", "fourleaf manyseed", "greater stitchwort", "rock sandwort", "birdseye pearlwort", "rock soapwort", "common soapwort", "german knotgrass", "moss campion", "sleepy catchfly", "dusty-miller", "red campion", "ragged-robin", "common catchfly", "indian-pink", "white campion", "nottingham catchfly", "royal catchfly", "starry campion", "sea campion", "fire pink", "bladder campion", "corn spurrey", "red sand spurrey", "water chickweed", "lesser stitchwort", "common chickweed", "wood stitchwort", "star chickweed", "sticky catchfly", "elephant bush", "drosera aberrans", "great sundew", "drosera arcturi", "tall sundew", "dwarf sundew", "pink sundew", "poppy-flowered sundew", "spoonleaf sundew", "climbing sundew", "round-leaved sundew", "rosy sundew", "whittaker's sundew", "alkali seaheath", "green carpetweed", "curnow's curse", "common pussypaws", "one-seeded pussypaws", "mount hood pussypaws", "carolina spring-beauty", "western spring beauty", "streambank springbeauty", "miner's lettuce", "redstem springbeauty", "candy flower", "virginia spring beauty", "oregon bitterroot", "small-leaved blinks", "yellow sand verbena", "red sand-verbena", "pink sand verbena", "desert sand verbena", "trailing windmills", "scarlet spiderling", "marvel of peru", "wishbone bush", "colorado four o'clock", "wild four o'clock", "rougeplant", "pokeweed", "red inkplant", "inkweed", "sea thrift", "carolina sea lavender", "perez's sea lavender", "blue statice", "cape leadwort", "wild leadwort", "coral bells", "american bistort", "common bistort", "alpine bistort", "sagebrush chorizanthe", "fringed spineflower", "devil's spineflower", "sea grape", "coastal wild buckwheat", "skeleton weed", "california buckwheat", "desert trumpet", "seaside buckwheat", "longleaf buckwheat", "naked buckwheat", "cushion buckwheat", "seacliff wild buckwheat", "sulphur flower", "bastardsage", "fringed bindweed", "black-bindweed", "copse-bindweed", "climbing false buckwheat", "pohuehue", "muehlenbeckia axillaris", "maidenhair vine", "alpine sorrel", "water smartweed", "halberd-leaved tearthumb", "pink knotweed", "china knotweed", "waterpepper", "mild water-pepper", "smartweed", "low smartweed", "redshank", "pinkweed", "mile-a-minute weed", "dotted knotweed", "arrowleaf tearthumb", "american jumpseed", "prostrate knotweed", "beach knotweed", "woodland pterostegia", "asiatic knotweed", "giant knotweed", "common sorrel", "sheep's sorrel", "russian dock", "curled dock", "canaigre dock", "golden dock", "bitter dock", "purslane", "shaggy portulaca", "greasewood", "jojoba", "jewels of opar", "saltcedar", "crucifixion thorn", "oriental staff vine", "american bittersweet", "winged euonymus", "strawberry bush", "wahoo", "spindle", "wintercreeper", "japanese spindle tree", "running strawberry-bush", "warty-barked spindle", "fringed grass of parnassus", "fen grass of parnassus", "marsh grass-of-parnassus", "oregon boxwood", "candlewood", "coontail", "white dogwood", "pagoda dogwood", "silky dogwood", "canadian bunchberry", "roughleaf dogwood", "flowering dogwood", "european cornel", "mountain dogwood", "gray dogwood", "round-leaved dogwood", "bloodtwig dogwood", "red osier dogwood", "swedish cornel", "western bunchberry", "wild hydrangea", "mophead hydrangea", "oakleaf hydrangea", "piper's mock orange", "stinging serpent", "desert rock nettle", "streambank stickleaf", "sand blazingstar", "giant blazingstar", "sandpaper plant", "black tupelo", "american bladdernut", "tree tutu", "karaka", "desert star-vine", "white bryony", "buffalo gourd", "coyote melon", "squirting cucumber", "wild cucumber", "snake apple", "california manroot", "wild cucumber", "coastal manroot", "creeping cucumber", "bitter melon", "bur-cucumber", "red valerian", "northern bush honeysuckle", "common teasel", "cutleaf teasel", "indian teasel", "field scabious", "himalayan honeysuckle", "twinflower", "western white honeysuckle", "american fly-honeysuckle", "orange honeysuckle", "glaucous honeysuckle", "pink honeysuckle", "twinberry honeysuckle", "japanese honeysuckle", "amur honeysuckle", "morrow's honeysuckle", "common honeysuckle", "coral honeysuckle", "southern honeysuckle", "tatarian honeysuckle", "fly honeysuckle", "shortspur seablush", "small scabious", "cream scabiosa", "sweet scabious", "devil's-bit scabious", "common snowberry", "creeping snowberry", "western snowberry", "coral berry", "roundleaf snowberry", "orange-fruited horse gentian", "common valerian", "sitka valerian", "common cornsalad", "beaked cornsalad", "moschatel", "american black elderberry", "blue elder", "danewort", "chinese elder", "european black elderberry", "red elderberry", "mapleleaf viburnum", "southern arrowwood", "squashberry", "wayfaring-tree", "hobblebush", "nannyberry", "wild raisin", "cramp bark", "blackhaw", "downy arrowwood", "leatherleaf viburnum", "rusty blackhaw", "laurustinus viburnum", "kashmir balsam", "spotted touch-me-not", "himalayan balsam", "touch-me-not balsam", "pale touch-me-not", "small balsam", "impatiens", "sweet pepperbush", "swamp titi", "diapensia", "galax", "texas persimmon", "american persimmon", "sugarstick", "wild rosemary", "pacific madrone", "strawberry tree", "texas madrone", "big berry manzanita", "pinemat manzanita", "greenleaf manzanita", "pointleaf manzanita", "pinemat manzanita", "mountain bearberry", "common heather", "western moss-heather", "arctic bell-heather", "sand heath", "leatherleaf", "spotted wintergreen", "little prince's pine", "prince's pine", "inaka", "copperbush", "black crowberry", "common heath", "trailing arbutus", "honeysuckle heath", "tree heath", "fire heath", "bell heather", "red heath", "purpletip heath", "discoloring erica", "salt-and-pepper heath", "portuguese heath", "hangertjies", "cross-leaved heath", "roughpetal heath complex", "gaultheria antipoda", "mountain snowberry", "creeping snowberry", "eastern teaberry", "salal", "black huckleberry", "sheep laurel", "mountain laurel", "alpine bog laurel", "swamp laurel", "alpine azalea", "prickly mingimingi", "pukiawe", "mānuka-rauriki", "patotara", "mountain doghobble", "fetterbush lyonia", "one-flowered wintergreen", "yellow bird's-nest", "indian pipe", "one-sided wintergreen", "sourwood", "carpet heath", "purple mountainheath", "red mountain-heather", "woodland pinedrops", "rounded shinleaf", "bog wintergreen", "green-flowered wintergreen", "shinleaf", "lesser wintergreen", "white-veined wintergreen", "round-leaved wintergreen", "white-flowered rhododendron", "flame azalea", "rhodora", "southern pinxter azalea", "western labrador tea", "rusty-leaved alpenrose", "labrador tea", "lapland azalea", "pacific rhododendron", "great rhododendron", "mock azalea", "western azalea", "pinkster flower", "marsh labrador tea", "snowplant", "lowbush blueberry", "sparkleberry", "northern highbush blueberry", "american cranberry", "black huckleberry", "velvetleaf blueberry", "common bilberry", "oval-leaf blueberry", "california huckleberry", "small cranberry", "red huckleberry", "deerberry", "bog bilberry", "lingonberry", "mission manzanita", "ocotillo", "broad-leaved gilia", "grand collomia", "variableleaf collomia", "narrow-leaf mountain trumpet", "giant woollystar", "desert woollystar", "bluehead gilia", "star gilia", "bird's-eye gilia", "scarlet gilia", "flaxflowered ipomopsis", "standing cypress", "lilac sunbonnet", "true babystars", "whisker brush", "nuttall's linanthus", "variable linanthus", "prickly phlox", "fringed linanthus", "granite gilia", "mexican false calico", "desert calico", "slender phlox", "skunkweed", "spreading phlox", "blue phlox", "drummond's phlox", "spiny phlox", "longleaf phlox", "fall phlox", "prairie phlox", "goldeneye phlox", "moss phlox", "tall jacob's-ladder", "blue jacob's ladder", "california jacob's ladder", "pretty jacob's-ladder", "jacob's ladder", "sky pilot", "pygmy-flower rock-jasmine", "coralberry", "cudjoewood", "ivy-leaved cyclamen", "european cyclamen", "blue scarlet pimpernel", "starflower", "fringed loosestrife", "gooseneck loosestrife", "chickweed-wintergreen", "star flower", "sea milkwort", "yellow pimpernel", "creeping-jenny", "dotted loosestrife", "narrow-leaved loosestrife", "whorled loosestrife", "swamp candles", "tufted loosestrife", "yellow loosestrife", "red matipo", "weeping mapou", "toro", "padre's shootingstar", "oxlip", "bird's-eye primrose", "henderson's shooting star", "eastern shooting star", "parry's primrose", "dark-throated shooting star", "cowslip", "primrose", "seaside brookweed", "maakoako", "alpine snowbell", "white milkwood", "gum bumelia", "california pitcher-plant", "pale pitcherplant", "yellow pitcher plant", "white pitcher plant", "hooded pitcher plant", "purple pitcher plant", "sweetleaf", "rosary pea", "taiwan acacia", "western coastal wattle", "silver wattle", "longleaf wattle", "black wattle", "tasmanian blackwood", "myrtle wattle", "kangaroo thorn", "golden wattle", "golden wreath wattle", "spanish clover", "silver bird's-foot trefoil", "common deerweed", "shrubby deervetch", "desert lotus", "persian silk tree", "leadplant", "desert false indigo", "american hog-peanut", "common kidneyvetch", "american groundnut", "alpine milkvetch", "canadian milkvetch", "chickpea milkvetch", "ground-plum", "purple milk-vetch", "wild liquorice", "spotted locoweed", "woolly locoweed", "small-flowered milkvetch", "woollypod milkvetch", "santa barbara milkvetch", "white wild indigo", "blue wild indigo", "longbract wild indigo", "horsefly weed", "arabian pea", "red bird-of-paradise bush", "pink fairy-duster", "calliandra houstoniana", "beach bean", "siberian peashrub", "new zealand common broom", "golden shower tree", "butterfly pea", "carob tree", "eastern redbud", "western redbud", "partridge pea", "sensitive pea", "chamaecytisus ruthenicus", "pigeonwings", "blue pea", "showy rattlebox", "tree lucerne", "scotch broom", "golden prairie clover", "white prairie clover", "feather dalea", "purple prairie clover", "flamboyant", "texas mountain laurel", "illinois bundleflower", "showy tick-trefoil", "panicled ticktrefoil", "aroma", "texas ebony", "elephant ear tree", "cherokee bean", "yellow bird-of-paradise shrub", "mexican holdback", "texas kidneywood", "common milkpea", "french broom", "dyer's greenweed", "honey locust", "fence post tree", "wild licorice", "nickernut", "kentucky coffeetree", "happy wanderer", "alpine sweet-vetch", "boreal sweet-vetch", "pig-nut", "wand holdback", "witch's-teeth", "pointed-leaved tick-trefoil", "naked-flowered tick-trefoil", "coastal indigo", "running postman", "golden chain", "hairy vetchling", "seaside pea", "broad-leaved sweet pea", "pale vetchling", "marsh pea", "meadow pea", "narrow-leaved everlasting-pea", "tuberous pea", "spring vetch", "pacific pea", "round-headed bush clover", "chinese bushclover", "slender bush clover", "cancer bush", "white leadtree", "eggs-and-bacon", "greater bird's-foot-trefoil", "silver lupine", "narrow-leaved lupin", "yellow bush lupine", "arctic lupine", "silvery lupine", "arizona lupine", "miniature lupine", "bajada lupine", "stinging lupine", "dwarf lupine", "chick lupine", "sky lupine", "nootka lupine", "sundial lupine", "garden lupine", "coulter's lupine", "arroyo lupine", "texas bluebonnet", "collared annual lupine", "purple bush-bean", "variableleaf bushbean", "phasey bean", "spotted medick", "sickle alfalfa", "black medic", "little bur-clover", "bur clover", "alfalfa", "white sweet clover", "sweet clover", "yellow sweet clover", "catclaw mimosa", "catclaw brier", "sensitive plant", "fourvalve mimosa", "sunshine mimosa", "yellow puff", "tropical puff", "ironwood", "sainfoin", "spiny restharrow", "yellow oxytropis", "stemless point-vetch", "plume albizia", "mexican palo verde", "blue palo verde", "little-leaved palo verde", "silvery scurfpea", "slimflower scurfpea", "chaparral pea", "blackbead", "common flat-pea", "honey mesquite", "prosopis laevigata", "screwbean", "velvet mesquite", "emory's indigo bush", "indigo-bush", "smoke tree", "kudzu", "new mexico locust", "black locust", "forest scurfpea", "purple crownvetch", "catsclaw", "candelabra bush", "spiny senna", "coues' senna", "african wild cassia", "american senna", "lindheimer's senna", "maryland senna", "american sicklepod", "coffee senna", "two-leaved senna", "rattlebush", "bigpod sesbania", "scarlet sesban", "bladder pod", "eve's necklace", "small-leaved kowhai", "spanish broom", "trailing fuzzy-bean", "pink fuzzybean", "pencil flower", "tamarind", "goat's rue", "california goldenbanner", "false lupine", "narrow-leaved clover", "hare's-foot clover", "large hop clover", "hop trefoil", "cowbag clover", "lesser hop trefoil", "strawberry clover", "rose clover", "alsike clover", "crimson clover", "lupine clover", "zigzag clover", "mountain clover", "red clover", "white clover", "reversed clover", "subterranean clover", "tomcat clover", "gorse", "whitethorn acacia", "bullhorn acacia", "sweet acacia", "sweet thorn", "blackbrush acacia", "twisted acacia", "american vetch", "hairy vetch", "carolina vetch", "tufted vetch", "giant vetch", "hairy tare", "slender vetch", "common vetch", "bush vetch", "wood vetch", "smooth tare", "hairy vetch", "hairypod cowpea", "chinese wisteria", "showy milkwort", "prickly purplegorse", "tortoise berry", "white milkwort", "orange milkwort", "sweet pea shrub", "candyroot", "racemed milkwort", "yellow milkwort", "field milkwort", "seneca snakeroot", "shrubby milkwort", "fringed polygala", "california milkwort", "green alder", "common alder", "speckled alder", "white alder", "red alder", "smooth alder", "yellow birch", "sweet birch", "dwarf birch", "river birch", "white birch", "silver birch", "gray birch", "downy birch", "swamp birch", "hornbeam", "american hornbeam", "american hazelnut", "common hazel", "beaked hazelnut", "american hophornbeam", "ironwood", "american chestnut", "sweet chestnut", "giant chinquapin", "bush chinquapin", "american beech", "european beech", "tanoak", "coast live oak", "white oak", "california scrub oak", "swamp white oak", "buckley's oak", "canyon live oak", "kermes oak", "scarlet oak", "muller's oak", "blue oak", "leather oak", "emory oak", "engelmann oak", "southern red oak", "texas live oak", "gambel oak", "oregon oak", "sand live oak", "holly oak", "bear oak", "shingle oak", "california black oak", "turkey oak", "valley oak", "bur oak", "blackjack oak", "swamp chestnut oak", "chestnut oak", "chinkapin oak", "water oak", "pin oak", "willow oak", "english oak", "sweet acorn oak", "northern red oak", "shumard oak", "bastard oak", "post oak", "cork oak", "sonoran scrub oak", "black oak", "southern live oak", "interior live oak", "bitternut hickory", "pignut hickory", "pecan", "shagbark hickory", "mockernut", "southern california walnut", "butternut", "eastern black walnut", "persian walnut", "sweet-fern", "california wax myrtle", "wax myrtle", "northern bayberry", "bog myrtle", "mountain beech", "silver beech", "coast silk tassel", "lindheimer's silktassel", "fringed bluestar", "eastern bluestar", "spreading dogbane", "hemp dogbane", "white bladderflower", "white-stemmed milkweed", "blunt-leaved milkweed", "spider milkweed", "california milkweed", "heart-leaf milkweed", "bloodflower milkweed", "woollypod milkweed", "desert milkweed", "poke milkweed", "narrowleaf milkweed", "tall green milkweed", "sandhill milkweed", "swamp milkweed", "fewflower milkweed", "broadleaf milkweed", "pineneedle milkweed", "prairie milkweed", "aquatic milkweed", "purple milkweed", "four-leaved milkweed", "showy milkweed", "rush milkweed", "horsetail milkweed", "common milkweed", "butterfly milkweed", "redring milkweed", "whorled milkweed", "green milkweed", "green antelopehorns", "crown flower", "giant milkweed", "num-num", "natal plum", "be-still tree", "madagascar periwinkle", "sandvine", "climbing milkweed", "hartweg's climbing milkweed", "trailing townula", "wild cotton", "narrow-leaf cotton bush", "balloonplant", "anglepod", "star milkvine", "pearl milkweed", "common oleander", "new zealand jasmine", "mexican plumeria", "star jasmine", "greater periwinkle", "lesser periwinkle", "swallow-wort", "black swallow-wort", "european swallow-wort", "yellow jessamine", "yellow-wort", "common centaury", "lesser centaury", "christmas berry", "catchfly prairie gentian", "american columbo", "monument plant", "trumpet gentian", "pale gentian", "arctic gentian", "closed bottle gentian", "willow gentian", "explorer's gentian", "bottle gentian", "cross gentian", "narrowleaf gentian", "great yellow gentian", "marsh gentian", "gentiana sedifolia", "spring gentian", "autumn gentian", "fringe-flowered gentian", "greater fringed gentian", "pennywort", "rosepink", "meadow pink", "marsh pink", "lady bird's centaury", "california centaury", "hangehange", "indian pink", "firecrackerbush", "buttonbush", "coffee", "sand coprosma", "aruhe", "mikimiki", "coprosma dumosa", "stinkwood", "kanono", "shining karamu", "miki", "mirror bush", "coprosma rhamnoides", "karamu", "round leaved coprosma", "crosswort", "buttonweed", "hedge bedstraw", "narrow-leaved bedstraw", "catchweed bedstraw", "northern bedstraw", "licorice bedstraw", "hedge bedstraw", "sweet woodruff", "common marsh-bedstraw", "fragrant bedstraw", "lady's bedstraw", "firebush", "rough buttonweed", "azure bluet", "long-leaved bluets", "roundleaf bluet", "summer bluet", "tiny bluet", "jungle flame", "partridgeberry", "noni", "nertera depressa", "old world diamond flower", "skunk vine", "red psychotria", "shiny-leaved wild coffee", "hot lips", "largeflower mexican clover", "field madder", "shrubby false buttonweed", "diamond-flowers", "long-beaked stork's bill", "redstem filaree", "musky stork's bill", "texas stork's bill", "purple cluster geranium", "carolina crane's-bill", "cut-leaved crane's-bill", "woolly cranesbill", "shining crane's-bill", "spotted geranium", "dove's-foot crane's-bill", "marsh crane's-bill", "dusky crane's-bill", "meadow crane's-bill", "little-robin", "small geranium", "hedgerow crane's-bill", "richardson's geranium", "herb robert", "bloody crane's-bill", "siberian crane's-bill", "woodland geranium", "sticky geranium", "rose-scented geranium", "wild mallow", "cinnamon geranium", "chilean rhubarb", "bear's breeches", "coromandel", "black mangrove", "grey mangrove", "false mint", "snakeherb", "polka dot plant", "american water-willow", "chuparosa", "gregg's tube tongue", "golden shrimp plant", "browne's blechum", "carolina wild petunia", "hairy petunia", "common wild petunia", "mexican petunia", "smooth ruellia", "black-eyed susan vine", "bengal trumpet", "cross vine", "american trumpet vine", "northern catalpa", "desert willow", "calabash tree", "blue jacaranda", "african tulip tree", "pink poui", "yellow trumpet flower", "cape honeysuckle", "taurepo", "anise hyssop", "nettle-leaf giant hyssop", "ground-pine", "blue bugle", "carpet bugle", "black horehound", "common hedge-nettle", "hairy wood mint", "american beautyberry", "formosan beauty- berry", "basil-thyme", "yerba buena", "wild basil", "citronella horse balm", "desert lavender", "common dittany", "bifid hemp-nettle", "large-flowered hemp-nettle", "common hemp-nettle", "ground-ivy", "slender hedeoma", "musky mint", "white deadnettle", "henbit deadnettle", "yellow archangel", "spotted deadnettle", "red deadnettle", "topped lavender", "wild dagga", "lion's ear", "common motherwort", "leonurus quinquelobatus", "california pitcher sage", "american bugleweed", "european bugleweed", "northern bugleweed", "white horehound", "lemon balm", "bastard balm", "watermint", "corn mint", "american cornmint", "horse mint", "pennyroyal", "spearmint", "lemon beebalm", "scarlet beebalm", "wild bergamot", "spotted horse mint", "mountain coyote mint", "coyote mint", "catnip", "oregano", "beefsteak plant", "phlomoides tuberosa", "obedient plant", "common selfheal", "narrowleaf mountainmint", "virginia mountain mint", "white sage", "brown sage", "giant blue sage", "tropical sage", "chia", "purple sage", "mealy blue sage", "sticky sage", "purple sage", "lyreleaf sage", "black sage", "baby sage", "meadow sage", "cedar sage", "rosemary", "hummingbird sage", "texas sage", "whorled clary", "drummond's skullcap", "marsh skullcap", "helmet skullcap", "side-flowering skullcap", "paperbag bush", "small skullcap", "blue skullcap", "california hedge nettle", "woolly hedgenettle", "coastal hedge-nettle", "scarlet hedgenettle", "florida hedgenettle", "marsh woundwort", "perennial yellow-woundwort", "hedge woundwort", "canada germander", "wall germander", "coastal germander", "woodland germander", "mother of thyme", "large thyme", "wild thyme", "blue curls", "woolly blue curls", "vinegar weed", "parish's bluecurls", "lilac chaste tree", "puriri", "beach vitex", "prairie brazoria", "alpine butterwort", "common butterwort", "horned bladderwort", "humped bladderwort", "intermediate bladderwort", "common bladderwort", "purple bladderwort", "zigzag bladderwort", "yellowseed false pimpernel", "brittle false pimpernel", "desert unicorn-plant", "ram's horn", "japanese mazus", "white fringetree", "stretchberry", "texas ash", "white ash", "european ash", "oregon ash", "black ash", "green ash", "velvet ash", "tree privet", "quihoui privet", "chinese privet", "common privet", "olive", "common lilac", "prairie false foxglove", "purple false foxglove", "slender false foxglove", "clustered broomrape", "oneflower broomrape", "one-flowered cancer-root", "velvetbells", "mediterranean lineseed", "yellow glandweed", "coast indian paintbrush", "wavyleaf paintbrush", "field indian paintbrush", "valley tassels", "desert paintbrush", "painted-cup paintbrush", "denseflower indian paintbrush", "purple owl's clover", "woolly indian paintbrush", "harsh indian paintbrush", "texas paintbrush", "wholeleaf indian paintbrush", "monterey indian paintbrush", "wyoming indian paintbrush", "giant red indian paintbrush", "mountain indian paintbrush", "purple paintbrush", "rhexia-leaf indian-paintbrush", "cream sacs", "downy indian-paintbrush", "santa catalina indian paintbrush", "wight's paintbrush", "alpine cancer-root", "bear corn", "stiffbranch bird's beak", "beechdrops", "common eyebright", "inkblom", "common toothwort", "field cow-wheat", "narrowleaf cow wheat", "wood cow-wheat", "common cow-wheat", "red odontites", "ivy broomrape", "common broomrape", "little elephant's head", "towering lousewort", "canadian wood betony", "indian warrior", "elephant's head", "labrador lousewort", "swamp lousewort", "leafy lousewort", "pinewoods lousewort", "lousewort", "whorled lousewort", "greater yellow rattle", "yellow rattle", "late-flowering yellow rattle", "johnnytuck", "dwarf orthocarpus", "princess tree", "orange bush monkeyflower", "bigelow's monkeyflower", "wide-throated yellow monkeyflower", "southern bush monkeyflower", "red bush monkeyflower", "wingstem monkeyflower", "scarlet monkeyflower", "seep monkeyflower", "lewis' monkeyflower", "muskflower", "primrose monkeyflower", "sharpwing monkeyflower", "allegheny monkeyflower", "american lopseed", "snapdragon", "herb-of-grace", "dwarf snapdragon", "white turtlehead", "purple chinese houses", "maiden blue-eyed mary", "spring blue-eyed mary", "ivy-leaved toadflax", "yellow foxglove", "purple foxglove", "common mare's tail", "chaparral beardtongue", "heartleaf keckiella", "scarlet keckiella", "narrow-leaved paleseed", "alpine toadflax", "balkan toadflax", "purple toadflax", "common toadflax", "snapdragon vine", "yellow-flowered waterhyssop", "ghost flower", "blue toadflax", "texas toadflax", "beardlip penstemon", "scarlet bugler", "cobaea beardtongue", "davidson's penstemon", "foxglove beardtongue", "firecracker penstemon", "large beardtongue", "bunchleaf penstemon", "hairy beardtongue", "mountain pride", "palmer's penstemon", "small-flower beardtongue", "bridges' penstemon", "showy penstemon", "whipple's penstemon", "largebracted plantain", "minutina", "erect plantain", "english plantain", "common plantain", "sea plantain", "hoary plantain", "desert plantain", "woolly plantain", "american plantain", "virginia plantain", "coulter's snapdragon", "nuttall's snapdragon", "scoparia weed", "american brooklime", "blue water-speedwell", "wall speedwell", "brooklime", "germander speedwell", "shore hebe", "slender speedwell", "ivy-leaved speedwell", "long-leaved speedwell", "heath speedwell", "purslane speedwell", "persian speedwell", "grey field-speedwell", "koromiko", "marsh speedwell", "thyme-leaved speedwell", "spiked speedwell", "koromiko", "broadleaf speedwell", "american alpine speedwell", "culver's root", "buddleja cordata", "butterfly bush", "texas barometer bush", "ngaio", "lazybush", "california figwort", "common figwort", "moth mullein", "white mullein", "dark mullein", "common mullein", "wand mullein", "african honeysuckle", "rust weed", "whitebrush", "golden dewdrop", "dakota mock vervain", "rose mock vervain", "dwarf verbena", "spanish flag", "button sage", "creeping lantana", "lanceleaf fogfruit", "turkey tangle", "jamaica snakeweed", "purpletop vervain", "prostrate vervain", "brazilian vervain", "texas vervain", "blue vervain", "western vervain", "macdougal verbena", "common vervain", "slender vervain", "hoary vervain", "white vervain", "pukatea", "carolina sweetshrub", "california sweetshrub", "beilschmiedia tawa", "dodder laurel", "camphor tree", "bay laurel", "common spicebush", "avocado", "sassafras", "california bay", "pigeonwood", "pond-apple", "common pawpaw", "tulip tree", "cucumber-tree", "fraser magnolia", "southern magnolia", "bigleaf magnolia", "umbrella magnolia", "sweetbay magnolia", "cocoplum", "hornbeam copperleaf", "common copperleaf", "spurge nettle", "texas bull nettle", "variegated croton", "california croton", "bush croton", "tropic croton", "lindheimer's doveweed", "prairie tea", "beach croton", "turkey-mullein", "lance-leaved ditaxis", "white-margined sandmat", "wood spurge", "candelilla", "snow-on-the-prairie", "medusa's-head", "mediterranean spurge", "flowering spurge", "painted leaf", "cypress spurge", "green poinsettia", "fendler's sandmat", "sun spurge", "painted spurge", "ara tanan", "graceful spurge", "hyssop spurge", "caper spurge", "spotted spurge", "snow-on-the-mountain", "pencil milkbush", "red-gland spurge", "cliff spurge", "donkey tail", "nodding spurge", "eggleaf spurge", "florida hammock sandmat", "sea spurge", "petty spurge", "smallseed sandmat", "mat euphorbia", "matted sandmat", "thyme-leafed spurge", "reticulate-seeded spurge", "geraldton carnation weed", "leafy spurge", "leatherstem", "bellyache bush", "elephant's ear", "turn-in-the-wind", "cassava", "annual mercury", "dog's mercury", "castor oil plant", "queen's delight", "texas stillingia", "chinese tallow", "bible-leaf", "great st. john's wort", "aaron's beard", "fraser's marsh st. john's-wort", "pineweed", "st. andrew's cross", "kalm's st. john's-wort", "imperforate st john's-wort", "dwarf st. john's wort", "klamath weed", "spotted st. john's wort", "fourpetal st. johnswort", "pale flax", "lewis flax", "meadow flax", "yellow flax", "nance", "bluecrown passionflower", "common passionfruit", "stinking passionflower", "purple passionflower", "yellow passionflower", "corkystem passionflower", "new zealand passionflower", "banana passionfruit", "grapeleaf passionflower", "yellow alder", "bishop wood", "hen and chickens", "chamberbitter", "red mangrove", "european white poplar", "balsam poplar", "eastern cottonwood", "alamo cottonwood", "bigtooth aspen", "black poplar", "european aspen", "trembling aspen", "black cottonwood", "white willow", "weeping willow", "bebb's willow", "goat willow", "grey willow", "american willow", "yellow willow", "narrowleaf willow", "goodding's willow", "interior sandbar willow", "red willow", "arroyo willow", "dark-leaved willow", "black willow", "net-leaved willow", "almond willow", "basket willow", "porcupine shrub", "whiteywood", "hookedspur violet", "european field pansy", "johnny-jump-up", "twoflower violet", "sweet white violet", "canada violet", "heath dog-violet", "stream violet", "halberd-leaved violet", "hairy violet", "labrador violet", "white bog violet", "pine violet", "small white violet", "miracle violet", "nuttall's violet", "western heart's ease", "sweet violet", "early blue violet", "birdfoot violet", "prairie violet", "california golden violet", "primrose-leaved violet", "downy yellow violet", "goosefoot violet", "early dog-violet", "common dog-violet", "long-spurred violet", "round-leaved violet", "arrowleaf violet", "redwood violet", "woolly blue violet", "cream violet", "heartsease", "lipsticktree", "cochlospermum vitifolium", "grey-leaved cistus", "hoary rock-rose", "gum cistus", "montpelier cistus", "sage-leaved rock-rose", "canada frostweed", "peak rushrose", "common rock-rose", "woolly beachheather", "sweet indian mallow", "velvetleaf", "baobab", "hollyhock", "chisos mountains false indianmallow", "violettas", "winecup mallow", "palmleaf winecup", "kapok tree", "whau", "desert five-spot", "california flannelbush", "upland cotton", "west indian elm", "curly herissantia", "rock hibiscus", "halberd-leaf rosemallow", "swamp rose mallow", "chinese hibiscus", "rose of sharon", "sea hibiscus", "flower-of-an-hour", "north island lacebark", "newberry's velvet-mallow", "saltmarsh mallow", "chaparral mallow", "tree mallow", "island mallow", "musk mallow", "common mallow", "cheeseweed mallow", "common mallow", "eastern tree-mallow", "three-lobe false mallow", "turk's cap", "alkali mallow", "pyramid flower", "gulf teabush", "modiola", "provision tree", "rock rose", "makaka", "lowland ribbonwood", "sida", "bracted fanpetals", "rhombus-leaved sida", "dwarf checkermallow", "oregon checker-mallow", "southern checkerbloom", "desert mallow", "narrowleaf globemallow", "scarlet globemallow", "cacao tree", "portia tree", "basswood", "little-leaf linden", "caesar weed", "sleepy morning", "cherry tree", "spurge flax", "spurge-laurel", "mezereon", "eastern leatherwood", "autetaranga", "buttonwood", "white mangrove", "country almond", "scarlet toothcup", "colombian cuphea", "willow-herb", "crepe myrtle", "winged loosestrife", "hyssop loosestrife", "purple loosestrife", "pomegranate", "water chestnut", "soapbush", "malabar melastome", "savannah meadowbeauty", "maryland meadowbeauty", "virginia meadow beauty", "river redgum", "tasmanian blue gum", "australian tea tree", "manuka", "ramarama", "white climbing rata", "pohutukawa", "scarlet rātā vine", "climbing rata", "ohia lehua", "northern rata", "southern rata", "myrtle", "strawberry-guava", "common guava", "california sun cup", "beach suncup", "fireweed", "dwarf fireweed", "yellow cups", "browneyes", "alpine enchanter's-nightshade", "broadleaf enchanter's nightshade", "enchanter's-nightshade", "farewell-to-spring", "red ribbons", "winecup clarkia", "diamond clarkia", "ruby chalice clarkia", "elegant clarkia", "panicled willowherb", "california fuchsia", "fringed willowherb", "codlins-and-cream", "broad-leaved willowherb", "booth's evening primrose", "california primrose", "lady's eardrops", "kotukutuku", "hardy fuchsia", "mosquito flower", "seedbox", "mexican primrose-willow", "water purslane", "floating primrose-willow", "peruvian primrose-willow", "berlandier's sundrops", "common evening-primrose", "fragrant evening primrose", "lizard-tail", "dune evening primrose", "beach evening-primrose", "hooker's evening primrose", "biennial beeblossom", "false gaura", "large-flowered evening-primrose", "hartweg's sundrops", "cutleaf evening primrose", "bigfruit evening primrose", "rose evening primrose", "oenothera rubricaulis", "pinkladies", "scarlet beeblossom", "roadside gaura", "stemless evening primrose", "sun cup", "noughts-and-crosses", "vliebos", "watershield", "spatterdock", "yellow water-lily", "american yellow pond-lily", "variegated pond-lily", "white water-lily", "nymphaea candida", "red and blue water lily", "american white waterlily", "river rose", "kamahi", "wineberry", "pokaka", "common woodsorrel", "pink-sorrel", "creeping woodsorrel", "largeflower pink-sorrel", "sussex yellow-sorrel", "drummond's woodsorrel", "pale pink-sorrel", "garden pink-sorrel", "golden sorrel", "mountain woodsorrel", "yelloweye woodsorrel", "redwood woodsorrel", "bermuda-buttercup", "manyleaf sorrel", "purple woodsorrel", "yellow woodsorrel", "violet woodsorrel", "california dutchman's pipe", "birthwort", "canadian wild ginger", "western wild ginger", "asarabacca", "virginia snakeroot", "little brown jug", "kawakawa", "yerba mansa", "chameleon plant", "lizard's tail", "american lotus", "sacred lotus", "american sycamore", "western sycamore", "arizona sycamore", "coastal banksia", "silver banksia", "bushy needlebush", "rewarewa", "golden sunshinebush", "spinning-top conebush", "common sunshine conebush", "tree pincushion", "wart-stemmed pincushion", "red pagoda", "common sugarbush", "king protea", "greyleaf sugarbush", "oleander-leaf protea", "wagon tree", "common sugarbush", "common pin spiderhead", "vanilla leaf", "oregon-grape", "leatherleaf mahonia", "darwin's barberry", "cascade oregon-grape", "creeping mahonia", "thunberg's barberry", "agarita", "european barberry", "early blue cohosh", "blue cohosh", "twinleaf", "heavenly bamboo", "mayapple", "white inside-out flower", "carolina snailseed", "moonseed", "white prickly poppy", "mexican prickly poppy", "flatbud prickly poppy", "mexican poppy", "thistle poppy", "rock harlequin", "greater celandine", "scrambled eggs", "hollowroot", "pale corydalis", "bird-in-a-bush", "bush poppy", "squirrel corn", "dutchman's breeches", "pacific bleeding heart", "golden eardrops", "tufted poppy", "californian poppy", "little gold poppy", "goldpoppy", "white ramping-fumitory", "common ramping-fumitory", "common fumitory", "sea poppy", "bleeding heart", "welsh poppy", "long-headed poppy", "windpoppy", "common poppy", "opium poppy", "creamcups", "yellow corydalis", "coulter's matilija poppy", "puccoon-root", "celandine poppy", "columbian monkshood", "larkspurleaf monkshood", "northern wolf's-bane", "monk's-hood", "aconitum septentrionale", "doll's eyes", "black cohosh", "red baneberry", "eurasian baneberry", "spring adonis", "meadow anemone", "tenpetal anemone", "poppy anemone", "cylindrical thimbleweed", "broad-leaved anemone", "cutleaf anemone", "tall thimbleweed", "anemonoides altaica", "wood anemone", "wood anemone", "yellow anemone", "snowdrop anemone", "dark columbine", "red columbine", "golden columbine", "colorado blue columbine", "yellow columbine", "western columbine", "common columbine", "white marsh marigold", "marsh-marigold", "curveseed butterwort", "alpine clematis", "swamp leather flower", "old man's beard", "pipestem clematis", "western virgin's bower", "purple clematis", "puawhananga", "pitcher's leatherflower", "autumn clematis", "scarlet leather flower", "virginia virgin's-bower", "old man's beard", "forking larkspur", "threeleaf goldthread", "scarlet larkspur", "wild blue larkspur", "mountain larkspur", "canyon larkspur", "nuttall's larkspur", "desert larkspur", "dwarf larkspur", "false rue anemone", "winter aconite", "lesser celandine", "stinking hellebore", "sharp-lobed hepatica", "roundlobe hepatica", "liverleaf", "goldenseal", "love-in-a-mist", "alpine pasqueflower", "eastern pasqueflower", "white pasqueflower", "eastern pasqueflower", "small-flowered crowfoot", "meadow buttercup", "common water-crowfoot", "goldilocks buttercup", "bulbous buttercup", "california buttercup", "kashubian buttercups", "early buttercup", "lesser spearwort", "sagebrush buttercup", "bristly buttercup", "rough-fruited buttercup", "western buttercup", "multi-flowered buttercup", "hooked buttercup", "creeping buttercup", "cursed crowfoot", "greater meadow-rue", "purple meadow-rue", "early meadow-rue", "fendler's meadow-rue", "shining meadow-rue", "lesser meadow-rue", "king of the meadow", "rue-anemone", "trollius asiaticus", "globeflower", "globe flower", "shrub yellowroot", "hemp", "sugarberry", "hackberry", "granjeno", "netleaf hackberry", "japanese hops", "common hops", "oriental trema", "russian olive", "silverberry", "thorny olive", "autumn olive", "sea-buckthorn", "canadian buffalo-berry", "breadfruit", "jackfruit", "paper mulberry", "hairy crabweed", "florida strangler fig", "common fig", "chinese banyan", "petiolate fig", "climbing fig", "sacred fig", "bois d'arc", "white mulberry", "korean mulberry", "red mulberry", "small leaved milk tree", "supplejack", "new jersey tea", "mountain whitethorn", "hoaryleaf ceanothus", "buckbrush", "fendler's ceanothus", "inland ceanothus", "deerbrush ceanothus", "chaparral whitethorn", "bigpod ceanothus", "hairy ceanothus", "cupped leaf ceanothus", "mahala mat", "greenbark ceanothus", "blueblossom", "tobacco brush", "wart-stemmed ceanothus", "hog-plum", "brasil", "matagouri", "alder buckthorn", "california buckthorn", "carolina buckthorn", "cascara", "jerusalem thorn", "mediterranean buckthorn", "alderleaf buckthorn", "common buckthorn", "redberry buckthorn", "coyotillo", "hollyleaf redberry", "dogsface", "lotebush", "bronze piri-piri-bur", "biddy-biddy", "chamise", "redshanks", "common agrimony", "common agrimony", "swamp agrimony", "hairy agrimony", "saskatoon", "shadblow", "dwarf serviceberry", "utah serviceberry", "common silverweed", "black chokeberry", "goatsbeard", "california mountain mahogany", "curl-leaf mountain-mahogany", "alder-leaved mountain-mahogany", "mountain misery", "fernbush", "blackbrush", "marsh cinquefoil", "parsley hawthorn", "common hawthorn", "shrubby cinquefoil", "yellow mountain-avens", "mountain avens", "tall cinquefoil", "sticky cinquefoil", "loquat", "apache plume", "queen of the prairie", "meadowsweet", "dropwort", "beach strawberry", "wood strawberry", "virginia strawberry", "green strawberry", "yellow avens", "white avens", "large-leaved avens", "water avens", "ross' avens", "prairie smoke", "wood avens", "toyon", "ocean spray", "japanese rose", "partridgefoot", "apple", "oregon crab apple", "domestic apple", "indian plum", "mat rock spiraea", "chinese photinia", "pacific ninebark", "common ninebark", "silverleaf cinquefoil", "dwarf cinquefoil", "tormentil", "slender cinquefoil", "indian strawberry", "rough cinquefoil", "sulphur cinquefoil", "creeping cinquefoil", "common cinquefoil", "american plum", "chickasaw plum", "sweet cherry", "carolina laurelcherry", "cherry-plum", "almond", "bitter cherry", "desert almond", "desert apricot", "hollyleaf cherry", "cherry laurel", "mexican plum", "european bird cherry", "fire cherry", "peach", "sand cherry", "black cherry", "blackthorn", "choke cherry", "stansbury's cliff rose", "antelope brush", "firethorn", "callery pear", "common pear", "indian hawthorn", "jetbead", "prickly wild rose", "prairie rose", "macartney's rose", "california wild rose", "dog-rose", "carolina rose", "baldhip rose", "cinnamon rose", "multiflora rose", "nootka rose", "swamp rose", "sweet-brier", "rugosa rose", "burnet rose", "woods' rose", "allegheny blackberry", "arctic raspberry", "himalayan blackberry", "himalayan-berry", "european dewberry", "cloudberry", "bush lawyer", "dewdrop", "common dewberry", "swamp dewberry", "red raspberry", "cutleaf blackberry", "whitebark raspberry", "black raspberry", "purple-flowered raspberry", "thimbleberry", "five-leaf dwarf bramble", "wineberry", "dwarf raspberry", "stone bramble", "rubus schmidelioides", "salmonberry", "coastal-plain dewberry", "elmleaf blackberry", "california blackberry", "canadian burnet", "salad burnet", "great burnet", "creeping sibbaldia", "three-toothed cinquefoil", "sorbaria", "american mountain ash", "rowan", "sitka mountain-ash", "white meadowsweet", "white spirea", "rose spiraea", "japanese spiraea", "mountain spiraea", "steeplebush", "appalachian barren-strawberry", "winged elm", "american elm", "cedar elm", "wych elm", "fluttering elm", "field elm", "lacebark elm", "siberian elm", "slippery elm", "rock elm", "false nettle", "parataniwha", "canadian wood nettle", "spreading pellitory", "pennsylvania pellitory", "rockweed", "canada clearweed", "heartleaf nettle", "stinging nettle", "tree nettle", "dwarf nettle", "green mistletoe", "western dwarf mistletoe", "cape sumach", "bastard toadflax", "cherry ballart", "northern comandra", "desert mistletoe", "oak mistletoe", "european mistletoe", "cashew", "smokebush", "laurel sumac", "indian mango", "chinese pistache", "mastic", "fragrant sumac", "shining sumac", "smooth sumac", "lemonade berry", "prairie flameleaf sumac", "little leaf sumac", "sugar bush", "staghorn sumach", "evergreen sumac", "peruvian peppertree", "brazilian pepper", "glossy currantrhus", "pacific poison oak", "poison ivy", "poison ivy", "poison sumac", "elephant tree", "gumbo limbo", "nim", "kohekohe", "umbrella tree", "trifoliate orange", "bushrue", "common correa", "orange jasmine", "common hoptree", "turpentine broom", "prickly ash", "hercules' club", "lime prickly-ash", "tickle tongue", "field maple", "vine maple", "southern sugar maple", "amur maple", "mountain maple", "bigtooth maple", "bigleaf maple", "boxelder maple", "black maple", "japanese maple", "striped maple", "norway maple", "sycamore maple", "red maple", "silver maple", "sugar maple", "mountain maple", "tartar maple", "california buckeye", "ohio buckeye", "horse-chestnut", "red buckeye", "titoki", "balloon vine", "broad leaf hopbush", "goldenrain tree", "western soapberry", "mexican buckeye", "tree of heaven", "american sweetgum", "round-leafed navel-wort", "pygmy stonecrop", "jade plant", "arizona chalk dudleya", "canyon live-forever", "fingertips", "powdery liveforever", "lanceleaf liveforever", "chalk dudleya", "panamint liveforever", "echeveria gibbiflora", "orpine", "mother of thousands", "air plant", "spiny pennywort", "western roseroot", "roseroot", "biting stonecrop", "white stonecrop", "thick-leaved stonecrop", "spearleaf stonecrop", "yellow stonecrop", "sierra stonecrop", "tasteless stonecrop", "colorado stonecrop", "wild stonecrop", "cobweb houseleek", "houseleek", "botterboom", "wall pennywort", "american black currant", "golden currant", "stink currant", "wax currant", "prickly gooseberry", "white-flowering currant", "swamp currant", "chaparral currant", "mountain pink currant", "blackcurrant", "sierra gooseberry", "red currant", "red-flowering currant", "fuchsiaflower gooseberry", "northern redcurrant", "goose-berry", "shrubby haloragis", "parrot's feather", "eurasian water-milfoil", "witchhazel", "virginia sweetspire", "brown's peony", "california peony", "ditch stonecrop", "alternate-leaved golden saxifrage", "american golden saxifrage", "indian rhubarb", "roundleaf alumroot", "crevice alumroot", "prairie alumroot", "leather-leaf saxifrage", "san francisco woodland-star", "hillside woodland star", "smallflower woodland star", "california saxifrage", "russethair saxifrage", "swamp saxifrage", "early saxifrage", "twoleaf miterwort", "naked bishop's cap", "yellow saxifrage", "matte saxifrage", "meadow saxifrage", "purple saxifrage", "livelong saxifrage", "prickly saxifrage", "fringe cups", "heartleaf foamflower", "threeleaf foamflower", "youth on age", "modesty", "island false bindweed", "hedge bindweed", "large bindweed", "beach morning-glory", "mallow bindweed", "field bindweed", "texas bindweed", "chaparral dodder", "common dodder", "goldenthread", "carolina ponysfoot", "kidney weed", "alamo vine", "tropical speedwell", "silky evolvulus", "moonflower", "sweet potato", "mile-a-minute vine", "bush morning glory", "red morning glory", "tievine", "ivy-leaved morning-glory", "scarlet creeper", "beach morning glory", "oceanblue morning glory", "white morning-glory", "obscure morning glory", "wild potato vine", "goat's foot convolvulus", "common morning-glory", "cypress vine", "saltmarsh morning-glory", "littlebell", "hairy cluster-vine", "blue waterleaf", "rattlebush", "japanese-lantern", "deadly nightshade", "sweet pepper", "desert thorn-apple", "angel's trumpet", "jimson weed", "sacred datura", "henbane", "anderson thornbush", "matrimony vine", "christmas berry", "african boxthorn", "apple-of-peru", "tree tobacco", "desert tobacco", "cutleaf groundcherry", "thickleaf groundcherry", "clammy groundcherry", "cape-gooseberry", "chinese lantern", "american nightshade", "carolina horse-nettle", "tall nightshade", "western horsenettle", "twoleaf nightshade", "greenspot nightshade", "woody nightshade", "white horse-nettle", "eastern black nightshade", "potato tree", "kangaroo-apple", "tomato", "bugtree", "black nightshade", "jerusalem cherry", "buffalo bur", "potato", "bluewitch nightshade", "purple nightshade", "pepper vine", "heart leaf peppervine", "porcelain berry", "bushkiller", "sorrelvine", "thicket creeper", "virginia creeper", "japanese creeper", "summer grape", "canyon wild grape", "california wild grape", "mustang grape", "riverbank grape", "muscadine", "white ratany", "pima rhatany", "prairie bur", "california fagonbush", "texas lignum-vitae", "guaiacum coulteri", "arizona poppy", "creosote bush", "puncture vine", "kauri", "california incense-cedar", "monterey cypress", "nootka cypress", "ashe juniper", "california juniper", "common juniper", "alligator juniper", "sierra juniper", "creeping juniper", "one-seed juniper", "western juniper", "utah juniper", "prickly juniper", "rocky mountain juniper", "eastern red cedar", "coast redwood", "giant sequoia", "pondcypress", "baldcypress", "montezuma cypress", "northern whitecedar", "western redcedar", "silver fir", "balsam fir", "white fir", "grand fir", "subalpine fir", "siberian fir", "european larch", "tamarack", "western larch", "siberian larch", "norway spruce", "engelmann spruce", "white spruce", "black spruce", "siberian spruce", "blue spruce", "red spruce", "sitka spruce", "whitebark pine", "jack pine", "mexican pinyon", "lodgepole pine", "coulter pine", "shortleaf pine", "twin-needle pinyon", "slash pine", "limber pine", "jeffrey's pine", "sugar pine", "great basin bristlecone pine", "singleleaf pinyon", "western white pine", "austrian pine", "longleaf pine", "maritime pine", "stone pine", "ponderosa pine", "monterey pine", "red pine", "pitch pine", "gray pine", "siberian pine", "eastern white pine", "scots pine", "loblolly pine", "torrey pine", "virginia pine", "common douglas-fir", "canada hemlock", "western hemlock", "mountain hemlock", "kahikatea", "red pine", "mountain toatoa", "celery pine", "hall's totara", "totara", "miro", "matai", "common yew", "pacific yew", "canadian yew", "california nutmeg", "silver fern", "black tree fern", "soft tree fern", "man fern", "golden tree fern", "new zealand tree fern", "scouring rush", "water horsetail", "rough horsetail", "smooth horsetail", "meadow horsetail", "branched horsetail", "dwarf horsetail", "wood horsetail", "giant horsetail", "variegated horsetail", "false staghorn fern", "tangle fern", "umbrella fern", "drooping filmy fern", "shiny filmy-fern", "kidney fern", "veined bristle-fern", "moonwort", "rattlesnake fern", "cut-leaved grape-fern", "leathery grapefern", "crepe fern", "interrupted fern", "royal fern", "cinnamon fern", "ground spleenwort", "hen and chicks fern", "rustyback", "necklace fern", "drooping spleenwort", "mother spleenwort", "hooker's spleenwort", "mountain spleenwort", "bird's nest fern", "huruhuruwhenua", "ebony spleenwort", "mare's tail fern", "walking fern", "wall-rue", "hart's-tongue fern", "maidenhair spleenwort", "green spleenwort", "northern lady fern", "lady fern", "silvery glade fern", "rereti", "thread fern", "palm-leaf fern", "rasp fern", "little hard fern", "blechnum procerum", "kiwakiwa", "crown fern", "deer fern", "toothed midsorus fern", "netted chain fern", "giant chain fern", "virginia chainfern", "bulblet fern", "brittle bladderfern", "lowland brittle fern", "northern oak fern", "hay-scented fern", "water fern", "common pig fern", "hard fern", "common bracken", "austral bracken", "pinewood bracken", "narrow-leaved glade fern", "house holly-fern", "coastal woodfern", "spinulose wood fern", "crested buckler-fern", "northern wood fern", "male fern", "fragrant wood fern", "intermediate wood fern", "marginal wood fern", "christmas fern", "holly fern", "western swordfern", "polystichum neozelandicum", "prickly shield fern", "leatherleaf fern", "tuber ladder fern", "ostrich fern", "sensitive fern", "kangaroo fern", "fragrant fern", "comb fern", "golden polypody", "resurrection fern", "california polypody", "licorice fern", "leathery polypody", "rock polypody", "common polypody", "leather-leaf fern", "giant leather fern", "five-fingered fern", "black maidenhair fern", "maidenhair", "rose maidenhair-fern", "california maidenhair fern", "northern maidenhair fern", "serpentine fern", "wavy scale cloakfern", "american parsley fern", "alabama lipfern", "golden lipfern", "parry's lip fern", "coffee fern", "purple-stem cliffbrake", "smooth cliffbrake", "bird's foot cliffbrake", "button fern", "goldback fern", "cretan brake", "sweet fern", "shaking brake", "ladder fern", "new york fern", "long beech fern", "broad beech fern", "gully fern", "marsh fern", "rusty woodsia", "blunt woodsia", "skeleton fork fern", "tmesipteris elongata", "water fern", "water spangles", "japanese climbing fern" ]
CharlesBointon/food_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # CharlesBointon/food_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 1.2662 - Validation Loss: 1.1840 - Train Accuracy: 1.0 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 2000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 3.6773 | 2.6774 | 1.0 | 0 | | 2.1700 | 1.8405 | 1.0 | 1 | | 1.6777 | 1.5476 | 1.0 | 2 | | 1.4452 | 1.3513 | 1.0 | 3 | | 1.2662 | 1.1840 | 1.0 | 4 | ### Framework versions - Transformers 4.32.0.dev0 - TensorFlow 2.9.1 - Datasets 2.14.2 - Tokenizers 0.12.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
fadhlika14/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
omarhkh/resnet-50-finetuned-omars3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-omars3 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7345 - Accuracy: 0.7436 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.3928 | 1.0 | 11 | 1.3871 | 0.2308 | | 1.3864 | 2.0 | 22 | 1.3820 | 0.3077 | | 1.3694 | 3.0 | 33 | 1.3510 | 0.5385 | | 1.3513 | 4.0 | 44 | 1.2942 | 0.4872 | | 1.3067 | 5.0 | 55 | 1.1984 | 0.6154 | | 1.2184 | 6.0 | 66 | 0.9974 | 0.6923 | | 1.0967 | 7.0 | 77 | 0.7869 | 0.6667 | | 0.9731 | 8.0 | 88 | 0.7923 | 0.7436 | | 0.9506 | 9.0 | 99 | 0.7161 | 0.6667 | | 0.7783 | 10.0 | 110 | 0.6736 | 0.6923 | | 0.7072 | 11.0 | 121 | 0.6693 | 0.7436 | | 0.6669 | 12.0 | 132 | 0.7203 | 0.6923 | | 0.6579 | 13.0 | 143 | 0.6195 | 0.7949 | | 0.6695 | 14.0 | 154 | 0.6395 | 0.7692 | | 0.678 | 15.0 | 165 | 0.6870 | 0.7692 | | 0.5919 | 16.0 | 176 | 0.6681 | 0.7692 | | 0.5459 | 17.0 | 187 | 0.6895 | 0.7692 | | 0.5635 | 18.0 | 198 | 0.6617 | 0.7692 | | 0.5378 | 19.0 | 209 | 0.6401 | 0.7949 | | 0.5105 | 20.0 | 220 | 0.7108 | 0.7692 | | 0.4656 | 21.0 | 231 | 0.7267 | 0.7692 | | 0.5338 | 22.0 | 242 | 0.7531 | 0.7436 | | 0.4846 | 23.0 | 253 | 0.7103 | 0.7179 | | 0.4212 | 24.0 | 264 | 0.7809 | 0.7436 | | 0.4677 | 25.0 | 275 | 0.7825 | 0.7692 | | 0.4496 | 26.0 | 286 | 0.8240 | 0.6923 | | 0.3784 | 27.0 | 297 | 0.7563 | 0.7179 | | 0.4949 | 28.0 | 308 | 0.6823 | 0.7692 | | 0.4612 | 29.0 | 319 | 0.7542 | 0.6667 | | 0.4491 | 30.0 | 330 | 0.7345 | 0.7436 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.0 - Tokenizers 0.13.3
[ "depthread", "depthunread", "widthread", "widthunread" ]
platzi/platzi-vit-model-cesar-vega
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-cesar-vega This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0489 - Accuracy: 0.9850 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1466 | 3.85 | 500 | 0.0489 | 0.9850 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
dyvapandhu/vit-molecul
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-molecul This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5737 - Accuracy: 0.71 - F1: 0.7086 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-06 - train_batch_size: 50 - eval_batch_size: 50 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.723 | 1.0 | 8 | 0.6790 | 0.61 | 0.6096 | | 0.6915 | 2.0 | 16 | 0.6661 | 0.62 | 0.5924 | | 0.6689 | 3.0 | 24 | 0.6470 | 0.69 | 0.6892 | | 0.6517 | 4.0 | 32 | 0.6356 | 0.64 | 0.6377 | | 0.6368 | 5.0 | 40 | 0.6289 | 0.72 | 0.7199 | | 0.621 | 6.0 | 48 | 0.6217 | 0.73 | 0.7293 | | 0.6061 | 7.0 | 56 | 0.6197 | 0.69 | 0.6862 | | 0.5924 | 8.0 | 64 | 0.6087 | 0.73 | 0.7293 | | 0.5767 | 9.0 | 72 | 0.6003 | 0.72 | 0.7199 | | 0.5633 | 10.0 | 80 | 0.5953 | 0.72 | 0.7196 | | 0.5491 | 11.0 | 88 | 0.5885 | 0.72 | 0.7199 | | 0.5351 | 12.0 | 96 | 0.5869 | 0.71 | 0.7100 | | 0.5239 | 13.0 | 104 | 0.5867 | 0.7 | 0.6995 | | 0.5118 | 14.0 | 112 | 0.5804 | 0.71 | 0.7100 | | 0.502 | 15.0 | 120 | 0.5752 | 0.71 | 0.7100 | | 0.4942 | 16.0 | 128 | 0.5738 | 0.72 | 0.7199 | | 0.4885 | 17.0 | 136 | 0.5771 | 0.71 | 0.7086 | | 0.4831 | 18.0 | 144 | 0.5751 | 0.71 | 0.7086 | | 0.4793 | 19.0 | 152 | 0.5743 | 0.71 | 0.7086 | | 0.4774 | 20.0 | 160 | 0.5737 | 0.71 | 0.7086 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.14.1 - Tokenizers 0.13.3
[ "a", "c" ]
omarhkh/resnet-50-finetuned-omars5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-omars5 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5844 - Accuracy: 0.8845 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.3431 | 0.99 | 92 | 1.2810 | 0.5836 | | 1.0465 | 2.0 | 185 | 0.8740 | 0.8176 | | 0.8755 | 2.99 | 277 | 0.6467 | 0.7994 | | 0.7459 | 4.0 | 370 | 0.5379 | 0.8480 | | 0.7983 | 4.99 | 462 | 0.4385 | 0.8207 | | 0.7692 | 6.0 | 555 | 0.5795 | 0.7842 | | 0.5158 | 6.99 | 647 | 0.4936 | 0.8207 | | 0.625 | 8.0 | 740 | 0.5316 | 0.8298 | | 0.511 | 8.99 | 832 | 0.5202 | 0.8845 | | 0.5025 | 10.0 | 925 | 0.5260 | 0.8784 | | 0.508 | 10.99 | 1017 | 0.5307 | 0.8632 | | 0.4652 | 12.0 | 1110 | 0.6060 | 0.8480 | | 0.4432 | 12.99 | 1202 | 0.5051 | 0.8845 | | 0.3373 | 14.0 | 1295 | 0.8695 | 0.8845 | | 0.3968 | 14.99 | 1387 | 0.6805 | 0.8571 | | 0.4268 | 16.0 | 1480 | 0.6541 | 0.8815 | | 0.3029 | 16.99 | 1572 | 0.5710 | 0.8906 | | 0.3801 | 18.0 | 1665 | 0.6499 | 0.8571 | | 0.3545 | 18.99 | 1757 | 0.6727 | 0.8419 | | 0.3526 | 20.0 | 1850 | 0.6542 | 0.8571 | | 0.3458 | 20.99 | 1942 | 0.6625 | 0.8997 | | 0.3078 | 22.0 | 2035 | 0.6551 | 0.8784 | | 0.3677 | 22.99 | 2127 | 0.5953 | 0.8815 | | 0.3386 | 24.0 | 2220 | 0.6549 | 0.8693 | | 0.213 | 24.99 | 2312 | 0.5846 | 0.8997 | | 0.3778 | 26.0 | 2405 | 0.6746 | 0.8602 | | 0.3079 | 26.99 | 2497 | 0.6594 | 0.8997 | | 0.2943 | 28.0 | 2590 | 0.6246 | 0.8815 | | 0.2782 | 28.99 | 2682 | 0.6550 | 0.8906 | | 0.2931 | 29.84 | 2760 | 0.5844 | 0.8845 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.0 - Tokenizers 0.13.3
[ "depthread", "depthunread", "widthread", "widthunread" ]
CharlesBointon/item_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # CharlesBointon/item_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.8371 - Validation Loss: 0.7054 - Train Accuracy: 0.786 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 1.5386 | 1.0983 | 0.74 | 0 | | 1.1263 | 0.8877 | 0.754 | 1 | | 0.9805 | 0.7840 | 0.77 | 2 | | 0.9209 | 0.7624 | 0.769 | 3 | | 0.8371 | 0.7054 | 0.786 | 4 | ### Framework versions - Transformers 4.32.0.dev0 - TensorFlow 2.9.1 - Datasets 2.14.3 - Tokenizers 0.12.1
[ "dresses", "pyjamas", "t-shirts", "swimsuits", "trousers", "shirts", "shoes", "socks" ]
elifm/swin-tiny-patch4-window7-224-finetuned-sar
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-sar This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0351 - Accuracy: 0.9880 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.3706 | 1.0 | 53 | 0.1639 | 0.9442 | | 0.3062 | 2.0 | 106 | 0.1337 | 0.9509 | | 0.264 | 3.0 | 159 | 0.0671 | 0.9748 | | 0.1861 | 4.0 | 212 | 0.0470 | 0.9854 | | 0.2131 | 5.0 | 265 | 0.0351 | 0.9880 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "alongside", "building", "road", "vegetation", "water" ]
jordyvl/vit-base_rvl_cdip_symce
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_cdip_symce This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6253 - Accuracy: 0.8982 - Brier Loss: 0.1796 - Nll: 1.1468 - F1 Micro: 0.8982 - F1 Macro: 0.8984 - Ece: 0.0846 - Aurc: 0.0197 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.1665 | 1.0 | 2500 | 0.3898 | 0.8939 | 0.1621 | 1.1704 | 0.8939 | 0.8938 | 0.0463 | 0.0167 | | 0.1439 | 2.0 | 5000 | 0.3927 | 0.8949 | 0.1602 | 1.1860 | 0.8949 | 0.8954 | 0.0506 | 0.0165 | | 0.0889 | 3.0 | 7500 | 0.4389 | 0.8941 | 0.1684 | 1.1449 | 0.8941 | 0.8946 | 0.0637 | 0.0172 | | 0.0574 | 4.0 | 10000 | 0.4870 | 0.8953 | 0.1741 | 1.1605 | 0.8953 | 0.8952 | 0.0719 | 0.0179 | | 0.0372 | 5.0 | 12500 | 0.5259 | 0.8929 | 0.1792 | 1.1860 | 0.8929 | 0.8935 | 0.0775 | 0.0185 | | 0.0225 | 6.0 | 15000 | 0.5579 | 0.8959 | 0.1784 | 1.1504 | 0.8959 | 0.8963 | 0.0799 | 0.0196 | | 0.0126 | 7.0 | 17500 | 0.5905 | 0.8949 | 0.1811 | 1.1714 | 0.8949 | 0.8950 | 0.0836 | 0.0197 | | 0.0081 | 8.0 | 20000 | 0.6011 | 0.8973 | 0.1791 | 1.1720 | 0.8973 | 0.8975 | 0.0828 | 0.0198 | | 0.0048 | 9.0 | 22500 | 0.6198 | 0.8975 | 0.1800 | 1.1518 | 0.8975 | 0.8977 | 0.0847 | 0.0198 | | 0.0038 | 10.0 | 25000 | 0.6253 | 0.8982 | 0.1796 | 1.1468 | 0.8982 | 0.8984 | 0.0846 | 0.0197 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl_cdip_entropy2_softmax
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl_cdip_entropy2_softmax This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8809 - Accuracy: 0.8968 - Brier Loss: 0.1890 - Nll: 1.1526 - F1 Micro: 0.8968 - F1 Macro: 0.8969 - Ece: 0.0923 - Aurc: 0.0205 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.3547 | 1.0 | 2500 | 0.7036 | 0.8958 | 0.1806 | 0.9568 | 0.8958 | 0.8955 | 0.0815 | 0.0174 | | 0.3049 | 2.0 | 5000 | 0.7030 | 0.8972 | 0.1784 | 1.0077 | 0.8972 | 0.8975 | 0.0825 | 0.0168 | | 0.2103 | 3.0 | 7500 | 0.7465 | 0.8946 | 0.1857 | 1.0229 | 0.8946 | 0.8954 | 0.0883 | 0.0178 | | 0.1548 | 4.0 | 10000 | 0.7640 | 0.8957 | 0.1860 | 1.0530 | 0.8957 | 0.8960 | 0.0893 | 0.0182 | | 0.1077 | 5.0 | 12500 | 0.7964 | 0.8955 | 0.1877 | 1.0743 | 0.8955 | 0.8955 | 0.0903 | 0.0182 | | 0.0742 | 6.0 | 15000 | 0.8253 | 0.8959 | 0.1887 | 1.0996 | 0.8959 | 0.8967 | 0.0919 | 0.0202 | | 0.0495 | 7.0 | 17500 | 0.8505 | 0.8964 | 0.1884 | 1.1281 | 0.8964 | 0.8963 | 0.0920 | 0.0201 | | 0.0352 | 8.0 | 20000 | 0.8645 | 0.8964 | 0.1895 | 1.1397 | 0.8964 | 0.8964 | 0.0931 | 0.0207 | | 0.0235 | 9.0 | 22500 | 0.8733 | 0.8984 | 0.1876 | 1.1365 | 0.8984 | 0.8986 | 0.0914 | 0.0204 | | 0.0176 | 10.0 | 25000 | 0.8809 | 0.8968 | 0.1890 | 1.1526 | 0.8968 | 0.8969 | 0.0923 | 0.0205 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
omarhkh/resnet-50-finetuned-omars6
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-omars6 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.4990 - Accuracy: 0.8328 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.3594 | 0.99 | 92 | 1.3630 | 0.5015 | | 1.3214 | 2.0 | 185 | 1.3252 | 0.5714 | | 1.2633 | 2.99 | 277 | 1.2851 | 0.6140 | | 1.2693 | 4.0 | 370 | 1.2385 | 0.6626 | | 1.1902 | 4.99 | 462 | 1.1837 | 0.6991 | | 1.1421 | 6.0 | 555 | 1.1255 | 0.7568 | | 1.1979 | 6.99 | 647 | 1.0094 | 0.8024 | | 0.9431 | 8.0 | 740 | 0.9544 | 0.8237 | | 0.9627 | 8.99 | 832 | 0.8864 | 0.8267 | | 0.8556 | 10.0 | 925 | 0.8365 | 0.8328 | | 0.7792 | 10.99 | 1017 | 0.7762 | 0.8359 | | 0.7941 | 12.0 | 1110 | 0.7467 | 0.8359 | | 0.8361 | 12.99 | 1202 | 0.7345 | 0.8237 | | 0.7757 | 14.0 | 1295 | 0.7228 | 0.8146 | | 0.6977 | 14.99 | 1387 | 0.6923 | 0.8267 | | 0.6874 | 16.0 | 1480 | 0.6540 | 0.8146 | | 0.6887 | 16.99 | 1572 | 0.6276 | 0.8298 | | 0.7204 | 18.0 | 1665 | 0.5989 | 0.8267 | | 0.8334 | 18.99 | 1757 | 0.6027 | 0.8237 | | 0.7654 | 20.0 | 1850 | 0.5699 | 0.8511 | | 0.7628 | 20.99 | 1942 | 0.5465 | 0.8389 | | 0.7874 | 22.0 | 2035 | 0.5621 | 0.8298 | | 0.8149 | 22.99 | 2127 | 0.5474 | 0.8298 | | 0.7565 | 24.0 | 2220 | 0.5388 | 0.8480 | | 0.7241 | 24.99 | 2312 | 0.5351 | 0.8267 | | 0.7894 | 26.0 | 2405 | 0.5327 | 0.8389 | | 0.7664 | 26.99 | 2497 | 0.5065 | 0.8450 | | 0.6655 | 28.0 | 2590 | 0.5309 | 0.8359 | | 0.607 | 28.99 | 2682 | 0.5061 | 0.8541 | | 0.6462 | 29.84 | 2760 | 0.4990 | 0.8328 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.0 - Tokenizers 0.13.3
[ "depthread", "depthunread", "widthread", "widthunread" ]
omarhkh/swin-tiny-patch4-window7-224-finetuned-omars6
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-omars6 This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5625 - Accuracy: 0.8815 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.9598 | 0.99 | 92 | 0.7744 | 0.6869 | | 0.7825 | 2.0 | 185 | 0.7336 | 0.7082 | | 0.9638 | 2.99 | 277 | 0.8202 | 0.7204 | | 1.0288 | 4.0 | 370 | 0.8621 | 0.7903 | | 0.9711 | 4.99 | 462 | 0.8212 | 0.6809 | | 1.0125 | 6.0 | 555 | 0.8700 | 0.7356 | | 0.945 | 6.99 | 647 | 0.7959 | 0.7781 | | 0.9851 | 8.0 | 740 | 0.8755 | 0.6140 | | 0.8078 | 8.99 | 832 | 0.6970 | 0.7781 | | 0.7377 | 10.0 | 925 | 0.6063 | 0.7386 | | 0.7934 | 10.99 | 1017 | 0.6121 | 0.8116 | | 0.7986 | 12.0 | 1110 | 0.6532 | 0.8116 | | 0.6129 | 12.99 | 1202 | 0.7250 | 0.8450 | | 0.7428 | 14.0 | 1295 | 0.6417 | 0.7264 | | 0.5661 | 14.99 | 1387 | 0.6847 | 0.7964 | | 0.6631 | 16.0 | 1480 | 0.5470 | 0.8298 | | 0.5787 | 16.99 | 1572 | 0.5696 | 0.8359 | | 0.6635 | 18.0 | 1665 | 0.6385 | 0.7872 | | 0.5251 | 18.99 | 1757 | 0.5842 | 0.8419 | | 0.6164 | 20.0 | 1850 | 0.5506 | 0.8207 | | 0.4166 | 20.99 | 1942 | 0.8169 | 0.8055 | | 0.4189 | 22.0 | 2035 | 0.5882 | 0.8480 | | 0.699 | 22.99 | 2127 | 0.5767 | 0.8541 | | 0.6095 | 24.0 | 2220 | 0.6392 | 0.8845 | | 0.3837 | 24.99 | 2312 | 0.6109 | 0.8723 | | 0.4916 | 26.0 | 2405 | 0.4862 | 0.8754 | | 0.4536 | 26.99 | 2497 | 0.5625 | 0.8754 | | 0.3636 | 28.0 | 2590 | 0.5948 | 0.8663 | | 0.4004 | 28.99 | 2682 | 0.5735 | 0.8906 | | 0.4248 | 29.84 | 2760 | 0.5625 | 0.8815 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.0 - Tokenizers 0.13.3
[ "depthread", "depthunread", "widthread", "widthunread" ]
Andyrasika/vit-base-patch16-224-in21k-finetuned-lora-food101
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-finetuned-lora-food101 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.5152 - Accuracy: 0.8560 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.005 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 512 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.8353 | 1.0 | 133 | 0.6692 | 0.8168 | | 0.702 | 2.0 | 266 | 0.5892 | 0.8393 | | 0.6419 | 2.99 | 399 | 0.5615 | 0.8455 | | 0.5742 | 4.0 | 533 | 0.5297 | 0.8535 | | 0.4942 | 4.99 | 665 | 0.5152 | 0.8560 | ### Framework versions - PEFT 0.5.0.dev0 - Transformers 4.32.0.dev0 - Pytorch 2.0.0 - Datasets 2.1.0 - Tokenizers 0.13.3 [notebook](https://github.com/andysingal/CV_public/blob/main/Image-classification/notebooks/image_classification_peft_lora.ipynb)
[ "apple_pie", "baby_back_ribs", "baklava", "beef_carpaccio", "beef_tartare", "beet_salad", "beignets", "bibimbap", "bread_pudding", "breakfast_burrito", "bruschetta", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare", "waffles" ]
amiqinayat/swin-tiny-patch4-window7-224-finetuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6899 - Accuracy: 0.7649 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.1453 | 1.0 | 68 | 0.9646 | 0.6969 | | 0.8615 | 1.99 | 136 | 0.7633 | 0.7340 | | 0.7551 | 2.99 | 204 | 0.6899 | 0.7649 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "crack", "environment - ground", "environment - other", "environment - sky", "environment - vegetation", "joint defect", "loss of section", "spalling", "vegetation", "wall - grafitti", "wall - normal", "wall - other", "wall - stain" ]
platzi/platzi-vit-model-ruben-troche
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-ruben-troche This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0618 - Accuracy: 0.9850 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1535 | 3.85 | 500 | 0.0618 | 0.9850 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
wuru330/378A1_results_coord
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 378A1_results_coord This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4924 - Accuracy: 0.8946 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.2529 | 1.0 | 37 | 1.0701 | 0.6207 | | 0.6771 | 2.0 | 74 | 0.6678 | 0.7687 | | 0.4363 | 3.0 | 111 | 0.5622 | 0.8010 | | 0.2884 | 4.0 | 148 | 0.3808 | 0.8690 | | 0.2382 | 5.0 | 185 | 0.3492 | 0.8810 | | 0.1213 | 6.0 | 222 | 0.3485 | 0.8895 | | 0.1238 | 7.0 | 259 | 0.4012 | 0.8827 | | 0.0878 | 8.0 | 296 | 0.4311 | 0.8639 | | 0.0839 | 9.0 | 333 | 0.4417 | 0.8656 | | 0.0406 | 10.0 | 370 | 0.3993 | 0.8844 | | 0.0509 | 11.0 | 407 | 0.4922 | 0.8690 | | 0.0347 | 12.0 | 444 | 0.4840 | 0.8741 | | 0.033 | 13.0 | 481 | 0.4572 | 0.8827 | | 0.0222 | 14.0 | 518 | 0.4376 | 0.8861 | | 0.0197 | 15.0 | 555 | 0.4397 | 0.8912 | | 0.0179 | 16.0 | 592 | 0.4464 | 0.8946 | | 0.0167 | 17.0 | 629 | 0.4526 | 0.8946 | | 0.0154 | 18.0 | 666 | 0.4588 | 0.8929 | | 0.0148 | 19.0 | 703 | 0.4642 | 0.8929 | | 0.0135 | 20.0 | 740 | 0.4691 | 0.8929 | | 0.0131 | 21.0 | 777 | 0.4732 | 0.8946 | | 0.0125 | 22.0 | 814 | 0.4776 | 0.8946 | | 0.0119 | 23.0 | 851 | 0.4809 | 0.8946 | | 0.0116 | 24.0 | 888 | 0.4841 | 0.8946 | | 0.0112 | 25.0 | 925 | 0.4863 | 0.8946 | | 0.0111 | 26.0 | 962 | 0.4885 | 0.8946 | | 0.0108 | 27.0 | 999 | 0.4903 | 0.8946 | | 0.0108 | 28.0 | 1036 | 0.4912 | 0.8946 | | 0.0105 | 29.0 | 1073 | 0.4921 | 0.8946 | | 0.0108 | 30.0 | 1110 | 0.4924 | 0.8946 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "label_0", "label_1", "label_2", "label_3" ]
alandevkota/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
rdmpage/autotrain-bwpages-start-only-79636141312
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 79636141312 - CO2 Emissions (in grams): 0.5096 ## Validation Metrics - Loss: 0.170 - Accuracy: 0.932 - Macro F1: 0.932 - Micro F1: 0.932 - Weighted F1: 0.931 - Macro Precision: 0.946 - Micro Precision: 0.932 - Weighted Precision: 0.936 - Macro Recall: 0.925 - Micro Recall: 0.932 - Weighted Recall: 0.932
[ "blank", "content", "end", "start" ]
maurope/vit_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0821 - Accuracy: 0.9774 ## Model description This model distinguishes between healthy and diseased bean leaves. It can also categorize between two diseases: bean rust and angular leaf spot. Just upload a photo and the model will tell you the probability of these three categories. # Healty ![Healty](healty.jpg) # Bean Rust ![bean_rust](bean_rust.jpeg) # Angular Leaf Spot ![angular_leaf_spot](angular_leaf_spot.jpg) ## Intended uses & limitations Just classifies bean leaves ## Training and evaluation data The model was trained with the dataset beans: https://huggingface.co/datasets/beans ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1435 | 3.85 | 500 | 0.0821 | 0.9774 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.2 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
jordyvl/vit-base-patch16-224-in21k-tiny_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-tiny_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 75.3146 - Accuracy: 0.811 - Brier Loss: 0.3395 - Nll: 2.0856 - F1 Micro: 0.811 - F1 Macro: 0.8109 - Ece: 0.1625 - Aurc: 0.0586 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 78.9278 | 1.0 | 1000 | 78.3645 | 0.588 | 0.5476 | 2.4320 | 0.588 | 0.5825 | 0.0552 | 0.1928 | | 77.6109 | 2.0 | 2000 | 77.4909 | 0.684 | 0.4259 | 2.1686 | 0.684 | 0.6879 | 0.0496 | 0.1167 | | 77.1849 | 3.0 | 3000 | 77.1828 | 0.7185 | 0.3854 | 2.1032 | 0.7185 | 0.7200 | 0.0495 | 0.0932 | | 76.8526 | 4.0 | 4000 | 76.9800 | 0.748 | 0.3549 | 2.0716 | 0.748 | 0.7492 | 0.0768 | 0.0781 | | 76.5928 | 5.0 | 5000 | 76.7544 | 0.743 | 0.3576 | 2.0634 | 0.743 | 0.7461 | 0.0564 | 0.0846 | | 76.1507 | 6.0 | 6000 | 76.5850 | 0.7688 | 0.3354 | 2.0506 | 0.7688 | 0.7698 | 0.0857 | 0.0701 | | 75.8107 | 7.0 | 7000 | 76.5816 | 0.75 | 0.3766 | 2.1753 | 0.75 | 0.7542 | 0.1230 | 0.0815 | | 75.868 | 8.0 | 8000 | 76.5048 | 0.785 | 0.3324 | 2.0865 | 0.785 | 0.7869 | 0.1244 | 0.0623 | | 75.6016 | 9.0 | 9000 | 76.3919 | 0.7827 | 0.3475 | 2.0858 | 0.7828 | 0.7849 | 0.1340 | 0.0657 | | 75.4883 | 10.0 | 10000 | 76.5121 | 0.7768 | 0.3644 | 2.1372 | 0.7768 | 0.7747 | 0.1550 | 0.0662 | | 75.2568 | 11.0 | 11000 | 76.4107 | 0.7857 | 0.3603 | 2.1229 | 0.7857 | 0.7863 | 0.1561 | 0.0619 | | 75.1623 | 12.0 | 12000 | 76.4517 | 0.771 | 0.3857 | 2.1118 | 0.771 | 0.7721 | 0.1681 | 0.0702 | | 75.0021 | 13.0 | 13000 | 76.3632 | 0.7885 | 0.3635 | 2.1178 | 0.7885 | 0.7870 | 0.1607 | 0.0621 | | 74.9056 | 14.0 | 14000 | 76.3074 | 0.7925 | 0.3533 | 2.1361 | 0.7925 | 0.7926 | 0.1626 | 0.0573 | | 74.9295 | 15.0 | 15000 | 76.3445 | 0.785 | 0.3730 | 2.0515 | 0.785 | 0.7861 | 0.1694 | 0.0661 | | 74.7288 | 16.0 | 16000 | 76.3441 | 0.7845 | 0.3776 | 2.1216 | 0.7845 | 0.7828 | 0.1731 | 0.0666 | | 74.5985 | 17.0 | 17000 | 76.1255 | 0.794 | 0.3593 | 2.0759 | 0.7940 | 0.7969 | 0.1640 | 0.0605 | | 74.471 | 18.0 | 18000 | 76.2140 | 0.7863 | 0.3721 | 2.1872 | 0.7863 | 0.7861 | 0.1705 | 0.0638 | | 74.4457 | 19.0 | 19000 | 76.1380 | 0.7925 | 0.3650 | 2.1106 | 0.7925 | 0.7940 | 0.1708 | 0.0634 | | 74.3675 | 20.0 | 20000 | 76.1423 | 0.7897 | 0.3684 | 2.0882 | 0.7897 | 0.7910 | 0.1731 | 0.0642 | | 74.3618 | 21.0 | 21000 | 76.0578 | 0.7987 | 0.3604 | 2.1007 | 0.7987 | 0.7982 | 0.1676 | 0.0622 | | 74.1398 | 22.0 | 22000 | 75.9928 | 0.7997 | 0.3578 | 2.0590 | 0.7997 | 0.8008 | 0.1672 | 0.0624 | | 74.0834 | 23.0 | 23000 | 75.8857 | 0.8013 | 0.3561 | 2.0986 | 0.8013 | 0.8010 | 0.1662 | 0.0602 | | 74.1467 | 24.0 | 24000 | 75.8767 | 0.8 | 0.3605 | 2.0794 | 0.8000 | 0.8014 | 0.1682 | 0.0608 | | 73.8823 | 25.0 | 25000 | 75.9471 | 0.799 | 0.3564 | 2.0934 | 0.799 | 0.7997 | 0.1684 | 0.0619 | | 73.9657 | 26.0 | 26000 | 75.8618 | 0.7987 | 0.3594 | 2.1020 | 0.7987 | 0.7991 | 0.1703 | 0.0599 | | 73.9721 | 27.0 | 27000 | 75.7331 | 0.8145 | 0.3347 | 2.0514 | 0.8145 | 0.8144 | 0.1569 | 0.0571 | | 73.8298 | 28.0 | 28000 | 75.8175 | 0.8007 | 0.3582 | 2.0923 | 0.8007 | 0.7999 | 0.1714 | 0.0625 | | 73.8483 | 29.0 | 29000 | 75.7541 | 0.8023 | 0.3554 | 2.1075 | 0.8023 | 0.8002 | 0.1698 | 0.0603 | | 73.6726 | 30.0 | 30000 | 75.6642 | 0.8095 | 0.3454 | 2.0600 | 0.8095 | 0.8092 | 0.1638 | 0.0600 | | 73.7118 | 31.0 | 31000 | 75.5905 | 0.8105 | 0.3398 | 2.1354 | 0.8105 | 0.8106 | 0.1587 | 0.0595 | | 73.5938 | 32.0 | 32000 | 75.5721 | 0.8087 | 0.3429 | 2.0765 | 0.8087 | 0.8094 | 0.1640 | 0.0616 | | 73.5563 | 33.0 | 33000 | 75.7021 | 0.8085 | 0.3474 | 2.0825 | 0.8085 | 0.8092 | 0.1656 | 0.0633 | | 73.6469 | 34.0 | 34000 | 75.5322 | 0.8095 | 0.3406 | 2.0907 | 0.8095 | 0.8079 | 0.1632 | 0.0590 | | 73.4666 | 35.0 | 35000 | 75.4994 | 0.8105 | 0.3397 | 2.0839 | 0.8105 | 0.8102 | 0.1621 | 0.0590 | | 73.4144 | 36.0 | 36000 | 75.5095 | 0.8063 | 0.3476 | 2.1055 | 0.8062 | 0.8050 | 0.1666 | 0.0616 | | 73.2744 | 37.0 | 37000 | 75.4980 | 0.8117 | 0.3403 | 2.0693 | 0.8117 | 0.8123 | 0.1607 | 0.0569 | | 73.4358 | 38.0 | 38000 | 75.4824 | 0.809 | 0.3434 | 2.0996 | 0.809 | 0.8090 | 0.1645 | 0.0586 | | 73.2696 | 39.0 | 39000 | 75.5088 | 0.8085 | 0.3468 | 2.0697 | 0.8085 | 0.8089 | 0.1658 | 0.0589 | | 73.382 | 40.0 | 40000 | 75.4705 | 0.8095 | 0.3437 | 2.0738 | 0.8095 | 0.8104 | 0.1641 | 0.0621 | | 73.3006 | 41.0 | 41000 | 75.4697 | 0.809 | 0.3440 | 2.1203 | 0.809 | 0.8097 | 0.1624 | 0.0614 | | 73.4237 | 42.0 | 42000 | 75.3601 | 0.8093 | 0.3434 | 2.0736 | 0.8093 | 0.8094 | 0.1629 | 0.0575 | | 73.2571 | 43.0 | 43000 | 75.3364 | 0.8103 | 0.3398 | 2.0665 | 0.8103 | 0.8101 | 0.1630 | 0.0599 | | 73.2241 | 44.0 | 44000 | 75.3369 | 0.8135 | 0.3381 | 2.0609 | 0.8135 | 0.8136 | 0.1600 | 0.0581 | | 73.2271 | 45.0 | 45000 | 75.2917 | 0.814 | 0.3355 | 2.0906 | 0.8140 | 0.8142 | 0.1576 | 0.0582 | | 73.1427 | 46.0 | 46000 | 75.3108 | 0.8125 | 0.3377 | 2.0784 | 0.8125 | 0.8127 | 0.1613 | 0.0586 | | 73.2754 | 47.0 | 47000 | 75.3195 | 0.8127 | 0.3386 | 2.0860 | 0.8128 | 0.8127 | 0.1604 | 0.0586 | | 73.132 | 48.0 | 48000 | 75.3168 | 0.812 | 0.3391 | 2.0853 | 0.8120 | 0.8118 | 0.1612 | 0.0582 | | 73.1482 | 49.0 | 49000 | 75.2943 | 0.8117 | 0.3395 | 2.0895 | 0.8117 | 0.8116 | 0.1615 | 0.0586 | | 73.1849 | 50.0 | 50000 | 75.3146 | 0.811 | 0.3395 | 2.0856 | 0.811 | 0.8109 | 0.1625 | 0.0586 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-tiny_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-tiny_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-tiny-patch16-224](https://huggingface.co/WinKawaks/vit-tiny-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.5793 - Accuracy: 0.8185 - Brier Loss: 0.3293 - Nll: 2.0402 - F1 Micro: 0.8185 - F1 Macro: 0.8184 - Ece: 0.1589 - Aurc: 0.0584 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 2.7601 | 1.0 | 1000 | 2.6212 | 0.5952 | 0.5281 | 2.4218 | 0.5952 | 0.5899 | 0.0529 | 0.1801 | | 2.236 | 2.0 | 2000 | 2.1744 | 0.6847 | 0.4219 | 2.1650 | 0.6847 | 0.6827 | 0.0515 | 0.1135 | | 1.9525 | 3.0 | 3000 | 1.9941 | 0.7147 | 0.3889 | 2.0860 | 0.7147 | 0.7163 | 0.0616 | 0.0961 | | 1.729 | 4.0 | 4000 | 1.8880 | 0.7485 | 0.3585 | 2.1097 | 0.7485 | 0.7476 | 0.0649 | 0.0811 | | 1.6172 | 5.0 | 5000 | 1.8693 | 0.7455 | 0.3598 | 2.0514 | 0.7455 | 0.7448 | 0.0733 | 0.0811 | | 1.4385 | 6.0 | 6000 | 1.8350 | 0.747 | 0.3654 | 2.0917 | 0.747 | 0.7536 | 0.0828 | 0.0843 | | 1.3098 | 7.0 | 7000 | 1.8317 | 0.7625 | 0.3515 | 2.0367 | 0.7625 | 0.7668 | 0.1039 | 0.0794 | | 1.1854 | 8.0 | 8000 | 1.8872 | 0.7675 | 0.3589 | 2.1085 | 0.7675 | 0.7661 | 0.1294 | 0.0694 | | 1.1066 | 9.0 | 9000 | 1.9843 | 0.7642 | 0.3707 | 2.1492 | 0.7642 | 0.7670 | 0.1457 | 0.0715 | | 1.0518 | 10.0 | 10000 | 1.9993 | 0.77 | 0.3660 | 2.0694 | 0.7700 | 0.7702 | 0.1440 | 0.0695 | | 0.9741 | 11.0 | 11000 | 2.2346 | 0.769 | 0.3870 | 2.0588 | 0.769 | 0.7704 | 0.1748 | 0.0735 | | 0.938 | 12.0 | 12000 | 2.2626 | 0.767 | 0.3918 | 2.1412 | 0.767 | 0.7679 | 0.1774 | 0.0723 | | 0.899 | 13.0 | 13000 | 2.3671 | 0.7698 | 0.3995 | 2.1064 | 0.7698 | 0.7725 | 0.1820 | 0.0708 | | 0.8708 | 14.0 | 14000 | 2.3386 | 0.7768 | 0.3838 | 2.1948 | 0.7768 | 0.7777 | 0.1724 | 0.0683 | | 0.8456 | 15.0 | 15000 | 2.3234 | 0.7827 | 0.3774 | 2.0937 | 0.7828 | 0.7830 | 0.1752 | 0.0638 | | 0.8021 | 16.0 | 16000 | 2.5700 | 0.7685 | 0.4092 | 2.1458 | 0.7685 | 0.7684 | 0.1911 | 0.0717 | | 0.7921 | 17.0 | 17000 | 2.5721 | 0.778 | 0.3934 | 2.1774 | 0.778 | 0.7756 | 0.1829 | 0.0708 | | 0.7779 | 18.0 | 18000 | 2.7204 | 0.772 | 0.4089 | 2.1145 | 0.772 | 0.7707 | 0.1926 | 0.0758 | | 0.7484 | 19.0 | 19000 | 2.7208 | 0.7752 | 0.4044 | 2.1020 | 0.7752 | 0.7760 | 0.1901 | 0.0746 | | 0.7469 | 20.0 | 20000 | 2.5898 | 0.7927 | 0.3711 | 2.2076 | 0.7927 | 0.7909 | 0.1763 | 0.0648 | | 0.7256 | 21.0 | 21000 | 2.5658 | 0.791 | 0.3727 | 2.0710 | 0.791 | 0.7920 | 0.1763 | 0.0650 | | 0.7137 | 22.0 | 22000 | 2.6782 | 0.7847 | 0.3851 | 2.1469 | 0.7847 | 0.7854 | 0.1824 | 0.0703 | | 0.6912 | 23.0 | 23000 | 2.5574 | 0.802 | 0.3539 | 2.1340 | 0.802 | 0.8009 | 0.1685 | 0.0615 | | 0.6894 | 24.0 | 24000 | 2.6331 | 0.7913 | 0.3760 | 2.0913 | 0.7913 | 0.7944 | 0.1807 | 0.0674 | | 0.6692 | 25.0 | 25000 | 2.6074 | 0.7955 | 0.3658 | 2.0837 | 0.7955 | 0.7975 | 0.1745 | 0.0645 | | 0.6541 | 26.0 | 26000 | 2.6059 | 0.7945 | 0.3672 | 2.0798 | 0.7945 | 0.7936 | 0.1751 | 0.0616 | | 0.6517 | 27.0 | 27000 | 2.7149 | 0.7965 | 0.3697 | 2.0842 | 0.7965 | 0.7987 | 0.1769 | 0.0663 | | 0.6484 | 28.0 | 28000 | 2.5700 | 0.8047 | 0.3542 | 2.0142 | 0.8047 | 0.8058 | 0.1685 | 0.0597 | | 0.6342 | 29.0 | 29000 | 2.6774 | 0.7987 | 0.3660 | 2.1231 | 0.7987 | 0.7972 | 0.1759 | 0.0622 | | 0.6331 | 30.0 | 30000 | 2.6112 | 0.7973 | 0.3621 | 2.0740 | 0.7973 | 0.7981 | 0.1752 | 0.0621 | | 0.6204 | 31.0 | 31000 | 2.6470 | 0.807 | 0.3521 | 2.0337 | 0.807 | 0.8056 | 0.1683 | 0.0638 | | 0.612 | 32.0 | 32000 | 2.6265 | 0.8053 | 0.3549 | 2.0800 | 0.8053 | 0.8038 | 0.1678 | 0.0596 | | 0.6049 | 33.0 | 33000 | 2.5749 | 0.8107 | 0.3428 | 2.0235 | 0.8108 | 0.8114 | 0.1662 | 0.0576 | | 0.5984 | 34.0 | 34000 | 2.6667 | 0.804 | 0.3572 | 2.1015 | 0.804 | 0.8041 | 0.1726 | 0.0632 | | 0.5961 | 35.0 | 35000 | 2.6481 | 0.8067 | 0.3521 | 2.0652 | 0.8067 | 0.8068 | 0.1673 | 0.0610 | | 0.5958 | 36.0 | 36000 | 2.6831 | 0.8065 | 0.3523 | 2.0272 | 0.8065 | 0.8074 | 0.1681 | 0.0623 | | 0.5836 | 37.0 | 37000 | 2.6573 | 0.8067 | 0.3533 | 2.0149 | 0.8067 | 0.8050 | 0.1699 | 0.0624 | | 0.5855 | 38.0 | 38000 | 2.6730 | 0.8087 | 0.3510 | 2.0149 | 0.8087 | 0.8087 | 0.1693 | 0.0611 | | 0.581 | 39.0 | 39000 | 2.6464 | 0.815 | 0.3390 | 2.0609 | 0.815 | 0.8148 | 0.1628 | 0.0588 | | 0.5708 | 40.0 | 40000 | 2.7165 | 0.8077 | 0.3515 | 2.0489 | 0.8077 | 0.8078 | 0.1698 | 0.0644 | | 0.5727 | 41.0 | 41000 | 2.6264 | 0.8135 | 0.3402 | 2.0070 | 0.8135 | 0.8130 | 0.1643 | 0.0601 | | 0.5688 | 42.0 | 42000 | 2.6522 | 0.8077 | 0.3522 | 2.0064 | 0.8077 | 0.8076 | 0.1687 | 0.0615 | | 0.5648 | 43.0 | 43000 | 2.5806 | 0.8193 | 0.3295 | 2.0211 | 0.8193 | 0.8188 | 0.1593 | 0.0585 | | 0.5649 | 44.0 | 44000 | 2.6182 | 0.8125 | 0.3372 | 2.0292 | 0.8125 | 0.8122 | 0.1630 | 0.0598 | | 0.562 | 45.0 | 45000 | 2.6274 | 0.8157 | 0.3366 | 2.0047 | 0.8157 | 0.8158 | 0.1610 | 0.0577 | | 0.5592 | 46.0 | 46000 | 2.6069 | 0.8145 | 0.3370 | 2.0276 | 0.8145 | 0.8146 | 0.1632 | 0.0599 | | 0.5582 | 47.0 | 47000 | 2.5935 | 0.8187 | 0.3331 | 2.0291 | 0.8187 | 0.8193 | 0.1594 | 0.0588 | | 0.5565 | 48.0 | 48000 | 2.5780 | 0.8197 | 0.3302 | 2.0401 | 0.8197 | 0.8197 | 0.1580 | 0.0577 | | 0.5571 | 49.0 | 49000 | 2.5781 | 0.8185 | 0.3294 | 2.0367 | 0.8185 | 0.8184 | 0.1590 | 0.0583 | | 0.5568 | 50.0 | 50000 | 2.5793 | 0.8185 | 0.3293 | 2.0402 | 0.8185 | 0.8184 | 0.1589 | 0.0584 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/dit-base-small_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-small_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.2813 - Accuracy: 0.841 - Brier Loss: 0.2948 - Nll: 1.9789 - F1 Micro: 0.841 - F1 Macro: 0.8409 - Ece: 0.1435 - Aurc: 0.0496 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 2.5147 | 1.0 | 1000 | 2.3332 | 0.659 | 0.4557 | 2.1792 | 0.659 | 0.6434 | 0.0525 | 0.1283 | | 2.0194 | 2.0 | 2000 | 1.9888 | 0.7225 | 0.3793 | 2.1426 | 0.7225 | 0.7275 | 0.0522 | 0.0910 | | 1.7429 | 3.0 | 3000 | 1.8097 | 0.759 | 0.3332 | 1.9995 | 0.7590 | 0.7599 | 0.0480 | 0.0713 | | 1.5665 | 4.0 | 4000 | 1.8189 | 0.7652 | 0.3417 | 2.0345 | 0.7652 | 0.7676 | 0.0920 | 0.0706 | | 1.4286 | 5.0 | 5000 | 1.7489 | 0.7588 | 0.3435 | 2.0233 | 0.7588 | 0.7639 | 0.0654 | 0.0745 | | 1.3258 | 6.0 | 6000 | 1.7223 | 0.7837 | 0.3212 | 1.9911 | 0.7837 | 0.7818 | 0.0939 | 0.0595 | | 1.1929 | 7.0 | 7000 | 1.6777 | 0.8015 | 0.3090 | 1.9764 | 0.8015 | 0.8045 | 0.0991 | 0.0554 | | 1.0707 | 8.0 | 8000 | 1.8240 | 0.79 | 0.3365 | 1.9708 | 0.79 | 0.7907 | 0.1334 | 0.0588 | | 1.0054 | 9.0 | 9000 | 1.9355 | 0.7853 | 0.3482 | 2.0445 | 0.7853 | 0.7861 | 0.1404 | 0.0637 | | 0.924 | 10.0 | 10000 | 2.0125 | 0.799 | 0.3399 | 2.0334 | 0.799 | 0.7993 | 0.1471 | 0.0584 | | 0.9043 | 11.0 | 11000 | 1.9868 | 0.8075 | 0.3252 | 2.0171 | 0.8075 | 0.8078 | 0.1429 | 0.0526 | | 0.8442 | 12.0 | 12000 | 2.1626 | 0.8087 | 0.3339 | 2.0839 | 0.8087 | 0.8075 | 0.1523 | 0.0534 | | 0.8158 | 13.0 | 13000 | 2.1123 | 0.7903 | 0.3567 | 2.0193 | 0.7903 | 0.7922 | 0.1592 | 0.0580 | | 0.8087 | 14.0 | 14000 | 2.1296 | 0.814 | 0.3269 | 2.0417 | 0.8140 | 0.8152 | 0.1482 | 0.0533 | | 0.7924 | 15.0 | 15000 | 2.0768 | 0.816 | 0.3181 | 2.1244 | 0.816 | 0.8152 | 0.1435 | 0.0500 | | 0.7574 | 16.0 | 16000 | 2.3326 | 0.797 | 0.3596 | 2.1292 | 0.797 | 0.7960 | 0.1668 | 0.0590 | | 0.7501 | 17.0 | 17000 | 2.3190 | 0.8067 | 0.3475 | 2.0635 | 0.8067 | 0.8106 | 0.1617 | 0.0588 | | 0.7097 | 18.0 | 18000 | 2.3543 | 0.8067 | 0.3468 | 2.0987 | 0.8067 | 0.8063 | 0.1636 | 0.0561 | | 0.6939 | 19.0 | 19000 | 2.3151 | 0.8067 | 0.3405 | 2.0927 | 0.8067 | 0.8047 | 0.1617 | 0.0546 | | 0.6972 | 20.0 | 20000 | 2.3289 | 0.8087 | 0.3409 | 2.0291 | 0.8087 | 0.8113 | 0.1628 | 0.0550 | | 0.6869 | 21.0 | 21000 | 2.3878 | 0.797 | 0.3567 | 2.0736 | 0.797 | 0.7995 | 0.1727 | 0.0599 | | 0.6621 | 22.0 | 22000 | 2.3796 | 0.8137 | 0.3323 | 2.1080 | 0.8137 | 0.8138 | 0.1570 | 0.0553 | | 0.6496 | 23.0 | 23000 | 2.4264 | 0.8143 | 0.3342 | 2.0650 | 0.8143 | 0.8182 | 0.1598 | 0.0562 | | 0.6478 | 24.0 | 24000 | 2.3300 | 0.8167 | 0.3313 | 2.0509 | 0.8167 | 0.8170 | 0.1564 | 0.0552 | | 0.6184 | 25.0 | 25000 | 2.3143 | 0.8283 | 0.3178 | 1.9842 | 0.8283 | 0.8284 | 0.1498 | 0.0509 | | 0.6203 | 26.0 | 26000 | 2.3665 | 0.825 | 0.3181 | 2.0647 | 0.825 | 0.8246 | 0.1501 | 0.0530 | | 0.6151 | 27.0 | 27000 | 2.2673 | 0.8335 | 0.3042 | 1.9961 | 0.8335 | 0.8351 | 0.1431 | 0.0501 | | 0.6054 | 28.0 | 28000 | 2.2927 | 0.8283 | 0.3100 | 2.0229 | 0.8283 | 0.8281 | 0.1490 | 0.0508 | | 0.5965 | 29.0 | 29000 | 2.4001 | 0.8223 | 0.3235 | 2.0393 | 0.8223 | 0.8266 | 0.1553 | 0.0536 | | 0.5869 | 30.0 | 30000 | 2.4167 | 0.821 | 0.3269 | 1.9890 | 0.821 | 0.8213 | 0.1576 | 0.0553 | | 0.5852 | 31.0 | 31000 | 2.4030 | 0.83 | 0.3120 | 2.0384 | 0.83 | 0.8304 | 0.1500 | 0.0539 | | 0.5741 | 32.0 | 32000 | 2.4800 | 0.8233 | 0.3259 | 2.1183 | 0.8233 | 0.8212 | 0.1561 | 0.0550 | | 0.5743 | 33.0 | 33000 | 2.4718 | 0.821 | 0.3269 | 2.0593 | 0.821 | 0.8191 | 0.1582 | 0.0578 | | 0.5694 | 34.0 | 34000 | 2.3638 | 0.8297 | 0.3135 | 2.0746 | 0.8297 | 0.8293 | 0.1484 | 0.0534 | | 0.5567 | 35.0 | 35000 | 2.3320 | 0.8313 | 0.3054 | 2.0498 | 0.8313 | 0.8311 | 0.1470 | 0.0524 | | 0.5547 | 36.0 | 36000 | 2.3902 | 0.8293 | 0.3138 | 2.0001 | 0.8293 | 0.8294 | 0.1506 | 0.0513 | | 0.55 | 37.0 | 37000 | 2.3812 | 0.822 | 0.3239 | 2.0567 | 0.822 | 0.8226 | 0.1560 | 0.0541 | | 0.5482 | 38.0 | 38000 | 2.3680 | 0.8333 | 0.3063 | 2.0447 | 0.8333 | 0.8332 | 0.1460 | 0.0496 | | 0.546 | 39.0 | 39000 | 2.3250 | 0.8323 | 0.3051 | 1.9994 | 0.8323 | 0.8315 | 0.1478 | 0.0504 | | 0.541 | 40.0 | 40000 | 2.3498 | 0.837 | 0.3003 | 2.0228 | 0.8370 | 0.8367 | 0.1456 | 0.0493 | | 0.5448 | 41.0 | 41000 | 2.3504 | 0.833 | 0.3053 | 1.9875 | 0.833 | 0.8323 | 0.1492 | 0.0532 | | 0.5365 | 42.0 | 42000 | 2.3421 | 0.8323 | 0.3077 | 2.0985 | 0.8323 | 0.8314 | 0.1477 | 0.0501 | | 0.5324 | 43.0 | 43000 | 2.2976 | 0.84 | 0.2929 | 1.9862 | 0.8400 | 0.8403 | 0.1418 | 0.0507 | | 0.5326 | 44.0 | 44000 | 2.3270 | 0.838 | 0.2993 | 2.0043 | 0.838 | 0.8384 | 0.1438 | 0.0521 | | 0.5303 | 45.0 | 45000 | 2.2919 | 0.839 | 0.2957 | 1.9625 | 0.839 | 0.8395 | 0.1430 | 0.0494 | | 0.5276 | 46.0 | 46000 | 2.2861 | 0.8397 | 0.2973 | 1.9382 | 0.8397 | 0.8402 | 0.1442 | 0.0512 | | 0.5279 | 47.0 | 47000 | 2.2930 | 0.8387 | 0.2962 | 1.9738 | 0.8387 | 0.8384 | 0.1456 | 0.0499 | | 0.5251 | 48.0 | 48000 | 2.2841 | 0.84 | 0.2950 | 1.9888 | 0.8400 | 0.8399 | 0.1441 | 0.0490 | | 0.5257 | 49.0 | 49000 | 2.2802 | 0.84 | 0.2950 | 1.9827 | 0.8400 | 0.8400 | 0.1443 | 0.0491 | | 0.5232 | 50.0 | 50000 | 2.2813 | 0.841 | 0.2948 | 1.9789 | 0.841 | 0.8409 | 0.1435 | 0.0496 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base-patch16-224-in21k-small_rvl_cdip-NK1000_hint
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-small_rvl_cdip-NK1000_hint This model is a fine-tuned version of [WinKawaks/vit-small-patch16-224](https://huggingface.co/WinKawaks/vit-small-patch16-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 75.2362 - Accuracy: 0.829 - Brier Loss: 0.3192 - Nll: 1.9987 - F1 Micro: 0.8290 - F1 Macro: 0.8288 - Ece: 0.1536 - Aurc: 0.0562 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 78.2197 | 1.0 | 1000 | 77.9227 | 0.6322 | 0.4884 | 2.2807 | 0.6322 | 0.6226 | 0.0652 | 0.1538 | | 77.1947 | 2.0 | 2000 | 77.1839 | 0.727 | 0.3752 | 2.1097 | 0.7270 | 0.7290 | 0.0480 | 0.0917 | | 76.83 | 3.0 | 3000 | 76.9771 | 0.7385 | 0.3662 | 2.0823 | 0.7385 | 0.7421 | 0.0612 | 0.0827 | | 76.6666 | 4.0 | 4000 | 76.7764 | 0.7725 | 0.3383 | 1.9757 | 0.7725 | 0.7763 | 0.0808 | 0.0697 | | 76.2784 | 5.0 | 5000 | 76.6028 | 0.773 | 0.3271 | 2.0074 | 0.7730 | 0.7737 | 0.0708 | 0.0668 | | 75.9985 | 6.0 | 6000 | 76.4455 | 0.7875 | 0.3138 | 1.9622 | 0.7875 | 0.7874 | 0.0868 | 0.0593 | | 75.7339 | 7.0 | 7000 | 76.2669 | 0.7973 | 0.3127 | 1.9661 | 0.7973 | 0.7978 | 0.1123 | 0.0546 | | 75.504 | 8.0 | 8000 | 76.3972 | 0.7927 | 0.3373 | 2.0115 | 0.7927 | 0.7954 | 0.1318 | 0.0594 | | 75.3558 | 9.0 | 9000 | 76.4614 | 0.7785 | 0.3628 | 2.1225 | 0.7785 | 0.7793 | 0.1501 | 0.0648 | | 75.1485 | 10.0 | 10000 | 76.2131 | 0.795 | 0.3351 | 2.0123 | 0.795 | 0.7964 | 0.1467 | 0.0565 | | 75.1098 | 11.0 | 11000 | 76.3161 | 0.798 | 0.3452 | 2.0430 | 0.798 | 0.8002 | 0.1569 | 0.0587 | | 74.822 | 12.0 | 12000 | 76.2536 | 0.788 | 0.3637 | 2.0509 | 0.788 | 0.7877 | 0.1659 | 0.0607 | | 74.8787 | 13.0 | 13000 | 76.2025 | 0.7965 | 0.3493 | 2.0401 | 0.7965 | 0.7981 | 0.1538 | 0.0582 | | 74.7046 | 14.0 | 14000 | 76.1598 | 0.8075 | 0.3335 | 2.0388 | 0.8075 | 0.8058 | 0.1551 | 0.0525 | | 74.6157 | 15.0 | 15000 | 76.0894 | 0.8003 | 0.3496 | 1.9931 | 0.8003 | 0.8006 | 0.1615 | 0.0566 | | 74.6451 | 16.0 | 16000 | 76.0082 | 0.8065 | 0.3380 | 2.0005 | 0.8065 | 0.8060 | 0.1593 | 0.0530 | | 74.3042 | 17.0 | 17000 | 76.0281 | 0.8075 | 0.3398 | 2.0028 | 0.8075 | 0.8097 | 0.1592 | 0.0544 | | 74.3261 | 18.0 | 18000 | 75.9836 | 0.8063 | 0.3414 | 2.0447 | 0.8062 | 0.8066 | 0.1614 | 0.0566 | | 74.1196 | 19.0 | 19000 | 75.8935 | 0.8103 | 0.3347 | 2.0211 | 0.8103 | 0.8121 | 0.1592 | 0.0595 | | 74.2291 | 20.0 | 20000 | 75.9679 | 0.815 | 0.3329 | 2.0335 | 0.815 | 0.8141 | 0.1586 | 0.0572 | | 74.268 | 21.0 | 21000 | 76.0052 | 0.8073 | 0.3442 | 2.0847 | 0.8073 | 0.8067 | 0.1633 | 0.0589 | | 74.0436 | 22.0 | 22000 | 75.9529 | 0.8093 | 0.3454 | 2.1010 | 0.8093 | 0.8081 | 0.1625 | 0.0547 | | 73.9289 | 23.0 | 23000 | 75.8841 | 0.8103 | 0.3420 | 2.0569 | 0.8103 | 0.8104 | 0.1625 | 0.0580 | | 73.9519 | 24.0 | 24000 | 75.7295 | 0.8167 | 0.3320 | 2.0459 | 0.8167 | 0.8152 | 0.1575 | 0.0533 | | 73.9333 | 25.0 | 25000 | 75.6503 | 0.8165 | 0.3296 | 1.9681 | 0.8165 | 0.8174 | 0.1586 | 0.0523 | | 73.8239 | 26.0 | 26000 | 75.6156 | 0.8203 | 0.3245 | 2.0540 | 0.8203 | 0.8192 | 0.1546 | 0.0506 | | 73.7011 | 27.0 | 27000 | 75.7075 | 0.8183 | 0.3312 | 2.0996 | 0.8183 | 0.8193 | 0.1594 | 0.0562 | | 73.4822 | 28.0 | 28000 | 75.5065 | 0.8247 | 0.3184 | 2.0404 | 0.8247 | 0.8254 | 0.1535 | 0.0548 | | 73.5787 | 29.0 | 29000 | 75.6063 | 0.8193 | 0.3295 | 2.0527 | 0.8193 | 0.8192 | 0.1591 | 0.0560 | | 73.519 | 30.0 | 30000 | 75.5828 | 0.8163 | 0.3351 | 2.0151 | 0.8163 | 0.8173 | 0.1621 | 0.0582 | | 73.6516 | 31.0 | 31000 | 75.4986 | 0.827 | 0.3147 | 2.0640 | 0.827 | 0.8272 | 0.1513 | 0.0539 | | 73.5156 | 32.0 | 32000 | 75.5884 | 0.8147 | 0.3355 | 2.0634 | 0.8148 | 0.8137 | 0.1631 | 0.0556 | | 73.4564 | 33.0 | 33000 | 75.3992 | 0.8233 | 0.3219 | 2.0498 | 0.8233 | 0.8227 | 0.1536 | 0.0526 | | 73.3286 | 34.0 | 34000 | 75.4277 | 0.8197 | 0.3256 | 2.0222 | 0.8197 | 0.8213 | 0.1594 | 0.0540 | | 73.3056 | 35.0 | 35000 | 75.3989 | 0.8285 | 0.3136 | 1.9681 | 0.8285 | 0.8303 | 0.1510 | 0.0566 | | 73.3272 | 36.0 | 36000 | 75.4398 | 0.8233 | 0.3247 | 2.0504 | 0.8233 | 0.8247 | 0.1583 | 0.0553 | | 73.2738 | 37.0 | 37000 | 75.3631 | 0.8207 | 0.3242 | 1.9921 | 0.8207 | 0.8211 | 0.1595 | 0.0546 | | 73.2657 | 38.0 | 38000 | 75.3613 | 0.8245 | 0.3231 | 2.0715 | 0.8245 | 0.8232 | 0.1569 | 0.0548 | | 73.2045 | 39.0 | 39000 | 75.3697 | 0.8223 | 0.3253 | 2.0207 | 0.8223 | 0.8213 | 0.1571 | 0.0557 | | 73.1701 | 40.0 | 40000 | 75.3138 | 0.8277 | 0.3174 | 2.0071 | 0.8277 | 0.8282 | 0.1525 | 0.0557 | | 73.1491 | 41.0 | 41000 | 75.3160 | 0.827 | 0.3183 | 2.0131 | 0.827 | 0.8271 | 0.1549 | 0.0573 | | 73.1466 | 42.0 | 42000 | 75.3052 | 0.8297 | 0.3166 | 1.9978 | 0.8297 | 0.8296 | 0.1535 | 0.0558 | | 73.0658 | 43.0 | 43000 | 75.3064 | 0.8293 | 0.3166 | 2.0293 | 0.8293 | 0.8292 | 0.1548 | 0.0570 | | 73.1394 | 44.0 | 44000 | 75.2527 | 0.8285 | 0.3179 | 2.0172 | 0.8285 | 0.8284 | 0.1540 | 0.0554 | | 73.2385 | 45.0 | 45000 | 75.2782 | 0.828 | 0.3181 | 2.0026 | 0.828 | 0.8280 | 0.1556 | 0.0570 | | 73.2207 | 46.0 | 46000 | 75.2624 | 0.827 | 0.3194 | 1.9884 | 0.827 | 0.8268 | 0.1552 | 0.0559 | | 73.2837 | 47.0 | 47000 | 75.2604 | 0.8285 | 0.3195 | 1.9982 | 0.8285 | 0.8283 | 0.1542 | 0.0566 | | 73.0848 | 48.0 | 48000 | 75.2454 | 0.829 | 0.3188 | 1.9958 | 0.8290 | 0.8288 | 0.1535 | 0.0568 | | 73.111 | 49.0 | 49000 | 75.2438 | 0.8283 | 0.3196 | 2.0019 | 0.8283 | 0.8282 | 0.1542 | 0.0567 | | 73.0278 | 50.0 | 50000 | 75.2362 | 0.829 | 0.3192 | 1.9987 | 0.8290 | 0.8288 | 0.1536 | 0.0562 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
daniejps10/platzi-vit-model-djps10
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-djps10 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0022 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0024 | 3.85 | 500 | 0.0022 | 1.0 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
tommilyjones/resnet-50-finetuned-hateful-meme-restructured-lowerLR
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-hateful-meme-restructured-lowerLR This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6967 - Accuracy: 0.492 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.7071 | 0.99 | 66 | 0.6967 | 0.492 | | 0.7052 | 2.0 | 133 | 0.6969 | 0.484 | | 0.7058 | 2.99 | 199 | 0.6961 | 0.484 | | 0.7024 | 4.0 | 266 | 0.6953 | 0.47 | | 0.7035 | 4.99 | 332 | 0.6962 | 0.488 | | 0.7033 | 6.0 | 399 | 0.6962 | 0.488 | | 0.7019 | 6.99 | 465 | 0.6958 | 0.472 | | 0.7015 | 8.0 | 532 | 0.6962 | 0.472 | | 0.7002 | 8.99 | 598 | 0.6958 | 0.472 | | 0.7019 | 9.92 | 660 | 0.6961 | 0.474 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/resnet-50-finetuned-hateful-meme-restructured-balanced
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-hateful-meme-restructured-balanced This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6946 - Accuracy: 0.522 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6941 | 0.98 | 47 | 0.6947 | 0.494 | | 0.6906 | 1.99 | 95 | 0.6945 | 0.492 | | 0.6885 | 2.99 | 143 | 0.6951 | 0.492 | | 0.6873 | 4.0 | 191 | 0.6946 | 0.5 | | 0.6851 | 4.98 | 238 | 0.6941 | 0.516 | | 0.6813 | 5.99 | 286 | 0.6946 | 0.522 | | 0.6817 | 6.99 | 334 | 0.6955 | 0.508 | | 0.6849 | 8.0 | 382 | 0.6948 | 0.52 | | 0.6834 | 8.98 | 429 | 0.6953 | 0.508 | | 0.6758 | 9.84 | 470 | 0.6953 | 0.516 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
tommilyjones/vit-base-patch16-224-finetuned-hateful-meme-restructured-balanced
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-hateful-meme-restructured-balanced This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7145 - Accuracy: 0.556 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.7016 | 0.98 | 47 | 0.7243 | 0.512 | | 0.6676 | 1.99 | 95 | 0.7139 | 0.544 | | 0.626 | 2.99 | 143 | 0.7145 | 0.556 | | 0.6042 | 4.0 | 191 | 0.7342 | 0.556 | | 0.5672 | 4.98 | 238 | 0.7481 | 0.548 | | 0.5339 | 5.99 | 286 | 0.7458 | 0.532 | | 0.5266 | 6.99 | 334 | 0.7662 | 0.536 | | 0.5102 | 8.0 | 382 | 0.7832 | 0.544 | | 0.4808 | 8.98 | 429 | 0.7898 | 0.53 | | 0.4698 | 9.84 | 470 | 0.7844 | 0.534 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "hate", "nohate" ]
carolinacalce/Mi_modelo_CatsDogs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Mi_modelo_CatsDogs This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "cats", "dogs" ]
mikecamara/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
rgarcia/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 1.5827 - Accuracy: 0.895 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.6833 | 0.99 | 62 | 2.4863 | 0.839 | | 1.8076 | 2.0 | 125 | 1.7471 | 0.883 | | 1.5823 | 2.98 | 186 | 1.5827 | 0.895 | ### Framework versions - Transformers 4.32.0.dev0 - Pytorch 2.0.1+cu117 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
frncscp/patacoswin_v2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # patacoswin_v2 This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window16-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window16-256) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0328 - Accuracy: 0.9910 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6055 | 0.95 | 13 | 0.2709 | 0.9615 | | 0.2812 | 1.96 | 27 | 0.0866 | 0.9683 | | 0.1426 | 2.98 | 41 | 0.0584 | 0.9796 | | 0.07 | 4.0 | 55 | 0.0268 | 0.9932 | | 0.0579 | 4.95 | 68 | 0.0451 | 0.9864 | | 0.091 | 5.96 | 82 | 0.0300 | 0.9887 | | 0.0247 | 6.98 | 96 | 0.0387 | 0.9864 | | 0.0323 | 8.0 | 110 | 0.0456 | 0.9887 | | 0.032 | 8.95 | 123 | 0.0475 | 0.9864 | | 0.0187 | 9.45 | 130 | 0.0328 | 0.9910 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "patacon-false", "patacon-true" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.2802 - Accuracy: 0.5747 - Brier Loss: 0.6822 - Nll: 3.2886 - F1 Micro: 0.5747 - F1 Macro: 0.5757 - Ece: 0.2786 - Aurc: 0.2132 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 250 | 4.1512 | 0.1727 | 0.9045 | 5.5051 | 0.1727 | 0.0947 | 0.0704 | 0.7164 | | 4.2402 | 2.0 | 500 | 3.8933 | 0.216 | 0.8775 | 4.1816 | 0.216 | 0.1697 | 0.0699 | 0.6624 | | 4.2402 | 3.0 | 750 | 3.4256 | 0.3207 | 0.8113 | 3.6783 | 0.3207 | 0.2567 | 0.0645 | 0.5125 | | 3.5189 | 4.0 | 1000 | 3.1611 | 0.3673 | 0.7763 | 3.6447 | 0.3673 | 0.3039 | 0.0797 | 0.4450 | | 3.5189 | 5.0 | 1250 | 2.7791 | 0.4253 | 0.7216 | 3.1536 | 0.4253 | 0.3860 | 0.0982 | 0.3729 | | 2.7963 | 6.0 | 1500 | 2.6525 | 0.4323 | 0.7004 | 3.0187 | 0.4323 | 0.4117 | 0.0992 | 0.3440 | | 2.7963 | 7.0 | 1750 | 2.3623 | 0.5005 | 0.6489 | 2.8371 | 0.5005 | 0.4747 | 0.1076 | 0.2843 | | 2.3741 | 8.0 | 2000 | 2.4259 | 0.4798 | 0.6704 | 2.9344 | 0.4798 | 0.4680 | 0.1164 | 0.3045 | | 2.3741 | 9.0 | 2250 | 2.3034 | 0.5005 | 0.6431 | 2.8598 | 0.5005 | 0.4892 | 0.1306 | 0.2683 | | 2.0855 | 10.0 | 2500 | 2.1550 | 0.5298 | 0.6264 | 2.6847 | 0.5298 | 0.5164 | 0.1413 | 0.2480 | | 2.0855 | 11.0 | 2750 | 2.0891 | 0.5455 | 0.6162 | 2.6978 | 0.5455 | 0.5330 | 0.1428 | 0.2343 | | 1.8265 | 12.0 | 3000 | 2.2045 | 0.5252 | 0.6627 | 2.7900 | 0.5252 | 0.5045 | 0.1997 | 0.2507 | | 1.8265 | 13.0 | 3250 | 2.0080 | 0.5597 | 0.5948 | 2.7128 | 0.5597 | 0.5564 | 0.1389 | 0.2145 | | 1.6099 | 14.0 | 3500 | 2.1966 | 0.5353 | 0.6594 | 2.8505 | 0.5353 | 0.5198 | 0.1984 | 0.2581 | | 1.6099 | 15.0 | 3750 | 2.0788 | 0.547 | 0.6191 | 2.7214 | 0.547 | 0.5419 | 0.1729 | 0.2294 | | 1.4149 | 16.0 | 4000 | 2.0634 | 0.5485 | 0.6235 | 2.7486 | 0.5485 | 0.5491 | 0.1872 | 0.2225 | | 1.4149 | 17.0 | 4250 | 2.0722 | 0.5597 | 0.6241 | 2.7989 | 0.5597 | 0.5574 | 0.1912 | 0.2189 | | 1.2282 | 18.0 | 4500 | 2.1226 | 0.557 | 0.6327 | 2.9138 | 0.557 | 0.5584 | 0.2016 | 0.2205 | | 1.2282 | 19.0 | 4750 | 2.1013 | 0.5577 | 0.6326 | 2.8846 | 0.5577 | 0.5574 | 0.2051 | 0.2200 | | 1.0543 | 20.0 | 5000 | 2.1902 | 0.5637 | 0.6519 | 2.9362 | 0.5637 | 0.5556 | 0.2261 | 0.2273 | | 1.0543 | 21.0 | 5250 | 2.2291 | 0.5603 | 0.6620 | 2.9256 | 0.5603 | 0.5532 | 0.2469 | 0.2350 | | 0.8882 | 22.0 | 5500 | 2.2152 | 0.5605 | 0.6613 | 3.0823 | 0.5605 | 0.5563 | 0.2397 | 0.2234 | | 0.8882 | 23.0 | 5750 | 2.2309 | 0.5617 | 0.6600 | 3.1164 | 0.5617 | 0.5571 | 0.2520 | 0.2252 | | 0.7308 | 24.0 | 6000 | 2.2332 | 0.5655 | 0.6631 | 3.1202 | 0.5655 | 0.5661 | 0.2502 | 0.2241 | | 0.7308 | 25.0 | 6250 | 2.3018 | 0.5663 | 0.6762 | 3.2623 | 0.5663 | 0.5652 | 0.2640 | 0.2265 | | 0.6001 | 26.0 | 6500 | 2.3505 | 0.5547 | 0.6923 | 3.3289 | 0.5547 | 0.5592 | 0.2790 | 0.2279 | | 0.6001 | 27.0 | 6750 | 2.3821 | 0.5555 | 0.6932 | 3.4374 | 0.5555 | 0.5538 | 0.2827 | 0.2275 | | 0.4912 | 28.0 | 7000 | 2.3788 | 0.5675 | 0.6915 | 3.3014 | 0.5675 | 0.5637 | 0.2865 | 0.2324 | | 0.4912 | 29.0 | 7250 | 2.4068 | 0.556 | 0.7028 | 3.4904 | 0.556 | 0.5559 | 0.2906 | 0.2365 | | 0.4068 | 30.0 | 7500 | 2.4476 | 0.5557 | 0.7044 | 3.4350 | 0.5557 | 0.5572 | 0.2846 | 0.2387 | | 0.4068 | 31.0 | 7750 | 2.4179 | 0.562 | 0.7021 | 3.4782 | 0.562 | 0.5619 | 0.2911 | 0.2305 | | 0.3364 | 32.0 | 8000 | 2.3915 | 0.5615 | 0.6961 | 3.4704 | 0.5615 | 0.5623 | 0.2889 | 0.2294 | | 0.3364 | 33.0 | 8250 | 2.3860 | 0.568 | 0.6957 | 3.4578 | 0.568 | 0.5703 | 0.2869 | 0.2263 | | 0.2862 | 34.0 | 8500 | 2.4250 | 0.5647 | 0.7022 | 3.4923 | 0.5647 | 0.5638 | 0.2928 | 0.2282 | | 0.2862 | 35.0 | 8750 | 2.4453 | 0.5587 | 0.7106 | 3.6175 | 0.5587 | 0.5594 | 0.2970 | 0.2306 | | 0.2397 | 36.0 | 9000 | 2.3919 | 0.5653 | 0.6964 | 3.4399 | 0.5653 | 0.5675 | 0.2881 | 0.2197 | | 0.2397 | 37.0 | 9250 | 2.3870 | 0.5647 | 0.6995 | 3.4910 | 0.5647 | 0.5657 | 0.2941 | 0.2237 | | 0.2058 | 38.0 | 9500 | 2.4080 | 0.5663 | 0.7033 | 3.5314 | 0.5663 | 0.5673 | 0.2979 | 0.2271 | | 0.2058 | 39.0 | 9750 | 2.3727 | 0.5675 | 0.6975 | 3.3806 | 0.5675 | 0.5708 | 0.2930 | 0.2240 | | 0.1819 | 40.0 | 10000 | 2.3627 | 0.5745 | 0.6913 | 3.4237 | 0.5745 | 0.5751 | 0.2847 | 0.2217 | | 0.1819 | 41.0 | 10250 | 2.3497 | 0.564 | 0.6952 | 3.3908 | 0.564 | 0.5626 | 0.2931 | 0.2208 | | 0.1587 | 42.0 | 10500 | 2.3168 | 0.5705 | 0.6842 | 3.3858 | 0.5705 | 0.5725 | 0.2808 | 0.2181 | | 0.1587 | 43.0 | 10750 | 2.2910 | 0.5715 | 0.6768 | 3.3739 | 0.5715 | 0.5727 | 0.2777 | 0.2127 | | 0.1402 | 44.0 | 11000 | 2.3053 | 0.5713 | 0.6808 | 3.4128 | 0.5713 | 0.5724 | 0.2793 | 0.2133 | | 0.1402 | 45.0 | 11250 | 2.3029 | 0.5743 | 0.6848 | 3.3133 | 0.5743 | 0.5750 | 0.2771 | 0.2192 | | 0.1257 | 46.0 | 11500 | 2.2965 | 0.5695 | 0.6856 | 3.2338 | 0.5695 | 0.5697 | 0.2858 | 0.2158 | | 0.1257 | 47.0 | 11750 | 2.2823 | 0.5685 | 0.6847 | 3.2705 | 0.5685 | 0.5693 | 0.2828 | 0.2153 | | 0.1134 | 48.0 | 12000 | 2.2800 | 0.5753 | 0.6803 | 3.2797 | 0.5753 | 0.5759 | 0.2795 | 0.2139 | | 0.1134 | 49.0 | 12250 | 2.2766 | 0.5733 | 0.6823 | 3.2828 | 0.5733 | 0.5751 | 0.2777 | 0.2135 | | 0.1039 | 50.0 | 12500 | 2.2802 | 0.5747 | 0.6822 | 3.2886 | 0.5747 | 0.5757 | 0.2786 | 0.2132 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
anmolgupta/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd_MSE
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd_MSE This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.1927 - Accuracy: 0.5835 - Brier Loss: 0.6740 - Nll: 3.1975 - F1 Micro: 0.5835 - F1 Macro: 0.5865 - Ece: 0.2742 - Aurc: 0.2074 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 250 | 4.2227 | 0.1325 | 0.9130 | 6.8924 | 0.1325 | 0.0728 | 0.0573 | 0.7519 | | 4.2305 | 2.0 | 500 | 3.9645 | 0.1638 | 0.8922 | 5.8361 | 0.1638 | 0.1235 | 0.0588 | 0.7012 | | 4.2305 | 3.0 | 750 | 3.6177 | 0.285 | 0.8227 | 4.3429 | 0.285 | 0.2289 | 0.0627 | 0.5424 | | 3.6208 | 4.0 | 1000 | 3.2220 | 0.3733 | 0.7617 | 3.5860 | 0.3733 | 0.3356 | 0.0606 | 0.4322 | | 3.6208 | 5.0 | 1250 | 3.0177 | 0.4045 | 0.7308 | 3.7807 | 0.4045 | 0.3770 | 0.0721 | 0.3835 | | 2.9674 | 6.0 | 1500 | 2.8203 | 0.4365 | 0.7032 | 3.3569 | 0.4365 | 0.4130 | 0.0969 | 0.3443 | | 2.9674 | 7.0 | 1750 | 2.6164 | 0.4557 | 0.6762 | 3.4281 | 0.4557 | 0.4413 | 0.0810 | 0.3058 | | 2.5154 | 8.0 | 2000 | 2.4991 | 0.472 | 0.6651 | 3.3938 | 0.472 | 0.4524 | 0.1092 | 0.2846 | | 2.5154 | 9.0 | 2250 | 2.4375 | 0.4878 | 0.6826 | 3.1749 | 0.4878 | 0.4603 | 0.1631 | 0.2872 | | 2.2165 | 10.0 | 2500 | 2.3537 | 0.5018 | 0.6686 | 3.1767 | 0.5018 | 0.4855 | 0.1589 | 0.2743 | | 2.2165 | 11.0 | 2750 | 2.2613 | 0.515 | 0.6276 | 3.1281 | 0.515 | 0.5141 | 0.1101 | 0.2457 | | 1.9636 | 12.0 | 3000 | 2.2592 | 0.5242 | 0.6624 | 3.1164 | 0.5242 | 0.5131 | 0.1840 | 0.2515 | | 1.9636 | 13.0 | 3250 | 2.1751 | 0.5315 | 0.6190 | 3.2643 | 0.5315 | 0.5268 | 0.1349 | 0.2288 | | 1.7526 | 14.0 | 3500 | 2.2171 | 0.5248 | 0.6546 | 3.1179 | 0.5248 | 0.5162 | 0.1889 | 0.2537 | | 1.7526 | 15.0 | 3750 | 2.1185 | 0.5507 | 0.6126 | 3.1117 | 0.5507 | 0.5496 | 0.1578 | 0.2219 | | 1.5673 | 16.0 | 4000 | 2.0807 | 0.5537 | 0.6208 | 3.2624 | 0.5537 | 0.5459 | 0.1735 | 0.2151 | | 1.5673 | 17.0 | 4250 | 2.0743 | 0.5677 | 0.6095 | 3.2650 | 0.5677 | 0.5683 | 0.1628 | 0.2090 | | 1.3823 | 18.0 | 4500 | 2.1201 | 0.5605 | 0.6454 | 3.1499 | 0.5605 | 0.5558 | 0.2130 | 0.2316 | | 1.3823 | 19.0 | 4750 | 2.0835 | 0.5655 | 0.6312 | 3.2920 | 0.5655 | 0.5666 | 0.2015 | 0.2149 | | 1.2113 | 20.0 | 5000 | 2.0809 | 0.5675 | 0.6284 | 3.2923 | 0.5675 | 0.5675 | 0.2180 | 0.2047 | | 1.2113 | 21.0 | 5250 | 2.1507 | 0.5633 | 0.6608 | 3.2713 | 0.5633 | 0.5668 | 0.2380 | 0.2183 | | 1.0543 | 22.0 | 5500 | 2.1295 | 0.5683 | 0.6476 | 3.5120 | 0.5683 | 0.5672 | 0.2369 | 0.2105 | | 1.0543 | 23.0 | 5750 | 2.1610 | 0.5675 | 0.6564 | 3.3818 | 0.5675 | 0.5625 | 0.2393 | 0.2166 | | 0.9098 | 24.0 | 6000 | 2.0862 | 0.5735 | 0.6562 | 3.3228 | 0.5735 | 0.5782 | 0.2528 | 0.2047 | | 0.9098 | 25.0 | 6250 | 2.0680 | 0.5727 | 0.6439 | 3.2971 | 0.5727 | 0.5767 | 0.2357 | 0.2050 | | 0.7832 | 26.0 | 6500 | 2.1829 | 0.5763 | 0.6667 | 3.3547 | 0.5763 | 0.5792 | 0.2627 | 0.2084 | | 0.7832 | 27.0 | 6750 | 2.1163 | 0.586 | 0.6479 | 3.2468 | 0.586 | 0.5894 | 0.2509 | 0.2016 | | 0.6572 | 28.0 | 7000 | 2.1492 | 0.5715 | 0.6612 | 3.4268 | 0.5715 | 0.5780 | 0.2642 | 0.2114 | | 0.6572 | 29.0 | 7250 | 2.1975 | 0.5723 | 0.6777 | 3.4662 | 0.5723 | 0.5739 | 0.2749 | 0.2202 | | 0.5632 | 30.0 | 7500 | 2.1733 | 0.5693 | 0.6767 | 3.3743 | 0.5693 | 0.5745 | 0.2737 | 0.2170 | | 0.5632 | 31.0 | 7750 | 2.1694 | 0.5807 | 0.6661 | 3.3917 | 0.5807 | 0.5814 | 0.2645 | 0.2193 | | 0.4827 | 32.0 | 8000 | 2.1585 | 0.5805 | 0.6671 | 3.3811 | 0.5805 | 0.5812 | 0.2692 | 0.2150 | | 0.4827 | 33.0 | 8250 | 2.1963 | 0.5767 | 0.6754 | 3.4575 | 0.5767 | 0.5835 | 0.2710 | 0.2160 | | 0.4134 | 34.0 | 8500 | 2.1720 | 0.581 | 0.6694 | 3.3663 | 0.581 | 0.5811 | 0.2672 | 0.2131 | | 0.4134 | 35.0 | 8750 | 2.1880 | 0.575 | 0.6759 | 3.4587 | 0.575 | 0.5790 | 0.2783 | 0.2105 | | 0.3541 | 36.0 | 9000 | 2.1482 | 0.581 | 0.6628 | 3.2956 | 0.581 | 0.5842 | 0.2712 | 0.2056 | | 0.3541 | 37.0 | 9250 | 2.1631 | 0.5885 | 0.6652 | 3.3217 | 0.5885 | 0.5915 | 0.2671 | 0.2069 | | 0.3078 | 38.0 | 9500 | 2.2036 | 0.577 | 0.6811 | 3.3564 | 0.577 | 0.5803 | 0.2849 | 0.2141 | | 0.3078 | 39.0 | 9750 | 2.1904 | 0.5753 | 0.6756 | 3.2783 | 0.5753 | 0.5765 | 0.2756 | 0.2135 | | 0.2671 | 40.0 | 10000 | 2.1774 | 0.5775 | 0.6685 | 3.3109 | 0.5775 | 0.5813 | 0.2700 | 0.2084 | | 0.2671 | 41.0 | 10250 | 2.1822 | 0.5807 | 0.6730 | 3.2139 | 0.5807 | 0.5842 | 0.2770 | 0.2100 | | 0.2331 | 42.0 | 10500 | 2.1673 | 0.5817 | 0.6705 | 3.2960 | 0.5817 | 0.5864 | 0.2757 | 0.2070 | | 0.2331 | 43.0 | 10750 | 2.1730 | 0.5765 | 0.6705 | 3.2195 | 0.5765 | 0.5807 | 0.2784 | 0.2072 | | 0.2038 | 44.0 | 11000 | 2.1709 | 0.585 | 0.6649 | 3.1928 | 0.585 | 0.5893 | 0.2627 | 0.2055 | | 0.2038 | 45.0 | 11250 | 2.1745 | 0.5783 | 0.6678 | 3.1900 | 0.5783 | 0.5811 | 0.2736 | 0.2061 | | 0.1792 | 46.0 | 11500 | 2.1824 | 0.5835 | 0.6682 | 3.1909 | 0.5835 | 0.5858 | 0.2719 | 0.2070 | | 0.1792 | 47.0 | 11750 | 2.1892 | 0.584 | 0.6716 | 3.2457 | 0.584 | 0.5864 | 0.2706 | 0.2082 | | 0.16 | 48.0 | 12000 | 2.1820 | 0.5835 | 0.6716 | 3.2011 | 0.5835 | 0.5857 | 0.2743 | 0.2073 | | 0.16 | 49.0 | 12250 | 2.1884 | 0.582 | 0.6736 | 3.2114 | 0.582 | 0.5856 | 0.2755 | 0.2073 | | 0.1465 | 50.0 | 12500 | 2.1927 | 0.5835 | 0.6740 | 3.1975 | 0.5835 | 0.5865 | 0.2742 | 0.2074 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 6.1672 - Accuracy: 0.5797 - Brier Loss: 0.6731 - Nll: 3.2703 - F1 Micro: 0.5797 - F1 Macro: 0.5841 - Ece: 0.2854 - Aurc: 0.1976 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 250 | 6.2188 | 0.1343 | 0.9122 | 5.7464 | 0.1343 | 0.0834 | 0.0536 | 0.7565 | | 6.3619 | 2.0 | 500 | 6.0878 | 0.1565 | 0.8959 | 5.2310 | 0.1565 | 0.1126 | 0.0670 | 0.7122 | | 6.3619 | 3.0 | 750 | 5.7358 | 0.2963 | 0.8276 | 3.6085 | 0.2963 | 0.2563 | 0.0948 | 0.5210 | | 5.9224 | 4.0 | 1000 | 5.5272 | 0.382 | 0.7742 | 3.2631 | 0.382 | 0.3481 | 0.1205 | 0.4212 | | 5.9224 | 5.0 | 1250 | 5.3271 | 0.4235 | 0.7257 | 3.1338 | 0.4235 | 0.4033 | 0.1172 | 0.3609 | | 5.4818 | 6.0 | 1500 | 5.2958 | 0.4343 | 0.7063 | 3.0800 | 0.4343 | 0.4119 | 0.0915 | 0.3431 | | 5.4818 | 7.0 | 1750 | 5.1042 | 0.4865 | 0.6655 | 2.9165 | 0.4865 | 0.4753 | 0.1281 | 0.2790 | | 5.1995 | 8.0 | 2000 | 5.0990 | 0.4868 | 0.6566 | 2.9361 | 0.4868 | 0.4782 | 0.1000 | 0.2775 | | 5.1995 | 9.0 | 2250 | 4.9973 | 0.5008 | 0.6235 | 2.7450 | 0.5008 | 0.4878 | 0.0901 | 0.2533 | | 5.0048 | 10.0 | 2500 | 4.9471 | 0.516 | 0.6182 | 2.7522 | 0.516 | 0.5141 | 0.0855 | 0.2455 | | 5.0048 | 11.0 | 2750 | 4.9331 | 0.5225 | 0.6072 | 2.7517 | 0.5225 | 0.5198 | 0.0724 | 0.2397 | | 4.8157 | 12.0 | 3000 | 4.9154 | 0.5343 | 0.5948 | 2.8289 | 0.5343 | 0.5274 | 0.0614 | 0.2331 | | 4.8157 | 13.0 | 3250 | 4.9063 | 0.5252 | 0.5985 | 2.8356 | 0.5252 | 0.5193 | 0.0565 | 0.2343 | | 4.6678 | 14.0 | 3500 | 4.9772 | 0.536 | 0.5988 | 2.8902 | 0.536 | 0.5233 | 0.0580 | 0.2359 | | 4.6678 | 15.0 | 3750 | 4.8401 | 0.5517 | 0.5759 | 2.7486 | 0.5517 | 0.5526 | 0.0618 | 0.2150 | | 4.5289 | 16.0 | 4000 | 4.8798 | 0.5617 | 0.5704 | 2.7557 | 0.5617 | 0.5581 | 0.0618 | 0.2134 | | 4.5289 | 17.0 | 4250 | 4.8518 | 0.5527 | 0.5710 | 2.8619 | 0.5527 | 0.5556 | 0.0451 | 0.2103 | | 4.3805 | 18.0 | 4500 | 4.8751 | 0.5623 | 0.5696 | 2.7950 | 0.5623 | 0.5607 | 0.0577 | 0.2081 | | 4.3805 | 19.0 | 4750 | 4.9057 | 0.5593 | 0.5767 | 2.9991 | 0.5593 | 0.5611 | 0.0608 | 0.2145 | | 4.2463 | 20.0 | 5000 | 4.9515 | 0.5595 | 0.5730 | 2.9144 | 0.5595 | 0.5578 | 0.0792 | 0.2119 | | 4.2463 | 21.0 | 5250 | 4.9867 | 0.5625 | 0.5742 | 2.8184 | 0.5625 | 0.5635 | 0.0896 | 0.2121 | | 4.1211 | 22.0 | 5500 | 4.9772 | 0.5683 | 0.5703 | 3.0845 | 0.5683 | 0.5682 | 0.0771 | 0.2050 | | 4.1211 | 23.0 | 5750 | 4.9923 | 0.5667 | 0.5767 | 3.0160 | 0.5667 | 0.5699 | 0.1001 | 0.2041 | | 3.9862 | 24.0 | 6000 | 5.0275 | 0.5687 | 0.5772 | 3.0111 | 0.5687 | 0.5705 | 0.1119 | 0.2012 | | 3.9862 | 25.0 | 6250 | 5.1046 | 0.5607 | 0.5890 | 3.2599 | 0.5607 | 0.5623 | 0.1284 | 0.2060 | | 3.8573 | 26.0 | 6500 | 5.1868 | 0.5607 | 0.6002 | 3.1568 | 0.5607 | 0.5669 | 0.1427 | 0.2085 | | 3.8573 | 27.0 | 6750 | 5.1975 | 0.569 | 0.5962 | 3.1893 | 0.569 | 0.5729 | 0.1442 | 0.2037 | | 3.7598 | 28.0 | 7000 | 5.2735 | 0.561 | 0.6090 | 3.3290 | 0.561 | 0.5674 | 0.1608 | 0.2087 | | 3.7598 | 29.0 | 7250 | 5.2898 | 0.5695 | 0.6063 | 3.2247 | 0.5695 | 0.5719 | 0.1744 | 0.2025 | | 3.6544 | 30.0 | 7500 | 5.3092 | 0.566 | 0.6142 | 3.2588 | 0.566 | 0.5725 | 0.1776 | 0.2064 | | 3.6544 | 31.0 | 7750 | 5.4251 | 0.564 | 0.6214 | 3.2408 | 0.564 | 0.5641 | 0.1938 | 0.2066 | | 3.5698 | 32.0 | 8000 | 5.4274 | 0.573 | 0.6217 | 3.3516 | 0.573 | 0.5780 | 0.1959 | 0.2036 | | 3.5698 | 33.0 | 8250 | 5.4650 | 0.5665 | 0.6301 | 3.3685 | 0.5665 | 0.5765 | 0.2088 | 0.2054 | | 3.4966 | 34.0 | 8500 | 5.4854 | 0.5733 | 0.6250 | 3.2985 | 0.5733 | 0.5754 | 0.2079 | 0.2027 | | 3.4966 | 35.0 | 8750 | 5.5474 | 0.5837 | 0.6261 | 3.2816 | 0.5837 | 0.5860 | 0.2134 | 0.1990 | | 3.4285 | 36.0 | 9000 | 5.5979 | 0.5725 | 0.6371 | 3.3105 | 0.5725 | 0.5763 | 0.2248 | 0.2023 | | 3.4285 | 37.0 | 9250 | 5.7002 | 0.576 | 0.6452 | 3.2637 | 0.576 | 0.5771 | 0.2396 | 0.2034 | | 3.377 | 38.0 | 9500 | 5.6932 | 0.5777 | 0.6448 | 3.3403 | 0.5777 | 0.5825 | 0.2362 | 0.2023 | | 3.377 | 39.0 | 9750 | 5.7180 | 0.5795 | 0.6409 | 3.2664 | 0.5795 | 0.5848 | 0.2382 | 0.1990 | | 3.3344 | 40.0 | 10000 | 5.7943 | 0.5765 | 0.6502 | 3.4052 | 0.5765 | 0.5810 | 0.2524 | 0.2001 | | 3.3344 | 41.0 | 10250 | 5.8347 | 0.5737 | 0.6562 | 3.3472 | 0.5737 | 0.5793 | 0.2555 | 0.2006 | | 3.2925 | 42.0 | 10500 | 5.9010 | 0.5835 | 0.6529 | 3.2352 | 0.5835 | 0.5867 | 0.2563 | 0.1987 | | 3.2925 | 43.0 | 10750 | 5.9119 | 0.5787 | 0.6550 | 3.2640 | 0.5787 | 0.5829 | 0.2611 | 0.1976 | | 3.2573 | 44.0 | 11000 | 5.9355 | 0.5765 | 0.6609 | 3.2903 | 0.5765 | 0.5811 | 0.2620 | 0.2004 | | 3.2573 | 45.0 | 11250 | 6.0046 | 0.58 | 0.6643 | 3.2248 | 0.58 | 0.5843 | 0.2691 | 0.1992 | | 3.2269 | 46.0 | 11500 | 6.0610 | 0.5847 | 0.6659 | 3.2719 | 0.5847 | 0.5888 | 0.2705 | 0.1974 | | 3.2269 | 47.0 | 11750 | 6.0938 | 0.5787 | 0.6718 | 3.2559 | 0.5787 | 0.5840 | 0.2801 | 0.1989 | | 3.2025 | 48.0 | 12000 | 6.1306 | 0.5787 | 0.6711 | 3.2546 | 0.5787 | 0.5823 | 0.2830 | 0.1974 | | 3.2025 | 49.0 | 12250 | 6.1521 | 0.5823 | 0.6725 | 3.2590 | 0.5823 | 0.5867 | 0.2822 | 0.1976 | | 3.1849 | 50.0 | 12500 | 6.1672 | 0.5797 | 0.6731 | 3.2703 | 0.5797 | 0.5841 | 0.2854 | 0.1976 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
carolinacalce/MiModeloCatsDogs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # MiModeloCatsDogs This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "cats", "dogs" ]
aksrad/autotrain-brain-80728141573
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 80728141573 - CO2 Emissions (in grams): 0.4577 ## Validation Metrics - Loss: 0.054 - Accuracy: 0.979 - Precision: 1.000 - Recall: 0.958 - AUC: 1.000 - F1: 0.979
[ "abnormal", "normal" ]
jordyvl/vit-base_rvl-cdip_r2_32
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip_r2_32 This model is a fine-tuned version of [jordyvl/vit-base_rvl-cdip](https://huggingface.co/jordyvl/vit-base_rvl-cdip) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6372 - Accuracy: 0.8985 - Brier Loss: 0.1792 - Nll: 1.1736 - F1 Micro: 0.8985 - F1 Macro: 0.8987 - Ece: 0.0847 - Aurc: 0.0201 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 96 - eval_batch_size: 96 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.1647 | 1.0 | 3334 | 0.4024 | 0.8887 | 0.1682 | 1.2086 | 0.8887 | 0.8891 | 0.0457 | 0.0178 | | 0.1418 | 2.0 | 6668 | 0.4075 | 0.8941 | 0.1646 | 1.2066 | 0.8941 | 0.8942 | 0.0522 | 0.0177 | | 0.0989 | 3.0 | 10002 | 0.4409 | 0.8932 | 0.1690 | 1.1966 | 0.8932 | 0.8932 | 0.0647 | 0.0175 | | 0.0614 | 4.0 | 13336 | 0.4781 | 0.8944 | 0.1730 | 1.2083 | 0.8944 | 0.8951 | 0.0694 | 0.0181 | | 0.0392 | 5.0 | 16670 | 0.5329 | 0.8959 | 0.1761 | 1.1777 | 0.8959 | 0.8958 | 0.0776 | 0.0187 | | 0.0231 | 6.0 | 20004 | 0.5714 | 0.8957 | 0.1799 | 1.2083 | 0.8957 | 0.8958 | 0.0813 | 0.0198 | | 0.0126 | 7.0 | 23338 | 0.6002 | 0.8966 | 0.1802 | 1.1732 | 0.8966 | 0.8972 | 0.0839 | 0.0197 | | 0.0079 | 8.0 | 26672 | 0.6193 | 0.8984 | 0.1789 | 1.1849 | 0.8984 | 0.8985 | 0.0833 | 0.0200 | | 0.0049 | 9.0 | 30006 | 0.6333 | 0.8976 | 0.1798 | 1.1906 | 0.8976 | 0.8978 | 0.0851 | 0.0205 | | 0.0034 | 10.0 | 33340 | 0.6372 | 0.8985 | 0.1792 | 1.1736 | 0.8985 | 0.8987 | 0.0847 | 0.0201 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
himanshusrivastava/finetuned-indian-food-images
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuned-indian-food-images This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the indian_food_images dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "burger", "butter_naan", "kaathi_rolls", "kadai_paneer", "kulfi", "masala_dosa", "momos", "paani_puri", "pakode", "pav_bhaji", "pizza", "samosa", "chai", "chapati", "chole_bhature", "dal_makhani", "dhokla", "fried_rice", "idli", "jalebi" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_hint_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_hint_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 75.5808 - Accuracy: 0.583 - Brier Loss: 0.7311 - Nll: 3.9633 - F1 Micro: 0.583 - F1 Macro: 0.5838 - Ece: 0.3399 - Aurc: 0.2128 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | No log | 1.0 | 250 | 78.0119 | 0.1285 | 0.9098 | 6.7342 | 0.1285 | 0.0748 | 0.0496 | 0.7634 | | 77.7969 | 2.0 | 500 | 77.3633 | 0.1595 | 0.8985 | 5.2942 | 0.1595 | 0.1038 | 0.0509 | 0.7216 | | 77.7969 | 3.0 | 750 | 76.6773 | 0.2545 | 0.8551 | 3.9015 | 0.2545 | 0.2006 | 0.0741 | 0.5967 | | 76.735 | 4.0 | 1000 | 76.1721 | 0.312 | 0.8123 | 3.4141 | 0.312 | 0.2785 | 0.0855 | 0.5018 | | 76.735 | 5.0 | 1250 | 76.0027 | 0.3703 | 0.7573 | 3.2539 | 0.3703 | 0.3299 | 0.0764 | 0.4161 | | 75.8262 | 6.0 | 1500 | 76.3256 | 0.4143 | 0.7290 | 3.1129 | 0.4143 | 0.3995 | 0.0835 | 0.3792 | | 75.8262 | 7.0 | 1750 | 75.5753 | 0.4575 | 0.6838 | 2.8940 | 0.4575 | 0.4421 | 0.0595 | 0.3262 | | 75.3656 | 8.0 | 2000 | 75.2875 | 0.475 | 0.6554 | 2.7996 | 0.4750 | 0.4596 | 0.0715 | 0.2976 | | 75.3656 | 9.0 | 2250 | 75.3849 | 0.4833 | 0.6446 | 2.7232 | 0.4833 | 0.4523 | 0.0651 | 0.2885 | | 75.0748 | 10.0 | 2500 | 75.3431 | 0.5172 | 0.6173 | 2.6664 | 0.5172 | 0.4905 | 0.0563 | 0.2606 | | 75.0748 | 11.0 | 2750 | 75.0478 | 0.5357 | 0.5982 | 2.7014 | 0.5357 | 0.5207 | 0.0550 | 0.2384 | | 74.821 | 12.0 | 3000 | 75.1324 | 0.5325 | 0.5973 | 2.6161 | 0.5325 | 0.5202 | 0.0569 | 0.2402 | | 74.821 | 13.0 | 3250 | 75.0049 | 0.528 | 0.5996 | 2.6859 | 0.528 | 0.5157 | 0.0657 | 0.2408 | | 74.613 | 14.0 | 3500 | 74.8702 | 0.5453 | 0.5881 | 2.7150 | 0.5453 | 0.5455 | 0.0661 | 0.2302 | | 74.613 | 15.0 | 3750 | 74.8427 | 0.5595 | 0.5697 | 2.5605 | 0.5595 | 0.5479 | 0.0765 | 0.2117 | | 74.421 | 16.0 | 4000 | 74.9157 | 0.5503 | 0.5829 | 2.7215 | 0.5503 | 0.5524 | 0.0765 | 0.2219 | | 74.421 | 17.0 | 4250 | 74.9051 | 0.5633 | 0.5816 | 2.6715 | 0.5633 | 0.5577 | 0.0924 | 0.2186 | | 74.2453 | 18.0 | 4500 | 74.9910 | 0.5733 | 0.5722 | 2.6963 | 0.5733 | 0.5717 | 0.0930 | 0.2107 | | 74.2453 | 19.0 | 4750 | 74.8632 | 0.5575 | 0.5892 | 2.6981 | 0.5575 | 0.5549 | 0.1073 | 0.2198 | | 74.0712 | 20.0 | 5000 | 74.8128 | 0.5757 | 0.5794 | 2.7227 | 0.5757 | 0.5697 | 0.1235 | 0.2083 | | 74.0712 | 21.0 | 5250 | 74.7545 | 0.575 | 0.5794 | 2.7000 | 0.575 | 0.5700 | 0.1372 | 0.2015 | | 73.9033 | 22.0 | 5500 | 74.7493 | 0.5737 | 0.5841 | 2.7996 | 0.5737 | 0.5806 | 0.1341 | 0.2073 | | 73.9033 | 23.0 | 5750 | 74.7641 | 0.582 | 0.5831 | 2.7846 | 0.582 | 0.5780 | 0.1576 | 0.1985 | | 73.7364 | 24.0 | 6000 | 74.8125 | 0.5807 | 0.5944 | 2.8725 | 0.5807 | 0.5767 | 0.1719 | 0.2015 | | 73.7364 | 25.0 | 6250 | 74.9721 | 0.573 | 0.6132 | 2.9232 | 0.573 | 0.5734 | 0.1920 | 0.2086 | | 73.5899 | 26.0 | 6500 | 74.8675 | 0.5823 | 0.6127 | 2.9200 | 0.5823 | 0.5788 | 0.1969 | 0.2059 | | 73.5899 | 27.0 | 6750 | 74.9213 | 0.5723 | 0.6234 | 3.0482 | 0.5723 | 0.5717 | 0.2138 | 0.2085 | | 73.4419 | 28.0 | 7000 | 74.9436 | 0.5815 | 0.6324 | 3.0789 | 0.5815 | 0.5803 | 0.2223 | 0.2058 | | 73.4419 | 29.0 | 7250 | 74.8826 | 0.5747 | 0.6408 | 3.1380 | 0.5747 | 0.5711 | 0.2428 | 0.2044 | | 73.3198 | 30.0 | 7500 | 75.0310 | 0.5633 | 0.6722 | 3.2517 | 0.5633 | 0.5639 | 0.2571 | 0.2226 | | 73.3198 | 31.0 | 7750 | 75.0300 | 0.5577 | 0.6795 | 3.3520 | 0.5577 | 0.5627 | 0.2611 | 0.2255 | | 73.2086 | 32.0 | 8000 | 74.9569 | 0.5793 | 0.6614 | 3.3345 | 0.5793 | 0.5829 | 0.2623 | 0.2070 | | 73.2086 | 33.0 | 8250 | 75.1474 | 0.5655 | 0.6902 | 3.5319 | 0.5655 | 0.5656 | 0.2780 | 0.2260 | | 73.1102 | 34.0 | 8500 | 75.1176 | 0.5697 | 0.6926 | 3.5011 | 0.5697 | 0.5685 | 0.2891 | 0.2127 | | 73.1102 | 35.0 | 8750 | 75.2834 | 0.5673 | 0.7085 | 3.7150 | 0.5673 | 0.5688 | 0.2945 | 0.2210 | | 73.0239 | 36.0 | 9000 | 75.2426 | 0.566 | 0.7101 | 3.6822 | 0.566 | 0.5679 | 0.3029 | 0.2200 | | 73.0239 | 37.0 | 9250 | 75.3049 | 0.5743 | 0.7082 | 3.6300 | 0.5743 | 0.5758 | 0.3044 | 0.2185 | | 72.9631 | 38.0 | 9500 | 75.3404 | 0.5695 | 0.7220 | 3.7386 | 0.5695 | 0.5741 | 0.3177 | 0.2210 | | 72.9631 | 39.0 | 9750 | 75.4376 | 0.5775 | 0.7181 | 3.8412 | 0.5775 | 0.5784 | 0.3148 | 0.2191 | | 72.9028 | 40.0 | 10000 | 75.4664 | 0.5777 | 0.7178 | 3.9272 | 0.5777 | 0.5775 | 0.3178 | 0.2233 | | 72.9028 | 41.0 | 10250 | 75.5305 | 0.5737 | 0.7279 | 3.8240 | 0.5737 | 0.5761 | 0.3271 | 0.2215 | | 72.8505 | 42.0 | 10500 | 75.4606 | 0.5783 | 0.7225 | 3.8401 | 0.5783 | 0.5805 | 0.3261 | 0.2156 | | 72.8505 | 43.0 | 10750 | 75.5084 | 0.5793 | 0.7242 | 3.8552 | 0.5793 | 0.5791 | 0.3308 | 0.2115 | | 72.8091 | 44.0 | 11000 | 75.4797 | 0.5817 | 0.7256 | 3.8946 | 0.5817 | 0.5825 | 0.3340 | 0.2112 | | 72.8091 | 45.0 | 11250 | 75.5695 | 0.5793 | 0.7297 | 3.9742 | 0.5793 | 0.5809 | 0.3379 | 0.2150 | | 72.7801 | 46.0 | 11500 | 75.5592 | 0.5807 | 0.7331 | 3.9445 | 0.5807 | 0.5830 | 0.3378 | 0.2151 | | 72.7801 | 47.0 | 11750 | 75.5976 | 0.5833 | 0.7303 | 3.9669 | 0.5833 | 0.5840 | 0.3380 | 0.2145 | | 72.7606 | 48.0 | 12000 | 75.5952 | 0.5833 | 0.7320 | 3.9813 | 0.5833 | 0.5847 | 0.3380 | 0.2148 | | 72.7606 | 49.0 | 12250 | 75.5621 | 0.5843 | 0.7309 | 3.9491 | 0.5843 | 0.5851 | 0.3385 | 0.2127 | | 72.7486 | 50.0 | 12500 | 75.5808 | 0.583 | 0.7311 | 3.9633 | 0.583 | 0.5838 | 0.3399 | 0.2128 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
ongkn/attraction-classifier
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # attraction-classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.4274 - Accuracy: 0.8243 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 69 - gradient_accumulation_steps: 16 - total_train_batch_size: 512 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.05 - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6782 | 1.78 | 15 | 0.5922 | 0.7008 | | 0.5096 | 3.56 | 30 | 0.5153 | 0.7552 | | 0.4434 | 5.33 | 45 | 0.4520 | 0.7762 | | 0.3844 | 7.11 | 60 | 0.4381 | 0.8013 | | 0.3642 | 8.89 | 75 | 0.4359 | 0.8054 | | 0.322 | 10.67 | 90 | 0.4086 | 0.8138 | | 0.2845 | 12.44 | 105 | 0.4111 | 0.8201 | | 0.2588 | 14.22 | 120 | 0.4100 | 0.8159 | | 0.2516 | 16.0 | 135 | 0.4122 | 0.8389 | | 0.2375 | 17.78 | 150 | 0.4085 | 0.8243 | | 0.2309 | 19.56 | 165 | 0.4149 | 0.8117 | | 0.2175 | 21.33 | 180 | 0.4274 | 0.8243 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.0.1+cu117 - Datasets 2.15.0 - Tokenizers 0.15.0
[ "neg", "pos" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_simkd_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_simkd_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0667 - Accuracy: 0.5865 - Brier Loss: 0.5908 - Nll: 3.0393 - F1 Micro: 0.5865 - F1 Macro: 0.5890 - Ece: 0.1479 - Aurc: 0.2054 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.0807 | 1.0 | 1000 | 0.0798 | 0.095 | 0.9362 | 7.0778 | 0.095 | 0.0517 | 0.0524 | 0.8510 | | 0.0785 | 2.0 | 2000 | 0.0782 | 0.142 | 0.9268 | 6.5000 | 0.142 | 0.0892 | 0.0843 | 0.7446 | | 0.0768 | 3.0 | 3000 | 0.0761 | 0.253 | 0.8945 | 4.3268 | 0.253 | 0.1827 | 0.1545 | 0.5697 | | 0.0753 | 4.0 | 4000 | 0.0747 | 0.327 | 0.8672 | 3.7313 | 0.327 | 0.2733 | 0.2052 | 0.4558 | | 0.074 | 5.0 | 5000 | 0.0739 | 0.359 | 0.8410 | 3.6965 | 0.359 | 0.2941 | 0.2102 | 0.4159 | | 0.0729 | 6.0 | 6000 | 0.0725 | 0.3795 | 0.8104 | 3.2323 | 0.3795 | 0.3340 | 0.2147 | 0.3672 | | 0.0718 | 7.0 | 7000 | 0.0717 | 0.4165 | 0.7806 | 3.1185 | 0.4165 | 0.3770 | 0.2186 | 0.3378 | | 0.071 | 8.0 | 8000 | 0.0714 | 0.4175 | 0.7785 | 3.1984 | 0.4175 | 0.3999 | 0.2170 | 0.3408 | | 0.0703 | 9.0 | 9000 | 0.0707 | 0.457 | 0.7563 | 2.8932 | 0.457 | 0.4310 | 0.2437 | 0.2965 | | 0.0696 | 10.0 | 10000 | 0.0699 | 0.4665 | 0.7452 | 2.7889 | 0.4665 | 0.4529 | 0.2456 | 0.2828 | | 0.0691 | 11.0 | 11000 | 0.0693 | 0.499 | 0.7219 | 2.7292 | 0.499 | 0.4756 | 0.2543 | 0.2579 | | 0.0685 | 12.0 | 12000 | 0.0691 | 0.4955 | 0.7144 | 2.8807 | 0.4955 | 0.4734 | 0.2443 | 0.2515 | | 0.068 | 13.0 | 13000 | 0.0688 | 0.5072 | 0.7096 | 2.6737 | 0.5072 | 0.4944 | 0.2525 | 0.2468 | | 0.0675 | 14.0 | 14000 | 0.0685 | 0.513 | 0.6952 | 2.7492 | 0.513 | 0.5001 | 0.2404 | 0.2453 | | 0.0669 | 15.0 | 15000 | 0.0682 | 0.5232 | 0.6855 | 2.7789 | 0.5232 | 0.5048 | 0.2441 | 0.2379 | | 0.0664 | 16.0 | 16000 | 0.0680 | 0.529 | 0.6790 | 2.8249 | 0.529 | 0.5182 | 0.2366 | 0.2340 | | 0.0658 | 17.0 | 17000 | 0.0678 | 0.5347 | 0.6668 | 2.7035 | 0.5347 | 0.5237 | 0.2338 | 0.2228 | | 0.0652 | 18.0 | 18000 | 0.0676 | 0.5335 | 0.6673 | 2.8630 | 0.5335 | 0.5249 | 0.2319 | 0.2252 | | 0.0651 | 19.0 | 19000 | 0.0675 | 0.5385 | 0.6524 | 2.7522 | 0.5385 | 0.5286 | 0.2172 | 0.2256 | | 0.0645 | 20.0 | 20000 | 0.0671 | 0.5593 | 0.6454 | 2.7445 | 0.5593 | 0.5563 | 0.2324 | 0.2122 | | 0.0639 | 21.0 | 21000 | 0.0672 | 0.5453 | 0.6541 | 2.9011 | 0.5453 | 0.5451 | 0.2236 | 0.2204 | | 0.0634 | 22.0 | 22000 | 0.0668 | 0.5617 | 0.6398 | 2.8668 | 0.5617 | 0.5604 | 0.2264 | 0.2108 | | 0.0629 | 23.0 | 23000 | 0.0670 | 0.5577 | 0.6295 | 2.8351 | 0.5577 | 0.5521 | 0.1984 | 0.2180 | | 0.0625 | 24.0 | 24000 | 0.0666 | 0.5765 | 0.6201 | 2.7133 | 0.5765 | 0.5754 | 0.2138 | 0.2035 | | 0.0618 | 25.0 | 25000 | 0.0666 | 0.565 | 0.6219 | 2.8775 | 0.565 | 0.5614 | 0.2010 | 0.2078 | | 0.0613 | 26.0 | 26000 | 0.0664 | 0.5795 | 0.6121 | 2.8665 | 0.5795 | 0.5805 | 0.1996 | 0.2024 | | 0.0606 | 27.0 | 27000 | 0.0667 | 0.5723 | 0.6101 | 2.9450 | 0.5723 | 0.5711 | 0.1804 | 0.2113 | | 0.0603 | 28.0 | 28000 | 0.0664 | 0.583 | 0.6106 | 2.9126 | 0.583 | 0.5845 | 0.2004 | 0.2006 | | 0.0597 | 29.0 | 29000 | 0.0665 | 0.5857 | 0.6050 | 2.9881 | 0.5857 | 0.5862 | 0.1912 | 0.2006 | | 0.0594 | 30.0 | 30000 | 0.0665 | 0.5775 | 0.6043 | 2.9735 | 0.5775 | 0.5797 | 0.1823 | 0.2029 | | 0.0589 | 31.0 | 31000 | 0.0666 | 0.5733 | 0.6080 | 2.9942 | 0.5733 | 0.5739 | 0.1721 | 0.2129 | | 0.0585 | 32.0 | 32000 | 0.0667 | 0.5803 | 0.6066 | 3.0341 | 0.5803 | 0.5826 | 0.1748 | 0.2114 | | 0.0583 | 33.0 | 33000 | 0.0665 | 0.5827 | 0.6033 | 3.0209 | 0.5827 | 0.5880 | 0.1799 | 0.2029 | | 0.0578 | 34.0 | 34000 | 0.0667 | 0.577 | 0.6020 | 3.0483 | 0.577 | 0.5816 | 0.1636 | 0.2081 | | 0.0576 | 35.0 | 35000 | 0.0667 | 0.577 | 0.6029 | 3.0263 | 0.577 | 0.5840 | 0.1573 | 0.2117 | | 0.0574 | 36.0 | 36000 | 0.0667 | 0.5803 | 0.6006 | 3.0578 | 0.5803 | 0.5851 | 0.1627 | 0.2082 | | 0.057 | 37.0 | 37000 | 0.0666 | 0.582 | 0.5997 | 3.1133 | 0.582 | 0.5867 | 0.1612 | 0.2094 | | 0.0567 | 38.0 | 38000 | 0.0667 | 0.5817 | 0.5951 | 3.0727 | 0.5817 | 0.5836 | 0.1552 | 0.2091 | | 0.0566 | 39.0 | 39000 | 0.0666 | 0.5815 | 0.5951 | 3.0308 | 0.5815 | 0.5853 | 0.1559 | 0.2049 | | 0.0564 | 40.0 | 40000 | 0.0666 | 0.5853 | 0.5940 | 3.0629 | 0.5853 | 0.5880 | 0.1564 | 0.2057 | | 0.0562 | 41.0 | 41000 | 0.0666 | 0.5845 | 0.5949 | 3.0956 | 0.5845 | 0.5881 | 0.1585 | 0.2055 | | 0.0561 | 42.0 | 42000 | 0.0666 | 0.5827 | 0.5960 | 3.0679 | 0.5827 | 0.5876 | 0.1540 | 0.2098 | | 0.0559 | 43.0 | 43000 | 0.0666 | 0.5833 | 0.5909 | 2.9904 | 0.5833 | 0.5854 | 0.1491 | 0.2049 | | 0.0559 | 44.0 | 44000 | 0.0665 | 0.585 | 0.5915 | 3.0150 | 0.585 | 0.5876 | 0.1543 | 0.2032 | | 0.0557 | 45.0 | 45000 | 0.0667 | 0.583 | 0.5923 | 3.0501 | 0.583 | 0.5851 | 0.1501 | 0.2056 | | 0.0557 | 46.0 | 46000 | 0.0666 | 0.5905 | 0.5914 | 3.0110 | 0.5905 | 0.5940 | 0.1550 | 0.2045 | | 0.0555 | 47.0 | 47000 | 0.0667 | 0.584 | 0.5922 | 3.0464 | 0.584 | 0.5872 | 0.1497 | 0.2069 | | 0.0555 | 48.0 | 48000 | 0.0667 | 0.588 | 0.5917 | 3.0408 | 0.588 | 0.5919 | 0.1489 | 0.2051 | | 0.0554 | 49.0 | 49000 | 0.0667 | 0.589 | 0.5908 | 3.0433 | 0.589 | 0.5923 | 0.1496 | 0.2044 | | 0.0554 | 50.0 | 50000 | 0.0667 | 0.5865 | 0.5908 | 3.0393 | 0.5865 | 0.5890 | 0.1479 | 0.2054 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
Deexit/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Deexit/swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 1.9176 - Validation Loss: 3.2903 - Validation Accuracy: 0.0 - Epoch: 13 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Validation Accuracy | Epoch | |:----------:|:---------------:|:-------------------:|:-----:| | 2.7734 | 2.7408 | 0.0 | 0 | | 2.4056 | 2.7463 | 0.0 | 1 | | 2.1880 | 2.7762 | 0.0 | 2 | | 2.0477 | 2.8285 | 0.0 | 3 | | 2.1556 | 2.8884 | 0.0 | 4 | | 2.0269 | 2.9569 | 0.0 | 5 | | 1.7258 | 3.0337 | 0.0 | 6 | | 2.3555 | 3.1071 | 0.0 | 7 | | 1.8657 | 3.1494 | 0.0 | 8 | | 1.8121 | 3.1848 | 0.0 | 9 | | 1.9192 | 3.2109 | 0.0 | 10 | | 1.9925 | 3.2335 | 0.0 | 11 | | 2.0157 | 3.2654 | 0.0 | 12 | | 1.9176 | 3.2903 | 0.0 | 13 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.13.0 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "boat", "children", "dogs", "fireman", "firetruck", "mountains", "people", "river", "snow", "stairs", "water" ]
jordyvl/vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_og_simkd_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-tiny_rvl_cdip-NK1000_og_simkd_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 267.6730 - Accuracy: 0.6705 - Brier Loss: 0.6262 - Nll: 2.7104 - F1 Micro: 0.6705 - F1 Macro: 0.6721 - Ece: 0.3087 - Aurc: 0.1976 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 286.7271 | 1.0 | 1000 | 285.5399 | 0.2112 | 1.1285 | 5.2382 | 0.2112 | 0.1362 | 0.4400 | 0.6668 | | 284.6535 | 2.0 | 2000 | 284.8639 | 0.2365 | 1.1876 | 6.1414 | 0.2365 | 0.1846 | 0.5026 | 0.6043 | | 283.982 | 3.0 | 3000 | 284.8751 | 0.2555 | 1.2913 | 6.7626 | 0.2555 | 0.2072 | 0.5840 | 0.6111 | | 283.8947 | 4.0 | 4000 | 283.0353 | 0.3585 | 1.0748 | 4.2918 | 0.3585 | 0.3100 | 0.4921 | 0.4239 | | 282.5615 | 5.0 | 5000 | 282.0369 | 0.3852 | 1.0142 | 4.7413 | 0.3852 | 0.3432 | 0.4558 | 0.3983 | | 281.6467 | 6.0 | 6000 | 280.8857 | 0.428 | 0.9539 | 4.1971 | 0.428 | 0.3797 | 0.4329 | 0.3427 | | 280.8835 | 7.0 | 7000 | 279.7836 | 0.4288 | 1.0391 | 3.9288 | 0.4288 | 0.4012 | 0.4994 | 0.3565 | | 279.5518 | 8.0 | 8000 | 278.7849 | 0.5198 | 0.8045 | 3.0811 | 0.5198 | 0.4977 | 0.3699 | 0.2454 | | 278.6091 | 9.0 | 9000 | 278.3536 | 0.5155 | 0.8487 | 3.1204 | 0.5155 | 0.4977 | 0.4004 | 0.2587 | | 277.9435 | 10.0 | 10000 | 277.6002 | 0.5258 | 0.8346 | 3.3232 | 0.5258 | 0.4899 | 0.3923 | 0.2693 | | 277.646 | 11.0 | 11000 | 276.9034 | 0.5285 | 0.8510 | 3.1019 | 0.5285 | 0.5010 | 0.4079 | 0.2804 | | 276.6211 | 12.0 | 12000 | 276.8536 | 0.5555 | 0.7899 | 3.0560 | 0.5555 | 0.5446 | 0.3760 | 0.2266 | | 276.1643 | 13.0 | 13000 | 275.8300 | 0.5685 | 0.7767 | 3.1275 | 0.5685 | 0.5412 | 0.3730 | 0.2267 | | 275.7773 | 14.0 | 14000 | 275.0154 | 0.5833 | 0.7536 | 2.9981 | 0.5833 | 0.5645 | 0.3603 | 0.2357 | | 274.971 | 15.0 | 15000 | 275.1284 | 0.6008 | 0.7210 | 2.8953 | 0.6008 | 0.5920 | 0.3414 | 0.2059 | | 274.6605 | 16.0 | 16000 | 273.9564 | 0.6132 | 0.7168 | 2.8476 | 0.6132 | 0.5968 | 0.3479 | 0.2272 | | 273.7713 | 17.0 | 17000 | 273.3493 | 0.5995 | 0.7409 | 2.8991 | 0.5995 | 0.5901 | 0.3607 | 0.2272 | | 272.7905 | 18.0 | 18000 | 273.5748 | 0.598 | 0.7367 | 2.7778 | 0.598 | 0.5858 | 0.3565 | 0.2102 | | 273.134 | 19.0 | 19000 | 272.6561 | 0.6158 | 0.7128 | 2.8084 | 0.6158 | 0.6023 | 0.3494 | 0.2132 | | 271.8558 | 20.0 | 20000 | 272.4530 | 0.618 | 0.7139 | 2.9767 | 0.618 | 0.6077 | 0.3480 | 0.2177 | | 271.9448 | 21.0 | 21000 | 272.1698 | 0.619 | 0.7164 | 2.9459 | 0.619 | 0.6133 | 0.3510 | 0.2256 | | 270.9343 | 22.0 | 22000 | 272.2906 | 0.6235 | 0.7087 | 2.9843 | 0.6235 | 0.6181 | 0.3452 | 0.2248 | | 270.6012 | 23.0 | 23000 | 271.5266 | 0.6382 | 0.6781 | 2.9158 | 0.6382 | 0.6352 | 0.3324 | 0.2110 | | 270.3184 | 24.0 | 24000 | 271.1095 | 0.634 | 0.6922 | 2.9734 | 0.634 | 0.6287 | 0.3348 | 0.2162 | | 269.5019 | 25.0 | 25000 | 270.8806 | 0.644 | 0.6683 | 2.8735 | 0.644 | 0.6359 | 0.3258 | 0.2123 | | 269.5113 | 26.0 | 26000 | 270.6180 | 0.6445 | 0.6650 | 2.6933 | 0.6445 | 0.6418 | 0.3271 | 0.2032 | | 269.1238 | 27.0 | 27000 | 270.1308 | 0.6445 | 0.6712 | 2.8097 | 0.6445 | 0.6462 | 0.3290 | 0.2128 | | 268.424 | 28.0 | 28000 | 269.7667 | 0.6352 | 0.6872 | 2.9166 | 0.6352 | 0.6314 | 0.3371 | 0.2231 | | 268.4034 | 29.0 | 29000 | 270.0039 | 0.6455 | 0.6685 | 2.7765 | 0.6455 | 0.6459 | 0.3273 | 0.2097 | | 268.3632 | 30.0 | 30000 | 270.0340 | 0.6448 | 0.6741 | 2.8602 | 0.6448 | 0.6455 | 0.3291 | 0.2178 | | 268.1831 | 31.0 | 31000 | 269.3010 | 0.6597 | 0.6467 | 2.7502 | 0.6597 | 0.6571 | 0.3176 | 0.2053 | | 268.0006 | 32.0 | 32000 | 269.4335 | 0.652 | 0.6583 | 2.8213 | 0.652 | 0.6457 | 0.3236 | 0.2081 | | 267.5016 | 33.0 | 33000 | 269.2711 | 0.654 | 0.6530 | 2.8720 | 0.654 | 0.6517 | 0.3199 | 0.2090 | | 267.177 | 34.0 | 34000 | 268.7774 | 0.661 | 0.6402 | 2.7718 | 0.661 | 0.6589 | 0.3137 | 0.1979 | | 266.8408 | 35.0 | 35000 | 268.8279 | 0.6478 | 0.6640 | 2.8626 | 0.6478 | 0.6472 | 0.3271 | 0.2204 | | 266.1984 | 36.0 | 36000 | 268.3442 | 0.6635 | 0.6378 | 2.7999 | 0.6635 | 0.6611 | 0.3128 | 0.2079 | | 266.1338 | 37.0 | 37000 | 268.5704 | 0.66 | 0.6430 | 2.8314 | 0.66 | 0.6576 | 0.3165 | 0.2039 | | 266.6958 | 38.0 | 38000 | 268.1453 | 0.6635 | 0.6415 | 2.7881 | 0.6635 | 0.6627 | 0.3147 | 0.2106 | | 265.6171 | 39.0 | 39000 | 268.1818 | 0.6635 | 0.6398 | 2.7602 | 0.6635 | 0.6641 | 0.3142 | 0.2025 | | 265.8238 | 40.0 | 40000 | 268.1265 | 0.6637 | 0.6390 | 2.8178 | 0.6637 | 0.6648 | 0.3151 | 0.2016 | | 265.4164 | 41.0 | 41000 | 267.8777 | 0.6663 | 0.6304 | 2.7649 | 0.6663 | 0.6664 | 0.3113 | 0.2012 | | 265.6293 | 42.0 | 42000 | 267.8370 | 0.6683 | 0.6285 | 2.7730 | 0.6683 | 0.6677 | 0.3108 | 0.2023 | | 265.6068 | 43.0 | 43000 | 267.7586 | 0.665 | 0.6348 | 2.7612 | 0.665 | 0.6649 | 0.3126 | 0.1992 | | 265.2131 | 44.0 | 44000 | 268.0432 | 0.667 | 0.6293 | 2.7217 | 0.667 | 0.6669 | 0.3094 | 0.1885 | | 265.1312 | 45.0 | 45000 | 267.6967 | 0.6653 | 0.6316 | 2.6899 | 0.6653 | 0.6637 | 0.3127 | 0.2000 | | 265.371 | 46.0 | 46000 | 267.5307 | 0.668 | 0.6317 | 2.7472 | 0.668 | 0.6684 | 0.3105 | 0.2000 | | 264.9213 | 47.0 | 47000 | 267.5887 | 0.672 | 0.6214 | 2.6635 | 0.672 | 0.6720 | 0.3063 | 0.1935 | | 265.1304 | 48.0 | 48000 | 267.4995 | 0.6735 | 0.6220 | 2.7437 | 0.6735 | 0.6730 | 0.3049 | 0.1958 | | 264.6242 | 49.0 | 49000 | 267.2600 | 0.6723 | 0.6236 | 2.8222 | 0.6723 | 0.6713 | 0.3074 | 0.1974 | | 265.1563 | 50.0 | 50000 | 267.6730 | 0.6705 | 0.6262 | 2.7104 | 0.6705 | 0.6721 | 0.3087 | 0.1976 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
drewglass/results
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # results This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 1.5746 - Accuracy: 0.894 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.6455 | 0.99 | 62 | 2.4448 | 0.841 | | 1.751 | 2.0 | 125 | 1.7416 | 0.879 | | 1.5345 | 2.98 | 186 | 1.5746 | 0.894 | ### Framework versions - Transformers 4.28.1 - Pytorch 1.12.1+cu116 - Datasets 2.4.0 - Tokenizers 0.12.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Thamer/beit_large512_fine_tuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beit_large512_fine_tuned This model is a fine-tuned version of [microsoft/beit-base-patch16-384](https://huggingface.co/microsoft/beit-base-patch16-384) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0353 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 4.6571 | 0.98 | 16 | 0.3870 | 0.8722 | | 0.2299 | 1.97 | 32 | 0.0632 | 0.9850 | | 0.1435 | 2.95 | 48 | 0.0353 | 0.9925 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cpu - Datasets 2.13.1 - Tokenizers 0.13.3
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.9130 - Accuracy: 0.6115 - Brier Loss: 0.5924 - Nll: 2.8591 - F1 Micro: 0.6115 - F1 Macro: 0.6137 - Ece: 0.2201 - Aurc: 0.1762 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 4.0737 | 1.0 | 1000 | 3.8362 | 0.1948 | 0.8785 | 4.2153 | 0.1948 | 0.1339 | 0.0877 | 0.6770 | | 3.2534 | 2.0 | 2000 | 3.0129 | 0.3772 | 0.7858 | 3.2453 | 0.3773 | 0.3171 | 0.1532 | 0.4263 | | 2.802 | 3.0 | 3000 | 2.6716 | 0.4472 | 0.7236 | 3.0740 | 0.4472 | 0.4150 | 0.1479 | 0.3537 | | 2.5803 | 4.0 | 4000 | 2.4882 | 0.4775 | 0.6581 | 2.9147 | 0.4775 | 0.4658 | 0.0774 | 0.2981 | | 2.4445 | 5.0 | 5000 | 2.3888 | 0.4753 | 0.6741 | 2.8504 | 0.4753 | 0.4562 | 0.1529 | 0.2912 | | 2.2315 | 6.0 | 6000 | 2.2439 | 0.5132 | 0.6404 | 2.7536 | 0.5132 | 0.5007 | 0.1179 | 0.2659 | | 2.0849 | 7.0 | 7000 | 2.2140 | 0.5125 | 0.6441 | 2.8160 | 0.5125 | 0.5119 | 0.1522 | 0.2634 | | 1.922 | 8.0 | 8000 | 2.0941 | 0.5298 | 0.6202 | 2.7843 | 0.5298 | 0.5267 | 0.1511 | 0.2384 | | 1.8235 | 9.0 | 9000 | 2.0477 | 0.5453 | 0.5974 | 2.6273 | 0.5453 | 0.5505 | 0.1131 | 0.2252 | | 1.6864 | 10.0 | 10000 | 2.0429 | 0.5575 | 0.6181 | 2.7309 | 0.5575 | 0.5391 | 0.1828 | 0.2172 | | 1.5498 | 11.0 | 11000 | 2.0547 | 0.5597 | 0.6272 | 2.6772 | 0.5597 | 0.5444 | 0.1931 | 0.2237 | | 1.4502 | 12.0 | 12000 | 2.0086 | 0.5707 | 0.6175 | 2.7419 | 0.5707 | 0.5592 | 0.1935 | 0.2153 | | 1.3331 | 13.0 | 13000 | 2.0399 | 0.566 | 0.6150 | 2.8462 | 0.566 | 0.5660 | 0.1786 | 0.2175 | | 1.19 | 14.0 | 14000 | 2.0207 | 0.5805 | 0.6209 | 2.7599 | 0.5805 | 0.5793 | 0.2133 | 0.2081 | | 1.0615 | 15.0 | 15000 | 2.0089 | 0.5793 | 0.6224 | 2.9014 | 0.5793 | 0.5787 | 0.2157 | 0.2077 | | 0.9517 | 16.0 | 16000 | 2.0644 | 0.5765 | 0.6300 | 2.9717 | 0.5765 | 0.5796 | 0.2264 | 0.2047 | | 0.8343 | 17.0 | 17000 | 2.0990 | 0.5745 | 0.6473 | 3.0444 | 0.5745 | 0.5712 | 0.2502 | 0.2050 | | 0.7191 | 18.0 | 18000 | 2.1704 | 0.5763 | 0.6527 | 3.0809 | 0.5763 | 0.5721 | 0.2539 | 0.2125 | | 0.638 | 19.0 | 19000 | 2.1930 | 0.5807 | 0.6572 | 3.1618 | 0.5807 | 0.5792 | 0.2620 | 0.2120 | | 0.5528 | 20.0 | 20000 | 2.1731 | 0.5885 | 0.6542 | 3.1412 | 0.5885 | 0.5898 | 0.2627 | 0.2067 | | 0.4957 | 21.0 | 21000 | 2.2492 | 0.5763 | 0.6708 | 3.1701 | 0.5763 | 0.5760 | 0.2700 | 0.2232 | | 0.4096 | 22.0 | 22000 | 2.3164 | 0.5707 | 0.6837 | 3.3874 | 0.5707 | 0.5706 | 0.2761 | 0.2260 | | 0.3915 | 23.0 | 23000 | 2.3277 | 0.58 | 0.6813 | 3.4230 | 0.58 | 0.5814 | 0.2857 | 0.2165 | | 0.3412 | 24.0 | 24000 | 2.2947 | 0.5813 | 0.6779 | 3.3373 | 0.5813 | 0.5854 | 0.2870 | 0.2076 | | 0.3171 | 25.0 | 25000 | 2.2743 | 0.586 | 0.6720 | 3.3310 | 0.586 | 0.5848 | 0.2767 | 0.2104 | | 0.2734 | 26.0 | 26000 | 2.2762 | 0.593 | 0.6696 | 3.3439 | 0.593 | 0.5977 | 0.2819 | 0.2044 | | 0.2535 | 27.0 | 27000 | 2.2205 | 0.5845 | 0.6605 | 3.2712 | 0.5845 | 0.5864 | 0.2684 | 0.2060 | | 0.229 | 28.0 | 28000 | 2.2961 | 0.5845 | 0.6821 | 3.2738 | 0.5845 | 0.5902 | 0.2896 | 0.2106 | | 0.2202 | 29.0 | 29000 | 2.2698 | 0.5845 | 0.6752 | 3.2127 | 0.5845 | 0.5845 | 0.2835 | 0.2128 | | 0.1922 | 30.0 | 30000 | 2.2511 | 0.5787 | 0.6731 | 3.3305 | 0.5787 | 0.5819 | 0.2770 | 0.2065 | | 0.1857 | 31.0 | 31000 | 2.1847 | 0.5863 | 0.6598 | 3.2211 | 0.5863 | 0.5889 | 0.2692 | 0.2036 | | 0.1678 | 32.0 | 32000 | 2.1752 | 0.5913 | 0.6551 | 3.0691 | 0.5913 | 0.5926 | 0.2680 | 0.2010 | | 0.1587 | 33.0 | 33000 | 2.1107 | 0.5972 | 0.6392 | 3.0412 | 0.5972 | 0.6002 | 0.2495 | 0.2002 | | 0.1432 | 34.0 | 34000 | 2.2079 | 0.5893 | 0.6593 | 3.1988 | 0.5893 | 0.5917 | 0.2640 | 0.2092 | | 0.1291 | 35.0 | 35000 | 2.0788 | 0.592 | 0.6388 | 2.9891 | 0.592 | 0.5942 | 0.2587 | 0.1887 | | 0.1245 | 36.0 | 36000 | 2.0521 | 0.601 | 0.6297 | 2.9432 | 0.601 | 0.6018 | 0.2464 | 0.1923 | | 0.1169 | 37.0 | 37000 | 2.0669 | 0.5935 | 0.6294 | 3.0364 | 0.5935 | 0.5932 | 0.2425 | 0.1878 | | 0.1068 | 38.0 | 38000 | 2.0197 | 0.606 | 0.6183 | 2.9448 | 0.606 | 0.6054 | 0.2375 | 0.1897 | | 0.1031 | 39.0 | 39000 | 2.0022 | 0.6032 | 0.6163 | 2.8697 | 0.6032 | 0.6051 | 0.2429 | 0.1880 | | 0.0961 | 40.0 | 40000 | 1.9965 | 0.6058 | 0.6147 | 2.9143 | 0.6058 | 0.6091 | 0.2393 | 0.1859 | | 0.093 | 41.0 | 41000 | 1.9735 | 0.6082 | 0.6080 | 2.9346 | 0.6082 | 0.6100 | 0.2331 | 0.1834 | | 0.0859 | 42.0 | 42000 | 1.9383 | 0.6092 | 0.6011 | 2.8692 | 0.6092 | 0.6105 | 0.2281 | 0.1792 | | 0.0817 | 43.0 | 43000 | 1.9413 | 0.6115 | 0.5999 | 2.8908 | 0.6115 | 0.6124 | 0.2264 | 0.1802 | | 0.0779 | 44.0 | 44000 | 1.9271 | 0.6112 | 0.6001 | 2.8695 | 0.6112 | 0.6127 | 0.2274 | 0.1791 | | 0.0745 | 45.0 | 45000 | 1.9259 | 0.6135 | 0.5983 | 2.8467 | 0.6135 | 0.6154 | 0.2206 | 0.1786 | | 0.0723 | 46.0 | 46000 | 1.9214 | 0.612 | 0.5964 | 2.8289 | 0.612 | 0.6136 | 0.2233 | 0.1778 | | 0.0673 | 47.0 | 47000 | 1.9117 | 0.6128 | 0.5923 | 2.8226 | 0.6128 | 0.6148 | 0.2197 | 0.1768 | | 0.0649 | 48.0 | 48000 | 1.9111 | 0.6152 | 0.5911 | 2.8575 | 0.6152 | 0.6164 | 0.2166 | 0.1757 | | 0.0651 | 49.0 | 49000 | 1.9106 | 0.6148 | 0.5914 | 2.8675 | 0.6148 | 0.6169 | 0.2164 | 0.1761 | | 0.0629 | 50.0 | 50000 | 1.9130 | 0.6115 | 0.5924 | 2.8591 | 0.6115 | 0.6137 | 0.2201 | 0.1762 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
rriverar75/vit-model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0189 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1527 | 3.85 | 500 | 0.0189 | 1.0 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_MSE_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_MSE_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.8136 - Accuracy: 0.6262 - Brier Loss: 0.5539 - Nll: 2.6914 - F1 Micro: 0.6262 - F1 Macro: 0.6298 - Ece: 0.1916 - Aurc: 0.1624 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 4.1455 | 1.0 | 1000 | 3.9403 | 0.1898 | 0.8859 | 5.5812 | 0.1898 | 0.1327 | 0.0656 | 0.6790 | | 3.4397 | 2.0 | 2000 | 3.3498 | 0.346 | 0.8036 | 4.5993 | 0.346 | 0.2909 | 0.1060 | 0.4698 | | 2.983 | 3.0 | 3000 | 3.1592 | 0.384 | 0.8070 | 3.7537 | 0.384 | 0.3451 | 0.1795 | 0.4280 | | 2.7406 | 4.0 | 4000 | 2.7614 | 0.4447 | 0.6956 | 3.6359 | 0.4447 | 0.4361 | 0.0840 | 0.3307 | | 2.5937 | 5.0 | 5000 | 2.7073 | 0.4462 | 0.7000 | 3.5795 | 0.4462 | 0.4321 | 0.1292 | 0.3212 | | 2.3878 | 6.0 | 6000 | 2.3935 | 0.5012 | 0.6651 | 3.1888 | 0.5012 | 0.4842 | 0.1429 | 0.2829 | | 2.2284 | 7.0 | 7000 | 2.3189 | 0.5022 | 0.6378 | 3.0628 | 0.5022 | 0.5109 | 0.1027 | 0.2630 | | 2.0759 | 8.0 | 8000 | 2.3408 | 0.4993 | 0.6550 | 3.3921 | 0.4993 | 0.4923 | 0.1398 | 0.2640 | | 1.9764 | 9.0 | 9000 | 2.0531 | 0.5563 | 0.5946 | 2.9188 | 0.5563 | 0.5619 | 0.1202 | 0.2170 | | 1.8232 | 10.0 | 10000 | 2.1083 | 0.5505 | 0.6295 | 3.0945 | 0.5505 | 0.5445 | 0.1794 | 0.2322 | | 1.7049 | 11.0 | 11000 | 2.0447 | 0.5653 | 0.6142 | 2.9718 | 0.5653 | 0.5605 | 0.1684 | 0.2207 | | 1.6182 | 12.0 | 12000 | 2.0684 | 0.5637 | 0.6462 | 3.0100 | 0.5637 | 0.5595 | 0.2095 | 0.2272 | | 1.4886 | 13.0 | 13000 | 1.9374 | 0.5735 | 0.6132 | 2.9415 | 0.5735 | 0.5806 | 0.1874 | 0.2042 | | 1.3538 | 14.0 | 14000 | 2.0147 | 0.5895 | 0.6174 | 3.0835 | 0.5895 | 0.5851 | 0.1966 | 0.2109 | | 1.2304 | 15.0 | 15000 | 1.9766 | 0.5867 | 0.6203 | 3.1471 | 0.5867 | 0.5846 | 0.2229 | 0.2091 | | 1.1124 | 16.0 | 16000 | 1.8998 | 0.6008 | 0.6044 | 2.9169 | 0.6008 | 0.5943 | 0.2144 | 0.1911 | | 1.0197 | 17.0 | 17000 | 1.9309 | 0.5955 | 0.6123 | 3.1166 | 0.5955 | 0.5979 | 0.2299 | 0.1876 | | 0.8763 | 18.0 | 18000 | 1.9741 | 0.5952 | 0.6316 | 3.2227 | 0.5952 | 0.5971 | 0.2439 | 0.1957 | | 0.8042 | 19.0 | 19000 | 1.9944 | 0.592 | 0.6318 | 3.1537 | 0.592 | 0.5898 | 0.2439 | 0.2024 | | 0.7059 | 20.0 | 20000 | 1.9259 | 0.6082 | 0.6124 | 3.0665 | 0.6082 | 0.6093 | 0.2344 | 0.1889 | | 0.632 | 21.0 | 21000 | 1.9444 | 0.6095 | 0.6148 | 3.0133 | 0.6095 | 0.6111 | 0.2281 | 0.1917 | | 0.5641 | 22.0 | 22000 | 1.9830 | 0.5968 | 0.6282 | 3.0999 | 0.5968 | 0.5984 | 0.2442 | 0.1913 | | 0.5138 | 23.0 | 23000 | 2.0190 | 0.5962 | 0.6331 | 3.1937 | 0.5962 | 0.5966 | 0.2501 | 0.2033 | | 0.457 | 24.0 | 24000 | 1.9340 | 0.6075 | 0.6151 | 2.9559 | 0.6075 | 0.6096 | 0.2333 | 0.1888 | | 0.3999 | 25.0 | 25000 | 1.9742 | 0.6048 | 0.6285 | 3.0455 | 0.6048 | 0.6080 | 0.2461 | 0.1939 | | 0.3629 | 26.0 | 26000 | 1.9308 | 0.6142 | 0.6027 | 3.1686 | 0.6142 | 0.6169 | 0.2244 | 0.1850 | | 0.3132 | 27.0 | 27000 | 1.9468 | 0.6175 | 0.6076 | 3.0271 | 0.6175 | 0.6189 | 0.2374 | 0.1863 | | 0.2818 | 28.0 | 28000 | 1.9392 | 0.6095 | 0.6152 | 3.0499 | 0.6095 | 0.6079 | 0.2422 | 0.1894 | | 0.2584 | 29.0 | 29000 | 1.8976 | 0.6202 | 0.6040 | 2.9355 | 0.6202 | 0.6204 | 0.2340 | 0.1834 | | 0.228 | 30.0 | 30000 | 1.9111 | 0.617 | 0.6020 | 3.0272 | 0.617 | 0.6192 | 0.2336 | 0.1780 | | 0.2041 | 31.0 | 31000 | 1.8513 | 0.6272 | 0.5835 | 2.8808 | 0.6272 | 0.6293 | 0.2222 | 0.1733 | | 0.1834 | 32.0 | 32000 | 1.8501 | 0.6262 | 0.5782 | 2.8280 | 0.6262 | 0.6275 | 0.2142 | 0.1702 | | 0.1613 | 33.0 | 33000 | 1.8250 | 0.6292 | 0.5712 | 2.8863 | 0.6292 | 0.6338 | 0.2021 | 0.1691 | | 0.1437 | 34.0 | 34000 | 1.8457 | 0.6228 | 0.5773 | 2.9046 | 0.6228 | 0.6232 | 0.2114 | 0.1717 | | 0.1275 | 35.0 | 35000 | 1.8088 | 0.6315 | 0.5646 | 2.8124 | 0.6315 | 0.6328 | 0.2039 | 0.1638 | | 0.1127 | 36.0 | 36000 | 1.8204 | 0.6335 | 0.5647 | 2.7943 | 0.6335 | 0.6373 | 0.1993 | 0.1661 | | 0.1026 | 37.0 | 37000 | 1.8070 | 0.631 | 0.5641 | 2.7537 | 0.631 | 0.6326 | 0.2015 | 0.1634 | | 0.0894 | 38.0 | 38000 | 1.8068 | 0.63 | 0.5606 | 2.7461 | 0.63 | 0.6317 | 0.1998 | 0.1634 | | 0.0785 | 39.0 | 39000 | 1.7894 | 0.6312 | 0.5550 | 2.7333 | 0.6312 | 0.6351 | 0.1963 | 0.1599 | | 0.0696 | 40.0 | 40000 | 1.7996 | 0.6288 | 0.5607 | 2.7489 | 0.6288 | 0.6334 | 0.1986 | 0.1645 | | 0.0626 | 41.0 | 41000 | 1.7963 | 0.6328 | 0.5532 | 2.7232 | 0.6328 | 0.6349 | 0.1933 | 0.1632 | | 0.055 | 42.0 | 42000 | 1.7959 | 0.6268 | 0.5556 | 2.6877 | 0.6268 | 0.6298 | 0.1957 | 0.1617 | | 0.0475 | 43.0 | 43000 | 1.8018 | 0.632 | 0.5522 | 2.7232 | 0.632 | 0.6354 | 0.1934 | 0.1598 | | 0.0419 | 44.0 | 44000 | 1.7930 | 0.6325 | 0.5507 | 2.6842 | 0.6325 | 0.6361 | 0.1906 | 0.1612 | | 0.0367 | 45.0 | 45000 | 1.8064 | 0.6265 | 0.5577 | 2.6772 | 0.6265 | 0.6299 | 0.1994 | 0.1632 | | 0.0328 | 46.0 | 46000 | 1.8044 | 0.6228 | 0.5524 | 2.6611 | 0.6228 | 0.6263 | 0.1971 | 0.1620 | | 0.0289 | 47.0 | 47000 | 1.8101 | 0.6248 | 0.5544 | 2.6841 | 0.6248 | 0.6284 | 0.1943 | 0.1624 | | 0.0265 | 48.0 | 48000 | 1.8088 | 0.6242 | 0.5531 | 2.6870 | 0.6242 | 0.6283 | 0.1943 | 0.1622 | | 0.0238 | 49.0 | 49000 | 1.8107 | 0.6255 | 0.5533 | 2.7007 | 0.6255 | 0.6292 | 0.1923 | 0.1621 | | 0.022 | 50.0 | 50000 | 1.8136 | 0.6262 | 0.5539 | 2.6914 | 0.6262 | 0.6298 | 0.1916 | 0.1624 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
TirathP/vit-base-patch16-224-finetuned-customData
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # TirathP/vit-base-patch16-224-finetuned-customData This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.2775 - Validation Loss: 0.3297 - Validation Accuracy: 0.8571 - Epoch: 19 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Validation Accuracy | Epoch | |:----------:|:---------------:|:-------------------:|:-----:| | 1.1397 | 1.0223 | 0.5714 | 0 | | 0.8312 | 0.8338 | 0.5714 | 1 | | 0.7131 | 0.7099 | 0.5714 | 2 | | 0.5754 | 0.6120 | 0.7143 | 3 | | 0.4804 | 0.5374 | 0.7143 | 4 | | 0.3934 | 0.4630 | 0.8571 | 5 | | 0.4258 | 0.3979 | 0.8571 | 6 | | 0.3739 | 0.3455 | 1.0 | 7 | | 0.3143 | 0.2909 | 1.0 | 8 | | 0.3113 | 0.2572 | 0.8571 | 9 | | 0.3327 | 0.2623 | 0.8571 | 10 | | 0.2227 | 0.2993 | 0.8571 | 11 | | 0.2860 | 0.3299 | 0.8571 | 12 | | 0.2081 | 0.3553 | 0.8571 | 13 | | 0.2243 | 0.3360 | 0.8571 | 14 | | 0.2246 | 0.2942 | 0.8571 | 15 | | 0.2570 | 0.2131 | 0.8571 | 16 | | 0.3173 | 0.1850 | 0.8571 | 17 | | 0.1572 | 0.2134 | 0.8571 | 18 | | 0.2775 | 0.3297 | 0.8571 | 19 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.12.0 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "highway", "industrial", "pasture" ]
TirathP/Classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # TirathP/food_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.6822 - Validation Loss: 0.6966 - Train Accuracy: 1.0 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 1.0773 | 0.9665 | 1.0 | 0 | | 0.9585 | 0.8375 | 1.0 | 1 | | 0.8571 | 0.7712 | 1.0 | 2 | | 0.7833 | 0.7278 | 1.0 | 3 | | 0.6822 | 0.6966 | 1.0 | 4 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.12.0 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "highway", "industrial", "pasture" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 6.1637 - Accuracy: 0.6275 - Brier Loss: 0.6026 - Nll: 2.9068 - F1 Micro: 0.6275 - F1 Macro: 0.6313 - Ece: 0.2499 - Aurc: 0.1609 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 6.3322 | 1.0 | 1000 | 6.0794 | 0.1835 | 0.8928 | 6.5679 | 0.1835 | 0.1322 | 0.0627 | 0.6846 | | 5.8198 | 2.0 | 2000 | 5.5963 | 0.3668 | 0.7821 | 3.5543 | 0.3668 | 0.3217 | 0.0967 | 0.4448 | | 5.53 | 3.0 | 3000 | 5.4184 | 0.4225 | 0.7382 | 3.4217 | 0.4225 | 0.3848 | 0.1087 | 0.3778 | | 5.3449 | 4.0 | 4000 | 5.1895 | 0.4655 | 0.6813 | 3.0794 | 0.4655 | 0.4562 | 0.1076 | 0.3029 | | 5.2467 | 5.0 | 5000 | 5.1813 | 0.4592 | 0.6845 | 2.9944 | 0.4592 | 0.4430 | 0.1009 | 0.3125 | | 5.1382 | 6.0 | 6000 | 5.0102 | 0.4998 | 0.6423 | 2.7804 | 0.4998 | 0.4926 | 0.1013 | 0.2660 | | 5.0255 | 7.0 | 7000 | 4.9611 | 0.501 | 0.6350 | 2.7692 | 0.501 | 0.5085 | 0.0795 | 0.2690 | | 4.9089 | 8.0 | 8000 | 4.9327 | 0.508 | 0.6204 | 2.6580 | 0.508 | 0.5068 | 0.0622 | 0.2565 | | 4.8337 | 9.0 | 9000 | 4.8324 | 0.5467 | 0.5866 | 2.5636 | 0.5467 | 0.5419 | 0.0642 | 0.2274 | | 4.747 | 10.0 | 10000 | 5.0170 | 0.5302 | 0.6080 | 2.7672 | 0.5302 | 0.5193 | 0.0622 | 0.2452 | | 4.622 | 11.0 | 11000 | 4.8259 | 0.5593 | 0.5709 | 2.6791 | 0.5593 | 0.5520 | 0.0619 | 0.2090 | | 4.5449 | 12.0 | 12000 | 4.7696 | 0.5675 | 0.5583 | 2.5273 | 0.5675 | 0.5678 | 0.0541 | 0.2016 | | 4.447 | 13.0 | 13000 | 4.8718 | 0.5575 | 0.5775 | 2.7597 | 0.5575 | 0.5557 | 0.0575 | 0.2142 | | 4.341 | 14.0 | 14000 | 4.7644 | 0.5897 | 0.5368 | 2.5797 | 0.5897 | 0.5930 | 0.0560 | 0.1835 | | 4.2476 | 15.0 | 15000 | 4.8339 | 0.5905 | 0.5485 | 2.6684 | 0.5905 | 0.5903 | 0.0719 | 0.1872 | | 4.1592 | 16.0 | 16000 | 4.7828 | 0.5877 | 0.5456 | 2.7300 | 0.5877 | 0.5877 | 0.0784 | 0.1832 | | 4.0513 | 17.0 | 17000 | 4.8771 | 0.5885 | 0.5533 | 2.9097 | 0.5885 | 0.5930 | 0.0965 | 0.1867 | | 3.9646 | 18.0 | 18000 | 4.8980 | 0.596 | 0.5499 | 2.8383 | 0.596 | 0.5948 | 0.1025 | 0.1797 | | 3.8768 | 19.0 | 19000 | 4.9787 | 0.605 | 0.5551 | 2.8903 | 0.605 | 0.6050 | 0.1302 | 0.1765 | | 3.7739 | 20.0 | 20000 | 5.1202 | 0.5945 | 0.5727 | 3.0393 | 0.5945 | 0.5935 | 0.1493 | 0.1821 | | 3.7023 | 21.0 | 21000 | 5.1879 | 0.5998 | 0.5785 | 2.9570 | 0.5998 | 0.5991 | 0.1690 | 0.1807 | | 3.6301 | 22.0 | 22000 | 5.2707 | 0.5933 | 0.5908 | 3.1177 | 0.5933 | 0.5971 | 0.1863 | 0.1829 | | 3.5857 | 23.0 | 23000 | 5.2522 | 0.5887 | 0.5994 | 3.2051 | 0.5887 | 0.5949 | 0.1928 | 0.1857 | | 3.5256 | 24.0 | 24000 | 5.3443 | 0.6102 | 0.5857 | 2.9687 | 0.6102 | 0.6084 | 0.1953 | 0.1760 | | 3.4954 | 25.0 | 25000 | 5.3010 | 0.6045 | 0.5874 | 3.0184 | 0.6045 | 0.6053 | 0.1851 | 0.1807 | | 3.46 | 26.0 | 26000 | 5.4451 | 0.5992 | 0.5994 | 3.0539 | 0.5992 | 0.6033 | 0.2053 | 0.1819 | | 3.4086 | 27.0 | 27000 | 5.4299 | 0.608 | 0.5913 | 3.1127 | 0.608 | 0.6082 | 0.2027 | 0.1751 | | 3.3769 | 28.0 | 28000 | 5.6979 | 0.601 | 0.6236 | 3.1077 | 0.601 | 0.6024 | 0.2396 | 0.1777 | | 3.3238 | 29.0 | 29000 | 5.6090 | 0.611 | 0.6013 | 3.0875 | 0.611 | 0.6114 | 0.2238 | 0.1729 | | 3.3011 | 30.0 | 30000 | 5.6356 | 0.6105 | 0.5991 | 2.9450 | 0.6105 | 0.6123 | 0.2243 | 0.1719 | | 3.2708 | 31.0 | 31000 | 5.7634 | 0.604 | 0.6181 | 2.9119 | 0.604 | 0.6075 | 0.2402 | 0.1771 | | 3.2556 | 32.0 | 32000 | 5.7042 | 0.617 | 0.6002 | 2.9324 | 0.617 | 0.6199 | 0.2263 | 0.1740 | | 3.2213 | 33.0 | 33000 | 5.7388 | 0.603 | 0.6121 | 2.9240 | 0.603 | 0.6108 | 0.2345 | 0.1782 | | 3.2138 | 34.0 | 34000 | 5.8008 | 0.6218 | 0.6001 | 2.9209 | 0.6218 | 0.6206 | 0.2284 | 0.1701 | | 3.1994 | 35.0 | 35000 | 5.7350 | 0.6142 | 0.5967 | 2.9021 | 0.6142 | 0.6147 | 0.2294 | 0.1688 | | 3.1776 | 36.0 | 36000 | 5.7487 | 0.609 | 0.6032 | 2.8651 | 0.609 | 0.6121 | 0.2329 | 0.1689 | | 3.1606 | 37.0 | 37000 | 5.8022 | 0.6165 | 0.6075 | 2.8604 | 0.6165 | 0.6189 | 0.2398 | 0.1677 | | 3.1405 | 38.0 | 38000 | 5.8133 | 0.6235 | 0.5949 | 2.8775 | 0.6235 | 0.6272 | 0.2319 | 0.1640 | | 3.132 | 39.0 | 39000 | 5.8934 | 0.6232 | 0.5974 | 2.9324 | 0.6232 | 0.6274 | 0.2389 | 0.1639 | | 3.1303 | 40.0 | 40000 | 5.8902 | 0.6288 | 0.5947 | 2.9049 | 0.6288 | 0.6322 | 0.2344 | 0.1634 | | 3.1187 | 41.0 | 41000 | 5.9076 | 0.6215 | 0.5987 | 2.8584 | 0.6215 | 0.6261 | 0.2394 | 0.1630 | | 3.0969 | 42.0 | 42000 | 5.9469 | 0.6265 | 0.5984 | 2.8509 | 0.6265 | 0.6309 | 0.2375 | 0.1631 | | 3.0964 | 43.0 | 43000 | 5.9442 | 0.6252 | 0.5951 | 2.9309 | 0.6252 | 0.6291 | 0.2397 | 0.1607 | | 3.0953 | 44.0 | 44000 | 6.0126 | 0.6238 | 0.5998 | 2.8956 | 0.6238 | 0.6274 | 0.2419 | 0.1630 | | 3.0904 | 45.0 | 45000 | 6.0602 | 0.6295 | 0.5991 | 2.8669 | 0.6295 | 0.6334 | 0.2417 | 0.1609 | | 3.0794 | 46.0 | 46000 | 6.0782 | 0.6282 | 0.6027 | 2.8830 | 0.6282 | 0.6321 | 0.2442 | 0.1616 | | 3.0788 | 47.0 | 47000 | 6.1062 | 0.6275 | 0.6003 | 2.8472 | 0.6275 | 0.6316 | 0.2471 | 0.1610 | | 3.0802 | 48.0 | 48000 | 6.1079 | 0.6285 | 0.5998 | 2.8916 | 0.6285 | 0.6322 | 0.2465 | 0.1600 | | 3.0644 | 49.0 | 49000 | 6.1569 | 0.6275 | 0.6025 | 2.8941 | 0.6275 | 0.6314 | 0.2497 | 0.1610 | | 3.0751 | 50.0 | 50000 | 6.1637 | 0.6275 | 0.6026 | 2.9068 | 0.6275 | 0.6313 | 0.2499 | 0.1609 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
prasankumar93/nateraw-vit-base-beans-onnx
https://huggingface.co/nateraw/vit-base-beans with ONNX weights ```sh # command for conversion optimum-cli export onnx --model nateraw/vit-base-beans nateraw-vit-base-beans-onnx/ ```
[ "angular_leaf_spot", "bean_rust", "healthy" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_hint_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_hint_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 74.2895 - Accuracy: 0.6627 - Brier Loss: 0.6224 - Nll: 3.3689 - F1 Micro: 0.6627 - F1 Macro: 0.6637 - Ece: 0.3019 - Aurc: 0.1471 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 77.1398 | 1.0 | 1000 | 76.8452 | 0.1923 | 0.8714 | 4.4881 | 0.1923 | 0.1046 | 0.0814 | 0.6378 | | 75.9619 | 2.0 | 2000 | 75.9373 | 0.3513 | 0.7709 | 3.0790 | 0.3513 | 0.3110 | 0.0596 | 0.4537 | | 75.5047 | 3.0 | 3000 | 75.7291 | 0.4233 | 0.7112 | 3.0280 | 0.4233 | 0.3913 | 0.0610 | 0.3648 | | 75.4727 | 4.0 | 4000 | 75.5639 | 0.4163 | 0.7147 | 3.0030 | 0.4163 | 0.3863 | 0.0669 | 0.3662 | | 75.146 | 5.0 | 5000 | 75.4176 | 0.467 | 0.6695 | 2.8545 | 0.467 | 0.4530 | 0.0563 | 0.3180 | | 74.8201 | 6.0 | 6000 | 74.8222 | 0.5275 | 0.6023 | 2.6409 | 0.5275 | 0.5201 | 0.0587 | 0.2448 | | 74.4727 | 7.0 | 7000 | 74.6341 | 0.5403 | 0.5930 | 2.5700 | 0.5403 | 0.5312 | 0.0707 | 0.2393 | | 74.1392 | 8.0 | 8000 | 74.6029 | 0.5615 | 0.5669 | 2.5716 | 0.5615 | 0.5496 | 0.0666 | 0.2142 | | 74.165 | 9.0 | 9000 | 74.4072 | 0.5863 | 0.5479 | 2.5087 | 0.5863 | 0.5793 | 0.0689 | 0.1969 | | 73.8821 | 10.0 | 10000 | 74.2595 | 0.5817 | 0.5517 | 2.4910 | 0.5817 | 0.5802 | 0.0733 | 0.1973 | | 73.6199 | 11.0 | 11000 | 74.2044 | 0.61 | 0.5233 | 2.4183 | 0.61 | 0.6001 | 0.0853 | 0.1722 | | 73.4772 | 12.0 | 12000 | 73.9341 | 0.593 | 0.5520 | 2.4592 | 0.593 | 0.5873 | 0.1244 | 0.1847 | | 73.2445 | 13.0 | 13000 | 73.9870 | 0.614 | 0.5368 | 2.5577 | 0.614 | 0.6093 | 0.1303 | 0.1706 | | 73.1468 | 14.0 | 14000 | 73.9027 | 0.6212 | 0.5368 | 2.6082 | 0.6212 | 0.6192 | 0.1340 | 0.1691 | | 72.9154 | 15.0 | 15000 | 73.7745 | 0.6298 | 0.5353 | 2.5866 | 0.6298 | 0.6260 | 0.1564 | 0.1598 | | 72.7416 | 16.0 | 16000 | 73.8946 | 0.6225 | 0.5528 | 2.5893 | 0.6225 | 0.6245 | 0.1810 | 0.1679 | | 72.4708 | 17.0 | 17000 | 73.9445 | 0.62 | 0.5757 | 2.7436 | 0.62 | 0.6217 | 0.2044 | 0.1714 | | 72.5169 | 18.0 | 18000 | 73.7757 | 0.6262 | 0.5741 | 2.6894 | 0.6262 | 0.6292 | 0.2139 | 0.1655 | | 72.2021 | 19.0 | 19000 | 73.9482 | 0.6192 | 0.6063 | 2.8813 | 0.6192 | 0.6175 | 0.2534 | 0.1706 | | 72.1296 | 20.0 | 20000 | 73.9725 | 0.6185 | 0.6135 | 2.9223 | 0.6185 | 0.6223 | 0.2495 | 0.1736 | | 72.1903 | 21.0 | 21000 | 74.0277 | 0.6285 | 0.6091 | 2.8760 | 0.6285 | 0.6307 | 0.2588 | 0.1638 | | 71.9868 | 22.0 | 22000 | 74.1811 | 0.6218 | 0.6317 | 3.0858 | 0.6218 | 0.6229 | 0.2792 | 0.1698 | | 71.9677 | 23.0 | 23000 | 74.1227 | 0.6222 | 0.6442 | 3.0329 | 0.6222 | 0.6214 | 0.2872 | 0.1764 | | 71.8254 | 24.0 | 24000 | 74.2927 | 0.6282 | 0.6412 | 3.1773 | 0.6282 | 0.6220 | 0.2928 | 0.1739 | | 71.7948 | 25.0 | 25000 | 74.1580 | 0.626 | 0.6498 | 3.1230 | 0.626 | 0.6286 | 0.3007 | 0.1703 | | 71.6915 | 26.0 | 26000 | 74.1776 | 0.6335 | 0.6367 | 3.1272 | 0.6335 | 0.6351 | 0.2937 | 0.1655 | | 71.4526 | 27.0 | 27000 | 74.4076 | 0.6335 | 0.6519 | 3.3331 | 0.6335 | 0.6318 | 0.3023 | 0.1749 | | 71.2967 | 28.0 | 28000 | 74.1954 | 0.6402 | 0.6361 | 3.1669 | 0.6402 | 0.6392 | 0.2995 | 0.1618 | | 71.4139 | 29.0 | 29000 | 74.2737 | 0.6342 | 0.6454 | 3.0744 | 0.6342 | 0.6347 | 0.3070 | 0.1626 | | 71.3204 | 30.0 | 30000 | 74.2779 | 0.652 | 0.6277 | 3.2286 | 0.652 | 0.6550 | 0.2956 | 0.1572 | | 71.4168 | 31.0 | 31000 | 74.3630 | 0.6458 | 0.6386 | 3.2327 | 0.6458 | 0.6463 | 0.3032 | 0.1594 | | 71.387 | 32.0 | 32000 | 74.4710 | 0.6522 | 0.6383 | 3.3193 | 0.6522 | 0.6526 | 0.3016 | 0.1610 | | 71.2382 | 33.0 | 33000 | 74.4096 | 0.652 | 0.6275 | 3.3440 | 0.652 | 0.6522 | 0.2977 | 0.1584 | | 71.1387 | 34.0 | 34000 | 74.2451 | 0.6512 | 0.6316 | 3.2834 | 0.6512 | 0.6525 | 0.3022 | 0.1555 | | 71.0904 | 35.0 | 35000 | 74.2640 | 0.6525 | 0.6341 | 3.1942 | 0.6525 | 0.6518 | 0.3023 | 0.1521 | | 70.9615 | 36.0 | 36000 | 74.1828 | 0.6565 | 0.6239 | 3.1805 | 0.6565 | 0.6568 | 0.3014 | 0.1516 | | 71.0673 | 37.0 | 37000 | 74.3405 | 0.6498 | 0.6341 | 3.3365 | 0.6498 | 0.6518 | 0.3071 | 0.1556 | | 71.0009 | 38.0 | 38000 | 74.2596 | 0.6595 | 0.6296 | 3.3359 | 0.6595 | 0.6622 | 0.2991 | 0.1512 | | 70.8441 | 39.0 | 39000 | 74.2837 | 0.6593 | 0.6254 | 3.3852 | 0.6593 | 0.6609 | 0.3005 | 0.1537 | | 70.8273 | 40.0 | 40000 | 74.3321 | 0.6567 | 0.6342 | 3.3111 | 0.6567 | 0.6589 | 0.3068 | 0.1544 | | 70.8931 | 41.0 | 41000 | 74.3478 | 0.662 | 0.6253 | 3.3022 | 0.662 | 0.6604 | 0.3029 | 0.1474 | | 70.8954 | 42.0 | 42000 | 74.2638 | 0.6613 | 0.6275 | 3.3811 | 0.6613 | 0.6612 | 0.3033 | 0.1499 | | 70.7389 | 43.0 | 43000 | 74.2531 | 0.6633 | 0.6221 | 3.3627 | 0.6633 | 0.6650 | 0.2998 | 0.1489 | | 70.7911 | 44.0 | 44000 | 74.3263 | 0.6587 | 0.6299 | 3.3918 | 0.6587 | 0.6588 | 0.3037 | 0.1496 | | 70.8719 | 45.0 | 45000 | 74.2778 | 0.6627 | 0.6236 | 3.3826 | 0.6627 | 0.6641 | 0.3009 | 0.1480 | | 70.7289 | 46.0 | 46000 | 74.2760 | 0.6625 | 0.6201 | 3.3467 | 0.6625 | 0.6635 | 0.3016 | 0.1469 | | 70.8773 | 47.0 | 47000 | 74.2709 | 0.6643 | 0.6185 | 3.3370 | 0.6643 | 0.6660 | 0.2989 | 0.1476 | | 70.6951 | 48.0 | 48000 | 74.2857 | 0.6643 | 0.6218 | 3.3545 | 0.6643 | 0.6648 | 0.2995 | 0.1477 | | 70.8059 | 49.0 | 49000 | 74.3124 | 0.6623 | 0.6228 | 3.3592 | 0.6623 | 0.6634 | 0.3020 | 0.1470 | | 70.6955 | 50.0 | 50000 | 74.2895 | 0.6627 | 0.6224 | 3.3689 | 0.6627 | 0.6637 | 0.3019 | 0.1471 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
platzi/platzi-vit-model-mauricio-rodriguez
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-mauricio-rodriguez This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0213 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1425 | 3.85 | 500 | 0.0213 | 0.9925 | ### Framework versions - Transformers 4.29.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_simkd_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_simkd_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0648 - Accuracy: 0.6072 - Brier Loss: 0.5503 - Nll: 2.7228 - F1 Micro: 0.6072 - F1 Macro: 0.6102 - Ece: 0.1175 - Aurc: 0.1871 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 0.0792 | 1.0 | 1000 | 0.0787 | 0.1103 | 0.9330 | 7.5575 | 0.1103 | 0.0531 | 0.0745 | 0.7817 | | 0.0778 | 2.0 | 2000 | 0.0772 | 0.1867 | 0.9126 | 5.0754 | 0.1867 | 0.1367 | 0.1099 | 0.6583 | | 0.0753 | 3.0 | 3000 | 0.0748 | 0.3185 | 0.8694 | 3.7575 | 0.3185 | 0.2711 | 0.2000 | 0.4734 | | 0.0738 | 4.0 | 4000 | 0.0733 | 0.3633 | 0.8431 | 3.5732 | 0.3633 | 0.3239 | 0.2222 | 0.4034 | | 0.0724 | 5.0 | 5000 | 0.0721 | 0.4083 | 0.8051 | 3.1556 | 0.4083 | 0.3747 | 0.2309 | 0.3543 | | 0.0712 | 6.0 | 6000 | 0.0716 | 0.422 | 0.7734 | 3.1574 | 0.422 | 0.3887 | 0.1999 | 0.3595 | | 0.07 | 7.0 | 7000 | 0.0703 | 0.4718 | 0.7584 | 2.9015 | 0.4718 | 0.4495 | 0.2548 | 0.2992 | | 0.0693 | 8.0 | 8000 | 0.0696 | 0.493 | 0.7247 | 3.0406 | 0.493 | 0.4634 | 0.2430 | 0.2667 | | 0.0683 | 9.0 | 9000 | 0.0693 | 0.4955 | 0.7122 | 3.1754 | 0.4955 | 0.4725 | 0.2329 | 0.2620 | | 0.0675 | 10.0 | 10000 | 0.0682 | 0.528 | 0.6893 | 2.7012 | 0.528 | 0.5188 | 0.2477 | 0.2418 | | 0.0668 | 11.0 | 11000 | 0.0680 | 0.5035 | 0.6961 | 2.9081 | 0.5035 | 0.4967 | 0.2243 | 0.2569 | | 0.066 | 12.0 | 12000 | 0.0669 | 0.5595 | 0.6626 | 2.6583 | 0.5595 | 0.5541 | 0.2550 | 0.2169 | | 0.0649 | 13.0 | 13000 | 0.0668 | 0.5493 | 0.6559 | 2.7433 | 0.5493 | 0.5525 | 0.2334 | 0.2165 | | 0.0645 | 14.0 | 14000 | 0.0663 | 0.5645 | 0.6334 | 2.6699 | 0.5645 | 0.5605 | 0.2191 | 0.2099 | | 0.0636 | 15.0 | 15000 | 0.0659 | 0.5765 | 0.6160 | 2.6406 | 0.5765 | 0.5664 | 0.2084 | 0.2012 | | 0.0629 | 16.0 | 16000 | 0.0661 | 0.56 | 0.6311 | 2.7536 | 0.56 | 0.5686 | 0.2129 | 0.2026 | | 0.0622 | 17.0 | 17000 | 0.0660 | 0.5725 | 0.6108 | 2.8055 | 0.5725 | 0.5707 | 0.1866 | 0.2046 | | 0.0617 | 18.0 | 18000 | 0.0656 | 0.5697 | 0.6081 | 2.8309 | 0.5697 | 0.5730 | 0.1810 | 0.2024 | | 0.0608 | 19.0 | 19000 | 0.0654 | 0.585 | 0.5982 | 2.6432 | 0.585 | 0.5867 | 0.1834 | 0.1975 | | 0.06 | 20.0 | 20000 | 0.0656 | 0.584 | 0.5959 | 2.8363 | 0.584 | 0.5856 | 0.1662 | 0.2067 | | 0.0592 | 21.0 | 21000 | 0.0657 | 0.5875 | 0.5896 | 2.8259 | 0.5875 | 0.5892 | 0.1575 | 0.2059 | | 0.0584 | 22.0 | 22000 | 0.0655 | 0.5887 | 0.5832 | 2.8147 | 0.5887 | 0.5895 | 0.1531 | 0.1998 | | 0.058 | 23.0 | 23000 | 0.0654 | 0.5945 | 0.5829 | 2.9399 | 0.5945 | 0.5955 | 0.1475 | 0.2007 | | 0.0571 | 24.0 | 24000 | 0.0654 | 0.5962 | 0.5779 | 2.8266 | 0.5962 | 0.5982 | 0.1460 | 0.1996 | | 0.0566 | 25.0 | 25000 | 0.0655 | 0.596 | 0.5815 | 2.9480 | 0.596 | 0.5975 | 0.1447 | 0.2099 | | 0.0561 | 26.0 | 26000 | 0.0660 | 0.5883 | 0.5840 | 2.9985 | 0.5883 | 0.5903 | 0.1146 | 0.2202 | | 0.0556 | 27.0 | 27000 | 0.0654 | 0.6042 | 0.5713 | 2.8775 | 0.6042 | 0.6052 | 0.1353 | 0.2020 | | 0.055 | 28.0 | 28000 | 0.0655 | 0.5945 | 0.5750 | 3.0404 | 0.5945 | 0.5965 | 0.1215 | 0.2051 | | 0.0546 | 29.0 | 29000 | 0.0655 | 0.5978 | 0.5740 | 2.9173 | 0.5978 | 0.6012 | 0.1226 | 0.2046 | | 0.0543 | 30.0 | 30000 | 0.0657 | 0.588 | 0.5813 | 3.0493 | 0.588 | 0.5915 | 0.1210 | 0.2104 | | 0.054 | 31.0 | 31000 | 0.0652 | 0.597 | 0.5715 | 2.9423 | 0.597 | 0.5989 | 0.1207 | 0.2055 | | 0.0537 | 32.0 | 32000 | 0.0650 | 0.6075 | 0.5618 | 2.8731 | 0.6075 | 0.6080 | 0.1209 | 0.1987 | | 0.0534 | 33.0 | 33000 | 0.0650 | 0.602 | 0.5651 | 2.7807 | 0.602 | 0.6046 | 0.1254 | 0.1988 | | 0.0535 | 34.0 | 34000 | 0.0652 | 0.602 | 0.5661 | 3.0050 | 0.602 | 0.6068 | 0.1187 | 0.1977 | | 0.053 | 35.0 | 35000 | 0.0649 | 0.6008 | 0.5603 | 2.8814 | 0.6008 | 0.6028 | 0.1172 | 0.1981 | | 0.0527 | 36.0 | 36000 | 0.0649 | 0.5988 | 0.5575 | 2.8419 | 0.5988 | 0.5974 | 0.1156 | 0.1917 | | 0.0526 | 37.0 | 37000 | 0.0649 | 0.598 | 0.5586 | 2.7982 | 0.598 | 0.5986 | 0.1173 | 0.1900 | | 0.0524 | 38.0 | 38000 | 0.0646 | 0.604 | 0.5546 | 2.8202 | 0.604 | 0.6060 | 0.1244 | 0.1908 | | 0.0524 | 39.0 | 39000 | 0.0651 | 0.5965 | 0.5627 | 2.8458 | 0.5965 | 0.6010 | 0.1125 | 0.1949 | | 0.0522 | 40.0 | 40000 | 0.0649 | 0.6072 | 0.5515 | 2.7872 | 0.6072 | 0.6100 | 0.1211 | 0.1881 | | 0.0521 | 41.0 | 41000 | 0.0648 | 0.6078 | 0.5542 | 2.7802 | 0.6078 | 0.6108 | 0.1199 | 0.1868 | | 0.052 | 42.0 | 42000 | 0.0648 | 0.6 | 0.5557 | 2.7968 | 0.6 | 0.6029 | 0.1190 | 0.1940 | | 0.0519 | 43.0 | 43000 | 0.0647 | 0.604 | 0.5503 | 2.7110 | 0.604 | 0.6060 | 0.1178 | 0.1896 | | 0.0516 | 44.0 | 44000 | 0.0647 | 0.6065 | 0.5515 | 2.7595 | 0.6065 | 0.6089 | 0.1170 | 0.1870 | | 0.0516 | 45.0 | 45000 | 0.0646 | 0.611 | 0.5496 | 2.7426 | 0.611 | 0.6129 | 0.1212 | 0.1873 | | 0.0515 | 46.0 | 46000 | 0.0648 | 0.6082 | 0.5510 | 2.7436 | 0.6082 | 0.6120 | 0.1227 | 0.1876 | | 0.0514 | 47.0 | 47000 | 0.0647 | 0.6088 | 0.5511 | 2.7379 | 0.6088 | 0.6115 | 0.1240 | 0.1874 | | 0.0514 | 48.0 | 48000 | 0.0647 | 0.6095 | 0.5501 | 2.7369 | 0.6095 | 0.6122 | 0.1193 | 0.1868 | | 0.0513 | 49.0 | 49000 | 0.0647 | 0.6095 | 0.5508 | 2.7295 | 0.6095 | 0.6122 | 0.1218 | 0.1870 | | 0.0513 | 50.0 | 50000 | 0.0648 | 0.6072 | 0.5503 | 2.7228 | 0.6072 | 0.6102 | 0.1175 | 0.1871 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
julienmercier/vit-base-patch16-224-in21k-mobile-eye-tracking-dataset-v1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-mobile-eye-tracking-dataset-v1 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0604 - Accuracy: 0.9911 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 96 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1179 | 0.99 | 73 | 0.0977 | 0.9885 | | 0.06 | 1.99 | 147 | 0.0693 | 0.9898 | | 0.0376 | 2.97 | 219 | 0.0604 | 0.9911 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "in", "none", "out" ]
julienmercier/vit-base-patch16-224-in21k-mobile-eye-tracking-dataset-v2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-in21k-mobile-eye-tracking-dataset-v2 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0542 - Accuracy: 0.9898 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 96 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.024 | 0.99 | 73 | 0.0769 | 0.9809 | | 0.0236 | 1.99 | 147 | 0.1111 | 0.9745 | | 0.0172 | 3.0 | 221 | 0.0542 | 0.9898 | | 0.0114 | 4.0 | 295 | 0.0630 | 0.9885 | | 0.0051 | 4.99 | 368 | 0.0674 | 0.9860 | | 0.0044 | 5.99 | 442 | 0.0640 | 0.9885 | | 0.0037 | 7.0 | 516 | 0.0646 | 0.9885 | | 0.0034 | 8.0 | 590 | 0.0652 | 0.9885 | | 0.0032 | 8.99 | 663 | 0.0656 | 0.9885 | | 0.0032 | 9.9 | 730 | 0.0657 | 0.9885 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "in", "none", "out" ]
benguerrieri/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
zetazlife/fisura-hormigon
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # fisura-hormigon This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0123 - Accuracy: 0.9978 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.0989 | 0.17 | 500 | 0.0477 | 0.9871 | | 0.0531 | 0.33 | 1000 | 0.0493 | 0.9865 | | 0.046 | 0.5 | 1500 | 0.0381 | 0.9894 | | 0.0392 | 0.67 | 2000 | 0.1129 | 0.9734 | | 0.0459 | 0.83 | 2500 | 0.0364 | 0.9904 | | 0.0255 | 1.0 | 3000 | 0.0305 | 0.9934 | | 0.0188 | 1.17 | 3500 | 0.0247 | 0.9949 | | 0.0222 | 1.33 | 4000 | 0.0206 | 0.9921 | | 0.02 | 1.5 | 4500 | 0.0154 | 0.9952 | | 0.0191 | 1.67 | 5000 | 0.0132 | 0.9952 | | 0.0141 | 1.83 | 5500 | 0.0294 | 0.9905 | | 0.0201 | 2.0 | 6000 | 0.0155 | 0.9968 | | 0.0114 | 2.17 | 6500 | 0.0161 | 0.9965 | | 0.0071 | 2.33 | 7000 | 0.0124 | 0.9975 | | 0.0083 | 2.5 | 7500 | 0.0141 | 0.9969 | | 0.0143 | 2.67 | 8000 | 0.0242 | 0.9932 | | 0.0088 | 2.83 | 8500 | 0.0123 | 0.9972 | | 0.0034 | 3.0 | 9000 | 0.0120 | 0.9972 | | 0.0064 | 3.17 | 9500 | 0.0100 | 0.9978 | | 0.0012 | 3.33 | 10000 | 0.0166 | 0.996 | | 0.006 | 3.5 | 10500 | 0.0110 | 0.998 | | 0.0007 | 3.67 | 11000 | 0.0126 | 0.9972 | | 0.0034 | 3.83 | 11500 | 0.0122 | 0.9979 | | 0.0057 | 4.0 | 12000 | 0.0123 | 0.9978 | ### Framework versions - Transformers 4.29.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "negative", "positive" ]
jordyvl/vit-base_rvl-cdip-small_rvl_cdip-NK1000_og_simkd_rand
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base_rvl-cdip-small_rvl_cdip-NK1000_og_simkd_rand This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 267.1454 - Accuracy: 0.6807 - Brier Loss: 0.6059 - Nll: 2.5092 - F1 Micro: 0.6807 - F1 Macro: 0.6792 - Ece: 0.2988 - Aurc: 0.1779 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| | 286.6037 | 1.0 | 1000 | 286.3978 | 0.242 | 1.0585 | 5.0180 | 0.242 | 0.1919 | 0.3885 | 0.6070 | | 284.5917 | 2.0 | 2000 | 285.0526 | 0.235 | 1.4192 | 6.2048 | 0.235 | 0.1678 | 0.6914 | 0.6366 | | 284.1567 | 3.0 | 3000 | 283.4989 | 0.3705 | 1.0456 | 4.7503 | 0.3705 | 0.2880 | 0.4669 | 0.4145 | | 282.6679 | 4.0 | 4000 | 282.5618 | 0.4042 | 0.8940 | 4.1927 | 0.4042 | 0.3644 | 0.3629 | 0.3572 | | 282.2283 | 5.0 | 5000 | 281.9135 | 0.418 | 0.9976 | 3.8856 | 0.418 | 0.3686 | 0.4631 | 0.3778 | | 281.3193 | 6.0 | 6000 | 279.9180 | 0.4723 | 0.8755 | 3.4852 | 0.4723 | 0.4312 | 0.3960 | 0.2962 | | 280.7993 | 7.0 | 7000 | 279.2325 | 0.5038 | 0.8411 | 3.3760 | 0.5038 | 0.4635 | 0.3844 | 0.2753 | | 279.8249 | 8.0 | 8000 | 278.4682 | 0.5268 | 0.8078 | 3.1572 | 0.5268 | 0.4894 | 0.3705 | 0.2620 | | 278.8243 | 9.0 | 9000 | 278.2146 | 0.5268 | 0.8245 | 3.2631 | 0.5268 | 0.5043 | 0.3819 | 0.2729 | | 278.1676 | 10.0 | 10000 | 276.9399 | 0.5607 | 0.7853 | 3.0151 | 0.5607 | 0.5390 | 0.3741 | 0.2275 | | 276.8185 | 11.0 | 11000 | 276.3879 | 0.5697 | 0.7659 | 2.9137 | 0.5697 | 0.5520 | 0.3660 | 0.2221 | | 276.0937 | 12.0 | 12000 | 275.9589 | 0.5777 | 0.7626 | 2.9855 | 0.5777 | 0.5643 | 0.3606 | 0.2360 | | 276.0743 | 13.0 | 13000 | 275.6118 | 0.5675 | 0.7938 | 3.2975 | 0.5675 | 0.5545 | 0.3852 | 0.2320 | | 275.008 | 14.0 | 14000 | 275.0585 | 0.6 | 0.7359 | 2.8607 | 0.6 | 0.5861 | 0.3517 | 0.2142 | | 274.483 | 15.0 | 15000 | 274.0515 | 0.6292 | 0.6738 | 2.7667 | 0.6292 | 0.6262 | 0.3215 | 0.1904 | | 273.261 | 16.0 | 16000 | 273.7844 | 0.6312 | 0.6819 | 2.7219 | 0.6312 | 0.6296 | 0.3286 | 0.2048 | | 272.9319 | 17.0 | 17000 | 273.4691 | 0.6198 | 0.7009 | 2.8745 | 0.6198 | 0.6160 | 0.3410 | 0.2134 | | 272.456 | 18.0 | 18000 | 273.1716 | 0.6195 | 0.7071 | 2.8631 | 0.6195 | 0.6223 | 0.3440 | 0.2140 | | 272.0481 | 19.0 | 19000 | 272.5084 | 0.6322 | 0.6864 | 2.7598 | 0.6322 | 0.6292 | 0.3362 | 0.2119 | | 271.0429 | 20.0 | 20000 | 272.1741 | 0.6365 | 0.6830 | 2.8104 | 0.6365 | 0.6300 | 0.3345 | 0.2185 | | 271.0098 | 21.0 | 21000 | 271.8972 | 0.649 | 0.6569 | 2.8558 | 0.649 | 0.6477 | 0.3221 | 0.2076 | | 270.1226 | 22.0 | 22000 | 271.3564 | 0.639 | 0.6850 | 3.0353 | 0.639 | 0.6326 | 0.3372 | 0.2275 | | 269.8644 | 23.0 | 23000 | 271.2604 | 0.6332 | 0.6903 | 2.9472 | 0.6332 | 0.6330 | 0.3400 | 0.2367 | | 269.6737 | 24.0 | 24000 | 270.9163 | 0.6485 | 0.6622 | 2.8937 | 0.6485 | 0.6477 | 0.3258 | 0.2139 | | 268.3083 | 25.0 | 25000 | 270.3471 | 0.6528 | 0.6590 | 2.7873 | 0.6528 | 0.6550 | 0.3231 | 0.2228 | | 268.6058 | 26.0 | 26000 | 270.2531 | 0.659 | 0.6377 | 2.7500 | 0.659 | 0.6599 | 0.3125 | 0.1980 | | 268.5694 | 27.0 | 27000 | 270.0281 | 0.6535 | 0.6510 | 2.7183 | 0.6535 | 0.6502 | 0.3210 | 0.2112 | | 267.5742 | 28.0 | 28000 | 269.6303 | 0.664 | 0.6327 | 2.6630 | 0.664 | 0.6619 | 0.3109 | 0.1974 | | 267.4235 | 29.0 | 29000 | 269.3493 | 0.6607 | 0.6417 | 2.7860 | 0.6607 | 0.6568 | 0.3162 | 0.2074 | | 267.1017 | 30.0 | 30000 | 269.1249 | 0.675 | 0.6152 | 2.6205 | 0.675 | 0.6760 | 0.3013 | 0.1923 | | 266.7395 | 31.0 | 31000 | 268.8958 | 0.6685 | 0.6281 | 2.7126 | 0.6685 | 0.6638 | 0.3086 | 0.1943 | | 266.3374 | 32.0 | 32000 | 268.6245 | 0.6703 | 0.6224 | 2.7028 | 0.6703 | 0.6686 | 0.3065 | 0.1900 | | 266.3529 | 33.0 | 33000 | 268.4537 | 0.6697 | 0.6240 | 2.6593 | 0.6697 | 0.6683 | 0.3066 | 0.1964 | | 266.1322 | 34.0 | 34000 | 268.1314 | 0.678 | 0.6096 | 2.6485 | 0.678 | 0.6784 | 0.3008 | 0.1857 | | 265.3824 | 35.0 | 35000 | 268.1505 | 0.6707 | 0.6242 | 2.5832 | 0.6707 | 0.6696 | 0.3058 | 0.1916 | | 265.5754 | 36.0 | 36000 | 267.9319 | 0.676 | 0.6155 | 2.6208 | 0.676 | 0.6761 | 0.3014 | 0.1908 | | 265.6115 | 37.0 | 37000 | 268.0886 | 0.679 | 0.6093 | 2.6068 | 0.679 | 0.6795 | 0.2991 | 0.1796 | | 264.8437 | 38.0 | 38000 | 267.9896 | 0.6783 | 0.6113 | 2.5873 | 0.6783 | 0.6765 | 0.3000 | 0.1805 | | 264.8028 | 39.0 | 39000 | 267.5381 | 0.68 | 0.6048 | 2.5007 | 0.68 | 0.6771 | 0.2974 | 0.1771 | | 264.8063 | 40.0 | 40000 | 267.6070 | 0.6763 | 0.6127 | 2.5359 | 0.6763 | 0.6751 | 0.3030 | 0.1821 | | 264.7481 | 41.0 | 41000 | 267.4914 | 0.6837 | 0.6000 | 2.5214 | 0.6837 | 0.6809 | 0.2942 | 0.1830 | | 264.6455 | 42.0 | 42000 | 267.6581 | 0.6857 | 0.5968 | 2.5211 | 0.6857 | 0.6856 | 0.2919 | 0.1741 | | 264.0388 | 43.0 | 43000 | 267.3815 | 0.6797 | 0.6035 | 2.5123 | 0.6797 | 0.6795 | 0.2973 | 0.1773 | | 264.3585 | 44.0 | 44000 | 267.3548 | 0.6847 | 0.5997 | 2.5583 | 0.6847 | 0.6851 | 0.2943 | 0.1769 | | 263.7822 | 45.0 | 45000 | 267.0005 | 0.682 | 0.6043 | 2.5023 | 0.682 | 0.6793 | 0.2966 | 0.1788 | | 263.9765 | 46.0 | 46000 | 267.2113 | 0.6853 | 0.5955 | 2.5256 | 0.6853 | 0.6816 | 0.2922 | 0.1737 | | 264.1576 | 47.0 | 47000 | 267.1731 | 0.6833 | 0.6002 | 2.5071 | 0.6833 | 0.6825 | 0.2951 | 0.1768 | | 263.8688 | 48.0 | 48000 | 267.0122 | 0.6843 | 0.5980 | 2.5328 | 0.6843 | 0.6830 | 0.2942 | 0.1781 | | 263.8963 | 49.0 | 49000 | 266.8628 | 0.6843 | 0.6021 | 2.5231 | 0.6843 | 0.6831 | 0.2957 | 0.1782 | | 264.2061 | 50.0 | 50000 | 267.1454 | 0.6807 | 0.6059 | 2.5092 | 0.6807 | 0.6792 | 0.2988 | 0.1779 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2
[ "letter", "form", "email", "handwritten", "advertisement", "scientific_report", "scientific_publication", "specification", "file_folder", "news_article", "budget", "invoice", "presentation", "questionnaire", "resume", "memo" ]
Thamer/resnet-fine_tuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-fine_tuned This model is a fine-tuned version of [microsoft/resnet-34](https://huggingface.co/microsoft/resnet-34) on the Falah/Alzheimer_MRI dataset. It achieves the following results on the evaluation set: - Loss: 0.1983 - Accuracy: 0.9219 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.9041 | 1.0 | 80 | 0.9659 | 0.5352 | | 0.8743 | 2.0 | 160 | 0.9348 | 0.5797 | | 0.7723 | 3.0 | 240 | 0.7793 | 0.6594 | | 0.6864 | 4.0 | 320 | 0.6799 | 0.7031 | | 0.5347 | 5.0 | 400 | 0.5596 | 0.7703 | | 0.4282 | 6.0 | 480 | 0.5078 | 0.7766 | | 0.4315 | 7.0 | 560 | 0.5455 | 0.7680 | | 0.3747 | 8.0 | 640 | 0.4203 | 0.8266 | | 0.2977 | 9.0 | 720 | 0.3926 | 0.8469 | | 0.2252 | 10.0 | 800 | 0.3024 | 0.8742 | | 0.2675 | 11.0 | 880 | 0.2731 | 0.8906 | | 0.2136 | 12.0 | 960 | 0.3045 | 0.875 | | 0.1998 | 13.0 | 1040 | 0.2370 | 0.9 | | 0.2406 | 14.0 | 1120 | 0.2387 | 0.9086 | | 0.1873 | 15.0 | 1200 | 0.1983 | 0.9219 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cpu - Datasets 2.13.1 - Tokenizers 0.13.3
[ "mild_demented", "moderate_demented", "non_demented", "very_mild_demented" ]
hdduytran/autotrain-cat-dog-testing-81779141856
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 81779141856 - CO2 Emissions (in grams): 0.2783 ## Validation Metrics - Loss: 0.030 - Accuracy: 1.000 - Precision: 1.000 - Recall: 1.000 - AUC: 1.000 - F1: 1.000
[ "cat", "dog" ]
hdduytran/autotrain-cat-dog-demo-ex-81819141862
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 81819141862 - CO2 Emissions (in grams): 0.2810 ## Validation Metrics - Loss: 0.108 - Accuracy: 0.962 - Precision: 0.929 - Recall: 1.000 - AUC: 0.988 - F1: 0.963
[ "cat", "dog" ]
realzdlegend/autotrain-pneumonia-81787141863
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 81787141863 - CO2 Emissions (in grams): 1.7413 ## Validation Metrics - Loss: 0.550 - Accuracy: 0.729 - Macro F1: 0.763 - Micro F1: 0.729 - Weighted F1: 0.726 - Macro Precision: 0.832 - Micro Precision: 0.729 - Weighted Precision: 0.785 - Macro Recall: 0.748 - Micro Recall: 0.729 - Weighted Recall: 0.729
[ "0", "1", "2" ]
platzi/platzi_vit_model-danae-martinez
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi_vit_model-danae-martinez This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0241 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1387 | 3.85 | 500 | 0.0241 | 0.9925 | ### Framework versions - Transformers 4.29.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
platzi/platzi-vit-model-andres-grimaldos
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-andres-grimaldos This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0166 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1458 | 3.85 | 500 | 0.0166 | 0.9925 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
lizsergeeva/vit-base-patch16-224-finetuned-vit
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-vit This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2549 - Accuracy: 0.9161 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.6065 | 0.99 | 47 | 0.4006 | 0.8748 | | 0.335 | 2.0 | 95 | 0.2745 | 0.9175 | | 0.2707 | 2.97 | 141 | 0.2549 | 0.9161 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "baklava", "beef_carpaccio", "beef_tartare", "beet_salad", "beignets", "bibimbap", "bread_pudding", "breakfast_burrito" ]
devboop/vit-base-patch16-224-cl-v1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-cl-v1 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.4053 - Accuracy: 0.5027 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 3.6762 | 1.0 | 353 | 3.4472 | 0.3691 | | 2.8516 | 2.0 | 706 | 2.5892 | 0.4738 | | 2.6887 | 3.0 | 1059 | 2.4053 | 0.5027 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.0 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "affenpinscher", "afghan_hound", "aidi", "airedale_terrier", "akbash", "akita", "alano_espanol", "alaskan_klee_kai", "alaskan_malamute", "alpine_dachsbracke", "american_bulldog", "american_bully", "american_cocker_spaniel", "american_english_coonhound", "american_eskimo_dog", "american_foxhound", "american_hairless_terrier", "american_pit_bull_terrier", "american_staffordshire_terrier", "american_water_spaniel", "anatolian_shepherd_dog", "anglo_francais_de_petite_venerie", "appenzeller_sennenhund", "ariegeois", "armenian_gampr", "artois_hound", "australian_cattle_dog", "australian_kelpie", "australian_shepherd", "australian_stumpy_tail_cattle_dog", "australian_terrier", "austrian_black_and_tan_hound", "austrian_pinscher", "azawakh", "bankhar_dog", "barak_hound", "barbado_da_terceira", "barbet", "basenji", "basque_shepherd_dog", "basset_artesien_normand", "basset_bleu_de_gascogne", "basset_fauve_de_bretagne", "basset_hound", "bavarian_mountain_hound", "beagle", "beagle_harrier", "bearded_collie", "beauceron", "bedlington_terrier", "belgian_groenendael", "belgian_laekenois", "belgian_malinois", "belgian_tervuren", "bergamasco_shepherd", "berger_picard", "bernese_mountain_dog", "bichon_frise", "biewer_terrier", "black_and_tan_coonhound", "black_mouth_cur", "black_norwegian_elkhound", "black_russian_terrier", "bloodhound", "blue_lacy", "blue_picardy_spaniel", "bluetick_coonhound", "boerboel", "bohemian_shepherd", "bolognese_dog", "border_collie", "border_terrier", "borzoi", "boston_terrier", "bouvier_des_ardennes", "bouvier_des_flandres", "boxer", "boykin_spaniel", "bracco_italiano", "braque_dauvergne", "braque_du_bourbonnais", "braque_saint_germain", "brazilian_terrier", "briard", "briquet_griffon_vendeen", "brittany_spaniel", "broholmer", "bruno_jura_hound", "brussels_griffon", "bucovina_shepherd_dog", "bull_terrier", "bulldog", "bullmastiff", "bully_kutta", "burgos_pointer", "ca_de_bou", "cairn_terrier", "calupoh", "campeiro_bulldog", "can_de_palleiro", "canaan_dog", "canadian_eskimo_dog", "cane_corso", "cao_de_castro_laboreiro", "cao_de_gado_transmontano", "cao_fila_de_sao_miguel", "cardigan_welsh_corgi", "carolina_dog", "catahoula_leopard_dog", "catalan_sheepdog", "cavalier_king_charles_spaniel", "central_asian_shepherd_dog", "cesky_fousek", "cesky_terrier", "chesapeake_bay_retriever", "chien_francais_blanc_et_noir", "chihuahua", "chinese_crested_dog", "chongqing_dog", "chow_chow", "clumber_spaniel", "continental_bulldog", "coton_de_tulear", "cretan_hound", "croatian_sheepdog", "curly_coated_retriever", "czechoslovakian_wolfdog", "dachshund", "dalmatian", "dandie_dinmont_terrier", "danish_spitz", "danish_swedish_farmdog", "denmark_feist", "dingo", "dobermann", "dogo_argentino", "dogo_guatemalteco", "drentse_patrijshond", "drever", "dunker", "dutch_shepherd", "dutch_smoushond", "east_european_shepherd", "east_siberian_laika", "ecuadorian_hairless_dog", "elo", "english_cocker_spaniel", "english_foxhound", "english_mastiff", "english_setter", "english_shepherd", "english_springer_spaniel", "english_toy_terrier", "entlebucher_mountain_dog", "estonian_hound", "estrela_mountain_dog", "eurasier", "field_spaniel", "fila_brasileiro", "finnish_hound", "finnish_lapphund", "finnish_spitz", "flat_coated_retriever", "french_bulldog", "french_spaniel", "galgo_espanol", "garafian_shepherd", "gascon_saintongeois", "georgian_shepherd", "german_hound", "german_longhaired_pointer", "german_pinscher", "german_roughhaired_pointer", "german_shepherd", "german_shorthaired_pointer", "german_spaniel", "german_spitz", "german_wirehaired_pointer", "giant_schnauzer", "glen_of_imaal_terrier", "golden_retriever", "gonczy_polski", "gordon_setter", "grand_anglo_francais_tricolore", "grand_griffon_vendeen", "great_dane", "greater_swiss_mountain_dog", "greek_harehound", "greek_shepherd", "greenland_dog", "greyhound", "griffon_bleu_de_gascogne", "griffon_fauve_de_bretagne", "griffon_nivernais", "gull_terrier", "halden_hound", "hallefors_elkhound", "hamiltonstovare", "hanover_hound", "harrier", "havanese", "hierran_wolfdog", "himalayan_sheepdog", "hmong_bobtail_dog", "hokkaido", "hovawart", "huntaway", "hygen_hound", "ibizan_hound", "icelandic_sheepdog", "indian_pariah_dog", "indian_spitz", "irish_red_and_white_setter", "irish_setter", "irish_terrier", "irish_water_spaniel", "irish_wolfhound", "istrian_coarse_haired_hound", "istrian_shorthaired_hound", "italian_greyhound", "jack_russell_terrier", "jagdterrier", "jamthund", "japanese_chin", "japanese_spitz", "japanese_terrier", "jindo", "jonangi", "kai_ken", "kaikadi", "kangal_shepherd_dog", "kanni_dog", "karakachan_dog", "karelian_bear_dog", "karst_shepherd", "keeshond", "kerry_beagle", "kerry_blue_terrier", "king_charles_spaniel", "king_shepherd", "kintamani", "kishu", "kokoni", "komondor", "kooikerhondje", "koolie_dog", "koyun_dog", "kromfohrlander", "kuchi_dog", "kunming_dog", "kurdish_mastiff", "kuvasz", "labrador_retriever", "lagotto_romagnolo", "lakeland_terrier", "lancashire_heeler", "landseer", "lapponian_herder", "large_munsterlander", "leonberger", "levriero_sardo", "lhasa_apso", "lithuanian_hound", "lowchen", "lupo_italiano", "magyar_agar", "mahratta_greyhound", "maltese_dog", "manchester_terrier", "maneto_dog", "maremmano_abruzzese_sheepdog", "mcnab_dog", "miki_dog", "miniature_american_shepherd", "miniature_bull_terrier", "miniature_fox_terrier", "miniature_pinscher", "miniature_poodle", "miniature_schnauzer", "molossus_of_epirus", "montenegrin_mountain_hound", "mountain_cur", "mountain_feist", "mudhol_hound", "mudi_dog", "murray_river_retriever", "neapolitan_mastiff", "nenets_herding_laika", "new_guinea_singing_dog", "new_zealand_heading_dog", "newfoundland", "norfolk_terrier", "norrbottenspets", "northern_inuit_dog", "norwegian_buhund", "norwegian_elkhound", "norwegian_lundehund", "norwich_terrier", "nova_scotia_duck_tolling_retriever", "old_danish_pointer", "old_english_sheepdog", "old_english_terrier", "olde_english_bulldogge", "otterhound", "pachon_navarro", "pampas_deerhound", "papillon", "parson_russell_terrier", "pastore_della_lessinia_e_del_lagorai", "patagonian_sheepdog", "patterdale_terrier", "pekingese", "pembroke_welsh_corgi", "perro_de_pastor_mallorquin", "perro_de_presa_canario", "perro_de_presa_mallorquin", "perro_majorero", "peruvian_inca_orchid", "petit_basset_griffon_vendeen", "phalene", "pharaoh_hound", "phu_quoc_ridgeback", "picardy_spaniel", "plott_hound", "plummer_terrier", "podenco_andaluz", "podenco_canario", "podenco_valenciano", "pointer", "poitevin", "polish_greyhound", "polish_hound", "polish_lowland_sheepdog", "polish_tatra_sheepdog", "pomeranian", "pont_audemer_spaniel", "porcelaine", "portuguese_podengo", "portuguese_pointer", "portuguese_sheepdog", "portuguese_water_dog", "posavac_hound", "prazsky_krysarik", "pudelpointer", "pug", "puli", "pumi", "pungsan_dog", "pyrenean_mastiff", "pyrenean_mountain_dog", "pyrenean_sheepdog", "rafeiro_do_alentejo", "rajapalayam", "rampur_greyhound", "rat_terrier", "ratonero_bodeguero_andaluz", "ratonero_mallorquin", "ratonero_murciano", "ratonero_valenciano", "redbone_coonhound", "rhodesian_ridgeback", "rottweiler", "rough_collie", "russian_hound", "russian_spaniel", "russian_toy", "russo_european_laika", "ryukyu_inu", "saarloos_wolfdog", "sabueso_espanol", "saint_bernard", "saint_hubert_jura_hound", "saint_miguel_cattle_dog", "saint_usuge_spaniel", "saluki", "samoyed", "sapsali", "sarabi_dog", "sardinian_shepherd_dog", "sarplaninac", "schapendoes", "schillerstovare", "schipperke", "schweizer_laufhund", "schweizerischer_niederlaufhund", "scottish_deerhound", "scottish_terrier", "sealyham_terrier", "segugio_dell_appennino", "segugio_italiano", "segugio_maremmano", "serbian_hound", "serbian_tricolour_hound", "serrano_bulldog", "shar_pei", "shetland_sheepdog", "shiba_inu", "shih_tzu", "shikoku", "shiloh_shepherd", "siberian_husky", "silken_windhound", "silky_terrier", "sinhala_hound", "skye_terrier", "sloughi", "slovakian_wirehaired_pointer", "slovensky_cuvac", "slovensky_kopov", "small_greek_domestic_dog", "small_munsterlander", "smooth_collie", "smooth_fox_terrier", "soft_coated_wheaten_terrier", "south_russian_ovcharka", "spanish_mastiff", "spanish_water_dog", "spinone_italiano", "sporting_lucas_terrier", "stabyhoun", "staffordshire_bull_terrier", "standard_poodle", "standard_schnauzer", "stephens_stock", "styrian_coarse_haired_hound", "sussex_spaniel", "swedish_lapphund", "swedish_vallhund", "swinford_bandog", "taigan", "taiwan_dog", "tamaskan_dog", "tang_dog", "tazy", "teddy_roosevelt_terrier", "telomian", "tenterfield_terrier", "terrier_brasileiro", "thai_bangkaew_dog", "thai_ridgeback", "tibetan_kyi_apso", "tibetan_mastiff", "tibetan_spaniel", "tibetan_terrier", "tornjak", "tosa_inu", "toy_fox_terrier", "toy_manchester_terrier", "toy_poodle", "transylvanian_hound", "treeing_cur", "treeing_feist", "treeing_tennessee_brindle", "treeing_walker_coonhound", "trigg_hound", "tyrolean_hound", "vikhan", "vizsla", "volpino_italiano", "weimaraner", "welsh_hound", "welsh_sheepdog", "welsh_springer_spaniel", "welsh_terrier", "west_highland_white_terrier", "west_siberian_laika", "westphalian_dachsbracke", "wetterhoun", "whippet", "white_shepherd", "white_swiss_shepherd_dog", "wire_fox_terrier", "wirehaired_pointing_griffon", "wirehaired_vizsla", "wolf", "xiasi_dog", "xoloitzcuintle", "yakutian_laika", "yorkshire_terrier", "zerdava" ]
DHEIVER/Brain_Tumor_Classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Brain_Tumor_Classification This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1012 - Accuracy: 0.9647 - F1: 0.9647 - Recall: 0.9647 - Precision: 0.9647 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:---------:| | 0.4856 | 0.99 | 83 | 0.3771 | 0.8444 | 0.8444 | 0.8444 | 0.8444 | | 0.3495 | 1.99 | 166 | 0.2608 | 0.8949 | 0.8949 | 0.8949 | 0.8949 | | 0.252 | 2.99 | 249 | 0.1445 | 0.9487 | 0.9487 | 0.9487 | 0.9487 | | 0.2364 | 3.99 | 332 | 0.1029 | 0.9588 | 0.9588 | 0.9588 | 0.9588 | | 0.2178 | 4.99 | 415 | 0.1012 | 0.9647 | 0.9647 | 0.9647 | 0.9647 | ### Framework versions - Transformers 4.23.1 - Pytorch 1.12.1 - Datasets 2.6.1 - Tokenizers 0.13.1
[ "glioma_tumor", "meningioma_tumor", "no_tumor", "pituitary_tumor" ]
DHEIVER/Classificacao_de_Tumores_Cerebrais_usando_transformer_swin
<!-- Este cartão de modelo foi gerado automaticamente de acordo com as informações acessadas pelo Treinador. Você deve revisá-lo e completá-lo, e depois remover este comentário. --> # Classificação_de_Tumores_Cerebrais_usando_transformer_swin Este modelo é uma versão refinada do [microsoft/swin-base-patch4-window7-224-in22k](https://huggingface.co/microsoft/swin-base-patch4-window7-224-in22k) no conjunto de dados imagefolder. Ele alcança os seguintes resultados no conjunto de avaliação: - Perda: 0,0118 - Acurácia: 0,9949 - F1: 0,9949 - Recall: 0,9949 - Precisão: 0,9949 ## Descrição do Modelo Mais informações necessárias ## Usos pretendidos e limitações Mais informações necessárias ## Dados de treinamento e avaliação Mais informações necessárias ## Procedimento de treinamento ### Hiperparâmetros de treinamento Os seguintes hiperparâmetros foram usados durante o treinamento: - taxa_de_aprendizado: 5e-05 - tamanho_do_lote_de_treinamento: 32 - tamanho_do_lote_de_avaliação: 32 - semente: 42 - acumulação_de_gradientes: 4 - tamanho_total_do_lote_de_treinamento: 128 - otimizador: Adam com betas=(0,9,0,999) e epsilon=1e-08 - tipo_de_agendador_de_taxa_de_aprendizado: linear - proporção_de_aquecimento_do_agendador_de_taxa_de_aprendizado: 0,1 - num_epochs: 3 ### Resultados de treinamento | Perda de Treinamento | Época | Passo | Perda de Validação | Acurácia | F1 | Recall | Precisão | |:---------------------:|:-----:|:----:|:-------------------:|:--------:|:------:|:------:|:---------:| | 0,081 | 1,0 | 180 | 0,0557 | 0,9832 | 0,9832 | 0,9832 | 0,9832 | | 0,0816 | 2,0 | 360 | 0,0187 | 0,9937 | 0,9937 | 0,9937 | 0,9937 | | 0,0543 | 3,0 | 540 | 0,0118 | 0,9949 | 0,9949 | 0,9949 | 0,9949 | ### Versões do Framework - Transformers 4.23.1 - Pytorch 1.13.0 - Datasets 2.6.1 - Tokenizers 0.13.1
[ "gliomas tumor", "meningiomas tumor", "pituitary tumor" ]
DHEIVER/Diagnostico-de-Cancer-de-Pele-Aperfeicoado
- Perda (Loss): 0.7695 - Precisão (Accuracy): 0.7275 ## Descrição do Modelo Este modelo foi criado importando um conjunto de dados de fotos de câncer de pele no Google Colab a partir do Kaggle [aqui](https://www.kaggle.com/datasets/kmader/skin-cancer-mnist-ham10000). Em seguida, foi utilizado um tutorial de classificação de imagens [aqui](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/image_classification.ipynb) para treinar o modelo. Você pode acessar o notebook do treinamento [aqui](https://colab.research.google.com/drive/1bMkXnAvAqjX3J2YJ8wXTNw2Z2pt5KCjy?usp=sharing). As possíveis doenças classificadas por este modelo são: 'Actinic-keratoses', 'Basal-cell-carcinoma', 'Benign-keratosis-like-lesions', 'Dermatofibroma', 'Melanocytic-nevi', 'Melanoma', 'Vascular-lesions'. ## Usos Previstos e Limitações Mais informações são necessárias para entender completamente os usos previstos e as limitações específicas deste modelo. ## Dados de Treinamento e Avaliação Mais informações são necessárias para entender os detalhes dos conjuntos de dados utilizados no treinamento e avaliação deste modelo. ## Procedimento de Treinamento ### Hiperparâmetros de Treinamento Durante o treinamento, os seguintes hiperparâmetros foram utilizados: - Taxa de Aprendizado (learning_rate): 5e-05 - Tamanho do Lote de Treinamento (train_batch_size): 32 - Tamanho do Lote de Avaliação (eval_batch_size): 32 - Semente (seed): 42 - Acumulação de Gradientes (gradient_accumulation_steps): 4 - Tamanho Total do Lote de Treinamento (total_train_batch_size): 128 - Otimizador: Adam com betas=(0.9, 0.999) e epsilon=1e-08 - Tipo de Programador de Taxa de Aprendizado (lr_scheduler_type): Linear - Proporção de Aquecimento do Programador de Taxa de Aprendizado (lr_scheduler_warmup_ratio): 0.1 - Número de Épocas (num_epochs): 1 ### Resultados do Treinamento | Perda de Treinamento | Época | Passo | Perda de Validação | Precisão | |:--------------------:|:-----:|:----:|:-------------------:|:--------:| | 0.6911 | 0.99 | 70 | 0.7695 | 0.7275 | ### Versões das Frameworks - Transformers 4.20.1 - Pytorch 1.11.0+cu113 - Datasets 2.3.2 - Tokenizers 0.12.1
[ "actinic-keratoses", "basal-cell-carcinoma", "benign-keratosis-like-lesions", "dermatofibroma", "melanocytic-nevi", "melanoma", "vascular-lesions" ]
Woleek/bg-classif
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3032 - Accuracy: 0.9231 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0254 | 2.94 | 50 | 0.4310 | 0.8974 | | 0.001 | 5.88 | 100 | 0.3017 | 0.9231 | | 0.0007 | 8.82 | 150 | 0.3032 | 0.9231 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu117 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "airport", "bathroom", "skatepark", "warehouse", "zoo", "beach", "building_site", "factory", "food", "harbour", "office", "railway", "road" ]
ArthurMor4is/vit-base-patch16-224-finetuned-covid_ct_set_resumed
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-covid_ct_set_resumed This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6175 - Accuracy: 0.6111 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.6175 | 0.6111 | | No log | 2.0 | 2 | 0.6285 | 0.5556 | | No log | 3.0 | 3 | 0.6700 | 0.5556 | | No log | 4.0 | 4 | 0.7347 | 0.5556 | | No log | 5.0 | 5 | 0.7754 | 0.5556 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "covid", "normal" ]
ArthurMor4is/vit-base-patch16-224-finetuned-covid_ct_set_full
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-covid_ct_set_full This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1225 - Accuracy: 0.9627 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.4343 | 0.99 | 29 | 0.1945 | 0.9298 | | 0.2353 | 1.98 | 58 | 0.2052 | 0.9290 | | 0.1395 | 2.97 | 87 | 0.2567 | 0.9075 | | 0.1399 | 4.0 | 117 | 0.1225 | 0.9627 | | 0.1186 | 4.96 | 145 | 0.1531 | 0.9521 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "covid", "normal" ]
AhmadHakami/alzheimer-image-classification-google-vit-base-patch16
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # alzheimer-image-classification-google-vit-base-patch16 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the [Alzheimer MRI data](https://www.kaggle.com/datasets/sachinkumar413/alzheimer-mri-dataset). It achieves the following results on the evaluation set: - Loss: 0.2127 - Accuracy: 0.9261 ## Model description The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a supervised fashion, namely ImageNet-21k, at a resolution of 224x224 pixels. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Note that this model does not provide any fine-tuned heads, as these were zero'd by Google researchers. However, the model does include the pre-trained pooler, which can be used for downstream tasks (such as image classification). By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for fine-tuned versions on a task that interests you. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.8167 | 1.0 | 715 | 0.7520 | 0.6494 | | 0.6264 | 2.0 | 1431 | 0.6467 | 0.7091 | | 0.5003 | 3.0 | 2146 | 0.5430 | 0.7594 | | 0.3543 | 4.0 | 2862 | 0.4372 | 0.8145 | | 0.3816 | 5.0 | 3577 | 0.3681 | 0.8428 | | 0.2055 | 6.0 | 4293 | 0.3746 | 0.8514 | | 0.2526 | 7.0 | 5008 | 0.2836 | 0.8907 | | 0.1262 | 8.0 | 5724 | 0.2798 | 0.8954 | | 0.1332 | 9.0 | 6439 | 0.2301 | 0.9159 | | 0.0702 | 9.99 | 7150 | 0.2127 | 0.9261 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1 - Datasets 2.14.3 - Tokenizers 0.13.3
[ "mild_demented", "moderate_demented", "non_demented", "very_mild_demented" ]
Onno/hotels_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Onno/hotels_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.4492 - Validation Loss: 0.5853 - Train Accuracy: 0.6548 - Epoch: 14 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 5025, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 0.6757 | 0.6910 | 0.5119 | 0 | | 0.6569 | 0.6739 | 0.5357 | 1 | | 0.6395 | 0.6663 | 0.5357 | 2 | | 0.6161 | 0.6465 | 0.6071 | 3 | | 0.5919 | 0.6299 | 0.6548 | 4 | | 0.5801 | 0.6173 | 0.6429 | 5 | | 0.5518 | 0.6039 | 0.6310 | 6 | | 0.5414 | 0.6205 | 0.6905 | 7 | | 0.5181 | 0.6138 | 0.6548 | 8 | | 0.4902 | 0.6300 | 0.6667 | 9 | | 0.4824 | 0.6672 | 0.6667 | 10 | | 0.4493 | 0.6038 | 0.6071 | 11 | | 0.4287 | 0.6329 | 0.6667 | 12 | | 0.4668 | 0.6371 | 0.6548 | 13 | | 0.4492 | 0.5853 | 0.6548 | 14 | ### Framework versions - Transformers 4.32.0 - TensorFlow 2.12.0 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "negative", "positive" ]
minchiosa/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
ZachBeesley/food-classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # ZachBeesley/food-classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.3376 - Validation Loss: 0.3213 - Train Accuracy: 0.921 - Epoch: 4 ## Model description Image-classification model that can identify foods based on pictures ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 20000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 2.6919 | 1.5372 | 0.848 | 0 | | 1.1404 | 0.8059 | 0.881 | 1 | | 0.6375 | 0.6164 | 0.865 | 2 | | 0.4379 | 0.3822 | 0.915 | 3 | | 0.3376 | 0.3213 | 0.921 | 4 | ### Framework versions - Transformers 4.31.0 - TensorFlow 2.12.0 - Datasets 2.14.4 - Tokenizers 0.13.3
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]