model_id
stringlengths 7
105
| model_card
stringlengths 1
130k
| model_labels
listlengths 2
80k
|
---|---|---|
bdpc/resnet101_rvl-cdip-_rvl_cdip-NK1000__CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101_rvl-cdip-_rvl_cdip-NK1000__CEKD_t2.5_a0.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6065
- Accuracy: 0.7915
- Brier Loss: 0.3054
- Nll: 1.9957
- F1 Micro: 0.7915
- F1 Macro: 0.7910
- Ece: 0.0453
- Aurc: 0.0607
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 250 | 4.1565 | 0.1378 | 0.9318 | 7.9039 | 0.1378 | 0.1073 | 0.0673 | 0.8326 |
| 4.1485 | 2.0 | 500 | 3.6932 | 0.3235 | 0.8832 | 5.1525 | 0.3235 | 0.2725 | 0.2044 | 0.5507 |
| 4.1485 | 3.0 | 750 | 2.3374 | 0.4725 | 0.6611 | 3.3127 | 0.4725 | 0.4311 | 0.0839 | 0.2921 |
| 2.392 | 4.0 | 1000 | 1.6516 | 0.588 | 0.5470 | 2.8681 | 0.588 | 0.5789 | 0.0620 | 0.1929 |
| 2.392 | 5.0 | 1250 | 1.3260 | 0.6488 | 0.4782 | 2.6378 | 0.6488 | 0.6444 | 0.0486 | 0.1458 |
| 1.1422 | 6.0 | 1500 | 1.0390 | 0.702 | 0.4156 | 2.4086 | 0.702 | 0.7029 | 0.0576 | 0.1097 |
| 1.1422 | 7.0 | 1750 | 0.8420 | 0.7288 | 0.3738 | 2.2222 | 0.7288 | 0.7300 | 0.0553 | 0.0888 |
| 0.708 | 8.0 | 2000 | 0.7753 | 0.7398 | 0.3586 | 2.1518 | 0.7398 | 0.7396 | 0.0587 | 0.0826 |
| 0.708 | 9.0 | 2250 | 0.7797 | 0.7462 | 0.3580 | 2.1095 | 0.7462 | 0.7457 | 0.0581 | 0.0820 |
| 0.5195 | 10.0 | 2500 | 0.7101 | 0.7602 | 0.3404 | 2.0711 | 0.7602 | 0.7612 | 0.0473 | 0.0733 |
| 0.5195 | 11.0 | 2750 | 0.6971 | 0.7645 | 0.3338 | 2.0649 | 0.7645 | 0.7653 | 0.0541 | 0.0715 |
| 0.4176 | 12.0 | 3000 | 0.6936 | 0.7712 | 0.3302 | 2.0265 | 0.7712 | 0.7708 | 0.0515 | 0.0702 |
| 0.4176 | 13.0 | 3250 | 0.6991 | 0.7662 | 0.3346 | 2.0582 | 0.7663 | 0.7657 | 0.0581 | 0.0723 |
| 0.3573 | 14.0 | 3500 | 0.6672 | 0.7722 | 0.3246 | 2.0053 | 0.7722 | 0.7723 | 0.0551 | 0.0683 |
| 0.3573 | 15.0 | 3750 | 0.6735 | 0.777 | 0.3244 | 2.0387 | 0.777 | 0.7782 | 0.0488 | 0.0671 |
| 0.3193 | 16.0 | 4000 | 0.6567 | 0.776 | 0.3216 | 2.0256 | 0.776 | 0.7773 | 0.0499 | 0.0678 |
| 0.3193 | 17.0 | 4250 | 0.6498 | 0.78 | 0.3184 | 1.9865 | 0.78 | 0.7802 | 0.0477 | 0.0662 |
| 0.2893 | 18.0 | 4500 | 0.6763 | 0.7755 | 0.3264 | 2.0844 | 0.7755 | 0.7755 | 0.0531 | 0.0697 |
| 0.2893 | 19.0 | 4750 | 0.6519 | 0.7815 | 0.3183 | 2.0458 | 0.7815 | 0.7817 | 0.0513 | 0.0658 |
| 0.271 | 20.0 | 5000 | 0.6432 | 0.7823 | 0.3147 | 2.0291 | 0.7823 | 0.7827 | 0.0440 | 0.0645 |
| 0.271 | 21.0 | 5250 | 0.6456 | 0.781 | 0.3156 | 2.0493 | 0.7810 | 0.7813 | 0.0487 | 0.0652 |
| 0.2516 | 22.0 | 5500 | 0.6336 | 0.7823 | 0.3144 | 1.9829 | 0.7823 | 0.7822 | 0.0522 | 0.0642 |
| 0.2516 | 23.0 | 5750 | 0.6333 | 0.7837 | 0.3128 | 2.0196 | 0.7837 | 0.7836 | 0.0492 | 0.0641 |
| 0.2397 | 24.0 | 6000 | 0.6337 | 0.7817 | 0.3147 | 2.0180 | 0.7817 | 0.7815 | 0.0494 | 0.0644 |
| 0.2397 | 25.0 | 6250 | 0.6347 | 0.7857 | 0.3145 | 2.0187 | 0.7857 | 0.7856 | 0.0510 | 0.0641 |
| 0.23 | 26.0 | 6500 | 0.6311 | 0.7815 | 0.3129 | 2.0132 | 0.7815 | 0.7819 | 0.0495 | 0.0637 |
| 0.23 | 27.0 | 6750 | 0.6329 | 0.7853 | 0.3125 | 2.0708 | 0.7853 | 0.7852 | 0.0502 | 0.0635 |
| 0.2191 | 28.0 | 7000 | 0.6222 | 0.786 | 0.3109 | 2.0022 | 0.786 | 0.7856 | 0.0483 | 0.0638 |
| 0.2191 | 29.0 | 7250 | 0.6195 | 0.7863 | 0.3096 | 2.0028 | 0.7863 | 0.7859 | 0.0550 | 0.0620 |
| 0.2155 | 30.0 | 7500 | 0.6196 | 0.7883 | 0.3090 | 1.9972 | 0.7883 | 0.7883 | 0.0486 | 0.0624 |
| 0.2155 | 31.0 | 7750 | 0.6167 | 0.787 | 0.3080 | 2.0173 | 0.787 | 0.7871 | 0.0443 | 0.0623 |
| 0.2074 | 32.0 | 8000 | 0.6143 | 0.7897 | 0.3073 | 2.0223 | 0.7897 | 0.7893 | 0.0443 | 0.0614 |
| 0.2074 | 33.0 | 8250 | 0.6123 | 0.787 | 0.3078 | 1.9869 | 0.787 | 0.7866 | 0.0458 | 0.0619 |
| 0.2028 | 34.0 | 8500 | 0.6137 | 0.7873 | 0.3070 | 1.9883 | 0.7873 | 0.7868 | 0.0457 | 0.0623 |
| 0.2028 | 35.0 | 8750 | 0.6152 | 0.786 | 0.3085 | 2.0108 | 0.786 | 0.7863 | 0.0497 | 0.0626 |
| 0.1982 | 36.0 | 9000 | 0.6133 | 0.7863 | 0.3077 | 2.0205 | 0.7863 | 0.7862 | 0.0515 | 0.0615 |
| 0.1982 | 37.0 | 9250 | 0.6145 | 0.7877 | 0.3081 | 1.9930 | 0.7877 | 0.7879 | 0.0444 | 0.0621 |
| 0.1948 | 38.0 | 9500 | 0.6116 | 0.7857 | 0.3078 | 2.0072 | 0.7857 | 0.7854 | 0.0508 | 0.0619 |
| 0.1948 | 39.0 | 9750 | 0.6090 | 0.788 | 0.3059 | 1.9954 | 0.788 | 0.7882 | 0.0430 | 0.0614 |
| 0.1933 | 40.0 | 10000 | 0.6143 | 0.7897 | 0.3072 | 1.9943 | 0.7897 | 0.7899 | 0.0462 | 0.0618 |
| 0.1933 | 41.0 | 10250 | 0.6061 | 0.7887 | 0.3041 | 1.9900 | 0.7887 | 0.7889 | 0.0439 | 0.0606 |
| 0.1882 | 42.0 | 10500 | 0.6070 | 0.7865 | 0.3058 | 1.9907 | 0.7865 | 0.7868 | 0.0438 | 0.0607 |
| 0.1882 | 43.0 | 10750 | 0.6083 | 0.788 | 0.3054 | 2.0095 | 0.788 | 0.7877 | 0.0489 | 0.0608 |
| 0.1871 | 44.0 | 11000 | 0.6083 | 0.787 | 0.3054 | 1.9828 | 0.787 | 0.7872 | 0.0469 | 0.0607 |
| 0.1871 | 45.0 | 11250 | 0.6092 | 0.7893 | 0.3057 | 2.0140 | 0.7893 | 0.7891 | 0.0483 | 0.0608 |
| 0.1862 | 46.0 | 11500 | 0.6057 | 0.7893 | 0.3053 | 2.0064 | 0.7893 | 0.7890 | 0.0450 | 0.0609 |
| 0.1862 | 47.0 | 11750 | 0.6042 | 0.79 | 0.3044 | 1.9691 | 0.79 | 0.7899 | 0.0435 | 0.0607 |
| 0.1845 | 48.0 | 12000 | 0.6068 | 0.79 | 0.3053 | 2.0052 | 0.79 | 0.7899 | 0.0438 | 0.0608 |
| 0.1845 | 49.0 | 12250 | 0.6081 | 0.7893 | 0.3062 | 2.0117 | 0.7893 | 0.7890 | 0.0485 | 0.0612 |
| 0.1836 | 50.0 | 12500 | 0.6065 | 0.7915 | 0.3054 | 1.9957 | 0.7915 | 0.7910 | 0.0453 | 0.0607 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
hansin91/activity_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# activity_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7631
- Accuracy: 0.7710
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.1235 | 1.0 | 315 | 1.3182 | 0.7099 |
| 1.0404 | 2.0 | 630 | 0.9831 | 0.7278 |
| 0.7899 | 3.0 | 945 | 0.9509 | 0.7175 |
| 0.6961 | 4.0 | 1260 | 0.8258 | 0.7460 |
| 0.615 | 5.0 | 1575 | 0.7890 | 0.7667 |
| 0.5534 | 6.0 | 1890 | 0.7876 | 0.7591 |
| 0.524 | 7.0 | 2205 | 0.7627 | 0.7663 |
| 0.4588 | 8.0 | 2520 | 0.8256 | 0.7468 |
| 0.4407 | 9.0 | 2835 | 0.8041 | 0.7615 |
| 0.4039 | 10.0 | 3150 | 0.8367 | 0.7540 |
| 0.3966 | 11.0 | 3465 | 0.8708 | 0.7492 |
| 0.366 | 12.0 | 3780 | 0.8410 | 0.7544 |
| 0.3522 | 13.0 | 4095 | 0.9019 | 0.7365 |
| 0.3495 | 14.0 | 4410 | 0.8240 | 0.7567 |
| 0.3206 | 15.0 | 4725 | 0.8428 | 0.7607 |
| 0.3172 | 16.0 | 5040 | 0.8626 | 0.7607 |
| 0.2931 | 17.0 | 5355 | 1.0311 | 0.7306 |
| 0.2943 | 18.0 | 5670 | 0.9393 | 0.7544 |
| 0.2886 | 19.0 | 5985 | 0.9379 | 0.7472 |
| 0.2785 | 20.0 | 6300 | 0.8911 | 0.7552 |
| 0.274 | 21.0 | 6615 | 0.9730 | 0.7484 |
| 0.2716 | 22.0 | 6930 | 0.9546 | 0.7504 |
| 0.2686 | 23.0 | 7245 | 0.8939 | 0.7651 |
| 0.2489 | 24.0 | 7560 | 0.9397 | 0.7480 |
| 0.257 | 25.0 | 7875 | 0.9298 | 0.7552 |
| 0.244 | 26.0 | 8190 | 0.9977 | 0.7437 |
| 0.2333 | 27.0 | 8505 | 0.9967 | 0.75 |
| 0.2376 | 28.0 | 8820 | 1.0012 | 0.7508 |
| 0.2428 | 29.0 | 9135 | 0.9674 | 0.7421 |
| 0.224 | 30.0 | 9450 | 1.0239 | 0.7361 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"calling",
"clapping",
"running",
"sitting",
"sleeping",
"texting",
"using_laptop",
"cycling",
"dancing",
"drinking",
"eating",
"fighting",
"hugging",
"laughing",
"listening_to_music"
] |
anggtpd/snacks_classifier
|
# snacks_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0090
- Accuracy: 0.8908
|
[
"tench, tinca tinca",
"goldfish, carassius auratus",
"great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias",
"tiger shark, galeocerdo cuvieri",
"hammerhead, hammerhead shark",
"electric ray, crampfish, numbfish, torpedo",
"stingray",
"cock",
"hen",
"ostrich, struthio camelus",
"brambling, fringilla montifringilla",
"goldfinch, carduelis carduelis",
"house finch, linnet, carpodacus mexicanus",
"junco, snowbird",
"indigo bunting, indigo finch, indigo bird, passerina cyanea",
"robin, american robin, turdus migratorius",
"bulbul",
"jay",
"magpie",
"chickadee",
"water ouzel, dipper",
"kite",
"bald eagle, american eagle, haliaeetus leucocephalus",
"vulture",
"great grey owl, great gray owl, strix nebulosa",
"european fire salamander, salamandra salamandra",
"common newt, triturus vulgaris",
"eft",
"spotted salamander, ambystoma maculatum",
"axolotl, mud puppy, ambystoma mexicanum",
"bullfrog, rana catesbeiana",
"tree frog, tree-frog",
"tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui",
"loggerhead, loggerhead turtle, caretta caretta",
"leatherback turtle, leatherback, leathery turtle, dermochelys coriacea",
"mud turtle",
"terrapin",
"box turtle, box tortoise",
"banded gecko",
"common iguana, iguana, iguana iguana",
"american chameleon, anole, anolis carolinensis",
"whiptail, whiptail lizard",
"agama",
"frilled lizard, chlamydosaurus kingi",
"alligator lizard",
"gila monster, heloderma suspectum",
"green lizard, lacerta viridis",
"african chameleon, chamaeleo chamaeleon",
"komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis",
"african crocodile, nile crocodile, crocodylus niloticus",
"american alligator, alligator mississipiensis",
"triceratops",
"thunder snake, worm snake, carphophis amoenus",
"ringneck snake, ring-necked snake, ring snake",
"hognose snake, puff adder, sand viper",
"green snake, grass snake",
"king snake, kingsnake",
"garter snake, grass snake",
"water snake",
"vine snake",
"night snake, hypsiglena torquata",
"boa constrictor, constrictor constrictor",
"rock python, rock snake, python sebae",
"indian cobra, naja naja",
"green mamba",
"sea snake",
"horned viper, cerastes, sand viper, horned asp, cerastes cornutus",
"diamondback, diamondback rattlesnake, crotalus adamanteus",
"sidewinder, horned rattlesnake, crotalus cerastes",
"trilobite",
"harvestman, daddy longlegs, phalangium opilio",
"scorpion",
"black and gold garden spider, argiope aurantia",
"barn spider, araneus cavaticus",
"garden spider, aranea diademata",
"black widow, latrodectus mactans",
"tarantula",
"wolf spider, hunting spider",
"tick",
"centipede",
"black grouse",
"ptarmigan",
"ruffed grouse, partridge, bonasa umbellus",
"prairie chicken, prairie grouse, prairie fowl",
"peacock",
"quail",
"partridge",
"african grey, african gray, psittacus erithacus",
"macaw",
"sulphur-crested cockatoo, kakatoe galerita, cacatua galerita",
"lorikeet",
"coucal",
"bee eater",
"hornbill",
"hummingbird",
"jacamar",
"toucan",
"drake",
"red-breasted merganser, mergus serrator",
"goose",
"black swan, cygnus atratus",
"tusker",
"echidna, spiny anteater, anteater",
"platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus",
"wallaby, brush kangaroo",
"koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus",
"wombat",
"jellyfish",
"sea anemone, anemone",
"brain coral",
"flatworm, platyhelminth",
"nematode, nematode worm, roundworm",
"conch",
"snail",
"slug",
"sea slug, nudibranch",
"chiton, coat-of-mail shell, sea cradle, polyplacophore",
"chambered nautilus, pearly nautilus, nautilus",
"dungeness crab, cancer magister",
"rock crab, cancer irroratus",
"fiddler crab",
"king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica",
"american lobster, northern lobster, maine lobster, homarus americanus",
"spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
"crayfish, crawfish, crawdad, crawdaddy",
"hermit crab",
"isopod",
"white stork, ciconia ciconia",
"black stork, ciconia nigra",
"spoonbill",
"flamingo",
"little blue heron, egretta caerulea",
"american egret, great white heron, egretta albus",
"bittern",
"crane",
"limpkin, aramus pictus",
"european gallinule, porphyrio porphyrio",
"american coot, marsh hen, mud hen, water hen, fulica americana",
"bustard",
"ruddy turnstone, arenaria interpres",
"red-backed sandpiper, dunlin, erolia alpina",
"redshank, tringa totanus",
"dowitcher",
"oystercatcher, oyster catcher",
"pelican",
"king penguin, aptenodytes patagonica",
"albatross, mollymawk",
"grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus",
"killer whale, killer, orca, grampus, sea wolf, orcinus orca",
"dugong, dugong dugon",
"sea lion",
"chihuahua",
"japanese spaniel",
"maltese dog, maltese terrier, maltese",
"pekinese, pekingese, peke",
"shih-tzu",
"blenheim spaniel",
"papillon",
"toy terrier",
"rhodesian ridgeback",
"afghan hound, afghan",
"basset, basset hound",
"beagle",
"bloodhound, sleuthhound",
"bluetick",
"black-and-tan coonhound",
"walker hound, walker foxhound",
"english foxhound",
"redbone",
"borzoi, russian wolfhound",
"irish wolfhound",
"italian greyhound",
"whippet",
"ibizan hound, ibizan podenco",
"norwegian elkhound, elkhound",
"otterhound, otter hound",
"saluki, gazelle hound",
"scottish deerhound, deerhound",
"weimaraner",
"staffordshire bullterrier, staffordshire bull terrier",
"american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier",
"bedlington terrier",
"border terrier",
"kerry blue terrier",
"irish terrier",
"norfolk terrier",
"norwich terrier",
"yorkshire terrier",
"wire-haired fox terrier",
"lakeland terrier",
"sealyham terrier, sealyham",
"airedale, airedale terrier",
"cairn, cairn terrier",
"australian terrier",
"dandie dinmont, dandie dinmont terrier",
"boston bull, boston terrier",
"miniature schnauzer",
"giant schnauzer",
"standard schnauzer",
"scotch terrier, scottish terrier, scottie",
"tibetan terrier, chrysanthemum dog",
"silky terrier, sydney silky",
"soft-coated wheaten terrier",
"west highland white terrier",
"lhasa, lhasa apso",
"flat-coated retriever",
"curly-coated retriever",
"golden retriever",
"labrador retriever",
"chesapeake bay retriever",
"german short-haired pointer",
"vizsla, hungarian pointer",
"english setter",
"irish setter, red setter",
"gordon setter",
"brittany spaniel",
"clumber, clumber spaniel",
"english springer, english springer spaniel",
"welsh springer spaniel",
"cocker spaniel, english cocker spaniel, cocker",
"sussex spaniel",
"irish water spaniel",
"kuvasz",
"schipperke",
"groenendael",
"malinois",
"briard",
"kelpie",
"komondor",
"old english sheepdog, bobtail",
"shetland sheepdog, shetland sheep dog, shetland",
"collie",
"border collie",
"bouvier des flandres, bouviers des flandres",
"rottweiler",
"german shepherd, german shepherd dog, german police dog, alsatian",
"doberman, doberman pinscher",
"miniature pinscher",
"greater swiss mountain dog",
"bernese mountain dog",
"appenzeller",
"entlebucher",
"boxer",
"bull mastiff",
"tibetan mastiff",
"french bulldog",
"great dane",
"saint bernard, st bernard",
"eskimo dog, husky",
"malamute, malemute, alaskan malamute",
"siberian husky",
"dalmatian, coach dog, carriage dog",
"affenpinscher, monkey pinscher, monkey dog",
"basenji",
"pug, pug-dog",
"leonberg",
"newfoundland, newfoundland dog",
"great pyrenees",
"samoyed, samoyede",
"pomeranian",
"chow, chow chow",
"keeshond",
"brabancon griffon",
"pembroke, pembroke welsh corgi",
"cardigan, cardigan welsh corgi",
"toy poodle",
"miniature poodle",
"standard poodle",
"mexican hairless",
"timber wolf, grey wolf, gray wolf, canis lupus",
"white wolf, arctic wolf, canis lupus tundrarum",
"red wolf, maned wolf, canis rufus, canis niger",
"coyote, prairie wolf, brush wolf, canis latrans",
"dingo, warrigal, warragal, canis dingo",
"dhole, cuon alpinus",
"african hunting dog, hyena dog, cape hunting dog, lycaon pictus",
"hyena, hyaena",
"red fox, vulpes vulpes",
"kit fox, vulpes macrotis",
"arctic fox, white fox, alopex lagopus",
"grey fox, gray fox, urocyon cinereoargenteus",
"tabby, tabby cat",
"tiger cat",
"persian cat",
"siamese cat, siamese",
"egyptian cat",
"cougar, puma, catamount, mountain lion, painter, panther, felis concolor",
"lynx, catamount",
"leopard, panthera pardus",
"snow leopard, ounce, panthera uncia",
"jaguar, panther, panthera onca, felis onca",
"lion, king of beasts, panthera leo",
"tiger, panthera tigris",
"cheetah, chetah, acinonyx jubatus",
"brown bear, bruin, ursus arctos",
"american black bear, black bear, ursus americanus, euarctos americanus",
"ice bear, polar bear, ursus maritimus, thalarctos maritimus",
"sloth bear, melursus ursinus, ursus ursinus",
"mongoose",
"meerkat, mierkat",
"tiger beetle",
"ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
"ground beetle, carabid beetle",
"long-horned beetle, longicorn, longicorn beetle",
"leaf beetle, chrysomelid",
"dung beetle",
"rhinoceros beetle",
"weevil",
"fly",
"bee",
"ant, emmet, pismire",
"grasshopper, hopper",
"cricket",
"walking stick, walkingstick, stick insect",
"cockroach, roach",
"mantis, mantid",
"cicada, cicala",
"leafhopper",
"lacewing, lacewing fly",
"dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
"damselfly",
"admiral",
"ringlet, ringlet butterfly",
"monarch, monarch butterfly, milkweed butterfly, danaus plexippus",
"cabbage butterfly",
"sulphur butterfly, sulfur butterfly",
"lycaenid, lycaenid butterfly",
"starfish, sea star",
"sea urchin",
"sea cucumber, holothurian",
"wood rabbit, cottontail, cottontail rabbit",
"hare",
"angora, angora rabbit",
"hamster",
"porcupine, hedgehog",
"fox squirrel, eastern fox squirrel, sciurus niger",
"marmot",
"beaver",
"guinea pig, cavia cobaya",
"sorrel",
"zebra",
"hog, pig, grunter, squealer, sus scrofa",
"wild boar, boar, sus scrofa",
"warthog",
"hippopotamus, hippo, river horse, hippopotamus amphibius",
"ox",
"water buffalo, water ox, asiatic buffalo, bubalus bubalis",
"bison",
"ram, tup",
"bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis",
"ibex, capra ibex",
"hartebeest",
"impala, aepyceros melampus",
"gazelle",
"arabian camel, dromedary, camelus dromedarius",
"llama",
"weasel",
"mink",
"polecat, fitch, foulmart, foumart, mustela putorius",
"black-footed ferret, ferret, mustela nigripes",
"otter",
"skunk, polecat, wood pussy",
"badger",
"armadillo",
"three-toed sloth, ai, bradypus tridactylus",
"orangutan, orang, orangutang, pongo pygmaeus",
"gorilla, gorilla gorilla",
"chimpanzee, chimp, pan troglodytes",
"gibbon, hylobates lar",
"siamang, hylobates syndactylus, symphalangus syndactylus",
"guenon, guenon monkey",
"patas, hussar monkey, erythrocebus patas",
"baboon",
"macaque",
"langur",
"colobus, colobus monkey",
"proboscis monkey, nasalis larvatus",
"marmoset",
"capuchin, ringtail, cebus capucinus",
"howler monkey, howler",
"titi, titi monkey",
"spider monkey, ateles geoffroyi",
"squirrel monkey, saimiri sciureus",
"madagascar cat, ring-tailed lemur, lemur catta",
"indri, indris, indri indri, indri brevicaudatus",
"indian elephant, elephas maximus",
"african elephant, loxodonta africana",
"lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens",
"giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca",
"barracouta, snoek",
"eel",
"coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch",
"rock beauty, holocanthus tricolor",
"anemone fish",
"sturgeon",
"gar, garfish, garpike, billfish, lepisosteus osseus",
"lionfish",
"puffer, pufferfish, blowfish, globefish",
"abacus",
"abaya",
"academic gown, academic robe, judge's robe",
"accordion, piano accordion, squeeze box",
"acoustic guitar",
"aircraft carrier, carrier, flattop, attack aircraft carrier",
"airliner",
"airship, dirigible",
"altar",
"ambulance",
"amphibian, amphibious vehicle",
"analog clock",
"apiary, bee house",
"apron",
"ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
"assault rifle, assault gun",
"backpack, back pack, knapsack, packsack, rucksack, haversack",
"bakery, bakeshop, bakehouse",
"balance beam, beam",
"balloon",
"ballpoint, ballpoint pen, ballpen, biro",
"band aid",
"banjo",
"bannister, banister, balustrade, balusters, handrail",
"barbell",
"barber chair",
"barbershop",
"barn",
"barometer",
"barrel, cask",
"barrow, garden cart, lawn cart, wheelbarrow",
"baseball",
"basketball",
"bassinet",
"bassoon",
"bathing cap, swimming cap",
"bath towel",
"bathtub, bathing tub, bath, tub",
"beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
"beacon, lighthouse, beacon light, pharos",
"beaker",
"bearskin, busby, shako",
"beer bottle",
"beer glass",
"bell cote, bell cot",
"bib",
"bicycle-built-for-two, tandem bicycle, tandem",
"bikini, two-piece",
"binder, ring-binder",
"binoculars, field glasses, opera glasses",
"birdhouse",
"boathouse",
"bobsled, bobsleigh, bob",
"bolo tie, bolo, bola tie, bola",
"bonnet, poke bonnet",
"bookcase",
"bookshop, bookstore, bookstall",
"bottlecap",
"bow",
"bow tie, bow-tie, bowtie",
"brass, memorial tablet, plaque",
"brassiere, bra, bandeau",
"breakwater, groin, groyne, mole, bulwark, seawall, jetty",
"breastplate, aegis, egis",
"broom",
"bucket, pail",
"buckle",
"bulletproof vest",
"bullet train, bullet",
"butcher shop, meat market",
"cab, hack, taxi, taxicab",
"caldron, cauldron",
"candle, taper, wax light",
"cannon",
"canoe",
"can opener, tin opener",
"cardigan",
"car mirror",
"carousel, carrousel, merry-go-round, roundabout, whirligig",
"carpenter's kit, tool kit",
"carton",
"car wheel",
"cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm",
"cassette",
"cassette player",
"castle",
"catamaran",
"cd player",
"cello, violoncello",
"cellular telephone, cellular phone, cellphone, cell, mobile phone",
"chain",
"chainlink fence",
"chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
"chain saw, chainsaw",
"chest",
"chiffonier, commode",
"chime, bell, gong",
"china cabinet, china closet",
"christmas stocking",
"church, church building",
"cinema, movie theater, movie theatre, movie house, picture palace",
"cleaver, meat cleaver, chopper",
"cliff dwelling",
"cloak",
"clog, geta, patten, sabot",
"cocktail shaker",
"coffee mug",
"coffeepot",
"coil, spiral, volute, whorl, helix",
"combination lock",
"computer keyboard, keypad",
"confectionery, confectionary, candy store",
"container ship, containership, container vessel",
"convertible",
"corkscrew, bottle screw",
"cornet, horn, trumpet, trump",
"cowboy boot",
"cowboy hat, ten-gallon hat",
"cradle",
"crane",
"crash helmet",
"crate",
"crib, cot",
"crock pot",
"croquet ball",
"crutch",
"cuirass",
"dam, dike, dyke",
"desk",
"desktop computer",
"dial telephone, dial phone",
"diaper, nappy, napkin",
"digital clock",
"digital watch",
"dining table, board",
"dishrag, dishcloth",
"dishwasher, dish washer, dishwashing machine",
"disk brake, disc brake",
"dock, dockage, docking facility",
"dogsled, dog sled, dog sleigh",
"dome",
"doormat, welcome mat",
"drilling platform, offshore rig",
"drum, membranophone, tympan",
"drumstick",
"dumbbell",
"dutch oven",
"electric fan, blower",
"electric guitar",
"electric locomotive",
"entertainment center",
"envelope",
"espresso maker",
"face powder",
"feather boa, boa",
"file, file cabinet, filing cabinet",
"fireboat",
"fire engine, fire truck",
"fire screen, fireguard",
"flagpole, flagstaff",
"flute, transverse flute",
"folding chair",
"football helmet",
"forklift",
"fountain",
"fountain pen",
"four-poster",
"freight car",
"french horn, horn",
"frying pan, frypan, skillet",
"fur coat",
"garbage truck, dustcart",
"gasmask, respirator, gas helmet",
"gas pump, gasoline pump, petrol pump, island dispenser",
"goblet",
"go-kart",
"golf ball",
"golfcart, golf cart",
"gondola",
"gong, tam-tam",
"gown",
"grand piano, grand",
"greenhouse, nursery, glasshouse",
"grille, radiator grille",
"grocery store, grocery, food market, market",
"guillotine",
"hair slide",
"hair spray",
"half track",
"hammer",
"hamper",
"hand blower, blow dryer, blow drier, hair dryer, hair drier",
"hand-held computer, hand-held microcomputer",
"handkerchief, hankie, hanky, hankey",
"hard disc, hard disk, fixed disk",
"harmonica, mouth organ, harp, mouth harp",
"harp",
"harvester, reaper",
"hatchet",
"holster",
"home theater, home theatre",
"honeycomb",
"hook, claw",
"hoopskirt, crinoline",
"horizontal bar, high bar",
"horse cart, horse-cart",
"hourglass",
"ipod",
"iron, smoothing iron",
"jack-o'-lantern",
"jean, blue jean, denim",
"jeep, landrover",
"jersey, t-shirt, tee shirt",
"jigsaw puzzle",
"jinrikisha, ricksha, rickshaw",
"joystick",
"kimono",
"knee pad",
"knot",
"lab coat, laboratory coat",
"ladle",
"lampshade, lamp shade",
"laptop, laptop computer",
"lawn mower, mower",
"lens cap, lens cover",
"letter opener, paper knife, paperknife",
"library",
"lifeboat",
"lighter, light, igniter, ignitor",
"limousine, limo",
"liner, ocean liner",
"lipstick, lip rouge",
"loafer",
"lotion",
"loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
"loupe, jeweler's loupe",
"lumbermill, sawmill",
"magnetic compass",
"mailbag, postbag",
"mailbox, letter box",
"maillot",
"maillot, tank suit",
"manhole cover",
"maraca",
"marimba, xylophone",
"mask",
"matchstick",
"maypole",
"maze, labyrinth",
"measuring cup",
"medicine chest, medicine cabinet",
"megalith, megalithic structure",
"microphone, mike",
"microwave, microwave oven",
"military uniform",
"milk can",
"minibus",
"miniskirt, mini",
"minivan",
"missile",
"mitten",
"mixing bowl",
"mobile home, manufactured home",
"model t",
"modem",
"monastery",
"monitor",
"moped",
"mortar",
"mortarboard",
"mosque",
"mosquito net",
"motor scooter, scooter",
"mountain bike, all-terrain bike, off-roader",
"mountain tent",
"mouse, computer mouse",
"mousetrap",
"moving van",
"muzzle",
"nail",
"neck brace",
"necklace",
"nipple",
"notebook, notebook computer",
"obelisk",
"oboe, hautboy, hautbois",
"ocarina, sweet potato",
"odometer, hodometer, mileometer, milometer",
"oil filter",
"organ, pipe organ",
"oscilloscope, scope, cathode-ray oscilloscope, cro",
"overskirt",
"oxcart",
"oxygen mask",
"packet",
"paddle, boat paddle",
"paddlewheel, paddle wheel",
"padlock",
"paintbrush",
"pajama, pyjama, pj's, jammies",
"palace",
"panpipe, pandean pipe, syrinx",
"paper towel",
"parachute, chute",
"parallel bars, bars",
"park bench",
"parking meter",
"passenger car, coach, carriage",
"patio, terrace",
"pay-phone, pay-station",
"pedestal, plinth, footstall",
"pencil box, pencil case",
"pencil sharpener",
"perfume, essence",
"petri dish",
"photocopier",
"pick, plectrum, plectron",
"pickelhaube",
"picket fence, paling",
"pickup, pickup truck",
"pier",
"piggy bank, penny bank",
"pill bottle",
"pillow",
"ping-pong ball",
"pinwheel",
"pirate, pirate ship",
"pitcher, ewer",
"plane, carpenter's plane, woodworking plane",
"planetarium",
"plastic bag",
"plate rack",
"plow, plough",
"plunger, plumber's helper",
"polaroid camera, polaroid land camera",
"pole",
"police van, police wagon, paddy wagon, patrol wagon, wagon, black maria",
"poncho",
"pool table, billiard table, snooker table",
"pop bottle, soda bottle",
"pot, flowerpot",
"potter's wheel",
"power drill",
"prayer rug, prayer mat",
"printer",
"prison, prison house",
"projectile, missile",
"projector",
"puck, hockey puck",
"punching bag, punch bag, punching ball, punchball",
"purse",
"quill, quill pen",
"quilt, comforter, comfort, puff",
"racer, race car, racing car",
"racket, racquet",
"radiator",
"radio, wireless",
"radio telescope, radio reflector",
"rain barrel",
"recreational vehicle, rv, r.v.",
"reel",
"reflex camera",
"refrigerator, icebox",
"remote control, remote",
"restaurant, eating house, eating place, eatery",
"revolver, six-gun, six-shooter",
"rifle",
"rocking chair, rocker",
"rotisserie",
"rubber eraser, rubber, pencil eraser",
"rugby ball",
"rule, ruler",
"running shoe",
"safe",
"safety pin",
"saltshaker, salt shaker",
"sandal",
"sarong",
"sax, saxophone",
"scabbard",
"scale, weighing machine",
"school bus",
"schooner",
"scoreboard",
"screen, crt screen",
"screw",
"screwdriver",
"seat belt, seatbelt",
"sewing machine",
"shield, buckler",
"shoe shop, shoe-shop, shoe store",
"shoji",
"shopping basket",
"shopping cart",
"shovel",
"shower cap",
"shower curtain",
"ski",
"ski mask",
"sleeping bag",
"slide rule, slipstick",
"sliding door",
"slot, one-armed bandit",
"snorkel",
"snowmobile",
"snowplow, snowplough",
"soap dispenser",
"soccer ball",
"sock",
"solar dish, solar collector, solar furnace",
"sombrero",
"soup bowl",
"space bar",
"space heater",
"space shuttle",
"spatula",
"speedboat",
"spider web, spider's web",
"spindle",
"sports car, sport car",
"spotlight, spot",
"stage",
"steam locomotive",
"steel arch bridge",
"steel drum",
"stethoscope",
"stole",
"stone wall",
"stopwatch, stop watch",
"stove",
"strainer",
"streetcar, tram, tramcar, trolley, trolley car",
"stretcher",
"studio couch, day bed",
"stupa, tope",
"submarine, pigboat, sub, u-boat",
"suit, suit of clothes",
"sundial",
"sunglass",
"sunglasses, dark glasses, shades",
"sunscreen, sunblock, sun blocker",
"suspension bridge",
"swab, swob, mop",
"sweatshirt",
"swimming trunks, bathing trunks",
"swing",
"switch, electric switch, electrical switch",
"syringe",
"table lamp",
"tank, army tank, armored combat vehicle, armoured combat vehicle",
"tape player",
"teapot",
"teddy, teddy bear",
"television, television system",
"tennis ball",
"thatch, thatched roof",
"theater curtain, theatre curtain",
"thimble",
"thresher, thrasher, threshing machine",
"throne",
"tile roof",
"toaster",
"tobacco shop, tobacconist shop, tobacconist",
"toilet seat",
"torch",
"totem pole",
"tow truck, tow car, wrecker",
"toyshop",
"tractor",
"trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
"tray",
"trench coat",
"tricycle, trike, velocipede",
"trimaran",
"tripod",
"triumphal arch",
"trolleybus, trolley coach, trackless trolley",
"trombone",
"tub, vat",
"turnstile",
"typewriter keyboard",
"umbrella",
"unicycle, monocycle",
"upright, upright piano",
"vacuum, vacuum cleaner",
"vase",
"vault",
"velvet",
"vending machine",
"vestment",
"viaduct",
"violin, fiddle",
"volleyball",
"waffle iron",
"wall clock",
"wallet, billfold, notecase, pocketbook",
"wardrobe, closet, press",
"warplane, military plane",
"washbasin, handbasin, washbowl, lavabo, wash-hand basin",
"washer, automatic washer, washing machine",
"water bottle",
"water jug",
"water tower",
"whiskey jug",
"whistle",
"wig",
"window screen",
"window shade",
"windsor tie",
"wine bottle",
"wing",
"wok",
"wooden spoon",
"wool, woolen, woollen",
"worm fence, snake fence, snake-rail fence, virginia fence",
"wreck",
"yawl",
"yurt",
"web site, website, internet site, site",
"comic book",
"crossword puzzle, crossword",
"street sign",
"traffic light, traffic signal, stoplight",
"book jacket, dust cover, dust jacket, dust wrapper",
"menu",
"plate",
"guacamole",
"consomme",
"hot pot, hotpot",
"trifle",
"ice cream, icecream",
"ice lolly, lolly, lollipop, popsicle",
"french loaf",
"bagel, beigel",
"pretzel",
"cheeseburger",
"hotdog, hot dog, red hot",
"mashed potato",
"head cabbage",
"broccoli",
"cauliflower",
"zucchini, courgette",
"spaghetti squash",
"acorn squash",
"butternut squash",
"cucumber, cuke",
"artichoke, globe artichoke",
"bell pepper",
"cardoon",
"mushroom",
"granny smith",
"strawberry",
"orange",
"lemon",
"fig",
"pineapple, ananas",
"banana",
"jackfruit, jak, jack",
"custard apple",
"pomegranate",
"hay",
"carbonara",
"chocolate sauce, chocolate syrup",
"dough",
"meat loaf, meatloaf",
"pizza, pizza pie",
"potpie",
"burrito",
"red wine",
"espresso",
"cup",
"eggnog",
"alp",
"bubble",
"cliff, drop, drop-off",
"coral reef",
"geyser",
"lakeside, lakeshore",
"promontory, headland, head, foreland",
"sandbar, sand bar",
"seashore, coast, seacoast, sea-coast",
"valley, vale",
"volcano",
"ballplayer, baseball player",
"groom, bridegroom",
"scuba diver",
"rapeseed",
"daisy",
"yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum",
"corn",
"acorn",
"hip, rose hip, rosehip",
"buckeye, horse chestnut, conker",
"coral fungus",
"agaric",
"gyromitra",
"stinkhorn, carrion fungus",
"earthstar",
"hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa",
"bolete",
"ear, spike, capitulum",
"toilet tissue, toilet paper, bathroom tissue"
] |
purabp1249/swin-tiny-patch4-window7-224-finetuned-herbify_plants
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-herbify_plants
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0703
- Accuracy: 0.9868
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7849 | 0.99 | 21 | 0.4281 | 0.8553 |
| 0.2498 | 1.98 | 42 | 0.1295 | 0.9737 |
| 0.1387 | 2.96 | 63 | 0.0703 | 0.9868 |
| 0.1039 | 3.95 | 84 | 0.0741 | 0.9737 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cpu
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"aloevera",
"amruta_balli",
"avacado",
"bamboo",
"betel"
] |
grahmatagung/flower_classifier
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flower_classifier
This model is a fine-tuned version of [jonathanfernandes/vit-base-patch16-224-finetuned-flower](https://huggingface.co/jonathanfernandes/vit-base-patch16-224-finetuned-flower) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2238
- Accuracy: 0.9878
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.0568 | 1.0 | 102 | 0.9734 | 0.9322 |
| 0.3193 | 2.0 | 205 | 0.3169 | 0.9805 |
| 0.1862 | 2.99 | 306 | 0.2250 | 0.9853 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"1",
"10",
"16",
"98",
"99",
"17",
"18",
"19",
"2",
"20",
"21",
"22",
"23",
"24",
"100",
"25",
"26",
"27",
"28",
"29",
"3",
"30",
"31",
"32",
"33",
"101",
"34",
"35",
"36",
"37",
"38",
"39",
"4",
"40",
"41",
"42",
"102",
"43",
"44",
"45",
"46",
"47",
"48",
"49",
"5",
"50",
"51",
"11",
"52",
"53",
"54",
"55",
"56",
"57",
"58",
"59",
"6",
"60",
"12",
"61",
"62",
"63",
"64",
"65",
"66",
"67",
"68",
"69",
"7",
"13",
"70",
"71",
"72",
"73",
"74",
"75",
"76",
"77",
"78",
"79",
"14",
"8",
"80",
"81",
"82",
"83",
"84",
"85",
"86",
"87",
"88",
"15",
"89",
"9",
"90",
"91",
"92",
"93",
"94",
"95",
"96",
"97"
] |
LucyintheSky/pose-estimation-front-side-back
|
# Pose Estimation: front,side,back
## Model description
This model predicts the person's body position relative to the camera: **front, side, back**. It was trained on [Lucy in the Sky](https://www.lucyinthesky.com/shop) images.
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k).
## Training and evaluation data
It achieves the following results on the evaluation set:
- Loss: 0.2524
- Accuracy: 0.9355
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"back",
"front",
"side"
] |
stevanojs/pokemon_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pokemon_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0586
- Accuracy: 0.9071
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 4.3925 | 1.0 | 350 | 4.0653 | 0.6705 |
| 3.2005 | 2.0 | 700 | 3.1602 | 0.8227 |
| 2.3615 | 3.0 | 1050 | 2.4281 | 0.8656 |
| 1.5369 | 4.0 | 1400 | 1.8786 | 0.8821 |
| 1.0741 | 5.0 | 1750 | 1.4818 | 0.9014 |
| 0.7094 | 6.0 | 2100 | 1.2335 | 0.9014 |
| 0.544 | 7.0 | 2450 | 1.0976 | 0.9042 |
| 0.4622 | 8.0 | 2800 | 1.0586 | 0.9071 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"golbat",
"machoke",
"raichu",
"dragonite",
"fearow",
"slowpoke",
"weezing",
"beedrill",
"weedle",
"cloyster",
"vaporeon",
"gyarados",
"golduck",
"zapdos",
"machamp",
"hitmonlee",
"primeape",
"cubone",
"sandslash",
"scyther",
"haunter",
"metapod",
"tentacruel",
"aerodactyl",
"raticate",
"kabutops",
"ninetales",
"zubat",
"rhydon",
"mew",
"pinsir",
"ditto",
"victreebel",
"omanyte",
"horsea",
"magnemite",
"pikachu",
"blastoise",
"venomoth",
"charizard",
"seadra",
"muk",
"spearow",
"bulbasaur",
"bellsprout",
"electrode",
"ivysaur",
"gloom",
"poliwhirl",
"flareon",
"seaking",
"hypno",
"wartortle",
"mankey",
"tentacool",
"exeggcute",
"meowth",
"growlithe",
"tangela",
"drowzee",
"rapidash",
"venonat",
"omastar",
"pidgeot",
"nidorino",
"porygon",
"lickitung",
"rattata",
"machop",
"charmeleon",
"slowbro",
"parasect",
"eevee",
"diglett",
"starmie",
"staryu",
"psyduck",
"dragonair",
"magikarp",
"vileplume",
"marowak",
"pidgeotto",
"shellder",
"mewtwo",
"lapras",
"farfetchd",
"kingler",
"seel",
"kakuna",
"doduo",
"electabuzz",
"charmander",
"rhyhorn",
"tauros",
"dugtrio",
"kabuto",
"poliwrath",
"gengar",
"exeggutor",
"dewgong",
"jigglypuff",
"geodude",
"kadabra",
"nidorina",
"sandshrew",
"grimer",
"persian",
"mrmime",
"pidgey",
"koffing",
"ekans",
"alolan sandslash",
"venusaur",
"snorlax",
"paras",
"jynx",
"chansey",
"weepinbell",
"hitmonchan",
"gastly",
"kangaskhan",
"oddish",
"wigglytuff",
"graveler",
"arcanine",
"clefairy",
"articuno",
"poliwag",
"golem",
"abra",
"squirtle",
"voltorb",
"ponyta",
"moltres",
"nidoqueen",
"magmar",
"onix",
"vulpix",
"butterfree",
"dodrio",
"krabby",
"arbok",
"clefable",
"goldeen",
"magneton",
"dratini",
"caterpie",
"jolteon",
"nidoking",
"alakazam"
] |
bdpc/resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5837
- Accuracy: 0.7867
- Brier Loss: 0.3013
- Nll: 1.9882
- F1 Micro: 0.7868
- F1 Macro: 0.7860
- Ece: 0.0529
- Aurc: 0.0581
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 250 | 4.1958 | 0.1035 | 0.9350 | 9.1004 | 0.1035 | 0.0792 | 0.0472 | 0.9013 |
| 4.2322 | 2.0 | 500 | 4.0778 | 0.173 | 0.9251 | 6.5742 | 0.173 | 0.1393 | 0.0993 | 0.7501 |
| 4.2322 | 3.0 | 750 | 3.6484 | 0.339 | 0.8778 | 4.9108 | 0.339 | 0.2957 | 0.2172 | 0.5305 |
| 3.5256 | 4.0 | 1000 | 2.5967 | 0.4592 | 0.6991 | 3.3640 | 0.4592 | 0.4220 | 0.1274 | 0.3285 |
| 3.5256 | 5.0 | 1250 | 2.0345 | 0.5417 | 0.6078 | 3.0118 | 0.5417 | 0.5180 | 0.0976 | 0.2447 |
| 1.9172 | 6.0 | 1500 | 1.4417 | 0.625 | 0.5029 | 2.7890 | 0.625 | 0.6123 | 0.0549 | 0.1623 |
| 1.9172 | 7.0 | 1750 | 1.3298 | 0.639 | 0.4852 | 2.6110 | 0.639 | 0.6320 | 0.0558 | 0.1501 |
| 1.1801 | 8.0 | 2000 | 1.1697 | 0.674 | 0.4473 | 2.4787 | 0.674 | 0.6712 | 0.0466 | 0.1283 |
| 1.1801 | 9.0 | 2250 | 0.9625 | 0.7093 | 0.4020 | 2.3242 | 0.7093 | 0.7085 | 0.0526 | 0.1017 |
| 0.8029 | 10.0 | 2500 | 0.9477 | 0.7215 | 0.3893 | 2.3193 | 0.7215 | 0.7228 | 0.0515 | 0.0971 |
| 0.8029 | 11.0 | 2750 | 0.8527 | 0.7375 | 0.3692 | 2.2785 | 0.7375 | 0.7377 | 0.0490 | 0.0870 |
| 0.5717 | 12.0 | 3000 | 0.7377 | 0.7515 | 0.3470 | 2.1475 | 0.7515 | 0.7529 | 0.0552 | 0.0757 |
| 0.5717 | 13.0 | 3250 | 0.7309 | 0.7498 | 0.3469 | 2.1250 | 0.7498 | 0.7494 | 0.0589 | 0.0758 |
| 0.4414 | 14.0 | 3500 | 0.7165 | 0.7558 | 0.3427 | 2.1045 | 0.7558 | 0.7576 | 0.0582 | 0.0721 |
| 0.4414 | 15.0 | 3750 | 0.6865 | 0.7678 | 0.3319 | 2.0457 | 0.7678 | 0.7688 | 0.0551 | 0.0697 |
| 0.3691 | 16.0 | 4000 | 0.7002 | 0.7662 | 0.3348 | 2.1280 | 0.7663 | 0.7664 | 0.0567 | 0.0698 |
| 0.3691 | 17.0 | 4250 | 0.6896 | 0.7628 | 0.3326 | 2.0750 | 0.7628 | 0.7631 | 0.0608 | 0.0691 |
| 0.3214 | 18.0 | 4500 | 0.6666 | 0.7715 | 0.3258 | 2.0468 | 0.7715 | 0.7707 | 0.0544 | 0.0680 |
| 0.3214 | 19.0 | 4750 | 0.6735 | 0.7702 | 0.3277 | 2.0544 | 0.7702 | 0.7700 | 0.0571 | 0.0681 |
| 0.2914 | 20.0 | 5000 | 0.6607 | 0.772 | 0.3241 | 2.0364 | 0.772 | 0.7729 | 0.0525 | 0.0659 |
| 0.2914 | 21.0 | 5250 | 0.6625 | 0.7688 | 0.3217 | 2.0387 | 0.7688 | 0.7703 | 0.0455 | 0.0664 |
| 0.2653 | 22.0 | 5500 | 0.6543 | 0.775 | 0.3200 | 2.0560 | 0.775 | 0.7752 | 0.0507 | 0.0647 |
| 0.2653 | 23.0 | 5750 | 0.6409 | 0.7725 | 0.3188 | 2.0091 | 0.7725 | 0.7733 | 0.0554 | 0.0647 |
| 0.2482 | 24.0 | 6000 | 0.6452 | 0.7758 | 0.3191 | 2.0256 | 0.7758 | 0.7756 | 0.0502 | 0.0655 |
| 0.2482 | 25.0 | 6250 | 0.6401 | 0.7742 | 0.3196 | 2.0668 | 0.7742 | 0.7745 | 0.0528 | 0.0648 |
| 0.2354 | 26.0 | 6500 | 0.6316 | 0.775 | 0.3171 | 2.0150 | 0.775 | 0.7755 | 0.0555 | 0.0634 |
| 0.2354 | 27.0 | 6750 | 0.6257 | 0.7808 | 0.3147 | 2.0129 | 0.7808 | 0.7808 | 0.0503 | 0.0624 |
| 0.2229 | 28.0 | 7000 | 0.6343 | 0.7778 | 0.3144 | 2.0910 | 0.7778 | 0.7776 | 0.0510 | 0.0624 |
| 0.2229 | 29.0 | 7250 | 0.6206 | 0.781 | 0.3115 | 2.0399 | 0.7810 | 0.7798 | 0.0555 | 0.0606 |
| 0.2147 | 30.0 | 7500 | 0.6262 | 0.777 | 0.3124 | 2.0603 | 0.777 | 0.7772 | 0.0539 | 0.0616 |
| 0.2147 | 31.0 | 7750 | 0.6265 | 0.7788 | 0.3137 | 2.0833 | 0.7788 | 0.7777 | 0.0532 | 0.0614 |
| 0.2058 | 32.0 | 8000 | 0.6134 | 0.7815 | 0.3119 | 2.0369 | 0.7815 | 0.7815 | 0.0514 | 0.0615 |
| 0.2058 | 33.0 | 8250 | 0.6153 | 0.7772 | 0.3133 | 2.0513 | 0.7773 | 0.7772 | 0.0534 | 0.0623 |
| 0.1994 | 34.0 | 8500 | 0.6143 | 0.7853 | 0.3098 | 2.0188 | 0.7853 | 0.7857 | 0.0508 | 0.0611 |
| 0.1994 | 35.0 | 8750 | 0.6096 | 0.7827 | 0.3086 | 2.0134 | 0.7828 | 0.7828 | 0.0512 | 0.0606 |
| 0.1932 | 36.0 | 9000 | 0.6094 | 0.784 | 0.3067 | 2.0151 | 0.7840 | 0.7847 | 0.0471 | 0.0602 |
| 0.1932 | 37.0 | 9250 | 0.6142 | 0.7833 | 0.3111 | 2.0213 | 0.7833 | 0.7829 | 0.0542 | 0.0608 |
| 0.1895 | 38.0 | 9500 | 0.6103 | 0.7812 | 0.3094 | 2.0594 | 0.7812 | 0.7799 | 0.0529 | 0.0603 |
| 0.1895 | 39.0 | 9750 | 0.6059 | 0.781 | 0.3078 | 2.0386 | 0.7810 | 0.7806 | 0.0545 | 0.0607 |
| 0.1848 | 40.0 | 10000 | 0.6042 | 0.782 | 0.3072 | 2.0133 | 0.782 | 0.7824 | 0.0527 | 0.0603 |
| 0.1848 | 41.0 | 10250 | 0.5991 | 0.785 | 0.3043 | 2.0124 | 0.785 | 0.7853 | 0.0496 | 0.0594 |
| 0.1793 | 42.0 | 10500 | 0.6034 | 0.784 | 0.3058 | 2.0607 | 0.7840 | 0.7838 | 0.0490 | 0.0599 |
| 0.1793 | 43.0 | 10750 | 0.6047 | 0.7827 | 0.3068 | 2.0139 | 0.7828 | 0.7819 | 0.0492 | 0.0595 |
| 0.1768 | 44.0 | 11000 | 0.5982 | 0.785 | 0.3057 | 2.0303 | 0.785 | 0.7843 | 0.0473 | 0.0596 |
| 0.1768 | 45.0 | 11250 | 0.6036 | 0.7795 | 0.3087 | 2.0173 | 0.7795 | 0.7788 | 0.0549 | 0.0607 |
| 0.1743 | 46.0 | 11500 | 0.5974 | 0.785 | 0.3060 | 2.0026 | 0.785 | 0.7839 | 0.0478 | 0.0596 |
| 0.1743 | 47.0 | 11750 | 0.5996 | 0.782 | 0.3068 | 2.0144 | 0.782 | 0.7825 | 0.0480 | 0.0598 |
| 0.1707 | 48.0 | 12000 | 0.5958 | 0.7833 | 0.3079 | 2.0344 | 0.7833 | 0.7827 | 0.0500 | 0.0598 |
| 0.1707 | 49.0 | 12250 | 0.5969 | 0.782 | 0.3060 | 2.0162 | 0.782 | 0.7820 | 0.0482 | 0.0597 |
| 0.1683 | 50.0 | 12500 | 0.5933 | 0.784 | 0.3043 | 1.9897 | 0.7840 | 0.7836 | 0.0496 | 0.0589 |
| 0.1683 | 51.0 | 12750 | 0.5935 | 0.7833 | 0.3042 | 2.0142 | 0.7833 | 0.7829 | 0.0501 | 0.0586 |
| 0.1649 | 52.0 | 13000 | 0.5950 | 0.7847 | 0.3050 | 2.0125 | 0.7847 | 0.7851 | 0.0475 | 0.0591 |
| 0.1649 | 53.0 | 13250 | 0.5904 | 0.7837 | 0.3020 | 1.9830 | 0.7837 | 0.7837 | 0.0504 | 0.0584 |
| 0.1636 | 54.0 | 13500 | 0.5926 | 0.785 | 0.3042 | 2.0006 | 0.785 | 0.7845 | 0.0493 | 0.0588 |
| 0.1636 | 55.0 | 13750 | 0.5885 | 0.7847 | 0.3029 | 2.0025 | 0.7847 | 0.7843 | 0.0505 | 0.0585 |
| 0.1616 | 56.0 | 14000 | 0.5920 | 0.788 | 0.3041 | 2.0174 | 0.788 | 0.7878 | 0.0520 | 0.0591 |
| 0.1616 | 57.0 | 14250 | 0.5927 | 0.7863 | 0.3033 | 2.0321 | 0.7863 | 0.7858 | 0.0521 | 0.0588 |
| 0.1592 | 58.0 | 14500 | 0.5878 | 0.787 | 0.3017 | 1.9751 | 0.787 | 0.7874 | 0.0461 | 0.0584 |
| 0.1592 | 59.0 | 14750 | 0.5888 | 0.7867 | 0.3030 | 1.9996 | 0.7868 | 0.7864 | 0.0494 | 0.0582 |
| 0.1585 | 60.0 | 15000 | 0.5929 | 0.786 | 0.3052 | 2.0237 | 0.786 | 0.7857 | 0.0512 | 0.0584 |
| 0.1585 | 61.0 | 15250 | 0.5894 | 0.7865 | 0.3026 | 1.9895 | 0.7865 | 0.7864 | 0.0548 | 0.0585 |
| 0.1562 | 62.0 | 15500 | 0.5903 | 0.7873 | 0.3033 | 1.9670 | 0.7873 | 0.7870 | 0.0481 | 0.0584 |
| 0.1562 | 63.0 | 15750 | 0.5896 | 0.7853 | 0.3023 | 1.9681 | 0.7853 | 0.7850 | 0.0520 | 0.0587 |
| 0.1548 | 64.0 | 16000 | 0.5903 | 0.7847 | 0.3027 | 1.9865 | 0.7847 | 0.7846 | 0.0506 | 0.0587 |
| 0.1548 | 65.0 | 16250 | 0.5910 | 0.7853 | 0.3039 | 2.0009 | 0.7853 | 0.7849 | 0.0515 | 0.0593 |
| 0.1537 | 66.0 | 16500 | 0.5866 | 0.7883 | 0.3012 | 1.9561 | 0.7883 | 0.7881 | 0.0447 | 0.0581 |
| 0.1537 | 67.0 | 16750 | 0.5858 | 0.7867 | 0.3009 | 1.9868 | 0.7868 | 0.7861 | 0.0486 | 0.0577 |
| 0.1526 | 68.0 | 17000 | 0.5886 | 0.7867 | 0.3024 | 2.0009 | 0.7868 | 0.7862 | 0.0530 | 0.0587 |
| 0.1526 | 69.0 | 17250 | 0.5850 | 0.7863 | 0.3010 | 2.0095 | 0.7863 | 0.7860 | 0.0510 | 0.0581 |
| 0.1508 | 70.0 | 17500 | 0.5867 | 0.7865 | 0.3019 | 2.0304 | 0.7865 | 0.7861 | 0.0525 | 0.0583 |
| 0.1508 | 71.0 | 17750 | 0.5895 | 0.7857 | 0.3038 | 2.0013 | 0.7857 | 0.7853 | 0.0478 | 0.0586 |
| 0.15 | 72.0 | 18000 | 0.5894 | 0.7847 | 0.3025 | 2.0051 | 0.7847 | 0.7845 | 0.0500 | 0.0586 |
| 0.15 | 73.0 | 18250 | 0.5867 | 0.7865 | 0.3022 | 1.9634 | 0.7865 | 0.7860 | 0.0489 | 0.0582 |
| 0.149 | 74.0 | 18500 | 0.5888 | 0.7857 | 0.3026 | 1.9817 | 0.7857 | 0.7851 | 0.0497 | 0.0584 |
| 0.149 | 75.0 | 18750 | 0.5823 | 0.7885 | 0.2994 | 1.9873 | 0.7885 | 0.7880 | 0.0476 | 0.0577 |
| 0.1483 | 76.0 | 19000 | 0.5866 | 0.7853 | 0.3025 | 1.9870 | 0.7853 | 0.7849 | 0.0531 | 0.0583 |
| 0.1483 | 77.0 | 19250 | 0.5866 | 0.7867 | 0.3013 | 1.9933 | 0.7868 | 0.7862 | 0.0498 | 0.0577 |
| 0.1478 | 78.0 | 19500 | 0.5844 | 0.787 | 0.3010 | 1.9793 | 0.787 | 0.7868 | 0.0465 | 0.0579 |
| 0.1478 | 79.0 | 19750 | 0.5850 | 0.7857 | 0.3005 | 1.9856 | 0.7857 | 0.7855 | 0.0489 | 0.0580 |
| 0.1463 | 80.0 | 20000 | 0.5829 | 0.7893 | 0.2999 | 2.0003 | 0.7893 | 0.7890 | 0.0543 | 0.0578 |
| 0.1463 | 81.0 | 20250 | 0.5845 | 0.7867 | 0.3011 | 2.0178 | 0.7868 | 0.7864 | 0.0494 | 0.0580 |
| 0.1457 | 82.0 | 20500 | 0.5878 | 0.7865 | 0.3022 | 2.0108 | 0.7865 | 0.7861 | 0.0507 | 0.0583 |
| 0.1457 | 83.0 | 20750 | 0.5862 | 0.7865 | 0.3016 | 1.9996 | 0.7865 | 0.7865 | 0.0505 | 0.0585 |
| 0.1452 | 84.0 | 21000 | 0.5851 | 0.7863 | 0.3011 | 2.0002 | 0.7863 | 0.7859 | 0.0481 | 0.0582 |
| 0.1452 | 85.0 | 21250 | 0.5850 | 0.787 | 0.3013 | 1.9659 | 0.787 | 0.7867 | 0.0524 | 0.0582 |
| 0.1449 | 86.0 | 21500 | 0.5878 | 0.7867 | 0.3023 | 1.9837 | 0.7868 | 0.7866 | 0.0526 | 0.0581 |
| 0.1449 | 87.0 | 21750 | 0.5844 | 0.7873 | 0.3010 | 1.9807 | 0.7873 | 0.7865 | 0.0522 | 0.0577 |
| 0.1437 | 88.0 | 22000 | 0.5846 | 0.7877 | 0.3012 | 1.9947 | 0.7877 | 0.7869 | 0.0464 | 0.0580 |
| 0.1437 | 89.0 | 22250 | 0.5859 | 0.787 | 0.3016 | 2.0002 | 0.787 | 0.7867 | 0.0503 | 0.0581 |
| 0.143 | 90.0 | 22500 | 0.5838 | 0.7865 | 0.3010 | 1.9996 | 0.7865 | 0.7859 | 0.0496 | 0.0576 |
| 0.143 | 91.0 | 22750 | 0.5843 | 0.7837 | 0.3011 | 1.9683 | 0.7837 | 0.7834 | 0.0501 | 0.0583 |
| 0.1426 | 92.0 | 23000 | 0.5843 | 0.7873 | 0.3010 | 1.9960 | 0.7873 | 0.7870 | 0.0524 | 0.0578 |
| 0.1426 | 93.0 | 23250 | 0.5827 | 0.7847 | 0.3005 | 1.9719 | 0.7847 | 0.7844 | 0.0506 | 0.0579 |
| 0.1428 | 94.0 | 23500 | 0.5831 | 0.7865 | 0.3009 | 1.9781 | 0.7865 | 0.7862 | 0.0517 | 0.0579 |
| 0.1428 | 95.0 | 23750 | 0.5821 | 0.784 | 0.3001 | 1.9641 | 0.7840 | 0.7838 | 0.0505 | 0.0579 |
| 0.1424 | 96.0 | 24000 | 0.5850 | 0.7845 | 0.3020 | 1.9667 | 0.7845 | 0.7842 | 0.0526 | 0.0584 |
| 0.1424 | 97.0 | 24250 | 0.5850 | 0.7847 | 0.3012 | 1.9776 | 0.7847 | 0.7844 | 0.0508 | 0.0579 |
| 0.142 | 98.0 | 24500 | 0.5845 | 0.7877 | 0.3011 | 1.9745 | 0.7877 | 0.7870 | 0.0491 | 0.0579 |
| 0.142 | 99.0 | 24750 | 0.5834 | 0.7853 | 0.3010 | 1.9679 | 0.7853 | 0.7852 | 0.0506 | 0.0581 |
| 0.1416 | 100.0 | 25000 | 0.5837 | 0.7867 | 0.3013 | 1.9882 | 0.7868 | 0.7860 | 0.0529 | 0.0581 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
bdpc/resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_MSE
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7429
- Accuracy: 0.7853
- Brier Loss: 0.3044
- Nll: 2.0364
- F1 Micro: 0.7853
- F1 Macro: 0.7862
- Ece: 0.0430
- Aurc: 0.0599
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 250 | 9.5443 | 0.0765 | 0.9365 | 3.7373 | 0.0765 | 0.0522 | 0.0360 | 0.9336 |
| 9.4095 | 2.0 | 500 | 7.4542 | 0.0757 | 0.9312 | 2.8468 | 0.0757 | 0.0316 | 0.0425 | 0.8840 |
| 9.4095 | 3.0 | 750 | 5.8933 | 0.0975 | 0.9356 | 3.2058 | 0.0975 | 0.0408 | 0.0798 | 0.8593 |
| 5.9994 | 4.0 | 1000 | 4.3665 | 0.2125 | 0.8700 | 5.3759 | 0.2125 | 0.1290 | 0.0743 | 0.7029 |
| 5.9994 | 5.0 | 1250 | 3.0367 | 0.4415 | 0.6924 | 4.9073 | 0.4415 | 0.4283 | 0.0806 | 0.3570 |
| 3.2184 | 6.0 | 1500 | 2.1589 | 0.579 | 0.5587 | 3.7412 | 0.579 | 0.5771 | 0.0572 | 0.2172 |
| 3.2184 | 7.0 | 1750 | 1.5582 | 0.652 | 0.4673 | 3.0701 | 0.652 | 0.6456 | 0.0517 | 0.1478 |
| 1.6737 | 8.0 | 2000 | 1.3502 | 0.6893 | 0.4266 | 2.8575 | 0.6893 | 0.6860 | 0.0544 | 0.1175 |
| 1.6737 | 9.0 | 2250 | 1.1389 | 0.7188 | 0.3914 | 2.5937 | 0.7188 | 0.7195 | 0.0544 | 0.1006 |
| 1.0789 | 10.0 | 2500 | 1.0563 | 0.7302 | 0.3742 | 2.5043 | 0.7302 | 0.7305 | 0.0618 | 0.0912 |
| 1.0789 | 11.0 | 2750 | 1.0035 | 0.7428 | 0.3604 | 2.4375 | 0.7428 | 0.7441 | 0.0587 | 0.0823 |
| 0.7934 | 12.0 | 3000 | 0.9169 | 0.7548 | 0.3472 | 2.2921 | 0.7548 | 0.7555 | 0.0547 | 0.0762 |
| 0.7934 | 13.0 | 3250 | 0.8628 | 0.7598 | 0.3386 | 2.2849 | 0.7598 | 0.7600 | 0.0550 | 0.0739 |
| 0.6268 | 14.0 | 3500 | 0.8773 | 0.7675 | 0.3362 | 2.2170 | 0.7675 | 0.7692 | 0.0490 | 0.0718 |
| 0.6268 | 15.0 | 3750 | 0.8263 | 0.7682 | 0.3306 | 2.1617 | 0.7682 | 0.7702 | 0.0534 | 0.0704 |
| 0.5269 | 16.0 | 4000 | 0.8422 | 0.7708 | 0.3289 | 2.1907 | 0.7707 | 0.7717 | 0.0524 | 0.0687 |
| 0.5269 | 17.0 | 4250 | 0.8100 | 0.7745 | 0.3241 | 2.1664 | 0.7745 | 0.7761 | 0.0509 | 0.0667 |
| 0.4516 | 18.0 | 4500 | 0.8013 | 0.7778 | 0.3215 | 2.1216 | 0.7778 | 0.7790 | 0.0473 | 0.0669 |
| 0.4516 | 19.0 | 4750 | 0.7911 | 0.7802 | 0.3183 | 2.1224 | 0.7802 | 0.7812 | 0.0476 | 0.0648 |
| 0.4039 | 20.0 | 5000 | 0.7900 | 0.7775 | 0.3197 | 2.0969 | 0.7775 | 0.7797 | 0.0473 | 0.0647 |
| 0.4039 | 21.0 | 5250 | 0.7919 | 0.7792 | 0.3191 | 2.1445 | 0.7792 | 0.7810 | 0.0531 | 0.0652 |
| 0.3563 | 22.0 | 5500 | 0.7960 | 0.7802 | 0.3166 | 2.0849 | 0.7802 | 0.7818 | 0.0478 | 0.0649 |
| 0.3563 | 23.0 | 5750 | 0.7615 | 0.7825 | 0.3128 | 2.0834 | 0.7825 | 0.7833 | 0.0478 | 0.0638 |
| 0.3251 | 24.0 | 6000 | 0.7840 | 0.7792 | 0.3151 | 2.0841 | 0.7792 | 0.7800 | 0.0513 | 0.0648 |
| 0.3251 | 25.0 | 6250 | 0.7837 | 0.7792 | 0.3159 | 2.0889 | 0.7792 | 0.7808 | 0.0485 | 0.0643 |
| 0.2949 | 26.0 | 6500 | 0.7827 | 0.7802 | 0.3158 | 2.0416 | 0.7802 | 0.7819 | 0.0548 | 0.0648 |
| 0.2949 | 27.0 | 6750 | 0.7650 | 0.78 | 0.3130 | 2.0411 | 0.78 | 0.7807 | 0.0506 | 0.0629 |
| 0.2669 | 28.0 | 7000 | 0.7787 | 0.7802 | 0.3133 | 2.0843 | 0.7802 | 0.7810 | 0.0454 | 0.0627 |
| 0.2669 | 29.0 | 7250 | 0.7892 | 0.782 | 0.3163 | 2.0953 | 0.782 | 0.7826 | 0.0508 | 0.0635 |
| 0.2512 | 30.0 | 7500 | 0.7775 | 0.7825 | 0.3126 | 2.0904 | 0.7825 | 0.7837 | 0.0451 | 0.0633 |
| 0.2512 | 31.0 | 7750 | 0.7601 | 0.7817 | 0.3124 | 2.0251 | 0.7817 | 0.7827 | 0.0485 | 0.0627 |
| 0.231 | 32.0 | 8000 | 0.7669 | 0.7833 | 0.3120 | 2.0685 | 0.7833 | 0.7842 | 0.0472 | 0.0629 |
| 0.231 | 33.0 | 8250 | 0.7652 | 0.7847 | 0.3116 | 2.0661 | 0.7847 | 0.7858 | 0.0519 | 0.0625 |
| 0.2172 | 34.0 | 8500 | 0.7637 | 0.7837 | 0.3107 | 2.0264 | 0.7837 | 0.7852 | 0.0487 | 0.0628 |
| 0.2172 | 35.0 | 8750 | 0.7691 | 0.783 | 0.3120 | 2.0535 | 0.7830 | 0.7844 | 0.0438 | 0.0634 |
| 0.2032 | 36.0 | 9000 | 0.7647 | 0.7845 | 0.3093 | 2.0480 | 0.7845 | 0.7852 | 0.0471 | 0.0620 |
| 0.2032 | 37.0 | 9250 | 0.7727 | 0.782 | 0.3122 | 2.0610 | 0.782 | 0.7830 | 0.0493 | 0.0628 |
| 0.1925 | 38.0 | 9500 | 0.7563 | 0.7843 | 0.3085 | 2.0267 | 0.7843 | 0.7849 | 0.0459 | 0.0608 |
| 0.1925 | 39.0 | 9750 | 0.7597 | 0.7835 | 0.3087 | 2.0062 | 0.7835 | 0.7845 | 0.0485 | 0.0614 |
| 0.1823 | 40.0 | 10000 | 0.7611 | 0.7833 | 0.3107 | 2.0007 | 0.7833 | 0.7853 | 0.0479 | 0.0625 |
| 0.1823 | 41.0 | 10250 | 0.7608 | 0.7843 | 0.3076 | 2.0335 | 0.7843 | 0.7854 | 0.0486 | 0.0602 |
| 0.17 | 42.0 | 10500 | 0.7535 | 0.7833 | 0.3096 | 2.0121 | 0.7833 | 0.7844 | 0.0505 | 0.0613 |
| 0.17 | 43.0 | 10750 | 0.7524 | 0.7845 | 0.3066 | 2.0425 | 0.7845 | 0.7856 | 0.0476 | 0.0605 |
| 0.1639 | 44.0 | 11000 | 0.7608 | 0.7808 | 0.3108 | 2.0739 | 0.7808 | 0.7816 | 0.0503 | 0.0618 |
| 0.1639 | 45.0 | 11250 | 0.7560 | 0.786 | 0.3063 | 1.9876 | 0.786 | 0.7868 | 0.0496 | 0.0607 |
| 0.1575 | 46.0 | 11500 | 0.7494 | 0.784 | 0.3063 | 2.0311 | 0.7840 | 0.7846 | 0.0416 | 0.0601 |
| 0.1575 | 47.0 | 11750 | 0.7515 | 0.7857 | 0.3069 | 2.0539 | 0.7857 | 0.7866 | 0.0456 | 0.0609 |
| 0.1493 | 48.0 | 12000 | 0.7511 | 0.7843 | 0.3086 | 2.0325 | 0.7843 | 0.7852 | 0.0552 | 0.0612 |
| 0.1493 | 49.0 | 12250 | 0.7495 | 0.787 | 0.3067 | 2.0231 | 0.787 | 0.7880 | 0.0475 | 0.0605 |
| 0.1425 | 50.0 | 12500 | 0.7538 | 0.7867 | 0.3052 | 2.0267 | 0.7868 | 0.7870 | 0.0507 | 0.0603 |
| 0.1425 | 51.0 | 12750 | 0.7529 | 0.7847 | 0.3081 | 2.0592 | 0.7847 | 0.7859 | 0.0467 | 0.0604 |
| 0.1356 | 52.0 | 13000 | 0.7527 | 0.7808 | 0.3071 | 2.0349 | 0.7808 | 0.7818 | 0.0473 | 0.0607 |
| 0.1356 | 53.0 | 13250 | 0.7451 | 0.7865 | 0.3049 | 2.0368 | 0.7865 | 0.7879 | 0.0484 | 0.0595 |
| 0.1325 | 54.0 | 13500 | 0.7481 | 0.7857 | 0.3056 | 2.0223 | 0.7857 | 0.7869 | 0.0468 | 0.0603 |
| 0.1325 | 55.0 | 13750 | 0.7470 | 0.7835 | 0.3057 | 2.0306 | 0.7835 | 0.7844 | 0.0492 | 0.0601 |
| 0.1264 | 56.0 | 14000 | 0.7471 | 0.7873 | 0.3053 | 2.0336 | 0.7873 | 0.7880 | 0.0519 | 0.0601 |
| 0.1264 | 57.0 | 14250 | 0.7429 | 0.7895 | 0.3032 | 2.0149 | 0.7895 | 0.7903 | 0.0468 | 0.0595 |
| 0.1208 | 58.0 | 14500 | 0.7399 | 0.7885 | 0.3035 | 2.0147 | 0.7885 | 0.7895 | 0.0433 | 0.0596 |
| 0.1208 | 59.0 | 14750 | 0.7518 | 0.786 | 0.3076 | 2.0481 | 0.786 | 0.7873 | 0.0403 | 0.0607 |
| 0.119 | 60.0 | 15000 | 0.7483 | 0.7903 | 0.3058 | 2.0138 | 0.7903 | 0.7914 | 0.0471 | 0.0601 |
| 0.119 | 61.0 | 15250 | 0.7463 | 0.7845 | 0.3043 | 2.0617 | 0.7845 | 0.7855 | 0.0458 | 0.0599 |
| 0.1128 | 62.0 | 15500 | 0.7478 | 0.7875 | 0.3056 | 2.0187 | 0.7875 | 0.7888 | 0.0452 | 0.0604 |
| 0.1128 | 63.0 | 15750 | 0.7510 | 0.784 | 0.3061 | 2.0204 | 0.7840 | 0.7850 | 0.0495 | 0.0605 |
| 0.1109 | 64.0 | 16000 | 0.7424 | 0.786 | 0.3053 | 2.0167 | 0.786 | 0.7871 | 0.0449 | 0.0603 |
| 0.1109 | 65.0 | 16250 | 0.7473 | 0.7885 | 0.3054 | 2.0200 | 0.7885 | 0.7893 | 0.0471 | 0.0600 |
| 0.1078 | 66.0 | 16500 | 0.7467 | 0.7873 | 0.3054 | 2.0224 | 0.7873 | 0.7883 | 0.0482 | 0.0599 |
| 0.1078 | 67.0 | 16750 | 0.7445 | 0.7893 | 0.3039 | 2.0082 | 0.7893 | 0.7895 | 0.0456 | 0.0593 |
| 0.1051 | 68.0 | 17000 | 0.7490 | 0.7873 | 0.3063 | 2.0152 | 0.7873 | 0.7883 | 0.0505 | 0.0602 |
| 0.1051 | 69.0 | 17250 | 0.7490 | 0.785 | 0.3061 | 2.0103 | 0.785 | 0.7861 | 0.0465 | 0.0602 |
| 0.1009 | 70.0 | 17500 | 0.7445 | 0.7875 | 0.3049 | 2.0308 | 0.7875 | 0.7884 | 0.0483 | 0.0598 |
| 0.1009 | 71.0 | 17750 | 0.7490 | 0.7863 | 0.3068 | 2.0260 | 0.7863 | 0.7875 | 0.0495 | 0.0604 |
| 0.0984 | 72.0 | 18000 | 0.7465 | 0.7893 | 0.3059 | 2.0161 | 0.7893 | 0.7906 | 0.0427 | 0.0601 |
| 0.0984 | 73.0 | 18250 | 0.7451 | 0.7873 | 0.3058 | 2.0204 | 0.7873 | 0.7882 | 0.0511 | 0.0605 |
| 0.0966 | 74.0 | 18500 | 0.7445 | 0.7875 | 0.3042 | 2.0227 | 0.7875 | 0.7886 | 0.0495 | 0.0599 |
| 0.0966 | 75.0 | 18750 | 0.7443 | 0.7863 | 0.3040 | 2.0138 | 0.7863 | 0.7872 | 0.0442 | 0.0598 |
| 0.0947 | 76.0 | 19000 | 0.7448 | 0.7865 | 0.3054 | 2.0234 | 0.7865 | 0.7873 | 0.0457 | 0.0598 |
| 0.0947 | 77.0 | 19250 | 0.7448 | 0.7865 | 0.3041 | 2.0110 | 0.7865 | 0.7875 | 0.0508 | 0.0596 |
| 0.0931 | 78.0 | 19500 | 0.7460 | 0.7883 | 0.3040 | 2.0125 | 0.7883 | 0.7895 | 0.0467 | 0.0595 |
| 0.0931 | 79.0 | 19750 | 0.7456 | 0.7883 | 0.3038 | 2.0302 | 0.7883 | 0.7894 | 0.0455 | 0.0596 |
| 0.0899 | 80.0 | 20000 | 0.7469 | 0.788 | 0.3040 | 2.0188 | 0.788 | 0.7892 | 0.0487 | 0.0597 |
| 0.0899 | 81.0 | 20250 | 0.7421 | 0.788 | 0.3041 | 2.0359 | 0.788 | 0.7888 | 0.0427 | 0.0595 |
| 0.0882 | 82.0 | 20500 | 0.7444 | 0.7865 | 0.3051 | 2.0219 | 0.7865 | 0.7875 | 0.0479 | 0.0600 |
| 0.0882 | 83.0 | 20750 | 0.7439 | 0.788 | 0.3039 | 2.0197 | 0.788 | 0.7894 | 0.0439 | 0.0597 |
| 0.0871 | 84.0 | 21000 | 0.7421 | 0.7865 | 0.3040 | 1.9910 | 0.7865 | 0.7876 | 0.0445 | 0.0598 |
| 0.0871 | 85.0 | 21250 | 0.7429 | 0.7887 | 0.3043 | 2.0253 | 0.7887 | 0.7898 | 0.0426 | 0.0597 |
| 0.0869 | 86.0 | 21500 | 0.7442 | 0.7873 | 0.3041 | 2.0156 | 0.7873 | 0.7885 | 0.0488 | 0.0596 |
| 0.0869 | 87.0 | 21750 | 0.7439 | 0.7857 | 0.3051 | 2.0099 | 0.7857 | 0.7867 | 0.0465 | 0.0599 |
| 0.084 | 88.0 | 22000 | 0.7434 | 0.786 | 0.3040 | 1.9926 | 0.786 | 0.7869 | 0.0469 | 0.0598 |
| 0.084 | 89.0 | 22250 | 0.7431 | 0.7873 | 0.3048 | 2.0028 | 0.7873 | 0.7880 | 0.0442 | 0.0599 |
| 0.0821 | 90.0 | 22500 | 0.7447 | 0.7867 | 0.3040 | 2.0349 | 0.7868 | 0.7876 | 0.0477 | 0.0596 |
| 0.0821 | 91.0 | 22750 | 0.7443 | 0.7877 | 0.3051 | 2.0356 | 0.7877 | 0.7887 | 0.0486 | 0.0601 |
| 0.0813 | 92.0 | 23000 | 0.7500 | 0.7873 | 0.3053 | 2.0465 | 0.7873 | 0.7880 | 0.0484 | 0.0601 |
| 0.0813 | 93.0 | 23250 | 0.7449 | 0.788 | 0.3037 | 1.9966 | 0.788 | 0.7890 | 0.0441 | 0.0594 |
| 0.0811 | 94.0 | 23500 | 0.7466 | 0.7897 | 0.3048 | 2.0297 | 0.7897 | 0.7907 | 0.0429 | 0.0600 |
| 0.0811 | 95.0 | 23750 | 0.7482 | 0.7875 | 0.3058 | 2.0319 | 0.7875 | 0.7885 | 0.0464 | 0.0601 |
| 0.0808 | 96.0 | 24000 | 0.7473 | 0.7863 | 0.3055 | 2.0219 | 0.7863 | 0.7874 | 0.0477 | 0.0603 |
| 0.0808 | 97.0 | 24250 | 0.7451 | 0.7855 | 0.3044 | 2.0356 | 0.7855 | 0.7865 | 0.0481 | 0.0594 |
| 0.08 | 98.0 | 24500 | 0.7442 | 0.7857 | 0.3042 | 2.0213 | 0.7857 | 0.7868 | 0.0475 | 0.0595 |
| 0.08 | 99.0 | 24750 | 0.7462 | 0.7863 | 0.3053 | 2.0354 | 0.7863 | 0.7874 | 0.0425 | 0.0599 |
| 0.079 | 100.0 | 25000 | 0.7429 | 0.7853 | 0.3044 | 2.0364 | 0.7853 | 0.7862 | 0.0430 | 0.0599 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
bdpc/resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_NKD_t1.0_g1.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.9013
- Accuracy: 0.7933
- Brier Loss: 0.3080
- Nll: 1.8102
- F1 Micro: 0.7932
- F1 Macro: 0.7937
- Ece: 0.0719
- Aurc: 0.0635
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 250 | 6.0054 | 0.098 | 0.9327 | 9.3196 | 0.0980 | 0.0481 | 0.0462 | 0.8670 |
| 6.0141 | 2.0 | 500 | 5.4713 | 0.2195 | 0.8933 | 5.2235 | 0.2195 | 0.1452 | 0.1046 | 0.7129 |
| 6.0141 | 3.0 | 750 | 4.4006 | 0.4535 | 0.7034 | 3.0178 | 0.4535 | 0.4351 | 0.1373 | 0.3334 |
| 4.5079 | 4.0 | 1000 | 3.8431 | 0.59 | 0.5686 | 2.5843 | 0.59 | 0.5822 | 0.1309 | 0.2072 |
| 4.5079 | 5.0 | 1250 | 3.5315 | 0.6552 | 0.4864 | 2.4330 | 0.6552 | 0.6537 | 0.1048 | 0.1504 |
| 3.5028 | 6.0 | 1500 | 3.2850 | 0.707 | 0.4163 | 2.2375 | 0.707 | 0.7082 | 0.0790 | 0.1111 |
| 3.5028 | 7.0 | 1750 | 3.0974 | 0.7312 | 0.3721 | 2.0933 | 0.7312 | 0.7328 | 0.0452 | 0.0899 |
| 3.0599 | 8.0 | 2000 | 3.0385 | 0.7455 | 0.3561 | 2.0148 | 0.7455 | 0.7456 | 0.0432 | 0.0838 |
| 3.0599 | 9.0 | 2250 | 2.9978 | 0.7565 | 0.3432 | 1.9780 | 0.7565 | 0.7572 | 0.0437 | 0.0777 |
| 2.8562 | 10.0 | 2500 | 2.9853 | 0.7622 | 0.3397 | 1.9176 | 0.7622 | 0.7619 | 0.0495 | 0.0751 |
| 2.8562 | 11.0 | 2750 | 2.9803 | 0.7615 | 0.3385 | 1.9327 | 0.7615 | 0.7627 | 0.0547 | 0.0760 |
| 2.7414 | 12.0 | 3000 | 2.9711 | 0.7658 | 0.3322 | 1.9439 | 0.7658 | 0.7661 | 0.0495 | 0.0740 |
| 2.7414 | 13.0 | 3250 | 2.9618 | 0.771 | 0.3276 | 1.8599 | 0.771 | 0.7718 | 0.0548 | 0.0704 |
| 2.6658 | 14.0 | 3500 | 2.9534 | 0.7762 | 0.3252 | 1.8935 | 0.7762 | 0.7770 | 0.0581 | 0.0699 |
| 2.6658 | 15.0 | 3750 | 2.9568 | 0.776 | 0.3248 | 1.8836 | 0.776 | 0.7776 | 0.0588 | 0.0699 |
| 2.6197 | 16.0 | 4000 | 2.9196 | 0.7812 | 0.3169 | 1.8338 | 0.7812 | 0.7814 | 0.0601 | 0.0655 |
| 2.6197 | 17.0 | 4250 | 2.9267 | 0.7785 | 0.3202 | 1.8430 | 0.7785 | 0.7783 | 0.0647 | 0.0677 |
| 2.5794 | 18.0 | 4500 | 2.9189 | 0.779 | 0.3155 | 1.8279 | 0.779 | 0.7794 | 0.0631 | 0.0661 |
| 2.5794 | 19.0 | 4750 | 2.9324 | 0.7823 | 0.3177 | 1.8508 | 0.7823 | 0.7823 | 0.0665 | 0.0669 |
| 2.5553 | 20.0 | 5000 | 2.9192 | 0.7837 | 0.3146 | 1.8312 | 0.7837 | 0.7840 | 0.0641 | 0.0654 |
| 2.5553 | 21.0 | 5250 | 2.9160 | 0.7817 | 0.3140 | 1.8366 | 0.7817 | 0.7828 | 0.0682 | 0.0658 |
| 2.53 | 22.0 | 5500 | 2.9172 | 0.7837 | 0.3139 | 1.8138 | 0.7837 | 0.7842 | 0.0602 | 0.0652 |
| 2.53 | 23.0 | 5750 | 2.9132 | 0.7875 | 0.3134 | 1.8254 | 0.7875 | 0.7877 | 0.0656 | 0.0646 |
| 2.5127 | 24.0 | 6000 | 2.9108 | 0.7875 | 0.3132 | 1.8367 | 0.7875 | 0.7869 | 0.0669 | 0.0652 |
| 2.5127 | 25.0 | 6250 | 2.9272 | 0.7837 | 0.3139 | 1.8551 | 0.7837 | 0.7843 | 0.0632 | 0.0653 |
| 2.4979 | 26.0 | 6500 | 2.9157 | 0.7867 | 0.3128 | 1.8101 | 0.7868 | 0.7876 | 0.0655 | 0.0647 |
| 2.4979 | 27.0 | 6750 | 2.9031 | 0.785 | 0.3112 | 1.8089 | 0.785 | 0.7856 | 0.0688 | 0.0639 |
| 2.4814 | 28.0 | 7000 | 2.9094 | 0.7875 | 0.3110 | 1.8594 | 0.7875 | 0.7880 | 0.0677 | 0.0646 |
| 2.4814 | 29.0 | 7250 | 2.9110 | 0.7885 | 0.3116 | 1.8150 | 0.7885 | 0.7891 | 0.0696 | 0.0639 |
| 2.4741 | 30.0 | 7500 | 2.9039 | 0.7877 | 0.3091 | 1.8471 | 0.7877 | 0.7887 | 0.0694 | 0.0632 |
| 2.4741 | 31.0 | 7750 | 2.9029 | 0.7907 | 0.3087 | 1.7604 | 0.7907 | 0.7917 | 0.0691 | 0.0633 |
| 2.4626 | 32.0 | 8000 | 2.8983 | 0.7877 | 0.3094 | 1.8191 | 0.7877 | 0.7884 | 0.0677 | 0.0625 |
| 2.4626 | 33.0 | 8250 | 2.9024 | 0.7897 | 0.3088 | 1.8025 | 0.7897 | 0.7905 | 0.0720 | 0.0635 |
| 2.4558 | 34.0 | 8500 | 2.9055 | 0.792 | 0.3070 | 1.7869 | 0.792 | 0.7920 | 0.0667 | 0.0628 |
| 2.4558 | 35.0 | 8750 | 2.9055 | 0.788 | 0.3104 | 1.8349 | 0.788 | 0.7883 | 0.0733 | 0.0645 |
| 2.4481 | 36.0 | 9000 | 2.9061 | 0.7887 | 0.3078 | 1.7840 | 0.7887 | 0.7898 | 0.0676 | 0.0642 |
| 2.4481 | 37.0 | 9250 | 2.9086 | 0.7917 | 0.3102 | 1.7942 | 0.7917 | 0.7923 | 0.0716 | 0.0644 |
| 2.4422 | 38.0 | 9500 | 2.9067 | 0.7897 | 0.3084 | 1.7915 | 0.7897 | 0.7900 | 0.0704 | 0.0637 |
| 2.4422 | 39.0 | 9750 | 2.9080 | 0.7927 | 0.3092 | 1.7951 | 0.7927 | 0.7930 | 0.0709 | 0.0631 |
| 2.4386 | 40.0 | 10000 | 2.9064 | 0.7943 | 0.3084 | 1.8079 | 0.7943 | 0.7949 | 0.0734 | 0.0635 |
| 2.4386 | 41.0 | 10250 | 2.8990 | 0.792 | 0.3056 | 1.7918 | 0.792 | 0.7924 | 0.0699 | 0.0623 |
| 2.4312 | 42.0 | 10500 | 2.9057 | 0.7893 | 0.3090 | 1.7892 | 0.7893 | 0.7901 | 0.0735 | 0.0641 |
| 2.4312 | 43.0 | 10750 | 2.8998 | 0.7923 | 0.3079 | 1.7909 | 0.7923 | 0.7932 | 0.0707 | 0.0630 |
| 2.4294 | 44.0 | 11000 | 2.9108 | 0.7905 | 0.3090 | 1.8220 | 0.7905 | 0.7916 | 0.0773 | 0.0636 |
| 2.4294 | 45.0 | 11250 | 2.9030 | 0.7927 | 0.3086 | 1.8126 | 0.7927 | 0.7932 | 0.0710 | 0.0631 |
| 2.4282 | 46.0 | 11500 | 2.9033 | 0.7915 | 0.3077 | 1.8234 | 0.7915 | 0.7920 | 0.0712 | 0.0631 |
| 2.4282 | 47.0 | 11750 | 2.8975 | 0.7957 | 0.3063 | 1.8070 | 0.7957 | 0.7968 | 0.0702 | 0.0630 |
| 2.4246 | 48.0 | 12000 | 2.9049 | 0.7935 | 0.3085 | 1.8090 | 0.7935 | 0.7944 | 0.0722 | 0.0635 |
| 2.4246 | 49.0 | 12250 | 2.9020 | 0.792 | 0.3075 | 1.8233 | 0.792 | 0.7927 | 0.0700 | 0.0638 |
| 2.4227 | 50.0 | 12500 | 2.9013 | 0.7933 | 0.3080 | 1.8102 | 0.7932 | 0.7937 | 0.0719 | 0.0635 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6481
- Accuracy: 0.69
- Brier Loss: 0.4919
- Nll: 2.4969
- F1 Micro: 0.69
- F1 Macro: 0.6317
- Ece: 0.3029
- Aurc: 0.1260
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.4796 | 0.165 | 0.8965 | 8.4885 | 0.165 | 0.1123 | 0.2151 | 0.8341 |
| No log | 2.0 | 26 | 1.4679 | 0.165 | 0.8954 | 8.3391 | 0.165 | 0.1066 | 0.2136 | 0.8332 |
| No log | 3.0 | 39 | 1.4170 | 0.21 | 0.8858 | 6.1941 | 0.2100 | 0.0969 | 0.2433 | 0.7991 |
| No log | 4.0 | 52 | 1.3472 | 0.21 | 0.8711 | 6.0602 | 0.2100 | 0.0728 | 0.2320 | 0.7271 |
| No log | 5.0 | 65 | 1.2776 | 0.19 | 0.8572 | 6.1293 | 0.19 | 0.0537 | 0.2422 | 0.7473 |
| No log | 6.0 | 78 | 1.1840 | 0.245 | 0.8353 | 6.2405 | 0.245 | 0.1060 | 0.2810 | 0.6690 |
| No log | 7.0 | 91 | 1.0740 | 0.365 | 0.7936 | 6.3617 | 0.3650 | 0.1739 | 0.3136 | 0.3646 |
| No log | 8.0 | 104 | 1.1102 | 0.345 | 0.8081 | 5.8896 | 0.345 | 0.1812 | 0.3046 | 0.4292 |
| No log | 9.0 | 117 | 1.0735 | 0.34 | 0.7963 | 5.9970 | 0.34 | 0.1842 | 0.3028 | 0.4286 |
| No log | 10.0 | 130 | 1.1145 | 0.265 | 0.8110 | 5.9054 | 0.265 | 0.1300 | 0.2511 | 0.6350 |
| No log | 11.0 | 143 | 0.9981 | 0.325 | 0.7659 | 5.3834 | 0.325 | 0.1655 | 0.2790 | 0.4860 |
| No log | 12.0 | 156 | 1.0500 | 0.285 | 0.7898 | 4.9696 | 0.285 | 0.1594 | 0.2604 | 0.6636 |
| No log | 13.0 | 169 | 0.8764 | 0.445 | 0.6976 | 4.6456 | 0.445 | 0.2647 | 0.2779 | 0.3020 |
| No log | 14.0 | 182 | 0.9147 | 0.48 | 0.7108 | 4.4793 | 0.48 | 0.2942 | 0.3262 | 0.2862 |
| No log | 15.0 | 195 | 0.9776 | 0.38 | 0.7434 | 4.4065 | 0.38 | 0.2269 | 0.2938 | 0.5297 |
| No log | 16.0 | 208 | 0.8066 | 0.47 | 0.6494 | 3.9671 | 0.47 | 0.2966 | 0.2791 | 0.2907 |
| No log | 17.0 | 221 | 0.7766 | 0.535 | 0.6305 | 3.5250 | 0.535 | 0.3866 | 0.3003 | 0.2424 |
| No log | 18.0 | 234 | 0.8186 | 0.535 | 0.6458 | 3.3670 | 0.535 | 0.3792 | 0.3005 | 0.2311 |
| No log | 19.0 | 247 | 0.8156 | 0.52 | 0.6430 | 3.1633 | 0.52 | 0.3675 | 0.3072 | 0.2667 |
| No log | 20.0 | 260 | 0.8386 | 0.55 | 0.6462 | 3.2549 | 0.55 | 0.4251 | 0.3103 | 0.2703 |
| No log | 21.0 | 273 | 0.7996 | 0.515 | 0.6342 | 3.1396 | 0.515 | 0.3969 | 0.3177 | 0.2867 |
| No log | 22.0 | 286 | 0.8605 | 0.6 | 0.6472 | 3.2563 | 0.6 | 0.4717 | 0.3810 | 0.2113 |
| No log | 23.0 | 299 | 0.7138 | 0.595 | 0.5713 | 3.1171 | 0.595 | 0.4657 | 0.2773 | 0.2034 |
| No log | 24.0 | 312 | 0.7212 | 0.665 | 0.5740 | 2.9688 | 0.665 | 0.5474 | 0.3366 | 0.1754 |
| No log | 25.0 | 325 | 0.7463 | 0.63 | 0.5843 | 2.8998 | 0.63 | 0.5502 | 0.3432 | 0.2072 |
| No log | 26.0 | 338 | 0.7231 | 0.67 | 0.5626 | 3.1334 | 0.67 | 0.5564 | 0.3160 | 0.1521 |
| No log | 27.0 | 351 | 0.6913 | 0.68 | 0.5427 | 2.8906 | 0.68 | 0.5702 | 0.3354 | 0.1406 |
| No log | 28.0 | 364 | 0.6825 | 0.66 | 0.5342 | 2.8619 | 0.66 | 0.5615 | 0.2902 | 0.1625 |
| No log | 29.0 | 377 | 0.7015 | 0.665 | 0.5549 | 2.7315 | 0.665 | 0.5741 | 0.3305 | 0.1769 |
| No log | 30.0 | 390 | 0.6939 | 0.67 | 0.5406 | 2.7114 | 0.67 | 0.5720 | 0.3353 | 0.1420 |
| No log | 31.0 | 403 | 0.6836 | 0.69 | 0.5265 | 2.7567 | 0.69 | 0.5982 | 0.3216 | 0.1455 |
| No log | 32.0 | 416 | 0.6728 | 0.69 | 0.5211 | 2.6858 | 0.69 | 0.6056 | 0.3124 | 0.1453 |
| No log | 33.0 | 429 | 0.6926 | 0.675 | 0.5403 | 2.5815 | 0.675 | 0.6095 | 0.3258 | 0.1683 |
| No log | 34.0 | 442 | 0.6673 | 0.66 | 0.5090 | 2.5591 | 0.66 | 0.5722 | 0.2950 | 0.1385 |
| No log | 35.0 | 455 | 0.6811 | 0.675 | 0.5207 | 2.5813 | 0.675 | 0.5841 | 0.3324 | 0.1273 |
| No log | 36.0 | 468 | 0.6648 | 0.69 | 0.5119 | 2.5745 | 0.69 | 0.6225 | 0.3433 | 0.1320 |
| No log | 37.0 | 481 | 0.6623 | 0.67 | 0.5092 | 2.6134 | 0.67 | 0.6129 | 0.3204 | 0.1471 |
| No log | 38.0 | 494 | 0.6635 | 0.69 | 0.5088 | 2.3862 | 0.69 | 0.6192 | 0.3201 | 0.1311 |
| 0.7628 | 39.0 | 507 | 0.6554 | 0.685 | 0.5008 | 2.5849 | 0.685 | 0.6210 | 0.3179 | 0.1377 |
| 0.7628 | 40.0 | 520 | 0.6567 | 0.685 | 0.5022 | 2.6498 | 0.685 | 0.6310 | 0.3127 | 0.1414 |
| 0.7628 | 41.0 | 533 | 0.6558 | 0.695 | 0.4996 | 2.5917 | 0.695 | 0.6347 | 0.3115 | 0.1321 |
| 0.7628 | 42.0 | 546 | 0.6578 | 0.695 | 0.5021 | 2.4864 | 0.695 | 0.6259 | 0.3098 | 0.1306 |
| 0.7628 | 43.0 | 559 | 0.6544 | 0.685 | 0.4969 | 2.5757 | 0.685 | 0.6175 | 0.2955 | 0.1342 |
| 0.7628 | 44.0 | 572 | 0.6507 | 0.685 | 0.4944 | 2.5057 | 0.685 | 0.6257 | 0.3144 | 0.1304 |
| 0.7628 | 45.0 | 585 | 0.6501 | 0.675 | 0.4937 | 2.4903 | 0.675 | 0.6208 | 0.3091 | 0.1301 |
| 0.7628 | 46.0 | 598 | 0.6518 | 0.685 | 0.4949 | 2.4732 | 0.685 | 0.6254 | 0.3164 | 0.1235 |
| 0.7628 | 47.0 | 611 | 0.6499 | 0.685 | 0.4936 | 2.4924 | 0.685 | 0.6273 | 0.3124 | 0.1323 |
| 0.7628 | 48.0 | 624 | 0.6490 | 0.7 | 0.4925 | 2.4999 | 0.7 | 0.6353 | 0.3147 | 0.1243 |
| 0.7628 | 49.0 | 637 | 0.6510 | 0.685 | 0.4933 | 2.5758 | 0.685 | 0.6242 | 0.3206 | 0.1281 |
| 0.7628 | 50.0 | 650 | 0.6481 | 0.69 | 0.4919 | 2.4969 | 0.69 | 0.6317 | 0.3029 | 0.1260 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_MSE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_MSE
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0899
- Accuracy: 0.395
- Brier Loss: 0.6867
- Nll: 4.7352
- F1 Micro: 0.395
- F1 Macro: 0.2347
- Ece: 0.2366
- Aurc: 0.3626
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.1202 | 0.17 | 0.8964 | 8.4790 | 0.17 | 0.1089 | 0.2136 | 0.8244 |
| No log | 2.0 | 26 | 1.0772 | 0.165 | 0.8950 | 8.2397 | 0.165 | 0.0929 | 0.2120 | 0.8534 |
| No log | 3.0 | 39 | 0.9427 | 0.2 | 0.8847 | 7.1036 | 0.2000 | 0.0796 | 0.2384 | 0.7748 |
| No log | 4.0 | 52 | 0.7947 | 0.21 | 0.8720 | 6.5481 | 0.2100 | 0.0649 | 0.2432 | 0.7270 |
| No log | 5.0 | 65 | 0.5378 | 0.205 | 0.8432 | 6.3064 | 0.205 | 0.0544 | 0.2367 | 0.6763 |
| No log | 6.0 | 78 | 0.4557 | 0.18 | 0.8402 | 6.3878 | 0.18 | 0.0308 | 0.2384 | 0.7467 |
| No log | 7.0 | 91 | 0.4326 | 0.18 | 0.8383 | 6.3386 | 0.18 | 0.0308 | 0.2385 | 0.7234 |
| No log | 8.0 | 104 | 0.2832 | 0.265 | 0.8085 | 6.3561 | 0.265 | 0.1012 | 0.2570 | 0.6272 |
| No log | 9.0 | 117 | 0.2672 | 0.255 | 0.8124 | 6.2296 | 0.255 | 0.0981 | 0.2569 | 0.6567 |
| No log | 10.0 | 130 | 0.2452 | 0.29 | 0.7953 | 6.3199 | 0.29 | 0.1153 | 0.2717 | 0.5884 |
| No log | 11.0 | 143 | 0.2155 | 0.31 | 0.7764 | 6.3618 | 0.31 | 0.1231 | 0.2728 | 0.4803 |
| No log | 12.0 | 156 | 0.1315 | 0.31 | 0.7371 | 6.2610 | 0.31 | 0.1231 | 0.2343 | 0.4419 |
| No log | 13.0 | 169 | 0.1803 | 0.3 | 0.7665 | 6.1189 | 0.3 | 0.1187 | 0.2587 | 0.4579 |
| No log | 14.0 | 182 | 0.1426 | 0.31 | 0.7386 | 6.1115 | 0.31 | 0.1236 | 0.2502 | 0.4341 |
| No log | 15.0 | 195 | 0.1431 | 0.31 | 0.7334 | 5.9353 | 0.31 | 0.1274 | 0.2624 | 0.4233 |
| No log | 16.0 | 208 | 0.1540 | 0.32 | 0.7318 | 5.7102 | 0.32 | 0.1432 | 0.2493 | 0.4322 |
| No log | 17.0 | 221 | 0.2603 | 0.305 | 0.7784 | 5.6776 | 0.305 | 0.1361 | 0.2751 | 0.5118 |
| No log | 18.0 | 234 | 0.1000 | 0.35 | 0.7074 | 5.4636 | 0.35 | 0.1574 | 0.2420 | 0.4027 |
| No log | 19.0 | 247 | 0.1014 | 0.33 | 0.7131 | 5.5297 | 0.33 | 0.1413 | 0.2439 | 0.4245 |
| No log | 20.0 | 260 | 0.2862 | 0.265 | 0.8013 | 5.5041 | 0.265 | 0.1126 | 0.2762 | 0.6324 |
| No log | 21.0 | 273 | 0.1224 | 0.34 | 0.7183 | 5.2027 | 0.34 | 0.1544 | 0.2673 | 0.4222 |
| No log | 22.0 | 286 | 0.1406 | 0.345 | 0.7173 | 5.1426 | 0.345 | 0.1612 | 0.2710 | 0.4019 |
| No log | 23.0 | 299 | 0.1509 | 0.34 | 0.7270 | 5.0281 | 0.34 | 0.1565 | 0.2641 | 0.4178 |
| No log | 24.0 | 312 | 0.0994 | 0.37 | 0.6996 | 5.1278 | 0.37 | 0.1771 | 0.2390 | 0.3930 |
| No log | 25.0 | 325 | 0.1965 | 0.35 | 0.7474 | 5.0356 | 0.35 | 0.1707 | 0.2774 | 0.4503 |
| No log | 26.0 | 338 | 0.1104 | 0.37 | 0.7085 | 5.0275 | 0.37 | 0.1984 | 0.2663 | 0.3927 |
| No log | 27.0 | 351 | 0.1674 | 0.34 | 0.7299 | 4.9200 | 0.34 | 0.1739 | 0.2787 | 0.4257 |
| No log | 28.0 | 364 | 0.2424 | 0.335 | 0.7626 | 5.0286 | 0.335 | 0.1693 | 0.2905 | 0.5297 |
| No log | 29.0 | 377 | 0.1261 | 0.345 | 0.7185 | 5.0591 | 0.345 | 0.1730 | 0.2892 | 0.4142 |
| No log | 30.0 | 390 | 0.1574 | 0.365 | 0.7213 | 4.8809 | 0.3650 | 0.1951 | 0.2983 | 0.4062 |
| No log | 31.0 | 403 | 0.1227 | 0.365 | 0.7098 | 4.8152 | 0.3650 | 0.1996 | 0.2802 | 0.3992 |
| No log | 32.0 | 416 | 0.1114 | 0.355 | 0.7010 | 4.8224 | 0.3550 | 0.1915 | 0.2657 | 0.3958 |
| No log | 33.0 | 429 | 0.1027 | 0.39 | 0.6934 | 4.7755 | 0.39 | 0.2245 | 0.2653 | 0.3695 |
| No log | 34.0 | 442 | 0.0959 | 0.385 | 0.6875 | 4.8715 | 0.3850 | 0.2299 | 0.2591 | 0.3699 |
| No log | 35.0 | 455 | 0.0905 | 0.395 | 0.6897 | 4.8649 | 0.395 | 0.2367 | 0.2519 | 0.3627 |
| No log | 36.0 | 468 | 0.0879 | 0.365 | 0.6911 | 4.8472 | 0.3650 | 0.2132 | 0.2437 | 0.3910 |
| No log | 37.0 | 481 | 0.0867 | 0.39 | 0.6881 | 4.7379 | 0.39 | 0.2335 | 0.2576 | 0.3680 |
| No log | 38.0 | 494 | 0.0934 | 0.4 | 0.6916 | 4.6797 | 0.4000 | 0.2490 | 0.2578 | 0.3628 |
| 0.2032 | 39.0 | 507 | 0.0928 | 0.38 | 0.6901 | 4.6734 | 0.38 | 0.2268 | 0.2432 | 0.3783 |
| 0.2032 | 40.0 | 520 | 0.0995 | 0.39 | 0.6875 | 4.8180 | 0.39 | 0.2323 | 0.2647 | 0.3730 |
| 0.2032 | 41.0 | 533 | 0.0944 | 0.37 | 0.6892 | 4.8193 | 0.37 | 0.2174 | 0.2536 | 0.3862 |
| 0.2032 | 42.0 | 546 | 0.0904 | 0.415 | 0.6885 | 4.5644 | 0.415 | 0.2556 | 0.2729 | 0.3573 |
| 0.2032 | 43.0 | 559 | 0.0951 | 0.39 | 0.6899 | 4.6549 | 0.39 | 0.2417 | 0.2525 | 0.3692 |
| 0.2032 | 44.0 | 572 | 0.0884 | 0.4 | 0.6860 | 4.6572 | 0.4000 | 0.2402 | 0.2587 | 0.3557 |
| 0.2032 | 45.0 | 585 | 0.0867 | 0.38 | 0.6874 | 4.6558 | 0.38 | 0.2278 | 0.2526 | 0.3738 |
| 0.2032 | 46.0 | 598 | 0.0861 | 0.405 | 0.6844 | 4.5777 | 0.405 | 0.2537 | 0.2548 | 0.3628 |
| 0.2032 | 47.0 | 611 | 0.0874 | 0.385 | 0.6853 | 4.4946 | 0.3850 | 0.2380 | 0.2570 | 0.3743 |
| 0.2032 | 48.0 | 624 | 0.0880 | 0.405 | 0.6857 | 4.5605 | 0.405 | 0.2500 | 0.2489 | 0.3555 |
| 0.2032 | 49.0 | 637 | 0.0884 | 0.4 | 0.6853 | 4.6057 | 0.4000 | 0.2481 | 0.2401 | 0.3616 |
| 0.2032 | 50.0 | 650 | 0.0899 | 0.395 | 0.6867 | 4.7352 | 0.395 | 0.2347 | 0.2366 | 0.3626 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6500
- Accuracy: 0.69
- Brier Loss: 0.5003
- Nll: 2.5629
- F1 Micro: 0.69
- F1 Macro: 0.6350
- Ece: 0.3098
- Aurc: 0.1329
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.4712 | 0.165 | 0.8966 | 8.4652 | 0.165 | 0.1101 | 0.2129 | 0.8342 |
| No log | 2.0 | 26 | 1.4590 | 0.165 | 0.8951 | 8.1097 | 0.165 | 0.1059 | 0.2034 | 0.8021 |
| No log | 3.0 | 39 | 1.4178 | 0.175 | 0.8873 | 6.8095 | 0.175 | 0.0813 | 0.2150 | 0.7994 |
| No log | 4.0 | 52 | 1.3342 | 0.18 | 0.8702 | 6.4137 | 0.18 | 0.0475 | 0.2314 | 0.7558 |
| No log | 5.0 | 65 | 1.2828 | 0.2 | 0.8587 | 6.1547 | 0.2000 | 0.0642 | 0.2429 | 0.7009 |
| No log | 6.0 | 78 | 1.2675 | 0.205 | 0.8548 | 6.1395 | 0.205 | 0.0612 | 0.2348 | 0.7022 |
| No log | 7.0 | 91 | 1.0716 | 0.31 | 0.7962 | 6.4589 | 0.31 | 0.1241 | 0.2787 | 0.4433 |
| No log | 8.0 | 104 | 1.1184 | 0.29 | 0.8126 | 6.2585 | 0.29 | 0.1394 | 0.2863 | 0.5819 |
| No log | 9.0 | 117 | 1.1021 | 0.31 | 0.8075 | 6.0370 | 0.31 | 0.1697 | 0.2834 | 0.5458 |
| No log | 10.0 | 130 | 1.0268 | 0.33 | 0.7815 | 6.1370 | 0.33 | 0.1921 | 0.2856 | 0.5395 |
| No log | 11.0 | 143 | 1.0290 | 0.355 | 0.7759 | 5.3640 | 0.3550 | 0.2143 | 0.2795 | 0.4697 |
| No log | 12.0 | 156 | 0.9169 | 0.36 | 0.7262 | 5.2997 | 0.36 | 0.1995 | 0.2761 | 0.4070 |
| No log | 13.0 | 169 | 0.9903 | 0.36 | 0.7586 | 4.9404 | 0.36 | 0.2200 | 0.2832 | 0.5343 |
| No log | 14.0 | 182 | 0.9128 | 0.425 | 0.7082 | 4.5862 | 0.425 | 0.2706 | 0.2834 | 0.3542 |
| No log | 15.0 | 195 | 1.0046 | 0.405 | 0.7441 | 3.9763 | 0.405 | 0.2759 | 0.3142 | 0.4602 |
| No log | 16.0 | 208 | 0.9277 | 0.41 | 0.7146 | 4.3670 | 0.41 | 0.2763 | 0.2695 | 0.4409 |
| No log | 17.0 | 221 | 0.9726 | 0.505 | 0.7208 | 3.5350 | 0.505 | 0.3736 | 0.3332 | 0.3469 |
| No log | 18.0 | 234 | 0.7717 | 0.505 | 0.6280 | 3.4386 | 0.505 | 0.3412 | 0.2564 | 0.2567 |
| No log | 19.0 | 247 | 0.7723 | 0.58 | 0.6143 | 3.6207 | 0.58 | 0.4125 | 0.3178 | 0.1847 |
| No log | 20.0 | 260 | 0.8182 | 0.57 | 0.6419 | 3.1633 | 0.57 | 0.4855 | 0.3517 | 0.2530 |
| No log | 21.0 | 273 | 0.7333 | 0.58 | 0.5891 | 3.3014 | 0.58 | 0.4512 | 0.2718 | 0.2137 |
| No log | 22.0 | 286 | 0.7374 | 0.665 | 0.5856 | 3.0299 | 0.665 | 0.5432 | 0.3459 | 0.1657 |
| No log | 23.0 | 299 | 0.7083 | 0.645 | 0.5564 | 3.0874 | 0.645 | 0.5180 | 0.3112 | 0.1608 |
| No log | 24.0 | 312 | 0.7480 | 0.64 | 0.5901 | 3.0218 | 0.64 | 0.5410 | 0.3701 | 0.1976 |
| No log | 25.0 | 325 | 0.7547 | 0.68 | 0.5894 | 2.9002 | 0.68 | 0.5801 | 0.3817 | 0.1559 |
| No log | 26.0 | 338 | 0.6998 | 0.65 | 0.5474 | 2.9402 | 0.65 | 0.5468 | 0.2875 | 0.1707 |
| No log | 27.0 | 351 | 0.6967 | 0.66 | 0.5506 | 2.8344 | 0.66 | 0.5578 | 0.3105 | 0.1707 |
| No log | 28.0 | 364 | 0.6733 | 0.655 | 0.5332 | 2.6492 | 0.655 | 0.5719 | 0.2935 | 0.1554 |
| No log | 29.0 | 377 | 0.7162 | 0.67 | 0.5596 | 2.7250 | 0.67 | 0.5721 | 0.3388 | 0.1423 |
| No log | 30.0 | 390 | 0.6826 | 0.665 | 0.5291 | 2.7460 | 0.665 | 0.5797 | 0.3353 | 0.1469 |
| No log | 31.0 | 403 | 0.6761 | 0.665 | 0.5195 | 2.7938 | 0.665 | 0.5647 | 0.3096 | 0.1485 |
| No log | 32.0 | 416 | 0.6745 | 0.695 | 0.5295 | 2.6172 | 0.695 | 0.6160 | 0.3171 | 0.1636 |
| No log | 33.0 | 429 | 0.6785 | 0.695 | 0.5242 | 2.5816 | 0.695 | 0.6115 | 0.3475 | 0.1349 |
| No log | 34.0 | 442 | 0.6688 | 0.665 | 0.5174 | 2.6401 | 0.665 | 0.5833 | 0.2988 | 0.1427 |
| No log | 35.0 | 455 | 0.6767 | 0.675 | 0.5275 | 2.6364 | 0.675 | 0.6027 | 0.3285 | 0.1483 |
| No log | 36.0 | 468 | 0.6605 | 0.695 | 0.5076 | 2.6483 | 0.695 | 0.6252 | 0.3127 | 0.1372 |
| No log | 37.0 | 481 | 0.6538 | 0.705 | 0.5029 | 2.6284 | 0.705 | 0.6340 | 0.3173 | 0.1220 |
| No log | 38.0 | 494 | 0.6610 | 0.695 | 0.5102 | 2.5052 | 0.695 | 0.6375 | 0.3128 | 0.1298 |
| 0.7532 | 39.0 | 507 | 0.6618 | 0.695 | 0.5110 | 2.5663 | 0.695 | 0.6268 | 0.3297 | 0.1367 |
| 0.7532 | 40.0 | 520 | 0.6749 | 0.69 | 0.5235 | 2.5343 | 0.69 | 0.6341 | 0.3256 | 0.1332 |
| 0.7532 | 41.0 | 533 | 0.6574 | 0.695 | 0.5062 | 2.4223 | 0.695 | 0.6338 | 0.3292 | 0.1469 |
| 0.7532 | 42.0 | 546 | 0.6530 | 0.695 | 0.5026 | 2.6189 | 0.695 | 0.6390 | 0.2950 | 0.1391 |
| 0.7532 | 43.0 | 559 | 0.6509 | 0.685 | 0.5003 | 2.5417 | 0.685 | 0.6299 | 0.3150 | 0.1368 |
| 0.7532 | 44.0 | 572 | 0.6520 | 0.71 | 0.5030 | 2.4796 | 0.7100 | 0.6453 | 0.3251 | 0.1286 |
| 0.7532 | 45.0 | 585 | 0.6494 | 0.69 | 0.4994 | 2.5431 | 0.69 | 0.6327 | 0.3138 | 0.1279 |
| 0.7532 | 46.0 | 598 | 0.6515 | 0.71 | 0.5007 | 2.5295 | 0.7100 | 0.6541 | 0.3307 | 0.1208 |
| 0.7532 | 47.0 | 611 | 0.6477 | 0.69 | 0.4979 | 2.5971 | 0.69 | 0.6323 | 0.3263 | 0.1281 |
| 0.7532 | 48.0 | 624 | 0.6495 | 0.7 | 0.5007 | 2.6162 | 0.7 | 0.6395 | 0.3412 | 0.1272 |
| 0.7532 | 49.0 | 637 | 0.6478 | 0.7 | 0.4968 | 2.4946 | 0.7 | 0.6386 | 0.3191 | 0.1309 |
| 0.7532 | 50.0 | 650 | 0.6500 | 0.69 | 0.5003 | 2.5629 | 0.69 | 0.6350 | 0.3098 | 0.1329 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.7
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8009
- Accuracy: 0.695
- Brier Loss: 0.4518
- Nll: 2.3840
- F1 Micro: 0.695
- F1 Macro: 0.6406
- Ece: 0.2661
- Aurc: 0.1211
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.7971 | 0.17 | 0.8966 | 8.4593 | 0.17 | 0.1148 | 0.2202 | 0.8308 |
| No log | 2.0 | 26 | 1.7887 | 0.13 | 0.8956 | 8.3211 | 0.13 | 0.0772 | 0.2024 | 0.8359 |
| No log | 3.0 | 39 | 1.7450 | 0.225 | 0.8868 | 6.4554 | 0.225 | 0.1165 | 0.2502 | 0.7572 |
| No log | 4.0 | 52 | 1.6811 | 0.24 | 0.8733 | 5.9510 | 0.24 | 0.0953 | 0.2651 | 0.6944 |
| No log | 5.0 | 65 | 1.6411 | 0.19 | 0.8649 | 6.0993 | 0.19 | 0.0493 | 0.2422 | 0.7783 |
| No log | 6.0 | 78 | 1.5475 | 0.195 | 0.8429 | 6.2065 | 0.195 | 0.0630 | 0.2472 | 0.7110 |
| No log | 7.0 | 91 | 1.4688 | 0.3 | 0.8197 | 6.0345 | 0.3 | 0.1481 | 0.2936 | 0.5379 |
| No log | 8.0 | 104 | 1.5036 | 0.285 | 0.8294 | 5.6660 | 0.285 | 0.1428 | 0.2869 | 0.6535 |
| No log | 9.0 | 117 | 1.3901 | 0.34 | 0.7934 | 5.9107 | 0.34 | 0.1853 | 0.2894 | 0.5277 |
| No log | 10.0 | 130 | 1.3484 | 0.37 | 0.7760 | 5.6441 | 0.37 | 0.2175 | 0.3177 | 0.5266 |
| No log | 11.0 | 143 | 1.3375 | 0.34 | 0.7734 | 5.0872 | 0.34 | 0.2083 | 0.2902 | 0.5557 |
| No log | 12.0 | 156 | 1.3639 | 0.305 | 0.7834 | 4.5070 | 0.305 | 0.1885 | 0.2674 | 0.6177 |
| No log | 13.0 | 169 | 1.2321 | 0.415 | 0.7225 | 4.3464 | 0.415 | 0.2751 | 0.2943 | 0.3825 |
| No log | 14.0 | 182 | 1.1453 | 0.44 | 0.6767 | 4.4158 | 0.44 | 0.2864 | 0.2617 | 0.3413 |
| No log | 15.0 | 195 | 1.1830 | 0.43 | 0.6965 | 3.8251 | 0.4300 | 0.2972 | 0.2912 | 0.4239 |
| No log | 16.0 | 208 | 1.0572 | 0.535 | 0.6230 | 3.5943 | 0.535 | 0.3758 | 0.2861 | 0.2291 |
| No log | 17.0 | 221 | 1.0532 | 0.585 | 0.6151 | 3.3834 | 0.585 | 0.4331 | 0.3278 | 0.1879 |
| No log | 18.0 | 234 | 1.0940 | 0.565 | 0.6374 | 3.2290 | 0.565 | 0.4431 | 0.3313 | 0.2415 |
| No log | 19.0 | 247 | 0.9877 | 0.585 | 0.5886 | 3.1068 | 0.585 | 0.4564 | 0.2896 | 0.2110 |
| No log | 20.0 | 260 | 1.0405 | 0.61 | 0.6056 | 3.1786 | 0.61 | 0.5038 | 0.3428 | 0.1962 |
| No log | 21.0 | 273 | 0.9728 | 0.635 | 0.5634 | 2.9133 | 0.635 | 0.5293 | 0.3333 | 0.1664 |
| No log | 22.0 | 286 | 0.9425 | 0.635 | 0.5527 | 2.8909 | 0.635 | 0.5237 | 0.3131 | 0.1796 |
| No log | 23.0 | 299 | 0.9549 | 0.65 | 0.5605 | 2.8074 | 0.65 | 0.5539 | 0.3283 | 0.1914 |
| No log | 24.0 | 312 | 1.0085 | 0.67 | 0.5733 | 2.8377 | 0.67 | 0.5543 | 0.3525 | 0.1571 |
| No log | 25.0 | 325 | 0.9140 | 0.655 | 0.5257 | 2.5878 | 0.655 | 0.5603 | 0.3171 | 0.1495 |
| No log | 26.0 | 338 | 0.8979 | 0.65 | 0.5249 | 2.7723 | 0.65 | 0.5563 | 0.2843 | 0.1646 |
| No log | 27.0 | 351 | 0.8912 | 0.675 | 0.5082 | 2.6562 | 0.675 | 0.5837 | 0.2871 | 0.1380 |
| No log | 28.0 | 364 | 0.8966 | 0.66 | 0.5242 | 2.3150 | 0.66 | 0.5890 | 0.3180 | 0.1777 |
| No log | 29.0 | 377 | 0.8602 | 0.67 | 0.4959 | 2.5813 | 0.67 | 0.5866 | 0.3023 | 0.1319 |
| No log | 30.0 | 390 | 0.8434 | 0.69 | 0.4779 | 2.5451 | 0.69 | 0.6130 | 0.3061 | 0.1188 |
| No log | 31.0 | 403 | 0.8406 | 0.715 | 0.4782 | 2.3339 | 0.715 | 0.6438 | 0.3241 | 0.1092 |
| No log | 32.0 | 416 | 0.8294 | 0.71 | 0.4726 | 2.5394 | 0.7100 | 0.6308 | 0.2922 | 0.1218 |
| No log | 33.0 | 429 | 0.8329 | 0.68 | 0.4763 | 2.4520 | 0.68 | 0.6166 | 0.2592 | 0.1396 |
| No log | 34.0 | 442 | 0.8937 | 0.69 | 0.5015 | 2.5649 | 0.69 | 0.6357 | 0.3293 | 0.1279 |
| No log | 35.0 | 455 | 0.8358 | 0.665 | 0.4807 | 2.4437 | 0.665 | 0.6178 | 0.2380 | 0.1473 |
| No log | 36.0 | 468 | 0.8283 | 0.685 | 0.4747 | 2.5408 | 0.685 | 0.6304 | 0.3126 | 0.1361 |
| No log | 37.0 | 481 | 0.8235 | 0.685 | 0.4707 | 2.4620 | 0.685 | 0.6300 | 0.2757 | 0.1343 |
| No log | 38.0 | 494 | 0.8289 | 0.68 | 0.4778 | 2.5443 | 0.68 | 0.6305 | 0.2935 | 0.1469 |
| 0.9462 | 39.0 | 507 | 0.8373 | 0.69 | 0.4728 | 2.5775 | 0.69 | 0.6281 | 0.3028 | 0.1149 |
| 0.9462 | 40.0 | 520 | 0.8062 | 0.715 | 0.4548 | 2.3673 | 0.715 | 0.6587 | 0.2776 | 0.1133 |
| 0.9462 | 41.0 | 533 | 0.7990 | 0.705 | 0.4517 | 2.3284 | 0.705 | 0.6463 | 0.2716 | 0.1185 |
| 0.9462 | 42.0 | 546 | 0.8210 | 0.7 | 0.4650 | 2.5646 | 0.7 | 0.6432 | 0.2690 | 0.1199 |
| 0.9462 | 43.0 | 559 | 0.8102 | 0.695 | 0.4558 | 2.5651 | 0.695 | 0.6442 | 0.2656 | 0.1184 |
| 0.9462 | 44.0 | 572 | 0.8061 | 0.69 | 0.4566 | 2.5154 | 0.69 | 0.6356 | 0.2816 | 0.1267 |
| 0.9462 | 45.0 | 585 | 0.8018 | 0.7 | 0.4531 | 2.4982 | 0.7 | 0.6419 | 0.2696 | 0.1192 |
| 0.9462 | 46.0 | 598 | 0.8040 | 0.7 | 0.4521 | 2.5309 | 0.7 | 0.6448 | 0.2797 | 0.1166 |
| 0.9462 | 47.0 | 611 | 0.8062 | 0.68 | 0.4560 | 2.5452 | 0.68 | 0.6370 | 0.2744 | 0.1217 |
| 0.9462 | 48.0 | 624 | 0.8011 | 0.69 | 0.4529 | 2.4281 | 0.69 | 0.6402 | 0.2594 | 0.1224 |
| 0.9462 | 49.0 | 637 | 0.8017 | 0.69 | 0.4532 | 2.4239 | 0.69 | 0.6400 | 0.2613 | 0.1261 |
| 0.9462 | 50.0 | 650 | 0.8009 | 0.695 | 0.4518 | 2.3840 | 0.695 | 0.6406 | 0.2661 | 0.1211 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.9
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8831
- Accuracy: 0.695
- Brier Loss: 0.4126
- Nll: 2.4628
- F1 Micro: 0.695
- F1 Macro: 0.6387
- Ece: 0.2426
- Aurc: 0.1068
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 2.1233 | 0.16 | 0.8967 | 8.5697 | 0.16 | 0.1066 | 0.2078 | 0.8316 |
| No log | 2.0 | 26 | 2.1188 | 0.14 | 0.8961 | 8.2960 | 0.14 | 0.0886 | 0.1947 | 0.8419 |
| No log | 3.0 | 39 | 2.0764 | 0.195 | 0.8873 | 6.4713 | 0.195 | 0.1159 | 0.2335 | 0.7665 |
| No log | 4.0 | 52 | 2.0365 | 0.21 | 0.8787 | 5.7752 | 0.2100 | 0.0930 | 0.2376 | 0.7548 |
| No log | 5.0 | 65 | 1.9888 | 0.2 | 0.8682 | 5.8737 | 0.2000 | 0.0775 | 0.2417 | 0.7314 |
| No log | 6.0 | 78 | 1.8998 | 0.215 | 0.8465 | 5.8553 | 0.2150 | 0.0970 | 0.2586 | 0.7063 |
| No log | 7.0 | 91 | 1.8351 | 0.33 | 0.8289 | 5.7781 | 0.33 | 0.1904 | 0.3089 | 0.6103 |
| No log | 8.0 | 104 | 1.7342 | 0.4 | 0.7968 | 5.5366 | 0.4000 | 0.2476 | 0.3457 | 0.4276 |
| No log | 9.0 | 117 | 1.6787 | 0.36 | 0.7757 | 5.7414 | 0.36 | 0.2148 | 0.3062 | 0.4324 |
| No log | 10.0 | 130 | 1.6942 | 0.4 | 0.7870 | 5.2615 | 0.4000 | 0.2831 | 0.3168 | 0.5227 |
| No log | 11.0 | 143 | 1.5992 | 0.4 | 0.7489 | 4.7833 | 0.4000 | 0.2649 | 0.3053 | 0.4679 |
| No log | 12.0 | 156 | 1.6071 | 0.425 | 0.7532 | 4.2803 | 0.425 | 0.2906 | 0.3196 | 0.4646 |
| No log | 13.0 | 169 | 1.4727 | 0.48 | 0.6925 | 4.1911 | 0.48 | 0.3239 | 0.2957 | 0.3081 |
| No log | 14.0 | 182 | 1.4275 | 0.515 | 0.6705 | 3.7980 | 0.515 | 0.3569 | 0.3211 | 0.2626 |
| No log | 15.0 | 195 | 1.3282 | 0.56 | 0.6200 | 3.6359 | 0.56 | 0.4163 | 0.2990 | 0.2213 |
| No log | 16.0 | 208 | 1.3280 | 0.565 | 0.6263 | 3.4960 | 0.565 | 0.4177 | 0.3217 | 0.2346 |
| No log | 17.0 | 221 | 1.3220 | 0.595 | 0.6196 | 3.2202 | 0.595 | 0.4639 | 0.3322 | 0.1992 |
| No log | 18.0 | 234 | 1.2359 | 0.595 | 0.5840 | 3.3332 | 0.595 | 0.4780 | 0.3042 | 0.2011 |
| No log | 19.0 | 247 | 1.1690 | 0.625 | 0.5531 | 3.2423 | 0.625 | 0.5233 | 0.2940 | 0.1807 |
| No log | 20.0 | 260 | 1.1644 | 0.64 | 0.5532 | 3.0542 | 0.64 | 0.5429 | 0.3019 | 0.1821 |
| No log | 21.0 | 273 | 1.1611 | 0.62 | 0.5516 | 2.9412 | 0.62 | 0.5193 | 0.2865 | 0.2160 |
| No log | 22.0 | 286 | 1.3427 | 0.585 | 0.6361 | 3.0936 | 0.585 | 0.5089 | 0.3442 | 0.2922 |
| No log | 23.0 | 299 | 1.1238 | 0.62 | 0.5440 | 2.7924 | 0.62 | 0.5458 | 0.2654 | 0.2088 |
| No log | 24.0 | 312 | 1.2008 | 0.685 | 0.5615 | 2.5918 | 0.685 | 0.5890 | 0.3907 | 0.1516 |
| No log | 25.0 | 325 | 1.0764 | 0.695 | 0.5000 | 2.6354 | 0.695 | 0.6107 | 0.3126 | 0.1397 |
| No log | 26.0 | 338 | 1.0268 | 0.675 | 0.4822 | 2.4798 | 0.675 | 0.5992 | 0.2775 | 0.1229 |
| No log | 27.0 | 351 | 1.0340 | 0.67 | 0.4893 | 2.4316 | 0.67 | 0.5997 | 0.2763 | 0.1638 |
| No log | 28.0 | 364 | 1.0154 | 0.665 | 0.4769 | 2.6487 | 0.665 | 0.6034 | 0.2590 | 0.1487 |
| No log | 29.0 | 377 | 1.0013 | 0.64 | 0.4814 | 2.5899 | 0.64 | 0.5771 | 0.2429 | 0.1593 |
| No log | 30.0 | 390 | 1.0173 | 0.685 | 0.4714 | 2.6922 | 0.685 | 0.6178 | 0.2898 | 0.1423 |
| No log | 31.0 | 403 | 0.9630 | 0.695 | 0.4509 | 2.6349 | 0.695 | 0.6206 | 0.2746 | 0.1248 |
| No log | 32.0 | 416 | 0.9950 | 0.68 | 0.4648 | 2.4144 | 0.68 | 0.6362 | 0.3020 | 0.1725 |
| No log | 33.0 | 429 | 0.9711 | 0.72 | 0.4502 | 2.6651 | 0.72 | 0.6571 | 0.2892 | 0.1268 |
| No log | 34.0 | 442 | 0.9491 | 0.705 | 0.4425 | 2.7169 | 0.705 | 0.6425 | 0.2541 | 0.1145 |
| No log | 35.0 | 455 | 0.9213 | 0.685 | 0.4309 | 2.5736 | 0.685 | 0.6174 | 0.2380 | 0.1161 |
| No log | 36.0 | 468 | 0.9144 | 0.695 | 0.4297 | 2.4141 | 0.695 | 0.6308 | 0.2502 | 0.1154 |
| No log | 37.0 | 481 | 0.9242 | 0.715 | 0.4264 | 2.7191 | 0.715 | 0.6429 | 0.2386 | 0.1030 |
| No log | 38.0 | 494 | 0.9290 | 0.695 | 0.4346 | 2.6515 | 0.695 | 0.6367 | 0.2432 | 0.1189 |
| 1.0953 | 39.0 | 507 | 0.9110 | 0.69 | 0.4262 | 2.6615 | 0.69 | 0.6328 | 0.2368 | 0.1112 |
| 1.0953 | 40.0 | 520 | 0.9000 | 0.695 | 0.4186 | 2.4590 | 0.695 | 0.6417 | 0.2453 | 0.1070 |
| 1.0953 | 41.0 | 533 | 0.8961 | 0.69 | 0.4189 | 2.4170 | 0.69 | 0.6368 | 0.2349 | 0.1090 |
| 1.0953 | 42.0 | 546 | 0.9103 | 0.675 | 0.4286 | 2.6129 | 0.675 | 0.6193 | 0.2318 | 0.1190 |
| 1.0953 | 43.0 | 559 | 0.8858 | 0.715 | 0.4131 | 2.5243 | 0.715 | 0.6517 | 0.2462 | 0.1053 |
| 1.0953 | 44.0 | 572 | 0.8872 | 0.705 | 0.4135 | 2.3272 | 0.705 | 0.6542 | 0.2596 | 0.1051 |
| 1.0953 | 45.0 | 585 | 0.8897 | 0.715 | 0.4136 | 2.3788 | 0.715 | 0.6532 | 0.2560 | 0.1035 |
| 1.0953 | 46.0 | 598 | 0.8842 | 0.7 | 0.4117 | 2.5325 | 0.7 | 0.6446 | 0.2327 | 0.1075 |
| 1.0953 | 47.0 | 611 | 0.8857 | 0.675 | 0.4141 | 2.5451 | 0.675 | 0.6203 | 0.2473 | 0.1125 |
| 1.0953 | 48.0 | 624 | 0.8875 | 0.69 | 0.4164 | 2.4696 | 0.69 | 0.6352 | 0.2542 | 0.1109 |
| 1.0953 | 49.0 | 637 | 0.8842 | 0.69 | 0.4153 | 2.5338 | 0.69 | 0.6358 | 0.2302 | 0.1112 |
| 1.0953 | 50.0 | 650 | 0.8831 | 0.695 | 0.4126 | 2.4628 | 0.695 | 0.6387 | 0.2426 | 0.1068 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_CEKD_t1.0_a1.0
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_kd_CEKD_t1.0_a1.0
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7844
- Accuracy: 0.742
- Brier Loss: 0.4405
- Nll: 2.8680
- F1 Micro: 0.7420
- F1 Macro: 0.7411
- Ece: 0.1946
- Aurc: 0.1002
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 250 | 2.7345 | 0.153 | 0.9327 | 8.3371 | 0.153 | 0.1246 | 0.0866 | 0.7933 |
| 2.6983 | 2.0 | 500 | 2.4500 | 0.4213 | 0.8816 | 4.7062 | 0.4213 | 0.3924 | 0.3073 | 0.4444 |
| 2.6983 | 3.0 | 750 | 1.7959 | 0.5012 | 0.7003 | 3.3576 | 0.5012 | 0.4758 | 0.1869 | 0.3051 |
| 1.7341 | 4.0 | 1000 | 1.3637 | 0.5985 | 0.5511 | 2.8818 | 0.5985 | 0.5868 | 0.1005 | 0.1935 |
| 1.7341 | 5.0 | 1250 | 1.1978 | 0.6498 | 0.4862 | 2.7546 | 0.6498 | 0.6471 | 0.0826 | 0.1500 |
| 1.0818 | 6.0 | 1500 | 1.0812 | 0.6853 | 0.4364 | 2.6325 | 0.6853 | 0.6845 | 0.0522 | 0.1217 |
| 1.0818 | 7.0 | 1750 | 1.0276 | 0.7013 | 0.4149 | 2.5542 | 0.7013 | 0.7003 | 0.0397 | 0.1108 |
| 0.7498 | 8.0 | 2000 | 0.9724 | 0.7133 | 0.3944 | 2.4773 | 0.7133 | 0.7129 | 0.0505 | 0.1040 |
| 0.7498 | 9.0 | 2250 | 0.9777 | 0.7248 | 0.3924 | 2.4916 | 0.7248 | 0.7242 | 0.0628 | 0.0992 |
| 0.5034 | 10.0 | 2500 | 1.0027 | 0.724 | 0.3976 | 2.4974 | 0.724 | 0.7250 | 0.0751 | 0.1032 |
| 0.5034 | 11.0 | 2750 | 0.9979 | 0.729 | 0.3913 | 2.5344 | 0.729 | 0.7295 | 0.0805 | 0.0988 |
| 0.3237 | 12.0 | 3000 | 1.0553 | 0.7192 | 0.4075 | 2.6242 | 0.7192 | 0.7193 | 0.0963 | 0.1072 |
| 0.3237 | 13.0 | 3250 | 1.1162 | 0.7175 | 0.4139 | 2.6543 | 0.7175 | 0.7185 | 0.1295 | 0.1093 |
| 0.2023 | 14.0 | 3500 | 1.1259 | 0.725 | 0.4140 | 2.6758 | 0.7250 | 0.7246 | 0.1237 | 0.1055 |
| 0.2023 | 15.0 | 3750 | 1.2728 | 0.7115 | 0.4381 | 2.8308 | 0.7115 | 0.7147 | 0.1464 | 0.1168 |
| 0.1264 | 16.0 | 4000 | 1.2664 | 0.7222 | 0.4296 | 2.8434 | 0.7223 | 0.7236 | 0.1523 | 0.1107 |
| 0.1264 | 17.0 | 4250 | 1.2620 | 0.724 | 0.4252 | 2.7990 | 0.724 | 0.7252 | 0.1563 | 0.1066 |
| 0.0802 | 18.0 | 4500 | 1.3362 | 0.727 | 0.4293 | 2.8642 | 0.7270 | 0.7267 | 0.1653 | 0.1090 |
| 0.0802 | 19.0 | 4750 | 1.3608 | 0.7302 | 0.4288 | 2.7893 | 0.7302 | 0.7318 | 0.1637 | 0.1059 |
| 0.0553 | 20.0 | 5000 | 1.3757 | 0.7308 | 0.4303 | 2.8861 | 0.7308 | 0.7300 | 0.1670 | 0.1073 |
| 0.0553 | 21.0 | 5250 | 1.4947 | 0.7295 | 0.4420 | 2.8306 | 0.7295 | 0.7300 | 0.1770 | 0.1128 |
| 0.0329 | 22.0 | 5500 | 1.5338 | 0.7265 | 0.4416 | 2.8729 | 0.7265 | 0.7273 | 0.1808 | 0.1097 |
| 0.0329 | 23.0 | 5750 | 1.5127 | 0.7355 | 0.4362 | 2.8574 | 0.7355 | 0.7366 | 0.1774 | 0.1045 |
| 0.0258 | 24.0 | 6000 | 1.5189 | 0.7352 | 0.4360 | 2.8435 | 0.7353 | 0.7344 | 0.1784 | 0.1030 |
| 0.0258 | 25.0 | 6250 | 1.5802 | 0.7362 | 0.4404 | 2.8399 | 0.7362 | 0.7362 | 0.1847 | 0.1013 |
| 0.0193 | 26.0 | 6500 | 1.5869 | 0.737 | 0.4378 | 2.8237 | 0.737 | 0.7362 | 0.1846 | 0.1022 |
| 0.0193 | 27.0 | 6750 | 1.6160 | 0.7365 | 0.4373 | 2.7928 | 0.7365 | 0.7360 | 0.1864 | 0.1049 |
| 0.014 | 28.0 | 7000 | 1.6775 | 0.7372 | 0.4426 | 2.9236 | 0.7372 | 0.7373 | 0.1909 | 0.1039 |
| 0.014 | 29.0 | 7250 | 1.6391 | 0.736 | 0.4370 | 2.8717 | 0.736 | 0.7358 | 0.1905 | 0.0999 |
| 0.0132 | 30.0 | 7500 | 1.6804 | 0.7355 | 0.4434 | 2.8397 | 0.7355 | 0.7360 | 0.1903 | 0.1067 |
| 0.0132 | 31.0 | 7750 | 1.6809 | 0.738 | 0.4386 | 2.8853 | 0.738 | 0.7387 | 0.1920 | 0.1015 |
| 0.0121 | 32.0 | 8000 | 1.6953 | 0.734 | 0.4443 | 2.8451 | 0.734 | 0.7342 | 0.1961 | 0.1013 |
| 0.0121 | 33.0 | 8250 | 1.7184 | 0.7425 | 0.4344 | 2.8180 | 0.7425 | 0.7423 | 0.1910 | 0.1014 |
| 0.0098 | 34.0 | 8500 | 1.7151 | 0.735 | 0.4445 | 2.8532 | 0.735 | 0.7337 | 0.1952 | 0.1000 |
| 0.0098 | 35.0 | 8750 | 1.7781 | 0.7338 | 0.4484 | 2.8133 | 0.7338 | 0.7351 | 0.1999 | 0.1052 |
| 0.0086 | 36.0 | 9000 | 1.7540 | 0.7372 | 0.4443 | 2.8388 | 0.7372 | 0.7388 | 0.1954 | 0.1039 |
| 0.0086 | 37.0 | 9250 | 1.7744 | 0.738 | 0.4474 | 2.8600 | 0.738 | 0.7390 | 0.1953 | 0.1057 |
| 0.0079 | 38.0 | 9500 | 1.7446 | 0.7368 | 0.4417 | 2.8485 | 0.7367 | 0.7374 | 0.1972 | 0.1016 |
| 0.0079 | 39.0 | 9750 | 1.7700 | 0.739 | 0.4398 | 2.8826 | 0.739 | 0.7395 | 0.1970 | 0.1023 |
| 0.0076 | 40.0 | 10000 | 1.7896 | 0.7368 | 0.4442 | 2.8449 | 0.7367 | 0.7376 | 0.1988 | 0.1033 |
| 0.0076 | 41.0 | 10250 | 1.7435 | 0.7402 | 0.4387 | 2.8390 | 0.7402 | 0.7405 | 0.1926 | 0.1031 |
| 0.0074 | 42.0 | 10500 | 1.7837 | 0.7338 | 0.4470 | 2.8191 | 0.7338 | 0.7339 | 0.2018 | 0.1035 |
| 0.0074 | 43.0 | 10750 | 1.8015 | 0.7392 | 0.4427 | 2.8093 | 0.7392 | 0.7401 | 0.1981 | 0.1017 |
| 0.0061 | 44.0 | 11000 | 1.8155 | 0.739 | 0.4449 | 2.8333 | 0.739 | 0.7406 | 0.1983 | 0.1022 |
| 0.0061 | 45.0 | 11250 | 1.7958 | 0.7392 | 0.4426 | 2.8161 | 0.7392 | 0.7385 | 0.1963 | 0.1039 |
| 0.0059 | 46.0 | 11500 | 1.8089 | 0.7422 | 0.4411 | 2.8174 | 0.7422 | 0.7422 | 0.1955 | 0.1011 |
| 0.0059 | 47.0 | 11750 | 1.8125 | 0.743 | 0.4386 | 2.8184 | 0.743 | 0.7435 | 0.1939 | 0.1012 |
| 0.0053 | 48.0 | 12000 | 1.8004 | 0.7372 | 0.4432 | 2.8413 | 0.7372 | 0.7371 | 0.1995 | 0.1023 |
| 0.0053 | 49.0 | 12250 | 1.8075 | 0.7405 | 0.4392 | 2.8569 | 0.7405 | 0.7397 | 0.1962 | 0.1015 |
| 0.0055 | 50.0 | 12500 | 1.7844 | 0.742 | 0.4405 | 2.8680 | 0.7420 | 0.7411 | 0.1946 | 0.1002 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"letter",
"form",
"email",
"handwritten",
"advertisement",
"scientific_report",
"scientific_publication",
"specification",
"file_folder",
"news_article",
"budget",
"invoice",
"presentation",
"questionnaire",
"resume",
"memo"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.7
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8012
- Accuracy: 0.7
- Brier Loss: 0.4467
- Nll: 2.5682
- F1 Micro: 0.7
- F1 Macro: 0.6313
- Ece: 0.2684
- Aurc: 0.1170
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.8024 | 0.16 | 0.8966 | 8.5001 | 0.16 | 0.1073 | 0.2079 | 0.8334 |
| No log | 2.0 | 26 | 1.7941 | 0.145 | 0.8957 | 8.3207 | 0.145 | 0.0843 | 0.2022 | 0.8435 |
| No log | 3.0 | 39 | 1.7486 | 0.2 | 0.8868 | 6.2015 | 0.2000 | 0.1007 | 0.2209 | 0.7900 |
| No log | 4.0 | 52 | 1.6854 | 0.205 | 0.8738 | 6.0142 | 0.205 | 0.0707 | 0.2453 | 0.7584 |
| No log | 5.0 | 65 | 1.6162 | 0.2 | 0.8594 | 6.2364 | 0.2000 | 0.0552 | 0.2466 | 0.7717 |
| No log | 6.0 | 78 | 1.5412 | 0.235 | 0.8416 | 6.0423 | 0.235 | 0.0902 | 0.2589 | 0.7006 |
| No log | 7.0 | 91 | 1.5011 | 0.295 | 0.8304 | 6.1420 | 0.295 | 0.1272 | 0.2803 | 0.6124 |
| No log | 8.0 | 104 | 1.4415 | 0.3 | 0.8114 | 6.0440 | 0.3 | 0.1296 | 0.2870 | 0.5641 |
| No log | 9.0 | 117 | 1.3257 | 0.38 | 0.7625 | 5.6923 | 0.38 | 0.2198 | 0.3136 | 0.3675 |
| No log | 10.0 | 130 | 1.3748 | 0.33 | 0.7905 | 5.5276 | 0.33 | 0.1870 | 0.2947 | 0.5985 |
| No log | 11.0 | 143 | 1.3294 | 0.39 | 0.7683 | 4.9632 | 0.39 | 0.2573 | 0.2940 | 0.4639 |
| No log | 12.0 | 156 | 1.2444 | 0.385 | 0.7297 | 4.8431 | 0.3850 | 0.2330 | 0.2849 | 0.4173 |
| No log | 13.0 | 169 | 1.2212 | 0.45 | 0.7153 | 4.5819 | 0.45 | 0.3051 | 0.3143 | 0.3379 |
| No log | 14.0 | 182 | 1.1835 | 0.495 | 0.6888 | 3.6108 | 0.495 | 0.3412 | 0.3316 | 0.2873 |
| No log | 15.0 | 195 | 1.1203 | 0.47 | 0.6559 | 3.6500 | 0.47 | 0.3348 | 0.2935 | 0.3061 |
| No log | 16.0 | 208 | 1.1520 | 0.495 | 0.6707 | 3.8106 | 0.495 | 0.3632 | 0.2938 | 0.3604 |
| No log | 17.0 | 221 | 1.0261 | 0.565 | 0.6021 | 3.3382 | 0.565 | 0.4214 | 0.2840 | 0.2047 |
| No log | 18.0 | 234 | 1.0080 | 0.61 | 0.5914 | 3.2936 | 0.61 | 0.4748 | 0.3240 | 0.1806 |
| No log | 19.0 | 247 | 1.0696 | 0.58 | 0.6253 | 3.2354 | 0.58 | 0.4686 | 0.3152 | 0.2626 |
| No log | 20.0 | 260 | 0.9733 | 0.615 | 0.5722 | 3.1019 | 0.615 | 0.4968 | 0.3259 | 0.2066 |
| No log | 21.0 | 273 | 0.9266 | 0.625 | 0.5423 | 3.0239 | 0.625 | 0.5202 | 0.2834 | 0.1782 |
| No log | 22.0 | 286 | 0.9364 | 0.66 | 0.5461 | 2.9031 | 0.66 | 0.5461 | 0.3128 | 0.1601 |
| No log | 23.0 | 299 | 0.9181 | 0.675 | 0.5307 | 2.8416 | 0.675 | 0.5584 | 0.3106 | 0.1462 |
| No log | 24.0 | 312 | 0.9739 | 0.665 | 0.5539 | 2.8798 | 0.665 | 0.5634 | 0.3325 | 0.1610 |
| No log | 25.0 | 325 | 0.8851 | 0.69 | 0.5099 | 2.7336 | 0.69 | 0.6013 | 0.3064 | 0.1437 |
| No log | 26.0 | 338 | 0.8755 | 0.71 | 0.4979 | 2.7400 | 0.7100 | 0.6032 | 0.3162 | 0.1211 |
| No log | 27.0 | 351 | 0.8653 | 0.675 | 0.4964 | 2.8339 | 0.675 | 0.5705 | 0.2977 | 0.1386 |
| No log | 28.0 | 364 | 0.8838 | 0.675 | 0.5055 | 2.7456 | 0.675 | 0.5816 | 0.2969 | 0.1524 |
| No log | 29.0 | 377 | 0.8805 | 0.68 | 0.5025 | 2.6942 | 0.68 | 0.5855 | 0.3099 | 0.1380 |
| No log | 30.0 | 390 | 0.8585 | 0.665 | 0.4891 | 2.7511 | 0.665 | 0.5737 | 0.2627 | 0.1370 |
| No log | 31.0 | 403 | 0.8410 | 0.675 | 0.4736 | 2.6431 | 0.675 | 0.5985 | 0.2670 | 0.1335 |
| No log | 32.0 | 416 | 0.8378 | 0.71 | 0.4724 | 2.7320 | 0.7100 | 0.6236 | 0.2885 | 0.1153 |
| No log | 33.0 | 429 | 0.8421 | 0.705 | 0.4718 | 2.6331 | 0.705 | 0.6326 | 0.2644 | 0.1147 |
| No log | 34.0 | 442 | 0.8350 | 0.685 | 0.4697 | 2.8035 | 0.685 | 0.6062 | 0.2831 | 0.1291 |
| No log | 35.0 | 455 | 0.8377 | 0.7 | 0.4708 | 2.4611 | 0.7 | 0.6376 | 0.3173 | 0.1195 |
| No log | 36.0 | 468 | 0.8126 | 0.69 | 0.4562 | 2.3909 | 0.69 | 0.6154 | 0.2433 | 0.1177 |
| No log | 37.0 | 481 | 0.8299 | 0.685 | 0.4673 | 2.5695 | 0.685 | 0.6080 | 0.2802 | 0.1261 |
| No log | 38.0 | 494 | 0.8197 | 0.685 | 0.4597 | 2.6388 | 0.685 | 0.6187 | 0.2690 | 0.1229 |
| 0.9314 | 39.0 | 507 | 0.8137 | 0.695 | 0.4547 | 2.7263 | 0.695 | 0.6332 | 0.2581 | 0.1207 |
| 0.9314 | 40.0 | 520 | 0.8168 | 0.69 | 0.4583 | 2.6230 | 0.69 | 0.6267 | 0.2696 | 0.1161 |
| 0.9314 | 41.0 | 533 | 0.8090 | 0.7 | 0.4529 | 2.6449 | 0.7 | 0.6236 | 0.2445 | 0.1187 |
| 0.9314 | 42.0 | 546 | 0.8168 | 0.68 | 0.4586 | 2.5516 | 0.68 | 0.6162 | 0.2722 | 0.1275 |
| 0.9314 | 43.0 | 559 | 0.8100 | 0.7 | 0.4523 | 2.5565 | 0.7 | 0.6347 | 0.2869 | 0.1192 |
| 0.9314 | 44.0 | 572 | 0.8078 | 0.7 | 0.4514 | 2.5734 | 0.7 | 0.6344 | 0.2583 | 0.1172 |
| 0.9314 | 45.0 | 585 | 0.8022 | 0.715 | 0.4472 | 2.4971 | 0.715 | 0.6534 | 0.2890 | 0.1165 |
| 0.9314 | 46.0 | 598 | 0.8049 | 0.695 | 0.4484 | 2.4891 | 0.695 | 0.6423 | 0.2722 | 0.1189 |
| 0.9314 | 47.0 | 611 | 0.8025 | 0.705 | 0.4481 | 2.4929 | 0.705 | 0.6393 | 0.2650 | 0.1124 |
| 0.9314 | 48.0 | 624 | 0.7973 | 0.7 | 0.4439 | 2.5000 | 0.7 | 0.6292 | 0.2718 | 0.1142 |
| 0.9314 | 49.0 | 637 | 0.8011 | 0.7 | 0.4464 | 2.5713 | 0.7 | 0.6303 | 0.2400 | 0.1183 |
| 0.9314 | 50.0 | 650 | 0.8012 | 0.7 | 0.4467 | 2.5682 | 0.7 | 0.6313 | 0.2684 | 0.1170 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.9
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8672
- Accuracy: 0.71
- Brier Loss: 0.4047
- Nll: 2.1924
- F1 Micro: 0.7100
- F1 Macro: 0.6463
- Ece: 0.2420
- Aurc: 0.1050
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 2.1239 | 0.16 | 0.8967 | 8.4233 | 0.16 | 0.1062 | 0.2101 | 0.8304 |
| No log | 2.0 | 26 | 2.1201 | 0.14 | 0.8961 | 8.2220 | 0.14 | 0.0876 | 0.1970 | 0.8491 |
| No log | 3.0 | 39 | 2.0724 | 0.215 | 0.8865 | 6.2039 | 0.2150 | 0.1169 | 0.2432 | 0.7837 |
| No log | 4.0 | 52 | 2.0291 | 0.185 | 0.8773 | 5.6169 | 0.185 | 0.0792 | 0.2329 | 0.7651 |
| No log | 5.0 | 65 | 1.9592 | 0.215 | 0.8614 | 6.0237 | 0.2150 | 0.0835 | 0.2493 | 0.7373 |
| No log | 6.0 | 78 | 1.9039 | 0.205 | 0.8483 | 5.9575 | 0.205 | 0.0619 | 0.2493 | 0.7526 |
| No log | 7.0 | 91 | 1.8651 | 0.26 | 0.8381 | 5.6215 | 0.26 | 0.1490 | 0.2663 | 0.6747 |
| No log | 8.0 | 104 | 1.8342 | 0.225 | 0.8311 | 5.7631 | 0.225 | 0.1071 | 0.2425 | 0.6919 |
| No log | 9.0 | 117 | 1.8057 | 0.31 | 0.8218 | 5.2969 | 0.31 | 0.2118 | 0.2795 | 0.6489 |
| No log | 10.0 | 130 | 1.5737 | 0.46 | 0.7277 | 5.1748 | 0.46 | 0.2853 | 0.3279 | 0.2977 |
| No log | 11.0 | 143 | 1.5629 | 0.415 | 0.7331 | 4.8259 | 0.415 | 0.2846 | 0.2924 | 0.3880 |
| No log | 12.0 | 156 | 1.5283 | 0.45 | 0.7135 | 4.0012 | 0.45 | 0.3122 | 0.3298 | 0.3197 |
| No log | 13.0 | 169 | 1.4200 | 0.51 | 0.6674 | 3.9849 | 0.51 | 0.3400 | 0.3259 | 0.2549 |
| No log | 14.0 | 182 | 1.4334 | 0.535 | 0.6710 | 3.7006 | 0.535 | 0.3840 | 0.3291 | 0.2584 |
| No log | 15.0 | 195 | 1.4306 | 0.45 | 0.6854 | 3.8260 | 0.45 | 0.3120 | 0.3055 | 0.4297 |
| No log | 16.0 | 208 | 1.3175 | 0.585 | 0.6174 | 3.3484 | 0.585 | 0.4401 | 0.3406 | 0.1916 |
| No log | 17.0 | 221 | 1.2680 | 0.57 | 0.5998 | 3.1408 | 0.57 | 0.4356 | 0.2903 | 0.2136 |
| No log | 18.0 | 234 | 1.2605 | 0.58 | 0.6020 | 3.2085 | 0.58 | 0.4711 | 0.2915 | 0.2355 |
| No log | 19.0 | 247 | 1.2292 | 0.61 | 0.5791 | 3.0633 | 0.61 | 0.5021 | 0.2929 | 0.2082 |
| No log | 20.0 | 260 | 1.3872 | 0.54 | 0.6604 | 3.2778 | 0.54 | 0.4604 | 0.3284 | 0.3506 |
| No log | 21.0 | 273 | 1.1646 | 0.625 | 0.5520 | 2.8539 | 0.625 | 0.5193 | 0.2828 | 0.1885 |
| No log | 22.0 | 286 | 1.1565 | 0.655 | 0.5438 | 2.6915 | 0.655 | 0.5437 | 0.3430 | 0.1549 |
| No log | 23.0 | 299 | 1.1041 | 0.625 | 0.5298 | 2.9930 | 0.625 | 0.5241 | 0.2423 | 0.1906 |
| No log | 24.0 | 312 | 1.0448 | 0.685 | 0.4895 | 2.8196 | 0.685 | 0.5846 | 0.2701 | 0.1411 |
| No log | 25.0 | 325 | 1.0623 | 0.695 | 0.4904 | 2.6903 | 0.695 | 0.6086 | 0.2762 | 0.1435 |
| No log | 26.0 | 338 | 0.9872 | 0.695 | 0.4607 | 2.6336 | 0.695 | 0.5953 | 0.2728 | 0.1180 |
| No log | 27.0 | 351 | 0.9789 | 0.705 | 0.4580 | 2.6326 | 0.705 | 0.6127 | 0.2579 | 0.1171 |
| No log | 28.0 | 364 | 1.0033 | 0.685 | 0.4707 | 2.5747 | 0.685 | 0.5906 | 0.2747 | 0.1291 |
| No log | 29.0 | 377 | 1.0152 | 0.7 | 0.4789 | 2.4333 | 0.7 | 0.6260 | 0.2951 | 0.1739 |
| No log | 30.0 | 390 | 1.0107 | 0.715 | 0.4684 | 2.5194 | 0.715 | 0.6401 | 0.3197 | 0.1389 |
| No log | 31.0 | 403 | 0.9511 | 0.69 | 0.4445 | 2.5648 | 0.69 | 0.6131 | 0.2648 | 0.1298 |
| No log | 32.0 | 416 | 0.9586 | 0.735 | 0.4448 | 2.3342 | 0.735 | 0.6578 | 0.2941 | 0.1275 |
| No log | 33.0 | 429 | 1.0010 | 0.73 | 0.4625 | 2.4748 | 0.7300 | 0.6613 | 0.3307 | 0.1202 |
| No log | 34.0 | 442 | 0.9481 | 0.71 | 0.4361 | 2.4986 | 0.7100 | 0.6456 | 0.2856 | 0.1228 |
| No log | 35.0 | 455 | 0.9190 | 0.69 | 0.4323 | 2.6586 | 0.69 | 0.6265 | 0.2538 | 0.1250 |
| No log | 36.0 | 468 | 0.9226 | 0.715 | 0.4350 | 2.2652 | 0.715 | 0.6507 | 0.2868 | 0.1328 |
| No log | 37.0 | 481 | 0.9017 | 0.725 | 0.4182 | 2.5141 | 0.7250 | 0.6590 | 0.2547 | 0.1013 |
| No log | 38.0 | 494 | 0.9092 | 0.72 | 0.4218 | 2.5171 | 0.72 | 0.6495 | 0.2677 | 0.1055 |
| 1.0958 | 39.0 | 507 | 0.9093 | 0.71 | 0.4221 | 2.6479 | 0.7100 | 0.6456 | 0.2567 | 0.1185 |
| 1.0958 | 40.0 | 520 | 0.8926 | 0.71 | 0.4204 | 2.3785 | 0.7100 | 0.6522 | 0.2396 | 0.1153 |
| 1.0958 | 41.0 | 533 | 0.8928 | 0.715 | 0.4157 | 2.5719 | 0.715 | 0.6487 | 0.2708 | 0.1067 |
| 1.0958 | 42.0 | 546 | 0.8967 | 0.715 | 0.4247 | 2.6422 | 0.715 | 0.6495 | 0.2525 | 0.1174 |
| 1.0958 | 43.0 | 559 | 0.8773 | 0.695 | 0.4116 | 2.5548 | 0.695 | 0.6400 | 0.2491 | 0.1142 |
| 1.0958 | 44.0 | 572 | 0.8660 | 0.71 | 0.4036 | 2.2950 | 0.7100 | 0.6535 | 0.2401 | 0.1009 |
| 1.0958 | 45.0 | 585 | 0.8718 | 0.72 | 0.4057 | 2.4922 | 0.72 | 0.6551 | 0.2624 | 0.0998 |
| 1.0958 | 46.0 | 598 | 0.8737 | 0.7 | 0.4070 | 2.4455 | 0.7 | 0.6416 | 0.2360 | 0.1052 |
| 1.0958 | 47.0 | 611 | 0.8707 | 0.715 | 0.4094 | 2.3519 | 0.715 | 0.6494 | 0.2514 | 0.1086 |
| 1.0958 | 48.0 | 624 | 0.8640 | 0.705 | 0.4039 | 2.3765 | 0.705 | 0.6430 | 0.2538 | 0.1041 |
| 1.0958 | 49.0 | 637 | 0.8702 | 0.7 | 0.4066 | 2.5524 | 0.7 | 0.6423 | 0.2160 | 0.1080 |
| 1.0958 | 50.0 | 650 | 0.8672 | 0.71 | 0.4047 | 2.1924 | 0.7100 | 0.6463 | 0.2420 | 0.1050 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.5
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6454
- Accuracy: 0.685
- Brier Loss: 0.4931
- Nll: 2.5040
- F1 Micro: 0.685
- F1 Macro: 0.6171
- Ece: 0.2996
- Aurc: 0.1499
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.4475 | 0.17 | 0.8966 | 8.4781 | 0.17 | 0.1126 | 0.2169 | 0.8299 |
| No log | 2.0 | 26 | 1.4360 | 0.165 | 0.8955 | 8.4118 | 0.165 | 0.1097 | 0.2115 | 0.8359 |
| No log | 3.0 | 39 | 1.3776 | 0.16 | 0.8842 | 6.1685 | 0.16 | 0.0633 | 0.2066 | 0.7780 |
| No log | 4.0 | 52 | 1.3085 | 0.2 | 0.8701 | 6.0521 | 0.2000 | 0.0728 | 0.2332 | 0.7424 |
| No log | 5.0 | 65 | 1.2551 | 0.18 | 0.8597 | 6.1887 | 0.18 | 0.0491 | 0.2265 | 0.7890 |
| No log | 6.0 | 78 | 1.2118 | 0.2 | 0.8489 | 6.1706 | 0.2000 | 0.0631 | 0.2324 | 0.7179 |
| No log | 7.0 | 91 | 1.1759 | 0.19 | 0.8418 | 6.1310 | 0.19 | 0.0428 | 0.2384 | 0.7431 |
| No log | 8.0 | 104 | 1.1577 | 0.195 | 0.8339 | 5.7305 | 0.195 | 0.0535 | 0.2399 | 0.7126 |
| No log | 9.0 | 117 | 0.9905 | 0.34 | 0.7692 | 6.1092 | 0.34 | 0.1567 | 0.2772 | 0.4320 |
| No log | 10.0 | 130 | 0.9603 | 0.355 | 0.7541 | 5.7998 | 0.3550 | 0.1969 | 0.3021 | 0.4111 |
| No log | 11.0 | 143 | 1.0839 | 0.255 | 0.8087 | 5.1464 | 0.255 | 0.1242 | 0.2389 | 0.6769 |
| No log | 12.0 | 156 | 0.9374 | 0.39 | 0.7410 | 4.8415 | 0.39 | 0.2220 | 0.3037 | 0.4194 |
| No log | 13.0 | 169 | 0.9974 | 0.33 | 0.7720 | 4.9023 | 0.33 | 0.1732 | 0.2863 | 0.6049 |
| No log | 14.0 | 182 | 0.9393 | 0.435 | 0.7251 | 4.3102 | 0.435 | 0.2455 | 0.3276 | 0.3645 |
| No log | 15.0 | 195 | 0.9554 | 0.39 | 0.7416 | 4.0361 | 0.39 | 0.2535 | 0.2721 | 0.5075 |
| No log | 16.0 | 208 | 0.8012 | 0.465 | 0.6445 | 4.0129 | 0.465 | 0.2935 | 0.2551 | 0.2839 |
| No log | 17.0 | 221 | 0.8033 | 0.53 | 0.6418 | 3.4959 | 0.53 | 0.3816 | 0.3242 | 0.2458 |
| No log | 18.0 | 234 | 0.7740 | 0.57 | 0.6204 | 3.4062 | 0.57 | 0.4297 | 0.3139 | 0.2245 |
| No log | 19.0 | 247 | 0.7736 | 0.6 | 0.6124 | 3.3460 | 0.6 | 0.4408 | 0.3017 | 0.1919 |
| No log | 20.0 | 260 | 0.9105 | 0.555 | 0.6919 | 3.2115 | 0.555 | 0.4524 | 0.3604 | 0.3099 |
| No log | 21.0 | 273 | 0.7416 | 0.61 | 0.5948 | 3.1349 | 0.61 | 0.5093 | 0.3176 | 0.2233 |
| No log | 22.0 | 286 | 0.7318 | 0.655 | 0.5815 | 3.1259 | 0.655 | 0.5433 | 0.3478 | 0.1672 |
| No log | 23.0 | 299 | 0.7799 | 0.59 | 0.6079 | 3.0590 | 0.59 | 0.4963 | 0.3340 | 0.2455 |
| No log | 24.0 | 312 | 0.7886 | 0.665 | 0.6038 | 2.9965 | 0.665 | 0.5575 | 0.3773 | 0.1623 |
| No log | 25.0 | 325 | 0.7083 | 0.66 | 0.5602 | 3.0752 | 0.66 | 0.5582 | 0.3283 | 0.1772 |
| No log | 26.0 | 338 | 0.6882 | 0.63 | 0.5507 | 2.9022 | 0.63 | 0.5404 | 0.2963 | 0.1851 |
| No log | 27.0 | 351 | 0.6774 | 0.66 | 0.5348 | 2.7876 | 0.66 | 0.5674 | 0.3095 | 0.1662 |
| No log | 28.0 | 364 | 0.8111 | 0.675 | 0.6067 | 2.7578 | 0.675 | 0.5800 | 0.3905 | 0.1923 |
| No log | 29.0 | 377 | 0.6803 | 0.645 | 0.5338 | 2.8666 | 0.645 | 0.5486 | 0.3054 | 0.1646 |
| No log | 30.0 | 390 | 0.6835 | 0.685 | 0.5336 | 2.5944 | 0.685 | 0.5840 | 0.3119 | 0.1595 |
| No log | 31.0 | 403 | 0.6810 | 0.655 | 0.5309 | 2.7112 | 0.655 | 0.5625 | 0.2879 | 0.1786 |
| No log | 32.0 | 416 | 0.6848 | 0.685 | 0.5194 | 2.6456 | 0.685 | 0.5893 | 0.3314 | 0.1350 |
| No log | 33.0 | 429 | 0.6631 | 0.695 | 0.5063 | 2.6286 | 0.695 | 0.5980 | 0.3198 | 0.1314 |
| No log | 34.0 | 442 | 0.6639 | 0.69 | 0.5126 | 2.3890 | 0.69 | 0.5834 | 0.2990 | 0.1376 |
| No log | 35.0 | 455 | 0.6736 | 0.675 | 0.5172 | 2.3291 | 0.675 | 0.6014 | 0.3148 | 0.1646 |
| No log | 36.0 | 468 | 0.6648 | 0.68 | 0.5137 | 2.4549 | 0.68 | 0.6156 | 0.3316 | 0.1492 |
| No log | 37.0 | 481 | 0.6543 | 0.7 | 0.5006 | 2.4275 | 0.7 | 0.6130 | 0.3041 | 0.1342 |
| No log | 38.0 | 494 | 0.6514 | 0.675 | 0.5001 | 2.4064 | 0.675 | 0.5984 | 0.2963 | 0.1491 |
| 0.7462 | 39.0 | 507 | 0.6498 | 0.71 | 0.4988 | 2.5772 | 0.7100 | 0.6405 | 0.2980 | 0.1335 |
| 0.7462 | 40.0 | 520 | 0.6496 | 0.705 | 0.4964 | 2.5649 | 0.705 | 0.6386 | 0.3060 | 0.1380 |
| 0.7462 | 41.0 | 533 | 0.6562 | 0.68 | 0.5027 | 2.5816 | 0.68 | 0.6026 | 0.3100 | 0.1467 |
| 0.7462 | 42.0 | 546 | 0.6632 | 0.68 | 0.5089 | 2.4570 | 0.68 | 0.6112 | 0.2989 | 0.1500 |
| 0.7462 | 43.0 | 559 | 0.6437 | 0.7 | 0.4885 | 2.3648 | 0.7 | 0.6331 | 0.2741 | 0.1427 |
| 0.7462 | 44.0 | 572 | 0.6435 | 0.705 | 0.4894 | 2.4253 | 0.705 | 0.6370 | 0.3043 | 0.1390 |
| 0.7462 | 45.0 | 585 | 0.6457 | 0.695 | 0.4929 | 2.3611 | 0.695 | 0.6314 | 0.3021 | 0.1449 |
| 0.7462 | 46.0 | 598 | 0.6437 | 0.695 | 0.4912 | 2.3639 | 0.695 | 0.6370 | 0.2984 | 0.1436 |
| 0.7462 | 47.0 | 611 | 0.6466 | 0.685 | 0.4933 | 2.4859 | 0.685 | 0.6306 | 0.2936 | 0.1474 |
| 0.7462 | 48.0 | 624 | 0.6470 | 0.67 | 0.4950 | 2.3782 | 0.67 | 0.6070 | 0.3139 | 0.1547 |
| 0.7462 | 49.0 | 637 | 0.6477 | 0.675 | 0.4945 | 2.4509 | 0.675 | 0.6092 | 0.2852 | 0.1527 |
| 0.7462 | 50.0 | 650 | 0.6454 | 0.685 | 0.4931 | 2.5040 | 0.685 | 0.6171 | 0.2996 | 0.1499 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.7
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.7
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7861
- Accuracy: 0.705
- Brier Loss: 0.4410
- Nll: 2.6519
- F1 Micro: 0.705
- F1 Macro: 0.6403
- Ece: 0.2724
- Aurc: 0.1188
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 1.7831 | 0.165 | 0.8966 | 8.4414 | 0.165 | 0.1121 | 0.2151 | 0.8335 |
| No log | 2.0 | 26 | 1.7753 | 0.145 | 0.8958 | 8.5715 | 0.145 | 0.0954 | 0.1998 | 0.8332 |
| No log | 3.0 | 39 | 1.7334 | 0.175 | 0.8877 | 6.4682 | 0.175 | 0.0756 | 0.2069 | 0.7896 |
| No log | 4.0 | 52 | 1.6604 | 0.185 | 0.8723 | 6.0351 | 0.185 | 0.0505 | 0.2328 | 0.7549 |
| No log | 5.0 | 65 | 1.5874 | 0.18 | 0.8560 | 6.0732 | 0.18 | 0.0431 | 0.2285 | 0.7506 |
| No log | 6.0 | 78 | 1.5223 | 0.185 | 0.8415 | 6.1638 | 0.185 | 0.0479 | 0.2419 | 0.7530 |
| No log | 7.0 | 91 | 1.4642 | 0.35 | 0.8239 | 6.0328 | 0.35 | 0.1696 | 0.3081 | 0.5219 |
| No log | 8.0 | 104 | 1.3599 | 0.35 | 0.7825 | 6.2102 | 0.35 | 0.1908 | 0.2977 | 0.4172 |
| No log | 9.0 | 117 | 1.3083 | 0.385 | 0.7566 | 5.7128 | 0.3850 | 0.2203 | 0.3012 | 0.3842 |
| No log | 10.0 | 130 | 1.3151 | 0.365 | 0.7670 | 5.1073 | 0.3650 | 0.2150 | 0.2923 | 0.4891 |
| No log | 11.0 | 143 | 1.3736 | 0.295 | 0.7950 | 5.3584 | 0.295 | 0.1747 | 0.2716 | 0.6360 |
| No log | 12.0 | 156 | 1.2655 | 0.425 | 0.7380 | 4.0312 | 0.425 | 0.2789 | 0.3273 | 0.3366 |
| No log | 13.0 | 169 | 1.1696 | 0.475 | 0.6901 | 3.9627 | 0.4750 | 0.3083 | 0.3011 | 0.2825 |
| No log | 14.0 | 182 | 1.2992 | 0.355 | 0.7473 | 3.9098 | 0.3550 | 0.2292 | 0.2675 | 0.4929 |
| No log | 15.0 | 195 | 1.1698 | 0.51 | 0.6881 | 3.7143 | 0.51 | 0.3691 | 0.3333 | 0.3278 |
| No log | 16.0 | 208 | 1.0624 | 0.515 | 0.6274 | 3.8387 | 0.515 | 0.3631 | 0.2821 | 0.2583 |
| No log | 17.0 | 221 | 1.0970 | 0.565 | 0.6421 | 3.3302 | 0.565 | 0.4493 | 0.3362 | 0.2373 |
| No log | 18.0 | 234 | 1.0029 | 0.625 | 0.5883 | 3.3820 | 0.625 | 0.4675 | 0.3005 | 0.1660 |
| No log | 19.0 | 247 | 1.0384 | 0.605 | 0.6093 | 3.3183 | 0.605 | 0.4863 | 0.3252 | 0.2145 |
| No log | 20.0 | 260 | 1.0686 | 0.62 | 0.6234 | 3.0246 | 0.62 | 0.5155 | 0.3625 | 0.2334 |
| No log | 21.0 | 273 | 0.9641 | 0.62 | 0.5685 | 2.9225 | 0.62 | 0.5259 | 0.3103 | 0.2063 |
| No log | 22.0 | 286 | 1.0054 | 0.665 | 0.5849 | 3.0792 | 0.665 | 0.5614 | 0.3636 | 0.1863 |
| No log | 23.0 | 299 | 0.9959 | 0.675 | 0.5734 | 2.9829 | 0.675 | 0.5577 | 0.3619 | 0.1806 |
| No log | 24.0 | 312 | 0.9044 | 0.675 | 0.5267 | 2.8952 | 0.675 | 0.5712 | 0.2989 | 0.1475 |
| No log | 25.0 | 325 | 0.9803 | 0.655 | 0.5627 | 2.7501 | 0.655 | 0.5418 | 0.3415 | 0.1919 |
| No log | 26.0 | 338 | 0.8814 | 0.65 | 0.5176 | 2.8421 | 0.65 | 0.5619 | 0.2665 | 0.1694 |
| No log | 27.0 | 351 | 0.8555 | 0.69 | 0.4928 | 2.7870 | 0.69 | 0.5831 | 0.3091 | 0.1279 |
| No log | 28.0 | 364 | 0.8290 | 0.69 | 0.4777 | 2.6377 | 0.69 | 0.5976 | 0.2551 | 0.1290 |
| No log | 29.0 | 377 | 0.8593 | 0.685 | 0.4949 | 2.5880 | 0.685 | 0.5776 | 0.3083 | 0.1279 |
| No log | 30.0 | 390 | 0.8226 | 0.685 | 0.4678 | 2.8938 | 0.685 | 0.5884 | 0.2820 | 0.1249 |
| No log | 31.0 | 403 | 0.8578 | 0.69 | 0.4857 | 2.6150 | 0.69 | 0.6024 | 0.3109 | 0.1344 |
| No log | 32.0 | 416 | 0.8330 | 0.685 | 0.4753 | 2.5999 | 0.685 | 0.6047 | 0.2688 | 0.1407 |
| No log | 33.0 | 429 | 0.8268 | 0.7 | 0.4683 | 2.6138 | 0.7 | 0.6193 | 0.2913 | 0.1315 |
| No log | 34.0 | 442 | 0.8535 | 0.715 | 0.4749 | 2.5059 | 0.715 | 0.6450 | 0.2931 | 0.1190 |
| No log | 35.0 | 455 | 0.8334 | 0.665 | 0.4752 | 2.3839 | 0.665 | 0.5950 | 0.2762 | 0.1397 |
| No log | 36.0 | 468 | 0.8025 | 0.71 | 0.4553 | 2.4803 | 0.7100 | 0.6302 | 0.2889 | 0.1178 |
| No log | 37.0 | 481 | 0.8142 | 0.715 | 0.4563 | 2.6785 | 0.715 | 0.6426 | 0.2989 | 0.1048 |
| No log | 38.0 | 494 | 0.8124 | 0.7 | 0.4538 | 2.5320 | 0.7 | 0.6332 | 0.2594 | 0.1132 |
| 0.9303 | 39.0 | 507 | 0.7888 | 0.69 | 0.4452 | 2.6427 | 0.69 | 0.6269 | 0.2583 | 0.1224 |
| 0.9303 | 40.0 | 520 | 0.7907 | 0.705 | 0.4458 | 2.6942 | 0.705 | 0.6367 | 0.2688 | 0.1155 |
| 0.9303 | 41.0 | 533 | 0.7918 | 0.71 | 0.4442 | 2.4378 | 0.7100 | 0.6558 | 0.2816 | 0.1132 |
| 0.9303 | 42.0 | 546 | 0.8005 | 0.725 | 0.4479 | 2.6088 | 0.7250 | 0.6576 | 0.2914 | 0.1049 |
| 0.9303 | 43.0 | 559 | 0.7879 | 0.72 | 0.4421 | 2.7052 | 0.72 | 0.6592 | 0.2741 | 0.1122 |
| 0.9303 | 44.0 | 572 | 0.7910 | 0.71 | 0.4461 | 2.6463 | 0.7100 | 0.6463 | 0.3119 | 0.1188 |
| 0.9303 | 45.0 | 585 | 0.7922 | 0.705 | 0.4450 | 2.6453 | 0.705 | 0.6481 | 0.2753 | 0.1211 |
| 0.9303 | 46.0 | 598 | 0.7915 | 0.715 | 0.4429 | 2.6970 | 0.715 | 0.6526 | 0.2741 | 0.1107 |
| 0.9303 | 47.0 | 611 | 0.7809 | 0.705 | 0.4370 | 2.6841 | 0.705 | 0.6453 | 0.2734 | 0.1158 |
| 0.9303 | 48.0 | 624 | 0.7771 | 0.705 | 0.4350 | 2.6168 | 0.705 | 0.6423 | 0.2652 | 0.1139 |
| 0.9303 | 49.0 | 637 | 0.7826 | 0.705 | 0.4377 | 2.5091 | 0.705 | 0.6423 | 0.2758 | 0.1202 |
| 0.9303 | 50.0 | 650 | 0.7861 | 0.705 | 0.4410 | 2.6519 | 0.705 | 0.6403 | 0.2724 | 0.1188 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
dima806/man_woman_face_image_detection
|
Returns with about 98.7% accuracy whether the face belongs to man or woman based on face image.
See https://www.kaggle.com/code/dima806/man-woman-face-image-detection-vit for more details.

```
Classification report:
precision recall f1-score support
man 0.9885 0.9857 0.9871 51062
woman 0.9857 0.9885 0.9871 51062
accuracy 0.9871 102124
macro avg 0.9871 0.9871 0.9871 102124
weighted avg 0.9871 0.9871 0.9871 102124
```
|
[
"man",
"woman"
] |
bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.9
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t5.0_a0.9
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8809
- Accuracy: 0.7
- Brier Loss: 0.4126
- Nll: 2.4279
- F1 Micro: 0.7
- F1 Macro: 0.6279
- Ece: 0.2569
- Aurc: 0.1111
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 13 | 2.1185 | 0.165 | 0.8967 | 8.5399 | 0.165 | 0.1130 | 0.2151 | 0.8331 |
| No log | 2.0 | 26 | 2.1127 | 0.13 | 0.8958 | 8.1152 | 0.13 | 0.0842 | 0.1816 | 0.8392 |
| No log | 3.0 | 39 | 2.0781 | 0.165 | 0.8888 | 6.8828 | 0.165 | 0.0878 | 0.2150 | 0.8082 |
| No log | 4.0 | 52 | 2.0197 | 0.22 | 0.8762 | 5.7578 | 0.22 | 0.1155 | 0.2521 | 0.7521 |
| No log | 5.0 | 65 | 1.9499 | 0.205 | 0.8601 | 6.0641 | 0.205 | 0.0951 | 0.2567 | 0.7355 |
| No log | 6.0 | 78 | 1.9019 | 0.25 | 0.8483 | 5.8930 | 0.25 | 0.1178 | 0.2728 | 0.6862 |
| No log | 7.0 | 91 | 1.8252 | 0.28 | 0.8301 | 5.8062 | 0.28 | 0.1660 | 0.2890 | 0.6982 |
| No log | 8.0 | 104 | 1.8194 | 0.28 | 0.8275 | 5.2642 | 0.28 | 0.1625 | 0.2874 | 0.6935 |
| No log | 9.0 | 117 | 1.7671 | 0.355 | 0.8109 | 5.1326 | 0.3550 | 0.2211 | 0.3018 | 0.5678 |
| No log | 10.0 | 130 | 1.6582 | 0.355 | 0.7774 | 5.2226 | 0.3550 | 0.2200 | 0.2991 | 0.5305 |
| No log | 11.0 | 143 | 1.5849 | 0.395 | 0.7422 | 5.0239 | 0.395 | 0.2436 | 0.2979 | 0.3974 |
| No log | 12.0 | 156 | 1.4908 | 0.46 | 0.7001 | 4.2790 | 0.46 | 0.3169 | 0.3091 | 0.3003 |
| No log | 13.0 | 169 | 1.6016 | 0.395 | 0.7496 | 4.2149 | 0.395 | 0.2793 | 0.2929 | 0.4640 |
| No log | 14.0 | 182 | 1.4714 | 0.475 | 0.6971 | 4.0742 | 0.4750 | 0.3299 | 0.3177 | 0.3613 |
| No log | 15.0 | 195 | 1.5007 | 0.46 | 0.7119 | 3.8252 | 0.46 | 0.3145 | 0.3111 | 0.3954 |
| No log | 16.0 | 208 | 1.4352 | 0.515 | 0.6776 | 3.4028 | 0.515 | 0.3948 | 0.3376 | 0.2993 |
| No log | 17.0 | 221 | 1.2890 | 0.575 | 0.6104 | 3.4453 | 0.575 | 0.4478 | 0.2940 | 0.2119 |
| No log | 18.0 | 234 | 1.2190 | 0.595 | 0.5719 | 3.2413 | 0.595 | 0.4662 | 0.2608 | 0.1981 |
| No log | 19.0 | 247 | 1.2287 | 0.59 | 0.5764 | 3.2303 | 0.59 | 0.4857 | 0.2811 | 0.2020 |
| No log | 20.0 | 260 | 1.1726 | 0.64 | 0.5494 | 2.9544 | 0.64 | 0.5307 | 0.2993 | 0.1708 |
| No log | 21.0 | 273 | 1.1305 | 0.61 | 0.5384 | 2.9557 | 0.61 | 0.5170 | 0.2771 | 0.1949 |
| No log | 22.0 | 286 | 1.1256 | 0.645 | 0.5295 | 2.7934 | 0.645 | 0.5381 | 0.3181 | 0.1629 |
| No log | 23.0 | 299 | 1.1209 | 0.645 | 0.5217 | 2.8697 | 0.645 | 0.5432 | 0.3055 | 0.1687 |
| No log | 24.0 | 312 | 1.2513 | 0.685 | 0.5917 | 2.7262 | 0.685 | 0.5639 | 0.3779 | 0.1833 |
| No log | 25.0 | 325 | 1.0321 | 0.695 | 0.4819 | 2.7202 | 0.695 | 0.5896 | 0.2810 | 0.1280 |
| No log | 26.0 | 338 | 1.0405 | 0.645 | 0.4957 | 2.6116 | 0.645 | 0.5661 | 0.2515 | 0.1700 |
| No log | 27.0 | 351 | 1.0580 | 0.695 | 0.4933 | 2.7436 | 0.695 | 0.5996 | 0.2967 | 0.1339 |
| No log | 28.0 | 364 | 0.9740 | 0.65 | 0.4575 | 2.5682 | 0.65 | 0.5731 | 0.2513 | 0.1384 |
| No log | 29.0 | 377 | 0.9934 | 0.695 | 0.4651 | 2.5753 | 0.695 | 0.6108 | 0.2775 | 0.1171 |
| No log | 30.0 | 390 | 0.9900 | 0.645 | 0.4695 | 2.6280 | 0.645 | 0.5668 | 0.2459 | 0.1558 |
| No log | 31.0 | 403 | 0.9671 | 0.695 | 0.4504 | 2.8174 | 0.695 | 0.6094 | 0.2505 | 0.1188 |
| No log | 32.0 | 416 | 0.9327 | 0.715 | 0.4324 | 2.5285 | 0.715 | 0.6415 | 0.2565 | 0.1086 |
| No log | 33.0 | 429 | 0.9628 | 0.71 | 0.4464 | 2.5876 | 0.7100 | 0.6435 | 0.2709 | 0.1152 |
| No log | 34.0 | 442 | 0.9316 | 0.715 | 0.4353 | 2.7111 | 0.715 | 0.6334 | 0.2361 | 0.1078 |
| No log | 35.0 | 455 | 0.9275 | 0.7 | 0.4364 | 2.5226 | 0.7 | 0.6251 | 0.2586 | 0.1207 |
| No log | 36.0 | 468 | 0.9301 | 0.7 | 0.4346 | 2.6464 | 0.7 | 0.6232 | 0.2482 | 0.1142 |
| No log | 37.0 | 481 | 0.9013 | 0.695 | 0.4194 | 2.5575 | 0.695 | 0.6197 | 0.2554 | 0.1098 |
| No log | 38.0 | 494 | 0.9008 | 0.695 | 0.4196 | 2.6270 | 0.695 | 0.6156 | 0.2246 | 0.1063 |
| 1.0903 | 39.0 | 507 | 0.9185 | 0.71 | 0.4311 | 2.6290 | 0.7100 | 0.6362 | 0.2626 | 0.1165 |
| 1.0903 | 40.0 | 520 | 0.9053 | 0.685 | 0.4254 | 2.5057 | 0.685 | 0.6239 | 0.2210 | 0.1171 |
| 1.0903 | 41.0 | 533 | 0.8955 | 0.7 | 0.4189 | 2.4823 | 0.7 | 0.6291 | 0.1995 | 0.1103 |
| 1.0903 | 42.0 | 546 | 0.9012 | 0.69 | 0.4223 | 2.5377 | 0.69 | 0.6195 | 0.2486 | 0.1119 |
| 1.0903 | 43.0 | 559 | 0.8894 | 0.71 | 0.4138 | 2.6167 | 0.7100 | 0.6382 | 0.2459 | 0.1022 |
| 1.0903 | 44.0 | 572 | 0.8846 | 0.695 | 0.4132 | 2.5130 | 0.695 | 0.6265 | 0.2198 | 0.1093 |
| 1.0903 | 45.0 | 585 | 0.8946 | 0.69 | 0.4190 | 2.6357 | 0.69 | 0.6230 | 0.2375 | 0.1145 |
| 1.0903 | 46.0 | 598 | 0.8931 | 0.705 | 0.4168 | 2.6306 | 0.705 | 0.6342 | 0.2555 | 0.1102 |
| 1.0903 | 47.0 | 611 | 0.8842 | 0.71 | 0.4160 | 2.3021 | 0.7100 | 0.6347 | 0.2096 | 0.1120 |
| 1.0903 | 48.0 | 624 | 0.8805 | 0.695 | 0.4140 | 2.3447 | 0.695 | 0.6237 | 0.2181 | 0.1128 |
| 1.0903 | 49.0 | 637 | 0.8816 | 0.7 | 0.4142 | 2.4358 | 0.7 | 0.6295 | 0.2550 | 0.1112 |
| 1.0903 | 50.0 | 650 | 0.8809 | 0.7 | 0.4126 | 2.4279 | 0.7 | 0.6279 | 0.2569 | 0.1111 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3
|
[
"adve",
"email",
"form",
"letter",
"memo",
"news",
"note",
"report",
"resume",
"scientific"
] |
yfh/food
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# food
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the food101 dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.6313
- eval_accuracy: 0.856
- eval_runtime: 739.9774
- eval_samples_per_second: 1.351
- eval_steps_per_second: 0.085
- epoch: 0.15
- step: 38
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
zkdeng/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2188
- Accuracy: 0.92
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6077 | 0.96 | 12 | 0.3408 | 0.895 |
| 0.3469 | 2.0 | 25 | 0.2188 | 0.92 |
| 0.2627 | 2.88 | 36 | 0.2183 | 0.915 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"lactrodectus_hesperus",
"parasteatoda_tepidariorum"
] |
zkdeng/swin-tiny-patch4-window7-224-finetuned-black_widow
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-black_widow
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1422
- Accuracy: 0.945
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1728 | 0.96 | 12 | 0.1506 | 0.935 |
| 0.1408 | 2.0 | 25 | 0.1422 | 0.945 |
| 0.1669 | 2.96 | 37 | 0.1289 | 0.945 |
| 0.1618 | 4.0 | 50 | 0.1126 | 0.945 |
| 0.1383 | 4.8 | 60 | 0.1200 | 0.94 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"lactrodectus_hesperus",
"parasteatoda_tepidariorum"
] |
aichoux/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3209
- Accuracy: 0.8902
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 8 | 2.7448 | 0.0314 |
| 2.7716 | 2.0 | 16 | 2.5834 | 0.1765 |
| 2.5974 | 3.0 | 24 | 2.3608 | 0.3020 |
| 2.3426 | 4.0 | 32 | 2.1157 | 0.3333 |
| 1.9747 | 5.0 | 40 | 1.7539 | 0.4627 |
| 1.9747 | 6.0 | 48 | 1.3641 | 0.6078 |
| 1.5182 | 7.0 | 56 | 1.0755 | 0.6471 |
| 1.198 | 8.0 | 64 | 0.8743 | 0.7216 |
| 1.0206 | 9.0 | 72 | 0.7666 | 0.7294 |
| 0.8731 | 10.0 | 80 | 0.7035 | 0.7490 |
| 0.8731 | 11.0 | 88 | 0.6122 | 0.7608 |
| 0.7938 | 12.0 | 96 | 0.6508 | 0.7490 |
| 0.7286 | 13.0 | 104 | 0.5081 | 0.7961 |
| 0.659 | 14.0 | 112 | 0.5536 | 0.7961 |
| 0.6232 | 15.0 | 120 | 0.5079 | 0.8 |
| 0.6232 | 16.0 | 128 | 0.4483 | 0.8314 |
| 0.6028 | 17.0 | 136 | 0.4096 | 0.8157 |
| 0.5333 | 18.0 | 144 | 0.3710 | 0.8510 |
| 0.5053 | 19.0 | 152 | 0.4810 | 0.8039 |
| 0.4717 | 20.0 | 160 | 0.4121 | 0.8235 |
| 0.4717 | 21.0 | 168 | 0.4021 | 0.8392 |
| 0.4728 | 22.0 | 176 | 0.3780 | 0.8588 |
| 0.4347 | 23.0 | 184 | 0.3374 | 0.8745 |
| 0.4545 | 24.0 | 192 | 0.4056 | 0.8431 |
| 0.3954 | 25.0 | 200 | 0.4088 | 0.8745 |
| 0.3954 | 26.0 | 208 | 0.4169 | 0.8392 |
| 0.4145 | 27.0 | 216 | 0.3262 | 0.8706 |
| 0.3895 | 28.0 | 224 | 0.4235 | 0.8706 |
| 0.4185 | 29.0 | 232 | 0.3482 | 0.8706 |
| 0.3686 | 30.0 | 240 | 0.3088 | 0.8824 |
| 0.3686 | 31.0 | 248 | 0.3230 | 0.8902 |
| 0.3617 | 32.0 | 256 | 0.3473 | 0.8824 |
| 0.3136 | 33.0 | 264 | 0.3793 | 0.8627 |
| 0.3482 | 34.0 | 272 | 0.3477 | 0.8588 |
| 0.3519 | 35.0 | 280 | 0.3692 | 0.8667 |
| 0.3519 | 36.0 | 288 | 0.3611 | 0.8627 |
| 0.3311 | 37.0 | 296 | 0.3233 | 0.8745 |
| 0.3222 | 38.0 | 304 | 0.3416 | 0.8627 |
| 0.3013 | 39.0 | 312 | 0.3198 | 0.8824 |
| 0.2871 | 40.0 | 320 | 0.3308 | 0.8667 |
| 0.2871 | 41.0 | 328 | 0.3246 | 0.8667 |
| 0.3154 | 42.0 | 336 | 0.3943 | 0.8667 |
| 0.2735 | 43.0 | 344 | 0.3186 | 0.8784 |
| 0.2911 | 44.0 | 352 | 0.3132 | 0.8824 |
| 0.266 | 45.0 | 360 | 0.3204 | 0.8980 |
| 0.266 | 46.0 | 368 | 0.3097 | 0.8784 |
| 0.2686 | 47.0 | 376 | 0.3075 | 0.8902 |
| 0.2818 | 48.0 | 384 | 0.3192 | 0.8902 |
| 0.2492 | 49.0 | 392 | 0.3434 | 0.8745 |
| 0.276 | 50.0 | 400 | 0.3237 | 0.8824 |
| 0.276 | 51.0 | 408 | 0.3450 | 0.8745 |
| 0.245 | 52.0 | 416 | 0.3284 | 0.8706 |
| 0.2292 | 53.0 | 424 | 0.3263 | 0.8902 |
| 0.2252 | 54.0 | 432 | 0.3216 | 0.8745 |
| 0.2483 | 55.0 | 440 | 0.3359 | 0.8863 |
| 0.2483 | 56.0 | 448 | 0.3314 | 0.8902 |
| 0.2549 | 57.0 | 456 | 0.3932 | 0.8745 |
| 0.2247 | 58.0 | 464 | 0.3189 | 0.8745 |
| 0.2344 | 59.0 | 472 | 0.3251 | 0.8745 |
| 0.2315 | 60.0 | 480 | 0.3289 | 0.8824 |
| 0.2315 | 61.0 | 488 | 0.3058 | 0.8745 |
| 0.2109 | 62.0 | 496 | 0.2999 | 0.8863 |
| 0.2325 | 63.0 | 504 | 0.3078 | 0.8980 |
| 0.2126 | 64.0 | 512 | 0.3531 | 0.8784 |
| 0.1975 | 65.0 | 520 | 0.3394 | 0.8902 |
| 0.1975 | 66.0 | 528 | 0.3113 | 0.8902 |
| 0.1998 | 67.0 | 536 | 0.3365 | 0.8941 |
| 0.2208 | 68.0 | 544 | 0.2854 | 0.9020 |
| 0.2126 | 69.0 | 552 | 0.3170 | 0.8941 |
| 0.2352 | 70.0 | 560 | 0.3155 | 0.8824 |
| 0.2352 | 71.0 | 568 | 0.3327 | 0.8824 |
| 0.1724 | 72.0 | 576 | 0.3503 | 0.8902 |
| 0.2038 | 73.0 | 584 | 0.3309 | 0.8824 |
| 0.1919 | 74.0 | 592 | 0.3299 | 0.8902 |
| 0.2199 | 75.0 | 600 | 0.3347 | 0.8863 |
| 0.2199 | 76.0 | 608 | 0.3471 | 0.8824 |
| 0.2075 | 77.0 | 616 | 0.3437 | 0.8863 |
| 0.2206 | 78.0 | 624 | 0.3161 | 0.8824 |
| 0.1655 | 79.0 | 632 | 0.3227 | 0.8784 |
| 0.1765 | 80.0 | 640 | 0.3302 | 0.8784 |
| 0.1765 | 81.0 | 648 | 0.3153 | 0.8745 |
| 0.1832 | 82.0 | 656 | 0.3010 | 0.8745 |
| 0.185 | 83.0 | 664 | 0.3266 | 0.8941 |
| 0.1627 | 84.0 | 672 | 0.3192 | 0.8941 |
| 0.176 | 85.0 | 680 | 0.3125 | 0.8863 |
| 0.176 | 86.0 | 688 | 0.3241 | 0.8745 |
| 0.1723 | 87.0 | 696 | 0.3124 | 0.8784 |
| 0.1477 | 88.0 | 704 | 0.3109 | 0.8745 |
| 0.1703 | 89.0 | 712 | 0.3196 | 0.8824 |
| 0.1919 | 90.0 | 720 | 0.3186 | 0.8980 |
| 0.1919 | 91.0 | 728 | 0.3178 | 0.8902 |
| 0.1465 | 92.0 | 736 | 0.3241 | 0.8824 |
| 0.155 | 93.0 | 744 | 0.3281 | 0.8784 |
| 0.1829 | 94.0 | 752 | 0.3263 | 0.8824 |
| 0.167 | 95.0 | 760 | 0.3282 | 0.8824 |
| 0.167 | 96.0 | 768 | 0.3290 | 0.8824 |
| 0.166 | 97.0 | 776 | 0.3253 | 0.8902 |
| 0.1756 | 98.0 | 784 | 0.3231 | 0.8863 |
| 0.157 | 99.0 | 792 | 0.3215 | 0.8902 |
| 0.1492 | 100.0 | 800 | 0.3209 | 0.8902 |
### Framework versions
- Transformers 4.33.3
- Pytorch 1.11.0+cu113
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"beijing",
"biaoend",
"biaomiddle",
"biaoout",
"biaoxie",
"biaozheng",
"dazhe",
"kaishi",
"loufeng",
"panend",
"panout",
"panxie",
"panzheng",
"weibu"
] |
GayatriC/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
DamarJati/Face-Mask-Detection
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Face-Mask-Detection
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0051
- Accuracy: 0.9992
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0344 | 1.0 | 83 | 0.0051 | 0.9992 |
| 0.0112 | 2.0 | 166 | 0.0052 | 0.9983 |
| 0.0146 | 3.0 | 249 | 0.0045 | 0.9992 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"withmask",
"withoutmask"
] |
navradio/swin-tiny-patch4-window7-224-PE
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-PE
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4489
- Accuracy: 0.7980
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0025
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6872 | 1.0 | 11 | 0.6535 | 0.6061 |
| 0.7287 | 2.0 | 22 | 0.6601 | 0.6397 |
| 0.7212 | 3.0 | 33 | 0.6740 | 0.5657 |
| 0.6947 | 4.0 | 44 | 0.6531 | 0.6532 |
| 0.6783 | 5.0 | 55 | 0.6739 | 0.5724 |
| 0.6816 | 6.0 | 66 | 0.6274 | 0.6599 |
| 0.6428 | 7.0 | 77 | 0.6671 | 0.6330 |
| 0.6928 | 8.0 | 88 | 0.6380 | 0.6498 |
| 0.6767 | 9.0 | 99 | 0.6875 | 0.6061 |
| 0.6918 | 10.0 | 110 | 0.6859 | 0.5690 |
| 0.6845 | 11.0 | 121 | 0.6810 | 0.5657 |
| 0.6826 | 12.0 | 132 | 0.6919 | 0.5185 |
| 0.6877 | 13.0 | 143 | 0.6693 | 0.6061 |
| 0.6709 | 14.0 | 154 | 0.6660 | 0.5690 |
| 0.6707 | 15.0 | 165 | 0.6764 | 0.5690 |
| 0.6703 | 16.0 | 176 | 0.6467 | 0.6296 |
| 0.6629 | 17.0 | 187 | 0.6471 | 0.6431 |
| 0.6557 | 18.0 | 198 | 0.6597 | 0.6229 |
| 0.659 | 19.0 | 209 | 0.6451 | 0.6027 |
| 0.65 | 20.0 | 220 | 0.6638 | 0.6094 |
| 0.6453 | 21.0 | 231 | 0.6544 | 0.6162 |
| 0.6426 | 22.0 | 242 | 0.6565 | 0.5825 |
| 0.6339 | 23.0 | 253 | 0.6743 | 0.6296 |
| 0.6236 | 24.0 | 264 | 0.6669 | 0.5960 |
| 0.6427 | 25.0 | 275 | 0.6379 | 0.6532 |
| 0.6439 | 26.0 | 286 | 0.6361 | 0.6263 |
| 0.6212 | 27.0 | 297 | 0.6540 | 0.6465 |
| 0.6186 | 28.0 | 308 | 0.5925 | 0.6700 |
| 0.6162 | 29.0 | 319 | 0.6224 | 0.6734 |
| 0.6237 | 30.0 | 330 | 0.6018 | 0.6667 |
| 0.6061 | 31.0 | 341 | 0.5735 | 0.6801 |
| 0.6138 | 32.0 | 352 | 0.6425 | 0.6566 |
| 0.595 | 33.0 | 363 | 0.5827 | 0.6768 |
| 0.5869 | 34.0 | 374 | 0.5956 | 0.7172 |
| 0.577 | 35.0 | 385 | 0.5458 | 0.7003 |
| 0.5766 | 36.0 | 396 | 0.5603 | 0.6869 |
| 0.5726 | 37.0 | 407 | 0.5339 | 0.7340 |
| 0.5702 | 38.0 | 418 | 0.5577 | 0.7138 |
| 0.5762 | 39.0 | 429 | 0.5262 | 0.7374 |
| 0.5543 | 40.0 | 440 | 0.5091 | 0.7441 |
| 0.5339 | 41.0 | 451 | 0.5185 | 0.7542 |
| 0.5428 | 42.0 | 462 | 0.5023 | 0.7542 |
| 0.5349 | 43.0 | 473 | 0.5439 | 0.7306 |
| 0.5319 | 44.0 | 484 | 0.4745 | 0.7811 |
| 0.5294 | 45.0 | 495 | 0.5432 | 0.7172 |
| 0.5314 | 46.0 | 506 | 0.4511 | 0.7912 |
| 0.5073 | 47.0 | 517 | 0.4379 | 0.8047 |
| 0.5028 | 48.0 | 528 | 0.4487 | 0.7980 |
| 0.4985 | 49.0 | 539 | 0.4550 | 0.7946 |
| 0.4826 | 50.0 | 550 | 0.4489 | 0.7980 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"cm",
"non_cm"
] |
fahmindra/activity_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# activity_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7087
- Accuracy: 0.8012
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.7167 | 1.0 | 157 | 1.6188 | 0.6964 |
| 1.0511 | 2.0 | 315 | 1.0981 | 0.7381 |
| 0.9184 | 3.0 | 472 | 0.9225 | 0.7710 |
| 0.7396 | 4.0 | 630 | 0.8333 | 0.7802 |
| 0.6873 | 5.0 | 787 | 0.7917 | 0.7849 |
| 0.6579 | 6.0 | 945 | 0.7510 | 0.7845 |
| 0.5857 | 7.0 | 1102 | 0.7672 | 0.7845 |
| 0.4968 | 8.0 | 1260 | 0.7467 | 0.7857 |
| 0.513 | 9.0 | 1417 | 0.7156 | 0.7940 |
| 0.4957 | 9.97 | 1570 | 0.7073 | 0.8024 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"calling",
"clapping",
"running",
"sitting",
"sleeping",
"texting",
"using_laptop",
"cycling",
"dancing",
"drinking",
"eating",
"fighting",
"hugging",
"laughing",
"listening_to_music"
] |
awrysfab/human_action_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# human_action_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.3689
- Accuracy: 0.0728
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3354 | 1.0 | 197 | 2.9994 | 0.0717 |
| 0.9519 | 2.0 | 394 | 3.3635 | 0.0778 |
| 0.8178 | 3.0 | 591 | 3.5103 | 0.0763 |
| 0.7122 | 4.0 | 788 | 3.7261 | 0.0683 |
| 0.7532 | 5.0 | 985 | 3.7279 | 0.0661 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"calling",
"clapping",
"running",
"sitting",
"sleeping",
"texting",
"using_laptop",
"cycling",
"dancing",
"drinking",
"eating",
"fighting",
"hugging",
"laughing",
"listening_to_music"
] |
michaelsinanta/smoke_detector
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# smoke_detector
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the smokedataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0187
- Accuracy: 0.9951
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1404 | 1.0 | 716 | 0.0396 | 0.9902 |
| 0.0493 | 2.0 | 1432 | 0.0337 | 0.9920 |
| 0.0237 | 3.0 | 2148 | 0.0263 | 0.9934 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"cloud",
"other",
"smoke"
] |
mmuratarat/kvasir-v2-classifier
|
This repo contains the artifacts of a model, using a pre-trained Visual Transformer model and fine-tuning it on a custom dataset.
There are significant benefits to using a pretrained model. It reduces computation costs, your carbon footprint, and allows you to use state-of-the-art models without having to train one from scratch
🤗 Hugging Face Transformers provides access to thousands of pretrained models for a wide range of tasks. When you use a pretrained model, you train it on a dataset specific to your task. This is known as fine-tuning, an incredibly powerful training technique.
## Dataset
Here, we use Kvasir dataset v2. It consists of images, annotated and verified by medical doctors (experienced endoscopists), including several classes showing anatomical landmarks, phatological findings or endoscopic procedures in the gastrointestinal tract
It is a multi-class dataset consisting of 1,000 images per class with a total of 8,000 images for eight different classes. These classes consist of pathological findings (esophagitis, polyps, ulcerative colitis), anatomical landmarks (z-line, pylorus, cecum), and normal and regular findings (normal colon mucosa, stool), and polyp removal cases (dyed and lifted polyps, dyed resection margins)
The dataset can be download from [here](https://datasets.simula.no/kvasir/) which weights around ~2.3 GB. and is free for research and educational purposes only.

## Model
The [Hugging Face transformers](https://huggingface.co/docs/transformers/index) package is a very popular Python library which provides access to the HuggingFace Hub where we can find a lot of pretrained models and pipelines for a variety of tasks in domains such as Natural Language Processing (NLP), Computer Vision (CV) or Automatic Speech Recognition (ASR).
A Vision Transformer-based model is used in this experiment. Vision Transformer (ViT) was introduced in June 2021 by a team of researchers at Google Brain (https://arxiv.org/abs/2010.11929).
The Vision Transformer (ViT) model was proposed in "An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale" by Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, Jakob Uszkoreit, Neil Houlsby. It’s the first paper that successfully trains a Transformer encoder on ImageNet, attaining very good results compared to familiar convolutional architectures.
The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a supervised fashion, namely ImageNet-21k, at a resolution of 224x224 pixels. Next, the model was fine-tuned on ImageNet (also referred to as ILSVRC2012), a dataset comprising 1 million images and 1,000 classes, also at resolution 224x224.

In this case, we'll be using [the google/vit-base-patch16-224-in21k model](https://huggingface.co/google/vit-base-patch16-224-in21k) from Hugging Face.
## Training hyperparameters
The following hyperparameters were used during training:
* learning_rate: 2e-5
* lr_scheduler_type: linear
* warmup_steps = 500
* weight_decay = 0.01
* warmup_ratio=0.0
* train_batch_size: 16
* eval_batch_size: 32
* seed: 42
* num_epochs: 5
* optimizer: Adam
* adam_beta1=0.9,
* adam_beta2=0.999,
* adam_epsilon=1e-08,
## Evaluation Metrics
We have used usual evaluation metrics for this image classification task, that include weighted average precision, F1 score and recall, and overall accuracy. In multi-class classification problems, the weighted average method adjusts for class imbalance by assigning a weight proportional to the number of instances in each class.
## Training results
| **Epoch** | **Training Loss** | **Validation Loss** | **Accuracy** | **F1** | **Precision** | **Recall** |
|:---------: |:-----------------: |:-------------------: |:------------: |:-------: |:-------------: |:----------: |
| 1 | 1.4341 | 0.62736 | 0.89417 | 0.89285 | 0.90208 | 0.89417 |
| 2 | 0.4203 | 0.3135 | 0.92917 | 0.929 | 0.93058 | 0.92917 |
| 3 | 0.2272 | 0.251 | 0.9375 | 0.93745 | 0.938 | 0.9375 |
| 4 | 0.146 | 0.24937 | 0.93833 | 0.93814 | 0.94072 | 0.93833 |
| 5 | 0.1034 | 0.2383 | 0.93917 | 0.9391 | 0.93992 | 0.93917 |

## Training Metrics
epoch = 5.0
total_flos = 2.634869332279296e+18
train_loss = 0.46618968290441176
train_runtime = 0:01:51.45
train_samples_per_second = 5.07
train_steps_per_second = 0.317
global_step = 2125
## Fine-tuned model
The pre-trained model has been pushed to Hugging Face Hub and can be found on https://huggingface.co/mmuratarat/kvasir-v2-classifier.
You can make inferences by either using "Hosted Inference API" on the Hub or locally pulling the model from the Hub.
## How to use
Here is how to use this pre-trained model to classify an image of GI tract:
```python
from transformers import AutoModelForImageClassification, AutoFeatureExtractor
from PIL import Image
import requests
# this is an image from "polyps" class
url = 'https://github.com/mmuratarat/turkish/blob/master/_posts/images/example_polyps_image.jpg?raw=true'
image = Image.open(requests.get(url, stream=True).raw)
model = AutoModelForImageClassification.from_pretrained("mmuratarat/kvasir-v2-classifier")
feature_extractor = AutoFeatureExtractor.from_pretrained("mmuratarat/kvasir-v2-classifier")
inputs = feature_extractor(image, return_tensors="pt")
id2label = {'0': 'dyed-lifted-polyps',
'1': 'dyed-resection-margins',
'2': 'esophagitis',
'3': 'normal-cecum',
'4': 'normal-pylorus',
'5': 'normal-z-line',
'6': 'polyps',
'7': 'ulcerative-colitis'}
logits = model(**inputs).logits
predicted_label = logits.argmax(-1).item()
predicted_class = id2label[str(predicted_label)]
predicted_class
```
## Framework versions
* Transformers 4.34.0
* Pytorch 2.0.1+cu118
* Datasets 2.14.5
* Tokenizers 0.14.0
* scikit-learn 1.2.2
* scipy 1.11.3
* numpy 1.23.5
* accelerate 0.23.0
* pandas 1.5.3
## Contact
Please reach out to [email protected] if you have any questions or feedback.
## Source Code
You can find the source code for obtaining this pre-trained model on [mmuratarat/kvasir-v2-ViT-classifier]( https://github.com/mmuratarat/kvasir-v2-ViT-classifier) repository of Github.
Note that `Kvasir_ViT.ipynb` file contains Turkish commentary but the code itself is self-explanatory.
## Citation
In all documents and papers that use or refer to this pre-trained model or report benchmarking results, a reference to this model have to be included.
|
[
"dyed-lifted-polyps",
"dyed-resection-margins",
"esophagitis",
"normal-cecum",
"normal-pylorus",
"normal-z-line",
"polyps",
"ulcerative-colitis"
] |
CHMD08/vit-base-patch16-224-finetuned-flower
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"daisy",
"dandelion",
"roses",
"sunflowers",
"tulips"
] |
dima806/top_15_anime_characters_image_detection
|
Returns anime character name given an image with about 98% accuracy.
See https://www.kaggle.com/code/dima806/anime-character-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
Killua 1.0000 1.0000 1.0000 57
Sakata Gintoki 1.0000 0.9655 0.9825 58
Eren Yeager 0.9649 0.9649 0.9649 57
Ichigo 0.9825 0.9825 0.9825 57
Lelouch Lamperouge 1.0000 1.0000 1.0000 58
Naruto 1.0000 1.0000 1.0000 58
Goku 0.9655 0.9825 0.9739 57
Vegeta 0.9649 0.9649 0.9649 57
Zoro 0.9355 1.0000 0.9667 58
Natsu Dragneel 1.0000 1.0000 1.0000 58
Gon 1.0000 0.9310 0.9643 58
Sasuke 0.9333 0.9655 0.9492 58
Elric Edward 1.0000 0.9825 0.9912 57
Light Yagami 0.9828 0.9828 0.9828 58
Luffy 1.0000 1.0000 1.0000 58
accuracy 0.9815 864
macro avg 0.9820 0.9815 0.9815 864
weighted avg 0.9820 0.9815 0.9815 864
```
|
[
"killua",
"sakata gintoki",
"eren yeager",
"ichigo",
"lelouch lamperouge",
"naruto",
"goku",
"vegeta",
"zoro",
"natsu dragneel",
"gon",
"sasuke",
"elric edward",
"light yagami",
"luffy"
] |
farhanyh/food_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6470
- Accuracy: 0.909
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.718 | 0.99 | 62 | 2.5596 | 0.842 |
| 1.8555 | 2.0 | 125 | 1.8344 | 0.873 |
| 1.6437 | 2.98 | 186 | 1.6470 | 0.909 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
crangana/trained-race
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trained-race
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the fair_face dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9830
- Accuracy: 0.6258
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3923 | 0.18 | 1000 | 1.3550 | 0.4712 |
| 1.1517 | 0.37 | 2000 | 1.1854 | 0.5429 |
| 1.2405 | 0.55 | 3000 | 1.1001 | 0.5754 |
| 1.0752 | 0.74 | 4000 | 1.0330 | 0.6018 |
| 1.0986 | 0.92 | 5000 | 0.9973 | 0.6173 |
| 1.0007 | 1.11 | 6000 | 0.9735 | 0.6279 |
| 0.9851 | 1.29 | 7000 | 0.9830 | 0.6258 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"east asian",
"indian",
"black",
"white",
"middle eastern",
"latino_hispanic",
"southeast asian"
] |
ahyar002/vit-pneumonia-classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-pneumonia-classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the chest-xray-classification dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1301
- Accuracy: 0.9561
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4786 | 1.0 | 32 | 0.3081 | 0.8609 |
| 0.213 | 2.0 | 64 | 0.1645 | 0.9399 |
| 0.1724 | 3.0 | 96 | 0.1419 | 0.9502 |
| 0.1438 | 4.0 | 128 | 0.0950 | 0.9734 |
| 0.1267 | 5.0 | 160 | 0.1225 | 0.9579 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"normal",
"pneumonia"
] |
crangana/trained-age
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trained-age
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the fair_face dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1340
- Accuracy: 0.5164
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3347 | 0.18 | 1000 | 1.3819 | 0.4296 |
| 1.3071 | 0.37 | 2000 | 1.2799 | 0.4642 |
| 1.297 | 0.55 | 3000 | 1.2503 | 0.4721 |
| 1.3121 | 0.74 | 4000 | 1.1661 | 0.4995 |
| 1.1806 | 0.92 | 5000 | 1.1137 | 0.5240 |
| 1.0839 | 1.11 | 6000 | 1.1340 | 0.5164 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"0-2",
"3-9",
"10-19",
"20-29",
"30-39",
"40-49",
"50-59",
"60-69",
"more than 70"
] |
dima806/ball_types_image_detection
|
Returns ball type given an image.
See https://www.kaggle.com/code/dima806/ball-types-image-detection for more details.
```
Classification report:
precision recall f1-score support
wiffle ball 1.0000 1.0000 1.0000 39
puffballs 1.0000 1.0000 1.0000 38
chrochet ball 1.0000 1.0000 1.0000 39
golf ball 1.0000 1.0000 1.0000 39
pokeman balls 1.0000 1.0000 1.0000 39
water polo ball 1.0000 1.0000 1.0000 39
football 1.0000 1.0000 1.0000 39
marble 1.0000 1.0000 1.0000 38
medicine ball 1.0000 1.0000 1.0000 39
tether ball 1.0000 1.0000 1.0000 38
billiard ball 1.0000 1.0000 1.0000 38
cannon ball 1.0000 1.0000 1.0000 39
crystal ball 1.0000 1.0000 1.0000 38
cricket ball 1.0000 1.0000 1.0000 39
sepak takraw ball 1.0000 1.0000 1.0000 39
tennis ball 1.0000 1.0000 1.0000 39
wrecking ball 1.0000 1.0000 1.0000 38
rubberband ball 1.0000 1.0000 1.0000 39
buckeyballs 1.0000 1.0000 1.0000 39
bowling ball 1.0000 1.0000 1.0000 38
eyeballs 1.0000 1.0000 1.0000 38
meat ball 1.0000 1.0000 1.0000 38
brass 1.0000 1.0000 1.0000 39
screwballs 1.0000 1.0000 1.0000 38
baseball 1.0000 1.0000 1.0000 38
beachballs 1.0000 1.0000 1.0000 39
soccer ball 1.0000 1.0000 1.0000 38
basketball 1.0000 1.0000 1.0000 39
volley ball 1.0000 1.0000 1.0000 39
paint balls 1.0000 1.0000 1.0000 39
accuracy 1.0000 1158
macro avg 1.0000 1.0000 1.0000 1158
weighted avg 1.0000 1.0000 1.0000 1158
```
|
[
"wiffle ball",
"puffballs",
"chrochet ball",
"golf ball",
"pokeman balls",
"water polo ball",
"football",
"marble",
"medicine ball",
"tether ball",
"billiard ball",
"cannon ball",
"crystal ball",
"cricket ball",
"sepak takraw ball",
"tennis ball",
"wrecking ball",
"rubberband ball",
"buckeyballs",
"bowling ball",
"eyeballs",
"meat ball",
"brass",
"screwballs",
"baseball",
"beachballs",
"soccer ball",
"basketball",
"volley ball",
"paint balls"
] |
crangana/trained-gender
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# trained-gender
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the fair_face dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2437
- Accuracy: 0.8986
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.4277 | 0.18 | 1000 | 0.4054 | 0.8089 |
| 0.315 | 0.37 | 2000 | 0.3487 | 0.8318 |
| 0.3082 | 0.55 | 3000 | 0.3052 | 0.8633 |
| 0.3235 | 0.74 | 4000 | 0.2899 | 0.8684 |
| 0.2505 | 0.92 | 5000 | 0.2693 | 0.8785 |
| 0.2484 | 1.11 | 6000 | 0.2547 | 0.8889 |
| 0.1933 | 1.29 | 7000 | 0.2521 | 0.8901 |
| 0.1497 | 1.48 | 8000 | 0.2443 | 0.8929 |
| 0.326 | 1.66 | 9000 | 0.2406 | 0.8958 |
| 0.215 | 1.84 | 10000 | 0.2381 | 0.9007 |
| 0.2035 | 2.03 | 11000 | 0.2437 | 0.8986 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"male",
"female"
] |
Manixtox/swin-tiny-patch4-window7-224-finetuned-shotclass
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-shotclass
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2648
- Accuracy: 0.9275
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 0.97 | 8 | 0.3702 | 0.8855 |
| 0.3802 | 1.94 | 16 | 0.2648 | 0.9275 |
| 0.2783 | 2.91 | 24 | 0.3685 | 0.8740 |
| 0.2403 | 4.0 | 33 | 0.3235 | 0.8893 |
| 0.1864 | 4.85 | 40 | 0.3339 | 0.8740 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"plano_entero",
"plano_medio",
"primer_plano",
"xlongshots"
] |
platzi/platzi-vit-model-gabriel-salazar
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# platzi-vit-model-gabriel-salazar
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1267
- Accuracy: 0.9774
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0535 | 3.85 | 500 | 0.1267 | 0.9774 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
dima806/shoe_types_image_detection
|
Return shoe type given an image.

See https://www.kaggle.com/code/dima806/shoe-type-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
Clog 0.9748 0.9598 0.9672 1169
Brogue 0.9804 0.9812 0.9808 1170
Sneaker 0.9718 0.9735 0.9727 1170
Boat 0.9642 0.9658 0.9650 1170
Ballet Flat 0.9729 0.9837 0.9783 1169
accuracy 0.9728 5848
macro avg 0.9728 0.9728 0.9728 5848
weighted avg 0.9728 0.9728 0.9728 5848
```
|
[
"clog",
"brogue",
"sneaker",
"boat",
"ballet flat"
] |
flatmoon102/fruits_and_vegetables_image_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fruits_and_vegetables_image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3835
- Accuracy: 0.9159
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 87 | 1.6751 | 0.8768 |
| No log | 2.0 | 174 | 1.0260 | 0.8957 |
| No log | 3.0 | 261 | 0.6767 | 0.8957 |
| No log | 4.0 | 348 | 0.5445 | 0.8986 |
| No log | 5.0 | 435 | 0.4685 | 0.9072 |
| 0.8955 | 6.0 | 522 | 0.4328 | 0.9072 |
| 0.8955 | 7.0 | 609 | 0.4028 | 0.9 |
| 0.8955 | 8.0 | 696 | 0.3958 | 0.9145 |
| 0.8955 | 9.0 | 783 | 0.3835 | 0.9159 |
| 0.8955 | 10.0 | 870 | 0.3842 | 0.9145 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"pineapple",
"lettuce",
"potato",
"bell pepper",
"pomegranate",
"chilli pepper",
"eggplant",
"mango",
"cabbage",
"lemon",
"capsicum",
"spinach",
"corn",
"watermelon",
"apple",
"garlic",
"sweetcorn",
"grapes",
"ginger",
"sweetpotato",
"raddish",
"tomato",
"paprika",
"carrot",
"cucumber",
"cauliflower",
"kiwi",
"orange",
"banana",
"soy beans",
"beetroot",
"jalepeno",
"onion",
"peas",
"pear",
"turnip"
] |
ammardaffa/fruit_veg_detection
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fruit_veg_detection
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6689
- Accuracy: 0.9116
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 87 | 0.8126 | 0.8913 |
| No log | 2.0 | 174 | 0.6689 | 0.9116 |
| No log | 3.0 | 261 | 0.5979 | 0.9087 |
| No log | 4.0 | 348 | 0.5629 | 0.9116 |
| No log | 5.0 | 435 | 0.5583 | 0.9014 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"spinach",
"lettuce",
"raddish",
"capsicum",
"cucumber",
"kiwi",
"tomato",
"jalepeno",
"chilli pepper",
"mango",
"garlic",
"paprika",
"sweetpotato",
"potato",
"pear",
"ginger",
"peas",
"pineapple",
"corn",
"sweetcorn",
"bell pepper",
"eggplant",
"lemon",
"banana",
"soy beans",
"watermelon",
"apple",
"carrot",
"cabbage",
"orange",
"onion",
"beetroot",
"pomegranate",
"cauliflower",
"turnip",
"grapes"
] |
dima806/face_obstruction_image_detection
|
Returns face obstruction type given a facial image with about 91% accuracy.
See https://www.kaggle.com/code/dima806/face-obstruction-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
sunglasses 0.9974 0.9985 0.9980 3422
glasses 0.9896 0.9968 0.9932 3422
other 0.7198 0.7613 0.7400 3422
mask 0.9971 0.9985 0.9978 3422
hand 0.7505 0.7086 0.7290 3422
none 0.9976 0.9860 0.9918 3422
accuracy 0.9083 20532
macro avg 0.9087 0.9083 0.9083 20532
weighted avg 0.9087 0.9083 0.9083 20532
```
|
[
"sunglasses",
"glasses",
"other",
"mask",
"hand",
"none"
] |
tejp/fine-tuned
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tuned
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the custom_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0068
- Accuracy: 0.2857
- F1: 0.2030
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"boat",
"children",
"water",
"dogs",
"fireman",
"firetruck",
"mountains",
"people",
"river",
"snow",
"stairs"
] |
Ahmeng/image_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1943
- Accuracy: 0.9469
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 52 | 0.2654 | 0.9324 |
| No log | 2.0 | 104 | 0.1310 | 0.9807 |
| No log | 3.0 | 156 | 0.1485 | 0.9662 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.13.1+cu117
- Datasets 2.13.2
- Tokenizers 0.13.3
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
lucascruz/CheXpert-ViT-U-MultiClass
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CheXpert-ViT-U-MultiClass
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 64
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 5
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.14.1
|
[
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14"
] |
zguo0525/myshell_nsfw_filter
|
## Example Usage
Here's a step-by-step Python code example to classify an image:
```python
from transformers import pipeline
import requests
from PIL import Image
from io import BytesIO
# 1. Load the pipeline for image classification
pipe = pipeline("image-classification", model="zguo0525/myshell_nsfw_filter")
# 2. Load the image into memory (assuming you have the URL for the image)
image_url = 'https://img-myshell.net/meinamix/red_hair_girl'
response = requests.get(image_url)
image = Image.open(BytesIO(response.content))
# 3. Use the pipeline to classify the image
results = pipe(image)
# 4. Print the results
label = results[0]['label']
score = results[0]['score']
print(f"{label}: {score:.4f}")
```
|
[
"ero",
"porn",
"safe"
] |
qzheng75/swin-tiny-patch4-window7-224-finetuned-plot-images
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-plot-images
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0093
- Accuracy: 0.9960
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0333 | 1.0 | 426 | 0.0261 | 0.9937 |
| 0.0223 | 2.0 | 852 | 0.0072 | 0.9975 |
| 0.0103 | 3.0 | 1278 | 0.0093 | 0.9960 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"dot",
"horizontal_bar",
"line",
"scatter",
"vertical_bar"
] |
Antdochi/results
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [microsoft/resnet-101](https://huggingface.co/microsoft/resnet-101) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1744
- Accuracy: 0.9428
- F1: 0.9428
- Precision: 0.9428
- Recall: 0.9428
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 16
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 0.2883 | 1.0 | 99 | 0.3757 | 0.8563 | 0.8563 | 0.8563 | 0.8563 |
| 0.2195 | 2.0 | 198 | 0.2293 | 0.9178 | 0.9178 | 0.9178 | 0.9178 |
| 0.1936 | 3.0 | 297 | 0.2120 | 0.9149 | 0.9149 | 0.9149 | 0.9149 |
| 0.163 | 4.0 | 396 | 0.1744 | 0.9428 | 0.9428 | 0.9428 | 0.9428 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"tench, tinca tinca",
"goldfish, carassius auratus",
"great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias",
"tiger shark, galeocerdo cuvieri",
"hammerhead, hammerhead shark",
"electric ray, crampfish, numbfish, torpedo",
"stingray",
"cock",
"hen",
"ostrich, struthio camelus",
"brambling, fringilla montifringilla",
"goldfinch, carduelis carduelis",
"house finch, linnet, carpodacus mexicanus",
"junco, snowbird",
"indigo bunting, indigo finch, indigo bird, passerina cyanea",
"robin, american robin, turdus migratorius",
"bulbul",
"jay",
"magpie",
"chickadee",
"water ouzel, dipper",
"kite",
"bald eagle, american eagle, haliaeetus leucocephalus",
"vulture",
"great grey owl, great gray owl, strix nebulosa",
"european fire salamander, salamandra salamandra",
"common newt, triturus vulgaris",
"eft",
"spotted salamander, ambystoma maculatum",
"axolotl, mud puppy, ambystoma mexicanum",
"bullfrog, rana catesbeiana",
"tree frog, tree-frog",
"tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui",
"loggerhead, loggerhead turtle, caretta caretta",
"leatherback turtle, leatherback, leathery turtle, dermochelys coriacea",
"mud turtle",
"terrapin",
"box turtle, box tortoise",
"banded gecko",
"common iguana, iguana, iguana iguana",
"american chameleon, anole, anolis carolinensis",
"whiptail, whiptail lizard",
"agama",
"frilled lizard, chlamydosaurus kingi",
"alligator lizard",
"gila monster, heloderma suspectum",
"green lizard, lacerta viridis",
"african chameleon, chamaeleo chamaeleon",
"komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis",
"african crocodile, nile crocodile, crocodylus niloticus",
"american alligator, alligator mississipiensis",
"triceratops",
"thunder snake, worm snake, carphophis amoenus",
"ringneck snake, ring-necked snake, ring snake",
"hognose snake, puff adder, sand viper",
"green snake, grass snake",
"king snake, kingsnake",
"garter snake, grass snake",
"water snake",
"vine snake",
"night snake, hypsiglena torquata",
"boa constrictor, constrictor constrictor",
"rock python, rock snake, python sebae",
"indian cobra, naja naja",
"green mamba",
"sea snake",
"horned viper, cerastes, sand viper, horned asp, cerastes cornutus",
"diamondback, diamondback rattlesnake, crotalus adamanteus",
"sidewinder, horned rattlesnake, crotalus cerastes",
"trilobite",
"harvestman, daddy longlegs, phalangium opilio",
"scorpion",
"black and gold garden spider, argiope aurantia",
"barn spider, araneus cavaticus",
"garden spider, aranea diademata",
"black widow, latrodectus mactans",
"tarantula",
"wolf spider, hunting spider",
"tick",
"centipede",
"black grouse",
"ptarmigan",
"ruffed grouse, partridge, bonasa umbellus",
"prairie chicken, prairie grouse, prairie fowl",
"peacock",
"quail",
"partridge",
"african grey, african gray, psittacus erithacus",
"macaw",
"sulphur-crested cockatoo, kakatoe galerita, cacatua galerita",
"lorikeet",
"coucal",
"bee eater",
"hornbill",
"hummingbird",
"jacamar",
"toucan",
"drake",
"red-breasted merganser, mergus serrator",
"goose",
"black swan, cygnus atratus",
"tusker",
"echidna, spiny anteater, anteater",
"platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus",
"wallaby, brush kangaroo",
"koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus",
"wombat",
"jellyfish",
"sea anemone, anemone",
"brain coral",
"flatworm, platyhelminth",
"nematode, nematode worm, roundworm",
"conch",
"snail",
"slug",
"sea slug, nudibranch",
"chiton, coat-of-mail shell, sea cradle, polyplacophore",
"chambered nautilus, pearly nautilus, nautilus",
"dungeness crab, cancer magister",
"rock crab, cancer irroratus",
"fiddler crab",
"king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica",
"american lobster, northern lobster, maine lobster, homarus americanus",
"spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
"crayfish, crawfish, crawdad, crawdaddy",
"hermit crab",
"isopod",
"white stork, ciconia ciconia",
"black stork, ciconia nigra",
"spoonbill",
"flamingo",
"little blue heron, egretta caerulea",
"american egret, great white heron, egretta albus",
"bittern",
"crane",
"limpkin, aramus pictus",
"european gallinule, porphyrio porphyrio",
"american coot, marsh hen, mud hen, water hen, fulica americana",
"bustard",
"ruddy turnstone, arenaria interpres",
"red-backed sandpiper, dunlin, erolia alpina",
"redshank, tringa totanus",
"dowitcher",
"oystercatcher, oyster catcher",
"pelican",
"king penguin, aptenodytes patagonica",
"albatross, mollymawk",
"grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus",
"killer whale, killer, orca, grampus, sea wolf, orcinus orca",
"dugong, dugong dugon",
"sea lion",
"chihuahua",
"japanese spaniel",
"maltese dog, maltese terrier, maltese",
"pekinese, pekingese, peke",
"shih-tzu",
"blenheim spaniel",
"papillon",
"toy terrier",
"rhodesian ridgeback",
"afghan hound, afghan",
"basset, basset hound",
"beagle",
"bloodhound, sleuthhound",
"bluetick",
"black-and-tan coonhound",
"walker hound, walker foxhound",
"english foxhound",
"redbone",
"borzoi, russian wolfhound",
"irish wolfhound",
"italian greyhound",
"whippet",
"ibizan hound, ibizan podenco",
"norwegian elkhound, elkhound",
"otterhound, otter hound",
"saluki, gazelle hound",
"scottish deerhound, deerhound",
"weimaraner",
"staffordshire bullterrier, staffordshire bull terrier",
"american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier",
"bedlington terrier",
"border terrier",
"kerry blue terrier",
"irish terrier",
"norfolk terrier",
"norwich terrier",
"yorkshire terrier",
"wire-haired fox terrier",
"lakeland terrier",
"sealyham terrier, sealyham",
"airedale, airedale terrier",
"cairn, cairn terrier",
"australian terrier",
"dandie dinmont, dandie dinmont terrier",
"boston bull, boston terrier",
"miniature schnauzer",
"giant schnauzer",
"standard schnauzer",
"scotch terrier, scottish terrier, scottie",
"tibetan terrier, chrysanthemum dog",
"silky terrier, sydney silky",
"soft-coated wheaten terrier",
"west highland white terrier",
"lhasa, lhasa apso",
"flat-coated retriever",
"curly-coated retriever",
"golden retriever",
"labrador retriever",
"chesapeake bay retriever",
"german short-haired pointer",
"vizsla, hungarian pointer",
"english setter",
"irish setter, red setter",
"gordon setter",
"brittany spaniel",
"clumber, clumber spaniel",
"english springer, english springer spaniel",
"welsh springer spaniel",
"cocker spaniel, english cocker spaniel, cocker",
"sussex spaniel",
"irish water spaniel",
"kuvasz",
"schipperke",
"groenendael",
"malinois",
"briard",
"kelpie",
"komondor",
"old english sheepdog, bobtail",
"shetland sheepdog, shetland sheep dog, shetland",
"collie",
"border collie",
"bouvier des flandres, bouviers des flandres",
"rottweiler",
"german shepherd, german shepherd dog, german police dog, alsatian",
"doberman, doberman pinscher",
"miniature pinscher",
"greater swiss mountain dog",
"bernese mountain dog",
"appenzeller",
"entlebucher",
"boxer",
"bull mastiff",
"tibetan mastiff",
"french bulldog",
"great dane",
"saint bernard, st bernard",
"eskimo dog, husky",
"malamute, malemute, alaskan malamute",
"siberian husky",
"dalmatian, coach dog, carriage dog",
"affenpinscher, monkey pinscher, monkey dog",
"basenji",
"pug, pug-dog",
"leonberg",
"newfoundland, newfoundland dog",
"great pyrenees",
"samoyed, samoyede",
"pomeranian",
"chow, chow chow",
"keeshond",
"brabancon griffon",
"pembroke, pembroke welsh corgi",
"cardigan, cardigan welsh corgi",
"toy poodle",
"miniature poodle",
"standard poodle",
"mexican hairless",
"timber wolf, grey wolf, gray wolf, canis lupus",
"white wolf, arctic wolf, canis lupus tundrarum",
"red wolf, maned wolf, canis rufus, canis niger",
"coyote, prairie wolf, brush wolf, canis latrans",
"dingo, warrigal, warragal, canis dingo",
"dhole, cuon alpinus",
"african hunting dog, hyena dog, cape hunting dog, lycaon pictus",
"hyena, hyaena",
"red fox, vulpes vulpes",
"kit fox, vulpes macrotis",
"arctic fox, white fox, alopex lagopus",
"grey fox, gray fox, urocyon cinereoargenteus",
"tabby, tabby cat",
"tiger cat",
"persian cat",
"siamese cat, siamese",
"egyptian cat",
"cougar, puma, catamount, mountain lion, painter, panther, felis concolor",
"lynx, catamount",
"leopard, panthera pardus",
"snow leopard, ounce, panthera uncia",
"jaguar, panther, panthera onca, felis onca",
"lion, king of beasts, panthera leo",
"tiger, panthera tigris",
"cheetah, chetah, acinonyx jubatus",
"brown bear, bruin, ursus arctos",
"american black bear, black bear, ursus americanus, euarctos americanus",
"ice bear, polar bear, ursus maritimus, thalarctos maritimus",
"sloth bear, melursus ursinus, ursus ursinus",
"mongoose",
"meerkat, mierkat",
"tiger beetle",
"ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
"ground beetle, carabid beetle",
"long-horned beetle, longicorn, longicorn beetle",
"leaf beetle, chrysomelid",
"dung beetle",
"rhinoceros beetle",
"weevil",
"fly",
"bee",
"ant, emmet, pismire",
"grasshopper, hopper",
"cricket",
"walking stick, walkingstick, stick insect",
"cockroach, roach",
"mantis, mantid",
"cicada, cicala",
"leafhopper",
"lacewing, lacewing fly",
"dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
"damselfly",
"admiral",
"ringlet, ringlet butterfly",
"monarch, monarch butterfly, milkweed butterfly, danaus plexippus",
"cabbage butterfly",
"sulphur butterfly, sulfur butterfly",
"lycaenid, lycaenid butterfly",
"starfish, sea star",
"sea urchin",
"sea cucumber, holothurian",
"wood rabbit, cottontail, cottontail rabbit",
"hare",
"angora, angora rabbit",
"hamster",
"porcupine, hedgehog",
"fox squirrel, eastern fox squirrel, sciurus niger",
"marmot",
"beaver",
"guinea pig, cavia cobaya",
"sorrel",
"zebra",
"hog, pig, grunter, squealer, sus scrofa",
"wild boar, boar, sus scrofa",
"warthog",
"hippopotamus, hippo, river horse, hippopotamus amphibius",
"ox",
"water buffalo, water ox, asiatic buffalo, bubalus bubalis",
"bison",
"ram, tup",
"bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis",
"ibex, capra ibex",
"hartebeest",
"impala, aepyceros melampus",
"gazelle",
"arabian camel, dromedary, camelus dromedarius",
"llama",
"weasel",
"mink",
"polecat, fitch, foulmart, foumart, mustela putorius",
"black-footed ferret, ferret, mustela nigripes",
"otter",
"skunk, polecat, wood pussy",
"badger",
"armadillo",
"three-toed sloth, ai, bradypus tridactylus",
"orangutan, orang, orangutang, pongo pygmaeus",
"gorilla, gorilla gorilla",
"chimpanzee, chimp, pan troglodytes",
"gibbon, hylobates lar",
"siamang, hylobates syndactylus, symphalangus syndactylus",
"guenon, guenon monkey",
"patas, hussar monkey, erythrocebus patas",
"baboon",
"macaque",
"langur",
"colobus, colobus monkey",
"proboscis monkey, nasalis larvatus",
"marmoset",
"capuchin, ringtail, cebus capucinus",
"howler monkey, howler",
"titi, titi monkey",
"spider monkey, ateles geoffroyi",
"squirrel monkey, saimiri sciureus",
"madagascar cat, ring-tailed lemur, lemur catta",
"indri, indris, indri indri, indri brevicaudatus",
"indian elephant, elephas maximus",
"african elephant, loxodonta africana",
"lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens",
"giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca",
"barracouta, snoek",
"eel",
"coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch",
"rock beauty, holocanthus tricolor",
"anemone fish",
"sturgeon",
"gar, garfish, garpike, billfish, lepisosteus osseus",
"lionfish",
"puffer, pufferfish, blowfish, globefish",
"abacus",
"abaya",
"academic gown, academic robe, judge's robe",
"accordion, piano accordion, squeeze box",
"acoustic guitar",
"aircraft carrier, carrier, flattop, attack aircraft carrier",
"airliner",
"airship, dirigible",
"altar",
"ambulance",
"amphibian, amphibious vehicle",
"analog clock",
"apiary, bee house",
"apron",
"ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
"assault rifle, assault gun",
"backpack, back pack, knapsack, packsack, rucksack, haversack",
"bakery, bakeshop, bakehouse",
"balance beam, beam",
"balloon",
"ballpoint, ballpoint pen, ballpen, biro",
"band aid",
"banjo",
"bannister, banister, balustrade, balusters, handrail",
"barbell",
"barber chair",
"barbershop",
"barn",
"barometer",
"barrel, cask",
"barrow, garden cart, lawn cart, wheelbarrow",
"baseball",
"basketball",
"bassinet",
"bassoon",
"bathing cap, swimming cap",
"bath towel",
"bathtub, bathing tub, bath, tub",
"beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
"beacon, lighthouse, beacon light, pharos",
"beaker",
"bearskin, busby, shako",
"beer bottle",
"beer glass",
"bell cote, bell cot",
"bib",
"bicycle-built-for-two, tandem bicycle, tandem",
"bikini, two-piece",
"binder, ring-binder",
"binoculars, field glasses, opera glasses",
"birdhouse",
"boathouse",
"bobsled, bobsleigh, bob",
"bolo tie, bolo, bola tie, bola",
"bonnet, poke bonnet",
"bookcase",
"bookshop, bookstore, bookstall",
"bottlecap",
"bow",
"bow tie, bow-tie, bowtie",
"brass, memorial tablet, plaque",
"brassiere, bra, bandeau",
"breakwater, groin, groyne, mole, bulwark, seawall, jetty",
"breastplate, aegis, egis",
"broom",
"bucket, pail",
"buckle",
"bulletproof vest",
"bullet train, bullet",
"butcher shop, meat market",
"cab, hack, taxi, taxicab",
"caldron, cauldron",
"candle, taper, wax light",
"cannon",
"canoe",
"can opener, tin opener",
"cardigan",
"car mirror",
"carousel, carrousel, merry-go-round, roundabout, whirligig",
"carpenter's kit, tool kit",
"carton",
"car wheel",
"cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm",
"cassette",
"cassette player",
"castle",
"catamaran",
"cd player",
"cello, violoncello",
"cellular telephone, cellular phone, cellphone, cell, mobile phone",
"chain",
"chainlink fence",
"chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
"chain saw, chainsaw",
"chest",
"chiffonier, commode",
"chime, bell, gong",
"china cabinet, china closet",
"christmas stocking",
"church, church building",
"cinema, movie theater, movie theatre, movie house, picture palace",
"cleaver, meat cleaver, chopper",
"cliff dwelling",
"cloak",
"clog, geta, patten, sabot",
"cocktail shaker",
"coffee mug",
"coffeepot",
"coil, spiral, volute, whorl, helix",
"combination lock",
"computer keyboard, keypad",
"confectionery, confectionary, candy store",
"container ship, containership, container vessel",
"convertible",
"corkscrew, bottle screw",
"cornet, horn, trumpet, trump",
"cowboy boot",
"cowboy hat, ten-gallon hat",
"cradle",
"crane",
"crash helmet",
"crate",
"crib, cot",
"crock pot",
"croquet ball",
"crutch",
"cuirass",
"dam, dike, dyke",
"desk",
"desktop computer",
"dial telephone, dial phone",
"diaper, nappy, napkin",
"digital clock",
"digital watch",
"dining table, board",
"dishrag, dishcloth",
"dishwasher, dish washer, dishwashing machine",
"disk brake, disc brake",
"dock, dockage, docking facility",
"dogsled, dog sled, dog sleigh",
"dome",
"doormat, welcome mat",
"drilling platform, offshore rig",
"drum, membranophone, tympan",
"drumstick",
"dumbbell",
"dutch oven",
"electric fan, blower",
"electric guitar",
"electric locomotive",
"entertainment center",
"envelope",
"espresso maker",
"face powder",
"feather boa, boa",
"file, file cabinet, filing cabinet",
"fireboat",
"fire engine, fire truck",
"fire screen, fireguard",
"flagpole, flagstaff",
"flute, transverse flute",
"folding chair",
"football helmet",
"forklift",
"fountain",
"fountain pen",
"four-poster",
"freight car",
"french horn, horn",
"frying pan, frypan, skillet",
"fur coat",
"garbage truck, dustcart",
"gasmask, respirator, gas helmet",
"gas pump, gasoline pump, petrol pump, island dispenser",
"goblet",
"go-kart",
"golf ball",
"golfcart, golf cart",
"gondola",
"gong, tam-tam",
"gown",
"grand piano, grand",
"greenhouse, nursery, glasshouse",
"grille, radiator grille",
"grocery store, grocery, food market, market",
"guillotine",
"hair slide",
"hair spray",
"half track",
"hammer",
"hamper",
"hand blower, blow dryer, blow drier, hair dryer, hair drier",
"hand-held computer, hand-held microcomputer",
"handkerchief, hankie, hanky, hankey",
"hard disc, hard disk, fixed disk",
"harmonica, mouth organ, harp, mouth harp",
"harp",
"harvester, reaper",
"hatchet",
"holster",
"home theater, home theatre",
"honeycomb",
"hook, claw",
"hoopskirt, crinoline",
"horizontal bar, high bar",
"horse cart, horse-cart",
"hourglass",
"ipod",
"iron, smoothing iron",
"jack-o'-lantern",
"jean, blue jean, denim",
"jeep, landrover",
"jersey, t-shirt, tee shirt",
"jigsaw puzzle",
"jinrikisha, ricksha, rickshaw",
"joystick",
"kimono",
"knee pad",
"knot",
"lab coat, laboratory coat",
"ladle",
"lampshade, lamp shade",
"laptop, laptop computer",
"lawn mower, mower",
"lens cap, lens cover",
"letter opener, paper knife, paperknife",
"library",
"lifeboat",
"lighter, light, igniter, ignitor",
"limousine, limo",
"liner, ocean liner",
"lipstick, lip rouge",
"loafer",
"lotion",
"loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
"loupe, jeweler's loupe",
"lumbermill, sawmill",
"magnetic compass",
"mailbag, postbag",
"mailbox, letter box",
"maillot",
"maillot, tank suit",
"manhole cover",
"maraca",
"marimba, xylophone",
"mask",
"matchstick",
"maypole",
"maze, labyrinth",
"measuring cup",
"medicine chest, medicine cabinet",
"megalith, megalithic structure",
"microphone, mike",
"microwave, microwave oven",
"military uniform",
"milk can",
"minibus",
"miniskirt, mini",
"minivan",
"missile",
"mitten",
"mixing bowl",
"mobile home, manufactured home",
"model t",
"modem",
"monastery",
"monitor",
"moped",
"mortar",
"mortarboard",
"mosque",
"mosquito net",
"motor scooter, scooter",
"mountain bike, all-terrain bike, off-roader",
"mountain tent",
"mouse, computer mouse",
"mousetrap",
"moving van",
"muzzle",
"nail",
"neck brace",
"necklace",
"nipple",
"notebook, notebook computer",
"obelisk",
"oboe, hautboy, hautbois",
"ocarina, sweet potato",
"odometer, hodometer, mileometer, milometer",
"oil filter",
"organ, pipe organ",
"oscilloscope, scope, cathode-ray oscilloscope, cro",
"overskirt",
"oxcart",
"oxygen mask",
"packet",
"paddle, boat paddle",
"paddlewheel, paddle wheel",
"padlock",
"paintbrush",
"pajama, pyjama, pj's, jammies",
"palace",
"panpipe, pandean pipe, syrinx",
"paper towel",
"parachute, chute",
"parallel bars, bars",
"park bench",
"parking meter",
"passenger car, coach, carriage",
"patio, terrace",
"pay-phone, pay-station",
"pedestal, plinth, footstall",
"pencil box, pencil case",
"pencil sharpener",
"perfume, essence",
"petri dish",
"photocopier",
"pick, plectrum, plectron",
"pickelhaube",
"picket fence, paling",
"pickup, pickup truck",
"pier",
"piggy bank, penny bank",
"pill bottle",
"pillow",
"ping-pong ball",
"pinwheel",
"pirate, pirate ship",
"pitcher, ewer",
"plane, carpenter's plane, woodworking plane",
"planetarium",
"plastic bag",
"plate rack",
"plow, plough",
"plunger, plumber's helper",
"polaroid camera, polaroid land camera",
"pole",
"police van, police wagon, paddy wagon, patrol wagon, wagon, black maria",
"poncho",
"pool table, billiard table, snooker table",
"pop bottle, soda bottle",
"pot, flowerpot",
"potter's wheel",
"power drill",
"prayer rug, prayer mat",
"printer",
"prison, prison house",
"projectile, missile",
"projector",
"puck, hockey puck",
"punching bag, punch bag, punching ball, punchball",
"purse",
"quill, quill pen",
"quilt, comforter, comfort, puff",
"racer, race car, racing car",
"racket, racquet",
"radiator",
"radio, wireless",
"radio telescope, radio reflector",
"rain barrel",
"recreational vehicle, rv, r.v.",
"reel",
"reflex camera",
"refrigerator, icebox",
"remote control, remote",
"restaurant, eating house, eating place, eatery",
"revolver, six-gun, six-shooter",
"rifle",
"rocking chair, rocker",
"rotisserie",
"rubber eraser, rubber, pencil eraser",
"rugby ball",
"rule, ruler",
"running shoe",
"safe",
"safety pin",
"saltshaker, salt shaker",
"sandal",
"sarong",
"sax, saxophone",
"scabbard",
"scale, weighing machine",
"school bus",
"schooner",
"scoreboard",
"screen, crt screen",
"screw",
"screwdriver",
"seat belt, seatbelt",
"sewing machine",
"shield, buckler",
"shoe shop, shoe-shop, shoe store",
"shoji",
"shopping basket",
"shopping cart",
"shovel",
"shower cap",
"shower curtain",
"ski",
"ski mask",
"sleeping bag",
"slide rule, slipstick",
"sliding door",
"slot, one-armed bandit",
"snorkel",
"snowmobile",
"snowplow, snowplough",
"soap dispenser",
"soccer ball",
"sock",
"solar dish, solar collector, solar furnace",
"sombrero",
"soup bowl",
"space bar",
"space heater",
"space shuttle",
"spatula",
"speedboat",
"spider web, spider's web",
"spindle",
"sports car, sport car",
"spotlight, spot",
"stage",
"steam locomotive",
"steel arch bridge",
"steel drum",
"stethoscope",
"stole",
"stone wall",
"stopwatch, stop watch",
"stove",
"strainer",
"streetcar, tram, tramcar, trolley, trolley car",
"stretcher",
"studio couch, day bed",
"stupa, tope",
"submarine, pigboat, sub, u-boat",
"suit, suit of clothes",
"sundial",
"sunglass",
"sunglasses, dark glasses, shades",
"sunscreen, sunblock, sun blocker",
"suspension bridge",
"swab, swob, mop",
"sweatshirt",
"swimming trunks, bathing trunks",
"swing",
"switch, electric switch, electrical switch",
"syringe",
"table lamp",
"tank, army tank, armored combat vehicle, armoured combat vehicle",
"tape player",
"teapot",
"teddy, teddy bear",
"television, television system",
"tennis ball",
"thatch, thatched roof",
"theater curtain, theatre curtain",
"thimble",
"thresher, thrasher, threshing machine",
"throne",
"tile roof",
"toaster",
"tobacco shop, tobacconist shop, tobacconist",
"toilet seat",
"torch",
"totem pole",
"tow truck, tow car, wrecker",
"toyshop",
"tractor",
"trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
"tray",
"trench coat",
"tricycle, trike, velocipede",
"trimaran",
"tripod",
"triumphal arch",
"trolleybus, trolley coach, trackless trolley",
"trombone",
"tub, vat",
"turnstile",
"typewriter keyboard",
"umbrella",
"unicycle, monocycle",
"upright, upright piano",
"vacuum, vacuum cleaner",
"vase",
"vault",
"velvet",
"vending machine",
"vestment",
"viaduct",
"violin, fiddle",
"volleyball",
"waffle iron",
"wall clock",
"wallet, billfold, notecase, pocketbook",
"wardrobe, closet, press",
"warplane, military plane",
"washbasin, handbasin, washbowl, lavabo, wash-hand basin",
"washer, automatic washer, washing machine",
"water bottle",
"water jug",
"water tower",
"whiskey jug",
"whistle",
"wig",
"window screen",
"window shade",
"windsor tie",
"wine bottle",
"wing",
"wok",
"wooden spoon",
"wool, woolen, woollen",
"worm fence, snake fence, snake-rail fence, virginia fence",
"wreck",
"yawl",
"yurt",
"web site, website, internet site, site",
"comic book",
"crossword puzzle, crossword",
"street sign",
"traffic light, traffic signal, stoplight",
"book jacket, dust cover, dust jacket, dust wrapper",
"menu",
"plate",
"guacamole",
"consomme",
"hot pot, hotpot",
"trifle",
"ice cream, icecream",
"ice lolly, lolly, lollipop, popsicle",
"french loaf",
"bagel, beigel",
"pretzel",
"cheeseburger",
"hotdog, hot dog, red hot",
"mashed potato",
"head cabbage",
"broccoli",
"cauliflower",
"zucchini, courgette",
"spaghetti squash",
"acorn squash",
"butternut squash",
"cucumber, cuke",
"artichoke, globe artichoke",
"bell pepper",
"cardoon",
"mushroom",
"granny smith",
"strawberry",
"orange",
"lemon",
"fig",
"pineapple, ananas",
"banana",
"jackfruit, jak, jack",
"custard apple",
"pomegranate",
"hay",
"carbonara",
"chocolate sauce, chocolate syrup",
"dough",
"meat loaf, meatloaf",
"pizza, pizza pie",
"potpie",
"burrito",
"red wine",
"espresso",
"cup",
"eggnog",
"alp",
"bubble",
"cliff, drop, drop-off",
"coral reef",
"geyser",
"lakeside, lakeshore",
"promontory, headland, head, foreland",
"sandbar, sand bar",
"seashore, coast, seacoast, sea-coast",
"valley, vale",
"volcano",
"ballplayer, baseball player",
"groom, bridegroom",
"scuba diver",
"rapeseed",
"daisy",
"yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum",
"corn",
"acorn",
"hip, rose hip, rosehip",
"buckeye, horse chestnut, conker",
"coral fungus",
"agaric",
"gyromitra",
"stinkhorn, carrion fungus",
"earthstar",
"hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa",
"bolete",
"ear, spike, capitulum",
"toilet tissue, toilet paper, bathroom tissue"
] |
lantian-chen/my_awesome_food_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6473
- Accuracy: 0.874
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.7147 | 0.99 | 62 | 2.5361 | 0.804 |
| 1.8577 | 2.0 | 125 | 1.8141 | 0.852 |
| 1.6359 | 2.98 | 186 | 1.6473 | 0.874 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
luminoussg/autotrain-xray-93756145904
|
# Model Trained Using AutoTrain
- Problem type: Binary Classification
- Model ID: 93756145904
- CO2 Emissions (in grams): 5.1814
## Validation Metrics
- Loss: 0.062
- Accuracy: 0.977
- Precision: 0.983
- Recall: 0.986
- AUC: 0.997
- F1: 0.984
|
[
"normal",
"pneumonia"
] |
qzheng75/swin-tiny-patch4-window7-224-finetuned-image-is-plot-or-not
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-image-is-plot-or-not
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0006 | 1.0 | 448 | 0.0000 | 1.0 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"is_not_plot",
"is_plot"
] |
qzheng75/swin-tiny-patch4-window7-224-finetuned-image-is-plot-or-not-finetuned-image-is-plot-or-not
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-image-is-plot-or-not-finetuned-image-is-plot-or-not
This model is a fine-tuned version of [qzheng75/swin-tiny-patch4-window7-224-finetuned-image-is-plot-or-not](https://huggingface.co/qzheng75/swin-tiny-patch4-window7-224-finetuned-image-is-plot-or-not) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0012
- Accuracy: 0.9997
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0001 | 1.0 | 90 | 0.0009 | 0.9997 |
| 0.0014 | 1.99 | 180 | 0.0022 | 0.9993 |
| 0.0054 | 2.99 | 270 | 0.0012 | 0.9997 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"is_not_plot",
"is_plot"
] |
dvs/swin-tiny-patch4-window7-224-uploads-classifier-v2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-uploads-classifier-v2
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0745
- Accuracy: 0.9843
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.2482 | 1.0 | 18 | 0.4781 | 0.8824 |
| 0.3036 | 2.0 | 36 | 0.0936 | 0.9804 |
| 0.1687 | 3.0 | 54 | 0.0745 | 0.9843 |
| 0.1392 | 4.0 | 72 | 0.0980 | 0.9725 |
| 0.14 | 5.0 | 90 | 0.0778 | 0.9765 |
| 0.1186 | 6.0 | 108 | 0.0837 | 0.9725 |
| 0.1088 | 7.0 | 126 | 0.0645 | 0.9804 |
| 0.0789 | 8.0 | 144 | 0.0675 | 0.9765 |
| 0.0644 | 9.0 | 162 | 0.0940 | 0.9686 |
| 0.0582 | 10.0 | 180 | 0.0879 | 0.9725 |
| 0.0591 | 11.0 | 198 | 0.0935 | 0.9686 |
| 0.0538 | 12.0 | 216 | 0.0540 | 0.9804 |
| 0.0588 | 13.0 | 234 | 0.0725 | 0.9686 |
| 0.0538 | 14.0 | 252 | 0.0637 | 0.9765 |
| 0.0462 | 15.0 | 270 | 0.0694 | 0.9725 |
| 0.0352 | 16.0 | 288 | 0.0771 | 0.9686 |
| 0.0536 | 17.0 | 306 | 0.0629 | 0.9804 |
| 0.0403 | 18.0 | 324 | 0.0933 | 0.9686 |
| 0.0412 | 19.0 | 342 | 0.0848 | 0.9725 |
| 0.0305 | 20.0 | 360 | 0.0820 | 0.9725 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"full-bleed",
"illustration",
"logo",
"photo",
"qr-code"
] |
tejp/fine-tuned-augmented
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tuned-augmented
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the custom_dataset_augmented dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2134
- Accuracy: 0.2333
- F1: 0.0455
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"aeroplane",
"car",
"people",
"river",
"ski",
"snow",
"deer",
"dogsledge",
"factory",
"fireman",
"firetruck",
"ladder",
"lake",
"mountain"
] |
kenghweetan/clothing_category_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# clothing_category_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 4.4070
- Accuracy: 0.2103
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.695 | 1.0 | 551 | 4.4070 | 0.2103 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1+cpu
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"accessories",
"apparel set",
"eyewear",
"flip flops",
"fragrance",
"free gifts",
"gloves",
"hair",
"headwear",
"home furnishing",
"innerwear",
"jewellery",
"bags",
"lips",
"loungewear and nightwear",
"makeup",
"mufflers",
"nails",
"perfumes",
"sandal",
"saree",
"scarves",
"shoe accessories",
"bath and body",
"shoes",
"skin",
"skin care",
"socks",
"sports accessories",
"sports equipment",
"stoles",
"ties",
"topwear",
"umbrellas",
"beauty accessories",
"wallets",
"watches",
"water bottle",
"wristbands",
"belts",
"bottomwear",
"cufflinks",
"dress",
"eyes"
] |
Frank0930/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the cifar10 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0864
- Accuracy: 0.9726
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4809 | 1.0 | 351 | 0.1388 | 0.9528 |
| 0.3489 | 2.0 | 703 | 0.0945 | 0.9692 |
| 0.3528 | 2.99 | 1053 | 0.0864 | 0.9726 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"airplane",
"automobile",
"bird",
"cat",
"deer",
"dog",
"frog",
"horse",
"ship",
"truck"
] |
Devarshi/Armature_Defect_Detection_Resin
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Armature_Defect_Detection_Resin
This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224-in22k](https://huggingface.co/microsoft/swin-base-patch4-window7-224-in22k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4978
- Accuracy: 0.76
- F1: 0.76
- Recall: 0.76
- Precision: 0.76
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----:|:------:|:---------:|
| No log | 0.57 | 1 | 0.7205 | 0.44 | 0.44 | 0.44 | 0.44 |
| No log | 1.57 | 2 | 0.5926 | 0.6 | 0.6 | 0.6 | 0.6 |
| No log | 2.57 | 3 | 0.4978 | 0.76 | 0.76 | 0.76 | 0.76 |
### Framework versions
- Transformers 4.23.1
- Pytorch 1.13.0
- Datasets 2.6.1
- Tokenizers 0.13.1
|
[
"resin under fill",
"good pieces"
] |
romitbarua/autotrain-deepfakeface-94074145984
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
- CO2 Emissions (in grams): 32.1392
## Validation Metricsg
loss: 0.23420386016368866
f1_macro: 0.9410988547155245
f1_micro: 0.941
f1_weighted: 0.9410988547155245
precision_macro: 0.9415975677612235
precision_micro: 0.941
precision_weighted: 0.9415975677612235
recall_macro: 0.941
recall_micro: 0.941
recall_weighted: 0.941
accuracy: 0.941
|
[
"inpainting",
"insight",
"text2img",
"wiki"
] |
EscvNcl/MobileNet-V2-Retinopathy
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MobileNet-V2-Retinopathy
This model is a fine-tuned version of [google/mobilenet_v2_1.4_224](https://huggingface.co/google/mobilenet_v2_1.4_224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2044
- Accuracy: 0.9307
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4403 | 1.0 | 113 | 0.5330 | 0.7079 |
| 0.5538 | 2.0 | 227 | 0.4312 | 0.7723 |
| 0.542 | 3.0 | 340 | 0.5137 | 0.7426 |
| 0.4776 | 4.0 | 454 | 0.4656 | 0.7723 |
| 0.4244 | 5.0 | 567 | 1.0400 | 0.5990 |
| 0.4694 | 6.0 | 681 | 0.5936 | 0.7228 |
| 0.4494 | 7.0 | 794 | 0.4667 | 0.7822 |
| 0.4647 | 8.0 | 908 | 0.2629 | 0.8960 |
| 0.3646 | 9.0 | 1021 | 0.2287 | 0.8861 |
| 0.4827 | 10.0 | 1135 | 1.7967 | 0.5149 |
| 0.3679 | 11.0 | 1248 | 0.4184 | 0.8267 |
| 0.3454 | 12.0 | 1362 | 0.1885 | 0.9406 |
| 0.3562 | 13.0 | 1475 | 0.2798 | 0.9059 |
| 0.3397 | 14.0 | 1589 | 1.6444 | 0.5891 |
| 0.4047 | 14.93 | 1695 | 0.2044 | 0.9307 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"nrdr",
"rdr"
] |
chanelcolgate/vit-base-image-classification-yenthienviet
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-image-classification-yenthienviet
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the image-classification-yenthienviet dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2380
- Accuracy: 0.9344
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6118 | 0.56 | 100 | 0.4854 | 0.8616 |
| 0.329 | 1.11 | 200 | 0.4473 | 0.8616 |
| 0.3002 | 1.67 | 300 | 0.4167 | 0.8637 |
| 0.1549 | 2.22 | 400 | 0.2911 | 0.9178 |
| 0.1993 | 2.78 | 500 | 0.2934 | 0.9168 |
| 0.1071 | 3.33 | 600 | 0.2389 | 0.9324 |
| 0.1027 | 3.89 | 700 | 0.2380 | 0.9344 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"botkhi",
"thuytinh",
"ocvit",
"ban",
"contrung",
"kimloai",
"toc"
] |
wasifh/model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8671
- Accuracy: 0.8235
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.9738 | 0.94 | 8 | 1.1530 | 0.5882 |
| 0.8674 | 2.0 | 17 | 1.0818 | 0.5882 |
| 0.708 | 2.94 | 25 | 1.0412 | 0.5882 |
| 0.7004 | 4.0 | 34 | 0.9774 | 0.7647 |
| 0.5957 | 4.94 | 42 | 1.0344 | 0.6471 |
| 0.5273 | 5.65 | 48 | 0.8671 | 0.8235 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"blistering",
"flatspot",
"graining",
"none"
] |
superdinmc/autotrain-orbit-millets-94211146034
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94211146034
- CO2 Emissions (in grams): 0.0110
## Validation Metrics
- Loss: 1.742
- Accuracy: 0.357
- Macro F1: 0.314
- Micro F1: 0.357
- Weighted F1: 0.314
- Macro Precision: 0.321
- Micro Precision: 0.357
- Weighted Precision: 0.321
- Macro Recall: 0.357
- Micro Recall: 0.357
- Weighted Recall: 0.357
|
[
"bajra",
"barri",
"jhangora",
"jowar",
"kangni",
"kodra",
"ragi"
] |
zkdeng/deit-base-patch16-224-finetuned-dangerousSpiders
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deit-base-patch16-224-finetuned-dangerousSpiders
This model is a fine-tuned version of [facebook/deit-base-patch16-224](https://huggingface.co/facebook/deit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.1957
- eval_accuracy: 0.915
- eval_precision: 0.8899
- eval_recall: 0.9510
- eval_f1: 0.9194
- eval_runtime: 5.3671
- eval_samples_per_second: 37.264
- eval_steps_per_second: 2.422
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 2
### Framework versions
- Transformers 4.33.2
- Pytorch 2.2.0.dev20230921
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"lactrodectus_hesperus",
"parasteatoda_tepidariorum"
] |
dima806/oxford_flowers_image_detection
|
Returns flower type given an image.
See https://www.kaggle.com/code/dima806/oxford-flowers-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
bolero deep blue 1.0000 1.0000 1.0000 94
toad lily 1.0000 1.0000 1.0000 94
bougainvillea 1.0000 1.0000 1.0000 94
blanket flower 1.0000 1.0000 1.0000 93
prince of wales feathers 1.0000 1.0000 1.0000 94
english marigold 1.0000 1.0000 1.0000 93
common dandelion 1.0000 1.0000 1.0000 94
mallow 1.0000 1.0000 1.0000 94
barbeton daisy 1.0000 1.0000 1.0000 94
desert-rose 1.0000 1.0000 1.0000 94
anthurium 1.0000 1.0000 1.0000 94
cyclamen 1.0000 1.0000 1.0000 94
marigold 1.0000 1.0000 1.0000 93
spring crocus 1.0000 1.0000 1.0000 94
petunia 1.0000 1.0000 1.0000 94
foxglove 1.0000 1.0000 1.0000 94
primula 1.0000 1.0000 1.0000 94
cape flower 1.0000 1.0000 1.0000 94
colt's foot 1.0000 1.0000 1.0000 93
osteospermum 1.0000 1.0000 1.0000 93
buttercup 1.0000 1.0000 1.0000 94
balloon flower 1.0000 1.0000 1.0000 94
fire lily 1.0000 1.0000 1.0000 93
bromelia 1.0000 1.0000 1.0000 93
artichoke 1.0000 1.0000 1.0000 93
daffodil 1.0000 1.0000 1.0000 94
pink-yellow dahlia 1.0000 1.0000 1.0000 93
geranium 1.0000 1.0000 1.0000 94
peruvian lily 1.0000 1.0000 1.0000 93
king protea 1.0000 1.0000 1.0000 94
silverbush 1.0000 1.0000 1.0000 94
alpine sea holly 1.0000 1.0000 1.0000 94
hibiscus 1.0000 1.0000 1.0000 93
giant white arum lily 1.0000 1.0000 1.0000 94
canna lily 1.0000 1.0000 1.0000 94
sunflower 1.0000 1.0000 1.0000 94
sweet pea 1.0000 1.0000 1.0000 94
mexican aster 1.0000 1.0000 1.0000 93
californian poppy 1.0000 1.0000 1.0000 94
pincushion flower 1.0000 1.0000 1.0000 93
black-eyed susan 1.0000 1.0000 1.0000 94
blackberry lily 1.0000 1.0000 1.0000 93
gaura 1.0000 1.0000 1.0000 94
love in the mist 1.0000 1.0000 1.0000 93
spear thistle 1.0000 1.0000 1.0000 94
orange dahlia 1.0000 1.0000 1.0000 93
wallflower 1.0000 1.0000 1.0000 93
tiger lily 1.0000 1.0000 1.0000 94
stemless gentian 1.0000 1.0000 1.0000 93
morning glory 1.0000 1.0000 1.0000 93
frangipani 1.0000 1.0000 1.0000 94
lotus lotus 1.0000 1.0000 1.0000 93
red ginger 1.0000 1.0000 1.0000 94
oxeye daisy 1.0000 1.0000 1.0000 94
windflower 1.0000 1.0000 1.0000 93
monkshood 1.0000 1.0000 1.0000 94
bishop of llandaff 1.0000 1.0000 1.0000 93
globe-flower 1.0000 1.0000 1.0000 93
globe thistle 1.0000 1.0000 1.0000 93
poinsettia 1.0000 1.0000 1.0000 94
wild pansy 1.0000 1.0000 1.0000 93
water lily 1.0000 1.0000 1.0000 94
watercress 1.0000 1.0000 1.0000 93
mexican petunia 1.0000 1.0000 1.0000 94
corn poppy 1.0000 1.0000 1.0000 93
bearded iris 1.0000 1.0000 1.0000 93
azalea 1.0000 1.0000 1.0000 93
camellia 1.0000 1.0000 1.0000 94
tree poppy 1.0000 1.0000 1.0000 93
moon orchid 1.0000 1.0000 1.0000 94
magnolia 1.0000 1.0000 1.0000 94
bee balm 1.0000 1.0000 1.0000 94
lenten rose 1.0000 1.0000 1.0000 94
trumpet creeper 1.0000 1.0000 1.0000 94
passion flower 1.0000 1.0000 1.0000 94
yellow iris 1.0000 1.0000 1.0000 93
pelargonium 1.0000 1.0000 1.0000 93
tree mallow 1.0000 1.0000 1.0000 94
thorn apple 1.0000 1.0000 1.0000 94
garden phlox 1.0000 1.0000 1.0000 94
sword lily 1.0000 1.0000 1.0000 94
carnation 1.0000 1.0000 1.0000 94
ruby-lipped cattleya 1.0000 1.0000 1.0000 94
ball moss 1.0000 1.0000 1.0000 94
columbine 1.0000 1.0000 1.0000 93
siam tulip 1.0000 1.0000 1.0000 94
snapdragon 1.0000 1.0000 1.0000 94
cautleya spicata 1.0000 1.0000 1.0000 94
hard-leaved pocket orchid 1.0000 1.0000 1.0000 93
pink primrose 1.0000 1.0000 1.0000 94
gazania 1.0000 1.0000 1.0000 93
hippeastrum 1.0000 1.0000 1.0000 93
fritillary 1.0000 1.0000 1.0000 93
canterbury bells 1.0000 1.0000 1.0000 94
great masterwort 1.0000 1.0000 1.0000 93
sweet william 1.0000 1.0000 1.0000 94
clematis 1.0000 1.0000 1.0000 93
purple coneflower 1.0000 1.0000 1.0000 94
japanese anemone 1.0000 1.0000 1.0000 94
bird of paradise 1.0000 1.0000 1.0000 93
rose 1.0000 1.0000 1.0000 94
grape hyacinth 1.0000 1.0000 1.0000 94
accuracy 1.0000 9548
macro avg 1.0000 1.0000 1.0000 9548
weighted avg 1.0000 1.0000 1.0000 9548
```
|
[
"bolero deep blue",
"toad lily",
"bougainvillea",
"blanket flower",
"prince of wales feathers",
"english marigold",
"common dandelion",
"mallow",
"barbeton daisy",
"desert-rose",
"anthurium",
"cyclamen",
"marigold",
"spring crocus",
"petunia",
"foxglove",
"primula",
"cape flower",
"colt's foot",
"osteospermum",
"buttercup",
"balloon flower",
"fire lily",
"bromelia",
"artichoke",
"daffodil",
"pink-yellow dahlia",
"geranium",
"peruvian lily",
"king protea",
"silverbush",
"alpine sea holly",
"hibiscus",
"giant white arum lily",
"canna lily",
"sunflower",
"sweet pea",
"mexican aster",
"californian poppy",
"pincushion flower",
"black-eyed susan",
"blackberry lily",
"gaura",
"love in the mist",
"spear thistle",
"orange dahlia",
"wallflower",
"tiger lily",
"stemless gentian",
"morning glory",
"frangipani",
"lotus lotus",
"red ginger",
"oxeye daisy",
"windflower",
"monkshood",
"bishop of llandaff",
"globe-flower",
"globe thistle",
"poinsettia",
"wild pansy",
"water lily",
"watercress",
"mexican petunia",
"corn poppy",
"bearded iris",
"azalea",
"camellia",
"tree poppy",
"moon orchid",
"magnolia",
"bee balm",
"lenten rose",
"trumpet creeper",
"passion flower",
"yellow iris",
"pelargonium",
"tree mallow",
"thorn apple",
"garden phlox",
"sword lily",
"carnation",
"ruby-lipped cattleya",
"ball moss",
"columbine",
"siam tulip",
"snapdragon",
"cautleya spicata",
"hard-leaved pocket orchid",
"pink primrose",
"gazania",
"hippeastrum",
"fritillary",
"canterbury bells",
"great masterwort",
"sweet william",
"clematis",
"purple coneflower",
"japanese anemone",
"bird of paradise",
"rose",
"grape hyacinth"
] |
nandyc/swin-tiny-patch4-window7-224-finetuned_ASL_Isolated_Swin_dataset2
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned_ASL_Isolated_Swin_dataset2
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the ASL_Isolated_Swin_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1269
- Accuracy: 0.9769
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.5439 | 1.09 | 100 | 1.4188 | 0.5538 |
| 0.8646 | 2.17 | 200 | 0.4542 | 0.8885 |
| 0.5485 | 3.26 | 300 | 0.4103 | 0.8538 |
| 0.5082 | 4.35 | 400 | 0.2925 | 0.8962 |
| 0.5302 | 5.43 | 500 | 0.2471 | 0.9269 |
| 0.4072 | 6.52 | 600 | 0.2676 | 0.9231 |
| 0.4424 | 7.61 | 700 | 0.4150 | 0.9038 |
| 0.3409 | 8.7 | 800 | 0.1922 | 0.9538 |
| 0.3046 | 9.78 | 900 | 0.1917 | 0.9462 |
| 0.2911 | 10.87 | 1000 | 0.2272 | 0.9423 |
| 0.269 | 11.96 | 1100 | 0.0722 | 0.9692 |
| 0.3709 | 13.04 | 1200 | 0.1473 | 0.9654 |
| 0.3443 | 14.13 | 1300 | 0.1545 | 0.9615 |
| 0.187 | 15.22 | 1400 | 0.1060 | 0.9731 |
| 0.1879 | 16.3 | 1500 | 0.1124 | 0.9692 |
| 0.2183 | 17.39 | 1600 | 0.1377 | 0.9615 |
| 0.1478 | 18.48 | 1700 | 0.1269 | 0.9769 |
| 0.1944 | 19.57 | 1800 | 0.0909 | 0.9769 |
### Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1
|
[
"a",
"b",
"c",
"d",
"e",
"f",
"g",
"h",
"i",
"j",
"k",
"l",
"m",
"n",
"o",
"p",
"q",
"r",
"s",
"t",
"u",
"v",
"w",
"x",
"y",
"z"
] |
superdinmc/autotrain-orbit-millets-2-94372146064
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94372146064
- CO2 Emissions (in grams): 0.0330
## Validation Metrics
- Loss: 1.710
- Accuracy: 0.327
- Macro F1: 0.262
- Micro F1: 0.327
- Weighted F1: 0.309
- Macro Precision: 0.277
- Micro Precision: 0.327
- Weighted Precision: 0.312
- Macro Recall: 0.270
- Micro Recall: 0.327
- Weighted Recall: 0.327
|
[
"bajra",
"barri",
"jhangora",
"jowar",
"kangni",
"kodra",
"ragi"
] |
EscvNcl/ConvNext-V2-Retinopathy
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ConvNext-V2-Retinopathy
This model is a fine-tuned version of [syedmuhammad/ConvNextV2-Diabetec-Retinopathy](https://huggingface.co/syedmuhammad/ConvNextV2-Diabetec-Retinopathy) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0219
- Accuracy: 0.9901
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.125 | 1.0 | 113 | 0.0339 | 0.9901 |
| 0.2206 | 2.0 | 227 | 0.0139 | 0.9901 |
| 0.1751 | 3.0 | 340 | 0.0114 | 0.9950 |
| 0.0599 | 4.0 | 454 | 0.0277 | 0.9950 |
| 0.1122 | 5.0 | 567 | 0.0328 | 0.9950 |
| 0.093 | 6.0 | 681 | 0.0240 | 0.9901 |
| 0.0673 | 7.0 | 794 | 0.0251 | 0.9950 |
| 0.0718 | 8.0 | 908 | 0.0458 | 0.9851 |
| 0.0632 | 9.0 | 1021 | 0.0477 | 0.9901 |
| 0.0263 | 10.0 | 1135 | 0.0399 | 0.9950 |
| 0.0304 | 11.0 | 1248 | 0.0295 | 0.9901 |
| 0.0892 | 12.0 | 1362 | 0.0330 | 0.9950 |
| 0.0227 | 13.0 | 1475 | 0.0287 | 0.9901 |
| 0.0253 | 14.0 | 1589 | 0.0262 | 0.9901 |
| 0.1242 | 14.93 | 1695 | 0.0219 | 0.9901 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"nrdr",
"rdr"
] |
wang1215/my_awesome_food_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6235
- Accuracy: 0.892
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.7037 | 0.99 | 62 | 2.5304 | 0.832 |
| 1.8523 | 2.0 | 125 | 1.8095 | 0.865 |
| 1.5914 | 2.98 | 186 | 1.6235 | 0.892 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.0
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
andriydovgal/mvp_flowers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mvp_flowers
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0181
- Accuracy: 0.907
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 3.8355 | 0.99 | 62 | 3.7493 | 0.711 |
| 3.2592 | 2.0 | 125 | 3.1841 | 0.886 |
| 2.9952 | 2.98 | 186 | 3.0181 | 0.907 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.1.0+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"1",
"10",
"16",
"98",
"99",
"17",
"18",
"19",
"2",
"20",
"21",
"22",
"23",
"24",
"100",
"25",
"26",
"27",
"28",
"29",
"3",
"30",
"31",
"32",
"33",
"101",
"34",
"35",
"36",
"37",
"38",
"39",
"4",
"40",
"41",
"42",
"102",
"43",
"44",
"45",
"46",
"47",
"48",
"49",
"5",
"50",
"51",
"11",
"52",
"53",
"54",
"55",
"56",
"57",
"58",
"59",
"6",
"60",
"12",
"61",
"62",
"63",
"64",
"65",
"66",
"67",
"68",
"69",
"7",
"13",
"70",
"71",
"72",
"73",
"74",
"75",
"76",
"77",
"78",
"79",
"14",
"8",
"80",
"81",
"82",
"83",
"84",
"85",
"86",
"87",
"88",
"15",
"89",
"9",
"90",
"91",
"92",
"93",
"94",
"95",
"96",
"97"
] |
galbitang/autotrain-sofa_style_classification-94412146080
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94412146080
- CO2 Emissions (in grams): 3.4207
## Validation Metrics
- Loss: 0.863
- Accuracy: 0.722
- Macro F1: 0.660
- Micro F1: 0.722
- Weighted F1: 0.711
- Macro Precision: 0.720
- Micro Precision: 0.722
- Weighted Precision: 0.735
- Macro Recall: 0.667
- Micro Recall: 0.722
- Weighted Recall: 0.722
|
[
"natural",
"lovelyromantic",
"koreanasia",
"modern",
"minimalsimple",
"northerneurope",
"vintageretro",
"unique",
"industrial",
"classicantique",
"frenchprovence"
] |
galbitang/autotrain-bed_frame_style_classification-94482146114
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94482146114
- CO2 Emissions (in grams): 0.0920
## Validation Metrics
- Loss: 0.544
- Accuracy: 0.824
- Macro F1: 0.820
- Micro F1: 0.824
- Weighted F1: 0.822
- Macro Precision: 0.829
- Micro Precision: 0.824
- Weighted Precision: 0.825
- Macro Recall: 0.816
- Micro Recall: 0.824
- Weighted Recall: 0.824
|
[
"natural",
"lovelyromantic",
"koreanasia",
"modern",
"minimalsimple",
"northerneurope",
"vintageretro",
"unique",
"industrial",
"classicantique",
"frenchprovence"
] |
galbitang/autotrain-chair_style_classification-94502146123
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94502146123
- CO2 Emissions (in grams): 0.0613
## Validation Metrics
- Loss: 0.736
- Accuracy: 0.760
- Macro F1: 0.688
- Micro F1: 0.760
- Weighted F1: 0.753
- Macro Precision: 0.764
- Micro Precision: 0.760
- Weighted Precision: 0.766
- Macro Recall: 0.671
- Micro Recall: 0.760
- Weighted Recall: 0.760
|
[
"natural",
"lovelyromantic",
"koreanasia",
"modern",
"minimalsimple",
"northerneurope",
"vintageretro",
"unique",
"industrial",
"classicantique",
"frenchprovence"
] |
hongerzh/my_NFT_sale_classifier
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_NFT_sale_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6323
- Accuracy: 0.6560
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6234 | 1.0 | 112 | 0.6335 | 0.6565 |
| 0.6077 | 2.0 | 225 | 0.6335 | 0.6583 |
| 0.5896 | 2.99 | 336 | 0.6323 | 0.6560 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1
- Datasets 2.14.5
- Tokenizers 0.14.0
|
[
"notsale",
"sale"
] |
Tushar86/yolo-testing
|
# Usage
```
from transformers import pipeline
p = pipeline("image-classification", model="juliensimon/autotrain-food101-1471154053")
result = p("my_image.jpg")
```
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 1471154053
- CO2 Emissions (in grams): 179.1154
## Validation Metrics
- Loss: 0.301
- Accuracy: 0.915
- Macro F1: 0.915
- Micro F1: 0.915
- Weighted F1: 0.915
- Macro Precision: 0.917
- Micro Precision: 0.915
- Weighted Precision: 0.917
- Macro Recall: 0.915
- Micro Recall: 0.915
- Weighted Recall: 0.915
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheese_plate",
"cheesecake",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
galbitang/autotrain-table_style_classification2-94510146124
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94510146124
- CO2 Emissions (in grams): 0.0786
## Validation Metrics
- Loss: 0.806
- Accuracy: 0.766
- Macro F1: 0.683
- Micro F1: 0.766
- Weighted F1: 0.750
- Macro Precision: 0.710
- Micro Precision: 0.766
- Weighted Precision: 0.744
- Macro Recall: 0.676
- Micro Recall: 0.766
- Weighted Recall: 0.766
|
[
"natural",
"lovelyromantic",
"koreanasia",
"modern",
"minimalsimple",
"northerneurope",
"vintageretro",
"unique",
"industrial",
"classicantique",
"frenchprovence"
] |
fengdavid/my_awesome_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_model
This model is a fine-tuned version of [distbert](https://huggingface.co/distbert) on the blala dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1908
- Accuracy: 0.929
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 391 | 0.1959 | 0.9243 |
| 0.2489 | 2.0 | 782 | 0.1908 | 0.929 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"negative",
"positive"
] |
lucascruz/CheXpert-ViT-U-SelfTrained
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CheXpert-ViT-U-SelfTrained
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 64
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 5
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.14.1
|
[
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
hyeongjin99/resnet_50_base_aihub_model_py
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet_50_base_aihub_model_py
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0987
- Accuracy: 0.9681
- Precision: 0.9712
- Recall: 0.9624
- F1: 0.9667
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| 0.5577 | 1.0 | 149 | 0.4027 | 0.8453 | 0.8514 | 0.8415 | 0.8435 |
| 0.323 | 2.0 | 299 | 0.2346 | 0.9097 | 0.9208 | 0.8962 | 0.9074 |
| 0.2467 | 3.0 | 448 | 0.1786 | 0.9303 | 0.9465 | 0.9216 | 0.9326 |
| 0.1953 | 4.0 | 598 | 0.1266 | 0.9573 | 0.9591 | 0.9483 | 0.9535 |
| 0.1456 | 4.98 | 745 | 0.0987 | 0.9681 | 0.9712 | 0.9624 | 0.9667 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
[
"cloudy",
"normal",
"rainy",
"snowy"
] |
ombhojane/healthyPlantsModel
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# healthy-plant-disease-identification
This model is a fine-tuned version of [google/mobilenet_v2_1.0_224](https://huggingface.co/google/mobilenet_v2_1.0_224) on the [Kaggle version](https://www.kaggle.com/datasets/vipoooool/new-plant-diseases-dataset) of the [Plant Village dataset](https://github.com/spMohanty/PlantVillage-Dataset).
It achieves the following results on the evaluation set:
- Cross Entropy Loss: 0.15
- Accuracy: 0.9541
## Intended uses & limitations
For identifying common diseases in crops and assessing plant health. Not to be used as a replacement for an actual diagnosis from experts.
## Training and evaluation data
The plant village dataset consists of 38 classes of diseases in common crops (including healthy/normal crops).
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-5
- train_batch_size: 256
- eval_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 6
### Framework versions
- Transformers 4.27.3
- Pytorch 1.13.0
- Datasets 2.1.0
- Tokenizers 0.13.2
|
[
"apple scab",
"apple with black rot",
"cedar apple rust",
"healthy apple",
"healthy blueberry plant",
"cherry with powdery mildew",
"healthy cherry plant",
"corn (maize) with cercospora and gray leaf spot",
"corn (maize) with common rust",
"corn (maize) with northern leaf blight",
"healthy corn (maize) plant",
"grape with black rot",
"grape with esca (black measles)",
"grape with isariopsis leaf spot",
"healthy grape plant",
"orange with citrus greening",
"peach with bacterial spot",
"healthy peach plant",
"bell pepper with bacterial spot",
"healthy bell pepper plant",
"potato with early blight",
"potato with late blight",
"healthy potato plant",
"healthy raspberry plant",
"healthy soybean plant",
"squash with powdery mildew",
"strawberry with leaf scorch",
"healthy strawberry plant",
"tomato with bacterial spot",
"tomato with early blight",
"tomato with late blight",
"tomato with leaf mold",
"tomato with septoria leaf spot",
"tomato with spider mites or two-spotted spider mite",
"tomato with target spot",
"tomato yellow leaf curl virus",
"tomato mosaic virus",
"healthy tomato plant"
] |
merve/beans-vit-224
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beans-vit-224
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3256
- Accuracy: 0.9375
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.0032 | 0.98 | 16 | 0.6540 | 0.8828 |
| 0.4711 | 1.97 | 32 | 0.4180 | 0.9297 |
| 0.3711 | 2.95 | 48 | 0.3256 | 0.9375 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
platzi/platzi-vit-model_JPLC
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# platzi-vit-model_JPLC
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0400
- Accuracy: 0.9850
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1287 | 3.85 | 500 | 0.0400 | 0.9850 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
dima806/surface_crack_image_detection
|
Check whether there is a surface crack given surface image.
See https://www.kaggle.com/code/dima806/surface-crack-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
Positive 0.9988 0.9995 0.9991 4000
Negative 0.9995 0.9988 0.9991 4000
accuracy 0.9991 8000
macro avg 0.9991 0.9991 0.9991 8000
weighted avg 0.9991 0.9991 0.9991 8000
```
|
[
"positive",
"negative"
] |
romitbarua/autotrain-deepfakeface_only_faces-94737146192
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
- CO2 Emissions (in grams): 33.9961
## Validation Metricsg
loss: 0.479949951171875
f1_macro: 0.813704872275267
f1_micro: 0.8058823529411765
f1_weighted: 0.8066289996092723
precision_macro: 0.8195022394186146
precision_micro: 0.8058823529411765
precision_weighted: 0.808552105878264
recall_macro: 0.8091827901264657
recall_micro: 0.8058823529411765
recall_weighted: 0.8058823529411765
accuracy: 0.8058823529411765
|
[
"inpainting",
"insight",
"real",
"text2img"
] |
sweetzinc/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1113
- Accuracy: 0.9641
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1937 | 1.0 | 190 | 0.1113 | 0.9641 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.1.0+cpu
- Datasets 2.14.5
- Tokenizers 0.14.1
|
[
"annualcrop",
"forest",
"herbaceousvegetation",
"highway",
"industrial",
"pasture",
"permanentcrop",
"residential",
"river",
"sealake"
] |
dima806/buscuit_wrappers_image_detection
|
Returns biscuit wrapper type based on an image with about 93% accuracy.
See https://www.kaggle.com/code/dima806/biscuit-wrappers-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
Americana Coconut Cookies 0.9677 0.9677 0.9677 31
Amul Chocolate Cookies 0.9688 1.0000 0.9841 31
Amul Elaichi Rusk 0.9143 1.0000 0.9552 32
Bhagwati Choco Vanilla Puff Biscuits 1.0000 1.0000 1.0000 32
Bhagwati Lemony Puff Biscuits 1.0000 1.0000 1.0000 31
Bisk Farm Sugar Free Biscuits 0.9688 1.0000 0.9841 31
Bonn Jeera Bite Biscuits 1.0000 1.0000 1.0000 31
Britannia 50-50 Maska Chaska 0.8750 0.4516 0.5957 31
Britannia 50-50 Maska Chaska Salted Biscuits 0.5111 0.7419 0.6053 31
Britannia 50-50 Potazos - Masti Masala 1.0000 1.0000 1.0000 31
Britannia 50-50 Sweet and Salty Biscuits 1.0000 0.8387 0.9123 31
Britannia 50-50 Timepass Classic Salted Biscuit 1.0000 1.0000 1.0000 31
Britannia Biscafe Coffee Cracker 0.8333 0.6452 0.7273 31
Britannia Bourbon 1.0000 0.8710 0.9310 31
Britannia Bourbon The Original Cream Biscuits 0.8889 1.0000 0.9412 32
Britannia Chocolush - Pure Magic 0.7045 1.0000 0.8267 31
Britannia Good Day - Chocochip Cookies 1.0000 0.9677 0.9836 31
Britannia Good Day Cashew Almond Cookies 0.6944 0.8065 0.7463 31
Britannia Good Day Harmony Biscuit 1.0000 0.7812 0.8772 32
Britannia Good Day Pista Badam Cookies 0.8378 1.0000 0.9118 31
Britannia Little Hearts 0.9688 1.0000 0.9841 31
Britannia Marie Gold Biscuit 1.0000 0.9688 0.9841 32
Britannia Milk Bikis Milk Biscuits 0.7381 1.0000 0.8493 31
Britannia Nice Time - Coconut Biscuits 0.8889 1.0000 0.9412 32
Britannia Nutri Choice Oats Cookies - Chocolate and Almonds 0.7500 0.8710 0.8060 31
Britannia Nutri Choice Oats Cookies - Orange With Almonds 1.0000 0.7097 0.8302 31
Britannia Nutri Choice Seed Biscuits 1.0000 0.9032 0.9492 31
Britannia Nutri Choice Sugar Free Cream Cracker Biscuits 1.0000 1.0000 1.0000 31
Britannia Nutrichoice Herbs Biscuits 1.0000 1.0000 1.0000 31
Britannia Tiger Glucose Biscuit 0.9667 0.9355 0.9508 31
Britannia Tiger Kreemz - Chocolate Cream Biscuits 0.9091 0.9375 0.9231 32
Britannia Tiger Kreemz - Elaichi Cream Biscuits 0.9688 1.0000 0.9841 31
Britannia Tiger Kreemz - Orange Cream Biscuits 0.8889 0.7742 0.8276 31
Britannia Tiger Krunch Chocochips Biscuit 0.8710 0.8710 0.8710 31
Britannia Treat Chocolate Cream Biscuits 1.0000 0.9032 0.9492 31
Britannia Treat Crazy Pineapple Cream Biscuit 0.9697 1.0000 0.9846 32
Britannia Treat Jim Jam Cream Biscuit 1.0000 1.0000 1.0000 31
Britannia Treat Osom Orange Cream Biscuit 0.9667 0.9355 0.9508 31
Britannia Vita Marie Gold Biscuits 1.0000 1.0000 1.0000 31
Cadbury Bournvita Biscuits 0.9667 0.9062 0.9355 32
Cadbury Chocobakes Choc Filled Cookies 1.0000 1.0000 1.0000 32
Cadbury Oreo Chocolate Flavour Biscuit Cream Sandwich 1.0000 0.8065 0.8929 31
Cadbury Oreo Strawberry Flavour Creme Sandwich Biscuit 1.0000 0.9677 0.9836 31
Canberra Big Orange Cream Biscuits 1.0000 0.8125 0.8966 32
CookieMan Hand Pound Chocolate Cookies 0.9394 1.0000 0.9688 31
Cremica Coconut Cookies 1.0000 1.0000 1.0000 31
Cremica Elaichi Sandwich Biscuits 1.0000 1.0000 1.0000 31
Cremica Jeera Lite 1.0000 0.9677 0.9836 31
Cremica Non-Stop Thin Potato Crackers - Baked, Crunchy Masala 1.0000 0.9355 0.9667 31
Cremica Orange Sandwich Biscuits 1.0000 0.8710 0.9310 31
Krown Black Magic Cream Biscuits 0.9655 0.9032 0.9333 31
MARIO Coconut Crunchy Biscuits 0.8378 1.0000 0.9118 31
McVities Bourbon Cream Biscuits 0.9688 0.9688 0.9688 32
McVities Dark Cookie Cream 1.0000 0.8065 0.8929 31
McVities Marie Biscuit 0.8710 0.8710 0.8710 31
Parle 20-20 Cashew Cookies 1.0000 1.0000 1.0000 32
Parle 20-20 Nice Biscuits 1.0000 1.0000 1.0000 32
Parle Happy Happy Choco-Chip Cookies 0.9394 1.0000 0.9688 31
Parle Hide and Seek 0.9333 0.9032 0.9180 31
Parle Hide and Seek - Black Bourbon Choco 0.9032 0.9032 0.9032 31
Parle Hide and Seek - Milano Choco Chip Cookies 1.0000 0.9677 0.9836 31
Parle Hide and Seek Caffe Mocha Cookies 0.9565 0.7097 0.8148 31
Parle Hide and Seek Chocolate and Almonds 0.9655 0.8750 0.9180 32
Parle Krack Jack Original Sweet and Salty Cracker Biscuit 0.9333 0.9032 0.9180 31
Parle Krackjack Biscuits 0.9643 0.8710 0.9153 31
Parle Magix Sandwich Biscuits - Chocolate 0.9375 0.9677 0.9524 31
Parle Milk Shakti Biscuits 0.9091 0.9677 0.9375 31
Parle Monaco Biscuit - Classic Regular 1.0000 0.9688 0.9841 32
Parle Monaco Piri Piri 1.0000 0.9062 0.9508 32
Parle Platina Hide and Seek Creme Sandwich - Vanilla 0.9412 1.0000 0.9697 32
Parle-G Gold Gluco Biscuits 0.9677 0.9677 0.9677 31
Parle-G Original Gluco Biscuits 1.0000 0.9677 0.9836 31
Patanjali Doodh Biscuit 1.0000 0.9688 0.9841 32
Priyagold Butter Delite Biscuits 1.0000 1.0000 1.0000 31
Priyagold CNC Biscuits 1.0000 0.8065 0.8929 31
Priyagold Cheese Chacker Biscuits 0.9333 0.9032 0.9180 31
Priyagold Snacks Zig Zag Biscuits 0.9688 1.0000 0.9841 31
Richlite Rich Butter Cookies 0.9688 1.0000 0.9841 31
RiteBite Max Protein 7 Grain Breakfast Cookies - Cashew Delite 1.0000 1.0000 1.0000 31
Sagar Coconut Munch Biscuits 1.0000 1.0000 1.0000 31
Sri Sri Tattva Cashew Nut Cookies 1.0000 1.0000 1.0000 31
Sri Sri Tattva Choco Hazelnut Cookies 0.8056 0.9355 0.8657 31
Sri Sri Tattva Coconut Cookies 0.8378 1.0000 0.9118 31
Sri Sri Tattva Digestive Cookies 1.0000 0.8710 0.9310 31
Sunfeast All Rounder - Cream and Herb 1.0000 0.9355 0.9667 31
Sunfeast All Rounder - Thin, Light and Crunchy Potato Biscuit With Chatpata Masala Flavour 1.0000 0.8387 0.9123 31
Sunfeast Bounce Creme Biscuits 0.9259 0.8065 0.8621 31
Sunfeast Bounce Creme Biscuits - Elaichi 0.7949 1.0000 0.8857 31
Sunfeast Bounce Creme Biscuits - Pineapple Zing 0.7949 1.0000 0.8857 31
Sunfeast Dark Fantasy - Choco Creme 0.7949 1.0000 0.8857 31
Sunfeast Dark Fantasy Bourbon Biscuits 0.6889 1.0000 0.8158 31
Sunfeast Dark Fantasy Choco Fills 1.0000 0.8065 0.8929 31
Sunfeast Glucose Biscuits 0.9310 0.8710 0.9000 31
Sunfeast Moms Magic - Fruit and Milk Cookies 0.8158 1.0000 0.8986 31
Sunfeast Moms Magic - Rich Butter Cookies 1.0000 0.9677 0.9836 31
Sunfeast Moms Magic - Rich Cashew and Almond Cookies 1.0000 0.9062 0.9508 32
Tasties Chocochip Cookies 1.0000 1.0000 1.0000 31
Tasties Coconut Cookies 1.0000 0.8750 0.9333 32
UNIBIC Choco Chip Cookies 0.8333 0.9677 0.8955 31
UNIBIC Pista Badam Cookies 0.8857 1.0000 0.9394 31
UNIBIC Snappers Potato Crackers 0.9667 0.9355 0.9508 31
accuracy 0.9305 3152
macro avg 0.9396 0.9304 0.9306 3152
weighted avg 0.9398 0.9305 0.9307 3152
```
|
[
"americana coconut cookies",
"amul chocolate cookies",
"amul elaichi rusk",
"bhagwati choco vanilla puff biscuits",
"bhagwati lemony puff biscuits",
"bisk farm sugar free biscuits",
"bonn jeera bite biscuits",
"britannia 50-50 maska chaska",
"britannia 50-50 maska chaska salted biscuits",
"britannia 50-50 potazos - masti masala",
"britannia 50-50 sweet and salty biscuits",
"britannia 50-50 timepass classic salted biscuit",
"britannia biscafe coffee cracker",
"britannia bourbon",
"britannia bourbon the original cream biscuits",
"britannia chocolush - pure magic",
"britannia good day - chocochip cookies",
"britannia good day cashew almond cookies",
"britannia good day harmony biscuit",
"britannia good day pista badam cookies",
"britannia little hearts",
"britannia marie gold biscuit",
"britannia milk bikis milk biscuits",
"britannia nice time - coconut biscuits",
"britannia nutri choice oats cookies - chocolate and almonds",
"britannia nutri choice oats cookies - orange with almonds",
"britannia nutri choice seed biscuits",
"britannia nutri choice sugar free cream cracker biscuits",
"britannia nutrichoice herbs biscuits",
"britannia tiger glucose biscuit",
"britannia tiger kreemz - chocolate cream biscuits",
"britannia tiger kreemz - elaichi cream biscuits",
"britannia tiger kreemz - orange cream biscuits",
"britannia tiger krunch chocochips biscuit",
"britannia treat chocolate cream biscuits",
"britannia treat crazy pineapple cream biscuit",
"britannia treat jim jam cream biscuit",
"britannia treat osom orange cream biscuit",
"britannia vita marie gold biscuits",
"cadbury bournvita biscuits",
"cadbury chocobakes choc filled cookies",
"cadbury oreo chocolate flavour biscuit cream sandwich",
"cadbury oreo strawberry flavour creme sandwich biscuit",
"canberra big orange cream biscuits",
"cookieman hand pound chocolate cookies",
"cremica coconut cookies",
"cremica elaichi sandwich biscuits",
"cremica jeera lite",
"cremica non-stop thin potato crackers - baked, crunchy masala",
"cremica orange sandwich biscuits",
"krown black magic cream biscuits",
"mario coconut crunchy biscuits",
"mcvities bourbon cream biscuits",
"mcvities dark cookie cream",
"mcvities marie biscuit",
"parle 20-20 cashew cookies",
"parle 20-20 nice biscuits",
"parle happy happy choco-chip cookies",
"parle hide and seek",
"parle hide and seek - black bourbon choco",
"parle hide and seek - milano choco chip cookies",
"parle hide and seek caffe mocha cookies",
"parle hide and seek chocolate and almonds",
"parle krack jack original sweet and salty cracker biscuit",
"parle krackjack biscuits",
"parle magix sandwich biscuits - chocolate",
"parle milk shakti biscuits",
"parle monaco biscuit - classic regular",
"parle monaco piri piri",
"parle platina hide and seek creme sandwich - vanilla",
"parle-g gold gluco biscuits",
"parle-g original gluco biscuits",
"patanjali doodh biscuit",
"priyagold butter delite biscuits",
"priyagold cnc biscuits",
"priyagold cheese chacker biscuits",
"priyagold snacks zig zag biscuits",
"richlite rich butter cookies",
"ritebite max protein 7 grain breakfast cookies - cashew delite",
"sagar coconut munch biscuits",
"sri sri tattva cashew nut cookies",
"sri sri tattva choco hazelnut cookies",
"sri sri tattva coconut cookies",
"sri sri tattva digestive cookies",
"sunfeast all rounder - cream and herb",
"sunfeast all rounder - thin, light and crunchy potato biscuit with chatpata masala flavour",
"sunfeast bounce creme biscuits",
"sunfeast bounce creme biscuits - elaichi",
"sunfeast bounce creme biscuits - pineapple zing",
"sunfeast dark fantasy - choco creme",
"sunfeast dark fantasy bourbon biscuits",
"sunfeast dark fantasy choco fills",
"sunfeast glucose biscuits",
"sunfeast moms magic - fruit and milk cookies",
"sunfeast moms magic - rich butter cookies",
"sunfeast moms magic - rich cashew and almond cookies",
"tasties chocochip cookies",
"tasties coconut cookies",
"unibic choco chip cookies",
"unibic pista badam cookies",
"unibic snappers potato crackers"
] |
ericrong888/logo_classifier
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# ericrong888/logo_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.7196
- Validation Loss: 0.8069
- Train Accuracy: 1.0
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 75, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 1.1054 | 1.0410 | 0.8333 | 0 |
| 0.9869 | 0.9692 | 0.8333 | 1 |
| 0.8856 | 0.9035 | 1.0 | 2 |
| 0.8117 | 0.8585 | 1.0 | 3 |
| 0.7196 | 0.8069 | 1.0 | 4 |
### Framework versions
- Transformers 4.34.1
- TensorFlow 2.14.0
- Datasets 2.14.6
- Tokenizers 0.14.1
|
[
"amazon",
"starbucks",
"wellsfargo"
] |
wangyk22/histoSwin-base
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# histoSwin-base
This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the nctcrche100_k dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1260
- Accuracy: 0.9709
## Model description
More information needed
## Intended uses & limitations
For TMA classification.
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0386 | 1.0 | 1562 | 0.1740 | 0.9592 |
| 0.05 | 2.0 | 3125 | 0.2313 | 0.9524 |
| 0.0324 | 3.0 | 4687 | 0.2606 | 0.9604 |
| 0.005 | 4.0 | 6250 | 0.1724 | 0.9660 |
| 0.0037 | 5.0 | 7810 | 0.1260 | 0.9709 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1
- Datasets 2.14.5
- Tokenizers 0.13.3
|
[
"str",
"mus",
"adi",
"tum",
"deb",
"lym",
"back",
"muc",
"norm"
] |
dima806/beard_face_image_detection
|
Predicts the presence of a beard given a facial image.
See https://www.kaggle.com/code/dima806/beard-face-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
Beard 1.0000 1.0000 1.0000 34
No Beard 1.0000 1.0000 1.0000 34
accuracy 1.0000 68
macro avg 1.0000 1.0000 1.0000 68
weighted avg 1.0000 1.0000 1.0000 68
```
|
[
"beard",
"no beard"
] |
dima806/food_beverages_japan_image_detection
|
Returns whether the Japanese food or beverage is in an image with about 89% accuracy.
See https://www.kaggle.com/code/dima806/food-beverages-japan-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
food 0.8898 0.8879 0.8889 473
beverage 0.8882 0.8901 0.8891 473
accuracy 0.8890 946
macro avg 0.8890 0.8890 0.8890 946
weighted avg 0.8890 0.8890 0.8890 946
```
|
[
"food",
"beverage"
] |
dima806/tyre_quality_image_detection
|
Retuns tyre quality given a tyre image with about 99.3% accuracy.
See https://www.kaggle.com/code/dima806/tyre-quality-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
defective 1.0000 0.9854 0.9926 411
good 0.9856 1.0000 0.9928 412
accuracy 0.9927 823
macro avg 0.9928 0.9927 0.9927 823
weighted avg 0.9928 0.9927 0.9927 823
```
|
[
"defective",
"good"
] |
dima806/full_flat_tyre_image_detection
|
Check whether the tyre is flat given an image.
See https://www.kaggle.com/code/dima806/full-flat-tyre-image-detection-vit for more details.
```
Classification report:
precision recall f1-score support
flat 1.0000 1.0000 1.0000 60
no-tire 1.0000 1.0000 1.0000 60
full 1.0000 1.0000 1.0000 60
accuracy 1.0000 180
macro avg 1.0000 1.0000 1.0000 180
weighted avg 1.0000 1.0000 1.0000 180
```
|
[
"flat",
"no-tire",
"full"
] |
romitbarua/autotrain-deepfakeface_only_faces_insightface-94902146221
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
- CO2 Emissions (in grams): 0.3674
## Validation Metricsg
loss: 0.39755979180336
f1: 0.8185757948998676
precision: 0.8184637839810254
recall: 0.8186878364814308
auc: 0.9113633194857089
accuracy: 0.8150662636596141
|
[
"insight",
"real"
] |
galbitang/autotrain-jeongmi_lamp-94917146228
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94917146228
- CO2 Emissions (in grams): 0.0453
## Validation Metrics
- Loss: 1.114
- Accuracy: 0.642
- Macro F1: 0.465
- Micro F1: 0.642
- Weighted F1: 0.612
- Macro Precision: 0.482
- Micro Precision: 0.642
- Weighted Precision: 0.595
- Macro Recall: 0.468
- Micro Recall: 0.642
- Weighted Recall: 0.642
|
[
"classicantique",
"frenchprovence",
"vintageretro",
"industrial",
"koreaaisa",
"lovelyromantic",
"minimalsimple",
"modern",
"natural",
"notherneurope",
"unique"
] |
galbitang/autotrain-jin0_table-94921146229
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94921146229
- CO2 Emissions (in grams): 0.1008
## Validation Metrics
- Loss: 0.892
- Accuracy: 0.727
- Macro F1: 0.672
- Micro F1: 0.727
- Weighted F1: 0.715
- Macro Precision: 0.682
- Micro Precision: 0.727
- Weighted Precision: 0.709
- Macro Recall: 0.675
- Micro Recall: 0.727
- Weighted Recall: 0.727
|
[
"classicantique",
"frenchprovence",
"vintageretro",
"industrial",
"koreaaisa",
"lovelyromantic",
"minimalsimple",
"modern",
"natural",
"notherneurope",
"unique"
] |
galbitang/autotrain-jeongmi_chair-94919146230
|
# Model Trained Using AutoTrain
- Problem type: Multi-class Classification
- Model ID: 94919146230
- CO2 Emissions (in grams): 0.0582
## Validation Metrics
- Loss: 0.714
- Accuracy: 0.735
- Macro F1: 0.622
- Micro F1: 0.735
- Weighted F1: 0.731
- Macro Precision: 0.678
- Micro Precision: 0.735
- Weighted Precision: 0.745
- Macro Recall: 0.597
- Micro Recall: 0.735
- Weighted Recall: 0.735
|
[
"classsicantique",
"frenchprovence",
"vintatageretro",
"industrial",
"koreaaisa",
"lovelyromantic",
"minimalsimple",
"modern",
"natural",
"notherneurope",
"unique"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.