model_id
stringlengths
7
105
model_card
stringlengths
1
130k
model_labels
listlengths
2
80k
ParichatS/food_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # ParichatS/food_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 1.3064 - Validation Loss: 1.2223 - Train Accuracy: 1.0 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 2000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 3.6307 | 2.6657 | 1.0 | 0 | | 2.1929 | 1.8806 | 1.0 | 1 | | 1.7192 | 1.5920 | 1.0 | 2 | | 1.4896 | 1.3940 | 1.0 | 3 | | 1.3064 | 1.2223 | 1.0 | 4 | ### Framework versions - Transformers 4.41.2 - TensorFlow 2.15.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Alan1402/vit-base-cifar10
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-cifar10 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the cifar dataset. It achieves the following results on the evaluation set: - Loss: 0.0803 - Accuracy: 0.9773 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.1043 | 0.0457 | 100 | 0.2855 | 0.919 | | 0.2671 | 0.0914 | 200 | 0.3650 | 0.9015 | | 0.2935 | 0.1371 | 300 | 0.3167 | 0.9067 | | 0.27 | 0.1828 | 400 | 0.3518 | 0.8922 | | 0.3634 | 0.2285 | 500 | 0.3660 | 0.8953 | | 0.2559 | 0.2742 | 600 | 0.3964 | 0.8901 | | 0.197 | 0.3199 | 700 | 0.2481 | 0.9253 | | 0.2594 | 0.3656 | 800 | 0.2486 | 0.923 | | 0.4545 | 0.4113 | 900 | 0.3271 | 0.9 | | 0.1243 | 0.4570 | 1000 | 0.2448 | 0.9269 | | 0.3593 | 0.5027 | 1100 | 0.2118 | 0.9354 | | 0.1375 | 0.5484 | 1200 | 0.2205 | 0.9349 | | 0.1521 | 0.5941 | 1300 | 0.2009 | 0.9376 | | 0.1237 | 0.6399 | 1400 | 0.1803 | 0.9445 | | 0.2214 | 0.6856 | 1500 | 0.2026 | 0.9395 | | 0.1324 | 0.7313 | 1600 | 0.1635 | 0.9493 | | 0.1864 | 0.7770 | 1700 | 0.1672 | 0.9493 | | 0.128 | 0.8227 | 1800 | 0.2015 | 0.9409 | | 0.121 | 0.8684 | 1900 | 0.1753 | 0.9451 | | 0.1918 | 0.9141 | 2000 | 0.1370 | 0.9588 | | 0.1658 | 0.9598 | 2100 | 0.1543 | 0.9535 | | 0.1088 | 1.0055 | 2200 | 0.1361 | 0.9577 | | 0.0916 | 1.0512 | 2300 | 0.1393 | 0.9597 | | 0.005 | 1.0969 | 2400 | 0.1295 | 0.9621 | | 0.0294 | 1.1426 | 2500 | 0.1327 | 0.9639 | | 0.0939 | 1.1883 | 2600 | 0.1409 | 0.9621 | | 0.0756 | 1.2340 | 2700 | 0.1202 | 0.9682 | | 0.0466 | 1.2797 | 2800 | 0.1274 | 0.964 | | 0.0565 | 1.3254 | 2900 | 0.1250 | 0.9663 | | 0.0609 | 1.3711 | 3000 | 0.1299 | 0.9657 | | 0.0201 | 1.4168 | 3100 | 0.1203 | 0.9685 | | 0.0258 | 1.4625 | 3200 | 0.1166 | 0.9693 | | 0.0913 | 1.5082 | 3300 | 0.1009 | 0.9736 | | 0.0235 | 1.5539 | 3400 | 0.0964 | 0.9732 | | 0.0089 | 1.5996 | 3500 | 0.0966 | 0.9747 | | 0.0455 | 1.6453 | 3600 | 0.0963 | 0.9748 | | 0.0271 | 1.6910 | 3700 | 0.0874 | 0.9763 | | 0.0407 | 1.7367 | 3800 | 0.0898 | 0.9761 | | 0.1095 | 1.7824 | 3900 | 0.0849 | 0.976 | | 0.0327 | 1.8282 | 4000 | 0.0926 | 0.9745 | | 0.0427 | 1.8739 | 4100 | 0.0811 | 0.9769 | | 0.003 | 1.9196 | 4200 | 0.0821 | 0.9761 | | 0.0182 | 1.9653 | 4300 | 0.0803 | 0.9773 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "airplane", "automobile", "bird", "cat", "deer", "dog", "frog", "horse", "ship", "truck" ]
habibi26/document-spoof
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # document-spoof This model is a fine-tuned version of [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1105 - Accuracy: 0.9767 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 25 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | No log | 0.9524 | 5 | 0.5211 | 0.8837 | | No log | 1.9048 | 10 | 0.2271 | 0.8837 | | 0.545 | 2.8571 | 15 | 0.0975 | 0.9884 | | 0.545 | 4.0 | 21 | 0.1020 | 0.9767 | | 0.545 | 4.9524 | 26 | 0.3087 | 0.9535 | | 0.472 | 5.9048 | 31 | 0.3385 | 0.8023 | | 0.472 | 6.8571 | 36 | 0.2358 | 0.8605 | | 0.472 | 8.0 | 42 | 0.3675 | 0.8605 | | 0.3762 | 8.9524 | 47 | 0.1460 | 0.9535 | | 0.3762 | 9.9048 | 52 | 0.6158 | 0.8140 | | 0.3762 | 10.8571 | 57 | 0.3228 | 0.9186 | | 0.1586 | 12.0 | 63 | 0.0248 | 0.9884 | | 0.1586 | 12.9524 | 68 | 0.0639 | 0.9651 | | 0.1586 | 13.9048 | 73 | 0.5674 | 0.8488 | | 0.1159 | 14.8571 | 78 | 0.0291 | 0.9884 | | 0.1159 | 16.0 | 84 | 0.0539 | 0.9884 | | 0.1159 | 16.9524 | 89 | 0.0772 | 0.9767 | | 0.0366 | 17.9048 | 94 | 0.0031 | 1.0 | | 0.0366 | 18.8571 | 99 | 0.1506 | 0.9535 | | 0.0179 | 20.0 | 105 | 0.0007 | 1.0 | | 0.0179 | 20.9524 | 110 | 0.1427 | 0.9535 | | 0.0179 | 21.9048 | 115 | 0.2299 | 0.9419 | | 0.0036 | 22.8571 | 120 | 0.1373 | 0.9767 | | 0.0036 | 23.8095 | 125 | 0.1105 | 0.9767 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.1.2 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "attack", "real" ]
hanslab37/architectural_styles_classifier
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # architectural_styles_classifier This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Architectural styles dataset, retrieved from https://www.kaggle.com/datasets/dumitrux/architectural-styles-dataset. It achieves the following results on the evaluation set: - Loss: 0.9412 - Accuracy: 0.7223 ## Model description Presentation about the model: https://www.canva.com/design/DAGLBMAs1K4/d8qvLN2nchSYVmnrwYzx0w/edit?utm_content=DAGLBMAs1K4&utm_campaign=designshare&utm_medium=link2&utm_source=sharebutton You can try the model from Huggingface Space this link: https://huggingface.co/spaces/hanslab37/technospire ## Intended uses & limitations The model were developed as part of experiment to learn about training a model and developing Image Classification model with Gradio in Huggingface. You can use it for experiment only. Not recommended for daily use. ## Training and evaluation data https://www.kaggle.com/datasets/dumitrux/architectural-styles-dataset ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=0.0003 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | 1.8855 | 0.9960 | 110 | 1.7753 | 0.4457 | | 1.583 | 1.9921 | 220 | 1.6208 | 0.4829 | | 1.4343 | 2.9972 | 331 | 1.3291 | 0.5851 | | 1.2836 | 3.9932 | 441 | 1.2550 | 0.6005 | | 1.2885 | 4.9983 | 552 | 1.1483 | 0.6298 | | 1.1226 | 5.9943 | 662 | 1.1245 | 0.6491 | | 0.985 | 6.9994 | 773 | 1.1381 | 0.6397 | | 0.9963 | 7.9955 | 883 | 1.0964 | 0.6605 | | 0.88 | 8.9915 | 993 | 1.0407 | 0.6739 | | 0.7688 | 9.9966 | 1104 | 1.0288 | 0.6918 | | 0.763 | 10.9926 | 1214 | 0.9835 | 0.6898 | | 0.6287 | 11.9977 | 1325 | 1.0049 | 0.7037 | | 0.6229 | 12.9938 | 1435 | 1.1010 | 0.6918 | | 0.5731 | 13.9989 | 1546 | 0.9910 | 0.7082 | | 0.5076 | 14.9949 | 1656 | 1.0457 | 0.7112 | | 0.554 | 16.0 | 1767 | 1.0141 | 0.7007 | | 0.382 | 16.9960 | 1877 | 1.0606 | 0.6928 | | 0.459 | 17.9921 | 1987 | 1.0091 | 0.7161 | | 0.4018 | 18.9972 | 2098 | 1.0011 | 0.7072 | | 0.3981 | 19.9208 | 2200 | 0.9821 | 0.7310 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "achaemenid architecture", "american foursquare architecture", "chicago school architecture", "colonial architecture", "deconstructivism", "edwardian architecture", "georgian architecture", "gothic architecture", "greek revival architecture", "international style", "novelty architecture", "palladian architecture", "american craftsman style", "postmodern architecture", "queen anne architecture", "romanesque architecture", "russian revival architecture", "tudor revival architecture", "ancient egyptian architecture", "art deco architecture", "art nouveau architecture", "baroque architecture", "bauhaus architecture", "beaux-arts architecture", "byzantine architecture" ]
Yasiru2002/swin-tiny-patch4-window7-224-finetuned-skin-cancer
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-skin-cancer This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3821 - Accuracy: 0.8613 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.9232 | 0.9929 | 35 | 0.8263 | 0.7116 | | 0.6427 | 1.9858 | 70 | 0.6465 | 0.7645 | | 0.5475 | 2.9787 | 105 | 0.5370 | 0.8054 | | 0.4539 | 4.0 | 141 | 0.4634 | 0.8234 | | 0.4547 | 4.9929 | 176 | 0.4462 | 0.8353 | | 0.4009 | 5.9858 | 211 | 0.4218 | 0.8483 | | 0.3722 | 6.9787 | 246 | 0.3942 | 0.8533 | | 0.3042 | 8.0 | 282 | 0.4198 | 0.8453 | | 0.3071 | 8.9929 | 317 | 0.4136 | 0.8493 | | 0.3103 | 9.9291 | 350 | 0.3821 | 0.8613 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.1.2 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "actinic-keratoses", "basal-cell-carcinoma", "benign-keratosis-like-lesions", "dermatofibroma", "melanocytic-nevi", "melanoma", "vascular-lesions" ]
wasmdashai/vit-face-expression-v1
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "angry", "disgust", "fear", "happy", "neutral", "sad", "surprise" ]
wendys-llc/yet-another-amber-mines
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.17235368490219116 f1: 0.94 precision: 0.94 recall: 0.94 auc: 0.9796 accuracy: 0.94
[ "negative", "positive" ]
sciarrilli/my_awesome_food_model_v2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/sciarrilli/huggingface/runs/trgtu68a) # my_awesome_food_model_v2 This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.8053 - Accuracy: 0.8083 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 512 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | 4.4448 | 0.9932 | 110 | 4.4236 | 0.0914 | | 3.8312 | 1.9955 | 221 | 3.8007 | 0.4096 | | 3.1568 | 2.9977 | 332 | 3.1221 | 0.5435 | | 2.4967 | 4.0 | 443 | 2.4920 | 0.6308 | | 2.0432 | 4.9932 | 553 | 2.0252 | 0.6825 | | 1.6512 | 5.9955 | 664 | 1.6771 | 0.7184 | | 1.388 | 6.9977 | 775 | 1.4464 | 0.7367 | | 1.1677 | 8.0 | 886 | 1.2782 | 0.7533 | | 1.0307 | 8.9932 | 996 | 1.1741 | 0.7625 | | 0.9156 | 9.9955 | 1107 | 1.0900 | 0.7741 | | 0.8283 | 10.9977 | 1218 | 1.0295 | 0.7771 | | 0.8078 | 12.0 | 1329 | 0.9949 | 0.7776 | | 0.7643 | 12.9932 | 1439 | 0.9656 | 0.7817 | | 0.6578 | 13.9955 | 1550 | 0.9274 | 0.7868 | | 0.611 | 14.9977 | 1661 | 0.9051 | 0.7921 | | 0.6016 | 16.0 | 1772 | 0.9009 | 0.7912 | | 0.5652 | 16.9932 | 1882 | 0.8772 | 0.7963 | | 0.5492 | 17.9955 | 1993 | 0.8559 | 0.7992 | | 0.5054 | 18.9977 | 2104 | 0.8734 | 0.7956 | | 0.5351 | 20.0 | 2215 | 0.8617 | 0.7999 | | 0.4949 | 20.9932 | 2325 | 0.8487 | 0.8013 | | 0.4701 | 21.9955 | 2436 | 0.8437 | 0.8013 | | 0.4576 | 22.9977 | 2547 | 0.8430 | 0.8008 | | 0.4573 | 24.0 | 2658 | 0.8195 | 0.8071 | | 0.4399 | 24.9932 | 2768 | 0.8206 | 0.8071 | | 0.424 | 25.9955 | 2879 | 0.8212 | 0.8068 | | 0.4031 | 26.9977 | 2990 | 0.8202 | 0.8069 | | 0.4031 | 28.0 | 3101 | 0.8173 | 0.8080 | | 0.407 | 28.9932 | 3211 | 0.8051 | 0.8069 | | 0.4194 | 29.7968 | 3300 | 0.8053 | 0.8083 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Abhiram4/PlantDiseaseDetectorSwin
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # PlantDiseaseDetectorSwin This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0207 - Accuracy: 0.9934 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0717 | 1.0 | 293 | 0.0393 | 0.9863 | | 0.0384 | 2.0 | 586 | 0.0207 | 0.9934 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple___apple_scab", "apple___black_rot", "apple___cedar_apple_rust", "apple___healthy", "blueberry___healthy", "cherry_(including_sour)___powdery_mildew", "cherry_(including_sour)___healthy", "corn_(maize)___cercospora_leaf_spot gray_leaf_spot", "corn_(maize)___common_rust_", "corn_(maize)___northern_leaf_blight", "corn_(maize)___healthy", "grape___black_rot", "grape___esca_(black_measles)", "grape___leaf_blight_(isariopsis_leaf_spot)", "grape___healthy", "orange___haunglongbing_(citrus_greening)", "peach___bacterial_spot", "peach___healthy", "pepper,_bell___bacterial_spot", "pepper,_bell___healthy", "potato___early_blight", "potato___late_blight", "potato___healthy", "raspberry___healthy", "soybean___healthy", "squash___powdery_mildew", "strawberry___leaf_scorch", "strawberry___healthy", "tomato___bacterial_spot", "tomato___early_blight", "tomato___late_blight", "tomato___leaf_mold", "tomato___septoria_leaf_spot", "tomato___spider_mites two-spotted_spider_mite", "tomato___target_spot", "tomato___tomato_yellow_leaf_curl_virus", "tomato___tomato_mosaic_virus", "tomato___healthy", "algal_spot", "brown_blight", "gray_blight", "healthy", "helopeltis", "red_spot" ]
Abhiram4/PlantDiseaseDetectorVit2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # PlantDiseaseDetectorVit2 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1892 - Accuracy: 0.9959 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5919 | 1.0 | 293 | 0.5612 | 0.9870 | | 0.2529 | 2.0 | 586 | 0.2436 | 0.9952 | | 0.1886 | 3.0 | 879 | 0.1892 | 0.9959 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple___apple_scab", "apple___black_rot", "apple___cedar_apple_rust", "apple___healthy", "blueberry___healthy", "cherry_(including_sour)___powdery_mildew", "cherry_(including_sour)___healthy", "corn_(maize)___cercospora_leaf_spot gray_leaf_spot", "corn_(maize)___common_rust_", "corn_(maize)___northern_leaf_blight", "corn_(maize)___healthy", "grape___black_rot", "grape___esca_(black_measles)", "grape___leaf_blight_(isariopsis_leaf_spot)", "grape___healthy", "orange___haunglongbing_(citrus_greening)", "peach___bacterial_spot", "peach___healthy", "pepper,_bell___bacterial_spot", "pepper,_bell___healthy", "potato___early_blight", "potato___late_blight", "potato___healthy", "raspberry___healthy", "soybean___healthy", "squash___powdery_mildew", "strawberry___leaf_scorch", "strawberry___healthy", "tomato___bacterial_spot", "tomato___early_blight", "tomato___late_blight", "tomato___leaf_mold", "tomato___septoria_leaf_spot", "tomato___spider_mites two-spotted_spider_mite", "tomato___target_spot", "tomato___tomato_yellow_leaf_curl_virus", "tomato___tomato_mosaic_virus", "tomato___healthy", "algal_spot", "brown_blight", "gray_blight", "healthy", "helopeltis", "red_spot" ]
Abhiram4/PlantDiseaseDetectorSwinv2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # PlantDiseaseDetectorSwinv2 This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0075 - Accuracy: 0.9975 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.082 | 1.0 | 293 | 0.0308 | 0.9900 | | 0.0427 | 2.0 | 586 | 0.0114 | 0.9955 | | 0.0341 | 3.0 | 879 | 0.0075 | 0.9975 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple___apple_scab", "apple___black_rot", "apple___cedar_apple_rust", "apple___healthy", "blueberry___healthy", "cherry_(including_sour)___powdery_mildew", "cherry_(including_sour)___healthy", "corn_(maize)___cercospora_leaf_spot gray_leaf_spot", "corn_(maize)___common_rust_", "corn_(maize)___northern_leaf_blight", "corn_(maize)___healthy", "grape___black_rot", "grape___esca_(black_measles)", "grape___leaf_blight_(isariopsis_leaf_spot)", "grape___healthy", "orange___haunglongbing_(citrus_greening)", "peach___bacterial_spot", "peach___healthy", "pepper,_bell___bacterial_spot", "pepper,_bell___healthy", "potato___early_blight", "potato___late_blight", "potato___healthy", "raspberry___healthy", "soybean___healthy", "squash___powdery_mildew", "strawberry___leaf_scorch", "strawberry___healthy", "tomato___bacterial_spot", "tomato___early_blight", "tomato___late_blight", "tomato___leaf_mold", "tomato___septoria_leaf_spot", "tomato___spider_mites two-spotted_spider_mite", "tomato___target_spot", "tomato___tomato_yellow_leaf_curl_virus", "tomato___tomato_mosaic_virus", "tomato___healthy", "algal_spot", "brown_blight", "gray_blight", "healthy", "helopeltis", "red_spot" ]
franzzepol/swin-base-patch4-window7-224-in22k-finetuned-CT-V2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-base-patch4-window7-224-in22k-finetuned-CT-V2 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224-in22k](https://huggingface.co/microsoft/swin-base-patch4-window7-224-in22k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0068 - Accuracy: 0.9983 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 8 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1147 | 0.0708 | 0.9768 | | No log | 2.0 | 2294 | 0.0423 | 0.9871 | | No log | 3.0 | 3441 | 0.0128 | 0.9963 | | No log | 4.0 | 4588 | 0.0155 | 0.9951 | | No log | 5.0 | 5735 | 0.0075 | 0.9980 | | No log | 6.0 | 6882 | 0.0090 | 0.9979 | | No log | 7.0 | 8029 | 0.0075 | 0.9976 | | 0.1295 | 8.0 | 9176 | 0.0068 | 0.9983 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.1+cpu - Datasets 2.20.0 - Tokenizers 0.19.1
[ "meningioma_tumor", "glioma_tumor", "no_tumor", "pituitary_tumor" ]
jerlawson13/vit-base-gpu
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-gpu This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1093 - Accuracy: 0.9736 - Confusion Matrix: [[60, 6], [0, 161]] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 285 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Confusion Matrix | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------------:| | 0.1208 | 1.7544 | 100 | 0.1628 | 0.9648 | [[58, 8], [0, 161]] | | 0.0908 | 3.5088 | 200 | 0.1093 | 0.9736 | [[60, 6], [0, 161]] | ### Framework versions - Transformers 4.42.4 - Pytorch 2.4.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "normal", "glaucoma" ]
qipchip31/electronic-components-model
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> The fine-tuned Vision Transformer (ViT) model, initialized from `google/vit-base-patch16-224` and named `electronic-components-model`, is specialized for classifying electronic components such as resistors, capacitors, inductors, and transistors. Initially pretrained on broader datasets, the fine-tuning process adjusts model parameters specifically for this custom dataset. This adaptation enhances the `electronic-components-model`'s ability to accurately identify and classify intricate visual features unique to electronic components, improving its efficacy in practical applications requiring automated component recognition based on visual inputs. - **Developed by:** Chirag Pradhan - **Funded by [optional]:** Fatima Al-Fihri Predoctoral Fellowship - **Shared by [optional]:** Chirag Pradhan - **Model type:** Vision Transformer (ViT) for image classification - **Language(s) (NLP):** Not applicable (Image classification) - **License:** Apache License 2.0 - **Finetuned from model [optional]:** google/vit-base-patch16-224 ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3" ]
hannguyen2880/massp-challenge
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # massp-challenge This model is a fine-tuned version of [google/vit-large-patch16-224-in21k](https://huggingface.co/google/vit-large-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1842 - Accuracy: 0.9512 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.1978 | 1.0 | 18 | 0.3973 | 0.8449 | | 0.2896 | 2.0 | 36 | 0.2688 | 0.9164 | | 0.1986 | 3.0 | 54 | 0.1842 | 0.9512 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "glioma_tumor", "meningioma_tumor", "no_tumor", "pituitary_tumor" ]
asterismer/my_awesome_food_model
<!-- This res_model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 2.5959 - Accuracy: 0.5666 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 3.2149 | 0.9928 | 69 | 3.0859 | 0.4351 | | 2.7832 | 2.0 | 139 | 2.7217 | 0.5446 | | 2.5928 | 2.9784 | 207 | 2.5959 | 0.5666 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "d01", "d02", "d03", "d04", "d05", "d06", "d07", "d08", "d09", "d10", "d11", "d12", "d13", "d14", "d15", "d16", "d17", "d18", "d19", "d20", "d21", "d22", "d23", "d24", "d25", "d26", "d27", "d28", "d29", "d30", "d31", "d32", "d33", "d34", "d35" ]
jhoppanne/SkinCancerClassifier_Plain-V0
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/yef3opr5) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/yef3opr5) # SkinCancerClassifier_Plain-V0 This model is a fine-tuned version of [microsoft/resnet-152](https://huggingface.co/microsoft/resnet-152) on the None dataset. It achieves the following results on the evaluation set: - eval_loss: 3.2075 - eval_accuracy: 0.6083 - eval_runtime: 1.756 - eval_samples_per_second: 136.671 - eval_steps_per_second: 4.556 - epoch: 250.6 - step: 7518 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 500 ### Framework versions - Transformers 4.42.2 - Pytorch 2.3.0 - Datasets 2.15.0 - Tokenizers 0.19.1
[ "2", "2", "1", "0", "0", "1", "2", "1", "2", "2", "0", "2", "1", "1", "2", "2", "0", "2", "2", "1", "1", "0", "2", "0", "1", "0", "2", "0", "2", "0", "1", "0", "2", "2", "1", "1", "0", "0", "0", "0", "0", "2", "2", "0", "1", "0", "0", "2", "0", "1", "0", "0", "1", "1", "1", "2", "1", "1", "0", "0", "1", "1", "1", "1", "2", "0", "0", "1", "2", "2", "1", "0", "0", "0", "2", "1", "2", "2", "0", "2", "0", "1", "1", "1", "0", "1", "2", "0", "2", "1", "0", "1", "2", "1", "0", "1", "2", "2", "2", "2", "2", "2", "0", "2", "0", "0", "0", "0", "1", "0", "0", "0", "2", "1", "2", "1", "0", "2", "2", "0", "1", "2", "1", "0", "0", "1", "2", "0", "0", "1", "0", "0", "2", "2", "1", "0", "0", "2", "0", "1", "2", "1", "0", "1", "0", "0", "2", "0", "2", "1", "0", "2", "0", "1", "2", "0", "2", "0", "1", "1", "1", "2", "1", "2", "1", "1", "1", "0", "1", "2", "2", "2", "0", "1", "0", "2", "0", "2", "1", "2", "2", "2", "1", "2", "0", "2", "2", "2", "0", "2", "0", "1", "0", "2", "1", "2", "0", "1", "0", "0", "2", "0", "0", "1", "1", "0", "2", "0", "0", "2", "1", "2", "0", "1", "1", "2", "1", "1", "1", "2", "2", "0", "1", "1", "2", "1", "2", "0", "0", "1", "1", "0", "1", "2", "2", "0", "2", "2", "1", "2", "1", "1", "1", "2", "0", "2", "2", "0", "1", "1", "2", "0", "2", "2", "1", "0", "1", "1", "1", "1", "2", "0", "2", "0", "2", "2", "1", "2", "2", "0", "2", "0", "2", "0", "2", "1", "0", "2", "1", "2", "1", "0", "0", "0", "1", "2", "0", "2", "2", "2", "1", "1", "1", "1", "0", "2", "2", "1", "0", "0", "1", "1", "1", "0", "2", "0", "0", "2", "2", "1", "2", "2", "0", "1", "1", "0", "2", "2", "1", "2", "1", "1", "2", "1", "2", "0", "0", "0", "1", "0", "2", "1", "2", "2", "1", "2", "2", "1", "1", "0", "0", "0", "1", "0", "1", "0", "0", "0", "1", "1", "2", "1", "2", "2", "0", "1", "0", "1", "2", "0", "2", "0", "2", "1", "0", "2", "0", "2", "2", "1", "2", "1", "2", "1", "2", "1", "0", "0", "1", "0", "0", "1", "2", "1", "1", "2", "0", "0", "2", "1", "1", "2", "0", "2", "2", "2", "1", "0", "1", "0", "0", "1", "1", "2", "1", "2", "1", "1", "0", "0", "1", "2", "0", "1", "0", "1", "0", "2", "2", "2", "0", "1", "2", "1", "2", "1", "0", "2", "0", "1", "0", "2", "2", "2", "1", "2", "1", "2", "1", "0", "0", "0", "0", "1", "1", "1", "0", "1", "1", "1", "2", "0", "1", "1", "1", "1", "1", "1", "1", "2", "2", "2", "1", "1", "0", "2", "1", "0", "0", "2", "1", "2", "1", "1", "2", "2", "0", "2", "2", "0", "1", "1", "1", "1", "2", "0", "2", "2", "0", "1", "1", "0", "0", "2", "0", "2", "0", "1", "1", "2", "1", "2", "2", "1", "2", "1", "1", "2", "0", "0", "0", "0", "0", "0", "1", "2", "1", "0", "1", "2", "2", "1", "1", "1", "2", "1", "2", "0", "2", "2", "2", "2", "1", "0", "1", "2", "2", "0", "0", "0", "1", "2", "0", "2", "1", "2", "2", "0", "1", "2", "0", "0", "1", "2", "2", "2", "0", "2", "2", "1", "0", "0", "1", "1", "2", "1", "2", "2", "0", "1", "2", "1", "2", "1", "2", "2", "2", "1", "1", "0", "2", "0", "1", "1", "2", "0", "1", "2", "2", "0", "0", "0", "2", "2", "1", "1", "0", "0", "1", "2", "1", "2", "1", "0", "1", "1", "2", "2", "1", "2", "0", "1", "2", "1", "1", "2", "2", "1", "0", "1", "1", "0", "0", "1", "0", "1", "0", "0", "2", "0", "0", "1", "0", "1", "0", "1", "0", "2", "2", "1", "0", "1", "1", "2", "1", "0", "1", "0", "0", "1", "2", "1", "1", "1", "2", "0", "0", "2", "0", "1", "2", "0", "0", "1", "1", "2", "1", "1", "0", "0", "0", "0", "0", "0", "0", "0", "2", "0", "1", "2", "1", "2", "1", "1", "2", "2", "2", "0", "0", "0", "2", "2", "0", "2", "0", "1", "0", "2", "0", "2", "0", "2", "1", "2", "1", "0", "1", "2", "2", "0", "2", "2", "0", "0", "0", "2", "1", "1", "2", "0", "0", "0", "2", "2", "0", "2", "2", "2", "1", "2", "1", "2", "0", "2", "1", "1", "0", "1", "1", "0", "1", "0", "0", "2", "0", "2", "1", "0", "0", "2", "0", "0", "0", "0", "1", "2", "1", "0", "1", "2", "0", "0", "0", "1", "0", "0", "1", "0", "2", "0", "1", "0", "2", "0", "2", "0", "0", "0", "1", "1", "0", "1", "0", "2", "1", "1", "1", "0", "2", "0", "0", "0", "1", "1", "0", "0", "2", "2", "2", "1", "0", "0", "2", "1", "2", "0", "2", "2", "2", "1", "1", "1", "0", "1", "1", "2", "0", "0", "0", "2", "1", "0", "0", "0", "2", "0", "2", "2", "1", "2", "0", "1", "0", "0", "1", "1", "1", "1", "0", "0", "2", "1", "2", "2", "2", "1", "1", "2", "2", "2", "1", "1", "1", "2", "0", "1", "1", "1", "0", "1", "2", "2", "2", "0", "0", "2", "0", "0", "1", "2", "1", "0", "2", "1", "2", "1", "2", "1", "2", "0", "1", "1", "0", "0", "2", "2", "2", "2", "1", "0", "1", "2", "2", "2", "2", "0", "1", "1", "0", "2", "2", "2", "1", "0", "0", "2", "0", "1", "2", "2", "1", "1", "1", "0", "1", "0", "1", "2", "0", "1", "0", "1", "2", "0", "2", "0", "1", "2", "2", "1", "0", "2", "0", "2", "0", "1", "0", "2", "1", "2", "0", "0", "0", "1", "0", "1", "1", "2", "1", "2", "1", "0", "2", "0", "2", "1", "1", "2", "1", "2" ]
jhoppanne/SkinCancerClassifier_Plain-V1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/ow4dk70u) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/ow4dk70u) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/ow4dk70u) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/8hxuz0bh) # SkinCancerClassifier_Plain-V1 This model is a fine-tuned version of [google/efficientnet-b0](https://huggingface.co/google/efficientnet-b0) on the imagefolder dataset. It achieves the following results on the evaluation set: - eval_loss: 1.0790 - eval_accuracy: 0.7792 - eval_runtime: 1.5074 - eval_samples_per_second: 159.219 - eval_steps_per_second: 5.307 - epoch: 104.5667 - step: 3137 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2000 ### Framework versions - Transformers 4.42.2 - Pytorch 2.3.0 - Datasets 2.15.0 - Tokenizers 0.19.1
[ "benign", "indeterminate", "malignant" ]
jhoppanne/SkinCancerClassifier_smote-V0
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/jhoppanne-myself/finalProject/runs/l2ftprb7) # SkinCancerClassifier_smote-V0 This model is a fine-tuned version of [google/efficientnet-b0](https://huggingface.co/google/efficientnet-b0) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6170 - Accuracy: 0.8042 ## Model description This model is designed for the classification of skin cancer images and was trained using the SMOTE technique to handle class imbalance. The dataset used for training is from the [ISIC 2024 - Skin Cancer Detection with 3D-TBP](https://www.kaggle.com/competitions/isic-2024-challenge/data) competition. ## Intended uses & limitations This model is intended for educational purposes and research in skin cancer detection. It should not be used for actual medical diagnosis or treatment planning. Further validation on clinical data is required before any real-world application. ## Training and evaluation data The model was trained on the ISIC 2024 dataset, which contains a diverse set of skin images labeled with different types of skin cancer. The training data was augmented using the SMOTE technique to address class imbalance. ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:-----:|:---------------:|:--------:| | No log | 1.0 | 30 | 6.7943 | 0.0042 | | No log | 2.0 | 60 | 6.7825 | 0.0042 | | No log | 3.0 | 90 | 6.7019 | 0.0042 | | No log | 4.0 | 120 | 6.6665 | 0.0042 | | No log | 5.0 | 150 | 6.5872 | 0.0167 | | No log | 6.0 | 180 | 6.5684 | 0.0292 | | No log | 7.0 | 210 | 6.5135 | 0.0625 | | No log | 8.0 | 240 | 6.4748 | 0.0917 | | No log | 9.0 | 270 | 6.3794 | 0.1458 | | No log | 10.0 | 300 | 6.3851 | 0.1583 | | No log | 11.0 | 330 | 6.3323 | 0.2208 | | No log | 12.0 | 360 | 6.3074 | 0.2208 | | No log | 13.0 | 390 | 6.2001 | 0.2708 | | No log | 14.0 | 420 | 6.1662 | 0.3375 | | No log | 15.0 | 450 | 6.2537 | 0.275 | | No log | 16.0 | 480 | 6.0434 | 0.3583 | | 6.3833 | 17.0 | 510 | 5.9091 | 0.4083 | | 6.3833 | 18.0 | 540 | 5.9385 | 0.3625 | | 6.3833 | 19.0 | 570 | 5.7777 | 0.4792 | | 6.3833 | 20.0 | 600 | 5.8297 | 0.3708 | | 6.3833 | 21.0 | 630 | 5.5555 | 0.5125 | | 6.3833 | 22.0 | 660 | 5.7096 | 0.4417 | | 6.3833 | 23.0 | 690 | 5.6401 | 0.4167 | | 6.3833 | 24.0 | 720 | 5.3735 | 0.5458 | | 6.3833 | 25.0 | 750 | 5.4768 | 0.5375 | | 6.3833 | 26.0 | 780 | 5.1468 | 0.55 | | 6.3833 | 27.0 | 810 | 5.0193 | 0.5167 | | 6.3833 | 28.0 | 840 | 4.7966 | 0.5667 | | 6.3833 | 29.0 | 870 | 4.8825 | 0.5625 | | 6.3833 | 30.0 | 900 | 4.9566 | 0.5042 | | 6.3833 | 31.0 | 930 | 4.6407 | 0.5625 | | 6.3833 | 32.0 | 960 | 4.7814 | 0.5417 | | 6.3833 | 33.0 | 990 | 4.4359 | 0.5667 | | 5.0833 | 34.0 | 1020 | 4.0800 | 0.6 | | 5.0833 | 35.0 | 1050 | 4.0507 | 0.5708 | | 5.0833 | 36.0 | 1080 | 4.1396 | 0.5542 | | 5.0833 | 37.0 | 1110 | 3.8816 | 0.6083 | | 5.0833 | 38.0 | 1140 | 3.8375 | 0.625 | | 5.0833 | 39.0 | 1170 | 3.7560 | 0.6125 | | 5.0833 | 40.0 | 1200 | 3.5860 | 0.6083 | | 5.0833 | 41.0 | 1230 | 3.4339 | 0.5792 | | 5.0833 | 42.0 | 1260 | 3.2689 | 0.6042 | | 5.0833 | 43.0 | 1290 | 3.2131 | 0.5917 | | 5.0833 | 44.0 | 1320 | 3.0630 | 0.6375 | | 5.0833 | 45.0 | 1350 | 2.7808 | 0.6292 | | 5.0833 | 46.0 | 1380 | 2.7642 | 0.6417 | | 5.0833 | 47.0 | 1410 | 2.5321 | 0.6667 | | 5.0833 | 48.0 | 1440 | 2.3647 | 0.6083 | | 5.0833 | 49.0 | 1470 | 2.4837 | 0.625 | | 3.1484 | 50.0 | 1500 | 2.1881 | 0.6375 | | 3.1484 | 51.0 | 1530 | 2.1573 | 0.6375 | | 3.1484 | 52.0 | 1560 | 1.9715 | 0.65 | | 3.1484 | 53.0 | 1590 | 2.1185 | 0.675 | | 3.1484 | 54.0 | 1620 | 2.1301 | 0.6542 | | 3.1484 | 55.0 | 1650 | 1.7292 | 0.6375 | | 3.1484 | 56.0 | 1680 | 1.9750 | 0.625 | | 3.1484 | 57.0 | 1710 | 1.6784 | 0.6667 | | 3.1484 | 58.0 | 1740 | 1.5947 | 0.65 | | 3.1484 | 59.0 | 1770 | 1.5680 | 0.6667 | | 3.1484 | 60.0 | 1800 | 1.5720 | 0.6625 | | 3.1484 | 61.0 | 1830 | 1.4629 | 0.7042 | | 3.1484 | 62.0 | 1860 | 1.4647 | 0.6542 | | 3.1484 | 63.0 | 1890 | 1.4588 | 0.6458 | | 3.1484 | 64.0 | 1920 | 1.3771 | 0.7 | | 3.1484 | 65.0 | 1950 | 1.4022 | 0.6458 | | 3.1484 | 66.0 | 1980 | 1.3628 | 0.6708 | | 1.6168 | 67.0 | 2010 | 1.2906 | 0.6625 | | 1.6168 | 68.0 | 2040 | 1.5474 | 0.6208 | | 1.6168 | 69.0 | 2070 | 1.2712 | 0.6708 | | 1.6168 | 70.0 | 2100 | 1.3176 | 0.6417 | | 1.6168 | 71.0 | 2130 | 1.3137 | 0.6667 | | 1.6168 | 72.0 | 2160 | 1.1199 | 0.6792 | | 1.6168 | 73.0 | 2190 | 1.1273 | 0.6583 | | 1.6168 | 74.0 | 2220 | 1.1283 | 0.6875 | | 1.6168 | 75.0 | 2250 | 1.0464 | 0.6875 | | 1.6168 | 76.0 | 2280 | 1.0722 | 0.6833 | | 1.6168 | 77.0 | 2310 | 0.9783 | 0.725 | | 1.6168 | 78.0 | 2340 | 1.0548 | 0.675 | | 1.6168 | 79.0 | 2370 | 1.0064 | 0.6708 | | 1.6168 | 80.0 | 2400 | 0.9031 | 0.7083 | | 1.6168 | 81.0 | 2430 | 0.9418 | 0.6542 | | 1.6168 | 82.0 | 2460 | 0.9195 | 0.7125 | | 1.6168 | 83.0 | 2490 | 0.9451 | 0.6667 | | 1.0004 | 84.0 | 2520 | 1.0035 | 0.6792 | | 1.0004 | 85.0 | 2550 | 0.9345 | 0.6875 | | 1.0004 | 86.0 | 2580 | 0.8992 | 0.6708 | | 1.0004 | 87.0 | 2610 | 0.8348 | 0.6583 | | 1.0004 | 88.0 | 2640 | 0.8885 | 0.7042 | | 1.0004 | 89.0 | 2670 | 0.8627 | 0.7208 | | 1.0004 | 90.0 | 2700 | 0.8333 | 0.7042 | | 1.0004 | 91.0 | 2730 | 0.8811 | 0.6917 | | 1.0004 | 92.0 | 2760 | 0.8792 | 0.7167 | | 1.0004 | 93.0 | 2790 | 0.8650 | 0.6833 | | 1.0004 | 94.0 | 2820 | 0.7681 | 0.7208 | | 1.0004 | 95.0 | 2850 | 0.8514 | 0.7292 | | 1.0004 | 96.0 | 2880 | 0.8744 | 0.6792 | | 1.0004 | 97.0 | 2910 | 0.8674 | 0.7083 | | 1.0004 | 98.0 | 2940 | 0.8486 | 0.7167 | | 1.0004 | 99.0 | 2970 | 0.8676 | 0.6625 | | 0.7749 | 100.0 | 3000 | 0.7971 | 0.7208 | | 0.7749 | 101.0 | 3030 | 0.8681 | 0.6917 | | 0.7749 | 102.0 | 3060 | 0.8081 | 0.7083 | | 0.7749 | 103.0 | 3090 | 0.9258 | 0.6875 | | 0.7749 | 104.0 | 3120 | 0.8388 | 0.7 | | 0.7749 | 105.0 | 3150 | 0.8101 | 0.7 | | 0.7749 | 106.0 | 3180 | 0.7663 | 0.7375 | | 0.7749 | 107.0 | 3210 | 0.8520 | 0.6833 | | 0.7749 | 108.0 | 3240 | 0.7547 | 0.7375 | | 0.7749 | 109.0 | 3270 | 0.7945 | 0.725 | | 0.7749 | 110.0 | 3300 | 0.7452 | 0.7333 | | 0.7749 | 111.0 | 3330 | 0.8002 | 0.7125 | | 0.7749 | 112.0 | 3360 | 0.7156 | 0.7417 | | 0.7749 | 113.0 | 3390 | 0.7751 | 0.725 | | 0.7749 | 114.0 | 3420 | 0.7902 | 0.7125 | | 0.7749 | 115.0 | 3450 | 0.7980 | 0.725 | | 0.7749 | 116.0 | 3480 | 0.7548 | 0.7042 | | 0.6543 | 117.0 | 3510 | 0.6958 | 0.7333 | | 0.6543 | 118.0 | 3540 | 0.6582 | 0.7375 | | 0.6543 | 119.0 | 3570 | 0.7922 | 0.7 | | 0.6543 | 120.0 | 3600 | 0.7022 | 0.7167 | | 0.6543 | 121.0 | 3630 | 0.7479 | 0.6958 | | 0.6543 | 122.0 | 3660 | 0.7118 | 0.75 | | 0.6543 | 123.0 | 3690 | 0.7521 | 0.7292 | | 0.6543 | 124.0 | 3720 | 0.7132 | 0.7333 | | 0.6543 | 125.0 | 3750 | 0.7345 | 0.7333 | | 0.6543 | 126.0 | 3780 | 0.7896 | 0.6625 | | 0.6543 | 127.0 | 3810 | 0.7449 | 0.7167 | | 0.6543 | 128.0 | 3840 | 0.6486 | 0.7125 | | 0.6543 | 129.0 | 3870 | 0.8283 | 0.7375 | | 0.6543 | 130.0 | 3900 | 0.7138 | 0.75 | | 0.6543 | 131.0 | 3930 | 0.7498 | 0.7083 | | 0.6543 | 132.0 | 3960 | 0.6947 | 0.7458 | | 0.6543 | 133.0 | 3990 | 0.7423 | 0.7375 | | 0.5814 | 134.0 | 4020 | 0.7298 | 0.7083 | | 0.5814 | 135.0 | 4050 | 0.6576 | 0.75 | | 0.5814 | 136.0 | 4080 | 0.7127 | 0.7417 | | 0.5814 | 137.0 | 4110 | 0.6741 | 0.7583 | | 0.5814 | 138.0 | 4140 | 0.8170 | 0.6875 | | 0.5814 | 139.0 | 4170 | 0.6322 | 0.7542 | | 0.5814 | 140.0 | 4200 | 0.6532 | 0.7292 | | 0.5814 | 141.0 | 4230 | 0.7523 | 0.7375 | | 0.5814 | 142.0 | 4260 | 0.7976 | 0.7042 | | 0.5814 | 143.0 | 4290 | 0.6449 | 0.7458 | | 0.5814 | 144.0 | 4320 | 0.7749 | 0.725 | | 0.5814 | 145.0 | 4350 | 0.6823 | 0.7333 | | 0.5814 | 146.0 | 4380 | 0.7293 | 0.6833 | | 0.5814 | 147.0 | 4410 | 0.7043 | 0.7708 | | 0.5814 | 148.0 | 4440 | 0.7280 | 0.725 | | 0.5814 | 149.0 | 4470 | 0.6766 | 0.7167 | | 0.5041 | 150.0 | 4500 | 0.6659 | 0.7583 | | 0.5041 | 151.0 | 4530 | 0.6706 | 0.75 | | 0.5041 | 152.0 | 4560 | 0.8339 | 0.6958 | | 0.5041 | 153.0 | 4590 | 0.6291 | 0.7458 | | 0.5041 | 154.0 | 4620 | 0.6723 | 0.725 | | 0.5041 | 155.0 | 4650 | 0.6450 | 0.775 | | 0.5041 | 156.0 | 4680 | 0.7696 | 0.7042 | | 0.5041 | 157.0 | 4710 | 0.6793 | 0.725 | | 0.5041 | 158.0 | 4740 | 0.7299 | 0.7042 | | 0.5041 | 159.0 | 4770 | 0.6984 | 0.7542 | | 0.5041 | 160.0 | 4800 | 0.6853 | 0.7042 | | 0.5041 | 161.0 | 4830 | 0.6883 | 0.7667 | | 0.5041 | 162.0 | 4860 | 0.6910 | 0.7292 | | 0.5041 | 163.0 | 4890 | 0.6833 | 0.7417 | | 0.5041 | 164.0 | 4920 | 0.6117 | 0.7542 | | 0.5041 | 165.0 | 4950 | 0.6456 | 0.775 | | 0.5041 | 166.0 | 4980 | 0.6448 | 0.7375 | | 0.4627 | 167.0 | 5010 | 0.6675 | 0.75 | | 0.4627 | 168.0 | 5040 | 0.7636 | 0.725 | | 0.4627 | 169.0 | 5070 | 0.6608 | 0.7417 | | 0.4627 | 170.0 | 5100 | 0.6971 | 0.7458 | | 0.4627 | 171.0 | 5130 | 0.6536 | 0.7458 | | 0.4627 | 172.0 | 5160 | 0.6441 | 0.7458 | | 0.4627 | 173.0 | 5190 | 0.6241 | 0.7542 | | 0.4627 | 174.0 | 5220 | 0.6801 | 0.7458 | | 0.4627 | 175.0 | 5250 | 0.6454 | 0.7417 | | 0.4627 | 176.0 | 5280 | 0.7308 | 0.7208 | | 0.4627 | 177.0 | 5310 | 0.6902 | 0.7292 | | 0.4627 | 178.0 | 5340 | 0.6786 | 0.7125 | | 0.4627 | 179.0 | 5370 | 0.7752 | 0.7208 | | 0.4627 | 180.0 | 5400 | 0.7500 | 0.7333 | | 0.4627 | 181.0 | 5430 | 0.7047 | 0.7042 | | 0.4627 | 182.0 | 5460 | 0.6155 | 0.75 | | 0.4627 | 183.0 | 5490 | 0.6885 | 0.7125 | | 0.4095 | 184.0 | 5520 | 0.7181 | 0.7083 | | 0.4095 | 185.0 | 5550 | 0.7138 | 0.7292 | | 0.4095 | 186.0 | 5580 | 0.7649 | 0.725 | | 0.4095 | 187.0 | 5610 | 0.6874 | 0.7417 | | 0.4095 | 188.0 | 5640 | 0.7002 | 0.7292 | | 0.4095 | 189.0 | 5670 | 0.6796 | 0.7125 | | 0.4095 | 190.0 | 5700 | 0.5897 | 0.7667 | | 0.4095 | 191.0 | 5730 | 0.7148 | 0.7542 | | 0.4095 | 192.0 | 5760 | 0.5560 | 0.7708 | | 0.4095 | 193.0 | 5790 | 0.7173 | 0.7042 | | 0.4095 | 194.0 | 5820 | 0.6560 | 0.7417 | | 0.4095 | 195.0 | 5850 | 0.7184 | 0.7208 | | 0.4095 | 196.0 | 5880 | 0.6937 | 0.7458 | | 0.4095 | 197.0 | 5910 | 0.7623 | 0.7125 | | 0.4095 | 198.0 | 5940 | 0.6437 | 0.7542 | | 0.4095 | 199.0 | 5970 | 0.6408 | 0.7708 | | 0.356 | 200.0 | 6000 | 0.6752 | 0.75 | | 0.356 | 201.0 | 6030 | 0.6156 | 0.75 | | 0.356 | 202.0 | 6060 | 0.6467 | 0.7792 | | 0.356 | 203.0 | 6090 | 0.6800 | 0.7375 | | 0.356 | 204.0 | 6120 | 0.6864 | 0.7625 | | 0.356 | 205.0 | 6150 | 0.7565 | 0.7375 | | 0.356 | 206.0 | 6180 | 0.7232 | 0.725 | | 0.356 | 207.0 | 6210 | 0.6148 | 0.7542 | | 0.356 | 208.0 | 6240 | 0.7386 | 0.75 | | 0.356 | 209.0 | 6270 | 0.6825 | 0.75 | | 0.356 | 210.0 | 6300 | 0.6789 | 0.75 | | 0.356 | 211.0 | 6330 | 0.6893 | 0.7625 | | 0.356 | 212.0 | 6360 | 0.7192 | 0.75 | | 0.356 | 213.0 | 6390 | 0.6908 | 0.7292 | | 0.356 | 214.0 | 6420 | 0.6376 | 0.7458 | | 0.356 | 215.0 | 6450 | 0.6130 | 0.7792 | | 0.356 | 216.0 | 6480 | 0.6527 | 0.7708 | | 0.313 | 217.0 | 6510 | 0.6995 | 0.7625 | | 0.313 | 218.0 | 6540 | 0.6987 | 0.7583 | | 0.313 | 219.0 | 6570 | 0.6439 | 0.7833 | | 0.313 | 220.0 | 6600 | 0.6347 | 0.7875 | | 0.313 | 221.0 | 6630 | 0.6133 | 0.7667 | | 0.313 | 222.0 | 6660 | 0.7384 | 0.7208 | | 0.313 | 223.0 | 6690 | 0.6899 | 0.7375 | | 0.313 | 224.0 | 6720 | 0.6132 | 0.7958 | | 0.313 | 225.0 | 6750 | 0.6579 | 0.7292 | | 0.313 | 226.0 | 6780 | 0.7425 | 0.725 | | 0.313 | 227.0 | 6810 | 0.6475 | 0.75 | | 0.313 | 228.0 | 6840 | 0.7864 | 0.7417 | | 0.313 | 229.0 | 6870 | 0.7130 | 0.775 | | 0.313 | 230.0 | 6900 | 0.6899 | 0.75 | | 0.313 | 231.0 | 6930 | 0.6144 | 0.7625 | | 0.313 | 232.0 | 6960 | 0.6892 | 0.75 | | 0.313 | 233.0 | 6990 | 0.7176 | 0.7417 | | 0.2888 | 234.0 | 7020 | 0.7361 | 0.7083 | | 0.2888 | 235.0 | 7050 | 0.6285 | 0.7708 | | 0.2888 | 236.0 | 7080 | 0.7158 | 0.7208 | | 0.2888 | 237.0 | 7110 | 0.6461 | 0.75 | | 0.2888 | 238.0 | 7140 | 0.7060 | 0.75 | | 0.2888 | 239.0 | 7170 | 0.6301 | 0.7583 | | 0.2888 | 240.0 | 7200 | 0.5899 | 0.7583 | | 0.2888 | 241.0 | 7230 | 0.6283 | 0.7542 | | 0.2888 | 242.0 | 7260 | 0.6840 | 0.7542 | | 0.2888 | 243.0 | 7290 | 0.7016 | 0.7583 | | 0.2888 | 244.0 | 7320 | 0.6058 | 0.7708 | | 0.2888 | 245.0 | 7350 | 0.6432 | 0.7375 | | 0.2888 | 246.0 | 7380 | 0.6569 | 0.7792 | | 0.2888 | 247.0 | 7410 | 0.6168 | 0.7792 | | 0.2888 | 248.0 | 7440 | 0.7352 | 0.7333 | | 0.2888 | 249.0 | 7470 | 0.6468 | 0.75 | | 0.2554 | 250.0 | 7500 | 0.6646 | 0.7542 | | 0.2554 | 251.0 | 7530 | 0.7433 | 0.75 | | 0.2554 | 252.0 | 7560 | 0.7793 | 0.7625 | | 0.2554 | 253.0 | 7590 | 0.6465 | 0.7458 | | 0.2554 | 254.0 | 7620 | 0.6032 | 0.7833 | | 0.2554 | 255.0 | 7650 | 0.7673 | 0.7167 | | 0.2554 | 256.0 | 7680 | 0.7349 | 0.7083 | | 0.2554 | 257.0 | 7710 | 0.7279 | 0.7458 | | 0.2554 | 258.0 | 7740 | 0.6159 | 0.7542 | | 0.2554 | 259.0 | 7770 | 0.7130 | 0.7625 | | 0.2554 | 260.0 | 7800 | 0.6907 | 0.7708 | | 0.2554 | 261.0 | 7830 | 0.6460 | 0.725 | | 0.2554 | 262.0 | 7860 | 0.7923 | 0.725 | | 0.2554 | 263.0 | 7890 | 0.7290 | 0.7125 | | 0.2554 | 264.0 | 7920 | 0.7246 | 0.7542 | | 0.2554 | 265.0 | 7950 | 0.6314 | 0.7792 | | 0.2554 | 266.0 | 7980 | 0.7331 | 0.7708 | | 0.2271 | 267.0 | 8010 | 0.7471 | 0.775 | | 0.2271 | 268.0 | 8040 | 0.7945 | 0.7208 | | 0.2271 | 269.0 | 8070 | 0.8326 | 0.7208 | | 0.2271 | 270.0 | 8100 | 0.6376 | 0.7833 | | 0.2271 | 271.0 | 8130 | 0.7209 | 0.7542 | | 0.2271 | 272.0 | 8160 | 0.7119 | 0.7458 | | 0.2271 | 273.0 | 8190 | 0.7579 | 0.7375 | | 0.2271 | 274.0 | 8220 | 0.6190 | 0.7958 | | 0.2271 | 275.0 | 8250 | 0.7346 | 0.75 | | 0.2271 | 276.0 | 8280 | 0.5725 | 0.7875 | | 0.2271 | 277.0 | 8310 | 0.6839 | 0.7542 | | 0.2271 | 278.0 | 8340 | 0.7181 | 0.7833 | | 0.2271 | 279.0 | 8370 | 0.7686 | 0.7625 | | 0.2271 | 280.0 | 8400 | 0.7098 | 0.75 | | 0.2271 | 281.0 | 8430 | 0.7520 | 0.7542 | | 0.2271 | 282.0 | 8460 | 0.7100 | 0.7958 | | 0.2271 | 283.0 | 8490 | 0.7457 | 0.7625 | | 0.1968 | 284.0 | 8520 | 0.7145 | 0.7542 | | 0.1968 | 285.0 | 8550 | 0.6266 | 0.7917 | | 0.1968 | 286.0 | 8580 | 0.6804 | 0.7917 | | 0.1968 | 287.0 | 8610 | 0.6829 | 0.7542 | | 0.1968 | 288.0 | 8640 | 0.6663 | 0.7625 | | 0.1968 | 289.0 | 8670 | 0.8079 | 0.7417 | | 0.1968 | 290.0 | 8700 | 0.7336 | 0.7792 | | 0.1968 | 291.0 | 8730 | 0.7564 | 0.7375 | | 0.1968 | 292.0 | 8760 | 0.6543 | 0.7792 | | 0.1968 | 293.0 | 8790 | 0.6996 | 0.75 | | 0.1968 | 294.0 | 8820 | 0.6303 | 0.8042 | | 0.1968 | 295.0 | 8850 | 0.8041 | 0.725 | | 0.1968 | 296.0 | 8880 | 0.8126 | 0.7417 | | 0.1968 | 297.0 | 8910 | 0.8852 | 0.7042 | | 0.1968 | 298.0 | 8940 | 0.7629 | 0.7292 | | 0.1968 | 299.0 | 8970 | 0.7488 | 0.7708 | | 0.1705 | 300.0 | 9000 | 0.8827 | 0.7417 | | 0.1705 | 301.0 | 9030 | 0.6436 | 0.775 | | 0.1705 | 302.0 | 9060 | 0.7171 | 0.7417 | | 0.1705 | 303.0 | 9090 | 0.8214 | 0.7417 | | 0.1705 | 304.0 | 9120 | 0.7368 | 0.7375 | | 0.1705 | 305.0 | 9150 | 0.6851 | 0.775 | | 0.1705 | 306.0 | 9180 | 0.6864 | 0.7667 | | 0.1705 | 307.0 | 9210 | 0.6693 | 0.7833 | | 0.1705 | 308.0 | 9240 | 0.7735 | 0.7625 | | 0.1705 | 309.0 | 9270 | 0.7294 | 0.7417 | | 0.1705 | 310.0 | 9300 | 0.7177 | 0.7833 | | 0.1705 | 311.0 | 9330 | 0.7042 | 0.7625 | | 0.1705 | 312.0 | 9360 | 0.6943 | 0.7875 | | 0.1705 | 313.0 | 9390 | 0.6023 | 0.7917 | | 0.1705 | 314.0 | 9420 | 0.8149 | 0.7417 | | 0.1705 | 315.0 | 9450 | 0.8316 | 0.7542 | | 0.1705 | 316.0 | 9480 | 0.7498 | 0.7583 | | 0.1559 | 317.0 | 9510 | 0.7777 | 0.75 | | 0.1559 | 318.0 | 9540 | 0.7214 | 0.7458 | | 0.1559 | 319.0 | 9570 | 0.8629 | 0.7375 | | 0.1559 | 320.0 | 9600 | 0.7550 | 0.7542 | | 0.1559 | 321.0 | 9630 | 0.7448 | 0.7583 | | 0.1559 | 322.0 | 9660 | 0.6344 | 0.8208 | | 0.1559 | 323.0 | 9690 | 0.8020 | 0.7542 | | 0.1559 | 324.0 | 9720 | 0.7172 | 0.7792 | | 0.1559 | 325.0 | 9750 | 0.7229 | 0.7833 | | 0.1559 | 326.0 | 9780 | 0.8375 | 0.7583 | | 0.1559 | 327.0 | 9810 | 0.7374 | 0.7583 | | 0.1559 | 328.0 | 9840 | 0.7407 | 0.7458 | | 0.1559 | 329.0 | 9870 | 0.7342 | 0.7875 | | 0.1559 | 330.0 | 9900 | 0.7588 | 0.7917 | | 0.1559 | 331.0 | 9930 | 0.5773 | 0.7917 | | 0.1559 | 332.0 | 9960 | 0.7909 | 0.7583 | | 0.1559 | 333.0 | 9990 | 0.7641 | 0.75 | | 0.1436 | 334.0 | 10020 | 0.7121 | 0.775 | | 0.1436 | 335.0 | 10050 | 0.8134 | 0.7542 | | 0.1436 | 336.0 | 10080 | 0.6390 | 0.7917 | | 0.1436 | 337.0 | 10110 | 0.7637 | 0.7708 | | 0.1436 | 338.0 | 10140 | 0.8571 | 0.7625 | | 0.1436 | 339.0 | 10170 | 0.6932 | 0.7917 | | 0.1436 | 340.0 | 10200 | 0.7662 | 0.7458 | | 0.1436 | 341.0 | 10230 | 0.7340 | 0.7708 | | 0.1436 | 342.0 | 10260 | 0.7839 | 0.7625 | | 0.1436 | 343.0 | 10290 | 0.8955 | 0.7708 | | 0.1436 | 344.0 | 10320 | 0.7302 | 0.7792 | | 0.1436 | 345.0 | 10350 | 0.7184 | 0.7833 | | 0.1436 | 346.0 | 10380 | 0.8096 | 0.7875 | | 0.1436 | 347.0 | 10410 | 0.6618 | 0.8042 | | 0.1436 | 348.0 | 10440 | 0.8814 | 0.725 | | 0.1436 | 349.0 | 10470 | 0.7115 | 0.7833 | | 0.1253 | 350.0 | 10500 | 0.7655 | 0.775 | | 0.1253 | 351.0 | 10530 | 1.0353 | 0.7042 | | 0.1253 | 352.0 | 10560 | 0.7779 | 0.75 | | 0.1253 | 353.0 | 10590 | 0.7690 | 0.7708 | | 0.1253 | 354.0 | 10620 | 0.8513 | 0.7708 | | 0.1253 | 355.0 | 10650 | 0.8023 | 0.7625 | | 0.1253 | 356.0 | 10680 | 0.8654 | 0.7375 | | 0.1253 | 357.0 | 10710 | 0.7158 | 0.7833 | | 0.1253 | 358.0 | 10740 | 0.7129 | 0.8083 | | 0.1253 | 359.0 | 10770 | 0.7965 | 0.7833 | | 0.1253 | 360.0 | 10800 | 0.6912 | 0.775 | | 0.1253 | 361.0 | 10830 | 0.8931 | 0.775 | | 0.1253 | 362.0 | 10860 | 0.7996 | 0.7792 | | 0.1253 | 363.0 | 10890 | 0.8954 | 0.7417 | | 0.1253 | 364.0 | 10920 | 0.7788 | 0.7583 | | 0.1253 | 365.0 | 10950 | 0.8762 | 0.7417 | | 0.1253 | 366.0 | 10980 | 0.7141 | 0.7792 | | 0.1137 | 367.0 | 11010 | 0.7754 | 0.7667 | | 0.1137 | 368.0 | 11040 | 0.8462 | 0.8 | | 0.1137 | 369.0 | 11070 | 0.7514 | 0.7833 | | 0.1137 | 370.0 | 11100 | 0.8935 | 0.7625 | | 0.1137 | 371.0 | 11130 | 0.6772 | 0.7875 | | 0.1137 | 372.0 | 11160 | 0.6557 | 0.8458 | | 0.1137 | 373.0 | 11190 | 0.7456 | 0.7917 | | 0.1137 | 374.0 | 11220 | 0.6444 | 0.8125 | | 0.1137 | 375.0 | 11250 | 0.8195 | 0.75 | | 0.1137 | 376.0 | 11280 | 0.7144 | 0.8083 | | 0.1137 | 377.0 | 11310 | 0.7302 | 0.7875 | | 0.1137 | 378.0 | 11340 | 0.8689 | 0.8 | | 0.1137 | 379.0 | 11370 | 0.7004 | 0.7625 | | 0.1137 | 380.0 | 11400 | 0.8927 | 0.7583 | | 0.1137 | 381.0 | 11430 | 0.8333 | 0.7583 | | 0.1137 | 382.0 | 11460 | 0.8921 | 0.7375 | | 0.1137 | 383.0 | 11490 | 0.8607 | 0.7833 | | 0.101 | 384.0 | 11520 | 0.7740 | 0.7917 | | 0.101 | 385.0 | 11550 | 0.7396 | 0.7833 | | 0.101 | 386.0 | 11580 | 0.7352 | 0.775 | | 0.101 | 387.0 | 11610 | 0.8536 | 0.7542 | | 0.101 | 388.0 | 11640 | 0.7138 | 0.7708 | | 0.101 | 389.0 | 11670 | 0.9080 | 0.7375 | | 0.101 | 390.0 | 11700 | 0.8300 | 0.7583 | | 0.101 | 391.0 | 11730 | 0.8436 | 0.7417 | | 0.101 | 392.0 | 11760 | 0.8957 | 0.75 | | 0.101 | 393.0 | 11790 | 0.7883 | 0.7833 | | 0.101 | 394.0 | 11820 | 0.8817 | 0.7583 | | 0.101 | 395.0 | 11850 | 0.8368 | 0.7583 | | 0.101 | 396.0 | 11880 | 0.7682 | 0.775 | | 0.101 | 397.0 | 11910 | 0.7059 | 0.8167 | | 0.101 | 398.0 | 11940 | 0.7397 | 0.7833 | | 0.101 | 399.0 | 11970 | 0.7836 | 0.7667 | | 0.0931 | 400.0 | 12000 | 0.8686 | 0.7667 | | 0.0931 | 401.0 | 12030 | 0.6950 | 0.7708 | | 0.0931 | 402.0 | 12060 | 0.8184 | 0.7667 | | 0.0931 | 403.0 | 12090 | 0.8252 | 0.8083 | | 0.0931 | 404.0 | 12120 | 0.7665 | 0.7792 | | 0.0931 | 405.0 | 12150 | 0.7455 | 0.8125 | | 0.0931 | 406.0 | 12180 | 0.9008 | 0.7958 | | 0.0931 | 407.0 | 12210 | 0.8553 | 0.7667 | | 0.0931 | 408.0 | 12240 | 0.9953 | 0.7625 | | 0.0931 | 409.0 | 12270 | 0.6862 | 0.8 | | 0.0931 | 410.0 | 12300 | 0.8894 | 0.7833 | | 0.0931 | 411.0 | 12330 | 0.8286 | 0.7792 | | 0.0931 | 412.0 | 12360 | 0.9056 | 0.7958 | | 0.0931 | 413.0 | 12390 | 0.7168 | 0.7958 | | 0.0931 | 414.0 | 12420 | 0.8387 | 0.7958 | | 0.0931 | 415.0 | 12450 | 0.7707 | 0.7958 | | 0.0931 | 416.0 | 12480 | 0.8852 | 0.7708 | | 0.0883 | 417.0 | 12510 | 0.7532 | 0.7833 | | 0.0883 | 418.0 | 12540 | 0.7912 | 0.7875 | | 0.0883 | 419.0 | 12570 | 0.7997 | 0.7708 | | 0.0883 | 420.0 | 12600 | 0.7568 | 0.7833 | | 0.0883 | 421.0 | 12630 | 0.8160 | 0.7792 | | 0.0883 | 422.0 | 12660 | 0.9379 | 0.7625 | | 0.0883 | 423.0 | 12690 | 0.8172 | 0.775 | | 0.0883 | 424.0 | 12720 | 0.9567 | 0.7458 | | 0.0883 | 425.0 | 12750 | 0.7185 | 0.8 | | 0.0883 | 426.0 | 12780 | 0.9086 | 0.7583 | | 0.0883 | 427.0 | 12810 | 0.8753 | 0.7792 | | 0.0883 | 428.0 | 12840 | 0.9000 | 0.7708 | | 0.0883 | 429.0 | 12870 | 0.7688 | 0.7708 | | 0.0883 | 430.0 | 12900 | 0.9019 | 0.7708 | | 0.0883 | 431.0 | 12930 | 0.8712 | 0.7833 | | 0.0883 | 432.0 | 12960 | 0.8873 | 0.7667 | | 0.0883 | 433.0 | 12990 | 0.8546 | 0.7667 | | 0.0767 | 434.0 | 13020 | 0.8754 | 0.7792 | | 0.0767 | 435.0 | 13050 | 0.8427 | 0.775 | | 0.0767 | 436.0 | 13080 | 0.8272 | 0.7958 | | 0.0767 | 437.0 | 13110 | 0.9191 | 0.7625 | | 0.0767 | 438.0 | 13140 | 0.9184 | 0.7583 | | 0.0767 | 439.0 | 13170 | 0.9290 | 0.775 | | 0.0767 | 440.0 | 13200 | 0.8154 | 0.7917 | | 0.0767 | 441.0 | 13230 | 0.9080 | 0.7792 | | 0.0767 | 442.0 | 13260 | 0.7309 | 0.7875 | | 0.0767 | 443.0 | 13290 | 0.9137 | 0.7792 | | 0.0767 | 444.0 | 13320 | 0.8898 | 0.775 | | 0.0767 | 445.0 | 13350 | 0.8037 | 0.7708 | | 0.0767 | 446.0 | 13380 | 0.8286 | 0.7708 | | 0.0767 | 447.0 | 13410 | 0.7343 | 0.8292 | | 0.0767 | 448.0 | 13440 | 0.9386 | 0.7708 | | 0.0767 | 449.0 | 13470 | 0.9273 | 0.7625 | | 0.0765 | 450.0 | 13500 | 0.8368 | 0.8167 | | 0.0765 | 451.0 | 13530 | 0.9534 | 0.7875 | | 0.0765 | 452.0 | 13560 | 0.9879 | 0.7708 | | 0.0765 | 453.0 | 13590 | 0.8020 | 0.7792 | | 0.0765 | 454.0 | 13620 | 0.8479 | 0.7792 | | 0.0765 | 455.0 | 13650 | 0.7941 | 0.775 | | 0.0765 | 456.0 | 13680 | 0.8531 | 0.7875 | | 0.0765 | 457.0 | 13710 | 0.8524 | 0.775 | | 0.0765 | 458.0 | 13740 | 0.8314 | 0.7542 | | 0.0765 | 459.0 | 13770 | 1.0210 | 0.75 | | 0.0765 | 460.0 | 13800 | 0.7902 | 0.775 | | 0.0765 | 461.0 | 13830 | 0.8209 | 0.8042 | | 0.0765 | 462.0 | 13860 | 0.9832 | 0.7542 | | 0.0765 | 463.0 | 13890 | 0.8477 | 0.7875 | | 0.0765 | 464.0 | 13920 | 0.8430 | 0.7875 | | 0.0765 | 465.0 | 13950 | 0.9338 | 0.75 | | 0.0765 | 466.0 | 13980 | 0.8497 | 0.8042 | | 0.0624 | 467.0 | 14010 | 0.7791 | 0.8042 | | 0.0624 | 468.0 | 14040 | 0.8319 | 0.7958 | | 0.0624 | 469.0 | 14070 | 0.9360 | 0.7708 | | 0.0624 | 470.0 | 14100 | 0.7524 | 0.7833 | | 0.0624 | 471.0 | 14130 | 0.9504 | 0.7833 | | 0.0624 | 472.0 | 14160 | 1.0148 | 0.75 | | 0.0624 | 473.0 | 14190 | 0.9203 | 0.7708 | | 0.0624 | 474.0 | 14220 | 0.7555 | 0.7833 | | 0.0624 | 475.0 | 14250 | 0.8002 | 0.7917 | | 0.0624 | 476.0 | 14280 | 1.1156 | 0.7875 | | 0.0624 | 477.0 | 14310 | 0.8743 | 0.7917 | | 0.0624 | 478.0 | 14340 | 0.8853 | 0.7667 | | 0.0624 | 479.0 | 14370 | 0.9660 | 0.7792 | | 0.0624 | 480.0 | 14400 | 0.8608 | 0.7667 | | 0.0624 | 481.0 | 14430 | 0.8446 | 0.7875 | | 0.0624 | 482.0 | 14460 | 0.8183 | 0.7875 | | 0.0624 | 483.0 | 14490 | 1.0341 | 0.7958 | | 0.0601 | 484.0 | 14520 | 0.8423 | 0.7833 | | 0.0601 | 485.0 | 14550 | 1.0109 | 0.7542 | | 0.0601 | 486.0 | 14580 | 0.8732 | 0.7875 | | 0.0601 | 487.0 | 14610 | 0.8673 | 0.7583 | | 0.0601 | 488.0 | 14640 | 0.8886 | 0.7875 | | 0.0601 | 489.0 | 14670 | 0.9933 | 0.7875 | | 0.0601 | 490.0 | 14700 | 1.0210 | 0.7458 | | 0.0601 | 491.0 | 14730 | 0.9674 | 0.7583 | | 0.0601 | 492.0 | 14760 | 0.8121 | 0.7708 | | 0.0601 | 493.0 | 14790 | 0.9943 | 0.7542 | | 0.0601 | 494.0 | 14820 | 0.7857 | 0.7917 | | 0.0601 | 495.0 | 14850 | 1.0005 | 0.775 | | 0.0601 | 496.0 | 14880 | 1.0947 | 0.7833 | | 0.0601 | 497.0 | 14910 | 0.9080 | 0.7833 | | 0.0601 | 498.0 | 14940 | 0.8516 | 0.8167 | | 0.0601 | 499.0 | 14970 | 1.0218 | 0.7708 | | 0.0519 | 500.0 | 15000 | 1.0543 | 0.7458 | | 0.0519 | 501.0 | 15030 | 0.8406 | 0.8125 | | 0.0519 | 502.0 | 15060 | 0.8366 | 0.8042 | | 0.0519 | 503.0 | 15090 | 0.9075 | 0.8083 | | 0.0519 | 504.0 | 15120 | 1.0783 | 0.775 | | 0.0519 | 505.0 | 15150 | 0.9681 | 0.7667 | | 0.0519 | 506.0 | 15180 | 1.1011 | 0.775 | | 0.0519 | 507.0 | 15210 | 0.9403 | 0.7917 | | 0.0519 | 508.0 | 15240 | 0.8061 | 0.7917 | | 0.0519 | 509.0 | 15270 | 0.8726 | 0.8 | | 0.0519 | 510.0 | 15300 | 1.1059 | 0.7417 | | 0.0519 | 511.0 | 15330 | 0.8516 | 0.8125 | | 0.0519 | 512.0 | 15360 | 1.1448 | 0.75 | | 0.0519 | 513.0 | 15390 | 0.8929 | 0.7583 | | 0.0519 | 514.0 | 15420 | 0.8666 | 0.8083 | | 0.0519 | 515.0 | 15450 | 0.9064 | 0.7875 | | 0.0519 | 516.0 | 15480 | 0.8795 | 0.8083 | | 0.051 | 517.0 | 15510 | 1.0771 | 0.75 | | 0.051 | 518.0 | 15540 | 0.9753 | 0.7875 | | 0.051 | 519.0 | 15570 | 0.8163 | 0.825 | | 0.051 | 520.0 | 15600 | 0.8755 | 0.7917 | | 0.051 | 521.0 | 15630 | 0.9051 | 0.7833 | | 0.051 | 522.0 | 15660 | 1.1984 | 0.7625 | | 0.051 | 523.0 | 15690 | 1.0198 | 0.775 | | 0.051 | 524.0 | 15720 | 1.0167 | 0.7958 | | 0.051 | 525.0 | 15750 | 1.0155 | 0.7792 | | 0.051 | 526.0 | 15780 | 1.0082 | 0.7458 | | 0.051 | 527.0 | 15810 | 1.0284 | 0.75 | | 0.051 | 528.0 | 15840 | 1.1984 | 0.7542 | | 0.051 | 529.0 | 15870 | 0.7895 | 0.7875 | | 0.051 | 530.0 | 15900 | 1.1221 | 0.7625 | | 0.051 | 531.0 | 15930 | 0.9090 | 0.7875 | | 0.051 | 532.0 | 15960 | 0.9495 | 0.7875 | | 0.051 | 533.0 | 15990 | 0.9953 | 0.775 | | 0.0443 | 534.0 | 16020 | 0.9892 | 0.7708 | | 0.0443 | 535.0 | 16050 | 0.9191 | 0.7917 | | 0.0443 | 536.0 | 16080 | 0.9072 | 0.7708 | | 0.0443 | 537.0 | 16110 | 0.9554 | 0.7625 | | 0.0443 | 538.0 | 16140 | 1.0217 | 0.7667 | | 0.0443 | 539.0 | 16170 | 0.9230 | 0.7667 | | 0.0443 | 540.0 | 16200 | 1.0109 | 0.7708 | | 0.0443 | 541.0 | 16230 | 0.9582 | 0.7667 | | 0.0443 | 542.0 | 16260 | 0.8373 | 0.7917 | | 0.0443 | 543.0 | 16290 | 1.0174 | 0.7833 | | 0.0443 | 544.0 | 16320 | 0.9116 | 0.7833 | | 0.0443 | 545.0 | 16350 | 0.8073 | 0.7625 | | 0.0443 | 546.0 | 16380 | 0.8774 | 0.8125 | | 0.0443 | 547.0 | 16410 | 0.9941 | 0.8 | | 0.0443 | 548.0 | 16440 | 1.0271 | 0.7708 | | 0.0443 | 549.0 | 16470 | 1.1320 | 0.8 | | 0.0482 | 550.0 | 16500 | 0.9242 | 0.7833 | | 0.0482 | 551.0 | 16530 | 1.1143 | 0.7875 | | 0.0482 | 552.0 | 16560 | 0.8519 | 0.7958 | | 0.0482 | 553.0 | 16590 | 0.9047 | 0.7667 | | 0.0482 | 554.0 | 16620 | 1.0526 | 0.7625 | | 0.0482 | 555.0 | 16650 | 1.0065 | 0.7958 | | 0.0482 | 556.0 | 16680 | 0.7897 | 0.8083 | | 0.0482 | 557.0 | 16710 | 0.9977 | 0.7708 | | 0.0482 | 558.0 | 16740 | 0.9471 | 0.7958 | | 0.0482 | 559.0 | 16770 | 0.8924 | 0.8083 | | 0.0482 | 560.0 | 16800 | 1.0979 | 0.8 | | 0.0482 | 561.0 | 16830 | 0.9277 | 0.7792 | | 0.0482 | 562.0 | 16860 | 0.9281 | 0.7917 | | 0.0482 | 563.0 | 16890 | 0.9837 | 0.7917 | | 0.0482 | 564.0 | 16920 | 1.0288 | 0.7833 | | 0.0482 | 565.0 | 16950 | 0.9448 | 0.7833 | | 0.0482 | 566.0 | 16980 | 0.9409 | 0.7542 | | 0.0425 | 567.0 | 17010 | 0.9580 | 0.8 | | 0.0425 | 568.0 | 17040 | 0.9897 | 0.8208 | | 0.0425 | 569.0 | 17070 | 0.9118 | 0.7917 | | 0.0425 | 570.0 | 17100 | 0.8612 | 0.8042 | | 0.0425 | 571.0 | 17130 | 1.0763 | 0.7792 | | 0.0425 | 572.0 | 17160 | 1.2041 | 0.8042 | | 0.0425 | 573.0 | 17190 | 1.0140 | 0.8 | | 0.0425 | 574.0 | 17220 | 1.0150 | 0.775 | | 0.0425 | 575.0 | 17250 | 0.9961 | 0.7958 | | 0.0425 | 576.0 | 17280 | 0.9425 | 0.7958 | | 0.0425 | 577.0 | 17310 | 1.0492 | 0.7958 | | 0.0425 | 578.0 | 17340 | 1.1043 | 0.7667 | | 0.0425 | 579.0 | 17370 | 1.0070 | 0.7792 | | 0.0425 | 580.0 | 17400 | 0.9145 | 0.7917 | | 0.0425 | 581.0 | 17430 | 0.9027 | 0.7667 | | 0.0425 | 582.0 | 17460 | 1.0716 | 0.7417 | | 0.0425 | 583.0 | 17490 | 1.0289 | 0.7708 | | 0.04 | 584.0 | 17520 | 1.1259 | 0.7417 | | 0.04 | 585.0 | 17550 | 1.0531 | 0.775 | | 0.04 | 586.0 | 17580 | 1.1063 | 0.7583 | | 0.04 | 587.0 | 17610 | 0.8817 | 0.7958 | | 0.04 | 588.0 | 17640 | 1.1537 | 0.775 | | 0.04 | 589.0 | 17670 | 0.9091 | 0.7708 | | 0.04 | 590.0 | 17700 | 1.0775 | 0.775 | | 0.04 | 591.0 | 17730 | 1.2687 | 0.7292 | | 0.04 | 592.0 | 17760 | 1.1212 | 0.7708 | | 0.04 | 593.0 | 17790 | 1.1881 | 0.7667 | | 0.04 | 594.0 | 17820 | 0.8995 | 0.7667 | | 0.04 | 595.0 | 17850 | 1.1407 | 0.75 | | 0.04 | 596.0 | 17880 | 1.1037 | 0.775 | | 0.04 | 597.0 | 17910 | 1.0381 | 0.7833 | | 0.04 | 598.0 | 17940 | 1.0942 | 0.7708 | | 0.04 | 599.0 | 17970 | 1.0618 | 0.7625 | | 0.0424 | 600.0 | 18000 | 1.1417 | 0.7958 | | 0.0424 | 601.0 | 18030 | 0.9643 | 0.8 | | 0.0424 | 602.0 | 18060 | 1.0011 | 0.7958 | | 0.0424 | 603.0 | 18090 | 1.1746 | 0.7667 | | 0.0424 | 604.0 | 18120 | 1.0480 | 0.8042 | | 0.0424 | 605.0 | 18150 | 1.0217 | 0.7875 | | 0.0424 | 606.0 | 18180 | 1.1289 | 0.7875 | | 0.0424 | 607.0 | 18210 | 1.0391 | 0.7958 | | 0.0424 | 608.0 | 18240 | 0.9223 | 0.7833 | | 0.0424 | 609.0 | 18270 | 0.9656 | 0.7792 | | 0.0424 | 610.0 | 18300 | 0.9802 | 0.8042 | | 0.0424 | 611.0 | 18330 | 0.8627 | 0.7917 | | 0.0424 | 612.0 | 18360 | 1.0566 | 0.7583 | | 0.0424 | 613.0 | 18390 | 0.9372 | 0.8042 | | 0.0424 | 614.0 | 18420 | 1.0899 | 0.7667 | | 0.0424 | 615.0 | 18450 | 0.9943 | 0.8 | | 0.0424 | 616.0 | 18480 | 1.0667 | 0.7583 | | 0.0384 | 617.0 | 18510 | 0.9348 | 0.7833 | | 0.0384 | 618.0 | 18540 | 0.9068 | 0.8042 | | 0.0384 | 619.0 | 18570 | 0.9632 | 0.8083 | | 0.0384 | 620.0 | 18600 | 0.9791 | 0.8083 | | 0.0384 | 621.0 | 18630 | 1.0912 | 0.7917 | | 0.0384 | 622.0 | 18660 | 0.9291 | 0.8 | | 0.0384 | 623.0 | 18690 | 1.0545 | 0.775 | | 0.0384 | 624.0 | 18720 | 0.9779 | 0.7833 | | 0.0384 | 625.0 | 18750 | 1.1038 | 0.7792 | | 0.0384 | 626.0 | 18780 | 0.7907 | 0.8208 | | 0.0384 | 627.0 | 18810 | 1.0893 | 0.7958 | | 0.0384 | 628.0 | 18840 | 0.8459 | 0.825 | | 0.0384 | 629.0 | 18870 | 1.1719 | 0.7708 | | 0.0384 | 630.0 | 18900 | 0.9541 | 0.7792 | | 0.0384 | 631.0 | 18930 | 1.2298 | 0.775 | | 0.0384 | 632.0 | 18960 | 1.1361 | 0.75 | | 0.0384 | 633.0 | 18990 | 1.1265 | 0.7667 | | 0.0338 | 634.0 | 19020 | 0.9982 | 0.8042 | | 0.0338 | 635.0 | 19050 | 0.9905 | 0.8042 | | 0.0338 | 636.0 | 19080 | 1.3666 | 0.7625 | | 0.0338 | 637.0 | 19110 | 1.1413 | 0.8083 | | 0.0338 | 638.0 | 19140 | 1.0829 | 0.7875 | | 0.0338 | 639.0 | 19170 | 1.0280 | 0.7833 | | 0.0338 | 640.0 | 19200 | 0.9060 | 0.8042 | | 0.0338 | 641.0 | 19230 | 1.1426 | 0.7583 | | 0.0338 | 642.0 | 19260 | 1.0235 | 0.775 | | 0.0338 | 643.0 | 19290 | 1.3322 | 0.7542 | | 0.0338 | 644.0 | 19320 | 1.2293 | 0.7708 | | 0.0338 | 645.0 | 19350 | 0.9707 | 0.8 | | 0.0338 | 646.0 | 19380 | 1.1312 | 0.7583 | | 0.0338 | 647.0 | 19410 | 1.1248 | 0.7833 | | 0.0338 | 648.0 | 19440 | 1.0716 | 0.7792 | | 0.0338 | 649.0 | 19470 | 1.0553 | 0.8042 | | 0.0385 | 650.0 | 19500 | 1.0426 | 0.8083 | | 0.0385 | 651.0 | 19530 | 1.2204 | 0.7458 | | 0.0385 | 652.0 | 19560 | 0.9107 | 0.8 | | 0.0385 | 653.0 | 19590 | 1.3038 | 0.7583 | | 0.0385 | 654.0 | 19620 | 1.2031 | 0.7417 | | 0.0385 | 655.0 | 19650 | 1.1108 | 0.7458 | | 0.0385 | 656.0 | 19680 | 1.2788 | 0.7917 | | 0.0385 | 657.0 | 19710 | 0.9632 | 0.8292 | | 0.0385 | 658.0 | 19740 | 1.0638 | 0.7958 | | 0.0385 | 659.0 | 19770 | 0.8664 | 0.8 | | 0.0385 | 660.0 | 19800 | 1.0600 | 0.7708 | | 0.0385 | 661.0 | 19830 | 1.1436 | 0.7625 | | 0.0385 | 662.0 | 19860 | 1.2493 | 0.8042 | | 0.0385 | 663.0 | 19890 | 1.1268 | 0.775 | | 0.0385 | 664.0 | 19920 | 1.0522 | 0.7708 | | 0.0385 | 665.0 | 19950 | 1.0047 | 0.7958 | | 0.0385 | 666.0 | 19980 | 1.1241 | 0.75 | | 0.0323 | 667.0 | 20010 | 1.0234 | 0.7875 | | 0.0323 | 668.0 | 20040 | 0.9342 | 0.8167 | | 0.0323 | 669.0 | 20070 | 1.2394 | 0.7875 | | 0.0323 | 670.0 | 20100 | 1.1028 | 0.775 | | 0.0323 | 671.0 | 20130 | 0.9562 | 0.8042 | | 0.0323 | 672.0 | 20160 | 1.2145 | 0.7583 | | 0.0323 | 673.0 | 20190 | 1.0855 | 0.7833 | | 0.0323 | 674.0 | 20220 | 1.3850 | 0.7708 | | 0.0323 | 675.0 | 20250 | 0.9828 | 0.7958 | | 0.0323 | 676.0 | 20280 | 1.1585 | 0.8167 | | 0.0323 | 677.0 | 20310 | 0.9875 | 0.7625 | | 0.0323 | 678.0 | 20340 | 1.0551 | 0.7792 | | 0.0323 | 679.0 | 20370 | 1.0060 | 0.7792 | | 0.0323 | 680.0 | 20400 | 1.1597 | 0.7875 | | 0.0323 | 681.0 | 20430 | 1.3096 | 0.7792 | | 0.0323 | 682.0 | 20460 | 1.1257 | 0.7833 | | 0.0323 | 683.0 | 20490 | 1.2330 | 0.7708 | | 0.0283 | 684.0 | 20520 | 1.0005 | 0.7917 | | 0.0283 | 685.0 | 20550 | 1.0816 | 0.7625 | | 0.0283 | 686.0 | 20580 | 1.1316 | 0.7917 | | 0.0283 | 687.0 | 20610 | 1.0168 | 0.7958 | | 0.0283 | 688.0 | 20640 | 1.0353 | 0.8208 | | 0.0283 | 689.0 | 20670 | 0.9711 | 0.7875 | | 0.0283 | 690.0 | 20700 | 1.0280 | 0.7917 | | 0.0283 | 691.0 | 20730 | 1.0708 | 0.8 | | 0.0283 | 692.0 | 20760 | 1.2052 | 0.7583 | | 0.0283 | 693.0 | 20790 | 1.1467 | 0.7625 | | 0.0283 | 694.0 | 20820 | 0.9665 | 0.8 | | 0.0283 | 695.0 | 20850 | 1.2117 | 0.7667 | | 0.0283 | 696.0 | 20880 | 0.9303 | 0.8042 | | 0.0283 | 697.0 | 20910 | 1.0587 | 0.7833 | | 0.0283 | 698.0 | 20940 | 1.0979 | 0.7833 | | 0.0283 | 699.0 | 20970 | 0.9975 | 0.8042 | | 0.0326 | 700.0 | 21000 | 1.1632 | 0.7875 | | 0.0326 | 701.0 | 21030 | 0.9511 | 0.8167 | | 0.0326 | 702.0 | 21060 | 1.1559 | 0.7542 | | 0.0326 | 703.0 | 21090 | 1.2330 | 0.7583 | | 0.0326 | 704.0 | 21120 | 0.9977 | 0.7833 | | 0.0326 | 705.0 | 21150 | 1.1061 | 0.775 | | 0.0326 | 706.0 | 21180 | 1.1159 | 0.7917 | | 0.0326 | 707.0 | 21210 | 0.9022 | 0.8292 | | 0.0326 | 708.0 | 21240 | 1.1994 | 0.75 | | 0.0326 | 709.0 | 21270 | 0.9543 | 0.8208 | | 0.0326 | 710.0 | 21300 | 0.9715 | 0.8125 | | 0.0326 | 711.0 | 21330 | 1.3772 | 0.7708 | | 0.0326 | 712.0 | 21360 | 1.0394 | 0.8083 | | 0.0326 | 713.0 | 21390 | 0.9502 | 0.7958 | | 0.0326 | 714.0 | 21420 | 1.1898 | 0.7583 | | 0.0326 | 715.0 | 21450 | 1.0820 | 0.8 | | 0.0326 | 716.0 | 21480 | 1.3499 | 0.7375 | | 0.0301 | 717.0 | 21510 | 1.2789 | 0.7667 | | 0.0301 | 718.0 | 21540 | 1.0876 | 0.7833 | | 0.0301 | 719.0 | 21570 | 1.1392 | 0.775 | | 0.0301 | 720.0 | 21600 | 1.2245 | 0.7667 | | 0.0301 | 721.0 | 21630 | 1.1443 | 0.8042 | | 0.0301 | 722.0 | 21660 | 1.1618 | 0.7917 | | 0.0301 | 723.0 | 21690 | 1.0548 | 0.7958 | | 0.0301 | 724.0 | 21720 | 1.0702 | 0.8083 | | 0.0301 | 725.0 | 21750 | 1.0840 | 0.7833 | | 0.0301 | 726.0 | 21780 | 1.2450 | 0.8125 | | 0.0301 | 727.0 | 21810 | 1.1175 | 0.7917 | | 0.0301 | 728.0 | 21840 | 1.3475 | 0.7833 | | 0.0301 | 729.0 | 21870 | 1.1527 | 0.7625 | | 0.0301 | 730.0 | 21900 | 1.0586 | 0.7625 | | 0.0301 | 731.0 | 21930 | 1.0848 | 0.7917 | | 0.0301 | 732.0 | 21960 | 1.1244 | 0.7708 | | 0.0301 | 733.0 | 21990 | 1.1275 | 0.775 | | 0.0272 | 734.0 | 22020 | 0.9909 | 0.8042 | | 0.0272 | 735.0 | 22050 | 1.2026 | 0.7625 | | 0.0272 | 736.0 | 22080 | 1.0991 | 0.8125 | | 0.0272 | 737.0 | 22110 | 1.2394 | 0.7708 | | 0.0272 | 738.0 | 22140 | 1.0878 | 0.7917 | | 0.0272 | 739.0 | 22170 | 1.1101 | 0.8125 | | 0.0272 | 740.0 | 22200 | 1.1070 | 0.8167 | | 0.0272 | 741.0 | 22230 | 1.0441 | 0.8125 | | 0.0272 | 742.0 | 22260 | 1.0896 | 0.7958 | | 0.0272 | 743.0 | 22290 | 1.0720 | 0.8333 | | 0.0272 | 744.0 | 22320 | 1.0240 | 0.8 | | 0.0272 | 745.0 | 22350 | 1.0434 | 0.7875 | | 0.0272 | 746.0 | 22380 | 1.3495 | 0.7708 | | 0.0272 | 747.0 | 22410 | 1.0755 | 0.775 | | 0.0272 | 748.0 | 22440 | 1.2062 | 0.7917 | | 0.0272 | 749.0 | 22470 | 1.1181 | 0.8167 | | 0.0249 | 750.0 | 22500 | 0.8874 | 0.8 | | 0.0249 | 751.0 | 22530 | 1.0670 | 0.8042 | | 0.0249 | 752.0 | 22560 | 1.0972 | 0.8042 | | 0.0249 | 753.0 | 22590 | 1.2970 | 0.7583 | | 0.0249 | 754.0 | 22620 | 1.1709 | 0.7875 | | 0.0249 | 755.0 | 22650 | 1.0521 | 0.8125 | | 0.0249 | 756.0 | 22680 | 1.1236 | 0.8042 | | 0.0249 | 757.0 | 22710 | 1.2493 | 0.7792 | | 0.0249 | 758.0 | 22740 | 1.2774 | 0.8042 | | 0.0249 | 759.0 | 22770 | 0.9895 | 0.8167 | | 0.0249 | 760.0 | 22800 | 1.1395 | 0.7833 | | 0.0249 | 761.0 | 22830 | 1.0139 | 0.8083 | | 0.0249 | 762.0 | 22860 | 1.2967 | 0.8 | | 0.0249 | 763.0 | 22890 | 1.1264 | 0.7833 | | 0.0249 | 764.0 | 22920 | 1.1385 | 0.775 | | 0.0249 | 765.0 | 22950 | 1.0383 | 0.7958 | | 0.0249 | 766.0 | 22980 | 1.2166 | 0.8 | | 0.0262 | 767.0 | 23010 | 1.0318 | 0.8042 | | 0.0262 | 768.0 | 23040 | 1.2273 | 0.7792 | | 0.0262 | 769.0 | 23070 | 1.1309 | 0.8167 | | 0.0262 | 770.0 | 23100 | 1.2404 | 0.7792 | | 0.0262 | 771.0 | 23130 | 1.2287 | 0.7917 | | 0.0262 | 772.0 | 23160 | 1.0488 | 0.8208 | | 0.0262 | 773.0 | 23190 | 1.0266 | 0.7833 | | 0.0262 | 774.0 | 23220 | 1.1560 | 0.7792 | | 0.0262 | 775.0 | 23250 | 1.2589 | 0.775 | | 0.0262 | 776.0 | 23280 | 1.2857 | 0.7542 | | 0.0262 | 777.0 | 23310 | 1.0205 | 0.7958 | | 0.0262 | 778.0 | 23340 | 0.9728 | 0.8292 | | 0.0262 | 779.0 | 23370 | 1.1494 | 0.7958 | | 0.0262 | 780.0 | 23400 | 1.1652 | 0.7708 | | 0.0262 | 781.0 | 23430 | 1.1576 | 0.8 | | 0.0262 | 782.0 | 23460 | 1.1547 | 0.8042 | | 0.0262 | 783.0 | 23490 | 1.0655 | 0.7792 | | 0.0253 | 784.0 | 23520 | 1.0329 | 0.8 | | 0.0253 | 785.0 | 23550 | 1.2010 | 0.7875 | | 0.0253 | 786.0 | 23580 | 1.3480 | 0.7917 | | 0.0253 | 787.0 | 23610 | 1.0791 | 0.775 | | 0.0253 | 788.0 | 23640 | 1.1344 | 0.7875 | | 0.0253 | 789.0 | 23670 | 1.3103 | 0.7958 | | 0.0253 | 790.0 | 23700 | 1.2800 | 0.7917 | | 0.0253 | 791.0 | 23730 | 1.3279 | 0.7792 | | 0.0253 | 792.0 | 23760 | 1.2755 | 0.825 | | 0.0253 | 793.0 | 23790 | 1.2713 | 0.7792 | | 0.0253 | 794.0 | 23820 | 1.2899 | 0.8 | | 0.0253 | 795.0 | 23850 | 1.2386 | 0.7833 | | 0.0253 | 796.0 | 23880 | 1.1535 | 0.8 | | 0.0253 | 797.0 | 23910 | 1.0342 | 0.8125 | | 0.0253 | 798.0 | 23940 | 1.0722 | 0.7958 | | 0.0253 | 799.0 | 23970 | 1.2387 | 0.8 | | 0.0207 | 800.0 | 24000 | 1.0833 | 0.8167 | | 0.0207 | 801.0 | 24030 | 1.1337 | 0.7792 | | 0.0207 | 802.0 | 24060 | 1.1895 | 0.7667 | | 0.0207 | 803.0 | 24090 | 0.9990 | 0.8125 | | 0.0207 | 804.0 | 24120 | 1.0142 | 0.8083 | | 0.0207 | 805.0 | 24150 | 1.1297 | 0.7958 | | 0.0207 | 806.0 | 24180 | 0.9841 | 0.7917 | | 0.0207 | 807.0 | 24210 | 1.3211 | 0.7625 | | 0.0207 | 808.0 | 24240 | 1.3687 | 0.7792 | | 0.0207 | 809.0 | 24270 | 1.4000 | 0.8083 | | 0.0207 | 810.0 | 24300 | 1.0097 | 0.7833 | | 0.0207 | 811.0 | 24330 | 0.9740 | 0.8083 | | 0.0207 | 812.0 | 24360 | 1.4046 | 0.7708 | | 0.0207 | 813.0 | 24390 | 1.1827 | 0.7833 | | 0.0207 | 814.0 | 24420 | 0.9860 | 0.8 | | 0.0207 | 815.0 | 24450 | 1.1385 | 0.775 | | 0.0207 | 816.0 | 24480 | 1.2281 | 0.8125 | | 0.0231 | 817.0 | 24510 | 1.3088 | 0.8042 | | 0.0231 | 818.0 | 24540 | 1.1096 | 0.8 | | 0.0231 | 819.0 | 24570 | 0.9547 | 0.7833 | | 0.0231 | 820.0 | 24600 | 1.0840 | 0.7625 | | 0.0231 | 821.0 | 24630 | 1.1569 | 0.7875 | | 0.0231 | 822.0 | 24660 | 1.1002 | 0.7792 | | 0.0231 | 823.0 | 24690 | 1.1376 | 0.7917 | | 0.0231 | 824.0 | 24720 | 1.0571 | 0.8 | | 0.0231 | 825.0 | 24750 | 1.0773 | 0.8083 | | 0.0231 | 826.0 | 24780 | 1.1077 | 0.7833 | | 0.0231 | 827.0 | 24810 | 1.2458 | 0.775 | | 0.0231 | 828.0 | 24840 | 1.1643 | 0.7625 | | 0.0231 | 829.0 | 24870 | 1.3109 | 0.8042 | | 0.0231 | 830.0 | 24900 | 1.2572 | 0.7833 | | 0.0231 | 831.0 | 24930 | 1.1068 | 0.8083 | | 0.0231 | 832.0 | 24960 | 1.1389 | 0.8 | | 0.0231 | 833.0 | 24990 | 1.0897 | 0.7958 | | 0.0207 | 834.0 | 25020 | 1.2637 | 0.8042 | | 0.0207 | 835.0 | 25050 | 1.3100 | 0.7833 | | 0.0207 | 836.0 | 25080 | 0.9977 | 0.8 | | 0.0207 | 837.0 | 25110 | 1.1378 | 0.8083 | | 0.0207 | 838.0 | 25140 | 1.3694 | 0.7917 | | 0.0207 | 839.0 | 25170 | 1.0631 | 0.8167 | | 0.0207 | 840.0 | 25200 | 1.1366 | 0.7875 | | 0.0207 | 841.0 | 25230 | 1.2233 | 0.7625 | | 0.0207 | 842.0 | 25260 | 1.2924 | 0.7792 | | 0.0207 | 843.0 | 25290 | 1.2211 | 0.8083 | | 0.0207 | 844.0 | 25320 | 1.1005 | 0.8 | | 0.0207 | 845.0 | 25350 | 1.2151 | 0.825 | | 0.0207 | 846.0 | 25380 | 1.1359 | 0.8 | | 0.0207 | 847.0 | 25410 | 0.8798 | 0.8208 | | 0.0207 | 848.0 | 25440 | 1.1880 | 0.7833 | | 0.0207 | 849.0 | 25470 | 0.9717 | 0.825 | | 0.0215 | 850.0 | 25500 | 1.5594 | 0.7625 | | 0.0215 | 851.0 | 25530 | 1.1872 | 0.8083 | | 0.0215 | 852.0 | 25560 | 1.1898 | 0.8042 | | 0.0215 | 853.0 | 25590 | 1.1967 | 0.7958 | | 0.0215 | 854.0 | 25620 | 1.1443 | 0.7667 | | 0.0215 | 855.0 | 25650 | 1.1826 | 0.7708 | | 0.0215 | 856.0 | 25680 | 1.1462 | 0.7875 | | 0.0215 | 857.0 | 25710 | 1.4278 | 0.7625 | | 0.0215 | 858.0 | 25740 | 1.2068 | 0.8 | | 0.0215 | 859.0 | 25770 | 1.2596 | 0.7875 | | 0.0215 | 860.0 | 25800 | 1.1978 | 0.7958 | | 0.0215 | 861.0 | 25830 | 1.2896 | 0.7958 | | 0.0215 | 862.0 | 25860 | 1.2299 | 0.7542 | | 0.0215 | 863.0 | 25890 | 1.2064 | 0.7583 | | 0.0215 | 864.0 | 25920 | 1.1525 | 0.7875 | | 0.0215 | 865.0 | 25950 | 1.2343 | 0.7625 | | 0.0215 | 866.0 | 25980 | 1.0024 | 0.7875 | | 0.0204 | 867.0 | 26010 | 1.0836 | 0.8125 | | 0.0204 | 868.0 | 26040 | 1.1291 | 0.7792 | | 0.0204 | 869.0 | 26070 | 1.0985 | 0.8125 | | 0.0204 | 870.0 | 26100 | 0.9606 | 0.8125 | | 0.0204 | 871.0 | 26130 | 1.0972 | 0.7917 | | 0.0204 | 872.0 | 26160 | 1.2781 | 0.7833 | | 0.0204 | 873.0 | 26190 | 1.2422 | 0.7792 | | 0.0204 | 874.0 | 26220 | 1.1804 | 0.7917 | | 0.0204 | 875.0 | 26250 | 1.1486 | 0.7958 | | 0.0204 | 876.0 | 26280 | 1.2045 | 0.7583 | | 0.0204 | 877.0 | 26310 | 1.3727 | 0.8 | | 0.0204 | 878.0 | 26340 | 1.3952 | 0.7958 | | 0.0204 | 879.0 | 26370 | 1.3071 | 0.7458 | | 0.0204 | 880.0 | 26400 | 1.2558 | 0.7667 | | 0.0204 | 881.0 | 26430 | 0.9884 | 0.825 | | 0.0204 | 882.0 | 26460 | 1.1677 | 0.8083 | | 0.0204 | 883.0 | 26490 | 1.0072 | 0.8208 | | 0.0191 | 884.0 | 26520 | 1.2943 | 0.8042 | | 0.0191 | 885.0 | 26550 | 1.0277 | 0.8042 | | 0.0191 | 886.0 | 26580 | 1.0072 | 0.8125 | | 0.0191 | 887.0 | 26610 | 1.1419 | 0.775 | | 0.0191 | 888.0 | 26640 | 1.2148 | 0.8083 | | 0.0191 | 889.0 | 26670 | 1.2696 | 0.7583 | | 0.0191 | 890.0 | 26700 | 0.9889 | 0.8167 | | 0.0191 | 891.0 | 26730 | 0.9741 | 0.7958 | | 0.0191 | 892.0 | 26760 | 1.1884 | 0.8208 | | 0.0191 | 893.0 | 26790 | 1.1508 | 0.7833 | | 0.0191 | 894.0 | 26820 | 1.2028 | 0.8208 | | 0.0191 | 895.0 | 26850 | 1.2590 | 0.8208 | | 0.0191 | 896.0 | 26880 | 0.9745 | 0.8208 | | 0.0191 | 897.0 | 26910 | 1.4040 | 0.7667 | | 0.0191 | 898.0 | 26940 | 1.0628 | 0.7667 | | 0.0191 | 899.0 | 26970 | 1.1360 | 0.775 | | 0.0166 | 900.0 | 27000 | 1.1260 | 0.8167 | | 0.0166 | 901.0 | 27030 | 1.0662 | 0.8125 | | 0.0166 | 902.0 | 27060 | 1.2636 | 0.7625 | | 0.0166 | 903.0 | 27090 | 1.2996 | 0.7792 | | 0.0166 | 904.0 | 27120 | 1.2951 | 0.7792 | | 0.0166 | 905.0 | 27150 | 0.9613 | 0.8042 | | 0.0166 | 906.0 | 27180 | 1.0893 | 0.8125 | | 0.0166 | 907.0 | 27210 | 1.3048 | 0.7958 | | 0.0166 | 908.0 | 27240 | 1.0748 | 0.7875 | | 0.0166 | 909.0 | 27270 | 1.3204 | 0.7708 | | 0.0166 | 910.0 | 27300 | 1.2628 | 0.8083 | | 0.0166 | 911.0 | 27330 | 1.2394 | 0.7833 | | 0.0166 | 912.0 | 27360 | 1.1889 | 0.8125 | | 0.0166 | 913.0 | 27390 | 1.0724 | 0.8167 | | 0.0166 | 914.0 | 27420 | 1.2091 | 0.7583 | | 0.0166 | 915.0 | 27450 | 1.2722 | 0.7583 | | 0.0166 | 916.0 | 27480 | 1.0382 | 0.7917 | | 0.0184 | 917.0 | 27510 | 1.1779 | 0.7833 | | 0.0184 | 918.0 | 27540 | 1.1789 | 0.7833 | | 0.0184 | 919.0 | 27570 | 1.2081 | 0.7833 | | 0.0184 | 920.0 | 27600 | 1.2754 | 0.7917 | | 0.0184 | 921.0 | 27630 | 1.2520 | 0.8 | | 0.0184 | 922.0 | 27660 | 1.2760 | 0.7583 | | 0.0184 | 923.0 | 27690 | 1.1261 | 0.8042 | | 0.0184 | 924.0 | 27720 | 1.0372 | 0.8083 | | 0.0184 | 925.0 | 27750 | 0.9795 | 0.8167 | | 0.0184 | 926.0 | 27780 | 1.3099 | 0.775 | | 0.0184 | 927.0 | 27810 | 1.2417 | 0.7625 | | 0.0184 | 928.0 | 27840 | 1.2214 | 0.75 | | 0.0184 | 929.0 | 27870 | 1.3354 | 0.7917 | | 0.0184 | 930.0 | 27900 | 1.2369 | 0.7958 | | 0.0184 | 931.0 | 27930 | 1.2308 | 0.7958 | | 0.0184 | 932.0 | 27960 | 1.0644 | 0.8042 | | 0.0184 | 933.0 | 27990 | 1.2438 | 0.8 | | 0.0152 | 934.0 | 28020 | 1.1781 | 0.7875 | | 0.0152 | 935.0 | 28050 | 1.0581 | 0.8042 | | 0.0152 | 936.0 | 28080 | 1.3594 | 0.7875 | | 0.0152 | 937.0 | 28110 | 1.3270 | 0.7625 | | 0.0152 | 938.0 | 28140 | 1.4216 | 0.7833 | | 0.0152 | 939.0 | 28170 | 1.4995 | 0.7542 | | 0.0152 | 940.0 | 28200 | 0.9541 | 0.825 | | 0.0152 | 941.0 | 28230 | 1.1239 | 0.7667 | | 0.0152 | 942.0 | 28260 | 1.1585 | 0.7792 | | 0.0152 | 943.0 | 28290 | 1.2934 | 0.7625 | | 0.0152 | 944.0 | 28320 | 1.0039 | 0.7875 | | 0.0152 | 945.0 | 28350 | 1.0728 | 0.7833 | | 0.0152 | 946.0 | 28380 | 1.2731 | 0.8 | | 0.0152 | 947.0 | 28410 | 1.2042 | 0.8125 | | 0.0152 | 948.0 | 28440 | 1.3425 | 0.7667 | | 0.0152 | 949.0 | 28470 | 1.2167 | 0.7958 | | 0.0154 | 950.0 | 28500 | 1.1224 | 0.8 | | 0.0154 | 951.0 | 28530 | 1.0493 | 0.8167 | | 0.0154 | 952.0 | 28560 | 1.4145 | 0.7708 | | 0.0154 | 953.0 | 28590 | 1.2137 | 0.7917 | | 0.0154 | 954.0 | 28620 | 1.2327 | 0.8125 | | 0.0154 | 955.0 | 28650 | 1.2070 | 0.8167 | | 0.0154 | 956.0 | 28680 | 1.0862 | 0.8042 | | 0.0154 | 957.0 | 28710 | 1.0016 | 0.8083 | | 0.0154 | 958.0 | 28740 | 1.2085 | 0.8042 | | 0.0154 | 959.0 | 28770 | 1.0550 | 0.8083 | | 0.0154 | 960.0 | 28800 | 1.1622 | 0.7917 | | 0.0154 | 961.0 | 28830 | 1.1003 | 0.8292 | | 0.0154 | 962.0 | 28860 | 1.1365 | 0.7958 | | 0.0154 | 963.0 | 28890 | 1.3100 | 0.7667 | | 0.0154 | 964.0 | 28920 | 1.2175 | 0.7792 | | 0.0154 | 965.0 | 28950 | 1.2421 | 0.7833 | | 0.0154 | 966.0 | 28980 | 1.2706 | 0.7875 | | 0.0155 | 967.0 | 29010 | 0.9261 | 0.8125 | | 0.0155 | 968.0 | 29040 | 1.5476 | 0.7917 | | 0.0155 | 969.0 | 29070 | 1.3698 | 0.7917 | | 0.0155 | 970.0 | 29100 | 1.3555 | 0.8292 | | 0.0155 | 971.0 | 29130 | 1.3758 | 0.7875 | | 0.0155 | 972.0 | 29160 | 1.1446 | 0.8083 | | 0.0155 | 973.0 | 29190 | 1.1567 | 0.825 | | 0.0155 | 974.0 | 29220 | 1.2239 | 0.7958 | | 0.0155 | 975.0 | 29250 | 1.2984 | 0.775 | | 0.0155 | 976.0 | 29280 | 1.3331 | 0.8042 | | 0.0155 | 977.0 | 29310 | 1.3698 | 0.7708 | | 0.0155 | 978.0 | 29340 | 1.3179 | 0.7917 | | 0.0155 | 979.0 | 29370 | 1.2027 | 0.8 | | 0.0155 | 980.0 | 29400 | 0.9489 | 0.8208 | | 0.0155 | 981.0 | 29430 | 1.1286 | 0.7708 | | 0.0155 | 982.0 | 29460 | 1.2550 | 0.8125 | | 0.0155 | 983.0 | 29490 | 1.3201 | 0.7833 | | 0.0176 | 984.0 | 29520 | 1.1823 | 0.8 | | 0.0176 | 985.0 | 29550 | 1.6433 | 0.7875 | | 0.0176 | 986.0 | 29580 | 1.1672 | 0.7792 | | 0.0176 | 987.0 | 29610 | 1.3200 | 0.7917 | | 0.0176 | 988.0 | 29640 | 1.1838 | 0.8 | | 0.0176 | 989.0 | 29670 | 1.2769 | 0.7708 | | 0.0176 | 990.0 | 29700 | 1.1194 | 0.8333 | | 0.0176 | 991.0 | 29730 | 1.1779 | 0.7708 | | 0.0176 | 992.0 | 29760 | 1.4575 | 0.7833 | | 0.0176 | 993.0 | 29790 | 1.3840 | 0.775 | | 0.0176 | 994.0 | 29820 | 1.5852 | 0.7833 | | 0.0176 | 995.0 | 29850 | 1.2423 | 0.8 | | 0.0176 | 996.0 | 29880 | 1.2358 | 0.8167 | | 0.0176 | 997.0 | 29910 | 1.3483 | 0.7792 | | 0.0176 | 998.0 | 29940 | 1.4198 | 0.75 | | 0.0176 | 999.0 | 29970 | 1.1185 | 0.7917 | | 0.015 | 1000.0 | 30000 | 1.4292 | 0.8 | | 0.015 | 1001.0 | 30030 | 1.3571 | 0.8 | | 0.015 | 1002.0 | 30060 | 1.4920 | 0.7583 | | 0.015 | 1003.0 | 30090 | 1.3661 | 0.8083 | | 0.015 | 1004.0 | 30120 | 1.4176 | 0.7708 | | 0.015 | 1005.0 | 30150 | 1.3307 | 0.825 | | 0.015 | 1006.0 | 30180 | 1.4701 | 0.7792 | | 0.015 | 1007.0 | 30210 | 1.2148 | 0.7875 | | 0.015 | 1008.0 | 30240 | 1.2507 | 0.7667 | | 0.015 | 1009.0 | 30270 | 1.0981 | 0.775 | | 0.015 | 1010.0 | 30300 | 1.1084 | 0.8125 | | 0.015 | 1011.0 | 30330 | 1.1015 | 0.8 | | 0.015 | 1012.0 | 30360 | 1.3804 | 0.7917 | | 0.015 | 1013.0 | 30390 | 1.2149 | 0.8125 | | 0.015 | 1014.0 | 30420 | 1.0234 | 0.7958 | | 0.015 | 1015.0 | 30450 | 1.2014 | 0.7958 | | 0.015 | 1016.0 | 30480 | 1.0906 | 0.8125 | | 0.0172 | 1017.0 | 30510 | 1.1625 | 0.8083 | | 0.0172 | 1018.0 | 30540 | 1.1962 | 0.8167 | | 0.0172 | 1019.0 | 30570 | 1.2694 | 0.8 | | 0.0172 | 1020.0 | 30600 | 1.1312 | 0.8167 | | 0.0172 | 1021.0 | 30630 | 1.6004 | 0.7125 | | 0.0172 | 1022.0 | 30660 | 0.9532 | 0.8292 | | 0.0172 | 1023.0 | 30690 | 1.2767 | 0.7833 | | 0.0172 | 1024.0 | 30720 | 1.2132 | 0.8083 | | 0.0172 | 1025.0 | 30750 | 1.4019 | 0.8 | | 0.0172 | 1026.0 | 30780 | 1.1299 | 0.8167 | | 0.0172 | 1027.0 | 30810 | 1.1287 | 0.7917 | | 0.0172 | 1028.0 | 30840 | 1.1774 | 0.7667 | | 0.0172 | 1029.0 | 30870 | 1.1074 | 0.8 | | 0.0172 | 1030.0 | 30900 | 1.2399 | 0.8042 | | 0.0172 | 1031.0 | 30930 | 1.1009 | 0.8 | | 0.0172 | 1032.0 | 30960 | 1.2207 | 0.7667 | | 0.0172 | 1033.0 | 30990 | 1.4750 | 0.7875 | | 0.0168 | 1034.0 | 31020 | 1.1526 | 0.7875 | | 0.0168 | 1035.0 | 31050 | 1.2182 | 0.8167 | | 0.0168 | 1036.0 | 31080 | 1.1382 | 0.7917 | | 0.0168 | 1037.0 | 31110 | 1.1549 | 0.7958 | | 0.0168 | 1038.0 | 31140 | 1.1373 | 0.8375 | | 0.0168 | 1039.0 | 31170 | 1.1892 | 0.775 | | 0.0168 | 1040.0 | 31200 | 1.0540 | 0.8167 | | 0.0168 | 1041.0 | 31230 | 1.0543 | 0.7875 | | 0.0168 | 1042.0 | 31260 | 1.7781 | 0.7 | | 0.0168 | 1043.0 | 31290 | 1.2750 | 0.7958 | | 0.0168 | 1044.0 | 31320 | 1.0537 | 0.7958 | | 0.0168 | 1045.0 | 31350 | 1.0793 | 0.7958 | | 0.0168 | 1046.0 | 31380 | 1.2879 | 0.8 | | 0.0168 | 1047.0 | 31410 | 1.2407 | 0.7917 | | 0.0168 | 1048.0 | 31440 | 1.2887 | 0.7917 | | 0.0168 | 1049.0 | 31470 | 1.1768 | 0.7917 | | 0.0171 | 1050.0 | 31500 | 1.2488 | 0.8167 | | 0.0171 | 1051.0 | 31530 | 1.2811 | 0.7667 | | 0.0171 | 1052.0 | 31560 | 1.4288 | 0.7833 | | 0.0171 | 1053.0 | 31590 | 1.1834 | 0.8208 | | 0.0171 | 1054.0 | 31620 | 1.4607 | 0.7958 | | 0.0171 | 1055.0 | 31650 | 1.2909 | 0.7875 | | 0.0171 | 1056.0 | 31680 | 1.4200 | 0.7917 | | 0.0171 | 1057.0 | 31710 | 1.1359 | 0.8167 | | 0.0171 | 1058.0 | 31740 | 1.2413 | 0.8 | | 0.0171 | 1059.0 | 31770 | 1.1094 | 0.8125 | | 0.0171 | 1060.0 | 31800 | 1.1526 | 0.7833 | | 0.0171 | 1061.0 | 31830 | 1.3996 | 0.7625 | | 0.0171 | 1062.0 | 31860 | 1.2302 | 0.8083 | | 0.0171 | 1063.0 | 31890 | 1.2330 | 0.775 | | 0.0171 | 1064.0 | 31920 | 1.2275 | 0.8 | | 0.0171 | 1065.0 | 31950 | 1.2238 | 0.8208 | | 0.0171 | 1066.0 | 31980 | 1.2760 | 0.8083 | | 0.015 | 1067.0 | 32010 | 1.1757 | 0.7917 | | 0.015 | 1068.0 | 32040 | 1.3817 | 0.7833 | | 0.015 | 1069.0 | 32070 | 1.3511 | 0.7625 | | 0.015 | 1070.0 | 32100 | 1.2074 | 0.8042 | | 0.015 | 1071.0 | 32130 | 1.3793 | 0.7792 | | 0.015 | 1072.0 | 32160 | 1.0829 | 0.7875 | | 0.015 | 1073.0 | 32190 | 1.1831 | 0.8042 | | 0.015 | 1074.0 | 32220 | 1.3478 | 0.8083 | | 0.015 | 1075.0 | 32250 | 1.2172 | 0.8 | | 0.015 | 1076.0 | 32280 | 1.1870 | 0.7833 | | 0.015 | 1077.0 | 32310 | 1.3886 | 0.7833 | | 0.015 | 1078.0 | 32340 | 1.3042 | 0.8 | | 0.015 | 1079.0 | 32370 | 1.2325 | 0.7958 | | 0.015 | 1080.0 | 32400 | 1.5088 | 0.775 | | 0.015 | 1081.0 | 32430 | 1.1727 | 0.8 | | 0.015 | 1082.0 | 32460 | 1.2728 | 0.8125 | | 0.015 | 1083.0 | 32490 | 1.3919 | 0.7667 | | 0.0177 | 1084.0 | 32520 | 1.3616 | 0.7958 | | 0.0177 | 1085.0 | 32550 | 1.3399 | 0.7958 | | 0.0177 | 1086.0 | 32580 | 1.3119 | 0.775 | | 0.0177 | 1087.0 | 32610 | 1.2765 | 0.7958 | | 0.0177 | 1088.0 | 32640 | 1.1675 | 0.7708 | | 0.0177 | 1089.0 | 32670 | 1.2446 | 0.8083 | | 0.0177 | 1090.0 | 32700 | 1.3678 | 0.7917 | | 0.0177 | 1091.0 | 32730 | 1.4348 | 0.7542 | | 0.0177 | 1092.0 | 32760 | 1.4077 | 0.75 | | 0.0177 | 1093.0 | 32790 | 1.0270 | 0.8208 | | 0.0177 | 1094.0 | 32820 | 1.1600 | 0.8167 | | 0.0177 | 1095.0 | 32850 | 1.2207 | 0.7917 | | 0.0177 | 1096.0 | 32880 | 1.4131 | 0.8042 | | 0.0177 | 1097.0 | 32910 | 1.3109 | 0.7792 | | 0.0177 | 1098.0 | 32940 | 1.2776 | 0.7417 | | 0.0177 | 1099.0 | 32970 | 1.1072 | 0.8042 | | 0.0142 | 1100.0 | 33000 | 1.1199 | 0.825 | | 0.0142 | 1101.0 | 33030 | 1.2923 | 0.7833 | | 0.0142 | 1102.0 | 33060 | 1.3257 | 0.8 | | 0.0142 | 1103.0 | 33090 | 1.2255 | 0.8125 | | 0.0142 | 1104.0 | 33120 | 1.4188 | 0.775 | | 0.0142 | 1105.0 | 33150 | 1.3203 | 0.7833 | | 0.0142 | 1106.0 | 33180 | 1.3513 | 0.7792 | | 0.0142 | 1107.0 | 33210 | 1.1060 | 0.8125 | | 0.0142 | 1108.0 | 33240 | 1.2840 | 0.7875 | | 0.0142 | 1109.0 | 33270 | 1.2345 | 0.8042 | | 0.0142 | 1110.0 | 33300 | 1.2918 | 0.8125 | | 0.0142 | 1111.0 | 33330 | 1.2811 | 0.7875 | | 0.0142 | 1112.0 | 33360 | 1.2805 | 0.8042 | | 0.0142 | 1113.0 | 33390 | 1.2860 | 0.7875 | | 0.0142 | 1114.0 | 33420 | 1.2186 | 0.8042 | | 0.0142 | 1115.0 | 33450 | 1.1594 | 0.825 | | 0.0142 | 1116.0 | 33480 | 1.2668 | 0.7833 | | 0.0148 | 1117.0 | 33510 | 1.2472 | 0.8208 | | 0.0148 | 1118.0 | 33540 | 1.2199 | 0.8125 | | 0.0148 | 1119.0 | 33570 | 1.2872 | 0.7875 | | 0.0148 | 1120.0 | 33600 | 1.2223 | 0.775 | | 0.0148 | 1121.0 | 33630 | 1.1484 | 0.8125 | | 0.0148 | 1122.0 | 33660 | 1.3649 | 0.7958 | | 0.0148 | 1123.0 | 33690 | 1.0882 | 0.7958 | | 0.0148 | 1124.0 | 33720 | 1.1841 | 0.7625 | | 0.0148 | 1125.0 | 33750 | 1.1985 | 0.7958 | | 0.0148 | 1126.0 | 33780 | 1.2063 | 0.7875 | | 0.0148 | 1127.0 | 33810 | 1.4187 | 0.7958 | | 0.0148 | 1128.0 | 33840 | 1.3637 | 0.7833 | | 0.0148 | 1129.0 | 33870 | 1.3114 | 0.7875 | | 0.0148 | 1130.0 | 33900 | 1.0851 | 0.825 | | 0.0148 | 1131.0 | 33930 | 1.4881 | 0.7667 | | 0.0148 | 1132.0 | 33960 | 1.5678 | 0.7792 | | 0.0148 | 1133.0 | 33990 | 1.4227 | 0.8125 | | 0.0139 | 1134.0 | 34020 | 1.1429 | 0.8292 | | 0.0139 | 1135.0 | 34050 | 1.2332 | 0.8 | | 0.0139 | 1136.0 | 34080 | 1.3828 | 0.7583 | | 0.0139 | 1137.0 | 34110 | 1.2947 | 0.8083 | | 0.0139 | 1138.0 | 34140 | 1.1120 | 0.8208 | | 0.0139 | 1139.0 | 34170 | 1.2132 | 0.7875 | | 0.0139 | 1140.0 | 34200 | 1.4537 | 0.7667 | | 0.0139 | 1141.0 | 34230 | 1.3759 | 0.7875 | | 0.0139 | 1142.0 | 34260 | 1.2975 | 0.8042 | | 0.0139 | 1143.0 | 34290 | 1.3009 | 0.7792 | | 0.0139 | 1144.0 | 34320 | 1.1644 | 0.8125 | | 0.0139 | 1145.0 | 34350 | 1.3290 | 0.7958 | | 0.0139 | 1146.0 | 34380 | 1.3580 | 0.8042 | | 0.0139 | 1147.0 | 34410 | 1.1816 | 0.7958 | | 0.0139 | 1148.0 | 34440 | 1.2178 | 0.8167 | | 0.0139 | 1149.0 | 34470 | 1.5208 | 0.7625 | | 0.0131 | 1150.0 | 34500 | 1.2757 | 0.7875 | | 0.0131 | 1151.0 | 34530 | 1.4133 | 0.7917 | | 0.0131 | 1152.0 | 34560 | 1.1700 | 0.8125 | | 0.0131 | 1153.0 | 34590 | 1.3750 | 0.7958 | | 0.0131 | 1154.0 | 34620 | 1.3490 | 0.8 | | 0.0131 | 1155.0 | 34650 | 1.3869 | 0.775 | | 0.0131 | 1156.0 | 34680 | 1.4270 | 0.8042 | | 0.0131 | 1157.0 | 34710 | 1.1555 | 0.8083 | | 0.0131 | 1158.0 | 34740 | 1.2426 | 0.775 | | 0.0131 | 1159.0 | 34770 | 1.2148 | 0.7833 | | 0.0131 | 1160.0 | 34800 | 1.2437 | 0.825 | | 0.0131 | 1161.0 | 34830 | 1.4501 | 0.7583 | | 0.0131 | 1162.0 | 34860 | 1.1728 | 0.7833 | | 0.0131 | 1163.0 | 34890 | 1.4399 | 0.7792 | | 0.0131 | 1164.0 | 34920 | 1.3135 | 0.7875 | | 0.0131 | 1165.0 | 34950 | 1.1913 | 0.8333 | | 0.0131 | 1166.0 | 34980 | 1.2604 | 0.7875 | | 0.0141 | 1167.0 | 35010 | 1.4779 | 0.7875 | | 0.0141 | 1168.0 | 35040 | 1.0426 | 0.8125 | | 0.0141 | 1169.0 | 35070 | 1.4297 | 0.775 | | 0.0141 | 1170.0 | 35100 | 1.2091 | 0.7958 | | 0.0141 | 1171.0 | 35130 | 1.2682 | 0.7833 | | 0.0141 | 1172.0 | 35160 | 1.4470 | 0.775 | | 0.0141 | 1173.0 | 35190 | 1.1855 | 0.8167 | | 0.0141 | 1174.0 | 35220 | 1.3932 | 0.7667 | | 0.0141 | 1175.0 | 35250 | 1.2889 | 0.7917 | | 0.0141 | 1176.0 | 35280 | 1.1186 | 0.8208 | | 0.0141 | 1177.0 | 35310 | 1.4161 | 0.775 | | 0.0141 | 1178.0 | 35340 | 1.2205 | 0.8 | | 0.0141 | 1179.0 | 35370 | 1.4083 | 0.7792 | | 0.0141 | 1180.0 | 35400 | 1.4713 | 0.7958 | | 0.0141 | 1181.0 | 35430 | 1.2595 | 0.7792 | | 0.0141 | 1182.0 | 35460 | 1.2736 | 0.7667 | | 0.0141 | 1183.0 | 35490 | 1.1907 | 0.7917 | | 0.016 | 1184.0 | 35520 | 1.3182 | 0.7833 | | 0.016 | 1185.0 | 35550 | 1.2241 | 0.8333 | | 0.016 | 1186.0 | 35580 | 1.0391 | 0.8125 | | 0.016 | 1187.0 | 35610 | 1.4469 | 0.775 | | 0.016 | 1188.0 | 35640 | 1.2545 | 0.8 | | 0.016 | 1189.0 | 35670 | 1.3608 | 0.7833 | | 0.016 | 1190.0 | 35700 | 1.2944 | 0.8083 | | 0.016 | 1191.0 | 35730 | 1.4708 | 0.775 | | 0.016 | 1192.0 | 35760 | 1.1792 | 0.8208 | | 0.016 | 1193.0 | 35790 | 1.4941 | 0.7708 | | 0.016 | 1194.0 | 35820 | 1.3757 | 0.7875 | | 0.016 | 1195.0 | 35850 | 1.5045 | 0.8083 | | 0.016 | 1196.0 | 35880 | 1.1618 | 0.8 | | 0.016 | 1197.0 | 35910 | 1.3231 | 0.7875 | | 0.016 | 1198.0 | 35940 | 1.6181 | 0.7625 | | 0.016 | 1199.0 | 35970 | 1.4816 | 0.7583 | | 0.0108 | 1200.0 | 36000 | 1.4468 | 0.7625 | | 0.0108 | 1201.0 | 36030 | 1.4746 | 0.7875 | | 0.0108 | 1202.0 | 36060 | 1.2116 | 0.8042 | | 0.0108 | 1203.0 | 36090 | 1.1604 | 0.7958 | | 0.0108 | 1204.0 | 36120 | 1.3660 | 0.7917 | | 0.0108 | 1205.0 | 36150 | 1.2539 | 0.8125 | | 0.0108 | 1206.0 | 36180 | 1.6884 | 0.7917 | | 0.0108 | 1207.0 | 36210 | 1.2239 | 0.8083 | | 0.0108 | 1208.0 | 36240 | 1.3461 | 0.8083 | | 0.0108 | 1209.0 | 36270 | 1.3824 | 0.7792 | | 0.0108 | 1210.0 | 36300 | 1.2422 | 0.8125 | | 0.0108 | 1211.0 | 36330 | 1.4409 | 0.7875 | | 0.0108 | 1212.0 | 36360 | 0.9312 | 0.8417 | | 0.0108 | 1213.0 | 36390 | 1.1455 | 0.7875 | | 0.0108 | 1214.0 | 36420 | 1.2086 | 0.7875 | | 0.0108 | 1215.0 | 36450 | 1.2418 | 0.8208 | | 0.0108 | 1216.0 | 36480 | 1.2790 | 0.8 | | 0.0121 | 1217.0 | 36510 | 1.3987 | 0.7958 | | 0.0121 | 1218.0 | 36540 | 1.5869 | 0.7417 | | 0.0121 | 1219.0 | 36570 | 1.1666 | 0.7792 | | 0.0121 | 1220.0 | 36600 | 1.2105 | 0.8208 | | 0.0121 | 1221.0 | 36630 | 1.2959 | 0.775 | | 0.0121 | 1222.0 | 36660 | 1.3241 | 0.7667 | | 0.0121 | 1223.0 | 36690 | 1.2996 | 0.8042 | | 0.0121 | 1224.0 | 36720 | 1.3784 | 0.75 | | 0.0121 | 1225.0 | 36750 | 1.4736 | 0.7917 | | 0.0121 | 1226.0 | 36780 | 1.2993 | 0.775 | | 0.0121 | 1227.0 | 36810 | 1.3094 | 0.7958 | | 0.0121 | 1228.0 | 36840 | 1.3072 | 0.7875 | | 0.0121 | 1229.0 | 36870 | 1.2921 | 0.7667 | | 0.0121 | 1230.0 | 36900 | 1.3255 | 0.8042 | | 0.0121 | 1231.0 | 36930 | 1.3001 | 0.7833 | | 0.0121 | 1232.0 | 36960 | 1.3200 | 0.8042 | | 0.0121 | 1233.0 | 36990 | 1.4908 | 0.7792 | | 0.0126 | 1234.0 | 37020 | 1.2452 | 0.8083 | | 0.0126 | 1235.0 | 37050 | 1.3089 | 0.8 | | 0.0126 | 1236.0 | 37080 | 1.4038 | 0.8083 | | 0.0126 | 1237.0 | 37110 | 1.1996 | 0.7875 | | 0.0126 | 1238.0 | 37140 | 1.4119 | 0.8292 | | 0.0126 | 1239.0 | 37170 | 1.2332 | 0.8167 | | 0.0126 | 1240.0 | 37200 | 1.1774 | 0.7917 | | 0.0126 | 1241.0 | 37230 | 1.3389 | 0.8 | | 0.0126 | 1242.0 | 37260 | 1.4678 | 0.7667 | | 0.0126 | 1243.0 | 37290 | 1.4133 | 0.775 | | 0.0126 | 1244.0 | 37320 | 1.3449 | 0.8042 | | 0.0126 | 1245.0 | 37350 | 1.5209 | 0.7875 | | 0.0126 | 1246.0 | 37380 | 1.2873 | 0.8125 | | 0.0126 | 1247.0 | 37410 | 1.4443 | 0.7833 | | 0.0126 | 1248.0 | 37440 | 1.2808 | 0.8125 | | 0.0126 | 1249.0 | 37470 | 1.5125 | 0.775 | | 0.0139 | 1250.0 | 37500 | 1.2931 | 0.8042 | | 0.0139 | 1251.0 | 37530 | 1.1329 | 0.8167 | | 0.0139 | 1252.0 | 37560 | 1.3924 | 0.8042 | | 0.0139 | 1253.0 | 37590 | 1.5617 | 0.775 | | 0.0139 | 1254.0 | 37620 | 1.3317 | 0.8042 | | 0.0139 | 1255.0 | 37650 | 1.4886 | 0.7917 | | 0.0139 | 1256.0 | 37680 | 1.2164 | 0.8125 | | 0.0139 | 1257.0 | 37710 | 1.3023 | 0.8 | | 0.0139 | 1258.0 | 37740 | 1.3687 | 0.8 | | 0.0139 | 1259.0 | 37770 | 1.1631 | 0.8125 | | 0.0139 | 1260.0 | 37800 | 1.2467 | 0.8 | | 0.0139 | 1261.0 | 37830 | 1.1801 | 0.8375 | | 0.0139 | 1262.0 | 37860 | 1.1806 | 0.7917 | | 0.0139 | 1263.0 | 37890 | 1.4988 | 0.7833 | | 0.0139 | 1264.0 | 37920 | 1.3505 | 0.7875 | | 0.0139 | 1265.0 | 37950 | 1.2264 | 0.8167 | | 0.0139 | 1266.0 | 37980 | 1.5791 | 0.8125 | | 0.014 | 1267.0 | 38010 | 1.2901 | 0.825 | | 0.014 | 1268.0 | 38040 | 1.2634 | 0.8 | | 0.014 | 1269.0 | 38070 | 1.1945 | 0.7833 | | 0.014 | 1270.0 | 38100 | 1.2704 | 0.7625 | | 0.014 | 1271.0 | 38130 | 1.5594 | 0.7667 | | 0.014 | 1272.0 | 38160 | 1.3672 | 0.8042 | | 0.014 | 1273.0 | 38190 | 1.4381 | 0.775 | | 0.014 | 1274.0 | 38220 | 1.8510 | 0.7208 | | 0.014 | 1275.0 | 38250 | 1.3156 | 0.7917 | | 0.014 | 1276.0 | 38280 | 1.2853 | 0.8042 | | 0.014 | 1277.0 | 38310 | 1.3247 | 0.8 | | 0.014 | 1278.0 | 38340 | 1.1573 | 0.8167 | | 0.014 | 1279.0 | 38370 | 1.2918 | 0.8083 | | 0.014 | 1280.0 | 38400 | 1.3332 | 0.7708 | | 0.014 | 1281.0 | 38430 | 1.5114 | 0.7583 | | 0.014 | 1282.0 | 38460 | 1.6017 | 0.7583 | | 0.014 | 1283.0 | 38490 | 1.2733 | 0.825 | | 0.0132 | 1284.0 | 38520 | 1.4314 | 0.7875 | | 0.0132 | 1285.0 | 38550 | 1.3916 | 0.7708 | | 0.0132 | 1286.0 | 38580 | 1.1371 | 0.8167 | | 0.0132 | 1287.0 | 38610 | 1.3006 | 0.7833 | | 0.0132 | 1288.0 | 38640 | 1.2554 | 0.7833 | | 0.0132 | 1289.0 | 38670 | 1.4035 | 0.7917 | | 0.0132 | 1290.0 | 38700 | 1.0704 | 0.7875 | | 0.0132 | 1291.0 | 38730 | 1.3593 | 0.7542 | | 0.0132 | 1292.0 | 38760 | 1.2005 | 0.8042 | | 0.0132 | 1293.0 | 38790 | 1.3436 | 0.8042 | | 0.0132 | 1294.0 | 38820 | 1.3618 | 0.7833 | | 0.0132 | 1295.0 | 38850 | 1.4030 | 0.8125 | | 0.0132 | 1296.0 | 38880 | 1.2990 | 0.8167 | | 0.0132 | 1297.0 | 38910 | 1.0381 | 0.8292 | | 0.0132 | 1298.0 | 38940 | 1.3280 | 0.7917 | | 0.0132 | 1299.0 | 38970 | 1.2085 | 0.8042 | | 0.0135 | 1300.0 | 39000 | 1.1258 | 0.8042 | | 0.0135 | 1301.0 | 39030 | 1.4681 | 0.7833 | | 0.0135 | 1302.0 | 39060 | 1.2293 | 0.8083 | | 0.0135 | 1303.0 | 39090 | 1.3352 | 0.8125 | | 0.0135 | 1304.0 | 39120 | 1.2390 | 0.8042 | | 0.0135 | 1305.0 | 39150 | 1.3284 | 0.7958 | | 0.0135 | 1306.0 | 39180 | 1.3757 | 0.7833 | | 0.0135 | 1307.0 | 39210 | 1.5578 | 0.775 | | 0.0135 | 1308.0 | 39240 | 1.2721 | 0.7792 | | 0.0135 | 1309.0 | 39270 | 1.4933 | 0.8167 | | 0.0135 | 1310.0 | 39300 | 1.4297 | 0.8042 | | 0.0135 | 1311.0 | 39330 | 1.1987 | 0.775 | | 0.0135 | 1312.0 | 39360 | 1.2935 | 0.8083 | | 0.0135 | 1313.0 | 39390 | 1.2500 | 0.775 | | 0.0135 | 1314.0 | 39420 | 1.3369 | 0.7917 | | 0.0135 | 1315.0 | 39450 | 1.2369 | 0.8042 | | 0.0135 | 1316.0 | 39480 | 1.3597 | 0.8042 | | 0.0127 | 1317.0 | 39510 | 1.5542 | 0.8042 | | 0.0127 | 1318.0 | 39540 | 1.2995 | 0.7833 | | 0.0127 | 1319.0 | 39570 | 1.4383 | 0.7958 | | 0.0127 | 1320.0 | 39600 | 1.2405 | 0.8083 | | 0.0127 | 1321.0 | 39630 | 1.4988 | 0.7792 | | 0.0127 | 1322.0 | 39660 | 1.4485 | 0.7833 | | 0.0127 | 1323.0 | 39690 | 1.2489 | 0.7875 | | 0.0127 | 1324.0 | 39720 | 1.1591 | 0.7833 | | 0.0127 | 1325.0 | 39750 | 1.2731 | 0.7917 | | 0.0127 | 1326.0 | 39780 | 1.4482 | 0.7958 | | 0.0127 | 1327.0 | 39810 | 1.2610 | 0.7875 | | 0.0127 | 1328.0 | 39840 | 1.4610 | 0.7917 | | 0.0127 | 1329.0 | 39870 | 1.5360 | 0.7708 | | 0.0127 | 1330.0 | 39900 | 1.3631 | 0.7667 | | 0.0127 | 1331.0 | 39930 | 1.3288 | 0.7958 | | 0.0127 | 1332.0 | 39960 | 1.3469 | 0.7958 | | 0.0127 | 1333.0 | 39990 | 1.4929 | 0.7833 | | 0.0119 | 1334.0 | 40020 | 1.1060 | 0.7958 | | 0.0119 | 1335.0 | 40050 | 1.1872 | 0.7958 | | 0.0119 | 1336.0 | 40080 | 1.3432 | 0.7917 | | 0.0119 | 1337.0 | 40110 | 1.4642 | 0.7958 | | 0.0119 | 1338.0 | 40140 | 1.0872 | 0.825 | | 0.0119 | 1339.0 | 40170 | 1.0611 | 0.8042 | | 0.0119 | 1340.0 | 40200 | 1.2117 | 0.8083 | | 0.0119 | 1341.0 | 40230 | 1.1744 | 0.8 | | 0.0119 | 1342.0 | 40260 | 1.4946 | 0.7875 | | 0.0119 | 1343.0 | 40290 | 1.4924 | 0.7667 | | 0.0119 | 1344.0 | 40320 | 1.6512 | 0.7958 | | 0.0119 | 1345.0 | 40350 | 1.3300 | 0.8333 | | 0.0119 | 1346.0 | 40380 | 1.4894 | 0.7833 | | 0.0119 | 1347.0 | 40410 | 1.1712 | 0.7833 | | 0.0119 | 1348.0 | 40440 | 1.4388 | 0.7792 | | 0.0119 | 1349.0 | 40470 | 1.3988 | 0.775 | | 0.0117 | 1350.0 | 40500 | 1.2460 | 0.8042 | | 0.0117 | 1351.0 | 40530 | 1.1276 | 0.8125 | | 0.0117 | 1352.0 | 40560 | 1.2683 | 0.7917 | | 0.0117 | 1353.0 | 40590 | 1.3402 | 0.8167 | | 0.0117 | 1354.0 | 40620 | 1.1061 | 0.8375 | | 0.0117 | 1355.0 | 40650 | 1.2443 | 0.8042 | | 0.0117 | 1356.0 | 40680 | 1.5169 | 0.7542 | | 0.0117 | 1357.0 | 40710 | 1.1183 | 0.8 | | 0.0117 | 1358.0 | 40740 | 1.1952 | 0.7917 | | 0.0117 | 1359.0 | 40770 | 1.2697 | 0.7958 | | 0.0117 | 1360.0 | 40800 | 1.3521 | 0.7833 | | 0.0117 | 1361.0 | 40830 | 1.4172 | 0.7917 | | 0.0117 | 1362.0 | 40860 | 1.4201 | 0.7917 | | 0.0117 | 1363.0 | 40890 | 1.0224 | 0.8333 | | 0.0117 | 1364.0 | 40920 | 1.5423 | 0.7833 | | 0.0117 | 1365.0 | 40950 | 1.2901 | 0.7917 | | 0.0117 | 1366.0 | 40980 | 1.2830 | 0.7875 | | 0.0119 | 1367.0 | 41010 | 1.0776 | 0.7958 | | 0.0119 | 1368.0 | 41040 | 1.2689 | 0.8 | | 0.0119 | 1369.0 | 41070 | 1.1735 | 0.8208 | | 0.0119 | 1370.0 | 41100 | 1.3600 | 0.7917 | | 0.0119 | 1371.0 | 41130 | 1.4670 | 0.8 | | 0.0119 | 1372.0 | 41160 | 1.3187 | 0.7792 | | 0.0119 | 1373.0 | 41190 | 1.3816 | 0.8125 | | 0.0119 | 1374.0 | 41220 | 1.3887 | 0.7917 | | 0.0119 | 1375.0 | 41250 | 1.3960 | 0.8083 | | 0.0119 | 1376.0 | 41280 | 1.8548 | 0.7667 | | 0.0119 | 1377.0 | 41310 | 1.5282 | 0.7875 | | 0.0119 | 1378.0 | 41340 | 1.2598 | 0.7875 | | 0.0119 | 1379.0 | 41370 | 1.2086 | 0.7792 | | 0.0119 | 1380.0 | 41400 | 1.0611 | 0.8042 | | 0.0119 | 1381.0 | 41430 | 1.4254 | 0.7917 | | 0.0119 | 1382.0 | 41460 | 1.4113 | 0.7917 | | 0.0119 | 1383.0 | 41490 | 1.5810 | 0.775 | | 0.0122 | 1384.0 | 41520 | 1.2773 | 0.7667 | | 0.0122 | 1385.0 | 41550 | 1.4357 | 0.7875 | | 0.0122 | 1386.0 | 41580 | 1.5143 | 0.8042 | | 0.0122 | 1387.0 | 41610 | 1.3586 | 0.7917 | | 0.0122 | 1388.0 | 41640 | 1.3677 | 0.7542 | | 0.0122 | 1389.0 | 41670 | 1.4306 | 0.7833 | | 0.0122 | 1390.0 | 41700 | 1.4094 | 0.775 | | 0.0122 | 1391.0 | 41730 | 1.2797 | 0.7875 | | 0.0122 | 1392.0 | 41760 | 1.1697 | 0.7875 | | 0.0122 | 1393.0 | 41790 | 1.1379 | 0.8125 | | 0.0122 | 1394.0 | 41820 | 1.2648 | 0.7833 | | 0.0122 | 1395.0 | 41850 | 1.3288 | 0.7917 | | 0.0122 | 1396.0 | 41880 | 1.3606 | 0.7958 | | 0.0122 | 1397.0 | 41910 | 1.2689 | 0.7792 | | 0.0122 | 1398.0 | 41940 | 1.2074 | 0.8292 | | 0.0122 | 1399.0 | 41970 | 1.1529 | 0.8083 | | 0.0117 | 1400.0 | 42000 | 1.2803 | 0.7875 | | 0.0117 | 1401.0 | 42030 | 1.3206 | 0.8 | | 0.0117 | 1402.0 | 42060 | 1.3851 | 0.7792 | | 0.0117 | 1403.0 | 42090 | 1.1094 | 0.8042 | | 0.0117 | 1404.0 | 42120 | 1.5090 | 0.7875 | | 0.0117 | 1405.0 | 42150 | 1.3904 | 0.7833 | | 0.0117 | 1406.0 | 42180 | 1.3749 | 0.8 | | 0.0117 | 1407.0 | 42210 | 1.2993 | 0.7917 | | 0.0117 | 1408.0 | 42240 | 1.3366 | 0.7875 | | 0.0117 | 1409.0 | 42270 | 1.3067 | 0.8083 | | 0.0117 | 1410.0 | 42300 | 1.4128 | 0.775 | | 0.0117 | 1411.0 | 42330 | 1.2120 | 0.8208 | | 0.0117 | 1412.0 | 42360 | 1.1774 | 0.8125 | | 0.0117 | 1413.0 | 42390 | 1.1445 | 0.8125 | | 0.0117 | 1414.0 | 42420 | 1.5141 | 0.7833 | | 0.0117 | 1415.0 | 42450 | 1.4082 | 0.7833 | | 0.0117 | 1416.0 | 42480 | 1.3647 | 0.7833 | | 0.0109 | 1417.0 | 42510 | 1.3435 | 0.7958 | | 0.0109 | 1418.0 | 42540 | 1.4641 | 0.8 | | 0.0109 | 1419.0 | 42570 | 1.2633 | 0.8125 | | 0.0109 | 1420.0 | 42600 | 1.3394 | 0.8042 | | 0.0109 | 1421.0 | 42630 | 1.4154 | 0.8042 | | 0.0109 | 1422.0 | 42660 | 1.2684 | 0.7958 | | 0.0109 | 1423.0 | 42690 | 1.3496 | 0.8042 | | 0.0109 | 1424.0 | 42720 | 1.3884 | 0.8042 | | 0.0109 | 1425.0 | 42750 | 1.2356 | 0.7708 | | 0.0109 | 1426.0 | 42780 | 1.2875 | 0.7958 | | 0.0109 | 1427.0 | 42810 | 1.3178 | 0.8042 | | 0.0109 | 1428.0 | 42840 | 1.4535 | 0.7917 | | 0.0109 | 1429.0 | 42870 | 1.2387 | 0.8083 | | 0.0109 | 1430.0 | 42900 | 1.6628 | 0.7458 | | 0.0109 | 1431.0 | 42930 | 1.4390 | 0.8125 | | 0.0109 | 1432.0 | 42960 | 1.2688 | 0.7833 | | 0.0109 | 1433.0 | 42990 | 1.3232 | 0.8 | | 0.0133 | 1434.0 | 43020 | 1.1935 | 0.8125 | | 0.0133 | 1435.0 | 43050 | 1.4634 | 0.7917 | | 0.0133 | 1436.0 | 43080 | 1.0387 | 0.8167 | | 0.0133 | 1437.0 | 43110 | 1.5935 | 0.8 | | 0.0133 | 1438.0 | 43140 | 1.0836 | 0.8 | | 0.0133 | 1439.0 | 43170 | 1.2449 | 0.8125 | | 0.0133 | 1440.0 | 43200 | 1.2790 | 0.7958 | | 0.0133 | 1441.0 | 43230 | 1.3732 | 0.8125 | | 0.0133 | 1442.0 | 43260 | 1.0954 | 0.8292 | | 0.0133 | 1443.0 | 43290 | 1.3387 | 0.7917 | | 0.0133 | 1444.0 | 43320 | 1.4839 | 0.7833 | | 0.0133 | 1445.0 | 43350 | 1.4672 | 0.7875 | | 0.0133 | 1446.0 | 43380 | 1.3551 | 0.7667 | | 0.0133 | 1447.0 | 43410 | 1.5240 | 0.7458 | | 0.0133 | 1448.0 | 43440 | 1.0737 | 0.8 | | 0.0133 | 1449.0 | 43470 | 1.3365 | 0.7958 | | 0.0117 | 1450.0 | 43500 | 1.5471 | 0.8125 | | 0.0117 | 1451.0 | 43530 | 1.2186 | 0.7875 | | 0.0117 | 1452.0 | 43560 | 1.4005 | 0.7833 | | 0.0117 | 1453.0 | 43590 | 1.3964 | 0.8083 | | 0.0117 | 1454.0 | 43620 | 1.3173 | 0.8042 | | 0.0117 | 1455.0 | 43650 | 1.1150 | 0.8417 | | 0.0117 | 1456.0 | 43680 | 1.1728 | 0.7917 | | 0.0117 | 1457.0 | 43710 | 1.4196 | 0.7792 | | 0.0117 | 1458.0 | 43740 | 1.4560 | 0.8 | | 0.0117 | 1459.0 | 43770 | 1.3794 | 0.8167 | | 0.0117 | 1460.0 | 43800 | 1.3478 | 0.7792 | | 0.0117 | 1461.0 | 43830 | 1.3383 | 0.8042 | | 0.0117 | 1462.0 | 43860 | 1.2820 | 0.8125 | | 0.0117 | 1463.0 | 43890 | 1.1542 | 0.7875 | | 0.0117 | 1464.0 | 43920 | 1.4135 | 0.7792 | | 0.0117 | 1465.0 | 43950 | 1.3492 | 0.8 | | 0.0117 | 1466.0 | 43980 | 2.0121 | 0.7 | | 0.0086 | 1467.0 | 44010 | 1.5258 | 0.7792 | | 0.0086 | 1468.0 | 44040 | 1.3612 | 0.8125 | | 0.0086 | 1469.0 | 44070 | 1.4076 | 0.7917 | | 0.0086 | 1470.0 | 44100 | 1.2031 | 0.8208 | | 0.0086 | 1471.0 | 44130 | 1.4157 | 0.7792 | | 0.0086 | 1472.0 | 44160 | 1.4416 | 0.8083 | | 0.0086 | 1473.0 | 44190 | 1.4480 | 0.7833 | | 0.0086 | 1474.0 | 44220 | 1.1862 | 0.8167 | | 0.0086 | 1475.0 | 44250 | 1.3431 | 0.7792 | | 0.0086 | 1476.0 | 44280 | 1.2668 | 0.8208 | | 0.0086 | 1477.0 | 44310 | 1.5297 | 0.775 | | 0.0086 | 1478.0 | 44340 | 1.4872 | 0.7833 | | 0.0086 | 1479.0 | 44370 | 1.2053 | 0.8333 | | 0.0086 | 1480.0 | 44400 | 1.2900 | 0.7833 | | 0.0086 | 1481.0 | 44430 | 1.4907 | 0.775 | | 0.0086 | 1482.0 | 44460 | 1.2355 | 0.8083 | | 0.0086 | 1483.0 | 44490 | 1.1396 | 0.775 | | 0.0088 | 1484.0 | 44520 | 1.5673 | 0.7833 | | 0.0088 | 1485.0 | 44550 | 1.4772 | 0.8042 | | 0.0088 | 1486.0 | 44580 | 1.5385 | 0.7583 | | 0.0088 | 1487.0 | 44610 | 1.5553 | 0.8125 | | 0.0088 | 1488.0 | 44640 | 1.4577 | 0.7917 | | 0.0088 | 1489.0 | 44670 | 0.9210 | 0.8333 | | 0.0088 | 1490.0 | 44700 | 1.1163 | 0.8083 | | 0.0088 | 1491.0 | 44730 | 1.2678 | 0.8 | | 0.0088 | 1492.0 | 44760 | 1.2810 | 0.7958 | | 0.0088 | 1493.0 | 44790 | 1.5296 | 0.7875 | | 0.0088 | 1494.0 | 44820 | 1.1364 | 0.775 | | 0.0088 | 1495.0 | 44850 | 1.1587 | 0.8208 | | 0.0088 | 1496.0 | 44880 | 1.3942 | 0.7917 | | 0.0088 | 1497.0 | 44910 | 1.4652 | 0.8083 | | 0.0088 | 1498.0 | 44940 | 1.5409 | 0.7875 | | 0.0088 | 1499.0 | 44970 | 1.2924 | 0.8 | | 0.0129 | 1500.0 | 45000 | 1.4110 | 0.8125 | | 0.0129 | 1501.0 | 45030 | 1.3966 | 0.8042 | | 0.0129 | 1502.0 | 45060 | 1.3567 | 0.8042 | | 0.0129 | 1503.0 | 45090 | 1.2921 | 0.775 | | 0.0129 | 1504.0 | 45120 | 1.1863 | 0.8083 | | 0.0129 | 1505.0 | 45150 | 1.5501 | 0.7792 | | 0.0129 | 1506.0 | 45180 | 1.5077 | 0.7792 | | 0.0129 | 1507.0 | 45210 | 1.4453 | 0.8167 | | 0.0129 | 1508.0 | 45240 | 1.7031 | 0.8042 | | 0.0129 | 1509.0 | 45270 | 1.4184 | 0.7958 | | 0.0129 | 1510.0 | 45300 | 1.4755 | 0.7917 | | 0.0129 | 1511.0 | 45330 | 1.3509 | 0.7875 | | 0.0129 | 1512.0 | 45360 | 1.3863 | 0.775 | | 0.0129 | 1513.0 | 45390 | 1.1137 | 0.8 | | 0.0129 | 1514.0 | 45420 | 1.6185 | 0.7792 | | 0.0129 | 1515.0 | 45450 | 1.5356 | 0.7875 | | 0.0129 | 1516.0 | 45480 | 1.3424 | 0.8042 | | 0.0091 | 1517.0 | 45510 | 1.3850 | 0.7875 | | 0.0091 | 1518.0 | 45540 | 1.3671 | 0.7792 | | 0.0091 | 1519.0 | 45570 | 1.5930 | 0.7917 | | 0.0091 | 1520.0 | 45600 | 1.2088 | 0.8 | | 0.0091 | 1521.0 | 45630 | 1.4669 | 0.8042 | | 0.0091 | 1522.0 | 45660 | 1.3986 | 0.7875 | | 0.0091 | 1523.0 | 45690 | 1.7112 | 0.75 | | 0.0091 | 1524.0 | 45720 | 1.1584 | 0.8208 | | 0.0091 | 1525.0 | 45750 | 1.3584 | 0.7792 | | 0.0091 | 1526.0 | 45780 | 1.5027 | 0.7542 | | 0.0091 | 1527.0 | 45810 | 1.3276 | 0.8125 | | 0.0091 | 1528.0 | 45840 | 1.2703 | 0.8167 | | 0.0091 | 1529.0 | 45870 | 1.6849 | 0.7125 | | 0.0091 | 1530.0 | 45900 | 1.4520 | 0.7833 | | 0.0091 | 1531.0 | 45930 | 1.3507 | 0.7833 | | 0.0091 | 1532.0 | 45960 | 1.2636 | 0.8125 | | 0.0091 | 1533.0 | 45990 | 1.3649 | 0.7625 | | 0.0112 | 1534.0 | 46020 | 1.4273 | 0.7542 | | 0.0112 | 1535.0 | 46050 | 1.3029 | 0.8 | | 0.0112 | 1536.0 | 46080 | 1.1719 | 0.8208 | | 0.0112 | 1537.0 | 46110 | 1.3519 | 0.8083 | | 0.0112 | 1538.0 | 46140 | 1.0847 | 0.8083 | | 0.0112 | 1539.0 | 46170 | 1.4050 | 0.8 | | 0.0112 | 1540.0 | 46200 | 1.6150 | 0.7958 | | 0.0112 | 1541.0 | 46230 | 1.2510 | 0.7917 | | 0.0112 | 1542.0 | 46260 | 1.7382 | 0.7833 | | 0.0112 | 1543.0 | 46290 | 1.2740 | 0.8 | | 0.0112 | 1544.0 | 46320 | 1.1606 | 0.8167 | | 0.0112 | 1545.0 | 46350 | 1.0645 | 0.8333 | | 0.0112 | 1546.0 | 46380 | 1.4396 | 0.7917 | | 0.0112 | 1547.0 | 46410 | 1.3832 | 0.7875 | | 0.0112 | 1548.0 | 46440 | 1.0985 | 0.825 | | 0.0112 | 1549.0 | 46470 | 1.2882 | 0.7792 | | 0.0096 | 1550.0 | 46500 | 1.3089 | 0.7875 | | 0.0096 | 1551.0 | 46530 | 1.2983 | 0.8167 | | 0.0096 | 1552.0 | 46560 | 1.6504 | 0.7583 | | 0.0096 | 1553.0 | 46590 | 1.4972 | 0.7875 | | 0.0096 | 1554.0 | 46620 | 1.6400 | 0.7958 | | 0.0096 | 1555.0 | 46650 | 1.8197 | 0.7667 | | 0.0096 | 1556.0 | 46680 | 1.3075 | 0.8083 | | 0.0096 | 1557.0 | 46710 | 1.4223 | 0.7958 | | 0.0096 | 1558.0 | 46740 | 1.5494 | 0.8292 | | 0.0096 | 1559.0 | 46770 | 1.3085 | 0.7875 | | 0.0096 | 1560.0 | 46800 | 1.4508 | 0.7708 | | 0.0096 | 1561.0 | 46830 | 1.4659 | 0.7958 | | 0.0096 | 1562.0 | 46860 | 1.3944 | 0.8083 | | 0.0096 | 1563.0 | 46890 | 1.2353 | 0.7833 | | 0.0096 | 1564.0 | 46920 | 1.6323 | 0.7417 | | 0.0096 | 1565.0 | 46950 | 1.1290 | 0.8 | | 0.0096 | 1566.0 | 46980 | 1.3444 | 0.8208 | | 0.0126 | 1567.0 | 47010 | 1.1851 | 0.8292 | | 0.0126 | 1568.0 | 47040 | 1.4688 | 0.8125 | | 0.0126 | 1569.0 | 47070 | 1.4655 | 0.7833 | | 0.0126 | 1570.0 | 47100 | 1.0312 | 0.7958 | | 0.0126 | 1571.0 | 47130 | 1.3962 | 0.7542 | | 0.0126 | 1572.0 | 47160 | 1.3534 | 0.7917 | | 0.0126 | 1573.0 | 47190 | 1.6330 | 0.8208 | | 0.0126 | 1574.0 | 47220 | 1.2064 | 0.7917 | | 0.0126 | 1575.0 | 47250 | 1.6004 | 0.7792 | | 0.0126 | 1576.0 | 47280 | 1.4152 | 0.7833 | | 0.0126 | 1577.0 | 47310 | 1.2253 | 0.8167 | | 0.0126 | 1578.0 | 47340 | 1.5881 | 0.7833 | | 0.0126 | 1579.0 | 47370 | 1.6793 | 0.7625 | | 0.0126 | 1580.0 | 47400 | 1.1328 | 0.8083 | | 0.0126 | 1581.0 | 47430 | 1.3365 | 0.8083 | | 0.0126 | 1582.0 | 47460 | 1.0486 | 0.8167 | | 0.0126 | 1583.0 | 47490 | 1.3892 | 0.7875 | | 0.0081 | 1584.0 | 47520 | 1.4876 | 0.7833 | | 0.0081 | 1585.0 | 47550 | 1.2344 | 0.8125 | | 0.0081 | 1586.0 | 47580 | 1.3279 | 0.7875 | | 0.0081 | 1587.0 | 47610 | 1.1887 | 0.8208 | | 0.0081 | 1588.0 | 47640 | 1.4643 | 0.8083 | | 0.0081 | 1589.0 | 47670 | 1.2302 | 0.8042 | | 0.0081 | 1590.0 | 47700 | 1.4154 | 0.8 | | 0.0081 | 1591.0 | 47730 | 1.2831 | 0.7958 | | 0.0081 | 1592.0 | 47760 | 1.6219 | 0.8125 | | 0.0081 | 1593.0 | 47790 | 1.3057 | 0.8 | | 0.0081 | 1594.0 | 47820 | 1.7437 | 0.7417 | | 0.0081 | 1595.0 | 47850 | 1.1653 | 0.7917 | | 0.0081 | 1596.0 | 47880 | 1.2753 | 0.8167 | | 0.0081 | 1597.0 | 47910 | 1.3407 | 0.8125 | | 0.0081 | 1598.0 | 47940 | 1.3949 | 0.8 | | 0.0081 | 1599.0 | 47970 | 1.1920 | 0.825 | | 0.0087 | 1600.0 | 48000 | 1.4784 | 0.7375 | | 0.0087 | 1601.0 | 48030 | 1.0188 | 0.8292 | | 0.0087 | 1602.0 | 48060 | 1.4276 | 0.8042 | | 0.0087 | 1603.0 | 48090 | 1.5804 | 0.775 | | 0.0087 | 1604.0 | 48120 | 1.4382 | 0.7792 | | 0.0087 | 1605.0 | 48150 | 1.4162 | 0.7792 | | 0.0087 | 1606.0 | 48180 | 1.3996 | 0.7958 | | 0.0087 | 1607.0 | 48210 | 1.5530 | 0.775 | | 0.0087 | 1608.0 | 48240 | 1.3044 | 0.7958 | | 0.0087 | 1609.0 | 48270 | 1.1829 | 0.8083 | | 0.0087 | 1610.0 | 48300 | 1.1712 | 0.825 | | 0.0087 | 1611.0 | 48330 | 1.3876 | 0.8 | | 0.0087 | 1612.0 | 48360 | 1.5846 | 0.7917 | | 0.0087 | 1613.0 | 48390 | 1.4043 | 0.7917 | | 0.0087 | 1614.0 | 48420 | 1.5841 | 0.7625 | | 0.0087 | 1615.0 | 48450 | 1.2773 | 0.7875 | | 0.0087 | 1616.0 | 48480 | 1.5338 | 0.7667 | | 0.0091 | 1617.0 | 48510 | 1.2505 | 0.7958 | | 0.0091 | 1618.0 | 48540 | 1.5483 | 0.7333 | | 0.0091 | 1619.0 | 48570 | 1.4355 | 0.7833 | | 0.0091 | 1620.0 | 48600 | 1.2665 | 0.8125 | | 0.0091 | 1621.0 | 48630 | 1.1949 | 0.8083 | | 0.0091 | 1622.0 | 48660 | 1.3095 | 0.8 | | 0.0091 | 1623.0 | 48690 | 1.3807 | 0.7958 | | 0.0091 | 1624.0 | 48720 | 1.2542 | 0.775 | | 0.0091 | 1625.0 | 48750 | 1.5920 | 0.7625 | | 0.0091 | 1626.0 | 48780 | 1.3928 | 0.8042 | | 0.0091 | 1627.0 | 48810 | 1.1826 | 0.8083 | | 0.0091 | 1628.0 | 48840 | 1.3381 | 0.8208 | | 0.0091 | 1629.0 | 48870 | 1.5823 | 0.7625 | | 0.0091 | 1630.0 | 48900 | 1.4224 | 0.7708 | | 0.0091 | 1631.0 | 48930 | 1.2238 | 0.825 | | 0.0091 | 1632.0 | 48960 | 1.3098 | 0.8167 | | 0.0091 | 1633.0 | 48990 | 1.5624 | 0.7875 | | 0.0102 | 1634.0 | 49020 | 1.3817 | 0.7708 | | 0.0102 | 1635.0 | 49050 | 1.3306 | 0.7792 | | 0.0102 | 1636.0 | 49080 | 1.3142 | 0.8208 | | 0.0102 | 1637.0 | 49110 | 1.3704 | 0.8042 | | 0.0102 | 1638.0 | 49140 | 1.3550 | 0.7833 | | 0.0102 | 1639.0 | 49170 | 1.2677 | 0.7875 | | 0.0102 | 1640.0 | 49200 | 1.3254 | 0.7917 | | 0.0102 | 1641.0 | 49230 | 1.3160 | 0.8125 | | 0.0102 | 1642.0 | 49260 | 1.2922 | 0.7958 | | 0.0102 | 1643.0 | 49290 | 1.4094 | 0.8042 | | 0.0102 | 1644.0 | 49320 | 1.3884 | 0.7917 | | 0.0102 | 1645.0 | 49350 | 1.1850 | 0.7875 | | 0.0102 | 1646.0 | 49380 | 1.3226 | 0.7792 | | 0.0102 | 1647.0 | 49410 | 1.4120 | 0.8167 | | 0.0102 | 1648.0 | 49440 | 1.4296 | 0.8 | | 0.0102 | 1649.0 | 49470 | 1.4205 | 0.8 | | 0.0114 | 1650.0 | 49500 | 1.2401 | 0.8292 | | 0.0114 | 1651.0 | 49530 | 1.5770 | 0.7458 | | 0.0114 | 1652.0 | 49560 | 1.2140 | 0.8125 | | 0.0114 | 1653.0 | 49590 | 1.3356 | 0.7833 | | 0.0114 | 1654.0 | 49620 | 1.5649 | 0.7833 | | 0.0114 | 1655.0 | 49650 | 1.5667 | 0.775 | | 0.0114 | 1656.0 | 49680 | 1.4604 | 0.7833 | | 0.0114 | 1657.0 | 49710 | 1.5255 | 0.7833 | | 0.0114 | 1658.0 | 49740 | 1.4988 | 0.8042 | | 0.0114 | 1659.0 | 49770 | 1.2903 | 0.7708 | | 0.0114 | 1660.0 | 49800 | 1.4512 | 0.7958 | | 0.0114 | 1661.0 | 49830 | 1.4526 | 0.8208 | | 0.0114 | 1662.0 | 49860 | 1.2664 | 0.8 | | 0.0114 | 1663.0 | 49890 | 1.2225 | 0.8125 | | 0.0114 | 1664.0 | 49920 | 1.3423 | 0.8 | | 0.0114 | 1665.0 | 49950 | 1.2811 | 0.8125 | | 0.0114 | 1666.0 | 49980 | 1.4179 | 0.8083 | | 0.0084 | 1667.0 | 50010 | 1.5394 | 0.8042 | | 0.0084 | 1668.0 | 50040 | 1.3144 | 0.825 | | 0.0084 | 1669.0 | 50070 | 1.4261 | 0.7833 | | 0.0084 | 1670.0 | 50100 | 1.4840 | 0.8042 | | 0.0084 | 1671.0 | 50130 | 1.1391 | 0.825 | | 0.0084 | 1672.0 | 50160 | 1.4506 | 0.8083 | | 0.0084 | 1673.0 | 50190 | 1.2468 | 0.8167 | | 0.0084 | 1674.0 | 50220 | 1.3354 | 0.8 | | 0.0084 | 1675.0 | 50250 | 1.1121 | 0.8333 | | 0.0084 | 1676.0 | 50280 | 1.2781 | 0.8125 | | 0.0084 | 1677.0 | 50310 | 1.2844 | 0.8083 | | 0.0084 | 1678.0 | 50340 | 1.1945 | 0.825 | | 0.0084 | 1679.0 | 50370 | 1.3404 | 0.8125 | | 0.0084 | 1680.0 | 50400 | 1.2188 | 0.8167 | | 0.0084 | 1681.0 | 50430 | 1.5161 | 0.8083 | | 0.0084 | 1682.0 | 50460 | 1.3913 | 0.775 | | 0.0084 | 1683.0 | 50490 | 1.3343 | 0.8167 | | 0.0098 | 1684.0 | 50520 | 1.4482 | 0.8125 | | 0.0098 | 1685.0 | 50550 | 1.3223 | 0.775 | | 0.0098 | 1686.0 | 50580 | 1.2461 | 0.8083 | | 0.0098 | 1687.0 | 50610 | 1.2945 | 0.8042 | | 0.0098 | 1688.0 | 50640 | 1.3086 | 0.7958 | | 0.0098 | 1689.0 | 50670 | 1.3121 | 0.8208 | | 0.0098 | 1690.0 | 50700 | 1.4512 | 0.7875 | | 0.0098 | 1691.0 | 50730 | 1.2236 | 0.7958 | | 0.0098 | 1692.0 | 50760 | 1.3813 | 0.8042 | | 0.0098 | 1693.0 | 50790 | 1.1950 | 0.8208 | | 0.0098 | 1694.0 | 50820 | 1.3170 | 0.7792 | | 0.0098 | 1695.0 | 50850 | 1.4470 | 0.7917 | | 0.0098 | 1696.0 | 50880 | 1.5317 | 0.7958 | | 0.0098 | 1697.0 | 50910 | 1.0926 | 0.8292 | | 0.0098 | 1698.0 | 50940 | 1.3273 | 0.8167 | | 0.0098 | 1699.0 | 50970 | 1.1896 | 0.8292 | | 0.0103 | 1700.0 | 51000 | 1.2892 | 0.8083 | | 0.0103 | 1701.0 | 51030 | 1.3922 | 0.7833 | | 0.0103 | 1702.0 | 51060 | 1.6931 | 0.7917 | | 0.0103 | 1703.0 | 51090 | 1.4755 | 0.7583 | | 0.0103 | 1704.0 | 51120 | 1.2349 | 0.8208 | | 0.0103 | 1705.0 | 51150 | 1.4495 | 0.7833 | | 0.0103 | 1706.0 | 51180 | 1.5563 | 0.7917 | | 0.0103 | 1707.0 | 51210 | 1.1578 | 0.7958 | | 0.0103 | 1708.0 | 51240 | 1.4043 | 0.8125 | | 0.0103 | 1709.0 | 51270 | 1.4009 | 0.8292 | | 0.0103 | 1710.0 | 51300 | 1.3301 | 0.8 | | 0.0103 | 1711.0 | 51330 | 1.6638 | 0.7917 | | 0.0103 | 1712.0 | 51360 | 1.1773 | 0.8167 | | 0.0103 | 1713.0 | 51390 | 1.0919 | 0.8208 | | 0.0103 | 1714.0 | 51420 | 1.2450 | 0.7917 | | 0.0103 | 1715.0 | 51450 | 1.3397 | 0.8292 | | 0.0103 | 1716.0 | 51480 | 1.3613 | 0.7792 | | 0.0105 | 1717.0 | 51510 | 1.2802 | 0.8 | | 0.0105 | 1718.0 | 51540 | 1.2437 | 0.8 | | 0.0105 | 1719.0 | 51570 | 1.3201 | 0.7875 | | 0.0105 | 1720.0 | 51600 | 1.5726 | 0.7708 | | 0.0105 | 1721.0 | 51630 | 1.3656 | 0.8042 | | 0.0105 | 1722.0 | 51660 | 1.7074 | 0.7792 | | 0.0105 | 1723.0 | 51690 | 1.2694 | 0.8375 | | 0.0105 | 1724.0 | 51720 | 1.4550 | 0.7792 | | 0.0105 | 1725.0 | 51750 | 1.4635 | 0.7917 | | 0.0105 | 1726.0 | 51780 | 1.4889 | 0.7625 | | 0.0105 | 1727.0 | 51810 | 1.3134 | 0.8042 | | 0.0105 | 1728.0 | 51840 | 1.4418 | 0.775 | | 0.0105 | 1729.0 | 51870 | 1.5681 | 0.7583 | | 0.0105 | 1730.0 | 51900 | 1.1054 | 0.8292 | | 0.0105 | 1731.0 | 51930 | 1.2462 | 0.8083 | | 0.0105 | 1732.0 | 51960 | 1.3583 | 0.7792 | | 0.0105 | 1733.0 | 51990 | 1.3378 | 0.775 | | 0.0091 | 1734.0 | 52020 | 1.4520 | 0.7583 | | 0.0091 | 1735.0 | 52050 | 1.3221 | 0.8125 | | 0.0091 | 1736.0 | 52080 | 1.4677 | 0.8042 | | 0.0091 | 1737.0 | 52110 | 1.4403 | 0.775 | | 0.0091 | 1738.0 | 52140 | 1.2008 | 0.8333 | | 0.0091 | 1739.0 | 52170 | 1.2424 | 0.8167 | | 0.0091 | 1740.0 | 52200 | 1.3899 | 0.7833 | | 0.0091 | 1741.0 | 52230 | 1.3449 | 0.8292 | | 0.0091 | 1742.0 | 52260 | 1.2990 | 0.8167 | | 0.0091 | 1743.0 | 52290 | 1.5920 | 0.7833 | | 0.0091 | 1744.0 | 52320 | 1.3588 | 0.7792 | | 0.0091 | 1745.0 | 52350 | 1.2354 | 0.8042 | | 0.0091 | 1746.0 | 52380 | 1.1257 | 0.8333 | | 0.0091 | 1747.0 | 52410 | 1.3577 | 0.7958 | | 0.0091 | 1748.0 | 52440 | 1.4216 | 0.8125 | | 0.0091 | 1749.0 | 52470 | 1.1096 | 0.8 | | 0.0074 | 1750.0 | 52500 | 1.5919 | 0.7708 | | 0.0074 | 1751.0 | 52530 | 1.3637 | 0.7792 | | 0.0074 | 1752.0 | 52560 | 1.5916 | 0.775 | | 0.0074 | 1753.0 | 52590 | 1.4799 | 0.7833 | | 0.0074 | 1754.0 | 52620 | 1.4437 | 0.7875 | | 0.0074 | 1755.0 | 52650 | 1.5032 | 0.8 | | 0.0074 | 1756.0 | 52680 | 1.1719 | 0.8125 | | 0.0074 | 1757.0 | 52710 | 1.0309 | 0.8208 | | 0.0074 | 1758.0 | 52740 | 1.3773 | 0.7917 | | 0.0074 | 1759.0 | 52770 | 1.4090 | 0.7958 | | 0.0074 | 1760.0 | 52800 | 1.3923 | 0.8 | | 0.0074 | 1761.0 | 52830 | 1.3306 | 0.8167 | | 0.0074 | 1762.0 | 52860 | 1.2398 | 0.8 | | 0.0074 | 1763.0 | 52890 | 1.3714 | 0.8167 | | 0.0074 | 1764.0 | 52920 | 1.3510 | 0.8042 | | 0.0074 | 1765.0 | 52950 | 1.4409 | 0.7917 | | 0.0074 | 1766.0 | 52980 | 1.4872 | 0.7792 | | 0.0069 | 1767.0 | 53010 | 1.2909 | 0.7833 | | 0.0069 | 1768.0 | 53040 | 1.1229 | 0.7958 | | 0.0069 | 1769.0 | 53070 | 1.1504 | 0.8125 | | 0.0069 | 1770.0 | 53100 | 1.3935 | 0.7958 | | 0.0069 | 1771.0 | 53130 | 1.3097 | 0.8083 | | 0.0069 | 1772.0 | 53160 | 1.0690 | 0.8458 | | 0.0069 | 1773.0 | 53190 | 1.2158 | 0.8083 | | 0.0069 | 1774.0 | 53220 | 1.3868 | 0.7708 | | 0.0069 | 1775.0 | 53250 | 1.4253 | 0.775 | | 0.0069 | 1776.0 | 53280 | 1.3567 | 0.8292 | | 0.0069 | 1777.0 | 53310 | 1.4256 | 0.8 | | 0.0069 | 1778.0 | 53340 | 1.4184 | 0.7833 | | 0.0069 | 1779.0 | 53370 | 1.6955 | 0.7667 | | 0.0069 | 1780.0 | 53400 | 1.4366 | 0.7833 | | 0.0069 | 1781.0 | 53430 | 1.2100 | 0.7917 | | 0.0069 | 1782.0 | 53460 | 1.3288 | 0.8208 | | 0.0069 | 1783.0 | 53490 | 1.4486 | 0.7958 | | 0.0092 | 1784.0 | 53520 | 1.4399 | 0.7875 | | 0.0092 | 1785.0 | 53550 | 1.4844 | 0.75 | | 0.0092 | 1786.0 | 53580 | 1.4153 | 0.8 | | 0.0092 | 1787.0 | 53610 | 1.4012 | 0.7875 | | 0.0092 | 1788.0 | 53640 | 1.4394 | 0.7708 | | 0.0092 | 1789.0 | 53670 | 1.3736 | 0.8125 | | 0.0092 | 1790.0 | 53700 | 1.8755 | 0.7458 | | 0.0092 | 1791.0 | 53730 | 1.3583 | 0.8042 | | 0.0092 | 1792.0 | 53760 | 1.2741 | 0.7833 | | 0.0092 | 1793.0 | 53790 | 1.7958 | 0.75 | | 0.0092 | 1794.0 | 53820 | 1.2800 | 0.8208 | | 0.0092 | 1795.0 | 53850 | 1.6146 | 0.8042 | | 0.0092 | 1796.0 | 53880 | 1.5070 | 0.7625 | | 0.0092 | 1797.0 | 53910 | 1.4889 | 0.7875 | | 0.0092 | 1798.0 | 53940 | 1.5039 | 0.8208 | | 0.0092 | 1799.0 | 53970 | 1.3737 | 0.7958 | | 0.0087 | 1800.0 | 54000 | 1.5132 | 0.7875 | | 0.0087 | 1801.0 | 54030 | 1.4327 | 0.8167 | | 0.0087 | 1802.0 | 54060 | 1.3060 | 0.8208 | | 0.0087 | 1803.0 | 54090 | 1.3390 | 0.775 | | 0.0087 | 1804.0 | 54120 | 1.4776 | 0.8042 | | 0.0087 | 1805.0 | 54150 | 1.4412 | 0.8 | | 0.0087 | 1806.0 | 54180 | 1.4395 | 0.775 | | 0.0087 | 1807.0 | 54210 | 1.7166 | 0.7708 | | 0.0087 | 1808.0 | 54240 | 1.3614 | 0.7875 | | 0.0087 | 1809.0 | 54270 | 1.2557 | 0.8292 | | 0.0087 | 1810.0 | 54300 | 1.5233 | 0.7833 | | 0.0087 | 1811.0 | 54330 | 1.1652 | 0.7917 | | 0.0087 | 1812.0 | 54360 | 1.0636 | 0.8375 | | 0.0087 | 1813.0 | 54390 | 1.2284 | 0.8042 | | 0.0087 | 1814.0 | 54420 | 1.2259 | 0.8167 | | 0.0087 | 1815.0 | 54450 | 1.4544 | 0.8083 | | 0.0087 | 1816.0 | 54480 | 1.5231 | 0.7833 | | 0.0092 | 1817.0 | 54510 | 1.4474 | 0.7792 | | 0.0092 | 1818.0 | 54540 | 1.5503 | 0.8083 | | 0.0092 | 1819.0 | 54570 | 1.2062 | 0.8083 | | 0.0092 | 1820.0 | 54600 | 1.2325 | 0.7875 | | 0.0092 | 1821.0 | 54630 | 1.3989 | 0.8042 | | 0.0092 | 1822.0 | 54660 | 1.4384 | 0.8 | | 0.0092 | 1823.0 | 54690 | 1.2723 | 0.7875 | | 0.0092 | 1824.0 | 54720 | 1.2214 | 0.8125 | | 0.0092 | 1825.0 | 54750 | 1.4630 | 0.8125 | | 0.0092 | 1826.0 | 54780 | 1.2539 | 0.7833 | | 0.0092 | 1827.0 | 54810 | 1.6401 | 0.7667 | | 0.0092 | 1828.0 | 54840 | 1.4556 | 0.8042 | | 0.0092 | 1829.0 | 54870 | 1.4111 | 0.7708 | | 0.0092 | 1830.0 | 54900 | 1.5331 | 0.7875 | | 0.0092 | 1831.0 | 54930 | 1.4464 | 0.8042 | | 0.0092 | 1832.0 | 54960 | 1.3866 | 0.8042 | | 0.0092 | 1833.0 | 54990 | 1.4990 | 0.7667 | | 0.011 | 1834.0 | 55020 | 1.1754 | 0.8333 | | 0.011 | 1835.0 | 55050 | 1.2265 | 0.8333 | | 0.011 | 1836.0 | 55080 | 1.2726 | 0.7958 | | 0.011 | 1837.0 | 55110 | 1.4070 | 0.7958 | | 0.011 | 1838.0 | 55140 | 1.5753 | 0.7667 | | 0.011 | 1839.0 | 55170 | 1.3695 | 0.7667 | | 0.011 | 1840.0 | 55200 | 1.5525 | 0.7875 | | 0.011 | 1841.0 | 55230 | 1.1199 | 0.7917 | | 0.011 | 1842.0 | 55260 | 1.5401 | 0.8083 | | 0.011 | 1843.0 | 55290 | 1.1869 | 0.8167 | | 0.011 | 1844.0 | 55320 | 1.3938 | 0.7958 | | 0.011 | 1845.0 | 55350 | 1.2166 | 0.8333 | | 0.011 | 1846.0 | 55380 | 1.4912 | 0.7917 | | 0.011 | 1847.0 | 55410 | 2.0722 | 0.7583 | | 0.011 | 1848.0 | 55440 | 1.4814 | 0.7792 | | 0.011 | 1849.0 | 55470 | 1.2156 | 0.7917 | | 0.0093 | 1850.0 | 55500 | 1.4209 | 0.8292 | | 0.0093 | 1851.0 | 55530 | 1.4285 | 0.7917 | | 0.0093 | 1852.0 | 55560 | 1.4826 | 0.7958 | | 0.0093 | 1853.0 | 55590 | 1.3639 | 0.7708 | | 0.0093 | 1854.0 | 55620 | 1.5322 | 0.7875 | | 0.0093 | 1855.0 | 55650 | 1.2177 | 0.8083 | | 0.0093 | 1856.0 | 55680 | 1.3090 | 0.7792 | | 0.0093 | 1857.0 | 55710 | 1.3316 | 0.7917 | | 0.0093 | 1858.0 | 55740 | 1.3497 | 0.8 | | 0.0093 | 1859.0 | 55770 | 1.4211 | 0.7583 | | 0.0093 | 1860.0 | 55800 | 1.2918 | 0.8083 | | 0.0093 | 1861.0 | 55830 | 1.3925 | 0.7917 | | 0.0093 | 1862.0 | 55860 | 1.4005 | 0.7958 | | 0.0093 | 1863.0 | 55890 | 1.5641 | 0.7875 | | 0.0093 | 1864.0 | 55920 | 1.5612 | 0.7875 | | 0.0093 | 1865.0 | 55950 | 1.3880 | 0.7833 | | 0.0093 | 1866.0 | 55980 | 1.2307 | 0.825 | | 0.01 | 1867.0 | 56010 | 1.3204 | 0.8292 | | 0.01 | 1868.0 | 56040 | 1.6270 | 0.775 | | 0.01 | 1869.0 | 56070 | 1.3512 | 0.775 | | 0.01 | 1870.0 | 56100 | 1.2808 | 0.8 | | 0.01 | 1871.0 | 56130 | 1.5210 | 0.7667 | | 0.01 | 1872.0 | 56160 | 1.2638 | 0.8083 | | 0.01 | 1873.0 | 56190 | 1.4639 | 0.7833 | | 0.01 | 1874.0 | 56220 | 1.7515 | 0.7292 | | 0.01 | 1875.0 | 56250 | 1.5528 | 0.7875 | | 0.01 | 1876.0 | 56280 | 1.4582 | 0.7875 | | 0.01 | 1877.0 | 56310 | 1.5549 | 0.7875 | | 0.01 | 1878.0 | 56340 | 1.1945 | 0.8208 | | 0.01 | 1879.0 | 56370 | 1.5149 | 0.8 | | 0.01 | 1880.0 | 56400 | 1.3831 | 0.7667 | | 0.01 | 1881.0 | 56430 | 1.5024 | 0.7792 | | 0.01 | 1882.0 | 56460 | 1.3748 | 0.8167 | | 0.01 | 1883.0 | 56490 | 1.4295 | 0.8 | | 0.0075 | 1884.0 | 56520 | 1.4949 | 0.7833 | | 0.0075 | 1885.0 | 56550 | 1.6528 | 0.7667 | | 0.0075 | 1886.0 | 56580 | 1.3022 | 0.8042 | | 0.0075 | 1887.0 | 56610 | 1.4377 | 0.7792 | | 0.0075 | 1888.0 | 56640 | 1.3229 | 0.8208 | | 0.0075 | 1889.0 | 56670 | 1.8277 | 0.8 | | 0.0075 | 1890.0 | 56700 | 1.1062 | 0.825 | | 0.0075 | 1891.0 | 56730 | 1.5968 | 0.7667 | | 0.0075 | 1892.0 | 56760 | 1.2030 | 0.8208 | | 0.0075 | 1893.0 | 56790 | 1.1057 | 0.8208 | | 0.0075 | 1894.0 | 56820 | 1.1445 | 0.7917 | | 0.0075 | 1895.0 | 56850 | 1.5530 | 0.8042 | | 0.0075 | 1896.0 | 56880 | 1.2015 | 0.7958 | | 0.0075 | 1897.0 | 56910 | 1.3523 | 0.8125 | | 0.0075 | 1898.0 | 56940 | 1.4890 | 0.7875 | | 0.0075 | 1899.0 | 56970 | 1.2990 | 0.8333 | | 0.0073 | 1900.0 | 57000 | 1.3289 | 0.8208 | | 0.0073 | 1901.0 | 57030 | 1.3369 | 0.7917 | | 0.0073 | 1902.0 | 57060 | 1.1434 | 0.8375 | | 0.0073 | 1903.0 | 57090 | 1.4121 | 0.7917 | | 0.0073 | 1904.0 | 57120 | 1.3132 | 0.7708 | | 0.0073 | 1905.0 | 57150 | 1.4352 | 0.7958 | | 0.0073 | 1906.0 | 57180 | 1.4682 | 0.8125 | | 0.0073 | 1907.0 | 57210 | 1.2368 | 0.7917 | | 0.0073 | 1908.0 | 57240 | 1.2184 | 0.8458 | | 0.0073 | 1909.0 | 57270 | 1.5381 | 0.8042 | | 0.0073 | 1910.0 | 57300 | 1.2602 | 0.8 | | 0.0073 | 1911.0 | 57330 | 1.4832 | 0.8042 | | 0.0073 | 1912.0 | 57360 | 1.3544 | 0.7833 | | 0.0073 | 1913.0 | 57390 | 1.3065 | 0.8125 | | 0.0073 | 1914.0 | 57420 | 1.3738 | 0.7708 | | 0.0073 | 1915.0 | 57450 | 1.4057 | 0.7875 | | 0.0073 | 1916.0 | 57480 | 1.4792 | 0.7917 | | 0.0096 | 1917.0 | 57510 | 1.4106 | 0.8083 | | 0.0096 | 1918.0 | 57540 | 1.8933 | 0.7625 | | 0.0096 | 1919.0 | 57570 | 1.2972 | 0.8083 | | 0.0096 | 1920.0 | 57600 | 1.5340 | 0.775 | | 0.0096 | 1921.0 | 57630 | 1.3333 | 0.8125 | | 0.0096 | 1922.0 | 57660 | 1.6496 | 0.7542 | | 0.0096 | 1923.0 | 57690 | 1.6254 | 0.7917 | | 0.0096 | 1924.0 | 57720 | 1.3081 | 0.8 | | 0.0096 | 1925.0 | 57750 | 1.4128 | 0.7958 | | 0.0096 | 1926.0 | 57780 | 1.3969 | 0.7958 | | 0.0096 | 1927.0 | 57810 | 1.3021 | 0.8333 | | 0.0096 | 1928.0 | 57840 | 1.6577 | 0.8083 | | 0.0096 | 1929.0 | 57870 | 1.3530 | 0.8458 | | 0.0096 | 1930.0 | 57900 | 1.6314 | 0.7667 | | 0.0096 | 1931.0 | 57930 | 1.2392 | 0.7958 | | 0.0096 | 1932.0 | 57960 | 1.5000 | 0.7833 | | 0.0096 | 1933.0 | 57990 | 1.5883 | 0.7958 | | 0.0092 | 1934.0 | 58020 | 1.0950 | 0.8083 | | 0.0092 | 1935.0 | 58050 | 1.6003 | 0.7875 | | 0.0092 | 1936.0 | 58080 | 1.2340 | 0.8042 | | 0.0092 | 1937.0 | 58110 | 1.5221 | 0.7958 | | 0.0092 | 1938.0 | 58140 | 1.5097 | 0.8042 | | 0.0092 | 1939.0 | 58170 | 1.2358 | 0.7917 | | 0.0092 | 1940.0 | 58200 | 1.3919 | 0.8042 | | 0.0092 | 1941.0 | 58230 | 1.2941 | 0.8125 | | 0.0092 | 1942.0 | 58260 | 1.3583 | 0.7875 | | 0.0092 | 1943.0 | 58290 | 1.2504 | 0.775 | | 0.0092 | 1944.0 | 58320 | 1.2353 | 0.8042 | | 0.0092 | 1945.0 | 58350 | 1.5725 | 0.7708 | | 0.0092 | 1946.0 | 58380 | 1.2969 | 0.8042 | | 0.0092 | 1947.0 | 58410 | 1.3742 | 0.7958 | | 0.0092 | 1948.0 | 58440 | 1.4705 | 0.8167 | | 0.0092 | 1949.0 | 58470 | 1.1498 | 0.825 | | 0.0091 | 1950.0 | 58500 | 1.4432 | 0.7917 | | 0.0091 | 1951.0 | 58530 | 1.3812 | 0.825 | | 0.0091 | 1952.0 | 58560 | 1.3279 | 0.8125 | | 0.0091 | 1953.0 | 58590 | 1.3954 | 0.8 | | 0.0091 | 1954.0 | 58620 | 1.3180 | 0.8208 | | 0.0091 | 1955.0 | 58650 | 1.2914 | 0.825 | | 0.0091 | 1956.0 | 58680 | 1.4334 | 0.8292 | | 0.0091 | 1957.0 | 58710 | 1.4698 | 0.8083 | | 0.0091 | 1958.0 | 58740 | 1.3515 | 0.8125 | | 0.0091 | 1959.0 | 58770 | 1.2476 | 0.8042 | | 0.0091 | 1960.0 | 58800 | 1.5237 | 0.7792 | | 0.0091 | 1961.0 | 58830 | 1.4550 | 0.7958 | | 0.0091 | 1962.0 | 58860 | 1.3967 | 0.8167 | | 0.0091 | 1963.0 | 58890 | 1.6567 | 0.7542 | | 0.0091 | 1964.0 | 58920 | 1.5198 | 0.8042 | | 0.0091 | 1965.0 | 58950 | 1.2391 | 0.8083 | | 0.0091 | 1966.0 | 58980 | 1.4267 | 0.7625 | | 0.0078 | 1967.0 | 59010 | 1.2633 | 0.7625 | | 0.0078 | 1968.0 | 59040 | 1.6378 | 0.8 | | 0.0078 | 1969.0 | 59070 | 1.2498 | 0.7833 | | 0.0078 | 1970.0 | 59100 | 1.2138 | 0.7667 | | 0.0078 | 1971.0 | 59130 | 1.3609 | 0.7667 | | 0.0078 | 1972.0 | 59160 | 1.1123 | 0.8083 | | 0.0078 | 1973.0 | 59190 | 1.3055 | 0.8125 | | 0.0078 | 1974.0 | 59220 | 1.2567 | 0.8375 | | 0.0078 | 1975.0 | 59250 | 1.2136 | 0.8 | | 0.0078 | 1976.0 | 59280 | 1.4452 | 0.7667 | | 0.0078 | 1977.0 | 59310 | 1.3163 | 0.8458 | | 0.0078 | 1978.0 | 59340 | 1.4422 | 0.7708 | | 0.0078 | 1979.0 | 59370 | 1.1028 | 0.8083 | | 0.0078 | 1980.0 | 59400 | 1.2743 | 0.8042 | | 0.0078 | 1981.0 | 59430 | 1.1929 | 0.8167 | | 0.0078 | 1982.0 | 59460 | 1.2579 | 0.8083 | | 0.0078 | 1983.0 | 59490 | 1.3026 | 0.7958 | | 0.0083 | 1984.0 | 59520 | 1.4051 | 0.8042 | | 0.0083 | 1985.0 | 59550 | 1.2011 | 0.8083 | | 0.0083 | 1986.0 | 59580 | 1.3644 | 0.7583 | | 0.0083 | 1987.0 | 59610 | 1.7503 | 0.7833 | | 0.0083 | 1988.0 | 59640 | 1.1952 | 0.8042 | | 0.0083 | 1989.0 | 59670 | 1.5411 | 0.7833 | | 0.0083 | 1990.0 | 59700 | 1.2093 | 0.8167 | | 0.0083 | 1991.0 | 59730 | 1.3150 | 0.7708 | | 0.0083 | 1992.0 | 59760 | 1.5014 | 0.775 | | 0.0083 | 1993.0 | 59790 | 1.3844 | 0.8042 | | 0.0083 | 1994.0 | 59820 | 1.4835 | 0.8 | | 0.0083 | 1995.0 | 59850 | 1.6885 | 0.7708 | | 0.0083 | 1996.0 | 59880 | 1.6286 | 0.7625 | | 0.0083 | 1997.0 | 59910 | 1.4851 | 0.8 | | 0.0083 | 1998.0 | 59940 | 1.3521 | 0.7792 | | 0.0083 | 1999.0 | 59970 | 1.1506 | 0.7917 | | 0.0072 | 2000.0 | 60000 | 1.2296 | 0.8125 | ### Framework versions - Transformers 4.42.2 - Pytorch 2.3.0 - Datasets 2.15.0 - Tokenizers 0.19.1 ## Limitations and future work - The model's performance needs further validation on more diverse and clinical datasets. - The use of SMOTE for balancing classes can introduce some artifacts, so other balancing techniques could be explored. - Future work could include experimenting with different architectures and fine-tuning strategies to improve performance.
[ "0", "1", "0", "1", "2", "0", "0", "0", "1", "1", "1", "1", "1", "1", "1", "2", "0", "1", "0", "2", "1", "0", "2", "2", "1", "2", "1", "1", "1", "2", "0", "1", "0", "0", "1", "2", "0", "0", "2", "0", "0", "1", "2", "0", "1", "1", "2", "1", "2", "1", "0", "0", "2", "1", "1", "2", "0", "0", "1", "0", "0", "1", "2", "0", "2", "2", "2", "0", "2", "0", "0", "1", "0", "2", "1", "0", "0", "0", "0", "1", "0", "0", "0", "0", "0", "0", "2", "0", "1", "1", "2", "2", "2", "2", "1", "2", "1", "0", "2", "2", "1", "1", "2", "0", "0", "1", "2", "2", "1", "0", "1", "2", "1", "2", "1", "1", "2", "0", "1", "1", "0", "2", "0", "0", "1", "2", "0", "1", "2", "0", "1", "2", "1", "1", "0", "1", "0", "2", "2", "2", "1", "1", "0", "0", "0", "1", "1", "2", "2", "0", "2", "2", "2", "2", "0", "1", "2", "2", "2", "0", "0", "1", "2", "2", "1", "0", "1", "1", "2", "1", "0", "2", "2", "2", "1", "1", "1", "2", "1", "2", "0", "2", "1", "0", "0", "1", "0", "2", "1", "2", "1", "2", "2", "0", "0", "1", "1", "1", "0", "0", "0", "0", "1", "2", "0", "1", "0", "0", "2", "1", "1", "2", "2", "2", "0", "0", "0", "2", "1", "1", "2", "1", "0", "0", "2", "2", "2", "1", "1", "2", "0", "2", "1", "0", "2", "1", "2", "0", "0", "0", "2", "0", "0", "2", "0", "1", "2", "2", "2", "1", "0", "1", "1", "1", "2", "0", "1", "0", "0", "0", "0", "1", "1", "1", "0", "2", "1", "0", "0", "0", "0", "2", "2", "1", "1", "2", "1", "2", "1", "1", "1", "2", "2", "1", "2", "1", "1", "1", "2", "0", "2", "1", "1", "1", "0", "1", "2", "0", "0", "0", "0", "0", "2", "0", "2", "2", "2", "0", "2", "1", "1", "2", "1", "0", "2", "1", "2", "0", "1", "0", "0", "2", "2", "2", "0", "0", "1", "2", "1", "2", "1", "2", "1", "1", "0", "1", "1", "0", "2", "0", "2", "2", "1", "1", "2", "0", "1", "2", "1", "2", "0", "0", "2", "0", "1", "2", "1", "1", "1", "2", "0", "1", "1", "1", "1", "0", "1", "0", "1", "0", "2", "2", "2", "1", "1", "2", "1", "2", "2", "2", "2", "2", "2", "0", "1", "0", "2", "2", "1", "2", "1", "2", "0", "1", "1", "2", "2", "0", "0", "2", "2", "2", "1", "1", "1", "2", "2", "1", "1", "2", "1", "0", "1", "0", "2", "2", "2", "2", "0", "1", "0", "1", "2", "0", "0", "0", "0", "1", "1", "0", "2", "2", "1", "2", "0", "1", "2", "0", "2", "1", "0", "1", "1", "1", "0", "0", "1", "0", "2", "2", "2", "0", "2", "1", "0", "2", "2", "1", "2", "0", "1", "1", "1", "0", "0", "2", "2", "0", "1", "1", "0", "1", "0", "2", "2", "0", "2", "0", "1", "0", "0", "1", "2", "0", "0", "0", "2", "1", "0", "0", "1", "0", "0", "0", "1", "0", "2", "1", "2", "2", "2", "2", "2", "1", "2", "1", "2", "1", "1", "1", "0", "0", "2", "1", "2", "2", "2", "1", "1", "2", "1", "2", "2", "0", "2", "1", "0", "1", "1", "0", "0", "2", "1", "2", "0", "2", "1", "2", "1", "0", "1", "2", "0", "0", "2", "1", "0", "0", "2", "0", "2", "0", "2", "2", "2", "1", "2", "0", "1", "2", "1", "1", "0", "0", "0", "2", "0", "2", "0", "2", "1", "0", "0", "0", "1", "0", "2", "1", "2", "1", "2", "1", "0", "2", "2", "2", "2", "2", "1", "2", "0", "1", "1", "2", "1", "1", "1", "2", "2", "0", "0", "0", "2", "2", "1", "0", "1", "0", "0", "0", "2", "2", "2", "2", "0", "2", "2", "1", "0", "2", "0", "0", "1", "0", "0", "1", "0", "1", "2", "2", "0", "1", "0", "1", "1", "2", "0", "2", "1", "2", "0", "1", "2", "2", "1", "0", "1", "1", "1", "2", "2", "1", "1", "1", "1", "0", "2", "1", "1", "2", "0", "2", "1", "2", "2", "0", "0", "1", "1", "2", "2", "1", "0", "0", "2", "2", "1", "1", "0", "0", "2", "1", "0", "0", "1", "0", "2", "0", "2", "1", "1", "1", "2", "1", "2", "0", "2", "2", "0", "1", "2", "1", "1", "1", "2", "0", "1", "1", "0", "2", "0", "1", "1", "1", "2", "1", "0", "0", "2", "2", "2", "1", "0", "0", "0", "2", "2", "1", "0", "0", "0", "1", "2", "1", "1", "1", "2", "0", "0", "1", "0", "0", "2", "1", "2", "0", "2", "2", "1", "1", "0", "0", "2", "2", "2", "1", "1", "1", "0", "0", "1", "1", "1", "1", "1", "1", "0", "0", "2", "2", "2", "0", "2", "1", "2", "1", "0", "2", "2", "1", "1", "1", "0", "0", "2", "2", "2", "0", "0", "2", "1", "1", "0", "2", "1", "0", "1", "2", "2", "2", "0", "2", "2", "1", "2", "1", "0", "0", "0", "2", "0", "2", "0", "0", "2", "2", "1", "1", "0", "1", "0", "0", "2", "0", "1", "2", "1", "0", "1", "2", "2", "0", "0", "2", "2", "2", "0", "0", "1", "1", "1", "0", "1", "1", "2", "2", "0", "2", "1", "2", "1", "0", "2", "1", "2", "1", "2", "2", "1", "2", "0", "1", "2", "1", "0", "1", "1", "0", "2", "0", "0", "0", "0", "0", "0", "0", "1", "0", "0", "0", "1", "2", "1", "2", "1", "1", "0", "1", "2", "0", "0", "1", "2", "2", "2", "2", "1", "0", "0", "1" ]
crocutacrocuto/convnext-base-224-MEG0-3
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "aardvark", "bird", "black-and-white colobus", "blue duiker", "blue monkey", "buffalo", "bushbuck", "bushpig", "cattle", "chimpanzee", "civet_genet", "dog", "elephant", "galago_potto", "goat", "golden cat", "gorilla", "grey duiker", "grey-cheeked mangabey", "guineafowl", "honey badger", "hyrax", "leopard", "lhoest monkey", "mandrill", "mongoose", "monkey", "olive baboon", "otter", "pangolin", "porcupine", "red colobus", "red duiker", "red-capped mangabey", "rodent", "serval", "side-striped jackal", "spotted hyena", "squirel", "vervet monkey", "water chevrotain" ]
geminiZzz/image_classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # image_classification This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.5890 - Accuracy: 0.5062 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 10 | 2.0332 | 0.25 | | No log | 2.0 | 20 | 1.9720 | 0.3125 | | No log | 3.0 | 30 | 1.8937 | 0.3688 | | No log | 4.0 | 40 | 1.8265 | 0.375 | | No log | 5.0 | 50 | 1.7561 | 0.3937 | | No log | 6.0 | 60 | 1.7083 | 0.45 | | No log | 7.0 | 70 | 1.6719 | 0.4375 | | No log | 8.0 | 80 | 1.6415 | 0.4688 | | No log | 9.0 | 90 | 1.6237 | 0.4813 | | No log | 10.0 | 100 | 1.6041 | 0.4938 | | No log | 11.0 | 110 | 1.5890 | 0.5062 | | No log | 12.0 | 120 | 1.5774 | 0.5 | | No log | 13.0 | 130 | 1.5700 | 0.5 | | No log | 14.0 | 140 | 1.5659 | 0.5062 | | No log | 15.0 | 150 | 1.5643 | 0.5062 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "anger", "contempt", "disgust", "fear", "happy", "neutral", "sad", "surprise" ]
ashaduzzaman/vit-base-oxford-iiit-pets
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-oxford-iiit-pets This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset. It achieves the following results on the evaluation set: - Loss: 2.4728 - Accuracy: 0.6067 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 3.588 | 1.0 | 184 | 3.2349 | 0.1522 | | 3.0928 | 2.0 | 368 | 2.8819 | 0.3478 | | 2.7571 | 3.0 | 552 | 2.6433 | 0.5149 | | 2.5459 | 4.0 | 736 | 2.5048 | 0.6019 | | 2.4484 | 5.0 | 920 | 2.4601 | 0.6155 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "20", "1", "18", "16", "14", "3", "8", "22", "19", "25", "9", "12", "32", "33", "21", "35", "6", "26", "15", "10", "24", "17", "27", "36", "11", "13", "5", "30", "31", "4", "0", "28", "29", "2", "23", "34", "7" ]
nvidia/MambaVision-T-1K
[**MambaVision: A Hybrid Mamba-Transformer Vision Backbone**](https://huggingface.co/papers/2407.08083) ## Model Overview We have developed the first hybrid model for computer vision which leverages the strengths of Mamba and Transformers. Specifically, our core contribution includes redesigning the Mamba formulation to enhance its capability for efficient modeling of visual features. In addition, we conducted a comprehensive ablation study on the feasibility of integrating Vision Transformers (ViT) with Mamba. Our results demonstrate that equipping the Mamba architecture with several self-attention blocks at the final layers greatly improves the modeling capacity to capture long-range spatial dependencies. Based on our findings, we introduce a family of MambaVision models with a hierarchical architecture to meet various design criteria. ## Model Performance MambaVision demonstrates a strong performance by achieving a new SOTA Pareto-front in terms of Top-1 accuracy and throughput. <p align="center"> <img src="https://github.com/NVlabs/MambaVision/assets/26806394/79dcf841-3966-4b77-883d-76cd5e1d4320" width=70% height=70% class="center"> </p> ## Model Usage It is highly recommended to install the requirements for MambaVision by running the following: ```Bash pip install mambavision ``` For each model, we offer two variants for image classification and feature extraction that can be imported with 1 line of code. ### Image Classification In the following example, we demonstrate how MambaVision can be used for image classification. Given the following image from [COCO dataset](https://cocodataset.org/#home) val set as an input: <p align="center"> <img src="https://hf.fast360.xyz/production/uploads/64414b62603214724ebd2636/4duSnqLf4lrNiAHczSmAN.jpeg" width=70% height=70% class="center"> </p> The following snippet can be used for image classification: ```Python from transformers import AutoModelForImageClassification from PIL import Image from timm.data.transforms_factory import create_transform import requests model = AutoModelForImageClassification.from_pretrained("nvidia/MambaVision-T-1K", trust_remote_code=True) # eval mode for inference model.cuda().eval() # prepare image for the model url = 'http://images.cocodataset.org/val2017/000000020247.jpg' image = Image.open(requests.get(url, stream=True).raw) input_resolution = (3, 224, 224) # MambaVision supports any input resolutions transform = create_transform(input_size=input_resolution, is_training=False, mean=model.config.mean, std=model.config.std, crop_mode=model.config.crop_mode, crop_pct=model.config.crop_pct) inputs = transform(image).unsqueeze(0).cuda() # model inference outputs = model(inputs) logits = outputs['logits'] predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` The predicted label is ```brown bear, bruin, Ursus arctos.``` ### Feature Extraction MambaVision can also be used as a generic feature extractor. Specifically, we can extract the outputs of each stage of model (4 stages) as well as the final averaged-pool features that are flattened. The following snippet can be used for feature extraction: ```Python from transformers import AutoModel from PIL import Image from timm.data.transforms_factory import create_transform import requests model = AutoModel.from_pretrained("nvidia/MambaVision-T-1K", trust_remote_code=True) # eval mode for inference model.cuda().eval() # prepare image for the model url = 'http://images.cocodataset.org/val2017/000000020247.jpg' image = Image.open(requests.get(url, stream=True).raw) input_resolution = (3, 224, 224) # MambaVision supports any input resolutions transform = create_transform(input_size=input_resolution, is_training=False, mean=model.config.mean, std=model.config.std, crop_mode=model.config.crop_mode, crop_pct=model.config.crop_pct) inputs = transform(image).unsqueeze(0).cuda() # model inference out_avg_pool, features = model(inputs) print("Size of the averaged pool features:", out_avg_pool.size()) # torch.Size([1, 640]) print("Number of stages in extracted features:", len(features)) # 4 stages print("Size of extracted features in stage 1:", features[0].size()) # torch.Size([1, 80, 56, 56]) print("Size of extracted features in stage 4:", features[3].size()) # torch.Size([1, 640, 7, 7]) ``` ### License: [NVIDIA Source Code License-NC](https://huggingface.co/nvidia/MambaVision-T-1K/blob/main/LICENSE)
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
nvidia/MambaVision-L-1K
[**MambaVision: A Hybrid Mamba-Transformer Vision Backbone**](https://arxiv.org/abs/2407.08083). ## Model Overview We have developed the first hybrid model for computer vision which leverages the strengths of Mamba and Transformers. Specifically, our core contribution includes redesigning the Mamba formulation to enhance its capability for efficient modeling of visual features. In addition, we conducted a comprehensive ablation study on the feasibility of integrating Vision Transformers (ViT) with Mamba. Our results demonstrate that equipping the Mamba architecture with several self-attention blocks at the final layers greatly improves the modeling capacity to capture long-range spatial dependencies. Based on our findings, we introduce a family of MambaVision models with a hierarchical architecture to meet various design criteria. ## Model Performance MambaVision demonstrates a strong performance by achieving a new SOTA Pareto-front in terms of Top-1 accuracy and throughput. <p align="center"> <img src="https://github.com/NVlabs/MambaVision/assets/26806394/79dcf841-3966-4b77-883d-76cd5e1d4320" width=70% height=70% class="center"> </p> ## Model Usage It is highly recommended to install the requirements for MambaVision by running the following: ```Bash pip install mambavision ``` For each model, we offer two variants for image classification and feature extraction that can be imported with 1 line of code. ### Image Classification In the following example, we demonstrate how MambaVision can be used for image classification. Given the following image from [COCO dataset](https://cocodataset.org/#home) val set as an input: <p align="center"> <img src="https://hf.fast360.xyz/production/uploads/64414b62603214724ebd2636/4duSnqLf4lrNiAHczSmAN.jpeg" width=70% height=70% class="center"> </p> The following snippet can be used for image classification: ```Python from transformers import AutoModelForImageClassification from PIL import Image from timm.data.transforms_factory import create_transform import requests model = AutoModelForImageClassification.from_pretrained("nvidia/MambaVision-L-1K", trust_remote_code=True) # eval mode for inference model.cuda().eval() # prepare image for the model url = 'http://images.cocodataset.org/val2017/000000020247.jpg' image = Image.open(requests.get(url, stream=True).raw) input_resolution = (3, 224, 224) # MambaVision supports any input resolutions transform = create_transform(input_size=input_resolution, is_training=False, mean=model.config.mean, std=model.config.std, crop_mode=model.config.crop_mode, crop_pct=model.config.crop_pct) inputs = transform(image).unsqueeze(0).cuda() # model inference outputs = model(inputs) logits = outputs['logits'] predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` The predicted label is ```brown bear, bruin, Ursus arctos.``` ### Feature Extraction MambaVision can also be used as a generic feature extractor. Specifically, we can extract the outputs of each stage of model (4 stages) as well as the final averaged-pool features that are flattened. The following snippet can be used for feature extraction: ```Python from transformers import AutoModel from PIL import Image from timm.data.transforms_factory import create_transform import requests model = AutoModel.from_pretrained("nvidia/MambaVision-L-1K", trust_remote_code=True) # eval mode for inference model.cuda().eval() # prepare image for the model url = 'http://images.cocodataset.org/val2017/000000020247.jpg' image = Image.open(requests.get(url, stream=True).raw) input_resolution = (3, 224, 224) # MambaVision supports any input resolutions transform = create_transform(input_size=input_resolution, is_training=False, mean=model.config.mean, std=model.config.std, crop_mode=model.config.crop_mode, crop_pct=model.config.crop_pct) inputs = transform(image).unsqueeze(0).cuda() # model inference out_avg_pool, features = model(inputs) print("Size of the averaged pool features:", out_avg_pool.size()) # torch.Size([1, 640]) print("Number of stages in extracted features:", len(features)) # 4 stages print("Size of extracted features in stage 1:", features[0].size()) # torch.Size([1, 80, 56, 56]) print("Size of extracted features in stage 4:", features[3].size()) # torch.Size([1, 640, 7, 7]) ``` ### License: [NVIDIA Source Code License-NC](https://huggingface.co/nvidia/MambaVision-T-1K/blob/main/LICENSE)
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
nvidia/MambaVision-L2-1K
[**MambaVision: A Hybrid Mamba-Transformer Vision Backbone**](https://arxiv.org/abs/2407.08083). Code: https://github.com/NVlabs/MambaVision ## Model Overview We have developed the first hybrid model for computer vision which leverages the strengths of Mamba and Transformers. Specifically, our core contribution includes redesigning the Mamba formulation to enhance its capability for efficient modeling of visual features. In addition, we conducted a comprehensive ablation study on the feasibility of integrating Vision Transformers (ViT) with Mamba. Our results demonstrate that equipping the Mamba architecture with several self-attention blocks at the final layers greatly improves the modeling capacity to capture long-range spatial dependencies. Based on our findings, we introduce a family of MambaVision models with a hierarchical architecture to meet various design criteria. ## Model Performance MambaVision demonstrates a strong performance by achieving a new SOTA Pareto-front in terms of Top-1 accuracy and throughput. <p align="center"> <img src="https://github.com/NVlabs/MambaVision/assets/26806394/79dcf841-3966-4b77-883d-76cd5e1d4320" width=70% height=70% class="center"> </p> ## Model Usage It is highly recommended to install the requirements for MambaVision by running the following: ```Bash pip install mambavision ``` For each model, we offer two variants for image classification and feature extraction that can be imported with 1 line of code. ### Image Classification In the following example, we demonstrate how MambaVision can be used for image classification. Given the following image from [COCO dataset](https://cocodataset.org/#home) val set as an input: <p align="center"> <img src="https://hf.fast360.xyz/production/uploads/64414b62603214724ebd2636/4duSnqLf4lrNiAHczSmAN.jpeg" width=70% height=70% class="center"> </p> The following snippet can be used for image classification: ```Python from transformers import AutoModelForImageClassification from PIL import Image from timm.data.transforms_factory import create_transform import requests model = AutoModelForImageClassification.from_pretrained("nvidia/MambaVision-L2-1K", trust_remote_code=True) # eval mode for inference model.cuda().eval() # prepare image for the model url = 'http://images.cocodataset.org/val2017/000000020247.jpg' image = Image.open(requests.get(url, stream=True).raw) input_resolution = (3, 224, 224) # MambaVision supports any input resolutions transform = create_transform(input_size=input_resolution, is_training=False, mean=model.config.mean, std=model.config.std, crop_mode=model.config.crop_pct, crop_pct=model.config.crop_pct) inputs = transform(image).unsqueeze(0).cuda() # model inference outputs = model(inputs) logits = outputs['logits'] predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` The predicted label is ```brown bear, bruin, Ursus arctos.``` ### Feature Extraction MambaVision can also be used as a generic feature extractor. Specifically, we can extract the outputs of each stage of model (4 stages) as well as the final averaged-pool features that are flattened. The following snippet can be used for feature extraction: ```Python from transformers import AutoModel from PIL import Image from timm.data.transforms_factory import create_transform import requests model = AutoModel.from_pretrained("nvidia/MambaVision-L2-1K", trust_remote_code=True) # eval mode for inference model.cuda().eval() # prepare image for the model url = 'http://images.cocodataset.org/val2017/000000020247.jpg' image = Image.open(requests.get(url, stream=True).raw) input_resolution = (3, 224, 224) # MambaVision supports any input resolutions transform = create_transform(input_size=input_resolution, is_training=False, mean=model.config.mean, std=model.config.std, crop_mode=model.config.crop_pct, crop_pct=model.config.crop_pct) inputs = transform(image).unsqueeze(0).cuda() # model inference out_avg_pool, features = model(inputs) print("Size of the averaged pool features:", out_avg_pool.size()) # torch.Size([1, 640]) print("Number of stages in extracted features:", len(features)) # 4 stages print("Size of extracted features in stage 1:", features[0].size()) # torch.Size([1, 80, 56, 56]) print("Size of extracted features in stage 4:", features[3].size()) # torch.Size([1, 640, 7, 7]) ``` ### License: [NVIDIA Source Code License-NC](https://huggingface.co/nvidia/MambaVision-T-1K/blob/main/LICENSE)
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
dhritic9/vit-base-brain-tumor-detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-brain-tumor-detection This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5832 - Accuracy: 0.785 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.9535 | 0.4 | 100 | 0.8966 | 0.618 | | 0.862 | 0.8 | 200 | 1.1149 | 0.561 | | 0.7373 | 1.2 | 300 | 0.8543 | 0.605 | | 0.6476 | 1.6 | 400 | 0.7307 | 0.666 | | 0.6712 | 2.0 | 500 | 0.6954 | 0.694 | | 0.4892 | 2.4 | 600 | 0.6391 | 0.707 | | 0.5801 | 2.8 | 700 | 0.6247 | 0.708 | | 0.3505 | 3.2 | 800 | 0.6056 | 0.778 | | 0.3503 | 3.6 | 900 | 0.6264 | 0.743 | | 0.3416 | 4.0 | 1000 | 0.5832 | 0.785 | | 0.1427 | 4.4 | 1100 | 0.7297 | 0.769 | | 0.1982 | 4.8 | 1200 | 0.7761 | 0.73 | | 0.193 | 5.2 | 1300 | 0.8467 | 0.741 | | 0.1831 | 5.6 | 1400 | 0.6975 | 0.774 | | 0.2612 | 6.0 | 1500 | 0.8719 | 0.775 | | 0.102 | 6.4 | 1600 | 0.9045 | 0.788 | | 0.1029 | 6.8 | 1700 | 0.9655 | 0.783 | | 0.0735 | 7.2 | 1800 | 0.9906 | 0.78 | | 0.0715 | 7.6 | 1900 | 0.8893 | 0.787 | | 0.1254 | 8.0 | 2000 | 1.1221 | 0.761 | | 0.021 | 8.4 | 2100 | 1.1648 | 0.779 | | 0.0133 | 8.8 | 2200 | 0.9857 | 0.806 | | 0.0086 | 9.2 | 2300 | 1.0365 | 0.799 | | 0.0223 | 9.6 | 2400 | 0.9826 | 0.812 | | 0.0023 | 10.0 | 2500 | 1.0697 | 0.795 | | 0.0021 | 10.4 | 2600 | 1.0490 | 0.815 | | 0.0401 | 10.8 | 2700 | 1.1594 | 0.8 | | 0.0012 | 11.2 | 2800 | 1.0811 | 0.817 | | 0.0034 | 11.6 | 2900 | 1.0956 | 0.825 | | 0.0012 | 12.0 | 3000 | 1.2010 | 0.808 | | 0.0011 | 12.4 | 3100 | 1.1712 | 0.81 | | 0.0092 | 12.8 | 3200 | 1.1814 | 0.813 | | 0.0007 | 13.2 | 3300 | 1.1677 | 0.818 | | 0.0007 | 13.6 | 3400 | 1.1723 | 0.818 | | 0.0006 | 14.0 | 3500 | 1.1852 | 0.821 | | 0.0005 | 14.4 | 3600 | 1.1928 | 0.82 | | 0.0005 | 14.8 | 3700 | 1.2030 | 0.819 | | 0.0005 | 15.2 | 3800 | 1.2093 | 0.818 | | 0.0005 | 15.6 | 3900 | 1.2160 | 0.818 | | 0.0004 | 16.0 | 4000 | 1.2232 | 0.819 | | 0.0004 | 16.4 | 4100 | 1.2302 | 0.819 | | 0.0004 | 16.8 | 4200 | 1.2350 | 0.819 | | 0.0004 | 17.2 | 4300 | 1.2400 | 0.82 | | 0.0004 | 17.6 | 4400 | 1.2442 | 0.821 | | 0.0004 | 18.0 | 4500 | 1.2483 | 0.821 | | 0.0004 | 18.4 | 4600 | 1.2518 | 0.821 | | 0.0004 | 18.8 | 4700 | 1.2546 | 0.821 | | 0.0004 | 19.2 | 4800 | 1.2561 | 0.821 | | 0.0004 | 19.6 | 4900 | 1.2574 | 0.82 | | 0.0004 | 20.0 | 5000 | 1.2577 | 0.82 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "0", "1", "2", "3" ]
prakhardixit24/urinary_carcinoma_classifier
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # urinary_carcinoma_classifier This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 145167345715929860710353977110167552.0000 - Accuracy: 0.5 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:-----------------------------------------:|:--------:| | No log | 1.0 | 1 | 171619846545786152085242447388475392.0000 | 0.5 | | No log | 2.0 | 2 | 216416222105935722637923733961965568.0000 | 0.75 | | No log | 3.0 | 3 | 145167345715929860710353977110167552.0000 | 0.5 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
prakhardixit24/urinary_carcinoma_classifier_m_rs_50
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # urinary_carcinoma_classifier_m_rs_50 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6941 - Accuracy: 0.5 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.6931 | 0.5 | | No log | 2.0 | 2 | 0.6935 | 0.5 | | No log | 3.0 | 3 | 0.6941 | 0.5 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
prakhardixit24/urinary_carcinoma_classifier_g
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # urinary_carcinoma_classifier_g This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1879 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.6847 | 0.5 | | No log | 2.0 | 2 | 0.6741 | 0.75 | | No log | 3.0 | 3 | 0.6468 | 0.75 | | No log | 4.0 | 4 | 0.6268 | 0.75 | | No log | 5.0 | 5 | 0.6053 | 0.75 | | No log | 6.0 | 6 | 0.5515 | 0.75 | | No log | 7.0 | 7 | 0.5259 | 1.0 | | No log | 8.0 | 8 | 0.4513 | 1.0 | | No log | 9.0 | 9 | 0.4493 | 1.0 | | 0.1427 | 10.0 | 10 | 0.3979 | 1.0 | | 0.1427 | 11.0 | 11 | 0.4203 | 1.0 | | 0.1427 | 12.0 | 12 | 0.3690 | 1.0 | | 0.1427 | 13.0 | 13 | 0.2793 | 1.0 | | 0.1427 | 14.0 | 14 | 0.3143 | 1.0 | | 0.1427 | 15.0 | 15 | 0.2536 | 1.0 | | 0.1427 | 16.0 | 16 | 0.2509 | 1.0 | | 0.1427 | 17.0 | 17 | 0.2619 | 1.0 | | 0.1427 | 18.0 | 18 | 0.2187 | 1.0 | | 0.1427 | 19.0 | 19 | 0.3027 | 1.0 | | 0.055 | 20.0 | 20 | 0.2662 | 1.0 | | 0.055 | 21.0 | 21 | 0.3630 | 0.75 | | 0.055 | 22.0 | 22 | 0.4297 | 0.75 | | 0.055 | 23.0 | 23 | 0.3473 | 0.75 | | 0.055 | 24.0 | 24 | 0.4058 | 0.75 | | 0.055 | 25.0 | 25 | 0.3959 | 0.75 | | 0.055 | 26.0 | 26 | 0.2548 | 1.0 | | 0.055 | 27.0 | 27 | 0.1835 | 1.0 | | 0.055 | 28.0 | 28 | 0.1909 | 1.0 | | 0.055 | 29.0 | 29 | 0.4000 | 0.75 | | 0.029 | 30.0 | 30 | 0.1879 | 1.0 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
KCAZAR/clasificador-de-bananas
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.5388542413711548 f1: 1.0 precision: 1.0 recall: 1.0 auc: 1.0 accuracy: 1.0
[ "maduras", "normales" ]
CRISMARHO/clasificador-de-bananas
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.38848769664764404 f1: 1.0 precision: 1.0 recall: 1.0 auc: 1.0 accuracy: 1.0
[ "maduras", "normales" ]
amiune/clasificacion-bananas
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.32454729080200195 f1: 1.0 precision: 1.0 recall: 1.0 auc: 1.0 accuracy: 1.0
[ "maduras", "normales" ]
KCAZAR/mi-banana-variedades
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.39871343970298767 f1: 1.0 precision: 1.0 recall: 1.0 auc: 1.0 accuracy: 1.0
[ "maduras", "normales" ]
CRISMARHO/prueba-banana-imagenes-nuevas
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.38206303119659424 f1: 1.0 precision: 1.0 recall: 1.0 auc: 1.0 accuracy: 1.0
[ "maduras", "normales" ]
prakhardixit24/urinary_carcinoma_classifier_g001
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # urinary_carcinoma_classifier_g001 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3828 - Accuracy: 0.8571 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.6854 | 0.7143 | | No log | 2.0 | 2 | 0.6759 | 0.5714 | | No log | 3.0 | 3 | 0.6738 | 0.5714 | | No log | 4.0 | 4 | 0.6571 | 0.5714 | | No log | 5.0 | 5 | 0.6342 | 0.5714 | | No log | 6.0 | 6 | 0.6339 | 0.5714 | | No log | 7.0 | 7 | 0.5402 | 0.5714 | | No log | 8.0 | 8 | 0.5827 | 0.5714 | | No log | 9.0 | 9 | 0.5439 | 0.7143 | | 0.2718 | 10.0 | 10 | 0.5553 | 0.7143 | | 0.2718 | 11.0 | 11 | 0.4241 | 1.0 | | 0.2718 | 12.0 | 12 | 0.5177 | 0.8571 | | 0.2718 | 13.0 | 13 | 0.4088 | 0.8571 | | 0.2718 | 14.0 | 14 | 0.4763 | 0.7143 | | 0.2718 | 15.0 | 15 | 0.3164 | 1.0 | | 0.2718 | 16.0 | 16 | 0.3087 | 1.0 | | 0.2718 | 17.0 | 17 | 0.3457 | 0.8571 | | 0.2718 | 18.0 | 18 | 0.2585 | 1.0 | | 0.2718 | 19.0 | 19 | 0.3642 | 0.8571 | | 0.1299 | 20.0 | 20 | 0.4421 | 0.7143 | | 0.1299 | 21.0 | 21 | 0.3558 | 0.8571 | | 0.1299 | 22.0 | 22 | 0.3611 | 0.8571 | | 0.1299 | 23.0 | 23 | 0.5796 | 0.7143 | | 0.1299 | 24.0 | 24 | 0.4137 | 0.8571 | | 0.1299 | 25.0 | 25 | 0.4281 | 0.8571 | | 0.1299 | 26.0 | 26 | 0.2066 | 1.0 | | 0.1299 | 27.0 | 27 | 0.2251 | 1.0 | | 0.1299 | 28.0 | 28 | 0.2459 | 1.0 | | 0.1299 | 29.0 | 29 | 0.4450 | 0.8571 | | 0.0743 | 30.0 | 30 | 0.3828 | 0.8571 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
ychoikr/resnet-18
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
mohammadAbdeh/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.3.0+cu121 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
ArrayDice/food_image_classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # food_image_classification This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6365 - Accuracy: 0.886 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7201 | 0.992 | 62 | 2.5420 | 0.825 | | 1.8371 | 2.0 | 125 | 1.7893 | 0.89 | | 1.5974 | 2.976 | 186 | 1.6365 | 0.886 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
ArrayDice/car_orientation_classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # car_orientation_classification2 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6800 - Accuracy: 0.6926 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.9933 | 1.0 | 68 | 1.9084 | 0.4099 | | 1.4721 | 2.0 | 136 | 1.2870 | 0.5124 | | 1.1677 | 3.0 | 204 | 1.0780 | 0.5265 | | 0.9919 | 4.0 | 272 | 0.9454 | 0.5760 | | 0.8392 | 5.0 | 340 | 0.8184 | 0.6926 | | 0.7778 | 6.0 | 408 | 0.8311 | 0.6431 | | 0.7341 | 7.0 | 476 | 0.7425 | 0.6572 | | 0.6695 | 8.0 | 544 | 0.6800 | 0.6926 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "n", "ne", "e", "se", "s", "sw", "w", "nw" ]
heisenberg3376/vit-base-food-items-v1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-food-items-v1 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.4524 - Accuracy: 0.9091 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.1773 | 0.6579 | 100 | 0.7280 | 0.8473 | | 0.0589 | 1.3158 | 200 | 0.5529 | 0.8873 | | 0.043 | 1.9737 | 300 | 0.4524 | 0.9091 | | 0.0022 | 2.6316 | 400 | 0.5150 | 0.8909 | | 0.0018 | 3.2895 | 500 | 0.4925 | 0.9018 | | 0.0017 | 3.9474 | 600 | 0.4941 | 0.9018 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "bread", "dairy product", "vegetable-fruit", "dessert", "egg", "fried food", "meat", "noodles-pasta", "rice", "seafood", "soup" ]
crapthings/beans
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beans This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0677 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2825 | 1.0 | 130 | 0.2142 | 0.9624 | | 0.1331 | 2.0 | 260 | 0.1347 | 0.9699 | | 0.1427 | 3.0 | 390 | 0.0999 | 0.9774 | | 0.0808 | 4.0 | 520 | 0.0677 | 0.9925 | | 0.1156 | 5.0 | 650 | 0.0839 | 0.9774 | ### Framework versions - Transformers 4.43.0.dev0 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "angular_leaf_spot", "bean_rust", "healthy" ]
prakhardixit24/urinary_carcinoma_classifier_g002
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # urinary_carcinoma_classifier_g002 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3544 - Accuracy: 0.9231 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.6814 | 0.5385 | | No log | 2.0 | 2 | 0.6743 | 0.6923 | | No log | 3.0 | 3 | 0.6449 | 0.7692 | | No log | 4.0 | 4 | 0.6149 | 0.7692 | | No log | 5.0 | 5 | 0.5980 | 0.7692 | | No log | 6.0 | 6 | 0.5855 | 0.7692 | | No log | 7.0 | 7 | 0.5663 | 0.7692 | | No log | 8.0 | 8 | 0.5675 | 0.7692 | | No log | 9.0 | 9 | 0.5530 | 0.7692 | | 0.637 | 10.0 | 10 | 0.5246 | 0.8462 | | 0.637 | 11.0 | 11 | 0.5135 | 0.7692 | | 0.637 | 12.0 | 12 | 0.5296 | 0.8462 | | 0.637 | 13.0 | 13 | 0.5340 | 0.8462 | | 0.637 | 14.0 | 14 | 0.4781 | 0.9231 | | 0.637 | 15.0 | 15 | 0.4870 | 0.8462 | | 0.637 | 16.0 | 16 | 0.4701 | 0.8462 | | 0.637 | 17.0 | 17 | 0.4521 | 1.0 | | 0.637 | 18.0 | 18 | 0.4266 | 0.9231 | | 0.637 | 19.0 | 19 | 0.4220 | 0.9231 | | 0.4474 | 20.0 | 20 | 0.3837 | 0.9231 | | 0.4474 | 21.0 | 21 | 0.4257 | 0.8462 | | 0.4474 | 22.0 | 22 | 0.4093 | 0.9231 | | 0.4474 | 23.0 | 23 | 0.4019 | 1.0 | | 0.4474 | 24.0 | 24 | 0.4578 | 0.8462 | | 0.4474 | 25.0 | 25 | 0.3932 | 1.0 | | 0.4474 | 26.0 | 26 | 0.3838 | 1.0 | | 0.4474 | 27.0 | 27 | 0.3627 | 1.0 | | 0.4474 | 28.0 | 28 | 0.3862 | 0.9231 | | 0.4474 | 29.0 | 29 | 0.3624 | 0.9231 | | 0.3102 | 30.0 | 30 | 0.3544 | 0.9231 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
prakhardixit24/urinary_carcinoma_classifier_g004
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # urinary_carcinoma_classifier_g004 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5556 - Accuracy: 0.7778 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.8 | 1 | 0.6989 | 0.6111 | | No log | 1.6 | 2 | 0.6758 | 0.6111 | | No log | 2.4 | 3 | 0.6409 | 0.6667 | | No log | 4.0 | 5 | 0.6102 | 0.7222 | | No log | 4.8 | 6 | 0.6065 | 0.7778 | | No log | 5.6 | 7 | 0.6030 | 0.7778 | | No log | 6.4 | 8 | 0.6254 | 0.5556 | | 0.6126 | 8.0 | 10 | 0.5948 | 0.7222 | | 0.6126 | 8.8 | 11 | 0.5967 | 0.6667 | | 0.6126 | 9.6 | 12 | 0.5784 | 0.7778 | | 0.6126 | 10.4 | 13 | 0.5751 | 0.6667 | | 0.6126 | 12.0 | 15 | 0.5556 | 0.7778 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
DS-LZY/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0935 - Accuracy: 0.9694 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.5789 | 0.9979 | 351 | 0.1564 | 0.9506 | | 0.3985 | 1.9986 | 703 | 0.1047 | 0.9652 | | 0.3306 | 2.9936 | 1053 | 0.0935 | 0.9694 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "airplane", "automobile", "bird", "cat", "deer", "dog", "frog", "horse", "ship", "truck" ]
typorch/secondlife-detect
# SecondLife-detect This one's for all my Flickr scrapers :) ~0.95 accuracy.
[ "reallife", "secondlife" ]
Veda0718/vit-base-patch16-224-finetuned-brain-tumor-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-brain-tumor-classification This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.4348 - Accuracy: 0.8905 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 3.1659 | 0.9897 | 48 | 2.4060 | 0.4086 | | 1.8381 | 2.0 | 97 | 1.2904 | 0.6772 | | 1.0781 | 2.9897 | 145 | 0.9211 | 0.7573 | | 0.8049 | 4.0 | 194 | 0.7274 | 0.8036 | | 0.6091 | 4.9897 | 242 | 0.6427 | 0.8330 | | 0.4985 | 6.0 | 291 | 0.5519 | 0.8510 | | 0.4077 | 6.9897 | 339 | 0.4921 | 0.8792 | | 0.3583 | 8.0 | 388 | 0.4756 | 0.8826 | | 0.3292 | 8.9897 | 436 | 0.4472 | 0.8883 | | 0.338 | 9.8969 | 480 | 0.4348 | 0.8905 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "astrocitoma t1", "astrocitoma t1c+", "astrocitoma t2", "carcinoma t1", "carcinoma t1c+", "carcinoma t2", "ependimoma t1", "ependimoma t1c+", "ependimoma t2", "ganglioglioma t1", "ganglioglioma t1c+", "ganglioglioma t2", "germinoma t1", "germinoma t1c+", "germinoma t2", "glioblastoma t1", "glioblastoma t1c+", "glioblastoma t2", "granuloma t1", "granuloma t1c+", "granuloma t2", "meduloblastoma t1", "meduloblastoma t1c+", "meduloblastoma t2", "meningioma t1", "meningioma t1c+", "meningioma t2", "neurocitoma t1", "neurocitoma t1c+", "neurocitoma t2", "oligodendroglioma t1", "oligodendroglioma t1c+", "oligodendroglioma t2", "papiloma t1", "papiloma t1c+", "papiloma t2", "schwannoma t1", "schwannoma t1c+", "schwannoma t2", "t1 tuberculoma", "t1 _normal", "t1c+ tuberculoma", "t2 tuberculoma", "t2 _normal" ]
mjbmjb/vit-base-oxford-iiit-pets
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-oxford-iiit-pets This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset. It achieves the following results on the evaluation set: - Loss: 0.1861 - Accuracy: 0.9459 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.391 | 1.0 | 370 | 0.3147 | 0.9188 | | 0.2372 | 2.0 | 740 | 0.2336 | 0.9296 | | 0.1759 | 3.0 | 1110 | 0.2081 | 0.9364 | | 0.1369 | 4.0 | 1480 | 0.1964 | 0.9378 | | 0.1154 | 5.0 | 1850 | 0.1951 | 0.9391 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.2.1+cu118 - Datasets 2.16.1 - Tokenizers 0.19.1
[ "siamese", "birman", "shiba inu", "staffordshire bull terrier", "basset hound", "bombay", "japanese chin", "chihuahua", "german shorthaired", "pomeranian", "beagle", "english cocker spaniel", "american pit bull terrier", "ragdoll", "persian", "egyptian mau", "miniature pinscher", "sphynx", "maine coon", "keeshond", "yorkshire terrier", "havanese", "leonberger", "wheaten terrier", "american bulldog", "english setter", "boxer", "newfoundland", "bengal", "samoyed", "british shorthair", "great pyrenees", "abyssinian", "pug", "saint bernard", "russian blue", "scottish terrier" ]
Ganesh-KSV/vit-face-recognition-1
## Model Details Model type: Vision Transformer (ViT) for Image Classification Finetuned from model : google/vit-base-patch16-384 ## Uses Image classification based on facial features from the dataset.Link:https://www.kaggle.com/datasets/bhaveshmittal/celebrity-face-recognition-dataset ### Downstream Use Fine-tuning for other image classification tasks. Transfer learning for related vision tasks. ### Out-of-Scope Use Tasks unrelated to image classification. Sensitive applications without proper evaluation of biases and limitations. ## Bias, Risks, and Limitations Potential biases in the training dataset affecting model predictions. Limitations in generalizability to different populations or image conditions not represented in the training data. Risks associated with misclassification in critical applications. ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases, and limitations of the model. It's recommended to evaluate the model's performance on the specific data before deploying it in a production environment ## How to Get Started with the Model #### Use the code below to get started with the model. ```python import torch from transformers import ViTForImageClassification, ViTImageProcessor model = ViTForImageClassification.from_pretrained("Ganesh-KSV/face-recognition-version1") processor = ViTImageProcessor.from_pretrained("Ganesh-KSV/face-recognition-version1") def predict(image): inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits predictions = torch.argmax(logits, dim=-1) return predictions ``` ## Training Details ### Training Data Training Procedure: Preprocessing : Images were resized, augmented (rotation, color jitter, etc.), and normalized. Training Hyperparameters: Optimizer: Adam with learning rate 2e-5 and weight decay 1e-2 Scheduler: StepLR with step size 2 and gamma 0.5 Loss Function: CrossEntropyLoss Epochs: 40 Batch Size: 4 ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data Validation split of the VGGFace dataset. #### Factors Performance evaluated based on loss and accuracy on the validation set. #### Metrics Loss and accuracy metrics for each epoch. ### Results Training and validation loss and accuracy plotted for 40 epochs. Confusion matrix generated for the final validation results. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6667e064e83d48c18fdaa52a/b6O3VQbi49cHe7iQNPmDy.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6667e064e83d48c18fdaa52a/MAqqm9lYhUgxKI8kC95Rc.png) #### Summary ## Model Examination Model performance examined through loss, accuracy plots, and confusion matrix. #### Glossary ViT: Vision Transformer CrossEntropyLoss: A loss function used for classification tasks. Adam: An optimization algorithm. StepLR: Learning rate scheduler that decays the learning rate by a factor every few epochs.
[ "angelina jolie", "brad pitt", "chris evans", "chris hemsworth", "denzel washington", "hugh jackman", "jennifer lawrence", "johnny depp", "kate winslet", "leonardo dicaprio", "megan fox", "natalie portman", "nicole kidman", "robert downey jr", "sandra bullock", "scarlett johansson", "tom cruise", "tom hanks", "will smith" ]
skutaada/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6227 - Accuracy: 0.893 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7022 | 0.992 | 62 | 2.5266 | 0.832 | | 1.8737 | 2.0 | 125 | 1.7892 | 0.869 | | 1.6387 | 2.976 | 186 | 1.6227 | 0.893 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
skutaada/VIT-VGGFace
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ViT-VGGFace This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.8361 - Accuracy: 0.8306 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 96 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 1.6402 | 0.9982 | 416 | 1.7366 | 0.7127 | | 1.1955 | 1.9988 | 833 | 1.2342 | 0.7782 | | 0.9051 | 2.9994 | 1250 | 1.0314 | 0.8023 | | 0.7446 | 4.0 | 1667 | 0.9074 | 0.8172 | | 0.8081 | 4.9910 | 2080 | 0.8361 | 0.8306 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+rocm6.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "n000001", "n000002", "n000011", "n000101", "n001011", "n001012", "n001014", "n001015", "n001016", "n001017", "n001018", "n001019", "n001021", "n001023", "n000102", "n001024", "n001025", "n001026", "n001027", "n001028", "n001029", "n001030", "n001031", "n001032", "n001033", "n000103", "n001034", "n001035", "n001036", "n001037", "n001038", "n001039", "n001040", "n001041", "n001042", "n001043", "n000104", "n001044", "n001045", "n001046", "n001047", "n001048", "n001049", "n001050", "n001051", "n001052", "n001053", "n000105", "n001054", "n001055", "n001056", "n001057", "n001058", "n001059", "n001060", "n001061", "n001062", "n001063", "n000106", "n001064", "n001065", "n001066", "n001067", "n001068", "n001069", "n001070", "n001071", "n001072", "n001073", "n000107", "n001074", "n001075", "n001076", "n001077", "n001078", "n001079", "n001080", "n001081", "n001082", "n001083", "n000108", "n001084", "n001085", "n001086", "n001087", "n001088", "n001089", "n001090", "n001091", "n001092", "n001093", "n000109", "n001094", "n001095", "n001096", "n001097", "n001098", "n001099", "n001100", "n001101", "n001102", "n001103", "n000110", "n001104", "n001105", "n001106", "n001108", "n001109", "n001110", "n001111", "n001112", "n001113", "n001114", "n000012", "n000111", "n001115", "n001116", "n001117", "n001118", "n001119", "n001120", "n001121", "n001122", "n001123", "n001124", "n000112", "n001125", "n001126", "n001127", "n001128", "n001129", "n001130", "n001131", "n001132", "n001133", "n001134", "n000113", "n001135", "n001136", "n001137", "n001138", "n001139", "n001140", "n001141", "n001142", "n001143", "n001144", "n000114", "n001145", "n001146", "n001147", "n001148", "n001149", "n001150", "n001151", "n001152", "n001153", "n001154", "n000115", "n001155", "n001156", "n001157", "n001158", "n001159", "n001160", "n001161", "n001162", "n001163", "n001164", "n000116", "n001165", "n001166", "n001167", "n001168", "n001169", "n001170", "n001171", "n001172", "n001173", "n001174", "n000117", "n001175", "n001176", "n001177", "n001178", "n001179", "n001180", "n001181", "n001182", "n001183", "n001184", "n000118", "n001185", "n001186", "n001187", "n001188", "n001189", "n001190", "n001191", "n001192", "n001193", "n001194", "n000119", "n001195", "n001196", "n001197", "n001198", "n001199", "n001200", "n001201", "n001202", "n001203", "n001204", "n000120", "n001205", "n001206", "n001207", "n001208", "n001209", "n001210", "n001211", "n001212", "n001213", "n001214", "n000013", "n000121", "n001215", "n001216", "n001217", "n001218", "n001219", "n001220", "n001221", "n001222", "n001223", "n001224", "n000122", "n001225", "n001226", "n001227", "n001228", "n001229", "n001230", "n001231", "n001232", "n001233", "n001234", "n000123", "n001235", "n001236", "n001237", "n001238", "n001239", "n001240", "n001241", "n001242", "n001243", "n001244", "n000124", "n001245", "n001246", "n001247", "n001248", "n001249", "n001250", "n001251", "n001252", "n001253", "n001254", "n000125", "n001255", "n001256", "n001257", "n001258", "n001259", "n001260", "n001261", "n001262", "n001263", "n001264", "n000126", "n001265", "n001266", "n001267", "n001268", "n001269", "n001270", "n001271", "n001272", "n001273", "n001274", "n000127", "n001275", "n001276", "n001277", "n001278", "n001279", "n001280", "n001281", "n001282", "n001283", "n001284", "n000128", "n001285", "n001286", "n001287", "n001288", "n001289", "n001290", "n001291", "n001292", "n001293", "n001294", "n000129", "n001295", "n001296", "n001297", "n001298", "n001299", "n001300", "n001301", "n001302", "n001303", "n001304", "n000130", "n001305", "n001306", "n001307", "n001308", "n001309", "n001310", "n001311", "n001312", "n001313", "n001314", "n000014", "n000131", "n001315", "n001316", "n001317", "n001319", "n001320", "n001321", "n001322", "n001323", "n001325", "n001326", "n000132", "n001327", "n001328", "n001329", "n001330", "n001331", "n001332", "n001333", "n001334", "n001335", "n001336", "n000133", "n001337", "n001338", "n001339", "n001340", "n001341", "n001342", "n001343", "n001344", "n001345", "n001346", "n000134", "n001347", "n001348", "n001349", "n001350", "n001351", "n001352", "n001353", "n001354", "n001355", "n001356", "n000135", "n001357", "n001358", "n001359", "n001360", "n001361", "n001362", "n001363", "n001364", "n001365", "n001366", "n000136", "n001367", "n001368", "n001369", "n001370", "n001371", "n001372", "n001373", "n001374", "n001375", "n001376", "n000137", "n001377", "n001378", "n001379", "n001380", "n001381", "n001382", "n001383", "n001384", "n001385", "n001386", "n000138", "n001387", "n001388", "n001389", "n001390", "n001391", "n001392", "n001393", "n001394", "n001395", "n001396", "n000139", "n001397", "n001398", "n001399", "n001400", "n001401", "n001402", "n001403", "n001404", "n001405", "n001406", "n000140", "n001407", "n001408", "n001409", "n001410", "n001411", "n001412", "n001413", "n001414", "n001415", "n001416", "n000015", "n000141", "n001417", "n001418", "n001419", "n001420", "n001421", "n001422", "n001423", "n001424", "n001425", "n001426", "n000142", "n001427", "n001428", "n001429", "n001430", "n001431", "n001432", "n001433", "n001434", "n001435", "n001436", "n000143", "n001437", "n001438", "n001439", "n001440", "n001441", "n001442", "n001443", "n001444", "n001445", "n001446", "n000144", "n001447", "n001448", "n001449", "n001450", "n001451", "n001452", "n001453", "n001454", "n001455", "n001456", "n000145", "n001457", "n001458", "n001459", "n001460", "n001461", "n001462", "n001463", "n001464", "n001465", "n001466", "n000146", "n001467", "n001468", "n001469", "n001470", "n001471", "n001472", "n001473", "n001474", "n001475", "n001476", "n000148", "n001477", "n001478", "n001479", "n001480", "n001482", "n001483", "n001484", "n001485", "n001486", "n001487", "n000149", "n001488", "n001489", "n001490", "n001491", "n001492", "n001493", "n001494", "n001495", "n001496", "n001497", "n000150", "n001498", "n001499", "n001500", "n001501", "n001502", "n001503", "n001504", "n001505", "n001506", "n001507", "n000151", "n001508", "n001509", "n001510", "n001511", "n001512", "n001513", "n001514", "n001515", "n001516", "n001517", "n000016", "n000152", "n001518", "n001519", "n001520", "n001521", "n001522", "n001523", "n001525", "n001526", "n001528", "n001529", "n000154", "n001530", "n001531", "n001532", "n001533", "n001534", "n001535", "n001536", "n001537", "n001538", "n001539", "n000155", "n001540", "n001541", "n001542", "n001543", "n001544", "n001545", "n001546", "n001547", "n001548", "n001549", "n000156", "n001550", "n001551", "n001552", "n001553", "n001554", "n001555", "n001556", "n001557", "n001558", "n001559", "n000157", "n001560", "n001561", "n001562", "n001563", "n001564", "n001565", "n001566", "n001567", "n001568", "n001569", "n000158", "n001570", "n001571", "n001572", "n001573", "n001574", "n001575", "n001576", "n001577", "n001578", "n001579", "n000159", "n001580", "n001581", "n001582", "n001583", "n001584", "n001585", "n001586", "n001587", "n001588", "n001589", "n000160", "n001590", "n001591", "n001592", "n001593", "n001594", "n001595", "n001596", "n001597", "n001598", "n001599", "n000161", "n001600", "n001601", "n001602", "n001603", "n001604", "n001605", "n001606", "n001607", "n001608", "n001609", "n000162", "n001610", "n001611", "n001612", "n001613", "n001614", "n001615", "n001616", "n001617", "n001618", "n001619", "n000017", "n000163", "n001620", "n001621", "n001622", "n001623", "n001624", "n001625", "n001626", "n001627", "n001628", "n001629", "n000164", "n001630", "n001631", "n001632", "n001633", "n001634", "n001635", "n001636", "n001637", "n001638", "n001639", "n000165", "n001640", "n001641", "n001642", "n001643", "n001644", "n001645", "n001646", "n001647", "n001648", "n001649", "n000166", "n001650", "n001651", "n001652", "n001653", "n001654", "n001655", "n001656", "n001657", "n001658", "n001659", "n000167", "n001660", "n001661", "n001662", "n001663", "n001664", "n001665", "n001666", "n001667", "n001668", "n001670", "n000168", "n001671", "n001672", "n001673", "n001674", "n001675", "n001676", "n001677", "n001678", "n001679", "n001680", "n000169", "n001681", "n001682", "n001683", "n001684", "n001685", "n001686", "n001687", "n001688", "n001689", "n001690", "n000170", "n001691", "n001692", "n001693", "n001694", "n001695", "n001696", "n001697", "n001698", "n001699", "n001700", "n000171", "n001701", "n001702", "n001703", "n001704", "n001705", "n001706", "n001707", "n001708", "n001709", "n001710", "n000172", "n001711", "n001712", "n001713", "n001714", "n001715", "n001716", "n001717", "n001718", "n001719", "n001720", "n000018", "n000173", "n001721", "n001722", "n001723", "n001724", "n001725", "n001726", "n001727", "n001728", "n001729", "n001730", "n000174", "n001731", "n001732", "n001733", "n001734", "n001735", "n001736", "n001737", "n001738", "n001739", "n001740", "n000175", "n001741", "n001742", "n001743", "n001744", "n001745", "n001746", "n001747", "n001748", "n001749", "n001750", "n000176", "n001751", "n001752", "n001753", "n001754", "n001755", "n001756", "n001757", "n001758", "n001759", "n001760", "n000177", "n001761", "n001762", "n001763", "n001764", "n001765", "n001766", "n001767", "n001768", "n001769", "n001770", "n000178", "n001771", "n001772", "n001773", "n001774", "n001775", "n001776", "n001777", "n001778", "n001779", "n001780", "n000179", "n001781", "n001782", "n001783", "n001784", "n001785", "n001786", "n001787", "n001788", "n001789", "n001790", "n000180", "n001791", "n001792", "n001793", "n001794", "n001795", "n001796", "n001797", "n001798", "n001799", "n001800", "n000181", "n001801", "n001802", "n001803", "n001804", "n001805", "n001806", "n001807", "n001808", "n001809", "n001810", "n000182", "n001811", "n001812", "n001813", "n001814", "n001815", "n001816", "n001817", "n001818", "n001820", "n001821", "n000019", "n000183", "n001822", "n001823", "n001824", "n001825", "n001826", "n001827", "n001828", "n001829", "n001830", "n001831", "n000184", "n001832", "n001833", "n001834", "n001835", "n001836", "n001837", "n001838", "n001839", "n001840", "n001841", "n000185", "n001842", "n001843", "n001844", "n001845", "n001846", "n001847", "n001848", "n001849", "n001850", "n001851", "n000186", "n001852", "n001853", "n001854", "n001855", "n001856", "n001857", "n001858", "n001859", "n001860", "n001861", "n000187", "n001862", "n001863", "n001864", "n001865", "n001866", "n001867", "n001868", "n001869", "n001870", "n001871", "n000188", "n001872", "n001873", "n001874", "n001875", "n001876", "n001877", "n001878", "n001879", "n001880", "n001881", "n000189", "n001882", "n001883", "n001884", "n001885", "n001886", "n001887", "n001888", "n001889", "n001890", "n001891", "n000190", "n001892", "n001893", "n001894", "n001895", "n001896", "n001897", "n001898", "n001899", "n001900", "n001901", "n000191", "n001902", "n001903", "n001904", "n001905", "n001906", "n001907", "n001908", "n001909", "n001910", "n001911", "n000192", "n001912", "n001913", "n001914", "n001915", "n001916", "n001917", "n001918", "n001919", "n001920", "n001921", "n000020", "n000193", "n001922", "n001923", "n001924", "n001925", "n001926", "n001927", "n001928", "n001929", "n001930", "n001931", "n000194", "n001932", "n001933", "n001934", "n001935", "n001936", "n001937", "n001938", "n001939", "n001940", "n001941", "n000195", "n001942", "n001943", "n001944", "n001945", "n001946", "n001947", "n001948", "n001949", "n001950", "n001951", "n000196", "n001952", "n001953", "n001954", "n001955", "n001956", "n001957", "n001958", "n001959", "n001960", "n001961", "n000197", "n001962", "n001963", "n001964", "n001965", "n001966", "n001967", "n001968", "n001970", "n001971", "n001972", "n000198", "n001973", "n001974", "n001975", "n001976", "n001978", "n001979", "n001980", "n001981", "n001982", "n001983", "n000199", "n001984", "n001985", "n001986", "n001987", "n001988", "n001989", "n001990", "n001991", "n001992", "n001993", "n000201", "n001994", "n001995", "n001996", "n001998", "n001999", "n002000", "n002001", "n002002", "n002003", "n002004", "n000202", "n002005", "n002006", "n002007", "n002008", "n002009", "n002010", "n002011", "n002012", "n002013", "n002014", "n000203", "n002015", "n002016", "n002017", "n002018", "n002019", "n002020", "n002021", "n002022", "n002023", "n002024", "n000003", "n000021", "n000204", "n002025", "n002026", "n002027", "n002028", "n002029", "n002031", "n002032", "n002033", "n002034", "n002035", "n000205", "n002036", "n002037", "n002038", "n002039", "n002040", "n002041", "n002042", "n002043", "n002044", "n002045", "n000206", "n002046", "n002047", "n002048", "n002049", "n002050", "n002051", "n002052", "n002053", "n002054", "n002055", "n000207", "n002056", "n002057", "n002058", "n002059", "n002060", "n002061", "n002062", "n002063", "n002064", "n002065", "n000208", "n002066", "n002067", "n002068", "n002069", "n002070", "n002071", "n002072", "n002073", "n002074", "n002075", "n000209", "n002076", "n002077", "n002078", "n002079", "n002080", "n002081", "n002082", "n002083", "n002084", "n002085", "n000210", "n002086", "n002087", "n002088", "n002089", "n002090", "n002091", "n002092", "n002093", "n002094", "n002095", "n000211", "n002096", "n002097", "n002098", "n002099", "n002100", "n002101", "n002102", "n002103", "n002104", "n002105", "n000212", "n002106", "n002107", "n002108", "n002109", "n002110", "n002111", "n002112", "n002113", "n002114", "n002115", "n000213", "n002116", "n002117", "n002118", "n002119", "n002120", "n002121", "n002122", "n002123", "n002124", "n002125", "n000022", "n000214", "n002126", "n002127", "n002128", "n002129", "n002130", "n002131", "n002132", "n002133", "n002134", "n002135", "n000215", "n002136", "n002137", "n002138", "n002139", "n002140", "n002141", "n002142", "n002143", "n002144", "n002145", "n000216", "n002146", "n002147", "n002148", "n002149", "n002150", "n002151", "n002152", "n002153", "n002154", "n002155", "n000217", "n002156", "n002157", "n002158", "n002159", "n002160", "n002161", "n002162", "n002163", "n002164", "n002165", "n000218", "n002166", "n002167", "n002168", "n002169", "n002170", "n002171", "n002172", "n002173", "n002174", "n002175", "n000219", "n002176", "n002177", "n002178", "n002179", "n002180", "n002182", "n002183", "n002184", "n002185", "n002186", "n000220", "n002187", "n002188", "n002189", "n002190", "n002191", "n002192", "n002193", "n002194", "n002195", "n002196", "n000221", "n002197", "n002198", "n002199", "n002200", "n002201", "n002202", "n002203", "n002204", "n002205", "n002206", "n000222", "n002207", "n002208", "n002209", "n002210", "n002211", "n002212", "n002213", "n002214", "n002215", "n002216", "n000223", "n002217", "n002218", "n002219", "n002220", "n002221", "n002222", "n002223", "n002224", "n002225", "n002226", "n000023", "n000224", "n002227", "n002228", "n002229", "n002230", "n002231", "n002232", "n002233", "n002234", "n002235", "n002236", "n000225", "n002237", "n002238", "n002239", "n002240", "n002241", "n002242", "n002243", "n002244", "n002245", "n002246", "n000226", "n002247", "n002248", "n002249", "n002250", "n002251", "n002252", "n002253", "n002254", "n002255", "n002256", "n000227", "n002257", "n002258", "n002259", "n002260", "n002261", "n002262", "n002263", "n002265", "n002266", "n002267", "n000228", "n002268", "n002269", "n002270", "n002271", "n002272", "n002273", "n002274", "n002275", "n002276", "n002277", "n000229", "n002278", "n002279", "n002280", "n002281", "n002282", "n002283", "n002284", "n002285", "n002286", "n002287", "n000230", "n002288", "n002289", "n002290", "n002291", "n002292", "n002293", "n002294", "n002295", "n002296", "n002297", "n000231", "n002298", "n002299", "n002300", "n002301", "n002302", "n002303", "n002304", "n002305", "n002306", "n002307", "n000232", "n002308", "n002309", "n002310", "n002311", "n002312", "n002313", "n002314", "n002315", "n002316", "n002317", "n000233", "n002318", "n002319", "n002320", "n002321", "n002322", "n002323", "n002324", "n002325", "n002326", "n002327", "n000024", "n000234", "n002328", "n002329", "n002330", "n002331", "n002332", "n002333", "n002334", "n002335", "n002336", "n002337", "n000235", "n002338", "n002339", "n002340", "n002341", "n002342", "n002343", "n002344", "n002345", "n002346", "n002347", "n000236", "n002348", "n002349", "n002350", "n002351", "n002352", "n002353", "n002354", "n002355", "n002356", "n002357", "n000237", "n002358", "n002359", "n002360", "n002361", "n002362", "n002363", "n002364", "n002365", "n002366", "n002367", "n000238", "n002368", "n002369", "n002370", "n002371", "n002372", "n002373", "n002374", "n002375", "n002376", "n002377", "n000239", "n002378", "n002379", "n002380", "n002381", "n002382", "n002383", "n002384", "n002386", "n002387", "n002388", "n000240", "n002389", "n002390", "n002391", "n002392", "n002393", "n002394", "n002395", "n002396", "n002397", "n002398", "n000241", "n002399", "n002400", "n002401", "n002402", "n002403", "n002404", "n002405", "n002406", "n002407", "n002408", "n000242", "n002409", "n002410", "n002411", "n002412", "n002413", "n002414", "n002415", "n002416", "n002417", "n002418", "n000243", "n002419", "n002420", "n002421", "n002422", "n002423", "n002424", "n002425", "n002426", "n002427", "n002428", "n000025", "n000244", "n002430", "n002431", "n002432", "n002433", "n002434", "n002435", "n002436", "n002437", "n002438", "n002439", "n000245", "n002440", "n002441", "n002442", "n002443", "n002444", "n002445", "n002446", "n002447", "n002448", "n002449", "n000246", "n002450", "n002451", "n002452", "n002453", "n002454", "n002455", "n002456", "n002457", "n002458", "n002459", "n000247", "n002460", "n002461", "n002462", "n002463", "n002464", "n002465", "n002466", "n002467", "n002468", "n002469", "n000248", "n002470", "n002471", "n002472", "n002473", "n002474", "n002475", "n002476", "n002477", "n002478", "n002479", "n000249", "n002480", "n002481", "n002482", "n002483", "n002484", "n002485", "n002486", "n002487", "n002488", "n002489", "n000250", "n002490", "n002491", "n002492", "n002493", "n002494", "n002495", "n002496", "n002497", "n002498", "n002499", "n000251", "n002500", "n002501", "n002502", "n002503", "n002504", "n002505", "n002506", "n002507", "n002508", "n002509", "n000252", "n002510", "n002512", "n002513", "n002514", "n002515", "n002516", "n002517", "n002518", "n002519", "n002520", "n000253", "n002521", "n002522", "n002523", "n002524", "n002525", "n002526", "n002527", "n002528", "n002529", "n002530", "n000026", "n000254", "n002531", "n002532", "n002533", "n002534", "n002535", "n002536", "n002537", "n002538", "n002539", "n002541", "n000255", "n002542", "n002543", "n002544", "n002545", "n002546", "n002547", "n002548", "n002549", "n002550", "n002551", "n000256", "n002552", "n002553", "n002554", "n002555", "n002556", "n002557", "n002558", "n002559", "n002560", "n002561", "n000257", "n002562", "n002563", "n002564", "n002565", "n002566", "n002567", "n002568", "n002569", "n002570", "n002571", "n000258", "n002572", "n002573", "n002574", "n002575", "n002576", "n002577", "n002578", "n002579", "n002580", "n002581", "n000259", "n002582", "n002583", "n002584", "n002585", "n002586", "n002588", "n002589", "n002590", "n002591", "n002592", "n000260", "n002593", "n002594", "n002595", "n002596", "n002597", "n002598", "n002599", "n002600", "n002601", "n002602", "n000261", "n002603", "n002604", "n002605", "n002606", "n002607", "n002608", "n002609", "n002610", "n002611", "n002612", "n000262", "n002613", "n002614", "n002615", "n002616", "n002617", "n002618", "n002619", "n002620", "n002621", "n002622", "n000263", "n002623", "n002624", "n002625", "n002626", "n002627", "n002628", "n002629", "n002630", "n002631", "n002632", "n000027", "n000264", "n002633", "n002634", "n002635", "n002636", "n002637", "n002638", "n002639", "n002640", "n002641", "n002642", "n000265", "n002643", "n002644", "n002645", "n002646", "n002647", "n002648", "n002649", "n002650", "n002651", "n002652", "n000266", "n002653", "n002654", "n002655", "n002656", "n002657", "n002659", "n002660", "n002661", "n002662", "n002663", "n000267", "n002664", "n002665", "n002666", "n002667", "n002668", "n002669", "n002670", "n002671", "n002672", "n002673", "n000268", "n002674", "n002675", "n002676", "n002677", "n002678", "n002679", "n002680", "n002681", "n002682", "n002683", "n000269", "n002684", "n002685", "n002686", "n002687", "n002688", "n002689", "n002690", "n002691", "n002692", "n002693", "n000270", "n002694", "n002695", "n002696", "n002697", "n002698", "n002699", "n002700", "n002701", "n002702", "n002703", "n000271", "n002704", "n002705", "n002706", "n002707", "n002708", "n002709", "n002710", "n002711", "n002712", "n002713", "n000272", "n002714", "n002715", "n002716", "n002717", "n002718", "n002719", "n002720", "n002721", "n002722", "n002723", "n000273", "n002724", "n002725", "n002726", "n002727", "n002728", "n002729", "n002730", "n002731", "n002732", "n002733", "n000028", "n000274", "n002734", "n002735", "n002736", "n002737", "n002738", "n002739", "n002740", "n002741", "n002742", "n002743", "n000275", "n002744", "n002745", "n002746", "n002747", "n002748", "n002749", "n002750", "n002751", "n002752", "n002753", "n000276", "n002754", "n002755", "n002756", "n002757", "n002758", "n002759", "n002760", "n002761", "n002762", "n002763", "n000277", "n002764", "n002765", "n002766", "n002767", "n002769", "n002770", "n002771", "n002772", "n002773", "n002774", "n000278", "n002775", "n002776", "n002777", "n002778", "n002779", "n002780", "n002781", "n002782", "n002783", "n002784", "n000279", "n002785", "n002786", "n002787", "n002788", "n002789", "n002790", "n002791", "n002792", "n002793", "n002794", "n000280", "n002795", "n002796", "n002797", "n002798", "n002799", "n002800", "n002801", "n002802", "n002803", "n002804", "n000281", "n002805", "n002806", "n002807", "n002808", "n002809", "n002811", "n002812", "n002813", "n002814", "n002815", "n000282", "n002816", "n002817", "n002818", "n002819", "n002820", "n002821", "n002822", "n002823", "n002824", "n002825", "n000283", "n002826", "n002827", "n002828", "n002829", "n002830", "n002831", "n002832", "n002833", "n002834", "n002835", "n000029", "n000284", "n002836", "n002837", "n002838", "n002839", "n002840", "n002841", "n002842", "n002843", "n002844", "n002845", "n000285", "n002846", "n002847", "n002848", "n002849", "n002850", "n002851", "n002852", "n002853", "n002854", "n002855", "n000286", "n002856", "n002857", "n002858", "n002859", "n002860", "n002861", "n002862", "n002863", "n002864", "n002865", "n000287", "n002866", "n002867", "n002868", "n002869", "n002870", "n002871", "n002872", "n002873", "n002874", "n002875", "n000288", "n002876", "n002877", "n002878", "n002879", "n002880", "n002881", "n002882", "n002883", "n002884", "n002885", "n000289", "n002886", "n002887", "n002888", "n002889", "n002890", "n002891", "n002892", "n002893", "n002894", "n002895", "n000290", "n002896", "n002897", "n002898", "n002899", "n002900", "n002901", "n002902", "n002903", "n002904", "n002905", "n000291", "n002906", "n002907", "n002908", "n002909", "n002910", "n002911", "n002912", "n002913", "n002914", "n002915", "n000292", "n002916", "n002917", "n002918", "n002919", "n002920", "n002921", "n002922", "n002923", "n002924", "n002925", "n000293", "n002926", "n002927", "n002928", "n002929", "n002930", "n002931", "n002932", "n002933", "n002934", "n002935", "n000030", "n000294", "n002936", "n002937", "n002938", "n002939", "n002940", "n002941", "n002942", "n002943", "n002944", "n002945", "n000295", "n002946", "n002947", "n002948", "n002949", "n002950", "n002951", "n002952", "n002953", "n002954", "n002955", "n000296", "n002956", "n002957", "n002958", "n002959", "n002960", "n002961", "n002962", "n002963", "n002964", "n002965", "n000297", "n002966", "n002967", "n002968", "n002969", "n002970", "n002971", "n002972", "n002973", "n002974", "n002975", "n000298", "n002976", "n002977", "n002978", "n002979", "n002980", "n002981", "n002982", "n002983", "n002984", "n002985", "n000299", "n002986", "n002987", "n002988", "n002989", "n002990", "n002991", "n002992", "n002993", "n002994", "n002995", "n000300", "n002997", "n002998", "n002999", "n003000", "n003001", "n003002", "n003003", "n003004", "n003005", "n003006", "n000301", "n003007", "n003008", "n003009", "n003010", "n003011", "n003012", "n003013", "n003014", "n003015", "n003016", "n000302", "n003017", "n003018", "n003019", "n003020", "n003021", "n003022", "n003023", "n003024", "n003025", "n003026", "n000303", "n003027", "n003028", "n003029", "n003030", "n003031", "n003032", "n003033", "n003034", "n003035", "n003036", "n000004", "n000031", "n000304", "n003037", "n003038", "n003039", "n003040", "n003041", "n003042", "n003043", "n003044", "n003045", "n003046", "n000305", "n003047", "n003048", "n003049", "n003050", "n003051", "n003052", "n003053", "n003054", "n003055", "n003056", "n000306", "n003057", "n003058", "n003059", "n003060", "n003061", "n003062", "n003063", "n003064", "n003065", "n003066", "n000307", "n003067", "n003068", "n003069", "n003070", "n003071", "n003072", "n003073", "n003074", "n003075", "n003076", "n000308", "n003077", "n003078", "n003079", "n003080", "n003081", "n003082", "n003083", "n003084", "n003085", "n003086", "n000309", "n003087", "n003088", "n003089", "n003090", "n003091", "n003092", "n003093", "n003094", "n003095", "n003096", "n000310", "n003097", "n003098", "n003099", "n003100", "n003101", "n003102", "n003103", "n003104", "n003105", "n003106", "n000311", "n003107", "n003108", "n003109", "n003110", "n003111", "n003112", "n003113", "n003114", "n003115", "n003116", "n000312", "n003117", "n003118", "n003119", "n003120", "n003121", "n003122", "n003123", "n003124", "n003125", "n003126", "n000313", "n003127", "n003128", "n003129", "n003130", "n003131", "n003132", "n003133", "n003134", "n003135", "n003136", "n000032", "n000314", "n003137", "n003138", "n003139", "n003140", "n003142", "n003143", "n003144", "n003145", "n003146", "n003147", "n000315", "n003148", "n003149", "n003150", "n003151", "n003152", "n003153", "n003154", "n003155", "n003156", "n003157", "n000316", "n003158", "n003159", "n003160", "n003161", "n003162", "n003163", "n003164", "n003165", "n003166", "n003167", "n000317", "n003168", "n003169", "n003170", "n003171", "n003172", "n003173", "n003174", "n003175", "n003176", "n003177", "n000318", "n003178", "n003179", "n003180", "n003181", "n003182", "n003183", "n003184", "n003185", "n003186", "n003187", "n000319", "n003188", "n003189", "n003190", "n003191", "n003192", "n003193", "n003194", "n003195", "n003196", "n003197", "n000320", "n003198", "n003199", "n003200", "n003201", "n003202", "n003203", "n003204", "n003205", "n003206", "n003207", "n000321", "n003208", "n003209", "n003210", "n003211", "n003212", "n003213", "n003214", "n003215", "n003216", "n003217", "n000322", "n003218", "n003219", "n003220", "n003221", "n003222", "n003223", "n003224", "n003225", "n003226", "n003227", "n000323", "n003228", "n003229", "n003231", "n003232", "n003233", "n003234", "n003235", "n003236", "n003237", "n003238", "n000033", "n000324", "n003239", "n003240", "n003241", "n003242", "n003243", "n003244", "n003245", "n003246", "n003247", "n003248", "n000325", "n003249", "n003250", "n003251", "n003252", "n003253", "n003254", "n003255", "n003256", "n003257", "n003258", "n000326", "n003259", "n003260", "n003261", "n003262", "n003263", "n003264", "n003265", "n003266", "n003267", "n003268", "n000327", "n003269", "n003270", "n003271", "n003272", "n003273", "n003274", "n003275", "n003276", "n003277", "n003278", "n000328", "n003279", "n003280", "n003281", "n003282", "n003283", "n003284", "n003285", "n003286", "n003287", "n003288", "n000329", "n003289", "n003290", "n003291", "n003292", "n003293", "n003294", "n003295", "n003296", "n003297", "n003298", "n000330", "n003299", "n003300", "n003301", "n003302", "n003303", "n003304", "n003305", "n003306", "n003307", "n003308", "n000331", "n003310", "n003311", "n003312", "n003313", "n003314", "n003315", "n003316", "n003317", "n003318", "n003319", "n000332", "n003320", "n003321", "n003322", "n003323", "n003324", "n003325", "n003326", "n003327", "n003328", "n003329", "n000333", "n003330", "n003331", "n003332", "n003333", "n003334", "n003335", "n003336", "n003337", "n003338", "n003339", "n000034", "n000334", "n003340", "n003341", "n003342", "n003343", "n003344", "n003345", "n003346", "n003347", "n003348", "n003349", "n000335", "n003350", "n003351", "n003352", "n003353", "n003354", "n003355", "n003356", "n003357", "n003358", "n003359", "n000336", "n003360", "n003361", "n003362", "n003363", "n003364", "n003365", "n003366", "n003367", "n003368", "n003369", "n000337", "n003370", "n003371", "n003372", "n003373", "n003374", "n003375", "n003376", "n003377", "n003378", "n003379", "n000338", "n003380", "n003381", "n003382", "n003383", "n003384", "n003385", "n003386", "n003387", "n003388", "n003389", "n000339", "n003390", "n003391", "n003392", "n003393", "n003394", "n003395", "n003396", "n003397", "n003398", "n003399", "n000340", "n003400", "n003401", "n003402", "n003403", "n003404", "n003405", "n003406", "n003407", "n003408", "n003409", "n000341", "n003410", "n003411", "n003412", "n003413", "n003414", "n003415", "n003416", "n003417", "n003418", "n003419", "n000342", "n003420", "n003421", "n003422", "n003423", "n003424", "n003425", "n003426", "n003427", "n003428", "n003429", "n000343", "n003430", "n003431", "n003432", "n003433", "n003434", "n003435", "n003436", "n003437", "n003438", "n003439", "n000035", "n000344", "n003440", "n003441", "n003442", "n003443", "n003444", "n003445", "n003446", "n003447", "n003448", "n003449", "n000345", "n003450", "n003451", "n003452", "n003453", "n003454", "n003455", "n003456", "n003457", "n003458", "n003459", "n000346", "n003460", "n003461", "n003462", "n003463", "n003464", "n003465", "n003466", "n003467", "n003468", "n003469", "n000347", "n003470", "n003471", "n003472", "n003473", "n003474", "n003475", "n003476", "n003477", "n003478", "n003479", "n000348", "n003481", "n003482", "n003483", "n003484", "n003485", "n003486", "n003487", "n003488", "n003489", "n003490", "n000349", "n003491", "n003492", "n003493", "n003494", "n003495", "n003496", "n003497", "n003498", "n003499", "n003500", "n000350", "n003501", "n003502", "n003503", "n003504", "n003505", "n003506", "n003507", "n003508", "n003509", "n003510", "n000351", "n003511", "n003512", "n003513", "n003514", "n003515", "n003516", "n003517", "n003518", "n003519", "n003520", "n000352", "n003521", "n003522", "n003523", "n003524", "n003525", "n003526", "n003527", "n003528", "n003529", "n003530", "n000353", "n003531", "n003532", "n003533", "n003534", "n003536", "n003537", "n003538", "n003539", "n003540", "n003541", "n000036", "n000354", "n003542", "n003543", "n003544", "n003545", "n003546", "n003547", "n003548", "n003549", "n003550", "n003551", "n000355", "n003552", "n003553", "n003554", "n003555", "n003556", "n003557", "n003558", "n003559", "n003560", "n003561", "n000356", "n003562", "n003563", "n003564", "n003565", "n003566", "n003567", "n003568", "n003569", "n003570", "n003571", "n000357", "n003572", "n003573", "n003574", "n003575", "n003576", "n003577", "n003578", "n003579", "n003580", "n003581", "n000358", "n003582", "n003583", "n003584", "n003585", "n003586", "n003587", "n003588", "n003590", "n003591", "n003593", "n000359", "n003594", "n003595", "n003596", "n003597", "n003598", "n003599", "n003600", "n003601", "n003602", "n003603", "n000360", "n003604", "n003605", "n003606", "n003607", "n003608", "n003609", "n003610", "n003612", "n003613", "n003614", "n000361", "n003615", "n003616", "n003617", "n003618", "n003619", "n003620", "n003621", "n003622", "n003623", "n003624", "n000362", "n003625", "n003626", "n003627", "n003628", "n003629", "n003630", "n003631", "n003632", "n003633", "n003634", "n000363", "n003635", "n003636", "n003637", "n003638", "n003639", "n003640", "n003641", "n003642", "n003643", "n003644", "n000037", "n000364", "n003645", "n003646", "n003647", "n003648", "n003649", "n003650", "n003651", "n003652", "n003653", "n003654", "n000365", "n003655", "n003656", "n003657", "n003658", "n003659", "n003660", "n003661", "n003663", "n003664", "n003666", "n000366", "n003667", "n003668", "n003669", "n003670", "n003671", "n003672", "n003673", "n003674", "n003675", "n003676", "n000367", "n003677", "n003678", "n003679", "n003680", "n003681", "n003682", "n003683", "n003684", "n003685", "n003686", "n000368", "n003687", "n003688", "n003689", "n003690", "n003691", "n003692", "n003693", "n003694", "n003695", "n003696", "n000369", "n003697", "n003698", "n003699", "n003700", "n003701", "n003702", "n003703", "n003704", "n003705", "n003706", "n000370", "n003707", "n003708", "n003709", "n003710", "n003711", "n003712", "n003713", "n003714", "n003715", "n003716", "n000371", "n003717", "n003718", "n003719", "n003720", "n003721", "n003722", "n003723", "n003724", "n003725", "n003726", "n000372", "n003727", "n003728", "n003729", "n003730", "n003731", "n003732", "n003733", "n003734", "n003735", "n003736", "n000373", "n003737", "n003738", "n003739", "n003740", "n003741", "n003742", "n003743", "n003744", "n003745", "n003746", "n000038", "n000374", "n003747", "n003748", "n003749", "n003750", "n003751", "n003752", "n003753", "n003754", "n003755", "n003756", "n000375", "n003757", "n003758", "n003759", "n003760", "n003761", "n003762", "n003763", "n003764", "n003766", "n003767", "n000376", "n003768", "n003769", "n003770", "n003771", "n003772", "n003773", "n003774", "n003775", "n003776", "n003777", "n000377", "n003778", "n003779", "n003780", "n003781", "n003782", "n003783", "n003784", "n003785", "n003786", "n003787", "n000378", "n003788", "n003789", "n003790", "n003791", "n003792", "n003793", "n003794", "n003795", "n003796", "n003797", "n000379", "n003798", "n003799", "n003800", "n003801", "n003802", "n003803", "n003805", "n003806", "n003807", "n003808", "n000380", "n003809", "n003810", "n003811", "n003812", "n003813", "n003814", "n003815", "n003816", "n003817", "n003818", "n000381", "n003819", "n003820", "n003821", "n003822", "n003823", "n003824", "n003825", "n003826", "n003827", "n003828", "n000382", "n003829", "n003830", "n003831", "n003832", "n003833", "n003834", "n003835", "n003836", "n003837", "n003838", "n000383", "n003839", "n003840", "n003841", "n003842", "n003843", "n003844", "n003845", "n003846", "n003847", "n003848", "n000039", "n000384", "n003849", "n003850", "n003851", "n003852", "n003853", "n003854", "n003855", "n003856", "n003857", "n003858", "n000385", "n003859", "n003860", "n003861", "n003862", "n003863", "n003864", "n003865", "n003866", "n003867", "n003868", "n000386", "n003869", "n003870", "n003871", "n003872", "n003873", "n003874", "n003875", "n003876", "n003877", "n003878", "n000387", "n003879", "n003880", "n003881", "n003882", "n003883", "n003884", "n003885", "n003886", "n003887", "n003888", "n000388", "n003889", "n003890", "n003891", "n003892", "n003893", "n003894", "n003895", "n003896", "n003897", "n003898", "n000389", "n003899", "n003900", "n003901", "n003902", "n003903", "n003904", "n003905", "n003906", "n003907", "n003908", "n000390", "n003909", "n003910", "n003911", "n003912", "n003913", "n003914", "n003915", "n003916", "n003917", "n003918", "n000391", "n003919", "n003920", "n003921", "n003922", "n003923", "n003924", "n003925", "n003926", "n003927", "n003928", "n000392", "n003929", "n003930", "n003931", "n003932", "n003933", "n003934", "n003935", "n003936", "n003937", "n003938", "n000393", "n003939", "n003940", "n003941", "n003942", "n003943", "n003944", "n003945", "n003946", "n003947", "n003948", "n000040", "n000394", "n003949", "n003950", "n003951", "n003952", "n003953", "n003954", "n003955", "n003956", "n003957", "n003958", "n000395", "n003959", "n003960", "n003961", "n003962", "n003963", "n003964", "n003965", "n003966", "n003967", "n003968", "n000396", "n003969", "n003970", "n003971", "n003972", "n003973", "n003974", "n003975", "n003976", "n003977", "n003978", "n000397", "n003979", "n003980", "n003981", "n003982", "n003983", "n003984", "n003985", "n003986", "n003987", "n003988", "n000398", "n003989", "n003990", "n003991", "n003992", "n003993", "n003994", "n003995", "n003996", "n003997", "n003998", "n000399", "n003999", "n004000", "n004001", "n004002", "n004003", "n004004", "n004005", "n004006", "n004007", "n004008", "n000400", "n004009", "n004011", "n004012", "n004013", "n004014", "n004015", "n004016", "n004017", "n004018", "n004019", "n000401", "n004020", "n004021", "n004022", "n004023", "n004024", "n004025", "n004026", "n004027", "n004028", "n004029", "n000402", "n004030", "n004031", "n004032", "n004033", "n004034", "n004035", "n004036", "n004037", "n004038", "n004039", "n000403", "n004040", "n004041", "n004042", "n004043", "n004044", "n004045", "n004046", "n004047", "n004048", "n004049", "n000005", "n000041", "n000405", "n004050", "n004051", "n004052", "n004053", "n004054", "n004055", "n004056", "n004057", "n004058", "n004059", "n000406", "n004060", "n004061", "n004062", "n004063", "n004064", "n004065", "n004066", "n004067", "n004068", "n004069", "n000407", "n004070", "n004071", "n004072", "n004073", "n004074", "n004075", "n004076", "n004077", "n004078", "n004079", "n000408", "n004080", "n004081", "n004082", "n004083", "n004084", "n004085", "n004087", "n004088", "n004089", "n004090", "n000409", "n004091", "n004092", "n004093", "n004094", "n004095", "n004096", "n004097", "n004098", "n004099", "n004100", "n000410", "n004101", "n004102", "n004103", "n004104", "n004105", "n004106", "n004107", "n004108", "n004109", "n004110", "n000411", "n004111", "n004112", "n004113", "n004114", "n004115", "n004116", "n004117", "n004119", "n004120", "n004121", "n000412", "n004122", "n004123", "n004124", "n004125", "n004126", "n004127", "n004128", "n004129", "n004130", "n004131", "n000413", "n004132", "n004133", "n004134", "n004135", "n004136", "n004137", "n004138", "n004139", "n004140", "n004141", "n000414", "n004142", "n004143", "n004144", "n004145", "n004146", "n004147", "n004148", "n004149", "n004150", "n004151", "n000042", "n000415", "n004152", "n004153", "n004154", "n004155", "n004156", "n004157", "n004158", "n004159", "n004160", "n004161", "n000416", "n004162", "n004163", "n004164", "n004165", "n004166", "n004167", "n004168", "n004169", "n004170", "n004171", "n000417", "n004172", "n004173", "n004174", "n004175", "n004176", "n004177", "n004178", "n004179", "n004180", "n004181", "n000418", "n004182", "n004183", "n004184", "n004185", "n004186", "n004187", "n004188", "n004189", "n004190", "n004191", "n000419", "n004192", "n004193", "n004194", "n004195", "n004196", "n004197", "n004198", "n004201", "n004202", "n004203", "n000420", "n004204", "n004205", "n004206", "n004208", "n004209", "n004210", "n004211", "n004212", "n004213", "n004214", "n000421", "n004215", "n004216", "n004217", "n004218", "n004220", "n004221", "n004222", "n004223", "n004224", "n004225", "n000422", "n004226", "n004227", "n004228", "n004229", "n004230", "n004231", "n004232", "n004233", "n004234", "n004235", "n000423", "n004236", "n004237", "n004238", "n004239", "n004240", "n004241", "n004242", "n004243", "n004244", "n004245", "n000424", "n004246", "n004247", "n004248", "n004249", "n004250", "n004251", "n004252", "n004253", "n004254", "n004255", "n000043", "n000425", "n004256", "n004257", "n004258", "n004259", "n004260", "n004261", "n004262", "n004263", "n004264", "n004265", "n000426", "n004266", "n004267", "n004268", "n004269", "n004270", "n004271", "n004272", "n004273", "n004274", "n004275", "n000427", "n004276", "n004277", "n004278", "n004279", "n004280", "n004281", "n004282", "n004283", "n004284", "n004285", "n000428", "n004286", "n004287", "n004288", "n004290", "n004291", "n004292", "n004293", "n004294", "n004295", "n004296", "n000429", "n004297", "n004299", "n004300", "n004301", "n004302", "n004303", "n004304", "n004305", "n004306", "n004307", "n000430", "n004308", "n004309", "n004310", "n004311", "n004312", "n004313", "n004314", "n004315", "n004316", "n004317", "n000431", "n004318", "n004319", "n004320", "n004321", "n004322", "n004323", "n004324", "n004325", "n004326", "n004327", "n000432", "n004328", "n004329", "n004330", "n004331", "n004332", "n004333", "n004334", "n004335", "n004336", "n004337", "n000433", "n004338", "n004339", "n004340", "n004341", "n004342", "n004343", "n004344", "n004345", "n004347", "n004348", "n000434", "n004349", "n004350", "n004351", "n004352", "n004353", "n004354", "n004355", "n004356", "n004357", "n004358", "n000044", "n000435", "n004359", "n004360", "n004361", "n004362", "n004363", "n004364", "n004365", "n004366", "n004367", "n004368", "n000436", "n004369", "n004370", "n004371", "n004372", "n004373", "n004374", "n004375", "n004376", "n004377", "n004378", "n000437", "n004379", "n004380", "n004381", "n004382", "n004383", "n004384", "n004385", "n004386", "n004387", "n004388", "n000438", "n004389", "n004390", "n004391", "n004392", "n004393", "n004394", "n004395", "n004396", "n004397", "n004398", "n000439", "n004399", "n004401", "n004402", "n004403", "n004404", "n004405", "n004406", "n004407", "n004408", "n004409", "n000440", "n004410", "n004411", "n004412", "n004413", "n004414", "n004415", "n004416", "n004417", "n004418", "n004419", "n000441", "n004420", "n004421", "n004422", "n004423", "n004424", "n004425", "n004426", "n004427", "n004428", "n004429", "n000442", "n004430", "n004431", "n004432", "n004433", "n004434", "n004435", "n004436", "n004437", "n004438", "n004439", "n000443", "n004440", "n004441", "n004442", "n004443", "n004444", "n004445", "n004446", "n004447", "n004448", "n004449", "n000444", "n004450", "n004451", "n004452", "n004453", "n004454", "n004455", "n004456", "n004457", "n004458", "n004459", "n000045", "n000445", "n004460", "n004461", "n004462", "n004463", "n004464", "n004465", "n004466", "n004467", "n004468", "n004469", "n000446", "n004470", "n004471", "n004472", "n004473", "n004474", "n004475", "n004476", "n004477", "n004478", "n004479", "n000447", "n004480", "n004481", "n004482", "n004483", "n004484", "n004485", "n004486", "n004487", "n004488", "n004489", "n000448", "n004490", "n004491", "n004492", "n004493", "n004494", "n004495", "n004496", "n004497", "n004498", "n004499", "n000449", "n004500", "n004501", "n004502", "n004503", "n004504", "n004505", "n004506", "n004507", "n004508", "n004509", "n000450", "n004510", "n004511", "n004512", "n004513", "n004514", "n004515", "n004516", "n004517", "n004518", "n004519", "n000451", "n004520", "n004521", "n004522", "n004523", "n004524", "n004525", "n004526", "n004527", "n004528", "n004529", "n000452", "n004530", "n004531", "n004532", "n004533", "n004534", "n004535", "n004536", "n004537", "n004538", "n004539", "n000453", "n004540", "n004541", "n004542", "n004543", "n004544", "n004545", "n004546", "n004547", "n004548", "n004549", "n000454", "n004550", "n004551", "n004552", "n004553", "n004554", "n004555", "n004556", "n004557", "n004558", "n004559", "n000046", "n000455", "n004560", "n004561", "n004562", "n004563", "n004564", "n004565", "n004566", "n004567", "n004568", "n004569", "n000456", "n004570", "n004571", "n004572", "n004573", "n004574", "n004575", "n004576", "n004577", "n004578", "n004579", "n000457", "n004580", "n004581", "n004582", "n004583", "n004584", "n004585", "n004586", "n004587", "n004588", "n004589", "n000458", "n004590", "n004591", "n004592", "n004593", "n004594", "n004595", "n004596", "n004597", "n004598", "n004599", "n000459", "n004600", "n004601", "n004602", "n004603", "n004604", "n004605", "n004606", "n004607", "n004608", "n004609", "n000460", "n004610", "n004611", "n004612", "n004613", "n004614", "n004615", "n004616", "n004617", "n004618", "n004620", "n000461", "n004621", "n004622", "n004623", "n004624", "n004625", "n004626", "n004627", "n004628", "n004629", "n004630", "n000462", "n004631", "n004632", "n004633", "n004634", "n004635", "n004636", "n004637", "n004638", "n004639", "n004640", "n000463", "n004641", "n004642", "n004643", "n004644", "n004645", "n004646", "n004647", "n004648", "n004649", "n004650", "n000464", "n004651", "n004652", "n004653", "n004654", "n004655", "n004656", "n004657", "n004658", "n004659", "n004661", "n000047", "n000465", "n004662", "n004663", "n004664", "n004665", "n004666", "n004667", "n004668", "n004669", "n004670", "n004671", "n000466", "n004672", "n004673", "n004674", "n004675", "n004676", "n004677", "n004678", "n004679", "n004680", "n004681", "n000467", "n004682", "n004683", "n004684", "n004685", "n004686", "n004687", "n004688", "n004689", "n004690", "n004691", "n000468", "n004692", "n004693", "n004694", "n004695", "n004696", "n004697", "n004698", "n004699", "n004700", "n004701", "n000469", "n004702", "n004703", "n004704", "n004705", "n004706", "n004707", "n004708", "n004709", "n004710", "n004711", "n000470", "n004712", "n004713", "n004714", "n004715", "n004716", "n004717", "n004718", "n004719", "n004720", "n004721", "n000471", "n004722", "n004724", "n004725", "n004726", "n004727", "n004728", "n004729", "n004730", "n004731", "n004732", "n000472", "n004733", "n004734", "n004735", "n004736", "n004737", "n004738", "n004739", "n004740", "n004741", "n004742", "n000473", "n004743", "n004744", "n004745", "n004746", "n004747", "n004748", "n004749", "n004750", "n004751", "n004752", "n000474", "n004753", "n004754", "n004755", "n004756", "n004757", "n004758", "n004759", "n004760", "n004761", "n004762", "n000048", "n000475", "n004763", "n004764", "n004765", "n004766", "n004767", "n004768", "n004769", "n004770", "n004771", "n004772", "n000476", "n004773", "n004774", "n004775", "n004776", "n004777", "n004778", "n004779", "n004780", "n004781", "n004782", "n000477", "n004783", "n004784", "n004785", "n004786", "n004787", "n004788", "n004790", "n004791", "n004792", "n004793", "n000478", "n004794", "n004796", "n004797", "n004798", "n004799", "n004800", "n004801", "n004802", "n004804", "n004805", "n000479", "n004806", "n004807", "n004808", "n004809", "n004810", "n004812", "n004813", "n004814", "n004816", "n004817", "n000480", "n004818", "n004819", "n004820", "n004821", "n004822", "n004823", "n004824", "n004825", "n004826", "n004827", "n000481", "n004828", "n004829", "n004830", "n004831", "n004832", "n004833", "n004834", "n004835", "n004836", "n004837", "n000482", "n004838", "n004839", "n004840", "n004841", "n004842", "n004843", "n004844", "n004845", "n004846", "n004847", "n000483", "n004848", "n004849", "n004850", "n004851", "n004852", "n004853", "n004854", "n004855", "n004856", "n004857", "n000484", "n004858", "n004859", "n004860", "n004861", "n004862", "n004863", "n004864", "n004865", "n004866", "n004867", "n000049", "n000485", "n004868", "n004869", "n004870", "n004871", "n004872", "n004873", "n004874", "n004875", "n004876", "n004877", "n000486", "n004878", "n004879", "n004880", "n004881", "n004882", "n004883", "n004884", "n004885", "n004886", "n004887", "n000487", "n004888", "n004889", "n004890", "n004891", "n004892", "n004893", "n004894", "n004895", "n004896", "n004897", "n000488", "n004898", "n004899", "n004900", "n004901", "n004902", "n004903", "n004904", "n004905", "n004906", "n004907", "n000489", "n004908", "n004909", "n004910", "n004911", "n004912", "n004913", "n004914", "n004915", "n004916", "n004917", "n000490", "n004918", "n004919", "n004921", "n004922", "n004923", "n004924", "n004925", "n004926", "n004927", "n004928", "n000491", "n004929", "n004930", "n004931", "n004932", "n004933", "n004934", "n004935", "n004936", "n004937", "n004938", "n000492", "n004939", "n004940", "n004941", "n004942", "n004943", "n004944", "n004945", "n004946", "n004947", "n004948", "n000493", "n004949", "n004950", "n004951", "n004952", "n004953", "n004954", "n004955", "n004956", "n004957", "n004958", "n000494", "n004959", "n004960", "n004961", "n004962", "n004963", "n004964", "n004965", "n004966", "n004967", "n004968", "n000050", "n000495", "n004969", "n004970", "n004971", "n004972", "n004973", "n004974", "n004975", "n004976", "n004977", "n004978", "n000496", "n004979", "n004980", "n004981", "n004982", "n004983", "n004984", "n004985", "n004986", "n004987", "n004988", "n000497", "n004989", "n004990", "n004991", "n004992", "n004993", "n004994", "n004995", "n004996", "n004997", "n004998", "n000498", "n004999", "n005000", "n005001", "n005002", "n005003", "n005004", "n005005", "n005006", "n005007", "n005008", "n000499", "n005009", "n005010", "n005011", "n005012", "n005013", "n005014", "n005015", "n005016", "n005017", "n005018", "n000500", "n005019", "n005020", "n005021", "n005022", "n005023", "n005024", "n005025", "n005026", "n005027", "n005028", "n000501", "n005029", "n005030", "n005031", "n005032", "n005033", "n005034", "n005035", "n005036", "n005037", "n005038", "n000502", "n005039", "n005040", "n005041", "n005042", "n005043", "n005044", "n005045", "n005046", "n005047", "n005048", "n000503", "n005050", "n005051", "n005052", "n005053", "n005054", "n005055", "n005056", "n005057", "n005058", "n005059", "n000504", "n005060", "n005061", "n005062", "n005063", "n005064", "n005065", "n005066", "n005067", "n005069", "n005070", "n000006", "n000051", "n000505", "n005071", "n005072", "n005073", "n005074", "n005075", "n005076", "n005077", "n005078", "n005079", "n005080", "n000506", "n005081", "n005082", "n005083", "n005084", "n005085", "n005086", "n005087", "n005088", "n005089", "n005090", "n000507", "n005091", "n005092", "n005093", "n005094", "n005095", "n005096", "n005097", "n005098", "n005099", "n005100", "n000508", "n005101", "n005102", "n005103", "n005104", "n005105", "n005106", "n005107", "n005108", "n005109", "n005110", "n000509", "n005111", "n005112", "n005113", "n005114", "n005115", "n005116", "n005117", "n005118", "n005119", "n005120", "n000510", "n005121", "n005122", "n005123", "n005124", "n005125", "n005126", "n005127", "n005128", "n005129", "n005130", "n000511", "n005131", "n005132", "n005133", "n005134", "n005135", "n005136", "n005137", "n005138", "n005139", "n005140", "n000512", "n005141", "n005142", "n005143", "n005144", "n005145", "n005146", "n005147", "n005148", "n005149", "n005150", "n000513", "n005151", "n005152", "n005153", "n005157", "n005158", "n005159", "n005160", "n005161", "n005162", "n005163", "n000514", "n005164", "n005165", "n005166", "n005167", "n005168", "n005169", "n005170", "n005171", "n005172", "n005173", "n000052", "n000515", "n005174", "n005175", "n005176", "n005177", "n005178", "n005179", "n005180", "n005181", "n005182", "n005183", "n000516", "n005184", "n005185", "n005186", "n005187", "n005188", "n005189", "n005190", "n005191", "n005192", "n005193", "n000517", "n005194", "n005195", "n005196", "n005197", "n005198", "n005199", "n005200", "n005201", "n005202", "n005203", "n000518", "n005204", "n005205", "n005206", "n005207", "n005208", "n005209", "n005210", "n005211", "n005212", "n005213", "n000519", "n005214", "n005215", "n005216", "n005217", "n005218", "n005219", "n005220", "n005221", "n005222", "n005223", "n000520", "n005224", "n005225", "n005226", "n005227", "n005228", "n005229", "n005230", "n005231", "n005232", "n005233", "n000521", "n005234", "n005235", "n005236", "n005237", "n005238", "n005239", "n005240", "n005241", "n005242", "n005243", "n000522", "n005244", "n005245", "n005246", "n005247", "n005248", "n005249", "n005250", "n005251", "n005252", "n005253", "n000524", "n005254", "n005255", "n005256", "n005257", "n005258", "n005259", "n005260", "n005261", "n005262", "n005263", "n000525", "n005264", "n005265", "n005266", "n005267", "n005268", "n005269", "n005270", "n005271", "n005272", "n005273", "n000053", "n000526", "n005274", "n005275", "n005276", "n005277", "n005278", "n005279", "n005280", "n005281", "n005282", "n005283", "n000527", "n005284", "n005285", "n005286", "n005287", "n005288", "n005289", "n005290", "n005291", "n005292", "n005293", "n000528", "n005294", "n005295", "n005296", "n005297", "n005298", "n005299", "n005300", "n005301", "n005302", "n005303", "n000529", "n005304", "n005305", "n005306", "n005307", "n005308", "n005309", "n005310", "n005311", "n005312", "n005313", "n000530", "n005314", "n005315", "n005316", "n005317", "n005318", "n005319", "n005320", "n005321", "n005322", "n005323", "n000531", "n005324", "n005325", "n005326", "n005327", "n005328", "n005329", "n005330", "n005331", "n005332", "n005333", "n000532", "n005334", "n005335", "n005336", "n005337", "n005338", "n005339", "n005340", "n005341", "n005342", "n005343", "n000533", "n005344", "n005345", "n005346", "n005347", "n005348", "n005349", "n005350", "n005351", "n005352", "n005353", "n000534", "n005354", "n005355", "n005356", "n005357", "n005358", "n005359", "n005360", "n005361", "n005362", "n005363", "n000535", "n005364", "n005365", "n005366", "n005367", "n005368", "n005369", "n005370", "n005371", "n005372", "n005373", "n000054", "n000536", "n005374", "n005375", "n005376", "n005377", "n005378", "n005379", "n005380", "n005381", "n005382", "n005383", "n000537", "n005384", "n005385", "n005386", "n005387", "n005388", "n005389", "n005390", "n005391", "n005392", "n005393", "n000538", "n005394", "n005395", "n005396", "n005397", "n005398", "n005399", "n005400", "n005401", "n005402", "n005403", "n000539", "n005404", "n005405", "n005406", "n005407", "n005408", "n005409", "n005410", "n005411", "n005412", "n005413", "n000540", "n005414", "n005415", "n005416", "n005417", "n005418", "n005419", "n005420", "n005421", "n005422", "n005423", "n000541", "n005424", "n005425", "n005426", "n005427", "n005428", "n005429", "n005430", "n005431", "n005432", "n005433", "n000542", "n005434", "n005435", "n005436", "n005437", "n005438", "n005439", "n005440", "n005441", "n005442", "n005443", "n000543", "n005444", "n005445", "n005446", "n005447", "n005448", "n005449", "n005450", "n005451", "n005452", "n005453", "n000544", "n005454", "n005455", "n005456", "n005457", "n005458", "n005459", "n005460", "n005461", "n005462", "n005463", "n000545", "n005464", "n005465", "n005466", "n005467", "n005468", "n005469", "n005470", "n005471", "n005472", "n005473", "n000055", "n000546", "n005474", "n005475", "n005476", "n005477", "n005478", "n005479", "n005480", "n005481", "n005482", "n005483", "n000547", "n005484", "n005485", "n005486", "n005487", "n005488", "n005489", "n005490", "n005491", "n005492", "n005493", "n000548", "n005494", "n005495", "n005496", "n005497", "n005498", "n005499", "n005500", "n005501", "n005502", "n005503", "n000549", "n005504", "n005505", "n005506", "n005507", "n005508", "n005509", "n005510", "n005511", "n005512", "n005513", "n000550", "n005514", "n005515", "n005516", "n005517", "n005518", "n005519", "n005520", "n005521", "n005522", "n005523", "n000551", "n005524", "n005525", "n005526", "n005527", "n005528", "n005529", "n005530", "n005531", "n005532", "n005533", "n000552", "n005534", "n005535", "n005536", "n005537", "n005538", "n005539", "n005540", "n005541", "n005542", "n005543", "n000553", "n005544", "n005545", "n005546", "n005547", "n005548", "n005549", "n005550", "n005551", "n005552", "n005553", "n000554", "n005554", "n005555", "n005556", "n005557", "n005558", "n005559", "n005560", "n005561", "n005562", "n005563", "n000555", "n005565", "n005566", "n005567", "n005568", "n005569", "n005570", "n005571", "n005572", "n005573", "n005574", "n000056", "n000556", "n005575", "n005576", "n005577", "n005578", "n005579", "n005580", "n005581", "n005582", "n005583", "n005584", "n000557", "n005585", "n005586", "n005587", "n005588", "n005589", "n005590", "n005591", "n005592", "n005593", "n005594", "n000558", "n005595", "n005596", "n005597", "n005598", "n005599", "n005600", "n005601", "n005602", "n005604", "n005605", "n000559", "n005606", "n005607", "n005608", "n005609", "n005610", "n005611", "n005612", "n005613", "n005614", "n005615", "n000560", "n005616", "n005617", "n005618", "n005619", "n005620", "n005621", "n005622", "n005623", "n005624", "n005625", "n000561", "n005626", "n005627", "n005628", "n005629", "n005630", "n005631", "n005632", "n005633", "n005634", "n005635", "n000562", "n005636", "n005637", "n005638", "n005639", "n005640", "n005641", "n005642", "n005643", "n005644", "n005645", "n000563", "n005646", "n005647", "n005648", "n005649", "n005650", "n005651", "n005652", "n005653", "n005654", "n005655", "n000564", "n005656", "n005657", "n005658", "n005659", "n005660", "n005661", "n005662", "n005663", "n005664", "n005665", "n000565", "n005666", "n005667", "n005668", "n005669", "n005670", "n005671", "n005672", "n005673", "n005674", "n005675", "n000057", "n000566", "n005676", "n005677", "n005678", "n005679", "n005680", "n005681", "n005682", "n005683", "n005684", "n005685", "n000567", "n005686", "n005687", "n005688", "n005689", "n005690", "n005691", "n005692", "n005693", "n005694", "n005695", "n000568", "n005696", "n005697", "n005698", "n005699", "n005700", "n005701", "n005702", "n005703", "n005704", "n005705", "n000569", "n005706", "n005707", "n005708", "n005709", "n005710", "n005711", "n005712", "n005713", "n005714", "n005715", "n000570", "n005716", "n005717", "n005718", "n005719", "n005720", "n005721", "n005722", "n005723", "n005724", "n005725", "n000571", "n005726", "n005727", "n005728", "n005729", "n005730", "n005731", "n005732", "n005733", "n005734", "n005735", "n000573", "n005736", "n005737", "n005738", "n005739", "n005740", "n005741", "n005742", "n005743", "n005744", "n005745", "n000574", "n005746", "n005747", "n005748", "n005749", "n005750", "n005751", "n005752", "n005753", "n005754", "n005755", "n000575", "n005756", "n005757", "n005758", "n005759", "n005760", "n005761", "n005762", "n005763", "n005764", "n005765", "n000576", "n005766", "n005767", "n005768", "n005769", "n005770", "n005771", "n005772", "n005773", "n005774", "n005775", "n000058", "n000577", "n005776", "n005777", "n005778", "n005779", "n005780", "n005782", "n005783", "n005785", "n005787", "n005788", "n000578", "n005789", "n005790", "n005791", "n005792", "n005793", "n005795", "n005796", "n005797", "n005798", "n005799", "n000579", "n005800", "n005801", "n005802", "n005803", "n005804", "n005805", "n005806", "n005807", "n005808", "n005809", "n000580", "n005810", "n005811", "n005812", "n005813", "n005814", "n005815", "n005816", "n005817", "n005818", "n005819", "n000581", "n005820", "n005821", "n005822", "n005823", "n005825", "n005826", "n005827", "n005828", "n005829", "n005830", "n000582", "n005831", "n005832", "n005833", "n005834", "n005835", "n005836", "n005837", "n005838", "n005839", "n005840", "n000583", "n005841", "n005842", "n005843", "n005844", "n005845", "n005846", "n005847", "n005848", "n005849", "n005850", "n000584", "n005851", "n005852", "n005853", "n005854", "n005855", "n005857", "n005858", "n005859", "n005860", "n005861", "n000585", "n005862", "n005863", "n005865", "n005866", "n005867", "n005868", "n005869", "n005870", "n005871", "n005872", "n000586", "n005873", "n005874", "n005875", "n005876", "n005877", "n005878", "n005879", "n005880", "n005881", "n005882", "n000059", "n000587", "n005883", "n005884", "n005885", "n005886", "n005887", "n005888", "n005889", "n005890", "n005891", "n005892", "n000588", "n005893", "n005894", "n005895", "n005896", "n005897", "n005898", "n005899", "n005900", "n005901", "n005902", "n000589", "n005903", "n005904", "n005905", "n005906", "n005907", "n005908", "n005909", "n005910", "n005911", "n005912", "n000590", "n005913", "n005914", "n005915", "n005916", "n005917", "n005918", "n005919", "n005920", "n005921", "n005922", "n000591", "n005923", "n005924", "n005925", "n005926", "n005927", "n005928", "n005929", "n005930", "n005931", "n005932", "n000592", "n005933", "n005934", "n005935", "n005936", "n005937", "n005938", "n005939", "n005940", "n005941", "n005942", "n000593", "n005943", "n005944", "n005945", "n005946", "n005947", "n005948", "n005949", "n005950", "n005951", "n005952", "n000594", "n005953", "n005954", "n005955", "n005956", "n005957", "n005958", "n005959", "n005960", "n005961", "n005962", "n000595", "n005963", "n005965", "n005966", "n005967", "n005968", "n005969", "n005970", "n005971", "n005972", "n005973", "n000596", "n005974", "n005975", "n005976", "n005977", "n005978", "n005979", "n005980", "n005981", "n005982", "n005983", "n000060", "n000597", "n005984", "n005985", "n005986", "n005987", "n005988", "n005989", "n005990", "n005991", "n005992", "n005993", "n000598", "n005995", "n005996", "n005997", "n005998", "n005999", "n006000", "n006001", "n006002", "n006003", "n006004", "n000599", "n006005", "n006006", "n006007", "n006008", "n006009", "n006010", "n006011", "n006012", "n006013", "n006014", "n000600", "n006015", "n006016", "n006017", "n006018", "n006019", "n006020", "n006021", "n006022", "n006023", "n006024", "n000601", "n006025", "n006026", "n006027", "n006028", "n006029", "n006030", "n006031", "n006032", "n006033", "n006034", "n000602", "n006035", "n006036", "n006037", "n006038", "n006039", "n006040", "n006041", "n006042", "n006043", "n006044", "n000603", "n006045", "n006047", "n006048", "n006049", "n006050", "n006051", "n006052", "n006053", "n006054", "n006055", "n000604", "n006056", "n006057", "n006058", "n006059", "n006060", "n006061", "n006062", "n006063", "n006064", "n006065", "n000605", "n006066", "n006067", "n006068", "n006069", "n006070", "n006071", "n006072", "n006073", "n006074", "n006075", "n000606", "n006076", "n006077", "n006078", "n006079", "n006080", "n006081", "n006082", "n006083", "n006084", "n006085", "n000007", "n000061", "n000607", "n006086", "n006087", "n006088", "n006089", "n006090", "n006091", "n006092", "n006093", "n006094", "n006095", "n000608", "n006096", "n006098", "n006099", "n006100", "n006101", "n006102", "n006103", "n006104", "n006105", "n006106", "n000609", "n006107", "n006108", "n006109", "n006110", "n006111", "n006112", "n006113", "n006114", "n006115", "n006116", "n000610", "n006117", "n006118", "n006119", "n006120", "n006121", "n006122", "n006123", "n006124", "n006125", "n006126", "n000611", "n006127", "n006128", "n006129", "n006130", "n006131", "n006132", "n006133", "n006134", "n006135", "n006136", "n000612", "n006137", "n006138", "n006139", "n006140", "n006141", "n006142", "n006143", "n006144", "n006145", "n006146", "n000613", "n006147", "n006148", "n006149", "n006150", "n006151", "n006152", "n006153", "n006154", "n006155", "n006156", "n000614", "n006157", "n006158", "n006159", "n006160", "n006161", "n006162", "n006163", "n006164", "n006165", "n006166", "n000615", "n006167", "n006169", "n006170", "n006171", "n006172", "n006173", "n006174", "n006175", "n006176", "n006177", "n000616", "n006178", "n006179", "n006180", "n006181", "n006182", "n006183", "n006184", "n006185", "n006186", "n006187", "n000062", "n000618", "n006188", "n006189", "n006190", "n006191", "n006192", "n006193", "n006194", "n006195", "n006196", "n006197", "n000619", "n006198", "n006199", "n006200", "n006201", "n006202", "n006203", "n006204", "n006205", "n006206", "n006207", "n000620", "n006208", "n006209", "n006210", "n006211", "n006212", "n006213", "n006214", "n006215", "n006216", "n006217", "n000621", "n006218", "n006219", "n006220", "n006221", "n006222", "n006223", "n006224", "n006225", "n006226", "n006227", "n000622", "n006228", "n006229", "n006230", "n006231", "n006232", "n006233", "n006234", "n006235", "n006236", "n006237", "n000623", "n006238", "n006239", "n006240", "n006241", "n006242", "n006243", "n006244", "n006245", "n006246", "n006247", "n000624", "n006248", "n006249", "n006250", "n006251", "n006252", "n006253", "n006254", "n006255", "n006256", "n006257", "n000625", "n006258", "n006259", "n006260", "n006261", "n006262", "n006263", "n006264", "n006265", "n006266", "n006267", "n000626", "n006268", "n006269", "n006270", "n006271", "n006272", "n006273", "n006274", "n006275", "n006276", "n006277", "n000627", "n006278", "n006279", "n006280", "n006281", "n006282", "n006283", "n006284", "n006285", "n006286", "n006287", "n000063", "n000628", "n006288", "n006289", "n006290", "n006291", "n006292", "n006293", "n006294", "n006296", "n006297", "n006298", "n000629", "n006300", "n006301", "n006302", "n006303", "n006304", "n006305", "n006306", "n006307", "n006308", "n006309", "n000630", "n006310", "n006311", "n006312", "n006313", "n006314", "n006315", "n006316", "n006317", "n006318", "n006319", "n000631", "n006320", "n006321", "n006322", "n006323", "n006324", "n006325", "n006326", "n006327", "n006328", "n006329", "n000632", "n006330", "n006331", "n006332", "n006333", "n006334", "n006335", "n006336", "n006337", "n006338", "n006339", "n000633", "n006340", "n006341", "n006342", "n006343", "n006344", "n006345", "n006346", "n006347", "n006348", "n006349", "n000634", "n006350", "n006351", "n006352", "n006353", "n006354", "n006355", "n006356", "n006357", "n006358", "n006359", "n000635", "n006360", "n006361", "n006362", "n006363", "n006364", "n006365", "n006366", "n006367", "n006368", "n006369", "n000636", "n006370", "n006371", "n006372", "n006373", "n006374", "n006375", "n006376", "n006377", "n006378", "n006379", "n000637", "n006380", "n006381", "n006382", "n006383", "n006384", "n006385", "n006386", "n006387", "n006388", "n006389", "n000064", "n000638", "n006390", "n006391", "n006392", "n006393", "n006394", "n006395", "n006396", "n006397", "n006398", "n006399", "n000639", "n006400", "n006401", "n006402", "n006403", "n006404", "n006405", "n006406", "n006407", "n006408", "n006410", "n000640", "n006411", "n006412", "n006413", "n006414", "n006415", "n006416", "n006417", "n006418", "n006420", "n006421", "n000641", "n006422", "n006423", "n006424", "n006425", "n006426", "n006427", "n006428", "n006429", "n006430", "n006431", "n000642", "n006432", "n006433", "n006434", "n006435", "n006436", "n006437", "n006438", "n006439", "n006440", "n006441", "n000643", "n006442", "n006443", "n006444", "n006445", "n006446", "n006447", "n006448", "n006449", "n006450", "n006451", "n000644", "n006452", "n006453", "n006454", "n006455", "n006456", "n006457", "n006458", "n006459", "n006460", "n006461", "n000645", "n006462", "n006463", "n006464", "n006465", "n006466", "n006467", "n006468", "n006469", "n006470", "n006471", "n000646", "n006473", "n006474", "n006475", "n006476", "n006477", "n006478", "n006479", "n006480", "n006481", "n006482", "n000647", "n006483", "n006484", "n006485", "n006486", "n006487", "n006488", "n006489", "n006490", "n006491", "n006492", "n000065", "n000648", "n006493", "n006494", "n006495", "n006496", "n006497", "n006498", "n006499", "n006500", "n006501", "n006502", "n000649", "n006503", "n006504", "n006505", "n006506", "n006507", "n006508", "n006509", "n006510", "n006511", "n006512", "n000650", "n006513", "n006514", "n006515", "n006516", "n006517", "n006518", "n006519", "n006520", "n006521", "n006522", "n000651", "n006523", "n006524", "n006525", "n006526", "n006527", "n006528", "n006529", "n006530", "n006531", "n006532", "n000652", "n006533", "n006534", "n006535", "n006536", "n006537", "n006538", "n006539", "n006540", "n006541", "n006542", "n000653", "n006543", "n006544", "n006545", "n006546", "n006547", "n006548", "n006549", "n006550", "n006551", "n006552", "n000654", "n006553", "n006554", "n006555", "n006556", "n006557", "n006558", "n006559", "n006560", "n006561", "n006562", "n000655", "n006563", "n006564", "n006565", "n006566", "n006567", "n006568", "n006569", "n006570", "n006571", "n006573", "n000656", "n006574", "n006575", "n006576", "n006577", "n006578", "n006579", "n006580", "n006581", "n006582", "n006583", "n000657", "n006584", "n006585", "n006587", "n006588", "n006589", "n006590", "n006591", "n006592", "n006593", "n006594", "n000066", "n000658", "n006595", "n006596", "n006597", "n006598", "n006599", "n006600", "n006601", "n006602", "n006603", "n006604", "n000659", "n006605", "n006606", "n006607", "n006608", "n006609", "n006610", "n006611", "n006612", "n006613", "n006614", "n000660", "n006615", "n006616", "n006617", "n006618", "n006619", "n006620", "n006621", "n006622", "n006623", "n006624", "n000661", "n006625", "n006626", "n006627", "n006628", "n006629", "n006630", "n006631", "n006632", "n006633", "n006634", "n000662", "n006635", "n006636", "n006637", "n006638", "n006639", "n006640", "n006641", "n006642", "n006643", "n006644", "n000663", "n006645", "n006646", "n006647", "n006648", "n006649", "n006650", "n006651", "n006652", "n006653", "n006654", "n000664", "n006655", "n006656", "n006657", "n006658", "n006659", "n006660", "n006661", "n006662", "n006663", "n006665", "n000665", "n006666", "n006667", "n006668", "n006669", "n006670", "n006671", "n006672", "n006673", "n006674", "n006675", "n000666", "n006677", "n006678", "n006679", "n006680", "n006682", "n006683", "n006684", "n006685", "n006686", "n006687", "n000667", "n006688", "n006689", "n006690", "n006691", "n006692", "n006693", "n006694", "n006695", "n006696", "n006697", "n000067", "n000668", "n006698", "n006699", "n006700", "n006701", "n006702", "n006703", "n006704", "n006705", "n006707", "n006708", "n000669", "n006709", "n006710", "n006711", "n006712", "n006713", "n006714", "n006715", "n006716", "n006717", "n006718", "n000670", "n006719", "n006720", "n006721", "n006722", "n006723", "n006724", "n006725", "n006726", "n006727", "n006728", "n000671", "n006729", "n006730", "n006731", "n006732", "n006733", "n006734", "n006735", "n006736", "n006737", "n006738", "n000672", "n006739", "n006740", "n006741", "n006742", "n006743", "n006744", "n006745", "n006746", "n006747", "n006748", "n000673", "n006749", "n006750", "n006751", "n006752", "n006753", "n006754", "n006755", "n006756", "n006757", "n006758", "n000674", "n006759", "n006760", "n006761", "n006762", "n006763", "n006764", "n006765", "n006766", "n006767", "n006768", "n000675", "n006769", "n006770", "n006771", "n006772", "n006773", "n006774", "n006775", "n006776", "n006777", "n006778", "n000676", "n006779", "n006780", "n006781", "n006782", "n006783", "n006784", "n006785", "n006786", "n006787", "n006788", "n000677", "n006789", "n006790", "n006791", "n006792", "n006793", "n006794", "n006795", "n006796", "n006797", "n006798", "n000068", "n000678", "n006799", "n006800", "n006801", "n006802", "n006803", "n006804", "n006805", "n006806", "n006807", "n006808", "n000679", "n006809", "n006810", "n006811", "n006812", "n006813", "n006814", "n006815", "n006816", "n006817", "n006818", "n000680", "n006819", "n006820", "n006821", "n006822", "n006823", "n006824", "n006826", "n006827", "n006828", "n006829", "n000681", "n006830", "n006831", "n006832", "n006833", "n006834", "n006835", "n006836", "n006837", "n006838", "n006839", "n000682", "n006840", "n006841", "n006842", "n006843", "n006844", "n006845", "n006846", "n006847", "n006848", "n006849", "n000683", "n006850", "n006851", "n006852", "n006853", "n006854", "n006855", "n006856", "n006857", "n006858", "n006859", "n000684", "n006860", "n006861", "n006862", "n006863", "n006864", "n006865", "n006866", "n006867", "n006868", "n006869", "n000685", "n006870", "n006872", "n006873", "n006874", "n006875", "n006876", "n006877", "n006878", "n006879", "n006880", "n000686", "n006881", "n006882", "n006883", "n006884", "n006885", "n006886", "n006887", "n006888", "n006889", "n006890", "n000687", "n006891", "n006892", "n006893", "n006894", "n006895", "n006896", "n006897", "n006898", "n006899", "n006900", "n000069", "n000688", "n006901", "n006902", "n006903", "n006904", "n006905", "n006906", "n006907", "n006908", "n006909", "n006910", "n000689", "n006911", "n006912", "n006913", "n006914", "n006915", "n006916", "n006917", "n006918", "n006919", "n006920", "n000690", "n006921", "n006922", "n006923", "n006924", "n006925", "n006926", "n006927", "n006928", "n006929", "n006930", "n000691", "n006931", "n006932", "n006933", "n006934", "n006935", "n006936", "n006937", "n006938", "n006939", "n006940", "n000692", "n006941", "n006943", "n006944", "n006945", "n006946", "n006947", "n006948", "n006949", "n006950", "n006951", "n000693", "n006952", "n006953", "n006954", "n006955", "n006956", "n006957", "n006958", "n006959", "n006960", "n006961", "n000694", "n006962", "n006963", "n006965", "n006966", "n006967", "n006968", "n006969", "n006970", "n006971", "n006972", "n000695", "n006973", "n006974", "n006975", "n006976", "n006977", "n006978", "n006979", "n006980", "n006981", "n006982", "n000696", "n006983", "n006984", "n006985", "n006986", "n006987", "n006988", "n006989", "n006990", "n006991", "n006992", "n000697", "n006993", "n006994", "n006995", "n006996", "n006997", "n006998", "n006999", "n007000", "n007001", "n007002", "n000070", "n000698", "n007003", "n007004", "n007005", "n007006", "n007007", "n007008", "n007009", "n007010", "n007011", "n007012", "n000699", "n007013", "n007015", "n007016", "n007017", "n007018", "n007019", "n007020", "n007021", "n007022", "n007023", "n000700", "n007024", "n007025", "n007026", "n007027", "n007028", "n007029", "n007030", "n007031", "n007032", "n007033", "n000701", "n007034", "n007035", "n007036", "n007037", "n007038", "n007039", "n007040", "n007041", "n007042", "n007043", "n000702", "n007044", "n007045", "n007046", "n007047", "n007048", "n007049", "n007050", "n007051", "n007052", "n007053", "n000703", "n007054", "n007055", "n007056", "n007057", "n007058", "n007059", "n007060", "n007061", "n007062", "n007063", "n000704", "n007064", "n007065", "n007066", "n007067", "n007068", "n007069", "n007070", "n007071", "n007072", "n007073", "n000705", "n007074", "n007075", "n007076", "n007077", "n007078", "n007079", "n007080", "n007081", "n007082", "n007083", "n000706", "n007084", "n007085", "n007086", "n007087", "n007088", "n007089", "n007090", "n007091", "n007092", "n007093", "n000707", "n007094", "n007095", "n007096", "n007097", "n007098", "n007099", "n007100", "n007101", "n007102", "n007103", "n000008", "n000071", "n000708", "n007104", "n007105", "n007106", "n007107", "n007108", "n007109", "n007110", "n007111", "n007112", "n007113", "n000709", "n007114", "n007115", "n007116", "n007117", "n007118", "n007119", "n007120", "n007121", "n007122", "n007123", "n000710", "n007124", "n007125", "n007126", "n007127", "n007128", "n007129", "n007130", "n007131", "n007132", "n007133", "n000711", "n007134", "n007135", "n007136", "n007137", "n007138", "n007139", "n007140", "n007141", "n007142", "n007143", "n000712", "n007144", "n007145", "n007146", "n007147", "n007148", "n007149", "n007150", "n007151", "n007152", "n007153", "n000713", "n007154", "n007155", "n007156", "n007157", "n007158", "n007159", "n007160", "n007161", "n007162", "n007163", "n000714", "n007164", "n007165", "n007166", "n007167", "n007168", "n007169", "n007170", "n007171", "n007172", "n007173", "n000715", "n007174", "n007175", "n007176", "n007177", "n007178", "n007179", "n007180", "n007181", "n007182", "n007184", "n000716", "n007185", "n007186", "n007187", "n007188", "n007189", "n007190", "n007191", "n007192", "n007193", "n007194", "n000717", "n007195", "n007196", "n007197", "n007198", "n007199", "n007200", "n007201", "n007202", "n007203", "n007204", "n000072", "n000718", "n007205", "n007206", "n007207", "n007208", "n007209", "n007210", "n007211", "n007212", "n007213", "n007214", "n000719", "n007215", "n007216", "n007217", "n007218", "n007219", "n007220", "n007221", "n007222", "n007223", "n007224", "n000720", "n007225", "n007226", "n007227", "n007228", "n007229", "n007230", "n007231", "n007232", "n007233", "n007234", "n000721", "n007235", "n007236", "n007237", "n007238", "n007239", "n007240", "n007241", "n007242", "n007243", "n007244", "n000722", "n007245", "n007246", "n007247", "n007248", "n007249", "n007250", "n007251", "n007252", "n007253", "n007254", "n000723", "n007255", "n007256", "n007257", "n007258", "n007259", "n007261", "n007262", "n007263", "n007264", "n007265", "n000724", "n007266", "n007267", "n007268", "n007269", "n007270", "n007271", "n007272", "n007273", "n007274", "n007275", "n000725", "n007276", "n007277", "n007278", "n007279", "n007280", "n007281", "n007282", "n007283", "n007284", "n007285", "n000726", "n007286", "n007287", "n007288", "n007289", "n007290", "n007291", "n007292", "n007293", "n007294", "n007295", "n000727", "n007296", "n007297", "n007298", "n007299", "n007300", "n007301", "n007302", "n007303", "n007304", "n007305", "n000073", "n000728", "n007306", "n007307", "n007308", "n007309", "n007310", "n007311", "n007312", "n007313", "n007314", "n007315", "n000729", "n007316", "n007317", "n007318", "n007319", "n007320", "n007321", "n007322", "n007323", "n007324", "n007325", "n000730", "n007326", "n007327", "n007328", "n007329", "n007330", "n007331", "n007332", "n007333", "n007334", "n007335", "n000731", "n007336", "n007337", "n007338", "n007339", "n007340", "n007341", "n007342", "n007343", "n007344", "n007345", "n000732", "n007346", "n007347", "n007348", "n007349", "n007350", "n007351", "n007352", "n007353", "n007354", "n007355", "n000733", "n007356", "n007357", "n007358", "n007359", "n007360", "n007361", "n007362", "n007363", "n007364", "n007365", "n000734", "n007366", "n007367", "n007368", "n007369", "n007370", "n007371", "n007372", "n007373", "n007374", "n007375", "n000735", "n007376", "n007377", "n007378", "n007379", "n007380", "n007381", "n007382", "n007383", "n007384", "n007385", "n000736", "n007386", "n007387", "n007388", "n007389", "n007391", "n007392", "n007393", "n007394", "n007395", "n007396", "n000737", "n007397", "n007398", "n007399", "n007400", "n007401", "n007402", "n007403", "n007404", "n007405", "n007406", "n000074", "n000739", "n007407", "n007408", "n007409", "n007410", "n007411", "n007412", "n007413", "n007414", "n007415", "n007416", "n000740", "n007417", "n007418", "n007419", "n007420", "n007421", "n007422", "n007423", "n007425", "n007426", "n007427", "n000741", "n007428", "n007429", "n007430", "n007431", "n007432", "n007433", "n007434", "n007435", "n007436", "n007437", "n000742", "n007438", "n007439", "n007440", "n007441", "n007442", "n007443", "n007444", "n007445", "n007446", "n007447", "n000743", "n007448", "n007449", "n007450", "n007451", "n007452", "n007453", "n007454", "n007455", "n007456", "n007457", "n000744", "n007458", "n007459", "n007460", "n007461", "n007462", "n007463", "n007464", "n007465", "n007466", "n007467", "n000745", "n007468", "n007469", "n007470", "n007471", "n007472", "n007473", "n007474", "n007475", "n007476", "n007477", "n000746", "n007478", "n007479", "n007480", "n007481", "n007482", "n007483", "n007484", "n007485", "n007486", "n007487", "n000747", "n007488", "n007489", "n007490", "n007491", "n007492", "n007493", "n007494", "n007495", "n007496", "n007497", "n000748", "n007498", "n007499", "n007501", "n007502", "n007503", "n007504", "n007505", "n007506", "n007507", "n007508", "n000075", "n000749", "n007509", "n007510", "n007511", "n007512", "n007513", "n007514", "n007515", "n007516", "n007517", "n007518", "n000750", "n007519", "n007520", "n007521", "n007522", "n007523", "n007524", "n007525", "n007526", "n007527", "n007528", "n000751", "n007529", "n007530", "n007531", "n007532", "n007533", "n007534", "n007535", "n007536", "n007537", "n007538", "n000752", "n007539", "n007540", "n007542", "n007543", "n007544", "n007545", "n007546", "n007547", "n007548", "n007549", "n000753", "n007550", "n007551", "n007552", "n007553", "n007554", "n007555", "n007557", "n007558", "n007559", "n007560", "n000754", "n007561", "n007562", "n007563", "n007564", "n007565", "n007566", "n007567", "n007568", "n007569", "n007570", "n000755", "n007571", "n007572", "n007573", "n007574", "n007575", "n007576", "n007577", "n007578", "n007579", "n007580", "n000756", "n007581", "n007582", "n007583", "n007584", "n007585", "n007586", "n007587", "n007588", "n007589", "n007590", "n000757", "n007591", "n007592", "n007593", "n007594", "n007595", "n007596", "n007597", "n007598", "n007599", "n007600", "n000758", "n007601", "n007603", "n007604", "n007605", "n007606", "n007607", "n007608", "n007609", "n007610", "n007611", "n000076", "n000759", "n007612", "n007613", "n007614", "n007615", "n007616", "n007617", "n007618", "n007619", "n007620", "n007621", "n000760", "n007622", "n007623", "n007624", "n007625", "n007626", "n007627", "n007628", "n007629", "n007630", "n007632", "n000761", "n007633", "n007634", "n007635", "n007636", "n007637", "n007638", "n007639", "n007640", "n007641", "n007642", "n000762", "n007643", "n007644", "n007645", "n007646", "n007647", "n007648", "n007649", "n007650", "n007651", "n007652", "n000763", "n007653", "n007654", "n007655", "n007656", "n007657", "n007658", "n007659", "n007660", "n007661", "n007662", "n000764", "n007663", "n007664", "n007665", "n007666", "n007667", "n007668", "n007669", "n007670", "n007671", "n007672", "n000765", "n007674", "n007675", "n007676", "n007677", "n007678", "n007679", "n007680", "n007681", "n007682", "n007683", "n000766", "n007684", "n007685", "n007686", "n007687", "n007688", "n007689", "n007690", "n007691", "n007692", "n007693", "n000767", "n007694", "n007695", "n007696", "n007697", "n007698", "n007699", "n007700", "n007701", "n007702", "n007703", "n000768", "n007704", "n007705", "n007706", "n007707", "n007708", "n007710", "n007711", "n007712", "n007713", "n007714", "n000077", "n000769", "n007715", "n007716", "n007717", "n007718", "n007719", "n007720", "n007721", "n007722", "n007723", "n007724", "n000770", "n007725", "n007726", "n007727", "n007728", "n007729", "n007730", "n007731", "n007732", "n007733", "n007734", "n000771", "n007735", "n007736", "n007737", "n007738", "n007739", "n007740", "n007741", "n007742", "n007743", "n007744", "n000772", "n007745", "n007746", "n007747", "n007748", "n007749", "n007750", "n007751", "n007752", "n007753", "n007754", "n000773", "n007755", "n007756", "n007757", "n007758", "n007759", "n007760", "n007761", "n007762", "n007763", "n007764", "n000774", "n007765", "n007767", "n007768", "n007769", "n007770", "n007771", "n007772", "n007773", "n007774", "n007775", "n000775", "n007776", "n007777", "n007778", "n007779", "n007780", "n007781", "n007782", "n007783", "n007784", "n007785", "n000776", "n007786", "n007787", "n007788", "n007789", "n007790", "n007791", "n007792", "n007793", "n007794", "n007795", "n000777", "n007796", "n007797", "n007798", "n007799", "n007800", "n007801", "n007802", "n007803", "n007804", "n007805", "n000778", "n007806", "n007807", "n007808", "n007809", "n007810", "n007811", "n007812", "n007813", "n007814", "n007815", "n000078", "n000779", "n007816", "n007817", "n007818", "n007819", "n007821", "n007822", "n007823", "n007824", "n007825", "n007826", "n000780", "n007827", "n007828", "n007830", "n007831", "n007832", "n007833", "n007834", "n007835", "n007836", "n007837", "n000781", "n007838", "n007839", "n007840", "n007841", "n007842", "n007843", "n007844", "n007845", "n007846", "n007847", "n000782", "n007848", "n007849", "n007850", "n007851", "n007852", "n007853", "n007854", "n007855", "n007856", "n007857", "n000783", "n007858", "n007859", "n007860", "n007861", "n007862", "n007863", "n007864", "n007865", "n007866", "n007867", "n000784", "n007868", "n007869", "n007870", "n007871", "n007873", "n007874", "n007875", "n007876", "n007877", "n007878", "n000785", "n007879", "n007880", "n007881", "n007882", "n007883", "n007884", "n007885", "n007886", "n007887", "n007888", "n000786", "n007889", "n007890", "n007891", "n007892", "n007893", "n007894", "n007895", "n007896", "n007897", "n007899", "n000787", "n007900", "n007901", "n007902", "n007904", "n007906", "n007907", "n007908", "n007909", "n007910", "n007911", "n000788", "n007912", "n007913", "n007915", "n007916", "n007917", "n007918", "n007919", "n007920", "n007921", "n007922", "n000079", "n000789", "n007923", "n007924", "n007925", "n007926", "n007927", "n007928", "n007929", "n007930", "n007931", "n007932", "n000790", "n007933", "n007934", "n007935", "n007936", "n007937", "n007938", "n007939", "n007940", "n007941", "n007942", "n000791", "n007943", "n007944", "n007945", "n007946", "n007947", "n007948", "n007949", "n007950", "n007951", "n007952", "n000792", "n007953", "n007954", "n007955", "n007956", "n007957", "n007958", "n007959", "n007960", "n007961", "n007962", "n000793", "n007963", "n007966", "n007967", "n007968", "n007969", "n007970", "n007971", "n007972", "n007973", "n007974", "n000794", "n007975", "n007976", "n007977", "n007978", "n007979", "n007980", "n007981", "n007982", "n007983", "n007984", "n000795", "n007985", "n007986", "n007987", "n007988", "n007989", "n007990", "n007991", "n007992", "n007993", "n007994", "n000796", "n007995", "n007996", "n007997", "n007998", "n007999", "n008000", "n008001", "n008002", "n008003", "n008004", "n000797", "n008005", "n008006", "n008007", "n008008", "n008009", "n008010", "n008011", "n008012", "n008013", "n008014", "n000798", "n008015", "n008016", "n008017", "n008018", "n008019", "n008020", "n008021", "n008022", "n008023", "n008024", "n000080", "n000799", "n008025", "n008026", "n008027", "n008028", "n008030", "n008032", "n008033", "n008034", "n008036", "n008037", "n000800", "n008038", "n008039", "n008040", "n008041", "n008042", "n008043", "n008044", "n008045", "n008046", "n008047", "n000801", "n008048", "n008049", "n008050", "n008051", "n008052", "n008053", "n008054", "n008055", "n008056", "n008057", "n000802", "n008058", "n008059", "n008060", "n008061", "n008062", "n008063", "n008064", "n008065", "n008066", "n008067", "n000803", "n008068", "n008069", "n008070", "n008071", "n008072", "n008073", "n008074", "n008075", "n008076", "n008077", "n000804", "n008078", "n008079", "n008080", "n008081", "n008082", "n008083", "n008084", "n008085", "n008086", "n008087", "n000805", "n008088", "n008089", "n008090", "n008091", "n008092", "n008093", "n008094", "n008095", "n008096", "n008097", "n000806", "n008098", "n008099", "n008100", "n008101", "n008102", "n008103", "n008104", "n008105", "n008107", "n008108", "n000807", "n008109", "n008110", "n008111", "n008112", "n008113", "n008114", "n008115", "n008116", "n008117", "n008118", "n000808", "n008119", "n008120", "n008121", "n008122", "n008123", "n008124", "n008125", "n008126", "n008127", "n008128", "n000009", "n000081", "n000809", "n008129", "n008130", "n008131", "n008132", "n008133", "n008134", "n008135", "n008136", "n008137", "n008138", "n000810", "n008139", "n008140", "n008141", "n008142", "n008143", "n008144", "n008145", "n008146", "n008147", "n008148", "n000811", "n008149", "n008150", "n008151", "n008152", "n008153", "n008154", "n008155", "n008156", "n008157", "n008158", "n000812", "n008159", "n008160", "n008161", "n008162", "n008163", "n008164", "n008165", "n008166", "n008168", "n008169", "n000813", "n008170", "n008171", "n008172", "n008173", "n008174", "n008175", "n008176", "n008177", "n008178", "n008179", "n000814", "n008180", "n008181", "n008182", "n008183", "n008184", "n008185", "n008186", "n008187", "n008188", "n008189", "n000815", "n008190", "n008191", "n008192", "n008193", "n008194", "n008196", "n008197", "n008198", "n008199", "n008200", "n000816", "n008201", "n008202", "n008203", "n008204", "n008205", "n008206", "n008207", "n008208", "n008209", "n008210", "n000817", "n008211", "n008212", "n008213", "n008214", "n008215", "n008216", "n008217", "n008218", "n008219", "n008220", "n000818", "n008221", "n008223", "n008224", "n008225", "n008226", "n008227", "n008228", "n008229", "n008230", "n008231", "n000082", "n000819", "n008232", "n008233", "n008234", "n008235", "n008236", "n008237", "n008238", "n008239", "n008240", "n008241", "n000820", "n008242", "n008243", "n008244", "n008245", "n008246", "n008247", "n008248", "n008249", "n008250", "n008252", "n000821", "n008253", "n008254", "n008255", "n008256", "n008257", "n008258", "n008259", "n008260", "n008261", "n008262", "n000822", "n008263", "n008265", "n008266", "n008267", "n008268", "n008270", "n008272", "n008273", "n008274", "n008275", "n000823", "n008276", "n008277", "n008278", "n008279", "n008280", "n008281", "n008282", "n008283", "n008284", "n008285", "n000824", "n008286", "n008287", "n008288", "n008289", "n008290", "n008291", "n008292", "n008293", "n008294", "n008295", "n000825", "n008296", "n008297", "n008298", "n008299", "n008300", "n008301", "n008302", "n008303", "n008304", "n008305", "n000826", "n008306", "n008307", "n008308", "n008309", "n008310", "n008311", "n008312", "n008313", "n008314", "n008316", "n000827", "n008317", "n008318", "n008319", "n008320", "n008321", "n008322", "n008323", "n008324", "n008325", "n008326", "n000828", "n008327", "n008328", "n008329", "n008330", "n008332", "n008333", "n008334", "n008335", "n008336", "n008337", "n000083", "n000829", "n008338", "n008339", "n008340", "n008341", "n008342", "n008343", "n008344", "n008345", "n008346", "n008347", "n000830", "n008348", "n008349", "n008350", "n008351", "n008352", "n008353", "n008354", "n008355", "n008356", "n008357", "n000831", "n008358", "n008359", "n008360", "n008361", "n008362", "n008363", "n008364", "n008365", "n008366", "n008367", "n000832", "n008368", "n008369", "n008370", "n008371", "n008372", "n008373", "n008374", "n008375", "n008376", "n008377", "n000833", "n008378", "n008379", "n008380", "n008381", "n008382", "n008383", "n008384", "n008386", "n008387", "n008388", "n000834", "n008389", "n008390", "n008391", "n008392", "n008393", "n008394", "n008395", "n008396", "n008397", "n008398", "n000835", "n008399", "n008400", "n008401", "n008402", "n008403", "n008404", "n008405", "n008406", "n008407", "n008408", "n000836", "n008409", "n008410", "n008411", "n008412", "n008413", "n008414", "n008415", "n008416", "n008417", "n008418", "n000837", "n008419", "n008420", "n008421", "n008422", "n008423", "n008424", "n008425", "n008427", "n008428", "n008429", "n000838", "n008430", "n008431", "n008432", "n008433", "n008434", "n008436", "n008437", "n008438", "n008439", "n008440", "n000084", "n000839", "n008441", "n008442", "n008443", "n008444", "n008445", "n008446", "n008447", "n008448", "n008449", "n008450", "n000840", "n008451", "n008452", "n008453", "n008454", "n008455", "n008456", "n008457", "n008458", "n008459", "n008460", "n000841", "n008461", "n008462", "n008463", "n008464", "n008465", "n008466", "n008467", "n008468", "n008469", "n008470", "n000842", "n008471", "n008472", "n008473", "n008474", "n008475", "n008476", "n008477", "n008479", "n008480", "n008481", "n000843", "n008482", "n008483", "n008484", "n008485", "n008487", "n008488", "n008489", "n008490", "n008491", "n008492", "n000844", "n008493", "n008494", "n008495", "n008496", "n008497", "n008498", "n008499", "n008500", "n008501", "n008502", "n000845", "n008503", "n008504", "n008505", "n008506", "n008507", "n008508", "n008509", "n008510", "n008511", "n008512", "n000846", "n008513", "n008514", "n008515", "n008516", "n008517", "n008519", "n008520", "n008521", "n008522", "n008523", "n000847", "n008524", "n008525", "n008526", "n008527", "n008528", "n008529", "n008530", "n008531", "n008532", "n008533", "n000848", "n008534", "n008535", "n008536", "n008537", "n008538", "n008539", "n008540", "n008541", "n008542", "n008543", "n000085", "n000849", "n008544", "n008545", "n008546", "n008547", "n008548", "n008549", "n008550", "n008551", "n008552", "n008553", "n000850", "n008554", "n008555", "n008556", "n008557", "n008558", "n008560", "n008561", "n008562", "n008563", "n008564", "n000851", "n008565", "n008566", "n008567", "n008568", "n008569", "n008570", "n008571", "n008572", "n008573", "n008574", "n000852", "n008575", "n008576", "n008577", "n008578", "n008579", "n008580", "n008581", "n008582", "n008583", "n008584", "n000853", "n008585", "n008586", "n008587", "n008588", "n008589", "n008590", "n008591", "n008592", "n008593", "n008594", "n000854", "n008595", "n008596", "n008597", "n008598", "n008599", "n008600", "n008601", "n008602", "n008603", "n008604", "n000855", "n008605", "n008606", "n008607", "n008608", "n008609", "n008610", "n008611", "n008612", "n008613", "n008614", "n000856", "n008615", "n008616", "n008617", "n008618", "n008619", "n008620", "n008621", "n008622", "n008623", "n008624", "n000857", "n008625", "n008626", "n008627", "n008628", "n008630", "n008631", "n008632", "n008633", "n008634", "n008635", "n000858", "n008636", "n008637", "n008638", "n008639", "n008640", "n008641", "n008642", "n008643", "n008644", "n008645", "n000086", "n000859", "n008646", "n008647", "n008648", "n008649", "n008650", "n008651", "n008652", "n008653", "n008654", "n008655", "n000860", "n008656", "n008657", "n008658", "n008659", "n008660", "n008661", "n008662", "n008663", "n008664", "n008665", "n000861", "n008666", "n008667", "n008668", "n008669", "n008670", "n008671", "n008672", "n008673", "n008674", "n008675", "n000862", "n008676", "n008677", "n008678", "n008679", "n008680", "n008681", "n008682", "n008683", "n008684", "n008685", "n000863", "n008686", "n008687", "n008688", "n008689", "n008690", "n008691", "n008692", "n008693", "n008694", "n008695", "n000864", "n008696", "n008697", "n008698", "n008699", "n008700", "n008701", "n008702", "n008703", "n008704", "n008705", "n000865", "n008706", "n008707", "n008708", "n008709", "n008710", "n008711", "n008712", "n008713", "n008714", "n008715", "n000866", "n008716", "n008718", "n008719", "n008720", "n008721", "n008722", "n008723", "n008724", "n008725", "n008726", "n000867", "n008727", "n008728", "n008729", "n008730", "n008731", "n008732", "n008733", "n008734", "n008735", "n008736", "n000868", "n008737", "n008738", "n008739", "n008740", "n008741", "n008742", "n008743", "n008744", "n008745", "n008746", "n000087", "n000869", "n008747", "n008748", "n008749", "n008750", "n008751", "n008752", "n008753", "n008754", "n008755", "n008756", "n000870", "n008757", "n008758", "n008759", "n008760", "n008761", "n008762", "n008763", "n008764", "n008765", "n008766", "n000871", "n008767", "n008768", "n008769", "n008770", "n008771", "n008772", "n008773", "n008774", "n008775", "n008776", "n000872", "n008778", "n008779", "n008780", "n008781", "n008782", "n008783", "n008784", "n008785", "n008786", "n008787", "n000873", "n008788", "n008789", "n008790", "n008791", "n008792", "n008793", "n008794", "n008795", "n008796", "n008797", "n000874", "n008798", "n008799", "n008800", "n008801", "n008802", "n008803", "n008804", "n008805", "n008806", "n008807", "n000875", "n008808", "n008809", "n008810", "n008811", "n008812", "n008813", "n008814", "n008815", "n008816", "n008817", "n000876", "n008818", "n008819", "n008820", "n008821", "n008822", "n008823", "n008824", "n008825", "n008826", "n008827", "n000878", "n008829", "n008830", "n008831", "n008832", "n008833", "n008834", "n008835", "n008836", "n008837", "n008838", "n000879", "n008839", "n008840", "n008841", "n008842", "n008843", "n008844", "n008845", "n008846", "n008847", "n008848", "n000088", "n000880", "n008849", "n008850", "n008851", "n008852", "n008853", "n008854", "n008855", "n008856", "n008857", "n008858", "n000881", "n008859", "n008860", "n008861", "n008862", "n008863", "n008864", "n008865", "n008866", "n008867", "n008868", "n000882", "n008869", "n008870", "n008871", "n008872", "n008873", "n008874", "n008875", "n008877", "n008878", "n008879", "n000883", "n008880", "n008881", "n008882", "n008883", "n008884", "n008885", "n008886", "n008887", "n008890", "n008891", "n000884", "n008892", "n008893", "n008894", "n008895", "n008896", "n008897", "n008898", "n008899", "n008900", "n008901", "n000885", "n008902", "n008903", "n008904", "n008905", "n008906", "n008907", "n008908", "n008909", "n008910", "n008911", "n000886", "n008912", "n008913", "n008914", "n008915", "n008916", "n008917", "n008918", "n008919", "n008920", "n008921", "n000887", "n008922", "n008923", "n008924", "n008925", "n008926", "n008927", "n008928", "n008929", "n008930", "n008931", "n000888", "n008932", "n008933", "n008934", "n008935", "n008936", "n008937", "n008938", "n008939", "n008940", "n008941", "n000889", "n008942", "n008943", "n008944", "n008945", "n008946", "n008947", "n008948", "n008949", "n008950", "n008951", "n000089", "n000890", "n008952", "n008953", "n008954", "n008955", "n008956", "n008957", "n008958", "n008959", "n008960", "n008961", "n000891", "n008962", "n008963", "n008964", "n008965", "n008966", "n008967", "n008968", "n008969", "n008970", "n008971", "n000892", "n008972", "n008973", "n008974", "n008975", "n008976", "n008977", "n008978", "n008979", "n008980", "n008981", "n000893", "n008982", "n008983", "n008984", "n008985", "n008986", "n008987", "n008988", "n008989", "n008990", "n008991", "n000894", "n008992", "n008993", "n008994", "n008995", "n008996", "n008997", "n008998", "n008999", "n009000", "n009001", "n000895", "n009002", "n009003", "n009004", "n009005", "n009006", "n009007", "n009008", "n009009", "n009010", "n009011", "n000896", "n009012", "n009013", "n009015", "n009016", "n009017", "n009018", "n009019", "n009020", "n009021", "n009022", "n000897", "n009023", "n009024", "n009025", "n009026", "n009027", "n009028", "n009029", "n009030", "n009031", "n009032", "n000898", "n009033", "n009034", "n009035", "n009036", "n009037", "n009038", "n009039", "n009040", "n009041", "n009042", "n000899", "n009043", "n009044", "n009045", "n009046", "n009047", "n009048", "n009049", "n009050", "n009051", "n009052", "n000090", "n000900", "n009053", "n009054", "n009055", "n009056", "n009057", "n009058", "n009059", "n009060", "n009061", "n009062", "n000901", "n009063", "n009064", "n009065", "n009066", "n009067", "n009068", "n009069", "n009070", "n009071", "n009072", "n000902", "n009073", "n009074", "n009075", "n009076", "n009077", "n009078", "n009079", "n009080", "n009081", "n009082", "n000903", "n009083", "n009084", "n009085", "n009086", "n009087", "n009088", "n009089", "n009091", "n009092", "n009093", "n000904", "n009094", "n009095", "n009096", "n009097", "n009098", "n009099", "n009100", "n009101", "n009102", "n009103", "n000905", "n009104", "n009105", "n009106", "n009107", "n009108", "n009109", "n009110", "n009111", "n009112", "n009113", "n000906", "n009114", "n009115", "n009116", "n009117", "n009118", "n009119", "n009120", "n009121", "n009122", "n009123", "n000907", "n009124", "n009125", "n009126", "n009127", "n009129", "n009130", "n009131", "n009132", "n009133", "n009134", "n000908", "n009135", "n009136", "n009137", "n009138", "n009139", "n009140", "n009141", "n009142", "n009143", "n009144", "n000909", "n009145", "n009146", "n009147", "n009148", "n009149", "n009150", "n009151", "n009152", "n009153", "n009154", "n000010", "n000091", "n000910", "n009155", "n009156", "n009157", "n009158", "n009159", "n009160", "n009161", "n009162", "n009163", "n009164", "n000911", "n009165", "n009166", "n009167", "n009168", "n009169", "n009170", "n009171", "n009172", "n009173", "n009174", "n000912", "n009175", "n009176", "n009177", "n009179", "n009180", "n009181", "n009182", "n009183", "n009184", "n009185", "n000913", "n009186", "n009187", "n009188", "n009189", "n009190", "n009191", "n009192", "n009193", "n009194", "n009195", "n000914", "n009196", "n009197", "n009198", "n009199", "n009200", "n009201", "n009202", "n009203", "n009204", "n009205", "n000915", "n009207", "n009208", "n009209", "n009210", "n009211", "n009212", "n009213", "n009214", "n009215", "n009216", "n000916", "n009217", "n009218", "n009219", "n009220", "n009221", "n009222", "n009223", "n009224", "n009225", "n009226", "n000917", "n009227", "n009228", "n009229", "n009230", "n009231", "n009232", "n009233", "n009234", "n009235", "n009236", "n000918", "n009237", "n009238", "n009239", "n009240", "n009241", "n009242", "n009243", "n009244", "n009245", "n009246", "n000919", "n009247", "n009248", "n009249", "n009250", "n009251", "n009252", "n009253", "n009254", "n009255", "n009256", "n000092", "n000920", "n009257", "n009258", "n009259", "n009260", "n009261", "n009262", "n009263", "n009264", "n009265", "n009266", "n000921", "n009267", "n009268", "n009269", "n009270", "n009271", "n009272", "n009273", "n009274", "n009275", "n009276", "n000922", "n009277", "n009278", "n009279", "n009283", "n009285", "n009286", "n009287", "n009288", "n009289", "n009291", "n000923", "n009294", "n000924", "n000925", "n000926", "n000927", "n000928", "n000929", "n000093", "n000930", "n000931", "n000932", "n000933", "n000934", "n000935", "n000937", "n000938", "n000939", "n000940", "n000094", "n000941", "n000942", "n000943", "n000944", "n000945", "n000946", "n000947", "n000948", "n000949", "n000950", "n000095", "n000951", "n000952", "n000953", "n000954", "n000955", "n000956", "n000957", "n000958", "n000959", "n000960", "n000096", "n000961", "n000962", "n000963", "n000964", "n000965", "n000966", "n000967", "n000968", "n000969", "n000970", "n000097", "n000971", "n000972", "n000973", "n000974", "n000975", "n000976", "n000977", "n000978", "n000979", "n000980", "n000098", "n000981", "n000982", "n000983", "n000984", "n000985", "n000986", "n000987", "n000988", "n000989", "n000990", "n000099", "n000991", "n000992", "n000993", "n000994", "n000995", "n000996", "n000997", "n000998", "n000999", "n001000", "n000100", "n001001", "n001002", "n001003", "n001004", "n001005", "n001006", "n001007", "n001008", "n001009", "n001010" ]
amiguel/mri_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # amiguel/mri_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0075 - Validation Loss: 0.0023 - Train Accuracy: 1.0 - Epoch: 14 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 3e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 0.1501 | 0.0619 | 0.9845 | 0 | | 0.0524 | 0.0825 | 0.9733 | 1 | | 0.0324 | 0.1416 | 0.9494 | 2 | | 0.0243 | 0.0327 | 0.9887 | 3 | | 0.0258 | 0.0095 | 0.9986 | 4 | | 0.0166 | 0.0069 | 0.9986 | 5 | | 0.0342 | 0.0126 | 0.9958 | 6 | | 0.0131 | 0.0057 | 0.9986 | 7 | | 0.0120 | 0.0037 | 0.9986 | 8 | | 0.0163 | 0.0055 | 0.9972 | 9 | | 0.0083 | 0.0018 | 1.0 | 10 | | 0.0128 | 0.0027 | 0.9986 | 11 | | 0.0070 | 0.0020 | 1.0 | 12 | | 0.0083 | 0.0014 | 1.0 | 13 | | 0.0075 | 0.0023 | 1.0 | 14 | ### Framework versions - Transformers 4.42.4 - TensorFlow 2.15.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "notumor", "tumor" ]
MattyB95/VIT-ASVspoof5-ConstantQ-Synthetic-Voice-Detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # VIT-ASVspoof5-ConstantQ-Synthetic-Voice-Detection This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 2.1171 - Accuracy: 0.7590 - F1: 0.8126 - Precision: 0.8285 - Recall: 0.7974 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:---------:|:------:| | 0.0145 | 1.0 | 22795 | 1.4281 | 0.7289 | 0.8193 | 0.7275 | 0.9377 | | 0.0084 | 2.0 | 45590 | 1.7308 | 0.7343 | 0.7730 | 0.8787 | 0.6899 | | 0.0 | 3.0 | 68385 | 2.1171 | 0.7590 | 0.8126 | 0.8285 | 0.7974 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "bonafide", "spoof" ]
lalla123/resnet-50-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # resnet-50-finetuned-eurosat This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.2964 - Accuracy: 0.2929 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | No log | 0.8889 | 4 | 2.3017 | 0.2643 | | No log | 2.0 | 9 | 2.2989 | 0.2286 | | 2.2991 | 2.6667 | 12 | 2.2964 | 0.2929 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
skutaada/MobileNetV2-KD-VGGFace
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # MobileNetV2-KD-VGGFace This model is trained via KD from [ViT](https://huggingface.co/skutaada/VIT-VGGFace) on first 50k samples of VGGFace dataset. It achieves the following results on the evaluation set: - Loss: 0.4919 - Accuracy: 0.7836 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 2.7506 | 1.0 | 1667 | 2.2449 | 0.0726 | | 2.105 | 2.0 | 3334 | 1.6904 | 0.2493 | | 1.6544 | 3.0 | 5001 | 1.3206 | 0.4043 | | 1.3357 | 4.0 | 6668 | 1.0675 | 0.5078 | | 1.1104 | 5.0 | 8335 | 0.9302 | 0.5582 | | 0.9287 | 6.0 | 10002 | 0.8738 | 0.5972 | | 0.7899 | 7.0 | 11669 | 0.7972 | 0.6388 | | 0.6738 | 8.0 | 13336 | 0.7074 | 0.6822 | | 0.5803 | 9.0 | 15003 | 0.6630 | 0.7009 | | 0.5038 | 10.0 | 16670 | 0.5855 | 0.735 | | 0.4366 | 11.0 | 18337 | 0.5761 | 0.7415 | | 0.3762 | 12.0 | 20004 | 0.5642 | 0.7496 | | 0.3321 | 13.0 | 21671 | 0.5373 | 0.7652 | | 0.2916 | 14.0 | 23338 | 0.5314 | 0.7625 | | 0.2615 | 15.0 | 25005 | 0.6206 | 0.7281 | | 0.2357 | 16.0 | 26672 | 0.5437 | 0.763 | | 0.2153 | 17.0 | 28339 | 0.5335 | 0.763 | | 0.1986 | 18.0 | 30006 | 0.4892 | 0.7869 | | 0.1866 | 19.0 | 31673 | 0.5368 | 0.7645 | | 0.1765 | 20.0 | 33340 | 0.4919 | 0.7836 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+rocm6.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9", "label_10", "label_11", "label_12", "label_13", "label_14", "label_15", "label_16", "label_17", "label_18", "label_19", "label_20", "label_21", "label_22", "label_23", "label_24", "label_25", "label_26", "label_27", "label_28", "label_29", "label_30", "label_31", "label_32", "label_33", "label_34", "label_35", "label_36", "label_37", "label_38", "label_39", "label_40", "label_41", "label_42", "label_43", "label_44", "label_45", "label_46", "label_47", "label_48", "label_49", "label_50", "label_51", "label_52", "label_53", "label_54", "label_55", "label_56", "label_57", "label_58", "label_59", "label_60", "label_61", "label_62", "label_63", "label_64", "label_65", "label_66", "label_67", "label_68", "label_69", "label_70", "label_71", "label_72", "label_73", "label_74", "label_75", "label_76", "label_77", "label_78", "label_79", "label_80", "label_81", "label_82", "label_83", "label_84", "label_85", "label_86", "label_87", "label_88", "label_89", "label_90", "label_91", "label_92", "label_93", "label_94", "label_95", "label_96", "label_97", "label_98", "label_99", "label_100", "label_101", "label_102", "label_103", "label_104", "label_105", "label_106", "label_107", "label_108", "label_109", "label_110", "label_111", "label_112", "label_113", "label_114", "label_115", "label_116", "label_117", "label_118", "label_119", "label_120", "label_121", "label_122", "label_123", "label_124", "label_125", "label_126", "label_127", "label_128", "label_129", "label_130", "label_131", "label_132", "label_133", "label_134", "label_135", "label_136", "label_137", "label_138", "label_139", "label_140", "label_141", "label_142", "label_143", "label_144", "label_145", "label_146", "label_147", "label_148", "label_149", "label_150", "label_151", "label_152", "label_153", "label_154", "label_155", "label_156", "label_157", "label_158", "label_159", "label_160", "label_161", "label_162", "label_163", "label_164", "label_165", "label_166", "label_167", "label_168", "label_169", "label_170", "label_171", "label_172", "label_173", "label_174", "label_175", "label_176", "label_177", "label_178", "label_179", "label_180", "label_181", "label_182", "label_183", "label_184", "label_185", "label_186", "label_187", "label_188", "label_189", "label_190", "label_191", "label_192", "label_193", "label_194", "label_195", "label_196", "label_197", "label_198", "label_199", "label_200", "label_201", "label_202", "label_203", "label_204", "label_205", "label_206", "label_207", "label_208", "label_209", "label_210", "label_211", "label_212", "label_213", "label_214", "label_215", "label_216", "label_217", "label_218", "label_219", "label_220", "label_221", "label_222", "label_223", "label_224", "label_225", "label_226", "label_227", "label_228", "label_229", "label_230", "label_231", "label_232", "label_233", "label_234", "label_235", "label_236", "label_237", "label_238", "label_239", "label_240", "label_241", "label_242", "label_243", "label_244", "label_245", "label_246", "label_247", "label_248", "label_249", "label_250", "label_251", "label_252", "label_253", "label_254", "label_255", "label_256", "label_257", "label_258", "label_259", "label_260", "label_261", "label_262", "label_263", "label_264", "label_265", "label_266", "label_267", "label_268", "label_269", "label_270", "label_271", "label_272", "label_273", "label_274", "label_275", "label_276", "label_277", "label_278", "label_279", "label_280", "label_281", "label_282", "label_283", "label_284", "label_285", "label_286", "label_287", "label_288", "label_289", "label_290", "label_291", "label_292", "label_293", "label_294", "label_295", "label_296", "label_297", "label_298", "label_299", "label_300", "label_301", "label_302", "label_303", "label_304", "label_305", "label_306", "label_307", "label_308", "label_309", "label_310", "label_311", "label_312", "label_313", "label_314", "label_315", "label_316", "label_317", "label_318", "label_319", "label_320", "label_321", "label_322", "label_323", "label_324", "label_325", "label_326", "label_327", "label_328", "label_329", "label_330", "label_331", "label_332", "label_333", "label_334", "label_335", "label_336", "label_337", "label_338", "label_339", "label_340", "label_341", "label_342", "label_343", "label_344", "label_345", "label_346", "label_347", "label_348", "label_349", "label_350", "label_351", "label_352", "label_353", "label_354", "label_355", "label_356", "label_357", "label_358", "label_359", "label_360", "label_361", "label_362", "label_363", "label_364", "label_365", "label_366", "label_367", "label_368", "label_369", "label_370", "label_371", "label_372", "label_373", "label_374", "label_375", "label_376", "label_377", "label_378", "label_379", "label_380", "label_381", "label_382", "label_383", "label_384", "label_385", "label_386", "label_387", "label_388", "label_389", "label_390", "label_391", "label_392", "label_393", "label_394", "label_395", "label_396", "label_397", "label_398", "label_399", "label_400", "label_401", "label_402", "label_403", "label_404", "label_405", "label_406", "label_407", "label_408", "label_409", "label_410", "label_411", "label_412", "label_413", "label_414", "label_415", "label_416", "label_417", "label_418", "label_419", "label_420", "label_421", "label_422", "label_423", "label_424", "label_425", "label_426", "label_427", "label_428", "label_429", "label_430", "label_431", "label_432", "label_433", "label_434", "label_435", "label_436", "label_437", "label_438", "label_439", "label_440", "label_441", "label_442", "label_443", "label_444", "label_445", "label_446", "label_447", "label_448", "label_449", "label_450", "label_451", "label_452", "label_453", "label_454", "label_455", "label_456", "label_457", "label_458", "label_459", "label_460", "label_461", "label_462", "label_463", "label_464", "label_465", "label_466", "label_467", "label_468", "label_469", "label_470", "label_471", "label_472", "label_473", "label_474", "label_475", "label_476", "label_477", "label_478", "label_479", "label_480", "label_481", "label_482", "label_483", "label_484", "label_485", "label_486", "label_487", "label_488", "label_489", "label_490", "label_491", "label_492", "label_493", "label_494", "label_495", "label_496", "label_497", "label_498", "label_499", "label_500", "label_501", "label_502", "label_503", "label_504", "label_505", "label_506", "label_507", "label_508", "label_509", "label_510", "label_511", "label_512", "label_513", "label_514", "label_515", "label_516", "label_517", "label_518", "label_519", "label_520", "label_521", "label_522", "label_523", "label_524", "label_525", "label_526", "label_527", "label_528", "label_529", "label_530", "label_531", "label_532", "label_533", "label_534", "label_535", "label_536", "label_537", "label_538", "label_539", "label_540", "label_541", "label_542", "label_543", "label_544", "label_545", "label_546", "label_547", "label_548", "label_549", "label_550", "label_551", "label_552", "label_553", "label_554", "label_555", "label_556", "label_557", "label_558", "label_559", "label_560", "label_561", "label_562", "label_563", "label_564", "label_565", "label_566", "label_567", "label_568", "label_569", "label_570", "label_571", "label_572", "label_573", "label_574", "label_575", "label_576", "label_577", "label_578", "label_579", "label_580", "label_581", "label_582", "label_583", "label_584", "label_585", "label_586", "label_587", "label_588", "label_589", "label_590", "label_591", "label_592", "label_593", "label_594", "label_595", "label_596", "label_597", "label_598", "label_599", "label_600", "label_601", "label_602", "label_603", "label_604", "label_605", "label_606", "label_607", "label_608", "label_609", "label_610", "label_611", "label_612", "label_613", "label_614", "label_615", "label_616", "label_617", "label_618", "label_619", "label_620", "label_621", "label_622", "label_623", "label_624", "label_625", "label_626", "label_627", "label_628", "label_629", "label_630", "label_631", "label_632", "label_633", "label_634", "label_635", "label_636", "label_637", "label_638", "label_639", "label_640", "label_641", "label_642", "label_643", "label_644", "label_645", "label_646", "label_647", "label_648", "label_649", "label_650", "label_651", "label_652", "label_653", "label_654", "label_655", "label_656", "label_657", "label_658", "label_659", "label_660", "label_661", "label_662", "label_663", "label_664", "label_665", "label_666", "label_667", "label_668", "label_669", "label_670", "label_671", "label_672", "label_673", "label_674", "label_675", "label_676", "label_677", "label_678", "label_679", "label_680", "label_681", "label_682", "label_683", "label_684", "label_685", "label_686", "label_687", "label_688", "label_689", "label_690", "label_691", "label_692", "label_693", "label_694", "label_695", "label_696", "label_697", "label_698", "label_699", "label_700", "label_701", "label_702", "label_703", "label_704", "label_705", "label_706", "label_707", "label_708", "label_709", "label_710", "label_711", "label_712", "label_713", "label_714", "label_715", "label_716", "label_717", "label_718", "label_719", "label_720", "label_721", "label_722", "label_723", "label_724", "label_725", "label_726", "label_727", "label_728", "label_729", "label_730", "label_731", "label_732", "label_733", "label_734", "label_735", "label_736", "label_737", "label_738", "label_739", "label_740", "label_741", "label_742", "label_743", "label_744", "label_745", "label_746", "label_747", "label_748", "label_749", "label_750", "label_751", "label_752", "label_753", "label_754", "label_755", "label_756", "label_757", "label_758", "label_759", "label_760", "label_761", "label_762", "label_763", "label_764", "label_765", "label_766", "label_767", "label_768", "label_769", "label_770", "label_771", "label_772", "label_773", "label_774", "label_775", "label_776", "label_777", "label_778", "label_779", "label_780", "label_781", "label_782", "label_783", "label_784", "label_785", "label_786", "label_787", "label_788", "label_789", "label_790", "label_791", "label_792", "label_793", "label_794", "label_795", "label_796", "label_797", "label_798", "label_799", "label_800", "label_801", "label_802", "label_803", "label_804", "label_805", "label_806", "label_807", "label_808", "label_809", "label_810", "label_811", "label_812", "label_813", "label_814", "label_815", "label_816", "label_817", "label_818", "label_819", "label_820", "label_821", "label_822", "label_823", "label_824", "label_825", "label_826", "label_827", "label_828", "label_829", "label_830", "label_831", "label_832", "label_833", "label_834", "label_835", "label_836", "label_837", "label_838", "label_839", "label_840", "label_841", "label_842", "label_843", "label_844", "label_845", "label_846", "label_847", "label_848", "label_849", "label_850", "label_851", "label_852", "label_853", "label_854", "label_855", "label_856", "label_857", "label_858", "label_859", "label_860", "label_861", "label_862", "label_863", "label_864", "label_865", "label_866", "label_867", "label_868", "label_869", "label_870", "label_871", "label_872", "label_873", "label_874", "label_875", "label_876", "label_877", "label_878", "label_879", "label_880", "label_881", "label_882", "label_883", "label_884", "label_885", "label_886", "label_887", "label_888", "label_889", "label_890", "label_891", "label_892", "label_893", "label_894", "label_895", "label_896", "label_897", "label_898", "label_899", "label_900", "label_901", "label_902", "label_903", "label_904", "label_905", "label_906", "label_907", "label_908", "label_909", "label_910", "label_911", "label_912", "label_913", "label_914", "label_915", "label_916", "label_917", "label_918", "label_919", "label_920", "label_921", "label_922", "label_923", "label_924", "label_925", "label_926", "label_927", "label_928", "label_929", "label_930", "label_931", "label_932", "label_933", "label_934", "label_935", "label_936", "label_937", "label_938", "label_939", "label_940", "label_941", "label_942", "label_943", "label_944", "label_945", "label_946", "label_947", "label_948", "label_949", "label_950", "label_951", "label_952", "label_953", "label_954", "label_955", "label_956", "label_957", "label_958", "label_959", "label_960", "label_961", "label_962", "label_963", "label_964", "label_965", "label_966", "label_967", "label_968", "label_969", "label_970", "label_971", "label_972", "label_973", "label_974", "label_975", "label_976", "label_977", "label_978", "label_979", "label_980", "label_981", "label_982", "label_983", "label_984", "label_985", "label_986", "label_987", "label_988", "label_989", "label_990", "label_991", "label_992", "label_993", "label_994", "label_995", "label_996", "label_997", "label_998", "label_999", "label_1000", "label_1001", "label_1002", "label_1003", "label_1004", "label_1005", "label_1006", "label_1007", "label_1008", "label_1009", "label_1010", "label_1011", "label_1012", "label_1013", "label_1014", "label_1015", "label_1016", "label_1017", "label_1018", "label_1019", "label_1020", "label_1021", "label_1022", "label_1023", "label_1024", "label_1025", "label_1026", "label_1027", "label_1028", "label_1029", "label_1030", "label_1031", "label_1032", "label_1033", "label_1034", "label_1035", "label_1036", "label_1037", "label_1038", "label_1039", "label_1040", "label_1041", "label_1042", "label_1043", "label_1044", "label_1045", "label_1046", "label_1047", "label_1048", "label_1049", "label_1050", "label_1051", "label_1052", "label_1053", "label_1054", "label_1055", "label_1056", "label_1057", "label_1058", "label_1059", "label_1060", "label_1061", "label_1062", "label_1063", "label_1064", "label_1065", "label_1066", "label_1067", "label_1068", "label_1069", "label_1070", "label_1071", "label_1072", "label_1073", "label_1074", "label_1075", "label_1076", "label_1077", "label_1078", "label_1079", "label_1080", "label_1081", "label_1082", "label_1083", "label_1084", "label_1085", "label_1086", "label_1087", "label_1088", "label_1089", "label_1090", "label_1091", "label_1092", "label_1093", "label_1094", "label_1095", "label_1096", "label_1097", "label_1098", "label_1099", "label_1100", "label_1101", "label_1102", "label_1103", "label_1104", "label_1105", "label_1106", "label_1107", "label_1108", "label_1109", "label_1110", "label_1111", "label_1112", "label_1113", "label_1114", "label_1115", "label_1116", "label_1117", "label_1118", "label_1119", "label_1120", "label_1121", "label_1122", "label_1123", "label_1124", "label_1125", "label_1126", "label_1127", "label_1128", "label_1129", "label_1130", "label_1131", "label_1132", "label_1133", "label_1134", "label_1135", "label_1136", "label_1137", "label_1138", "label_1139", "label_1140", "label_1141", "label_1142", "label_1143", "label_1144", "label_1145", "label_1146", "label_1147", "label_1148", "label_1149", "label_1150", "label_1151", "label_1152", "label_1153", "label_1154", "label_1155", "label_1156", "label_1157", "label_1158", "label_1159", "label_1160", "label_1161", "label_1162", "label_1163", "label_1164", "label_1165", "label_1166", "label_1167", "label_1168", "label_1169", "label_1170", "label_1171", "label_1172", "label_1173", "label_1174", "label_1175", "label_1176", "label_1177", "label_1178", "label_1179", "label_1180", "label_1181", "label_1182", "label_1183", "label_1184", "label_1185", "label_1186", "label_1187", "label_1188", "label_1189", "label_1190", "label_1191", "label_1192", "label_1193", "label_1194", "label_1195", "label_1196", "label_1197", "label_1198", "label_1199", "label_1200", "label_1201", "label_1202", "label_1203", "label_1204", "label_1205", "label_1206", "label_1207", "label_1208", "label_1209", "label_1210", "label_1211", "label_1212", "label_1213", "label_1214", "label_1215", "label_1216", "label_1217", "label_1218", "label_1219", "label_1220", "label_1221", "label_1222", "label_1223", "label_1224", "label_1225", "label_1226", "label_1227", "label_1228", "label_1229", "label_1230", "label_1231", "label_1232", "label_1233", "label_1234", "label_1235", "label_1236", "label_1237", "label_1238", "label_1239", "label_1240", "label_1241", "label_1242", "label_1243", "label_1244", "label_1245", "label_1246", "label_1247", "label_1248", "label_1249", "label_1250", "label_1251", "label_1252", "label_1253", "label_1254", "label_1255", "label_1256", "label_1257", "label_1258", "label_1259", "label_1260", "label_1261", "label_1262", "label_1263", "label_1264", "label_1265", "label_1266", "label_1267", "label_1268", "label_1269", "label_1270", "label_1271", "label_1272", "label_1273", "label_1274", "label_1275", "label_1276", "label_1277", "label_1278", "label_1279", "label_1280", "label_1281", "label_1282", "label_1283", "label_1284", "label_1285", "label_1286", "label_1287", "label_1288", "label_1289", "label_1290", "label_1291", "label_1292", "label_1293", "label_1294", "label_1295", "label_1296", "label_1297", "label_1298", "label_1299", "label_1300", "label_1301", "label_1302", "label_1303", "label_1304", "label_1305", "label_1306", "label_1307", "label_1308", "label_1309", "label_1310", "label_1311", "label_1312", "label_1313", "label_1314", "label_1315", "label_1316", "label_1317", "label_1318", "label_1319", "label_1320", "label_1321", "label_1322", "label_1323", "label_1324", "label_1325", "label_1326", "label_1327", "label_1328", "label_1329", "label_1330", "label_1331", "label_1332", "label_1333", "label_1334", "label_1335", "label_1336", "label_1337", "label_1338", "label_1339", "label_1340", "label_1341", "label_1342", "label_1343", "label_1344", "label_1345", "label_1346", "label_1347", "label_1348", "label_1349", "label_1350", "label_1351", "label_1352", "label_1353", "label_1354", "label_1355", "label_1356", "label_1357", "label_1358", "label_1359", "label_1360", "label_1361", "label_1362", "label_1363", "label_1364", "label_1365", "label_1366", "label_1367", "label_1368", "label_1369", "label_1370", "label_1371", "label_1372", "label_1373", "label_1374", "label_1375", "label_1376", "label_1377", "label_1378", "label_1379", "label_1380", "label_1381", "label_1382", "label_1383", "label_1384", "label_1385", "label_1386", "label_1387", "label_1388", "label_1389", "label_1390", "label_1391", "label_1392", "label_1393", "label_1394", "label_1395", "label_1396", "label_1397", "label_1398", "label_1399", "label_1400", "label_1401", "label_1402", "label_1403", "label_1404", "label_1405", "label_1406", "label_1407", "label_1408", "label_1409", "label_1410", "label_1411", "label_1412", "label_1413", "label_1414", "label_1415", "label_1416", "label_1417", "label_1418", "label_1419", "label_1420", "label_1421", "label_1422", "label_1423", "label_1424", "label_1425", "label_1426", "label_1427", "label_1428", "label_1429", "label_1430", "label_1431", "label_1432", "label_1433", "label_1434", "label_1435", "label_1436", "label_1437", "label_1438", "label_1439", "label_1440", "label_1441", "label_1442", "label_1443", "label_1444", "label_1445", "label_1446", "label_1447", "label_1448", "label_1449", "label_1450", "label_1451", "label_1452", "label_1453", "label_1454", "label_1455", "label_1456", "label_1457", "label_1458", "label_1459", "label_1460", "label_1461", "label_1462", "label_1463", "label_1464", "label_1465", "label_1466", "label_1467", "label_1468", "label_1469", "label_1470", "label_1471", "label_1472", "label_1473", "label_1474", "label_1475", "label_1476", "label_1477", "label_1478", "label_1479", "label_1480", "label_1481", "label_1482", "label_1483", "label_1484", "label_1485", "label_1486", "label_1487", "label_1488", "label_1489", "label_1490", "label_1491", "label_1492", "label_1493", "label_1494", "label_1495", "label_1496", "label_1497", "label_1498", "label_1499", "label_1500", "label_1501", "label_1502", "label_1503", "label_1504", "label_1505", "label_1506", "label_1507", "label_1508", "label_1509", "label_1510", "label_1511", "label_1512", "label_1513", "label_1514", "label_1515", "label_1516", "label_1517", "label_1518", "label_1519", "label_1520", "label_1521", "label_1522", "label_1523", "label_1524", "label_1525", "label_1526", "label_1527", "label_1528", "label_1529", "label_1530", "label_1531", "label_1532", "label_1533", "label_1534", "label_1535", "label_1536", "label_1537", "label_1538", "label_1539", "label_1540", "label_1541", "label_1542", "label_1543", "label_1544", "label_1545", "label_1546", "label_1547", "label_1548", "label_1549", "label_1550", "label_1551", "label_1552", "label_1553", "label_1554", "label_1555", "label_1556", "label_1557", "label_1558", "label_1559", "label_1560", "label_1561", "label_1562", "label_1563", "label_1564", "label_1565", "label_1566", "label_1567", "label_1568", "label_1569", "label_1570", "label_1571", "label_1572", "label_1573", "label_1574", "label_1575", "label_1576", "label_1577", "label_1578", "label_1579", "label_1580", "label_1581", "label_1582", "label_1583", "label_1584", "label_1585", "label_1586", "label_1587", "label_1588", "label_1589", "label_1590", "label_1591", "label_1592", "label_1593", "label_1594", "label_1595", "label_1596", "label_1597", "label_1598", "label_1599", "label_1600", "label_1601", "label_1602", "label_1603", "label_1604", "label_1605", "label_1606", "label_1607", "label_1608", "label_1609", "label_1610", "label_1611", "label_1612", "label_1613", "label_1614", "label_1615", "label_1616", "label_1617", "label_1618", "label_1619", "label_1620", "label_1621", "label_1622", "label_1623", "label_1624", "label_1625", "label_1626", "label_1627", "label_1628", "label_1629", "label_1630", "label_1631", "label_1632", "label_1633", "label_1634", "label_1635", "label_1636", "label_1637", "label_1638", "label_1639", "label_1640", "label_1641", "label_1642", "label_1643", "label_1644", "label_1645", "label_1646", "label_1647", "label_1648", "label_1649", "label_1650", "label_1651", "label_1652", "label_1653", "label_1654", "label_1655", "label_1656", "label_1657", "label_1658", "label_1659", "label_1660", "label_1661", "label_1662", "label_1663", "label_1664", "label_1665", "label_1666", "label_1667", "label_1668", "label_1669", "label_1670", "label_1671", "label_1672", "label_1673", "label_1674", "label_1675", "label_1676", "label_1677", "label_1678", "label_1679", "label_1680", "label_1681", "label_1682", "label_1683", "label_1684", "label_1685", "label_1686", "label_1687", "label_1688", "label_1689", "label_1690", "label_1691", "label_1692", "label_1693", "label_1694", "label_1695", "label_1696", "label_1697", "label_1698", "label_1699", "label_1700", "label_1701", "label_1702", "label_1703", "label_1704", "label_1705", "label_1706", "label_1707", "label_1708", "label_1709", "label_1710", "label_1711", "label_1712", "label_1713", "label_1714", "label_1715", "label_1716", "label_1717", "label_1718", "label_1719", "label_1720", "label_1721", "label_1722", "label_1723", "label_1724", "label_1725", "label_1726", "label_1727", "label_1728", "label_1729", "label_1730", "label_1731", "label_1732", "label_1733", "label_1734", "label_1735", "label_1736", "label_1737", "label_1738", "label_1739", "label_1740", "label_1741", "label_1742", "label_1743", "label_1744", "label_1745", "label_1746", "label_1747", "label_1748", "label_1749", "label_1750", "label_1751", "label_1752", "label_1753", "label_1754", "label_1755", "label_1756", "label_1757", "label_1758", "label_1759", "label_1760", "label_1761", "label_1762", "label_1763", "label_1764", "label_1765", "label_1766", "label_1767", "label_1768", "label_1769", "label_1770", "label_1771", "label_1772", "label_1773", "label_1774", "label_1775", "label_1776", "label_1777", "label_1778", "label_1779", "label_1780", "label_1781", "label_1782", "label_1783", "label_1784", "label_1785", "label_1786", "label_1787", "label_1788", "label_1789", "label_1790", "label_1791", "label_1792", "label_1793", "label_1794", "label_1795", "label_1796", "label_1797", "label_1798", "label_1799", "label_1800", "label_1801", "label_1802", "label_1803", "label_1804", "label_1805", "label_1806", "label_1807", "label_1808", "label_1809", "label_1810", "label_1811", "label_1812", "label_1813", "label_1814", "label_1815", "label_1816", "label_1817", "label_1818", "label_1819", "label_1820", "label_1821", "label_1822", "label_1823", "label_1824", "label_1825", "label_1826", "label_1827", "label_1828", "label_1829", "label_1830", "label_1831", "label_1832", "label_1833", "label_1834", "label_1835", "label_1836", "label_1837", "label_1838", "label_1839", "label_1840", "label_1841", "label_1842", "label_1843", "label_1844", "label_1845", "label_1846", "label_1847", "label_1848", "label_1849", "label_1850", "label_1851", "label_1852", "label_1853", "label_1854", "label_1855", "label_1856", "label_1857", "label_1858", "label_1859", "label_1860", "label_1861", "label_1862", "label_1863", "label_1864", "label_1865", "label_1866", "label_1867", "label_1868", "label_1869", "label_1870", "label_1871", "label_1872", "label_1873", "label_1874", "label_1875", "label_1876", "label_1877", "label_1878", "label_1879", "label_1880", "label_1881", "label_1882", "label_1883", "label_1884", "label_1885", "label_1886", "label_1887", "label_1888", "label_1889", "label_1890", "label_1891", "label_1892", "label_1893", "label_1894", "label_1895", "label_1896", "label_1897", "label_1898", "label_1899", "label_1900", "label_1901", "label_1902", "label_1903", "label_1904", "label_1905", "label_1906", "label_1907", "label_1908", "label_1909", "label_1910", "label_1911", "label_1912", "label_1913", "label_1914", "label_1915", "label_1916", "label_1917", "label_1918", "label_1919", "label_1920", "label_1921", "label_1922", "label_1923", "label_1924", "label_1925", "label_1926", "label_1927", "label_1928", "label_1929", "label_1930", "label_1931", "label_1932", "label_1933", "label_1934", "label_1935", "label_1936", "label_1937", "label_1938", "label_1939", "label_1940", "label_1941", "label_1942", "label_1943", "label_1944", "label_1945", "label_1946", "label_1947", "label_1948", "label_1949", "label_1950", "label_1951", "label_1952", "label_1953", "label_1954", "label_1955", "label_1956", "label_1957", "label_1958", "label_1959", "label_1960", "label_1961", "label_1962", "label_1963", "label_1964", "label_1965", "label_1966", "label_1967", "label_1968", "label_1969", "label_1970", "label_1971", "label_1972", "label_1973", "label_1974", "label_1975", "label_1976", "label_1977", "label_1978", "label_1979", "label_1980", "label_1981", "label_1982", "label_1983", "label_1984", "label_1985", "label_1986", "label_1987", "label_1988", "label_1989", "label_1990", "label_1991", "label_1992", "label_1993", "label_1994", "label_1995", "label_1996", "label_1997", "label_1998", "label_1999", "label_2000", "label_2001", "label_2002", "label_2003", "label_2004", "label_2005", "label_2006", "label_2007", "label_2008", "label_2009", "label_2010", "label_2011", "label_2012", "label_2013", "label_2014", "label_2015", "label_2016", "label_2017", "label_2018", "label_2019", "label_2020", "label_2021", "label_2022", "label_2023", "label_2024", "label_2025", "label_2026", "label_2027", "label_2028", "label_2029", "label_2030", "label_2031", "label_2032", "label_2033", "label_2034", "label_2035", "label_2036", "label_2037", "label_2038", "label_2039", "label_2040", "label_2041", "label_2042", "label_2043", "label_2044", "label_2045", "label_2046", "label_2047", "label_2048", "label_2049", "label_2050", "label_2051", "label_2052", "label_2053", "label_2054", "label_2055", "label_2056", "label_2057", "label_2058", "label_2059", "label_2060", "label_2061", "label_2062", "label_2063", "label_2064", "label_2065", "label_2066", "label_2067", "label_2068", "label_2069", "label_2070", "label_2071", "label_2072", "label_2073", "label_2074", "label_2075", "label_2076", "label_2077", "label_2078", "label_2079", "label_2080", "label_2081", "label_2082", "label_2083", "label_2084", "label_2085", "label_2086", "label_2087", "label_2088", "label_2089", "label_2090", "label_2091", "label_2092", "label_2093", "label_2094", "label_2095", "label_2096", "label_2097", "label_2098", "label_2099", "label_2100", "label_2101", "label_2102", "label_2103", "label_2104", "label_2105", "label_2106", "label_2107", "label_2108", "label_2109", "label_2110", "label_2111", "label_2112", "label_2113", "label_2114", "label_2115", "label_2116", "label_2117", "label_2118", "label_2119", "label_2120", "label_2121", "label_2122", "label_2123", "label_2124", "label_2125", "label_2126", "label_2127", "label_2128", "label_2129", "label_2130", "label_2131", "label_2132", "label_2133", "label_2134", "label_2135", "label_2136", "label_2137", "label_2138", "label_2139", "label_2140", "label_2141", "label_2142", "label_2143", "label_2144", "label_2145", "label_2146", "label_2147", "label_2148", "label_2149", "label_2150", "label_2151", "label_2152", "label_2153", "label_2154", "label_2155", "label_2156", "label_2157", "label_2158", "label_2159", "label_2160", "label_2161", "label_2162", "label_2163", "label_2164", "label_2165", "label_2166", "label_2167", "label_2168", "label_2169", "label_2170", "label_2171", "label_2172", "label_2173", "label_2174", "label_2175", "label_2176", "label_2177", "label_2178", "label_2179", "label_2180", "label_2181", "label_2182", "label_2183", "label_2184", "label_2185", "label_2186", "label_2187", "label_2188", "label_2189", "label_2190", "label_2191", "label_2192", "label_2193", "label_2194", "label_2195", "label_2196", "label_2197", "label_2198", "label_2199", "label_2200", "label_2201", "label_2202", "label_2203", "label_2204", "label_2205", "label_2206", "label_2207", "label_2208", "label_2209", "label_2210", "label_2211", "label_2212", "label_2213", "label_2214", "label_2215", "label_2216", "label_2217", "label_2218", "label_2219", "label_2220", "label_2221", "label_2222", "label_2223", "label_2224", "label_2225", "label_2226", "label_2227", "label_2228", "label_2229", "label_2230", "label_2231", "label_2232", "label_2233", "label_2234", "label_2235", "label_2236", "label_2237", "label_2238", "label_2239", "label_2240", "label_2241", "label_2242", "label_2243", "label_2244", "label_2245", "label_2246", "label_2247", "label_2248", "label_2249", "label_2250", "label_2251", "label_2252", "label_2253", "label_2254", "label_2255", "label_2256", "label_2257", "label_2258", "label_2259", "label_2260", "label_2261", "label_2262", "label_2263", "label_2264", "label_2265", "label_2266", "label_2267", "label_2268", "label_2269", "label_2270", "label_2271", "label_2272", "label_2273", "label_2274", "label_2275", "label_2276", "label_2277", "label_2278", "label_2279", "label_2280", "label_2281", "label_2282", "label_2283", "label_2284", "label_2285", "label_2286", "label_2287", "label_2288", "label_2289", "label_2290", "label_2291", "label_2292", "label_2293", "label_2294", "label_2295", "label_2296", "label_2297", "label_2298", "label_2299", "label_2300", "label_2301", "label_2302", "label_2303", "label_2304", "label_2305", "label_2306", "label_2307", "label_2308", "label_2309", "label_2310", "label_2311", "label_2312", "label_2313", "label_2314", "label_2315", "label_2316", "label_2317", "label_2318", "label_2319", "label_2320", "label_2321", "label_2322", "label_2323", "label_2324", "label_2325", "label_2326", "label_2327", "label_2328", "label_2329", "label_2330", "label_2331", "label_2332", "label_2333", "label_2334", "label_2335", "label_2336", "label_2337", "label_2338", "label_2339", "label_2340", "label_2341", "label_2342", "label_2343", "label_2344", "label_2345", "label_2346", "label_2347", "label_2348", "label_2349", "label_2350", "label_2351", "label_2352", "label_2353", "label_2354", "label_2355", "label_2356", "label_2357", "label_2358", "label_2359", "label_2360", "label_2361", "label_2362", "label_2363", "label_2364", "label_2365", "label_2366", "label_2367", "label_2368", "label_2369", "label_2370", "label_2371", "label_2372", "label_2373", "label_2374", "label_2375", "label_2376", "label_2377", "label_2378", "label_2379", "label_2380", "label_2381", "label_2382", "label_2383", "label_2384", "label_2385", "label_2386", "label_2387", "label_2388", "label_2389", "label_2390", "label_2391", "label_2392", "label_2393", "label_2394", "label_2395", "label_2396", "label_2397", "label_2398", "label_2399", "label_2400", "label_2401", "label_2402", "label_2403", "label_2404", "label_2405", "label_2406", "label_2407", "label_2408", "label_2409", "label_2410", "label_2411", "label_2412", "label_2413", "label_2414", "label_2415", "label_2416", "label_2417", "label_2418", "label_2419", "label_2420", "label_2421", "label_2422", "label_2423", "label_2424", "label_2425", "label_2426", "label_2427", "label_2428", "label_2429", "label_2430", "label_2431", "label_2432", "label_2433", "label_2434", "label_2435", "label_2436", "label_2437", "label_2438", "label_2439", "label_2440", "label_2441", "label_2442", "label_2443", "label_2444", "label_2445", "label_2446", "label_2447", "label_2448", "label_2449", "label_2450", "label_2451", "label_2452", "label_2453", "label_2454", "label_2455", "label_2456", "label_2457", "label_2458", "label_2459", "label_2460", "label_2461", "label_2462", "label_2463", "label_2464", "label_2465", "label_2466", "label_2467", "label_2468", "label_2469", "label_2470", "label_2471", "label_2472", "label_2473", "label_2474", "label_2475", "label_2476", "label_2477", "label_2478", "label_2479", "label_2480", "label_2481", "label_2482", "label_2483", "label_2484", "label_2485", "label_2486", "label_2487", "label_2488", "label_2489", "label_2490", "label_2491", "label_2492", "label_2493", "label_2494", "label_2495", "label_2496", "label_2497", "label_2498", "label_2499", "label_2500", "label_2501", "label_2502", "label_2503", "label_2504", "label_2505", "label_2506", "label_2507", "label_2508", "label_2509", "label_2510", "label_2511", "label_2512", "label_2513", "label_2514", "label_2515", "label_2516", "label_2517", "label_2518", "label_2519", "label_2520", "label_2521", "label_2522", "label_2523", "label_2524", "label_2525", "label_2526", "label_2527", "label_2528", "label_2529", "label_2530", "label_2531", "label_2532", "label_2533", "label_2534", "label_2535", "label_2536", "label_2537", "label_2538", "label_2539", "label_2540", "label_2541", "label_2542", "label_2543", "label_2544", "label_2545", "label_2546", "label_2547", "label_2548", "label_2549", "label_2550", "label_2551", "label_2552", "label_2553", "label_2554", "label_2555", "label_2556", "label_2557", "label_2558", "label_2559", "label_2560", "label_2561", "label_2562", "label_2563", "label_2564", "label_2565", "label_2566", "label_2567", "label_2568", "label_2569", "label_2570", "label_2571", "label_2572", "label_2573", "label_2574", "label_2575", "label_2576", "label_2577", "label_2578", "label_2579", "label_2580", "label_2581", "label_2582", "label_2583", "label_2584", "label_2585", "label_2586", "label_2587", "label_2588", "label_2589", "label_2590", "label_2591", "label_2592", "label_2593", "label_2594", "label_2595", "label_2596", "label_2597", "label_2598", "label_2599", "label_2600", "label_2601", "label_2602", "label_2603", "label_2604", "label_2605", "label_2606", "label_2607", "label_2608", "label_2609", "label_2610", "label_2611", "label_2612", "label_2613", "label_2614", "label_2615", "label_2616", "label_2617", "label_2618", "label_2619", "label_2620", "label_2621", "label_2622", "label_2623", "label_2624", "label_2625", "label_2626", "label_2627", "label_2628", "label_2629", "label_2630", "label_2631", "label_2632", "label_2633", "label_2634", "label_2635", "label_2636", "label_2637", "label_2638", "label_2639", "label_2640", "label_2641", "label_2642", "label_2643", "label_2644", "label_2645", "label_2646", "label_2647", "label_2648", "label_2649", "label_2650", "label_2651", "label_2652", "label_2653", "label_2654", "label_2655", "label_2656", "label_2657", "label_2658", "label_2659", "label_2660", "label_2661", "label_2662", "label_2663", "label_2664", "label_2665", "label_2666", "label_2667", "label_2668", "label_2669", "label_2670", "label_2671", "label_2672", "label_2673", "label_2674", "label_2675", "label_2676", "label_2677", "label_2678", "label_2679", "label_2680", "label_2681", "label_2682", "label_2683", "label_2684", "label_2685", "label_2686", "label_2687", "label_2688", "label_2689", "label_2690", "label_2691", "label_2692", "label_2693", "label_2694", "label_2695", "label_2696", "label_2697", "label_2698", "label_2699", "label_2700", "label_2701", "label_2702", "label_2703", "label_2704", "label_2705", "label_2706", "label_2707", "label_2708", "label_2709", "label_2710", "label_2711", "label_2712", "label_2713", "label_2714", "label_2715", "label_2716", "label_2717", "label_2718", "label_2719", "label_2720", "label_2721", "label_2722", "label_2723", "label_2724", "label_2725", "label_2726", "label_2727", "label_2728", "label_2729", "label_2730", "label_2731", "label_2732", "label_2733", "label_2734", "label_2735", "label_2736", "label_2737", "label_2738", "label_2739", "label_2740", "label_2741", "label_2742", "label_2743", "label_2744", "label_2745", "label_2746", "label_2747", "label_2748", "label_2749", "label_2750", "label_2751", "label_2752", "label_2753", "label_2754", "label_2755", "label_2756", "label_2757", "label_2758", "label_2759", "label_2760", "label_2761", "label_2762", "label_2763", "label_2764", "label_2765", "label_2766", "label_2767", "label_2768", "label_2769", "label_2770", "label_2771", "label_2772", "label_2773", "label_2774", "label_2775", "label_2776", "label_2777", "label_2778", "label_2779", "label_2780", "label_2781", "label_2782", "label_2783", "label_2784", "label_2785", "label_2786", "label_2787", "label_2788", "label_2789", "label_2790", "label_2791", "label_2792", "label_2793", "label_2794", "label_2795", "label_2796", "label_2797", "label_2798", "label_2799", "label_2800", "label_2801", "label_2802", "label_2803", "label_2804", "label_2805", "label_2806", "label_2807", "label_2808", "label_2809", "label_2810", "label_2811", "label_2812", "label_2813", "label_2814", "label_2815", "label_2816", "label_2817", "label_2818", "label_2819", "label_2820", "label_2821", "label_2822", "label_2823", "label_2824", "label_2825", "label_2826", "label_2827", "label_2828", "label_2829", "label_2830", "label_2831", "label_2832", "label_2833", "label_2834", "label_2835", "label_2836", "label_2837", "label_2838", "label_2839", "label_2840", "label_2841", "label_2842", "label_2843", "label_2844", "label_2845", "label_2846", "label_2847", "label_2848", "label_2849", "label_2850", "label_2851", "label_2852", "label_2853", "label_2854", "label_2855", "label_2856", "label_2857", "label_2858", "label_2859", "label_2860", "label_2861", "label_2862", "label_2863", "label_2864", "label_2865", "label_2866", "label_2867", "label_2868", "label_2869", "label_2870", "label_2871", "label_2872", "label_2873", "label_2874", "label_2875", "label_2876", "label_2877", "label_2878", "label_2879", "label_2880", "label_2881", "label_2882", "label_2883", "label_2884", "label_2885", "label_2886", "label_2887", "label_2888", "label_2889", "label_2890", "label_2891", "label_2892", "label_2893", "label_2894", "label_2895", "label_2896", "label_2897", "label_2898", "label_2899", "label_2900", "label_2901", "label_2902", "label_2903", "label_2904", "label_2905", "label_2906", "label_2907", "label_2908", "label_2909", "label_2910", "label_2911", "label_2912", "label_2913", "label_2914", "label_2915", "label_2916", "label_2917", "label_2918", "label_2919", "label_2920", "label_2921", "label_2922", "label_2923", "label_2924", "label_2925", "label_2926", "label_2927", "label_2928", "label_2929", "label_2930", "label_2931", "label_2932", "label_2933", "label_2934", "label_2935", "label_2936", "label_2937", "label_2938", "label_2939", "label_2940", "label_2941", "label_2942", "label_2943", "label_2944", "label_2945", "label_2946", "label_2947", "label_2948", "label_2949", "label_2950", "label_2951", "label_2952", "label_2953", "label_2954", "label_2955", "label_2956", "label_2957", "label_2958", "label_2959", "label_2960", "label_2961", "label_2962", "label_2963", "label_2964", "label_2965", "label_2966", "label_2967", "label_2968", "label_2969", "label_2970", "label_2971", "label_2972", "label_2973", "label_2974", "label_2975", "label_2976", "label_2977", "label_2978", "label_2979", "label_2980", "label_2981", "label_2982", "label_2983", "label_2984", "label_2985", "label_2986", "label_2987", "label_2988", "label_2989", "label_2990", "label_2991", "label_2992", "label_2993", "label_2994", "label_2995", "label_2996", "label_2997", "label_2998", "label_2999", "label_3000", "label_3001", "label_3002", "label_3003", "label_3004", "label_3005", "label_3006", "label_3007", "label_3008", "label_3009", "label_3010", "label_3011", "label_3012", "label_3013", "label_3014", "label_3015", "label_3016", "label_3017", "label_3018", "label_3019", "label_3020", "label_3021", "label_3022", "label_3023", "label_3024", "label_3025", "label_3026", "label_3027", "label_3028", "label_3029", "label_3030", "label_3031", "label_3032", "label_3033", "label_3034", "label_3035", "label_3036", "label_3037", "label_3038", "label_3039", "label_3040", "label_3041", "label_3042", "label_3043", "label_3044", "label_3045", "label_3046", "label_3047", "label_3048", "label_3049", "label_3050", "label_3051", "label_3052", "label_3053", "label_3054", "label_3055", "label_3056", "label_3057", "label_3058", "label_3059", "label_3060", "label_3061", "label_3062", "label_3063", "label_3064", "label_3065", "label_3066", "label_3067", "label_3068", "label_3069", "label_3070", "label_3071", "label_3072", "label_3073", "label_3074", "label_3075", "label_3076", "label_3077", "label_3078", "label_3079", "label_3080", "label_3081", "label_3082", "label_3083", "label_3084", "label_3085", "label_3086", "label_3087", "label_3088", "label_3089", "label_3090", "label_3091", "label_3092", "label_3093", "label_3094", "label_3095", "label_3096", "label_3097", "label_3098", "label_3099", "label_3100", "label_3101", "label_3102", "label_3103", "label_3104", "label_3105", "label_3106", "label_3107", "label_3108", "label_3109", "label_3110", "label_3111", "label_3112", "label_3113", "label_3114", "label_3115", "label_3116", "label_3117", "label_3118", "label_3119", "label_3120", "label_3121", "label_3122", "label_3123", "label_3124", "label_3125", "label_3126", "label_3127", "label_3128", "label_3129", "label_3130", "label_3131", "label_3132", "label_3133", "label_3134", "label_3135", "label_3136", "label_3137", "label_3138", "label_3139", "label_3140", "label_3141", "label_3142", "label_3143", "label_3144", "label_3145", "label_3146", "label_3147", "label_3148", "label_3149", "label_3150", "label_3151", "label_3152", "label_3153", "label_3154", "label_3155", "label_3156", "label_3157", "label_3158", "label_3159", "label_3160", "label_3161", "label_3162", "label_3163", "label_3164", "label_3165", "label_3166", "label_3167", "label_3168", "label_3169", "label_3170", "label_3171", "label_3172", "label_3173", "label_3174", "label_3175", "label_3176", "label_3177", "label_3178", "label_3179", "label_3180", "label_3181", "label_3182", "label_3183", "label_3184", "label_3185", "label_3186", "label_3187", "label_3188", "label_3189", "label_3190", "label_3191", "label_3192", "label_3193", "label_3194", "label_3195", "label_3196", "label_3197", "label_3198", "label_3199", "label_3200", "label_3201", "label_3202", "label_3203", "label_3204", "label_3205", "label_3206", "label_3207", "label_3208", "label_3209", "label_3210", "label_3211", "label_3212", "label_3213", "label_3214", "label_3215", "label_3216", "label_3217", "label_3218", "label_3219", "label_3220", "label_3221", "label_3222", "label_3223", "label_3224", "label_3225", "label_3226", "label_3227", "label_3228", "label_3229", "label_3230", "label_3231", "label_3232", "label_3233", "label_3234", "label_3235", "label_3236", "label_3237", "label_3238", "label_3239", "label_3240", "label_3241", "label_3242", "label_3243", "label_3244", "label_3245", "label_3246", "label_3247", "label_3248", "label_3249", "label_3250", "label_3251", "label_3252", "label_3253", "label_3254", "label_3255", "label_3256", "label_3257", "label_3258", "label_3259", "label_3260", "label_3261", "label_3262", "label_3263", "label_3264", "label_3265", "label_3266", "label_3267", "label_3268", "label_3269", "label_3270", "label_3271", "label_3272", "label_3273", "label_3274", "label_3275", "label_3276", "label_3277", "label_3278", "label_3279", "label_3280", "label_3281", "label_3282", "label_3283", "label_3284", "label_3285", "label_3286", "label_3287", "label_3288", "label_3289", "label_3290", "label_3291", "label_3292", "label_3293", "label_3294", "label_3295", "label_3296", "label_3297", "label_3298", "label_3299", "label_3300", "label_3301", "label_3302", "label_3303", "label_3304", "label_3305", "label_3306", "label_3307", "label_3308", "label_3309", "label_3310", "label_3311", "label_3312", "label_3313", "label_3314", "label_3315", "label_3316", "label_3317", "label_3318", "label_3319", "label_3320", "label_3321", "label_3322", "label_3323", "label_3324", "label_3325", "label_3326", "label_3327", "label_3328", "label_3329", "label_3330", "label_3331", "label_3332", "label_3333", "label_3334", "label_3335", "label_3336", "label_3337", "label_3338", "label_3339", "label_3340", "label_3341", "label_3342", "label_3343", "label_3344", "label_3345", "label_3346", "label_3347", "label_3348", "label_3349", "label_3350", "label_3351", "label_3352", "label_3353", "label_3354", "label_3355", "label_3356", "label_3357", "label_3358", "label_3359", "label_3360", "label_3361", "label_3362", "label_3363", "label_3364", "label_3365", "label_3366", "label_3367", "label_3368", "label_3369", "label_3370", "label_3371", "label_3372", "label_3373", "label_3374", "label_3375", "label_3376", "label_3377", "label_3378", "label_3379", "label_3380", "label_3381", "label_3382", "label_3383", "label_3384", "label_3385", "label_3386", "label_3387", "label_3388", "label_3389", "label_3390", "label_3391", "label_3392", "label_3393", "label_3394", "label_3395", "label_3396", "label_3397", "label_3398", "label_3399", "label_3400", "label_3401", "label_3402", "label_3403", "label_3404", "label_3405", "label_3406", "label_3407", "label_3408", "label_3409", "label_3410", "label_3411", "label_3412", "label_3413", "label_3414", "label_3415", "label_3416", "label_3417", "label_3418", "label_3419", "label_3420", "label_3421", "label_3422", "label_3423", "label_3424", "label_3425", "label_3426", "label_3427", "label_3428", "label_3429", "label_3430", "label_3431", "label_3432", "label_3433", "label_3434", "label_3435", "label_3436", "label_3437", "label_3438", "label_3439", "label_3440", "label_3441", "label_3442", "label_3443", "label_3444", "label_3445", "label_3446", "label_3447", "label_3448", "label_3449", "label_3450", "label_3451", "label_3452", "label_3453", "label_3454", "label_3455", "label_3456", "label_3457", "label_3458", "label_3459", "label_3460", "label_3461", "label_3462", "label_3463", "label_3464", "label_3465", "label_3466", "label_3467", "label_3468", "label_3469", "label_3470", "label_3471", "label_3472", "label_3473", "label_3474", "label_3475", "label_3476", "label_3477", "label_3478", "label_3479", "label_3480", "label_3481", "label_3482", "label_3483", "label_3484", "label_3485", "label_3486", "label_3487", "label_3488", "label_3489", "label_3490", "label_3491", "label_3492", "label_3493", "label_3494", "label_3495", "label_3496", "label_3497", "label_3498", "label_3499", "label_3500", "label_3501", "label_3502", "label_3503", "label_3504", "label_3505", "label_3506", "label_3507", "label_3508", "label_3509", "label_3510", "label_3511", "label_3512", "label_3513", "label_3514", "label_3515", "label_3516", "label_3517", "label_3518", "label_3519", "label_3520", "label_3521", "label_3522", "label_3523", "label_3524", "label_3525", "label_3526", "label_3527", "label_3528", "label_3529", "label_3530", "label_3531", "label_3532", "label_3533", "label_3534", "label_3535", "label_3536", "label_3537", "label_3538", "label_3539", "label_3540", "label_3541", "label_3542", "label_3543", "label_3544", "label_3545", "label_3546", "label_3547", "label_3548", "label_3549", "label_3550", "label_3551", "label_3552", "label_3553", "label_3554", "label_3555", "label_3556", "label_3557", "label_3558", "label_3559", "label_3560", "label_3561", "label_3562", "label_3563", "label_3564", "label_3565", "label_3566", "label_3567", "label_3568", "label_3569", "label_3570", "label_3571", "label_3572", "label_3573", "label_3574", "label_3575", "label_3576", "label_3577", "label_3578", "label_3579", "label_3580", "label_3581", "label_3582", "label_3583", "label_3584", "label_3585", "label_3586", "label_3587", "label_3588", "label_3589", "label_3590", "label_3591", "label_3592", "label_3593", "label_3594", "label_3595", "label_3596", "label_3597", "label_3598", "label_3599", "label_3600", "label_3601", "label_3602", "label_3603", "label_3604", "label_3605", "label_3606", "label_3607", "label_3608", "label_3609", "label_3610", "label_3611", "label_3612", "label_3613", "label_3614", "label_3615", "label_3616", "label_3617", "label_3618", "label_3619", "label_3620", "label_3621", "label_3622", "label_3623", "label_3624", "label_3625", "label_3626", "label_3627", "label_3628", "label_3629", "label_3630", "label_3631", "label_3632", "label_3633", "label_3634", "label_3635", "label_3636", "label_3637", "label_3638", "label_3639", "label_3640", "label_3641", "label_3642", "label_3643", "label_3644", "label_3645", "label_3646", "label_3647", "label_3648", "label_3649", "label_3650", "label_3651", "label_3652", "label_3653", "label_3654", "label_3655", "label_3656", "label_3657", "label_3658", "label_3659", "label_3660", "label_3661", "label_3662", "label_3663", "label_3664", "label_3665", "label_3666", "label_3667", "label_3668", "label_3669", "label_3670", "label_3671", "label_3672", "label_3673", "label_3674", "label_3675", "label_3676", "label_3677", "label_3678", "label_3679", "label_3680", "label_3681", "label_3682", "label_3683", "label_3684", "label_3685", "label_3686", "label_3687", "label_3688", "label_3689", "label_3690", "label_3691", "label_3692", "label_3693", "label_3694", "label_3695", "label_3696", "label_3697", "label_3698", "label_3699", "label_3700", "label_3701", "label_3702", "label_3703", "label_3704", "label_3705", "label_3706", "label_3707", "label_3708", "label_3709", "label_3710", "label_3711", "label_3712", "label_3713", "label_3714", "label_3715", "label_3716", "label_3717", "label_3718", "label_3719", "label_3720", "label_3721", "label_3722", "label_3723", "label_3724", "label_3725", "label_3726", "label_3727", "label_3728", "label_3729", "label_3730", "label_3731", "label_3732", "label_3733", "label_3734", "label_3735", "label_3736", "label_3737", "label_3738", "label_3739", "label_3740", "label_3741", "label_3742", "label_3743", "label_3744", "label_3745", "label_3746", "label_3747", "label_3748", "label_3749", "label_3750", "label_3751", "label_3752", "label_3753", "label_3754", "label_3755", "label_3756", "label_3757", "label_3758", "label_3759", "label_3760", "label_3761", "label_3762", "label_3763", "label_3764", "label_3765", "label_3766", "label_3767", "label_3768", "label_3769", "label_3770", "label_3771", "label_3772", "label_3773", "label_3774", "label_3775", "label_3776", "label_3777", "label_3778", "label_3779", "label_3780", "label_3781", "label_3782", "label_3783", "label_3784", "label_3785", "label_3786", "label_3787", "label_3788", "label_3789", "label_3790", "label_3791", "label_3792", "label_3793", "label_3794", "label_3795", "label_3796", "label_3797", "label_3798", "label_3799", "label_3800", "label_3801", "label_3802", "label_3803", "label_3804", "label_3805", "label_3806", "label_3807", "label_3808", "label_3809", "label_3810", "label_3811", "label_3812", "label_3813", "label_3814", "label_3815", "label_3816", "label_3817", "label_3818", "label_3819", "label_3820", "label_3821", "label_3822", "label_3823", "label_3824", "label_3825", "label_3826", "label_3827", "label_3828", "label_3829", "label_3830", "label_3831", "label_3832", "label_3833", "label_3834", "label_3835", "label_3836", "label_3837", "label_3838", "label_3839", "label_3840", "label_3841", "label_3842", "label_3843", "label_3844", "label_3845", "label_3846", "label_3847", "label_3848", "label_3849", "label_3850", "label_3851", "label_3852", "label_3853", "label_3854", "label_3855", "label_3856", "label_3857", "label_3858", "label_3859", "label_3860", "label_3861", "label_3862", "label_3863", "label_3864", "label_3865", "label_3866", "label_3867", "label_3868", "label_3869", "label_3870", "label_3871", "label_3872", "label_3873", "label_3874", "label_3875", "label_3876", "label_3877", "label_3878", "label_3879", "label_3880", "label_3881", "label_3882", "label_3883", "label_3884", "label_3885", "label_3886", "label_3887", "label_3888", "label_3889", "label_3890", "label_3891", "label_3892", "label_3893", "label_3894", "label_3895", "label_3896", "label_3897", "label_3898", "label_3899", "label_3900", "label_3901", "label_3902", "label_3903", "label_3904", "label_3905", "label_3906", "label_3907", "label_3908", "label_3909", "label_3910", "label_3911", "label_3912", "label_3913", "label_3914", "label_3915", "label_3916", "label_3917", "label_3918", "label_3919", "label_3920", "label_3921", "label_3922", "label_3923", "label_3924", "label_3925", "label_3926", "label_3927", "label_3928", "label_3929", "label_3930", "label_3931", "label_3932", "label_3933", "label_3934", "label_3935", "label_3936", "label_3937", "label_3938", "label_3939", "label_3940", "label_3941", "label_3942", "label_3943", "label_3944", "label_3945", "label_3946", "label_3947", "label_3948", "label_3949", "label_3950", "label_3951", "label_3952", "label_3953", "label_3954", "label_3955", "label_3956", "label_3957", "label_3958", "label_3959", "label_3960", "label_3961", "label_3962", "label_3963", "label_3964", "label_3965", "label_3966", "label_3967", "label_3968", "label_3969", "label_3970", "label_3971", "label_3972", "label_3973", "label_3974", "label_3975", "label_3976", "label_3977", "label_3978", "label_3979", "label_3980", "label_3981", "label_3982", "label_3983", "label_3984", "label_3985", "label_3986", "label_3987", "label_3988", "label_3989", "label_3990", "label_3991", "label_3992", "label_3993", "label_3994", "label_3995", "label_3996", "label_3997", "label_3998", "label_3999", "label_4000", "label_4001", "label_4002", "label_4003", "label_4004", "label_4005", "label_4006", "label_4007", "label_4008", "label_4009", "label_4010", "label_4011", "label_4012", "label_4013", "label_4014", "label_4015", "label_4016", "label_4017", "label_4018", "label_4019", "label_4020", "label_4021", "label_4022", "label_4023", "label_4024", "label_4025", "label_4026", "label_4027", "label_4028", "label_4029", "label_4030", "label_4031", "label_4032", "label_4033", "label_4034", "label_4035", "label_4036", "label_4037", "label_4038", "label_4039", "label_4040", "label_4041", "label_4042", "label_4043", "label_4044", "label_4045", "label_4046", "label_4047", "label_4048", "label_4049", "label_4050", "label_4051", "label_4052", "label_4053", "label_4054", "label_4055", "label_4056", "label_4057", "label_4058", "label_4059", "label_4060", "label_4061", "label_4062", "label_4063", "label_4064", "label_4065", "label_4066", "label_4067", "label_4068", "label_4069", "label_4070", "label_4071", "label_4072", "label_4073", "label_4074", "label_4075", "label_4076", "label_4077", "label_4078", "label_4079", "label_4080", "label_4081", "label_4082", "label_4083", "label_4084", "label_4085", "label_4086", "label_4087", "label_4088", "label_4089", "label_4090", "label_4091", "label_4092", "label_4093", "label_4094", "label_4095", "label_4096", "label_4097", "label_4098", "label_4099", "label_4100", "label_4101", "label_4102", "label_4103", "label_4104", "label_4105", "label_4106", "label_4107", "label_4108", "label_4109", "label_4110", "label_4111", "label_4112", "label_4113", "label_4114", "label_4115", "label_4116", "label_4117", "label_4118", "label_4119", "label_4120", "label_4121", "label_4122", "label_4123", "label_4124", "label_4125", "label_4126", "label_4127", "label_4128", "label_4129", "label_4130", "label_4131", "label_4132", "label_4133", "label_4134", "label_4135", "label_4136", "label_4137", "label_4138", "label_4139", "label_4140", "label_4141", "label_4142", "label_4143", "label_4144", "label_4145", "label_4146", "label_4147", "label_4148", "label_4149", "label_4150", "label_4151", "label_4152", "label_4153", "label_4154", "label_4155", "label_4156", "label_4157", "label_4158", "label_4159", "label_4160", "label_4161", "label_4162", "label_4163", "label_4164", "label_4165", "label_4166", "label_4167", "label_4168", "label_4169", "label_4170", "label_4171", "label_4172", "label_4173", "label_4174", "label_4175", "label_4176", "label_4177", "label_4178", "label_4179", "label_4180", "label_4181", "label_4182", "label_4183", "label_4184", "label_4185", "label_4186", "label_4187", "label_4188", "label_4189", "label_4190", "label_4191", "label_4192", "label_4193", "label_4194", "label_4195", "label_4196", "label_4197", "label_4198", "label_4199", "label_4200", "label_4201", "label_4202", "label_4203", "label_4204", "label_4205", "label_4206", "label_4207", "label_4208", "label_4209", "label_4210", "label_4211", "label_4212", "label_4213", "label_4214", "label_4215", "label_4216", "label_4217", "label_4218", "label_4219", "label_4220", "label_4221", "label_4222", "label_4223", "label_4224", "label_4225", "label_4226", "label_4227", "label_4228", "label_4229", "label_4230", "label_4231", "label_4232", "label_4233", "label_4234", "label_4235", "label_4236", "label_4237", "label_4238", "label_4239", "label_4240", "label_4241", "label_4242", "label_4243", "label_4244", "label_4245", "label_4246", "label_4247", "label_4248", "label_4249", "label_4250", "label_4251", "label_4252", "label_4253", "label_4254", "label_4255", "label_4256", "label_4257", "label_4258", "label_4259", "label_4260", "label_4261", "label_4262", "label_4263", "label_4264", "label_4265", "label_4266", "label_4267", "label_4268", "label_4269", "label_4270", "label_4271", "label_4272", "label_4273", "label_4274", "label_4275", "label_4276", "label_4277", "label_4278", "label_4279", "label_4280", "label_4281", "label_4282", "label_4283", "label_4284", "label_4285", "label_4286", "label_4287", "label_4288", "label_4289", "label_4290", "label_4291", "label_4292", "label_4293", "label_4294", "label_4295", "label_4296", "label_4297", "label_4298", "label_4299", "label_4300", "label_4301", "label_4302", "label_4303", "label_4304", "label_4305", "label_4306", "label_4307", "label_4308", "label_4309", "label_4310", "label_4311", "label_4312", "label_4313", "label_4314", "label_4315", "label_4316", "label_4317", "label_4318", "label_4319", "label_4320", "label_4321", "label_4322", "label_4323", "label_4324", "label_4325", "label_4326", "label_4327", "label_4328", "label_4329", "label_4330", "label_4331", "label_4332", "label_4333", "label_4334", "label_4335", "label_4336", "label_4337", "label_4338", "label_4339", "label_4340", "label_4341", "label_4342", "label_4343", "label_4344", "label_4345", "label_4346", "label_4347", "label_4348", "label_4349", "label_4350", "label_4351", "label_4352", "label_4353", "label_4354", "label_4355", "label_4356", "label_4357", "label_4358", "label_4359", "label_4360", "label_4361", "label_4362", "label_4363", "label_4364", "label_4365", "label_4366", "label_4367", "label_4368", "label_4369", "label_4370", "label_4371", "label_4372", "label_4373", "label_4374", "label_4375", "label_4376", "label_4377", "label_4378", "label_4379", "label_4380", "label_4381", "label_4382", "label_4383", "label_4384", "label_4385", "label_4386", "label_4387", "label_4388", "label_4389", "label_4390", "label_4391", "label_4392", "label_4393", "label_4394", "label_4395", "label_4396", "label_4397", "label_4398", "label_4399", "label_4400", "label_4401", "label_4402", "label_4403", "label_4404", "label_4405", "label_4406", "label_4407", "label_4408", "label_4409", "label_4410", "label_4411", "label_4412", "label_4413", "label_4414", "label_4415", "label_4416", "label_4417", "label_4418", "label_4419", "label_4420", "label_4421", "label_4422", "label_4423", "label_4424", "label_4425", "label_4426", "label_4427", "label_4428", "label_4429", "label_4430", "label_4431", "label_4432", "label_4433", "label_4434", "label_4435", "label_4436", "label_4437", "label_4438", "label_4439", "label_4440", "label_4441", "label_4442", "label_4443", "label_4444", "label_4445", "label_4446", "label_4447", "label_4448", "label_4449", "label_4450", "label_4451", "label_4452", "label_4453", "label_4454", "label_4455", "label_4456", "label_4457", "label_4458", "label_4459", "label_4460", "label_4461", "label_4462", "label_4463", "label_4464", "label_4465", "label_4466", "label_4467", "label_4468", "label_4469", "label_4470", "label_4471", "label_4472", "label_4473", "label_4474", "label_4475", "label_4476", "label_4477", "label_4478", "label_4479", "label_4480", "label_4481", "label_4482", "label_4483", "label_4484", "label_4485", "label_4486", "label_4487", "label_4488", "label_4489", "label_4490", "label_4491", "label_4492", "label_4493", "label_4494", "label_4495", "label_4496", "label_4497", "label_4498", "label_4499", "label_4500", "label_4501", "label_4502", "label_4503", "label_4504", "label_4505", "label_4506", "label_4507", "label_4508", "label_4509", "label_4510", "label_4511", "label_4512", "label_4513", "label_4514", "label_4515", "label_4516", "label_4517", "label_4518", "label_4519", "label_4520", "label_4521", "label_4522", "label_4523", "label_4524", "label_4525", "label_4526", "label_4527", "label_4528", "label_4529", "label_4530", "label_4531", "label_4532", "label_4533", "label_4534", "label_4535", "label_4536", "label_4537", "label_4538", "label_4539", "label_4540", "label_4541", "label_4542", "label_4543", "label_4544", "label_4545", "label_4546", "label_4547", "label_4548", "label_4549", "label_4550", "label_4551", "label_4552", "label_4553", "label_4554", "label_4555", "label_4556", "label_4557", "label_4558", "label_4559", "label_4560", "label_4561", "label_4562", "label_4563", "label_4564", "label_4565", "label_4566", "label_4567", "label_4568", "label_4569", "label_4570", "label_4571", "label_4572", "label_4573", "label_4574", "label_4575", "label_4576", "label_4577", "label_4578", "label_4579", "label_4580", "label_4581", "label_4582", "label_4583", "label_4584", "label_4585", "label_4586", "label_4587", "label_4588", "label_4589", "label_4590", "label_4591", "label_4592", "label_4593", "label_4594", "label_4595", "label_4596", "label_4597", "label_4598", "label_4599", "label_4600", "label_4601", "label_4602", "label_4603", "label_4604", "label_4605", "label_4606", "label_4607", "label_4608", "label_4609", "label_4610", "label_4611", "label_4612", "label_4613", "label_4614", "label_4615", "label_4616", "label_4617", "label_4618", "label_4619", "label_4620", "label_4621", "label_4622", "label_4623", "label_4624", "label_4625", "label_4626", "label_4627", "label_4628", "label_4629", "label_4630", "label_4631", "label_4632", "label_4633", "label_4634", "label_4635", "label_4636", "label_4637", "label_4638", "label_4639", "label_4640", "label_4641", "label_4642", "label_4643", "label_4644", "label_4645", "label_4646", "label_4647", "label_4648", "label_4649", "label_4650", "label_4651", "label_4652", "label_4653", "label_4654", "label_4655", "label_4656", "label_4657", "label_4658", "label_4659", "label_4660", "label_4661", "label_4662", "label_4663", "label_4664", "label_4665", "label_4666", "label_4667", "label_4668", "label_4669", "label_4670", "label_4671", "label_4672", "label_4673", "label_4674", "label_4675", "label_4676", "label_4677", "label_4678", "label_4679", "label_4680", "label_4681", "label_4682", "label_4683", "label_4684", "label_4685", "label_4686", "label_4687", "label_4688", "label_4689", "label_4690", "label_4691", "label_4692", "label_4693", "label_4694", "label_4695", "label_4696", "label_4697", "label_4698", "label_4699", "label_4700", "label_4701", "label_4702", "label_4703", "label_4704", "label_4705", "label_4706", "label_4707", "label_4708", "label_4709", "label_4710", "label_4711", "label_4712", "label_4713", "label_4714", "label_4715", "label_4716", "label_4717", "label_4718", "label_4719", "label_4720", "label_4721", "label_4722", "label_4723", "label_4724", "label_4725", "label_4726", "label_4727", "label_4728", "label_4729", "label_4730", "label_4731", "label_4732", "label_4733", "label_4734", "label_4735", "label_4736", "label_4737", "label_4738", "label_4739", "label_4740", "label_4741", "label_4742", "label_4743", "label_4744", "label_4745", "label_4746", "label_4747", "label_4748", "label_4749", "label_4750", "label_4751", "label_4752", "label_4753", "label_4754", "label_4755", "label_4756", "label_4757", "label_4758", "label_4759", "label_4760", "label_4761", "label_4762", "label_4763", "label_4764", "label_4765", "label_4766", "label_4767", "label_4768", "label_4769", "label_4770", "label_4771", "label_4772", "label_4773", "label_4774", "label_4775", "label_4776", "label_4777", "label_4778", "label_4779", "label_4780", "label_4781", "label_4782", "label_4783", "label_4784", "label_4785", "label_4786", "label_4787", "label_4788", "label_4789", "label_4790", "label_4791", "label_4792", "label_4793", "label_4794", "label_4795", "label_4796", "label_4797", "label_4798", "label_4799", "label_4800", "label_4801", "label_4802", "label_4803", "label_4804", "label_4805", "label_4806", "label_4807", "label_4808", "label_4809", "label_4810", "label_4811", "label_4812", "label_4813", "label_4814", "label_4815", "label_4816", "label_4817", "label_4818", "label_4819", "label_4820", "label_4821", "label_4822", "label_4823", "label_4824", "label_4825", "label_4826", "label_4827", "label_4828", "label_4829", "label_4830", "label_4831", "label_4832", "label_4833", "label_4834", "label_4835", "label_4836", "label_4837", "label_4838", "label_4839", "label_4840", "label_4841", "label_4842", "label_4843", "label_4844", "label_4845", "label_4846", "label_4847", "label_4848", "label_4849", "label_4850", "label_4851", "label_4852", "label_4853", "label_4854", "label_4855", "label_4856", "label_4857", "label_4858", "label_4859", "label_4860", "label_4861", "label_4862", "label_4863", "label_4864", "label_4865", "label_4866", "label_4867", "label_4868", "label_4869", "label_4870", "label_4871", "label_4872", "label_4873", "label_4874", "label_4875", "label_4876", "label_4877", "label_4878", "label_4879", "label_4880", "label_4881", "label_4882", "label_4883", "label_4884", "label_4885", "label_4886", "label_4887", "label_4888", "label_4889", "label_4890", "label_4891", "label_4892", "label_4893", "label_4894", "label_4895", "label_4896", "label_4897", "label_4898", "label_4899", "label_4900", "label_4901", "label_4902", "label_4903", "label_4904", "label_4905", "label_4906", "label_4907", "label_4908", "label_4909", "label_4910", "label_4911", "label_4912", "label_4913", "label_4914", "label_4915", "label_4916", "label_4917", "label_4918", "label_4919", "label_4920", "label_4921", "label_4922", "label_4923", "label_4924", "label_4925", "label_4926", "label_4927", "label_4928", "label_4929", "label_4930", "label_4931", "label_4932", "label_4933", "label_4934", "label_4935", "label_4936", "label_4937", "label_4938", "label_4939", "label_4940", "label_4941", "label_4942", "label_4943", "label_4944", "label_4945", "label_4946", "label_4947", "label_4948", "label_4949", "label_4950", "label_4951", "label_4952", "label_4953", "label_4954", "label_4955", "label_4956", "label_4957", "label_4958", "label_4959", "label_4960", "label_4961", "label_4962", "label_4963", "label_4964", "label_4965", "label_4966", "label_4967", "label_4968", "label_4969", "label_4970", "label_4971", "label_4972", "label_4973", "label_4974", "label_4975", "label_4976", "label_4977", "label_4978", "label_4979", "label_4980", "label_4981", "label_4982", "label_4983", "label_4984", "label_4985", "label_4986", "label_4987", "label_4988", "label_4989", "label_4990", "label_4991", "label_4992", "label_4993", "label_4994", "label_4995", "label_4996", "label_4997", "label_4998", "label_4999", "label_5000", "label_5001", "label_5002", "label_5003", "label_5004", "label_5005", "label_5006", "label_5007", "label_5008", "label_5009", "label_5010", "label_5011", "label_5012", "label_5013", "label_5014", "label_5015", "label_5016", "label_5017", "label_5018", "label_5019", "label_5020", "label_5021", "label_5022", "label_5023", "label_5024", "label_5025", "label_5026", "label_5027", "label_5028", "label_5029", "label_5030", "label_5031", "label_5032", "label_5033", "label_5034", "label_5035", "label_5036", "label_5037", "label_5038", "label_5039", "label_5040", "label_5041", "label_5042", "label_5043", "label_5044", "label_5045", "label_5046", "label_5047", "label_5048", "label_5049", "label_5050", "label_5051", "label_5052", "label_5053", "label_5054", "label_5055", "label_5056", "label_5057", "label_5058", "label_5059", "label_5060", "label_5061", "label_5062", "label_5063", "label_5064", "label_5065", "label_5066", "label_5067", "label_5068", "label_5069", "label_5070", "label_5071", "label_5072", "label_5073", "label_5074", "label_5075", "label_5076", "label_5077", "label_5078", "label_5079", "label_5080", "label_5081", "label_5082", "label_5083", "label_5084", "label_5085", "label_5086", "label_5087", "label_5088", "label_5089", "label_5090", "label_5091", "label_5092", "label_5093", "label_5094", "label_5095", "label_5096", "label_5097", "label_5098", "label_5099", "label_5100", "label_5101", "label_5102", "label_5103", "label_5104", "label_5105", "label_5106", "label_5107", "label_5108", "label_5109", "label_5110", "label_5111", "label_5112", "label_5113", "label_5114", "label_5115", "label_5116", "label_5117", "label_5118", "label_5119", "label_5120", "label_5121", "label_5122", "label_5123", "label_5124", "label_5125", "label_5126", "label_5127", "label_5128", "label_5129", "label_5130", "label_5131", "label_5132", "label_5133", "label_5134", "label_5135", "label_5136", "label_5137", "label_5138", "label_5139", "label_5140", "label_5141", "label_5142", "label_5143", "label_5144", "label_5145", "label_5146", "label_5147", "label_5148", "label_5149", "label_5150", "label_5151", "label_5152", "label_5153", "label_5154", "label_5155", "label_5156", "label_5157", "label_5158", "label_5159", "label_5160", "label_5161", "label_5162", "label_5163", "label_5164", "label_5165", "label_5166", "label_5167", "label_5168", "label_5169", "label_5170", "label_5171", "label_5172", "label_5173", "label_5174", "label_5175", "label_5176", "label_5177", "label_5178", "label_5179", "label_5180", "label_5181", "label_5182", "label_5183", "label_5184", "label_5185", "label_5186", "label_5187", "label_5188", "label_5189", "label_5190", "label_5191", "label_5192", "label_5193", "label_5194", "label_5195", "label_5196", "label_5197", "label_5198", "label_5199", "label_5200", "label_5201", "label_5202", "label_5203", "label_5204", "label_5205", "label_5206", "label_5207", "label_5208", "label_5209", "label_5210", "label_5211", "label_5212", "label_5213", "label_5214", "label_5215", "label_5216", "label_5217", "label_5218", "label_5219", "label_5220", "label_5221", "label_5222", "label_5223", "label_5224", "label_5225", "label_5226", "label_5227", "label_5228", "label_5229", "label_5230", "label_5231", "label_5232", "label_5233", "label_5234", "label_5235", "label_5236", "label_5237", "label_5238", "label_5239", "label_5240", "label_5241", "label_5242", "label_5243", "label_5244", "label_5245", "label_5246", "label_5247", "label_5248", "label_5249", "label_5250", "label_5251", "label_5252", "label_5253", "label_5254", "label_5255", "label_5256", "label_5257", "label_5258", "label_5259", "label_5260", "label_5261", "label_5262", "label_5263", "label_5264", "label_5265", "label_5266", "label_5267", "label_5268", "label_5269", "label_5270", "label_5271", "label_5272", "label_5273", "label_5274", "label_5275", "label_5276", "label_5277", "label_5278", "label_5279", "label_5280", "label_5281", "label_5282", "label_5283", "label_5284", "label_5285", "label_5286", "label_5287", "label_5288", "label_5289", "label_5290", "label_5291", "label_5292", "label_5293", "label_5294", "label_5295", "label_5296", "label_5297", "label_5298", "label_5299", "label_5300", "label_5301", "label_5302", "label_5303", "label_5304", "label_5305", "label_5306", "label_5307", "label_5308", "label_5309", "label_5310", "label_5311", "label_5312", "label_5313", "label_5314", "label_5315", "label_5316", "label_5317", "label_5318", "label_5319", "label_5320", "label_5321", "label_5322", "label_5323", "label_5324", "label_5325", "label_5326", "label_5327", "label_5328", "label_5329", "label_5330", "label_5331", "label_5332", "label_5333", "label_5334", "label_5335", "label_5336", "label_5337", "label_5338", "label_5339", "label_5340", "label_5341", "label_5342", "label_5343", "label_5344", "label_5345", "label_5346", "label_5347", "label_5348", "label_5349", "label_5350", "label_5351", "label_5352", "label_5353", "label_5354", "label_5355", "label_5356", "label_5357", "label_5358", "label_5359", "label_5360", "label_5361", "label_5362", "label_5363", "label_5364", "label_5365", "label_5366", "label_5367", "label_5368", "label_5369", "label_5370", "label_5371", "label_5372", "label_5373", "label_5374", "label_5375", "label_5376", "label_5377", "label_5378", "label_5379", "label_5380", "label_5381", "label_5382", "label_5383", "label_5384", "label_5385", "label_5386", "label_5387", "label_5388", "label_5389", "label_5390", "label_5391", "label_5392", "label_5393", "label_5394", "label_5395", "label_5396", "label_5397", "label_5398", "label_5399", "label_5400", "label_5401", "label_5402", "label_5403", "label_5404", "label_5405", "label_5406", "label_5407", "label_5408", "label_5409", "label_5410", "label_5411", "label_5412", "label_5413", "label_5414", "label_5415", "label_5416", "label_5417", "label_5418", "label_5419", "label_5420", "label_5421", "label_5422", "label_5423", "label_5424", "label_5425", "label_5426", "label_5427", "label_5428", "label_5429", "label_5430", "label_5431", "label_5432", "label_5433", "label_5434", "label_5435", "label_5436", "label_5437", "label_5438", "label_5439", "label_5440", "label_5441", "label_5442", "label_5443", "label_5444", "label_5445", "label_5446", "label_5447", "label_5448", "label_5449", "label_5450", "label_5451", "label_5452", "label_5453", "label_5454", "label_5455", "label_5456", "label_5457", "label_5458", "label_5459", "label_5460", "label_5461", "label_5462", "label_5463", "label_5464", "label_5465", "label_5466", "label_5467", "label_5468", "label_5469", "label_5470", "label_5471", "label_5472", "label_5473", "label_5474", "label_5475", "label_5476", "label_5477", "label_5478", "label_5479", "label_5480", "label_5481", "label_5482", "label_5483", "label_5484", "label_5485", "label_5486", "label_5487", "label_5488", "label_5489", "label_5490", "label_5491", "label_5492", "label_5493", "label_5494", "label_5495", "label_5496", "label_5497", "label_5498", "label_5499", "label_5500", "label_5501", "label_5502", "label_5503", "label_5504", "label_5505", "label_5506", "label_5507", "label_5508", "label_5509", "label_5510", "label_5511", "label_5512", "label_5513", "label_5514", "label_5515", "label_5516", "label_5517", "label_5518", "label_5519", "label_5520", "label_5521", "label_5522", "label_5523", "label_5524", "label_5525", "label_5526", "label_5527", "label_5528", "label_5529", "label_5530", "label_5531", "label_5532", "label_5533", "label_5534", "label_5535", "label_5536", "label_5537", "label_5538", "label_5539", "label_5540", "label_5541", "label_5542", "label_5543", "label_5544", "label_5545", "label_5546", "label_5547", "label_5548", "label_5549", "label_5550", "label_5551", "label_5552", "label_5553", "label_5554", "label_5555", "label_5556", "label_5557", "label_5558", "label_5559", "label_5560", "label_5561", "label_5562", "label_5563", "label_5564", "label_5565", "label_5566", "label_5567", "label_5568", "label_5569", "label_5570", "label_5571", "label_5572", "label_5573", "label_5574", "label_5575", "label_5576", "label_5577", "label_5578", "label_5579", "label_5580", "label_5581", "label_5582", "label_5583", "label_5584", "label_5585", "label_5586", "label_5587", "label_5588", "label_5589", "label_5590", "label_5591", "label_5592", "label_5593", "label_5594", "label_5595", "label_5596", "label_5597", "label_5598", "label_5599", "label_5600", "label_5601", "label_5602", "label_5603", "label_5604", "label_5605", "label_5606", "label_5607", "label_5608", "label_5609", "label_5610", "label_5611", "label_5612", "label_5613", "label_5614", "label_5615", "label_5616", "label_5617", "label_5618", "label_5619", "label_5620", "label_5621", "label_5622", "label_5623", "label_5624", "label_5625", "label_5626", "label_5627", "label_5628", "label_5629", "label_5630", "label_5631", "label_5632", "label_5633", "label_5634", "label_5635", "label_5636", "label_5637", "label_5638", "label_5639", "label_5640", "label_5641", "label_5642", "label_5643", "label_5644", "label_5645", "label_5646", "label_5647", "label_5648", "label_5649", "label_5650", "label_5651", "label_5652", "label_5653", "label_5654", "label_5655", "label_5656", "label_5657", "label_5658", "label_5659", "label_5660", "label_5661", "label_5662", "label_5663", "label_5664", "label_5665", "label_5666", "label_5667", "label_5668", "label_5669", "label_5670", "label_5671", "label_5672", "label_5673", "label_5674", "label_5675", "label_5676", "label_5677", "label_5678", "label_5679", "label_5680", "label_5681", "label_5682", "label_5683", "label_5684", "label_5685", "label_5686", "label_5687", "label_5688", "label_5689", "label_5690", "label_5691", "label_5692", "label_5693", "label_5694", "label_5695", "label_5696", "label_5697", "label_5698", "label_5699", "label_5700", "label_5701", "label_5702", "label_5703", "label_5704", "label_5705", "label_5706", "label_5707", "label_5708", "label_5709", "label_5710", "label_5711", "label_5712", "label_5713", "label_5714", "label_5715", "label_5716", "label_5717", "label_5718", "label_5719", "label_5720", "label_5721", "label_5722", "label_5723", "label_5724", "label_5725", "label_5726", "label_5727", "label_5728", "label_5729", "label_5730", "label_5731", "label_5732", "label_5733", "label_5734", "label_5735", "label_5736", "label_5737", "label_5738", "label_5739", "label_5740", "label_5741", "label_5742", "label_5743", "label_5744", "label_5745", "label_5746", "label_5747", "label_5748", "label_5749", "label_5750", "label_5751", "label_5752", "label_5753", "label_5754", "label_5755", "label_5756", "label_5757", "label_5758", "label_5759", "label_5760", "label_5761", "label_5762", "label_5763", "label_5764", "label_5765", "label_5766", "label_5767", "label_5768", "label_5769", "label_5770", "label_5771", "label_5772", "label_5773", "label_5774", "label_5775", "label_5776", "label_5777", "label_5778", "label_5779", "label_5780", "label_5781", "label_5782", "label_5783", "label_5784", "label_5785", "label_5786", "label_5787", "label_5788", "label_5789", "label_5790", "label_5791", "label_5792", "label_5793", "label_5794", "label_5795", "label_5796", "label_5797", "label_5798", "label_5799", "label_5800", "label_5801", "label_5802", "label_5803", "label_5804", "label_5805", "label_5806", "label_5807", "label_5808", "label_5809", "label_5810", "label_5811", "label_5812", "label_5813", "label_5814", "label_5815", "label_5816", "label_5817", "label_5818", "label_5819", "label_5820", "label_5821", "label_5822", "label_5823", "label_5824", "label_5825", "label_5826", "label_5827", "label_5828", "label_5829", "label_5830", "label_5831", "label_5832", "label_5833", "label_5834", "label_5835", "label_5836", "label_5837", "label_5838", "label_5839", "label_5840", "label_5841", "label_5842", "label_5843", "label_5844", "label_5845", "label_5846", "label_5847", "label_5848", "label_5849", "label_5850", "label_5851", "label_5852", "label_5853", "label_5854", "label_5855", "label_5856", "label_5857", "label_5858", "label_5859", "label_5860", "label_5861", "label_5862", "label_5863", "label_5864", "label_5865", "label_5866", "label_5867", "label_5868", "label_5869", "label_5870", "label_5871", "label_5872", "label_5873", "label_5874", "label_5875", "label_5876", "label_5877", "label_5878", "label_5879", "label_5880", "label_5881", "label_5882", "label_5883", "label_5884", "label_5885", "label_5886", "label_5887", "label_5888", "label_5889", "label_5890", "label_5891", "label_5892", "label_5893", "label_5894", "label_5895", "label_5896", "label_5897", "label_5898", "label_5899", "label_5900", "label_5901", "label_5902", "label_5903", "label_5904", "label_5905", "label_5906", "label_5907", "label_5908", "label_5909", "label_5910", "label_5911", "label_5912", "label_5913", "label_5914", "label_5915", "label_5916", "label_5917", "label_5918", "label_5919", "label_5920", "label_5921", "label_5922", "label_5923", "label_5924", "label_5925", "label_5926", "label_5927", "label_5928", "label_5929", "label_5930", "label_5931", "label_5932", "label_5933", "label_5934", "label_5935", "label_5936", "label_5937", "label_5938", "label_5939", "label_5940", "label_5941", "label_5942", "label_5943", "label_5944", "label_5945", "label_5946", "label_5947", "label_5948", "label_5949", "label_5950", "label_5951", "label_5952", "label_5953", "label_5954", "label_5955", "label_5956", "label_5957", "label_5958", "label_5959", "label_5960", "label_5961", "label_5962", "label_5963", "label_5964", "label_5965", "label_5966", "label_5967", "label_5968", "label_5969", "label_5970", "label_5971", "label_5972", "label_5973", "label_5974", "label_5975", "label_5976", "label_5977", "label_5978", "label_5979", "label_5980", "label_5981", "label_5982", "label_5983", "label_5984", "label_5985", "label_5986", "label_5987", "label_5988", "label_5989", "label_5990", "label_5991", "label_5992", "label_5993", "label_5994", "label_5995", "label_5996", "label_5997", "label_5998", "label_5999", "label_6000", "label_6001", "label_6002", "label_6003", "label_6004", "label_6005", "label_6006", "label_6007", "label_6008", "label_6009", "label_6010", "label_6011", "label_6012", "label_6013", "label_6014", "label_6015", "label_6016", "label_6017", "label_6018", "label_6019", "label_6020", "label_6021", "label_6022", "label_6023", "label_6024", "label_6025", "label_6026", "label_6027", "label_6028", "label_6029", "label_6030", "label_6031", "label_6032", "label_6033", "label_6034", "label_6035", "label_6036", "label_6037", "label_6038", "label_6039", "label_6040", "label_6041", "label_6042", "label_6043", "label_6044", "label_6045", "label_6046", "label_6047", "label_6048", "label_6049", "label_6050", "label_6051", "label_6052", "label_6053", "label_6054", "label_6055", "label_6056", "label_6057", "label_6058", "label_6059", "label_6060", "label_6061", "label_6062", "label_6063", "label_6064", "label_6065", "label_6066", "label_6067", "label_6068", "label_6069", "label_6070", "label_6071", "label_6072", "label_6073", "label_6074", "label_6075", "label_6076", "label_6077", "label_6078", "label_6079", "label_6080", "label_6081", "label_6082", "label_6083", "label_6084", "label_6085", "label_6086", "label_6087", "label_6088", "label_6089", "label_6090", "label_6091", "label_6092", "label_6093", "label_6094", "label_6095", "label_6096", "label_6097", "label_6098", "label_6099", "label_6100", "label_6101", "label_6102", "label_6103", "label_6104", "label_6105", "label_6106", "label_6107", "label_6108", "label_6109", "label_6110", "label_6111", "label_6112", "label_6113", "label_6114", "label_6115", "label_6116", "label_6117", "label_6118", "label_6119", "label_6120", "label_6121", "label_6122", "label_6123", "label_6124", "label_6125", "label_6126", "label_6127", "label_6128", "label_6129", "label_6130", "label_6131", "label_6132", "label_6133", "label_6134", "label_6135", "label_6136", "label_6137", "label_6138", "label_6139", "label_6140", "label_6141", "label_6142", "label_6143", "label_6144", "label_6145", "label_6146", "label_6147", "label_6148", "label_6149", "label_6150", "label_6151", "label_6152", "label_6153", "label_6154", "label_6155", "label_6156", "label_6157", "label_6158", "label_6159", "label_6160", "label_6161", "label_6162", "label_6163", "label_6164", "label_6165", "label_6166", "label_6167", "label_6168", "label_6169", "label_6170", "label_6171", "label_6172", "label_6173", "label_6174", "label_6175", "label_6176", "label_6177", "label_6178", "label_6179", "label_6180", "label_6181", "label_6182", "label_6183", "label_6184", "label_6185", "label_6186", "label_6187", "label_6188", "label_6189", "label_6190", "label_6191", "label_6192", "label_6193", "label_6194", "label_6195", "label_6196", "label_6197", "label_6198", "label_6199", "label_6200", "label_6201", "label_6202", "label_6203", "label_6204", "label_6205", "label_6206", "label_6207", "label_6208", "label_6209", "label_6210", "label_6211", "label_6212", "label_6213", "label_6214", "label_6215", "label_6216", "label_6217", "label_6218", "label_6219", "label_6220", "label_6221", "label_6222", "label_6223", "label_6224", "label_6225", "label_6226", "label_6227", "label_6228", "label_6229", "label_6230", "label_6231", "label_6232", "label_6233", "label_6234", "label_6235", "label_6236", "label_6237", "label_6238", "label_6239", "label_6240", "label_6241", "label_6242", "label_6243", "label_6244", "label_6245", "label_6246", "label_6247", "label_6248", "label_6249", "label_6250", "label_6251", "label_6252", "label_6253", "label_6254", "label_6255", "label_6256", "label_6257", "label_6258", "label_6259", "label_6260", "label_6261", "label_6262", "label_6263", "label_6264", "label_6265", "label_6266", "label_6267", "label_6268", "label_6269", "label_6270", "label_6271", "label_6272", "label_6273", "label_6274", "label_6275", "label_6276", "label_6277", "label_6278", "label_6279", "label_6280", "label_6281", "label_6282", "label_6283", "label_6284", "label_6285", "label_6286", "label_6287", "label_6288", "label_6289", "label_6290", "label_6291", "label_6292", "label_6293", "label_6294", "label_6295", "label_6296", "label_6297", "label_6298", "label_6299", "label_6300", "label_6301", "label_6302", "label_6303", "label_6304", "label_6305", "label_6306", "label_6307", "label_6308", "label_6309", "label_6310", "label_6311", "label_6312", "label_6313", "label_6314", "label_6315", "label_6316", "label_6317", "label_6318", "label_6319", "label_6320", "label_6321", "label_6322", "label_6323", "label_6324", "label_6325", "label_6326", "label_6327", "label_6328", "label_6329", "label_6330", "label_6331", "label_6332", "label_6333", "label_6334", "label_6335", "label_6336", "label_6337", "label_6338", "label_6339", "label_6340", "label_6341", "label_6342", "label_6343", "label_6344", "label_6345", "label_6346", "label_6347", "label_6348", "label_6349", "label_6350", "label_6351", "label_6352", "label_6353", "label_6354", "label_6355", "label_6356", "label_6357", "label_6358", "label_6359", "label_6360", "label_6361", "label_6362", "label_6363", "label_6364", "label_6365", "label_6366", "label_6367", "label_6368", "label_6369", "label_6370", "label_6371", "label_6372", "label_6373", "label_6374", "label_6375", "label_6376", "label_6377", "label_6378", "label_6379", "label_6380", "label_6381", "label_6382", "label_6383", "label_6384", "label_6385", "label_6386", "label_6387", "label_6388", "label_6389", "label_6390", "label_6391", "label_6392", "label_6393", "label_6394", "label_6395", "label_6396", "label_6397", "label_6398", "label_6399", "label_6400", "label_6401", "label_6402", "label_6403", "label_6404", "label_6405", "label_6406", "label_6407", "label_6408", "label_6409", "label_6410", "label_6411", "label_6412", "label_6413", "label_6414", "label_6415", "label_6416", "label_6417", "label_6418", "label_6419", "label_6420", "label_6421", "label_6422", "label_6423", "label_6424", "label_6425", "label_6426", "label_6427", "label_6428", "label_6429", "label_6430", "label_6431", "label_6432", "label_6433", "label_6434", "label_6435", "label_6436", "label_6437", "label_6438", "label_6439", "label_6440", "label_6441", "label_6442", "label_6443", "label_6444", "label_6445", "label_6446", "label_6447", "label_6448", "label_6449", "label_6450", "label_6451", "label_6452", "label_6453", "label_6454", "label_6455", "label_6456", "label_6457", "label_6458", "label_6459", "label_6460", "label_6461", "label_6462", "label_6463", "label_6464", "label_6465", "label_6466", "label_6467", "label_6468", "label_6469", "label_6470", "label_6471", "label_6472", "label_6473", "label_6474", "label_6475", "label_6476", "label_6477", "label_6478", "label_6479", "label_6480", "label_6481", "label_6482", "label_6483", "label_6484", "label_6485", "label_6486", "label_6487", "label_6488", "label_6489", "label_6490", "label_6491", "label_6492", "label_6493", "label_6494", "label_6495", "label_6496", "label_6497", "label_6498", "label_6499", "label_6500", "label_6501", "label_6502", "label_6503", "label_6504", "label_6505", "label_6506", "label_6507", "label_6508", "label_6509", "label_6510", "label_6511", "label_6512", "label_6513", "label_6514", "label_6515", "label_6516", "label_6517", "label_6518", "label_6519", "label_6520", "label_6521", "label_6522", "label_6523", "label_6524", "label_6525", "label_6526", "label_6527", "label_6528", "label_6529", "label_6530", "label_6531", "label_6532", "label_6533", "label_6534", "label_6535", "label_6536", "label_6537", "label_6538", "label_6539", "label_6540", "label_6541", "label_6542", "label_6543", "label_6544", "label_6545", "label_6546", "label_6547", "label_6548", "label_6549", "label_6550", "label_6551", "label_6552", "label_6553", "label_6554", "label_6555", "label_6556", "label_6557", "label_6558", "label_6559", "label_6560", "label_6561", "label_6562", "label_6563", "label_6564", "label_6565", "label_6566", "label_6567", "label_6568", "label_6569", "label_6570", "label_6571", "label_6572", "label_6573", "label_6574", "label_6575", "label_6576", "label_6577", "label_6578", "label_6579", "label_6580", "label_6581", "label_6582", "label_6583", "label_6584", "label_6585", "label_6586", "label_6587", "label_6588", "label_6589", "label_6590", "label_6591", "label_6592", "label_6593", "label_6594", "label_6595", "label_6596", "label_6597", "label_6598", "label_6599", "label_6600", "label_6601", "label_6602", "label_6603", "label_6604", "label_6605", "label_6606", "label_6607", "label_6608", "label_6609", "label_6610", "label_6611", "label_6612", "label_6613", "label_6614", "label_6615", "label_6616", "label_6617", "label_6618", "label_6619", "label_6620", "label_6621", "label_6622", "label_6623", "label_6624", "label_6625", "label_6626", "label_6627", "label_6628", "label_6629", "label_6630", "label_6631", "label_6632", "label_6633", "label_6634", "label_6635", "label_6636", "label_6637", "label_6638", "label_6639", "label_6640", "label_6641", "label_6642", "label_6643", "label_6644", "label_6645", "label_6646", "label_6647", "label_6648", "label_6649", "label_6650", "label_6651", "label_6652", "label_6653", "label_6654", "label_6655", "label_6656", "label_6657", "label_6658", "label_6659", "label_6660", "label_6661", "label_6662", "label_6663", "label_6664", "label_6665", "label_6666", "label_6667", "label_6668", "label_6669", "label_6670", "label_6671", "label_6672", "label_6673", "label_6674", "label_6675", "label_6676", "label_6677", "label_6678", "label_6679", "label_6680", "label_6681", "label_6682", "label_6683", "label_6684", "label_6685", "label_6686", "label_6687", "label_6688", "label_6689", "label_6690", "label_6691", "label_6692", "label_6693", "label_6694", "label_6695", "label_6696", "label_6697", "label_6698", "label_6699", "label_6700", "label_6701", "label_6702", "label_6703", "label_6704", "label_6705", "label_6706", "label_6707", "label_6708", "label_6709", "label_6710", "label_6711", "label_6712", "label_6713", "label_6714", "label_6715", "label_6716", "label_6717", "label_6718", "label_6719", "label_6720", "label_6721", "label_6722", "label_6723", "label_6724", "label_6725", "label_6726", "label_6727", "label_6728", "label_6729", "label_6730", "label_6731", "label_6732", "label_6733", "label_6734", "label_6735", "label_6736", "label_6737", "label_6738", "label_6739", "label_6740", "label_6741", "label_6742", "label_6743", "label_6744", "label_6745", "label_6746", "label_6747", "label_6748", "label_6749", "label_6750", "label_6751", "label_6752", "label_6753", "label_6754", "label_6755", "label_6756", "label_6757", "label_6758", "label_6759", "label_6760", "label_6761", "label_6762", "label_6763", "label_6764", "label_6765", "label_6766", "label_6767", "label_6768", "label_6769", "label_6770", "label_6771", "label_6772", "label_6773", "label_6774", "label_6775", "label_6776", "label_6777", "label_6778", "label_6779", "label_6780", "label_6781", "label_6782", "label_6783", "label_6784", "label_6785", "label_6786", "label_6787", "label_6788", "label_6789", "label_6790", "label_6791", "label_6792", "label_6793", "label_6794", "label_6795", "label_6796", "label_6797", "label_6798", "label_6799", "label_6800", "label_6801", "label_6802", "label_6803", "label_6804", "label_6805", "label_6806", "label_6807", "label_6808", "label_6809", "label_6810", "label_6811", "label_6812", "label_6813", "label_6814", "label_6815", "label_6816", "label_6817", "label_6818", "label_6819", "label_6820", "label_6821", "label_6822", "label_6823", "label_6824", "label_6825", "label_6826", "label_6827", "label_6828", "label_6829", "label_6830", "label_6831", "label_6832", "label_6833", "label_6834", "label_6835", "label_6836", "label_6837", "label_6838", "label_6839", "label_6840", "label_6841", "label_6842", "label_6843", "label_6844", "label_6845", "label_6846", "label_6847", "label_6848", "label_6849", "label_6850", "label_6851", "label_6852", "label_6853", "label_6854", "label_6855", "label_6856", "label_6857", "label_6858", "label_6859", "label_6860", "label_6861", "label_6862", "label_6863", "label_6864", "label_6865", "label_6866", "label_6867", "label_6868", "label_6869", "label_6870", "label_6871", "label_6872", "label_6873", "label_6874", "label_6875", "label_6876", "label_6877", "label_6878", "label_6879", "label_6880", "label_6881", "label_6882", "label_6883", "label_6884", "label_6885", "label_6886", "label_6887", "label_6888", "label_6889", "label_6890", "label_6891", "label_6892", "label_6893", "label_6894", "label_6895", "label_6896", "label_6897", "label_6898", "label_6899", "label_6900", "label_6901", "label_6902", "label_6903", "label_6904", "label_6905", "label_6906", "label_6907", "label_6908", "label_6909", "label_6910", "label_6911", "label_6912", "label_6913", "label_6914", "label_6915", "label_6916", "label_6917", "label_6918", "label_6919", "label_6920", "label_6921", "label_6922", "label_6923", "label_6924", "label_6925", "label_6926", "label_6927", "label_6928", "label_6929", "label_6930", "label_6931", "label_6932", "label_6933", "label_6934", "label_6935", "label_6936", "label_6937", "label_6938", "label_6939", "label_6940", "label_6941", "label_6942", "label_6943", "label_6944", "label_6945", "label_6946", "label_6947", "label_6948", "label_6949", "label_6950", "label_6951", "label_6952", "label_6953", "label_6954", "label_6955", "label_6956", "label_6957", "label_6958", "label_6959", "label_6960", "label_6961", "label_6962", "label_6963", "label_6964", "label_6965", "label_6966", "label_6967", "label_6968", "label_6969", "label_6970", "label_6971", "label_6972", "label_6973", "label_6974", "label_6975", "label_6976", "label_6977", "label_6978", "label_6979", "label_6980", "label_6981", "label_6982", "label_6983", "label_6984", "label_6985", "label_6986", "label_6987", "label_6988", "label_6989", "label_6990", "label_6991", "label_6992", "label_6993", "label_6994", "label_6995", "label_6996", "label_6997", "label_6998", "label_6999", "label_7000", "label_7001", "label_7002", "label_7003", "label_7004", "label_7005", "label_7006", "label_7007", "label_7008", "label_7009", "label_7010", "label_7011", "label_7012", "label_7013", "label_7014", "label_7015", "label_7016", "label_7017", "label_7018", "label_7019", "label_7020", "label_7021", "label_7022", "label_7023", "label_7024", "label_7025", "label_7026", "label_7027", "label_7028", "label_7029", "label_7030", "label_7031", "label_7032", "label_7033", "label_7034", "label_7035", "label_7036", "label_7037", "label_7038", "label_7039", "label_7040", "label_7041", "label_7042", "label_7043", "label_7044", "label_7045", "label_7046", "label_7047", "label_7048", "label_7049", "label_7050", "label_7051", "label_7052", "label_7053", "label_7054", "label_7055", "label_7056", "label_7057", "label_7058", "label_7059", "label_7060", "label_7061", "label_7062", "label_7063", "label_7064", "label_7065", "label_7066", "label_7067", "label_7068", "label_7069", "label_7070", "label_7071", "label_7072", "label_7073", "label_7074", "label_7075", "label_7076", "label_7077", "label_7078", "label_7079", "label_7080", "label_7081", "label_7082", "label_7083", "label_7084", "label_7085", "label_7086", "label_7087", "label_7088", "label_7089", "label_7090", "label_7091", "label_7092", "label_7093", "label_7094", "label_7095", "label_7096", "label_7097", "label_7098", "label_7099", "label_7100", "label_7101", "label_7102", "label_7103", "label_7104", "label_7105", "label_7106", "label_7107", "label_7108", "label_7109", "label_7110", "label_7111", "label_7112", "label_7113", "label_7114", "label_7115", "label_7116", "label_7117", "label_7118", "label_7119", "label_7120", "label_7121", "label_7122", "label_7123", "label_7124", "label_7125", "label_7126", "label_7127", "label_7128", "label_7129", "label_7130", "label_7131", "label_7132", "label_7133", "label_7134", "label_7135", "label_7136", "label_7137", "label_7138", "label_7139", "label_7140", "label_7141", "label_7142", "label_7143", "label_7144", "label_7145", "label_7146", "label_7147", "label_7148", "label_7149", "label_7150", "label_7151", "label_7152", "label_7153", "label_7154", "label_7155", "label_7156", "label_7157", "label_7158", "label_7159", "label_7160", "label_7161", "label_7162", "label_7163", "label_7164", "label_7165", "label_7166", "label_7167", "label_7168", "label_7169", "label_7170", "label_7171", "label_7172", "label_7173", "label_7174", "label_7175", "label_7176", "label_7177", "label_7178", "label_7179", "label_7180", "label_7181", "label_7182", "label_7183", "label_7184", "label_7185", "label_7186", "label_7187", "label_7188", "label_7189", "label_7190", "label_7191", "label_7192", "label_7193", "label_7194", "label_7195", "label_7196", "label_7197", "label_7198", "label_7199", "label_7200", "label_7201", "label_7202", "label_7203", "label_7204", "label_7205", "label_7206", "label_7207", "label_7208", "label_7209", "label_7210", "label_7211", "label_7212", "label_7213", "label_7214", "label_7215", "label_7216", "label_7217", "label_7218", "label_7219", "label_7220", "label_7221", "label_7222", "label_7223", "label_7224", "label_7225", "label_7226", "label_7227", "label_7228", "label_7229", "label_7230", "label_7231", "label_7232", "label_7233", "label_7234", "label_7235", "label_7236", "label_7237", "label_7238", "label_7239", "label_7240", "label_7241", "label_7242", "label_7243", "label_7244", "label_7245", "label_7246", "label_7247", "label_7248", "label_7249", "label_7250", "label_7251", "label_7252", "label_7253", "label_7254", "label_7255", "label_7256", "label_7257", "label_7258", "label_7259", "label_7260", "label_7261", "label_7262", "label_7263", "label_7264", "label_7265", "label_7266", "label_7267", "label_7268", "label_7269", "label_7270", "label_7271", "label_7272", "label_7273", "label_7274", "label_7275", "label_7276", "label_7277", "label_7278", "label_7279", "label_7280", "label_7281", "label_7282", "label_7283", "label_7284", "label_7285", "label_7286", "label_7287", "label_7288", "label_7289", "label_7290", "label_7291", "label_7292", "label_7293", "label_7294", "label_7295", "label_7296", "label_7297", "label_7298", "label_7299", "label_7300", "label_7301", "label_7302", "label_7303", "label_7304", "label_7305", "label_7306", "label_7307", "label_7308", "label_7309", "label_7310", "label_7311", "label_7312", "label_7313", "label_7314", "label_7315", "label_7316", "label_7317", "label_7318", "label_7319", "label_7320", "label_7321", "label_7322", "label_7323", "label_7324", "label_7325", "label_7326", "label_7327", "label_7328", "label_7329", "label_7330", "label_7331", "label_7332", "label_7333", "label_7334", "label_7335", "label_7336", "label_7337", "label_7338", "label_7339", "label_7340", "label_7341", "label_7342", "label_7343", "label_7344", "label_7345", "label_7346", "label_7347", "label_7348", "label_7349", "label_7350", "label_7351", "label_7352", "label_7353", "label_7354", "label_7355", "label_7356", "label_7357", "label_7358", "label_7359", "label_7360", "label_7361", "label_7362", "label_7363", "label_7364", "label_7365", "label_7366", "label_7367", "label_7368", "label_7369", "label_7370", "label_7371", "label_7372", "label_7373", "label_7374", "label_7375", "label_7376", "label_7377", "label_7378", "label_7379", "label_7380", "label_7381", "label_7382", "label_7383", "label_7384", "label_7385", "label_7386", "label_7387", "label_7388", "label_7389", "label_7390", "label_7391", "label_7392", "label_7393", "label_7394", "label_7395", "label_7396", "label_7397", "label_7398", "label_7399", "label_7400", "label_7401", "label_7402", "label_7403", "label_7404", "label_7405", "label_7406", "label_7407", "label_7408", "label_7409", "label_7410", "label_7411", "label_7412", "label_7413", "label_7414", "label_7415", "label_7416", "label_7417", "label_7418", "label_7419", "label_7420", "label_7421", "label_7422", "label_7423", "label_7424", "label_7425", "label_7426", "label_7427", "label_7428", "label_7429", "label_7430", "label_7431", "label_7432", "label_7433", "label_7434", "label_7435", "label_7436", "label_7437", "label_7438", "label_7439", "label_7440", "label_7441", "label_7442", "label_7443", "label_7444", "label_7445", "label_7446", "label_7447", "label_7448", "label_7449", "label_7450", "label_7451", "label_7452", "label_7453", "label_7454", "label_7455", "label_7456", "label_7457", "label_7458", "label_7459", "label_7460", "label_7461", "label_7462", "label_7463", "label_7464", "label_7465", "label_7466", "label_7467", "label_7468", "label_7469", "label_7470", "label_7471", "label_7472", "label_7473", "label_7474", "label_7475", "label_7476", "label_7477", "label_7478", "label_7479", "label_7480", "label_7481", "label_7482", "label_7483", "label_7484", "label_7485", "label_7486", "label_7487", "label_7488", "label_7489", "label_7490", "label_7491", "label_7492", "label_7493", "label_7494", "label_7495", "label_7496", "label_7497", "label_7498", "label_7499", "label_7500", "label_7501", "label_7502", "label_7503", "label_7504", "label_7505", "label_7506", "label_7507", "label_7508", "label_7509", "label_7510", "label_7511", "label_7512", "label_7513", "label_7514", "label_7515", "label_7516", "label_7517", "label_7518", "label_7519", "label_7520", "label_7521", "label_7522", "label_7523", "label_7524", "label_7525", "label_7526", "label_7527", "label_7528", "label_7529", "label_7530", "label_7531", "label_7532", "label_7533", "label_7534", "label_7535", "label_7536", "label_7537", "label_7538", "label_7539", "label_7540", "label_7541", "label_7542", "label_7543", "label_7544", "label_7545", "label_7546", "label_7547", "label_7548", "label_7549", "label_7550", "label_7551", "label_7552", "label_7553", "label_7554", "label_7555", "label_7556", "label_7557", "label_7558", "label_7559", "label_7560", "label_7561", "label_7562", "label_7563", "label_7564", "label_7565", "label_7566", "label_7567", "label_7568", "label_7569", "label_7570", "label_7571", "label_7572", "label_7573", "label_7574", "label_7575", "label_7576", "label_7577", "label_7578", "label_7579", "label_7580", "label_7581", "label_7582", "label_7583", "label_7584", "label_7585", "label_7586", "label_7587", "label_7588", "label_7589", "label_7590", "label_7591", "label_7592", "label_7593", "label_7594", "label_7595", "label_7596", "label_7597", "label_7598", "label_7599", "label_7600", "label_7601", "label_7602", "label_7603", "label_7604", "label_7605", "label_7606", "label_7607", "label_7608", "label_7609", "label_7610", "label_7611", "label_7612", "label_7613", "label_7614", "label_7615", "label_7616", "label_7617", "label_7618", "label_7619", "label_7620", "label_7621", "label_7622", "label_7623", "label_7624", "label_7625", "label_7626", "label_7627", "label_7628", "label_7629", "label_7630", "label_7631", "label_7632", "label_7633", "label_7634", "label_7635", "label_7636", "label_7637", "label_7638", "label_7639", "label_7640", "label_7641", "label_7642", "label_7643", "label_7644", "label_7645", "label_7646", "label_7647", "label_7648", "label_7649", "label_7650", "label_7651", "label_7652", "label_7653", "label_7654", "label_7655", "label_7656", "label_7657", "label_7658", "label_7659", "label_7660", "label_7661", "label_7662", "label_7663", "label_7664", "label_7665", "label_7666", "label_7667", "label_7668", "label_7669", "label_7670", "label_7671", "label_7672", "label_7673", "label_7674", "label_7675", "label_7676", "label_7677", "label_7678", "label_7679", "label_7680", "label_7681", "label_7682", "label_7683", "label_7684", "label_7685", "label_7686", "label_7687", "label_7688", "label_7689", "label_7690", "label_7691", "label_7692", "label_7693", "label_7694", "label_7695", "label_7696", "label_7697", "label_7698", "label_7699", "label_7700", "label_7701", "label_7702", "label_7703", "label_7704", "label_7705", "label_7706", "label_7707", "label_7708", "label_7709", "label_7710", "label_7711", "label_7712", "label_7713", "label_7714", "label_7715", "label_7716", "label_7717", "label_7718", "label_7719", "label_7720", "label_7721", "label_7722", "label_7723", "label_7724", "label_7725", "label_7726", "label_7727", "label_7728", "label_7729", "label_7730", "label_7731", "label_7732", "label_7733", "label_7734", "label_7735", "label_7736", "label_7737", "label_7738", "label_7739", "label_7740", "label_7741", "label_7742", "label_7743", "label_7744", "label_7745", "label_7746", "label_7747", "label_7748", "label_7749", "label_7750", "label_7751", "label_7752", "label_7753", "label_7754", "label_7755", "label_7756", "label_7757", "label_7758", "label_7759", "label_7760", "label_7761", "label_7762", "label_7763", "label_7764", "label_7765", "label_7766", "label_7767", "label_7768", "label_7769", "label_7770", "label_7771", "label_7772", "label_7773", "label_7774", "label_7775", "label_7776", "label_7777", "label_7778", "label_7779", "label_7780", "label_7781", "label_7782", "label_7783", "label_7784", "label_7785", "label_7786", "label_7787", "label_7788", "label_7789", "label_7790", "label_7791", "label_7792", "label_7793", "label_7794", "label_7795", "label_7796", "label_7797", "label_7798", "label_7799", "label_7800", "label_7801", "label_7802", "label_7803", "label_7804", "label_7805", "label_7806", "label_7807", "label_7808", "label_7809", "label_7810", "label_7811", "label_7812", "label_7813", "label_7814", "label_7815", "label_7816", "label_7817", "label_7818", "label_7819", "label_7820", "label_7821", "label_7822", "label_7823", "label_7824", "label_7825", "label_7826", "label_7827", "label_7828", "label_7829", "label_7830", "label_7831", "label_7832", "label_7833", "label_7834", "label_7835", "label_7836", "label_7837", "label_7838", "label_7839", "label_7840", "label_7841", "label_7842", "label_7843", "label_7844", "label_7845", "label_7846", "label_7847", "label_7848", "label_7849", "label_7850", "label_7851", "label_7852", "label_7853", "label_7854", "label_7855", "label_7856", "label_7857", "label_7858", "label_7859", "label_7860", "label_7861", "label_7862", "label_7863", "label_7864", "label_7865", "label_7866", "label_7867", "label_7868", "label_7869", "label_7870", "label_7871", "label_7872", "label_7873", "label_7874", "label_7875", "label_7876", "label_7877", "label_7878", "label_7879", "label_7880", "label_7881", "label_7882", "label_7883", "label_7884", "label_7885", "label_7886", "label_7887", "label_7888", "label_7889", "label_7890", "label_7891", "label_7892", "label_7893", "label_7894", "label_7895", "label_7896", "label_7897", "label_7898", "label_7899", "label_7900", "label_7901", "label_7902", "label_7903", "label_7904", "label_7905", "label_7906", "label_7907", "label_7908", "label_7909", "label_7910", "label_7911", "label_7912", "label_7913", "label_7914", "label_7915", "label_7916", "label_7917", "label_7918", "label_7919", "label_7920", "label_7921", "label_7922", "label_7923", "label_7924", "label_7925", "label_7926", "label_7927", "label_7928", "label_7929", "label_7930", "label_7931", "label_7932", "label_7933", "label_7934", "label_7935", "label_7936", "label_7937", "label_7938", "label_7939", "label_7940", "label_7941", "label_7942", "label_7943", "label_7944", "label_7945", "label_7946", "label_7947", "label_7948", "label_7949", "label_7950", "label_7951", "label_7952", "label_7953", "label_7954", "label_7955", "label_7956", "label_7957", "label_7958", "label_7959", "label_7960", "label_7961", "label_7962", "label_7963", "label_7964", "label_7965", "label_7966", "label_7967", "label_7968", "label_7969", "label_7970", "label_7971", "label_7972", "label_7973", "label_7974", "label_7975", "label_7976", "label_7977", "label_7978", "label_7979", "label_7980", "label_7981", "label_7982", "label_7983", "label_7984", "label_7985", "label_7986", "label_7987", "label_7988", "label_7989", "label_7990", "label_7991", "label_7992", "label_7993", "label_7994", "label_7995", "label_7996", "label_7997", "label_7998", "label_7999", "label_8000", "label_8001", "label_8002", "label_8003", "label_8004", "label_8005", "label_8006", "label_8007", "label_8008", "label_8009", "label_8010", "label_8011", "label_8012", "label_8013", "label_8014", "label_8015", "label_8016", "label_8017", "label_8018", "label_8019", "label_8020", "label_8021", "label_8022", "label_8023", "label_8024", "label_8025", "label_8026", "label_8027", "label_8028", "label_8029", "label_8030", "label_8031", "label_8032", "label_8033", "label_8034", "label_8035", "label_8036", "label_8037", "label_8038", "label_8039", "label_8040", "label_8041", "label_8042", "label_8043", "label_8044", "label_8045", "label_8046", "label_8047", "label_8048", "label_8049", "label_8050", "label_8051", "label_8052", "label_8053", "label_8054", "label_8055", "label_8056", "label_8057", "label_8058", "label_8059", "label_8060", "label_8061", "label_8062", "label_8063", "label_8064", "label_8065", "label_8066", "label_8067", "label_8068", "label_8069", "label_8070", "label_8071", "label_8072", "label_8073", "label_8074", "label_8075", "label_8076", "label_8077", "label_8078", "label_8079", "label_8080", "label_8081", "label_8082", "label_8083", "label_8084", "label_8085", "label_8086", "label_8087", "label_8088", "label_8089", "label_8090", "label_8091", "label_8092", "label_8093", "label_8094", "label_8095", "label_8096", "label_8097", "label_8098", "label_8099", "label_8100", "label_8101", "label_8102", "label_8103", "label_8104", "label_8105", "label_8106", "label_8107", "label_8108", "label_8109", "label_8110", "label_8111", "label_8112", "label_8113", "label_8114", "label_8115", "label_8116", "label_8117", "label_8118", "label_8119", "label_8120", "label_8121", "label_8122", "label_8123", "label_8124", "label_8125", "label_8126", "label_8127", "label_8128", "label_8129", "label_8130", "label_8131", "label_8132", "label_8133", "label_8134", "label_8135", "label_8136", "label_8137", "label_8138", "label_8139", "label_8140", "label_8141", "label_8142", "label_8143", "label_8144", "label_8145", "label_8146", "label_8147", "label_8148", "label_8149", "label_8150", "label_8151", "label_8152", "label_8153", "label_8154", "label_8155", "label_8156", "label_8157", "label_8158", "label_8159", "label_8160", "label_8161", "label_8162", "label_8163", "label_8164", "label_8165", "label_8166", "label_8167", "label_8168", "label_8169", "label_8170", "label_8171", "label_8172", "label_8173", "label_8174", "label_8175", "label_8176", "label_8177", "label_8178", "label_8179", "label_8180", "label_8181", "label_8182", "label_8183", "label_8184", "label_8185", "label_8186", "label_8187", "label_8188", "label_8189", "label_8190", "label_8191", "label_8192", "label_8193", "label_8194", "label_8195", "label_8196", "label_8197", "label_8198", "label_8199", "label_8200", "label_8201", "label_8202", "label_8203", "label_8204", "label_8205", "label_8206", "label_8207", "label_8208", "label_8209", "label_8210", "label_8211", "label_8212", "label_8213", "label_8214", "label_8215", "label_8216", "label_8217", "label_8218", "label_8219", "label_8220", "label_8221", "label_8222", "label_8223", "label_8224", "label_8225", "label_8226", "label_8227", "label_8228", "label_8229", "label_8230", "label_8231", "label_8232", "label_8233", "label_8234", "label_8235", "label_8236", "label_8237", "label_8238", "label_8239", "label_8240", "label_8241", "label_8242", "label_8243", "label_8244", "label_8245", "label_8246", "label_8247", "label_8248", "label_8249", "label_8250", "label_8251", "label_8252", "label_8253", "label_8254", "label_8255", "label_8256", "label_8257", "label_8258", "label_8259", "label_8260", "label_8261", "label_8262", "label_8263", "label_8264", "label_8265", "label_8266", "label_8267", "label_8268", "label_8269", "label_8270", "label_8271", "label_8272", "label_8273", "label_8274", "label_8275", "label_8276", "label_8277", "label_8278", "label_8279", "label_8280", "label_8281", "label_8282", "label_8283", "label_8284", "label_8285", "label_8286", "label_8287", "label_8288", "label_8289", "label_8290", "label_8291", "label_8292", "label_8293", "label_8294", "label_8295", "label_8296", "label_8297", "label_8298", "label_8299", "label_8300", "label_8301", "label_8302", "label_8303", "label_8304", "label_8305", "label_8306", "label_8307", "label_8308", "label_8309", "label_8310", "label_8311", "label_8312", "label_8313", "label_8314", "label_8315", "label_8316", "label_8317", "label_8318", "label_8319", "label_8320", "label_8321", "label_8322", "label_8323", "label_8324", "label_8325", "label_8326", "label_8327", "label_8328", "label_8329", "label_8330", "label_8331", "label_8332", "label_8333", "label_8334", "label_8335", "label_8336", "label_8337", "label_8338", "label_8339", "label_8340", "label_8341", "label_8342", "label_8343", "label_8344", "label_8345", "label_8346", "label_8347", "label_8348", "label_8349", "label_8350", "label_8351", "label_8352", "label_8353", "label_8354", "label_8355", "label_8356", "label_8357", "label_8358", "label_8359", "label_8360", "label_8361", "label_8362", "label_8363", "label_8364", "label_8365", "label_8366", "label_8367", "label_8368", "label_8369", "label_8370", "label_8371", "label_8372", "label_8373", "label_8374", "label_8375", "label_8376", "label_8377", "label_8378", "label_8379", "label_8380", "label_8381", "label_8382", "label_8383", "label_8384", "label_8385", "label_8386", "label_8387", "label_8388", "label_8389", "label_8390", "label_8391", "label_8392", "label_8393", "label_8394", "label_8395", "label_8396", "label_8397", "label_8398", "label_8399", "label_8400", "label_8401", "label_8402", "label_8403", "label_8404", "label_8405", "label_8406", "label_8407", "label_8408", "label_8409", "label_8410", "label_8411", "label_8412", "label_8413", "label_8414", "label_8415", "label_8416", "label_8417", "label_8418", "label_8419", "label_8420", "label_8421", "label_8422", "label_8423", "label_8424", "label_8425", "label_8426", "label_8427", "label_8428", "label_8429", "label_8430", "label_8431", "label_8432", "label_8433", "label_8434", "label_8435", "label_8436", "label_8437", "label_8438", "label_8439", "label_8440", "label_8441", "label_8442", "label_8443", "label_8444", "label_8445", "label_8446", "label_8447", "label_8448", "label_8449", "label_8450", "label_8451", "label_8452", "label_8453", "label_8454", "label_8455", "label_8456", "label_8457", "label_8458", "label_8459", "label_8460", "label_8461", "label_8462", "label_8463", "label_8464", "label_8465", "label_8466", "label_8467", "label_8468", "label_8469", "label_8470", "label_8471", "label_8472", "label_8473", "label_8474", "label_8475", "label_8476", "label_8477", "label_8478", "label_8479", "label_8480", "label_8481", "label_8482", "label_8483", "label_8484", "label_8485", "label_8486", "label_8487", "label_8488", "label_8489", "label_8490", "label_8491", "label_8492", "label_8493", "label_8494", "label_8495", "label_8496", "label_8497", "label_8498", "label_8499", "label_8500", "label_8501", "label_8502", "label_8503", "label_8504", "label_8505", "label_8506", "label_8507", "label_8508", "label_8509", "label_8510", "label_8511", "label_8512", "label_8513", "label_8514", "label_8515", "label_8516", "label_8517", "label_8518", "label_8519", "label_8520", "label_8521", "label_8522", "label_8523", "label_8524", "label_8525", "label_8526", "label_8527", "label_8528", "label_8529", "label_8530", "label_8531", "label_8532", "label_8533", "label_8534", "label_8535", "label_8536", "label_8537", "label_8538", "label_8539", "label_8540", "label_8541", "label_8542", "label_8543", "label_8544", "label_8545", "label_8546", "label_8547", "label_8548", "label_8549", "label_8550", "label_8551", "label_8552", "label_8553", "label_8554", "label_8555", "label_8556", "label_8557", "label_8558", "label_8559", "label_8560", "label_8561", "label_8562", "label_8563", "label_8564", "label_8565", "label_8566", "label_8567", "label_8568", "label_8569", "label_8570", "label_8571", "label_8572", "label_8573", "label_8574", "label_8575", "label_8576", "label_8577", "label_8578", "label_8579", "label_8580", "label_8581", "label_8582", "label_8583", "label_8584", "label_8585", "label_8586", "label_8587", "label_8588", "label_8589", "label_8590", "label_8591", "label_8592", "label_8593", "label_8594", "label_8595", "label_8596", "label_8597", "label_8598", "label_8599", "label_8600", "label_8601", "label_8602", "label_8603", "label_8604", "label_8605", "label_8606", "label_8607", "label_8608", "label_8609", "label_8610", "label_8611", "label_8612", "label_8613", "label_8614", "label_8615", "label_8616", "label_8617", "label_8618", "label_8619", "label_8620", "label_8621", "label_8622", "label_8623", "label_8624", "label_8625", "label_8626", "label_8627", "label_8628", "label_8629", "label_8630", "label_8631", "label_8632", "label_8633", "label_8634", "label_8635", "label_8636", "label_8637", "label_8638", "label_8639", "label_8640", "label_8641", "label_8642", "label_8643", "label_8644", "label_8645", "label_8646", "label_8647", "label_8648", "label_8649", "label_8650", "label_8651", "label_8652", "label_8653", "label_8654", "label_8655", "label_8656", "label_8657", "label_8658", "label_8659", "label_8660", "label_8661", "label_8662", "label_8663", "label_8664", "label_8665", "label_8666", "label_8667", "label_8668", "label_8669", "label_8670", "label_8671", "label_8672", "label_8673", "label_8674", "label_8675", "label_8676", "label_8677", "label_8678", "label_8679", "label_8680", "label_8681", "label_8682", "label_8683", "label_8684", "label_8685", "label_8686", "label_8687", "label_8688", "label_8689", "label_8690", "label_8691", "label_8692", "label_8693", "label_8694", "label_8695", "label_8696", "label_8697", "label_8698", "label_8699", "label_8700", "label_8701", "label_8702", "label_8703", "label_8704", "label_8705", "label_8706", "label_8707", "label_8708", "label_8709", "label_8710", "label_8711", "label_8712", "label_8713", "label_8714", "label_8715", "label_8716", "label_8717", "label_8718", "label_8719", "label_8720", "label_8721", "label_8722", "label_8723", "label_8724", "label_8725", "label_8726", "label_8727", "label_8728", "label_8729", "label_8730", "label_8731", "label_8732", "label_8733", "label_8734", "label_8735", "label_8736", "label_8737", "label_8738", "label_8739", "label_8740", "label_8741", "label_8742", "label_8743", "label_8744", "label_8745", "label_8746", "label_8747", "label_8748", "label_8749", "label_8750", "label_8751", "label_8752", "label_8753", "label_8754", "label_8755", "label_8756", "label_8757", "label_8758", "label_8759", "label_8760", "label_8761", "label_8762", "label_8763", "label_8764", "label_8765", "label_8766", "label_8767", "label_8768", "label_8769", "label_8770", "label_8771", "label_8772", "label_8773", "label_8774", "label_8775", "label_8776", "label_8777", "label_8778", "label_8779", "label_8780", "label_8781", "label_8782", "label_8783", "label_8784", "label_8785", "label_8786", "label_8787", "label_8788", "label_8789", "label_8790", "label_8791", "label_8792", "label_8793", "label_8794", "label_8795", "label_8796", "label_8797", "label_8798", "label_8799", "label_8800", "label_8801", "label_8802", "label_8803", "label_8804", "label_8805", "label_8806", "label_8807", "label_8808", "label_8809", "label_8810", "label_8811", "label_8812", "label_8813", "label_8814", "label_8815", "label_8816", "label_8817", "label_8818", "label_8819", "label_8820", "label_8821", "label_8822", "label_8823", "label_8824", "label_8825", "label_8826", "label_8827", "label_8828", "label_8829", "label_8830", "label_8831", "label_8832", "label_8833", "label_8834", "label_8835", "label_8836", "label_8837", "label_8838", "label_8839", "label_8840", "label_8841", "label_8842", "label_8843", "label_8844", "label_8845", "label_8846", "label_8847", "label_8848", "label_8849", "label_8850", "label_8851", "label_8852", "label_8853", "label_8854", "label_8855", "label_8856", "label_8857", "label_8858", "label_8859", "label_8860", "label_8861", "label_8862", "label_8863", "label_8864", "label_8865", "label_8866", "label_8867", "label_8868", "label_8869", "label_8870", "label_8871", "label_8872", "label_8873", "label_8874", "label_8875", "label_8876", "label_8877", "label_8878", "label_8879", "label_8880", "label_8881", "label_8882", "label_8883", "label_8884", "label_8885", "label_8886", "label_8887", "label_8888", "label_8889", "label_8890", "label_8891", "label_8892", "label_8893", "label_8894", "label_8895", "label_8896", "label_8897", "label_8898", "label_8899", "label_8900", "label_8901", "label_8902", "label_8903", "label_8904", "label_8905", "label_8906", "label_8907", "label_8908", "label_8909", "label_8910", "label_8911", "label_8912", "label_8913", "label_8914", "label_8915", "label_8916", "label_8917", "label_8918", "label_8919", "label_8920", "label_8921", "label_8922", "label_8923", "label_8924", "label_8925", "label_8926", "label_8927", "label_8928", "label_8929", "label_8930", "label_8931", "label_8932", "label_8933", "label_8934", "label_8935", "label_8936", "label_8937", "label_8938", "label_8939", "label_8940", "label_8941", "label_8942", "label_8943", "label_8944", "label_8945", "label_8946", "label_8947", "label_8948", "label_8949", "label_8950", "label_8951", "label_8952", "label_8953", "label_8954", "label_8955", "label_8956", "label_8957", "label_8958", "label_8959", "label_8960", "label_8961", "label_8962", "label_8963", "label_8964", "label_8965", "label_8966", "label_8967", "label_8968", "label_8969", "label_8970", "label_8971", "label_8972", "label_8973", "label_8974", "label_8975", "label_8976", "label_8977", "label_8978", "label_8979", "label_8980", "label_8981", "label_8982", "label_8983", "label_8984", "label_8985", "label_8986", "label_8987", "label_8988", "label_8989", "label_8990", "label_8991", "label_8992", "label_8993", "label_8994", "label_8995", "label_8996", "label_8997", "label_8998", "label_8999", "label_9000", "label_9001", "label_9002", "label_9003", "label_9004", "label_9005", "label_9006", "label_9007", "label_9008", "label_9009", "label_9010", "label_9011", "label_9012", "label_9013", "label_9014", "label_9015", "label_9016", "label_9017", "label_9018", "label_9019", "label_9020", "label_9021", "label_9022", "label_9023", "label_9024", "label_9025", "label_9026", "label_9027", "label_9028", "label_9029", "label_9030", "label_9031", "label_9032", "label_9033", "label_9034", "label_9035", "label_9036", "label_9037", "label_9038", "label_9039", "label_9040", "label_9041", "label_9042", "label_9043", "label_9044", "label_9045", "label_9046", "label_9047", "label_9048", "label_9049", "label_9050", "label_9051", "label_9052", "label_9053", "label_9054", "label_9055", "label_9056", "label_9057", "label_9058", "label_9059", "label_9060", "label_9061", "label_9062", "label_9063", "label_9064", "label_9065", "label_9066", "label_9067", "label_9068", "label_9069", "label_9070", "label_9071", "label_9072", "label_9073", "label_9074", "label_9075", "label_9076", "label_9077", "label_9078", "label_9079", "label_9080", "label_9081", "label_9082", "label_9083", "label_9084", "label_9085", "label_9086", "label_9087", "label_9088", "label_9089", "label_9090", "label_9091", "label_9092", "label_9093", "label_9094", "label_9095", "label_9096", "label_9097", "label_9098", "label_9099", "label_9100", "label_9101", "label_9102", "label_9103", "label_9104", "label_9105", "label_9106", "label_9107", "label_9108", "label_9109", "label_9110", "label_9111", "label_9112", "label_9113", "label_9114", "label_9115", "label_9116", "label_9117", "label_9118", "label_9119", "label_9120", "label_9121", "label_9122", "label_9123", "label_9124", "label_9125", "label_9126", "label_9127", "label_9128", "label_9129", "label_9130" ]
amiguel/cmm560_surface_corrosion_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # amiguel/cmm560_surface_corrosion_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0340 - Validation Loss: 0.0958 - Train Accuracy: 0.9731 - Epoch: 14 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 3e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 0.3646 | 0.2100 | 0.8879 | 0 | | 0.1923 | 0.1380 | 0.9731 | 1 | | 0.1445 | 0.1082 | 0.9731 | 2 | | 0.1066 | 0.0847 | 0.9776 | 3 | | 0.0779 | 0.0656 | 0.9731 | 4 | | 0.0758 | 0.0658 | 0.9776 | 5 | | 0.0892 | 0.0499 | 0.9821 | 6 | | 0.0701 | 0.1073 | 0.9731 | 7 | | 0.0656 | 0.0655 | 0.9686 | 8 | | 0.0527 | 0.0578 | 0.9776 | 9 | | 0.0731 | 0.1136 | 0.9462 | 10 | | 0.0508 | 0.0830 | 0.9641 | 11 | | 0.0453 | 0.0762 | 0.9731 | 12 | | 0.0541 | 0.0821 | 0.9686 | 13 | | 0.0340 | 0.0958 | 0.9731 | 14 | ### Framework versions - Transformers 4.42.4 - TensorFlow 2.15.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "negative", "positive" ]
Leotrim/food101_vit_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/raspuntinov_ai/huggingface/runs/26tpizu2) # food101_vit_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7311 - Accuracy: 0.8536 - Precision: 0.8531 - Recall: 0.8536 - F1: 0.8529 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 1.726 | 0.9994 | 1183 | 1.4984 | 0.7974 | 0.8021 | 0.7974 | 0.7906 | | 0.9996 | 1.9996 | 2367 | 0.8596 | 0.8417 | 0.8430 | 0.8417 | 0.8413 | | 0.8383 | 2.9981 | 3549 | 0.7311 | 0.8536 | 0.8531 | 0.8536 | 0.8529 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
pimcore/car-countries-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # car-countries-classification This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.4039 - Accuracy: 0.2941 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | No log | 0.9231 | 3 | 1.5830 | 0.3137 | | No log | 1.8462 | 6 | 1.5342 | 0.2941 | | No log | 2.7692 | 9 | 1.4845 | 0.2941 | | 1.5308 | 4.0 | 13 | 1.4705 | 0.2745 | | 1.5308 | 4.9231 | 16 | 1.4534 | 0.3137 | | 1.5308 | 5.8462 | 19 | 1.4583 | 0.2745 | | 1.3601 | 6.7692 | 22 | 1.4218 | 0.2941 | | 1.3601 | 8.0 | 26 | 1.4283 | 0.2745 | | 1.3601 | 8.9231 | 29 | 1.3973 | 0.3137 | | 1.2778 | 9.2308 | 30 | 1.4039 | 0.2941 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "france", "germany", "italy", "united kingdom", "united states" ]
YuanUDE/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6504 - Accuracy: 0.869 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.6867 | 0.992 | 62 | 2.5581 | 0.811 | | 1.8553 | 2.0 | 125 | 1.8286 | 0.856 | | 1.6018 | 2.976 | 186 | 1.6504 | 0.869 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
RhythmKulsh/DummyViT2307
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "corgi", "samoyed", "shiba inu" ]
habibi26/document-crop
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # document-crop This model is a fine-tuned version of [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7682 - Accuracy: 0.9022 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 250 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:--------:|:----:|:---------------:|:--------:| | 0.6337 | 2.6667 | 20 | 0.6879 | 0.5870 | | 0.4336 | 5.3333 | 40 | 0.7280 | 0.6413 | | 0.2493 | 8.0 | 60 | 0.5044 | 0.75 | | 0.1756 | 10.6667 | 80 | 0.3750 | 0.8478 | | 0.1715 | 13.3333 | 100 | 0.7468 | 0.6957 | | 0.1525 | 16.0 | 120 | 0.6240 | 0.7935 | | 0.2019 | 18.6667 | 140 | 0.3115 | 0.8804 | | 0.1366 | 21.3333 | 160 | 0.8020 | 0.7391 | | 0.1729 | 24.0 | 180 | 0.7651 | 0.7283 | | 0.1499 | 26.6667 | 200 | 0.6695 | 0.7826 | | 0.1226 | 29.3333 | 220 | 0.5607 | 0.8370 | | 0.1426 | 32.0 | 240 | 0.5363 | 0.8152 | | 0.0986 | 34.6667 | 260 | 0.2214 | 0.9022 | | 0.0984 | 37.3333 | 280 | 0.2494 | 0.9022 | | 0.1764 | 40.0 | 300 | 0.3202 | 0.9022 | | 0.0712 | 42.6667 | 320 | 0.6895 | 0.8370 | | 0.104 | 45.3333 | 340 | 0.8008 | 0.75 | | 0.107 | 48.0 | 360 | 0.6523 | 0.8696 | | 0.1446 | 50.6667 | 380 | 0.4615 | 0.8370 | | 0.0525 | 53.3333 | 400 | 0.5936 | 0.9130 | | 0.1076 | 56.0 | 420 | 0.5063 | 0.9022 | | 0.0554 | 58.6667 | 440 | 0.4740 | 0.8913 | | 0.0701 | 61.3333 | 460 | 0.4842 | 0.8587 | | 0.1011 | 64.0 | 480 | 0.5180 | 0.8587 | | 0.0471 | 66.6667 | 500 | 1.6979 | 0.7717 | | 0.0559 | 69.3333 | 520 | 0.4181 | 0.9022 | | 0.0371 | 72.0 | 540 | 0.4239 | 0.9022 | | 0.0653 | 74.6667 | 560 | 0.2725 | 0.9239 | | 0.0564 | 77.3333 | 580 | 0.8607 | 0.8043 | | 0.0427 | 80.0 | 600 | 0.2848 | 0.9457 | | 0.1251 | 82.6667 | 620 | 0.3903 | 0.9022 | | 0.023 | 85.3333 | 640 | 0.4514 | 0.9239 | | 0.0297 | 88.0 | 660 | 0.7634 | 0.8913 | | 0.0553 | 90.6667 | 680 | 0.5395 | 0.8913 | | 0.0147 | 93.3333 | 700 | 0.7752 | 0.8696 | | 0.0804 | 96.0 | 720 | 0.6780 | 0.8913 | | 0.0154 | 98.6667 | 740 | 0.7887 | 0.8587 | | 0.0063 | 101.3333 | 760 | 0.5492 | 0.9239 | | 0.0131 | 104.0 | 780 | 0.8119 | 0.8804 | | 0.0113 | 106.6667 | 800 | 1.0839 | 0.8587 | | 0.0268 | 109.3333 | 820 | 1.0396 | 0.8587 | | 0.0215 | 112.0 | 840 | 0.8707 | 0.9022 | | 0.0271 | 114.6667 | 860 | 0.5733 | 0.9457 | | 0.0208 | 117.3333 | 880 | 0.6780 | 0.9130 | | 0.0224 | 120.0 | 900 | 0.3565 | 0.9457 | | 0.0324 | 122.6667 | 920 | 0.3860 | 0.9239 | | 0.019 | 125.3333 | 940 | 0.5652 | 0.9022 | | 0.0079 | 128.0 | 960 | 0.5316 | 0.9348 | | 0.0064 | 130.6667 | 980 | 0.5368 | 0.9239 | | 0.0055 | 133.3333 | 1000 | 0.8009 | 0.8913 | | 0.0156 | 136.0 | 1020 | 0.8391 | 0.9348 | | 0.04 | 138.6667 | 1040 | 0.6336 | 0.9022 | | 0.0031 | 141.3333 | 1060 | 0.5656 | 0.9348 | | 0.0009 | 144.0 | 1080 | 0.4957 | 0.9348 | | 0.0004 | 146.6667 | 1100 | 0.9136 | 0.8913 | | 0.006 | 149.3333 | 1120 | 0.9782 | 0.8913 | | 0.0004 | 152.0 | 1140 | 0.9065 | 0.9239 | | 0.0042 | 154.6667 | 1160 | 0.9944 | 0.9130 | | 0.0001 | 157.3333 | 1180 | 0.8723 | 0.9239 | | 0.0002 | 160.0 | 1200 | 1.1987 | 0.8804 | | 0.0083 | 162.6667 | 1220 | 0.7118 | 0.9239 | | 0.0 | 165.3333 | 1240 | 0.7793 | 0.9130 | | 0.0 | 168.0 | 1260 | 0.7330 | 0.9239 | | 0.0038 | 170.6667 | 1280 | 0.5990 | 0.9348 | | 0.0001 | 173.3333 | 1300 | 0.6496 | 0.9239 | | 0.0 | 176.0 | 1320 | 0.8535 | 0.8913 | | 0.0 | 178.6667 | 1340 | 0.6108 | 0.9348 | | 0.0 | 181.3333 | 1360 | 0.5813 | 0.9348 | | 0.0 | 184.0 | 1380 | 0.5817 | 0.9239 | | 0.0 | 186.6667 | 1400 | 0.5852 | 0.9239 | | 0.0 | 189.3333 | 1420 | 0.5877 | 0.9239 | | 0.0 | 192.0 | 1440 | 0.5941 | 0.9239 | | 0.0 | 194.6667 | 1460 | 0.6219 | 0.9130 | | 0.0 | 197.3333 | 1480 | 0.6350 | 0.9130 | | 0.0 | 200.0 | 1500 | 0.6388 | 0.9130 | | 0.0 | 202.6667 | 1520 | 0.6409 | 0.9130 | | 0.0 | 205.3333 | 1540 | 0.6423 | 0.9130 | | 0.0 | 208.0 | 1560 | 0.6430 | 0.9130 | | 0.0 | 210.6667 | 1580 | 0.6336 | 0.9130 | | 0.0 | 213.3333 | 1600 | 0.7124 | 0.9022 | | 0.0 | 216.0 | 1620 | 0.7457 | 0.9022 | | 0.0 | 218.6667 | 1640 | 0.7498 | 0.9022 | | 0.0 | 221.3333 | 1660 | 0.7505 | 0.9022 | | 0.0 | 224.0 | 1680 | 0.7512 | 0.9022 | | 0.0 | 226.6667 | 1700 | 0.7660 | 0.9022 | | 0.0 | 229.3333 | 1720 | 0.7682 | 0.9022 | | 0.0 | 232.0 | 1740 | 0.7682 | 0.9022 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "crop", "not_crop" ]
hanad/self_harm_detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # self_harm_detection This model is a fine-tuned version of [Falconsai/nsfw_image_detection](https://huggingface.co/Falconsai/nsfw_image_detection) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0386 - Accuracy: 0.9860 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.0772 | 0.9984 | 156 | 0.1007 | 0.9580 | | 0.0351 | 1.9968 | 312 | 0.0557 | 0.9760 | | 0.0206 | 2.9952 | 468 | 0.0386 | 0.9860 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "normal", "self_harm" ]
nvidia/MambaVision-B-1K
This repository contains the data for the paper [PAVE: Patching and Adapting Video Large Language Models](https://arxiv.org/abs/2503.19794). Code: https://github.com/dragonlzm/PAVE ## Citation [optional] arxiv.org/abs/2503.19794 <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ``` @misc{liu2025pavepatchingadaptingvideo, title={PAVE: Patching and Adapting Video Large Language Models}, author={Zhuoming Liu and Yiquan Li and Khoi Duc Nguyen and Yiwu Zhong and Yin Li}, year={2025}, eprint={2503.19794}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2503.19794}, } ```
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
Bekhzod/vit-base-patch16-224-in21k-lora
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "apple_pie", "baby_back_ribs", "baklava", "beef_carpaccio", "beef_tartare", "beet_salad", "beignets", "bibimbap", "bread_pudding", "breakfast_burrito", "bruschetta", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare", "waffles" ]
Denspls/lunwenjizhunhoumen
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "airplane", "automobile", "bird", "cat", "deer", "dog", "frog", "horse", "ship", "truck" ]
sunny77/vit-base-fashion
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "topwear", "bottomwear", "watches", "socks", "shoes", "belts", "flip flops", "bags", "innerwear", "sandal", "fragrance", "jewellery", "lips", "saree", "eyewear", "nails", "scarves", "dress", "loungewear and nightwear", "wallets", "apparel set", "headwear", "makeup", "free gifts", "ties", "accessories", "cufflinks" ]
Faariya-syed/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [nielsr/swin-tiny-patch4-window7-224-finetuned-eurosat](https://huggingface.co/nielsr/swin-tiny-patch4-window7-224-finetuned-eurosat) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-15 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 15 ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
Hanhpt23/vit_classification_food
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit_classification_food This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.1393 - Accuracy: 0.86 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7317 | 0.992 | 62 | 2.5819 | 0.837 | | 2.1225 | 1.984 | 124 | 2.1393 | 0.86 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
LongLe3102000/herbal_identification
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "10_tuc_doan", "11_thien_mon", "12_sai_ho", "13_vien_chi", "14_su_quan_tu", "15_bach_mao_can", "16_cau_ky_tu", "17_do_trong", "18_dang_sam", "19_cau_tich", "1_boi_mau", "20_tho_ty_tu", "21_hoang_ky", "22_coi_xay", "23_huyen_sam", "24_tang_chi", "25_diep_ha_chau", "26_kim_anh", "27_cat_can", "28_co_ngot", "29_cuc_hoa", "2_hoe_hoa", "30_to_moc", "31_kim_tien_thao", "32_dan_sam", "33_chi_tu", "34_ngai_cuu", "35_sinh_dia", "36_nguu_tat", "37_bach_truat", "38_nhan_tran", "39_duong_quy", "3_linh_chi", "40_nho_noi", "41_dao_nhan", "42_cat_canh", "43_ha_kho_thao", "44_xa_tien_tu", "45_che_day", "46_xa_can", "47_tang_diep", "48_ngu_boi_tu", "49_ngu_gia_bi", "4_thong_thao", "50_rau_ngo", "51_nguu_bang_tu", "52_cam_thao_dat", "53_dai_hoang", "54_hoai_son", "55_dam_duong_hoac", "56_moc_qua", "57_bo_cong_anh", "58_tho_phuc_linh", "59_mach_mon", "5_trach_ta", "60_ke_dau_ngua", "61_tang_bach_bi", "62_cam_thao_bac", "63_o_tac_cot", "64_thao_quyet_minh", "65_dai_tao", "66_kim_ngan_hoa", "67_tao_nhan", "68_ban_ha", "69_ca_gai_leo", "6_y_di", "70_kho_qua", "71_xuyen_tam_lien", "72_nhan_sam", "73_bach_gioi_tu", "74_tam_that", "75_bach_chi", "76_sa_sam", "77_bach_thuoc", "7_can_khuong", "8_ty_giai", "9_cot_toai_bo" ]
hanad/Drugs_detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Drugs_detection This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0447 - Accuracy: 0.9854 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.0746 | 0.9670 | 22 | 0.0708 | 0.9757 | | 0.0263 | 1.9780 | 45 | 0.0522 | 0.9854 | | 0.0179 | 2.9890 | 68 | 0.0829 | 0.9612 | | 0.0181 | 4.0 | 91 | 0.0438 | 0.9903 | | 0.0255 | 4.8352 | 110 | 0.0447 | 0.9854 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "drug", "normal" ]
ArrayDice/car_orientation_classification_zoomed
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # car_orientation_classification_zoomed This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6108 - Accuracy: 0.7597 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.9887 | 1.0 | 68 | 1.9011 | 0.3463 | | 1.4388 | 2.0 | 136 | 1.3001 | 0.4594 | | 1.1799 | 3.0 | 204 | 1.1267 | 0.4841 | | 1.0245 | 4.0 | 272 | 0.9695 | 0.5936 | | 0.8203 | 5.0 | 340 | 0.8157 | 0.6890 | | 0.7146 | 6.0 | 408 | 0.7898 | 0.6678 | | 0.6137 | 7.0 | 476 | 0.6343 | 0.7420 | | 0.5746 | 8.0 | 544 | 0.6351 | 0.7527 | | 0.5316 | 9.0 | 612 | 0.5899 | 0.7986 | | 0.5073 | 10.0 | 680 | 0.6193 | 0.7491 | | 0.4854 | 11.0 | 748 | 0.5721 | 0.7845 | | 0.4347 | 12.0 | 816 | 0.6495 | 0.7562 | | 0.3937 | 13.0 | 884 | 0.6108 | 0.7597 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "n", "ne", "e", "se", "s", "sw", "w", "nw" ]
Frances300/results
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # results This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.4683 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 32 | 1.8852 | | No log | 2.0 | 64 | 1.5778 | | No log | 3.0 | 96 | 1.4683 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9" ]
nemik/mobilevitv2-1.0-imagenet1k-256-finetuned_v2024-7-25-frost
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mobilevitv2-1.0-imagenet1k-256-finetuned_v2024-7-25-frost This model is a fine-tuned version of [apple/mobilevitv2-1.0-imagenet1k-256](https://huggingface.co/apple/mobilevitv2-1.0-imagenet1k-256) on the webdataset dataset. It achieves the following results on the evaluation set: - Loss: 0.1896 - Accuracy: 0.9310 - F1: 0.8227 - Precision: 0.8458 - Recall: 0.8009 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | |:-------------:|:-------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:| | 0.6687 | 1.5625 | 100 | 0.6623 | 0.7230 | 0.5335 | 0.4022 | 0.7920 | | 0.4454 | 3.125 | 200 | 0.4152 | 0.8832 | 0.7490 | 0.6567 | 0.8717 | | 0.2835 | 4.6875 | 300 | 0.2661 | 0.9097 | 0.7661 | 0.7952 | 0.7389 | | 0.2197 | 6.25 | 400 | 0.2151 | 0.9195 | 0.7869 | 0.8358 | 0.7434 | | 0.1613 | 7.8125 | 500 | 0.2007 | 0.9292 | 0.8140 | 0.8578 | 0.7743 | | 0.1655 | 9.375 | 600 | 0.1935 | 0.9310 | 0.8227 | 0.8458 | 0.8009 | | 0.1815 | 10.9375 | 700 | 0.1883 | 0.9265 | 0.8074 | 0.8488 | 0.7699 | | 0.1316 | 12.5 | 800 | 0.1825 | 0.9327 | 0.8273 | 0.8505 | 0.8053 | | 0.1612 | 14.0625 | 900 | 0.1837 | 0.9257 | 0.8100 | 0.8287 | 0.7920 | | 0.118 | 15.625 | 1000 | 0.1896 | 0.9310 | 0.8227 | 0.8458 | 0.8009 | | 0.1178 | 17.1875 | 1100 | 0.1937 | 0.9239 | 0.8028 | 0.8333 | 0.7743 | | 0.1248 | 18.75 | 1200 | 0.1913 | 0.9301 | 0.8192 | 0.8483 | 0.7920 | | 0.1169 | 20.3125 | 1300 | 0.1916 | 0.9301 | 0.8167 | 0.8585 | 0.7788 | | 0.1094 | 21.875 | 1400 | 0.1925 | 0.9292 | 0.8182 | 0.8411 | 0.7965 | | 0.1108 | 23.4375 | 1500 | 0.1961 | 0.9345 | 0.8333 | 0.8486 | 0.8186 | | 0.1089 | 25.0 | 1600 | 0.1993 | 0.9283 | 0.8172 | 0.8341 | 0.8009 | | 0.0919 | 26.5625 | 1700 | 0.1936 | 0.9319 | 0.8262 | 0.8433 | 0.8097 | | 0.0969 | 28.125 | 1800 | 0.1978 | 0.9310 | 0.8227 | 0.8458 | 0.8009 | | 0.1093 | 29.6875 | 1900 | 0.1955 | 0.9283 | 0.8172 | 0.8341 | 0.8009 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "snowing", "raining", "sunny", "cloudy", "night", "snow_on_road", "partial_snow_on_road", "clear_pavement", "wet_pavement", "iced_lens" ]
ayubkfupm/swin-tiny-patch4-window7-224-finetuned-wsdmhar
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-wsdmhar This model is a fine-tuned version of [nielsr/swin-tiny-patch4-window7-224-finetuned-eurosat](https://huggingface.co/nielsr/swin-tiny-patch4-window7-224-finetuned-eurosat) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1444 - Accuracy: 0.9683 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0624 | 1.0 | 53 | 0.8879 | 0.6092 | | 0.6893 | 2.0 | 106 | 0.6601 | 0.7090 | | 0.6152 | 3.0 | 159 | 0.5114 | 0.7855 | | 0.5456 | 4.0 | 212 | 0.3819 | 0.8423 | | 0.4673 | 5.0 | 265 | 0.3267 | 0.8719 | | 0.4166 | 6.0 | 318 | 0.2804 | 0.9039 | | 0.3757 | 7.0 | 371 | 0.2881 | 0.8994 | | 0.3798 | 8.0 | 424 | 0.2635 | 0.9032 | | 0.3303 | 9.0 | 477 | 0.2703 | 0.9074 | | 0.3346 | 10.0 | 530 | 0.2565 | 0.9005 | | 0.2971 | 11.0 | 583 | 0.2182 | 0.9311 | | 0.2992 | 12.0 | 636 | 0.2240 | 0.9256 | | 0.2637 | 13.0 | 689 | 0.2131 | 0.9239 | | 0.2653 | 14.0 | 742 | 0.1801 | 0.9397 | | 0.2472 | 15.0 | 795 | 0.1807 | 0.9377 | | 0.2263 | 16.0 | 848 | 0.1612 | 0.9466 | | 0.1786 | 17.0 | 901 | 0.1735 | 0.9418 | | 0.2103 | 18.0 | 954 | 0.1786 | 0.9463 | | 0.1725 | 19.0 | 1007 | 0.1631 | 0.9473 | | 0.1787 | 20.0 | 1060 | 0.1439 | 0.9532 | | 0.1924 | 21.0 | 1113 | 0.1388 | 0.9504 | | 0.1662 | 22.0 | 1166 | 0.1470 | 0.9508 | | 0.1724 | 23.0 | 1219 | 0.1538 | 0.9497 | | 0.1633 | 24.0 | 1272 | 0.1731 | 0.9384 | | 0.174 | 25.0 | 1325 | 0.1555 | 0.9539 | | 0.1657 | 26.0 | 1378 | 0.1542 | 0.9494 | | 0.1513 | 27.0 | 1431 | 0.1526 | 0.9508 | | 0.126 | 28.0 | 1484 | 0.1560 | 0.9511 | | 0.1508 | 29.0 | 1537 | 0.1607 | 0.9480 | | 0.1368 | 30.0 | 1590 | 0.1729 | 0.9435 | | 0.1166 | 31.0 | 1643 | 0.1555 | 0.9532 | | 0.1076 | 32.0 | 1696 | 0.1400 | 0.9580 | | 0.1189 | 33.0 | 1749 | 0.1419 | 0.9590 | | 0.1512 | 34.0 | 1802 | 0.1364 | 0.9580 | | 0.1323 | 35.0 | 1855 | 0.1497 | 0.9539 | | 0.1031 | 36.0 | 1908 | 0.1437 | 0.9580 | | 0.1215 | 37.0 | 1961 | 0.1460 | 0.9559 | | 0.1069 | 38.0 | 2014 | 0.1362 | 0.9601 | | 0.129 | 39.0 | 2067 | 0.1490 | 0.9590 | | 0.1202 | 40.0 | 2120 | 0.1616 | 0.9545 | | 0.1011 | 41.0 | 2173 | 0.1518 | 0.9570 | | 0.1092 | 42.0 | 2226 | 0.1308 | 0.9618 | | 0.1163 | 43.0 | 2279 | 0.1458 | 0.9590 | | 0.1074 | 44.0 | 2332 | 0.1414 | 0.9549 | | 0.0814 | 45.0 | 2385 | 0.1509 | 0.9580 | | 0.0985 | 46.0 | 2438 | 0.1287 | 0.9628 | | 0.0863 | 47.0 | 2491 | 0.1277 | 0.9625 | | 0.0932 | 48.0 | 2544 | 0.1453 | 0.9559 | | 0.0863 | 49.0 | 2597 | 0.1520 | 0.9566 | | 0.0887 | 50.0 | 2650 | 0.1279 | 0.9656 | | 0.0744 | 51.0 | 2703 | 0.1552 | 0.9566 | | 0.0928 | 52.0 | 2756 | 0.1465 | 0.9621 | | 0.0776 | 53.0 | 2809 | 0.1575 | 0.9583 | | 0.088 | 54.0 | 2862 | 0.1614 | 0.9563 | | 0.0909 | 55.0 | 2915 | 0.1312 | 0.9638 | | 0.089 | 56.0 | 2968 | 0.1357 | 0.9652 | | 0.0587 | 57.0 | 3021 | 0.1510 | 0.9614 | | 0.0931 | 58.0 | 3074 | 0.1466 | 0.9580 | | 0.0878 | 59.0 | 3127 | 0.1499 | 0.9590 | | 0.0725 | 60.0 | 3180 | 0.1524 | 0.9597 | | 0.0543 | 61.0 | 3233 | 0.1543 | 0.9583 | | 0.0773 | 62.0 | 3286 | 0.1513 | 0.9635 | | 0.0626 | 63.0 | 3339 | 0.1511 | 0.9601 | | 0.0649 | 64.0 | 3392 | 0.1467 | 0.9594 | | 0.0705 | 65.0 | 3445 | 0.1443 | 0.9590 | | 0.0737 | 66.0 | 3498 | 0.1361 | 0.9607 | | 0.0518 | 67.0 | 3551 | 0.1441 | 0.9594 | | 0.0502 | 68.0 | 3604 | 0.1535 | 0.9590 | | 0.0701 | 69.0 | 3657 | 0.1362 | 0.9663 | | 0.0826 | 70.0 | 3710 | 0.1492 | 0.9611 | | 0.0715 | 71.0 | 3763 | 0.1615 | 0.9625 | | 0.0635 | 72.0 | 3816 | 0.1488 | 0.9642 | | 0.0522 | 73.0 | 3869 | 0.1456 | 0.9621 | | 0.0485 | 74.0 | 3922 | 0.1386 | 0.9645 | | 0.0629 | 75.0 | 3975 | 0.1463 | 0.9632 | | 0.0568 | 76.0 | 4028 | 0.1472 | 0.9621 | | 0.0556 | 77.0 | 4081 | 0.1440 | 0.9659 | | 0.0547 | 78.0 | 4134 | 0.1421 | 0.9635 | | 0.0527 | 79.0 | 4187 | 0.1444 | 0.9683 | | 0.054 | 80.0 | 4240 | 0.1464 | 0.9628 | | 0.0641 | 81.0 | 4293 | 0.1491 | 0.9635 | | 0.0546 | 82.0 | 4346 | 0.1529 | 0.9611 | | 0.059 | 83.0 | 4399 | 0.1462 | 0.9652 | | 0.0485 | 84.0 | 4452 | 0.1567 | 0.9632 | | 0.0388 | 85.0 | 4505 | 0.1548 | 0.9621 | | 0.0421 | 86.0 | 4558 | 0.1484 | 0.9621 | | 0.0375 | 87.0 | 4611 | 0.1681 | 0.9597 | | 0.0376 | 88.0 | 4664 | 0.1513 | 0.9632 | | 0.0514 | 89.0 | 4717 | 0.1485 | 0.9642 | | 0.0598 | 90.0 | 4770 | 0.1541 | 0.9638 | | 0.0431 | 91.0 | 4823 | 0.1474 | 0.9628 | | 0.0432 | 92.0 | 4876 | 0.1498 | 0.9645 | | 0.0391 | 93.0 | 4929 | 0.1506 | 0.9645 | | 0.0408 | 94.0 | 4982 | 0.1462 | 0.9642 | | 0.0335 | 95.0 | 5035 | 0.1509 | 0.9652 | | 0.0447 | 96.0 | 5088 | 0.1508 | 0.9635 | | 0.0477 | 97.0 | 5141 | 0.1510 | 0.9635 | | 0.0504 | 98.0 | 5194 | 0.1510 | 0.9642 | | 0.0406 | 99.0 | 5247 | 0.1479 | 0.9649 | | 0.0343 | 100.0 | 5300 | 0.1480 | 0.9645 | ### Framework versions - Transformers 4.43.2 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
nemik/vit-base-patch16-384-finetuned_v2024-7-25-frost
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-384-finetuned_v2024-7-25-frost This model is a fine-tuned version of [google/vit-base-patch16-384](https://huggingface.co/google/vit-base-patch16-384) on the webdataset dataset. It achieves the following results on the evaluation set: - Loss: 0.0795 - Accuracy: 0.9747 - F1: 0.9373 - Precision: 0.9342 - Recall: 0.9404 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | |:-------------:|:-------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:| | 0.0624 | 1.7544 | 100 | 0.0458 | 0.9867 | 0.9665 | 0.9774 | 0.9558 | | 0.0729 | 3.5088 | 200 | 0.0942 | 0.9689 | 0.9220 | 0.9303 | 0.9139 | | 0.0566 | 5.2632 | 300 | 0.0802 | 0.972 | 0.9311 | 0.9221 | 0.9404 | | 0.051 | 7.0175 | 400 | 0.0965 | 0.9631 | 0.9066 | 0.9243 | 0.8896 | | 0.0686 | 8.7719 | 500 | 0.0795 | 0.9747 | 0.9373 | 0.9342 | 0.9404 | | 0.0271 | 10.5263 | 600 | 0.0935 | 0.9693 | 0.9239 | 0.9229 | 0.9249 | | 0.0273 | 12.2807 | 700 | 0.0975 | 0.9716 | 0.9300 | 0.9219 | 0.9382 | | 0.0445 | 14.0351 | 800 | 0.0910 | 0.9698 | 0.9248 | 0.9268 | 0.9227 | | 0.0217 | 15.7895 | 900 | 0.0942 | 0.9698 | 0.9243 | 0.9326 | 0.9161 | | 0.0257 | 17.5439 | 1000 | 0.0906 | 0.9684 | 0.9210 | 0.9283 | 0.9139 | | 0.0188 | 19.2982 | 1100 | 0.1028 | 0.9676 | 0.9181 | 0.9338 | 0.9029 | | 0.0196 | 21.0526 | 1200 | 0.1020 | 0.9698 | 0.9244 | 0.9306 | 0.9183 | | 0.025 | 22.8070 | 1300 | 0.1005 | 0.9702 | 0.9258 | 0.9289 | 0.9227 | | 0.009 | 24.5614 | 1400 | 0.0976 | 0.9729 | 0.9324 | 0.9356 | 0.9294 | | 0.0184 | 26.3158 | 1500 | 0.0987 | 0.9716 | 0.9290 | 0.9332 | 0.9249 | | 0.0048 | 28.0702 | 1600 | 0.0958 | 0.972 | 0.9301 | 0.9353 | 0.9249 | | 0.0072 | 29.8246 | 1700 | 0.0948 | 0.972 | 0.9301 | 0.9353 | 0.9249 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "snowing", "raining", "sunny", "cloudy", "night", "snow_on_road", "partial_snow_on_road", "clear_pavement", "wet_pavement", "iced_lens" ]
Tuu-invitrace/skin_decease
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # skin_decease This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0680 - Accuracy: 0.9872 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.2359 | 0.8621 | 100 | 0.2427 | 0.9744 | | 0.086 | 1.7241 | 200 | 0.1178 | 0.9872 | | 0.0435 | 2.5862 | 300 | 0.0801 | 0.9872 | | 0.0312 | 3.4483 | 400 | 0.0748 | 0.9872 | | 0.023 | 4.3103 | 500 | 0.0715 | 0.9872 | | 0.0197 | 5.1724 | 600 | 0.0696 | 0.9872 | | 0.0174 | 6.0345 | 700 | 0.0687 | 0.9872 | | 0.0161 | 6.8966 | 800 | 0.0684 | 0.9872 | | 0.0151 | 7.7586 | 900 | 0.0680 | 0.9872 | ### Framework versions - Transformers 4.43.2 - Pytorch 2.2.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "ba- cellulitis", "ba-impetigo", "fu-athlete-foot", "fu-nail-fungus", "fu-ringworm", "pa-cutaneous-larva-migrans", "vi-chickenpox", "vi-shingles" ]
giswqs/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6607 - Accuracy: 0.886 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.7361 | 0.992 | 62 | 2.5386 | 0.821 | | 1.8628 | 2.0 | 125 | 1.8000 | 0.893 | | 1.6416 | 2.976 | 186 | 1.6607 | 0.886 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Ikmalhakim/Your-Model-Name
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2" ]
djbp/swin-tiny-patch4-window7-224-MM_Classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-MM_Classification This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3468 - Accuracy: 0.8694 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 512 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0476 | 1.0 | 19 | 0.7707 | 0.6530 | | 0.6226 | 2.0 | 38 | 0.4743 | 0.8105 | | 0.4477 | 3.0 | 57 | 0.4133 | 0.8323 | | 0.3963 | 4.0 | 76 | 0.3813 | 0.8476 | | 0.3694 | 5.0 | 95 | 0.3753 | 0.8540 | | 0.3451 | 6.0 | 114 | 0.3587 | 0.8489 | | 0.3382 | 7.0 | 133 | 0.3531 | 0.8451 | | 0.3253 | 8.0 | 152 | 0.3498 | 0.8579 | | 0.3121 | 9.0 | 171 | 0.3437 | 0.8579 | | 0.2855 | 10.0 | 190 | 0.3447 | 0.8656 | | 0.2961 | 11.0 | 209 | 0.3350 | 0.8617 | | 0.273 | 12.0 | 228 | 0.3484 | 0.8566 | | 0.2745 | 13.0 | 247 | 0.3433 | 0.8604 | | 0.2613 | 14.0 | 266 | 0.3498 | 0.8643 | | 0.2527 | 15.0 | 285 | 0.3365 | 0.8579 | | 0.2619 | 16.0 | 304 | 0.3450 | 0.8617 | | 0.2436 | 17.0 | 323 | 0.3454 | 0.8681 | | 0.2518 | 18.0 | 342 | 0.3437 | 0.8681 | | 0.243 | 19.0 | 361 | 0.3468 | 0.8694 | | 0.2415 | 20.0 | 380 | 0.3455 | 0.8694 | ### Framework versions - Transformers 4.43.3 - Pytorch 1.13.1+cu117 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "invalid", "mid market", "non mid market" ]
Hanhpt23/SwinLarge
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # microsoft/swin-large-patch4-window12-384-in22k This model is a fine-tuned version of [microsoft/swin-large-patch4-window12-384-in22k](https://huggingface.co/microsoft/swin-large-patch4-window12-384-in22k) on the NIH-Xray dataset. It achieves the following results on the evaluation set: - Loss: 3.7711 - Accuracy: 0.4938 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 1.8318 | 0.9984 | 315 | 1.7651 | 0.5437 | | 1.6067 | 2.0 | 631 | 1.6393 | 0.5455 | | 1.406 | 2.9984 | 946 | 1.6472 | 0.5490 | | 1.3983 | 4.0 | 1262 | 1.7344 | 0.5455 | | 0.7272 | 4.9984 | 1577 | 2.1283 | 0.5258 | | 0.3975 | 6.0 | 1893 | 2.5229 | 0.5134 | | 0.2648 | 6.9984 | 2208 | 3.0333 | 0.5080 | | 0.1232 | 8.0 | 2524 | 3.4626 | 0.5241 | | 0.0873 | 8.9984 | 2839 | 3.6219 | 0.5027 | | 0.0554 | 9.9842 | 3150 | 3.7711 | 0.4938 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "pleural_thickening", "pneumothorax", "effusion", "no finding", "infiltration", "mass", "nodule", "emphysema", "edema", "fibrosis", "cardiomegaly", "atelectasis", "pneumonia", "hernia", "consolidation" ]
Raidenv/swin-tiny-patch4-window7-224-finetuned-bootcamp
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-bootcamp This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "corrugation", "corrugation flaking_multiple squat", "corrugation spalling", "crack", "crack flaking spalling", "crack flaking spalling_multiple", "crack flaking spalling_multiple squat", "crack flaking_multiple", "crack flaking_multiple spalling squat", "crack flaking_multiple spalling_multiple", "crack flaking_multiple spalling_multiple squat", "crack flaking_multiple spalling_multiple squat_multiple", "crack flaking_multiple squat", "crack spalling", "crack spalling squat", "crack spalling_multiple", "crack spalling_multiple squat", "crack spalling_multiple squat_multiple", "crack squat", "crack_multiple", "crack_multiple flaking spalling squat", "crack_multiple flaking spalling squat_multiple", "crack_multiple flaking spalling_multiple", "crack_multiple flaking spalling_multiple squat", "crack_multiple flaking spalling_multiple squat_multiple", "crack_multiple flaking_multiple", "crack_multiple flaking_multiple spalling", "crack_multiple flaking_multiple spalling squat", "crack_multiple flaking_multiple spalling squat_multiple", "crack_multiple flaking_multiple spalling_multiple", "crack_multiple flaking_multiple spalling_multiple squat", "crack_multiple flaking_multiple spalling_multiple squat_multiple", "crack_multiple flaking_multiple squat", "crack_multiple spalling", "crack_multiple spalling squat", "crack_multiple spalling squat_multiple", "crack_multiple spalling_multiple", "crack_multiple spalling_multiple squat", "crack_multiple spalling_multiple squat_multiple", "crack_multiple squat", "crack_multiple squat_multiple", "empty", "flaking", "flaking putus spalling_multiple", "flaking spalling", "flaking spalling squat", "flaking spalling_multiple", "flaking spalling_multiple squat", "flaking squat", "flaking squat_multiple", "flaking_multiple", "flaking_multiple spalling", "flaking_multiple spalling squat", "flaking_multiple spalling squat_multiple", "flaking_multiple spalling_multiple", "flaking_multiple spalling_multiple squat", "flaking_multiple spalling_multiple squat_multiple", "flaking_multiple squat", "flaking_multiple squat_multiple", "putus", "putus spalling_multiple", "spalling", "spalling squat", "spalling squat_multiple", "spalling_multiple", "spalling_multiple squat", "spalling_multiple squat_multiple", "squat", "squat_multiple" ]
NithaRenjith/swin-tiny-patch4-window7-224-finetuned-bootcamp
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-bootcamp This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.8963 - Accuracy: 0.7324 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | No log | 0.8889 | 6 | 4.2849 | 0.0047 | | 4.3139 | 1.9259 | 13 | 4.1846 | 0.0329 | | 4.1651 | 2.9630 | 20 | 4.0585 | 0.0563 | | 4.1651 | 4.0 | 27 | 3.9527 | 0.0610 | | 3.9272 | 4.8889 | 33 | 3.8813 | 0.0610 | | 3.7461 | 5.9259 | 40 | 3.7536 | 0.0845 | | 3.7461 | 6.9630 | 47 | 3.6486 | 0.1080 | | 3.5254 | 8.0 | 54 | 3.5603 | 0.1362 | | 3.3478 | 8.8889 | 60 | 3.4566 | 0.1362 | | 3.3478 | 9.9259 | 67 | 3.2986 | 0.1502 | | 3.0423 | 10.9630 | 74 | 3.2166 | 0.1549 | | 2.7931 | 12.0 | 81 | 3.0203 | 0.2160 | | 2.7931 | 12.8889 | 87 | 2.8991 | 0.2911 | | 2.541 | 13.9259 | 94 | 2.7941 | 0.2911 | | 2.3487 | 14.9630 | 101 | 2.7337 | 0.2911 | | 2.3487 | 16.0 | 108 | 2.5401 | 0.3662 | | 2.1043 | 16.8889 | 114 | 2.5088 | 0.3803 | | 1.8892 | 17.9259 | 121 | 2.3596 | 0.4131 | | 1.8892 | 18.9630 | 128 | 2.3180 | 0.4178 | | 1.7167 | 20.0 | 135 | 2.1820 | 0.4272 | | 1.5748 | 20.8889 | 141 | 2.0547 | 0.4413 | | 1.5748 | 21.9259 | 148 | 1.9472 | 0.4930 | | 1.4052 | 22.9630 | 155 | 1.9053 | 0.4883 | | 1.2535 | 24.0 | 162 | 1.8179 | 0.5117 | | 1.2535 | 24.8889 | 168 | 1.7600 | 0.5305 | | 1.1687 | 25.9259 | 175 | 1.6922 | 0.5493 | | 1.0719 | 26.9630 | 182 | 1.6076 | 0.5587 | | 1.0719 | 28.0 | 189 | 1.5316 | 0.5587 | | 1.0577 | 28.8889 | 195 | 1.5365 | 0.5775 | | 0.9558 | 29.9259 | 202 | 1.4488 | 0.6291 | | 0.9558 | 30.9630 | 209 | 1.4185 | 0.6150 | | 0.8771 | 32.0 | 216 | 1.3906 | 0.6056 | | 0.8146 | 32.8889 | 222 | 1.3828 | 0.6150 | | 0.8146 | 33.9259 | 229 | 1.3927 | 0.5822 | | 0.8228 | 34.9630 | 236 | 1.3036 | 0.6385 | | 0.6878 | 36.0 | 243 | 1.2240 | 0.6808 | | 0.6878 | 36.8889 | 249 | 1.2388 | 0.6714 | | 0.6471 | 37.9259 | 256 | 1.1345 | 0.6808 | | 0.6102 | 38.9630 | 263 | 1.1815 | 0.6573 | | 0.6599 | 40.0 | 270 | 1.1720 | 0.6526 | | 0.6599 | 40.8889 | 276 | 1.1336 | 0.6526 | | 0.5742 | 41.9259 | 283 | 1.0863 | 0.6714 | | 0.5478 | 42.9630 | 290 | 1.0910 | 0.6714 | | 0.5478 | 44.0 | 297 | 1.0746 | 0.6620 | | 0.557 | 44.8889 | 303 | 1.0724 | 0.6808 | | 0.5753 | 45.9259 | 310 | 1.0108 | 0.7136 | | 0.5753 | 46.9630 | 317 | 1.1296 | 0.6432 | | 0.5325 | 48.0 | 324 | 1.0361 | 0.6901 | | 0.4349 | 48.8889 | 330 | 1.0237 | 0.6995 | | 0.4349 | 49.9259 | 337 | 0.9790 | 0.7183 | | 0.447 | 50.9630 | 344 | 1.0409 | 0.6808 | | 0.4502 | 52.0 | 351 | 1.0467 | 0.6714 | | 0.4502 | 52.8889 | 357 | 0.9773 | 0.7183 | | 0.4345 | 53.9259 | 364 | 0.9931 | 0.6808 | | 0.4557 | 54.9630 | 371 | 0.9685 | 0.7136 | | 0.4557 | 56.0 | 378 | 0.9547 | 0.7371 | | 0.4109 | 56.8889 | 384 | 1.0015 | 0.6948 | | 0.4406 | 57.9259 | 391 | 0.9410 | 0.7230 | | 0.4406 | 58.9630 | 398 | 0.9765 | 0.6808 | | 0.4039 | 60.0 | 405 | 0.9505 | 0.7089 | | 0.396 | 60.8889 | 411 | 0.9539 | 0.7183 | | 0.396 | 61.9259 | 418 | 1.0391 | 0.6761 | | 0.3958 | 62.9630 | 425 | 0.9576 | 0.7136 | | 0.3763 | 64.0 | 432 | 0.9380 | 0.7230 | | 0.3763 | 64.8889 | 438 | 0.9363 | 0.7277 | | 0.3985 | 65.9259 | 445 | 0.9400 | 0.7089 | | 0.3701 | 66.9630 | 452 | 0.9769 | 0.7183 | | 0.3701 | 68.0 | 459 | 0.9604 | 0.7277 | | 0.3729 | 68.8889 | 465 | 0.9883 | 0.7089 | | 0.3958 | 69.9259 | 472 | 0.9516 | 0.7277 | | 0.3958 | 70.9630 | 479 | 0.9252 | 0.7183 | | 0.359 | 72.0 | 486 | 0.9196 | 0.7136 | | 0.362 | 72.8889 | 492 | 0.9104 | 0.7230 | | 0.362 | 73.9259 | 499 | 0.9255 | 0.7136 | | 0.353 | 74.9630 | 506 | 0.9359 | 0.7089 | | 0.345 | 76.0 | 513 | 0.9274 | 0.7230 | | 0.345 | 76.8889 | 519 | 0.9206 | 0.7371 | | 0.3414 | 77.9259 | 526 | 0.9229 | 0.7277 | | 0.3298 | 78.9630 | 533 | 0.9102 | 0.7418 | | 0.3394 | 80.0 | 540 | 0.8955 | 0.7512 | | 0.3394 | 80.8889 | 546 | 0.8956 | 0.7371 | | 0.3384 | 81.9259 | 553 | 0.8927 | 0.7277 | | 0.3164 | 82.9630 | 560 | 0.8885 | 0.7418 | | 0.3164 | 84.0 | 567 | 0.8941 | 0.7371 | | 0.3055 | 84.8889 | 573 | 0.8963 | 0.7418 | | 0.3355 | 85.9259 | 580 | 0.8992 | 0.7324 | | 0.3355 | 86.9630 | 587 | 0.8988 | 0.7324 | | 0.3101 | 88.0 | 594 | 0.8969 | 0.7324 | | 0.3218 | 88.8889 | 600 | 0.8963 | 0.7324 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "corrugation", "corrugation flaking_multiple squat", "corrugation spalling", "crack", "crack flaking spalling", "crack flaking spalling_multiple", "crack flaking spalling_multiple squat", "crack flaking_multiple", "crack flaking_multiple spalling squat", "crack flaking_multiple spalling_multiple", "crack flaking_multiple spalling_multiple squat", "crack flaking_multiple spalling_multiple squat_multiple", "crack flaking_multiple squat", "crack spalling", "crack spalling squat", "crack spalling_multiple", "crack spalling_multiple squat", "crack spalling_multiple squat_multiple", "crack squat", "crack_multiple", "crack_multiple flaking spalling squat", "crack_multiple flaking spalling squat_multiple", "crack_multiple flaking spalling_multiple", "crack_multiple flaking spalling_multiple squat", "crack_multiple flaking spalling_multiple squat_multiple", "crack_multiple flaking_multiple", "crack_multiple flaking_multiple spalling", "crack_multiple flaking_multiple spalling squat", "crack_multiple flaking_multiple spalling squat_multiple", "crack_multiple flaking_multiple spalling_multiple", "crack_multiple flaking_multiple spalling_multiple squat", "crack_multiple flaking_multiple spalling_multiple squat_multiple", "crack_multiple flaking_multiple squat", "crack_multiple spalling", "crack_multiple spalling squat", "crack_multiple spalling squat_multiple", "crack_multiple spalling_multiple", "crack_multiple spalling_multiple squat", "crack_multiple spalling_multiple squat_multiple", "crack_multiple squat", "crack_multiple squat_multiple", "empty", "flaking", "flaking putus spalling_multiple", "flaking spalling", "flaking spalling squat", "flaking spalling_multiple", "flaking spalling_multiple squat", "flaking squat", "flaking squat_multiple", "flaking_multiple", "flaking_multiple spalling", "flaking_multiple spalling squat", "flaking_multiple spalling squat_multiple", "flaking_multiple spalling_multiple", "flaking_multiple spalling_multiple squat", "flaking_multiple spalling_multiple squat_multiple", "flaking_multiple squat", "flaking_multiple squat_multiple", "putus", "putus spalling_multiple", "spalling", "spalling squat", "spalling squat_multiple", "spalling_multiple", "spalling_multiple squat", "spalling_multiple squat_multiple", "squat", "squat_multiple" ]
Ikmalhakim/ikmalmodel
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2" ]
ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-st-wsdmhar This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1144 - Accuracy: 0.9711 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.4641 | 1.0 | 53 | 1.2510 | 0.5768 | | 0.7713 | 2.0 | 106 | 0.6282 | 0.7435 | | 0.5822 | 3.0 | 159 | 0.4503 | 0.8251 | | 0.5086 | 4.0 | 212 | 0.4028 | 0.8306 | | 0.4499 | 5.0 | 265 | 0.3399 | 0.8709 | | 0.4171 | 6.0 | 318 | 0.3024 | 0.8898 | | 0.3597 | 7.0 | 371 | 0.2584 | 0.9029 | | 0.3201 | 8.0 | 424 | 0.2467 | 0.9143 | | 0.2974 | 9.0 | 477 | 0.2784 | 0.8981 | | 0.3323 | 10.0 | 530 | 0.2414 | 0.9101 | | 0.2717 | 11.0 | 583 | 0.2051 | 0.9349 | | 0.2647 | 12.0 | 636 | 0.1944 | 0.9294 | | 0.296 | 13.0 | 689 | 0.1871 | 0.9329 | | 0.2434 | 14.0 | 742 | 0.1701 | 0.9411 | | 0.2293 | 15.0 | 795 | 0.1685 | 0.9435 | | 0.2196 | 16.0 | 848 | 0.1486 | 0.9446 | | 0.2249 | 17.0 | 901 | 0.1438 | 0.9456 | | 0.2142 | 18.0 | 954 | 0.1529 | 0.9449 | | 0.2024 | 19.0 | 1007 | 0.1361 | 0.9532 | | 0.2177 | 20.0 | 1060 | 0.1906 | 0.9315 | | 0.1812 | 21.0 | 1113 | 0.1333 | 0.9532 | | 0.1771 | 22.0 | 1166 | 0.1424 | 0.9528 | | 0.1718 | 23.0 | 1219 | 0.1370 | 0.9545 | | 0.1333 | 24.0 | 1272 | 0.1550 | 0.9466 | | 0.1592 | 25.0 | 1325 | 0.1344 | 0.9535 | | 0.1279 | 26.0 | 1378 | 0.1201 | 0.9590 | | 0.1547 | 27.0 | 1431 | 0.1225 | 0.9597 | | 0.1443 | 28.0 | 1484 | 0.1526 | 0.9480 | | 0.1306 | 29.0 | 1537 | 0.1188 | 0.9597 | | 0.129 | 30.0 | 1590 | 0.1204 | 0.9614 | | 0.1354 | 31.0 | 1643 | 0.1351 | 0.9570 | | 0.135 | 32.0 | 1696 | 0.1010 | 0.9676 | | 0.137 | 33.0 | 1749 | 0.1381 | 0.9566 | | 0.101 | 34.0 | 1802 | 0.1119 | 0.9642 | | 0.1118 | 35.0 | 1855 | 0.1056 | 0.9656 | | 0.1206 | 36.0 | 1908 | 0.0975 | 0.9663 | | 0.1028 | 37.0 | 1961 | 0.1265 | 0.9635 | | 0.1058 | 38.0 | 2014 | 0.0958 | 0.9680 | | 0.1013 | 39.0 | 2067 | 0.1060 | 0.9642 | | 0.0765 | 40.0 | 2120 | 0.1024 | 0.9659 | | 0.0997 | 41.0 | 2173 | 0.1116 | 0.9642 | | 0.0933 | 42.0 | 2226 | 0.1082 | 0.9666 | | 0.0937 | 43.0 | 2279 | 0.1095 | 0.9680 | | 0.0831 | 44.0 | 2332 | 0.1034 | 0.9683 | | 0.0849 | 45.0 | 2385 | 0.0995 | 0.9690 | | 0.0739 | 46.0 | 2438 | 0.1042 | 0.9666 | | 0.0944 | 47.0 | 2491 | 0.1068 | 0.9697 | | 0.0821 | 48.0 | 2544 | 0.1087 | 0.9690 | | 0.0835 | 49.0 | 2597 | 0.0975 | 0.9721 | | 0.0666 | 50.0 | 2650 | 0.1189 | 0.9638 | | 0.0614 | 51.0 | 2703 | 0.1421 | 0.9611 | | 0.0738 | 52.0 | 2756 | 0.1253 | 0.9638 | | 0.0949 | 53.0 | 2809 | 0.1274 | 0.9663 | | 0.068 | 54.0 | 2862 | 0.1051 | 0.9669 | | 0.0626 | 55.0 | 2915 | 0.1102 | 0.9673 | | 0.0647 | 56.0 | 2968 | 0.1096 | 0.9673 | | 0.0803 | 57.0 | 3021 | 0.1049 | 0.9683 | | 0.0744 | 58.0 | 3074 | 0.1039 | 0.9697 | | 0.0769 | 59.0 | 3127 | 0.1060 | 0.9690 | | 0.0763 | 60.0 | 3180 | 0.1077 | 0.9680 | | 0.0591 | 61.0 | 3233 | 0.1165 | 0.9680 | | 0.0649 | 62.0 | 3286 | 0.1109 | 0.9694 | | 0.0557 | 63.0 | 3339 | 0.1162 | 0.9680 | | 0.0644 | 64.0 | 3392 | 0.1039 | 0.9718 | | 0.0558 | 65.0 | 3445 | 0.1182 | 0.9687 | | 0.0633 | 66.0 | 3498 | 0.1151 | 0.9680 | | 0.0586 | 67.0 | 3551 | 0.1147 | 0.9694 | | 0.0651 | 68.0 | 3604 | 0.1124 | 0.9711 | | 0.0693 | 69.0 | 3657 | 0.1104 | 0.9687 | | 0.0584 | 70.0 | 3710 | 0.1177 | 0.9697 | | 0.0471 | 71.0 | 3763 | 0.1160 | 0.9690 | | 0.0614 | 72.0 | 3816 | 0.1220 | 0.9680 | | 0.0583 | 73.0 | 3869 | 0.1236 | 0.9656 | | 0.0495 | 74.0 | 3922 | 0.1076 | 0.9718 | | 0.0574 | 75.0 | 3975 | 0.1163 | 0.9673 | | 0.0399 | 76.0 | 4028 | 0.1126 | 0.9683 | | 0.0357 | 77.0 | 4081 | 0.1064 | 0.9728 | | 0.0441 | 78.0 | 4134 | 0.1139 | 0.9694 | | 0.0504 | 79.0 | 4187 | 0.1083 | 0.9707 | | 0.0546 | 80.0 | 4240 | 0.1167 | 0.9676 | | 0.0528 | 81.0 | 4293 | 0.1143 | 0.9697 | | 0.0385 | 82.0 | 4346 | 0.1226 | 0.9676 | | 0.0511 | 83.0 | 4399 | 0.1199 | 0.9694 | | 0.0533 | 84.0 | 4452 | 0.1279 | 0.9673 | | 0.043 | 85.0 | 4505 | 0.1161 | 0.9714 | | 0.0231 | 86.0 | 4558 | 0.1166 | 0.9728 | | 0.0426 | 87.0 | 4611 | 0.1239 | 0.9690 | | 0.0565 | 88.0 | 4664 | 0.1189 | 0.9687 | | 0.0378 | 89.0 | 4717 | 0.1186 | 0.9697 | | 0.0406 | 90.0 | 4770 | 0.1209 | 0.9718 | | 0.0306 | 91.0 | 4823 | 0.1189 | 0.9721 | | 0.0354 | 92.0 | 4876 | 0.1244 | 0.9687 | | 0.0293 | 93.0 | 4929 | 0.1235 | 0.9697 | | 0.0381 | 94.0 | 4982 | 0.1186 | 0.9711 | | 0.0372 | 95.0 | 5035 | 0.1172 | 0.9714 | | 0.0469 | 96.0 | 5088 | 0.1180 | 0.9711 | | 0.0535 | 97.0 | 5141 | 0.1152 | 0.9718 | | 0.0496 | 98.0 | 5194 | 0.1157 | 0.9714 | | 0.034 | 99.0 | 5247 | 0.1145 | 0.9714 | | 0.0348 | 100.0 | 5300 | 0.1144 | 0.9711 | ### Framework versions - Transformers 4.43.2 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "laying", "sitting", "standing", "walking", "walking_downstairs", "walking_upstairs" ]
orha/cnn
# My Custom Model This is a custom model for image classification.
[ "class0", "class1", "class2" ]
ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-160
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-160 This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1319 - Accuracy: 0.9735 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 160 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.6426 | 1.0 | 53 | 1.5279 | 0.4642 | | 0.9084 | 2.0 | 106 | 0.7447 | 0.7028 | | 0.6514 | 3.0 | 159 | 0.5340 | 0.7934 | | 0.56 | 4.0 | 212 | 0.5470 | 0.7621 | | 0.4578 | 5.0 | 265 | 0.3704 | 0.8547 | | 0.4811 | 6.0 | 318 | 0.3317 | 0.8705 | | 0.4211 | 7.0 | 371 | 0.3045 | 0.8764 | | 0.3828 | 8.0 | 424 | 0.3240 | 0.8908 | | 0.3505 | 9.0 | 477 | 0.2840 | 0.8836 | | 0.3652 | 10.0 | 530 | 0.2430 | 0.9084 | | 0.3385 | 11.0 | 583 | 0.2359 | 0.9153 | | 0.3071 | 12.0 | 636 | 0.2899 | 0.8946 | | 0.3319 | 13.0 | 689 | 0.2588 | 0.9108 | | 0.2657 | 14.0 | 742 | 0.1920 | 0.9270 | | 0.2423 | 15.0 | 795 | 0.1830 | 0.9370 | | 0.2625 | 16.0 | 848 | 0.2116 | 0.9273 | | 0.2394 | 17.0 | 901 | 0.1917 | 0.9353 | | 0.2245 | 18.0 | 954 | 0.1894 | 0.9280 | | 0.2123 | 19.0 | 1007 | 0.1768 | 0.9346 | | 0.2158 | 20.0 | 1060 | 0.1902 | 0.9311 | | 0.1943 | 21.0 | 1113 | 0.1704 | 0.9390 | | 0.1712 | 22.0 | 1166 | 0.1442 | 0.9466 | | 0.2073 | 23.0 | 1219 | 0.1279 | 0.9552 | | 0.1676 | 24.0 | 1272 | 0.1548 | 0.9459 | | 0.1775 | 25.0 | 1325 | 0.1371 | 0.9497 | | 0.1644 | 26.0 | 1378 | 0.1247 | 0.9563 | | 0.1652 | 27.0 | 1431 | 0.1547 | 0.9408 | | 0.1383 | 28.0 | 1484 | 0.1301 | 0.9545 | | 0.1268 | 29.0 | 1537 | 0.1484 | 0.9504 | | 0.1252 | 30.0 | 1590 | 0.1385 | 0.9549 | | 0.1288 | 31.0 | 1643 | 0.1368 | 0.9521 | | 0.1353 | 32.0 | 1696 | 0.1184 | 0.9597 | | 0.1585 | 33.0 | 1749 | 0.1218 | 0.9570 | | 0.1445 | 34.0 | 1802 | 0.1173 | 0.9597 | | 0.1381 | 35.0 | 1855 | 0.1160 | 0.9614 | | 0.1398 | 36.0 | 1908 | 0.1292 | 0.9542 | | 0.1138 | 37.0 | 1961 | 0.1135 | 0.9597 | | 0.147 | 38.0 | 2014 | 0.0958 | 0.9690 | | 0.0927 | 39.0 | 2067 | 0.1008 | 0.9663 | | 0.1031 | 40.0 | 2120 | 0.1105 | 0.9659 | | 0.1197 | 41.0 | 2173 | 0.1010 | 0.9656 | | 0.1325 | 42.0 | 2226 | 0.1178 | 0.9621 | | 0.0862 | 43.0 | 2279 | 0.1042 | 0.9659 | | 0.1037 | 44.0 | 2332 | 0.1016 | 0.9680 | | 0.0885 | 45.0 | 2385 | 0.1063 | 0.9649 | | 0.1217 | 46.0 | 2438 | 0.1117 | 0.9673 | | 0.0947 | 47.0 | 2491 | 0.1048 | 0.9673 | | 0.0831 | 48.0 | 2544 | 0.1061 | 0.9666 | | 0.1082 | 49.0 | 2597 | 0.0946 | 0.9680 | | 0.0856 | 50.0 | 2650 | 0.1139 | 0.9694 | | 0.0832 | 51.0 | 2703 | 0.1152 | 0.9618 | | 0.0823 | 52.0 | 2756 | 0.0970 | 0.9721 | | 0.0773 | 53.0 | 2809 | 0.1049 | 0.9683 | | 0.0794 | 54.0 | 2862 | 0.1048 | 0.9731 | | 0.0813 | 55.0 | 2915 | 0.1089 | 0.9669 | | 0.079 | 56.0 | 2968 | 0.0982 | 0.9704 | | 0.095 | 57.0 | 3021 | 0.1242 | 0.9680 | | 0.0775 | 58.0 | 3074 | 0.1262 | 0.9676 | | 0.0795 | 59.0 | 3127 | 0.1276 | 0.9649 | | 0.0619 | 60.0 | 3180 | 0.0937 | 0.9704 | | 0.0688 | 61.0 | 3233 | 0.1149 | 0.9707 | | 0.0932 | 62.0 | 3286 | 0.1019 | 0.9700 | | 0.0675 | 63.0 | 3339 | 0.1239 | 0.9687 | | 0.0715 | 64.0 | 3392 | 0.1143 | 0.9669 | | 0.0858 | 65.0 | 3445 | 0.1053 | 0.9680 | | 0.0646 | 66.0 | 3498 | 0.1150 | 0.9694 | | 0.0736 | 67.0 | 3551 | 0.1119 | 0.9700 | | 0.0665 | 68.0 | 3604 | 0.1031 | 0.9721 | | 0.0509 | 69.0 | 3657 | 0.1069 | 0.9731 | | 0.0642 | 70.0 | 3710 | 0.1171 | 0.9704 | | 0.0588 | 71.0 | 3763 | 0.1235 | 0.9718 | | 0.0837 | 72.0 | 3816 | 0.1121 | 0.9700 | | 0.0534 | 73.0 | 3869 | 0.1162 | 0.9704 | | 0.0612 | 74.0 | 3922 | 0.1116 | 0.9697 | | 0.0621 | 75.0 | 3975 | 0.1220 | 0.9700 | | 0.063 | 76.0 | 4028 | 0.1084 | 0.9714 | | 0.0604 | 77.0 | 4081 | 0.1180 | 0.9694 | | 0.0511 | 78.0 | 4134 | 0.1325 | 0.9687 | | 0.05 | 79.0 | 4187 | 0.1179 | 0.9680 | | 0.072 | 80.0 | 4240 | 0.1516 | 0.9597 | | 0.0746 | 81.0 | 4293 | 0.1159 | 0.9714 | | 0.0544 | 82.0 | 4346 | 0.1201 | 0.9707 | | 0.0527 | 83.0 | 4399 | 0.1232 | 0.9725 | | 0.044 | 84.0 | 4452 | 0.1450 | 0.9700 | | 0.0462 | 85.0 | 4505 | 0.1229 | 0.9690 | | 0.0445 | 86.0 | 4558 | 0.1404 | 0.9669 | | 0.0524 | 87.0 | 4611 | 0.1153 | 0.9711 | | 0.0638 | 88.0 | 4664 | 0.1207 | 0.9707 | | 0.0435 | 89.0 | 4717 | 0.1289 | 0.9718 | | 0.0567 | 90.0 | 4770 | 0.1167 | 0.9700 | | 0.0553 | 91.0 | 4823 | 0.1100 | 0.9742 | | 0.0566 | 92.0 | 4876 | 0.1319 | 0.9721 | | 0.0462 | 93.0 | 4929 | 0.1275 | 0.9707 | | 0.0539 | 94.0 | 4982 | 0.1263 | 0.9711 | | 0.0561 | 95.0 | 5035 | 0.1333 | 0.9725 | | 0.0362 | 96.0 | 5088 | 0.1241 | 0.9704 | | 0.0435 | 97.0 | 5141 | 0.1199 | 0.9714 | | 0.0637 | 98.0 | 5194 | 0.1290 | 0.9707 | | 0.0466 | 99.0 | 5247 | 0.1200 | 0.9666 | | 0.0471 | 100.0 | 5300 | 0.1556 | 0.9656 | | 0.0407 | 101.0 | 5353 | 0.1334 | 0.9707 | | 0.0375 | 102.0 | 5406 | 0.1307 | 0.9707 | | 0.0375 | 103.0 | 5459 | 0.1392 | 0.9687 | | 0.0354 | 104.0 | 5512 | 0.1237 | 0.9714 | | 0.0523 | 105.0 | 5565 | 0.1298 | 0.9711 | | 0.0307 | 106.0 | 5618 | 0.1283 | 0.9687 | | 0.0427 | 107.0 | 5671 | 0.1300 | 0.9683 | | 0.0327 | 108.0 | 5724 | 0.1292 | 0.9711 | | 0.0411 | 109.0 | 5777 | 0.1377 | 0.9683 | | 0.0422 | 110.0 | 5830 | 0.1260 | 0.9697 | | 0.044 | 111.0 | 5883 | 0.1183 | 0.9731 | | 0.0332 | 112.0 | 5936 | 0.1347 | 0.9735 | | 0.0302 | 113.0 | 5989 | 0.1251 | 0.9731 | | 0.0273 | 114.0 | 6042 | 0.1100 | 0.9728 | | 0.0442 | 115.0 | 6095 | 0.1368 | 0.9728 | | 0.0337 | 116.0 | 6148 | 0.1308 | 0.9697 | | 0.0395 | 117.0 | 6201 | 0.1198 | 0.9738 | | 0.0398 | 118.0 | 6254 | 0.1344 | 0.9697 | | 0.0362 | 119.0 | 6307 | 0.1243 | 0.9752 | | 0.035 | 120.0 | 6360 | 0.1363 | 0.9735 | | 0.0389 | 121.0 | 6413 | 0.1271 | 0.9756 | | 0.0305 | 122.0 | 6466 | 0.1277 | 0.9759 | | 0.0366 | 123.0 | 6519 | 0.1276 | 0.9704 | | 0.0329 | 124.0 | 6572 | 0.1192 | 0.9780 | | 0.0304 | 125.0 | 6625 | 0.1325 | 0.9728 | | 0.0289 | 126.0 | 6678 | 0.1334 | 0.9728 | | 0.0362 | 127.0 | 6731 | 0.1272 | 0.9707 | | 0.0326 | 128.0 | 6784 | 0.1250 | 0.9735 | | 0.0357 | 129.0 | 6837 | 0.1255 | 0.9749 | | 0.0264 | 130.0 | 6890 | 0.1326 | 0.9769 | | 0.0324 | 131.0 | 6943 | 0.1359 | 0.9752 | | 0.0258 | 132.0 | 6996 | 0.1229 | 0.9766 | | 0.033 | 133.0 | 7049 | 0.1184 | 0.9759 | | 0.0259 | 134.0 | 7102 | 0.1416 | 0.9718 | | 0.0362 | 135.0 | 7155 | 0.1310 | 0.9745 | | 0.0263 | 136.0 | 7208 | 0.1434 | 0.9728 | | 0.0406 | 137.0 | 7261 | 0.1271 | 0.9745 | | 0.027 | 138.0 | 7314 | 0.1395 | 0.9728 | | 0.0417 | 139.0 | 7367 | 0.1307 | 0.9735 | | 0.0321 | 140.0 | 7420 | 0.1276 | 0.9742 | | 0.0451 | 141.0 | 7473 | 0.1338 | 0.9759 | | 0.029 | 142.0 | 7526 | 0.1337 | 0.9749 | | 0.0337 | 143.0 | 7579 | 0.1315 | 0.9745 | | 0.0212 | 144.0 | 7632 | 0.1331 | 0.9759 | | 0.0301 | 145.0 | 7685 | 0.1291 | 0.9759 | | 0.0306 | 146.0 | 7738 | 0.1276 | 0.9749 | | 0.0283 | 147.0 | 7791 | 0.1275 | 0.9731 | | 0.0291 | 148.0 | 7844 | 0.1293 | 0.9752 | | 0.0265 | 149.0 | 7897 | 0.1381 | 0.9749 | | 0.0326 | 150.0 | 7950 | 0.1308 | 0.9742 | | 0.0301 | 151.0 | 8003 | 0.1279 | 0.9731 | | 0.021 | 152.0 | 8056 | 0.1312 | 0.9735 | | 0.0186 | 153.0 | 8109 | 0.1364 | 0.9735 | | 0.0322 | 154.0 | 8162 | 0.1367 | 0.9725 | | 0.0229 | 155.0 | 8215 | 0.1347 | 0.9745 | | 0.0249 | 156.0 | 8268 | 0.1360 | 0.9728 | | 0.0312 | 157.0 | 8321 | 0.1325 | 0.9731 | | 0.0295 | 158.0 | 8374 | 0.1315 | 0.9735 | | 0.0234 | 159.0 | 8427 | 0.1308 | 0.9738 | | 0.0239 | 160.0 | 8480 | 0.1319 | 0.9735 | ### Framework versions - Transformers 4.43.2 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "downstairs", "jogging", "sitting", "standing", "upstairs", "walking" ]
ombharamadev/autotrain-ijzeq-gcc9o
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.6824218034744263 f1: 0.5333333333333333 precision: 0.6666666666666666 recall: 0.4444444444444444 auc: 0.654320987654321 accuracy: 0.6111111111111112
[ "female", "male" ]
ombharamadev/beauty-ornot
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metrics loss: 0.2851656675338745 f1: 0.918918918918919 precision: 0.8947368421052632 recall: 0.9444444444444444 auc: 0.9401709401709402 accuracy: 0.9032258064516129
[ "beauty", "notattractive" ]
ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-xyz
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-xyz This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1062 - Accuracy: 0.9749 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.545 | 1.0 | 53 | 1.3735 | 0.4845 | | 0.8697 | 2.0 | 106 | 0.6895 | 0.7156 | | 0.5783 | 3.0 | 159 | 0.4415 | 0.8165 | | 0.4849 | 4.0 | 212 | 0.3737 | 0.8461 | | 0.4004 | 5.0 | 265 | 0.3442 | 0.8488 | | 0.3553 | 6.0 | 318 | 0.3271 | 0.8757 | | 0.3318 | 7.0 | 371 | 0.2491 | 0.9050 | | 0.3894 | 8.0 | 424 | 0.2636 | 0.9081 | | 0.3201 | 9.0 | 477 | 0.2368 | 0.9070 | | 0.2915 | 10.0 | 530 | 0.2390 | 0.9108 | | 0.2582 | 11.0 | 583 | 0.2044 | 0.9294 | | 0.2696 | 12.0 | 636 | 0.1948 | 0.9360 | | 0.2429 | 13.0 | 689 | 0.2282 | 0.9143 | | 0.257 | 14.0 | 742 | 0.1751 | 0.9339 | | 0.2042 | 15.0 | 795 | 0.1765 | 0.9349 | | 0.1952 | 16.0 | 848 | 0.1878 | 0.9284 | | 0.1949 | 17.0 | 901 | 0.1303 | 0.9494 | | 0.1786 | 18.0 | 954 | 0.1305 | 0.9552 | | 0.1593 | 19.0 | 1007 | 0.1249 | 0.9570 | | 0.1741 | 20.0 | 1060 | 0.1076 | 0.9601 | | 0.1638 | 21.0 | 1113 | 0.1220 | 0.9580 | | 0.1261 | 22.0 | 1166 | 0.1344 | 0.9532 | | 0.1599 | 23.0 | 1219 | 0.1293 | 0.9535 | | 0.1137 | 24.0 | 1272 | 0.1106 | 0.9621 | | 0.1257 | 25.0 | 1325 | 0.1205 | 0.9573 | | 0.1067 | 26.0 | 1378 | 0.1541 | 0.9535 | | 0.1297 | 27.0 | 1431 | 0.1128 | 0.9604 | | 0.1076 | 28.0 | 1484 | 0.1092 | 0.9594 | | 0.0917 | 29.0 | 1537 | 0.1011 | 0.9614 | | 0.0905 | 30.0 | 1590 | 0.1109 | 0.9604 | | 0.0948 | 31.0 | 1643 | 0.1046 | 0.9638 | | 0.0984 | 32.0 | 1696 | 0.1026 | 0.9669 | | 0.0921 | 33.0 | 1749 | 0.1034 | 0.9642 | | 0.0762 | 34.0 | 1802 | 0.0925 | 0.9687 | | 0.0818 | 35.0 | 1855 | 0.0966 | 0.9656 | | 0.0908 | 36.0 | 1908 | 0.0940 | 0.9687 | | 0.0699 | 37.0 | 1961 | 0.0779 | 0.9742 | | 0.0972 | 38.0 | 2014 | 0.1104 | 0.9687 | | 0.0756 | 39.0 | 2067 | 0.0838 | 0.9742 | | 0.0878 | 40.0 | 2120 | 0.1119 | 0.9673 | | 0.0819 | 41.0 | 2173 | 0.1164 | 0.9618 | | 0.0815 | 42.0 | 2226 | 0.1099 | 0.9666 | | 0.0618 | 43.0 | 2279 | 0.1003 | 0.9680 | | 0.0709 | 44.0 | 2332 | 0.0934 | 0.9721 | | 0.0697 | 45.0 | 2385 | 0.0869 | 0.9731 | | 0.0551 | 46.0 | 2438 | 0.1086 | 0.9694 | | 0.049 | 47.0 | 2491 | 0.1036 | 0.9687 | | 0.0646 | 48.0 | 2544 | 0.0854 | 0.9735 | | 0.0704 | 49.0 | 2597 | 0.0959 | 0.9714 | | 0.0578 | 50.0 | 2650 | 0.1034 | 0.9707 | | 0.0579 | 51.0 | 2703 | 0.0965 | 0.9700 | | 0.051 | 52.0 | 2756 | 0.0962 | 0.9721 | | 0.0477 | 53.0 | 2809 | 0.1218 | 0.9690 | | 0.0769 | 54.0 | 2862 | 0.1027 | 0.9714 | | 0.0493 | 55.0 | 2915 | 0.1175 | 0.9725 | | 0.0535 | 56.0 | 2968 | 0.1140 | 0.9690 | | 0.0359 | 57.0 | 3021 | 0.0990 | 0.9725 | | 0.0388 | 58.0 | 3074 | 0.0965 | 0.9700 | | 0.0455 | 59.0 | 3127 | 0.1119 | 0.9700 | | 0.0584 | 60.0 | 3180 | 0.0989 | 0.9735 | | 0.0555 | 61.0 | 3233 | 0.1130 | 0.9680 | | 0.0567 | 62.0 | 3286 | 0.1045 | 0.9721 | | 0.0543 | 63.0 | 3339 | 0.1168 | 0.9707 | | 0.0562 | 64.0 | 3392 | 0.1196 | 0.9649 | | 0.0472 | 65.0 | 3445 | 0.1034 | 0.9725 | | 0.0387 | 66.0 | 3498 | 0.1125 | 0.9728 | | 0.0485 | 67.0 | 3551 | 0.1057 | 0.9738 | | 0.0395 | 68.0 | 3604 | 0.1252 | 0.9725 | | 0.0266 | 69.0 | 3657 | 0.1023 | 0.9742 | | 0.0409 | 70.0 | 3710 | 0.1095 | 0.9738 | | 0.0349 | 71.0 | 3763 | 0.1101 | 0.9752 | | 0.0205 | 72.0 | 3816 | 0.1127 | 0.9725 | | 0.0336 | 73.0 | 3869 | 0.1131 | 0.9735 | | 0.0305 | 74.0 | 3922 | 0.0987 | 0.9749 | | 0.0298 | 75.0 | 3975 | 0.1051 | 0.9742 | | 0.0304 | 76.0 | 4028 | 0.1049 | 0.9728 | | 0.051 | 77.0 | 4081 | 0.1134 | 0.9711 | | 0.045 | 78.0 | 4134 | 0.1334 | 0.9707 | | 0.0345 | 79.0 | 4187 | 0.1233 | 0.9707 | | 0.0328 | 80.0 | 4240 | 0.1106 | 0.9728 | | 0.0391 | 81.0 | 4293 | 0.1073 | 0.9735 | | 0.0383 | 82.0 | 4346 | 0.1189 | 0.9707 | | 0.0299 | 83.0 | 4399 | 0.1131 | 0.9756 | | 0.0195 | 84.0 | 4452 | 0.1267 | 0.9714 | | 0.0181 | 85.0 | 4505 | 0.1200 | 0.9700 | | 0.0266 | 86.0 | 4558 | 0.1086 | 0.9752 | | 0.0322 | 87.0 | 4611 | 0.1149 | 0.9735 | | 0.0325 | 88.0 | 4664 | 0.1130 | 0.9738 | | 0.0303 | 89.0 | 4717 | 0.1105 | 0.9749 | | 0.0275 | 90.0 | 4770 | 0.1078 | 0.9752 | | 0.0281 | 91.0 | 4823 | 0.1077 | 0.9742 | | 0.0231 | 92.0 | 4876 | 0.1060 | 0.9752 | | 0.022 | 93.0 | 4929 | 0.1077 | 0.9749 | | 0.0219 | 94.0 | 4982 | 0.1080 | 0.9749 | | 0.0184 | 95.0 | 5035 | 0.1061 | 0.9756 | | 0.0198 | 96.0 | 5088 | 0.1047 | 0.9749 | | 0.0355 | 97.0 | 5141 | 0.1084 | 0.9735 | | 0.0309 | 98.0 | 5194 | 0.1088 | 0.9735 | | 0.0324 | 99.0 | 5247 | 0.1066 | 0.9742 | | 0.0216 | 100.0 | 5300 | 0.1062 | 0.9749 | ### Framework versions - Transformers 4.43.2 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "downstairs", "jogging", "sitting", "standing", "upstairs", "walking" ]
pjdevelop/dinov2-svm-embeddings-currency_indian
# Dinov2-SVM-Embeddings-Currency-Indian This model is fine-tuned for image classification of Indian currency notes.
[ "1", "10", "100", "2", "20", "200", "2000", "5", "50", "500", "coin", "rupee" ]
n1hal/Food_Model_Example
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Food_Model_Example This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6000 - Accuracy: 0.881 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.6623 | 0.992 | 62 | 2.4890 | 0.826 | | 1.8808 | 2.0 | 125 | 1.7638 | 0.868 | | 1.5842 | 2.976 | 186 | 1.6000 | 0.881 | ### Framework versions - Transformers 4.43.3 - Pytorch 2.4.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-stacked
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-stacked This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1344 - Accuracy: 0.9680 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.4973 | 1.0 | 53 | 1.3214 | 0.3743 | | 0.7898 | 2.0 | 106 | 0.7212 | 0.6753 | | 0.5919 | 3.0 | 159 | 0.5983 | 0.7317 | | 0.5388 | 4.0 | 212 | 0.4451 | 0.8209 | | 0.475 | 5.0 | 265 | 0.3542 | 0.8674 | | 0.4174 | 6.0 | 318 | 0.3148 | 0.8771 | | 0.3487 | 7.0 | 371 | 0.3107 | 0.8802 | | 0.3385 | 8.0 | 424 | 0.3179 | 0.8798 | | 0.3324 | 9.0 | 477 | 0.2846 | 0.8998 | | 0.3347 | 10.0 | 530 | 0.2837 | 0.8871 | | 0.2952 | 11.0 | 583 | 0.2412 | 0.9139 | | 0.282 | 12.0 | 636 | 0.3142 | 0.8767 | | 0.2679 | 13.0 | 689 | 0.2496 | 0.9005 | | 0.2816 | 14.0 | 742 | 0.2014 | 0.9239 | | 0.2989 | 15.0 | 795 | 0.2049 | 0.9218 | | 0.2634 | 16.0 | 848 | 0.2066 | 0.9232 | | 0.2692 | 17.0 | 901 | 0.1994 | 0.9284 | | 0.2069 | 18.0 | 954 | 0.1958 | 0.9304 | | 0.2373 | 19.0 | 1007 | 0.2273 | 0.9249 | | 0.1992 | 20.0 | 1060 | 0.2094 | 0.9267 | | 0.1997 | 21.0 | 1113 | 0.1808 | 0.9387 | | 0.1794 | 22.0 | 1166 | 0.1833 | 0.9408 | | 0.1736 | 23.0 | 1219 | 0.2456 | 0.9091 | | 0.2004 | 24.0 | 1272 | 0.1918 | 0.9294 | | 0.2039 | 25.0 | 1325 | 0.1768 | 0.9370 | | 0.1829 | 26.0 | 1378 | 0.2090 | 0.9225 | | 0.1566 | 27.0 | 1431 | 0.1467 | 0.9456 | | 0.1531 | 28.0 | 1484 | 0.1604 | 0.9404 | | 0.1553 | 29.0 | 1537 | 0.1612 | 0.9449 | | 0.1406 | 30.0 | 1590 | 0.1644 | 0.9494 | | 0.1396 | 31.0 | 1643 | 0.1411 | 0.9501 | | 0.1049 | 32.0 | 1696 | 0.1616 | 0.9539 | | 0.1411 | 33.0 | 1749 | 0.1708 | 0.9446 | | 0.1211 | 34.0 | 1802 | 0.1392 | 0.9501 | | 0.1113 | 35.0 | 1855 | 0.1369 | 0.9525 | | 0.1249 | 36.0 | 1908 | 0.1320 | 0.9535 | | 0.1274 | 37.0 | 1961 | 0.1524 | 0.9518 | | 0.1191 | 38.0 | 2014 | 0.1438 | 0.9525 | | 0.0949 | 39.0 | 2067 | 0.1379 | 0.9573 | | 0.0936 | 40.0 | 2120 | 0.1463 | 0.9518 | | 0.1008 | 41.0 | 2173 | 0.1681 | 0.9494 | | 0.0887 | 42.0 | 2226 | 0.1463 | 0.9566 | | 0.1113 | 43.0 | 2279 | 0.1719 | 0.9456 | | 0.1087 | 44.0 | 2332 | 0.1343 | 0.9604 | | 0.097 | 45.0 | 2385 | 0.1431 | 0.9576 | | 0.1061 | 46.0 | 2438 | 0.1495 | 0.9580 | | 0.11 | 47.0 | 2491 | 0.1555 | 0.9549 | | 0.0806 | 48.0 | 2544 | 0.1493 | 0.9549 | | 0.0979 | 49.0 | 2597 | 0.2320 | 0.9373 | | 0.0751 | 50.0 | 2650 | 0.1516 | 0.9573 | | 0.0845 | 51.0 | 2703 | 0.1277 | 0.9614 | | 0.079 | 52.0 | 2756 | 0.1373 | 0.9601 | | 0.0818 | 53.0 | 2809 | 0.1569 | 0.9539 | | 0.0845 | 54.0 | 2862 | 0.1422 | 0.9604 | | 0.0796 | 55.0 | 2915 | 0.1400 | 0.9621 | | 0.0975 | 56.0 | 2968 | 0.1375 | 0.9573 | | 0.0607 | 57.0 | 3021 | 0.1504 | 0.9580 | | 0.0632 | 58.0 | 3074 | 0.1364 | 0.9607 | | 0.0542 | 59.0 | 3127 | 0.1278 | 0.9669 | | 0.0807 | 60.0 | 3180 | 0.1507 | 0.9518 | | 0.0673 | 61.0 | 3233 | 0.1302 | 0.9645 | | 0.0773 | 62.0 | 3286 | 0.1388 | 0.9638 | | 0.0739 | 63.0 | 3339 | 0.1533 | 0.9573 | | 0.0718 | 64.0 | 3392 | 0.1325 | 0.9594 | | 0.0719 | 65.0 | 3445 | 0.1304 | 0.9625 | | 0.0487 | 66.0 | 3498 | 0.1250 | 0.9645 | | 0.0718 | 67.0 | 3551 | 0.1512 | 0.9573 | | 0.0851 | 68.0 | 3604 | 0.1299 | 0.9607 | | 0.0658 | 69.0 | 3657 | 0.1424 | 0.9625 | | 0.0605 | 70.0 | 3710 | 0.1391 | 0.9625 | | 0.0732 | 71.0 | 3763 | 0.1320 | 0.9642 | | 0.0613 | 72.0 | 3816 | 0.1461 | 0.9607 | | 0.056 | 73.0 | 3869 | 0.1328 | 0.9635 | | 0.0661 | 74.0 | 3922 | 0.1319 | 0.9628 | | 0.0581 | 75.0 | 3975 | 0.1337 | 0.9666 | | 0.0698 | 76.0 | 4028 | 0.1383 | 0.9645 | | 0.0544 | 77.0 | 4081 | 0.1324 | 0.9656 | | 0.059 | 78.0 | 4134 | 0.1380 | 0.9645 | | 0.0554 | 79.0 | 4187 | 0.1435 | 0.9638 | | 0.0497 | 80.0 | 4240 | 0.1310 | 0.9649 | | 0.0463 | 81.0 | 4293 | 0.1384 | 0.9604 | | 0.0622 | 82.0 | 4346 | 0.1363 | 0.9628 | | 0.0534 | 83.0 | 4399 | 0.1428 | 0.9635 | | 0.0434 | 84.0 | 4452 | 0.1374 | 0.9656 | | 0.0591 | 85.0 | 4505 | 0.1332 | 0.9663 | | 0.0488 | 86.0 | 4558 | 0.1271 | 0.9697 | | 0.0418 | 87.0 | 4611 | 0.1286 | 0.9669 | | 0.0505 | 88.0 | 4664 | 0.1372 | 0.9676 | | 0.0486 | 89.0 | 4717 | 0.1372 | 0.9676 | | 0.0561 | 90.0 | 4770 | 0.1348 | 0.9680 | | 0.0498 | 91.0 | 4823 | 0.1340 | 0.9669 | | 0.0432 | 92.0 | 4876 | 0.1351 | 0.9621 | | 0.0322 | 93.0 | 4929 | 0.1380 | 0.9659 | | 0.0389 | 94.0 | 4982 | 0.1370 | 0.9656 | | 0.0408 | 95.0 | 5035 | 0.1343 | 0.9683 | | 0.0367 | 96.0 | 5088 | 0.1347 | 0.9680 | | 0.0337 | 97.0 | 5141 | 0.1366 | 0.9669 | | 0.0338 | 98.0 | 5194 | 0.1355 | 0.9669 | | 0.0284 | 99.0 | 5247 | 0.1340 | 0.9676 | | 0.0501 | 100.0 | 5300 | 0.1344 | 0.9680 | ### Framework versions - Transformers 4.43.2 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "downstairs", "jogging", "sitting", "standing", "upstairs", "walking" ]
djbp/swin-base-patch4-window7-224-MM_Classification_base
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-base-patch4-window7-224-MM_Classification_base This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2998 - Accuracy: 0.8771 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 512 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.887 | 1.0 | 19 | 0.4012 | 0.8566 | | 0.4302 | 2.0 | 38 | 0.3361 | 0.8656 | | 0.3477 | 3.0 | 57 | 0.3272 | 0.8656 | | 0.3281 | 4.0 | 76 | 0.3129 | 0.8694 | | 0.308 | 5.0 | 95 | 0.2984 | 0.8732 | | 0.2821 | 6.0 | 114 | 0.3010 | 0.8694 | | 0.2763 | 7.0 | 133 | 0.2998 | 0.8771 | | 0.2607 | 8.0 | 152 | 0.2938 | 0.8720 | | 0.2502 | 9.0 | 171 | 0.2990 | 0.8732 | | 0.2337 | 10.0 | 190 | 0.2978 | 0.8758 | ### Framework versions - Transformers 4.43.3 - Pytorch 1.13.1+cu117 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "invalid", "mid market", "non mid market" ]
Maria831Chowdhury/cat_classifier
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Maria831Chowdhury/cat_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 1.2533 - Validation Loss: 1.1095 - Train Accuracy: 0.5508 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 3740, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 1.5813 | 1.5262 | 0.4011 | 0 | | 1.4980 | 1.4068 | 0.5027 | 1 | | 1.4093 | 1.2781 | 0.4973 | 2 | | 1.3448 | 1.2010 | 0.5241 | 3 | | 1.2533 | 1.1095 | 0.5508 | 4 | ### Framework versions - Transformers 4.43.3 - TensorFlow 2.15.0 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "bengal", "domestic_shorthair", "maine_coon", "ragdoll", "siamese" ]
jnmrr/doc-img-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # doc-img-classification This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.0820 - Accuracy: 0.3484 - Weighted f1: 0.2183 - Micro f1: 0.3484 - Macro f1: 0.2173 - Weighted recall: 0.3484 - Micro recall: 0.3484 - Macro recall: 0.3545 - Weighted precision: 0.4016 - Micro precision: 0.3484 - Macro precision: 0.3764 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Weighted f1 | Micro f1 | Macro f1 | Weighted recall | Micro recall | Macro recall | Weighted precision | Micro precision | Macro precision | |:-------------:|:------:|:----:|:---------------:|:--------:|:-----------:|:--------:|:--------:|:---------------:|:------------:|:------------:|:------------------:|:---------------:|:---------------:| | 1.7064 | 0.9855 | 17 | 1.0820 | 0.3484 | 0.2183 | 0.3484 | 0.2173 | 0.3484 | 0.3484 | 0.3545 | 0.4016 | 0.3484 | 0.3764 | ### Framework versions - Transformers 4.43.3 - Pytorch 2.4.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "other", "invoice", "receipt" ]
hanad/Firearms_detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Firearms_detection This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0580 - Accuracy: 0.9788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.1916 | 0.9903 | 51 | 0.1668 | 0.9566 | | 0.0711 | 2.0 | 103 | 0.0857 | 0.9757 | | 0.053 | 2.9903 | 154 | 0.0803 | 0.9757 | | 0.0368 | 4.0 | 206 | 0.0622 | 0.9820 | | 0.0524 | 4.9515 | 255 | 0.0597 | 0.9799 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "firearm", "normal" ]
dennishauser/mnist_basic
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the mnist dataset. It achieves the following results on the evaluation set: - Loss: 0.5370 - Accuracy: 0.8809 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.5443 | 0.9979 | 234 | 0.5314 | 0.8862 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9" ]
mashequr/fine-tuned-mobileclip_s0_timm
# Model Card for Model ID This model is a fine-tuned version of [apple/mobileclip_s0_timm](https://huggingface.co/apple/mobileclip_s0_timm/tree/main) on an unknown dataset. ## Model Details The `apple/mobileclip_s0_timm` model was finetuned on a domain-specific dataset of dog breed images to enhance its classification accuracy. This process involved adjusting the model's pre-existing weights, enabling it to specialize in recognizing various dog breeds while leveraging its general image classification capabilities. The finetuned model now offers improved performance for tasks related to dog breed classification. ### Training Data The model was trained and tested using this dataset: [mashequr/images_of_dog_breeds](https://huggingface.co/datasets/mashequr/images_of_dog_breeds) ### Training Hyperparameters Here is a summary of the training hyperparameters used: - **Train Batch Size (per device):** 8 - **Evaluation Batch Size (per device):** 8 - **Number of Training Epochs:** 3 ### Training Results | Epoch | Validation Loss | |:-----:|:---------------:| | 1 | 4.785654 | | 2 | 4.009919 | | 3 | 3.602830 |
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9", "label_10", "label_11", "label_12", "label_13", "label_14", "label_15", "label_16", "label_17", "label_18", "label_19", "label_20" ]
gusevvan/test
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
JonPGallegos/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2459 - Accuracy: 0.9186 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.3913 | 1.0 | 298 | 0.3844 | 0.8782 | | 0.2563 | 2.0 | 597 | 0.3126 | 0.9079 | | 0.2216 | 2.99 | 894 | 0.2459 | 0.9186 | ### Framework versions - Transformers 4.39.0 - Pytorch 2.4.0+cpu - Datasets 2.20.0 - Tokenizers 0.15.2
[ "covid", "lung_opacity", "normal", "viral pneumonia" ]
jayanthspratap/vit-base-patch16-224
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224 This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window8-256](https://huggingface.co/microsoft/swinv2-base-patch4-window8-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5932 - Accuracy: 0.7209 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.96 | 6 | 0.7424 | 0.3488 | | 0.7374 | 1.92 | 12 | 0.5932 | 0.7209 | | 0.7374 | 2.88 | 18 | 0.5843 | 0.7209 | | 0.5783 | 4.0 | 25 | 0.5996 | 0.7209 | | 0.5358 | 4.96 | 31 | 0.6147 | 0.7209 | | 0.5358 | 5.92 | 37 | 0.6159 | 0.7209 | | 0.5745 | 6.88 | 43 | 0.6091 | 0.7209 | | 0.5325 | 8.0 | 50 | 0.6067 | 0.7209 | | 0.5325 | 8.96 | 56 | 0.6047 | 0.7209 | | 0.524 | 9.6 | 60 | 0.6046 | 0.7209 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1+cu118 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "n", "y" ]