model_id
stringlengths
7
105
model_card
stringlengths
1
130k
model_labels
listlengths
2
80k
chaphoto/autotrain-test3-3741499453
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 3741499453 - CO2 Emissions (in grams): 0.0169 ## Validation Metrics - Loss: 0.000 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "cross", "hook", "jabs", "uppercut" ]
chaphoto/autotrain-test3-3741499455
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 3741499455 - CO2 Emissions (in grams): 6.5539 ## Validation Metrics - Loss: 0.355 - Accuracy: 0.982 - Macro F1: 0.983 - Micro F1: 0.982 - Weighted F1: 0.982 - Macro Precision: 0.983 - Micro Precision: 0.982 - Weighted Precision: 0.983 - Macro Recall: 0.983 - Micro Recall: 0.982 - Weighted Recall: 0.982
[ "cross", "hook", "jabs", "uppercut" ]
chaphoto/autotrain-test3-3741499457
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 3741499457 - CO2 Emissions (in grams): 4.4754 ## Validation Metrics - Loss: 0.008 - Accuracy: 0.998 - Macro F1: 0.998 - Micro F1: 0.998 - Weighted F1: 0.998 - Macro Precision: 0.998 - Micro Precision: 0.998 - Weighted Precision: 0.998 - Macro Recall: 0.998 - Micro Recall: 0.998 - Weighted Recall: 0.998
[ "cross", "hook", "jabs", "uppercut" ]
chaphoto/autotrain-test3-3741499456
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 3741499456 - CO2 Emissions (in grams): 6.0558 ## Validation Metrics - Loss: 0.000 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "cross", "hook", "jabs", "uppercut" ]
hazardous/swin-tiny-patch4-window7-224_har-finetuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224_har-finetuned This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the image_folder dataset. It achieves the following results on the evaluation set: - Loss: 0.5817 - Accuracy: 0.8307 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 6 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.9655 | 0.99 | 83 | 0.8374 | 0.7323 | | 0.648 | 1.99 | 166 | 0.6449 | 0.7931 | | 0.5058 | 2.99 | 249 | 0.6082 | 0.8074 | | 0.3272 | 3.99 | 332 | 0.5766 | 0.8254 | | 0.2569 | 4.99 | 415 | 0.5827 | 0.8217 | | 0.2226 | 5.99 | 498 | 0.5817 | 0.8307 | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.0 - Datasets 2.1.0 - Tokenizers 0.12.1
[ "calling", "clapping", "cycling", "dancing", "drinking", "eating", "fighting", "hugging", "laughing", "listening_to_music", "running", "sitting", "sleeping", "texting", "using_laptop" ]
opiljain/autotrain-cardamage-3762299975
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 3762299975 - CO2 Emissions (in grams): 0.6713 ## Validation Metrics - Loss: 0.711 - Accuracy: 0.733 - Macro F1: 0.531 - Micro F1: 0.733 - Weighted F1: 0.649 - Macro Precision: 0.492 - Micro Precision: 0.733 - Weighted Precision: 0.588 - Macro Recall: 0.583 - Micro Recall: 0.733 - Weighted Recall: 0.733
[ "door", "fenderbender", "hood" ]
AthiraVr/dit-base-finetuned-rvlcdip-finetuned-data200
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dit-base-finetuned-rvlcdip-finetuned-data200 This model is a fine-tuned version of [microsoft/dit-base-finetuned-rvlcdip](https://huggingface.co/microsoft/dit-base-finetuned-rvlcdip) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 3.0080 - Accuracy: 0.5699 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.1142 | 1.0 | 46 | 2.0131 | 0.3441 | | 1.9953 | 2.0 | 92 | 1.9577 | 0.4086 | | 1.9558 | 3.0 | 138 | 1.9231 | 0.4301 | | 1.9251 | 4.0 | 184 | 1.8015 | 0.4946 | | 1.6485 | 5.0 | 230 | 1.7045 | 0.5269 | | 1.5973 | 6.0 | 276 | 1.5806 | 0.5054 | | 1.4755 | 7.0 | 322 | 1.4849 | 0.5054 | | 1.4537 | 8.0 | 368 | 1.4356 | 0.5161 | | 1.416 | 9.0 | 414 | 1.4512 | 0.5269 | | 1.3645 | 10.0 | 460 | 1.3857 | 0.5591 | | 1.3017 | 11.0 | 506 | 1.3108 | 0.5484 | | 1.2794 | 12.0 | 552 | 1.3027 | 0.5376 | | 1.1553 | 13.0 | 598 | 1.2883 | 0.5484 | | 1.1526 | 14.0 | 644 | 1.3554 | 0.5054 | | 1.1116 | 15.0 | 690 | 1.3235 | 0.5914 | | 1.1925 | 16.0 | 736 | 1.2401 | 0.5806 | | 1.1297 | 17.0 | 782 | 1.3425 | 0.5914 | | 0.9717 | 18.0 | 828 | 1.3538 | 0.5484 | | 0.8404 | 19.0 | 874 | 1.2648 | 0.5699 | | 0.7008 | 20.0 | 920 | 1.4971 | 0.5376 | | 1.1454 | 21.0 | 966 | 1.4137 | 0.4839 | | 0.6849 | 22.0 | 1012 | 1.2801 | 0.5591 | | 0.8566 | 23.0 | 1058 | 1.2380 | 0.5699 | | 0.8956 | 24.0 | 1104 | 1.2903 | 0.6129 | | 0.8004 | 25.0 | 1150 | 1.4372 | 0.5591 | | 0.818 | 26.0 | 1196 | 1.1640 | 0.6344 | | 0.6387 | 27.0 | 1242 | 1.3120 | 0.6452 | | 0.7282 | 28.0 | 1288 | 1.4678 | 0.5161 | | 0.7426 | 29.0 | 1334 | 1.4815 | 0.5269 | | 0.735 | 30.0 | 1380 | 1.2714 | 0.6129 | | 0.6769 | 31.0 | 1426 | 1.2262 | 0.5699 | | 0.5562 | 32.0 | 1472 | 1.3348 | 0.6344 | | 0.6671 | 33.0 | 1518 | 1.4159 | 0.6129 | | 0.3708 | 34.0 | 1564 | 1.6416 | 0.5484 | | 0.3967 | 35.0 | 1610 | 1.3298 | 0.5699 | | 0.4692 | 36.0 | 1656 | 1.3559 | 0.5699 | | 0.632 | 37.0 | 1702 | 1.3349 | 0.5699 | | 0.3719 | 38.0 | 1748 | 1.4697 | 0.5914 | | 0.4238 | 39.0 | 1794 | 1.5207 | 0.6022 | | 0.3608 | 40.0 | 1840 | 1.5557 | 0.5591 | | 0.6252 | 41.0 | 1886 | 1.6247 | 0.5269 | | 0.4183 | 42.0 | 1932 | 1.5885 | 0.5914 | | 0.3922 | 43.0 | 1978 | 1.6593 | 0.5699 | | 0.5715 | 44.0 | 2024 | 1.5270 | 0.5699 | | 0.3656 | 45.0 | 2070 | 1.8899 | 0.5054 | | 0.3656 | 46.0 | 2116 | 2.0936 | 0.4624 | | 0.4003 | 47.0 | 2162 | 1.5610 | 0.5054 | | 0.446 | 48.0 | 2208 | 1.7388 | 0.5376 | | 0.5219 | 49.0 | 2254 | 1.4976 | 0.6129 | | 0.3488 | 50.0 | 2300 | 1.5744 | 0.5914 | | 0.323 | 51.0 | 2346 | 1.6312 | 0.6022 | | 0.3713 | 52.0 | 2392 | 1.6975 | 0.5591 | | 0.2981 | 53.0 | 2438 | 1.6229 | 0.5699 | | 0.3422 | 54.0 | 2484 | 2.0909 | 0.4624 | | 0.2538 | 55.0 | 2530 | 2.0966 | 0.5161 | | 0.3868 | 56.0 | 2576 | 1.5614 | 0.6344 | | 0.4662 | 57.0 | 2622 | 1.8929 | 0.5269 | | 0.4277 | 58.0 | 2668 | 1.9573 | 0.5376 | | 0.5301 | 59.0 | 2714 | 1.7999 | 0.5699 | | 0.3867 | 60.0 | 2760 | 2.3481 | 0.4624 | | 0.2334 | 61.0 | 2806 | 1.9924 | 0.5376 | | 0.2921 | 62.0 | 2852 | 2.0454 | 0.5591 | | 0.4386 | 63.0 | 2898 | 1.7798 | 0.5376 | | 0.3299 | 64.0 | 2944 | 1.9370 | 0.5914 | | 0.5982 | 65.0 | 2990 | 2.0527 | 0.5591 | | 0.4433 | 66.0 | 3036 | 1.6222 | 0.6237 | | 0.3717 | 67.0 | 3082 | 1.7977 | 0.5914 | | 0.3642 | 68.0 | 3128 | 1.6988 | 0.5914 | | 0.4541 | 69.0 | 3174 | 1.7567 | 0.6022 | | 0.3464 | 70.0 | 3220 | 1.9029 | 0.5699 | | 0.2764 | 71.0 | 3266 | 1.9611 | 0.6022 | | 0.2138 | 72.0 | 3312 | 1.9333 | 0.5591 | | 0.3928 | 73.0 | 3358 | 1.7701 | 0.5806 | | 0.1811 | 74.0 | 3404 | 1.8330 | 0.5806 | | 0.2076 | 75.0 | 3450 | 1.6676 | 0.6559 | | 0.3326 | 76.0 | 3496 | 2.0036 | 0.6022 | | 0.1343 | 77.0 | 3542 | 1.6937 | 0.6344 | | 0.3031 | 78.0 | 3588 | 1.9223 | 0.6237 | | 0.2743 | 79.0 | 3634 | 2.1681 | 0.5699 | | 0.3392 | 80.0 | 3680 | 2.0505 | 0.6129 | | 0.1346 | 81.0 | 3726 | 2.0190 | 0.5699 | | 0.0652 | 82.0 | 3772 | 2.2910 | 0.5699 | | 0.4219 | 83.0 | 3818 | 1.8858 | 0.5914 | | 0.1386 | 84.0 | 3864 | 1.7976 | 0.6237 | | 0.2155 | 85.0 | 3910 | 2.4278 | 0.5161 | | 0.4901 | 86.0 | 3956 | 1.9239 | 0.6237 | | 0.3141 | 87.0 | 4002 | 2.0954 | 0.6559 | | 0.2328 | 88.0 | 4048 | 2.2602 | 0.5806 | | 0.2768 | 89.0 | 4094 | 2.1083 | 0.5914 | | 0.3476 | 90.0 | 4140 | 2.4922 | 0.5269 | | 0.2029 | 91.0 | 4186 | 2.2094 | 0.5591 | | 0.2421 | 92.0 | 4232 | 2.2407 | 0.5376 | | 0.2034 | 93.0 | 4278 | 2.1488 | 0.5591 | | 0.2461 | 94.0 | 4324 | 2.1332 | 0.5806 | | 0.1462 | 95.0 | 4370 | 2.2702 | 0.5591 | | 0.5213 | 96.0 | 4416 | 2.2134 | 0.5699 | | 0.3634 | 97.0 | 4462 | 2.1066 | 0.5699 | | 0.1698 | 98.0 | 4508 | 2.2736 | 0.6237 | | 0.1685 | 99.0 | 4554 | 2.3919 | 0.5806 | | 0.1971 | 100.0 | 4600 | 2.0664 | 0.6237 | | 0.1496 | 101.0 | 4646 | 2.5661 | 0.5806 | | 0.283 | 102.0 | 4692 | 2.0714 | 0.5699 | | 0.185 | 103.0 | 4738 | 2.1369 | 0.6022 | | 0.1489 | 104.0 | 4784 | 2.1653 | 0.6129 | | 0.1231 | 105.0 | 4830 | 2.0890 | 0.6452 | | 0.3224 | 106.0 | 4876 | 2.3771 | 0.5376 | | 0.3452 | 107.0 | 4922 | 2.2537 | 0.6344 | | 0.4404 | 108.0 | 4968 | 2.0253 | 0.6129 | | 0.3408 | 109.0 | 5014 | 2.1653 | 0.5699 | | 0.2406 | 110.0 | 5060 | 2.0196 | 0.6237 | | 0.3051 | 111.0 | 5106 | 2.1980 | 0.6129 | | 0.1515 | 112.0 | 5152 | 2.4104 | 0.5699 | | 0.3836 | 113.0 | 5198 | 2.2342 | 0.6344 | | 0.3572 | 114.0 | 5244 | 2.2321 | 0.6022 | | 0.3006 | 115.0 | 5290 | 2.3555 | 0.5806 | | 0.0965 | 116.0 | 5336 | 2.7237 | 0.4516 | | 0.2023 | 117.0 | 5382 | 2.3798 | 0.6237 | | 0.1272 | 118.0 | 5428 | 2.5357 | 0.5591 | | 0.4318 | 119.0 | 5474 | 2.4913 | 0.5699 | | 0.0414 | 120.0 | 5520 | 2.3760 | 0.6022 | | 0.1785 | 121.0 | 5566 | 2.3920 | 0.6129 | | 0.0142 | 122.0 | 5612 | 2.4256 | 0.6022 | | 0.1262 | 123.0 | 5658 | 2.7212 | 0.5806 | | 0.2219 | 124.0 | 5704 | 2.3683 | 0.5699 | | 0.1629 | 125.0 | 5750 | 2.4280 | 0.5484 | | 0.149 | 126.0 | 5796 | 3.0708 | 0.4839 | | 0.2394 | 127.0 | 5842 | 2.2192 | 0.6022 | | 0.2165 | 128.0 | 5888 | 2.4015 | 0.5806 | | 0.0729 | 129.0 | 5934 | 2.2241 | 0.6022 | | 0.2585 | 130.0 | 5980 | 2.9483 | 0.5054 | | 0.1401 | 131.0 | 6026 | 2.3180 | 0.6129 | | 0.4162 | 132.0 | 6072 | 3.0147 | 0.4946 | | 0.1188 | 133.0 | 6118 | 2.3128 | 0.6237 | | 0.0939 | 134.0 | 6164 | 2.5300 | 0.6022 | | 0.1039 | 135.0 | 6210 | 2.5740 | 0.5699 | | 0.3678 | 136.0 | 6256 | 2.5887 | 0.5914 | | 0.3998 | 137.0 | 6302 | 2.5664 | 0.5376 | | 0.1952 | 138.0 | 6348 | 2.1861 | 0.6774 | | 0.2616 | 139.0 | 6394 | 2.7036 | 0.5806 | | 0.2523 | 140.0 | 6440 | 2.5953 | 0.5806 | | 0.2772 | 141.0 | 6486 | 2.4114 | 0.6129 | | 0.2399 | 142.0 | 6532 | 2.3203 | 0.6237 | | 0.3769 | 143.0 | 6578 | 2.7200 | 0.5591 | | 0.0094 | 144.0 | 6624 | 2.7315 | 0.5591 | | 0.1818 | 145.0 | 6670 | 2.5223 | 0.6129 | | 0.3063 | 146.0 | 6716 | 2.3310 | 0.6237 | | 0.222 | 147.0 | 6762 | 2.6180 | 0.5806 | | 0.2505 | 148.0 | 6808 | 2.2976 | 0.6344 | | 0.2705 | 149.0 | 6854 | 2.4091 | 0.5914 | | 0.1624 | 150.0 | 6900 | 2.8030 | 0.5269 | | 0.1322 | 151.0 | 6946 | 2.6379 | 0.5591 | | 0.0876 | 152.0 | 6992 | 2.5781 | 0.5484 | | 0.1332 | 153.0 | 7038 | 2.8476 | 0.5591 | | 0.2727 | 154.0 | 7084 | 2.6779 | 0.5699 | | 0.195 | 155.0 | 7130 | 3.0504 | 0.4839 | | 0.152 | 156.0 | 7176 | 2.6103 | 0.5806 | | 0.2811 | 157.0 | 7222 | 2.5947 | 0.6129 | | 0.0742 | 158.0 | 7268 | 2.4666 | 0.6559 | | 0.2052 | 159.0 | 7314 | 2.5116 | 0.5484 | | 0.2598 | 160.0 | 7360 | 3.0400 | 0.5269 | | 0.2846 | 161.0 | 7406 | 2.2042 | 0.6667 | | 0.2653 | 162.0 | 7452 | 3.0598 | 0.5484 | | 0.358 | 163.0 | 7498 | 2.7669 | 0.5806 | | 0.0355 | 164.0 | 7544 | 2.4568 | 0.6237 | | 0.1817 | 165.0 | 7590 | 2.9532 | 0.5806 | | 0.0955 | 166.0 | 7636 | 2.4798 | 0.6237 | | 0.1941 | 167.0 | 7682 | 2.7027 | 0.5699 | | 0.1787 | 168.0 | 7728 | 2.4225 | 0.6237 | | 0.0998 | 169.0 | 7774 | 2.5104 | 0.5914 | | 0.0392 | 170.0 | 7820 | 2.6235 | 0.5806 | | 0.2689 | 171.0 | 7866 | 2.9215 | 0.5806 | | 0.0595 | 172.0 | 7912 | 2.8108 | 0.5699 | | 0.148 | 173.0 | 7958 | 2.9213 | 0.5806 | | 0.2159 | 174.0 | 8004 | 2.6172 | 0.6129 | | 0.1221 | 175.0 | 8050 | 2.4386 | 0.6237 | | 0.0691 | 176.0 | 8096 | 2.8642 | 0.5269 | | 0.2014 | 177.0 | 8142 | 2.7364 | 0.6022 | | 0.0379 | 178.0 | 8188 | 2.4859 | 0.6022 | | 0.2202 | 179.0 | 8234 | 3.0665 | 0.5484 | | 0.2078 | 180.0 | 8280 | 2.3521 | 0.6237 | | 0.1051 | 181.0 | 8326 | 2.4827 | 0.6237 | | 0.2257 | 182.0 | 8372 | 2.8155 | 0.5914 | | 0.1339 | 183.0 | 8418 | 2.6274 | 0.6237 | | 0.1414 | 184.0 | 8464 | 2.7645 | 0.5806 | | 0.0993 | 185.0 | 8510 | 2.8886 | 0.5591 | | 0.1769 | 186.0 | 8556 | 2.5164 | 0.6129 | | 0.1575 | 187.0 | 8602 | 2.9346 | 0.5376 | | 0.0251 | 188.0 | 8648 | 2.6099 | 0.5376 | | 0.0536 | 189.0 | 8694 | 2.9630 | 0.5376 | | 0.1748 | 190.0 | 8740 | 2.8360 | 0.5699 | | 0.0151 | 191.0 | 8786 | 2.7525 | 0.6022 | | 0.2198 | 192.0 | 8832 | 2.6656 | 0.5376 | | 0.267 | 193.0 | 8878 | 3.0118 | 0.5591 | | 0.1043 | 194.0 | 8924 | 3.0214 | 0.5699 | | 0.0035 | 195.0 | 8970 | 2.7925 | 0.5806 | | 0.0707 | 196.0 | 9016 | 2.7839 | 0.5806 | | 0.0656 | 197.0 | 9062 | 3.0370 | 0.5376 | | 0.1155 | 198.0 | 9108 | 2.6510 | 0.5914 | | 0.1118 | 199.0 | 9154 | 2.7058 | 0.5699 | | 0.3086 | 200.0 | 9200 | 3.0080 | 0.5699 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.0 - Tokenizers 0.13.2
[ "claims_car_abs_customers", "claims_car_abs_invoices", "claims_car_unity_x_t_h_clients_invoices", "claims_car_unity_x_t_invoices", "claims_retail", "claims_travel_assistance", "claims_travel_best_price_guarantee", "claims_travel_cancellation", "claims_travel_travel_cancellation" ]
hg2001/autotrain-animals-vs-humans2-37846100283
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 37846100283 - CO2 Emissions (in grams): 1.5959 ## Validation Metrics - Loss: 0.001 - Accuracy: 1.000 - Precision: 1.000 - Recall: 1.000 - AUC: 1.000 - F1: 1.000
[ "animals", "human faces" ]
hg2001/autotrain-animals-vs-humans2-37846100280
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 37846100280 - CO2 Emissions (in grams): 0.0096 ## Validation Metrics - Loss: 0.005 - Accuracy: 1.000 - Precision: 1.000 - Recall: 1.000 - AUC: 1.000 - F1: 1.000
[ "animals", "human faces" ]
hg2001/autotrain-male-vs-femalee-37851100302
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 37851100302 - CO2 Emissions (in grams): 0.0034 ## Validation Metrics - Loss: 0.060 - Accuracy: 0.979 - Precision: 0.960 - Recall: 1.000 - AUC: 1.000 - F1: 0.980
[ "female", "male" ]
keerthan2/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0575 - Accuracy: 0.9785 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2258 | 1.0 | 190 | 0.1188 | 0.9596 | | 0.1613 | 2.0 | 380 | 0.0786 | 0.9711 | | 0.1636 | 3.0 | 570 | 0.0575 | 0.9785 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.0 - Tokenizers 0.13.2
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
mm-ai/vit-cc-512-birads
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-cc-512-birads This model is a fine-tuned version of [](https://huggingface.co/) on the preprocessed1024_config dataset. It achieves the following results on the evaluation set: - Loss: 1.1133 - Accuracy: {'accuracy': 0.4943467336683417} - F1: {'f1': 0.3929699341372617} ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:---------------------------------:|:---------------------------:| | 1.1037 | 1.0 | 796 | 1.0357 | {'accuracy': 0.4748743718592965} | {'f1': 0.21465076660988078} | | 1.0588 | 2.0 | 1592 | 1.0446 | {'accuracy': 0.4623115577889447} | {'f1': 0.33094476503399495} | | 1.0486 | 3.0 | 2388 | 1.0408 | {'accuracy': 0.47361809045226133} | {'f1': 0.3313643442345453} | | 1.0288 | 4.0 | 3184 | 1.0186 | {'accuracy': 0.5050251256281407} | {'f1': 0.3404676010455165} | | 1.0284 | 5.0 | 3980 | 1.0288 | {'accuracy': 0.5037688442211056} | {'f1': 0.3406391773730375} | | 0.997 | 6.0 | 4776 | 1.0183 | {'accuracy': 0.5087939698492462} | {'f1': 0.3539488153998284} | | 0.9682 | 7.0 | 5572 | 1.0965 | {'accuracy': 0.4566582914572864} | {'f1': 0.3695106771946128} | | 0.9313 | 8.0 | 6368 | 1.0554 | {'accuracy': 0.4962311557788945} | {'f1': 0.38158088397057704} | | 0.8938 | 9.0 | 7164 | 1.0930 | {'accuracy': 0.4943467336683417} | {'f1': 0.38196414933207573} | | 0.8697 | 10.0 | 7960 | 1.1133 | {'accuracy': 0.4943467336683417} | {'f1': 0.3929699341372617} | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.0 - Datasets 2.1.0 - Tokenizers 0.12.1
[ "label_0", "label_1", "label_2" ]
mm-ai/vit-mlo-512-birads
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-mlo-512-birads This model is a fine-tuned version of [](https://huggingface.co/) on the preprocessed1024_config dataset. It achieves the following results on the evaluation set: - Loss: 1.0864 - Accuracy: {'accuracy': 0.4667085427135678} - F1: {'f1': 0.3786054240333243} ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:---------------------------------:|:---------------------------:| | 1.103 | 1.0 | 796 | 1.0452 | {'accuracy': 0.4748743718592965} | {'f1': 0.21465076660988078} | | 1.0596 | 2.0 | 1592 | 1.0433 | {'accuracy': 0.4748743718592965} | {'f1': 0.21465076660988078} | | 1.0547 | 3.0 | 2388 | 1.0361 | {'accuracy': 0.4748743718592965} | {'f1': 0.21465076660988078} | | 1.047 | 4.0 | 3184 | 1.0395 | {'accuracy': 0.46796482412060303} | {'f1': 0.25128840471066954} | | 1.0524 | 5.0 | 3980 | 1.0331 | {'accuracy': 0.4648241206030151} | {'f1': 0.298317360340153} | | 1.0268 | 6.0 | 4776 | 1.0224 | {'accuracy': 0.47675879396984927} | {'f1': 0.23426509831984135} | | 1.0043 | 7.0 | 5572 | 1.0609 | {'accuracy': 0.417713567839196} | {'f1': 0.3663405670841817} | | 0.982 | 8.0 | 6368 | 1.0521 | {'accuracy': 0.44221105527638194} | {'f1': 0.3650005046420297} | | 0.9315 | 9.0 | 7164 | 1.0473 | {'accuracy': 0.47738693467336685} | {'f1': 0.3727220695970696} | | 0.9319 | 10.0 | 7960 | 1.0864 | {'accuracy': 0.4667085427135678} | {'f1': 0.3786054240333243} | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.0 - Datasets 2.1.0 - Tokenizers 0.12.1
[ "label_0", "label_1", "label_2" ]
mm-ai/vit-mlo-512-breat_composition
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-mlo-512-breat_composition This model is a fine-tuned version of [](https://huggingface.co/) on the preprocessed1024_config dataset. It achieves the following results on the evaluation set: - Loss: 1.3123 - Accuracy: {'accuracy': 0.5791457286432161} - F1: {'f1': 0.5749067914290308} ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------------------------------:|:---------------------------:| | 1.2679 | 1.0 | 796 | 1.0281 | {'accuracy': 0.5062814070351759} | {'f1': 0.38950358034816535} | | 0.9805 | 2.0 | 1592 | 0.9240 | {'accuracy': 0.5672110552763819} | {'f1': 0.5273112700912543} | | 0.9167 | 3.0 | 2388 | 0.9608 | {'accuracy': 0.5477386934673367} | {'f1': 0.45736748568671376} | | 0.8292 | 4.0 | 3184 | 0.8973 | {'accuracy': 0.5891959798994975} | {'f1': 0.5783349603036094} | | 0.7695 | 5.0 | 3980 | 1.0477 | {'accuracy': 0.5571608040201005} | {'f1': 0.5379432393338944} | | 0.6912 | 6.0 | 4776 | 0.9479 | {'accuracy': 0.585427135678392} | {'f1': 0.5766494177636581} | | 0.61 | 7.0 | 5572 | 1.1280 | {'accuracy': 0.5703517587939698} | {'f1': 0.5560158679652624} | | 0.5591 | 8.0 | 6368 | 1.1866 | {'accuracy': 0.5741206030150754} | {'f1': 0.5541999644498281} | | 0.5021 | 9.0 | 7164 | 1.1537 | {'accuracy': 0.582286432160804} | {'f1': 0.566315815243799} | | 0.4262 | 10.0 | 7960 | 1.3123 | {'accuracy': 0.5791457286432161} | {'f1': 0.5749067914290308} | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.0 - Datasets 2.1.0 - Tokenizers 0.12.1
[ "label_0", "label_1", "label_2", "label_3" ]
mm-ai/convnext-mlo-512-breat_composition
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnext-mlo-512-breat_composition This model is a fine-tuned version of [](https://huggingface.co/) on the preprocessed1024_config dataset. It achieves the following results on the evaluation set: - Loss: 1.1521 - Accuracy: {'accuracy': 0.5785175879396985} - F1: {'f1': 0.565251065728165} ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:---------------------------------:|:---------------------------:| | 1.3433 | 1.0 | 796 | 1.1893 | {'accuracy': 0.4566582914572864} | {'f1': 0.32080438921262083} | | 1.1242 | 2.0 | 1592 | 1.0867 | {'accuracy': 0.48555276381909546} | {'f1': 0.4061780745199038} | | 1.0569 | 3.0 | 2388 | 1.1587 | {'accuracy': 0.49120603015075376} | {'f1': 0.40970823779940124} | | 0.9327 | 4.0 | 3184 | 0.9901 | {'accuracy': 0.5452261306532663} | {'f1': 0.4885626990630958} | | 0.8723 | 5.0 | 3980 | 0.9824 | {'accuracy': 0.5728643216080402} | {'f1': 0.5365052338942904} | | 0.7803 | 6.0 | 4776 | 1.0071 | {'accuracy': 0.571608040201005} | {'f1': 0.5246756181464156} | | 0.7198 | 7.0 | 5572 | 1.0233 | {'accuracy': 0.5741206030150754} | {'f1': 0.5405969058526473} | | 0.6589 | 8.0 | 6368 | 1.0902 | {'accuracy': 0.5816582914572864} | {'f1': 0.5421523761661359} | | 0.6055 | 9.0 | 7164 | 1.0980 | {'accuracy': 0.5835427135678392} | {'f1': 0.5601877104043351} | | 0.5722 | 10.0 | 7960 | 1.1521 | {'accuracy': 0.5785175879396985} | {'f1': 0.565251065728165} | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.0 - Datasets 2.1.0 - Tokenizers 0.12.1
[ "a", "c", "d", "b" ]
omarques/autotrain-flower-classifier-6-38061100888
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 38061100888 - CO2 Emissions (in grams): 0.0038 ## Validation Metrics - Loss: 0.017 - Accuracy: 0.990 - Macro F1: 0.990 - Micro F1: 0.990 - Weighted F1: 0.990 - Macro Precision: 0.990 - Micro Precision: 0.990 - Weighted Precision: 0.990 - Macro Recall: 0.990 - Micro Recall: 0.990 - Weighted Recall: 0.990
[ "buttercup", "daisy", "dandelion", "iris", "sunflower", "tulip" ]
omarques/autotrain-flower-classification-6-38312101280
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 38312101280 - CO2 Emissions (in grams): 0.2370 ## Validation Metrics - Loss: 0.007 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "buttercup", "daisy", "dandelion", "iris", "sunflower", "tulip" ]
gorrox14/vit-base-txoriaktxori
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-txoriaktxori This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the txoriak_txori dataset. It achieves the following results on the evaluation set: - Loss: 0.0559 - Accuracy: 0.9864 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 5.8505 | 0.02 | 100 | 5.8381 | 0.2584 | | 5.259 | 0.04 | 200 | 5.2556 | 0.4992 | | 4.6643 | 0.06 | 300 | 4.5950 | 0.6532 | | 4.0801 | 0.08 | 400 | 3.9534 | 0.6976 | | 3.3312 | 0.1 | 500 | 3.2908 | 0.7608 | | 2.773 | 0.12 | 600 | 2.6892 | 0.7704 | | 2.3108 | 0.14 | 700 | 2.0982 | 0.7976 | | 1.662 | 0.16 | 800 | 1.6214 | 0.8216 | | 1.3897 | 0.18 | 900 | 1.2662 | 0.8604 | | 1.1634 | 0.2 | 1000 | 0.9868 | 0.8892 | | 1.0498 | 0.22 | 1100 | 0.7855 | 0.8992 | | 0.5978 | 0.24 | 1200 | 0.6305 | 0.912 | | 0.6399 | 0.26 | 1300 | 0.5560 | 0.9164 | | 0.607 | 0.28 | 1400 | 0.5119 | 0.9192 | | 0.6595 | 0.3 | 1500 | 0.4307 | 0.9272 | | 0.5239 | 0.32 | 1600 | 0.4124 | 0.9176 | | 0.5166 | 0.34 | 1700 | 0.3280 | 0.9312 | | 0.5352 | 0.36 | 1800 | 0.3155 | 0.9308 | | 0.4036 | 0.38 | 1900 | 0.2893 | 0.9424 | | 0.3836 | 0.4 | 2000 | 0.3161 | 0.9272 | | 0.3418 | 0.42 | 2100 | 0.3005 | 0.9384 | | 0.4172 | 0.44 | 2200 | 0.2518 | 0.9456 | | 0.4293 | 0.46 | 2300 | 0.2367 | 0.9424 | | 0.3551 | 0.48 | 2400 | 0.2422 | 0.9432 | | 0.2718 | 0.5 | 2500 | 0.2207 | 0.9492 | | 0.3802 | 0.52 | 2600 | 0.2163 | 0.9428 | | 0.2916 | 0.54 | 2700 | 0.2156 | 0.946 | | 0.3384 | 0.56 | 2800 | 0.2037 | 0.9508 | | 0.352 | 0.58 | 2900 | 0.2241 | 0.9432 | | 0.3868 | 0.6 | 3000 | 0.2525 | 0.9428 | | 0.3195 | 0.62 | 3100 | 0.2032 | 0.9496 | | 0.2618 | 0.64 | 3200 | 0.2088 | 0.944 | | 0.326 | 0.66 | 3300 | 0.1744 | 0.9536 | | 0.2691 | 0.68 | 3400 | 0.1853 | 0.9516 | | 0.2629 | 0.7 | 3500 | 0.1788 | 0.9464 | | 0.2965 | 0.72 | 3600 | 0.1719 | 0.9572 | | 0.3565 | 0.74 | 3700 | 0.2041 | 0.9452 | | 0.2344 | 0.76 | 3800 | 0.1863 | 0.9504 | | 0.4416 | 0.78 | 3900 | 0.1938 | 0.9472 | | 0.2901 | 0.8 | 4000 | 0.1674 | 0.9572 | | 0.3158 | 0.82 | 4100 | 0.2006 | 0.9496 | | 0.3708 | 0.84 | 4200 | 0.1850 | 0.952 | | 0.2636 | 0.86 | 4300 | 0.1488 | 0.9624 | | 0.1764 | 0.88 | 4400 | 0.1818 | 0.9524 | | 0.4299 | 0.9 | 4500 | 0.1642 | 0.9576 | | 0.4862 | 0.92 | 4600 | 0.1867 | 0.9516 | | 0.288 | 0.94 | 4700 | 0.1362 | 0.9604 | | 0.2715 | 0.96 | 4800 | 0.1384 | 0.9668 | | 0.3139 | 0.98 | 4900 | 0.1607 | 0.956 | | 0.2301 | 1.0 | 5000 | 0.1428 | 0.9628 | | 0.1527 | 1.02 | 5100 | 0.1313 | 0.9672 | | 0.1856 | 1.04 | 5200 | 0.1356 | 0.9628 | | 0.1143 | 1.06 | 5300 | 0.1469 | 0.962 | | 0.1465 | 1.08 | 5400 | 0.1320 | 0.9648 | | 0.1342 | 1.1 | 5500 | 0.1291 | 0.9644 | | 0.1686 | 1.12 | 5600 | 0.1589 | 0.952 | | 0.0683 | 1.14 | 5700 | 0.1598 | 0.9592 | | 0.095 | 1.16 | 5800 | 0.1330 | 0.9628 | | 0.1458 | 1.18 | 5900 | 0.1307 | 0.9652 | | 0.2321 | 1.2 | 6000 | 0.1498 | 0.9608 | | 0.0593 | 1.22 | 6100 | 0.1393 | 0.9636 | | 0.1721 | 1.24 | 6200 | 0.1564 | 0.9604 | | 0.2735 | 1.26 | 6300 | 0.1509 | 0.9572 | | 0.1384 | 1.28 | 6400 | 0.1526 | 0.958 | | 0.1232 | 1.3 | 6500 | 0.1560 | 0.9596 | | 0.1615 | 1.32 | 6600 | 0.1348 | 0.9652 | | 0.2521 | 1.34 | 6700 | 0.1223 | 0.9684 | | 0.0616 | 1.36 | 6800 | 0.1556 | 0.9616 | | 0.23 | 1.38 | 6900 | 0.1338 | 0.9652 | | 0.237 | 1.4 | 7000 | 0.1140 | 0.9664 | | 0.2572 | 1.42 | 7100 | 0.1191 | 0.9672 | | 0.1841 | 1.44 | 7200 | 0.1121 | 0.9708 | | 0.1212 | 1.46 | 7300 | 0.1089 | 0.9708 | | 0.1436 | 1.48 | 7400 | 0.1246 | 0.9672 | | 0.1403 | 1.5 | 7500 | 0.1234 | 0.9676 | | 0.1794 | 1.52 | 7600 | 0.1273 | 0.966 | | 0.2153 | 1.54 | 7700 | 0.1423 | 0.964 | | 0.1347 | 1.56 | 7800 | 0.0985 | 0.9708 | | 0.1989 | 1.58 | 7900 | 0.1117 | 0.9712 | | 0.2686 | 1.6 | 8000 | 0.1166 | 0.9704 | | 0.134 | 1.62 | 8100 | 0.1391 | 0.962 | | 0.2474 | 1.64 | 8200 | 0.1280 | 0.9676 | | 0.0635 | 1.66 | 8300 | 0.1079 | 0.9696 | | 0.1073 | 1.68 | 8400 | 0.1335 | 0.9628 | | 0.1483 | 1.7 | 8500 | 0.1108 | 0.9692 | | 0.0933 | 1.72 | 8600 | 0.1059 | 0.9708 | | 0.1204 | 1.74 | 8700 | 0.1007 | 0.9752 | | 0.1051 | 1.76 | 8800 | 0.1055 | 0.9712 | | 0.1509 | 1.78 | 8900 | 0.0995 | 0.9704 | | 0.1404 | 1.8 | 9000 | 0.1012 | 0.9744 | | 0.0502 | 1.82 | 9100 | 0.0913 | 0.9768 | | 0.3038 | 1.84 | 9200 | 0.0988 | 0.9732 | | 0.1651 | 1.86 | 9300 | 0.1146 | 0.9656 | | 0.1047 | 1.88 | 9400 | 0.1140 | 0.9664 | | 0.1639 | 1.9 | 9500 | 0.1059 | 0.97 | | 0.1044 | 1.92 | 9600 | 0.1012 | 0.9744 | | 0.1955 | 1.94 | 9700 | 0.1119 | 0.9676 | | 0.1903 | 1.96 | 9800 | 0.1127 | 0.9716 | | 0.1328 | 1.98 | 9900 | 0.1199 | 0.9628 | | 0.1219 | 2.0 | 10000 | 0.1011 | 0.972 | | 0.0514 | 2.02 | 10100 | 0.1040 | 0.9728 | | 0.0194 | 2.04 | 10200 | 0.0994 | 0.9752 | | 0.0469 | 2.06 | 10300 | 0.1027 | 0.9716 | | 0.0417 | 2.08 | 10400 | 0.1045 | 0.9748 | | 0.0566 | 2.1 | 10500 | 0.0861 | 0.9792 | | 0.0427 | 2.12 | 10600 | 0.1094 | 0.974 | | 0.1358 | 2.14 | 10700 | 0.0795 | 0.9776 | | 0.0119 | 2.16 | 10800 | 0.0972 | 0.9748 | | 0.0379 | 2.18 | 10900 | 0.1087 | 0.97 | | 0.0951 | 2.2 | 11000 | 0.1079 | 0.9728 | | 0.0256 | 2.22 | 11100 | 0.0951 | 0.9748 | | 0.076 | 2.24 | 11200 | 0.0945 | 0.9764 | | 0.1004 | 2.26 | 11300 | 0.0870 | 0.9788 | | 0.0657 | 2.28 | 11400 | 0.1073 | 0.974 | | 0.0332 | 2.3 | 11500 | 0.0960 | 0.9752 | | 0.0087 | 2.32 | 11600 | 0.0865 | 0.978 | | 0.0351 | 2.34 | 11700 | 0.0963 | 0.9736 | | 0.0127 | 2.36 | 11800 | 0.0989 | 0.976 | | 0.0447 | 2.38 | 11900 | 0.1038 | 0.9752 | | 0.023 | 2.4 | 12000 | 0.0919 | 0.9744 | | 0.0329 | 2.42 | 12100 | 0.0857 | 0.9796 | | 0.042 | 2.44 | 12200 | 0.0812 | 0.9804 | | 0.0549 | 2.46 | 12300 | 0.1114 | 0.9732 | | 0.0806 | 2.48 | 12400 | 0.0971 | 0.9772 | | 0.1768 | 2.5 | 12500 | 0.0933 | 0.974 | | 0.059 | 2.52 | 12600 | 0.0943 | 0.9788 | | 0.0184 | 2.54 | 12700 | 0.0874 | 0.978 | | 0.021 | 2.56 | 12800 | 0.0903 | 0.9764 | | 0.0457 | 2.58 | 12900 | 0.0999 | 0.976 | | 0.0788 | 2.6 | 13000 | 0.0954 | 0.9732 | | 0.0599 | 2.62 | 13100 | 0.0876 | 0.9752 | | 0.1041 | 2.64 | 13200 | 0.1017 | 0.9744 | | 0.0309 | 2.66 | 13300 | 0.0918 | 0.9772 | | 0.1347 | 2.68 | 13400 | 0.0758 | 0.9792 | | 0.0432 | 2.7 | 13500 | 0.0790 | 0.9808 | | 0.0802 | 2.72 | 13600 | 0.0860 | 0.9776 | | 0.0841 | 2.74 | 13700 | 0.0857 | 0.98 | | 0.0513 | 2.76 | 13800 | 0.0895 | 0.9764 | | 0.0129 | 2.78 | 13900 | 0.0861 | 0.9772 | | 0.1279 | 2.8 | 14000 | 0.0895 | 0.9764 | | 0.0074 | 2.82 | 14100 | 0.0842 | 0.978 | | 0.0132 | 2.84 | 14200 | 0.0742 | 0.9796 | | 0.0974 | 2.86 | 14300 | 0.0854 | 0.9776 | | 0.0803 | 2.88 | 14400 | 0.0769 | 0.9804 | | 0.037 | 2.9 | 14500 | 0.0806 | 0.9788 | | 0.0936 | 2.92 | 14600 | 0.0824 | 0.9812 | | 0.0064 | 2.94 | 14700 | 0.0748 | 0.9832 | | 0.0631 | 2.96 | 14800 | 0.0761 | 0.9828 | | 0.0158 | 2.98 | 14900 | 0.0709 | 0.9848 | | 0.0433 | 3.0 | 15000 | 0.0704 | 0.9828 | | 0.0028 | 3.02 | 15100 | 0.0712 | 0.9824 | | 0.0031 | 3.04 | 15200 | 0.0717 | 0.9808 | | 0.0191 | 3.06 | 15300 | 0.0716 | 0.9828 | | 0.0051 | 3.08 | 15400 | 0.0708 | 0.9832 | | 0.0205 | 3.1 | 15500 | 0.0686 | 0.9828 | | 0.1147 | 3.12 | 15600 | 0.0670 | 0.984 | | 0.0014 | 3.14 | 15700 | 0.0628 | 0.9848 | | 0.0082 | 3.16 | 15800 | 0.0659 | 0.984 | | 0.0149 | 3.18 | 15900 | 0.0672 | 0.9836 | | 0.0056 | 3.2 | 16000 | 0.0676 | 0.9852 | | 0.0059 | 3.22 | 16100 | 0.0706 | 0.9836 | | 0.0198 | 3.24 | 16200 | 0.0725 | 0.9812 | | 0.0019 | 3.26 | 16300 | 0.0681 | 0.9828 | | 0.0013 | 3.28 | 16400 | 0.0681 | 0.9856 | | 0.0663 | 3.3 | 16500 | 0.0704 | 0.9852 | | 0.0024 | 3.32 | 16600 | 0.0697 | 0.984 | | 0.0081 | 3.34 | 16700 | 0.0679 | 0.9852 | | 0.0264 | 3.36 | 16800 | 0.0631 | 0.9872 | | 0.0061 | 3.38 | 16900 | 0.0651 | 0.9848 | | 0.0169 | 3.4 | 17000 | 0.0655 | 0.9828 | | 0.0013 | 3.42 | 17100 | 0.0661 | 0.9836 | | 0.0072 | 3.44 | 17200 | 0.0633 | 0.9848 | | 0.009 | 3.46 | 17300 | 0.0634 | 0.9848 | | 0.0028 | 3.48 | 17400 | 0.0634 | 0.9844 | | 0.0024 | 3.5 | 17500 | 0.0637 | 0.9836 | | 0.0031 | 3.52 | 17600 | 0.0641 | 0.9848 | | 0.004 | 3.54 | 17700 | 0.0619 | 0.9856 | | 0.0562 | 3.56 | 17800 | 0.0673 | 0.9856 | | 0.0005 | 3.58 | 17900 | 0.0644 | 0.9864 | | 0.0079 | 3.6 | 18000 | 0.0647 | 0.9872 | | 0.0016 | 3.62 | 18100 | 0.0617 | 0.9872 | | 0.0019 | 3.64 | 18200 | 0.0636 | 0.9872 | | 0.0047 | 3.66 | 18300 | 0.0608 | 0.9848 | | 0.0327 | 3.68 | 18400 | 0.0586 | 0.9868 | | 0.0108 | 3.7 | 18500 | 0.0594 | 0.9872 | | 0.0061 | 3.72 | 18600 | 0.0597 | 0.9868 | | 0.0106 | 3.74 | 18700 | 0.0579 | 0.9872 | | 0.001 | 3.76 | 18800 | 0.0564 | 0.9872 | | 0.012 | 3.78 | 18900 | 0.0561 | 0.9876 | | 0.0038 | 3.8 | 19000 | 0.0566 | 0.9868 | | 0.0099 | 3.82 | 19100 | 0.0573 | 0.9864 | | 0.0026 | 3.84 | 19200 | 0.0575 | 0.9864 | | 0.0062 | 3.86 | 19300 | 0.0573 | 0.9872 | | 0.0239 | 3.88 | 19400 | 0.0573 | 0.9864 | | 0.0026 | 3.9 | 19500 | 0.0568 | 0.9868 | | 0.0014 | 3.92 | 19600 | 0.0557 | 0.9868 | | 0.0019 | 3.94 | 19700 | 0.0562 | 0.9864 | | 0.0484 | 3.96 | 19800 | 0.0560 | 0.9864 | | 0.0022 | 3.98 | 19900 | 0.0559 | 0.9864 | | 0.0145 | 4.0 | 20000 | 0.0559 | 0.9864 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "abbotts babbler", "abbotts booby", "alberts towhee", "bornean bristlehead", "bornean leafbird", "bornean pheasant", "brandt cormarant", "brewers blackbird", "brown crepper", "brown headed cowbird", "brown noody", "brown thrasher", "bufflehead", "alexandrine parakeet", "bulwers pheasant", "burchells courser", "bush turkey", "caatinga cacholote", "cactus wren", "california condor", "california gull", "california quail", "campo flicker", "canary", "alpine chough", "canvasback", "cape glossy starling", "cape longclaw", "cape may warbler", "cape rock thrush", "capped heron", "capuchinbird", "carmine bee-eater", "caspian tern", "cassowary", "altamira yellowthroat", "cedar waxwing", "cerulean warbler", "chara de collar", "chattering lory", "chestnet bellied euphonia", "chinese bamboo partridge", "chinese pond heron", "chipping sparrow", "chucao tapaculo", "chukar partridge", "american avocet", "cinnamon attila", "cinnamon flycatcher", "cinnamon teal", "clarks grebe", "clarks nutcracker", "cock of the rock", "cockatoo", "collared aracari", "collared crescentchest", "common firecrest", "american bittern", "common grackle", "common house martin", "common iora", "common loon", "common poorwill", "common starling", "coppery tailed coucal", "crab plover", "crane hawk", "cream colored woodpecker", "american coot", "crested auklet", "crested caracara", "crested coua", "crested fireback", "crested kingfisher", "crested nuthatch", "crested oropendola", "crested serpent eagle", "crested shriketit", "crested wood partridge", "american flamingo", "crimson chat", "crimson sunbird", "crow", "crowned pigeon", "cuban tody", "cuban trogon", "curl crested aracuri", "d-arnauds barbet", "dalmatian pelican", "darjeeling woodpecker", "american goldfinch", "dark eyed junco", "daurian redstart", "demoiselle crane", "double barred finch", "double brested cormarant", "double eyed fig parrot", "downy woodpecker", "dusky lory", "dusky robin", "eared pita", "american kestrel", "eastern bluebird", "eastern bluebonnet", "eastern golden weaver", "eastern meadowlark", "eastern rosella", "eastern towee", "eastern wip poor will", "eastern yellow robin", "ecuadorian hillstar", "egyptian goose", "abyssinian ground hornbill", "american pipit", "elegant trogon", "elliots pheasant", "emerald tanager", "emperor penguin", "emu", "enggano myna", "eurasian bullfinch", "eurasian golden oriole", "eurasian magpie", "european goldfinch", "american redstart", "european turtle dove", "evening grosbeak", "fairy bluebird", "fairy penguin", "fairy tern", "fan tailed widow", "fasciated wren", "fiery minivet", "fiordland penguin", "fire tailled myzornis", "american robin", "flame bowerbird", "flame tanager", "frigate", "frill back pigeon", "gambels quail", "gang gang cockatoo", "gila woodpecker", "gilded flicker", "glossy ibis", "go away bird", "american wigeon", "gold wing warbler", "golden bower bird", "golden cheeked warbler", "golden chlorophonia", "golden eagle", "golden parakeet", "golden pheasant", "golden pipit", "gouldian finch", "grandala", "amethyst woodstar", "gray catbird", "gray kingbird", "gray partridge", "great argus", "great gray owl", "great jacamar", "great kiskadee", "great potoo", "great tinamou", "great xenops", "andean goose", "greater pewee", "greater prairie chicken", "greator sage grouse", "green broadbill", "green jay", "green magpie", "green winged dove", "grey cuckooshrike", "grey headed fish eagle", "grey plover", "andean lapwing", "groved billed ani", "guinea turaco", "guineafowl", "gurneys pitta", "gyrfalcon", "hamerkop", "harlequin duck", "harlequin quail", "harpy eagle", "hawaiian goose", "andean siskin", "hawfinch", "helmet vanga", "hepatic tanager", "himalayan bluetail", "himalayan monal", "hoatzin", "hooded merganser", "hoopoes", "horned guan", "horned lark", "anhinga", "horned sungem", "house finch", "house sparrow", "hyacinth macaw", "iberian magpie", "ibisbill", "imperial shaq", "inca tern", "indian bustard", "indian pitta", "anianiau", "indian roller", "indian vulture", "indigo bunting", "indigo flycatcher", "inland dotterel", "ivory billed aracari", "ivory gull", "iwi", "jabiru", "jack snipe", "african crowned crane", "annas hummingbird", "jacobin pigeon", "jandaya parakeet", "japanese robin", "java sparrow", "jocotoco antpitta", "kagu", "kakapo", "killdear", "king eider", "king vulture", "antbird", "kiwi", "kookaburra", "lark bunting", "laughing gull", "lazuli bunting", "lesser adjutant", "lilac roller", "limpkin", "little auk", "loggerhead shrike", "antillean euphonia", "long-eared owl", "looney birds", "lucifer hummingbird", "magpie goose", "malabar hornbill", "malachite kingfisher", "malagasy white eye", "maleo", "mallard duck", "mandrin duck", "apapane", "mangrove cuckoo", "marabou stork", "masked bobwhite", "masked booby", "masked lapwing", "mckays bunting", "merlin", "mikado pheasant", "military macaw", "mourning dove", "apostlebird", "myna", "nicobar pigeon", "noisy friarbird", "northern beardless tyrannulet", "northern cardinal", "northern flicker", "northern fulmar", "northern gannet", "northern goshawk", "northern jacana", "araripe manakin", "northern mockingbird", "northern parula", "northern red bishop", "northern shoveler", "ocellated turkey", "okinawa rail", "orange brested bunting", "oriental bay owl", "ornate hawk eagle", "osprey", "ashy storm petrel", "ostrich", "ovenbird", "oyster catcher", "painted bunting", "palila", "palm nut vulture", "paradise tanager", "parakett akulet", "parus major", "patagonian sierra finch", "ashy thrushbird", "peacock", "peregrine falcon", "phainopepla", "philippine eagle", "pink robin", "plush crested jay", "pomarine jaeger", "puffin", "puna teal", "purple finch", "asian crested ibis", "purple gallinule", "purple martin", "purple swamphen", "pygmy kingfisher", "pyrrhuloxia", "quetzal", "rainbow lorikeet", "razorbill", "red bearded bee eater", "red bellied pitta", "asian dollard bird", "red billed tropicbird", "red browed finch", "red faced cormorant", "red faced warbler", "red fody", "red headed duck", "red headed woodpecker", "red knot", "red legged honeycreeper", "red naped trogon", "african emerald cuckoo", "auckland shaq", "red shouldered hawk", "red tailed hawk", "red tailed thrush", "red winged blackbird", "red wiskered bulbul", "regent bowerbird", "ring-necked pheasant", "roadrunner", "rock dove", "rose breasted cockatoo", "austral canastero", "rose breasted grosbeak", "roseate spoonbill", "rosy faced lovebird", "rough leg buzzard", "royal flycatcher", "ruby crowned kinglet", "ruby throated hummingbird", "rudy kingfisher", "rufous kingfisher", "rufuos motmot", "australasian figbird", "samatran thrush", "sand martin", "sandhill crane", "satyr tragopan", "says phoebe", "scarlet crowned fruit dove", "scarlet faced liocichla", "scarlet ibis", "scarlet macaw", "scarlet tanager", "avadavat", "shoebill", "short billed dowitcher", "smiths longspur", "snow goose", "snowy egret", "snowy owl", "snowy plover", "sora", "spangled cotinga", "splendid wren", "azaras spinetail", "spoon biled sandpiper", "spotted catbird", "spotted whistling duck", "sri lanka blue magpie", "steamer duck", "stork billed kingfisher", "striated caracara", "striped owl", "stripped manakin", "stripped swallow", "azure breasted pitta", "sunbittern", "superb starling", "surf scoter", "swinhoes pheasant", "tailorbird", "taiwan magpie", "takahe", "tasmanian hen", "tawny frogmouth", "teal duck", "azure jay", "tit mouse", "touchan", "townsends warbler", "tree swallow", "tricolored blackbird", "tropical kingbird", "trumpter swan", "turkey vulture", "turquoise motmot", "umbrella bird", "azure tanager", "varied thrush", "veery", "venezuelian troupial", "verdin", "vermilion flycather", "victoria crowned pigeon", "violet backed starling", "violet green swallow", "violet turaco", "vulturine guineafowl", "azure tit", "wall creaper", "wattled curassow", "wattled lapwing", "whimbrel", "white browed crake", "white cheeked turaco", "white crested hornbill", "white eared hummingbird", "white necked raven", "white tailed tropic", "baikal teal", "white throated bee eater", "wild turkey", "willow ptarmigan", "wilsons bird of paradise", "wood duck", "wood thrush", "wrentit", "yellow bellied flowerpecker", "yellow cacique", "yellow headed blackbird", "african firefinch", "bald eagle", "bald ibis", "bali starling", "baltimore oriole", "bananaquit", "band tailed guan", "banded broadbill", "banded pita", "banded stilt", "bar-tailed godwit", "african oyster catcher", "barn owl", "barn swallow", "barred puffbird", "barrows goldeneye", "bay-breasted warbler", "bearded barbet", "bearded bellbird", "bearded reedling", "belted kingfisher", "bird of paradise", "african pied hornbill", "black and yellow broadbill", "black baza", "black cockato", "black faced spoonbill", "black francolin", "black headed caique", "black necked stilt", "black skimmer", "black swan", "black tail crake", "african pygmy goose", "black throated bushtit", "black throated huet", "black throated warbler", "black vented shearwater", "black vulture", "black-capped chickadee", "black-necked grebe", "black-throated sparrow", "blackburniam warbler", "blonde crested woodpecker", "albatross", "blood pheasant", "blue coau", "blue dacnis", "blue gray gnatcatcher", "blue grosbeak", "blue grouse", "blue heron", "blue malkoha", "blue throated toucanet", "bobolink" ]
MManu26/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5827 - Accuracy: 0.7059 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.8 | 2 | 0.6240 | 0.6471 | | No log | 1.8 | 4 | 0.5937 | 0.7059 | | No log | 2.8 | 6 | 0.5827 | 0.7059 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "epoc", "no_epoc" ]
omarques/autotrain-img-classifier-march-2023-38397101427
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 38397101427 - CO2 Emissions (in grams): 0.2886 ## Validation Metrics - Loss: 0.000 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "buttercup", "daisy", "dandelion", "iris", "sunflower", "tulip" ]
jinkasreedhar/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the food101 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
platzi/platzi-vit-model-julio-test
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-julio-test This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0163 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1271 | 3.85 | 500 | 0.0163 | 0.9925 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "angular_leaf_spot", "bean_rust", "healthy" ]
osarez-group/vit-model-julio-test
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-model-julio-test This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.1218 - Accuracy: 0.9774 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0881 | 3.85 | 500 | 0.1218 | 0.9774 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "angular_leaf_spot", "bean_rust", "healthy" ]
iammartian0/vegetation_classification_model
## Model description This is a transformers based image classification model, implemented using the technique of transfer learning. The pretrained model is [Vision transformer](https://huggingface.co/google/vit-base-patch16-224) trained on Imagenet-21k. ## Datasets The dataset used is downloaded from git repo [Agri-Hub/Space2Ground](https://github.com/Agri-Hub/Space2Ground/tree/main). I used Street-level image patches folder for this model. It is a dataset containing cropped vegetation parts of mapillary street-level images. Further details are on the linked git repo. ### How to use You can use this model directly with help of pipeline class from transformers library of hugging face ```python >>>from transformers import pipeline >>>classifier = pipeline("image-classification", model="iammartian0/vegetation_classification_model") >>>classifier(image) ``` or uploading a target image to Hosted inference api. ## Training procedure ### Preprocessing Assigining labels based on parent folder names ### Image Transformations Applied RandomResizedCrop from torchvision.transforms to all the training images. ### Finetuning Model is finetuned on the dataset for four epochs ## Evaluation results Model acheived an Top-1 accuracy of 0.929. ## Further exploration to do - Trainig a multilabel model where model can find if the image is from left side or right side on top of classifying the vegetation - Fine grained classification of crop labels using Raw/Initial set of street-level images ### BibTeX entry and citation info ```bibtex @misc{wu2020visual, title={Visual Transformers: Token-based Image Representation and Processing for Computer Vision}, author={Bichen Wu and Chenfeng Xu and Xiaoliang Dai and Alvin Wan and Peizhao Zhang and Zhicheng Yan and Masayoshi Tomizuka and Joseph Gonzalez and Kurt Keutzer and Peter Vajda}, year={2020}, eprint={2006.03677}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` ```bibtex @INPROCEEDINGS{9816335, author={Choumos, George and Koukos, Alkiviadis and Sitokonstantinou, Vasileios and Kontoes, Charalampos}, booktitle={2022 IEEE 14th Image, Video, and Multidimensional Signal Processing Workshop (IVMSP)}, title={Towards Space-to-Ground Data Availability for Agriculture Monitoring}, year={2022}, volume={}, number={}, pages={1-5}, doi={10.1109/IVMSP54334.2022.9816335} } ```
[ "grassland", "non_grassland" ]
sabhashanki/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6177 - Accuracy: 0.8182 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.7459 | 0.3636 | | No log | 2.0 | 2 | 0.7104 | 0.5455 | | No log | 3.0 | 3 | 0.6824 | 0.4545 | | No log | 4.0 | 4 | 0.6597 | 0.4545 | | No log | 5.0 | 5 | 0.6421 | 0.5455 | | No log | 6.0 | 6 | 0.6285 | 0.7273 | | No log | 7.0 | 7 | 0.6177 | 0.8182 | | No log | 8.0 | 8 | 0.6101 | 0.8182 | | No log | 9.0 | 9 | 0.6053 | 0.8182 | | 0.5108 | 10.0 | 10 | 0.6029 | 0.8182 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "aadhaar", "others" ]
sabhashanki/vit-base-patch16-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-eurosat This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5186 - Accuracy: 0.7391 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.57 | 1 | 0.6360 | 0.6522 | | No log | 1.57 | 2 | 0.6056 | 0.6957 | | No log | 2.57 | 3 | 0.5824 | 0.6957 | | No log | 3.57 | 4 | 0.5638 | 0.6957 | | No log | 4.57 | 5 | 0.5497 | 0.6957 | | No log | 5.57 | 6 | 0.5388 | 0.6522 | | No log | 6.57 | 7 | 0.5296 | 0.6522 | | No log | 7.57 | 8 | 0.5232 | 0.6522 | | No log | 8.57 | 9 | 0.5186 | 0.7391 | | 1.0722 | 9.57 | 10 | 0.5163 | 0.7391 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "mirudhu", "mythilie" ]
RicardC/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
sabhashanki/beit-base-patch16-224-pt22k-ft22k-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # beit-base-patch16-224-pt22k-ft22k-finetuned-eurosat This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.5698 - Accuracy: 0.7826 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.57 | 1 | 0.8366 | 0.4348 | | No log | 1.57 | 2 | 0.7708 | 0.5217 | | No log | 2.57 | 3 | 0.7185 | 0.6522 | | No log | 3.57 | 4 | 0.6747 | 0.6522 | | No log | 4.57 | 5 | 0.6380 | 0.6522 | | No log | 5.57 | 6 | 0.6098 | 0.6957 | | No log | 6.57 | 7 | 0.5859 | 0.7391 | | No log | 7.57 | 8 | 0.5698 | 0.7826 | | No log | 8.57 | 9 | 0.5589 | 0.7826 | | 1.0859 | 9.57 | 10 | 0.5534 | 0.7826 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "mirudhu", "mythilie" ]
Barghi/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
ChasingMercer/weather-base
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weather-base This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2184 - Accuracy: 0.9359 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 6 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.3368 | 1.0 | 171 | 0.2780 | 0.9009 | | 0.2129 | 2.0 | 342 | 0.2333 | 0.9300 | | 0.1827 | 3.0 | 513 | 0.2440 | 0.9213 | | 0.1475 | 4.0 | 684 | 0.2306 | 0.9315 | | 0.1284 | 5.0 | 855 | 0.2192 | 0.9359 | | 0.0526 | 6.0 | 1026 | 0.2184 | 0.9359 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "dew", "fogsmog", "frost", "glaze", "hail", "lightning", "rain", "rainbow", "rime", "sandstorm", "snow" ]
SirBadr/my_awesome_classification_model
This model classifies images in two classes {original, generated by AI}
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9", "label_10", "label_11", "label_12", "label_13", "label_14", "label_15", "label_16", "label_17", "label_18", "label_19", "label_20", "label_21", "label_22", "label_23", "label_24", "label_25", "label_26", "label_27", "label_28", "label_29", "label_30", "label_31", "label_32", "label_33", "label_34", "label_35", "label_36", "label_37", "label_38", "label_39", "label_40", "label_41", "label_42", "label_43", "label_44", "label_45", "label_46", "label_47", "label_48", "label_49", "label_50", "label_51", "label_52", "label_53", "label_54", "label_55", "label_56", "label_57", "label_58", "label_59", "label_60", "label_61", "label_62", "label_63", "label_64", "label_65", "label_66", "label_67", "label_68", "label_69", "label_70", "label_71", "label_72", "label_73", "label_74", "label_75", "label_76", "label_77", "label_78", "label_79", "label_80", "label_81", "label_82", "label_83", "label_84", "label_85", "label_86", "label_87", "label_88", "label_89", "label_90", "label_91", "label_92", "label_93", "label_94", "label_95", "label_96", "label_97", "label_98", "label_99", "label_100", "label_101", "label_102", "label_103", "label_104", "label_105", "label_106", "label_107", "label_108", "label_109", "label_110", "label_111", "label_112", "label_113", "label_114", "label_115", "label_116", "label_117", "label_118", "label_119", "label_120", "label_121", "label_122", "label_123", "label_124", "label_125", "label_126", "label_127", "label_128", "label_129", "label_130", "label_131", "label_132", "label_133", "label_134", "label_135", "label_136", "label_137", "label_138", "label_139", "label_140", "label_141", "label_142", "label_143", "label_144", "label_145", "label_146", "label_147", "label_148", "label_149", "label_150", "label_151", "label_152", "label_153", "label_154", "label_155", "label_156", "label_157", "label_158", "label_159", "label_160", "label_161", "label_162", "label_163", "label_164", "label_165", "label_166", "label_167", "label_168", "label_169", "label_170", "label_171", "label_172", "label_173", "label_174", "label_175", "label_176", "label_177", "label_178", "label_179", "label_180", "label_181", "label_182", "label_183", "label_184", "label_185", "label_186", "label_187", "label_188", "label_189", "label_190", "label_191", "label_192", "label_193", "label_194", "label_195", "label_196", "label_197", "label_198", "label_199", "label_200", "label_201", "label_202", "label_203", "label_204", "label_205", "label_206", "label_207", "label_208", "label_209", "label_210", "label_211", "label_212", "label_213", "label_214", "label_215", "label_216", "label_217", "label_218", "label_219", "label_220", "label_221", "label_222", "label_223", "label_224", "label_225", "label_226", "label_227", "label_228", "label_229", "label_230", "label_231", "label_232", "label_233", "label_234", "label_235", "label_236", "label_237", "label_238", "label_239", "label_240", "label_241", "label_242", "label_243", "label_244", "label_245", "label_246", "label_247", "label_248", "label_249", "label_250", "label_251", "label_252", "label_253", "label_254", "label_255", "label_256", "label_257", "label_258", "label_259", "label_260", "label_261", "label_262", "label_263", "label_264", "label_265", "label_266", "label_267", "label_268", "label_269", "label_270", "label_271", "label_272", "label_273", "label_274", "label_275", "label_276", "label_277", "label_278", "label_279", "label_280", "label_281", "label_282", "label_283", "label_284", "label_285", "label_286", "label_287", "label_288", "label_289", "label_290", "label_291", "label_292", "label_293", "label_294", "label_295", "label_296", "label_297", "label_298", "label_299", "label_300", "label_301", "label_302", "label_303", "label_304", "label_305", "label_306", "label_307", "label_308", "label_309", "label_310", "label_311", "label_312", "label_313", "label_314", "label_315", "label_316", "label_317", "label_318", "label_319", "label_320", "label_321", "label_322", "label_323", "label_324", "label_325", "label_326", "label_327", "label_328", "label_329", "label_330", "label_331", "label_332", "label_333", "label_334", "label_335", "label_336", "label_337", "label_338", "label_339", "label_340", "label_341", "label_342", "label_343", "label_344", "label_345", "label_346", "label_347", "label_348", "label_349", "label_350", "label_351", "label_352", "label_353", "label_354", "label_355", "label_356", "label_357", "label_358", "label_359", "label_360", "label_361", "label_362", "label_363", "label_364", "label_365", "label_366", "label_367", "label_368", "label_369", "label_370", "label_371", "label_372", "label_373", "label_374", "label_375", "label_376", "label_377", "label_378", "label_379", "label_380", "label_381", "label_382", "label_383", "label_384", "label_385", "label_386", "label_387", "label_388", "label_389", "label_390", "label_391", "label_392", "label_393", "label_394", "label_395", "label_396", "label_397", "label_398", "label_399", "label_400", "label_401", "label_402", "label_403", "label_404", "label_405", "label_406", "label_407", "label_408", "label_409", "label_410", "label_411", "label_412", "label_413", "label_414", "label_415", "label_416", "label_417", "label_418", "label_419", "label_420", "label_421", "label_422", "label_423", "label_424", "label_425", "label_426", "label_427", "label_428", "label_429", "label_430", "label_431", "label_432", "label_433", "label_434", "label_435", "label_436", "label_437", "label_438", "label_439", "label_440", "label_441", "label_442", "label_443", "label_444", "label_445", "label_446", "label_447", "label_448", "label_449", "label_450", "label_451", "label_452", "label_453", "label_454", "label_455", "label_456", "label_457", "label_458", "label_459", "label_460", "label_461", "label_462", "label_463", "label_464", "label_465", "label_466", "label_467", "label_468", "label_469", "label_470", "label_471", "label_472", "label_473", "label_474", "label_475", "label_476", "label_477", "label_478", "label_479", "label_480", "label_481", "label_482", "label_483", "label_484", "label_485", "label_486", "label_487", "label_488", "label_489", "label_490", "label_491", "label_492", "label_493", "label_494", "label_495", "label_496", "label_497", "label_498", "label_499", "label_500", "label_501", "label_502", "label_503", "label_504", "label_505", "label_506", "label_507", "label_508", "label_509", "label_510", "label_511", "label_512", "label_513", "label_514", "label_515", "label_516", "label_517", "label_518", "label_519", "label_520", "label_521", "label_522", "label_523", "label_524", "label_525", "label_526", "label_527", "label_528", "label_529", "label_530", "label_531", "label_532", "label_533", "label_534", "label_535", "label_536", "label_537", "label_538", "label_539", "label_540", "label_541", "label_542", "label_543", "label_544", "label_545", "label_546", "label_547", "label_548", "label_549", "label_550", "label_551", "label_552", "label_553", "label_554", "label_555", "label_556", "label_557", "label_558", "label_559", "label_560", "label_561", "label_562", "label_563", "label_564", "label_565", "label_566", "label_567", "label_568", "label_569", "label_570", "label_571", "label_572", "label_573", "label_574", "label_575", "label_576", "label_577", "label_578", "label_579", "label_580", "label_581", "label_582", "label_583", "label_584", "label_585", "label_586", "label_587", "label_588", "label_589", "label_590", "label_591", "label_592", "label_593", "label_594", "label_595", "label_596", "label_597", "label_598", "label_599", "label_600", "label_601", "label_602", "label_603", "label_604", "label_605", "label_606", "label_607", "label_608", "label_609", "label_610", "label_611", "label_612", "label_613", "label_614", "label_615", "label_616", "label_617", "label_618", "label_619", "label_620", "label_621", "label_622", "label_623", "label_624", "label_625", "label_626", "label_627", "label_628", "label_629", "label_630", "label_631", "label_632", "label_633", "label_634", "label_635", "label_636", "label_637", "label_638", "label_639", "label_640", "label_641", "label_642", "label_643", "label_644", "label_645", "label_646", "label_647", "label_648", "label_649", "label_650", "label_651", "label_652", "label_653", "label_654", "label_655", "label_656", "label_657", "label_658", "label_659", "label_660", "label_661", "label_662", "label_663", "label_664", "label_665", "label_666", "label_667", "label_668", "label_669", "label_670", "label_671", "label_672", "label_673", "label_674", "label_675", "label_676", "label_677", "label_678", "label_679", "label_680", "label_681", "label_682", "label_683", "label_684", "label_685", "label_686", "label_687", "label_688", "label_689", "label_690", "label_691", "label_692", "label_693", "label_694", "label_695", "label_696", "label_697", "label_698", "label_699", "label_700", "label_701", "label_702", "label_703", "label_704", "label_705", "label_706", "label_707", "label_708", "label_709", "label_710", "label_711", "label_712", "label_713", "label_714", "label_715", "label_716", "label_717", "label_718", "label_719", "label_720", "label_721", "label_722", "label_723", "label_724", "label_725", "label_726", "label_727", "label_728", "label_729", "label_730", "label_731", "label_732", "label_733", "label_734", "label_735", "label_736", "label_737", "label_738", "label_739", "label_740", "label_741", "label_742", "label_743", "label_744", "label_745", "label_746", "label_747", "label_748", "label_749", "label_750", "label_751", "label_752", "label_753", "label_754", "label_755", "label_756", "label_757", "label_758", "label_759", "label_760", "label_761", "label_762", "label_763", "label_764", "label_765", "label_766", "label_767", "label_768", "label_769", "label_770", "label_771", "label_772", "label_773", "label_774", "label_775", "label_776", "label_777", "label_778", "label_779", "label_780", "label_781", "label_782", "label_783", "label_784", "label_785", "label_786", "label_787", "label_788", "label_789", "label_790", "label_791", "label_792", "label_793", "label_794", "label_795", "label_796", "label_797", "label_798", "label_799", "label_800", "label_801", "label_802", "label_803", "label_804", "label_805", "label_806", "label_807", "label_808", "label_809", "label_810", "label_811", "label_812", "label_813", "label_814", "label_815", "label_816", "label_817", "label_818", "label_819", "label_820", "label_821", "label_822", "label_823", "label_824", "label_825", "label_826", "label_827", "label_828", "label_829", "label_830", "label_831", "label_832", "label_833", "label_834", "label_835", "label_836", "label_837", "label_838", "label_839", "label_840", "label_841", "label_842", "label_843", "label_844", "label_845", "label_846", "label_847", "label_848", "label_849", "label_850", "label_851", "label_852", "label_853", "label_854", "label_855", "label_856", "label_857", "label_858", "label_859", "label_860", "label_861", "label_862", "label_863", "label_864", "label_865", "label_866", "label_867", "label_868", "label_869", "label_870", "label_871", "label_872", "label_873", "label_874", "label_875", "label_876", "label_877", "label_878", "label_879", "label_880", "label_881", "label_882", "label_883", "label_884", "label_885", "label_886", "label_887", "label_888", "label_889", "label_890", "label_891", "label_892", "label_893", "label_894", "label_895", "label_896", "label_897", "label_898", "label_899", "label_900", "label_901", "label_902", "label_903", "label_904", "label_905", "label_906", "label_907", "label_908", "label_909", "label_910", "label_911", "label_912", "label_913", "label_914", "label_915", "label_916", "label_917", "label_918", "label_919", "label_920", "label_921", "label_922", "label_923", "label_924", "label_925", "label_926", "label_927", "label_928", "label_929", "label_930", "label_931", "label_932", "label_933", "label_934", "label_935", "label_936", "label_937", "label_938", "label_939", "label_940", "label_941", "label_942", "label_943", "label_944", "label_945", "label_946", "label_947", "label_948", "label_949", "label_950", "label_951", "label_952", "label_953", "label_954", "label_955", "label_956", "label_957", "label_958", "label_959", "label_960", "label_961", "label_962", "label_963", "label_964", "label_965", "label_966", "label_967", "label_968", "label_969", "label_970", "label_971", "label_972", "label_973", "label_974", "label_975", "label_976", "label_977", "label_978", "label_979", "label_980", "label_981", "label_982", "label_983", "label_984", "label_985", "label_986", "label_987", "label_988", "label_989", "label_990", "label_991", "label_992", "label_993", "label_994", "label_995", "label_996", "label_997", "label_998", "label_999", "label_1000", "label_1001", "label_1002", "label_1003", "label_1004", "label_1005", "label_1006", "label_1007", "label_1008", "label_1009", "label_1010", "label_1011", "label_1012", "label_1013", "label_1014", "label_1015", "label_1016", "label_1017", "label_1018", "label_1019", "label_1020", "label_1021", "label_1022", "label_1023", "label_1024", "label_1025", "label_1026", "label_1027", "label_1028", "label_1029", "label_1030", "label_1031", "label_1032", "label_1033", "label_1034", "label_1035", "label_1036", "label_1037", "label_1038", "label_1039", "label_1040", "label_1041", "label_1042", "label_1043", "label_1044", "label_1045", "label_1046", "label_1047", "label_1048", "label_1049", "label_1050", "label_1051", "label_1052", "label_1053", "label_1054", "label_1055", "label_1056", "label_1057", "label_1058", "label_1059", "label_1060", "label_1061", "label_1062", "label_1063", "label_1064", "label_1065", "label_1066", "label_1067", "label_1068", "label_1069", "label_1070", "label_1071", "label_1072", "label_1073", "label_1074", "label_1075", "label_1076", "label_1077", "label_1078", "label_1079", "label_1080", "label_1081", "label_1082", "label_1083", "label_1084", "label_1085", "label_1086", "label_1087", "label_1088", "label_1089", "label_1090", "label_1091", "label_1092", "label_1093", "label_1094", "label_1095", "label_1096", "label_1097", "label_1098", "label_1099", "label_1100", "label_1101", "label_1102", "label_1103", "label_1104", "label_1105", "label_1106", "label_1107", "label_1108", "label_1109", "label_1110", "label_1111", "label_1112", "label_1113", "label_1114", "label_1115", "label_1116", "label_1117", "label_1118", "label_1119", "label_1120", "label_1121", "label_1122", "label_1123", "label_1124", "label_1125", "label_1126", "label_1127", "label_1128", "label_1129", "label_1130", "label_1131", "label_1132", "label_1133", "label_1134", "label_1135", "label_1136", "label_1137", "label_1138", "label_1139", "label_1140", "label_1141", "label_1142", "label_1143", "label_1144", "label_1145", "label_1146", "label_1147", "label_1148", "label_1149", "label_1150", "label_1151", "label_1152", "label_1153", "label_1154", "label_1155", "label_1156", "label_1157", "label_1158", "label_1159", "label_1160", "label_1161", "label_1162", "label_1163", "label_1164", "label_1165", "label_1166", "label_1167", "label_1168", "label_1169", "label_1170", "label_1171", "label_1172", "label_1173", "label_1174", "label_1175", "label_1176", "label_1177", "label_1178", "label_1179", "label_1180", "label_1181", "label_1182", "label_1183", "label_1184", "label_1185", "label_1186", "label_1187", "label_1188", "label_1189", "label_1190", "label_1191", "label_1192", "label_1193", "label_1194", "label_1195", "label_1196", "label_1197", "label_1198", "label_1199", "label_1200", "label_1201", "label_1202", "label_1203", "label_1204", "label_1205", "label_1206", "label_1207", "label_1208", "label_1209", "label_1210", "label_1211", "label_1212", "label_1213", "label_1214", "label_1215", "label_1216", "label_1217", "label_1218", "label_1219", "label_1220", "label_1221", "label_1222", "label_1223", "label_1224", "label_1225", "label_1226", "label_1227", "label_1228", "label_1229", "label_1230", "label_1231", "label_1232", "label_1233", "label_1234", "label_1235", "label_1236", "label_1237", "label_1238", "label_1239", "label_1240", "label_1241", "label_1242", "label_1243", "label_1244", "label_1245", "label_1246", "label_1247", "label_1248", "label_1249", "label_1250", "label_1251", "label_1252", "label_1253", "label_1254", "label_1255", "label_1256", "label_1257", "label_1258", "label_1259", "label_1260", "label_1261", "label_1262", "label_1263", "label_1264", "label_1265", "label_1266", "label_1267", "label_1268", "label_1269", "label_1270", "label_1271", "label_1272", "label_1273", "label_1274", "label_1275", "label_1276", "label_1277", "label_1278", "label_1279", "label_1280", "label_1281", "label_1282", "label_1283", "label_1284", "label_1285", "label_1286", "label_1287", "label_1288", "label_1289", "label_1290", "label_1291", "label_1292", "label_1293", "label_1294", "label_1295", "label_1296", "label_1297", "label_1298", "label_1299", "label_1300", "label_1301", "label_1302", "label_1303", "label_1304", "label_1305", "label_1306", "label_1307", "label_1308", "label_1309", "label_1310", "label_1311", "label_1312", "label_1313", "label_1314", "label_1315", "label_1316", "label_1317", "label_1318", "label_1319", "label_1320", "label_1321", "label_1322", "label_1323", "label_1324", "label_1325", "label_1326", "label_1327", "label_1328", "label_1329", "label_1330", "label_1331", "label_1332", "label_1333", "label_1334", "label_1335", "label_1336", "label_1337", "label_1338", "label_1339", "label_1340", "label_1341", "label_1342", "label_1343", "label_1344", "label_1345", "label_1346", "label_1347", "label_1348", "label_1349", "label_1350", "label_1351", "label_1352", "label_1353", "label_1354", "label_1355", "label_1356", "label_1357", "label_1358", "label_1359", "label_1360", "label_1361", "label_1362", "label_1363", "label_1364", "label_1365", "label_1366", "label_1367", "label_1368", "label_1369", "label_1370", "label_1371", "label_1372", "label_1373", "label_1374", "label_1375", "label_1376", "label_1377", "label_1378", "label_1379", "label_1380", "label_1381", "label_1382", "label_1383", "label_1384", "label_1385", "label_1386", "label_1387", "label_1388", "label_1389", "label_1390", "label_1391", "label_1392", "label_1393", "label_1394", "label_1395", "label_1396", "label_1397", "label_1398", "label_1399", "label_1400", "label_1401", "label_1402", "label_1403", "label_1404", "label_1405", "label_1406", "label_1407", "label_1408", "label_1409", "label_1410", "label_1411", "label_1412", "label_1413", "label_1414", "label_1415", "label_1416", "label_1417", "label_1418", "label_1419", "label_1420", "label_1421", "label_1422", "label_1423", "label_1424", "label_1425", "label_1426", "label_1427", "label_1428", "label_1429", "label_1430", "label_1431", "label_1432", "label_1433", "label_1434", "label_1435", "label_1436", "label_1437", "label_1438", "label_1439", "label_1440", "label_1441", "label_1442", "label_1443", "label_1444", "label_1445", "label_1446", "label_1447", "label_1448", "label_1449", "label_1450", "label_1451", "label_1452", "label_1453", "label_1454", "label_1455", "label_1456", "label_1457", "label_1458", "label_1459", "label_1460", "label_1461", "label_1462", "label_1463", "label_1464", "label_1465", "label_1466", "label_1467", "label_1468", "label_1469", "label_1470", "label_1471", "label_1472", "label_1473", "label_1474", "label_1475", "label_1476", "label_1477", "label_1478", "label_1479", "label_1480", "label_1481", "label_1482", "label_1483", "label_1484", "label_1485", "label_1486", "label_1487", "label_1488", "label_1489", "label_1490", "label_1491", "label_1492", "label_1493", "label_1494", "label_1495", "label_1496", "label_1497", "label_1498", "label_1499", "label_1500", "label_1501", "label_1502", "label_1503", "label_1504", "label_1505", "label_1506", "label_1507", "label_1508", "label_1509", "label_1510", "label_1511", "label_1512", "label_1513", "label_1514", "label_1515", "label_1516", "label_1517", "label_1518", "label_1519", "label_1520", "label_1521", "label_1522", "label_1523", "label_1524", "label_1525", "label_1526", "label_1527", "label_1528", "label_1529", "label_1530", "label_1531", "label_1532", "label_1533", "label_1534", "label_1535", "label_1536", "label_1537", "label_1538", "label_1539", "label_1540", "label_1541", "label_1542", "label_1543", "label_1544", "label_1545", "label_1546", "label_1547", "label_1548", "label_1549", "label_1550", "label_1551", "label_1552", "label_1553", "label_1554", "label_1555", "label_1556", "label_1557", "label_1558", "label_1559", "label_1560", "label_1561", "label_1562", "label_1563", "label_1564", "label_1565", "label_1566", "label_1567", "label_1568", "label_1569", "label_1570", "label_1571", "label_1572", "label_1573", "label_1574", "label_1575", "label_1576", "label_1577", "label_1578", "label_1579", "label_1580", "label_1581", "label_1582", "label_1583", "label_1584", "label_1585", "label_1586", "label_1587", "label_1588", "label_1589", "label_1590", "label_1591", "label_1592", "label_1593", "label_1594", "label_1595", "label_1596", "label_1597", "label_1598", "label_1599", "label_1600", "label_1601", "label_1602", "label_1603", "label_1604", "label_1605", "label_1606", "label_1607", "label_1608", "label_1609", "label_1610", "label_1611", "label_1612", "label_1613", "label_1614", "label_1615", "label_1616", "label_1617", "label_1618", "label_1619", "label_1620", "label_1621", "label_1622", "label_1623", "label_1624", "label_1625", "label_1626", "label_1627", "label_1628", "label_1629", "label_1630", "label_1631", "label_1632", "label_1633", "label_1634", "label_1635", "label_1636", "label_1637", "label_1638", "label_1639", "label_1640", "label_1641", "label_1642", "label_1643", "label_1644", "label_1645", "label_1646", "label_1647", "label_1648", "label_1649", "label_1650", "label_1651", "label_1652", "label_1653", "label_1654", "label_1655", "label_1656", "label_1657", "label_1658", "label_1659", "label_1660", "label_1661", "label_1662", "label_1663", "label_1664", "label_1665", "label_1666", "label_1667", "label_1668", "label_1669", "label_1670", "label_1671", "label_1672", "label_1673", "label_1674", "label_1675", "label_1676", "label_1677", "label_1678", "label_1679", "label_1680", "label_1681", "label_1682", "label_1683", "label_1684", "label_1685", "label_1686", "label_1687", "label_1688", "label_1689", "label_1690", "label_1691", "label_1692", "label_1693", "label_1694", "label_1695", "label_1696", "label_1697", "label_1698", "label_1699", "label_1700", "label_1701", "label_1702", "label_1703", "label_1704", "label_1705", "label_1706", "label_1707", "label_1708", "label_1709", "label_1710", "label_1711", "label_1712", "label_1713", "label_1714", "label_1715", "label_1716", "label_1717", "label_1718", "label_1719", "label_1720", "label_1721", "label_1722", "label_1723", "label_1724", "label_1725", "label_1726", "label_1727", "label_1728", "label_1729", "label_1730", "label_1731", "label_1732", "label_1733", "label_1734", "label_1735", "label_1736", "label_1737", "label_1738", "label_1739", "label_1740", "label_1741", "label_1742", "label_1743", "label_1744", "label_1745", "label_1746", "label_1747", "label_1748", "label_1749", "label_1750", "label_1751", "label_1752", "label_1753", "label_1754", "label_1755", "label_1756", "label_1757", "label_1758", "label_1759", "label_1760", "label_1761", "label_1762", "label_1763", "label_1764", "label_1765", "label_1766", "label_1767", "label_1768", "label_1769", "label_1770", "label_1771", "label_1772", "label_1773", "label_1774", "label_1775", "label_1776", "label_1777", "label_1778", "label_1779", "label_1780", "label_1781", "label_1782", "label_1783", "label_1784", "label_1785", "label_1786", "label_1787", "label_1788", "label_1789", "label_1790", "label_1791", "label_1792", "label_1793", "label_1794", "label_1795", "label_1796", "label_1797", "label_1798", "label_1799", "label_1800", "label_1801", "label_1802", "label_1803", "label_1804", "label_1805", "label_1806", "label_1807", "label_1808", "label_1809", "label_1810", "label_1811", "label_1812", "label_1813", "label_1814", "label_1815", "label_1816", "label_1817", "label_1818", "label_1819", "label_1820", "label_1821", "label_1822", "label_1823", "label_1824", "label_1825", "label_1826", "label_1827", "label_1828", "label_1829", "label_1830", "label_1831", "label_1832", "label_1833", "label_1834", "label_1835", "label_1836", "label_1837", "label_1838", "label_1839", "label_1840", "label_1841", "label_1842", "label_1843", "label_1844", "label_1845", "label_1846", "label_1847", "label_1848", "label_1849", "label_1850", "label_1851", "label_1852", "label_1853", "label_1854", "label_1855", "label_1856", "label_1857", "label_1858", "label_1859", "label_1860", "label_1861", "label_1862", "label_1863", "label_1864", "label_1865", "label_1866", "label_1867", "label_1868", "label_1869", "label_1870", "label_1871", "label_1872", "label_1873", "label_1874", "label_1875", "label_1876", "label_1877", "label_1878", "label_1879", "label_1880", "label_1881", "label_1882", "label_1883", "label_1884", "label_1885", "label_1886", "label_1887", "label_1888", "label_1889", "label_1890", "label_1891", "label_1892", "label_1893", "label_1894", "label_1895", "label_1896", "label_1897", "label_1898", "label_1899", "label_1900", "label_1901", "label_1902", "label_1903", "label_1904", "label_1905", "label_1906", "label_1907", "label_1908", "label_1909", "label_1910", "label_1911", "label_1912", "label_1913", "label_1914", "label_1915", "label_1916", "label_1917", "label_1918", "label_1919", "label_1920", "label_1921", "label_1922", "label_1923", "label_1924", "label_1925", "label_1926", "label_1927", "label_1928", "label_1929", "label_1930", "label_1931", "label_1932", "label_1933", "label_1934", "label_1935", "label_1936", "label_1937", "label_1938", "label_1939", "label_1940", "label_1941", "label_1942", "label_1943", "label_1944", "label_1945", "label_1946", "label_1947", "label_1948", "label_1949", "label_1950", "label_1951", "label_1952", "label_1953", "label_1954", "label_1955", "label_1956", "label_1957", "label_1958", "label_1959", "label_1960", "label_1961", "label_1962", "label_1963", "label_1964", "label_1965", "label_1966", "label_1967", "label_1968", "label_1969", "label_1970", "label_1971", "label_1972", "label_1973", "label_1974", "label_1975", "label_1976", "label_1977", "label_1978", "label_1979", "label_1980", "label_1981", "label_1982", "label_1983", "label_1984", "label_1985", "label_1986", "label_1987", "label_1988", "label_1989", "label_1990", "label_1991", "label_1992", "label_1993", "label_1994", "label_1995", "label_1996", "label_1997", "label_1998", "label_1999", "label_2000", "label_2001", "label_2002", "label_2003", "label_2004", "label_2005", "label_2006", "label_2007", "label_2008", "label_2009", "label_2010", "label_2011", "label_2012", "label_2013", "label_2014", "label_2015", "label_2016", "label_2017", "label_2018", "label_2019", "label_2020", "label_2021", "label_2022", "label_2023", "label_2024", "label_2025", "label_2026", "label_2027", "label_2028", "label_2029", "label_2030", "label_2031", "label_2032", "label_2033", "label_2034", "label_2035", "label_2036", "label_2037", "label_2038", "label_2039", "label_2040", "label_2041", "label_2042", "label_2043", "label_2044", "label_2045", "label_2046", "label_2047", "label_2048", "label_2049", "label_2050", "label_2051", "label_2052", "label_2053", "label_2054", "label_2055", "label_2056", "label_2057", "label_2058", "label_2059", "label_2060", "label_2061", "label_2062", "label_2063", "label_2064", "label_2065", "label_2066", "label_2067", "label_2068", "label_2069", "label_2070", "label_2071", "label_2072", "label_2073", "label_2074", "label_2075", "label_2076", "label_2077", "label_2078", "label_2079", "label_2080", "label_2081", "label_2082", "label_2083", "label_2084", "label_2085", "label_2086", "label_2087", "label_2088", "label_2089", "label_2090", "label_2091", "label_2092", "label_2093", "label_2094", "label_2095", "label_2096", "label_2097", "label_2098", "label_2099", "label_2100", "label_2101", "label_2102", "label_2103", "label_2104", "label_2105", "label_2106", "label_2107", "label_2108", "label_2109", "label_2110", "label_2111", "label_2112", "label_2113", "label_2114", "label_2115", "label_2116", "label_2117", "label_2118", "label_2119", "label_2120", "label_2121", "label_2122", "label_2123", "label_2124", "label_2125", "label_2126", "label_2127", "label_2128", "label_2129", "label_2130", "label_2131", "label_2132", "label_2133", "label_2134", "label_2135", "label_2136", "label_2137", "label_2138", "label_2139", "label_2140", "label_2141", "label_2142", "label_2143", "label_2144", "label_2145", "label_2146", "label_2147", "label_2148", "label_2149", "label_2150", "label_2151", "label_2152", "label_2153", "label_2154", "label_2155", "label_2156", "label_2157", "label_2158", "label_2159", "label_2160", "label_2161", "label_2162", "label_2163", "label_2164", "label_2165", "label_2166", "label_2167", "label_2168", "label_2169", "label_2170", "label_2171", "label_2172", "label_2173", "label_2174", "label_2175", "label_2176", "label_2177", "label_2178", "label_2179", "label_2180", "label_2181", "label_2182", "label_2183", "label_2184", "label_2185", "label_2186", "label_2187", "label_2188", "label_2189", "label_2190", "label_2191", "label_2192", "label_2193", "label_2194", "label_2195", "label_2196", "label_2197", "label_2198", "label_2199", "label_2200", "label_2201", "label_2202", "label_2203", "label_2204", "label_2205", "label_2206", "label_2207", "label_2208", "label_2209", "label_2210", "label_2211", "label_2212", "label_2213", "label_2214", "label_2215", "label_2216", "label_2217", "label_2218", "label_2219", "label_2220", "label_2221", "label_2222", "label_2223", "label_2224", "label_2225", "label_2226", "label_2227", "label_2228", "label_2229", "label_2230", "label_2231", "label_2232", "label_2233", "label_2234", "label_2235", "label_2236", "label_2237", "label_2238", "label_2239", "label_2240", "label_2241", "label_2242", "label_2243", "label_2244", "label_2245", "label_2246", "label_2247", "label_2248", "label_2249", "label_2250", "label_2251", "label_2252", "label_2253", "label_2254", "label_2255", "label_2256", "label_2257", "label_2258", "label_2259", "label_2260", "label_2261", "label_2262", "label_2263", "label_2264", "label_2265", "label_2266", "label_2267", "label_2268", "label_2269", "label_2270", "label_2271", "label_2272", "label_2273", "label_2274", "label_2275", "label_2276", "label_2277", "label_2278", "label_2279", "label_2280", "label_2281", "label_2282", "label_2283", "label_2284", "label_2285", "label_2286", "label_2287", "label_2288", "label_2289", "label_2290", "label_2291", "label_2292", "label_2293", "label_2294", "label_2295", "label_2296", "label_2297", "label_2298", "label_2299", "label_2300", "label_2301", "label_2302", "label_2303", "label_2304", "label_2305", "label_2306", "label_2307", "label_2308", "label_2309", "label_2310", "label_2311", "label_2312", "label_2313", "label_2314", "label_2315", "label_2316", "label_2317", "label_2318", "label_2319", "label_2320", "label_2321", "label_2322", "label_2323", "label_2324", "label_2325", "label_2326", "label_2327", "label_2328", "label_2329", "label_2330", "label_2331", "label_2332", "label_2333", "label_2334", "label_2335", "label_2336", "label_2337", "label_2338", "label_2339", "label_2340", "label_2341", "label_2342", "label_2343", "label_2344", "label_2345", "label_2346", "label_2347", "label_2348", "label_2349", "label_2350", "label_2351", "label_2352", "label_2353", "label_2354", "label_2355", "label_2356", "label_2357", "label_2358", "label_2359", "label_2360", "label_2361", "label_2362", "label_2363", "label_2364", "label_2365", "label_2366", "label_2367", "label_2368", "label_2369", "label_2370", "label_2371", "label_2372", "label_2373", "label_2374", "label_2375", "label_2376", "label_2377", "label_2378", "label_2379", "label_2380", "label_2381", "label_2382", "label_2383", "label_2384", "label_2385", "label_2386", "label_2387", "label_2388", "label_2389", "label_2390", "label_2391", "label_2392", "label_2393", "label_2394", "label_2395", "label_2396", "label_2397", "label_2398", "label_2399", "label_2400", "label_2401", "label_2402", "label_2403", "label_2404", "label_2405", "label_2406", "label_2407", "label_2408", "label_2409", "label_2410", "label_2411", "label_2412", "label_2413", "label_2414", "label_2415", "label_2416", "label_2417", "label_2418", "label_2419", "label_2420", "label_2421", "label_2422", "label_2423", "label_2424", "label_2425", "label_2426", "label_2427", "label_2428", "label_2429", "label_2430", "label_2431", "label_2432", "label_2433", "label_2434", "label_2435", "label_2436", "label_2437", "label_2438", "label_2439", "label_2440", "label_2441", "label_2442", "label_2443", "label_2444", "label_2445", "label_2446", "label_2447", "label_2448", "label_2449", "label_2450", "label_2451", "label_2452", "label_2453", "label_2454", "label_2455", "label_2456", "label_2457", "label_2458", "label_2459", "label_2460", "label_2461", "label_2462", "label_2463", "label_2464", "label_2465", "label_2466", "label_2467", "label_2468", "label_2469", "label_2470", "label_2471", "label_2472", "label_2473", "label_2474", "label_2475", "label_2476", "label_2477", "label_2478", "label_2479", "label_2480", "label_2481", "label_2482", "label_2483", "label_2484", "label_2485", "label_2486", "label_2487", "label_2488", "label_2489", "label_2490", "label_2491", "label_2492", "label_2493", "label_2494", "label_2495", "label_2496", "label_2497", "label_2498", "label_2499", "label_2500", "label_2501", "label_2502", "label_2503", "label_2504", "label_2505", "label_2506", "label_2507", "label_2508", "label_2509", "label_2510", "label_2511", "label_2512", "label_2513", "label_2514", "label_2515", "label_2516", "label_2517", "label_2518", "label_2519", "label_2520", "label_2521", "label_2522", "label_2523", "label_2524", "label_2525", "label_2526", "label_2527", "label_2528", "label_2529", "label_2530", "label_2531", "label_2532", "label_2533", "label_2534", "label_2535", "label_2536", "label_2537", "label_2538", "label_2539", "label_2540", "label_2541", "label_2542", "label_2543", "label_2544", "label_2545", "label_2546", "label_2547", "label_2548", "label_2549", "label_2550", "label_2551", "label_2552", "label_2553", "label_2554", "label_2555", "label_2556", "label_2557", "label_2558", "label_2559", "label_2560", "label_2561", "label_2562", "label_2563", "label_2564", "label_2565", "label_2566", "label_2567", "label_2568", "label_2569", "label_2570", "label_2571", "label_2572", "label_2573", "label_2574", "label_2575", "label_2576", "label_2577", "label_2578", "label_2579", "label_2580", "label_2581", "label_2582", "label_2583", "label_2584", "label_2585", "label_2586", "label_2587", "label_2588", "label_2589", "label_2590", "label_2591", "label_2592", "label_2593", "label_2594", "label_2595", "label_2596", "label_2597", "label_2598", "label_2599", "label_2600", "label_2601", "label_2602", "label_2603", "label_2604", "label_2605", "label_2606", "label_2607", "label_2608", "label_2609", "label_2610", "label_2611", "label_2612", "label_2613", "label_2614", "label_2615", "label_2616", "label_2617", "label_2618", "label_2619", "label_2620", "label_2621", "label_2622", "label_2623", "label_2624", "label_2625", "label_2626", "label_2627", "label_2628", "label_2629", "label_2630", "label_2631", "label_2632", "label_2633", "label_2634", "label_2635", "label_2636", "label_2637", "label_2638", "label_2639", "label_2640", "label_2641", "label_2642", "label_2643", "label_2644", "label_2645", "label_2646", "label_2647", "label_2648", "label_2649", "label_2650", "label_2651", "label_2652", "label_2653", "label_2654", "label_2655", "label_2656", "label_2657", "label_2658", "label_2659", "label_2660", "label_2661", "label_2662", "label_2663", "label_2664", "label_2665", "label_2666", "label_2667", "label_2668", "label_2669", "label_2670", "label_2671", "label_2672", "label_2673", "label_2674", "label_2675", "label_2676", "label_2677", "label_2678", "label_2679", "label_2680", "label_2681", "label_2682", "label_2683", "label_2684", "label_2685", "label_2686", "label_2687", "label_2688", "label_2689", "label_2690", "label_2691", "label_2692", "label_2693", "label_2694", "label_2695", "label_2696", "label_2697", "label_2698", "label_2699", "label_2700", "label_2701", "label_2702", "label_2703", "label_2704", "label_2705", "label_2706", "label_2707", "label_2708", "label_2709", "label_2710", "label_2711", "label_2712", "label_2713", "label_2714", "label_2715", "label_2716", "label_2717", "label_2718", "label_2719", "label_2720", "label_2721", "label_2722", "label_2723", "label_2724", "label_2725", "label_2726", "label_2727", "label_2728", "label_2729", "label_2730", "label_2731", "label_2732", "label_2733", "label_2734", "label_2735", "label_2736", "label_2737", "label_2738", "label_2739", "label_2740", "label_2741", "label_2742", "label_2743", "label_2744", "label_2745", "label_2746", "label_2747", "label_2748", "label_2749", "label_2750", "label_2751", "label_2752", "label_2753", "label_2754", "label_2755", "label_2756", "label_2757", "label_2758", "label_2759", "label_2760", "label_2761", "label_2762", "label_2763", "label_2764", "label_2765", "label_2766", "label_2767", "label_2768", "label_2769", "label_2770", "label_2771", "label_2772", "label_2773", "label_2774", "label_2775", "label_2776", "label_2777", "label_2778", "label_2779", "label_2780", "label_2781", "label_2782", "label_2783", "label_2784", "label_2785", "label_2786", "label_2787", "label_2788", "label_2789", "label_2790", "label_2791", "label_2792", "label_2793", "label_2794", "label_2795", "label_2796", "label_2797", "label_2798", "label_2799", "label_2800", "label_2801", "label_2802", "label_2803", "label_2804", "label_2805", "label_2806", "label_2807", "label_2808", "label_2809", "label_2810", "label_2811", "label_2812", "label_2813", "label_2814", "label_2815", "label_2816", "label_2817", "label_2818", "label_2819", "label_2820", "label_2821", "label_2822", "label_2823", "label_2824", "label_2825", "label_2826", "label_2827", "label_2828", "label_2829", "label_2830", "label_2831", "label_2832", "label_2833", "label_2834", "label_2835", "label_2836", "label_2837", "label_2838", "label_2839", "label_2840", "label_2841", "label_2842", "label_2843", "label_2844", "label_2845", "label_2846", "label_2847", "label_2848", "label_2849", "label_2850", "label_2851", "label_2852", "label_2853", "label_2854", "label_2855", "label_2856", "label_2857", "label_2858", "label_2859", "label_2860", "label_2861", "label_2862", "label_2863", "label_2864", "label_2865", "label_2866", "label_2867", "label_2868", "label_2869", "label_2870", "label_2871", "label_2872", "label_2873", "label_2874", "label_2875", "label_2876", "label_2877", "label_2878", "label_2879", "label_2880", "label_2881", "label_2882", "label_2883", "label_2884", "label_2885", "label_2886", "label_2887", "label_2888", "label_2889", "label_2890", "label_2891", "label_2892", "label_2893", "label_2894", "label_2895", "label_2896", "label_2897", "label_2898", "label_2899", "label_2900", "label_2901", "label_2902", "label_2903", "label_2904", "label_2905", "label_2906", "label_2907", "label_2908", "label_2909", "label_2910", "label_2911", "label_2912", "label_2913", "label_2914", "label_2915", "label_2916", "label_2917", "label_2918", "label_2919", "label_2920", "label_2921", "label_2922", "label_2923", "label_2924", "label_2925", "label_2926", "label_2927", "label_2928", "label_2929", "label_2930", "label_2931", "label_2932", "label_2933", "label_2934", "label_2935", "label_2936", "label_2937", "label_2938", "label_2939", "label_2940", "label_2941", "label_2942", "label_2943", "label_2944", "label_2945", "label_2946", "label_2947", "label_2948", "label_2949", "label_2950", "label_2951", "label_2952", "label_2953", "label_2954", "label_2955", "label_2956", "label_2957", "label_2958", "label_2959", "label_2960", "label_2961", "label_2962", "label_2963", "label_2964", "label_2965", "label_2966", "label_2967", "label_2968", "label_2969", "label_2970", "label_2971", "label_2972", "label_2973", "label_2974", "label_2975", "label_2976", "label_2977", "label_2978", "label_2979", "label_2980", "label_2981", "label_2982", "label_2983", "label_2984", "label_2985", "label_2986", "label_2987", "label_2988", "label_2989", "label_2990", "label_2991", "label_2992", "label_2993", "label_2994", "label_2995", "label_2996", "label_2997", "label_2998", "label_2999", "label_3000", "label_3001", "label_3002", "label_3003", "label_3004", "label_3005", "label_3006", "label_3007", "label_3008", "label_3009", "label_3010", "label_3011", "label_3012", "label_3013", "label_3014", "label_3015", "label_3016", "label_3017", "label_3018", "label_3019", "label_3020", "label_3021", "label_3022", "label_3023", "label_3024", "label_3025", "label_3026", "label_3027", "label_3028", "label_3029", "label_3030", "label_3031", "label_3032", "label_3033", "label_3034", "label_3035", "label_3036", "label_3037", "label_3038", "label_3039", "label_3040", "label_3041", "label_3042", "label_3043", "label_3044", "label_3045", "label_3046", "label_3047", "label_3048", "label_3049", "label_3050", "label_3051", "label_3052", "label_3053", "label_3054", "label_3055", "label_3056", "label_3057", "label_3058", "label_3059", "label_3060", "label_3061", "label_3062", "label_3063", "label_3064", "label_3065", "label_3066", "label_3067", "label_3068", "label_3069", "label_3070", "label_3071", "label_3072", "label_3073", "label_3074", "label_3075", "label_3076", "label_3077", "label_3078", "label_3079", "label_3080", "label_3081", "label_3082", "label_3083", "label_3084", "label_3085", "label_3086", "label_3087", "label_3088", "label_3089", "label_3090", "label_3091", "label_3092", "label_3093", "label_3094", "label_3095", "label_3096", "label_3097", "label_3098", "label_3099", "label_3100", "label_3101", "label_3102", "label_3103", "label_3104", "label_3105", "label_3106", "label_3107", "label_3108", "label_3109", "label_3110", "label_3111", "label_3112", "label_3113", "label_3114", "label_3115", "label_3116", "label_3117", "label_3118", "label_3119", "label_3120", "label_3121", "label_3122", "label_3123", "label_3124", "label_3125", "label_3126", "label_3127", "label_3128", "label_3129", "label_3130", "label_3131", "label_3132", "label_3133", "label_3134", "label_3135", "label_3136", "label_3137", "label_3138", "label_3139", "label_3140", "label_3141", "label_3142", "label_3143", "label_3144", "label_3145", "label_3146", "label_3147", "label_3148", "label_3149", "label_3150", "label_3151", "label_3152", "label_3153", "label_3154", "label_3155", "label_3156", "label_3157", "label_3158", "label_3159", "label_3160", "label_3161", "label_3162", "label_3163", "label_3164", "label_3165", "label_3166", "label_3167", "label_3168", "label_3169", "label_3170", "label_3171", "label_3172", "label_3173", "label_3174", "label_3175", "label_3176", "label_3177", "label_3178", "label_3179", "label_3180", "label_3181", "label_3182", "label_3183", "label_3184", "label_3185", "label_3186", "label_3187", "label_3188", "label_3189", "label_3190", "label_3191", "label_3192", "label_3193", "label_3194", "label_3195", "label_3196", "label_3197", "label_3198", "label_3199", "label_3200", "label_3201", "label_3202", "label_3203", "label_3204", "label_3205", "label_3206", "label_3207", "label_3208", "label_3209", "label_3210", "label_3211", "label_3212", "label_3213", "label_3214", "label_3215", "label_3216", "label_3217", "label_3218", "label_3219", "label_3220", "label_3221", "label_3222", "label_3223", "label_3224", "label_3225", "label_3226", "label_3227", "label_3228", "label_3229", "label_3230", "label_3231", "label_3232", "label_3233", "label_3234", "label_3235", "label_3236", "label_3237", "label_3238", "label_3239", "label_3240", "label_3241", "label_3242", "label_3243", "label_3244", "label_3245", "label_3246", "label_3247", "label_3248", "label_3249", "label_3250", "label_3251", "label_3252", "label_3253", "label_3254", "label_3255", "label_3256", "label_3257", "label_3258", "label_3259", "label_3260", "label_3261", "label_3262", "label_3263", "label_3264", "label_3265", "label_3266", "label_3267", "label_3268", "label_3269", "label_3270", "label_3271", "label_3272", "label_3273", "label_3274", "label_3275", "label_3276", "label_3277", "label_3278", "label_3279", "label_3280", "label_3281", "label_3282", "label_3283", "label_3284", "label_3285", "label_3286", "label_3287", "label_3288", "label_3289", "label_3290", "label_3291", "label_3292", "label_3293", "label_3294", "label_3295", "label_3296", "label_3297", "label_3298", "label_3299", "label_3300", "label_3301", "label_3302", "label_3303", "label_3304", "label_3305", "label_3306", "label_3307", "label_3308", "label_3309", "label_3310", "label_3311", "label_3312", "label_3313", "label_3314", "label_3315", "label_3316", "label_3317", "label_3318", "label_3319", "label_3320", "label_3321", "label_3322", "label_3323", "label_3324", "label_3325", "label_3326", "label_3327", "label_3328", "label_3329", "label_3330", "label_3331", "label_3332", "label_3333", "label_3334", "label_3335", "label_3336", "label_3337", "label_3338", "label_3339", "label_3340", "label_3341", "label_3342", "label_3343", "label_3344", "label_3345", "label_3346", "label_3347", "label_3348", "label_3349", "label_3350", "label_3351", "label_3352", "label_3353", "label_3354", "label_3355", "label_3356", "label_3357", "label_3358", "label_3359", "label_3360", "label_3361", "label_3362", "label_3363", "label_3364", "label_3365", "label_3366", "label_3367", "label_3368", "label_3369", "label_3370", "label_3371", "label_3372", "label_3373", "label_3374", "label_3375", "label_3376", "label_3377", "label_3378", "label_3379", "label_3380", "label_3381", "label_3382", "label_3383", "label_3384", "label_3385", "label_3386", "label_3387", "label_3388", "label_3389", "label_3390", "label_3391", "label_3392", "label_3393", "label_3394", "label_3395", "label_3396", "label_3397", "label_3398", "label_3399", "label_3400", "label_3401", "label_3402", "label_3403", "label_3404", "label_3405", "label_3406", "label_3407", "label_3408", "label_3409", "label_3410", "label_3411", "label_3412", "label_3413", "label_3414", "label_3415", "label_3416", "label_3417", "label_3418", "label_3419", "label_3420", "label_3421", "label_3422", "label_3423", "label_3424", "label_3425", "label_3426", "label_3427", "label_3428", "label_3429", "label_3430", "label_3431", "label_3432", "label_3433", "label_3434", "label_3435", "label_3436", "label_3437", "label_3438", "label_3439", "label_3440", "label_3441", "label_3442", "label_3443", "label_3444", "label_3445", "label_3446", "label_3447", "label_3448", "label_3449", "label_3450", "label_3451", "label_3452", "label_3453", "label_3454", "label_3455", "label_3456", "label_3457", "label_3458", "label_3459", "label_3460", "label_3461", "label_3462", "label_3463", "label_3464", "label_3465", "label_3466", "label_3467", "label_3468", "label_3469", "label_3470", "label_3471", "label_3472", "label_3473", "label_3474", "label_3475", "label_3476", "label_3477", "label_3478", "label_3479", "label_3480", "label_3481", "label_3482", "label_3483", "label_3484", "label_3485", "label_3486", "label_3487", "label_3488", "label_3489", "label_3490", "label_3491", "label_3492", "label_3493", "label_3494", "label_3495", "label_3496", "label_3497", "label_3498", "label_3499", "label_3500", "label_3501", "label_3502", "label_3503", "label_3504", "label_3505", "label_3506", "label_3507", "label_3508", "label_3509", "label_3510", "label_3511", "label_3512", "label_3513", "label_3514", "label_3515", "label_3516", "label_3517", "label_3518", "label_3519", "label_3520", "label_3521", "label_3522", "label_3523", "label_3524", "label_3525", "label_3526", "label_3527", "label_3528", "label_3529", "label_3530", "label_3531", "label_3532", "label_3533", "label_3534", "label_3535", "label_3536", "label_3537", "label_3538", "label_3539", "label_3540", "label_3541", "label_3542", "label_3543", "label_3544", "label_3545", "label_3546", "label_3547", "label_3548", "label_3549", "label_3550", "label_3551", "label_3552", "label_3553", "label_3554", "label_3555", "label_3556", "label_3557", "label_3558", "label_3559", "label_3560", "label_3561", "label_3562", "label_3563", "label_3564", "label_3565", "label_3566", "label_3567", "label_3568", "label_3569", "label_3570", "label_3571", "label_3572", "label_3573", "label_3574", "label_3575", "label_3576", "label_3577", "label_3578", "label_3579", "label_3580", "label_3581", "label_3582", "label_3583", "label_3584", "label_3585", "label_3586", "label_3587", "label_3588", "label_3589", "label_3590", "label_3591", "label_3592", "label_3593", "label_3594", "label_3595", "label_3596", "label_3597", "label_3598", "label_3599", "label_3600", "label_3601", "label_3602", "label_3603", "label_3604", "label_3605", "label_3606", "label_3607", "label_3608", "label_3609", "label_3610", "label_3611", "label_3612", "label_3613", "label_3614", "label_3615", "label_3616", "label_3617", "label_3618", "label_3619", "label_3620", "label_3621", "label_3622", "label_3623", "label_3624", "label_3625", "label_3626", "label_3627", "label_3628", "label_3629", "label_3630", "label_3631", "label_3632", "label_3633", "label_3634", "label_3635", "label_3636", "label_3637", "label_3638", "label_3639", "label_3640", "label_3641", "label_3642", "label_3643", "label_3644", "label_3645", "label_3646", "label_3647", "label_3648", "label_3649", "label_3650", "label_3651", "label_3652", "label_3653", "label_3654", "label_3655", "label_3656", "label_3657", "label_3658", "label_3659", "label_3660", "label_3661", "label_3662", "label_3663", "label_3664", "label_3665", "label_3666", "label_3667", "label_3668", "label_3669", "label_3670", "label_3671", "label_3672", "label_3673", "label_3674", "label_3675", "label_3676", "label_3677", "label_3678", "label_3679", "label_3680", "label_3681", "label_3682", "label_3683", "label_3684", "label_3685", "label_3686", "label_3687", "label_3688", "label_3689", "label_3690", "label_3691", "label_3692", "label_3693", "label_3694", "label_3695", "label_3696", "label_3697", "label_3698", "label_3699", "label_3700", "label_3701", "label_3702", "label_3703", "label_3704", "label_3705", "label_3706", "label_3707", "label_3708", "label_3709", "label_3710", "label_3711", "label_3712", "label_3713", "label_3714", "label_3715", "label_3716", "label_3717", "label_3718", "label_3719", "label_3720", "label_3721", "label_3722", "label_3723", "label_3724", "label_3725", "label_3726", "label_3727", "label_3728", "label_3729", "label_3730", "label_3731", "label_3732", "label_3733", "label_3734", "label_3735", "label_3736", "label_3737", "label_3738", "label_3739", "label_3740", "label_3741", "label_3742", "label_3743", "label_3744", "label_3745", "label_3746", "label_3747", "label_3748", "label_3749", "label_3750", "label_3751", "label_3752", "label_3753", "label_3754", "label_3755", "label_3756", "label_3757", "label_3758", "label_3759", "label_3760", "label_3761", "label_3762", "label_3763", "label_3764", "label_3765", "label_3766", "label_3767", "label_3768", "label_3769", "label_3770", "label_3771", "label_3772", "label_3773", "label_3774", "label_3775", "label_3776", "label_3777", "label_3778", "label_3779", "label_3780", "label_3781", "label_3782", "label_3783", "label_3784", "label_3785", "label_3786", "label_3787", "label_3788", "label_3789", "label_3790", "label_3791", "label_3792", "label_3793", "label_3794", "label_3795", "label_3796", "label_3797", "label_3798", "label_3799", "label_3800", "label_3801", "label_3802", "label_3803", "label_3804", "label_3805", "label_3806", "label_3807", "label_3808", "label_3809", "label_3810", "label_3811", "label_3812", "label_3813", "label_3814", "label_3815", "label_3816", "label_3817", "label_3818", "label_3819", "label_3820", "label_3821", "label_3822", "label_3823", "label_3824", "label_3825", "label_3826", "label_3827", "label_3828", "label_3829", "label_3830", "label_3831", "label_3832", "label_3833", "label_3834", "label_3835", "label_3836", "label_3837", "label_3838", "label_3839", "label_3840", "label_3841", "label_3842", "label_3843", "label_3844", "label_3845", "label_3846", "label_3847", "label_3848", "label_3849", "label_3850", "label_3851", "label_3852", "label_3853", "label_3854", "label_3855", "label_3856", "label_3857", "label_3858", "label_3859", "label_3860", "label_3861", "label_3862", "label_3863", "label_3864", "label_3865", "label_3866", "label_3867", "label_3868", "label_3869", "label_3870", "label_3871", "label_3872", "label_3873", "label_3874", "label_3875", "label_3876", "label_3877", "label_3878", "label_3879", "label_3880", "label_3881", "label_3882", "label_3883", "label_3884", "label_3885", "label_3886", "label_3887", "label_3888", "label_3889", "label_3890", "label_3891", "label_3892", "label_3893", "label_3894", "label_3895", "label_3896", "label_3897", "label_3898", "label_3899", "label_3900", "label_3901", "label_3902", "label_3903", "label_3904", "label_3905", "label_3906", "label_3907", "label_3908", "label_3909", "label_3910", "label_3911", "label_3912", "label_3913", "label_3914", "label_3915", "label_3916", "label_3917", "label_3918", "label_3919", "label_3920", "label_3921", "label_3922", "label_3923", "label_3924", "label_3925", "label_3926", "label_3927", "label_3928", "label_3929", "label_3930", "label_3931", "label_3932", "label_3933", "label_3934", "label_3935", "label_3936", "label_3937", "label_3938", "label_3939", "label_3940", "label_3941", "label_3942", "label_3943", "label_3944", "label_3945", "label_3946", "label_3947", "label_3948", "label_3949", "label_3950", "label_3951", "label_3952", "label_3953", "label_3954", "label_3955", "label_3956", "label_3957", "label_3958", "label_3959", "label_3960", "label_3961", "label_3962", "label_3963", "label_3964", "label_3965", "label_3966", "label_3967", "label_3968", "label_3969", "label_3970", "label_3971", "label_3972", "label_3973", "label_3974", "label_3975", "label_3976", "label_3977", "label_3978", "label_3979", "label_3980", "label_3981", "label_3982", "label_3983", "label_3984", "label_3985", "label_3986", "label_3987", "label_3988", "label_3989", "label_3990", "label_3991", "label_3992", "label_3993", "label_3994", "label_3995", "label_3996", "label_3997", "label_3998", "label_3999", "label_4000", "label_4001", "label_4002", "label_4003", "label_4004", "label_4005", "label_4006", "label_4007", "label_4008", "label_4009", "label_4010", "label_4011", "label_4012", "label_4013", "label_4014", "label_4015", "label_4016", "label_4017", "label_4018", "label_4019", "label_4020", "label_4021", "label_4022", "label_4023", "label_4024", "label_4025", "label_4026", "label_4027", "label_4028", "label_4029", "label_4030", "label_4031", "label_4032", "label_4033", "label_4034", "label_4035", "label_4036", "label_4037", "label_4038", "label_4039", "label_4040", "label_4041", "label_4042", "label_4043", "label_4044", "label_4045", "label_4046", "label_4047", "label_4048", "label_4049", "label_4050", "label_4051", "label_4052", "label_4053", "label_4054", "label_4055", "label_4056", "label_4057", "label_4058", "label_4059", "label_4060", "label_4061", "label_4062", "label_4063", "label_4064", "label_4065", "label_4066", "label_4067", "label_4068", "label_4069", "label_4070", "label_4071", "label_4072", "label_4073", "label_4074", "label_4075", "label_4076", "label_4077", "label_4078", "label_4079", "label_4080", "label_4081", "label_4082", "label_4083", "label_4084", "label_4085", "label_4086", "label_4087", "label_4088", "label_4089", "label_4090", "label_4091", "label_4092", "label_4093", "label_4094", "label_4095", "label_4096", "label_4097", "label_4098", "label_4099", "label_4100", "label_4101", "label_4102", "label_4103", "label_4104", "label_4105", "label_4106", "label_4107", "label_4108", "label_4109", "label_4110", "label_4111", "label_4112", "label_4113", "label_4114", "label_4115", "label_4116", "label_4117", "label_4118", "label_4119", "label_4120", "label_4121", "label_4122", "label_4123", "label_4124", "label_4125", "label_4126", "label_4127", "label_4128", "label_4129", "label_4130", "label_4131", "label_4132", "label_4133", "label_4134", "label_4135", "label_4136", "label_4137", "label_4138", "label_4139", "label_4140", "label_4141", "label_4142", "label_4143", "label_4144", "label_4145", "label_4146", "label_4147", "label_4148", "label_4149", "label_4150", "label_4151", "label_4152", "label_4153", "label_4154", "label_4155", "label_4156", "label_4157", "label_4158", "label_4159", "label_4160", "label_4161", "label_4162", "label_4163", "label_4164", "label_4165", "label_4166", "label_4167", "label_4168", "label_4169", "label_4170", "label_4171", "label_4172", "label_4173", "label_4174", "label_4175", "label_4176", "label_4177", "label_4178", "label_4179", "label_4180", "label_4181", "label_4182", "label_4183", "label_4184", "label_4185", "label_4186", "label_4187", "label_4188", "label_4189", "label_4190", "label_4191", "label_4192", "label_4193", "label_4194", "label_4195", "label_4196", "label_4197", "label_4198", "label_4199", "label_4200", "label_4201", "label_4202", "label_4203", "label_4204", "label_4205", "label_4206", "label_4207", "label_4208", "label_4209", "label_4210", "label_4211", "label_4212", "label_4213", "label_4214", "label_4215", "label_4216", "label_4217", "label_4218", "label_4219", "label_4220", "label_4221", "label_4222", "label_4223", "label_4224", "label_4225", "label_4226", "label_4227", "label_4228", "label_4229", "label_4230", "label_4231", "label_4232", "label_4233", "label_4234", "label_4235", "label_4236", "label_4237", "label_4238", "label_4239", "label_4240", "label_4241", "label_4242", "label_4243", "label_4244", "label_4245", "label_4246", "label_4247", "label_4248", "label_4249", "label_4250", "label_4251", "label_4252", "label_4253", "label_4254", "label_4255", "label_4256", "label_4257", "label_4258", "label_4259", "label_4260", "label_4261", "label_4262", "label_4263", "label_4264", "label_4265", "label_4266", "label_4267", "label_4268", "label_4269", "label_4270", "label_4271", "label_4272", "label_4273", "label_4274", "label_4275", "label_4276", "label_4277", "label_4278", "label_4279", "label_4280", "label_4281", "label_4282", "label_4283", "label_4284", "label_4285", "label_4286", "label_4287", "label_4288", "label_4289", "label_4290", "label_4291", "label_4292", "label_4293", "label_4294", "label_4295", "label_4296", "label_4297", "label_4298", "label_4299", "label_4300", "label_4301", "label_4302", "label_4303", "label_4304", "label_4305", "label_4306", "label_4307", "label_4308", "label_4309", "label_4310", "label_4311", "label_4312", "label_4313", "label_4314", "label_4315", "label_4316", "label_4317", "label_4318", "label_4319", "label_4320", "label_4321", "label_4322", "label_4323", "label_4324", "label_4325", "label_4326", "label_4327", "label_4328", "label_4329", "label_4330", "label_4331", "label_4332", "label_4333", "label_4334", "label_4335", "label_4336", "label_4337", "label_4338", "label_4339", "label_4340", "label_4341", "label_4342", "label_4343", "label_4344", "label_4345", "label_4346", "label_4347", "label_4348", "label_4349", "label_4350", "label_4351", "label_4352", "label_4353", "label_4354", "label_4355", "label_4356", "label_4357", "label_4358", "label_4359", "label_4360", "label_4361", "label_4362", "label_4363", "label_4364", "label_4365", "label_4366", "label_4367", "label_4368", "label_4369", "label_4370", "label_4371", "label_4372", "label_4373", "label_4374", "label_4375", "label_4376", "label_4377", "label_4378", "label_4379", "label_4380", "label_4381", "label_4382", "label_4383", "label_4384", "label_4385", "label_4386", "label_4387", "label_4388", "label_4389", "label_4390", "label_4391", "label_4392", "label_4393", "label_4394", "label_4395", "label_4396", "label_4397", "label_4398", "label_4399", "label_4400", "label_4401", "label_4402", "label_4403", "label_4404", "label_4405", "label_4406", "label_4407", "label_4408", "label_4409", "label_4410", "label_4411", "label_4412", "label_4413", "label_4414", "label_4415", "label_4416", "label_4417", "label_4418", "label_4419", "label_4420", "label_4421", "label_4422", "label_4423", "label_4424", "label_4425", "label_4426", "label_4427", "label_4428", "label_4429", "label_4430", "label_4431", "label_4432", "label_4433", "label_4434", "label_4435", "label_4436", "label_4437", "label_4438", "label_4439", "label_4440", "label_4441", "label_4442", "label_4443", "label_4444", "label_4445", "label_4446", "label_4447", "label_4448", "label_4449", "label_4450", "label_4451", "label_4452", "label_4453", "label_4454", "label_4455", "label_4456", "label_4457", "label_4458", "label_4459", "label_4460", "label_4461", "label_4462", "label_4463", "label_4464", "label_4465", "label_4466", "label_4467", "label_4468", "label_4469", "label_4470", "label_4471", "label_4472", "label_4473", "label_4474", "label_4475", "label_4476", "label_4477", "label_4478", "label_4479", "label_4480", "label_4481", "label_4482", "label_4483", "label_4484", "label_4485", "label_4486", "label_4487", "label_4488", "label_4489", "label_4490", "label_4491", "label_4492", "label_4493", "label_4494", "label_4495", "label_4496", "label_4497", "label_4498", "label_4499", "label_4500", "label_4501", "label_4502", "label_4503", "label_4504", "label_4505", "label_4506", "label_4507", "label_4508", "label_4509", "label_4510", "label_4511", "label_4512", "label_4513", "label_4514", "label_4515", "label_4516", "label_4517", "label_4518", "label_4519", "label_4520", "label_4521", "label_4522", "label_4523", "label_4524", "label_4525", "label_4526", "label_4527", "label_4528", "label_4529", "label_4530", "label_4531", "label_4532", "label_4533", "label_4534", "label_4535", "label_4536", "label_4537", "label_4538", "label_4539", "label_4540", "label_4541", "label_4542", "label_4543", "label_4544", "label_4545", "label_4546", "label_4547", "label_4548", "label_4549", "label_4550", "label_4551", "label_4552", "label_4553", "label_4554", "label_4555", "label_4556", "label_4557", "label_4558", "label_4559", "label_4560", "label_4561", "label_4562", "label_4563", "label_4564", "label_4565", "label_4566", "label_4567", "label_4568", "label_4569", "label_4570", "label_4571", "label_4572", "label_4573", "label_4574", "label_4575", "label_4576", "label_4577", "label_4578", "label_4579", "label_4580", "label_4581", "label_4582", "label_4583", "label_4584", "label_4585", "label_4586", "label_4587", "label_4588", "label_4589", "label_4590", "label_4591", "label_4592", "label_4593", "label_4594", "label_4595", "label_4596", "label_4597", "label_4598", "label_4599", "label_4600", "label_4601", "label_4602", "label_4603", "label_4604", "label_4605", "label_4606", "label_4607", "label_4608", "label_4609", "label_4610", "label_4611", "label_4612", "label_4613", "label_4614", "label_4615", "label_4616", "label_4617", "label_4618", "label_4619", "label_4620", "label_4621", "label_4622", "label_4623", "label_4624", "label_4625", "label_4626", "label_4627", "label_4628", "label_4629", "label_4630", "label_4631", "label_4632", "label_4633", "label_4634", "label_4635", "label_4636", "label_4637", "label_4638", "label_4639", "label_4640", "label_4641", "label_4642", "label_4643", "label_4644", "label_4645", "label_4646", "label_4647", "label_4648", "label_4649", "label_4650", "label_4651", "label_4652", "label_4653", "label_4654", "label_4655", "label_4656", "label_4657", "label_4658", "label_4659", "label_4660", "label_4661", "label_4662", "label_4663", "label_4664", "label_4665", "label_4666", "label_4667", "label_4668", "label_4669", "label_4670", "label_4671", "label_4672", "label_4673", "label_4674", "label_4675", "label_4676", "label_4677", "label_4678", "label_4679", "label_4680", "label_4681", "label_4682", "label_4683", "label_4684", "label_4685", "label_4686", "label_4687", "label_4688", "label_4689", "label_4690", "label_4691", "label_4692", "label_4693", "label_4694", "label_4695", "label_4696", "label_4697", "label_4698", "label_4699", "label_4700", "label_4701", "label_4702", "label_4703", "label_4704", "label_4705", "label_4706", "label_4707", "label_4708", "label_4709", "label_4710", "label_4711", "label_4712", "label_4713", "label_4714", "label_4715", "label_4716", "label_4717", "label_4718", "label_4719", "label_4720", "label_4721", "label_4722", "label_4723", "label_4724", "label_4725", "label_4726", "label_4727", "label_4728", "label_4729", "label_4730", "label_4731", "label_4732", "label_4733", "label_4734", "label_4735", "label_4736", "label_4737", "label_4738", "label_4739", "label_4740", "label_4741", "label_4742", "label_4743", "label_4744", "label_4745", "label_4746", "label_4747", "label_4748", "label_4749", "label_4750", "label_4751", "label_4752", "label_4753", "label_4754", "label_4755", "label_4756", "label_4757", "label_4758", "label_4759", "label_4760", "label_4761", "label_4762", "label_4763", "label_4764", "label_4765", "label_4766", "label_4767", "label_4768", "label_4769", "label_4770", "label_4771", "label_4772", "label_4773", "label_4774", "label_4775", "label_4776", "label_4777", "label_4778", "label_4779", "label_4780", "label_4781", "label_4782", "label_4783", "label_4784", "label_4785", "label_4786", "label_4787", "label_4788", "label_4789", "label_4790", "label_4791", "label_4792", "label_4793", "label_4794", "label_4795", "label_4796", "label_4797", "label_4798", "label_4799", "label_4800", "label_4801", "label_4802", "label_4803", "label_4804", "label_4805", "label_4806", "label_4807", "label_4808", "label_4809", "label_4810", "label_4811", "label_4812", "label_4813", "label_4814", "label_4815", "label_4816", "label_4817", "label_4818", "label_4819", "label_4820", "label_4821", "label_4822", "label_4823", "label_4824", "label_4825", "label_4826", "label_4827", "label_4828", "label_4829", "label_4830", "label_4831", "label_4832", "label_4833", "label_4834", "label_4835", "label_4836", "label_4837", "label_4838", "label_4839", "label_4840", "label_4841", "label_4842", "label_4843", "label_4844", "label_4845", "label_4846", "label_4847", "label_4848", "label_4849", "label_4850", "label_4851", "label_4852", "label_4853", "label_4854", "label_4855", "label_4856", "label_4857", "label_4858", "label_4859", "label_4860", "label_4861", "label_4862", "label_4863", "label_4864", "label_4865", "label_4866", "label_4867", "label_4868", "label_4869", "label_4870", "label_4871", "label_4872", "label_4873", "label_4874", "label_4875", "label_4876", "label_4877", "label_4878", "label_4879", "label_4880", "label_4881", "label_4882", "label_4883", "label_4884", "label_4885", "label_4886", "label_4887", "label_4888", "label_4889", "label_4890", "label_4891", "label_4892", "label_4893", "label_4894", "label_4895", "label_4896", "label_4897", "label_4898", "label_4899", "label_4900", "label_4901", "label_4902", "label_4903", "label_4904", "label_4905", "label_4906", "label_4907", "label_4908", "label_4909", "label_4910", "label_4911", "label_4912", "label_4913", "label_4914", "label_4915", "label_4916", "label_4917", "label_4918", "label_4919", "label_4920", "label_4921", "label_4922", "label_4923", "label_4924", "label_4925", "label_4926", "label_4927", "label_4928", "label_4929", "label_4930", "label_4931", "label_4932", "label_4933", "label_4934", "label_4935", "label_4936", "label_4937", "label_4938", "label_4939", "label_4940", "label_4941", "label_4942", "label_4943", "label_4944", "label_4945", "label_4946", "label_4947", "label_4948", "label_4949", "label_4950", "label_4951", "label_4952", "label_4953", "label_4954", "label_4955", "label_4956", "label_4957", "label_4958", "label_4959", "label_4960", "label_4961", "label_4962", "label_4963", "label_4964", "label_4965", "label_4966", "label_4967", "label_4968", "label_4969", "label_4970", "label_4971", "label_4972", "label_4973", "label_4974", "label_4975", "label_4976", "label_4977", "label_4978", "label_4979", "label_4980", "label_4981", "label_4982", "label_4983", "label_4984", "label_4985", "label_4986", "label_4987", "label_4988", "label_4989", "label_4990", "label_4991", "label_4992", "label_4993", "label_4994", "label_4995", "label_4996", "label_4997", "label_4998", "label_4999", "label_5000", "label_5001", "label_5002", "label_5003", "label_5004", "label_5005", "label_5006", "label_5007", "label_5008", "label_5009", "label_5010", "label_5011", "label_5012", "label_5013", "label_5014", "label_5015", "label_5016", "label_5017", "label_5018", "label_5019", "label_5020", "label_5021", "label_5022", "label_5023", "label_5024", "label_5025", "label_5026", "label_5027", "label_5028", "label_5029", "label_5030", "label_5031", "label_5032", "label_5033", "label_5034", "label_5035", "label_5036", "label_5037", "label_5038", "label_5039", "label_5040", "label_5041", "label_5042", "label_5043", "label_5044", "label_5045", "label_5046", "label_5047", "label_5048", "label_5049", "label_5050", "label_5051", "label_5052", "label_5053", "label_5054", "label_5055", "label_5056", "label_5057", "label_5058", "label_5059", "label_5060", "label_5061", "label_5062", "label_5063", "label_5064", "label_5065", "label_5066", "label_5067", "label_5068", "label_5069", "label_5070", "label_5071", "label_5072", "label_5073", "label_5074", "label_5075", "label_5076", "label_5077", "label_5078", "label_5079", "label_5080", "label_5081", "label_5082", "label_5083", "label_5084", "label_5085", "label_5086", "label_5087", "label_5088", "label_5089", "label_5090", "label_5091", "label_5092", "label_5093", "label_5094", "label_5095", "label_5096", "label_5097", "label_5098", "label_5099", "label_5100", "label_5101", "label_5102", "label_5103", "label_5104", "label_5105", "label_5106", "label_5107", "label_5108", "label_5109", "label_5110", "label_5111", "label_5112", "label_5113", "label_5114", "label_5115", "label_5116", "label_5117", "label_5118", "label_5119", "label_5120", "label_5121", "label_5122", "label_5123", "label_5124", "label_5125", "label_5126", "label_5127", "label_5128", "label_5129", "label_5130", "label_5131", "label_5132", "label_5133", "label_5134", "label_5135", "label_5136", "label_5137", "label_5138", "label_5139", "label_5140", "label_5141", "label_5142", "label_5143", "label_5144", "label_5145", "label_5146", "label_5147", "label_5148", "label_5149", "label_5150", "label_5151", "label_5152", "label_5153", "label_5154", "label_5155", "label_5156", "label_5157", "label_5158", "label_5159", "label_5160", "label_5161", "label_5162", "label_5163", "label_5164", "label_5165", "label_5166", "label_5167", "label_5168", "label_5169", "label_5170", "label_5171", "label_5172", "label_5173", "label_5174", "label_5175", "label_5176", "label_5177", "label_5178", "label_5179", "label_5180", "label_5181", "label_5182", "label_5183", "label_5184", "label_5185", "label_5186", "label_5187", "label_5188", "label_5189", "label_5190", "label_5191", "label_5192", "label_5193", "label_5194", "label_5195", "label_5196", "label_5197", "label_5198", "label_5199", "label_5200", "label_5201", "label_5202", "label_5203", "label_5204", "label_5205", "label_5206", "label_5207", "label_5208", "label_5209", "label_5210", "label_5211", "label_5212", "label_5213", "label_5214", "label_5215", "label_5216", "label_5217", "label_5218", "label_5219", "label_5220", "label_5221", "label_5222", "label_5223", "label_5224", "label_5225", "label_5226", "label_5227", "label_5228", "label_5229", "label_5230", "label_5231", "label_5232", "label_5233", "label_5234", "label_5235", "label_5236", "label_5237", "label_5238", "label_5239", "label_5240", "label_5241", "label_5242", "label_5243", "label_5244", "label_5245", "label_5246", "label_5247", "label_5248", "label_5249", "label_5250", "label_5251", "label_5252", "label_5253", "label_5254", "label_5255", "label_5256", "label_5257", "label_5258", "label_5259", "label_5260", "label_5261", "label_5262", "label_5263", "label_5264", "label_5265", "label_5266", "label_5267", "label_5268", "label_5269", "label_5270", "label_5271", "label_5272", "label_5273", "label_5274", "label_5275", "label_5276", "label_5277", "label_5278", "label_5279", "label_5280", "label_5281", "label_5282", "label_5283", "label_5284", "label_5285", "label_5286", "label_5287", "label_5288", "label_5289", "label_5290", "label_5291", "label_5292", "label_5293", "label_5294", "label_5295", "label_5296", "label_5297", "label_5298", "label_5299", "label_5300", "label_5301", "label_5302", "label_5303", "label_5304", "label_5305", "label_5306", "label_5307", "label_5308", "label_5309", "label_5310", "label_5311", "label_5312", "label_5313", "label_5314", "label_5315", "label_5316", "label_5317", "label_5318", "label_5319", "label_5320", "label_5321", "label_5322", "label_5323", "label_5324", "label_5325", "label_5326", "label_5327", "label_5328", "label_5329", "label_5330", "label_5331", "label_5332", "label_5333", "label_5334", "label_5335", "label_5336", "label_5337", "label_5338", "label_5339", "label_5340", "label_5341", "label_5342", "label_5343", "label_5344", "label_5345", "label_5346", "label_5347", "label_5348", "label_5349", "label_5350", "label_5351", "label_5352", "label_5353", "label_5354", "label_5355", "label_5356", "label_5357", "label_5358", "label_5359", "label_5360", "label_5361", "label_5362", "label_5363", "label_5364", "label_5365", "label_5366", "label_5367", "label_5368", "label_5369", "label_5370", "label_5371", "label_5372", "label_5373", "label_5374", "label_5375", "label_5376", "label_5377", "label_5378", "label_5379", "label_5380", "label_5381", "label_5382", "label_5383", "label_5384", "label_5385", "label_5386", "label_5387", "label_5388", "label_5389", "label_5390", "label_5391", "label_5392", "label_5393", "label_5394", "label_5395", "label_5396", "label_5397", "label_5398", "label_5399", "label_5400", "label_5401", "label_5402", "label_5403", "label_5404", "label_5405", "label_5406", "label_5407", "label_5408", "label_5409", "label_5410", "label_5411", "label_5412", "label_5413", "label_5414", "label_5415", "label_5416", "label_5417", "label_5418", "label_5419", "label_5420", "label_5421", "label_5422", "label_5423", "label_5424", "label_5425", "label_5426", "label_5427", "label_5428", "label_5429", "label_5430", "label_5431", "label_5432", "label_5433", "label_5434", "label_5435", "label_5436", "label_5437", "label_5438", "label_5439", "label_5440", "label_5441", "label_5442", "label_5443", "label_5444", "label_5445", "label_5446", "label_5447", "label_5448", "label_5449", "label_5450", "label_5451", "label_5452", "label_5453", "label_5454", "label_5455", "label_5456", "label_5457", "label_5458", "label_5459", "label_5460", "label_5461", "label_5462", "label_5463", "label_5464", "label_5465", "label_5466", "label_5467", "label_5468", "label_5469", "label_5470", "label_5471", "label_5472", "label_5473", "label_5474", "label_5475", "label_5476", "label_5477", "label_5478", "label_5479", "label_5480", "label_5481", "label_5482", "label_5483", "label_5484", "label_5485", "label_5486", "label_5487", "label_5488", "label_5489", "label_5490", "label_5491", "label_5492", "label_5493", "label_5494", "label_5495", "label_5496", "label_5497", "label_5498", "label_5499", "label_5500", "label_5501", "label_5502", "label_5503", "label_5504", "label_5505", "label_5506", "label_5507", "label_5508", "label_5509", "label_5510", "label_5511", "label_5512", "label_5513", "label_5514", "label_5515", "label_5516", "label_5517", "label_5518", "label_5519", "label_5520", "label_5521", "label_5522", "label_5523", "label_5524", "label_5525", "label_5526", "label_5527", "label_5528", "label_5529", "label_5530", "label_5531", "label_5532", "label_5533", "label_5534", "label_5535", "label_5536", "label_5537", "label_5538", "label_5539", "label_5540", "label_5541", "label_5542", "label_5543", "label_5544", "label_5545", "label_5546", "label_5547", "label_5548", "label_5549", "label_5550", "label_5551", "label_5552", "label_5553", "label_5554", "label_5555", "label_5556", "label_5557", "label_5558", "label_5559", "label_5560", "label_5561", "label_5562", "label_5563", "label_5564", "label_5565", "label_5566", "label_5567", "label_5568", "label_5569", "label_5570", "label_5571", "label_5572", "label_5573", "label_5574", "label_5575", "label_5576", "label_5577", "label_5578", "label_5579", "label_5580", "label_5581", "label_5582", "label_5583", "label_5584", "label_5585", "label_5586", "label_5587", "label_5588", "label_5589", "label_5590", "label_5591", "label_5592", "label_5593", "label_5594", "label_5595", "label_5596", "label_5597", "label_5598", "label_5599", "label_5600", "label_5601", "label_5602", "label_5603", "label_5604", "label_5605", "label_5606", "label_5607", "label_5608", "label_5609", "label_5610", "label_5611", "label_5612", "label_5613", "label_5614", "label_5615", "label_5616", "label_5617", "label_5618", "label_5619", "label_5620", "label_5621", "label_5622", "label_5623", "label_5624", "label_5625", "label_5626", "label_5627", "label_5628", "label_5629", "label_5630", "label_5631", "label_5632", "label_5633", "label_5634", "label_5635", "label_5636", "label_5637", "label_5638", "label_5639", "label_5640", "label_5641", "label_5642", "label_5643", "label_5644", "label_5645", "label_5646", "label_5647", "label_5648", "label_5649", "label_5650", "label_5651", "label_5652", "label_5653", "label_5654", "label_5655", "label_5656", "label_5657", "label_5658", "label_5659", "label_5660", "label_5661", "label_5662", "label_5663", "label_5664", "label_5665", "label_5666", "label_5667", "label_5668", "label_5669", "label_5670", "label_5671", "label_5672", "label_5673", "label_5674", "label_5675", "label_5676", "label_5677", "label_5678", "label_5679", "label_5680", "label_5681", "label_5682", "label_5683", "label_5684", "label_5685", "label_5686", "label_5687", "label_5688", "label_5689", "label_5690", "label_5691", "label_5692", "label_5693", "label_5694", "label_5695", "label_5696", "label_5697", "label_5698", "label_5699", "label_5700", "label_5701", "label_5702", "label_5703", "label_5704", "label_5705", "label_5706", "label_5707", "label_5708", "label_5709", "label_5710", "label_5711", "label_5712", "label_5713", "label_5714", "label_5715", "label_5716", "label_5717", "label_5718", "label_5719", "label_5720", "label_5721", "label_5722", "label_5723", "label_5724", "label_5725", "label_5726", "label_5727", "label_5728", "label_5729", "label_5730", "label_5731", "label_5732", "label_5733", "label_5734", "label_5735", "label_5736", "label_5737", "label_5738", "label_5739", "label_5740", "label_5741", "label_5742", "label_5743", "label_5744", "label_5745", "label_5746", "label_5747", "label_5748", "label_5749", "label_5750", "label_5751", "label_5752", "label_5753", "label_5754", "label_5755", "label_5756", "label_5757", "label_5758", "label_5759", "label_5760", "label_5761", "label_5762", "label_5763", "label_5764", "label_5765", "label_5766", "label_5767", "label_5768", "label_5769", "label_5770", "label_5771", "label_5772", "label_5773", "label_5774", "label_5775", "label_5776", "label_5777", "label_5778", "label_5779", "label_5780", "label_5781", "label_5782", "label_5783", "label_5784", "label_5785", "label_5786", "label_5787", "label_5788", "label_5789", "label_5790", "label_5791", "label_5792", "label_5793", "label_5794", "label_5795", "label_5796", "label_5797", "label_5798", "label_5799", "label_5800", "label_5801", "label_5802", "label_5803", "label_5804", "label_5805", "label_5806", "label_5807", "label_5808", "label_5809", "label_5810", "label_5811", "label_5812", "label_5813", "label_5814", "label_5815", "label_5816", "label_5817", "label_5818", "label_5819", "label_5820", "label_5821", "label_5822", "label_5823", "label_5824", "label_5825", "label_5826", "label_5827", "label_5828", "label_5829", "label_5830", "label_5831", "label_5832", "label_5833", "label_5834", "label_5835", "label_5836", "label_5837", "label_5838", "label_5839", "label_5840", "label_5841", "label_5842", "label_5843", "label_5844", "label_5845", "label_5846", "label_5847", "label_5848", "label_5849", "label_5850", "label_5851", "label_5852", "label_5853", "label_5854", "label_5855", "label_5856", "label_5857", "label_5858", "label_5859", "label_5860", "label_5861", "label_5862", "label_5863", "label_5864", "label_5865", "label_5866", "label_5867", "label_5868", "label_5869", "label_5870", "label_5871", "label_5872", "label_5873", "label_5874", "label_5875", "label_5876", "label_5877", "label_5878", "label_5879", "label_5880", "label_5881", "label_5882", "label_5883", "label_5884", "label_5885", "label_5886", "label_5887", "label_5888", "label_5889", "label_5890", "label_5891", "label_5892", "label_5893", "label_5894", "label_5895", "label_5896", "label_5897", "label_5898", "label_5899", "label_5900", "label_5901", "label_5902", "label_5903", "label_5904", "label_5905", "label_5906", "label_5907", "label_5908", "label_5909", "label_5910", "label_5911", "label_5912", "label_5913", "label_5914", "label_5915", "label_5916", "label_5917", "label_5918", "label_5919", "label_5920", "label_5921", "label_5922", "label_5923", "label_5924", "label_5925", "label_5926", "label_5927", "label_5928", "label_5929", "label_5930", "label_5931", "label_5932", "label_5933", "label_5934", "label_5935", "label_5936", "label_5937", "label_5938", "label_5939", "label_5940", "label_5941", "label_5942", "label_5943", "label_5944", "label_5945", "label_5946", "label_5947", "label_5948", "label_5949", "label_5950", "label_5951", "label_5952", "label_5953", "label_5954", "label_5955", "label_5956", "label_5957", "label_5958", "label_5959", "label_5960", "label_5961", "label_5962", "label_5963", "label_5964", "label_5965", "label_5966", "label_5967", "label_5968", "label_5969", "label_5970", "label_5971", "label_5972", "label_5973", "label_5974", "label_5975", "label_5976", "label_5977", "label_5978", "label_5979", "label_5980", "label_5981", "label_5982", "label_5983", "label_5984", "label_5985", "label_5986", "label_5987", "label_5988", "label_5989", "label_5990", "label_5991", "label_5992", "label_5993", "label_5994", "label_5995", "label_5996", "label_5997", "label_5998", "label_5999", "label_6000", "label_6001", "label_6002", "label_6003", "label_6004", "label_6005", "label_6006", "label_6007", "label_6008", "label_6009", "label_6010", "label_6011", "label_6012", "label_6013", "label_6014", "label_6015", "label_6016", "label_6017", "label_6018", "label_6019", "label_6020", "label_6021", "label_6022", "label_6023", "label_6024", "label_6025", "label_6026", "label_6027", "label_6028", "label_6029", "label_6030", "label_6031", "label_6032", "label_6033", "label_6034", "label_6035", "label_6036", "label_6037", "label_6038", "label_6039", "label_6040", "label_6041", "label_6042", "label_6043", "label_6044", "label_6045", "label_6046", "label_6047", "label_6048", "label_6049", "label_6050", "label_6051", "label_6052", "label_6053", "label_6054", "label_6055", "label_6056", "label_6057", "label_6058", "label_6059", "label_6060", "label_6061", "label_6062", "label_6063", "label_6064", "label_6065", "label_6066", "label_6067", "label_6068", "label_6069", "label_6070", "label_6071", "label_6072", "label_6073", "label_6074", "label_6075", "label_6076", "label_6077", "label_6078", "label_6079", "label_6080", "label_6081", "label_6082", "label_6083", "label_6084", "label_6085", "label_6086", "label_6087", "label_6088", "label_6089", "label_6090", "label_6091", "label_6092", "label_6093", "label_6094", "label_6095", "label_6096", "label_6097", "label_6098", "label_6099", "label_6100", "label_6101", "label_6102", "label_6103", "label_6104", "label_6105", "label_6106", "label_6107", "label_6108", "label_6109", "label_6110", "label_6111", "label_6112", "label_6113", "label_6114", "label_6115", "label_6116", "label_6117", "label_6118", "label_6119", "label_6120", "label_6121", "label_6122", "label_6123", "label_6124", "label_6125", "label_6126", "label_6127", "label_6128", "label_6129", "label_6130", "label_6131", "label_6132", "label_6133", "label_6134", "label_6135", "label_6136", "label_6137", "label_6138", "label_6139", "label_6140", "label_6141", "label_6142", "label_6143", "label_6144", "label_6145", "label_6146", "label_6147", "label_6148", "label_6149", "label_6150", "label_6151", "label_6152", "label_6153", "label_6154", "label_6155", "label_6156", "label_6157", "label_6158", "label_6159", "label_6160", "label_6161", "label_6162", "label_6163", "label_6164", "label_6165", "label_6166", "label_6167", "label_6168", "label_6169", "label_6170", "label_6171", "label_6172", "label_6173", "label_6174", "label_6175", "label_6176", "label_6177", "label_6178", "label_6179", "label_6180", "label_6181", "label_6182", "label_6183", "label_6184", "label_6185", "label_6186", "label_6187", "label_6188", "label_6189", "label_6190", "label_6191", "label_6192", "label_6193", "label_6194", "label_6195", "label_6196", "label_6197", "label_6198", "label_6199", "label_6200", "label_6201", "label_6202", "label_6203", "label_6204", "label_6205", "label_6206", "label_6207", "label_6208", "label_6209", "label_6210", "label_6211", "label_6212", "label_6213", "label_6214", "label_6215", "label_6216", "label_6217", "label_6218", "label_6219", "label_6220", "label_6221", "label_6222", "label_6223", "label_6224", "label_6225", "label_6226", "label_6227", "label_6228", "label_6229", "label_6230", "label_6231", "label_6232", "label_6233", "label_6234", "label_6235", "label_6236", "label_6237", "label_6238", "label_6239", "label_6240", "label_6241", "label_6242", "label_6243", "label_6244", "label_6245", "label_6246", "label_6247", "label_6248", "label_6249", "label_6250", "label_6251", "label_6252", "label_6253", "label_6254", "label_6255", "label_6256", "label_6257", "label_6258", "label_6259", "label_6260", "label_6261", "label_6262", "label_6263", "label_6264", "label_6265", "label_6266", "label_6267", "label_6268", "label_6269", "label_6270", "label_6271", "label_6272", "label_6273", "label_6274", "label_6275", "label_6276", "label_6277", "label_6278", "label_6279", "label_6280", "label_6281", "label_6282", "label_6283", "label_6284", "label_6285", "label_6286", "label_6287", "label_6288", "label_6289", "label_6290", "label_6291", "label_6292", "label_6293", "label_6294", "label_6295", "label_6296", "label_6297", "label_6298", "label_6299", "label_6300", "label_6301", "label_6302", "label_6303", "label_6304", "label_6305", "label_6306", "label_6307", "label_6308", "label_6309", "label_6310", "label_6311", "label_6312", "label_6313", "label_6314", "label_6315", "label_6316", "label_6317", "label_6318", "label_6319", "label_6320", "label_6321", "label_6322", "label_6323", "label_6324", "label_6325", "label_6326", "label_6327", "label_6328", "label_6329", "label_6330", "label_6331", "label_6332", "label_6333", "label_6334", "label_6335", "label_6336", "label_6337", "label_6338", "label_6339", "label_6340", "label_6341", "label_6342", "label_6343", "label_6344", "label_6345", "label_6346", "label_6347", "label_6348", "label_6349", "label_6350", "label_6351", "label_6352", "label_6353", "label_6354", "label_6355", "label_6356", "label_6357", "label_6358", "label_6359", "label_6360", "label_6361", "label_6362", "label_6363", "label_6364", "label_6365", "label_6366", "label_6367", "label_6368", "label_6369", "label_6370", "label_6371", "label_6372", "label_6373", "label_6374", "label_6375", "label_6376", "label_6377", "label_6378", "label_6379", "label_6380", "label_6381", "label_6382", "label_6383", "label_6384", "label_6385", "label_6386", "label_6387", "label_6388", "label_6389", "label_6390", "label_6391", "label_6392", "label_6393", "label_6394", "label_6395", "label_6396", "label_6397", "label_6398", "label_6399", "label_6400", "label_6401", "label_6402", "label_6403", "label_6404", "label_6405", "label_6406", "label_6407", "label_6408", "label_6409", "label_6410", "label_6411", "label_6412", "label_6413", "label_6414", "label_6415", "label_6416", "label_6417", "label_6418", "label_6419", "label_6420", "label_6421", "label_6422", "label_6423", "label_6424", "label_6425", "label_6426", "label_6427", "label_6428", "label_6429", "label_6430", "label_6431", "label_6432", "label_6433", "label_6434", "label_6435", "label_6436", "label_6437", "label_6438", "label_6439", "label_6440", "label_6441", "label_6442", "label_6443", "label_6444", "label_6445", "label_6446", "label_6447", "label_6448", "label_6449", "label_6450", "label_6451", "label_6452", "label_6453", "label_6454", "label_6455", "label_6456", "label_6457", "label_6458", "label_6459", "label_6460", "label_6461", "label_6462", "label_6463", "label_6464", "label_6465", "label_6466", "label_6467", "label_6468", "label_6469", "label_6470", "label_6471", "label_6472", "label_6473", "label_6474", "label_6475", "label_6476", "label_6477", "label_6478", "label_6479", "label_6480", "label_6481", "label_6482", "label_6483", "label_6484", "label_6485", "label_6486", "label_6487", "label_6488", "label_6489", "label_6490", "label_6491", "label_6492", "label_6493", "label_6494", "label_6495", "label_6496", "label_6497", "label_6498", "label_6499", "label_6500", "label_6501", "label_6502", "label_6503", "label_6504", "label_6505", "label_6506", "label_6507", "label_6508", "label_6509", "label_6510", "label_6511", "label_6512", "label_6513", "label_6514", "label_6515", "label_6516", "label_6517", "label_6518", "label_6519", "label_6520", "label_6521", "label_6522", "label_6523", "label_6524", "label_6525", "label_6526", "label_6527", "label_6528", "label_6529", "label_6530", "label_6531", "label_6532", "label_6533", "label_6534", "label_6535", "label_6536", "label_6537", "label_6538", "label_6539", "label_6540", "label_6541", "label_6542", "label_6543", "label_6544", "label_6545", "label_6546", "label_6547", "label_6548", "label_6549", "label_6550", "label_6551", "label_6552", "label_6553", "label_6554", "label_6555", "label_6556", "label_6557", "label_6558", "label_6559", "label_6560", "label_6561", "label_6562", "label_6563", "label_6564", "label_6565", "label_6566", "label_6567", "label_6568", "label_6569", "label_6570", "label_6571", "label_6572", "label_6573", "label_6574", "label_6575", "label_6576", "label_6577", "label_6578", "label_6579", "label_6580", "label_6581", "label_6582", "label_6583", "label_6584", "label_6585", "label_6586", "label_6587", "label_6588", "label_6589", "label_6590", "label_6591", "label_6592", "label_6593", "label_6594", "label_6595", "label_6596", "label_6597", "label_6598", "label_6599", "label_6600", "label_6601", "label_6602", "label_6603", "label_6604", "label_6605", "label_6606", "label_6607", "label_6608", "label_6609", "label_6610", "label_6611", "label_6612", "label_6613", "label_6614", "label_6615", "label_6616", "label_6617", "label_6618", "label_6619", "label_6620", "label_6621", "label_6622", "label_6623", "label_6624", "label_6625", "label_6626", "label_6627", "label_6628", "label_6629", "label_6630", "label_6631", "label_6632", "label_6633", "label_6634", "label_6635", "label_6636", "label_6637", "label_6638", "label_6639", "label_6640", "label_6641", "label_6642", "label_6643", "label_6644", "label_6645", "label_6646", "label_6647", "label_6648", "label_6649", "label_6650", "label_6651", "label_6652", "label_6653", "label_6654", "label_6655", "label_6656", "label_6657", "label_6658", "label_6659", "label_6660", "label_6661", "label_6662", "label_6663", "label_6664", "label_6665", "label_6666", "label_6667", "label_6668", "label_6669", "label_6670", "label_6671", "label_6672", "label_6673", "label_6674", "label_6675", "label_6676", "label_6677", "label_6678", "label_6679", "label_6680", "label_6681", "label_6682", "label_6683", "label_6684", "label_6685", "label_6686", "label_6687", "label_6688", "label_6689", "label_6690", "label_6691", "label_6692", "label_6693", "label_6694", "label_6695", "label_6696", "label_6697", "label_6698", "label_6699", "label_6700", "label_6701", "label_6702", "label_6703", "label_6704", "label_6705", "label_6706", "label_6707", "label_6708", "label_6709", "label_6710", "label_6711", "label_6712", "label_6713", "label_6714", "label_6715", "label_6716", "label_6717", "label_6718", "label_6719", "label_6720", "label_6721", "label_6722", "label_6723", "label_6724", "label_6725", "label_6726", "label_6727", "label_6728", "label_6729", "label_6730", "label_6731", "label_6732", "label_6733", "label_6734", "label_6735", "label_6736", "label_6737", "label_6738", "label_6739", "label_6740", "label_6741", "label_6742", "label_6743", "label_6744", "label_6745", "label_6746", "label_6747", "label_6748", "label_6749", "label_6750", "label_6751", "label_6752", "label_6753", "label_6754", "label_6755", "label_6756", "label_6757", "label_6758", "label_6759", "label_6760", "label_6761", "label_6762", "label_6763", "label_6764", "label_6765", "label_6766", "label_6767", "label_6768", "label_6769", "label_6770", "label_6771", "label_6772", "label_6773", "label_6774", "label_6775", "label_6776", "label_6777", "label_6778", "label_6779", "label_6780", "label_6781", "label_6782", "label_6783", "label_6784", "label_6785", "label_6786", "label_6787", "label_6788", "label_6789", "label_6790", "label_6791", "label_6792", "label_6793", "label_6794", "label_6795", "label_6796", "label_6797", "label_6798", "label_6799", "label_6800", "label_6801", "label_6802", "label_6803", "label_6804", "label_6805", "label_6806", "label_6807", "label_6808", "label_6809", "label_6810", "label_6811", "label_6812", "label_6813", "label_6814", "label_6815", "label_6816", "label_6817", "label_6818", "label_6819", "label_6820", "label_6821", "label_6822", "label_6823", "label_6824", "label_6825", "label_6826", "label_6827", "label_6828", "label_6829", "label_6830", "label_6831", "label_6832", "label_6833", "label_6834", "label_6835", "label_6836", "label_6837", "label_6838", "label_6839", "label_6840", "label_6841", "label_6842", "label_6843", "label_6844", "label_6845", "label_6846", "label_6847", "label_6848", "label_6849", "label_6850", "label_6851", "label_6852", "label_6853", "label_6854", "label_6855", "label_6856", "label_6857", "label_6858", "label_6859", "label_6860", "label_6861", "label_6862", "label_6863", "label_6864", "label_6865", "label_6866", "label_6867", "label_6868", "label_6869", "label_6870", "label_6871", "label_6872", "label_6873", "label_6874", "label_6875", "label_6876", "label_6877", "label_6878", "label_6879", "label_6880", "label_6881", "label_6882", "label_6883", "label_6884", "label_6885", "label_6886", "label_6887", "label_6888", "label_6889", "label_6890", "label_6891", "label_6892", "label_6893", "label_6894", "label_6895", "label_6896", "label_6897", "label_6898", "label_6899", "label_6900", "label_6901", "label_6902", "label_6903", "label_6904", "label_6905", "label_6906", "label_6907", "label_6908", "label_6909", "label_6910", "label_6911", "label_6912", "label_6913", "label_6914", "label_6915", "label_6916", "label_6917", "label_6918", "label_6919", "label_6920", "label_6921", "label_6922", "label_6923", "label_6924", "label_6925", "label_6926", "label_6927", "label_6928", "label_6929", "label_6930", "label_6931", "label_6932", "label_6933", "label_6934", "label_6935", "label_6936", "label_6937", "label_6938", "label_6939", "label_6940", "label_6941", "label_6942", "label_6943", "label_6944", "label_6945", "label_6946", "label_6947", "label_6948", "label_6949", "label_6950", "label_6951", "label_6952", "label_6953", "label_6954", "label_6955", "label_6956", "label_6957", "label_6958", "label_6959", "label_6960", "label_6961", "label_6962", "label_6963", "label_6964", "label_6965", "label_6966", "label_6967", "label_6968", "label_6969", "label_6970", "label_6971", "label_6972", "label_6973", "label_6974", "label_6975", "label_6976", "label_6977", "label_6978", "label_6979", "label_6980", "label_6981", "label_6982", "label_6983", "label_6984", "label_6985", "label_6986", "label_6987", "label_6988", "label_6989", "label_6990", "label_6991", "label_6992", "label_6993", "label_6994", "label_6995", "label_6996", "label_6997", "label_6998", "label_6999", "label_7000", "label_7001", "label_7002", "label_7003", "label_7004", "label_7005", "label_7006", "label_7007", "label_7008", "label_7009", "label_7010", "label_7011", "label_7012", "label_7013", "label_7014", "label_7015", "label_7016", "label_7017", "label_7018", "label_7019", "label_7020", "label_7021", "label_7022", "label_7023", "label_7024", "label_7025", "label_7026", "label_7027", "label_7028", "label_7029", "label_7030", "label_7031", "label_7032", "label_7033", "label_7034", "label_7035", "label_7036", "label_7037", "label_7038", "label_7039", "label_7040", "label_7041", "label_7042", "label_7043", "label_7044", "label_7045", "label_7046", "label_7047", "label_7048", "label_7049", "label_7050", "label_7051", "label_7052", "label_7053", "label_7054", "label_7055", "label_7056", "label_7057", "label_7058", "label_7059", "label_7060", "label_7061", "label_7062", "label_7063", "label_7064", "label_7065", "label_7066", "label_7067", "label_7068", "label_7069", "label_7070", "label_7071", "label_7072", "label_7073", "label_7074", "label_7075", "label_7076", "label_7077", "label_7078", "label_7079", "label_7080", "label_7081", "label_7082", "label_7083", "label_7084", "label_7085", "label_7086", "label_7087", "label_7088", "label_7089", "label_7090", "label_7091", "label_7092", "label_7093", "label_7094", "label_7095", "label_7096", "label_7097", "label_7098", "label_7099", "label_7100", "label_7101", "label_7102", "label_7103", "label_7104", "label_7105", "label_7106", "label_7107", "label_7108", "label_7109", "label_7110", "label_7111", "label_7112", "label_7113", "label_7114", "label_7115", "label_7116", "label_7117", "label_7118", "label_7119", "label_7120", "label_7121", "label_7122", "label_7123", "label_7124", "label_7125", "label_7126", "label_7127", "label_7128", "label_7129", "label_7130", "label_7131", "label_7132", "label_7133", "label_7134", "label_7135", "label_7136", "label_7137", "label_7138", "label_7139", "label_7140", "label_7141", "label_7142", "label_7143", "label_7144", "label_7145", "label_7146", "label_7147", "label_7148", "label_7149", "label_7150", "label_7151", "label_7152", "label_7153", "label_7154", "label_7155", "label_7156", "label_7157", "label_7158", "label_7159", "label_7160", "label_7161", "label_7162", "label_7163", "label_7164", "label_7165", "label_7166", "label_7167", "label_7168", "label_7169", "label_7170", "label_7171", "label_7172", "label_7173", "label_7174", "label_7175", "label_7176", "label_7177", "label_7178", "label_7179", "label_7180", "label_7181", "label_7182", "label_7183", "label_7184", "label_7185", "label_7186", "label_7187", "label_7188", "label_7189", "label_7190", "label_7191", "label_7192", "label_7193", "label_7194", "label_7195", "label_7196", "label_7197", "label_7198", "label_7199", "label_7200", "label_7201", "label_7202", "label_7203", "label_7204", "label_7205", "label_7206", "label_7207", "label_7208", "label_7209", "label_7210", "label_7211", "label_7212", "label_7213", "label_7214", "label_7215", "label_7216", "label_7217", "label_7218", "label_7219", "label_7220", "label_7221", "label_7222", "label_7223", "label_7224", "label_7225", "label_7226", "label_7227", "label_7228", "label_7229", "label_7230", "label_7231", "label_7232", "label_7233", "label_7234", "label_7235", "label_7236", "label_7237", "label_7238", "label_7239", "label_7240", "label_7241", "label_7242", "label_7243", "label_7244", "label_7245", "label_7246", "label_7247", "label_7248", "label_7249", "label_7250", "label_7251", "label_7252", "label_7253", "label_7254", "label_7255", "label_7256", "label_7257", "label_7258", "label_7259", "label_7260", "label_7261", "label_7262", "label_7263", "label_7264", "label_7265", "label_7266", "label_7267", "label_7268", "label_7269", "label_7270", "label_7271", "label_7272", "label_7273", "label_7274", "label_7275", "label_7276", "label_7277", "label_7278", "label_7279", "label_7280", "label_7281", "label_7282", "label_7283", "label_7284", "label_7285", "label_7286", "label_7287", "label_7288", "label_7289", "label_7290", "label_7291", "label_7292", "label_7293", "label_7294", "label_7295", "label_7296", "label_7297", "label_7298", "label_7299", "label_7300", "label_7301", "label_7302", "label_7303", "label_7304", "label_7305", "label_7306", "label_7307", "label_7308", "label_7309", "label_7310", "label_7311", "label_7312", "label_7313", "label_7314", "label_7315", "label_7316", "label_7317", "label_7318", "label_7319", "label_7320", "label_7321", "label_7322", "label_7323", "label_7324", "label_7325", "label_7326", "label_7327", "label_7328", "label_7329", "label_7330", "label_7331", "label_7332", "label_7333", "label_7334", "label_7335", "label_7336", "label_7337", "label_7338", "label_7339", "label_7340", "label_7341", "label_7342", "label_7343", "label_7344", "label_7345", "label_7346", "label_7347", "label_7348", "label_7349", "label_7350", "label_7351", "label_7352", "label_7353", "label_7354", "label_7355", "label_7356", "label_7357", "label_7358", "label_7359", "label_7360", "label_7361", "label_7362", "label_7363", "label_7364", "label_7365", "label_7366", "label_7367", "label_7368", "label_7369", "label_7370", "label_7371", "label_7372", "label_7373", "label_7374", "label_7375", "label_7376", "label_7377", "label_7378", "label_7379", "label_7380", "label_7381", "label_7382", "label_7383", "label_7384", "label_7385", "label_7386", "label_7387", "label_7388", "label_7389", "label_7390", "label_7391", "label_7392", "label_7393", "label_7394", "label_7395", "label_7396", "label_7397", "label_7398", "label_7399", "label_7400", "label_7401", "label_7402", "label_7403", "label_7404", "label_7405", "label_7406", "label_7407", "label_7408", "label_7409", "label_7410", "label_7411", "label_7412", "label_7413", "label_7414", "label_7415", "label_7416", "label_7417", "label_7418", "label_7419", "label_7420", "label_7421", "label_7422", "label_7423", "label_7424", "label_7425", "label_7426", "label_7427", "label_7428", "label_7429", "label_7430", "label_7431", "label_7432", "label_7433", "label_7434", "label_7435", "label_7436", "label_7437", "label_7438", "label_7439", "label_7440", "label_7441", "label_7442", "label_7443", "label_7444", "label_7445", "label_7446", "label_7447", "label_7448", "label_7449", "label_7450", "label_7451", "label_7452", "label_7453", "label_7454", "label_7455", "label_7456", "label_7457", "label_7458", "label_7459", "label_7460", "label_7461", "label_7462", "label_7463", "label_7464", "label_7465", "label_7466", "label_7467", "label_7468", "label_7469", "label_7470", "label_7471", "label_7472", "label_7473", "label_7474", "label_7475", "label_7476", "label_7477", "label_7478", "label_7479", "label_7480", "label_7481", "label_7482", "label_7483", "label_7484", "label_7485", "label_7486", "label_7487", "label_7488", "label_7489", "label_7490", "label_7491", "label_7492", "label_7493", "label_7494", "label_7495", "label_7496", "label_7497", "label_7498", "label_7499", "label_7500", "label_7501", "label_7502", "label_7503", "label_7504", "label_7505", "label_7506", "label_7507", "label_7508", "label_7509", "label_7510", "label_7511", "label_7512", "label_7513", "label_7514", "label_7515", "label_7516", "label_7517", "label_7518", "label_7519", "label_7520", "label_7521", "label_7522", "label_7523", "label_7524", "label_7525", "label_7526", "label_7527", "label_7528", "label_7529", "label_7530", "label_7531", "label_7532", "label_7533", "label_7534", "label_7535", "label_7536", "label_7537", "label_7538", "label_7539", "label_7540", "label_7541", "label_7542", "label_7543", "label_7544", "label_7545", "label_7546", "label_7547", "label_7548", "label_7549", "label_7550", "label_7551", "label_7552", "label_7553", "label_7554", "label_7555", "label_7556", "label_7557", "label_7558", "label_7559", "label_7560", "label_7561", "label_7562", "label_7563", "label_7564", "label_7565", "label_7566", "label_7567", "label_7568", "label_7569", "label_7570", "label_7571", "label_7572", "label_7573", "label_7574", "label_7575", "label_7576", "label_7577", "label_7578", "label_7579", "label_7580", "label_7581", "label_7582", "label_7583", "label_7584", "label_7585", "label_7586", "label_7587", "label_7588", "label_7589", "label_7590", "label_7591", "label_7592", "label_7593", "label_7594", "label_7595", "label_7596", "label_7597", "label_7598", "label_7599", "label_7600", "label_7601", "label_7602", "label_7603", "label_7604", "label_7605", "label_7606", "label_7607", "label_7608", "label_7609", "label_7610", "label_7611", "label_7612", "label_7613", "label_7614", "label_7615", "label_7616", "label_7617", "label_7618", "label_7619", "label_7620", "label_7621", "label_7622", "label_7623", "label_7624", "label_7625", "label_7626", "label_7627", "label_7628", "label_7629", "label_7630", "label_7631", "label_7632", "label_7633", "label_7634", "label_7635", "label_7636", "label_7637", "label_7638", "label_7639", "label_7640", "label_7641", "label_7642", "label_7643", "label_7644", "label_7645", "label_7646", "label_7647", "label_7648", "label_7649", "label_7650", "label_7651", "label_7652", "label_7653", "label_7654", "label_7655", "label_7656", "label_7657", "label_7658", "label_7659", "label_7660", "label_7661", "label_7662", "label_7663", "label_7664", "label_7665", "label_7666", "label_7667", "label_7668", "label_7669", "label_7670", "label_7671", "label_7672", "label_7673", "label_7674", "label_7675", "label_7676", "label_7677", "label_7678", "label_7679", "label_7680", "label_7681", "label_7682", "label_7683", "label_7684", "label_7685", "label_7686", "label_7687", "label_7688", "label_7689", "label_7690", "label_7691", "label_7692", "label_7693", "label_7694", "label_7695", "label_7696", "label_7697", "label_7698", "label_7699", "label_7700", "label_7701", "label_7702", "label_7703", "label_7704", "label_7705", "label_7706", "label_7707", "label_7708", "label_7709", "label_7710", "label_7711", "label_7712", "label_7713", "label_7714", "label_7715", "label_7716", "label_7717", "label_7718", "label_7719", "label_7720", "label_7721", "label_7722", "label_7723", "label_7724", "label_7725", "label_7726", "label_7727", "label_7728", "label_7729", "label_7730", "label_7731", "label_7732", "label_7733", "label_7734", "label_7735", "label_7736", "label_7737", "label_7738", "label_7739", "label_7740", "label_7741", "label_7742", "label_7743", "label_7744", "label_7745", "label_7746", "label_7747", "label_7748", "label_7749", "label_7750", "label_7751", "label_7752", "label_7753", "label_7754", "label_7755", "label_7756", "label_7757", "label_7758", "label_7759", "label_7760", "label_7761", "label_7762", "label_7763", "label_7764", "label_7765", "label_7766", "label_7767", "label_7768", "label_7769", "label_7770", "label_7771", "label_7772", "label_7773", "label_7774", "label_7775", "label_7776", "label_7777", "label_7778", "label_7779", "label_7780", "label_7781", "label_7782", "label_7783", "label_7784", "label_7785", "label_7786", "label_7787", "label_7788", "label_7789", "label_7790", "label_7791", "label_7792", "label_7793", "label_7794", "label_7795", "label_7796", "label_7797", "label_7798", "label_7799", "label_7800", "label_7801", "label_7802", "label_7803", "label_7804", "label_7805", "label_7806", "label_7807", "label_7808", "label_7809", "label_7810", "label_7811", "label_7812", "label_7813", "label_7814", "label_7815", "label_7816", "label_7817", "label_7818", "label_7819", "label_7820", "label_7821", "label_7822", "label_7823", "label_7824", "label_7825", "label_7826", "label_7827", "label_7828", "label_7829", "label_7830", "label_7831", "label_7832", "label_7833", "label_7834", "label_7835", "label_7836", "label_7837", "label_7838", "label_7839", "label_7840", "label_7841", "label_7842", "label_7843", "label_7844", "label_7845", "label_7846", "label_7847", "label_7848", "label_7849", "label_7850", "label_7851", "label_7852", "label_7853", "label_7854", "label_7855", "label_7856", "label_7857", "label_7858", "label_7859", "label_7860", "label_7861", "label_7862", "label_7863", "label_7864", "label_7865", "label_7866", "label_7867", "label_7868", "label_7869", "label_7870", "label_7871", "label_7872", "label_7873", "label_7874", "label_7875", "label_7876", "label_7877", "label_7878", "label_7879", "label_7880", "label_7881", "label_7882", "label_7883", "label_7884", "label_7885", "label_7886", "label_7887", "label_7888", "label_7889", "label_7890", "label_7891", "label_7892", "label_7893", "label_7894", "label_7895", "label_7896", "label_7897", "label_7898", "label_7899", "label_7900", "label_7901", "label_7902", "label_7903", "label_7904", "label_7905", "label_7906", "label_7907", "label_7908", "label_7909", "label_7910", "label_7911", "label_7912", "label_7913", "label_7914", "label_7915", "label_7916", "label_7917", "label_7918", "label_7919", "label_7920", "label_7921", "label_7922", "label_7923", "label_7924", "label_7925", "label_7926", "label_7927", "label_7928", "label_7929", "label_7930", "label_7931", "label_7932", "label_7933", "label_7934", "label_7935", "label_7936", "label_7937", "label_7938", "label_7939", "label_7940", "label_7941", "label_7942", "label_7943", "label_7944", "label_7945", "label_7946", "label_7947", "label_7948", "label_7949", "label_7950", "label_7951", "label_7952", "label_7953", "label_7954", "label_7955", "label_7956", "label_7957", "label_7958", "label_7959", "label_7960", "label_7961", "label_7962", "label_7963", "label_7964", "label_7965", "label_7966", "label_7967", "label_7968", "label_7969", "label_7970", "label_7971", "label_7972", "label_7973", "label_7974", "label_7975", "label_7976", "label_7977", "label_7978", "label_7979", "label_7980", "label_7981", "label_7982", "label_7983", "label_7984", "label_7985", "label_7986", "label_7987", "label_7988", "label_7989", "label_7990", "label_7991", "label_7992", "label_7993", "label_7994", "label_7995", "label_7996", "label_7997", "label_7998", "label_7999", "label_8000", "label_8001", "label_8002", "label_8003", "label_8004", "label_8005", "label_8006", "label_8007", "label_8008", "label_8009", "label_8010", "label_8011", "label_8012", "label_8013", "label_8014", "label_8015", "label_8016", "label_8017", "label_8018", "label_8019", "label_8020", "label_8021", "label_8022", "label_8023", "label_8024", "label_8025", "label_8026", "label_8027", "label_8028", "label_8029", "label_8030", "label_8031", "label_8032", "label_8033", "label_8034", "label_8035", "label_8036", "label_8037", "label_8038", "label_8039", "label_8040", "label_8041", "label_8042", "label_8043", "label_8044", "label_8045", "label_8046", "label_8047", "label_8048", "label_8049", "label_8050", "label_8051", "label_8052", "label_8053", "label_8054", "label_8055", "label_8056", "label_8057", "label_8058", "label_8059", "label_8060", "label_8061", "label_8062", "label_8063", "label_8064", "label_8065", "label_8066", "label_8067", "label_8068", "label_8069", "label_8070", "label_8071", "label_8072", "label_8073", "label_8074", "label_8075", "label_8076", "label_8077", "label_8078", "label_8079", "label_8080", "label_8081", "label_8082", "label_8083", "label_8084", "label_8085", "label_8086", "label_8087", "label_8088", "label_8089", "label_8090", "label_8091", "label_8092", "label_8093", "label_8094", "label_8095", "label_8096", "label_8097", "label_8098", "label_8099", "label_8100", "label_8101", "label_8102", "label_8103", "label_8104", "label_8105", "label_8106", "label_8107", "label_8108", "label_8109", "label_8110", "label_8111", "label_8112", "label_8113", "label_8114", "label_8115", "label_8116", "label_8117", "label_8118", "label_8119", "label_8120", "label_8121", "label_8122", "label_8123", "label_8124", "label_8125", "label_8126", "label_8127", "label_8128", "label_8129", "label_8130", "label_8131", "label_8132", "label_8133", "label_8134", "label_8135", "label_8136", "label_8137", "label_8138", "label_8139", "label_8140", "label_8141", "label_8142", "label_8143", "label_8144", "label_8145", "label_8146", "label_8147", "label_8148", "label_8149", "label_8150", "label_8151", "label_8152", "label_8153", "label_8154", "label_8155", "label_8156", "label_8157", "label_8158", "label_8159", "label_8160", "label_8161", "label_8162", "label_8163", "label_8164", "label_8165", "label_8166", "label_8167", "label_8168", "label_8169", "label_8170", "label_8171", "label_8172", "label_8173", "label_8174", "label_8175", "label_8176", "label_8177", "label_8178", "label_8179", "label_8180", "label_8181", "label_8182", "label_8183", "label_8184", "label_8185", "label_8186", "label_8187", "label_8188", "label_8189", "label_8190", "label_8191", "label_8192", "label_8193", "label_8194", "label_8195", "label_8196", "label_8197", "label_8198", "label_8199", "label_8200", "label_8201", "label_8202", "label_8203", "label_8204", "label_8205", "label_8206", "label_8207", "label_8208", "label_8209", "label_8210", "label_8211", "label_8212", "label_8213", "label_8214", "label_8215", "label_8216", "label_8217", "label_8218", "label_8219", "label_8220", "label_8221", "label_8222", "label_8223", "label_8224", "label_8225", "label_8226", "label_8227", "label_8228", "label_8229", "label_8230", "label_8231", "label_8232", "label_8233", "label_8234", "label_8235", "label_8236", "label_8237", "label_8238", "label_8239", "label_8240", "label_8241", "label_8242", "label_8243", "label_8244", "label_8245", "label_8246", "label_8247", "label_8248", "label_8249", "label_8250", "label_8251", "label_8252", "label_8253", "label_8254", "label_8255", "label_8256", "label_8257", "label_8258", "label_8259", "label_8260", "label_8261", "label_8262", "label_8263", "label_8264", "label_8265", "label_8266", "label_8267", "label_8268", "label_8269", "label_8270", "label_8271", "label_8272", "label_8273", "label_8274", "label_8275", "label_8276", "label_8277", "label_8278", "label_8279", "label_8280", "label_8281", "label_8282", "label_8283", "label_8284", "label_8285", "label_8286", "label_8287", "label_8288", "label_8289", "label_8290", "label_8291", "label_8292", "label_8293", "label_8294", "label_8295", "label_8296", "label_8297", "label_8298", "label_8299", "label_8300", "label_8301", "label_8302", "label_8303", "label_8304", "label_8305", "label_8306", "label_8307", "label_8308", "label_8309", "label_8310", "label_8311", "label_8312", "label_8313", "label_8314", "label_8315", "label_8316", "label_8317", "label_8318", "label_8319", "label_8320", "label_8321", "label_8322", "label_8323", "label_8324", "label_8325", "label_8326", "label_8327", "label_8328", "label_8329", "label_8330", "label_8331", "label_8332", "label_8333", "label_8334", "label_8335", "label_8336", "label_8337", "label_8338", "label_8339", "label_8340", "label_8341", "label_8342", "label_8343", "label_8344", "label_8345", "label_8346", "label_8347", "label_8348", "label_8349", "label_8350", "label_8351", "label_8352", "label_8353", "label_8354", "label_8355", "label_8356", "label_8357", "label_8358", "label_8359", "label_8360", "label_8361", "label_8362", "label_8363", "label_8364", "label_8365", "label_8366", "label_8367", "label_8368", "label_8369", "label_8370", "label_8371", "label_8372", "label_8373", "label_8374", "label_8375", "label_8376", "label_8377", "label_8378", "label_8379", "label_8380", "label_8381", "label_8382", "label_8383", "label_8384", "label_8385", "label_8386", "label_8387", "label_8388", "label_8389", "label_8390", "label_8391", "label_8392", "label_8393", "label_8394", "label_8395", "label_8396", "label_8397", "label_8398", "label_8399", "label_8400", "label_8401", "label_8402", "label_8403", "label_8404", "label_8405", "label_8406", "label_8407", "label_8408", "label_8409", "label_8410", "label_8411", "label_8412", "label_8413", "label_8414", "label_8415", "label_8416", "label_8417", "label_8418", "label_8419", "label_8420", "label_8421", "label_8422", "label_8423", "label_8424", "label_8425", "label_8426", "label_8427", "label_8428", "label_8429", "label_8430", "label_8431", "label_8432", "label_8433", "label_8434", "label_8435", "label_8436", "label_8437", "label_8438", "label_8439", "label_8440", "label_8441", "label_8442", "label_8443", "label_8444", "label_8445", "label_8446", "label_8447", "label_8448", "label_8449", "label_8450", "label_8451", "label_8452", "label_8453", "label_8454", "label_8455", "label_8456", "label_8457", "label_8458", "label_8459", "label_8460", "label_8461", "label_8462", "label_8463", "label_8464", "label_8465", "label_8466", "label_8467", "label_8468", "label_8469", "label_8470", "label_8471", "label_8472", "label_8473", "label_8474", "label_8475", "label_8476", "label_8477", "label_8478", "label_8479", "label_8480", "label_8481", "label_8482", "label_8483", "label_8484", "label_8485", "label_8486", "label_8487", "label_8488", "label_8489", "label_8490", "label_8491", "label_8492", "label_8493", "label_8494", "label_8495", "label_8496", "label_8497", "label_8498", "label_8499", "label_8500", "label_8501", "label_8502", "label_8503", "label_8504", "label_8505", "label_8506", "label_8507", "label_8508", "label_8509", "label_8510", "label_8511", "label_8512", "label_8513", "label_8514", "label_8515", "label_8516", "label_8517", "label_8518", "label_8519", "label_8520", "label_8521", "label_8522", "label_8523", "label_8524", "label_8525", "label_8526", "label_8527", "label_8528", "label_8529", "label_8530", "label_8531", "label_8532", "label_8533", "label_8534", "label_8535", "label_8536", "label_8537", "label_8538", "label_8539", "label_8540", "label_8541", "label_8542", "label_8543", "label_8544", "label_8545", "label_8546", "label_8547", "label_8548", "label_8549", "label_8550", "label_8551", "label_8552", "label_8553", "label_8554", "label_8555", "label_8556", "label_8557", "label_8558", "label_8559", "label_8560", "label_8561", "label_8562", "label_8563", "label_8564", "label_8565", "label_8566", "label_8567", "label_8568", "label_8569", "label_8570", "label_8571", "label_8572", "label_8573", "label_8574", "label_8575", "label_8576", "label_8577", "label_8578", "label_8579", "label_8580", "label_8581", "label_8582", "label_8583", "label_8584", "label_8585", "label_8586", "label_8587", "label_8588", "label_8589", "label_8590", "label_8591", "label_8592", "label_8593", "label_8594", "label_8595", "label_8596", "label_8597", "label_8598", "label_8599", "label_8600", "label_8601", "label_8602", "label_8603", "label_8604", "label_8605", "label_8606", "label_8607", "label_8608", "label_8609", "label_8610", "label_8611", "label_8612", "label_8613", "label_8614", "label_8615", "label_8616", "label_8617", "label_8618", "label_8619", "label_8620", "label_8621", "label_8622", "label_8623", "label_8624", "label_8625", "label_8626", "label_8627", "label_8628", "label_8629", "label_8630", "label_8631", "label_8632", "label_8633", "label_8634", "label_8635", "label_8636", "label_8637", "label_8638", "label_8639", "label_8640", "label_8641", "label_8642", "label_8643", "label_8644", "label_8645", "label_8646", "label_8647", "label_8648", "label_8649", "label_8650", "label_8651", "label_8652", "label_8653", "label_8654", "label_8655", "label_8656", "label_8657", "label_8658", "label_8659", "label_8660", "label_8661", "label_8662", "label_8663", "label_8664", "label_8665", "label_8666", "label_8667", "label_8668", "label_8669", "label_8670", "label_8671", "label_8672", "label_8673", "label_8674", "label_8675", "label_8676", "label_8677", "label_8678", "label_8679", "label_8680", "label_8681", "label_8682", "label_8683", "label_8684", "label_8685", "label_8686", "label_8687", "label_8688", "label_8689", "label_8690", "label_8691", "label_8692", "label_8693", "label_8694", "label_8695", "label_8696", "label_8697", "label_8698", "label_8699", "label_8700", "label_8701", "label_8702", "label_8703", "label_8704", "label_8705", "label_8706", "label_8707", "label_8708", "label_8709", "label_8710", "label_8711", "label_8712", "label_8713", "label_8714", "label_8715", "label_8716", "label_8717", "label_8718", "label_8719", "label_8720", "label_8721", "label_8722", "label_8723", "label_8724", "label_8725", "label_8726", "label_8727", "label_8728", "label_8729", "label_8730", "label_8731", "label_8732", "label_8733", "label_8734", "label_8735", "label_8736", "label_8737", "label_8738", "label_8739", "label_8740", "label_8741", "label_8742", "label_8743", "label_8744", "label_8745", "label_8746", "label_8747", "label_8748", "label_8749", "label_8750", "label_8751", "label_8752", "label_8753", "label_8754", "label_8755", "label_8756", "label_8757", "label_8758", "label_8759", "label_8760", "label_8761", "label_8762", "label_8763", "label_8764", "label_8765", "label_8766", "label_8767", "label_8768", "label_8769", "label_8770", "label_8771", "label_8772", "label_8773", "label_8774", "label_8775", "label_8776", "label_8777", "label_8778", "label_8779", "label_8780", "label_8781", "label_8782", "label_8783", "label_8784", "label_8785", "label_8786", "label_8787", "label_8788", "label_8789", "label_8790", "label_8791", "label_8792", "label_8793", "label_8794", "label_8795", "label_8796", "label_8797", "label_8798", "label_8799", "label_8800", "label_8801", "label_8802", "label_8803", "label_8804", "label_8805", "label_8806", "label_8807", "label_8808", "label_8809", "label_8810", "label_8811", "label_8812", "label_8813", "label_8814", "label_8815", "label_8816", "label_8817", "label_8818", "label_8819", "label_8820", "label_8821", "label_8822", "label_8823", "label_8824", "label_8825", "label_8826", "label_8827", "label_8828", "label_8829", "label_8830", "label_8831", "label_8832", "label_8833", "label_8834", "label_8835", "label_8836", "label_8837", "label_8838", "label_8839", "label_8840", "label_8841", "label_8842", "label_8843", "label_8844", "label_8845", "label_8846", "label_8847", "label_8848", "label_8849", "label_8850", "label_8851", "label_8852", "label_8853", "label_8854", "label_8855", "label_8856", "label_8857", "label_8858", "label_8859", "label_8860", "label_8861", "label_8862", "label_8863", "label_8864", "label_8865", "label_8866", "label_8867", "label_8868", "label_8869", "label_8870", "label_8871", "label_8872", "label_8873", "label_8874", "label_8875", "label_8876", "label_8877", "label_8878", "label_8879", "label_8880", "label_8881", "label_8882", "label_8883", "label_8884", "label_8885", "label_8886", "label_8887", "label_8888", "label_8889", "label_8890", "label_8891", "label_8892", "label_8893", "label_8894", "label_8895", "label_8896", "label_8897", "label_8898", "label_8899", "label_8900", "label_8901", "label_8902", "label_8903", "label_8904", "label_8905", "label_8906", "label_8907", "label_8908", "label_8909", "label_8910", "label_8911", "label_8912", "label_8913", "label_8914", "label_8915", "label_8916", "label_8917", "label_8918", "label_8919", "label_8920", "label_8921", "label_8922", "label_8923", "label_8924", "label_8925", "label_8926", "label_8927", "label_8928", "label_8929", "label_8930", "label_8931", "label_8932", "label_8933", "label_8934", "label_8935", "label_8936", "label_8937", "label_8938", "label_8939", "label_8940", "label_8941", "label_8942", "label_8943", "label_8944", "label_8945", "label_8946", "label_8947", "label_8948", "label_8949", "label_8950", "label_8951", "label_8952", "label_8953", "label_8954", "label_8955", "label_8956", "label_8957", "label_8958", "label_8959", "label_8960", "label_8961", "label_8962", "label_8963", "label_8964", "label_8965", "label_8966", "label_8967", "label_8968", "label_8969", "label_8970", "label_8971", "label_8972", "label_8973", "label_8974", "label_8975", "label_8976", "label_8977", "label_8978", "label_8979", "label_8980", "label_8981", "label_8982", "label_8983", "label_8984", "label_8985", "label_8986", "label_8987", "label_8988", "label_8989", "label_8990", "label_8991", "label_8992", "label_8993", "label_8994", "label_8995", "label_8996", "label_8997", "label_8998", "label_8999", "label_9000", "label_9001", "label_9002", "label_9003", "label_9004", "label_9005", "label_9006", "label_9007", "label_9008", "label_9009", "label_9010", "label_9011", "label_9012", "label_9013", "label_9014", "label_9015", "label_9016", "label_9017", "label_9018", "label_9019", "label_9020", "label_9021", "label_9022", "label_9023", "label_9024", "label_9025", "label_9026", "label_9027", "label_9028", "label_9029", "label_9030", "label_9031", "label_9032", "label_9033", "label_9034", "label_9035", "label_9036", "label_9037", "label_9038", "label_9039", "label_9040", "label_9041", "label_9042", "label_9043", "label_9044", "label_9045", "label_9046", "label_9047", "label_9048", "label_9049", "label_9050", "label_9051", "label_9052", "label_9053", "label_9054", "label_9055", "label_9056", "label_9057", "label_9058", "label_9059", "label_9060", "label_9061", "label_9062", "label_9063", "label_9064", "label_9065", "label_9066", "label_9067", "label_9068", "label_9069", "label_9070", "label_9071", "label_9072", "label_9073", "label_9074", "label_9075", "label_9076", "label_9077", "label_9078", "label_9079", "label_9080", "label_9081", "label_9082", "label_9083", "label_9084", "label_9085", "label_9086", "label_9087", "label_9088", "label_9089", "label_9090", "label_9091", "label_9092", "label_9093", "label_9094", "label_9095", "label_9096", "label_9097", "label_9098", "label_9099", "label_9100", "label_9101", "label_9102", "label_9103", "label_9104", "label_9105", "label_9106", "label_9107", "label_9108", "label_9109", "label_9110", "label_9111", "label_9112", "label_9113", "label_9114", "label_9115", "label_9116", "label_9117", "label_9118", "label_9119", "label_9120", "label_9121", "label_9122", "label_9123", "label_9124", "label_9125", "label_9126", "label_9127", "label_9128", "label_9129", "label_9130", "label_9131", "label_9132", "label_9133", "label_9134", "label_9135", "label_9136", "label_9137", "label_9138", "label_9139", "label_9140", "label_9141", "label_9142", "label_9143", "label_9144", "label_9145", "label_9146", "label_9147", "label_9148", "label_9149", "label_9150", "label_9151", "label_9152", "label_9153", "label_9154", "label_9155", "label_9156", "label_9157", "label_9158", "label_9159", "label_9160", "label_9161", "label_9162", "label_9163", "label_9164", "label_9165", "label_9166", "label_9167", "label_9168", "label_9169", "label_9170", "label_9171", "label_9172", "label_9173", "label_9174", "label_9175", "label_9176", "label_9177", "label_9178", "label_9179", "label_9180", "label_9181", "label_9182", "label_9183", "label_9184", "label_9185", "label_9186", "label_9187", "label_9188", "label_9189", "label_9190", "label_9191", "label_9192", "label_9193", "label_9194", "label_9195", "label_9196", "label_9197", "label_9198", "label_9199", "label_9200", "label_9201", "label_9202", "label_9203", "label_9204", "label_9205", "label_9206", "label_9207", "label_9208", "label_9209", "label_9210", "label_9211", "label_9212", "label_9213", "label_9214", "label_9215", "label_9216", "label_9217", "label_9218", "label_9219", "label_9220", "label_9221", "label_9222", "label_9223", "label_9224", "label_9225", "label_9226", "label_9227", "label_9228", "label_9229", "label_9230", "label_9231", "label_9232", "label_9233", "label_9234", "label_9235", "label_9236", "label_9237", "label_9238", "label_9239", "label_9240", "label_9241", "label_9242", "label_9243", "label_9244", "label_9245", "label_9246", "label_9247", "label_9248", "label_9249", "label_9250", "label_9251", "label_9252", "label_9253", "label_9254", "label_9255", "label_9256", "label_9257", "label_9258", "label_9259", "label_9260", "label_9261", "label_9262", "label_9263", "label_9264", "label_9265", "label_9266", "label_9267", "label_9268", "label_9269", "label_9270", "label_9271", "label_9272", "label_9273", "label_9274", "label_9275", "label_9276", "label_9277", "label_9278", "label_9279", "label_9280", "label_9281", "label_9282", "label_9283", "label_9284", "label_9285", "label_9286", "label_9287", "label_9288", "label_9289", "label_9290", "label_9291", "label_9292", "label_9293", "label_9294", "label_9295", "label_9296", "label_9297", "label_9298", "label_9299", "label_9300", "label_9301", "label_9302", "label_9303", "label_9304", "label_9305", "label_9306", "label_9307", "label_9308", "label_9309", "label_9310", "label_9311", "label_9312", "label_9313", "label_9314", "label_9315", "label_9316", "label_9317", "label_9318", "label_9319", "label_9320", "label_9321", "label_9322", "label_9323", "label_9324", "label_9325", "label_9326", "label_9327", "label_9328", "label_9329", "label_9330", "label_9331", "label_9332", "label_9333", "label_9334", "label_9335", "label_9336", "label_9337", "label_9338", "label_9339", "label_9340", "label_9341", "label_9342", "label_9343", "label_9344", "label_9345", "label_9346", "label_9347", "label_9348", "label_9349", "label_9350", "label_9351", "label_9352", "label_9353", "label_9354", "label_9355", "label_9356", "label_9357", "label_9358", "label_9359", "label_9360", "label_9361", "label_9362", "label_9363", "label_9364", "label_9365", "label_9366", "label_9367", "label_9368", "label_9369", "label_9370", "label_9371", "label_9372", "label_9373", "label_9374", "label_9375", "label_9376", "label_9377", "label_9378", "label_9379", "label_9380", "label_9381", "label_9382", "label_9383", "label_9384", "label_9385", "label_9386", "label_9387", "label_9388", "label_9389", "label_9390", "label_9391", "label_9392", "label_9393", "label_9394", "label_9395", "label_9396", "label_9397", "label_9398", "label_9399", "label_9400", "label_9401", "label_9402", "label_9403", "label_9404", "label_9405", "label_9406", "label_9407", "label_9408", "label_9409", "label_9410", "label_9411", "label_9412", "label_9413", "label_9414", "label_9415", "label_9416", "label_9417", "label_9418", "label_9419", "label_9420", "label_9421", "label_9422", "label_9423", "label_9424", "label_9425", "label_9426", "label_9427", "label_9428", "label_9429", "label_9430", "label_9431", "label_9432", "label_9433", "label_9434", "label_9435", "label_9436", "label_9437", "label_9438", "label_9439", "label_9440", "label_9441", "label_9442", "label_9443", "label_9444", "label_9445", "label_9446", "label_9447", "label_9448", "label_9449", "label_9450", "label_9451", "label_9452", "label_9453", "label_9454", "label_9455", "label_9456", "label_9457", "label_9458", "label_9459", "label_9460", "label_9461", "label_9462", "label_9463", "label_9464", "label_9465", "label_9466", "label_9467", "label_9468", "label_9469", "label_9470", "label_9471", "label_9472", "label_9473", "label_9474", "label_9475", "label_9476", "label_9477", "label_9478", "label_9479", "label_9480", "label_9481", "label_9482", "label_9483", "label_9484", "label_9485", "label_9486", "label_9487", "label_9488", "label_9489", "label_9490", "label_9491", "label_9492", "label_9493", "label_9494", "label_9495", "label_9496", "label_9497", "label_9498", "label_9499", "label_9500", "label_9501", "label_9502", "label_9503", "label_9504", "label_9505", "label_9506", "label_9507", "label_9508", "label_9509", "label_9510", "label_9511", "label_9512", "label_9513", "label_9514", "label_9515", "label_9516", "label_9517", "label_9518", "label_9519", "label_9520", "label_9521", "label_9522", "label_9523", "label_9524", "label_9525", "label_9526", "label_9527", "label_9528", "label_9529", "label_9530", "label_9531", "label_9532", "label_9533", "label_9534", "label_9535", "label_9536", "label_9537", "label_9538", "label_9539", "label_9540", "label_9541", "label_9542", "label_9543", "label_9544", "label_9545", "label_9546", "label_9547", "label_9548", "label_9549", "label_9550", "label_9551", "label_9552", "label_9553", "label_9554", "label_9555", "label_9556", "label_9557", "label_9558", "label_9559", "label_9560", "label_9561", "label_9562", "label_9563", "label_9564", "label_9565", "label_9566", "label_9567", "label_9568", "label_9569", "label_9570", "label_9571", "label_9572", "label_9573", "label_9574", "label_9575", "label_9576", "label_9577", "label_9578", "label_9579", "label_9580", "label_9581", "label_9582", "label_9583", "label_9584", "label_9585", "label_9586", "label_9587", "label_9588", "label_9589", "label_9590", "label_9591", "label_9592", "label_9593", "label_9594", "label_9595", "label_9596", "label_9597", "label_9598", "label_9599", "label_9600", "label_9601", "label_9602", "label_9603", "label_9604", "label_9605", "label_9606", "label_9607", "label_9608", "label_9609", "label_9610", "label_9611", "label_9612", "label_9613", "label_9614", "label_9615", "label_9616", "label_9617", "label_9618", "label_9619", "label_9620", "label_9621", "label_9622", "label_9623", "label_9624", "label_9625", "label_9626", "label_9627", "label_9628", "label_9629", "label_9630", "label_9631", "label_9632", "label_9633", "label_9634", "label_9635", "label_9636", "label_9637", "label_9638", "label_9639", "label_9640", "label_9641", "label_9642", "label_9643", "label_9644", "label_9645", "label_9646", "label_9647", "label_9648", "label_9649", "label_9650", "label_9651", "label_9652", "label_9653", "label_9654", "label_9655", "label_9656", "label_9657", "label_9658", "label_9659", "label_9660", "label_9661", "label_9662", "label_9663", "label_9664", "label_9665", "label_9666", "label_9667", "label_9668", "label_9669", "label_9670", "label_9671", "label_9672", "label_9673", "label_9674", "label_9675", "label_9676", "label_9677", "label_9678", "label_9679", "label_9680", "label_9681", "label_9682", "label_9683", "label_9684", "label_9685", "label_9686", "label_9687", "label_9688", "label_9689", "label_9690", "label_9691", "label_9692", "label_9693", "label_9694", "label_9695", "label_9696", "label_9697", "label_9698", "label_9699", "label_9700", "label_9701", "label_9702", "label_9703", "label_9704", "label_9705", "label_9706", "label_9707", "label_9708", "label_9709", "label_9710", "label_9711", "label_9712", "label_9713", "label_9714", "label_9715", "label_9716", "label_9717", "label_9718", "label_9719", "label_9720", "label_9721", "label_9722", "label_9723", "label_9724", "label_9725", "label_9726", "label_9727", "label_9728", "label_9729", "label_9730", "label_9731", "label_9732", "label_9733", "label_9734", "label_9735", "label_9736", "label_9737", "label_9738", "label_9739", "label_9740", "label_9741", "label_9742", "label_9743", "label_9744", "label_9745", "label_9746", "label_9747", "label_9748", "label_9749", "label_9750", "label_9751", "label_9752", "label_9753", "label_9754", "label_9755", "label_9756", "label_9757", "label_9758", "label_9759", "label_9760", "label_9761", "label_9762", "label_9763", "label_9764", "label_9765", "label_9766", "label_9767", "label_9768", "label_9769", "label_9770", "label_9771", "label_9772", "label_9773", "label_9774", "label_9775", "label_9776", "label_9777", "label_9778", "label_9779", "label_9780", "label_9781", "label_9782", "label_9783", "label_9784", "label_9785", "label_9786", "label_9787", "label_9788", "label_9789", "label_9790", "label_9791", "label_9792", "label_9793", "label_9794", "label_9795", "label_9796", "label_9797", "label_9798", "label_9799", "label_9800", "label_9801", "label_9802", "label_9803", "label_9804", "label_9805", "label_9806", "label_9807", "label_9808", "label_9809", "label_9810", "label_9811", "label_9812", "label_9813", "label_9814", "label_9815", "label_9816", "label_9817", "label_9818", "label_9819", "label_9820", "label_9821", "label_9822", "label_9823", "label_9824", "label_9825", "label_9826", "label_9827", "label_9828", "label_9829", "label_9830", "label_9831", "label_9832", "label_9833", "label_9834", "label_9835", "label_9836", "label_9837", "label_9838", "label_9839", "label_9840", "label_9841", "label_9842", "label_9843", "label_9844", "label_9845", "label_9846", "label_9847", "label_9848", "label_9849", "label_9850", "label_9851", "label_9852", "label_9853", "label_9854", "label_9855", "label_9856", "label_9857", "label_9858", "label_9859", "label_9860", "label_9861", "label_9862", "label_9863", "label_9864", "label_9865", "label_9866", "label_9867", "label_9868", "label_9869", "label_9870", "label_9871", "label_9872", "label_9873", "label_9874", "label_9875", "label_9876", "label_9877", "label_9878", "label_9879", "label_9880", "label_9881", "label_9882", "label_9883", "label_9884", "label_9885", "label_9886", "label_9887", "label_9888", "label_9889", "label_9890", "label_9891", "label_9892", "label_9893", "label_9894", "label_9895", "label_9896", "label_9897", "label_9898", "label_9899", "label_9900", "label_9901", "label_9902", "label_9903", "label_9904", "label_9905", "label_9906", "label_9907", "label_9908", "label_9909", "label_9910", "label_9911", "label_9912", "label_9913", "label_9914", "label_9915", "label_9916", "label_9917", "label_9918", "label_9919", "label_9920", "label_9921", "label_9922", "label_9923", "label_9924", "label_9925", "label_9926", "label_9927", "label_9928", "label_9929", "label_9930", "label_9931", "label_9932", "label_9933", "label_9934", "label_9935", "label_9936", "label_9937", "label_9938", "label_9939", "label_9940", "label_9941", "label_9942", "label_9943", "label_9944", "label_9945", "label_9946", "label_9947", "label_9948", "label_9949", "label_9950", "label_9951", "label_9952", "label_9953", "label_9954", "label_9955", "label_9956", "label_9957", "label_9958", "label_9959", "label_9960", "label_9961", "label_9962", "label_9963", "label_9964", "label_9965", "label_9966", "label_9967", "label_9968", "label_9969", "label_9970", "label_9971", "label_9972", "label_9973", "label_9974", "label_9975", "label_9976", "label_9977", "label_9978", "label_9979", "label_9980", "label_9981", "label_9982", "label_9983", "label_9984", "label_9985", "label_9986", "label_9987", "label_9988", "label_9989", "label_9990", "label_9991", "label_9992", "label_9993", "label_9994", "label_9995", "label_9996", "label_9997", "label_9998", "label_9999", "label_10000", "label_10001", "label_10002", "label_10003", "label_10004", "label_10005", "label_10006", "label_10007", "label_10008", "label_10009", "label_10010", "label_10011", "label_10012", "label_10013", "label_10014", "label_10015", "label_10016", "label_10017", "label_10018", "label_10019", "label_10020", "label_10021", "label_10022", "label_10023", "label_10024", "label_10025", "label_10026", "label_10027", "label_10028", "label_10029", "label_10030", "label_10031", "label_10032", "label_10033", "label_10034", "label_10035", "label_10036", "label_10037", "label_10038", "label_10039", "label_10040", "label_10041", "label_10042", "label_10043", "label_10044", "label_10045", "label_10046", "label_10047", "label_10048", "label_10049", "label_10050", "label_10051", "label_10052", "label_10053", "label_10054", "label_10055", "label_10056", "label_10057", "label_10058", "label_10059", "label_10060", "label_10061", "label_10062", "label_10063", "label_10064", "label_10065", "label_10066", "label_10067", "label_10068", "label_10069", "label_10070", "label_10071", "label_10072", "label_10073", "label_10074", "label_10075", "label_10076", "label_10077", "label_10078", "label_10079", "label_10080", "label_10081", "label_10082", "label_10083", "label_10084", "label_10085", "label_10086", "label_10087", "label_10088", "label_10089", "label_10090", "label_10091", "label_10092", "label_10093", "label_10094", "label_10095", "label_10096", "label_10097", "label_10098", "label_10099", "label_10100", "label_10101", "label_10102", "label_10103", "label_10104", "label_10105", "label_10106", "label_10107", "label_10108", "label_10109", "label_10110", "label_10111", "label_10112", "label_10113", "label_10114", "label_10115", "label_10116", "label_10117", "label_10118", "label_10119", "label_10120", "label_10121", "label_10122", "label_10123", "label_10124", "label_10125", "label_10126", "label_10127", "label_10128", "label_10129", "label_10130", "label_10131", "label_10132", "label_10133", "label_10134", "label_10135", "label_10136", "label_10137", "label_10138", "label_10139", "label_10140", "label_10141", "label_10142", "label_10143", "label_10144", "label_10145", "label_10146", "label_10147", "label_10148", "label_10149", "label_10150", "label_10151", "label_10152", "label_10153", "label_10154", "label_10155", "label_10156", "label_10157", "label_10158", "label_10159", "label_10160", "label_10161", "label_10162", "label_10163", "label_10164", "label_10165", "label_10166", "label_10167", "label_10168", "label_10169", "label_10170", "label_10171", "label_10172", "label_10173", "label_10174", "label_10175", "label_10176", "label_10177", "label_10178", "label_10179", "label_10180", "label_10181", "label_10182", "label_10183", "label_10184", "label_10185", "label_10186", "label_10187", "label_10188", "label_10189", "label_10190", "label_10191", "label_10192", "label_10193", "label_10194", "label_10195", "label_10196", "label_10197", "label_10198", "label_10199", "label_10200", "label_10201", "label_10202", "label_10203", "label_10204", "label_10205", "label_10206", "label_10207", "label_10208", "label_10209", "label_10210", "label_10211", "label_10212", "label_10213", "label_10214", "label_10215", "label_10216", "label_10217", "label_10218", "label_10219", "label_10220", "label_10221", "label_10222", "label_10223", "label_10224", "label_10225", "label_10226", "label_10227", "label_10228", "label_10229", "label_10230", "label_10231", "label_10232", "label_10233", "label_10234", "label_10235", "label_10236", "label_10237", "label_10238", "label_10239", "label_10240", "label_10241", "label_10242", "label_10243", "label_10244", "label_10245", "label_10246", "label_10247", "label_10248", "label_10249", "label_10250", "label_10251", "label_10252", "label_10253", "label_10254", "label_10255", "label_10256", "label_10257", "label_10258", "label_10259", "label_10260", "label_10261", "label_10262", "label_10263", "label_10264", "label_10265", "label_10266", "label_10267", "label_10268", "label_10269", "label_10270", "label_10271", "label_10272", "label_10273", "label_10274", "label_10275", "label_10276", "label_10277", "label_10278", "label_10279", "label_10280", "label_10281", "label_10282", "label_10283", "label_10284", "label_10285", "label_10286", "label_10287", "label_10288", "label_10289", "label_10290", "label_10291", "label_10292", "label_10293", "label_10294", "label_10295", "label_10296", "label_10297", "label_10298", "label_10299", "label_10300", "label_10301", "label_10302", "label_10303", "label_10304", "label_10305", "label_10306", "label_10307", "label_10308", "label_10309", "label_10310", "label_10311", "label_10312", "label_10313", "label_10314", "label_10315", "label_10316", "label_10317", "label_10318", "label_10319", "label_10320", "label_10321", "label_10322", "label_10323", "label_10324", "label_10325", "label_10326", "label_10327", "label_10328", "label_10329", "label_10330", "label_10331", "label_10332", "label_10333", "label_10334", "label_10335", "label_10336", "label_10337", "label_10338", "label_10339", "label_10340", "label_10341", "label_10342", "label_10343", "label_10344", "label_10345", "label_10346", "label_10347", "label_10348", "label_10349", "label_10350", "label_10351", "label_10352", "label_10353", "label_10354", "label_10355", "label_10356", "label_10357", "label_10358", "label_10359", "label_10360", "label_10361", "label_10362", "label_10363", "label_10364", "label_10365", "label_10366", "label_10367", "label_10368", "label_10369", "label_10370", "label_10371", "label_10372", "label_10373", "label_10374", "label_10375", "label_10376", "label_10377", "label_10378", "label_10379", "label_10380", "label_10381", "label_10382", "label_10383", "label_10384", "label_10385", "label_10386", "label_10387", "label_10388", "label_10389", "label_10390", "label_10391", "label_10392", "label_10393", "label_10394", "label_10395", "label_10396", "label_10397", "label_10398", "label_10399", "label_10400", "label_10401", "label_10402", "label_10403", "label_10404", "label_10405", "label_10406", "label_10407", "label_10408", "label_10409", "label_10410", "label_10411", "label_10412", "label_10413", "label_10414", "label_10415", "label_10416", "label_10417", "label_10418", "label_10419", "label_10420", "label_10421", "label_10422", "label_10423", "label_10424", "label_10425", "label_10426", "label_10427", "label_10428", "label_10429", "label_10430", "label_10431", "label_10432", "label_10433", "label_10434", "label_10435", "label_10436", "label_10437", "label_10438", "label_10439", "label_10440", "label_10441", "label_10442", "label_10443", "label_10444", "label_10445", "label_10446", "label_10447", "label_10448", "label_10449", "label_10450", "label_10451", "label_10452", "label_10453", "label_10454", "label_10455", "label_10456", "label_10457", "label_10458", "label_10459", "label_10460", "label_10461", "label_10462", "label_10463", "label_10464", "label_10465", "label_10466", "label_10467", "label_10468", "label_10469", "label_10470", "label_10471", "label_10472", "label_10473", "label_10474", "label_10475", "label_10476", "label_10477", "label_10478", "label_10479", "label_10480", "label_10481", "label_10482", "label_10483", "label_10484", "label_10485", "label_10486", "label_10487", "label_10488", "label_10489", "label_10490", "label_10491", "label_10492", "label_10493", "label_10494", "label_10495", "label_10496", "label_10497", "label_10498", "label_10499", "label_10500", "label_10501", "label_10502", "label_10503", "label_10504", "label_10505", "label_10506", "label_10507", "label_10508", "label_10509", "label_10510", "label_10511", "label_10512", "label_10513", "label_10514", "label_10515", "label_10516", "label_10517", "label_10518", "label_10519", "label_10520", "label_10521", "label_10522", "label_10523", "label_10524", "label_10525", "label_10526", "label_10527", "label_10528", "label_10529", "label_10530", "label_10531", "label_10532", "label_10533", "label_10534", "label_10535", "label_10536", "label_10537", "label_10538", "label_10539", "label_10540", "label_10541", "label_10542", "label_10543", "label_10544", "label_10545", "label_10546", "label_10547", "label_10548", "label_10549", "label_10550", "label_10551", "label_10552", "label_10553", "label_10554", "label_10555", "label_10556", "label_10557", "label_10558", "label_10559", "label_10560", "label_10561", "label_10562", "label_10563", "label_10564", "label_10565", "label_10566", "label_10567", "label_10568", "label_10569", "label_10570", "label_10571", "label_10572", "label_10573", "label_10574", "label_10575", "label_10576", "label_10577", "label_10578", "label_10579", "label_10580", "label_10581", "label_10582", "label_10583", "label_10584", "label_10585", "label_10586", "label_10587", "label_10588", "label_10589", "label_10590", "label_10591", "label_10592", "label_10593", "label_10594", "label_10595", "label_10596", "label_10597", "label_10598", "label_10599", "label_10600", "label_10601", "label_10602", "label_10603", "label_10604", "label_10605", "label_10606", "label_10607", "label_10608", "label_10609", "label_10610", "label_10611", "label_10612", "label_10613", "label_10614", "label_10615", "label_10616", "label_10617", "label_10618", "label_10619", "label_10620", "label_10621", "label_10622", "label_10623", "label_10624", "label_10625", "label_10626", "label_10627", "label_10628", "label_10629", "label_10630", "label_10631", "label_10632", "label_10633", "label_10634", "label_10635", "label_10636", "label_10637", "label_10638", "label_10639", "label_10640", "label_10641", "label_10642", "label_10643", "label_10644", "label_10645", "label_10646", "label_10647", "label_10648", "label_10649", "label_10650", "label_10651", "label_10652", "label_10653", "label_10654", "label_10655", "label_10656", "label_10657", "label_10658", "label_10659", "label_10660", "label_10661", "label_10662", "label_10663", "label_10664", "label_10665", "label_10666", "label_10667", "label_10668", "label_10669", "label_10670", "label_10671", "label_10672", "label_10673", "label_10674", "label_10675", "label_10676", "label_10677", "label_10678", "label_10679", "label_10680", "label_10681", "label_10682", "label_10683", "label_10684", "label_10685", "label_10686", "label_10687", "label_10688", "label_10689", "label_10690", "label_10691", "label_10692", "label_10693", "label_10694", "label_10695", "label_10696", "label_10697", "label_10698", "label_10699", "label_10700", "label_10701", "label_10702", "label_10703", "label_10704", "label_10705", "label_10706", "label_10707", "label_10708", "label_10709", "label_10710", "label_10711", "label_10712", "label_10713", "label_10714", "label_10715", "label_10716", "label_10717", "label_10718", "label_10719", "label_10720", "label_10721", "label_10722", "label_10723", "label_10724", "label_10725", "label_10726", "label_10727", "label_10728", "label_10729", "label_10730", "label_10731", "label_10732", "label_10733", "label_10734", "label_10735", "label_10736", "label_10737", "label_10738", "label_10739", "label_10740", "label_10741", "label_10742", "label_10743", "label_10744", "label_10745", "label_10746", "label_10747", "label_10748", "label_10749", "label_10750", "label_10751", "label_10752", "label_10753", "label_10754", "label_10755", "label_10756", "label_10757", "label_10758", "label_10759", "label_10760", "label_10761", "label_10762", "label_10763", "label_10764", "label_10765", "label_10766", "label_10767", "label_10768", "label_10769", "label_10770", "label_10771", "label_10772", "label_10773", "label_10774", "label_10775", "label_10776", "label_10777", "label_10778", "label_10779", "label_10780", "label_10781", "label_10782", "label_10783", "label_10784", "label_10785", "label_10786", "label_10787", "label_10788", "label_10789", "label_10790", "label_10791", "label_10792", "label_10793", "label_10794", "label_10795", "label_10796", "label_10797", "label_10798", "label_10799", "label_10800", "label_10801", "label_10802", "label_10803", "label_10804", "label_10805", "label_10806", "label_10807", "label_10808", "label_10809", "label_10810", "label_10811", "label_10812", "label_10813", "label_10814", "label_10815", "label_10816", "label_10817", "label_10818", "label_10819", "label_10820", "label_10821", "label_10822", "label_10823", "label_10824", "label_10825", "label_10826", "label_10827", "label_10828", "label_10829", "label_10830", "label_10831", "label_10832", "label_10833", "label_10834", "label_10835", "label_10836", "label_10837", "label_10838", "label_10839", "label_10840", "label_10841", "label_10842", "label_10843", "label_10844", "label_10845", "label_10846", "label_10847", "label_10848", "label_10849", "label_10850", "label_10851", "label_10852", "label_10853", "label_10854", "label_10855", "label_10856", "label_10857", "label_10858", "label_10859", "label_10860", "label_10861", "label_10862", "label_10863", "label_10864", "label_10865", "label_10866", "label_10867", "label_10868", "label_10869", "label_10870", "label_10871", "label_10872", "label_10873", "label_10874", "label_10875", "label_10876", "label_10877", "label_10878", "label_10879", "label_10880", "label_10881", "label_10882", "label_10883", "label_10884", "label_10885", "label_10886", "label_10887", "label_10888", "label_10889", "label_10890", "label_10891", "label_10892", "label_10893", "label_10894", "label_10895", "label_10896", "label_10897", "label_10898", "label_10899", "label_10900", "label_10901", "label_10902", "label_10903", "label_10904", "label_10905", "label_10906", "label_10907", "label_10908", "label_10909", "label_10910", "label_10911", "label_10912", "label_10913", "label_10914", "label_10915", "label_10916", "label_10917", "label_10918", "label_10919", "label_10920", "label_10921", "label_10922", "label_10923", "label_10924", "label_10925", "label_10926", "label_10927", "label_10928", "label_10929", "label_10930", "label_10931", "label_10932", "label_10933", "label_10934", "label_10935", "label_10936", "label_10937", "label_10938", "label_10939", "label_10940", "label_10941", "label_10942", "label_10943", "label_10944", "label_10945", "label_10946", "label_10947", "label_10948", "label_10949", "label_10950", "label_10951", "label_10952", "label_10953", "label_10954", "label_10955", "label_10956", "label_10957", "label_10958", "label_10959", "label_10960", "label_10961", "label_10962", "label_10963", "label_10964", "label_10965", "label_10966", "label_10967", "label_10968", "label_10969", "label_10970", "label_10971", "label_10972", "label_10973", "label_10974", "label_10975", "label_10976", "label_10977", "label_10978", "label_10979", "label_10980", "label_10981", "label_10982", "label_10983", "label_10984", "label_10985", "label_10986", "label_10987", "label_10988", "label_10989", "label_10990", "label_10991", "label_10992", "label_10993", "label_10994", "label_10995", "label_10996", "label_10997", "label_10998", "label_10999", "label_11000", "label_11001", "label_11002", "label_11003", "label_11004", "label_11005", "label_11006", "label_11007", "label_11008", "label_11009", "label_11010", "label_11011", "label_11012", "label_11013", "label_11014", "label_11015", "label_11016", "label_11017", "label_11018", "label_11019", "label_11020", "label_11021", "label_11022", "label_11023", "label_11024", "label_11025", "label_11026", "label_11027", "label_11028", "label_11029", "label_11030", "label_11031", "label_11032", "label_11033", "label_11034", "label_11035", "label_11036", "label_11037", "label_11038", "label_11039", "label_11040", "label_11041", "label_11042", "label_11043", "label_11044", "label_11045", "label_11046", "label_11047", "label_11048", "label_11049", "label_11050", "label_11051", "label_11052", "label_11053", "label_11054", "label_11055", "label_11056", "label_11057", "label_11058", "label_11059", "label_11060", "label_11061", "label_11062", "label_11063", "label_11064", "label_11065", "label_11066", "label_11067", "label_11068", "label_11069", "label_11070", "label_11071", "label_11072", "label_11073", "label_11074", "label_11075", "label_11076", "label_11077", "label_11078", "label_11079", "label_11080", "label_11081", "label_11082", "label_11083", "label_11084", "label_11085", "label_11086", "label_11087", "label_11088", "label_11089", "label_11090", "label_11091", "label_11092", "label_11093", "label_11094", "label_11095", "label_11096", "label_11097", "label_11098", "label_11099", "label_11100", "label_11101", "label_11102", "label_11103", "label_11104", "label_11105", "label_11106", "label_11107", "label_11108", "label_11109", "label_11110", "label_11111", "label_11112", "label_11113", "label_11114", "label_11115", "label_11116", "label_11117", "label_11118", "label_11119", "label_11120", "label_11121", "label_11122", "label_11123", "label_11124", "label_11125", "label_11126", "label_11127", "label_11128", "label_11129", "label_11130", "label_11131", "label_11132", "label_11133", "label_11134", "label_11135", "label_11136", "label_11137", "label_11138", "label_11139", "label_11140", "label_11141", "label_11142", "label_11143", "label_11144", "label_11145", "label_11146", "label_11147", "label_11148", "label_11149", "label_11150", "label_11151", "label_11152", "label_11153", "label_11154", "label_11155", "label_11156", "label_11157", "label_11158", "label_11159", "label_11160", "label_11161", "label_11162", "label_11163", "label_11164", "label_11165", "label_11166", "label_11167", "label_11168", "label_11169", "label_11170", "label_11171", "label_11172", "label_11173", "label_11174", "label_11175", "label_11176", "label_11177", "label_11178", "label_11179", "label_11180", "label_11181", "label_11182", "label_11183", "label_11184", "label_11185", "label_11186", "label_11187", "label_11188", "label_11189", "label_11190", "label_11191", "label_11192", "label_11193", "label_11194", "label_11195", "label_11196", "label_11197", "label_11198", "label_11199", "label_11200", "label_11201", "label_11202", "label_11203", "label_11204", "label_11205", "label_11206", "label_11207", "label_11208", "label_11209", "label_11210", "label_11211", "label_11212", "label_11213", "label_11214", "label_11215", "label_11216", "label_11217", "label_11218", "label_11219", "label_11220", "label_11221", "label_11222", "label_11223", "label_11224", "label_11225", "label_11226", "label_11227", "label_11228", "label_11229", "label_11230", "label_11231", "label_11232", "label_11233", "label_11234", "label_11235", "label_11236", "label_11237", "label_11238", "label_11239", "label_11240", "label_11241", "label_11242", "label_11243", "label_11244", "label_11245", "label_11246", "label_11247", "label_11248", "label_11249", "label_11250", "label_11251", "label_11252", "label_11253", "label_11254", "label_11255", "label_11256", "label_11257", "label_11258", "label_11259", "label_11260", "label_11261", "label_11262", "label_11263", "label_11264", "label_11265", "label_11266", "label_11267", "label_11268", "label_11269", "label_11270", "label_11271", "label_11272", "label_11273", "label_11274", "label_11275", "label_11276", "label_11277", "label_11278", "label_11279", "label_11280", "label_11281", "label_11282", "label_11283", "label_11284", "label_11285", "label_11286", "label_11287", "label_11288", "label_11289", "label_11290", "label_11291", "label_11292", "label_11293", "label_11294", "label_11295", "label_11296", "label_11297", "label_11298", "label_11299", "label_11300", "label_11301", "label_11302", "label_11303", "label_11304", "label_11305", "label_11306", "label_11307", "label_11308", "label_11309", "label_11310", "label_11311", "label_11312", "label_11313", "label_11314", "label_11315", "label_11316", "label_11317", "label_11318", "label_11319", "label_11320", "label_11321", "label_11322", "label_11323", "label_11324", "label_11325", "label_11326", "label_11327", "label_11328", "label_11329", "label_11330", "label_11331", "label_11332", "label_11333", "label_11334", "label_11335", "label_11336", "label_11337", "label_11338", "label_11339", "label_11340", "label_11341", "label_11342", "label_11343", "label_11344", "label_11345", "label_11346", "label_11347", "label_11348", "label_11349", "label_11350", "label_11351", "label_11352", "label_11353", "label_11354", "label_11355", "label_11356", "label_11357", "label_11358", "label_11359", "label_11360", "label_11361", "label_11362", "label_11363", "label_11364", "label_11365", "label_11366", "label_11367", "label_11368", "label_11369", "label_11370", "label_11371", "label_11372", "label_11373", "label_11374", "label_11375", "label_11376", "label_11377", "label_11378", "label_11379", "label_11380", "label_11381", "label_11382", "label_11383", "label_11384", "label_11385", "label_11386", "label_11387", "label_11388", "label_11389", "label_11390", "label_11391", "label_11392", "label_11393", "label_11394", "label_11395", "label_11396", "label_11397", "label_11398", "label_11399", "label_11400", "label_11401", "label_11402", "label_11403", "label_11404", "label_11405", "label_11406", "label_11407", "label_11408", "label_11409", "label_11410", "label_11411", "label_11412", "label_11413", "label_11414", "label_11415", "label_11416", "label_11417", "label_11418", "label_11419", "label_11420", "label_11421", "label_11422", "label_11423", "label_11424", "label_11425", "label_11426", "label_11427", "label_11428", "label_11429", "label_11430", "label_11431", "label_11432", "label_11433", "label_11434", "label_11435", "label_11436", "label_11437", "label_11438", "label_11439", "label_11440", "label_11441", "label_11442", "label_11443", "label_11444", "label_11445", "label_11446", "label_11447", "label_11448", "label_11449", "label_11450", "label_11451", "label_11452", "label_11453", "label_11454", "label_11455", "label_11456", "label_11457", "label_11458", "label_11459", "label_11460", "label_11461", "label_11462", "label_11463", "label_11464", "label_11465", "label_11466", "label_11467", "label_11468", "label_11469", "label_11470", "label_11471", "label_11472", "label_11473", "label_11474", "label_11475", "label_11476", "label_11477", "label_11478", "label_11479", "label_11480", "label_11481", "label_11482", "label_11483", "label_11484", "label_11485", "label_11486", "label_11487", "label_11488", "label_11489", "label_11490", "label_11491", "label_11492", "label_11493", "label_11494", "label_11495", "label_11496", "label_11497", "label_11498", "label_11499", "label_11500", "label_11501", "label_11502", "label_11503", "label_11504", "label_11505", "label_11506", "label_11507", "label_11508", "label_11509", "label_11510", "label_11511", "label_11512", "label_11513", "label_11514", "label_11515", "label_11516", "label_11517", "label_11518", "label_11519", "label_11520", "label_11521", "label_11522", "label_11523", "label_11524", "label_11525", "label_11526", "label_11527", "label_11528", "label_11529", "label_11530", "label_11531", "label_11532", "label_11533", "label_11534", "label_11535", "label_11536", "label_11537", "label_11538", "label_11539", "label_11540", "label_11541", "label_11542", "label_11543", "label_11544", "label_11545", "label_11546", "label_11547", "label_11548", "label_11549", "label_11550", "label_11551", "label_11552", "label_11553", "label_11554", "label_11555", "label_11556", "label_11557", "label_11558", "label_11559", "label_11560", "label_11561", "label_11562", "label_11563", "label_11564", "label_11565", "label_11566", "label_11567", "label_11568", "label_11569", "label_11570", "label_11571", "label_11572", "label_11573", "label_11574", "label_11575", "label_11576", "label_11577", "label_11578", "label_11579", "label_11580", "label_11581", "label_11582", "label_11583", "label_11584", "label_11585", "label_11586", "label_11587", "label_11588", "label_11589", "label_11590", "label_11591", "label_11592", "label_11593", "label_11594", "label_11595", "label_11596", "label_11597", "label_11598", "label_11599", "label_11600", "label_11601", "label_11602", "label_11603", "label_11604", "label_11605", "label_11606", "label_11607", "label_11608", "label_11609", "label_11610", "label_11611", "label_11612", "label_11613", "label_11614", "label_11615", "label_11616", "label_11617", "label_11618", "label_11619", "label_11620", "label_11621", "label_11622", "label_11623", "label_11624", "label_11625", "label_11626", "label_11627", "label_11628", "label_11629", "label_11630", "label_11631", "label_11632", "label_11633", "label_11634", "label_11635", "label_11636", "label_11637", "label_11638", "label_11639", "label_11640", "label_11641", "label_11642", "label_11643", "label_11644", "label_11645", "label_11646", "label_11647", "label_11648", "label_11649", "label_11650", "label_11651", "label_11652", "label_11653", "label_11654", "label_11655", "label_11656", "label_11657", "label_11658", "label_11659", "label_11660", "label_11661", "label_11662", "label_11663", "label_11664", "label_11665", "label_11666", "label_11667", "label_11668", "label_11669", "label_11670", "label_11671", "label_11672", "label_11673", "label_11674", "label_11675", "label_11676", "label_11677", "label_11678", "label_11679", "label_11680", "label_11681", "label_11682", "label_11683", "label_11684", "label_11685", "label_11686", "label_11687", "label_11688", "label_11689", "label_11690", "label_11691", "label_11692", "label_11693", "label_11694", "label_11695", "label_11696", "label_11697", "label_11698", "label_11699", "label_11700", "label_11701", "label_11702", "label_11703", "label_11704", "label_11705", "label_11706", "label_11707", "label_11708", "label_11709", "label_11710", "label_11711", "label_11712", "label_11713", "label_11714", "label_11715", "label_11716", "label_11717", "label_11718", "label_11719", "label_11720", "label_11721", "label_11722", "label_11723", "label_11724", "label_11725", "label_11726", "label_11727", "label_11728", "label_11729", "label_11730", "label_11731", "label_11732", "label_11733", "label_11734", "label_11735", "label_11736", "label_11737", "label_11738", "label_11739", "label_11740", "label_11741", "label_11742", "label_11743", "label_11744", "label_11745", "label_11746", "label_11747", "label_11748", "label_11749", "label_11750", "label_11751", "label_11752", "label_11753", "label_11754", "label_11755", "label_11756", "label_11757", "label_11758", "label_11759", "label_11760", "label_11761", "label_11762", "label_11763", "label_11764", "label_11765", "label_11766", "label_11767", "label_11768", "label_11769", "label_11770", "label_11771", "label_11772", "label_11773", "label_11774", "label_11775", "label_11776", "label_11777", "label_11778", "label_11779", "label_11780", "label_11781", "label_11782", "label_11783", "label_11784", "label_11785", "label_11786", "label_11787", "label_11788", "label_11789", "label_11790", "label_11791", "label_11792", "label_11793", "label_11794", "label_11795", "label_11796", "label_11797", "label_11798", "label_11799", "label_11800", "label_11801", "label_11802", "label_11803", "label_11804", "label_11805", "label_11806", "label_11807", "label_11808", "label_11809", "label_11810", "label_11811", "label_11812", "label_11813", "label_11814", "label_11815", "label_11816", "label_11817", "label_11818", "label_11819", "label_11820", "label_11821", "label_11822", "label_11823", "label_11824", "label_11825", "label_11826", "label_11827", "label_11828", "label_11829", "label_11830", "label_11831", "label_11832", "label_11833", "label_11834", "label_11835", "label_11836", "label_11837", "label_11838", "label_11839", "label_11840", "label_11841", "label_11842", "label_11843", "label_11844", "label_11845", "label_11846", "label_11847", "label_11848", "label_11849", "label_11850", "label_11851", "label_11852", "label_11853", "label_11854", "label_11855", "label_11856", "label_11857", "label_11858", "label_11859", "label_11860", "label_11861", "label_11862", "label_11863", "label_11864", "label_11865", "label_11866", "label_11867", "label_11868", "label_11869", "label_11870", "label_11871", "label_11872", "label_11873", "label_11874", "label_11875", "label_11876", "label_11877", "label_11878", "label_11879", "label_11880", "label_11881", "label_11882", "label_11883", "label_11884", "label_11885", "label_11886", "label_11887", "label_11888", "label_11889", "label_11890", "label_11891", "label_11892", "label_11893", "label_11894", "label_11895", "label_11896", "label_11897", "label_11898", "label_11899", "label_11900", "label_11901", "label_11902", "label_11903", "label_11904", "label_11905", "label_11906", "label_11907", "label_11908", "label_11909", "label_11910", "label_11911", "label_11912", "label_11913", "label_11914", "label_11915", "label_11916", "label_11917", "label_11918", "label_11919", "label_11920", "label_11921", "label_11922", "label_11923", "label_11924", "label_11925", "label_11926", "label_11927", "label_11928", "label_11929", "label_11930", "label_11931", "label_11932", "label_11933", "label_11934", "label_11935", "label_11936", "label_11937", "label_11938", "label_11939", "label_11940", "label_11941", "label_11942", "label_11943", "label_11944", "label_11945", "label_11946", "label_11947", "label_11948", "label_11949", "label_11950", "label_11951", "label_11952", "label_11953", "label_11954", "label_11955", "label_11956", "label_11957", "label_11958", "label_11959", "label_11960", "label_11961", "label_11962", "label_11963", "label_11964", "label_11965", "label_11966", "label_11967", "label_11968", "label_11969", "label_11970", "label_11971", "label_11972", "label_11973", "label_11974", "label_11975", "label_11976", "label_11977", "label_11978", "label_11979", "label_11980", "label_11981", "label_11982", "label_11983", "label_11984", "label_11985", "label_11986", "label_11987", "label_11988", "label_11989", "label_11990", "label_11991", "label_11992", "label_11993", "label_11994", "label_11995", "label_11996", "label_11997", "label_11998", "label_11999", "label_12000", "label_12001", "label_12002", "label_12003", "label_12004", "label_12005", "label_12006", "label_12007", "label_12008", "label_12009", "label_12010", "label_12011", "label_12012", "label_12013", "label_12014", "label_12015", "label_12016", "label_12017", "label_12018", "label_12019", "label_12020", "label_12021", "label_12022", "label_12023", "label_12024", "label_12025", "label_12026", "label_12027", "label_12028", "label_12029", "label_12030", "label_12031", "label_12032", "label_12033", "label_12034", "label_12035", "label_12036", "label_12037", "label_12038", "label_12039", "label_12040", "label_12041", "label_12042", "label_12043", "label_12044", "label_12045", "label_12046", "label_12047", "label_12048", "label_12049", "label_12050", "label_12051", "label_12052", "label_12053", "label_12054", "label_12055", "label_12056", "label_12057", "label_12058", "label_12059", "label_12060", "label_12061", "label_12062", "label_12063", "label_12064", "label_12065", "label_12066", "label_12067", "label_12068", "label_12069", "label_12070", "label_12071", "label_12072", "label_12073", "label_12074", "label_12075", "label_12076", "label_12077", "label_12078", "label_12079", "label_12080", "label_12081", "label_12082", "label_12083", "label_12084", "label_12085", "label_12086", "label_12087", "label_12088", "label_12089", "label_12090", "label_12091", "label_12092", "label_12093", "label_12094", "label_12095", "label_12096", "label_12097", "label_12098", "label_12099", "label_12100", "label_12101", "label_12102", "label_12103", "label_12104", "label_12105", "label_12106", "label_12107", "label_12108", "label_12109", "label_12110", "label_12111", "label_12112", "label_12113", "label_12114", "label_12115", "label_12116", "label_12117", "label_12118", "label_12119", "label_12120", "label_12121", "label_12122", "label_12123", "label_12124", "label_12125", "label_12126", "label_12127", "label_12128", "label_12129", "label_12130", "label_12131", "label_12132", "label_12133", "label_12134", "label_12135", "label_12136", "label_12137", "label_12138", "label_12139", "label_12140", "label_12141", "label_12142", "label_12143", "label_12144", "label_12145", "label_12146", "label_12147", "label_12148", "label_12149", "label_12150", "label_12151", "label_12152", "label_12153", "label_12154", "label_12155", "label_12156", "label_12157", "label_12158", "label_12159", "label_12160", "label_12161", "label_12162", "label_12163", "label_12164", "label_12165", "label_12166", "label_12167", "label_12168", "label_12169", "label_12170", "label_12171", "label_12172", "label_12173", "label_12174", "label_12175", "label_12176", "label_12177", "label_12178", "label_12179", "label_12180", "label_12181", "label_12182", "label_12183", "label_12184", "label_12185", "label_12186", "label_12187", "label_12188", "label_12189", "label_12190", "label_12191", "label_12192", "label_12193", "label_12194", "label_12195", "label_12196", "label_12197", "label_12198", "label_12199", "label_12200", "label_12201", "label_12202", "label_12203", "label_12204", "label_12205", "label_12206", "label_12207", "label_12208", "label_12209", "label_12210", "label_12211", "label_12212", "label_12213", "label_12214", "label_12215", "label_12216", "label_12217", "label_12218", "label_12219", "label_12220", "label_12221", "label_12222", "label_12223", "label_12224", "label_12225", "label_12226", "label_12227", "label_12228", "label_12229", "label_12230", "label_12231", "label_12232", "label_12233", "label_12234", "label_12235", "label_12236", "label_12237", "label_12238", "label_12239", "label_12240", "label_12241", "label_12242", "label_12243", "label_12244", "label_12245", "label_12246", "label_12247", "label_12248", "label_12249", "label_12250", "label_12251", "label_12252", "label_12253", "label_12254", "label_12255", "label_12256", "label_12257", "label_12258", "label_12259", "label_12260", "label_12261", "label_12262", "label_12263", "label_12264", "label_12265", "label_12266", "label_12267", "label_12268", "label_12269", "label_12270", "label_12271", "label_12272", "label_12273", "label_12274", "label_12275", "label_12276", "label_12277", "label_12278", "label_12279", "label_12280", "label_12281", "label_12282", "label_12283", "label_12284", "label_12285", "label_12286", "label_12287", "label_12288", "label_12289", "label_12290", "label_12291", "label_12292", "label_12293", "label_12294", "label_12295", "label_12296", "label_12297", "label_12298", "label_12299", "label_12300", "label_12301", "label_12302", "label_12303", "label_12304", "label_12305", "label_12306", "label_12307", "label_12308", "label_12309", "label_12310", "label_12311", "label_12312", "label_12313", "label_12314", "label_12315", "label_12316", "label_12317", "label_12318", "label_12319", "label_12320", "label_12321", "label_12322", "label_12323", "label_12324", "label_12325", "label_12326", "label_12327", "label_12328", "label_12329", "label_12330", "label_12331", "label_12332", "label_12333", "label_12334", "label_12335", "label_12336", "label_12337", "label_12338", "label_12339", "label_12340", "label_12341", "label_12342", "label_12343", "label_12344", "label_12345", "label_12346", "label_12347", "label_12348", "label_12349", "label_12350", "label_12351", "label_12352", "label_12353", "label_12354", "label_12355", "label_12356", "label_12357", "label_12358", "label_12359", "label_12360", "label_12361", "label_12362", "label_12363", "label_12364", "label_12365", "label_12366", "label_12367", "label_12368", "label_12369", "label_12370", "label_12371", "label_12372", "label_12373", "label_12374", "label_12375", "label_12376", "label_12377", "label_12378", "label_12379", "label_12380", "label_12381", "label_12382", "label_12383", "label_12384", "label_12385", "label_12386", "label_12387", "label_12388", "label_12389", "label_12390", "label_12391", "label_12392", "label_12393", "label_12394", "label_12395", "label_12396", "label_12397", "label_12398", "label_12399", "label_12400", "label_12401", "label_12402", "label_12403", "label_12404", "label_12405", "label_12406", "label_12407", "label_12408", "label_12409", "label_12410", "label_12411", "label_12412", "label_12413", "label_12414", "label_12415", "label_12416", "label_12417", "label_12418", "label_12419", "label_12420", "label_12421", "label_12422", "label_12423", "label_12424", "label_12425", "label_12426", "label_12427", "label_12428", "label_12429", "label_12430", "label_12431", "label_12432", "label_12433", "label_12434", "label_12435", "label_12436", "label_12437", "label_12438", "label_12439", "label_12440", "label_12441", "label_12442", "label_12443", "label_12444", "label_12445", "label_12446", "label_12447", "label_12448", "label_12449", "label_12450", "label_12451", "label_12452", "label_12453", "label_12454", "label_12455", "label_12456", "label_12457", "label_12458", "label_12459", "label_12460", "label_12461", "label_12462", "label_12463", "label_12464", "label_12465", "label_12466", "label_12467", "label_12468", "label_12469", "label_12470", "label_12471", "label_12472", "label_12473", "label_12474", "label_12475", "label_12476", "label_12477", "label_12478", "label_12479", "label_12480", "label_12481", "label_12482", "label_12483", "label_12484", "label_12485", "label_12486", "label_12487", "label_12488", "label_12489", "label_12490", "label_12491", "label_12492", "label_12493", "label_12494", "label_12495", "label_12496", "label_12497", "label_12498", "label_12499", "label_12500", "label_12501", "label_12502", "label_12503", "label_12504", "label_12505", "label_12506", "label_12507", "label_12508", "label_12509", "label_12510", "label_12511", "label_12512", "label_12513", "label_12514", "label_12515", "label_12516", "label_12517", "label_12518", "label_12519", "label_12520", "label_12521", "label_12522", "label_12523", "label_12524", "label_12525", "label_12526", "label_12527", "label_12528", "label_12529", "label_12530", "label_12531", "label_12532", "label_12533", "label_12534", "label_12535", "label_12536", "label_12537", "label_12538", "label_12539", "label_12540", "label_12541", "label_12542", "label_12543", "label_12544", "label_12545", "label_12546", "label_12547", "label_12548", "label_12549", "label_12550", "label_12551", "label_12552", "label_12553", "label_12554", "label_12555", "label_12556", "label_12557", "label_12558", "label_12559", "label_12560", "label_12561", "label_12562", "label_12563", "label_12564", "label_12565", "label_12566", "label_12567", "label_12568", "label_12569", "label_12570", "label_12571", "label_12572", "label_12573", "label_12574", "label_12575", "label_12576", "label_12577", "label_12578", "label_12579", "label_12580", "label_12581", "label_12582", "label_12583", "label_12584", "label_12585", "label_12586", "label_12587", "label_12588", "label_12589", "label_12590", "label_12591", "label_12592", "label_12593", "label_12594", "label_12595", "label_12596", "label_12597", "label_12598", "label_12599", "label_12600", "label_12601", "label_12602", "label_12603", "label_12604", "label_12605", "label_12606", "label_12607", "label_12608", "label_12609", "label_12610", "label_12611", "label_12612", "label_12613", "label_12614", "label_12615", "label_12616", "label_12617", "label_12618", "label_12619", "label_12620", "label_12621", "label_12622", "label_12623", "label_12624", "label_12625", "label_12626", "label_12627", "label_12628", "label_12629", "label_12630", "label_12631", "label_12632", "label_12633", "label_12634", "label_12635", "label_12636", "label_12637", "label_12638", "label_12639", "label_12640", "label_12641", "label_12642", "label_12643", "label_12644", "label_12645", "label_12646", "label_12647", "label_12648", "label_12649", "label_12650", "label_12651", "label_12652", "label_12653", "label_12654", "label_12655", "label_12656", "label_12657", "label_12658", "label_12659", "label_12660", "label_12661", "label_12662", "label_12663", "label_12664", "label_12665", "label_12666", "label_12667", "label_12668", "label_12669", "label_12670", "label_12671", "label_12672", "label_12673", "label_12674", "label_12675", "label_12676", "label_12677", "label_12678", "label_12679", "label_12680", "label_12681", "label_12682", "label_12683", "label_12684", "label_12685", "label_12686", "label_12687", "label_12688", "label_12689", "label_12690", "label_12691", "label_12692", "label_12693", "label_12694", "label_12695", "label_12696", "label_12697", "label_12698", "label_12699", "label_12700", "label_12701", "label_12702", "label_12703", "label_12704", "label_12705", "label_12706", "label_12707", "label_12708", "label_12709", "label_12710", "label_12711", "label_12712", "label_12713", "label_12714", "label_12715", "label_12716", "label_12717", "label_12718", "label_12719", "label_12720", "label_12721", "label_12722", "label_12723", "label_12724", "label_12725", "label_12726", "label_12727", "label_12728", "label_12729", "label_12730", "label_12731", "label_12732", "label_12733", "label_12734", "label_12735", "label_12736", "label_12737", "label_12738", "label_12739", "label_12740", "label_12741", "label_12742", "label_12743", "label_12744", "label_12745", "label_12746", "label_12747", "label_12748", "label_12749", "label_12750", "label_12751", "label_12752", "label_12753", "label_12754", "label_12755", "label_12756", "label_12757", "label_12758", "label_12759", "label_12760", "label_12761", "label_12762", "label_12763", "label_12764", "label_12765", "label_12766", "label_12767", "label_12768", "label_12769", "label_12770", "label_12771", "label_12772", "label_12773", "label_12774", "label_12775", "label_12776", "label_12777", "label_12778", "label_12779", "label_12780", "label_12781", "label_12782", "label_12783", "label_12784", "label_12785", "label_12786", "label_12787", "label_12788", "label_12789", "label_12790", "label_12791", "label_12792", "label_12793", "label_12794", "label_12795", "label_12796", "label_12797", "label_12798", "label_12799", "label_12800", "label_12801", "label_12802", "label_12803", "label_12804", "label_12805", "label_12806", "label_12807", "label_12808", "label_12809", "label_12810", "label_12811", "label_12812", "label_12813", "label_12814", "label_12815", "label_12816", "label_12817", "label_12818", "label_12819", "label_12820", "label_12821", "label_12822", "label_12823", "label_12824", "label_12825", "label_12826", "label_12827", "label_12828", "label_12829", "label_12830", "label_12831", "label_12832", "label_12833", "label_12834", "label_12835", "label_12836", "label_12837", "label_12838", "label_12839", "label_12840", "label_12841", "label_12842", "label_12843", "label_12844", "label_12845", "label_12846", "label_12847", "label_12848", "label_12849", "label_12850", "label_12851", "label_12852", "label_12853", "label_12854", "label_12855", "label_12856", "label_12857", "label_12858", "label_12859", "label_12860", "label_12861", "label_12862", "label_12863", "label_12864", "label_12865", "label_12866", "label_12867", "label_12868", "label_12869", "label_12870", "label_12871", "label_12872", "label_12873", "label_12874", "label_12875", "label_12876", "label_12877", "label_12878", "label_12879", "label_12880", "label_12881", "label_12882", "label_12883", "label_12884", "label_12885", "label_12886", "label_12887", "label_12888", "label_12889", "label_12890", "label_12891", "label_12892", "label_12893", "label_12894", "label_12895", "label_12896", "label_12897", "label_12898", "label_12899", "label_12900", "label_12901", "label_12902", "label_12903", "label_12904", "label_12905", "label_12906", "label_12907", "label_12908", "label_12909", "label_12910", "label_12911", "label_12912", "label_12913", "label_12914", "label_12915", "label_12916", "label_12917", "label_12918", "label_12919", "label_12920", "label_12921", "label_12922", "label_12923", "label_12924", "label_12925", "label_12926", "label_12927", "label_12928", "label_12929", "label_12930", "label_12931", "label_12932", "label_12933", "label_12934", "label_12935", "label_12936", "label_12937", "label_12938", "label_12939", "label_12940", "label_12941", "label_12942", "label_12943", "label_12944", "label_12945", "label_12946", "label_12947", "label_12948", "label_12949", "label_12950", "label_12951", "label_12952", "label_12953", "label_12954", "label_12955", "label_12956", "label_12957", "label_12958", "label_12959", "label_12960", "label_12961", "label_12962", "label_12963", "label_12964", "label_12965", "label_12966", "label_12967", "label_12968", "label_12969", "label_12970", "label_12971", "label_12972", "label_12973", "label_12974", "label_12975", "label_12976", "label_12977", "label_12978", "label_12979", "label_12980", "label_12981", "label_12982", "label_12983", "label_12984", "label_12985", "label_12986", "label_12987", "label_12988", "label_12989", "label_12990", "label_12991", "label_12992", "label_12993", "label_12994", "label_12995", "label_12996", "label_12997", "label_12998", "label_12999", "label_13000", "label_13001", "label_13002", "label_13003", "label_13004", "label_13005", "label_13006", "label_13007", "label_13008", "label_13009", "label_13010", "label_13011", "label_13012", "label_13013", "label_13014", "label_13015", "label_13016", "label_13017", "label_13018", "label_13019", "label_13020", "label_13021", "label_13022", "label_13023", "label_13024", "label_13025", "label_13026", "label_13027", "label_13028", "label_13029", "label_13030", "label_13031", "label_13032", "label_13033", "label_13034", "label_13035", "label_13036", "label_13037", "label_13038", "label_13039", "label_13040", "label_13041", "label_13042", "label_13043", "label_13044", "label_13045", "label_13046", "label_13047", "label_13048", "label_13049", "label_13050", "label_13051", "label_13052", "label_13053", "label_13054", "label_13055", "label_13056", "label_13057", "label_13058", "label_13059", "label_13060", "label_13061", "label_13062", "label_13063", "label_13064", "label_13065", "label_13066", "label_13067", "label_13068", "label_13069", "label_13070", "label_13071", "label_13072", "label_13073", "label_13074", "label_13075", "label_13076", "label_13077", "label_13078", "label_13079", "label_13080", "label_13081", "label_13082", "label_13083", "label_13084", "label_13085", "label_13086", "label_13087", "label_13088", "label_13089", "label_13090", "label_13091", "label_13092", "label_13093", "label_13094", "label_13095", "label_13096", "label_13097", "label_13098", "label_13099", "label_13100", "label_13101", "label_13102", "label_13103", "label_13104", "label_13105", "label_13106", "label_13107", "label_13108", "label_13109", "label_13110", "label_13111", "label_13112", "label_13113", "label_13114", "label_13115", "label_13116", "label_13117", "label_13118", "label_13119", "label_13120", "label_13121", "label_13122", "label_13123", "label_13124", "label_13125", "label_13126", "label_13127", "label_13128", "label_13129", "label_13130", "label_13131", "label_13132", "label_13133", "label_13134", "label_13135", "label_13136", "label_13137", "label_13138", "label_13139", "label_13140", "label_13141", "label_13142", "label_13143", "label_13144", "label_13145", "label_13146", "label_13147", "label_13148", "label_13149", "label_13150", "label_13151", "label_13152", "label_13153", "label_13154", "label_13155", "label_13156", "label_13157", "label_13158", "label_13159", "label_13160", "label_13161", "label_13162", "label_13163", "label_13164", "label_13165", "label_13166", "label_13167", "label_13168", "label_13169", "label_13170", "label_13171", "label_13172", "label_13173", "label_13174", "label_13175", "label_13176", "label_13177", "label_13178", "label_13179", "label_13180", "label_13181", "label_13182", "label_13183", "label_13184", "label_13185", "label_13186", "label_13187", "label_13188", "label_13189", "label_13190", "label_13191", "label_13192", "label_13193", "label_13194", "label_13195", "label_13196", "label_13197", "label_13198", "label_13199", "label_13200", "label_13201", "label_13202", "label_13203", "label_13204", "label_13205", "label_13206", "label_13207", "label_13208", "label_13209", "label_13210", "label_13211", "label_13212", "label_13213", "label_13214", "label_13215", "label_13216", "label_13217", "label_13218", "label_13219", "label_13220", "label_13221", "label_13222", "label_13223", "label_13224", "label_13225", "label_13226", "label_13227", "label_13228", "label_13229", "label_13230", "label_13231", "label_13232", "label_13233", "label_13234", "label_13235", "label_13236", "label_13237", "label_13238", "label_13239", "label_13240", "label_13241", "label_13242", "label_13243", "label_13244", "label_13245", "label_13246", "label_13247", "label_13248", "label_13249", "label_13250", "label_13251", "label_13252", "label_13253", "label_13254", "label_13255", "label_13256", "label_13257", "label_13258", "label_13259", "label_13260", "label_13261", "label_13262", "label_13263", "label_13264", "label_13265", "label_13266", "label_13267", "label_13268", "label_13269", "label_13270", "label_13271", "label_13272", "label_13273", "label_13274", "label_13275", "label_13276", "label_13277", "label_13278", "label_13279", "label_13280", "label_13281", "label_13282", "label_13283", "label_13284", "label_13285", "label_13286", "label_13287", "label_13288", "label_13289", "label_13290", "label_13291", "label_13292", "label_13293", "label_13294", "label_13295", "label_13296", "label_13297", "label_13298", "label_13299", "label_13300", "label_13301", "label_13302", "label_13303", "label_13304", "label_13305", "label_13306", "label_13307", "label_13308", "label_13309", "label_13310", "label_13311", "label_13312", "label_13313", "label_13314", "label_13315", "label_13316", "label_13317", "label_13318", "label_13319", "label_13320", "label_13321", "label_13322", "label_13323", "label_13324", "label_13325", "label_13326", "label_13327", "label_13328", "label_13329", "label_13330", "label_13331", "label_13332", "label_13333", "label_13334", "label_13335", "label_13336", "label_13337", "label_13338", "label_13339", "label_13340", "label_13341", "label_13342", "label_13343", "label_13344", "label_13345", "label_13346", "label_13347", "label_13348", "label_13349", "label_13350", "label_13351", "label_13352", "label_13353", "label_13354", "label_13355", "label_13356", "label_13357", "label_13358", "label_13359", "label_13360", "label_13361", "label_13362", "label_13363", "label_13364", "label_13365", "label_13366", "label_13367", "label_13368", "label_13369", "label_13370", "label_13371", "label_13372", "label_13373", "label_13374", "label_13375", "label_13376", "label_13377", "label_13378", "label_13379", "label_13380", "label_13381", "label_13382", "label_13383", "label_13384", "label_13385", "label_13386", "label_13387", "label_13388", "label_13389", "label_13390", "label_13391", "label_13392", "label_13393", "label_13394", "label_13395", "label_13396", "label_13397", "label_13398", "label_13399", "label_13400", "label_13401", "label_13402", "label_13403", "label_13404", "label_13405", "label_13406", "label_13407", "label_13408", "label_13409", "label_13410", "label_13411", "label_13412", "label_13413", "label_13414", "label_13415", "label_13416", "label_13417", "label_13418", "label_13419", "label_13420", "label_13421", "label_13422", "label_13423", "label_13424", "label_13425", "label_13426", "label_13427", "label_13428", "label_13429", "label_13430", "label_13431", "label_13432", "label_13433", "label_13434", "label_13435", "label_13436", "label_13437", "label_13438", "label_13439", "label_13440", "label_13441", "label_13442", "label_13443", "label_13444", "label_13445", "label_13446", "label_13447", "label_13448", "label_13449", "label_13450", "label_13451", "label_13452", "label_13453", "label_13454", "label_13455", "label_13456", "label_13457", "label_13458", "label_13459", "label_13460", "label_13461", "label_13462", "label_13463", "label_13464", "label_13465", "label_13466", "label_13467", "label_13468", "label_13469", "label_13470", "label_13471", "label_13472", "label_13473", "label_13474", "label_13475", "label_13476", "label_13477", "label_13478", "label_13479", "label_13480", "label_13481", "label_13482", "label_13483", "label_13484", "label_13485", "label_13486", "label_13487", "label_13488", "label_13489", "label_13490", "label_13491", "label_13492", "label_13493", "label_13494", "label_13495", "label_13496", "label_13497", "label_13498", "label_13499", "label_13500", "label_13501", "label_13502", "label_13503", "label_13504", "label_13505", "label_13506", "label_13507", "label_13508", "label_13509", "label_13510", "label_13511", "label_13512", "label_13513", "label_13514", "label_13515", "label_13516", "label_13517", "label_13518", "label_13519", "label_13520", "label_13521", "label_13522", "label_13523", "label_13524", "label_13525", "label_13526", "label_13527", "label_13528", "label_13529", "label_13530", "label_13531", "label_13532", "label_13533", "label_13534", "label_13535", "label_13536", "label_13537", "label_13538", "label_13539", "label_13540", "label_13541", "label_13542", "label_13543", "label_13544", "label_13545", "label_13546", "label_13547", "label_13548", "label_13549", "label_13550", "label_13551", "label_13552", "label_13553", "label_13554", "label_13555", "label_13556", "label_13557", "label_13558", "label_13559", "label_13560", "label_13561", "label_13562", "label_13563", "label_13564", "label_13565", "label_13566", "label_13567", "label_13568", "label_13569", "label_13570", "label_13571", "label_13572", "label_13573", "label_13574", "label_13575", "label_13576", "label_13577", "label_13578", "label_13579", "label_13580", "label_13581", "label_13582", "label_13583", "label_13584", "label_13585", "label_13586", "label_13587", "label_13588", "label_13589", "label_13590", "label_13591", "label_13592", "label_13593", "label_13594", "label_13595", "label_13596", "label_13597", "label_13598", "label_13599", "label_13600", "label_13601", "label_13602", "label_13603", "label_13604", "label_13605", "label_13606", "label_13607", "label_13608", "label_13609", "label_13610", "label_13611", "label_13612", "label_13613", "label_13614", "label_13615", "label_13616", "label_13617", "label_13618", "label_13619", "label_13620", "label_13621", "label_13622", "label_13623", "label_13624", "label_13625", "label_13626", "label_13627", "label_13628", "label_13629", "label_13630", "label_13631", "label_13632", "label_13633", "label_13634", "label_13635", "label_13636", "label_13637", "label_13638", "label_13639", "label_13640", "label_13641", "label_13642", "label_13643", "label_13644", "label_13645", "label_13646", "label_13647", "label_13648", "label_13649", "label_13650", "label_13651", "label_13652", "label_13653", "label_13654", "label_13655", "label_13656", "label_13657", "label_13658", "label_13659", "label_13660", "label_13661", "label_13662", "label_13663", "label_13664", "label_13665", "label_13666", "label_13667", "label_13668", "label_13669", "label_13670", "label_13671", "label_13672", "label_13673", "label_13674", "label_13675", "label_13676", "label_13677", "label_13678", "label_13679", "label_13680", "label_13681", "label_13682", "label_13683", "label_13684", "label_13685", "label_13686", "label_13687", "label_13688", "label_13689", "label_13690", "label_13691", "label_13692", "label_13693", "label_13694", "label_13695", "label_13696", "label_13697", "label_13698", "label_13699", "label_13700", "label_13701", "label_13702", "label_13703", "label_13704", "label_13705", "label_13706", "label_13707", "label_13708", "label_13709", "label_13710", "label_13711", "label_13712", "label_13713", "label_13714", "label_13715", "label_13716", "label_13717", "label_13718", "label_13719", "label_13720", "label_13721", "label_13722", "label_13723", "label_13724", "label_13725", "label_13726", "label_13727", "label_13728", "label_13729", "label_13730", "label_13731", "label_13732", "label_13733", "label_13734", "label_13735", "label_13736", "label_13737", "label_13738", "label_13739", "label_13740", "label_13741", "label_13742", "label_13743", "label_13744", "label_13745", "label_13746", "label_13747", "label_13748", "label_13749", "label_13750", "label_13751", "label_13752", "label_13753", "label_13754", "label_13755", "label_13756", "label_13757", "label_13758", "label_13759", "label_13760", "label_13761", "label_13762", "label_13763", "label_13764", "label_13765", "label_13766", "label_13767", "label_13768", "label_13769", "label_13770", "label_13771", "label_13772", "label_13773", "label_13774", "label_13775", "label_13776", "label_13777", "label_13778", "label_13779", "label_13780", "label_13781", "label_13782", "label_13783", "label_13784", "label_13785", "label_13786", "label_13787", "label_13788", "label_13789", "label_13790", "label_13791", "label_13792", "label_13793", "label_13794", "label_13795", "label_13796", "label_13797", "label_13798", "label_13799", "label_13800", "label_13801", "label_13802", "label_13803", "label_13804", "label_13805", "label_13806", "label_13807", "label_13808", "label_13809", "label_13810", "label_13811", "label_13812", "label_13813", "label_13814", "label_13815", "label_13816", "label_13817", "label_13818", "label_13819", "label_13820", "label_13821", "label_13822", "label_13823", "label_13824", "label_13825", "label_13826", "label_13827", "label_13828", "label_13829", "label_13830", "label_13831", "label_13832", "label_13833", "label_13834", "label_13835", "label_13836", "label_13837", "label_13838", "label_13839", "label_13840", "label_13841", "label_13842", "label_13843", "label_13844", "label_13845", "label_13846", "label_13847", "label_13848", "label_13849", "label_13850", "label_13851", "label_13852", "label_13853", "label_13854", "label_13855", "label_13856", "label_13857", "label_13858", "label_13859", "label_13860", "label_13861", "label_13862", "label_13863", "label_13864", "label_13865", "label_13866", "label_13867", "label_13868", "label_13869", "label_13870", "label_13871", "label_13872", "label_13873", "label_13874", "label_13875", "label_13876", "label_13877", "label_13878", "label_13879", "label_13880", "label_13881", "label_13882", "label_13883", "label_13884", "label_13885", "label_13886", "label_13887", "label_13888", "label_13889", "label_13890", "label_13891", "label_13892", "label_13893", "label_13894", "label_13895", "label_13896", "label_13897", "label_13898", "label_13899", "label_13900", "label_13901", "label_13902", "label_13903", "label_13904", "label_13905", "label_13906", "label_13907", "label_13908", "label_13909", "label_13910", "label_13911", "label_13912", "label_13913", "label_13914", "label_13915", "label_13916", "label_13917", "label_13918", "label_13919", "label_13920", "label_13921", "label_13922", "label_13923", "label_13924", "label_13925", "label_13926", "label_13927", "label_13928", "label_13929", "label_13930", "label_13931", "label_13932", "label_13933", "label_13934", "label_13935", "label_13936", "label_13937", "label_13938", "label_13939", "label_13940", "label_13941", "label_13942", "label_13943", "label_13944", "label_13945", "label_13946", "label_13947", "label_13948", "label_13949", "label_13950", "label_13951", "label_13952", "label_13953", "label_13954", "label_13955", "label_13956", "label_13957", "label_13958", "label_13959", "label_13960", "label_13961", "label_13962", "label_13963", "label_13964", "label_13965", "label_13966", "label_13967", "label_13968", "label_13969", "label_13970", "label_13971", "label_13972", "label_13973", "label_13974", "label_13975", "label_13976", "label_13977", "label_13978", "label_13979", "label_13980", "label_13981", "label_13982", "label_13983", "label_13984", "label_13985", "label_13986", "label_13987", "label_13988", "label_13989", "label_13990", "label_13991", "label_13992", "label_13993", "label_13994", "label_13995", "label_13996", "label_13997", "label_13998", "label_13999", "label_14000", "label_14001", "label_14002", "label_14003", "label_14004", "label_14005", "label_14006", "label_14007", "label_14008", "label_14009", "label_14010", "label_14011", "label_14012", "label_14013", "label_14014", "label_14015", "label_14016", "label_14017", "label_14018", "label_14019", "label_14020", "label_14021", "label_14022", "label_14023", "label_14024", "label_14025", "label_14026", "label_14027", "label_14028", "label_14029", "label_14030", "label_14031", "label_14032", "label_14033", "label_14034", "label_14035", "label_14036", "label_14037", "label_14038", "label_14039", "label_14040", "label_14041", "label_14042", "label_14043", "label_14044", "label_14045", "label_14046", "label_14047", "label_14048", "label_14049", "label_14050", "label_14051", "label_14052", "label_14053", "label_14054", "label_14055", "label_14056", "label_14057", "label_14058", "label_14059", "label_14060", "label_14061", "label_14062", "label_14063", "label_14064", "label_14065", "label_14066", "label_14067", "label_14068", "label_14069", "label_14070", "label_14071", "label_14072", "label_14073", "label_14074", "label_14075", "label_14076", "label_14077", "label_14078", "label_14079", "label_14080", "label_14081", "label_14082", "label_14083", "label_14084", "label_14085", "label_14086", "label_14087", "label_14088", "label_14089", "label_14090", "label_14091", "label_14092", "label_14093", "label_14094", "label_14095", "label_14096", "label_14097", "label_14098", "label_14099", "label_14100", "label_14101", "label_14102", "label_14103", "label_14104", "label_14105", "label_14106", "label_14107", "label_14108", "label_14109", "label_14110", "label_14111", "label_14112", "label_14113", "label_14114", "label_14115", "label_14116", "label_14117", "label_14118", "label_14119", "label_14120", "label_14121", "label_14122", "label_14123", "label_14124", "label_14125", "label_14126", "label_14127", "label_14128", "label_14129", "label_14130", "label_14131", "label_14132", "label_14133", "label_14134", "label_14135", "label_14136", "label_14137", "label_14138", "label_14139", "label_14140", "label_14141", "label_14142", "label_14143", "label_14144", "label_14145", "label_14146", "label_14147", "label_14148", "label_14149", "label_14150", "label_14151", "label_14152", "label_14153", "label_14154", "label_14155", "label_14156", "label_14157", "label_14158", "label_14159", "label_14160", "label_14161", "label_14162", "label_14163", "label_14164", "label_14165", "label_14166", "label_14167", "label_14168", "label_14169", "label_14170", "label_14171", "label_14172", "label_14173", "label_14174", "label_14175", "label_14176", "label_14177", "label_14178", "label_14179", "label_14180", "label_14181", "label_14182", "label_14183", "label_14184", "label_14185", "label_14186", "label_14187", "label_14188", "label_14189", "label_14190", "label_14191", "label_14192", "label_14193", "label_14194", "label_14195", "label_14196", "label_14197", "label_14198", "label_14199", "label_14200", "label_14201", "label_14202", "label_14203", "label_14204", "label_14205", "label_14206", "label_14207", "label_14208", "label_14209", "label_14210", "label_14211", "label_14212", "label_14213", "label_14214", "label_14215", "label_14216", "label_14217", "label_14218", "label_14219", "label_14220", "label_14221", "label_14222", "label_14223", "label_14224", "label_14225", "label_14226", "label_14227", "label_14228", "label_14229", "label_14230", "label_14231", "label_14232", "label_14233", "label_14234", "label_14235", "label_14236", "label_14237", "label_14238", "label_14239", "label_14240", "label_14241", "label_14242", "label_14243", "label_14244", "label_14245", "label_14246", "label_14247", "label_14248", "label_14249", "label_14250", "label_14251", "label_14252", "label_14253", "label_14254", "label_14255", "label_14256", "label_14257", "label_14258", "label_14259", "label_14260", "label_14261", "label_14262", "label_14263", "label_14264", "label_14265", "label_14266", "label_14267", "label_14268", "label_14269", "label_14270", "label_14271", "label_14272", "label_14273", "label_14274", "label_14275", "label_14276", "label_14277", "label_14278", "label_14279", "label_14280", "label_14281", "label_14282", "label_14283", "label_14284", "label_14285", "label_14286", "label_14287", "label_14288", "label_14289", "label_14290", "label_14291", "label_14292", "label_14293", "label_14294", "label_14295", "label_14296", "label_14297", "label_14298", "label_14299", "label_14300", "label_14301", "label_14302", "label_14303", "label_14304", "label_14305", "label_14306", "label_14307", "label_14308", "label_14309", "label_14310", "label_14311", "label_14312", "label_14313", "label_14314", "label_14315", "label_14316", "label_14317", "label_14318", "label_14319", "label_14320", "label_14321", "label_14322", "label_14323", "label_14324", "label_14325", "label_14326", "label_14327", "label_14328", "label_14329", "label_14330", "label_14331", "label_14332", "label_14333", "label_14334", "label_14335", "label_14336", "label_14337", "label_14338", "label_14339", "label_14340", "label_14341", "label_14342", "label_14343", "label_14344", "label_14345", "label_14346", "label_14347", "label_14348", "label_14349", "label_14350", "label_14351", "label_14352", "label_14353", "label_14354", "label_14355", "label_14356", "label_14357", "label_14358", "label_14359", "label_14360", "label_14361", "label_14362", "label_14363", "label_14364", "label_14365", "label_14366", "label_14367", "label_14368", "label_14369", "label_14370", "label_14371", "label_14372", "label_14373", "label_14374", "label_14375", "label_14376", "label_14377", "label_14378", "label_14379", "label_14380", "label_14381", "label_14382", "label_14383", "label_14384", "label_14385", "label_14386", "label_14387", "label_14388", "label_14389", "label_14390", "label_14391", "label_14392", "label_14393", "label_14394", "label_14395", "label_14396", "label_14397", "label_14398", "label_14399", "label_14400", "label_14401", "label_14402", "label_14403", "label_14404", "label_14405", "label_14406", "label_14407", "label_14408", "label_14409", "label_14410", "label_14411", "label_14412", "label_14413", "label_14414", "label_14415", "label_14416", "label_14417", "label_14418", "label_14419", "label_14420", "label_14421", "label_14422", "label_14423", "label_14424", "label_14425", "label_14426", "label_14427", "label_14428", "label_14429", "label_14430", "label_14431", "label_14432", "label_14433", "label_14434", "label_14435", "label_14436", "label_14437", "label_14438", "label_14439", "label_14440", "label_14441", "label_14442", "label_14443", "label_14444", "label_14445", "label_14446", "label_14447", "label_14448", "label_14449", "label_14450", "label_14451", "label_14452", "label_14453", "label_14454", "label_14455", "label_14456", "label_14457", "label_14458", "label_14459", "label_14460", "label_14461", "label_14462", "label_14463", "label_14464", "label_14465", "label_14466", "label_14467", "label_14468", "label_14469", "label_14470", "label_14471", "label_14472", "label_14473", "label_14474", "label_14475", "label_14476", "label_14477", "label_14478", "label_14479", "label_14480", "label_14481", "label_14482", "label_14483", "label_14484", "label_14485", "label_14486", "label_14487", "label_14488", "label_14489", "label_14490", "label_14491", "label_14492", "label_14493", "label_14494", "label_14495", "label_14496", "label_14497", "label_14498", "label_14499", "label_14500", "label_14501", "label_14502", "label_14503", "label_14504", "label_14505", "label_14506", "label_14507", "label_14508", "label_14509", "label_14510", "label_14511", "label_14512", "label_14513", "label_14514", "label_14515", "label_14516", "label_14517", "label_14518", "label_14519", "label_14520", "label_14521", "label_14522", "label_14523", "label_14524", "label_14525", "label_14526", "label_14527", "label_14528", "label_14529", "label_14530", "label_14531", "label_14532", "label_14533", "label_14534", "label_14535", "label_14536", "label_14537", "label_14538", "label_14539", "label_14540", "label_14541", "label_14542", "label_14543", "label_14544", "label_14545", "label_14546", "label_14547", "label_14548", "label_14549", "label_14550", "label_14551", "label_14552", "label_14553", "label_14554", "label_14555", "label_14556", "label_14557", "label_14558", "label_14559", "label_14560", "label_14561", "label_14562", "label_14563", "label_14564", "label_14565", "label_14566", "label_14567", "label_14568", "label_14569", "label_14570", "label_14571", "label_14572", "label_14573", "label_14574", "label_14575", "label_14576", "label_14577", "label_14578", "label_14579", "label_14580", "label_14581", "label_14582", "label_14583", "label_14584", "label_14585", "label_14586", "label_14587", "label_14588", "label_14589", "label_14590", "label_14591", "label_14592", "label_14593", "label_14594", "label_14595", "label_14596", "label_14597", "label_14598", "label_14599", "label_14600", "label_14601", "label_14602", "label_14603", "label_14604", "label_14605", "label_14606", "label_14607", "label_14608", "label_14609", "label_14610", "label_14611", "label_14612", "label_14613", "label_14614", "label_14615", "label_14616", "label_14617", "label_14618", "label_14619", "label_14620", "label_14621", "label_14622", "label_14623", "label_14624", "label_14625", "label_14626", "label_14627", "label_14628", "label_14629", "label_14630", "label_14631", "label_14632", "label_14633", "label_14634", "label_14635", "label_14636", "label_14637", "label_14638", "label_14639", "label_14640", "label_14641", "label_14642", "label_14643", "label_14644", "label_14645", "label_14646", "label_14647", "label_14648", "label_14649", "label_14650", "label_14651", "label_14652", "label_14653", "label_14654", "label_14655", "label_14656", "label_14657", "label_14658", "label_14659", "label_14660", "label_14661", "label_14662", "label_14663", "label_14664", "label_14665", "label_14666", "label_14667", "label_14668", "label_14669", "label_14670", "label_14671", "label_14672", "label_14673", "label_14674", "label_14675", "label_14676", "label_14677", "label_14678", "label_14679", "label_14680", "label_14681", "label_14682", "label_14683", "label_14684", "label_14685", "label_14686", "label_14687", "label_14688", "label_14689", "label_14690", "label_14691", "label_14692", "label_14693", "label_14694", "label_14695", "label_14696", "label_14697", "label_14698", "label_14699", "label_14700", "label_14701", "label_14702", "label_14703", "label_14704", "label_14705", "label_14706", "label_14707", "label_14708", "label_14709", "label_14710", "label_14711", "label_14712", "label_14713", "label_14714", "label_14715", "label_14716", "label_14717", "label_14718", "label_14719", "label_14720", "label_14721", "label_14722", "label_14723", "label_14724", "label_14725", "label_14726", "label_14727", "label_14728", "label_14729", "label_14730", "label_14731", "label_14732", "label_14733", "label_14734", "label_14735", "label_14736", "label_14737", "label_14738", "label_14739", "label_14740", "label_14741", "label_14742", "label_14743", "label_14744", "label_14745", "label_14746", "label_14747", "label_14748", "label_14749", "label_14750", "label_14751", "label_14752", "label_14753", "label_14754", "label_14755", "label_14756", "label_14757", "label_14758", "label_14759", "label_14760", "label_14761", "label_14762", "label_14763", "label_14764", "label_14765", "label_14766", "label_14767", "label_14768", "label_14769", "label_14770", "label_14771", "label_14772", "label_14773", "label_14774", "label_14775", "label_14776", "label_14777", "label_14778", "label_14779", "label_14780", "label_14781", "label_14782", "label_14783", "label_14784", "label_14785", "label_14786", "label_14787", "label_14788", "label_14789", "label_14790", "label_14791", "label_14792", "label_14793", "label_14794", "label_14795", "label_14796", "label_14797", "label_14798", "label_14799", "label_14800", "label_14801", "label_14802", "label_14803", "label_14804", "label_14805", "label_14806", "label_14807", "label_14808", "label_14809", "label_14810", "label_14811", "label_14812", "label_14813", "label_14814", "label_14815", "label_14816", "label_14817", "label_14818", "label_14819", "label_14820", "label_14821", "label_14822", "label_14823", "label_14824", "label_14825", "label_14826", "label_14827", "label_14828", "label_14829", "label_14830", "label_14831", "label_14832", "label_14833", "label_14834", "label_14835", "label_14836", "label_14837", "label_14838", "label_14839", "label_14840", "label_14841", "label_14842", "label_14843", "label_14844", "label_14845", "label_14846", "label_14847", "label_14848", "label_14849", "label_14850", "label_14851", "label_14852", "label_14853", "label_14854", "label_14855", "label_14856", "label_14857", "label_14858", "label_14859", "label_14860", "label_14861", "label_14862", "label_14863", "label_14864", "label_14865", "label_14866", "label_14867", "label_14868", "label_14869", "label_14870", "label_14871", "label_14872", "label_14873", "label_14874", "label_14875", "label_14876", "label_14877", "label_14878", "label_14879", "label_14880", "label_14881", "label_14882", "label_14883", "label_14884", "label_14885", "label_14886", "label_14887", "label_14888", "label_14889", "label_14890", "label_14891", "label_14892", "label_14893" ]
bortle/astrophotography-object-classifier-alpha
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 39543103134 - CO2 Emissions (in grams): 1.9112 ## Validation Metrics - Loss: 0.011 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "diffuse_nebula", "galaxy", "globular_cluster", "open_cluster", "planetary_nebula", "supernova_remnant" ]
smakubi/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
eric1993/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the cifar10 dataset. It achieves the following results on the evaluation set: - Loss: 0.0866 - Accuracy: 0.9724 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.4795 | 1.0 | 351 | 0.1273 | 0.9598 | | 0.3451 | 2.0 | 702 | 0.0996 | 0.9668 | | 0.3242 | 3.0 | 1053 | 0.0866 | 0.9724 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "airplane", "automobile", "bird", "cat", "deer", "dog", "frog", "horse", "ship", "truck" ]
davanstrien/autotrain-map_no_map_twitter_demo-39701103400
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 39701103400 - CO2 Emissions (in grams): 0.0010 ## Validation Metrics - Loss: 0.137 - Accuracy: 0.947 - Precision: 1.000 - Recall: 0.923 - AUC: 1.000 - F1: 0.960
[ "map", "no_map" ]
julienmercier/vit-base-patch16-224-in21k-mobile-eye-tracking-dataset-v0
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # julienmercier/eyetracking_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.4188 - Validation Loss: 0.2002 - Train Accuracy: 0.9349 - Epoch: 0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 5492, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 0.4188 | 0.2002 | 0.9349 | 0 | ### Framework versions - Transformers 4.26.1 - TensorFlow 2.11.0 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "in", "none", "out" ]
uisikdag/vit-base-patch16-224-in21k-plant-seedling-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weeds_hfclass11 Model is trained on balanced dataset/ 250 image per class/ .8 .1 .1 split/ 224x224 resized Dataset: https://www.kaggle.com/datasets/vbookshelf/v2-plant-seedlings-dataset This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3603 - Accuracy: 0.9567 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.3089 | 0.99 | 37 | 2.0422 | 0.7133 | | 1.4465 | 1.99 | 74 | 1.2227 | 0.8767 | | 0.8455 | 2.99 | 111 | 0.8121 | 0.9067 | | 0.6579 | 3.99 | 148 | 0.6161 | 0.9267 | | 0.5163 | 4.99 | 185 | 0.5031 | 0.94 | | 0.4374 | 5.99 | 222 | 0.4078 | 0.9633 | | 0.3912 | 6.99 | 259 | 0.4134 | 0.9467 | | 0.358 | 7.99 | 296 | 0.4207 | 0.9233 | | 0.3509 | 8.99 | 333 | 0.3768 | 0.95 | | 0.3288 | 9.99 | 370 | 0.3603 | 0.9567 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "black-grass", "charlock", "small-flowered cranesbill", "sugar beet", "cleavers", "common chickweed", "common wheat", "fat hen", "loose silky-bent", "maize", "scentless mayweed", "shepherds purse" ]
uisikdag/beit-base-patch16-224-pt22k-ft22k-plant-seedling-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weeds_hfclass12 Model is trained on balanced dataset/ 250 image per class/ .8 .1 .1 split/ 224x224 resized Dataset: https://www.kaggle.com/datasets/vbookshelf/v2-plant-seedlings-dataset This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1257 - Accuracy: 0.96 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.6013 | 0.99 | 37 | 0.7579 | 0.8067 | | 0.3887 | 1.99 | 74 | 0.2834 | 0.9033 | | 0.2846 | 2.99 | 111 | 0.2767 | 0.9 | | 0.2086 | 3.99 | 148 | 0.2642 | 0.9067 | | 0.1664 | 4.99 | 185 | 0.2016 | 0.9333 | | 0.168 | 5.99 | 222 | 0.1498 | 0.9533 | | 0.1159 | 6.99 | 259 | 0.1607 | 0.9533 | | 0.1195 | 7.99 | 296 | 0.1719 | 0.9467 | | 0.1013 | 8.99 | 333 | 0.1442 | 0.9533 | | 0.0939 | 9.99 | 370 | 0.1257 | 0.96 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "black-grass", "charlock", "small-flowered cranesbill", "sugar beet", "cleavers", "common chickweed", "common wheat", "fat hen", "loose silky-bent", "maize", "scentless mayweed", "shepherds purse" ]
surprisedPikachu007/tomato-disease-detection_V2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tomato-disease-detection_V2 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0483 - Accuracy: 0.9887 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1585 | 1.0 | 949 | 0.1547 | 0.9663 | | 0.1129 | 2.0 | 1898 | 0.0773 | 0.9826 | | 0.0223 | 3.0 | 2847 | 0.0483 | 0.9887 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "tomato___bacterial_spot", "tomato___early_blight", "healthy", "tomato___late_blight", "tomato___leaf_mold", "tomato___septoria_leaf_spot", "tomato___spider_mites two-spotted_spider_mite", "tomato___target_spot", "tomato___tomato_yellow_leaf_curl_virus", "tomato___tomato_mosaic_virus", "affected" ]
surprisedPikachu007/tomato-disease-detection
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tomato-disease-detection This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0394 - Accuracy: 0.9918 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1363 | 1.0 | 941 | 0.1109 | 0.9774 | | 0.0657 | 2.0 | 1882 | 0.0666 | 0.9841 | | 0.0605 | 3.0 | 2823 | 0.0394 | 0.9918 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "tomato___bacterial_spot", "tomato___early_blight", "tomato___late_blight", "tomato___leaf_mold", "tomato___septoria_leaf_spot", "tomato___spider_mites two-spotted_spider_mite", "tomato___target_spot", "tomato___tomato_yellow_leaf_curl_virus", "tomato___tomato_mosaic_virus", "healthy" ]
helenai/swin-base-food101-jpqd-ov
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-base-food101-jpqd-ov It was compressed using [NNCF](https://github.com/openvinotoolkit/nncf) with [Optimum Intel](https://github.com/huggingface/optimum-intel#openvino) following the JPQD image classification example. This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.3396 - Accuracy: 0.9061 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 2.2162 | 0.42 | 500 | 2.1111 | 0.7967 | | 0.729 | 0.84 | 1000 | 0.5474 | 0.8773 | | 0.7536 | 1.27 | 1500 | 0.3844 | 0.8984 | | 0.4822 | 1.69 | 2000 | 0.3340 | 0.9043 | | 12.2559 | 2.11 | 2500 | 12.0128 | 0.9033 | | 48.7302 | 2.54 | 3000 | 48.3874 | 0.8681 | | 75.1831 | 2.96 | 3500 | 75.3200 | 0.7183 | | 93.5572 | 3.38 | 4000 | 93.4142 | 0.5939 | | 103.798 | 3.8 | 4500 | 103.4427 | 0.5634 | | 108.0993 | 4.23 | 5000 | 108.6461 | 0.5490 | | 110.1265 | 4.65 | 5500 | 109.3663 | 0.5636 | | 1.5584 | 5.07 | 6000 | 0.9255 | 0.8374 | | 1.0883 | 5.49 | 6500 | 0.5841 | 0.8758 | | 0.7024 | 5.92 | 7000 | 0.5055 | 0.8854 | | 0.9033 | 6.34 | 7500 | 0.4639 | 0.8901 | | 0.6901 | 6.76 | 8000 | 0.4360 | 0.8947 | | 0.6114 | 7.19 | 8500 | 0.4080 | 0.8978 | | 0.5102 | 7.61 | 9000 | 0.3911 | 0.9009 | | 0.7154 | 8.03 | 9500 | 0.3747 | 0.9027 | | 0.5621 | 8.45 | 10000 | 0.3622 | 0.9021 | | 0.5262 | 8.88 | 10500 | 0.3554 | 0.9041 | | 0.5442 | 9.3 | 11000 | 0.3462 | 0.9053 | | 0.5615 | 9.72 | 11500 | 0.3416 | 0.9061 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.8.0 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
platzi/platzi_Santiago_Res_Esc
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi_Santiago_Res_Esc This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0250 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0059 | 3.85 | 500 | 0.0250 | 0.9925 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "angular_leaf_spot", "bean_rust", "healthy" ]
uisikdag/deit-base-patch16-224-plant-seedling-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weeds_hfclass16 Model is trained on balanced dataset/250 per class/ .8 .1 .1 split/ 224x224 resized Dataset: https://www.kaggle.com/datasets/vbookshelf/v2-plant-seedlings-dataset This model is a fine-tuned version of [facebook/deit-base-patch16-224](https://huggingface.co/facebook/deit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1597 - Accuracy: 0.9467 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.1694 | 0.99 | 37 | 1.2880 | 0.69 | | 0.5192 | 1.99 | 74 | 0.3995 | 0.8667 | | 0.3136 | 2.99 | 111 | 0.3333 | 0.89 | | 0.2521 | 3.99 | 148 | 0.2672 | 0.91 | | 0.1693 | 4.99 | 185 | 0.2316 | 0.9167 | | 0.1747 | 5.99 | 222 | 0.1575 | 0.9567 | | 0.1324 | 6.99 | 259 | 0.1896 | 0.9467 | | 0.1102 | 7.99 | 296 | 0.1931 | 0.94 | | 0.1105 | 8.99 | 333 | 0.1537 | 0.9533 | | 0.1036 | 9.99 | 370 | 0.1597 | 0.9467 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "black-grass", "charlock", "small-flowered cranesbill", "sugar beet", "cleavers", "common chickweed", "common wheat", "fat hen", "loose silky-bent", "maize", "scentless mayweed", "shepherds purse" ]
uisikdag/microsoft-resnet-152-plant-seedling-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weeds_hfclass18 Model is trained on balanced dataset/250 per class/ .8 .1 .1 split/ 224x224 resized Dataset: https://www.kaggle.com/datasets/vbookshelf/v2-plant-seedlings-dataset This model is a fine-tuned version of [microsoft/resnet-152](https://huggingface.co/microsoft/resnet-152) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.2397 - Accuracy: 0.7767 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.4803 | 0.99 | 37 | 2.4724 | 0.1133 | | 2.4464 | 1.99 | 74 | 2.4305 | 0.2967 | | 2.3843 | 2.99 | 111 | 2.3658 | 0.4233 | | 2.3018 | 3.99 | 148 | 2.2287 | 0.5067 | | 2.1075 | 4.99 | 185 | 2.0144 | 0.5967 | | 1.8743 | 5.99 | 222 | 1.7228 | 0.65 | | 1.7114 | 6.99 | 259 | 1.5487 | 0.6833 | | 1.5345 | 7.99 | 296 | 1.3920 | 0.7267 | | 1.4471 | 8.99 | 333 | 1.2914 | 0.7333 | | 1.3994 | 9.99 | 370 | 1.2397 | 0.7767 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "black-grass", "charlock", "small-flowered cranesbill", "sugar beet", "cleavers", "common chickweed", "common wheat", "fat hen", "loose silky-bent", "maize", "scentless mayweed", "shepherds purse" ]
uisikdag/swin-base-patch4-window7-224-in22k-plant-seedling-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weeds_swin_balanced This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224-in22k](https://huggingface.co/microsoft/swin-base-patch4-window7-224-in22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1356 - Accuracy: 0.9667 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5298 | 1.0 | 150 | 0.3729 | 0.8633 | | 0.4204 | 2.0 | 300 | 0.2560 | 0.8933 | | 0.3229 | 3.0 | 450 | 0.3213 | 0.88 | | 0.2347 | 4.0 | 600 | 0.2031 | 0.9233 | | 0.3342 | 5.0 | 750 | 0.1912 | 0.9367 | | 0.2106 | 6.0 | 900 | 0.1365 | 0.9467 | | 0.1891 | 7.0 | 1050 | 0.1927 | 0.97 | | 0.0629 | 8.0 | 1200 | 0.1726 | 0.9467 | | 0.1169 | 9.0 | 1350 | 0.1363 | 0.9633 | | 0.0426 | 10.0 | 1500 | 0.1356 | 0.9667 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "black-grass", "charlock", "small-flowered cranesbill", "sugar beet", "cleavers", "common chickweed", "common wheat", "fat hen", "loose silky-bent", "maize", "scentless mayweed", "shepherds purse" ]
fzaghloul/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
jonathanfernandes/vit-base-patch16-224-finetuned-flower2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower2 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
bortle/astrophotography-object-classifier-alpha2
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 40327104863 - CO2 Emissions (in grams): 0.0118 ## Validation Metrics - Loss: 0.013 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "diffuse_nebula", "galaxy", "globular_cluster", "milky_way", "moon", "open_cluster", "planetary_nebula", "supernova_remnant" ]
bortle/astrophotography-object-classifier-alpha4
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 40333104876 - CO2 Emissions (in grams): 0.0066 ## Validation Metrics - Loss: 0.015 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "diffuse_nebula", "galaxy", "globular_cluster", "milky_way", "moon", "open_cluster", "planetary_nebula", "supernova_remnant" ]
helenai/microsoft-swin-tiny-patch4-window7-224-ov
# microsoft-swin-tiny-patch4-window7-224-ov This is the [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) model converted to [OpenVINO](https://openvino.ai), for accellerated inference. An example of how to do inference on this model: ```python from optimum.intel.openvino import OVModelForImageClassification from transformers import AutoFeatureExtractor, pipeline model_id = "helenai/microsoft-swin-tiny-patch4-window7-224" model = OVModelForImageClassification.from_pretrained(model_id, compile=False) feature_extractor = AutoFeatureExtractor.from_pretrained(model_id) model.reshape(1,3, feature_extractor.size["height"], feature_extractor.size["width"]) model.compile() ov_pipe = pipeline("image-classification", model=model, feature_extractor=feature_extractor) image_url_or_path = "http://images.cocodataset.org/val2017/000000039769.jpg" ov_pipe(image_url_or_path) ```
[ "tench, tinca tinca", "goldfish, carassius auratus", "great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias", "tiger shark, galeocerdo cuvieri", "hammerhead, hammerhead shark", "electric ray, crampfish, numbfish, torpedo", "stingray", "cock", "hen", "ostrich, struthio camelus", "brambling, fringilla montifringilla", "goldfinch, carduelis carduelis", "house finch, linnet, carpodacus mexicanus", "junco, snowbird", "indigo bunting, indigo finch, indigo bird, passerina cyanea", "robin, american robin, turdus migratorius", "bulbul", "jay", "magpie", "chickadee", "water ouzel, dipper", "kite", "bald eagle, american eagle, haliaeetus leucocephalus", "vulture", "great grey owl, great gray owl, strix nebulosa", "european fire salamander, salamandra salamandra", "common newt, triturus vulgaris", "eft", "spotted salamander, ambystoma maculatum", "axolotl, mud puppy, ambystoma mexicanum", "bullfrog, rana catesbeiana", "tree frog, tree-frog", "tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui", "loggerhead, loggerhead turtle, caretta caretta", "leatherback turtle, leatherback, leathery turtle, dermochelys coriacea", "mud turtle", "terrapin", "box turtle, box tortoise", "banded gecko", "common iguana, iguana, iguana iguana", "american chameleon, anole, anolis carolinensis", "whiptail, whiptail lizard", "agama", "frilled lizard, chlamydosaurus kingi", "alligator lizard", "gila monster, heloderma suspectum", "green lizard, lacerta viridis", "african chameleon, chamaeleo chamaeleon", "komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis", "african crocodile, nile crocodile, crocodylus niloticus", "american alligator, alligator mississipiensis", "triceratops", "thunder snake, worm snake, carphophis amoenus", "ringneck snake, ring-necked snake, ring snake", "hognose snake, puff adder, sand viper", "green snake, grass snake", "king snake, kingsnake", "garter snake, grass snake", "water snake", "vine snake", "night snake, hypsiglena torquata", "boa constrictor, constrictor constrictor", "rock python, rock snake, python sebae", "indian cobra, naja naja", "green mamba", "sea snake", "horned viper, cerastes, sand viper, horned asp, cerastes cornutus", "diamondback, diamondback rattlesnake, crotalus adamanteus", "sidewinder, horned rattlesnake, crotalus cerastes", "trilobite", "harvestman, daddy longlegs, phalangium opilio", "scorpion", "black and gold garden spider, argiope aurantia", "barn spider, araneus cavaticus", "garden spider, aranea diademata", "black widow, latrodectus mactans", "tarantula", "wolf spider, hunting spider", "tick", "centipede", "black grouse", "ptarmigan", "ruffed grouse, partridge, bonasa umbellus", "prairie chicken, prairie grouse, prairie fowl", "peacock", "quail", "partridge", "african grey, african gray, psittacus erithacus", "macaw", "sulphur-crested cockatoo, kakatoe galerita, cacatua galerita", "lorikeet", "coucal", "bee eater", "hornbill", "hummingbird", "jacamar", "toucan", "drake", "red-breasted merganser, mergus serrator", "goose", "black swan, cygnus atratus", "tusker", "echidna, spiny anteater, anteater", "platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus", "wallaby, brush kangaroo", "koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus", "wombat", "jellyfish", "sea anemone, anemone", "brain coral", "flatworm, platyhelminth", "nematode, nematode worm, roundworm", "conch", "snail", "slug", "sea slug, nudibranch", "chiton, coat-of-mail shell, sea cradle, polyplacophore", "chambered nautilus, pearly nautilus, nautilus", "dungeness crab, cancer magister", "rock crab, cancer irroratus", "fiddler crab", "king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica", "american lobster, northern lobster, maine lobster, homarus americanus", "spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish", "crayfish, crawfish, crawdad, crawdaddy", "hermit crab", "isopod", "white stork, ciconia ciconia", "black stork, ciconia nigra", "spoonbill", "flamingo", "little blue heron, egretta caerulea", "american egret, great white heron, egretta albus", "bittern", "crane", "limpkin, aramus pictus", "european gallinule, porphyrio porphyrio", "american coot, marsh hen, mud hen, water hen, fulica americana", "bustard", "ruddy turnstone, arenaria interpres", "red-backed sandpiper, dunlin, erolia alpina", "redshank, tringa totanus", "dowitcher", "oystercatcher, oyster catcher", "pelican", "king penguin, aptenodytes patagonica", "albatross, mollymawk", "grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus", "killer whale, killer, orca, grampus, sea wolf, orcinus orca", "dugong, dugong dugon", "sea lion", "chihuahua", "japanese spaniel", "maltese dog, maltese terrier, maltese", "pekinese, pekingese, peke", "shih-tzu", "blenheim spaniel", "papillon", "toy terrier", "rhodesian ridgeback", "afghan hound, afghan", "basset, basset hound", "beagle", "bloodhound, sleuthhound", "bluetick", "black-and-tan coonhound", "walker hound, walker foxhound", "english foxhound", "redbone", "borzoi, russian wolfhound", "irish wolfhound", "italian greyhound", "whippet", "ibizan hound, ibizan podenco", "norwegian elkhound, elkhound", "otterhound, otter hound", "saluki, gazelle hound", "scottish deerhound, deerhound", "weimaraner", "staffordshire bullterrier, staffordshire bull terrier", "american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier", "bedlington terrier", "border terrier", "kerry blue terrier", "irish terrier", "norfolk terrier", "norwich terrier", "yorkshire terrier", "wire-haired fox terrier", "lakeland terrier", "sealyham terrier, sealyham", "airedale, airedale terrier", "cairn, cairn terrier", "australian terrier", "dandie dinmont, dandie dinmont terrier", "boston bull, boston terrier", "miniature schnauzer", "giant schnauzer", "standard schnauzer", "scotch terrier, scottish terrier, scottie", "tibetan terrier, chrysanthemum dog", "silky terrier, sydney silky", "soft-coated wheaten terrier", "west highland white terrier", "lhasa, lhasa apso", "flat-coated retriever", "curly-coated retriever", "golden retriever", "labrador retriever", "chesapeake bay retriever", "german short-haired pointer", "vizsla, hungarian pointer", "english setter", "irish setter, red setter", "gordon setter", "brittany spaniel", "clumber, clumber spaniel", "english springer, english springer spaniel", "welsh springer spaniel", "cocker spaniel, english cocker spaniel, cocker", "sussex spaniel", "irish water spaniel", "kuvasz", "schipperke", "groenendael", "malinois", "briard", "kelpie", "komondor", "old english sheepdog, bobtail", "shetland sheepdog, shetland sheep dog, shetland", "collie", "border collie", "bouvier des flandres, bouviers des flandres", "rottweiler", "german shepherd, german shepherd dog, german police dog, alsatian", "doberman, doberman pinscher", "miniature pinscher", "greater swiss mountain dog", "bernese mountain dog", "appenzeller", "entlebucher", "boxer", "bull mastiff", "tibetan mastiff", "french bulldog", "great dane", "saint bernard, st bernard", "eskimo dog, husky", "malamute, malemute, alaskan malamute", "siberian husky", "dalmatian, coach dog, carriage dog", "affenpinscher, monkey pinscher, monkey dog", "basenji", "pug, pug-dog", "leonberg", "newfoundland, newfoundland dog", "great pyrenees", "samoyed, samoyede", "pomeranian", "chow, chow chow", "keeshond", "brabancon griffon", "pembroke, pembroke welsh corgi", "cardigan, cardigan welsh corgi", "toy poodle", "miniature poodle", "standard poodle", "mexican hairless", "timber wolf, grey wolf, gray wolf, canis lupus", "white wolf, arctic wolf, canis lupus tundrarum", "red wolf, maned wolf, canis rufus, canis niger", "coyote, prairie wolf, brush wolf, canis latrans", "dingo, warrigal, warragal, canis dingo", "dhole, cuon alpinus", "african hunting dog, hyena dog, cape hunting dog, lycaon pictus", "hyena, hyaena", "red fox, vulpes vulpes", "kit fox, vulpes macrotis", "arctic fox, white fox, alopex lagopus", "grey fox, gray fox, urocyon cinereoargenteus", "tabby, tabby cat", "tiger cat", "persian cat", "siamese cat, siamese", "egyptian cat", "cougar, puma, catamount, mountain lion, painter, panther, felis concolor", "lynx, catamount", "leopard, panthera pardus", "snow leopard, ounce, panthera uncia", "jaguar, panther, panthera onca, felis onca", "lion, king of beasts, panthera leo", "tiger, panthera tigris", "cheetah, chetah, acinonyx jubatus", "brown bear, bruin, ursus arctos", "american black bear, black bear, ursus americanus, euarctos americanus", "ice bear, polar bear, ursus maritimus, thalarctos maritimus", "sloth bear, melursus ursinus, ursus ursinus", "mongoose", "meerkat, mierkat", "tiger beetle", "ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle", "ground beetle, carabid beetle", "long-horned beetle, longicorn, longicorn beetle", "leaf beetle, chrysomelid", "dung beetle", "rhinoceros beetle", "weevil", "fly", "bee", "ant, emmet, pismire", "grasshopper, hopper", "cricket", "walking stick, walkingstick, stick insect", "cockroach, roach", "mantis, mantid", "cicada, cicala", "leafhopper", "lacewing, lacewing fly", "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", "damselfly", "admiral", "ringlet, ringlet butterfly", "monarch, monarch butterfly, milkweed butterfly, danaus plexippus", "cabbage butterfly", "sulphur butterfly, sulfur butterfly", "lycaenid, lycaenid butterfly", "starfish, sea star", "sea urchin", "sea cucumber, holothurian", "wood rabbit, cottontail, cottontail rabbit", "hare", "angora, angora rabbit", "hamster", "porcupine, hedgehog", "fox squirrel, eastern fox squirrel, sciurus niger", "marmot", "beaver", "guinea pig, cavia cobaya", "sorrel", "zebra", "hog, pig, grunter, squealer, sus scrofa", "wild boar, boar, sus scrofa", "warthog", "hippopotamus, hippo, river horse, hippopotamus amphibius", "ox", "water buffalo, water ox, asiatic buffalo, bubalus bubalis", "bison", "ram, tup", "bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis", "ibex, capra ibex", "hartebeest", "impala, aepyceros melampus", "gazelle", "arabian camel, dromedary, camelus dromedarius", "llama", "weasel", "mink", "polecat, fitch, foulmart, foumart, mustela putorius", "black-footed ferret, ferret, mustela nigripes", "otter", "skunk, polecat, wood pussy", "badger", "armadillo", "three-toed sloth, ai, bradypus tridactylus", "orangutan, orang, orangutang, pongo pygmaeus", "gorilla, gorilla gorilla", "chimpanzee, chimp, pan troglodytes", "gibbon, hylobates lar", "siamang, hylobates syndactylus, symphalangus syndactylus", "guenon, guenon monkey", "patas, hussar monkey, erythrocebus patas", "baboon", "macaque", "langur", "colobus, colobus monkey", "proboscis monkey, nasalis larvatus", "marmoset", "capuchin, ringtail, cebus capucinus", "howler monkey, howler", "titi, titi monkey", "spider monkey, ateles geoffroyi", "squirrel monkey, saimiri sciureus", "madagascar cat, ring-tailed lemur, lemur catta", "indri, indris, indri indri, indri brevicaudatus", "indian elephant, elephas maximus", "african elephant, loxodonta africana", "lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens", "giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca", "barracouta, snoek", "eel", "coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch", "rock beauty, holocanthus tricolor", "anemone fish", "sturgeon", "gar, garfish, garpike, billfish, lepisosteus osseus", "lionfish", "puffer, pufferfish, blowfish, globefish", "abacus", "abaya", "academic gown, academic robe, judge's robe", "accordion, piano accordion, squeeze box", "acoustic guitar", "aircraft carrier, carrier, flattop, attack aircraft carrier", "airliner", "airship, dirigible", "altar", "ambulance", "amphibian, amphibious vehicle", "analog clock", "apiary, bee house", "apron", "ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin", "assault rifle, assault gun", "backpack, back pack, knapsack, packsack, rucksack, haversack", "bakery, bakeshop, bakehouse", "balance beam, beam", "balloon", "ballpoint, ballpoint pen, ballpen, biro", "band aid", "banjo", "bannister, banister, balustrade, balusters, handrail", "barbell", "barber chair", "barbershop", "barn", "barometer", "barrel, cask", "barrow, garden cart, lawn cart, wheelbarrow", "baseball", "basketball", "bassinet", "bassoon", "bathing cap, swimming cap", "bath towel", "bathtub, bathing tub, bath, tub", "beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon", "beacon, lighthouse, beacon light, pharos", "beaker", "bearskin, busby, shako", "beer bottle", "beer glass", "bell cote, bell cot", "bib", "bicycle-built-for-two, tandem bicycle, tandem", "bikini, two-piece", "binder, ring-binder", "binoculars, field glasses, opera glasses", "birdhouse", "boathouse", "bobsled, bobsleigh, bob", "bolo tie, bolo, bola tie, bola", "bonnet, poke bonnet", "bookcase", "bookshop, bookstore, bookstall", "bottlecap", "bow", "bow tie, bow-tie, bowtie", "brass, memorial tablet, plaque", "brassiere, bra, bandeau", "breakwater, groin, groyne, mole, bulwark, seawall, jetty", "breastplate, aegis, egis", "broom", "bucket, pail", "buckle", "bulletproof vest", "bullet train, bullet", "butcher shop, meat market", "cab, hack, taxi, taxicab", "caldron, cauldron", "candle, taper, wax light", "cannon", "canoe", "can opener, tin opener", "cardigan", "car mirror", "carousel, carrousel, merry-go-round, roundabout, whirligig", "carpenter's kit, tool kit", "carton", "car wheel", "cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm", "cassette", "cassette player", "castle", "catamaran", "cd player", "cello, violoncello", "cellular telephone, cellular phone, cellphone, cell, mobile phone", "chain", "chainlink fence", "chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour", "chain saw, chainsaw", "chest", "chiffonier, commode", "chime, bell, gong", "china cabinet, china closet", "christmas stocking", "church, church building", "cinema, movie theater, movie theatre, movie house, picture palace", "cleaver, meat cleaver, chopper", "cliff dwelling", "cloak", "clog, geta, patten, sabot", "cocktail shaker", "coffee mug", "coffeepot", "coil, spiral, volute, whorl, helix", "combination lock", "computer keyboard, keypad", "confectionery, confectionary, candy store", "container ship, containership, container vessel", "convertible", "corkscrew, bottle screw", "cornet, horn, trumpet, trump", "cowboy boot", "cowboy hat, ten-gallon hat", "cradle", "crane", "crash helmet", "crate", "crib, cot", "crock pot", "croquet ball", "crutch", "cuirass", "dam, dike, dyke", "desk", "desktop computer", "dial telephone, dial phone", "diaper, nappy, napkin", "digital clock", "digital watch", "dining table, board", "dishrag, dishcloth", "dishwasher, dish washer, dishwashing machine", "disk brake, disc brake", "dock, dockage, docking facility", "dogsled, dog sled, dog sleigh", "dome", "doormat, welcome mat", "drilling platform, offshore rig", "drum, membranophone, tympan", "drumstick", "dumbbell", "dutch oven", "electric fan, blower", "electric guitar", "electric locomotive", "entertainment center", "envelope", "espresso maker", "face powder", "feather boa, boa", "file, file cabinet, filing cabinet", "fireboat", "fire engine, fire truck", "fire screen, fireguard", "flagpole, flagstaff", "flute, transverse flute", "folding chair", "football helmet", "forklift", "fountain", "fountain pen", "four-poster", "freight car", "french horn, horn", "frying pan, frypan, skillet", "fur coat", "garbage truck, dustcart", "gasmask, respirator, gas helmet", "gas pump, gasoline pump, petrol pump, island dispenser", "goblet", "go-kart", "golf ball", "golfcart, golf cart", "gondola", "gong, tam-tam", "gown", "grand piano, grand", "greenhouse, nursery, glasshouse", "grille, radiator grille", "grocery store, grocery, food market, market", "guillotine", "hair slide", "hair spray", "half track", "hammer", "hamper", "hand blower, blow dryer, blow drier, hair dryer, hair drier", "hand-held computer, hand-held microcomputer", "handkerchief, hankie, hanky, hankey", "hard disc, hard disk, fixed disk", "harmonica, mouth organ, harp, mouth harp", "harp", "harvester, reaper", "hatchet", "holster", "home theater, home theatre", "honeycomb", "hook, claw", "hoopskirt, crinoline", "horizontal bar, high bar", "horse cart, horse-cart", "hourglass", "ipod", "iron, smoothing iron", "jack-o'-lantern", "jean, blue jean, denim", "jeep, landrover", "jersey, t-shirt, tee shirt", "jigsaw puzzle", "jinrikisha, ricksha, rickshaw", "joystick", "kimono", "knee pad", "knot", "lab coat, laboratory coat", "ladle", "lampshade, lamp shade", "laptop, laptop computer", "lawn mower, mower", "lens cap, lens cover", "letter opener, paper knife, paperknife", "library", "lifeboat", "lighter, light, igniter, ignitor", "limousine, limo", "liner, ocean liner", "lipstick, lip rouge", "loafer", "lotion", "loudspeaker, speaker, speaker unit, loudspeaker system, speaker system", "loupe, jeweler's loupe", "lumbermill, sawmill", "magnetic compass", "mailbag, postbag", "mailbox, letter box", "maillot", "maillot, tank suit", "manhole cover", "maraca", "marimba, xylophone", "mask", "matchstick", "maypole", "maze, labyrinth", "measuring cup", "medicine chest, medicine cabinet", "megalith, megalithic structure", "microphone, mike", "microwave, microwave oven", "military uniform", "milk can", "minibus", "miniskirt, mini", "minivan", "missile", "mitten", "mixing bowl", "mobile home, manufactured home", "model t", "modem", "monastery", "monitor", "moped", "mortar", "mortarboard", "mosque", "mosquito net", "motor scooter, scooter", "mountain bike, all-terrain bike, off-roader", "mountain tent", "mouse, computer mouse", "mousetrap", "moving van", "muzzle", "nail", "neck brace", "necklace", "nipple", "notebook, notebook computer", "obelisk", "oboe, hautboy, hautbois", "ocarina, sweet potato", "odometer, hodometer, mileometer, milometer", "oil filter", "organ, pipe organ", "oscilloscope, scope, cathode-ray oscilloscope, cro", "overskirt", "oxcart", "oxygen mask", "packet", "paddle, boat paddle", "paddlewheel, paddle wheel", "padlock", "paintbrush", "pajama, pyjama, pj's, jammies", "palace", "panpipe, pandean pipe, syrinx", "paper towel", "parachute, chute", "parallel bars, bars", "park bench", "parking meter", "passenger car, coach, carriage", "patio, terrace", "pay-phone, pay-station", "pedestal, plinth, footstall", "pencil box, pencil case", "pencil sharpener", "perfume, essence", "petri dish", "photocopier", "pick, plectrum, plectron", "pickelhaube", "picket fence, paling", "pickup, pickup truck", "pier", "piggy bank, penny bank", "pill bottle", "pillow", "ping-pong ball", "pinwheel", "pirate, pirate ship", "pitcher, ewer", "plane, carpenter's plane, woodworking plane", "planetarium", "plastic bag", "plate rack", "plow, plough", "plunger, plumber's helper", "polaroid camera, polaroid land camera", "pole", "police van, police wagon, paddy wagon, patrol wagon, wagon, black maria", "poncho", "pool table, billiard table, snooker table", "pop bottle, soda bottle", "pot, flowerpot", "potter's wheel", "power drill", "prayer rug, prayer mat", "printer", "prison, prison house", "projectile, missile", "projector", "puck, hockey puck", "punching bag, punch bag, punching ball, punchball", "purse", "quill, quill pen", "quilt, comforter, comfort, puff", "racer, race car, racing car", "racket, racquet", "radiator", "radio, wireless", "radio telescope, radio reflector", "rain barrel", "recreational vehicle, rv, r.v.", "reel", "reflex camera", "refrigerator, icebox", "remote control, remote", "restaurant, eating house, eating place, eatery", "revolver, six-gun, six-shooter", "rifle", "rocking chair, rocker", "rotisserie", "rubber eraser, rubber, pencil eraser", "rugby ball", "rule, ruler", "running shoe", "safe", "safety pin", "saltshaker, salt shaker", "sandal", "sarong", "sax, saxophone", "scabbard", "scale, weighing machine", "school bus", "schooner", "scoreboard", "screen, crt screen", "screw", "screwdriver", "seat belt, seatbelt", "sewing machine", "shield, buckler", "shoe shop, shoe-shop, shoe store", "shoji", "shopping basket", "shopping cart", "shovel", "shower cap", "shower curtain", "ski", "ski mask", "sleeping bag", "slide rule, slipstick", "sliding door", "slot, one-armed bandit", "snorkel", "snowmobile", "snowplow, snowplough", "soap dispenser", "soccer ball", "sock", "solar dish, solar collector, solar furnace", "sombrero", "soup bowl", "space bar", "space heater", "space shuttle", "spatula", "speedboat", "spider web, spider's web", "spindle", "sports car, sport car", "spotlight, spot", "stage", "steam locomotive", "steel arch bridge", "steel drum", "stethoscope", "stole", "stone wall", "stopwatch, stop watch", "stove", "strainer", "streetcar, tram, tramcar, trolley, trolley car", "stretcher", "studio couch, day bed", "stupa, tope", "submarine, pigboat, sub, u-boat", "suit, suit of clothes", "sundial", "sunglass", "sunglasses, dark glasses, shades", "sunscreen, sunblock, sun blocker", "suspension bridge", "swab, swob, mop", "sweatshirt", "swimming trunks, bathing trunks", "swing", "switch, electric switch, electrical switch", "syringe", "table lamp", "tank, army tank, armored combat vehicle, armoured combat vehicle", "tape player", "teapot", "teddy, teddy bear", "television, television system", "tennis ball", "thatch, thatched roof", "theater curtain, theatre curtain", "thimble", "thresher, thrasher, threshing machine", "throne", "tile roof", "toaster", "tobacco shop, tobacconist shop, tobacconist", "toilet seat", "torch", "totem pole", "tow truck, tow car, wrecker", "toyshop", "tractor", "trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi", "tray", "trench coat", "tricycle, trike, velocipede", "trimaran", "tripod", "triumphal arch", "trolleybus, trolley coach, trackless trolley", "trombone", "tub, vat", "turnstile", "typewriter keyboard", "umbrella", "unicycle, monocycle", "upright, upright piano", "vacuum, vacuum cleaner", "vase", "vault", "velvet", "vending machine", "vestment", "viaduct", "violin, fiddle", "volleyball", "waffle iron", "wall clock", "wallet, billfold, notecase, pocketbook", "wardrobe, closet, press", "warplane, military plane", "washbasin, handbasin, washbowl, lavabo, wash-hand basin", "washer, automatic washer, washing machine", "water bottle", "water jug", "water tower", "whiskey jug", "whistle", "wig", "window screen", "window shade", "windsor tie", "wine bottle", "wing", "wok", "wooden spoon", "wool, woolen, woollen", "worm fence, snake fence, snake-rail fence, virginia fence", "wreck", "yawl", "yurt", "web site, website, internet site, site", "comic book", "crossword puzzle, crossword", "street sign", "traffic light, traffic signal, stoplight", "book jacket, dust cover, dust jacket, dust wrapper", "menu", "plate", "guacamole", "consomme", "hot pot, hotpot", "trifle", "ice cream, icecream", "ice lolly, lolly, lollipop, popsicle", "french loaf", "bagel, beigel", "pretzel", "cheeseburger", "hotdog, hot dog, red hot", "mashed potato", "head cabbage", "broccoli", "cauliflower", "zucchini, courgette", "spaghetti squash", "acorn squash", "butternut squash", "cucumber, cuke", "artichoke, globe artichoke", "bell pepper", "cardoon", "mushroom", "granny smith", "strawberry", "orange", "lemon", "fig", "pineapple, ananas", "banana", "jackfruit, jak, jack", "custard apple", "pomegranate", "hay", "carbonara", "chocolate sauce, chocolate syrup", "dough", "meat loaf, meatloaf", "pizza, pizza pie", "potpie", "burrito", "red wine", "espresso", "cup", "eggnog", "alp", "bubble", "cliff, drop, drop-off", "coral reef", "geyser", "lakeside, lakeshore", "promontory, headland, head, foreland", "sandbar, sand bar", "seashore, coast, seacoast, sea-coast", "valley, vale", "volcano", "ballplayer, baseball player", "groom, bridegroom", "scuba diver", "rapeseed", "daisy", "yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum", "corn", "acorn", "hip, rose hip, rosehip", "buckeye, horse chestnut, conker", "coral fungus", "agaric", "gyromitra", "stinkhorn, carrion fungus", "earthstar", "hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa", "bolete", "ear, spike, capitulum", "toilet tissue, toilet paper, bathroom tissue" ]
gjuggler/swin-tiny-patch4-window7-224-finetuned-birds
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-birds This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the bird-data dataset. It achieves the following results on the evaluation set: - Loss: 0.6642 - Accuracy: 0.8215 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 72 - eval_batch_size: 72 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 288 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 3.8854 | 0.99 | 74 | 3.0164 | 0.3039 | | 2.066 | 1.99 | 148 | 1.4849 | 0.6095 | | 1.5066 | 2.99 | 222 | 1.0624 | 0.7145 | | 1.1904 | 3.99 | 296 | 0.9347 | 0.7450 | | 0.9986 | 4.99 | 370 | 0.8415 | 0.7709 | | 0.9437 | 5.99 | 444 | 0.7713 | 0.7901 | | 0.8297 | 6.99 | 518 | 0.7216 | 0.8081 | | 0.7805 | 7.99 | 592 | 0.6856 | 0.8152 | | 0.6978 | 8.99 | 666 | 0.6642 | 0.8215 | | 0.6147 | 9.99 | 740 | 0.6525 | 0.8207 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "abert's towhee", "acorn woodpecker", "allen's hummingbird", "american avocet", "american black duck", "american coot", "american crow", "american dipper", "american goldfinch", "american kestrel", "american oystercatcher", "american pipit", "american redstart", "american robin", "american tree sparrow", "american white pelican", "american wigeon", "american woodcock", "anhinga", "anna's hummingbird", "ash-throated flycatcher", "bald eagle", "baltimore oriole", "band-tailed pigeon", "bank swallow", "barn owl", "barn swallow", "barred owl", "barrow's goldeneye", "bay-breasted warbler", "bell's vireo", "belted kingfisher", "bewick's wren", "black guillemot", "black oystercatcher", "black phoebe", "black rosy-finch", "black scoter", "black skimmer", "black tern", "black turnstone", "black vulture", "black-and-white warbler", "black-bellied plover", "black-bellied whistling-duck", "black-billed cuckoo", "black-billed magpie", "black-capped chickadee", "black-chinned hummingbird", "black-crested titmouse", "black-crowned night-heron", "black-headed grosbeak", "black-legged kittiwake", "black-necked stilt", "black-tailed gnatcatcher", "black-throated blue warbler", "black-throated gray warbler", "black-throated green warbler", "blackburnian warbler", "blackpoll warbler", "blue grosbeak", "blue jay", "blue-gray gnatcatcher", "blue-headed vireo", "blue-winged teal", "blue-winged warbler", "boat-tailed grackle", "bobolink", "bohemian waxwing", "bonaparte's gull", "boreal chickadee", "brandt's cormorant", "brant", "brewer's blackbird", "brewer's sparrow", "bridled titmouse", "broad-billed hummingbird", "broad-tailed hummingbird", "broad-winged hawk", "bronzed cowbird", "brown creeper", "brown pelican", "brown thrasher", "brown-capped rosy-finch", "brown-headed cowbird", "brown-headed nuthatch", "bufflehead", "bullock's oriole", "burrowing owl", "bushtit", "cackling goose", "cactus wren", "california gull", "california quail", "california thrasher", "california towhee", "calliope hummingbird", "canada goose", "canada warbler", "canvasback", "canyon towhee", "canyon wren", "cape may warbler", "carolina chickadee", "carolina wren", "caspian tern", "cassin's finch", "cassin's kingbird", "cassin's vireo", "cattle egret", "cave swallow", "cedar waxwing", "chestnut-backed chickadee", "chestnut-sided warbler", "chihuahuan raven", "chimney swift", "chipping sparrow", "cinnamon teal", "clark's grebe", "clark's nutcracker", "clay-colored sparrow", "cliff swallow", "common eider", "common gallinule", "common goldeneye", "common grackle", "common ground-dove", "common loon", "common merganser", "common nighthawk", "common raven", "common redpoll", "common tern", "common yellowthroat", "cooper's hawk", "cordilleran flycatcher", "costa's hummingbird", "crested caracara", "curve-billed thrasher", "dark-eyed junco", "dickcissel", "double-crested cormorant", "downy woodpecker", "dunlin", "eared grebe", "eastern bluebird", "eastern kingbird", "eastern meadowlark", "eastern phoebe", "eastern screech-owl", "eastern towhee", "eastern wood-pewee", "eurasian collared-dove", "eurasian eagle-owl", "european starling", "evening grosbeak", "field sparrow", "fish crow", "florida scrub-jay", "forster's tern", "fox sparrow", "gadwall", "gambel's quail", "gila woodpecker", "glaucous-winged gull", "glossy ibis", "golden eagle", "golden-crowned kinglet", "golden-crowned sparrow", "golden-fronted woodpecker", "gray catbird", "gray jay", "gray-crowned rosy-finch", "great black-backed gull", "great blue heron", "great cormorant", "great crested flycatcher", "great egret", "great horned owl", "great-tailed grackle", "greater roadrunner", "greater scaup", "greater white-fronted goose", "greater yellowlegs", "green heron", "green-tailed towhee", "green-winged teal", "hairy woodpecker", "harlequin duck", "harris's hawk", "harris's sparrow", "heermann's gull", "hermit thrush", "hermit warbler", "herring gull", "hoary redpoll", "hooded merganser", "hooded oriole", "hooded warbler", "horned grebe", "horned lark", "house finch", "house sparrow", "house wren", "hutton's vireo", "inca dove", "indigo bunting", "juniper titmouse", "killdeer", "ladder-backed woodpecker", "lark bunting", "lark sparrow", "laughing gull", "lazuli bunting", "least flycatcher", "least sandpiper", "lesser goldfinch", "lesser scaup", "lesser yellowlegs", "lincoln's sparrow", "little blue heron", "loggerhead shrike", "long-billed curlew", "long-tailed duck", "louisiana waterthrush", "macgillivray's warbler", "magnolia warbler", "mallard", "marbled godwit", "marsh wren", "merlin", "mew gull", "mexican jay", "mississippi kite", "monk parakeet", "mottled duck", "mountain bluebird", "mountain chickadee", "mourning dove", "mourning warbler", "mute swan", "nashville warbler", "neotropic cormorant", "northern bobwhite", "northern cardinal", "northern flicker", "northern gannet", "northern harrier", "northern mockingbird", "northern parula", "northern pintail", "northern pygmy-owl", "northern rough-winged swallow", "northern saw-whet owl", "northern shoveler", "northern shrike", "northern waterthrush", "northwestern crow", "nuttall's woodpecker", "oak titmouse", "orange-crowned warbler", "orchard oriole", "osprey", "ovenbird", "pacific loon", "pacific wren", "pacific-slope flycatcher", "painted bunting", "palm warbler", "pelagic cormorant", "peregrine falcon", "phainopepla", "pied-billed grebe", "pigeon guillemot", "pileated woodpecker", "pine grosbeak", "pine siskin", "pine warbler", "plumbeous vireo", "prairie falcon", "prairie warbler", "prothonotary warbler", "purple finch", "purple gallinule", "purple martin", "pygmy nuthatch", "pyrrhuloxia", "red crossbill", "red-bellied woodpecker", "red-breasted merganser", "red-breasted nuthatch", "red-breasted sapsucker", "red-eyed vireo", "red-headed woodpecker", "red-naped sapsucker", "red-necked grebe", "red-shouldered hawk", "red-tailed hawk", "red-throated loon", "red-winged blackbird", "reddish egret", "redhead", "ring-billed gull", "ring-necked duck", "ring-necked pheasant", "rock pigeon", "rose-breasted grosbeak", "roseate spoonbill", "ross's goose", "rough-legged hawk", "royal tern", "ruby-crowned kinglet", "ruby-throated hummingbird", "ruddy duck", "ruddy turnstone", "ruffed grouse", "rufous hummingbird", "rufous-crowned sparrow", "rusty blackbird", "sanderling", "sandhill crane", "savannah sparrow", "say's phoebe", "scaled quail", "scarlet tanager", "scissor-tailed flycatcher", "semipalmated plover", "semipalmated sandpiper", "sharp-shinned hawk", "short-billed dowitcher", "snow bunting", "snow goose", "snowy egret", "snowy owl", "solitary sandpiper", "song sparrow", "spotted sandpiper", "spotted towhee", "steller's jay", "summer tanager", "surf scoter", "surfbird", "swainson's hawk", "swainson's thrush", "swallow-tailed kite", "swamp sparrow", "tennessee warbler", "townsend's solitaire", "townsend's warbler", "tree swallow", "tricolored heron", "trumpeter swan", "tufted titmouse", "tundra swan", "turkey vulture", "varied thrush", "vaux's swift", "veery", "verdin", "vermilion flycatcher", "vesper sparrow", "violet-green swallow", "warbling vireo", "western bluebird", "western grebe", "western gull", "western kingbird", "western meadowlark", "western sandpiper", "western screech-owl", "western scrub-jay", "western tanager", "western wood-pewee", "whimbrel", "white ibis", "white-breasted nuthatch", "white-crowned sparrow", "white-eyed vireo", "white-faced ibis", "white-tailed kite", "white-throated sparrow", "white-throated swift", "white-winged crossbill", "white-winged dove", "white-winged scoter", "wild turkey", "willet", "wilson's phalarope", "wilson's snipe", "wilson's warbler", "winter wren", "wood duck", "wood stork", "wood thrush", "wrentit", "yellow warbler", "yellow-bellied sapsucker", "yellow-billed cuckoo", "yellow-billed magpie", "yellow-breasted chat", "yellow-crowned night-heron", "yellow-headed blackbird", "yellow-rumped warbler", "yellow-throated vireo", "yellow-throated warbler" ]
gjuggler/swin-tiny-patch4-window7-224-finetuned-birds-finetuned-birds
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-birds-finetuned-birds This model is a fine-tuned version of [gjuggler/swin-tiny-patch4-window7-224-finetuned-birds](https://huggingface.co/gjuggler/swin-tiny-patch4-window7-224-finetuned-birds) on the bird-data dataset. It achieves the following results on the evaluation set: - Loss: 1.2646 - Accuracy: 0.6919 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.0629 | 1.0 | 84 | 1.5111 | 0.6455 | | 1.8561 | 2.0 | 168 | 1.3206 | 0.6747 | | 1.686 | 3.0 | 252 | 1.2646 | 0.6919 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "abert's towhee", "acorn woodpecker", "allen's hummingbird", "american avocet", "american black duck", "american coot", "american crow", "american dipper", "american goldfinch", "american kestrel", "american oystercatcher", "american pipit", "american redstart", "american robin", "american tree sparrow", "american white pelican", "american wigeon", "american woodcock", "anhinga", "anna's hummingbird", "ash-throated flycatcher", "bald eagle", "baltimore oriole", "band-tailed pigeon", "bank swallow", "barn owl", "barn swallow", "barred owl", "barrow's goldeneye", "bay-breasted warbler", "bell's vireo", "belted kingfisher", "bewick's wren", "black guillemot", "black oystercatcher", "black phoebe", "black rosy-finch", "black scoter", "black skimmer", "black tern", "black turnstone", "black vulture", "black-and-white warbler", "black-bellied plover", "black-bellied whistling-duck", "black-billed cuckoo", "black-billed magpie", "black-capped chickadee", "black-chinned hummingbird", "black-crested titmouse", "black-crowned night-heron", "black-headed grosbeak", "black-legged kittiwake", "black-necked stilt", "black-tailed gnatcatcher", "black-throated blue warbler", "black-throated gray warbler", "black-throated green warbler", "blackburnian warbler", "blackpoll warbler", "blue grosbeak", "blue jay", "blue-gray gnatcatcher", "blue-headed vireo", "blue-winged teal", "blue-winged warbler", "boat-tailed grackle", "bobolink", "bohemian waxwing", "bonaparte's gull", "boreal chickadee", "brandt's cormorant", "brant", "brewer's blackbird", "brewer's sparrow", "bridled titmouse", "broad-billed hummingbird", "broad-tailed hummingbird", "broad-winged hawk", "bronzed cowbird", "brown creeper", "brown pelican", "brown thrasher", "brown-capped rosy-finch", "brown-headed cowbird", "brown-headed nuthatch", "bufflehead", "bullock's oriole", "burrowing owl", "bushtit", "cackling goose", "cactus wren", "california gull", "california quail", "california thrasher", "california towhee", "calliope hummingbird", "canada goose", "canada warbler", "canvasback", "canyon towhee", "canyon wren", "cape may warbler", "carolina chickadee", "carolina wren", "caspian tern", "cassin's finch", "cassin's kingbird", "cassin's vireo", "cattle egret", "cave swallow", "cedar waxwing", "chestnut-backed chickadee", "chestnut-sided warbler", "chihuahuan raven", "chimney swift", "chipping sparrow", "cinnamon teal", "clark's grebe", "clark's nutcracker", "clay-colored sparrow", "cliff swallow", "common eider", "common gallinule", "common goldeneye", "common grackle", "common ground-dove", "common loon", "common merganser", "common nighthawk", "common raven", "common redpoll", "common tern", "common yellowthroat", "cooper's hawk", "cordilleran flycatcher", "costa's hummingbird", "crested caracara", "curve-billed thrasher", "dark-eyed junco", "dickcissel", "double-crested cormorant", "downy woodpecker", "dunlin", "eared grebe", "eastern bluebird", "eastern kingbird", "eastern meadowlark", "eastern phoebe", "eastern screech-owl", "eastern towhee", "eastern wood-pewee", "eurasian collared-dove", "eurasian eagle-owl", "european starling", "evening grosbeak", "field sparrow", "fish crow", "florida scrub-jay", "forster's tern", "fox sparrow", "gadwall", "gambel's quail", "gila woodpecker", "glaucous-winged gull", "glossy ibis", "golden eagle", "golden-crowned kinglet", "golden-crowned sparrow", "golden-fronted woodpecker", "gray catbird", "gray jay", "gray-crowned rosy-finch", "great black-backed gull", "great blue heron", "great cormorant", "great crested flycatcher", "great egret", "great horned owl", "great-tailed grackle", "greater roadrunner", "greater scaup", "greater white-fronted goose", "greater yellowlegs", "green heron", "green-tailed towhee", "green-winged teal", "hairy woodpecker", "harlequin duck", "harris's hawk", "harris's sparrow", "heermann's gull", "hermit thrush", "hermit warbler", "herring gull", "hoary redpoll", "hooded merganser", "hooded oriole", "hooded warbler", "horned grebe", "horned lark", "house finch", "house sparrow", "house wren", "hutton's vireo", "inca dove", "indigo bunting", "juniper titmouse", "killdeer", "ladder-backed woodpecker", "lark bunting", "lark sparrow", "laughing gull", "lazuli bunting", "least flycatcher", "least sandpiper", "lesser goldfinch", "lesser scaup", "lesser yellowlegs", "lincoln's sparrow", "little blue heron", "loggerhead shrike", "long-billed curlew", "long-tailed duck", "louisiana waterthrush", "macgillivray's warbler", "magnolia warbler", "mallard", "marbled godwit", "marsh wren", "merlin", "mew gull", "mexican jay", "mississippi kite", "monk parakeet", "mottled duck", "mountain bluebird", "mountain chickadee", "mourning dove", "mourning warbler", "mute swan", "nashville warbler", "neotropic cormorant", "northern bobwhite", "northern cardinal", "northern flicker", "northern gannet", "northern harrier", "northern mockingbird", "northern parula", "northern pintail", "northern pygmy-owl", "northern rough-winged swallow", "northern saw-whet owl", "northern shoveler", "northern shrike", "northern waterthrush", "northwestern crow", "nuttall's woodpecker", "oak titmouse", "orange-crowned warbler", "orchard oriole", "osprey", "ovenbird", "pacific loon", "pacific wren", "pacific-slope flycatcher", "painted bunting", "palm warbler", "pelagic cormorant", "peregrine falcon", "phainopepla", "pied-billed grebe", "pigeon guillemot", "pileated woodpecker", "pine grosbeak", "pine siskin", "pine warbler", "plumbeous vireo", "prairie falcon", "prairie warbler", "prothonotary warbler", "purple finch", "purple gallinule", "purple martin", "pygmy nuthatch", "pyrrhuloxia", "red crossbill", "red-bellied woodpecker", "red-breasted merganser", "red-breasted nuthatch", "red-breasted sapsucker", "red-eyed vireo", "red-headed woodpecker", "red-naped sapsucker", "red-necked grebe", "red-shouldered hawk", "red-tailed hawk", "red-throated loon", "red-winged blackbird", "reddish egret", "redhead", "ring-billed gull", "ring-necked duck", "ring-necked pheasant", "rock pigeon", "rose-breasted grosbeak", "roseate spoonbill", "ross's goose", "rough-legged hawk", "royal tern", "ruby-crowned kinglet", "ruby-throated hummingbird", "ruddy duck", "ruddy turnstone", "ruffed grouse", "rufous hummingbird", "rufous-crowned sparrow", "rusty blackbird", "sanderling", "sandhill crane", "savannah sparrow", "say's phoebe", "scaled quail", "scarlet tanager", "scissor-tailed flycatcher", "semipalmated plover", "semipalmated sandpiper", "sharp-shinned hawk", "short-billed dowitcher", "snow bunting", "snow goose", "snowy egret", "snowy owl", "solitary sandpiper", "song sparrow", "spotted sandpiper", "spotted towhee", "steller's jay", "summer tanager", "surf scoter", "surfbird", "swainson's hawk", "swainson's thrush", "swallow-tailed kite", "swamp sparrow", "tennessee warbler", "townsend's solitaire", "townsend's warbler", "tree swallow", "tricolored heron", "trumpeter swan", "tufted titmouse", "tundra swan", "turkey vulture", "varied thrush", "vaux's swift", "veery", "verdin", "vermilion flycatcher", "vesper sparrow", "violet-green swallow", "warbling vireo", "western bluebird", "western grebe", "western gull", "western kingbird", "western meadowlark", "western sandpiper", "western screech-owl", "western scrub-jay", "western tanager", "western wood-pewee", "whimbrel", "white ibis", "white-breasted nuthatch", "white-crowned sparrow", "white-eyed vireo", "white-faced ibis", "white-tailed kite", "white-throated sparrow", "white-throated swift", "white-winged crossbill", "white-winged dove", "white-winged scoter", "wild turkey", "willet", "wilson's phalarope", "wilson's snipe", "wilson's warbler", "winter wren", "wood duck", "wood stork", "wood thrush", "wrentit", "yellow warbler", "yellow-bellied sapsucker", "yellow-billed cuckoo", "yellow-billed magpie", "yellow-breasted chat", "yellow-crowned night-heron", "yellow-headed blackbird", "yellow-rumped warbler", "yellow-throated vireo", "yellow-throated warbler" ]
katielink/autotrain-cxr-cfdl-repro-40197105212
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 40197105212 - CO2 Emissions (in grams): 0.0082 ## Validation Metrics - Loss: 0.063 - Accuracy: 0.980 - Precision: 0.984 - Recall: 0.990 - AUC: 0.997 - F1: 0.987
[ "normal", "pneumonia" ]
micazevedo/autotrain-vision-tcg-40463105224
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 40463105224 - CO2 Emissions (in grams): 1.6135 ## Validation Metrics - Loss: 0.751 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "base1-1 alakazam pokémon psychic rare holo", "base1-10 mewtwo pokémon psychic rare holo", "base1-16 zapdos pokémon lightning rare holo", "base1-98 fire energy energy", "base1-99 grass energy energy", "base1-17 beedrill pokémon grass rare", "base1-18 dragonair pokémon colorless rare", "base1-19 dugtrio pokémon fighting rare", "base1-2 blastoise pokémon water rare holo", "base1-20 electabuzz pokémon lightning rare", "base1-21 electrode pokémon lightning rare", "base1-22 pidgeotto pokémon colorless rare", "base1-23 arcanine pokémon fire uncommon", "base1-24 charmeleon pokémon fire uncommon", "base1-100 lightning energy energy", "base1-25 dewgong pokémon water uncommon", "base1-26 dratini pokémon colorless uncommon", "base1-27 farfetch'd pokémon colorless uncommon", "base1-28 growlithe pokémon fire uncommon", "base1-29 haunter pokémon psychic uncommon", "base1-3 chansey pokémon colorless rare holo", "base1-30 ivysaur pokémon grass uncommon", "base1-31 jynx pokémon psychic uncommon", "base1-32 kadabra pokémon psychic uncommon", "base1-33 kakuna pokémon grass uncommon", "base1-101 psychic energy energy", "base1-34 machoke pokémon fighting uncommon", "base1-35 magikarp pokémon water uncommon", "base1-36 magmar pokémon fire uncommon", "base1-37 nidorino pokémon grass uncommon", "base1-38 poliwhirl pokémon water uncommon", "base1-39 porygon pokémon colorless uncommon", "base1-4 charizard pokémon fire rare holo", "base1-40 raticate pokémon colorless uncommon", "base1-41 seel pokémon water uncommon", "base1-42 wartortle pokémon water uncommon", "base1-102 water energy energy", "base1-43 abra pokémon psychic common", "base1-44 bulbasaur pokémon grass common", "base1-45 caterpie pokémon grass common", "base1-46 charmander pokémon fire common", "base1-47 diglett pokémon fighting common", "base1-48 doduo pokémon colorless common", "base1-49 drowzee pokémon psychic common", "base1-5 clefairy pokémon colorless rare holo", "base1-50 gastly pokémon psychic common", "base1-51 koffing pokémon grass common", "base1-11 nidoking pokémon grass rare holo", "base1-52 machop pokémon fighting common", "base1-53 magnemite pokémon lightning common", "base1-54 metapod pokémon grass common", "base1-55 nidoran ♂ pokémon grass common", "base1-56 onix pokémon fighting common", "base1-57 pidgey pokémon colorless common", "base1-58 pikachu pokémon lightning common", "base1-59 poliwag pokémon water common", "base1-6 gyarados pokémon water rare holo", "base1-60 ponyta pokémon fire common", "base1-12 ninetales pokémon fire rare holo", "base1-61 rattata pokémon colorless common", "base1-62 sandshrew pokémon fighting common", "base1-63 squirtle pokémon water common", "base1-64 starmie pokémon water common", "base1-65 staryu pokémon water common", "base1-66 tangela pokémon grass common", "base1-67 voltorb pokémon lightning common", "base1-68 vulpix pokémon fire common", "base1-69 weedle pokémon grass common", "base1-7 hitmonchan pokémon fighting rare holo", "base1-13 poliwrath pokémon water rare holo", "base1-70 clefairy doll trainer rare", "base1-71 computer search trainer rare", "base1-72 devolution spray trainer rare", "base1-73 impostor professor oak trainer rare", "base1-74 item finder trainer rare", "base1-75 lass trainer rare", "base1-76 pokémon breeder trainer rare", "base1-77 pokémon trader trainer rare", "base1-78 scoop up trainer rare", "base1-79 super energy removal trainer rare", "base1-14 raichu pokémon lightning rare holo", "base1-8 machamp pokémon fighting rare holo", "base1-80 defender trainer uncommon", "base1-81 energy retrieval trainer uncommon", "base1-82 full heal trainer uncommon", "base1-83 maintenance trainer uncommon", "base1-84 pluspower trainer uncommon", "base1-85 pokémon center trainer uncommon", "base1-86 pokémon flute trainer uncommon", "base1-87 pokédex trainer uncommon", "base1-88 professor oak trainer uncommon", "base1-15 venusaur pokémon grass rare holo", "base1-89 revive trainer uncommon", "base1-9 magneton pokémon lightning rare holo", "base1-90 super potion trainer uncommon", "base1-91 bill trainer common", "base1-92 energy removal trainer common", "base1-93 gust of wind trainer common", "base1-94 potion trainer common", "base1-95 switch trainer common", "base1-96 double colorless energy energy uncommon", "base1-97 fighting energy energy" ]
ChasingMercer/weather-mod
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # weather-mod This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0907 - Accuracy: 0.9745 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 6 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.4096 | 1.0 | 118 | 0.3402 | 0.8790 | | 0.2749 | 2.0 | 236 | 0.1482 | 0.9490 | | 0.1989 | 3.0 | 354 | 0.1297 | 0.9660 | | 0.1129 | 4.0 | 472 | 0.1074 | 0.9788 | | 0.0827 | 5.0 | 590 | 0.1023 | 0.9745 | | 0.0644 | 6.0 | 708 | 0.0907 | 0.9745 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "dew", "fogsmog", "frost", "glaze", "hail", "lightning", "rain", "rainbow", "rime", "sandstorm", "snow" ]
trpakov/vit-pneumonia
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-pneumonia This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the chest-xray-classification dataset. It achieves the following results on the evaluation set: - Loss: 0.1086 - Accuracy: 0.9768 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.25 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0357 | 1.0 | 192 | 0.0955 | 0.9691 | | 0.0404 | 2.0 | 384 | 0.0720 | 0.9751 | | 0.0546 | 3.0 | 576 | 0.2275 | 0.9468 | | 0.0113 | 4.0 | 768 | 0.1386 | 0.9648 | | 0.0101 | 5.0 | 960 | 0.1212 | 0.9708 | | 0.0003 | 6.0 | 1152 | 0.0929 | 0.9777 | | 0.0002 | 7.0 | 1344 | 0.1051 | 0.9777 | | 0.0002 | 8.0 | 1536 | 0.1075 | 0.9777 | | 0.0002 | 9.0 | 1728 | 0.1084 | 0.9768 | | 0.0002 | 10.0 | 1920 | 0.1086 | 0.9768 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "pneumonia", "normal" ]
priyankloco/swin-base-patch4-window7-224-in22k-finetuned_swinv1-autotags-224
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-base-patch4-window7-224-in22k-finetuned_swinv1-autotags-224 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224-in22k](https://huggingface.co/microsoft/swin-base-patch4-window7-224-in22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1186 - Accuracy: 0.9675 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.1468 | 0.99 | 61 | 0.8540 | 0.7503 | | 0.6167 | 1.99 | 122 | 0.3772 | 0.8904 | | 0.45 | 2.99 | 183 | 0.2963 | 0.9086 | | 0.2163 | 3.99 | 244 | 0.2172 | 0.9391 | | 0.209 | 4.99 | 305 | 0.1733 | 0.9431 | | 0.1558 | 5.99 | 366 | 0.2101 | 0.9310 | | 0.109 | 6.99 | 427 | 0.1268 | 0.9655 | | 0.1214 | 7.99 | 488 | 0.1251 | 0.9706 | | 0.1471 | 8.99 | 549 | 0.1194 | 0.9665 | | 0.0888 | 9.99 | 610 | 0.1376 | 0.9574 | | 0.1077 | 10.99 | 671 | 0.1211 | 0.9614 | | 0.0969 | 11.99 | 732 | 0.1231 | 0.9695 | | 0.0585 | 12.99 | 793 | 0.1472 | 0.9553 | | 0.0659 | 13.99 | 854 | 0.1203 | 0.9655 | | 0.0645 | 14.99 | 915 | 0.1405 | 0.9614 | | 0.0472 | 15.99 | 976 | 0.1340 | 0.9604 | | 0.0616 | 16.99 | 1037 | 0.1272 | 0.9655 | | 0.0609 | 17.99 | 1098 | 0.1121 | 0.9685 | | 0.0525 | 18.99 | 1159 | 0.1162 | 0.9685 | | 0.0406 | 19.99 | 1220 | 0.1186 | 0.9675 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.10.2+cu113 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "accordion", "button", "checkbox", "date-time", "date-time picker", "dropdown_closed", "dropdown_open", "form", "input", "multi-node_horizontal", "multi-node_vertical", "progress_circular", "progress_linear", "progress_stepper", "radio", "sections", "slider", "switch" ]
chanelcolgate/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
priyankloco/swinv2-tiny-patch4-window8-256-finetuned_swinv2tiny-autotags-256
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-tiny-patch4-window8-256-finetuned_swinv2tiny-autotags-256 This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1115 - Accuracy: 0.9655 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.6169 | 0.99 | 61 | 1.1018 | 0.6701 | | 0.7747 | 1.99 | 122 | 0.4571 | 0.8670 | | 0.6088 | 2.99 | 183 | 0.3002 | 0.9198 | | 0.3908 | 3.99 | 244 | 0.2334 | 0.9299 | | 0.399 | 4.99 | 305 | 0.2138 | 0.9320 | | 0.2969 | 5.99 | 366 | 0.1650 | 0.9492 | | 0.2743 | 6.99 | 427 | 0.1514 | 0.9533 | | 0.2947 | 7.99 | 488 | 0.1428 | 0.9513 | | 0.2304 | 8.99 | 549 | 0.1541 | 0.9523 | | 0.1957 | 9.99 | 610 | 0.1256 | 0.9604 | | 0.1645 | 10.99 | 671 | 0.1138 | 0.9645 | | 0.2317 | 11.99 | 732 | 0.1140 | 0.9655 | | 0.1001 | 12.99 | 793 | 0.1068 | 0.9706 | | 0.1564 | 13.99 | 854 | 0.1119 | 0.9675 | | 0.1386 | 14.99 | 915 | 0.1115 | 0.9655 | ### Framework versions - Transformers 4.25.1 - Pytorch 1.10.2+cu113 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "accordion", "button", "checkbox", "date-time", "date-time picker", "dropdown_closed", "dropdown_open", "form", "input", "multi-node_horizontal", "multi-node_vertical", "progress_circular", "progress_linear", "progress_stepper", "radio", "sections", "slider", "switch" ]
Kaspar/vit-base-railspace
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-beans-demo-v5 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0292 - Accuracy: 0.9926 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data precision recall f1-score support 0 1.00 1.00 1.00 11315 1 0.92 0.94 0.93 204 2 0.95 0.97 0.96 714 3 0.87 0.98 0.92 171 macro avg 0.93 0.97 0.95 12404 weighted avg 0.99 0.99 0.99 12404 accuracy 0.99 12404 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 64 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0206 | 1.72 | 1000 | 0.0422 | 0.9854 | | 0.0008 | 3.44 | 2000 | 0.0316 | 0.9918 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "no building or railspace", "railspace", "building", "railspace and non railspace building" ]
davanstrien/autotrain-mapreader-5000-40830105612
# Model Trained Using AutoTrain Image classification model trained to predict whether a patch of a historic map contains 'railspace' or not. See the [dataset](https://huggingface.co/datasets/Livingwithmachines/MapReader_Data_SIGSPATIAL_2022) used for training for more information on the labels. - Problem type: Multi-class Classification - Model ID: 40830105612 - CO2 Emissions (in grams): 0.0081 ## Validation Metrics - Loss: 0.038 - Accuracy: 0.995 - Macro F1: 0.983 - Micro F1: 0.995 - Weighted F1: 0.995 - Macro Precision: 0.991 - Micro Precision: 0.995 - Weighted Precision: 0.995 - Macro Recall: 0.975 - Micro Recall: 0.995 - Weighted Recall: 0.995
[ "building", "no building or railspace", "railspace", "railspace and non railspace building" ]
victor/autotrain-satellite-image-classification-40975105875
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 40975105875 - CO2 Emissions (in grams): 2.3260 ## Validation Metrics - Loss: 0.002 - Accuracy: 1.000 - Macro F1: 1.000 - Micro F1: 1.000 - Weighted F1: 1.000 - Macro Precision: 1.000 - Micro Precision: 1.000 - Weighted Precision: 1.000 - Macro Recall: 1.000 - Micro Recall: 1.000 - Weighted Recall: 1.000
[ "cloudy", "desert", "green_area", "water" ]
abletobetable/image_feature_extractor
Finetuned Beit for product category classification based on it's image
[ "2599", "2600", "2730", "2897", "2898", "10309", "10868", "10912", "11018", "11508", "11533", "11536", "11539", "2731", "11542", "11548", "11549", "11550", "11560", "11565", "11568", "11573", "11575", "11581", "2732", "11584", "11587", "11594", "11602", "11610", "11619", "11621", "11622", "11624", "11636", "2733", "11637", "11642", "11644", "11645", "11648", "11649", "11652", "11657", "11658", "11659", "2735", "11670", "11672", "11674", "11678", "11686", "11694", "11708", "11714", "11717", "11731", "2736", "11733", "11735", "11743", "11749", "11752", "11756", "11757", "11762", "11764", "11774", "2737", "11778", "11782", "11790", "11805", "11818", "11820", "11823", "11824", "11825", "11826", "2738", "11827", "11828", "11836", "11838", "11846", "11847", "11848", "11850", "11861", "11864", "2739", "11875", "11879", "11881", "11889", "11896", "11897", "11899", "11901", "11902", "11904", "2740", "11913", "11924", "11928", "11930", "11931", "11934", "11935", "11937", "11939", "11942", "2601", "2741", "11956", "11958", "11959", "11961", "11962", "11967", "11968", "11975", "11977", "11978", "2742", "11980", "11983", "11987", "11988", "11998", "11999", "12001", "12004", "12011", "12024", "2743", "12039", "12044", "12045", "12049", "12052", "12053", "12062", "12069", "12076", "12077", "2744", "12082", "12084", "12085", "12088", "12091", "12092", "12094", "12100", "12102", "12111", "2745", "12116", "12120", "12128", "12139", "12142", "12147", "12150", "12157", "12158", "12161", "2746", "12164", "12171", "12172", "12175", "12176", "12186", "12192", "12197", "12199", "12210", "2747", "12222", "12223", "12226", "12237", "12238", "12240", "12243", "12246", "12249", "12250", "2748", "12252", "12255", "12262", "12265", "12266", "12267", "12268", "12270", "12271", "12272", "2750", "12273", "12275", "12279", "12282", "12285", "12287", "12301", "12304", "12309", "12314", "2751", "12317", "12318", "12321", "12324", "12327", "12328", "12329", "12337", "12338", "12345", "2602", "2752", "12352", "12357", "12364", "12374", "12376", "12379", "12380", "12381", "12382", "12384", "2753", "12385", "12400", "12407", "12415", "12422", "12428", "12431", "12433", "12435", "12436", "2754", "12444", "12456", "12457", "12460", "12461", "12469", "12478", "12479", "12481", "12482", "2755", "12483", "12486", "12490", "12493", "12494", "12514", "12515", "12517", "12519", "12520", "2756", "12523", "12524", "12529", "12530", "12539", "12548", "12553", "12554", "12567", "12569", "2757", "12588", "12596", "12601", "12602", "12604", "12605", "12609", "12611", "12612", "12615", "2758", "12622", "12624", "12630", "12635", "12637", "12638", "12640", "12641", "12646", "12647", "2759", "12649", "12652", "12657", "12658", "12661", "12664", "12665", "12667", "12669", "12674", "2760", "12683", "12686", "12690", "12696", "12697", "12701", "12709", "12713", "12718", "12721", "2761", "12725", "12726", "12730", "12731", "12741", "12743", "12747", "12755", "12763", "12768", "2605", "2762", "12769", "12771", "12773", "12774", "12775", "12776", "12781", "12788", "12797", "12802", "2763", "12808", "12811", "12828", "12834", "12836", "12841", "12851", "12860", "12862", "12863", "2764", "12865", "12869", "12883", "12884", "12887", "12896", "12901", "12907", "12908", "12909", "2765", "12910", "12913", "12914", "12919", "12928", "12930", "12936", "12943", "12945", "12952", "2766", "12956", "12961", "12964", "12967", "12971", "12975", "12977", "12980", "12987", "12992", "2767", "12993", "12998", "13000", "13006", "13019", "13023", "13024", "13027", "13028", "13033", "2768", "13040", "13043", "13052", "13066", "13068", "13069", "13072", "13076", "13083", "13088", "2769", "13094", "13095", "13098", "13100", "13104", "13105", "13121", "13135", "13137", "13138", "2771", "13143", "13148", "13150", "13152", "13153", "13157", "13159", "13169", "13170", "13171", "2774", "13172", "13178", "13182", "13187", "13188", "13190", "13199", "13205", "13208", "13210", "2725", "2775", "13212", "13215", "13217", "13220", "13227", "13228", "13230", "13234", "13237", "13241", "2776", "13253", "13258", "13259", "13260", "13263", "13271", "13273", "13277", "13289", "13297", "2777", "13299", "13302", "13304", "13306", "13307", "13310", "13322", "13324", "13325", "13327", "2778", "13328", "13337", "13342", "13346", "13350", "13352", "13358", "13362", "13366", "13367", "2781", "13369", "13370", "13371", "13373", "13376", "13377", "13381", "13384", "13389", "13390", "2782", "13406", "13407", "13409", "13414", "13415", "13420", "13423", "13424", "13425", "13427", "2783", "13429", "13433", "13434", "13439", "13445", "13451", "13453", "13462", "13464", "13469", "2784", "13474", "13485", "13493", "13495", "13497", "13504", "13509", "13511", "13515", "13520", "2785", "13529", "13532", "13533", "13537", "13540", "13545", "13547", "13549", "13551", "13554", "2787", "13555", "13558", "13562", "13564", "13568", "13570", "13571", "13581", "13588", "13589", "2726", "2788", "13594", "13607", "13608", "13611", "13613", "13615", "13616", "13632", "13647", "13648", "2789", "13651", "13658", "13661", "13665", "13667", "13672", "13674", "13678", "13679", "13682", "2790", "13684", "13685", "13693", "13700", "13701", "13704", "13707", "13714", "13715", "13726", "2791", "13727", "13739", "13742", "13744", "13745", "13749", "13755", "13765", "13770", "13775", "2793", "13776", "13779", "13781", "13786", "13788", "13795", "13798", "13799", "13802", "13803", "2794", "13814", "13821", "13823", "13831", "13833", "13835", "13839", "13844", "13847", "13851", "2795", "13852", "13856", "13860", "13869", "13873", "13879", "13885", "13887", "13890", "13893", "2796", "13894", "13895", "13901", "13904", "13905", "13912", "13918", "13921", "13932", "13936", "2800", "13939", "13946", "13948", "13950", "13954", "13956", "13968", "13973", "13975", "13976", "2802", "13982", "13983", "13986", "13987", "13994", "14000", "14003", "14005", "14008", "14011", "2727", "2803", "14014", "14018", "14019", "14023", "14030", "14033", "14036", "14037", "14038", "14046", "2804", "14049", "14050", "14060", "14064", "14065", "14067", "14088", "14094", "14095", "14097", "2805", "14099", "14103", "14110", "14111", "14117", "14119", "14132", "14140", "14143", "14149", "2824", "14153", "14158", "14160", "14161", "14165", "14171", "14172", "14186", "14190", "14191", "2825", "14196", "14199", "14205", "14207", "14216", "14221", "14225", "14226", "14233", "14235", "2827", "14237", "14238", "14240", "14241", "14242", "14245", "14255", "14258", "14262", "14265", "2829", "14269", "14270", "14274", "14277", "14278", "14279", "14365", "14368", "14379", "14388", "2831", "14389", "14390", "14391", "14398", "14500", "14502", "14503", "14538", "14539", "14540", "2833", "14541", "14546", "14547", "14548", "14549", "14550", "14551", "14553", "14559", "14585", "2837", "14597", "14600", "14611", "14612", "14613", "14614", "14615", "14674", "14676", "14677", "2728", "2838", "14678", "14679", "14680", "14681", "14682", "14683", "14685", "14686", "14687", "14688", "2840", "14689", "14706", "14727", "14732", "14740", "14741", "14742", "14746", "14861", "14862", "2841", "14867", "14868", "14869", "14870", "14871", "14872", "14873", "14874", "14875", "14876", "2842", "14911", "14922", "14923", "14924", "14927", "14929", "14930", "14931", "14933", "14934", "2844", "14938", "14939", "14940", "14941", "14942", "14945", "14963", "14968", "14969", "14970", "2846", "14981", "14989", "14990", "15023", "15024", "15025", "15026", "15027", "15033", "15034", "2850", "15035", "15036", "15037", "15040", "15042", "15047", "15049", "15050", "15054", "15055", "2851", "15058", "15068", "15075", "15076", "2857", "2858", "2729", "2860", "2861", "2864", "2865", "2880", "2881", "2882", "2890", "2895", "2896" ]
yujiepan/internal.swin-base-food101-int8-structured40
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-food101-jpqd This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.3497 - Accuracy: 0.9055 This model is quantized. Structured sparsity in transformer linear layers: 40%. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 2.2676 | 0.42 | 500 | 2.1087 | 0.7947 | | 0.6823 | 0.84 | 1000 | 0.5127 | 0.8818 | | 0.816 | 1.27 | 1500 | 0.3944 | 0.8954 | | 0.5272 | 1.69 | 2000 | 0.3310 | 0.9050 | | 12.263 | 2.11 | 2500 | 12.0040 | 0.9057 | | 48.9519 | 2.54 | 3000 | 48.4500 | 0.8597 | | 75.576 | 2.96 | 3500 | 75.5765 | 0.6951 | | 93.7523 | 3.38 | 4000 | 93.3753 | 0.5992 | | 103.7155 | 3.8 | 4500 | 103.5301 | 0.5622 | | 107.7993 | 4.23 | 5000 | 108.0881 | 0.5636 | | 109.6831 | 4.65 | 5500 | 109.2205 | 0.5844 | | 1.8848 | 5.07 | 6000 | 0.9807 | 0.8315 | | 1.0668 | 5.49 | 6500 | 0.6050 | 0.8740 | | 0.7951 | 5.92 | 7000 | 0.5151 | 0.8838 | | 0.7402 | 6.34 | 7500 | 0.4843 | 0.8906 | | 0.7319 | 6.76 | 8000 | 0.4494 | 0.8933 | | 0.5683 | 7.19 | 8500 | 0.4378 | 0.8953 | | 0.496 | 7.61 | 9000 | 0.4115 | 0.8981 | | 0.6174 | 8.03 | 9500 | 0.3952 | 0.9005 | | 0.4921 | 8.45 | 10000 | 0.3765 | 0.9026 | | 0.5843 | 8.88 | 10500 | 0.3678 | 0.9035 | | 0.5485 | 9.3 | 11000 | 0.3576 | 0.9039 | | 0.4337 | 9.72 | 11500 | 0.3512 | 0.9057 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.13.1+cu116 - Datasets 2.8.0 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Madronus/MultiLabel_V3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # MultiLabel_V3 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.9683 - Accuracy: 0.7370 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.8572 | 0.1 | 100 | 1.1607 | 0.6466 | | 0.8578 | 0.2 | 200 | 1.1956 | 0.6499 | | 0.7362 | 0.3 | 300 | 1.1235 | 0.6885 | | 0.8569 | 0.39 | 400 | 1.0460 | 0.6891 | | 0.4851 | 0.49 | 500 | 1.1213 | 0.6891 | | 0.7252 | 0.59 | 600 | 1.1512 | 0.6720 | | 0.6333 | 0.69 | 700 | 1.1039 | 0.6913 | | 0.6239 | 0.79 | 800 | 1.0636 | 0.7001 | | 0.2768 | 0.89 | 900 | 1.0386 | 0.7073 | | 0.4872 | 0.99 | 1000 | 1.0311 | 0.7062 | | 0.3049 | 1.09 | 1100 | 1.0437 | 0.7155 | | 0.1435 | 1.18 | 1200 | 1.0343 | 0.7222 | | 0.2088 | 1.28 | 1300 | 1.0784 | 0.7194 | | 0.4972 | 1.38 | 1400 | 1.1072 | 0.7166 | | 0.3604 | 1.48 | 1500 | 1.0438 | 0.7150 | | 0.2726 | 1.58 | 1600 | 1.0077 | 0.7293 | | 0.3106 | 1.68 | 1700 | 1.0029 | 0.7326 | | 0.3259 | 1.78 | 1800 | 0.9906 | 0.7310 | | 0.3323 | 1.88 | 1900 | 0.9729 | 0.7359 | | 0.2998 | 1.97 | 2000 | 0.9683 | 0.7370 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "atticvent", "addresssignage", "fence", "fuelstorage", "garagedoor", "gutter", "noobservationtype", "propanetank", "recreationalvehicle", "roof", "siding", "sidingclearance", "chimney", "shrub", "skylight", "surfacelayer", "trees", "window", "woodpile", "ventother", "crawlspacevent", "combustibleitemstorage", "complexrooflocation", "deck", "door", "dryervent", "eave" ]
Younesao/autotrain-test-41086106044
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 41086106044 - CO2 Emissions (in grams): 0.1871 ## Validation Metrics - Loss: 1.110 - Accuracy: 0.100 - Macro F1: 0.074 - Micro F1: 0.100 - Weighted F1: 0.111 - Macro Precision: 0.083 - Micro Precision: 0.100 - Weighted Precision: 0.125 - Macro Recall: 0.067 - Micro Recall: 0.100 - Weighted Recall: 0.100
[ "abstract-1", "abstract-2", "nouveau dossier" ]
RiniPL/autotrain-dementia_classification-41162106183
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 41162106183 - CO2 Emissions (in grams): 0.2237 ## Validation Metrics - Loss: 1.017 - Accuracy: 0.643 - Macro F1: 0.427 - Micro F1: 0.643 - Weighted F1: 0.566 - Macro Precision: 0.646 - Micro Precision: 0.643 - Weighted Precision: 0.756 - Macro Recall: 0.431 - Micro Recall: 0.643 - Weighted Recall: 0.643
[ "mild_demented", "moderate_demented", "non_demented", "very_mild_demented" ]
mouss/autotrain-bikes_1-41171106189
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 41171106189 - CO2 Emissions (in grams): 0.4167 ## Validation Metrics - Loss: 0.368 - Accuracy: 0.818 - Precision: 0.882 - Recall: 0.789 - AUC: 0.921 - F1: 0.833
[ "bike", "damaged_bike" ]
mouss/autotrain-bikes-ag-41243106351
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 41243106351 - CO2 Emissions (in grams): 1.3811 ## Validation Metrics - Loss: 0.161 - Accuracy: 0.936 - Macro F1: 0.936 - Micro F1: 0.936 - Weighted F1: 0.936 - Macro Precision: 0.936 - Micro Precision: 0.936 - Weighted Precision: 0.936 - Macro Recall: 0.936 - Micro Recall: 0.936 - Weighted Recall: 0.936
[ "broken_1", "flat_tyre_1", "normal_1" ]
mouss/autotrain-bbikes-41250106361
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 41250106361 - CO2 Emissions (in grams): 1.3340 ## Validation Metrics - Loss: 0.599 - Accuracy: 0.830 - Macro F1: 0.829 - Micro F1: 0.830 - Weighted F1: 0.830 - Macro Precision: 0.831 - Micro Precision: 0.830 - Weighted Precision: 0.833 - Macro Recall: 0.831 - Micro Recall: 0.830 - Weighted Recall: 0.830
[ "broken_1", "dent_1", "flat_tyre_1", "loose_chain_1", "normal_1", "scratch_1" ]
Yuuki0/kecerdasan-buatan
Project Kelompok 4 - Kecerdasan Buatan
[ "ai generative art", "human generated art" ]
fyztkr/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Framework versions - Transformers 4.27.0 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "annualcrop", "forest", "herbaceousvegetation", "highway", "industrial", "pasture", "permanentcrop", "residential", "river", "sealake" ]
petrznel/face_discriminator
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # face_discriminator This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0170 - Accuracy: 0.9984 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5318 | 1.0 | 246 | 0.4920 | 0.7377 | | 0.3427 | 2.0 | 492 | 0.1974 | 0.9813 | | 0.2394 | 3.0 | 738 | 0.0735 | 0.9945 | | 0.0961 | 4.0 | 984 | 0.0972 | 0.9859 | | 0.1978 | 5.0 | 1230 | 0.0317 | 0.9969 | | 0.0787 | 6.0 | 1476 | 0.0324 | 0.9938 | | 0.045 | 7.0 | 1722 | 0.0222 | 0.9969 | | 0.1506 | 8.0 | 1968 | 0.0235 | 0.9961 | | 0.0478 | 9.0 | 2214 | 0.0272 | 0.9961 | | 0.1224 | 10.0 | 2460 | 0.0170 | 0.9984 | ### Framework versions - Transformers 4.28.0.dev0 - Pytorch 1.12.1 - Datasets 2.10.1 - Tokenizers 0.11.0
[ "ffhq_res256", "sd_aligned" ]
vuiseng9/swin-base-food101-int8-structured43-15eph
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # jpqd-swin-b-15eph-r1.00-s2e5-mock-main-merge-pr2 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.2970 - Accuracy: 0.9144 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 15.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 3.8787 | 0.42 | 500 | 3.9971 | 0.7163 | | 0.8429 | 0.84 | 1000 | 0.6450 | 0.8678 | | 0.8561 | 1.27 | 1500 | 0.4160 | 0.8945 | | 0.5777 | 1.69 | 2000 | 0.3664 | 0.9006 | | 12.3601 | 2.11 | 2500 | 12.0328 | 0.9023 | | 49.0606 | 2.54 | 3000 | 48.5000 | 0.8526 | | 75.3173 | 2.96 | 3500 | 75.5341 | 0.6942 | | 93.6153 | 3.38 | 4000 | 93.3091 | 0.5929 | | 103.5744 | 3.8 | 4500 | 103.1211 | 0.5846 | | 107.7701 | 4.23 | 5000 | 108.0755 | 0.5398 | | 109.5736 | 4.65 | 5500 | 108.7624 | 0.5855 | | 1.8028 | 5.07 | 6000 | 1.0960 | 0.8179 | | 1.2549 | 5.49 | 6500 | 0.6560 | 0.8695 | | 0.7199 | 5.92 | 7000 | 0.5619 | 0.8769 | | 0.8874 | 6.34 | 7500 | 0.5151 | 0.8859 | | 0.7429 | 6.76 | 8000 | 0.4830 | 0.8898 | | 0.6759 | 7.19 | 8500 | 0.4681 | 0.8926 | | 0.5352 | 7.61 | 9000 | 0.4360 | 0.8956 | | 0.6021 | 8.03 | 9500 | 0.4202 | 0.8979 | | 0.5617 | 8.45 | 10000 | 0.3940 | 0.9003 | | 0.7235 | 8.88 | 10500 | 0.3915 | 0.9000 | | 0.5323 | 9.3 | 11000 | 0.3793 | 0.9017 | | 0.589 | 9.72 | 11500 | 0.3670 | 0.9051 | | 0.425 | 10.14 | 12000 | 0.3615 | 0.9059 | | 0.7103 | 10.57 | 12500 | 0.3479 | 0.9070 | | 0.6251 | 10.99 | 13000 | 0.3472 | 0.9073 | | 0.623 | 11.41 | 13500 | 0.3353 | 0.9088 | | 0.6012 | 11.83 | 14000 | 0.3292 | 0.9098 | | 0.4984 | 12.26 | 14500 | 0.3230 | 0.9112 | | 0.4763 | 12.68 | 15000 | 0.3158 | 0.9109 | | 0.3209 | 13.1 | 15500 | 0.3120 | 0.9123 | | 0.4854 | 13.52 | 16000 | 0.3057 | 0.9126 | | 0.5472 | 13.95 | 16500 | 0.3032 | 0.9134 | | 0.3264 | 14.37 | 17000 | 0.3013 | 0.9134 | | 0.4136 | 14.79 | 17500 | 0.2977 | 0.9141 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
vuiseng9/swin-base-food101-int8-structured44.5-20eph
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # jpqd-swin-b-20eph-r1.00-s2e5-mock-main-merge-pr2 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.2715 - Accuracy: 0.9179 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 5.1452 | 0.42 | 500 | 5.4928 | 0.6440 | | 0.9839 | 0.84 | 1000 | 0.7956 | 0.8580 | | 0.8533 | 1.27 | 1500 | 0.4392 | 0.8911 | | 0.6123 | 1.69 | 2000 | 0.3768 | 0.8983 | | 12.3076 | 2.11 | 2500 | 12.0798 | 0.8953 | | 49.301 | 2.54 | 3000 | 48.6292 | 0.8343 | | 75.6345 | 2.96 | 3500 | 75.7027 | 0.6777 | | 94.2556 | 3.38 | 4000 | 93.5852 | 0.5604 | | 103.3226 | 3.8 | 4500 | 103.1255 | 0.5702 | | 107.3423 | 4.23 | 5000 | 107.9250 | 0.5359 | | 108.9013 | 4.65 | 5500 | 108.5225 | 0.5882 | | 2.045 | 5.07 | 6000 | 1.1149 | 0.8154 | | 1.3377 | 5.49 | 6500 | 0.6747 | 0.8665 | | 0.7565 | 5.92 | 7000 | 0.5814 | 0.8765 | | 0.7493 | 6.34 | 7500 | 0.5460 | 0.8840 | | 0.7693 | 6.76 | 8000 | 0.5109 | 0.8851 | | 0.6082 | 7.19 | 8500 | 0.4893 | 0.8895 | | 0.7575 | 7.61 | 9000 | 0.4521 | 0.8943 | | 0.7943 | 8.03 | 9500 | 0.4465 | 0.8941 | | 0.5521 | 8.45 | 10000 | 0.4119 | 0.8967 | | 0.6536 | 8.88 | 10500 | 0.4071 | 0.9010 | | 0.5164 | 9.3 | 11000 | 0.3945 | 0.9010 | | 0.6687 | 9.72 | 11500 | 0.3884 | 0.9030 | | 0.4374 | 10.14 | 12000 | 0.3764 | 0.9040 | | 0.7326 | 10.57 | 12500 | 0.3678 | 0.9060 | | 0.6148 | 10.99 | 13000 | 0.3602 | 0.9057 | | 0.6068 | 11.41 | 13500 | 0.3566 | 0.9075 | | 0.6105 | 11.83 | 14000 | 0.3456 | 0.9074 | | 0.5277 | 12.26 | 14500 | 0.3383 | 0.9107 | | 0.5255 | 12.68 | 15000 | 0.3328 | 0.9097 | | 0.4536 | 13.1 | 15500 | 0.3268 | 0.9108 | | 0.5337 | 13.52 | 16000 | 0.3256 | 0.9107 | | 0.5299 | 13.95 | 16500 | 0.3161 | 0.9124 | | 0.3037 | 14.37 | 17000 | 0.3162 | 0.9123 | | 0.4171 | 14.79 | 17500 | 0.3078 | 0.9124 | | 0.5375 | 15.22 | 18000 | 0.3002 | 0.9116 | | 0.2722 | 15.64 | 18500 | 0.2953 | 0.9134 | | 0.3684 | 16.06 | 19000 | 0.2960 | 0.9137 | | 0.4369 | 16.48 | 19500 | 0.2918 | 0.9150 | | 0.3346 | 16.91 | 20000 | 0.2856 | 0.9171 | | 0.3645 | 17.33 | 20500 | 0.2856 | 0.9162 | | 0.4475 | 17.75 | 21000 | 0.2833 | 0.9157 | | 0.2553 | 18.17 | 21500 | 0.2788 | 0.9167 | | 0.5098 | 18.6 | 22000 | 0.2766 | 0.9164 | | 0.4149 | 19.02 | 22500 | 0.2732 | 0.9177 | | 0.3737 | 19.44 | 23000 | 0.2734 | 0.9181 | | 0.325 | 19.86 | 23500 | 0.2715 | 0.9176 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Gokulapriyan/swin-tiny-patch4-window7-224-finetuned-main-gpu-20e-final
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-main-gpu-20e-final This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0251 - Accuracy: 0.9917 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.5767 | 1.0 | 551 | 0.5565 | 0.7463 | | 0.3985 | 2.0 | 1102 | 0.3165 | 0.8711 | | 0.2988 | 3.0 | 1653 | 0.1835 | 0.9293 | | 0.2449 | 4.0 | 2204 | 0.1150 | 0.9572 | | 0.2037 | 5.0 | 2755 | 0.0993 | 0.9632 | | 0.1646 | 6.0 | 3306 | 0.0750 | 0.9717 | | 0.1995 | 7.0 | 3857 | 0.0610 | 0.9776 | | 0.1659 | 8.0 | 4408 | 0.0485 | 0.9815 | | 0.1449 | 9.0 | 4959 | 0.0505 | 0.9821 | | 0.1315 | 10.0 | 5510 | 0.0444 | 0.9843 | | 0.102 | 11.0 | 6061 | 0.0440 | 0.9838 | | 0.1039 | 12.0 | 6612 | 0.0359 | 0.9870 | | 0.0798 | 13.0 | 7163 | 0.0393 | 0.9869 | | 0.1033 | 14.0 | 7714 | 0.0343 | 0.9890 | | 0.078 | 15.0 | 8265 | 0.0298 | 0.9902 | | 0.0765 | 16.0 | 8816 | 0.0299 | 0.9901 | | 0.0769 | 17.0 | 9367 | 0.0275 | 0.9908 | | 0.0751 | 18.0 | 9918 | 0.0271 | 0.9910 | | 0.0822 | 19.0 | 10469 | 0.0251 | 0.9917 | | 0.0756 | 20.0 | 11020 | 0.0254 | 0.9913 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "fnh", "hcc", "hem", "healthy" ]
changsu/testmodel
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1). ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ### How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Data Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "hot dog", "not hot dog" ]
chromefan/vit-base-game-icons
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # game-ad-0306_outputs This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the ./data/games-ad-0306 dataset. It achieves the following results on the evaluation set: - Loss: 2.6235 - Accuracy: 0.3024 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 13373 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1000.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:------:|:---------------:|:--------:| | 3.2891 | 1.0 | 103 | 3.0266 | 0.2165 | | 2.9971 | 2.0 | 206 | 2.9194 | 0.2302 | | 2.9151 | 3.0 | 309 | 2.8731 | 0.2474 | | 2.8579 | 4.0 | 412 | 2.8072 | 0.2715 | | 2.7768 | 5.0 | 515 | 2.7918 | 0.2577 | | 2.7184 | 6.0 | 618 | 2.7296 | 0.2818 | | 2.648 | 7.0 | 721 | 2.7044 | 0.2921 | | 2.5884 | 8.0 | 824 | 2.7190 | 0.2680 | | 2.5146 | 9.0 | 927 | 2.6942 | 0.2784 | | 2.4384 | 10.0 | 1030 | 2.6877 | 0.2921 | | 2.442 | 11.0 | 1133 | 2.6412 | 0.2818 | | 2.3099 | 12.0 | 1236 | 2.6331 | 0.2852 | | 2.2685 | 13.0 | 1339 | 2.6451 | 0.2990 | | 2.182 | 14.0 | 1442 | 2.6927 | 0.2715 | | 2.1421 | 15.0 | 1545 | 2.6615 | 0.3162 | | 2.0483 | 16.0 | 1648 | 2.6500 | 0.3230 | | 1.9884 | 17.0 | 1751 | 2.6527 | 0.2990 | | 1.9316 | 18.0 | 1854 | 2.6736 | 0.2990 | | 1.8785 | 19.0 | 1957 | 2.6391 | 0.2921 | | 1.788 | 20.0 | 2060 | 2.7002 | 0.3127 | | 1.7115 | 21.0 | 2163 | 2.8321 | 0.2715 | | 1.6929 | 22.0 | 2266 | 2.6235 | 0.3024 | | 1.6239 | 23.0 | 2369 | 2.6378 | 0.3058 | | 1.5387 | 24.0 | 2472 | 2.6888 | 0.3127 | | 1.5095 | 25.0 | 2575 | 2.6888 | 0.3127 | | 1.4153 | 26.0 | 2678 | 2.6771 | 0.2715 | | 1.4254 | 27.0 | 2781 | 2.7354 | 0.2887 | | 1.3351 | 28.0 | 2884 | 2.7175 | 0.2990 | | 1.2955 | 29.0 | 2987 | 2.7679 | 0.2818 | | 1.2232 | 30.0 | 3090 | 2.7784 | 0.2921 | | 1.2115 | 31.0 | 3193 | 2.8496 | 0.2749 | | 1.1656 | 32.0 | 3296 | 2.7899 | 0.2818 | | 1.1419 | 33.0 | 3399 | 2.7646 | 0.2715 | | 1.0481 | 34.0 | 3502 | 2.8416 | 0.2715 | | 0.9763 | 35.0 | 3605 | 2.8370 | 0.3024 | | 0.9452 | 36.0 | 3708 | 2.7904 | 0.2955 | | 0.9178 | 37.0 | 3811 | 2.8309 | 0.2715 | | 0.9115 | 38.0 | 3914 | 2.8584 | 0.3093 | | 0.8472 | 39.0 | 4017 | 2.9066 | 0.2612 | | 0.8323 | 40.0 | 4120 | 2.8630 | 0.2921 | | 0.7622 | 41.0 | 4223 | 3.0020 | 0.2680 | | 0.7531 | 42.0 | 4326 | 2.8885 | 0.2921 | | 0.7054 | 43.0 | 4429 | 2.8820 | 0.2818 | | 0.685 | 44.0 | 4532 | 2.8764 | 0.3162 | | 0.7206 | 45.0 | 4635 | 2.8659 | 0.3162 | | 0.6304 | 46.0 | 4738 | 2.9537 | 0.2887 | | 0.6369 | 47.0 | 4841 | 2.9660 | 0.2509 | | 0.6161 | 48.0 | 4944 | 3.1112 | 0.2543 | | 0.618 | 49.0 | 5047 | 2.9729 | 0.2990 | | 0.556 | 50.0 | 5150 | 2.9870 | 0.2921 | | 0.5314 | 51.0 | 5253 | 2.9934 | 0.3093 | | 0.5502 | 52.0 | 5356 | 2.9379 | 0.2818 | | 0.4958 | 53.0 | 5459 | 3.0344 | 0.3024 | | 0.4896 | 54.0 | 5562 | 2.9924 | 0.2749 | | 0.4803 | 55.0 | 5665 | 3.0161 | 0.3127 | | 0.4554 | 56.0 | 5768 | 3.0221 | 0.2818 | | 0.4591 | 57.0 | 5871 | 3.0461 | 0.3024 | | 0.4349 | 58.0 | 5974 | 3.1377 | 0.3265 | | 0.4127 | 59.0 | 6077 | 3.0169 | 0.2955 | | 0.3973 | 60.0 | 6180 | 3.0338 | 0.2818 | | 0.4109 | 61.0 | 6283 | 3.0638 | 0.2818 | | 0.3872 | 62.0 | 6386 | 3.0810 | 0.2818 | | 0.3693 | 63.0 | 6489 | 3.2003 | 0.2715 | | 0.3457 | 64.0 | 6592 | 3.0843 | 0.2990 | | 0.3521 | 65.0 | 6695 | 3.1623 | 0.3058 | | 0.3625 | 66.0 | 6798 | 3.0036 | 0.3299 | | 0.3339 | 67.0 | 6901 | 3.2389 | 0.2921 | | 0.3378 | 68.0 | 7004 | 3.2493 | 0.2990 | | 0.2981 | 69.0 | 7107 | 3.1308 | 0.2955 | | 0.3023 | 70.0 | 7210 | 3.2455 | 0.3093 | | 0.3076 | 71.0 | 7313 | 3.2725 | 0.2887 | | 0.3201 | 72.0 | 7416 | 3.2563 | 0.2887 | | 0.3083 | 73.0 | 7519 | 3.2520 | 0.2921 | | 0.2906 | 74.0 | 7622 | 3.3344 | 0.3093 | | 0.2721 | 75.0 | 7725 | 3.1952 | 0.2852 | | 0.2873 | 76.0 | 7828 | 3.2529 | 0.3058 | | 0.278 | 77.0 | 7931 | 3.3428 | 0.2818 | | 0.2573 | 78.0 | 8034 | 3.3216 | 0.2784 | | 0.2578 | 79.0 | 8137 | 3.4178 | 0.2955 | | 0.2774 | 80.0 | 8240 | 3.3449 | 0.2818 | | 0.2762 | 81.0 | 8343 | 3.3452 | 0.2749 | | 0.2504 | 82.0 | 8446 | 3.5792 | 0.2955 | | 0.2552 | 83.0 | 8549 | 3.3478 | 0.2818 | | 0.2541 | 84.0 | 8652 | 3.4902 | 0.2784 | | 0.2616 | 85.0 | 8755 | 3.2829 | 0.3127 | | 0.2079 | 86.0 | 8858 | 3.5287 | 0.3162 | | 0.2538 | 87.0 | 8961 | 3.4731 | 0.3196 | | 0.2485 | 88.0 | 9064 | 3.5998 | 0.2646 | | 0.2714 | 89.0 | 9167 | 3.4567 | 0.2921 | | 0.232 | 90.0 | 9270 | 3.5061 | 0.2818 | | 0.2577 | 91.0 | 9373 | 3.5370 | 0.2921 | | 0.2232 | 92.0 | 9476 | 3.5062 | 0.2509 | | 0.2351 | 93.0 | 9579 | 3.5592 | 0.2784 | | 0.2299 | 94.0 | 9682 | 3.5167 | 0.3333 | | 0.2415 | 95.0 | 9785 | 3.6283 | 0.2887 | | 0.2265 | 96.0 | 9888 | 3.4819 | 0.2852 | | 0.2448 | 97.0 | 9991 | 3.5793 | 0.2990 | | 0.2141 | 98.0 | 10094 | 3.5728 | 0.2887 | | 0.1979 | 99.0 | 10197 | 3.4685 | 0.2921 | | 0.2077 | 100.0 | 10300 | 3.5586 | 0.3230 | | 0.1854 | 101.0 | 10403 | 3.5650 | 0.3162 | | 0.2017 | 102.0 | 10506 | 3.4760 | 0.2921 | | 0.2119 | 103.0 | 10609 | 3.5531 | 0.2784 | | 0.2314 | 104.0 | 10712 | 3.5118 | 0.3024 | | 0.212 | 105.0 | 10815 | 3.5496 | 0.3196 | | 0.197 | 106.0 | 10918 | 3.6080 | 0.2543 | | 0.2067 | 107.0 | 11021 | 3.6217 | 0.2887 | | 0.1896 | 108.0 | 11124 | 3.6446 | 0.3230 | | 0.198 | 109.0 | 11227 | 3.7699 | 0.2784 | | 0.2152 | 110.0 | 11330 | 3.6709 | 0.3162 | | 0.2121 | 111.0 | 11433 | 3.6266 | 0.3368 | | 0.1869 | 112.0 | 11536 | 3.6681 | 0.2955 | | 0.1927 | 113.0 | 11639 | 3.7305 | 0.3162 | | 0.2259 | 114.0 | 11742 | 3.6302 | 0.3127 | | 0.1809 | 115.0 | 11845 | 3.6301 | 0.3093 | | 0.2071 | 116.0 | 11948 | 3.7288 | 0.3127 | | 0.1977 | 117.0 | 12051 | 3.6467 | 0.3058 | | 0.1902 | 118.0 | 12154 | 3.7039 | 0.3093 | | 0.1996 | 119.0 | 12257 | 3.9013 | 0.3093 | | 0.2122 | 120.0 | 12360 | 3.8228 | 0.2990 | | 0.1702 | 121.0 | 12463 | 3.7118 | 0.3162 | | 0.1889 | 122.0 | 12566 | 3.7211 | 0.3162 | | 0.1857 | 123.0 | 12669 | 3.8894 | 0.2509 | | 0.2003 | 124.0 | 12772 | 3.6575 | 0.3093 | | 0.202 | 125.0 | 12875 | 3.7925 | 0.3333 | | 0.1722 | 126.0 | 12978 | 3.8188 | 0.2818 | | 0.1716 | 127.0 | 13081 | 3.9584 | 0.3162 | | 0.1598 | 128.0 | 13184 | 3.7732 | 0.3265 | | 0.1825 | 129.0 | 13287 | 3.8038 | 0.3196 | | 0.1716 | 130.0 | 13390 | 3.7606 | 0.3196 | | 0.179 | 131.0 | 13493 | 3.7458 | 0.2955 | | 0.1817 | 132.0 | 13596 | 3.8413 | 0.2955 | | 0.1606 | 133.0 | 13699 | 3.8766 | 0.3196 | | 0.1625 | 134.0 | 13802 | 3.8188 | 0.3230 | | 0.1622 | 135.0 | 13905 | 3.7223 | 0.2955 | | 0.1852 | 136.0 | 14008 | 3.7774 | 0.3024 | | 0.1671 | 137.0 | 14111 | 3.8407 | 0.2612 | | 0.1862 | 138.0 | 14214 | 3.7442 | 0.3196 | | 0.1808 | 139.0 | 14317 | 3.8458 | 0.3093 | | 0.1375 | 140.0 | 14420 | 3.7372 | 0.3024 | | 0.1876 | 141.0 | 14523 | 3.9925 | 0.2990 | | 0.1693 | 142.0 | 14626 | 3.9364 | 0.3058 | | 0.1719 | 143.0 | 14729 | 3.9149 | 0.2818 | | 0.1406 | 144.0 | 14832 | 3.8603 | 0.2955 | | 0.1709 | 145.0 | 14935 | 3.9216 | 0.3196 | | 0.1794 | 146.0 | 15038 | 3.8934 | 0.3058 | | 0.1455 | 147.0 | 15141 | 4.0086 | 0.2784 | | 0.1959 | 148.0 | 15244 | 3.9358 | 0.3024 | | 0.1664 | 149.0 | 15347 | 3.9775 | 0.2921 | | 0.1455 | 150.0 | 15450 | 3.9304 | 0.2990 | | 0.1819 | 151.0 | 15553 | 4.0299 | 0.2715 | | 0.1532 | 152.0 | 15656 | 4.1219 | 0.2680 | | 0.1638 | 153.0 | 15759 | 4.1465 | 0.3093 | | 0.1579 | 154.0 | 15862 | 4.0596 | 0.2955 | | 0.1668 | 155.0 | 15965 | 4.0857 | 0.3127 | | 0.1401 | 156.0 | 16068 | 4.1669 | 0.2921 | | 0.1452 | 157.0 | 16171 | 4.0430 | 0.2887 | | 0.1568 | 158.0 | 16274 | 4.0157 | 0.2990 | | 0.1771 | 159.0 | 16377 | 4.0770 | 0.3093 | | 0.1383 | 160.0 | 16480 | 4.0888 | 0.2680 | | 0.1572 | 161.0 | 16583 | 4.2271 | 0.2646 | | 0.1472 | 162.0 | 16686 | 4.0215 | 0.2852 | | 0.1534 | 163.0 | 16789 | 4.2248 | 0.3058 | | 0.136 | 164.0 | 16892 | 4.2159 | 0.2852 | | 0.1525 | 165.0 | 16995 | 4.0565 | 0.2990 | | 0.1418 | 166.0 | 17098 | 4.1175 | 0.2852 | | 0.1374 | 167.0 | 17201 | 4.1708 | 0.2921 | | 0.1538 | 168.0 | 17304 | 4.2566 | 0.2784 | | 0.1365 | 169.0 | 17407 | 4.3063 | 0.2577 | | 0.1661 | 170.0 | 17510 | 4.2231 | 0.2887 | | 0.1278 | 171.0 | 17613 | 4.3125 | 0.2646 | | 0.1418 | 172.0 | 17716 | 4.3337 | 0.2646 | | 0.1538 | 173.0 | 17819 | 4.3129 | 0.2852 | | 0.1315 | 174.0 | 17922 | 4.3102 | 0.2680 | | 0.128 | 175.0 | 18025 | 4.2853 | 0.2749 | | 0.1398 | 176.0 | 18128 | 4.1560 | 0.2715 | | 0.1525 | 177.0 | 18231 | 4.1812 | 0.2955 | | 0.1603 | 178.0 | 18334 | 4.1262 | 0.3093 | | 0.1412 | 179.0 | 18437 | 4.2778 | 0.2887 | | 0.1521 | 180.0 | 18540 | 4.2881 | 0.2680 | | 0.1404 | 181.0 | 18643 | 4.3147 | 0.2852 | | 0.1468 | 182.0 | 18746 | 4.2042 | 0.2749 | | 0.1448 | 183.0 | 18849 | 4.2110 | 0.2784 | | 0.1299 | 184.0 | 18952 | 4.2314 | 0.2921 | | 0.1361 | 185.0 | 19055 | 4.2993 | 0.2749 | | 0.1455 | 186.0 | 19158 | 4.3509 | 0.3058 | | 0.1345 | 187.0 | 19261 | 4.2828 | 0.2921 | | 0.1394 | 188.0 | 19364 | 4.1001 | 0.3093 | | 0.1415 | 189.0 | 19467 | 4.2179 | 0.2955 | | 0.1235 | 190.0 | 19570 | 4.2963 | 0.3093 | | 0.1373 | 191.0 | 19673 | 4.1833 | 0.2715 | | 0.1323 | 192.0 | 19776 | 4.3057 | 0.2852 | | 0.1188 | 193.0 | 19879 | 4.3819 | 0.2749 | | 0.1528 | 194.0 | 19982 | 4.3091 | 0.2749 | | 0.1365 | 195.0 | 20085 | 4.3870 | 0.2887 | | 0.1187 | 196.0 | 20188 | 4.2303 | 0.2715 | | 0.1409 | 197.0 | 20291 | 4.2344 | 0.2784 | | 0.1346 | 198.0 | 20394 | 4.0637 | 0.3162 | | 0.1449 | 199.0 | 20497 | 4.3022 | 0.2852 | | 0.1415 | 200.0 | 20600 | 4.2672 | 0.2990 | | 0.1283 | 201.0 | 20703 | 4.2363 | 0.2749 | | 0.1469 | 202.0 | 20806 | 4.2714 | 0.2990 | | 0.1288 | 203.0 | 20909 | 4.3246 | 0.2818 | | 0.1334 | 204.0 | 21012 | 4.1711 | 0.2887 | | 0.1419 | 205.0 | 21115 | 4.3263 | 0.2784 | | 0.1395 | 206.0 | 21218 | 4.2855 | 0.2990 | | 0.1255 | 207.0 | 21321 | 4.4301 | 0.2474 | | 0.1288 | 208.0 | 21424 | 4.3735 | 0.2955 | | 0.1395 | 209.0 | 21527 | 4.3549 | 0.2852 | | 0.1144 | 210.0 | 21630 | 4.4569 | 0.2715 | | 0.1185 | 211.0 | 21733 | 4.5008 | 0.2921 | | 0.1578 | 212.0 | 21836 | 4.2313 | 0.2818 | | 0.1434 | 213.0 | 21939 | 4.4445 | 0.2715 | | 0.1147 | 214.0 | 22042 | 4.4329 | 0.2818 | | 0.1239 | 215.0 | 22145 | 4.4102 | 0.2715 | | 0.1315 | 216.0 | 22248 | 4.2503 | 0.2955 | | 0.1413 | 217.0 | 22351 | 4.5559 | 0.2955 | | 0.1137 | 218.0 | 22454 | 4.4504 | 0.2990 | | 0.1412 | 219.0 | 22557 | 4.3377 | 0.3058 | | 0.1051 | 220.0 | 22660 | 4.5250 | 0.2852 | | 0.1314 | 221.0 | 22763 | 4.4539 | 0.2646 | | 0.1284 | 222.0 | 22866 | 4.3481 | 0.2921 | | 0.1159 | 223.0 | 22969 | 4.4284 | 0.3127 | | 0.1219 | 224.0 | 23072 | 4.5069 | 0.2749 | | 0.1183 | 225.0 | 23175 | 4.5461 | 0.2990 | | 0.1172 | 226.0 | 23278 | 4.3986 | 0.2921 | | 0.1216 | 227.0 | 23381 | 4.5154 | 0.3127 | | 0.1207 | 228.0 | 23484 | 4.4848 | 0.2887 | | 0.1303 | 229.0 | 23587 | 4.3925 | 0.2921 | | 0.1238 | 230.0 | 23690 | 4.3748 | 0.2990 | | 0.1126 | 231.0 | 23793 | 4.4806 | 0.3127 | | 0.1227 | 232.0 | 23896 | 4.4439 | 0.2921 | | 0.1146 | 233.0 | 23999 | 4.5228 | 0.2921 | | 0.1168 | 234.0 | 24102 | 4.5614 | 0.2887 | | 0.1219 | 235.0 | 24205 | 4.4129 | 0.2921 | | 0.1181 | 236.0 | 24308 | 4.5444 | 0.2990 | | 0.1167 | 237.0 | 24411 | 4.4038 | 0.2749 | | 0.1173 | 238.0 | 24514 | 4.3967 | 0.3230 | | 0.1052 | 239.0 | 24617 | 4.5055 | 0.2887 | | 0.1216 | 240.0 | 24720 | 4.5693 | 0.3024 | | 0.1242 | 241.0 | 24823 | 4.4906 | 0.2852 | | 0.1553 | 242.0 | 24926 | 4.4971 | 0.2990 | | 0.1377 | 243.0 | 25029 | 4.4536 | 0.2818 | | 0.1126 | 244.0 | 25132 | 4.5324 | 0.2852 | | 0.1321 | 245.0 | 25235 | 4.8037 | 0.2646 | | 0.115 | 246.0 | 25338 | 4.6682 | 0.2715 | | 0.1311 | 247.0 | 25441 | 4.6374 | 0.3196 | | 0.1224 | 248.0 | 25544 | 4.7803 | 0.2680 | | 0.1291 | 249.0 | 25647 | 4.6564 | 0.3093 | | 0.1138 | 250.0 | 25750 | 4.5188 | 0.3024 | | 0.1159 | 251.0 | 25853 | 4.5116 | 0.2990 | | 0.1172 | 252.0 | 25956 | 4.7039 | 0.2921 | | 0.1256 | 253.0 | 26059 | 4.6462 | 0.2852 | | 0.1227 | 254.0 | 26162 | 4.7470 | 0.2852 | | 0.1186 | 255.0 | 26265 | 4.6541 | 0.2921 | | 0.1114 | 256.0 | 26368 | 4.6005 | 0.2887 | | 0.1154 | 257.0 | 26471 | 4.5707 | 0.2818 | | 0.1229 | 258.0 | 26574 | 4.5180 | 0.2749 | | 0.1138 | 259.0 | 26677 | 4.6220 | 0.2818 | | 0.0987 | 260.0 | 26780 | 4.6446 | 0.2921 | | 0.1056 | 261.0 | 26883 | 4.7600 | 0.2715 | | 0.1362 | 262.0 | 26986 | 4.6703 | 0.2680 | | 0.1131 | 263.0 | 27089 | 4.6065 | 0.2715 | | 0.1127 | 264.0 | 27192 | 4.5125 | 0.2784 | | 0.1248 | 265.0 | 27295 | 4.5967 | 0.2921 | | 0.111 | 266.0 | 27398 | 4.6182 | 0.2474 | | 0.1203 | 267.0 | 27501 | 4.5969 | 0.2887 | | 0.1242 | 268.0 | 27604 | 4.5437 | 0.2749 | | 0.1041 | 269.0 | 27707 | 4.7105 | 0.2887 | | 0.1233 | 270.0 | 27810 | 4.6305 | 0.2784 | | 0.1003 | 271.0 | 27913 | 4.5865 | 0.2990 | | 0.1144 | 272.0 | 28016 | 4.6216 | 0.2852 | | 0.1061 | 273.0 | 28119 | 4.5387 | 0.2955 | | 0.1102 | 274.0 | 28222 | 4.5850 | 0.2921 | | 0.109 | 275.0 | 28325 | 4.6442 | 0.2921 | | 0.1277 | 276.0 | 28428 | 4.5837 | 0.2612 | | 0.1101 | 277.0 | 28531 | 4.7880 | 0.2784 | | 0.1136 | 278.0 | 28634 | 4.5664 | 0.2646 | | 0.1125 | 279.0 | 28737 | 4.7245 | 0.2990 | | 0.1207 | 280.0 | 28840 | 4.7841 | 0.2852 | | 0.1223 | 281.0 | 28943 | 4.7736 | 0.2852 | | 0.1132 | 282.0 | 29046 | 4.6193 | 0.2852 | | 0.1118 | 283.0 | 29149 | 4.7512 | 0.2921 | | 0.1196 | 284.0 | 29252 | 4.7773 | 0.2680 | | 0.1035 | 285.0 | 29355 | 4.6611 | 0.2921 | | 0.1079 | 286.0 | 29458 | 4.6916 | 0.2921 | | 0.1124 | 287.0 | 29561 | 4.6505 | 0.2680 | | 0.1024 | 288.0 | 29664 | 4.6303 | 0.2680 | | 0.101 | 289.0 | 29767 | 4.6079 | 0.2852 | | 0.124 | 290.0 | 29870 | 4.4566 | 0.2887 | | 0.1121 | 291.0 | 29973 | 4.5021 | 0.2887 | | 0.1005 | 292.0 | 30076 | 4.5479 | 0.2852 | | 0.1152 | 293.0 | 30179 | 4.6658 | 0.2749 | | 0.113 | 294.0 | 30282 | 4.5608 | 0.2749 | | 0.112 | 295.0 | 30385 | 4.6577 | 0.2852 | | 0.1095 | 296.0 | 30488 | 4.5323 | 0.2784 | | 0.1053 | 297.0 | 30591 | 4.6355 | 0.2921 | | 0.1138 | 298.0 | 30694 | 4.7187 | 0.2852 | | 0.1105 | 299.0 | 30797 | 4.6037 | 0.2784 | | 0.0944 | 300.0 | 30900 | 4.7195 | 0.2646 | | 0.1027 | 301.0 | 31003 | 4.6786 | 0.2749 | | 0.0994 | 302.0 | 31106 | 4.7625 | 0.2990 | | 0.1229 | 303.0 | 31209 | 4.8497 | 0.2715 | | 0.1094 | 304.0 | 31312 | 4.7454 | 0.2612 | | 0.1225 | 305.0 | 31415 | 4.7722 | 0.2818 | | 0.102 | 306.0 | 31518 | 4.8431 | 0.2749 | | 0.1283 | 307.0 | 31621 | 4.7977 | 0.2784 | | 0.109 | 308.0 | 31724 | 4.6382 | 0.3127 | | 0.1193 | 309.0 | 31827 | 4.7094 | 0.2543 | | 0.1106 | 310.0 | 31930 | 4.7562 | 0.2921 | | 0.1032 | 311.0 | 32033 | 4.7265 | 0.2577 | | 0.114 | 312.0 | 32136 | 4.7516 | 0.2852 | | 0.1265 | 313.0 | 32239 | 4.7882 | 0.2474 | | 0.1252 | 314.0 | 32342 | 4.7084 | 0.2543 | | 0.1102 | 315.0 | 32445 | 4.6895 | 0.2887 | | 0.0984 | 316.0 | 32548 | 4.6341 | 0.3024 | | 0.0978 | 317.0 | 32651 | 4.6211 | 0.3196 | | 0.1068 | 318.0 | 32754 | 4.7675 | 0.2921 | | 0.1017 | 319.0 | 32857 | 4.7061 | 0.2784 | | 0.1138 | 320.0 | 32960 | 4.7139 | 0.2784 | | 0.0997 | 321.0 | 33063 | 4.7117 | 0.2852 | | 0.1036 | 322.0 | 33166 | 4.7136 | 0.3058 | | 0.0988 | 323.0 | 33269 | 4.7139 | 0.2852 | | 0.1052 | 324.0 | 33372 | 4.7646 | 0.3058 | | 0.0957 | 325.0 | 33475 | 4.7901 | 0.2955 | | 0.1009 | 326.0 | 33578 | 4.7048 | 0.2749 | | 0.0957 | 327.0 | 33681 | 4.6212 | 0.2955 | | 0.1244 | 328.0 | 33784 | 4.7481 | 0.2852 | | 0.1021 | 329.0 | 33887 | 4.7497 | 0.2852 | | 0.1017 | 330.0 | 33990 | 4.8310 | 0.2749 | | 0.0957 | 331.0 | 34093 | 4.6941 | 0.3093 | | 0.1042 | 332.0 | 34196 | 4.7253 | 0.3127 | | 0.1046 | 333.0 | 34299 | 4.8593 | 0.2784 | | 0.1103 | 334.0 | 34402 | 4.8480 | 0.2715 | | 0.09 | 335.0 | 34505 | 4.9101 | 0.3162 | | 0.1108 | 336.0 | 34608 | 4.7839 | 0.2887 | | 0.1043 | 337.0 | 34711 | 4.9543 | 0.2680 | | 0.104 | 338.0 | 34814 | 4.8026 | 0.2990 | | 0.1015 | 339.0 | 34917 | 4.8008 | 0.2887 | | 0.1029 | 340.0 | 35020 | 4.9069 | 0.2990 | | 0.1002 | 341.0 | 35123 | 4.9242 | 0.3024 | | 0.1076 | 342.0 | 35226 | 4.7199 | 0.2921 | | 0.1055 | 343.0 | 35329 | 4.8440 | 0.3162 | | 0.0925 | 344.0 | 35432 | 4.8572 | 0.3230 | | 0.0827 | 345.0 | 35535 | 4.9133 | 0.3024 | | 0.1105 | 346.0 | 35638 | 4.9865 | 0.2852 | | 0.0875 | 347.0 | 35741 | 4.7973 | 0.2955 | | 0.106 | 348.0 | 35844 | 4.8696 | 0.2955 | | 0.1083 | 349.0 | 35947 | 4.9786 | 0.2646 | | 0.105 | 350.0 | 36050 | 4.9114 | 0.2680 | | 0.1075 | 351.0 | 36153 | 4.8693 | 0.2612 | | 0.1026 | 352.0 | 36256 | 4.8735 | 0.2887 | | 0.101 | 353.0 | 36359 | 5.0447 | 0.2646 | | 0.0944 | 354.0 | 36462 | 4.9492 | 0.2784 | | 0.1055 | 355.0 | 36565 | 4.9895 | 0.2715 | | 0.0858 | 356.0 | 36668 | 5.0955 | 0.2440 | | 0.0955 | 357.0 | 36771 | 5.0106 | 0.2990 | | 0.1108 | 358.0 | 36874 | 4.9109 | 0.3058 | | 0.1179 | 359.0 | 36977 | 4.9082 | 0.2852 | | 0.0984 | 360.0 | 37080 | 4.8480 | 0.3058 | | 0.0997 | 361.0 | 37183 | 4.8957 | 0.2715 | | 0.1128 | 362.0 | 37286 | 4.9127 | 0.3058 | | 0.0961 | 363.0 | 37389 | 5.0965 | 0.2784 | | 0.1096 | 364.0 | 37492 | 5.0317 | 0.2887 | | 0.0916 | 365.0 | 37595 | 4.9745 | 0.2887 | | 0.1057 | 366.0 | 37698 | 4.8775 | 0.2680 | | 0.0932 | 367.0 | 37801 | 5.0282 | 0.2680 | | 0.1072 | 368.0 | 37904 | 4.8097 | 0.2646 | | 0.0973 | 369.0 | 38007 | 4.9321 | 0.2749 | | 0.1034 | 370.0 | 38110 | 4.8176 | 0.2715 | | 0.1084 | 371.0 | 38213 | 4.8562 | 0.2852 | | 0.0957 | 372.0 | 38316 | 4.9466 | 0.2852 | | 0.1049 | 373.0 | 38419 | 4.8515 | 0.2612 | | 0.097 | 374.0 | 38522 | 4.8833 | 0.2887 | | 0.1008 | 375.0 | 38625 | 4.9442 | 0.2887 | | 0.1019 | 376.0 | 38728 | 4.8345 | 0.2818 | | 0.1083 | 377.0 | 38831 | 4.9350 | 0.2749 | | 0.1181 | 378.0 | 38934 | 4.8605 | 0.2612 | | 0.1043 | 379.0 | 39037 | 4.8783 | 0.2921 | | 0.1212 | 380.0 | 39140 | 4.8641 | 0.2852 | | 0.0941 | 381.0 | 39243 | 4.9772 | 0.2955 | | 0.0986 | 382.0 | 39346 | 4.9191 | 0.2715 | | 0.1054 | 383.0 | 39449 | 5.0695 | 0.2818 | | 0.1066 | 384.0 | 39552 | 5.1141 | 0.2852 | | 0.0929 | 385.0 | 39655 | 5.0176 | 0.2680 | | 0.102 | 386.0 | 39758 | 4.7790 | 0.2749 | | 0.103 | 387.0 | 39861 | 4.7348 | 0.2818 | | 0.107 | 388.0 | 39964 | 4.6667 | 0.2921 | | 0.0922 | 389.0 | 40067 | 4.6687 | 0.2680 | | 0.102 | 390.0 | 40170 | 4.8450 | 0.2680 | | 0.0958 | 391.0 | 40273 | 5.1279 | 0.2680 | | 0.0908 | 392.0 | 40376 | 4.9624 | 0.2921 | | 0.0988 | 393.0 | 40479 | 5.1676 | 0.2955 | | 0.0995 | 394.0 | 40582 | 4.8726 | 0.3058 | | 0.1087 | 395.0 | 40685 | 4.9525 | 0.2818 | | 0.11 | 396.0 | 40788 | 5.0258 | 0.2543 | | 0.0916 | 397.0 | 40891 | 5.0114 | 0.3265 | | 0.089 | 398.0 | 40994 | 4.9689 | 0.3058 | | 0.1089 | 399.0 | 41097 | 4.8648 | 0.3058 | | 0.085 | 400.0 | 41200 | 4.7376 | 0.2990 | | 0.1135 | 401.0 | 41303 | 4.9685 | 0.2955 | | 0.1032 | 402.0 | 41406 | 4.6955 | 0.3162 | | 0.0987 | 403.0 | 41509 | 4.8972 | 0.2990 | | 0.1112 | 404.0 | 41612 | 4.8028 | 0.2887 | | 0.0926 | 405.0 | 41715 | 4.6858 | 0.3265 | | 0.1032 | 406.0 | 41818 | 4.7680 | 0.3127 | | 0.1066 | 407.0 | 41921 | 4.8087 | 0.2887 | | 0.1053 | 408.0 | 42024 | 4.8871 | 0.2852 | | 0.0999 | 409.0 | 42127 | 4.7056 | 0.2818 | | 0.0929 | 410.0 | 42230 | 4.8846 | 0.2852 | | 0.1138 | 411.0 | 42333 | 4.7741 | 0.2990 | | 0.1126 | 412.0 | 42436 | 4.9157 | 0.2887 | | 0.0835 | 413.0 | 42539 | 4.9607 | 0.2784 | | 0.1004 | 414.0 | 42642 | 4.7718 | 0.3024 | | 0.0972 | 415.0 | 42745 | 4.8288 | 0.3058 | | 0.1023 | 416.0 | 42848 | 4.9083 | 0.2646 | | 0.0948 | 417.0 | 42951 | 4.8509 | 0.2887 | | 0.0918 | 418.0 | 43054 | 4.8323 | 0.2715 | | 0.0961 | 419.0 | 43157 | 4.9570 | 0.2818 | | 0.0911 | 420.0 | 43260 | 4.9581 | 0.2680 | | 0.0927 | 421.0 | 43363 | 4.9856 | 0.2852 | | 0.0907 | 422.0 | 43466 | 4.9146 | 0.2818 | | 0.1039 | 423.0 | 43569 | 4.7813 | 0.2818 | | 0.1093 | 424.0 | 43672 | 4.9574 | 0.3024 | | 0.0859 | 425.0 | 43775 | 4.8934 | 0.2818 | | 0.111 | 426.0 | 43878 | 4.8562 | 0.2887 | | 0.0944 | 427.0 | 43981 | 4.8261 | 0.3058 | | 0.1 | 428.0 | 44084 | 4.8226 | 0.2990 | | 0.0965 | 429.0 | 44187 | 4.8104 | 0.3127 | | 0.0905 | 430.0 | 44290 | 4.7416 | 0.3058 | | 0.1095 | 431.0 | 44393 | 5.0877 | 0.2715 | | 0.0855 | 432.0 | 44496 | 4.9392 | 0.2784 | | 0.1079 | 433.0 | 44599 | 4.8227 | 0.3024 | | 0.102 | 434.0 | 44702 | 4.9779 | 0.2784 | | 0.0888 | 435.0 | 44805 | 4.9958 | 0.2955 | | 0.0842 | 436.0 | 44908 | 4.7461 | 0.3093 | | 0.0918 | 437.0 | 45011 | 5.0597 | 0.2646 | | 0.0911 | 438.0 | 45114 | 4.9771 | 0.2784 | | 0.0859 | 439.0 | 45217 | 4.8373 | 0.2990 | | 0.0916 | 440.0 | 45320 | 4.7408 | 0.3093 | | 0.0988 | 441.0 | 45423 | 4.7879 | 0.2612 | | 0.0994 | 442.0 | 45526 | 4.7355 | 0.2990 | | 0.102 | 443.0 | 45629 | 4.8696 | 0.3196 | | 0.0951 | 444.0 | 45732 | 4.9578 | 0.2955 | | 0.0843 | 445.0 | 45835 | 5.0340 | 0.3093 | | 0.0927 | 446.0 | 45938 | 5.0122 | 0.3058 | | 0.1028 | 447.0 | 46041 | 4.8365 | 0.2887 | | 0.0988 | 448.0 | 46144 | 4.9790 | 0.2543 | | 0.0993 | 449.0 | 46247 | 4.8574 | 0.2818 | | 0.0935 | 450.0 | 46350 | 5.0489 | 0.2784 | | 0.0942 | 451.0 | 46453 | 4.9593 | 0.2715 | | 0.0875 | 452.0 | 46556 | 4.9571 | 0.2887 | | 0.0968 | 453.0 | 46659 | 4.8004 | 0.3058 | | 0.0969 | 454.0 | 46762 | 5.1910 | 0.2852 | | 0.0954 | 455.0 | 46865 | 5.0355 | 0.2784 | | 0.1008 | 456.0 | 46968 | 4.8536 | 0.2990 | | 0.09 | 457.0 | 47071 | 4.7043 | 0.2715 | | 0.1064 | 458.0 | 47174 | 4.8734 | 0.2749 | | 0.0902 | 459.0 | 47277 | 4.9062 | 0.3299 | | 0.0831 | 460.0 | 47380 | 5.0669 | 0.3058 | | 0.1008 | 461.0 | 47483 | 5.1403 | 0.2784 | | 0.0883 | 462.0 | 47586 | 5.1774 | 0.2818 | | 0.0915 | 463.0 | 47689 | 5.1486 | 0.2852 | | 0.1124 | 464.0 | 47792 | 5.1076 | 0.3093 | | 0.0892 | 465.0 | 47895 | 5.0262 | 0.2784 | | 0.088 | 466.0 | 47998 | 5.1672 | 0.2749 | | 0.0969 | 467.0 | 48101 | 5.1796 | 0.2784 | | 0.0851 | 468.0 | 48204 | 5.1422 | 0.2646 | | 0.094 | 469.0 | 48307 | 5.1663 | 0.2509 | | 0.085 | 470.0 | 48410 | 5.2027 | 0.2715 | | 0.0953 | 471.0 | 48513 | 5.0788 | 0.2955 | | 0.097 | 472.0 | 48616 | 5.1568 | 0.2680 | | 0.092 | 473.0 | 48719 | 5.0175 | 0.2749 | | 0.0876 | 474.0 | 48822 | 5.0064 | 0.2852 | | 0.0984 | 475.0 | 48925 | 4.9885 | 0.2784 | | 0.0781 | 476.0 | 49028 | 5.1671 | 0.2715 | | 0.1001 | 477.0 | 49131 | 5.2429 | 0.2749 | | 0.085 | 478.0 | 49234 | 5.2670 | 0.2749 | | 0.0924 | 479.0 | 49337 | 5.0759 | 0.2784 | | 0.0855 | 480.0 | 49440 | 5.2673 | 0.2955 | | 0.1018 | 481.0 | 49543 | 5.1715 | 0.3127 | | 0.0883 | 482.0 | 49646 | 5.0860 | 0.2887 | | 0.101 | 483.0 | 49749 | 5.1873 | 0.2818 | | 0.1061 | 484.0 | 49852 | 5.1156 | 0.2852 | | 0.1091 | 485.0 | 49955 | 5.1338 | 0.2887 | | 0.0935 | 486.0 | 50058 | 5.0872 | 0.2680 | | 0.0983 | 487.0 | 50161 | 5.0349 | 0.2818 | | 0.0955 | 488.0 | 50264 | 5.1492 | 0.2955 | | 0.1065 | 489.0 | 50367 | 5.0529 | 0.2749 | | 0.0771 | 490.0 | 50470 | 5.0177 | 0.2818 | | 0.0962 | 491.0 | 50573 | 5.0682 | 0.2887 | | 0.0701 | 492.0 | 50676 | 5.1446 | 0.2852 | | 0.0908 | 493.0 | 50779 | 5.1319 | 0.2955 | | 0.0957 | 494.0 | 50882 | 5.1732 | 0.2543 | | 0.1039 | 495.0 | 50985 | 5.1408 | 0.2715 | | 0.0947 | 496.0 | 51088 | 5.1906 | 0.2680 | | 0.097 | 497.0 | 51191 | 5.3184 | 0.2405 | | 0.0848 | 498.0 | 51294 | 5.1346 | 0.2921 | | 0.0855 | 499.0 | 51397 | 5.0153 | 0.2784 | | 0.1041 | 500.0 | 51500 | 5.1230 | 0.2612 | | 0.0936 | 501.0 | 51603 | 5.1331 | 0.2715 | | 0.0934 | 502.0 | 51706 | 5.1767 | 0.2612 | | 0.0966 | 503.0 | 51809 | 5.0495 | 0.2921 | | 0.0953 | 504.0 | 51912 | 5.0618 | 0.2543 | | 0.0852 | 505.0 | 52015 | 5.1167 | 0.2818 | | 0.0889 | 506.0 | 52118 | 5.0981 | 0.3058 | | 0.0854 | 507.0 | 52221 | 5.1853 | 0.2955 | | 0.0877 | 508.0 | 52324 | 5.2161 | 0.2887 | | 0.1074 | 509.0 | 52427 | 5.1670 | 0.2646 | | 0.1055 | 510.0 | 52530 | 5.0545 | 0.2749 | | 0.0789 | 511.0 | 52633 | 5.0691 | 0.2509 | | 0.0816 | 512.0 | 52736 | 5.0847 | 0.2887 | | 0.0818 | 513.0 | 52839 | 5.1307 | 0.3024 | | 0.0999 | 514.0 | 52942 | 5.1029 | 0.2852 | | 0.0787 | 515.0 | 53045 | 5.2270 | 0.2955 | | 0.0892 | 516.0 | 53148 | 5.1925 | 0.3024 | | 0.0995 | 517.0 | 53251 | 5.2463 | 0.2955 | | 0.0812 | 518.0 | 53354 | 5.3743 | 0.2955 | | 0.101 | 519.0 | 53457 | 5.1906 | 0.2852 | | 0.082 | 520.0 | 53560 | 5.1656 | 0.2887 | | 0.0904 | 521.0 | 53663 | 5.1051 | 0.2921 | | 0.0909 | 522.0 | 53766 | 5.2543 | 0.2990 | | 0.1033 | 523.0 | 53869 | 5.2171 | 0.2784 | | 0.0793 | 524.0 | 53972 | 5.2428 | 0.2955 | | 0.0879 | 525.0 | 54075 | 5.3480 | 0.2955 | | 0.0836 | 526.0 | 54178 | 5.2810 | 0.2784 | | 0.0886 | 527.0 | 54281 | 5.2532 | 0.2955 | | 0.0881 | 528.0 | 54384 | 5.4993 | 0.2646 | | 0.1158 | 529.0 | 54487 | 5.2754 | 0.2749 | | 0.0984 | 530.0 | 54590 | 5.2237 | 0.2509 | | 0.0974 | 531.0 | 54693 | 5.4133 | 0.2715 | | 0.0892 | 532.0 | 54796 | 5.2500 | 0.2852 | | 0.0892 | 533.0 | 54899 | 5.3204 | 0.2612 | | 0.0873 | 534.0 | 55002 | 5.2275 | 0.2749 | | 0.0882 | 535.0 | 55105 | 5.2049 | 0.2921 | | 0.0915 | 536.0 | 55208 | 5.2155 | 0.2990 | | 0.0759 | 537.0 | 55311 | 5.2795 | 0.2818 | | 0.0893 | 538.0 | 55414 | 5.2271 | 0.2852 | | 0.0845 | 539.0 | 55517 | 5.2346 | 0.2680 | | 0.0912 | 540.0 | 55620 | 5.2443 | 0.3093 | | 0.0804 | 541.0 | 55723 | 5.2777 | 0.2921 | | 0.0753 | 542.0 | 55826 | 5.3583 | 0.2680 | | 0.0829 | 543.0 | 55929 | 5.1900 | 0.2852 | | 0.0984 | 544.0 | 56032 | 5.1930 | 0.2990 | | 0.0993 | 545.0 | 56135 | 5.1223 | 0.3093 | | 0.0793 | 546.0 | 56238 | 5.2101 | 0.3024 | | 0.0912 | 547.0 | 56341 | 5.2742 | 0.2749 | | 0.0892 | 548.0 | 56444 | 5.1734 | 0.2921 | | 0.1029 | 549.0 | 56547 | 5.2658 | 0.2921 | | 0.0863 | 550.0 | 56650 | 5.2372 | 0.2990 | | 0.1017 | 551.0 | 56753 | 5.2105 | 0.2680 | | 0.0883 | 552.0 | 56856 | 5.1055 | 0.2955 | | 0.1042 | 553.0 | 56959 | 5.2432 | 0.2612 | | 0.0817 | 554.0 | 57062 | 5.2423 | 0.2921 | | 0.0869 | 555.0 | 57165 | 5.2250 | 0.2784 | | 0.0843 | 556.0 | 57268 | 5.1962 | 0.2887 | | 0.0887 | 557.0 | 57371 | 5.1148 | 0.2990 | | 0.0838 | 558.0 | 57474 | 5.0202 | 0.2852 | | 0.0759 | 559.0 | 57577 | 5.0678 | 0.3265 | | 0.0934 | 560.0 | 57680 | 4.9558 | 0.3265 | | 0.0858 | 561.0 | 57783 | 5.0168 | 0.3093 | | 0.0873 | 562.0 | 57886 | 5.0457 | 0.3058 | | 0.0902 | 563.0 | 57989 | 5.0469 | 0.3230 | | 0.0793 | 564.0 | 58092 | 4.9871 | 0.3265 | | 0.0882 | 565.0 | 58195 | 5.1584 | 0.3162 | | 0.0984 | 566.0 | 58298 | 5.0747 | 0.3230 | | 0.0824 | 567.0 | 58401 | 5.1735 | 0.3196 | | 0.0794 | 568.0 | 58504 | 5.1323 | 0.3265 | | 0.0847 | 569.0 | 58607 | 5.1292 | 0.3230 | | 0.0833 | 570.0 | 58710 | 5.0710 | 0.3265 | | 0.0831 | 571.0 | 58813 | 5.1205 | 0.2955 | | 0.0922 | 572.0 | 58916 | 5.1007 | 0.2990 | | 0.0906 | 573.0 | 59019 | 5.1924 | 0.2955 | | 0.1079 | 574.0 | 59122 | 5.1933 | 0.2955 | | 0.0943 | 575.0 | 59225 | 5.1558 | 0.3024 | | 0.0877 | 576.0 | 59328 | 5.1573 | 0.2990 | | 0.0977 | 577.0 | 59431 | 5.0311 | 0.2990 | | 0.0751 | 578.0 | 59534 | 5.1581 | 0.2887 | | 0.096 | 579.0 | 59637 | 5.2115 | 0.2818 | | 0.0902 | 580.0 | 59740 | 5.2544 | 0.2921 | | 0.1052 | 581.0 | 59843 | 5.1612 | 0.3196 | | 0.0763 | 582.0 | 59946 | 5.1434 | 0.2921 | | 0.0904 | 583.0 | 60049 | 5.1911 | 0.2955 | | 0.0868 | 584.0 | 60152 | 5.1716 | 0.3024 | | 0.091 | 585.0 | 60255 | 5.1767 | 0.2818 | | 0.0936 | 586.0 | 60358 | 5.1801 | 0.2852 | | 0.082 | 587.0 | 60461 | 5.0496 | 0.2852 | | 0.0999 | 588.0 | 60564 | 5.2585 | 0.2852 | | 0.0826 | 589.0 | 60667 | 5.2566 | 0.2887 | | 0.0949 | 590.0 | 60770 | 5.3015 | 0.2990 | | 0.0828 | 591.0 | 60873 | 5.1411 | 0.3093 | | 0.0827 | 592.0 | 60976 | 5.1199 | 0.3024 | | 0.0943 | 593.0 | 61079 | 5.1063 | 0.3024 | | 0.076 | 594.0 | 61182 | 5.1141 | 0.3093 | | 0.0917 | 595.0 | 61285 | 5.1414 | 0.2990 | | 0.0976 | 596.0 | 61388 | 5.1441 | 0.2955 | | 0.0804 | 597.0 | 61491 | 5.1681 | 0.3024 | | 0.0923 | 598.0 | 61594 | 5.1333 | 0.3024 | | 0.093 | 599.0 | 61697 | 5.1260 | 0.2921 | | 0.0926 | 600.0 | 61800 | 5.1560 | 0.3196 | | 0.0844 | 601.0 | 61903 | 5.1931 | 0.2990 | | 0.0847 | 602.0 | 62006 | 5.0865 | 0.3024 | | 0.0822 | 603.0 | 62109 | 5.0862 | 0.3127 | | 0.0771 | 604.0 | 62212 | 5.0475 | 0.3058 | | 0.0885 | 605.0 | 62315 | 5.0884 | 0.3093 | | 0.0809 | 606.0 | 62418 | 5.2159 | 0.2921 | | 0.0892 | 607.0 | 62521 | 5.0867 | 0.3093 | | 0.085 | 608.0 | 62624 | 5.0848 | 0.3058 | | 0.0828 | 609.0 | 62727 | 5.2343 | 0.3093 | | 0.0978 | 610.0 | 62830 | 5.1203 | 0.2921 | | 0.0922 | 611.0 | 62933 | 5.2543 | 0.2921 | | 0.091 | 612.0 | 63036 | 5.1228 | 0.2784 | | 0.0926 | 613.0 | 63139 | 5.3064 | 0.2887 | | 0.078 | 614.0 | 63242 | 5.3367 | 0.2921 | | 0.0791 | 615.0 | 63345 | 5.2738 | 0.3058 | | 0.0803 | 616.0 | 63448 | 5.2698 | 0.2990 | | 0.0936 | 617.0 | 63551 | 5.3062 | 0.3162 | | 0.0894 | 618.0 | 63654 | 5.3834 | 0.2990 | | 0.0794 | 619.0 | 63757 | 5.2768 | 0.3196 | | 0.0885 | 620.0 | 63860 | 5.2569 | 0.2990 | | 0.0866 | 621.0 | 63963 | 5.3325 | 0.2955 | | 0.079 | 622.0 | 64066 | 5.2798 | 0.2887 | | 0.084 | 623.0 | 64169 | 5.4603 | 0.2715 | | 0.0886 | 624.0 | 64272 | 5.2922 | 0.2784 | | 0.0726 | 625.0 | 64375 | 5.1952 | 0.2921 | | 0.0893 | 626.0 | 64478 | 5.4114 | 0.2543 | | 0.0881 | 627.0 | 64581 | 5.4867 | 0.2509 | | 0.079 | 628.0 | 64684 | 5.4838 | 0.2887 | | 0.0933 | 629.0 | 64787 | 5.5214 | 0.2921 | | 0.0795 | 630.0 | 64890 | 5.4256 | 0.2818 | | 0.0882 | 631.0 | 64993 | 5.3628 | 0.2818 | | 0.0826 | 632.0 | 65096 | 5.2816 | 0.2921 | | 0.0853 | 633.0 | 65199 | 5.2615 | 0.2749 | | 0.0862 | 634.0 | 65302 | 5.2622 | 0.2955 | | 0.0823 | 635.0 | 65405 | 5.3123 | 0.2955 | | 0.0915 | 636.0 | 65508 | 5.2486 | 0.2852 | | 0.0776 | 637.0 | 65611 | 5.2641 | 0.2955 | | 0.0799 | 638.0 | 65714 | 5.4327 | 0.2887 | | 0.0925 | 639.0 | 65817 | 5.3664 | 0.2852 | | 0.0865 | 640.0 | 65920 | 5.3066 | 0.2990 | | 0.09 | 641.0 | 66023 | 5.0985 | 0.3127 | | 0.0867 | 642.0 | 66126 | 5.1732 | 0.2955 | | 0.084 | 643.0 | 66229 | 5.2330 | 0.3127 | | 0.0806 | 644.0 | 66332 | 5.2097 | 0.3162 | | 0.0821 | 645.0 | 66435 | 5.3272 | 0.2990 | | 0.0869 | 646.0 | 66538 | 5.3930 | 0.3024 | | 0.0777 | 647.0 | 66641 | 5.3346 | 0.2990 | | 0.0822 | 648.0 | 66744 | 5.2165 | 0.2990 | | 0.0967 | 649.0 | 66847 | 5.2284 | 0.3024 | | 0.0792 | 650.0 | 66950 | 5.3921 | 0.3024 | | 0.0849 | 651.0 | 67053 | 5.5296 | 0.2749 | | 0.0854 | 652.0 | 67156 | 5.4795 | 0.2852 | | 0.0796 | 653.0 | 67259 | 5.3334 | 0.2784 | | 0.093 | 654.0 | 67362 | 5.3140 | 0.3058 | | 0.076 | 655.0 | 67465 | 5.3064 | 0.2887 | | 0.086 | 656.0 | 67568 | 5.3858 | 0.2990 | | 0.0856 | 657.0 | 67671 | 5.3206 | 0.2887 | | 0.0826 | 658.0 | 67774 | 5.2731 | 0.2852 | | 0.0972 | 659.0 | 67877 | 5.3104 | 0.2921 | | 0.0828 | 660.0 | 67980 | 5.3299 | 0.2955 | | 0.0792 | 661.0 | 68083 | 5.4611 | 0.2818 | | 0.0839 | 662.0 | 68186 | 5.4076 | 0.2749 | | 0.0816 | 663.0 | 68289 | 5.3335 | 0.2852 | | 0.0786 | 664.0 | 68392 | 5.3885 | 0.2577 | | 0.0958 | 665.0 | 68495 | 5.4822 | 0.2543 | | 0.0872 | 666.0 | 68598 | 5.4748 | 0.2784 | | 0.0823 | 667.0 | 68701 | 5.3412 | 0.2887 | | 0.0845 | 668.0 | 68804 | 5.2716 | 0.2955 | | 0.0882 | 669.0 | 68907 | 5.4058 | 0.2818 | | 0.0794 | 670.0 | 69010 | 5.5217 | 0.2543 | | 0.0876 | 671.0 | 69113 | 5.3548 | 0.2784 | | 0.0754 | 672.0 | 69216 | 5.3593 | 0.2921 | | 0.0842 | 673.0 | 69319 | 5.4261 | 0.2680 | | 0.0832 | 674.0 | 69422 | 5.3608 | 0.2887 | | 0.0874 | 675.0 | 69525 | 5.4222 | 0.2784 | | 0.0822 | 676.0 | 69628 | 5.2592 | 0.3058 | | 0.0852 | 677.0 | 69731 | 5.2905 | 0.2921 | | 0.0819 | 678.0 | 69834 | 5.2874 | 0.2955 | | 0.0842 | 679.0 | 69937 | 5.5141 | 0.2887 | | 0.0871 | 680.0 | 70040 | 5.3684 | 0.2990 | | 0.0756 | 681.0 | 70143 | 5.4528 | 0.2887 | | 0.0844 | 682.0 | 70246 | 5.3712 | 0.2818 | | 0.0774 | 683.0 | 70349 | 5.3621 | 0.2818 | | 0.0914 | 684.0 | 70452 | 5.3721 | 0.3024 | | 0.0883 | 685.0 | 70555 | 5.2809 | 0.2955 | | 0.0812 | 686.0 | 70658 | 5.3432 | 0.2955 | | 0.0838 | 687.0 | 70761 | 5.3131 | 0.3162 | | 0.081 | 688.0 | 70864 | 5.3051 | 0.3058 | | 0.0785 | 689.0 | 70967 | 5.2396 | 0.3024 | | 0.0842 | 690.0 | 71070 | 5.2475 | 0.2818 | | 0.0956 | 691.0 | 71173 | 5.3493 | 0.3058 | | 0.0823 | 692.0 | 71276 | 5.2118 | 0.3127 | | 0.0841 | 693.0 | 71379 | 5.1624 | 0.3127 | | 0.078 | 694.0 | 71482 | 5.2229 | 0.3162 | | 0.0831 | 695.0 | 71585 | 5.2669 | 0.3127 | | 0.0863 | 696.0 | 71688 | 5.2763 | 0.3024 | | 0.0957 | 697.0 | 71791 | 5.3014 | 0.3333 | | 0.0775 | 698.0 | 71894 | 5.3820 | 0.2990 | | 0.0907 | 699.0 | 71997 | 5.4359 | 0.3127 | | 0.0802 | 700.0 | 72100 | 5.4012 | 0.3058 | | 0.0799 | 701.0 | 72203 | 5.3790 | 0.2784 | | 0.0822 | 702.0 | 72306 | 5.3593 | 0.2955 | | 0.0841 | 703.0 | 72409 | 5.3180 | 0.2990 | | 0.0883 | 704.0 | 72512 | 5.2755 | 0.3024 | | 0.0863 | 705.0 | 72615 | 5.2439 | 0.3024 | | 0.0776 | 706.0 | 72718 | 5.2928 | 0.2887 | | 0.0854 | 707.0 | 72821 | 5.3421 | 0.2749 | | 0.0853 | 708.0 | 72924 | 5.3366 | 0.2852 | | 0.0864 | 709.0 | 73027 | 5.3050 | 0.2990 | | 0.0802 | 710.0 | 73130 | 5.3095 | 0.3024 | | 0.0868 | 711.0 | 73233 | 5.3088 | 0.2921 | | 0.0817 | 712.0 | 73336 | 5.2846 | 0.2955 | | 0.0848 | 713.0 | 73439 | 5.3219 | 0.2612 | | 0.0891 | 714.0 | 73542 | 5.3707 | 0.2646 | | 0.0829 | 715.0 | 73645 | 5.3405 | 0.2852 | | 0.0882 | 716.0 | 73748 | 5.1875 | 0.3024 | | 0.0944 | 717.0 | 73851 | 5.2667 | 0.2921 | | 0.0713 | 718.0 | 73954 | 5.2920 | 0.2818 | | 0.0855 | 719.0 | 74057 | 5.1722 | 0.2955 | | 0.0812 | 720.0 | 74160 | 5.1372 | 0.2921 | | 0.0731 | 721.0 | 74263 | 5.1013 | 0.2921 | | 0.0845 | 722.0 | 74366 | 5.1055 | 0.2990 | | 0.0857 | 723.0 | 74469 | 5.2164 | 0.2921 | | 0.0843 | 724.0 | 74572 | 5.3023 | 0.2852 | | 0.084 | 725.0 | 74675 | 5.1233 | 0.3127 | | 0.0846 | 726.0 | 74778 | 5.3163 | 0.2680 | | 0.0838 | 727.0 | 74881 | 5.2244 | 0.2749 | | 0.0815 | 728.0 | 74984 | 5.1616 | 0.2784 | | 0.0849 | 729.0 | 75087 | 5.1514 | 0.2955 | | 0.0818 | 730.0 | 75190 | 5.1428 | 0.2990 | | 0.0751 | 731.0 | 75293 | 5.1820 | 0.2749 | | 0.0766 | 732.0 | 75396 | 5.2326 | 0.2749 | | 0.0772 | 733.0 | 75499 | 5.2083 | 0.2955 | | 0.0846 | 734.0 | 75602 | 5.3257 | 0.2887 | | 0.0811 | 735.0 | 75705 | 5.3460 | 0.2784 | | 0.089 | 736.0 | 75808 | 5.3004 | 0.2852 | | 0.0711 | 737.0 | 75911 | 5.2424 | 0.2955 | | 0.0852 | 738.0 | 76014 | 5.3143 | 0.2612 | | 0.0798 | 739.0 | 76117 | 5.3268 | 0.2646 | | 0.0783 | 740.0 | 76220 | 5.2696 | 0.2921 | | 0.086 | 741.0 | 76323 | 5.2744 | 0.2749 | | 0.0778 | 742.0 | 76426 | 5.3274 | 0.2818 | | 0.0832 | 743.0 | 76529 | 5.3297 | 0.2852 | | 0.0826 | 744.0 | 76632 | 5.2858 | 0.2990 | | 0.0792 | 745.0 | 76735 | 5.3368 | 0.2852 | | 0.0787 | 746.0 | 76838 | 5.3574 | 0.2749 | | 0.0732 | 747.0 | 76941 | 5.3469 | 0.2852 | | 0.0857 | 748.0 | 77044 | 5.2975 | 0.2955 | | 0.07 | 749.0 | 77147 | 5.3372 | 0.2784 | | 0.0829 | 750.0 | 77250 | 5.2525 | 0.2921 | | 0.0794 | 751.0 | 77353 | 5.3314 | 0.2852 | | 0.0781 | 752.0 | 77456 | 5.3318 | 0.2715 | | 0.0914 | 753.0 | 77559 | 5.2651 | 0.2715 | | 0.0822 | 754.0 | 77662 | 5.3557 | 0.2852 | | 0.0782 | 755.0 | 77765 | 5.4120 | 0.2818 | | 0.0828 | 756.0 | 77868 | 5.4191 | 0.2921 | | 0.0747 | 757.0 | 77971 | 5.4100 | 0.3058 | | 0.0765 | 758.0 | 78074 | 5.3832 | 0.3024 | | 0.077 | 759.0 | 78177 | 5.3801 | 0.2955 | | 0.0751 | 760.0 | 78280 | 5.3274 | 0.3058 | | 0.0821 | 761.0 | 78383 | 5.3911 | 0.2955 | | 0.0854 | 762.0 | 78486 | 5.4113 | 0.3093 | | 0.0765 | 763.0 | 78589 | 5.3642 | 0.3024 | | 0.0787 | 764.0 | 78692 | 5.3545 | 0.2887 | | 0.0842 | 765.0 | 78795 | 5.3986 | 0.2990 | | 0.0856 | 766.0 | 78898 | 5.4038 | 0.2887 | | 0.082 | 767.0 | 79001 | 5.3815 | 0.3058 | | 0.0787 | 768.0 | 79104 | 5.4093 | 0.2852 | | 0.0731 | 769.0 | 79207 | 5.3961 | 0.2955 | | 0.0762 | 770.0 | 79310 | 5.3746 | 0.3093 | | 0.0874 | 771.0 | 79413 | 5.3983 | 0.3058 | | 0.0835 | 772.0 | 79516 | 5.4264 | 0.2887 | | 0.0841 | 773.0 | 79619 | 5.4252 | 0.2990 | | 0.0792 | 774.0 | 79722 | 5.3730 | 0.3058 | | 0.0816 | 775.0 | 79825 | 5.3834 | 0.3127 | | 0.0928 | 776.0 | 79928 | 5.4694 | 0.2887 | | 0.0739 | 777.0 | 80031 | 5.3801 | 0.2887 | | 0.0778 | 778.0 | 80134 | 5.3827 | 0.2818 | | 0.0826 | 779.0 | 80237 | 5.4980 | 0.2887 | | 0.0873 | 780.0 | 80340 | 5.3884 | 0.2749 | | 0.0762 | 781.0 | 80443 | 5.3831 | 0.2887 | | 0.0802 | 782.0 | 80546 | 5.4449 | 0.2852 | | 0.0832 | 783.0 | 80649 | 5.4030 | 0.2921 | | 0.0716 | 784.0 | 80752 | 5.4508 | 0.2955 | | 0.0885 | 785.0 | 80855 | 5.3869 | 0.2887 | | 0.0685 | 786.0 | 80958 | 5.3692 | 0.2990 | | 0.0797 | 787.0 | 81061 | 5.3884 | 0.3024 | | 0.0748 | 788.0 | 81164 | 5.3263 | 0.3162 | | 0.0741 | 789.0 | 81267 | 5.3524 | 0.3024 | | 0.0767 | 790.0 | 81370 | 5.2625 | 0.3230 | | 0.0814 | 791.0 | 81473 | 5.2668 | 0.3299 | | 0.0845 | 792.0 | 81576 | 5.2356 | 0.3093 | | 0.076 | 793.0 | 81679 | 5.2616 | 0.3230 | | 0.0769 | 794.0 | 81782 | 5.3046 | 0.3333 | | 0.0866 | 795.0 | 81885 | 5.2902 | 0.3299 | | 0.0772 | 796.0 | 81988 | 5.3078 | 0.3127 | | 0.079 | 797.0 | 82091 | 5.2889 | 0.2955 | | 0.0797 | 798.0 | 82194 | 5.2158 | 0.2990 | | 0.0802 | 799.0 | 82297 | 5.3130 | 0.3024 | | 0.0859 | 800.0 | 82400 | 5.2843 | 0.3162 | | 0.0789 | 801.0 | 82503 | 5.2430 | 0.3127 | | 0.0809 | 802.0 | 82606 | 5.2167 | 0.3436 | | 0.0787 | 803.0 | 82709 | 5.2202 | 0.3127 | | 0.0878 | 804.0 | 82812 | 5.3567 | 0.3024 | | 0.0772 | 805.0 | 82915 | 5.3986 | 0.2887 | | 0.0809 | 806.0 | 83018 | 5.3578 | 0.2887 | | 0.0815 | 807.0 | 83121 | 5.3142 | 0.3093 | | 0.0762 | 808.0 | 83224 | 5.2857 | 0.2955 | | 0.0732 | 809.0 | 83327 | 5.2571 | 0.2955 | | 0.0779 | 810.0 | 83430 | 5.2882 | 0.2887 | | 0.0872 | 811.0 | 83533 | 5.3455 | 0.3024 | | 0.076 | 812.0 | 83636 | 5.2805 | 0.2955 | | 0.0894 | 813.0 | 83739 | 5.2921 | 0.2990 | | 0.0724 | 814.0 | 83842 | 5.3510 | 0.2887 | | 0.0828 | 815.0 | 83945 | 5.3011 | 0.3024 | | 0.0818 | 816.0 | 84048 | 5.2944 | 0.3196 | | 0.0728 | 817.0 | 84151 | 5.2526 | 0.3058 | | 0.0776 | 818.0 | 84254 | 5.2646 | 0.2921 | | 0.0768 | 819.0 | 84357 | 5.3151 | 0.3024 | | 0.0725 | 820.0 | 84460 | 5.3043 | 0.3058 | | 0.077 | 821.0 | 84563 | 5.3536 | 0.3024 | | 0.0815 | 822.0 | 84666 | 5.3243 | 0.3162 | | 0.0753 | 823.0 | 84769 | 5.3728 | 0.2990 | | 0.0837 | 824.0 | 84872 | 5.3566 | 0.2852 | | 0.0786 | 825.0 | 84975 | 5.3487 | 0.3058 | | 0.0897 | 826.0 | 85078 | 5.3847 | 0.2955 | | 0.079 | 827.0 | 85181 | 5.3576 | 0.2955 | | 0.0791 | 828.0 | 85284 | 5.3439 | 0.2818 | | 0.0778 | 829.0 | 85387 | 5.3457 | 0.2921 | | 0.0732 | 830.0 | 85490 | 5.3470 | 0.2887 | | 0.0752 | 831.0 | 85593 | 5.3294 | 0.2921 | | 0.0823 | 832.0 | 85696 | 5.4163 | 0.2887 | | 0.0803 | 833.0 | 85799 | 5.3962 | 0.3058 | | 0.0792 | 834.0 | 85902 | 5.3944 | 0.3127 | | 0.0701 | 835.0 | 86005 | 5.4105 | 0.3024 | | 0.0853 | 836.0 | 86108 | 5.3402 | 0.3162 | | 0.0753 | 837.0 | 86211 | 5.3846 | 0.3196 | | 0.0867 | 838.0 | 86314 | 5.4029 | 0.3024 | | 0.0722 | 839.0 | 86417 | 5.3613 | 0.3093 | | 0.0686 | 840.0 | 86520 | 5.3966 | 0.3093 | | 0.0891 | 841.0 | 86623 | 5.3980 | 0.2955 | | 0.0826 | 842.0 | 86726 | 5.3373 | 0.3024 | | 0.0767 | 843.0 | 86829 | 5.4020 | 0.2955 | | 0.0816 | 844.0 | 86932 | 5.3813 | 0.2784 | | 0.0775 | 845.0 | 87035 | 5.3968 | 0.2887 | | 0.0694 | 846.0 | 87138 | 5.4287 | 0.2955 | | 0.0816 | 847.0 | 87241 | 5.4425 | 0.2990 | | 0.0697 | 848.0 | 87344 | 5.4049 | 0.3024 | | 0.0771 | 849.0 | 87447 | 5.4044 | 0.2990 | | 0.0712 | 850.0 | 87550 | 5.4029 | 0.2990 | | 0.0806 | 851.0 | 87653 | 5.3960 | 0.2818 | | 0.0766 | 852.0 | 87756 | 5.3878 | 0.2852 | | 0.074 | 853.0 | 87859 | 5.4213 | 0.2749 | | 0.0779 | 854.0 | 87962 | 5.4028 | 0.2784 | | 0.084 | 855.0 | 88065 | 5.4720 | 0.2852 | | 0.0757 | 856.0 | 88168 | 5.4470 | 0.2784 | | 0.0763 | 857.0 | 88271 | 5.4431 | 0.2749 | | 0.0816 | 858.0 | 88374 | 5.4127 | 0.2749 | | 0.0761 | 859.0 | 88477 | 5.4201 | 0.2646 | | 0.093 | 860.0 | 88580 | 5.3464 | 0.3024 | | 0.0729 | 861.0 | 88683 | 5.3696 | 0.2852 | | 0.0792 | 862.0 | 88786 | 5.3409 | 0.2990 | | 0.0742 | 863.0 | 88889 | 5.3730 | 0.2818 | | 0.0795 | 864.0 | 88992 | 5.4294 | 0.2818 | | 0.0701 | 865.0 | 89095 | 5.4176 | 0.2715 | | 0.087 | 866.0 | 89198 | 5.4339 | 0.2784 | | 0.0775 | 867.0 | 89301 | 5.4669 | 0.2818 | | 0.0764 | 868.0 | 89404 | 5.4774 | 0.2955 | | 0.0827 | 869.0 | 89507 | 5.4227 | 0.2921 | | 0.0757 | 870.0 | 89610 | 5.4220 | 0.3024 | | 0.0761 | 871.0 | 89713 | 5.3954 | 0.2887 | | 0.0777 | 872.0 | 89816 | 5.3860 | 0.3024 | | 0.0737 | 873.0 | 89919 | 5.3625 | 0.2818 | | 0.0777 | 874.0 | 90022 | 5.4137 | 0.2955 | | 0.0758 | 875.0 | 90125 | 5.4152 | 0.2818 | | 0.0764 | 876.0 | 90228 | 5.3812 | 0.2921 | | 0.087 | 877.0 | 90331 | 5.3757 | 0.3024 | | 0.0705 | 878.0 | 90434 | 5.3995 | 0.2852 | | 0.0831 | 879.0 | 90537 | 5.3755 | 0.2852 | | 0.0692 | 880.0 | 90640 | 5.3843 | 0.2852 | | 0.0752 | 881.0 | 90743 | 5.3978 | 0.2852 | | 0.0732 | 882.0 | 90846 | 5.3873 | 0.2887 | | 0.0836 | 883.0 | 90949 | 5.3961 | 0.2818 | | 0.0761 | 884.0 | 91052 | 5.4159 | 0.2887 | | 0.082 | 885.0 | 91155 | 5.4183 | 0.2990 | | 0.0729 | 886.0 | 91258 | 5.4438 | 0.2921 | | 0.0908 | 887.0 | 91361 | 5.4588 | 0.2784 | | 0.0677 | 888.0 | 91464 | 5.4840 | 0.2818 | | 0.0821 | 889.0 | 91567 | 5.4664 | 0.2887 | | 0.0812 | 890.0 | 91670 | 5.5019 | 0.2990 | | 0.0849 | 891.0 | 91773 | 5.4783 | 0.3024 | | 0.079 | 892.0 | 91876 | 5.4933 | 0.2818 | | 0.0703 | 893.0 | 91979 | 5.5191 | 0.2921 | | 0.0777 | 894.0 | 92082 | 5.5171 | 0.2921 | | 0.0767 | 895.0 | 92185 | 5.5280 | 0.2818 | | 0.0697 | 896.0 | 92288 | 5.4920 | 0.2887 | | 0.0831 | 897.0 | 92391 | 5.4587 | 0.2887 | | 0.0715 | 898.0 | 92494 | 5.4843 | 0.2887 | | 0.0764 | 899.0 | 92597 | 5.5036 | 0.2921 | | 0.0785 | 900.0 | 92700 | 5.4781 | 0.2921 | | 0.0783 | 901.0 | 92803 | 5.4685 | 0.3058 | | 0.0791 | 902.0 | 92906 | 5.4434 | 0.3093 | | 0.0714 | 903.0 | 93009 | 5.4704 | 0.3093 | | 0.0834 | 904.0 | 93112 | 5.4543 | 0.3058 | | 0.0796 | 905.0 | 93215 | 5.4430 | 0.3093 | | 0.0741 | 906.0 | 93318 | 5.4621 | 0.2990 | | 0.0752 | 907.0 | 93421 | 5.4498 | 0.3024 | | 0.0776 | 908.0 | 93524 | 5.4553 | 0.2955 | | 0.0795 | 909.0 | 93627 | 5.4151 | 0.3024 | | 0.0771 | 910.0 | 93730 | 5.3965 | 0.2990 | | 0.0756 | 911.0 | 93833 | 5.4121 | 0.3058 | | 0.0769 | 912.0 | 93936 | 5.4056 | 0.2990 | | 0.0799 | 913.0 | 94039 | 5.3876 | 0.3024 | | 0.0853 | 914.0 | 94142 | 5.4022 | 0.2990 | | 0.0726 | 915.0 | 94245 | 5.4384 | 0.2852 | | 0.0745 | 916.0 | 94348 | 5.4223 | 0.2955 | | 0.0688 | 917.0 | 94451 | 5.4298 | 0.2887 | | 0.0743 | 918.0 | 94554 | 5.4227 | 0.3024 | | 0.0842 | 919.0 | 94657 | 5.3807 | 0.3093 | | 0.0732 | 920.0 | 94760 | 5.3881 | 0.2990 | | 0.0717 | 921.0 | 94863 | 5.3828 | 0.2990 | | 0.084 | 922.0 | 94966 | 5.3770 | 0.3024 | | 0.079 | 923.0 | 95069 | 5.3873 | 0.2887 | | 0.0761 | 924.0 | 95172 | 5.3788 | 0.2921 | | 0.0777 | 925.0 | 95275 | 5.3932 | 0.2921 | | 0.0729 | 926.0 | 95378 | 5.4352 | 0.2921 | | 0.0756 | 927.0 | 95481 | 5.4271 | 0.2921 | | 0.0699 | 928.0 | 95584 | 5.4086 | 0.2955 | | 0.0814 | 929.0 | 95687 | 5.4210 | 0.2784 | | 0.07 | 930.0 | 95790 | 5.4176 | 0.2852 | | 0.0736 | 931.0 | 95893 | 5.4347 | 0.2852 | | 0.0694 | 932.0 | 95996 | 5.4364 | 0.2887 | | 0.0771 | 933.0 | 96099 | 5.4468 | 0.2852 | | 0.0718 | 934.0 | 96202 | 5.4523 | 0.2887 | | 0.0784 | 935.0 | 96305 | 5.4216 | 0.2852 | | 0.087 | 936.0 | 96408 | 5.4159 | 0.2818 | | 0.0717 | 937.0 | 96511 | 5.4228 | 0.2852 | | 0.0714 | 938.0 | 96614 | 5.4017 | 0.2852 | | 0.0754 | 939.0 | 96717 | 5.4021 | 0.2852 | | 0.0733 | 940.0 | 96820 | 5.3958 | 0.2852 | | 0.0697 | 941.0 | 96923 | 5.3859 | 0.2887 | | 0.082 | 942.0 | 97026 | 5.3714 | 0.2921 | | 0.0696 | 943.0 | 97129 | 5.3697 | 0.2921 | | 0.0719 | 944.0 | 97232 | 5.3969 | 0.2784 | | 0.0772 | 945.0 | 97335 | 5.3958 | 0.2852 | | 0.0759 | 946.0 | 97438 | 5.4128 | 0.2818 | | 0.074 | 947.0 | 97541 | 5.4283 | 0.2818 | | 0.0704 | 948.0 | 97644 | 5.4305 | 0.2818 | | 0.069 | 949.0 | 97747 | 5.4300 | 0.2852 | | 0.0701 | 950.0 | 97850 | 5.4446 | 0.2818 | | 0.087 | 951.0 | 97953 | 5.4365 | 0.2818 | | 0.0837 | 952.0 | 98056 | 5.4268 | 0.2921 | | 0.0754 | 953.0 | 98159 | 5.4260 | 0.2955 | | 0.0778 | 954.0 | 98262 | 5.4057 | 0.2955 | | 0.0643 | 955.0 | 98365 | 5.3992 | 0.2921 | | 0.0768 | 956.0 | 98468 | 5.3886 | 0.2990 | | 0.0727 | 957.0 | 98571 | 5.3845 | 0.2990 | | 0.0859 | 958.0 | 98674 | 5.3822 | 0.2955 | | 0.0831 | 959.0 | 98777 | 5.3852 | 0.2955 | | 0.0756 | 960.0 | 98880 | 5.3884 | 0.2955 | | 0.0857 | 961.0 | 98983 | 5.3892 | 0.2955 | | 0.0707 | 962.0 | 99086 | 5.3776 | 0.2921 | | 0.0746 | 963.0 | 99189 | 5.3785 | 0.3058 | | 0.0745 | 964.0 | 99292 | 5.3776 | 0.2921 | | 0.0827 | 965.0 | 99395 | 5.3704 | 0.2887 | | 0.0774 | 966.0 | 99498 | 5.3653 | 0.2921 | | 0.0795 | 967.0 | 99601 | 5.3569 | 0.2887 | | 0.0759 | 968.0 | 99704 | 5.3515 | 0.2887 | | 0.0713 | 969.0 | 99807 | 5.3752 | 0.2955 | | 0.0735 | 970.0 | 99910 | 5.3728 | 0.2955 | | 0.0777 | 971.0 | 100013 | 5.3690 | 0.2955 | | 0.0844 | 972.0 | 100116 | 5.3782 | 0.2921 | | 0.0758 | 973.0 | 100219 | 5.3822 | 0.2921 | | 0.0735 | 974.0 | 100322 | 5.3893 | 0.2852 | | 0.0698 | 975.0 | 100425 | 5.3887 | 0.2818 | | 0.0773 | 976.0 | 100528 | 5.3908 | 0.2852 | | 0.0695 | 977.0 | 100631 | 5.3909 | 0.2887 | | 0.0786 | 978.0 | 100734 | 5.3939 | 0.2921 | | 0.0784 | 979.0 | 100837 | 5.3838 | 0.2921 | | 0.078 | 980.0 | 100940 | 5.3891 | 0.2921 | | 0.0721 | 981.0 | 101043 | 5.3875 | 0.2887 | | 0.0779 | 982.0 | 101146 | 5.3925 | 0.2887 | | 0.0706 | 983.0 | 101249 | 5.4006 | 0.2921 | | 0.0808 | 984.0 | 101352 | 5.4022 | 0.2921 | | 0.071 | 985.0 | 101455 | 5.4076 | 0.2921 | | 0.0743 | 986.0 | 101558 | 5.4104 | 0.2921 | | 0.0784 | 987.0 | 101661 | 5.4093 | 0.2921 | | 0.0793 | 988.0 | 101764 | 5.4071 | 0.2921 | | 0.0838 | 989.0 | 101867 | 5.4029 | 0.2921 | | 0.0708 | 990.0 | 101970 | 5.4035 | 0.2921 | | 0.0742 | 991.0 | 102073 | 5.4021 | 0.2921 | | 0.0746 | 992.0 | 102176 | 5.4050 | 0.2921 | | 0.0756 | 993.0 | 102279 | 5.4059 | 0.2921 | | 0.0744 | 994.0 | 102382 | 5.4053 | 0.2921 | | 0.0741 | 995.0 | 102485 | 5.4075 | 0.2921 | | 0.0757 | 996.0 | 102588 | 5.4072 | 0.2921 | | 0.0735 | 997.0 | 102691 | 5.4086 | 0.2921 | | 0.0708 | 998.0 | 102794 | 5.4088 | 0.2921 | | 0.0812 | 999.0 | 102897 | 5.4088 | 0.2921 | | 0.0722 | 1000.0 | 103000 | 5.4090 | 0.2921 | ### Framework versions - Transformers 4.28.0.dev0 - Pytorch 1.13.1+cu117 - Datasets 2.9.0 - Tokenizers 0.12.1
[ "yx-二次元广告", "banner", "平面-日式", "广告图", "广告投放买量图", "快手信息流投放图", "排版图片", "海外", "海报视觉", "渠道图", "港澳台广告图", "游戏 banner", "二次元广告", "游戏kv", "游戏买量", "游戏五图", "游戏发行", "游戏广告", "游戏广告-排版", "游戏广告图", "游戏投放", "游戏排版", "游戏推广", "二次元投放图", "游戏海报-字体样式", "科幻端界面", "竖版广点通", "竖版本banner", "纯日式banner~", "面板参考", "仙侠投放图", "卡牌背景", "商店图", "商店图横版", "国风、传奇类", "小姐姐成图" ]
leejw51/vit-base-beans
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-beans This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0875 - Accuracy: 0.9850 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Accuracy | Validation Loss | |:-------------:|:-----:|:----:|:--------:|:---------------:| | 0.2574 | 1.0 | 130 | 0.9624 | 0.2307 | | 0.2785 | 2.0 | 260 | 0.9925 | 0.1109 | | 0.1496 | 3.0 | 390 | 0.9699 | 0.1109 | | 0.0916 | 4.0 | 520 | 0.9850 | 0.0875 | | 0.1489 | 5.0 | 650 | 0.9774 | 0.0886 | ### Framework versions - Transformers 4.28.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "angular_leaf_spot", "bean_rust", "healthy" ]
yujiepan/internal.swin-base-food101-int8-structured30.56
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-food101-jpqd-1to2r1-epo7-finetuned-student This model is a fine-tuned version of [skylord/swin-finetuned-food101](https://huggingface.co/skylord/swin-finetuned-food101) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.1947 - Accuracy: 0.9213 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 7.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2342 | 0.42 | 500 | 0.1993 | 0.9099 | | 0.2891 | 0.84 | 1000 | 0.1912 | 0.9137 | | 67.4995 | 1.27 | 1500 | 66.4760 | 0.8035 | | 109.8398 | 1.69 | 2000 | 109.5154 | 0.4499 | | 0.6337 | 2.11 | 2500 | 0.4865 | 0.8826 | | 0.6605 | 2.54 | 3000 | 0.3551 | 0.9013 | | 0.4013 | 2.96 | 3500 | 0.3176 | 0.9044 | | 0.3949 | 3.38 | 4000 | 0.2839 | 0.9079 | | 0.4632 | 3.8 | 4500 | 0.2652 | 0.9118 | | 0.3717 | 4.23 | 5000 | 0.2459 | 0.9147 | | 0.3308 | 4.65 | 5500 | 0.2439 | 0.9159 | | 0.4232 | 5.07 | 6000 | 0.2259 | 0.9169 | | 0.3426 | 5.49 | 6500 | 0.2147 | 0.9199 | | 0.331 | 5.92 | 7000 | 0.2086 | 0.9189 | | 0.3032 | 6.34 | 7500 | 0.2036 | 0.9201 | | 0.3393 | 6.76 | 8000 | 0.1978 | 0.9204 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.13.1+cu116 - Datasets 2.8.0 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
CHAOYUYD/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
platzi/platzi-vit-model-Joaquin-Romero
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-Joaquin-Romero This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the datasetX dataset. It achieves the following results on the evaluation set: - Loss: 0.0613 - Accuracy: 0.9850 ## Model description It's a Image Classification model performed ## Intended uses & limitations None ## Training and evaluation data Beans dataset ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1475 | 3.85 | 500 | 0.0613 | 0.9850 | ### Framework versions - Transformers 4.27.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "angular_leaf_spot", "bean_rust", "healthy" ]
Nahrawy/AIorNot
Classification model used to classify real images and AI generated images.\ The model used is swin-tiny-patch4-window7-224 finetued on aiornot dataset.\ To use the model ``` import torch from transformers import AutoFeatureExtractor, AutoModelForImageClassification labels = ["Real", "AI"] feature_extractor = AutoFeatureExtractor.from_pretrained("Nahrawy/AIorNot") model = AutoModelForImageClassification.from_pretrained("Nahrawy/AIorNot") input = feature_extractor(image, return_tensors="pt") with torch.no_grad(): outputs = model(**input) logits = outputs.logits prediction = logits.argmax(-1).item() label = labels[prediction] ```
[ "real", "ai" ]
ThankGod/vit-base-aiornot
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-aiornot This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.27.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "not_ai_gen", "ai_gen" ]
sgonzalezsilot/vit-Diatome
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-Diatome This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the Diatome dataset. It achieves the following results on the evaluation set: - Loss: 0.2285 - Accuracy: 0.9429 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.2009 | 0.23 | 100 | 1.9938 | 0.6045 | | 1.2823 | 0.47 | 200 | 1.2293 | 0.7327 | | 0.7569 | 0.7 | 300 | 0.9534 | 0.7868 | | 0.7428 | 0.93 | 400 | 0.7906 | 0.8078 | | 0.4309 | 1.16 | 500 | 0.5759 | 0.8538 | | 0.349 | 1.4 | 600 | 0.5070 | 0.8742 | | 0.517 | 1.63 | 700 | 0.5048 | 0.8794 | | 0.3667 | 1.86 | 800 | 0.5212 | 0.8596 | | 0.169 | 2.09 | 900 | 0.4112 | 0.8888 | | 0.1443 | 2.33 | 1000 | 0.3294 | 0.9109 | | 0.1389 | 2.56 | 1100 | 0.3146 | 0.9190 | | 0.142 | 2.79 | 1200 | 0.2994 | 0.9208 | | 0.0921 | 3.02 | 1300 | 0.2620 | 0.9324 | | 0.0768 | 3.26 | 1400 | 0.2516 | 0.9336 | | 0.061 | 3.49 | 1500 | 0.2425 | 0.9388 | | 0.0729 | 3.72 | 1600 | 0.2335 | 0.9418 | | 0.0757 | 3.95 | 1700 | 0.2285 | 0.9429 | ### Framework versions - Transformers 4.27.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "achnanthidium", "aulacoseira", "stephanodiscus", "pinnularia", "nitzschia", "chamaepinnularia", "surirella", "tabellaria", "stauroneis", "cavinula", "neidiopsis", "hantzschia", "cocconeis", "navicula", "sellaphora", "craticula", "adlafia", "stenopterobia", "mayamaea", "gomphonema", "caloneis", "neidium", "staurosirella", "brachysira", "rossithidium", "staurosira", "luticola", "diadesmis", "amphora", "geissleria", "encyonema", "stauroforma", "frustulia", "eunotia", "eolimna", "cymbopleura", "psammothidium", "pseudostaurosira", "planothidium", "cyclotella", "diatoma", "meridion", "fragilaria", "encyonopsis" ]
yujiepan/internal.swin-base-food101-int8-structured38.63
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-food101-jpqd-1to2r1.5-epo7-finetuned-student This model is a fine-tuned version of [skylord/swin-finetuned-food101](https://huggingface.co/skylord/swin-finetuned-food101) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.2658 - Accuracy: 0.9124 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 7.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2977 | 0.42 | 500 | 0.1949 | 0.9112 | | 0.3183 | 0.84 | 1000 | 0.1867 | 0.9144 | | 99.9552 | 1.27 | 1500 | 99.4882 | 0.7577 | | 162.4195 | 1.69 | 2000 | 162.7763 | 0.3373 | | 1.2272 | 2.11 | 2500 | 0.7333 | 0.8564 | | 1.0236 | 2.54 | 3000 | 0.5016 | 0.8823 | | 0.6472 | 2.96 | 3500 | 0.4337 | 0.8908 | | 0.52 | 3.38 | 4000 | 0.3927 | 0.8974 | | 0.6075 | 3.8 | 4500 | 0.3506 | 0.9011 | | 0.5348 | 4.23 | 5000 | 0.3425 | 0.9006 | | 0.444 | 4.65 | 5500 | 0.3268 | 0.9044 | | 0.5787 | 5.07 | 6000 | 0.3020 | 0.9078 | | 0.3995 | 5.49 | 6500 | 0.2932 | 0.9095 | | 0.414 | 5.92 | 7000 | 0.2806 | 0.9104 | | 0.4386 | 6.34 | 7500 | 0.2738 | 0.9112 | | 0.452 | 6.76 | 8000 | 0.2673 | 0.9127 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.13.1+cu116 - Datasets 2.8.0 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
yujiepan/internal.swin-base-food101-int8-structured38.01
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-food101-jpqd-1to2r1.5-epo10-finetuned-student This model is a fine-tuned version of [skylord/swin-finetuned-food101](https://huggingface.co/skylord/swin-finetuned-food101) on the food101 dataset. It achieves the following results on the evaluation set: - Loss: 0.2391 - Accuracy: 0.9184 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 128 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.3011 | 0.42 | 500 | 0.1951 | 0.9124 | | 0.2613 | 0.84 | 1000 | 0.1897 | 0.9139 | | 100.1552 | 1.27 | 1500 | 99.5975 | 0.7445 | | 162.0751 | 1.69 | 2000 | 162.5020 | 0.3512 | | 1.061 | 2.11 | 2500 | 0.7523 | 0.8550 | | 0.9728 | 2.54 | 3000 | 0.5263 | 0.8767 | | 0.5851 | 2.96 | 3500 | 0.4599 | 0.8892 | | 0.4668 | 3.38 | 4000 | 0.4064 | 0.8938 | | 0.6967 | 3.8 | 4500 | 0.3814 | 0.8986 | | 0.4928 | 4.23 | 5000 | 0.3522 | 0.9036 | | 0.4893 | 4.65 | 5500 | 0.3562 | 0.9026 | | 0.5421 | 5.07 | 6000 | 0.3182 | 0.9049 | | 0.4405 | 5.49 | 6500 | 0.3112 | 0.9071 | | 0.4423 | 5.92 | 7000 | 0.3012 | 0.9092 | | 0.4143 | 6.34 | 7500 | 0.2958 | 0.9095 | | 0.4997 | 6.76 | 8000 | 0.2796 | 0.9126 | | 0.2448 | 7.19 | 8500 | 0.2747 | 0.9124 | | 0.4468 | 7.61 | 9000 | 0.2699 | 0.9144 | | 0.4163 | 8.03 | 9500 | 0.2583 | 0.9166 | | 0.3651 | 8.45 | 10000 | 0.2567 | 0.9165 | | 0.3946 | 8.88 | 10500 | 0.2489 | 0.9176 | | 0.3196 | 9.3 | 11000 | 0.2444 | 0.9180 | | 0.312 | 9.72 | 11500 | 0.2402 | 0.9172 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.13.1+cu116 - Datasets 2.8.0 - Tokenizers 0.13.2
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
Gokulapriyan/vit-base-patch16-224-finetuned-main-gpu-30e-final
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-main-gpu-30e-final This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0231 - Accuracy: 0.9940 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.5113 | 1.0 | 551 | 0.4745 | 0.7971 | | 0.3409 | 2.0 | 1102 | 0.2697 | 0.8961 | | 0.2675 | 3.0 | 1653 | 0.1611 | 0.9381 | | 0.2092 | 4.0 | 2204 | 0.1176 | 0.9548 | | 0.2008 | 5.0 | 2755 | 0.0889 | 0.9656 | | 0.1555 | 6.0 | 3306 | 0.0666 | 0.9759 | | 0.1614 | 7.0 | 3857 | 0.0576 | 0.9778 | | 0.1518 | 8.0 | 4408 | 0.0517 | 0.9814 | | 0.1231 | 9.0 | 4959 | 0.0528 | 0.9812 | | 0.1076 | 10.0 | 5510 | 0.0426 | 0.9850 | | 0.0953 | 11.0 | 6061 | 0.0634 | 0.9795 | | 0.1097 | 12.0 | 6612 | 0.0398 | 0.9860 | | 0.0763 | 13.0 | 7163 | 0.0348 | 0.9866 | | 0.0895 | 14.0 | 7714 | 0.0341 | 0.9884 | | 0.06 | 15.0 | 8265 | 0.0381 | 0.9883 | | 0.0767 | 16.0 | 8816 | 0.0382 | 0.9875 | | 0.0868 | 17.0 | 9367 | 0.0309 | 0.9898 | | 0.091 | 18.0 | 9918 | 0.0339 | 0.9885 | | 0.0817 | 19.0 | 10469 | 0.0243 | 0.9913 | | 0.0641 | 20.0 | 11020 | 0.0286 | 0.9906 | | 0.0703 | 21.0 | 11571 | 0.0314 | 0.9906 | | 0.0642 | 22.0 | 12122 | 0.0261 | 0.9913 | | 0.0695 | 23.0 | 12673 | 0.0260 | 0.9920 | | 0.0664 | 24.0 | 13224 | 0.0241 | 0.9928 | | 0.0552 | 25.0 | 13775 | 0.0258 | 0.9928 | | 0.056 | 26.0 | 14326 | 0.0230 | 0.9939 | | 0.0488 | 27.0 | 14877 | 0.0221 | 0.9936 | | 0.0389 | 28.0 | 15428 | 0.0225 | 0.9930 | | 0.0402 | 29.0 | 15979 | 0.0231 | 0.9940 | | 0.0424 | 30.0 | 16530 | 0.0211 | 0.9939 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "fnh", "hcc", "hem", "healthy" ]
pittawat/vit-base-letter
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-letter This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the pittawat/letter_recognition dataset. It achieves the following results on the evaluation set: - Loss: 0.0515 - Accuracy: 0.9881 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 32 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5539 | 0.12 | 100 | 0.5576 | 0.9308 | | 0.2688 | 0.25 | 200 | 0.2371 | 0.9665 | | 0.1568 | 0.37 | 300 | 0.1829 | 0.9688 | | 0.1684 | 0.49 | 400 | 0.1611 | 0.9662 | | 0.1584 | 0.62 | 500 | 0.1340 | 0.9673 | | 0.1569 | 0.74 | 600 | 0.1933 | 0.9531 | | 0.0992 | 0.86 | 700 | 0.1031 | 0.9781 | | 0.0573 | 0.98 | 800 | 0.1024 | 0.9781 | | 0.0359 | 1.11 | 900 | 0.0950 | 0.9804 | | 0.0961 | 1.23 | 1000 | 0.1200 | 0.9723 | | 0.0334 | 1.35 | 1100 | 0.0995 | 0.975 | | 0.0855 | 1.48 | 1200 | 0.0791 | 0.9815 | | 0.0902 | 1.6 | 1300 | 0.0981 | 0.9765 | | 0.0583 | 1.72 | 1400 | 0.1192 | 0.9712 | | 0.0683 | 1.85 | 1500 | 0.0692 | 0.9846 | | 0.1188 | 1.97 | 1600 | 0.0931 | 0.9785 | | 0.0366 | 2.09 | 1700 | 0.0919 | 0.9804 | | 0.0276 | 2.21 | 1800 | 0.0667 | 0.9846 | | 0.0309 | 2.34 | 1900 | 0.0599 | 0.9858 | | 0.0183 | 2.46 | 2000 | 0.0892 | 0.9769 | | 0.0431 | 2.58 | 2100 | 0.0663 | 0.985 | | 0.0424 | 2.71 | 2200 | 0.0643 | 0.9862 | | 0.0453 | 2.83 | 2300 | 0.0646 | 0.9862 | | 0.0528 | 2.95 | 2400 | 0.0550 | 0.985 | | 0.0045 | 3.08 | 2500 | 0.0579 | 0.9846 | | 0.007 | 3.2 | 2600 | 0.0517 | 0.9885 | | 0.0048 | 3.32 | 2700 | 0.0584 | 0.9865 | | 0.019 | 3.44 | 2800 | 0.0560 | 0.9873 | | 0.0038 | 3.57 | 2900 | 0.0515 | 0.9881 | | 0.0219 | 3.69 | 3000 | 0.0527 | 0.9881 | | 0.0117 | 3.81 | 3100 | 0.0523 | 0.9888 | | 0.0035 | 3.94 | 3200 | 0.0559 | 0.9865 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.0 - Datasets 2.1.0 - Tokenizers 0.13.2
[ "a", "b", "k", "l", "m", "n", "o", "p", "q", "r", "s", "t", "c", "u", "v", "w", "x", "y", "z", "d", "e", "f", "g", "h", "i", "j" ]
kanak8278/convnext-tiny-224-finetuned-aiornot
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnext-tiny-224-finetuned-aiornot This model is a fine-tuned version of [facebook/convnext-tiny-224](https://huggingface.co/facebook/convnext-tiny-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1051 - Accuracy: 0.9608 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2188 | 1.0 | 523 | 0.1419 | 0.9431 | | 0.0957 | 2.0 | 1047 | 0.1018 | 0.9592 | | 0.0847 | 3.0 | 1569 | 0.1051 | 0.9608 | ### Framework versions - Transformers 4.27.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "natural", "ai" ]
wendys-llc/amber-mines
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 42327108535 - CO2 Emissions (in grams): 0.4211 ## Validation Metrics - Loss: 0.195 - Accuracy: 0.950 - Precision: 0.941 - Recall: 0.960 - AUC: 0.984 - F1: 0.950
[ "negative", "positive" ]
wendys-llc/autotrain-amber-mines-42327108538
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 42327108538 - CO2 Emissions (in grams): 0.5195 ## Validation Metrics - Loss: 0.073 - Accuracy: 0.950 - Precision: 0.979 - Recall: 0.920 - AUC: 0.999 - F1: 0.948
[ "negative", "positive" ]
coralexbadea/swin-tiny-patch4-window7-224-finetuned-eurosat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0591 - Accuracy: 0.9785 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2318 | 1.0 | 190 | 0.1268 | 0.9593 | | 0.1825 | 2.0 | 380 | 0.0874 | 0.97 | | 0.1386 | 3.0 | 570 | 0.0591 | 0.9785 | ### Framework versions - Transformers 4.27.2 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "annual crop", "forest", "herbaceous vegetation", "highway", "industrial", "pasture", "permanent crop", "residential", "river", "sea or lake" ]
davanstrien/convnext-tiny-224-wikiart
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # convnext-tiny-224-wikiart This model is a fine-tuned version of [facebook/convnext-tiny-224](https://huggingface.co/facebook/convnext-tiny-224) on the huggan/wikiart dataset. It achieves the following results on the evaluation set: - Loss: 0.8022 - Accuracy: 0.7140 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.9779 | 1.0 | 8654 | 0.9191 | 0.6743 | | 0.9959 | 2.0 | 17308 | 0.8523 | 0.6941 | | 1.0344 | 3.0 | 25962 | 0.8277 | 0.7023 | | 0.8853 | 4.0 | 34616 | 0.8126 | 0.7100 | | 0.9557 | 5.0 | 43270 | 0.8022 | 0.7140 | ### Framework versions - Transformers 4.28.0.dev0 - Pytorch 1.13.0+cu117 - Datasets 2.10.1 - Tokenizers 0.13.2
[ "abstract_painting", "cityscape", "unknown genre", "genre_painting", "illustration", "landscape", "nude_painting", "portrait", "religious_painting", "sketch_and_study", "still_life" ]
davanstrien/autotrain-wikiart-sample2-42615108993
# Model Trained Using AutoTrain - Problem type: Multi-class Classification - Model ID: 42615108993 - CO2 Emissions (in grams): 11.2052 ## Validation Metrics - Loss: 0.761 - Accuracy: 0.729 - Macro F1: 0.682 - Micro F1: 0.729 - Weighted F1: 0.723 - Macro Precision: 0.742 - Micro Precision: 0.729 - Weighted Precision: 0.726 - Macro Recall: 0.658 - Micro Recall: 0.729 - Weighted Recall: 0.729
[ "unknown genre", "abstract_painting", "still_life", "cityscape", "genre_painting", "illustration", "landscape", "nude_painting", "portrait", "religious_painting", "sketch_and_study" ]
vevlins/autotrain-classify-42751109216
# Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 42751109216 - CO2 Emissions (in grams): 0.8521 ## Validation Metrics - Loss: 0.010 - Accuracy: 1.000 - Precision: 1.000 - Recall: 1.000 - AUC: 1.000 - F1: 1.000
[ "苹果", "香蕉" ]
jamesportis/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
Hrishikesh332/autotrain-meme-classification-42897109437
**Dataset** The dataset consist of two label images: * Meme * Not Meme Meme folder consist of 222 meme images and Not Meme folder consist of 108 non meme files. Meme file consist most of the images contaning the text on the picture and not meme consist of all type of images from sports to the text in various forms like document, image text to get the higher accuracy and understand about the meme in a most efficient way. **UseCase** * **Content Moderation** - The meme classification model can be used to filter out the content of meme from the vast amount of data generated for the specific domain from the social media for the better understanding. **Future Scope** * Further work on the sentiment of the meme image like positive, voilence, offensive, sarcasm, neutral, etc. This can be used for various task like: * **Education** - To eliminate the offensive content from the curated memes for education * **Brand Monitoring** - To understand the sentiments of the user by understanding the representation by meme culture for decision making process. # Model Trained Using AutoTrain - Problem type: Binary Classification - Model ID: 42897109437 - CO2 Emissions (in grams): 1.1329 ## Validation Metrics - Loss: 0.025 - Accuracy: 1.000 - Precision: 1.000 - Recall: 1.000 - AUC: 1.000 - F1: 1.000
[ "meme", "not_meme" ]
OttoYu/TreeClassification
## Validation Metrics - Loss: 0.772 - Accuracy: 0.792 - Macro F1: 0.754 - Micro F1: 0.792 - Weighted F1: 0.747 - Macro Precision: 0.744 - Micro Precision: 0.792 - Weighted Precision: 0.743 - Macro Recall: 0.808 - Micro Recall: 0.792 - Weighted Recall: 0.792
[ "araucaria columnaris", "archontophenix alexandrae", "melaleuca cajuputi subsp. cumingiana", "psychotria asiatica", "terminalia mantaly", "bischofia javanica", "callistemon viminalis", "casuarina equisetifolia", "cinnamomum burmannii", "dicranopteris pedata", "hibiscus tiliaceus", "livistona chinensis", "machilus chekiangensis" ]
EddyWebb/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.13.1+cu116 - Datasets 2.7.1 - Tokenizers 0.13.2
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]