model_id
stringlengths 7
105
| model_card
stringlengths 1
130k
| model_labels
listlengths 2
80k
|
---|---|---|
Dhruvt7707/resnet-50-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet-50-finetuned-eurosat
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9696
- Accuracy: 0.757
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 5.0195 | 1.0 | 704 | 4.8949 | 0.1813 |
| 3.2267 | 2.0 | 1408 | 2.2076 | 0.5449 |
| 2.5469 | 3.0 | 2112 | 1.5307 | 0.6525 |
| 2.3148 | 4.0 | 2816 | 1.2637 | 0.7025 |
| 2.0595 | 5.0 | 3520 | 1.1327 | 0.7218 |
| 2.0319 | 6.0 | 4224 | 1.0622 | 0.7373 |
| 2.0039 | 7.0 | 4928 | 1.0171 | 0.7454 |
| 2.0365 | 8.0 | 5632 | 0.9957 | 0.7527 |
| 1.9755 | 9.0 | 6336 | 0.9780 | 0.7539 |
| 1.8816 | 9.9868 | 7030 | 0.9696 | 0.757 |
### Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"n01443537",
"n01629819",
"n01641577",
"n01644900",
"n01698640",
"n01742172",
"n01768244",
"n01770393",
"n01774384",
"n01774750",
"n01784675",
"n01882714",
"n01910747",
"n01917289",
"n01944390",
"n01950731",
"n01983481",
"n01984695",
"n02002724",
"n02056570",
"n02058221",
"n02074367",
"n02094433",
"n02099601",
"n02099712",
"n02106662",
"n02113799",
"n02123045",
"n02123394",
"n02124075",
"n02125311",
"n02129165",
"n02132136",
"n02165456",
"n02226429",
"n02231487",
"n02233338",
"n02236044",
"n02268443",
"n02279972",
"n02281406",
"n02321529",
"n02364673",
"n02395406",
"n02403003",
"n02410509",
"n02415577",
"n02423022",
"n02437312",
"n02480495",
"n02481823",
"n02486410",
"n02504458",
"n02509815",
"n02666347",
"n02669723",
"n02699494",
"n02769748",
"n02788148",
"n02791270",
"n02793495",
"n02795169",
"n02802426",
"n02808440",
"n02814533",
"n02814860",
"n02815834",
"n02823428",
"n02837789",
"n02841315",
"n02843684",
"n02883205",
"n02892201",
"n02909870",
"n02917067",
"n02927161",
"n02948072",
"n02950826",
"n02963159",
"n02977058",
"n02988304",
"n03014705",
"n03026506",
"n03042490",
"n03085013",
"n03089624",
"n03100240",
"n03126707",
"n03160309",
"n03179701",
"n03201208",
"n03255030",
"n03355925",
"n03373237",
"n03388043",
"n03393912",
"n03400231",
"n03404251",
"n03424325",
"n03444034",
"n03447447",
"n03544143",
"n03584254",
"n03599486",
"n03617480",
"n03637318",
"n03649909",
"n03662601",
"n03670208",
"n03706229",
"n03733131",
"n03763968",
"n03770439",
"n03796401",
"n03814639",
"n03837869",
"n03838899",
"n03854065",
"n03891332",
"n03902125",
"n03930313",
"n03937543",
"n03970156",
"n03977966",
"n03980874",
"n03983396",
"n03992509",
"n04008634",
"n04023962",
"n04070727",
"n04074963",
"n04099969",
"n04118538",
"n04133789",
"n04146614",
"n04149813",
"n04179913",
"n04251144",
"n04254777",
"n04259630",
"n04265275",
"n04275548",
"n04285008",
"n04311004",
"n04328186",
"n04356056",
"n04366367",
"n04371430",
"n04376876",
"n04398044",
"n04399382",
"n04417672",
"n04456115",
"n04465666",
"n04486054",
"n04487081",
"n04501370",
"n04507155",
"n04532106",
"n04532670",
"n04540053",
"n04560804",
"n04562935",
"n04596742",
"n04598010",
"n06596364",
"n07056680",
"n07583066",
"n07614500",
"n07615774",
"n07646821",
"n07647870",
"n07657664",
"n07695742",
"n07711569",
"n07715103",
"n07720875",
"n07749582",
"n07753592",
"n07768694",
"n07871810",
"n07873807",
"n07875152",
"n07920052",
"n07975909",
"n08496334",
"n08620881",
"n08742578",
"n09193705",
"n09246464",
"n09256479",
"n09332890",
"n09428293",
"n12267677",
"n12520864",
"n13001041",
"n13652335",
"n13652994",
"n13719102",
"n14991210"
] |
LukeXOTWOD/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2110
- Accuracy: 0.9378
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3688 | 1.0 | 370 | 0.2936 | 0.9175 |
| 0.2174 | 2.0 | 740 | 0.2098 | 0.9323 |
| 0.1684 | 3.0 | 1110 | 0.1840 | 0.9459 |
| 0.1427 | 4.0 | 1480 | 0.1772 | 0.9405 |
| 0.1289 | 5.0 | 1850 | 0.1743 | 0.9378 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
---
## 🧠 Zero-Shot Classification (CLIP)
This evaluation compares the fine-tuned ViT model to a zero-shot approach using [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32).
The model was evaluated on 100 samples from the Oxford-IIIT Pet dataset.
### 🔢 Zero-Shot Metrics
| Metric | Score |
|------------|---------|
| Accuracy | 88.00% |
| Precision | 87.68% |
| Recall | 88.00% |
Evaluation notebook: [`oxford_pets_zero_shot.ipynb`](https://github.com/bkuehnis/ai-applications-fs25/blob/main/week7/oxford_pets_zero_shot.ipynb)
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
agasta/virtus
|
# Model Card for Virtus
Virtus is a fine-tuned Vision Transformer (ViT) model for binary image classification, specifically trained to distinguish between real and deepfake images. It achieves **~99.2% accuracy** on a balanced dataset of over 190,000 images.
## Model Details
### Model Description
Virtus is based on `facebook/deit-base-distilled-patch16-224` and was fine-tuned on a binary classification task using a large dataset of real and fake facial images. The training process involved class balancing, data augmentation, and evaluation using accuracy and F1 score.
- **Developed by:** [Agasta](https://github.com/Itz-Agasta)
- **Funded by:** None
- **Shared by:** Agasta
- **Model type:** Vision Transformer (ViT) for image classification
- **Language(s):** N/A (vision model)
- **License:** MIT
- **Finetuned from model:** [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224)
### Model Sources
- **Repository:** [https://huggingface.co/agasta/virtus](https://huggingface.co/agasta/virtus)
## Uses
### Direct Use
This model can be used to predict whether an input image is a real or a deepfake. It can be deployed in image analysis pipelines or integrated into applications that require media authenticity detection.
### Downstream Use
Virtus may be used in broader deepfake detection systems, educational tools for detecting synthetic media, or pre-screening systems for online platforms.
### Out-of-Scope Use
- Detection of deepfakes in videos or audio
- General object classification tasks outside of the real/fake binary domain
## Bias, Risks, and Limitations
The dataset, while balanced, may still carry biases in facial features, lighting conditions, or demographics. The model is also not robust to non-standard input sizes or heavily occluded faces.
### Recommendations
- Use only on face images similar in nature to the training set.
- Do not use for critical or high-stakes decisions without human verification.
- Regularly re-evaluate performance with updated data.
## How to Get Started with the Model
```python
from transformers import AutoFeatureExtractor, AutoModelForImageClassification
from PIL import Image
import torch
model = AutoModelForImageClassification.from_pretrained("agasta/virtus")
extractor = AutoFeatureExtractor.from_pretrained("agasta/virtus")
image = Image.open("path_to_image.jpg")
inputs = extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
predicted_class = outputs.logits.argmax(-1).item()
print(model.config.id2label[predicted_class])
```
## Training Details
### Training Data
The dataset consisted of 190,335 self-collected real and deepfake face images, with RandomOverSampler used to balance the two classes. The data was split into 60% training and 40% testing, maintaining class stratification.
### Training Procedure
#### Preprocessing
- Images resized to 224x224
- Augmentations: Random rotation, sharpness adjustments, normalization
#### Training Hyperparameters
- **Epochs:** 2
- **Learning rate:** 1e-6
- **Train batch size:** 32
- **Eval batch size:** 8
- **Weight decay:** 0.02
- **Optimizer:** AdamW (via Trainer API)
- **Mixed precision:** Not used
## Evaluation
### Testing Data
Same dataset, stratified 60:40 split, used for evaluation.
### Metrics
- **Accuracy**
- **F1 Score (macro)**
- **Confusion matrix**
- **Classification report**
### Results
- **Accuracy:** 99.20%
- **F1 Score (macro):** 0.9920
## Environmental Impact
- **Hardware Type:** NVIDIA Tesla V100 (Kaggle Notebook GPU)
- **Hours used:** ~2.3 hours
- **Cloud Provider:** Kaggle
- **Compute Region:** Unknown
- **Carbon Emitted:** Can be estimated via [MLCO2 Calculator](https://mlco2.github.io/impact#compute)
## Technical Specifications
### Model Architecture and Objective
The model is a distilled Vision Transformer (DeiT) designed for image classification with a binary objective: classify images as Real or Fake.
### Compute Infrastructure
- **Hardware:** 1x NVIDIA Tesla V100 GPU
- **Software:** PyTorch, Hugging Face Transformers, Datasets, Accelerate
## Citation
**BibTeX:**
```bibtex
@misc{virtus2025,
title={Virtus: Deepfake Detection using Vision Transformers},
author={Agasta},
year={2025},
howpublished={\url{https://huggingface.co/agasta/virtus}},
}
```
**APA:**
Agasta. (2025). *Virtus: Deepfake Detection using Vision Transformers*. Hugging Face. https://huggingface.co/agasta/virtus
## Model Card Contact
For questions or feedback, reach out via [GitHub](https://github.com/Itz-Agasta) or open an issue on the [model repository](https://github.com/Itz-Agasta/Lopt/tree/main/models/image). or mail me at [email protected]
|
[
"real",
"fake"
] |
Pamreth/vit-ena24
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-ena24
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- eval_loss: 3.0899
- eval_model_preparation_time: 0.0031
- eval_accuracy: 0.0435
- eval_runtime: 925.7714
- eval_samples_per_second: 1.415
- eval_steps_per_second: 0.177
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"american black bear",
"american crow",
"eastern fox squirrel",
"eastern gray squirrel",
"grey fox",
"horse",
"northern raccoon",
"red fox",
"striped skunk",
"vehicle",
"virginia opossum",
"white_tailed_deer",
"bird",
"wild turkey",
"woodchuck",
"bobcat",
"chicken",
"coyote",
"dog",
"domestic cat",
"eastern chipmunk",
"eastern cottontail"
] |
222dunja/vit-base-oxford-iiit-pets
|
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Ayananshu/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5575
- Accuracy: 0.8531
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 1.928 | 1.0 | 704 | 1.1334 | 0.7328 |
| 1.5799 | 2.0 | 1408 | 0.7367 | 0.8062 |
| 1.1837 | 3.0 | 2112 | 0.6606 | 0.8275 |
| 1.211 | 4.0 | 2816 | 0.6414 | 0.8312 |
| 0.9815 | 5.0 | 3520 | 0.5919 | 0.8435 |
| 1.0347 | 6.0 | 4224 | 0.5949 | 0.8445 |
| 0.9556 | 7.0 | 4928 | 0.5808 | 0.8508 |
| 0.9018 | 8.0 | 5632 | 0.5627 | 0.8514 |
| 0.8382 | 9.0 | 6336 | 0.5583 | 0.8517 |
| 0.7842 | 9.9868 | 7030 | 0.5575 | 0.8531 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"n01443537",
"n01629819",
"n01641577",
"n01644900",
"n01698640",
"n01742172",
"n01768244",
"n01770393",
"n01774384",
"n01774750",
"n01784675",
"n01882714",
"n01910747",
"n01917289",
"n01944390",
"n01950731",
"n01983481",
"n01984695",
"n02002724",
"n02056570",
"n02058221",
"n02074367",
"n02094433",
"n02099601",
"n02099712",
"n02106662",
"n02113799",
"n02123045",
"n02123394",
"n02124075",
"n02125311",
"n02129165",
"n02132136",
"n02165456",
"n02226429",
"n02231487",
"n02233338",
"n02236044",
"n02268443",
"n02279972",
"n02281406",
"n02321529",
"n02364673",
"n02395406",
"n02403003",
"n02410509",
"n02415577",
"n02423022",
"n02437312",
"n02480495",
"n02481823",
"n02486410",
"n02504458",
"n02509815",
"n02666347",
"n02669723",
"n02699494",
"n02769748",
"n02788148",
"n02791270",
"n02793495",
"n02795169",
"n02802426",
"n02808440",
"n02814533",
"n02814860",
"n02815834",
"n02823428",
"n02837789",
"n02841315",
"n02843684",
"n02883205",
"n02892201",
"n02909870",
"n02917067",
"n02927161",
"n02948072",
"n02950826",
"n02963159",
"n02977058",
"n02988304",
"n03014705",
"n03026506",
"n03042490",
"n03085013",
"n03089624",
"n03100240",
"n03126707",
"n03160309",
"n03179701",
"n03201208",
"n03255030",
"n03355925",
"n03373237",
"n03388043",
"n03393912",
"n03400231",
"n03404251",
"n03424325",
"n03444034",
"n03447447",
"n03544143",
"n03584254",
"n03599486",
"n03617480",
"n03637318",
"n03649909",
"n03662601",
"n03670208",
"n03706229",
"n03733131",
"n03763968",
"n03770439",
"n03796401",
"n03814639",
"n03837869",
"n03838899",
"n03854065",
"n03891332",
"n03902125",
"n03930313",
"n03937543",
"n03970156",
"n03977966",
"n03980874",
"n03983396",
"n03992509",
"n04008634",
"n04023962",
"n04070727",
"n04074963",
"n04099969",
"n04118538",
"n04133789",
"n04146614",
"n04149813",
"n04179913",
"n04251144",
"n04254777",
"n04259630",
"n04265275",
"n04275548",
"n04285008",
"n04311004",
"n04328186",
"n04356056",
"n04366367",
"n04371430",
"n04376876",
"n04398044",
"n04399382",
"n04417672",
"n04456115",
"n04465666",
"n04486054",
"n04487081",
"n04501370",
"n04507155",
"n04532106",
"n04532670",
"n04540053",
"n04560804",
"n04562935",
"n04596742",
"n04598010",
"n06596364",
"n07056680",
"n07583066",
"n07614500",
"n07615774",
"n07646821",
"n07647870",
"n07657664",
"n07695742",
"n07711569",
"n07715103",
"n07720875",
"n07749582",
"n07753592",
"n07768694",
"n07871810",
"n07873807",
"n07875152",
"n07920052",
"n07975909",
"n08496334",
"n08620881",
"n08742578",
"n09193705",
"n09246464",
"n09256479",
"n09332890",
"n09428293",
"n12267677",
"n12520864",
"n13001041",
"n13652335",
"n13652994",
"n13719102",
"n14991210"
] |
muellje3/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# clip-oxford-pets
This model is a fine-tuned version of openai/clip-vit-base-patch32 on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- accuracy: 0.8800,
- precision: 0.8768,
- recall": 0.8800
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1769
- Accuracy: 0.9405
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3773 | 1.0 | 370 | 0.2977 | 0.9418 |
| 0.2106 | 2.0 | 740 | 0.2214 | 0.9459 |
| 0.152 | 3.0 | 1110 | 0.2042 | 0.9459 |
| 0.1423 | 4.0 | 1480 | 0.2001 | 0.9432 |
| 0.1174 | 5.0 | 1850 | 0.1956 | 0.9445 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
blaxe191/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1985
- Accuracy: 0.9445
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3957 | 1.0 | 370 | 0.3339 | 0.9147 |
| 0.2076 | 2.0 | 740 | 0.2593 | 0.9215 |
| 0.1858 | 3.0 | 1110 | 0.2356 | 0.9350 |
| 0.1483 | 4.0 | 1480 | 0.2266 | 0.9337 |
| 0.1376 | 5.0 | 1850 | 0.2237 | 0.9337 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
---
## 🧠 Zero-Shot Classification (CLIP)
This evaluation compares the fine-tuned ViT model to a zero-shot approach using [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32).
The model was evaluated on 100 samples from the Oxford-IIIT Pet dataset.
### 🔢 Zero-Shot Metrics
| Metric | Score |
|------------|---------|
| Accuracy | 88.00% |
| Precision | 87.68% |
| Recall | 88.00% |
Evaluation notebook: [oxford_pets_zero_shot.ipynb](https://github.com/bkuehnis/ai-applications-fs25/blob/main/week7/oxford_pets_zero_shot.ipynb)
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
remonemo/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1992
- Accuracy: 0.9391
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3991 | 1.0 | 370 | 0.2804 | 0.9337 |
| 0.2286 | 2.0 | 740 | 0.2133 | 0.9445 |
| 0.1633 | 3.0 | 1110 | 0.2036 | 0.9418 |
| 0.1518 | 4.0 | 1480 | 0.1882 | 0.9418 |
| 0.1434 | 5.0 | 1850 | 0.1854 | 0.9432 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero Shot Resultate
- Model used for Zero Shot: openai/clip-vit-large-patch14
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Pamreth/deit-ena24
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deit-ena24
This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the ena24 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0870
- Accuracy: 0.9794
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 1.2994 | 0.1302 | 100 | 1.0314 | 0.7092 |
| 0.8789 | 0.2604 | 200 | 0.6169 | 0.8328 |
| 0.4592 | 0.3906 | 300 | 0.5234 | 0.8298 |
| 0.6806 | 0.5208 | 400 | 0.5431 | 0.8489 |
| 0.4878 | 0.6510 | 500 | 0.3905 | 0.8855 |
| 0.4643 | 0.7812 | 600 | 0.3281 | 0.9092 |
| 0.3765 | 0.9115 | 700 | 0.2398 | 0.9290 |
| 0.1379 | 1.0417 | 800 | 0.1861 | 0.9412 |
| 0.1422 | 1.1719 | 900 | 0.1657 | 0.9527 |
| 0.2655 | 1.3021 | 1000 | 0.1526 | 0.9557 |
| 0.0304 | 1.4323 | 1100 | 0.1578 | 0.9634 |
| 0.072 | 1.5625 | 1200 | 0.1418 | 0.9679 |
| 0.2936 | 1.6927 | 1300 | 0.1003 | 0.9771 |
| 0.0333 | 1.8229 | 1400 | 0.0935 | 0.9794 |
| 0.0844 | 1.9531 | 1500 | 0.0870 | 0.9794 |
### Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"american black bear",
"american crow",
"eastern fox squirrel",
"eastern gray squirrel",
"grey fox",
"horse",
"northern raccoon",
"red fox",
"striped skunk",
"vehicle",
"virginia opossum",
"white_tailed_deer",
"bird",
"wild turkey",
"woodchuck",
"bobcat",
"chicken",
"coyote",
"dog",
"domestic cat",
"eastern chipmunk",
"eastern cottontail"
] |
Venojah/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# clip-oxford-pets
This model is a fine-tuned version of openai/clip-vit-base-patch32 on the pcuenq/oxford-pets dataset. It achieves the following results on the evaluation set:
accuracy: 0.8800,
precision: 0.8768,
recall": 0.8800
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1738
- Accuracy: 0.9445
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1253 | 1.0 | 370 | 0.2038 | 0.9296 |
| 0.1086 | 2.0 | 740 | 0.1962 | 0.9283 |
| 0.0828 | 3.0 | 1110 | 0.1879 | 0.9364 |
| 0.0772 | 4.0 | 1480 | 0.1922 | 0.9296 |
| 0.0665 | 5.0 | 1850 | 0.1908 | 0.9337 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Pamreth/swin-ena24
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-ena24
This model is a fine-tuned version of [microsoft/swin-base-simmim-window6-192](https://huggingface.co/microsoft/swin-base-simmim-window6-192) on the ena24 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4677
- Accuracy: 0.5146
- F1 Macro: 0.4328
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 7
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro |
|:-------------:|:------:|:----:|:---------------:|:--------:|:--------:|
| 1.9888 | 0.2519 | 100 | 3.4122 | 0.1631 | 0.0893 |
| 1.6111 | 0.5038 | 200 | 2.9077 | 0.2578 | 0.1535 |
| 1.1276 | 0.7557 | 300 | 2.6504 | 0.3574 | 0.2827 |
| 1.0234 | 1.0076 | 400 | 2.5728 | 0.3906 | 0.3156 |
| 0.8909 | 1.2594 | 500 | 2.5007 | 0.4219 | 0.3388 |
| 0.8008 | 1.5113 | 600 | 2.7039 | 0.4043 | 0.3619 |
| 0.6885 | 1.7632 | 700 | 3.1090 | 0.3701 | 0.2926 |
| 0.839 | 2.0151 | 800 | 2.5845 | 0.4844 | 0.4149 |
| 0.325 | 2.2670 | 900 | 2.5143 | 0.5068 | 0.4128 |
| 0.4501 | 2.5189 | 1000 | 2.7684 | 0.4482 | 0.4056 |
| 0.3191 | 2.7708 | 1100 | 2.4677 | 0.5146 | 0.4328 |
| 0.1664 | 3.0227 | 1200 | 2.4777 | 0.5361 | 0.4597 |
| 0.1469 | 3.2746 | 1300 | 2.6403 | 0.5205 | 0.4495 |
| 0.3063 | 3.5264 | 1400 | 2.8000 | 0.5010 | 0.4415 |
| 0.1786 | 3.7783 | 1500 | 2.8165 | 0.5332 | 0.4525 |
| 0.0687 | 4.0302 | 1600 | 2.9027 | 0.5684 | 0.4942 |
| 0.0427 | 4.2821 | 1700 | 3.3216 | 0.4912 | 0.4362 |
| 0.1825 | 4.5340 | 1800 | 3.1456 | 0.5312 | 0.4664 |
| 0.0758 | 4.7859 | 1900 | 3.2782 | 0.5547 | 0.4578 |
| 0.0471 | 5.0378 | 2000 | 3.3348 | 0.5518 | 0.4725 |
| 0.0512 | 5.2897 | 2100 | 3.7182 | 0.5283 | 0.4514 |
| 0.0095 | 5.5416 | 2200 | 3.9028 | 0.5342 | 0.4785 |
| 0.0247 | 5.7935 | 2300 | 3.9606 | 0.5713 | 0.4879 |
| 0.0008 | 6.0453 | 2400 | 4.1290 | 0.5654 | 0.4918 |
| 0.0024 | 6.2972 | 2500 | 4.4147 | 0.5654 | 0.4863 |
| 0.0002 | 6.5491 | 2600 | 4.5209 | 0.5654 | 0.4913 |
| 0.0055 | 6.8010 | 2700 | 4.5154 | 0.5820 | 0.5067 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
|
[
"american black bear",
"american crow",
"eastern fox squirrel",
"eastern gray squirrel",
"grey fox",
"horse",
"northern raccoon",
"red fox",
"striped skunk",
"virginia opossum",
"white_tailed_deer",
"wild turkey",
"bird",
"woodchuck",
"bobcat",
"chicken",
"coyote",
"dog",
"domestic cat",
"eastern chipmunk",
"eastern cottontail"
] |
thenewsupercell/MaskedJaw_image_parts_df_VIT
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MaskedJaw_image_parts_df_VIT
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0727
- Accuracy: 0.9847
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.1212 | 1.0 | 5252 | 0.0840 | 0.9791 |
| 0.037 | 2.0 | 10504 | 0.0652 | 0.9837 |
| 0.0441 | 3.0 | 15756 | 0.0585 | 0.9860 |
| 0.0639 | 4.0 | 21008 | 0.0727 | 0.9847 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
|
[
"fake",
"real"
] |
thenewsupercell/MaskedMouth_image_parts_df_VIT
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MaskedMouth_image_parts_df_VIT
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0234
- Accuracy: 0.9960
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.0111 | 1.0 | 5252 | 0.0387 | 0.9910 |
| 0.001 | 2.0 | 10504 | 0.0576 | 0.9896 |
| 0.0002 | 3.0 | 15756 | 0.0347 | 0.9935 |
| 0.0089 | 4.0 | 21008 | 0.0234 | 0.9960 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
|
[
"fake",
"real"
] |
mylonjones/vit-base-beans
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0647
- Accuracy: 0.9925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2828 | 1.0 | 130 | 0.2179 | 0.9624 |
| 0.129 | 2.0 | 260 | 0.1295 | 0.9699 |
| 0.1418 | 3.0 | 390 | 0.0949 | 0.9699 |
| 0.0859 | 4.0 | 520 | 0.0647 | 0.9925 |
| 0.1129 | 5.0 | 650 | 0.0815 | 0.9774 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cpu
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"angular_leaf_spot",
"bean_rust",
"healthy"
] |
thenewsupercell/MaskedForehead_image_parts_df_VIT
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MaskedForehead_image_parts_df_VIT
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0184
- Accuracy: 0.9965
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.0347 | 1.0 | 5252 | 0.0299 | 0.9908 |
| 0.0095 | 2.0 | 10504 | 0.0208 | 0.9944 |
| 0.0004 | 3.0 | 15756 | 0.0196 | 0.9955 |
| 0.0102 | 4.0 | 21008 | 0.0184 | 0.9965 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
|
[
"fake",
"real"
] |
thenewsupercell/MaskedNose_image_parts_df_VIT
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MaskedNose_image_parts_df_VIT
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0229
- Accuracy: 0.9958
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.031 | 1.0 | 5252 | 0.0241 | 0.9927 |
| 0.0009 | 2.0 | 10504 | 0.0274 | 0.9937 |
| 0.0438 | 3.0 | 15756 | 0.0212 | 0.9951 |
| 0.0098 | 4.0 | 21008 | 0.0229 | 0.9958 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
|
[
"fake",
"real"
] |
avanishd/vit-base-patch16-224-in21k-finetuned-cifar10
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-in21k-finetuned-cifar10
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the cifar-10 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1126
- Accuracy: 0.9877
## Model description
More information needed
## Intended uses & limitations
More information needed
## How to Get Started with the Model
```Python
from transformers import pipeline
pipe = pipeline("image-classification", "avanishd/vit-base-patch16-224-in21k-finetuned-cifar10")
pipe(image)
```
## Training and evaluation data
More information needed
## Training procedure
More information needed
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4166 | 1.0 | 313 | 0.2324 | 0.9791 |
| 0.3247 | 2.0 | 626 | 0.1320 | 0.9875 |
| 0.2661 | 2.992 | 936 | 0.1126 | 0.9877 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"airplane",
"automobile",
"bird",
"cat",
"deer",
"dog",
"frog",
"horse",
"ship",
"truck"
] |
prithivMLmods/Multilabel-Portrait-SigLIP2
|

# **Multilabel-Portrait-SigLIP2**
> **Multilabel-Portrait-SigLIP2** is a vision-language model fine-tuned from [**google/siglip2-base-patch16-224**](https://huggingface.co/google/siglip2-base-patch16-224) using the `SiglipForImageClassification` architecture. It classifies portrait-style images into one of the following **visual portrait categories**:
```py
Classification Report:
precision recall f1-score support
Anime Portrait 0.9989 0.9991 0.9990 4444
Cartoon Portrait 0.9964 0.9926 0.9945 4444
Real Portrait 0.9964 0.9971 0.9967 4444
Sketch Portrait 0.9971 1.0000 0.9985 4444
accuracy 0.9972 17776
macro avg 0.9972 0.9972 0.9972 17776
weighted avg 0.9972 0.9972 0.9972 17776
```

---
# **Model Objective**
The model is designed to **analyze portrait images** and categorize them into **one of four distinct portrait types**:
- **0:** Anime Portrait
- **1:** Cartoon Portrait
- **2:** Real Portrait
- **3:** Sketch Portrait
---
# **Try it with Transformers 🤗**
Install dependencies:
```bash
pip install -q transformers torch pillow gradio
```
Run the model with the following script:
```python
import gradio as gr
from transformers import AutoImageProcessor, SiglipForImageClassification
from PIL import Image
import torch
# Load model and processor
model_name = "prithivMLmods/Multilabel-Portrait-SigLIP2" # Replace with actual HF model path
model = SiglipForImageClassification.from_pretrained(model_name)
processor = AutoImageProcessor.from_pretrained(model_name)
# Label mapping
id2label = {
0: "Anime Portrait",
1: "Cartoon Portrait",
2: "Real Portrait",
3: "Sketch Portrait"
}
def classify_portrait(image):
"""Predict the type of portrait style from an image."""
image = Image.fromarray(image).convert("RGB")
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
probs = torch.nn.functional.softmax(logits, dim=1).squeeze().tolist()
predictions = {id2label[i]: round(probs[i], 3) for i in range(len(probs))}
predictions = dict(sorted(predictions.items(), key=lambda item: item[1], reverse=True))
return predictions
# Gradio interface
iface = gr.Interface(
fn=classify_portrait,
inputs=gr.Image(type="numpy"),
outputs=gr.Label(label="Portrait Type Prediction Scores"),
title="Multilabel-Portrait-SigLIP2",
description="Upload a portrait-style image (anime, cartoon, real, or sketch) to predict its most likely visual category."
)
if __name__ == "__main__":
iface.launch()
```
---
# **Intended Use Cases**
- **AI Art Curation** — Automatically organize large-scale datasets of artistic portraits.
- **Style-based Portrait Analysis** — Determine artistic style in user-uploaded or curated portrait datasets.
- **Content Filtering for Platforms** — Group and recommend based on visual aesthetics.
- **Dataset Pre-labeling** — Helps reduce manual effort in annotation tasks.
- **User Avatar Classification** — Profile categorization in social or gaming platforms.
|
[
"anime portrait",
"cartoon portrait",
"real portrait",
"sketch portrait"
] |
yeryeong-cha/my_awesome_food_model
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6215
- Accuracy: 0.889
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.6762 | 1.0 | 63 | 2.5166 | 0.829 |
| 1.8191 | 2.0 | 126 | 1.7831 | 0.881 |
| 1.5868 | 2.96 | 186 | 1.6215 | 0.889 |
### Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"apple_pie",
"baby_back_ribs",
"bruschetta",
"waffles",
"caesar_salad",
"cannoli",
"caprese_salad",
"carrot_cake",
"ceviche",
"cheesecake",
"cheese_plate",
"chicken_curry",
"chicken_quesadilla",
"baklava",
"chicken_wings",
"chocolate_cake",
"chocolate_mousse",
"churros",
"clam_chowder",
"club_sandwich",
"crab_cakes",
"creme_brulee",
"croque_madame",
"cup_cakes",
"beef_carpaccio",
"deviled_eggs",
"donuts",
"dumplings",
"edamame",
"eggs_benedict",
"escargots",
"falafel",
"filet_mignon",
"fish_and_chips",
"foie_gras",
"beef_tartare",
"french_fries",
"french_onion_soup",
"french_toast",
"fried_calamari",
"fried_rice",
"frozen_yogurt",
"garlic_bread",
"gnocchi",
"greek_salad",
"grilled_cheese_sandwich",
"beet_salad",
"grilled_salmon",
"guacamole",
"gyoza",
"hamburger",
"hot_and_sour_soup",
"hot_dog",
"huevos_rancheros",
"hummus",
"ice_cream",
"lasagna",
"beignets",
"lobster_bisque",
"lobster_roll_sandwich",
"macaroni_and_cheese",
"macarons",
"miso_soup",
"mussels",
"nachos",
"omelette",
"onion_rings",
"oysters",
"bibimbap",
"pad_thai",
"paella",
"pancakes",
"panna_cotta",
"peking_duck",
"pho",
"pizza",
"pork_chop",
"poutine",
"prime_rib",
"bread_pudding",
"pulled_pork_sandwich",
"ramen",
"ravioli",
"red_velvet_cake",
"risotto",
"samosa",
"sashimi",
"scallops",
"seaweed_salad",
"shrimp_and_grits",
"breakfast_burrito",
"spaghetti_bolognese",
"spaghetti_carbonara",
"spring_rolls",
"steak",
"strawberry_shortcake",
"sushi",
"tacos",
"takoyaki",
"tiramisu",
"tuna_tartare"
] |
Mavangu/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1699
- Accuracy: 0.9432
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3885 | 1.0 | 370 | 0.2874 | 0.9323 |
| 0.2141 | 2.0 | 740 | 0.2148 | 0.9405 |
| 0.1856 | 3.0 | 1110 | 0.1960 | 0.9445 |
| 0.1446 | 4.0 | 1480 | 0.1855 | 0.9486 |
| 0.1488 | 5.0 | 1850 | 0.1861 | 0.9432 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## Evaluation Results (Fine-Tuned ViT)
| Metric | Score |
|------------|---------|
| Accuracy | 88.00% |
| Precision | 67.68% |
| Recall | 88.00% |
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
patronmoses/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero Shot Evaluation
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
fischm04/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1691
- Accuracy: 0.9499
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3768 | 1.0 | 370 | 0.3309 | 0.9202 |
| 0.2111 | 2.0 | 740 | 0.2495 | 0.9296 |
| 0.1794 | 3.0 | 1110 | 0.2335 | 0.9269 |
| 0.1423 | 4.0 | 1480 | 0.2229 | 0.9337 |
| 0.1257 | 5.0 | 1850 | 0.2205 | 0.9364 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## Zero-Shot Benchmark
We compared our fine-tuned ViT model against the **CLIP-ViT-L/14** zero-shot classifier on the Oxford-IIIT Pets dataset.
| Metric | ViT (Transfer Learning)
|------------|-------------------------
| Accuracy | 0.8800 |
| Precision | 0.8768 |
| Recall | 0.8800 |
Output of Pythonnotebook:
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
## Zero-Shot CLIP Evaluation
- **Model:** openai/clip-vit-base-patch32
- **Accuracy:** 88.00 %
- **Precision (weighted):** 87.68 %
- **Recall (weighted):** 88.00 %
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
michiel/checkthat_resnet
|
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
[
"tench, tinca tinca",
"goldfish, carassius auratus",
"great white shark, white shark, man-eater, man-eating shark, carcharodon carcharias",
"tiger shark, galeocerdo cuvieri",
"hammerhead, hammerhead shark",
"electric ray, crampfish, numbfish, torpedo",
"stingray",
"cock",
"hen",
"ostrich, struthio camelus",
"brambling, fringilla montifringilla",
"goldfinch, carduelis carduelis",
"house finch, linnet, carpodacus mexicanus",
"junco, snowbird",
"indigo bunting, indigo finch, indigo bird, passerina cyanea",
"robin, american robin, turdus migratorius",
"bulbul",
"jay",
"magpie",
"chickadee",
"water ouzel, dipper",
"kite",
"bald eagle, american eagle, haliaeetus leucocephalus",
"vulture",
"great grey owl, great gray owl, strix nebulosa",
"european fire salamander, salamandra salamandra",
"common newt, triturus vulgaris",
"eft",
"spotted salamander, ambystoma maculatum",
"axolotl, mud puppy, ambystoma mexicanum",
"bullfrog, rana catesbeiana",
"tree frog, tree-frog",
"tailed frog, bell toad, ribbed toad, tailed toad, ascaphus trui",
"loggerhead, loggerhead turtle, caretta caretta",
"leatherback turtle, leatherback, leathery turtle, dermochelys coriacea",
"mud turtle",
"terrapin",
"box turtle, box tortoise",
"banded gecko",
"common iguana, iguana, iguana iguana",
"american chameleon, anole, anolis carolinensis",
"whiptail, whiptail lizard",
"agama",
"frilled lizard, chlamydosaurus kingi",
"alligator lizard",
"gila monster, heloderma suspectum",
"green lizard, lacerta viridis",
"african chameleon, chamaeleo chamaeleon",
"komodo dragon, komodo lizard, dragon lizard, giant lizard, varanus komodoensis",
"african crocodile, nile crocodile, crocodylus niloticus",
"american alligator, alligator mississipiensis",
"triceratops",
"thunder snake, worm snake, carphophis amoenus",
"ringneck snake, ring-necked snake, ring snake",
"hognose snake, puff adder, sand viper",
"green snake, grass snake",
"king snake, kingsnake",
"garter snake, grass snake",
"water snake",
"vine snake",
"night snake, hypsiglena torquata",
"boa constrictor, constrictor constrictor",
"rock python, rock snake, python sebae",
"indian cobra, naja naja",
"green mamba",
"sea snake",
"horned viper, cerastes, sand viper, horned asp, cerastes cornutus",
"diamondback, diamondback rattlesnake, crotalus adamanteus",
"sidewinder, horned rattlesnake, crotalus cerastes",
"trilobite",
"harvestman, daddy longlegs, phalangium opilio",
"scorpion",
"black and gold garden spider, argiope aurantia",
"barn spider, araneus cavaticus",
"garden spider, aranea diademata",
"black widow, latrodectus mactans",
"tarantula",
"wolf spider, hunting spider",
"tick",
"centipede",
"black grouse",
"ptarmigan",
"ruffed grouse, partridge, bonasa umbellus",
"prairie chicken, prairie grouse, prairie fowl",
"peacock",
"quail",
"partridge",
"african grey, african gray, psittacus erithacus",
"macaw",
"sulphur-crested cockatoo, kakatoe galerita, cacatua galerita",
"lorikeet",
"coucal",
"bee eater",
"hornbill",
"hummingbird",
"jacamar",
"toucan",
"drake",
"red-breasted merganser, mergus serrator",
"goose",
"black swan, cygnus atratus",
"tusker",
"echidna, spiny anteater, anteater",
"platypus, duckbill, duckbilled platypus, duck-billed platypus, ornithorhynchus anatinus",
"wallaby, brush kangaroo",
"koala, koala bear, kangaroo bear, native bear, phascolarctos cinereus",
"wombat",
"jellyfish",
"sea anemone, anemone",
"brain coral",
"flatworm, platyhelminth",
"nematode, nematode worm, roundworm",
"conch",
"snail",
"slug",
"sea slug, nudibranch",
"chiton, coat-of-mail shell, sea cradle, polyplacophore",
"chambered nautilus, pearly nautilus, nautilus",
"dungeness crab, cancer magister",
"rock crab, cancer irroratus",
"fiddler crab",
"king crab, alaska crab, alaskan king crab, alaska king crab, paralithodes camtschatica",
"american lobster, northern lobster, maine lobster, homarus americanus",
"spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish",
"crayfish, crawfish, crawdad, crawdaddy",
"hermit crab",
"isopod",
"white stork, ciconia ciconia",
"black stork, ciconia nigra",
"spoonbill",
"flamingo",
"little blue heron, egretta caerulea",
"american egret, great white heron, egretta albus",
"bittern",
"crane",
"limpkin, aramus pictus",
"european gallinule, porphyrio porphyrio",
"american coot, marsh hen, mud hen, water hen, fulica americana",
"bustard",
"ruddy turnstone, arenaria interpres",
"red-backed sandpiper, dunlin, erolia alpina",
"redshank, tringa totanus",
"dowitcher",
"oystercatcher, oyster catcher",
"pelican",
"king penguin, aptenodytes patagonica",
"albatross, mollymawk",
"grey whale, gray whale, devilfish, eschrichtius gibbosus, eschrichtius robustus",
"killer whale, killer, orca, grampus, sea wolf, orcinus orca",
"dugong, dugong dugon",
"sea lion",
"chihuahua",
"japanese spaniel",
"maltese dog, maltese terrier, maltese",
"pekinese, pekingese, peke",
"shih-tzu",
"blenheim spaniel",
"papillon",
"toy terrier",
"rhodesian ridgeback",
"afghan hound, afghan",
"basset, basset hound",
"beagle",
"bloodhound, sleuthhound",
"bluetick",
"black-and-tan coonhound",
"walker hound, walker foxhound",
"english foxhound",
"redbone",
"borzoi, russian wolfhound",
"irish wolfhound",
"italian greyhound",
"whippet",
"ibizan hound, ibizan podenco",
"norwegian elkhound, elkhound",
"otterhound, otter hound",
"saluki, gazelle hound",
"scottish deerhound, deerhound",
"weimaraner",
"staffordshire bullterrier, staffordshire bull terrier",
"american staffordshire terrier, staffordshire terrier, american pit bull terrier, pit bull terrier",
"bedlington terrier",
"border terrier",
"kerry blue terrier",
"irish terrier",
"norfolk terrier",
"norwich terrier",
"yorkshire terrier",
"wire-haired fox terrier",
"lakeland terrier",
"sealyham terrier, sealyham",
"airedale, airedale terrier",
"cairn, cairn terrier",
"australian terrier",
"dandie dinmont, dandie dinmont terrier",
"boston bull, boston terrier",
"miniature schnauzer",
"giant schnauzer",
"standard schnauzer",
"scotch terrier, scottish terrier, scottie",
"tibetan terrier, chrysanthemum dog",
"silky terrier, sydney silky",
"soft-coated wheaten terrier",
"west highland white terrier",
"lhasa, lhasa apso",
"flat-coated retriever",
"curly-coated retriever",
"golden retriever",
"labrador retriever",
"chesapeake bay retriever",
"german short-haired pointer",
"vizsla, hungarian pointer",
"english setter",
"irish setter, red setter",
"gordon setter",
"brittany spaniel",
"clumber, clumber spaniel",
"english springer, english springer spaniel",
"welsh springer spaniel",
"cocker spaniel, english cocker spaniel, cocker",
"sussex spaniel",
"irish water spaniel",
"kuvasz",
"schipperke",
"groenendael",
"malinois",
"briard",
"kelpie",
"komondor",
"old english sheepdog, bobtail",
"shetland sheepdog, shetland sheep dog, shetland",
"collie",
"border collie",
"bouvier des flandres, bouviers des flandres",
"rottweiler",
"german shepherd, german shepherd dog, german police dog, alsatian",
"doberman, doberman pinscher",
"miniature pinscher",
"greater swiss mountain dog",
"bernese mountain dog",
"appenzeller",
"entlebucher",
"boxer",
"bull mastiff",
"tibetan mastiff",
"french bulldog",
"great dane",
"saint bernard, st bernard",
"eskimo dog, husky",
"malamute, malemute, alaskan malamute",
"siberian husky",
"dalmatian, coach dog, carriage dog",
"affenpinscher, monkey pinscher, monkey dog",
"basenji",
"pug, pug-dog",
"leonberg",
"newfoundland, newfoundland dog",
"great pyrenees",
"samoyed, samoyede",
"pomeranian",
"chow, chow chow",
"keeshond",
"brabancon griffon",
"pembroke, pembroke welsh corgi",
"cardigan, cardigan welsh corgi",
"toy poodle",
"miniature poodle",
"standard poodle",
"mexican hairless",
"timber wolf, grey wolf, gray wolf, canis lupus",
"white wolf, arctic wolf, canis lupus tundrarum",
"red wolf, maned wolf, canis rufus, canis niger",
"coyote, prairie wolf, brush wolf, canis latrans",
"dingo, warrigal, warragal, canis dingo",
"dhole, cuon alpinus",
"african hunting dog, hyena dog, cape hunting dog, lycaon pictus",
"hyena, hyaena",
"red fox, vulpes vulpes",
"kit fox, vulpes macrotis",
"arctic fox, white fox, alopex lagopus",
"grey fox, gray fox, urocyon cinereoargenteus",
"tabby, tabby cat",
"tiger cat",
"persian cat",
"siamese cat, siamese",
"egyptian cat",
"cougar, puma, catamount, mountain lion, painter, panther, felis concolor",
"lynx, catamount",
"leopard, panthera pardus",
"snow leopard, ounce, panthera uncia",
"jaguar, panther, panthera onca, felis onca",
"lion, king of beasts, panthera leo",
"tiger, panthera tigris",
"cheetah, chetah, acinonyx jubatus",
"brown bear, bruin, ursus arctos",
"american black bear, black bear, ursus americanus, euarctos americanus",
"ice bear, polar bear, ursus maritimus, thalarctos maritimus",
"sloth bear, melursus ursinus, ursus ursinus",
"mongoose",
"meerkat, mierkat",
"tiger beetle",
"ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle",
"ground beetle, carabid beetle",
"long-horned beetle, longicorn, longicorn beetle",
"leaf beetle, chrysomelid",
"dung beetle",
"rhinoceros beetle",
"weevil",
"fly",
"bee",
"ant, emmet, pismire",
"grasshopper, hopper",
"cricket",
"walking stick, walkingstick, stick insect",
"cockroach, roach",
"mantis, mantid",
"cicada, cicala",
"leafhopper",
"lacewing, lacewing fly",
"dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
"damselfly",
"admiral",
"ringlet, ringlet butterfly",
"monarch, monarch butterfly, milkweed butterfly, danaus plexippus",
"cabbage butterfly",
"sulphur butterfly, sulfur butterfly",
"lycaenid, lycaenid butterfly",
"starfish, sea star",
"sea urchin",
"sea cucumber, holothurian",
"wood rabbit, cottontail, cottontail rabbit",
"hare",
"angora, angora rabbit",
"hamster",
"porcupine, hedgehog",
"fox squirrel, eastern fox squirrel, sciurus niger",
"marmot",
"beaver",
"guinea pig, cavia cobaya",
"sorrel",
"zebra",
"hog, pig, grunter, squealer, sus scrofa",
"wild boar, boar, sus scrofa",
"warthog",
"hippopotamus, hippo, river horse, hippopotamus amphibius",
"ox",
"water buffalo, water ox, asiatic buffalo, bubalus bubalis",
"bison",
"ram, tup",
"bighorn, bighorn sheep, cimarron, rocky mountain bighorn, rocky mountain sheep, ovis canadensis",
"ibex, capra ibex",
"hartebeest",
"impala, aepyceros melampus",
"gazelle",
"arabian camel, dromedary, camelus dromedarius",
"llama",
"weasel",
"mink",
"polecat, fitch, foulmart, foumart, mustela putorius",
"black-footed ferret, ferret, mustela nigripes",
"otter",
"skunk, polecat, wood pussy",
"badger",
"armadillo",
"three-toed sloth, ai, bradypus tridactylus",
"orangutan, orang, orangutang, pongo pygmaeus",
"gorilla, gorilla gorilla",
"chimpanzee, chimp, pan troglodytes",
"gibbon, hylobates lar",
"siamang, hylobates syndactylus, symphalangus syndactylus",
"guenon, guenon monkey",
"patas, hussar monkey, erythrocebus patas",
"baboon",
"macaque",
"langur",
"colobus, colobus monkey",
"proboscis monkey, nasalis larvatus",
"marmoset",
"capuchin, ringtail, cebus capucinus",
"howler monkey, howler",
"titi, titi monkey",
"spider monkey, ateles geoffroyi",
"squirrel monkey, saimiri sciureus",
"madagascar cat, ring-tailed lemur, lemur catta",
"indri, indris, indri indri, indri brevicaudatus",
"indian elephant, elephas maximus",
"african elephant, loxodonta africana",
"lesser panda, red panda, panda, bear cat, cat bear, ailurus fulgens",
"giant panda, panda, panda bear, coon bear, ailuropoda melanoleuca",
"barracouta, snoek",
"eel",
"coho, cohoe, coho salmon, blue jack, silver salmon, oncorhynchus kisutch",
"rock beauty, holocanthus tricolor",
"anemone fish",
"sturgeon",
"gar, garfish, garpike, billfish, lepisosteus osseus",
"lionfish",
"puffer, pufferfish, blowfish, globefish",
"abacus",
"abaya",
"academic gown, academic robe, judge's robe",
"accordion, piano accordion, squeeze box",
"acoustic guitar",
"aircraft carrier, carrier, flattop, attack aircraft carrier",
"airliner",
"airship, dirigible",
"altar",
"ambulance",
"amphibian, amphibious vehicle",
"analog clock",
"apiary, bee house",
"apron",
"ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin",
"assault rifle, assault gun",
"backpack, back pack, knapsack, packsack, rucksack, haversack",
"bakery, bakeshop, bakehouse",
"balance beam, beam",
"balloon",
"ballpoint, ballpoint pen, ballpen, biro",
"band aid",
"banjo",
"bannister, banister, balustrade, balusters, handrail",
"barbell",
"barber chair",
"barbershop",
"barn",
"barometer",
"barrel, cask",
"barrow, garden cart, lawn cart, wheelbarrow",
"baseball",
"basketball",
"bassinet",
"bassoon",
"bathing cap, swimming cap",
"bath towel",
"bathtub, bathing tub, bath, tub",
"beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon",
"beacon, lighthouse, beacon light, pharos",
"beaker",
"bearskin, busby, shako",
"beer bottle",
"beer glass",
"bell cote, bell cot",
"bib",
"bicycle-built-for-two, tandem bicycle, tandem",
"bikini, two-piece",
"binder, ring-binder",
"binoculars, field glasses, opera glasses",
"birdhouse",
"boathouse",
"bobsled, bobsleigh, bob",
"bolo tie, bolo, bola tie, bola",
"bonnet, poke bonnet",
"bookcase",
"bookshop, bookstore, bookstall",
"bottlecap",
"bow",
"bow tie, bow-tie, bowtie",
"brass, memorial tablet, plaque",
"brassiere, bra, bandeau",
"breakwater, groin, groyne, mole, bulwark, seawall, jetty",
"breastplate, aegis, egis",
"broom",
"bucket, pail",
"buckle",
"bulletproof vest",
"bullet train, bullet",
"butcher shop, meat market",
"cab, hack, taxi, taxicab",
"caldron, cauldron",
"candle, taper, wax light",
"cannon",
"canoe",
"can opener, tin opener",
"cardigan",
"car mirror",
"carousel, carrousel, merry-go-round, roundabout, whirligig",
"carpenter's kit, tool kit",
"carton",
"car wheel",
"cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, atm",
"cassette",
"cassette player",
"castle",
"catamaran",
"cd player",
"cello, violoncello",
"cellular telephone, cellular phone, cellphone, cell, mobile phone",
"chain",
"chainlink fence",
"chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour",
"chain saw, chainsaw",
"chest",
"chiffonier, commode",
"chime, bell, gong",
"china cabinet, china closet",
"christmas stocking",
"church, church building",
"cinema, movie theater, movie theatre, movie house, picture palace",
"cleaver, meat cleaver, chopper",
"cliff dwelling",
"cloak",
"clog, geta, patten, sabot",
"cocktail shaker",
"coffee mug",
"coffeepot",
"coil, spiral, volute, whorl, helix",
"combination lock",
"computer keyboard, keypad",
"confectionery, confectionary, candy store",
"container ship, containership, container vessel",
"convertible",
"corkscrew, bottle screw",
"cornet, horn, trumpet, trump",
"cowboy boot",
"cowboy hat, ten-gallon hat",
"cradle",
"crane",
"crash helmet",
"crate",
"crib, cot",
"crock pot",
"croquet ball",
"crutch",
"cuirass",
"dam, dike, dyke",
"desk",
"desktop computer",
"dial telephone, dial phone",
"diaper, nappy, napkin",
"digital clock",
"digital watch",
"dining table, board",
"dishrag, dishcloth",
"dishwasher, dish washer, dishwashing machine",
"disk brake, disc brake",
"dock, dockage, docking facility",
"dogsled, dog sled, dog sleigh",
"dome",
"doormat, welcome mat",
"drilling platform, offshore rig",
"drum, membranophone, tympan",
"drumstick",
"dumbbell",
"dutch oven",
"electric fan, blower",
"electric guitar",
"electric locomotive",
"entertainment center",
"envelope",
"espresso maker",
"face powder",
"feather boa, boa",
"file, file cabinet, filing cabinet",
"fireboat",
"fire engine, fire truck",
"fire screen, fireguard",
"flagpole, flagstaff",
"flute, transverse flute",
"folding chair",
"football helmet",
"forklift",
"fountain",
"fountain pen",
"four-poster",
"freight car",
"french horn, horn",
"frying pan, frypan, skillet",
"fur coat",
"garbage truck, dustcart",
"gasmask, respirator, gas helmet",
"gas pump, gasoline pump, petrol pump, island dispenser",
"goblet",
"go-kart",
"golf ball",
"golfcart, golf cart",
"gondola",
"gong, tam-tam",
"gown",
"grand piano, grand",
"greenhouse, nursery, glasshouse",
"grille, radiator grille",
"grocery store, grocery, food market, market",
"guillotine",
"hair slide",
"hair spray",
"half track",
"hammer",
"hamper",
"hand blower, blow dryer, blow drier, hair dryer, hair drier",
"hand-held computer, hand-held microcomputer",
"handkerchief, hankie, hanky, hankey",
"hard disc, hard disk, fixed disk",
"harmonica, mouth organ, harp, mouth harp",
"harp",
"harvester, reaper",
"hatchet",
"holster",
"home theater, home theatre",
"honeycomb",
"hook, claw",
"hoopskirt, crinoline",
"horizontal bar, high bar",
"horse cart, horse-cart",
"hourglass",
"ipod",
"iron, smoothing iron",
"jack-o'-lantern",
"jean, blue jean, denim",
"jeep, landrover",
"jersey, t-shirt, tee shirt",
"jigsaw puzzle",
"jinrikisha, ricksha, rickshaw",
"joystick",
"kimono",
"knee pad",
"knot",
"lab coat, laboratory coat",
"ladle",
"lampshade, lamp shade",
"laptop, laptop computer",
"lawn mower, mower",
"lens cap, lens cover",
"letter opener, paper knife, paperknife",
"library",
"lifeboat",
"lighter, light, igniter, ignitor",
"limousine, limo",
"liner, ocean liner",
"lipstick, lip rouge",
"loafer",
"lotion",
"loudspeaker, speaker, speaker unit, loudspeaker system, speaker system",
"loupe, jeweler's loupe",
"lumbermill, sawmill",
"magnetic compass",
"mailbag, postbag",
"mailbox, letter box",
"maillot",
"maillot, tank suit",
"manhole cover",
"maraca",
"marimba, xylophone",
"mask",
"matchstick",
"maypole",
"maze, labyrinth",
"measuring cup",
"medicine chest, medicine cabinet",
"megalith, megalithic structure",
"microphone, mike",
"microwave, microwave oven",
"military uniform",
"milk can",
"minibus",
"miniskirt, mini",
"minivan",
"missile",
"mitten",
"mixing bowl",
"mobile home, manufactured home",
"model t",
"modem",
"monastery",
"monitor",
"moped",
"mortar",
"mortarboard",
"mosque",
"mosquito net",
"motor scooter, scooter",
"mountain bike, all-terrain bike, off-roader",
"mountain tent",
"mouse, computer mouse",
"mousetrap",
"moving van",
"muzzle",
"nail",
"neck brace",
"necklace",
"nipple",
"notebook, notebook computer",
"obelisk",
"oboe, hautboy, hautbois",
"ocarina, sweet potato",
"odometer, hodometer, mileometer, milometer",
"oil filter",
"organ, pipe organ",
"oscilloscope, scope, cathode-ray oscilloscope, cro",
"overskirt",
"oxcart",
"oxygen mask",
"packet",
"paddle, boat paddle",
"paddlewheel, paddle wheel",
"padlock",
"paintbrush",
"pajama, pyjama, pj's, jammies",
"palace",
"panpipe, pandean pipe, syrinx",
"paper towel",
"parachute, chute",
"parallel bars, bars",
"park bench",
"parking meter",
"passenger car, coach, carriage",
"patio, terrace",
"pay-phone, pay-station",
"pedestal, plinth, footstall",
"pencil box, pencil case",
"pencil sharpener",
"perfume, essence",
"petri dish",
"photocopier",
"pick, plectrum, plectron",
"pickelhaube",
"picket fence, paling",
"pickup, pickup truck",
"pier",
"piggy bank, penny bank",
"pill bottle",
"pillow",
"ping-pong ball",
"pinwheel",
"pirate, pirate ship",
"pitcher, ewer",
"plane, carpenter's plane, woodworking plane",
"planetarium",
"plastic bag",
"plate rack",
"plow, plough",
"plunger, plumber's helper",
"polaroid camera, polaroid land camera",
"pole",
"police van, police wagon, paddy wagon, patrol wagon, wagon, black maria",
"poncho",
"pool table, billiard table, snooker table",
"pop bottle, soda bottle",
"pot, flowerpot",
"potter's wheel",
"power drill",
"prayer rug, prayer mat",
"printer",
"prison, prison house",
"projectile, missile",
"projector",
"puck, hockey puck",
"punching bag, punch bag, punching ball, punchball",
"purse",
"quill, quill pen",
"quilt, comforter, comfort, puff",
"racer, race car, racing car",
"racket, racquet",
"radiator",
"radio, wireless",
"radio telescope, radio reflector",
"rain barrel",
"recreational vehicle, rv, r.v.",
"reel",
"reflex camera",
"refrigerator, icebox",
"remote control, remote",
"restaurant, eating house, eating place, eatery",
"revolver, six-gun, six-shooter",
"rifle",
"rocking chair, rocker",
"rotisserie",
"rubber eraser, rubber, pencil eraser",
"rugby ball",
"rule, ruler",
"running shoe",
"safe",
"safety pin",
"saltshaker, salt shaker",
"sandal",
"sarong",
"sax, saxophone",
"scabbard",
"scale, weighing machine",
"school bus",
"schooner",
"scoreboard",
"screen, crt screen",
"screw",
"screwdriver",
"seat belt, seatbelt",
"sewing machine",
"shield, buckler",
"shoe shop, shoe-shop, shoe store",
"shoji",
"shopping basket",
"shopping cart",
"shovel",
"shower cap",
"shower curtain",
"ski",
"ski mask",
"sleeping bag",
"slide rule, slipstick",
"sliding door",
"slot, one-armed bandit",
"snorkel",
"snowmobile",
"snowplow, snowplough",
"soap dispenser",
"soccer ball",
"sock",
"solar dish, solar collector, solar furnace",
"sombrero",
"soup bowl",
"space bar",
"space heater",
"space shuttle",
"spatula",
"speedboat",
"spider web, spider's web",
"spindle",
"sports car, sport car",
"spotlight, spot",
"stage",
"steam locomotive",
"steel arch bridge",
"steel drum",
"stethoscope",
"stole",
"stone wall",
"stopwatch, stop watch",
"stove",
"strainer",
"streetcar, tram, tramcar, trolley, trolley car",
"stretcher",
"studio couch, day bed",
"stupa, tope",
"submarine, pigboat, sub, u-boat",
"suit, suit of clothes",
"sundial",
"sunglass",
"sunglasses, dark glasses, shades",
"sunscreen, sunblock, sun blocker",
"suspension bridge",
"swab, swob, mop",
"sweatshirt",
"swimming trunks, bathing trunks",
"swing",
"switch, electric switch, electrical switch",
"syringe",
"table lamp",
"tank, army tank, armored combat vehicle, armoured combat vehicle",
"tape player",
"teapot",
"teddy, teddy bear",
"television, television system",
"tennis ball",
"thatch, thatched roof",
"theater curtain, theatre curtain",
"thimble",
"thresher, thrasher, threshing machine",
"throne",
"tile roof",
"toaster",
"tobacco shop, tobacconist shop, tobacconist",
"toilet seat",
"torch",
"totem pole",
"tow truck, tow car, wrecker",
"toyshop",
"tractor",
"trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi",
"tray",
"trench coat",
"tricycle, trike, velocipede",
"trimaran",
"tripod",
"triumphal arch",
"trolleybus, trolley coach, trackless trolley",
"trombone",
"tub, vat",
"turnstile",
"typewriter keyboard",
"umbrella",
"unicycle, monocycle",
"upright, upright piano",
"vacuum, vacuum cleaner",
"vase",
"vault",
"velvet",
"vending machine",
"vestment",
"viaduct",
"violin, fiddle",
"volleyball",
"waffle iron",
"wall clock",
"wallet, billfold, notecase, pocketbook",
"wardrobe, closet, press",
"warplane, military plane",
"washbasin, handbasin, washbowl, lavabo, wash-hand basin",
"washer, automatic washer, washing machine",
"water bottle",
"water jug",
"water tower",
"whiskey jug",
"whistle",
"wig",
"window screen",
"window shade",
"windsor tie",
"wine bottle",
"wing",
"wok",
"wooden spoon",
"wool, woolen, woollen",
"worm fence, snake fence, snake-rail fence, virginia fence",
"wreck",
"yawl",
"yurt",
"web site, website, internet site, site",
"comic book",
"crossword puzzle, crossword",
"street sign",
"traffic light, traffic signal, stoplight",
"book jacket, dust cover, dust jacket, dust wrapper",
"menu",
"plate",
"guacamole",
"consomme",
"hot pot, hotpot",
"trifle",
"ice cream, icecream",
"ice lolly, lolly, lollipop, popsicle",
"french loaf",
"bagel, beigel",
"pretzel",
"cheeseburger",
"hotdog, hot dog, red hot",
"mashed potato",
"head cabbage",
"broccoli",
"cauliflower",
"zucchini, courgette",
"spaghetti squash",
"acorn squash",
"butternut squash",
"cucumber, cuke",
"artichoke, globe artichoke",
"bell pepper",
"cardoon",
"mushroom",
"granny smith",
"strawberry",
"orange",
"lemon",
"fig",
"pineapple, ananas",
"banana",
"jackfruit, jak, jack",
"custard apple",
"pomegranate",
"hay",
"carbonara",
"chocolate sauce, chocolate syrup",
"dough",
"meat loaf, meatloaf",
"pizza, pizza pie",
"potpie",
"burrito",
"red wine",
"espresso",
"cup",
"eggnog",
"alp",
"bubble",
"cliff, drop, drop-off",
"coral reef",
"geyser",
"lakeside, lakeshore",
"promontory, headland, head, foreland",
"sandbar, sand bar",
"seashore, coast, seacoast, sea-coast",
"valley, vale",
"volcano",
"ballplayer, baseball player",
"groom, bridegroom",
"scuba diver",
"rapeseed",
"daisy",
"yellow lady's slipper, yellow lady-slipper, cypripedium calceolus, cypripedium parviflorum",
"corn",
"acorn",
"hip, rose hip, rosehip",
"buckeye, horse chestnut, conker",
"coral fungus",
"agaric",
"gyromitra",
"stinkhorn, carrion fungus",
"earthstar",
"hen-of-the-woods, hen of the woods, polyporus frondosus, grifola frondosa",
"bolete",
"ear, spike, capitulum",
"toilet tissue, toilet paper, bathroom tissue"
] |
Abeesan/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1929
- Accuracy: 0.9499
- Accuracy: 0.8800
-Precision: 0.8768
-Recall: 0.8800
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3626 | 1.0 | 370 | 0.3135 | 0.9202 |
| 0.206 | 2.0 | 740 | 0.2440 | 0.9350 |
| 0.1765 | 3.0 | 1110 | 0.2278 | 0.9323 |
| 0.1454 | 4.0 | 1480 | 0.2167 | 0.9350 |
| 0.1234 | 5.0 | 1850 | 0.2139 | 0.9323 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
schlenat/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1977
- Accuracy: 0.9445
## Model description
This model is based on the Vision Transformer (ViT) architecture and was fine-tuned for
image classification on the Oxford-IIIT Pet Dataset. In addition to fine-tuning, we evaluated zero-shot performance using the CLIP model.
To complement the fine-tuned model performance, we also evaluated the zero-shot capabilities of CLIP using the Oxford-IIIT Pet Dataset.
The zero-shot classification was conducted with the `transformers` pipeline for `zero-shot-image-classification`, where pet breed names were used as candidate labels without any additional fine-tuning.
Results:
- Accuracy: 88.00%
- Precision (weighted): 87.68%
- Recall (weighted): 88.00%
These results demonstrate that CLIP is capable of recognizing fine-grained pet categories with high accuracy in a zero-shot setting,
highlighting the model’s robustness and generalization capability across unseen tasks.
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3582 | 1.0 | 370 | 0.2997 | 0.9256 |
| 0.2125 | 2.0 | 740 | 0.2200 | 0.9418 |
| 0.1573 | 3.0 | 1110 | 0.1966 | 0.9405 |
| 0.1472 | 4.0 | 1480 | 0.1884 | 0.9445 |
| 0.1338 | 5.0 | 1850 | 0.1865 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Tharsana/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# clip-oxford-pets
This model is a fine-tuned version of openai/clip-vit-base-patch14 on the pcuenq/oxford-pets dataset. It achieves the following results on the evaluation set:
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1894
- Accuracy: 0.9364
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3552 | 1.0 | 370 | 0.3072 | 0.9120 |
| 0.2159 | 2.0 | 740 | 0.2327 | 0.9242 |
| 0.1625 | 3.0 | 1110 | 0.2089 | 0.9256 |
| 0.155 | 4.0 | 1480 | 0.2029 | 0.9296 |
| 0.1219 | 5.0 | 1850 | 0.1995 | 0.9323 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Fadri/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1991
- Accuracy: 0.9378
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.371 | 1.0 | 370 | 0.2837 | 0.9405 |
| 0.2082 | 2.0 | 740 | 0.2215 | 0.9378 |
| 0.1798 | 3.0 | 1110 | 0.2038 | 0.9350 |
| 0.1419 | 4.0 | 1480 | 0.1966 | 0.9364 |
| 0.1273 | 5.0 | 1850 | 0.1936 | 0.9405 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero Shot Evaluation
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
hindero1/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2132
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.377 | 1.0 | 370 | 0.2844 | 0.9337 |
| 0.211 | 2.0 | 740 | 0.2143 | 0.9391 |
| 0.1792 | 3.0 | 1110 | 0.1906 | 0.9391 |
| 0.1445 | 4.0 | 1480 | 0.1811 | 0.9432 |
| 0.135 | 5.0 | 1850 | 0.1801 | 0.9445 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero SHot Evaluation
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
bloecand/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1995
- Accuracy: 0.9432
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.374 | 1.0 | 370 | 0.3119 | 0.9378 |
| 0.2103 | 2.0 | 740 | 0.2457 | 0.9405 |
| 0.1651 | 3.0 | 1110 | 0.2329 | 0.9337 |
| 0.1409 | 4.0 | 1480 | 0.2236 | 0.9432 |
| 0.1274 | 5.0 | 1850 | 0.2220 | 0.9459 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero-Shot-Modell: openai/clip-vit-large-patch14
- Accuracy (Genauigkeit): 0.8800
- Precision (Präzision): 0.8768
- Recall (Sensitivität): 0.8800
Demo: https://huggingface.co/spaces/bloecand/week7
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
kornmayer/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero-Shot Klassifikation mit CLIP
Für einen Vergleich wurde zusätzlich das Modell `openai/clip-vit-large-patch14` als Zero-Shot-Image-Classifier auf das Oxford-IIIT Pet Dataset angewendet. Die folgenden Ergebnisse basieren auf der Vorhersage von 100 zufälligen Bildern mit 37 Klassenbezeichnungen:
- **Accuracy**: 88.00 %
- **Precision (weighted average)**: 87.68 %
- **Recall (weighted average)**: 88.00 %
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
caccaluc/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1962
- Accuracy: 0.9391
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3549 | 1.0 | 370 | 0.3108 | 0.9269 |
| 0.2108 | 2.0 | 740 | 0.2384 | 0.9364 |
| 0.1707 | 3.0 | 1110 | 0.2171 | 0.9310 |
| 0.1574 | 4.0 | 1480 | 0.2142 | 0.9283 |
| 0.1322 | 5.0 | 1850 | 0.2100 | 0.9296 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
Accuracy: 0.8785
Precision: 0.8761
Recall: 0.8785
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Thivjan11/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1977
- Accuracy: 0.9445
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3582 | 1.0 | 370 | 0.2997 | 0.9256 |
| 0.2125 | 2.0 | 740 | 0.2200 | 0.9418 |
| 0.1573 | 3.0 | 1110 | 0.1966 | 0.9405 |
| 0.1472 | 4.0 | 1480 | 0.1884 | 0.9445 |
| 0.1338 | 5.0 | 1850 | 0.1865 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
Results:
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
Zusätzlich erstellt aus Neugier:
F1-Score (weighted): 0.8605
F1-Score (micro): 0.8800
F1-Score (macro): 0.8605
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
jarinschnierl/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1940
- Accuracy: 0.9391
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.382 | 1.0 | 370 | 0.2590 | 0.9391 |
| 0.1976 | 2.0 | 740 | 0.1871 | 0.9445 |
| 0.1605 | 3.0 | 1110 | 0.1637 | 0.9567 |
| 0.1513 | 4.0 | 1480 | 0.1601 | 0.9513 |
| 0.1424 | 5.0 | 1850 | 0.1583 | 0.9513 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cpu
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
faridkarimli/SWIN_Gaudi_60
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# SWIN_Gaudi_v1
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12-192-22k](https://huggingface.co/microsoft/swinv2-large-patch4-window12-192-22k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 4.8090
- Accuracy: 0.2293
- Memory Allocated (gb): 1.84
- Max Memory Allocated (gb): 58.8
- Total Memory Available (gb): 94.62
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 1024
- total_eval_batch_size: 1024
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear
- num_epochs: 30.0
### Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss | Memory Allocated (gb) | Allocated (gb) | Memory Available (gb) |
|:-------------:|:-----:|:-----:|:--------:|:---------------:|:---------------------:|:--------------:|:---------------------:|
| 5.3071 | 1.0 | 657 | 0.0801 | 6.2406 | 60.21 | 3.2 | 94.62 |
| 3.1366 | 2.0 | 1314 | 0.1047 | 5.8481 | 60.21 | 3.2 | 94.62 |
| 2.6048 | 3.0 | 1971 | 0.1238 | 5.5522 | 60.21 | 3.2 | 94.62 |
| 1.9918 | 4.0 | 2628 | 0.1301 | 5.5551 | 60.21 | 3.2 | 94.62 |
| 1.8353 | 5.0 | 3285 | 0.1415 | 5.4142 | 60.21 | 3.2 | 94.62 |
| 1.7262 | 6.0 | 3942 | 0.1495 | 5.4061 | 60.21 | 3.2 | 94.62 |
| 1.5135 | 7.0 | 4599 | 0.1468 | 5.4261 | 60.21 | 3.2 | 94.62 |
| 1.4225 | 8.0 | 5256 | 0.1573 | 5.3333 | 60.21 | 3.2 | 94.62 |
| 1.354 | 9.0 | 5913 | 0.1638 | 5.2205 | 60.21 | 3.2 | 94.62 |
| 1.2511 | 10.0 | 6570 | 0.1708 | 5.2129 | 60.21 | 3.2 | 94.62 |
| 1.1742 | 11.0 | 7227 | 0.1724 | 5.2002 | 60.21 | 3.2 | 94.62 |
| 1.1342 | 12.0 | 7884 | 0.1782 | 5.1635 | 60.21 | 3.2 | 94.62 |
| 1.0711 | 13.0 | 8541 | 0.1779 | 5.1436 | 60.21 | 3.2 | 94.62 |
| 0.9971 | 14.0 | 9198 | 0.1817 | 5.1076 | 60.21 | 3.2 | 94.62 |
| 0.9774 | 15.0 | 9855 | 0.1935 | 4.9076 | 60.21 | 3.2 | 94.62 |
| 0.9174 | 16.0 | 10512 | 0.1890 | 5.0318 | 60.21 | 3.2 | 94.62 |
| 0.8675 | 17.0 | 11169 | 0.1951 | 5.0392 | 60.21 | 3.2 | 94.62 |
| 0.8499 | 18.0 | 11826 | 0.1978 | 5.0243 | 60.21 | 3.2 | 94.62 |
| 0.8262 | 19.0 | 12483 | 0.1972 | 5.0843 | 60.21 | 3.2 | 94.62 |
| 0.7623 | 20.0 | 13140 | 0.2048 | 5.0004 | 60.21 | 3.2 | 94.62 |
| 0.7481 | 21.0 | 13797 | 0.2132 | 4.8428 | 60.24 | 3.2 | 94.62 |
| 0.7284 | 22.0 | 14454 | 0.2149 | 4.8461 | 60.24 | 3.2 | 94.62 |
| 0.6834 | 23.0 | 15111 | 0.2159 | 4.8741 | 60.24 | 3.2 | 94.62 |
| 0.6591 | 24.0 | 15768 | 0.2187 | 4.8993 | 60.24 | 3.2 | 94.62 |
| 0.6447 | 25.0 | 16425 | 0.2196 | 4.8415 | 60.24 | 3.2 | 94.62 |
| 0.6107 | 26.0 | 17082 | 0.2216 | 4.8600 | 60.24 | 3.2 | 94.62 |
| 0.5958 | 27.0 | 17739 | 0.2245 | 4.8391 | 60.24 | 3.2 | 94.62 |
| 0.5836 | 28.0 | 18396 | 0.2265 | 4.8561 | 60.24 | 3.2 | 94.62 |
| 0.5547 | 29.0 | 19053 | 0.2295 | 4.7933 | 60.24 | 3.2 | 94.62 |
| 0.547 | 30.0 | 19710 | 0.2293 | 4.8090 | 60.24 | 3.2 | 94.62 |
### Framework versions
- Transformers 4.45.2
- Pytorch 2.6.0+hpu_1.20.0-543.git4952fce
- Datasets 3.5.0
- Tokenizers 0.20.3
|
[
"3150",
"7279",
"12267",
"15267",
"8181",
"12263",
"6469",
"15322",
"755",
"184",
"3501",
"8749",
"5411",
"11090",
"1176",
"15154",
"7198",
"13439",
"1432",
"2219",
"7338",
"9986",
"5439",
"12921",
"4688",
"8686",
"9356",
"12530",
"6406",
"10983",
"3705",
"13494",
"2008",
"3537",
"2485",
"9773",
"15450",
"13679",
"13236",
"13793",
"11717",
"8123",
"11098",
"6493",
"10737",
"3270",
"10288",
"6881",
"1657",
"1460",
"13482",
"6820",
"11248",
"2080",
"3247",
"15029",
"7639",
"10799",
"711",
"13596",
"9392",
"13049",
"12499",
"10341",
"11730",
"3049",
"2089",
"4350",
"1203",
"13403",
"911",
"14681",
"1612",
"1275",
"10697",
"9871",
"9072",
"7353",
"10624",
"113",
"2554",
"920",
"3201",
"11067",
"7312",
"9178",
"8348",
"7905",
"9549",
"1679",
"13655",
"2647",
"11344",
"11748",
"11809",
"2757",
"3925",
"12044",
"11473",
"1750",
"3267",
"3037",
"304",
"991",
"217",
"4724",
"3454",
"15335",
"5825",
"8084",
"4578",
"8303",
"12034",
"5806",
"10313",
"12244",
"9297",
"2465",
"9266",
"12956",
"320",
"14986",
"4926",
"7960",
"14747",
"15400",
"3055",
"9956",
"11559",
"9918",
"14619",
"5290",
"14312",
"14238",
"8357",
"2521",
"7929",
"3134",
"5947",
"2351",
"14258",
"3728",
"7445",
"8726",
"799",
"10015",
"1055",
"11872",
"9054",
"10441",
"13763",
"13553",
"14942",
"12084",
"13734",
"10029",
"5057",
"2460",
"7950",
"5950",
"272",
"13784",
"1403",
"15494",
"7972",
"8917",
"14000",
"11507",
"3006",
"1351",
"6180",
"2951",
"10109",
"1480",
"442",
"5591",
"12525",
"977",
"13014",
"5630",
"9079",
"3333",
"11846",
"8349",
"11491",
"6695",
"11335",
"14542",
"14277",
"9015",
"6201",
"12181",
"5422",
"6518",
"12641",
"12427",
"3428",
"9622",
"11366",
"9983",
"1879",
"15156",
"2515",
"14678",
"5162",
"3650",
"3948",
"10411",
"1997",
"5344",
"12848",
"14653",
"14759",
"12646",
"2961",
"2920",
"8380",
"12454",
"10385",
"8757",
"6034",
"5017",
"1196",
"8074",
"1433",
"3121",
"7035",
"7638",
"2793",
"10598",
"5066",
"11089",
"9046",
"6334",
"729",
"2664",
"2444",
"6738",
"2567",
"14058",
"132",
"10924",
"6464",
"13740",
"5999",
"11154",
"2221",
"233",
"7736",
"5983",
"2919",
"13839",
"11427",
"7728",
"127",
"9926",
"10951",
"12221",
"12858",
"1784",
"9377",
"8428",
"12376",
"13416",
"13148",
"5128",
"4333",
"9736",
"5423",
"5526",
"14570",
"10560",
"9847",
"7050",
"11770",
"11204",
"10956",
"11287",
"12460",
"8370",
"3597",
"1867",
"6485",
"10485",
"3383",
"3298",
"6840",
"9105",
"9791",
"13467",
"4094",
"2751",
"7902",
"10891",
"14473",
"13379",
"15375",
"10114",
"4828",
"12828",
"12413",
"142",
"7434",
"782",
"12959",
"8848",
"94",
"12350",
"11180",
"5102",
"13705",
"14586",
"4984",
"6420",
"6854",
"963",
"9452",
"2643",
"5774",
"13949",
"11305",
"4904",
"3305",
"10841",
"11382",
"14140",
"7677",
"11964",
"811",
"9588",
"4961",
"15300",
"1815",
"14150",
"2752",
"5887",
"6867",
"11426",
"13222",
"13909",
"3197",
"5256",
"13862",
"3720",
"580",
"10421",
"15002",
"996",
"12540",
"9261",
"6241",
"13307",
"9515",
"5904",
"9006",
"12099",
"11073",
"12914",
"8678",
"4455",
"12369",
"4793",
"4894",
"15346",
"7830",
"6711",
"10704",
"6117",
"5694",
"2363",
"11547",
"6305",
"5069",
"12718",
"11966",
"13719",
"1261",
"9335",
"7220",
"400",
"7322",
"11639",
"99",
"12269",
"9002",
"3012",
"9754",
"1762",
"5462",
"1553",
"7689",
"10532",
"5165",
"13962",
"4519",
"10628",
"6977",
"8932",
"11821",
"7760",
"8508",
"168",
"10043",
"11718",
"9776",
"12747",
"5388",
"252",
"12381",
"5674",
"5729",
"11358",
"2782",
"6239",
"5174",
"415",
"10245",
"14122",
"11122",
"12065",
"12424",
"10575",
"10738",
"7037",
"6690",
"10103",
"13005",
"7705",
"6633",
"15100",
"5115",
"10211",
"13022",
"5310",
"11725",
"525",
"4851",
"2512",
"14512",
"8831",
"176",
"3593",
"9300",
"5222",
"4877",
"8462",
"11258",
"2346",
"12944",
"13249",
"8902",
"939",
"8947",
"4026",
"15480",
"10261",
"2558",
"679",
"1822",
"11166",
"529",
"7041",
"12882",
"15050",
"9087",
"11959",
"443",
"4276",
"4489",
"171",
"3097",
"9706",
"6073",
"11145",
"1473",
"9289",
"8336",
"13308",
"13733",
"5146",
"7974",
"15057",
"11464",
"9651",
"13070",
"4035",
"13779",
"6198",
"12096",
"11209",
"2955",
"14608",
"6479",
"1183",
"8037",
"5813",
"10674",
"626",
"10811",
"7882",
"12563",
"11841",
"5082",
"3005",
"9735",
"5485",
"13598",
"3992",
"6052",
"10746",
"7280",
"12930",
"4537",
"4179",
"12183",
"12522",
"10234",
"15030",
"4539",
"7650",
"4974",
"11208",
"3017",
"3686",
"10006",
"738",
"11998",
"2786",
"4381",
"12496",
"9406",
"13756",
"10948",
"14699",
"1529",
"1585",
"10992",
"1322",
"5311",
"10452",
"2502",
"2594",
"15199",
"5993",
"9569",
"13020",
"11343",
"1735",
"8632",
"288",
"3714",
"5822",
"3425",
"9282",
"8452",
"3335",
"261",
"4127",
"2406",
"530",
"14904",
"13394",
"8522",
"7021",
"7942",
"7318",
"9383",
"2116",
"10338",
"8528",
"1848",
"14524",
"1753",
"4814",
"15055",
"12491",
"13846",
"12115",
"6579",
"826",
"8323",
"1240",
"15187",
"5808",
"14587",
"1563",
"15482",
"7957",
"10829",
"9089",
"14208",
"10477",
"2240",
"5623",
"1713",
"9078",
"15064",
"861",
"2656",
"3575",
"8629",
"7399",
"564",
"7450",
"2105",
"14328",
"11660",
"14350",
"7208",
"1371",
"91",
"8627",
"8669",
"15485",
"12560",
"3209",
"11311",
"6829",
"6330",
"3251",
"12933",
"5874",
"14669",
"9553",
"4605",
"1487",
"12436",
"2344",
"469",
"605",
"8377",
"13171",
"623",
"10080",
"4595",
"13654",
"1721",
"11002",
"11526",
"2463",
"10096",
"13951",
"7803",
"4439",
"3131",
"13545",
"9045",
"9915",
"12363",
"12298",
"14719",
"11402",
"4031",
"6592",
"14059",
"14069",
"1986",
"8104",
"7301",
"2115",
"13936",
"8587",
"11549",
"12520",
"4597",
"2702",
"13077",
"13233",
"11047",
"12021",
"5937",
"15299",
"240",
"2962",
"6273",
"13552",
"11612",
"7764",
"12218",
"11756",
"6077",
"3060",
"11411",
"12762",
"554",
"421",
"12865",
"1124",
"12430",
"7106",
"4341",
"6162",
"1730",
"10208",
"13551",
"14726",
"14977",
"11948",
"3208",
"11716",
"8207",
"4951",
"7324",
"7186",
"7071",
"728",
"1353",
"2451",
"3910",
"15257",
"10856",
"4854",
"13567",
"1708",
"8611",
"4580",
"13239",
"11056",
"13818",
"332",
"2922",
"13752",
"27",
"381",
"2073",
"7305",
"10290",
"5882",
"1992",
"13850",
"301",
"14709",
"7498",
"9941",
"15306",
"3750",
"4354",
"7507",
"4625",
"14869",
"7337",
"785",
"5015",
"13690",
"87",
"2508",
"3183",
"10483",
"8569",
"13028",
"6993",
"10842",
"10786",
"11534",
"3667",
"8532",
"11671",
"14453",
"12132",
"7397",
"8884",
"448",
"9658",
"7930",
"7418",
"5759",
"1287",
"1803",
"12784",
"12",
"1984",
"1017",
"6900",
"13069",
"10358",
"6582",
"1126",
"12999",
"1380",
"13530",
"4157",
"12709",
"5701",
"11854",
"13689",
"8670",
"5989",
"5727",
"12561",
"8341",
"2595",
"13448",
"7105",
"14812",
"150",
"12050",
"6013",
"8383",
"11400",
"11409",
"8222",
"12905",
"13333",
"5678",
"5684",
"15458",
"428",
"8471",
"13538",
"7923",
"7828",
"2792",
"1377",
"6963",
"13519",
"8188",
"11691",
"10932",
"7031",
"2194",
"14244",
"5055",
"8126",
"9886",
"10128",
"8354",
"3002",
"2681",
"11617",
"6866",
"7421",
"13381",
"3855",
"11535",
"4566",
"3863",
"12654",
"11757",
"2345",
"7247",
"6495",
"12038",
"7017",
"10067",
"2533",
"7162",
"7509",
"15428",
"3417",
"12095",
"4582",
"10988",
"10604",
"11825",
"7459",
"3103",
"10383",
"1733",
"5342",
"1145",
"3565",
"2551",
"2249",
"5584",
"6431",
"13922",
"2079",
"11683",
"8612",
"1575",
"13787",
"3475",
"1701",
"11388",
"3550",
"8197",
"2112",
"7785",
"15436",
"38",
"1286",
"2452",
"8142",
"7147",
"7733",
"6487",
"14357",
"9891",
"9878",
"11012",
"8595",
"10892",
"5781",
"4404",
"1166",
"3780",
"11135",
"12911",
"14118",
"14014",
"3759",
"5252",
"8294",
"14199",
"496",
"8653",
"2848",
"14979",
"14036",
"2611",
"9456",
"6538",
"1039",
"2291",
"8696",
"9445",
"7268",
"4955",
"13035",
"9630",
"8147",
"8955",
"3429",
"13360",
"15444",
"11557",
"5305",
"10391",
"4859",
"59",
"9957",
"1041",
"8999",
"5001",
"5894",
"4193",
"12439",
"12100",
"2422",
"5695",
"6946",
"9150",
"12521",
"7194",
"2239",
"8264",
"11431",
"14141",
"10050",
"7715",
"7372",
"4836",
"13046",
"14384",
"14811",
"10343",
"1431",
"13043",
"6328",
"11486",
"7007",
"12645",
"3881",
"13582",
"13804",
"4731",
"5394",
"4242",
"6172",
"13229",
"5657",
"8904",
"14367",
"1650",
"10942",
"12001",
"3504",
"13182",
"734",
"7749",
"5537",
"5409",
"3657",
"67",
"7472",
"4588",
"4235",
"12860",
"230",
"7816",
"9875",
"10695",
"11950",
"696",
"4728",
"13943",
"6228",
"1619",
"4430",
"1333",
"14519",
"14701",
"5663",
"7027",
"9673",
"6534",
"4111",
"5163",
"9968",
"7210",
"12377",
"11217",
"4748",
"6477",
"3093",
"3499",
"14764",
"14730",
"6316",
"235",
"566",
"8688",
"1880",
"3285",
"2513",
"14566",
"239",
"5912",
"11121",
"1933",
"209",
"11392",
"6552",
"4199",
"2975",
"7020",
"10909",
"3505",
"10978",
"10677",
"5108",
"6069",
"10166",
"15376",
"13728",
"10553",
"13137",
"538",
"8069",
"5598",
"15070",
"11817",
"12227",
"5303",
"6108",
"14703",
"2047",
"8241",
"561",
"7093",
"6716",
"12952",
"1976",
"8288",
"1142",
"831",
"8603",
"791",
"6713",
"3632",
"14163",
"12079",
"3472",
"13399",
"7948",
"1361",
"3248",
"190",
"1157",
"12967",
"13745",
"5920",
"2811",
"6271",
"4459",
"7124",
"8987",
"9713",
"15373",
"5099",
"14066",
"5641",
"5858",
"9614",
"9523",
"3168",
"1170",
"3184",
"8785",
"6161",
"1696",
"6620",
"5453",
"1381",
"7068",
"12579",
"11386",
"3967",
"3367",
"12727",
"8841",
"9628",
"3987",
"9746",
"6759",
"4116",
"11659",
"11799",
"5768",
"7651",
"5446",
"3542",
"1329",
"14683",
"5786",
"9779",
"3554",
"6951",
"11478",
"7067",
"12593",
"2534",
"14065",
"86",
"1883",
"14649",
"5672",
"12720",
"15254",
"13910",
"1526",
"3850",
"409",
"7119",
"6630",
"8639",
"2317",
"52",
"897",
"7658",
"4010",
"2727",
"2748",
"10013",
"4576",
"3951",
"925",
"1593",
"1872",
"10158",
"15075",
"14816",
"11292",
"353",
"4289",
"8772",
"4222",
"14944",
"14792",
"10730",
"1570",
"10076",
"7896",
"7062",
"8255",
"10675",
"6036",
"1468",
"10662",
"3943",
"4648",
"6038",
"3498",
"591",
"8685",
"2769",
"12919",
"6761",
"10788",
"562",
"2048",
"14685",
"9244",
"11533",
"5261",
"1500",
"13368",
"579",
"8592",
"3630",
"698",
"8512",
"11810",
"788",
"9313",
"11940",
"13911",
"9759",
"420",
"15403",
"2581",
"2228",
"14472",
"5333",
"10556",
"14528",
"2334",
"923",
"2869",
"12514",
"12367",
"5143",
"3623",
"5512",
"13487",
"4250",
"3027",
"6674",
"12097",
"8557",
"14313",
"5608",
"10756",
"7431",
"4266",
"2429",
"11632",
"65",
"8486",
"8537",
"9608",
"6675",
"13659",
"8020",
"478",
"14419",
"8100",
"682",
"10716",
"12500",
"7706",
"12058",
"1198",
"4208",
"2490",
"1394",
"537",
"13185",
"1858",
"14967",
"3380",
"857",
"3158",
"1205",
"5416",
"10616",
"6619",
"9305",
"646",
"3171",
"15155",
"11800",
"9564",
"2268",
"12760",
"4191",
"4065",
"3015",
"4799",
"10278",
"10092",
"14888",
"12139",
"4998",
"5289",
"6742",
"14628",
"7819",
"7065",
"4299",
"15025",
"5300",
"5211",
"13188",
"13749",
"11778",
"3543",
"1521",
"2576",
"8301",
"8353",
"3068",
"9710",
"10308",
"1188",
"6096",
"10980",
"8781",
"12976",
"4747",
"14063",
"14045",
"8619",
"14329",
"11845",
"8490",
"1340",
"1225",
"3110",
"2171",
"11921",
"3360",
"3676",
"7096",
"8225",
"11981",
"1646",
"3320",
"8748",
"11176",
"3958",
"9743",
"9330",
"2136",
"5975",
"6202",
"10779",
"14481",
"2473",
"1863",
"124",
"4572",
"2881",
"5523",
"9841",
"950",
"2093",
"5711",
"11866",
"902",
"2418",
"2348",
"15192",
"6383",
"14025",
"8062",
"11781",
"14980",
"10962",
"3381",
"9765",
"8116",
"358",
"1354",
"574",
"433",
"8165",
"14599",
"3046",
"704",
"13600",
"13516",
"5149",
"5103",
"5906",
"7617",
"13026",
"12600",
"9165",
"9388",
"3673",
"12576",
"9529",
"56",
"8372",
"5870",
"460",
"10922",
"10113",
"5503",
"10996",
"13486",
"4607",
"220",
"15406",
"10586",
"10413",
"11861",
"13344",
"9106",
"12745",
"8626",
"6890",
"10136",
"9798",
"12327",
"2617",
"2497",
"13539",
"6253",
"6501",
"6333",
"5397",
"1192",
"11975",
"3202",
"14842",
"14945",
"5451",
"11860",
"6564",
"3419",
"15117",
"5552",
"14731",
"5977",
"10407",
"13120",
"14918",
"11790",
"9543",
"763",
"3090",
"9960",
"7799",
"10660",
"13564",
"5667",
"15353",
"49",
"2671",
"9674",
"15172",
"12336",
"11492",
"11924",
"4115",
"7185",
"6685",
"3474",
"1718",
"599",
"10814",
"1922",
"10436",
"7538",
"1884",
"10807",
"11646",
"14235",
"15481",
"3316",
"2250",
"8656",
"14005",
"13980",
"12231",
"5376",
"1789",
"12307",
"11827",
"13715",
"2907",
"12036",
"10311",
"8521",
"1774",
"12888",
"9889",
"2003",
"12426",
"10213",
"4233",
"2790",
"2585",
"2488",
"14672",
"12778",
"3574",
"12193",
"14982",
"13454",
"11113",
"7170",
"3261",
"14173",
"2147",
"1804",
"7139",
"1388",
"751",
"11847",
"2243",
"8784",
"15221",
"14226",
"12293",
"8624",
"2992",
"9887",
"10876",
"7937",
"10622",
"1874",
"2133",
"8075",
"4055",
"11609",
"3113",
"12504",
"6673",
"5396",
"8004",
"12769",
"6164",
"10541",
"7861",
"8727",
"1541",
"5037",
"15204",
"7719",
"10453",
"6907",
"15109",
"5769",
"3178",
"5093",
"5984",
"5387",
"7488",
"8693",
"8018",
"9980",
"7375",
"10691",
"5779",
"11769",
"7287",
"7572",
"9116",
"5048",
"1005",
"216",
"5104",
"8615",
"8",
"7712",
"13817",
"634",
"10578",
"9894",
"368",
"2010",
"1078",
"5318",
"4543",
"11504",
"1909",
"3693",
"5918",
"10820",
"13549",
"3653",
"1276",
"3806",
"9938",
"11115",
"13218",
"12412",
"13270",
"577",
"681",
"680",
"10759",
"10438",
"3106",
"10233",
"1391",
"7801",
"14955",
"2359",
"11714",
"6294",
"11543",
"12373",
"800",
"15012",
"1095",
"1532",
"9531",
"2049",
"1940",
"11453",
"735",
"10408",
"7688",
"15293",
"3185",
"13183",
"10053",
"215",
"8687",
"3559",
"14435",
"3579",
"10178",
"12623",
"9896",
"3230",
"12741",
"4490",
"15175",
"15486",
"10063",
"6879",
"15032",
"11200",
"13680",
"1562",
"4526",
"9415",
"10454",
"1796",
"5316",
"15418",
"13501",
"10634",
"2214",
"7594",
"6446",
"9828",
"4914",
"5117",
"12571",
"2603",
"11265",
"2321",
"4393",
"5323",
"9925",
"12242",
"4357",
"8505",
"2860",
"1037",
"7518",
"9719",
"6071",
"6331",
"8388",
"2272",
"2388",
"4114",
"9835",
"11162",
"4419",
"1048",
"6267",
"9807",
"2672",
"13431",
"14092",
"7938",
"14650",
"4751",
"6224",
"12824",
"2377",
"5520",
"7308",
"11840",
"2233",
"12199",
"2634",
"9824",
"9995",
"313",
"6933",
"2650",
"7165",
"15467",
"7516",
"10292",
"768",
"3039",
"3794",
"5391",
"507",
"1251",
"6588",
"9667",
"7043",
"13857",
"15251",
"8503",
"10085",
"7052",
"1438",
"2856",
"628",
"9461",
"14974",
"12098",
"8024",
"8249",
"10078",
"11193",
"5022",
"13498",
"14475",
"9916",
"4834",
"13541",
"8747",
"2717",
"13918",
"2590",
"2379",
"1053",
"8208",
"14291",
"14919",
"3955",
"14310",
"12032",
"11838",
"10955",
"13000",
"4473",
"15297",
"8033",
"13651",
"10155",
"11238",
"8944",
"12925",
"1452",
"8314",
"7373",
"6551",
"7631",
"5100",
"11600",
"10615",
"14758",
"5296",
"2364",
"4542",
"6404",
"11110",
"2810",
"15466",
"7767",
"7898",
"12321",
"3216",
"2538",
"7209",
"2343",
"9168",
"486",
"2845",
"11338",
"6635",
"6750",
"2435",
"9510",
"4819",
"2327",
"13156",
"9636",
"5722",
"48",
"9833",
"1964",
"12005",
"12117",
"14354",
"11128",
"1647",
"4550",
"6123",
"13037",
"12744",
"6929",
"8259",
"943",
"6671",
"5484",
"7852",
"8059",
"10550",
"7752",
"5011",
"7369",
"7849",
"4411",
"2271",
"6155",
"6430",
"11240",
"15124",
"8283",
"14287",
"7966",
"10205",
"13475",
"8175",
"4743",
"11455",
"12863",
"12495",
"3636",
"2885",
"13507",
"8302",
"8799",
"15118",
"583",
"7115",
"14640",
"10940",
"6549",
"13718",
"2496",
"3974",
"6704",
"12270",
"15498",
"10777",
"5710",
"5871",
"13446",
"8394",
"5649",
"6544",
"13528",
"11650",
"5216",
"6680",
"15453",
"1435",
"9932",
"2938",
"15079",
"7606",
"7278",
"10664",
"13653",
"11842",
"4144",
"870",
"13683",
"4994",
"11945",
"7860",
"7864",
"7804",
"5903",
"4468",
"10933",
"12105",
"13359",
"4056",
"3816",
"13133",
"13738",
"7621",
"5945",
"2566",
"12386",
"9679",
"302",
"11018",
"10822",
"222",
"8445",
"15000",
"2764",
"1030",
"5107",
"12488",
"2801",
"2357",
"12834",
"7351",
"389",
"10350",
"6566",
"8911",
"714",
"5136",
"8309",
"15479",
"9194",
"7940",
"5365",
"13799",
"13898",
"12947",
"6199",
"9250",
"12493",
"6499",
"12066",
"2186",
"6133",
"8409",
"11082",
"11561",
"1038",
"706",
"7454",
"780",
"1507",
"12302",
"4584",
"9368",
"5050",
"4483",
"5465",
"12706",
"9412",
"4764",
"1119",
"8000",
"14789",
"15240",
"3357",
"10221",
"15001",
"14529",
"14110",
"2957",
"11264",
"8129",
"13664",
"347",
"4121",
"14715",
"5454",
"5915",
"10328",
"10524",
"10378",
"13411",
"425",
"7967",
"6821",
"7371",
"10872",
"11107",
"1248",
"5490",
"13213",
"6318",
"13389",
"9561",
"13926",
"7655",
"6214",
"3095",
"6296",
"7754",
"5974",
"13681",
"1045",
"3898",
"14193",
"5688",
"6264",
"13426",
"383",
"3571",
"13287",
"2831",
"5190",
"11406",
"922",
"8795",
"5130",
"6659",
"12642",
"501",
"10124",
"6169",
"12480",
"14060",
"2378",
"14185",
"8975",
"3265",
"11349",
"4429",
"2063",
"2880",
"13622",
"8280",
"9067",
"3451",
"1232",
"10900",
"10192",
"7597",
"6844",
"1490",
"6370",
"2754",
"12483",
"8455",
"13725",
"5708",
"3711",
"4013",
"5739",
"7853",
"13144",
"5274",
"6751",
"767",
"121",
"14784",
"10760",
"63",
"2424",
"13138",
"5169",
"6585",
"6067",
"12182",
"8003",
"793",
"4456",
"3726",
"10305",
"1693",
"6147",
"10743",
"3032",
"8897",
"6346",
"14782",
"8378",
"14470",
"15394",
"11990",
"26",
"4424",
"262",
"563",
"7773",
"6233",
"5671",
"13645",
"4390",
"7660",
"5145",
"906",
"2472",
"4603",
"15238",
"13202",
"14802",
"802",
"12445",
"13294",
"24",
"5940",
"2227",
"11396",
"9757",
"959",
"12063",
"1567",
"312",
"1837",
"14488",
"9427",
"13785",
"2870",
"8281",
"6615",
"11876",
"8966",
"13842",
"1223",
"3979",
"2531",
"9058",
"4005",
"6244",
"9730",
"14034",
"14214",
"8943",
"2212",
"4916",
"5354",
"14894",
"9222",
"2352",
"9237",
"8212",
"6232",
"3791",
"1271",
"7363",
"309",
"2960",
"12776",
"12082",
"12184",
"4432",
"4684",
"12075",
"13025",
"13612",
"15218",
"3558",
"10859",
"2198",
"8039",
"5788",
"14883",
"12211",
"1975",
"13601",
"4206",
"1345",
"6085",
"10733",
"11497",
"5725",
"6046",
"9761",
"1749",
"4598",
"3703",
"3334",
"286",
"9624",
"7221",
"5193",
"6597",
"14552",
"2013",
"8305",
"13165",
"5952",
"10417",
"5326",
"4907",
"10275",
"1635",
"2738",
"6601",
"382",
"11828",
"1023",
"6842",
"9198",
"5291",
"12208",
"9684",
"11754",
"4379",
"6304",
"1074",
"14983",
"44",
"14294",
"9317",
"14527",
"7831",
"10014",
"612",
"10243",
"5088",
"2287",
"9917",
"731",
"2181",
"7897",
"11596",
"1252",
"13888",
"15365",
"321",
"3115",
"11425",
"8491",
"1383",
"2847",
"2631",
"10064",
"7276",
"6242",
"7559",
"1116",
"8547",
"1648",
"10088",
"1807",
"13647",
"7059",
"14705",
"7679",
"3539",
"13998",
"14728",
"14578",
"8365",
"8493",
"10629",
"10281",
"10312",
"4848",
"1294",
"14526",
"13238",
"589",
"2246",
"6222",
"6941",
"9349",
"3287",
"12537",
"6856",
"4442",
"10722",
"14831",
"7477",
"9647",
"8542",
"6427",
"4802",
"13768",
"8125",
"10415",
"1297",
"4156",
"14851",
"508",
"11882",
"11694",
"10727",
"539",
"9169",
"4621",
"7359",
"13466",
"10617",
"1404",
"12415",
"10339",
"8130",
"5390",
"11323",
"6884",
"4130",
"6072",
"9043",
"12144",
"4278",
"1482",
"401",
"7730",
"3434",
"4394",
"12940",
"6508",
"11663",
"11723",
"1132",
"8597",
"10257",
"3271",
"1632",
"945",
"5403",
"8561",
"14756",
"1099",
"7401",
"1746",
"8081",
"10701",
"3776",
"12972",
"10699",
"2704",
"14455",
"13714",
"13837",
"4129",
"311",
"7133",
"7916",
"9789",
"8734",
"14239",
"8379",
"9129",
"13543",
"9583",
"12868",
"269",
"11689",
"12719",
"6559",
"4997",
"14917",
"3929",
"8946",
"13815",
"1704",
"13056",
"8407",
"1699",
"12608",
"1651",
"5933",
"71",
"15454",
"578",
"720",
"7347",
"13324",
"14755",
"2546",
"8203",
"3236",
"14887",
"1466",
"8573",
"6276",
"12407",
"1409",
"3978",
"4888",
"11328",
"4723",
"14857",
"1869",
"9653",
"3885",
"7256",
"185",
"5258",
"14666",
"4586",
"14711",
"1724",
"2983",
"14298",
"10960",
"12333",
"5922",
"291",
"14161",
"12017",
"14352",
"8658",
"6186",
"12177",
"1263",
"13006",
"4041",
"13172",
"11034",
"8872",
"9147",
"5083",
"4267",
"2470",
"254",
"6754",
"3151",
"10143",
"7593",
"1026",
"12985",
"14625",
"7793",
"9698",
"11156",
"4119",
"3846",
"3479",
"14569",
"1485",
"7883",
"2989",
"8810",
"9665",
"2313",
"109",
"839",
"14271",
"13084",
"1798",
"9545",
"11518",
"8782",
"6082",
"532",
"13184",
"1548",
"15201",
"3556",
"7416",
"8006",
"6917",
"3327",
"2651",
"11638",
"6757",
"7058",
"11298",
"5755",
"10458",
"4635",
"10169",
"2005",
"13106",
"4180",
"10721",
"10860",
"521",
"4205",
"4837",
"8031",
"1596",
"4678",
"5081",
"14064",
"2453",
"4003",
"12161",
"10386",
"13565",
"5355",
"3232",
"6893",
"14109",
"9353",
"11023",
"14188",
"14881",
"6859",
"10009",
"6697",
"9030",
"1625",
"6131",
"1063",
"11529",
"2981",
"9642",
"3361",
"350",
"3874",
"8675",
"1633",
"10557",
"8832",
"9068",
"8218",
"12261",
"12693",
"3952",
"1740",
"2691",
"1875",
"15217",
"13455",
"15041",
"2988",
"12164",
"10518",
"13092",
"14197",
"9318",
"6",
"4359",
"1642",
"6987",
"1087",
"9869",
"4882",
"4175",
"3904",
"11463",
"8476",
"3010",
"15231",
"6035",
"9467",
"8928",
"6994",
"15472",
"7030",
"6168",
"13867",
"2208",
"13261",
"1164",
"4060",
"10506",
"14502",
"3892",
"1947",
"2028",
"11401",
"7619",
"1917",
"2715",
"3192",
"5818",
"5720",
"2495",
"2557",
"13615",
"12892",
"4332",
"11206",
"9463",
"6121",
"14201",
"3867",
"9853",
"12917",
"6058",
"10889",
"6026",
"9033",
"440",
"739",
"4898",
"4202",
"11077",
"13109",
"3860",
"13437",
"14866",
"9974",
"15125",
"2131",
"13514",
"9586",
"6919",
"2925",
"3461",
"6696",
"7961",
"11257",
"12174",
"6221",
"10935",
"12354",
"10574",
"12748",
"10937",
"3897",
"192",
"1741",
"5321",
"9538",
"7985",
"5986",
"849",
"6519",
"858",
"12508",
"6764",
"13042",
"2119",
"582",
"9742",
"14413",
"7956",
"3165",
"2382",
"2503",
"13972",
"7928",
"3351",
"3812",
"12980",
"7004",
"6976",
"6899",
"13628",
"12277",
"14440",
"11602",
"14853",
"14933",
"13973",
"7813",
"5854",
"12887",
"5262",
"11620",
"6760",
"6980",
"2732",
"3675",
"7622",
"2260",
"6109",
"11569",
"1268",
"14155",
"8816",
"12823",
"3696",
"10440",
"6027",
"4307",
"14738",
"5250",
"2304",
"13436",
"12595",
"14691",
"1805",
"5049",
"654",
"8109",
"892",
"3446",
"7611",
"2336",
"10789",
"12669",
"12353",
"2000",
"4847",
"2280",
"584",
"6016",
"10640",
"3336",
"7344",
"11466",
"904",
"10684",
"11509",
"5351",
"6513",
"13055",
"7008",
"10511",
"11814",
"10545",
"12818",
"635",
"7637",
"3717",
"2946",
"10203",
"9496",
"6311",
"7721",
"6146",
"4775",
"2117",
"7330",
"14990",
"479",
"14469",
"12684",
"9769",
"11368",
"14424",
"14809",
"6583",
"2825",
"10381",
"14929",
"6165",
"10653",
"2441",
"2038",
"14240",
"6873",
"4988",
"8120",
"14433",
"1865",
"14644",
"15159",
"12605",
"13395",
"2608",
"12119",
"712",
"13591",
"9542",
"2998",
"390",
"5062",
"10964",
"408",
"13387",
"4365",
"6986",
"4963",
"2624",
"12705",
"12871",
"14160",
"12983",
"12666",
"6702",
"11045",
"12971",
"1788",
"9533",
"88",
"9322",
"15022",
"10741",
"4797",
"9380",
"8994",
"2529",
"2867",
"9689",
"5032",
"15476",
"8531",
"7273",
"11905",
"11750",
"1949",
"3912",
"7400",
"4164",
"3534",
"1823",
"14144",
"7979",
"10011",
"13105",
"4438",
"1262",
"761",
"13710",
"5179",
"13531",
"3970",
"8328",
"10994",
"13370",
"10170",
"7843",
"3365",
"5911",
"4948",
"10974",
"8415",
"7991",
"9382",
"6374",
"998",
"14051",
"14106",
"13561",
"11233",
"3493",
"31",
"14800",
"2659",
"2927",
"2309",
"11871",
"2813",
"7153",
"3207",
"14269",
"15301",
"13102",
"5762",
"8720",
"3020",
"417",
"2719",
"14132",
"3679",
"7361",
"1763",
"5662",
"14116",
"11236",
"832",
"4857",
"14176",
"6465",
"4146",
"5255",
"10539",
"6995",
"7237",
"11086",
"13721",
"12226",
"997",
"11397",
"10373",
"11514",
"4630",
"6730",
"14891",
"12018",
"12303",
"11360",
"15137",
"3930",
"6610",
"5514",
"12507",
"8376",
"5868",
"13990",
"15266",
"4826",
"9819",
"3546",
"8982",
"2213",
"6599",
"3707",
"15024",
"2024",
"1561",
"6861",
"5412",
"14186",
"8066",
"14390",
"9160",
"12910",
"60",
"9882",
"11146",
"2783",
"2875",
"12147",
"3159",
"2110",
"73",
"4725",
"5031",
"11900",
"5595",
"5533",
"6661",
"14327",
"15449",
"14505",
"13782",
"4714",
"12864",
"1977",
"229",
"8589",
"14490",
"12257",
"12670",
"6498",
"912",
"10140",
"4128",
"8634",
"14145",
"1864",
"13255",
"2238",
"990",
"8187",
"7099",
"13557",
"806",
"1667",
"13444",
"4104",
"13814",
"8566",
"7111",
"4744",
"11149",
"4183",
"4389",
"4691",
"8990",
"4923",
"11273",
"13974",
"2325",
"12451",
"905",
"13826",
"10083",
"6676",
"7772",
"13220",
"13089",
"9284",
"5293",
"3231",
"9274",
"13524",
"14618",
"7193",
"14906",
"8560",
"14202",
"15129",
"11498",
"3155",
"1942",
"13513",
"3594",
"9241",
"8034",
"1397",
"4342",
"2559",
"12884",
"8215",
"14211",
"13286",
"4538",
"5313",
"8667",
"2011",
"5309",
"12689",
"5276",
"790",
"4565",
"7227",
"4380",
"11037",
"8905",
"12893",
"7643",
"9816",
"7299",
"10086",
"13457",
"5080",
"11955",
"7640",
"9855",
"1905",
"2312",
"7311",
"7781",
"7654",
"8368",
"2204",
"699",
"4686",
"5195",
"11185",
"14323",
"1213",
"7676",
"5716",
"9992",
"10569",
"5798",
"3306",
"5472",
"6889",
"15220",
"3082",
"7743",
"2324",
"2121",
"5187",
"6399",
"10223",
"5834",
"1464",
"13510",
"11063",
"2550",
"6648",
"5938",
"4929",
"11505",
"2489",
"6470",
"12114",
"4516",
"108",
"8984",
"6870",
"1932",
"7980",
"10244",
"8308",
"12389",
"1871",
"5877",
"3339",
"12958",
"12253",
"7243",
"149",
"5670",
"957",
"4497",
"8402",
"14021",
"5424",
"8319",
"12548",
"13794",
"10007",
"1479",
"14538",
"12275",
"7766",
"3911",
"15197",
"6076",
"14793",
"156",
"14723",
"1497",
"4400",
"6996",
"1001",
"5352",
"6798",
"11467",
"8200",
"2004",
"11976",
"588",
"4700",
"14643",
"12153",
"11589",
"7357",
"6998",
"6001",
"7616",
"6541",
"246",
"13863",
"7696",
"7023",
"3678",
"6115",
"15432",
"14302",
"5389",
"15457",
"2766",
"5251",
"13746",
"10878",
"2202",
"14517",
"7214",
"9440",
"5070",
"8454",
"10915",
"9138",
"1109",
"4318",
"5173",
"2776",
"10108",
"1127",
"10706",
"10564",
"1764",
"11326",
"13822",
"5009",
"118",
"8099",
"9838",
"13350",
"1877",
"4794",
"2745",
"6789",
"13181",
"7319",
"5540",
"3359",
"465",
"13544",
"10531",
"12134",
"13900",
"9350",
"11142",
"7714",
"547",
"13231",
"7811",
"1207",
"8581",
"7340",
"431",
"10831",
"4730",
"283",
"179",
"10945",
"3072",
"5201",
"3512",
"12816",
"13078",
"2187",
"11578",
"4036",
"10309",
"11304",
"2222",
"3927",
"15442",
"3388",
"7012",
"5929",
"6653",
"14557",
"13950",
"4737",
"514",
"4417",
"2740",
"11896",
"5828",
"2421",
"13008",
"7525",
"15371",
"1631",
"13892",
"4500",
"7710",
"6451",
"5461",
"15188",
"753",
"12431",
"3902",
"11019",
"4053",
"11510",
"2434",
"11341",
"14181",
"11031",
"1802",
"13191",
"14224",
"14658",
"8802",
"5502",
"13419",
"14004",
"2711",
"12387",
"8960",
"11615",
"10883",
"9690",
"1066",
"8332",
"11749",
"10588",
"10685",
"6054",
"8046",
"11731",
"11550",
"4063",
"14396",
"6384",
"4975",
"7229",
"792",
"5205",
"13490",
"4838",
"1130",
"13631",
"8101",
"5426",
"14174",
"8467",
"14183",
"3181",
"4626",
"889",
"12658",
"5161",
"14200",
"3709",
"14158",
"14042",
"11150",
"13580",
"14580",
"8732",
"11260",
"14735",
"1283",
"11272",
"11908",
"2061",
"11404",
"4617",
"13550",
"13809",
"8359",
"6158",
"7575",
"9478",
"5432",
"8827",
"11061",
"4881",
"4176",
"11262",
"13769",
"15042",
"4544",
"1000",
"12886",
"2937",
"3786",
"5976",
"1706",
"9815",
"4032",
"13969",
"8705",
"1628",
"8007",
"14584",
"9554",
"8638",
"2043",
"8488",
"11773",
"15011",
"14958",
"5812",
"3721",
"13991",
"4941",
"9065",
"1441",
"363",
"6189",
"11995",
"4759",
"7190",
"174",
"12908",
"11424",
"90",
"3770",
"535",
"4100",
"1510",
"4280",
"6381",
"11014",
"5545",
"3959",
"4238",
"3599",
"647",
"10984",
"855",
"11025",
"12059",
"5564",
"12158",
"9984",
"11052",
"2283",
"2808",
"326",
"14985",
"8321",
"1467",
"4201",
"4309",
"1195",
"13095",
"14595",
"4511",
"1169",
"4153",
"3482",
"7874",
"14624",
"4858",
"5803",
"362",
"14804",
"5733",
"13119",
"285",
"4476",
"11281",
"276",
"13450",
"9340",
"11168",
"5028",
"5350",
"7018",
"13896",
"12040",
"1850",
"6457",
"14593",
"14243",
"14206",
"255",
"9559",
"8236",
"2689",
"7263",
"14815",
"3509",
"14125",
"7261",
"4443",
"7807",
"4301",
"14954",
"10747",
"13577",
"6094",
"3830",
"4879",
"2986",
"6701",
"14266",
"13619",
"13989",
"5086",
"13047",
"10035",
"2505",
"13913",
"4212",
"13365",
"10152",
"851",
"10913",
"5302",
"14189",
"627",
"7076",
"12814",
"1317",
"11999",
"12397",
"14164",
"2232",
"7895",
"4646",
"12594",
"9338",
"11261",
"8993",
"3480",
"9146",
"9128",
"1103",
"942",
"14016",
"10467",
"14522",
"1310",
"14763",
"12655",
"2556",
"5003",
"9231",
"14315",
"13694",
"4169",
"13916",
"15087",
"3526",
"3098",
"10732",
"8430",
"4703",
"13241",
"10873",
"10370",
"8954",
"7155",
"3235",
"11295",
"13706",
"6928",
"7780",
"13886",
"6312",
"5315",
"15214",
"12662",
"9454",
"13982",
"752",
"10561",
"3582",
"10576",
"14198",
"201",
"6562",
"8945",
"13860",
"2395",
"567",
"3127",
"5677",
"11325",
"12402",
"3661",
"542",
"6219",
"12479",
"1590",
"10402",
"8983",
"4068",
"10353",
"10708",
"7697",
"9589",
"8922",
"12607",
"4512",
"5338",
"5435",
"3790",
"10696",
"14361",
"7253",
"4674",
"3872",
"10694",
"4742",
"8233",
"5574",
"820",
"5508",
"15085",
"4922",
"868",
"11068",
"2834",
"688",
"15239",
"15331",
"3921",
"14697",
"13620",
"13278",
"9122",
"4675",
"14740",
"14075",
"9711",
"6897",
"3971",
"4644",
"5570",
"9825",
"14718",
"8513",
"3989",
"8714",
"13383",
"13017",
"7245",
"9826",
"1059",
"5547",
"4672",
"12020",
"4095",
"8552",
"12835",
"2623",
"765",
"3835",
"13284",
"1453",
"2935",
"2493",
"9788",
"6075",
"10602",
"3070",
"12687",
"7080",
"4911",
"10348",
"15430",
"14912",
"1292",
"8145",
"1120",
"7042",
"3048",
"1332",
"12986",
"9577",
"5978",
"5772",
"6920",
"14478",
"10122",
"11693",
"5600",
"11631",
"894",
"8649",
"11891",
"3163",
"1720",
"7810",
"2207",
"814",
"11879",
"3442",
"13638",
"6087",
"5260",
"10563",
"1184",
"9611",
"1398",
"7536",
"8754",
"12019",
"15274",
"953",
"8729",
"5441",
"958",
"45",
"709",
"636",
"12243",
"6030",
"11008",
"8124",
"686",
"9472",
"9110",
"8770",
"11608",
"12047",
"4204",
"15308",
"9032",
"3066",
"15206",
"14388",
"3555",
"11727",
"11457",
"11626",
"7486",
"13919",
"284",
"3200",
"5681",
"1348",
"7567",
"8610",
"993",
"10971",
"2151",
"4482",
"6731",
"1449",
"6127",
"5925",
"14712",
"10409",
"4810",
"2354",
"4391",
"11321",
"3862",
"12216",
"7405",
"7724",
"15357",
"13732",
"6197",
"8588",
"1712",
"3708",
"12408",
"4279",
"7342",
"4749",
"15462",
"5715",
"1331",
"236",
"10230",
"5257",
"14516",
"2683",
"14129",
"3996",
"8298",
"11496",
"6261",
"11066",
"11997",
"10512",
"1044",
"4720",
"4958",
"7876",
"6684",
"11729",
"7732",
"7510",
"8894",
"7465",
"5437",
"11118",
"12111",
"9263",
"1311",
"8196",
"6339",
"3722",
"7317",
"373",
"11895",
"12966",
"2864",
"3368",
"657",
"1945",
"856",
"3123",
"10630",
"6801",
"5881",
"10320",
"13422",
"3281",
"6111",
"2939",
"8789",
"14604",
"1946",
"5091",
"15149",
"10525",
"4167",
"6088",
"6365",
"2492",
"13899",
"6587",
"2637",
"2682",
"7403",
"2022",
"2987",
"5669",
"11636",
"5971",
"3215",
"10072",
"15226",
"2555",
"1890",
"8475",
"10868",
"6797",
"14168",
"4778",
"4323",
"13404",
"528",
"10844",
"2863",
"10392",
"3637",
"13644",
"4616",
"5923",
"12235",
"13816",
"416",
"3330",
"12313",
"9784",
"4428",
"1014",
"13453",
"14629",
"193",
"9635",
"5054",
"10190",
"9726",
"13155",
"2627",
"1885",
"11007",
"13848",
"8220",
"7996",
"10022",
"10481",
"2122",
"2916",
"8936",
"11522",
"12370",
"11701",
"7943",
"9629",
"14657",
"7341",
"12471",
"14146",
"9844",
"5751",
"12821",
"10711",
"13216",
"13460",
"10443",
"8148",
"12704",
"5002",
"14689",
"5235",
"13927",
"10049",
"13775",
"2224",
"12448",
"2258",
"10546",
"4642",
"14546",
"14307",
"1165",
"9764",
"11102",
"1036",
"6525",
"5930",
"3307",
"405",
"6868",
"674",
"12781",
"4813",
"5691",
"13849",
"8274",
"1745",
"6063",
"2619",
"14316",
"8633",
"15307",
"8910",
"12535",
"10354",
"9802",
"12279",
"9497",
"6245",
"9339",
"1551",
"4938",
"58",
"4486",
"10276",
"3520",
"4304",
"5851",
"13529",
"13883",
"12636",
"9417",
"13744",
"7003",
"781",
"4768",
"1725",
"12960",
"9175",
"8286",
"534",
"11877",
"3538",
"9141",
"979",
"13476",
"5639",
"5367",
"10835",
"12946",
"5491",
"12443",
"3560",
"11715",
"6523",
"8654",
"1307",
"7506",
"13783",
"6581",
"13570",
"9682",
"3406",
"10433",
"10153",
"10364",
"725",
"2996",
"4979",
"2600",
"2108",
"6205",
"8679",
"12624",
"81",
"8179",
"6022",
"3913",
"14950",
"2145",
"12505",
"6048",
"3517",
"7564",
"6565",
"13629",
"13709",
"9191",
"4583",
"10771",
"1477",
"14446",
"11591",
"1948",
"7053",
"7746",
"2281",
"14938",
"13626",
"13859",
"779",
"11746",
"10161",
"3013",
"14232",
"8373",
"14477",
"2461",
"11859",
"11385",
"12202",
"14487",
"3052",
"10384",
"14212",
"11637",
"4441",
"4784",
"9464",
"8221",
"5224",
"9395",
"3407",
"274",
"12169",
"14808",
"6136",
"13327",
"14182",
"6148",
"11956",
"1517",
"14724",
"640",
"7073",
"13217",
"7235",
"1396",
"6940",
"4422",
"11420",
"7725",
"9343",
"10005",
"6256",
"12015",
"8040",
"3004",
"8912",
"6515",
"14761",
"5700",
"9139",
"15223",
"12403",
"10377",
"12582",
"14838",
"380",
"14598",
"9795",
"10647",
"695",
"12212",
"2387",
"9860",
"8874",
"15052",
"344",
"6229",
"3757",
"7206",
"14597",
"4268",
"10805",
"8052",
"5460",
"4772",
"4426",
"8651",
"13481",
"1064",
"6560",
"642",
"10359",
"14771",
"1352",
"15091",
"6462",
"4125",
"2886",
"2735",
"10301",
"7731",
"1566",
"9271",
"1995",
"777",
"12547",
"11270",
"9927",
"1105",
"8400",
"8585",
"11324",
"2082",
"10135",
"4645",
"4131",
"3768",
"10510",
"12418",
"14418",
"8941",
"15364",
"1549",
"10393",
"388",
"7872",
"14839",
"12633",
"14695",
"4649",
"3255",
"10079",
"4493",
"4259",
"766",
"11575",
"3870",
"13194",
"15391",
"7709",
"9680",
"9442",
"4294",
"7222",
"555",
"7946",
"7140",
"7475",
"6954",
"12249",
"349",
"15499",
"5078",
"3864",
"7973",
"719",
"8858",
"2373",
"10118",
"2883",
"2245",
"10388",
"11868",
"12977",
"13318",
"4862",
"7132",
"13272",
"15313",
"3916",
"1609",
"4353",
"11786",
"2274",
"3838",
"9278",
"6631",
"1663",
"8216",
"1463",
"3433",
"9897",
"4002",
"3455",
"7561",
"3260",
"10357",
"5919",
"12840",
"6364",
"1832",
"11797",
"4447",
"14465",
"12913",
"9142",
"12853",
"6056",
"5089",
"11970",
"7381",
"3784",
"7701",
"1634",
"5418",
"10665",
"14196",
"15465",
"2568",
"10335",
"9453",
"3662",
"11855",
"6904",
"6855",
"13410",
"754",
"13852",
"9656",
"10317",
"649",
"9792",
"10544",
"5063",
"3213",
"15255",
"1067",
"10074",
"4102",
"11640",
"1069",
"1112",
"1695",
"290",
"8457",
"7847",
"2216",
"5371",
"5816",
"4162",
"13702",
"5518",
"10487",
"9211",
"9782",
"4427",
"295",
"11642",
"7844",
"8646",
"7687",
"1008",
"5014",
"9579",
"6887",
"11835",
"13758",
"2527",
"3659",
"12492",
"2408",
"6254",
"10232",
"8252",
"11105",
"4708",
"11869",
"12875",
"13739",
"13905",
"5679",
"2090",
"2858",
"12556",
"4765",
"1426",
"13076",
"14283",
"1400",
"7049",
"2755",
"1280",
"14788",
"6120",
"8210",
"1057",
"13127",
"9088",
"12970",
"14951",
"8487",
"568",
"9885",
"13322",
"18",
"1420",
"8825",
"7988",
"10748",
"5844",
"6958",
"10181",
"12080",
"9474",
"6717",
"3363",
"5656",
"1710",
"14882",
"6605",
"3057",
"12589",
"11013",
"3634",
"12619",
"5633",
"3573",
"5902",
"10216",
"5610",
"4682",
"2449",
"9188",
"4039",
"14750",
"9154",
"5625",
"13038",
"3275",
"397",
"122",
"9631",
"14389",
"6156",
"3817",
"4657",
"4270",
"1020",
"13844",
"10051",
"8539",
"13830",
"10864",
"10047",
"8971",
"13143",
"7252",
"10854",
"12337",
"15095",
"2678",
"2330",
"15015",
"3153",
"14485",
"2165",
"476",
"3277",
"3889",
"6721",
"9117",
"4288",
"10380",
"10643",
"7275",
"7016",
"114",
"4092",
"4824",
"4227",
"4592",
"5994",
"7175",
"5860",
"8957",
"13618",
"10637",
"10981",
"1050",
"7505",
"10059",
"12751",
"8642",
"14989",
"7207",
"1328",
"701",
"4983",
"7700",
"3845",
"6787",
"6767",
"9803",
"10403",
"10369",
"11893",
"13941",
"14113",
"2023",
"13855",
"3908",
"986",
"453",
"11096",
"3878",
"10258",
"5341",
"17",
"5278",
"13402",
"6938",
"2252",
"4258",
"10520",
"4325",
"14355",
"11221",
"15417",
"13642",
"7045",
"7761",
"1792",
"438",
"11005",
"9214",
"2486",
"2169",
"3730",
"11776",
"15464",
"587",
"10590",
"3129",
"7478",
"7791",
"13711",
"1413",
"14452",
"9953",
"15080",
"8155",
"6539",
"6667",
"14120",
"3249",
"13633",
"9424",
"4079",
"5569",
"12926",
"6132",
"11954",
"2807",
"4806",
"11741",
"12414",
"2734",
"1684",
"4161",
"12640",
"8814",
"14767",
"7456",
"3087",
"11178",
"9508",
"2367",
"4604",
"6799",
"666",
"9208",
"5869",
"2017",
"2721",
"4507",
"165",
"9558",
"10073",
"2417",
"9233",
"8044",
"13348",
"5797",
"186",
"15500",
"2411",
"11531",
"11279",
"3358",
"11685",
"7196",
"8219",
"913",
"8453",
"4629",
"9449",
"9650",
"13503",
"12462",
"509",
"7691",
"13158",
"9738",
"6524",
"2573",
"14139",
"10218",
"8728",
"10827",
"1158",
"7055",
"12599",
"8287",
"9718",
"2392",
"12513",
"10490",
"14425",
"8144",
"10",
"4368",
"9166",
"15108",
"2420",
"41",
"15027",
"54",
"14399",
"13554",
"3895",
"7601",
"2630",
"12961",
"9325",
"13968",
"15404",
"6851",
"13731",
"661",
"8284",
"4075",
"2065",
"8719",
"13753",
"15455",
"3102",
"8408",
"12362",
"5112",
"4792",
"15420",
"7302",
"5402",
"1697",
"9796",
"7633",
"8446",
"13079",
"5121",
"7315",
"9298",
"11038",
"10592",
"12228",
"4910",
"5488",
"9458",
"2710",
"9625",
"10887",
"11227",
"11923",
"5617",
"46",
"13767",
"7796",
"12309",
"12232",
"6913",
"10866",
"7502",
"12475",
"7283",
"213",
"3995",
"2779",
"12210",
"1323",
"6710",
"6744",
"9512",
"3193",
"9770",
"8828",
"8695",
"969",
"12295",
"6832",
"10020",
"9346",
"4546",
"9259",
"5345",
"5879",
"5951",
"5245",
"3008",
"6182",
"573",
"12166",
"6059",
"1743",
"6628",
"12382",
"11",
"9562",
"4021",
"12324",
"4782",
"944",
"6644",
"1851",
"8250",
"7778",
"2984",
"8198",
"5325",
"14114",
"14131",
"13321",
"9066",
"13939",
"13275",
"7108",
"10834",
"5631",
"11820",
"102",
"7580",
"1337",
"8668",
"15285",
"12009",
"14105",
"8976",
"4736",
"5775",
"6024",
"7539",
"2041",
"5964",
"3240",
"1180",
"3487",
"7542",
"12667",
"10778",
"12318",
"1509",
"12123",
"12014",
"9441",
"5824",
"11991",
"11499",
"1006",
"7927",
"6258",
"5596",
"11033",
"13812",
"9229",
"2355",
"9500",
"8243",
"8431",
"611",
"30",
"10766",
"1146",
"1901",
"9149",
"1414",
"13429",
"14395",
"8533",
"9685",
"110",
"14751",
"7659",
"2511",
"14843",
"337",
"9176",
"13623",
"12587",
"3999",
"11785",
"5515",
"9431",
"7618",
"11020",
"5899",
"1222",
"8746",
"14227",
"14371",
"5784",
"8660",
"5900",
"14909",
"13723",
"9348",
"9293",
"12700",
"8097",
"893",
"2977",
"2229",
"10555",
"11967",
"13257",
"10382",
"6936",
"2081",
"1664",
"2096",
"15072",
"6810",
"2703",
"11593",
"9115",
"5735",
"8189",
"12375",
"15463",
"6478",
"14544",
"7922",
"9634",
"6746",
"11926",
"3563",
"3485",
"2902",
"6554",
"8149",
"11051",
"2026",
"13630",
"13362",
"95",
"7952",
"11124",
"14055",
"2944",
"4338",
"2575",
"9506",
"13259",
"3541",
"14072",
"12259",
"12192",
"11286",
"862",
"2707",
"5061",
"1028",
"11182",
"4045",
"2936",
"13361",
"14162",
"1265",
"12661",
"10990",
"11058",
"5445",
"4707",
"2244",
"6020",
"8570",
"11241",
"4650",
"4822",
"14011",
"4733",
"268",
"7082",
"11160",
"6830",
"3146",
"9905",
"14646",
"4337",
"8002",
"1035",
"7094",
"12125",
"4773",
"9527",
"10997",
"12138",
"1027",
"8345",
"3154",
"6915",
"196",
"8568",
"4853",
"5299",
"12296",
"12533",
"7543",
"8895",
"6177",
"3803",
"1270",
"1365",
"13209",
"15038",
"1833",
"1338",
"11667",
"14824",
"9948",
"14264",
"14963",
"1301",
"1304",
"6371",
"11449",
"13955",
"7553",
"1429",
"4803",
"12188",
"9522",
"1021",
"4628",
"11108",
"2726",
"1522",
"1150",
"9468",
"7690",
"13797",
"12160",
"5859",
"11698",
"9143",
"1707",
"13021",
"10332",
"10387",
"3390",
"4283",
"2027",
"9466",
"14693",
"14096",
"3014",
"378",
"1070",
"9236",
"7858",
"429",
"1266",
"82",
"7605",
"4939",
"4917",
"15250",
"10037",
"6536",
"7057",
"2390",
"6454",
"275",
"3164",
"8562",
"5579",
"1408",
"9494",
"8514",
"3937",
"359",
"3731",
"5263",
"2731",
"8863",
"896",
"374",
"11189",
"1873",
"6748",
"2865",
"11753",
"8956",
"4562",
"2464",
"214",
"14679",
"15114",
"12603",
"1134",
"7129",
"9507",
"13282",
"4329",
"11886",
"13649",
"5587",
"10837",
"5966",
"6914",
"15234",
"3677",
"9571",
"5124",
"8364",
"7776",
"5170",
"678",
"12644",
"4334",
"1920",
"1296",
"11095",
"12869",
"9990",
"9593",
"5789",
"8011",
"8710",
"10605",
"10372",
"2196",
"4596",
"11980",
"1952",
"5875",
"3981",
"2688",
"7150",
"13065",
"726",
"10197",
"9419",
"3243",
"7066",
"11479",
"7223",
"3939",
"1881",
"13409",
"10010",
"4763",
"1914",
"11483",
"8176",
"4254",
"926",
"11363",
"1031",
"9892",
"13819",
"1002",
"12601",
"5555",
"9525",
"3544",
"7034",
"11858",
"2890",
"10131",
"692",
"5955",
"9286",
"3025",
"8450",
"12538",
"10315",
"12314",
"12399",
"7171",
"13223",
"2648",
"4485",
"8485",
"5771",
"2338",
"7388",
"8786",
"12598",
"9837",
"10422",
"10645",
"10499",
"4750",
"6060",
"5519",
"14822",
"10466",
"4284",
"14157",
"2561",
"10144",
"10264",
"10620",
"6356",
"1436",
"5386",
"5075",
"8273",
"2199",
"11712",
"14591",
"253",
"6720",
"4791",
"14828",
"12085",
"14154",
"10012",
"7560",
"967",
"11837",
"756",
"11532",
"4286",
"12810",
"8395",
"10880",
"5908",
"1765",
"4964",
"164",
"4492",
"3092",
"2950",
"9321",
"393",
"12609",
"2615",
"11519",
"11988",
"1278",
"13970",
"4263",
"3829",
"13417",
"1376",
"11371",
"11506",
"414",
"8964",
"4369",
"117",
"11500",
"12771",
"7173",
"317",
"3778",
"4423",
"7100",
"12774",
"12989",
"14292",
"1636",
"3161",
"7413",
"4272",
"8578",
"5486",
"14147",
"12007",
"404",
"11538",
"1661",
"12106",
"14879",
"1062",
"8867",
"12803",
"10781",
"7634",
"4047",
"3141",
"4458",
"10468",
"9455",
"11143",
"10171",
"13608",
"7350",
"14151",
"10848",
"4264",
"2583",
"5281",
"10318",
"7570",
"10670",
"13737",
"2948",
"6953",
"9073",
"8644",
"14137",
"12787",
"12384",
"13999",
"11657",
"13933",
"11229",
"13117",
"2416",
"14760",
"11901",
"6972",
"7546",
"9702",
"7325",
"4344",
"7770",
"4704",
"9299",
"2394",
"11653",
"7537",
"987",
"707",
"1091",
"6617",
"4277",
"14237",
"4947",
"4891",
"9768",
"14119",
"3143",
"1965",
"4220",
"5791",
"4690",
"745",
"11916",
"7061",
"9172",
"13685",
"15090",
"7596",
"4767",
"3034",
"11414",
"15277",
"4633",
"8721",
"8813",
"2893",
"4406",
"11517",
"8292",
"227",
"36",
"384",
"1374",
"3740",
"12553",
"2123",
"7225",
"12380",
"5963",
"12906",
"4563",
"5516",
"5153",
"3091",
"3182",
"3567",
"3827",
"15227",
"1306",
"5777",
"2746",
"1015",
"12688",
"4038",
"13428",
"407",
"3572",
"5114",
"12698",
"2674",
"6607",
"11444",
"3432",
"2491",
"11850",
"6483",
"6795",
"4833",
"11795",
"1937",
"5322",
"14220",
"7707",
"13346",
"667",
"8889",
"14445",
"8036",
"1569",
"12681",
"4293",
"6823",
"11475",
"13542",
"2750",
"10528",
"2319",
"559",
"5138",
"6226",
"8934",
"2789",
"4367",
"14284",
"12006",
"14401",
"4067",
"7240",
"10488",
"7533",
"9152",
"2601",
"6864",
"10260",
"5509",
"2665",
"11131",
"2308",
"10498",
"9136",
"280",
"11556",
"10967",
"14247",
"11708",
"3410",
"11678",
"4726",
"7074",
"11129",
"10589",
"1110",
"526",
"5521",
"1910",
"10492",
"2923",
"5275",
"1136",
"9064",
"1424",
"14817",
"3223",
"3321",
"4071",
"9144",
"9023",
"2092",
"7300",
"9541",
"11103",
"11011",
"2818",
"3998",
"9901",
"519",
"2749",
"7484",
"2020",
"5497",
"10966",
"12637",
"8978",
"10065",
"10803",
"7352",
"4656",
"4697",
"9766",
"11560",
"11515",
"9556",
"9314",
"8950",
"14442",
"2591",
"9666",
"8859",
"2415",
"6656",
"11782",
"11254",
"544",
"7083",
"12927",
"15190",
"12838",
"3928",
"4488",
"4296",
"2292",
"10149",
"8073",
"2675",
"1061",
"11383",
"15265",
"9524",
"10105",
"5151",
"14925",
"9535",
"14848",
"11641",
"8545",
"2509",
"2539",
"11658",
"6727",
"15478",
"7718",
"119",
"3793",
"9285",
"3600",
"9643",
"6531",
"1722",
"8211",
"12780",
"4509",
"10509",
"676",
"14525",
"4472",
"1406",
"9557",
"11951",
"305",
"3918",
"5550",
"2862",
"10225",
"12572",
"4830",
"1422",
"14741",
"2952",
"3269",
"5621",
"5487",
"9540",
"4798",
"11684",
"10920",
"15350",
"5653",
"1171",
"14127",
"4689",
"3735",
"15171",
"9985",
"8253",
"3886",
"9481",
"12929",
"8507",
"5752",
"219",
"536",
"2947",
"2374",
"4027",
"14972",
"10650",
"70",
"1682",
"4150",
"5020",
"9662",
"1029",
"13602",
"13682",
"7138",
"1161",
"13774",
"11824",
"15164",
"10429",
"1603",
"11275",
"6382",
"10363",
"3036",
"4756",
"297",
"7391",
"684",
"14821",
"11965",
"9606",
"10184",
"1125",
"9961",
"12736",
"4012",
"13856",
"8741",
"9843",
"11079",
"12701",
"12214",
"8555",
"3900",
"14928",
"11571",
"2625",
"4457",
"6125",
"1958",
"462",
"7685",
"13313",
"1163",
"15427",
"2462",
"12657",
"13027",
"1982",
"12485",
"3777",
"5841",
"9376",
"11314",
"5761",
"7909",
"1542",
"908",
"14348",
"9170",
"3237",
"7191",
"10507",
"6729",
"14558",
"9181",
"7158",
"5541",
"11767",
"8885",
"4058",
"11521",
"9183",
"717",
"11320",
"266",
"14111",
"4174",
"7321",
"3986",
"10389",
"12463",
"8551",
"4809",
"12315",
"1990",
"9877",
"2180",
"13235",
"4",
"4096",
"3473",
"325",
"2690",
"15016",
"1316",
"3001",
"3901",
"3310",
"1970",
"1327",
"7551",
"2166",
"15132",
"3966",
"7589",
"8761",
"13571",
"3564",
"10639",
"1605",
"318",
"512",
"14417",
"10212",
"10019",
"1944",
"14862",
"1516",
"10032",
"10475",
"5294",
"4807",
"15460",
"1795",
"10690",
"4680",
"4534",
"14531",
"3100",
"11852",
"982",
"12359",
"5494",
"9670",
"5757",
"11961",
"12090",
"12482",
"11686",
"5405",
"2982",
"14596",
"10093",
"1235",
"1122",
"2910",
"9536",
"11423",
"360",
"2985",
"6248",
"10296",
"1072",
"13098",
"8008",
"1405",
"2887",
"4020",
"2795",
"13288",
"1767",
"9599",
"8201",
"895",
"11973",
"3346",
"7647",
"8913",
"11784",
"10390",
"9280",
"14288",
"12262",
"9428",
"3799",
"4155",
"4312",
"750",
"6031",
"9834",
"11310",
"1291",
"12340",
"1508",
"12731",
"7091",
"4622",
"2838",
"8861",
"3019",
"11920",
"6342",
"2742",
"2574",
"14770",
"3813",
"9041",
"4571",
"3553",
"4702",
"11986",
"15126",
"1330",
"10214",
"13011",
"8925",
"12544",
"2237",
"4717",
"12916",
"7886",
"4820",
"13200",
"1315",
"1312",
"7366",
"8494",
"5972",
"9303",
"12449",
"15473",
"622",
"6705",
"11805",
"12434",
"11766",
"8330",
"3924",
"1427",
"3412",
"1102",
"5492",
"5905",
"12154",
"15235",
"1367",
"3172",
"1256",
"5105",
"2353",
"10262",
"1216",
"6135",
"548",
"2993",
"12091",
"4608",
"10137",
"9620",
"4790",
"13141",
"7260",
"4712",
"1854",
"2442",
"12140",
"598",
"12962",
"4328",
"9410",
"11702",
"3094",
"8529",
"1140",
"4339",
"4721",
"11867",
"2934",
"1719",
"633",
"6973",
"15076",
"10663",
"15141",
"15056",
"4606",
"2071",
"3023",
"1727",
"9617",
"4433",
"12588",
"2536",
"10632",
"15461",
"12245",
"7441",
"1674",
"11134",
"3408",
"7063",
"4631",
"10351",
"10001",
"11085",
"3450",
"134",
"12429",
"9329",
"978",
"9187",
"10367",
"5364",
"1114",
"256",
"5180",
"983",
"3056",
"669",
"5369",
"15474",
"14101",
"4991",
"14865",
"2120",
"4554",
"5077",
"1246",
"3691",
"3592",
"1151",
"5071",
"3073",
"13107",
"15167",
"581",
"13071",
"10496",
"5699",
"10912",
"15337",
"9476",
"9179",
"14426",
"2263",
"11670",
"6124",
"15283",
"10395",
"4343",
"4841",
"5131",
"3030",
"1971",
"2265",
"1269",
"10649",
"13652",
"11299",
"7250",
"8403",
"3514",
"4676",
"11445",
"11458",
"2852",
"15451",
"7203",
"888",
"7880",
"8579",
"590",
"5292",
"600",
"7310",
"13452",
"5026",
"324",
"14807",
"7112",
"11126",
"10904",
"10627",
"11590",
"9235",
"4893",
"12456",
"6433",
"2448",
"4118",
"12045",
"2068",
"4076",
"703",
"2127",
"1960",
"1694",
"8381",
"13944",
"7406",
"3180",
"13087",
"489",
"15039",
"11516",
"8765",
"5765",
"4515",
"8382",
"11100",
"14872",
"4514",
"10337",
"6957",
"1903",
"13145",
"11127",
"6533",
"2723",
"10621",
"12152",
"1589",
"386",
"1626",
"3890",
"4698",
"14376",
"14152",
"3511",
"1060",
"12730",
"3548",
"4207",
"14776",
"12425",
"399",
"4780",
"11645",
"2184",
"12794",
"5506",
"14245",
"1895",
"1457",
"11284",
"3695",
"3985",
"4679",
"617",
"2019",
"1476",
"7267",
"2724",
"15160",
"3244",
"14603",
"3771",
"9316",
"3652",
"11761",
"2160",
"10583",
"12322",
"12845",
"12288",
"14783",
"7264",
"2205",
"10970",
"8525",
"11004",
"12664",
"15178",
"6089",
"4273",
"15492",
"13869",
"687",
"9495",
"2882",
"8926",
"12626",
"7288",
"8489",
"10437",
"14937",
"5125",
"2069",
"14387",
"5528",
"9849",
"9063",
"7141",
"7964",
"10757",
"12973",
"12466",
"9763",
"9173",
"5738",
"1113",
"8502",
"975",
"12316",
"10500",
"3656",
"3877",
"9601",
"2034",
"8248",
"11006",
"3488",
"14818",
"13086",
"2652",
"9694",
"5495",
"6138",
"12648",
"2775",
"8385",
"5242",
"10838",
"14702",
"853",
"1597",
"3088",
"3195",
"4727",
"4159",
"4170",
"11390",
"4972",
"13010",
"12742",
"14503",
"3638",
"7104",
"5750",
"12712",
"5524",
"3586",
"3449",
"3540",
"11247",
"665",
"2289",
"7663",
"14448",
"1016",
"14102",
"4421",
"2518",
"348",
"10400",
"4685",
"4080",
"5865",
"4553",
"14749",
"14605",
"9502",
"3973",
"5568",
"3189",
"370",
"1298",
"6010",
"9104",
"10863",
"2018",
"14736",
"3612",
"1375",
"11604",
"1373",
"15270",
"1167",
"8183",
"5374",
"907",
"7224",
"12070",
"13678",
"5998",
"2270",
"778",
"5501",
"12765",
"14142",
"6122",
"5855",
"1100",
"3403",
"5872",
"11059",
"8641",
"14880",
"4711",
"3767",
"2569",
"9279",
"10368",
"5076",
"4796",
"5368",
"4479",
"13337",
"202",
"4310",
"1175",
"13009",
"14325",
"1101",
"8963",
"10503",
"12374",
"1253",
"6476",
"12898",
"335",
"14623",
"9594",
"10325",
"2658",
"7056",
"2487",
"1308",
"8599",
"1033",
"885",
"1838",
"4817",
"188",
"8788",
"8013",
"9398",
"15326",
"13838",
"14061",
"6037",
"15177",
"9055",
"4710",
"11280",
"850",
"13023",
"3404",
"406",
"8819",
"9582",
"12671",
"9049",
"2185",
"15475",
"1295",
"75",
"14100",
"7469",
"10084",
"12284",
"2285",
"12585",
"14769",
"7142",
"10635",
"8881",
"10762",
"4306",
"12230",
"5499",
"12453",
"9436",
"12674",
"6011",
"10394",
"14073",
"14023",
"10444",
"1242",
"4502",
"992",
"11690",
"6642",
"3796",
"4549",
"11055",
"5850",
"5556",
"6937",
"9707",
"387",
"917",
"2423",
"15031",
"4996",
"11758",
"15189",
"13707",
"1159",
"195",
"3210",
"11043",
"12739",
"9492",
"11732",
"1172",
"7215",
"981",
"9632",
"327",
"13221",
"10071",
"645",
"9217",
"12767",
"4331",
"14923",
"1279",
"11354",
"13418",
"2206",
"13894",
"10754",
"6657",
"9668",
"9145",
"7149",
"4109",
"15484",
"12043",
"14543",
"4412",
"15471",
"10999",
"4190",
"166",
"1019",
"10707",
"13688",
"11340",
"4732",
"9644",
"244",
"5544",
"710",
"13760",
"3857",
"7717",
"14757",
"12464",
"11709",
"2544",
"9163",
"10026",
"9471",
"4475",
"4564",
"1098",
"10998",
"4269",
"2660",
"8699",
"10159",
"4705",
"9488",
"6550",
"6896",
"3839",
"9192",
"2592",
"9604",
"450",
"976",
"4808",
"4069",
"11042",
"4461",
"9193",
"1363",
"8968",
"900",
"13396",
"4221",
"47",
"12053",
"1757",
"4787",
"4487",
"4662",
"14300",
"8375",
"5144",
"14171",
"1638",
"7429",
"12621",
"5288",
"11831",
"1956",
"11071",
"1104",
"12928",
"14861",
"1968",
"9027",
"4781",
"14936",
"693",
"10126",
"2842",
"2111",
"1344",
"14790",
"3156",
"13825",
"586",
"7648",
"9778",
"11439",
"10571",
"51",
"3764",
"3227",
"12975",
"5141",
"12622",
"5835",
"1654",
"4440",
"11035",
"8576",
"4203",
"4494",
"8012",
"4590",
"10631",
"29",
"8351",
"8875",
"423",
"6831",
"9123",
"13520",
"10763",
"9895",
"6570",
"3903",
"7293",
"8436",
"10606",
"10742",
"7777",
"14642",
"1592",
"1826",
"11618",
"1211",
"3458",
"4322",
"5785",
"4083",
"2076",
"13903",
"10608",
"8161",
"718",
"11898",
"15098",
"1492",
"12126",
"157",
"8779",
"2045",
"5954",
"11379",
"9722",
"3647",
"112",
"15",
"278",
"807",
"12332",
"9114",
"3683",
"4527",
"3760",
"7748",
"14564",
"8456",
"6686",
"11114",
"4524",
"7384",
"128",
"96",
"9574",
"11525",
"13966",
"492",
"3364",
"4745",
"2124",
"11480",
"12523",
"8621",
"2220",
"13425",
"8110",
"9993",
"4940",
"4867",
"7481",
"9367",
"14301",
"14562",
"6733",
"2722",
"1608",
"3300",
"7527",
"1773",
"12151",
"6979",
"2640",
"631",
"11536",
"7646",
"882",
"9799",
"3688",
"1717",
"3766",
"11985",
"6321",
"828",
"2498",
"10521",
"8523",
"7829",
"10855",
"2901",
"6425",
"1209",
"552",
"8257",
"11614",
"7246",
"7103",
"5535",
"7765",
"11927",
"2254",
"2370",
"8709",
"7216",
"5500",
"5558",
"8170",
"3840",
"616",
"2737",
"7873",
"3810",
"4256",
"10865",
"7786",
"2481",
"14819",
"6142",
"3932",
"4275",
"10720",
"12057",
"4474",
"11621",
"2040",
"6883",
"13488",
"7867",
"5660",
"1913",
"7986",
"14834",
"15157",
"10087",
"4209",
"11880",
"10987",
"11212",
"13094",
"14272",
"14952",
"2730",
"11885",
"3772",
"9137",
"3729",
"8618",
"689",
"11395",
"6411",
"8716",
"3464",
"11210",
"4609",
"13384",
"14690",
"8766",
"13201",
"6492",
"14686",
"9971",
"3355",
"2696",
"9930",
"9265",
"14460",
"14510",
"4106",
"3622",
"13878",
"3668",
"7148",
"11628",
"6514",
"936",
"4498",
"4244",
"106",
"9252",
"8526",
"1624",
"14708",
"10284",
"9699",
"12847",
"12130",
"12987",
"3273",
"15120",
"8458",
"8444",
"10775",
"2262",
"4189",
"4931",
"13285",
"6546",
"9387",
"10024",
"8793",
"4843",
"32",
"3395",
"1878",
"1582",
"180",
"4138",
"10427",
"12793",
"11293",
"12790",
"10025",
"6166",
"457",
"14408",
"7870",
"10416",
"10802",
"15315",
"3946",
"10638",
"345",
"12016",
"13605",
"1545",
"15445",
"11541",
"10127",
"9904",
"1544",
"4194",
"11728",
"829",
"357",
"2007",
"7840",
"15071",
"8712",
"14268",
"1586",
"9649",
"11648",
"12452",
"8900",
"6660",
"14634",
"5469",
"6459",
"4219",
"2577",
"9919",
"2638",
"11267",
"13713",
"339",
"11252",
"3628",
"93",
"7166",
"3256",
"7630",
"11912",
"10666",
"10449",
"3834",
"1793",
"6557",
"6175",
"3629",
"372",
"4168",
"1980",
"13134",
"7664",
"8449",
"10611",
"4821",
"14406",
"4185",
"9097",
"11097",
"8405",
"8238",
"11450",
"4978",
"11302",
"10293",
"12625",
"694",
"5029",
"10534",
"1860",
"7901",
"4832",
"15370",
"3478",
"11764",
"13085",
"10132",
"6458",
"808",
"10474",
"5221",
"5476",
"10954",
"6160",
"11319",
"2855",
"3906",
"10989",
"11476",
"1742",
"3398",
"1458",
"4265",
"3173",
"13413",
"15258",
"8042",
"11050",
"21",
"10346",
"4977",
"8544",
"2523",
"11044",
"3484",
"5414",
"9548",
"5588",
"12879",
"9551",
"13803",
"3045",
"10907",
"15209",
"7038",
"13276",
"3148",
"6349",
"5046",
"2058",
"12713",
"6930",
"13634",
"6668",
"3779",
"7107",
"376",
"12328",
"9687",
"14717",
"2918",
"11566",
"13173",
"12254",
"14836",
"3347",
"5178",
"12498",
"14285",
"15269",
"4528",
"221",
"14801",
"11288",
"7900",
"845",
"7176",
"15363",
"1931",
"6636",
"3268",
"10522",
"3749",
"1639",
"5559",
"7881",
"672",
"3618",
"10270",
"9856",
"11144",
"11021",
"13104",
"7412",
"9846",
"12209",
"6028",
"6102",
"9036",
"4557",
"6527",
"3280",
"367",
"3323",
"9565",
"2109",
"6400",
"7877",
"2032",
"8068",
"11936",
"1981",
"5237",
"2889",
"3936",
"7756",
"1846",
"13908",
"6367",
"77",
"6435",
"12922",
"9232",
"2771",
"7036",
"12694",
"6083",
"7501",
"7888",
"10439",
"2296",
"3196",
"7355",
"1711",
"11222",
"2107",
"546",
"13398",
"10194",
"8358",
"6576",
"13594",
"11177",
"13853",
"3610",
"15409",
"2976",
"1554",
"12478",
"3503",
"330",
"9085",
"6975",
"13613",
"105",
"10925",
"10033",
"1437",
"5240",
"9014",
"14194",
"10603",
"5134",
"13880",
"8311",
"4465",
"11928",
"1686",
"14043",
"4599",
"8335",
"5142",
"7368",
"7742",
"11141",
"5482",
"3758",
"14722",
"7644",
"9616",
"342",
"13832",
"8969",
"15232",
"8339",
"6416",
"9309",
"6563",
"3727",
"10714",
"4236",
"11681",
"10016",
"8038",
"6101",
"12072",
"13946",
"12077",
"8290",
"1478",
"10862",
"57",
"1780",
"12527",
"14948",
"673",
"6882",
"10081",
"15380",
"3385",
"3378",
"12287",
"6448",
"1859",
"1314",
"10769",
"13080",
"13051",
"3917",
"11919",
"13957",
"14930",
"5747",
"10626",
"6724",
"1081",
"2785",
"1645",
"14554",
"12470",
"6931",
"13045",
"9503",
"8028",
"11296",
"10599",
"3229",
"13273",
"15035",
"6119",
"14960",
"6099",
"9640",
"14474",
"9101",
"14020",
"8134",
"8246",
"1523",
"4769",
"7125",
"13643",
"10946",
"8325",
"14368",
"13754",
"12786",
"3467",
"1662",
"2714",
"7970",
"13227",
"5449",
"4255",
"5990",
"11705",
"7599",
"11356",
"5817",
"6223",
"11153",
"4953",
"11308",
"6288",
"2758",
"822",
"3670",
"9203",
"12755",
"2803",
"13123",
"2514",
"50",
"4413",
"14267",
"4575",
"11125",
"12918",
"4962",
"5101",
"10133",
"10489",
"14959",
"12990",
"4945",
"4591",
"13067",
"4124",
"12422",
"4347",
"15088",
"5192",
"8418",
"2552",
"875",
"1835",
"5212",
"10139",
"2039",
"2759",
"9267",
"9228",
"2687",
"5696",
"303",
"15106",
"5589",
"5241",
"4594",
"10895",
"4870",
"28",
"1177",
"13166",
"12468",
"515",
"8054",
"10405",
"2668",
"1025",
"9697",
"11857",
"1983",
"12145",
"8887",
"3783",
"2269",
"6348",
"10591",
"13976",
"6608",
"946",
"5736",
"13802",
"9034",
"1601",
"2876",
"10513",
"8461",
"10430",
"8337",
"3288",
"11849",
"7490",
"7713",
"8838",
"11137",
"11874",
"732",
"2500",
"4892",
"7180",
"5573",
"9095",
"1558",
"9225",
"12051",
"2174",
"2256",
"3991",
"9654",
"4587",
"15259",
"8571",
"522",
"2358",
"2001",
"14468",
"933",
"4829",
"12299",
"5181",
"10229",
"11544",
"12306",
"322",
"879",
"13129",
"11888",
"9546",
"13805",
"4869",
"473",
"8598",
"15319",
"6903",
"7417",
"11696",
"11111",
"9609",
"4355",
"9459",
"6614",
"2626",
"6387",
"13470",
"13442",
"8419",
"15044",
"1518",
"8553",
"13833",
"10658",
"10846",
"14104",
"11798",
"6324",
"8849",
"7156",
"13748",
"4469",
"8989",
"2520",
"5644",
"7962",
"11875",
"11443",
"8139",
"130",
"15224",
"9220",
"1978",
"5534",
"8482",
"6956",
"10919",
"11374",
"12653",
"14437",
"13610",
"11803",
"11747",
"8032",
"7884",
"282",
"3602",
"12768",
"12339",
"9739",
"11347",
"7992",
"7404",
"12178",
"7443",
"14567",
"3611",
"7271",
"2456",
"2526",
"14393",
"4317",
"8938",
"2913",
"6664",
"12773",
"8347",
"7528",
"5987",
"14727",
"2085",
"9823",
"9384",
"5425",
"6841",
"4403",
"7784",
"2894",
"12957",
"333",
"11234",
"1302",
"1652",
"8666",
"5629",
"964",
"12416",
"5468",
"10195",
"6057",
"11962",
"13393",
"5072",
"14978",
"9908",
"7769",
"1621",
"2177",
"6734",
"11164",
"8387",
"5286",
"15286",
"3761",
"746",
"7463",
"7674",
"8064",
"3457",
"9276",
"6737",
"4062",
"2153",
"3319",
"14534",
"11572",
"9537",
"5440",
"10036",
"1687",
"2995",
"8070",
"11946",
"6475",
"6545",
"1093",
"8763",
"8363",
"9737",
"11870",
"6091",
"9484",
"12822",
"7625",
"8796",
"5489",
"7613",
"2542",
"3328",
"12162",
"7040",
"4667",
"668",
"4827",
"1505",
"11737",
"7468",
"12083",
"13206",
"9202",
"8691",
"5913",
"6231",
"8977",
"6112",
"9273",
"8501",
"8479",
"5096",
"6140",
"1047",
"4313",
"6827",
"9709",
"14241",
"115",
"3205",
"9874",
"648",
"15338",
"3453",
"10045",
"2972",
"3028",
"11616",
"4513",
"13907",
"62",
"2183",
"12049",
"4303",
"7588",
"7255",
"13636",
"12590",
"6966",
"12804",
"12725",
"10609",
"13103",
"10141",
"270",
"10175",
"5191",
"13851",
"804",
"9351",
"10939",
"13447",
"13315",
"2524",
"14746",
"1660",
"4950",
"8317",
"3627",
"11720",
"6262",
"15134",
"3438",
"8929",
"14886",
"10236",
"8809",
"6015",
"9025",
"10040",
"14047",
"11269",
"8794",
"10793",
"2211",
"470",
"1677",
"8173",
"12656",
"10584",
"8672",
"13873",
"13515",
"2765",
"2784",
"10683",
"5946",
"2438",
"4677",
"3666",
"2474",
"15333",
"1318",
"4627",
"7358",
"5800",
"13121",
"10921",
"8053",
"11372",
"1928",
"14192",
"13511",
"6895",
"129",
"15359",
"12102",
"11375",
"2328",
"10717",
"8951",
"7878",
"379",
"8711",
"10875",
"12696",
"251",
"11677",
"3980",
"10542",
"9433",
"4085",
"10731",
"5019",
"14895",
"14304",
"162",
"2756",
"1728",
"14968",
"6405",
"8091",
"7727",
"7623",
"7327",
"1894",
"11413",
"3640",
"9342",
"14404",
"14331",
"8751",
"2072",
"8067",
"11802",
"39",
"11081",
"11545",
"1504",
"11394",
"12849",
"805",
"1776",
"5732",
"2763",
"9020",
"3343",
"6561",
"3486",
"13884",
"14103",
"14766",
"2259",
"14098",
"5611",
"511",
"3949",
"9929",
"12663",
"2661",
"7394",
"2820",
"11934",
"14489",
"9526",
"12850",
"6497",
"2528",
"5383",
"15081",
"7422",
"4285",
"2980",
"6815",
"3795",
"9612",
"2036",
"2633",
"786",
"11459",
"4865",
"4777",
"2548",
"7699",
"3147",
"7192",
"4928",
"9451",
"7398",
"2701",
"8432",
"4921",
"9201",
"6259",
"13836",
"3820",
"7672",
"1233",
"854",
"15143",
"4405",
"13292",
"10062",
"12143",
"4229",
"15112",
"12534",
"11738",
"14835",
"8138",
"2480",
"10678",
"15212",
"11910",
"1054",
"13986",
"13762",
"12564",
"2439",
"8942",
"3962",
"10772",
"8662",
"3581",
"6700",
"14083",
"1257",
"6252",
"411",
"7833",
"15316",
"9180",
"194",
"7695",
"5651",
"6390",
"12191",
"9086",
"13616",
"5087",
"153",
"3217",
"13522",
"15020",
"12438",
"12171",
"6144",
"9295",
"971",
"10038",
"14572",
"8094",
"14458",
"13382",
"6662",
"6335",
"9978",
"8424",
"9369",
"1231",
"4995",
"675",
"15144",
"7851",
"6833",
"4510",
"1230",
"9613",
"8095",
"12938",
"2516",
"7554",
"13256",
"6104",
"10121",
"4425",
"11170",
"15096",
"9475",
"10472",
"8722",
"6230",
"1726",
"2323",
"15369",
"10780",
"8731",
"12934",
"14364",
"1189",
"3125",
"4223",
"3400",
"14551",
"13599",
"10911",
"7265",
"1051",
"11416",
"5597",
"1783",
"175",
"13228",
"801",
"12440",
"553",
"14126",
"14094",
"11188",
"15089",
"14975",
"2905",
"11577",
"7179",
"11418",
"10914",
"1527",
"8088",
"1238",
"2912",
"3111",
"4992",
"12923",
"12777",
"5339",
"4927",
"12196",
"6092",
"5689",
"2",
"11669",
"13977",
"7226",
"14385",
"4989",
"12118",
"7137",
"6858",
"12087",
"6185",
"7626",
"8356",
"12410",
"1998",
"12078",
"1769",
"4445",
"5166",
"11957",
"7127",
"11819",
"4300",
"4349",
"13325",
"12349",
"11216",
"241",
"5857",
"6814",
"12068",
"3067",
"1855",
"4883",
"9491",
"13932",
"14431",
"11271",
"1267",
"14813",
"5953",
"2264",
"461",
"9004",
"14594",
"7476",
"3035",
"11339",
"10255",
"7323",
"5483",
"4402",
"15332",
"14010",
"3801",
"9578",
"13677",
"5340",
"9219",
"6983",
"2425",
"7420",
"1106",
"4993",
"11902",
"10700",
"10463",
"1370",
"3167",
"6212",
"13593",
"3617",
"10689",
"13397",
"13097",
"2201",
"7022",
"10527",
"12248",
"5384",
"2739",
"15294",
"10577",
"11470",
"11048",
"13044",
"10120",
"10572",
"12620",
"5717",
"11389",
"3062",
"13407",
"3894",
"3681",
"9103",
"9421",
"2475",
"6452",
"5995",
"700",
"3084",
"7513",
"14981",
"8465",
"12737",
"434",
"16",
"14339",
"12912",
"6850",
"14088",
"13536",
"12391",
"5466",
"8556",
"3245",
"4434",
"234",
"12432",
"1615",
"14924",
"2685",
"15013",
"12093",
"13997",
"8275",
"5247",
"7163",
"5566",
"4136",
"5109",
"6682",
"5285",
"12405",
"1828",
"14428",
"9933",
"7452",
"2168",
"6852",
"2966",
"5632",
"1817",
"7976",
"14635",
"12826",
"13329",
"12054",
"14077",
"14279",
"3332",
"4471",
"6467",
"9857",
"5845",
"13646",
"8724",
"2767",
"13994",
"5481",
"3566",
"11656",
"8985",
"6351",
"3969",
"9839",
"13140",
"3427",
"14108",
"4099",
"9277",
"7201",
"12294",
"796",
"10823",
"1085",
"11762",
"776",
"12614",
"9357",
"1349",
"10265",
"13349",
"8803",
"2405",
"15327",
"5665",
"3136",
"12574",
"1688",
"11530",
"1866",
"4632",
"8061",
"4107",
"8852",
"6638",
"12659",
"11830",
"11198",
"187",
"13441",
"6872",
"2315",
"15165",
"4295",
"784",
"1012",
"136",
"4437",
"12889",
"5047",
"1282",
"5650",
"10852",
"12317",
"5737",
"9162",
"13132",
"4122",
"2605",
"5981",
"8883",
"11511",
"9256",
"14378",
"5931",
"1576",
"11679",
"3781",
"14038",
"3241",
"11447",
"1455",
"7609",
"6888",
"11929",
"1770",
"6843",
"12055",
"8300",
"9077",
"624",
"9026",
"8744",
"570",
"3899",
"12770",
"5319",
"937",
"8331",
"6779",
"663",
"15176",
"5513",
"6725",
"7865",
"6926",
"12240",
"7557",
"12222",
"9783",
"2788",
"7757",
"155",
"466",
"506",
"2430",
"14579",
"3690",
"14402",
"8278",
"4082",
"12575",
"6007",
"13204",
"9288",
"5027",
"13485",
"8194",
"12880",
"11434",
"5815",
"13096",
"10899",
"7230",
"8242",
"13781",
"2970",
"7069",
"13716",
"4695",
"14611",
"3993",
"11711",
"422",
"2781",
"12717",
"2050",
"10812",
"3716",
"1974",
"8025",
"1579",
"12788",
"7426",
"3377",
"12855",
"11812",
"5728",
"11937",
"15390",
"9648",
"13443",
"4811",
"815",
"3833",
"5956",
"11984",
"2300",
"11856",
"9443",
"12963",
"9090",
"4135",
"12435",
"1419",
"6942",
"3142",
"3580",
"10676",
"2437",
"1305",
"1640",
"1714",
"4377",
"6679",
"14253",
"9262",
"13477",
"8876",
"10434",
"7146",
"15008",
"139",
"7693",
"2736",
"14915",
"13954",
"2560",
"14630",
"2712",
"3422",
"334",
"12707",
"13526",
"12647",
"1573",
"8524",
"13480",
"11783",
"3242",
"4420",
"10461",
"5450",
"12238",
"1255",
"7978",
"2278",
"6772",
"3118",
"4876",
"2718",
"15058",
"10412",
"2563",
"3312",
"6535",
"12334",
"12643",
"11186",
"708",
"4816",
"9226",
"5593",
"1082",
"7571",
"6547",
"5021",
"13742",
"3392",
"14412",
"9361",
"9401",
"995",
"167",
"11551",
"15105",
"8251",
"13576",
"6715",
"10688",
"10646",
"8092",
"2248",
"3751",
"5118",
"13788",
"9870",
"7758",
"15426",
"9391",
"14166",
"14520",
"15229",
"5184",
"9485",
"7918",
"9935",
"9199",
"1215",
"1678",
"12013",
"15383",
"9204",
"771",
"9600",
"13975",
"9071",
"12056",
"7205",
"2042",
"10504",
"8230",
"5407",
"8918",
"3891",
"1734",
"4930",
"1622",
"4658",
"12877",
"14295",
"5982",
"12825",
"14263",
"9955",
"12672",
"13210",
"14372",
"3508",
"12861",
"9745",
"2942",
"13840",
"11040",
"1129",
"1987",
"2879",
"10145",
"1334",
"15272",
"12401",
"10322",
"8953",
"4287",
"11539",
"6051",
"3742",
"4120",
"14582",
"8771",
"8535",
"14897",
"13828",
"1212",
"4638",
"15312",
"9780",
"13573",
"7739",
"952",
"14363",
"1135",
"12122",
"3510",
"11435",
"8783",
"7740",
"8901",
"9890",
"11682",
"9246",
"1814",
"1073",
"5622",
"14079",
"15010",
"208",
"4856",
"12081",
"12710",
"1227",
"12634",
"13160",
"12831",
"7862",
"7495",
"2139",
"3152",
"12802",
"10397",
"7131",
"5836",
"7982",
"9677",
"2851",
"1493",
"2446",
"5036",
"8846",
"9084",
"12732",
"7573",
"7958",
"7101",
"5890",
"13806",
"9414",
"9505",
"2046",
"881",
"11249",
"15345",
"2060",
"10426",
"15276",
"12433",
"13663",
"15219",
"12904",
"1775",
"8700",
"10017",
"11430",
"7995",
"7971",
"4360",
"9336",
"6003",
"14084",
"8442",
"6988",
"13611",
"1889",
"2728",
"4319",
"2055",
"3723",
"1577",
"8713",
"6964",
"12758",
"13568",
"3416",
"3843",
"2234",
"9099",
"11917",
"13283",
"14081",
"6414",
"1926",
"6440",
"6419",
"8234",
"5415",
"4330",
"8268",
"15202",
"10094",
"1870",
"7526",
"3506",
"3893",
"13299",
"11174",
"445",
"10182",
"11482",
"6494",
"5301",
"3085",
"2571",
"8346",
"733",
"8213",
"248",
"1236",
"9021",
"8293",
"8855",
"940",
"2340",
"8920",
"8056",
"1191",
"7297",
"9659",
"12839",
"6654",
"15102",
"7438",
"6151",
"7474",
"11647",
"12348",
"7172",
"9127",
"15037",
"2667",
"5312",
"3149",
"6176",
"9721",
"1399",
"13034",
"6876",
"10336",
"1787",
"11226",
"12360",
"13915",
"2445",
"14667",
"14261",
"2051",
"787",
"13491",
"4232",
"7144",
"13668",
"6167",
"3841",
"14660",
"10263",
"11474",
"797",
"3598",
"9037",
"13163",
"1754",
"15162",
"2144",
"6773",
"7871",
"1244",
"15425",
"4668",
"15261",
"4959",
"3344",
"11049",
"7763",
"2686",
"3577",
"1277",
"11167",
"9074",
"7556",
"9818",
"12721",
"7540",
"4019",
"14324",
"2530",
"6835",
"9092",
"5620",
"1557",
"10425",
"4864",
"2676",
"8601",
"2094",
"8078",
"12789",
"12364",
"1892",
"4503",
"4655",
"5932",
"7835",
"5546",
"13641",
"11309",
"5375",
"15288",
"13937",
"135",
"3933",
"6216",
"14250",
"2029",
"43",
"13708",
"15248",
"7933",
"3175",
"2499",
"9920",
"11332",
"9362",
"13751",
"14539",
"11771",
"10536",
"11191",
"6410",
"7095",
"12404",
"10304",
"6812",
"816",
"4228",
"8481",
"2189",
"10130",
"8366",
"11582",
"1115",
"6651",
"1514",
"12726",
"13018",
"12137",
"13574",
"424",
"2699",
"10046",
"8027",
"19",
"10751",
"2832",
"840",
"3847",
"4762",
"11777",
"10782",
"5225",
"4875",
"3734",
"2564",
"9260",
"6408",
"7218",
"1954",
"656",
"6438",
"4057",
"13451",
"12205",
"5636",
"12754",
"12678",
"4478",
"8717",
"13810",
"8750",
"13265",
"1193",
"6049",
"10280",
"13110",
"6429",
"1144",
"468",
"1818",
"13004",
"1668",
"13597",
"955",
"10516",
"3409",
"6300",
"12326",
"14217",
"3352",
"366",
"10277",
"4382",
"3524",
"1847",
"7259",
"5196",
"5988",
"6358",
"10530",
"5271",
"8683",
"11245",
"13364",
"15382",
"11307",
"10959",
"13877",
"8026",
"13914",
"7671",
"7666",
"9664",
"9804",
"11205",
"4758",
"5269",
"4878",
"556",
"5164",
"13897",
"12685",
"11619",
"2114",
"2176",
"13765",
"10623",
"10300",
"8577",
"14342",
"12948",
"716",
"12807",
"4536",
"7641",
"3752",
"9831",
"14993",
"484",
"14028",
"8774",
"14935",
"10826",
"138",
"12447",
"7410",
"3694",
"13316",
"520",
"5647",
"8689",
"12237",
"11378",
"7136",
"5985",
"6726",
"10753",
"11813",
"2606",
"10356",
"8614",
"10600",
"137",
"1571",
"1118",
"543",
"13674",
"2293",
"4915",
"15069",
"12997",
"7435",
"9001",
"13928",
"2588",
"11978",
"13971",
"1533",
"12187",
"10077",
"1186",
"1418",
"14780",
"12515",
"6265",
"6516",
"8702",
"3295",
"4541",
"691",
"15262",
"9486",
"11700",
"11823",
"5718",
"10740",
"2895",
"7846",
"14153",
"12489",
"7331",
"10816",
"8680",
"6952",
"14230",
"5575",
"7720",
"7335",
"11297",
"4008",
"3953",
"14648",
"2218",
"11939",
"11586",
"5008",
"9479",
"14961",
"3651",
"5059",
"6622",
"10613",
"11000",
"14337",
"9394",
"12247",
"11472",
"8558",
"11468",
"1738",
"8823",
"11117",
"9595",
"1953",
"3401",
"13632",
"5116",
"10160",
"200",
"12591",
"6511",
"2799",
"6935",
"14565",
"2295",
"11462",
"8924",
"10982",
"15014",
"1900",
"454",
"7211",
"15349",
"12974",
"14032",
"259",
"15495",
"527",
"8800",
"4004",
"11969",
"7827",
"721",
"13506",
"1876",
"1247",
"13088",
"10938",
"11246",
"8949",
"352",
"12331",
"11010",
"8623",
"934",
"13963",
"5604",
"12379",
"11072",
"7121",
"8604",
"7684",
"8787",
"3393",
"3225",
"2440",
"8536",
"14497",
"394",
"1790",
"10659",
"6395",
"12357",
"2066",
"14613",
"495",
"9490",
"12856",
"10881",
"14803",
"6070",
"7072",
"6512",
"8404",
"9676",
"5154",
"10451",
"12896",
"9411",
"14353",
"2535",
"8438",
"2141",
"1812",
"11811",
"13434",
"8797",
"12836",
"15314",
"10060",
"6991",
"6283",
"15222",
"2095",
"4464",
"7054",
"10107",
"1369",
"10824",
"12761",
"1618",
"6532",
"5168",
"14003",
"9845",
"3331",
"1320",
"1425",
"7854",
"10549",
"15491",
"877",
"1658",
"6453",
"3431",
"9371",
"1934",
"15168",
"12501",
"7576",
"4855",
"9148",
"3997",
"9057",
"6220",
"1702",
"14548",
"7077",
"8023",
"6183",
"4028",
"6188",
"8206",
"12996",
"4450",
"5253",
"14858",
"12113",
"11370",
"9866",
"8229",
"3699",
"7462",
"14636",
"8607",
"12487",
"5627",
"13518",
"6865",
"3340",
"15330",
"7751",
"11818",
"1916",
"14159",
"8217",
"14664",
"6625",
"13887",
"2006",
"6621",
"12610",
"13197",
"9407",
"11839",
"15169",
"8801",
"5232",
"4919",
"9385",
"13831",
"7649",
"15152",
"7894",
"9061",
"9893",
"9691",
"4699",
"10031",
"1336",
"7716",
"9581",
"5939",
"8282",
"8655",
"14555",
"1386",
"6793",
"6437",
"2670",
"6297",
"6314",
"8166",
"4073",
"15244",
"3805",
"5208",
"3007",
"2805",
"5236",
"4081",
"7670",
"9598",
"8962",
"7981",
"10055",
"7098",
"3741",
"8369",
"5837",
"9964",
"6847",
"3345",
"5741",
"6739",
"12469",
"722",
"1617",
"11751",
"11087",
"2100",
"12679",
"8620",
"4396",
"10905",
"5707",
"12891",
"5343",
"1483",
"6282",
"2884",
"13297",
"6004",
"12680",
"2175",
"6806",
"7258",
"13965",
"14655",
"11106",
"4340",
"9331",
"2809",
"3748",
"14748",
"3114",
"15393",
"341",
"8135",
"3065",
"11724",
"15362",
"7681",
"6658",
"10344",
"6822",
"798",
"14062",
"23",
"4899",
"7009",
"11676",
"1888",
"8256",
"12757",
"2641",
"2906",
"4504",
"14560",
"7233",
"4387",
"15438",
"10414",
"11941",
"13938",
"4659",
"14008",
"10713",
"13906",
"7134",
"8997",
"3665",
"13252",
"8948",
"10446",
"9310",
"14414",
"9460",
"14965",
"14190",
"7821",
"3975",
"13676",
"5712",
"336",
"9848",
"2700",
"15396",
"773",
"2033",
"846",
"8830",
"14436",
"146",
"11230",
"11232",
"1779",
"456",
"1440",
"2386",
"5615",
"13801",
"5867",
"6098",
"7809",
"14491",
"2476",
"7504",
"12342",
"4025",
"12312",
"2547",
"14486",
"6776",
"6375",
"12029",
"14934",
"14498",
"8029",
"15147",
"8080",
"13205",
"11196",
"6783",
"2596",
"4064",
"3322",
"12110",
"7392",
"15341",
"10289",
"13700",
"5132",
"15170",
"4970",
"9403",
"841",
"6191",
"7087",
"14231",
"671",
"5198",
"5238",
"6756",
"6128",
"14607",
"10241",
"2158",
"7891",
"994",
"4774",
"9135",
"8090",
"7010",
"12352",
"13459",
"13170",
"9646",
"12329",
"2817",
"15018",
"7998",
"5814",
"6718",
"6643",
"569",
"7822",
"14622",
"8909",
"551",
"6526",
"8473",
"9426",
"1918",
"1852",
"12297",
"1143",
"1536",
"464",
"2087",
"1655",
"8546",
"1675",
"3942",
"2436",
"557",
"10231",
"12592",
"6012",
"3329",
"3011",
"8425",
"12103",
"11091",
"2037",
"15053",
"11290",
"15006",
"14019",
"11601",
"6507",
"6885",
"9972",
"3107",
"3354",
"6703",
"5709",
"7379",
"9946",
"7500",
"1540",
"3704",
"2217",
"8285",
"3254",
"7512",
"8682",
"4517",
"6480",
"873",
"11763",
"11454",
"8575",
"13007",
"1428",
"14687",
"15115",
"1825",
"2413",
"14540",
"7856",
"13489",
"6456",
"9928",
"2772",
"5753",
"1808",
"4374",
"14626",
"3875",
"5766",
"6251",
"294",
"10340",
"7497",
"6118",
"10374",
"10910",
"4074",
"2064",
"4452",
"14739",
"11796",
"14615",
"6193",
"4663",
"10840",
"5706",
"3926",
"3818",
"3869",
"764",
"6394",
"1899",
"10587",
"662",
"12255",
"12612",
"1430",
"760",
"1578",
"10543",
"8258",
"14332",
"3096",
"3389",
"13811",
"664",
"1250",
"13058",
"1190",
"524",
"1309",
"14825",
"15003",
"1416",
"13177",
"3311",
"11664",
"1022",
"11438",
"7683",
"14745",
"7151",
"7440",
"3968",
"1791",
"3238",
"6126",
"9530",
"7624",
"4623",
"14674",
"0",
"1259",
"12201",
"2118",
"3465",
"11244",
"12629",
"7547",
"8261",
"1032",
"837",
"609",
"5358",
"2401",
"7145",
"14507",
"12170",
"5833",
"13847",
"10082",
"4694",
"9728",
"9544",
"14430",
"2861",
"1470",
"14768",
"13151",
"8030",
"4392",
"4049",
"2075",
"10874",
"210",
"11548",
"8834",
"876",
"12937",
"1326",
"12979",
"9720",
"14278",
"12954",
"12156",
"11595",
"7213",
"1439",
"5135",
"4126",
"13800",
"12190",
"1930",
"7612",
"11202",
"7274",
"6990",
"10607",
"5171",
"12372",
"14394",
"1564",
"15161",
"2666",
"13340",
"9663",
"9247",
"6698",
"9372",
"6669",
"11789",
"8223",
"6803",
"11163",
"5901",
"2360",
"3074",
"5634",
"5698",
"2953",
"7393",
"9483",
"7545",
"12550",
"4800",
"4863",
"12356",
"2990",
"5000",
"6190",
"5846",
"2284",
"9292",
"11203",
"3715",
"8980",
"8398",
"4023",
"10200",
"8270",
"3570",
"14499",
"9888",
"10494",
"2173",
"485",
"7333",
"7304",
"10295",
"11192",
"10165",
"4072",
"11398",
"10004",
"1111",
"10410",
"7520",
"7473",
"4037",
"451",
"441",
"3608",
"4327",
"12536",
"10125",
"2257",
"2341",
"4066",
"12697",
"3596",
"11172",
"11036",
"6397",
"14870",
"5218",
"7657",
"4925",
"10375",
"11046",
"7425",
"8864",
"2350",
"1816",
"14022",
"12092",
"14138",
"9688",
"8899",
"602",
"15242",
"12524",
"6153",
"1137",
"13770",
"15107",
"1003",
"6589",
"12107",
"14222",
"8737",
"7602",
"11652",
"9771",
"14228",
"8534",
"7270",
"7244",
"4139",
"9251",
"13854",
"1454",
"245",
"14274",
"1751",
"4414",
"6319",
"2203",
"7471",
"12260",
"12444",
"4968",
"13430",
"8509",
"14336",
"14900",
"5967",
"4386",
"10476",
"7523",
"11829",
"13309",
"14898",
"9024",
"11316",
"12749",
"2458",
"8447",
"6422",
"560",
"1739",
"12690",
"8209",
"7837",
"13319",
"3371",
"3832",
"5248",
"10191",
"13351",
"10681",
"3854",
"5935",
"6968",
"4771",
"133",
"355",
"6392",
"4913",
"6353",
"9365",
"10902",
"3887",
"3391",
"2994",
"8843",
"14400",
"11429",
"12815",
"12673",
"7549",
"14365",
"620",
"13627",
"11523",
"6627",
"9563",
"5366",
"4448",
"12543",
"9898",
"6337",
"11001",
"924",
"4651",
"5213",
"6849",
"14416",
"12539",
"5690",
"3500",
"6578",
"8730",
"14039",
"3413",
"651",
"775",
"3038",
"14826",
"13882",
"4552",
"85",
"8146",
"8049",
"11393",
"12251",
"510",
"4178",
"163",
"1581",
"8227",
"2578",
"11351",
"7219",
"9308",
"6291",
"13400",
"13033",
"8045",
"5329",
"14677",
"8152",
"14270",
"11119",
"5148",
"11704",
"8854",
"9660",
"10923",
"7592",
"10540",
"2478",
"12568",
"1355",
"5373",
"14492",
"5",
"9113",
"9911",
"5680",
"12490",
"6641",
"4531",
"6080",
"12542",
"1486",
"14953",
"10478",
"7692",
"1368",
"12785",
"1659",
"9257",
"3536",
"11362",
"8931",
"6813",
"10398",
"824",
"8047",
"8563",
"11942",
"11661",
"4316",
"10963",
"10075",
"6569",
"1819",
"3732",
"3423",
"14716",
"1969",
"11687",
"3931",
"14785",
"6558",
"11583",
"6606",
"12067",
"2662",
"8847",
"1627",
"7",
"10554",
"15414",
"4987",
"2616",
"629",
"2593",
"12149",
"13868",
"2306",
"447",
"7182",
"7487",
"10830",
"14914",
"2125",
"6278",
"11029",
"292",
"9324",
"9708",
"4016",
"2015",
"14773",
"4825",
"7168",
"4957",
"11471",
"8600",
"8798",
"2999",
"5949",
"498",
"1152",
"9432",
"9732",
"1501",
"4158",
"14943",
"14890",
"6732",
"13942",
"2663",
"8559",
"12088",
"2914",
"6692",
"13566",
"7919",
"3882",
"12141",
"13100",
"8871",
"15077",
"1849",
"10961",
"10928",
"5682",
"12541",
"7183",
"10687",
"9131",
"5177",
"314",
"5199",
"14009",
"14259",
"6170",
"9083",
"10056",
"12965",
"5336",
"6502",
"11915",
"618",
"8582",
"7917",
"13142",
"449",
"14420",
"1555",
"14476",
"12101",
"15305",
"3738",
"13432",
"11833",
"13497",
"1598",
"6426",
"11772",
"11194",
"8882",
"257",
"743",
"6008",
"6090",
"10849",
"5277",
"4262",
"2242",
"549",
"5880",
"9830",
"8586",
"4601",
"4388",
"14617",
"7236",
"576",
"10869",
"11662",
"11190",
"12799",
"13195",
"11104",
"5053",
"1786",
"7848",
"6967",
"4484",
"9805",
"5380",
"3972",
"6084",
"6824",
"12419",
"9048",
"7482",
"8163",
"9872",
"10804",
"9167",
"4218",
"6270",
"8520",
"9573",
"9444",
"6639",
"1925",
"12833",
"15207",
"6154",
"4152",
"12753",
"3954",
"13657",
"14281",
"4274",
"795",
"1539",
"14846",
"4134",
"5778",
"5723",
"14321",
"6874",
"1915",
"10718",
"5025",
"11218",
"458",
"15287",
"1896",
"13192",
"8111",
"12994",
"5110",
"2078",
"9345",
"11721",
"4943",
"8821",
"11996",
"6908",
"6774",
"14480",
"8121",
"9981",
"5664",
"8517",
"1520",
"7503",
"11353",
"8459",
"9967",
"11906",
"10734",
"12176",
"2584",
"2705",
"3387",
"5420",
"11935",
"5958",
"13169",
"10839",
"8580",
"2908",
"1393",
"9240",
"2543",
"391",
"13215",
"6818",
"1672",
"4687",
"13326",
"249",
"8758",
"6210",
"12076",
"3765",
"11605",
"608",
"7374",
"13871",
"12358",
"13083",
"4198",
"10361",
"13267",
"9320",
"329",
"8780",
"15372",
"14087",
"3069",
"10517",
"11078",
"9332",
"13199",
"10117",
"7349",
"5012",
"178",
"13540",
"13924",
"6274",
"331",
"7424",
"13277",
"1844",
"13504",
"4090",
"9775",
"7232",
"13720",
"6100",
"3139",
"659",
"6489",
"6338",
"10810",
"14037",
"14638",
"1811",
"13798",
"13736",
"2035",
"2733",
"4835",
"8334",
"5392",
"10204",
"3495",
"1049",
"4336",
"3445",
"7788",
"8483",
"4030",
"7797",
"4224",
"8439",
"14044",
"6791",
"10462",
"9360",
"4706",
"6066",
"13016",
"12207",
"2322",
"883",
"1531",
"12734",
"10548",
"12185",
"8058",
"7635",
"5452",
"237",
"11655",
"4740",
"4290",
"9762",
"3547",
"9062",
"1993",
"4937",
"12378",
"12189",
"2191",
"3976",
"6978",
"2331",
"14676",
"14602",
"14969",
"5157",
"2447",
"11161",
"5782",
"9253",
"1572",
"37",
"8735",
"6645",
"5873",
"1831",
"8817",
"7248",
"12716",
"4886",
"4620",
"13729",
"5013",
"11844",
"2744",
"13823",
"13343",
"11699",
"15007",
"6275",
"8406",
"9402",
"8392",
"9931",
"10000",
"14112",
"13264",
"5962",
"4372",
"2868",
"2679",
"3289",
"3802",
"13250",
"10365",
"7794",
"9817",
"10784",
"5819",
"7521",
"5758",
"7480",
"6839",
"3021",
"14383",
"3907",
"11184",
"4009",
"3851",
"4952",
"10515",
"5979",
"12866",
"3595",
"12310",
"13376",
"3674",
"1260",
"13773",
"12932",
"9637",
"2399",
"7493",
"1935",
"3940",
"2991",
"12472",
"9096",
"9951",
"3848",
"14864",
"10039",
"12876",
"13139",
"14",
"3112",
"13013",
"9399",
"354",
"7834",
"2684",
"8306",
"8016",
"2613",
"10761",
"7365",
"10735",
"11274",
"9910",
"1891",
"14179",
"5479",
"13300",
"12752",
"14940",
"12323",
"9701",
"42",
"8232",
"12635",
"14317",
"1018",
"5147",
"8106",
"836",
"2057",
"1243",
"5244",
"5507",
"965",
"9227",
"6816",
"5675",
"7380",
"7574",
"9003",
"1128",
"984",
"8617",
"2030",
"6782",
"2859",
"13001",
"5377",
"1824",
"10095",
"3130",
"4302",
"4969",
"15047",
"3314",
"2275",
"11624",
"13841",
"14215",
"11503",
"5200",
"860",
"9487",
"9215",
"1284",
"10104",
"8226",
"12304",
"11739",
"1690",
"5106",
"11878",
"5239",
"4398",
"1951",
"14242",
"6490",
"7395",
"14026",
"15215",
"4361",
"1801",
"2279",
"8304",
"14459",
"10099",
"15397",
"4639",
"3370",
"7449",
"632",
"14733",
"1506",
"247",
"89",
"5560",
"9268",
"6272",
"15292",
"1321",
"4918",
"3162",
"2455",
"10482",
"10894",
"145",
"14175",
"9151",
"11489",
"13303",
"12197",
"11495",
"5852",
"9312",
"11364",
"7987",
"13534",
"14675",
"4532",
"7290",
"14099",
"12474",
"12409",
"14992",
"1052",
"11412",
"8518",
"7354",
"12129",
"15083",
"10808",
"5590",
"15388",
"7747",
"4451",
"12695",
"9618",
"4454",
"10798",
"14297",
"11863",
"14903",
"13090",
"12180",
"4014",
"9429",
"8933",
"11734",
"2796",
"11508",
"12311",
"14326",
"6266",
"161",
"1465",
"11215",
"8102",
"658",
"79",
"8315",
"916",
"8541",
"2113",
"8327",
"1943",
"7047",
"13093",
"3003",
"13414",
"9301",
"7109",
"6005",
"12041",
"13157",
"7157",
"11213",
"2891",
"10559",
"14091",
"6537",
"9504",
"3798",
"11574",
"12031",
"97",
"3177",
"12484",
"10222",
"9230",
"12738",
"14799",
"2230",
"4093",
"517",
"7820",
"14696",
"10185",
"2361",
"643",
"1364",
"6892",
"5493",
"8371",
"9242",
"2427",
"6181",
"14814",
"15367",
"1637",
"7642",
"724",
"5970",
"6521",
"9248",
"9555",
"10307",
"3945",
"890",
"13556",
"10787",
"11171",
"6345",
"2692",
"5279",
"12859",
"6368",
"3137",
"13669",
"10729",
"502",
"13535",
"5643",
"6655",
"4982",
"10975",
"2054",
"748",
"2697",
"12441",
"8996",
"6444",
"7583",
"11971",
"11881",
"14257",
"6428",
"9378",
"3459",
"5896",
"2255",
"2680",
"3957",
"2163",
"7645",
"6407",
"12516",
"4770",
"6826",
"1502",
"1362",
"3905",
"4197",
"2974",
"5811",
"1451",
"10949",
"14661",
"6709",
"11336",
"1272",
"5084",
"9610",
"10680",
"14863",
"5223",
"15377",
"1962",
"13929",
"6468",
"7187",
"12766",
"9323",
"3658",
"4795",
"10404",
"1685",
"11224",
"8739",
"10898",
"13559",
"10897",
"7345",
"15392",
"2969",
"1071",
"4140",
"4243",
"6315",
"432",
"3525",
"4818",
"11779",
"8401",
"5609",
"516",
"3601",
"7983",
"12649",
"10401",
"12628",
"12723",
"2903",
"11629",
"1228",
"8637",
"8318",
"10219",
"6074",
"14135",
"1766",
"2729",
"14752",
"14134",
"13153",
"13039",
"143",
"6694",
"5094",
"2302",
"5554",
"8362",
"3276",
"13808",
"14340",
"13879",
"9111",
"1179",
"14568",
"6809",
"4669",
"9109",
"6650",
"13558",
"6325",
"12715",
"2572",
"4589",
"10447",
"15348",
"685",
"3648",
"8939",
"653",
"10723",
"11407",
"12900",
"887",
"1747",
"15194",
"15284",
"9017",
"13874",
"6179",
"4117",
"7197",
"4683",
"7499",
"10362",
"1288",
"5726",
"6786",
"4463",
"2431",
"5475",
"14521",
"13621",
"2853",
"3170",
"6910",
"100",
"13271",
"9513",
"4496",
"3221",
"7195",
"5433",
"8564",
"3309",
"9633",
"14286",
"14382",
"3643",
"4444",
"5767",
"7944",
"10090",
"11419",
"14859",
"11598",
"5434",
"11032",
"2301",
"6107",
"5705",
"11713",
"11283",
"8427",
"8835",
"15448",
"455",
"3169",
"14902",
"821",
"1594",
"9000",
"2197",
"12991",
"14349",
"8777",
"6945",
"2106",
"6103",
"14187",
"8048",
"5320",
"12383",
"14219",
"12618",
"8093",
"487",
"7289",
"9873",
"11744",
"9423",
"1290",
"6875",
"13304",
"2099",
"14956",
"11680",
"3466",
"493",
"8762",
"15477",
"2830",
"11546",
"10806",
"10187",
"12341",
"14706",
"9973",
"5648",
"5936",
"14007",
"1821",
"4163",
"11387",
"1239",
"6766",
"2621",
"5202",
"14806",
"4901",
"14995",
"9405",
"491",
"13125",
"9867",
"4671",
"11573",
"1089",
"5831",
"7167",
"12420",
"3399",
"15092",
"11960",
"6017",
"9315",
"8320",
"2368",
"5052",
"2768",
"10116",
"4077",
"9900",
"6796",
"3436",
"7266",
"1893",
"10111",
"2157",
"1524",
"11623",
"11250",
"14443",
"1755",
"6891",
"10168",
"7610",
"8608",
"14988",
"3712",
"1096",
"6640",
"15340",
"4356",
"14786",
"9418",
"14076",
"10867",
"1245",
"14671",
"4610",
"8279",
"10818",
"4973",
"12829",
"10610",
"2747",
"2273",
"11493",
"11851",
"3053",
"2978",
"11911",
"3884",
"5790",
"2540",
"13876",
"8648",
"8840",
"7307",
"5840",
"9184",
"864",
"8390",
"4089",
"7269",
"7669",
"5895",
"12801",
"11139",
"531",
"4912",
"2791",
"7313",
"9912",
"7652",
"3528",
"1299",
"9134",
"25",
"843",
"3947",
"15407",
"13435",
"10021",
"4535",
"1410",
"11065",
"12330",
"11175",
"2375",
"4184",
"14632",
"9196",
"2921",
"11253",
"14523",
"3463",
"1387",
"5914",
"2305",
"5687",
"7005",
"3198",
"12805",
"968",
"7362",
"2282",
"6649",
"3291",
"10167",
"15249",
"10918",
"9120",
"10896",
"8420",
"14439",
"1574",
"11485",
"8921",
"540",
"3922",
"972",
"10209",
"2653",
"10702",
"2161",
"13583",
"356",
"8429",
"13953",
"13569",
"13149",
"6837",
"8386",
"5594",
"6665",
"7239",
"7120",
"10226",
"6332",
"4624",
"13901",
"5360",
"11342",
"5427",
"1761",
"13356",
"14616",
"6985",
"10558",
"9438",
"6134",
"14415",
"11726",
"8167",
"6415",
"10298",
"13993",
"8374",
"7467",
"11369",
"12606",
"5527",
"11972",
"11381",
"3754",
"13355",
"14411",
"13743",
"14984",
"13178",
"15344",
"9264",
"13726",
"1511",
"1206",
"12577",
"1325",
"7453",
"3842",
"467",
"7907",
"4577",
"12450",
"10089",
"7955",
"7128",
"12175",
"11436",
"14246",
"13232",
"14341",
"4556",
"5431",
"3660",
"8096",
"12565",
"10352",
"6116",
"13533",
"5892",
"8915",
"14018",
"3811",
"12942",
"9750",
"3421",
"1324",
"11553",
"9989",
"14013",
"5359",
"11873",
"7825",
"14050",
"12651",
"14962",
"1967",
"7563",
"15179",
"8397",
"7189",
"5268",
"6771",
"14713",
"228",
"4315",
"10725",
"13717",
"12935",
"9626",
"3145",
"835",
"6413",
"2138",
"12459",
"4530",
"8435",
"7675",
"13338",
"9213",
"7558",
"10943",
"12396",
"1472",
"14333",
"6000",
"13648",
"13113",
"11613",
"11807",
"13473",
"5234",
"8628",
"9683",
"15290",
"13588",
"14583",
"9963",
"8583",
"6040",
"6441",
"12167",
"3174",
"818",
"172",
"999",
"9962",
"12998",
"14370",
"4261",
"9255",
"13866",
"11944",
"2924",
"12446",
"9212",
"10253",
"5561",
"10238",
"13390",
"3490",
"8118",
"3219",
"3047",
"1691",
"1799",
"742",
"15123",
"8468",
"9714",
"3938",
"1752",
"8661",
"9797",
"6869",
"10619",
"2410",
"8986",
"13684",
"8715",
"8815",
"4246",
"6055",
"3605",
"13445",
"419",
"11742",
"2644",
"6794",
"11138",
"9102",
"10101",
"15422",
"11649",
"9747",
"5034",
"2335",
"13889",
"4812",
"11064",
"3804",
"5843",
"7254",
"2941",
"12682",
"2794",
"15264",
"3144",
"1123",
"12817",
"10595",
"7850",
"762",
"10098",
"915",
"12011",
"1939",
"4641",
"865",
"10247",
"9275",
"7306",
"7892",
"11780",
"5357",
"299",
"12338",
"6503",
"8708",
"6310",
"10323",
"541",
"6184",
"6450",
"7231",
"5742",
"7494",
"9725",
"8674",
"6093",
"10235",
"12506",
"4043",
"9396",
"3375",
"503",
"9155",
"2973",
"1537",
"14729",
"15268",
"11563",
"12872",
"3533",
"14871",
"3452",
"6110",
"6047",
"15193",
"1616",
"14911",
"12319",
"11465",
"6504",
"898",
"15113",
"7566",
"10456",
"7711",
"3769",
"11373",
"8543",
"385",
"7378",
"2900",
"4840",
"12052",
"2253",
"14609",
"1813",
"10973",
"4401",
"4866",
"14932",
"8591",
"4196",
"1446",
"12756",
"8645",
"3866",
"12631",
"6287",
"12300",
"6388",
"2877",
"6962",
"4741",
"8326",
"8159",
"3308",
"2074",
"13665",
"12993",
"14495",
"872",
"15328",
"6819",
"8289",
"7006",
"10976",
"11922",
"9863",
"15180",
"6623",
"2297",
"12519",
"3787",
"8549",
"1936",
"9302",
"13890",
"8692",
"6362",
"10406",
"12165",
"2657",
"315",
"10526",
"5668",
"14688",
"4376",
"15019",
"11710",
"9952",
"9859",
"3266",
"7470",
"13363",
"4713",
"14089",
"12616",
"3160",
"7199",
"13301",
"13759",
"5098",
"1357",
"13789",
"13057",
"11133",
"947",
"11009",
"4666",
"8998",
"6466",
"13263",
"3203",
"9568",
"9437",
"10189",
"4889",
"8269",
"5848",
"10776",
"8492",
"2129",
"15412",
"14873",
"1988",
"7800",
"9947",
"3545",
"2741",
"1550",
"5743",
"9584",
"12035",
"7479",
"10941",
"12371",
"9177",
"9809",
"9355",
"2366",
"4722",
"8086",
"9821",
"5264",
"10770",
"7584",
"1495",
"8959",
"9723",
"7383",
"13735",
"8154",
"4407",
"12220",
"6770",
"3435",
"6934",
"9381",
"13813",
"14921",
"12273",
"15324",
"7390",
"6023",
"13059",
"3515",
"13484",
"5734",
"3462",
"7945",
"9012",
"15068",
"2170",
"6442",
"2909",
"11953",
"14086",
"8844",
"8878",
"5229",
"12569",
"2598",
"238",
"15459",
"5613",
"9661",
"11918",
"1985",
"9028",
"14673",
"2091",
"488",
"11024",
"14515",
"14262",
"4453",
"13240",
"12033",
"11675",
"4234",
"11333",
"15352",
"9239",
"8550",
"5719",
"4415",
"14422",
"9518",
"9473",
"10199",
"11592",
"14306",
"2172",
"9615",
"3233",
"2146",
"12857",
"2298",
"12573",
"9457",
"9602",
"14447",
"9959",
"11815",
"13578",
"3350",
"14694",
"6396",
"14375",
"12173",
"11843",
"1412",
"10193",
"9390",
"5638",
"2389",
"4070",
"9186",
"13617",
"8470",
"13741",
"9409",
"6594",
"1907",
"207",
"287",
"8818",
"2709",
"2762",
"14901",
"5408",
"13934",
"13935",
"2316",
"6290",
"7953",
"8421",
"12729",
"4177",
"12290",
"3557",
"15063",
"4324",
"12112",
"15439",
"4311",
"15230",
"8640",
"8860",
"9080",
"12272",
"13262",
"12291",
"10719",
"6041",
"1731",
"1456",
"7582",
"11268",
"7939",
"12746",
"9876",
"11410",
"8140",
"13474",
"8192",
"12366",
"6784",
"11039",
"5045",
"6802",
"1906",
"4936",
"7423",
"6763",
"14006",
"1237",
"14276",
"4779",
"6105",
"13757",
"4033",
"15236",
"690",
"3964",
"10379",
"12699",
"15099",
"14479",
"1156",
"15343",
"7887",
"10202",
"8510",
"9576",
"2294",
"9811",
"5085",
"871",
"2798",
"2967",
"14805",
"8313",
"8299",
"7446",
"6997",
"1845",
"1401",
"10459",
"6217",
"11242",
"4149",
"3923",
"774",
"10150",
"3176",
"6805",
"8657",
"212",
"7002",
"10861",
"6340",
"12281",
"15241",
"7430",
"7814",
"9942",
"11484",
"3747",
"12104",
"7694",
"263",
"11692",
"5158",
"7586",
"3239",
"5744",
"14614",
"13305",
"10227",
"9966",
"2506",
"9977",
"10319",
"14973",
"7517",
"4946",
"13581",
"3009",
"9717",
"6590",
"9944",
"7682",
"13342",
"7529",
"7968",
"6747",
"12795",
"4670",
"13985",
"12581",
"10573",
"11554",
"14275",
"1671",
"3439",
"14409",
"10906",
"12351",
"8908",
"6473",
"8991",
"13824",
"10800",
"9801",
"7667",
"10752",
"2266",
"7925",
"14833",
"5471",
"14496",
"4766",
"3430",
"14743",
"14225",
"69",
"13672",
"6611",
"12639",
"6683",
"11933",
"3753",
"14698",
"8764",
"9528",
"3654",
"6401",
"4729",
"1178",
"5571",
"4018",
"13372",
"833",
"10551",
"2484",
"6363",
"7798",
"6593",
"8893",
"1698",
"4048",
"15181",
"10342",
"6137",
"10097",
"13670",
"2964",
"4908",
"8790",
"4637",
"4636",
"5645",
"7620",
"10726",
"4078",
"3272",
"625",
"2932",
"11357",
"928",
"5525",
"6145",
"8498",
"550",
"4050",
"621",
"12224",
"12271",
"11542",
"9124",
"14085",
"11415",
"10256",
"9420",
"9075",
"5346",
"9982",
"13930",
"8350",
"3396",
"3755",
"9842",
"8484",
"8184",
"6878",
"15246",
"2247",
"14041",
"10508",
"2332",
"13012",
"11334",
"14946",
"12461",
"3529",
"7665",
"8316",
"8190",
"11775",
"6863",
"8043",
"5120",
"7188",
"4383",
"8015",
"13658",
"14407",
"14068",
"11315",
"12086",
"4924",
"8050",
"12941",
"15336",
"6543",
"919",
"13331",
"13162",
"14319",
"11567",
"2021",
"9016",
"9363",
"533",
"2636",
"10917",
"7029",
"5866",
"7999",
"5330",
"604",
"6019",
"9827",
"4173",
"11207",
"5122",
"7569",
"10112",
"9171",
"9519",
"8584",
"9161",
"12779",
"10929",
"9425",
"4738",
"4098",
"2673",
"11930",
"2580",
"7408",
"351",
"3079",
"4895",
"7085",
"9829",
"12274",
"279",
"1217",
"3990",
"2482",
"7842",
"3521",
"5395",
"7376",
"6530",
"6281",
"11666",
"8775",
"5172",
"14547",
"10758",
"1443",
"14576",
"929",
"80",
"6777",
"2706",
"2629",
"14429",
"8297",
"12873",
"3104",
"14299",
"2843",
"182",
"3349",
"10347",
"242",
"4481",
"4884",
"15423",
"12074",
"14775",
"11017",
"2841",
"7656",
"8974",
"13002",
"15302",
"15136",
"11157",
"343",
"2210",
"9056",
"7738",
"11451",
"307",
"1200",
"844",
"2267",
"10491",
"4363",
"12978",
"9019",
"9254",
"2878",
"6484",
"5536",
"6263",
"11015",
"2622",
"4906",
"3477",
"6308",
"7088",
"11610",
"459",
"12955",
"1011",
"3441",
"7817",
"11469",
"2850",
"7524",
"9592",
"11804",
"151",
"13776",
"2231",
"14195",
"723",
"12843",
"11285",
"7181",
"5581",
"7309",
"6391",
"3719",
"223",
"14421",
"11576",
"10501",
"9734",
"5804",
"10106",
"5576",
"4569",
"10041",
"9715",
"9703",
"11448",
"11433",
"10418",
"3228",
"3645",
"2833",
"14563",
"1961",
"11199",
"10366",
"1220",
"11564",
"13995",
"7508",
"5079",
"3471",
"14905",
"7802",
"13861",
"14509",
"9294",
"12950",
"13967",
"7377",
"5448",
"13579",
"8413",
"13345",
"3250",
"7386",
"14994",
"4217",
"1174",
"8919",
"3988",
"12602",
"10048",
"11759",
"13040",
"12512",
"11989",
"4985",
"14506",
"7975",
"10785",
"10671",
"3606",
"9550",
"10173",
"4499",
"4568",
"9566",
"7044",
"9130",
"8338",
"4647",
"5347",
"15151",
"4574",
"9132",
"14107",
"3671",
"10445",
"8857",
"10636",
"6159",
"5182",
"10471",
"12555",
"8565",
"11736",
"4783",
"92",
"4470",
"14377",
"5378",
"619",
"8474",
"9752",
"12529",
"10715",
"3089",
"9998",
"6927",
"11151",
"4358",
"12256",
"6234",
"11355",
"1068",
"11405",
"5628",
"9022",
"7123",
"2911",
"12320",
"13254",
"14798",
"5097",
"8554",
"4868",
"8652",
"10123",
"3083",
"4007",
"747",
"10791",
"8914",
"2314",
"13081",
"2277",
"1772",
"2562",
"6902",
"14753",
"15139",
"6572",
"9996",
"6095",
"6906",
"8393",
"8851",
"7154",
"15046",
"9741",
"8113",
"9333",
"9712",
"7460",
"15399",
"13843",
"6556",
"338",
"11889",
"7826",
"2471",
"14627",
"9868",
"15173",
"3669",
"4529",
"10958",
"670",
"10450",
"10668",
"1963",
"12724",
"11806",
"13724",
"8519",
"13",
"13186",
"6922",
"5129",
"10302",
"9861",
"571",
"9206",
"6285",
"12344",
"5259",
"12881",
"10423",
"13463",
"1474",
"3994",
"10044",
"5421",
"9205",
"3919",
"7320",
"2695",
"3411",
"2829",
"10250",
"15158",
"4195",
"147",
"13320",
"5980",
"346",
"7698",
"13176",
"6357",
"825",
"1861",
"4932",
"14500",
"12215",
"1004",
"13357",
"7118",
"15433",
"5530",
"6765",
"10656",
"13747",
"5957",
"7702",
"5111",
"12842",
"11109",
"14589",
"962",
"3109",
"6960",
"15334",
"12668",
"11755",
"10174",
"10952",
"396",
"7492",
"7339",
"8822",
"6313",
"10287",
"1347",
"4559",
"2797",
"11537",
"5838",
"3736",
"4786",
"12121",
"11300",
"5406",
"980",
"2067",
"3614",
"3831",
"5580",
"6043",
"10310",
"8235",
"7060",
"5572",
"15045",
"13378",
"14791",
"14647",
"948",
"9945",
"3078",
"104",
"1434",
"11350",
"494",
"2134",
"125",
"6344",
"6240",
"13440",
"15138",
"9943",
"6768",
"6033",
"14794",
"3292",
"13505",
"9785",
"10151",
"5203",
"10703",
"3325",
"9358",
"10420",
"6616",
"13152",
"6152",
"7161",
"6044",
"5776",
"6714",
"4137",
"3552",
"4015",
"10570",
"7915",
"11993",
"13639",
"5884",
"14330",
"9306",
"8361",
"8609",
"8005",
"5233",
"4154",
"2016",
"6925",
"884",
"6461",
"4760",
"12061",
"6389",
"12846",
"13650",
"783",
"2303",
"8322",
"10851",
"14966",
"14641",
"1519",
"9076",
"4017",
"5810",
"14847",
"9906",
"7579",
"3044",
"8041",
"15103",
"6139",
"11668",
"14090",
"10535",
"5119",
"10801",
"4634",
"13483",
"5219",
"14777",
"12037",
"3384",
"10267",
"12617",
"14454",
"12283",
"5799",
"3246",
"9207",
"1360",
"5505",
"9118",
"11501",
"6150",
"13458",
"15086",
"11223",
"8085",
"931",
"4409",
"9281",
"11120",
"13424",
"13827",
"12131",
"1820",
"8083",
"14121",
"7385",
"14123",
"2517",
"12632",
"6002",
"3644",
"8548",
"8254",
"505",
"3588",
"7768",
"8742",
"3616",
"4935",
"8001",
"4815",
"9520",
"7444",
"1611",
"3702",
"1341",
"1389",
"7026",
"2933",
"14680",
"2195",
"2083",
"8466",
"4011",
"7249",
"9786",
"13314",
"3824",
"7169",
"12953",
"5064",
"15278",
"9621",
"6306",
"4873",
"14600",
"9393",
"5356",
"10790",
"6326",
"9836",
"5673",
"6586",
"10177",
"2400",
"140",
"8079",
"3950",
"938",
"6327",
"9638",
"12252",
"3447",
"8035",
"6187",
"7084",
"8707",
"12062",
"8169",
"4326",
"2874",
"5210",
"3587",
"6845",
"13334",
"11384",
"1496",
"7000",
"9511",
"5510",
"14322",
"12246",
"9940",
"1147",
"9619",
"2866",
"7448",
"14057",
"13983",
"14991",
"11913",
"5463",
"6086",
"15410",
"13190",
"7396",
"10672",
"9686",
"13492",
"14203",
"9509",
"3941",
"4653",
"12939",
"5215",
"3642",
"4240",
"4172",
"6780",
"5618",
"7845",
"5065",
"15360",
"7603",
"2888",
"1042",
"13687",
"10502",
"14832",
"11520",
"13225",
"15321",
"3105",
"3194",
"6141",
"15320",
"9234",
"3279",
"9311",
"5928",
"5349",
"7152",
"10269",
"9627",
"1644",
"12476",
"1602",
"9337",
"13563",
"8778",
"9159",
"12982",
"5543",
"12652",
"7632",
"2620",
"13336",
"5661",
"565",
"15253",
"12554",
"5730",
"5007",
"11289",
"7019",
"12046",
"8089",
"7294",
"1758",
"13917",
"4052",
"15447",
"9934",
"9098",
"4660",
"12198",
"6029",
"2084",
"2193",
"9327",
"7629",
"5965",
"3299",
"11606",
"2371",
"12566",
"3584",
"2477",
"3317",
"3071",
"444",
"2365",
"10018",
"11565",
"3825",
"13921",
"13208",
"12258",
"2821",
"9216",
"4850",
"3476",
"1390",
"72",
"7212",
"4399",
"14620",
"12335",
"812",
"12135",
"11155",
"10147",
"13352",
"7491",
"3687",
"6293",
"7014",
"11688",
"13245",
"10345",
"3029",
"12142",
"5030",
"11622",
"9539",
"8636",
"8776",
"8132",
"2618",
"8177",
"4046",
"365",
"10744",
"3762",
"3282",
"2143",
"4143",
"10479",
"3326",
"3132",
"12390",
"970",
"3460",
"8697",
"2012",
"2398",
"3448",
"13061",
"5362",
"921",
"4905",
"12808",
"13196",
"6257",
"4006",
"3944",
"111",
"9858",
"9772",
"5018",
"6860",
"12915",
"909",
"14850",
"9140",
"8478",
"11801",
"5616",
"15005",
"14737",
"954",
"9924",
"5655",
"141",
"10282",
"11225",
"8808",
"9108",
"6163",
"14877",
"6247",
"4614",
"8769",
"15260",
"5498",
"7745",
"9133",
"2148",
"13958",
"6857",
"5040",
"2226",
"1670",
"10217",
"13449",
"12060",
"2857",
"203",
"13289",
"14128",
"6755",
"427",
"2694",
"13471",
"6736",
"12423",
"11123",
"11740",
"3655",
"10667",
"12532",
"7934",
"6652",
"985",
"956",
"14095",
"4061",
"8540",
"1462",
"7048",
"2849",
"2077",
"6950",
"10052",
"3077",
"7485",
"10154",
"6880",
"1185",
"8880",
"13691",
"8806",
"2397",
"869",
"9224",
"1683",
"7661",
"2846",
"1552",
"1",
"13462",
"9794",
"3304",
"13146",
"3374",
"10249",
"1778",
"3117",
"14451",
"7678",
"10329",
"9447",
"3836",
"652",
"14133",
"3689",
"11695",
"9258",
"1927",
"11243",
"5652",
"13940",
"9374",
"8907",
"8312",
"9902",
"11235",
"2419",
"5601",
"1107",
"1024",
"7879",
"5194",
"6949",
"11179",
"1607",
"5829",
"5856",
"15203",
"10396",
"932",
"264",
"2525",
"13237",
"12627",
"10661",
"8260",
"8186",
"6836",
"2971",
"281",
"2053",
"12010",
"13858",
"15205",
"3443",
"5074",
"12368",
"15329",
"1313",
"4446",
"5780",
"10797",
"886",
"12902",
"14854",
"10186",
"1840",
"10884",
"644",
"1680",
"14868",
"7143",
"13385",
"15040",
"1219",
"4351",
"13243",
"8631",
"11907",
"8896",
"12567",
"4709",
"10947",
"1610",
"6629",
"10953",
"3128",
"3222",
"4435",
"6992",
"10795",
"7914",
"5467",
"11903",
"15200",
"2132",
"14074",
"13468",
"1996",
"3",
"5823",
"1666",
"7686",
"12743",
"3220",
"9448",
"7926",
"6203",
"10138",
"6663",
"794",
"4320",
"9570",
"10448",
"12000",
"2459",
"10268",
"9082",
"6984",
"8127",
"5702",
"15060",
"4171",
"15443",
"4665",
"3849",
"6687",
"12792",
"2915",
"10764",
"4718",
"1303",
"5140",
"6369",
"13931",
"13790",
"3865",
"1595",
"5265",
"7762",
"8698",
"15021",
"3909",
"6417",
"8065",
"12347",
"7959",
"6269",
"4896",
"13339",
"4253",
"10813",
"13671",
"3935",
"11848",
"11707",
"7838",
"5332",
"13607",
"6723",
"8768",
"12883",
"6378",
"13401",
"15163",
"12650",
"10882",
"3206",
"15415",
"3963",
"12458",
"1841",
"2959",
"7598",
"10836",
"6062",
"5204",
"4216",
"11165",
"10399",
"2545",
"3620",
"4226",
"12874",
"13586",
"8755",
"2839",
"12213",
"2761",
"3426",
"3119",
"2819",
"8853",
"7316",
"10547",
"8499",
"9596",
"6039",
"9044",
"2290",
"10783",
"7789",
"15133",
"8128",
"2522",
"4308",
"4249",
"8965",
"8866",
"3744",
"2372",
"15296",
"11130",
"4042",
"12949",
"12885",
"14027",
"14742",
"15101",
"2604",
"9081",
"13923",
"4215",
"12878",
"9965",
"159",
"5480",
"2507",
"9059",
"10210",
"13268",
"1911",
"10061",
"7997",
"7332",
"14663",
"639",
"3879",
"6211",
"930",
"11428",
"2137",
"2103",
"197",
"10266",
"11391",
"6192",
"6350",
"8837",
"13224",
"2483",
"10110",
"3187",
"9291",
"15228",
"1513",
"8076",
"14639",
"9498",
"10220",
"13699",
"14391",
"11277",
"11555",
"2457",
"4560",
"10644",
"15033",
"10705",
"10431",
"15094",
"15317",
"13960",
"12120",
"10034",
"13054",
"4281",
"7013",
"13032",
"1538",
"14456",
"6081",
"6045",
"3603",
"5821",
"226",
"2713",
"6255",
"1868",
"5183",
"15368",
"10815",
"9482",
"3516",
"9605",
"15493",
"12206",
"2655",
"9883",
"9820",
"6207",
"7033",
"12428",
"2409",
"8506",
"8500",
"8981",
"1079",
"7595",
"2031",
"10985",
"15145",
"3405",
"3518",
"12200",
"6510",
"3294",
"5209",
"13614",
"3492",
"15271",
"2380",
"6790",
"10246",
"13060",
"9958",
"7604",
"3440",
"4245",
"4719",
"13293",
"12285",
"5658",
"9994",
"5842",
"9091",
"7954",
"7984",
"15310",
"2383",
"3578",
"5417",
"11794",
"14957",
"7722",
"918",
"11377",
"7382",
"7704",
"1358",
"14289",
"8265",
"1809",
"2479",
"7064",
"7089",
"13258",
"5228",
"11894",
"601",
"10567",
"4611",
"8892",
"15043",
"5231",
"6268",
"11327",
"7204",
"5637",
"14031",
"4321",
"1756",
"2126",
"8725",
"5578",
"7451",
"2412",
"12909",
"749",
"10042",
"10228",
"3264",
"11958",
"10327",
"1065",
"1076",
"4145",
"9814",
"7977",
"11558",
"5401",
"10593",
"11016",
"2381",
"8829",
"12683",
"12897",
"12562",
"6173",
"8940",
"8441",
"810",
"13697",
"13230",
"10419",
"398",
"10297",
"4654",
"3356",
"9806",
"3646",
"4123",
"12128",
"9681",
"8740",
"5033",
"3188",
"5801",
"2835",
"5363",
"5039",
"11987",
"4160",
"15078",
"4305",
"15496",
"7296",
"6785",
"9499",
"13624",
"4054",
"5713",
"14360",
"5944",
"10529",
"13154",
"10903",
"8979",
"7969",
"12481",
"6540",
"14223",
"11367",
"10003",
"12012",
"12584",
"11231",
"9756",
"7436",
"2154",
"15148",
"403",
"7097",
"12278",
"3576",
"3883",
"3513",
"15358",
"13024",
"7432",
"9572",
"5692",
"14344",
"9822",
"11030",
"9470",
"10819",
"9753",
"9375",
"7653",
"11494",
"11644",
"4681",
"10768",
"14889",
"4148",
"6811",
"4132",
"5331",
"12570",
"14305",
"11437",
"12820",
"14293",
"1902",
"13030",
"4954",
"12703",
"6817",
"10294",
"9884",
"14308",
"10334",
"14795",
"7818",
"9189",
"2318",
"8164",
"1836",
"10224",
"4247",
"3394",
"15342",
"11528",
"3081",
"1781",
"9209",
"9700",
"6965",
"13029",
"3562",
"15216",
"4581",
"14916",
"14359",
"5283",
"5361",
"6834",
"8752",
"8804",
"8193",
"8930",
"12806",
"4788",
"13371",
"6336",
"9469",
"14362",
"8958",
"2362",
"204",
"9521",
"3965",
"8480",
"10986",
"2940",
"11359",
"4186",
"8574",
"12722",
"3382",
"5004",
"12782",
"8625",
"6434",
"1759",
"13525",
"3502",
"15325",
"10745",
"14483",
"14645",
"14002",
"2599",
"10291",
"6113",
"1587",
"2589",
"1957",
"12894",
"10469",
"10179",
"8935",
"10686",
"3080",
"13111",
"13587",
"6853",
"1133",
"8247",
"5907",
"13603",
"1897",
"9185",
"5511",
"12136",
"3782",
"1591",
"11540",
"6670",
"7708",
"15366",
"14345",
"7885",
"12437",
"4097",
"6624",
"7855",
"13332",
"12069",
"13771",
"11440",
"14762",
"1448",
"6528",
"12558",
"6752",
"5562",
"4844",
"3214",
"6944",
"4966",
"1264",
"14216",
"11088",
"9514",
"9692",
"5400",
"14845",
"10272",
"13147",
"14941",
"1141",
"10242",
"14029",
"11094",
"2190",
"12392",
"11568",
"5760",
"1785",
"12028",
"6753",
"323",
"15408",
"5126",
"8497",
"12586",
"8115",
"8437",
"98",
"3809",
"9740",
"14530",
"10271",
"11159",
"316",
"9029",
"5973",
"5961",
"8202",
"8836",
"15355",
"2062",
"7457",
"4861",
"7578",
"2044",
"13420",
"13984",
"10944",
"14070",
"15386",
"5206",
"14463",
"4752",
"14829",
"9790",
"9125",
"5921",
"12895",
"13870",
"1546",
"6740",
"12613",
"2140",
"15150",
"8448",
"6678",
"6449",
"15213",
"7511",
"22",
"10825",
"4739",
"1034",
"3876",
"3418",
"2466",
"15389",
"2347",
"14535",
"8151",
"6634",
"4999",
"6800",
"15073",
"14351",
"10698",
"4735",
"1547",
"4480",
"1335",
"9999",
"13893",
"10514",
"198",
"9705",
"8496",
"2223",
"14311",
"6971",
"9290",
"14410",
"2149",
"10853",
"4612",
"14056",
"2443",
"1703",
"5599",
"1499",
"3522",
"14143",
"3122",
"6722",
"15497",
"2200",
"9587",
"8516",
"11635",
"12763",
"14338",
"5150",
"13041",
"11974",
"3135",
"9223",
"7590",
"6792",
"630",
"9547",
"12797",
"13207",
"7282",
"5582",
"13956",
"1392",
"8117",
"14884",
"14380",
"7262",
"11080",
"13872",
"14434",
"11148",
"13464",
"12964",
"12890",
"6542",
"15263",
"6025",
"7217",
"3126",
"4466",
"9751",
"6804",
"5934",
"15111",
"8329",
"1097",
"4225",
"13131",
"169",
"11147",
"12615",
"13392",
"5548",
"7024",
"14585",
"243",
"7836",
"4292",
"10252",
"12109",
"3022",
"9182",
"10885",
"11237",
"6372",
"12030",
"15298",
"3133",
"2931",
"10215",
"15421",
"1202",
"8071",
"13167",
"6423",
"2892",
"8276",
"15059",
"15303",
"11136",
"6065",
"13366",
"8205",
"14545",
"7389",
"1481",
"1366",
"12398",
"12042",
"13502",
"11765",
"1154",
"1378",
"14335",
"260",
"15110",
"8743",
"5474",
"15295",
"9307",
"12195",
"6989",
"5539",
"2086",
"2965",
"5399",
"10930",
"3982",
"11152",
"1777",
"4384",
"8826",
"14423",
"11899",
"13764",
"14115",
"1447",
"14467",
"4239",
"4734",
"11099",
"1007",
"12551",
"5642",
"4346",
"558",
"7990",
"3362",
"12851",
"9787",
"5635",
"7442",
"4600",
"206",
"5666",
"5282",
"11069",
"15225",
"12064",
"11625",
"13964",
"13755",
"7904",
"4231",
"5991",
"10995",
"8773",
"6970",
"10809",
"14637",
"5878",
"13701",
"10239",
"6909",
"13180",
"9010",
"8464",
"5123",
"8927",
"5916",
"8820",
"7548",
"6769",
"265",
"1359",
"11490",
"7530",
"10618",
"4561",
"4960",
"10198",
"5849",
"14441",
"4241",
"9597",
"11195",
"8756",
"10058",
"14379",
"3815",
"267",
"14358",
"5792",
"14926",
"6032",
"5794",
"3234",
"8659",
"10285",
"15280",
"7839",
"14856",
"13704",
"6208",
"8807",
"7532",
"4112",
"499",
"7535",
"3871",
"8767",
"11197",
"737",
"4375",
"3101",
"3414",
"14920",
"1491",
"2872",
"1339",
"10890",
"328",
"11301",
"3061",
"14573",
"1543",
"4852",
"9249",
"3257",
"2549",
"1966",
"9591",
"8131",
"6905",
"13696",
"9018",
"5948",
"3915",
"12467",
"14204",
"1559",
"1249",
"7550",
"8291",
"13766",
"5960",
"11075",
"9645",
"9575",
"13122",
"11183",
"9334",
"436",
"5051",
"13295",
"11949",
"513",
"8733",
"7407",
"9093",
"11140",
"13891",
"4101",
"3527",
"12236",
"10828",
"40",
"2494",
"14670",
"8906",
"8923",
"7242",
"14820",
"9195",
"6360",
"14318",
"15104",
"7755",
"9052",
"5612",
"14536",
"9899",
"6886",
"7367",
"11643",
"8630",
"15282",
"4976",
"9607",
"5060",
"3262",
"6981",
"6728",
"5095",
"13242",
"4986",
"1843",
"9532",
"13555",
"6447",
"10330",
"6781",
"4754",
"12023",
"6299",
"10871",
"4491",
"377",
"3437",
"1121",
"1512",
"11330",
"1853",
"10360",
"4547",
"11768",
"12406",
"3724",
"3402",
"8051",
"11361",
"1498",
"11255",
"7823",
"9813",
"7346",
"12388",
"4934",
"13405",
"14700",
"1229",
"7427",
"6848",
"6249",
"13778",
"13193",
"9923",
"6959",
"4348",
"10950",
"13353",
"5703",
"8245",
"4696",
"10424",
"3211",
"14537",
"1999",
"2720",
"15384",
"6681",
"15048",
"11994",
"13073",
"1197",
"11862",
"12395",
"231",
"14369",
"11947",
"11674",
"14282",
"11345",
"9008",
"12394",
"9210",
"5969",
"2454",
"2632",
"10614",
"1385",
"13175",
"11884",
"9810",
"8295",
"2433",
"2310",
"13281",
"12365",
"13792",
"6500",
"842",
"677",
"13750",
"12992",
"3914",
"5529",
"12830",
"8706",
"3099",
"7562",
"1525",
"13118",
"2806",
"7890",
"7680",
"116",
"3984",
"9997",
"15490",
"3301",
"11887",
"10028",
"11084",
"14501",
"11513",
"14466",
"4967",
"10566",
"3685",
"7113",
"12280",
"13386",
"15431",
"11931",
"6218",
"6195",
"10991",
"6517",
"7081",
"11365",
"960",
"9939",
"1254",
"11074",
"68",
"770",
"1273",
"3284",
"5586",
"9243",
"14209",
"4230",
"3532",
"15062",
"3649",
"437",
"12002",
"189",
"6298",
"10505",
"439",
"11282",
"2369",
"7866",
"2235",
"14508",
"1732",
"7360",
"10957",
"3788",
"3789",
"10349",
"3258",
"7514",
"14462",
"14207",
"2396",
"5832",
"9909",
"5197",
"4248",
"9245",
"10581",
"14450",
"5888",
"15441",
"9462",
"3733",
"6225",
"8021",
"13660",
"10523",
"10597",
"10432",
"7585",
"5626",
"2645",
"6573",
"61",
"8972",
"1886",
"14575",
"1689",
"9731",
"2391",
"13427",
"10470",
"13408",
"4949",
"12772",
"1168",
"8239",
"13015",
"4410",
"463",
"15004",
"4373",
"1274",
"713",
"6432",
"941",
"5133",
"12924",
"12266",
"10129",
"8136",
"8112",
"11062",
"13064",
"13673",
"3041",
"5473",
"7607",
"1160",
"3315",
"13495",
"1898",
"1973",
"13323",
"660",
"901",
"14514",
"8204",
"375",
"13253",
"12557",
"5090",
"9446",
"6068",
"12417",
"15195",
"8343",
"2602",
"1356",
"13274",
"10371",
"6021",
"5385",
"13537",
"10552",
"1994",
"12951",
"10850",
"13019",
"7122",
"9050",
"13066",
"14049",
"12400",
"14030",
"15034",
"6567",
"10157",
"13048",
"10901",
"2088",
"4860",
"15437",
"15347",
"5583",
"8862",
"3607",
"6646",
"12217",
"5327",
"15311",
"3664",
"12728",
"7812",
"8988",
"7841",
"5456",
"1614",
"13532",
"9950",
"7402",
"35",
"12150",
"4845",
"11488",
"4585",
"11774",
"8824",
"9781",
"7522",
"3494",
"5348",
"6014",
"5805",
"4980",
"13369",
"3706",
"1077",
"11952",
"9416",
"9164",
"8856",
"3353",
"14922",
"9344",
"8684",
"8063",
"7251",
"6871",
"15186",
"9370",
"10068",
"7741",
"5044",
"14427",
"14464",
"8019",
"3303",
"1649",
"5335",
"4252",
"12233",
"9221",
"2814",
"5886",
"7326",
"3956",
"148",
"6618",
"13493",
"4956",
"13920",
"1199",
"1382",
"7079",
"5324",
"13509",
"13456",
"10162",
"12133",
"14346",
"8970",
"9585",
"8014",
"2597",
"14236",
"6788",
"1058",
"14178",
"702",
"13590",
"15385",
"1620",
"4593",
"13291",
"935",
"12265",
"6361",
"6626",
"2646",
"2402",
"8451",
"9379",
"6443",
"2407",
"12094",
"1588",
"4804",
"2376",
"12048",
"9430",
"6307",
"12308",
"8877",
"15065",
"4902",
"15247",
"14662",
"13311",
"13703",
"3626",
"12026",
"7759",
"3763",
"14040",
"2816",
"8812",
"8992",
"6808",
"7913",
"9793",
"2009",
"866",
"11053",
"9450",
"13807",
"8886",
"11925",
"14381",
"471",
"5185",
"2968",
"12421",
"1842",
"10201",
"3199",
"3379",
"9359",
"10879",
"8718",
"11460",
"13415",
"14078",
"8022",
"66",
"13112",
"8224",
"4366",
"5585",
"12735",
"1395",
"13189",
"14997",
"4909",
"15026",
"5605",
"6309",
"9341",
"9516",
"2097",
"6320",
"10669",
"8590",
"4165",
"5517",
"5748",
"1086",
"14191",
"9879",
"15483",
"1630",
"1088",
"15281",
"2469",
"13496",
"3076",
"8122",
"1224",
"11220",
"1580",
"14665",
"306",
"5334",
"308",
"9400",
"10206",
"13388",
"8352",
"4133",
"12596",
"14876",
"120",
"9970",
"8833",
"10652",
"15381",
"5924",
"11318",
"9112",
"5602",
"13834",
"7920",
"5458",
"2896",
"12832",
"11832",
"11026",
"4166",
"5910",
"5167",
"2822",
"15351",
"15470",
"5862",
"10119",
"14403",
"610",
"9413",
"7001",
"10774",
"5393",
"2958",
"11791",
"14830",
"951",
"13637",
"10183",
"4282",
"8961",
"4942",
"13695",
"1459",
"5683",
"7965",
"11337",
"7771",
"3823",
"14024",
"8417",
"14668",
"5429",
"14265",
"5217",
"1606",
"1950",
"13063",
"8792",
"11932",
"1797",
"4990",
"11587",
"1692",
"2823",
"14167",
"15142",
"15140",
"9767",
"14601",
"8572",
"1676",
"15354",
"8150",
"15130",
"4872",
"9",
"1810",
"3859",
"412",
"9031",
"3018",
"5565",
"11581",
"14574",
"683",
"3837",
"7086",
"6386",
"475",
"10648",
"5006",
"15211",
"7989",
"7114",
"10102",
"7951",
"7075",
"12711",
"11580",
"2871",
"1484",
"5927",
"6602",
"14444",
"8172",
"11787",
"13136",
"5809",
"4449",
"2802",
"11487",
"9921",
"5410",
"14996",
"7544",
"7908",
"10773",
"2414",
"7433",
"13864",
"13465",
"4291",
"2949",
"3713",
"7164",
"914",
"181",
"13546",
"3868",
"14156",
"7447",
"13298",
"11979",
"10100",
"637",
"12393",
"13875",
"14012",
"371",
"5749",
"6932",
"5693",
"211",
"3624",
"5459",
"650",
"3290",
"482",
"4885",
"14987",
"2963",
"6807",
"15153",
"12988",
"3633",
"2251",
"592",
"5230",
"705",
"7782",
"1204",
"2261",
"5160",
"8898",
"7092",
"14927",
"727",
"8606",
"289",
"9808",
"3050",
"593",
"8530",
"14588",
"13174",
"8157",
"3318",
"4523",
"1979",
"13091",
"5607",
"10115",
"13373",
"12465",
"2349",
"13666",
"10355",
"14787",
"9040",
"10188",
"9465",
"3348",
"6009",
"7238",
"12385",
"988",
"8077",
"12708",
"8704",
"15245",
"3591",
"6130",
"7941",
"9404",
"9639",
"11512",
"4692",
"392",
"12071",
"5861",
"5307",
"3613",
"472",
"4755",
"1187",
"14797",
"15429",
"12611",
"5243",
"144",
"6295",
"13606",
"7931",
"9580",
"14779",
"813",
"6923",
"1471",
"3204",
"4661",
"9567",
"5614",
"15084",
"13050",
"395",
"1991",
"13712",
"8214",
"8087",
"13796",
"2102",
"9352",
"8266",
"14772",
"5297",
"4548",
"14255",
"12597",
"14682",
"2311",
"410",
"12442",
"2904",
"15208",
"8676",
"10651",
"9270",
"789",
"11633",
"9501",
"6708",
"14432",
"5370",
"14511",
"2854",
"9840",
"3826",
"2276",
"12981",
"12345",
"1214",
"8333",
"14612",
"2468",
"5455",
"11421",
"7673",
"5266",
"14251",
"4579",
"33",
"5997",
"14885",
"3960",
"15135",
"6577",
"13421",
"2612",
"3059",
"1423",
"10303",
"13523",
"9949",
"9693",
"7281",
"3341",
"10654",
"15036",
"6376",
"271",
"10533",
"11422",
"12004",
"13625",
"5724",
"8622",
"3373",
"13527",
"12116",
"9969",
"13791",
"5214",
"4555",
"8903",
"13686",
"15182",
"4640",
"11112",
"6555",
"8185",
"4673",
"12264",
"14177",
"14550",
"8527",
"8231",
"11057",
"12163",
"8495",
"4408",
"3821",
"8673",
"3252",
"9975",
"9386",
"10710",
"14221",
"730",
"7387",
"9347",
"10794",
"13987",
"1857",
"3774",
"3828",
"6280",
"14457",
"11502",
"14765",
"14561",
"8650",
"5624",
"3680",
"13150",
"11211",
"6209",
"430",
"12146",
"9408",
"6778",
"5295",
"2167",
"3302",
"9777",
"12509",
"12155",
"11890",
"6491",
"12969",
"6612",
"4521",
"11456",
"10054",
"545",
"2307",
"3342",
"9744",
"6481",
"9991",
"8191",
"9678",
"2777",
"15304",
"10196",
"11239",
"1556",
"1912",
"12325",
"2326",
"9319",
"11743",
"5419",
"13198",
"7464",
"1210",
"14651",
"13052",
"1613",
"4110",
"5442",
"3157",
"13068",
"7790",
"13130",
"6947",
"10565",
"3814",
"10240",
"14035",
"7924",
"2537",
"5477",
"12239",
"585",
"9988",
"12811",
"6114",
"3366",
"10765",
"11864",
"11585",
"880",
"9157",
"12899",
"8868",
"2716",
"4213",
"8995",
"9903",
"5826",
"11719",
"6106",
"3800",
"11228",
"15487",
"5553",
"3226",
"76",
"12511",
"1955",
"14067",
"15017",
"9287",
"5968",
"6603",
"13992",
"5139",
"2155",
"9389",
"9009",
"10594",
"2698",
"8389",
"11313",
"1138",
"10176",
"4024",
"12008",
"11588",
"7963",
"6838",
"874",
"15398",
"3397",
"9042",
"2824",
"7051",
"5041",
"1444",
"8760",
"11611",
"1108",
"2192",
"10693",
"6196",
"7893",
"10679",
"13478",
"14260",
"6505",
"2101",
"10612",
"1379",
"10817",
"14754",
"7277",
"2450",
"12798",
"6509",
"2432",
"13115",
"7729",
"6439",
"9913",
"11733",
"3253",
"11481",
"402",
"14810",
"13575",
"1528",
"6277",
"8133",
"10601",
"14347",
"9832",
"1218",
"14320",
"2654",
"5898",
"8103",
"5137",
"14080",
"14373",
"13187",
"903",
"14093",
"11634",
"15185",
"2130",
"12292",
"14493",
"11403",
"12282",
"2519",
"9851",
"10927",
"7135",
"5246",
"7292",
"10274",
"4618",
"2804",
"152",
"14471",
"2669",
"6522",
"3697",
"7889",
"8158",
"3043",
"6939",
"10057",
"2635",
"8440",
"14606",
"10254",
"6735",
"6707",
"15416",
"7228",
"2827",
"10562",
"14149",
"518",
"7531",
"7415",
"5567",
"6598",
"4897",
"7200",
"5577",
"12494",
"4084",
"9669",
"10286",
"7857",
"4540",
"3977",
"7241",
"10164",
"3551",
"12862",
"10473",
"15387",
"14356",
"10641",
"2342",
"5721",
"8842",
"4981",
"2426",
"10916",
"6666",
"13996",
"12241",
"13829",
"12604",
"1488",
"13981",
"3120",
"15051",
"4397",
"14184",
"131",
"10333",
"5796",
"13075",
"11329",
"3725",
"3568",
"13099",
"4823",
"7628",
"6157",
"4831",
"15456",
"2384",
"817",
"5740",
"6575",
"12867",
"715",
"9675",
"3604",
"12800",
"5942",
"1149",
"10172",
"4965",
"4716",
"2510",
"15256",
"4385",
"14621",
"1084",
"7668",
"1989",
"1959",
"7411",
"64",
"250",
"2337",
"10283",
"12517",
"12809",
"8472",
"3609",
"426",
"3983",
"7284",
"5893",
"5787",
"5267",
"14893",
"5756",
"13251",
"11003",
"12268",
"1929",
"966",
"14148",
"10739",
"4506",
"12759",
"12355",
"2607",
"11883",
"13469",
"10480",
"1234",
"13845",
"10857",
"13072",
"10821",
"7437",
"8602",
"4091",
"9987",
"5379",
"12783",
"3888",
"4192",
"413",
"9716",
"5996",
"7130",
"13865",
"14976",
"14913",
"6379",
"12234",
"1131",
"3278",
"14218",
"1665",
"10993",
"11173",
"1461",
"7912",
"3496",
"8010",
"4034",
"14518",
"6053",
"7568",
"12870",
"15378",
"8271",
"8805",
"13062",
"2164",
"446",
"3051",
"1421",
"12194",
"6580",
"1600",
"9069",
"2052",
"3372",
"6178",
"11054",
"6982",
"78",
"5917",
"1013",
"5073",
"13438",
"6079",
"9623",
"1281",
"10273",
"8237",
"10580",
"4789",
"3444",
"12968",
"5438",
"14860",
"1641",
"15067",
"10331",
"8414",
"5457",
"12225",
"2828",
"13667",
"1806",
"14553",
"974",
"6548",
"300",
"13786",
"8412",
"3369",
"11201",
"7177",
"8391",
"6366",
"9854",
"6911",
"8967",
"14504",
"14892",
"11303",
"13640",
"9936",
"10376",
"13317",
"14366",
"14397",
"4345",
"5159",
"13260",
"10931",
"10465",
"523",
"12775",
"3040",
"4113",
"15198",
"7994",
"10858",
"2501",
"11158",
"5830",
"5287",
"7541",
"4147",
"3033",
"14970",
"13168",
"14398",
"12943",
"3641",
"9397",
"3858",
"15318",
"5478",
"3852",
"11562",
"11380",
"3042",
"7046",
"12027",
"14852",
"13835",
"5646",
"3589",
"7090",
"2837",
"10484",
"2800",
"6174",
"1862",
"11527",
"15210",
"7775",
"6206",
"13375",
"5532",
"13290",
"13820",
"11266",
"9724",
"6706",
"6246",
"14710",
"12630",
"759",
"11259",
"13074",
"11735",
"13698",
"9434",
"5372",
"15243",
"13952",
"11943",
"2997",
"13391",
"3797",
"10460",
"14303",
"8055",
"14778",
"13354",
"5764",
"8643",
"183",
"13226",
"3386",
"12203",
"3138",
"5398",
"11417",
"11093",
"12583",
"6064",
"3880",
"14052",
"1744",
"3190",
"10002",
"9197",
"9907",
"1342",
"11904",
"5864",
"13604",
"11477",
"8240",
"10582",
"13374",
"8324",
"12995",
"12276",
"5023",
"5016",
"6677",
"5092",
"15119",
"8272",
"6948",
"11697",
"6647",
"5839",
"9758",
"14827",
"11963",
"4087",
"9603",
"6688",
"13212",
"14532",
"13234",
"4477",
"9517",
"2956",
"481",
"6061",
"9100",
"12204",
"10493",
"12819",
"3590",
"3682",
"11552",
"6018",
"6609",
"10724",
"14571",
"4418",
"101",
"4757",
"1092",
"8916",
"2815",
"4933",
"6596",
"7703",
"4880",
"2898",
"8228",
"2179",
"4431",
"10321",
"2917",
"14296",
"6171",
"3140",
"927",
"9480",
"15279",
"11276",
"8267",
"3856",
"12903",
"7608",
"4505",
"13947",
"8663",
"10279",
"1921",
"7032",
"10936",
"14213",
"9477",
"11627",
"7419",
"13584",
"10625",
"15435",
"12526",
"13248",
"4237",
"7565",
"10030",
"11834",
"4182",
"6317",
"14165",
"1285",
"11938",
"1450",
"5189",
"12578",
"14170",
"1565",
"1155",
"12531",
"6322",
"9979",
"11461",
"9038",
"5428",
"7178",
"2778",
"13500",
"2241",
"8973",
"10877",
"11346",
"12675",
"13164",
"891",
"12289",
"736",
"15049",
"10749",
"1194",
"13656",
"4257",
"497",
"8423",
"6916",
"8119",
"8952",
"6279",
"5304",
"8463",
"7921",
"7286",
"11251",
"3286",
"12691",
"14169",
"15419",
"14633",
"6352",
"84",
"1904",
"758",
"10314",
"14405",
"2639",
"5314",
"7750",
"1768",
"1090",
"3663",
"6236",
"14254",
"11651",
"615",
"4558",
"5538",
"7515",
"4314",
"11256",
"7824",
"6149",
"2428",
"2162",
"4211",
"3861",
"3058",
"3296",
"12945",
"12039",
"13562",
"14309",
"11865",
"1760",
"1827",
"14704",
"6355",
"13247",
"12796",
"8891",
"5704",
"9652",
"4944",
"12740",
"6424",
"8690",
"15446",
"8396",
"5056",
"5470",
"7587",
"6969",
"11909",
"2403",
"7409",
"6303",
"369",
"1300",
"3583",
"11826",
"4362",
"11570",
"12159",
"14015",
"10180",
"7291",
"1221",
"14971",
"6604",
"15023",
"3179",
"14494",
"11579",
"14878",
"10977",
"6637",
"1736",
"9158",
"4187",
"15054",
"14559",
"15424",
"6712",
"15093",
"8360",
"14720",
"10259",
"15196",
"14774",
"8367",
"8098",
"13296",
"899",
"827",
"4416",
"1568",
"7458",
"8082",
"5686",
"910",
"12502",
"6194",
"4900",
"5876",
"3108",
"14205",
"4887",
"8811",
"2979",
"13312",
"8850",
"5298",
"3274",
"1938",
"13727",
"14210",
"218",
"7370",
"12686",
"2188",
"12022",
"838",
"6359",
"4142",
"5226",
"12024",
"15289",
"15411",
"6632",
"10237",
"1972",
"11263",
"13662",
"293",
"14899",
"13031",
"7102",
"13433",
"13108",
"3619",
"1700",
"1442",
"2236",
"11181",
"5284",
"10497",
"8017",
"8162",
"14556",
"14577",
"8681",
"1494",
"9053",
"13948",
"4753",
"3313",
"1583",
"1258",
"14136",
"13219",
"14734",
"5035",
"10537",
"418",
"6689",
"11441",
"830",
"9865",
"3785",
"823",
"5186",
"8307",
"4613",
"1887",
"7744",
"6974",
"5188",
"13412",
"15082",
"5156",
"8340",
"83",
"232",
"8873",
"3684",
"1839",
"12305",
"10655",
"6693",
"9850",
"12791",
"6343",
"614",
"4151",
"15309",
"13517",
"2677",
"10316",
"1056",
"1148",
"10888",
"8616",
"10091",
"10495",
"12692",
"13902",
"4842",
"3489",
"7552",
"273",
"11597",
"6921",
"7489",
"3468",
"13003",
"7783",
"11076",
"6421",
"12301",
"834",
"8664",
"2582",
"6200",
"11672",
"2182",
"14273",
"14334",
"5847",
"8888",
"15452",
"103",
"3873",
"2329",
"2780",
"310",
"5909",
"8759",
"7906",
"7787",
"158",
"11312",
"1584",
"7070",
"8701",
"7779",
"13885",
"2899",
"5926",
"6472",
"5676",
"7414",
"10682",
"9200",
"8277",
"3523",
"11977",
"10207",
"12455",
"12827",
"809",
"205",
"13472",
"11092",
"2014",
"1882",
"13589",
"15174",
"2159",
"1350",
"2836",
"277",
"5685",
"14590",
"3530",
"8605",
"5731",
"2209",
"6347",
"1737",
"5308",
"6877",
"597",
"8399",
"7726",
"595",
"4059",
"15066",
"340",
"14654",
"6250",
"8199",
"3064",
"2320",
"819",
"8538",
"5941",
"2565",
"10585",
"607",
"14841",
"11132",
"10596",
"224",
"2059",
"11792",
"199",
"14252",
"1830",
"9800",
"6591",
"12477",
"2299",
"3297",
"9039",
"10579",
"11408",
"15395",
"8426",
"3961",
"10642",
"12486",
"14046",
"5447",
"7461",
"9881",
"7028",
"4871",
"11306",
"10796",
"8182",
"5943",
"4210",
"15252",
"606",
"12497",
"4761",
"8263",
"2708",
"9954",
"4460",
"7356",
"10070",
"14781",
"13244",
"1681",
"12343",
"6529",
"8137",
"13979",
"15402",
"6377",
"14097",
"2844",
"8839",
"11599",
"9749",
"4298",
"11070",
"4701",
"9560",
"6301",
"5795",
"9218",
"863",
"3259",
"5155",
"5714",
"10728",
"1117",
"14461",
"6097",
"15356",
"12545",
"173",
"14117",
"4352",
"14314",
"9760",
"14840",
"6574",
"1729",
"7753",
"6213",
"10066",
"9107",
"4567",
"2225",
"12003",
"3718",
"11083",
"1919",
"1834",
"3792",
"13959",
"8870",
"6758",
"12361",
"4141",
"9439",
"6143",
"1623",
"7899",
"3376",
"13302",
"1475",
"12676",
"13269",
"3621",
"6520",
"10299",
"8477",
"5005",
"1782",
"12157",
"9812",
"11399",
"1829",
"452",
"12286",
"5522",
"2156",
"7364",
"9283",
"4000",
"12854",
"225",
"6898",
"847",
"11278",
"7832",
"3263",
"9864",
"15122",
"1530",
"10893",
"14124",
"4785",
"6227",
"5443",
"4849",
"13895",
"10134",
"14386",
"12750",
"126",
"14656",
"10972",
"5306",
"5697",
"5891",
"11041",
"1515",
"12411",
"14659",
"2098",
"5010",
"4086",
"4467",
"6204",
"160",
"4364",
"7466",
"2579",
"3497",
"6862",
"3491",
"14939",
"7285",
"6050",
"11376",
"8344",
"1673",
"5959",
"11442",
"361",
"8703",
"3026",
"15413",
"13330",
"3535",
"258",
"14229",
"3024",
"4040",
"5328",
"3756",
"8143",
"9880",
"14875",
"6741",
"11788",
"4693",
"6474",
"12124",
"15323",
"3569",
"15166",
"12127",
"4051",
"10755",
"9657",
"13961",
"3224",
"6691",
"4436",
"5746",
"4805",
"9304",
"9174",
"5270",
"5606",
"5992",
"11630",
"6393",
"11722",
"11836",
"4522",
"14392",
"10027",
"5640",
"13126",
"14844",
"1723",
"11853",
"6584",
"13761",
"11808",
"3585",
"1800",
"7126",
"603",
"1653",
"15379",
"5885",
"2774",
"5557",
"1941",
"12852",
"7734",
"11654",
"15116",
"13777",
"5773",
"14837",
"9937",
"10538",
"14908",
"3775",
"14896",
"13266",
"11822",
"3293",
"6238",
"744",
"4839",
"2928",
"14234",
"7859",
"177",
"8195",
"10969",
"5542",
"13547",
"3934",
"1153",
"6237",
"10326",
"2002",
"13423",
"14374",
"5563",
"14438",
"14910",
"6302",
"14053",
"4573",
"9013",
"9727",
"3086",
"3191",
"13082",
"12148",
"4525",
"4260",
"7334",
"11673",
"14823",
"1319",
"13179",
"6373",
"6341",
"15339",
"9328",
"8105",
"2393",
"12733",
"14707",
"13692",
"2504",
"3549",
"10148",
"6445",
"10156",
"296",
"2787",
"7328",
"15361",
"1080",
"13358",
"500",
"13310",
"14180",
"7910",
"1924",
"803",
"8160",
"12837",
"7903",
"7078",
"6042",
"5430",
"8114",
"11897",
"10146",
"6999",
"6613",
"3338",
"14017",
"6719",
"5068",
"14513",
"4462",
"11101",
"6329",
"7483",
"848",
"15468",
"8384",
"1908",
"5220",
"769",
"6286",
"13461",
"13795",
"14280",
"2288",
"7303",
"14482",
"2770",
"12936",
"11665",
"12528",
"14867",
"9729",
"1293",
"13499",
"2128",
"1923",
"6825",
"11752",
"2135",
"4297",
"13521",
"12346",
"8937",
"8515",
"2826",
"7627",
"4971",
"741",
"2943",
"6235",
"7636",
"14931",
"13367",
"8443",
"3420",
"8736",
"5592",
"15405",
"9695",
"1748",
"13508",
"9156",
"961",
"10248",
"12901",
"10568",
"12250",
"12457",
"3807",
"5273",
"9641",
"7116",
"7011",
"2628",
"13772",
"9733",
"12764",
"2142",
"10486",
"10709",
"1181",
"4890",
"6380",
"5152",
"7295",
"8694",
"1040",
"9862",
"12920",
"14054",
"14581",
"5227",
"613",
"14631",
"14964",
"3698",
"1771",
"4200",
"11982",
"10142",
"9121",
"6403",
"8635",
"11022",
"2150",
"2642",
"6894",
"12559",
"10455",
"12841",
"7581",
"4664",
"12549",
"480",
"9914",
"11594",
"1489",
"14610",
"8567",
"7257",
"9655",
"7343",
"3808",
"3692",
"4776",
"483",
"6463",
"7039",
"5043",
"6418",
"6571",
"8171",
"7117",
"2339",
"12813",
"13347",
"989",
"7993",
"10464",
"1445",
"4801",
"11983",
"8107",
"9094",
"6243",
"4746",
"6943",
"6260",
"2104",
"13341",
"10435",
"10519",
"10845",
"13912",
"5024",
"14048",
"11968",
"504",
"14744",
"14721",
"4508",
"14874",
"2215",
"7496",
"9774",
"15131",
"11219",
"15440",
"1226",
"11524",
"3710",
"13114",
"5464",
"8671",
"2333",
"13512",
"10023",
"6471",
"12510",
"7869",
"1469",
"8355",
"1402",
"13675",
"5436",
"8593",
"6486",
"8753",
"7314",
"2610",
"1599",
"7723",
"5793",
"1139",
"14725",
"123",
"14343",
"4903",
"14947",
"14796",
"2743",
"11291",
"11607",
"6743",
"1046",
"15127",
"3739",
"4501",
"11446",
"14855",
"9671",
"7614",
"13560",
"2753",
"6749",
"7202",
"12552",
"7806",
"2926",
"7947",
"6918",
"949",
"8060",
"3701",
"13978",
"8153",
"11452",
"4652",
"6846",
"1560",
"14130",
"9051",
"14592",
"9035",
"2930",
"5038",
"8647",
"6506",
"12073",
"8596",
"5175",
"10908",
"757",
"1604",
"8469",
"6323",
"13406",
"8665",
"5883",
"8180",
"7662",
"10968",
"10712",
"12931",
"11432",
"7868",
"6284",
"594",
"15121",
"3531",
"5413",
"7159",
"5353",
"4029",
"5783",
"6398",
"6215",
"13128",
"11892",
"12229",
"10306",
"12702",
"12665",
"14256",
"7935",
"10833",
"2945",
"9489",
"490",
"8178",
"3483",
"11317",
"4088",
"5802",
"13053",
"14533",
"3166",
"5807",
"14732",
"6600",
"14692",
"14849",
"6482",
"8613",
"4370",
"2614",
"6775",
"2897",
"10457",
"10324",
"11116",
"4551",
"3737",
"6455",
"13380",
"5337",
"14999",
"9005",
"7792",
"5381",
"5404",
"6568",
"13159",
"11028",
"3218",
"9704",
"8416",
"5127",
"13693",
"14907",
"15191",
"10965",
"9435",
"5551",
"3896",
"13821",
"15233",
"7932",
"655",
"5280",
"11703",
"9696",
"1535",
"10792",
"15488",
"13101",
"1411",
"3519",
"5820",
"10008",
"10692",
"13116",
"7455",
"8009",
"14541",
"9070",
"9153",
"9534",
"1384",
"5853",
"13595",
"15183",
"7174",
"1716",
"10673",
"7577",
"1656",
"14290",
"7272",
"12984",
"3456",
"11760",
"74",
"867",
"10750",
"5249",
"3054",
"6699",
"15237",
"14071",
"11706",
"4518",
"12179",
"11745",
"13328",
"4022",
"13306",
"1241",
"9269",
"6402",
"6912",
"5042",
"2587",
"697",
"8504",
"15273",
"13722",
"4545",
"15291",
"11027",
"10163",
"15275",
"8677",
"13279",
"9364",
"7519",
"10767",
"12714",
"6129",
"8296",
"575",
"3844",
"8845",
"9493",
"2570",
"10926",
"7805",
"6412",
"4715",
"1669",
"14233",
"1043",
"13635",
"15128",
"364",
"15374",
"9190",
"12186",
"7534",
"1208",
"859",
"6553",
"6488",
"13548",
"6901",
"9326",
"4395",
"13780",
"8342",
"11914",
"7774",
"3700",
"14449",
"7600",
"9007",
"4251",
"638",
"3822",
"1201",
"973",
"3283",
"10832",
"14652",
"5176",
"3469",
"55",
"2812",
"4643",
"8511",
"13203",
"14172",
"1705",
"9047",
"4615",
"154",
"6745",
"1372",
"9119",
"12907",
"12219",
"12580",
"596",
"4271",
"6385",
"191",
"7815",
"8072",
"15184",
"12473",
"3639",
"1709",
"6924",
"5531",
"7025",
"11187",
"3773",
"5504",
"2404",
"2840",
"13280",
"1173",
"14248",
"9976",
"8244",
"7160",
"4378",
"3561",
"10442",
"12812",
"15009",
"4602",
"2070",
"9748",
"15401",
"11584",
"3481",
"14249",
"1856",
"3615",
"13572",
"8460",
"5770",
"7808",
"1415",
"8791",
"12518",
"4495",
"6828",
"13904",
"10657",
"4371",
"8745",
"5272",
"14484",
"15146",
"6672",
"3324",
"8879",
"3415",
"435",
"2178",
"2929",
"6354",
"2609",
"9272",
"3745",
"2773",
"15061",
"8174",
"5549",
"1346",
"6078",
"2152",
"9672",
"1289",
"1075",
"5827",
"7428",
"4619",
"2385",
"3424",
"14549",
"4001",
"878",
"12660",
"12223",
"2725",
"5113",
"1162",
"9296",
"2649",
"319",
"13036",
"13988",
"11060",
"3212",
"13214",
"5254",
"53",
"3000",
"474",
"3063",
"1715",
"1503",
"8262",
"10069",
"12638",
"8108",
"13881",
"13609",
"3337",
"10843",
"9354",
"2693",
"1407",
"11793",
"2553",
"572",
"4874",
"5067",
"8410",
"7329",
"4214",
"3819",
"740",
"4108",
"12503",
"4044",
"5382",
"13945",
"34",
"8865",
"6595",
"1794",
"2586",
"8869",
"11331",
"4920",
"11348",
"10886",
"7949",
"6762",
"7348",
"7336",
"12108",
"2954",
"7863",
"6006",
"3625",
"641",
"1417",
"6292",
"6460",
"14033",
"5745",
"8411",
"10934",
"10428",
"13730",
"9373",
"15097",
"10633",
"7184",
"5659",
"3186",
"14998",
"9852",
"4570",
"7735",
"7234",
"9060",
"2467",
"13161",
"5603",
"3075",
"7110",
"11322",
"8141",
"2286",
"11169",
"20",
"7936",
"7795",
"14949",
"9755",
"11294",
"3743",
"298",
"12168",
"14082",
"14684",
"4103",
"1010",
"3920",
"3031",
"9422",
"5654",
"5317",
"6436",
"3746",
"15028",
"3853",
"2532",
"852",
"11992",
"13661",
"12089",
"13211",
"2356",
"772",
"170",
"7555",
"10870",
"12546",
"4846",
"8594",
"9366",
"4181",
"9126",
"7015",
"5496",
"9922",
"6409",
"14714",
"7737",
"1643",
"1343",
"11603",
"3470",
"7615",
"13592",
"8168",
"5897",
"8723",
"7875",
"9238",
"15489",
"1094",
"4335",
"4533",
"2025",
"5207",
"5619",
"7298",
"15469",
"8890",
"3124",
"9011",
"10979",
"13377",
"5863",
"477",
"8433",
"13135",
"7439",
"6496",
"5754",
"3635",
"13585",
"14001",
"3672",
"13925",
"6289",
"3507",
"6955",
"5763",
"5444",
"13124",
"1534",
"9552",
"12844",
"11352",
"2541",
"11816",
"10736",
"8738",
"8434",
"2056",
"13246",
"1629",
"7911",
"6961",
"1083",
"8156",
"8310",
"4188",
"7591",
"11214",
"10251",
"4520",
"2873",
"8057",
"3116",
"12677",
"12172",
"13335",
"10847",
"3631",
"2760",
"4105",
"1182",
"13479",
"5889",
"8422",
"15074",
"12025",
"107",
"1009",
"3016",
"5058",
"15434",
"9590"
] |
FurqanNiazi/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the arrow dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2059
- Accuracy: 0.3835
- F1: 0.0455
- Precision: 0.0442
- Recall: 0.0468
- Auc Roc: 0.5665
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Auc Roc |
|:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|:-------:|
| 0.2335 | 0.9860 | 53 | 0.2059 | 0.3835 | 0.0455 | 0.0442 | 0.0468 | 0.5665 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cpu
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"atelectasis",
"cardiomegaly",
"consolidation",
"edema",
"effusion",
"emphysema",
"fibrosis",
"hernia",
"infiltration",
"mass",
"no finding",
"nodule",
"pleural_thickening",
"pneumonia",
"pneumothorax"
] |
miccer/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1893
- Accuracy: 0.9350
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3668 | 1.0 | 370 | 0.3206 | 0.9134 |
| 0.1983 | 2.0 | 740 | 0.2488 | 0.9337 |
| 0.1716 | 3.0 | 1110 | 0.2282 | 0.9378 |
| 0.1389 | 4.0 | 1480 | 0.2175 | 0.9391 |
| 0.1296 | 5.0 | 1850 | 0.2168 | 0.9364 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
# laion/CLIP-ViT-B-32-laion2B-s34B-b79K
Accuracy: 0.8564
Precision: 0.8526
Recall: 0.8564
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
norburay/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## 📊 Evaluation on Oxford Pets Dataset (Zero-Shot Image Classification)
This model was evaluated using the [Oxford Pets dataset](https://huggingface.co/datasets/pcuenq/oxford-pets) in a **zero-shot image classification** setting, where no additional training was performed.
### 🔍 Model Information
- **Model used:** [`openai/clip-vit-large-patch14`](https://huggingface.co/openai/clip-vit-large-patch14)
- **Task:** Zero-Shot Image Classification
- **Approach:** The model was prompted with a list of 37 pet breed labels and asked to classify each image from the dataset without any fine-tuning.
### 📈 Evaluation Results
| Metric | Value |
|------------|-----------|
| Accuracy | 88.00% |
| Precision | 87.68% |
| Recall | 88.00% |
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
marinrad/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1977
- Accuracy: 0.9445
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3582 | 1.0 | 370 | 0.2997 | 0.9256 |
| 0.2125 | 2.0 | 740 | 0.2200 | 0.9418 |
| 0.1573 | 3.0 | 1110 | 0.1966 | 0.9405 |
| 0.1472 | 4.0 | 1480 | 0.1884 | 0.9445 |
| 0.1338 | 5.0 | 1850 | 0.1865 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
thenewsupercell/NewMaskedJaw_image_parts_df_VIT
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# NewMaskedJaw_image_parts_df_VIT
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0291
- Accuracy: 0.9944
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.0537 | 1.0 | 5252 | 0.0464 | 0.9890 |
| 0.0014 | 2.0 | 10504 | 0.0463 | 0.9904 |
| 0.0237 | 3.0 | 15756 | 0.0227 | 0.9940 |
| 0.0408 | 4.0 | 21008 | 0.0291 | 0.9944 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
|
[
"fake",
"real"
] |
graftim2/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2211
- Accuracy: 0.9337
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3686 | 1.0 | 370 | 0.3147 | 0.9134 |
| 0.2265 | 2.0 | 740 | 0.2431 | 0.9269 |
| 0.1486 | 3.0 | 1110 | 0.2231 | 0.9296 |
| 0.1399 | 4.0 | 1480 | 0.2131 | 0.9310 |
| 0.123 | 5.0 | 1850 | 0.2101 | 0.9337 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
RafidPratama2003/vit-rupiah-classifier
|
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
[
"seribu",
"dua ribu",
"lima ribu",
"sepuluh ribu",
"dua puluh ribu",
"lima puluh ribu",
"seratus ribu"
] |
prithivMLmods/WikiArt-Genre
|

# **WikiArt-Genre**
> **WikiArt-Genre** is a vision model fine-tuned from **google/siglip2-base-patch16-224** using the **SiglipForImageClassification** architecture. It classifies artwork images into one of 43 genre categories from the WikiArt dataset.
```
Classification Report:
precision recall f1-score support
None 0.7222 0.0518 0.0967 1254
abstract 0.8268 0.9280 0.8745 9498
advertisement 0.0000 0.0000 0.0000 82
allegorical painting 0.4576 0.0522 0.0938 1034
animal painting 0.6474 0.6569 0.6521 1571
battle painting 0.4755 0.1899 0.2715 358
bijinga 0.4737 0.0947 0.1579 95
bird-and-flower painting 0.5410 0.5546 0.5477 119
calligraphy 0.7989 0.8938 0.8437 160
capriccio 1.0000 0.0169 0.0333 236
caricature 0.9412 0.2078 0.3404 231
cityscape 0.7335 0.7556 0.7444 5348
cloudscape 0.7949 0.1490 0.2510 208
design 0.5811 0.5593 0.5700 2024
figurative 0.4517 0.3124 0.3693 2244
flower painting 0.7846 0.8574 0.8194 1606
genre painting 0.6442 0.7460 0.6914 14260
history painting 0.4623 0.1115 0.1797 879
illustration 0.5557 0.6156 0.5841 3202
interior 0.6495 0.4896 0.5583 670
landscape 0.8068 0.8876 0.8453 15006
literary painting 0.7000 0.0125 0.0246 558
marina 0.6790 0.7429 0.7095 1805
miniature 0.0000 0.0000 0.0000 118
mythological painting 0.4949 0.3565 0.4145 1910
nude painting (nu) 0.6639 0.7926 0.7225 2290
panorama 0.0000 0.0000 0.0000 19
pastorale 0.0000 0.0000 0.0000 125
portrait 0.7518 0.8867 0.8137 16847
poster 0.4338 0.2063 0.2796 286
quadratura 0.0000 0.0000 0.0000 22
religious painting 0.7543 0.7530 0.7537 7429
self-portrait 0.6058 0.1365 0.2228 1531
shan shui 0.0000 0.0000 0.0000 33
sketch and study 0.6475 0.4361 0.5212 3644
still life 0.7923 0.7953 0.7938 3132
symbolic painting 0.5231 0.4629 0.4911 2545
tessellation 0.8583 0.5860 0.6965 186
urushi-e 0.0000 0.0000 0.0000 1
vanitas 0.0000 0.0000 0.0000 32
veduta 0.8043 0.1588 0.2652 233
wildlife painting 0.5833 0.5138 0.5463 327
yakusha-e 0.6250 0.0543 0.1000 92
accuracy 0.7183 103250
macro avg 0.5411 0.3727 0.3925 103250
weighted avg 0.7044 0.7183 0.6945 103250
```
```py
from datasets import load_dataset
# Load the dataset
dataset = load_dataset("Artificio/WikiArt")
# Extract unique masterCategory values (assuming it's a string field)
labels = sorted(set(example["genre"] for example in dataset["train"]))
# Create id2label mapping
id2label = {str(i): label for i, label in enumerate(labels)}
# Print the mapping
print(id2label)
```
{'0': 'None', '1': 'abstract', '2': 'advertisement', '3': 'allegorical painting', '4': 'animal painting', '5': 'battle painting', '6': 'bijinga', '7': 'bird-and-flower painting', '8': 'calligraphy', '9': 'capriccio', '10': 'caricature', '11': 'cityscape', '12': 'cloudscape', '13': 'design', '14': 'figurative', '15': 'flower painting', '16': 'genre painting', '17': 'history painting', '18': 'illustration', '19': 'interior', '20': 'landscape', '21': 'literary painting', '22': 'marina', '23': 'miniature', '24': 'mythological painting', '25': 'nude painting (nu)', '26': 'panorama', '27': 'pastorale', '28': 'portrait', '29': 'poster', '30': 'quadratura', '31': 'religious painting', '32': 'self-portrait', '33': 'shan shui', '34': 'sketch and study', '35': 'still life', '36': 'symbolic painting', '37': 'tessellation', '38': 'urushi-e', '39': 'vanitas', '40': 'veduta', '41': 'wildlife painting', '42': 'yakusha-e'}
---
## **Model Details**
The model predicts one of the following genre categories for artworks:
| ID | Genre | ID | Genre |
|----|---------------------------|----|---------------------------|
| 0 | None | 22 | Marina |
| 1 | Abstract | 23 | Miniature |
| 2 | Advertisement | 24 | Mythological Painting |
| 3 | Allegorical Painting | 25 | Nude Painting (Nu) |
| 4 | Animal Painting | 26 | Panorama |
| 5 | Battle Painting | 27 | Pastorale |
| 6 | Bijinga | 28 | Portrait |
| 7 | Bird-and-Flower Painting | 29 | Poster |
| 8 | Calligraphy | 30 | Quadratura |
| 9 | Capriccio | 31 | Religious Painting |
| 10 | Caricature | 32 | Self-Portrait |
| 11 | Cityscape | 33 | Shan Shui |
| 12 | Cloudscape | 34 | Sketch and Study |
| 13 | Design | 35 | Still Life |
| 14 | Figurative | 36 | Symbolic Painting |
| 15 | Flower Painting | 37 | Tessellation |
| 16 | Genre Painting | 38 | Urushi-e |
| 17 | History Painting | 39 | Vanitas |
| 18 | Illustration | 40 | Veduta |
| 19 | Interior | 41 | Wildlife Painting |
| 20 | Landscape | 42 | Yakusha-e |
| 21 | Literary Painting | | |
## **Run with Transformers 🤗**
```bash
pip install -q transformers torch pillow gradio
```
```python
import gradio as gr
from transformers import AutoImageProcessor, SiglipForImageClassification
from PIL import Image
import torch
# Load model and processor
model_name = "prithivMLmods/WikiArt-Genre" # Replace with your actual model path
model = SiglipForImageClassification.from_pretrained(model_name)
processor = AutoImageProcessor.from_pretrained(model_name)
# Label mapping
id2label = {
0: "None", 1: "abstract", 2: "advertisement", 3: "allegorical painting", 4: "animal painting",
5: "battle painting", 6: "bijinga", 7: "bird-and-flower painting", 8: "calligraphy",
9: "capriccio", 10: "caricature", 11: "cityscape", 12: "cloudscape", 13: "design",
14: "figurative", 15: "flower painting", 16: "genre painting", 17: "history painting",
18: "illustration", 19: "interior", 20: "landscape", 21: "literary painting", 22: "marina",
23: "miniature", 24: "mythological painting", 25: "nude painting (nu)", 26: "panorama",
27: "pastorale", 28: "portrait", 29: "poster", 30: "quadratura", 31: "religious painting",
32: "self-portrait", 33: "shan shui", 34: "sketch and study", 35: "still life",
36: "symbolic painting", 37: "tessellation", 38: "urushi-e", 39: "vanitas",
40: "veduta", 41: "wildlife painting", 42: "yakusha-e"
}
def classify_genre(image):
"""Predicts the genre category for an artwork."""
image = Image.fromarray(image).convert("RGB")
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
probs = torch.nn.functional.softmax(logits, dim=1).squeeze().tolist()
predictions = {id2label[i]: round(probs[i], 3) for i in range(len(probs))}
return predictions
# Gradio interface
iface = gr.Interface(
fn=classify_genre,
inputs=gr.Image(type="numpy"),
outputs=gr.Label(label="Genre Prediction Scores"),
title="WikiArt-Genre",
description="Upload an artwork image to predict its genre (e.g., landscape, portrait, calligraphy, still life, etc.)."
)
# Launch the app
if __name__ == "__main__":
iface.launch()
```
---
## **Intended Use**
This model is best suited for:
- **Art classification and analysis**
- **Museum and gallery cataloging systems**
- **Art history research tools**
- **Training data enrichment for AI art models**
- **Art-themed e-commerce filtering and tagging**
|
[
"none",
"abstract",
"advertisement",
"allegorical painting",
"animal painting",
"battle painting",
"bijinga",
"bird-and-flower painting",
"calligraphy",
"capriccio",
"caricature",
"cityscape",
"cloudscape",
"design",
"figurative",
"flower painting",
"genre painting",
"history painting",
"illustration",
"interior",
"landscape",
"literary painting",
"marina",
"miniature",
"mythological painting",
"nude painting (nu)",
"panorama",
"pastorale",
"portrait",
"poster",
"quadratura",
"religious painting",
"self-portrait",
"shan shui",
"sketch and study",
"still life",
"symbolic painting",
"tessellation",
"urushi-e",
"vanitas",
"veduta",
"wildlife painting",
"yakusha-e"
] |
prithivMLmods/WikiArt-Style
|

# **WikiArt-Style**
> **WikiArt-Style** is a vision model fine-tuned from **google/siglip2-base-patch16-224** using the **SiglipForImageClassification** architecture. It classifies art images into one of 137 painting style categories.
```
Classification Report:
precision recall f1-score support
Abstract Art 0.2784 0.3228 0.2990 979
Abstract Expressionism 0.3615 0.5757 0.4441 2074
Academicism 0.4410 0.0730 0.1253 972
Action painting 0.0000 0.0000 0.0000 98
American Realism 0.9592 0.1697 0.2883 277
Analytical Cubism 0.2000 0.0115 0.0217 87
Analytical Realism 0.6667 0.0225 0.0435 89
Art Brut 0.0000 0.0000 0.0000 198
Art Deco 0.6178 0.6801 0.6475 644
Art Informel 0.3295 0.2486 0.2834 1267
Art Nouveau (Modern) 0.5183 0.6289 0.5682 4899
Automatic Painting 0.0000 0.0000 0.0000 37
Baroque 0.5312 0.6495 0.5845 4400
Biedermeier 0.0000 0.0000 0.0000 132
Byzantine 0.0000 0.0000 0.0000 77
Cartographic Art 0.0000 0.0000 0.0000 10
Classicism 0.0000 0.0000 0.0000 223
Cloisonnism 0.0000 0.0000 0.0000 172
Color Field Painting 0.6947 0.5352 0.6046 910
Conceptual Art 0.6667 0.0161 0.0315 124
Concretism 0.3973 0.4596 0.4262 581
Constructivism 0.3462 0.3022 0.3227 268
Contemporary Realism 1.0000 0.0518 0.0985 309
Costumbrismo 0.0000 0.0000 0.0000 19
Cubism 0.3426 0.6926 0.4584 1747
Cubo-Expressionism 0.0000 0.0000 0.0000 22
Cubo-Futurism 0.0000 0.0000 0.0000 137
Dada 0.5000 0.0098 0.0191 205
Divisionism 0.6810 0.2337 0.3480 338
Early Renaissance 0.5411 0.6329 0.5834 1351
Environmental (Land) Art 0.0000 0.0000 0.0000 1
Existential Art 0.0000 0.0000 0.0000 17
Expressionism 0.4540 0.5701 0.5055 7013
Fantastic Realism 0.0000 0.0000 0.0000 31
Fauvism 0.4480 0.1710 0.2475 731
Feminist Art 0.0000 0.0000 0.0000 18
Figurative Expressionism 0.0000 0.0000 0.0000 19
Futurism 0.4717 0.0859 0.1453 291
Gongbi 0.0000 0.0000 0.0000 38
Gothic 0.0000 0.0000 0.0000 12
Hard Edge Painting 0.4458 0.3978 0.4205 372
High Renaissance 0.5016 0.3653 0.4227 1314
Hyper-Realism 0.0000 0.0000 0.0000 49
Ilkhanid 0.0000 0.0000 0.0000 2
Impressionism 0.6057 0.7266 0.6607 10643
Indian Space painting 0.0000 0.0000 0.0000 25
Ink and wash painting 0.7739 0.4459 0.5658 545
International Gothic 0.8099 0.5349 0.6443 215
Intimism 0.0000 0.0000 0.0000 109
Japonism 0.0000 0.0000 0.0000 118
Joseon Dynasty 0.0000 0.0000 0.0000 10
Kinetic Art 0.0000 0.0000 0.0000 3
Kitsch 0.0000 0.0000 0.0000 47
Lettrism 0.0000 0.0000 0.0000 33
Light and Space 0.0000 0.0000 0.0000 11
Luminism 0.8333 0.0519 0.0978 385
Lyrical Abstraction 0.4631 0.2060 0.2851 670
Magic Realism 0.5883 0.5020 0.5417 1002
Mail Art 0.0000 0.0000 0.0000 10
Mannerism (Late Renaissance) 0.4875 0.3934 0.4355 1342
Mechanistic Cubism 0.0000 0.0000 0.0000 69
Metaphysical art 0.5385 0.0729 0.1284 192
Minimalism 0.5352 0.5957 0.5638 460
Miserablism 0.0000 0.0000 0.0000 8
Modernismo 0.0000 0.0000 0.0000 43
Mosan art 0.0000 0.0000 0.0000 39
Muralism 0.3636 0.0268 0.0500 149
Nanga (Bunjinga) 0.0000 0.0000 0.0000 58
Nas-Taliq 0.0000 0.0000 0.0000 9
Native Art 0.0000 0.0000 0.0000 21
Naturalism 0.8459 0.6437 0.7311 435
Naïve Art (Primitivism) 0.4897 0.5412 0.5142 2295
Neo-Byzantine 0.0000 0.0000 0.0000 18
Neo-Concretism 0.0000 0.0000 0.0000 44
Neo-Dada 0.0000 0.0000 0.0000 131
Neo-Expressionism 0.6763 0.2238 0.3363 420
Neo-Figurative Art 0.0000 0.0000 0.0000 27
Neo-Rococo 0.0000 0.0000 0.0000 97
Neo-Romanticism 0.6762 0.1254 0.2116 566
Neo-baroque 0.0000 0.0000 0.0000 105
Neoclassicism 0.5868 0.3317 0.4238 2038
Neoplasticism 0.8889 0.2775 0.4229 173
New Casualism 0.0000 0.0000 0.0000 22
New European Painting 0.0000 0.0000 0.0000 25
New Realism 0.0000 0.0000 0.0000 329
Nihonga 0.0000 0.0000 0.0000 29
None 0.3056 0.0112 0.0215 986
Northern Renaissance 0.6198 0.6448 0.6321 2379
Nouveau Réalisme 0.0000 0.0000 0.0000 142
Op Art 0.5120 0.7292 0.6016 528
Orientalism 0.0000 0.0000 0.0000 392
Orphism 0.6471 0.0444 0.0830 248
Ottoman Period 0.8571 0.0833 0.1519 72
Outsider art 0.0000 0.0000 0.0000 68
Perceptism 0.0000 0.0000 0.0000 6
Photorealism 0.0000 0.0000 0.0000 61
Pointillism 0.6450 0.6347 0.6398 501
Pop Art 0.5580 0.4501 0.4983 791
Post-Impressionism 0.4611 0.4200 0.4396 5778
Post-Minimalism 0.0000 0.0000 0.0000 31
Post-Painterly Abstraction 0.6250 0.0296 0.0565 169
Poster Art Realism 0.0000 0.0000 0.0000 43
Precisionism 0.7258 0.3169 0.4412 284
Primitivism 0.0000 0.0000 0.0000 36
Proto Renaissance 0.4342 0.3626 0.3952 273
Purism 0.5000 0.1635 0.2464 159
Rayonism 0.0000 0.0000 0.0000 6
Realism 0.4909 0.6939 0.5750 10523
Regionalism 0.7500 0.1776 0.2872 321
Renaissance 0.0000 0.0000 0.0000 1
Rococo 0.4975 0.5913 0.5404 2733
Romanesque 0.0000 0.0000 0.0000 55
Romanticism 0.5182 0.5380 0.5279 9285
Safavid Period 0.0000 0.0000 0.0000 39
Shin-hanga 0.6894 0.7184 0.7036 380
Social Realism 0.0000 0.0000 0.0000 305
Socialist Realism 0.0000 0.0000 0.0000 95
Spatialism 0.0000 0.0000 0.0000 83
Spectralism 0.0000 0.0000 0.0000 5
Street art 0.0000 0.0000 0.0000 23
Suprematism 0.0000 0.0000 0.0000 80
Surrealism 0.5011 0.6151 0.5523 4167
Symbolism 0.4398 0.3087 0.3627 3476
Synchromism 0.0000 0.0000 0.0000 10
Synthetic Cubism 0.6914 0.2995 0.4179 187
Synthetism 0.0000 0.0000 0.0000 49
Sōsaku hanga 0.5753 0.2900 0.3856 369
Tachisme 0.4366 0.1422 0.2145 436
Tenebrism 0.0000 0.0000 0.0000 221
Timurid Period 0.0000 0.0000 0.0000 17
Tonalism 0.0000 0.0000 0.0000 202
Transautomatism 0.0000 0.0000 0.0000 74
Tubism 0.0000 0.0000 0.0000 21
Ukiyo-e 0.7822 0.8612 0.8198 1426
Verism 0.0000 0.0000 0.0000 84
Yamato-e 0.0000 0.0000 0.0000 12
Zen 0.6702 0.6702 0.6702 94
accuracy 0.5108 103250
macro avg 0.2664 0.1708 0.1811 103250
weighted avg 0.4956 0.5108 0.4822 103250
```
```py
from datasets import load_dataset
# Load the dataset
dataset = load_dataset("Artificio/WikiArt")
# Extract unique masterCategory values (assuming it's a string field)
labels = sorted(set(example["style"] for example in dataset["train"]))
# Create id2label mapping
id2label = {str(i): label for i, label in enumerate(labels)}
# Print the mapping
print(id2label)
```
{'0': 'Abstract Art', '1': 'Abstract Expressionism', '2': 'Academicism', '3': 'Action painting', '4': 'American Realism', '5': 'Analytical Cubism', '6': 'Analytical\xa0Realism', '7': 'Art Brut', '8': 'Art Deco', '9': 'Art Informel', '10': 'Art Nouveau (Modern)', '11': 'Automatic Painting', '12': 'Baroque', '13': 'Biedermeier', '14': 'Byzantine', '15': 'Cartographic Art', '16': 'Classicism', '17': 'Cloisonnism', '18': 'Color Field Painting', '19': 'Conceptual Art', '20': 'Concretism', '21': 'Constructivism', '22': 'Contemporary Realism', '23': 'Costumbrismo', '24': 'Cubism', '25': 'Cubo-Expressionism', '26': 'Cubo-Futurism', '27': 'Dada', '28': 'Divisionism', '29': 'Early Renaissance', '30': 'Environmental (Land) Art', '31': 'Existential Art', '32': 'Expressionism', '33': 'Fantastic Realism', '34': 'Fauvism', '35': 'Feminist Art', '36': 'Figurative Expressionism', '37': 'Futurism', '38': 'Gongbi', '39': 'Gothic', '40': 'Hard Edge Painting', '41': 'High Renaissance', '42': 'Hyper-Realism', '43': 'Ilkhanid', '44': 'Impressionism', '45': 'Indian Space painting', '46': 'Ink and wash painting', '47': 'International Gothic', '48': 'Intimism', '49': 'Japonism', '50': 'Joseon Dynasty', '51': 'Kinetic Art', '52': 'Kitsch', '53': 'Lettrism', '54': 'Light and Space', '55': 'Luminism', '56': 'Lyrical Abstraction', '57': 'Magic Realism', '58': 'Mail Art', '59': 'Mannerism (Late Renaissance)', '60': 'Mechanistic Cubism', '61': 'Metaphysical art', '62': 'Minimalism', '63': 'Miserablism', '64': 'Modernismo', '65': 'Mosan art', '66': 'Muralism', '67': 'Nanga (Bunjinga)', '68': 'Nas-Taliq', '69': 'Native Art', '70': 'Naturalism', '71': 'Naïve Art (Primitivism)', '72': 'Neo-Byzantine', '73': 'Neo-Concretism', '74': 'Neo-Dada', '75': 'Neo-Expressionism', '76': 'Neo-Figurative Art', '77': 'Neo-Rococo', '78': 'Neo-Romanticism', '79': 'Neo-baroque', '80': 'Neoclassicism', '81': 'Neoplasticism', '82': 'New Casualism', '83': 'New European Painting', '84': 'New Realism', '85': 'Nihonga', '86': 'None', '87': 'Northern Renaissance', '88': 'Nouveau Réalisme', '89': 'Op Art', '90': 'Orientalism', '91': 'Orphism', '92': 'Ottoman Period', '93': 'Outsider art', '94': 'Perceptism ', '95': 'Photorealism', '96': 'Pointillism', '97': 'Pop Art', '98': 'Post-Impressionism', '99': 'Post-Minimalism', '100': 'Post-Painterly Abstraction', '101': 'Poster Art Realism', '102': 'Precisionism', '103': 'Primitivism', '104': 'Proto Renaissance', '105': 'Purism', '106': 'Rayonism', '107': 'Realism', '108': 'Regionalism', '109': 'Renaissance', '110': 'Rococo', '111': 'Romanesque', '112': 'Romanticism', '113': 'Safavid Period', '114': 'Shin-hanga', '115': 'Social Realism', '116': 'Socialist Realism', '117': 'Spatialism', '118': 'Spectralism', '119': 'Street art', '120': 'Suprematism', '121': 'Surrealism', '122': 'Symbolism', '123': 'Synchromism', '124': 'Synthetic Cubism', '125': 'Synthetism', '126': 'Sōsaku hanga', '127': 'Tachisme', '128': 'Tenebrism', '129': 'Timurid Period', '130': 'Tonalism', '131': 'Transautomatism', '132': 'Tubism', '133': 'Ukiyo-e', '134': 'Verism', '135': 'Yamato-e', '136': 'Zen'}
The model predicts one of the following painting **style** categories:
```
0: Abstract Art
1: Abstract Expressionism
2: Academicism
3: Action painting
4: American Realism
5: Analytical Cubism
6: Analytical Realism
7: Art Brut
8: Art Deco
9: Art Informel
10: Art Nouveau (Modern)
11: Automatic Painting
12: Baroque
13: Biedermeier
14: Byzantine
15: Cartographic Art
16: Classicism
17: Cloisonnism
18: Color Field Painting
19: Conceptual Art
20: Concretism
21: Constructivism
22: Contemporary Realism
23: Costumbrismo
24: Cubism
25: Cubo-Expressionism
26: Cubo-Futurism
27: Dada
28: Divisionism
29: Early Renaissance
30: Environmental (Land) Art
31: Existential Art
32: Expressionism
33: Fantastic Realism
34: Fauvism
35: Feminist Art
36: Figurative Expressionism
37: Futurism
38: Gongbi
39: Gothic
40: Hard Edge Painting
41: High Renaissance
42: Hyper-Realism
43: Ilkhanid
44: Impressionism
45: Indian Space painting
46: Ink and wash painting
47: International Gothic
48: Intimism
49: Japonism
50: Joseon Dynasty
51: Kinetic Art
52: Kitsch
53: Lettrism
54: Light and Space
55: Luminism
56: Lyrical Abstraction
57: Magic Realism
58: Mail Art
59: Mannerism (Late Renaissance)
60: Mechanistic Cubism
61: Metaphysical art
62: Minimalism
63: Miserablism
64: Modernismo
65: Mosan art
66: Muralism
67: Nanga (Bunjinga)
68: Nas-Taliq
69: Native Art
70: Naturalism
71: Naïve Art (Primitivism)
72: Neo-Byzantine
73: Neo-Concretism
74: Neo-Dada
75: Neo-Expressionism
76: Neo-Figurative Art
77: Neo-Rococo
78: Neo-Romanticism
79: Neo-baroque
80: Neoclassicism
81: Neoplasticism
82: New Casualism
83: New European Painting
84: New Realism
85: Nihonga
86: None
87: Northern Renaissance
88: Nouveau Réalisme
89: Op Art
90: Orientalism
91: Orphism
92: Ottoman Period
93: Outsider art
94: Perceptism
95: Photorealism
96: Pointillism
97: Pop Art
98: Post-Impressionism
99: Post-Minimalism
100: Post-Painterly Abstraction
101: Poster Art Realism
102: Precisionism
103: Primitivism
104: Proto Renaissance
105: Purism
106: Rayonism
107: Realism
108: Regionalism
109: Renaissance
110: Rococo
111: Romanesque
112: Romanticism
113: Safavid Period
114: Shin-hanga
115: Social Realism
116: Socialist Realism
117: Spatialism
118: Spectralism
119: Street art
120: Suprematism
121: Surrealism
122: Symbolism
123: Synchromism
124: Synthetic Cubism
125: Synthetism
126: Sōsaku hanga
127: Tachisme
128: Tenebrism
129: Timurid Period
130: Tonalism
131: Transautomatism
132: Tubism
133: Ukiyo-e
134: Verism
135: Yamato-e
136: Zen
```
---
## **Run with Transformers 🤗**
```bash
pip install -q transformers torch pillow gradio
```
```python
import gradio as gr
from transformers import AutoImageProcessor, SiglipForImageClassification
from PIL import Image
import torch
# Load model and processor
model_name = "prithivMLmods/WikiArt-Style" # Replace with your model path
model = SiglipForImageClassification.from_pretrained(model_name)
processor = AutoImageProcessor.from_pretrained(model_name)
# Label mapping
id2label = {
0: "Abstract Art", 1: "Abstract Expressionism", 2: "Academicism", 3: "Action painting",
4: "American Realism", 5: "Analytical Cubism", 6: "Analytical Realism", 7: "Art Brut",
8: "Art Deco", 9: "Art Informel", 10: "Art Nouveau (Modern)", 11: "Automatic Painting",
12: "Baroque", 13: "Biedermeier", 14: "Byzantine", 15: "Cartographic Art", 16: "Classicism",
17: "Cloisonnism", 18: "Color Field Painting", 19: "Conceptual Art", 20: "Concretism",
21: "Constructivism", 22: "Contemporary Realism", 23: "Costumbrismo", 24: "Cubism",
25: "Cubo-Expressionism", 26: "Cubo-Futurism", 27: "Dada", 28: "Divisionism",
29: "Early Renaissance", 30: "Environmental (Land) Art", 31: "Existential Art",
32: "Expressionism", 33: "Fantastic Realism", 34: "Fauvism", 35: "Feminist Art",
36: "Figurative Expressionism", 37: "Futurism", 38: "Gongbi", 39: "Gothic",
40: "Hard Edge Painting", 41: "High Renaissance", 42: "Hyper-Realism", 43: "Ilkhanid",
44: "Impressionism", 45: "Indian Space painting", 46: "Ink and wash painting",
47: "International Gothic", 48: "Intimism", 49: "Japonism", 50: "Joseon Dynasty",
51: "Kinetic Art", 52: "Kitsch", 53: "Lettrism", 54: "Light and Space", 55: "Luminism",
56: "Lyrical Abstraction", 57: "Magic Realism", 58: "Mail Art", 59: "Mannerism (Late Renaissance)",
60: "Mechanistic Cubism", 61: "Metaphysical art", 62: "Minimalism", 63: "Miserablism",
64: "Modernismo", 65: "Mosan art", 66: "Muralism", 67: "Nanga (Bunjinga)", 68: "Nas-Taliq",
69: "Native Art", 70: "Naturalism", 71: "Naïve Art (Primitivism)", 72: "Neo-Byzantine",
73: "Neo-Concretism", 74: "Neo-Dada", 75: "Neo-Expressionism", 76: "Neo-Figurative Art",
77: "Neo-Rococo", 78: "Neo-Romanticism", 79: "Neo-baroque", 80: "Neoclassicism",
81: "Neoplasticism", 82: "New Casualism", 83: "New European Painting", 84: "New Realism",
85: "Nihonga", 86: "None", 87: "Northern Renaissance", 88: "Nouveau Réalisme", 89: "Op Art",
90: "Orientalism", 91: "Orphism", 92: "Ottoman Period", 93: "Outsider art", 94: "Perceptism ",
95: "Photorealism", 96: "Pointillism", 97: "Pop Art", 98: "Post-Impressionism",
99: "Post-Minimalism", 100: "Post-Painterly Abstraction", 101: "Poster Art Realism",
102: "Precisionism", 103: "Primitivism", 104: "Proto Renaissance", 105: "Purism",
106: "Rayonism", 107: "Realism", 108: "Regionalism", 109: "Renaissance", 110: "Rococo",
111: "Romanesque", 112: "Romanticism", 113: "Safavid Period", 114: "Shin-hanga",
115: "Social Realism", 116: "Socialist Realism", 117: "Spatialism", 118: "Spectralism",
119: "Street art", 120: "Suprematism", 121: "Surrealism", 122: "Symbolism",
123: "Synchromism", 124: "Synthetic Cubism", 125: "Synthetism", 126: "Sōsaku hanga",
127: "Tachisme", 128: "Tenebrism", 129: "Timurid Period", 130: "Tonalism",
131: "Transautomatism", 132: "Tubism", 133: "Ukiyo-e", 134: "Verism", 135: "Yamato-e",
136: "Zen"
}
def classify_style(image):
"""Predicts the artistic style of the input artwork."""
image = Image.fromarray(image).convert("RGB")
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
probs = torch.nn.functional.softmax(outputs.logits, dim=1).squeeze().tolist()
return {id2label[i]: round(probs[i], 3) for i in range(len(probs))}
# Gradio interface
iface = gr.Interface(
fn=classify_style,
inputs=gr.Image(type="numpy"),
outputs=gr.Label(label="Style Prediction Scores"),
title="WikiArt-Style",
description="Upload an art image to predict its painting style category (e.g., Impressionism, Cubism, Baroque, etc.)."
)
if __name__ == "__main__":
iface.launch()
```
---
# **Intended Use of WikiArt-Style**
**1. Style Classification in Machine Learning Models**
- Used as **labels** for training and evaluating models that classify artworks based on their artistic styles.
- Ideal for deep learning applications involving **convolutional neural networks (CNNs)** or **transformer-based vision models**.
**2. Style Transfer Applications**
- Acts as a **style reference** for neural style transfer algorithms (e.g., applying "Baroque" or "Cubism" to photos).
- Can guide users to select a target style from a curated list.
**3. Dataset Annotation**
- Used to **annotate** images in large datasets of paintings with consistent style names.
- Ensures compatibility with datasets like WikiArt, Kaggle’s Painter by Numbers, or custom curation.
**4. Educational and Exploratory Interfaces**
- Powers interfaces or apps for **exploring art history**, with filterable and searchable styles.
- Great for building **art recommender systems** or **virtual museums**.
**5. Generative Art Prompting**
- Assists in **text-to-image prompting** for generative models (e.g., Stable Diffusion, DALL·E) to specify desired styles.
- Example: "Generate a portrait in the style of Neo-Expressionism."
**6. Metadata Categorization in Art Databases**
- Useful for tagging and organizing artworks by style in digital archives or NFT marketplaces.
|
[
"abstract art",
"abstract expressionism",
"academicism",
"action painting",
"american realism",
"analytical cubism",
"analytical realism",
"art brut",
"art deco",
"art informel",
"art nouveau (modern)",
"automatic painting",
"baroque",
"biedermeier",
"byzantine",
"cartographic art",
"classicism",
"cloisonnism",
"color field painting",
"conceptual art",
"concretism",
"constructivism",
"contemporary realism",
"costumbrismo",
"cubism",
"cubo-expressionism",
"cubo-futurism",
"dada",
"divisionism",
"early renaissance",
"environmental (land) art",
"existential art",
"expressionism",
"fantastic realism",
"fauvism",
"feminist art",
"figurative expressionism",
"futurism",
"gongbi",
"gothic",
"hard edge painting",
"high renaissance",
"hyper-realism",
"ilkhanid",
"impressionism",
"indian space painting",
"ink and wash painting",
"international gothic",
"intimism",
"japonism",
"joseon dynasty",
"kinetic art",
"kitsch",
"lettrism",
"light and space",
"luminism",
"lyrical abstraction",
"magic realism",
"mail art",
"mannerism (late renaissance)",
"mechanistic cubism",
"metaphysical art",
"minimalism",
"miserablism",
"modernismo",
"mosan art",
"muralism",
"nanga (bunjinga)",
"nas-taliq",
"native art",
"naturalism",
"naïve art (primitivism)",
"neo-byzantine",
"neo-concretism",
"neo-dada",
"neo-expressionism",
"neo-figurative art",
"neo-rococo",
"neo-romanticism",
"neo-baroque",
"neoclassicism",
"neoplasticism",
"new casualism",
"new european painting",
"new realism",
"nihonga",
"none",
"northern renaissance",
"nouveau réalisme",
"op art",
"orientalism",
"orphism",
"ottoman period",
"outsider art",
"perceptism ",
"photorealism",
"pointillism",
"pop art",
"post-impressionism",
"post-minimalism",
"post-painterly abstraction",
"poster art realism",
"precisionism",
"primitivism",
"proto renaissance",
"purism",
"rayonism",
"realism",
"regionalism",
"renaissance",
"rococo",
"romanesque",
"romanticism",
"safavid period",
"shin-hanga",
"social realism",
"socialist realism",
"spatialism",
"spectralism",
"street art",
"suprematism",
"surrealism",
"symbolism",
"synchromism",
"synthetic cubism",
"synthetism",
"sōsaku hanga",
"tachisme",
"tenebrism",
"timurid period",
"tonalism",
"transautomatism",
"tubism",
"ukiyo-e",
"verism",
"yamato-e",
"zen"
] |
Straueri/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1977
- Accuracy: 0.9445
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3582 | 1.0 | 370 | 0.2997 | 0.9256 |
| 0.2125 | 2.0 | 740 | 0.2200 | 0.9418 |
| 0.1573 | 3.0 | 1110 | 0.1966 | 0.9405 |
| 0.1472 | 4.0 | 1480 | 0.1884 | 0.9445 |
| 0.1338 | 5.0 | 1850 | 0.1865 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
Results:
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
Zusätzlich aus Interesse erstellt:
F1-Score (weighted): 0.8605
F1-Score (micro): 0.8800
F1-Score (macro): 0.8605
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Mathunan/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1900
- Accuracy: 0.9378
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.392 | 1.0 | 370 | 0.3019 | 0.9269 |
| 0.2013 | 2.0 | 740 | 0.2306 | 0.9405 |
| 0.1777 | 3.0 | 1110 | 0.2113 | 0.9378 |
| 0.1426 | 4.0 | 1480 | 0.1980 | 0.9432 |
| 0.1458 | 5.0 | 1850 | 0.1972 | 0.9445 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## Zero-Shot Evaluation
A comparison model was evaluated using `openai/clip-vit-base-patch32` on the Oxford-IIIT Pet dataset.
### Ergebnisse (Zero-Shot):
- **Accuracy:** 88.00%
- **Precision (macro):** 87.68%
- **Recall (macro):** 88.00%
Although the model was not trained on the dataset, it shows remarkable performance.
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
shahad-alh/arabichar-balanced-ver2
|
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
- **Model type:** OCR CNN model
- **Language(s) (NLP):** Arabic
- **License:** MIT
- **Finetuned from model [optional]:** asyafalni/arabichar-v3
### Results
training accuracy 53%
validation accuracy 51%
test accuracy 51%
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
[
".ipynb_checkpoints",
"alif",
"ayen",
"baa",
"dal",
"dhad",
"faa",
"ghayen",
"h_aa",
"haa",
"jeem",
"kaf",
"khaa",
"lam",
"meem",
"noon",
"qaf",
"raa",
"sad",
"seen",
"sheen",
"t_aa",
"taa",
"th_aa",
"thaa",
"thal",
"waw",
"yaa",
"zay"
] |
bodmedam/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1819
- Accuracy: 0.9445
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
- pcuenq/oxford-pets
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3569 | 1.0 | 370 | 0.3190 | 0.9161 |
| 0.2217 | 2.0 | 740 | 0.2527 | 0.9296 |
| 0.1703 | 3.0 | 1110 | 0.2419 | 0.9323 |
| 0.1404 | 4.0 | 1480 | 0.2359 | 0.9296 |
| 0.1286 | 5.0 | 1850 | 0.2338 | 0.9310 |
### Zero-Shot Evaluation
- Model: openai/clip-vit-large-patch14
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
kabboabb/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2023
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3878 | 1.0 | 370 | 0.2921 | 0.9215 |
| 0.2188 | 2.0 | 740 | 0.2260 | 0.9269 |
| 0.1832 | 3.0 | 1110 | 0.2136 | 0.9283 |
| 0.14 | 4.0 | 1480 | 0.2050 | 0.9323 |
| 0.1322 | 5.0 | 1850 | 0.2030 | 0.9323 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Key Figures of the transfer learning model:
- 'eval_loss': 0.20226821303367615,
- 'eval_accuracy': 0.945872801082544,
- 'eval_runtime': 10.8017,
- 'eval_samples_per_second': 68.415,
- 'eval_steps_per_second': 8.61,
- 'epoch': 5.0}
### Key Figures of the Zero Shot model:
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
albertstudy/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
## Zero-Shot Classification Results (Oxford-IIIT Pets Test Set)
* **Model Used:** `openai/clip-vit-large-patch14`
* **Accuracy:** `0.9039`
* **Precision (Weighted):** `0.9189`
* **Recall (Weighted):** `0.9039`
* **Precision (Macro):** `0.9131`
* **Recall (Macro):** `0.9091`
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2136
- Accuracy: 0.9350
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3667 | 1.0 | 370 | 0.3159 | 0.9188 |
| 0.2091 | 2.0 | 740 | 0.2353 | 0.9418 |
| 0.1749 | 3.0 | 1110 | 0.2184 | 0.9391 |
| 0.1361 | 4.0 | 1480 | 0.2089 | 0.9432 |
| 0.1401 | 5.0 | 1850 | 0.2064 | 0.9405 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
pereilea/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2066
- Accuracy: 0.9405
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3465 | 1.0 | 370 | 0.2699 | 0.9405 |
| 0.2149 | 2.0 | 740 | 0.2013 | 0.9499 |
| 0.1667 | 3.0 | 1110 | 0.1827 | 0.9621 |
| 0.1452 | 4.0 | 1480 | 0.1661 | 0.9621 |
| 0.1392 | 5.0 | 1850 | 0.1623 | 0.9648 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## Zero-Shot Classification Evaluation
**Model used:** `openai/clip-vit-large-patch14`
**Dataset:** Oxford-IIIT Pet (subset)
**Evaluation method:** Hugging Face `pipeline("zero-shot-image-classification")`
- Accuracy: **88.00%**
- Precision: **87.68%**
- Recall: **88.00%**
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
babicami/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1907
- Accuracy: 0.9405
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3546 | 1.0 | 370 | 0.2913 | 0.9296 |
| 0.2045 | 2.0 | 740 | 0.2223 | 0.9378 |
| 0.1642 | 3.0 | 1110 | 0.2108 | 0.9418 |
| 0.1374 | 4.0 | 1480 | 0.2041 | 0.9445 |
| 0.1362 | 5.0 | 1850 | 0.2010 | 0.9432 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero-Shot Week7 Report Data
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
azamat1ch/vit-neu-defect
|
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
[
"crazing",
"inclusion",
"patches",
"pitted_surface",
"rolled-in_scale",
"scratches"
] |
faridkarimli/SWIN_Gaudi_100
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# SWIN_Gaudi_100
This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12-192-22k](https://huggingface.co/microsoft/swinv2-large-patch4-window12-192-22k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 6.5510
- Accuracy: 0.1798
- Memory Allocated (gb): 2.43
- Max Memory Allocated (gb): 21.35
- Total Memory Available (gb): 94.62
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 512
- total_eval_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear
- num_epochs: 100.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Allocated (gb) | Memory Allocated (gb) | Memory Available (gb) |
|:-------------:|:-----:|:------:|:---------------:|:--------:|:--------------:|:---------------------:|:---------------------:|
| 4.0897 | 1.0 | 1313 | 6.5606 | 0.0689 | 2.46 | 21.24 | 94.62 |
| 2.8297 | 2.0 | 2626 | 6.3807 | 0.0806 | 2.46 | 21.25 | 94.62 |
| 2.3951 | 3.0 | 3939 | 6.4208 | 0.0884 | 2.46 | 21.25 | 94.62 |
| 2.1003 | 4.0 | 5252 | 6.1807 | 0.1014 | 2.46 | 21.25 | 94.62 |
| 1.93 | 5.0 | 6565 | 6.3296 | 0.1027 | 2.46 | 21.25 | 94.62 |
| 1.7486 | 6.0 | 7878 | 6.2377 | 0.1091 | 2.46 | 21.25 | 94.62 |
| 1.6498 | 7.0 | 9191 | 6.4809 | 0.1056 | 2.46 | 21.25 | 94.62 |
| 1.5722 | 8.0 | 10504 | 6.2681 | 0.1150 | 2.46 | 21.25 | 94.62 |
| 1.458 | 9.0 | 11817 | 6.2907 | 0.1137 | 2.46 | 21.25 | 94.62 |
| 1.4131 | 10.0 | 13130 | 6.4250 | 0.1149 | 2.46 | 21.25 | 94.62 |
| 1.3132 | 11.0 | 14443 | 6.4233 | 0.1148 | 2.46 | 21.28 | 94.62 |
| 1.2835 | 12.0 | 15756 | 6.3784 | 0.1192 | 2.46 | 21.28 | 94.62 |
| 1.2414 | 13.0 | 17069 | 6.4416 | 0.1161 | 2.46 | 21.28 | 94.62 |
| 1.1652 | 14.0 | 18382 | 6.5069 | 0.1194 | 2.46 | 21.28 | 94.62 |
| 1.1415 | 15.0 | 19695 | 6.3847 | 0.1265 | 2.46 | 21.28 | 94.62 |
| 1.118 | 16.0 | 21008 | 6.3110 | 0.1265 | 2.46 | 21.28 | 94.62 |
| 1.065 | 17.0 | 22321 | 6.4024 | 0.1283 | 2.46 | 21.28 | 94.62 |
| 1.0469 | 18.0 | 23634 | 6.1888 | 0.1318 | 2.46 | 21.28 | 94.62 |
| 0.978 | 19.0 | 24947 | 6.4888 | 0.1328 | 2.46 | 21.28 | 94.62 |
| 0.9734 | 20.0 | 26260 | 6.3570 | 0.1328 | 2.46 | 21.28 | 94.62 |
| 0.9602 | 21.0 | 27573 | 6.2379 | 0.1335 | 2.46 | 21.28 | 94.62 |
| 0.9069 | 22.0 | 28886 | 6.3066 | 0.1334 | 2.46 | 21.28 | 94.62 |
| 0.8996 | 23.0 | 30199 | 6.2510 | 0.1328 | 2.46 | 21.28 | 94.62 |
| 0.893 | 24.0 | 31512 | 6.4094 | 0.1370 | 2.46 | 21.28 | 94.62 |
| 0.8494 | 25.0 | 32825 | 6.3232 | 0.1386 | 2.46 | 21.28 | 94.62 |
| 0.8507 | 26.0 | 34138 | 6.4262 | 0.1361 | 2.46 | 21.28 | 94.62 |
| 0.8065 | 27.0 | 35451 | 6.4156 | 0.1339 | 2.46 | 21.28 | 94.62 |
| 0.7956 | 28.0 | 36764 | 6.2878 | 0.1395 | 2.46 | 21.28 | 94.62 |
| 0.7889 | 29.0 | 38077 | 6.4994 | 0.1349 | 2.46 | 21.28 | 94.62 |
| 0.7645 | 30.0 | 39390 | 6.4939 | 0.1407 | 2.46 | 21.28 | 94.62 |
| 0.7548 | 31.0 | 40703 | 6.4849 | 0.1375 | 2.46 | 21.28 | 94.62 |
| 0.7494 | 32.0 | 42016 | 6.5542 | 0.1415 | 2.46 | 21.28 | 94.62 |
| 0.7162 | 33.0 | 43329 | 6.4573 | 0.1418 | 2.46 | 21.28 | 94.62 |
| 0.7109 | 34.0 | 44642 | 6.4910 | 0.1414 | 2.46 | 21.28 | 94.62 |
| 0.683 | 35.0 | 45955 | 6.4313 | 0.1411 | 2.46 | 21.28 | 94.62 |
| 0.6828 | 36.0 | 47268 | 6.3059 | 0.1456 | 2.46 | 21.28 | 94.62 |
| 0.6772 | 37.0 | 48581 | 6.3764 | 0.1464 | 2.46 | 21.28 | 94.62 |
| 0.652 | 38.0 | 49894 | 6.3437 | 0.1502 | 2.46 | 21.28 | 94.62 |
| 0.6533 | 39.0 | 51207 | 6.3493 | 0.1470 | 2.46 | 21.28 | 94.62 |
| 0.6527 | 40.0 | 52520 | 6.3078 | 0.1481 | 2.46 | 21.28 | 94.62 |
| 0.633 | 41.0 | 53833 | 6.5351 | 0.1413 | 2.46 | 21.28 | 94.62 |
| 0.6219 | 42.0 | 55146 | 6.3772 | 0.1492 | 2.46 | 21.28 | 94.62 |
| 0.6053 | 43.0 | 56459 | 6.4808 | 0.1481 | 2.46 | 21.28 | 94.62 |
| 0.5996 | 44.0 | 57772 | 6.5651 | 0.1480 | 2.46 | 21.28 | 94.62 |
| 0.5974 | 45.0 | 59085 | 6.5338 | 0.1488 | 2.46 | 21.28 | 94.62 |
| 0.5818 | 46.0 | 60398 | 6.3044 | 0.1524 | 2.46 | 21.28 | 94.62 |
| 0.5803 | 47.0 | 61711 | 6.5366 | 0.1514 | 2.46 | 21.28 | 94.62 |
| 0.573 | 48.0 | 63024 | 6.4783 | 0.1528 | 2.46 | 21.28 | 94.62 |
| 0.551 | 49.0 | 64337 | 6.4941 | 0.1540 | 2.46 | 21.28 | 94.62 |
| 0.5447 | 50.0 | 65650 | 6.4514 | 0.1528 | 2.46 | 21.28 | 94.62 |
| 0.5326 | 51.0 | 66963 | 6.3732 | 0.1547 | 2.46 | 21.28 | 94.62 |
| 0.5307 | 52.0 | 68276 | 6.5803 | 0.1546 | 2.46 | 21.28 | 94.62 |
| 0.5265 | 53.0 | 69589 | 6.2254 | 0.1594 | 2.46 | 21.28 | 94.62 |
| 0.5216 | 54.0 | 70902 | 6.2881 | 0.1574 | 2.46 | 21.28 | 94.62 |
| 0.5214 | 55.0 | 72215 | 6.4118 | 0.1564 | 2.46 | 21.28 | 94.62 |
| 0.5163 | 56.0 | 73528 | 6.4703 | 0.1574 | 2.46 | 21.28 | 94.62 |
| 0.4954 | 57.0 | 74841 | 6.3910 | 0.1602 | 2.46 | 21.28 | 94.62 |
| 0.4946 | 58.0 | 76154 | 6.4567 | 0.1607 | 2.46 | 21.28 | 94.62 |
| 0.4764 | 59.0 | 77467 | 6.4750 | 0.1592 | 2.46 | 21.28 | 94.62 |
| 0.4797 | 60.0 | 78780 | 6.5071 | 0.1580 | 2.46 | 21.28 | 94.62 |
| 0.4773 | 61.0 | 80093 | 6.2996 | 0.1649 | 2.46 | 21.28 | 94.62 |
| 0.4638 | 62.0 | 81406 | 6.3757 | 0.1582 | 2.46 | 21.35 | 94.62 |
| 0.4634 | 63.0 | 82719 | 6.4944 | 0.1579 | 2.46 | 21.35 | 94.62 |
| 0.4605 | 64.0 | 84032 | 6.6361 | 0.1573 | 2.46 | 21.35 | 94.62 |
| 0.4541 | 65.0 | 85345 | 6.5321 | 0.1566 | 2.46 | 21.35 | 94.62 |
| 0.447 | 66.0 | 86658 | 6.2949 | 0.1647 | 2.46 | 21.35 | 94.62 |
| 0.4392 | 67.0 | 87971 | 6.4294 | 0.1616 | 2.46 | 21.35 | 94.62 |
| 0.4319 | 68.0 | 89284 | 6.4686 | 0.1657 | 2.46 | 21.35 | 94.62 |
| 0.4321 | 69.0 | 90597 | 6.5044 | 0.1654 | 2.46 | 21.35 | 94.62 |
| 0.4239 | 70.0 | 91910 | 6.2884 | 0.1670 | 2.46 | 21.35 | 94.62 |
| 0.424 | 71.0 | 93223 | 6.4557 | 0.1650 | 2.46 | 21.35 | 94.62 |
| 0.4189 | 72.0 | 94536 | 6.5151 | 0.1643 | 2.46 | 21.35 | 94.62 |
| 0.4056 | 73.0 | 95849 | 6.4498 | 0.1685 | 2.46 | 21.35 | 94.62 |
| 0.4113 | 74.0 | 97162 | 6.4636 | 0.1672 | 2.46 | 21.35 | 94.62 |
| 0.4031 | 75.0 | 98475 | 6.6464 | 0.1627 | 2.46 | 21.35 | 94.62 |
| 0.3965 | 76.0 | 99788 | 6.5633 | 0.1686 | 2.46 | 21.35 | 94.62 |
| 0.393 | 77.0 | 101101 | 6.6878 | 0.1633 | 2.46 | 21.35 | 94.62 |
| 0.3958 | 78.0 | 102414 | 6.4100 | 0.1742 | 2.46 | 21.35 | 94.62 |
| 0.3848 | 79.0 | 103727 | 6.5372 | 0.1708 | 2.46 | 21.35 | 94.62 |
| 0.3785 | 80.0 | 105040 | 6.4460 | 0.1702 | 2.46 | 21.35 | 94.62 |
| 0.3709 | 81.0 | 106353 | 6.4497 | 0.1763 | 2.46 | 21.35 | 94.62 |
| 0.3692 | 82.0 | 107666 | 6.4494 | 0.1746 | 2.46 | 21.35 | 94.62 |
| 0.3667 | 83.0 | 108979 | 6.4787 | 0.1733 | 2.46 | 21.35 | 94.62 |
| 0.3642 | 84.0 | 110292 | 6.3792 | 0.1762 | 2.46 | 21.35 | 94.62 |
| 0.3648 | 85.0 | 111605 | 6.4105 | 0.1784 | 2.46 | 21.35 | 94.62 |
| 0.3595 | 86.0 | 112918 | 6.6821 | 0.1718 | 2.46 | 21.35 | 94.62 |
| 0.3575 | 87.0 | 114231 | 6.5187 | 0.1763 | 2.46 | 21.35 | 94.62 |
| 0.3512 | 88.0 | 115544 | 6.5861 | 0.1752 | 2.46 | 21.35 | 94.62 |
| 0.3416 | 89.0 | 116857 | 6.5337 | 0.1772 | 2.46 | 21.35 | 94.62 |
| 0.3454 | 90.0 | 118170 | 6.6075 | 0.1758 | 2.46 | 21.35 | 94.62 |
| 0.3401 | 91.0 | 119483 | 6.5369 | 0.1759 | 2.46 | 21.35 | 94.62 |
| 0.3361 | 92.0 | 120796 | 6.6148 | 0.1777 | 2.46 | 21.35 | 94.62 |
| 0.3377 | 93.0 | 122109 | 6.4843 | 0.1799 | 2.46 | 21.35 | 94.62 |
| 0.3344 | 94.0 | 123422 | 6.4471 | 0.1790 | 2.46 | 21.35 | 94.62 |
| 0.3262 | 95.0 | 124735 | 6.4506 | 0.1810 | 2.46 | 21.35 | 94.62 |
| 0.3228 | 96.0 | 126048 | 6.5665 | 0.1794 | 2.46 | 21.35 | 94.62 |
| 0.327 | 97.0 | 127361 | 6.5349 | 0.1793 | 2.46 | 21.35 | 94.62 |
| 0.3275 | 98.0 | 128674 | 6.5128 | 0.1799 | 2.46 | 21.35 | 94.62 |
| 0.321 | 99.0 | 129987 | 6.5574 | 0.1801 | 2.46 | 21.35 | 94.62 |
| 0.3217 | 100.0 | 131300 | 6.5510 | 0.1798 | 2.46 | 21.35 | 94.62 |
### Framework versions
- Transformers 4.45.2
- Pytorch 2.6.0+hpu_1.20.0-543.git4952fce
- Datasets 3.5.0
- Tokenizers 0.20.3
|
[
"3150",
"7279",
"12267",
"15267",
"8181",
"12263",
"6469",
"15322",
"755",
"184",
"3501",
"8749",
"5411",
"11090",
"1176",
"15154",
"7198",
"13439",
"1432",
"2219",
"7338",
"9986",
"5439",
"12921",
"4688",
"8686",
"9356",
"12530",
"6406",
"10983",
"3705",
"13494",
"2008",
"3537",
"2485",
"9773",
"15450",
"13679",
"13236",
"13793",
"11717",
"8123",
"11098",
"6493",
"10737",
"3270",
"10288",
"6881",
"1657",
"1460",
"13482",
"6820",
"11248",
"2080",
"3247",
"15029",
"7639",
"10799",
"711",
"13596",
"9392",
"13049",
"12499",
"10341",
"11730",
"3049",
"2089",
"4350",
"1203",
"13403",
"911",
"14681",
"1612",
"1275",
"10697",
"9871",
"9072",
"7353",
"10624",
"113",
"2554",
"920",
"3201",
"11067",
"7312",
"9178",
"8348",
"7905",
"9549",
"1679",
"13655",
"2647",
"11344",
"11748",
"11809",
"2757",
"3925",
"12044",
"11473",
"1750",
"3267",
"3037",
"304",
"991",
"217",
"4724",
"3454",
"15335",
"5825",
"8084",
"4578",
"8303",
"12034",
"5806",
"10313",
"12244",
"9297",
"2465",
"9266",
"12956",
"320",
"14986",
"4926",
"7960",
"14747",
"15400",
"3055",
"9956",
"11559",
"9918",
"14619",
"5290",
"14312",
"14238",
"8357",
"2521",
"7929",
"3134",
"5947",
"2351",
"14258",
"3728",
"7445",
"8726",
"799",
"10015",
"1055",
"11872",
"9054",
"10441",
"13763",
"13553",
"14942",
"12084",
"13734",
"10029",
"5057",
"2460",
"7950",
"5950",
"272",
"13784",
"1403",
"15494",
"7972",
"8917",
"14000",
"11507",
"3006",
"1351",
"6180",
"2951",
"10109",
"1480",
"442",
"5591",
"12525",
"977",
"13014",
"5630",
"9079",
"3333",
"11846",
"8349",
"11491",
"6695",
"11335",
"14542",
"14277",
"9015",
"6201",
"12181",
"5422",
"6518",
"12641",
"12427",
"3428",
"9622",
"11366",
"9983",
"1879",
"15156",
"2515",
"14678",
"5162",
"3650",
"3948",
"10411",
"1997",
"5344",
"12848",
"14653",
"14759",
"12646",
"2961",
"2920",
"8380",
"12454",
"10385",
"8757",
"6034",
"5017",
"1196",
"8074",
"1433",
"3121",
"7035",
"7638",
"2793",
"10598",
"5066",
"11089",
"9046",
"6334",
"729",
"2664",
"2444",
"6738",
"2567",
"14058",
"132",
"10924",
"6464",
"13740",
"5999",
"11154",
"2221",
"233",
"7736",
"5983",
"2919",
"13839",
"11427",
"7728",
"127",
"9926",
"10951",
"12221",
"12858",
"1784",
"9377",
"8428",
"12376",
"13416",
"13148",
"5128",
"4333",
"9736",
"5423",
"5526",
"14570",
"10560",
"9847",
"7050",
"11770",
"11204",
"10956",
"11287",
"12460",
"8370",
"3597",
"1867",
"6485",
"10485",
"3383",
"3298",
"6840",
"9105",
"9791",
"13467",
"4094",
"2751",
"7902",
"10891",
"14473",
"13379",
"15375",
"10114",
"4828",
"12828",
"12413",
"142",
"7434",
"782",
"12959",
"8848",
"94",
"12350",
"11180",
"5102",
"13705",
"14586",
"4984",
"6420",
"6854",
"963",
"9452",
"2643",
"5774",
"13949",
"11305",
"4904",
"3305",
"10841",
"11382",
"14140",
"7677",
"11964",
"811",
"9588",
"4961",
"15300",
"1815",
"14150",
"2752",
"5887",
"6867",
"11426",
"13222",
"13909",
"3197",
"5256",
"13862",
"3720",
"580",
"10421",
"15002",
"996",
"12540",
"9261",
"6241",
"13307",
"9515",
"5904",
"9006",
"12099",
"11073",
"12914",
"8678",
"4455",
"12369",
"4793",
"4894",
"15346",
"7830",
"6711",
"10704",
"6117",
"5694",
"2363",
"11547",
"6305",
"5069",
"12718",
"11966",
"13719",
"1261",
"9335",
"7220",
"400",
"7322",
"11639",
"99",
"12269",
"9002",
"3012",
"9754",
"1762",
"5462",
"1553",
"7689",
"10532",
"5165",
"13962",
"4519",
"10628",
"6977",
"8932",
"11821",
"7760",
"8508",
"168",
"10043",
"11718",
"9776",
"12747",
"5388",
"252",
"12381",
"5674",
"5729",
"11358",
"2782",
"6239",
"5174",
"415",
"10245",
"14122",
"11122",
"12065",
"12424",
"10575",
"10738",
"7037",
"6690",
"10103",
"13005",
"7705",
"6633",
"15100",
"5115",
"10211",
"13022",
"5310",
"11725",
"525",
"4851",
"2512",
"14512",
"8831",
"176",
"3593",
"9300",
"5222",
"4877",
"8462",
"11258",
"2346",
"12944",
"13249",
"8902",
"939",
"8947",
"4026",
"15480",
"10261",
"2558",
"679",
"1822",
"11166",
"529",
"7041",
"12882",
"15050",
"9087",
"11959",
"443",
"4276",
"4489",
"171",
"3097",
"9706",
"6073",
"11145",
"1473",
"9289",
"8336",
"13308",
"13733",
"5146",
"7974",
"15057",
"11464",
"9651",
"13070",
"4035",
"13779",
"6198",
"12096",
"11209",
"2955",
"14608",
"6479",
"1183",
"8037",
"5813",
"10674",
"626",
"10811",
"7882",
"12563",
"11841",
"5082",
"3005",
"9735",
"5485",
"13598",
"3992",
"6052",
"10746",
"7280",
"12930",
"4537",
"4179",
"12183",
"12522",
"10234",
"15030",
"4539",
"7650",
"4974",
"11208",
"3017",
"3686",
"10006",
"738",
"11998",
"2786",
"4381",
"12496",
"9406",
"13756",
"10948",
"14699",
"1529",
"1585",
"10992",
"1322",
"5311",
"10452",
"2502",
"2594",
"15199",
"5993",
"9569",
"13020",
"11343",
"1735",
"8632",
"288",
"3714",
"5822",
"3425",
"9282",
"8452",
"3335",
"261",
"4127",
"2406",
"530",
"14904",
"13394",
"8522",
"7021",
"7942",
"7318",
"9383",
"2116",
"10338",
"8528",
"1848",
"14524",
"1753",
"4814",
"15055",
"12491",
"13846",
"12115",
"6579",
"826",
"8323",
"1240",
"15187",
"5808",
"14587",
"1563",
"15482",
"7957",
"10829",
"9089",
"14208",
"10477",
"2240",
"5623",
"1713",
"9078",
"15064",
"861",
"2656",
"3575",
"8629",
"7399",
"564",
"7450",
"2105",
"14328",
"11660",
"14350",
"7208",
"1371",
"91",
"8627",
"8669",
"15485",
"12560",
"3209",
"11311",
"6829",
"6330",
"3251",
"12933",
"5874",
"14669",
"9553",
"4605",
"1487",
"12436",
"2344",
"469",
"605",
"8377",
"13171",
"623",
"10080",
"4595",
"13654",
"1721",
"11002",
"11526",
"2463",
"10096",
"13951",
"7803",
"4439",
"3131",
"13545",
"9045",
"9915",
"12363",
"12298",
"14719",
"11402",
"4031",
"6592",
"14059",
"14069",
"1986",
"8104",
"7301",
"2115",
"13936",
"8587",
"11549",
"12520",
"4597",
"2702",
"13077",
"13233",
"11047",
"12021",
"5937",
"15299",
"240",
"2962",
"6273",
"13552",
"11612",
"7764",
"12218",
"11756",
"6077",
"3060",
"11411",
"12762",
"554",
"421",
"12865",
"1124",
"12430",
"7106",
"4341",
"6162",
"1730",
"10208",
"13551",
"14726",
"14977",
"11948",
"3208",
"11716",
"8207",
"4951",
"7324",
"7186",
"7071",
"728",
"1353",
"2451",
"3910",
"15257",
"10856",
"4854",
"13567",
"1708",
"8611",
"4580",
"13239",
"11056",
"13818",
"332",
"2922",
"13752",
"27",
"381",
"2073",
"7305",
"10290",
"5882",
"1992",
"13850",
"301",
"14709",
"7498",
"9941",
"15306",
"3750",
"4354",
"7507",
"4625",
"14869",
"7337",
"785",
"5015",
"13690",
"87",
"2508",
"3183",
"10483",
"8569",
"13028",
"6993",
"10842",
"10786",
"11534",
"3667",
"8532",
"11671",
"14453",
"12132",
"7397",
"8884",
"448",
"9658",
"7930",
"7418",
"5759",
"1287",
"1803",
"12784",
"12",
"1984",
"1017",
"6900",
"13069",
"10358",
"6582",
"1126",
"12999",
"1380",
"13530",
"4157",
"12709",
"5701",
"11854",
"13689",
"8670",
"5989",
"5727",
"12561",
"8341",
"2595",
"13448",
"7105",
"14812",
"150",
"12050",
"6013",
"8383",
"11400",
"11409",
"8222",
"12905",
"13333",
"5678",
"5684",
"15458",
"428",
"8471",
"13538",
"7923",
"7828",
"2792",
"1377",
"6963",
"13519",
"8188",
"11691",
"10932",
"7031",
"2194",
"14244",
"5055",
"8126",
"9886",
"10128",
"8354",
"3002",
"2681",
"11617",
"6866",
"7421",
"13381",
"3855",
"11535",
"4566",
"3863",
"12654",
"11757",
"2345",
"7247",
"6495",
"12038",
"7017",
"10067",
"2533",
"7162",
"7509",
"15428",
"3417",
"12095",
"4582",
"10988",
"10604",
"11825",
"7459",
"3103",
"10383",
"1733",
"5342",
"1145",
"3565",
"2551",
"2249",
"5584",
"6431",
"13922",
"2079",
"11683",
"8612",
"1575",
"13787",
"3475",
"1701",
"11388",
"3550",
"8197",
"2112",
"7785",
"15436",
"38",
"1286",
"2452",
"8142",
"7147",
"7733",
"6487",
"14357",
"9891",
"9878",
"11012",
"8595",
"10892",
"5781",
"4404",
"1166",
"3780",
"11135",
"12911",
"14118",
"14014",
"3759",
"5252",
"8294",
"14199",
"496",
"8653",
"2848",
"14979",
"14036",
"2611",
"9456",
"6538",
"1039",
"2291",
"8696",
"9445",
"7268",
"4955",
"13035",
"9630",
"8147",
"8955",
"3429",
"13360",
"15444",
"11557",
"5305",
"10391",
"4859",
"59",
"9957",
"1041",
"8999",
"5001",
"5894",
"4193",
"12439",
"12100",
"2422",
"5695",
"6946",
"9150",
"12521",
"7194",
"2239",
"8264",
"11431",
"14141",
"10050",
"7715",
"7372",
"4836",
"13046",
"14384",
"14811",
"10343",
"1431",
"13043",
"6328",
"11486",
"7007",
"12645",
"3881",
"13582",
"13804",
"4731",
"5394",
"4242",
"6172",
"13229",
"5657",
"8904",
"14367",
"1650",
"10942",
"12001",
"3504",
"13182",
"734",
"7749",
"5537",
"5409",
"3657",
"67",
"7472",
"4588",
"4235",
"12860",
"230",
"7816",
"9875",
"10695",
"11950",
"696",
"4728",
"13943",
"6228",
"1619",
"4430",
"1333",
"14519",
"14701",
"5663",
"7027",
"9673",
"6534",
"4111",
"5163",
"9968",
"7210",
"12377",
"11217",
"4748",
"6477",
"3093",
"3499",
"14764",
"14730",
"6316",
"235",
"566",
"8688",
"1880",
"3285",
"2513",
"14566",
"239",
"5912",
"11121",
"1933",
"209",
"11392",
"6552",
"4199",
"2975",
"7020",
"10909",
"3505",
"10978",
"10677",
"5108",
"6069",
"10166",
"15376",
"13728",
"10553",
"13137",
"538",
"8069",
"5598",
"15070",
"11817",
"12227",
"5303",
"6108",
"14703",
"2047",
"8241",
"561",
"7093",
"6716",
"12952",
"1976",
"8288",
"1142",
"831",
"8603",
"791",
"6713",
"3632",
"14163",
"12079",
"3472",
"13399",
"7948",
"1361",
"3248",
"190",
"1157",
"12967",
"13745",
"5920",
"2811",
"6271",
"4459",
"7124",
"8987",
"9713",
"15373",
"5099",
"14066",
"5641",
"5858",
"9614",
"9523",
"3168",
"1170",
"3184",
"8785",
"6161",
"1696",
"6620",
"5453",
"1381",
"7068",
"12579",
"11386",
"3967",
"3367",
"12727",
"8841",
"9628",
"3987",
"9746",
"6759",
"4116",
"11659",
"11799",
"5768",
"7651",
"5446",
"3542",
"1329",
"14683",
"5786",
"9779",
"3554",
"6951",
"11478",
"7067",
"12593",
"2534",
"14065",
"86",
"1883",
"14649",
"5672",
"12720",
"15254",
"13910",
"1526",
"3850",
"409",
"7119",
"6630",
"8639",
"2317",
"52",
"897",
"7658",
"4010",
"2727",
"2748",
"10013",
"4576",
"3951",
"925",
"1593",
"1872",
"10158",
"15075",
"14816",
"11292",
"353",
"4289",
"8772",
"4222",
"14944",
"14792",
"10730",
"1570",
"10076",
"7896",
"7062",
"8255",
"10675",
"6036",
"1468",
"10662",
"3943",
"4648",
"6038",
"3498",
"591",
"8685",
"2769",
"12919",
"6761",
"10788",
"562",
"2048",
"14685",
"9244",
"11533",
"5261",
"1500",
"13368",
"579",
"8592",
"3630",
"698",
"8512",
"11810",
"788",
"9313",
"11940",
"13911",
"9759",
"420",
"15403",
"2581",
"2228",
"14472",
"5333",
"10556",
"14528",
"2334",
"923",
"2869",
"12514",
"12367",
"5143",
"3623",
"5512",
"13487",
"4250",
"3027",
"6674",
"12097",
"8557",
"14313",
"5608",
"10756",
"7431",
"4266",
"2429",
"11632",
"65",
"8486",
"8537",
"9608",
"6675",
"13659",
"8020",
"478",
"14419",
"8100",
"682",
"10716",
"12500",
"7706",
"12058",
"1198",
"4208",
"2490",
"1394",
"537",
"13185",
"1858",
"14967",
"3380",
"857",
"3158",
"1205",
"5416",
"10616",
"6619",
"9305",
"646",
"3171",
"15155",
"11800",
"9564",
"2268",
"12760",
"4191",
"4065",
"3015",
"4799",
"10278",
"10092",
"14888",
"12139",
"4998",
"5289",
"6742",
"14628",
"7819",
"7065",
"4299",
"15025",
"5300",
"5211",
"13188",
"13749",
"11778",
"3543",
"1521",
"2576",
"8301",
"8353",
"3068",
"9710",
"10308",
"1188",
"6096",
"10980",
"8781",
"12976",
"4747",
"14063",
"14045",
"8619",
"14329",
"11845",
"8490",
"1340",
"1225",
"3110",
"2171",
"11921",
"3360",
"3676",
"7096",
"8225",
"11981",
"1646",
"3320",
"8748",
"11176",
"3958",
"9743",
"9330",
"2136",
"5975",
"6202",
"10779",
"14481",
"2473",
"1863",
"124",
"4572",
"2881",
"5523",
"9841",
"950",
"2093",
"5711",
"11866",
"902",
"2418",
"2348",
"15192",
"6383",
"14025",
"8062",
"11781",
"14980",
"10962",
"3381",
"9765",
"8116",
"358",
"1354",
"574",
"433",
"8165",
"14599",
"3046",
"704",
"13600",
"13516",
"5149",
"5103",
"5906",
"7617",
"13026",
"12600",
"9165",
"9388",
"3673",
"12576",
"9529",
"56",
"8372",
"5870",
"460",
"10922",
"10113",
"5503",
"10996",
"13486",
"4607",
"220",
"15406",
"10586",
"10413",
"11861",
"13344",
"9106",
"12745",
"8626",
"6890",
"10136",
"9798",
"12327",
"2617",
"2497",
"13539",
"6253",
"6501",
"6333",
"5397",
"1192",
"11975",
"3202",
"14842",
"14945",
"5451",
"11860",
"6564",
"3419",
"15117",
"5552",
"14731",
"5977",
"10407",
"13120",
"14918",
"11790",
"9543",
"763",
"3090",
"9960",
"7799",
"10660",
"13564",
"5667",
"15353",
"49",
"2671",
"9674",
"15172",
"12336",
"11492",
"11924",
"4115",
"7185",
"6685",
"3474",
"1718",
"599",
"10814",
"1922",
"10436",
"7538",
"1884",
"10807",
"11646",
"14235",
"15481",
"3316",
"2250",
"8656",
"14005",
"13980",
"12231",
"5376",
"1789",
"12307",
"11827",
"13715",
"2907",
"12036",
"10311",
"8521",
"1774",
"12888",
"9889",
"2003",
"12426",
"10213",
"4233",
"2790",
"2585",
"2488",
"14672",
"12778",
"3574",
"12193",
"14982",
"13454",
"11113",
"7170",
"3261",
"14173",
"2147",
"1804",
"7139",
"1388",
"751",
"11847",
"2243",
"8784",
"15221",
"14226",
"12293",
"8624",
"2992",
"9887",
"10876",
"7937",
"10622",
"1874",
"2133",
"8075",
"4055",
"11609",
"3113",
"12504",
"6673",
"5396",
"8004",
"12769",
"6164",
"10541",
"7861",
"8727",
"1541",
"5037",
"15204",
"7719",
"10453",
"6907",
"15109",
"5769",
"3178",
"5093",
"5984",
"5387",
"7488",
"8693",
"8018",
"9980",
"7375",
"10691",
"5779",
"11769",
"7287",
"7572",
"9116",
"5048",
"1005",
"216",
"5104",
"8615",
"8",
"7712",
"13817",
"634",
"10578",
"9894",
"368",
"2010",
"1078",
"5318",
"4543",
"11504",
"1909",
"3693",
"5918",
"10820",
"13549",
"3653",
"1276",
"3806",
"9938",
"11115",
"13218",
"12412",
"13270",
"577",
"681",
"680",
"10759",
"10438",
"3106",
"10233",
"1391",
"7801",
"14955",
"2359",
"11714",
"6294",
"11543",
"12373",
"800",
"15012",
"1095",
"1532",
"9531",
"2049",
"1940",
"11453",
"735",
"10408",
"7688",
"15293",
"3185",
"13183",
"10053",
"215",
"8687",
"3559",
"14435",
"3579",
"10178",
"12623",
"9896",
"3230",
"12741",
"4490",
"15175",
"15486",
"10063",
"6879",
"15032",
"11200",
"13680",
"1562",
"4526",
"9415",
"10454",
"1796",
"5316",
"15418",
"13501",
"10634",
"2214",
"7594",
"6446",
"9828",
"4914",
"5117",
"12571",
"2603",
"11265",
"2321",
"4393",
"5323",
"9925",
"12242",
"4357",
"8505",
"2860",
"1037",
"7518",
"9719",
"6071",
"6331",
"8388",
"2272",
"2388",
"4114",
"9835",
"11162",
"4419",
"1048",
"6267",
"9807",
"2672",
"13431",
"14092",
"7938",
"14650",
"4751",
"6224",
"12824",
"2377",
"5520",
"7308",
"11840",
"2233",
"12199",
"2634",
"9824",
"9995",
"313",
"6933",
"2650",
"7165",
"15467",
"7516",
"10292",
"768",
"3039",
"3794",
"5391",
"507",
"1251",
"6588",
"9667",
"7043",
"13857",
"15251",
"8503",
"10085",
"7052",
"1438",
"2856",
"628",
"9461",
"14974",
"12098",
"8024",
"8249",
"10078",
"11193",
"5022",
"13498",
"14475",
"9916",
"4834",
"13541",
"8747",
"2717",
"13918",
"2590",
"2379",
"1053",
"8208",
"14291",
"14919",
"3955",
"14310",
"12032",
"11838",
"10955",
"13000",
"4473",
"15297",
"8033",
"13651",
"10155",
"11238",
"8944",
"12925",
"1452",
"8314",
"7373",
"6551",
"7631",
"5100",
"11600",
"10615",
"14758",
"5296",
"2364",
"4542",
"6404",
"11110",
"2810",
"15466",
"7767",
"7898",
"12321",
"3216",
"2538",
"7209",
"2343",
"9168",
"486",
"2845",
"11338",
"6635",
"6750",
"2435",
"9510",
"4819",
"2327",
"13156",
"9636",
"5722",
"48",
"9833",
"1964",
"12005",
"12117",
"14354",
"11128",
"1647",
"4550",
"6123",
"13037",
"12744",
"6929",
"8259",
"943",
"6671",
"5484",
"7852",
"8059",
"10550",
"7752",
"5011",
"7369",
"7849",
"4411",
"2271",
"6155",
"6430",
"11240",
"15124",
"8283",
"14287",
"7966",
"10205",
"13475",
"8175",
"4743",
"11455",
"12863",
"12495",
"3636",
"2885",
"13507",
"8302",
"8799",
"15118",
"583",
"7115",
"14640",
"10940",
"6549",
"13718",
"2496",
"3974",
"6704",
"12270",
"15498",
"10777",
"5710",
"5871",
"13446",
"8394",
"5649",
"6544",
"13528",
"11650",
"5216",
"6680",
"15453",
"1435",
"9932",
"2938",
"15079",
"7606",
"7278",
"10664",
"13653",
"11842",
"4144",
"870",
"13683",
"4994",
"11945",
"7860",
"7864",
"7804",
"5903",
"4468",
"10933",
"12105",
"13359",
"4056",
"3816",
"13133",
"13738",
"7621",
"5945",
"2566",
"12386",
"9679",
"302",
"11018",
"10822",
"222",
"8445",
"15000",
"2764",
"1030",
"5107",
"12488",
"2801",
"2357",
"12834",
"7351",
"389",
"10350",
"6566",
"8911",
"714",
"5136",
"8309",
"15479",
"9194",
"7940",
"5365",
"13799",
"13898",
"12947",
"6199",
"9250",
"12493",
"6499",
"12066",
"2186",
"6133",
"8409",
"11082",
"11561",
"1038",
"706",
"7454",
"780",
"1507",
"12302",
"4584",
"9368",
"5050",
"4483",
"5465",
"12706",
"9412",
"4764",
"1119",
"8000",
"14789",
"15240",
"3357",
"10221",
"15001",
"14529",
"14110",
"2957",
"11264",
"8129",
"13664",
"347",
"4121",
"14715",
"5454",
"5915",
"10328",
"10524",
"10378",
"13411",
"425",
"7967",
"6821",
"7371",
"10872",
"11107",
"1248",
"5490",
"13213",
"6318",
"13389",
"9561",
"13926",
"7655",
"6214",
"3095",
"6296",
"7754",
"5974",
"13681",
"1045",
"3898",
"14193",
"5688",
"6264",
"13426",
"383",
"3571",
"13287",
"2831",
"5190",
"11406",
"922",
"8795",
"5130",
"6659",
"12642",
"501",
"10124",
"6169",
"12480",
"14060",
"2378",
"14185",
"8975",
"3265",
"11349",
"4429",
"2063",
"2880",
"13622",
"8280",
"9067",
"3451",
"1232",
"10900",
"10192",
"7597",
"6844",
"1490",
"6370",
"2754",
"12483",
"8455",
"13725",
"5708",
"3711",
"4013",
"5739",
"7853",
"13144",
"5274",
"6751",
"767",
"121",
"14784",
"10760",
"63",
"2424",
"13138",
"5169",
"6585",
"6067",
"12182",
"8003",
"793",
"4456",
"3726",
"10305",
"1693",
"6147",
"10743",
"3032",
"8897",
"6346",
"14782",
"8378",
"14470",
"15394",
"11990",
"26",
"4424",
"262",
"563",
"7773",
"6233",
"5671",
"13645",
"4390",
"7660",
"5145",
"906",
"2472",
"4603",
"15238",
"13202",
"14802",
"802",
"12445",
"13294",
"24",
"5940",
"2227",
"11396",
"9757",
"959",
"12063",
"1567",
"312",
"1837",
"14488",
"9427",
"13785",
"2870",
"8281",
"6615",
"11876",
"8966",
"13842",
"1223",
"3979",
"2531",
"9058",
"4005",
"6244",
"9730",
"14034",
"14214",
"8943",
"2212",
"4916",
"5354",
"14894",
"9222",
"2352",
"9237",
"8212",
"6232",
"3791",
"1271",
"7363",
"309",
"2960",
"12776",
"12082",
"12184",
"4432",
"4684",
"12075",
"13025",
"13612",
"15218",
"3558",
"10859",
"2198",
"8039",
"5788",
"14883",
"12211",
"1975",
"13601",
"4206",
"1345",
"6085",
"10733",
"11497",
"5725",
"6046",
"9761",
"1749",
"4598",
"3703",
"3334",
"286",
"9624",
"7221",
"5193",
"6597",
"14552",
"2013",
"8305",
"13165",
"5952",
"10417",
"5326",
"4907",
"10275",
"1635",
"2738",
"6601",
"382",
"11828",
"1023",
"6842",
"9198",
"5291",
"12208",
"9684",
"11754",
"4379",
"6304",
"1074",
"14983",
"44",
"14294",
"9317",
"14527",
"7831",
"10014",
"612",
"10243",
"5088",
"2287",
"9917",
"731",
"2181",
"7897",
"11596",
"1252",
"13888",
"15365",
"321",
"3115",
"11425",
"8491",
"1383",
"2847",
"2631",
"10064",
"7276",
"6242",
"7559",
"1116",
"8547",
"1648",
"10088",
"1807",
"13647",
"7059",
"14705",
"7679",
"3539",
"13998",
"14728",
"14578",
"8365",
"8493",
"10629",
"10281",
"10312",
"4848",
"1294",
"14526",
"13238",
"589",
"2246",
"6222",
"6941",
"9349",
"3287",
"12537",
"6856",
"4442",
"10722",
"14831",
"7477",
"9647",
"8542",
"6427",
"4802",
"13768",
"8125",
"10415",
"1297",
"4156",
"14851",
"508",
"11882",
"11694",
"10727",
"539",
"9169",
"4621",
"7359",
"13466",
"10617",
"1404",
"12415",
"10339",
"8130",
"5390",
"11323",
"6884",
"4130",
"6072",
"9043",
"12144",
"4278",
"1482",
"401",
"7730",
"3434",
"4394",
"12940",
"6508",
"11663",
"11723",
"1132",
"8597",
"10257",
"3271",
"1632",
"945",
"5403",
"8561",
"14756",
"1099",
"7401",
"1746",
"8081",
"10701",
"3776",
"12972",
"10699",
"2704",
"14455",
"13714",
"13837",
"4129",
"311",
"7133",
"7916",
"9789",
"8734",
"14239",
"8379",
"9129",
"13543",
"9583",
"12868",
"269",
"11689",
"12719",
"6559",
"4997",
"14917",
"3929",
"8946",
"13815",
"1704",
"13056",
"8407",
"1699",
"12608",
"1651",
"5933",
"71",
"15454",
"578",
"720",
"7347",
"13324",
"14755",
"2546",
"8203",
"3236",
"14887",
"1466",
"8573",
"6276",
"12407",
"1409",
"3978",
"4888",
"11328",
"4723",
"14857",
"1869",
"9653",
"3885",
"7256",
"185",
"5258",
"14666",
"4586",
"14711",
"1724",
"2983",
"14298",
"10960",
"12333",
"5922",
"291",
"14161",
"12017",
"14352",
"8658",
"6186",
"12177",
"1263",
"13006",
"4041",
"13172",
"11034",
"8872",
"9147",
"5083",
"4267",
"2470",
"254",
"6754",
"3151",
"10143",
"7593",
"1026",
"12985",
"14625",
"7793",
"9698",
"11156",
"4119",
"3846",
"3479",
"14569",
"1485",
"7883",
"2989",
"8810",
"9665",
"2313",
"109",
"839",
"14271",
"13084",
"1798",
"9545",
"11518",
"8782",
"6082",
"532",
"13184",
"1548",
"15201",
"3556",
"7416",
"8006",
"6917",
"3327",
"2651",
"11638",
"6757",
"7058",
"11298",
"5755",
"10458",
"4635",
"10169",
"2005",
"13106",
"4180",
"10721",
"10860",
"521",
"4205",
"4837",
"8031",
"1596",
"4678",
"5081",
"14064",
"2453",
"4003",
"12161",
"10386",
"13565",
"5355",
"3232",
"6893",
"14109",
"9353",
"11023",
"14188",
"14881",
"6859",
"10009",
"6697",
"9030",
"1625",
"6131",
"1063",
"11529",
"2981",
"9642",
"3361",
"350",
"3874",
"8675",
"1633",
"10557",
"8832",
"9068",
"8218",
"12261",
"12693",
"3952",
"1740",
"2691",
"1875",
"15217",
"13455",
"15041",
"2988",
"12164",
"10518",
"13092",
"14197",
"9318",
"6",
"4359",
"1642",
"6987",
"1087",
"9869",
"4882",
"4175",
"3904",
"11463",
"8476",
"3010",
"15231",
"6035",
"9467",
"8928",
"6994",
"15472",
"7030",
"6168",
"13867",
"2208",
"13261",
"1164",
"4060",
"10506",
"14502",
"3892",
"1947",
"2028",
"11401",
"7619",
"1917",
"2715",
"3192",
"5818",
"5720",
"2495",
"2557",
"13615",
"12892",
"4332",
"11206",
"9463",
"6121",
"14201",
"3867",
"9853",
"12917",
"6058",
"10889",
"6026",
"9033",
"440",
"739",
"4898",
"4202",
"11077",
"13109",
"3860",
"13437",
"14866",
"9974",
"15125",
"2131",
"13514",
"9586",
"6919",
"2925",
"3461",
"6696",
"7961",
"11257",
"12174",
"6221",
"10935",
"12354",
"10574",
"12748",
"10937",
"3897",
"192",
"1741",
"5321",
"9538",
"7985",
"5986",
"849",
"6519",
"858",
"12508",
"6764",
"13042",
"2119",
"582",
"9742",
"14413",
"7956",
"3165",
"2382",
"2503",
"13972",
"7928",
"3351",
"3812",
"12980",
"7004",
"6976",
"6899",
"13628",
"12277",
"14440",
"11602",
"14853",
"14933",
"13973",
"7813",
"5854",
"12887",
"5262",
"11620",
"6760",
"6980",
"2732",
"3675",
"7622",
"2260",
"6109",
"11569",
"1268",
"14155",
"8816",
"12823",
"3696",
"10440",
"6027",
"4307",
"14738",
"5250",
"2304",
"13436",
"12595",
"14691",
"1805",
"5049",
"654",
"8109",
"892",
"3446",
"7611",
"2336",
"10789",
"12669",
"12353",
"2000",
"4847",
"2280",
"584",
"6016",
"10640",
"3336",
"7344",
"11466",
"904",
"10684",
"11509",
"5351",
"6513",
"13055",
"7008",
"10511",
"11814",
"10545",
"12818",
"635",
"7637",
"3717",
"2946",
"10203",
"9496",
"6311",
"7721",
"6146",
"4775",
"2117",
"7330",
"14990",
"479",
"14469",
"12684",
"9769",
"11368",
"14424",
"14809",
"6583",
"2825",
"10381",
"14929",
"6165",
"10653",
"2441",
"2038",
"14240",
"6873",
"4988",
"8120",
"14433",
"1865",
"14644",
"15159",
"12605",
"13395",
"2608",
"12119",
"712",
"13591",
"9542",
"2998",
"390",
"5062",
"10964",
"408",
"13387",
"4365",
"6986",
"4963",
"2624",
"12705",
"12871",
"14160",
"12983",
"12666",
"6702",
"11045",
"12971",
"1788",
"9533",
"88",
"9322",
"15022",
"10741",
"4797",
"9380",
"8994",
"2529",
"2867",
"9689",
"5032",
"15476",
"8531",
"7273",
"11905",
"11750",
"1949",
"3912",
"7400",
"4164",
"3534",
"1823",
"14144",
"7979",
"10011",
"13105",
"4438",
"1262",
"761",
"13710",
"5179",
"13531",
"3970",
"8328",
"10994",
"13370",
"10170",
"7843",
"3365",
"5911",
"4948",
"10974",
"8415",
"7991",
"9382",
"6374",
"998",
"14051",
"14106",
"13561",
"11233",
"3493",
"31",
"14800",
"2659",
"2927",
"2309",
"11871",
"2813",
"7153",
"3207",
"14269",
"15301",
"13102",
"5762",
"8720",
"3020",
"417",
"2719",
"14132",
"3679",
"7361",
"1763",
"5662",
"14116",
"11236",
"832",
"4857",
"14176",
"6465",
"4146",
"5255",
"10539",
"6995",
"7237",
"11086",
"13721",
"12226",
"997",
"11397",
"10373",
"11514",
"4630",
"6730",
"14891",
"12018",
"12303",
"11360",
"15137",
"3930",
"6610",
"5514",
"12507",
"8376",
"5868",
"13990",
"15266",
"4826",
"9819",
"3546",
"8982",
"2213",
"6599",
"3707",
"15024",
"2024",
"1561",
"6861",
"5412",
"14186",
"8066",
"14390",
"9160",
"12910",
"60",
"9882",
"11146",
"2783",
"2875",
"12147",
"3159",
"2110",
"73",
"4725",
"5031",
"11900",
"5595",
"5533",
"6661",
"14327",
"15449",
"14505",
"13782",
"4714",
"12864",
"1977",
"229",
"8589",
"14490",
"12257",
"12670",
"6498",
"912",
"10140",
"4128",
"8634",
"14145",
"1864",
"13255",
"2238",
"990",
"8187",
"7099",
"13557",
"806",
"1667",
"13444",
"4104",
"13814",
"8566",
"7111",
"4744",
"11149",
"4183",
"4389",
"4691",
"8990",
"4923",
"11273",
"13974",
"2325",
"12451",
"905",
"13826",
"10083",
"6676",
"7772",
"13220",
"13089",
"9284",
"5293",
"3231",
"9274",
"13524",
"14618",
"7193",
"14906",
"8560",
"14202",
"15129",
"11498",
"3155",
"1942",
"13513",
"3594",
"9241",
"8034",
"1397",
"4342",
"2559",
"12884",
"8215",
"14211",
"13286",
"4538",
"5313",
"8667",
"2011",
"5309",
"12689",
"5276",
"790",
"4565",
"7227",
"4380",
"11037",
"8905",
"12893",
"7643",
"9816",
"7299",
"10086",
"13457",
"5080",
"11955",
"7640",
"9855",
"1905",
"2312",
"7311",
"7781",
"7654",
"8368",
"2204",
"699",
"4686",
"5195",
"11185",
"14323",
"1213",
"7676",
"5716",
"9992",
"10569",
"5798",
"3306",
"5472",
"6889",
"15220",
"3082",
"7743",
"2324",
"2121",
"5187",
"6399",
"10223",
"5834",
"1464",
"13510",
"11063",
"2550",
"6648",
"5938",
"4929",
"11505",
"2489",
"6470",
"12114",
"4516",
"108",
"8984",
"6870",
"1932",
"7980",
"10244",
"8308",
"12389",
"1871",
"5877",
"3339",
"12958",
"12253",
"7243",
"149",
"5670",
"957",
"4497",
"8402",
"14021",
"5424",
"8319",
"12548",
"13794",
"10007",
"1479",
"14538",
"12275",
"7766",
"3911",
"15197",
"6076",
"14793",
"156",
"14723",
"1497",
"4400",
"6996",
"1001",
"5352",
"6798",
"11467",
"8200",
"2004",
"11976",
"588",
"4700",
"14643",
"12153",
"11589",
"7357",
"6998",
"6001",
"7616",
"6541",
"246",
"13863",
"7696",
"7023",
"3678",
"6115",
"15432",
"14302",
"5389",
"15457",
"2766",
"5251",
"13746",
"10878",
"2202",
"14517",
"7214",
"9440",
"5070",
"8454",
"10915",
"9138",
"1109",
"4318",
"5173",
"2776",
"10108",
"1127",
"10706",
"10564",
"1764",
"11326",
"13822",
"5009",
"118",
"8099",
"9838",
"13350",
"1877",
"4794",
"2745",
"6789",
"13181",
"7319",
"5540",
"3359",
"465",
"13544",
"10531",
"12134",
"13900",
"9350",
"11142",
"7714",
"547",
"13231",
"7811",
"1207",
"8581",
"7340",
"431",
"10831",
"4730",
"283",
"179",
"10945",
"3072",
"5201",
"3512",
"12816",
"13078",
"2187",
"11578",
"4036",
"10309",
"11304",
"2222",
"3927",
"15442",
"3388",
"7012",
"5929",
"6653",
"14557",
"13950",
"4737",
"514",
"4417",
"2740",
"11896",
"5828",
"2421",
"13008",
"7525",
"15371",
"1631",
"13892",
"4500",
"7710",
"6451",
"5461",
"15188",
"753",
"12431",
"3902",
"11019",
"4053",
"11510",
"2434",
"11341",
"14181",
"11031",
"1802",
"13191",
"14224",
"14658",
"8802",
"5502",
"13419",
"14004",
"2711",
"12387",
"8960",
"11615",
"10883",
"9690",
"1066",
"8332",
"11749",
"10588",
"10685",
"6054",
"8046",
"11731",
"11550",
"4063",
"14396",
"6384",
"4975",
"7229",
"792",
"5205",
"13490",
"4838",
"1130",
"13631",
"8101",
"5426",
"14174",
"8467",
"14183",
"3181",
"4626",
"889",
"12658",
"5161",
"14200",
"3709",
"14158",
"14042",
"11150",
"13580",
"14580",
"8732",
"11260",
"14735",
"1283",
"11272",
"11908",
"2061",
"11404",
"4617",
"13550",
"13809",
"8359",
"6158",
"7575",
"9478",
"5432",
"8827",
"11061",
"4881",
"4176",
"11262",
"13769",
"15042",
"4544",
"1000",
"12886",
"2937",
"3786",
"5976",
"1706",
"9815",
"4032",
"13969",
"8705",
"1628",
"8007",
"14584",
"9554",
"8638",
"2043",
"8488",
"11773",
"15011",
"14958",
"5812",
"3721",
"13991",
"4941",
"9065",
"1441",
"363",
"6189",
"11995",
"4759",
"7190",
"174",
"12908",
"11424",
"90",
"3770",
"535",
"4100",
"1510",
"4280",
"6381",
"11014",
"5545",
"3959",
"4238",
"3599",
"647",
"10984",
"855",
"11025",
"12059",
"5564",
"12158",
"9984",
"11052",
"2283",
"2808",
"326",
"14985",
"8321",
"1467",
"4201",
"4309",
"1195",
"13095",
"14595",
"4511",
"1169",
"4153",
"3482",
"7874",
"14624",
"4858",
"5803",
"362",
"14804",
"5733",
"13119",
"285",
"4476",
"11281",
"276",
"13450",
"9340",
"11168",
"5028",
"5350",
"7018",
"13896",
"12040",
"1850",
"6457",
"14593",
"14243",
"14206",
"255",
"9559",
"8236",
"2689",
"7263",
"14815",
"3509",
"14125",
"7261",
"4443",
"7807",
"4301",
"14954",
"10747",
"13577",
"6094",
"3830",
"4879",
"2986",
"6701",
"14266",
"13619",
"13989",
"5086",
"13047",
"10035",
"2505",
"13913",
"4212",
"13365",
"10152",
"851",
"10913",
"5302",
"14189",
"627",
"7076",
"12814",
"1317",
"11999",
"12397",
"14164",
"2232",
"7895",
"4646",
"12594",
"9338",
"11261",
"8993",
"3480",
"9146",
"9128",
"1103",
"942",
"14016",
"10467",
"14522",
"1310",
"14763",
"12655",
"2556",
"5003",
"9231",
"14315",
"13694",
"4169",
"13916",
"15087",
"3526",
"3098",
"10732",
"8430",
"4703",
"13241",
"10873",
"10370",
"8954",
"7155",
"3235",
"11295",
"13706",
"6928",
"7780",
"13886",
"6312",
"5315",
"15214",
"12662",
"9454",
"13982",
"752",
"10561",
"3582",
"10576",
"14198",
"201",
"6562",
"8945",
"13860",
"2395",
"567",
"3127",
"5677",
"11325",
"12402",
"3661",
"542",
"6219",
"12479",
"1590",
"10402",
"8983",
"4068",
"10353",
"10708",
"7697",
"9589",
"8922",
"12607",
"4512",
"5338",
"5435",
"3790",
"10696",
"14361",
"7253",
"4674",
"3872",
"10694",
"4742",
"8233",
"5574",
"820",
"5508",
"15085",
"4922",
"868",
"11068",
"2834",
"688",
"15239",
"15331",
"3921",
"14697",
"13620",
"13278",
"9122",
"4675",
"14740",
"14075",
"9711",
"6897",
"3971",
"4644",
"5570",
"9825",
"14718",
"8513",
"3989",
"8714",
"13383",
"13017",
"7245",
"9826",
"1059",
"5547",
"4672",
"12020",
"4095",
"8552",
"12835",
"2623",
"765",
"3835",
"13284",
"1453",
"2935",
"2493",
"9788",
"6075",
"10602",
"3070",
"12687",
"7080",
"4911",
"10348",
"15430",
"14912",
"1292",
"8145",
"1120",
"7042",
"3048",
"1332",
"12986",
"9577",
"5978",
"5772",
"6920",
"14478",
"10122",
"11693",
"5600",
"11631",
"894",
"8649",
"11891",
"3163",
"1720",
"7810",
"2207",
"814",
"11879",
"3442",
"13638",
"6087",
"5260",
"10563",
"1184",
"9611",
"1398",
"7536",
"8754",
"12019",
"15274",
"953",
"8729",
"5441",
"958",
"45",
"709",
"636",
"12243",
"6030",
"11008",
"8124",
"686",
"9472",
"9110",
"8770",
"11608",
"12047",
"4204",
"15308",
"9032",
"3066",
"15206",
"14388",
"3555",
"11727",
"11457",
"11626",
"7486",
"13919",
"284",
"3200",
"5681",
"1348",
"7567",
"8610",
"993",
"10971",
"2151",
"4482",
"6731",
"1449",
"6127",
"5925",
"14712",
"10409",
"4810",
"2354",
"4391",
"11321",
"3862",
"12216",
"7405",
"7724",
"15357",
"13732",
"6197",
"8588",
"1712",
"3708",
"12408",
"4279",
"7342",
"4749",
"15462",
"5715",
"1331",
"236",
"10230",
"5257",
"14516",
"2683",
"14129",
"3996",
"8298",
"11496",
"6261",
"11066",
"11997",
"10512",
"1044",
"4720",
"4958",
"7876",
"6684",
"11729",
"7732",
"7510",
"8894",
"7465",
"5437",
"11118",
"12111",
"9263",
"1311",
"8196",
"6339",
"3722",
"7317",
"373",
"11895",
"12966",
"2864",
"3368",
"657",
"1945",
"856",
"3123",
"10630",
"6801",
"5881",
"10320",
"13422",
"3281",
"6111",
"2939",
"8789",
"14604",
"1946",
"5091",
"15149",
"10525",
"4167",
"6088",
"6365",
"2492",
"13899",
"6587",
"2637",
"2682",
"7403",
"2022",
"2987",
"5669",
"11636",
"5971",
"3215",
"10072",
"15226",
"2555",
"1890",
"8475",
"10868",
"6797",
"14168",
"4778",
"4323",
"13404",
"528",
"10844",
"2863",
"10392",
"3637",
"13644",
"4616",
"5923",
"12235",
"13816",
"416",
"3330",
"12313",
"9784",
"4428",
"1014",
"13453",
"14629",
"193",
"9635",
"5054",
"10190",
"9726",
"13155",
"2627",
"1885",
"11007",
"13848",
"8220",
"7996",
"10022",
"10481",
"2122",
"2916",
"8936",
"11522",
"12370",
"11701",
"7943",
"9629",
"14657",
"7341",
"12471",
"14146",
"9844",
"5751",
"12821",
"10711",
"13216",
"13460",
"10443",
"8148",
"12704",
"5002",
"14689",
"5235",
"13927",
"10049",
"13775",
"2224",
"12448",
"2258",
"10546",
"4642",
"14546",
"14307",
"1165",
"9764",
"11102",
"1036",
"6525",
"5930",
"3307",
"405",
"6868",
"674",
"12781",
"4813",
"5691",
"13849",
"8274",
"1745",
"6063",
"2619",
"14316",
"8633",
"15307",
"8910",
"12535",
"10354",
"9802",
"12279",
"9497",
"6245",
"9339",
"1551",
"4938",
"58",
"4486",
"10276",
"3520",
"4304",
"5851",
"13529",
"13883",
"12636",
"9417",
"13744",
"7003",
"781",
"4768",
"1725",
"12960",
"9175",
"8286",
"534",
"11877",
"3538",
"9141",
"979",
"13476",
"5639",
"5367",
"10835",
"12946",
"5491",
"12443",
"3560",
"11715",
"6523",
"8654",
"1307",
"7506",
"13783",
"6581",
"13570",
"9682",
"3406",
"10433",
"10153",
"10364",
"725",
"2996",
"4979",
"2600",
"2108",
"6205",
"8679",
"12624",
"81",
"8179",
"6022",
"3913",
"14950",
"2145",
"12505",
"6048",
"3517",
"7564",
"6565",
"13629",
"13709",
"9191",
"4583",
"10771",
"1477",
"14446",
"11591",
"1948",
"7053",
"7746",
"2281",
"14938",
"13626",
"13859",
"779",
"11746",
"10161",
"3013",
"14232",
"8373",
"14477",
"2461",
"11859",
"11385",
"12202",
"14487",
"3052",
"10384",
"14212",
"11637",
"4441",
"4784",
"9464",
"8221",
"5224",
"9395",
"3407",
"274",
"12169",
"14808",
"6136",
"13327",
"14182",
"6148",
"11956",
"1517",
"14724",
"640",
"7073",
"13217",
"7235",
"1396",
"6940",
"4422",
"11420",
"7725",
"9343",
"10005",
"6256",
"12015",
"8040",
"3004",
"8912",
"6515",
"14761",
"5700",
"9139",
"15223",
"12403",
"10377",
"12582",
"14838",
"380",
"14598",
"9795",
"10647",
"695",
"12212",
"2387",
"9860",
"8874",
"15052",
"344",
"6229",
"3757",
"7206",
"14597",
"4268",
"10805",
"8052",
"5460",
"4772",
"4426",
"8651",
"13481",
"1064",
"6560",
"642",
"10359",
"14771",
"1352",
"15091",
"6462",
"4125",
"2886",
"2735",
"10301",
"7731",
"1566",
"9271",
"1995",
"777",
"12547",
"11270",
"9927",
"1105",
"8400",
"8585",
"11324",
"2082",
"10135",
"4645",
"4131",
"3768",
"10510",
"12418",
"14418",
"8941",
"15364",
"1549",
"10393",
"388",
"7872",
"14839",
"12633",
"14695",
"4649",
"3255",
"10079",
"4493",
"4259",
"766",
"11575",
"3870",
"13194",
"15391",
"7709",
"9680",
"9442",
"4294",
"7222",
"555",
"7946",
"7140",
"7475",
"6954",
"12249",
"349",
"15499",
"5078",
"3864",
"7973",
"719",
"8858",
"2373",
"10118",
"2883",
"2245",
"10388",
"11868",
"12977",
"13318",
"4862",
"7132",
"13272",
"15313",
"3916",
"1609",
"4353",
"11786",
"2274",
"3838",
"9278",
"6631",
"1663",
"8216",
"1463",
"3433",
"9897",
"4002",
"3455",
"7561",
"3260",
"10357",
"5919",
"12840",
"6364",
"1832",
"11797",
"4447",
"14465",
"12913",
"9142",
"12853",
"6056",
"5089",
"11970",
"7381",
"3784",
"7701",
"1634",
"5418",
"10665",
"14196",
"15465",
"2568",
"10335",
"9453",
"3662",
"11855",
"6904",
"6855",
"13410",
"754",
"13852",
"9656",
"10317",
"649",
"9792",
"10544",
"5063",
"3213",
"15255",
"1067",
"10074",
"4102",
"11640",
"1069",
"1112",
"1695",
"290",
"8457",
"7847",
"2216",
"5371",
"5816",
"4162",
"13702",
"5518",
"10487",
"9211",
"9782",
"4427",
"295",
"11642",
"7844",
"8646",
"7687",
"1008",
"5014",
"9579",
"6887",
"11835",
"13758",
"2527",
"3659",
"12492",
"2408",
"6254",
"10232",
"8252",
"11105",
"4708",
"11869",
"12875",
"13739",
"13905",
"5679",
"2090",
"2858",
"12556",
"4765",
"1426",
"13076",
"14283",
"1400",
"7049",
"2755",
"1280",
"14788",
"6120",
"8210",
"1057",
"13127",
"9088",
"12970",
"14951",
"8487",
"568",
"9885",
"13322",
"18",
"1420",
"8825",
"7988",
"10748",
"5844",
"6958",
"10181",
"12080",
"9474",
"6717",
"3363",
"5656",
"1710",
"14882",
"6605",
"3057",
"12589",
"11013",
"3634",
"12619",
"5633",
"3573",
"5902",
"10216",
"5610",
"4682",
"2449",
"9188",
"4039",
"14750",
"9154",
"5625",
"13038",
"3275",
"397",
"122",
"9631",
"14389",
"6156",
"3817",
"4657",
"4270",
"1020",
"13844",
"10051",
"8539",
"13830",
"10864",
"10047",
"8971",
"13143",
"7252",
"10854",
"12337",
"15095",
"2678",
"2330",
"15015",
"3153",
"14485",
"2165",
"476",
"3277",
"3889",
"6721",
"9117",
"4288",
"10380",
"10643",
"7275",
"7016",
"114",
"4092",
"4824",
"4227",
"4592",
"5994",
"7175",
"5860",
"8957",
"13618",
"10637",
"10981",
"1050",
"7505",
"10059",
"12751",
"8642",
"14989",
"7207",
"1328",
"701",
"4983",
"7700",
"3845",
"6787",
"6767",
"9803",
"10403",
"10369",
"11893",
"13941",
"14113",
"2023",
"13855",
"3908",
"986",
"453",
"11096",
"3878",
"10258",
"5341",
"17",
"5278",
"13402",
"6938",
"2252",
"4258",
"10520",
"4325",
"14355",
"11221",
"15417",
"13642",
"7045",
"7761",
"1792",
"438",
"11005",
"9214",
"2486",
"2169",
"3730",
"11776",
"15464",
"587",
"10590",
"3129",
"7478",
"7791",
"13711",
"1413",
"14452",
"9953",
"15080",
"8155",
"6539",
"6667",
"14120",
"3249",
"13633",
"9424",
"4079",
"5569",
"12926",
"6132",
"11954",
"2807",
"4806",
"11741",
"12414",
"2734",
"1684",
"4161",
"12640",
"8814",
"14767",
"7456",
"3087",
"11178",
"9508",
"2367",
"4604",
"6799",
"666",
"9208",
"5869",
"2017",
"2721",
"4507",
"165",
"9558",
"10073",
"2417",
"9233",
"8044",
"13348",
"5797",
"186",
"15500",
"2411",
"11531",
"11279",
"3358",
"11685",
"7196",
"8219",
"913",
"8453",
"4629",
"9449",
"9650",
"13503",
"12462",
"509",
"7691",
"13158",
"9738",
"6524",
"2573",
"14139",
"10218",
"8728",
"10827",
"1158",
"7055",
"12599",
"8287",
"9718",
"2392",
"12513",
"10490",
"14425",
"8144",
"10",
"4368",
"9166",
"15108",
"2420",
"41",
"15027",
"54",
"14399",
"13554",
"3895",
"7601",
"2630",
"12961",
"9325",
"13968",
"15404",
"6851",
"13731",
"661",
"8284",
"4075",
"2065",
"8719",
"13753",
"15455",
"3102",
"8408",
"12362",
"5112",
"4792",
"15420",
"7302",
"5402",
"1697",
"9796",
"7633",
"8446",
"13079",
"5121",
"7315",
"9298",
"11038",
"10592",
"12228",
"4910",
"5488",
"9458",
"2710",
"9625",
"10887",
"11227",
"11923",
"5617",
"46",
"13767",
"7796",
"12309",
"12232",
"6913",
"10866",
"7502",
"12475",
"7283",
"213",
"3995",
"2779",
"12210",
"1323",
"6710",
"6744",
"9512",
"3193",
"9770",
"8828",
"8695",
"969",
"12295",
"6832",
"10020",
"9346",
"4546",
"9259",
"5345",
"5879",
"5951",
"5245",
"3008",
"6182",
"573",
"12166",
"6059",
"1743",
"6628",
"12382",
"11",
"9562",
"4021",
"12324",
"4782",
"944",
"6644",
"1851",
"8250",
"7778",
"2984",
"8198",
"5325",
"14114",
"14131",
"13321",
"9066",
"13939",
"13275",
"7108",
"10834",
"5631",
"11820",
"102",
"7580",
"1337",
"8668",
"15285",
"12009",
"14105",
"8976",
"4736",
"5775",
"6024",
"7539",
"2041",
"5964",
"3240",
"1180",
"3487",
"7542",
"12667",
"10778",
"12318",
"1509",
"12123",
"12014",
"9441",
"5824",
"11991",
"11499",
"1006",
"7927",
"6258",
"5596",
"11033",
"13812",
"9229",
"2355",
"9500",
"8243",
"8431",
"611",
"30",
"10766",
"1146",
"1901",
"9149",
"1414",
"13429",
"14395",
"8533",
"9685",
"110",
"14751",
"7659",
"2511",
"14843",
"337",
"9176",
"13623",
"12587",
"3999",
"11785",
"5515",
"9431",
"7618",
"11020",
"5899",
"1222",
"8746",
"14227",
"14371",
"5784",
"8660",
"5900",
"14909",
"13723",
"9348",
"9293",
"12700",
"8097",
"893",
"2977",
"2229",
"10555",
"11967",
"13257",
"10382",
"6936",
"2081",
"1664",
"2096",
"15072",
"6810",
"2703",
"11593",
"9115",
"5735",
"8189",
"12375",
"15463",
"6478",
"14544",
"7922",
"9634",
"6746",
"11926",
"3563",
"3485",
"2902",
"6554",
"8149",
"11051",
"2026",
"13630",
"13362",
"95",
"7952",
"11124",
"14055",
"2944",
"4338",
"2575",
"9506",
"13259",
"3541",
"14072",
"12259",
"12192",
"11286",
"862",
"2707",
"5061",
"1028",
"11182",
"4045",
"2936",
"13361",
"14162",
"1265",
"12661",
"10990",
"11058",
"5445",
"4707",
"2244",
"6020",
"8570",
"11241",
"4650",
"4822",
"14011",
"4733",
"268",
"7082",
"11160",
"6830",
"3146",
"9905",
"14646",
"4337",
"8002",
"1035",
"7094",
"12125",
"4773",
"9527",
"10997",
"12138",
"1027",
"8345",
"3154",
"6915",
"196",
"8568",
"4853",
"5299",
"12296",
"12533",
"7543",
"8895",
"6177",
"3803",
"1270",
"1365",
"13209",
"15038",
"1833",
"1338",
"11667",
"14824",
"9948",
"14264",
"14963",
"1301",
"1304",
"6371",
"11449",
"13955",
"7553",
"1429",
"4803",
"12188",
"9522",
"1021",
"4628",
"11108",
"2726",
"1522",
"1150",
"9468",
"7690",
"13797",
"12160",
"5859",
"11698",
"9143",
"1707",
"13021",
"10332",
"10387",
"3390",
"4283",
"2027",
"9466",
"14693",
"14096",
"3014",
"378",
"1070",
"9236",
"7858",
"429",
"1266",
"82",
"7605",
"4939",
"4917",
"15250",
"10037",
"6536",
"7057",
"2390",
"6454",
"275",
"3164",
"8562",
"5579",
"1408",
"9494",
"8514",
"3937",
"359",
"3731",
"5263",
"2731",
"8863",
"896",
"374",
"11189",
"1873",
"6748",
"2865",
"11753",
"8956",
"4562",
"2464",
"214",
"14679",
"15114",
"12603",
"1134",
"7129",
"9507",
"13282",
"4329",
"11886",
"13649",
"5587",
"10837",
"5966",
"6914",
"15234",
"3677",
"9571",
"5124",
"8364",
"7776",
"5170",
"678",
"12644",
"4334",
"1920",
"1296",
"11095",
"12869",
"9990",
"9593",
"5789",
"8011",
"8710",
"10605",
"10372",
"2196",
"4596",
"11980",
"1952",
"5875",
"3981",
"2688",
"7150",
"13065",
"726",
"10197",
"9419",
"3243",
"7066",
"11479",
"7223",
"3939",
"1881",
"13409",
"10010",
"4763",
"1914",
"11483",
"8176",
"4254",
"926",
"11363",
"1031",
"9892",
"13819",
"1002",
"12601",
"5555",
"9525",
"3544",
"7034",
"11858",
"2890",
"10131",
"692",
"5955",
"9286",
"3025",
"8450",
"12538",
"10315",
"12314",
"12399",
"7171",
"13223",
"2648",
"4485",
"8485",
"5771",
"2338",
"7388",
"8786",
"12598",
"9837",
"10422",
"10645",
"10499",
"4750",
"6060",
"5519",
"14822",
"10466",
"4284",
"14157",
"2561",
"10144",
"10264",
"10620",
"6356",
"1436",
"5386",
"5075",
"8273",
"2199",
"11712",
"14591",
"253",
"6720",
"4791",
"14828",
"12085",
"14154",
"10012",
"7560",
"967",
"11837",
"756",
"11532",
"4286",
"12810",
"8395",
"10880",
"5908",
"1765",
"4964",
"164",
"4492",
"3092",
"2950",
"9321",
"393",
"12609",
"2615",
"11519",
"11988",
"1278",
"13970",
"4263",
"3829",
"13417",
"1376",
"11371",
"11506",
"414",
"8964",
"4369",
"117",
"11500",
"12771",
"7173",
"317",
"3778",
"4423",
"7100",
"12774",
"12989",
"14292",
"1636",
"3161",
"7413",
"4272",
"8578",
"5486",
"14147",
"12007",
"404",
"11538",
"1661",
"12106",
"14879",
"1062",
"8867",
"12803",
"10781",
"7634",
"4047",
"3141",
"4458",
"10468",
"9455",
"11143",
"10171",
"13608",
"7350",
"14151",
"10848",
"4264",
"2583",
"5281",
"10318",
"7570",
"10670",
"13737",
"2948",
"6953",
"9073",
"8644",
"14137",
"12787",
"12384",
"13999",
"11657",
"13933",
"11229",
"13117",
"2416",
"14760",
"11901",
"6972",
"7546",
"9702",
"7325",
"4344",
"7770",
"4704",
"9299",
"2394",
"11653",
"7537",
"987",
"707",
"1091",
"6617",
"4277",
"14237",
"4947",
"4891",
"9768",
"14119",
"3143",
"1965",
"4220",
"5791",
"4690",
"745",
"11916",
"7061",
"9172",
"13685",
"15090",
"7596",
"4767",
"3034",
"11414",
"15277",
"4633",
"8721",
"8813",
"2893",
"4406",
"11517",
"8292",
"227",
"36",
"384",
"1374",
"3740",
"12553",
"2123",
"7225",
"12380",
"5963",
"12906",
"4563",
"5516",
"5153",
"3091",
"3182",
"3567",
"3827",
"15227",
"1306",
"5777",
"2746",
"1015",
"12688",
"4038",
"13428",
"407",
"3572",
"5114",
"12698",
"2674",
"6607",
"11444",
"3432",
"2491",
"11850",
"6483",
"6795",
"4833",
"11795",
"1937",
"5322",
"14220",
"7707",
"13346",
"667",
"8889",
"14445",
"8036",
"1569",
"12681",
"4293",
"6823",
"11475",
"13542",
"2750",
"10528",
"2319",
"559",
"5138",
"6226",
"8934",
"2789",
"4367",
"14284",
"12006",
"14401",
"4067",
"7240",
"10488",
"7533",
"9152",
"2601",
"6864",
"10260",
"5509",
"2665",
"11131",
"2308",
"10498",
"9136",
"280",
"11556",
"10967",
"14247",
"11708",
"3410",
"11678",
"4726",
"7074",
"11129",
"10589",
"1110",
"526",
"5521",
"1910",
"10492",
"2923",
"5275",
"1136",
"9064",
"1424",
"14817",
"3223",
"3321",
"4071",
"9144",
"9023",
"2092",
"7300",
"9541",
"11103",
"11011",
"2818",
"3998",
"9901",
"519",
"2749",
"7484",
"2020",
"5497",
"10966",
"12637",
"8978",
"10065",
"10803",
"7352",
"4656",
"4697",
"9766",
"11560",
"11515",
"9556",
"9314",
"8950",
"14442",
"2591",
"9666",
"8859",
"2415",
"6656",
"11782",
"11254",
"544",
"7083",
"12927",
"15190",
"12838",
"3928",
"4488",
"4296",
"2292",
"10149",
"8073",
"2675",
"1061",
"11383",
"15265",
"9524",
"10105",
"5151",
"14925",
"9535",
"14848",
"11641",
"8545",
"2509",
"2539",
"11658",
"6727",
"15478",
"7718",
"119",
"3793",
"9285",
"3600",
"9643",
"6531",
"1722",
"8211",
"12780",
"4509",
"10509",
"676",
"14525",
"4472",
"1406",
"9557",
"11951",
"305",
"3918",
"5550",
"2862",
"10225",
"12572",
"4830",
"1422",
"14741",
"2952",
"3269",
"5621",
"5487",
"9540",
"4798",
"11684",
"10920",
"15350",
"5653",
"1171",
"14127",
"4689",
"3735",
"15171",
"9985",
"8253",
"3886",
"9481",
"12929",
"8507",
"5752",
"219",
"536",
"2947",
"2374",
"4027",
"14972",
"10650",
"70",
"1682",
"4150",
"5020",
"9662",
"1029",
"13602",
"13682",
"7138",
"1161",
"13774",
"11824",
"15164",
"10429",
"1603",
"11275",
"6382",
"10363",
"3036",
"4756",
"297",
"7391",
"684",
"14821",
"11965",
"9606",
"10184",
"1125",
"9961",
"12736",
"4012",
"13856",
"8741",
"9843",
"11079",
"12701",
"12214",
"8555",
"3900",
"14928",
"11571",
"2625",
"4457",
"6125",
"1958",
"462",
"7685",
"13313",
"1163",
"15427",
"2462",
"12657",
"13027",
"1982",
"12485",
"3777",
"5841",
"9376",
"11314",
"5761",
"7909",
"1542",
"908",
"14348",
"9170",
"3237",
"7191",
"10507",
"6729",
"14558",
"9181",
"7158",
"5541",
"11767",
"8885",
"4058",
"11521",
"9183",
"717",
"11320",
"266",
"14111",
"4174",
"7321",
"3986",
"10389",
"12463",
"8551",
"4809",
"12315",
"1990",
"9877",
"2180",
"13235",
"4",
"4096",
"3473",
"325",
"2690",
"15016",
"1316",
"3001",
"3901",
"3310",
"1970",
"1327",
"7551",
"2166",
"15132",
"3966",
"7589",
"8761",
"13571",
"3564",
"10639",
"1605",
"318",
"512",
"14417",
"10212",
"10019",
"1944",
"14862",
"1516",
"10032",
"10475",
"5294",
"4807",
"15460",
"1795",
"10690",
"4680",
"4534",
"14531",
"3100",
"11852",
"982",
"12359",
"5494",
"9670",
"5757",
"11961",
"12090",
"12482",
"11686",
"5405",
"2982",
"14596",
"10093",
"1235",
"1122",
"2910",
"9536",
"11423",
"360",
"2985",
"6248",
"10296",
"1072",
"13098",
"8008",
"1405",
"2887",
"4020",
"2795",
"13288",
"1767",
"9599",
"8201",
"895",
"11973",
"3346",
"7647",
"8913",
"11784",
"10390",
"9280",
"14288",
"12262",
"9428",
"3799",
"4155",
"4312",
"750",
"6031",
"9834",
"11310",
"1291",
"12340",
"1508",
"12731",
"7091",
"4622",
"2838",
"8861",
"3019",
"11920",
"6342",
"2742",
"2574",
"14770",
"3813",
"9041",
"4571",
"3553",
"4702",
"11986",
"15126",
"1330",
"10214",
"13011",
"8925",
"12544",
"2237",
"4717",
"12916",
"7886",
"4820",
"13200",
"1315",
"1312",
"7366",
"8494",
"5972",
"9303",
"12449",
"15473",
"622",
"6705",
"11805",
"12434",
"11766",
"8330",
"3924",
"1427",
"3412",
"1102",
"5492",
"5905",
"12154",
"15235",
"1367",
"3172",
"1256",
"5105",
"2353",
"10262",
"1216",
"6135",
"548",
"2993",
"12091",
"4608",
"10137",
"9620",
"4790",
"13141",
"7260",
"4712",
"1854",
"2442",
"12140",
"598",
"12962",
"4328",
"9410",
"11702",
"3094",
"8529",
"1140",
"4339",
"4721",
"11867",
"2934",
"1719",
"633",
"6973",
"15076",
"10663",
"15141",
"15056",
"4606",
"2071",
"3023",
"1727",
"9617",
"4433",
"12588",
"2536",
"10632",
"15461",
"12245",
"7441",
"1674",
"11134",
"3408",
"7063",
"4631",
"10351",
"10001",
"11085",
"3450",
"134",
"12429",
"9329",
"978",
"9187",
"10367",
"5364",
"1114",
"256",
"5180",
"983",
"3056",
"669",
"5369",
"15474",
"14101",
"4991",
"14865",
"2120",
"4554",
"5077",
"1246",
"3691",
"3592",
"1151",
"5071",
"3073",
"13107",
"15167",
"581",
"13071",
"10496",
"5699",
"10912",
"15337",
"9476",
"9179",
"14426",
"2263",
"11670",
"6124",
"15283",
"10395",
"4343",
"4841",
"5131",
"3030",
"1971",
"2265",
"1269",
"10649",
"13652",
"11299",
"7250",
"8403",
"3514",
"4676",
"11445",
"11458",
"2852",
"15451",
"7203",
"888",
"7880",
"8579",
"590",
"5292",
"600",
"7310",
"13452",
"5026",
"324",
"14807",
"7112",
"11126",
"10904",
"10627",
"11590",
"9235",
"4893",
"12456",
"6433",
"2448",
"4118",
"12045",
"2068",
"4076",
"703",
"2127",
"1960",
"1694",
"8381",
"13944",
"7406",
"3180",
"13087",
"489",
"15039",
"11516",
"8765",
"5765",
"4515",
"8382",
"11100",
"14872",
"4514",
"10337",
"6957",
"1903",
"13145",
"11127",
"6533",
"2723",
"10621",
"12152",
"1589",
"386",
"1626",
"3890",
"4698",
"14376",
"14152",
"3511",
"1060",
"12730",
"3548",
"4207",
"14776",
"12425",
"399",
"4780",
"11645",
"2184",
"12794",
"5506",
"14245",
"1895",
"1457",
"11284",
"3695",
"3985",
"4679",
"617",
"2019",
"1476",
"7267",
"2724",
"15160",
"3244",
"14603",
"3771",
"9316",
"3652",
"11761",
"2160",
"10583",
"12322",
"12845",
"12288",
"14783",
"7264",
"2205",
"10970",
"8525",
"11004",
"12664",
"15178",
"6089",
"4273",
"15492",
"13869",
"687",
"9495",
"2882",
"8926",
"12626",
"7288",
"8489",
"10437",
"14937",
"5125",
"2069",
"14387",
"5528",
"9849",
"9063",
"7141",
"7964",
"10757",
"12973",
"12466",
"9763",
"9173",
"5738",
"1113",
"8502",
"975",
"12316",
"10500",
"3656",
"3877",
"9601",
"2034",
"8248",
"11006",
"3488",
"14818",
"13086",
"2652",
"9694",
"5495",
"6138",
"12648",
"2775",
"8385",
"5242",
"10838",
"14702",
"853",
"1597",
"3088",
"3195",
"4727",
"4159",
"4170",
"11390",
"4972",
"13010",
"12742",
"14503",
"3638",
"7104",
"5750",
"12712",
"5524",
"3586",
"3449",
"3540",
"11247",
"665",
"2289",
"7663",
"14448",
"1016",
"14102",
"4421",
"2518",
"348",
"10400",
"4685",
"4080",
"5865",
"4553",
"14749",
"14605",
"9502",
"3973",
"5568",
"3189",
"370",
"1298",
"6010",
"9104",
"10863",
"2018",
"14736",
"3612",
"1375",
"11604",
"1373",
"15270",
"1167",
"8183",
"5374",
"907",
"7224",
"12070",
"13678",
"5998",
"2270",
"778",
"5501",
"12765",
"14142",
"6122",
"5855",
"1100",
"3403",
"5872",
"11059",
"8641",
"14880",
"4711",
"3767",
"2569",
"9279",
"10368",
"5076",
"4796",
"5368",
"4479",
"13337",
"202",
"4310",
"1175",
"13009",
"14325",
"1101",
"8963",
"10503",
"12374",
"1253",
"6476",
"12898",
"335",
"14623",
"9594",
"10325",
"2658",
"7056",
"2487",
"1308",
"8599",
"1033",
"885",
"1838",
"4817",
"188",
"8788",
"8013",
"9398",
"15326",
"13838",
"14061",
"6037",
"15177",
"9055",
"4710",
"11280",
"850",
"13023",
"3404",
"406",
"8819",
"9582",
"12671",
"9049",
"2185",
"15475",
"1295",
"75",
"14100",
"7469",
"10084",
"12284",
"2285",
"12585",
"14769",
"7142",
"10635",
"8881",
"10762",
"4306",
"12230",
"5499",
"12453",
"9436",
"12674",
"6011",
"10394",
"14073",
"14023",
"10444",
"1242",
"4502",
"992",
"11690",
"6642",
"3796",
"4549",
"11055",
"5850",
"5556",
"6937",
"9707",
"387",
"917",
"2423",
"15031",
"4996",
"11758",
"15189",
"13707",
"1159",
"195",
"3210",
"11043",
"12739",
"9492",
"11732",
"1172",
"7215",
"981",
"9632",
"327",
"13221",
"10071",
"645",
"9217",
"12767",
"4331",
"14923",
"1279",
"11354",
"13418",
"2206",
"13894",
"10754",
"6657",
"9668",
"9145",
"7149",
"4109",
"15484",
"12043",
"14543",
"4412",
"15471",
"10999",
"4190",
"166",
"1019",
"10707",
"13688",
"11340",
"4732",
"9644",
"244",
"5544",
"710",
"13760",
"3857",
"7717",
"14757",
"12464",
"11709",
"2544",
"9163",
"10026",
"9471",
"4475",
"4564",
"1098",
"10998",
"4269",
"2660",
"8699",
"10159",
"4705",
"9488",
"6550",
"6896",
"3839",
"9192",
"2592",
"9604",
"450",
"976",
"4808",
"4069",
"11042",
"4461",
"9193",
"1363",
"8968",
"900",
"13396",
"4221",
"47",
"12053",
"1757",
"4787",
"4487",
"4662",
"14300",
"8375",
"5144",
"14171",
"1638",
"7429",
"12621",
"5288",
"11831",
"1956",
"11071",
"1104",
"12928",
"14861",
"1968",
"9027",
"4781",
"14936",
"693",
"10126",
"2842",
"2111",
"1344",
"14790",
"3156",
"13825",
"586",
"7648",
"9778",
"11439",
"10571",
"51",
"3764",
"3227",
"12975",
"5141",
"12622",
"5835",
"1654",
"4440",
"11035",
"8576",
"4203",
"4494",
"8012",
"4590",
"10631",
"29",
"8351",
"8875",
"423",
"6831",
"9123",
"13520",
"10763",
"9895",
"6570",
"3903",
"7293",
"8436",
"10606",
"10742",
"7777",
"14642",
"1592",
"1826",
"11618",
"1211",
"3458",
"4322",
"5785",
"4083",
"2076",
"13903",
"10608",
"8161",
"718",
"11898",
"15098",
"1492",
"12126",
"157",
"8779",
"2045",
"5954",
"11379",
"9722",
"3647",
"112",
"15",
"278",
"807",
"12332",
"9114",
"3683",
"4527",
"3760",
"7748",
"14564",
"8456",
"6686",
"11114",
"4524",
"7384",
"128",
"96",
"9574",
"11525",
"13966",
"492",
"3364",
"4745",
"2124",
"11480",
"12523",
"8621",
"2220",
"13425",
"8110",
"9993",
"4940",
"4867",
"7481",
"9367",
"14301",
"14562",
"6733",
"2722",
"1608",
"3300",
"7527",
"1773",
"12151",
"6979",
"2640",
"631",
"11536",
"7646",
"882",
"9799",
"3688",
"1717",
"3766",
"11985",
"6321",
"828",
"2498",
"10521",
"8523",
"7829",
"10855",
"2901",
"6425",
"1209",
"552",
"8257",
"11614",
"7246",
"7103",
"5535",
"7765",
"11927",
"2254",
"2370",
"8709",
"7216",
"5500",
"5558",
"8170",
"3840",
"616",
"2737",
"7873",
"3810",
"4256",
"10865",
"7786",
"2481",
"14819",
"6142",
"3932",
"4275",
"10720",
"12057",
"4474",
"11621",
"2040",
"6883",
"13488",
"7867",
"5660",
"1913",
"7986",
"14834",
"15157",
"10087",
"4209",
"11880",
"10987",
"11212",
"13094",
"14272",
"14952",
"2730",
"11885",
"3772",
"9137",
"3729",
"8618",
"689",
"11395",
"6411",
"8716",
"3464",
"11210",
"4609",
"13384",
"14690",
"8766",
"13201",
"6492",
"14686",
"9971",
"3355",
"2696",
"9930",
"9265",
"14460",
"14510",
"4106",
"3622",
"13878",
"3668",
"7148",
"11628",
"6514",
"936",
"4498",
"4244",
"106",
"9252",
"8526",
"1624",
"14708",
"10284",
"9699",
"12847",
"12130",
"12987",
"3273",
"15120",
"8458",
"8444",
"10775",
"2262",
"4189",
"4931",
"13285",
"6546",
"9387",
"10024",
"8793",
"4843",
"32",
"3395",
"1878",
"1582",
"180",
"4138",
"10427",
"12793",
"11293",
"12790",
"10025",
"6166",
"457",
"14408",
"7870",
"10416",
"10802",
"15315",
"3946",
"10638",
"345",
"12016",
"13605",
"1545",
"15445",
"11541",
"10127",
"9904",
"1544",
"4194",
"11728",
"829",
"357",
"2007",
"7840",
"15071",
"8712",
"14268",
"1586",
"9649",
"11648",
"12452",
"8900",
"6660",
"14634",
"5469",
"6459",
"4219",
"2577",
"9919",
"2638",
"11267",
"13713",
"339",
"11252",
"3628",
"93",
"7166",
"3256",
"7630",
"11912",
"10666",
"10449",
"3834",
"1793",
"6557",
"6175",
"3629",
"372",
"4168",
"1980",
"13134",
"7664",
"8449",
"10611",
"4821",
"14406",
"4185",
"9097",
"11097",
"8405",
"8238",
"11450",
"4978",
"11302",
"10293",
"12625",
"694",
"5029",
"10534",
"1860",
"7901",
"4832",
"15370",
"3478",
"11764",
"13085",
"10132",
"6458",
"808",
"10474",
"5221",
"5476",
"10954",
"6160",
"11319",
"2855",
"3906",
"10989",
"11476",
"1742",
"3398",
"1458",
"4265",
"3173",
"13413",
"15258",
"8042",
"11050",
"21",
"10346",
"4977",
"8544",
"2523",
"11044",
"3484",
"5414",
"9548",
"5588",
"12879",
"9551",
"13803",
"3045",
"10907",
"15209",
"7038",
"13276",
"3148",
"6349",
"5046",
"2058",
"12713",
"6930",
"13634",
"6668",
"3779",
"7107",
"376",
"12328",
"9687",
"14717",
"2918",
"11566",
"13173",
"12254",
"14836",
"3347",
"5178",
"12498",
"14285",
"15269",
"4528",
"221",
"14801",
"11288",
"7900",
"845",
"7176",
"15363",
"1931",
"6636",
"3268",
"10522",
"3749",
"1639",
"5559",
"7881",
"672",
"3618",
"10270",
"9856",
"11144",
"11021",
"13104",
"7412",
"9846",
"12209",
"6028",
"6102",
"9036",
"4557",
"6527",
"3280",
"367",
"3323",
"9565",
"2109",
"6400",
"7877",
"2032",
"8068",
"11936",
"1981",
"5237",
"2889",
"3936",
"7756",
"1846",
"13908",
"6367",
"77",
"6435",
"12922",
"9232",
"2771",
"7036",
"12694",
"6083",
"7501",
"7888",
"10439",
"2296",
"3196",
"7355",
"1711",
"11222",
"2107",
"546",
"13398",
"10194",
"8358",
"6576",
"13594",
"11177",
"13853",
"3610",
"15409",
"2976",
"1554",
"12478",
"3503",
"330",
"9085",
"6975",
"13613",
"105",
"10925",
"10033",
"1437",
"5240",
"9014",
"14194",
"10603",
"5134",
"13880",
"8311",
"4465",
"11928",
"1686",
"14043",
"4599",
"8335",
"5142",
"7368",
"7742",
"11141",
"5482",
"3758",
"14722",
"7644",
"9616",
"342",
"13832",
"8969",
"15232",
"8339",
"6416",
"9309",
"6563",
"3727",
"10714",
"4236",
"11681",
"10016",
"8038",
"6101",
"12072",
"13946",
"12077",
"8290",
"1478",
"10862",
"57",
"1780",
"12527",
"14948",
"673",
"6882",
"10081",
"15380",
"3385",
"3378",
"12287",
"6448",
"1859",
"1314",
"10769",
"13080",
"13051",
"3917",
"11919",
"13957",
"14930",
"5747",
"10626",
"6724",
"1081",
"2785",
"1645",
"14554",
"12470",
"6931",
"13045",
"9503",
"8028",
"11296",
"10599",
"3229",
"13273",
"15035",
"6119",
"14960",
"6099",
"9640",
"14474",
"9101",
"14020",
"8134",
"8246",
"1523",
"4769",
"7125",
"13643",
"10946",
"8325",
"14368",
"13754",
"12786",
"3467",
"1662",
"2714",
"7970",
"13227",
"5449",
"4255",
"5990",
"11705",
"7599",
"11356",
"5817",
"6223",
"11153",
"4953",
"11308",
"6288",
"2758",
"822",
"3670",
"9203",
"12755",
"2803",
"13123",
"2514",
"50",
"4413",
"14267",
"4575",
"11125",
"12918",
"4962",
"5101",
"10133",
"10489",
"14959",
"12990",
"4945",
"4591",
"13067",
"4124",
"12422",
"4347",
"15088",
"5192",
"8418",
"2552",
"875",
"1835",
"5212",
"10139",
"2039",
"2759",
"9267",
"9228",
"2687",
"5696",
"303",
"15106",
"5589",
"5241",
"4594",
"10895",
"4870",
"28",
"1177",
"13166",
"12468",
"515",
"8054",
"10405",
"2668",
"1025",
"9697",
"11857",
"1983",
"12145",
"8887",
"3783",
"2269",
"6348",
"10591",
"13976",
"6608",
"946",
"5736",
"13802",
"9034",
"1601",
"2876",
"10513",
"8461",
"10430",
"8337",
"3288",
"11849",
"7490",
"7713",
"8838",
"11137",
"11874",
"732",
"2500",
"4892",
"7180",
"5573",
"9095",
"1558",
"9225",
"12051",
"2174",
"2256",
"3991",
"9654",
"4587",
"15259",
"8571",
"522",
"2358",
"2001",
"14468",
"933",
"4829",
"12299",
"5181",
"10229",
"11544",
"12306",
"322",
"879",
"13129",
"11888",
"9546",
"13805",
"4869",
"473",
"8598",
"15319",
"6903",
"7417",
"11696",
"11111",
"9609",
"4355",
"9459",
"6614",
"2626",
"6387",
"13470",
"13442",
"8419",
"15044",
"1518",
"8553",
"13833",
"10658",
"10846",
"14104",
"11798",
"6324",
"8849",
"7156",
"13748",
"4469",
"8989",
"2520",
"5644",
"7962",
"11875",
"11443",
"8139",
"130",
"15224",
"9220",
"1978",
"5534",
"8482",
"6956",
"10919",
"11374",
"12653",
"14437",
"13610",
"11803",
"11747",
"8032",
"7884",
"282",
"3602",
"12768",
"12339",
"9739",
"11347",
"7992",
"7404",
"12178",
"7443",
"14567",
"3611",
"7271",
"2456",
"2526",
"14393",
"4317",
"8938",
"2913",
"6664",
"12773",
"8347",
"7528",
"5987",
"14727",
"2085",
"9823",
"9384",
"5425",
"6841",
"4403",
"7784",
"2894",
"12957",
"333",
"11234",
"1302",
"1652",
"8666",
"5629",
"964",
"12416",
"5468",
"10195",
"6057",
"11962",
"13393",
"5072",
"14978",
"9908",
"7769",
"1621",
"2177",
"6734",
"11164",
"8387",
"5286",
"15286",
"3761",
"746",
"7463",
"7674",
"8064",
"3457",
"9276",
"6737",
"4062",
"2153",
"3319",
"14534",
"11572",
"9537",
"5440",
"10036",
"1687",
"2995",
"8070",
"11946",
"6475",
"6545",
"1093",
"8763",
"8363",
"9737",
"11870",
"6091",
"9484",
"12822",
"7625",
"8796",
"5489",
"7613",
"2542",
"3328",
"12162",
"7040",
"4667",
"668",
"4827",
"1505",
"11737",
"7468",
"12083",
"13206",
"9202",
"8691",
"5913",
"6231",
"8977",
"6112",
"9273",
"8501",
"8479",
"5096",
"6140",
"1047",
"4313",
"6827",
"9709",
"14241",
"115",
"3205",
"9874",
"648",
"15338",
"3453",
"10045",
"2972",
"3028",
"11616",
"4513",
"13907",
"62",
"2183",
"12049",
"4303",
"7588",
"7255",
"13636",
"12590",
"6966",
"12804",
"12725",
"10609",
"13103",
"10141",
"270",
"10175",
"5191",
"13851",
"804",
"9351",
"10939",
"13447",
"13315",
"2524",
"14746",
"1660",
"4950",
"8317",
"3627",
"11720",
"6262",
"15134",
"3438",
"8929",
"14886",
"10236",
"8809",
"6015",
"9025",
"10040",
"14047",
"11269",
"8794",
"10793",
"2211",
"470",
"1677",
"8173",
"12656",
"10584",
"8672",
"13873",
"13515",
"2765",
"2784",
"10683",
"5946",
"2438",
"4677",
"3666",
"2474",
"15333",
"1318",
"4627",
"7358",
"5800",
"13121",
"10921",
"8053",
"11372",
"1928",
"14192",
"13511",
"6895",
"129",
"15359",
"12102",
"11375",
"2328",
"10717",
"8951",
"7878",
"379",
"8711",
"10875",
"12696",
"251",
"11677",
"3980",
"10542",
"9433",
"4085",
"10731",
"5019",
"14895",
"14304",
"162",
"2756",
"1728",
"14968",
"6405",
"8091",
"7727",
"7623",
"7327",
"1894",
"11413",
"3640",
"9342",
"14404",
"14331",
"8751",
"2072",
"8067",
"11802",
"39",
"11081",
"11545",
"1504",
"11394",
"12849",
"805",
"1776",
"5732",
"2763",
"9020",
"3343",
"6561",
"3486",
"13884",
"14103",
"14766",
"2259",
"14098",
"5611",
"511",
"3949",
"9929",
"12663",
"2661",
"7394",
"2820",
"11934",
"14489",
"9526",
"12850",
"6497",
"2528",
"5383",
"15081",
"7422",
"4285",
"2980",
"6815",
"3795",
"9612",
"2036",
"2633",
"786",
"11459",
"4865",
"4777",
"2548",
"7699",
"3147",
"7192",
"4928",
"9451",
"7398",
"2701",
"8432",
"4921",
"9201",
"6259",
"13836",
"3820",
"7672",
"1233",
"854",
"15143",
"4405",
"13292",
"10062",
"12143",
"4229",
"15112",
"12534",
"11738",
"14835",
"8138",
"2480",
"10678",
"15212",
"11910",
"1054",
"13986",
"13762",
"12564",
"2439",
"8942",
"3962",
"10772",
"8662",
"3581",
"6700",
"14083",
"1257",
"6252",
"411",
"7833",
"15316",
"9180",
"194",
"7695",
"5651",
"6390",
"12191",
"9086",
"13616",
"5087",
"153",
"3217",
"13522",
"15020",
"12438",
"12171",
"6144",
"9295",
"971",
"10038",
"14572",
"8094",
"14458",
"13382",
"6662",
"6335",
"9978",
"8424",
"9369",
"1231",
"4995",
"675",
"15144",
"7851",
"6833",
"4510",
"1230",
"9613",
"8095",
"12938",
"2516",
"7554",
"13256",
"6104",
"10121",
"4425",
"11170",
"15096",
"9475",
"10472",
"8722",
"6230",
"1726",
"2323",
"15369",
"10780",
"8731",
"12934",
"14364",
"1189",
"3125",
"4223",
"3400",
"14551",
"13599",
"10911",
"7265",
"1051",
"11416",
"5597",
"1783",
"175",
"13228",
"801",
"12440",
"553",
"14126",
"14094",
"11188",
"15089",
"14975",
"2905",
"11577",
"7179",
"11418",
"10914",
"1527",
"8088",
"1238",
"2912",
"3111",
"4992",
"12923",
"12777",
"5339",
"4927",
"12196",
"6092",
"5689",
"2",
"11669",
"13977",
"7226",
"14385",
"4989",
"12118",
"7137",
"6858",
"12087",
"6185",
"7626",
"8356",
"12410",
"1998",
"12078",
"1769",
"4445",
"5166",
"11957",
"7127",
"11819",
"4300",
"4349",
"13325",
"12349",
"11216",
"241",
"5857",
"6814",
"12068",
"3067",
"1855",
"4883",
"9491",
"13932",
"14431",
"11271",
"1267",
"14813",
"5953",
"2264",
"461",
"9004",
"14594",
"7476",
"3035",
"11339",
"10255",
"7323",
"5483",
"4402",
"15332",
"14010",
"3801",
"9578",
"13677",
"5340",
"9219",
"6983",
"2425",
"7420",
"1106",
"4993",
"11902",
"10700",
"10463",
"1370",
"3167",
"6212",
"13593",
"3617",
"10689",
"13397",
"13097",
"2201",
"7022",
"10527",
"12248",
"5384",
"2739",
"15294",
"10577",
"11470",
"11048",
"13044",
"10120",
"10572",
"12620",
"5717",
"11389",
"3062",
"13407",
"3894",
"3681",
"9103",
"9421",
"2475",
"6452",
"5995",
"700",
"3084",
"7513",
"14981",
"8465",
"12737",
"434",
"16",
"14339",
"12912",
"6850",
"14088",
"13536",
"12391",
"5466",
"8556",
"3245",
"4434",
"234",
"12432",
"1615",
"14924",
"2685",
"15013",
"12093",
"13997",
"8275",
"5247",
"7163",
"5566",
"4136",
"5109",
"6682",
"5285",
"12405",
"1828",
"14428",
"9933",
"7452",
"2168",
"6852",
"2966",
"5632",
"1817",
"7976",
"14635",
"12826",
"13329",
"12054",
"14077",
"14279",
"3332",
"4471",
"6467",
"9857",
"5845",
"13646",
"8724",
"2767",
"13994",
"5481",
"3566",
"11656",
"8985",
"6351",
"3969",
"9839",
"13140",
"3427",
"14108",
"4099",
"9277",
"7201",
"12294",
"796",
"10823",
"1085",
"11762",
"776",
"12614",
"9357",
"1349",
"10265",
"13349",
"8803",
"2405",
"15327",
"5665",
"3136",
"12574",
"1688",
"11530",
"1866",
"4632",
"8061",
"4107",
"8852",
"6638",
"12659",
"11830",
"11198",
"187",
"13441",
"6872",
"2315",
"15165",
"4295",
"784",
"1012",
"136",
"4437",
"12889",
"5047",
"1282",
"5650",
"10852",
"12317",
"5737",
"9162",
"13132",
"4122",
"2605",
"5981",
"8883",
"11511",
"9256",
"14378",
"5931",
"1576",
"11679",
"3781",
"14038",
"3241",
"11447",
"1455",
"7609",
"6888",
"11929",
"1770",
"6843",
"12055",
"8300",
"9077",
"624",
"9026",
"8744",
"570",
"3899",
"12770",
"5319",
"937",
"8331",
"6779",
"663",
"15176",
"5513",
"6725",
"7865",
"6926",
"12240",
"7557",
"12222",
"9783",
"2788",
"7757",
"155",
"466",
"506",
"2430",
"14579",
"3690",
"14402",
"8278",
"4082",
"12575",
"6007",
"13204",
"9288",
"5027",
"13485",
"8194",
"12880",
"11434",
"5815",
"13096",
"10899",
"7230",
"8242",
"13781",
"2970",
"7069",
"13716",
"4695",
"14611",
"3993",
"11711",
"422",
"2781",
"12717",
"2050",
"10812",
"3716",
"1974",
"8025",
"1579",
"12788",
"7426",
"3377",
"12855",
"11812",
"5728",
"11937",
"15390",
"9648",
"13443",
"4811",
"815",
"3833",
"5956",
"11984",
"2300",
"11856",
"9443",
"12963",
"9090",
"4135",
"12435",
"1419",
"6942",
"3142",
"3580",
"10676",
"2437",
"1305",
"1640",
"1714",
"4377",
"6679",
"14253",
"9262",
"13477",
"8876",
"10434",
"7146",
"15008",
"139",
"7693",
"2736",
"14915",
"13954",
"2560",
"14630",
"2712",
"3422",
"334",
"12707",
"13526",
"12647",
"1573",
"8524",
"13480",
"11783",
"3242",
"4420",
"10461",
"5450",
"12238",
"1255",
"7978",
"2278",
"6772",
"3118",
"4876",
"2718",
"15058",
"10412",
"2563",
"3312",
"6535",
"12334",
"12643",
"11186",
"708",
"4816",
"9226",
"5593",
"1082",
"7571",
"6547",
"5021",
"13742",
"3392",
"14412",
"9361",
"9401",
"995",
"167",
"11551",
"15105",
"8251",
"13576",
"6715",
"10688",
"10646",
"8092",
"2248",
"3751",
"5118",
"13788",
"9870",
"7758",
"15426",
"9391",
"14166",
"14520",
"15229",
"5184",
"9485",
"7918",
"9935",
"9199",
"1215",
"1678",
"12013",
"15383",
"9204",
"771",
"9600",
"13975",
"9071",
"12056",
"7205",
"2042",
"10504",
"8230",
"5407",
"8918",
"3891",
"1734",
"4930",
"1622",
"4658",
"12877",
"14295",
"5982",
"12825",
"14263",
"9955",
"12672",
"13210",
"14372",
"3508",
"12861",
"9745",
"2942",
"13840",
"11040",
"1129",
"1987",
"2879",
"10145",
"1334",
"15272",
"12401",
"10322",
"8953",
"4287",
"11539",
"6051",
"3742",
"4120",
"14582",
"8771",
"8535",
"14897",
"13828",
"1212",
"4638",
"15312",
"9780",
"13573",
"7739",
"952",
"14363",
"1135",
"12122",
"3510",
"11435",
"8783",
"7740",
"8901",
"9890",
"11682",
"9246",
"1814",
"1073",
"5622",
"14079",
"15010",
"208",
"4856",
"12081",
"12710",
"1227",
"12634",
"13160",
"12831",
"7862",
"7495",
"2139",
"3152",
"12802",
"10397",
"7131",
"5836",
"7982",
"9677",
"2851",
"1493",
"2446",
"5036",
"8846",
"9084",
"12732",
"7573",
"7958",
"7101",
"5890",
"13806",
"9414",
"9505",
"2046",
"881",
"11249",
"15345",
"2060",
"10426",
"15276",
"12433",
"13663",
"15219",
"12904",
"1775",
"8700",
"10017",
"11430",
"7995",
"7971",
"4360",
"9336",
"6003",
"14084",
"8442",
"6988",
"13611",
"1889",
"2728",
"4319",
"2055",
"3723",
"1577",
"8713",
"6964",
"12758",
"13568",
"3416",
"3843",
"2234",
"9099",
"11917",
"13283",
"14081",
"6414",
"1926",
"6440",
"6419",
"8234",
"5415",
"4330",
"8268",
"15202",
"10094",
"1870",
"7526",
"3506",
"3893",
"13299",
"11174",
"445",
"10182",
"11482",
"6494",
"5301",
"3085",
"2571",
"8346",
"733",
"8213",
"248",
"1236",
"9021",
"8293",
"8855",
"940",
"2340",
"8920",
"8056",
"1191",
"7297",
"9659",
"12839",
"6654",
"15102",
"7438",
"6151",
"7474",
"11647",
"12348",
"7172",
"9127",
"15037",
"2667",
"5312",
"3149",
"6176",
"9721",
"1399",
"13034",
"6876",
"10336",
"1787",
"11226",
"12360",
"13915",
"2445",
"14667",
"14261",
"2051",
"787",
"13491",
"4232",
"7144",
"13668",
"6167",
"3841",
"14660",
"10263",
"11474",
"797",
"3598",
"9037",
"13163",
"1754",
"15162",
"2144",
"6773",
"7871",
"1244",
"15425",
"4668",
"15261",
"4959",
"3344",
"11049",
"7763",
"2686",
"3577",
"1277",
"11167",
"9074",
"7556",
"9818",
"12721",
"7540",
"4019",
"14324",
"2530",
"6835",
"9092",
"5620",
"1557",
"10425",
"4864",
"2676",
"8601",
"2094",
"8078",
"12789",
"12364",
"1892",
"4503",
"4655",
"5932",
"7835",
"5546",
"13641",
"11309",
"5375",
"15288",
"13937",
"135",
"3933",
"6216",
"14250",
"2029",
"43",
"13708",
"15248",
"7933",
"3175",
"2499",
"9920",
"11332",
"9362",
"13751",
"14539",
"11771",
"10536",
"11191",
"6410",
"7095",
"12404",
"10304",
"6812",
"816",
"4228",
"8481",
"2189",
"10130",
"8366",
"11582",
"1115",
"6651",
"1514",
"12726",
"13018",
"12137",
"13574",
"424",
"2699",
"10046",
"8027",
"19",
"10751",
"2832",
"840",
"3847",
"4762",
"11777",
"10782",
"5225",
"4875",
"3734",
"2564",
"9260",
"6408",
"7218",
"1954",
"656",
"6438",
"4057",
"13451",
"12205",
"5636",
"12754",
"12678",
"4478",
"8717",
"13810",
"8750",
"13265",
"1193",
"6049",
"10280",
"13110",
"6429",
"1144",
"468",
"1818",
"13004",
"1668",
"13597",
"955",
"10516",
"3409",
"6300",
"12326",
"14217",
"3352",
"366",
"10277",
"4382",
"3524",
"1847",
"7259",
"5196",
"5988",
"6358",
"10530",
"5271",
"8683",
"11245",
"13364",
"15382",
"11307",
"10959",
"13877",
"8026",
"13914",
"7671",
"7666",
"9664",
"9804",
"11205",
"4758",
"5269",
"4878",
"556",
"5164",
"13897",
"12685",
"11619",
"2114",
"2176",
"13765",
"10623",
"10300",
"8577",
"14342",
"12948",
"716",
"12807",
"4536",
"7641",
"3752",
"9831",
"14993",
"484",
"14028",
"8774",
"14935",
"10826",
"138",
"12447",
"7410",
"3694",
"13316",
"520",
"5647",
"8689",
"12237",
"11378",
"7136",
"5985",
"6726",
"10753",
"11813",
"2606",
"10356",
"8614",
"10600",
"137",
"1571",
"1118",
"543",
"13674",
"2293",
"4915",
"15069",
"12997",
"7435",
"9001",
"13928",
"2588",
"11978",
"13971",
"1533",
"12187",
"10077",
"1186",
"1418",
"14780",
"12515",
"6265",
"6516",
"8702",
"3295",
"4541",
"691",
"15262",
"9486",
"11700",
"11823",
"5718",
"10740",
"2895",
"7846",
"14153",
"12489",
"7331",
"10816",
"8680",
"6952",
"14230",
"5575",
"7720",
"7335",
"11297",
"4008",
"3953",
"14648",
"2218",
"11939",
"11586",
"5008",
"9479",
"14961",
"3651",
"5059",
"6622",
"10613",
"11000",
"14337",
"9394",
"12247",
"11472",
"8558",
"11468",
"1738",
"8823",
"11117",
"9595",
"1953",
"3401",
"13632",
"5116",
"10160",
"200",
"12591",
"6511",
"2799",
"6935",
"14565",
"2295",
"11462",
"8924",
"10982",
"15014",
"1900",
"454",
"7211",
"15349",
"12974",
"14032",
"259",
"15495",
"527",
"8800",
"4004",
"11969",
"7827",
"721",
"13506",
"1876",
"1247",
"13088",
"10938",
"11246",
"8949",
"352",
"12331",
"11010",
"8623",
"934",
"13963",
"5604",
"12379",
"11072",
"7121",
"8604",
"7684",
"8787",
"3393",
"3225",
"2440",
"8536",
"14497",
"394",
"1790",
"10659",
"6395",
"12357",
"2066",
"14613",
"495",
"9490",
"12856",
"10881",
"14803",
"6070",
"7072",
"6512",
"8404",
"9676",
"5154",
"10451",
"12896",
"9411",
"14353",
"2535",
"8438",
"2141",
"1812",
"11811",
"13434",
"8797",
"12836",
"15314",
"10060",
"6991",
"6283",
"15222",
"2095",
"4464",
"7054",
"10107",
"1369",
"10824",
"12761",
"1618",
"6532",
"5168",
"14003",
"9845",
"3331",
"1320",
"1425",
"7854",
"10549",
"15491",
"877",
"1658",
"6453",
"3431",
"9371",
"1934",
"15168",
"12501",
"7576",
"4855",
"9148",
"3997",
"9057",
"6220",
"1702",
"14548",
"7077",
"8023",
"6183",
"4028",
"6188",
"8206",
"12996",
"4450",
"5253",
"14858",
"12113",
"11370",
"9866",
"8229",
"3699",
"7462",
"14636",
"8607",
"12487",
"5627",
"13518",
"6865",
"3340",
"15330",
"7751",
"11818",
"1916",
"14159",
"8217",
"14664",
"6625",
"13887",
"2006",
"6621",
"12610",
"13197",
"9407",
"11839",
"15169",
"8801",
"5232",
"4919",
"9385",
"13831",
"7649",
"15152",
"7894",
"9061",
"9893",
"9691",
"4699",
"10031",
"1336",
"7716",
"9581",
"5939",
"8282",
"8655",
"14555",
"1386",
"6793",
"6437",
"2670",
"6297",
"6314",
"8166",
"4073",
"15244",
"3805",
"5208",
"3007",
"2805",
"5236",
"4081",
"7670",
"9598",
"8962",
"7981",
"10055",
"7098",
"3741",
"8369",
"5837",
"9964",
"6847",
"3345",
"5741",
"6739",
"12469",
"722",
"1617",
"11751",
"11087",
"2100",
"12679",
"8620",
"4396",
"10905",
"5707",
"12891",
"5343",
"1483",
"6282",
"2884",
"13297",
"6004",
"12680",
"2175",
"6806",
"7258",
"13965",
"14655",
"11106",
"4340",
"9331",
"2809",
"3748",
"14748",
"3114",
"15393",
"341",
"8135",
"3065",
"11724",
"15362",
"7681",
"6658",
"10344",
"6822",
"798",
"14062",
"23",
"4899",
"7009",
"11676",
"1888",
"8256",
"12757",
"2641",
"2906",
"4504",
"14560",
"7233",
"4387",
"15438",
"10414",
"11941",
"13938",
"4659",
"14008",
"10713",
"13906",
"7134",
"8997",
"3665",
"13252",
"8948",
"10446",
"9310",
"14414",
"9460",
"14965",
"14190",
"7821",
"3975",
"13676",
"5712",
"336",
"9848",
"2700",
"15396",
"773",
"2033",
"846",
"8830",
"14436",
"146",
"11230",
"11232",
"1779",
"456",
"1440",
"2386",
"5615",
"13801",
"5867",
"6098",
"7809",
"14491",
"2476",
"7504",
"12342",
"4025",
"12312",
"2547",
"14486",
"6776",
"6375",
"12029",
"14934",
"14498",
"8029",
"15147",
"8080",
"13205",
"11196",
"6783",
"2596",
"4064",
"3322",
"12110",
"7392",
"15341",
"10289",
"13700",
"5132",
"15170",
"4970",
"9403",
"841",
"6191",
"7087",
"14231",
"671",
"5198",
"5238",
"6756",
"6128",
"14607",
"10241",
"2158",
"7891",
"994",
"4774",
"9135",
"8090",
"7010",
"12352",
"13459",
"13170",
"9646",
"12329",
"2817",
"15018",
"7998",
"5814",
"6718",
"6643",
"569",
"7822",
"14622",
"8909",
"551",
"6526",
"8473",
"9426",
"1918",
"1852",
"12297",
"1143",
"1536",
"464",
"2087",
"1655",
"8546",
"1675",
"3942",
"2436",
"557",
"10231",
"12592",
"6012",
"3329",
"3011",
"8425",
"12103",
"11091",
"2037",
"15053",
"11290",
"15006",
"14019",
"11601",
"6507",
"6885",
"9972",
"3107",
"3354",
"6703",
"5709",
"7379",
"9946",
"7500",
"1540",
"3704",
"2217",
"8285",
"3254",
"7512",
"8682",
"4517",
"6480",
"873",
"11763",
"11454",
"8575",
"13007",
"1428",
"14687",
"15115",
"1825",
"2413",
"14540",
"7856",
"13489",
"6456",
"9928",
"2772",
"5753",
"1808",
"4374",
"14626",
"3875",
"5766",
"6251",
"294",
"10340",
"7497",
"6118",
"10374",
"10910",
"4074",
"2064",
"4452",
"14739",
"11796",
"14615",
"6193",
"4663",
"10840",
"5706",
"3926",
"3818",
"3869",
"764",
"6394",
"1899",
"10587",
"662",
"12255",
"12612",
"1430",
"760",
"1578",
"10543",
"8258",
"14332",
"3096",
"3389",
"13811",
"664",
"1250",
"13058",
"1190",
"524",
"1309",
"14825",
"15003",
"1416",
"13177",
"3311",
"11664",
"1022",
"11438",
"7683",
"14745",
"7151",
"7440",
"3968",
"1791",
"3238",
"6126",
"9530",
"7624",
"4623",
"14674",
"0",
"1259",
"12201",
"2118",
"3465",
"11244",
"12629",
"7547",
"8261",
"1032",
"837",
"609",
"5358",
"2401",
"7145",
"14507",
"12170",
"5833",
"13847",
"10082",
"4694",
"9728",
"9544",
"14430",
"2861",
"1470",
"14768",
"13151",
"8030",
"4392",
"4049",
"2075",
"10874",
"210",
"11548",
"8834",
"876",
"12937",
"1326",
"12979",
"9720",
"14278",
"12954",
"12156",
"11595",
"7213",
"1439",
"5135",
"4126",
"13800",
"12190",
"1930",
"7612",
"11202",
"7274",
"6990",
"10607",
"5171",
"12372",
"14394",
"1564",
"15161",
"2666",
"13340",
"9663",
"9247",
"6698",
"9372",
"6669",
"11789",
"8223",
"6803",
"11163",
"5901",
"2360",
"3074",
"5634",
"5698",
"2953",
"7393",
"9483",
"7545",
"12550",
"4800",
"4863",
"12356",
"2990",
"5000",
"6190",
"5846",
"2284",
"9292",
"11203",
"3715",
"8980",
"8398",
"4023",
"10200",
"8270",
"3570",
"14499",
"9888",
"10494",
"2173",
"485",
"7333",
"7304",
"10295",
"11192",
"10165",
"4072",
"11398",
"10004",
"1111",
"10410",
"7520",
"7473",
"4037",
"451",
"441",
"3608",
"4327",
"12536",
"10125",
"2257",
"2341",
"4066",
"12697",
"3596",
"11172",
"11036",
"6397",
"14870",
"5218",
"7657",
"4925",
"10375",
"11046",
"7425",
"8864",
"2350",
"1816",
"14022",
"12092",
"14138",
"9688",
"8899",
"602",
"15242",
"12524",
"6153",
"1137",
"13770",
"15107",
"1003",
"6589",
"12107",
"14222",
"8737",
"7602",
"11652",
"9771",
"14228",
"8534",
"7270",
"7244",
"4139",
"9251",
"13854",
"1454",
"245",
"14274",
"1751",
"4414",
"6319",
"2203",
"7471",
"12260",
"12444",
"4968",
"13430",
"8509",
"14336",
"14900",
"5967",
"4386",
"10476",
"7523",
"11829",
"13309",
"14898",
"9024",
"11316",
"12749",
"2458",
"8447",
"6422",
"560",
"1739",
"12690",
"8209",
"7837",
"13319",
"3371",
"3832",
"5248",
"10191",
"13351",
"10681",
"3854",
"5935",
"6968",
"4771",
"133",
"355",
"6392",
"4913",
"6353",
"9365",
"10902",
"3887",
"3391",
"2994",
"8843",
"14400",
"11429",
"12815",
"12673",
"7549",
"14365",
"620",
"13627",
"11523",
"6627",
"9563",
"5366",
"4448",
"12543",
"9898",
"6337",
"11001",
"924",
"4651",
"5213",
"6849",
"14416",
"12539",
"5690",
"3500",
"6578",
"8730",
"14039",
"3413",
"651",
"775",
"3038",
"14826",
"13882",
"4552",
"85",
"8146",
"8049",
"11393",
"12251",
"510",
"4178",
"163",
"1581",
"8227",
"2578",
"11351",
"7219",
"9308",
"6291",
"13400",
"13033",
"8045",
"5329",
"14677",
"8152",
"14270",
"11119",
"5148",
"11704",
"8854",
"9660",
"10923",
"7592",
"10540",
"2478",
"12568",
"1355",
"5373",
"14492",
"5",
"9113",
"9911",
"5680",
"12490",
"6641",
"4531",
"6080",
"12542",
"1486",
"14953",
"10478",
"7692",
"1368",
"12785",
"1659",
"9257",
"3536",
"11362",
"8931",
"6813",
"10398",
"824",
"8047",
"8563",
"11942",
"11661",
"4316",
"10963",
"10075",
"6569",
"1819",
"3732",
"3423",
"14716",
"1969",
"11687",
"3931",
"14785",
"6558",
"11583",
"6606",
"12067",
"2662",
"8847",
"1627",
"7",
"10554",
"15414",
"4987",
"2616",
"629",
"2593",
"12149",
"13868",
"2306",
"447",
"7182",
"7487",
"10830",
"14914",
"2125",
"6278",
"11029",
"292",
"9324",
"9708",
"4016",
"2015",
"14773",
"4825",
"7168",
"4957",
"11471",
"8600",
"8798",
"2999",
"5949",
"498",
"1152",
"9432",
"9732",
"1501",
"4158",
"14943",
"14890",
"6732",
"13942",
"2663",
"8559",
"12088",
"2914",
"6692",
"13566",
"7919",
"3882",
"12141",
"13100",
"8871",
"15077",
"1849",
"10961",
"10928",
"5682",
"12541",
"7183",
"10687",
"9131",
"5177",
"314",
"5199",
"14009",
"14259",
"6170",
"9083",
"10056",
"12965",
"5336",
"6502",
"11915",
"618",
"8582",
"7917",
"13142",
"449",
"14420",
"1555",
"14476",
"12101",
"15305",
"3738",
"13432",
"11833",
"13497",
"1598",
"6426",
"11772",
"11194",
"8882",
"257",
"743",
"6008",
"6090",
"10849",
"5277",
"4262",
"2242",
"549",
"5880",
"9830",
"8586",
"4601",
"4388",
"14617",
"7236",
"576",
"10869",
"11662",
"11190",
"12799",
"13195",
"11104",
"5053",
"1786",
"7848",
"6967",
"4484",
"9805",
"5380",
"3972",
"6084",
"6824",
"12419",
"9048",
"7482",
"8163",
"9872",
"10804",
"9167",
"4218",
"6270",
"8520",
"9573",
"9444",
"6639",
"1925",
"12833",
"15207",
"6154",
"4152",
"12753",
"3954",
"13657",
"14281",
"4274",
"795",
"1539",
"14846",
"4134",
"5778",
"5723",
"14321",
"6874",
"1915",
"10718",
"5025",
"11218",
"458",
"15287",
"1896",
"13192",
"8111",
"12994",
"5110",
"2078",
"9345",
"11721",
"4943",
"8821",
"11996",
"6908",
"6774",
"14480",
"8121",
"9981",
"5664",
"8517",
"1520",
"7503",
"11353",
"8459",
"9967",
"11906",
"10734",
"12176",
"2584",
"2705",
"3387",
"5420",
"11935",
"5958",
"13169",
"10839",
"8580",
"2908",
"1393",
"9240",
"2543",
"391",
"13215",
"6818",
"1672",
"4687",
"13326",
"249",
"8758",
"6210",
"12076",
"3765",
"11605",
"608",
"7374",
"13871",
"12358",
"13083",
"4198",
"10361",
"13267",
"9320",
"329",
"8780",
"15372",
"14087",
"3069",
"10517",
"11078",
"9332",
"13199",
"10117",
"7349",
"5012",
"178",
"13540",
"13924",
"6274",
"331",
"7424",
"13277",
"1844",
"13504",
"4090",
"9775",
"7232",
"13720",
"6100",
"3139",
"659",
"6489",
"6338",
"10810",
"14037",
"14638",
"1811",
"13798",
"13736",
"2035",
"2733",
"4835",
"8334",
"5392",
"10204",
"3495",
"1049",
"4336",
"3445",
"7788",
"8483",
"4030",
"7797",
"4224",
"8439",
"14044",
"6791",
"10462",
"9360",
"4706",
"6066",
"13016",
"12207",
"2322",
"883",
"1531",
"12734",
"10548",
"12185",
"8058",
"7635",
"5452",
"237",
"11655",
"4740",
"4290",
"9762",
"3547",
"9062",
"1993",
"4937",
"12378",
"12189",
"2191",
"3976",
"6978",
"2331",
"14676",
"14602",
"14969",
"5157",
"2447",
"11161",
"5782",
"9253",
"1572",
"37",
"8735",
"6645",
"5873",
"1831",
"8817",
"7248",
"12716",
"4886",
"4620",
"13729",
"5013",
"11844",
"2744",
"13823",
"13343",
"11699",
"15007",
"6275",
"8406",
"9402",
"8392",
"9931",
"10000",
"14112",
"13264",
"5962",
"4372",
"2868",
"2679",
"3289",
"3802",
"13250",
"10365",
"7794",
"9817",
"10784",
"5819",
"7521",
"5758",
"7480",
"6839",
"3021",
"14383",
"3907",
"11184",
"4009",
"3851",
"4952",
"10515",
"5979",
"12866",
"3595",
"12310",
"13376",
"3674",
"1260",
"13773",
"12932",
"9637",
"2399",
"7493",
"1935",
"3940",
"2991",
"12472",
"9096",
"9951",
"3848",
"14864",
"10039",
"12876",
"13139",
"14",
"3112",
"13013",
"9399",
"354",
"7834",
"2684",
"8306",
"8016",
"2613",
"10761",
"7365",
"10735",
"11274",
"9910",
"1891",
"14179",
"5479",
"13300",
"12752",
"14940",
"12323",
"9701",
"42",
"8232",
"12635",
"14317",
"1018",
"5147",
"8106",
"836",
"2057",
"1243",
"5244",
"5507",
"965",
"9227",
"6816",
"5675",
"7380",
"7574",
"9003",
"1128",
"984",
"8617",
"2030",
"6782",
"2859",
"13001",
"5377",
"1824",
"10095",
"3130",
"4302",
"4969",
"15047",
"3314",
"2275",
"11624",
"13841",
"14215",
"11503",
"5200",
"860",
"9487",
"9215",
"1284",
"10104",
"8226",
"12304",
"11739",
"1690",
"5106",
"11878",
"5239",
"4398",
"1951",
"14242",
"6490",
"7395",
"14026",
"15215",
"4361",
"1801",
"2279",
"8304",
"14459",
"10099",
"15397",
"4639",
"3370",
"7449",
"632",
"14733",
"1506",
"247",
"89",
"5560",
"9268",
"6272",
"15292",
"1321",
"4918",
"3162",
"2455",
"10482",
"10894",
"145",
"14175",
"9151",
"11489",
"13303",
"12197",
"11495",
"5852",
"9312",
"11364",
"7987",
"13534",
"14675",
"4532",
"7290",
"14099",
"12474",
"12409",
"14992",
"1052",
"11412",
"8518",
"7354",
"12129",
"15083",
"10808",
"5590",
"15388",
"7747",
"4451",
"12695",
"9618",
"4454",
"10798",
"14297",
"11863",
"14903",
"13090",
"12180",
"4014",
"9429",
"8933",
"11734",
"2796",
"11508",
"12311",
"14326",
"6266",
"161",
"1465",
"11215",
"8102",
"658",
"79",
"8315",
"916",
"8541",
"2113",
"8327",
"1943",
"7047",
"13093",
"3003",
"13414",
"9301",
"7109",
"6005",
"12041",
"13157",
"7157",
"11213",
"2891",
"10559",
"14091",
"6537",
"9504",
"3798",
"11574",
"12031",
"97",
"3177",
"12484",
"10222",
"9230",
"12738",
"14799",
"2230",
"4093",
"517",
"7820",
"14696",
"10185",
"2361",
"643",
"1364",
"6892",
"5493",
"8371",
"9242",
"2427",
"6181",
"14814",
"15367",
"1637",
"7642",
"724",
"5970",
"6521",
"9248",
"9555",
"10307",
"3945",
"890",
"13556",
"10787",
"11171",
"6345",
"2692",
"5279",
"12859",
"6368",
"3137",
"13669",
"10729",
"502",
"13535",
"5643",
"6655",
"4982",
"10975",
"2054",
"748",
"2697",
"12441",
"8996",
"6444",
"7583",
"11971",
"11881",
"14257",
"6428",
"9378",
"3459",
"5896",
"2255",
"2680",
"3957",
"2163",
"7645",
"6407",
"12516",
"4770",
"6826",
"1502",
"1362",
"3905",
"4197",
"2974",
"5811",
"1451",
"10949",
"14661",
"6709",
"11336",
"1272",
"5084",
"9610",
"10680",
"14863",
"5223",
"15377",
"1962",
"13929",
"6468",
"7187",
"12766",
"9323",
"3658",
"4795",
"10404",
"1685",
"11224",
"8739",
"10898",
"13559",
"10897",
"7345",
"15392",
"2969",
"1071",
"4140",
"4243",
"6315",
"432",
"3525",
"4818",
"11779",
"8401",
"5609",
"516",
"3601",
"7983",
"12649",
"10401",
"12628",
"12723",
"2903",
"11629",
"1228",
"8637",
"8318",
"10219",
"6074",
"14135",
"1766",
"2729",
"14752",
"14134",
"13153",
"13039",
"143",
"6694",
"5094",
"2302",
"5554",
"8362",
"3276",
"13808",
"14340",
"13879",
"9111",
"1179",
"14568",
"6809",
"4669",
"9109",
"6650",
"13558",
"6325",
"12715",
"2572",
"4589",
"10447",
"15348",
"685",
"3648",
"8939",
"653",
"10723",
"11407",
"12900",
"887",
"1747",
"15194",
"15284",
"9017",
"13874",
"6179",
"4117",
"7197",
"4683",
"7499",
"10362",
"1288",
"5726",
"6786",
"4463",
"2431",
"5475",
"14521",
"13621",
"2853",
"3170",
"6910",
"100",
"13271",
"9513",
"4496",
"3221",
"7195",
"5433",
"8564",
"3309",
"9633",
"14286",
"14382",
"3643",
"4444",
"5767",
"7944",
"10090",
"11419",
"14859",
"11598",
"5434",
"11032",
"2301",
"6107",
"5705",
"11713",
"11283",
"8427",
"8835",
"15448",
"455",
"3169",
"14902",
"821",
"1594",
"9000",
"2197",
"12991",
"14349",
"8777",
"6945",
"2106",
"6103",
"14187",
"8048",
"5320",
"12383",
"14219",
"12618",
"8093",
"487",
"7289",
"9873",
"11744",
"9423",
"1290",
"6875",
"13304",
"2099",
"14956",
"11680",
"3466",
"493",
"8762",
"15477",
"2830",
"11546",
"10806",
"10187",
"12341",
"14706",
"9973",
"5648",
"5936",
"14007",
"1821",
"4163",
"11387",
"1239",
"6766",
"2621",
"5202",
"14806",
"4901",
"14995",
"9405",
"491",
"13125",
"9867",
"4671",
"11573",
"1089",
"5831",
"7167",
"12420",
"3399",
"15092",
"11960",
"6017",
"9315",
"8320",
"2368",
"5052",
"2768",
"10116",
"4077",
"9900",
"6796",
"3436",
"7266",
"1893",
"10111",
"2157",
"1524",
"11623",
"11250",
"14443",
"1755",
"6891",
"10168",
"7610",
"8608",
"14988",
"3712",
"1096",
"6640",
"15340",
"4356",
"14786",
"9418",
"14076",
"10867",
"1245",
"14671",
"4610",
"8279",
"10818",
"4973",
"12829",
"10610",
"2747",
"2273",
"11493",
"11851",
"3053",
"2978",
"11911",
"3884",
"5790",
"2540",
"13876",
"8648",
"8840",
"7307",
"5840",
"9184",
"864",
"8390",
"4089",
"7269",
"7669",
"5895",
"12801",
"11139",
"531",
"4912",
"2791",
"7313",
"9912",
"7652",
"3528",
"1299",
"9134",
"25",
"843",
"3947",
"15407",
"13435",
"10021",
"4535",
"1410",
"11065",
"12330",
"11175",
"2375",
"4184",
"14632",
"9196",
"2921",
"11253",
"14523",
"3463",
"1387",
"5914",
"2305",
"5687",
"7005",
"3198",
"12805",
"968",
"7362",
"2282",
"6649",
"3291",
"10167",
"15249",
"10918",
"9120",
"10896",
"8420",
"14439",
"1574",
"11485",
"8921",
"540",
"3922",
"972",
"10209",
"2653",
"10702",
"2161",
"13583",
"356",
"8429",
"13953",
"13569",
"13149",
"6837",
"8386",
"5594",
"6665",
"7239",
"7120",
"10226",
"6332",
"4624",
"13901",
"5360",
"11342",
"5427",
"1761",
"13356",
"14616",
"6985",
"10558",
"9438",
"6134",
"14415",
"11726",
"8167",
"6415",
"10298",
"13993",
"8374",
"7467",
"11369",
"12606",
"5527",
"11972",
"11381",
"3754",
"13355",
"14411",
"13743",
"14984",
"13178",
"15344",
"9264",
"13726",
"1511",
"1206",
"12577",
"1325",
"7453",
"3842",
"467",
"7907",
"4577",
"12450",
"10089",
"7955",
"7128",
"12175",
"11436",
"14246",
"13232",
"14341",
"4556",
"5431",
"3660",
"8096",
"12565",
"10352",
"6116",
"13533",
"5892",
"8915",
"14018",
"3811",
"12942",
"9750",
"3421",
"1324",
"11553",
"9989",
"14013",
"5359",
"11873",
"7825",
"14050",
"12651",
"14962",
"1967",
"7563",
"15179",
"8397",
"7189",
"5268",
"6771",
"14713",
"228",
"4315",
"10725",
"13717",
"12935",
"9626",
"3145",
"835",
"6413",
"2138",
"12459",
"4530",
"8435",
"7675",
"13338",
"9213",
"7558",
"10943",
"12396",
"1472",
"14333",
"6000",
"13648",
"13113",
"11613",
"11807",
"13473",
"5234",
"8628",
"9683",
"15290",
"13588",
"14583",
"9963",
"8583",
"6040",
"6441",
"12167",
"3174",
"818",
"172",
"999",
"9962",
"12998",
"14370",
"4261",
"9255",
"13866",
"11944",
"2924",
"12446",
"9212",
"10253",
"5561",
"10238",
"13390",
"3490",
"8118",
"3219",
"3047",
"1691",
"1799",
"742",
"15123",
"8468",
"9714",
"3938",
"1752",
"8661",
"9797",
"6869",
"10619",
"2410",
"8986",
"13684",
"8715",
"8815",
"4246",
"6055",
"3605",
"13445",
"419",
"11742",
"2644",
"6794",
"11138",
"9102",
"10101",
"15422",
"11649",
"9747",
"5034",
"2335",
"13889",
"4812",
"11064",
"3804",
"5843",
"7254",
"2941",
"12682",
"2794",
"15264",
"3144",
"1123",
"12817",
"10595",
"7850",
"762",
"10098",
"915",
"12011",
"1939",
"4641",
"865",
"10247",
"9275",
"7306",
"7892",
"11780",
"5357",
"299",
"12338",
"6503",
"8708",
"6310",
"10323",
"541",
"6184",
"6450",
"7231",
"5742",
"7494",
"9725",
"8674",
"6093",
"10235",
"12506",
"4043",
"9396",
"3375",
"503",
"9155",
"2973",
"1537",
"14729",
"15268",
"11563",
"12872",
"3533",
"14871",
"3452",
"6110",
"6047",
"15193",
"1616",
"14911",
"12319",
"11465",
"6504",
"898",
"15113",
"7566",
"10456",
"7711",
"3769",
"11373",
"8543",
"385",
"7378",
"2900",
"4840",
"12052",
"2253",
"14609",
"1813",
"10973",
"4401",
"4866",
"14932",
"8591",
"4196",
"1446",
"12756",
"8645",
"3866",
"12631",
"6287",
"12300",
"6388",
"2877",
"6962",
"4741",
"8326",
"8159",
"3308",
"2074",
"13665",
"12993",
"14495",
"872",
"15328",
"6819",
"8289",
"7006",
"10976",
"11922",
"9863",
"15180",
"6623",
"2297",
"12519",
"3787",
"8549",
"1936",
"9302",
"13890",
"8692",
"6362",
"10406",
"12165",
"2657",
"315",
"10526",
"5668",
"14688",
"4376",
"15019",
"11710",
"9952",
"9859",
"3266",
"7470",
"13363",
"4713",
"14089",
"12616",
"3160",
"7199",
"13301",
"13759",
"5098",
"1357",
"13789",
"13057",
"11133",
"947",
"11009",
"4666",
"8998",
"6466",
"13263",
"3203",
"9568",
"9437",
"10189",
"4889",
"8269",
"5848",
"10776",
"8492",
"2129",
"15412",
"14873",
"1988",
"7800",
"9947",
"3545",
"2741",
"1550",
"5743",
"9584",
"12035",
"7479",
"10941",
"12371",
"9177",
"9809",
"9355",
"2366",
"4722",
"8086",
"9821",
"5264",
"10770",
"7584",
"1495",
"8959",
"9723",
"7383",
"13735",
"8154",
"4407",
"12220",
"6770",
"3435",
"6934",
"9381",
"13813",
"14921",
"12273",
"15324",
"7390",
"6023",
"13059",
"3515",
"13484",
"5734",
"3462",
"7945",
"9012",
"15068",
"2170",
"6442",
"2909",
"11953",
"14086",
"8844",
"8878",
"5229",
"12569",
"2598",
"238",
"15459",
"5613",
"9661",
"11918",
"1985",
"9028",
"14673",
"2091",
"488",
"11024",
"14515",
"14262",
"4453",
"13240",
"12033",
"11675",
"4234",
"11333",
"15352",
"9239",
"8550",
"5719",
"4415",
"14422",
"9518",
"9473",
"10199",
"11592",
"14306",
"2172",
"9615",
"3233",
"2146",
"12857",
"2298",
"12573",
"9457",
"9602",
"14447",
"9959",
"11815",
"13578",
"3350",
"14694",
"6396",
"14375",
"12173",
"11843",
"1412",
"10193",
"9390",
"5638",
"2389",
"4070",
"9186",
"13617",
"8470",
"13741",
"9409",
"6594",
"1907",
"207",
"287",
"8818",
"2709",
"2762",
"14901",
"5408",
"13934",
"13935",
"2316",
"6290",
"7953",
"8421",
"12729",
"4177",
"12290",
"3557",
"15063",
"4324",
"12112",
"15439",
"4311",
"15230",
"8640",
"8860",
"9080",
"12272",
"13262",
"12291",
"10719",
"6041",
"1731",
"1456",
"7582",
"11268",
"7939",
"12746",
"9876",
"11410",
"8140",
"13474",
"8192",
"12366",
"6784",
"11039",
"5045",
"6802",
"1906",
"4936",
"7423",
"6763",
"14006",
"1237",
"14276",
"4779",
"6105",
"13757",
"4033",
"15236",
"690",
"3964",
"10379",
"12699",
"15099",
"14479",
"1156",
"15343",
"7887",
"10202",
"8510",
"9576",
"2294",
"9811",
"5085",
"871",
"2798",
"2967",
"14805",
"8313",
"8299",
"7446",
"6997",
"1845",
"1401",
"10459",
"6217",
"11242",
"4149",
"3923",
"774",
"10150",
"3176",
"6805",
"8657",
"212",
"7002",
"10861",
"6340",
"12281",
"15241",
"7430",
"7814",
"9942",
"11484",
"3747",
"12104",
"7694",
"263",
"11692",
"5158",
"7586",
"3239",
"5744",
"14614",
"13305",
"10227",
"9966",
"2506",
"9977",
"10319",
"14973",
"7517",
"4946",
"13581",
"3009",
"9717",
"6590",
"9944",
"7682",
"13342",
"7529",
"7968",
"6747",
"12795",
"4670",
"13985",
"12581",
"10573",
"11554",
"14275",
"1671",
"3439",
"14409",
"10906",
"12351",
"8908",
"6473",
"8991",
"13824",
"10800",
"9801",
"7667",
"10752",
"2266",
"7925",
"14833",
"5471",
"14496",
"4766",
"3430",
"14743",
"14225",
"69",
"13672",
"6611",
"12639",
"6683",
"11933",
"3753",
"14698",
"8764",
"9528",
"3654",
"6401",
"4729",
"1178",
"5571",
"4018",
"13372",
"833",
"10551",
"2484",
"6363",
"7798",
"6593",
"8893",
"1698",
"4048",
"15181",
"10342",
"6137",
"10097",
"13670",
"2964",
"4908",
"8790",
"4637",
"4636",
"5645",
"7620",
"10726",
"4078",
"3272",
"625",
"2932",
"11357",
"928",
"5525",
"6145",
"8498",
"550",
"4050",
"621",
"12224",
"12271",
"11542",
"9124",
"14085",
"11415",
"10256",
"9420",
"9075",
"5346",
"9982",
"13930",
"8350",
"3396",
"3755",
"9842",
"8484",
"8184",
"6878",
"15246",
"2247",
"14041",
"10508",
"2332",
"13012",
"11334",
"14946",
"12461",
"3529",
"7665",
"8316",
"8190",
"11775",
"6863",
"8043",
"5120",
"7188",
"4383",
"8015",
"13658",
"14407",
"14068",
"11315",
"12086",
"4924",
"8050",
"12941",
"15336",
"6543",
"919",
"13331",
"13162",
"14319",
"11567",
"2021",
"9016",
"9363",
"533",
"2636",
"10917",
"7029",
"5866",
"7999",
"5330",
"604",
"6019",
"9827",
"4173",
"11207",
"5122",
"7569",
"10112",
"9171",
"9519",
"8584",
"9161",
"12779",
"10929",
"9425",
"4738",
"4098",
"2673",
"11930",
"2580",
"7408",
"351",
"3079",
"4895",
"7085",
"9829",
"12274",
"279",
"1217",
"3990",
"2482",
"7842",
"3521",
"5395",
"7376",
"6530",
"6281",
"11666",
"8775",
"5172",
"14547",
"10758",
"1443",
"14576",
"929",
"80",
"6777",
"2706",
"2629",
"14429",
"8297",
"12873",
"3104",
"14299",
"2843",
"182",
"3349",
"10347",
"242",
"4481",
"4884",
"15423",
"12074",
"14775",
"11017",
"2841",
"7656",
"8974",
"13002",
"15302",
"15136",
"11157",
"343",
"2210",
"9056",
"7738",
"11451",
"307",
"1200",
"844",
"2267",
"10491",
"4363",
"12978",
"9019",
"9254",
"2878",
"6484",
"5536",
"6263",
"11015",
"2622",
"4906",
"3477",
"6308",
"7088",
"11610",
"459",
"12955",
"1011",
"3441",
"7817",
"11469",
"2850",
"7524",
"9592",
"11804",
"151",
"13776",
"2231",
"14195",
"723",
"12843",
"11285",
"7181",
"5581",
"7309",
"6391",
"3719",
"223",
"14421",
"11576",
"10501",
"9734",
"5804",
"10106",
"5576",
"4569",
"10041",
"9715",
"9703",
"11448",
"11433",
"10418",
"3228",
"3645",
"2833",
"14563",
"1961",
"11199",
"10366",
"1220",
"11564",
"13995",
"7508",
"5079",
"3471",
"14905",
"7802",
"13861",
"14509",
"9294",
"12950",
"13967",
"7377",
"5448",
"13579",
"8413",
"13345",
"3250",
"7386",
"14994",
"4217",
"1174",
"8919",
"3988",
"12602",
"10048",
"11759",
"13040",
"12512",
"11989",
"4985",
"14506",
"7975",
"10785",
"10671",
"3606",
"9550",
"10173",
"4499",
"4568",
"9566",
"7044",
"9130",
"8338",
"4647",
"5347",
"15151",
"4574",
"9132",
"14107",
"3671",
"10445",
"8857",
"10636",
"6159",
"5182",
"10471",
"12555",
"8565",
"11736",
"4783",
"92",
"4470",
"14377",
"5378",
"619",
"8474",
"9752",
"12529",
"10715",
"3089",
"9998",
"6927",
"11151",
"4358",
"12256",
"6234",
"11355",
"1068",
"11405",
"5628",
"9022",
"7123",
"2911",
"12320",
"13254",
"14798",
"5097",
"8554",
"4868",
"8652",
"10123",
"3083",
"4007",
"747",
"10791",
"8914",
"2314",
"13081",
"2277",
"1772",
"2562",
"6902",
"14753",
"15139",
"6572",
"9996",
"6095",
"6906",
"8393",
"8851",
"7154",
"15046",
"9741",
"8113",
"9333",
"9712",
"7460",
"15399",
"13843",
"6556",
"338",
"11889",
"7826",
"2471",
"14627",
"9868",
"15173",
"3669",
"4529",
"10958",
"670",
"10450",
"10668",
"1963",
"12724",
"11806",
"13724",
"8519",
"13",
"13186",
"6922",
"5129",
"10302",
"9861",
"571",
"9206",
"6285",
"12344",
"5259",
"12881",
"10423",
"13463",
"1474",
"3994",
"10044",
"5421",
"9205",
"3919",
"7320",
"2695",
"3411",
"2829",
"10250",
"15158",
"4195",
"147",
"13320",
"5980",
"346",
"7698",
"13176",
"6357",
"825",
"1861",
"4932",
"14500",
"12215",
"1004",
"13357",
"7118",
"15433",
"5530",
"6765",
"10656",
"13747",
"5957",
"7702",
"5111",
"12842",
"11109",
"14589",
"962",
"3109",
"6960",
"15334",
"12668",
"11755",
"10174",
"10952",
"396",
"7492",
"7339",
"8822",
"6313",
"10287",
"1347",
"4559",
"2797",
"11537",
"5838",
"3736",
"4786",
"12121",
"11300",
"5406",
"980",
"2067",
"3614",
"3831",
"5580",
"6043",
"10310",
"8235",
"7060",
"5572",
"15045",
"13378",
"14791",
"14647",
"948",
"9945",
"3078",
"104",
"1434",
"11350",
"494",
"2134",
"125",
"6344",
"6240",
"13440",
"15138",
"9943",
"6768",
"6033",
"14794",
"3292",
"13505",
"9785",
"10151",
"5203",
"10703",
"3325",
"9358",
"10420",
"6616",
"13152",
"6152",
"7161",
"6044",
"5776",
"6714",
"4137",
"3552",
"4015",
"10570",
"7915",
"11993",
"13639",
"5884",
"14330",
"9306",
"8361",
"8609",
"8005",
"5233",
"4154",
"2016",
"6925",
"884",
"6461",
"4760",
"12061",
"6389",
"12846",
"13650",
"783",
"2303",
"8322",
"10851",
"14966",
"14641",
"1519",
"9076",
"4017",
"5810",
"14847",
"9906",
"7579",
"3044",
"8041",
"15103",
"6139",
"11668",
"14090",
"10535",
"5119",
"10801",
"4634",
"13483",
"5219",
"14777",
"12037",
"3384",
"10267",
"12617",
"14454",
"12283",
"5799",
"3246",
"9207",
"1360",
"5505",
"9118",
"11501",
"6150",
"13458",
"15086",
"11223",
"8085",
"931",
"4409",
"9281",
"11120",
"13424",
"13827",
"12131",
"1820",
"8083",
"14121",
"7385",
"14123",
"2517",
"12632",
"6002",
"3644",
"8548",
"8254",
"505",
"3588",
"7768",
"8742",
"3616",
"4935",
"8001",
"4815",
"9520",
"7444",
"1611",
"3702",
"1341",
"1389",
"7026",
"2933",
"14680",
"2195",
"2083",
"8466",
"4011",
"7249",
"9786",
"13314",
"3824",
"7169",
"12953",
"5064",
"15278",
"9621",
"6306",
"4873",
"14600",
"9393",
"5356",
"10790",
"6326",
"9836",
"5673",
"6586",
"10177",
"2400",
"140",
"8079",
"3950",
"938",
"6327",
"9638",
"12252",
"3447",
"8035",
"6187",
"7084",
"8707",
"12062",
"8169",
"4326",
"2874",
"5210",
"3587",
"6845",
"13334",
"11384",
"1496",
"7000",
"9511",
"5510",
"14322",
"12246",
"9940",
"1147",
"9619",
"2866",
"7448",
"14057",
"13983",
"14991",
"11913",
"5463",
"6086",
"15410",
"13190",
"7396",
"10672",
"9686",
"13492",
"14203",
"9509",
"3941",
"4653",
"12939",
"5215",
"3642",
"4240",
"4172",
"6780",
"5618",
"7845",
"5065",
"15360",
"7603",
"2888",
"1042",
"13687",
"10502",
"14832",
"11520",
"13225",
"15321",
"3105",
"3194",
"6141",
"15320",
"9234",
"3279",
"9311",
"5928",
"5349",
"7152",
"10269",
"9627",
"1644",
"12476",
"1602",
"9337",
"13563",
"8778",
"9159",
"12982",
"5543",
"12652",
"7632",
"2620",
"13336",
"5661",
"565",
"15253",
"12554",
"5730",
"5007",
"11289",
"7019",
"12046",
"8089",
"7294",
"1758",
"13917",
"4052",
"15447",
"9934",
"9098",
"4660",
"12198",
"6029",
"2084",
"2193",
"9327",
"7629",
"5965",
"3299",
"11606",
"2371",
"12566",
"3584",
"2477",
"3317",
"3071",
"444",
"2365",
"10018",
"11565",
"3825",
"13921",
"13208",
"12258",
"2821",
"9216",
"4850",
"3476",
"1390",
"72",
"7212",
"4399",
"14620",
"12335",
"812",
"12135",
"11155",
"10147",
"13352",
"7491",
"3687",
"6293",
"7014",
"11688",
"13245",
"10345",
"3029",
"12142",
"5030",
"11622",
"9539",
"8636",
"8776",
"8132",
"2618",
"8177",
"4046",
"365",
"10744",
"3762",
"3282",
"2143",
"4143",
"10479",
"3326",
"3132",
"12390",
"970",
"3460",
"8697",
"2012",
"2398",
"3448",
"13061",
"5362",
"921",
"4905",
"12808",
"13196",
"6257",
"4006",
"3944",
"111",
"9858",
"9772",
"5018",
"6860",
"12915",
"909",
"14850",
"9140",
"8478",
"11801",
"5616",
"15005",
"14737",
"954",
"9924",
"5655",
"141",
"10282",
"11225",
"8808",
"9108",
"6163",
"14877",
"6247",
"4614",
"8769",
"15260",
"5498",
"7745",
"9133",
"2148",
"13958",
"6857",
"5040",
"2226",
"1670",
"10217",
"13449",
"12060",
"2857",
"203",
"13289",
"14128",
"6755",
"427",
"2694",
"13471",
"6736",
"12423",
"11123",
"11740",
"3655",
"10667",
"12532",
"7934",
"6652",
"985",
"956",
"14095",
"4061",
"8540",
"1462",
"7048",
"2849",
"2077",
"6950",
"10052",
"3077",
"7485",
"10154",
"6880",
"1185",
"8880",
"13691",
"8806",
"2397",
"869",
"9224",
"1683",
"7661",
"2846",
"1552",
"1",
"13462",
"9794",
"3304",
"13146",
"3374",
"10249",
"1778",
"3117",
"14451",
"7678",
"10329",
"9447",
"3836",
"652",
"14133",
"3689",
"11695",
"9258",
"1927",
"11243",
"5652",
"13940",
"9374",
"8907",
"8312",
"9902",
"11235",
"2419",
"5601",
"1107",
"1024",
"7879",
"5194",
"6949",
"11179",
"1607",
"5829",
"5856",
"15203",
"10396",
"932",
"264",
"2525",
"13237",
"12627",
"10661",
"8260",
"8186",
"6836",
"2971",
"281",
"2053",
"12010",
"13858",
"15205",
"3443",
"5074",
"12368",
"15329",
"1313",
"4446",
"5780",
"10797",
"886",
"12902",
"14854",
"10186",
"1840",
"10884",
"644",
"1680",
"14868",
"7143",
"13385",
"15040",
"1219",
"4351",
"13243",
"8631",
"11907",
"8896",
"12567",
"4709",
"10947",
"1610",
"6629",
"10953",
"3128",
"3222",
"4435",
"6992",
"10795",
"7914",
"5467",
"11903",
"15200",
"2132",
"14074",
"13468",
"1996",
"3",
"5823",
"1666",
"7686",
"12743",
"3220",
"9448",
"7926",
"6203",
"10138",
"6663",
"794",
"4320",
"9570",
"10448",
"12000",
"2459",
"10268",
"9082",
"6984",
"8127",
"5702",
"15060",
"4171",
"15443",
"4665",
"3849",
"6687",
"12792",
"2915",
"10764",
"4718",
"1303",
"5140",
"6369",
"13931",
"13790",
"3865",
"1595",
"5265",
"7762",
"8698",
"15021",
"3909",
"6417",
"8065",
"12347",
"7959",
"6269",
"4896",
"13339",
"4253",
"10813",
"13671",
"3935",
"11848",
"11707",
"7838",
"5332",
"13607",
"6723",
"8768",
"12883",
"6378",
"13401",
"15163",
"12650",
"10882",
"3206",
"15415",
"3963",
"12458",
"1841",
"2959",
"7598",
"10836",
"6062",
"5204",
"4216",
"11165",
"10399",
"2545",
"3620",
"4226",
"12874",
"13586",
"8755",
"2839",
"12213",
"2761",
"3426",
"3119",
"2819",
"8853",
"7316",
"10547",
"8499",
"9596",
"6039",
"9044",
"2290",
"10783",
"7789",
"15133",
"8128",
"2522",
"4308",
"4249",
"8965",
"8866",
"3744",
"2372",
"15296",
"11130",
"4042",
"12949",
"12885",
"14027",
"14742",
"15101",
"2604",
"9081",
"13923",
"4215",
"12878",
"9965",
"159",
"5480",
"2507",
"9059",
"10210",
"13268",
"1911",
"10061",
"7997",
"7332",
"14663",
"639",
"3879",
"6211",
"930",
"11428",
"2137",
"2103",
"197",
"10266",
"11391",
"6192",
"6350",
"8837",
"13224",
"2483",
"10110",
"3187",
"9291",
"15228",
"1513",
"8076",
"14639",
"9498",
"10220",
"13699",
"14391",
"11277",
"11555",
"2457",
"4560",
"10644",
"15033",
"10705",
"10431",
"15094",
"15317",
"13960",
"12120",
"10034",
"13054",
"4281",
"7013",
"13032",
"1538",
"14456",
"6081",
"6045",
"3603",
"5821",
"226",
"2713",
"6255",
"1868",
"5183",
"15368",
"10815",
"9482",
"3516",
"9605",
"15493",
"12206",
"2655",
"9883",
"9820",
"6207",
"7033",
"12428",
"2409",
"8506",
"8500",
"8981",
"1079",
"7595",
"2031",
"10985",
"15145",
"3405",
"3518",
"12200",
"6510",
"3294",
"5209",
"13614",
"3492",
"15271",
"2380",
"6790",
"10246",
"13060",
"9958",
"7604",
"3440",
"4245",
"4719",
"13293",
"12285",
"5658",
"9994",
"5842",
"9091",
"7954",
"7984",
"15310",
"2383",
"3578",
"5417",
"11794",
"14957",
"7722",
"918",
"11377",
"7382",
"7704",
"1358",
"14289",
"8265",
"1809",
"2479",
"7064",
"7089",
"13258",
"5228",
"11894",
"601",
"10567",
"4611",
"8892",
"15043",
"5231",
"6268",
"11327",
"7204",
"5637",
"14031",
"4321",
"1756",
"2126",
"8725",
"5578",
"7451",
"2412",
"12909",
"749",
"10042",
"10228",
"3264",
"11958",
"10327",
"1065",
"1076",
"4145",
"9814",
"7977",
"11558",
"5401",
"10593",
"11016",
"2381",
"8829",
"12683",
"12897",
"12562",
"6173",
"8940",
"8441",
"810",
"13697",
"13230",
"10419",
"398",
"10297",
"4654",
"3356",
"9806",
"3646",
"4123",
"12128",
"9681",
"8740",
"5033",
"3188",
"5801",
"2835",
"5363",
"5039",
"11987",
"4160",
"15078",
"4305",
"15496",
"7296",
"6785",
"9499",
"13624",
"4054",
"5713",
"14360",
"5944",
"10529",
"13154",
"10903",
"8979",
"7969",
"12481",
"6540",
"14223",
"11367",
"10003",
"12012",
"12584",
"11231",
"9756",
"7436",
"2154",
"15148",
"403",
"7097",
"12278",
"3576",
"3883",
"3513",
"15358",
"13024",
"7432",
"9572",
"5692",
"14344",
"9822",
"11030",
"9470",
"10819",
"9753",
"9375",
"7653",
"11494",
"11644",
"4681",
"10768",
"14889",
"4148",
"6811",
"4132",
"5331",
"12570",
"14305",
"11437",
"12820",
"14293",
"1902",
"13030",
"4954",
"12703",
"6817",
"10294",
"9884",
"14308",
"10334",
"14795",
"7818",
"9189",
"2318",
"8164",
"1836",
"10224",
"4247",
"3394",
"15342",
"11528",
"3081",
"1781",
"9209",
"9700",
"6965",
"13029",
"3562",
"15216",
"4581",
"14916",
"14359",
"5283",
"5361",
"6834",
"8752",
"8804",
"8193",
"8930",
"12806",
"4788",
"13371",
"6336",
"9469",
"14362",
"8958",
"2362",
"204",
"9521",
"3965",
"8480",
"10986",
"2940",
"11359",
"4186",
"8574",
"12722",
"3382",
"5004",
"12782",
"8625",
"6434",
"1759",
"13525",
"3502",
"15325",
"10745",
"14483",
"14645",
"14002",
"2599",
"10291",
"6113",
"1587",
"2589",
"1957",
"12894",
"10469",
"10179",
"8935",
"10686",
"3080",
"13111",
"13587",
"6853",
"1133",
"8247",
"5907",
"13603",
"1897",
"9185",
"5511",
"12136",
"3782",
"1591",
"11540",
"6670",
"7708",
"15366",
"14345",
"7885",
"12437",
"4097",
"6624",
"7855",
"13332",
"12069",
"13771",
"11440",
"14762",
"1448",
"6528",
"12558",
"6752",
"5562",
"4844",
"3214",
"6944",
"4966",
"1264",
"14216",
"11088",
"9514",
"9692",
"5400",
"14845",
"10272",
"13147",
"14941",
"1141",
"10242",
"14029",
"11094",
"2190",
"12392",
"11568",
"5760",
"1785",
"12028",
"6753",
"323",
"15408",
"5126",
"8497",
"12586",
"8115",
"8437",
"98",
"3809",
"9740",
"14530",
"10271",
"11159",
"316",
"9029",
"5973",
"5961",
"8202",
"8836",
"15355",
"2062",
"7457",
"4861",
"7578",
"2044",
"13420",
"13984",
"10944",
"14070",
"15386",
"5206",
"14463",
"4752",
"14829",
"9790",
"9125",
"5921",
"12895",
"13870",
"1546",
"6740",
"12613",
"2140",
"15150",
"8448",
"6678",
"6449",
"15213",
"7511",
"22",
"10825",
"4739",
"1034",
"3876",
"3418",
"2466",
"15389",
"2347",
"14535",
"8151",
"6634",
"4999",
"6800",
"15073",
"14351",
"10698",
"4735",
"1547",
"4480",
"1335",
"9999",
"13893",
"10514",
"198",
"9705",
"8496",
"2223",
"14311",
"6971",
"9290",
"14410",
"2149",
"10853",
"4612",
"14056",
"2443",
"1703",
"5599",
"1499",
"3522",
"14143",
"3122",
"6722",
"15497",
"2200",
"9587",
"8516",
"11635",
"12763",
"14338",
"5150",
"13041",
"11974",
"3135",
"9223",
"7590",
"6792",
"630",
"9547",
"12797",
"13207",
"7282",
"5582",
"13956",
"1392",
"8117",
"14884",
"14380",
"7262",
"11080",
"13872",
"14434",
"11148",
"13464",
"12964",
"12890",
"6542",
"15263",
"6025",
"7217",
"3126",
"4466",
"9751",
"6804",
"5934",
"15111",
"8329",
"1097",
"4225",
"13131",
"169",
"11147",
"12615",
"13392",
"5548",
"7024",
"14585",
"243",
"7836",
"4292",
"10252",
"12109",
"3022",
"9182",
"10885",
"11237",
"6372",
"12030",
"15298",
"3133",
"2931",
"10215",
"15421",
"1202",
"8071",
"13167",
"6423",
"2892",
"8276",
"15059",
"15303",
"11136",
"6065",
"13366",
"8205",
"14545",
"7389",
"1481",
"1366",
"12398",
"12042",
"13502",
"11765",
"1154",
"1378",
"14335",
"260",
"15110",
"8743",
"5474",
"15295",
"9307",
"12195",
"6989",
"5539",
"2086",
"2965",
"5399",
"10930",
"3982",
"11152",
"1777",
"4384",
"8826",
"14423",
"11899",
"13764",
"14115",
"1447",
"14467",
"4239",
"4734",
"11099",
"1007",
"12551",
"5642",
"4346",
"558",
"7990",
"3362",
"12851",
"9787",
"5635",
"7442",
"4600",
"206",
"5666",
"5282",
"11069",
"15225",
"12064",
"11625",
"13964",
"13755",
"7904",
"4231",
"5991",
"10995",
"8773",
"6970",
"10809",
"14637",
"5878",
"13701",
"10239",
"6909",
"13180",
"9010",
"8464",
"5123",
"8927",
"5916",
"8820",
"7548",
"6769",
"265",
"1359",
"11490",
"7530",
"10618",
"4561",
"4960",
"10198",
"5849",
"14441",
"4241",
"9597",
"11195",
"8756",
"10058",
"14379",
"3815",
"267",
"14358",
"5792",
"14926",
"6032",
"5794",
"3234",
"8659",
"10285",
"15280",
"7839",
"14856",
"13704",
"6208",
"8807",
"7532",
"4112",
"499",
"7535",
"3871",
"8767",
"11197",
"737",
"4375",
"3101",
"3414",
"14920",
"1491",
"2872",
"1339",
"10890",
"328",
"11301",
"3061",
"14573",
"1543",
"4852",
"9249",
"3257",
"2549",
"1966",
"9591",
"8131",
"6905",
"13696",
"9018",
"5948",
"3915",
"12467",
"14204",
"1559",
"1249",
"7550",
"8291",
"13766",
"5960",
"11075",
"9645",
"9575",
"13122",
"11183",
"9334",
"436",
"5051",
"13295",
"11949",
"513",
"8733",
"7407",
"9093",
"11140",
"13891",
"4101",
"3527",
"12236",
"10828",
"40",
"2494",
"14670",
"8906",
"8923",
"7242",
"14820",
"9195",
"6360",
"14318",
"15104",
"7755",
"9052",
"5612",
"14536",
"9899",
"6886",
"7367",
"11643",
"8630",
"15282",
"4976",
"9607",
"5060",
"3262",
"6981",
"6728",
"5095",
"13242",
"4986",
"1843",
"9532",
"13555",
"6447",
"10330",
"6781",
"4754",
"12023",
"6299",
"10871",
"4491",
"377",
"3437",
"1121",
"1512",
"11330",
"1853",
"10360",
"4547",
"11768",
"12406",
"3724",
"3402",
"8051",
"11361",
"1498",
"11255",
"7823",
"9813",
"7346",
"12388",
"4934",
"13405",
"14700",
"1229",
"7427",
"6848",
"6249",
"13778",
"13193",
"9923",
"6959",
"4348",
"10950",
"13353",
"5703",
"8245",
"4696",
"10424",
"3211",
"14537",
"1999",
"2720",
"15384",
"6681",
"15048",
"11994",
"13073",
"1197",
"11862",
"12395",
"231",
"14369",
"11947",
"11674",
"14282",
"11345",
"9008",
"12394",
"9210",
"5969",
"2454",
"2632",
"10614",
"1385",
"13175",
"11884",
"9810",
"8295",
"2433",
"2310",
"13281",
"12365",
"13792",
"6500",
"842",
"677",
"13750",
"12992",
"3914",
"5529",
"12830",
"8706",
"3099",
"7562",
"1525",
"13118",
"2806",
"7890",
"7680",
"116",
"3984",
"9997",
"15490",
"3301",
"11887",
"10028",
"11084",
"14501",
"11513",
"14466",
"4967",
"10566",
"3685",
"7113",
"12280",
"13386",
"15431",
"11931",
"6218",
"6195",
"10991",
"6517",
"7081",
"11365",
"960",
"9939",
"1254",
"11074",
"68",
"770",
"1273",
"3284",
"5586",
"9243",
"14209",
"4230",
"3532",
"15062",
"3649",
"437",
"12002",
"189",
"6298",
"10505",
"439",
"11282",
"2369",
"7866",
"2235",
"14508",
"1732",
"7360",
"10957",
"3788",
"3789",
"10349",
"3258",
"7514",
"14462",
"14207",
"2396",
"5832",
"9909",
"5197",
"4248",
"9245",
"10581",
"14450",
"5888",
"15441",
"9462",
"3733",
"6225",
"8021",
"13660",
"10523",
"10597",
"10432",
"7585",
"5626",
"2645",
"6573",
"61",
"8972",
"1886",
"14575",
"1689",
"9731",
"2391",
"13427",
"10470",
"13408",
"4949",
"12772",
"1168",
"8239",
"13015",
"4410",
"463",
"15004",
"4373",
"1274",
"713",
"6432",
"941",
"5133",
"12924",
"12266",
"10129",
"8136",
"8112",
"11062",
"13064",
"13673",
"3041",
"5473",
"7607",
"1160",
"3315",
"13495",
"1898",
"1973",
"13323",
"660",
"901",
"14514",
"8204",
"375",
"13253",
"12557",
"5090",
"9446",
"6068",
"12417",
"15195",
"8343",
"2602",
"1356",
"13274",
"10371",
"6021",
"5385",
"13537",
"10552",
"1994",
"12951",
"10850",
"13019",
"7122",
"9050",
"13066",
"14049",
"12400",
"14030",
"15034",
"6567",
"10157",
"13048",
"10901",
"2088",
"4860",
"15437",
"15347",
"5583",
"8862",
"3607",
"6646",
"12217",
"5327",
"15311",
"3664",
"12728",
"7812",
"8988",
"7841",
"5456",
"1614",
"13532",
"9950",
"7402",
"35",
"12150",
"4845",
"11488",
"4585",
"11774",
"8824",
"9781",
"7522",
"3494",
"5348",
"6014",
"5805",
"4980",
"13369",
"3706",
"1077",
"11952",
"9416",
"9164",
"8856",
"3353",
"14922",
"9344",
"8684",
"8063",
"7251",
"6871",
"15186",
"9370",
"10068",
"7741",
"5044",
"14427",
"14464",
"8019",
"3303",
"1649",
"5335",
"4252",
"12233",
"9221",
"2814",
"5886",
"7326",
"3956",
"148",
"6618",
"13493",
"4956",
"13920",
"1199",
"1382",
"7079",
"5324",
"13509",
"13456",
"10162",
"12133",
"14346",
"8970",
"9585",
"8014",
"2597",
"14236",
"6788",
"1058",
"14178",
"702",
"13590",
"15385",
"1620",
"4593",
"13291",
"935",
"12265",
"6361",
"6626",
"2646",
"2402",
"8451",
"9379",
"6443",
"2407",
"12094",
"1588",
"4804",
"2376",
"12048",
"9430",
"6307",
"12308",
"8877",
"15065",
"4902",
"15247",
"14662",
"13311",
"13703",
"3626",
"12026",
"7759",
"3763",
"14040",
"2816",
"8812",
"8992",
"6808",
"7913",
"9793",
"2009",
"866",
"11053",
"9450",
"13807",
"8886",
"11925",
"14381",
"471",
"5185",
"2968",
"12421",
"1842",
"10201",
"3199",
"3379",
"9359",
"10879",
"8718",
"11460",
"13415",
"14078",
"8022",
"66",
"13112",
"8224",
"4366",
"5585",
"12735",
"1395",
"13189",
"14997",
"4909",
"15026",
"5605",
"6309",
"9341",
"9516",
"2097",
"6320",
"10669",
"8590",
"4165",
"5517",
"5748",
"1086",
"14191",
"9879",
"15483",
"1630",
"1088",
"15281",
"2469",
"13496",
"3076",
"8122",
"1224",
"11220",
"1580",
"14665",
"306",
"5334",
"308",
"9400",
"10206",
"13388",
"8352",
"4133",
"12596",
"14876",
"120",
"9970",
"8833",
"10652",
"15381",
"5924",
"11318",
"9112",
"5602",
"13834",
"7920",
"5458",
"2896",
"12832",
"11832",
"11026",
"4166",
"5910",
"5167",
"2822",
"15351",
"15470",
"5862",
"10119",
"14403",
"610",
"9413",
"7001",
"10774",
"5393",
"2958",
"11791",
"14830",
"951",
"13637",
"10183",
"4282",
"8961",
"4942",
"13695",
"1459",
"5683",
"7965",
"11337",
"7771",
"3823",
"14024",
"8417",
"14668",
"5429",
"14265",
"5217",
"1606",
"1950",
"13063",
"8792",
"11932",
"1797",
"4990",
"11587",
"1692",
"2823",
"14167",
"15142",
"15140",
"9767",
"14601",
"8572",
"1676",
"15354",
"8150",
"15130",
"4872",
"9",
"1810",
"3859",
"412",
"9031",
"3018",
"5565",
"11581",
"14574",
"683",
"3837",
"7086",
"6386",
"475",
"10648",
"5006",
"15211",
"7989",
"7114",
"10102",
"7951",
"7075",
"12711",
"11580",
"2871",
"1484",
"5927",
"6602",
"14444",
"8172",
"11787",
"13136",
"5809",
"4449",
"2802",
"11487",
"9921",
"5410",
"14996",
"7544",
"7908",
"10773",
"2414",
"7433",
"13864",
"13465",
"4291",
"2949",
"3713",
"7164",
"914",
"181",
"13546",
"3868",
"14156",
"7447",
"13298",
"11979",
"10100",
"637",
"12393",
"13875",
"14012",
"371",
"5749",
"6932",
"5693",
"211",
"3624",
"5459",
"650",
"3290",
"482",
"4885",
"14987",
"2963",
"6807",
"15153",
"12988",
"3633",
"2251",
"592",
"5230",
"705",
"7782",
"1204",
"2261",
"5160",
"8898",
"7092",
"14927",
"727",
"8606",
"289",
"9808",
"3050",
"593",
"8530",
"14588",
"13174",
"8157",
"3318",
"4523",
"1979",
"13091",
"5607",
"10115",
"13373",
"12465",
"2349",
"13666",
"10355",
"14787",
"9040",
"10188",
"9465",
"3348",
"6009",
"7238",
"12385",
"988",
"8077",
"12708",
"8704",
"15245",
"3591",
"6130",
"7941",
"9404",
"9639",
"11512",
"4692",
"392",
"12071",
"5861",
"5307",
"3613",
"472",
"4755",
"1187",
"14797",
"15429",
"12611",
"5243",
"144",
"6295",
"13606",
"7931",
"9580",
"14779",
"813",
"6923",
"1471",
"3204",
"4661",
"9567",
"5614",
"15084",
"13050",
"395",
"1991",
"13712",
"8214",
"8087",
"13796",
"2102",
"9352",
"8266",
"14772",
"5297",
"4548",
"14255",
"12597",
"14682",
"2311",
"410",
"12442",
"2904",
"15208",
"8676",
"10651",
"9270",
"789",
"11633",
"9501",
"6708",
"14432",
"5370",
"14511",
"2854",
"9840",
"3826",
"2276",
"12981",
"12345",
"1214",
"8333",
"14612",
"2468",
"5455",
"11421",
"7673",
"5266",
"14251",
"4579",
"33",
"5997",
"14885",
"3960",
"15135",
"6577",
"13421",
"2612",
"3059",
"1423",
"10303",
"13523",
"9949",
"9693",
"7281",
"3341",
"10654",
"15036",
"6376",
"271",
"10533",
"11422",
"12004",
"13625",
"5724",
"8622",
"3373",
"13527",
"12116",
"9969",
"13791",
"5214",
"4555",
"8903",
"13686",
"15182",
"4640",
"11112",
"6555",
"8185",
"4673",
"12264",
"14177",
"14550",
"8527",
"8231",
"11057",
"12163",
"8495",
"4408",
"3821",
"8673",
"3252",
"9975",
"9386",
"10710",
"14221",
"730",
"7387",
"9347",
"10794",
"13987",
"1857",
"3774",
"3828",
"6280",
"14457",
"11502",
"14765",
"14561",
"8650",
"5624",
"3680",
"13150",
"11211",
"6209",
"430",
"12146",
"9408",
"6778",
"5295",
"2167",
"3302",
"9777",
"12509",
"12155",
"11890",
"6491",
"12969",
"6612",
"4521",
"11456",
"10054",
"545",
"2307",
"3342",
"9744",
"6481",
"9991",
"8191",
"9678",
"2777",
"15304",
"10196",
"11239",
"1556",
"1912",
"12325",
"2326",
"9319",
"11743",
"5419",
"13198",
"7464",
"1210",
"14651",
"13052",
"1613",
"4110",
"5442",
"3157",
"13068",
"7790",
"13130",
"6947",
"10565",
"3814",
"10240",
"14035",
"7924",
"2537",
"5477",
"12239",
"585",
"9988",
"12811",
"6114",
"3366",
"10765",
"11864",
"11585",
"880",
"9157",
"12899",
"8868",
"2716",
"4213",
"8995",
"9903",
"5826",
"11719",
"6106",
"3800",
"11228",
"15487",
"5553",
"3226",
"76",
"12511",
"1955",
"14067",
"15017",
"9287",
"5968",
"6603",
"13992",
"5139",
"2155",
"9389",
"9009",
"10594",
"2698",
"8389",
"11313",
"1138",
"10176",
"4024",
"12008",
"11588",
"7963",
"6838",
"874",
"15398",
"3397",
"9042",
"2824",
"7051",
"5041",
"1444",
"8760",
"11611",
"1108",
"2192",
"10693",
"6196",
"7893",
"10679",
"13478",
"14260",
"6505",
"2101",
"10612",
"1379",
"10817",
"14754",
"7277",
"2450",
"12798",
"6509",
"2432",
"13115",
"7729",
"6439",
"9913",
"11733",
"3253",
"11481",
"402",
"14810",
"13575",
"1528",
"6277",
"8133",
"10601",
"14347",
"9832",
"1218",
"14320",
"2654",
"5898",
"8103",
"5137",
"14080",
"14373",
"13187",
"903",
"14093",
"11634",
"15185",
"2130",
"12292",
"14493",
"11403",
"12282",
"2519",
"9851",
"10927",
"7135",
"5246",
"7292",
"10274",
"4618",
"2804",
"152",
"14471",
"2669",
"6522",
"3697",
"7889",
"8158",
"3043",
"6939",
"10057",
"2635",
"8440",
"14606",
"10254",
"6735",
"6707",
"15416",
"7228",
"2827",
"10562",
"14149",
"518",
"7531",
"7415",
"5567",
"6598",
"4897",
"7200",
"5577",
"12494",
"4084",
"9669",
"10286",
"7857",
"4540",
"3977",
"7241",
"10164",
"3551",
"12862",
"10473",
"15387",
"14356",
"10641",
"2342",
"5721",
"8842",
"4981",
"2426",
"10916",
"6666",
"13996",
"12241",
"13829",
"12604",
"1488",
"13981",
"3120",
"15051",
"4397",
"14184",
"131",
"10333",
"5796",
"13075",
"11329",
"3725",
"3568",
"13099",
"4823",
"7628",
"6157",
"4831",
"15456",
"2384",
"817",
"5740",
"6575",
"12867",
"715",
"9675",
"3604",
"12800",
"5942",
"1149",
"10172",
"4965",
"4716",
"2510",
"15256",
"4385",
"14621",
"1084",
"7668",
"1989",
"1959",
"7411",
"64",
"250",
"2337",
"10283",
"12517",
"12809",
"8472",
"3609",
"426",
"3983",
"7284",
"5893",
"5787",
"5267",
"14893",
"5756",
"13251",
"11003",
"12268",
"1929",
"966",
"14148",
"10739",
"4506",
"12759",
"12355",
"2607",
"11883",
"13469",
"10480",
"1234",
"13845",
"10857",
"13072",
"10821",
"7437",
"8602",
"4091",
"9987",
"5379",
"12783",
"3888",
"4192",
"413",
"9716",
"5996",
"7130",
"13865",
"14976",
"14913",
"6379",
"12234",
"1131",
"3278",
"14218",
"1665",
"10993",
"11173",
"1461",
"7912",
"3496",
"8010",
"4034",
"14518",
"6053",
"7568",
"12870",
"15378",
"8271",
"8805",
"13062",
"2164",
"446",
"3051",
"1421",
"12194",
"6580",
"1600",
"9069",
"2052",
"3372",
"6178",
"11054",
"6982",
"78",
"5917",
"1013",
"5073",
"13438",
"6079",
"9623",
"1281",
"10273",
"8237",
"10580",
"4789",
"3444",
"12968",
"5438",
"14860",
"1641",
"15067",
"10331",
"8414",
"5457",
"12225",
"2828",
"13667",
"1806",
"14553",
"974",
"6548",
"300",
"13786",
"8412",
"3369",
"11201",
"7177",
"8391",
"6366",
"9854",
"6911",
"8967",
"14504",
"14892",
"11303",
"13640",
"9936",
"10376",
"13317",
"14366",
"14397",
"4345",
"5159",
"13260",
"10931",
"10465",
"523",
"12775",
"3040",
"4113",
"15198",
"7994",
"10858",
"2501",
"11158",
"5830",
"5287",
"7541",
"4147",
"3033",
"14970",
"13168",
"14398",
"12943",
"3641",
"9397",
"3858",
"15318",
"5478",
"3852",
"11562",
"11380",
"3042",
"7046",
"12027",
"14852",
"13835",
"5646",
"3589",
"7090",
"2837",
"10484",
"2800",
"6174",
"1862",
"11527",
"15210",
"7775",
"6206",
"13375",
"5532",
"13290",
"13820",
"11266",
"9724",
"6706",
"6246",
"14710",
"12630",
"759",
"11259",
"13074",
"11735",
"13698",
"9434",
"5372",
"15243",
"13952",
"11943",
"2997",
"13391",
"3797",
"10460",
"14303",
"8055",
"14778",
"13354",
"5764",
"8643",
"183",
"13226",
"3386",
"12203",
"3138",
"5398",
"11417",
"11093",
"12583",
"6064",
"3880",
"14052",
"1744",
"3190",
"10002",
"9197",
"9907",
"1342",
"11904",
"5864",
"13604",
"11477",
"8240",
"10582",
"13374",
"8324",
"12995",
"12276",
"5023",
"5016",
"6677",
"5092",
"15119",
"8272",
"6948",
"11697",
"6647",
"5839",
"9758",
"14827",
"11963",
"4087",
"9603",
"6688",
"13212",
"14532",
"13234",
"4477",
"9517",
"2956",
"481",
"6061",
"9100",
"12204",
"10493",
"12819",
"3590",
"3682",
"11552",
"6018",
"6609",
"10724",
"14571",
"4418",
"101",
"4757",
"1092",
"8916",
"2815",
"4933",
"6596",
"7703",
"4880",
"2898",
"8228",
"2179",
"4431",
"10321",
"2917",
"14296",
"6171",
"3140",
"927",
"9480",
"15279",
"11276",
"8267",
"3856",
"12903",
"7608",
"4505",
"13947",
"8663",
"10279",
"1921",
"7032",
"10936",
"14213",
"9477",
"11627",
"7419",
"13584",
"10625",
"15435",
"12526",
"13248",
"4237",
"7565",
"10030",
"11834",
"4182",
"6317",
"14165",
"1285",
"11938",
"1450",
"5189",
"12578",
"14170",
"1565",
"1155",
"12531",
"6322",
"9979",
"11461",
"9038",
"5428",
"7178",
"2778",
"13500",
"2241",
"8973",
"10877",
"11346",
"12675",
"13164",
"891",
"12289",
"736",
"15049",
"10749",
"1194",
"13656",
"4257",
"497",
"8423",
"6916",
"8119",
"8952",
"6279",
"5304",
"8463",
"7921",
"7286",
"11251",
"3286",
"12691",
"14169",
"15419",
"14633",
"6352",
"84",
"1904",
"758",
"10314",
"14405",
"2639",
"5314",
"7750",
"1768",
"1090",
"3663",
"6236",
"14254",
"11651",
"615",
"4558",
"5538",
"7515",
"4314",
"11256",
"7824",
"6149",
"2428",
"2162",
"4211",
"3861",
"3058",
"3296",
"12945",
"12039",
"13562",
"14309",
"11865",
"1760",
"1827",
"14704",
"6355",
"13247",
"12796",
"8891",
"5704",
"9652",
"4944",
"12740",
"6424",
"8690",
"15446",
"8396",
"5056",
"5470",
"7587",
"6969",
"11909",
"2403",
"7409",
"6303",
"369",
"1300",
"3583",
"11826",
"4362",
"11570",
"12159",
"14015",
"10180",
"7291",
"1221",
"14971",
"6604",
"15023",
"3179",
"14494",
"11579",
"14878",
"10977",
"6637",
"1736",
"9158",
"4187",
"15054",
"14559",
"15424",
"6712",
"15093",
"8360",
"14720",
"10259",
"15196",
"14774",
"8367",
"8098",
"13296",
"899",
"827",
"4416",
"1568",
"7458",
"8082",
"5686",
"910",
"12502",
"6194",
"4900",
"5876",
"3108",
"14205",
"4887",
"8811",
"2979",
"13312",
"8850",
"5298",
"3274",
"1938",
"13727",
"14210",
"218",
"7370",
"12686",
"2188",
"12022",
"838",
"6359",
"4142",
"5226",
"12024",
"15289",
"15411",
"6632",
"10237",
"1972",
"11263",
"13662",
"293",
"14899",
"13031",
"7102",
"13433",
"13108",
"3619",
"1700",
"1442",
"2236",
"11181",
"5284",
"10497",
"8017",
"8162",
"14556",
"14577",
"8681",
"1494",
"9053",
"13948",
"4753",
"3313",
"1583",
"1258",
"14136",
"13219",
"14734",
"5035",
"10537",
"418",
"6689",
"11441",
"830",
"9865",
"3785",
"823",
"5186",
"8307",
"4613",
"1887",
"7744",
"6974",
"5188",
"13412",
"15082",
"5156",
"8340",
"83",
"232",
"8873",
"3684",
"1839",
"12305",
"10655",
"6693",
"9850",
"12791",
"6343",
"614",
"4151",
"15309",
"13517",
"2677",
"10316",
"1056",
"1148",
"10888",
"8616",
"10091",
"10495",
"12692",
"13902",
"4842",
"3489",
"7552",
"273",
"11597",
"6921",
"7489",
"3468",
"13003",
"7783",
"11076",
"6421",
"12301",
"834",
"8664",
"2582",
"6200",
"11672",
"2182",
"14273",
"14334",
"5847",
"8888",
"15452",
"103",
"3873",
"2329",
"2780",
"310",
"5909",
"8759",
"7906",
"7787",
"158",
"11312",
"1584",
"7070",
"8701",
"7779",
"13885",
"2899",
"5926",
"6472",
"5676",
"7414",
"10682",
"9200",
"8277",
"3523",
"11977",
"10207",
"12455",
"12827",
"809",
"205",
"13472",
"11092",
"2014",
"1882",
"13589",
"15174",
"2159",
"1350",
"2836",
"277",
"5685",
"14590",
"3530",
"8605",
"5731",
"2209",
"6347",
"1737",
"5308",
"6877",
"597",
"8399",
"7726",
"595",
"4059",
"15066",
"340",
"14654",
"6250",
"8199",
"3064",
"2320",
"819",
"8538",
"5941",
"2565",
"10585",
"607",
"14841",
"11132",
"10596",
"224",
"2059",
"11792",
"199",
"14252",
"1830",
"9800",
"6591",
"12477",
"2299",
"3297",
"9039",
"10579",
"11408",
"15395",
"8426",
"3961",
"10642",
"12486",
"14046",
"5447",
"7461",
"9881",
"7028",
"4871",
"11306",
"10796",
"8182",
"5943",
"4210",
"15252",
"606",
"12497",
"4761",
"8263",
"2708",
"9954",
"4460",
"7356",
"10070",
"14781",
"13244",
"1681",
"12343",
"6529",
"8137",
"13979",
"15402",
"6377",
"14097",
"2844",
"8839",
"11599",
"9749",
"4298",
"11070",
"4701",
"9560",
"6301",
"5795",
"9218",
"863",
"3259",
"5155",
"5714",
"10728",
"1117",
"14461",
"6097",
"15356",
"12545",
"173",
"14117",
"4352",
"14314",
"9760",
"14840",
"6574",
"1729",
"7753",
"6213",
"10066",
"9107",
"4567",
"2225",
"12003",
"3718",
"11083",
"1919",
"1834",
"3792",
"13959",
"8870",
"6758",
"12361",
"4141",
"9439",
"6143",
"1623",
"7899",
"3376",
"13302",
"1475",
"12676",
"13269",
"3621",
"6520",
"10299",
"8477",
"5005",
"1782",
"12157",
"9812",
"11399",
"1829",
"452",
"12286",
"5522",
"2156",
"7364",
"9283",
"4000",
"12854",
"225",
"6898",
"847",
"11278",
"7832",
"3263",
"9864",
"15122",
"1530",
"10893",
"14124",
"4785",
"6227",
"5443",
"4849",
"13895",
"10134",
"14386",
"12750",
"126",
"14656",
"10972",
"5306",
"5697",
"5891",
"11041",
"1515",
"12411",
"14659",
"2098",
"5010",
"4086",
"4467",
"6204",
"160",
"4364",
"7466",
"2579",
"3497",
"6862",
"3491",
"14939",
"7285",
"6050",
"11376",
"8344",
"1673",
"5959",
"11442",
"361",
"8703",
"3026",
"15413",
"13330",
"3535",
"258",
"14229",
"3024",
"4040",
"5328",
"3756",
"8143",
"9880",
"14875",
"6741",
"11788",
"4693",
"6474",
"12124",
"15323",
"3569",
"15166",
"12127",
"4051",
"10755",
"9657",
"13961",
"3224",
"6691",
"4436",
"5746",
"4805",
"9304",
"9174",
"5270",
"5606",
"5992",
"11630",
"6393",
"11722",
"11836",
"4522",
"14392",
"10027",
"5640",
"13126",
"14844",
"1723",
"11853",
"6584",
"13761",
"11808",
"3585",
"1800",
"7126",
"603",
"1653",
"15379",
"5885",
"2774",
"5557",
"1941",
"12852",
"7734",
"11654",
"15116",
"13777",
"5773",
"14837",
"9937",
"10538",
"14908",
"3775",
"14896",
"13266",
"11822",
"3293",
"6238",
"744",
"4839",
"2928",
"14234",
"7859",
"177",
"8195",
"10969",
"5542",
"13547",
"3934",
"1153",
"6237",
"10326",
"2002",
"13423",
"14374",
"5563",
"14438",
"14910",
"6302",
"14053",
"4573",
"9013",
"9727",
"3086",
"3191",
"13082",
"12148",
"4525",
"4260",
"7334",
"11673",
"14823",
"1319",
"13179",
"6373",
"6341",
"15339",
"9328",
"8105",
"2393",
"12733",
"14707",
"13692",
"2504",
"3549",
"10148",
"6445",
"10156",
"296",
"2787",
"7328",
"15361",
"1080",
"13358",
"500",
"13310",
"14180",
"7910",
"1924",
"803",
"8160",
"12837",
"7903",
"7078",
"6042",
"5430",
"8114",
"11897",
"10146",
"6999",
"6613",
"3338",
"14017",
"6719",
"5068",
"14513",
"4462",
"11101",
"6329",
"7483",
"848",
"15468",
"8384",
"1908",
"5220",
"769",
"6286",
"13461",
"13795",
"14280",
"2288",
"7303",
"14482",
"2770",
"12936",
"11665",
"12528",
"14867",
"9729",
"1293",
"13499",
"2128",
"1923",
"6825",
"11752",
"2135",
"4297",
"13521",
"12346",
"8937",
"8515",
"2826",
"7627",
"4971",
"741",
"2943",
"6235",
"7636",
"14931",
"13367",
"8443",
"3420",
"8736",
"5592",
"15405",
"9695",
"1748",
"13508",
"9156",
"961",
"10248",
"12901",
"10568",
"12250",
"12457",
"3807",
"5273",
"9641",
"7116",
"7011",
"2628",
"13772",
"9733",
"12764",
"2142",
"10486",
"10709",
"1181",
"4890",
"6380",
"5152",
"7295",
"8694",
"1040",
"9862",
"12920",
"14054",
"14581",
"5227",
"613",
"14631",
"14964",
"3698",
"1771",
"4200",
"11982",
"10142",
"9121",
"6403",
"8635",
"11022",
"2150",
"2642",
"6894",
"12559",
"10455",
"12841",
"7581",
"4664",
"12549",
"480",
"9914",
"11594",
"1489",
"14610",
"8567",
"7257",
"9655",
"7343",
"3808",
"3692",
"4776",
"483",
"6463",
"7039",
"5043",
"6418",
"6571",
"8171",
"7117",
"2339",
"12813",
"13347",
"989",
"7993",
"10464",
"1445",
"4801",
"11983",
"8107",
"9094",
"6243",
"4746",
"6943",
"6260",
"2104",
"13341",
"10435",
"10519",
"10845",
"13912",
"5024",
"14048",
"11968",
"504",
"14744",
"14721",
"4508",
"14874",
"2215",
"7496",
"9774",
"15131",
"11219",
"15440",
"1226",
"11524",
"3710",
"13114",
"5464",
"8671",
"2333",
"13512",
"10023",
"6471",
"12510",
"7869",
"1469",
"8355",
"1402",
"13675",
"5436",
"8593",
"6486",
"8753",
"7314",
"2610",
"1599",
"7723",
"5793",
"1139",
"14725",
"123",
"14343",
"4903",
"14947",
"14796",
"2743",
"11291",
"11607",
"6743",
"1046",
"15127",
"3739",
"4501",
"11446",
"14855",
"9671",
"7614",
"13560",
"2753",
"6749",
"7202",
"12552",
"7806",
"2926",
"7947",
"6918",
"949",
"8060",
"3701",
"13978",
"8153",
"11452",
"4652",
"6846",
"1560",
"14130",
"9051",
"14592",
"9035",
"2930",
"5038",
"8647",
"6506",
"12073",
"8596",
"5175",
"10908",
"757",
"1604",
"8469",
"6323",
"13406",
"8665",
"5883",
"8180",
"7662",
"10968",
"10712",
"12931",
"11432",
"7868",
"6284",
"594",
"15121",
"3531",
"5413",
"7159",
"5353",
"4029",
"5783",
"6398",
"6215",
"13128",
"11892",
"12229",
"10306",
"12702",
"12665",
"14256",
"7935",
"10833",
"2945",
"9489",
"490",
"8178",
"3483",
"11317",
"4088",
"5802",
"13053",
"14533",
"3166",
"5807",
"14732",
"6600",
"14692",
"14849",
"6482",
"8613",
"4370",
"2614",
"6775",
"2897",
"10457",
"10324",
"11116",
"4551",
"3737",
"6455",
"13380",
"5337",
"14999",
"9005",
"7792",
"5381",
"5404",
"6568",
"13159",
"11028",
"3218",
"9704",
"8416",
"5127",
"13693",
"14907",
"15191",
"10965",
"9435",
"5551",
"3896",
"13821",
"15233",
"7932",
"655",
"5280",
"11703",
"9696",
"1535",
"10792",
"15488",
"13101",
"1411",
"3519",
"5820",
"10008",
"10692",
"13116",
"7455",
"8009",
"14541",
"9070",
"9153",
"9534",
"1384",
"5853",
"13595",
"15183",
"7174",
"1716",
"10673",
"7577",
"1656",
"14290",
"7272",
"12984",
"3456",
"11760",
"74",
"867",
"10750",
"5249",
"3054",
"6699",
"15237",
"14071",
"11706",
"4518",
"12179",
"11745",
"13328",
"4022",
"13306",
"1241",
"9269",
"6402",
"6912",
"5042",
"2587",
"697",
"8504",
"15273",
"13722",
"4545",
"15291",
"11027",
"10163",
"15275",
"8677",
"13279",
"9364",
"7519",
"10767",
"12714",
"6129",
"8296",
"575",
"3844",
"8845",
"9493",
"2570",
"10926",
"7805",
"6412",
"4715",
"1669",
"14233",
"1043",
"13635",
"15128",
"364",
"15374",
"9190",
"12186",
"7534",
"1208",
"859",
"6553",
"6488",
"13548",
"6901",
"9326",
"4395",
"13780",
"8342",
"11914",
"7774",
"3700",
"14449",
"7600",
"9007",
"4251",
"638",
"3822",
"1201",
"973",
"3283",
"10832",
"14652",
"5176",
"3469",
"55",
"2812",
"4643",
"8511",
"13203",
"14172",
"1705",
"9047",
"4615",
"154",
"6745",
"1372",
"9119",
"12907",
"12219",
"12580",
"596",
"4271",
"6385",
"191",
"7815",
"8072",
"15184",
"12473",
"3639",
"1709",
"6924",
"5531",
"7025",
"11187",
"3773",
"5504",
"2404",
"2840",
"13280",
"1173",
"14248",
"9976",
"8244",
"7160",
"4378",
"3561",
"10442",
"12812",
"15009",
"4602",
"2070",
"9748",
"15401",
"11584",
"3481",
"14249",
"1856",
"3615",
"13572",
"8460",
"5770",
"7808",
"1415",
"8791",
"12518",
"4495",
"6828",
"13904",
"10657",
"4371",
"8745",
"5272",
"14484",
"15146",
"6672",
"3324",
"8879",
"3415",
"435",
"2178",
"2929",
"6354",
"2609",
"9272",
"3745",
"2773",
"15061",
"8174",
"5549",
"1346",
"6078",
"2152",
"9672",
"1289",
"1075",
"5827",
"7428",
"4619",
"2385",
"3424",
"14549",
"4001",
"878",
"12660",
"12223",
"2725",
"5113",
"1162",
"9296",
"2649",
"319",
"13036",
"13988",
"11060",
"3212",
"13214",
"5254",
"53",
"3000",
"474",
"3063",
"1715",
"1503",
"8262",
"10069",
"12638",
"8108",
"13881",
"13609",
"3337",
"10843",
"9354",
"2693",
"1407",
"11793",
"2553",
"572",
"4874",
"5067",
"8410",
"7329",
"4214",
"3819",
"740",
"4108",
"12503",
"4044",
"5382",
"13945",
"34",
"8865",
"6595",
"1794",
"2586",
"8869",
"11331",
"4920",
"11348",
"10886",
"7949",
"6762",
"7348",
"7336",
"12108",
"2954",
"7863",
"6006",
"3625",
"641",
"1417",
"6292",
"6460",
"14033",
"5745",
"8411",
"10934",
"10428",
"13730",
"9373",
"15097",
"10633",
"7184",
"5659",
"3186",
"14998",
"9852",
"4570",
"7735",
"7234",
"9060",
"2467",
"13161",
"5603",
"3075",
"7110",
"11322",
"8141",
"2286",
"11169",
"20",
"7936",
"7795",
"14949",
"9755",
"11294",
"3743",
"298",
"12168",
"14082",
"14684",
"4103",
"1010",
"3920",
"3031",
"9422",
"5654",
"5317",
"6436",
"3746",
"15028",
"3853",
"2532",
"852",
"11992",
"13661",
"12089",
"13211",
"2356",
"772",
"170",
"7555",
"10870",
"12546",
"4846",
"8594",
"9366",
"4181",
"9126",
"7015",
"5496",
"9922",
"6409",
"14714",
"7737",
"1643",
"1343",
"11603",
"3470",
"7615",
"13592",
"8168",
"5897",
"8723",
"7875",
"9238",
"15489",
"1094",
"4335",
"4533",
"2025",
"5207",
"5619",
"7298",
"15469",
"8890",
"3124",
"9011",
"10979",
"13377",
"5863",
"477",
"8433",
"13135",
"7439",
"6496",
"5754",
"3635",
"13585",
"14001",
"3672",
"13925",
"6289",
"3507",
"6955",
"5763",
"5444",
"13124",
"1534",
"9552",
"12844",
"11352",
"2541",
"11816",
"10736",
"8738",
"8434",
"2056",
"13246",
"1629",
"7911",
"6961",
"1083",
"8156",
"8310",
"4188",
"7591",
"11214",
"10251",
"4520",
"2873",
"8057",
"3116",
"12677",
"12172",
"13335",
"10847",
"3631",
"2760",
"4105",
"1182",
"13479",
"5889",
"8422",
"15074",
"12025",
"107",
"1009",
"3016",
"5058",
"15434",
"9590"
] |
thoeppner/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3079
- Accuracy: 0.9337
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4044 | 1.0 | 185 | 0.3637 | 0.9310 |
| 0.374 | 2.0 | 370 | 0.3439 | 0.9364 |
| 0.3458 | 3.0 | 555 | 0.3295 | 0.9364 |
| 0.3391 | 4.0 | 740 | 0.3189 | 0.9378 |
| 0.3502 | 5.0 | 925 | 0.3111 | 0.9391 |
| 0.3275 | 6.0 | 1110 | 0.3059 | 0.9391 |
| 0.3369 | 7.0 | 1295 | 0.3028 | 0.9391 |
| 0.3128 | 8.0 | 1480 | 0.3019 | 0.9391 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Dalmatiner/vit-base-oxford-iiit-pets
|
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2070
- Accuracy: 0.9391
## Model description
The model I used for the zero-shot classification is the "openai/clip-vit-large-patch14"
The results are:
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3661 | 1.0 | 370 | 0.3151 | 0.9242 |
| 0.2065 | 2.0 | 740 | 0.2477 | 0.9323 |
| 0.1576 | 3.0 | 1110 | 0.2172 | 0.9310 |
| 0.1451 | 4.0 | 1480 | 0.2048 | 0.9350 |
| 0.1419 | 5.0 | 1850 | 0.2019 | 0.9378 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
ElioBaserga/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the fruits-and-vegetables-classification dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1798
- Accuracy: 0.9331
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6158 | 1.0 | 195 | 0.2572 | 0.9145 |
| 0.2671 | 2.0 | 390 | 0.2054 | 0.9288 |
| 0.2212 | 3.0 | 585 | 0.1905 | 0.9288 |
| 0.1935 | 4.0 | 780 | 0.1803 | 0.9345 |
| 0.1969 | 5.0 | 975 | 0.1774 | 0.9316 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"apple",
"banana",
"beetroot",
"bell pepper",
"cabbage",
"capsicum",
"carrot",
"cauliflower",
"chilli pepper",
"corn",
"cucumber",
"eggplant",
"garlic",
"ginger",
"grapes",
"jalepeno",
"kiwi",
"lemon",
"lettuce",
"mango",
"onion",
"orange",
"paprika",
"pear",
"peas",
"pineapple",
"pomegranate",
"potato",
"raddish",
"soy beans",
"spinach",
"sweetcorn",
"sweetpotato",
"tomato",
"turnip",
"watermelon"
] |
BerkeOek/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2047
- Accuracy: 0.9391
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3964 | 1.0 | 370 | 0.2865 | 0.9242 |
| 0.2192 | 2.0 | 740 | 0.2245 | 0.9378 |
| 0.1611 | 3.0 | 1110 | 0.2072 | 0.9391 |
| 0.1376 | 4.0 | 1480 | 0.1973 | 0.9391 |
| 0.1447 | 5.0 | 1850 | 0.1957 | 0.9391 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zeroshot week7
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
martivic/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1929
- Accuracy: 0.9364
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3787 | 1.0 | 370 | 0.3074 | 0.9269 |
| 0.2072 | 2.0 | 740 | 0.2311 | 0.9337 |
| 0.1399 | 3.0 | 1110 | 0.2137 | 0.9378 |
| 0.1393 | 4.0 | 1480 | 0.2065 | 0.9418 |
| 0.1373 | 5.0 | 1850 | 0.2029 | 0.9418 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## 🔍 Zero-Shot Evaluation (Week 7)
- **Modell:** openai/clip-vit-large-patch14
- **Accuracy:** 0.8800
- **Precision:** 0.8768
- **Recall:** 0.8800
Diese Werte stammen aus einer Zero-Shot-Klassifikation mit dem Modell `openai/clip-vit-large-patch14`
auf dem Oxford-IIIT Pet Datensatz (100 Testbilder). Ziel war es, die Performance eines Transfer-Learning-Modells
mit einem Zero-Shot-Ansatz zu vergleichen.
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
joyjkl/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1788
- Accuracy: 0.9391
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4079 | 1.0 | 370 | 0.2911 | 0.9147 |
| 0.2327 | 2.0 | 740 | 0.2106 | 0.9378 |
| 0.1812 | 3.0 | 1110 | 0.1860 | 0.9432 |
| 0.1453 | 4.0 | 1480 | 0.1790 | 0.9418 |
| 0.1434 | 5.0 | 1850 | 0.1752 | 0.9445 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
alimoh02/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1872
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3871 | 1.0 | 370 | 0.3107 | 0.9256 |
| 0.2244 | 2.0 | 740 | 0.2439 | 0.9323 |
| 0.1725 | 3.0 | 1110 | 0.2220 | 0.9378 |
| 0.145 | 4.0 | 1480 | 0.2157 | 0.9350 |
| 0.129 | 5.0 | 1850 | 0.2131 | 0.9337 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
## Zero-Shot classification model
This section compares the performance of a zero-shot model (`openai/clip-vit-large-patch14`) on the Oxford Pets dataset (`pcuenq/oxford-pets`).
- **Model used**: `openai/clip-vit-large-patch14`
- **Dataset**: `pcuenq/oxford-pets` (train split)
- **Evaluation Task**: Zero-Shot Image Classification
- **Candidate Labels**: 37 pet breeds from the dataset
### Results:
Zero-Shot Evaluation mit CLIP:
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
Evaluated using Hugging Face `transformers` pipeline and `sklearn.metrics` on the full training set.
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
ferzanagehringer/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Evaluation Results on Oxford-Pet Dataset
I evaluated the zero-shot classification performance of this model on the Oxford-IIIT Pet dataset using 37 class labels.
##### 🔍 Model used
- Model: openai/clip-vit-large-patch14
- Pipeline: transformers.pipeline(task="zero-shot-image-classification")
- For each image, the model was asked to classify it among all 37 class labels (dog and cat breeds) in a zero-shot setting.
- The label with the highest score was chosen as the prediction.
##### 🧪 Metrics
- Accuracy: 0.8800
- Precision (weighted): 0.8768
- Recall (weighted): 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
detorcla/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- **Accuracy**: 76.00%
- **Precision (macro)**: 81.56%
- **Recall (macro)**: 76.16%
### Training results
| Training Loss | Epoch |
|:-------------:|:-----:|
| 147.5163 | 1.0 |
| 66.5542 | 2.0 |
| 42.2211 | 3.0 |
| 26.8211 | 4.0 |
| 19.2624 | 5.0 |
|
[
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36"
] |
Nikolamitrovic/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1790
- Accuracy: 0.9486
## Zero-Shot Classification Performance (CLIP)
The Oxford-Pets dataset was also evaluated using a zero-shot classification model (CLIP) without any fine-tuning specific to this dataset.
* **Model Used:** `openai/clip-vit-large-patch14`
* **Accuracy:** 0.8800
* **Precision (Weighted):** 0.8768
* **Recall (Weighted):** 0.8800
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.4038 | 1.0 | 370 | 0.3003 | 0.9175 |
| 0.1968 | 2.0 | 740 | 0.2344 | 0.9296 |
| 0.1704 | 3.0 | 1110 | 0.2110 | 0.9296 |
| 0.1439 | 4.0 | 1480 | 0.2060 | 0.9364 |
| 0.135 | 5.0 | 1850 | 0.2047 | 0.9350 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
raveendran-shajiran/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
---
## Zero-Shot Evaluation
- **Model used**: [`openai/clip-vit-large-patch14`]
- **Dataset**: [`Oxford-IIIT-Pets`]
- **Accuracy**: `0.8800`
- **Precision**: `0.8768`
- **Recall**: `0.8800`
The zero-shot evaluation was performed using the Hugging Face Transformers library and the CLIP model on the Oxford-IIIT-Pets dataset.
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
bastiansteingruber/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2213
- Accuracy: 0.9269
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3701 | 1.0 | 370 | 0.3034 | 0.9256 |
| 0.225 | 2.0 | 740 | 0.2272 | 0.9513 |
| 0.1628 | 3.0 | 1110 | 0.2093 | 0.9513 |
| 0.1439 | 4.0 | 1480 | 0.2020 | 0.9540 |
| 0.1283 | 5.0 | 1850 | 0.2002 | 0.9567 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero Shot Evaluation
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
walzsil1/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2245
- Accuracy: 0.9364
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3843 | 1.0 | 370 | 0.2882 | 0.9242 |
| 0.1961 | 2.0 | 740 | 0.2120 | 0.9405 |
| 0.1512 | 3.0 | 1110 | 0.1928 | 0.9432 |
| 0.1393 | 4.0 | 1480 | 0.1844 | 0.9432 |
| 0.1138 | 5.0 | 1850 | 0.1823 | 0.9432 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### zero-shot classification model "openai/clip-vit-large-patch14"
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Vinci96/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1634
- Accuracy: 0.9526
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3942 | 1.0 | 370 | 0.2973 | 0.9229 |
| 0.2301 | 2.0 | 740 | 0.2206 | 0.9391 |
| 0.1671 | 3.0 | 1110 | 0.2077 | 0.9364 |
| 0.1555 | 4.0 | 1480 | 0.2004 | 0.9418 |
| 0.1276 | 5.0 | 1850 | 0.1970 | 0.9391 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### Zero Shot Evaluation
- Accuracy: 0.8800
- Precision: 0.8768
- Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Flogoro/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8733
- Accuracy: 0.8782
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 1.5525 | 0.2703 | 100 | 1.1996 | 0.8769 |
| 0.9021 | 0.5405 | 200 | 0.8349 | 0.9053 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cpu
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
lindritdev/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the Isamu136/oxford_pets_with_l14_emb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2043
- Accuracy: 0.9418
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3734 | 1.0 | 370 | 0.2734 | 0.9337 |
| 0.213 | 2.0 | 740 | 0.2147 | 0.9418 |
| 0.1806 | 3.0 | 1110 | 0.1920 | 0.9445 |
| 0.1449 | 4.0 | 1480 | 0.1859 | 0.9472 |
| 0.131 | 5.0 | 1850 | 0.1815 | 0.9445 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
fabfacal/vit-base-oxford-iiit-pets
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1986
- Accuracy: 0.9364
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3803 | 1.0 | 370 | 0.2851 | 0.9323 |
| 0.2213 | 2.0 | 740 | 0.2248 | 0.9364 |
| 0.1837 | 3.0 | 1110 | 0.2068 | 0.9418 |
| 0.1419 | 4.0 | 1480 | 0.2006 | 0.9418 |
| 0.134 | 5.0 | 1850 | 0.1996 | 0.9418 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
macbaileys/vit-base-oxford-iiit-pets
|
## 📊 Zero-Shot Evaluation using CLIP (openai/clip-vit-large-patch14)
We tested the model `openai/clip-vit-large-patch14` using zero-shot classification on 100 samples from the Oxford-IIIT Pets dataset.
Accuracy: 0.8800
Precision: 0.8768
Recall: 0.8800
The CLIP model was not fine-tuned on this dataset. It demonstrates strong generalization but falls short of the ViT model trained via transfer learning.
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
meyeryve/vit-base-oxford-iiit-pets
|
# vit-base-oxford-iiit-pets
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the pcuenq/oxford-pets dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2031
- Accuracy: 0.9459
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3727 | 1.0 | 370 | 0.2756 | 0.9337 |
| 0.2145 | 2.0 | 740 | 0.2168 | 0.9378 |
| 0.1835 | 3.0 | 1110 | 0.1918 | 0.9459 |
| 0.147 | 4.0 | 1480 | 0.1857 | 0.9472 |
| 0.1315 | 5.0 | 1850 | 0.1818 | 0.9472 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
### 🧪 Zero-Shot Model Comparison (Separate Models)
For comparison purposes, we evaluated zero-shot image classification models on the same dataset without any fine-tuning. These models was used to demonstrate the generalization capabilities of large-scale vision-language models.
- **Model Used**: `laion/CLIP-ViT-g-14-laion2B-s12B-b42K`
- **Method**: Zero-shot image classification via Hugging Face `pipeline()`
- **Accuracy**: 0.8794
- **Precision**: 0.8736
- **Recall**: 0.8794
- **Model Used**: `laion/CLIP-ViT-B-32-laion2B-s34B-b79K`
- **Method**: Zero-shot image classification via Hugging Face `pipeline()`
- **Accuracy**: 0.8564
- **Precision**: 0.8526
- **Recall**: 0.8564
> ⚠️ Note: The zero-shot models are **not the same** as this trained model. It was evaluated independently and is included here only for comparison.
|
[
"siamese",
"birman",
"shiba inu",
"staffordshire bull terrier",
"basset hound",
"bombay",
"japanese chin",
"chihuahua",
"german shorthaired",
"pomeranian",
"beagle",
"english cocker spaniel",
"american pit bull terrier",
"ragdoll",
"persian",
"egyptian mau",
"miniature pinscher",
"sphynx",
"maine coon",
"keeshond",
"yorkshire terrier",
"havanese",
"leonberger",
"wheaten terrier",
"american bulldog",
"english setter",
"boxer",
"newfoundland",
"bengal",
"samoyed",
"british shorthair",
"great pyrenees",
"abyssinian",
"pug",
"saint bernard",
"russian blue",
"scottish terrier"
] |
Louloubib/acouslic_ai_image_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# acouslic_ai_image_classification
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6111
- Accuracy: 0.7261
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.7275 | 1.0 | 81 | 0.7071 | 0.6615 |
| 0.6587 | 2.0 | 162 | 0.6466 | 0.6988 |
| 0.6125 | 2.9689 | 240 | 0.6111 | 0.7261 |
### Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"no_annotation",
"optimal",
"suboptimal"
] |
Louloubib/acouslic_ai_image_classification-10-epochs
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# acouslic_ai_image_classification-10-epochs
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5137
- Accuracy: 0.7873
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.8185 | 0.9895 | 71 | 0.8175 | 0.5911 |
| 0.6682 | 1.9895 | 142 | 0.6966 | 0.6774 |
| 0.5782 | 2.9895 | 213 | 0.5758 | 0.7384 |
| 0.5419 | 3.9895 | 284 | 0.5869 | 0.7332 |
| 0.5374 | 4.9895 | 355 | 0.5485 | 0.7515 |
| 0.4418 | 5.9895 | 426 | 0.5585 | 0.7402 |
| 0.4767 | 6.9895 | 497 | 0.5299 | 0.7672 |
| 0.3957 | 7.9895 | 568 | 0.5121 | 0.7847 |
| 0.4294 | 8.9895 | 639 | 0.5118 | 0.7820 |
| 0.4067 | 9.9895 | 710 | 0.5137 | 0.7873 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
|
[
"no_annotation",
"optimal",
"suboptimal"
] |
encku/tuborg-04-2025
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
## Validation Metrics
loss: 0.0022049029357731342
f1_macro: 0.9996719180065856
f1_micro: 0.9995339340405699
f1_weighted: 0.99953381773749
precision_macro: 0.9997326789953793
precision_micro: 0.9995339340405699
precision_weighted: 0.999540534539045
recall_macro: 0.9996157107960941
recall_micro: 0.9995339340405699
recall_weighted: 0.9995339340405699
accuracy: 0.9995339340405699
|
[
"c001",
"c002",
"c002_4lu",
"c003",
"c004",
"c005",
"c006",
"c006_4lu",
"c007",
"c008",
"c009",
"c010",
"c012",
"c014",
"c015",
"c015_4lu",
"c017",
"c018",
"c020",
"c020_4lu",
"c021",
"c022",
"c023",
"c024",
"c025",
"c026",
"c027",
"c028",
"c029",
"c030",
"c031",
"c032",
"c033",
"c034",
"c035",
"c036",
"c037",
"c038",
"c039",
"c040",
"c041",
"c042",
"c043",
"c044",
"tbrg066",
"tbrg067",
"tbrg068",
"tbrg072",
"tbrg073",
"tbrg074",
"tbrg075",
"tbrg085",
"tbrg086",
"tbrg087",
"tbrg090",
"tbrg092",
"tbrg093",
"tbrg096",
"tbrg097",
"tbrg098",
"tbrg100",
"tbrg156",
"tbrg157",
"tbrg158",
"tbrg159",
"tt00277",
"tt00523",
"tt00677",
"tt00677-1",
"tt00685",
"tt00735",
"tt00737",
"tt00765",
"tt00792",
"tt00792-1",
"tt00793",
"tt00793-1",
"tt00810",
"tt00811",
"tt00812",
"tt00852",
"tt00853",
"tt00854",
"tt00857",
"tt00857-1",
"tt00859",
"tt00875",
"tt00875-1",
"tt00876",
"tt00876-1",
"tt00893",
"tt00904",
"tt00944",
"tt00945",
"tt00947",
"tt00964",
"tt00980",
"tt00989",
"tt01001",
"tt01020",
"tt01037",
"tt01069",
"tt01070",
"tt01071",
"tt01072",
"tt01142",
"tt01148",
"tt01149",
"tt01150",
"tt01152",
"tt01155",
"tt01160",
"tt01162",
"tt01169",
"tt01172",
"tt01174",
"tt01176",
"tt01178",
"tt01179",
"tt01296",
"tt01297",
"tt01300",
"tt01307",
"tt01431",
"tt01460",
"tt01481",
"tt01482"
] |
shahad-alh/arabichar-balanced-ver3
|
# Model Card for Model ID
This CNN fine-tuned model is designed to identify Alphabet characters written by children for ages between 4-8.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model focuses on adapting a pre-trained CNN model. It is built of three convolutional layers with 32 filters, followed by max-pooling and batch normalization. Another set of three convolutional layers with 64 filters extracts deeper features, followed by another pooling and normalization step. The extracted features are passed through two fully connected layers with dropout, and the final softmax layer classifies the characters into 28 categories.
The base model, trained on AHCD, and I've fine-tuned it on Dhad-Hijja Dataset collection,
- **Developed by:** [shahad-alh]
- **Model type:** [Image Classification]
- **Language(s) (NLP):** [ar (Arabic)]
- **License:** [MIT]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
Could be used by anyone who is interested in buliding App, Model, anything that should identify Children's characters handwriting.
## Evaluation
Training Accuracy :86%
Test Accuracy: 87%
## 🚀 Demo
Try it out live here:
[](https://huggingface.co/spaces/shahad-alh/Batoot-HW-main-model)
|
[
"alif",
"ayen",
"baa",
"dal",
"dhad",
"faa",
"ghayen",
"h_aa",
"haa",
"jeem",
"kaf",
"khaa",
"lam",
"meem",
"noon",
"qaf",
"raa",
"sad",
"seen",
"sheen",
"t_aa",
"taa",
"th_aa",
"thaa",
"thal",
"waw",
"yaa",
"zay"
] |
SodaXII/vit-base-patch16-224_rice-leaf-disease-augmented-v4_v5_fft
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224_rice-leaf-disease-augmented-v4_v5_fft
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2430
- Accuracy: 0.9631
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 256
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.9551 | 0.5 | 64 | 1.5507 | 0.5537 |
| 1.0881 | 1.0 | 128 | 0.6198 | 0.8054 |
| 0.3872 | 1.5 | 192 | 0.3096 | 0.8960 |
| 0.1724 | 2.0 | 256 | 0.1786 | 0.9530 |
| 0.0382 | 2.5 | 320 | 0.2118 | 0.9430 |
| 0.0261 | 3.0 | 384 | 0.2390 | 0.9430 |
| 0.0065 | 3.5 | 448 | 0.2377 | 0.9362 |
| 0.0036 | 4.0 | 512 | 0.2146 | 0.9463 |
| 0.0013 | 4.5 | 576 | 0.2235 | 0.9463 |
| 0.0009 | 5.0 | 640 | 0.2121 | 0.9564 |
| 0.0007 | 5.5 | 704 | 0.2125 | 0.9564 |
| 0.0007 | 6.0 | 768 | 0.2121 | 0.9564 |
| 0.0007 | 6.5 | 832 | 0.2120 | 0.9564 |
| 0.0006 | 7.0 | 896 | 0.2121 | 0.9530 |
| 0.0005 | 7.5 | 960 | 0.2037 | 0.9564 |
| 0.0004 | 8.0 | 1024 | 0.2124 | 0.9530 |
| 0.0003 | 8.5 | 1088 | 0.2120 | 0.9564 |
| 0.0003 | 9.0 | 1152 | 0.2125 | 0.9564 |
| 0.0002 | 9.5 | 1216 | 0.2138 | 0.9564 |
| 0.0003 | 10.0 | 1280 | 0.2137 | 0.9564 |
| 0.0002 | 10.5 | 1344 | 0.2139 | 0.9564 |
| 0.0002 | 11.0 | 1408 | 0.2140 | 0.9564 |
| 0.0002 | 11.5 | 1472 | 0.2170 | 0.9530 |
| 0.0002 | 12.0 | 1536 | 0.2159 | 0.9564 |
| 0.0002 | 12.5 | 1600 | 0.2172 | 0.9597 |
| 0.0002 | 13.0 | 1664 | 0.2200 | 0.9564 |
| 0.0001 | 13.5 | 1728 | 0.2196 | 0.9631 |
| 0.0001 | 14.0 | 1792 | 0.2211 | 0.9597 |
| 0.0001 | 14.5 | 1856 | 0.2219 | 0.9597 |
| 0.0001 | 15.0 | 1920 | 0.2220 | 0.9597 |
| 0.0001 | 15.5 | 1984 | 0.2222 | 0.9597 |
| 0.0001 | 16.0 | 2048 | 0.2222 | 0.9597 |
| 0.0001 | 16.5 | 2112 | 0.2244 | 0.9597 |
| 0.0001 | 17.0 | 2176 | 0.2255 | 0.9597 |
| 0.0001 | 17.5 | 2240 | 0.2265 | 0.9597 |
| 0.0001 | 18.0 | 2304 | 0.2278 | 0.9564 |
| 0.0001 | 18.5 | 2368 | 0.2284 | 0.9564 |
| 0.0001 | 19.0 | 2432 | 0.2288 | 0.9564 |
| 0.0001 | 19.5 | 2496 | 0.2294 | 0.9564 |
| 0.0001 | 20.0 | 2560 | 0.2295 | 0.9564 |
| 0.0001 | 20.5 | 2624 | 0.2295 | 0.9564 |
| 0.0001 | 21.0 | 2688 | 0.2304 | 0.9597 |
| 0.0001 | 21.5 | 2752 | 0.2309 | 0.9597 |
| 0.0001 | 22.0 | 2816 | 0.2337 | 0.9564 |
| 0.0001 | 22.5 | 2880 | 0.2351 | 0.9564 |
| 0.0001 | 23.0 | 2944 | 0.2354 | 0.9597 |
| 0.0 | 23.5 | 3008 | 0.2356 | 0.9597 |
| 0.0 | 24.0 | 3072 | 0.2361 | 0.9597 |
| 0.0 | 24.5 | 3136 | 0.2363 | 0.9631 |
| 0.0 | 25.0 | 3200 | 0.2363 | 0.9597 |
| 0.0 | 25.5 | 3264 | 0.2366 | 0.9631 |
| 0.0 | 26.0 | 3328 | 0.2382 | 0.9631 |
| 0.0 | 26.5 | 3392 | 0.2398 | 0.9631 |
| 0.0 | 27.0 | 3456 | 0.2406 | 0.9597 |
| 0.0 | 27.5 | 3520 | 0.2416 | 0.9631 |
| 0.0 | 28.0 | 3584 | 0.2421 | 0.9631 |
| 0.0 | 28.5 | 3648 | 0.2429 | 0.9597 |
| 0.0 | 29.0 | 3712 | 0.2429 | 0.9631 |
| 0.0 | 29.5 | 3776 | 0.2431 | 0.9631 |
| 0.0 | 30.0 | 3840 | 0.2430 | 0.9631 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.1
|
[
"bacterial leaf blight",
"brown spot",
"healthy rice leaf",
"leaf blast",
"leaf scald",
"narrow brown leaf spot",
"rice hispa",
"sheath blight"
] |
SodaXII/swin-base-patch4-window7-224_rice-leaf-disease-augmented-v4_v5_fft
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-base-patch4-window7-224_rice-leaf-disease-augmented-v4_v5_fft
This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6225
- Accuracy: 0.9262
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 256
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss |
|:-------------:|:-----:|:----:|:--------:|:---------------:|
| 1.9171 | 0.5 | 64 | 0.5403 | 1.4134 |
| 0.8021 | 1.0 | 128 | 0.8725 | 0.3672 |
| 0.2453 | 1.5 | 192 | 0.8624 | 0.3582 |
| 0.1156 | 2.0 | 256 | 0.9463 | 0.2018 |
| 0.0502 | 2.5 | 320 | 0.9262 | 0.2554 |
| 0.0396 | 3.0 | 384 | 0.9262 | 0.3171 |
| 0.0241 | 3.5 | 448 | 0.9295 | 0.2359 |
| 0.0106 | 4.0 | 512 | 0.9228 | 0.3841 |
| 0.003 | 4.5 | 576 | 0.9396 | 0.3358 |
| 0.0028 | 5.0 | 640 | 0.9396 | 0.2688 |
| 0.0008 | 5.5 | 704 | 0.9362 | 0.3238 |
| 0.0003 | 6.0 | 768 | 0.9329 | 0.3211 |
| 0.0004 | 6.5 | 832 | 0.9362 | 0.3208 |
| 0.0012 | 7.0 | 896 | 0.9430 | 0.3044 |
| 0.0191 | 7.5 | 960 | 0.9161 | 0.5238 |
| 0.0093 | 8.0 | 1024 | 0.9161 | 0.5086 |
| 0.0029 | 8.5 | 1088 | 0.9396 | 0.3395 |
| 0.0067 | 9.0 | 1152 | 0.9329 | 0.3446 |
| 0.0013 | 9.5 | 1216 | 0.9329 | 0.3731 |
| 0.0014 | 10.0 | 1280 | 0.9295 | 0.3921 |
| 0.0018 | 10.5 | 1344 | 0.9362 | 0.3714 |
| 0.0002 | 11.0 | 1408 | 0.9295 | 0.3655 |
| 0.001 | 11.5 | 1472 | 0.9362 | 0.4734 |
| 0.0035 | 12.0 | 1536 | 0.9463 | 0.3671 |
| 0.0183 | 12.5 | 1600 | 0.9295 | 0.4800 |
| 0.0148 | 13.0 | 1664 | 0.9262 | 0.5419 |
| 0.0067 | 13.5 | 1728 | 0.9362 | 0.5487 |
| 0.0029 | 14.0 | 1792 | 0.9362 | 0.5559 |
| 0.0008 | 14.5 | 1856 | 0.9396 | 0.5968 |
| 0.0008 | 15.0 | 1920 | 0.9463 | 0.4915 |
| 0.0016 | 15.5 | 1984 | 0.9430 | 0.5009 |
| 0.0001 | 16.0 | 2048 | 0.9430 | 0.5010 |
| 0.0005 | 16.5 | 2112 | 0.9430 | 0.4041 |
| 0.0042 | 17.0 | 2176 | 0.9463 | 0.5723 |
| 0.0019 | 17.5 | 2240 | 0.9262 | 0.6179 |
| 0.0031 | 18.0 | 2304 | 0.9463 | 0.4994 |
| 0.002 | 18.5 | 2368 | 0.9329 | 0.6108 |
| 0.0009 | 19.0 | 2432 | 0.9430 | 0.5313 |
| 0.0001 | 19.5 | 2496 | 0.9329 | 0.5379 |
| 0.0 | 20.0 | 2560 | 0.9362 | 0.5363 |
| 0.0002 | 20.5 | 2624 | 0.9362 | 0.5345 |
| 0.0001 | 21.0 | 2688 | 0.9430 | 0.5184 |
| 0.0035 | 21.5 | 2752 | 0.9195 | 0.6128 |
| 0.013 | 22.0 | 2816 | 0.9161 | 0.6121 |
| 0.0184 | 22.5 | 2880 | 0.9295 | 0.5990 |
| 0.0074 | 23.0 | 2944 | 0.9329 | 0.5262 |
| 0.0024 | 23.5 | 3008 | 0.9329 | 0.5637 |
| 0.0002 | 24.0 | 3072 | 0.9497 | 0.4270 |
| 0.0001 | 24.5 | 3136 | 0.9396 | 0.4547 |
| 0.0001 | 25.0 | 3200 | 0.9396 | 0.4528 |
| 0.0 | 25.5 | 3264 | 0.9396 | 0.4483 |
| 0.0001 | 26.0 | 3328 | 0.9396 | 0.4772 |
| 0.0018 | 26.5 | 3392 | 0.9430 | 0.4525 |
| 0.0119 | 27.0 | 3456 | 0.8893 | 0.7561 |
| 0.0052 | 27.5 | 3520 | 0.9329 | 0.6145 |
| 0.0007 | 28.0 | 3584 | 0.9228 | 0.7502 |
| 0.0002 | 28.5 | 3648 | 0.9228 | 0.6163 |
| 0.0004 | 29.0 | 3712 | 0.9262 | 0.6056 |
| 0.0002 | 29.5 | 3776 | 0.9262 | 0.6198 |
| 0.0002 | 30.0 | 3840 | 0.6225 | 0.9262 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.1
|
[
"bacterial leaf blight",
"brown spot",
"healthy rice leaf",
"leaf blast",
"leaf scald",
"narrow brown leaf spot",
"rice hispa",
"sheath blight"
] |
SodaXII/deit-base-patch16-224_rice-leaf-disease-augmented-v4_v5_fft
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deit-base-patch16-224_rice-leaf-disease-augmented-v4_v5_fft
This model is a fine-tuned version of [facebook/deit-base-patch16-224](https://huggingface.co/facebook/deit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2620
- Accuracy: 0.9329
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 256
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.9056 | 0.5 | 64 | 1.4718 | 0.5906 |
| 0.9484 | 1.0 | 128 | 0.5152 | 0.8624 |
| 0.3332 | 1.5 | 192 | 0.3270 | 0.8893 |
| 0.1545 | 2.0 | 256 | 0.2024 | 0.9463 |
| 0.0326 | 2.5 | 320 | 0.1973 | 0.9396 |
| 0.0197 | 3.0 | 384 | 0.2748 | 0.9195 |
| 0.0037 | 3.5 | 448 | 0.2517 | 0.9228 |
| 0.002 | 4.0 | 512 | 0.2252 | 0.9329 |
| 0.0005 | 4.5 | 576 | 0.2425 | 0.9362 |
| 0.0004 | 5.0 | 640 | 0.2387 | 0.9329 |
| 0.0003 | 5.5 | 704 | 0.2384 | 0.9329 |
| 0.0003 | 6.0 | 768 | 0.2374 | 0.9329 |
| 0.0003 | 6.5 | 832 | 0.2374 | 0.9329 |
| 0.0003 | 7.0 | 896 | 0.2421 | 0.9329 |
| 0.0002 | 7.5 | 960 | 0.2315 | 0.9362 |
| 0.0002 | 8.0 | 1024 | 0.2387 | 0.9362 |
| 0.0001 | 8.5 | 1088 | 0.2378 | 0.9362 |
| 0.0001 | 9.0 | 1152 | 0.2405 | 0.9362 |
| 0.0001 | 9.5 | 1216 | 0.2395 | 0.9362 |
| 0.0001 | 10.0 | 1280 | 0.2381 | 0.9362 |
| 0.0001 | 10.5 | 1344 | 0.2377 | 0.9362 |
| 0.0001 | 11.0 | 1408 | 0.2379 | 0.9362 |
| 0.0001 | 11.5 | 1472 | 0.2430 | 0.9362 |
| 0.0001 | 12.0 | 1536 | 0.2367 | 0.9396 |
| 0.0001 | 12.5 | 1600 | 0.2403 | 0.9396 |
| 0.0001 | 13.0 | 1664 | 0.2412 | 0.9396 |
| 0.0001 | 13.5 | 1728 | 0.2437 | 0.9362 |
| 0.0001 | 14.0 | 1792 | 0.2425 | 0.9362 |
| 0.0001 | 14.5 | 1856 | 0.2435 | 0.9362 |
| 0.0001 | 15.0 | 1920 | 0.2431 | 0.9362 |
| 0.0001 | 15.5 | 1984 | 0.2437 | 0.9362 |
| 0.0001 | 16.0 | 2048 | 0.2437 | 0.9362 |
| 0.0001 | 16.5 | 2112 | 0.2414 | 0.9362 |
| 0.0001 | 17.0 | 2176 | 0.2497 | 0.9362 |
| 0.0 | 17.5 | 2240 | 0.2464 | 0.9362 |
| 0.0 | 18.0 | 2304 | 0.2490 | 0.9362 |
| 0.0 | 18.5 | 2368 | 0.2491 | 0.9362 |
| 0.0 | 19.0 | 2432 | 0.2490 | 0.9329 |
| 0.0 | 19.5 | 2496 | 0.2501 | 0.9329 |
| 0.0 | 20.0 | 2560 | 0.2500 | 0.9329 |
| 0.0 | 20.5 | 2624 | 0.2499 | 0.9329 |
| 0.0 | 21.0 | 2688 | 0.2509 | 0.9362 |
| 0.0 | 21.5 | 2752 | 0.2514 | 0.9362 |
| 0.0 | 22.0 | 2816 | 0.2552 | 0.9329 |
| 0.0 | 22.5 | 2880 | 0.2562 | 0.9329 |
| 0.0 | 23.0 | 2944 | 0.2536 | 0.9329 |
| 0.0 | 23.5 | 3008 | 0.2555 | 0.9329 |
| 0.0 | 24.0 | 3072 | 0.2554 | 0.9329 |
| 0.0 | 24.5 | 3136 | 0.2555 | 0.9329 |
| 0.0 | 25.0 | 3200 | 0.2557 | 0.9329 |
| 0.0 | 25.5 | 3264 | 0.2564 | 0.9362 |
| 0.0 | 26.0 | 3328 | 0.2567 | 0.9362 |
| 0.0 | 26.5 | 3392 | 0.2553 | 0.9396 |
| 0.0 | 27.0 | 3456 | 0.2610 | 0.9329 |
| 0.0 | 27.5 | 3520 | 0.2607 | 0.9329 |
| 0.0 | 28.0 | 3584 | 0.2607 | 0.9329 |
| 0.0 | 28.5 | 3648 | 0.2624 | 0.9329 |
| 0.0 | 29.0 | 3712 | 0.2620 | 0.9329 |
| 0.0 | 29.5 | 3776 | 0.2620 | 0.9329 |
| 0.0 | 30.0 | 3840 | 0.2620 | 0.9329 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.1
|
[
"bacterial leaf blight",
"brown spot",
"healthy rice leaf",
"leaf blast",
"leaf scald",
"narrow brown leaf spot",
"rice hispa",
"sheath blight"
] |
SodaXII/dinov2-base_rice-leaf-disease-augmented-v4_v5_fft
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dinov2-base_rice-leaf-disease-augmented-v4_v5_fft
This model is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2683
- Accuracy: 0.9430
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 256
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3383 | 0.5 | 64 | 0.4827 | 0.8356 |
| 0.2775 | 1.0 | 128 | 0.3518 | 0.8658 |
| 0.1456 | 1.5 | 192 | 0.5239 | 0.8490 |
| 0.1629 | 2.0 | 256 | 0.2593 | 0.9295 |
| 0.1255 | 2.5 | 320 | 0.3740 | 0.8993 |
| 0.1142 | 3.0 | 384 | 0.3753 | 0.9128 |
| 0.06 | 3.5 | 448 | 0.3722 | 0.9295 |
| 0.0587 | 4.0 | 512 | 0.4174 | 0.9228 |
| 0.0157 | 4.5 | 576 | 0.3364 | 0.9329 |
| 0.0062 | 5.0 | 640 | 0.2237 | 0.9396 |
| 0.0012 | 5.5 | 704 | 0.2186 | 0.9530 |
| 0.0001 | 6.0 | 768 | 0.2342 | 0.9430 |
| 0.0 | 6.5 | 832 | 0.2343 | 0.9430 |
| 0.0 | 7.0 | 896 | 0.2563 | 0.9430 |
| 0.0 | 7.5 | 960 | 0.2597 | 0.9430 |
| 0.0 | 8.0 | 1024 | 0.2546 | 0.9430 |
| 0.0 | 8.5 | 1088 | 0.2553 | 0.9430 |
| 0.0 | 9.0 | 1152 | 0.2562 | 0.9430 |
| 0.0 | 9.5 | 1216 | 0.2570 | 0.9430 |
| 0.0 | 10.0 | 1280 | 0.2564 | 0.9430 |
| 0.0 | 10.5 | 1344 | 0.2566 | 0.9430 |
| 0.0 | 11.0 | 1408 | 0.2565 | 0.9430 |
| 0.0 | 11.5 | 1472 | 0.2578 | 0.9430 |
| 0.0 | 12.0 | 1536 | 0.2580 | 0.9430 |
| 0.0 | 12.5 | 1600 | 0.2571 | 0.9430 |
| 0.0 | 13.0 | 1664 | 0.2590 | 0.9430 |
| 0.0 | 13.5 | 1728 | 0.2599 | 0.9430 |
| 0.0 | 14.0 | 1792 | 0.2595 | 0.9430 |
| 0.0 | 14.5 | 1856 | 0.2594 | 0.9430 |
| 0.0 | 15.0 | 1920 | 0.2597 | 0.9430 |
| 0.0 | 15.5 | 1984 | 0.2596 | 0.9430 |
| 0.0 | 16.0 | 2048 | 0.2597 | 0.9430 |
| 0.0 | 16.5 | 2112 | 0.2605 | 0.9430 |
| 0.0 | 17.0 | 2176 | 0.2602 | 0.9430 |
| 0.0 | 17.5 | 2240 | 0.2608 | 0.9430 |
| 0.0 | 18.0 | 2304 | 0.2617 | 0.9430 |
| 0.0 | 18.5 | 2368 | 0.2628 | 0.9430 |
| 0.0 | 19.0 | 2432 | 0.2621 | 0.9430 |
| 0.0 | 19.5 | 2496 | 0.2621 | 0.9430 |
| 0.0 | 20.0 | 2560 | 0.2621 | 0.9430 |
| 0.0 | 20.5 | 2624 | 0.2621 | 0.9430 |
| 0.0 | 21.0 | 2688 | 0.2625 | 0.9430 |
| 0.0 | 21.5 | 2752 | 0.2625 | 0.9430 |
| 0.0 | 22.0 | 2816 | 0.2638 | 0.9430 |
| 0.0 | 22.5 | 2880 | 0.2648 | 0.9430 |
| 0.0 | 23.0 | 2944 | 0.2648 | 0.9430 |
| 0.0 | 23.5 | 3008 | 0.2645 | 0.9430 |
| 0.0 | 24.0 | 3072 | 0.2652 | 0.9430 |
| 0.0 | 24.5 | 3136 | 0.2654 | 0.9430 |
| 0.0 | 25.0 | 3200 | 0.2654 | 0.9430 |
| 0.0 | 25.5 | 3264 | 0.2654 | 0.9430 |
| 0.0 | 26.0 | 3328 | 0.2658 | 0.9430 |
| 0.0 | 26.5 | 3392 | 0.2657 | 0.9430 |
| 0.0 | 27.0 | 3456 | 0.2670 | 0.9430 |
| 0.0 | 27.5 | 3520 | 0.2680 | 0.9430 |
| 0.0 | 28.0 | 3584 | 0.2681 | 0.9430 |
| 0.0 | 28.5 | 3648 | 0.2687 | 0.9430 |
| 0.0 | 29.0 | 3712 | 0.2684 | 0.9430 |
| 0.0 | 29.5 | 3776 | 0.2683 | 0.9430 |
| 0.0 | 30.0 | 3840 | 0.2683 | 0.9430 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.1
|
[
"bacterial leaf blight",
"brown spot",
"healthy rice leaf",
"leaf blast",
"leaf scald",
"narrow brown leaf spot",
"rice hispa",
"sheath blight"
] |
SodaXII/convnextv2-base-1k-224_rice-leaf-disease-augmented-v4_v5_fft
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-base-1k-224_rice-leaf-disease-augmented-v4_v5_fft
This model is a fine-tuned version of [facebook/convnextv2-base-1k-224](https://huggingface.co/facebook/convnextv2-base-1k-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3987
- Accuracy: 0.9362
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 256
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0296 | 0.5 | 64 | 1.8920 | 0.4799 |
| 1.5755 | 1.0 | 128 | 1.2129 | 0.7651 |
| 0.9059 | 1.5 | 192 | 0.6750 | 0.8289 |
| 0.4659 | 2.0 | 256 | 0.3395 | 0.9128 |
| 0.2039 | 2.5 | 320 | 0.2663 | 0.9362 |
| 0.1195 | 3.0 | 384 | 0.2652 | 0.9329 |
| 0.0556 | 3.5 | 448 | 0.2046 | 0.9396 |
| 0.0271 | 4.0 | 512 | 0.2504 | 0.9396 |
| 0.0096 | 4.5 | 576 | 0.2902 | 0.9329 |
| 0.0058 | 5.0 | 640 | 0.2492 | 0.9362 |
| 0.0032 | 5.5 | 704 | 0.2680 | 0.9329 |
| 0.0027 | 6.0 | 768 | 0.2695 | 0.9430 |
| 0.0026 | 6.5 | 832 | 0.2680 | 0.9396 |
| 0.0023 | 7.0 | 896 | 0.2648 | 0.9430 |
| 0.0014 | 7.5 | 960 | 0.2618 | 0.9463 |
| 0.001 | 8.0 | 1024 | 0.2782 | 0.9396 |
| 0.0008 | 8.5 | 1088 | 0.2870 | 0.9430 |
| 0.0007 | 9.0 | 1152 | 0.2964 | 0.9396 |
| 0.0006 | 9.5 | 1216 | 0.2964 | 0.9430 |
| 0.0006 | 10.0 | 1280 | 0.3007 | 0.9430 |
| 0.0006 | 10.5 | 1344 | 0.3016 | 0.9430 |
| 0.0005 | 11.0 | 1408 | 0.3014 | 0.9430 |
| 0.0005 | 11.5 | 1472 | 0.3089 | 0.9396 |
| 0.0005 | 12.0 | 1536 | 0.3132 | 0.9396 |
| 0.0004 | 12.5 | 1600 | 0.3153 | 0.9362 |
| 0.0003 | 13.0 | 1664 | 0.3209 | 0.9396 |
| 0.0003 | 13.5 | 1728 | 0.3287 | 0.9396 |
| 0.0003 | 14.0 | 1792 | 0.3284 | 0.9396 |
| 0.0003 | 14.5 | 1856 | 0.3342 | 0.9396 |
| 0.0003 | 15.0 | 1920 | 0.3333 | 0.9396 |
| 0.0003 | 15.5 | 1984 | 0.3340 | 0.9396 |
| 0.0003 | 16.0 | 2048 | 0.3342 | 0.9396 |
| 0.0003 | 16.5 | 2112 | 0.3338 | 0.9396 |
| 0.0002 | 17.0 | 2176 | 0.3515 | 0.9362 |
| 0.0002 | 17.5 | 2240 | 0.3447 | 0.9396 |
| 0.0002 | 18.0 | 2304 | 0.3551 | 0.9396 |
| 0.0002 | 18.5 | 2368 | 0.3551 | 0.9396 |
| 0.0002 | 19.0 | 2432 | 0.3570 | 0.9396 |
| 0.0002 | 19.5 | 2496 | 0.3606 | 0.9396 |
| 0.0002 | 20.0 | 2560 | 0.3593 | 0.9396 |
| 0.0002 | 20.5 | 2624 | 0.3592 | 0.9396 |
| 0.0002 | 21.0 | 2688 | 0.3612 | 0.9396 |
| 0.0001 | 21.5 | 2752 | 0.3726 | 0.9396 |
| 0.0001 | 22.0 | 2816 | 0.3690 | 0.9396 |
| 0.0001 | 22.5 | 2880 | 0.3777 | 0.9362 |
| 0.0001 | 23.0 | 2944 | 0.3725 | 0.9396 |
| 0.0001 | 23.5 | 3008 | 0.3751 | 0.9396 |
| 0.0001 | 24.0 | 3072 | 0.3782 | 0.9396 |
| 0.0001 | 24.5 | 3136 | 0.3805 | 0.9396 |
| 0.0001 | 25.0 | 3200 | 0.3804 | 0.9396 |
| 0.0001 | 25.5 | 3264 | 0.3824 | 0.9396 |
| 0.0001 | 26.0 | 3328 | 0.3891 | 0.9362 |
| 0.0001 | 26.5 | 3392 | 0.3921 | 0.9329 |
| 0.0001 | 27.0 | 3456 | 0.3896 | 0.9362 |
| 0.0001 | 27.5 | 3520 | 0.3978 | 0.9362 |
| 0.0001 | 28.0 | 3584 | 0.3952 | 0.9362 |
| 0.0001 | 28.5 | 3648 | 0.3994 | 0.9362 |
| 0.0001 | 29.0 | 3712 | 0.3989 | 0.9362 |
| 0.0001 | 29.5 | 3776 | 0.3985 | 0.9362 |
| 0.0001 | 30.0 | 3840 | 0.3987 | 0.9362 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.1
|
[
"bacterial leaf blight",
"brown spot",
"healthy rice leaf",
"leaf blast",
"leaf scald",
"narrow brown leaf spot",
"rice hispa",
"sheath blight"
] |
Pratigya0/dit-base-finetuned-rvlcdip-finetuned-document-classifier
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dit-base-finetuned-rvlcdip-finetuned-document-classifier
This model is a fine-tuned version of [microsoft/dit-base-finetuned-rvlcdip](https://huggingface.co/microsoft/dit-base-finetuned-rvlcdip) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4756
- Accuracy: 0.936
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 1.5664 | 0.9577 | 17 | 1.3503 | 0.624 |
| 1.1861 | 1.9577 | 34 | 0.9458 | 0.888 |
| 0.7973 | 2.9577 | 51 | 0.6490 | 0.92 |
| 0.7129 | 3.9577 | 68 | 0.5158 | 0.928 |
| 0.5476 | 4.9577 | 85 | 0.4756 | 0.936 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"0",
"2",
"4",
"6",
"9"
] |
SodaXII/vit-hybrid-base-bit-384_rice-leaf-disease-augmented-v4_v5_fft
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-hybrid-base-bit-384_rice-leaf-disease-augmented-v4_v5_fft
This model is a fine-tuned version of [google/vit-hybrid-base-bit-384](https://huggingface.co/google/vit-hybrid-base-bit-384) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2360
- Accuracy: 0.9664
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 512
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss |
|:-------------:|:-----:|:----:|:--------:|:---------------:|
| 1.3979 | 0.5 | 128 | 0.8289 | 0.5143 |
| 0.225 | 1.0 | 256 | 0.9228 | 0.2283 |
| 0.0585 | 1.5 | 384 | 0.9094 | 0.2963 |
| 0.062 | 2.0 | 512 | 0.9530 | 0.1543 |
| 0.0166 | 2.5 | 640 | 0.9295 | 0.3726 |
| 0.0189 | 3.0 | 768 | 0.9430 | 0.2889 |
| 0.0126 | 3.5 | 896 | 0.9497 | 0.2025 |
| 0.0031 | 4.0 | 1024 | 0.9597 | 0.2456 |
| 0.0001 | 4.5 | 1152 | 0.9631 | 0.2122 |
| 0.0 | 5.0 | 1280 | 0.9631 | 0.2111 |
| 0.0 | 5.5 | 1408 | 0.9631 | 0.2112 |
| 0.0 | 6.0 | 1536 | 0.9631 | 0.2110 |
| 0.0 | 6.5 | 1664 | 0.9631 | 0.2110 |
| 0.0 | 7.0 | 1792 | 0.9631 | 0.2117 |
| 0.0 | 7.5 | 1920 | 0.9631 | 0.2121 |
| 0.0 | 8.0 | 2048 | 0.9631 | 0.2133 |
| 0.0 | 8.5 | 2176 | 0.9631 | 0.2144 |
| 0.0 | 9.0 | 2304 | 0.9631 | 0.2148 |
| 0.0 | 9.5 | 2432 | 0.9631 | 0.2155 |
| 0.0 | 10.0 | 2560 | 0.9631 | 0.2154 |
| 0.0 | 10.5 | 2688 | 0.9631 | 0.2154 |
| 0.0 | 11.0 | 2816 | 0.9631 | 0.2154 |
| 0.0 | 11.5 | 2944 | 0.9631 | 0.2152 |
| 0.0 | 12.0 | 3072 | 0.9631 | 0.2162 |
| 0.0 | 12.5 | 3200 | 0.9631 | 0.2184 |
| 0.0 | 13.0 | 3328 | 0.9631 | 0.2174 |
| 0.0 | 13.5 | 3456 | 0.2183 | 0.9664 |
| 0.0 | 14.0 | 3584 | 0.2189 | 0.9664 |
| 0.0 | 14.5 | 3712 | 0.2194 | 0.9664 |
| 0.0 | 15.0 | 3840 | 0.2194 | 0.9664 |
| 0.0 | 15.5 | 3968 | 0.2194 | 0.9664 |
| 0.0 | 16.0 | 4096 | 0.2194 | 0.9664 |
| 0.0 | 16.5 | 4224 | 0.2212 | 0.9664 |
| 0.0 | 17.0 | 4352 | 0.2221 | 0.9664 |
| 0.0 | 17.5 | 4480 | 0.2231 | 0.9664 |
| 0.0 | 18.0 | 4608 | 0.2227 | 0.9664 |
| 0.0 | 18.5 | 4736 | 0.2237 | 0.9664 |
| 0.0 | 19.0 | 4864 | 0.2237 | 0.9664 |
| 0.0 | 19.5 | 4992 | 0.2240 | 0.9664 |
| 0.0 | 20.0 | 5120 | 0.2240 | 0.9664 |
| 0.0 | 20.5 | 5248 | 0.2241 | 0.9664 |
| 0.0 | 21.0 | 5376 | 0.2239 | 0.9664 |
| 0.0 | 21.5 | 5504 | 0.2245 | 0.9664 |
| 0.0 | 22.0 | 5632 | 0.2267 | 0.9664 |
| 0.0 | 22.5 | 5760 | 0.2271 | 0.9664 |
| 0.0 | 23.0 | 5888 | 0.2285 | 0.9664 |
| 0.0 | 23.5 | 6016 | 0.2285 | 0.9664 |
| 0.0 | 24.0 | 6144 | 0.2289 | 0.9664 |
| 0.0 | 24.5 | 6272 | 0.2292 | 0.9664 |
| 0.0 | 25.0 | 6400 | 0.2293 | 0.9664 |
| 0.0 | 25.5 | 6528 | 0.2298 | 0.9664 |
| 0.0 | 26.0 | 6656 | 0.2309 | 0.9664 |
| 0.0 | 26.5 | 6784 | 0.2316 | 0.9664 |
| 0.0 | 27.0 | 6912 | 0.2330 | 0.9664 |
| 0.0 | 27.5 | 7040 | 0.2348 | 0.9664 |
| 0.0 | 28.0 | 7168 | 0.2350 | 0.9664 |
| 0.0 | 28.5 | 7296 | 0.2357 | 0.9664 |
| 0.0 | 29.0 | 7424 | 0.2358 | 0.9664 |
| 0.0 | 29.5 | 7552 | 0.2360 | 0.9664 |
| 0.0 | 30.0 | 7680 | 0.2360 | 0.9664 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.1
|
[
"bacterial leaf blight",
"brown spot",
"healthy rice leaf",
"leaf blast",
"leaf scald",
"narrow brown leaf spot",
"rice hispa",
"sheath blight"
] |
anusha2002/model_name-finetuned-catdog
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model_name-finetuned-catdog
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0076
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| No log | 0.9032 | 7 | 0.3160 | 0.9756 |
| 0.5269 | 1.9032 | 14 | 0.0562 | 1.0 |
| 0.1105 | 2.9032 | 21 | 0.0136 | 1.0 |
| 0.1105 | 3.9032 | 28 | 0.0076 | 1.0 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cpu
- Datasets 3.5.0
- Tokenizers 0.21.0
|
[
"cat",
"dog"
] |
encku/corbin-04-2025
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
## Validation Metrics
loss: 0.0022578395437449217
f1_macro: 0.9993728898422963
f1_micro: 0.9994462901439646
f1_weighted: 0.9994461479999952
precision_macro: 0.999304490903036
precision_micro: 0.9994462901439646
precision_weighted: 0.9994521433108384
recall_macro: 0.999448205969945
recall_micro: 0.9994462901439646
recall_weighted: 0.9994462901439646
accuracy: 0.9994462901439646
|
[
"018200250002",
"018200250019",
"018200250101",
"018200261244",
"01823743",
"021136180596",
"021136180947",
"021136181364",
"021136181371",
"025000058011",
"03435515",
"049000003710",
"049000007909",
"049000019162",
"049000040869",
"049000071542",
"04904403",
"04904500",
"04976400",
"04997704",
"070847012474",
"070847811169",
"070847898245",
"071990095451",
"071990300654",
"080660956435",
"080660957210",
"083783375534",
"083900005757",
"083900005771",
"085000027141",
"085000028728",
"085000029275",
"085000031377",
"087692832317",
"786162200433",
"786162338006",
"796030250965",
"810628031474",
"816751021214",
"855352008064",
"857531005284"
] |
prithivMLmods/Anime-Classification-v1.0
|

# **Anime-Classification-v1.0**
> **Anime-Classification-v1.0** is an image classification vision-language encoder model fine-tuned from **google/siglip2-base-patch16-224** for a single-label classification task. It is designed to classify anime-related images using the **SiglipForImageClassification** architecture.
```py
Classification Report:
precision recall f1-score support
3D 0.7979 0.8443 0.8204 4649
Bangumi 0.8677 0.8728 0.8702 4914
Comic 0.9716 0.9233 0.9468 5746
Illustration 0.8204 0.8186 0.8195 6064
accuracy 0.8648 21373
macro avg 0.8644 0.8647 0.8642 21373
weighted avg 0.8670 0.8648 0.8656 21373
```

---
The model categorizes images into 4 anime-related classes:
```
Class 0: "3D"
Class 1: "Bangumi"
Class 2: "Comic"
Class 3: "Illustration"
```
---
## **Install dependencies**
```python
!pip install -q transformers torch pillow gradio
```
---
## **Inference Code**
```python
import gradio as gr
from transformers import AutoImageProcessor, SiglipForImageClassification
from PIL import Image
import torch
# Load model and processor
model_name = "prithivMLmods/Anime-Classification-v1.0" # New model name
model = SiglipForImageClassification.from_pretrained(model_name)
processor = AutoImageProcessor.from_pretrained(model_name)
def classify_anime_image(image):
"""Predicts the anime category for an input image."""
image = Image.fromarray(image).convert("RGB")
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
probs = torch.nn.functional.softmax(logits, dim=1).squeeze().tolist()
labels = {
"0": "3D", "1": "Bangumi", "2": "Comic", "3": "Illustration"
}
predictions = {labels[str(i)]: round(probs[i], 3) for i in range(len(probs))}
return predictions
# Create Gradio interface
iface = gr.Interface(
fn=classify_anime_image,
inputs=gr.Image(type="numpy"),
outputs=gr.Label(label="Prediction Scores"),
title="Anime Classification v1.0",
description="Upload an image to classify the anime style category."
)
if __name__ == "__main__":
iface.launch()
```
---
## **Intended Use:**
The **Anime-Classification-v1.0** model is designed to classify anime-related images. Potential use cases include:
- **Content Tagging:** Automatically label anime artwork on platforms or apps.
- **Recommendation Engines:** Enhance personalized anime content suggestions.
- **Digital Art Curation:** Organize galleries by anime style for artists and fans.
- **Dataset Filtering:** Categorize and filter images during dataset creation.
|
[
"3d",
"bangumi",
"comic",
"illustration"
] |
ppicazo/autotrain-ap-pass-fail-v1
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
## Validation Metrics
loss: 0.5907868146896362
f1: 0.5454545454545454
precision: 0.46153846153846156
recall: 0.6666666666666666
auc: 0.7388888888888889
accuracy: 0.6551724137931034
|
[
"fail",
"pass"
] |
zaidlodu/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"no_tumor_512",
"tumor_512"
] |
zaidlodu/vit-base-patch16-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-eurosat
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"no_tumor_512",
"tumor_512"
] |
YassineKader/vit-nsfw-classifier
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-nsfw-classifier
This model is a fine-tuned version of [AdamCodd/vit-base-nsfw-detector](https://huggingface.co/AdamCodd/vit-base-nsfw-detector) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0003
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.1
- Pytorch 2.5.1+cu124
- Datasets 3.5.0
- Tokenizers 0.21.0
|
[
"sfw",
"nsfw"
] |
BeckerAnas/convnextv2-tiny-1k-224-finetuned-cifar10
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# convnextv2-tiny-1k-224-finetuned-cifar10
This model is a fine-tuned version of [facebook/convnextv2-tiny-1k-224](https://huggingface.co/facebook/convnextv2-tiny-1k-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6668
- Accuracy: 0.8427
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 1024
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 1.516 | 1.0 | 41 | 1.2639 | 0.6939 |
| 1.0647 | 2.0 | 82 | 0.7782 | 0.8099 |
| 0.9449 | 2.9441 | 120 | 0.6668 | 0.8427 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.7.0+cpu
- Datasets 3.3.2
- Tokenizers 0.21.0
|
[
"airplane",
"automobile",
"bird",
"cat",
"deer",
"dog",
"frog",
"horse",
"ship",
"truck"
] |
wmeynard/vit-animals
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-animals
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the mertcobanov/animals dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2444
- Accuracy: 0.9565
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 2.9211 | 0.4926 | 100 | 2.8644 | 0.8963 |
| 1.7472 | 0.9852 | 200 | 1.6272 | 0.9380 |
| 0.6862 | 1.4778 | 300 | 0.7584 | 0.9519 |
| 0.3567 | 1.9704 | 400 | 0.4741 | 0.9519 |
| 0.167 | 2.4631 | 500 | 0.3281 | 0.9546 |
| 0.1162 | 2.9557 | 600 | 0.2864 | 0.9565 |
| 0.0915 | 3.4483 | 700 | 0.2587 | 0.9528 |
| 0.0847 | 3.9409 | 800 | 0.2444 | 0.9565 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"antelope",
"badger",
"caterpillar",
"chimpanzee",
"cockroach",
"cow",
"coyote",
"crab",
"crow",
"deer",
"dog",
"dolphin",
"bat",
"donkey",
"dragonfly",
"duck",
"eagle",
"elephant",
"flamingo",
"fly",
"fox",
"goat",
"goldfish",
"bear",
"goose",
"gorilla",
"grasshopper",
"hamster",
"hare",
"hedgehog",
"hippopotamus",
"hornbill",
"horse",
"hummingbird",
"bee",
"hyena",
"jellyfish",
"kangaroo",
"koala",
"ladybugs",
"leopard",
"lion",
"lizard",
"lobster",
"mosquito",
"beetle",
"moth",
"mouse",
"octopus",
"okapi",
"orangutan",
"otter",
"owl",
"ox",
"oyster",
"panda",
"bison",
"parrot",
"pelecaniformes",
"penguin",
"pig",
"pigeon",
"porcupine",
"possum",
"raccoon",
"rat",
"reindeer",
"boar",
"rhinoceros",
"sandpiper",
"seahorse",
"seal",
"shark",
"sheep",
"snake",
"sparrow",
"squid",
"squirrel",
"butterfly",
"starfish",
"swan",
"tiger",
"turkey",
"turtle",
"whale",
"wolf",
"wombat",
"woodpecker",
"zebra",
"cat"
] |
abubakarabdullah/swin-tiny-patch4-window7-224-finetuned-eurosat
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1792
- Accuracy: 0.4627
- F1: 0.1004
- Precision: 0.2132
- Recall: 0.0852
- Auc Roc: 0.7403
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 40
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Auc Roc |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|:-------:|
| 0.2828 | 1.0 | 82 | 0.2357 | 0.3645 | 0.0412 | 0.0410 | 0.0414 | 0.5445 |
| 0.2048 | 2.0 | 164 | 0.1922 | 0.5713 | 0.0492 | 0.0396 | 0.0648 | 0.5762 |
| 0.2067 | 3.0 | 246 | 0.1899 | 0.4824 | 0.0493 | 0.1107 | 0.0550 | 0.6240 |
| 0.1974 | 4.0 | 328 | 0.1870 | 0.4079 | 0.0474 | 0.0485 | 0.0463 | 0.6619 |
| 0.2021 | 5.0 | 410 | 0.2048 | 0.1612 | 0.0358 | 0.1222 | 0.0227 | 0.6841 |
| 0.1843 | 6.0 | 492 | 0.1838 | 0.3709 | 0.0625 | 0.0967 | 0.0526 | 0.6965 |
| 0.187 | 7.0 | 574 | 0.1800 | 0.4009 | 0.0610 | 0.0963 | 0.0535 | 0.7127 |
| 0.1992 | 8.0 | 656 | 0.1775 | 0.4547 | 0.0563 | 0.1095 | 0.0548 | 0.7172 |
| 0.179 | 9.0 | 738 | 0.1863 | 0.3316 | 0.0667 | 0.1044 | 0.0521 | 0.7192 |
| 0.1683 | 10.0 | 820 | 0.1786 | 0.4165 | 0.0573 | 0.0951 | 0.0521 | 0.7194 |
| 0.1742 | 11.0 | 902 | 0.1764 | 0.4575 | 0.0714 | 0.1289 | 0.0640 | 0.7240 |
| 0.1871 | 12.0 | 984 | 0.1767 | 0.5130 | 0.0626 | 0.1030 | 0.0642 | 0.7272 |
| 0.1802 | 13.0 | 1066 | 0.1750 | 0.4575 | 0.0604 | 0.1088 | 0.0573 | 0.7305 |
| 0.1666 | 14.0 | 1148 | 0.1755 | 0.4414 | 0.0847 | 0.1338 | 0.0732 | 0.7282 |
| 0.1674 | 15.0 | 1230 | 0.1767 | 0.4263 | 0.0733 | 0.0990 | 0.0651 | 0.7296 |
| 0.1846 | 16.0 | 1312 | 0.1771 | 0.4460 | 0.0866 | 0.1363 | 0.0784 | 0.7349 |
| 0.1819 | 17.0 | 1394 | 0.1762 | 0.4356 | 0.0686 | 0.1063 | 0.0604 | 0.7282 |
| 0.1594 | 18.0 | 1476 | 0.1757 | 0.4298 | 0.0804 | 0.1289 | 0.0699 | 0.7345 |
| 0.1805 | 19.0 | 1558 | 0.1783 | 0.4206 | 0.0884 | 0.1358 | 0.0803 | 0.7323 |
| 0.164 | 20.0 | 1640 | 0.1768 | 0.4858 | 0.0896 | 0.1683 | 0.0816 | 0.7377 |
| 0.1673 | 21.0 | 1722 | 0.1779 | 0.4292 | 0.0888 | 0.1500 | 0.0764 | 0.7281 |
| 0.1719 | 22.0 | 1804 | 0.1780 | 0.5142 | 0.0783 | 0.1409 | 0.0746 | 0.7358 |
| 0.1598 | 23.0 | 1886 | 0.1762 | 0.4801 | 0.0876 | 0.2268 | 0.0778 | 0.7387 |
| 0.1682 | 24.0 | 1968 | 0.1788 | 0.4570 | 0.0954 | 0.1420 | 0.0849 | 0.7406 |
| 0.1558 | 25.0 | 2050 | 0.1795 | 0.4391 | 0.1023 | 0.2186 | 0.0912 | 0.7396 |
| 0.1695 | 26.0 | 2132 | 0.1768 | 0.4830 | 0.0881 | 0.2050 | 0.0789 | 0.7368 |
| 0.1514 | 27.0 | 2214 | 0.1771 | 0.4402 | 0.0958 | 0.2160 | 0.0807 | 0.7385 |
| 0.1669 | 28.0 | 2296 | 0.1773 | 0.4876 | 0.0892 | 0.2195 | 0.0791 | 0.7428 |
| 0.156 | 29.0 | 2378 | 0.1802 | 0.4107 | 0.1000 | 0.2148 | 0.0820 | 0.7331 |
| 0.1555 | 30.0 | 2460 | 0.1795 | 0.4258 | 0.0975 | 0.2033 | 0.0832 | 0.7336 |
| 0.1527 | 31.0 | 2542 | 0.1773 | 0.4853 | 0.0955 | 0.2149 | 0.0835 | 0.7407 |
| 0.1592 | 32.0 | 2624 | 0.1786 | 0.4830 | 0.0920 | 0.2138 | 0.0810 | 0.7402 |
| 0.1567 | 33.0 | 2706 | 0.1788 | 0.4916 | 0.0908 | 0.2125 | 0.0795 | 0.7415 |
| 0.1595 | 34.0 | 2788 | 0.1788 | 0.4558 | 0.0993 | 0.2175 | 0.0835 | 0.7385 |
| 0.1431 | 35.0 | 2870 | 0.1791 | 0.4529 | 0.1043 | 0.2087 | 0.0867 | 0.7364 |
| 0.1709 | 36.0 | 2952 | 0.1792 | 0.4541 | 0.1009 | 0.2145 | 0.0861 | 0.7389 |
| 0.1525 | 37.0 | 3034 | 0.1789 | 0.4714 | 0.0957 | 0.2123 | 0.0832 | 0.7405 |
| 0.1467 | 38.0 | 3116 | 0.1791 | 0.4639 | 0.0981 | 0.2110 | 0.0840 | 0.7397 |
| 0.1531 | 39.0 | 3198 | 0.1791 | 0.4633 | 0.1000 | 0.2123 | 0.0850 | 0.7404 |
| 0.1507 | 39.5169 | 3240 | 0.1792 | 0.4627 | 0.1004 | 0.2132 | 0.0852 | 0.7403 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"10",
"11",
"12",
"13",
"14"
] |
StealBlu/fruit_classification
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fruit_classification
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- eval_loss: 2.9073
- eval_model_preparation_time: 0.0021
- eval_accuracy: 0.0655
- eval_runtime: 25.5029
- eval_samples_per_second: 1229.662
- eval_steps_per_second: 153.708
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 8
- seed: 42
- distributed_type: tpu
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cpu
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"10",
"11",
"12",
"13",
"14",
"15",
"16",
"17"
] |
minsunny/swin-tiny-patch4-window7-224-finetuned-cifar10
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-cifar10
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0688
- Accuracy: 0.9744
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.3869 | 1.0 | 352 | 0.0846 | 0.9732 |
| 0.3457 | 2.0 | 704 | 0.0768 | 0.9732 |
| 0.2653 | 2.9922 | 1053 | 0.0688 | 0.9744 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
|
[
"airplane",
"automobile",
"bird",
"cat",
"deer",
"dog",
"frog",
"horse",
"ship",
"truck"
] |
encku/tuborg-04-2025v2
|
# Model Trained Using AutoTrain
- Problem type: Image Classification
## Validation Metrics
loss: 0.0012685584370046854
f1_macro: 0.9997715102411731
f1_micro: 0.9996524519514823
f1_weighted: 0.9996523835288024
precision_macro: 0.99980800467781
precision_micro: 0.9996524519514823
precision_weighted: 0.9996578833419523
recall_macro: 0.9997386659663885
recall_micro: 0.9996524519514823
recall_weighted: 0.9996524519514823
accuracy: 0.9996524519514823
|
[
"6974202725334",
"c001",
"c002",
"c002_4lu",
"c003",
"c004",
"c005",
"c006",
"c006_4lu",
"c007",
"c008",
"c009",
"c010",
"c012",
"c014",
"c015",
"c015_4lu",
"c017",
"c018",
"c020",
"c020_4lu",
"c021",
"c022",
"c023",
"c024",
"c025",
"c026",
"c027",
"c028",
"c029",
"c030",
"c031",
"c032",
"c033",
"c034",
"c035",
"c036",
"c037",
"c038",
"c039",
"c040",
"c041",
"c042",
"c043",
"c044",
"tbrg066",
"tbrg067",
"tbrg068",
"tbrg072",
"tbrg073",
"tbrg074",
"tbrg075",
"tbrg085",
"tbrg086",
"tbrg087",
"tbrg090",
"tbrg092",
"tbrg093",
"tbrg096",
"tbrg097",
"tbrg098",
"tbrg100",
"tbrg156",
"tbrg157",
"tbrg158",
"tbrg159",
"tt00277",
"tt00523",
"tt00677",
"tt00677-1",
"tt00685",
"tt00735",
"tt00737",
"tt00765",
"tt00792",
"tt00792-1",
"tt00793",
"tt00793-1",
"tt00810",
"tt00811",
"tt00812",
"tt00852",
"tt00853",
"tt00854",
"tt00857",
"tt00857-1",
"tt00859",
"tt00875",
"tt00875-1",
"tt00876",
"tt00876-1",
"tt00893",
"tt00904",
"tt00944",
"tt00945",
"tt00947",
"tt00964",
"tt00980",
"tt00989",
"tt01001",
"tt01020",
"tt01037",
"tt01069",
"tt01070",
"tt01071",
"tt01072",
"tt01142",
"tt01148",
"tt01149",
"tt01150",
"tt01152",
"tt01155",
"tt01160",
"tt01162",
"tt01169",
"tt01172",
"tt01174",
"tt01176",
"tt01178",
"tt01179",
"tt01231",
"tt01276",
"tt01277",
"tt01296",
"tt01297",
"tt01300",
"tt01307",
"tt01431",
"tt01460",
"tt01481",
"tt01482"
] |
Dugerij/image_segmentation_classifier
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# image_segmentation_classifier
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the taresco/newspaper_ocr dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0033
- Accuracy: 0.9993
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.0014 | 1.0 | 2031 | 0.0065 | 0.9986 |
| 0.0005 | 2.0 | 4062 | 0.0033 | 0.9993 |
| 0.0003 | 3.0 | 6093 | 0.0058 | 0.9990 |
| 0.0002 | 4.0 | 8124 | 0.0043 | 0.9983 |
| 0.0001 | 5.0 | 10155 | 0.0036 | 0.9990 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.0
|
[
"no_segment",
"segment"
] |
prithivMLmods/Multilabel-GeoSceneNet
|

# **Multilabel-GeoSceneNet**
> **Multilabel-GeoSceneNet** is a vision-language encoder model fine-tuned from **google/siglip2-base-patch16-224** for **multi-label** image classification. It is designed to recognize and label multiple geographic or environmental elements in a single image using the **SiglipForImageClassification** architecture.
```py
Classification Report:
precision recall f1-score support
Buildings and Structures 0.8881 0.9498 0.9179 2190
Desert 0.9649 0.9480 0.9564 2000
Forest Area 0.9807 0.9855 0.9831 2271
Hill or Mountain 0.8616 0.8993 0.8800 2512
Ice Glacier 0.9114 0.8382 0.8732 2404
Sea or Ocean 0.9328 0.9525 0.9426 2274
Street View 0.9476 0.9106 0.9287 2382
accuracy 0.9245 16033
macro avg 0.9267 0.9263 0.9260 16033
weighted avg 0.9253 0.9245 0.9244 16033
```

---
The model predicts the presence of one or more of the following **7 geographic scene categories**:
```
Class 0: "Buildings and Structures"
Class 1: "Desert"
Class 2: "Forest Area"
Class 3: "Hill or Mountain"
Class 4: "Ice Glacier"
Class 5: "Sea or Ocean"
Class 6: "Street View"
```
---
## **Install dependencies**
```python
!pip install -q transformers torch pillow gradio
```
---
## **Inference Code**
```python
import gradio as gr
from transformers import AutoImageProcessor, SiglipForImageClassification
from PIL import Image
import torch
# Load model and processor
model_name = "prithivMLmods/Multilabel-GeoSceneNet" # Updated model name
model = SiglipForImageClassification.from_pretrained(model_name)
processor = AutoImageProcessor.from_pretrained(model_name)
def classify_geoscene_image(image):
"""Predicts geographic scene labels for an input image."""
image = Image.fromarray(image).convert("RGB")
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
probs = torch.sigmoid(logits).squeeze().tolist() # Sigmoid for multilabel
labels = {
"0": "Buildings and Structures",
"1": "Desert",
"2": "Forest Area",
"3": "Hill or Mountain",
"4": "Ice Glacier",
"5": "Sea or Ocean",
"6": "Street View"
}
threshold = 0.5
predictions = {
labels[str(i)]: round(probs[i], 3)
for i in range(len(probs)) if probs[i] >= threshold
}
return predictions or {"None Detected": 0.0}
# Create Gradio interface
iface = gr.Interface(
fn=classify_geoscene_image,
inputs=gr.Image(type="numpy"),
outputs=gr.Label(label="Predicted Scene Categories"),
title="Multilabel-GeoSceneNet",
description="Upload an image to detect multiple geographic scene elements (e.g., forest, ocean, buildings)."
)
if __name__ == "__main__":
iface.launch()
```
---
## **Intended Use:**
The **Multilabel-GeoSceneNet** model is suitable for recognizing multiple geographic and structural elements in a single image. Use cases include:
- **Remote Sensing:** Label elements in satellite or drone imagery.
- **Geographic Tagging:** Auto-tagging images for search or sorting.
- **Environmental Monitoring:** Identify features like glaciers or forests.
- **Scene Understanding:** Help autonomous systems interpret complex scenes.
|
[
"buildings and structures",
"desert",
"forest area",
"hill or mountain",
"ice glacier",
"sea or ocean",
"street view"
] |
prithivMLmods/IndoorOutdoorNet
|

# **IndoorOutdoorNet**
> **IndoorOutdoorNet** is an image classification vision-language encoder model fine-tuned from **google/siglip2-base-patch16-224** for a single-label classification task. It is designed to classify images as either **Indoor** or **Outdoor** using the **SiglipForImageClassification** architecture.
```py
Classification Report:
precision recall f1-score support
Indoor 0.9661 0.9554 0.9607 9999
Outdoor 0.9559 0.9665 0.9612 9999
accuracy 0.9609 19998
macro avg 0.9610 0.9609 0.9609 19998
weighted avg 0.9610 0.9609 0.9609 19998
```

---
The model categorizes images into 2 environment-related classes:
```
Class 0: "Indoor"
Class 1: "Outdoor"
```
---
## **Install dependencies**
```python
!pip install -q transformers torch pillow gradio
```
---
## **Inference Code**
```python
import gradio as gr
from transformers import AutoImageProcessor, SiglipForImageClassification
from PIL import Image
import torch
# Load model and processor
model_name = "prithivMLmods/IndoorOutdoorNet" # Updated model name
model = SiglipForImageClassification.from_pretrained(model_name)
processor = AutoImageProcessor.from_pretrained(model_name)
def classify_environment_image(image):
"""Predicts whether an image is Indoor or Outdoor."""
image = Image.fromarray(image).convert("RGB")
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
probs = torch.nn.functional.softmax(logits, dim=1).squeeze().tolist()
labels = {
"0": "Indoor", "1": "Outdoor"
}
predictions = {labels[str(i)]: round(probs[i], 3) for i in range(len(probs))}
return predictions
# Create Gradio interface
iface = gr.Interface(
fn=classify_environment_image,
inputs=gr.Image(type="numpy"),
outputs=gr.Label(label="Prediction Scores"),
title="IndoorOutdoorNet",
description="Upload an image to classify it as Indoor or Outdoor."
)
if __name__ == "__main__":
iface.launch()
```
---
## **Intended Use:**
The **IndoorOutdoorNet** model is designed to classify images into indoor or outdoor environments. Potential use cases include:
- **Smart Cameras:** Detect indoor/outdoor context to adjust settings.
- **Dataset Curation:** Automatically filter image datasets by setting.
- **Robotics & Drones:** Environment-aware navigation logic.
- **Content Filtering:** Moderate or tag environment context in image platforms.
|
[
"indoor",
"outdoor"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.