model_id
stringlengths
7
105
model_card
stringlengths
1
130k
model_labels
listlengths
2
80k
N0elle-08/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.1.0+cu121 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
alirzb/S5_M1_fold1_swint_42507053
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold1_swint_42507053 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0078 - Accuracy: 0.9984 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0231 | 1.0 | 79 | 0.0356 | 0.9953 | | 0.0134 | 1.99 | 158 | 0.0047 | 0.9992 | | 0.0189 | 2.99 | 237 | 0.0148 | 0.9968 | | 0.0043 | 4.0 | 317 | 0.0119 | 0.9968 | | 0.0003 | 4.98 | 395 | 0.0078 | 0.9984 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R1_swint_42507050
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R1_swint_42507050 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0206 - Accuracy: 0.9957 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.119 | 1.0 | 58 | 0.0232 | 0.9946 | | 0.0218 | 2.0 | 116 | 0.0150 | 0.9968 | | 0.0157 | 3.0 | 174 | 0.0177 | 0.9957 | | 0.0108 | 4.0 | 232 | 0.0193 | 0.9957 | | 0.0005 | 5.0 | 290 | 0.0206 | 0.9957 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R2_swint_42507051
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R2_swint_42507051 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0101 - Accuracy: 0.9981 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1068 | 0.99 | 66 | 0.0199 | 0.9934 | | 0.0382 | 2.0 | 133 | 0.0301 | 0.9906 | | 0.0034 | 2.99 | 199 | 0.0259 | 0.9943 | | 0.0004 | 4.0 | 266 | 0.0100 | 0.9972 | | 0.0009 | 4.96 | 330 | 0.0101 | 0.9981 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R3_swint_42507052
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R3_swint_42507052 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0177 - Accuracy: 0.9966 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0359 | 0.99 | 73 | 0.0208 | 0.9949 | | 0.0209 | 1.99 | 147 | 0.0115 | 0.9975 | | 0.0088 | 3.0 | 221 | 0.0281 | 0.9958 | | 0.0029 | 4.0 | 295 | 0.0185 | 0.9958 | | 0.0002 | 4.95 | 365 | 0.0177 | 0.9966 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R1_deit_42507118
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R1_deit_42507118 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0369 - Accuracy: 0.9935 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0151 | 1.0 | 58 | 0.0547 | 0.9859 | | 0.0018 | 2.0 | 116 | 0.0337 | 0.9903 | | 0.0001 | 3.0 | 174 | 0.0453 | 0.9903 | | 0.0013 | 4.0 | 232 | 0.0338 | 0.9946 | | 0.0 | 5.0 | 290 | 0.0369 | 0.9935 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R2_deit_42507119
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R2_deit_42507119 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0037 - Accuracy: 0.9981 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0518 | 0.99 | 66 | 0.0180 | 0.9953 | | 0.0537 | 2.0 | 133 | 0.0177 | 0.9934 | | 0.0008 | 2.99 | 199 | 0.0035 | 0.9991 | | 0.0 | 4.0 | 266 | 0.0038 | 0.9981 | | 0.0 | 4.96 | 330 | 0.0037 | 0.9981 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R3_deit_42507120
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R3_deit_42507120 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0015 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0089 | 0.99 | 73 | 0.0057 | 0.9992 | | 0.0049 | 1.99 | 147 | 0.0052 | 0.9992 | | 0.0 | 3.0 | 221 | 0.0017 | 0.9983 | | 0.0 | 4.0 | 295 | 0.0015 | 0.9992 | | 0.0 | 4.95 | 365 | 0.0015 | 0.9992 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold1_deit_42507121
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold1_deit_42507121 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0481 - Accuracy: 0.9913 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0035 | 1.0 | 79 | 0.0343 | 0.9921 | | 0.0202 | 1.99 | 158 | 0.0397 | 0.9905 | | 0.0002 | 2.99 | 237 | 0.0337 | 0.9921 | | 0.0 | 4.0 | 317 | 0.0573 | 0.9897 | | 0.0 | 4.98 | 395 | 0.0481 | 0.9913 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R1_beit_42507336
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_beit_42507336 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0008 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1257 | 0.99 | 57 | 0.0438 | 0.9913 | | 0.0131 | 1.99 | 115 | 0.0709 | 0.9827 | | 0.0105 | 3.0 | 173 | 0.0032 | 0.9989 | | 0.0014 | 4.0 | 231 | 0.0009 | 1.0 | | 0.001 | 4.94 | 285 | 0.0008 | 1.0 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
amyeroberts/vit-base-beans-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/aeroberts4444/huggingface/runs/120mmtvn) # vit-base-beans-2 This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 1.1599 - Accuracy: 0.125 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2.0 ### Training results | Training Loss | Epoch | Step | Accuracy | Validation Loss | |:-------------:|:-----:|:----:|:--------:|:---------------:| | No log | 1.0 | 1 | 0.125 | 1.1599 | | No log | 2.0 | 2 | 0.0 | 1.1626 | ### Framework versions - Transformers 4.41.0.dev0 - Pytorch 2.3.0 - Datasets 2.15.1.dev0 - Tokenizers 0.19.1
[ "angular_leaf_spot", "bean_rust", "healthy" ]
alirzb/S1_M1_R3_beit_42507359
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R3_beit_42507359 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: nan - Accuracy: 0.4835 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0 | 0.99 | 73 | nan | 0.4835 | | 0.0 | 1.99 | 147 | nan | 0.4835 | | 0.0 | 3.0 | 221 | nan | 0.4835 | | 0.0 | 4.0 | 295 | nan | 0.4835 | | 0.0 | 4.95 | 365 | nan | 0.4835 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R1_beit_42507361
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R1_beit_42507361 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: nan - Accuracy: 0.5092 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0 | 1.0 | 58 | nan | 0.5092 | | 0.0 | 2.0 | 116 | nan | 0.5092 | | 0.0 | 3.0 | 174 | nan | 0.5092 | | 0.0 | 4.0 | 232 | nan | 0.5092 | | 0.0 | 5.0 | 290 | nan | 0.5092 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R2_beit_42507358
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R2_beit_42507358 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: nan - Accuracy: 0.5118 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0 | 0.99 | 66 | nan | 0.5118 | | 0.0 | 2.0 | 133 | nan | 0.5118 | | 0.0 | 2.99 | 199 | nan | 0.5118 | | 0.0 | 4.0 | 266 | nan | 0.5118 | | 0.0 | 4.96 | 330 | nan | 0.5118 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
hkivancoral/smids_10x_beit_large_sgd_00001_fold1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_00001_fold1 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.8308 - Accuracy: 0.6361 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 1.1798 | 1.0 | 751 | 1.2530 | 0.3139 | | 1.1588 | 2.0 | 1502 | 1.2210 | 0.3272 | | 1.0286 | 3.0 | 2253 | 1.1929 | 0.3272 | | 1.0699 | 4.0 | 3004 | 1.1680 | 0.3372 | | 1.0532 | 5.0 | 3755 | 1.1455 | 0.3539 | | 1.0168 | 6.0 | 4506 | 1.1249 | 0.3589 | | 1.0334 | 7.0 | 5257 | 1.1059 | 0.3840 | | 1.006 | 8.0 | 6008 | 1.0879 | 0.3923 | | 0.9781 | 9.0 | 6759 | 1.0713 | 0.4073 | | 0.9206 | 10.0 | 7510 | 1.0557 | 0.4324 | | 0.9599 | 11.0 | 8261 | 1.0410 | 0.4457 | | 0.8538 | 12.0 | 9012 | 1.0272 | 0.4591 | | 0.8992 | 13.0 | 9763 | 1.0143 | 0.4725 | | 0.9105 | 14.0 | 10514 | 1.0019 | 0.4925 | | 0.8886 | 15.0 | 11265 | 0.9904 | 0.5058 | | 0.8635 | 16.0 | 12016 | 0.9792 | 0.5209 | | 0.9091 | 17.0 | 12767 | 0.9687 | 0.5292 | | 0.8236 | 18.0 | 13518 | 0.9588 | 0.5342 | | 0.8559 | 19.0 | 14269 | 0.9493 | 0.5426 | | 0.7879 | 20.0 | 15020 | 0.9403 | 0.5509 | | 0.765 | 21.0 | 15771 | 0.9320 | 0.5543 | | 0.8223 | 22.0 | 16522 | 0.9238 | 0.5593 | | 0.782 | 23.0 | 17273 | 0.9162 | 0.5659 | | 0.875 | 24.0 | 18024 | 0.9090 | 0.5726 | | 0.8022 | 25.0 | 18775 | 0.9023 | 0.5793 | | 0.8471 | 26.0 | 19526 | 0.8959 | 0.5860 | | 0.7822 | 27.0 | 20277 | 0.8898 | 0.5977 | | 0.789 | 28.0 | 21028 | 0.8841 | 0.6010 | | 0.8149 | 29.0 | 21779 | 0.8788 | 0.6027 | | 0.7987 | 30.0 | 22530 | 0.8738 | 0.6077 | | 0.7188 | 31.0 | 23281 | 0.8692 | 0.6160 | | 0.802 | 32.0 | 24032 | 0.8649 | 0.6194 | | 0.8114 | 33.0 | 24783 | 0.8608 | 0.6194 | | 0.7414 | 34.0 | 25534 | 0.8570 | 0.6210 | | 0.766 | 35.0 | 26285 | 0.8536 | 0.6210 | | 0.7537 | 36.0 | 27036 | 0.8504 | 0.6260 | | 0.7794 | 37.0 | 27787 | 0.8475 | 0.6277 | | 0.7455 | 38.0 | 28538 | 0.8448 | 0.6311 | | 0.7702 | 39.0 | 29289 | 0.8424 | 0.6311 | | 0.75 | 40.0 | 30040 | 0.8403 | 0.6311 | | 0.7442 | 41.0 | 30791 | 0.8384 | 0.6344 | | 0.6885 | 42.0 | 31542 | 0.8367 | 0.6344 | | 0.7317 | 43.0 | 32293 | 0.8353 | 0.6344 | | 0.7377 | 44.0 | 33044 | 0.8340 | 0.6344 | | 0.7327 | 45.0 | 33795 | 0.8330 | 0.6344 | | 0.752 | 46.0 | 34546 | 0.8322 | 0.6361 | | 0.7091 | 47.0 | 35297 | 0.8315 | 0.6361 | | 0.7684 | 48.0 | 36048 | 0.8311 | 0.6361 | | 0.7425 | 49.0 | 36799 | 0.8309 | 0.6361 | | 0.7641 | 50.0 | 37550 | 0.8308 | 0.6361 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
alirzb/S1_M1_R1_vit_42509509
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_vit_42509509 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0052 - Accuracy: 0.9988 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0316 | 1.0 | 256 | 0.0126 | 0.9961 | | 0.0021 | 2.0 | 512 | 0.0055 | 0.9988 | | 0.0006 | 3.0 | 768 | 0.0050 | 0.9985 | | 0.0075 | 4.0 | 1025 | 0.0055 | 0.9990 | | 0.0004 | 5.0 | 1280 | 0.0052 | 0.9988 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
Audi24/RockAI
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Audi24/RockAI This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.5468 - Validation Loss: 0.6111 - Train Accuracy: 0.7674 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 2550, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 1.0659 | 1.0044 | 0.6279 | 0 | | 0.9502 | 0.9168 | 0.7442 | 1 | | 0.8135 | 0.7778 | 0.7287 | 2 | | 0.6570 | 0.6767 | 0.7442 | 3 | | 0.5468 | 0.6111 | 0.7674 | 4 | ### Framework versions - Transformers 4.38.1 - TensorFlow 2.15.0 - Datasets 2.18.0 - Tokenizers 0.15.2
[ "sedimentory", "igneous", "metamorphic" ]
alirzb/S2_M1_R2_deit_42509577
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R2_deit_42509577 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0114 - Accuracy: 0.9978 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1006 | 1.0 | 199 | 0.0203 | 0.9928 | | 0.0034 | 2.0 | 399 | 0.0087 | 0.9972 | | 0.004 | 3.0 | 598 | 0.0166 | 0.9959 | | 0.0001 | 4.0 | 798 | 0.0107 | 0.9978 | | 0.0 | 4.99 | 995 | 0.0114 | 0.9978 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R2_deit_42509574
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R2_deit_42509574 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0074 - Accuracy: 0.9983 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0094 | 1.0 | 260 | 0.0070 | 0.9978 | | 0.0064 | 2.0 | 521 | 0.0053 | 0.9986 | | 0.0002 | 3.0 | 782 | 0.0068 | 0.9983 | | 0.0 | 4.0 | 1043 | 0.0078 | 0.9983 | | 0.0 | 4.99 | 1300 | 0.0074 | 0.9983 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R1_deit_42509573
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_deit_42509573 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0037 - Accuracy: 0.9988 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0237 | 1.0 | 256 | 0.0053 | 0.9983 | | 0.0014 | 2.0 | 512 | 0.0056 | 0.9985 | | 0.0 | 3.0 | 768 | 0.0023 | 0.9993 | | 0.0 | 4.0 | 1025 | 0.0037 | 0.9988 | | 0.0 | 5.0 | 1280 | 0.0037 | 0.9988 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R3_deit_42509575
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R3_deit_42509575 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0039 - Accuracy: 0.9990 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0046 | 1.0 | 320 | 0.0065 | 0.9973 | | 0.0027 | 2.0 | 640 | 0.0124 | 0.9975 | | 0.0001 | 3.0 | 960 | 0.0013 | 0.9994 | | 0.0 | 4.0 | 1280 | 0.0028 | 0.9992 | | 0.0 | 5.0 | 1600 | 0.0039 | 0.9990 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R1_deit_42509576
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R1_deit_42509576 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0057 - Accuracy: 0.9987 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0111 | 1.0 | 195 | 0.0118 | 0.9971 | | 0.0098 | 2.0 | 390 | 0.0096 | 0.9978 | | 0.0066 | 3.0 | 585 | 0.0094 | 0.9974 | | 0.0 | 4.0 | 780 | 0.0053 | 0.9994 | | 0.0 | 5.0 | 975 | 0.0057 | 0.9987 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R3_deit_42509578
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R3_deit_42509578 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0060 - Accuracy: 0.9990 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0149 | 1.0 | 258 | 0.0100 | 0.9973 | | 0.004 | 2.0 | 517 | 0.0058 | 0.9988 | | 0.0097 | 3.0 | 776 | 0.0074 | 0.9986 | | 0.0002 | 4.0 | 1035 | 0.0041 | 0.9993 | | 0.0 | 4.99 | 1290 | 0.0060 | 0.9990 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R1_swint_42509597
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_swint_42509597 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0080 - Accuracy: 0.9983 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0214 | 1.0 | 256 | 0.0224 | 0.9907 | | 0.0075 | 2.0 | 512 | 0.0098 | 0.9971 | | 0.0171 | 3.0 | 768 | 0.0243 | 0.9949 | | 0.0007 | 4.0 | 1025 | 0.0063 | 0.9985 | | 0.0001 | 5.0 | 1280 | 0.0080 | 0.9983 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R2_swint_42509598
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R2_swint_42509598 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0089 - Accuracy: 0.9983 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0211 | 1.0 | 260 | 0.0173 | 0.9954 | | 0.0208 | 2.0 | 521 | 0.0103 | 0.9969 | | 0.009 | 3.0 | 782 | 0.0159 | 0.9964 | | 0.0062 | 4.0 | 1043 | 0.0091 | 0.9983 | | 0.0002 | 4.99 | 1300 | 0.0089 | 0.9983 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R3_swint_42509599
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R3_swint_42509599 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0068 - Accuracy: 0.9988 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0003 | 1.0 | 320 | 0.0107 | 0.9973 | | 0.0051 | 2.0 | 640 | 0.0051 | 0.9982 | | 0.0002 | 3.0 | 960 | 0.0054 | 0.9990 | | 0.0001 | 4.0 | 1280 | 0.0063 | 0.9990 | | 0.0001 | 5.0 | 1600 | 0.0068 | 0.9988 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R2_swint_42509600
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R2_swint_42509600 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0052 - Accuracy: 0.9987 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0985 | 1.0 | 199 | 0.0097 | 0.9975 | | 0.0021 | 2.0 | 399 | 0.0030 | 0.9987 | | 0.0031 | 3.0 | 598 | 0.0024 | 0.9991 | | 0.0018 | 4.0 | 798 | 0.0057 | 0.9987 | | 0.0001 | 4.99 | 995 | 0.0052 | 0.9987 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
hkivancoral/smids_10x_beit_large_sgd_00001_fold2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_00001_fold2 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7989 - Accuracy: 0.6473 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 1.1712 | 1.0 | 750 | 1.2053 | 0.3594 | | 1.1095 | 2.0 | 1500 | 1.1744 | 0.3677 | | 1.079 | 3.0 | 2250 | 1.1476 | 0.3677 | | 1.0868 | 4.0 | 3000 | 1.1238 | 0.3760 | | 1.0188 | 5.0 | 3750 | 1.1026 | 0.4043 | | 1.0313 | 6.0 | 4500 | 1.0830 | 0.4176 | | 0.9867 | 7.0 | 5250 | 1.0650 | 0.4343 | | 0.9922 | 8.0 | 6000 | 1.0482 | 0.4509 | | 1.0089 | 9.0 | 6750 | 1.0324 | 0.4626 | | 0.9248 | 10.0 | 7500 | 1.0176 | 0.4809 | | 0.9924 | 11.0 | 8250 | 1.0037 | 0.4942 | | 0.9341 | 12.0 | 9000 | 0.9905 | 0.5042 | | 0.9032 | 13.0 | 9750 | 0.9777 | 0.5158 | | 0.9223 | 14.0 | 10500 | 0.9658 | 0.5241 | | 0.8875 | 15.0 | 11250 | 0.9546 | 0.5275 | | 0.8812 | 16.0 | 12000 | 0.9440 | 0.5408 | | 0.8383 | 17.0 | 12750 | 0.9339 | 0.5524 | | 0.8368 | 18.0 | 13500 | 0.9242 | 0.5557 | | 0.8681 | 19.0 | 14250 | 0.9150 | 0.5657 | | 0.8552 | 20.0 | 15000 | 0.9065 | 0.5674 | | 0.8564 | 21.0 | 15750 | 0.8983 | 0.5691 | | 0.8254 | 22.0 | 16500 | 0.8905 | 0.5740 | | 0.842 | 23.0 | 17250 | 0.8831 | 0.5807 | | 0.802 | 24.0 | 18000 | 0.8761 | 0.5857 | | 0.8617 | 25.0 | 18750 | 0.8694 | 0.5973 | | 0.8384 | 26.0 | 19500 | 0.8631 | 0.6057 | | 0.8257 | 27.0 | 20250 | 0.8572 | 0.6106 | | 0.8327 | 28.0 | 21000 | 0.8516 | 0.6156 | | 0.8111 | 29.0 | 21750 | 0.8464 | 0.6173 | | 0.7892 | 30.0 | 22500 | 0.8414 | 0.6206 | | 0.7974 | 31.0 | 23250 | 0.8368 | 0.6256 | | 0.8791 | 32.0 | 24000 | 0.8325 | 0.6256 | | 0.7583 | 33.0 | 24750 | 0.8285 | 0.6306 | | 0.7714 | 34.0 | 25500 | 0.8248 | 0.6323 | | 0.7891 | 35.0 | 26250 | 0.8214 | 0.6356 | | 0.7659 | 36.0 | 27000 | 0.8182 | 0.6389 | | 0.8096 | 37.0 | 27750 | 0.8154 | 0.6356 | | 0.7644 | 38.0 | 28500 | 0.8128 | 0.6373 | | 0.8029 | 39.0 | 29250 | 0.8104 | 0.6406 | | 0.7912 | 40.0 | 30000 | 0.8082 | 0.6406 | | 0.7766 | 41.0 | 30750 | 0.8063 | 0.6423 | | 0.7693 | 42.0 | 31500 | 0.8047 | 0.6439 | | 0.735 | 43.0 | 32250 | 0.8032 | 0.6456 | | 0.7637 | 44.0 | 33000 | 0.8020 | 0.6456 | | 0.7733 | 45.0 | 33750 | 0.8010 | 0.6473 | | 0.7268 | 46.0 | 34500 | 0.8002 | 0.6473 | | 0.8097 | 47.0 | 35250 | 0.7996 | 0.6473 | | 0.7648 | 48.0 | 36000 | 0.7991 | 0.6473 | | 0.7593 | 49.0 | 36750 | 0.7989 | 0.6473 | | 0.7579 | 50.0 | 37500 | 0.7989 | 0.6473 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
alirzb/S2_M1_R3_swint_42509601
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R3_swint_42509601 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0038 - Accuracy: 0.9995 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0242 | 1.0 | 258 | 0.0034 | 0.9993 | | 0.0004 | 2.0 | 517 | 0.0029 | 0.9995 | | 0.0001 | 3.0 | 776 | 0.0054 | 0.9990 | | 0.0001 | 4.0 | 1035 | 0.0048 | 0.9990 | | 0.0001 | 4.99 | 1290 | 0.0038 | 0.9995 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold1_deit_42510037
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold1_deit_42510037 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0030 - Accuracy: 0.9996 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0003 | 1.0 | 310 | 0.0054 | 0.9990 | | 0.013 | 2.0 | 620 | 0.0055 | 0.9988 | | 0.0002 | 3.0 | 930 | 0.0055 | 0.9982 | | 0.0 | 4.0 | 1241 | 0.0052 | 0.9984 | | 0.0 | 5.0 | 1550 | 0.0030 | 0.9996 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold2_deit_42510038
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold2_deit_42510038 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0084 - Accuracy: 0.9984 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0038 | 1.0 | 310 | 0.0085 | 0.9980 | | 0.0104 | 2.0 | 620 | 0.0051 | 0.9980 | | 0.0016 | 3.0 | 930 | 0.0107 | 0.9984 | | 0.0001 | 4.0 | 1241 | 0.0067 | 0.9988 | | 0.0 | 5.0 | 1550 | 0.0084 | 0.9984 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold3_deit_42510039
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold3_deit_42510039 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0025 - Accuracy: 0.9990 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0297 | 1.0 | 310 | 0.0101 | 0.9978 | | 0.0011 | 2.0 | 620 | 0.0170 | 0.9962 | | 0.0009 | 3.0 | 930 | 0.0045 | 0.9988 | | 0.0001 | 4.0 | 1241 | 0.0026 | 0.9988 | | 0.0 | 5.0 | 1550 | 0.0025 | 0.9990 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold4_deit_42510040
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold4_deit_42510040 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0042 - Accuracy: 0.9994 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0274 | 1.0 | 310 | 0.0129 | 0.9966 | | 0.0006 | 2.0 | 620 | 0.0035 | 0.9986 | | 0.0227 | 3.0 | 930 | 0.0042 | 0.9982 | | 0.0 | 4.0 | 1241 | 0.0040 | 0.9994 | | 0.0 | 5.0 | 1550 | 0.0042 | 0.9994 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold5_deit_42510041
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold5_deit_42510041 This model is a fine-tuned version of [facebook/deit-base-distilled-patch16-224](https://huggingface.co/facebook/deit-base-distilled-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0117 - Accuracy: 0.9988 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0018 | 1.0 | 310 | 0.0305 | 0.9921 | | 0.0033 | 2.0 | 620 | 0.0113 | 0.9976 | | 0.0002 | 3.0 | 930 | 0.0101 | 0.9986 | | 0.0 | 4.0 | 1241 | 0.0097 | 0.9986 | | 0.0 | 5.0 | 1550 | 0.0117 | 0.9988 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold1_swint_42510042
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold1_swint_42510042 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0038 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0406 | 1.0 | 310 | 0.0068 | 0.9978 | | 0.0007 | 2.0 | 620 | 0.0046 | 0.9986 | | 0.0003 | 3.0 | 930 | 0.0036 | 0.9990 | | 0.0001 | 4.0 | 1241 | 0.0025 | 0.9994 | | 0.0001 | 5.0 | 1550 | 0.0038 | 0.9992 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold2_swint_42510043
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold2_swint_42510043 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0066 - Accuracy: 0.9986 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0158 | 1.0 | 310 | 0.0185 | 0.9950 | | 0.001 | 2.0 | 620 | 0.0113 | 0.9968 | | 0.0001 | 3.0 | 930 | 0.0057 | 0.9986 | | 0.0004 | 4.0 | 1241 | 0.0077 | 0.9988 | | 0.0065 | 5.0 | 1550 | 0.0066 | 0.9986 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold3_swint_42510044
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold3_swint_42510044 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0024 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0448 | 1.0 | 310 | 0.0119 | 0.9966 | | 0.0077 | 2.0 | 620 | 0.0027 | 0.9994 | | 0.0005 | 3.0 | 930 | 0.0037 | 0.9988 | | 0.0001 | 4.0 | 1241 | 0.0017 | 0.9992 | | 0.0001 | 5.0 | 1550 | 0.0024 | 0.9992 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold4_swint_42510045
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold4_swint_42510045 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0046 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0062 | 1.0 | 310 | 0.0110 | 0.9976 | | 0.0102 | 2.0 | 620 | 0.0111 | 0.9968 | | 0.011 | 3.0 | 930 | 0.0105 | 0.9976 | | 0.0001 | 4.0 | 1241 | 0.0056 | 0.9990 | | 0.0003 | 5.0 | 1550 | 0.0046 | 0.9992 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold5_swint_42510046
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold5_swint_42510046 This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0031 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0691 | 1.0 | 310 | 0.0071 | 0.9980 | | 0.0264 | 2.0 | 620 | 0.0095 | 0.9970 | | 0.0005 | 3.0 | 930 | 0.0047 | 0.9986 | | 0.0021 | 4.0 | 1241 | 0.0040 | 0.9992 | | 0.0002 | 5.0 | 1550 | 0.0031 | 0.9992 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu102 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
hkivancoral/smids_10x_beit_large_sgd_00001_fold3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_00001_fold3 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.8246 - Accuracy: 0.6317 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 1.2284 | 1.0 | 750 | 1.2241 | 0.3517 | | 1.1457 | 2.0 | 1500 | 1.1930 | 0.365 | | 1.1396 | 3.0 | 2250 | 1.1661 | 0.3783 | | 1.0897 | 4.0 | 3000 | 1.1425 | 0.385 | | 1.025 | 5.0 | 3750 | 1.1215 | 0.3883 | | 1.0158 | 6.0 | 4500 | 1.1022 | 0.3883 | | 0.9975 | 7.0 | 5250 | 1.0842 | 0.4017 | | 1.0278 | 8.0 | 6000 | 1.0673 | 0.4067 | | 0.9784 | 9.0 | 6750 | 1.0514 | 0.4133 | | 0.9157 | 10.0 | 7500 | 1.0366 | 0.4317 | | 0.9554 | 11.0 | 8250 | 1.0228 | 0.4467 | | 0.8899 | 12.0 | 9000 | 1.0096 | 0.4667 | | 0.9379 | 13.0 | 9750 | 0.9973 | 0.4767 | | 0.944 | 14.0 | 10500 | 0.9856 | 0.4867 | | 0.9071 | 15.0 | 11250 | 0.9745 | 0.4983 | | 0.8922 | 16.0 | 12000 | 0.9641 | 0.505 | | 0.8643 | 17.0 | 12750 | 0.9544 | 0.5133 | | 0.8278 | 18.0 | 13500 | 0.9449 | 0.52 | | 0.9039 | 19.0 | 14250 | 0.9361 | 0.5317 | | 0.8559 | 20.0 | 15000 | 0.9279 | 0.5383 | | 0.8179 | 21.0 | 15750 | 0.9199 | 0.545 | | 0.8248 | 22.0 | 16500 | 0.9124 | 0.56 | | 0.8379 | 23.0 | 17250 | 0.9052 | 0.56 | | 0.864 | 24.0 | 18000 | 0.8985 | 0.565 | | 0.8458 | 25.0 | 18750 | 0.8922 | 0.575 | | 0.8014 | 26.0 | 19500 | 0.8861 | 0.5783 | | 0.7589 | 27.0 | 20250 | 0.8805 | 0.5883 | | 0.8089 | 28.0 | 21000 | 0.8752 | 0.595 | | 0.8337 | 29.0 | 21750 | 0.8701 | 0.5983 | | 0.7734 | 30.0 | 22500 | 0.8654 | 0.6033 | | 0.7463 | 31.0 | 23250 | 0.8610 | 0.6033 | | 0.7746 | 32.0 | 24000 | 0.8569 | 0.6067 | | 0.8126 | 33.0 | 24750 | 0.8532 | 0.6117 | | 0.7894 | 34.0 | 25500 | 0.8496 | 0.615 | | 0.7634 | 35.0 | 26250 | 0.8463 | 0.615 | | 0.7765 | 36.0 | 27000 | 0.8433 | 0.6167 | | 0.8136 | 37.0 | 27750 | 0.8405 | 0.6217 | | 0.8117 | 38.0 | 28500 | 0.8380 | 0.6217 | | 0.7707 | 39.0 | 29250 | 0.8357 | 0.6217 | | 0.7678 | 40.0 | 30000 | 0.8337 | 0.6267 | | 0.7823 | 41.0 | 30750 | 0.8319 | 0.6283 | | 0.7728 | 42.0 | 31500 | 0.8303 | 0.63 | | 0.7705 | 43.0 | 32250 | 0.8289 | 0.6283 | | 0.7342 | 44.0 | 33000 | 0.8277 | 0.6283 | | 0.7107 | 45.0 | 33750 | 0.8267 | 0.6283 | | 0.7263 | 46.0 | 34500 | 0.8259 | 0.63 | | 0.7101 | 47.0 | 35250 | 0.8253 | 0.63 | | 0.7724 | 48.0 | 36000 | 0.8249 | 0.6317 | | 0.7714 | 49.0 | 36750 | 0.8247 | 0.6317 | | 0.7461 | 50.0 | 37500 | 0.8246 | 0.6317 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
hkivancoral/smids_10x_beit_large_sgd_00001_fold4
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_00001_fold4 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.7900 - Accuracy: 0.6483 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 1.2211 | 1.0 | 750 | 1.1931 | 0.3533 | | 1.1323 | 2.0 | 1500 | 1.1612 | 0.365 | | 1.184 | 3.0 | 2250 | 1.1336 | 0.37 | | 1.1136 | 4.0 | 3000 | 1.1091 | 0.3817 | | 0.9758 | 5.0 | 3750 | 1.0870 | 0.385 | | 1.0842 | 6.0 | 4500 | 1.0669 | 0.3917 | | 1.0165 | 7.0 | 5250 | 1.0484 | 0.4117 | | 1.0062 | 8.0 | 6000 | 1.0310 | 0.43 | | 1.0015 | 9.0 | 6750 | 1.0148 | 0.4367 | | 0.9415 | 10.0 | 7500 | 0.9997 | 0.45 | | 0.9588 | 11.0 | 8250 | 0.9856 | 0.4533 | | 0.9674 | 12.0 | 9000 | 0.9724 | 0.47 | | 0.9046 | 13.0 | 9750 | 0.9600 | 0.4733 | | 0.9542 | 14.0 | 10500 | 0.9483 | 0.4867 | | 0.8663 | 15.0 | 11250 | 0.9372 | 0.5 | | 0.8717 | 16.0 | 12000 | 0.9268 | 0.51 | | 0.7922 | 17.0 | 12750 | 0.9171 | 0.525 | | 0.8562 | 18.0 | 13500 | 0.9078 | 0.535 | | 0.9212 | 19.0 | 14250 | 0.8991 | 0.5433 | | 0.8823 | 20.0 | 15000 | 0.8907 | 0.5567 | | 0.8498 | 21.0 | 15750 | 0.8828 | 0.565 | | 0.8335 | 22.0 | 16500 | 0.8754 | 0.575 | | 0.8369 | 23.0 | 17250 | 0.8683 | 0.5867 | | 0.8886 | 24.0 | 18000 | 0.8617 | 0.5917 | | 0.8131 | 25.0 | 18750 | 0.8555 | 0.6 | | 0.8107 | 26.0 | 19500 | 0.8497 | 0.605 | | 0.7489 | 27.0 | 20250 | 0.8442 | 0.61 | | 0.8154 | 28.0 | 21000 | 0.8390 | 0.6167 | | 0.7935 | 29.0 | 21750 | 0.8341 | 0.62 | | 0.7606 | 30.0 | 22500 | 0.8296 | 0.6267 | | 0.7688 | 31.0 | 23250 | 0.8253 | 0.6283 | | 0.755 | 32.0 | 24000 | 0.8214 | 0.63 | | 0.8046 | 33.0 | 24750 | 0.8176 | 0.63 | | 0.8193 | 34.0 | 25500 | 0.8142 | 0.6317 | | 0.7668 | 35.0 | 26250 | 0.8110 | 0.635 | | 0.7573 | 36.0 | 27000 | 0.8080 | 0.6367 | | 0.7928 | 37.0 | 27750 | 0.8053 | 0.6417 | | 0.792 | 38.0 | 28500 | 0.8028 | 0.6417 | | 0.7917 | 39.0 | 29250 | 0.8007 | 0.645 | | 0.7521 | 40.0 | 30000 | 0.7987 | 0.645 | | 0.777 | 41.0 | 30750 | 0.7969 | 0.6483 | | 0.7956 | 42.0 | 31500 | 0.7954 | 0.6483 | | 0.8067 | 43.0 | 32250 | 0.7940 | 0.65 | | 0.7335 | 44.0 | 33000 | 0.7929 | 0.65 | | 0.7708 | 45.0 | 33750 | 0.7920 | 0.6483 | | 0.74 | 46.0 | 34500 | 0.7912 | 0.6483 | | 0.7222 | 47.0 | 35250 | 0.7906 | 0.6483 | | 0.7572 | 48.0 | 36000 | 0.7902 | 0.6483 | | 0.7909 | 49.0 | 36750 | 0.7900 | 0.6483 | | 0.7055 | 50.0 | 37500 | 0.7900 | 0.6483 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
hossay/stool-condition-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # stool-condition-classification This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the stool-image dataset. It achieves the following results on the evaluation set: - Loss: 0.4237 - Auroc: 0.9418 - Accuracy: 0.9417 - Sensitivity: 0.9091 - Specificty: 0.9661 - Ppv: 0.9524 - Npv: 0.9344 - F1: 0.9302 - Model Selection: 0.9215 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Auroc | Accuracy | Sensitivity | Specificty | Ppv | Npv | F1 | Model Selection | |:-------------:|:-----:|:----:|:---------------:|:------:|:--------:|:-----------:|:----------:|:------:|:------:|:------:|:---------------:| | 0.5076 | 0.98 | 100 | 0.5361 | 0.8538 | 0.7731 | 0.5393 | 0.9801 | 0.96 | 0.7061 | 0.6906 | 0.5592 | | 0.4086 | 1.96 | 200 | 0.4857 | 0.8728 | 0.7836 | 0.6011 | 0.9453 | 0.9068 | 0.7280 | 0.7230 | 0.6558 | | 0.5208 | 2.94 | 300 | 0.5109 | 0.8059 | 0.7599 | 0.6124 | 0.8905 | 0.8321 | 0.7218 | 0.7055 | 0.7218 | | 0.474 | 3.92 | 400 | 0.5212 | 0.8601 | 0.7995 | 0.6180 | 0.9602 | 0.9322 | 0.7395 | 0.7432 | 0.6578 | | 0.4285 | 4.9 | 500 | 0.4511 | 0.8728 | 0.7757 | 0.7472 | 0.8010 | 0.7688 | 0.7816 | 0.7578 | 0.9462 | | 0.3506 | 5.88 | 600 | 0.4716 | 0.8691 | 0.8047 | 0.6798 | 0.9154 | 0.8768 | 0.7635 | 0.7658 | 0.7644 | | 0.4239 | 6.86 | 700 | 0.5043 | 0.8517 | 0.8100 | 0.6685 | 0.9353 | 0.9015 | 0.7611 | 0.7677 | 0.7332 | | 0.2447 | 7.84 | 800 | 0.5804 | 0.8592 | 0.8074 | 0.6910 | 0.9104 | 0.8723 | 0.7689 | 0.7712 | 0.7806 | | 0.1739 | 8.82 | 900 | 0.6225 | 0.8562 | 0.8074 | 0.7135 | 0.8905 | 0.8523 | 0.7783 | 0.7768 | 0.8229 | | 0.2888 | 9.8 | 1000 | 0.5807 | 0.8570 | 0.8047 | 0.7528 | 0.8507 | 0.8171 | 0.7953 | 0.7836 | 0.9021 | ### Framework versions - Transformers 4.38.2 - Pytorch 2.0.1 - Datasets 2.14.7 - Tokenizers 0.15.2
[ "normal", "abnormal" ]
hkivancoral/smids_10x_beit_large_sgd_00001_fold5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_00001_fold5 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.8273 - Accuracy: 0.6317 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 1.2351 | 1.0 | 750 | 1.2384 | 0.335 | | 1.188 | 2.0 | 1500 | 1.2067 | 0.3417 | | 1.1425 | 3.0 | 2250 | 1.1794 | 0.35 | | 1.0663 | 4.0 | 3000 | 1.1549 | 0.36 | | 1.0302 | 5.0 | 3750 | 1.1332 | 0.3733 | | 1.0803 | 6.0 | 4500 | 1.1133 | 0.3783 | | 1.0194 | 7.0 | 5250 | 1.0948 | 0.395 | | 1.041 | 8.0 | 6000 | 1.0776 | 0.4133 | | 0.958 | 9.0 | 6750 | 1.0617 | 0.4267 | | 0.9328 | 10.0 | 7500 | 1.0465 | 0.44 | | 0.9293 | 11.0 | 8250 | 1.0324 | 0.4533 | | 0.9087 | 12.0 | 9000 | 1.0189 | 0.465 | | 0.9387 | 13.0 | 9750 | 1.0063 | 0.4783 | | 0.8996 | 14.0 | 10500 | 0.9944 | 0.4933 | | 0.8606 | 15.0 | 11250 | 0.9830 | 0.5083 | | 0.8536 | 16.0 | 12000 | 0.9723 | 0.5117 | | 0.8222 | 17.0 | 12750 | 0.9621 | 0.5217 | | 0.8298 | 18.0 | 13500 | 0.9525 | 0.53 | | 0.9106 | 19.0 | 14250 | 0.9434 | 0.54 | | 0.8462 | 20.0 | 15000 | 0.9347 | 0.5483 | | 0.8209 | 21.0 | 15750 | 0.9265 | 0.5533 | | 0.8393 | 22.0 | 16500 | 0.9186 | 0.5583 | | 0.8236 | 23.0 | 17250 | 0.9111 | 0.565 | | 0.8476 | 24.0 | 18000 | 0.9042 | 0.5717 | | 0.7925 | 25.0 | 18750 | 0.8975 | 0.5733 | | 0.8294 | 26.0 | 19500 | 0.8913 | 0.5817 | | 0.7415 | 27.0 | 20250 | 0.8856 | 0.585 | | 0.8113 | 28.0 | 21000 | 0.8800 | 0.585 | | 0.8087 | 29.0 | 21750 | 0.8747 | 0.5833 | | 0.8087 | 30.0 | 22500 | 0.8698 | 0.59 | | 0.7723 | 31.0 | 23250 | 0.8652 | 0.595 | | 0.7864 | 32.0 | 24000 | 0.8609 | 0.6033 | | 0.7882 | 33.0 | 24750 | 0.8569 | 0.6083 | | 0.7814 | 34.0 | 25500 | 0.8532 | 0.61 | | 0.8053 | 35.0 | 26250 | 0.8498 | 0.6117 | | 0.7759 | 36.0 | 27000 | 0.8467 | 0.6167 | | 0.73 | 37.0 | 27750 | 0.8438 | 0.6167 | | 0.8437 | 38.0 | 28500 | 0.8412 | 0.6183 | | 0.7621 | 39.0 | 29250 | 0.8389 | 0.6183 | | 0.719 | 40.0 | 30000 | 0.8367 | 0.6217 | | 0.7491 | 41.0 | 30750 | 0.8348 | 0.625 | | 0.7887 | 42.0 | 31500 | 0.8332 | 0.625 | | 0.8254 | 43.0 | 32250 | 0.8317 | 0.625 | | 0.7337 | 44.0 | 33000 | 0.8305 | 0.6267 | | 0.7762 | 45.0 | 33750 | 0.8295 | 0.6283 | | 0.7277 | 46.0 | 34500 | 0.8286 | 0.6317 | | 0.7733 | 47.0 | 35250 | 0.8280 | 0.6317 | | 0.7249 | 48.0 | 36000 | 0.8276 | 0.6317 | | 0.7591 | 49.0 | 36750 | 0.8274 | 0.6317 | | 0.7103 | 50.0 | 37500 | 0.8273 | 0.6317 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
kijeong22/swin-finetuned-food101
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-finetuned-food101 This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 27286250970434916419272480644923392.0000 - Accuracy: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:----------------------------------------:|:-----:|:----:|:----------------------------------------:|:--------:| | 27286256417371087430271282830114816.0000 | 1.0 | 47 | 27286250970434916419272480644923392.0000 | 0.0 | | 27286256417371087430271282830114816.0000 | 2.0 | 94 | 27286250970434916419272480644923392.0000 | 0.0 | | 27286256417371087430271282830114816.0000 | 3.0 | 141 | 27286250970434916419272480644923392.0000 | 0.0 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.13.1+cu116 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
alirzb/S1_M1_R1_beit_42534242
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_beit_42534242 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0090 - Accuracy: 0.9980 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0101 | 1.0 | 256 | 0.0465 | 0.9873 | | 0.0107 | 2.0 | 512 | 0.0155 | 0.9939 | | 0.0011 | 3.0 | 768 | 0.0082 | 0.9976 | | 0.0095 | 4.0 | 1025 | 0.0077 | 0.9978 | | 0.0002 | 5.0 | 1280 | 0.0090 | 0.9980 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.2+cu118 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
hkivancoral/smids_10x_beit_large_sgd_0001_fold1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_0001_fold1 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3289 - Accuracy: 0.8715 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.9122 | 1.0 | 751 | 1.0277 | 0.4591 | | 0.7509 | 2.0 | 1502 | 0.8635 | 0.6194 | | 0.6317 | 3.0 | 2253 | 0.7448 | 0.7012 | | 0.5452 | 4.0 | 3004 | 0.6612 | 0.7446 | | 0.5954 | 5.0 | 3755 | 0.5972 | 0.7830 | | 0.5075 | 6.0 | 4506 | 0.5495 | 0.7930 | | 0.5045 | 7.0 | 5257 | 0.5158 | 0.8130 | | 0.4666 | 8.0 | 6008 | 0.4891 | 0.8147 | | 0.4159 | 9.0 | 6759 | 0.4686 | 0.8247 | | 0.4231 | 10.0 | 7510 | 0.4504 | 0.8280 | | 0.4497 | 11.0 | 8261 | 0.4359 | 0.8364 | | 0.3539 | 12.0 | 9012 | 0.4229 | 0.8381 | | 0.3554 | 13.0 | 9763 | 0.4122 | 0.8414 | | 0.3441 | 14.0 | 10514 | 0.4038 | 0.8414 | | 0.3331 | 15.0 | 11265 | 0.3962 | 0.8431 | | 0.3376 | 16.0 | 12016 | 0.3885 | 0.8431 | | 0.374 | 17.0 | 12767 | 0.3827 | 0.8431 | | 0.3157 | 18.0 | 13518 | 0.3768 | 0.8464 | | 0.3563 | 19.0 | 14269 | 0.3725 | 0.8514 | | 0.3183 | 20.0 | 15020 | 0.3682 | 0.8548 | | 0.2569 | 21.0 | 15771 | 0.3646 | 0.8598 | | 0.312 | 22.0 | 16522 | 0.3608 | 0.8581 | | 0.3262 | 23.0 | 17273 | 0.3576 | 0.8598 | | 0.3722 | 24.0 | 18024 | 0.3550 | 0.8598 | | 0.3339 | 25.0 | 18775 | 0.3524 | 0.8598 | | 0.3725 | 26.0 | 19526 | 0.3497 | 0.8631 | | 0.35 | 27.0 | 20277 | 0.3474 | 0.8664 | | 0.3858 | 28.0 | 21028 | 0.3456 | 0.8648 | | 0.3212 | 29.0 | 21779 | 0.3439 | 0.8664 | | 0.3222 | 30.0 | 22530 | 0.3422 | 0.8681 | | 0.2584 | 31.0 | 23281 | 0.3410 | 0.8664 | | 0.3877 | 32.0 | 24032 | 0.3393 | 0.8698 | | 0.3116 | 33.0 | 24783 | 0.3380 | 0.8698 | | 0.3141 | 34.0 | 25534 | 0.3366 | 0.8715 | | 0.3279 | 35.0 | 26285 | 0.3358 | 0.8681 | | 0.2798 | 36.0 | 27036 | 0.3348 | 0.8715 | | 0.3928 | 37.0 | 27787 | 0.3341 | 0.8715 | | 0.3 | 38.0 | 28538 | 0.3331 | 0.8715 | | 0.2471 | 39.0 | 29289 | 0.3324 | 0.8715 | | 0.3456 | 40.0 | 30040 | 0.3317 | 0.8715 | | 0.3078 | 41.0 | 30791 | 0.3311 | 0.8715 | | 0.24 | 42.0 | 31542 | 0.3306 | 0.8715 | | 0.289 | 43.0 | 32293 | 0.3302 | 0.8715 | | 0.2977 | 44.0 | 33044 | 0.3297 | 0.8715 | | 0.2559 | 45.0 | 33795 | 0.3294 | 0.8715 | | 0.3508 | 46.0 | 34546 | 0.3292 | 0.8715 | | 0.26 | 47.0 | 35297 | 0.3291 | 0.8715 | | 0.3325 | 48.0 | 36048 | 0.3290 | 0.8715 | | 0.2898 | 49.0 | 36799 | 0.3289 | 0.8715 | | 0.2912 | 50.0 | 37550 | 0.3289 | 0.8715 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
intMinsu/swinv2-large-patch4-window12to16-192to256-22kto1k-ft-01040117
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swinv2-large-patch4-window12to16-192to256-22kto1k-ft-01040117 This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 64 - eval_batch_size: 64 - seed: 777 - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 10 ### Framework versions - Transformers 4.36.2 - Pytorch 1.12.1 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "airplane", "airport", "bare soil", "baseball diamond", "basketball court", "beach", "bridge", "buildings", "cars", "chaparral", "cloud", "containers", "crosswalk", "dense residential area", "desert", "dock", "factory", "field", "football field", "forest", "freeway", "golf course", "grass", "greenhouse", "gully", "habor", "intersection", "island", "lake", "mobile home", "mountain", "overpass", "park", "parking lot", "parkway", "pavement", "railway", "railway station", "river", "road", "roundabout", "runway", "sand", "sea", "ships", "snow", "snowberg", "sparse residential area", "stadium", "swimming pool", "tanks", "tennis court", "terrace", "track", "trail", "transmission tower", "trees", "water", "wetland", "wind turbine" ]
Dataallyouhave/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.1.0+cu121 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
alirzb/S1_M1_R2_beit_42535991
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R2_beit_42535991 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0043 - Accuracy: 0.9990 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0302 | 1.0 | 260 | 0.0595 | 0.9883 | | 0.0137 | 2.0 | 521 | 0.0066 | 0.9976 | | 0.0007 | 3.0 | 782 | 0.0079 | 0.9981 | | 0.0002 | 4.0 | 1043 | 0.0046 | 0.9988 | | 0.0002 | 4.99 | 1300 | 0.0043 | 0.9990 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.2+cu118 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "none_seizures", "seizures" ]
omarques/vit-base-patch16-224-finetuned-flower
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.1.0+cu121 - Datasets 2.7.1 - Tokenizers 0.13.3
[ "daisy", "dandelion", "roses", "sunflowers", "tulips" ]
hkivancoral/smids_10x_beit_large_sgd_0001_fold2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_0001_fold2 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3022 - Accuracy: 0.8769 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.9337 | 1.0 | 750 | 0.9902 | 0.5025 | | 0.7559 | 2.0 | 1500 | 0.8323 | 0.6206 | | 0.6418 | 3.0 | 2250 | 0.7119 | 0.7205 | | 0.6498 | 4.0 | 3000 | 0.6261 | 0.7737 | | 0.5308 | 5.0 | 3750 | 0.5616 | 0.8020 | | 0.5189 | 6.0 | 4500 | 0.5157 | 0.8186 | | 0.4977 | 7.0 | 5250 | 0.4808 | 0.8303 | | 0.4495 | 8.0 | 6000 | 0.4552 | 0.8369 | | 0.4544 | 9.0 | 6750 | 0.4332 | 0.8303 | | 0.4325 | 10.0 | 7500 | 0.4166 | 0.8336 | | 0.4708 | 11.0 | 8250 | 0.4025 | 0.8419 | | 0.4375 | 12.0 | 9000 | 0.3904 | 0.8419 | | 0.3875 | 13.0 | 9750 | 0.3796 | 0.8486 | | 0.338 | 14.0 | 10500 | 0.3718 | 0.8486 | | 0.3613 | 15.0 | 11250 | 0.3643 | 0.8502 | | 0.3159 | 16.0 | 12000 | 0.3576 | 0.8569 | | 0.313 | 17.0 | 12750 | 0.3520 | 0.8602 | | 0.3243 | 18.0 | 13500 | 0.3466 | 0.8619 | | 0.3747 | 19.0 | 14250 | 0.3420 | 0.8619 | | 0.3494 | 20.0 | 15000 | 0.3382 | 0.8652 | | 0.3628 | 21.0 | 15750 | 0.3347 | 0.8652 | | 0.2681 | 22.0 | 16500 | 0.3313 | 0.8686 | | 0.3103 | 23.0 | 17250 | 0.3283 | 0.8686 | | 0.3029 | 24.0 | 18000 | 0.3255 | 0.8686 | | 0.3439 | 25.0 | 18750 | 0.3228 | 0.8686 | | 0.363 | 26.0 | 19500 | 0.3205 | 0.8735 | | 0.3457 | 27.0 | 20250 | 0.3186 | 0.8735 | | 0.3118 | 28.0 | 21000 | 0.3168 | 0.8719 | | 0.3203 | 29.0 | 21750 | 0.3151 | 0.8719 | | 0.2897 | 30.0 | 22500 | 0.3135 | 0.8702 | | 0.3287 | 31.0 | 23250 | 0.3118 | 0.8702 | | 0.3672 | 32.0 | 24000 | 0.3107 | 0.8719 | | 0.3139 | 33.0 | 24750 | 0.3101 | 0.8702 | | 0.3173 | 34.0 | 25500 | 0.3088 | 0.8719 | | 0.3321 | 35.0 | 26250 | 0.3079 | 0.8735 | | 0.3146 | 36.0 | 27000 | 0.3071 | 0.8735 | | 0.3221 | 37.0 | 27750 | 0.3062 | 0.8735 | | 0.2973 | 38.0 | 28500 | 0.3058 | 0.8752 | | 0.275 | 39.0 | 29250 | 0.3050 | 0.8752 | | 0.3603 | 40.0 | 30000 | 0.3045 | 0.8752 | | 0.3249 | 41.0 | 30750 | 0.3040 | 0.8752 | | 0.3107 | 42.0 | 31500 | 0.3036 | 0.8752 | | 0.2783 | 43.0 | 32250 | 0.3032 | 0.8752 | | 0.2901 | 44.0 | 33000 | 0.3029 | 0.8752 | | 0.3257 | 45.0 | 33750 | 0.3026 | 0.8752 | | 0.2732 | 46.0 | 34500 | 0.3025 | 0.8752 | | 0.3622 | 47.0 | 35250 | 0.3024 | 0.8769 | | 0.3082 | 48.0 | 36000 | 0.3023 | 0.8769 | | 0.2937 | 49.0 | 36750 | 0.3022 | 0.8769 | | 0.3097 | 50.0 | 37500 | 0.3022 | 0.8769 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
Hugomartinezg/primer-vit-model-hugo
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # primer-vit-model-hugo This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0242 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1145 | 3.85 | 500 | 0.0242 | 0.9925 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "angular_leaf_spot", "bean_rust", "healthy" ]
hkivancoral/smids_10x_beit_large_sgd_0001_fold3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_0001_fold3 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3045 - Accuracy: 0.8783 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.9373 | 1.0 | 750 | 1.0091 | 0.4617 | | 0.7639 | 2.0 | 1500 | 0.8536 | 0.6117 | | 0.6803 | 3.0 | 2250 | 0.7396 | 0.6933 | | 0.5905 | 4.0 | 3000 | 0.6588 | 0.75 | | 0.5735 | 5.0 | 3750 | 0.5968 | 0.7833 | | 0.5021 | 6.0 | 4500 | 0.5507 | 0.8017 | | 0.4704 | 7.0 | 5250 | 0.5159 | 0.8133 | | 0.4872 | 8.0 | 6000 | 0.4878 | 0.8267 | | 0.4458 | 9.0 | 6750 | 0.4650 | 0.83 | | 0.4154 | 10.0 | 7500 | 0.4469 | 0.8417 | | 0.4321 | 11.0 | 8250 | 0.4318 | 0.845 | | 0.3944 | 12.0 | 9000 | 0.4172 | 0.8433 | | 0.3976 | 13.0 | 9750 | 0.4054 | 0.8483 | | 0.4242 | 14.0 | 10500 | 0.3948 | 0.85 | | 0.3817 | 15.0 | 11250 | 0.3850 | 0.8517 | | 0.3695 | 16.0 | 12000 | 0.3777 | 0.8517 | | 0.3394 | 17.0 | 12750 | 0.3711 | 0.8533 | | 0.3418 | 18.0 | 13500 | 0.3639 | 0.8583 | | 0.3927 | 19.0 | 14250 | 0.3584 | 0.8633 | | 0.3355 | 20.0 | 15000 | 0.3536 | 0.8617 | | 0.3182 | 21.0 | 15750 | 0.3485 | 0.86 | | 0.3252 | 22.0 | 16500 | 0.3442 | 0.8617 | | 0.3481 | 23.0 | 17250 | 0.3402 | 0.86 | | 0.352 | 24.0 | 18000 | 0.3367 | 0.8617 | | 0.3814 | 25.0 | 18750 | 0.3335 | 0.865 | | 0.3436 | 26.0 | 19500 | 0.3305 | 0.865 | | 0.2353 | 27.0 | 20250 | 0.3280 | 0.865 | | 0.3097 | 28.0 | 21000 | 0.3253 | 0.8683 | | 0.3673 | 29.0 | 21750 | 0.3232 | 0.8683 | | 0.316 | 30.0 | 22500 | 0.3211 | 0.87 | | 0.2736 | 31.0 | 23250 | 0.3193 | 0.8733 | | 0.3111 | 32.0 | 24000 | 0.3172 | 0.875 | | 0.3586 | 33.0 | 24750 | 0.3157 | 0.875 | | 0.3482 | 34.0 | 25500 | 0.3143 | 0.875 | | 0.2894 | 35.0 | 26250 | 0.3130 | 0.875 | | 0.3247 | 36.0 | 27000 | 0.3121 | 0.8733 | | 0.3266 | 37.0 | 27750 | 0.3109 | 0.8733 | | 0.3501 | 38.0 | 28500 | 0.3098 | 0.8733 | | 0.3018 | 39.0 | 29250 | 0.3089 | 0.875 | | 0.3416 | 40.0 | 30000 | 0.3082 | 0.875 | | 0.318 | 41.0 | 30750 | 0.3074 | 0.875 | | 0.3558 | 42.0 | 31500 | 0.3067 | 0.8767 | | 0.2993 | 43.0 | 32250 | 0.3061 | 0.8767 | | 0.2907 | 44.0 | 33000 | 0.3056 | 0.8767 | | 0.2783 | 45.0 | 33750 | 0.3053 | 0.8767 | | 0.2937 | 46.0 | 34500 | 0.3050 | 0.8767 | | 0.3037 | 47.0 | 35250 | 0.3048 | 0.8783 | | 0.3233 | 48.0 | 36000 | 0.3046 | 0.8783 | | 0.3074 | 49.0 | 36750 | 0.3045 | 0.8783 | | 0.2861 | 50.0 | 37500 | 0.3045 | 0.8783 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
hkivancoral/smids_10x_beit_large_sgd_0001_fold4
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_0001_fold4 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3577 - Accuracy: 0.8583 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.956 | 1.0 | 750 | 0.9742 | 0.4667 | | 0.7783 | 2.0 | 1500 | 0.8200 | 0.63 | | 0.7323 | 3.0 | 2250 | 0.7096 | 0.71 | | 0.6337 | 4.0 | 3000 | 0.6341 | 0.7517 | | 0.5065 | 5.0 | 3750 | 0.5795 | 0.775 | | 0.4965 | 6.0 | 4500 | 0.5386 | 0.8 | | 0.4578 | 7.0 | 5250 | 0.5091 | 0.8117 | | 0.4692 | 8.0 | 6000 | 0.4857 | 0.825 | | 0.4711 | 9.0 | 6750 | 0.4676 | 0.8333 | | 0.3709 | 10.0 | 7500 | 0.4525 | 0.835 | | 0.4051 | 11.0 | 8250 | 0.4402 | 0.8367 | | 0.4533 | 12.0 | 9000 | 0.4305 | 0.8417 | | 0.3537 | 13.0 | 9750 | 0.4215 | 0.8467 | | 0.4025 | 14.0 | 10500 | 0.4147 | 0.8483 | | 0.3254 | 15.0 | 11250 | 0.4082 | 0.8467 | | 0.3312 | 16.0 | 12000 | 0.4031 | 0.8467 | | 0.2854 | 17.0 | 12750 | 0.3983 | 0.8483 | | 0.3355 | 18.0 | 13500 | 0.3942 | 0.8517 | | 0.3881 | 19.0 | 14250 | 0.3905 | 0.8483 | | 0.3257 | 20.0 | 15000 | 0.3873 | 0.8517 | | 0.3303 | 21.0 | 15750 | 0.3846 | 0.8483 | | 0.3308 | 22.0 | 16500 | 0.3815 | 0.8517 | | 0.3025 | 23.0 | 17250 | 0.3791 | 0.85 | | 0.3591 | 24.0 | 18000 | 0.3770 | 0.8517 | | 0.3426 | 25.0 | 18750 | 0.3750 | 0.8567 | | 0.2909 | 26.0 | 19500 | 0.3737 | 0.8567 | | 0.3106 | 27.0 | 20250 | 0.3719 | 0.855 | | 0.3129 | 28.0 | 21000 | 0.3704 | 0.855 | | 0.2957 | 29.0 | 21750 | 0.3688 | 0.855 | | 0.2639 | 30.0 | 22500 | 0.3673 | 0.855 | | 0.2821 | 31.0 | 23250 | 0.3660 | 0.855 | | 0.2912 | 32.0 | 24000 | 0.3649 | 0.8567 | | 0.3006 | 33.0 | 24750 | 0.3640 | 0.8583 | | 0.3129 | 34.0 | 25500 | 0.3632 | 0.8583 | | 0.2463 | 35.0 | 26250 | 0.3625 | 0.86 | | 0.3133 | 36.0 | 27000 | 0.3619 | 0.8583 | | 0.3061 | 37.0 | 27750 | 0.3612 | 0.8583 | | 0.3206 | 38.0 | 28500 | 0.3606 | 0.8583 | | 0.3433 | 39.0 | 29250 | 0.3601 | 0.8583 | | 0.3138 | 40.0 | 30000 | 0.3597 | 0.8583 | | 0.2988 | 41.0 | 30750 | 0.3593 | 0.8583 | | 0.3075 | 42.0 | 31500 | 0.3589 | 0.8583 | | 0.3059 | 43.0 | 32250 | 0.3587 | 0.8583 | | 0.3142 | 44.0 | 33000 | 0.3585 | 0.8583 | | 0.3034 | 45.0 | 33750 | 0.3583 | 0.8583 | | 0.2744 | 46.0 | 34500 | 0.3580 | 0.8583 | | 0.2599 | 47.0 | 35250 | 0.3579 | 0.8583 | | 0.2643 | 48.0 | 36000 | 0.3578 | 0.8583 | | 0.2927 | 49.0 | 36750 | 0.3577 | 0.8583 | | 0.2381 | 50.0 | 37500 | 0.3577 | 0.8583 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
lxl2023/autotrain-3xy9g-unefm1
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metricsg loss: nan f1_macro: 0.13333333333333333 f1_micro: 0.25 f1_weighted: 0.1 precision_macro: 0.08333333333333333 precision_micro: 0.25 precision_weighted: 0.0625 recall_macro: 0.3333333333333333 recall_micro: 0.25 recall_weighted: 0.25 accuracy: 0.25
[ "11covered_with_a_quilt_and_only_the_head_exposed", "12covered_with_a_quilt_and_exposed_other_parts_of_the_body", "13has_nothing_to_do_with_11_and_12_above" ]
lxl2023/autotrain-wf2gy-dl7q8
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metricsg loss: 0.8730660676956177 f1_macro: 1.0 f1_micro: 1.0 f1_weighted: 1.0 precision_macro: 1.0 precision_micro: 1.0 precision_weighted: 1.0 recall_macro: 1.0 recall_micro: 1.0 recall_weighted: 1.0 accuracy: 1.0
[ "11covered_with_a_quilt_and_only_the_head_exposed", "12covered_with_a_quilt_and_exposed_other_parts_of_the_body", "13has_nothing_to_do_with_11_and_12_above" ]
lxl2023/autotrain-9974e-705pt
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metricsg loss: 0.9459153413772583 f1_macro: 0.26666666666666666 f1_micro: 0.5 f1_weighted: 0.4 precision_macro: 0.2222222222222222 precision_micro: 0.5 precision_weighted: 0.3333333333333333 recall_macro: 0.3333333333333333 recall_micro: 0.5 recall_weighted: 0.5 accuracy: 0.5
[ "11covered_with_a_quilt_and_only_the_head_exposed", "12covered_with_a_quilt_and_exposed_other_parts_of_the_body", "13has_nothing_to_do_with_11_and_12_above" ]
hkivancoral/smids_10x_beit_large_sgd_0001_fold5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smids_10x_beit_large_sgd_0001_fold5 This model is a fine-tuned version of [microsoft/beit-large-patch16-224](https://huggingface.co/microsoft/beit-large-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.3210 - Accuracy: 0.8733 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.9567 | 1.0 | 750 | 1.0187 | 0.4617 | | 0.813 | 2.0 | 1500 | 0.8588 | 0.6033 | | 0.7071 | 3.0 | 2250 | 0.7412 | 0.6717 | | 0.6056 | 4.0 | 3000 | 0.6548 | 0.7317 | | 0.553 | 5.0 | 3750 | 0.5916 | 0.7767 | | 0.5415 | 6.0 | 4500 | 0.5456 | 0.7983 | | 0.4714 | 7.0 | 5250 | 0.5118 | 0.8083 | | 0.4919 | 8.0 | 6000 | 0.4844 | 0.8133 | | 0.4714 | 9.0 | 6750 | 0.4633 | 0.8167 | | 0.408 | 10.0 | 7500 | 0.4458 | 0.8267 | | 0.416 | 11.0 | 8250 | 0.4326 | 0.8317 | | 0.4057 | 12.0 | 9000 | 0.4197 | 0.84 | | 0.4411 | 13.0 | 9750 | 0.4091 | 0.8383 | | 0.3787 | 14.0 | 10500 | 0.3999 | 0.84 | | 0.4112 | 15.0 | 11250 | 0.3917 | 0.8433 | | 0.3272 | 16.0 | 12000 | 0.3857 | 0.8433 | | 0.3453 | 17.0 | 12750 | 0.3795 | 0.8467 | | 0.2978 | 18.0 | 13500 | 0.3732 | 0.8467 | | 0.3695 | 19.0 | 14250 | 0.3692 | 0.8533 | | 0.3546 | 20.0 | 15000 | 0.3643 | 0.855 | | 0.3274 | 21.0 | 15750 | 0.3603 | 0.8583 | | 0.3708 | 22.0 | 16500 | 0.3566 | 0.8583 | | 0.3177 | 23.0 | 17250 | 0.3530 | 0.8617 | | 0.3259 | 24.0 | 18000 | 0.3501 | 0.865 | | 0.3343 | 25.0 | 18750 | 0.3473 | 0.8683 | | 0.3365 | 26.0 | 19500 | 0.3445 | 0.865 | | 0.2524 | 27.0 | 20250 | 0.3419 | 0.865 | | 0.3298 | 28.0 | 21000 | 0.3396 | 0.8667 | | 0.3375 | 29.0 | 21750 | 0.3374 | 0.8667 | | 0.3203 | 30.0 | 22500 | 0.3355 | 0.8683 | | 0.2843 | 31.0 | 23250 | 0.3334 | 0.8683 | | 0.3065 | 32.0 | 24000 | 0.3325 | 0.8667 | | 0.3385 | 33.0 | 24750 | 0.3310 | 0.8717 | | 0.2656 | 34.0 | 25500 | 0.3296 | 0.8717 | | 0.3103 | 35.0 | 26250 | 0.3282 | 0.8733 | | 0.3336 | 36.0 | 27000 | 0.3274 | 0.8717 | | 0.2743 | 37.0 | 27750 | 0.3265 | 0.8733 | | 0.3245 | 38.0 | 28500 | 0.3255 | 0.8717 | | 0.321 | 39.0 | 29250 | 0.3249 | 0.8733 | | 0.2652 | 40.0 | 30000 | 0.3240 | 0.8733 | | 0.2925 | 41.0 | 30750 | 0.3236 | 0.875 | | 0.3072 | 42.0 | 31500 | 0.3229 | 0.875 | | 0.3317 | 43.0 | 32250 | 0.3226 | 0.875 | | 0.2932 | 44.0 | 33000 | 0.3221 | 0.875 | | 0.3178 | 45.0 | 33750 | 0.3218 | 0.8733 | | 0.2606 | 46.0 | 34500 | 0.3214 | 0.875 | | 0.3688 | 47.0 | 35250 | 0.3212 | 0.875 | | 0.2811 | 48.0 | 36000 | 0.3211 | 0.8733 | | 0.3003 | 49.0 | 36750 | 0.3211 | 0.8733 | | 0.2418 | 50.0 | 37500 | 0.3210 | 0.8733 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.0+cu121 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "abnormal_sperm", "non-sperm", "normal_sperm" ]
Vigneshwari-Sambandan/vit-base-patch16-224-finetuned-fibre
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-fibre This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.5532 - Accuracy: 0.5180 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.6045 | 1.0 | 879 | 1.6613 | 0.4918 | | 1.5847 | 2.0 | 1758 | 1.5962 | 0.5065 | | 1.4774 | 3.0 | 2637 | 1.5532 | 0.5180 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "abaca", "acrylic", "alpaca", "angora", "aramid", "camel", "cashmere", "cotton", "cupro", "elastane_spandex", "flax_linen", "fur", "hemp", "horse_hair", "jute", "leather", "llama", "lyocell", "milk_fiber", "modal", "mohair", "nylon", "polyester", "polyolefin", "ramie", "silk", "sisal", "soybean_fiber", "suede", "triacetate_acetate", "viscose_rayon", "wool", "yak" ]
tonyassi/vogue-fashion-collection-15-nobg
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vogue-fashion-collection-15-nobg ## Model description This model classifies an image into a fashion collection. It is trained on the [tonyassi/vogue-runway-top15-512px-nobg](https://huggingface.co/datasets/tonyassi/vogue-runway-top15-512px-nobg) dataset and fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k). Because the model trained on a dataset with white background it is suggested to only give the model an image with a white background. Removing the background allows the model to focus on the clothes and disregard the background. ## Dataset description [tonyassi/vogue-runway-top15-512px-nobg](https://huggingface.co/datasets/tonyassi/vogue-runway-top15-512px-nobg) - 15 fashion houses - 1679 collections - 87,547 images - No background ### How to use ```python from transformers import pipeline # Initialize image classification pipeline pipe = pipeline("image-classification", model="tonyassi/vogue-fashion-collection-15-nobg") # Perform classification result = pipe('image.png') # Print results print(result) ``` ## Training and evaluation data It achieves the following results on the evaluation set: - Loss: 0.5880 - Accuracy: 0.8403 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "alexander mcqueen,fall 1996 ready to wear", "alexander mcqueen,fall 1997 ready to wear", "alexander mcqueen,fall 2005 ready to wear", "alexander mcqueen,spring 2012 menswear", "louis vuitton,pre fall 2009", "louis vuitton,pre fall 2010", "louis vuitton,pre fall 2011", "louis vuitton,pre fall 2012", "louis vuitton,pre fall 2013", "louis vuitton,pre fall 2014", "louis vuitton,pre fall 2015", "louis vuitton,pre fall 2016", "louis vuitton,pre fall 2017", "louis vuitton,pre fall 2018", "alexander mcqueen,spring 2012 ready to wear", "louis vuitton,pre fall 2019", "louis vuitton,pre fall 2020", "louis vuitton,pre fall 2020 menswear", "louis vuitton,pre fall 2021", "louis vuitton,pre fall 2021 menswear", "louis vuitton,pre fall 2022 menswear", "louis vuitton,pre fall 2023", "louis vuitton,pre fall 2023 menswear", "louis vuitton,pre fall 2024 menswear", "louis vuitton,resort 2008", "alexander mcqueen,spring 2013 menswear", "louis vuitton,resort 2009", "louis vuitton,resort 2010", "louis vuitton,resort 2011", "louis vuitton,resort 2012", "louis vuitton,resort 2013", "louis vuitton,resort 2014", "louis vuitton,resort 2015", "louis vuitton,resort 2016", "louis vuitton,resort 2017", "louis vuitton,resort 2018", "alexander mcqueen,spring 2013 ready to wear", "louis vuitton,resort 2019", "louis vuitton,resort 2020", "louis vuitton,resort 2021", "louis vuitton,resort 2021 menswear", "louis vuitton,resort 2022", "louis vuitton,resort 2022 menswear", "louis vuitton,resort 2023", "louis vuitton,resort 2023 menswear", "louis vuitton,resort 2024", "louis vuitton,resort 2024 menswear", "alexander mcqueen,spring 2014 menswear", "louis vuitton,spring 2000 ready to wear", "louis vuitton,spring 2001 ready to wear", "louis vuitton,spring 2002 ready to wear", "louis vuitton,spring 2003 ready to wear", "louis vuitton,spring 2004 ready to wear", "louis vuitton,spring 2005 menswear", "louis vuitton,spring 2005 ready to wear", "louis vuitton,spring 2006 menswear", "louis vuitton,spring 2006 ready to wear", "louis vuitton,spring 2007 menswear", "alexander mcqueen,spring 2014 ready to wear", "louis vuitton,spring 2007 ready to wear", "louis vuitton,spring 2008 menswear", "louis vuitton,spring 2008 ready to wear", "louis vuitton,spring 2009 menswear", "louis vuitton,spring 2009 ready to wear", "louis vuitton,spring 2010 menswear", "louis vuitton,spring 2010 ready to wear", "louis vuitton,spring 2011 menswear", "louis vuitton,spring 2011 ready to wear", "louis vuitton,spring 2012 menswear", "alexander mcqueen,spring 2015 menswear", "louis vuitton,spring 2012 ready to wear", "louis vuitton,spring 2013 menswear", "louis vuitton,spring 2013 ready to wear", "louis vuitton,spring 2014 menswear", "louis vuitton,spring 2014 ready to wear", "louis vuitton,spring 2015 menswear", "louis vuitton,spring 2015 ready to wear", "louis vuitton,spring 2016 menswear", "louis vuitton,spring 2016 ready to wear", "louis vuitton,spring 2017 menswear", "alexander mcqueen,spring 2015 ready to wear", "louis vuitton,spring 2017 ready to wear", "louis vuitton,spring 2018 menswear", "louis vuitton,spring 2018 ready to wear", "louis vuitton,spring 2019 menswear", "louis vuitton,spring 2019 ready to wear", "louis vuitton,spring 2020 menswear", "louis vuitton,spring 2020 ready to wear", "louis vuitton,spring 2021 menswear", "louis vuitton,spring 2021 ready to wear", "louis vuitton,spring 2022 menswear", "alexander mcqueen,spring 2016 menswear", "louis vuitton,spring 2023 menswear", "louis vuitton,spring 2023 ready to wear", "louis vuitton,spring 2024 menswear", "prada,fall 1996 ready to wear", "prada,fall 2000 ready to wear", "prada,fall 2001 ready to wear", "prada,fall 2002 ready to wear", "prada,fall 2003 ready to wear", "prada,fall 2004 ready to wear", "prada,fall 2005 menswear", "alexander mcqueen,spring 2016 ready to wear", "prada,fall 2005 ready to wear", "prada,fall 2006 menswear", "prada,fall 2006 ready to wear", "prada,fall 2007 menswear", "prada,fall 2007 ready to wear", "prada,fall 2008 menswear", "prada,fall 2008 ready to wear", "prada,fall 2009 menswear", "prada,fall 2009 ready to wear", "prada,fall 2010 menswear", "alexander mcqueen,fall 2006 menswear", "alexander mcqueen,spring 2017 menswear", "prada,fall 2010 ready to wear", "prada,fall 2011 menswear", "prada,fall 2011 ready to wear", "prada,fall 2012 menswear", "prada,fall 2012 ready to wear", "prada,fall 2013 menswear", "prada,fall 2013 ready to wear", "prada,fall 2014 menswear", "prada,fall 2014 ready to wear", "prada,fall 2015 menswear", "alexander mcqueen,spring 2017 ready to wear", "prada,fall 2015 ready to wear", "prada,fall 2016 menswear", "prada,fall 2016 ready to wear", "prada,fall 2017 menswear", "prada,fall 2017 ready to wear", "prada,fall 2018 menswear", "prada,fall 2018 ready to wear", "prada,fall 2019 menswear", "prada,fall 2019 ready to wear", "prada,fall 2020 menswear", "alexander mcqueen,spring 2018 menswear", "prada,fall 2020 ready to wear", "prada,fall 2021 menswear", "prada,fall 2021 ready to wear", "prada,fall 2022 menswear", "prada,fall 2022 ready to wear", "prada,fall 2023 menswear", "prada,fall 2023 ready to wear", "prada,pre fall 2009", "prada,pre fall 2010", "prada,resort 2008", "alexander mcqueen,spring 2018 ready to wear", "prada,resort 2009", "prada,resort 2010", "prada,resort 2011", "prada,resort 2012", "prada,resort 2013", "prada,resort 2018", "prada,resort 2019", "prada,resort 2020", "prada,spring 1992 ready to wear", "prada,spring 1993 ready to wear", "alexander mcqueen,spring 2019 menswear", "prada,spring 1994 ready to wear", "prada,spring 1995 ready to wear", "prada,spring 1996 ready to wear", "prada,spring 1997 ready to wear", "prada,spring 1998 ready to wear", "prada,spring 1999 ready to wear", "prada,spring 2000 ready to wear", "prada,spring 2001 ready to wear", "prada,spring 2002 ready to wear", "prada,spring 2003 ready to wear", "alexander mcqueen,spring 2019 ready to wear", "prada,spring 2004 ready to wear", "prada,spring 2005 menswear", "prada,spring 2005 ready to wear", "prada,spring 2006 menswear", "prada,spring 2006 ready to wear", "prada,spring 2007 menswear", "prada,spring 2007 ready to wear", "prada,spring 2008 menswear", "prada,spring 2008 ready to wear", "prada,spring 2009 menswear", "alexander mcqueen,spring 2020 menswear", "prada,spring 2009 ready to wear", "prada,spring 2010 ready to wear", "prada,spring 2011 menswear", "prada,spring 2011 ready to wear", "prada,spring 2012 menswear", "prada,spring 2012 ready to wear", "prada,spring 2013 menswear", "prada,spring 2013 ready to wear", "prada,spring 2014 menswear", "prada,spring 2014 ready to wear", "alexander mcqueen,spring 2020 ready to wear", "prada,spring 2015 menswear", "prada,spring 2015 ready to wear", "prada,spring 2016 menswear", "prada,spring 2016 ready to wear", "prada,spring 2017 menswear", "prada,spring 2017 ready to wear", "prada,spring 2018 menswear", "prada,spring 2018 ready to wear", "prada,spring 2019 menswear", "prada,spring 2019 ready to wear", "alexander mcqueen,spring 2021 menswear", "prada,spring 2020 menswear", "prada,spring 2020 ready to wear", "prada,spring 2021 menswear", "prada,spring 2021 ready to wear", "prada,spring 2022 menswear", "prada,spring 2022 ready to wear", "prada,spring 2023 menswear", "prada,spring 2023 ready to wear", "prada,spring 2024 menswear", "prada,spring 2024 ready to wear", "alexander mcqueen,spring 2021 ready to wear", "ralph lauren,fall 2000 ready to wear", "ralph lauren,fall 2001 ready to wear", "ralph lauren,fall 2002 ready to wear", "ralph lauren,fall 2003 ready to wear", "ralph lauren,fall 2004 ready to wear", "ralph lauren,fall 2005 menswear", "ralph lauren,fall 2005 ready to wear", "ralph lauren,fall 2006 menswear", "ralph lauren,fall 2006 ready to wear", "ralph lauren,fall 2007 menswear", "alexander mcqueen,fall 2006 ready to wear", "alexander mcqueen,spring 2022 menswear", "ralph lauren,fall 2007 ready to wear", "ralph lauren,fall 2008 menswear", "ralph lauren,fall 2008 ready to wear", "ralph lauren,fall 2009 ready to wear", "ralph lauren,fall 2010 menswear", "ralph lauren,fall 2010 ready to wear", "ralph lauren,fall 2011 ready to wear", "ralph lauren,fall 2012 ready to wear", "ralph lauren,fall 2013 menswear", "ralph lauren,fall 2013 ready to wear", "alexander mcqueen,spring 2022 ready to wear", "ralph lauren,fall 2014 menswear", "ralph lauren,fall 2014 ready to wear", "ralph lauren,fall 2015 menswear", "ralph lauren,fall 2015 ready to wear", "ralph lauren,fall 2016 menswear", "ralph lauren,fall 2016 ready to wear", "ralph lauren,fall 2017 menswear", "ralph lauren,fall 2017 ready to wear", "ralph lauren,fall 2018 menswear", "ralph lauren,fall 2018 ready to wear", "alexander mcqueen,spring 2023 menswear", "ralph lauren,fall 2019 menswear", "ralph lauren,fall 2019 ready to wear", "ralph lauren,fall 2020 menswear", "ralph lauren,fall 2020 ready to wear", "ralph lauren,fall 2021 ready to wear", "ralph lauren,fall 2022 ready to wear", "ralph lauren,fall 2023 ready to wear", "ralph lauren,pre fall 2014", "ralph lauren,pre fall 2015", "ralph lauren,pre fall 2016", "alexander mcqueen,spring 2023 ready to wear", "ralph lauren,pre fall 2017", "ralph lauren,pre fall 2018", "ralph lauren,pre fall 2019", "ralph lauren,pre fall 2020", "ralph lauren,pre fall 2021", "ralph lauren,resort 2008", "ralph lauren,resort 2009", "ralph lauren,resort 2013", "ralph lauren,resort 2014", "ralph lauren,resort 2015", "alexander mcqueen,spring 2024 menswear", "ralph lauren,resort 2016", "ralph lauren,resort 2019", "ralph lauren,resort 2022", "ralph lauren,resort 2024", "ralph lauren,spring 2000 ready to wear", "ralph lauren,spring 2001 ready to wear", "ralph lauren,spring 2002 ready to wear", "ralph lauren,spring 2003 ready to wear", "ralph lauren,spring 2004 ready to wear", "ralph lauren,spring 2005 ready to wear", "alexander mcqueen,spring 2024 ready to wear", "ralph lauren,spring 2006 menswear", "ralph lauren,spring 2006 ready to wear", "ralph lauren,spring 2007 menswear", "ralph lauren,spring 2007 ready to wear", "ralph lauren,spring 2008 menswear", "ralph lauren,spring 2008 ready to wear", "ralph lauren,spring 2009 ready to wear", "ralph lauren,spring 2010 ready to wear", "ralph lauren,spring 2011 ready to wear", "ralph lauren,spring 2012 ready to wear", "armani prive,fall 2005 couture", "ralph lauren,spring 2013 menswear", "ralph lauren,spring 2013 ready to wear", "ralph lauren,spring 2014 menswear", "ralph lauren,spring 2014 ready to wear", "ralph lauren,spring 2015 menswear", "ralph lauren,spring 2015 ready to wear", "ralph lauren,spring 2016 menswear", "ralph lauren,spring 2016 ready to wear", "ralph lauren,spring 2017 menswear", "ralph lauren,spring 2017 ready to wear", "armani prive,fall 2006 couture", "ralph lauren,spring 2018 menswear", "ralph lauren,spring 2018 ready to wear", "ralph lauren,spring 2019 menswear", "ralph lauren,spring 2019 ready to wear", "ralph lauren,spring 2020 menswear", "ralph lauren,spring 2021 ready to wear", "ralph lauren,spring 2022 ready to wear", "ralph lauren,spring 2023 ready to wear", "ralph lauren,spring 2024 menswear", "ralph lauren,spring 2024 ready to wear", "armani prive,fall 2007 couture", "saint laurent,fall 2000 ready to wear", "saint laurent,fall 2001 couture", "saint laurent,fall 2001 ready to wear", "saint laurent,fall 2002 ready to wear", "saint laurent,fall 2003 ready to wear", "saint laurent,fall 2004 ready to wear", "saint laurent,fall 2005 menswear", "saint laurent,fall 2005 ready to wear", "saint laurent,fall 2006 menswear", "saint laurent,fall 2006 ready to wear", "armani prive,fall 2008 couture", "saint laurent,fall 2007 menswear", "saint laurent,fall 2007 ready to wear", "saint laurent,fall 2008 menswear", "saint laurent,fall 2008 ready to wear", "saint laurent,fall 2009 ready to wear", "saint laurent,fall 2010 menswear", "saint laurent,fall 2010 ready to wear", "saint laurent,fall 2011 menswear", "saint laurent,fall 2011 ready to wear", "saint laurent,fall 2012 menswear", "alexander mcqueen,fall 2007 menswear", "armani prive,fall 2009 couture", "saint laurent,fall 2012 ready to wear", "saint laurent,fall 2013 menswear", "saint laurent,fall 2013 ready to wear", "saint laurent,fall 2014 menswear", "saint laurent,fall 2014 ready to wear", "saint laurent,fall 2015 menswear", "saint laurent,fall 2015 ready to wear", "saint laurent,fall 2016 menswear", "saint laurent,fall 2016 ready to wear", "saint laurent,fall 2017 ready to wear", "armani prive,fall 2010 couture", "saint laurent,fall 2018 ready to wear", "saint laurent,fall 2019 menswear", "saint laurent,fall 2019 ready to wear", "saint laurent,fall 2020 ready to wear", "saint laurent,fall 2021 menswear", "saint laurent,fall 2021 ready to wear", "saint laurent,fall 2022 menswear", "saint laurent,fall 2022 ready to wear", "saint laurent,fall 2023 menswear", "saint laurent,fall 2023 ready to wear", "armani prive,fall 2011 couture", "saint laurent,pre fall 2009", "saint laurent,pre fall 2010", "saint laurent,pre fall 2011", "saint laurent,pre fall 2012", "saint laurent,pre fall 2013", "saint laurent,pre fall 2016", "saint laurent,pre fall 2019", "saint laurent,pre fall 2020", "saint laurent,pre fall 2020 menswear", "saint laurent,pre fall 2021", "armani prive,fall 2012 couture", "saint laurent,pre fall 2022", "saint laurent,pre fall 2023", "saint laurent,resort 2008", "saint laurent,resort 2010", "saint laurent,resort 2011", "saint laurent,resort 2012", "saint laurent,resort 2014", "saint laurent,resort 2020", "saint laurent,resort 2021", "saint laurent,resort 2022", "armani prive,fall 2013 couture", "saint laurent,resort 2023", "saint laurent,spring 2000 ready to wear", "saint laurent,spring 2001 couture", "saint laurent,spring 2001 ready to wear", "saint laurent,spring 2002 couture", "saint laurent,spring 2002 ready to wear", "saint laurent,spring 2003 ready to wear", "saint laurent,spring 2004 ready to wear", "saint laurent,spring 2005 menswear", "saint laurent,spring 2005 ready to wear", "armani prive,fall 2014 couture", "saint laurent,spring 2006 menswear", "saint laurent,spring 2006 ready to wear", "saint laurent,spring 2007 menswear", "saint laurent,spring 2007 ready to wear", "saint laurent,spring 2008 menswear", "saint laurent,spring 2008 ready to wear", "saint laurent,spring 2009 menswear", "saint laurent,spring 2009 ready to wear", "saint laurent,spring 2010 ready to wear", "saint laurent,spring 2011 menswear", "armani prive,fall 2015 couture", "saint laurent,spring 2011 ready to wear", "saint laurent,spring 2012 menswear", "saint laurent,spring 2012 ready to wear", "saint laurent,spring 2013 ready to wear", "saint laurent,spring 2014 menswear", "saint laurent,spring 2014 ready to wear", "saint laurent,spring 2015 menswear", "saint laurent,spring 2015 ready to wear", "saint laurent,spring 2016 menswear", "saint laurent,spring 2016 ready to wear", "armani prive,fall 2016 couture", "saint laurent,spring 2017 ready to wear", "saint laurent,spring 2018 ready to wear", "saint laurent,spring 2019 menswear", "saint laurent,spring 2019 ready to wear", "saint laurent,spring 2020 menswear", "saint laurent,spring 2020 ready to wear", "saint laurent,spring 2021 menswear", "saint laurent,spring 2021 ready to wear", "saint laurent,spring 2022 menswear", "saint laurent,spring 2022 ready to wear", "armani prive,fall 2017 couture", "saint laurent,spring 2023 menswear", "saint laurent,spring 2023 ready to wear", "saint laurent,spring 2024 menswear", "saint laurent,spring 2024 ready to wear", "valentino,fall 2000 ready to wear", "valentino,fall 2001 couture", "valentino,fall 2001 ready to wear", "valentino,fall 2002 couture", "valentino,fall 2002 ready to wear", "valentino,fall 2003 couture", "armani prive,fall 2018 couture", "valentino,fall 2003 ready to wear", "valentino,fall 2004 couture", "valentino,fall 2004 ready to wear", "valentino,fall 2005 couture", "valentino,fall 2005 menswear", "valentino,fall 2005 ready to wear", "valentino,fall 2006 couture", "valentino,fall 2006 menswear", "valentino,fall 2006 ready to wear", "valentino,fall 2007 couture", "alexander mcqueen,fall 2007 ready to wear", "armani prive,fall 2019 couture", "valentino,fall 2007 menswear", "valentino,fall 2007 ready to wear", "valentino,fall 2008 couture", "valentino,fall 2008 menswear", "valentino,fall 2008 ready to wear", "valentino,fall 2009 couture", "valentino,fall 2009 ready to wear", "valentino,fall 2010 couture", "valentino,fall 2010 ready to wear", "valentino,fall 2011 couture", "armani prive,fall 2021 couture", "valentino,fall 2011 ready to wear", "valentino,fall 2012 couture", "valentino,fall 2012 menswear", "valentino,fall 2012 ready to wear", "valentino,fall 2013 couture", "valentino,fall 2013 menswear", "valentino,fall 2013 ready to wear", "valentino,fall 2014 couture", "valentino,fall 2014 menswear", "valentino,fall 2014 ready to wear", "armani prive,fall 2022 couture", "valentino,fall 2015 couture", "valentino,fall 2015 menswear", "valentino,fall 2015 ready to wear", "valentino,fall 2016 couture", "valentino,fall 2016 menswear", "valentino,fall 2016 ready to wear", "valentino,fall 2017 couture", "valentino,fall 2017 menswear", "valentino,fall 2017 ready to wear", "valentino,fall 2018 couture", "armani prive,fall 2023 couture", "valentino,fall 2018 menswear", "valentino,fall 2018 ready to wear", "valentino,fall 2019 couture", "valentino,fall 2019 menswear", "valentino,fall 2019 ready to wear", "valentino,fall 2020 couture", "valentino,fall 2020 menswear", "valentino,fall 2020 ready to wear", "valentino,fall 2021 couture", "valentino,fall 2021 ready to wear", "armani prive,spring 2005 couture", "valentino,fall 2022 couture", "valentino,fall 2022 ready to wear", "valentino,fall 2023 couture", "valentino,fall 2023 ready to wear", "valentino,pre fall 2008", "valentino,pre fall 2010", "valentino,pre fall 2011", "valentino,pre fall 2012", "valentino,pre fall 2013", "valentino,pre fall 2014", "armani prive,spring 2006 couture", "valentino,pre fall 2015", "valentino,pre fall 2016", "valentino,pre fall 2017", "valentino,pre fall 2018", "valentino,pre fall 2019", "valentino,pre fall 2020", "valentino,pre fall 2021", "valentino,pre fall 2022", "valentino,pre fall 2023", "valentino,pre fall 2024", "armani prive,spring 2007 couture", "valentino,resort 2008", "valentino,resort 2009", "valentino,resort 2011", "valentino,resort 2012", "valentino,resort 2013", "valentino,resort 2014", "valentino,resort 2015", "valentino,resort 2016", "valentino,resort 2017", "valentino,resort 2018", "armani prive,spring 2008 couture", "valentino,resort 2019", "valentino,resort 2020", "valentino,resort 2021", "valentino,resort 2022", "valentino,resort 2023", "valentino,resort 2024", "valentino,spring 2000 ready to wear", "valentino,spring 2001 couture", "valentino,spring 2001 ready to wear", "valentino,spring 2002 couture", "armani prive,spring 2009 couture", "valentino,spring 2002 ready to wear", "valentino,spring 2003 couture", "valentino,spring 2003 ready to wear", "valentino,spring 2004 couture", "valentino,spring 2004 ready to wear", "valentino,spring 2005 couture", "valentino,spring 2005 menswear", "valentino,spring 2005 ready to wear", "valentino,spring 2006 couture", "valentino,spring 2006 menswear", "armani prive,spring 2010 couture", "valentino,spring 2006 ready to wear", "valentino,spring 2007 couture", "valentino,spring 2007 menswear", "valentino,spring 2007 ready to wear", "valentino,spring 2008 couture", "valentino,spring 2008 menswear", "valentino,spring 2008 ready to wear", "valentino,spring 2009 couture", "valentino,spring 2009 menswear", "valentino,spring 2009 ready to wear", "alexander mcqueen,fall 2008 menswear", "armani prive,spring 2011 couture", "valentino,spring 2010 couture", "valentino,spring 2010 ready to wear", "valentino,spring 2011 couture", "valentino,spring 2011 ready to wear", "valentino,spring 2012 couture", "valentino,spring 2012 menswear", "valentino,spring 2012 ready to wear", "valentino,spring 2013 couture", "valentino,spring 2013 menswear", "valentino,spring 2013 ready to wear", "armani prive,spring 2012 couture", "valentino,spring 2014 couture", "valentino,spring 2014 menswear", "valentino,spring 2014 ready to wear", "valentino,spring 2015 couture", "valentino,spring 2015 menswear", "valentino,spring 2015 ready to wear", "valentino,spring 2016 couture", "valentino,spring 2016 menswear", "valentino,spring 2016 ready to wear", "valentino,spring 2017 couture", "armani prive,spring 2013 couture", "valentino,spring 2017 menswear", "valentino,spring 2017 ready to wear", "valentino,spring 2018 couture", "valentino,spring 2018 menswear", "valentino,spring 2018 ready to wear", "valentino,spring 2019 couture", "valentino,spring 2019 menswear", "valentino,spring 2019 ready to wear", "valentino,spring 2020 couture", "valentino,spring 2020 menswear", "armani prive,spring 2014 couture", "valentino,spring 2020 ready to wear", "valentino,spring 2021 couture", "valentino,spring 2021 menswear", "valentino,spring 2021 ready to wear", "valentino,spring 2022 couture", "valentino,spring 2022 ready to wear", "valentino,spring 2023 couture", "valentino,spring 2023 ready to wear", "valentino,spring 2024 menswear", "versace by fendi,pre fall 2022", "armani prive,spring 2015 couture", "versace,fall 1991 ready to wear", "versace,fall 1992 ready to wear", "versace,fall 1993 ready to wear", "versace,fall 1994 ready to wear", "versace,fall 1995 ready to wear", "versace,fall 1996 ready to wear", "versace,fall 1997 ready to wear", "versace,fall 2000 ready to wear", "versace,fall 2001 couture", "versace,fall 2001 ready to wear", "armani prive,spring 2016 couture", "versace,fall 2002 couture", "versace,fall 2002 ready to wear", "versace,fall 2003 couture", "versace,fall 2003 ready to wear", "versace,fall 2004 ready to wear", "versace,fall 2005 menswear", "versace,fall 2005 ready to wear", "versace,fall 2006 menswear", "versace,fall 2006 ready to wear", "versace,fall 2007 menswear", "armani prive,spring 2017 couture", "versace,fall 2007 ready to wear", "versace,fall 2008 menswear", "versace,fall 2008 ready to wear", "versace,fall 2009 ready to wear", "versace,fall 2010 menswear", "versace,fall 2010 ready to wear", "versace,fall 2011 menswear", "versace,fall 2011 ready to wear", "versace,fall 2012 menswear", "versace,fall 2012 ready to wear", "armani prive,spring 2018 couture", "versace,fall 2013 menswear", "versace,fall 2013 ready to wear", "versace,fall 2014 menswear", "versace,fall 2014 ready to wear", "versace,fall 2015 menswear", "versace,fall 2015 ready to wear", "versace,fall 2016 menswear", "versace,fall 2016 ready to wear", "versace,fall 2017 menswear", "versace,fall 2017 ready to wear", "armani prive,spring 2019 couture", "versace,fall 2018 menswear", "versace,fall 2018 ready to wear", "versace,fall 2019 menswear", "versace,fall 2019 ready to wear", "versace,fall 2020 menswear", "versace,fall 2020 ready to wear", "versace,fall 2021 ready to wear", "versace,fall 2022 menswear", "versace,fall 2022 ready to wear", "versace,fall 2023 ready to wear", "armani prive,spring 2020 couture", "versace,pre fall 2008", "versace,pre fall 2009", "versace,pre fall 2010", "versace,pre fall 2011", "versace,pre fall 2012", "versace,pre fall 2013", "versace,pre fall 2014", "versace,pre fall 2015", "versace,pre fall 2016", "versace,pre fall 2017", "alexander mcqueen,fall 2008 ready to wear", "armani prive,spring 2021 couture", "versace,pre fall 2018", "versace,pre fall 2019", "versace,pre fall 2020", "versace,pre fall 2021", "versace,pre fall 2022", "versace,pre fall 2022 menswear", "versace,pre fall 2023", "versace,resort 2008", "versace,resort 2009", "versace,resort 2010", "armani prive,spring 2023 couture", "versace,resort 2011", "versace,resort 2012", "versace,resort 2013", "versace,resort 2014", "versace,resort 2015", "versace,resort 2016", "versace,resort 2017", "versace,resort 2018", "versace,resort 2019", "versace,resort 2020", "balenciaga,fall 2000 ready to wear", "versace,resort 2021", "versace,resort 2022", "versace,resort 2023", "versace,spring 1991 ready to wear", "versace,spring 1992 ready to wear", "versace,spring 1993 ready to wear", "versace,spring 1994 ready to wear", "versace,spring 1995 ready to wear", "versace,spring 1996 ready to wear", "versace,spring 1997 ready to wear", "balenciaga,fall 2001 ready to wear", "versace,spring 2000 ready to wear", "versace,spring 2001 couture", "versace,spring 2001 ready to wear", "versace,spring 2002 couture", "versace,spring 2002 ready to wear", "versace,spring 2003 couture", "versace,spring 2003 ready to wear", "versace,spring 2004 couture", "versace,spring 2004 ready to wear", "versace,spring 2005 menswear", "balenciaga,fall 2002 ready to wear", "versace,spring 2005 ready to wear", "versace,spring 2006 menswear", "versace,spring 2006 ready to wear", "versace,spring 2007 menswear", "versace,spring 2007 ready to wear", "versace,spring 2008 couture", "versace,spring 2008 menswear", "versace,spring 2008 ready to wear", "versace,spring 2009 menswear", "versace,spring 2009 ready to wear", "balenciaga,fall 2003 ready to wear", "versace,spring 2010 ready to wear", "versace,spring 2011 menswear", "versace,spring 2011 ready to wear", "versace,spring 2012 menswear", "versace,spring 2012 ready to wear", "versace,spring 2013 menswear", "versace,spring 2013 ready to wear", "versace,spring 2014 menswear", "versace,spring 2014 ready to wear", "versace,spring 2015 menswear", "balenciaga,fall 2004 ready to wear", "versace,spring 2015 ready to wear", "versace,spring 2016 menswear", "versace,spring 2016 ready to wear", "versace,spring 2017 menswear", "versace,spring 2017 ready to wear", "versace,spring 2018 menswear", "versace,spring 2018 ready to wear", "versace,spring 2019 menswear", "versace,spring 2019 ready to wear", "versace,spring 2020 menswear", "balenciaga,fall 2005 ready to wear", "versace,spring 2020 ready to wear", "versace,spring 2021 menswear", "versace,spring 2021 ready to wear", "versace,spring 2022 ready to wear", "versace,spring 2023 menswear", "versace,spring 2023 ready to wear", "versace,spring 2024 ready to wear", "balenciaga,fall 2006 ready to wear", "balenciaga,fall 2007 menswear", "alexander mcqueen,fall 2009 ready to wear", "balenciaga,fall 2007 ready to wear", "balenciaga,fall 2008 ready to wear", "balenciaga,fall 2009 ready to wear", "balenciaga,fall 2010 ready to wear", "balenciaga,fall 2011 menswear", "balenciaga,fall 2011 ready to wear", "balenciaga,fall 2012 menswear", "balenciaga,fall 2012 ready to wear", "balenciaga,fall 2013 menswear", "balenciaga,fall 2013 ready to wear", "alexander mcqueen,fall 2010 menswear", "balenciaga,fall 2014 menswear", "balenciaga,fall 2014 ready to wear", "balenciaga,fall 2015 menswear", "balenciaga,fall 2015 ready to wear", "balenciaga,fall 2016 ready to wear", "balenciaga,fall 2017 menswear", "balenciaga,fall 2017 ready to wear", "balenciaga,fall 2018 ready to wear", "balenciaga,fall 2019 menswear", "balenciaga,fall 2019 ready to wear", "alexander mcqueen,fall 2010 ready to wear", "balenciaga,fall 2020 menswear", "balenciaga,fall 2020 ready to wear", "balenciaga,fall 2021 couture", "balenciaga,fall 2021 menswear", "balenciaga,fall 2021 ready to wear", "balenciaga,fall 2022 couture", "balenciaga,fall 2022 ready to wear", "balenciaga,fall 2023 couture", "balenciaga,fall 2023 ready to wear", "balenciaga,pre fall 2008", "alexander mcqueen,fall 1998 ready to wear", "alexander mcqueen,fall 2011 menswear", "balenciaga,pre fall 2009", "balenciaga,pre fall 2010", "balenciaga,pre fall 2011", "balenciaga,pre fall 2012", "balenciaga,pre fall 2013", "balenciaga,pre fall 2014", "balenciaga,pre fall 2015", "balenciaga,pre fall 2016", "balenciaga,pre fall 2017", "balenciaga,pre fall 2018", "alexander mcqueen,fall 2011 ready to wear", "balenciaga,pre fall 2019", "balenciaga,pre fall 2020", "balenciaga,pre fall 2021", "balenciaga,pre fall 2022", "balenciaga,pre fall 2023", "balenciaga,pre fall 2024", "balenciaga,resort 2008", "balenciaga,resort 2009", "balenciaga,resort 2010", "balenciaga,resort 2011", "alexander mcqueen,fall 2012 menswear", "balenciaga,resort 2012", "balenciaga,resort 2013", "balenciaga,resort 2014", "balenciaga,resort 2015", "balenciaga,resort 2016", "balenciaga,resort 2017", "balenciaga,resort 2018", "balenciaga,resort 2019", "balenciaga,resort 2020", "balenciaga,resort 2021", "alexander mcqueen,fall 2012 ready to wear", "balenciaga,resort 2022", "balenciaga,resort 2023", "balenciaga,resort 2024", "balenciaga,spring 1998 ready to wear", "balenciaga,spring 2000 ready to wear", "balenciaga,spring 2001 ready to wear", "balenciaga,spring 2002 ready to wear", "balenciaga,spring 2003 ready to wear", "balenciaga,spring 2004 ready to wear", "balenciaga,spring 2005 ready to wear", "alexander mcqueen,fall 2013 menswear", "balenciaga,spring 2006 ready to wear", "balenciaga,spring 2007 menswear", "balenciaga,spring 2007 ready to wear", "balenciaga,spring 2008 menswear", "balenciaga,spring 2008 ready to wear", "balenciaga,spring 2009 ready to wear", "balenciaga,spring 2010 ready to wear", "balenciaga,spring 2011 menswear", "balenciaga,spring 2011 ready to wear", "balenciaga,spring 2012 menswear", "alexander mcqueen,fall 2013 ready to wear", "balenciaga,spring 2012 ready to wear", "balenciaga,spring 2013 menswear", "balenciaga,spring 2013 ready to wear", "balenciaga,spring 2014 menswear", "balenciaga,spring 2014 ready to wear", "balenciaga,spring 2015 menswear", "balenciaga,spring 2015 ready to wear", "balenciaga,spring 2016 menswear", "balenciaga,spring 2016 ready to wear", "balenciaga,spring 2017 menswear", "alexander mcqueen,fall 2014 menswear", "balenciaga,spring 2017 ready to wear", "balenciaga,spring 2018 menswear", "balenciaga,spring 2018 ready to wear", "balenciaga,spring 2019 ready to wear", "balenciaga,spring 2020 menswear", "balenciaga,spring 2020 ready to wear", "balenciaga,spring 2021 menswear", "balenciaga,spring 2021 ready to wear", "balenciaga,spring 2022 ready to wear", "balenciaga,spring 2023 ready to wear", "alexander mcqueen,fall 2014 ready to wear", "balenciaga,spring 2024 ready to wear", "calvin klein collection,fall 1995 ready to wear", "calvin klein collection,fall 1996 ready to wear", "calvin klein collection,fall 1997 ready to wear", "calvin klein collection,fall 1998 ready to wear", "calvin klein collection,fall 1999 ready to wear", "calvin klein collection,fall 2000 ready to wear", "calvin klein collection,fall 2001 ready to wear", "calvin klein collection,fall 2002 ready to wear", "calvin klein collection,fall 2003 ready to wear", "alexander mcqueen,fall 2015 menswear", "calvin klein collection,fall 2004 ready to wear", "calvin klein collection,fall 2005 menswear", "calvin klein collection,fall 2005 ready to wear", "calvin klein collection,fall 2006 menswear", "calvin klein collection,fall 2006 ready to wear", "calvin klein collection,fall 2007 menswear", "calvin klein collection,fall 2007 ready to wear", "calvin klein collection,fall 2008 menswear", "calvin klein collection,fall 2008 ready to wear", "calvin klein collection,fall 2009 ready to wear", "alexander mcqueen,fall 2015 ready to wear", "calvin klein collection,fall 2010 menswear", "calvin klein collection,fall 2010 ready to wear", "calvin klein collection,fall 2011 menswear", "calvin klein collection,fall 2011 ready to wear", "calvin klein collection,fall 2012 menswear", "calvin klein collection,fall 2012 ready to wear", "calvin klein collection,fall 2013 menswear", "calvin klein collection,fall 2013 ready to wear", "calvin klein collection,fall 2014 menswear", "calvin klein collection,fall 2014 ready to wear", "alexander mcqueen,fall 1999 ready to wear", "alexander mcqueen,fall 2016 menswear", "calvin klein collection,fall 2015 menswear", "calvin klein collection,fall 2015 ready to wear", "calvin klein collection,fall 2016 menswear", "calvin klein collection,fall 2016 ready to wear", "calvin klein collection,pre fall 2008", "calvin klein collection,pre fall 2009", "calvin klein collection,pre fall 2010", "calvin klein collection,pre fall 2011", "calvin klein collection,pre fall 2012", "calvin klein collection,pre fall 2013", "alexander mcqueen,fall 2016 ready to wear", "calvin klein collection,pre fall 2014", "calvin klein collection,pre fall 2015", "calvin klein collection,pre fall 2016", "calvin klein collection,resort 2008", "calvin klein collection,resort 2009", "calvin klein collection,resort 2010", "calvin klein collection,resort 2011", "calvin klein collection,resort 2012", "calvin klein collection,resort 2013", "calvin klein collection,resort 2014", "alexander mcqueen,fall 2017 menswear", "calvin klein collection,resort 2015", "calvin klein collection,resort 2016", "calvin klein collection,resort 2017", "calvin klein collection,spring 1994 ready to wear", "calvin klein collection,spring 1995 ready to wear", "calvin klein collection,spring 1996 ready to wear", "calvin klein collection,spring 1997 ready to wear", "calvin klein collection,spring 1998 ready to wear", "calvin klein collection,spring 1999 ready to wear", "calvin klein collection,spring 2000 ready to wear", "alexander mcqueen,fall 2017 ready to wear", "calvin klein collection,spring 2001 ready to wear", "calvin klein collection,spring 2002 ready to wear", "calvin klein collection,spring 2003 ready to wear", "calvin klein collection,spring 2004 ready to wear", "calvin klein collection,spring 2005 menswear", "calvin klein collection,spring 2005 ready to wear", "calvin klein collection,spring 2006 menswear", "calvin klein collection,spring 2006 ready to wear", "calvin klein collection,spring 2007 menswear", "calvin klein collection,spring 2007 ready to wear", "alexander mcqueen,fall 2018 menswear", "calvin klein collection,spring 2008 menswear", "calvin klein collection,spring 2008 ready to wear", "calvin klein collection,spring 2009 menswear", "calvin klein collection,spring 2009 ready to wear", "calvin klein collection,spring 2010 menswear", "calvin klein collection,spring 2010 ready to wear", "calvin klein collection,spring 2011 menswear", "calvin klein collection,spring 2011 ready to wear", "calvin klein collection,spring 2012 menswear", "calvin klein collection,spring 2012 ready to wear", "alexander mcqueen,fall 2018 ready to wear", "calvin klein collection,spring 2013 menswear", "calvin klein collection,spring 2013 ready to wear", "calvin klein collection,spring 2014 menswear", "calvin klein collection,spring 2014 ready to wear", "calvin klein collection,spring 2015 menswear", "calvin klein collection,spring 2015 ready to wear", "calvin klein collection,spring 2016 menswear", "calvin klein collection,spring 2016 ready to wear", "calvin klein collection,spring 2017 menswear", "calvin klein,fall 2017 menswear", "alexander mcqueen,fall 2019 menswear", "calvin klein,fall 2017 ready to wear", "calvin klein,fall 2018 menswear", "calvin klein,fall 2018 ready to wear", "calvin klein,pre fall 2019", "calvin klein,resort 2019", "calvin klein,spring 2018 menswear", "calvin klein,spring 2018 ready to wear", "calvin klein,spring 2019 menswear", "calvin klein,spring 2019 ready to wear", "chanel,fall 1991 ready to wear", "alexander mcqueen,fall 2019 ready to wear", "chanel,fall 1994 ready to wear", "chanel,fall 1995 couture", "chanel,fall 1996 couture", "chanel,fall 1997 couture", "chanel,fall 1999 couture", "chanel,fall 2000 couture", "chanel,fall 2000 ready to wear", "chanel,fall 2002 couture", "chanel,fall 2003 ready to wear", "chanel,fall 2004 couture", "alexander mcqueen,fall 2020 menswear", "chanel,fall 2004 ready to wear", "chanel,fall 2005 couture", "chanel,fall 2005 ready to wear", "chanel,fall 2006 couture", "chanel,fall 2006 ready to wear", "chanel,fall 2007 couture", "chanel,fall 2007 ready to wear", "chanel,fall 2008 couture", "chanel,fall 2008 ready to wear", "chanel,fall 2009 couture", "alexander mcqueen,fall 2020 ready to wear", "chanel,fall 2009 ready to wear", "chanel,fall 2010 couture", "chanel,fall 2010 ready to wear", "chanel,fall 2011 couture", "chanel,fall 2011 ready to wear", "chanel,fall 2012 couture", "chanel,fall 2012 ready to wear", "chanel,fall 2013 couture", "chanel,fall 2013 ready to wear", "chanel,fall 2014 couture", "alexander mcqueen,fall 2000 ready to wear", "alexander mcqueen,fall 2021 menswear", "chanel,fall 2014 ready to wear", "chanel,fall 2015 couture", "chanel,fall 2015 ready to wear", "chanel,fall 2016 couture", "chanel,fall 2016 ready to wear", "chanel,fall 2017 couture", "chanel,fall 2017 ready to wear", "chanel,fall 2018 couture", "chanel,fall 2018 ready to wear", "chanel,fall 2019 couture", "alexander mcqueen,fall 2021 ready to wear", "chanel,fall 2019 ready to wear", "chanel,fall 2020 couture", "chanel,fall 2020 ready to wear", "chanel,fall 2021 couture", "chanel,fall 2021 ready to wear", "chanel,fall 2022 couture", "chanel,fall 2022 ready to wear", "chanel,fall 2023 couture", "chanel,fall 2023 ready to wear", "chanel,pre fall 2008", "alexander mcqueen,fall 2022 menswear", "chanel,pre fall 2009", "chanel,pre fall 2010", "chanel,pre fall 2011", "chanel,pre fall 2012", "chanel,pre fall 2013", "chanel,pre fall 2014", "chanel,pre fall 2015", "chanel,pre fall 2016", "chanel,pre fall 2017", "chanel,pre fall 2018", "alexander mcqueen,fall 2022 ready to wear", "chanel,pre fall 2019", "chanel,pre fall 2020", "chanel,pre fall 2021", "chanel,pre fall 2022", "chanel,pre fall 2023", "chanel,pre fall 2024", "chanel,resort 2007", "chanel,resort 2008", "chanel,resort 2009", "chanel,resort 2010", "alexander mcqueen,fall 2023 menswear", "chanel,resort 2011", "chanel,resort 2012", "chanel,resort 2013", "chanel,resort 2014", "chanel,resort 2015", "chanel,resort 2016", "chanel,resort 2017", "chanel,resort 2018", "chanel,resort 2019", "chanel,resort 2020", "alexander mcqueen,fall 2023 ready to wear", "chanel,resort 2021", "chanel,resort 2022", "chanel,resort 2023", "chanel,resort 2024", "chanel,spring 1992 ready to wear", "chanel,spring 1993 couture", "chanel,spring 1993 ready to wear", "chanel,spring 1994 ready to wear", "chanel,spring 1995 ready to wear", "chanel,spring 1996 ready to wear", "alexander mcqueen,pre fall 2009", "chanel,spring 1997 couture", "chanel,spring 1999 couture", "chanel,spring 2001 couture", "chanel,spring 2002 couture", "chanel,spring 2002 ready to wear", "chanel,spring 2003 couture", "chanel,spring 2004 couture", "chanel,spring 2004 ready to wear", "chanel,spring 2005 couture", "chanel,spring 2005 ready to wear", "alexander mcqueen,pre fall 2011", "chanel,spring 2006 couture", "chanel,spring 2006 ready to wear", "chanel,spring 2007 couture", "chanel,spring 2007 ready to wear", "chanel,spring 2008 couture", "chanel,spring 2008 ready to wear", "chanel,spring 2009 couture", "chanel,spring 2009 ready to wear", "chanel,spring 2010 couture", "chanel,spring 2010 ready to wear", "alexander mcqueen,pre fall 2012", "chanel,spring 2011 couture", "chanel,spring 2011 ready to wear", "chanel,spring 2012 couture", "chanel,spring 2012 ready to wear", "chanel,spring 2013 couture", "chanel,spring 2013 ready to wear", "chanel,spring 2014 couture", "chanel,spring 2014 ready to wear", "chanel,spring 2015 couture", "chanel,spring 2015 ready to wear", "alexander mcqueen,pre fall 2013", "chanel,spring 2016 couture", "chanel,spring 2016 ready to wear", "chanel,spring 2017 couture", "chanel,spring 2017 ready to wear", "chanel,spring 2018 couture", "chanel,spring 2018 ready to wear", "chanel,spring 2019 couture", "chanel,spring 2019 ready to wear", "chanel,spring 2020 couture", "chanel,spring 2020 ready to wear", "alexander mcqueen,fall 2001 ready to wear", "alexander mcqueen,pre fall 2014", "chanel,spring 2021 couture", "chanel,spring 2021 ready to wear", "chanel,spring 2022 couture", "chanel,spring 2022 ready to wear", "chanel,spring 2023 couture", "chanel,spring 2023 ready to wear", "chanel,spring 2024 ready to wear", "christian dior,fall 1999 couture", "christian dior,fall 2000 couture", "christian dior,fall 2000 ready to wear", "alexander mcqueen,pre fall 2015", "christian dior,fall 2001 couture", "christian dior,fall 2001 ready to wear", "christian dior,fall 2002 couture", "christian dior,fall 2002 ready to wear", "christian dior,fall 2003 couture", "christian dior,fall 2003 ready to wear", "christian dior,fall 2004 couture", "christian dior,fall 2004 ready to wear", "christian dior,fall 2005 couture", "christian dior,fall 2005 ready to wear", "alexander mcqueen,pre fall 2016", "christian dior,fall 2006 couture", "christian dior,fall 2006 ready to wear", "christian dior,fall 2007 couture", "christian dior,fall 2007 ready to wear", "christian dior,fall 2008 couture", "christian dior,fall 2008 ready to wear", "christian dior,fall 2009 couture", "christian dior,fall 2009 ready to wear", "christian dior,fall 2010 couture", "christian dior,fall 2010 menswear", "alexander mcqueen,pre fall 2017", "christian dior,fall 2010 ready to wear", "christian dior,fall 2011 couture", "christian dior,fall 2011 ready to wear", "christian dior,fall 2012 couture", "christian dior,fall 2012 ready to wear", "christian dior,fall 2013 couture", "christian dior,fall 2013 ready to wear", "christian dior,fall 2014 couture", "christian dior,fall 2014 ready to wear", "christian dior,fall 2015 couture", "alexander mcqueen,pre fall 2018", "christian dior,fall 2015 ready to wear", "christian dior,fall 2016 couture", "christian dior,fall 2016 ready to wear", "christian dior,fall 2017 couture", "christian dior,fall 2017 ready to wear", "christian dior,fall 2018 couture", "christian dior,fall 2018 ready to wear", "christian dior,fall 2019 couture", "christian dior,fall 2019 ready to wear", "christian dior,fall 2020 couture", "alexander mcqueen,pre fall 2019", "christian dior,fall 2021 couture", "christian dior,fall 2021 ready to wear", "christian dior,fall 2022 couture", "christian dior,fall 2022 ready to wear", "christian dior,fall 2023 couture", "christian dior,fall 2023 ready to wear", "christian dior,pre fall 2009", "christian dior,pre fall 2010", "christian dior,pre fall 2011", "christian dior,pre fall 2012", "alexander mcqueen,pre fall 2020", "christian dior,pre fall 2013", "christian dior,pre fall 2014", "christian dior,pre fall 2015", "christian dior,pre fall 2016", "christian dior,pre fall 2017", "christian dior,pre fall 2018", "christian dior,pre fall 2019", "christian dior,pre fall 2020", "christian dior,pre fall 2021", "christian dior,pre fall 2022", "alexander mcqueen,pre fall 2021", "christian dior,pre fall 2023", "christian dior,resort 2007", "christian dior,resort 2008", "christian dior,resort 2009", "christian dior,resort 2010", "christian dior,resort 2011", "christian dior,resort 2012", "christian dior,resort 2013", "christian dior,resort 2014", "christian dior,resort 2015", "alexander mcqueen,pre fall 2021 menswear", "christian dior,resort 2016", "christian dior,resort 2017", "christian dior,resort 2018", "christian dior,resort 2019", "christian dior,resort 2020", "christian dior,resort 2021", "christian dior,resort 2022", "christian dior,resort 2023", "christian dior,resort 2024", "christian dior,spring 1999 couture", "alexander mcqueen,pre fall 2022", "christian dior,spring 2000 ready to wear", "christian dior,spring 2001 couture", "christian dior,spring 2001 ready to wear", "christian dior,spring 2002 couture", "christian dior,spring 2002 ready to wear", "christian dior,spring 2003 couture", "christian dior,spring 2003 ready to wear", "christian dior,spring 2004 couture", "christian dior,spring 2004 ready to wear", "christian dior,spring 2005 couture", "alexander mcqueen,fall 2002 ready to wear", "alexander mcqueen,pre fall 2023", "christian dior,spring 2005 ready to wear", "christian dior,spring 2006 couture", "christian dior,spring 2006 ready to wear", "christian dior,spring 2007 couture", "christian dior,spring 2007 ready to wear", "christian dior,spring 2008 couture", "christian dior,spring 2008 ready to wear", "christian dior,spring 2009 couture", "christian dior,spring 2009 ready to wear", "christian dior,spring 2010 couture", "alexander mcqueen,resort 2009", "christian dior,spring 2010 menswear", "christian dior,spring 2010 ready to wear", "christian dior,spring 2011 couture", "christian dior,spring 2011 ready to wear", "christian dior,spring 2012 couture", "christian dior,spring 2012 ready to wear", "christian dior,spring 2013 couture", "christian dior,spring 2013 ready to wear", "christian dior,spring 2014 couture", "christian dior,spring 2014 ready to wear", "alexander mcqueen,resort 2010", "christian dior,spring 2015 couture", "christian dior,spring 2015 ready to wear", "christian dior,spring 2016 couture", "christian dior,spring 2016 ready to wear", "christian dior,spring 2017 couture", "christian dior,spring 2017 ready to wear", "christian dior,spring 2018 couture", "christian dior,spring 2018 ready to wear", "christian dior,spring 2019 couture", "christian dior,spring 2019 ready to wear", "alexander mcqueen,resort 2011", "christian dior,spring 2020 couture", "christian dior,spring 2020 ready to wear", "christian dior,spring 2021 couture", "christian dior,spring 2021 ready to wear", "christian dior,spring 2022 couture", "christian dior,spring 2022 ready to wear", "christian dior,spring 2023 couture", "christian dior,spring 2023 ready to wear", "christian dior,spring 2024 ready to wear", "fendi,fall 1999 ready to wear", "alexander mcqueen,resort 2012", "fendi,fall 2000 ready to wear", "fendi,fall 2001 ready to wear", "fendi,fall 2002 ready to wear", "fendi,fall 2003 ready to wear", "fendi,fall 2004 ready to wear", "fendi,fall 2005 ready to wear", "fendi,fall 2006 ready to wear", "fendi,fall 2007 menswear", "fendi,fall 2007 ready to wear", "fendi,fall 2008 menswear", "alexander mcqueen,resort 2013", "fendi,fall 2008 ready to wear", "fendi,fall 2009 ready to wear", "fendi,fall 2010 ready to wear", "fendi,fall 2011 ready to wear", "fendi,fall 2012 menswear", "fendi,fall 2012 ready to wear", "fendi,fall 2013 menswear", "fendi,fall 2013 ready to wear", "fendi,fall 2014 menswear", "fendi,fall 2014 ready to wear", "alexander mcqueen,resort 2014", "fendi,fall 2015 couture", "fendi,fall 2015 menswear", "fendi,fall 2015 ready to wear", "fendi,fall 2016 couture", "fendi,fall 2016 menswear", "fendi,fall 2016 ready to wear", "fendi,fall 2017 couture", "fendi,fall 2017 menswear", "fendi,fall 2017 ready to wear", "fendi,fall 2018 couture", "alexander mcqueen,resort 2015", "fendi,fall 2018 menswear", "fendi,fall 2018 ready to wear", "fendi,fall 2019 couture", "fendi,fall 2019 menswear", "fendi,fall 2019 ready to wear", "fendi,fall 2020 menswear", "fendi,fall 2020 ready to wear", "fendi,fall 2021 couture", "fendi,fall 2021 menswear", "fendi,fall 2021 ready to wear", "alexander mcqueen,resort 2016", "fendi,fall 2022 couture", "fendi,fall 2022 menswear", "fendi,fall 2022 ready to wear", "fendi,fall 2023 couture", "fendi,fall 2023 menswear", "fendi,fall 2023 ready to wear", "fendi,pre fall 2011", "fendi,pre fall 2012", "fendi,pre fall 2013", "fendi,pre fall 2014", "alexander mcqueen,resort 2017", "fendi,pre fall 2015", "fendi,pre fall 2016", "fendi,pre fall 2017", "fendi,pre fall 2018", "fendi,pre fall 2019", "fendi,pre fall 2020", "fendi,pre fall 2022", "fendi,resort 2008", "fendi,resort 2009", "fendi,resort 2012", "alexander mcqueen,fall 2003 ready to wear", "alexander mcqueen,resort 2018", "fendi,resort 2013", "fendi,resort 2014", "fendi,resort 2015", "fendi,resort 2016", "fendi,resort 2017", "fendi,resort 2018", "fendi,resort 2019", "fendi,resort 2020", "fendi,resort 2022", "fendi,resort 2023", "alexander mcqueen,resort 2019", "fendi,resort 2024", "fendi,spring 1999 ready to wear", "fendi,spring 2000 ready to wear", "fendi,spring 2001 ready to wear", "fendi,spring 2002 ready to wear", "fendi,spring 2003 ready to wear", "fendi,spring 2004 ready to wear", "fendi,spring 2005 ready to wear", "fendi,spring 2006 ready to wear", "fendi,spring 2007 ready to wear", "alexander mcqueen,resort 2020", "fendi,spring 2008 menswear", "fendi,spring 2008 ready to wear", "fendi,spring 2009 menswear", "fendi,spring 2009 ready to wear", "fendi,spring 2010 ready to wear", "fendi,spring 2011 ready to wear", "fendi,spring 2012 ready to wear", "fendi,spring 2013 menswear", "fendi,spring 2013 ready to wear", "fendi,spring 2014 menswear", "alexander mcqueen,resort 2021", "fendi,spring 2014 ready to wear", "fendi,spring 2015 menswear", "fendi,spring 2015 ready to wear", "fendi,spring 2016 menswear", "fendi,spring 2016 ready to wear", "fendi,spring 2017 menswear", "fendi,spring 2017 ready to wear", "fendi,spring 2018 menswear", "fendi,spring 2018 ready to wear", "fendi,spring 2019 menswear", "alexander mcqueen,resort 2022", "fendi,spring 2019 ready to wear", "fendi,spring 2020 menswear", "fendi,spring 2020 ready to wear", "fendi,spring 2021 couture", "fendi,spring 2021 menswear", "fendi,spring 2021 ready to wear", "fendi,spring 2022 couture", "fendi,spring 2022 menswear", "fendi,spring 2022 ready to wear", "fendi,spring 2023 couture", "alexander mcqueen,resort 2023", "fendi,spring 2023 menswear", "fendi,spring 2023 ready to wear", "fendi,spring 2024 menswear", "fendi,spring 2024 ready to wear", "gucci,fall 1995 ready to wear", "gucci,fall 1996 ready to wear", "gucci,fall 2000 ready to wear", "gucci,fall 2001 ready to wear", "gucci,fall 2002 ready to wear", "gucci,fall 2003 ready to wear", "alexander mcqueen,spring 1995 ready to wear", "gucci,fall 2004 ready to wear", "gucci,fall 2005 menswear", "gucci,fall 2005 ready to wear", "gucci,fall 2006 menswear", "gucci,fall 2006 ready to wear", "gucci,fall 2007 menswear", "gucci,fall 2007 ready to wear", "gucci,fall 2008 menswear", "gucci,fall 2008 ready to wear", "gucci,fall 2009 ready to wear", "alexander mcqueen,spring 1996 ready to wear", "gucci,fall 2010 menswear", "gucci,fall 2010 ready to wear", "gucci,fall 2011 menswear", "gucci,fall 2011 ready to wear", "gucci,fall 2012 menswear", "gucci,fall 2012 ready to wear", "gucci,fall 2013 menswear", "gucci,fall 2013 ready to wear", "gucci,fall 2014 menswear", "gucci,fall 2014 ready to wear", "alexander mcqueen,spring 1997 ready to wear", "gucci,fall 2015 menswear", "gucci,fall 2015 ready to wear", "gucci,fall 2016 menswear", "gucci,fall 2016 ready to wear", "gucci,fall 2017 menswear", "gucci,fall 2017 ready to wear", "gucci,fall 2018 menswear", "gucci,fall 2018 ready to wear", "gucci,fall 2019 menswear", "gucci,fall 2019 ready to wear", "alexander mcqueen,spring 1998 ready to wear", "gucci,fall 2020 menswear", "gucci,fall 2020 ready to wear", "gucci,fall 2022 ready to wear", "gucci,fall 2023 menswear", "gucci,fall 2023 ready to wear", "gucci,pre fall 2011", "gucci,pre fall 2012", "gucci,pre fall 2013", "gucci,pre fall 2014", "gucci,pre fall 2015", "alexander mcqueen,fall 2004 ready to wear", "alexander mcqueen,spring 1999 ready to wear", "gucci,pre fall 2016", "gucci,pre fall 2017", "gucci,pre fall 2018", "gucci,pre fall 2019", "gucci,pre fall 2020", "gucci,pre fall 2020 menswear", "gucci,pre fall 2021", "gucci,pre fall 2021 menswear", "gucci,pre fall 2022", "gucci,resort 2007", "alexander mcqueen,spring 2000 ready to wear", "gucci,resort 2008", "gucci,resort 2009", "gucci,resort 2010", "gucci,resort 2011", "gucci,resort 2012", "gucci,resort 2013", "gucci,resort 2014", "gucci,resort 2015", "gucci,resort 2016", "gucci,resort 2017", "alexander mcqueen,spring 2001 ready to wear", "gucci,resort 2018", "gucci,resort 2019", "gucci,resort 2020", "gucci,resort 2021", "gucci,resort 2023", "gucci,resort 2024", "gucci,spring 1999 ready to wear", "gucci,spring 2000 ready to wear", "gucci,spring 2001 ready to wear", "gucci,spring 2002 ready to wear", "alexander mcqueen,spring 2002 ready to wear", "gucci,spring 2003 ready to wear", "gucci,spring 2004 ready to wear", "gucci,spring 2005 menswear", "gucci,spring 2005 ready to wear", "gucci,spring 2006 menswear", "gucci,spring 2006 ready to wear", "gucci,spring 2007 menswear", "gucci,spring 2007 ready to wear", "gucci,spring 2008 menswear", "gucci,spring 2008 ready to wear", "alexander mcqueen,spring 2003 ready to wear", "gucci,spring 2009 menswear", "gucci,spring 2009 ready to wear", "gucci,spring 2010 menswear", "gucci,spring 2010 ready to wear", "gucci,spring 2011 menswear", "gucci,spring 2011 ready to wear", "gucci,spring 2012 menswear", "gucci,spring 2012 ready to wear", "gucci,spring 2013 menswear", "gucci,spring 2013 ready to wear", "alexander mcqueen,spring 2004 ready to wear", "gucci,spring 2014 menswear", "gucci,spring 2014 ready to wear", "gucci,spring 2015 menswear", "gucci,spring 2015 ready to wear", "gucci,spring 2016 menswear", "gucci,spring 2016 ready to wear", "gucci,spring 2017 menswear", "gucci,spring 2017 ready to wear", "gucci,spring 2018 menswear", "gucci,spring 2018 ready to wear", "alexander mcqueen,spring 2005 menswear", "gucci,spring 2019 ready to wear", "gucci,spring 2020 menswear", "gucci,spring 2020 ready to wear", "gucci,spring 2021 menswear", "gucci,spring 2021 ready to wear", "gucci,spring 2022 ready to wear", "gucci,spring 2023 ready to wear", "gucci,spring 2024 menswear", "gucci,spring 2024 ready to wear", "hermes,fall 1999 ready to wear", "alexander mcqueen,spring 2005 ready to wear", "hermes,fall 2000 ready to wear", "hermes,fall 2001 ready to wear", "hermes,fall 2004 ready to wear", "hermes,fall 2005 menswear", "hermes,fall 2005 ready to wear", "hermes,fall 2006 menswear", "hermes,fall 2006 ready to wear", "hermes,fall 2007 menswear", "hermes,fall 2007 ready to wear", "hermes,fall 2008 menswear", "alexander mcqueen,spring 2006 menswear", "hermes,fall 2008 ready to wear", "hermes,fall 2009 ready to wear", "hermes,fall 2010 menswear", "hermes,fall 2010 ready to wear", "hermes,fall 2011 menswear", "hermes,fall 2011 ready to wear", "hermes,fall 2012 menswear", "hermes,fall 2012 ready to wear", "hermes,fall 2013 menswear", "hermes,fall 2013 ready to wear", "alexander mcqueen,spring 2006 ready to wear", "hermes,fall 2014 menswear", "hermes,fall 2014 ready to wear", "hermes,fall 2015 menswear", "hermes,fall 2015 ready to wear", "hermes,fall 2016 menswear", "hermes,fall 2016 ready to wear", "hermes,fall 2017 menswear", "hermes,fall 2017 ready to wear", "hermes,fall 2018 menswear", "hermes,fall 2018 ready to wear", "alexander mcqueen,fall 2005 menswear", "alexander mcqueen,spring 2007 menswear", "hermes,fall 2019 menswear", "hermes,fall 2019 ready to wear", "hermes,fall 2020 menswear", "hermes,fall 2020 ready to wear", "hermes,fall 2021 menswear", "hermes,fall 2021 ready to wear", "hermes,fall 2022 menswear", "hermes,fall 2022 ready to wear", "hermes,fall 2023 menswear", "hermes,fall 2023 ready to wear", "alexander mcqueen,spring 2007 ready to wear", "hermes,pre fall 2017", "hermes,pre fall 2018", "hermes,pre fall 2019", "hermes,resort 2017", "hermes,resort 2018", "hermes,resort 2019", "hermes,spring 1999 ready to wear", "hermes,spring 2000 ready to wear", "hermes,spring 2001 ready to wear", "hermes,spring 2002 ready to wear", "alexander mcqueen,spring 2008 menswear", "hermes,spring 2006 menswear", "hermes,spring 2006 ready to wear", "hermes,spring 2007 menswear", "hermes,spring 2007 ready to wear", "hermes,spring 2008 menswear", "hermes,spring 2008 ready to wear", "hermes,spring 2009 menswear", "hermes,spring 2010 menswear", "hermes,spring 2010 ready to wear", "hermes,spring 2011 menswear", "alexander mcqueen,spring 2008 ready to wear", "hermes,spring 2011 ready to wear", "hermes,spring 2012 menswear", "hermes,spring 2012 ready to wear", "hermes,spring 2013 menswear", "hermes,spring 2013 ready to wear", "hermes,spring 2014 menswear", "hermes,spring 2014 ready to wear", "hermes,spring 2015 menswear", "hermes,spring 2015 ready to wear", "hermes,spring 2016 menswear", "alexander mcqueen,spring 2009 menswear", "hermes,spring 2016 ready to wear", "hermes,spring 2017 menswear", "hermes,spring 2017 ready to wear", "hermes,spring 2018 menswear", "hermes,spring 2018 ready to wear", "hermes,spring 2019 menswear", "hermes,spring 2019 ready to wear", "hermes,spring 2020 menswear", "hermes,spring 2020 ready to wear", "hermes,spring 2021 menswear", "alexander mcqueen,spring 2009 ready to wear", "hermes,spring 2021 ready to wear", "hermes,spring 2022 menswear", "hermes,spring 2022 ready to wear", "hermes,spring 2023 menswear", "hermes,spring 2023 ready to wear", "hermes,spring 2024 menswear", "hermes,spring 2024 ready to wear", "louis vuitton,fall 1998 ready to wear", "louis vuitton,fall 2000 ready to wear", "louis vuitton,fall 2001 ready to wear", "alexander mcqueen,spring 2010 menswear", "louis vuitton,fall 2002 ready to wear", "louis vuitton,fall 2003 ready to wear", "louis vuitton,fall 2004 ready to wear", "louis vuitton,fall 2005 menswear", "louis vuitton,fall 2005 ready to wear", "louis vuitton,fall 2006 menswear", "louis vuitton,fall 2006 ready to wear", "louis vuitton,fall 2007 menswear", "louis vuitton,fall 2008 menswear", "louis vuitton,fall 2008 ready to wear", "alexander mcqueen,spring 2010 ready to wear", "louis vuitton,fall 2009 ready to wear", "louis vuitton,fall 2010 menswear", "louis vuitton,fall 2010 ready to wear", "louis vuitton,fall 2011 menswear", "louis vuitton,fall 2011 ready to wear", "louis vuitton,fall 2012 menswear", "louis vuitton,fall 2012 ready to wear", "louis vuitton,fall 2013 menswear", "louis vuitton,fall 2013 ready to wear", "louis vuitton,fall 2014 menswear", "alexander mcqueen,spring 2011 menswear", "louis vuitton,fall 2014 ready to wear", "louis vuitton,fall 2015 menswear", "louis vuitton,fall 2015 ready to wear", "louis vuitton,fall 2016 menswear", "louis vuitton,fall 2016 ready to wear", "louis vuitton,fall 2017 menswear", "louis vuitton,fall 2017 ready to wear", "louis vuitton,fall 2018 menswear", "louis vuitton,fall 2018 ready to wear", "louis vuitton,fall 2019 menswear", "alexander mcqueen,spring 2011 ready to wear", "louis vuitton,fall 2019 ready to wear", "louis vuitton,fall 2020 menswear", "louis vuitton,fall 2020 ready to wear", "louis vuitton,fall 2021 menswear", "louis vuitton,fall 2021 ready to wear", "louis vuitton,fall 2022 menswear", "louis vuitton,fall 2022 ready to wear", "louis vuitton,fall 2023 menswear", "louis vuitton,fall 2023 ready to wear", "louis vuitton,pre fall 2008" ]
Wvolf/ViT_Deepfake_Detection
<p>This model was trained by Rudolf Enyimba in partial fulfillment of the requirements of Solent University for the degree of MSc Artificial Intelligence and Data Science</p> <p>This model was trained to detect deepfake images.</p> <p>The model achieved an accuracy of <strong>98.70%</strong> on the test set.</p> <p>Upload a face image or pick from the samples below to test model accuracy</p>
[ "real", "fake" ]
3una/finetuned-CK
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # finetuned-CK This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1159 - Accuracy: 0.9873 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.87 | 5 | 2.0523 | 0.1139 | | 1.9841 | 1.91 | 11 | 1.9751 | 0.0633 | | 1.9841 | 2.96 | 17 | 1.8640 | 0.1392 | | 1.8698 | 4.0 | 23 | 1.7348 | 0.3291 | | 1.8698 | 4.87 | 28 | 1.6262 | 0.4430 | | 1.6947 | 5.91 | 34 | 1.4895 | 0.5823 | | 1.6038 | 6.96 | 40 | 1.3500 | 0.5949 | | 1.6038 | 8.0 | 46 | 1.1857 | 0.6835 | | 1.4019 | 8.87 | 51 | 1.0636 | 0.6962 | | 1.4019 | 9.91 | 57 | 0.9472 | 0.7468 | | 1.2101 | 10.96 | 63 | 0.8263 | 0.7722 | | 1.2101 | 12.0 | 69 | 0.7643 | 0.7595 | | 1.077 | 12.87 | 74 | 0.7260 | 0.7722 | | 0.9735 | 13.91 | 80 | 0.6628 | 0.8228 | | 0.9735 | 14.96 | 86 | 0.6245 | 0.8101 | | 0.8696 | 16.0 | 92 | 0.5579 | 0.8101 | | 0.8696 | 16.87 | 97 | 0.5309 | 0.8228 | | 0.858 | 17.91 | 103 | 0.5099 | 0.8354 | | 0.858 | 18.96 | 109 | 0.4997 | 0.8354 | | 0.7716 | 20.0 | 115 | 0.4679 | 0.8608 | | 0.699 | 20.87 | 120 | 0.4455 | 0.8481 | | 0.699 | 21.91 | 126 | 0.4409 | 0.8608 | | 0.6768 | 22.96 | 132 | 0.4186 | 0.8481 | | 0.6768 | 24.0 | 138 | 0.3826 | 0.8734 | | 0.6303 | 24.87 | 143 | 0.3708 | 0.8861 | | 0.6303 | 25.91 | 149 | 0.3545 | 0.8608 | | 0.5786 | 26.96 | 155 | 0.3480 | 0.8987 | | 0.5188 | 28.0 | 161 | 0.3241 | 0.9241 | | 0.5188 | 28.87 | 166 | 0.3257 | 0.9114 | | 0.5349 | 29.91 | 172 | 0.2963 | 0.9241 | | 0.5349 | 30.96 | 178 | 0.2836 | 0.9367 | | 0.5208 | 32.0 | 184 | 0.2822 | 0.9620 | | 0.5208 | 32.87 | 189 | 0.2933 | 0.9494 | | 0.4458 | 33.91 | 195 | 0.2742 | 0.9620 | | 0.4716 | 34.96 | 201 | 0.2580 | 0.9620 | | 0.4716 | 36.0 | 207 | 0.2432 | 0.9620 | | 0.427 | 36.87 | 212 | 0.2333 | 0.9747 | | 0.427 | 37.91 | 218 | 0.2115 | 0.9494 | | 0.4052 | 38.96 | 224 | 0.2044 | 0.9873 | | 0.3899 | 40.0 | 230 | 0.2081 | 0.9747 | | 0.3899 | 40.87 | 235 | 0.2110 | 0.9620 | | 0.4098 | 41.91 | 241 | 0.2099 | 0.9494 | | 0.4098 | 42.96 | 247 | 0.1899 | 0.9747 | | 0.3691 | 44.0 | 253 | 0.1783 | 0.9873 | | 0.3691 | 44.87 | 258 | 0.1768 | 0.9747 | | 0.3878 | 45.91 | 264 | 0.1836 | 0.9873 | | 0.3582 | 46.96 | 270 | 0.1876 | 0.9747 | | 0.3582 | 48.0 | 276 | 0.1761 | 0.9747 | | 0.356 | 48.87 | 281 | 0.1745 | 0.9620 | | 0.356 | 49.91 | 287 | 0.1761 | 0.9620 | | 0.3715 | 50.96 | 293 | 0.1716 | 0.9873 | | 0.3715 | 52.0 | 299 | 0.1694 | 0.9620 | | 0.3195 | 52.87 | 304 | 0.1660 | 0.9873 | | 0.3452 | 53.91 | 310 | 0.1584 | 0.9873 | | 0.3452 | 54.96 | 316 | 0.1579 | 0.9620 | | 0.3407 | 56.0 | 322 | 0.1431 | 0.9873 | | 0.3407 | 56.87 | 327 | 0.1423 | 0.9873 | | 0.3092 | 57.91 | 333 | 0.1434 | 0.9747 | | 0.3092 | 58.96 | 339 | 0.1391 | 0.9873 | | 0.3346 | 60.0 | 345 | 0.1362 | 0.9873 | | 0.3107 | 60.87 | 350 | 0.1341 | 0.9873 | | 0.3107 | 61.91 | 356 | 0.1368 | 0.9873 | | 0.2884 | 62.96 | 362 | 0.1393 | 0.9747 | | 0.2884 | 64.0 | 368 | 0.1347 | 0.9873 | | 0.3048 | 64.87 | 373 | 0.1334 | 0.9873 | | 0.3048 | 65.91 | 379 | 0.1342 | 0.9747 | | 0.3452 | 66.96 | 385 | 0.1310 | 0.9747 | | 0.2835 | 68.0 | 391 | 0.1308 | 0.9873 | | 0.2835 | 68.87 | 396 | 0.1304 | 0.9747 | | 0.3865 | 69.91 | 402 | 0.1264 | 0.9873 | | 0.3865 | 70.96 | 408 | 0.1247 | 0.9873 | | 0.2729 | 72.0 | 414 | 0.1239 | 0.9873 | | 0.2729 | 72.87 | 419 | 0.1249 | 0.9747 | | 0.2643 | 73.91 | 425 | 0.1189 | 0.9873 | | 0.3176 | 74.96 | 431 | 0.1171 | 0.9873 | | 0.3176 | 76.0 | 437 | 0.1174 | 0.9873 | | 0.3184 | 76.87 | 442 | 0.1184 | 0.9873 | | 0.3184 | 77.91 | 448 | 0.1160 | 0.9873 | | 0.2817 | 78.96 | 454 | 0.1147 | 0.9873 | | 0.2666 | 80.0 | 460 | 0.1133 | 0.9873 | | 0.2666 | 80.87 | 465 | 0.1139 | 0.9873 | | 0.2589 | 81.91 | 471 | 0.1142 | 0.9873 | | 0.2589 | 82.96 | 477 | 0.1150 | 0.9873 | | 0.275 | 84.0 | 483 | 0.1156 | 0.9873 | | 0.275 | 84.87 | 488 | 0.1160 | 0.9873 | | 0.2644 | 85.91 | 494 | 0.1160 | 0.9873 | | 0.3187 | 86.96 | 500 | 0.1159 | 0.9873 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "anger", "contempt", "disgust", "fear", "happy", "sadness", "surprise" ]
yusx-swapp/ofm-vit-base-patch16-224-cifar100
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ofm-vit-base-patch16-224-cifar100 This model was trained from scratch on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 512 - eval_batch_size: 512 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Framework versions - Transformers 4.36.2 - Pytorch 2.0.0+cu118 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "apple", "aquarium_fish", "baby", "bear", "beaver", "bed", "bee", "beetle", "bicycle", "bottle", "bowl", "boy", "bridge", "bus", "butterfly", "camel", "can", "castle", "caterpillar", "cattle", "chair", "chimpanzee", "clock", "cloud", "cockroach", "couch", "cra", "crocodile", "cup", "dinosaur", "dolphin", "elephant", "flatfish", "forest", "fox", "girl", "hamster", "house", "kangaroo", "keyboard", "lamp", "lawn_mower", "leopard", "lion", "lizard", "lobster", "man", "maple_tree", "motorcycle", "mountain", "mouse", "mushroom", "oak_tree", "orange", "orchid", "otter", "palm_tree", "pear", "pickup_truck", "pine_tree", "plain", "plate", "poppy", "porcupine", "possum", "rabbit", "raccoon", "ray", "road", "rocket", "rose", "sea", "seal", "shark", "shrew", "skunk", "skyscraper", "snail", "snake", "spider", "squirrel", "streetcar", "sunflower", "sweet_pepper", "table", "tank", "telephone", "television", "tiger", "tractor", "train", "trout", "tulip", "turtle", "wardrobe", "whale", "willow_tree", "wolf", "woman", "worm" ]
xxhwjzx/yanzheng
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metricsg loss: nan f1_macro: 0.19999999999999998 f1_micro: 0.42857142857142855 f1_weighted: 0.2571428571428571 precision_macro: 0.14285714285714285 precision_micro: 0.42857142857142855 precision_weighted: 0.18367346938775508 recall_macro: 0.3333333333333333 recall_micro: 0.42857142857142855 recall_weighted: 0.42857142857142855 accuracy: 0.42857142857142855
[ "bj", "ng", "ok" ]
xxhwjzx/autotrain-9f2g9-xg0d2
# Model Trained Using AutoTrain - Problem type: Image Classification ## Validation Metricsg loss: 1.0950939655303955 f1_macro: 0.14814814814814817 f1_micro: 0.2857142857142857 f1_weighted: 0.126984126984127 precision_macro: 0.09523809523809523 precision_micro: 0.2857142857142857 precision_weighted: 0.08163265306122448 recall_macro: 0.3333333333333333 recall_micro: 0.2857142857142857 recall_weighted: 0.2857142857142857 accuracy: 0.2857142857142857
[ "bj", "ng", "ok" ]
adhisetiawan/mnist-test
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # adhisetiawan/mnist-test This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.7312 - Validation Loss: 0.9257 - Train Accuracy: 0.8 - Epoch: 19 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 1600, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 2.2668 | 2.2081 | 0.4 | 0 | | 2.1502 | 2.1140 | 0.6 | 1 | | 2.0506 | 2.0350 | 0.65 | 2 | | 1.9473 | 1.9239 | 0.7 | 3 | | 1.8164 | 1.8355 | 0.7 | 4 | | 1.7091 | 1.7534 | 0.75 | 5 | | 1.6152 | 1.6683 | 0.8 | 6 | | 1.5122 | 1.5825 | 0.8 | 7 | | 1.4108 | 1.4897 | 0.8 | 8 | | 1.3225 | 1.4149 | 0.8 | 9 | | 1.2426 | 1.3135 | 0.8 | 10 | | 1.1740 | 1.2704 | 0.8 | 11 | | 1.0894 | 1.2213 | 0.85 | 12 | | 1.0230 | 1.1424 | 0.8 | 13 | | 0.9646 | 1.1171 | 0.85 | 14 | | 0.9109 | 1.0744 | 0.8 | 15 | | 0.8547 | 1.0376 | 0.85 | 16 | | 0.8082 | 0.9892 | 0.8 | 17 | | 0.7632 | 0.9604 | 0.85 | 18 | | 0.7312 | 0.9257 | 0.8 | 19 | ### Framework versions - Transformers 4.35.2 - TensorFlow 2.15.0 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9" ]
PeteA2Z/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.4415 - Accuracy: 0.7968 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5437 | 0.99 | 70 | 0.5017 | 0.7587 | | 0.4947 | 2.0 | 141 | 0.4697 | 0.7657 | | 0.4533 | 2.98 | 210 | 0.4415 | 0.7968 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.15.0
[ "ja_kelp", "nee_kelp" ]
alirzb/S1_M1_R1_ViT_42616100
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_ViT_42616100 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0078 - Accuracy: 0.9971 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0113 | 1.0 | 304 | 0.0067 | 0.9980 | | 0.0065 | 2.0 | 608 | 0.0040 | 0.9980 | | 0.0031 | 3.0 | 912 | 0.0152 | 0.9961 | | 0.0 | 4.0 | 1217 | 0.0091 | 0.9971 | | 0.0 | 5.0 | 1520 | 0.0078 | 0.9971 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R1_BEiT_42616788
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R1_BEiT_42616788 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0033 - Accuracy: 0.9990 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0501 | 1.0 | 304 | 0.0197 | 0.9941 | | 0.0008 | 2.0 | 608 | 0.0044 | 0.9990 | | 0.0116 | 3.0 | 912 | 0.0039 | 0.9980 | | 0.0002 | 4.0 | 1217 | 0.0000 | 1.0 | | 0.0 | 5.0 | 1520 | 0.0033 | 0.9990 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
ashutoshsharma58/indian_food_image_detection
### Model Description This model classifies the uploaded imgage into various Indian cuisines. The categories are: adhirasam, aloo_gobi, aloo_matar, aloo_methi, aloo_shimla_mirch, aloo_tikki, anarsa, ariselu, bandar_laddu, basundi, bhatura, bhindi_masala, biryani, boondi, butter_chicken, chak_hao_kheer, cham_cham, chana_masala, chapati, chhena_kheeri, chicken_razala, chicken_tikka,chicken_tikka_masala, chikki, daal_baati_churma, daal_puri, dal_makhani, dal_tadka, dharwad_pedha, doodhpak, double_ka_meetha, dum_aloo, gajar_ka_halwa, gavvalu, ghevar, gulab_jamun, imarti, jalebi, kachori, kadai_paneer, kadhi_pakoda, kajjikaya, kakinada_khaja, kalakand, karela_bharta, kofta, kuzhi_paniyaram, lassi, ledikeni, litti_chokha, lyangcha, maach_jhol, makki_di_roti_sarson_da_saag, malapua, misi_roti, misti_doi, modak, mysore_pak, naan, navrattan_korma, palak_paneer, paneer_butter_masala, phirni, pithe, poha, poornalu, pootharekulu, qubani_ka_meetha, rabri, ras_malai, rasgulla, sandesh, shankarpali, sheer_korma, sheera, shrikhand, sohan_halwa, sohan_papdi, sutar_feni, unni_appam - **Developed by:** [Ashutosh Sharma]
[ "chak_hao_kheer", "chana_masala", "kalakand", "daal_baati_churma", "dal_makhani", "dharwad_pedha", "kajjikaya", "misi_roti", "qubani_ka_meetha", "imarti", "sutar_feni", "dal_tadka", "aloo_matar", "ledikeni", "mysore_pak", "kachori", "maach_jhol", "aloo_methi", "phirni", "kakinada_khaja", "adhirasam", "chhena_kheeri", "chicken_tikka_masala", "kadhi_pakoda", "daal_puri", "pithe", "gavvalu", "lassi", "malapua", "bhatura", "chicken_razala", "basundi", "chapati", "kuzhi_paniyaram", "unni_appam", "ras_malai", "misti_doi", "modak", "rasgulla", "bandar_laddu", "poornalu", "aloo_gobi", "aloo_tikki", "gajar_ka_halwa", "chikki", "lyangcha", "bhindi_masala", "boondi", "aloo_shimla_mirch", "chicken_tikka", "paneer_butter_masala", "kadai_paneer", "gulab_jamun", "rabri", "sandesh", "jalebi", "sohan_halwa", "shankarpali", "cham_cham", "karela_bharta", "dum_aloo", "naan", "doodhpak", "pootharekulu", "kofta", "ariselu", "anarsa", "shrikhand", "sohan_papdi", "poha", "sheer_korma", "litti_chokha", "butter_chicken", "navrattan_korma", "makki_di_roti_sarson_da_saag", "palak_paneer", "sheera", "biryani", "ghevar", "double_ka_meetha" ]
alirzb/S1_M1_R3_ViT_42618486
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R3_ViT_42618486 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0015 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0039 | 1.0 | 379 | 0.0024 | 0.9992 | | 0.0041 | 2.0 | 759 | 0.0049 | 0.9984 | | 0.0001 | 3.0 | 1139 | 0.0029 | 0.9992 | | 0.0 | 4.0 | 1519 | 0.0014 | 0.9992 | | 0.0 | 4.99 | 1895 | 0.0015 | 0.9992 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R2_ViT_42618476
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R2_ViT_42618476 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0006 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0111 | 1.0 | 309 | 0.0033 | 0.9981 | | 0.0057 | 2.0 | 619 | 0.0007 | 1.0 | | 0.0001 | 3.0 | 929 | 0.0005 | 1.0 | | 0.0 | 4.0 | 1239 | 0.0005 | 1.0 | | 0.0 | 4.99 | 1545 | 0.0006 | 1.0 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R1_ViT_42618522
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R1_ViT_42618522 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0114 - Accuracy: 0.9987 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0551 | 1.0 | 231 | 0.0058 | 0.9987 | | 0.0032 | 2.0 | 463 | 0.0181 | 0.9962 | | 0.008 | 3.0 | 694 | 0.0099 | 0.9987 | | 0.0002 | 4.0 | 926 | 0.0181 | 0.9974 | | 0.0 | 4.99 | 1155 | 0.0114 | 0.9987 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R2_ViT_42618530
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R2_ViT_42618530 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0018 - Accuracy: 0.9987 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0088 | 1.0 | 237 | 0.0385 | 0.9887 | | 0.0067 | 2.0 | 474 | 0.0155 | 0.9962 | | 0.0015 | 3.0 | 711 | 0.0038 | 0.9987 | | 0.0001 | 4.0 | 948 | 0.0011 | 0.9987 | | 0.0001 | 5.0 | 1185 | 0.0018 | 0.9987 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold1_ViT_42618571
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold1_ViT_42618571 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0013 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0293 | 1.0 | 368 | 0.0035 | 0.9992 | | 0.0006 | 2.0 | 737 | 0.0031 | 0.9984 | | 0.0001 | 3.0 | 1105 | 0.0017 | 0.9992 | | 0.0 | 4.0 | 1474 | 0.0016 | 0.9992 | | 0.0 | 4.99 | 1840 | 0.0013 | 0.9992 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R3_ViT_42618549
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R3_ViT_42618549 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0001 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0171 | 1.0 | 307 | 0.0156 | 0.9952 | | 0.0097 | 2.0 | 614 | 0.0005 | 1.0 | | 0.0045 | 3.0 | 921 | 0.0021 | 0.9990 | | 0.0 | 4.0 | 1229 | 0.0001 | 1.0 | | 0.0001 | 5.0 | 1535 | 0.0001 | 1.0 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold2_ViT_42618583
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold2_ViT_42618583 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0165 - Accuracy: 0.9976 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0107 | 1.0 | 368 | 0.0235 | 0.9936 | | 0.0006 | 2.0 | 737 | 0.0171 | 0.9960 | | 0.0001 | 3.0 | 1105 | 0.0154 | 0.9984 | | 0.0001 | 4.0 | 1474 | 0.0151 | 0.9976 | | 0.0001 | 4.99 | 1840 | 0.0165 | 0.9976 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold3_ViT_42618589
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold3_ViT_42618589 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0068 - Accuracy: 0.9984 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0026 | 1.0 | 368 | 0.0069 | 0.9976 | | 0.0052 | 2.0 | 737 | 0.0094 | 0.9984 | | 0.0006 | 3.0 | 1105 | 0.0086 | 0.9984 | | 0.0 | 4.0 | 1474 | 0.0068 | 0.9984 | | 0.0 | 4.99 | 1840 | 0.0068 | 0.9984 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold4_ViT_42618593
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold4_ViT_42618593 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0091 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0072 | 1.0 | 368 | 0.0147 | 0.9960 | | 0.0161 | 2.0 | 737 | 0.0104 | 0.9984 | | 0.0012 | 3.0 | 1105 | 0.0104 | 0.9976 | | 0.0001 | 4.0 | 1474 | 0.0091 | 0.9992 | | 0.0 | 4.99 | 1840 | 0.0091 | 0.9992 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold5_ViT_42621111
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold5_ViT_42621111 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0042 - Accuracy: 0.9984 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0311 | 1.0 | 368 | 0.0044 | 0.9992 | | 0.0045 | 2.0 | 737 | 0.0014 | 0.9992 | | 0.0038 | 3.0 | 1105 | 0.0068 | 0.9984 | | 0.0001 | 4.0 | 1474 | 0.0041 | 0.9984 | | 0.0 | 4.99 | 1840 | 0.0042 | 0.9984 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R2_BEiT_42621211
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R2_BEiT_42621211 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0129 - Accuracy: 0.9971 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0137 | 1.0 | 309 | 0.0707 | 0.9885 | | 0.0103 | 2.0 | 619 | 0.0111 | 0.9981 | | 0.0043 | 3.0 | 929 | 0.0176 | 0.9971 | | 0.0011 | 4.0 | 1239 | 0.0263 | 0.9952 | | 0.0 | 4.99 | 1545 | 0.0129 | 0.9971 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S1_M1_R3_BEiT_42621220
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S1_M1_R3_BEiT_42621220 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0131 - Accuracy: 0.9977 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0088 | 1.0 | 379 | 0.0077 | 0.9977 | | 0.025 | 2.0 | 759 | 0.0080 | 0.9984 | | 0.0362 | 3.0 | 1139 | 0.0049 | 0.9992 | | 0.0007 | 4.0 | 1519 | 0.0146 | 0.9977 | | 0.0 | 4.99 | 1895 | 0.0131 | 0.9977 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R1_BEiT_42621224
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R1_BEiT_42621224 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0001 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0388 | 1.0 | 231 | 0.0141 | 0.9949 | | 0.0418 | 2.0 | 463 | 0.0076 | 0.9987 | | 0.0004 | 3.0 | 694 | 0.0002 | 1.0 | | 0.0044 | 4.0 | 926 | 0.0003 | 1.0 | | 0.0001 | 4.99 | 1155 | 0.0001 | 1.0 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S2_M1_R2_BEiT_42621227
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R2_BEiT_42621227 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0197 - Accuracy: 0.9950 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0451 | 1.0 | 237 | 0.0198 | 0.9912 | | 0.0182 | 2.0 | 474 | 0.0110 | 0.9950 | | 0.0048 | 3.0 | 711 | 0.0192 | 0.9950 | | 0.0046 | 4.0 | 948 | 0.0259 | 0.9950 | | 0.0001 | 5.0 | 1185 | 0.0197 | 0.9950 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
SaladSlayer00/twin_matcher_beta
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # SaladSlayer00/twin_matcher_beta This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0286 - Validation Loss: 1.1866 - Validation Accuracy: 0.7159 - Epoch: 34 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Validation Accuracy | Epoch | |:----------:|:---------------:|:-------------------:|:-----:| | 7.0814 | 4.8848 | 0.0133 | 0 | | 4.6679 | 4.5568 | 0.0666 | 1 | | 4.3536 | 4.1337 | 0.1221 | 2 | | 3.8915 | 3.6650 | 0.2053 | 3 | | 3.4256 | 3.2568 | 0.2597 | 4 | | 3.0033 | 2.8885 | 0.3185 | 5 | | 2.6252 | 2.5913 | 0.3973 | 6 | | 2.2829 | 2.3391 | 0.4406 | 7 | | 1.9821 | 2.1352 | 0.4928 | 8 | | 1.7076 | 1.9428 | 0.5250 | 9 | | 1.4693 | 1.8008 | 0.5627 | 10 | | 1.2464 | 1.6763 | 0.5949 | 11 | | 1.0552 | 1.5872 | 0.6093 | 12 | | 0.9105 | 1.4840 | 0.6238 | 13 | | 0.7595 | 1.4117 | 0.6426 | 14 | | 0.6390 | 1.3601 | 0.6582 | 15 | | 0.5328 | 1.3283 | 0.6548 | 16 | | 0.4539 | 1.2958 | 0.6681 | 17 | | 0.3655 | 1.2470 | 0.6715 | 18 | | 0.3183 | 1.2389 | 0.6770 | 19 | | 0.2597 | 1.2309 | 0.6792 | 20 | | 0.2269 | 1.2193 | 0.6881 | 21 | | 0.1750 | 1.2206 | 0.6781 | 22 | | 0.1553 | 1.1853 | 0.6970 | 23 | | 0.1313 | 1.1949 | 0.6781 | 24 | | 0.1058 | 1.1935 | 0.6870 | 25 | | 0.0903 | 1.2042 | 0.6859 | 26 | | 0.0762 | 1.1950 | 0.6948 | 27 | | 0.0654 | 1.1798 | 0.7037 | 28 | | 0.0588 | 1.1955 | 0.6959 | 29 | | 0.0488 | 1.1788 | 0.7048 | 30 | | 0.0444 | 1.1845 | 0.7037 | 31 | | 0.0374 | 1.1969 | 0.7026 | 32 | | 0.0327 | 1.1907 | 0.7048 | 33 | | 0.0286 | 1.1866 | 0.7159 | 34 | ### Framework versions - Transformers 4.35.2 - TensorFlow 2.15.0 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni" ]
tonyassi/celebrity-classifier
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Celebrity Classifier ## Model description This model classifies a face to a celebrity. It is trained on [tonyassi/celebrity-1000](https://huggingface.co/datasets/tonyassi/celebrity-1000) dataset and fine-tuned on [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k). ## Dataset description [tonyassi/celebrity-1000](https://huggingface.co/datasets/tonyassi/celebrity-1000) Top 1000 celebrities. 18,184 images. 256x256. Square cropped to face. ### How to use ```python from transformers import pipeline # Initialize image classification pipeline pipe = pipeline("image-classification", model="tonyassi/celebrity-classifier") # Perform classification result = pipe('image.png') # Print results print(result) ``` ## Training and evaluation data It achieves the following results on the evaluation set: - Loss: 0.9089 - Accuracy: 0.7982 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "aaron eckhart", "aaron paul", "adam driver", "blake lively", "bob odenkirk", "bonnie wright", "boyd holbrook", "brad pitt", "bradley cooper", "brendan fraser", "brian cox", "brie larson", "brittany snow", "adam lambert", "bryan cranston", "bryce dallas howard", "busy philipps", "caitriona balfe", "cameron diaz", "camila cabello", "camila mendes", "cardi b", "carey mulligan", "carla gugino", "adam levine", "carrie underwood", "casey affleck", "cate blanchett", "catherine keener", "catherine zeta-jones", "celine dion", "chace crawford", "chadwick boseman", "channing tatum", "charlie cox", "adam sandler", "charlie day", "charlie hunnam", "charlie plummer", "charlize theron", "chiara ferragni", "chiwetel ejiofor", "chloe bennet", "chloe grace moretz", "chloe sevigny", "chloë grace moretz", "adam scott", "chloë sevigny", "chris cooper", "chris evans", "chris hemsworth", "chris martin", "chris messina", "chris noth", "chris o'dowd", "chris pine", "chris pratt", "adele", "chris tucker", "chrissy teigen", "christian bale", "christian slater", "christina aguilera", "christina applegate", "christina hendricks", "christina milian", "christina ricci", "christine baranski", "adrian grenier", "christoph waltz", "christopher plummer", "christopher walken", "cillian murphy", "claire foy", "clive owen", "clive standen", "cobie smulders", "colin farrell", "colin firth", "adèle exarchopoulos", "colin hanks", "connie britton", "conor mcgregor", "constance wu", "constance zimmer", "courteney cox", "cristiano ronaldo", "daisy ridley", "dak prescott", "dakota fanning", "aidan gillen", "dakota johnson", "damian lewis", "dan stevens", "danai gurira", "dane dehaan", "daniel craig", "daniel dae kim", "daniel day-lewis", "daniel gillies", "daniel kaluuya", "aidan turner", "daniel mays", "daniel radcliffe", "danny devito", "darren criss", "dave bautista", "dave franco", "dave grohl", "daveed diggs", "david attenborough", "david beckham", "aaron rodgers", "aishwarya rai", "david duchovny", "david harbour", "david oyelowo", "david schwimmer", "david tennant", "david thewlis", "dax shepard", "debra messing", "demi lovato", "dennis quaid", "aja naomi king", "denzel washington", "dermot mulroney", "dev patel", "diane keaton", "diane kruger", "diane lane", "diego boneta", "diego luna", "djimon hounsou", "dolly parton", "alden ehrenreich", "domhnall gleeson", "dominic cooper", "dominic monaghan", "dominic west", "don cheadle", "donald glover", "donald sutherland", "donald trump", "dua lipa", "dwayne \"the rock\" johnson", "aldis hodge", "dwayne johnson", "dylan o'brien", "ed harris", "ed helms", "ed sheeran", "eddie murphy", "eddie redmayne", "edgar ramirez", "edward norton", "eiza gonzalez", "alec baldwin", "eiza gonzález", "elijah wood", "elisabeth moss", "elisha cuthbert", "eliza coupe", "elizabeth banks", "elizabeth debicki", "elizabeth lail", "elizabeth mcgovern", "elizabeth moss", "alex morgan", "elizabeth olsen", "elle fanning", "ellen degeneres", "ellen page", "ellen pompeo", "ellie goulding", "elon musk", "emile hirsch", "emilia clarke", "emilia fox", "alex pettyfer", "emily beecham", "emily blunt", "emily browning", "emily deschanel", "emily hampshire", "emily mortimer", "emily ratajkowski", "emily vancamp", "emily watson", "emma bunton", "alex rodriguez", "emma chamberlain", "emma corrin", "emma mackey", "emma roberts", "emma stone", "emma thompson", "emma watson", "emmanuelle chriqui", "emmy rossum", "eoin macken", "alexander skarsgård", "eric bana", "ethan hawke", "eva green", "eva longoria", "eva mendes", "evan peters", "evan rachel wood", "evangeline lilly", "ewan mcgregor", "ezra miller", "alexandra daddario", "felicity huffman", "felicity jones", "finn wolfhard", "florence pugh", "florence welch", "forest whitaker", "freddie highmore", "freddie prinze jr.", "freema agyeman", "freida pinto", "aaron taylor-johnson", "alfre woodard", "freya allan", "gabrielle union", "gael garcia bernal", "gael garcía bernal", "gal gadot", "garrett hedlund", "gary oldman", "gemma arterton", "gemma chan", "gemma whelan", "alia shawkat", "george clooney", "george lucas", "gerard butler", "giancarlo esposito", "giannis antetokounmpo", "gigi hadid", "gillian anderson", "gillian jacobs", "gina carano", "gina gershon", "alice braga", "gina rodriguez", "ginnifer goodwin", "gisele bundchen", "glenn close", "grace kelly", "greg kinnear", "greta gerwig", "greta scacchi", "greta thunberg", "gugu mbatha-raw", "alice eve", "guy ritchie", "gwen stefani", "gwendoline christie", "gwyneth paltrow", "hafthor bjornsson", "hailee steinfeld", "hailey bieber", "haley joel osment", "halle berry", "hannah simone", "alicia keys", "harrison ford", "harry styles", "harvey weinstein", "hayden panettiere", "hayley atwell", "helen hunt", "helen mirren", "helena bonham carter", "henry cavill", "henry golding", "alicia vikander", "hilary swank", "himesh patel", "hozier", "hugh bonneville", "hugh dancy", "hugh grant", "hugh jackman", "hugh laurie", "ian somerhalder", "idris elba", "alison brie", "imelda staunton", "imogen poots", "ioan gruffudd", "isabella rossellini", "isabelle huppert", "isla fisher", "issa rae", "iwan rheon", "j.k. rowling", "j.k. simmons", "allison janney", "jack black", "jack reynor", "jack whitehall", "jackie chan", "jada pinkett smith", "jaden smith", "jaimie alexander", "jake gyllenhaal", "jake johnson", "jake t. austin", "allison williams", "james cameron", "james corden", "james franco", "james marsden", "james mcavoy", "james norton", "jamie bell", "jamie chung", "jamie dornan", "jamie foxx", "alyson hannigan", "jamie lee curtis", "jamie oliver", "jane fonda", "jane krakowski", "jane levy", "jane lynch", "jane seymour", "janelle monáe", "january jones", "jared leto", "abbi jacobson", "amanda peet", "jason bateman", "jason clarke", "jason derulo", "jason isaacs", "jason momoa", "jason mraz", "jason schwartzman", "jason segel", "jason statham", "jason sudeikis", "amanda seyfried", "javier bardem", "jay baruchel", "jay-z", "jeff bezos", "jeff bridges", "jeff daniels", "jeff goldblum", "jeffrey dean morgan", "jeffrey donovan", "jeffrey wright", "amandla stenberg", "jemima kirke", "jenna coleman", "jenna fischer", "jenna ortega", "jennifer aniston", "jennifer connelly", "jennifer coolidge", "jennifer esposito", "jennifer garner", "jennifer hudson", "amber heard", "jennifer lawrence", "jennifer lopez", "jennifer love hewitt", "jenny slate", "jeremy irons", "jeremy renner", "jeremy strong", "jerry seinfeld", "jesse eisenberg", "jesse metcalfe", "america ferrera", "jesse plemons", "jesse tyler ferguson", "jesse williams", "jessica alba", "jessica biel", "jessica chastain", "jessica lange", "jessie buckley", "jim carrey", "jim parsons", "amy adams", "joan collins", "joan cusack", "joanne froggatt", "joaquin phoenix", "jodie comer", "jodie foster", "joe jonas", "joe keery", "joel edgerton", "joel kinnaman", "amy poehler", "joel mchale", "john boyega", "john c. reilly", "john cena", "john cho", "john cleese", "john corbett", "john david washington", "john goodman", "john hawkes", "amy schumer", "john krasinski", "john legend", "john leguizamo", "john lithgow", "john malkovich", "john mayer", "john mulaney", "john oliver", "john slattery", "john travolta", "ana de armas", "john turturro", "johnny depp", "johnny knoxville", "jon bernthal", "jon favreau", "jon hamm", "jonah hill", "jonathan groff", "jonathan majors", "jonathan pryce", "andie macdowell", "jonathan rhys meyers", "jordan peele", "jordana brewster", "joseph fiennes", "joseph gordon-levitt", "josh allen", "josh brolin", "josh gad", "josh hartnett", "josh hutcherson", "abhishek bachchan", "andrew garfield", "josh radnor", "jude law", "judy dench", "judy greer", "julia garner", "julia louis-dreyfus", "julia roberts", "julia stiles", "julian casablancas", "julian mcmahon", "andrew lincoln", "julianna margulies", "julianne hough", "julianne moore", "julianne nicholson", "juliette binoche", "juliette lewis", "juno temple", "jurnee smollett", "justin bartha", "justin bieber", "andrew scott", "justin hartley", "justin herbert", "justin long", "justin theroux", "justin timberlake", "kj apa", "kaitlyn dever", "kaley cuoco", "kanye west", "karl urban", "andy garcia", "kat dennings", "kate beckinsale", "kate bosworth", "kate hudson", "kate mara", "kate middleton", "kate upton", "kate walsh", "kate winslet", "katee sackhoff", "andy samberg", "katherine heigl", "katherine langford", "katherine waterston", "kathryn hahn", "katie holmes", "katie mcgrath", "katy perry", "kaya scodelario", "keanu reeves", "keegan-michael key", "andy serkis", "keira knightley", "keke palmer", "kelly clarkson", "kelly macdonald", "kelly marie tran", "kelly reilly", "kelly ripa", "kelvin harrison jr.", "keri russell", "kerry washington", "angela bassett", "kevin bacon", "kevin costner", "kevin hart", "kevin spacey", "ki hong lee", "kiefer sutherland", "kieran culkin", "kiernan shipka", "kim dickens", "kim kardashian", "angelina jolie", "kirsten dunst", "kit harington", "kourtney kardashian", "kristen bell", "kristen stewart", "kristen wiig", "kristin davis", "krysten ritter", "kyle chandler", "kylie jenner", "anna camp", "kylie minogue", "lady gaga", "lake bell", "lakeith stanfield", "lamar jackson", "lana del rey", "laura dern", "laura harrier", "laura linney", "laura prepon", "anna faris", "laurence fishburne", "laverne cox", "lebron james", "lea michele", "lea seydoux", "lee pace", "leighton meester", "lena headey", "leonardo da vinci", "leonardo dicaprio", "abigail breslin", "anna kendrick", "leslie mann", "leslie odom jr.", "lewis hamilton", "liam hemsworth", "liam neeson", "lili reinhart", "lily aldridge", "lily allen", "lily collins", "lily james", "anna paquin", "lily rabe", "lily tomlin", "lin-manuel miranda", "linda cardellini", "lionel messi", "lisa bonet", "lisa kudrow", "liv tyler", "lizzo", "logan lerman", "annasophia robb", "lorde", "lucy boynton", "lucy hale", "lucy lawless", "lucy liu", "luke evans", "luke perry", "luke wilson", "lupita nyong'o", "léa seydoux", "annabelle wallis", "mackenzie davis", "madelaine petsch", "mads mikkelsen", "mae whitman", "maggie gyllenhaal", "maggie q", "maggie siff", "maggie smith", "mahershala ali", "mahira khan", "anne hathaway", "maisie richardson-sellers", "maisie williams", "mandy moore", "mandy patinkin", "marc anthony", "margaret qualley", "margot robbie", "maria sharapova", "marion cotillard", "marisa tomei", "anne marie", "mariska hargitay", "mark hamill", "mark ruffalo", "mark strong", "mark wahlberg", "mark zuckerberg", "marlon brando", "martin freeman", "martin scorsese", "mary elizabeth winstead", "anne-marie", "mary j. blige", "mary steenburgen", "mary-louise parker", "matt bomer", "matt damon", "matt leblanc", "matt smith", "matthew fox", "matthew goode", "matthew macfadyen", "ansel elgort", "matthew mcconaughey", "matthew perry", "matthew rhys", "matthew stafford", "max minghella", "maya angelou", "maya hawke", "maya rudolph", "megan fox", "megan rapinoe", "anson mount", "meghan markle", "mel gibson", "melanie lynskey", "melissa benoist", "melissa mccarthy", "melonie diaz", "meryl streep", "mia wasikowska", "michael b. jordan", "michael c. hall", "anthony hopkins", "michael caine", "michael cera", "michael cudlitz", "michael douglas", "michael ealy", "michael fassbender", "michael jordan", "michael keaton", "michael pena", "michael peña", "abigail spencer", "anthony joshua", "michael phelps", "michael shannon", "michael sheen", "michael stuhlbarg", "michelle dockery", "michelle monaghan", "michelle obama", "michelle pfeiffer", "michelle rodriguez", "michelle williams", "anthony mackie", "michelle yeoh", "michiel huisman", "mila kunis", "miles teller", "milla jovovich", "millie bobby brown", "milo ventimiglia", "mindy kaling", "miranda cosgrove", "miranda kerr", "antonio banderas", "mireille enos", "molly ringwald", "morgan freeman", "mélanie laurent", "naomi campbell", "naomi harris", "naomi scott", "naomi watts", "naomie harris", "nas", "anya taylor-joy", "natalie dormer", "natalie imbruglia", "natalie morales", "natalie portman", "nathalie emmanuel", "nathalie portman", "nathan fillion", "naya rivera", "neil patrick harris", "neil degrasse tyson", "ariana grande", "neve campbell", "neymar jr.", "nicholas braun", "nicholas hoult", "nick jonas", "nick kroll", "nick offerman", "nick robinson", "nicole kidman", "nikolaj coster-waldau", "armie hammer", "nina dobrev", "noah centineo", "noomi rapace", "norman reedus", "novak djokovic", "octavia spencer", "odessa young", "odette annable", "olivia colman", "olivia cooke", "ashley judd", "olivia holt", "olivia munn", "olivia wilde", "oprah winfrey", "orlando bloom", "oscar isaac", "owen wilson", "pablo picasso", "patrick dempsey", "patrick mahomes", "ashton kutcher", "patrick stewart", "patrick wilson", "paul bettany", "paul dano", "paul giamatti", "paul mccartney", "paul rudd", "paul wesley", "paula patton", "pedro almodóvar", "aubrey plaza", "pedro pascal", "penelope cruz", "penélope cruz", "pete davidson", "peter dinklage", "phoebe dynevor", "phoebe waller-bridge", "pierce brosnan", "portia de rossi", "priyanka chopra", "auli'i cravalho", "quentin tarantino", "rachel bilson", "rachel brosnahan", "rachel mcadams", "rachel weisz", "rafe spall", "rainn wilson", "ralph fiennes", "rami malek", "rashida jones", "adam brody", "awkwafina", "ray liotta", "ray romano", "rebecca ferguson", "rebecca hall", "reese witherspoon", "regina hall", "regina king", "renee zellweger", "renée zellweger", "rhys ifans", "barack obama", "ricardo montalban", "richard armitage", "richard gere", "richard jenkins", "richard madden", "ricky gervais", "ricky martin", "rihanna", "riley keough", "rita ora", "bella hadid", "river phoenix", "riz ahmed", "rob lowe", "robert carlyle", "robert de niro", "robert downey jr.", "robert pattinson", "robert sheehan", "robin tunney", "robin williams", "bella thorne", "roger federer", "rooney mara", "rosamund pike", "rosario dawson", "rose byrne", "rose leslie", "roselyn sanchez", "ruby rose", "rupert grint", "russell brand", "ben barnes", "russell crowe", "russell wilson", "ruth bader ginsburg", "ruth wilson", "ryan eggold", "ryan gosling", "ryan murphy", "ryan phillippe", "ryan reynolds", "ryan seacrest", "ben mendelsohn", "salma hayek", "sam claflin", "sam heughan", "sam rockwell", "sam smith", "samara weaving", "samuel l. jackson", "sandra bullock", "sandra oh", "saoirse ronan", "ben stiller", "sarah gadon", "sarah hyland", "sarah jessica parker", "sarah michelle gellar", "sarah paulson", "sarah silverman", "sarah wayne callies", "sasha alexander", "scarlett johansson", "scott speedman", "ben whishaw", "sean bean", "sebastian stan", "selena gomez", "selma blair", "serena williams", "seth macfarlane", "seth meyers", "seth rogen", "shailene woodley", "shakira", "benedict cumberbatch", "shania twain", "sharlto copley", "shawn mendes", "shia labeouf", "shiri appleby", "shohreh aghdashloo", "shonda rhimes", "sienna miller", "sigourney weaver", "simon baker", "benedict wong", "simon cowell", "simon pegg", "simone biles", "sofia boutella", "sofia vergara", "sophie turner", "sophie wessex", "stanley tucci", "stephen amell", "stephen colbert", "adam devine", "benicio del toro", "stephen curry", "stephen dorff", "sterling k. brown", "sterling knight", "steve carell", "steven yeun", "susan sarandon", "taika waititi", "taraji p. henson", "taron egerton", "bill gates", "taylor hill", "taylor kitsch", "taylor lautner", "taylor schilling", "taylor swift", "teresa palmer", "terrence howard", "tessa thompson", "thandie newton", "the weeknd", "bill hader", "theo james", "thomas brodie-sangster", "thomas jane", "tiger woods", "tilda swinton", "tim burton", "tim cook", "timothee chalamet", "timothy olyphant", "timothy spall", "bill murray", "timothée chalamet", "tina fey", "tobey maguire", "toby jones", "toby kebbell", "toby regbo", "tom brady", "tom brokaw", "tom cavanagh", "tom cruise", "bill pullman", "tom ellis", "tom felton", "tom hanks", "tom hardy", "tom hiddleston", "tom holland", "tom hollander", "tom hopper", "tom selleck", "toni collette", "bill skarsgård", "tony hale", "topher grace", "tracee ellis ross", "tyra banks", "tyrese gibson", "uma thurman", "usain bolt", "uzo aduba", "vanessa hudgens", "vanessa kirby", "billie eilish", "vera farmiga", "victoria pedretti", "viggo mortensen", "vin diesel", "vince vaughn", "vincent cassel", "vincent d'onofrio", "vincent kartheiser", "viola davis", "walton goggins", "billie lourd", "wes anderson", "wes bentley", "whoopi goldberg", "will ferrell", "will poulter", "willem dafoe", "william jackson harper", "william shatner", "winona ryder", "woody harrelson", "billy crudup", "yara shahidi", "yvonne strahovski", "zac efron", "zach braff", "zach galifianakis", "zachary levi", "zachary quinto", "zayn malik", "zazie beetz", "zendaya", "billy porter", "zoe kazan", "zoe kravitz", "zoe saldana", "zoey deutch", "zooey deschanel", "zoë kravitz", "zoë saldana" ]
alirzb/S2_M1_R3_BEiT_42621830
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S2_M1_R3_BEiT_42621830 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0128 - Accuracy: 0.9981 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0088 | 1.0 | 307 | 0.0336 | 0.9942 | | 0.011 | 2.0 | 614 | 0.0439 | 0.9932 | | 0.0009 | 3.0 | 921 | 0.0163 | 0.9961 | | 0.003 | 4.0 | 1229 | 0.0130 | 0.9971 | | 0.0001 | 5.0 | 1535 | 0.0128 | 0.9981 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold2_BEiT_42621842
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold2_BEiT_42621842 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0008 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0283 | 1.0 | 368 | 0.0190 | 0.9960 | | 0.014 | 2.0 | 737 | 0.0153 | 0.9960 | | 0.0044 | 3.0 | 1105 | 0.0032 | 0.9992 | | 0.0019 | 4.0 | 1474 | 0.0130 | 0.9976 | | 0.0 | 4.99 | 1840 | 0.0008 | 0.9992 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold1_BEiT_42621837
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold1_BEiT_42621837 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0098 - Accuracy: 0.9976 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.012 | 1.0 | 368 | 0.0211 | 0.9944 | | 0.0349 | 2.0 | 737 | 0.0147 | 0.9960 | | 0.0014 | 3.0 | 1105 | 0.0075 | 0.9976 | | 0.0001 | 4.0 | 1474 | 0.0071 | 0.9984 | | 0.0065 | 4.99 | 1840 | 0.0098 | 0.9976 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold4_BEiT_42621847
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold4_BEiT_42621847 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0018 - Accuracy: 0.9992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.007 | 1.0 | 368 | 0.0027 | 1.0 | | 0.0132 | 2.0 | 737 | 0.0019 | 1.0 | | 0.0255 | 3.0 | 1105 | 0.0053 | 0.9976 | | 0.0001 | 4.0 | 1474 | 0.0060 | 0.9976 | | 0.0001 | 4.99 | 1840 | 0.0018 | 0.9992 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
alirzb/S5_M1_fold5_BEiT_42621849
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # S5_M1_fold5_BEiT_42621849 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0001 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0023 | 1.0 | 368 | 0.0052 | 0.9984 | | 0.0201 | 2.0 | 737 | 0.0208 | 0.9952 | | 0.0 | 3.0 | 1105 | 0.0257 | 0.9936 | | 0.0007 | 4.0 | 1474 | 0.0005 | 1.0 | | 0.0001 | 4.99 | 1840 | 0.0001 | 1.0 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.13.3
[ "none_seizures", "seizures" ]
Ojimi/vit-anime-caption
This model is the product of curiosity—imagine a choice that allows you to label anime images! **Disclaimer**: The model has been trained on an entirely new dataset. Predictions made by the model *prior to 2023 might be off*. It's advisable to fine-tune the model according to your specific use case. # Quick setup guide: ```python from transformers.modeling_outputs import ImageClassifierOutput from transformers import ViTImageProcessor, ViTForImageClassification import torch from PIL import Image model_name_or_path = "Ojimi/vit-anime-caption" processor = ViTImageProcessor.from_pretrained(model_name_or_path) model = ViTForImageClassification.from_pretrained(model_name_or_path) threshold = 0.3 device = torch.device('cuda') image = Image.open(YOUR_IMAGE_PATH) inputs = processor(image, return_tensors='pt') model.to(device=device) model.eval() with torch.no_grad(): pixel_values = inputs['pixel_values'].to(device=device) outputs : ImageClassifierOutput = model(pixel_values=pixel_values) logits = outputs.logits # The raw scores before applying any activation sigmoid = torch.nn.Sigmoid() # Sigmoid function to convert logits to probabilities logits : torch.FloatTensor = sigmoid(logits) # Applying sigmoid activation predictions = [] # List to store predictions for idx, p in enumerate(logits[0]): if p > threshold: # Applying a threshold of 0.3 to consider a class prediction predictions.append((model.config.id2label[idx], p.item())) # Storing class label and probability for tag in predictions: print(tag) ``` Why the `Sigmoid`? - Sigmoid turns boring scores into fun probabilities, so you can use thresholds and find more cool tags. - It's like a wizard turning regular stuff into magic potions! [Training guide](https://huggingface.co/Ojimi/vit-anime-caption/blob/main/training_guide.md)
[ "general", "sensitive", "questionable", "explicit", "1girl", "akatsuki uni", "armpits", "arms up", "bangs", "bare shoulders", "black dress", "black gloves", "blonde hair", "blush", "breasts", "clenched teeth", "dress", "gloves", "hair ornament", "hairclip", "long hair", "looking at viewer", "parted bangs", "pink eyes", "simple background", "sleeveless", "sleeveless dress", "small breasts", "solo", "sweat", "teeth", "two side up", "very long hair", "virtual youtuber", "white background", "zipper", "zipper pull tab", "1boy", ":o", "bed", "bed sheet", "black leotard", "breasts apart", "breath", "brown leotard", "cameltoe", "censored", "clenched hands", "corruption", "covered navel", "covered nipples", "drooling", "fingerless gloves", "fingernails", "fishnets", "hair between eyes", "hands up", "heart", "heart-shaped pupils", "heavy breathing", "hetero", "highleg", "highleg leotard", "impossible clothes", "impossible leotard", "leotard", "lying", "medium breasts", "nipple piercing", "nipples", "nose blush", "on back", "on bed", "open mouth", "partially fingerless gloves", "paw pose", "penis", "piercing", "pillow", "pink bow", "precum", "pubic tattoo", "purple eyes", "pussy juice", "saliva", "saliva trail", "shiny", "shiny clothes", "shiny hair", "shiny skin", "showgirl skirt", "sidelocks", "skin tight", "skirt", "solo focus", "spread legs", "steam", "steaming body", "symbol-shaped pupils", "taimanin suit", "tattoo", "thighhighs", "tongue", "tongue out", ":3", "character name", "closed mouth", "glasses", "grey background", "hand up", "red eyes", "red-framed eyewear", "semi-rimless eyewear", "signature", "smile", "under-rim eyewear", "upper body", "black footwear", "black thighhighs", "boots", "fang", "fang out", "full body", "high heel boots", "high heels", "invisible chair", "sitting", "star (symbol)", "collarbone", "cowgirl position", "female pubic hair", "nipple tweak", "pov", "pubic hair", "sex", "straddling", "ass visible through thighs", "black panties", "cowboy shot", "groin", "navel", "panties", "panty pull", "topless", "underwear", "underwear only", ":d", "gradient", "gradient background", "pink background", "2girls", "after kiss", "bow", "brown eyes", "grey hair", "hair bow", "kiss", "multiple girls", "sweatdrop", "yellow eyes", "yuri", "beret", "black coat", "black headwear", "black pantyhose", "blue bow", "bowtie", "clothes lift", "coat", "cosplay", "dress lift", "frilled dress", "frills", "hat", "hood", "hood down", "hooded coat", "kneeling", "long sleeves", "open clothes", "open coat", "pantyhose", "parted lips", "purple coat", "purple headwear", "school uniform", "st. theresa's girls academy school uniform", "white dress", "vest", "couch", "dutch angle", "feet out of frame", "hand on own knee", "head tilt", "indoors", "red panties", "window", "black shirt", "half-closed eyes", "pleated skirt", "seductive smile", "thighs", "book", "floating", "floating book", "floating object", "index finger raised", "open book", "pleated dress", "twitter username", "bat (animal)", "full moon", "moon", "night", "outdoors", "red sky", "sky", "tree", "^^^", "ahegao", "ass", "bent over", "bodysuit", "crying", "elbow gloves", "implied sex", "tears", "trembling", "camisole", "food", "hot", "melting", "popsicle", "sexually suggestive", "table", "bare arms", "black skirt", "shirt", "loli", "no panties", "pussy", "skirt lift", "arms behind back", "from behind", "fur trim", "looking back", "microskirt", "standing", ";)", "nail polish", "one eye closed", "red nails", "red skirt", "zettai ryouiki", ":q", "licking lips", "naughty face", "side-tie panties", "from below", "reaching towards viewer", "hand on hip", "sparkle", "two-tone background", "covering", "covering ass", "sleeveless jacket", "sleeveless shirt", "uncensored", "all fours", "bare legs", "barefoot", "fangs", "multiple views", "string panties", ";d", "claw pose", "cropped legs", "from above", "nude", "oral invitation", "seiza", "uvula", "red bow", "brown gloves", "bikini", "black bikini", "black choker", "bridal garter", "choker", "micro bikini", "side-tie bikini bottom", "swimsuit", "thigh gap", "bath", "bathing", "bathtub", "completely nude", "convenient censoring", "feet", "flat chest", "hair censor", "partially submerged", "rubber duck", "soles", "toes", "water", "arm support", "black socks", "bottomless", "cleft of venus", "flying sweatdrops", "gym shirt", "gym uniform", "knee up", "puffy short sleeves", "puffy sleeves", "shoes", "short sleeves", "sneakers", "socks", "white footwear", "white shirt", "animal ears", "brown hair", "collar", "earpiece", "fox ears", "hairband", "infection monitor (arknights)", "jacket", "open jacket", "red hairband", "red jacket", "twintails", "two-tone hairband", "white coat", "white jacket", "bracelet", "casual one-piece swimsuit", "fox girl", "fox tail", "hair ribbon", "jewelry", "necklace", "official alternate costume", "one-piece swimsuit", "red one-piece swimsuit", "red ribbon", "ribbon", "ring", "swimsuit cover-up", "tail", "thigh strap", "animal ear fluff", "blurry", "blurry background", "hand on own chest", "large breasts", "mosaic censoring", "no bra", "one breast out", "orange eyes", "pink nails", "see-through", "shirt lift", "stomach", "sweatband", "tank top", "wristband", "alternate costume", "cleavage", "detached collar", "playboy bunny", "red bowtie", "strapless", "strapless leotard", "wrist cuffs", "embarrassed", "kneehighs", "staff", "striped", "3girls", "beach", "blue sky", "cake", "cake slice", "cup", "day", "drink", "drinking glass", "feet up", "flower", "fruit", "leg up", "ocean", "off shoulder", "on stomach", "petals", "plate", "rose", "sandals", "sarong", "standing on one leg", "strawberry", "watermark", "white flower", "white sarong", "ball gag", "bdsm", "blue sailor collar", "blue skirt", "bondage", "bound", "bow panties", "breasts out", "gag", "gagged", "hitachi magic wand", "miniskirt", "rope", "sailor collar", "serafuku", "sex toy", "shibari", "shibari over clothes", "vibrator", "white panties", "white thighhighs", "bag", "black shorts", "clothes around waist", "crop top", "jacket around waist", "midriff", "short shorts", "shorts", "shoulder bag", "sports bra", "after sex", "anus", "condom", "condom wrapper", "cum", "cum in pussy", "cum on body", "cum on breasts", "cum on hair", "facial", "hands on own chest", "pointless condom", "black collar", "clothes pull", "unzipped", "m legs", "missionary", "vaginal", "headgear", "upper teeth only", "polearm", "weapon", "black nails", "clothing aside", "panties aside", "reverse suspended congress", "testicles", "belt", "extra ears", "border", "eating", "lemon", "lemon slice", "orange (fruit)", "orange background", "orange slice", "white border", "leaf", "light particles", "plant", "arknights", "between legs", "candy", "chocolate", "grin", "heart hands", "no shoes", "white pantyhose", "toenail polish", "toenails", "yokozuwari", "bar censor", "bound arms", "chain", "lactation", "red rope", "restrained", "sideboob", "red coat", "cloud", "outstretched arms", "areola slip", "underboob", "ejaculation", "girl on top", "grabbing own breast", "motion lines", "overflow", "swimsuit aside", "anal", "looking away", "ass focus", "back", "basketball", "clitoris", "on floor", "presenting", "top-down bottom-up", "wooden floor", "bikini bottom only", "blurry foreground", "covering breasts", "crossed arms", "depth of field", "red bikini", "wading", "wet", "black background", "closed eyes", "facing viewer", "finger to mouth", "shushing", "bandeau", "tube top", "revealing clothes", "single thighhigh", "skindentation", "arm up", "spaghetti strap", "from side", "black ribbon", "black wristband", "hand between legs", "neck ribbon", "off-shoulder dress", "wariza", "covering mouth", "spoken blush", "criss-cross halter", "detached sleeves", "hair rings", "halterneck", "horns", "ice cream", "red hair", "star hair ornament", "portrait", "classroom", "collared shirt", "desk", "envelope", "leaning forward", "letter", "love letter", "after vaginal", "cum string", "cumdrip", "dark skin", "dark-skinned male", "erection", "interracial", "anal object insertion", "backless leotard", "bandaid", "bandaids on nipples", "object insertion", "partially visible vulva", "pasties", "wedgie", "black border", "chibi", "hands on own cheeks", "hands on own face", "knees up", "box", "gift", "gift box", "valentine", "flashing", "cellphone", "naked coat", "naked jacket", "phone", "selfie", "smartphone", "covering face", "heart-shaped box", "belt pouch", "between breasts", "cum in mouth", "underbust", "white belt", "lamp", "beanstalk (arknights)", "foot out of frame", "green dress", "green hairband", "green nails", "off-shoulder sweater", "sleeves past wrists", "sweater", "white sweater", "bell", "braid", "cardboard box", "christmas", "in box", "in container", "neck bell", "single braid", "white socks", "frilled collar", "green ribbon", "multicolored nails", "candy cane", "hair intakes", "hair over shoulder", "ankle boots", "brown shorts", "crab", "creature", "crossbow", "grass", "hanging breasts", "rock", "shirt removed", "shorts pull", "torn clothes", "torn shirt", "arm grab", "green hair", "sex from behind", "standing sex", "clothed female nude male", "open shirt", "striped shirt", "vertical stripes", "vertical-striped shirt", "ears through headwear", "femdom", "multicolored hair", "white collar", "ribbon braid", "streaked hair", "yellow flower", "bra", "cropped sweater", "sweater lift", "animal", "strap pull", "foot focus", "foreshortening", "ribbon in mouth", ":p", "pussy juice trail", "sketch", "speech bubble", "spoken heart", "squatting", "disembodied limb", "headpat", "brown footwear", "breast grab", "grabbing", "paizuri", "veins", "veiny penis", "single sock", "green skirt", "christmas tree", "starry sky", "braided ponytail", "clothed sex", "panties around one leg", "sleepy", "waking up", "hand on own stomach", "bestiality", "doggystyle", "monster", "green shirt", "overalls", "suspender shorts", "suspenders", "undressing", "blue eyes", "blue jacket", "clothes removed", "hood up", "hooded jacket", "low twintails", "on side", "pink hair", "spooning", "ahoge", "buttons", "enmaided", "maid", "maid headdress", "blue nails", "bottle", "earrings", "pink choker", "twin braids", "water bottle", "fork", "legs", "shadow", "spoon", "feather hair", "owl ears", "white hair", "bespectacled", "black-framed eyewear", "doughnut", "elite ii (arknights)", "heart censor", "pouch", "split", "standing split", "scales", "o-ring choker", "sitting on table", "thighband pantyhose", "purple jacket", "black jacket", "chibi inset", "rabbit ears", "bright pupils", "dressing", "dress shirt", "hoodie", "syringe", "tearing up", "black pants", "blue shorts", "leggings", "legwear under shorts", "o-ring", "pants", "dolphin shorts", "grey shorts", "white hairband", "jacket removed", "...", "pink hairband", "spoken ellipsis", "tight", "tight pants", "yoga pants", "bisexual female", "fellatio", "ffm threesome", "french kiss", "handjob", "medium hair", "oral", "under table", "expressionless", "air bubble", "bubble", "underwater", "flip-flops", "shoe dangle", "single shoe", "white sports bra", "own hands together", "pouring", "pouring onto self", "recording", "v", "viewfinder", "blue footwear", "grey jacket", "female ejaculation", "rolling eyes", "dakimakura (medium)", "character doll", "stuffed toy", "zoom layer", "pants pull", "towel", "artist name", "rain", "cat", "frog", "arm behind back", "cardigan (arknights)", "cat ears", "cat girl", "dog ears", "dog girl", "goggles", "goggles on head", "necktie", "oripathy lesion (arknights)", "purple hair", "black sleeves", "black vest", "blue necktie", "outside border", "white gloves", "yellow background", "afterimage", "ball", "dog tail", "tail wagging", "tennis ball", "blue bowtie", "cropped jacket", "snowflake print", "snowflakes", "striped thighhighs", "white headwear", "light brown hair", "pantyhose pull", "ponytail", "knee pads", "red footwear", "shield", ";o", "dark-skinned female", "goat ears", "goat girl", "goat horns", "rubbing eyes", "blue shirt", "crop top overhang", "scarf", "weapon on back", "notice lines", "v-shaped eyebrows", "bread", "bug", "butterfly", "orange neckerchief", "paper bag", "shopping bag", "sunflower", "clothing cutout", "shoulder cutout", "black hair", "grabbing from behind", "open bra", "purple bra", "two-tone hair", "unfastened", "used condom", "used tissue", "blue panties", "dildo", "female masturbation", "hand to own mouth", "masturbation", "moaning", "no pants", "vaginal object insertion", "panties under pantyhose", "strap slip", "capelet", "split mouth", "salute", "neckerchief", "fur-trimmed capelet", "short hair", "white sleeves", "mountain", "red scarf", "snow", "bound legs", "crotch rope", "multicolored clothes", "pom pom (clothes)", "suspension", "toeless legwear", "legs up", "on couch", "naked towel", "pov hands", "skin fang", "wavy mouth", "antenna hair", "bikini pull", "bikini top only", "hair flower", "hibiscus", "imminent penetration", "off-shoulder bikini", "pink bikini", "pink flower", "sheep ears", "sheep girl", "sheep horns", "torso grab", "plaid", "plaid bikini", "cape", "ear protection", "mask", "mask around neck", "naked ribbon", "respirator", "white cape", "beach towel", "wedding ring", "brown thighhighs", "condom in mouth", "heart in eye", "torn thighhighs", "sunset", "shallow water", "lollipop", "frilled bikini", "gradient hair", "against wall", "interlocked fingers", "drinking straw", "glass", "ice", "ice cube", "bandages", "out of frame", "steepled fingers", "fur-trimmed dress", "cropped torso", "fire", "floating hair", "hand in own hair", "innertube", "palm tree", "sand", "water drop", "cat tail", "denim", "denim shorts", "eyewear on head", "sheep", "sunglasses", "watermelon", "white bikini", "white bow", "arms behind head", "bikini lift", "bikini bottom aside", "crying with eyes open", "anklet", "cherry blossoms", "hanami", "starfish", "curled horns", "pantyshot", "purple dress", "blue ribbon", "night sky", "onsen", "spread pussy", "looking to the side", "blue background", "brown background", "grey dress", "bandaged hand", "black hairband", "brushing hair", "chair", "comb", "floppy ears", "hair brush", "hairdressing", "mirror", "purple bow", "scissors", "side braid", "two-tone jacket", "black one-piece swimsuit", "gaping", "urethra", "bonnet", "goat", "striped bow", "curtains", "outstretched arm", "red choker", "ribbon choker", "pink socks", "saucer", "sleepwear", "teacup", "white shorts", "headphones", "object hug", "pillow hug", "the pose", "apron", "brown dress", "frilled apron", "garter straps", "heart of string", "parfait", "tray", "waist apron", "white apron", "yellow bow", "@_@", "empty eyes", "high-waist skirt", "id card", "imminent vaginal", "just the tip", "lanyard", "multicolored jacket", "pink jacket", "print bow", "pink coat", "black bow", "beachball", "sandals removed", "straw hat", "umbrella", "brown headwear", "hammock", "sun hat", "feathered wings", "green bow", "green sweater", "red gloves", "red shirt", "red sweater", "tail bow", "tail ornament", "wings", "heart-shaped eyewear", "hugging own legs", "tinted eyewear", "upskirt", "pink footwear", "clothes writing", "looking down", "maid apron", "stuffed animal", "teddy bear", "electricity", "puffy long sleeves", "tail raised", "yarn", "yarn ball", ">_<", "stuffed bunny", "white skirt", "looking at animal", "on lap", "petting", "?", "looking up", "sign", "fur-trimmed sleeves", "instrument", "trumpet", "animal hat", "bendy straw", "alternate hairstyle", "black bra", "cocktail glass", "hair down", "potted plant", "under covers", "bound wrists", "leash", "on ground", "rape", "clitoral stimulation", "fingering", "naked shirt", "dildo riding", "white wings", "lightning", "vertical-striped thighhighs", "happy sex", "snowman", "wide sleeves", "bear ears", "blue headwear", "food on face", "food-themed hair ornament", "gummy (arknights)", "ice cream cone", "sailor dress", "brown jacket", "candy hair ornament", "white neckerchief", "white sailor collar", "bear girl", "mini hat", "double bun", "headphones around neck", "blue dress", "frying pan", "orange nails", "checkerboard cookie", "cookie", "mug", "orange pantyhose", ":t", "pout", "candy wrapper", "diagonal stripes", "hair bun", "pumpkin hair ornament", "starry background", "cum on ass", "cum on feet", "red pantyhose", "torn pantyhose", "confetti", "flag", "monocle", "pennant", "see-through dress", "see-through skirt", "string of flags", "sailor hat", "bear tail", "blowing kiss", "twisted torso", "after fellatio", "cum on clothes", "cum on penis", "licking", "licking penis", ";q", "finger to own chin", "rabbit tail", "animal print", "back-print panties", "bear print", "nearly naked apron", "pink panties", "print panties", "bear", "traditional media", "baguette", "^_^", "cloudy sky", "sunlight", "beach umbrella", "bikini skirt", "frilled sleeves", "pink dress", "checkered background", "short dress", "deepthroat", "naked apron", "purple background", "dated", "balloon", "pov crotch", "fence", "orange thighhighs", "striped panties", "cream", ":<", "waving", "red thighhighs", "happy", "blue one-piece swimsuit", "hose", "wading pool", "wet clothes", "cream on face", "red background", "anal beads", "anal tail", "cooperative fellatio", "green eyes", "group sex", "mini top hat", "multiple tails", "shoes removed", "teamwork", "threesome", "white rose", "bikini under clothes", "blue bikini", "lens flare", "light rays", "sunbeam", "sundae", "wrist ribbon", "harmonie (arknights)", "lingerie", "tail through clothes", "green tail", "plaid skirt", "blunt bangs", "cardigan", "furry", "furry female", "grey cardigan", "slit pupils", "argyle", "argyle legwear", "jacket on shoulders", "wand", "black necktie", "faceless", "faceless male", "projectile cum", "sitting on lap", "sitting on person", "thigh sex", "open cardigan", "bow legwear", "frilled panties", "lace trim", "black cape", "pelvic curtain", "arm behind head", "lace-trimmed legwear", "aqua hair", "colored pubic hair", "tail ring", "bag charm", "charm (object)", "grey socks", "light green hair", "brown skirt", "footjob", "two-footed footjob", "jingle bell", "grey shirt", "partially unbuttoned", "tail censor", "purple panties", "cloak", "skin fangs", "guided breast grab", "guiding hand", "hands on hips", "kneepits", "crotchless", "crotchless panties", "micro panties", "card", "black belt", "bird tail", "head wings", "ho'olheyak (arknights)", "irrumatio", "monster girl", "arm under breasts", "breast press", "thigh grab", "aqua nails", "hair over one eye", "long fingernails", "one eye covered", "aqua eyes", "bookshelf", "blue hair", "necktie between breasts", "out-of-frame censoring", "colored tongue", "forked tongue", "hand on own cheek", "hand on own face", "snake tail", "hug", "grey skirt", "head rest", "huge breasts", "blue tongue", "blue flower", "blue rose", "fur-trimmed headwear", "red headwear", "ribbon bondage", "santa hat", "hands on feet", "demon girl", "demon horns", "demon tail", "pointy ears", "red necktie", "spear", "trident", "cleavage cutout", "eyes visible through hair", "shrug (clothing)", "ear piercing", "black bowtie", "cable", "microphone", "mole", "mole under eye", "pinecone (arknights)", "two-tone bow", "aran sweater", "brown sweater", "cable knit", "grey eyes", "orange hair", "turtleneck", "turtleneck sweater", "brown bow", "hand on own chin", "orange bow", "orange dress", "red dress", "double-breasted", "feather hair ornament", "feathers", "furry male", "muscular", "muscular male", "tiger ears", "emphasis lines", "furry with non-furry", "interspecies", "prone bone", "sheet grab", "size difference", "wide-eyed", "stool", "suspender skirt", "biting", "lip biting", "orange jacket", "orange skirt", "black corset", "epaulettes", "asymmetrical legwear", "bokeh", "frilled skirt", "knee boots", "layered skirt", "spotlight", "stage lights", "doll hug", "yellow sweater", "bird", "red socks", "single kneehigh", "inverted nipples", "head on pillow", "breasts squeezed together", "cum on tongue", "sound effects", "mismatched legwear", "horse ears", "horse girl", "penis on face", "horse tail", "thigh boots", "halftone", "halftone background", "dragon girl", "dragon horns", "dragon tail", "girl sandwich", "laurel crown", "sandwiched", "thighlet", "high collar", "juliet sleeves", "casual", "frilled shirt", "horizon", "off-shoulder shirt", "chromatic aberration", "english text", "retro artstyle", "subtitled", "backlighting", "blue neckerchief", "hair tie in mouth", "midriff peek", "tying hair", "breast sucking", "arrow (projectile)", "quiver", "bow (weapon)", "one knee", "anger vein", "korean text", "ear blush", "!?", "cum in nose", "greyscale", "hypnosis", "mind control", "monochrome", "spoken question mark", "hands in hair", "chess piece", "rook (chess)", "open fly", "handbag", "bird on hand", "soaking feet", "nontraditional playboy bunny", "white bowtie", "white leotard", "squiggle", "spoken squiggle", "shaded face", "sash", "artificial vagina", "palm leaf", "goggles around neck", "pointy hair", "ptilopsis (arknights)", "armband", "grey gloves", "broom", "bucket", "red flower", "purple shirt", "bouncing breasts", "twitching", "grey coat", "white bra", "2boys", "abs", "buttjob", "multiple boys", "orange sweater", "mmf threesome", "coffee", "coffee mug", "spill", "tea", "torn dress", "chinese text", "clothed male nude female", "paper", "penis grab", "stealth sex", "animal on shoulder", "bird on shoulder", "feather trim", "owl", "animal on head", "bird on head", "on head", "curtain grab", "legs apart", "multiple penises", "exhibitionism", "public indecency", "pussy juice puddle", "slingshot swimsuit", "tile floor", "white pupils", "branch", "stirrup legwear", "arm strap", "orange necktie", "fur-trimmed boots", "heterochromia", "rosa (arknights)", "santa costume", "colored eyelashes", "pillow grab", "beltbra", "leg ribbon", "tail between legs", "tail ribbon", "bandaid on pussy", "hooded cloak", "comic", "rabbit girl", "crossover", "take your pick", "coat on shoulders", "birdcage", "cage", "sundress", "pussy peek", "untied", "untied panties", "low ponytail", "front-tie bikini top", "front-tie top", "lowleg", "blue leotard", "sleeping", "blue gloves", "green background", "against glass", "naked cape", "scar", "no pussy", "red collar", "mary janes", "shoe soles", "bench", "food in mouth", "on bench", "black cloak", "convenient leg", "head wreath", "backpack", "upside-down", "broken", "broken chain", "cuffs", "metal collar", "ruins", "shackles", "tatami", "bound ankles", "orgasm", "indian style", "ambiguous gender", "book hug", "female orgasm", "japanese clothes", "kimono", "scene (arknights)", "obi", "oil-paper umbrella", "camera", "one side up", "wet shirt", "red umbrella", "white kimono", "keyboard (computer)", "monitor", "office chair", "tissue box", "brown vest", "brown bag", "side ponytail", "white hoodie", "drone", "robot", "handheld game console", "nintendo switch", "reverse upright straddle", "wince", "fireworks", "floral print", "grey kimono", "print kimono", "leg grab", "male pubic hair", "ground vehicle", "motor vehicle", "tank", "garter belt", "cross-section", "cum on stomach", "uterus", "pussy juice stain", "wet panties", "controller", "crescent", "crescent moon", "game console", "game controller", "television", "vase", "dress bow", "footwear bow", "frown", "knife", "shamare (arknights)", "apple", "backless outfit", "skull hair ornament", "stuffed dog", "boy on top", "panties removed", "deep penetration", "internal cumshot", "butterfly on hand", "diamond-shaped pupils", "red apple", "burnt clothes", "pink skirt", "purple vest", "red neckerchief", "weibo username", "building", "city", "cityscape", "blue hairband", "kitsune", "stretching", "!", "skull print", "needle", "sewing", "costume switch", "purple skirt", "glaring", "armchair", "candle", "pink ribbon", "torogao", "book stack", "no tail", "remote control vibrator", "blood", "pointless censoring", "skyfire (arknight)", "brown pantyhose", "crotch seam", "sleeves past fingers", "clipboard", "single earring", "jack-o' challenge", "mole on breast", "nipple bar", "lace", "lace-trimmed bra", "lace-trimmed panties", "toothbrush", "belt buckle", "suspended congress", "gradient legwear", "letterboxed", "striped necktie", "have to pee", "name tag", "v arms", "bra pull", "walkie-talkie", "thong", ">:)", "copyright name", "crossed legs", "alcohol", "drunk", "black cat", "wing collar", "china dress", "chinese clothes", "egasumi", "o-ring bottom", "side slit", "wavy hair", "kemonomimi mode", "sora (arknights)", "short twintails", "wolf ears", "wolf tail", "hair bobbles", "pink necktie", "red cape", "jumping", "wolf girl", "yellow nails", "microphone stand", "blue kimono", "green flower", "leaning back", "reclining", "??", "cheek squash", "can", "cooler", "grill", "soda can", "hat bow", "tilted headwear", "school desk", "grand piano", "phonograph", "piano", "sheet music", "wine", "wine glass", "condom packet strip", "6+boys", "bukkake", "gangbang", "cross", "red cross", "energy drink", "ink", "monster energy", "adjusting clothes", "adjusting gloves", "profile", "x-ray", "bandaged arm", "halloween", "halloween costume", "mummy costume", "coca-cola", "x hair ornament", "pocky", "double penetration", "spitroast", "spoken sweatdrop", "aged down", "child", "clothes grab", "female child", "collared dress", "ears down", "bouquet", "white cloak", "fetal position", "burger", "highleg bikini", "blood on face", "red capelet", "santa bikini", ";p", "inflatable toy", "pink kimono", "tabi", "frilled hairband", "open kimono", "animal hug", "cat on head", "wind lift", "ankle cuffs", "kyuubi", "tape", "ankle grab", "meat", "on table", "steak", "imminent rape", "tentacles", "lily of the valley", "single glove", "white capelet", "thick eyebrows", "butt plug", "egg vibrator", "gohei", "on grass", "single wrist cuff", "covered mouth", "first aid kit", "nurse cap", "leg lift", "legwear garter", "pee", "peeing", "candy apple", "navel cutout", "cunnilingus", "sitting on face", "maple leaf", "covering own eyes", "knees together feet apart", "yellow cardigan", "falling petals", "ribbon trim", "ribbon-trimmed sleeves", "torii", "stomach bulge", "enpera", "plaid scarf", "snowing", "white scarf", "winter", "winter clothes", "basket", "fox shadow puppet", "pointing", "phone screen", "crocodilian tail", "large tail", "star (sky)", "white one-piece swimsuit", "seashell", "shell", "short hair with long locks", "purple flower", "thick thighs", "dress pull", "anus peek", "ass grab", "forest", "nature", "spoken character", "hat flower", "red rose", "scar across eye", "scar on face", "top hat", "vampire", "vial", "warfarin (arknights)", "white skin", "blood bag", "colored skin", "cross hair ornament", "crown braid", "pale skin", "tiptoes", "blood on hands", "sleep molestation", "untied bikini", "black kimono", "6+girls", "on chair", "sharp teeth", "black sweater", "ribbed sweater", "close-up", "eyelashes", "breasts on glass", "mixed bathing", "shower (place)", "showering", "black capelet", "cross earrings", "hand on headwear", "sack", "heart hands duo", "full-face blush", "ascot", "musical note", "pink shirt", "blood from mouth", "closed umbrella", "avatar icon", "black camisole", "pig", "pink vest", "crown", "mini crown", "bird ears", "clock", "diamond (shape)", "butterfly print", "skull earrings", "purple bowtie", "bone hair ornament", "white ribbon", "frilled choker", "eye focus", "lolita hairband", "bandaid on face", "heart hair ornament", "butterfly hair ornament", "plaid bow", "plaid bowtie", "heart earrings", "heart o-ring", "hair bell", "angel", "angel wings", "mini wings", "bob cut", "monocle hair ornament", "animal hood", "round eyewear", "shark hood", "bear hair ornament", "wing hair ornament", "blue choker", "frilled sailor collar", "hands on headwear", "kindergarten uniform", "school hat", "yellow bowtie", "yellow headwear", "black horns", "hair on horn", "halo", "head scarf", "thought bubble", "industrial piercing", "black sailor collar", "chestnut mouth", "rabbit hair ornament", "grey sailor collar", "hat ribbon", "mob cap", "brooch", "hair scrunchie", "pink theme", "scrunchie", "center frills", "pink apron", "pink headwear", "2021", "chinese zodiac", "cow ears", "cow girl", "cow horns", "cow print", "cowbell", "drinking", "ear tag", "juice box", "milk", "milk carton", "year of the ox", "carrot hair ornament", "bun cover", "bat hair ornament", "black scarf", "blue collar", "pink collar", "lolita fashion", "skull", "+_+", "tassel", "fake animal ears", "chick", "low twin braids", "albacore (azur lane)", "azur lane", "torpedo", "star in eye", "star-shaped pupils", "symbol in eye", "flag print", "striped bikini", "pink thighhighs", "tiara", "veil", "spread anus", "lotion", "sunscreen", "pearl necklace", "blush stickers", "fake tail", "\\m/", "green belt", "green vest", "headset", "idol", "open vest", "uneven legwear", "cum in ass", "full nelson", "frilled thighhighs", "wine bottle", "butt crack", "strapless dress", "bikini bottom removed", "double v", "v over eye", "afloat", "caustics", "high ponytail", "pool", "snorkel", "white choker", "arm warmers", "livestream", "nurse", "pink bra", "spoken musical note", "animal hands", "facial mark", "facial tattoo", "musical note hair ornament", "paw gloves", "buckle", "fur-trimmed gloves", "fur-trimmed skirt", "highleg panties", "pointing at viewer", "cat paws", "paw shoes", "pet play", "fur collar", "yellow skirt", "hands on own ass", "heart ahoge", "huge ahoge", "spread ass", "mittens", "legs together", "finger in own mouth", "spilling", "teapot", "half gloves", "single hair bun", "skirt removed", "microdress", "sideless outfit", "cum pool", "nipple slip", "cat hair ornament", "retrofit (azur lane)", "animal ear headphones", "cat ear headphones", "american flag", "grey footwear", "american flag print", "frilled pillow", "aiming at viewer", "covering nipples", "water gun", "macaron", "pancake", "rudder footwear", "locker", "partially underwater shot", "transparent", "print necktie", "fat man", "military", "military uniform", "sleeve grab", "heart pillow", "rose petals", "vibrator under clothes", "vibrator under panties", "seagull", "sun", "bow bra", "bra lift", "fishnet gloves", "lantern", "tail bell", "shoulder blades", "polka dot", "polka dot background", "bunching hair", "bridal veil", "church", "wedding dress", "adjusting swimsuit", "fisheye", "lifebuoy", "star print", "tan", "turret", "checkered floor", "bride", "happy tears", "ornate ring", "wedding", "tsurime", "purple bikini", "fat", "untying", "3boys", "reverse cowgirl position", "patreon username", "web address", "covering crotch", "handcuffs", "bare back", "body writing", "rigging", "belt collar", "smug", "mismatched bikini", "legs over head", "groping", "raised eyebrows", "bridal gauntlets", "blue buruma", "buruma", "white camisole", "cupcake", "kabedon", "sweater vest", "wide ponytail", "against tree", "dappled sunlight", "frilled shorts", "marker (medium)", "millipen (medium)", "purple ribbon", "sleep mask", "adjusting hair", "blur censor", "motion blur", "hands on own knees", "hands on own thighs", "brick wall", "wall", "blue umbrella", "picnic basket", "covering one eye", "locked arms", "purple thighhighs", "bathroom", "shower head", "tile wall", "tiles", "anchor", "cannon", "object namesake", "torpedo tubes", "clothes in mouth", "pajamas", "shirt in mouth", "purple rose", "alternate hair length", "lowleg bikini", "pov doorway", "eighth note", "pink rose", "carrying", "naval uniform", "princess carry", "uniform", "fucked silly", "splashing", "light purple hair", "blue socks", "big belly", "pregnant", "pink neckerchief", "sideways glance", "loungewear", "contrapposto", "multiple condoms", "ankle ribbon", "pink pajamas", "polka dot shirt", "glowing", "glowing eyes", "single strap", "beamed eighth notes", "staff (music)", "door", "open door", "downblouse", "head out of frame", "socks removed", "triangle mouth", "bedroom", "hand fan", "hanfu", "in tree", "paper fan", "sitting in tree", "uchiwa", "slippers", "thighhighs pull", "facing away", "hand on own ass", "albino", "christmas lights", "christmas ornaments", "leg hair", "asymmetrical docking", "cheek-to-cheek", "symmetrical docking", "backless dress", "sakazuki", "sake", "cola", "sake bottle", "soda bottle", "bunny print", "peaked cap", "yellow shirt", "snowflake hair ornament", "glowstick", "fur-trimmed jacket", "speaker", "forehead", "sims (azur lane)", "thighhighs under boots", "adjusting eyewear", "looking over eyewear", "striped bra", "gloved handjob", "pinky out", "bat wings", "black wings", "brown wings", "naked bandage", "sarashi", "spread pussy under clothes", "penis out", "heart background", "on desk", "sitting on desk", ";3", "happy new year", "kagami mochi", "mandarin orange", "new year", "red kimono", "short kimono", "fake facial hair", "fake mustache", "nude cover", "underboob cutout", "anchor choker", "lace-trimmed hairband", "two-tone dress", "maid bikini", "chandelier", "dessert", "gothic lolita", "tiered tray", "two-tone ribbon", "o-ring top", "white scrunchie", "wrist scrunchie", "breast curtains", "see-through sleeves", "foot up", "lace panties", "demon wings", "fake horns", "jack-o'-lantern", "pumpkin", "grabbing own ass", "parasol", "white umbrella", "yukikaze (azur lane)", "arm ribbon", "skirt pull", "swing", "string", "string of fate", "white theme", "kimono lift", "uchikake", "ribbon-trimmed legwear", "oppai loli", "squatting cowgirl position", "bailu", "crossed bangs", "coral", "fish", "antlers", "4boys", "outstretched hand", "after rape", "gourd", "blue sleeves", "bow hairband", "colored inner hair", "dual persona", "long skirt", "blue serafuku", "chain-link fence", "poolside", "black serafuku", "fingersmile", "whale", "light blue hair", "school chair", "see-through shirt", "upshirt", "clothed masturbation", "crotch rub", "masturbation through clothes", "open skirt", "presenting armpit", "string bikini", "no humans", "rabbit", "randoseru", "speed lines", "pom pom (cheerleading)", "blue archive", "crazy straw", "birthday cake", "waves", "scenery", "disembodied penis", ":>", "belly grab", "pinching", "=3", "happy birthday", "spoken anger vein", "panty peek", "excessive cum", "cum on hands", "trefoil", "cat cutout", "cat lingerie", "frilled bra", "meme attire", "head grab", "wind", "looking at penis", "male masturbation", "penis awe", "brown belt", "swimsuit under clothes", "bunny pose", "gun", "rifle", "red bag", "black eyes", "chalkboard", "pokemon (creature)", "running", "leaning on person", "flower field", "school swimsuit", "child on child", "tribadism", "sleeping on person", "wardrobe malfunction", "eye contact", "frilled swimsuit", "heads together", "arrow (symbol)", "hydrangea", "accidental exposure", "sparkling eyes", "milestone celebration", "surprised", "reflection", "reflective floor", "reflective water", "skyscraper", "hug from behind", "round teeth", "skirt tug", "thumbs up", "omelet", "omurice", "peeing self", "puddle", "fuuka (blue archive)", "hair over breasts", "kitchen", "ladle", "orange headwear", "yukata", "cooking", "bento", "bowl", "chopsticks", "incoming food", "rice", "tamagoyaki", "wagashi", "fur-trimmed kimono", "formal", "suit", "buruma pull", "eyeshadow", "makeup", "pom pom hair ornament", "fishnet thighhighs", "pink scarf", "kunai", "loafers", "ninja", "visor cap", "cutoffs", "double fox shadow puppet", "ramune", "micro shorts", "sunflower hair ornament", "erection under clothes", "shorts around one leg", "falling leaves", "breast rest", "playing card", "red sailor collar", "single side bun", "fox mask", "pink eyeshadow", "holster", "shotgun", "submachine gun", "thigh holster", "red eyeshadow", "buruma aside", "bead bracelet", "beads", "+++", "red buruma", "dog", "fox", "architecture", "east asian architecture", "white serafuku", "handstand", "puckered anus", "after anal", "assault rifle", "battle rifle", "playing games", "polka dot panties", "strawberry print", "toilet", "toilet use", "polka dot bra", "training bra", "siblings", "sisters", "twins", "cheerleader", "polka dot bikini", "spoken interrobang", "animal ear headwear", "habit", "nun", "own hands clasped", "strawberry shortcake", "utensil in mouth", "oni", "oni horns", "white cardigan", "shuro (blue archive)", "ear biting", "hair in mouth", "bandaged leg", "sleeveless kimono", "character censor", "novelty censor", "bandaid on cheek", "clothes down", "maebari", "ofuda", "bandaged neck", "katana", "sheath", "sheathed", "sword", "bandaid on leg", "geta", "multi-strapped bikini", "arm garter", "heart pasties", "pink bowtie", "animal ear legwear", "cat ear legwear", "off-shoulder jacket", "red wings", "striped background", "headdress", "bulge", "crossdressing", "double horizontal stripe", "gradient eyes", "male focus", "multicolored eyes", "o_o", "otoko no ko", "pink cardigan", "pink sweater", "pocket", "brown kimono", "paper lantern", "yellow kimono", "brown panties", "deer ears", "fake antlers", "reindeer antlers", "heart cutout", "nightgown", "cow tail", "cover", "cover page", "doujin cover", "bow bikini", "grey bikini", "grey bow", "vertical-striped bikini", "undershirt", "bamboo steamer", "waitress", "bloomers", "frilled socks", "blue bra", "yellow rose", "fleur de lapin uniform", "heart-shaped chocolate", "blueberry", "string bra", "striped dress", "drill hair", "jirai kei", "light censor", "duster", "wa lolita", "food print", "print dress", "brown shirt", "plaid dress", "ringlets", "twin drills", "cat hood", "crescent hair ornament", "beach chair", "lounge chair", "mimikaki", "mask on head", "bicycle", "swept bangs", "hat removed", "headwear removed", "bridge", "frilled kimono", "railing", "unbuttoned", "computer", "mouse (computer)", "rabbit hood", "quill", "cane", "cross-laced footwear", "dango", "hakama", "hakama skirt", "red hakama", "sanshoku dango", "blue cardigan", "ribbed legwear", "striped socks", "upshorts", "bra visible through clothes", "cherry", "ferris wheel", "heart balloon", "lamppost", "blue thighhighs", "animal bag", "striped skirt", "vertical-striped skirt", "yellow panties", "food on body", "panda ears", "gift bag", "black cardigan", "school bag", "magical girl", "pink sailor collar", "arthropod girl", "flower wreath", "taur", "yellow ribbon", "white ascot", "bag of chips", "chips (food)", "potato chips", "horned headwear", "layered sleeves", "short over long sleeves", "striped sleeves", "corded phone", "loose socks", "yellow necktie", "red hoodie", "latin cross", "purple nails", "yellow neckerchief", "soap bubbles", "soap censor", "sponge", "witch hat", "pinching sleeves", "cherry hair ornament", "grey-framed eyewear", "white-framed eyewear", "cosmetics", "hand mirror", "lipstick", "lipstick tube", "intravenous drip", "frilled bow", "frilled shirt collar", "hair tubes", "nontraditional miko", "red vest", "skirt set", "stick", "yellow ascot", "striped jacket", "blazer", "skirt around one leg", "vertical-striped panties", "sailor bikini", "shell hair ornament", "pastry bag", "floral background", "shawl", "penguin", "polka dot skirt", "stuffed penguin", "bare tree", "blue capelet", "heart necklace", "plaid panties", "grey pants", "petals on liquid", "easter", "easter egg", "egg", "mop", "taiyaki", "braiding hair", "fur-trimmed hood", "hooded capelet", "aerial fireworks", "lace-up boots", "pink capelet", "cross choker", "yellow dress", "bunny hat", "pink umbrella", "autumn leaves", "hakama short skirt", "miko", "dress removed", "panda", "peach", "spring (season)", "naked kimono", "round window", "cheese", "french fries", "ketchup", "lettuce", "sandwich", "tomato", "blue apron", "eyepatch", "medical eyepatch", "white cat", "see-through silhouette", "striped ribbon", "crepe", "yellow scrunchie", "fringe trim", "one-piece swimsuit pull", "tanlines", "purple kimono", "yagasuri", "pink-framed eyewear", "short eyebrows", "two tails", "wet dress", "purple socks", "blue sweater", "picture frame", "wa maid", "blue wings", "carrot", "snowball", "badge", "print bikini", "disposable cup", "shaved ice", "stethoscope", "bicycle basket", "blue vest", "riding", "against railing", "public nudity", "remote control", "vibrator cord", "elf", "office lady", "pen", "pencil", "pencil skirt", "button eyes", "red mittens", "brown pants", "frilled leotard", "fur-trimmed legwear", "merry christmas", "red leotard", "santa gloves", ":x", "green sleeves", "hands on lap", "minigirl", "onigiri", "crime prevention buzzer", "low wings", "!!", "0_0", "spoken exclamation mark", "horn bow", "horn ornament", "brown apron", "pudding", "cross-shaped pupils", "in water", "vines", "black flower", "colored tips", "tiger tail", "brown scarf", "digital dissolve", "mouth drool", "fur-trimmed cape", "pocket watch", "roman numeral", "wall clock", "yellow footwear", "polka dot scrunchie", "sliding doors", "veranda", "tokkuri", "bra strap", "single bare shoulder", "winged hat", "climbing", "pole", "rolling suitcase", "suitcase", "red cloak", "fur-trimmed shorts", "red shorts", "pacifier", "recorder", "cabbie hat", "clover print", "hat feather", "hat ornament", "pink sleeves", "pink shorts", "pendant watch", "tareme", "watch", "purple sleeves", "messy hair", "print shirt", "t-shirt", "yawning", "high-waist shorts", "licking finger", "puffy shorts", "whisk", "cupless bra", "uwabaki", "bandaid on nose", "eye mask", "purple leotard", "stuffed winged unicorn", "age difference", "hand on own thigh", "panty lift", "pocky day", "pocky kiss", "animal costume", "pink gloves", "orange bikini", "yellow bikini", "drawstring", "striped shorts", "pastel colors", "purple footwear", "rainbow", "swirl lollipop", "pet bowl", "mouth pull", "open hoodie", "over-kneehighs", "heart tattoo", "witch", "aqua necktie", "tie clip", "knees to chest", "blue coat", "fur-trimmed coat", "print skirt", "bush", "denim skirt", "soft serve", "strap between breasts", "strawberry hair ornament", "thermometer", "competition school swimsuit", "eraser", "orange shirt", "pink hoodie", "angel and devil", "halloween bucket", "pancake stack", "purple choker", "cervix", "beans", "leg warmers", "setsubun", "tiger print", "leg lock", "cheek pinching", "cheek pull", "stuffed cat", "pointer", "yellow hairband", "ankle socks", "yellow socks", "pink camisole", "face-to-face", "o-ring bikini", "orange scarf", "shared clothes", "shared scarf", "corset", "french braid", "frilled gloves", "bandaid on knee", "page number", "blanket", "polka dot legwear", "borrowed character", "omikuji", "oven mitts", "futon", "index fingers together", "athletic leotard", "flexible", "gymnastics", "pink leotard", "breastless clothes", "blue hoodie", "hand in pocket", "tied shirt", "fur-trimmed cloak", "animal collar", "heart collar", "anilingus", "bubble tea", "autumn", "ginkgo leaf", "skirt basket", "green hoodie", "green jacket", "fish hair ornament", "melon", "upright straddle", "waist bow", "animal slippers", "green footwear", "carpet", "kotatsu", "rug", "under kotatsu", "green sailor collar", "suggestive fluid", "yellow scarf", "oversized clothes", "oversized shirt", "anchor symbol", "pendant", "hair twirling", "playing with own hair", "gloves removed", "green choker", "wide spread legs", "yellow gloves", "lion ears", "lion tail", "open-chest sweater", "hair flaps", "hair stick", "union jack", "frilled capelet", "orange gloves", "snow bunny", "brown ribbon", "2018", "akeome", "low-tied long hair", "nengajou", "year of the dog", "cupping hands", "object on head", "39", "school briefcase", "green necktie", "cat mask", "open dress", "four-leaf clover hair ornament", "bra removed", "licking nipple", "banana", "sample watermark", "shirt pull", "yellow bra", "inflation", "shamoji", "zouri", "yellow jacket", "purple sweater", "grey cape", "grey capelet", "grey thighhighs", "petite", "2019", "boar", "year of the pig", "pink one-piece swimsuit", "clara (honkai star rail)", "stairs", "clothes", "spread toes", "hair over eyes", "reaching", "mecha", "large penis", "red bra", "hand over eye", "diaper", "cloth gag", "improvised gag", "polka dot bow", "footprints", "colorfull", "red theme", "bubble skirt", "single leg pantyhose", "star earrings", "ajirogasa", "flower (symbol)", "platform footwear", "shide", "androgynous", "arm cuffs", "back bow", "lace-trimmed dress", "pearl (gemstone)", "side drill", "demon", "demon boy", "heart print", "angora rabbit", "sleeve cuffs", "milk bottle", "sketchbook", "sticker", "crystal", "personification", "purple theme", "tiger", "box of chocolates", "orange footwear", "vertical-striped dress", "vision (genshin impact)", "white bloomers", "bangs pinned back", "marker", "stylus", "drawing tablet", "print legwear", "2020", "mouse", "mouse ears", "mouse girl", "mouse tail", "year of the rat", "layered dress", "copyright", "food-themed earrings", "colorful", "bubble blowing", "chewing gum", "dot nose", "star choker", "multiple hair bows", "heart choker", "clover", "red hood", "earth (planet)", "oversized object", "planet", "bandaid on arm", "blue pantyhose", "blue theme", "food-themed clothes", "striped pantyhose", "darktheme", "back tattoo", "glowing weapon", "human scabbard", "mitsudomoe (shape)", "musou isshin (genshin impact)", "tomoe (symbol)", "cigarette", "smoke", "smoking", "spot color", "red pupils", "x-shaped pupils", "city lights", "white pants", "explosive", "dagger", "grey sky", "scar on nose", "broken glass", "shards", "fallenshadow", "finger to cheek", "yandere", "height difference", "looking through legs", "strangling", "pursed lips", "feeding", "halter dress", "blood on knife", "blood on weapon", "charlotte (genshin impact)", "geshin impact", "framed", "assisted exposure", "asymmetrical bangs", "white feathers", "glint", "yellow-framed eyewear", "after paizuri", "arched back", "braided bangs", "gem", "sideways mouth", "topknot", "paw print", "arm held back", "puffy detached sleeves", "barrel", "coin", "cat teaser", "tickling", "eyepatch bikini", "furina (genshin impact)", "blue gemstone", "blue ascot", "throne", "tailcoat", "grinding", "assertive female", "blue butterfly", "glove biting", "santa dress", "kirara (genshin impact)", "bike shorts", "tail grab", "heart tail", ":>=", "bike shorts under skirt", "lap pillow", "shorts under skirt", "shrine", "four-leaf clover", "tally", "adjusting headwear", "spread arms", "^o^", "fertilization", "impregnation", "ovum", "mountainous horizon", "sand castle", "sand sculpture", "backpack removed", "leaf hair ornament", "censored nipples", "sperm cell", "double handjob", "green bowtie", "defloration", "ribs", "reading", "pointing up", "leaf on head", "raccoon ears", "raccoon tail", "sayu (genshin impact)", "arm guards", "shuriken", "green bikini", "toeless footwear", "sigewinne (genshin impact)", "4girls", "5girls", "thinking", "armpit crease", "no headwear", "mouth veil", "stray pubic hair", "yaoyao (genshin impact)", "coin hair ornament", "jiangshi", "qing guanmao", "xd", "raccoon girl", "tanuki", "torn shorts", "peeking out", "chest sarashi", "hanasaki chiyu", "cheek poking", "poking", "board game", "sad", "hand in panties", "pinstripe pattern", "pinstripe suit", "between fingers", "cushion", "chole (hololive)", "hololive", "navel piercing", "caution tape", "seigaiha", "button gap", "handgun", "shirt tucked in", "mask removed", "tachi-e", "blood on clothes", "blood splatter", "cardigan around waist", "sweater around waist", "tassel hair ornament", "black mask", "mouth mask", "furrowed brow", "fubuki (hololive)", "pentagram", "coffee cup", "whispering", "cellphone picture", "taking picture", "key", "blue scrunchie", "bandage over one eye", "corn", "dog tags", "leash pull", "arms between legs", "blue hakama", "alternate breast size", "user interface", "frilled ribbon", "star brooch", "pinstripe shirt", "cropped shirt", "purple scarf", "purple necktie", "triangle earrings", "ear ribbon", "ghost", "brown cardigan", "puckered lips", "uneven twintails", "asphyxiation", "heart in mouth", "3d background", "lace choker", "mixed media", "mask pull", "scarf over mouth", "head back", "disgust", "fireplace", "green bra", "nosebleed", "tanabata", "tanzaku", "chained", "dress tug", "houhou (honkai star rail)", "hands on own head", "2koma", "white eyes", ">_o", "scared", "green shorts", "green theme", "dragon", "duel monster", "spirit", "menu", "flower-shaped pupils", "ghost pose", "kicking", "aqua background", "zzz", "ilya (princess connect!)", "leotard aside", "crossed bandaids", "electric fan", "rabbit house uniform", "layered bikini", "flower pot", "hand on own leg", "diagonal-striped necktie", "apron lift", "lotion bottle", "two-tone shirt", "bikini top removed", "field", "clitoral hood", "grey necktie", "plaid necktie", "breast bondage", "frilled cuffs", "checkered kimono", "king (chess)", "melon bread", "track jacket", "brown cape", "brown coat", "wiping tears", "orange ribbon", "fiery horns", "fake screenshot", "orange bowtie", "imminent fellatio", "imagining", "pizza", "pizza slice", "cum on legs", "ribbon-trimmed dress", "kimono pull", "eyewear removed", "bar (place)", "bar stool", "lifting person", "bald", "armlet", "cone hair bun", "green headwear", "fat mons", "green kimono", "green hakama", "=_=", "aged up", "rice on face", "green one-piece swimsuit", "green apron", "green panties", "ugly man", "cheek press", "breast poke", "two-tone bikini", "curly hair", "invisible penis", "orb", "applying makeup", "dancer", "harem outfit", "fellatio gesture", "fishnet pantyhose", "reverse bunnysuit", "reverse outfit", "makaino ririmu nijisanji", "surgical mask", "stained glass", "blue belt", "yes", "yes-no pillow", "2022", "cross necklace", "single boot", "meme", "sailor moon redraw challenge (meme)", "boxers", "male underwear", "leg tattoo", "mimi (princess connect!)", "kinchaku", "w arms", "long braid", "no shirt", "photo (object)", "mochizuki himari", "short necktie", "praying", "hammer", "mallet", "wet swimsuit", "nippleless clothes", "playstation controller", "noel", "ear covers", "giving", "black rose", "official alternate hairstyle", "single ear cover", "red sleeves", "calligraphy brush", "paintbrush", "leotard pull", "headpiece", "nipple rings", "virgin killer sweater", "gyaru", "69", "hands on ass", "orange panties", "straight hair", "armor", "caressing testicles", "pauldrons", "shoulder armor", "lying on person", "sangvis ferri", "incest", "thong bikini", "purple hairband", "harvin", "diadem", "facial hair", "purple gloves", "condom on penis", "yaoi", "nursing handjob", "short ponytail", "aqua panties", "red eyeliner", "blue-framed eyewear", "lips", "mature female", "winged arms", "hair tucking", "sharp fingernails", "whistle", "whistle around neck", "eyeball", "third eye", "huge penis", "shota", "bikini armor", "orgy", "vibrator in thighhighs", "grapes", "implied fellatio", "undercut", "ringed eyes", "dragon wings", "body fur", "brown fur", "furry with furry", "grey fur", "paw print pattern", "romaji text", "snot", "two-tone fur", "uneven eyes", "white fur", "cum in container", "sling bikini top", "after ejaculation", "one-piece tan", "bunny-shaped pupils", "covered eyes", "puffy cheeks", "lion girl", "framed breasts", "hairjob", "tokisadame school uniform", "office", "frog hair ornament", "single sidelock", "snake hair ornament", "fingering through clothes", "through clothes", "laptop", "gokkun", "adjusting legwear", "dress swimsuit", "stomach cutout", "single horn", "squirrel ears", "squirrel girl", "squirrel tail", "soap", "soap bottle", "star halo", "headband", "asymmetrical hair", "official alternate hair length", "single hair intake", "armored dress", "gauntlets", "constricted pupils", "command spell", "flower earrings", "breast tattoo", "red ascot", "shouji", "gusset", "futa with female", "futanari", "kouhaku nawa", "shimenawa", "skin-covered horns", "split-color hair", "5boys", "multicolored wings", "orange ascot", "fish tail", "shark girl", "shark tail", "erune", "full-length zipper", "shark hair ornament", "circlet", "crescent earrings", "sailor senshi uniform", "food on head", "green pantyhose", "sushi", "tempura", "tentacle hair", "stomach tattoo", "blank censor", "helmet", "roswaal mansion maid uniform", "black tank top", "nanamori school uniform", "flaccid", "kissing penis", "train interior", "cum in clothes", "half-closed eye", "genderswap", "genderswap (ftm)", "forehead mark", "wooden chair", "onee-loli", "grey bra", "grey headwear", "grey panties", "plaid bra", "plaid headwear", "plaid jacket", "tress ribbon", "tri tails", "plaid shirt", "clothed female nude female", "multi-tied hair", "scowl", "cheek bulge", "scar on cheek", "shoulder tattoo", "bags under eyes", "patreon logo", "blue pants", "dimples of venus", "twitching penis", "bath stool", "two-handed handjob", "forked eyebrows", "ninja mask", "bodystocking", "black pubic hair", "striped bowtie", "triple penetration", "cooperative paizuri", "competition swimsuit", "mismatched pupils", "stud earrings", "multiple horns", "black undershirt", "butter", "plant girl", "syrup", "drugs", "pill", "horseshoe ornament", "sailor shirt", "summer uniform", "tracen school uniform", "fairy", "fairy wings", "striped tail", "tiger girl", "year of the tiger", "orange flower", "checkered sash", "haori", "angry", "pinafore dress", "tutu", "princess", "large insertion", "anal fingering", "feather boa", "spiked collar", "spikes", "spade hair ornament", "imminent anal", "genderswap (mtf)", "beer mug", "futanari masturbation", "bow earrings", "foreskin", "whisker markings", "tail piercing", "chinese knot", "confused", "leotard under clothes", "sportswear", "tennis uniform", "bolt action", "bear costume", "blue horns", "wet hair", "towel around neck", "pole dancing", "stripper pole", "bulletproof vest", "two-tone gloves", "dark blue hair", "joints", "partially unzipped", "dark", "neck ring", "pince-nez", "pink-tinted eyewear", "chimney", "boots removed", "unconscious", "gradient dress", "red horns", "magatama", "forehead jewel", "lowleg panties", "female pov", "stained panties", "cat print", "hitodama", "oni mask", "striped horns", "tomoeda elementary school uniform", "white necktie", "grey vest", "half updo", "lower body", "lalafell", "fish girl", "head fins", "cunny", "hair beads", "freckles", "strapless bikini", "checkered dress", "green cape", "green capelet", "ruler", "blue cape", "bracer", "white tank top", "scroll", "multiple braids", "side braids", "jitome", "print shorts", "thorns", "cross-laced clothes", "naked hoodie", "toilet paper", "looking at phone", "hand on eyewear", "joy-con", "hairpin", "octopus", "shop", "naked cloak", "blouse", "doll joints", "pov pussy pussy focus", "white horns", "cropped vest", "cube hair ornament", "quad tails", "military hat", "doll", "faceless female", "animal penis", "horse", "horse penis", "tentacle sex", "single wing", "x-ray display", "green gloves", "chest jewel", "homurahara academy school uniform", "wisteria", "santa boots", "champagne flute", "cat hat", "petticoat", "vertical-striped pantyhose", "tentacle pit", "tentacles under clothes", "sleeveless sweater", "pigeon-toed", "anchor hair ornament", "print mug", "clownfish", "unicorn", "elbow rest", "power symbol", "consensual tentacles", "center opening", "anzio military uniform", "stuffed shark", "grey sweater", "sleeveless turtleneck", "blue scarf", "purple neckerchief", "purple sailor collar", "bow-shaped hair", "flower ornament", "unbuttoned shirt", "black hoodie", "hand over own mouth", "black neckerchief", "through wall", "sniper rifle", "pink wings", "garrison cap", "crow", "safe", "blue-tinted eyewear", "cotton candy", "band uniform", "shako cap", "morning glory", "tropical drink", "mochi", "sunburst", "sunburst background", "letterman jacket", "middle finger", "guitar", "music", "playing instrument", "pearl bracelet", "clothes hanger", "shopping", "hololive idol uniform", "earmuffs", "shrimp", "mismatched eyebrows", "aqua bow", "aqua bowtie", "grey pantyhose", "transparent umbrella", "earphones", "highleg swimsuit", "earbuds", "barcode", "barcode tattoo", "emoji", "heart on chest", "heart-shaped lock", "pill earrings", "skeleton print", "breastplate", "excalibur (fate/stay night)", "finger to face", "blinds", "adapted costume", "bead necklace", "magic", "magic circle", "red belt", "pink belt", "hoop earrings", "black scrunchie", "festival", "summer festival", "high-waist pants", "unmoving pattern", "no eyewear", "multiple rings", "harness", "sleeves rolled up", "body markings", "interface headset", "plugsuit", "red bodysuit", "ghost tail", "instrument case", "claws", "number tattoo", "hooded cape", "clock eyes", "arm tattoo", "chest tattoo", "flower tattoo", "kanzashi", "pendant choker", "injury", "scabbard", "raglan sleeves", "white tail", "sideways", "ship", "watercraft", "median furrow", "toast", "toast in mouth", ":/", "absurdly long hair", "frog print", "shell necklace", "in food", "oversized food", "purple pantyhose", "summer", "constellation", "contemporary", "excalibur morgan (fate)", "crescent hat ornament", "mole on thigh", "mole under mouth", "fur scarf", "korean clothes", "singing", "miqo'te", "thigh ribbon", "hands in pockets", "mushroom", "shoulder strap", "evening", "android", "dark persona", "mechanical parts", "river", "asa no ha (pattern)", "bamboo", "bit gag", "ripples", "black headband", "debris", "dust", "forehead protector", "konohagakure symbol", "serious", "spiked hair", "wataboushi", "grey scarf", ":i", "thigh pouch", "mixing bowl", "spatula", "labcoat", "sticker on face", "couple", "road", "convenient arm", "bullet", "grenade", "magazine (weapon)", "bangle", "asymmetrical horns", "tube", "cafe", "elbows on table", "looking outside", "baozi", "white mask", "space", "chef hat", "broken heart", "ear ornament", "blue headband", "electrokinesis", "energy", "fantasy", "sitting on stairs", "sakurada shiro", "tail hug", "diffraction spikes", "power lines", "silhouette", "sunrise", "utility pole", "fireflies", "storefront", "landscape", "path", "twilight", "loaded interior", "shelf", "wind chime", "jar", "sink", "stove", "hill", "science fiction", "open window", "lily pad", "polar bear", "house", "street", "vanishing point", "lotus", "food focus", "still life", "railroad tracks", "pond", "orange sky", "spider lily", "waterfall", "contrail", "cliff", "fishing", "fishing rod", "wide shot", "frieren", "sousou no frieren", "hands on own stomach", "tassel earrings", "robe", "phimosis", "hime cut", "torn bodysuit", "ice wings", "touhou", "detached wings", "clone", "multiple persona", "person on head", "watermelon bar", "flying", "crotch cutout", "multicolored bow", "star hat ornament", "alternate color", "pov across table", "whipped cream", "back cutout", "laevatein (touhou)", "red gemstone", "purple gemstone", "straight-on", "talking", "paw print background", "floor", "light", "bondage outfit", "dominatrix", "latex", "latex gloves", "whip", "angelina (arknights)", "doctor (arknights)", "surtr (arknights)", "blue poison (arknights)", "blue poison (shoal beat) (arknights)", "amiya (arknights)", "eyjafjalla (arknights)", "goldenglow (arknights)", "mudrock (arknights)", "platinum (arknights)", "rosmontis (arknights)", "suzuran (arknights)", "ifrit (arknights)", "sussurro (arknights)", "luoxiaohei", "tomimi (arknights)", "kal'tsit (arknights)", "schwarz (arknights)", "hoto cocoa", "yazawa nico", "uruha rushia", "watson amelia", "gawr gura", "kisaragi (azur lane)", "tokoyami towa", "yuudachi (azur lane)", "remilia scarlet", "natori sana", "keqing (genshin impact)", "eunectes (arknights)", "shirakami fubuki", "le malin (azur lane)", "le malin (listless lapin) (azur lane)", "manjuu (azur lane)", "eldridge (azur lane)", "hammann (azur lane)", "commander (azur lane)", "javelin (azur lane)", "z23 (azur lane)", "ayanami (azur lane)", "laffey (azur lane)", "dido (azur lane)", "sirius (azur lane)", "formidable (azur lane)", "sirius (scorching-hot seirios) (azur lane)", "sirius (azure horizons) (azur lane)", "roxy migurdia", "arona (blue archive)", "hibiki (blue archive)", "hibiki (cheerleader) (blue archive)", "sensei (blue archive)", "izuna (blue archive)", "momoi (blue archive)", "midori (blue archive)", "shun (blue archive)", "mari (blue archive)", "makaino ririmu", "shiina yuika", "kirima syaro", "hakurei reimu", "hoshino (blue archive)", "kyouka (princess connect!)", "yuni (princess connect!)", "karyl (princess connect!)", "kokkoro (princess connect!)", "pecorine (princess connect!)", "nahida (genshin impact)", "mutsuki (blue archive)", "paimon (genshin impact)", "yuuki (princess connect!)", "suou momoko", "koharu (blue archive)", "usada pekora", "tedeza rize", "klee (genshin impact)", "dodoco (genshin impact)", "unicorn (azur lane)", "hatsune miku", "izumi sagiri", "hiiragi kagami", "hiiragi tsukasa", "belfast (azur lane)", "takao (kancolle)", "shampoo (ranma 1/2)", "yamashiro (azur lane)", "yuki miku", "lysithea von ordelia", "hagoromo lala", "yumemi riamu", "kafuu chino", "tippy (gochiusa)", "flandre scarlet", "minato aqua", "murasaki shion", "bronya zaychik", "abigail williams (fate)", "raiden shogun", "imaizumi kagerou", "yanfei (genshin impact)", "diona (genshin impact)", "albedo (genshin impact)", "jumpy dumpty", "yoimiya (genshin impact)", "razor (genshin impact)", "slime (genshin impact)", "aether (genshin impact)", "qiqi (genshin impact)", "hinatsuru ai", "sakamata chloe", "hu tao (genshin impact)", "boo tao (genshin impact)", "hoshi syoko", "curren chan (umamusume)", "daiwa scarlet (umamusume)", "inugami korone", "inubashiri momiji", "rita rossweisse", "jeanne d'arc alter santa lily (fate)", "matoba risa", "jeanne d'arc alter (fate)", "jeanne d'arc alter (avenger) (fate)", "matikane tannhauser (umamusume)", "illyasviel von einzbern", "akagi miria", "minami kotori", "fu hua", "hamakaze (kancolle)", "natsuiro matsuri", "anya (spy x family)", "rachel alucard", "komeiji koishi", "nero claudius (fate)", "kagamine len", "kama (fate)", "hina (blue archive)", "wu zetian (fate)", "nishikigi chisato", "marnie (pokemon)", "shidare hotaru", "9a-91 (girls' frontline)", "lize helesta", "ganyu (genshin impact)", "minase iori", "ibuki tsubasa", "tamamo (fate)", "koyanskaya (fate)", "kitagawa marin", "shishiro botan", "kochiya sanae", "miyu edelfelt", "prisma illya", "emilia (re:zero)", "amane kanata", "futaba anzu", "anastasia (idolmaster)", "nero claudius (fate/extra)", "azusa (blue archive)", "noelle (genshin impact)", "shinano (azur lane)", "komeiji satori", "sakurai momoka", "alice margatroid", "houshou marine", "chloe von einzbern", "nakiri ayame", "kamisato ayaka", "lisa (genshin impact)", "yui (princess connect!)", "fujimaru ritsuka (male)", "yakumo yukari", "sister cleaire", "nia (xenoblade)", "nia (blade) (xenoblade)", "tsukino usagi", "sailor moon", "marie (splatoon)", "hatoba tsugu", "rem (re:zero)", "m200 (girls' frontline)", "suzuhara lulu", "komiya kaho", "yoshikawa chinatsu", "lumine (genshin impact)", "hilda (pokemon)", "zuikaku (kancolle)", "nakano yotsuba", "medusa (fate)", "yamakaze (kancolle)", "tifa lockhart", "eurasian eagle owl (kemono friends)", "tsukumo sana", "nakano nino", "tomoe mami", "momoe nagisa", "hoshimachi suisei", "eula (genshin impact)", "illustrious (azur lane)", "rumia", "megumin", "jack the ripper (fate/apocrypha)", "mika (blue archive)", "toga himiko", "mononobe no futo", "vikala (granblue fantasy)", "miyu (blue archive)", "saren (princess connect!)", "fairy knight lancelot (fate)", "yukoku kiriko", "princess zelda", "hakui koyori", "star sapphire", "kotonoha akane", "kotonoha aoi", "omaru polka", "aqua (konosuba)", "lillie (pokemon)", "kanna kamui", "shiroko (blue archive)", "shiroko (swimsuit) (blue archive)", "hina (swimsuit) (blue archive)", "ui (blue archive)", "akashi (azur lane)", "sesshouin kiara", "ninomae ina'nis", "sweep tosho (umamusume)", "pomu rainpuff", "hibiki (kancolle)", "toudou yurika", "kamado nezuko", "kasumi (kancolle)", "kasumi kai ni (kancolle)", "morgan le fay (fate)", "takagi-san", "elaina (majo no tabitabi)", "artoria pendragon (fate)", "artoria pendragon (swimsuit ruler) (fate)", "rimuru tempest", "ouro kronii", "yae miko", "ahri (league of legends)", "noumi kudryavka", "iori (blue archive)", "gran (granblue fantasy)", "aris (blue archive)", "stheno (fate)", "nagato (azur lane)", "len (tsukihime)", "euryale (fate)", "la+ darknesss", "uzuki (kancolle)", "kinomoto sakura", "hoshimiya kate", "tachibana arisu", "finana ryugu", "matsuwa (kancolle)", "nonomi (blue archive)", "oshino shinobu", "ryuzaki kaoru", "sajo yukimi", "shirasaka koume", "bloop (gawr gura)", "silver fox (kemono friends)", "ezo red fox (kemono friends)", "cagliostro (granblue fantasy)", "yusa kozue", "katsushika hokusai (fate)", "tokitarou (fate)", "yor briar", "yamada elf", "elizabeth bathory (fate)", "vampy", "blanc (neptune series)", "nanashi mumei", "albedo (overlord)", "gotou hitori", "lucy (cyberpunk)", "platinum the trinity", "sucrose (genshin impact)", "hilichurl (genshin impact)", "koshimizu sachiko", "hilda valentine goneril", "barbara (genshin impact)", "amatsukaze (kancolle)", "silence suzuka (umamusume)", "mihono bourbon (umamusume)", "mythra (xenoblade)", "nekomata okayu", "ookami mio", "hifumi (blue archive)", "hoshikawa sara", "yuuka (blue archive)", "nilou (genshin impact)", "anchovy (girls und panzer)", "sangonomiya kokomi", "mash kyrielight", "rabbit yukine", "utage (arknights)", "meltryllis (fate)", "meltryllis (swimsuit lancer) (fate)", "meltryllis (swimsuit lancer) (second ascension) (fate)", "saber", "kuzuha (nijisanji)", "yuudachi (kancolle)", "yuudachi kai ni (kancolle)", "texas (arknights)", "grey wolf (kemono friends)", "kuki shinobu", "souryuu asuka langley", "sonozaki mion", "tokisaki kurumi", "konpaku youmu", "sakura miku", "kamisato ayato", "kirisame marisa", "saber alter", "saber alter (ver. shinjuku 1999) (fate)", "patchouli knowledge", "don-chan (usada pekora)", "avatar (ff14)", "shanghai doll", "nakano miku", "nakano itsuki", "uzumaki naruto", "mori calliope", "kiana kaslana", "yuzuki choco", "yozora mel", "elizabeth bathory (fate/extra ccc)", "elizabeth bathory (first ascension) (fate)", "cirno", "daiyousei", "kirby", "izayoi sakuya", "hong meiling", "koakuma" ]
hyunseo-mil/vit-base-beans
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-beans This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset. It achieves the following results on the evaluation set: - Loss: 0.0192 - Accuracy: 0.9925 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0811 | 1.54 | 100 | 0.0358 | 0.9925 | | 0.0132 | 3.08 | 200 | 0.0192 | 0.9925 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.2 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "angular_leaf_spot", "bean_rust", "healthy" ]
hiddenbebb/my_awesome_food_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_awesome_food_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.2+cpu - Datasets 2.16.1 - Tokenizers 0.15.0
[ "apple_pie", "baby_back_ribs", "bruschetta", "waffles", "caesar_salad", "cannoli", "caprese_salad", "carrot_cake", "ceviche", "cheesecake", "cheese_plate", "chicken_curry", "chicken_quesadilla", "baklava", "chicken_wings", "chocolate_cake", "chocolate_mousse", "churros", "clam_chowder", "club_sandwich", "crab_cakes", "creme_brulee", "croque_madame", "cup_cakes", "beef_carpaccio", "deviled_eggs", "donuts", "dumplings", "edamame", "eggs_benedict", "escargots", "falafel", "filet_mignon", "fish_and_chips", "foie_gras", "beef_tartare", "french_fries", "french_onion_soup", "french_toast", "fried_calamari", "fried_rice", "frozen_yogurt", "garlic_bread", "gnocchi", "greek_salad", "grilled_cheese_sandwich", "beet_salad", "grilled_salmon", "guacamole", "gyoza", "hamburger", "hot_and_sour_soup", "hot_dog", "huevos_rancheros", "hummus", "ice_cream", "lasagna", "beignets", "lobster_bisque", "lobster_roll_sandwich", "macaroni_and_cheese", "macarons", "miso_soup", "mussels", "nachos", "omelette", "onion_rings", "oysters", "bibimbap", "pad_thai", "paella", "pancakes", "panna_cotta", "peking_duck", "pho", "pizza", "pork_chop", "poutine", "prime_rib", "bread_pudding", "pulled_pork_sandwich", "ramen", "ravioli", "red_velvet_cake", "risotto", "samosa", "sashimi", "scallops", "seaweed_salad", "shrimp_and_grits", "breakfast_burrito", "spaghetti_bolognese", "spaghetti_carbonara", "spring_rolls", "steak", "strawberry_shortcake", "sushi", "tacos", "takoyaki", "tiramisu", "tuna_tartare" ]
SaladSlayer00/twin_matcher
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # SaladSlayer00/twin_matcher This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0494 - Validation Loss: 0.9150 - Validation Accuracy: 0.7791 - Epoch: 8 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.0005, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Validation Accuracy | Epoch | |:----------:|:---------------:|:-------------------:|:-----:| | 4.3094 | 2.8494 | 0.2653 | 0 | | 1.9389 | 1.6614 | 0.5538 | 1 | | 0.8892 | 1.1064 | 0.7059 | 2 | | 0.4021 | 0.9831 | 0.7336 | 3 | | 0.2010 | 0.8325 | 0.7814 | 4 | | 0.1096 | 0.8393 | 0.7758 | 5 | | 0.0681 | 0.8437 | 0.7880 | 6 | | 0.0543 | 0.8610 | 0.7658 | 7 | | 0.0494 | 0.9150 | 0.7791 | 8 | ### Framework versions - Transformers 4.35.2 - TensorFlow 2.15.0 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "ben_affleck", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "tom_ellis", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "danielle_panabaker", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "madelaine_petsch", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "katharine_mcphee", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "camila_mendes", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "melissa_fumero", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "anthony_mackie", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "natalie_portman", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "cristiano_ronaldo", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "tom_hiddleston", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "logan_lerman", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "lili_reinhart", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "elon_musk", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "bobby_morley", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "brie_larson", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "josh_radnor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "eliza_taylor", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "alexandra_daddario", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "krysten_ritter", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "zendaya", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "jeff_bezos", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "gal_gadot", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "zoe_saldana", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "shakira_isabel_mebarak", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "mark_zuckerberg", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "marie_avgeropoulos", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "neil_patrick_harris", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "chris_hemsworth", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "elizabeth_lail", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "richard_harmon", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "chris_evans", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "kiernen_shipka", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "natalie_dormer", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "alvaro_morte", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "stephen_amell", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "alex_lawther", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "irina_shayk", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "amanda_crew", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "wentworth_miller", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "katherine_langford", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "penn_badgley", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "barack_obama", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "christian_bale", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "nadia_hilker", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "morena_baccarin", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "chris_pratt", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "anne_hathaway", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "emma_stone", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "ellen_page", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "robert_de_niro", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "tom_holland", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "sarah_wayne_callies", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "inbar_lavi", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "scarlett_johansson", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "tom_hardy", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "megan_fox", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "pedro_alonso", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "brenton_thwaites", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "keanu_reeves", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "andy_samberg", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "rebecca_ferguson", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "alycia_dabnem_carey", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "dwayne_johnson", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "rihanna", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "miley_cyrus", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "zac_efron", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "amber_heard", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "robert_downey_jr", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "leonardo_dicaprio", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "selena_gomez", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "barbara_palvin", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "emilia_clarke", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "morgan_freeman", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "gwyneth_paltrow", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "maria_pedraza", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "jeremy_renner", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "tom_cruise", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "jimmy_fallon", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "hugh_jackman", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "sophie_turner", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "tuppence_middleton", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jessica_barden", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "jennifer_lawrence", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "millie_bobby_brown", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "ursula_corbero", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "bill_gates", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "mark_ruffalo", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "avril_lavigne", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "maisie_williams", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "margot_robbie", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "elizabeth_olsen", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "brian_j._smith", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "grant_gustin", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "rami_malek", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "taylor_swift", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "emma_watson", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "jake_mcdorman", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "adriana_lima", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "henry_cavil", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "lindsey_morgan", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "dominic_purcell", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "jason_momoa", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "johnny_depp", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "lionel_messi", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "beatrice_insalata", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni", "laura_puccioni" ]
platzi/platzi-vit-model-sebastian-gaviria
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # platzi-vit-model-sebastian-gaviria This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0307 - Accuracy: 0.9850 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.1275 | 3.85 | 500 | 0.0307 | 0.9850 | ### Framework versions - Transformers 4.36.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "angular_leaf_spot", "bean_rust", "healthy" ]
spolivin/alz-mri-vit
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # alz-mri-vit This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on Falah/Alzheimer_MRI dataset (fine-tuning procedure is described [here](https://huggingface.co/spolivin/alz-mri-vit/blob/main/vit_finetuning.ipynb)). It achieves the following results on the evaluation set: - Loss: 0.1875 - F1: 0.9309 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 1.1218 | 1.0 | 64 | 0.9419 | 0.5742 | | 0.94 | 2.0 | 128 | 0.9054 | 0.6029 | | 0.9123 | 3.0 | 192 | 0.9019 | 0.5262 | | 0.8625 | 4.0 | 256 | 0.8465 | 0.6029 | | 0.8104 | 5.0 | 320 | 0.7810 | 0.6319 | | 0.7244 | 6.0 | 384 | 0.7278 | 0.7037 | | 0.697 | 7.0 | 448 | 0.6300 | 0.7480 | | 0.5865 | 8.0 | 512 | 0.5659 | 0.7662 | | 0.5199 | 9.0 | 576 | 0.5445 | 0.7721 | | 0.4734 | 10.0 | 640 | 0.6750 | 0.7185 | | 0.4399 | 11.0 | 704 | 0.4893 | 0.8274 | | 0.3817 | 12.0 | 768 | 0.5578 | 0.7844 | | 0.3318 | 13.0 | 832 | 0.4699 | 0.8228 | | 0.3096 | 14.0 | 896 | 0.4460 | 0.8399 | | 0.2787 | 15.0 | 960 | 0.4105 | 0.8399 | | 0.2517 | 16.0 | 1024 | 0.3488 | 0.8578 | | 0.2346 | 17.0 | 1088 | 0.3877 | 0.8773 | | 0.2286 | 18.0 | 1152 | 0.3420 | 0.8575 | | 0.1914 | 19.0 | 1216 | 0.4123 | 0.8682 | | 0.1844 | 20.0 | 1280 | 0.2894 | 0.8913 | | 0.173 | 21.0 | 1344 | 0.3197 | 0.8887 | | 0.1687 | 22.0 | 1408 | 0.2626 | 0.9075 | | 0.1601 | 23.0 | 1472 | 0.2951 | 0.9068 | | 0.1466 | 24.0 | 1536 | 0.2666 | 0.9049 | | 0.1468 | 25.0 | 1600 | 0.2136 | 0.9103 | | 0.1226 | 26.0 | 1664 | 0.2387 | 0.9127 | | 0.1186 | 27.0 | 1728 | 0.2131 | 0.9271 | | 0.0951 | 28.0 | 1792 | 0.2520 | 0.9130 | | 0.1049 | 29.0 | 1856 | 0.2096 | 0.9259 | | 0.0936 | 30.0 | 1920 | 0.1875 | 0.9309 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "mild_demented", "moderate_demented", "non_demented", "very_mild_demented" ]
dhruvilHV/initial_ViT_model
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # initial_ViT_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the fair_face dataset. It achieves the following results on the evaluation set: - Loss: 3.6347 - Accuracy: 0.2125 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 4.7855 | 0.15 | 50 | 4.6444 | 0.0511 | | 4.4242 | 0.29 | 100 | 4.2124 | 0.1418 | | 4.0596 | 0.44 | 150 | 3.9402 | 0.1744 | | 3.859 | 0.59 | 200 | 3.7823 | 0.1956 | | 3.7392 | 0.74 | 250 | 3.6877 | 0.2105 | | 3.6424 | 0.88 | 300 | 3.6347 | 0.2125 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "east asian_male_0-2", "east asian_male_3-9", "east asian_male_10-19", "east asian_male_20-29", "east asian_male_30-39", "east asian_male_40-49", "east asian_male_50-59", "east asian_male_60-69", "east asian_male_more than 70", "east asian_female_0-2", "east asian_female_3-9", "east asian_female_10-19", "east asian_female_20-29", "east asian_female_30-39", "east asian_female_40-49", "east asian_female_50-59", "east asian_female_60-69", "east asian_female_more than 70", "indian_male_0-2", "indian_male_3-9", "indian_male_10-19", "indian_male_20-29", "indian_male_30-39", "indian_male_40-49", "indian_male_50-59", "indian_male_60-69", "indian_male_more than 70", "indian_female_0-2", "indian_female_3-9", "indian_female_10-19", "indian_female_20-29", "indian_female_30-39", "indian_female_40-49", "indian_female_50-59", "indian_female_60-69", "indian_female_more than 70", "black_male_0-2", "black_male_3-9", "black_male_10-19", "black_male_20-29", "black_male_30-39", "black_male_40-49", "black_male_50-59", "black_male_60-69", "black_male_more than 70", "black_female_0-2", "black_female_3-9", "black_female_10-19", "black_female_20-29", "black_female_30-39", "black_female_40-49", "black_female_50-59", "black_female_60-69", "black_female_more than 70", "white_male_0-2", "white_male_3-9", "white_male_10-19", "white_male_20-29", "white_male_30-39", "white_male_40-49", "white_male_50-59", "white_male_60-69", "white_male_more than 70", "white_female_0-2", "white_female_3-9", "white_female_10-19", "white_female_20-29", "white_female_30-39", "white_female_40-49", "white_female_50-59", "white_female_60-69", "white_female_more than 70", "middle eastern_male_0-2", "middle eastern_male_3-9", "middle eastern_male_10-19", "middle eastern_male_20-29", "middle eastern_male_30-39", "middle eastern_male_40-49", "middle eastern_male_50-59", "middle eastern_male_60-69", "middle eastern_male_more than 70", "middle eastern_female_0-2", "middle eastern_female_3-9", "middle eastern_female_10-19", "middle eastern_female_20-29", "middle eastern_female_30-39", "middle eastern_female_40-49", "middle eastern_female_50-59", "middle eastern_female_60-69", "middle eastern_female_more than 70", "latino_hispanic_male_0-2", "latino_hispanic_male_3-9", "latino_hispanic_male_10-19", "latino_hispanic_male_20-29", "latino_hispanic_male_30-39", "latino_hispanic_male_40-49", "latino_hispanic_male_50-59", "latino_hispanic_male_60-69", "latino_hispanic_male_more than 70", "latino_hispanic_female_0-2", "latino_hispanic_female_3-9", "latino_hispanic_female_10-19", "latino_hispanic_female_20-29", "latino_hispanic_female_30-39", "latino_hispanic_female_40-49", "latino_hispanic_female_50-59", "latino_hispanic_female_60-69", "latino_hispanic_female_more than 70", "southeast asian_male_0-2", "southeast asian_male_3-9", "southeast asian_male_10-19", "southeast asian_male_20-29", "southeast asian_male_30-39", "southeast asian_male_40-49", "southeast asian_male_50-59", "southeast asian_male_60-69", "southeast asian_male_more than 70", "southeast asian_female_0-2", "southeast asian_female_3-9", "southeast asian_female_10-19", "southeast asian_female_20-29", "southeast asian_female_30-39", "southeast asian_female_40-49", "southeast asian_female_50-59", "southeast asian_female_60-69", "southeast asian_female_more than 70" ]
gianlab/swin-tiny-patch4-window7-224-finetuned-parkinson-classification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-parkinson-classification This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.4966 - Accuracy: 0.9091 ## Model description This model was created by importing the dataset of spiral drawings made by both parkinsons patients and healthy people into Google Colab from kaggle here: https://www.kaggle.com/datasets/kmader/parkinsons-drawings/data. I then used the image classification tutorial here: https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/image_classification.ipynb obtaining the following notebook: https://colab.research.google.com/drive/1oRjwgHjmaQYRU1qf-TTV7cg1qMZXgMaO?usp=sharing The possible classified data are: <ul> <li>Healthy</li> <li>Parkinson</li> </ul> ### Spiral drawing example: ![Screenshot](V13PE02.png) ## Intended uses & limitations Acknowledgements The data came from the paper: Zham P, Kumar DK, Dabnichki P, Poosapadi Arjunan S and Raghav S (2017) Distinguishing Different Stages of Parkinson’s Disease Using Composite Index of Speed and Pen-Pressure of Sketching a Spiral. Front. Neurol. 8:435. doi: 10.3389/fneur.2017.00435 https://www.frontiersin.org/articles/10.3389/fneur.2017.00435/full Data licence : https://creativecommons.org/licenses/by-nc-nd/4.0/ ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 1 | 0.6801 | 0.4545 | | No log | 2.0 | 3 | 0.8005 | 0.3636 | | No log | 3.0 | 5 | 0.6325 | 0.6364 | | No log | 4.0 | 6 | 0.5494 | 0.8182 | | No log | 5.0 | 7 | 0.5214 | 0.8182 | | No log | 6.0 | 9 | 0.5735 | 0.7273 | | 0.3063 | 7.0 | 11 | 0.4966 | 0.9091 | | 0.3063 | 8.0 | 12 | 0.4557 | 0.9091 | | 0.3063 | 9.0 | 13 | 0.4444 | 0.9091 | | 0.3063 | 10.0 | 15 | 0.6226 | 0.6364 | | 0.3063 | 11.0 | 17 | 0.8224 | 0.4545 | | 0.3063 | 12.0 | 18 | 0.8127 | 0.4545 | | 0.3063 | 13.0 | 19 | 0.7868 | 0.4545 | | 0.2277 | 14.0 | 21 | 0.8195 | 0.4545 | | 0.2277 | 15.0 | 23 | 0.7499 | 0.4545 | | 0.2277 | 16.0 | 24 | 0.7022 | 0.5455 | | 0.2277 | 17.0 | 25 | 0.6755 | 0.5455 | | 0.2277 | 18.0 | 27 | 0.6277 | 0.6364 | | 0.2277 | 19.0 | 29 | 0.5820 | 0.6364 | | 0.1867 | 20.0 | 30 | 0.5784 | 0.6364 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "healthy", "parkinson" ]
stentorianvoice/vit-base-patch16-224
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224 This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2774 - Accuracy: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 5 - eval_batch_size: 5 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 20 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.8 | 2 | 0.5778 | 0.6667 | | No log | 2.0 | 5 | 0.2774 | 1.0 | | No log | 2.4 | 6 | 0.2546 | 1.0 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.1+cu121 - Datasets 2.16.0 - Tokenizers 0.15.0
[ "jet", "para" ]