model_id
stringlengths
12
92
model_card
stringlengths
166
900k
model_labels
listlengths
2
250
sayeed99/segformer-b2-fashion
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b2-fashion This model is a fine-tuned version of [nvidia/mit-b2](https://huggingface.co/nvidia/mit-b2) on the sayeed99/fashion_segmentation dataset. ```python from transformers import SegformerImageProcessor, AutoModelForSemanticSegmentation from PIL import Image import requests import matplotlib.pyplot as plt import torch.nn as nn processor = SegformerImageProcessor.from_pretrained("sayeed99/segformer-b2-fashion") model = AutoModelForSemanticSegmentation.from_pretrained("sayeed99/segformer-b2-fashion") url = "https://plus.unsplash.com/premium_photo-1673210886161-bfcc40f54d1f?ixlib=rb-4.0.3&ixid=MnwxMjA3fDB8MHxzZWFyY2h8MXx8cGVyc29uJTIwc3RhbmRpbmd8ZW58MHx8MHx8&w=1000&q=80" image = Image.open(requests.get(url, stream=True).raw) inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits.cpu() upsampled_logits = nn.functional.interpolate( logits, size=image.size[::-1], mode="bilinear", align_corners=False, ) pred_seg = upsampled_logits.argmax(dim=1)[0] plt.imshow(pred_seg) ``` Labels : {"0":"Everything Else", "1": "shirt, blouse", "2": "top, t-shirt, sweatshirt", "3": "sweater", "4": "cardigan", "5": "jacket", "6": "vest", "7": "pants", "8": "shorts", "9": "skirt", "10": "coat", "11": "dress", "12": "jumpsuit", "13": "cape", "14": "glasses", "15": "hat", "16": "headband, head covering, hair accessory", "17": "tie", "18": "glove", "19": "watch", "20": "belt", "21": "leg warmer", "22": "tights, stockings", "23": "sock", "24": "shoe", "25": "bag, wallet", "26": "scarf", "27": "umbrella", "28": "hood", "29": "collar", "30": "lapel", "31": "epaulette", "32": "sleeve", "33": "pocket", "34": "neckline", "35": "buckle", "36": "zipper", "37": "applique", "38": "bead", "39": "bow", "40": "flower", "41": "fringe", "42": "ribbon", "43": "rivet", "44": "ruffle", "45": "sequin", "46": "tassel"} ### Framework versions - Transformers 4.30.0 - Pytorch 2.2.2+cu121 - Datasets 2.18.0 - Tokenizers 0.13.3 ### License The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE). ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2105-15203, author = {Enze Xie and Wenhai Wang and Zhiding Yu and Anima Anandkumar and Jose M. Alvarez and Ping Luo}, title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers}, journal = {CoRR}, volume = {abs/2105.15203}, year = {2021}, url = {https://arxiv.org/abs/2105.15203}, eprinttype = {arXiv}, eprint = {2105.15203}, timestamp = {Wed, 02 Jun 2021 11:46:42 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} }
[ "everything else", "shirt, blouse", "top, t-shirt, sweatshirt", "sweater", "cardigan", "jacket", "vest", "pants", "shorts", "skirt", "coat", "dress", "jumpsuit", "cape", "glasses", "hat", "headband, head covering, hair accessory", "tie", "glove", "watch", "belt", "leg warmer", "tights, stockings", "sock", "shoe", "bag, wallet", "scarf", "umbrella", "hood", "collar", "lapel", "epaulette", "sleeve", "pocket", "neckline", "buckle", "zipper", "applique", "bead", "bow", "flower", "fringe", "ribbon", "rivet", "ruffle", "sequin", "tassel" ]
unreal-hug/segformer-b2-seed63-apr-13-v1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b2-seed63-apr-13-v1 This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the unreal-hug/REAL_DATASET_SEG_401_6_lbls dataset. It achieves the following results on the evaluation set: - Loss: 1.7138 - Mean Iou: 0.1266 - Mean Accuracy: 0.2136 - Overall Accuracy: 0.4273 - Accuracy Unlabeled: nan - Accuracy Lv: 0.6939 - Accuracy Rv: 0.0982 - Accuracy Ra: 0.1706 - Accuracy La: 0.5041 - Accuracy Vs: 0.0 - Accuracy As: 0.0 - Accuracy Mk: 0.0 - Accuracy Tk: nan - Accuracy Asd: 0.0557 - Accuracy Vsd: 0.2283 - Accuracy Ak: 0.3849 - Iou Unlabeled: 0.0 - Iou Lv: 0.4965 - Iou Rv: 0.0899 - Iou Ra: 0.1288 - Iou La: 0.2845 - Iou Vs: 0.0 - Iou As: 0.0 - Iou Mk: 0.0 - Iou Tk: 0.0 - Iou Asd: 0.0462 - Iou Vsd: 0.1513 - Iou Ak: 0.3225 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.05 - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Lv | Accuracy Rv | Accuracy Ra | Accuracy La | Accuracy Vs | Accuracy As | Accuracy Mk | Accuracy Tk | Accuracy Asd | Accuracy Vsd | Accuracy Ak | Iou Unlabeled | Iou Lv | Iou Rv | Iou Ra | Iou La | Iou Vs | Iou As | Iou Mk | Iou Tk | Iou Asd | Iou Vsd | Iou Ak | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:------------:|:------------:|:-----------:|:-------------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:-------:|:-------:|:------:| | 2.5423 | 2.5 | 100 | 2.6367 | 0.0332 | 0.0976 | 0.0951 | nan | 0.0612 | 0.0642 | 0.0301 | 0.1898 | 0.0 | 0.0 | 0.0086 | nan | 0.0495 | 0.4697 | 0.1033 | 0.0 | 0.0573 | 0.0485 | 0.0262 | 0.1021 | 0.0 | 0.0 | 0.0019 | 0.0 | 0.0204 | 0.0612 | 0.0812 | | 2.3042 | 5.0 | 200 | 2.3925 | 0.0604 | 0.1412 | 0.1975 | nan | 0.2435 | 0.0655 | 0.1292 | 0.2869 | 0.0 | 0.0 | 0.0046 | nan | 0.0669 | 0.4894 | 0.1258 | 0.0 | 0.2144 | 0.0516 | 0.1074 | 0.1515 | 0.0 | 0.0 | 0.0017 | 0.0 | 0.0243 | 0.0670 | 0.1063 | | 2.0869 | 7.5 | 300 | 2.2183 | 0.0932 | 0.1839 | 0.3354 | nan | 0.5208 | 0.0717 | 0.1836 | 0.4192 | 0.0 | 0.0 | 0.0006 | nan | 0.0768 | 0.3608 | 0.2060 | 0.0 | 0.4077 | 0.0617 | 0.1436 | 0.2158 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0358 | 0.0787 | 0.1746 | | 2.0559 | 10.0 | 400 | 2.0298 | 0.1110 | 0.2055 | 0.3886 | nan | 0.6144 | 0.1027 | 0.1815 | 0.4598 | 0.0 | 0.0 | 0.0005 | nan | 0.0909 | 0.3011 | 0.3041 | 0.0 | 0.4559 | 0.0880 | 0.1400 | 0.2409 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0534 | 0.1001 | 0.2538 | | 1.9554 | 12.5 | 500 | 1.8871 | 0.1189 | 0.2111 | 0.4100 | nan | 0.6561 | 0.1004 | 0.1647 | 0.4900 | 0.0 | 0.0 | 0.0009 | nan | 0.0763 | 0.2611 | 0.3619 | 0.0 | 0.4739 | 0.0896 | 0.1263 | 0.2616 | 0.0 | 0.0 | 0.0007 | 0.0 | 0.0531 | 0.1207 | 0.3015 | | 2.0181 | 15.0 | 600 | 1.7720 | 0.1247 | 0.2139 | 0.4199 | nan | 0.6735 | 0.1008 | 0.1723 | 0.4898 | 0.0 | 0.0 | 0.0 | nan | 0.0706 | 0.2349 | 0.3972 | 0.0 | 0.4860 | 0.0912 | 0.1293 | 0.2720 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0532 | 0.1386 | 0.3256 | | 1.6723 | 17.5 | 700 | 1.7386 | 0.1258 | 0.2129 | 0.4251 | nan | 0.6860 | 0.1011 | 0.1724 | 0.5062 | 0.0 | 0.0 | 0.0 | nan | 0.0615 | 0.2167 | 0.3848 | 0.0 | 0.4927 | 0.0917 | 0.1304 | 0.2814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0488 | 0.1426 | 0.3221 | | 1.5613 | 20.0 | 800 | 1.7751 | 0.1269 | 0.2151 | 0.4322 | nan | 0.7050 | 0.1020 | 0.1730 | 0.5066 | 0.0 | 0.0 | 0.0 | nan | 0.0570 | 0.2288 | 0.3788 | 0.0 | 0.4990 | 0.0927 | 0.1308 | 0.2841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0465 | 0.1502 | 0.3199 | | 1.5653 | 22.5 | 900 | 1.7222 | 0.1272 | 0.2142 | 0.4277 | nan | 0.6924 | 0.1003 | 0.1794 | 0.5018 | 0.0 | 0.0 | 0.0 | nan | 0.0568 | 0.2295 | 0.3814 | 0.0 | 0.4969 | 0.0914 | 0.1341 | 0.2837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0466 | 0.1523 | 0.3209 | | 1.5196 | 25.0 | 1000 | 1.7138 | 0.1266 | 0.2136 | 0.4273 | nan | 0.6939 | 0.0982 | 0.1706 | 0.5041 | 0.0 | 0.0 | 0.0 | nan | 0.0557 | 0.2283 | 0.3849 | 0.0 | 0.4965 | 0.0899 | 0.1288 | 0.2845 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0462 | 0.1513 | 0.3225 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.2+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "unlabeled", "lv", "rv", "ra", "la", "vs", "as", "mk", "tk", "asd", "vsd", "ak" ]
diegola123/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 2.1850 - Mean Iou: 0.0074 - Mean Accuracy: 0.0858 - Overall Accuracy: 0.0532 - Accuracy Unlabeled: 0.0 - Accuracy Flat-road: 0.0521 - Accuracy Flat-sidewalk: 0.2880 - Accuracy Flat-crosswalk: 0.0002 - Accuracy Flat-cyclinglane: 0.0414 - Accuracy Flat-parkingdriveway: nan - Accuracy Flat-railtrack: 0.0 - Accuracy Flat-curb: 0.0 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.7261 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: nan - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.8347 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0011 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: nan - Accuracy Construction-tunnel: 0.0 - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.7158 - Accuracy Nature-terrain: 0.0 - Accuracy Sky: 0.0001 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0 - Accuracy Void-unclear: nan - Iou Unlabeled: 0.0 - Iou Flat-road: 0.0373 - Iou Flat-sidewalk: 0.0114 - Iou Flat-crosswalk: 0.0002 - Iou Flat-cyclinglane: 0.0279 - Iou Flat-parkingdriveway: 0.0 - Iou Flat-railtrack: 0.0 - Iou Flat-curb: 0.0 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.0081 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: nan - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.0229 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0011 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: 0.0 - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.1347 - Iou Nature-terrain: 0.0 - Iou Sky: 0.0000 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0 - Iou Void-unclear: nan ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 6 - eval_batch_size: 6 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 3.583 | 0.43 | 50 | 3.5690 | 0.0035 | 0.0283 | 0.0143 | 0.0179 | 0.0003 | 0.0008 | 0.0087 | 0.2546 | nan | 0.0010 | 0.0022 | 0.0015 | 0.0234 | 0.2546 | 0.0 | 0.0330 | 0.0 | 0.0004 | nan | 0.0 | 0.0105 | 0.0 | 0.0027 | 0.0120 | 0.0 | nan | 0.0 | 0.0000 | 0.0061 | 0.0439 | 0.0009 | 0.0312 | 0.0055 | 0.0143 | 0.0679 | 0.0008 | 0.0841 | nan | 0.0156 | 0.0003 | 0.0008 | 0.0069 | 0.0354 | 0.0 | 0.0007 | 0.0004 | 0.0001 | 0.0096 | 0.0010 | 0.0 | 0.0001 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0097 | 0.0 | 0.0023 | 0.0081 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0033 | 0.0005 | 0.0009 | 0.0144 | 0.0041 | 0.0056 | 0.0014 | 0.0007 | 0.0001 | 0.0 | | 3.541 | 0.85 | 100 | 3.5349 | 0.0040 | 0.0309 | 0.0152 | 0.0195 | 0.0003 | 0.0033 | 0.0139 | 0.2434 | nan | 0.0007 | 0.0027 | 0.0032 | 0.0195 | 0.3347 | 0.0 | 0.0174 | 0.0 | 0.0005 | nan | 0.0 | 0.0105 | 0.0003 | 0.0033 | 0.0132 | 0.0 | nan | 0.0 | 0.0 | 0.0086 | 0.0445 | 0.0007 | 0.0547 | 0.0053 | 0.0147 | 0.0689 | 0.0011 | 0.0731 | nan | 0.0166 | 0.0003 | 0.0029 | 0.0102 | 0.0339 | 0.0 | 0.0005 | 0.0004 | 0.0002 | 0.0079 | 0.0013 | 0.0 | 0.0000 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0098 | 0.0002 | 0.0028 | 0.0089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0046 | 0.0005 | 0.0007 | 0.0263 | 0.0039 | 0.0056 | 0.0014 | 0.0009 | 0.0001 | 0.0 | | 3.4975 | 1.28 | 150 | 3.5016 | 0.0044 | 0.0336 | 0.0165 | 0.0157 | 0.0004 | 0.0078 | 0.0120 | 0.2803 | nan | 0.0004 | 0.0011 | 0.0018 | 0.0176 | 0.4099 | 0.0 | 0.0113 | 0.0 | 0.0004 | nan | 0.0 | 0.0090 | 0.0008 | 0.0025 | 0.0113 | 0.0 | nan | 0.0 | 0.0 | 0.0141 | 0.0421 | 0.0004 | 0.0806 | 0.0057 | 0.0142 | 0.0624 | 0.0011 | 0.0390 | nan | 0.0136 | 0.0004 | 0.0044 | 0.0095 | 0.0338 | 0.0 | 0.0003 | 0.0002 | 0.0001 | 0.0071 | 0.0016 | 0.0 | 0.0000 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0085 | 0.0005 | 0.0022 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0004 | 0.0004 | 0.0415 | 0.0042 | 0.0057 | 0.0015 | 0.0009 | 0.0000 | 0.0 | | 3.5169 | 1.71 | 200 | 3.4636 | 0.0052 | 0.0399 | 0.0195 | 0.0159 | 0.0003 | 0.0224 | 0.0081 | 0.3090 | nan | 0.0002 | 0.0010 | 0.0022 | 0.0136 | 0.4941 | 0.0 | 0.0071 | 0.0 | 0.0003 | nan | 0.0 | 0.0112 | 0.0032 | 0.0018 | 0.0168 | 0.0 | nan | 0.0 | 0.0 | 0.0127 | 0.0408 | 0.0002 | 0.1364 | 0.0040 | 0.0128 | 0.0652 | 0.0014 | 0.0562 | nan | 0.0138 | 0.0003 | 0.0047 | 0.0068 | 0.0366 | 0.0 | 0.0001 | 0.0002 | 0.0002 | 0.0062 | 0.0021 | 0.0 | 0.0000 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0106 | 0.0016 | 0.0017 | 0.0111 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0006 | 0.0002 | 0.0657 | 0.0031 | 0.0052 | 0.0017 | 0.0012 | 0.0000 | 0.0 | | 3.3938 | 2.14 | 250 | 3.4255 | 0.0057 | 0.0433 | 0.0208 | 0.0139 | 0.0005 | 0.0461 | 0.0090 | 0.2873 | nan | 0.0002 | 0.0015 | 0.0016 | 0.0119 | 0.5499 | 0.0 | 0.0067 | 0.0 | 0.0003 | nan | 0.0 | 0.0101 | 0.0044 | 0.0021 | 0.0197 | 0.0 | nan | 0.0 | 0.0 | 0.0134 | 0.0390 | 0.0001 | 0.1878 | 0.0037 | 0.0157 | 0.0644 | 0.0011 | 0.0534 | nan | 0.0121 | 0.0005 | 0.0054 | 0.0075 | 0.0382 | 0.0 | 0.0001 | 0.0003 | 0.0002 | 0.0052 | 0.0025 | 0.0 | 0.0000 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0096 | 0.0021 | 0.0019 | 0.0132 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0006 | 0.0001 | 0.0801 | 0.0029 | 0.0062 | 0.0018 | 0.0010 | 0.0000 | 0.0 | | 3.4231 | 2.56 | 300 | 3.3861 | 0.0062 | 0.0466 | 0.0232 | 0.0130 | 0.0005 | 0.0759 | 0.0072 | 0.2834 | nan | 0.0001 | 0.0006 | 0.0022 | 0.0097 | 0.5913 | 0.0 | 0.0017 | 0.0 | 0.0003 | nan | 0.0 | 0.0109 | 0.0069 | 0.0018 | 0.0223 | 0.0 | nan | 0.0 | 0.0 | 0.0136 | 0.0389 | 0.0000 | 0.2467 | 0.0033 | 0.0148 | 0.0719 | 0.0010 | 0.0262 | nan | 0.0113 | 0.0005 | 0.0060 | 0.0062 | 0.0426 | 0.0 | 0.0001 | 0.0002 | 0.0003 | 0.0045 | 0.0031 | 0.0 | 0.0000 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0105 | 0.0031 | 0.0016 | 0.0150 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0072 | 0.0008 | 0.0000 | 0.0909 | 0.0027 | 0.0059 | 0.0019 | 0.0010 | 0.0000 | 0.0 | | 3.3664 | 2.99 | 350 | 3.3542 | 0.0065 | 0.0503 | 0.0246 | 0.0112 | 0.0006 | 0.1034 | 0.0054 | 0.2629 | nan | 0.0002 | 0.0002 | 0.0015 | 0.0066 | 0.6758 | 0.0 | 0.0009 | 0.0 | 0.0002 | nan | 0.0 | 0.0073 | 0.0123 | 0.0020 | 0.0218 | 0.0 | nan | 0.0 | 0.0 | 0.0118 | 0.0363 | 0.0 | 0.3115 | 0.0032 | 0.0066 | 0.0634 | 0.0008 | 0.0131 | nan | 0.0099 | 0.0006 | 0.0065 | 0.0048 | 0.0455 | 0.0 | 0.0001 | 0.0001 | 0.0002 | 0.0035 | 0.0031 | 0.0 | 0.0000 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0071 | 0.0042 | 0.0019 | 0.0151 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0072 | 0.0008 | 0.0 | 0.1091 | 0.0027 | 0.0030 | 0.0021 | 0.0007 | 0.0000 | 0.0 | | 3.3089 | 3.42 | 400 | 3.3039 | 0.0069 | 0.0551 | 0.0281 | 0.0107 | 0.0006 | 0.1407 | 0.0043 | 0.2687 | nan | 0.0002 | 0.0002 | 0.0011 | 0.0051 | 0.6984 | 0.0 | 0.0002 | 0.0 | 0.0002 | nan | 0.0 | 0.0081 | 0.0164 | 0.0018 | 0.0267 | 0.0 | nan | 0.0 | 0.0 | 0.0104 | 0.0366 | 0.0 | 0.3861 | 0.0041 | 0.0079 | 0.0729 | 0.0007 | 0.0069 | nan | 0.0094 | 0.0006 | 0.0075 | 0.0039 | 0.0495 | 0.0 | 0.0002 | 0.0001 | 0.0002 | 0.0025 | 0.0037 | 0.0 | 0.0000 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0079 | 0.0054 | 0.0018 | 0.0179 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0066 | 0.0010 | 0.0 | 0.1123 | 0.0033 | 0.0032 | 0.0025 | 0.0007 | 0.0000 | 0.0 | | 3.2464 | 3.85 | 450 | 3.2705 | 0.0070 | 0.0575 | 0.0288 | 0.0092 | 0.0006 | 0.1587 | 0.0043 | 0.2538 | nan | 0.0001 | 0.0 | 0.0001 | 0.0027 | 0.7500 | 0.0 | 0.0001 | 0.0 | 0.0001 | nan | 0.0 | 0.0068 | 0.0248 | 0.0020 | 0.0284 | 0.0 | nan | 0.0 | 0.0 | 0.0089 | 0.0347 | 0.0 | 0.4160 | 0.0040 | 0.0057 | 0.0694 | 0.0005 | 0.0007 | nan | 0.0083 | 0.0006 | 0.0080 | 0.0040 | 0.0504 | 0.0 | 0.0001 | 0.0 | 0.0000 | 0.0016 | 0.0036 | 0.0 | 0.0000 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0067 | 0.0065 | 0.0019 | 0.0191 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0060 | 0.0012 | 0.0 | 0.1181 | 0.0032 | 0.0025 | 0.0026 | 0.0005 | 0.0000 | 0.0 | | 3.2788 | 4.27 | 500 | 3.2230 | 0.0074 | 0.0616 | 0.0320 | 0.0100 | 0.0009 | 0.1784 | 0.0041 | 0.2465 | nan | 0.0001 | 0.0 | 0.0 | 0.0020 | 0.7658 | 0.0 | 0.0 | 0.0 | 0.0001 | nan | 0.0 | 0.0060 | 0.0325 | 0.0018 | 0.0373 | 0.0 | nan | 0.0 | 0.0 | 0.0075 | 0.0318 | 0.0 | 0.4977 | 0.0046 | 0.0049 | 0.0783 | 0.0004 | 0.0 | nan | 0.0089 | 0.0009 | 0.0084 | 0.0038 | 0.0523 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0012 | 0.0040 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0058 | 0.0071 | 0.0018 | 0.0231 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0013 | 0.0 | 0.1257 | 0.0037 | 0.0023 | 0.0029 | 0.0004 | 0.0 | 0.0 | | 3.2179 | 4.7 | 550 | 3.1821 | 0.0074 | 0.0641 | 0.0337 | 0.0090 | 0.0011 | 0.2000 | 0.0030 | 0.2313 | nan | 0.0001 | 0.0 | 0.0 | 0.0016 | 0.7777 | 0.0 | 0.0 | 0.0 | 0.0000 | nan | 0.0 | 0.0054 | 0.0473 | 0.0018 | 0.0371 | 0.0 | nan | 0.0 | 0.0 | 0.0042 | 0.0317 | 0.0 | 0.5484 | 0.0049 | 0.0047 | 0.0782 | 0.0004 | 0.0 | nan | 0.0081 | 0.0011 | 0.0090 | 0.0028 | 0.0523 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0010 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0053 | 0.0088 | 0.0018 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0027 | 0.0016 | 0.0 | 0.1282 | 0.0039 | 0.0022 | 0.0034 | 0.0003 | 0.0 | 0.0 | | 3.1906 | 5.13 | 600 | 3.1424 | 0.0074 | 0.0651 | 0.0338 | 0.0104 | 0.0015 | 0.2149 | 0.0034 | 0.2067 | nan | 0.0001 | 0.0 | 0.0 | 0.0011 | 0.8076 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0044 | 0.0642 | 0.0017 | 0.0322 | 0.0 | nan | 0.0 | 0.0 | 0.0024 | 0.0266 | 0.0 | 0.5584 | 0.0044 | 0.0047 | 0.0734 | 0.0002 | 0.0 | nan | 0.0093 | 0.0015 | 0.0094 | 0.0032 | 0.0515 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0008 | 0.0041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.0099 | 0.0017 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0016 | 0.0 | 0.1304 | 0.0035 | 0.0021 | 0.0034 | 0.0002 | 0.0 | 0.0 | | 3.2075 | 5.56 | 650 | 3.1091 | 0.0075 | 0.0678 | 0.0367 | 0.0088 | 0.0019 | 0.2249 | 0.0032 | 0.2201 | nan | 0.0000 | 0.0 | 0.0 | 0.0009 | 0.8058 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0035 | 0.0846 | 0.0014 | 0.0332 | 0.0 | nan | 0.0 | 0.0 | 0.0009 | 0.0145 | 0.0 | 0.6252 | 0.0050 | 0.0049 | 0.0639 | 0.0002 | 0.0 | nan | 0.0080 | 0.0018 | 0.0096 | 0.0030 | 0.0525 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0006 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0034 | 0.0114 | 0.0014 | 0.0220 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0013 | 0.0 | 0.1327 | 0.0041 | 0.0022 | 0.0035 | 0.0002 | 0.0 | 0.0 | | 3.0888 | 5.98 | 700 | 3.0596 | 0.0075 | 0.0690 | 0.0364 | 0.0080 | 0.0025 | 0.2672 | 0.0028 | 0.1936 | nan | 0.0001 | 0.0 | 0.0 | 0.0008 | 0.8043 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0033 | 0.1121 | 0.0018 | 0.0380 | 0.0 | nan | 0.0 | 0.0000 | 0.0007 | 0.0109 | 0.0 | 0.6151 | 0.0053 | 0.0040 | 0.0672 | 0.0001 | 0.0 | nan | 0.0073 | 0.0024 | 0.0108 | 0.0027 | 0.0502 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0006 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0033 | 0.0125 | 0.0018 | 0.0244 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0005 | 0.0012 | 0.0 | 0.1313 | 0.0043 | 0.0017 | 0.0041 | 0.0001 | 0.0 | 0.0 | | 3.1177 | 6.41 | 750 | 3.0297 | 0.0075 | 0.0700 | 0.0387 | 0.0082 | 0.0029 | 0.2403 | 0.0022 | 0.2030 | nan | 0.0000 | 0.0 | 0.0 | 0.0004 | 0.8050 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0028 | 0.1223 | 0.0012 | 0.0318 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0030 | 0.0 | 0.6823 | 0.0056 | 0.0042 | 0.0556 | 0.0001 | 0.0 | nan | 0.0077 | 0.0028 | 0.0099 | 0.0022 | 0.0513 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0003 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.0131 | 0.0012 | 0.0210 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0005 | 0.0 | 0.1343 | 0.0045 | 0.0018 | 0.0039 | 0.0001 | 0.0 | 0.0 | | 3.0981 | 6.84 | 800 | 2.9921 | 0.0075 | 0.0716 | 0.0382 | 0.0071 | 0.0029 | 0.3004 | 0.0017 | 0.1831 | nan | 0.0000 | 0.0 | 0.0 | 0.0004 | 0.8008 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0026 | 0.1617 | 0.0013 | 0.0377 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0003 | 0.0 | 0.6584 | 0.0052 | 0.0035 | 0.0532 | 0.0001 | 0.0 | nan | 0.0066 | 0.0028 | 0.0117 | 0.0016 | 0.0489 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0003 | 0.0045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0026 | 0.0139 | 0.0013 | 0.0245 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0001 | 0.0 | 0.1324 | 0.0042 | 0.0013 | 0.0044 | 0.0001 | 0.0 | 0.0 | | 2.9986 | 7.26 | 850 | 2.9500 | 0.0073 | 0.0718 | 0.0379 | 0.0051 | 0.0029 | 0.3269 | 0.0015 | 0.1622 | nan | 0.0000 | 0.0 | 0.0 | 0.0002 | 0.7995 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0019 | 0.1661 | 0.0011 | 0.0342 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.6631 | 0.0057 | 0.0034 | 0.0505 | 0.0000 | 0.0 | nan | 0.0049 | 0.0028 | 0.0124 | 0.0015 | 0.0477 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0002 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0139 | 0.0011 | 0.0226 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.1328 | 0.0044 | 0.0012 | 0.0047 | 0.0000 | 0.0 | 0.0 | | 3.01 | 7.69 | 900 | 2.9155 | 0.0075 | 0.0726 | 0.0397 | 0.0057 | 0.0048 | 0.2827 | 0.0016 | 0.1780 | nan | 0.0 | 0.0 | 0.0 | 0.0001 | 0.7988 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0018 | 0.1896 | 0.0014 | 0.0313 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7004 | 0.0053 | 0.0027 | 0.0462 | 0.0 | 0.0 | nan | 0.0054 | 0.0044 | 0.0112 | 0.0016 | 0.0495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.0146 | 0.0014 | 0.0215 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1358 | 0.0041 | 0.0010 | 0.0043 | 0.0 | 0.0 | 0.0 | | 2.7829 | 8.12 | 950 | 2.9108 | 0.0075 | 0.0737 | 0.0396 | 0.0046 | 0.0056 | 0.3126 | 0.0016 | 0.1523 | nan | 0.0 | 0.0 | 0.0 | 0.0001 | 0.7949 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0012 | 0.2263 | 0.0013 | 0.0359 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.6970 | 0.0047 | 0.0030 | 0.0449 | 0.0 | 0.0 | nan | 0.0045 | 0.0051 | 0.0120 | 0.0016 | 0.0474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0158 | 0.0013 | 0.0238 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.1359 | 0.0036 | 0.0010 | 0.0057 | 0.0 | 0.0 | 0.0 | | 2.8855 | 8.55 | 1000 | 2.8632 | 0.0075 | 0.0762 | 0.0407 | 0.0035 | 0.0060 | 0.3065 | 0.0012 | 0.1668 | nan | 0.0 | 0.0 | 0.0 | 0.0000 | 0.7886 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0011 | 0.2899 | 0.0015 | 0.0307 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.7177 | 0.0046 | 0.0028 | 0.0425 | 0.0 | 0.0 | nan | 0.0034 | 0.0053 | 0.0118 | 0.0012 | 0.0482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0177 | 0.0015 | 0.0216 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.1363 | 0.0035 | 0.0008 | 0.0067 | 0.0 | 0.0 | 0.0 | | 2.9367 | 8.97 | 1050 | 2.8275 | 0.0073 | 0.0747 | 0.0400 | 0.0028 | 0.0064 | 0.3273 | 0.0011 | 0.1432 | nan | 0.0 | 0.0 | 0.0 | 0.0000 | 0.7941 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0007 | 0.2668 | 0.0013 | 0.0213 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7122 | 0.0036 | 0.0029 | 0.0306 | 0.0 | 0.0 | nan | 0.0027 | 0.0057 | 0.0123 | 0.0011 | 0.0455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0169 | 0.0013 | 0.0157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1387 | 0.0028 | 0.0007 | 0.0061 | 0.0 | 0.0 | 0.0 | | 2.7874 | 9.4 | 1100 | 2.7847 | 0.0072 | 0.0772 | 0.0396 | 0.0020 | 0.0044 | 0.3967 | 0.0012 | 0.1218 | nan | 0.0 | 0.0 | 0.0 | 0.0000 | 0.7892 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0007 | 0.3182 | 0.0009 | 0.0262 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7052 | 0.0033 | 0.0022 | 0.0224 | 0.0 | 0.0 | nan | 0.0020 | 0.0040 | 0.0142 | 0.0012 | 0.0427 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0183 | 0.0009 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1382 | 0.0025 | 0.0005 | 0.0057 | 0.0 | 0.0 | 0.0 | | 2.7519 | 9.83 | 1150 | 2.7757 | 0.0074 | 0.0773 | 0.0408 | 0.0016 | 0.0078 | 0.3511 | 0.0007 | 0.1389 | nan | 0.0 | 0.0 | 0.0 | 0.0000 | 0.7855 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0005 | 0.3444 | 0.0012 | 0.0257 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7131 | 0.0030 | 0.0018 | 0.0220 | 0.0 | 0.0 | nan | 0.0016 | 0.0068 | 0.0130 | 0.0007 | 0.0451 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0048 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0190 | 0.0012 | 0.0186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1375 | 0.0023 | 0.0004 | 0.0067 | 0.0 | 0.0 | 0.0 | | 2.8513 | 10.26 | 1200 | 2.7939 | 0.0076 | 0.0773 | 0.0420 | 0.0021 | 0.0113 | 0.3051 | 0.0007 | 0.1357 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7816 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0003 | 0.3811 | 0.0013 | 0.0213 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7334 | 0.0018 | 0.0024 | 0.0175 | 0.0 | 0.0 | nan | 0.0021 | 0.0096 | 0.0117 | 0.0007 | 0.0452 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0003 | 0.0194 | 0.0013 | 0.0157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1406 | 0.0014 | 0.0005 | 0.0058 | 0.0 | 0.0 | 0.0 | | 2.9147 | 10.68 | 1250 | 2.7415 | 0.0073 | 0.0800 | 0.0437 | 0.0016 | 0.0124 | 0.3254 | 0.0004 | 0.1348 | nan | 0.0 | 0.0 | 0.0 | 0.0000 | 0.7681 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0003 | 0.4424 | 0.0013 | 0.0158 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7583 | 0.0027 | 0.0016 | 0.0161 | 0.0 | 0.0 | nan | 0.0016 | 0.0105 | 0.0123 | 0.0004 | 0.0444 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0210 | 0.0013 | 0.0118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1361 | 0.0020 | 0.0004 | 0.0062 | 0.0 | 0.0 | 0.0 | | 2.9311 | 11.11 | 1300 | 2.7368 | 0.0071 | 0.0793 | 0.0437 | 0.0013 | 0.0122 | 0.3176 | 0.0005 | 0.1358 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7735 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.4230 | 0.0011 | 0.0100 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7709 | 0.0014 | 0.0014 | 0.0102 | 0.0 | 0.0 | nan | 0.0012 | 0.0103 | 0.0121 | 0.0005 | 0.0446 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0205 | 0.0011 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1390 | 0.0011 | 0.0003 | 0.0044 | 0.0 | 0.0 | 0.0 | | 2.7803 | 11.54 | 1350 | 2.6650 | 0.0074 | 0.0813 | 0.0444 | 0.0014 | 0.0140 | 0.3329 | 0.0005 | 0.1260 | nan | 0.0 | 0.0 | 0.0 | 0.0000 | 0.7640 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.4862 | 0.0011 | 0.0122 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7668 | 0.0018 | 0.0015 | 0.0107 | 0.0 | 0.0 | nan | 0.0014 | 0.0116 | 0.0125 | 0.0005 | 0.0432 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0218 | 0.0011 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1369 | 0.0014 | 0.0003 | 0.0050 | 0.0 | 0.0 | 0.0 | | 2.8496 | 11.97 | 1400 | 2.6764 | 0.0073 | 0.0794 | 0.0434 | 0.0009 | 0.0170 | 0.3417 | 0.0005 | 0.0959 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7733 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.4785 | 0.0010 | 0.0090 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7354 | 0.0010 | 0.0012 | 0.0066 | 0.0 | 0.0 | nan | 0.0009 | 0.0139 | 0.0128 | 0.0005 | 0.0392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0210 | 0.0010 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1408 | 0.0007 | 0.0002 | 0.0035 | 0.0 | 0.0 | 0.0 | | 2.9356 | 12.39 | 1450 | 2.6741 | 0.0075 | 0.0819 | 0.0451 | 0.0009 | 0.0190 | 0.3118 | 0.0005 | 0.1236 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7635 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.5493 | 0.0008 | 0.0110 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7487 | 0.0006 | 0.0008 | 0.0074 | 0.0 | 0.0 | nan | 0.0009 | 0.0154 | 0.0120 | 0.0005 | 0.0447 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0220 | 0.0008 | 0.0086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1411 | 0.0004 | 0.0002 | 0.0040 | 0.0 | 0.0 | 0.0 | | 2.7973 | 12.82 | 1500 | 2.6146 | 0.0073 | 0.0828 | 0.0454 | 0.0007 | 0.0191 | 0.3251 | 0.0005 | 0.1108 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7611 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.5792 | 0.0008 | 0.0056 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7606 | 0.0005 | 0.0009 | 0.0028 | 0.0 | 0.0 | nan | 0.0007 | 0.0155 | 0.0123 | 0.0004 | 0.0422 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0227 | 0.0008 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1418 | 0.0003 | 0.0002 | 0.0018 | 0.0 | 0.0 | 0.0 | | 2.5603 | 13.25 | 1550 | 2.6177 | 0.0074 | 0.0830 | 0.0463 | 0.0007 | 0.0228 | 0.3112 | 0.0006 | 0.1018 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7591 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.6096 | 0.0007 | 0.0062 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7584 | 0.0002 | 0.0007 | 0.0022 | 0.0 | 0.0 | nan | 0.0007 | 0.0181 | 0.0120 | 0.0006 | 0.0412 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0228 | 0.0007 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1425 | 0.0001 | 0.0001 | 0.0015 | 0.0 | 0.0 | 0.0 | | 2.7181 | 13.68 | 1600 | 2.5973 | 0.0074 | 0.0822 | 0.0472 | 0.0005 | 0.0246 | 0.2926 | 0.0005 | 0.1116 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7616 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.5796 | 0.0007 | 0.0033 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7706 | 0.0002 | 0.0003 | 0.0013 | 0.0 | 0.0 | nan | 0.0005 | 0.0195 | 0.0114 | 0.0005 | 0.0438 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0230 | 0.0007 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1411 | 0.0002 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | | 2.6034 | 14.1 | 1650 | 2.5914 | 0.0073 | 0.0843 | 0.0469 | 0.0005 | 0.0228 | 0.3196 | 0.0004 | 0.1056 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7544 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.6371 | 0.0006 | 0.0031 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7686 | 0.0002 | 0.0004 | 0.0013 | 0.0 | 0.0 | nan | 0.0005 | 0.0181 | 0.0122 | 0.0004 | 0.0417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0060 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0238 | 0.0006 | 0.0027 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1402 | 0.0001 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | | 2.6792 | 14.53 | 1700 | 2.5773 | 0.0073 | 0.0857 | 0.0463 | 0.0003 | 0.0232 | 0.3367 | 0.0005 | 0.0953 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7459 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7004 | 0.0004 | 0.0066 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7443 | 0.0000 | 0.0002 | 0.0015 | 0.0 | 0.0 | nan | 0.0003 | 0.0184 | 0.0127 | 0.0005 | 0.0395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0237 | 0.0004 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1410 | 0.0000 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | | 2.5936 | 14.96 | 1750 | 2.5147 | 0.0072 | 0.0859 | 0.0459 | 0.0003 | 0.0228 | 0.3530 | 0.0005 | 0.0813 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7462 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7110 | 0.0004 | 0.0043 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7411 | 0.0000 | 0.0002 | 0.0006 | 0.0 | 0.0 | nan | 0.0003 | 0.0181 | 0.0131 | 0.0005 | 0.0362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0236 | 0.0004 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1414 | 0.0000 | 0.0000 | 0.0005 | 0.0 | 0.0 | 0.0 | | 2.7846 | 15.38 | 1800 | 2.5288 | 0.0074 | 0.0855 | 0.0491 | 0.0003 | 0.0307 | 0.3131 | 0.0004 | 0.0913 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7464 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.6968 | 0.0005 | 0.0032 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7653 | 0.0000 | 0.0001 | 0.0008 | 0.0 | 0.0 | nan | 0.0003 | 0.0237 | 0.0121 | 0.0004 | 0.0395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0242 | 0.0005 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1411 | 0.0000 | 0.0000 | 0.0006 | 0.0 | 0.0 | 0.0 | | 2.7533 | 15.81 | 1850 | 2.5347 | 0.0074 | 0.0849 | 0.0480 | 0.0002 | 0.0296 | 0.3023 | 0.0005 | 0.0928 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7478 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7060 | 0.0004 | 0.0031 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7481 | 0.0000 | 0.0001 | 0.0002 | 0.0 | 0.0 | nan | 0.0002 | 0.0229 | 0.0117 | 0.0005 | 0.0408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0060 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0239 | 0.0004 | 0.0027 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1413 | 0.0000 | 0.0000 | 0.0001 | 0.0 | 0.0 | 0.0 | | 2.6202 | 16.24 | 1900 | 2.5233 | 0.0074 | 0.0854 | 0.0484 | 0.0002 | 0.0319 | 0.3158 | 0.0005 | 0.0869 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7452 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7275 | 0.0004 | 0.0038 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7336 | 0.0 | 0.0000 | 0.0003 | 0.0 | 0.0 | nan | 0.0002 | 0.0246 | 0.0121 | 0.0005 | 0.0393 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0235 | 0.0004 | 0.0034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1412 | 0.0 | 0.0000 | 0.0002 | 0.0 | 0.0 | 0.0 | | 2.7831 | 16.67 | 1950 | 2.4872 | 0.0072 | 0.0865 | 0.0486 | 0.0001 | 0.0311 | 0.3382 | 0.0005 | 0.0763 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7401 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7483 | 0.0002 | 0.0027 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7429 | 0.0000 | 0.0000 | 0.0006 | 0.0 | 0.0 | nan | 0.0001 | 0.0240 | 0.0128 | 0.0005 | 0.0355 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0067 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0240 | 0.0002 | 0.0024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1395 | 0.0000 | 0.0000 | 0.0005 | 0.0 | 0.0 | 0.0 | | 2.6771 | 17.09 | 2000 | 2.4556 | 0.0073 | 0.0866 | 0.0501 | 0.0002 | 0.0324 | 0.3120 | 0.0006 | 0.0705 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7385 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7390 | 0.0002 | 0.0023 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7900 | 0.0 | 0.0000 | 0.0001 | 0.0 | 0.0 | nan | 0.0002 | 0.0248 | 0.0120 | 0.0005 | 0.0346 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0067 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0243 | 0.0002 | 0.0020 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1415 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | 0.0 | | 2.4096 | 17.52 | 2050 | 2.4846 | 0.0075 | 0.0859 | 0.0507 | 0.0001 | 0.0400 | 0.2959 | 0.0005 | 0.0812 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7376 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7732 | 0.0002 | 0.0027 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7321 | 0.0 | 0.0000 | 0.0005 | 0.0 | 0.0 | nan | 0.0001 | 0.0301 | 0.0116 | 0.0005 | 0.0383 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0234 | 0.0002 | 0.0024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1399 | 0.0 | 0.0000 | 0.0004 | 0.0 | 0.0 | 0.0 | | 2.4462 | 17.95 | 2100 | 2.4614 | 0.0075 | 0.0857 | 0.0510 | 0.0001 | 0.0405 | 0.2981 | 0.0004 | 0.0742 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7409 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7569 | 0.0002 | 0.0023 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7419 | 0.0 | 0.0 | 0.0004 | 0.0 | 0.0 | nan | 0.0001 | 0.0303 | 0.0116 | 0.0004 | 0.0373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0066 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0237 | 0.0002 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1409 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0 | 0.0 | | 2.4129 | 18.38 | 2150 | 2.4534 | 0.0075 | 0.0858 | 0.0504 | 0.0001 | 0.0399 | 0.3024 | 0.0004 | 0.0796 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7393 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7732 | 0.0001 | 0.0027 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7230 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0001 | 0.0300 | 0.0118 | 0.0004 | 0.0388 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0068 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0235 | 0.0001 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1393 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | | 2.4048 | 18.8 | 2200 | 2.4267 | 0.0073 | 0.0858 | 0.0474 | 0.0001 | 0.0349 | 0.3520 | 0.0004 | 0.0593 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.7945 | 0.0001 | 0.0046 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.6776 | 0.0 | 0.0000 | 0.0004 | 0.0 | 0.0 | nan | 0.0001 | 0.0265 | 0.0132 | 0.0004 | 0.0314 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0226 | 0.0001 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.1353 | 0.0 | 0.0000 | 0.0003 | 0.0 | 0.0 | nan | | 2.4369 | 19.23 | 2250 | 2.4101 | 0.0075 | 0.0866 | 0.0501 | 0.0000 | 0.0377 | 0.3220 | 0.0004 | 0.0694 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7360 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7833 | 0.0001 | 0.0025 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7331 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0000 | 0.0284 | 0.0123 | 0.0004 | 0.0352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0070 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0233 | 0.0001 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1399 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | | 2.4132 | 19.66 | 2300 | 2.4152 | 0.0078 | 0.0863 | 0.0530 | 0.0001 | 0.0452 | 0.2873 | 0.0003 | 0.0791 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7368 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7715 | 0.0001 | 0.0022 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7518 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0001 | 0.0334 | 0.0113 | 0.0003 | 0.0395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0070 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0238 | 0.0001 | 0.0020 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1405 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | | 2.3273 | 20.09 | 2350 | 2.3988 | 0.0075 | 0.0866 | 0.0500 | 0.0000 | 0.0381 | 0.3221 | 0.0004 | 0.0691 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7351 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7904 | 0.0001 | 0.0023 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7258 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0287 | 0.0124 | 0.0004 | 0.0353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0231 | 0.0001 | 0.0022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1392 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.7112 | 20.51 | 2400 | 2.3673 | 0.0074 | 0.0866 | 0.0520 | 0.0000 | 0.0426 | 0.3041 | 0.0004 | 0.0661 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7337 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7843 | 0.0001 | 0.0018 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7502 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0317 | 0.0118 | 0.0004 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0236 | 0.0001 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1397 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | | 2.4112 | 20.94 | 2450 | 2.3806 | 0.0078 | 0.0864 | 0.0528 | 0.0000 | 0.0441 | 0.2836 | 0.0004 | 0.0771 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7364 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7745 | 0.0001 | 0.0018 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7590 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0327 | 0.0112 | 0.0004 | 0.0391 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0235 | 0.0001 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1416 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4846 | 21.37 | 2500 | 2.3986 | 0.0075 | 0.0856 | 0.0530 | 0.0000 | 0.0499 | 0.2838 | 0.0004 | 0.0694 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7335 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.7987 | 0.0001 | 0.0024 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7143 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0000 | 0.0364 | 0.0113 | 0.0004 | 0.0369 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0072 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0001 | 0.0022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1385 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | | 2.5263 | 21.79 | 2550 | 2.3352 | 0.0074 | 0.0864 | 0.0496 | 0.0000 | 0.0387 | 0.3347 | 0.0004 | 0.0515 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7344 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.7991 | 0.0001 | 0.0022 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7161 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0290 | 0.0127 | 0.0004 | 0.0297 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1389 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.453 | 22.22 | 2600 | 2.3465 | 0.0076 | 0.0859 | 0.0539 | 0.0000 | 0.0505 | 0.2887 | 0.0003 | 0.0537 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7331 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7891 | 0.0001 | 0.0023 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7443 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0368 | 0.0114 | 0.0003 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0235 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1386 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.7019 | 22.65 | 2650 | 2.3168 | 0.0075 | 0.0868 | 0.0511 | 0.0000 | 0.0408 | 0.3173 | 0.0004 | 0.0563 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7320 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7998 | 0.0001 | 0.0023 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7425 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0303 | 0.0122 | 0.0004 | 0.0317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0232 | 0.0001 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1395 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.7737 | 23.08 | 2700 | 2.3540 | 0.0074 | 0.0859 | 0.0512 | 0.0000 | 0.0454 | 0.2955 | 0.0004 | 0.0657 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7311 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8175 | 0.0001 | 0.0020 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7042 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0334 | 0.0116 | 0.0004 | 0.0359 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0222 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1383 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.369 | 23.5 | 2750 | 2.3262 | 0.0076 | 0.0873 | 0.0528 | 0.0000 | 0.0433 | 0.3015 | 0.0004 | 0.0616 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7300 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.7972 | 0.0001 | 0.0020 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7695 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0320 | 0.0118 | 0.0004 | 0.0341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0234 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1410 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4282 | 23.93 | 2800 | 2.3334 | 0.0077 | 0.0866 | 0.0519 | 0.0000 | 0.0442 | 0.2954 | 0.0004 | 0.0671 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7318 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8095 | 0.0001 | 0.0018 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7334 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0325 | 0.0116 | 0.0004 | 0.0366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0226 | 0.0001 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1401 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2932 | 24.36 | 2850 | 2.2981 | 0.0076 | 0.0864 | 0.0520 | 0.0000 | 0.0451 | 0.3003 | 0.0003 | 0.0589 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7315 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8105 | 0.0001 | 0.0018 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7309 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0332 | 0.0117 | 0.0003 | 0.0335 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0001 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1399 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.334 | 24.79 | 2900 | 2.3066 | 0.0071 | 0.0863 | 0.0526 | 0.0000 | 0.0481 | 0.3031 | 0.0003 | 0.0563 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7274 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8170 | 0.0000 | 0.0023 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7204 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0351 | 0.0118 | 0.0003 | 0.0327 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0230 | 0.0000 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1365 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | 0.0 | | 2.547 | 25.21 | 2950 | 2.3064 | 0.0077 | 0.0861 | 0.0537 | 0.0000 | 0.0512 | 0.2805 | 0.0003 | 0.0633 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7264 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8193 | 0.0000 | 0.0019 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7269 | 0.0 | 0.0001 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0370 | 0.0111 | 0.0003 | 0.0358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0228 | 0.0000 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1373 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.3814 | 25.64 | 3000 | 2.3104 | 0.0076 | 0.0862 | 0.0519 | 0.0000 | 0.0472 | 0.2887 | 0.0003 | 0.0643 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7270 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8368 | 0.0001 | 0.0017 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7072 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0345 | 0.0114 | 0.0003 | 0.0362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0221 | 0.0001 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1380 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3606 | 26.07 | 3050 | 2.2659 | 0.0073 | 0.0853 | 0.0503 | 0.0000 | 0.0451 | 0.3161 | 0.0003 | 0.0466 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7313 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8180 | 0.0000 | 0.0018 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6860 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0000 | 0.0330 | 0.0122 | 0.0003 | 0.0287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0000 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1333 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.5409 | 26.5 | 3100 | 2.2998 | 0.0074 | 0.0865 | 0.0538 | 0.0000 | 0.0502 | 0.2868 | 0.0002 | 0.0582 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7246 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8215 | 0.0000 | 0.0015 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7395 | 0.0 | 0.0002 | 0.0000 | 0.0 | 0.0 | nan | 0.0000 | 0.0364 | 0.0113 | 0.0002 | 0.0340 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0230 | 0.0000 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1379 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.3258 | 26.92 | 3150 | 2.2954 | 0.0077 | 0.0860 | 0.0531 | 0.0 | 0.0498 | 0.2769 | 0.0003 | 0.0603 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7307 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8194 | 0.0001 | 0.0017 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7267 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0361 | 0.0110 | 0.0003 | 0.0353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0225 | 0.0001 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1391 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3487 | 27.35 | 3200 | 2.2857 | 0.0076 | 0.0860 | 0.0532 | 0.0 | 0.0509 | 0.2855 | 0.0003 | 0.0558 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7280 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8264 | 0.0000 | 0.0016 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7172 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0368 | 0.0113 | 0.0003 | 0.0333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0226 | 0.0000 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1372 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2953 | 27.78 | 3250 | 2.2727 | 0.0075 | 0.0861 | 0.0521 | 0.0 | 0.0496 | 0.2988 | 0.0002 | 0.0515 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7236 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8456 | 0.0000 | 0.0013 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6973 | 0.0 | 0.0001 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0359 | 0.0117 | 0.0002 | 0.0316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0223 | 0.0000 | 0.0012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1353 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.3487 | 28.21 | 3300 | 2.3029 | 0.0075 | 0.0855 | 0.0522 | 0.0 | 0.0500 | 0.2892 | 0.0002 | 0.0581 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7283 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8276 | 0.0000 | 0.0019 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6952 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0362 | 0.0114 | 0.0002 | 0.0344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0228 | 0.0000 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1340 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3138 | 28.63 | 3350 | 2.2554 | 0.0074 | 0.0860 | 0.0504 | 0.0 | 0.0444 | 0.3070 | 0.0003 | 0.0495 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7290 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8402 | 0.0000 | 0.0012 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6956 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0325 | 0.0119 | 0.0003 | 0.0306 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0222 | 0.0000 | 0.0012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1364 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3157 | 29.06 | 3400 | 2.2679 | 0.0074 | 0.0860 | 0.0539 | 0.0 | 0.0529 | 0.2817 | 0.0001 | 0.0563 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7253 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8286 | 0.0000 | 0.0012 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7203 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0380 | 0.0112 | 0.0001 | 0.0340 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0230 | 0.0000 | 0.0012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1355 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | nan | | 2.2678 | 29.49 | 3450 | 2.2538 | 0.0073 | 0.0867 | 0.0497 | 0.0 | 0.0414 | 0.3204 | 0.0003 | 0.0485 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7255 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8487 | 0.0001 | 0.0009 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7005 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0305 | 0.0123 | 0.0003 | 0.0296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0221 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1367 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3993 | 29.91 | 3500 | 2.2745 | 0.0075 | 0.0865 | 0.0557 | 0.0 | 0.0555 | 0.2680 | 0.0002 | 0.0589 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7239 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8238 | 0.0000 | 0.0014 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7502 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0397 | 0.0108 | 0.0002 | 0.0352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0231 | 0.0000 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1381 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2576 | 30.34 | 3550 | 2.2471 | 0.0074 | 0.0851 | 0.0524 | 0.0 | 0.0520 | 0.2844 | 0.0002 | 0.0489 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7285 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8330 | 0.0000 | 0.0014 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6892 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0375 | 0.0113 | 0.0002 | 0.0311 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0224 | 0.0000 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1337 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2889 | 30.77 | 3600 | 2.2595 | 0.0075 | 0.0860 | 0.0525 | 0.0 | 0.0496 | 0.2902 | 0.0002 | 0.0503 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7281 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8338 | 0.0000 | 0.0011 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7116 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0359 | 0.0114 | 0.0002 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0225 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1366 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2409 | 31.2 | 3650 | 2.2224 | 0.0072 | 0.0860 | 0.0505 | 0.0 | 0.0449 | 0.3076 | 0.0002 | 0.0437 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7269 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8424 | 0.0000 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6995 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0328 | 0.0119 | 0.0002 | 0.0281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0224 | 0.0000 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1350 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4857 | 31.62 | 3700 | 2.2306 | 0.0075 | 0.0863 | 0.0525 | 0.0 | 0.0488 | 0.2940 | 0.0002 | 0.0471 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7261 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8368 | 0.0000 | 0.0012 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7218 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0354 | 0.0115 | 0.0002 | 0.0300 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1371 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.5056 | 32.05 | 3750 | 2.2192 | 0.0075 | 0.0852 | 0.0540 | 0.0 | 0.0560 | 0.2740 | 0.0002 | 0.0424 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7282 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8345 | 0.0000 | 0.0012 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7061 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0400 | 0.0109 | 0.0002 | 0.0281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0225 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1356 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4911 | 32.48 | 3800 | 2.2501 | 0.0074 | 0.0850 | 0.0518 | 0.0 | 0.0529 | 0.2857 | 0.0002 | 0.0483 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7249 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8575 | 0.0000 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6644 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0380 | 0.0113 | 0.0002 | 0.0311 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0220 | 0.0000 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1316 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2452 | 32.91 | 3850 | 2.2030 | 0.0071 | 0.0870 | 0.0509 | 0.0 | 0.0424 | 0.3135 | 0.0002 | 0.0466 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7250 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8420 | 0.0000 | 0.0009 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7275 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0311 | 0.0121 | 0.0002 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0226 | 0.0000 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1377 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3314 | 33.33 | 3900 | 2.2650 | 0.0075 | 0.0851 | 0.0527 | 0.0 | 0.0539 | 0.2782 | 0.0001 | 0.0562 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7247 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8481 | 0.0000 | 0.0008 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6774 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0386 | 0.0111 | 0.0001 | 0.0345 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0224 | 0.0000 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1317 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.456 | 33.76 | 3950 | 2.2260 | 0.0075 | 0.0859 | 0.0526 | 0.0 | 0.0501 | 0.2881 | 0.0002 | 0.0473 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7281 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8343 | 0.0000 | 0.0013 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7135 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0362 | 0.0114 | 0.0002 | 0.0305 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0226 | 0.0000 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1364 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.1795 | 34.19 | 4000 | 2.2383 | 0.0073 | 0.0854 | 0.0515 | 0.0 | 0.0501 | 0.2940 | 0.0002 | 0.0445 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7254 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8502 | 0.0000 | 0.0008 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6823 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0362 | 0.0115 | 0.0002 | 0.0293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0222 | 0.0000 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1333 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2701 | 34.62 | 4050 | 2.1953 | 0.0073 | 0.0858 | 0.0516 | 0.0 | 0.0493 | 0.3067 | 0.0001 | 0.0430 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7213 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8499 | 0.0 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6893 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0355 | 0.0119 | 0.0001 | 0.0282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0229 | 0.0 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1312 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | | 2.5063 | 35.04 | 4100 | 2.2152 | 0.0074 | 0.0856 | 0.0531 | 0.0 | 0.0525 | 0.2859 | 0.0002 | 0.0434 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7272 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8357 | 0.0000 | 0.0011 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7079 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0376 | 0.0113 | 0.0002 | 0.0286 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1349 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3872 | 35.47 | 4150 | 2.2011 | 0.0072 | 0.0846 | 0.0519 | 0.0 | 0.0524 | 0.2894 | 0.0001 | 0.0414 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7295 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8320 | 0.0 | 0.0011 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6776 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0375 | 0.0114 | 0.0001 | 0.0278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0228 | 0.0 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1304 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4513 | 35.9 | 4200 | 2.2279 | 0.0074 | 0.0851 | 0.0519 | 0.0 | 0.0516 | 0.2789 | 0.0002 | 0.0485 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7288 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8479 | 0.0000 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6822 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0371 | 0.0111 | 0.0002 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0219 | 0.0000 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1344 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.2309 | 36.32 | 4250 | 2.2145 | 0.0073 | 0.0852 | 0.0519 | 0.0 | 0.0514 | 0.2921 | 0.0002 | 0.0441 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7270 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8437 | 0.0000 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6814 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0369 | 0.0115 | 0.0002 | 0.0292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0224 | 0.0000 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1322 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.225 | 36.75 | 4300 | 2.2065 | 0.0073 | 0.0853 | 0.0533 | 0.0 | 0.0542 | 0.2830 | 0.0002 | 0.0383 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7264 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8384 | 0.0000 | 0.0012 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7026 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0387 | 0.0112 | 0.0002 | 0.0262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0227 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1337 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.1982 | 37.18 | 4350 | 2.2199 | 0.0074 | 0.0854 | 0.0531 | 0.0 | 0.0537 | 0.2812 | 0.0002 | 0.0431 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7279 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8403 | 0.0000 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6995 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0384 | 0.0112 | 0.0002 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0224 | 0.0000 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1351 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.1423 | 37.61 | 4400 | 2.2118 | 0.0073 | 0.0850 | 0.0520 | 0.0 | 0.0521 | 0.2897 | 0.0002 | 0.0408 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7267 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8429 | 0.0000 | 0.0010 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6826 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0374 | 0.0114 | 0.0002 | 0.0276 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0226 | 0.0000 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1315 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3004 | 38.03 | 4450 | 2.2279 | 0.0075 | 0.0866 | 0.0528 | 0.0 | 0.0497 | 0.2876 | 0.0002 | 0.0478 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7201 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8572 | 0.0 | 0.0008 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7209 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0359 | 0.0114 | 0.0002 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0223 | 0.0 | 0.0007 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1366 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.3405 | 38.46 | 4500 | 2.2198 | 0.0074 | 0.0850 | 0.0519 | 0.0 | 0.0535 | 0.2824 | 0.0002 | 0.0513 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7249 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8620 | 0.0000 | 0.0011 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6592 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0383 | 0.0112 | 0.0002 | 0.0327 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0219 | 0.0000 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1312 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4924 | 38.89 | 4550 | 2.2621 | 0.0076 | 0.0866 | 0.0536 | 0.0 | 0.0498 | 0.2829 | 0.0002 | 0.0517 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7265 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8296 | 0.0 | 0.0011 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7427 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0359 | 0.0112 | 0.0002 | 0.0325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0230 | 0.0 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1380 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.1744 | 39.32 | 4600 | 2.1896 | 0.0072 | 0.0852 | 0.0514 | 0.0 | 0.0507 | 0.2949 | 0.0002 | 0.0365 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7249 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8512 | 0.0000 | 0.0009 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6806 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0364 | 0.0116 | 0.0002 | 0.0253 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0224 | 0.0000 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1315 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | | 2.4663 | 39.74 | 4650 | 2.1850 | 0.0074 | 0.0858 | 0.0532 | 0.0 | 0.0521 | 0.2880 | 0.0002 | 0.0414 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7261 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.8347 | 0.0 | 0.0011 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7158 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0373 | 0.0114 | 0.0002 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0229 | 0.0 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1347 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.18.0 - Tokenizers 0.15.0
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
unreal-hug/segformer-b2-seed-67-v1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b2-seed-67-v1 This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the unreal-hug/REAL_DATASET_SEG_331 dataset. It achieves the following results on the evaluation set: - Loss: 0.4746 - Mean Iou: 0.2841 - Mean Accuracy: 0.3507 - Overall Accuracy: 0.6084 - Accuracy Unlabeled: nan - Accuracy Lv: 0.7915 - Accuracy Rv: 0.4646 - Accuracy Ra: 0.4834 - Accuracy La: 0.6858 - Accuracy Vs: 0.0 - Accuracy As: 0.0 - Accuracy Mk: 0.0 - Accuracy Tk: nan - Accuracy Asd: 0.3160 - Accuracy Vsd: 0.2747 - Accuracy Ak: 0.4910 - Iou Unlabeled: 0.0 - Iou Lv: 0.7252 - Iou Rv: 0.4232 - Iou Ra: 0.4411 - Iou La: 0.5427 - Iou Vs: 0.0 - Iou As: 0.0 - Iou Mk: 0.0 - Iou Tk: nan - Iou Asd: 0.2832 - Iou Vsd: 0.2342 - Iou Ak: 0.4759 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.05 - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Lv | Accuracy Rv | Accuracy Ra | Accuracy La | Accuracy Vs | Accuracy As | Accuracy Mk | Accuracy Tk | Accuracy Asd | Accuracy Vsd | Accuracy Ak | Iou Unlabeled | Iou Lv | Iou Rv | Iou Ra | Iou La | Iou Vs | Iou As | Iou Mk | Iou Tk | Iou Asd | Iou Vsd | Iou Ak | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:------------:|:------------:|:-----------:|:-------------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:-------:|:-------:|:------:| | 1.2449 | 5.88 | 100 | 1.1508 | 0.1187 | 0.1954 | 0.4575 | nan | 0.8193 | 0.0533 | 0.1371 | 0.5424 | 0.0 | 0.0 | 0.0 | nan | 0.0171 | 0.0155 | 0.3697 | 0.0 | 0.5501 | 0.0518 | 0.1253 | 0.3509 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0170 | 0.0148 | 0.3145 | | 0.7118 | 11.76 | 200 | 0.7012 | 0.1534 | 0.2007 | 0.4466 | nan | 0.7352 | 0.1138 | 0.2300 | 0.5548 | 0.0 | 0.0 | 0.0 | nan | 0.0168 | 0.0284 | 0.3280 | 0.0 | 0.6079 | 0.1081 | 0.2084 | 0.4120 | 0.0 | 0.0 | 0.0 | nan | 0.0167 | 0.0276 | 0.3064 | | 0.5567 | 17.65 | 300 | 0.5686 | 0.1896 | 0.2372 | 0.4810 | nan | 0.6994 | 0.2332 | 0.3522 | 0.5913 | 0.0 | 0.0 | 0.0 | nan | 0.0389 | 0.0765 | 0.3806 | 0.0 | 0.6382 | 0.2142 | 0.3023 | 0.4563 | 0.0 | 0.0 | 0.0 | nan | 0.0386 | 0.0714 | 0.3649 | | 0.5054 | 23.53 | 400 | 0.5441 | 0.2473 | 0.3075 | 0.5803 | nan | 0.7991 | 0.4241 | 0.4885 | 0.5970 | 0.0 | 0.0 | 0.0 | nan | 0.1535 | 0.1388 | 0.4745 | 0.0 | 0.7215 | 0.3725 | 0.4107 | 0.4908 | 0.0 | 0.0 | 0.0 | nan | 0.1486 | 0.1228 | 0.4537 | | 0.4344 | 29.41 | 500 | 0.5188 | 0.2706 | 0.3382 | 0.5967 | nan | 0.7810 | 0.4337 | 0.4668 | 0.7031 | 0.0 | 0.0 | 0.0 | nan | 0.2612 | 0.2644 | 0.4721 | 0.0 | 0.7121 | 0.3916 | 0.4164 | 0.5372 | 0.0 | 0.0 | 0.0 | nan | 0.2398 | 0.2236 | 0.4558 | | 0.3796 | 35.29 | 600 | 0.5032 | 0.2669 | 0.3315 | 0.5911 | nan | 0.7953 | 0.4343 | 0.4050 | 0.6920 | 0.0 | 0.0 | 0.0 | nan | 0.2841 | 0.2321 | 0.4717 | 0.0 | 0.7196 | 0.3965 | 0.3778 | 0.5273 | 0.0 | 0.0 | 0.0 | nan | 0.2589 | 0.1996 | 0.4568 | | 0.3888 | 41.18 | 700 | 0.4801 | 0.2798 | 0.3461 | 0.6037 | nan | 0.7862 | 0.4532 | 0.4667 | 0.6983 | 0.0 | 0.0 | 0.0 | nan | 0.3065 | 0.2590 | 0.4908 | 0.0 | 0.7192 | 0.4127 | 0.4292 | 0.5444 | 0.0 | 0.0 | 0.0 | nan | 0.2756 | 0.2216 | 0.4746 | | 0.3467 | 47.06 | 800 | 0.4753 | 0.2822 | 0.3478 | 0.6061 | nan | 0.7919 | 0.4585 | 0.4857 | 0.6814 | 0.0 | 0.0 | 0.0 | nan | 0.3131 | 0.2640 | 0.4831 | 0.0 | 0.7259 | 0.4196 | 0.4424 | 0.5402 | 0.0 | 0.0 | 0.0 | nan | 0.2813 | 0.2262 | 0.4685 | | 0.3757 | 52.94 | 900 | 0.4746 | 0.2841 | 0.3507 | 0.6084 | nan | 0.7915 | 0.4646 | 0.4834 | 0.6858 | 0.0 | 0.0 | 0.0 | nan | 0.3160 | 0.2747 | 0.4910 | 0.0 | 0.7252 | 0.4232 | 0.4411 | 0.5427 | 0.0 | 0.0 | 0.0 | nan | 0.2832 | 0.2342 | 0.4759 | | 0.3616 | 58.82 | 1000 | 0.4788 | 0.2860 | 0.3537 | 0.6116 | nan | 0.7931 | 0.4687 | 0.4837 | 0.6922 | 0.0 | 0.0 | 0.0 | nan | 0.3193 | 0.2830 | 0.4970 | 0.0 | 0.7262 | 0.4259 | 0.4411 | 0.5449 | 0.0 | 0.0 | 0.0 | nan | 0.2856 | 0.2407 | 0.4817 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.2+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0
[ "unlabeled", "lv", "rv", "ra", "la", "vs", "as", "mk", "tk", "asd", "vsd", "ak" ]
imessam/segformer-b0-finetuned-agriculture-3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-agriculture-3 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Framework versions - Transformers 4.39.1 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
[ "unlabeled", "nutrient_deficiency", "planter_skip", "water", "waterway", "weed_cluster" ]
yuyijiong/segformer-b5-remote-sensing-quality
[github](https://github.com/yuyijiong/remote_sense_image_quality_inspection) [paper](https://arxiv.org/abs/2307.11965) 使用segformer模型对遥感图像进行质量检测。通过语义分割,标出以下6种类型区域,“背景”代表图片无质量问题,其余代表某一种特定质量问题: "0": "背景", "1": "云", "2": "阴影", "3": "拉花", "4": "模糊", "5": "光谱溢出", "6": "扭曲"
[ "背景", "云", "阴影", "拉花", "模糊", "光谱溢出", "扭曲" ]
vigneshgs7/segformer-b2-p142-cvat-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b2-p142-cvat-2 This model is a fine-tuned version of [nvidia/mit-b2](https://huggingface.co/nvidia/mit-b2) on the vigneshgs7/segformer_open_cv_RGB_L_0_1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0222 - Mean Iou: 0.4959 - Mean Accuracy: 0.9919 - Overall Accuracy: 0.9919 - Accuracy Background: nan - Accuracy Object: 0.9919 - Iou Background: 0.0 - Iou Object: 0.9919 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Object | Iou Background | Iou Object | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------:| | 0.4097 | 0.06 | 20 | 0.4634 | 0.4794 | 0.9589 | 0.9589 | nan | 0.9589 | 0.0 | 0.9589 | | 0.4192 | 0.11 | 40 | 0.2595 | 0.4800 | 0.9601 | 0.9601 | nan | 0.9601 | 0.0 | 0.9601 | | 0.4005 | 0.17 | 60 | 0.1546 | 0.4720 | 0.9441 | 0.9441 | nan | 0.9441 | 0.0 | 0.9441 | | 0.1912 | 0.23 | 80 | 0.1395 | 0.4780 | 0.9560 | 0.9560 | nan | 0.9560 | 0.0 | 0.9560 | | 0.1286 | 0.29 | 100 | 0.1182 | 0.4775 | 0.9551 | 0.9551 | nan | 0.9551 | 0.0 | 0.9551 | | 0.1012 | 0.34 | 120 | 0.0902 | 0.4738 | 0.9477 | 0.9477 | nan | 0.9477 | 0.0 | 0.9477 | | 0.0798 | 0.4 | 140 | 0.0777 | 0.4812 | 0.9624 | 0.9624 | nan | 0.9624 | 0.0 | 0.9624 | | 0.0593 | 0.46 | 160 | 0.0716 | 0.4849 | 0.9697 | 0.9697 | nan | 0.9697 | 0.0 | 0.9697 | | 0.107 | 0.52 | 180 | 0.0675 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 | | 0.0521 | 0.57 | 200 | 0.0553 | 0.4811 | 0.9621 | 0.9621 | nan | 0.9621 | 0.0 | 0.9621 | | 0.045 | 0.63 | 220 | 0.0527 | 0.4915 | 0.9829 | 0.9829 | nan | 0.9829 | 0.0 | 0.9829 | | 0.0447 | 0.69 | 240 | 0.0481 | 0.4785 | 0.9571 | 0.9571 | nan | 0.9571 | 0.0 | 0.9571 | | 0.0381 | 0.74 | 260 | 0.0405 | 0.4878 | 0.9755 | 0.9755 | nan | 0.9755 | 0.0 | 0.9755 | | 0.0392 | 0.8 | 280 | 0.0409 | 0.4861 | 0.9723 | 0.9723 | nan | 0.9723 | 0.0 | 0.9723 | | 0.0364 | 0.86 | 300 | 0.0377 | 0.4878 | 0.9755 | 0.9755 | nan | 0.9755 | 0.0 | 0.9755 | | 0.0481 | 0.92 | 320 | 0.0383 | 0.4920 | 0.9840 | 0.9840 | nan | 0.9840 | 0.0 | 0.9840 | | 0.0424 | 0.97 | 340 | 0.0355 | 0.4909 | 0.9818 | 0.9818 | nan | 0.9818 | 0.0 | 0.9818 | | 0.0371 | 1.03 | 360 | 0.0358 | 0.4866 | 0.9732 | 0.9732 | nan | 0.9732 | 0.0 | 0.9732 | | 0.0224 | 1.09 | 380 | 0.0355 | 0.4897 | 0.9794 | 0.9794 | nan | 0.9794 | 0.0 | 0.9794 | | 0.0358 | 1.15 | 400 | 0.0359 | 0.4885 | 0.9769 | 0.9769 | nan | 0.9769 | 0.0 | 0.9769 | | 0.0235 | 1.2 | 420 | 0.0340 | 0.4877 | 0.9753 | 0.9753 | nan | 0.9753 | 0.0 | 0.9753 | | 0.1746 | 1.26 | 440 | 0.0335 | 0.4927 | 0.9854 | 0.9854 | nan | 0.9854 | 0.0 | 0.9854 | | 0.0253 | 1.32 | 460 | 0.0321 | 0.4889 | 0.9778 | 0.9778 | nan | 0.9778 | 0.0 | 0.9778 | | 0.0247 | 1.38 | 480 | 0.0299 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 | | 0.0351 | 1.43 | 500 | 0.0303 | 0.4907 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 | | 0.0203 | 1.49 | 520 | 0.0300 | 0.4906 | 0.9812 | 0.9812 | nan | 0.9812 | 0.0 | 0.9812 | | 0.0254 | 1.55 | 540 | 0.0327 | 0.4859 | 0.9718 | 0.9718 | nan | 0.9718 | 0.0 | 0.9718 | | 0.0272 | 1.6 | 560 | 0.0293 | 0.4908 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | 0.9816 | | 0.0295 | 1.66 | 580 | 0.0284 | 0.4908 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | 0.9816 | | 0.025 | 1.72 | 600 | 0.0286 | 0.4890 | 0.9779 | 0.9779 | nan | 0.9779 | 0.0 | 0.9779 | | 0.0225 | 1.78 | 620 | 0.0283 | 0.4899 | 0.9799 | 0.9799 | nan | 0.9799 | 0.0 | 0.9799 | | 0.1922 | 1.83 | 640 | 0.0264 | 0.4917 | 0.9834 | 0.9834 | nan | 0.9834 | 0.0 | 0.9834 | | 0.0349 | 1.89 | 660 | 0.0265 | 0.4935 | 0.9871 | 0.9871 | nan | 0.9871 | 0.0 | 0.9871 | | 0.023 | 1.95 | 680 | 0.0281 | 0.4887 | 0.9774 | 0.9774 | nan | 0.9774 | 0.0 | 0.9774 | | 0.024 | 2.01 | 700 | 0.0262 | 0.4936 | 0.9872 | 0.9872 | nan | 0.9872 | 0.0 | 0.9872 | | 0.0278 | 2.06 | 720 | 0.0261 | 0.4923 | 0.9846 | 0.9846 | nan | 0.9846 | 0.0 | 0.9846 | | 0.0276 | 2.12 | 740 | 0.0263 | 0.4923 | 0.9845 | 0.9845 | nan | 0.9845 | 0.0 | 0.9845 | | 0.0208 | 2.18 | 760 | 0.0262 | 0.4903 | 0.9806 | 0.9806 | nan | 0.9806 | 0.0 | 0.9806 | | 0.0206 | 2.23 | 780 | 0.0258 | 0.4896 | 0.9792 | 0.9792 | nan | 0.9792 | 0.0 | 0.9792 | | 0.017 | 2.29 | 800 | 0.0265 | 0.4887 | 0.9775 | 0.9775 | nan | 0.9775 | 0.0 | 0.9775 | | 0.1898 | 2.35 | 820 | 0.0260 | 0.4902 | 0.9803 | 0.9803 | nan | 0.9803 | 0.0 | 0.9803 | | 0.0167 | 2.41 | 840 | 0.0256 | 0.4942 | 0.9883 | 0.9883 | nan | 0.9883 | 0.0 | 0.9883 | | 0.0212 | 2.46 | 860 | 0.0263 | 0.4892 | 0.9784 | 0.9784 | nan | 0.9784 | 0.0 | 0.9784 | | 0.0182 | 2.52 | 880 | 0.0252 | 0.4900 | 0.9800 | 0.9800 | nan | 0.9800 | 0.0 | 0.9800 | | 0.0218 | 2.58 | 900 | 0.0241 | 0.4918 | 0.9836 | 0.9836 | nan | 0.9836 | 0.0 | 0.9836 | | 0.0197 | 2.64 | 920 | 0.0249 | 0.4895 | 0.9791 | 0.9791 | nan | 0.9791 | 0.0 | 0.9791 | | 0.0254 | 2.69 | 940 | 0.0241 | 0.4910 | 0.9819 | 0.9819 | nan | 0.9819 | 0.0 | 0.9819 | | 0.0276 | 2.75 | 960 | 0.0249 | 0.4908 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | 0.9816 | | 0.0167 | 2.81 | 980 | 0.0241 | 0.4929 | 0.9858 | 0.9858 | nan | 0.9858 | 0.0 | 0.9858 | | 0.0173 | 2.87 | 1000 | 0.0241 | 0.4903 | 0.9806 | 0.9806 | nan | 0.9806 | 0.0 | 0.9806 | | 0.081 | 2.92 | 1020 | 0.0251 | 0.4892 | 0.9783 | 0.9783 | nan | 0.9783 | 0.0 | 0.9783 | | 0.0273 | 2.98 | 1040 | 0.0230 | 0.4921 | 0.9842 | 0.9842 | nan | 0.9842 | 0.0 | 0.9842 | | 0.0384 | 3.04 | 1060 | 0.0232 | 0.4941 | 0.9881 | 0.9881 | nan | 0.9881 | 0.0 | 0.9881 | | 0.0229 | 3.09 | 1080 | 0.0235 | 0.4932 | 0.9863 | 0.9863 | nan | 0.9863 | 0.0 | 0.9863 | | 0.0329 | 3.15 | 1100 | 0.0231 | 0.4941 | 0.9882 | 0.9882 | nan | 0.9882 | 0.0 | 0.9882 | | 0.0149 | 3.21 | 1120 | 0.0232 | 0.4942 | 0.9883 | 0.9883 | nan | 0.9883 | 0.0 | 0.9883 | | 0.0163 | 3.27 | 1140 | 0.0237 | 0.4906 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 | | 0.0144 | 3.32 | 1160 | 0.0237 | 0.4903 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 | | 0.0196 | 3.38 | 1180 | 0.0225 | 0.4926 | 0.9851 | 0.9851 | nan | 0.9851 | 0.0 | 0.9851 | | 0.0194 | 3.44 | 1200 | 0.0224 | 0.4921 | 0.9841 | 0.9841 | nan | 0.9841 | 0.0 | 0.9841 | | 0.0182 | 3.5 | 1220 | 0.0224 | 0.4916 | 0.9832 | 0.9832 | nan | 0.9832 | 0.0 | 0.9832 | | 0.0178 | 3.55 | 1240 | 0.0230 | 0.4954 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0291 | 3.61 | 1260 | 0.0221 | 0.4920 | 0.9840 | 0.9840 | nan | 0.9840 | 0.0 | 0.9840 | | 0.0167 | 3.67 | 1280 | 0.0219 | 0.4934 | 0.9868 | 0.9868 | nan | 0.9868 | 0.0 | 0.9868 | | 0.0142 | 3.72 | 1300 | 0.0216 | 0.4943 | 0.9886 | 0.9886 | nan | 0.9886 | 0.0 | 0.9886 | | 0.0183 | 3.78 | 1320 | 0.0217 | 0.4927 | 0.9855 | 0.9855 | nan | 0.9855 | 0.0 | 0.9855 | | 0.0156 | 3.84 | 1340 | 0.0216 | 0.4946 | 0.9892 | 0.9892 | nan | 0.9892 | 0.0 | 0.9892 | | 0.0438 | 3.9 | 1360 | 0.0215 | 0.4932 | 0.9863 | 0.9863 | nan | 0.9863 | 0.0 | 0.9863 | | 0.0265 | 3.95 | 1380 | 0.0217 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 | | 0.0481 | 4.01 | 1400 | 0.0231 | 0.4943 | 0.9885 | 0.9885 | nan | 0.9885 | 0.0 | 0.9885 | | 0.0163 | 4.07 | 1420 | 0.0227 | 0.4948 | 0.9896 | 0.9896 | nan | 0.9896 | 0.0 | 0.9896 | | 0.0399 | 4.13 | 1440 | 0.0210 | 0.4941 | 0.9881 | 0.9881 | nan | 0.9881 | 0.0 | 0.9881 | | 0.0178 | 4.18 | 1460 | 0.0221 | 0.4947 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | 0.9894 | | 0.0159 | 4.24 | 1480 | 0.0220 | 0.4940 | 0.9880 | 0.9880 | nan | 0.9880 | 0.0 | 0.9880 | | 0.0159 | 4.3 | 1500 | 0.0212 | 0.4952 | 0.9903 | 0.9903 | nan | 0.9903 | 0.0 | 0.9903 | | 0.0241 | 4.36 | 1520 | 0.0214 | 0.4945 | 0.9890 | 0.9890 | nan | 0.9890 | 0.0 | 0.9890 | | 0.0159 | 4.41 | 1540 | 0.0215 | 0.4941 | 0.9882 | 0.9882 | nan | 0.9882 | 0.0 | 0.9882 | | 0.0202 | 4.47 | 1560 | 0.0233 | 0.4953 | 0.9907 | 0.9907 | nan | 0.9907 | 0.0 | 0.9907 | | 0.037 | 4.53 | 1580 | 0.0225 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0203 | 4.58 | 1600 | 0.0229 | 0.4944 | 0.9889 | 0.9889 | nan | 0.9889 | 0.0 | 0.9889 | | 0.0244 | 4.64 | 1620 | 0.0210 | 0.4948 | 0.9896 | 0.9896 | nan | 0.9896 | 0.0 | 0.9896 | | 0.0202 | 4.7 | 1640 | 0.0209 | 0.4954 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0137 | 4.76 | 1660 | 0.0211 | 0.4940 | 0.9879 | 0.9879 | nan | 0.9879 | 0.0 | 0.9879 | | 0.0152 | 4.81 | 1680 | 0.0210 | 0.4934 | 0.9868 | 0.9868 | nan | 0.9868 | 0.0 | 0.9868 | | 0.0159 | 4.87 | 1700 | 0.0206 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 | | 0.0202 | 4.93 | 1720 | 0.0207 | 0.4930 | 0.9861 | 0.9861 | nan | 0.9861 | 0.0 | 0.9861 | | 0.0453 | 4.99 | 1740 | 0.0211 | 0.4929 | 0.9859 | 0.9859 | nan | 0.9859 | 0.0 | 0.9859 | | 0.0203 | 5.04 | 1760 | 0.0207 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 | | 0.014 | 5.1 | 1780 | 0.0207 | 0.4957 | 0.9913 | 0.9913 | nan | 0.9913 | 0.0 | 0.9913 | | 0.0458 | 5.16 | 1800 | 0.0217 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.012 | 5.21 | 1820 | 0.0218 | 0.4945 | 0.9889 | 0.9889 | nan | 0.9889 | 0.0 | 0.9889 | | 0.0444 | 5.27 | 1840 | 0.0227 | 0.4949 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 | | 0.0791 | 5.33 | 1860 | 0.0226 | 0.4942 | 0.9884 | 0.9884 | nan | 0.9884 | 0.0 | 0.9884 | | 0.0349 | 5.39 | 1880 | 0.0222 | 0.4932 | 0.9865 | 0.9865 | nan | 0.9865 | 0.0 | 0.9865 | | 0.0175 | 5.44 | 1900 | 0.0225 | 0.4943 | 0.9885 | 0.9885 | nan | 0.9885 | 0.0 | 0.9885 | | 0.0191 | 5.5 | 1920 | 0.0222 | 0.4939 | 0.9878 | 0.9878 | nan | 0.9878 | 0.0 | 0.9878 | | 0.0219 | 5.56 | 1940 | 0.0217 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0251 | 5.62 | 1960 | 0.0225 | 0.4947 | 0.9895 | 0.9895 | nan | 0.9895 | 0.0 | 0.9895 | | 0.0317 | 5.67 | 1980 | 0.0232 | 0.4943 | 0.9887 | 0.9887 | nan | 0.9887 | 0.0 | 0.9887 | | 0.0177 | 5.73 | 2000 | 0.0232 | 0.4946 | 0.9892 | 0.9892 | nan | 0.9892 | 0.0 | 0.9892 | | 0.0172 | 5.79 | 2020 | 0.0205 | 0.4939 | 0.9879 | 0.9879 | nan | 0.9879 | 0.0 | 0.9879 | | 0.028 | 5.85 | 2040 | 0.0224 | 0.4968 | 0.9936 | 0.9936 | nan | 0.9936 | 0.0 | 0.9936 | | 0.0144 | 5.9 | 2060 | 0.0202 | 0.4939 | 0.9877 | 0.9877 | nan | 0.9877 | 0.0 | 0.9877 | | 0.0143 | 5.96 | 2080 | 0.0203 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0161 | 6.02 | 2100 | 0.0199 | 0.4945 | 0.9890 | 0.9890 | nan | 0.9890 | 0.0 | 0.9890 | | 0.014 | 6.07 | 2120 | 0.0202 | 0.4953 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.0299 | 6.13 | 2140 | 0.0203 | 0.4932 | 0.9863 | 0.9863 | nan | 0.9863 | 0.0 | 0.9863 | | 0.0152 | 6.19 | 2160 | 0.0201 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 | | 0.0159 | 6.25 | 2180 | 0.0200 | 0.4956 | 0.9913 | 0.9913 | nan | 0.9913 | 0.0 | 0.9913 | | 0.0135 | 6.3 | 2200 | 0.0214 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0122 | 6.36 | 2220 | 0.0211 | 0.4939 | 0.9879 | 0.9879 | nan | 0.9879 | 0.0 | 0.9879 | | 0.0198 | 6.42 | 2240 | 0.0203 | 0.4955 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 | | 0.0205 | 6.48 | 2260 | 0.0207 | 0.4948 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 | | 0.0144 | 6.53 | 2280 | 0.0205 | 0.4947 | 0.9893 | 0.9893 | nan | 0.9893 | 0.0 | 0.9893 | | 0.0138 | 6.59 | 2300 | 0.0207 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.0228 | 6.65 | 2320 | 0.0224 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0126 | 6.7 | 2340 | 0.0206 | 0.4949 | 0.9899 | 0.9899 | nan | 0.9899 | 0.0 | 0.9899 | | 0.0134 | 6.76 | 2360 | 0.0208 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0105 | 6.82 | 2380 | 0.0229 | 0.4954 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0407 | 6.88 | 2400 | 0.0219 | 0.4952 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.0148 | 6.93 | 2420 | 0.0212 | 0.4948 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 | | 0.011 | 6.99 | 2440 | 0.0216 | 0.4955 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0149 | 7.05 | 2460 | 0.0221 | 0.4948 | 0.9895 | 0.9895 | nan | 0.9895 | 0.0 | 0.9895 | | 0.0312 | 7.11 | 2480 | 0.0243 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.0146 | 7.16 | 2500 | 0.0236 | 0.4963 | 0.9927 | 0.9927 | nan | 0.9927 | 0.0 | 0.9927 | | 0.0132 | 7.22 | 2520 | 0.0221 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 | | 0.0314 | 7.28 | 2540 | 0.0214 | 0.4939 | 0.9878 | 0.9878 | nan | 0.9878 | 0.0 | 0.9878 | | 0.0177 | 7.34 | 2560 | 0.0221 | 0.4951 | 0.9903 | 0.9903 | nan | 0.9903 | 0.0 | 0.9903 | | 0.0213 | 7.39 | 2580 | 0.0223 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.0135 | 7.45 | 2600 | 0.0212 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0361 | 7.51 | 2620 | 0.0223 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.0457 | 7.56 | 2640 | 0.0221 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 | | 0.0191 | 7.62 | 2660 | 0.0238 | 0.4960 | 0.9919 | 0.9919 | nan | 0.9919 | 0.0 | 0.9919 | | 0.0141 | 7.68 | 2680 | 0.0222 | 0.4951 | 0.9902 | 0.9902 | nan | 0.9902 | 0.0 | 0.9902 | | 0.012 | 7.74 | 2700 | 0.0232 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.0134 | 7.79 | 2720 | 0.0226 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 | | 0.0174 | 7.85 | 2740 | 0.0226 | 0.4957 | 0.9913 | 0.9913 | nan | 0.9913 | 0.0 | 0.9913 | | 0.0163 | 7.91 | 2760 | 0.0215 | 0.4948 | 0.9895 | 0.9895 | nan | 0.9895 | 0.0 | 0.9895 | | 0.0159 | 7.97 | 2780 | 0.0213 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0122 | 8.02 | 2800 | 0.0206 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0272 | 8.08 | 2820 | 0.0207 | 0.4947 | 0.9893 | 0.9893 | nan | 0.9893 | 0.0 | 0.9893 | | 0.0178 | 8.14 | 2840 | 0.0214 | 0.4953 | 0.9907 | 0.9907 | nan | 0.9907 | 0.0 | 0.9907 | | 0.1188 | 8.19 | 2860 | 0.0211 | 0.4946 | 0.9892 | 0.9892 | nan | 0.9892 | 0.0 | 0.9892 | | 0.0128 | 8.25 | 2880 | 0.0222 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.0171 | 8.31 | 2900 | 0.0222 | 0.4955 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0522 | 8.37 | 2920 | 0.0227 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.0142 | 8.42 | 2940 | 0.0237 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0422 | 8.48 | 2960 | 0.0234 | 0.4950 | 0.9901 | 0.9901 | nan | 0.9901 | 0.0 | 0.9901 | | 0.0362 | 8.54 | 2980 | 0.0226 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 | | 0.0187 | 8.6 | 3000 | 0.0220 | 0.4952 | 0.9903 | 0.9903 | nan | 0.9903 | 0.0 | 0.9903 | | 0.0154 | 8.65 | 3020 | 0.0216 | 0.4948 | 0.9896 | 0.9896 | nan | 0.9896 | 0.0 | 0.9896 | | 0.0387 | 8.71 | 3040 | 0.0219 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.038 | 8.77 | 3060 | 0.0214 | 0.4948 | 0.9896 | 0.9896 | nan | 0.9896 | 0.0 | 0.9896 | | 0.0145 | 8.83 | 3080 | 0.0213 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 | | 0.0129 | 8.88 | 3100 | 0.0210 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0129 | 8.94 | 3120 | 0.0213 | 0.4953 | 0.9907 | 0.9907 | nan | 0.9907 | 0.0 | 0.9907 | | 0.0148 | 9.0 | 3140 | 0.0220 | 0.4958 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | 0.9916 | | 0.0133 | 9.05 | 3160 | 0.0210 | 0.4946 | 0.9891 | 0.9891 | nan | 0.9891 | 0.0 | 0.9891 | | 0.0158 | 9.11 | 3180 | 0.0213 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 | | 0.0155 | 9.17 | 3200 | 0.0217 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 | | 0.0202 | 9.23 | 3220 | 0.0218 | 0.4955 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 | | 0.0128 | 9.28 | 3240 | 0.0211 | 0.4953 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.0304 | 9.34 | 3260 | 0.0218 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.0354 | 9.4 | 3280 | 0.0214 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 | | 0.0188 | 9.46 | 3300 | 0.0214 | 0.4952 | 0.9903 | 0.9903 | nan | 0.9903 | 0.0 | 0.9903 | | 0.0117 | 9.51 | 3320 | 0.0223 | 0.4961 | 0.9921 | 0.9921 | nan | 0.9921 | 0.0 | 0.9921 | | 0.0175 | 9.57 | 3340 | 0.0215 | 0.4954 | 0.9907 | 0.9907 | nan | 0.9907 | 0.0 | 0.9907 | | 0.0304 | 9.63 | 3360 | 0.0217 | 0.4954 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0166 | 9.68 | 3380 | 0.0216 | 0.4955 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0899 | 9.74 | 3400 | 0.0221 | 0.4962 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | 0.9923 | | 0.0128 | 9.8 | 3420 | 0.0216 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 | | 0.0149 | 9.86 | 3440 | 0.0217 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 | | 0.0192 | 9.91 | 3460 | 0.0216 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0454 | 9.97 | 3480 | 0.0222 | 0.4959 | 0.9919 | 0.9919 | nan | 0.9919 | 0.0 | 0.9919 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.2.2 - Datasets 2.14.6 - Tokenizers 0.14.1
[ "background", "object" ]
karthik540/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 2.4429 - Mean Iou: 0.0127 - Mean Accuracy: 0.0289 - Overall Accuracy: 0.2813 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.0012 - Accuracy Flat-sidewalk: 0.7342 - Accuracy Flat-crosswalk: 0.0 - Accuracy Flat-cyclinglane: 0.0 - Accuracy Flat-parkingdriveway: 0.0 - Accuracy Flat-railtrack: 0.0 - Accuracy Flat-curb: 0.0 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.0 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.0538 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: 0.0 - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.1770 - Accuracy Nature-terrain: 0.0 - Accuracy Sky: 0.0149 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.0012 - Iou Flat-sidewalk: 0.3016 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.0 - Iou Flat-parkingdriveway: 0.0 - Iou Flat-railtrack: 0.0 - Iou Flat-curb: 0.0 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.0 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.0318 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: 0.0 - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.0859 - Iou Nature-terrain: 0.0 - Iou Sky: 0.0108 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 3.5256 | 0.2 | 10 | 3.5147 | 0.0071 | 0.0401 | 0.1017 | nan | 0.0000 | 0.2861 | 0.0000 | 0.0000 | 0.0402 | 0.0 | 0.0011 | 0.0017 | 0.0 | 0.0035 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0215 | 0.0236 | 0.0002 | 0.0010 | 0.0 | 0.0053 | 0.0162 | 0.0020 | 0.5432 | 0.0000 | 0.0815 | 0.0166 | 0.0172 | 0.0010 | 0.0000 | 0.0028 | 0.2620 | 0.0002 | 0.0060 | 0.0294 | 0.0 | 0.0000 | 0.1889 | 0.0000 | 0.0000 | 0.0173 | 0.0 | 0.0011 | 0.0005 | 0.0 | 0.0029 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0026 | 0.0001 | 0.0000 | 0.0010 | 0.0 | 0.0038 | 0.0064 | 0.0001 | 0.0000 | 0.0000 | 0.0077 | 0.0010 | 0.0000 | 0.0010 | 0.0000 | 0.0027 | 0.0051 | 0.0002 | 0.0049 | 0.0002 | | 3.3115 | 0.4 | 20 | 3.4349 | 0.0090 | 0.0293 | 0.1597 | nan | 0.0001 | 0.4536 | 0.0 | 0.0 | 0.0642 | 0.0 | 0.0002 | 0.0009 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.0213 | 0.0 | 0.0009 | 0.0 | 0.0059 | 0.0046 | 0.0 | 0.0782 | 0.0 | 0.1300 | 0.0102 | 0.0231 | 0.0046 | 0.0000 | 0.0016 | 0.1731 | 0.0 | 0.0044 | 0.0114 | nan | 0.0001 | 0.2507 | 0.0 | 0.0 | 0.0202 | 0.0 | 0.0002 | 0.0004 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0021 | 0.0001 | 0.0 | 0.0009 | 0.0 | 0.0038 | 0.0030 | 0.0 | 0.0000 | 0.0 | 0.0080 | 0.0012 | 0.0000 | 0.0044 | 0.0000 | 0.0016 | 0.0050 | 0.0 | 0.0038 | 0.0002 | | 2.8003 | 0.6 | 30 | 3.3730 | 0.0087 | 0.0281 | 0.1245 | nan | 0.0054 | 0.3314 | 0.0000 | 0.0000 | 0.1290 | 0.0 | 0.0003 | 0.0031 | 0.0 | 0.0004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0092 | 0.0 | 0.0011 | 0.0 | 0.0233 | 0.0022 | 0.0 | 0.0 | 0.0 | 0.1795 | 0.0126 | 0.0010 | 0.0239 | 0.0002 | 0.0046 | 0.2127 | 0.0 | 0.0075 | 0.0060 | nan | 0.0052 | 0.2092 | 0.0000 | 0.0000 | 0.0244 | 0.0 | 0.0003 | 0.0009 | 0.0 | 0.0004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0001 | 0.0 | 0.0011 | 0.0 | 0.0078 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0013 | 0.0000 | 0.0203 | 0.0002 | 0.0041 | 0.0050 | 0.0 | 0.0054 | 0.0004 | | 3.2521 | 0.8 | 40 | 3.2736 | 0.0110 | 0.0292 | 0.1863 | nan | 0.0294 | 0.5083 | 0.0001 | 0.0001 | 0.1290 | 0.0 | 0.0004 | 0.0012 | 0.0 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0016 | 0.0 | 0.0016 | 0.0 | 0.0127 | 0.0038 | 0.0 | 0.0 | 0.0000 | 0.1112 | 0.0033 | 0.0 | 0.0218 | 0.0002 | 0.0042 | 0.1371 | 0.0 | 0.0163 | 0.0076 | nan | 0.0243 | 0.2651 | 0.0001 | 0.0001 | 0.0253 | 0.0 | 0.0004 | 0.0005 | 0.0 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0001 | 0.0 | 0.0016 | 0.0 | 0.0058 | 0.0028 | 0.0 | 0.0 | 0.0000 | 0.0080 | 0.0011 | 0.0 | 0.0185 | 0.0002 | 0.0037 | 0.0049 | 0.0 | 0.0083 | 0.0011 | | 2.9043 | 1.0 | 50 | 3.2220 | 0.0132 | 0.0291 | 0.1739 | nan | 0.1252 | 0.3934 | 0.0003 | 0.0003 | 0.1066 | 0.0 | 0.0063 | 0.0008 | 0.0 | 0.0068 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0 | 0.0 | 0.0075 | 0.0 | 0.0098 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0582 | 0.0006 | 0.0 | 0.1309 | 0.0006 | 0.0081 | 0.1208 | 0.0 | 0.0094 | 0.0001 | nan | 0.0664 | 0.2317 | 0.0003 | 0.0003 | 0.0241 | 0.0 | 0.0053 | 0.0004 | 0.0 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0068 | 0.0 | 0.0052 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0004 | 0.0 | 0.0710 | 0.0006 | 0.0064 | 0.0046 | 0.0 | 0.0061 | 0.0000 | | 2.8893 | 1.2 | 60 | 3.1323 | 0.0128 | 0.0301 | 0.1824 | nan | 0.1147 | 0.2779 | 0.0000 | 0.0002 | 0.0638 | 0.0 | 0.0002 | 0.0001 | 0.0 | 0.0066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0111 | 0.0 | 0.0017 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0052 | 0.0000 | 0.0 | 0.4865 | 0.0005 | 0.0062 | 0.0445 | 0.0 | 0.0025 | 0.0 | nan | 0.0637 | 0.1900 | 0.0000 | 0.0002 | 0.0202 | 0.0 | 0.0002 | 0.0001 | 0.0 | 0.0052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0091 | 0.0 | 0.0014 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0033 | 0.0000 | 0.0 | 0.1301 | 0.0005 | 0.0050 | 0.0041 | 0.0 | 0.0022 | 0.0 | | 2.8221 | 1.4 | 70 | 3.0049 | 0.0138 | 0.0298 | 0.2481 | nan | 0.0664 | 0.5578 | 0.0 | 0.0001 | 0.0184 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0101 | 0.0 | 0.0002 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0 | 0.0 | 0.3322 | 0.0001 | 0.0147 | 0.0097 | 0.0 | 0.0000 | 0.0 | nan | 0.0443 | 0.2727 | 0.0 | 0.0001 | 0.0109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0084 | 0.0 | 0.0002 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0 | 0.0 | 0.1168 | 0.0001 | 0.0102 | 0.0032 | 0.0 | 0.0000 | 0.0 | | 2.7321 | 1.6 | 80 | 2.9281 | 0.0129 | 0.0300 | 0.2121 | nan | 0.1000 | 0.3599 | 0.0 | 0.0 | 0.0076 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0172 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.5224 | 0.0 | 0.0077 | 0.0040 | 0.0 | 0.0 | 0.0 | nan | 0.0577 | 0.2179 | 0.0 | 0.0 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0126 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.1335 | 0.0 | 0.0065 | 0.0028 | 0.0 | 0.0 | 0.0 | | 2.7583 | 1.8 | 90 | 2.9182 | 0.0107 | 0.0303 | 0.1746 | nan | 0.1465 | 0.1641 | 0.0 | 0.0 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.6905 | 0.0 | 0.0102 | 0.0008 | 0.0 | 0.0 | 0.0 | nan | 0.0714 | 0.1297 | 0.0 | 0.0 | 0.0030 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0095 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.1398 | 0.0 | 0.0084 | 0.0007 | 0.0 | 0.0 | 0.0 | | 3.1177 | 2.0 | 100 | 2.9230 | 0.0138 | 0.0297 | 0.2272 | nan | 0.1294 | 0.4556 | 0.0 | 0.0000 | 0.0030 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0299 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3671 | 0.0 | 0.0219 | 0.0004 | 0.0 | 0.0 | 0.0 | nan | 0.0662 | 0.2463 | 0.0 | 0.0000 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0196 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1193 | 0.0 | 0.0149 | 0.0004 | 0.0 | 0.0 | 0.0 | | 3.041 | 2.2 | 110 | 2.8124 | 0.0138 | 0.0291 | 0.2549 | nan | 0.1402 | 0.6049 | 0.0 | 0.0000 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0363 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1992 | 0.0 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0683 | 0.2797 | 0.0 | 0.0000 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0921 | 0.0 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1549 | 2.4 | 120 | 2.7993 | 0.0132 | 0.0292 | 0.2105 | nan | 0.1463 | 0.3812 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0572 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4022 | 0.0 | 0.0061 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0692 | 0.2227 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1223 | 0.0 | 0.0053 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.7506 | 2.6 | 130 | 2.7869 | 0.0136 | 0.0290 | 0.2153 | nan | 0.1198 | 0.4194 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0626 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3578 | 0.0 | 0.0272 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0628 | 0.2315 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1161 | 0.0 | 0.0191 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.8666 | 2.8 | 140 | 2.7030 | 0.0133 | 0.0288 | 0.2546 | nan | 0.0626 | 0.5989 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2736 | 0.0 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0417 | 0.2753 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1059 | 0.0 | 0.0041 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3693 | 3.0 | 150 | 2.6758 | 0.0133 | 0.0289 | 0.2790 | nan | 0.0661 | 0.7211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0304 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1548 | 0.0 | 0.0089 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0432 | 0.3002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0808 | 0.0 | 0.0071 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.4211 | 3.2 | 160 | 2.6509 | 0.0122 | 0.0292 | 0.3118 | nan | 0.0340 | 0.8762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0493 | 0.0 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0270 | 0.3255 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0129 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0375 | 0.0 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2934 | 3.4 | 170 | 2.5811 | 0.0109 | 0.0290 | 0.3268 | nan | 0.0162 | 0.9439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0104 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0148 | 0.0 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0145 | 0.3322 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0131 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2474 | 3.6 | 180 | 2.6740 | 0.0122 | 0.0287 | 0.3000 | nan | 0.0201 | 0.8363 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0619 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0461 | 0.0 | 0.0089 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0170 | 0.3185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0342 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0354 | 0.0 | 0.0072 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.4543 | 3.8 | 190 | 2.5741 | 0.0115 | 0.0287 | 0.3111 | nan | 0.0113 | 0.8837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0529 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0263 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0103 | 0.3246 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0219 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.6415 | 4.0 | 200 | 2.4955 | 0.0114 | 0.0287 | 0.3121 | nan | 0.0075 | 0.8862 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0328 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0070 | 0.3248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0266 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3359 | 4.2 | 210 | 2.6535 | 0.0130 | 0.0280 | 0.2474 | nan | 0.0389 | 0.6235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1633 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1048 | 0.0 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0285 | 0.2807 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0537 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0632 | 0.0 | 0.0142 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.8133 | 4.4 | 220 | 2.6000 | 0.0133 | 0.0285 | 0.2609 | nan | 0.0401 | 0.6643 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1350 | 0.0 | 0.0210 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0291 | 0.2901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0447 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0749 | 0.0 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3126 | 4.6 | 230 | 2.6429 | 0.0126 | 0.0288 | 0.1857 | nan | 0.0332 | 0.3374 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2779 | 0.0 | 0.0480 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0255 | 0.2045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0639 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1067 | 0.0 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2695 | 4.8 | 240 | 2.5140 | 0.0128 | 0.0282 | 0.2399 | nan | 0.0217 | 0.5869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1312 | 0.0 | 0.0183 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0177 | 0.2729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0574 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0738 | 0.0 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0622 | 5.0 | 250 | 2.4634 | 0.0126 | 0.0283 | 0.2656 | nan | 0.0107 | 0.6862 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1332 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1278 | 0.0 | 0.0058 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0097 | 0.2918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0736 | 0.0 | 0.0052 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1988 | 5.2 | 260 | 2.5162 | 0.0125 | 0.0282 | 0.2209 | nan | 0.0083 | 0.5152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2606 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1382 | 0.0 | 0.0379 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0077 | 0.2553 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0621 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0760 | 0.0 | 0.0236 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.4214 | 5.4 | 270 | 2.5880 | 0.0122 | 0.0284 | 0.1888 | nan | 0.0134 | 0.3772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1887 | 0.0 | 0.0516 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0117 | 0.2176 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0671 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0892 | 0.0 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2255 | 5.6 | 280 | 2.4963 | 0.0127 | 0.0287 | 0.2732 | nan | 0.0126 | 0.7299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0689 | 0.0 | 0.0301 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0113 | 0.3024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0513 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0476 | 0.0 | 0.0182 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3459 | 5.8 | 290 | 2.5055 | 0.0131 | 0.0288 | 0.2638 | nan | 0.0133 | 0.6801 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1258 | 0.0 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0118 | 0.2933 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0489 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.0 | 0.0198 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1034 | 6.0 | 300 | 2.4549 | 0.0125 | 0.0288 | 0.2873 | nan | 0.0048 | 0.7776 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0929 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0897 | 0.0 | 0.0143 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0046 | 0.3101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0430 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0581 | 0.0 | 0.0107 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2193 | 6.2 | 310 | 2.4227 | 0.0126 | 0.0290 | 0.2879 | nan | 0.0013 | 0.7619 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1482 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0013 | 0.3070 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0810 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3808 | 6.4 | 320 | 2.4239 | 0.0124 | 0.0290 | 0.2926 | nan | 0.0006 | 0.7900 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1109 | 0.0 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0006 | 0.3122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0397 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0675 | 0.0 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1201 | 6.6 | 330 | 2.4546 | 0.0130 | 0.0292 | 0.2795 | nan | 0.0010 | 0.7295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0903 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1522 | 0.0 | 0.0186 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0010 | 0.3036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0817 | 0.0 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1429 | 6.8 | 340 | 2.4390 | 0.0121 | 0.0292 | 0.3077 | nan | 0.0004 | 0.8612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0502 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0618 | 0.0 | 0.0185 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0004 | 0.3245 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0314 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0446 | 0.0 | 0.0122 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.3745 | 7.0 | 350 | 2.4814 | 0.0132 | 0.0292 | 0.2555 | nan | 0.0020 | 0.6392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0911 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1816 | 0.0 | 0.0800 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0020 | 0.2865 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0881 | 0.0 | 0.0287 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1907 | 7.2 | 360 | 2.4901 | 0.0130 | 0.0290 | 0.2387 | nan | 0.0014 | 0.5526 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2669 | 0.0 | 0.0588 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0014 | 0.2661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0432 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1055 | 0.0 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1116 | 7.4 | 370 | 2.4841 | 0.0130 | 0.0290 | 0.2350 | nan | 0.0015 | 0.5323 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0908 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2968 | 0.0 | 0.0659 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0014 | 0.2612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0397 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1097 | 0.0 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.4808 | 7.6 | 380 | 2.4890 | 0.0129 | 0.0293 | 0.2376 | nan | 0.0025 | 0.5715 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0758 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2136 | 0.0 | 0.1314 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0024 | 0.2729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0958 | 0.0 | 0.0319 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.8601 | 7.8 | 390 | 2.5003 | 0.0128 | 0.0290 | 0.2250 | nan | 0.0022 | 0.4998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2944 | 0.0 | 0.1015 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0022 | 0.2538 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0393 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1094 | 0.0 | 0.0313 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.032 | 8.0 | 400 | 2.5240 | 0.0125 | 0.0289 | 0.2093 | nan | 0.0027 | 0.4406 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3108 | 0.0 | 0.1262 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0026 | 0.2379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1106 | 0.0 | 0.0326 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.9364 | 8.2 | 410 | 2.4666 | 0.0127 | 0.0292 | 0.2720 | nan | 0.0024 | 0.7293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0924 | 0.0 | 0.1028 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0023 | 0.3046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0371 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0590 | 0.0 | 0.0282 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0335 | 8.4 | 420 | 2.4894 | 0.0129 | 0.0292 | 0.2402 | nan | 0.0046 | 0.5965 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0783 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1650 | 0.0 | 0.1478 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0044 | 0.2787 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0400 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0831 | 0.0 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0622 | 8.6 | 430 | 2.5457 | 0.0121 | 0.0287 | 0.1888 | nan | 0.0038 | 0.3645 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1536 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3132 | 0.0 | 0.1396 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0037 | 0.2129 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0500 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1108 | 0.0 | 0.0338 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.9635 | 8.8 | 440 | 2.5416 | 0.0120 | 0.0287 | 0.1908 | nan | 0.0028 | 0.4200 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1216 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1900 | 0.0 | 0.2427 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0027 | 0.2335 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0899 | 0.0 | 0.0340 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.9328 | 9.0 | 450 | 2.4707 | 0.0128 | 0.0293 | 0.2609 | nan | 0.0024 | 0.6792 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1358 | 0.0 | 0.1274 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0024 | 0.2958 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0327 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0750 | 0.0 | 0.0299 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0373 | 9.2 | 460 | 2.5003 | 0.0128 | 0.0292 | 0.2294 | nan | 0.0028 | 0.5341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0638 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2482 | 0.0 | 0.1447 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0028 | 0.2641 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1011 | 0.0 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2552 | 9.4 | 470 | 2.4884 | 0.0130 | 0.0292 | 0.2400 | nan | 0.0020 | 0.5674 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2513 | 0.0 | 0.1038 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0020 | 0.2712 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0365 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1016 | 0.0 | 0.0296 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.956 | 9.6 | 480 | 2.5214 | 0.0126 | 0.0289 | 0.2153 | nan | 0.0034 | 0.4825 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2477 | 0.0 | 0.1458 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0033 | 0.2501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0445 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1003 | 0.0 | 0.0320 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1743 | 9.8 | 490 | 2.4624 | 0.0127 | 0.0289 | 0.2689 | nan | 0.0018 | 0.7146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0769 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1041 | 0.0 | 0.0848 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0018 | 0.3001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0402 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0633 | 0.0 | 0.0261 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0282 | 10.0 | 500 | 2.4429 | 0.0127 | 0.0289 | 0.2813 | nan | 0.0012 | 0.7342 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0538 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1770 | 0.0 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0012 | 0.3016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0859 | 0.0 | 0.0108 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
karthik540/mario-semantic-1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mario-semantic-1 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Custom mario Dataset dataset. It achieves the following results on the evaluation set: - Loss: 0.0721 - Mean Iou: 0.0 - Mean Accuracy: 0.0 - Overall Accuracy: 0.0 - Accuracy Unlabeled: nan - Accuracy Mario: 0.0 - Accuracy Ground: 0.0 - Accuracy Enemy: 0.0 - Accuracy Bricks: 0.0 - Accuracy Question: 0.0 - Iou Unlabeled: 0.0 - Iou Mario: 0.0 - Iou Ground: 0.0 - Iou Enemy: 0.0 - Iou Bricks: 0.0 - Iou Question: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Mario | Accuracy Ground | Accuracy Enemy | Accuracy Bricks | Accuracy Question | Iou Unlabeled | Iou Mario | Iou Ground | Iou Enemy | Iou Bricks | Iou Question | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:---------------:|:--------------:|:---------------:|:-----------------:|:-------------:|:---------:|:----------:|:---------:|:----------:|:------------:| | 1.1471 | 0.2222 | 10 | 1.3150 | 0.0054 | 0.0409 | 0.0429 | nan | 0.0587 | 0.0 | 0.0305 | 0.0481 | 0.0674 | 0.0 | 0.0141 | 0.0 | 0.0110 | 0.0010 | 0.0063 | | 1.0399 | 0.4444 | 20 | 1.1597 | 0.0042 | 0.0247 | 0.0335 | nan | 0.0687 | 0.0 | 0.0054 | 0.0098 | 0.0397 | 0.0 | 0.0136 | 0.0 | 0.0029 | 0.0005 | 0.0081 | | 0.8368 | 0.6667 | 30 | 0.9484 | 0.0018 | 0.0052 | 0.0054 | nan | 0.0024 | 0.0 | 0.0098 | 0.0018 | 0.0121 | 0.0 | 0.0012 | 0.0 | 0.0049 | 0.0002 | 0.0046 | | 0.9264 | 0.8889 | 40 | 0.7115 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7753 | 1.1111 | 50 | 0.7572 | 0.0010 | 0.0023 | 0.0038 | nan | 0.0 | 0.0 | 0.0113 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0 | 0.0 | | 0.6295 | 1.3333 | 60 | 0.5617 | 0.0001 | 0.0002 | 0.0003 | nan | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | | 0.5956 | 1.5556 | 70 | 0.4135 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5756 | 1.7778 | 80 | 0.2028 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5318 | 2.0 | 90 | 0.1185 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5351 | 2.2222 | 100 | 0.3064 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5706 | 2.4444 | 110 | 0.1378 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.4863 | 2.6667 | 120 | 0.1121 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3226 | 2.8889 | 130 | 0.2038 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.4139 | 3.1111 | 140 | 0.1520 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3983 | 3.3333 | 150 | 0.1070 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3672 | 3.5556 | 160 | 0.1282 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3324 | 3.7778 | 170 | 0.1075 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.2806 | 4.0 | 180 | 0.2677 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.2854 | 4.2222 | 190 | 0.1020 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3463 | 4.4444 | 200 | 0.0551 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1957 | 4.6667 | 210 | 0.1982 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3063 | 4.8889 | 220 | 0.0962 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1933 | 5.1111 | 230 | 0.1172 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1833 | 5.3333 | 240 | 0.0600 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.231 | 5.5556 | 250 | 0.0519 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1516 | 5.7778 | 260 | 0.0575 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.172 | 6.0 | 270 | 0.1182 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1307 | 6.2222 | 280 | 0.0989 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1454 | 6.4444 | 290 | 0.1045 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1319 | 6.6667 | 300 | 0.0793 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1154 | 6.8889 | 310 | 0.0567 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1241 | 7.1111 | 320 | 0.0562 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1379 | 7.3333 | 330 | 0.0700 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1183 | 7.5556 | 340 | 0.0616 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.108 | 7.7778 | 350 | 0.0823 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1204 | 8.0 | 360 | 0.0661 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1391 | 8.2222 | 370 | 0.0578 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1554 | 8.4444 | 380 | 0.0643 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1338 | 8.6667 | 390 | 0.0822 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1358 | 8.8889 | 400 | 0.0997 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1704 | 9.1111 | 410 | 0.0503 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1242 | 9.3333 | 420 | 0.0692 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.1153 | 9.5556 | 430 | 0.1003 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0999 | 9.7778 | 440 | 0.0909 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0968 | 10.0 | 450 | 0.0721 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "unlabeled", "mario", "ground", "enemy", "bricks", "question" ]
vigneshgs7/segformer-b5-p142-cvat-vgs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b5-p142-cvat-vgs This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the vigneshgs7/segformer_open_cv_RGB_L_0_1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0131 - Mean Iou: 0.4961 - Mean Accuracy: 0.9922 - Overall Accuracy: 0.9922 - Accuracy Background: nan - Accuracy Object: 0.9922 - Iou Background: 0.0 - Iou Object: 0.9922 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Object | Iou Background | Iou Object | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------:| | 0.2847 | 0.06 | 20 | 0.3843 | 0.4662 | 0.9324 | 0.9324 | nan | 0.9324 | 0.0 | 0.9324 | | 0.1681 | 0.11 | 40 | 0.1983 | 0.4704 | 0.9408 | 0.9408 | nan | 0.9408 | 0.0 | 0.9408 | | 0.1592 | 0.17 | 60 | 0.1303 | 0.4745 | 0.9489 | 0.9489 | nan | 0.9489 | 0.0 | 0.9489 | | 0.1177 | 0.23 | 80 | 0.0922 | 0.4944 | 0.9888 | 0.9888 | nan | 0.9888 | 0.0 | 0.9888 | | 0.062 | 0.29 | 100 | 0.0745 | 0.4946 | 0.9892 | 0.9892 | nan | 0.9892 | 0.0 | 0.9892 | | 0.0767 | 0.34 | 120 | 0.0545 | 0.4852 | 0.9703 | 0.9703 | nan | 0.9703 | 0.0 | 0.9703 | | 0.0984 | 0.4 | 140 | 0.0621 | 0.4938 | 0.9875 | 0.9875 | nan | 0.9875 | 0.0 | 0.9875 | | 0.1779 | 0.46 | 160 | 0.0504 | 0.4961 | 0.9921 | 0.9921 | nan | 0.9921 | 0.0 | 0.9921 | | 0.0468 | 0.52 | 180 | 0.0407 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 | | 0.0618 | 0.57 | 200 | 0.0390 | 0.4936 | 0.9873 | 0.9873 | nan | 0.9873 | 0.0 | 0.9873 | | 0.062 | 0.63 | 220 | 0.0348 | 0.4947 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | 0.9894 | | 0.0357 | 0.69 | 240 | 0.0341 | 0.4914 | 0.9828 | 0.9828 | nan | 0.9828 | 0.0 | 0.9828 | | 0.0304 | 0.74 | 260 | 0.0351 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0267 | 0.8 | 280 | 0.0311 | 0.4938 | 0.9877 | 0.9877 | nan | 0.9877 | 0.0 | 0.9877 | | 0.0536 | 0.86 | 300 | 0.0282 | 0.4904 | 0.9807 | 0.9807 | nan | 0.9807 | 0.0 | 0.9807 | | 0.049 | 0.92 | 320 | 0.0274 | 0.4928 | 0.9855 | 0.9855 | nan | 0.9855 | 0.0 | 0.9855 | | 0.0304 | 0.97 | 340 | 0.0262 | 0.4936 | 0.9872 | 0.9872 | nan | 0.9872 | 0.0 | 0.9872 | | 0.0232 | 1.03 | 360 | 0.0251 | 0.4923 | 0.9847 | 0.9847 | nan | 0.9847 | 0.0 | 0.9847 | | 0.0304 | 1.09 | 380 | 0.0240 | 0.4917 | 0.9835 | 0.9835 | nan | 0.9835 | 0.0 | 0.9835 | | 0.0451 | 1.15 | 400 | 0.0261 | 0.4964 | 0.9927 | 0.9927 | nan | 0.9927 | 0.0 | 0.9927 | | 0.0254 | 1.2 | 420 | 0.0234 | 0.4929 | 0.9859 | 0.9859 | nan | 0.9859 | 0.0 | 0.9859 | | 0.0354 | 1.26 | 440 | 0.0229 | 0.4931 | 0.9861 | 0.9861 | nan | 0.9861 | 0.0 | 0.9861 | | 0.2103 | 1.32 | 460 | 0.0224 | 0.4951 | 0.9902 | 0.9902 | nan | 0.9902 | 0.0 | 0.9902 | | 0.041 | 1.38 | 480 | 0.0222 | 0.4920 | 0.9839 | 0.9839 | nan | 0.9839 | 0.0 | 0.9839 | | 0.0297 | 1.43 | 500 | 0.0223 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0299 | 1.49 | 520 | 0.0227 | 0.4961 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | 0.9923 | | 0.0213 | 1.55 | 540 | 0.0209 | 0.4947 | 0.9895 | 0.9895 | nan | 0.9895 | 0.0 | 0.9895 | | 0.0269 | 1.6 | 560 | 0.0214 | 0.4909 | 0.9817 | 0.9817 | nan | 0.9817 | 0.0 | 0.9817 | | 0.2199 | 1.66 | 580 | 0.0216 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.0191 | 1.72 | 600 | 0.0208 | 0.4935 | 0.9869 | 0.9869 | nan | 0.9869 | 0.0 | 0.9869 | | 0.0265 | 1.78 | 620 | 0.0201 | 0.4941 | 0.9882 | 0.9882 | nan | 0.9882 | 0.0 | 0.9882 | | 0.0244 | 1.83 | 640 | 0.0213 | 0.4910 | 0.9820 | 0.9820 | nan | 0.9820 | 0.0 | 0.9820 | | 0.0172 | 1.89 | 660 | 0.0199 | 0.4929 | 0.9858 | 0.9858 | nan | 0.9858 | 0.0 | 0.9858 | | 0.0339 | 1.95 | 680 | 0.0190 | 0.4930 | 0.9859 | 0.9859 | nan | 0.9859 | 0.0 | 0.9859 | | 0.027 | 2.01 | 700 | 0.0192 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0221 | 2.06 | 720 | 0.0195 | 0.4915 | 0.9830 | 0.9830 | nan | 0.9830 | 0.0 | 0.9830 | | 0.0461 | 2.12 | 740 | 0.0188 | 0.4953 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.0444 | 2.18 | 760 | 0.0189 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 | | 0.0211 | 2.23 | 780 | 0.0184 | 0.4949 | 0.9898 | 0.9898 | nan | 0.9898 | 0.0 | 0.9898 | | 0.0221 | 2.29 | 800 | 0.0186 | 0.4963 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | 0.9925 | | 0.0165 | 2.35 | 820 | 0.0181 | 0.4942 | 0.9883 | 0.9883 | nan | 0.9883 | 0.0 | 0.9883 | | 0.0171 | 2.41 | 840 | 0.0181 | 0.4923 | 0.9846 | 0.9846 | nan | 0.9846 | 0.0 | 0.9846 | | 0.0202 | 2.46 | 860 | 0.0178 | 0.4958 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.0222 | 2.52 | 880 | 0.0178 | 0.4922 | 0.9844 | 0.9844 | nan | 0.9844 | 0.0 | 0.9844 | | 0.018 | 2.58 | 900 | 0.0162 | 0.4949 | 0.9898 | 0.9898 | nan | 0.9898 | 0.0 | 0.9898 | | 0.0288 | 2.64 | 920 | 0.0168 | 0.4943 | 0.9887 | 0.9887 | nan | 0.9887 | 0.0 | 0.9887 | | 0.016 | 2.69 | 940 | 0.0178 | 0.4968 | 0.9936 | 0.9936 | nan | 0.9936 | 0.0 | 0.9936 | | 0.0184 | 2.75 | 960 | 0.0172 | 0.4935 | 0.9870 | 0.9870 | nan | 0.9870 | 0.0 | 0.9870 | | 0.0172 | 2.81 | 980 | 0.0175 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0168 | 2.87 | 1000 | 0.0172 | 0.4951 | 0.9902 | 0.9902 | nan | 0.9902 | 0.0 | 0.9902 | | 0.0197 | 2.92 | 1020 | 0.0169 | 0.4961 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | 0.9923 | | 0.0177 | 2.98 | 1040 | 0.0170 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 | | 0.0377 | 3.04 | 1060 | 0.0163 | 0.4944 | 0.9888 | 0.9888 | nan | 0.9888 | 0.0 | 0.9888 | | 0.0168 | 3.09 | 1080 | 0.0162 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 | | 0.0167 | 3.15 | 1100 | 0.0166 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 | | 0.0213 | 3.21 | 1120 | 0.0164 | 0.4948 | 0.9895 | 0.9895 | nan | 0.9895 | 0.0 | 0.9895 | | 0.0195 | 3.27 | 1140 | 0.0162 | 0.4947 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | 0.9894 | | 0.014 | 3.32 | 1160 | 0.0160 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0221 | 3.38 | 1180 | 0.0164 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 | | 0.0162 | 3.44 | 1200 | 0.0159 | 0.4945 | 0.9890 | 0.9890 | nan | 0.9890 | 0.0 | 0.9890 | | 0.0153 | 3.5 | 1220 | 0.0152 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 | | 0.0145 | 3.55 | 1240 | 0.0161 | 0.4935 | 0.9871 | 0.9871 | nan | 0.9871 | 0.0 | 0.9871 | | 0.0139 | 3.61 | 1260 | 0.0155 | 0.4951 | 0.9902 | 0.9902 | nan | 0.9902 | 0.0 | 0.9902 | | 0.0153 | 3.67 | 1280 | 0.0157 | 0.4942 | 0.9884 | 0.9884 | nan | 0.9884 | 0.0 | 0.9884 | | 0.0156 | 3.72 | 1300 | 0.0157 | 0.4949 | 0.9898 | 0.9898 | nan | 0.9898 | 0.0 | 0.9898 | | 0.033 | 3.78 | 1320 | 0.0157 | 0.4952 | 0.9903 | 0.9903 | nan | 0.9903 | 0.0 | 0.9903 | | 0.0219 | 3.84 | 1340 | 0.0153 | 0.4957 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.0166 | 3.9 | 1360 | 0.0162 | 0.4935 | 0.9871 | 0.9871 | nan | 0.9871 | 0.0 | 0.9871 | | 0.0168 | 3.95 | 1380 | 0.0157 | 0.4949 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 | | 0.0177 | 4.01 | 1400 | 0.0153 | 0.4966 | 0.9932 | 0.9932 | nan | 0.9932 | 0.0 | 0.9932 | | 0.0136 | 4.07 | 1420 | 0.0150 | 0.4952 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.0334 | 4.13 | 1440 | 0.0156 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.019 | 4.18 | 1460 | 0.0154 | 0.4950 | 0.9899 | 0.9899 | nan | 0.9899 | 0.0 | 0.9899 | | 0.0147 | 4.24 | 1480 | 0.0148 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0135 | 4.3 | 1500 | 0.0146 | 0.4951 | 0.9902 | 0.9902 | nan | 0.9902 | 0.0 | 0.9902 | | 0.0186 | 4.36 | 1520 | 0.0143 | 0.4966 | 0.9933 | 0.9933 | nan | 0.9933 | 0.0 | 0.9933 | | 0.0153 | 4.41 | 1540 | 0.0141 | 0.4954 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | 0.9909 | | 0.0181 | 4.47 | 1560 | 0.0145 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 | | 0.0266 | 4.53 | 1580 | 0.0146 | 0.4953 | 0.9907 | 0.9907 | nan | 0.9907 | 0.0 | 0.9907 | | 0.0141 | 4.58 | 1600 | 0.0147 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 | | 0.0145 | 4.64 | 1620 | 0.0150 | 0.4947 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | 0.9894 | | 0.0128 | 4.7 | 1640 | 0.0151 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0119 | 4.76 | 1660 | 0.0143 | 0.4948 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 | | 0.0133 | 4.81 | 1680 | 0.0144 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 | | 0.0151 | 4.87 | 1700 | 0.0143 | 0.4956 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 | | 0.0211 | 4.93 | 1720 | 0.0149 | 0.4965 | 0.9930 | 0.9930 | nan | 0.9930 | 0.0 | 0.9930 | | 0.0136 | 4.99 | 1740 | 0.0144 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0129 | 5.04 | 1760 | 0.0142 | 0.4967 | 0.9934 | 0.9934 | nan | 0.9934 | 0.0 | 0.9934 | | 0.0176 | 5.1 | 1780 | 0.0142 | 0.4965 | 0.9930 | 0.9930 | nan | 0.9930 | 0.0 | 0.9930 | | 0.0119 | 5.16 | 1800 | 0.0141 | 0.4958 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | 0.9916 | | 0.021 | 5.21 | 1820 | 0.0143 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0146 | 5.27 | 1840 | 0.0137 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 | | 0.0158 | 5.33 | 1860 | 0.0138 | 0.4953 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.014 | 5.39 | 1880 | 0.0142 | 0.4956 | 0.9913 | 0.9913 | nan | 0.9913 | 0.0 | 0.9913 | | 0.0145 | 5.44 | 1900 | 0.0145 | 0.4952 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.019 | 5.5 | 1920 | 0.0145 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0134 | 5.56 | 1940 | 0.0143 | 0.4958 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.011 | 5.62 | 1960 | 0.0141 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 | | 0.0159 | 5.67 | 1980 | 0.0143 | 0.4971 | 0.9942 | 0.9942 | nan | 0.9942 | 0.0 | 0.9942 | | 0.0132 | 5.73 | 2000 | 0.0140 | 0.4966 | 0.9933 | 0.9933 | nan | 0.9933 | 0.0 | 0.9933 | | 0.017 | 5.79 | 2020 | 0.0136 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0156 | 5.85 | 2040 | 0.0139 | 0.4951 | 0.9902 | 0.9902 | nan | 0.9902 | 0.0 | 0.9902 | | 0.0169 | 5.9 | 2060 | 0.0142 | 0.4943 | 0.9887 | 0.9887 | nan | 0.9887 | 0.0 | 0.9887 | | 0.0337 | 5.96 | 2080 | 0.0145 | 0.4967 | 0.9933 | 0.9933 | nan | 0.9933 | 0.0 | 0.9933 | | 0.0158 | 6.02 | 2100 | 0.0141 | 0.4949 | 0.9898 | 0.9898 | nan | 0.9898 | 0.0 | 0.9898 | | 0.0401 | 6.07 | 2120 | 0.0139 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.0629 | 6.13 | 2140 | 0.0138 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 | | 0.0143 | 6.19 | 2160 | 0.0142 | 0.4967 | 0.9935 | 0.9935 | nan | 0.9935 | 0.0 | 0.9935 | | 0.0133 | 6.25 | 2180 | 0.0135 | 0.4957 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.0326 | 6.3 | 2200 | 0.0139 | 0.4963 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | 0.9925 | | 0.0141 | 6.36 | 2220 | 0.0133 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 | | 0.0119 | 6.42 | 2240 | 0.0134 | 0.4958 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.0133 | 6.48 | 2260 | 0.0139 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.0123 | 6.53 | 2280 | 0.0138 | 0.4967 | 0.9934 | 0.9934 | nan | 0.9934 | 0.0 | 0.9934 | | 0.014 | 6.59 | 2300 | 0.0138 | 0.4962 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | 0.9925 | | 0.0137 | 6.65 | 2320 | 0.0136 | 0.4958 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | 0.9916 | | 0.0173 | 6.7 | 2340 | 0.0138 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0137 | 6.76 | 2360 | 0.0136 | 0.4953 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 | | 0.0153 | 6.82 | 2380 | 0.0134 | 0.4958 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | 0.9916 | | 0.0135 | 6.88 | 2400 | 0.0137 | 0.4963 | 0.9926 | 0.9926 | nan | 0.9926 | 0.0 | 0.9926 | | 0.0151 | 6.93 | 2420 | 0.0137 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 | | 0.0122 | 6.99 | 2440 | 0.0134 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.013 | 7.05 | 2460 | 0.0135 | 0.4970 | 0.9941 | 0.9941 | nan | 0.9941 | 0.0 | 0.9941 | | 0.0134 | 7.11 | 2480 | 0.0133 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0145 | 7.16 | 2500 | 0.0134 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.028 | 7.22 | 2520 | 0.0135 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.0288 | 7.28 | 2540 | 0.0137 | 0.4967 | 0.9933 | 0.9933 | nan | 0.9933 | 0.0 | 0.9933 | | 0.0117 | 7.34 | 2560 | 0.0135 | 0.4964 | 0.9927 | 0.9927 | nan | 0.9927 | 0.0 | 0.9927 | | 0.013 | 7.39 | 2580 | 0.0136 | 0.4966 | 0.9932 | 0.9932 | nan | 0.9932 | 0.0 | 0.9932 | | 0.0158 | 7.45 | 2600 | 0.0134 | 0.4950 | 0.9899 | 0.9899 | nan | 0.9899 | 0.0 | 0.9899 | | 0.0135 | 7.51 | 2620 | 0.0134 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0136 | 7.56 | 2640 | 0.0140 | 0.4967 | 0.9935 | 0.9935 | nan | 0.9935 | 0.0 | 0.9935 | | 0.0396 | 7.62 | 2660 | 0.0133 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 | | 0.0109 | 7.68 | 2680 | 0.0134 | 0.4963 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | 0.9925 | | 0.0148 | 7.74 | 2700 | 0.0133 | 0.4963 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | 0.9925 | | 0.0121 | 7.79 | 2720 | 0.0140 | 0.4945 | 0.9890 | 0.9890 | nan | 0.9890 | 0.0 | 0.9890 | | 0.0109 | 7.85 | 2740 | 0.0139 | 0.4957 | 0.9913 | 0.9913 | nan | 0.9913 | 0.0 | 0.9913 | | 0.014 | 7.91 | 2760 | 0.0135 | 0.4957 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.0199 | 7.97 | 2780 | 0.0134 | 0.4959 | 0.9917 | 0.9917 | nan | 0.9917 | 0.0 | 0.9917 | | 0.0119 | 8.02 | 2800 | 0.0136 | 0.4958 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | 0.9916 | | 0.0129 | 8.08 | 2820 | 0.0136 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.0108 | 8.14 | 2840 | 0.0134 | 0.4959 | 0.9917 | 0.9917 | nan | 0.9917 | 0.0 | 0.9917 | | 0.0209 | 8.19 | 2860 | 0.0136 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0154 | 8.25 | 2880 | 0.0137 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0141 | 8.31 | 2900 | 0.0132 | 0.4965 | 0.9929 | 0.9929 | nan | 0.9929 | 0.0 | 0.9929 | | 0.0187 | 8.37 | 2920 | 0.0131 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 | | 0.0124 | 8.42 | 2940 | 0.0133 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.0135 | 8.48 | 2960 | 0.0132 | 0.4963 | 0.9926 | 0.9926 | nan | 0.9926 | 0.0 | 0.9926 | | 0.0283 | 8.54 | 2980 | 0.0131 | 0.4958 | 0.9917 | 0.9917 | nan | 0.9917 | 0.0 | 0.9917 | | 0.0691 | 8.6 | 3000 | 0.0131 | 0.4965 | 0.9930 | 0.9930 | nan | 0.9930 | 0.0 | 0.9930 | | 0.0142 | 8.65 | 3020 | 0.0131 | 0.4965 | 0.9929 | 0.9929 | nan | 0.9929 | 0.0 | 0.9929 | | 0.0155 | 8.71 | 3040 | 0.0130 | 0.4966 | 0.9931 | 0.9931 | nan | 0.9931 | 0.0 | 0.9931 | | 0.0115 | 8.77 | 3060 | 0.0129 | 0.4966 | 0.9932 | 0.9932 | nan | 0.9932 | 0.0 | 0.9932 | | 0.0095 | 8.83 | 3080 | 0.0130 | 0.4963 | 0.9927 | 0.9927 | nan | 0.9927 | 0.0 | 0.9927 | | 0.012 | 8.88 | 3100 | 0.0132 | 0.4954 | 0.9907 | 0.9907 | nan | 0.9907 | 0.0 | 0.9907 | | 0.0153 | 8.94 | 3120 | 0.0132 | 0.4965 | 0.9930 | 0.9930 | nan | 0.9930 | 0.0 | 0.9930 | | 0.0141 | 9.0 | 3140 | 0.0134 | 0.4958 | 0.9917 | 0.9917 | nan | 0.9917 | 0.0 | 0.9917 | | 0.0141 | 9.05 | 3160 | 0.0133 | 0.4958 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | 0.9915 | | 0.016 | 9.11 | 3180 | 0.0133 | 0.4964 | 0.9929 | 0.9929 | nan | 0.9929 | 0.0 | 0.9929 | | 0.017 | 9.17 | 3200 | 0.0132 | 0.4965 | 0.9929 | 0.9929 | nan | 0.9929 | 0.0 | 0.9929 | | 0.0245 | 9.23 | 3220 | 0.0132 | 0.4961 | 0.9921 | 0.9921 | nan | 0.9921 | 0.0 | 0.9921 | | 0.0101 | 9.28 | 3240 | 0.0132 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 | | 0.012 | 9.34 | 3260 | 0.0133 | 0.4959 | 0.9917 | 0.9917 | nan | 0.9917 | 0.0 | 0.9917 | | 0.0111 | 9.4 | 3280 | 0.0133 | 0.4964 | 0.9928 | 0.9928 | nan | 0.9928 | 0.0 | 0.9928 | | 0.0148 | 9.46 | 3300 | 0.0132 | 0.4962 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | 0.9925 | | 0.0124 | 9.51 | 3320 | 0.0135 | 0.4967 | 0.9934 | 0.9934 | nan | 0.9934 | 0.0 | 0.9934 | | 0.0209 | 9.57 | 3340 | 0.0133 | 0.4963 | 0.9926 | 0.9926 | nan | 0.9926 | 0.0 | 0.9926 | | 0.0134 | 9.63 | 3360 | 0.0132 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0146 | 9.68 | 3380 | 0.0132 | 0.4958 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | 0.9916 | | 0.0217 | 9.74 | 3400 | 0.0132 | 0.4961 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | 0.9923 | | 0.0142 | 9.8 | 3420 | 0.0131 | 0.4961 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | 0.9923 | | 0.0134 | 9.86 | 3440 | 0.0131 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 | | 0.0131 | 9.91 | 3460 | 0.0131 | 0.4960 | 0.9920 | 0.9920 | nan | 0.9920 | 0.0 | 0.9920 | | 0.0136 | 9.97 | 3480 | 0.0131 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.2.2 - Datasets 2.14.6 - Tokenizers 0.14.1
[ "background", "object" ]
chribark/segformer-b3-finetuned-UAVid
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b3-finetuned-UAVid This model is a fine-tuned version of [nvidia/segformer-b3-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b3-finetuned-ade-512-512) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2115 - Mean Iou: 0.6365 - Mean Accuracy: 0.7005 - Overall Accuracy: 0.9263 - Accuracy Wall: nan - Accuracy Building: 0.9535 - Accuracy Sky: nan - Accuracy Floor: nan - Accuracy Tree: 0.9415 - Accuracy Ceiling: nan - Accuracy Road: 0.8948 - Accuracy Bed : nan - Accuracy Windowpane: nan - Accuracy Grass: nan - Accuracy Cabinet: nan - Accuracy Sidewalk: nan - Accuracy Person: 0.0038 - Accuracy Earth: nan - Accuracy Door: nan - Accuracy Table: nan - Accuracy Mountain: nan - Accuracy Plant: nan - Accuracy Curtain: nan - Accuracy Chair: nan - Accuracy Car: 0.7086 - Accuracy Water: nan - Accuracy Painting: nan - Accuracy Sofa: nan - Accuracy Shelf: nan - Accuracy House: nan - Accuracy Sea: nan - Accuracy Mirror: nan - Accuracy Rug: nan - Accuracy Field: nan - Accuracy Armchair: nan - Accuracy Seat: nan - Accuracy Fence: nan - Accuracy Desk: nan - Accuracy Rock: nan - Accuracy Wardrobe: nan - Accuracy Lamp: nan - Accuracy Bathtub: nan - Accuracy Railing: nan - Accuracy Cushion: nan - Accuracy Base: nan - Accuracy Box: nan - Accuracy Column: nan - Accuracy Signboard: nan - Accuracy Chest of drawers: nan - Accuracy Counter: nan - Accuracy Sand: nan - Accuracy Sink: nan - Accuracy Skyscraper: nan - Accuracy Fireplace: nan - Accuracy Refrigerator: nan - Accuracy Grandstand: nan - Accuracy Path: nan - Accuracy Stairs: nan - Accuracy Runway: nan - Accuracy Case: nan - Accuracy Pool table: nan - Accuracy Pillow: nan - Accuracy Screen door: nan - Accuracy Stairway: nan - Accuracy River: nan - Accuracy Bridge: nan - Accuracy Bookcase: nan - Accuracy Blind: nan - Accuracy Coffee table: nan - Accuracy Toilet: nan - Accuracy Flower: nan - Accuracy Book: nan - Accuracy Hill: nan - Accuracy Bench: nan - Accuracy Countertop: nan - Accuracy Stove: nan - Accuracy Palm: nan - Accuracy Kitchen island: nan - Accuracy Computer: nan - Accuracy Swivel chair: nan - Accuracy Boat: nan - Accuracy Bar: nan - Accuracy Arcade machine: nan - Accuracy Hovel: nan - Accuracy Bus: nan - Accuracy Towel: nan - Accuracy Light: nan - Accuracy Truck: nan - Accuracy Tower: nan - Accuracy Chandelier: nan - Accuracy Awning: nan - Accuracy Streetlight: nan - Accuracy Booth: nan - Accuracy Television receiver: nan - Accuracy Airplane: nan - Accuracy Dirt track: nan - Accuracy Apparel: nan - Accuracy Pole: nan - Accuracy Land: nan - Accuracy Bannister: nan - Accuracy Escalator: nan - Accuracy Ottoman: nan - Accuracy Bottle: nan - Accuracy Buffet: nan - Accuracy Poster: nan - Accuracy Stage: nan - Accuracy Van: nan - Accuracy Ship: nan - Accuracy Fountain: nan - Accuracy Conveyer belt: nan - Accuracy Canopy: nan - Accuracy Washer: nan - Accuracy Plaything: nan - Accuracy Swimming pool: nan - Accuracy Stool: nan - Accuracy Barrel: nan - Accuracy Basket: nan - Accuracy Waterfall: nan - Accuracy Tent: nan - Accuracy Bag: nan - Accuracy Minibike: nan - Accuracy Cradle: nan - Accuracy Oven: nan - Accuracy Ball: nan - Accuracy Food: nan - Accuracy Step: nan - Accuracy Tank: nan - Accuracy Trade name: nan - Accuracy Microwave: nan - Accuracy Pot: nan - Accuracy Animal: nan - Accuracy Bicycle: nan - Accuracy Lake: nan - Accuracy Dishwasher: nan - Accuracy Screen: nan - Accuracy Blanket: nan - Accuracy Sculpture: nan - Accuracy Hood: nan - Accuracy Sconce: nan - Accuracy Vase: nan - Accuracy Traffic light: nan - Accuracy Tray: nan - Accuracy Ashcan: nan - Accuracy Fan: nan - Accuracy Pier: nan - Accuracy Crt screen: nan - Accuracy Plate: nan - Accuracy Monitor: nan - Accuracy Bulletin board: nan - Accuracy Shower: nan - Accuracy Radiator: nan - Accuracy Glass: nan - Accuracy Clock: nan - Accuracy Flag: nan - Iou Wall: nan - Iou Building: 0.9105 - Iou Sky: nan - Iou Floor: nan - Iou Tree: 0.8818 - Iou Ceiling: nan - Iou Road: 0.8152 - Iou Bed : nan - Iou Windowpane: nan - Iou Grass: nan - Iou Cabinet: nan - Iou Sidewalk: nan - Iou Person: 0.0038 - Iou Earth: nan - Iou Door: nan - Iou Table: nan - Iou Mountain: nan - Iou Plant: nan - Iou Curtain: nan - Iou Chair: nan - Iou Car: 0.5711 - Iou Water: nan - Iou Painting: nan - Iou Sofa: nan - Iou Shelf: nan - Iou House: nan - Iou Sea: nan - Iou Mirror: nan - Iou Rug: nan - Iou Field: nan - Iou Armchair: nan - Iou Seat: nan - Iou Fence: nan - Iou Desk: nan - Iou Rock: nan - Iou Wardrobe: nan - Iou Lamp: nan - Iou Bathtub: nan - Iou Railing: nan - Iou Cushion: nan - Iou Base: nan - Iou Box: nan - Iou Column: nan - Iou Signboard: nan - Iou Chest of drawers: nan - Iou Counter: nan - Iou Sand: nan - Iou Sink: nan - Iou Skyscraper: nan - Iou Fireplace: nan - Iou Refrigerator: nan - Iou Grandstand: nan - Iou Path: nan - Iou Stairs: nan - Iou Runway: nan - Iou Case: nan - Iou Pool table: nan - Iou Pillow: nan - Iou Screen door: nan - Iou Stairway: nan - Iou River: nan - Iou Bridge: nan - Iou Bookcase: nan - Iou Blind: nan - Iou Coffee table: nan - Iou Toilet: nan - Iou Flower: nan - Iou Book: nan - Iou Hill: nan - Iou Bench: nan - Iou Countertop: nan - Iou Stove: nan - Iou Palm: nan - Iou Kitchen island: nan - Iou Computer: nan - Iou Swivel chair: nan - Iou Boat: nan - Iou Bar: nan - Iou Arcade machine: nan - Iou Hovel: nan - Iou Bus: nan - Iou Towel: nan - Iou Light: nan - Iou Truck: nan - Iou Tower: nan - Iou Chandelier: nan - Iou Awning: nan - Iou Streetlight: nan - Iou Booth: nan - Iou Television receiver: nan - Iou Airplane: nan - Iou Dirt track: nan - Iou Apparel: nan - Iou Pole: nan - Iou Land: nan - Iou Bannister: nan - Iou Escalator: nan - Iou Ottoman: nan - Iou Bottle: nan - Iou Buffet: nan - Iou Poster: nan - Iou Stage: nan - Iou Van: nan - Iou Ship: nan - Iou Fountain: nan - Iou Conveyer belt: nan - Iou Canopy: nan - Iou Washer: nan - Iou Plaything: nan - Iou Swimming pool: nan - Iou Stool: nan - Iou Barrel: nan - Iou Basket: nan - Iou Waterfall: nan - Iou Tent: nan - Iou Bag: nan - Iou Minibike: nan - Iou Cradle: nan - Iou Oven: nan - Iou Ball: nan - Iou Food: nan - Iou Step: nan - Iou Tank: nan - Iou Trade name: nan - Iou Microwave: nan - Iou Pot: nan - Iou Animal: nan - Iou Bicycle: nan - Iou Lake: nan - Iou Dishwasher: nan - Iou Screen: nan - Iou Blanket: nan - Iou Sculpture: nan - Iou Hood: nan - Iou Sconce: nan - Iou Vase: nan - Iou Traffic light: nan - Iou Tray: nan - Iou Ashcan: nan - Iou Fan: nan - Iou Pier: nan - Iou Crt screen: nan - Iou Plate: nan - Iou Monitor: nan - Iou Bulletin board: nan - Iou Shower: nan - Iou Radiator: nan - Iou Glass: nan - Iou Clock: nan - Iou Flag: nan ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Wall | Accuracy Building | Accuracy Sky | Accuracy Floor | Accuracy Tree | Accuracy Ceiling | Accuracy Road | Accuracy Bed | Accuracy Windowpane | Accuracy Grass | Accuracy Cabinet | Accuracy Sidewalk | Accuracy Person | Accuracy Earth | Accuracy Door | Accuracy Table | Accuracy Mountain | Accuracy Plant | Accuracy Curtain | Accuracy Chair | Accuracy Car | Accuracy Water | Accuracy Painting | Accuracy Sofa | Accuracy Shelf | Accuracy House | Accuracy Sea | Accuracy Mirror | Accuracy Rug | Accuracy Field | Accuracy Armchair | Accuracy Seat | Accuracy Fence | Accuracy Desk | Accuracy Rock | Accuracy Wardrobe | Accuracy Lamp | Accuracy Bathtub | Accuracy Railing | Accuracy Cushion | Accuracy Base | Accuracy Box | Accuracy Column | Accuracy Signboard | Accuracy Chest of drawers | Accuracy Counter | Accuracy Sand | Accuracy Sink | Accuracy Skyscraper | Accuracy Fireplace | Accuracy Refrigerator | Accuracy Grandstand | Accuracy Path | Accuracy Stairs | Accuracy Runway | Accuracy Case | Accuracy Pool table | Accuracy Pillow | Accuracy Screen door | Accuracy Stairway | Accuracy River | Accuracy Bridge | Accuracy Bookcase | Accuracy Blind | Accuracy Coffee table | Accuracy Toilet | Accuracy Flower | Accuracy Book | Accuracy Hill | Accuracy Bench | Accuracy Countertop | Accuracy Stove | Accuracy Palm | Accuracy Kitchen island | Accuracy Computer | Accuracy Swivel chair | Accuracy Boat | Accuracy Bar | Accuracy Arcade machine | Accuracy Hovel | Accuracy Bus | Accuracy Towel | Accuracy Light | Accuracy Truck | Accuracy Tower | Accuracy Chandelier | Accuracy Awning | Accuracy Streetlight | Accuracy Booth | Accuracy Television receiver | Accuracy Airplane | Accuracy Dirt track | Accuracy Apparel | Accuracy Pole | Accuracy Land | Accuracy Bannister | Accuracy Escalator | Accuracy Ottoman | Accuracy Bottle | Accuracy Buffet | Accuracy Poster | Accuracy Stage | Accuracy Van | Accuracy Ship | Accuracy Fountain | Accuracy Conveyer belt | Accuracy Canopy | Accuracy Washer | Accuracy Plaything | Accuracy Swimming pool | Accuracy Stool | Accuracy Barrel | Accuracy Basket | Accuracy Waterfall | Accuracy Tent | Accuracy Bag | Accuracy Minibike | Accuracy Cradle | Accuracy Oven | Accuracy Ball | Accuracy Food | Accuracy Step | Accuracy Tank | Accuracy Trade name | Accuracy Microwave | Accuracy Pot | Accuracy Animal | Accuracy Bicycle | Accuracy Lake | Accuracy Dishwasher | Accuracy Screen | Accuracy Blanket | Accuracy Sculpture | Accuracy Hood | Accuracy Sconce | Accuracy Vase | Accuracy Traffic light | Accuracy Tray | Accuracy Ashcan | Accuracy Fan | Accuracy Pier | Accuracy Crt screen | Accuracy Plate | Accuracy Monitor | Accuracy Bulletin board | Accuracy Shower | Accuracy Radiator | Accuracy Glass | Accuracy Clock | Accuracy Flag | Iou Wall | Iou Building | Iou Sky | Iou Floor | Iou Tree | Iou Ceiling | Iou Road | Iou Bed | Iou Windowpane | Iou Grass | Iou Cabinet | Iou Sidewalk | Iou Person | Iou Earth | Iou Door | Iou Table | Iou Mountain | Iou Plant | Iou Curtain | Iou Chair | Iou Car | Iou Water | Iou Painting | Iou Sofa | Iou Shelf | Iou House | Iou Sea | Iou Mirror | Iou Rug | Iou Field | Iou Armchair | Iou Seat | Iou Fence | Iou Desk | Iou Rock | Iou Wardrobe | Iou Lamp | Iou Bathtub | Iou Railing | Iou Cushion | Iou Base | Iou Box | Iou Column | Iou Signboard | Iou Chest of drawers | Iou Counter | Iou Sand | Iou Sink | Iou Skyscraper | Iou Fireplace | Iou Refrigerator | Iou Grandstand | Iou Path | Iou Stairs | Iou Runway | Iou Case | Iou Pool table | Iou Pillow | Iou Screen door | Iou Stairway | Iou River | Iou Bridge | Iou Bookcase | Iou Blind | Iou Coffee table | Iou Toilet | Iou Flower | Iou Book | Iou Hill | Iou Bench | Iou Countertop | Iou Stove | Iou Palm | Iou Kitchen island | Iou Computer | Iou Swivel chair | Iou Boat | Iou Bar | Iou Arcade machine | Iou Hovel | Iou Bus | Iou Towel | Iou Light | Iou Truck | Iou Tower | Iou Chandelier | Iou Awning | Iou Streetlight | Iou Booth | Iou Television receiver | Iou Airplane | Iou Dirt track | Iou Apparel | Iou Pole | Iou Land | Iou Bannister | Iou Escalator | Iou Ottoman | Iou Bottle | Iou Buffet | Iou Poster | Iou Stage | Iou Van | Iou Ship | Iou Fountain | Iou Conveyer belt | Iou Canopy | Iou Washer | Iou Plaything | Iou Swimming pool | Iou Stool | Iou Barrel | Iou Basket | Iou Waterfall | Iou Tent | Iou Bag | Iou Minibike | Iou Cradle | Iou Oven | Iou Ball | Iou Food | Iou Step | Iou Tank | Iou Trade name | Iou Microwave | Iou Pot | Iou Animal | Iou Bicycle | Iou Lake | Iou Dishwasher | Iou Screen | Iou Blanket | Iou Sculpture | Iou Hood | Iou Sconce | Iou Vase | Iou Traffic light | Iou Tray | Iou Ashcan | Iou Fan | Iou Pier | Iou Crt screen | Iou Plate | Iou Monitor | Iou Bulletin board | Iou Shower | Iou Radiator | Iou Glass | Iou Clock | Iou Flag | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------:|:-----------------:|:------------:|:--------------:|:-------------:|:----------------:|:-------------:|:-------------:|:-------------------:|:--------------:|:----------------:|:-----------------:|:---------------:|:--------------:|:-------------:|:--------------:|:-----------------:|:--------------:|:----------------:|:--------------:|:------------:|:--------------:|:-----------------:|:-------------:|:--------------:|:--------------:|:------------:|:---------------:|:------------:|:--------------:|:-----------------:|:-------------:|:--------------:|:-------------:|:-------------:|:-----------------:|:-------------:|:----------------:|:----------------:|:----------------:|:-------------:|:------------:|:---------------:|:------------------:|:-------------------------:|:----------------:|:-------------:|:-------------:|:-------------------:|:------------------:|:---------------------:|:-------------------:|:-------------:|:---------------:|:---------------:|:-------------:|:-------------------:|:---------------:|:--------------------:|:-----------------:|:--------------:|:---------------:|:-----------------:|:--------------:|:---------------------:|:---------------:|:---------------:|:-------------:|:-------------:|:--------------:|:-------------------:|:--------------:|:-------------:|:-----------------------:|:-----------------:|:---------------------:|:-------------:|:------------:|:-----------------------:|:--------------:|:------------:|:--------------:|:--------------:|:--------------:|:--------------:|:-------------------:|:---------------:|:--------------------:|:--------------:|:----------------------------:|:-----------------:|:-------------------:|:----------------:|:-------------:|:-------------:|:------------------:|:------------------:|:----------------:|:---------------:|:---------------:|:---------------:|:--------------:|:------------:|:-------------:|:-----------------:|:----------------------:|:---------------:|:---------------:|:------------------:|:----------------------:|:--------------:|:---------------:|:---------------:|:------------------:|:-------------:|:------------:|:-----------------:|:---------------:|:-------------:|:-------------:|:-------------:|:-------------:|:-------------:|:-------------------:|:------------------:|:------------:|:---------------:|:----------------:|:-------------:|:-------------------:|:---------------:|:----------------:|:------------------:|:-------------:|:---------------:|:-------------:|:----------------------:|:-------------:|:---------------:|:------------:|:-------------:|:-------------------:|:--------------:|:----------------:|:-----------------------:|:---------------:|:-----------------:|:--------------:|:--------------:|:-------------:|:--------:|:------------:|:-------:|:---------:|:--------:|:-----------:|:--------:|:--------:|:--------------:|:---------:|:-----------:|:------------:|:----------:|:---------:|:--------:|:---------:|:------------:|:---------:|:-----------:|:---------:|:-------:|:---------:|:------------:|:--------:|:---------:|:---------:|:-------:|:----------:|:-------:|:---------:|:------------:|:--------:|:---------:|:--------:|:--------:|:------------:|:--------:|:-----------:|:-----------:|:-----------:|:--------:|:-------:|:----------:|:-------------:|:--------------------:|:-----------:|:--------:|:--------:|:--------------:|:-------------:|:----------------:|:--------------:|:--------:|:----------:|:----------:|:--------:|:--------------:|:----------:|:---------------:|:------------:|:---------:|:----------:|:------------:|:---------:|:----------------:|:----------:|:----------:|:--------:|:--------:|:---------:|:--------------:|:---------:|:--------:|:------------------:|:------------:|:----------------:|:--------:|:-------:|:------------------:|:---------:|:-------:|:---------:|:---------:|:---------:|:---------:|:--------------:|:----------:|:---------------:|:---------:|:-----------------------:|:------------:|:--------------:|:-----------:|:--------:|:--------:|:-------------:|:-------------:|:-----------:|:----------:|:----------:|:----------:|:---------:|:-------:|:--------:|:------------:|:-----------------:|:----------:|:----------:|:-------------:|:-----------------:|:---------:|:----------:|:----------:|:-------------:|:--------:|:-------:|:------------:|:----------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------------:|:-------------:|:-------:|:----------:|:-----------:|:--------:|:--------------:|:----------:|:-----------:|:-------------:|:--------:|:----------:|:--------:|:-----------------:|:--------:|:----------:|:-------:|:--------:|:--------------:|:---------:|:-----------:|:------------------:|:----------:|:------------:|:---------:|:---------:|:--------:| | 1.6727 | 0.25 | 20 | 0.7288 | 0.0610 | 0.5942 | 0.8568 | nan | 0.8982 | nan | nan | 0.9055 | nan | 0.7829 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.3843 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.8278 | 0.0 | 0.0 | 0.7955 | 0.0 | 0.6743 | nan | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2635 | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | 0.0 | nan | nan | nan | 0.0 | nan | nan | 0.0 | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | nan | 0.0 | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | nan | 0.0 | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | 0.0 | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.8408 | 0.5 | 40 | 0.5517 | 0.1077 | 0.5647 | 0.8553 | nan | 0.8733 | nan | nan | 0.8477 | nan | 0.8880 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.2144 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.8235 | 0.0 | 0.0 | 0.7880 | nan | 0.6803 | nan | nan | 0.0 | nan | 0.0 | 0.0 | 0.0 | nan | 0.0 | nan | 0.0 | nan | 0.0 | 0.1864 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.6627 | 0.75 | 60 | 0.5441 | 0.1379 | 0.5611 | 0.8600 | nan | 0.8180 | nan | nan | 0.9523 | nan | 0.8221 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.2130 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.7973 | 0.0 | nan | 0.8032 | nan | 0.6947 | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | 0.0 | nan | 0.0 | nan | nan | 0.1868 | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.5375 | 1.0 | 80 | 0.3639 | 0.2082 | 0.6031 | 0.8892 | nan | 0.9561 | nan | nan | 0.9220 | nan | 0.8168 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.3205 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.8589 | 0.0 | nan | 0.8312 | nan | 0.7386 | nan | nan | 0.0 | nan | 0.0 | 0.0 | nan | nan | nan | 0.0 | 0.0 | nan | nan | 0.2781 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.6152 | 1.25 | 100 | 0.3272 | 0.3115 | 0.6320 | 0.8924 | nan | 0.9481 | nan | nan | 0.8939 | nan | 0.8640 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.4542 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.8665 | 0.0 | nan | 0.8310 | nan | 0.7502 | nan | nan | 0.0 | nan | nan | 0.0 | nan | nan | nan | nan | 0.0 | nan | nan | 0.3561 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.3287 | 1.5 | 120 | 0.3285 | 0.4038 | 0.6368 | 0.8931 | nan | 0.9512 | nan | nan | 0.9560 | nan | 0.7799 | nan | nan | nan | nan | nan | 0.0003 | nan | nan | nan | nan | nan | nan | nan | 0.4966 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.8704 | 0.0 | nan | 0.8349 | nan | 0.7359 | nan | nan | nan | nan | nan | 0.0003 | nan | nan | nan | nan | nan | nan | nan | 0.3854 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.3504 | 1.75 | 140 | 0.2870 | 0.3619 | 0.6522 | 0.9008 | nan | 0.9315 | nan | nan | 0.9032 | nan | 0.8916 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.5349 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.0 | 0.8827 | 0.0 | 0.0 | 0.8443 | nan | 0.7637 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.4047 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.3755 | 2.0 | 160 | 0.2640 | 0.4847 | 0.6449 | 0.9053 | nan | 0.9369 | nan | nan | 0.9154 | nan | 0.8888 | nan | nan | nan | nan | nan | 0.0065 | nan | nan | nan | nan | nan | nan | nan | 0.4771 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8872 | 0.0 | nan | 0.8499 | nan | 0.7739 | nan | nan | nan | nan | nan | 0.0061 | nan | nan | nan | nan | nan | nan | nan | 0.3910 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.4509 | 2.25 | 180 | 0.2537 | 0.4893 | 0.6500 | 0.9066 | nan | 0.9367 | nan | nan | 0.9533 | nan | 0.8417 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.5183 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8875 | 0.0 | nan | 0.8538 | nan | 0.7701 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.4242 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2465 | 2.5 | 200 | 0.2546 | 0.4916 | 0.6554 | 0.9078 | nan | 0.9481 | nan | nan | 0.9322 | nan | 0.8607 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.5362 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8821 | 0.0 | nan | 0.8594 | nan | 0.7751 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.4332 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2378 | 2.75 | 220 | 0.2560 | 0.4976 | 0.6709 | 0.9092 | nan | 0.9344 | nan | nan | 0.9299 | nan | 0.8774 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.6129 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8900 | 0.0 | nan | 0.8583 | nan | 0.7794 | nan | nan | nan | nan | nan | 0.0 | nan | nan | nan | nan | nan | nan | nan | 0.4579 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.3245 | 3.0 | 240 | 0.2574 | 0.4764 | 0.6270 | 0.9066 | nan | 0.9565 | nan | nan | 0.8915 | nan | 0.9111 | nan | nan | nan | nan | nan | 0.0003 | nan | nan | nan | nan | nan | nan | nan | 0.3755 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8923 | 0.0 | nan | 0.8501 | nan | 0.7773 | nan | nan | nan | nan | nan | 0.0003 | nan | nan | nan | nan | nan | nan | nan | 0.3383 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2053 | 3.25 | 260 | 0.2437 | 0.5042 | 0.6795 | 0.9120 | nan | 0.9617 | nan | nan | 0.9269 | nan | 0.8619 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.6445 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8881 | 0.0 | nan | 0.8638 | nan | 0.7847 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.4859 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2223 | 3.5 | 280 | 0.2430 | 0.5079 | 0.6889 | 0.9135 | nan | 0.9408 | nan | nan | 0.9464 | nan | 0.8594 | nan | nan | nan | nan | nan | 0.0084 | nan | nan | nan | nan | nan | nan | nan | 0.6897 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8943 | 0.0 | nan | 0.8661 | nan | 0.7858 | nan | nan | nan | nan | nan | 0.0081 | nan | nan | nan | nan | nan | nan | nan | 0.4932 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2291 | 3.75 | 300 | 0.2328 | 0.5073 | 0.6747 | 0.9152 | nan | 0.9415 | nan | nan | 0.9299 | nan | 0.8905 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.6079 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8940 | 0.0 | nan | 0.8681 | nan | 0.7924 | nan | nan | nan | nan | nan | 0.0037 | nan | nan | nan | nan | nan | nan | nan | 0.4856 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.3919 | 4.0 | 320 | 0.2346 | 0.5115 | 0.6894 | 0.9159 | nan | 0.9497 | nan | nan | 0.9484 | nan | 0.8558 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.6905 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8979 | 0.0 | nan | 0.8685 | nan | 0.7898 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.5101 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1595 | 4.25 | 340 | 0.2241 | 0.5131 | 0.6903 | 0.9172 | nan | 0.9494 | nan | nan | 0.9390 | nan | 0.8730 | nan | nan | nan | nan | nan | 0.0009 | nan | nan | nan | nan | nan | nan | nan | 0.6893 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8984 | 0.0 | nan | 0.8709 | nan | 0.7944 | nan | nan | nan | nan | nan | 0.0009 | nan | nan | nan | nan | nan | nan | nan | 0.5142 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2397 | 4.5 | 360 | 0.2301 | 0.6103 | 0.6711 | 0.9173 | nan | 0.9435 | nan | nan | 0.9346 | nan | 0.8910 | nan | nan | nan | nan | nan | 0.0024 | nan | nan | nan | nan | nan | nan | nan | 0.5842 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9003 | nan | nan | 0.8686 | nan | 0.7967 | nan | nan | nan | nan | nan | 0.0023 | nan | nan | nan | nan | nan | nan | nan | 0.4835 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.248 | 4.75 | 380 | 0.2289 | 0.5150 | 0.6901 | 0.9169 | nan | 0.9404 | nan | nan | 0.9316 | nan | 0.8907 | nan | nan | nan | nan | nan | 0.0052 | nan | nan | nan | nan | nan | nan | nan | 0.6826 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8995 | 0.0 | nan | 0.8698 | nan | 0.7936 | nan | nan | nan | nan | nan | 0.0050 | nan | nan | nan | nan | nan | nan | nan | 0.5220 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1986 | 5.0 | 400 | 0.2282 | 0.6163 | 0.6799 | 0.9182 | nan | 0.9525 | nan | nan | 0.9507 | nan | 0.8615 | nan | nan | nan | nan | nan | 0.0032 | nan | nan | nan | nan | nan | nan | nan | 0.6317 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9000 | nan | nan | 0.8707 | nan | 0.7958 | nan | nan | nan | nan | nan | 0.0032 | nan | nan | nan | nan | nan | nan | nan | 0.5116 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1553 | 5.25 | 420 | 0.2216 | 0.6194 | 0.6891 | 0.9188 | nan | 0.9551 | nan | nan | 0.9297 | nan | 0.8858 | nan | nan | nan | nan | nan | 0.0028 | nan | nan | nan | nan | nan | nan | nan | 0.6721 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9006 | nan | nan | 0.8714 | nan | 0.7995 | nan | nan | nan | nan | nan | 0.0028 | nan | nan | nan | nan | nan | nan | nan | 0.5228 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1767 | 5.5 | 440 | 0.2197 | 0.6188 | 0.6839 | 0.9192 | nan | 0.9517 | nan | nan | 0.9484 | nan | 0.8674 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.6495 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9006 | nan | nan | 0.8728 | nan | 0.7979 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.5204 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2381 | 5.75 | 460 | 0.2221 | 0.6219 | 0.6951 | 0.9192 | nan | 0.9579 | nan | nan | 0.9377 | nan | 0.8722 | nan | nan | nan | nan | nan | 0.0049 | nan | nan | nan | nan | nan | nan | nan | 0.7027 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.8994 | nan | nan | 0.8737 | nan | 0.7988 | nan | nan | nan | nan | nan | 0.0047 | nan | nan | nan | nan | nan | nan | nan | 0.5328 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1749 | 6.0 | 480 | 0.2223 | 0.6233 | 0.6944 | 0.9204 | nan | 0.9533 | nan | nan | 0.9436 | nan | 0.8733 | nan | nan | nan | nan | nan | 0.0035 | nan | nan | nan | nan | nan | nan | nan | 0.6983 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9029 | nan | nan | 0.8741 | nan | 0.8016 | nan | nan | nan | nan | nan | 0.0035 | nan | nan | nan | nan | nan | nan | nan | 0.5345 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1881 | 6.25 | 500 | 0.2224 | 0.6229 | 0.6910 | 0.9205 | nan | 0.9485 | nan | nan | 0.9415 | nan | 0.8818 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.6808 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9023 | nan | nan | 0.8744 | nan | 0.8023 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.5330 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1535 | 6.5 | 520 | 0.2170 | 0.6250 | 0.6987 | 0.9202 | nan | 0.9510 | nan | nan | 0.9304 | nan | 0.8907 | nan | nan | nan | nan | nan | 0.0022 | nan | nan | nan | nan | nan | nan | nan | 0.7192 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9041 | nan | nan | 0.8726 | nan | 0.8020 | nan | nan | nan | nan | nan | 0.0022 | nan | nan | nan | nan | nan | nan | nan | 0.5440 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2497 | 6.75 | 540 | 0.2191 | 0.6229 | 0.6859 | 0.9207 | nan | 0.9562 | nan | nan | 0.9251 | nan | 0.8984 | nan | nan | nan | nan | nan | 0.0035 | nan | nan | nan | nan | nan | nan | nan | 0.6464 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9032 | nan | nan | 0.8726 | nan | 0.8047 | nan | nan | nan | nan | nan | 0.0035 | nan | nan | nan | nan | nan | nan | nan | 0.5305 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2103 | 7.0 | 560 | 0.2211 | 0.6263 | 0.6958 | 0.9213 | nan | 0.9616 | nan | nan | 0.9447 | nan | 0.8663 | nan | nan | nan | nan | nan | 0.0053 | nan | nan | nan | nan | nan | nan | nan | 0.7013 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9022 | nan | nan | 0.8759 | nan | 0.8027 | nan | nan | nan | nan | nan | 0.0052 | nan | nan | nan | nan | nan | nan | nan | 0.5455 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1699 | 7.25 | 580 | 0.2170 | 0.6272 | 0.6969 | 0.9217 | nan | 0.9610 | nan | nan | 0.9289 | nan | 0.8890 | nan | nan | nan | nan | nan | 0.0043 | nan | nan | nan | nan | nan | nan | nan | 0.7013 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9036 | nan | nan | 0.8756 | nan | 0.8057 | nan | nan | nan | nan | nan | 0.0042 | nan | nan | nan | nan | nan | nan | nan | 0.5469 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1875 | 7.5 | 600 | 0.2174 | 0.6255 | 0.6917 | 0.9226 | nan | 0.9594 | nan | nan | 0.9349 | nan | 0.8874 | nan | nan | nan | nan | nan | 0.0018 | nan | nan | nan | nan | nan | nan | nan | 0.6751 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9050 | nan | nan | 0.8770 | nan | 0.8081 | nan | nan | nan | nan | nan | 0.0017 | nan | nan | nan | nan | nan | nan | nan | 0.5357 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.174 | 7.75 | 620 | 0.2159 | 0.6282 | 0.6946 | 0.9229 | nan | 0.9566 | nan | nan | 0.9351 | nan | 0.8901 | nan | nan | nan | nan | nan | 0.0024 | nan | nan | nan | nan | nan | nan | nan | 0.6890 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9064 | nan | nan | 0.8764 | nan | 0.8087 | nan | nan | nan | nan | nan | 0.0023 | nan | nan | nan | nan | nan | nan | nan | 0.5473 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1752 | 8.0 | 640 | 0.2141 | 0.6294 | 0.6991 | 0.9229 | nan | 0.9547 | nan | nan | 0.9389 | nan | 0.8854 | nan | nan | nan | nan | nan | 0.0034 | nan | nan | nan | nan | nan | nan | nan | 0.7133 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9056 | nan | nan | 0.8775 | nan | 0.8078 | nan | nan | nan | nan | nan | 0.0033 | nan | nan | nan | nan | nan | nan | nan | 0.5526 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1544 | 8.25 | 660 | 0.2146 | 0.6281 | 0.6928 | 0.9233 | nan | 0.9487 | nan | nan | 0.9393 | nan | 0.8944 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.6793 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9067 | nan | nan | 0.8780 | nan | 0.8092 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.5443 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1715 | 8.5 | 680 | 0.2161 | 0.6299 | 0.7027 | 0.9229 | nan | 0.9608 | nan | nan | 0.9281 | nan | 0.8925 | nan | nan | nan | nan | nan | 0.0024 | nan | nan | nan | nan | nan | nan | nan | 0.7296 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9062 | nan | nan | 0.8768 | nan | 0.8086 | nan | nan | nan | nan | nan | 0.0023 | nan | nan | nan | nan | nan | nan | nan | 0.5556 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1833 | 8.75 | 700 | 0.2145 | 0.6305 | 0.6997 | 0.9238 | nan | 0.9541 | nan | nan | 0.9418 | nan | 0.8851 | nan | nan | nan | nan | nan | 0.0021 | nan | nan | nan | nan | nan | nan | nan | 0.7154 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9073 | nan | nan | 0.8784 | nan | 0.8098 | nan | nan | nan | nan | nan | 0.0020 | nan | nan | nan | nan | nan | nan | nan | 0.5552 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.184 | 9.0 | 720 | 0.2122 | 0.6303 | 0.6929 | 0.9243 | nan | 0.9536 | nan | nan | 0.9429 | nan | 0.8882 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.6770 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9074 | nan | nan | 0.8794 | nan | 0.8108 | nan | nan | nan | nan | nan | 0.0025 | nan | nan | nan | nan | nan | nan | nan | 0.5512 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1257 | 9.25 | 740 | 0.2145 | 0.6313 | 0.6958 | 0.9242 | nan | 0.9520 | nan | nan | 0.9372 | nan | 0.8961 | nan | nan | nan | nan | nan | 0.0044 | nan | nan | nan | nan | nan | nan | nan | 0.6890 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9087 | nan | nan | 0.8783 | nan | 0.8111 | nan | nan | nan | nan | nan | 0.0043 | nan | nan | nan | nan | nan | nan | nan | 0.5541 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1228 | 9.5 | 760 | 0.2119 | 0.6321 | 0.6990 | 0.9244 | nan | 0.9547 | nan | nan | 0.9401 | nan | 0.8895 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.7071 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9083 | nan | nan | 0.8794 | nan | 0.8111 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.5579 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.171 | 9.75 | 780 | 0.2158 | 0.6322 | 0.7007 | 0.9241 | nan | 0.9569 | nan | nan | 0.9336 | nan | 0.8941 | nan | nan | nan | nan | nan | 0.0031 | nan | nan | nan | nan | nan | nan | nan | 0.7159 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9081 | nan | nan | 0.8785 | nan | 0.8109 | nan | nan | nan | nan | nan | 0.0030 | nan | nan | nan | nan | nan | nan | nan | 0.5606 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1556 | 10.0 | 800 | 0.2132 | 0.6324 | 0.6987 | 0.9246 | nan | 0.9573 | nan | nan | 0.9365 | nan | 0.8920 | nan | nan | nan | nan | nan | 0.0032 | nan | nan | nan | nan | nan | nan | nan | 0.7046 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9083 | nan | nan | 0.8793 | nan | 0.8115 | nan | nan | nan | nan | nan | 0.0032 | nan | nan | nan | nan | nan | nan | nan | 0.5600 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.155 | 10.25 | 820 | 0.2106 | 0.6347 | 0.7076 | 0.9249 | nan | 0.9557 | nan | nan | 0.9423 | nan | 0.8846 | nan | nan | nan | nan | nan | 0.0059 | nan | nan | nan | nan | nan | nan | nan | 0.7494 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9096 | nan | nan | 0.8800 | nan | 0.8118 | nan | nan | nan | nan | nan | 0.0057 | nan | nan | nan | nan | nan | nan | nan | 0.5664 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2074 | 10.5 | 840 | 0.2118 | 0.6329 | 0.6972 | 0.9250 | nan | 0.9544 | nan | nan | 0.9453 | nan | 0.8854 | nan | nan | nan | nan | nan | 0.0053 | nan | nan | nan | nan | nan | nan | nan | 0.6958 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9092 | nan | nan | 0.8803 | nan | 0.8114 | nan | nan | nan | nan | nan | 0.0052 | nan | nan | nan | nan | nan | nan | nan | 0.5584 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1932 | 10.75 | 860 | 0.2132 | 0.6328 | 0.6958 | 0.9249 | nan | 0.9587 | nan | nan | 0.9330 | nan | 0.8977 | nan | nan | nan | nan | nan | 0.0062 | nan | nan | nan | nan | nan | nan | nan | 0.6836 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9095 | nan | nan | 0.8791 | nan | 0.8128 | nan | nan | nan | nan | nan | 0.0060 | nan | nan | nan | nan | nan | nan | nan | 0.5567 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.143 | 11.0 | 880 | 0.2139 | 0.6355 | 0.7062 | 0.9252 | nan | 0.9561 | nan | nan | 0.9436 | nan | 0.8839 | nan | nan | nan | nan | nan | 0.0077 | nan | nan | nan | nan | nan | nan | nan | 0.7400 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9098 | nan | nan | 0.8801 | nan | 0.8123 | nan | nan | nan | nan | nan | 0.0074 | nan | nan | nan | nan | nan | nan | nan | 0.5680 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1663 | 11.25 | 900 | 0.2097 | 0.6331 | 0.6953 | 0.9254 | nan | 0.9549 | nan | nan | 0.9351 | nan | 0.9005 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.6822 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9098 | nan | nan | 0.8801 | nan | 0.8141 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.5577 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.0955 | 11.5 | 920 | 0.2099 | 0.6350 | 0.7011 | 0.9254 | nan | 0.9533 | nan | nan | 0.9384 | nan | 0.8956 | nan | nan | nan | nan | nan | 0.0044 | nan | nan | nan | nan | nan | nan | nan | 0.7139 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9103 | nan | nan | 0.8802 | nan | 0.8130 | nan | nan | nan | nan | nan | 0.0043 | nan | nan | nan | nan | nan | nan | nan | 0.5670 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.2205 | 11.75 | 940 | 0.2131 | 0.6351 | 0.7024 | 0.9254 | nan | 0.9552 | nan | nan | 0.9419 | nan | 0.8888 | nan | nan | nan | nan | nan | 0.0049 | nan | nan | nan | nan | nan | nan | nan | 0.7214 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9095 | nan | nan | 0.8806 | nan | 0.8127 | nan | nan | nan | nan | nan | 0.0048 | nan | nan | nan | nan | nan | nan | nan | 0.5681 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1432 | 12.0 | 960 | 0.2128 | 0.6353 | 0.7008 | 0.9256 | nan | 0.9560 | nan | nan | 0.9379 | nan | 0.8946 | nan | nan | nan | nan | nan | 0.0041 | nan | nan | nan | nan | nan | nan | nan | 0.7114 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9097 | nan | nan | 0.8809 | nan | 0.8136 | nan | nan | nan | nan | nan | 0.0040 | nan | nan | nan | nan | nan | nan | nan | 0.5683 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.171 | 12.25 | 980 | 0.2118 | 0.6357 | 0.7018 | 0.9254 | nan | 0.9547 | nan | nan | 0.9351 | nan | 0.8988 | nan | nan | nan | nan | nan | 0.0062 | nan | nan | nan | nan | nan | nan | nan | 0.7142 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9094 | nan | nan | 0.8807 | nan | 0.8135 | nan | nan | nan | nan | nan | 0.0060 | nan | nan | nan | nan | nan | nan | nan | 0.5687 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.139 | 12.5 | 1000 | 0.2145 | 0.6354 | 0.6986 | 0.9258 | nan | 0.9530 | nan | nan | 0.9436 | nan | 0.8917 | nan | nan | nan | nan | nan | 0.0062 | nan | nan | nan | nan | nan | nan | nan | 0.6983 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9096 | nan | nan | 0.8815 | nan | 0.8140 | nan | nan | nan | nan | nan | 0.0060 | nan | nan | nan | nan | nan | nan | nan | 0.5657 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1362 | 12.75 | 1020 | 0.2122 | 0.6356 | 0.6997 | 0.9256 | nan | 0.9582 | nan | nan | 0.9356 | nan | 0.8960 | nan | nan | nan | nan | nan | 0.0057 | nan | nan | nan | nan | nan | nan | nan | 0.7030 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9091 | nan | nan | 0.8809 | nan | 0.8140 | nan | nan | nan | nan | nan | 0.0056 | nan | nan | nan | nan | nan | nan | nan | 0.5682 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.147 | 13.0 | 1040 | 0.2150 | 0.6358 | 0.7003 | 0.9258 | nan | 0.9543 | nan | nan | 0.9433 | nan | 0.8902 | nan | nan | nan | nan | nan | 0.0053 | nan | nan | nan | nan | nan | nan | nan | 0.7085 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9097 | nan | nan | 0.8816 | nan | 0.8137 | nan | nan | nan | nan | nan | 0.0052 | nan | nan | nan | nan | nan | nan | nan | 0.5690 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1416 | 13.25 | 1060 | 0.2128 | 0.6357 | 0.6995 | 0.9260 | nan | 0.9545 | nan | nan | 0.9397 | nan | 0.8956 | nan | nan | nan | nan | nan | 0.0046 | nan | nan | nan | nan | nan | nan | nan | 0.7032 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9099 | nan | nan | 0.8817 | nan | 0.8146 | nan | nan | nan | nan | nan | 0.0045 | nan | nan | nan | nan | nan | nan | nan | 0.5680 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1933 | 13.5 | 1080 | 0.2112 | 0.6363 | 0.7020 | 0.9261 | nan | 0.9557 | nan | nan | 0.9413 | nan | 0.8917 | nan | nan | nan | nan | nan | 0.0047 | nan | nan | nan | nan | nan | nan | nan | 0.7163 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9098 | nan | nan | 0.8820 | nan | 0.8145 | nan | nan | nan | nan | nan | 0.0046 | nan | nan | nan | nan | nan | nan | nan | 0.5707 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1715 | 13.75 | 1100 | 0.2130 | 0.6360 | 0.6991 | 0.9261 | nan | 0.9561 | nan | nan | 0.9403 | nan | 0.8939 | nan | nan | nan | nan | nan | 0.0057 | nan | nan | nan | nan | nan | nan | nan | 0.6997 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9103 | nan | nan | 0.8815 | nan | 0.8147 | nan | nan | nan | nan | nan | 0.0056 | nan | nan | nan | nan | nan | nan | nan | 0.5680 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1995 | 14.0 | 1120 | 0.2129 | 0.6364 | 0.6999 | 0.9263 | nan | 0.9572 | nan | nan | 0.9393 | nan | 0.8943 | nan | nan | nan | nan | nan | 0.0057 | nan | nan | nan | nan | nan | nan | nan | 0.7029 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9102 | nan | nan | 0.8817 | nan | 0.8153 | nan | nan | nan | nan | nan | 0.0056 | nan | nan | nan | nan | nan | nan | nan | 0.5690 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1944 | 14.25 | 1140 | 0.2154 | 0.6362 | 0.6985 | 0.9262 | nan | 0.9560 | nan | nan | 0.9408 | nan | 0.8936 | nan | nan | nan | nan | nan | 0.0053 | nan | nan | nan | nan | nan | nan | nan | 0.6970 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9104 | nan | nan | 0.8814 | nan | 0.8149 | nan | nan | nan | nan | nan | 0.0052 | nan | nan | nan | nan | nan | nan | nan | 0.5689 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1709 | 14.5 | 1160 | 0.2109 | 0.6352 | 0.6956 | 0.9263 | nan | 0.9527 | nan | nan | 0.9423 | nan | 0.8961 | nan | nan | nan | nan | nan | 0.0028 | nan | nan | nan | nan | nan | nan | nan | 0.6841 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9106 | nan | nan | 0.8817 | nan | 0.8153 | nan | nan | nan | nan | nan | 0.0028 | nan | nan | nan | nan | nan | nan | nan | 0.5655 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1848 | 14.75 | 1180 | 0.2122 | 0.6366 | 0.7006 | 0.9263 | nan | 0.9534 | nan | nan | 0.9430 | nan | 0.8928 | nan | nan | nan | nan | nan | 0.0043 | nan | nan | nan | nan | nan | nan | nan | 0.7093 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9107 | nan | nan | 0.8817 | nan | 0.8150 | nan | nan | nan | nan | nan | 0.0042 | nan | nan | nan | nan | nan | nan | nan | 0.5714 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | | 0.1487 | 15.0 | 1200 | 0.2115 | 0.6365 | 0.7005 | 0.9263 | nan | 0.9535 | nan | nan | 0.9415 | nan | 0.8948 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.7086 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | 0.9105 | nan | nan | 0.8818 | nan | 0.8152 | nan | nan | nan | nan | nan | 0.0038 | nan | nan | nan | nan | nan | nan | nan | 0.5711 | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | nan | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "wall", "building", "sky", "floor", "tree", "ceiling", "road", "bed ", "windowpane", "grass", "cabinet", "sidewalk", "person", "earth", "door", "table", "mountain", "plant", "curtain", "chair", "car", "water", "painting", "sofa", "shelf", "house", "sea", "mirror", "rug", "field", "armchair", "seat", "fence", "desk", "rock", "wardrobe", "lamp", "bathtub", "railing", "cushion", "base", "box", "column", "signboard", "chest of drawers", "counter", "sand", "sink", "skyscraper", "fireplace", "refrigerator", "grandstand", "path", "stairs", "runway", "case", "pool table", "pillow", "screen door", "stairway", "river", "bridge", "bookcase", "blind", "coffee table", "toilet", "flower", "book", "hill", "bench", "countertop", "stove", "palm", "kitchen island", "computer", "swivel chair", "boat", "bar", "arcade machine", "hovel", "bus", "towel", "light", "truck", "tower", "chandelier", "awning", "streetlight", "booth", "television receiver", "airplane", "dirt track", "apparel", "pole", "land", "bannister", "escalator", "ottoman", "bottle", "buffet", "poster", "stage", "van", "ship", "fountain", "conveyer belt", "canopy", "washer", "plaything", "swimming pool", "stool", "barrel", "basket", "waterfall", "tent", "bag", "minibike", "cradle", "oven", "ball", "food", "step", "tank", "trade name", "microwave", "pot", "animal", "bicycle", "lake", "dishwasher", "screen", "blanket", "sculpture", "hood", "sconce", "vase", "traffic light", "tray", "ashcan", "fan", "pier", "crt screen", "plate", "monitor", "bulletin board", "shower", "radiator", "glass", "clock", "flag" ]
ruisusanofi/segformer-b0-finetuned-raw_img_ready2train_patches
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-raw_img_ready2train_patches This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the raw_img_ready2train_patches dataset. It achieves the following results on the evaluation set: - Loss: 0.6829 - Mean Iou: 0.4110 - Mean Accuracy: 0.7629 - Overall Accuracy: 0.7631 - Accuracy Unlabeled: nan - Accuracy Eczema: 0.7673 - Accuracy Background: 0.7585 - Iou Unlabeled: 0.0 - Iou Eczema: 0.6284 - Iou Background: 0.6047 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Eczema | Accuracy Background | Iou Unlabeled | Iou Eczema | Iou Background | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:---------------:|:-------------------:|:-------------:|:----------:|:--------------:| | 1.0753 | 0.0312 | 5 | 1.0925 | 0.2358 | 0.4682 | 0.4698 | nan | 0.5042 | 0.4322 | 0.0 | 0.3705 | 0.3367 | | 0.9863 | 0.0625 | 10 | 1.0697 | 0.2994 | 0.6182 | 0.6306 | nan | 0.8979 | 0.3385 | 0.0 | 0.5784 | 0.3198 | | 1.0056 | 0.0938 | 15 | 1.0377 | 0.3303 | 0.6678 | 0.6792 | nan | 0.9236 | 0.4121 | 0.0 | 0.6064 | 0.3844 | | 1.0133 | 0.125 | 20 | 1.0006 | 0.3478 | 0.6869 | 0.6950 | nan | 0.8710 | 0.5027 | 0.0 | 0.6008 | 0.4425 | | 0.9748 | 0.1562 | 25 | 0.9689 | 0.3543 | 0.6947 | 0.7022 | nan | 0.8647 | 0.5246 | 0.0 | 0.6043 | 0.4586 | | 0.9367 | 0.1875 | 30 | 0.9417 | 0.3566 | 0.6950 | 0.6965 | nan | 0.7290 | 0.6610 | 0.0 | 0.5583 | 0.5114 | | 0.8363 | 0.2188 | 35 | 0.9118 | 0.3557 | 0.6940 | 0.6959 | nan | 0.7366 | 0.6514 | 0.0 | 0.5600 | 0.5069 | | 1.1431 | 0.25 | 40 | 0.8830 | 0.3575 | 0.6963 | 0.6989 | nan | 0.7556 | 0.6370 | 0.0 | 0.5686 | 0.5039 | | 0.7312 | 0.2812 | 45 | 0.8592 | 0.3680 | 0.7098 | 0.7133 | nan | 0.7888 | 0.6307 | 0.0 | 0.5907 | 0.5133 | | 0.8135 | 0.3125 | 50 | 0.8268 | 0.3559 | 0.6994 | 0.7083 | nan | 0.8992 | 0.4997 | 0.0 | 0.6173 | 0.4505 | | 0.7528 | 0.3438 | 55 | 0.8110 | 0.3525 | 0.6960 | 0.7053 | nan | 0.9055 | 0.4866 | 0.0 | 0.6162 | 0.4412 | | 0.8405 | 0.375 | 60 | 0.7967 | 0.3518 | 0.6950 | 0.7041 | nan | 0.9008 | 0.4893 | 0.0 | 0.6140 | 0.4415 | | 0.7865 | 0.4062 | 65 | 0.7791 | 0.3561 | 0.6992 | 0.7075 | nan | 0.8869 | 0.5116 | 0.0 | 0.6130 | 0.4553 | | 0.8309 | 0.4375 | 70 | 0.7650 | 0.3652 | 0.7083 | 0.7147 | nan | 0.8512 | 0.5655 | 0.0 | 0.6090 | 0.4864 | | 0.6775 | 0.4688 | 75 | 0.7615 | 0.3613 | 0.7044 | 0.7115 | nan | 0.8651 | 0.5437 | 0.0 | 0.6102 | 0.4738 | | 0.7033 | 0.5 | 80 | 0.7498 | 0.3737 | 0.7179 | 0.7227 | nan | 0.8260 | 0.6099 | 0.0 | 0.6087 | 0.5125 | | 0.8377 | 0.5312 | 85 | 0.7443 | 0.3790 | 0.7243 | 0.7290 | nan | 0.8303 | 0.6184 | 0.0 | 0.6154 | 0.5217 | | 0.825 | 0.5625 | 90 | 0.7547 | 0.3676 | 0.7125 | 0.7201 | nan | 0.8840 | 0.5411 | 0.0 | 0.6225 | 0.4802 | | 0.7408 | 0.5938 | 95 | 0.7415 | 0.3767 | 0.7228 | 0.7295 | nan | 0.8747 | 0.5708 | 0.0 | 0.6281 | 0.5021 | | 0.8087 | 0.625 | 100 | 0.7201 | 0.3926 | 0.7404 | 0.7445 | nan | 0.8318 | 0.6491 | 0.0 | 0.6296 | 0.5483 | | 0.7146 | 0.6562 | 105 | 0.7096 | 0.4002 | 0.7493 | 0.7520 | nan | 0.8109 | 0.6877 | 0.0 | 0.6307 | 0.5699 | | 0.6875 | 0.6875 | 110 | 0.7047 | 0.4010 | 0.7502 | 0.7541 | nan | 0.8398 | 0.6606 | 0.0 | 0.6407 | 0.5621 | | 0.6382 | 0.7188 | 115 | 0.7031 | 0.3982 | 0.7471 | 0.7519 | nan | 0.8543 | 0.6400 | 0.0 | 0.6426 | 0.5521 | | 0.6551 | 0.75 | 120 | 0.6953 | 0.4018 | 0.7512 | 0.7553 | nan | 0.8450 | 0.6573 | 0.0 | 0.6433 | 0.5621 | | 0.7074 | 0.7812 | 125 | 0.6912 | 0.4054 | 0.7553 | 0.7583 | nan | 0.8236 | 0.6871 | 0.0 | 0.6402 | 0.5760 | | 0.768 | 0.8125 | 130 | 0.6866 | 0.4048 | 0.7546 | 0.7579 | nan | 0.8278 | 0.6814 | 0.0 | 0.6410 | 0.5736 | | 0.7543 | 0.8438 | 135 | 0.6851 | 0.4031 | 0.7526 | 0.7564 | nan | 0.8374 | 0.6679 | 0.0 | 0.6422 | 0.5671 | | 0.7107 | 0.875 | 140 | 0.6803 | 0.6122 | 0.7586 | 0.7608 | nan | 0.8071 | 0.7101 | nan | 0.6379 | 0.5865 | | 0.7054 | 0.9062 | 145 | 0.6799 | 0.4098 | 0.7608 | 0.7622 | nan | 0.7924 | 0.7292 | 0.0 | 0.6350 | 0.5943 | | 1.1302 | 0.9375 | 150 | 0.6801 | 0.4103 | 0.7616 | 0.7626 | nan | 0.7840 | 0.7393 | 0.0 | 0.6330 | 0.5981 | | 0.6037 | 0.9688 | 155 | 0.6827 | 0.4111 | 0.7628 | 0.7632 | nan | 0.7721 | 0.7534 | 0.0 | 0.6300 | 0.6032 | | 0.8577 | 1.0 | 160 | 0.6829 | 0.4110 | 0.7629 | 0.7631 | nan | 0.7673 | 0.7585 | 0.0 | 0.6284 | 0.6047 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "unlabeled", "eczema", "background" ]
AliShah07/segformer-b0-finetuned-segments-stamp-verification
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-stamp-verification This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the AliShah07/stamp-verification dataset. It achieves the following results on the evaluation set: - Loss: 0.0535 - Mean Iou: 0.1317 - Mean Accuracy: 0.2635 - Overall Accuracy: 0.2635 - Accuracy Unlabeled: nan - Accuracy Stamp: 0.2635 - Iou Unlabeled: 0.0 - Iou Stamp: 0.2635 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Stamp | Iou Unlabeled | Iou Stamp | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:-------------:|:---------:| | 0.6502 | 0.8333 | 20 | 0.6958 | 0.4685 | 0.9370 | 0.9370 | nan | 0.9370 | 0.0 | 0.9370 | | 0.4529 | 1.6667 | 40 | 0.5458 | 0.0754 | 0.1508 | 0.1508 | nan | 0.1508 | 0.0 | 0.1508 | | 0.3716 | 2.5 | 60 | 0.3818 | 0.0021 | 0.0041 | 0.0041 | nan | 0.0041 | 0.0 | 0.0041 | | 0.3238 | 3.3333 | 80 | 0.2932 | 0.0126 | 0.0252 | 0.0252 | nan | 0.0252 | 0.0 | 0.0252 | | 0.2167 | 4.1667 | 100 | 0.2326 | 0.0008 | 0.0015 | 0.0015 | nan | 0.0015 | 0.0 | 0.0015 | | 0.1948 | 5.0 | 120 | 0.2029 | 0.0033 | 0.0065 | 0.0065 | nan | 0.0065 | 0.0 | 0.0065 | | 0.1643 | 5.8333 | 140 | 0.1609 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | | 0.1642 | 6.6667 | 160 | 0.1428 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | | 0.1326 | 7.5 | 180 | 0.1222 | 0.0001 | 0.0002 | 0.0002 | nan | 0.0002 | 0.0 | 0.0002 | | 0.1012 | 8.3333 | 200 | 0.0981 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | | 0.0981 | 9.1667 | 220 | 0.0972 | 0.0058 | 0.0117 | 0.0117 | nan | 0.0117 | 0.0 | 0.0117 | | 0.0838 | 10.0 | 240 | 0.0781 | 0.0015 | 0.0031 | 0.0031 | nan | 0.0031 | 0.0 | 0.0031 | | 0.0771 | 10.8333 | 260 | 0.0708 | 0.0060 | 0.0120 | 0.0120 | nan | 0.0120 | 0.0 | 0.0120 | | 0.0743 | 11.6667 | 280 | 0.0696 | 0.0298 | 0.0596 | 0.0596 | nan | 0.0596 | 0.0 | 0.0596 | | 0.0655 | 12.5 | 300 | 0.0630 | 0.0398 | 0.0795 | 0.0795 | nan | 0.0795 | 0.0 | 0.0795 | | 0.0673 | 13.3333 | 320 | 0.0613 | 0.0856 | 0.1712 | 0.1712 | nan | 0.1712 | 0.0 | 0.1712 | | 0.0573 | 14.1667 | 340 | 0.0538 | 0.0725 | 0.1450 | 0.1450 | nan | 0.1450 | 0.0 | 0.1450 | | 0.0623 | 15.0 | 360 | 0.0543 | 0.1008 | 0.2016 | 0.2016 | nan | 0.2016 | 0.0 | 0.2016 | | 0.0557 | 15.8333 | 380 | 0.0559 | 0.1474 | 0.2947 | 0.2947 | nan | 0.2947 | 0.0 | 0.2947 | | 0.0594 | 16.6667 | 400 | 0.0492 | 0.1019 | 0.2039 | 0.2039 | nan | 0.2039 | 0.0 | 0.2039 | | 0.056 | 17.5 | 420 | 0.0479 | 0.1235 | 0.2470 | 0.2470 | nan | 0.2470 | 0.0 | 0.2470 | | 0.0499 | 18.3333 | 440 | 0.0481 | 0.1124 | 0.2248 | 0.2248 | nan | 0.2248 | 0.0 | 0.2248 | | 0.0516 | 19.1667 | 460 | 0.0477 | 0.1465 | 0.2930 | 0.2930 | nan | 0.2930 | 0.0 | 0.2930 | | 0.0517 | 20.0 | 480 | 0.0535 | 0.1317 | 0.2635 | 0.2635 | nan | 0.2635 | 0.0 | 0.2635 | ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "unlabeled", "stamp" ]
AliShah07/segformer-b0-finetuned-segments-stamp-verification2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-stamp-verification2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the AliShah07/stamp-verification dataset. It achieves the following results on the evaluation set: - Loss: 0.0365 - Mean Iou: 0.1372 - Mean Accuracy: 0.2744 - Overall Accuracy: 0.2744 - Accuracy Unlabeled: nan - Accuracy Stamp: 0.2744 - Iou Unlabeled: 0.0 - Iou Stamp: 0.2744 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Stamp | Iou Unlabeled | Iou Stamp | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:-------------:|:---------:| | 0.4566 | 0.8333 | 20 | 0.4738 | 0.1430 | 0.2860 | 0.2860 | nan | 0.2860 | 0.0 | 0.2860 | | 0.3076 | 1.6667 | 40 | 0.3046 | 0.1307 | 0.2614 | 0.2614 | nan | 0.2614 | 0.0 | 0.2614 | | 0.2373 | 2.5 | 60 | 0.2226 | 0.0604 | 0.1209 | 0.1209 | nan | 0.1209 | 0.0 | 0.1209 | | 0.2184 | 3.3333 | 80 | 0.2220 | 0.1942 | 0.3884 | 0.3884 | nan | 0.3884 | 0.0 | 0.3884 | | 0.1578 | 4.1667 | 100 | 0.1704 | 0.2468 | 0.4936 | 0.4936 | nan | 0.4936 | 0.0 | 0.4936 | | 0.1412 | 5.0 | 120 | 0.1269 | 0.0376 | 0.0751 | 0.0751 | nan | 0.0751 | 0.0 | 0.0751 | | 0.1109 | 5.8333 | 140 | 0.1076 | 0.2741 | 0.5483 | 0.5483 | nan | 0.5483 | 0.0 | 0.5483 | | 0.106 | 6.6667 | 160 | 0.0892 | 0.0583 | 0.1166 | 0.1166 | nan | 0.1166 | 0.0 | 0.1166 | | 0.0899 | 7.5 | 180 | 0.0747 | 0.0173 | 0.0346 | 0.0346 | nan | 0.0346 | 0.0 | 0.0346 | | 0.0794 | 8.3333 | 200 | 0.0683 | 0.0189 | 0.0378 | 0.0378 | nan | 0.0378 | 0.0 | 0.0378 | | 0.0741 | 9.1667 | 220 | 0.0639 | 0.0981 | 0.1963 | 0.1963 | nan | 0.1963 | 0.0 | 0.1963 | | 0.0832 | 10.0 | 240 | 0.0559 | 0.0599 | 0.1198 | 0.1198 | nan | 0.1198 | 0.0 | 0.1198 | | 0.0575 | 10.8333 | 260 | 0.0527 | 0.0769 | 0.1538 | 0.1538 | nan | 0.1538 | 0.0 | 0.1538 | | 0.05 | 11.6667 | 280 | 0.0502 | 0.0852 | 0.1704 | 0.1704 | nan | 0.1704 | 0.0 | 0.1704 | | 0.0523 | 12.5 | 300 | 0.0446 | 0.1038 | 0.2076 | 0.2076 | nan | 0.2076 | 0.0 | 0.2076 | | 0.0481 | 13.3333 | 320 | 0.0431 | 0.0956 | 0.1913 | 0.1913 | nan | 0.1913 | 0.0 | 0.1913 | | 0.0471 | 14.1667 | 340 | 0.0420 | 0.1330 | 0.2660 | 0.2660 | nan | 0.2660 | 0.0 | 0.2660 | | 0.042 | 15.0 | 360 | 0.0412 | 0.1124 | 0.2248 | 0.2248 | nan | 0.2248 | 0.0 | 0.2248 | | 0.041 | 15.8333 | 380 | 0.0400 | 0.1144 | 0.2288 | 0.2288 | nan | 0.2288 | 0.0 | 0.2288 | | 0.0444 | 16.6667 | 400 | 0.0383 | 0.1415 | 0.2830 | 0.2830 | nan | 0.2830 | 0.0 | 0.2830 | | 0.0514 | 17.5 | 420 | 0.0377 | 0.0779 | 0.1559 | 0.1559 | nan | 0.1559 | 0.0 | 0.1559 | | 0.0434 | 18.3333 | 440 | 0.0374 | 0.1482 | 0.2964 | 0.2964 | nan | 0.2964 | 0.0 | 0.2964 | | 0.0383 | 19.1667 | 460 | 0.0363 | 0.1843 | 0.3686 | 0.3686 | nan | 0.3686 | 0.0 | 0.3686 | | 0.0411 | 20.0 | 480 | 0.0365 | 0.1372 | 0.2744 | 0.2744 | nan | 0.2744 | 0.0 | 0.2744 | ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "unlabeled", "stamp" ]
sayeed99/segformer-b3-fashion
# segformer-b3-fashion This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the sayeed99/fashion_segmentation dataset using original image sizes without resizing. ```python from transformers import SegformerImageProcessor, AutoModelForSemanticSegmentation from PIL import Image import requests import matplotlib.pyplot as plt import torch.nn as nn processor = SegformerImageProcessor.from_pretrained("sayeed99/segformer-b3-fashion") model = AutoModelForSemanticSegmentation.from_pretrained("sayeed99/segformer-b3-fashion") url = "https://plus.unsplash.com/premium_photo-1673210886161-bfcc40f54d1f?ixlib=rb-4.0.3&ixid=MnwxMjA3fDB8MHxzZWFyY2h8MXx8cGVyc29uJTIwc3RhbmRpbmd8ZW58MHx8MHx8&w=1000&q=80" image = Image.open(requests.get(url, stream=True).raw) inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits.cpu() upsampled_logits = nn.functional.interpolate( logits, size=image.size[::-1], mode="bilinear", align_corners=False, ) pred_seg = upsampled_logits.argmax(dim=1)[0] plt.imshow(pred_seg) ``` Labels : {"0":"Unlabelled", "1": "shirt, blouse", "2": "top, t-shirt, sweatshirt", "3": "sweater", "4": "cardigan", "5": "jacket", "6": "vest", "7": "pants", "8": "shorts", "9": "skirt", "10": "coat", "11": "dress", "12": "jumpsuit", "13": "cape", "14": "glasses", "15": "hat", "16": "headband, head covering, hair accessory", "17": "tie", "18": "glove", "19": "watch", "20": "belt", "21": "leg warmer", "22": "tights, stockings", "23": "sock", "24": "shoe", "25": "bag, wallet", "26": "scarf", "27": "umbrella", "28": "hood", "29": "collar", "30": "lapel", "31": "epaulette", "32": "sleeve", "33": "pocket", "34": "neckline", "35": "buckle", "36": "zipper", "37": "applique", "38": "bead", "39": "bow", "40": "flower", "41": "fringe", "42": "ribbon", "43": "rivet", "44": "ruffle", "45": "sequin", "46": "tassel"} ### Framework versions - Transformers 4.30.0 - Pytorch 2.2.2+cu121 - Datasets 2.18.0 - Tokenizers 0.13.3 ### License The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE). ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2105-15203, author = {Enze Xie and Wenhai Wang and Zhiding Yu and Anima Anandkumar and Jose M. Alvarez and Ping Luo}, title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers}, journal = {CoRR}, volume = {abs/2105.15203}, year = {2021}, url = {https://arxiv.org/abs/2105.15203}, eprinttype = {arXiv}, eprint = {2105.15203}, timestamp = {Wed, 02 Jun 2021 11:46:42 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} }
[ "unlabelled", "shirt, blouse", "top, t-shirt, sweatshirt", "sweater", "cardigan", "jacket", "vest", "pants", "shorts", "skirt", "coat", "dress", "jumpsuit", "cape", "glasses", "hat", "headband, head covering, hair accessory", "tie", "glove", "watch", "belt", "leg warmer", "tights, stockings", "sock", "shoe", "bag, wallet", "scarf", "umbrella", "hood", "collar", "lapel", "epaulette", "sleeve", "pocket", "neckline", "buckle", "zipper", "applique", "bead", "bow", "flower", "fringe", "ribbon", "rivet", "ruffle", "sequin", "tassel" ]
cephelos/dungeon-maps-seg-v0.0.1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dungeon-maps-seg-v0.0.1 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the cephelos/dungeon-maps-seg dataset. It achieves the following results on the evaluation set: - Loss: 0.0361 - Mean Iou: 0.9518 - Mean Accuracy: 0.9783 - Overall Accuracy: 0.9893 - Accuracy Unlabeled: nan - Accuracy Room: 0.9923 - Accuracy Wall: 0.9490 - Accuracy Outside: 0.9935 - Iou Unlabeled: nan - Iou Room: 0.9857 - Iou Wall: 0.8788 - Iou Outside: 0.9911 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Room | Accuracy Wall | Accuracy Outside | Iou Unlabeled | Iou Room | Iou Wall | Iou Outside | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:----------------:|:-------------:|:--------:|:--------:|:-----------:| | 0.2922 | 0.7692 | 20 | 0.2745 | 0.8581 | 0.9561 | 0.9598 | nan | 0.9526 | 0.9466 | 0.9690 | nan | 0.9466 | 0.6646 | 0.9632 | | 0.2099 | 1.5385 | 40 | 0.2072 | 0.8639 | 0.9584 | 0.9625 | nan | 0.9680 | 0.9472 | 0.9599 | nan | 0.9600 | 0.6732 | 0.9584 | | 0.2009 | 2.3077 | 60 | 0.1688 | 0.8968 | 0.9623 | 0.9741 | nan | 0.9718 | 0.9316 | 0.9835 | nan | 0.9649 | 0.7477 | 0.9778 | | 0.1258 | 3.0769 | 80 | 0.1482 | 0.8991 | 0.9676 | 0.9745 | nan | 0.9773 | 0.9492 | 0.9762 | nan | 0.9708 | 0.7529 | 0.9736 | | 0.1624 | 3.8462 | 100 | 0.1333 | 0.9115 | 0.9682 | 0.9785 | nan | 0.9807 | 0.9410 | 0.9829 | nan | 0.9734 | 0.7817 | 0.9795 | | 0.1098 | 4.6154 | 120 | 0.1079 | 0.9173 | 0.9624 | 0.9805 | nan | 0.9859 | 0.9145 | 0.9868 | nan | 0.9753 | 0.7950 | 0.9817 | | 0.1629 | 5.3846 | 140 | 0.1041 | 0.9195 | 0.9711 | 0.9806 | nan | 0.9790 | 0.9462 | 0.9881 | nan | 0.9738 | 0.8013 | 0.9833 | | 0.1243 | 6.1538 | 160 | 0.0872 | 0.9243 | 0.9675 | 0.9821 | nan | 0.9852 | 0.9288 | 0.9884 | nan | 0.9766 | 0.8125 | 0.9836 | | 0.0974 | 6.9231 | 180 | 0.0996 | 0.9217 | 0.9731 | 0.9811 | nan | 0.9754 | 0.9525 | 0.9915 | nan | 0.9717 | 0.8073 | 0.9861 | | 0.0861 | 7.6923 | 200 | 0.0798 | 0.9248 | 0.9706 | 0.9821 | nan | 0.9829 | 0.9403 | 0.9886 | nan | 0.9764 | 0.8142 | 0.9836 | | 0.0928 | 8.4615 | 220 | 0.0718 | 0.9276 | 0.9740 | 0.9828 | nan | 0.9830 | 0.9507 | 0.9882 | nan | 0.9773 | 0.8209 | 0.9847 | | 0.0583 | 9.2308 | 240 | 0.0726 | 0.9240 | 0.9686 | 0.9822 | nan | 0.9870 | 0.9326 | 0.9862 | nan | 0.9789 | 0.8111 | 0.9821 | | 0.0886 | 10.0 | 260 | 0.0700 | 0.9296 | 0.9740 | 0.9835 | nan | 0.9845 | 0.9491 | 0.9885 | nan | 0.9786 | 0.8250 | 0.9852 | | 0.1133 | 10.7692 | 280 | 0.0651 | 0.9322 | 0.9633 | 0.9848 | nan | 0.9912 | 0.9064 | 0.9922 | nan | 0.9794 | 0.8301 | 0.9872 | | 0.0821 | 11.5385 | 300 | 0.0616 | 0.9302 | 0.9721 | 0.9836 | nan | 0.9833 | 0.9417 | 0.9912 | nan | 0.9779 | 0.8270 | 0.9857 | | 0.07 | 12.3077 | 320 | 0.0586 | 0.9394 | 0.9690 | 0.9864 | nan | 0.9896 | 0.9232 | 0.9942 | nan | 0.9810 | 0.8485 | 0.9887 | | 0.076 | 13.0769 | 340 | 0.0566 | 0.9349 | 0.9651 | 0.9854 | nan | 0.9919 | 0.9113 | 0.9920 | nan | 0.9803 | 0.8365 | 0.9878 | | 0.0577 | 13.8462 | 360 | 0.0570 | 0.9378 | 0.9755 | 0.9857 | nan | 0.9850 | 0.9488 | 0.9926 | nan | 0.9797 | 0.8452 | 0.9886 | | 0.1261 | 14.6154 | 380 | 0.0548 | 0.9403 | 0.9739 | 0.9864 | nan | 0.9867 | 0.9410 | 0.9939 | nan | 0.9808 | 0.8511 | 0.9891 | | 0.0583 | 15.3846 | 400 | 0.0523 | 0.9428 | 0.9736 | 0.9871 | nan | 0.9895 | 0.9379 | 0.9934 | nan | 0.9820 | 0.8566 | 0.9896 | | 0.0602 | 16.1538 | 420 | 0.0488 | 0.9409 | 0.9737 | 0.9866 | nan | 0.9899 | 0.9394 | 0.9917 | nan | 0.9820 | 0.8519 | 0.9887 | | 0.0728 | 16.9231 | 440 | 0.0504 | 0.9380 | 0.9716 | 0.9860 | nan | 0.9907 | 0.9335 | 0.9905 | nan | 0.9819 | 0.8448 | 0.9873 | | 0.0507 | 17.6923 | 460 | 0.0503 | 0.9378 | 0.9739 | 0.9858 | nan | 0.9892 | 0.9424 | 0.9901 | nan | 0.9820 | 0.8445 | 0.9869 | | 0.077 | 18.4615 | 480 | 0.0474 | 0.9429 | 0.9740 | 0.9871 | nan | 0.9876 | 0.9396 | 0.9949 | nan | 0.9819 | 0.8570 | 0.9897 | | 0.2137 | 19.2308 | 500 | 0.0500 | 0.9413 | 0.9763 | 0.9866 | nan | 0.9892 | 0.9489 | 0.9907 | nan | 0.9823 | 0.8532 | 0.9882 | | 0.0991 | 20.0 | 520 | 0.0459 | 0.9440 | 0.9719 | 0.9875 | nan | 0.9899 | 0.9309 | 0.9950 | nan | 0.9827 | 0.8595 | 0.9898 | | 0.0691 | 20.7692 | 540 | 0.0447 | 0.9451 | 0.9743 | 0.9877 | nan | 0.9906 | 0.9390 | 0.9933 | nan | 0.9831 | 0.8623 | 0.9897 | | 0.0602 | 21.5385 | 560 | 0.0447 | 0.9462 | 0.9754 | 0.9879 | nan | 0.9885 | 0.9424 | 0.9952 | nan | 0.9828 | 0.8654 | 0.9904 | | 0.0469 | 22.3077 | 580 | 0.0429 | 0.9466 | 0.9767 | 0.9879 | nan | 0.9889 | 0.9471 | 0.9940 | nan | 0.9830 | 0.8664 | 0.9903 | | 0.0553 | 23.0769 | 600 | 0.0445 | 0.9468 | 0.9722 | 0.9882 | nan | 0.9913 | 0.9301 | 0.9952 | nan | 0.9832 | 0.8666 | 0.9906 | | 0.0671 | 23.8462 | 620 | 0.0424 | 0.9455 | 0.9748 | 0.9878 | nan | 0.9900 | 0.9407 | 0.9938 | nan | 0.9833 | 0.8635 | 0.9898 | | 0.0431 | 24.6154 | 640 | 0.0417 | 0.9475 | 0.9732 | 0.9883 | nan | 0.9921 | 0.9331 | 0.9943 | nan | 0.9836 | 0.8681 | 0.9907 | | 0.0381 | 25.3846 | 660 | 0.0429 | 0.9449 | 0.9763 | 0.9876 | nan | 0.9881 | 0.9467 | 0.9942 | nan | 0.9827 | 0.8620 | 0.9901 | | 0.0503 | 26.1538 | 680 | 0.0403 | 0.9471 | 0.9746 | 0.9882 | nan | 0.9924 | 0.9384 | 0.9929 | nan | 0.9841 | 0.8669 | 0.9902 | | 0.0685 | 26.9231 | 700 | 0.0410 | 0.9496 | 0.9743 | 0.9888 | nan | 0.9913 | 0.9361 | 0.9957 | nan | 0.9842 | 0.8732 | 0.9912 | | 0.0381 | 27.6923 | 720 | 0.0398 | 0.9494 | 0.9771 | 0.9887 | nan | 0.9906 | 0.9466 | 0.9942 | nan | 0.9843 | 0.8729 | 0.9909 | | 0.0587 | 28.4615 | 740 | 0.0397 | 0.9500 | 0.9760 | 0.9889 | nan | 0.9913 | 0.9421 | 0.9947 | nan | 0.9843 | 0.8743 | 0.9913 | | 0.0573 | 29.2308 | 760 | 0.0402 | 0.9489 | 0.9756 | 0.9887 | nan | 0.9913 | 0.9411 | 0.9945 | nan | 0.9845 | 0.8715 | 0.9908 | | 0.0686 | 30.0 | 780 | 0.0386 | 0.9499 | 0.9763 | 0.9889 | nan | 0.9914 | 0.9433 | 0.9944 | nan | 0.9844 | 0.8740 | 0.9912 | | 0.037 | 30.7692 | 800 | 0.0386 | 0.9503 | 0.9752 | 0.9890 | nan | 0.9925 | 0.9387 | 0.9944 | nan | 0.9849 | 0.8748 | 0.9911 | | 0.0565 | 31.5385 | 820 | 0.0389 | 0.9497 | 0.9773 | 0.9888 | nan | 0.9898 | 0.9471 | 0.9950 | nan | 0.9840 | 0.8738 | 0.9913 | | 0.0405 | 32.3077 | 840 | 0.0383 | 0.9483 | 0.9743 | 0.9886 | nan | 0.9933 | 0.9366 | 0.9930 | nan | 0.9848 | 0.8698 | 0.9903 | | 0.0618 | 33.0769 | 860 | 0.0383 | 0.9497 | 0.9757 | 0.9889 | nan | 0.9920 | 0.9408 | 0.9942 | nan | 0.9847 | 0.8734 | 0.9910 | | 0.0398 | 33.8462 | 880 | 0.0379 | 0.9494 | 0.9766 | 0.9888 | nan | 0.9917 | 0.9446 | 0.9936 | nan | 0.9846 | 0.8729 | 0.9908 | | 0.0488 | 34.6154 | 900 | 0.0376 | 0.9501 | 0.9769 | 0.9889 | nan | 0.9915 | 0.9450 | 0.9941 | nan | 0.9851 | 0.8745 | 0.9907 | | 0.0574 | 35.3846 | 920 | 0.0379 | 0.9512 | 0.9762 | 0.9892 | nan | 0.9914 | 0.9419 | 0.9953 | nan | 0.9849 | 0.8773 | 0.9914 | | 0.0331 | 36.1538 | 940 | 0.0368 | 0.9514 | 0.9764 | 0.9893 | nan | 0.9921 | 0.9424 | 0.9947 | nan | 0.9852 | 0.8777 | 0.9913 | | 0.0578 | 36.9231 | 960 | 0.0368 | 0.9520 | 0.9770 | 0.9894 | nan | 0.9916 | 0.9443 | 0.9951 | nan | 0.9852 | 0.8790 | 0.9917 | | 0.0471 | 37.6923 | 980 | 0.0369 | 0.9517 | 0.9779 | 0.9893 | nan | 0.9912 | 0.9480 | 0.9947 | nan | 0.9852 | 0.8786 | 0.9915 | | 0.0388 | 38.4615 | 1000 | 0.0369 | 0.9511 | 0.9776 | 0.9892 | nan | 0.9904 | 0.9473 | 0.9952 | nan | 0.9846 | 0.8770 | 0.9916 | | 0.0455 | 39.2308 | 1020 | 0.0367 | 0.9517 | 0.9753 | 0.9894 | nan | 0.9928 | 0.9379 | 0.9950 | nan | 0.9853 | 0.8784 | 0.9915 | | 0.0359 | 40.0 | 1040 | 0.0360 | 0.9516 | 0.9773 | 0.9893 | nan | 0.9917 | 0.9457 | 0.9945 | nan | 0.9853 | 0.8783 | 0.9913 | | 0.0281 | 40.7692 | 1060 | 0.0363 | 0.9519 | 0.9775 | 0.9894 | nan | 0.9917 | 0.9462 | 0.9946 | nan | 0.9854 | 0.8790 | 0.9913 | | 0.0394 | 41.5385 | 1080 | 0.0367 | 0.9508 | 0.9769 | 0.9891 | nan | 0.9922 | 0.9446 | 0.9939 | nan | 0.9854 | 0.8761 | 0.9909 | | 0.0286 | 42.3077 | 1100 | 0.0360 | 0.9525 | 0.9761 | 0.9896 | nan | 0.9924 | 0.9405 | 0.9953 | nan | 0.9855 | 0.8804 | 0.9917 | | 0.028 | 43.0769 | 1120 | 0.0363 | 0.9509 | 0.9791 | 0.9891 | nan | 0.9909 | 0.9530 | 0.9936 | nan | 0.9850 | 0.8767 | 0.9911 | | 0.0523 | 43.8462 | 1140 | 0.0366 | 0.9526 | 0.9777 | 0.9895 | nan | 0.9919 | 0.9466 | 0.9947 | nan | 0.9856 | 0.8806 | 0.9915 | | 0.0492 | 44.6154 | 1160 | 0.0364 | 0.9523 | 0.9764 | 0.9895 | nan | 0.9926 | 0.9419 | 0.9948 | nan | 0.9856 | 0.8799 | 0.9915 | | 0.0331 | 45.3846 | 1180 | 0.0356 | 0.9523 | 0.9781 | 0.9894 | nan | 0.9906 | 0.9484 | 0.9954 | nan | 0.9852 | 0.8799 | 0.9917 | | 0.0443 | 46.1538 | 1200 | 0.0358 | 0.9533 | 0.9772 | 0.9897 | nan | 0.9921 | 0.9443 | 0.9953 | nan | 0.9857 | 0.8824 | 0.9918 | | 0.0331 | 46.9231 | 1220 | 0.0356 | 0.9527 | 0.9771 | 0.9896 | nan | 0.9929 | 0.9441 | 0.9943 | nan | 0.9858 | 0.8808 | 0.9915 | | 0.0546 | 47.6923 | 1240 | 0.0357 | 0.9532 | 0.9774 | 0.9897 | nan | 0.9916 | 0.9450 | 0.9956 | nan | 0.9856 | 0.8821 | 0.9919 | | 0.0297 | 48.4615 | 1260 | 0.0351 | 0.9526 | 0.9776 | 0.9896 | nan | 0.9925 | 0.9461 | 0.9942 | nan | 0.9857 | 0.8807 | 0.9915 | | 0.053 | 49.2308 | 1280 | 0.0349 | 0.9527 | 0.9779 | 0.9896 | nan | 0.9921 | 0.9471 | 0.9945 | nan | 0.9856 | 0.8809 | 0.9916 | | 0.0474 | 50.0 | 1300 | 0.0361 | 0.9518 | 0.9783 | 0.9893 | nan | 0.9923 | 0.9490 | 0.9935 | nan | 0.9857 | 0.8788 | 0.9911 | ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.0+cpu - Datasets 2.19.1 - Tokenizers 0.19.1
[ "unlabeled", "room", "wall", "outside" ]
Cookito/segformer-b0-finetuned-segments-sidewalk-oct-22
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-oct-22 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
Grizzlygg/segformer-b0-scene-parse-150
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-scene-parse-150 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Grizzlygg/CrownTest dataset. It achieves the following results on the evaluation set: - Loss: 0.7268 - Mean Iou: 0.3888 - Mean Accuracy: 0.7776 - Overall Accuracy: 0.7776 - Accuracy Unlabeled: nan - Accuracy Crown: 0.7776 - Iou Unlabeled: 0.0 - Iou Crown: 0.7776 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Crown | Iou Unlabeled | Iou Crown | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:-------------:|:---------:| | 0.3303 | 10.0 | 20 | 0.6046 | 0.3722 | 0.7444 | 0.7444 | nan | 0.7444 | 0.0 | 0.7444 | | 0.2269 | 20.0 | 40 | 0.6817 | 0.3613 | 0.7226 | 0.7226 | nan | 0.7226 | 0.0 | 0.7226 | | 0.1893 | 30.0 | 60 | 0.7231 | 0.3717 | 0.7435 | 0.7435 | nan | 0.7435 | 0.0 | 0.7435 | | 0.185 | 40.0 | 80 | 0.7688 | 0.3880 | 0.7760 | 0.7760 | nan | 0.7760 | 0.0 | 0.7760 | | 0.1704 | 50.0 | 100 | 0.7268 | 0.3888 | 0.7776 | 0.7776 | nan | 0.7776 | 0.0 | 0.7776 | ### Framework versions - Transformers 4.30.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.13.3
[ "unlabeled", "crown" ]
bhaskarSingha/maskformer-paddy
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "unlabeled", "healthy", "brownspot", "leafblast" ]
jrevilla/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
jrevilla/segformer-b0-finetuned-segments-zagile-image-search
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-zagile-image-search This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the jrevilla/zagile-image-search dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "configuration-page", "slack-modal", "slack-modal", "configuration-page", "lightning-component" ]
restor/tcd-segformer-mit-b0
# Model Card for Restor's SegFormer-based TCD models This is a semantic segmentation model that can delineate tree cover in high resolution (10 cm/px) aerial images. This model card is mostly the same for all similar models uploaded to Hugging Face. The model name refers to the specific architecture variant (e.g. nvidia-mit-b0 to nvidia-mit-b5) but the broad details for training and evaluation are identical. This repository is for `tcd-segformer-mit-b0` ## Model Details ### Model Description This semantic segmentation model was trained on global aerial imagery and is able to accurately delineate tree cover in similar images. The model does not detect individual trees, but provides a per-pixel classification of tree/no-tree. - **Developed by:** [Restor](https://restor.eco) / [ETH Zurich](https://ethz.ch) - **Funded by:** This project was made possible via a (Google.org impact grant)[https://blog.google/outreach-initiatives/sustainability/restor-helps-anyone-be-part-ecological-restoration/] - **Model type:** Semantic segmentation (binary class) - **License:** Model training code is provided under an Apache-2 license. NVIDIA has released SegFormer under their own research license. Users should check the terms of this license before deploying. This model was trained on CC BY-NC imagery. - **Finetuned from model:** SegFormer family SegFormer is a variant of the Pyramid Vision Transformer v2 model, with many identical structural features and a semantic segmentation decode head. Functionally, the architecture is quite similar to a Feature Pyramid Network (FPN) as the output predictions are based on combining features from different stages of the network at different spatial resolutions. ### Model Sources - **Repository:** https://github.com/restor-foundation/tcd - **Paper:** We will release a preprint shortly. ## Uses The primary use-case for this model is asessing canopy cover from aerial images (i.e. percentage of study area that is covered by tree canopy). ### Direct Use This model is suitable for inference on a single image tile. For performing predictions on large orthomosaics, a higher level framework is required to manage tiling source imagery and stitching predictions. Our repository provides a comprehensive reference implementation of such a pipeline and has been tested on extremely large images (country-scale). The model will give you predictions for an entire image. In most cases users will want to predict cover for a specific region of the image, for example a study plot or some other geographic boundary. If you predict tree cover in an image you should perform some kind of region-of-interest analysis on the results. Our linked pipeline repository supports shapefile-based region analysis. ### Out-of-Scope Use While we trained the model on globally diverse imagery, some ecological biomes are under-represented in the training dataset and performance may vary. We therefore encourage users to experiment with their own imagery before using the model for any sort of mission-critical use. The model was trained on imagery at a resolution of 10 cm/px. You may be able to get good predictions at other geospatial resolutions, but the results may not be reliable. In particular the model is essentially looking for "things that look like trees" and this is highly resolution dependent. If you want to routinely predict images at a higher or lower resolution, you should fine-tune this model on your own or a resampled version of the training dataset. The model does not predict biomass, canopy height or other derived information. It only predicts the likelihood that some pixel is covered by tree canopy. As-is, the model is not suitable for carbon credit estimation. ## Bias, Risks, and Limitations The main limitation of this model is false positives over objects that look like, or could be confused as, trees. For example large bushes, shrubs or ground cover that looks like tree canopy. The dataset used to train this model was annotated by non-experts. We believe that this is a reasonable trade-off given the size of the dataset and the results on independent test data, as well as empirical evaluation during operational use at Restor on partner data. However, there are almost certainly incorrect labels in the dataset and this may translate into incorrect predictions or other biases in model output. We have observed that the models tend to "disagree" with training data in a way that is probably correct (i.e. the aggregate statistics of the labels are good) and we are working to re-evaluate all training data to remove spurious labels. We provide cross-validation results to give a robust estimate of prediction performance, as well as results on independent imagery (i.e. images the model has never seen) so users can make their own assessments. We do not provide any guarantees on accuracy and users should perform their own independent testing for any kind of "mission critical" or production use. There is no substitute for trying the model on your own data and performing your own evaluation; we strongly encourage experimentation! ## How to Get Started with the Model You can see a brief example of inference in [this Colab notebook](https://colab.research.google.com/drive/1N_rWko6jzGji3j_ayDR7ngT5lf4P8at_). For end-to-end usage, we direct users to our prediction and training [pipeline](https://github.com/restor-foundation/tcd) which also supports tiled prediction over arbitrarily large images, reporting outputs, etc. ## Training Details ### Training Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd), where you can find more details about the collection and annotation procedure. Our image labels are largely released under a CC-BY 4.0 license, with smaller subsets of CC BY-NC and CC BY-SA imagery. ### Training Procedure We used a 5-fold cross-validation process to adjust hyperparameters during training, before training on the "full" training set and evaluating on a holdout set of images. The model in the main branch of this repository should be considered the release version. We used [Pytorch Lightning](https://lightning.ai/) as our training framework with hyperparameters listed below. The training procedure is straightforward and should be familiar to anyone with experience training deep neural networks. A typical training command using our pipeline for this model: ```bash tcd-train semantic segformer-mit-b0 data.output= ... data.root=/mnt/data/tcd/dataset/holdout data.tile_size=1024 ``` #### Preprocessing This repository contains a pre-processor configuration that can be used with the model, assuming you use the `transformers` library. You can load this preprocessor easily by using e.g. ```python from transformers import AutoImageProcessor processor = AutoImageProcessor.from_pretrained('restor/tcd-segformer-mit-b0') ``` Note that we do not resize input images (so that the geospatial scale of the source image is respected) and we assume that normalisation is performed in this processing step and not as a dataset transform. #### Training Hyperparameters - Image size: 1024 px square - Learning rate: initially 1e4-1e5 - Learning rate schedule: reduce on plateau - Optimizer: AdamW - Augmentation: random crop to 1024x1024, arbitrary rotation, flips, colour adjustments - Number of epochs: 75 during cross-validation to ensure convergence; 50 for final models - Normalisation: Imagenet statistics #### Speeds, Sizes, Times You should be able to evaluate the model on a CPU (even up to mit-b5) however you will need a lot of available RAM if you try to infer large tile sizes. In general we find that 1024 px inputs are as large as you want to go, given the fixed size of the output segmentation masks (i.e. it is probably better to perform inference in batched mode at 1024x1024 px than try to predict a single 2048x2048 px image). All models were trained on a single GPU with 24 GB VRAM (NVIDIA RTX3090) attached to a 32-core machine with 64GB RAM. All but the largest models can be trained in under a day on a machine of this specification. The smallest models take under half a day, while the largest models take just over a day to train. Feedback we've received from users (in the field) is that landowners are often interested in seeing the results of aerial surveys, but data bandwidth is often a prohibiting factor in remote areas. One of our goals was to support this kind of in-field usage, so that users who fly a survey can process results offline and in a reasonable amount of time (i.e. on the order of an hour). ## Evaluation We report evaluation results on the OAM-TCD holdout split. ### Testing Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd). This model (`main` branch) was trained on all `train` images and tested on the `test` (holdout) images. ![Training loss](train_loss.png) ### Metrics We report F1, Accuracy and IoU on the holdout dataset, as well as results on a 5-fold cross validation split. Cross validtion is visualised as min/max error bars on the plots below. ### Results ![Validation loss](val_loss.png) ![IoU](val_jaccard_index.png) ![Accuracy (foreground)](val_multiclassaccuracy_tree.png) ![F1 Score](val_multiclassf1score_tree.png) ## Environmental Impact This estimate is the maximum (in terms of training time) for the SegFormer family of models presented here. Smaller models, such as `mit-b0` train in less than half a day. - **Hardware Type:** NVIDIA RTX3090 - **Hours used:** < 36 - **Carbon Emitted:** 5.44 kg CO2 equivalent per model Carbon emissions were be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). This estimate does not take into account time require for experimentation, failed training runs, etc. For example since we used cross-validation, each model actually required approximately 6x this estimate - one run for each fold, plus the final run. Efficient inference on CPU is possible for field work, at the expense of inference latency. A typical single-battery drone flight can be processed in minutes. ## Citation We will provide a preprint version of our paper shortly. In the mean time, please cite as: **BibTeX:** ```latex @unpublished{restortcd, author = "Veitch-Michaelis, Josh and Cottam, Andrew and Schweizer, Daniella Schweizer and Broadbent, Eben N. and Dao, David and Zhang, Ce and Almeyda Zambrano, Angelica and Max, Simeon", title = "OAM-TCD: A globally diverse dataset of high-resolution tree cover maps", note = "In prep.", month = "06", year = "2024" } ``` ## Model Card Authors Josh Veitch-Michaelis, 2024; on behalf of the dataset authors. ## Model Card Contact Please contact josh [at] restor.eco for questions or further information.
[ "__background__", "tree" ]
restor/tcd-segformer-mit-b1
# Model Card for Restor's SegFormer-based TCD models This is a semantic segmentation model that can delineate tree cover in high resolution (10 cm/px) aerial images. This model card is mostly the same for all similar models uploaded to Hugging Face. The model name refers to the specific architecture variant (e.g. nvidia-mit-b0 to nvidia-mit-b5) but the broad details for training and evaluation are identical. This repository is for `tcd-segformer-mit-b1` ## Model Details ### Model Description This semantic segmentation model was trained on global aerial imagery and is able to accurately delineate tree cover in similar images. The model does not detect individual trees, but provides a per-pixel classification of tree/no-tree. - **Developed by:** [Restor](https://restor.eco) / [ETH Zurich](https://ethz.ch) - **Funded by:** This project was made possible via a (Google.org impact grant)[https://blog.google/outreach-initiatives/sustainability/restor-helps-anyone-be-part-ecological-restoration/] - **Model type:** Semantic segmentation (binary class) - **License:** Model training code is provided under an Apache-2 license. NVIDIA has released SegFormer under their own research license. Users should check the terms of this license before deploying. This model was trained on CC BY-NC imagery. - **Finetuned from model:** SegFormer family SegFormer is a variant of the Pyramid Vision Transformer v2 model, with many identical structural features and a semantic segmentation decode head. Functionally, the architecture is quite similar to a Feature Pyramid Network (FPN) as the output predictions are based on combining features from different stages of the network at different spatial resolutions. ### Model Sources - **Repository:** https://github.com/restor-foundation/tcd - **Paper:** We will release a preprint shortly. ## Uses The primary use-case for this model is asessing canopy cover from aerial images (i.e. percentage of study area that is covered by tree canopy). ### Direct Use This model is suitable for inference on a single image tile. For performing predictions on large orthomosaics, a higher level framework is required to manage tiling source imagery and stitching predictions. Our repository provides a comprehensive reference implementation of such a pipeline and has been tested on extremely large images (country-scale). The model will give you predictions for an entire image. In most cases users will want to predict cover for a specific region of the image, for example a study plot or some other geographic boundary. If you predict tree cover in an image you should perform some kind of region-of-interest analysis on the results. Our linked pipeline repository supports shapefile-based region analysis. ### Out-of-Scope Use While we trained the model on globally diverse imagery, some ecological biomes are under-represented in the training dataset and performance may vary. We therefore encourage users to experiment with their own imagery before using the model for any sort of mission-critical use. The model was trained on imagery at a resolution of 10 cm/px. You may be able to get good predictions at other geospatial resolutions, but the results may not be reliable. In particular the model is essentially looking for "things that look like trees" and this is highly resolution dependent. If you want to routinely predict images at a higher or lower resolution, you should fine-tune this model on your own or a resampled version of the training dataset. The model does not predict biomass, canopy height or other derived information. It only predicts the likelihood that some pixel is covered by tree canopy. As-is, the model is not suitable for carbon credit estimation. ## Bias, Risks, and Limitations The main limitation of this model is false positives over objects that look like, or could be confused as, trees. For example large bushes, shrubs or ground cover that looks like tree canopy. The dataset used to train this model was annotated by non-experts. We believe that this is a reasonable trade-off given the size of the dataset and the results on independent test data, as well as empirical evaluation during operational use at Restor on partner data. However, there are almost certainly incorrect labels in the dataset and this may translate into incorrect predictions or other biases in model output. We have observed that the models tend to "disagree" with training data in a way that is probably correct (i.e. the aggregate statistics of the labels are good) and we are working to re-evaluate all training data to remove spurious labels. We provide cross-validation results to give a robust estimate of prediction performance, as well as results on independent imagery (i.e. images the model has never seen) so users can make their own assessments. We do not provide any guarantees on accuracy and users should perform their own independent testing for any kind of "mission critical" or production use. There is no substitute for trying the model on your own data and performing your own evaluation; we strongly encourage experimentation! ## How to Get Started with the Model You can see a brief example of inference in [this Colab notebook](https://colab.research.google.com/drive/1N_rWko6jzGji3j_ayDR7ngT5lf4P8at_). For end-to-end usage, we direct users to our prediction and training [pipeline](https://github.com/restor-foundation/tcd) which also supports tiled prediction over arbitrarily large images, reporting outputs, etc. ## Training Details ### Training Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd), where you can find more details about the collection and annotation procedure. Our image labels are largely released under a CC-BY 4.0 license, with smaller subsets of CC BY-NC and CC BY-SA imagery. ### Training Procedure We used a 5-fold cross-validation process to adjust hyperparameters during training, before training on the "full" training set and evaluating on a holdout set of images. The model in the main branch of this repository should be considered the release version. We used [Pytorch Lightning](https://lightning.ai/) as our training framework with hyperparameters listed below. The training procedure is straightforward and should be familiar to anyone with experience training deep neural networks. A typical training command using our pipeline for this model: ```bash tcd-train semantic segformer-mit-b1 data.output= ... data.root=/mnt/data/tcd/dataset/holdout data.tile_size=1024 ``` #### Preprocessing This repository contains a pre-processor configuration that can be used with the model, assuming you use the `transformers` library. You can load this preprocessor easily by using e.g. ```python from transformers import AutoImageProcessor processor = AutoImageProcessor.from_pretrained('restor/tcd-segformer-mit-b1') ``` Note that we do not resize input images (so that the geospatial scale of the source image is respected) and we assume that normalisation is performed in this processing step and not as a dataset transform. #### Training Hyperparameters - Image size: 1024 px square - Learning rate: initially 1e4-1e5 - Learning rate schedule: reduce on plateau - Optimizer: AdamW - Augmentation: random crop to 1024x1024, arbitrary rotation, flips, colour adjustments - Number of epochs: 75 during cross-validation to ensure convergence; 50 for final models - Normalisation: Imagenet statistics #### Speeds, Sizes, Times You should be able to evaluate the model on a CPU (even up to mit-b5) however you will need a lot of available RAM if you try to infer large tile sizes. In general we find that 1024 px inputs are as large as you want to go, given the fixed size of the output segmentation masks (i.e. it is probably better to perform inference in batched mode at 1024x1024 px than try to predict a single 2048x2048 px image). All models were trained on a single GPU with 24 GB VRAM (NVIDIA RTX3090) attached to a 32-core machine with 64GB RAM. All but the largest models can be trained in under a day on a machine of this specification. The smallest models take under half a day, while the largest models take just over a day to train. Feedback we've received from users (in the field) is that landowners are often interested in seeing the results of aerial surveys, but data bandwidth is often a prohibiting factor in remote areas. One of our goals was to support this kind of in-field usage, so that users who fly a survey can process results offline and in a reasonable amount of time (i.e. on the order of an hour). ## Evaluation We report evaluation results on the OAM-TCD holdout split. ### Testing Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd). This model (`main` branch) was trained on all `train` images and tested on the `test` (holdout) images. ![Training loss](train_loss.png) ### Metrics We report F1, Accuracy and IoU on the holdout dataset, as well as results on a 5-fold cross validation split. Cross validtion is visualised as min/max error bars on the plots below. ### Results ![Validation loss](val_loss.png) ![IoU](val_jaccard_index.png) ![Accuracy (foreground)](val_multiclassaccuracy_tree.png) ![F1 Score](val_multiclassf1score_tree.png) ## Environmental Impact This estimate is the maximum (in terms of training time) for the SegFormer family of models presented here. Smaller models, such as `mit-b0` train in less than half a day. - **Hardware Type:** NVIDIA RTX3090 - **Hours used:** < 36 - **Carbon Emitted:** 5.44 kg CO2 equivalent per model Carbon emissions were be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). This estimate does not take into account time require for experimentation, failed training runs, etc. For example since we used cross-validation, each model actually required approximately 6x this estimate - one run for each fold, plus the final run. Efficient inference on CPU is possible for field work, at the expense of inference latency. A typical single-battery drone flight can be processed in minutes. ## Citation We will provide a preprint version of our paper shortly. In the mean time, please cite as: **BibTeX:** ```latex @unpublished{restortcd, author = "Veitch-Michaelis, Josh and Cottam, Andrew and Schweizer, Daniella Schweizer and Broadbent, Eben N. and Dao, David and Zhang, Ce and Almeyda Zambrano, Angelica and Max, Simeon", title = "OAM-TCD: A globally diverse dataset of high-resolution tree cover maps", note = "In prep.", month = "06", year = "2024" } ``` ## Model Card Authors Josh Veitch-Michaelis, 2024; on behalf of the dataset authors. ## Model Card Contact Please contact josh [at] restor.eco for questions or further information.
[ "__background__", "tree" ]
restor/tcd-segformer-mit-b2
# Model Card for Restor's SegFormer-based TCD models This is a semantic segmentation model that can delineate tree cover in high resolution (10 cm/px) aerial images. This model card is mostly the same for all similar models uploaded to Hugging Face. The model name refers to the specific architecture variant (e.g. nvidia-mit-b0 to nvidia-mit-b5) but the broad details for training and evaluation are identical. This repository is for `tcd-segformer-mit-b2` ## Model Details ### Model Description This semantic segmentation model was trained on global aerial imagery and is able to accurately delineate tree cover in similar images. The model does not detect individual trees, but provides a per-pixel classification of tree/no-tree. - **Developed by:** [Restor](https://restor.eco) / [ETH Zurich](https://ethz.ch) - **Funded by:** This project was made possible via a (Google.org impact grant)[https://blog.google/outreach-initiatives/sustainability/restor-helps-anyone-be-part-ecological-restoration/] - **Model type:** Semantic segmentation (binary class) - **License:** Model training code is provided under an Apache-2 license. NVIDIA has released SegFormer under their own research license. Users should check the terms of this license before deploying. This model was trained on CC BY-NC imagery. - **Finetuned from model:** SegFormer family SegFormer is a variant of the Pyramid Vision Transformer v2 model, with many identical structural features and a semantic segmentation decode head. Functionally, the architecture is quite similar to a Feature Pyramid Network (FPN) as the output predictions are based on combining features from different stages of the network at different spatial resolutions. ### Model Sources - **Repository:** https://github.com/restor-foundation/tcd - **Paper:** We will release a preprint shortly. ## Uses The primary use-case for this model is asessing canopy cover from aerial images (i.e. percentage of study area that is covered by tree canopy). ### Direct Use This model is suitable for inference on a single image tile. For performing predictions on large orthomosaics, a higher level framework is required to manage tiling source imagery and stitching predictions. Our repository provides a comprehensive reference implementation of such a pipeline and has been tested on extremely large images (country-scale). The model will give you predictions for an entire image. In most cases users will want to predict cover for a specific region of the image, for example a study plot or some other geographic boundary. If you predict tree cover in an image you should perform some kind of region-of-interest analysis on the results. Our linked pipeline repository supports shapefile-based region analysis. ### Out-of-Scope Use While we trained the model on globally diverse imagery, some ecological biomes are under-represented in the training dataset and performance may vary. We therefore encourage users to experiment with their own imagery before using the model for any sort of mission-critical use. The model was trained on imagery at a resolution of 10 cm/px. You may be able to get good predictions at other geospatial resolutions, but the results may not be reliable. In particular the model is essentially looking for "things that look like trees" and this is highly resolution dependent. If you want to routinely predict images at a higher or lower resolution, you should fine-tune this model on your own or a resampled version of the training dataset. The model does not predict biomass, canopy height or other derived information. It only predicts the likelihood that some pixel is covered by tree canopy. As-is, the model is not suitable for carbon credit estimation. ## Bias, Risks, and Limitations The main limitation of this model is false positives over objects that look like, or could be confused as, trees. For example large bushes, shrubs or ground cover that looks like tree canopy. The dataset used to train this model was annotated by non-experts. We believe that this is a reasonable trade-off given the size of the dataset and the results on independent test data, as well as empirical evaluation during operational use at Restor on partner data. However, there are almost certainly incorrect labels in the dataset and this may translate into incorrect predictions or other biases in model output. We have observed that the models tend to "disagree" with training data in a way that is probably correct (i.e. the aggregate statistics of the labels are good) and we are working to re-evaluate all training data to remove spurious labels. We provide cross-validation results to give a robust estimate of prediction performance, as well as results on independent imagery (i.e. images the model has never seen) so users can make their own assessments. We do not provide any guarantees on accuracy and users should perform their own independent testing for any kind of "mission critical" or production use. There is no substitute for trying the model on your own data and performing your own evaluation; we strongly encourage experimentation! ## How to Get Started with the Model You can see a brief example of inference in [this Colab notebook](https://colab.research.google.com/drive/1N_rWko6jzGji3j_ayDR7ngT5lf4P8at_). For end-to-end usage, we direct users to our prediction and training [pipeline](https://github.com/restor-foundation/tcd) which also supports tiled prediction over arbitrarily large images, reporting outputs, etc. ## Training Details ### Training Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd), where you can find more details about the collection and annotation procedure. Our image labels are largely released under a CC-BY 4.0 license, with smaller subsets of CC BY-NC and CC BY-SA imagery. ### Training Procedure We used a 5-fold cross-validation process to adjust hyperparameters during training, before training on the "full" training set and evaluating on a holdout set of images. The model in the main branch of this repository should be considered the release version. We used [Pytorch Lightning](https://lightning.ai/) as our training framework with hyperparameters listed below. The training procedure is straightforward and should be familiar to anyone with experience training deep neural networks. A typical training command using our pipeline for this model: ```bash tcd-train semantic segformer-mit-b2 data.output= ... data.root=/mnt/data/tcd/dataset/holdout data.tile_size=1024 ``` #### Preprocessing This repository contains a pre-processor configuration that can be used with the model, assuming you use the `transformers` library. You can load this preprocessor easily by using e.g. ```python from transformers import AutoImageProcessor processor = AutoImageProcessor.from_pretrained('restor/tcd-segformer-mit-b2') ``` Note that we do not resize input images (so that the geospatial scale of the source image is respected) and we assume that normalisation is performed in this processing step and not as a dataset transform. #### Training Hyperparameters - Image size: 1024 px square - Learning rate: initially 1e4-1e5 - Learning rate schedule: reduce on plateau - Optimizer: AdamW - Augmentation: random crop to 1024x1024, arbitrary rotation, flips, colour adjustments - Number of epochs: 75 during cross-validation to ensure convergence; 50 for final models - Normalisation: Imagenet statistics #### Speeds, Sizes, Times You should be able to evaluate the model on a CPU (even up to mit-b5) however you will need a lot of available RAM if you try to infer large tile sizes. In general we find that 1024 px inputs are as large as you want to go, given the fixed size of the output segmentation masks (i.e. it is probably better to perform inference in batched mode at 1024x1024 px than try to predict a single 2048x2048 px image). All models were trained on a single GPU with 24 GB VRAM (NVIDIA RTX3090) attached to a 32-core machine with 64GB RAM. All but the largest models can be trained in under a day on a machine of this specification. The smallest models take under half a day, while the largest models take just over a day to train. Feedback we've received from users (in the field) is that landowners are often interested in seeing the results of aerial surveys, but data bandwidth is often a prohibiting factor in remote areas. One of our goals was to support this kind of in-field usage, so that users who fly a survey can process results offline and in a reasonable amount of time (i.e. on the order of an hour). ## Evaluation We report evaluation results on the OAM-TCD holdout split. ### Testing Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd). This model (`main` branch) was trained on all `train` images and tested on the `test` (holdout) images. ![Training loss](train_loss.png) ### Metrics We report F1, Accuracy and IoU on the holdout dataset, as well as results on a 5-fold cross validation split. Cross validtion is visualised as min/max error bars on the plots below. ### Results ![Validation loss](val_loss.png) ![IoU](val_jaccard_index.png) ![Accuracy (foreground)](val_multiclassaccuracy_tree.png) ![F1 Score](val_multiclassf1score_tree.png) ## Environmental Impact This estimate is the maximum (in terms of training time) for the SegFormer family of models presented here. Smaller models, such as `mit-b0` train in less than half a day. - **Hardware Type:** NVIDIA RTX3090 - **Hours used:** < 36 - **Carbon Emitted:** 5.44 kg CO2 equivalent per model Carbon emissions were be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). This estimate does not take into account time require for experimentation, failed training runs, etc. For example since we used cross-validation, each model actually required approximately 6x this estimate - one run for each fold, plus the final run. Efficient inference on CPU is possible for field work, at the expense of inference latency. A typical single-battery drone flight can be processed in minutes. ## Citation We will provide a preprint version of our paper shortly. In the mean time, please cite as: **BibTeX:** ```latex @unpublished{restortcd, author = "Veitch-Michaelis, Josh and Cottam, Andrew and Schweizer, Daniella Schweizer and Broadbent, Eben N. and Dao, David and Zhang, Ce and Almeyda Zambrano, Angelica and Max, Simeon", title = "OAM-TCD: A globally diverse dataset of high-resolution tree cover maps", note = "In prep.", month = "06", year = "2024" } ``` ## Model Card Authors Josh Veitch-Michaelis, 2024; on behalf of the dataset authors. ## Model Card Contact Please contact josh [at] restor.eco for questions or further information.
[ "__background__", "tree" ]
restor/tcd-segformer-mit-b3
# Model Card for Restor's SegFormer-based TCD models This is a semantic segmentation model that can delineate tree cover in high resolution (10 cm/px) aerial images. This model card is mostly the same for all similar models uploaded to Hugging Face. The model name refers to the specific architecture variant (e.g. nvidia-mit-b0 to nvidia-mit-b5) but the broad details for training and evaluation are identical. This repository is for `tcd-segformer-mit-b3` ## Model Details ### Model Description This semantic segmentation model was trained on global aerial imagery and is able to accurately delineate tree cover in similar images. The model does not detect individual trees, but provides a per-pixel classification of tree/no-tree. - **Developed by:** [Restor](https://restor.eco) / [ETH Zurich](https://ethz.ch) - **Funded by:** This project was made possible via a (Google.org impact grant)[https://blog.google/outreach-initiatives/sustainability/restor-helps-anyone-be-part-ecological-restoration/] - **Model type:** Semantic segmentation (binary class) - **License:** Model training code is provided under an Apache-2 license. NVIDIA has released SegFormer under their own research license. Users should check the terms of this license before deploying. This model was trained on CC BY-NC imagery. - **Finetuned from model:** SegFormer family SegFormer is a variant of the Pyramid Vision Transformer v2 model, with many identical structural features and a semantic segmentation decode head. Functionally, the architecture is quite similar to a Feature Pyramid Network (FPN) as the output predictions are based on combining features from different stages of the network at different spatial resolutions. ### Model Sources - **Repository:** https://github.com/restor-foundation/tcd - **Paper:** We will release a preprint shortly. ## Uses The primary use-case for this model is asessing canopy cover from aerial images (i.e. percentage of study area that is covered by tree canopy). ### Direct Use This model is suitable for inference on a single image tile. For performing predictions on large orthomosaics, a higher level framework is required to manage tiling source imagery and stitching predictions. Our repository provides a comprehensive reference implementation of such a pipeline and has been tested on extremely large images (country-scale). The model will give you predictions for an entire image. In most cases users will want to predict cover for a specific region of the image, for example a study plot or some other geographic boundary. If you predict tree cover in an image you should perform some kind of region-of-interest analysis on the results. Our linked pipeline repository supports shapefile-based region analysis. ### Out-of-Scope Use While we trained the model on globally diverse imagery, some ecological biomes are under-represented in the training dataset and performance may vary. We therefore encourage users to experiment with their own imagery before using the model for any sort of mission-critical use. The model was trained on imagery at a resolution of 10 cm/px. You may be able to get good predictions at other geospatial resolutions, but the results may not be reliable. In particular the model is essentially looking for "things that look like trees" and this is highly resolution dependent. If you want to routinely predict images at a higher or lower resolution, you should fine-tune this model on your own or a resampled version of the training dataset. The model does not predict biomass, canopy height or other derived information. It only predicts the likelihood that some pixel is covered by tree canopy. As-is, the model is not suitable for carbon credit estimation. ## Bias, Risks, and Limitations The main limitation of this model is false positives over objects that look like, or could be confused as, trees. For example large bushes, shrubs or ground cover that looks like tree canopy. The dataset used to train this model was annotated by non-experts. We believe that this is a reasonable trade-off given the size of the dataset and the results on independent test data, as well as empirical evaluation during operational use at Restor on partner data. However, there are almost certainly incorrect labels in the dataset and this may translate into incorrect predictions or other biases in model output. We have observed that the models tend to "disagree" with training data in a way that is probably correct (i.e. the aggregate statistics of the labels are good) and we are working to re-evaluate all training data to remove spurious labels. We provide cross-validation results to give a robust estimate of prediction performance, as well as results on independent imagery (i.e. images the model has never seen) so users can make their own assessments. We do not provide any guarantees on accuracy and users should perform their own independent testing for any kind of "mission critical" or production use. There is no substitute for trying the model on your own data and performing your own evaluation; we strongly encourage experimentation! ## How to Get Started with the Model You can see a brief example of inference in [this Colab notebook](https://colab.research.google.com/drive/1N_rWko6jzGji3j_ayDR7ngT5lf4P8at_). For end-to-end usage, we direct users to our prediction and training [pipeline](https://github.com/restor-foundation/tcd) which also supports tiled prediction over arbitrarily large images, reporting outputs, etc. ## Training Details ### Training Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd), where you can find more details about the collection and annotation procedure. Our image labels are largely released under a CC-BY 4.0 license, with smaller subsets of CC BY-NC and CC BY-SA imagery. ### Training Procedure We used a 5-fold cross-validation process to adjust hyperparameters during training, before training on the "full" training set and evaluating on a holdout set of images. The model in the main branch of this repository should be considered the release version. We used [Pytorch Lightning](https://lightning.ai/) as our training framework with hyperparameters listed below. The training procedure is straightforward and should be familiar to anyone with experience training deep neural networks. A typical training command using our pipeline for this model: ```bash tcd-train semantic segformer-mit-b3 data.output= ... data.root=/mnt/data/tcd/dataset/holdout data.tile_size=1024 ``` #### Preprocessing This repository contains a pre-processor configuration that can be used with the model, assuming you use the `transformers` library. You can load this preprocessor easily by using e.g. ```python from transformers import AutoImageProcessor processor = AutoImageProcessor.from_pretrained('restor/tcd-segformer-mit-b3') ``` Note that we do not resize input images (so that the geospatial scale of the source image is respected) and we assume that normalisation is performed in this processing step and not as a dataset transform. #### Training Hyperparameters - Image size: 1024 px square - Learning rate: initially 1e4-1e5 - Learning rate schedule: reduce on plateau - Optimizer: AdamW - Augmentation: random crop to 1024x1024, arbitrary rotation, flips, colour adjustments - Number of epochs: 75 during cross-validation to ensure convergence; 50 for final models - Normalisation: Imagenet statistics #### Speeds, Sizes, Times You should be able to evaluate the model on a CPU (even up to mit-b5) however you will need a lot of available RAM if you try to infer large tile sizes. In general we find that 1024 px inputs are as large as you want to go, given the fixed size of the output segmentation masks (i.e. it is probably better to perform inference in batched mode at 1024x1024 px than try to predict a single 2048x2048 px image). All models were trained on a single GPU with 24 GB VRAM (NVIDIA RTX3090) attached to a 32-core machine with 64GB RAM. All but the largest models can be trained in under a day on a machine of this specification. The smallest models take under half a day, while the largest models take just over a day to train. Feedback we've received from users (in the field) is that landowners are often interested in seeing the results of aerial surveys, but data bandwidth is often a prohibiting factor in remote areas. One of our goals was to support this kind of in-field usage, so that users who fly a survey can process results offline and in a reasonable amount of time (i.e. on the order of an hour). ## Evaluation We report evaluation results on the OAM-TCD holdout split. ### Testing Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd). This model (`main` branch) was trained on all `train` images and tested on the `test` (holdout) images. ![Training loss](train_loss.png) ### Metrics We report F1, Accuracy and IoU on the holdout dataset, as well as results on a 5-fold cross validation split. Cross validtion is visualised as min/max error bars on the plots below. ### Results ![Validation loss](val_loss.png) ![IoU](val_jaccard_index.png) ![Accuracy (foreground)](val_multiclassaccuracy_tree.png) ![F1 Score](val_multiclassf1score_tree.png) ## Environmental Impact This estimate is the maximum (in terms of training time) for the SegFormer family of models presented here. Smaller models, such as `mit-b0` train in less than half a day. - **Hardware Type:** NVIDIA RTX3090 - **Hours used:** < 36 - **Carbon Emitted:** 5.44 kg CO2 equivalent per model Carbon emissions were be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). This estimate does not take into account time require for experimentation, failed training runs, etc. For example since we used cross-validation, each model actually required approximately 6x this estimate - one run for each fold, plus the final run. Efficient inference on CPU is possible for field work, at the expense of inference latency. A typical single-battery drone flight can be processed in minutes. ## Citation We will provide a preprint version of our paper shortly. In the mean time, please cite as: **BibTeX:** ```latex @unpublished{restortcd, author = "Veitch-Michaelis, Josh and Cottam, Andrew and Schweizer, Daniella Schweizer and Broadbent, Eben N. and Dao, David and Zhang, Ce and Almeyda Zambrano, Angelica and Max, Simeon", title = "OAM-TCD: A globally diverse dataset of high-resolution tree cover maps", note = "In prep.", month = "06", year = "2024" } ``` ## Model Card Authors Josh Veitch-Michaelis, 2024; on behalf of the dataset authors. ## Model Card Contact Please contact josh [at] restor.eco for questions or further information.
[ "__background__", "tree" ]
restor/tcd-segformer-mit-b4
# Model Card for Restor's SegFormer-based TCD models This is a semantic segmentation model that can delineate tree cover in high resolution (10 cm/px) aerial images. This model card is mostly the same for all similar models uploaded to Hugging Face. The model name refers to the specific architecture variant (e.g. nvidia-mit-b0 to nvidia-mit-b5) but the broad details for training and evaluation are identical. This repository is for `tcd-segformer-mit-b4` ## Model Details ### Model Description This semantic segmentation model was trained on global aerial imagery and is able to accurately delineate tree cover in similar images. The model does not detect individual trees, but provides a per-pixel classification of tree/no-tree. - **Developed by:** [Restor](https://restor.eco) / [ETH Zurich](https://ethz.ch) - **Funded by:** This project was made possible via a (Google.org impact grant)[https://blog.google/outreach-initiatives/sustainability/restor-helps-anyone-be-part-ecological-restoration/] - **Model type:** Semantic segmentation (binary class) - **License:** Model training code is provided under an Apache-2 license. NVIDIA has released SegFormer under their own research license. Users should check the terms of this license before deploying. This model was trained on CC BY-NC imagery. - **Finetuned from model:** SegFormer family SegFormer is a variant of the Pyramid Vision Transformer v2 model, with many identical structural features and a semantic segmentation decode head. Functionally, the architecture is quite similar to a Feature Pyramid Network (FPN) as the output predictions are based on combining features from different stages of the network at different spatial resolutions. ### Model Sources - **Repository:** https://github.com/restor-foundation/tcd - **Paper:** We will release a preprint shortly. ## Uses The primary use-case for this model is asessing canopy cover from aerial images (i.e. percentage of study area that is covered by tree canopy). ### Direct Use This model is suitable for inference on a single image tile. For performing predictions on large orthomosaics, a higher level framework is required to manage tiling source imagery and stitching predictions. Our repository provides a comprehensive reference implementation of such a pipeline and has been tested on extremely large images (country-scale). The model will give you predictions for an entire image. In most cases users will want to predict cover for a specific region of the image, for example a study plot or some other geographic boundary. If you predict tree cover in an image you should perform some kind of region-of-interest analysis on the results. Our linked pipeline repository supports shapefile-based region analysis. ### Out-of-Scope Use While we trained the model on globally diverse imagery, some ecological biomes are under-represented in the training dataset and performance may vary. We therefore encourage users to experiment with their own imagery before using the model for any sort of mission-critical use. The model was trained on imagery at a resolution of 10 cm/px. You may be able to get good predictions at other geospatial resolutions, but the results may not be reliable. In particular the model is essentially looking for "things that look like trees" and this is highly resolution dependent. If you want to routinely predict images at a higher or lower resolution, you should fine-tune this model on your own or a resampled version of the training dataset. The model does not predict biomass, canopy height or other derived information. It only predicts the likelihood that some pixel is covered by tree canopy. As-is, the model is not suitable for carbon credit estimation. ## Bias, Risks, and Limitations The main limitation of this model is false positives over objects that look like, or could be confused as, trees. For example large bushes, shrubs or ground cover that looks like tree canopy. The dataset used to train this model was annotated by non-experts. We believe that this is a reasonable trade-off given the size of the dataset and the results on independent test data, as well as empirical evaluation during operational use at Restor on partner data. However, there are almost certainly incorrect labels in the dataset and this may translate into incorrect predictions or other biases in model output. We have observed that the models tend to "disagree" with training data in a way that is probably correct (i.e. the aggregate statistics of the labels are good) and we are working to re-evaluate all training data to remove spurious labels. We provide cross-validation results to give a robust estimate of prediction performance, as well as results on independent imagery (i.e. images the model has never seen) so users can make their own assessments. We do not provide any guarantees on accuracy and users should perform their own independent testing for any kind of "mission critical" or production use. There is no substitute for trying the model on your own data and performing your own evaluation; we strongly encourage experimentation! ## How to Get Started with the Model You can see a brief example of inference in [this Colab notebook](https://colab.research.google.com/drive/1N_rWko6jzGji3j_ayDR7ngT5lf4P8at_). For end-to-end usage, we direct users to our prediction and training [pipeline](https://github.com/restor-foundation/tcd) which also supports tiled prediction over arbitrarily large images, reporting outputs, etc. ## Training Details ### Training Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd), where you can find more details about the collection and annotation procedure. Our image labels are largely released under a CC-BY 4.0 license, with smaller subsets of CC BY-NC and CC BY-SA imagery. ### Training Procedure We used a 5-fold cross-validation process to adjust hyperparameters during training, before training on the "full" training set and evaluating on a holdout set of images. The model in the main branch of this repository should be considered the release version. We used [Pytorch Lightning](https://lightning.ai/) as our training framework with hyperparameters listed below. The training procedure is straightforward and should be familiar to anyone with experience training deep neural networks. A typical training command using our pipeline for this model: ```bash tcd-train semantic segformer-mit-b4 data.output= ... data.root=/mnt/data/tcd/dataset/holdout data.tile_size=1024 ``` #### Preprocessing This repository contains a pre-processor configuration that can be used with the model, assuming you use the `transformers` library. You can load this preprocessor easily by using e.g. ```python from transformers import AutoImageProcessor processor = AutoImageProcessor.from_pretrained('restor/tcd-segformer-mit-b4') ``` Note that we do not resize input images (so that the geospatial scale of the source image is respected) and we assume that normalisation is performed in this processing step and not as a dataset transform. #### Training Hyperparameters - Image size: 1024 px square - Learning rate: initially 1e4-1e5 - Learning rate schedule: reduce on plateau - Optimizer: AdamW - Augmentation: random crop to 1024x1024, arbitrary rotation, flips, colour adjustments - Number of epochs: 75 during cross-validation to ensure convergence; 50 for final models - Normalisation: Imagenet statistics #### Speeds, Sizes, Times You should be able to evaluate the model on a CPU (even up to mit-b5) however you will need a lot of available RAM if you try to infer large tile sizes. In general we find that 1024 px inputs are as large as you want to go, given the fixed size of the output segmentation masks (i.e. it is probably better to perform inference in batched mode at 1024x1024 px than try to predict a single 2048x2048 px image). All models were trained on a single GPU with 24 GB VRAM (NVIDIA RTX3090) attached to a 32-core machine with 64GB RAM. All but the largest models can be trained in under a day on a machine of this specification. The smallest models take under half a day, while the largest models take just over a day to train. Feedback we've received from users (in the field) is that landowners are often interested in seeing the results of aerial surveys, but data bandwidth is often a prohibiting factor in remote areas. One of our goals was to support this kind of in-field usage, so that users who fly a survey can process results offline and in a reasonable amount of time (i.e. on the order of an hour). ## Evaluation We report evaluation results on the OAM-TCD holdout split. ### Testing Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd). This model (`main` branch) was trained on all `train` images and tested on the `test` (holdout) images. ![Training loss](train_loss.png) ### Metrics We report F1, Accuracy and IoU on the holdout dataset, as well as results on a 5-fold cross validation split. Cross validtion is visualised as min/max error bars on the plots below. ### Results ![Validation loss](val_loss.png) ![IoU](val_jaccard_index.png) ![Accuracy (foreground)](val_multiclassaccuracy_tree.png) ![F1 Score](val_multiclassf1score_tree.png) ## Environmental Impact This estimate is the maximum (in terms of training time) for the SegFormer family of models presented here. Smaller models, such as `mit-b0` train in less than half a day. - **Hardware Type:** NVIDIA RTX3090 - **Hours used:** < 36 - **Carbon Emitted:** 5.44 kg CO2 equivalent per model Carbon emissions were be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). This estimate does not take into account time require for experimentation, failed training runs, etc. For example since we used cross-validation, each model actually required approximately 6x this estimate - one run for each fold, plus the final run. Efficient inference on CPU is possible for field work, at the expense of inference latency. A typical single-battery drone flight can be processed in minutes. ## Citation We will provide a preprint version of our paper shortly. In the mean time, please cite as: **BibTeX:** ```latex @unpublished{restortcd, author = "Veitch-Michaelis, Josh and Cottam, Andrew and Schweizer, Daniella Schweizer and Broadbent, Eben N. and Dao, David and Zhang, Ce and Almeyda Zambrano, Angelica and Max, Simeon", title = "OAM-TCD: A globally diverse dataset of high-resolution tree cover maps", note = "In prep.", month = "06", year = "2024" } ``` ## Model Card Authors Josh Veitch-Michaelis, 2024; on behalf of the dataset authors. ## Model Card Contact Please contact josh [at] restor.eco for questions or further information.
[ "__background__", "tree" ]
restor/tcd-segformer-mit-b5
# Model Card for Restor's SegFormer-based TCD models This is a semantic segmentation model that can delineate tree cover in high resolution (10 cm/px) aerial images. This model card is mostly the same for all similar models uploaded to Hugging Face. The model name refers to the specific architecture variant (e.g. nvidia-mit-b0 to nvidia-mit-b5) but the broad details for training and evaluation are identical. This repository is for `tcd-segformer-mit-b5` ## Citation and contact **BibTeX:** This paper was accepted into NeurIPS 2024 under the Datasets and Benchmarks track. The citation will be updated once the final version is confirmed and the proceedings are online. ```latex @inproceedings{restortcd, author = {Veitch-Michaelis, Josh and Cottam, Andrew and Schweizer, Daniella Schweizer and Broadbent, Eben N. and Dao, David and Zhang, Ce and Almeyda Zambrano, Angelica and Max, Simeon} title = {OAM-TCD: A globally diverse dataset of high-resolution tree cover maps}, booktitle = {Advances in Neural Information Processing Systems}, pages = {1--12}, publisher = {Curran Associates, Inc.}, volume = {37}, year = {2024} ``` Please contact josh [at] restor.eco for questions or further information. ## Model Details ### Model Description This semantic segmentation model was trained on global aerial imagery and is able to accurately delineate tree cover in similar images. The model does not detect individual trees, but provides a per-pixel classification of tree/no-tree. - **Developed by:** [Restor](https://restor.eco) / [ETH Zurich](https://ethz.ch) - **Funded by:** This project was made possible via a (Google.org impact grant)[https://blog.google/outreach-initiatives/sustainability/restor-helps-anyone-be-part-ecological-restoration/] - **Model type:** Semantic segmentation (binary class) - **License:** Model training code is provided under an Apache-2 license. NVIDIA has released SegFormer under their own research license. Users should check the terms of this license before deploying. This model was trained on CC BY-NC imagery. - **Finetuned from model:** SegFormer family SegFormer is a variant of the Pyramid Vision Transformer v2 model, with many identical structural features and a semantic segmentation decode head. Functionally, the architecture is quite similar to a Feature Pyramid Network (FPN) as the output predictions are based on combining features from different stages of the network at different spatial resolutions. ### Model Sources - **Repository:** https://github.com/restor-foundation/tcd - **Paper:** We will release a preprint shortly. ## Uses The primary use-case for this model is asessing canopy cover from aerial images (i.e. percentage of study area that is covered by tree canopy). ### Direct Use This model is suitable for inference on a single image tile. For performing predictions on large orthomosaics, a higher level framework is required to manage tiling source imagery and stitching predictions. Our repository provides a comprehensive reference implementation of such a pipeline and has been tested on extremely large images (country-scale). The model will give you predictions for an entire image. In most cases users will want to predict cover for a specific region of the image, for example a study plot or some other geographic boundary. If you predict tree cover in an image you should perform some kind of region-of-interest analysis on the results. Our linked pipeline repository supports shapefile-based region analysis. ### Out-of-Scope Use While we trained the model on globally diverse imagery, some ecological biomes are under-represented in the training dataset and performance may vary. We therefore encourage users to experiment with their own imagery before using the model for any sort of mission-critical use. The model was trained on imagery at a resolution of 10 cm/px. You may be able to get good predictions at other geospatial resolutions, but the results may not be reliable. In particular the model is essentially looking for "things that look like trees" and this is highly resolution dependent. If you want to routinely predict images at a higher or lower resolution, you should fine-tune this model on your own or a resampled version of the training dataset. The model does not predict biomass, canopy height or other derived information. It only predicts the likelihood that some pixel is covered by tree canopy. As-is, the model is not suitable for carbon credit estimation. ## Bias, Risks, and Limitations The main limitation of this model is false positives over objects that look like, or could be confused as, trees. For example large bushes, shrubs or ground cover that looks like tree canopy. The dataset used to train this model was annotated by non-experts. We believe that this is a reasonable trade-off given the size of the dataset and the results on independent test data, as well as empirical evaluation during operational use at Restor on partner data. However, there are almost certainly incorrect labels in the dataset and this may translate into incorrect predictions or other biases in model output. We have observed that the models tend to "disagree" with training data in a way that is probably correct (i.e. the aggregate statistics of the labels are good) and we are working to re-evaluate all training data to remove spurious labels. We provide cross-validation results to give a robust estimate of prediction performance, as well as results on independent imagery (i.e. images the model has never seen) so users can make their own assessments. We do not provide any guarantees on accuracy and users should perform their own independent testing for any kind of "mission critical" or production use. There is no substitute for trying the model on your own data and performing your own evaluation; we strongly encourage experimentation! ## How to Get Started with the Model You can see a brief example of inference in [this Colab notebook](https://colab.research.google.com/drive/1N_rWko6jzGji3j_ayDR7ngT5lf4P8at_). For end-to-end usage, we direct users to our prediction and training [pipeline](https://github.com/restor-foundation/tcd) which also supports tiled prediction over arbitrarily large images, reporting outputs, etc. ## Training Details ### Training Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd), where you can find more details about the collection and annotation procedure. Our image labels are largely released under a CC-BY 4.0 license, with smaller subsets of CC BY-NC and CC BY-SA imagery. ### Training Procedure We used a 5-fold cross-validation process to adjust hyperparameters during training, before training on the "full" training set and evaluating on a holdout set of images. The model in the main branch of this repository should be considered the release version. We used [Pytorch Lightning](https://lightning.ai/) as our training framework with hyperparameters listed below. The training procedure is straightforward and should be familiar to anyone with experience training deep neural networks. A typical training command using our pipeline for this model: ```bash tcd-train semantic segformer-mit-b5 data.output= ... data.root=/mnt/data/tcd/dataset/holdout data.tile_size=1024 ``` #### Preprocessing This repository contains a pre-processor configuration that can be used with the model, assuming you use the `transformers` library. You can load this preprocessor easily by using e.g. ```python from transformers import AutoImageProcessor processor = AutoImageProcessor.from_pretrained('restor/tcd-segformer-mit-b5') ``` Note that we do not resize input images (so that the geospatial scale of the source image is respected) and we assume that normalisation is performed in this processing step and not as a dataset transform. #### Training Hyperparameters - Image size: 1024 px square - Learning rate: initially 1e4-1e5 - Learning rate schedule: reduce on plateau - Optimizer: AdamW - Augmentation: random crop to 1024x1024, arbitrary rotation, flips, colour adjustments - Number of epochs: 75 during cross-validation to ensure convergence; 50 for final models - Normalisation: Imagenet statistics #### Speeds, Sizes, Times You should be able to evaluate the model on a CPU (even up to mit-b5) however you will need a lot of available RAM if you try to infer large tile sizes. In general we find that 1024 px inputs are as large as you want to go, given the fixed size of the output segmentation masks (i.e. it is probably better to perform inference in batched mode at 1024x1024 px than try to predict a single 2048x2048 px image). All models were trained on a single GPU with 24 GB VRAM (NVIDIA RTX3090) attached to a 32-core machine with 64GB RAM. All but the largest models can be trained in under a day on a machine of this specification. The smallest models take under half a day, while the largest models take just over a day to train. Feedback we've received from users (in the field) is that landowners are often interested in seeing the results of aerial surveys, but data bandwidth is often a prohibiting factor in remote areas. One of our goals was to support this kind of in-field usage, so that users who fly a survey can process results offline and in a reasonable amount of time (i.e. on the order of an hour). ## Evaluation We report evaluation results on the OAM-TCD holdout split. ### Testing Data The training dataset may be found [here](https://huggingface.co/datasets/restor/tcd). This model (`main` branch) was trained on all `train` images and tested on the `test` (holdout) images. ![Training loss](train_loss.png) ### Metrics We report F1, Accuracy and IoU on the holdout dataset, as well as results on a 5-fold cross validation split. Cross validtion is visualised as min/max error bars on the plots below. ### Results ![Validation loss](val_loss.png) ![IoU](val_jaccard_index.png) ![Accuracy (foreground)](val_multiclassaccuracy_tree.png) ![F1 Score](val_multiclassf1score_tree.png) ## Environmental Impact This estimate is the maximum (in terms of training time) for the SegFormer family of models presented here. Smaller models, such as `mit-b0` train in less than half a day. - **Hardware Type:** NVIDIA RTX3090 - **Hours used:** < 36 - **Carbon Emitted:** 5.44 kg CO2 equivalent per model Carbon emissions were be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). This estimate does not take into account time require for experimentation, failed training runs, etc. For example since we used cross-validation, each model actually required approximately 6x this estimate - one run for each fold, plus the final run. Efficient inference on CPU is possible for field work, at the expense of inference latency. A typical single-battery drone flight can be processed in minutes. ## Model Card Authors Josh Veitch-Michaelis, 2024; on behalf of the dataset authors.
[ "__background__", "tree" ]
sam1120/safety-utcustom-terrain-b0
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # safety-utcustom-terrain-b0 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/safety-utcustom-terrain-jackal-full-391 dataset. It achieves the following results on the evaluation set: - Loss: 0.1068 - Mean Iou: 0.6684 - Mean Accuracy: 0.7082 - Overall Accuracy: 0.9747 - Accuracy Unlabeled: nan - Accuracy Nat: 0.9932 - Accuracy Concrete: 0.9513 - Accuracy Grass: 0.9029 - Accuracy Speedway bricks: 0.9891 - Accuracy Steel: 0.9082 - Accuracy Rough concrete: 0.0 - Accuracy Dark bricks: 0.8767 - Accuracy Road: 0.9315 - Accuracy Rough red sidewalk: 0.5011 - Accuracy Tiles: 0.2352 - Accuracy Red bricks: 0.9299 - Accuracy Concrete tiles: 0.9872 - Accuracy Rest: 0.0 - Iou Unlabeled: nan - Iou Nat: 0.9807 - Iou Concrete: 0.8923 - Iou Grass: 0.8376 - Iou Speedway bricks: 0.9775 - Iou Steel: 0.8554 - Iou Rough concrete: 0.0 - Iou Dark bricks: 0.8309 - Iou Road: 0.9241 - Iou Rough red sidewalk: 0.3386 - Iou Tiles: 0.2342 - Iou Red bricks: 0.8665 - Iou Concrete tiles: 0.9518 - Iou Rest: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.05 - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Nat | Accuracy Concrete | Accuracy Grass | Accuracy Speedway bricks | Accuracy Steel | Accuracy Rough concrete | Accuracy Dark bricks | Accuracy Road | Accuracy Rough red sidewalk | Accuracy Tiles | Accuracy Red bricks | Accuracy Concrete tiles | Accuracy Rest | Iou Unlabeled | Iou Nat | Iou Concrete | Iou Grass | Iou Speedway bricks | Iou Steel | Iou Rough concrete | Iou Dark bricks | Iou Road | Iou Rough red sidewalk | Iou Tiles | Iou Red bricks | Iou Concrete tiles | Iou Rest | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------:|:-----------------:|:--------------:|:------------------------:|:--------------:|:-----------------------:|:--------------------:|:-------------:|:---------------------------:|:--------------:|:-------------------:|:-----------------------:|:-------------:|:-------------:|:-------:|:------------:|:---------:|:-------------------:|:---------:|:------------------:|:---------------:|:--------:|:----------------------:|:---------:|:--------------:|:------------------:|:--------:| | 2.6108 | 1.82 | 20 | 2.6051 | 0.0159 | 0.1044 | 0.0773 | nan | 0.0951 | 0.0130 | 0.1687 | 0.0635 | 0.0140 | 0.0 | 0.6411 | 0.0058 | 0.1063 | 0.0 | 0.0 | 0.0 | 0.25 | 0.0 | 0.0936 | 0.0118 | 0.0202 | 0.0622 | 0.0065 | 0.0 | 0.0059 | 0.0058 | 0.0170 | 0.0 | 0.0 | 0.0 | 0.0000 | | 2.3421 | 3.64 | 40 | 2.2776 | 0.1190 | 0.1997 | 0.5675 | nan | 0.6849 | 0.0547 | 0.1880 | 0.7466 | 0.0293 | 0.0 | 0.5917 | 0.1477 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.125 | 0.0 | 0.6762 | 0.0473 | 0.0756 | 0.6971 | 0.0129 | 0.0 | 0.0096 | 0.1411 | 0.0059 | 0.0 | 0.0 | 0.0 | 0.0000 | | 1.9173 | 5.45 | 60 | 1.6415 | 0.2069 | 0.2866 | 0.8386 | nan | 0.9258 | 0.5798 | 0.4195 | 0.9327 | 0.0002 | 0.0 | 0.1839 | 0.6829 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9159 | 0.3659 | 0.1946 | 0.7993 | 0.0002 | 0.0 | 0.0247 | 0.5963 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.5459 | 7.27 | 80 | 1.2372 | 0.2280 | 0.2969 | 0.8869 | nan | 0.9541 | 0.7506 | 0.3301 | 0.9632 | 0.0 | 0.0 | 0.0003 | 0.8612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9414 | 0.4972 | 0.2048 | 0.8328 | 0.0 | 0.0 | 0.0002 | 0.7155 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3972 | 9.09 | 100 | 1.1420 | 0.2549 | 0.3078 | 0.8928 | nan | 0.9442 | 0.7732 | 0.3751 | 0.9784 | 0.0 | 0.0 | 0.0 | 0.9303 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9364 | 0.5272 | 0.2267 | 0.8404 | 0.0 | 0.0 | 0.0 | 0.7832 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1514 | 10.91 | 120 | 0.9919 | 0.2752 | 0.3282 | 0.9085 | nan | 0.9494 | 0.8587 | 0.5347 | 0.9744 | 0.0 | 0.0 | 0.0 | 0.9496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9413 | 0.6159 | 0.3252 | 0.8578 | 0.0 | 0.0 | 0.0 | 0.8378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0387 | 12.73 | 140 | 0.8870 | 0.2886 | 0.3438 | 0.9166 | nan | 0.9525 | 0.8759 | 0.7264 | 0.9727 | 0.0 | 0.0 | 0.0 | 0.9417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9437 | 0.6566 | 0.4501 | 0.8664 | 0.0 | 0.0 | 0.0 | 0.8346 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9516 | 14.55 | 160 | 0.7789 | 0.2891 | 0.3436 | 0.9149 | nan | 0.9519 | 0.8994 | 0.7480 | 0.9754 | 0.0 | 0.0 | 0.0 | 0.8920 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9435 | 0.6126 | 0.4907 | 0.8661 | 0.0 | 0.0 | 0.0 | 0.8459 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7909 | 16.36 | 180 | 0.6643 | 0.3019 | 0.3510 | 0.9260 | nan | 0.9619 | 0.8908 | 0.7824 | 0.9699 | 0.0 | 0.0 | 0.0 | 0.9575 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9487 | 0.6817 | 0.5505 | 0.8693 | 0.0 | 0.0 | 0.0 | 0.8747 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7189 | 18.18 | 200 | 0.5797 | 0.3030 | 0.3495 | 0.9264 | nan | 0.9611 | 0.8973 | 0.7524 | 0.9796 | 0.0 | 0.0 | 0.0 | 0.9528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9488 | 0.6701 | 0.5678 | 0.8697 | 0.0 | 0.0 | 0.0 | 0.8822 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5829 | 20.0 | 220 | 0.4973 | 0.3041 | 0.3521 | 0.9272 | nan | 0.9617 | 0.9070 | 0.7870 | 0.9751 | 0.0 | 0.0 | 0.0 | 0.9461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9499 | 0.6820 | 0.5802 | 0.8727 | 0.0 | 0.0 | 0.0 | 0.8684 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5048 | 21.82 | 240 | 0.4336 | 0.3096 | 0.3555 | 0.9298 | nan | 0.9649 | 0.9156 | 0.7919 | 0.9689 | 0.0267 | 0.0 | 0.0 | 0.9534 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9519 | 0.6809 | 0.6172 | 0.8810 | 0.0267 | 0.0 | 0.0 | 0.8675 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.4344 | 23.64 | 260 | 0.3822 | 0.3176 | 0.3626 | 0.9310 | nan | 0.9642 | 0.9097 | 0.8164 | 0.9851 | 0.1145 | 0.0 | 0.0 | 0.9234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9528 | 0.6748 | 0.6166 | 0.8783 | 0.1144 | 0.0 | 0.0 | 0.8913 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3584 | 25.45 | 280 | 0.3193 | 0.3436 | 0.3813 | 0.9417 | nan | 0.9740 | 0.9091 | 0.7792 | 0.9795 | 0.3456 | 0.0 | 0.0 | 0.9691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9581 | 0.7269 | 0.6364 | 0.9026 | 0.3397 | 0.0 | 0.0 | 0.9028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3006 | 27.27 | 300 | 0.2956 | 0.3565 | 0.4000 | 0.9420 | nan | 0.9688 | 0.9278 | 0.8511 | 0.9803 | 0.4487 | 0.0 | 0.0 | 0.9461 | 0.0 | 0.0 | 0.0 | 0.0770 | 0.0 | nan | 0.9583 | 0.7207 | 0.6336 | 0.9046 | 0.4241 | 0.0 | 0.0 | 0.9164 | 0.0 | 0.0 | 0.0 | 0.0770 | 0.0 | | 0.2623 | 29.09 | 320 | 0.2804 | 0.4001 | 0.4338 | 0.9516 | nan | 0.9864 | 0.8830 | 0.8140 | 0.9768 | 0.5599 | 0.0 | 0.0 | 0.9644 | 0.0 | 0.0 | 0.0 | 0.4547 | 0.0 | nan | 0.9602 | 0.7984 | 0.6255 | 0.9274 | 0.5066 | 0.0 | 0.0 | 0.9280 | 0.0 | 0.0 | 0.0 | 0.4547 | 0.0 | | 0.2954 | 30.91 | 340 | 0.2242 | 0.4176 | 0.4540 | 0.9533 | nan | 0.9780 | 0.9337 | 0.8581 | 0.9774 | 0.5902 | 0.0 | 0.0 | 0.9684 | 0.0003 | 0.0 | 0.0 | 0.5960 | 0.0 | nan | 0.9631 | 0.7737 | 0.6824 | 0.9314 | 0.5540 | 0.0 | 0.0 | 0.9282 | 0.0003 | 0.0 | 0.0 | 0.5960 | 0.0 | | 0.23 | 32.73 | 360 | 0.2201 | 0.4474 | 0.4750 | 0.9579 | nan | 0.9899 | 0.9128 | 0.7867 | 0.9772 | 0.7017 | 0.0 | 0.0 | 0.9574 | 0.0514 | 0.0 | 0.0 | 0.7984 | 0.0 | nan | 0.9622 | 0.8177 | 0.6784 | 0.9358 | 0.6335 | 0.0 | 0.0 | 0.9405 | 0.0508 | 0.0 | 0.0 | 0.7977 | 0.0 | | 0.2269 | 34.55 | 380 | 0.2007 | 0.4393 | 0.4770 | 0.9567 | nan | 0.9863 | 0.9324 | 0.8395 | 0.9595 | 0.8059 | 0.0 | 0.0080 | 0.9571 | 0.0643 | 0.0 | 0.0 | 0.6487 | 0.0 | nan | 0.9663 | 0.7832 | 0.7368 | 0.9318 | 0.6573 | 0.0 | 0.0080 | 0.9332 | 0.0486 | 0.0 | 0.0 | 0.6463 | 0.0 | | 0.2013 | 36.36 | 400 | 0.1881 | 0.4695 | 0.5016 | 0.9589 | nan | 0.9866 | 0.9298 | 0.8124 | 0.9791 | 0.7739 | 0.0 | 0.0650 | 0.9425 | 0.1114 | 0.0 | 0.0 | 0.9207 | 0.0 | nan | 0.9663 | 0.7974 | 0.7394 | 0.9391 | 0.6864 | 0.0 | 0.0649 | 0.9249 | 0.0789 | 0.0 | 0.0 | 0.9057 | 0.0 | | 0.1744 | 38.18 | 420 | 0.1692 | 0.5043 | 0.5397 | 0.9624 | nan | 0.9873 | 0.9266 | 0.8166 | 0.9732 | 0.7795 | 0.0 | 0.4470 | 0.9749 | 0.1529 | 0.0 | 0.0 | 0.9576 | 0.0 | nan | 0.9680 | 0.8151 | 0.7397 | 0.9481 | 0.6920 | 0.0 | 0.4439 | 0.9384 | 0.1039 | 0.0 | 0.0 | 0.9067 | 0.0 | | 0.1487 | 40.0 | 440 | 0.1572 | 0.5243 | 0.5751 | 0.9645 | nan | 0.9855 | 0.9130 | 0.8884 | 0.9811 | 0.7724 | 0.0 | 0.4822 | 0.9664 | 0.5105 | 0.0 | 0.0 | 0.9765 | 0.0 | nan | 0.9683 | 0.8370 | 0.7608 | 0.9518 | 0.6969 | 0.0 | 0.4801 | 0.9353 | 0.3195 | 0.0 | 0.0 | 0.8659 | 0.0 | | 0.1535 | 41.82 | 460 | 0.1495 | 0.5245 | 0.5653 | 0.9642 | nan | 0.9876 | 0.9293 | 0.7922 | 0.9860 | 0.7584 | 0.0 | 0.4997 | 0.9617 | 0.4520 | 0.0 | 0.0 | 0.9816 | 0.0 | nan | 0.9692 | 0.8188 | 0.7413 | 0.9504 | 0.7010 | 0.0 | 0.4966 | 0.9431 | 0.3008 | 0.0 | 0.0 | 0.8977 | 0.0 | | 0.1375 | 43.64 | 480 | 0.1455 | 0.5208 | 0.5592 | 0.9647 | nan | 0.9903 | 0.9111 | 0.8097 | 0.9868 | 0.7366 | 0.0 | 0.5516 | 0.9644 | 0.3513 | 0.0 | 0.0 | 0.9679 | 0.0 | nan | 0.9695 | 0.8279 | 0.7512 | 0.9494 | 0.6913 | 0.0 | 0.5358 | 0.9440 | 0.2400 | 0.0 | 0.0 | 0.8613 | 0.0 | | 0.1309 | 45.45 | 500 | 0.1380 | 0.5367 | 0.5770 | 0.9668 | nan | 0.9898 | 0.9317 | 0.8456 | 0.9773 | 0.8291 | 0.0 | 0.6698 | 0.9641 | 0.3113 | 0.0 | 0.0 | 0.9817 | 0.0 | nan | 0.9705 | 0.8298 | 0.7728 | 0.9568 | 0.7419 | 0.0 | 0.6454 | 0.9515 | 0.2103 | 0.0 | 0.0 | 0.8986 | 0.0 | | 0.1186 | 47.27 | 520 | 0.1348 | 0.5351 | 0.5779 | 0.9669 | nan | 0.9897 | 0.9159 | 0.8389 | 0.9888 | 0.7943 | 0.0 | 0.7516 | 0.9627 | 0.2831 | 0.0 | 0.0 | 0.9877 | 0.0 | nan | 0.9707 | 0.8389 | 0.7715 | 0.9533 | 0.7310 | 0.0 | 0.7112 | 0.9499 | 0.1922 | 0.0 | 0.0 | 0.8376 | 0.0 | | 0.1157 | 49.09 | 540 | 0.1341 | 0.5405 | 0.5876 | 0.9666 | nan | 0.9881 | 0.9353 | 0.8558 | 0.9765 | 0.8611 | 0.0 | 0.7315 | 0.9595 | 0.3413 | 0.0 | 0.0 | 0.9902 | 0.0 | nan | 0.9709 | 0.8279 | 0.7672 | 0.9589 | 0.7522 | 0.0 | 0.6999 | 0.9455 | 0.2193 | 0.0 | 0.0 | 0.8843 | 0.0 | | 0.127 | 50.91 | 560 | 0.1232 | 0.5484 | 0.5833 | 0.9689 | nan | 0.9905 | 0.9207 | 0.8449 | 0.9881 | 0.7859 | 0.0 | 0.7773 | 0.9745 | 0.3135 | 0.0 | 0.0124 | 0.9757 | 0.0 | nan | 0.9722 | 0.8494 | 0.7821 | 0.9561 | 0.7330 | 0.0 | 0.7354 | 0.9569 | 0.2113 | 0.0 | 0.0124 | 0.9198 | 0.0 | | 0.114 | 52.73 | 580 | 0.1195 | 0.5655 | 0.6096 | 0.9701 | nan | 0.9900 | 0.9228 | 0.8583 | 0.9841 | 0.8427 | 0.0 | 0.7965 | 0.9742 | 0.5022 | 0.0 | 0.0707 | 0.9834 | 0.0 | nan | 0.9720 | 0.8596 | 0.7848 | 0.9593 | 0.7577 | 0.0 | 0.7516 | 0.9558 | 0.3258 | 0.0 | 0.0707 | 0.9144 | 0.0 | | 0.0963 | 54.55 | 600 | 0.1274 | 0.5707 | 0.6194 | 0.9666 | nan | 0.9902 | 0.9337 | 0.8520 | 0.9823 | 0.8614 | 0.0 | 0.7853 | 0.9296 | 0.3977 | 0.0 | 0.3295 | 0.9906 | 0.0 | nan | 0.9731 | 0.8200 | 0.7887 | 0.9618 | 0.7623 | 0.0 | 0.7394 | 0.9177 | 0.2601 | 0.0 | 0.3277 | 0.8686 | 0.0 | | 0.0854 | 56.36 | 620 | 0.1151 | 0.5870 | 0.6264 | 0.9706 | nan | 0.9911 | 0.9281 | 0.8529 | 0.9829 | 0.8564 | 0.0 | 0.7772 | 0.9739 | 0.3820 | 0.0 | 0.4178 | 0.9814 | 0.0 | nan | 0.9731 | 0.8538 | 0.7872 | 0.9622 | 0.7664 | 0.0 | 0.7444 | 0.9591 | 0.2556 | 0.0 | 0.4145 | 0.9142 | 0.0 | | 0.0803 | 58.18 | 640 | 0.1088 | 0.6041 | 0.6548 | 0.9708 | nan | 0.9885 | 0.9300 | 0.8814 | 0.9849 | 0.8553 | 0.0 | 0.7885 | 0.9720 | 0.4836 | 0.0 | 0.6344 | 0.9932 | 0.0 | nan | 0.9731 | 0.8548 | 0.7902 | 0.9633 | 0.7716 | 0.0 | 0.7500 | 0.9598 | 0.3174 | 0.0 | 0.6294 | 0.8437 | 0.0 | | 0.0862 | 60.0 | 660 | 0.1186 | 0.5920 | 0.6297 | 0.9685 | nan | 0.9898 | 0.9405 | 0.7933 | 0.9828 | 0.8620 | 0.0 | 0.7965 | 0.9697 | 0.3429 | 0.0 | 0.5671 | 0.9420 | 0.0 | nan | 0.9733 | 0.8298 | 0.7456 | 0.9634 | 0.7728 | 0.0 | 0.7532 | 0.9526 | 0.2269 | 0.0 | 0.5623 | 0.9168 | 0.0 | | 0.0869 | 61.82 | 680 | 0.1129 | 0.6074 | 0.6520 | 0.9696 | nan | 0.9893 | 0.9362 | 0.8739 | 0.9839 | 0.8544 | 0.0 | 0.8275 | 0.9579 | 0.3160 | 0.0 | 0.7447 | 0.9920 | 0.0 | nan | 0.9741 | 0.8338 | 0.7941 | 0.9649 | 0.7770 | 0.0 | 0.7754 | 0.9473 | 0.2110 | 0.0 | 0.7349 | 0.8835 | 0.0 | | 0.1056 | 63.64 | 700 | 0.1100 | 0.6182 | 0.6585 | 0.9708 | nan | 0.9891 | 0.9337 | 0.8652 | 0.9873 | 0.8505 | 0.0 | 0.8278 | 0.9701 | 0.3640 | 0.0 | 0.8052 | 0.9671 | 0.0 | nan | 0.9741 | 0.8467 | 0.7933 | 0.9625 | 0.7814 | 0.0 | 0.7727 | 0.9599 | 0.2387 | 0.0 | 0.7949 | 0.9119 | 0.0 | | 0.1043 | 65.45 | 720 | 0.1035 | 0.6218 | 0.6638 | 0.9712 | nan | 0.9892 | 0.9350 | 0.8516 | 0.9865 | 0.8404 | 0.0 | 0.8462 | 0.9771 | 0.4300 | 0.0 | 0.8018 | 0.9717 | 0.0 | nan | 0.9739 | 0.8530 | 0.7846 | 0.9637 | 0.7690 | 0.0 | 0.7931 | 0.9615 | 0.2844 | 0.0 | 0.7855 | 0.9140 | 0.0 | | 0.0689 | 67.27 | 740 | 0.1138 | 0.6181 | 0.6609 | 0.9695 | nan | 0.9897 | 0.9396 | 0.8453 | 0.9850 | 0.8639 | 0.0 | 0.8107 | 0.9525 | 0.4244 | 0.0 | 0.8111 | 0.9696 | 0.0 | nan | 0.9741 | 0.8334 | 0.7742 | 0.9656 | 0.7824 | 0.0 | 0.7606 | 0.9434 | 0.2848 | 0.0 | 0.8006 | 0.9161 | 0.0 | | 0.0783 | 69.09 | 760 | 0.1063 | 0.6229 | 0.6732 | 0.9710 | nan | 0.9893 | 0.9251 | 0.8575 | 0.9870 | 0.8592 | 0.0 | 0.8318 | 0.9740 | 0.4851 | 0.0 | 0.8546 | 0.9876 | 0.0 | nan | 0.9736 | 0.8536 | 0.7815 | 0.9640 | 0.7762 | 0.0 | 0.7815 | 0.9612 | 0.2937 | 0.0 | 0.8101 | 0.9021 | 0.0 | | 0.0747 | 70.91 | 780 | 0.0987 | 0.6396 | 0.6860 | 0.9728 | nan | 0.9907 | 0.9231 | 0.8665 | 0.9885 | 0.8526 | 0.0 | 0.8435 | 0.9773 | 0.5318 | 0.0899 | 0.8635 | 0.9902 | 0.0 | nan | 0.9741 | 0.8681 | 0.7966 | 0.9643 | 0.7786 | 0.0 | 0.7924 | 0.9651 | 0.3517 | 0.0899 | 0.8366 | 0.8972 | 0.0 | | 0.0832 | 72.73 | 800 | 0.0957 | 0.6392 | 0.6846 | 0.9733 | nan | 0.9908 | 0.9323 | 0.8780 | 0.9862 | 0.8581 | 0.0 | 0.8482 | 0.9774 | 0.4579 | 0.1035 | 0.8787 | 0.9888 | 0.0 | nan | 0.9750 | 0.8677 | 0.8050 | 0.9670 | 0.7853 | 0.0 | 0.7940 | 0.9662 | 0.2988 | 0.1035 | 0.8400 | 0.9074 | 0.0 | | 0.0768 | 74.55 | 820 | 0.0994 | 0.6346 | 0.6812 | 0.9725 | nan | 0.9909 | 0.9388 | 0.8632 | 0.9871 | 0.8383 | 0.0 | 0.8645 | 0.9662 | 0.4957 | 0.0385 | 0.8894 | 0.9824 | 0.0 | nan | 0.9750 | 0.8590 | 0.8004 | 0.9666 | 0.7760 | 0.0 | 0.8055 | 0.9570 | 0.3298 | 0.0385 | 0.8262 | 0.9163 | 0.0 | | 0.0738 | 76.36 | 840 | 0.1013 | 0.6243 | 0.6647 | 0.9724 | nan | 0.9917 | 0.9415 | 0.8527 | 0.9846 | 0.8556 | 0.0 | 0.8335 | 0.9683 | 0.4429 | 0.0 | 0.7978 | 0.9727 | 0.0 | nan | 0.9752 | 0.8557 | 0.7921 | 0.9686 | 0.7892 | 0.0 | 0.7866 | 0.9582 | 0.2944 | 0.0 | 0.7735 | 0.9231 | 0.0 | | 0.073 | 78.18 | 860 | 0.0955 | 0.6448 | 0.6984 | 0.9725 | nan | 0.9887 | 0.9234 | 0.8929 | 0.9908 | 0.8370 | 0.0 | 0.8653 | 0.9728 | 0.5640 | 0.1644 | 0.8924 | 0.9878 | 0.0 | nan | 0.9744 | 0.8640 | 0.7960 | 0.9649 | 0.7718 | 0.0 | 0.8031 | 0.9629 | 0.3590 | 0.1644 | 0.8266 | 0.8955 | 0.0 | | 0.0829 | 80.0 | 880 | 0.0980 | 0.6320 | 0.6724 | 0.9731 | nan | 0.9917 | 0.9318 | 0.8553 | 0.9885 | 0.8538 | 0.0 | 0.8547 | 0.9764 | 0.4251 | 0.0 | 0.8906 | 0.9725 | 0.0 | nan | 0.9751 | 0.8630 | 0.7927 | 0.9675 | 0.7918 | 0.0 | 0.8018 | 0.9667 | 0.2869 | 0.0 | 0.8399 | 0.9300 | 0.0 | | 0.0874 | 81.82 | 900 | 0.1001 | 0.6299 | 0.6843 | 0.9712 | nan | 0.9916 | 0.9439 | 0.8556 | 0.9838 | 0.8674 | 0.0 | 0.8520 | 0.9486 | 0.4492 | 0.0979 | 0.9274 | 0.9781 | 0.0 | nan | 0.9764 | 0.8471 | 0.8000 | 0.9681 | 0.7920 | 0.0 | 0.7995 | 0.9386 | 0.2937 | 0.0979 | 0.7541 | 0.9211 | 0.0 | | 0.0605 | 83.64 | 920 | 0.0964 | 0.6260 | 0.6758 | 0.9728 | nan | 0.9913 | 0.9397 | 0.8582 | 0.9840 | 0.8832 | 0.0 | 0.8554 | 0.9723 | 0.4102 | 0.0 | 0.8995 | 0.9916 | 0.0 | nan | 0.9758 | 0.8560 | 0.7972 | 0.9691 | 0.7995 | 0.0 | 0.8009 | 0.9616 | 0.2729 | 0.0 | 0.8104 | 0.8948 | 0.0 | | 0.0756 | 85.45 | 940 | 0.1011 | 0.6253 | 0.6666 | 0.9717 | nan | 0.9911 | 0.9382 | 0.8403 | 0.9844 | 0.8815 | 0.0 | 0.8326 | 0.9703 | 0.4261 | 0.0 | 0.8475 | 0.9542 | 0.0 | nan | 0.9756 | 0.8483 | 0.7718 | 0.9703 | 0.8011 | 0.0 | 0.7831 | 0.9551 | 0.2846 | 0.0 | 0.8109 | 0.9277 | 0.0 | | 0.0713 | 87.27 | 960 | 0.1032 | 0.6235 | 0.6630 | 0.9718 | nan | 0.9926 | 0.9410 | 0.8021 | 0.9865 | 0.8696 | 0.0 | 0.8435 | 0.9684 | 0.4094 | 0.0 | 0.8290 | 0.9769 | 0.0 | nan | 0.9754 | 0.8490 | 0.7615 | 0.9700 | 0.8027 | 0.0 | 0.7976 | 0.9560 | 0.2749 | 0.0 | 0.7923 | 0.9267 | 0.0 | | 0.0644 | 89.09 | 980 | 0.0966 | 0.6311 | 0.6747 | 0.9735 | nan | 0.9921 | 0.9329 | 0.8607 | 0.9845 | 0.8910 | 0.0 | 0.8584 | 0.9775 | 0.4111 | 0.0 | 0.8837 | 0.9787 | 0.0 | nan | 0.9757 | 0.8627 | 0.7931 | 0.9700 | 0.7982 | 0.0 | 0.8043 | 0.9671 | 0.2751 | 0.0 | 0.8277 | 0.9305 | 0.0 | | 0.0585 | 90.91 | 1000 | 0.0986 | 0.6292 | 0.6737 | 0.9731 | nan | 0.9916 | 0.9357 | 0.8753 | 0.9859 | 0.8991 | 0.0 | 0.8332 | 0.9651 | 0.4522 | 0.0 | 0.8565 | 0.9637 | 0.0 | nan | 0.9764 | 0.8609 | 0.8055 | 0.9685 | 0.7975 | 0.0 | 0.7890 | 0.9567 | 0.2884 | 0.0 | 0.8142 | 0.9222 | 0.0 | | 0.0686 | 92.73 | 1020 | 0.0954 | 0.6256 | 0.6720 | 0.9733 | nan | 0.9922 | 0.9391 | 0.8400 | 0.9828 | 0.8998 | 0.0 | 0.8325 | 0.9792 | 0.4168 | 0.0 | 0.8636 | 0.9906 | 0.0 | nan | 0.9762 | 0.8622 | 0.7879 | 0.9696 | 0.7963 | 0.0 | 0.7923 | 0.9668 | 0.2704 | 0.0 | 0.8132 | 0.8986 | 0.0 | | 0.0553 | 94.55 | 1040 | 0.0971 | 0.6333 | 0.6778 | 0.9739 | nan | 0.9928 | 0.9317 | 0.8925 | 0.9839 | 0.8925 | 0.0 | 0.8706 | 0.9689 | 0.3992 | 0.0 | 0.9004 | 0.9790 | 0.0 | nan | 0.9764 | 0.8621 | 0.8160 | 0.9705 | 0.8045 | 0.0 | 0.8125 | 0.9615 | 0.2666 | 0.0 | 0.8325 | 0.9305 | 0.0 | | 0.072 | 96.36 | 1060 | 0.0970 | 0.6286 | 0.6766 | 0.9725 | nan | 0.9920 | 0.9422 | 0.8644 | 0.9821 | 0.8992 | 0.0 | 0.8566 | 0.9614 | 0.4215 | 0.0 | 0.8988 | 0.9773 | 0.0 | nan | 0.9769 | 0.8639 | 0.8004 | 0.9700 | 0.7999 | 0.0 | 0.8077 | 0.9519 | 0.2747 | 0.0 | 0.8032 | 0.9231 | 0.0 | | 0.0561 | 98.18 | 1080 | 0.0950 | 0.6295 | 0.6776 | 0.9737 | nan | 0.9915 | 0.9444 | 0.8585 | 0.9860 | 0.8814 | 0.0 | 0.8606 | 0.9737 | 0.4086 | 0.0 | 0.9299 | 0.9741 | 0.0 | nan | 0.9767 | 0.8620 | 0.7957 | 0.9707 | 0.8058 | 0.0 | 0.8081 | 0.9643 | 0.2692 | 0.0 | 0.7989 | 0.9325 | 0.0 | | 0.0611 | 100.0 | 1100 | 0.0960 | 0.6294 | 0.6721 | 0.9736 | nan | 0.9911 | 0.9463 | 0.8699 | 0.9856 | 0.8766 | 0.0 | 0.8567 | 0.9717 | 0.3878 | 0.0 | 0.875 | 0.9766 | 0.0 | nan | 0.9770 | 0.8580 | 0.7921 | 0.9711 | 0.8052 | 0.0 | 0.8082 | 0.9652 | 0.2593 | 0.0 | 0.8141 | 0.9324 | 0.0 | | 0.067 | 101.82 | 1120 | 0.0957 | 0.6464 | 0.6943 | 0.9722 | nan | 0.9918 | 0.9381 | 0.8776 | 0.9871 | 0.8781 | 0.0 | 0.8499 | 0.9425 | 0.5555 | 0.1116 | 0.9094 | 0.9846 | 0.0 | nan | 0.9771 | 0.8688 | 0.8164 | 0.9707 | 0.8054 | 0.0 | 0.8045 | 0.9332 | 0.3562 | 0.1116 | 0.8289 | 0.9298 | 0.0 | | 0.0582 | 103.64 | 1140 | 0.0924 | 0.6352 | 0.6809 | 0.9741 | nan | 0.9906 | 0.9457 | 0.8864 | 0.9845 | 0.8884 | 0.0 | 0.8490 | 0.9731 | 0.4750 | 0.0 | 0.8885 | 0.9712 | 0.0 | nan | 0.9772 | 0.8707 | 0.8061 | 0.9712 | 0.8071 | 0.0 | 0.8044 | 0.9607 | 0.3140 | 0.0 | 0.8093 | 0.9371 | 0.0 | | 0.0593 | 105.45 | 1160 | 0.0886 | 0.6406 | 0.6893 | 0.9753 | nan | 0.9922 | 0.9405 | 0.8798 | 0.9832 | 0.8871 | 0.0 | 0.8589 | 0.9819 | 0.5258 | 0.0 | 0.9325 | 0.9787 | 0.0 | nan | 0.9772 | 0.8717 | 0.8177 | 0.9715 | 0.8127 | 0.0 | 0.8109 | 0.9683 | 0.3418 | 0.0 | 0.8159 | 0.9407 | 0.0 | | 0.0548 | 107.27 | 1180 | 0.1100 | 0.6298 | 0.6722 | 0.9701 | nan | 0.9918 | 0.9455 | 0.8889 | 0.9854 | 0.8736 | 0.0 | 0.8647 | 0.9207 | 0.4293 | 0.0 | 0.8596 | 0.9790 | 0.0 | nan | 0.9778 | 0.8703 | 0.8140 | 0.9716 | 0.8126 | 0.0 | 0.8105 | 0.9119 | 0.2856 | 0.0 | 0.7992 | 0.9343 | 0.0 | | 0.0593 | 109.09 | 1200 | 0.0992 | 0.6380 | 0.6825 | 0.9726 | nan | 0.9918 | 0.9376 | 0.8880 | 0.9891 | 0.8636 | 0.0 | 0.8465 | 0.9463 | 0.5118 | 0.0 | 0.9139 | 0.9839 | 0.0 | nan | 0.9775 | 0.8707 | 0.8239 | 0.9714 | 0.8117 | 0.0 | 0.8062 | 0.9372 | 0.3340 | 0.0 | 0.8298 | 0.9319 | 0.0 | | 0.0622 | 110.91 | 1220 | 0.1069 | 0.6264 | 0.6789 | 0.9701 | nan | 0.9912 | 0.9374 | 0.8492 | 0.9884 | 0.8766 | 0.0 | 0.8838 | 0.9360 | 0.4517 | 0.0 | 0.9219 | 0.9893 | 0.0 | nan | 0.9770 | 0.8579 | 0.7776 | 0.9715 | 0.8147 | 0.0 | 0.8237 | 0.9284 | 0.2904 | 0.0 | 0.7852 | 0.9170 | 0.0 | | 0.0491 | 112.73 | 1240 | 0.1163 | 0.6280 | 0.6759 | 0.9675 | nan | 0.9914 | 0.9446 | 0.8614 | 0.9853 | 0.8876 | 0.0 | 0.8585 | 0.9001 | 0.4679 | 0.0 | 0.9050 | 0.9849 | 0.0 | nan | 0.9774 | 0.8608 | 0.7882 | 0.9723 | 0.8141 | 0.0 | 0.8161 | 0.8928 | 0.3068 | 0.0 | 0.8105 | 0.9248 | 0.0 | | 0.0468 | 114.55 | 1260 | 0.1143 | 0.6328 | 0.6804 | 0.9687 | nan | 0.9916 | 0.9414 | 0.8875 | 0.9840 | 0.9030 | 0.0 | 0.8520 | 0.9044 | 0.5029 | 0.0 | 0.8908 | 0.9876 | 0.0 | nan | 0.9773 | 0.8657 | 0.8087 | 0.9719 | 0.8102 | 0.0 | 0.8114 | 0.8969 | 0.3365 | 0.0 | 0.8308 | 0.9176 | 0.0 | | 0.0488 | 116.36 | 1280 | 0.1216 | 0.6249 | 0.6712 | 0.9674 | nan | 0.9918 | 0.9369 | 0.8593 | 0.9869 | 0.8932 | 0.0 | 0.8738 | 0.9018 | 0.4061 | 0.0 | 0.8888 | 0.9873 | 0.0 | nan | 0.9777 | 0.8509 | 0.7931 | 0.9714 | 0.8078 | 0.0 | 0.8184 | 0.8920 | 0.2700 | 0.0 | 0.8181 | 0.9246 | 0.0 | | 0.0502 | 118.18 | 1300 | 0.1638 | 0.6172 | 0.6661 | 0.9596 | nan | 0.9920 | 0.9447 | 0.8527 | 0.9860 | 0.8831 | 0.0 | 0.8622 | 0.8101 | 0.4110 | 0.0 | 0.9242 | 0.9933 | 0.0 | nan | 0.9775 | 0.8604 | 0.7949 | 0.9726 | 0.8083 | 0.0 | 0.8153 | 0.8038 | 0.2725 | 0.0 | 0.8271 | 0.8911 | 0.0 | | 0.0471 | 120.0 | 1320 | 0.1424 | 0.6293 | 0.6739 | 0.9653 | nan | 0.9917 | 0.9445 | 0.8728 | 0.9886 | 0.8708 | 0.0 | 0.8460 | 0.8645 | 0.4731 | 0.0 | 0.9242 | 0.9839 | 0.0 | nan | 0.9774 | 0.8691 | 0.8054 | 0.9720 | 0.8065 | 0.0 | 0.8094 | 0.8590 | 0.3194 | 0.0 | 0.8309 | 0.9319 | 0.0 | | 0.0518 | 121.82 | 1340 | 0.1363 | 0.6243 | 0.6735 | 0.9653 | nan | 0.9917 | 0.9401 | 0.8560 | 0.9891 | 0.8778 | 0.0 | 0.8785 | 0.8730 | 0.4372 | 0.0 | 0.9201 | 0.9917 | 0.0 | nan | 0.9779 | 0.8643 | 0.7942 | 0.9722 | 0.8123 | 0.0 | 0.8157 | 0.8661 | 0.2896 | 0.0 | 0.8186 | 0.9046 | 0.0 | | 0.054 | 123.64 | 1360 | 0.1198 | 0.6297 | 0.6849 | 0.9673 | nan | 0.9911 | 0.9396 | 0.8900 | 0.9876 | 0.8889 | 0.0 | 0.8761 | 0.8863 | 0.5217 | 0.0 | 0.9284 | 0.9936 | 0.0 | nan | 0.9781 | 0.8691 | 0.8123 | 0.9728 | 0.8192 | 0.0 | 0.8135 | 0.8788 | 0.3432 | 0.0 | 0.8097 | 0.8890 | 0.0 | | 0.0519 | 125.45 | 1380 | 0.1309 | 0.6294 | 0.6836 | 0.9656 | nan | 0.9897 | 0.9451 | 0.8967 | 0.9857 | 0.8936 | 0.0 | 0.8679 | 0.8728 | 0.5175 | 0.0029 | 0.9231 | 0.9916 | 0.0 | nan | 0.9779 | 0.8707 | 0.8042 | 0.9726 | 0.8207 | 0.0 | 0.8079 | 0.8637 | 0.3420 | 0.0029 | 0.8063 | 0.9128 | 0.0 | | 0.0485 | 127.27 | 1400 | 0.1189 | 0.6372 | 0.6865 | 0.9681 | nan | 0.9914 | 0.9485 | 0.8947 | 0.9838 | 0.8993 | 0.0 | 0.8539 | 0.8885 | 0.5688 | 0.0 | 0.9094 | 0.9870 | 0.0 | nan | 0.9782 | 0.8774 | 0.8081 | 0.9730 | 0.8220 | 0.0 | 0.8123 | 0.8812 | 0.3755 | 0.0 | 0.8236 | 0.9327 | 0.0 | | 0.0573 | 129.09 | 1420 | 0.1260 | 0.6308 | 0.6758 | 0.9664 | nan | 0.9920 | 0.9443 | 0.8626 | 0.9893 | 0.8762 | 0.0 | 0.8538 | 0.8755 | 0.5081 | 0.0 | 0.9038 | 0.9802 | 0.0 | nan | 0.9784 | 0.8688 | 0.7995 | 0.9725 | 0.8194 | 0.0 | 0.8114 | 0.8695 | 0.3320 | 0.0 | 0.8036 | 0.9450 | 0.0 | | 0.055 | 130.91 | 1440 | 0.0953 | 0.6321 | 0.6751 | 0.9741 | nan | 0.9930 | 0.9417 | 0.8574 | 0.9885 | 0.8704 | 0.0 | 0.8546 | 0.9676 | 0.3929 | 0.0 | 0.9254 | 0.9846 | 0.0 | nan | 0.9778 | 0.8698 | 0.7990 | 0.9725 | 0.8170 | 0.0 | 0.8141 | 0.9595 | 0.2659 | 0.0 | 0.8194 | 0.9226 | 0.0 | | 0.0516 | 132.73 | 1460 | 0.1438 | 0.6295 | 0.6778 | 0.9655 | nan | 0.9910 | 0.9403 | 0.8820 | 0.9891 | 0.8782 | 0.0 | 0.8693 | 0.8681 | 0.4979 | 0.0 | 0.9075 | 0.9884 | 0.0 | nan | 0.9785 | 0.8640 | 0.8050 | 0.9728 | 0.8218 | 0.0 | 0.8159 | 0.8580 | 0.3304 | 0.0 | 0.8166 | 0.9212 | 0.0 | | 0.054 | 134.55 | 1480 | 0.1301 | 0.6283 | 0.6715 | 0.9657 | nan | 0.9927 | 0.9424 | 0.8534 | 0.9878 | 0.8903 | 0.0 | 0.8533 | 0.8705 | 0.4538 | 0.0 | 0.9084 | 0.9772 | 0.0 | nan | 0.9782 | 0.8684 | 0.7977 | 0.9726 | 0.8226 | 0.0 | 0.8102 | 0.8611 | 0.3035 | 0.0 | 0.8130 | 0.9410 | 0.0 | | 0.0456 | 136.36 | 1500 | 0.1230 | 0.6256 | 0.6719 | 0.9661 | nan | 0.9931 | 0.9476 | 0.8517 | 0.9852 | 0.8958 | 0.0 | 0.8568 | 0.8740 | 0.4267 | 0.0 | 0.9197 | 0.9847 | 0.0 | nan | 0.9778 | 0.8704 | 0.8045 | 0.9734 | 0.8265 | 0.0 | 0.8162 | 0.8658 | 0.2888 | 0.0 | 0.7812 | 0.9286 | 0.0 | | 0.0466 | 138.18 | 1520 | 0.1097 | 0.6245 | 0.6715 | 0.9690 | nan | 0.9923 | 0.9381 | 0.8776 | 0.9887 | 0.8817 | 0.0 | 0.8708 | 0.9091 | 0.3661 | 0.0 | 0.9224 | 0.9833 | 0.0 | nan | 0.9784 | 0.8675 | 0.8071 | 0.9723 | 0.8200 | 0.0 | 0.8111 | 0.9005 | 0.2467 | 0.0 | 0.7825 | 0.9318 | 0.0 | | 0.0485 | 140.0 | 1540 | 0.1299 | 0.6245 | 0.6674 | 0.9650 | nan | 0.9926 | 0.9384 | 0.8472 | 0.9894 | 0.8740 | 0.0 | 0.8533 | 0.8692 | 0.4291 | 0.0 | 0.8936 | 0.9894 | 0.0 | nan | 0.9781 | 0.8598 | 0.7874 | 0.9730 | 0.8191 | 0.0 | 0.8104 | 0.8611 | 0.2891 | 0.0 | 0.8259 | 0.9146 | 0.0 | | 0.0461 | 141.82 | 1560 | 0.1423 | 0.6234 | 0.6720 | 0.9650 | nan | 0.9915 | 0.9411 | 0.8722 | 0.9901 | 0.8678 | 0.0 | 0.8721 | 0.8645 | 0.4308 | 0.0 | 0.9137 | 0.9919 | 0.0 | nan | 0.9786 | 0.8598 | 0.7986 | 0.9723 | 0.8163 | 0.0 | 0.8170 | 0.8566 | 0.2894 | 0.0 | 0.8113 | 0.9040 | 0.0 | | 0.0531 | 143.64 | 1580 | 0.1312 | 0.6359 | 0.6837 | 0.9670 | nan | 0.9913 | 0.9444 | 0.9011 | 0.9885 | 0.8899 | 0.0 | 0.8505 | 0.8707 | 0.5601 | 0.0 | 0.9137 | 0.9780 | 0.0 | nan | 0.9788 | 0.8787 | 0.8177 | 0.9736 | 0.8223 | 0.0 | 0.8184 | 0.8642 | 0.3641 | 0.0 | 0.8046 | 0.9437 | 0.0 | | 0.0518 | 145.45 | 1600 | 0.1391 | 0.6422 | 0.6897 | 0.9668 | nan | 0.9927 | 0.9450 | 0.8721 | 0.9885 | 0.8684 | 0.0 | 0.8748 | 0.8686 | 0.5539 | 0.1030 | 0.9142 | 0.9849 | 0.0 | nan | 0.9777 | 0.8809 | 0.8168 | 0.9731 | 0.8154 | 0.0 | 0.8197 | 0.8621 | 0.3757 | 0.1030 | 0.7990 | 0.9252 | 0.0 | | 0.0528 | 147.27 | 1620 | 0.1476 | 0.6304 | 0.6752 | 0.9659 | nan | 0.9917 | 0.9486 | 0.8675 | 0.9869 | 0.8872 | 0.0 | 0.8604 | 0.8695 | 0.4855 | 0.0 | 0.8935 | 0.9867 | 0.0 | nan | 0.9789 | 0.8640 | 0.8085 | 0.9736 | 0.8266 | 0.0 | 0.8233 | 0.8612 | 0.3207 | 0.0 | 0.8082 | 0.9304 | 0.0 | | 0.0474 | 149.09 | 1640 | 0.1277 | 0.6245 | 0.6769 | 0.9663 | nan | 0.9927 | 0.9449 | 0.8744 | 0.9850 | 0.9086 | 0.0 | 0.8567 | 0.8710 | 0.4388 | 0.0 | 0.9414 | 0.9866 | 0.0 | nan | 0.9788 | 0.8704 | 0.8091 | 0.9740 | 0.8237 | 0.0 | 0.8186 | 0.8633 | 0.2956 | 0.0 | 0.7524 | 0.9320 | 0.0 | | 0.0451 | 150.91 | 1660 | 0.0888 | 0.6545 | 0.7022 | 0.9746 | nan | 0.9927 | 0.9469 | 0.8896 | 0.9863 | 0.8971 | 0.0 | 0.8766 | 0.9487 | 0.6198 | 0.0727 | 0.9187 | 0.9792 | 0.0 | nan | 0.9783 | 0.8878 | 0.8243 | 0.9745 | 0.8276 | 0.0 | 0.8248 | 0.9420 | 0.4160 | 0.0727 | 0.8227 | 0.9373 | 0.0 | | 0.0453 | 152.73 | 1680 | 0.1081 | 0.6326 | 0.6743 | 0.9713 | nan | 0.9930 | 0.9434 | 0.8839 | 0.9867 | 0.8978 | 0.0 | 0.8580 | 0.9243 | 0.3913 | 0.0 | 0.9006 | 0.9865 | 0.0 | nan | 0.9784 | 0.8748 | 0.8193 | 0.9739 | 0.8247 | 0.0 | 0.8226 | 0.9177 | 0.2647 | 0.0 | 0.8158 | 0.9314 | 0.0 | | 0.0443 | 154.55 | 1700 | 0.1257 | 0.6273 | 0.6698 | 0.9668 | nan | 0.9924 | 0.9455 | 0.8873 | 0.9871 | 0.8989 | 0.0 | 0.8664 | 0.8743 | 0.3528 | 0.0053 | 0.9135 | 0.9844 | 0.0 | nan | 0.9789 | 0.8723 | 0.8205 | 0.9738 | 0.8237 | 0.0 | 0.8221 | 0.8679 | 0.2405 | 0.0053 | 0.8197 | 0.9310 | 0.0 | | 0.0417 | 156.36 | 1720 | 0.1372 | 0.6280 | 0.6709 | 0.9662 | nan | 0.9929 | 0.9490 | 0.8584 | 0.9864 | 0.8963 | 0.0 | 0.8795 | 0.8707 | 0.4016 | 0.0 | 0.9004 | 0.9862 | 0.0 | nan | 0.9790 | 0.8683 | 0.8056 | 0.9744 | 0.8308 | 0.0 | 0.8122 | 0.8637 | 0.2695 | 0.0 | 0.8282 | 0.9321 | 0.0 | | 0.0475 | 158.18 | 1740 | 0.1191 | 0.6321 | 0.6768 | 0.9682 | nan | 0.9930 | 0.9459 | 0.8680 | 0.9889 | 0.8898 | 0.0 | 0.8884 | 0.8861 | 0.4469 | 0.0 | 0.9162 | 0.9753 | 0.0 | nan | 0.9792 | 0.8702 | 0.8095 | 0.9740 | 0.8291 | 0.0 | 0.8180 | 0.8800 | 0.2983 | 0.0 | 0.8168 | 0.9426 | 0.0 | | 0.0468 | 160.0 | 1760 | 0.1353 | 0.6509 | 0.7001 | 0.9673 | nan | 0.9920 | 0.9455 | 0.8941 | 0.9859 | 0.9074 | 0.0 | 0.8583 | 0.8715 | 0.4859 | 0.2384 | 0.9371 | 0.9857 | 0.0 | nan | 0.9788 | 0.8815 | 0.8221 | 0.9744 | 0.8224 | 0.0 | 0.8043 | 0.8638 | 0.3264 | 0.2384 | 0.8353 | 0.9147 | 0.0 | | 0.0569 | 161.82 | 1780 | 0.1207 | 0.6354 | 0.6746 | 0.9679 | nan | 0.9930 | 0.9493 | 0.8787 | 0.9856 | 0.9042 | 0.0 | 0.8538 | 0.8813 | 0.4475 | 0.0 | 0.8967 | 0.9793 | 0.0 | nan | 0.9787 | 0.8746 | 0.8278 | 0.9744 | 0.8304 | 0.0 | 0.8138 | 0.8728 | 0.3010 | 0.0 | 0.8456 | 0.9404 | 0.0 | | 0.0414 | 163.64 | 1800 | 0.1460 | 0.6455 | 0.6913 | 0.9669 | nan | 0.9933 | 0.9372 | 0.8813 | 0.9873 | 0.9058 | 0.0 | 0.8720 | 0.8683 | 0.4885 | 0.1494 | 0.9128 | 0.9911 | 0.0 | nan | 0.9779 | 0.8812 | 0.8257 | 0.9744 | 0.8261 | 0.0 | 0.8178 | 0.8607 | 0.3223 | 0.1493 | 0.8482 | 0.9084 | 0.0 | | 0.0447 | 165.45 | 1820 | 0.1044 | 0.6388 | 0.6894 | 0.9708 | nan | 0.9923 | 0.9439 | 0.8828 | 0.9876 | 0.8871 | 0.0 | 0.8512 | 0.9150 | 0.5713 | 0.0 | 0.9451 | 0.9856 | 0.0 | nan | 0.9784 | 0.8774 | 0.8215 | 0.9743 | 0.8290 | 0.0 | 0.8095 | 0.9050 | 0.3752 | 0.0 | 0.8043 | 0.9301 | 0.0 | | 0.0485 | 167.27 | 1840 | 0.1204 | 0.6388 | 0.6841 | 0.9686 | nan | 0.9924 | 0.9449 | 0.8884 | 0.9871 | 0.8977 | 0.0 | 0.8716 | 0.8881 | 0.5143 | 0.0 | 0.9263 | 0.9824 | 0.0 | nan | 0.9788 | 0.8792 | 0.8206 | 0.9747 | 0.8304 | 0.0 | 0.8055 | 0.8793 | 0.3445 | 0.0 | 0.8460 | 0.9454 | 0.0 | | 0.0461 | 169.09 | 1860 | 0.1313 | 0.6422 | 0.6901 | 0.9675 | nan | 0.9925 | 0.9475 | 0.8937 | 0.9850 | 0.9035 | 0.0 | 0.8448 | 0.8708 | 0.6013 | 0.0115 | 0.9366 | 0.9846 | 0.0 | nan | 0.9783 | 0.8858 | 0.8222 | 0.9745 | 0.8296 | 0.0 | 0.8058 | 0.8639 | 0.3987 | 0.0115 | 0.8451 | 0.9331 | 0.0 | | 0.0443 | 170.91 | 1880 | 0.1051 | 0.6355 | 0.6830 | 0.9719 | nan | 0.9920 | 0.9431 | 0.8904 | 0.9887 | 0.8924 | 0.0 | 0.8725 | 0.9294 | 0.4618 | 0.0 | 0.9210 | 0.9880 | 0.0 | nan | 0.9785 | 0.8747 | 0.8183 | 0.9741 | 0.8253 | 0.0 | 0.8115 | 0.9231 | 0.3032 | 0.0 | 0.8130 | 0.9394 | 0.0 | | 0.0435 | 172.73 | 1900 | 0.1098 | 0.6357 | 0.6870 | 0.9703 | nan | 0.9922 | 0.9460 | 0.9039 | 0.9866 | 0.8965 | 0.0 | 0.8758 | 0.9051 | 0.4786 | 0.0008 | 0.9556 | 0.9903 | 0.0 | nan | 0.9787 | 0.8757 | 0.8331 | 0.9746 | 0.8308 | 0.0 | 0.7989 | 0.8990 | 0.3124 | 0.0008 | 0.8368 | 0.9230 | 0.0 | | 0.0406 | 174.55 | 1920 | 0.1402 | 0.6283 | 0.6813 | 0.9663 | nan | 0.9926 | 0.9485 | 0.8834 | 0.9835 | 0.9023 | 0.0 | 0.8798 | 0.8675 | 0.4361 | 0.0158 | 0.9554 | 0.9920 | 0.0 | nan | 0.9787 | 0.8694 | 0.8222 | 0.9739 | 0.8313 | 0.0 | 0.8015 | 0.8603 | 0.2922 | 0.0158 | 0.8206 | 0.9024 | 0.0 | | 0.0405 | 176.36 | 1940 | 0.1101 | 0.6409 | 0.6866 | 0.9705 | nan | 0.9920 | 0.9462 | 0.9041 | 0.9873 | 0.8948 | 0.0 | 0.8779 | 0.9058 | 0.5158 | 0.0017 | 0.9210 | 0.9792 | 0.0 | nan | 0.9790 | 0.8803 | 0.8317 | 0.9749 | 0.8309 | 0.0 | 0.8015 | 0.8978 | 0.3388 | 0.0017 | 0.8453 | 0.9496 | 0.0 | | 0.0406 | 178.18 | 1960 | 0.0978 | 0.6439 | 0.6944 | 0.9735 | nan | 0.9918 | 0.9414 | 0.9071 | 0.9902 | 0.8802 | 0.0 | 0.8832 | 0.9390 | 0.5456 | 0.0194 | 0.9396 | 0.9897 | 0.0 | nan | 0.9788 | 0.8794 | 0.8314 | 0.9742 | 0.8269 | 0.0 | 0.8102 | 0.9326 | 0.3544 | 0.0194 | 0.8350 | 0.9280 | 0.0 | | 0.0456 | 180.0 | 1980 | 0.1044 | 0.6467 | 0.6936 | 0.9716 | nan | 0.9922 | 0.9444 | 0.8799 | 0.9875 | 0.8953 | 0.0 | 0.8744 | 0.9262 | 0.4691 | 0.1335 | 0.9274 | 0.9868 | 0.0 | nan | 0.9791 | 0.8722 | 0.8122 | 0.9747 | 0.8297 | 0.0 | 0.8042 | 0.9185 | 0.3132 | 0.1335 | 0.8411 | 0.9293 | 0.0 | | 0.042 | 181.82 | 2000 | 0.0967 | 0.6447 | 0.6927 | 0.9733 | nan | 0.9928 | 0.9445 | 0.8680 | 0.9886 | 0.9002 | 0.0 | 0.8684 | 0.9425 | 0.4960 | 0.0803 | 0.9386 | 0.9852 | 0.0 | nan | 0.9787 | 0.8760 | 0.8115 | 0.9750 | 0.8353 | 0.0 | 0.8030 | 0.9342 | 0.3328 | 0.0803 | 0.8076 | 0.9470 | 0.0 | | 0.0374 | 183.64 | 2020 | 0.1129 | 0.6375 | 0.6835 | 0.9706 | nan | 0.9924 | 0.9455 | 0.8588 | 0.9892 | 0.8894 | 0.0 | 0.8842 | 0.9191 | 0.4598 | 0.0367 | 0.9190 | 0.9918 | 0.0 | nan | 0.9790 | 0.8655 | 0.7978 | 0.9745 | 0.8342 | 0.0 | 0.8206 | 0.9114 | 0.3061 | 0.0367 | 0.8331 | 0.9283 | 0.0 | | 0.0442 | 185.45 | 2040 | 0.1124 | 0.6460 | 0.6941 | 0.9706 | nan | 0.9926 | 0.9448 | 0.8853 | 0.9888 | 0.8950 | 0.0 | 0.8626 | 0.9060 | 0.5086 | 0.0894 | 0.9625 | 0.9875 | 0.0 | nan | 0.9792 | 0.8786 | 0.8236 | 0.9750 | 0.8357 | 0.0 | 0.8144 | 0.8990 | 0.3366 | 0.0894 | 0.8274 | 0.9386 | 0.0 | | 0.039 | 187.27 | 2060 | 0.1035 | 0.6498 | 0.6960 | 0.9731 | nan | 0.9939 | 0.9453 | 0.8743 | 0.9877 | 0.8961 | 0.0 | 0.8663 | 0.9283 | 0.5815 | 0.0531 | 0.9382 | 0.9837 | 0.0 | nan | 0.9789 | 0.8854 | 0.8230 | 0.9749 | 0.8369 | 0.0 | 0.8153 | 0.9222 | 0.3904 | 0.0531 | 0.8315 | 0.9353 | 0.0 | | 0.0399 | 189.09 | 2080 | 0.1065 | 0.6524 | 0.7053 | 0.9719 | nan | 0.9914 | 0.9487 | 0.9079 | 0.9860 | 0.9134 | 0.0 | 0.8916 | 0.9178 | 0.5399 | 0.1436 | 0.9398 | 0.9882 | 0.0 | nan | 0.9795 | 0.8828 | 0.8276 | 0.9751 | 0.8305 | 0.0 | 0.8146 | 0.9102 | 0.3570 | 0.1436 | 0.8248 | 0.9359 | 0.0 | | 0.0453 | 190.91 | 2100 | 0.1085 | 0.6539 | 0.7008 | 0.9715 | nan | 0.9908 | 0.9478 | 0.8996 | 0.9894 | 0.8936 | 0.0 | 0.8861 | 0.9166 | 0.5587 | 0.1298 | 0.9151 | 0.9828 | 0.0 | nan | 0.9793 | 0.8822 | 0.8145 | 0.9747 | 0.8314 | 0.0 | 0.8284 | 0.9090 | 0.3664 | 0.1298 | 0.8376 | 0.9473 | 0.0 | | 0.0424 | 192.73 | 2120 | 0.1242 | 0.6399 | 0.6789 | 0.9685 | nan | 0.9935 | 0.9428 | 0.8621 | 0.9885 | 0.8940 | 0.0 | 0.8553 | 0.8895 | 0.4576 | 0.0785 | 0.8791 | 0.9849 | 0.0 | nan | 0.9792 | 0.8688 | 0.8021 | 0.9752 | 0.8347 | 0.0 | 0.8214 | 0.8821 | 0.3031 | 0.0785 | 0.8396 | 0.9333 | 0.0 | | 0.0447 | 194.55 | 2140 | 0.0969 | 0.6571 | 0.7036 | 0.9741 | nan | 0.9928 | 0.9496 | 0.8869 | 0.9861 | 0.9057 | 0.0 | 0.8605 | 0.9397 | 0.5943 | 0.1185 | 0.9292 | 0.9839 | 0.0 | nan | 0.9796 | 0.8869 | 0.8219 | 0.9755 | 0.8337 | 0.0 | 0.8229 | 0.9314 | 0.3983 | 0.1185 | 0.8297 | 0.9446 | 0.0 | | 0.041 | 196.36 | 2160 | 0.1037 | 0.6440 | 0.6925 | 0.9728 | nan | 0.9911 | 0.9520 | 0.8945 | 0.9877 | 0.9020 | 0.0 | 0.8730 | 0.9330 | 0.5270 | 0.0333 | 0.9222 | 0.9871 | 0.0 | nan | 0.9801 | 0.8794 | 0.8242 | 0.9751 | 0.8313 | 0.0 | 0.8158 | 0.9229 | 0.3409 | 0.0333 | 0.8325 | 0.9361 | 0.0 | | 0.0407 | 198.18 | 2180 | 0.0970 | 0.6534 | 0.6982 | 0.9741 | nan | 0.9922 | 0.9549 | 0.8958 | 0.9864 | 0.9019 | 0.0 | 0.8594 | 0.9377 | 0.5668 | 0.0813 | 0.9150 | 0.9854 | 0.0 | nan | 0.9795 | 0.8872 | 0.8284 | 0.9755 | 0.8361 | 0.0 | 0.8159 | 0.9311 | 0.3776 | 0.0813 | 0.8390 | 0.9427 | 0.0 | | 0.0423 | 200.0 | 2200 | 0.1142 | 0.6458 | 0.6857 | 0.9713 | nan | 0.9931 | 0.9536 | 0.8819 | 0.9867 | 0.8953 | 0.0 | 0.8528 | 0.9101 | 0.4995 | 0.0458 | 0.9151 | 0.9800 | 0.0 | nan | 0.9792 | 0.8851 | 0.8256 | 0.9757 | 0.8350 | 0.0 | 0.8187 | 0.9030 | 0.3364 | 0.0458 | 0.8417 | 0.9489 | 0.0 | | 0.0473 | 201.82 | 2220 | 0.1097 | 0.6449 | 0.6868 | 0.9721 | nan | 0.9925 | 0.9470 | 0.9050 | 0.9876 | 0.9043 | 0.0 | 0.8599 | 0.9182 | 0.4843 | 0.0331 | 0.9084 | 0.9885 | 0.0 | nan | 0.9799 | 0.8820 | 0.8344 | 0.9753 | 0.8367 | 0.0 | 0.8230 | 0.9102 | 0.3255 | 0.0331 | 0.8494 | 0.9341 | 0.0 | | 0.0381 | 203.64 | 2240 | 0.0959 | 0.6528 | 0.6953 | 0.9744 | nan | 0.9927 | 0.9464 | 0.9132 | 0.9868 | 0.9039 | 0.0 | 0.8731 | 0.9425 | 0.4724 | 0.0949 | 0.9332 | 0.9801 | 0.0 | nan | 0.9797 | 0.8859 | 0.8375 | 0.9758 | 0.8376 | 0.0 | 0.8194 | 0.9351 | 0.3156 | 0.0949 | 0.8611 | 0.9445 | 0.0 | | 0.0362 | 205.45 | 2260 | 0.0900 | 0.6588 | 0.7062 | 0.9750 | nan | 0.9920 | 0.9460 | 0.9109 | 0.9875 | 0.9014 | 0.0 | 0.8806 | 0.9502 | 0.5285 | 0.1579 | 0.9389 | 0.9869 | 0.0 | nan | 0.9798 | 0.8868 | 0.8348 | 0.9752 | 0.8325 | 0.0 | 0.8100 | 0.9409 | 0.3510 | 0.1579 | 0.8579 | 0.9369 | 0.0 | | 0.039 | 207.27 | 2280 | 0.0881 | 0.6658 | 0.7105 | 0.9754 | nan | 0.9929 | 0.9455 | 0.9029 | 0.9875 | 0.8980 | 0.0 | 0.8771 | 0.9511 | 0.5019 | 0.2533 | 0.9458 | 0.9809 | 0.0 | nan | 0.9796 | 0.8885 | 0.8338 | 0.9755 | 0.8355 | 0.0 | 0.8113 | 0.9430 | 0.3352 | 0.2532 | 0.8491 | 0.9508 | 0.0 | | 0.0398 | 209.09 | 2300 | 0.0998 | 0.6528 | 0.6944 | 0.9728 | nan | 0.9919 | 0.9449 | 0.9178 | 0.9875 | 0.9061 | 0.0 | 0.8733 | 0.9274 | 0.4699 | 0.1329 | 0.8958 | 0.9801 | 0.0 | nan | 0.9794 | 0.8844 | 0.8302 | 0.9755 | 0.8326 | 0.0 | 0.8198 | 0.9202 | 0.3128 | 0.1329 | 0.8480 | 0.9504 | 0.0 | | 0.0365 | 210.91 | 2320 | 0.0984 | 0.6675 | 0.7098 | 0.9733 | nan | 0.9933 | 0.9445 | 0.8949 | 0.9888 | 0.8964 | 0.0 | 0.8677 | 0.9261 | 0.4936 | 0.3113 | 0.9244 | 0.9861 | 0.0 | nan | 0.9795 | 0.8890 | 0.8324 | 0.9753 | 0.8317 | 0.0 | 0.8176 | 0.9192 | 0.3306 | 0.3111 | 0.8537 | 0.9368 | 0.0 | | 0.0367 | 212.73 | 2340 | 0.1067 | 0.6602 | 0.7011 | 0.9719 | nan | 0.9934 | 0.9422 | 0.8750 | 0.9884 | 0.8982 | 0.0 | 0.8791 | 0.9204 | 0.4682 | 0.2506 | 0.9141 | 0.9843 | 0.0 | nan | 0.9796 | 0.8816 | 0.8147 | 0.9756 | 0.8385 | 0.0 | 0.8254 | 0.9096 | 0.3141 | 0.2504 | 0.8512 | 0.9418 | 0.0 | | 0.0391 | 214.55 | 2360 | 0.0870 | 0.6662 | 0.7085 | 0.9755 | nan | 0.9930 | 0.9450 | 0.8901 | 0.9888 | 0.9011 | 0.0 | 0.8814 | 0.9554 | 0.4655 | 0.2839 | 0.9226 | 0.9832 | 0.0 | nan | 0.9797 | 0.8893 | 0.8246 | 0.9753 | 0.8332 | 0.0 | 0.8310 | 0.9479 | 0.3120 | 0.2835 | 0.8428 | 0.9417 | 0.0 | | 0.0346 | 216.36 | 2380 | 0.0950 | 0.6591 | 0.7038 | 0.9736 | nan | 0.9927 | 0.9460 | 0.8845 | 0.9879 | 0.9088 | 0.0 | 0.8671 | 0.9383 | 0.4853 | 0.2373 | 0.9169 | 0.9842 | 0.0 | nan | 0.9796 | 0.8822 | 0.8151 | 0.9757 | 0.8361 | 0.0 | 0.8191 | 0.9313 | 0.3232 | 0.2372 | 0.8276 | 0.9414 | 0.0 | | 0.0383 | 218.18 | 2400 | 0.1076 | 0.6613 | 0.7094 | 0.9711 | nan | 0.9921 | 0.9488 | 0.8887 | 0.9844 | 0.9121 | 0.0 | 0.8751 | 0.9158 | 0.4338 | 0.3525 | 0.9274 | 0.9919 | 0.0 | nan | 0.9797 | 0.8779 | 0.8134 | 0.9750 | 0.8368 | 0.0 | 0.8200 | 0.9074 | 0.2902 | 0.3522 | 0.8398 | 0.9046 | 0.0 | | 0.0377 | 220.0 | 2420 | 0.1122 | 0.6518 | 0.6957 | 0.9709 | nan | 0.9917 | 0.9429 | 0.9016 | 0.9898 | 0.8987 | 0.0 | 0.8883 | 0.9111 | 0.3966 | 0.2131 | 0.9212 | 0.9887 | 0.0 | nan | 0.9798 | 0.8757 | 0.8202 | 0.9753 | 0.8400 | 0.0 | 0.8250 | 0.9030 | 0.2656 | 0.2130 | 0.8447 | 0.9306 | 0.0 | | 0.0379 | 221.82 | 2440 | 0.0987 | 0.6640 | 0.7004 | 0.9738 | nan | 0.9941 | 0.9488 | 0.8726 | 0.9868 | 0.9009 | 0.0 | 0.8637 | 0.9365 | 0.4319 | 0.2848 | 0.9094 | 0.9752 | 0.0 | nan | 0.9791 | 0.8878 | 0.8249 | 0.9759 | 0.8389 | 0.0 | 0.8238 | 0.9275 | 0.2918 | 0.2845 | 0.8525 | 0.9457 | 0.0 | | 0.0343 | 223.64 | 2460 | 0.1061 | 0.6487 | 0.6894 | 0.9725 | nan | 0.9926 | 0.9501 | 0.8866 | 0.9875 | 0.8991 | 0.0 | 0.8693 | 0.9286 | 0.4146 | 0.1344 | 0.9150 | 0.9846 | 0.0 | nan | 0.9796 | 0.8692 | 0.8190 | 0.9760 | 0.8445 | 0.0 | 0.8169 | 0.9197 | 0.2795 | 0.1344 | 0.8453 | 0.9488 | 0.0 | | 0.0383 | 225.45 | 2480 | 0.1005 | 0.6618 | 0.7027 | 0.9733 | nan | 0.9928 | 0.9465 | 0.8944 | 0.9887 | 0.9054 | 0.0 | 0.8781 | 0.9289 | 0.4708 | 0.2467 | 0.8929 | 0.9900 | 0.0 | nan | 0.9795 | 0.8830 | 0.8284 | 0.9761 | 0.8437 | 0.0 | 0.8206 | 0.9213 | 0.3143 | 0.2464 | 0.8506 | 0.9397 | 0.0 | | 0.0363 | 227.27 | 2500 | 0.1074 | 0.6551 | 0.6951 | 0.9718 | nan | 0.9929 | 0.9474 | 0.8825 | 0.9881 | 0.9054 | 0.0 | 0.8826 | 0.9187 | 0.3993 | 0.2288 | 0.9078 | 0.9831 | 0.0 | nan | 0.9792 | 0.8771 | 0.8129 | 0.9762 | 0.8419 | 0.0 | 0.8189 | 0.9124 | 0.2724 | 0.2287 | 0.8479 | 0.9482 | 0.0 | | 0.0385 | 229.09 | 2520 | 0.1043 | 0.6731 | 0.7193 | 0.9721 | nan | 0.9933 | 0.9460 | 0.8837 | 0.9870 | 0.9116 | 0.0 | 0.8674 | 0.9127 | 0.5260 | 0.4081 | 0.9276 | 0.9879 | 0.0 | nan | 0.9793 | 0.8821 | 0.8220 | 0.9759 | 0.8386 | 0.0 | 0.8192 | 0.9056 | 0.3289 | 0.4075 | 0.8638 | 0.9278 | 0.0 | | 0.0373 | 230.91 | 2540 | 0.0916 | 0.6496 | 0.6911 | 0.9752 | nan | 0.9927 | 0.9497 | 0.8680 | 0.9870 | 0.9075 | 0.0 | 0.8735 | 0.9633 | 0.4731 | 0.0747 | 0.9151 | 0.9797 | 0.0 | nan | 0.9794 | 0.8766 | 0.8046 | 0.9757 | 0.8373 | 0.0 | 0.8183 | 0.9545 | 0.3179 | 0.0747 | 0.8549 | 0.9508 | 0.0 | | 0.0366 | 232.73 | 2560 | 0.0847 | 0.6691 | 0.7167 | 0.9772 | nan | 0.9923 | 0.9459 | 0.9042 | 0.9887 | 0.9046 | 0.0 | 0.8739 | 0.9677 | 0.6272 | 0.1859 | 0.9395 | 0.9880 | 0.0 | nan | 0.9798 | 0.8933 | 0.8331 | 0.9759 | 0.8407 | 0.0 | 0.8193 | 0.9571 | 0.4178 | 0.1859 | 0.8634 | 0.9317 | 0.0 | | 0.0414 | 234.55 | 2580 | 0.0939 | 0.6490 | 0.6894 | 0.9752 | nan | 0.9925 | 0.9569 | 0.8856 | 0.9870 | 0.8921 | 0.0 | 0.8494 | 0.9563 | 0.4653 | 0.0709 | 0.9268 | 0.9795 | 0.0 | nan | 0.9798 | 0.8794 | 0.8275 | 0.9756 | 0.8384 | 0.0 | 0.8147 | 0.9476 | 0.3099 | 0.0709 | 0.8441 | 0.9494 | 0.0 | | 0.0346 | 236.36 | 2600 | 0.0996 | 0.6545 | 0.6965 | 0.9742 | nan | 0.9932 | 0.9517 | 0.8859 | 0.9872 | 0.9077 | 0.0 | 0.8768 | 0.9402 | 0.4570 | 0.1435 | 0.9322 | 0.9790 | 0.0 | nan | 0.9796 | 0.8829 | 0.8273 | 0.9762 | 0.8420 | 0.0 | 0.8076 | 0.9333 | 0.3059 | 0.1435 | 0.8563 | 0.9538 | 0.0 | | 0.029 | 238.18 | 2620 | 0.0890 | 0.6611 | 0.6995 | 0.9757 | nan | 0.9934 | 0.9481 | 0.8936 | 0.9881 | 0.9021 | 0.0 | 0.8593 | 0.9552 | 0.4595 | 0.2004 | 0.9164 | 0.9776 | 0.0 | nan | 0.9792 | 0.8856 | 0.8319 | 0.9764 | 0.8420 | 0.0 | 0.8096 | 0.9484 | 0.3101 | 0.2003 | 0.8606 | 0.9506 | 0.0 | | 0.0364 | 240.0 | 2640 | 0.0909 | 0.6572 | 0.7009 | 0.9757 | nan | 0.9918 | 0.9496 | 0.9183 | 0.9874 | 0.9126 | 0.0 | 0.8707 | 0.9547 | 0.4623 | 0.1635 | 0.9155 | 0.9857 | 0.0 | nan | 0.9796 | 0.8840 | 0.8399 | 0.9761 | 0.8407 | 0.0 | 0.8115 | 0.9472 | 0.3082 | 0.1635 | 0.8461 | 0.9466 | 0.0 | | 0.0378 | 241.82 | 2660 | 0.1189 | 0.6465 | 0.6878 | 0.9711 | nan | 0.9927 | 0.9481 | 0.9003 | 0.9879 | 0.8954 | 0.0 | 0.8815 | 0.9081 | 0.4541 | 0.0652 | 0.9213 | 0.9868 | 0.0 | nan | 0.9799 | 0.8833 | 0.8392 | 0.9757 | 0.8418 | 0.0 | 0.8246 | 0.8977 | 0.3027 | 0.0652 | 0.8540 | 0.9400 | 0.0 | | 0.0364 | 243.64 | 2680 | 0.1163 | 0.6478 | 0.6961 | 0.9713 | nan | 0.9922 | 0.9447 | 0.9114 | 0.9883 | 0.8977 | 0.0 | 0.8686 | 0.9083 | 0.4920 | 0.0920 | 0.9648 | 0.9891 | 0.0 | nan | 0.9795 | 0.8835 | 0.8338 | 0.9762 | 0.8434 | 0.0 | 0.8138 | 0.8991 | 0.3271 | 0.0918 | 0.8566 | 0.9163 | 0.0 | | 0.0339 | 245.45 | 2700 | 0.1076 | 0.6511 | 0.6946 | 0.9725 | nan | 0.9929 | 0.9489 | 0.8986 | 0.9886 | 0.8957 | 0.0 | 0.8640 | 0.9174 | 0.5220 | 0.0932 | 0.9231 | 0.9853 | 0.0 | nan | 0.9800 | 0.8850 | 0.8335 | 0.9761 | 0.8393 | 0.0 | 0.8062 | 0.9103 | 0.3452 | 0.0931 | 0.8478 | 0.9477 | 0.0 | | 0.0353 | 247.27 | 2720 | 0.0962 | 0.6604 | 0.7042 | 0.9750 | nan | 0.9925 | 0.9480 | 0.9092 | 0.9880 | 0.9056 | 0.0 | 0.8581 | 0.9461 | 0.4787 | 0.2037 | 0.9393 | 0.9854 | 0.0 | nan | 0.9801 | 0.8871 | 0.8381 | 0.9763 | 0.8380 | 0.0 | 0.8099 | 0.9373 | 0.3189 | 0.2032 | 0.8565 | 0.9403 | 0.0 | | 0.0328 | 249.09 | 2740 | 0.0971 | 0.6526 | 0.6908 | 0.9751 | nan | 0.9936 | 0.9463 | 0.8975 | 0.9893 | 0.8997 | 0.0 | 0.8706 | 0.9470 | 0.4562 | 0.0899 | 0.9249 | 0.9653 | 0.0 | nan | 0.9796 | 0.8851 | 0.8337 | 0.9764 | 0.8431 | 0.0 | 0.8177 | 0.9399 | 0.3098 | 0.0899 | 0.8589 | 0.9500 | 0.0 | | 0.0356 | 250.91 | 2760 | 0.1015 | 0.6489 | 0.6913 | 0.9746 | nan | 0.9926 | 0.9498 | 0.9064 | 0.9880 | 0.9107 | 0.0 | 0.8614 | 0.9440 | 0.4341 | 0.0809 | 0.9395 | 0.9796 | 0.0 | nan | 0.9802 | 0.8818 | 0.8357 | 0.9762 | 0.8419 | 0.0 | 0.8091 | 0.9370 | 0.2908 | 0.0808 | 0.8500 | 0.9522 | 0.0 | | 0.0383 | 252.73 | 2780 | 0.0961 | 0.6501 | 0.6945 | 0.9752 | nan | 0.9928 | 0.9431 | 0.9144 | 0.9889 | 0.9036 | 0.0 | 0.8666 | 0.9496 | 0.4640 | 0.0811 | 0.9359 | 0.9886 | 0.0 | nan | 0.9800 | 0.8831 | 0.8373 | 0.9762 | 0.8421 | 0.0 | 0.8104 | 0.9420 | 0.3138 | 0.0810 | 0.8524 | 0.9328 | 0.0 | | 0.0368 | 254.55 | 2800 | 0.1057 | 0.6475 | 0.6871 | 0.9737 | nan | 0.9930 | 0.9435 | 0.9092 | 0.9894 | 0.9046 | 0.0 | 0.8617 | 0.9319 | 0.4621 | 0.0683 | 0.8796 | 0.9887 | 0.0 | nan | 0.9803 | 0.8826 | 0.8369 | 0.9759 | 0.8427 | 0.0 | 0.8200 | 0.9239 | 0.3130 | 0.0683 | 0.8387 | 0.9359 | 0.0 | | 0.0384 | 256.36 | 2820 | 0.0982 | 0.6487 | 0.6865 | 0.9754 | nan | 0.9929 | 0.9478 | 0.9111 | 0.9888 | 0.9009 | 0.0 | 0.8570 | 0.9511 | 0.4426 | 0.0480 | 0.9020 | 0.9822 | 0.0 | nan | 0.9804 | 0.8846 | 0.8404 | 0.9762 | 0.8416 | 0.0 | 0.8202 | 0.9428 | 0.2983 | 0.0480 | 0.8516 | 0.9493 | 0.0 | | 0.0397 | 258.18 | 2840 | 0.1466 | 0.6451 | 0.6865 | 0.9682 | nan | 0.9931 | 0.9459 | 0.8996 | 0.9887 | 0.9032 | 0.0 | 0.8676 | 0.8718 | 0.4513 | 0.0848 | 0.9297 | 0.9887 | 0.0 | nan | 0.9798 | 0.8838 | 0.8376 | 0.9763 | 0.8404 | 0.0 | 0.8213 | 0.8646 | 0.3021 | 0.0847 | 0.8609 | 0.9346 | 0.0 | | 0.038 | 260.0 | 2860 | 0.1468 | 0.6483 | 0.6927 | 0.9682 | nan | 0.9929 | 0.9485 | 0.8948 | 0.9867 | 0.9072 | 0.0 | 0.8651 | 0.8744 | 0.4672 | 0.1379 | 0.9425 | 0.9878 | 0.0 | nan | 0.9800 | 0.8840 | 0.8300 | 0.9763 | 0.8424 | 0.0 | 0.8136 | 0.8654 | 0.3113 | 0.1378 | 0.8543 | 0.9327 | 0.0 | | 0.0387 | 261.82 | 2880 | 0.1367 | 0.6458 | 0.6887 | 0.9682 | nan | 0.9921 | 0.9506 | 0.9011 | 0.9873 | 0.9034 | 0.0 | 0.8918 | 0.8766 | 0.4037 | 0.1411 | 0.9176 | 0.9881 | 0.0 | nan | 0.9803 | 0.8771 | 0.8240 | 0.9763 | 0.8453 | 0.0 | 0.8182 | 0.8692 | 0.2718 | 0.1411 | 0.8544 | 0.9371 | 0.0 | | 0.0406 | 263.64 | 2900 | 0.1388 | 0.6462 | 0.6837 | 0.9686 | nan | 0.9931 | 0.9496 | 0.9036 | 0.9882 | 0.9004 | 0.0 | 0.8422 | 0.8734 | 0.4268 | 0.1141 | 0.9176 | 0.9789 | 0.0 | nan | 0.9801 | 0.8847 | 0.8350 | 0.9761 | 0.8426 | 0.0 | 0.8114 | 0.8671 | 0.2895 | 0.1140 | 0.8518 | 0.9479 | 0.0 | | 0.0333 | 265.45 | 2920 | 0.1423 | 0.6417 | 0.6813 | 0.9682 | nan | 0.9936 | 0.9488 | 0.9013 | 0.9876 | 0.9089 | 0.0 | 0.8564 | 0.8689 | 0.4076 | 0.0753 | 0.9224 | 0.9863 | 0.0 | nan | 0.9800 | 0.8830 | 0.8374 | 0.9762 | 0.8432 | 0.0 | 0.8099 | 0.8638 | 0.2764 | 0.0752 | 0.8575 | 0.9400 | 0.0 | | 0.0371 | 267.27 | 2940 | 0.1304 | 0.6447 | 0.6840 | 0.9696 | nan | 0.9929 | 0.9500 | 0.9074 | 0.9888 | 0.8951 | 0.0 | 0.8676 | 0.8841 | 0.4537 | 0.0632 | 0.9064 | 0.9827 | 0.0 | nan | 0.9804 | 0.8833 | 0.8422 | 0.9764 | 0.8446 | 0.0 | 0.8170 | 0.8775 | 0.3021 | 0.0632 | 0.8548 | 0.9394 | 0.0 | | 0.0352 | 269.09 | 2960 | 0.1279 | 0.6472 | 0.6870 | 0.9699 | nan | 0.9930 | 0.9512 | 0.9078 | 0.9872 | 0.9099 | 0.0 | 0.8505 | 0.8856 | 0.4674 | 0.0723 | 0.9237 | 0.9828 | 0.0 | nan | 0.9804 | 0.8852 | 0.8422 | 0.9766 | 0.8423 | 0.0 | 0.8170 | 0.8788 | 0.3129 | 0.0722 | 0.8605 | 0.9449 | 0.0 | | 0.0334 | 270.91 | 2980 | 0.1152 | 0.6485 | 0.6882 | 0.9723 | nan | 0.9928 | 0.9534 | 0.9089 | 0.9867 | 0.9112 | 0.0 | 0.8471 | 0.9139 | 0.4807 | 0.0732 | 0.8960 | 0.9827 | 0.0 | nan | 0.9802 | 0.8832 | 0.8407 | 0.9765 | 0.8439 | 0.0 | 0.8158 | 0.9075 | 0.3193 | 0.0731 | 0.8427 | 0.9481 | 0.0 | | 0.0331 | 272.73 | 3000 | 0.1094 | 0.6447 | 0.6842 | 0.9734 | nan | 0.9938 | 0.9506 | 0.8812 | 0.9863 | 0.9115 | 0.0 | 0.8775 | 0.9348 | 0.3710 | 0.0707 | 0.9306 | 0.9864 | 0.0 | nan | 0.9797 | 0.8824 | 0.8242 | 0.9764 | 0.8466 | 0.0 | 0.8228 | 0.9269 | 0.2543 | 0.0706 | 0.8557 | 0.9418 | 0.0 | | 0.0397 | 274.55 | 3020 | 0.1459 | 0.6450 | 0.6880 | 0.9685 | nan | 0.9913 | 0.9531 | 0.9213 | 0.9887 | 0.9014 | 0.0 | 0.8561 | 0.8730 | 0.4884 | 0.0579 | 0.9380 | 0.9742 | 0.0 | nan | 0.9801 | 0.8830 | 0.8371 | 0.9766 | 0.8445 | 0.0 | 0.8194 | 0.8664 | 0.3220 | 0.0579 | 0.8477 | 0.9505 | 0.0 | | 0.0376 | 276.36 | 3040 | 0.1256 | 0.6477 | 0.6861 | 0.9709 | nan | 0.9932 | 0.9507 | 0.8959 | 0.9869 | 0.9154 | 0.0 | 0.8651 | 0.9021 | 0.4304 | 0.0857 | 0.9226 | 0.9712 | 0.0 | nan | 0.9796 | 0.8852 | 0.8328 | 0.9764 | 0.8431 | 0.0 | 0.8257 | 0.8955 | 0.2909 | 0.0855 | 0.8581 | 0.9473 | 0.0 | | 0.0355 | 278.18 | 3060 | 0.1042 | 0.6466 | 0.6889 | 0.9747 | nan | 0.9925 | 0.9473 | 0.9166 | 0.9888 | 0.9084 | 0.0 | 0.8891 | 0.9434 | 0.4081 | 0.0692 | 0.9006 | 0.9913 | 0.0 | nan | 0.9802 | 0.8834 | 0.8377 | 0.9763 | 0.8458 | 0.0 | 0.8202 | 0.9369 | 0.2743 | 0.0691 | 0.8489 | 0.9336 | 0.0 | | 0.0385 | 280.0 | 3080 | 0.1003 | 0.6610 | 0.7055 | 0.9747 | nan | 0.9920 | 0.9463 | 0.9175 | 0.9893 | 0.9007 | 0.0 | 0.8564 | 0.9408 | 0.5343 | 0.1822 | 0.9226 | 0.9901 | 0.0 | nan | 0.9802 | 0.8886 | 0.8337 | 0.9766 | 0.8450 | 0.0 | 0.8132 | 0.9323 | 0.3547 | 0.1819 | 0.8547 | 0.9319 | 0.0 | | 0.0375 | 281.82 | 3100 | 0.1082 | 0.6517 | 0.6955 | 0.9730 | nan | 0.9926 | 0.9455 | 0.9030 | 0.9887 | 0.9064 | 0.0 | 0.8594 | 0.9259 | 0.5051 | 0.0940 | 0.9343 | 0.9864 | 0.0 | nan | 0.9799 | 0.8844 | 0.8304 | 0.9766 | 0.8460 | 0.0 | 0.8194 | 0.9168 | 0.3366 | 0.0939 | 0.8555 | 0.9324 | 0.0 | | 0.0357 | 283.64 | 3120 | 0.1075 | 0.6502 | 0.6933 | 0.9735 | nan | 0.9928 | 0.9493 | 0.9009 | 0.9880 | 0.9062 | 0.0 | 0.8816 | 0.9296 | 0.4668 | 0.0868 | 0.9267 | 0.9842 | 0.0 | nan | 0.9799 | 0.8843 | 0.8307 | 0.9769 | 0.8448 | 0.0 | 0.8140 | 0.9225 | 0.3143 | 0.0867 | 0.8554 | 0.9434 | 0.0 | | 0.0341 | 285.45 | 3140 | 0.1091 | 0.6502 | 0.6944 | 0.9731 | nan | 0.9921 | 0.9492 | 0.9150 | 0.9883 | 0.9075 | 0.0 | 0.8668 | 0.9249 | 0.4764 | 0.0871 | 0.9300 | 0.9903 | 0.0 | nan | 0.9801 | 0.8869 | 0.8347 | 0.9767 | 0.8439 | 0.0 | 0.8218 | 0.9173 | 0.3210 | 0.0871 | 0.8571 | 0.9256 | 0.0 | | 0.0433 | 287.27 | 3160 | 0.0948 | 0.6581 | 0.6971 | 0.9753 | nan | 0.9925 | 0.9521 | 0.9171 | 0.9876 | 0.9109 | 0.0 | 0.8641 | 0.9448 | 0.4618 | 0.1374 | 0.9102 | 0.9842 | 0.0 | nan | 0.9802 | 0.8906 | 0.8398 | 0.9766 | 0.8442 | 0.0 | 0.8263 | 0.9381 | 0.3118 | 0.1373 | 0.8630 | 0.9473 | 0.0 | | 0.032 | 289.09 | 3180 | 0.1012 | 0.6563 | 0.6981 | 0.9747 | nan | 0.9933 | 0.9507 | 0.9007 | 0.9875 | 0.9115 | 0.0 | 0.8715 | 0.9381 | 0.4890 | 0.1092 | 0.9363 | 0.9872 | 0.0 | nan | 0.9803 | 0.8896 | 0.8362 | 0.9765 | 0.8439 | 0.0 | 0.8247 | 0.9315 | 0.3282 | 0.1091 | 0.8707 | 0.9415 | 0.0 | | 0.0324 | 290.91 | 3200 | 0.1065 | 0.6532 | 0.6958 | 0.9740 | nan | 0.9929 | 0.9477 | 0.9010 | 0.9887 | 0.9010 | 0.0 | 0.8773 | 0.9350 | 0.4745 | 0.1097 | 0.9297 | 0.9884 | 0.0 | nan | 0.9802 | 0.8865 | 0.8310 | 0.9767 | 0.8447 | 0.0 | 0.8237 | 0.9274 | 0.3161 | 0.1096 | 0.8635 | 0.9324 | 0.0 | | 0.0399 | 292.73 | 3220 | 0.1024 | 0.6618 | 0.7039 | 0.9745 | nan | 0.9927 | 0.9484 | 0.9118 | 0.9881 | 0.9022 | 0.0 | 0.8858 | 0.9392 | 0.4226 | 0.2348 | 0.9387 | 0.9868 | 0.0 | nan | 0.9803 | 0.8879 | 0.8379 | 0.9763 | 0.8409 | 0.0 | 0.8257 | 0.9311 | 0.2880 | 0.2344 | 0.8686 | 0.9326 | 0.0 | | 0.0358 | 294.55 | 3240 | 0.0968 | 0.6583 | 0.6988 | 0.9754 | nan | 0.9931 | 0.9516 | 0.9007 | 0.9888 | 0.9003 | 0.0 | 0.8733 | 0.9472 | 0.4539 | 0.1575 | 0.9322 | 0.9856 | 0.0 | nan | 0.9805 | 0.8855 | 0.8392 | 0.9766 | 0.8445 | 0.0 | 0.8260 | 0.9396 | 0.3039 | 0.1573 | 0.8612 | 0.9430 | 0.0 | | 0.0318 | 296.36 | 3260 | 0.0984 | 0.6572 | 0.7017 | 0.9746 | nan | 0.9930 | 0.9477 | 0.8847 | 0.9889 | 0.9082 | 0.0 | 0.8899 | 0.9443 | 0.4672 | 0.1735 | 0.9339 | 0.9906 | 0.0 | nan | 0.9803 | 0.8838 | 0.8241 | 0.9765 | 0.8420 | 0.0 | 0.8275 | 0.9362 | 0.3131 | 0.1730 | 0.8511 | 0.9358 | 0.0 | | 0.0367 | 298.18 | 3280 | 0.0947 | 0.6654 | 0.7079 | 0.9753 | nan | 0.9927 | 0.9520 | 0.9038 | 0.9876 | 0.9102 | 0.0 | 0.8762 | 0.9459 | 0.4688 | 0.2340 | 0.9501 | 0.9809 | 0.0 | nan | 0.9801 | 0.8911 | 0.8361 | 0.9769 | 0.8468 | 0.0 | 0.8173 | 0.9376 | 0.3177 | 0.2333 | 0.8562 | 0.9574 | 0.0 | | 0.034 | 300.0 | 3300 | 0.1027 | 0.6550 | 0.6972 | 0.9745 | nan | 0.9936 | 0.9506 | 0.8847 | 0.9871 | 0.9078 | 0.0 | 0.8731 | 0.9405 | 0.4984 | 0.1196 | 0.9224 | 0.9853 | 0.0 | nan | 0.9799 | 0.8866 | 0.8280 | 0.9766 | 0.8477 | 0.0 | 0.8185 | 0.9333 | 0.3341 | 0.1190 | 0.8477 | 0.9432 | 0.0 | | 0.0394 | 301.82 | 3320 | 0.1032 | 0.6578 | 0.7034 | 0.9742 | nan | 0.9931 | 0.9493 | 0.8916 | 0.9874 | 0.9112 | 0.0 | 0.8673 | 0.9365 | 0.4777 | 0.2113 | 0.9263 | 0.9921 | 0.0 | nan | 0.9805 | 0.8835 | 0.8296 | 0.9766 | 0.8464 | 0.0 | 0.8245 | 0.9295 | 0.3217 | 0.2103 | 0.8460 | 0.9032 | 0.0 | | 0.0353 | 303.64 | 3340 | 0.0979 | 0.6641 | 0.7104 | 0.9749 | nan | 0.9924 | 0.9497 | 0.9076 | 0.9882 | 0.9109 | 0.0 | 0.8726 | 0.9414 | 0.4908 | 0.2587 | 0.9361 | 0.9866 | 0.0 | nan | 0.9801 | 0.8886 | 0.8345 | 0.9768 | 0.8497 | 0.0 | 0.8241 | 0.9335 | 0.3314 | 0.2581 | 0.8295 | 0.9277 | 0.0 | | 0.0325 | 305.45 | 3360 | 0.0969 | 0.6640 | 0.7063 | 0.9751 | nan | 0.9934 | 0.9500 | 0.8941 | 0.9871 | 0.9121 | 0.0 | 0.8845 | 0.9438 | 0.4721 | 0.2378 | 0.9276 | 0.9793 | 0.0 | nan | 0.9801 | 0.8905 | 0.8309 | 0.9768 | 0.8476 | 0.0 | 0.8264 | 0.9352 | 0.3194 | 0.2376 | 0.8489 | 0.9392 | 0.0 | | 0.0356 | 307.27 | 3380 | 0.0955 | 0.6611 | 0.7019 | 0.9754 | nan | 0.9919 | 0.9491 | 0.9202 | 0.9899 | 0.8986 | 0.0 | 0.8815 | 0.9480 | 0.4479 | 0.1931 | 0.9190 | 0.9857 | 0.0 | nan | 0.9804 | 0.8876 | 0.8392 | 0.9763 | 0.8482 | 0.0 | 0.8300 | 0.9393 | 0.3040 | 0.1929 | 0.8553 | 0.9404 | 0.0 | | 0.033 | 309.09 | 3400 | 0.0864 | 0.6635 | 0.7068 | 0.9773 | nan | 0.9933 | 0.9498 | 0.8937 | 0.9878 | 0.9080 | 0.0 | 0.8824 | 0.9700 | 0.4898 | 0.1834 | 0.9464 | 0.9834 | 0.0 | nan | 0.9800 | 0.8896 | 0.8338 | 0.9766 | 0.8462 | 0.0 | 0.8299 | 0.9615 | 0.3289 | 0.1829 | 0.8562 | 0.9403 | 0.0 | | 0.0304 | 310.91 | 3420 | 0.0916 | 0.6616 | 0.7037 | 0.9762 | nan | 0.9926 | 0.9543 | 0.9045 | 0.9870 | 0.9089 | 0.0 | 0.8793 | 0.9565 | 0.4878 | 0.1674 | 0.9311 | 0.9782 | 0.0 | nan | 0.9800 | 0.8887 | 0.8346 | 0.9766 | 0.8462 | 0.0 | 0.8261 | 0.9489 | 0.3265 | 0.1672 | 0.8534 | 0.9523 | 0.0 | | 0.0318 | 312.73 | 3440 | 0.0935 | 0.6599 | 0.7017 | 0.9761 | nan | 0.9934 | 0.9474 | 0.9019 | 0.9877 | 0.9066 | 0.0 | 0.8820 | 0.9562 | 0.4831 | 0.1431 | 0.9379 | 0.9826 | 0.0 | nan | 0.9800 | 0.8898 | 0.8358 | 0.9768 | 0.8444 | 0.0 | 0.8229 | 0.9475 | 0.3262 | 0.1427 | 0.8622 | 0.9511 | 0.0 | | 0.0352 | 314.55 | 3460 | 0.0938 | 0.6609 | 0.7056 | 0.9759 | nan | 0.9925 | 0.9503 | 0.9221 | 0.9886 | 0.9002 | 0.0 | 0.8792 | 0.9482 | 0.5006 | 0.1663 | 0.9412 | 0.9836 | 0.0 | nan | 0.9804 | 0.8900 | 0.8406 | 0.9769 | 0.8443 | 0.0 | 0.8220 | 0.9417 | 0.3358 | 0.1660 | 0.8459 | 0.9477 | 0.0 | | 0.0336 | 316.36 | 3480 | 0.0982 | 0.6582 | 0.7007 | 0.9745 | nan | 0.9937 | 0.9459 | 0.8763 | 0.9881 | 0.9053 | 0.0 | 0.8775 | 0.9448 | 0.4568 | 0.1987 | 0.9377 | 0.9838 | 0.0 | nan | 0.9799 | 0.8837 | 0.8150 | 0.9767 | 0.8457 | 0.0 | 0.8227 | 0.9370 | 0.3087 | 0.1977 | 0.8554 | 0.9342 | 0.0 | | 0.0346 | 318.18 | 3500 | 0.0997 | 0.6642 | 0.7060 | 0.9747 | nan | 0.9926 | 0.9499 | 0.9103 | 0.9878 | 0.9062 | 0.0 | 0.8826 | 0.9394 | 0.4827 | 0.2278 | 0.9121 | 0.9860 | 0.0 | nan | 0.9802 | 0.8884 | 0.8343 | 0.9766 | 0.8449 | 0.0 | 0.8271 | 0.9321 | 0.3253 | 0.2274 | 0.8565 | 0.9415 | 0.0 | | 0.0339 | 320.0 | 3520 | 0.1043 | 0.6610 | 0.7007 | 0.9742 | nan | 0.9934 | 0.9495 | 0.9038 | 0.9876 | 0.9070 | 0.0 | 0.8826 | 0.9326 | 0.4432 | 0.2062 | 0.9174 | 0.9855 | 0.0 | nan | 0.9800 | 0.8909 | 0.8363 | 0.9767 | 0.8469 | 0.0 | 0.8238 | 0.9257 | 0.3016 | 0.2059 | 0.8643 | 0.9411 | 0.0 | | 0.0372 | 321.82 | 3540 | 0.0999 | 0.6605 | 0.7022 | 0.9749 | nan | 0.9935 | 0.9485 | 0.9043 | 0.9865 | 0.9148 | 0.0 | 0.8799 | 0.9408 | 0.4754 | 0.1708 | 0.9279 | 0.9859 | 0.0 | nan | 0.9803 | 0.8891 | 0.8380 | 0.9766 | 0.8477 | 0.0 | 0.8260 | 0.9336 | 0.3210 | 0.1705 | 0.8638 | 0.9395 | 0.0 | | 0.0344 | 323.64 | 3560 | 0.1052 | 0.6526 | 0.6951 | 0.9739 | nan | 0.9930 | 0.9506 | 0.8960 | 0.9873 | 0.9120 | 0.0 | 0.8821 | 0.9343 | 0.4613 | 0.1019 | 0.9341 | 0.9844 | 0.0 | nan | 0.9805 | 0.8835 | 0.8306 | 0.9768 | 0.8507 | 0.0 | 0.8310 | 0.9263 | 0.3098 | 0.1016 | 0.8523 | 0.9402 | 0.0 | | 0.0337 | 325.45 | 3580 | 0.1140 | 0.6610 | 0.7004 | 0.9721 | nan | 0.9931 | 0.9503 | 0.9067 | 0.9884 | 0.9116 | 0.0 | 0.8615 | 0.9055 | 0.5177 | 0.1706 | 0.9183 | 0.9815 | 0.0 | nan | 0.9808 | 0.8915 | 0.8395 | 0.9768 | 0.8513 | 0.0 | 0.8255 | 0.8978 | 0.3436 | 0.1703 | 0.8613 | 0.9549 | 0.0 | | 0.0299 | 327.27 | 3600 | 0.1028 | 0.6597 | 0.6985 | 0.9744 | nan | 0.9936 | 0.9506 | 0.8959 | 0.9876 | 0.9117 | 0.0 | 0.8808 | 0.9349 | 0.4626 | 0.1589 | 0.9242 | 0.9801 | 0.0 | nan | 0.9804 | 0.8908 | 0.8347 | 0.9767 | 0.8499 | 0.0 | 0.8308 | 0.9274 | 0.3142 | 0.1587 | 0.8613 | 0.9512 | 0.0 | | 0.0327 | 329.09 | 3620 | 0.1274 | 0.6525 | 0.6970 | 0.9703 | nan | 0.9917 | 0.9535 | 0.9200 | 0.9881 | 0.9098 | 0.0 | 0.8856 | 0.8883 | 0.5077 | 0.1019 | 0.9302 | 0.9845 | 0.0 | nan | 0.9808 | 0.8871 | 0.8375 | 0.9768 | 0.8499 | 0.0 | 0.8328 | 0.8817 | 0.3361 | 0.1017 | 0.8455 | 0.9519 | 0.0 | | 0.036 | 330.91 | 3640 | 0.1276 | 0.6583 | 0.6987 | 0.9702 | nan | 0.9932 | 0.9487 | 0.9053 | 0.9882 | 0.9103 | 0.0 | 0.8781 | 0.8853 | 0.4711 | 0.2019 | 0.9148 | 0.9867 | 0.0 | nan | 0.9806 | 0.8895 | 0.8365 | 0.9765 | 0.8482 | 0.0 | 0.8299 | 0.8789 | 0.3178 | 0.2016 | 0.8486 | 0.9503 | 0.0 | | 0.0338 | 332.73 | 3660 | 0.1096 | 0.6593 | 0.7017 | 0.9733 | nan | 0.9931 | 0.9480 | 0.9026 | 0.9887 | 0.9094 | 0.0 | 0.8854 | 0.9220 | 0.4773 | 0.1739 | 0.9343 | 0.9872 | 0.0 | nan | 0.9805 | 0.8890 | 0.8355 | 0.9770 | 0.8510 | 0.0 | 0.8334 | 0.9145 | 0.3204 | 0.1737 | 0.8491 | 0.9469 | 0.0 | | 0.034 | 334.55 | 3680 | 0.1070 | 0.6611 | 0.7025 | 0.9739 | nan | 0.9930 | 0.9467 | 0.9052 | 0.9898 | 0.8996 | 0.0 | 0.8764 | 0.9284 | 0.5121 | 0.1626 | 0.9343 | 0.9847 | 0.0 | nan | 0.9807 | 0.8886 | 0.8382 | 0.9766 | 0.8495 | 0.0 | 0.8349 | 0.9205 | 0.3396 | 0.1620 | 0.8533 | 0.9498 | 0.0 | | 0.0314 | 336.36 | 3700 | 0.1142 | 0.6595 | 0.7008 | 0.9725 | nan | 0.9928 | 0.9508 | 0.9107 | 0.9870 | 0.9103 | 0.0 | 0.8786 | 0.9141 | 0.4806 | 0.1704 | 0.9308 | 0.9849 | 0.0 | nan | 0.9804 | 0.8884 | 0.8384 | 0.9768 | 0.8520 | 0.0 | 0.8312 | 0.9060 | 0.3239 | 0.1700 | 0.8596 | 0.9470 | 0.0 | | 0.0296 | 338.18 | 3720 | 0.1065 | 0.6608 | 0.7052 | 0.9740 | nan | 0.9934 | 0.9512 | 0.8980 | 0.9887 | 0.9036 | 0.0 | 0.8775 | 0.9268 | 0.5043 | 0.1763 | 0.9622 | 0.9854 | 0.0 | nan | 0.9808 | 0.8896 | 0.8395 | 0.9769 | 0.8502 | 0.0 | 0.8337 | 0.9195 | 0.3363 | 0.1758 | 0.8382 | 0.9501 | 0.0 | | 0.0309 | 340.0 | 3740 | 0.1064 | 0.6631 | 0.7045 | 0.9740 | nan | 0.9932 | 0.9509 | 0.9000 | 0.9878 | 0.9064 | 0.0 | 0.8737 | 0.9295 | 0.5074 | 0.1806 | 0.9489 | 0.9799 | 0.0 | nan | 0.9804 | 0.8879 | 0.8381 | 0.9770 | 0.8498 | 0.0 | 0.8322 | 0.9209 | 0.3386 | 0.1803 | 0.8589 | 0.9563 | 0.0 | | 0.029 | 341.82 | 3760 | 0.0915 | 0.6896 | 0.7332 | 0.9763 | nan | 0.9933 | 0.9478 | 0.9059 | 0.9880 | 0.9049 | 0.0 | 0.8808 | 0.9474 | 0.5829 | 0.4499 | 0.9519 | 0.9790 | 0.0 | nan | 0.9802 | 0.8993 | 0.8408 | 0.9770 | 0.8475 | 0.0 | 0.8313 | 0.9387 | 0.3932 | 0.4479 | 0.8648 | 0.9444 | 0.0 | | 0.038 | 343.64 | 3780 | 0.1187 | 0.6534 | 0.6971 | 0.9719 | nan | 0.9932 | 0.9471 | 0.9049 | 0.9887 | 0.9057 | 0.0 | 0.8836 | 0.9071 | 0.4950 | 0.1019 | 0.9446 | 0.9900 | 0.0 | nan | 0.9800 | 0.8812 | 0.8385 | 0.9768 | 0.8491 | 0.0 | 0.8326 | 0.9003 | 0.3332 | 0.1016 | 0.8610 | 0.9402 | 0.0 | | 0.0306 | 345.45 | 3800 | 0.1199 | 0.6540 | 0.6957 | 0.9717 | nan | 0.9934 | 0.9485 | 0.9036 | 0.9886 | 0.9085 | 0.0 | 0.8733 | 0.9017 | 0.5013 | 0.1100 | 0.9276 | 0.9879 | 0.0 | nan | 0.9808 | 0.8865 | 0.8393 | 0.9770 | 0.8514 | 0.0 | 0.8312 | 0.8941 | 0.3342 | 0.1097 | 0.8574 | 0.9407 | 0.0 | | 0.0303 | 347.27 | 3820 | 0.1052 | 0.6698 | 0.7124 | 0.9746 | nan | 0.9930 | 0.9500 | 0.9129 | 0.9879 | 0.9045 | 0.0 | 0.8743 | 0.9304 | 0.5893 | 0.1978 | 0.9364 | 0.9843 | 0.0 | nan | 0.9806 | 0.8942 | 0.8406 | 0.9772 | 0.8501 | 0.0 | 0.8327 | 0.9228 | 0.3938 | 0.1975 | 0.8706 | 0.9480 | 0.0 | | 0.0342 | 349.09 | 3840 | 0.1158 | 0.6600 | 0.7001 | 0.9728 | nan | 0.9933 | 0.9509 | 0.9046 | 0.9877 | 0.9108 | 0.0 | 0.8755 | 0.9139 | 0.4924 | 0.1622 | 0.9254 | 0.9846 | 0.0 | nan | 0.9805 | 0.8891 | 0.8405 | 0.9770 | 0.8506 | 0.0 | 0.8323 | 0.9068 | 0.3283 | 0.1618 | 0.8625 | 0.9509 | 0.0 | | 0.0324 | 350.91 | 3860 | 0.1156 | 0.6605 | 0.7014 | 0.9727 | nan | 0.9929 | 0.9506 | 0.9149 | 0.9881 | 0.9070 | 0.0 | 0.8814 | 0.9118 | 0.5101 | 0.1482 | 0.9283 | 0.9847 | 0.0 | nan | 0.9807 | 0.8909 | 0.8440 | 0.9769 | 0.8486 | 0.0 | 0.8356 | 0.9046 | 0.3397 | 0.1479 | 0.8653 | 0.9529 | 0.0 | | 0.0304 | 352.73 | 3880 | 0.1129 | 0.6615 | 0.7025 | 0.9730 | nan | 0.9925 | 0.9508 | 0.9135 | 0.9887 | 0.9107 | 0.0 | 0.8874 | 0.9171 | 0.4755 | 0.1809 | 0.9288 | 0.9863 | 0.0 | nan | 0.9804 | 0.8896 | 0.8425 | 0.9769 | 0.8512 | 0.0 | 0.8365 | 0.9097 | 0.3200 | 0.1802 | 0.8606 | 0.9524 | 0.0 | | 0.0344 | 354.55 | 3900 | 0.1090 | 0.6594 | 0.6983 | 0.9737 | nan | 0.9933 | 0.9541 | 0.9029 | 0.9873 | 0.9119 | 0.0 | 0.8656 | 0.9238 | 0.4764 | 0.1538 | 0.9247 | 0.9841 | 0.0 | nan | 0.9806 | 0.8885 | 0.8411 | 0.9770 | 0.8520 | 0.0 | 0.8314 | 0.9170 | 0.3190 | 0.1535 | 0.8563 | 0.9552 | 0.0 | | 0.0376 | 356.36 | 3920 | 0.1084 | 0.6639 | 0.7075 | 0.9739 | nan | 0.9923 | 0.9513 | 0.9136 | 0.9892 | 0.9051 | 0.0 | 0.8871 | 0.9262 | 0.4806 | 0.2174 | 0.9455 | 0.9896 | 0.0 | nan | 0.9805 | 0.8898 | 0.8434 | 0.9769 | 0.8515 | 0.0 | 0.8322 | 0.9185 | 0.3222 | 0.2167 | 0.8623 | 0.9366 | 0.0 | | 0.0327 | 358.18 | 3940 | 0.1028 | 0.6641 | 0.7051 | 0.9747 | nan | 0.9930 | 0.9499 | 0.9123 | 0.9894 | 0.9015 | 0.0 | 0.8806 | 0.9326 | 0.4839 | 0.2005 | 0.9338 | 0.9888 | 0.0 | nan | 0.9806 | 0.8909 | 0.8447 | 0.9768 | 0.8476 | 0.0 | 0.8332 | 0.9264 | 0.3271 | 0.2002 | 0.8635 | 0.9423 | 0.0 | | 0.0363 | 360.0 | 3960 | 0.1098 | 0.6603 | 0.7007 | 0.9739 | nan | 0.9933 | 0.9498 | 0.9041 | 0.9885 | 0.9086 | 0.0 | 0.8880 | 0.9265 | 0.4705 | 0.1675 | 0.9247 | 0.9870 | 0.0 | nan | 0.9803 | 0.8913 | 0.8396 | 0.9772 | 0.8498 | 0.0 | 0.8333 | 0.9194 | 0.3183 | 0.1671 | 0.8628 | 0.9444 | 0.0 | | 0.0317 | 361.82 | 3980 | 0.0958 | 0.6542 | 0.6926 | 0.9762 | nan | 0.9933 | 0.9529 | 0.9043 | 0.9886 | 0.9063 | 0.0 | 0.8821 | 0.9529 | 0.4346 | 0.0914 | 0.9110 | 0.9868 | 0.0 | nan | 0.9808 | 0.8865 | 0.8425 | 0.9771 | 0.8508 | 0.0 | 0.8330 | 0.9463 | 0.2929 | 0.0911 | 0.8535 | 0.9502 | 0.0 | | 0.0414 | 363.64 | 4000 | 0.1092 | 0.6523 | 0.6917 | 0.9740 | nan | 0.9929 | 0.9533 | 0.8956 | 0.9883 | 0.9052 | 0.0 | 0.8828 | 0.9345 | 0.4354 | 0.0910 | 0.9288 | 0.9843 | 0.0 | nan | 0.9803 | 0.8854 | 0.8284 | 0.9770 | 0.8530 | 0.0 | 0.8342 | 0.9276 | 0.2940 | 0.0908 | 0.8606 | 0.9486 | 0.0 | | 0.0338 | 365.45 | 4020 | 0.1130 | 0.6592 | 0.7004 | 0.9735 | nan | 0.9933 | 0.9503 | 0.8980 | 0.9884 | 0.9051 | 0.0 | 0.8856 | 0.9241 | 0.4642 | 0.1656 | 0.9428 | 0.9874 | 0.0 | nan | 0.9805 | 0.8884 | 0.8323 | 0.9771 | 0.8508 | 0.0 | 0.8363 | 0.9175 | 0.3122 | 0.1649 | 0.8609 | 0.9484 | 0.0 | | 0.0321 | 367.27 | 4040 | 0.1147 | 0.6520 | 0.6921 | 0.9732 | nan | 0.9930 | 0.9509 | 0.9042 | 0.9888 | 0.8978 | 0.0 | 0.8816 | 0.9234 | 0.4467 | 0.0944 | 0.9313 | 0.9857 | 0.0 | nan | 0.9804 | 0.8863 | 0.8336 | 0.9769 | 0.8483 | 0.0 | 0.8320 | 0.9168 | 0.3015 | 0.0942 | 0.8590 | 0.9471 | 0.0 | | 0.0291 | 369.09 | 4060 | 0.1129 | 0.6592 | 0.6990 | 0.9736 | nan | 0.9930 | 0.9521 | 0.9075 | 0.9885 | 0.9033 | 0.0 | 0.8799 | 0.9231 | 0.4734 | 0.1556 | 0.9261 | 0.9839 | 0.0 | nan | 0.9805 | 0.8896 | 0.8383 | 0.9772 | 0.8512 | 0.0 | 0.8304 | 0.9163 | 0.3185 | 0.1553 | 0.8603 | 0.9519 | 0.0 | | 0.0344 | 370.91 | 4080 | 0.1066 | 0.6673 | 0.7113 | 0.9752 | nan | 0.9933 | 0.9506 | 0.9042 | 0.9886 | 0.9078 | 0.0 | 0.8807 | 0.9327 | 0.6771 | 0.0985 | 0.9316 | 0.9822 | 0.0 | nan | 0.9807 | 0.8969 | 0.8410 | 0.9771 | 0.8515 | 0.0 | 0.8338 | 0.9256 | 0.4496 | 0.0984 | 0.8678 | 0.9521 | 0.0 | | 0.0309 | 372.73 | 4100 | 0.1078 | 0.6646 | 0.7102 | 0.9750 | nan | 0.9929 | 0.9501 | 0.9114 | 0.9889 | 0.9056 | 0.0 | 0.8885 | 0.9325 | 0.6374 | 0.0962 | 0.9425 | 0.9872 | 0.0 | nan | 0.9808 | 0.8964 | 0.8429 | 0.9769 | 0.8498 | 0.0 | 0.8329 | 0.9248 | 0.4280 | 0.0959 | 0.8592 | 0.9528 | 0.0 | | 0.032 | 374.55 | 4120 | 0.1079 | 0.6634 | 0.7059 | 0.9749 | nan | 0.9933 | 0.9509 | 0.9039 | 0.9880 | 0.9096 | 0.0 | 0.8748 | 0.9330 | 0.6270 | 0.0868 | 0.9237 | 0.9862 | 0.0 | nan | 0.9803 | 0.8952 | 0.8383 | 0.9772 | 0.8513 | 0.0 | 0.8327 | 0.9262 | 0.4210 | 0.0866 | 0.8605 | 0.9547 | 0.0 | | 0.034 | 376.36 | 4140 | 0.1070 | 0.6638 | 0.7059 | 0.9751 | nan | 0.9928 | 0.9526 | 0.9109 | 0.9888 | 0.9033 | 0.0 | 0.8748 | 0.9341 | 0.6112 | 0.0970 | 0.9249 | 0.9857 | 0.0 | nan | 0.9807 | 0.8940 | 0.8421 | 0.9772 | 0.8521 | 0.0 | 0.8302 | 0.9272 | 0.4091 | 0.0968 | 0.8640 | 0.9556 | 0.0 | | 0.0339 | 378.18 | 4160 | 0.1066 | 0.6623 | 0.7055 | 0.9750 | nan | 0.9922 | 0.9494 | 0.9231 | 0.9895 | 0.9009 | 0.0 | 0.8846 | 0.9353 | 0.5772 | 0.1073 | 0.9229 | 0.9891 | 0.0 | nan | 0.9807 | 0.8928 | 0.8430 | 0.9771 | 0.8506 | 0.0 | 0.8315 | 0.9275 | 0.3883 | 0.1072 | 0.8586 | 0.9526 | 0.0 | | 0.0322 | 380.0 | 4180 | 0.1092 | 0.6652 | 0.7077 | 0.9746 | nan | 0.9932 | 0.9505 | 0.9031 | 0.9890 | 0.9082 | 0.0 | 0.8758 | 0.9288 | 0.6226 | 0.1212 | 0.9229 | 0.9845 | 0.0 | nan | 0.9804 | 0.8947 | 0.8393 | 0.9771 | 0.8497 | 0.0 | 0.8322 | 0.9225 | 0.4172 | 0.1209 | 0.8610 | 0.9529 | 0.0 | | 0.033 | 381.82 | 4200 | 0.1056 | 0.6681 | 0.7110 | 0.9754 | nan | 0.9930 | 0.9523 | 0.9051 | 0.9885 | 0.9079 | 0.0 | 0.8830 | 0.9378 | 0.5947 | 0.1563 | 0.9377 | 0.9873 | 0.0 | nan | 0.9807 | 0.8957 | 0.8419 | 0.9774 | 0.8528 | 0.0 | 0.8356 | 0.9296 | 0.4018 | 0.1557 | 0.8694 | 0.9451 | 0.0 | | 0.0313 | 383.64 | 4220 | 0.1042 | 0.6592 | 0.6998 | 0.9751 | nan | 0.9930 | 0.9511 | 0.9023 | 0.9890 | 0.9061 | 0.0 | 0.8871 | 0.9402 | 0.5171 | 0.0998 | 0.9258 | 0.9853 | 0.0 | nan | 0.9806 | 0.8893 | 0.8350 | 0.9775 | 0.8540 | 0.0 | 0.8371 | 0.9328 | 0.3485 | 0.0996 | 0.8653 | 0.9506 | 0.0 | | 0.0322 | 385.45 | 4240 | 0.1045 | 0.6679 | 0.7128 | 0.9755 | nan | 0.9929 | 0.9532 | 0.9057 | 0.9877 | 0.9116 | 0.0 | 0.8831 | 0.9385 | 0.6182 | 0.1546 | 0.9355 | 0.9859 | 0.0 | nan | 0.9808 | 0.8927 | 0.8370 | 0.9775 | 0.8536 | 0.0 | 0.8352 | 0.9314 | 0.4072 | 0.1538 | 0.8618 | 0.9520 | 0.0 | | 0.0296 | 387.27 | 4260 | 0.1042 | 0.6687 | 0.7111 | 0.9752 | nan | 0.9931 | 0.9536 | 0.9027 | 0.9875 | 0.9090 | 0.0 | 0.8889 | 0.9369 | 0.5808 | 0.1781 | 0.9279 | 0.9853 | 0.0 | nan | 0.9807 | 0.8940 | 0.8369 | 0.9773 | 0.8536 | 0.0 | 0.8327 | 0.9292 | 0.3924 | 0.1774 | 0.8645 | 0.9544 | 0.0 | | 0.0273 | 389.09 | 4280 | 0.1097 | 0.6720 | 0.7135 | 0.9742 | nan | 0.9926 | 0.9519 | 0.9032 | 0.9888 | 0.9077 | 0.0 | 0.8643 | 0.9264 | 0.6261 | 0.2096 | 0.9201 | 0.9849 | 0.0 | nan | 0.9806 | 0.8926 | 0.8316 | 0.9772 | 0.8535 | 0.0 | 0.8291 | 0.9194 | 0.4166 | 0.2093 | 0.8717 | 0.9542 | 0.0 | | 0.034 | 390.91 | 4300 | 0.1072 | 0.6682 | 0.7088 | 0.9744 | nan | 0.9929 | 0.9510 | 0.9133 | 0.9887 | 0.9114 | 0.0 | 0.8780 | 0.9271 | 0.5054 | 0.2158 | 0.9437 | 0.9866 | 0.0 | nan | 0.9807 | 0.8938 | 0.8431 | 0.9770 | 0.8529 | 0.0 | 0.8313 | 0.9201 | 0.3425 | 0.2157 | 0.8787 | 0.9505 | 0.0 | | 0.0312 | 392.73 | 4320 | 0.1088 | 0.6679 | 0.7075 | 0.9741 | nan | 0.9932 | 0.9524 | 0.9077 | 0.9881 | 0.9081 | 0.0 | 0.8804 | 0.9248 | 0.5026 | 0.2282 | 0.9302 | 0.9820 | 0.0 | nan | 0.9806 | 0.8943 | 0.8420 | 0.9774 | 0.8530 | 0.0 | 0.8290 | 0.9178 | 0.3387 | 0.2279 | 0.8662 | 0.9564 | 0.0 | | 0.03 | 394.55 | 4340 | 0.1121 | 0.6646 | 0.7049 | 0.9736 | nan | 0.9929 | 0.9513 | 0.9114 | 0.9890 | 0.9055 | 0.0 | 0.8864 | 0.9199 | 0.4844 | 0.2110 | 0.9247 | 0.9866 | 0.0 | nan | 0.9806 | 0.8916 | 0.8418 | 0.9772 | 0.8538 | 0.0 | 0.8311 | 0.9134 | 0.3262 | 0.2107 | 0.8611 | 0.9526 | 0.0 | | 0.0329 | 396.36 | 4360 | 0.1102 | 0.6615 | 0.7029 | 0.9742 | nan | 0.9931 | 0.9494 | 0.9078 | 0.9885 | 0.9066 | 0.0 | 0.8832 | 0.9294 | 0.5040 | 0.1510 | 0.9398 | 0.9844 | 0.0 | nan | 0.9805 | 0.8906 | 0.8399 | 0.9770 | 0.8550 | 0.0 | 0.8314 | 0.9215 | 0.3376 | 0.1508 | 0.8647 | 0.9508 | 0.0 | | 0.0279 | 398.18 | 4380 | 0.1050 | 0.6897 | 0.7341 | 0.9749 | nan | 0.9927 | 0.9530 | 0.9107 | 0.9883 | 0.9079 | 0.0 | 0.8718 | 0.9275 | 0.6044 | 0.4519 | 0.9537 | 0.9816 | 0.0 | nan | 0.9805 | 0.8992 | 0.8405 | 0.9773 | 0.8529 | 0.0 | 0.8284 | 0.9208 | 0.4046 | 0.4495 | 0.8583 | 0.9543 | 0.0 | | 0.0459 | 400.0 | 4400 | 0.1012 | 0.6861 | 0.7289 | 0.9758 | nan | 0.9933 | 0.9516 | 0.9025 | 0.9885 | 0.9038 | 0.0 | 0.8784 | 0.9368 | 0.6359 | 0.3664 | 0.9354 | 0.9837 | 0.0 | nan | 0.9809 | 0.9010 | 0.8415 | 0.9773 | 0.8514 | 0.0 | 0.8330 | 0.9278 | 0.4256 | 0.3657 | 0.8584 | 0.9567 | 0.0 | | 0.0265 | 401.82 | 4420 | 0.1049 | 0.6814 | 0.7251 | 0.9750 | nan | 0.9930 | 0.9491 | 0.9083 | 0.9897 | 0.9053 | 0.0 | 0.8945 | 0.9292 | 0.6095 | 0.3308 | 0.9311 | 0.9859 | 0.0 | nan | 0.9806 | 0.8996 | 0.8408 | 0.9771 | 0.8506 | 0.0 | 0.8344 | 0.9223 | 0.4087 | 0.3297 | 0.8618 | 0.9522 | 0.0 | | 0.0257 | 403.64 | 4440 | 0.1071 | 0.6680 | 0.7070 | 0.9744 | nan | 0.9927 | 0.9550 | 0.9028 | 0.9882 | 0.9066 | 0.0 | 0.8828 | 0.9303 | 0.5080 | 0.2263 | 0.9164 | 0.9817 | 0.0 | nan | 0.9807 | 0.8919 | 0.8369 | 0.9773 | 0.8546 | 0.0 | 0.8372 | 0.9223 | 0.3427 | 0.2259 | 0.8560 | 0.9589 | 0.0 | | 0.0316 | 405.45 | 4460 | 0.1070 | 0.6690 | 0.7100 | 0.9743 | nan | 0.9928 | 0.9525 | 0.9038 | 0.9886 | 0.9098 | 0.0 | 0.8779 | 0.9284 | 0.5173 | 0.2422 | 0.9300 | 0.9867 | 0.0 | nan | 0.9807 | 0.8913 | 0.8372 | 0.9772 | 0.8532 | 0.0 | 0.8342 | 0.9209 | 0.3481 | 0.2415 | 0.8597 | 0.9532 | 0.0 | | 0.0288 | 407.27 | 4480 | 0.1034 | 0.6715 | 0.7129 | 0.9750 | nan | 0.9928 | 0.9533 | 0.9072 | 0.9884 | 0.9092 | 0.0 | 0.8758 | 0.9352 | 0.5335 | 0.2468 | 0.9403 | 0.9851 | 0.0 | nan | 0.9807 | 0.8943 | 0.8389 | 0.9773 | 0.8550 | 0.0 | 0.8327 | 0.9272 | 0.3604 | 0.2461 | 0.8625 | 0.9546 | 0.0 | | 0.028 | 409.09 | 4500 | 0.1090 | 0.6737 | 0.7143 | 0.9742 | nan | 0.9931 | 0.9512 | 0.9094 | 0.9890 | 0.9067 | 0.0 | 0.8799 | 0.9237 | 0.5016 | 0.3071 | 0.9384 | 0.9863 | 0.0 | nan | 0.9807 | 0.8942 | 0.8413 | 0.9773 | 0.8530 | 0.0 | 0.8315 | 0.9166 | 0.3396 | 0.3057 | 0.8665 | 0.9522 | 0.0 | | 0.0331 | 410.91 | 4520 | 0.1070 | 0.6759 | 0.7167 | 0.9746 | nan | 0.9931 | 0.9518 | 0.9057 | 0.9886 | 0.9070 | 0.0 | 0.8680 | 0.9285 | 0.5480 | 0.3009 | 0.9402 | 0.9853 | 0.0 | nan | 0.9807 | 0.8956 | 0.8388 | 0.9774 | 0.8545 | 0.0 | 0.8323 | 0.9209 | 0.3685 | 0.2995 | 0.8660 | 0.9524 | 0.0 | | 0.0338 | 412.73 | 4540 | 0.1101 | 0.6645 | 0.7036 | 0.9738 | nan | 0.9934 | 0.9499 | 0.8950 | 0.9895 | 0.9048 | 0.0 | 0.8838 | 0.9255 | 0.4974 | 0.1936 | 0.9292 | 0.9842 | 0.0 | nan | 0.9803 | 0.8908 | 0.8323 | 0.9774 | 0.8556 | 0.0 | 0.8333 | 0.9184 | 0.3370 | 0.1933 | 0.8638 | 0.9560 | 0.0 | | 0.0287 | 414.55 | 4560 | 0.1085 | 0.6693 | 0.7107 | 0.9743 | nan | 0.9931 | 0.9506 | 0.9014 | 0.9885 | 0.9123 | 0.0 | 0.8803 | 0.9289 | 0.5478 | 0.2182 | 0.9334 | 0.9844 | 0.0 | nan | 0.9807 | 0.8943 | 0.8361 | 0.9773 | 0.8542 | 0.0 | 0.8310 | 0.9200 | 0.3689 | 0.2177 | 0.8634 | 0.9576 | 0.0 | | 0.0326 | 416.36 | 4580 | 0.1041 | 0.6632 | 0.7006 | 0.9750 | nan | 0.9932 | 0.9538 | 0.8958 | 0.9886 | 0.9065 | 0.0 | 0.8720 | 0.9393 | 0.4789 | 0.1780 | 0.9192 | 0.9820 | 0.0 | nan | 0.9807 | 0.8888 | 0.8343 | 0.9774 | 0.8552 | 0.0 | 0.8289 | 0.9321 | 0.3229 | 0.1776 | 0.8620 | 0.9614 | 0.0 | | 0.0295 | 418.18 | 4600 | 0.1171 | 0.6685 | 0.7099 | 0.9729 | nan | 0.9927 | 0.9532 | 0.9062 | 0.9890 | 0.9036 | 0.0 | 0.8812 | 0.9111 | 0.5387 | 0.2322 | 0.9354 | 0.9852 | 0.0 | nan | 0.9805 | 0.8927 | 0.8366 | 0.9774 | 0.8529 | 0.0 | 0.8349 | 0.9044 | 0.3600 | 0.2312 | 0.8643 | 0.9554 | 0.0 | | 0.0323 | 420.0 | 4620 | 0.1553 | 0.6577 | 0.6970 | 0.9686 | nan | 0.9931 | 0.9520 | 0.8853 | 0.9888 | 0.9058 | 0.0 | 0.8867 | 0.8720 | 0.4612 | 0.2018 | 0.9276 | 0.9869 | 0.0 | nan | 0.9806 | 0.8834 | 0.8226 | 0.9772 | 0.8533 | 0.0 | 0.8359 | 0.8649 | 0.3116 | 0.2008 | 0.8663 | 0.9530 | 0.0 | | 0.0268 | 421.82 | 4640 | 0.1445 | 0.6624 | 0.7015 | 0.9692 | nan | 0.9932 | 0.9490 | 0.9004 | 0.9889 | 0.9068 | 0.0 | 0.8870 | 0.8732 | 0.4930 | 0.2215 | 0.9254 | 0.9814 | 0.0 | nan | 0.9804 | 0.8901 | 0.8337 | 0.9772 | 0.8530 | 0.0 | 0.8348 | 0.8657 | 0.3319 | 0.2205 | 0.8682 | 0.9559 | 0.0 | | 0.0328 | 423.64 | 4660 | 0.1390 | 0.6602 | 0.7000 | 0.9696 | nan | 0.9934 | 0.9513 | 0.8971 | 0.9884 | 0.9092 | 0.0 | 0.8821 | 0.8762 | 0.5016 | 0.1896 | 0.9247 | 0.9861 | 0.0 | nan | 0.9805 | 0.8896 | 0.8358 | 0.9774 | 0.8538 | 0.0 | 0.8361 | 0.8693 | 0.3366 | 0.1890 | 0.8587 | 0.9552 | 0.0 | | 0.0288 | 425.45 | 4680 | 0.1314 | 0.6644 | 0.7043 | 0.9705 | nan | 0.9931 | 0.9512 | 0.9069 | 0.9886 | 0.9062 | 0.0 | 0.8749 | 0.8842 | 0.5280 | 0.2110 | 0.9290 | 0.9824 | 0.0 | nan | 0.9805 | 0.8917 | 0.8392 | 0.9776 | 0.8547 | 0.0 | 0.8356 | 0.8774 | 0.3548 | 0.2104 | 0.8617 | 0.9537 | 0.0 | | 0.0438 | 427.27 | 4700 | 0.1298 | 0.6677 | 0.7098 | 0.9709 | nan | 0.9932 | 0.9510 | 0.9074 | 0.9881 | 0.9107 | 0.0 | 0.8794 | 0.8867 | 0.5764 | 0.2139 | 0.9347 | 0.9859 | 0.0 | nan | 0.9807 | 0.8942 | 0.8405 | 0.9774 | 0.8546 | 0.0 | 0.8348 | 0.8792 | 0.3874 | 0.2132 | 0.8620 | 0.9568 | 0.0 | | 0.0309 | 429.09 | 4720 | 0.1460 | 0.6790 | 0.7216 | 0.9699 | nan | 0.9932 | 0.9471 | 0.9055 | 0.9899 | 0.9045 | 0.0 | 0.8758 | 0.8721 | 0.5933 | 0.3667 | 0.9432 | 0.9891 | 0.0 | nan | 0.9807 | 0.8964 | 0.8385 | 0.9771 | 0.8531 | 0.0 | 0.8327 | 0.8652 | 0.3990 | 0.3641 | 0.8680 | 0.9517 | 0.0 | | 0.0254 | 430.91 | 4740 | 0.1192 | 0.6887 | 0.7300 | 0.9726 | nan | 0.9935 | 0.9494 | 0.9066 | 0.9885 | 0.9067 | 0.0 | 0.8602 | 0.8992 | 0.6281 | 0.4333 | 0.9393 | 0.9853 | 0.0 | nan | 0.9805 | 0.9008 | 0.8409 | 0.9774 | 0.8540 | 0.0 | 0.8289 | 0.8919 | 0.4234 | 0.4308 | 0.8684 | 0.9567 | 0.0 | | 0.0291 | 432.73 | 4760 | 0.1170 | 0.6746 | 0.7153 | 0.9726 | nan | 0.9932 | 0.9498 | 0.9028 | 0.9881 | 0.9113 | 0.0 | 0.8715 | 0.9075 | 0.5480 | 0.3106 | 0.9334 | 0.9828 | 0.0 | nan | 0.9805 | 0.8940 | 0.8363 | 0.9772 | 0.8531 | 0.0 | 0.8298 | 0.8994 | 0.3697 | 0.3088 | 0.8629 | 0.9585 | 0.0 | | 0.0323 | 434.55 | 4780 | 0.1113 | 0.6776 | 0.7203 | 0.9736 | nan | 0.9932 | 0.9502 | 0.8997 | 0.9893 | 0.9055 | 0.0 | 0.8741 | 0.9167 | 0.5755 | 0.3296 | 0.9414 | 0.9885 | 0.0 | nan | 0.9808 | 0.8949 | 0.8364 | 0.9772 | 0.8525 | 0.0 | 0.8318 | 0.9091 | 0.3855 | 0.3267 | 0.8628 | 0.9515 | 0.0 | | 0.0339 | 436.36 | 4800 | 0.1122 | 0.6649 | 0.7026 | 0.9732 | nan | 0.9935 | 0.9510 | 0.8929 | 0.9880 | 0.9104 | 0.0 | 0.8759 | 0.9216 | 0.4338 | 0.2570 | 0.9238 | 0.9864 | 0.0 | nan | 0.9806 | 0.8869 | 0.8296 | 0.9772 | 0.8534 | 0.0 | 0.8307 | 0.9143 | 0.2958 | 0.2563 | 0.8651 | 0.9544 | 0.0 | | 0.03 | 438.18 | 4820 | 0.1112 | 0.6697 | 0.7092 | 0.9736 | nan | 0.9934 | 0.9493 | 0.8991 | 0.9886 | 0.9095 | 0.0 | 0.8783 | 0.9223 | 0.4661 | 0.2963 | 0.9284 | 0.9877 | 0.0 | nan | 0.9805 | 0.8900 | 0.8352 | 0.9773 | 0.8528 | 0.0 | 0.8317 | 0.9148 | 0.3153 | 0.2951 | 0.8639 | 0.9499 | 0.0 | | 0.0296 | 440.0 | 4840 | 0.1123 | 0.6666 | 0.7061 | 0.9733 | nan | 0.9930 | 0.9513 | 0.8969 | 0.9887 | 0.9089 | 0.0 | 0.8775 | 0.9201 | 0.4845 | 0.2525 | 0.9185 | 0.9877 | 0.0 | nan | 0.9805 | 0.8887 | 0.8308 | 0.9774 | 0.8544 | 0.0 | 0.8304 | 0.9129 | 0.3265 | 0.2515 | 0.8602 | 0.9526 | 0.0 | | 0.0304 | 441.82 | 4860 | 0.1114 | 0.6675 | 0.7095 | 0.9737 | nan | 0.9928 | 0.9524 | 0.9015 | 0.9876 | 0.9101 | 0.0 | 0.8760 | 0.9244 | 0.5140 | 0.2458 | 0.9308 | 0.9885 | 0.0 | nan | 0.9808 | 0.8888 | 0.8325 | 0.9775 | 0.8535 | 0.0 | 0.8287 | 0.9161 | 0.3432 | 0.2451 | 0.8606 | 0.9510 | 0.0 | | 0.0346 | 443.64 | 4880 | 0.1098 | 0.6680 | 0.7095 | 0.9739 | nan | 0.9929 | 0.9518 | 0.9035 | 0.9888 | 0.9084 | 0.0 | 0.8779 | 0.9251 | 0.4964 | 0.2572 | 0.9327 | 0.9893 | 0.0 | nan | 0.9809 | 0.8892 | 0.8344 | 0.9774 | 0.8546 | 0.0 | 0.8321 | 0.9177 | 0.3338 | 0.2558 | 0.8595 | 0.9487 | 0.0 | | 0.0283 | 445.45 | 4900 | 0.1115 | 0.6668 | 0.7077 | 0.9736 | nan | 0.9931 | 0.9492 | 0.9032 | 0.9887 | 0.9079 | 0.0 | 0.8813 | 0.9237 | 0.4617 | 0.2674 | 0.9348 | 0.9894 | 0.0 | nan | 0.9806 | 0.8885 | 0.8322 | 0.9775 | 0.8544 | 0.0 | 0.8320 | 0.9162 | 0.3135 | 0.2659 | 0.8614 | 0.9469 | 0.0 | | 0.027 | 447.27 | 4920 | 0.1135 | 0.6658 | 0.7070 | 0.9733 | nan | 0.9928 | 0.9509 | 0.9054 | 0.9890 | 0.9060 | 0.0 | 0.8776 | 0.9189 | 0.4845 | 0.2421 | 0.9343 | 0.9898 | 0.0 | nan | 0.9807 | 0.8895 | 0.8343 | 0.9774 | 0.8545 | 0.0 | 0.8314 | 0.9119 | 0.3274 | 0.2414 | 0.8597 | 0.9479 | 0.0 | | 0.0307 | 449.09 | 4940 | 0.1111 | 0.6686 | 0.7097 | 0.9736 | nan | 0.9928 | 0.9506 | 0.9083 | 0.9892 | 0.9068 | 0.0 | 0.8797 | 0.9209 | 0.4992 | 0.2577 | 0.9331 | 0.9882 | 0.0 | nan | 0.9807 | 0.8908 | 0.8360 | 0.9774 | 0.8558 | 0.0 | 0.8319 | 0.9140 | 0.3362 | 0.2563 | 0.8623 | 0.9498 | 0.0 | | 0.0292 | 450.91 | 4960 | 0.1114 | 0.6700 | 0.7096 | 0.9737 | nan | 0.9933 | 0.9518 | 0.8997 | 0.9879 | 0.9133 | 0.0 | 0.8726 | 0.9220 | 0.4959 | 0.2733 | 0.9295 | 0.9861 | 0.0 | nan | 0.9806 | 0.8910 | 0.8350 | 0.9775 | 0.8556 | 0.0 | 0.8331 | 0.9147 | 0.3344 | 0.2713 | 0.8651 | 0.9521 | 0.0 | | 0.0329 | 452.73 | 4980 | 0.1052 | 0.6722 | 0.7124 | 0.9748 | nan | 0.9933 | 0.9499 | 0.9052 | 0.9885 | 0.9106 | 0.0 | 0.8740 | 0.9336 | 0.5081 | 0.2771 | 0.9352 | 0.9851 | 0.0 | nan | 0.9807 | 0.8936 | 0.8364 | 0.9776 | 0.8558 | 0.0 | 0.8323 | 0.9257 | 0.3428 | 0.2754 | 0.8649 | 0.9537 | 0.0 | | 0.0314 | 454.55 | 5000 | 0.1017 | 0.6675 | 0.7081 | 0.9753 | nan | 0.9930 | 0.9516 | 0.9057 | 0.9890 | 0.9098 | 0.0 | 0.8730 | 0.9399 | 0.4960 | 0.2209 | 0.9395 | 0.9869 | 0.0 | nan | 0.9808 | 0.8910 | 0.8362 | 0.9775 | 0.8566 | 0.0 | 0.8333 | 0.9332 | 0.3341 | 0.2198 | 0.8598 | 0.9547 | 0.0 | | 0.03 | 456.36 | 5020 | 0.1029 | 0.6670 | 0.7079 | 0.9752 | nan | 0.9930 | 0.9523 | 0.9043 | 0.9887 | 0.9100 | 0.0 | 0.8754 | 0.9389 | 0.4895 | 0.2228 | 0.9403 | 0.9881 | 0.0 | nan | 0.9808 | 0.8908 | 0.8353 | 0.9776 | 0.8571 | 0.0 | 0.8323 | 0.9314 | 0.3304 | 0.2216 | 0.8602 | 0.9534 | 0.0 | | 0.0311 | 458.18 | 5040 | 0.1032 | 0.6681 | 0.7082 | 0.9751 | nan | 0.9931 | 0.9528 | 0.9032 | 0.9884 | 0.9078 | 0.0 | 0.8704 | 0.9387 | 0.4860 | 0.2394 | 0.9412 | 0.9857 | 0.0 | nan | 0.9807 | 0.8907 | 0.8345 | 0.9777 | 0.8571 | 0.0 | 0.8315 | 0.9312 | 0.3280 | 0.2381 | 0.8614 | 0.9545 | 0.0 | | 0.0269 | 460.0 | 5060 | 0.1031 | 0.6697 | 0.7100 | 0.9750 | nan | 0.9932 | 0.9534 | 0.9008 | 0.9878 | 0.9114 | 0.0 | 0.8793 | 0.9369 | 0.4996 | 0.2463 | 0.9386 | 0.9824 | 0.0 | nan | 0.9808 | 0.8912 | 0.8345 | 0.9775 | 0.8567 | 0.0 | 0.8335 | 0.9296 | 0.3354 | 0.2453 | 0.8663 | 0.9551 | 0.0 | | 0.0261 | 461.82 | 5080 | 0.1035 | 0.6714 | 0.7120 | 0.9751 | nan | 0.9933 | 0.9516 | 0.9002 | 0.9884 | 0.9087 | 0.0 | 0.8799 | 0.9370 | 0.5140 | 0.2609 | 0.9389 | 0.9828 | 0.0 | nan | 0.9808 | 0.8920 | 0.8339 | 0.9776 | 0.8566 | 0.0 | 0.8338 | 0.9293 | 0.3441 | 0.2596 | 0.8673 | 0.9538 | 0.0 | | 0.0336 | 463.64 | 5100 | 0.1046 | 0.6704 | 0.7097 | 0.9750 | nan | 0.9931 | 0.9523 | 0.9039 | 0.9885 | 0.9098 | 0.0 | 0.8741 | 0.9356 | 0.5061 | 0.2520 | 0.9277 | 0.9832 | 0.0 | nan | 0.9809 | 0.8913 | 0.8355 | 0.9776 | 0.8563 | 0.0 | 0.8327 | 0.9280 | 0.3399 | 0.2508 | 0.8668 | 0.9549 | 0.0 | | 0.029 | 465.45 | 5120 | 0.1046 | 0.6679 | 0.7071 | 0.9749 | nan | 0.9931 | 0.9518 | 0.9026 | 0.9887 | 0.9101 | 0.0 | 0.8793 | 0.9360 | 0.4742 | 0.2432 | 0.9263 | 0.9864 | 0.0 | nan | 0.9807 | 0.8907 | 0.8349 | 0.9775 | 0.8562 | 0.0 | 0.8325 | 0.9285 | 0.3212 | 0.2422 | 0.8673 | 0.9512 | 0.0 | | 0.0287 | 467.27 | 5140 | 0.1040 | 0.6672 | 0.7072 | 0.9749 | nan | 0.9932 | 0.9527 | 0.8982 | 0.9879 | 0.9113 | 0.0 | 0.8828 | 0.9373 | 0.4782 | 0.2333 | 0.9324 | 0.9869 | 0.0 | nan | 0.9807 | 0.8907 | 0.8342 | 0.9776 | 0.8557 | 0.0 | 0.8327 | 0.9290 | 0.3237 | 0.2325 | 0.8672 | 0.9501 | 0.0 | | 0.0264 | 469.09 | 5160 | 0.1061 | 0.6677 | 0.7077 | 0.9746 | nan | 0.9930 | 0.9516 | 0.8997 | 0.9889 | 0.9099 | 0.0 | 0.8816 | 0.9334 | 0.4851 | 0.2397 | 0.9300 | 0.9874 | 0.0 | nan | 0.9807 | 0.8897 | 0.8333 | 0.9775 | 0.8557 | 0.0 | 0.8319 | 0.9257 | 0.3274 | 0.2384 | 0.8671 | 0.9522 | 0.0 | | 0.0283 | 470.91 | 5180 | 0.1037 | 0.6696 | 0.7101 | 0.9748 | nan | 0.9931 | 0.9510 | 0.8997 | 0.9889 | 0.9093 | 0.0 | 0.8847 | 0.9353 | 0.5076 | 0.2461 | 0.9293 | 0.9864 | 0.0 | nan | 0.9807 | 0.8914 | 0.8334 | 0.9775 | 0.8552 | 0.0 | 0.8320 | 0.9276 | 0.3416 | 0.2450 | 0.8681 | 0.9523 | 0.0 | | 0.0283 | 472.73 | 5200 | 0.1039 | 0.6712 | 0.7123 | 0.9750 | nan | 0.9931 | 0.9521 | 0.9004 | 0.9885 | 0.9105 | 0.0 | 0.8833 | 0.9360 | 0.5190 | 0.2596 | 0.9311 | 0.9867 | 0.0 | nan | 0.9808 | 0.8919 | 0.8343 | 0.9775 | 0.8545 | 0.0 | 0.8328 | 0.9283 | 0.3487 | 0.2585 | 0.8666 | 0.9519 | 0.0 | | 0.0275 | 474.55 | 5220 | 0.1047 | 0.6687 | 0.7082 | 0.9748 | nan | 0.9932 | 0.9520 | 0.9013 | 0.9885 | 0.9093 | 0.0 | 0.8806 | 0.9345 | 0.4863 | 0.2485 | 0.9256 | 0.9871 | 0.0 | nan | 0.9807 | 0.8904 | 0.8340 | 0.9776 | 0.8553 | 0.0 | 0.8325 | 0.9276 | 0.3288 | 0.2475 | 0.8671 | 0.9510 | 0.0 | | 0.0264 | 476.36 | 5240 | 0.1108 | 0.6676 | 0.7079 | 0.9739 | nan | 0.9929 | 0.9516 | 0.9041 | 0.9890 | 0.9080 | 0.0 | 0.8811 | 0.9241 | 0.4843 | 0.2515 | 0.9279 | 0.9881 | 0.0 | nan | 0.9807 | 0.8898 | 0.8337 | 0.9775 | 0.8550 | 0.0 | 0.8310 | 0.9175 | 0.3270 | 0.2504 | 0.8675 | 0.9487 | 0.0 | | 0.0313 | 478.18 | 5260 | 0.1172 | 0.6674 | 0.7089 | 0.9729 | nan | 0.9929 | 0.9506 | 0.9020 | 0.9891 | 0.9097 | 0.0 | 0.8754 | 0.9137 | 0.5080 | 0.2487 | 0.9377 | 0.9881 | 0.0 | nan | 0.9807 | 0.8901 | 0.8330 | 0.9774 | 0.8545 | 0.0 | 0.8310 | 0.9068 | 0.3415 | 0.2476 | 0.8652 | 0.9480 | 0.0 | | 0.0254 | 480.0 | 5280 | 0.1142 | 0.6699 | 0.7121 | 0.9734 | nan | 0.9928 | 0.9508 | 0.9090 | 0.9891 | 0.9092 | 0.0 | 0.8830 | 0.9175 | 0.5182 | 0.2615 | 0.9396 | 0.9866 | 0.0 | nan | 0.9808 | 0.8924 | 0.8371 | 0.9775 | 0.8549 | 0.0 | 0.8311 | 0.9101 | 0.3480 | 0.2601 | 0.8651 | 0.9518 | 0.0 | | 0.0373 | 481.82 | 5300 | 0.1122 | 0.6691 | 0.7095 | 0.9737 | nan | 0.9932 | 0.9513 | 0.9036 | 0.9884 | 0.9107 | 0.0 | 0.8778 | 0.9206 | 0.5073 | 0.2500 | 0.9354 | 0.9847 | 0.0 | nan | 0.9808 | 0.8928 | 0.8370 | 0.9776 | 0.8554 | 0.0 | 0.8302 | 0.9127 | 0.3426 | 0.2488 | 0.8664 | 0.9546 | 0.0 | | 0.0255 | 483.64 | 5320 | 0.1104 | 0.6685 | 0.7087 | 0.9740 | nan | 0.9933 | 0.9515 | 0.9023 | 0.9885 | 0.9098 | 0.0 | 0.8805 | 0.9240 | 0.5053 | 0.2405 | 0.9322 | 0.9854 | 0.0 | nan | 0.9807 | 0.8922 | 0.8368 | 0.9776 | 0.8559 | 0.0 | 0.8310 | 0.9165 | 0.3407 | 0.2394 | 0.8662 | 0.9537 | 0.0 | | 0.0285 | 485.45 | 5340 | 0.1084 | 0.6693 | 0.7108 | 0.9745 | nan | 0.9930 | 0.9509 | 0.9085 | 0.9886 | 0.9106 | 0.0 | 0.8817 | 0.9289 | 0.5108 | 0.2425 | 0.9379 | 0.9866 | 0.0 | nan | 0.9808 | 0.8928 | 0.8378 | 0.9776 | 0.8557 | 0.0 | 0.8312 | 0.9214 | 0.3439 | 0.2415 | 0.8652 | 0.9527 | 0.0 | | 0.0297 | 487.27 | 5360 | 0.1085 | 0.6688 | 0.7097 | 0.9745 | nan | 0.9932 | 0.9517 | 0.9030 | 0.9889 | 0.9086 | 0.0 | 0.8795 | 0.9290 | 0.5124 | 0.2383 | 0.9336 | 0.9883 | 0.0 | nan | 0.9808 | 0.8918 | 0.8368 | 0.9776 | 0.8554 | 0.0 | 0.8305 | 0.9220 | 0.3451 | 0.2371 | 0.8657 | 0.9512 | 0.0 | | 0.0265 | 489.09 | 5380 | 0.1088 | 0.6682 | 0.7087 | 0.9745 | nan | 0.9932 | 0.9520 | 0.9020 | 0.9888 | 0.9081 | 0.0 | 0.8791 | 0.9297 | 0.5135 | 0.2311 | 0.9279 | 0.9881 | 0.0 | nan | 0.9808 | 0.8916 | 0.8362 | 0.9776 | 0.8561 | 0.0 | 0.8304 | 0.9222 | 0.3459 | 0.2300 | 0.8639 | 0.9519 | 0.0 | | 0.0301 | 490.91 | 5400 | 0.1079 | 0.6687 | 0.7095 | 0.9746 | nan | 0.9932 | 0.9517 | 0.9030 | 0.9884 | 0.9091 | 0.0 | 0.8773 | 0.9313 | 0.5110 | 0.2369 | 0.9327 | 0.9884 | 0.0 | nan | 0.9807 | 0.8922 | 0.8367 | 0.9777 | 0.8563 | 0.0 | 0.8301 | 0.9232 | 0.3445 | 0.2357 | 0.8655 | 0.9507 | 0.0 | | 0.028 | 492.73 | 5420 | 0.1072 | 0.6697 | 0.7101 | 0.9747 | nan | 0.9932 | 0.9509 | 0.9040 | 0.9886 | 0.9109 | 0.0 | 0.8777 | 0.9321 | 0.5124 | 0.2444 | 0.9302 | 0.9874 | 0.0 | nan | 0.9808 | 0.8928 | 0.8373 | 0.9776 | 0.8557 | 0.0 | 0.8306 | 0.9239 | 0.3455 | 0.2432 | 0.8671 | 0.9518 | 0.0 | | 0.0305 | 494.55 | 5440 | 0.1065 | 0.6700 | 0.7107 | 0.9748 | nan | 0.9933 | 0.9498 | 0.9037 | 0.9890 | 0.9093 | 0.0 | 0.8802 | 0.9329 | 0.5086 | 0.2493 | 0.9361 | 0.9869 | 0.0 | nan | 0.9807 | 0.8932 | 0.8377 | 0.9775 | 0.8555 | 0.0 | 0.8314 | 0.9248 | 0.3432 | 0.2481 | 0.8658 | 0.9515 | 0.0 | | 0.0292 | 496.36 | 5460 | 0.1088 | 0.6695 | 0.7101 | 0.9746 | nan | 0.9931 | 0.9518 | 0.9053 | 0.9888 | 0.9104 | 0.0 | 0.8763 | 0.9303 | 0.5131 | 0.2429 | 0.9332 | 0.9865 | 0.0 | nan | 0.9807 | 0.8925 | 0.8380 | 0.9776 | 0.8556 | 0.0 | 0.8308 | 0.9231 | 0.3451 | 0.2418 | 0.8653 | 0.9532 | 0.0 | | 0.0281 | 498.18 | 5480 | 0.1070 | 0.6693 | 0.7091 | 0.9747 | nan | 0.9932 | 0.9521 | 0.9031 | 0.9883 | 0.9118 | 0.0 | 0.8750 | 0.9323 | 0.4945 | 0.2482 | 0.9331 | 0.9861 | 0.0 | nan | 0.9807 | 0.8925 | 0.8375 | 0.9776 | 0.8557 | 0.0 | 0.8312 | 0.9245 | 0.3349 | 0.2470 | 0.8662 | 0.9531 | 0.0 | | 0.0335 | 500.0 | 5500 | 0.1068 | 0.6684 | 0.7082 | 0.9747 | nan | 0.9932 | 0.9513 | 0.9029 | 0.9891 | 0.9082 | 0.0 | 0.8767 | 0.9315 | 0.5011 | 0.2352 | 0.9299 | 0.9872 | 0.0 | nan | 0.9807 | 0.8923 | 0.8376 | 0.9775 | 0.8554 | 0.0 | 0.8309 | 0.9241 | 0.3386 | 0.2342 | 0.8665 | 0.9518 | 0.0 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "unlabeled", "nat", "concrete", "grass", "speedway bricks", "steel", "rough concrete", "dark bricks", "road", "rough red sidewalk", "tiles", "red bricks", "concrete tiles", "rest" ]
sam1120/safety-utcustom-terrain-b0-512sq
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # safety-utcustom-terrain-b0-512sq This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/safety-utcustom-terrain-jackal-full-391 dataset. It achieves the following results on the evaluation set: - Loss: 0.0905 - Mean Iou: 0.6547 - Mean Accuracy: 0.6922 - Overall Accuracy: 0.9785 - Accuracy Unlabeled: nan - Accuracy Nat: 0.9937 - Accuracy Concrete: 0.9505 - Accuracy Grass: 0.8967 - Accuracy Speedway bricks: 0.9886 - Accuracy Steel: 0.9017 - Accuracy Rough concrete: 0.0 - Accuracy Dark bricks: 0.8836 - Accuracy Road: 0.9814 - Accuracy Rough red sidewalk: 0.4644 - Accuracy Tiles: 0.0731 - Accuracy Red bricks: 0.8837 - Accuracy Concrete tiles: 0.9814 - Accuracy Rest: 0.0 - Iou Unlabeled: nan - Iou Nat: 0.9808 - Iou Concrete: 0.8881 - Iou Grass: 0.8358 - Iou Speedway bricks: 0.9774 - Iou Steel: 0.8498 - Iou Rough concrete: 0.0 - Iou Dark bricks: 0.8385 - Iou Road: 0.9740 - Iou Rough red sidewalk: 0.3068 - Iou Tiles: 0.0731 - Iou Red bricks: 0.8306 - Iou Concrete tiles: 0.9563 - Iou Rest: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.05 - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Nat | Accuracy Concrete | Accuracy Grass | Accuracy Speedway bricks | Accuracy Steel | Accuracy Rough concrete | Accuracy Dark bricks | Accuracy Road | Accuracy Rough red sidewalk | Accuracy Tiles | Accuracy Red bricks | Accuracy Concrete tiles | Accuracy Rest | Iou Unlabeled | Iou Nat | Iou Concrete | Iou Grass | Iou Speedway bricks | Iou Steel | Iou Rough concrete | Iou Dark bricks | Iou Road | Iou Rough red sidewalk | Iou Tiles | Iou Red bricks | Iou Concrete tiles | Iou Rest | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------:|:-----------------:|:--------------:|:------------------------:|:--------------:|:-----------------------:|:--------------------:|:-------------:|:---------------------------:|:--------------:|:-------------------:|:-----------------------:|:-------------:|:-------------:|:-------:|:------------:|:---------:|:-------------------:|:---------:|:------------------:|:---------------:|:--------:|:----------------------:|:---------:|:--------------:|:------------------:|:--------:| | 2.5605 | 1.82 | 20 | 2.5923 | 0.0418 | 0.1042 | 0.2052 | nan | 0.0696 | 0.0122 | 0.2127 | 0.8829 | 0.0111 | 0.0142 | 0.0130 | 0.0030 | 0.0657 | 0.0145 | 0.0 | 0.0553 | 0.0 | 0.0 | 0.0694 | 0.0116 | 0.0456 | 0.4465 | 0.0028 | 0.0007 | 0.0008 | 0.0030 | 0.0012 | 0.0003 | 0.0 | 0.0026 | 0.0 | | 2.2612 | 3.64 | 40 | 2.2846 | 0.1228 | 0.1971 | 0.5972 | nan | 0.6488 | 0.2267 | 0.3308 | 0.9638 | 0.0140 | 0.0422 | 0.0129 | 0.1058 | 0.1058 | 0.0 | 0.0 | 0.1114 | 0.0 | 0.0 | 0.6444 | 0.1866 | 0.1285 | 0.6288 | 0.0081 | 0.0032 | 0.0010 | 0.1044 | 0.0036 | 0.0 | 0.0 | 0.0103 | 0.0 | | 1.9031 | 5.45 | 60 | 1.8314 | 0.1813 | 0.2456 | 0.8180 | nan | 0.9262 | 0.5041 | 0.1795 | 0.9734 | 0.0017 | 0.0 | 0.0156 | 0.5168 | 0.0455 | 0.0 | 0.0 | 0.0301 | 0.0 | 0.0 | 0.9160 | 0.2939 | 0.1365 | 0.6863 | 0.0016 | 0.0 | 0.0052 | 0.4787 | 0.0115 | 0.0 | 0.0 | 0.0092 | 0.0 | | 1.5736 | 7.27 | 80 | 1.4966 | 0.2120 | 0.2770 | 0.8702 | nan | 0.9479 | 0.6932 | 0.1739 | 0.9742 | 0.0 | 0.0 | 0.0017 | 0.7967 | 0.0122 | 0.0 | 0.0 | 0.0017 | 0.0 | 0.0 | 0.9379 | 0.3900 | 0.1382 | 0.8110 | 0.0 | 0.0 | 0.0015 | 0.6779 | 0.0096 | 0.0 | 0.0 | 0.0013 | 0.0 | | 1.4408 | 9.09 | 100 | 1.2881 | 0.2191 | 0.2891 | 0.8813 | nan | 0.9447 | 0.6590 | 0.2255 | 0.9794 | 0.0 | 0.0 | 0.0013 | 0.9482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9369 | 0.4418 | 0.1736 | 0.8288 | 0.0 | 0.0 | 0.0013 | 0.6856 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1772 | 10.91 | 120 | 1.1057 | 0.2549 | 0.3271 | 0.9074 | nan | 0.9502 | 0.8398 | 0.5467 | 0.9768 | 0.0 | 0.0 | 0.0000 | 0.9391 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9420 | 0.5934 | 0.3957 | 0.8504 | 0.0 | 0.0 | 0.0000 | 0.7875 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0571 | 12.73 | 140 | 0.9586 | 0.2889 | 0.3418 | 0.9156 | nan | 0.9510 | 0.8589 | 0.7033 | 0.9788 | 0.0 | 0.0 | 0.0 | 0.9520 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9427 | 0.6459 | 0.5132 | 0.8582 | 0.0 | 0.0 | 0.0 | 0.7961 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9963 | 14.55 | 160 | 0.8513 | 0.2884 | 0.3424 | 0.9136 | nan | 0.9495 | 0.8917 | 0.7291 | 0.9787 | 0.0 | 0.0 | 0.0 | 0.9017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9420 | 0.6236 | 0.5275 | 0.8591 | 0.0 | 0.0 | 0.0 | 0.7965 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8241 | 16.36 | 180 | 0.7475 | 0.2994 | 0.3535 | 0.9216 | nan | 0.9526 | 0.9040 | 0.8159 | 0.9651 | 0.0 | 0.0 | 0.0 | 0.9574 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9441 | 0.6678 | 0.5813 | 0.8691 | 0.0 | 0.0 | 0.0 | 0.8298 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7479 | 18.18 | 200 | 0.6309 | 0.2999 | 0.3488 | 0.9228 | nan | 0.9550 | 0.8897 | 0.7557 | 0.9846 | 0.0 | 0.0 | 0.0 | 0.9496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9456 | 0.6793 | 0.5699 | 0.8567 | 0.0 | 0.0 | 0.0 | 0.8474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.605 | 20.0 | 220 | 0.5351 | 0.3058 | 0.3554 | 0.9277 | nan | 0.9591 | 0.8978 | 0.8226 | 0.9748 | 0.0 | 0.0 | 0.0 | 0.9665 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9481 | 0.7179 | 0.5936 | 0.8682 | 0.0 | 0.0 | 0.0 | 0.8474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5297 | 21.82 | 240 | 0.4703 | 0.3033 | 0.3525 | 0.9250 | nan | 0.9592 | 0.9271 | 0.7842 | 0.9591 | 0.0 | 0.0 | 0.0 | 0.9536 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9493 | 0.6572 | 0.6144 | 0.8750 | 0.0 | 0.0 | 0.0 | 0.8468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.4684 | 23.64 | 260 | 0.4069 | 0.3091 | 0.3541 | 0.9290 | nan | 0.9631 | 0.9042 | 0.8167 | 0.9881 | 0.0 | 0.0 | 0.0 | 0.9228 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | nan | 0.9522 | 0.6840 | 0.6363 | 0.8642 | 0.0 | 0.0 | 0.0 | 0.8735 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0 | | 0.3694 | 25.45 | 280 | 0.3385 | 0.3465 | 0.3830 | 0.9384 | nan | 0.9734 | 0.8940 | 0.7669 | 0.9846 | 0.0217 | 0.0 | 0.0 | 0.9760 | 0.0 | 0.0 | 0.0 | 0.3627 | 0.0 | nan | 0.9569 | 0.7483 | 0.6604 | 0.8802 | 0.0217 | 0.0 | 0.0 | 0.8744 | 0.0 | 0.0 | 0.0 | 0.3627 | 0.0 | | 0.3173 | 27.27 | 300 | 0.3078 | 0.3538 | 0.3922 | 0.9415 | nan | 0.9728 | 0.9165 | 0.8294 | 0.9875 | 0.2440 | 0.0 | 0.0 | 0.9417 | 0.0 | 0.0 | 0.0 | 0.2065 | 0.0 | nan | 0.9596 | 0.7289 | 0.6603 | 0.8906 | 0.2414 | 0.0 | 0.0 | 0.9115 | 0.0 | 0.0 | 0.0 | 0.2065 | 0.0 | | 0.2653 | 29.09 | 320 | 0.2574 | 0.4205 | 0.4560 | 0.9516 | nan | 0.9783 | 0.9122 | 0.8457 | 0.9819 | 0.5124 | 0.0 | 0.0 | 0.9681 | 0.0 | 0.0 | 0.0 | 0.7299 | 0.0 | nan | 0.9619 | 0.7819 | 0.6859 | 0.9210 | 0.4798 | 0.0 | 0.0 | 0.9111 | 0.0 | 0.0 | 0.0 | 0.7248 | 0.0 | | 0.2918 | 30.91 | 340 | 0.2306 | 0.4342 | 0.4679 | 0.9534 | nan | 0.9799 | 0.9160 | 0.8320 | 0.9820 | 0.5856 | 0.0 | 0.0 | 0.9642 | 0.0028 | 0.0 | 0.0 | 0.8200 | 0.0 | nan | 0.9625 | 0.7914 | 0.6872 | 0.9293 | 0.5456 | 0.0 | 0.0 | 0.9103 | 0.0025 | 0.0 | 0.0 | 0.8159 | 0.0 | | 0.2481 | 32.73 | 360 | 0.2078 | 0.4326 | 0.4626 | 0.9567 | nan | 0.9854 | 0.9113 | 0.8624 | 0.9831 | 0.5510 | 0.0 | 0.0 | 0.9639 | 0.0534 | 0.0 | 0.0 | 0.7037 | 0.0 | nan | 0.9656 | 0.8043 | 0.7168 | 0.9337 | 0.5321 | 0.0 | 0.0 | 0.9251 | 0.0449 | 0.0 | 0.0 | 0.7011 | 0.0 | | 0.2312 | 34.55 | 380 | 0.2061 | 0.4492 | 0.4916 | 0.9526 | nan | 0.9799 | 0.9541 | 0.8706 | 0.9471 | 0.7317 | 0.0 | 0.0960 | 0.9508 | 0.1540 | 0.0 | 0.0 | 0.7071 | 0.0 | nan | 0.9653 | 0.7358 | 0.7194 | 0.9262 | 0.6443 | 0.0 | 0.0913 | 0.9343 | 0.1193 | 0.0 | 0.0 | 0.7038 | 0.0 | | 0.1927 | 36.36 | 400 | 0.1879 | 0.4713 | 0.5096 | 0.9570 | nan | 0.9803 | 0.9387 | 0.8489 | 0.9857 | 0.6957 | 0.0 | 0.0159 | 0.9357 | 0.2860 | 0.0 | 0.0 | 0.9384 | 0.0 | nan | 0.9663 | 0.7742 | 0.7474 | 0.9402 | 0.6482 | 0.0 | 0.0146 | 0.9142 | 0.2025 | 0.0 | 0.0 | 0.9195 | 0.0 | | 0.1704 | 38.18 | 420 | 0.1674 | 0.4773 | 0.5119 | 0.9605 | nan | 0.9881 | 0.9335 | 0.7950 | 0.9703 | 0.7309 | 0.0 | 0.0557 | 0.9701 | 0.2721 | 0.0 | 0.0 | 0.9390 | 0.0 | nan | 0.9675 | 0.8049 | 0.7364 | 0.9476 | 0.6658 | 0.0 | 0.0537 | 0.9229 | 0.1923 | 0.0 | 0.0 | 0.9134 | 0.0 | | 0.1536 | 40.0 | 440 | 0.1518 | 0.4893 | 0.5307 | 0.9639 | nan | 0.9858 | 0.9143 | 0.8835 | 0.9842 | 0.7672 | 0.0 | 0.0150 | 0.9717 | 0.4264 | 0.0 | 0.0 | 0.9514 | 0.0 | nan | 0.9698 | 0.8369 | 0.7571 | 0.9460 | 0.6959 | 0.0 | 0.0145 | 0.9396 | 0.2775 | 0.0 | 0.0 | 0.9230 | 0.0 | | 0.1496 | 41.82 | 460 | 0.1503 | 0.4859 | 0.5224 | 0.9623 | nan | 0.9853 | 0.9400 | 0.8286 | 0.9831 | 0.7661 | 0.0 | 0.0583 | 0.9593 | 0.3768 | 0.0 | 0.0 | 0.8938 | 0.0 | nan | 0.9690 | 0.8098 | 0.7616 | 0.9471 | 0.7028 | 0.0 | 0.0523 | 0.9361 | 0.2600 | 0.0 | 0.0 | 0.8776 | 0.0 | | 0.1374 | 43.64 | 480 | 0.1393 | 0.5053 | 0.5413 | 0.9645 | nan | 0.9883 | 0.9244 | 0.8425 | 0.9857 | 0.7523 | 0.0 | 0.2899 | 0.9611 | 0.3584 | 0.0 | 0.0 | 0.9345 | 0.0 | nan | 0.9708 | 0.8305 | 0.7664 | 0.9480 | 0.6994 | 0.0 | 0.2654 | 0.9358 | 0.2508 | 0.0 | 0.0 | 0.9020 | 0.0 | | 0.124 | 45.45 | 500 | 0.1328 | 0.5158 | 0.5550 | 0.9670 | nan | 0.9881 | 0.9320 | 0.8503 | 0.9809 | 0.7936 | 0.0 | 0.3073 | 0.9809 | 0.4357 | 0.0 | 0.0 | 0.9463 | 0.0 | nan | 0.9711 | 0.8432 | 0.7721 | 0.9541 | 0.7303 | 0.0 | 0.2799 | 0.9504 | 0.2928 | 0.0 | 0.0 | 0.9120 | 0.0 | | 0.1208 | 47.27 | 520 | 0.1289 | 0.5342 | 0.5789 | 0.9659 | nan | 0.9868 | 0.9336 | 0.8527 | 0.9883 | 0.7217 | 0.0 | 0.7360 | 0.9587 | 0.3765 | 0.0 | 0.0 | 0.9715 | 0.0 | nan | 0.9716 | 0.8279 | 0.7602 | 0.9545 | 0.6899 | 0.0 | 0.6173 | 0.9466 | 0.2585 | 0.0 | 0.0 | 0.9182 | 0.0 | | 0.1149 | 49.09 | 540 | 0.1277 | 0.5389 | 0.5930 | 0.9671 | nan | 0.9866 | 0.9491 | 0.8611 | 0.9734 | 0.8384 | 0.0 | 0.7090 | 0.9679 | 0.4303 | 0.0 | 0.0 | 0.9935 | 0.0 | nan | 0.9719 | 0.8246 | 0.7677 | 0.9590 | 0.7486 | 0.0 | 0.5948 | 0.9533 | 0.2950 | 0.0 | 0.0 | 0.8912 | 0.0 | | 0.1311 | 50.91 | 560 | 0.1219 | 0.5443 | 0.5911 | 0.9683 | nan | 0.9902 | 0.9172 | 0.8452 | 0.9916 | 0.7374 | 0.0 | 0.7281 | 0.9649 | 0.5163 | 0.0 | 0.0020 | 0.9913 | 0.0 | nan | 0.9726 | 0.8471 | 0.7756 | 0.9530 | 0.7023 | 0.0 | 0.6140 | 0.9524 | 0.3576 | 0.0 | 0.0020 | 0.8997 | 0.0 | | 0.106 | 52.73 | 580 | 0.1146 | 0.5863 | 0.6368 | 0.9707 | nan | 0.9891 | 0.9221 | 0.8715 | 0.9858 | 0.8293 | 0.0 | 0.7703 | 0.9760 | 0.6216 | 0.0 | 0.3242 | 0.9879 | 0.0 | nan | 0.9733 | 0.8595 | 0.7837 | 0.9614 | 0.7570 | 0.0 | 0.7053 | 0.9553 | 0.3929 | 0.0 | 0.3223 | 0.9111 | 0.0 | | 0.096 | 54.55 | 600 | 0.1188 | 0.5968 | 0.6547 | 0.9693 | nan | 0.9909 | 0.9320 | 0.8542 | 0.9813 | 0.8542 | 0.0 | 0.7737 | 0.9499 | 0.5814 | 0.0 | 0.6014 | 0.9928 | 0.0 | nan | 0.9737 | 0.8434 | 0.7888 | 0.9626 | 0.7530 | 0.0 | 0.7130 | 0.9389 | 0.3792 | 0.0 | 0.5128 | 0.8933 | 0.0 | | 0.0825 | 56.36 | 620 | 0.1115 | 0.6032 | 0.6578 | 0.9707 | nan | 0.9894 | 0.9354 | 0.8517 | 0.9825 | 0.8532 | 0.0 | 0.6762 | 0.9744 | 0.5866 | 0.0 | 0.7186 | 0.9830 | 0.0 | nan | 0.9734 | 0.8537 | 0.7873 | 0.9634 | 0.7669 | 0.0 | 0.6445 | 0.9534 | 0.3820 | 0.0 | 0.5954 | 0.9220 | 0.0 | | 0.0824 | 58.18 | 640 | 0.1087 | 0.6086 | 0.6798 | 0.9714 | nan | 0.9877 | 0.9269 | 0.8897 | 0.9882 | 0.8186 | 0.0 | 0.8032 | 0.9757 | 0.6110 | 0.0 | 0.8406 | 0.9952 | 0.0 | nan | 0.9741 | 0.8564 | 0.7941 | 0.9628 | 0.7639 | 0.0 | 0.7397 | 0.9598 | 0.3921 | 0.0 | 0.5848 | 0.8840 | 0.0 | | 0.0816 | 60.0 | 660 | 0.1060 | 0.6101 | 0.6772 | 0.9722 | nan | 0.9894 | 0.9402 | 0.8646 | 0.9842 | 0.8415 | 0.0 | 0.7738 | 0.9749 | 0.6312 | 0.0 | 0.8253 | 0.9785 | 0.0 | nan | 0.9745 | 0.8587 | 0.7972 | 0.9650 | 0.7713 | 0.0 | 0.7324 | 0.9650 | 0.4008 | 0.0 | 0.5478 | 0.9188 | 0.0 | | 0.0829 | 61.82 | 680 | 0.1056 | 0.6183 | 0.6715 | 0.9720 | nan | 0.9886 | 0.9449 | 0.8732 | 0.9828 | 0.8584 | 0.0 | 0.7537 | 0.9741 | 0.5886 | 0.0 | 0.7807 | 0.9850 | 0.0 | nan | 0.9743 | 0.8583 | 0.7965 | 0.9657 | 0.7759 | 0.0 | 0.7155 | 0.9597 | 0.3843 | 0.0 | 0.6928 | 0.9148 | 0.0 | | 0.1027 | 63.64 | 700 | 0.1031 | 0.6173 | 0.6786 | 0.9721 | nan | 0.9889 | 0.9300 | 0.8825 | 0.9875 | 0.8370 | 0.0 | 0.7903 | 0.9751 | 0.6066 | 0.0 | 0.8350 | 0.9885 | 0.0 | nan | 0.9745 | 0.8620 | 0.7973 | 0.9650 | 0.7728 | 0.0 | 0.7508 | 0.9571 | 0.3938 | 0.0 | 0.6309 | 0.9203 | 0.0 | | 0.0998 | 65.45 | 720 | 0.1042 | 0.6240 | 0.6727 | 0.9718 | nan | 0.9883 | 0.9422 | 0.8827 | 0.9867 | 0.8207 | 0.0 | 0.7999 | 0.9719 | 0.5322 | 0.0 | 0.8391 | 0.9816 | 0.0 | nan | 0.9744 | 0.8559 | 0.7951 | 0.9655 | 0.7684 | 0.0 | 0.7510 | 0.9592 | 0.3525 | 0.0 | 0.7652 | 0.9242 | 0.0 | | 0.072 | 67.27 | 740 | 0.1045 | 0.6144 | 0.6787 | 0.9722 | nan | 0.9896 | 0.9454 | 0.8734 | 0.9875 | 0.8123 | 0.0 | 0.7965 | 0.9637 | 0.5820 | 0.0 | 0.8853 | 0.9871 | 0.0 | nan | 0.9750 | 0.8559 | 0.8023 | 0.9658 | 0.7643 | 0.0 | 0.7467 | 0.9579 | 0.3895 | 0.0 | 0.6028 | 0.9265 | 0.0 | | 0.0743 | 69.09 | 760 | 0.1043 | 0.6208 | 0.6830 | 0.9713 | nan | 0.9884 | 0.9355 | 0.8582 | 0.9893 | 0.8035 | 0.0 | 0.8218 | 0.9696 | 0.6572 | 0.0 | 0.8619 | 0.9942 | 0.0 | nan | 0.9742 | 0.8610 | 0.7941 | 0.9644 | 0.7632 | 0.0 | 0.7712 | 0.9500 | 0.4047 | 0.0 | 0.6851 | 0.9027 | 0.0 | | 0.0707 | 70.91 | 780 | 0.1029 | 0.6113 | 0.6824 | 0.9718 | nan | 0.9917 | 0.9214 | 0.8504 | 0.9898 | 0.8348 | 0.0 | 0.8253 | 0.9667 | 0.5826 | 0.0 | 0.9146 | 0.9939 | 0.0 | nan | 0.9748 | 0.8554 | 0.7927 | 0.9637 | 0.7769 | 0.0 | 0.7698 | 0.9574 | 0.3879 | 0.0 | 0.5817 | 0.8870 | 0.0 | | 0.0805 | 72.73 | 800 | 0.0973 | 0.6116 | 0.6743 | 0.9730 | nan | 0.9912 | 0.9373 | 0.8660 | 0.9858 | 0.8550 | 0.0 | 0.8333 | 0.9751 | 0.4338 | 0.0 | 0.8999 | 0.9889 | 0.0 | nan | 0.9763 | 0.8603 | 0.7998 | 0.9672 | 0.7886 | 0.0 | 0.7670 | 0.9630 | 0.2926 | 0.0 | 0.6338 | 0.9022 | 0.0 | | 0.0752 | 74.55 | 820 | 0.0966 | 0.6230 | 0.6851 | 0.9728 | nan | 0.9911 | 0.9334 | 0.8708 | 0.9882 | 0.8492 | 0.0 | 0.8676 | 0.9678 | 0.4659 | 0.0624 | 0.9181 | 0.9914 | 0.0 | nan | 0.9755 | 0.8588 | 0.8023 | 0.9681 | 0.7916 | 0.0 | 0.7904 | 0.9603 | 0.3112 | 0.0606 | 0.6871 | 0.8931 | 0.0 | | 0.0697 | 76.36 | 840 | 0.0983 | 0.6214 | 0.6727 | 0.9728 | nan | 0.9905 | 0.9452 | 0.8647 | 0.9858 | 0.8395 | 0.0 | 0.8154 | 0.9732 | 0.4675 | 0.0 | 0.8782 | 0.9852 | 0.0 | nan | 0.9755 | 0.8600 | 0.7985 | 0.9684 | 0.7829 | 0.0 | 0.7687 | 0.9604 | 0.3162 | 0.0 | 0.7128 | 0.9349 | 0.0 | | 0.0697 | 78.18 | 860 | 0.0995 | 0.6156 | 0.6778 | 0.9722 | nan | 0.9884 | 0.9354 | 0.8923 | 0.9906 | 0.8281 | 0.0 | 0.8573 | 0.9698 | 0.4580 | 0.0 | 0.9006 | 0.9915 | 0.0 | nan | 0.9754 | 0.8556 | 0.7990 | 0.9657 | 0.7767 | 0.0 | 0.7927 | 0.9619 | 0.3007 | 0.0 | 0.6730 | 0.9017 | 0.0 | | 0.0803 | 80.0 | 880 | 0.0992 | 0.6175 | 0.6703 | 0.9730 | nan | 0.9904 | 0.9426 | 0.8748 | 0.9883 | 0.8511 | 0.0 | 0.8218 | 0.9700 | 0.4223 | 0.0 | 0.8890 | 0.9637 | 0.0 | nan | 0.9762 | 0.8564 | 0.8017 | 0.9686 | 0.7954 | 0.0 | 0.7788 | 0.9618 | 0.2867 | 0.0 | 0.6713 | 0.9313 | 0.0 | | 0.0823 | 81.82 | 900 | 0.0973 | 0.6168 | 0.6847 | 0.9731 | nan | 0.9905 | 0.9499 | 0.8543 | 0.9826 | 0.8727 | 0.0 | 0.8550 | 0.9735 | 0.5014 | 0.0 | 0.9352 | 0.9860 | 0.0 | nan | 0.9763 | 0.8580 | 0.7892 | 0.9698 | 0.8021 | 0.0 | 0.7834 | 0.9621 | 0.3350 | 0.0 | 0.6072 | 0.9350 | 0.0 | | 0.0598 | 83.64 | 920 | 0.0954 | 0.6275 | 0.6801 | 0.9733 | nan | 0.9901 | 0.9489 | 0.8719 | 0.9825 | 0.8830 | 0.0 | 0.8481 | 0.9741 | 0.4549 | 0.0 | 0.8990 | 0.9890 | 0.0 | nan | 0.9760 | 0.8617 | 0.7922 | 0.9699 | 0.8058 | 0.0 | 0.7944 | 0.9631 | 0.3129 | 0.0 | 0.7540 | 0.9276 | 0.0 | | 0.0735 | 85.45 | 940 | 0.0999 | 0.6226 | 0.6745 | 0.9723 | nan | 0.9897 | 0.9441 | 0.8749 | 0.9868 | 0.8551 | 0.0 | 0.8233 | 0.9650 | 0.4587 | 0.0 | 0.8929 | 0.9777 | 0.0 | nan | 0.9761 | 0.8553 | 0.7936 | 0.9692 | 0.7972 | 0.0 | 0.7699 | 0.9534 | 0.3065 | 0.0 | 0.7480 | 0.9251 | 0.0 | | 0.0696 | 87.27 | 960 | 0.0999 | 0.6220 | 0.6722 | 0.9726 | nan | 0.9919 | 0.9462 | 0.8167 | 0.9874 | 0.8538 | 0.0 | 0.8171 | 0.9703 | 0.4605 | 0.0 | 0.9078 | 0.9873 | 0.0 | nan | 0.9755 | 0.8559 | 0.7776 | 0.9698 | 0.7974 | 0.0 | 0.7685 | 0.9602 | 0.3078 | 0.0 | 0.7414 | 0.9314 | 0.0 | | 0.0644 | 89.09 | 980 | 0.0931 | 0.6398 | 0.6900 | 0.9750 | nan | 0.9925 | 0.9316 | 0.8678 | 0.9851 | 0.8972 | 0.0 | 0.8281 | 0.9818 | 0.5975 | 0.0 | 0.9047 | 0.9841 | 0.0 | nan | 0.9765 | 0.8748 | 0.8044 | 0.9701 | 0.8101 | 0.0 | 0.7795 | 0.9705 | 0.3899 | 0.0 | 0.7927 | 0.9489 | 0.0 | | 0.0575 | 90.91 | 1000 | 0.0959 | 0.6301 | 0.6854 | 0.9740 | nan | 0.9896 | 0.9436 | 0.8986 | 0.9857 | 0.8906 | 0.0 | 0.8316 | 0.9720 | 0.5076 | 0.0 | 0.9078 | 0.9833 | 0.0 | nan | 0.9765 | 0.8658 | 0.8081 | 0.9701 | 0.8106 | 0.0 | 0.7807 | 0.9613 | 0.3361 | 0.0 | 0.7475 | 0.9349 | 0.0 | | 0.066 | 92.73 | 1020 | 0.0913 | 0.6283 | 0.6837 | 0.9746 | nan | 0.9922 | 0.9347 | 0.8792 | 0.9844 | 0.8871 | 0.0 | 0.8579 | 0.9800 | 0.4638 | 0.0 | 0.9167 | 0.9916 | 0.0 | nan | 0.9768 | 0.8739 | 0.8051 | 0.9706 | 0.8100 | 0.0 | 0.8016 | 0.9651 | 0.3081 | 0.0 | 0.7405 | 0.9164 | 0.0 | | 0.0546 | 94.55 | 1040 | 0.0911 | 0.6307 | 0.6824 | 0.9746 | nan | 0.9916 | 0.9393 | 0.8885 | 0.9854 | 0.8828 | 0.0 | 0.8570 | 0.9742 | 0.4796 | 0.0 | 0.8862 | 0.9864 | 0.0 | nan | 0.9770 | 0.8719 | 0.8072 | 0.9706 | 0.8101 | 0.0 | 0.7964 | 0.9639 | 0.3157 | 0.0 | 0.7487 | 0.9371 | 0.0 | | 0.0716 | 96.36 | 1060 | 0.0937 | 0.6248 | 0.6821 | 0.9742 | nan | 0.9909 | 0.9465 | 0.8896 | 0.9820 | 0.8896 | 0.0 | 0.8510 | 0.9756 | 0.4418 | 0.0 | 0.9084 | 0.9916 | 0.0 | nan | 0.9770 | 0.8634 | 0.8156 | 0.9701 | 0.8097 | 0.0 | 0.7995 | 0.9667 | 0.2885 | 0.0 | 0.7244 | 0.9069 | 0.0 | | 0.0576 | 98.18 | 1080 | 0.0904 | 0.6289 | 0.6941 | 0.9750 | nan | 0.9910 | 0.9405 | 0.8869 | 0.9852 | 0.8759 | 0.0 | 0.8518 | 0.9794 | 0.6394 | 0.0 | 0.9277 | 0.9452 | 0.0 | nan | 0.9772 | 0.8750 | 0.8138 | 0.9705 | 0.8051 | 0.0 | 0.7947 | 0.9637 | 0.4131 | 0.0 | 0.6436 | 0.9186 | 0.0 | | 0.0606 | 100.0 | 1100 | 0.0928 | 0.6288 | 0.6792 | 0.9742 | nan | 0.9911 | 0.9493 | 0.8852 | 0.9831 | 0.8831 | 0.0 | 0.8521 | 0.9718 | 0.4600 | 0.0 | 0.8720 | 0.9826 | 0.0 | nan | 0.9771 | 0.8625 | 0.8163 | 0.9709 | 0.8052 | 0.0 | 0.8036 | 0.9638 | 0.2981 | 0.0 | 0.7430 | 0.9335 | 0.0 | | 0.0597 | 101.82 | 1120 | 0.0897 | 0.6424 | 0.6936 | 0.9750 | nan | 0.9923 | 0.9405 | 0.8641 | 0.9870 | 0.8760 | 0.0 | 0.8517 | 0.9719 | 0.6602 | 0.0002 | 0.8922 | 0.9801 | 0.0 | nan | 0.9776 | 0.8727 | 0.8084 | 0.9705 | 0.8083 | 0.0 | 0.7879 | 0.9616 | 0.4128 | 0.0002 | 0.8048 | 0.9467 | 0.0 | | 0.0621 | 103.64 | 1140 | 0.0879 | 0.6261 | 0.6895 | 0.9752 | nan | 0.9898 | 0.9435 | 0.9067 | 0.9845 | 0.8822 | 0.0 | 0.8721 | 0.9842 | 0.4954 | 0.0016 | 0.9125 | 0.9913 | 0.0 | nan | 0.9774 | 0.8761 | 0.8140 | 0.9713 | 0.8095 | 0.0 | 0.7999 | 0.9696 | 0.3199 | 0.0016 | 0.6830 | 0.9174 | 0.0 | | 0.0602 | 105.45 | 1160 | 0.0881 | 0.6387 | 0.6951 | 0.9756 | nan | 0.9917 | 0.9458 | 0.8855 | 0.9809 | 0.8912 | 0.0 | 0.8578 | 0.9837 | 0.5855 | 0.0326 | 0.9078 | 0.9742 | 0.0 | nan | 0.9778 | 0.8781 | 0.8201 | 0.9705 | 0.8113 | 0.0 | 0.8106 | 0.9652 | 0.3802 | 0.0326 | 0.7079 | 0.9489 | 0.0 | | 0.0541 | 107.27 | 1180 | 0.0904 | 0.6415 | 0.6833 | 0.9754 | nan | 0.9923 | 0.9486 | 0.8892 | 0.9828 | 0.8759 | 0.0 | 0.8422 | 0.9741 | 0.5535 | 0.0010 | 0.8414 | 0.9824 | 0.0 | nan | 0.9776 | 0.8737 | 0.8163 | 0.9714 | 0.8132 | 0.0 | 0.7996 | 0.9650 | 0.3684 | 0.0010 | 0.8082 | 0.9452 | 0.0 | | 0.0547 | 109.09 | 1200 | 0.0888 | 0.6418 | 0.6943 | 0.9760 | nan | 0.9916 | 0.9432 | 0.8859 | 0.9874 | 0.8600 | 0.0 | 0.8476 | 0.9811 | 0.6323 | 0.0 | 0.9075 | 0.9895 | 0.0 | nan | 0.9778 | 0.8809 | 0.8207 | 0.9715 | 0.8118 | 0.0 | 0.8049 | 0.9664 | 0.4088 | 0.0 | 0.7697 | 0.9307 | 0.0 | | 0.0596 | 110.91 | 1220 | 0.0900 | 0.6405 | 0.6968 | 0.9754 | nan | 0.9913 | 0.9408 | 0.8925 | 0.9856 | 0.8853 | 0.0 | 0.8634 | 0.9756 | 0.6322 | 0.0107 | 0.9013 | 0.9791 | 0.0 | nan | 0.9774 | 0.8754 | 0.8180 | 0.9720 | 0.8216 | 0.0 | 0.8127 | 0.9616 | 0.4064 | 0.0107 | 0.7352 | 0.9358 | 0.0 | | 0.0507 | 112.73 | 1240 | 0.0908 | 0.6305 | 0.6847 | 0.9747 | nan | 0.9915 | 0.9484 | 0.8845 | 0.9830 | 0.8906 | 0.0 | 0.8583 | 0.9739 | 0.4710 | 0.0 | 0.9125 | 0.9869 | 0.0 | nan | 0.9772 | 0.8682 | 0.8151 | 0.9719 | 0.8173 | 0.0 | 0.8147 | 0.9648 | 0.3019 | 0.0 | 0.7352 | 0.9295 | 0.0 | | 0.0455 | 114.55 | 1260 | 0.0938 | 0.6308 | 0.6784 | 0.9744 | nan | 0.9923 | 0.9503 | 0.8855 | 0.9816 | 0.9113 | 0.0 | 0.8149 | 0.9653 | 0.4230 | 0.0 | 0.9059 | 0.9889 | 0.0 | nan | 0.9778 | 0.8637 | 0.8202 | 0.9712 | 0.8135 | 0.0 | 0.7877 | 0.9583 | 0.2796 | 0.0 | 0.7950 | 0.9330 | 0.0 | | 0.048 | 116.36 | 1280 | 0.0936 | 0.6220 | 0.6703 | 0.9747 | nan | 0.9918 | 0.9469 | 0.8865 | 0.9854 | 0.8863 | 0.0 | 0.8440 | 0.9757 | 0.3294 | 0.0 | 0.8793 | 0.9893 | 0.0 | nan | 0.9780 | 0.8647 | 0.8210 | 0.9720 | 0.8156 | 0.0 | 0.8005 | 0.9656 | 0.2192 | 0.0 | 0.7209 | 0.9289 | 0.0 | | 0.0463 | 118.18 | 1300 | 0.0900 | 0.6242 | 0.6772 | 0.9753 | nan | 0.9914 | 0.9517 | 0.8897 | 0.9860 | 0.8762 | 0.0 | 0.8464 | 0.9755 | 0.3979 | 0.0 | 0.9025 | 0.9863 | 0.0 | nan | 0.9784 | 0.8670 | 0.8260 | 0.9718 | 0.8174 | 0.0 | 0.8059 | 0.9681 | 0.2628 | 0.0 | 0.6776 | 0.9393 | 0.0 | | 0.0465 | 120.0 | 1320 | 0.0930 | 0.6341 | 0.6732 | 0.9750 | nan | 0.9917 | 0.9468 | 0.8817 | 0.9887 | 0.8689 | 0.0 | 0.8452 | 0.9734 | 0.3972 | 0.0 | 0.8773 | 0.9801 | 0.0 | nan | 0.9778 | 0.8669 | 0.8182 | 0.9714 | 0.8194 | 0.0 | 0.8089 | 0.9659 | 0.2646 | 0.0 | 0.8129 | 0.9374 | 0.0 | | 0.0502 | 121.82 | 1340 | 0.0886 | 0.6253 | 0.6780 | 0.9756 | nan | 0.9912 | 0.9450 | 0.8926 | 0.9875 | 0.8638 | 0.0 | 0.8620 | 0.9832 | 0.4147 | 0.0 | 0.8908 | 0.9837 | 0.0 | nan | 0.9780 | 0.8724 | 0.8232 | 0.9719 | 0.8159 | 0.0 | 0.8176 | 0.9708 | 0.2733 | 0.0 | 0.6680 | 0.9373 | 0.0 | | 0.0533 | 123.64 | 1360 | 0.0909 | 0.6364 | 0.6836 | 0.9758 | nan | 0.9911 | 0.9499 | 0.8999 | 0.9869 | 0.8726 | 0.0 | 0.8457 | 0.9777 | 0.4724 | 0.0 | 0.9038 | 0.9871 | 0.0 | nan | 0.9784 | 0.8724 | 0.8248 | 0.9727 | 0.8220 | 0.0 | 0.8066 | 0.9679 | 0.3108 | 0.0 | 0.7791 | 0.9390 | 0.0 | | 0.0508 | 125.45 | 1380 | 0.0871 | 0.6395 | 0.6886 | 0.9757 | nan | 0.9905 | 0.9465 | 0.9123 | 0.9840 | 0.8927 | 0.0 | 0.8514 | 0.9799 | 0.5121 | 0.0053 | 0.8828 | 0.9942 | 0.0 | nan | 0.9782 | 0.8755 | 0.8192 | 0.9728 | 0.8222 | 0.0 | 0.8012 | 0.9685 | 0.3255 | 0.0053 | 0.8334 | 0.9121 | 0.0 | | 0.0468 | 127.27 | 1400 | 0.0887 | 0.6415 | 0.6835 | 0.9760 | nan | 0.9919 | 0.9495 | 0.9074 | 0.9819 | 0.8983 | 0.0 | 0.8487 | 0.9790 | 0.4751 | 0.0 | 0.8791 | 0.9751 | 0.0 | nan | 0.9786 | 0.8770 | 0.8236 | 0.9720 | 0.8216 | 0.0 | 0.8101 | 0.9694 | 0.3103 | 0.0 | 0.8329 | 0.9439 | 0.0 | | 0.056 | 129.09 | 1420 | 0.0924 | 0.6374 | 0.6770 | 0.9754 | nan | 0.9914 | 0.9495 | 0.8930 | 0.9881 | 0.8619 | 0.0 | 0.8171 | 0.9743 | 0.4818 | 0.0 | 0.8736 | 0.9704 | 0.0 | nan | 0.9783 | 0.8673 | 0.8232 | 0.9723 | 0.8198 | 0.0 | 0.7873 | 0.9673 | 0.3109 | 0.0 | 0.8204 | 0.9389 | 0.0 | | 0.0497 | 130.91 | 1440 | 0.0899 | 0.6379 | 0.6754 | 0.9760 | nan | 0.9931 | 0.9436 | 0.8792 | 0.9878 | 0.8857 | 0.0 | 0.8438 | 0.9777 | 0.4134 | 0.0001 | 0.8691 | 0.9862 | 0.0 | nan | 0.9783 | 0.8737 | 0.8241 | 0.9724 | 0.8254 | 0.0 | 0.8048 | 0.9715 | 0.2750 | 0.0001 | 0.8249 | 0.9427 | 0.0 | | 0.0519 | 132.73 | 1460 | 0.0912 | 0.6385 | 0.6810 | 0.9757 | nan | 0.9910 | 0.9447 | 0.9049 | 0.9896 | 0.8752 | 0.0 | 0.8507 | 0.9747 | 0.4465 | 0.0 | 0.8869 | 0.9895 | 0.0 | nan | 0.9787 | 0.8690 | 0.8292 | 0.9720 | 0.8223 | 0.0 | 0.7985 | 0.9669 | 0.2905 | 0.0 | 0.8346 | 0.9393 | 0.0 | | 0.0529 | 134.55 | 1480 | 0.0868 | 0.6421 | 0.6851 | 0.9762 | nan | 0.9926 | 0.9446 | 0.8778 | 0.9869 | 0.8962 | 0.0 | 0.8482 | 0.9789 | 0.5179 | 0.0 | 0.8899 | 0.9740 | 0.0 | nan | 0.9788 | 0.8758 | 0.8223 | 0.9724 | 0.8253 | 0.0 | 0.8075 | 0.9702 | 0.3241 | 0.0 | 0.8268 | 0.9439 | 0.0 | | 0.0443 | 136.36 | 1500 | 0.0892 | 0.6327 | 0.6773 | 0.9758 | nan | 0.9927 | 0.9482 | 0.8801 | 0.9851 | 0.8933 | 0.0 | 0.8489 | 0.9784 | 0.4032 | 0.0012 | 0.8899 | 0.9844 | 0.0 | nan | 0.9783 | 0.8728 | 0.8217 | 0.9729 | 0.8253 | 0.0 | 0.8145 | 0.9688 | 0.2697 | 0.0012 | 0.7518 | 0.9477 | 0.0 | | 0.0467 | 138.18 | 1520 | 0.0875 | 0.6376 | 0.6821 | 0.9758 | nan | 0.9913 | 0.9418 | 0.9080 | 0.9879 | 0.8849 | 0.0 | 0.8577 | 0.9782 | 0.4535 | 0.0 | 0.8816 | 0.9831 | 0.0 | nan | 0.9788 | 0.8715 | 0.8266 | 0.9719 | 0.8240 | 0.0 | 0.8031 | 0.9683 | 0.2954 | 0.0 | 0.8051 | 0.9437 | 0.0 | | 0.0456 | 140.0 | 1540 | 0.0920 | 0.6348 | 0.6757 | 0.9754 | nan | 0.9926 | 0.9431 | 0.8845 | 0.9883 | 0.8819 | 0.0 | 0.8550 | 0.9750 | 0.3713 | 0.0 | 0.9087 | 0.9833 | 0.0 | nan | 0.9786 | 0.8662 | 0.8251 | 0.9725 | 0.8248 | 0.0 | 0.8059 | 0.9663 | 0.2475 | 0.0 | 0.8224 | 0.9431 | 0.0 | | 0.0464 | 141.82 | 1560 | 0.0928 | 0.6213 | 0.6727 | 0.9749 | nan | 0.9922 | 0.9426 | 0.8930 | 0.9891 | 0.8705 | 0.0 | 0.8534 | 0.9699 | 0.3501 | 0.0003 | 0.9004 | 0.9829 | 0.0 | nan | 0.9788 | 0.8664 | 0.8236 | 0.9718 | 0.8173 | 0.0 | 0.8144 | 0.9632 | 0.2344 | 0.0003 | 0.6645 | 0.9425 | 0.0 | | 0.0515 | 143.64 | 1580 | 0.0870 | 0.6458 | 0.6873 | 0.9765 | nan | 0.9922 | 0.9450 | 0.8966 | 0.9871 | 0.8969 | 0.0 | 0.8351 | 0.9776 | 0.5370 | 0.0 | 0.8887 | 0.9790 | 0.0 | nan | 0.9789 | 0.8775 | 0.8220 | 0.9731 | 0.8302 | 0.0 | 0.8041 | 0.9698 | 0.3500 | 0.0 | 0.8378 | 0.9519 | 0.0 | | 0.0523 | 145.45 | 1600 | 0.0863 | 0.6476 | 0.6915 | 0.9767 | nan | 0.9931 | 0.9472 | 0.8754 | 0.9882 | 0.8823 | 0.0 | 0.8597 | 0.9766 | 0.5686 | 0.0 | 0.9093 | 0.9887 | 0.0 | nan | 0.9789 | 0.8781 | 0.8255 | 0.9729 | 0.8275 | 0.0 | 0.8152 | 0.9697 | 0.3796 | 0.0 | 0.8277 | 0.9432 | 0.0 | | 0.052 | 147.27 | 1620 | 0.0873 | 0.6420 | 0.6825 | 0.9764 | nan | 0.9922 | 0.9500 | 0.8870 | 0.9873 | 0.8899 | 0.0 | 0.8587 | 0.9777 | 0.4852 | 0.0 | 0.8690 | 0.9760 | 0.0 | nan | 0.9792 | 0.8712 | 0.8313 | 0.9731 | 0.8311 | 0.0 | 0.8184 | 0.9693 | 0.3199 | 0.0 | 0.8105 | 0.9414 | 0.0 | | 0.0473 | 149.09 | 1640 | 0.0875 | 0.6421 | 0.6892 | 0.9764 | nan | 0.9922 | 0.9482 | 0.9111 | 0.9830 | 0.9171 | 0.0 | 0.8507 | 0.9768 | 0.4431 | 0.0 | 0.9482 | 0.9885 | 0.0 | nan | 0.9792 | 0.8750 | 0.8314 | 0.9729 | 0.8297 | 0.0 | 0.8091 | 0.9695 | 0.2972 | 0.0 | 0.8383 | 0.9450 | 0.0 | | 0.0445 | 150.91 | 1660 | 0.0842 | 0.6527 | 0.6974 | 0.9774 | nan | 0.9928 | 0.9471 | 0.8885 | 0.9834 | 0.9026 | 0.0 | 0.8568 | 0.9864 | 0.6235 | 0.0 | 0.8997 | 0.9859 | 0.0 | nan | 0.9788 | 0.8870 | 0.8257 | 0.9733 | 0.8326 | 0.0 | 0.8127 | 0.9753 | 0.4015 | 0.0 | 0.8476 | 0.9501 | 0.0 | | 0.0478 | 152.73 | 1680 | 0.0865 | 0.6443 | 0.6865 | 0.9765 | nan | 0.9930 | 0.9384 | 0.8960 | 0.9844 | 0.9068 | 0.0 | 0.8708 | 0.9842 | 0.4478 | 0.0 | 0.9144 | 0.9881 | 0.0 | nan | 0.9789 | 0.8757 | 0.8300 | 0.9734 | 0.8344 | 0.0 | 0.8121 | 0.9743 | 0.2946 | 0.0 | 0.8543 | 0.9486 | 0.0 | | 0.0446 | 154.55 | 1700 | 0.0870 | 0.6366 | 0.6838 | 0.9762 | nan | 0.9926 | 0.9455 | 0.9027 | 0.9864 | 0.8989 | 0.0 | 0.8544 | 0.9744 | 0.4161 | 0.0038 | 0.9286 | 0.9860 | 0.0 | nan | 0.9794 | 0.8722 | 0.8321 | 0.9733 | 0.8241 | 0.0 | 0.8089 | 0.9675 | 0.2782 | 0.0038 | 0.7873 | 0.9485 | 0.0 | | 0.0408 | 156.36 | 1720 | 0.0872 | 0.6391 | 0.6887 | 0.9769 | nan | 0.9929 | 0.9444 | 0.8817 | 0.9868 | 0.8889 | 0.0 | 0.8612 | 0.9841 | 0.5046 | 0.0020 | 0.9160 | 0.9900 | 0.0 | nan | 0.9789 | 0.8807 | 0.8256 | 0.9738 | 0.8312 | 0.0 | 0.8095 | 0.9733 | 0.3279 | 0.0020 | 0.7604 | 0.9456 | 0.0 | | 0.0456 | 158.18 | 1740 | 0.0855 | 0.6520 | 0.6956 | 0.9774 | nan | 0.9929 | 0.9444 | 0.8943 | 0.9878 | 0.8884 | 0.0 | 0.8676 | 0.9796 | 0.6019 | 0.0005 | 0.9043 | 0.9808 | 0.0 | nan | 0.9795 | 0.8837 | 0.8329 | 0.9733 | 0.8323 | 0.0 | 0.8160 | 0.9717 | 0.3833 | 0.0005 | 0.8495 | 0.9534 | 0.0 | | 0.0476 | 160.0 | 1760 | 0.0868 | 0.6477 | 0.6920 | 0.9766 | nan | 0.9917 | 0.9492 | 0.9086 | 0.9843 | 0.9041 | 0.0 | 0.8346 | 0.9772 | 0.5563 | 0.0008 | 0.9078 | 0.9816 | 0.0 | nan | 0.9795 | 0.8796 | 0.8315 | 0.9728 | 0.8183 | 0.0 | 0.7906 | 0.9661 | 0.3688 | 0.0008 | 0.8619 | 0.9496 | 0.0 | | 0.0566 | 161.82 | 1780 | 0.0887 | 0.6417 | 0.6789 | 0.9761 | nan | 0.9934 | 0.9505 | 0.8682 | 0.9849 | 0.8958 | 0.0 | 0.8442 | 0.9778 | 0.4560 | 0.0006 | 0.8697 | 0.9849 | 0.0 | nan | 0.9785 | 0.8737 | 0.8161 | 0.9738 | 0.8317 | 0.0 | 0.8078 | 0.9707 | 0.3040 | 0.0006 | 0.8365 | 0.9482 | 0.0 | | 0.0401 | 163.64 | 1800 | 0.0858 | 0.6481 | 0.6910 | 0.9767 | nan | 0.9939 | 0.9386 | 0.8660 | 0.9871 | 0.9003 | 0.0 | 0.8600 | 0.9816 | 0.5701 | 0.0029 | 0.8975 | 0.9849 | 0.0 | nan | 0.9786 | 0.8804 | 0.8180 | 0.9736 | 0.8312 | 0.0 | 0.8110 | 0.9707 | 0.3689 | 0.0029 | 0.8452 | 0.9450 | 0.0 | | 0.0444 | 165.45 | 1820 | 0.0860 | 0.6523 | 0.6953 | 0.9773 | nan | 0.9927 | 0.9457 | 0.8903 | 0.9881 | 0.8894 | 0.0 | 0.8563 | 0.9806 | 0.5883 | 0.0 | 0.9254 | 0.9817 | 0.0 | nan | 0.9791 | 0.8831 | 0.8284 | 0.9736 | 0.8308 | 0.0 | 0.8148 | 0.9727 | 0.3862 | 0.0 | 0.8561 | 0.9544 | 0.0 | | 0.046 | 167.27 | 1840 | 0.0843 | 0.6520 | 0.6935 | 0.9776 | nan | 0.9927 | 0.9457 | 0.8892 | 0.9872 | 0.8920 | 0.0 | 0.8584 | 0.9864 | 0.5872 | 0.0004 | 0.9052 | 0.9714 | 0.0 | nan | 0.9792 | 0.8877 | 0.8292 | 0.9740 | 0.8305 | 0.0 | 0.8158 | 0.9738 | 0.3900 | 0.0004 | 0.8492 | 0.9455 | 0.0 | | 0.0447 | 169.09 | 1860 | 0.0845 | 0.6496 | 0.6915 | 0.9773 | nan | 0.9934 | 0.9468 | 0.8948 | 0.9842 | 0.9045 | 0.0 | 0.8463 | 0.9818 | 0.5438 | 0.0 | 0.9126 | 0.9818 | 0.0 | nan | 0.9794 | 0.8843 | 0.8311 | 0.9737 | 0.8292 | 0.0 | 0.8147 | 0.9719 | 0.3579 | 0.0 | 0.8540 | 0.9489 | 0.0 | | 0.0448 | 170.91 | 1880 | 0.0850 | 0.6516 | 0.6959 | 0.9772 | nan | 0.9925 | 0.9460 | 0.9000 | 0.9882 | 0.8887 | 0.0 | 0.8586 | 0.9766 | 0.6172 | 0.0005 | 0.9082 | 0.9699 | 0.0 | nan | 0.9793 | 0.8827 | 0.8312 | 0.9737 | 0.8319 | 0.0 | 0.8139 | 0.9692 | 0.4030 | 0.0005 | 0.8391 | 0.9463 | 0.0 | | 0.0432 | 172.73 | 1900 | 0.0846 | 0.6461 | 0.6916 | 0.9770 | nan | 0.9919 | 0.9509 | 0.9007 | 0.9859 | 0.9084 | 0.0 | 0.8633 | 0.9783 | 0.5265 | 0.0 | 0.8888 | 0.9957 | 0.0 | nan | 0.9794 | 0.8784 | 0.8339 | 0.9742 | 0.8369 | 0.0 | 0.8174 | 0.9719 | 0.3383 | 0.0 | 0.8343 | 0.9343 | 0.0 | | 0.0403 | 174.55 | 1920 | 0.0891 | 0.6386 | 0.6791 | 0.9764 | nan | 0.9928 | 0.9528 | 0.8935 | 0.9832 | 0.9037 | 0.0 | 0.8613 | 0.9795 | 0.3732 | 0.0004 | 0.9052 | 0.9832 | 0.0 | nan | 0.9794 | 0.8730 | 0.8329 | 0.9736 | 0.8364 | 0.0 | 0.8171 | 0.9690 | 0.2479 | 0.0004 | 0.8240 | 0.9487 | 0.0 | | 0.0405 | 176.36 | 1940 | 0.0860 | 0.6393 | 0.6853 | 0.9769 | nan | 0.9935 | 0.9449 | 0.8865 | 0.9864 | 0.8847 | 0.0 | 0.8523 | 0.9799 | 0.5182 | 0.0010 | 0.8945 | 0.9672 | 0.0 | nan | 0.9793 | 0.8807 | 0.8300 | 0.9738 | 0.8268 | 0.0 | 0.8157 | 0.9690 | 0.3349 | 0.0010 | 0.7527 | 0.9474 | 0.0 | | 0.0402 | 178.18 | 1960 | 0.0847 | 0.6502 | 0.6921 | 0.9769 | nan | 0.9926 | 0.9465 | 0.8924 | 0.9892 | 0.8834 | 0.0 | 0.8686 | 0.9770 | 0.4591 | 0.0827 | 0.9181 | 0.9871 | 0.0 | nan | 0.9797 | 0.8758 | 0.8332 | 0.9737 | 0.8297 | 0.0 | 0.8206 | 0.9700 | 0.3011 | 0.0827 | 0.8403 | 0.9461 | 0.0 | | 0.0436 | 180.0 | 1980 | 0.0815 | 0.6690 | 0.7161 | 0.9774 | nan | 0.9918 | 0.9493 | 0.8977 | 0.9876 | 0.8862 | 0.0 | 0.8811 | 0.9812 | 0.4751 | 0.3711 | 0.9036 | 0.9850 | 0.0 | nan | 0.9798 | 0.8820 | 0.8311 | 0.9742 | 0.8307 | 0.0 | 0.8124 | 0.9700 | 0.3094 | 0.3706 | 0.7918 | 0.9445 | 0.0 | | 0.0421 | 181.82 | 2000 | 0.0854 | 0.6503 | 0.6950 | 0.9770 | nan | 0.9935 | 0.9442 | 0.8793 | 0.9873 | 0.9035 | 0.0 | 0.8674 | 0.9774 | 0.5263 | 0.0594 | 0.9222 | 0.9750 | 0.0 | nan | 0.9792 | 0.8809 | 0.8249 | 0.9746 | 0.8335 | 0.0 | 0.8129 | 0.9669 | 0.3519 | 0.0594 | 0.8241 | 0.9457 | 0.0 | | 0.0371 | 183.64 | 2020 | 0.0849 | 0.6483 | 0.6947 | 0.9773 | nan | 0.9928 | 0.9426 | 0.8938 | 0.9890 | 0.8770 | 0.0 | 0.8678 | 0.9821 | 0.5535 | 0.0308 | 0.9119 | 0.9892 | 0.0 | nan | 0.9798 | 0.8829 | 0.8330 | 0.9739 | 0.8290 | 0.0 | 0.8214 | 0.9704 | 0.3514 | 0.0308 | 0.8132 | 0.9425 | 0.0 | | 0.0443 | 185.45 | 2040 | 0.0843 | 0.6457 | 0.6853 | 0.9771 | nan | 0.9935 | 0.9458 | 0.8817 | 0.9882 | 0.8838 | 0.0 | 0.8539 | 0.9804 | 0.5030 | 0.0012 | 0.8981 | 0.9789 | 0.0 | nan | 0.9796 | 0.8809 | 0.8281 | 0.9742 | 0.8324 | 0.0 | 0.8179 | 0.9702 | 0.3281 | 0.0012 | 0.8325 | 0.9485 | 0.0 | | 0.0385 | 187.27 | 2060 | 0.0845 | 0.6541 | 0.6969 | 0.9774 | nan | 0.9941 | 0.9462 | 0.8743 | 0.9878 | 0.8929 | 0.0 | 0.8566 | 0.9762 | 0.6349 | 0.0032 | 0.9201 | 0.9729 | 0.0 | nan | 0.9790 | 0.8862 | 0.8223 | 0.9741 | 0.8351 | 0.0 | 0.8207 | 0.9712 | 0.4177 | 0.0032 | 0.8426 | 0.9514 | 0.0 | | 0.0399 | 189.09 | 2080 | 0.0837 | 0.6527 | 0.6989 | 0.9776 | nan | 0.9919 | 0.9501 | 0.9122 | 0.9855 | 0.9098 | 0.0 | 0.8795 | 0.9811 | 0.5548 | 0.0128 | 0.9242 | 0.9840 | 0.0 | nan | 0.9798 | 0.8845 | 0.8356 | 0.9743 | 0.8367 | 0.0 | 0.8283 | 0.9719 | 0.3656 | 0.0128 | 0.8463 | 0.9493 | 0.0 | | 0.0431 | 190.91 | 2100 | 0.0839 | 0.6531 | 0.6996 | 0.9774 | nan | 0.9917 | 0.9453 | 0.9157 | 0.9890 | 0.8875 | 0.0 | 0.8777 | 0.9777 | 0.6127 | 0.0171 | 0.9132 | 0.9678 | 0.0 | nan | 0.9798 | 0.8899 | 0.8325 | 0.9738 | 0.8315 | 0.0 | 0.8317 | 0.9691 | 0.3959 | 0.0171 | 0.8222 | 0.9465 | 0.0 | | 0.0391 | 192.73 | 2120 | 0.0858 | 0.6470 | 0.6854 | 0.9775 | nan | 0.9940 | 0.9423 | 0.8936 | 0.9881 | 0.8904 | 0.0 | 0.8589 | 0.9811 | 0.4732 | 0.0024 | 0.9091 | 0.9774 | 0.0 | nan | 0.9798 | 0.8832 | 0.8332 | 0.9747 | 0.8376 | 0.0 | 0.8227 | 0.9724 | 0.3105 | 0.0024 | 0.8475 | 0.9467 | 0.0 | | 0.0456 | 194.55 | 2140 | 0.0837 | 0.6486 | 0.6902 | 0.9777 | nan | 0.9931 | 0.9509 | 0.8929 | 0.9847 | 0.9039 | 0.0 | 0.8655 | 0.9842 | 0.5249 | 0.0019 | 0.8949 | 0.9757 | 0.0 | nan | 0.9797 | 0.8867 | 0.8311 | 0.9748 | 0.8352 | 0.0 | 0.8232 | 0.9738 | 0.3440 | 0.0019 | 0.8342 | 0.9471 | 0.0 | | 0.0402 | 196.36 | 2160 | 0.0845 | 0.6471 | 0.6883 | 0.9777 | nan | 0.9929 | 0.9508 | 0.8902 | 0.9869 | 0.8998 | 0.0 | 0.8581 | 0.9838 | 0.5185 | 0.0009 | 0.8828 | 0.9831 | 0.0 | nan | 0.9803 | 0.8831 | 0.8354 | 0.9742 | 0.8368 | 0.0 | 0.8180 | 0.9740 | 0.3370 | 0.0009 | 0.8272 | 0.9450 | 0.0 | | 0.0401 | 198.18 | 2180 | 0.0845 | 0.6442 | 0.6850 | 0.9772 | nan | 0.9915 | 0.9541 | 0.9191 | 0.9855 | 0.8951 | 0.0 | 0.8530 | 0.9812 | 0.4627 | 0.0003 | 0.8910 | 0.9714 | 0.0 | nan | 0.9798 | 0.8795 | 0.8322 | 0.9747 | 0.8343 | 0.0 | 0.8161 | 0.9739 | 0.3073 | 0.0003 | 0.8313 | 0.9449 | 0.0 | | 0.0412 | 200.0 | 2200 | 0.0851 | 0.6480 | 0.6892 | 0.9776 | nan | 0.9938 | 0.9482 | 0.8900 | 0.9864 | 0.8914 | 0.0 | 0.8598 | 0.9812 | 0.5108 | 0.0054 | 0.9217 | 0.9715 | 0.0 | nan | 0.9794 | 0.8861 | 0.8301 | 0.9745 | 0.8346 | 0.0 | 0.8271 | 0.9737 | 0.3424 | 0.0054 | 0.8244 | 0.9469 | 0.0 | | 0.043 | 201.82 | 2220 | 0.0851 | 0.6426 | 0.6807 | 0.9773 | nan | 0.9925 | 0.9475 | 0.9046 | 0.9870 | 0.8995 | 0.0 | 0.8505 | 0.9851 | 0.3807 | 0.0039 | 0.9116 | 0.9861 | 0.0 | nan | 0.9800 | 0.8786 | 0.8335 | 0.9748 | 0.8384 | 0.0 | 0.8218 | 0.9753 | 0.2520 | 0.0039 | 0.8551 | 0.9409 | 0.0 | | 0.0371 | 203.64 | 2240 | 0.0859 | 0.6406 | 0.6836 | 0.9774 | nan | 0.9929 | 0.9508 | 0.9022 | 0.9846 | 0.9086 | 0.0 | 0.8696 | 0.9831 | 0.3943 | 0.0071 | 0.9134 | 0.9806 | 0.0 | nan | 0.9801 | 0.8788 | 0.8377 | 0.9746 | 0.8404 | 0.0 | 0.8231 | 0.9748 | 0.2606 | 0.0071 | 0.8088 | 0.9425 | 0.0 | | 0.0355 | 205.45 | 2260 | 0.0833 | 0.6460 | 0.6877 | 0.9777 | nan | 0.9931 | 0.9481 | 0.8961 | 0.9859 | 0.9084 | 0.0 | 0.8678 | 0.9856 | 0.4645 | 0.0090 | 0.8975 | 0.9840 | 0.0 | nan | 0.9801 | 0.8850 | 0.8338 | 0.9748 | 0.8389 | 0.0 | 0.8238 | 0.9745 | 0.3060 | 0.0090 | 0.8263 | 0.9454 | 0.0 | | 0.0382 | 207.27 | 2280 | 0.0839 | 0.6471 | 0.6885 | 0.9777 | nan | 0.9936 | 0.9437 | 0.8971 | 0.9866 | 0.8981 | 0.0 | 0.8724 | 0.9851 | 0.4615 | 0.0353 | 0.9068 | 0.9699 | 0.0 | nan | 0.9798 | 0.8859 | 0.8297 | 0.9750 | 0.8406 | 0.0 | 0.8149 | 0.9750 | 0.3041 | 0.0353 | 0.8238 | 0.9481 | 0.0 | | 0.0383 | 209.09 | 2300 | 0.0860 | 0.6433 | 0.6849 | 0.9771 | nan | 0.9921 | 0.9491 | 0.9125 | 0.9862 | 0.9107 | 0.0 | 0.8769 | 0.9794 | 0.4066 | 0.0111 | 0.9015 | 0.9776 | 0.0 | nan | 0.9796 | 0.8809 | 0.8274 | 0.9750 | 0.8402 | 0.0 | 0.8304 | 0.9723 | 0.2693 | 0.0111 | 0.8281 | 0.9489 | 0.0 | | 0.0369 | 210.91 | 2320 | 0.0858 | 0.6451 | 0.6854 | 0.9774 | nan | 0.9938 | 0.9469 | 0.8873 | 0.9875 | 0.8980 | 0.0 | 0.8681 | 0.9792 | 0.4341 | 0.0301 | 0.8997 | 0.9849 | 0.0 | nan | 0.9799 | 0.8820 | 0.8302 | 0.9751 | 0.8392 | 0.0 | 0.8270 | 0.9709 | 0.2871 | 0.0300 | 0.8170 | 0.9482 | 0.0 | | 0.0365 | 212.73 | 2340 | 0.0870 | 0.6424 | 0.6818 | 0.9774 | nan | 0.9932 | 0.9465 | 0.8934 | 0.9882 | 0.8889 | 0.0 | 0.8825 | 0.9840 | 0.4031 | 0.0084 | 0.8975 | 0.9776 | 0.0 | nan | 0.9802 | 0.8806 | 0.8351 | 0.9750 | 0.8359 | 0.0 | 0.8287 | 0.9722 | 0.2679 | 0.0084 | 0.8182 | 0.9497 | 0.0 | | 0.0388 | 214.55 | 2360 | 0.0846 | 0.6463 | 0.6844 | 0.9777 | nan | 0.9933 | 0.9474 | 0.8950 | 0.9884 | 0.8972 | 0.0 | 0.8586 | 0.9814 | 0.4519 | 0.0087 | 0.8919 | 0.9837 | 0.0 | nan | 0.9804 | 0.8816 | 0.8358 | 0.9750 | 0.8385 | 0.0 | 0.8168 | 0.9725 | 0.2994 | 0.0087 | 0.8421 | 0.9517 | 0.0 | | 0.0352 | 216.36 | 2380 | 0.0857 | 0.6476 | 0.6861 | 0.9776 | nan | 0.9937 | 0.9468 | 0.8907 | 0.9869 | 0.9014 | 0.0 | 0.8626 | 0.9822 | 0.4450 | 0.0199 | 0.9125 | 0.9782 | 0.0 | nan | 0.9798 | 0.8848 | 0.8314 | 0.9750 | 0.8406 | 0.0 | 0.8210 | 0.9735 | 0.2926 | 0.0199 | 0.8490 | 0.9517 | 0.0 | | 0.0382 | 218.18 | 2400 | 0.0857 | 0.6466 | 0.6898 | 0.9774 | nan | 0.9924 | 0.9508 | 0.8959 | 0.9843 | 0.9031 | 0.0 | 0.8752 | 0.9884 | 0.4047 | 0.0570 | 0.9235 | 0.9926 | 0.0 | nan | 0.9800 | 0.8822 | 0.8340 | 0.9745 | 0.8345 | 0.0 | 0.8227 | 0.9741 | 0.2655 | 0.0570 | 0.8529 | 0.9289 | 0.0 | | 0.0378 | 220.0 | 2420 | 0.0898 | 0.6441 | 0.6845 | 0.9767 | nan | 0.9916 | 0.9460 | 0.9194 | 0.9881 | 0.9021 | 0.0 | 0.8785 | 0.9782 | 0.3604 | 0.0384 | 0.9238 | 0.9718 | 0.0 | nan | 0.9798 | 0.8752 | 0.8329 | 0.9748 | 0.8383 | 0.0 | 0.8230 | 0.9694 | 0.2401 | 0.0384 | 0.8566 | 0.9444 | 0.0 | | 0.0383 | 221.82 | 2440 | 0.0864 | 0.6444 | 0.6799 | 0.9773 | nan | 0.9942 | 0.9487 | 0.8780 | 0.9863 | 0.9040 | 0.0 | 0.8575 | 0.9817 | 0.4188 | 0.0098 | 0.8942 | 0.9651 | 0.0 | nan | 0.9796 | 0.8815 | 0.8291 | 0.9753 | 0.8390 | 0.0 | 0.8213 | 0.9732 | 0.2791 | 0.0098 | 0.8409 | 0.9483 | 0.0 | | 0.0339 | 223.64 | 2460 | 0.0903 | 0.6451 | 0.6845 | 0.9771 | nan | 0.9932 | 0.9525 | 0.8919 | 0.9859 | 0.8993 | 0.0 | 0.8594 | 0.9775 | 0.4242 | 0.0173 | 0.9150 | 0.9822 | 0.0 | nan | 0.9799 | 0.8781 | 0.8304 | 0.9752 | 0.8406 | 0.0 | 0.8191 | 0.9687 | 0.2815 | 0.0173 | 0.8416 | 0.9533 | 0.0 | | 0.0369 | 225.45 | 2480 | 0.0885 | 0.6394 | 0.6782 | 0.9771 | nan | 0.9930 | 0.9472 | 0.9029 | 0.9881 | 0.8882 | 0.0 | 0.8679 | 0.9787 | 0.3940 | 0.0029 | 0.8674 | 0.9865 | 0.0 | nan | 0.9800 | 0.8797 | 0.8314 | 0.9754 | 0.8349 | 0.0 | 0.8237 | 0.9698 | 0.2634 | 0.0029 | 0.8046 | 0.9465 | 0.0 | | 0.0362 | 227.27 | 2500 | 0.0907 | 0.6431 | 0.6818 | 0.9771 | nan | 0.9938 | 0.9467 | 0.8888 | 0.9883 | 0.8969 | 0.0 | 0.8804 | 0.9772 | 0.3631 | 0.0375 | 0.9137 | 0.9774 | 0.0 | nan | 0.9798 | 0.8776 | 0.8317 | 0.9753 | 0.8415 | 0.0 | 0.8259 | 0.9703 | 0.2421 | 0.0375 | 0.8272 | 0.9513 | 0.0 | | 0.0381 | 229.09 | 2520 | 0.0875 | 0.6538 | 0.6983 | 0.9775 | nan | 0.9940 | 0.9460 | 0.8795 | 0.9867 | 0.9127 | 0.0 | 0.8662 | 0.9761 | 0.5741 | 0.0397 | 0.9178 | 0.9854 | 0.0 | nan | 0.9796 | 0.8836 | 0.8292 | 0.9752 | 0.8412 | 0.0 | 0.8211 | 0.9681 | 0.3713 | 0.0397 | 0.8419 | 0.9482 | 0.0 | | 0.0378 | 230.91 | 2540 | 0.0891 | 0.6451 | 0.6820 | 0.9772 | nan | 0.9941 | 0.9497 | 0.8782 | 0.9853 | 0.9044 | 0.0 | 0.8550 | 0.9823 | 0.4145 | 0.0431 | 0.8920 | 0.9675 | 0.0 | nan | 0.9801 | 0.8799 | 0.8293 | 0.9748 | 0.8341 | 0.0 | 0.8235 | 0.9715 | 0.2744 | 0.0431 | 0.8278 | 0.9482 | 0.0 | | 0.0366 | 232.73 | 2560 | 0.0805 | 0.6596 | 0.7029 | 0.9781 | nan | 0.9927 | 0.9457 | 0.9038 | 0.9883 | 0.9021 | 0.0 | 0.8627 | 0.9820 | 0.5617 | 0.1157 | 0.9036 | 0.9792 | 0.0 | nan | 0.9804 | 0.8879 | 0.8347 | 0.9751 | 0.8363 | 0.0 | 0.8243 | 0.9713 | 0.3664 | 0.1157 | 0.8314 | 0.9517 | 0.0 | | 0.0392 | 234.55 | 2580 | 0.0876 | 0.6457 | 0.6848 | 0.9775 | nan | 0.9932 | 0.9542 | 0.8871 | 0.9864 | 0.8981 | 0.0 | 0.8556 | 0.9812 | 0.4539 | 0.0049 | 0.9222 | 0.9662 | 0.0 | nan | 0.9801 | 0.8793 | 0.8331 | 0.9755 | 0.8402 | 0.0 | 0.8213 | 0.9730 | 0.2986 | 0.0049 | 0.8421 | 0.9459 | 0.0 | | 0.0343 | 236.36 | 2600 | 0.0929 | 0.6426 | 0.6817 | 0.9767 | nan | 0.9939 | 0.9510 | 0.8821 | 0.9868 | 0.9002 | 0.0 | 0.8755 | 0.9732 | 0.3850 | 0.0175 | 0.9199 | 0.9769 | 0.0 | nan | 0.9797 | 0.8746 | 0.8311 | 0.9756 | 0.8401 | 0.0 | 0.8335 | 0.9670 | 0.2518 | 0.0175 | 0.8300 | 0.9533 | 0.0 | | 0.0293 | 238.18 | 2620 | 0.0866 | 0.6505 | 0.6889 | 0.9774 | nan | 0.9936 | 0.9459 | 0.8962 | 0.9881 | 0.9018 | 0.0 | 0.8603 | 0.9749 | 0.5017 | 0.0444 | 0.8683 | 0.9809 | 0.0 | nan | 0.9796 | 0.8821 | 0.8319 | 0.9759 | 0.8400 | 0.0 | 0.8296 | 0.9691 | 0.3295 | 0.0444 | 0.8198 | 0.9541 | 0.0 | | 0.0353 | 240.0 | 2640 | 0.0883 | 0.6480 | 0.6858 | 0.9776 | nan | 0.9932 | 0.9495 | 0.9035 | 0.9866 | 0.9131 | 0.0 | 0.8624 | 0.9788 | 0.4136 | 0.0268 | 0.9084 | 0.9792 | 0.0 | nan | 0.9802 | 0.8823 | 0.8352 | 0.9756 | 0.8412 | 0.0 | 0.8318 | 0.9712 | 0.2761 | 0.0268 | 0.8497 | 0.9534 | 0.0 | | 0.0372 | 241.82 | 2660 | 0.0902 | 0.6433 | 0.6790 | 0.9775 | nan | 0.9937 | 0.9508 | 0.8846 | 0.9880 | 0.8968 | 0.0 | 0.8704 | 0.9824 | 0.3534 | 0.0062 | 0.9178 | 0.9832 | 0.0 | nan | 0.9803 | 0.8801 | 0.8332 | 0.9758 | 0.8412 | 0.0 | 0.8291 | 0.9733 | 0.2367 | 0.0062 | 0.8540 | 0.9533 | 0.0 | | 0.0343 | 243.64 | 2680 | 0.0866 | 0.6496 | 0.6912 | 0.9766 | nan | 0.9932 | 0.9498 | 0.8967 | 0.9863 | 0.8808 | 0.0 | 0.8618 | 0.9714 | 0.4802 | 0.0480 | 0.9261 | 0.9906 | 0.0 | nan | 0.9801 | 0.8795 | 0.8317 | 0.9748 | 0.8345 | 0.0 | 0.8253 | 0.9611 | 0.3118 | 0.0454 | 0.8515 | 0.9493 | 0.0 | | 0.0342 | 245.45 | 2700 | 0.0872 | 0.6468 | 0.6875 | 0.9774 | nan | 0.9930 | 0.9491 | 0.8958 | 0.9878 | 0.8840 | 0.0 | 0.8627 | 0.9795 | 0.4834 | 0.0200 | 0.8965 | 0.9851 | 0.0 | nan | 0.9803 | 0.8788 | 0.8315 | 0.9751 | 0.8329 | 0.0 | 0.8275 | 0.9709 | 0.3149 | 0.0200 | 0.8233 | 0.9530 | 0.0 | | 0.0348 | 247.27 | 2720 | 0.0851 | 0.6531 | 0.6958 | 0.9781 | nan | 0.9932 | 0.9475 | 0.9046 | 0.9874 | 0.8862 | 0.0 | 0.8656 | 0.9824 | 0.5480 | 0.0275 | 0.9258 | 0.9776 | 0.0 | nan | 0.9805 | 0.8861 | 0.8377 | 0.9756 | 0.8310 | 0.0 | 0.8251 | 0.9717 | 0.3604 | 0.0275 | 0.8419 | 0.9532 | 0.0 | | 0.0323 | 249.09 | 2740 | 0.0843 | 0.6518 | 0.6893 | 0.9780 | nan | 0.9934 | 0.9488 | 0.9029 | 0.9883 | 0.8835 | 0.0 | 0.8707 | 0.9807 | 0.4740 | 0.0491 | 0.8972 | 0.9720 | 0.0 | nan | 0.9802 | 0.8849 | 0.8369 | 0.9755 | 0.8337 | 0.0 | 0.8349 | 0.9736 | 0.3165 | 0.0491 | 0.8370 | 0.9511 | 0.0 | | 0.0355 | 250.91 | 2760 | 0.0860 | 0.6511 | 0.6904 | 0.9780 | nan | 0.9931 | 0.9499 | 0.9036 | 0.9877 | 0.8926 | 0.0 | 0.8690 | 0.9808 | 0.5094 | 0.0171 | 0.9016 | 0.9699 | 0.0 | nan | 0.9805 | 0.8852 | 0.8370 | 0.9756 | 0.8348 | 0.0 | 0.8307 | 0.9720 | 0.3331 | 0.0171 | 0.8485 | 0.9504 | 0.0 | | 0.0366 | 252.73 | 2780 | 0.0867 | 0.6514 | 0.6900 | 0.9778 | nan | 0.9937 | 0.9446 | 0.9035 | 0.9886 | 0.8815 | 0.0 | 0.8649 | 0.9794 | 0.5150 | 0.0229 | 0.9016 | 0.9744 | 0.0 | nan | 0.9803 | 0.8844 | 0.8358 | 0.9755 | 0.8337 | 0.0 | 0.8274 | 0.9709 | 0.3393 | 0.0229 | 0.8452 | 0.9524 | 0.0 | | 0.0368 | 254.55 | 2800 | 0.0879 | 0.6451 | 0.6828 | 0.9777 | nan | 0.9937 | 0.9434 | 0.9032 | 0.9882 | 0.8870 | 0.0 | 0.8569 | 0.9826 | 0.4230 | 0.0183 | 0.8910 | 0.9890 | 0.0 | nan | 0.9804 | 0.8819 | 0.8383 | 0.9753 | 0.8337 | 0.0 | 0.8215 | 0.9726 | 0.2783 | 0.0183 | 0.8376 | 0.9488 | 0.0 | | 0.0388 | 256.36 | 2820 | 0.0883 | 0.6422 | 0.6776 | 0.9776 | nan | 0.9937 | 0.9464 | 0.9016 | 0.9883 | 0.8907 | 0.0 | 0.8708 | 0.9827 | 0.3579 | 0.0076 | 0.8931 | 0.9764 | 0.0 | nan | 0.9804 | 0.8827 | 0.8374 | 0.9754 | 0.8333 | 0.0 | 0.8287 | 0.9729 | 0.2403 | 0.0076 | 0.8395 | 0.9509 | 0.0 | | 0.0386 | 258.18 | 2840 | 0.0893 | 0.6434 | 0.6809 | 0.9772 | nan | 0.9938 | 0.9477 | 0.8840 | 0.9885 | 0.8922 | 0.0 | 0.8622 | 0.9794 | 0.3777 | 0.0420 | 0.9091 | 0.9745 | 0.0 | nan | 0.9801 | 0.8804 | 0.8298 | 0.9754 | 0.8342 | 0.0 | 0.8269 | 0.9703 | 0.2490 | 0.0420 | 0.8242 | 0.9520 | 0.0 | | 0.0386 | 260.0 | 2860 | 0.0903 | 0.6473 | 0.6870 | 0.9776 | nan | 0.9940 | 0.9485 | 0.8889 | 0.9860 | 0.9029 | 0.0 | 0.8773 | 0.9816 | 0.3984 | 0.0531 | 0.9192 | 0.9808 | 0.0 | nan | 0.9803 | 0.8815 | 0.8334 | 0.9756 | 0.8383 | 0.0 | 0.8326 | 0.9719 | 0.2637 | 0.0531 | 0.8304 | 0.9544 | 0.0 | | 0.0383 | 261.82 | 2880 | 0.0882 | 0.6451 | 0.6817 | 0.9771 | nan | 0.9923 | 0.9516 | 0.9188 | 0.9872 | 0.8969 | 0.0 | 0.8459 | 0.9748 | 0.4169 | 0.0588 | 0.8576 | 0.9614 | 0.0 | nan | 0.9805 | 0.8734 | 0.8412 | 0.9756 | 0.8391 | 0.0 | 0.8114 | 0.9671 | 0.2781 | 0.0588 | 0.8156 | 0.9450 | 0.0 | | 0.0453 | 263.64 | 2900 | 0.0936 | 0.6425 | 0.6776 | 0.9770 | nan | 0.9938 | 0.9515 | 0.8913 | 0.9875 | 0.8942 | 0.0 | 0.8662 | 0.9750 | 0.3374 | 0.0279 | 0.9025 | 0.9808 | 0.0 | nan | 0.9799 | 0.8763 | 0.8322 | 0.9759 | 0.8413 | 0.0 | 0.8315 | 0.9685 | 0.2306 | 0.0279 | 0.8327 | 0.9552 | 0.0 | | 0.0337 | 265.45 | 2920 | 0.0940 | 0.6379 | 0.6778 | 0.9761 | nan | 0.9937 | 0.9492 | 0.9001 | 0.9869 | 0.8973 | 0.0 | 0.8594 | 0.9643 | 0.3736 | 0.0228 | 0.8864 | 0.9772 | 0.0 | nan | 0.9799 | 0.8747 | 0.8347 | 0.9754 | 0.8359 | 0.0 | 0.8280 | 0.9589 | 0.2507 | 0.0228 | 0.7784 | 0.9534 | 0.0 | | 0.037 | 267.27 | 2940 | 0.0926 | 0.6435 | 0.6818 | 0.9771 | nan | 0.9938 | 0.9521 | 0.8916 | 0.9878 | 0.8858 | 0.0 | 0.8528 | 0.9722 | 0.4428 | 0.0264 | 0.8817 | 0.9758 | 0.0 | nan | 0.9803 | 0.8757 | 0.8370 | 0.9758 | 0.8372 | 0.0 | 0.8186 | 0.9658 | 0.2930 | 0.0264 | 0.8019 | 0.9537 | 0.0 | | 0.0357 | 269.09 | 2960 | 0.0887 | 0.6467 | 0.6829 | 0.9776 | nan | 0.9937 | 0.9502 | 0.9018 | 0.9873 | 0.9034 | 0.0 | 0.8470 | 0.9759 | 0.4431 | 0.0296 | 0.8739 | 0.9719 | 0.0 | nan | 0.9804 | 0.8799 | 0.8387 | 0.9760 | 0.8437 | 0.0 | 0.8183 | 0.9696 | 0.2943 | 0.0296 | 0.8247 | 0.9519 | 0.0 | | 0.033 | 270.91 | 2980 | 0.0910 | 0.6453 | 0.6824 | 0.9773 | nan | 0.9934 | 0.9515 | 0.8987 | 0.9864 | 0.9136 | 0.0 | 0.8525 | 0.9744 | 0.4236 | 0.0323 | 0.8571 | 0.9880 | 0.0 | nan | 0.9804 | 0.8760 | 0.8387 | 0.9759 | 0.8437 | 0.0 | 0.8255 | 0.9671 | 0.2827 | 0.0323 | 0.8169 | 0.9492 | 0.0 | | 0.0333 | 272.73 | 3000 | 0.0910 | 0.6481 | 0.6856 | 0.9774 | nan | 0.9943 | 0.9491 | 0.8809 | 0.9854 | 0.9146 | 0.0 | 0.8704 | 0.9788 | 0.4187 | 0.0534 | 0.8887 | 0.9782 | 0.0 | nan | 0.9799 | 0.8802 | 0.8306 | 0.9758 | 0.8404 | 0.0 | 0.8322 | 0.9703 | 0.2804 | 0.0534 | 0.8296 | 0.9522 | 0.0 | | 0.037 | 274.55 | 3020 | 0.0907 | 0.6453 | 0.6841 | 0.9771 | nan | 0.9934 | 0.9531 | 0.8984 | 0.9865 | 0.9030 | 0.0 | 0.8419 | 0.9721 | 0.4512 | 0.0357 | 0.8848 | 0.9729 | 0.0 | nan | 0.9806 | 0.8740 | 0.8384 | 0.9757 | 0.8420 | 0.0 | 0.8096 | 0.9646 | 0.2979 | 0.0357 | 0.8184 | 0.9520 | 0.0 | | 0.0365 | 276.36 | 3040 | 0.0909 | 0.6466 | 0.6848 | 0.9773 | nan | 0.9939 | 0.9522 | 0.8941 | 0.9849 | 0.9098 | 0.0 | 0.8605 | 0.9761 | 0.4040 | 0.0574 | 0.8935 | 0.9757 | 0.0 | nan | 0.9804 | 0.8766 | 0.8367 | 0.9756 | 0.8423 | 0.0 | 0.8269 | 0.9696 | 0.2695 | 0.0574 | 0.8201 | 0.9513 | 0.0 | | 0.0354 | 278.18 | 3060 | 0.0905 | 0.6487 | 0.6945 | 0.9772 | nan | 0.9928 | 0.9459 | 0.9125 | 0.9880 | 0.9037 | 0.0 | 0.8835 | 0.9734 | 0.4337 | 0.0627 | 0.9426 | 0.9893 | 0.0 | nan | 0.9802 | 0.8777 | 0.8358 | 0.9757 | 0.8375 | 0.0 | 0.8297 | 0.9668 | 0.2863 | 0.0626 | 0.8329 | 0.9478 | 0.0 | | 0.0378 | 280.0 | 3080 | 0.0879 | 0.6535 | 0.6946 | 0.9778 | nan | 0.9931 | 0.9468 | 0.9035 | 0.9891 | 0.8882 | 0.0 | 0.8652 | 0.9781 | 0.5137 | 0.0597 | 0.9041 | 0.9878 | 0.0 | nan | 0.9805 | 0.8828 | 0.8358 | 0.9758 | 0.8385 | 0.0 | 0.8281 | 0.9697 | 0.3350 | 0.0597 | 0.8412 | 0.9488 | 0.0 | | 0.0371 | 281.82 | 3100 | 0.0858 | 0.6545 | 0.6945 | 0.9783 | nan | 0.9933 | 0.9467 | 0.9019 | 0.9886 | 0.8911 | 0.0 | 0.8670 | 0.9826 | 0.5208 | 0.0628 | 0.8881 | 0.9858 | 0.0 | nan | 0.9805 | 0.8874 | 0.8387 | 0.9759 | 0.8411 | 0.0 | 0.8301 | 0.9727 | 0.3412 | 0.0628 | 0.8308 | 0.9466 | 0.0 | | 0.035 | 283.64 | 3120 | 0.0876 | 0.6526 | 0.6928 | 0.9780 | nan | 0.9934 | 0.9498 | 0.9050 | 0.9873 | 0.9034 | 0.0 | 0.8836 | 0.9783 | 0.4618 | 0.0547 | 0.9038 | 0.9846 | 0.0 | nan | 0.9807 | 0.8836 | 0.8402 | 0.9763 | 0.8446 | 0.0 | 0.8362 | 0.9713 | 0.3050 | 0.0546 | 0.8429 | 0.9480 | 0.0 | | 0.0349 | 285.45 | 3140 | 0.0877 | 0.6500 | 0.6906 | 0.9780 | nan | 0.9930 | 0.9504 | 0.9105 | 0.9869 | 0.9051 | 0.0 | 0.8656 | 0.9809 | 0.4285 | 0.0499 | 0.9171 | 0.9901 | 0.0 | nan | 0.9804 | 0.8853 | 0.8374 | 0.9763 | 0.8424 | 0.0 | 0.8308 | 0.9728 | 0.2869 | 0.0499 | 0.8404 | 0.9471 | 0.0 | | 0.0423 | 287.27 | 3160 | 0.0875 | 0.6496 | 0.6888 | 0.9781 | nan | 0.9929 | 0.9553 | 0.9057 | 0.9863 | 0.9057 | 0.0 | 0.8749 | 0.9822 | 0.4253 | 0.0542 | 0.8963 | 0.9755 | 0.0 | nan | 0.9807 | 0.8844 | 0.8394 | 0.9760 | 0.8420 | 0.0 | 0.8338 | 0.9748 | 0.2834 | 0.0542 | 0.8248 | 0.9516 | 0.0 | | 0.0326 | 289.09 | 3180 | 0.0897 | 0.6492 | 0.6881 | 0.9780 | nan | 0.9936 | 0.9527 | 0.8975 | 0.9870 | 0.9142 | 0.0 | 0.8698 | 0.9788 | 0.4126 | 0.0584 | 0.9041 | 0.9763 | 0.0 | nan | 0.9808 | 0.8823 | 0.8411 | 0.9761 | 0.8421 | 0.0 | 0.8305 | 0.9724 | 0.2746 | 0.0583 | 0.8292 | 0.9520 | 0.0 | | 0.0323 | 290.91 | 3200 | 0.0899 | 0.6487 | 0.6875 | 0.9782 | nan | 0.9934 | 0.9504 | 0.8956 | 0.9878 | 0.8963 | 0.0 | 0.8841 | 0.9845 | 0.4247 | 0.0323 | 0.8995 | 0.9888 | 0.0 | nan | 0.9807 | 0.8849 | 0.8356 | 0.9762 | 0.8384 | 0.0 | 0.8347 | 0.9761 | 0.2822 | 0.0323 | 0.8397 | 0.9517 | 0.0 | | 0.0407 | 292.73 | 3220 | 0.0844 | 0.6536 | 0.6982 | 0.9782 | nan | 0.9930 | 0.9509 | 0.9032 | 0.9872 | 0.9049 | 0.0 | 0.8784 | 0.9808 | 0.4992 | 0.0769 | 0.9125 | 0.9902 | 0.0 | nan | 0.9809 | 0.8850 | 0.8388 | 0.9761 | 0.8392 | 0.0 | 0.8307 | 0.9729 | 0.3249 | 0.0768 | 0.8274 | 0.9438 | 0.0 | | 0.0365 | 294.55 | 3240 | 0.0892 | 0.6506 | 0.6887 | 0.9779 | nan | 0.9943 | 0.9506 | 0.8839 | 0.9878 | 0.9002 | 0.0 | 0.8785 | 0.9772 | 0.4731 | 0.0510 | 0.8823 | 0.9736 | 0.0 | nan | 0.9806 | 0.8820 | 0.8361 | 0.9761 | 0.8421 | 0.0 | 0.8275 | 0.9705 | 0.3144 | 0.0509 | 0.8260 | 0.9512 | 0.0 | | 0.0315 | 296.36 | 3260 | 0.0882 | 0.6528 | 0.6949 | 0.9781 | nan | 0.9935 | 0.9462 | 0.9003 | 0.9884 | 0.9042 | 0.0 | 0.8880 | 0.9805 | 0.4850 | 0.0606 | 0.9047 | 0.9822 | 0.0 | nan | 0.9808 | 0.8822 | 0.8440 | 0.9760 | 0.8444 | 0.0 | 0.8331 | 0.9712 | 0.3161 | 0.0606 | 0.8248 | 0.9529 | 0.0 | | 0.0379 | 298.18 | 3280 | 0.0894 | 0.6517 | 0.6921 | 0.9781 | nan | 0.9935 | 0.9504 | 0.8988 | 0.9868 | 0.9107 | 0.0 | 0.8828 | 0.9815 | 0.4648 | 0.0525 | 0.9134 | 0.9626 | 0.0 | nan | 0.9806 | 0.8837 | 0.8398 | 0.9761 | 0.8460 | 0.0 | 0.8284 | 0.9726 | 0.3106 | 0.0524 | 0.8347 | 0.9477 | 0.0 | | 0.034 | 300.0 | 3300 | 0.0889 | 0.6531 | 0.6950 | 0.9779 | nan | 0.9939 | 0.9525 | 0.8816 | 0.9866 | 0.9054 | 0.0 | 0.8788 | 0.9787 | 0.5118 | 0.0680 | 0.9114 | 0.9664 | 0.0 | nan | 0.9802 | 0.8843 | 0.8316 | 0.9761 | 0.8440 | 0.0 | 0.8317 | 0.9709 | 0.3396 | 0.0679 | 0.8153 | 0.9490 | 0.0 | | 0.0387 | 301.82 | 3320 | 0.0877 | 0.6542 | 0.6954 | 0.9781 | nan | 0.9937 | 0.9516 | 0.8935 | 0.9870 | 0.9100 | 0.0 | 0.8756 | 0.9778 | 0.5035 | 0.0720 | 0.8871 | 0.9884 | 0.0 | nan | 0.9810 | 0.8816 | 0.8420 | 0.9761 | 0.8463 | 0.0 | 0.8325 | 0.9710 | 0.3264 | 0.0719 | 0.8233 | 0.9521 | 0.0 | | 0.0349 | 303.64 | 3340 | 0.0875 | 0.6509 | 0.6923 | 0.9781 | nan | 0.9933 | 0.9499 | 0.8976 | 0.9874 | 0.9067 | 0.0 | 0.8839 | 0.9811 | 0.4834 | 0.0564 | 0.8771 | 0.9830 | 0.0 | nan | 0.9805 | 0.8849 | 0.8363 | 0.9764 | 0.8444 | 0.0 | 0.8349 | 0.9724 | 0.3192 | 0.0558 | 0.8106 | 0.9461 | 0.0 | | 0.0313 | 305.45 | 3360 | 0.0894 | 0.6503 | 0.6889 | 0.9779 | nan | 0.9938 | 0.9537 | 0.8934 | 0.9858 | 0.9085 | 0.0 | 0.8752 | 0.9786 | 0.4658 | 0.0540 | 0.8803 | 0.9669 | 0.0 | nan | 0.9806 | 0.8827 | 0.8379 | 0.9762 | 0.8451 | 0.0 | 0.8352 | 0.9710 | 0.3082 | 0.0533 | 0.8173 | 0.9470 | 0.0 | | 0.0358 | 307.27 | 3380 | 0.0894 | 0.6465 | 0.6883 | 0.9778 | nan | 0.9923 | 0.9520 | 0.9138 | 0.9893 | 0.8962 | 0.0 | 0.8796 | 0.9780 | 0.4009 | 0.0639 | 0.9009 | 0.9806 | 0.0 | nan | 0.9808 | 0.8814 | 0.8397 | 0.9761 | 0.8424 | 0.0 | 0.8330 | 0.9708 | 0.2664 | 0.0637 | 0.8022 | 0.9475 | 0.0 | | 0.0323 | 309.09 | 3400 | 0.0890 | 0.6486 | 0.6912 | 0.9781 | nan | 0.9936 | 0.9510 | 0.8914 | 0.9870 | 0.9048 | 0.0 | 0.8768 | 0.9829 | 0.4675 | 0.0636 | 0.8936 | 0.9735 | 0.0 | nan | 0.9804 | 0.8869 | 0.8342 | 0.9763 | 0.8419 | 0.0 | 0.8339 | 0.9741 | 0.3078 | 0.0623 | 0.7855 | 0.9485 | 0.0 | | 0.0299 | 310.91 | 3420 | 0.0898 | 0.6479 | 0.6913 | 0.9779 | nan | 0.9930 | 0.9546 | 0.9055 | 0.9855 | 0.9118 | 0.0 | 0.8758 | 0.9793 | 0.4479 | 0.0659 | 0.9043 | 0.9628 | 0.0 | nan | 0.9806 | 0.8839 | 0.8373 | 0.9760 | 0.8412 | 0.0 | 0.8331 | 0.9720 | 0.2964 | 0.0655 | 0.7905 | 0.9466 | 0.0 | | 0.0316 | 312.73 | 3440 | 0.0902 | 0.6532 | 0.6945 | 0.9777 | nan | 0.9939 | 0.9459 | 0.8925 | 0.9879 | 0.9053 | 0.0 | 0.8680 | 0.9766 | 0.5033 | 0.0824 | 0.8993 | 0.9729 | 0.0 | nan | 0.9802 | 0.8857 | 0.8357 | 0.9762 | 0.8419 | 0.0 | 0.8163 | 0.9677 | 0.3293 | 0.0823 | 0.8255 | 0.9513 | 0.0 | | 0.0346 | 314.55 | 3460 | 0.0891 | 0.6538 | 0.6971 | 0.9781 | nan | 0.9926 | 0.9511 | 0.9086 | 0.9887 | 0.8953 | 0.0 | 0.8813 | 0.9794 | 0.4861 | 0.0727 | 0.9263 | 0.9805 | 0.0 | nan | 0.9805 | 0.8868 | 0.8357 | 0.9764 | 0.8416 | 0.0 | 0.8351 | 0.9717 | 0.3176 | 0.0726 | 0.8282 | 0.9531 | 0.0 | | 0.0341 | 316.36 | 3480 | 0.0861 | 0.6524 | 0.6911 | 0.9781 | nan | 0.9941 | 0.9474 | 0.8925 | 0.9879 | 0.8937 | 0.0 | 0.8771 | 0.9825 | 0.4475 | 0.0827 | 0.9002 | 0.9791 | 0.0 | nan | 0.9804 | 0.8871 | 0.8373 | 0.9764 | 0.8403 | 0.0 | 0.8350 | 0.9732 | 0.2970 | 0.0781 | 0.8236 | 0.9529 | 0.0 | | 0.0349 | 318.18 | 3500 | 0.0907 | 0.6487 | 0.6856 | 0.9778 | nan | 0.9939 | 0.9498 | 0.8943 | 0.9877 | 0.8943 | 0.0 | 0.8742 | 0.9781 | 0.4577 | 0.0311 | 0.8775 | 0.9736 | 0.0 | nan | 0.9803 | 0.8829 | 0.8362 | 0.9764 | 0.8421 | 0.0 | 0.8306 | 0.9712 | 0.3032 | 0.0302 | 0.8271 | 0.9529 | 0.0 | | 0.0343 | 320.0 | 3520 | 0.0910 | 0.6485 | 0.6858 | 0.9778 | nan | 0.9938 | 0.9497 | 0.8968 | 0.9874 | 0.9044 | 0.0 | 0.8807 | 0.9778 | 0.4278 | 0.0422 | 0.8817 | 0.9733 | 0.0 | nan | 0.9802 | 0.8844 | 0.8364 | 0.9764 | 0.8443 | 0.0 | 0.8339 | 0.9714 | 0.2865 | 0.0421 | 0.8223 | 0.9530 | 0.0 | | 0.0372 | 321.82 | 3540 | 0.0894 | 0.6507 | 0.6899 | 0.9781 | nan | 0.9935 | 0.9504 | 0.9042 | 0.9855 | 0.9141 | 0.0 | 0.8811 | 0.9809 | 0.4533 | 0.0524 | 0.8755 | 0.9779 | 0.0 | nan | 0.9805 | 0.8851 | 0.8382 | 0.9762 | 0.8449 | 0.0 | 0.8309 | 0.9731 | 0.2998 | 0.0519 | 0.8239 | 0.9540 | 0.0 | | 0.0338 | 323.64 | 3560 | 0.0882 | 0.6524 | 0.6930 | 0.9780 | nan | 0.9934 | 0.9521 | 0.9021 | 0.9858 | 0.9186 | 0.0 | 0.8738 | 0.9792 | 0.4563 | 0.0612 | 0.9087 | 0.9770 | 0.0 | nan | 0.9808 | 0.8832 | 0.8395 | 0.9763 | 0.8462 | 0.0 | 0.8298 | 0.9717 | 0.3012 | 0.0603 | 0.8389 | 0.9531 | 0.0 | | 0.0332 | 325.45 | 3580 | 0.0876 | 0.6522 | 0.6888 | 0.9784 | nan | 0.9939 | 0.9498 | 0.8982 | 0.9873 | 0.9035 | 0.0 | 0.8680 | 0.9826 | 0.4739 | 0.0619 | 0.8603 | 0.9754 | 0.0 | nan | 0.9805 | 0.8875 | 0.8382 | 0.9764 | 0.8477 | 0.0 | 0.8294 | 0.9747 | 0.3179 | 0.0614 | 0.8103 | 0.9541 | 0.0 | | 0.031 | 327.27 | 3600 | 0.0880 | 0.6533 | 0.6911 | 0.9783 | nan | 0.9941 | 0.9461 | 0.8996 | 0.9874 | 0.9056 | 0.0 | 0.8708 | 0.9827 | 0.4609 | 0.0705 | 0.8919 | 0.9753 | 0.0 | nan | 0.9804 | 0.8876 | 0.8355 | 0.9765 | 0.8409 | 0.0 | 0.8321 | 0.9755 | 0.3079 | 0.0705 | 0.8331 | 0.9523 | 0.0 | | 0.0314 | 329.09 | 3620 | 0.0911 | 0.6518 | 0.6915 | 0.9779 | nan | 0.9924 | 0.9532 | 0.9187 | 0.9871 | 0.9050 | 0.0 | 0.8749 | 0.9783 | 0.4372 | 0.0759 | 0.8888 | 0.9781 | 0.0 | nan | 0.9807 | 0.8819 | 0.8395 | 0.9765 | 0.8451 | 0.0 | 0.8347 | 0.9719 | 0.2873 | 0.0758 | 0.8273 | 0.9531 | 0.0 | | 0.0315 | 330.91 | 3640 | 0.0890 | 0.6539 | 0.6924 | 0.9782 | nan | 0.9935 | 0.9475 | 0.9014 | 0.9877 | 0.8994 | 0.0 | 0.8782 | 0.9816 | 0.4982 | 0.0765 | 0.8597 | 0.9769 | 0.0 | nan | 0.9803 | 0.8874 | 0.8335 | 0.9764 | 0.8441 | 0.0 | 0.8334 | 0.9738 | 0.3267 | 0.0764 | 0.8142 | 0.9541 | 0.0 | | 0.0333 | 332.73 | 3660 | 0.0897 | 0.6530 | 0.6913 | 0.9782 | nan | 0.9939 | 0.9478 | 0.8982 | 0.9875 | 0.9056 | 0.0 | 0.8833 | 0.9814 | 0.4382 | 0.0825 | 0.8878 | 0.9813 | 0.0 | nan | 0.9806 | 0.8848 | 0.8371 | 0.9766 | 0.8444 | 0.0 | 0.8387 | 0.9737 | 0.2914 | 0.0824 | 0.8248 | 0.9542 | 0.0 | | 0.0341 | 334.55 | 3680 | 0.0886 | 0.6545 | 0.6923 | 0.9783 | nan | 0.9934 | 0.9476 | 0.9030 | 0.9889 | 0.8997 | 0.0 | 0.8764 | 0.9822 | 0.4599 | 0.0920 | 0.8837 | 0.9728 | 0.0 | nan | 0.9808 | 0.8856 | 0.8393 | 0.9765 | 0.8458 | 0.0 | 0.8357 | 0.9744 | 0.3017 | 0.0915 | 0.8247 | 0.9523 | 0.0 | | 0.0301 | 336.36 | 3700 | 0.0899 | 0.6516 | 0.6892 | 0.9781 | nan | 0.9934 | 0.9521 | 0.9029 | 0.9862 | 0.9045 | 0.0 | 0.8795 | 0.9812 | 0.4375 | 0.0754 | 0.8707 | 0.9762 | 0.0 | nan | 0.9808 | 0.8836 | 0.8377 | 0.9765 | 0.8453 | 0.0 | 0.8357 | 0.9728 | 0.2916 | 0.0754 | 0.8184 | 0.9524 | 0.0 | | 0.0295 | 338.18 | 3720 | 0.0874 | 0.6559 | 0.6972 | 0.9785 | nan | 0.9936 | 0.9509 | 0.8985 | 0.9880 | 0.8961 | 0.0 | 0.8824 | 0.9819 | 0.4822 | 0.0693 | 0.9396 | 0.9813 | 0.0 | nan | 0.9810 | 0.8869 | 0.8385 | 0.9764 | 0.8440 | 0.0 | 0.8374 | 0.9737 | 0.3193 | 0.0693 | 0.8475 | 0.9531 | 0.0 | | 0.0306 | 340.0 | 3740 | 0.0870 | 0.6592 | 0.7002 | 0.9786 | nan | 0.9935 | 0.9516 | 0.8970 | 0.9866 | 0.9053 | 0.0 | 0.8728 | 0.9828 | 0.5435 | 0.0707 | 0.9238 | 0.9748 | 0.0 | nan | 0.9806 | 0.8901 | 0.8339 | 0.9766 | 0.8449 | 0.0 | 0.8341 | 0.9741 | 0.3597 | 0.0707 | 0.8507 | 0.9537 | 0.0 | | 0.0283 | 341.82 | 3760 | 0.0878 | 0.6583 | 0.6993 | 0.9783 | nan | 0.9938 | 0.9480 | 0.8967 | 0.9869 | 0.9106 | 0.0 | 0.8838 | 0.9801 | 0.5249 | 0.0868 | 0.9036 | 0.9756 | 0.0 | nan | 0.9805 | 0.8869 | 0.8363 | 0.9765 | 0.8463 | 0.0 | 0.8350 | 0.9717 | 0.3444 | 0.0867 | 0.8407 | 0.9535 | 0.0 | | 0.0376 | 343.64 | 3780 | 0.0906 | 0.6561 | 0.6963 | 0.9781 | nan | 0.9935 | 0.9483 | 0.9037 | 0.9879 | 0.9036 | 0.0 | 0.8749 | 0.9780 | 0.4881 | 0.0760 | 0.9123 | 0.9858 | 0.0 | nan | 0.9803 | 0.8866 | 0.8349 | 0.9766 | 0.8474 | 0.0 | 0.8351 | 0.9709 | 0.3256 | 0.0760 | 0.8412 | 0.9552 | 0.0 | | 0.0316 | 345.45 | 3800 | 0.0876 | 0.6581 | 0.7000 | 0.9785 | nan | 0.9935 | 0.9492 | 0.9003 | 0.9876 | 0.9055 | 0.0 | 0.8766 | 0.9809 | 0.5509 | 0.0718 | 0.9013 | 0.9826 | 0.0 | nan | 0.9808 | 0.8886 | 0.8361 | 0.9765 | 0.8447 | 0.0 | 0.8351 | 0.9734 | 0.3618 | 0.0718 | 0.8336 | 0.9532 | 0.0 | | 0.03 | 347.27 | 3820 | 0.0881 | 0.6560 | 0.6944 | 0.9784 | nan | 0.9941 | 0.9491 | 0.8922 | 0.9879 | 0.8951 | 0.0 | 0.8800 | 0.9802 | 0.5194 | 0.0688 | 0.8873 | 0.9729 | 0.0 | nan | 0.9807 | 0.8888 | 0.8349 | 0.9766 | 0.8437 | 0.0 | 0.8362 | 0.9721 | 0.3422 | 0.0687 | 0.8330 | 0.9512 | 0.0 | | 0.0336 | 349.09 | 3840 | 0.0898 | 0.6534 | 0.6917 | 0.9782 | nan | 0.9937 | 0.9532 | 0.8942 | 0.9873 | 0.8994 | 0.0 | 0.8761 | 0.9799 | 0.4736 | 0.0707 | 0.8835 | 0.9808 | 0.0 | nan | 0.9809 | 0.8845 | 0.8358 | 0.9766 | 0.8441 | 0.0 | 0.8348 | 0.9726 | 0.3129 | 0.0706 | 0.8264 | 0.9552 | 0.0 | | 0.0314 | 350.91 | 3860 | 0.0869 | 0.6565 | 0.6970 | 0.9786 | nan | 0.9935 | 0.9510 | 0.8995 | 0.9878 | 0.8977 | 0.0 | 0.8845 | 0.9827 | 0.5176 | 0.0739 | 0.8986 | 0.9735 | 0.0 | nan | 0.9808 | 0.8904 | 0.8348 | 0.9768 | 0.8437 | 0.0 | 0.8344 | 0.9746 | 0.3410 | 0.0739 | 0.8310 | 0.9527 | 0.0 | | 0.0309 | 352.73 | 3880 | 0.0892 | 0.6533 | 0.6939 | 0.9781 | nan | 0.9921 | 0.9510 | 0.9161 | 0.9882 | 0.9016 | 0.0 | 0.8884 | 0.9823 | 0.4434 | 0.0799 | 0.8945 | 0.9834 | 0.0 | nan | 0.9804 | 0.8875 | 0.8316 | 0.9767 | 0.8442 | 0.0 | 0.8387 | 0.9742 | 0.2952 | 0.0798 | 0.8278 | 0.9571 | 0.0 | | 0.0342 | 354.55 | 3900 | 0.0888 | 0.6530 | 0.6897 | 0.9782 | nan | 0.9938 | 0.9530 | 0.8968 | 0.9864 | 0.9063 | 0.0 | 0.8723 | 0.9802 | 0.4558 | 0.0685 | 0.8786 | 0.9742 | 0.0 | nan | 0.9805 | 0.8871 | 0.8362 | 0.9767 | 0.8457 | 0.0 | 0.8345 | 0.9728 | 0.3029 | 0.0685 | 0.8289 | 0.9546 | 0.0 | | 0.0365 | 356.36 | 3920 | 0.0869 | 0.6567 | 0.6982 | 0.9785 | nan | 0.9932 | 0.9526 | 0.9029 | 0.9878 | 0.9015 | 0.0 | 0.8879 | 0.9803 | 0.5220 | 0.0699 | 0.8917 | 0.9872 | 0.0 | nan | 0.9808 | 0.8883 | 0.8396 | 0.9767 | 0.8428 | 0.0 | 0.8347 | 0.9724 | 0.3456 | 0.0698 | 0.8348 | 0.9522 | 0.0 | | 0.0333 | 358.18 | 3940 | 0.0868 | 0.6535 | 0.6926 | 0.9783 | nan | 0.9934 | 0.9503 | 0.9020 | 0.9883 | 0.8989 | 0.0 | 0.8703 | 0.9806 | 0.4826 | 0.0743 | 0.8771 | 0.9863 | 0.0 | nan | 0.9807 | 0.8866 | 0.8383 | 0.9765 | 0.8419 | 0.0 | 0.8326 | 0.9736 | 0.3193 | 0.0742 | 0.8191 | 0.9521 | 0.0 | | 0.036 | 360.0 | 3960 | 0.0882 | 0.6540 | 0.6929 | 0.9786 | nan | 0.9936 | 0.9513 | 0.8979 | 0.9877 | 0.9024 | 0.0 | 0.8887 | 0.9827 | 0.4783 | 0.0736 | 0.8688 | 0.9822 | 0.0 | nan | 0.9806 | 0.8897 | 0.8360 | 0.9768 | 0.8460 | 0.0 | 0.8367 | 0.9753 | 0.3176 | 0.0736 | 0.8162 | 0.9530 | 0.0 | | 0.0312 | 361.82 | 3980 | 0.0862 | 0.6613 | 0.7035 | 0.9788 | nan | 0.9938 | 0.9513 | 0.8954 | 0.9879 | 0.9096 | 0.0 | 0.8779 | 0.9791 | 0.5993 | 0.0829 | 0.8876 | 0.9805 | 0.0 | nan | 0.9806 | 0.8918 | 0.8369 | 0.9768 | 0.8470 | 0.0 | 0.8330 | 0.9736 | 0.3931 | 0.0828 | 0.8256 | 0.9557 | 0.0 | | 0.039 | 363.64 | 4000 | 0.0917 | 0.6504 | 0.6875 | 0.9780 | nan | 0.9937 | 0.9536 | 0.8970 | 0.9869 | 0.9043 | 0.0 | 0.8786 | 0.9781 | 0.4372 | 0.0616 | 0.8686 | 0.9777 | 0.0 | nan | 0.9806 | 0.8880 | 0.8365 | 0.9767 | 0.8463 | 0.0 | 0.8375 | 0.9711 | 0.2909 | 0.0616 | 0.8126 | 0.9527 | 0.0 | | 0.0333 | 365.45 | 4020 | 0.0867 | 0.6568 | 0.6974 | 0.9787 | nan | 0.9936 | 0.9497 | 0.9033 | 0.9880 | 0.9013 | 0.0 | 0.8829 | 0.9815 | 0.5210 | 0.0750 | 0.8924 | 0.9778 | 0.0 | nan | 0.9808 | 0.8915 | 0.8355 | 0.9768 | 0.8459 | 0.0 | 0.8346 | 0.9743 | 0.3429 | 0.0750 | 0.8258 | 0.9560 | 0.0 | | 0.0317 | 367.27 | 4040 | 0.0899 | 0.6532 | 0.6921 | 0.9781 | nan | 0.9937 | 0.9519 | 0.8979 | 0.9880 | 0.8961 | 0.0 | 0.8784 | 0.9775 | 0.4633 | 0.0753 | 0.8974 | 0.9780 | 0.0 | nan | 0.9805 | 0.8868 | 0.8346 | 0.9766 | 0.8437 | 0.0 | 0.8348 | 0.9710 | 0.3082 | 0.0752 | 0.8253 | 0.9553 | 0.0 | | 0.0293 | 369.09 | 4060 | 0.0875 | 0.6556 | 0.6944 | 0.9785 | nan | 0.9935 | 0.9523 | 0.8980 | 0.9872 | 0.8980 | 0.0 | 0.8787 | 0.9830 | 0.5020 | 0.0679 | 0.8871 | 0.9799 | 0.0 | nan | 0.9807 | 0.8889 | 0.8363 | 0.9767 | 0.8463 | 0.0 | 0.8363 | 0.9750 | 0.3301 | 0.0678 | 0.8285 | 0.9559 | 0.0 | | 0.0332 | 370.91 | 4080 | 0.0881 | 0.6568 | 0.6958 | 0.9787 | nan | 0.9939 | 0.9515 | 0.8929 | 0.9873 | 0.9054 | 0.0 | 0.8822 | 0.9830 | 0.5162 | 0.0663 | 0.8869 | 0.9793 | 0.0 | nan | 0.9808 | 0.8897 | 0.8364 | 0.9767 | 0.8467 | 0.0 | 0.8369 | 0.9751 | 0.3395 | 0.0663 | 0.8335 | 0.9562 | 0.0 | | 0.0306 | 372.73 | 4100 | 0.0878 | 0.6567 | 0.6967 | 0.9786 | nan | 0.9937 | 0.9509 | 0.8963 | 0.9873 | 0.9101 | 0.0 | 0.8838 | 0.9827 | 0.4861 | 0.0826 | 0.8999 | 0.9835 | 0.0 | nan | 0.9809 | 0.8888 | 0.8376 | 0.9767 | 0.8476 | 0.0 | 0.8383 | 0.9743 | 0.3225 | 0.0825 | 0.8314 | 0.9567 | 0.0 | | 0.0317 | 374.55 | 4120 | 0.0913 | 0.6552 | 0.6942 | 0.9781 | nan | 0.9937 | 0.9506 | 0.8978 | 0.9876 | 0.9046 | 0.0 | 0.8750 | 0.9771 | 0.5074 | 0.0674 | 0.8819 | 0.9820 | 0.0 | nan | 0.9806 | 0.8853 | 0.8359 | 0.9769 | 0.8474 | 0.0 | 0.8339 | 0.9697 | 0.3348 | 0.0674 | 0.8287 | 0.9567 | 0.0 | | 0.0335 | 376.36 | 4140 | 0.0892 | 0.6531 | 0.6911 | 0.9784 | nan | 0.9931 | 0.9536 | 0.9012 | 0.9886 | 0.8983 | 0.0 | 0.8784 | 0.9816 | 0.4736 | 0.0670 | 0.875 | 0.9738 | 0.0 | nan | 0.9812 | 0.8850 | 0.8372 | 0.9771 | 0.8461 | 0.0 | 0.8359 | 0.9741 | 0.3110 | 0.0670 | 0.8223 | 0.9534 | 0.0 | | 0.0338 | 378.18 | 4160 | 0.0899 | 0.6529 | 0.6907 | 0.9784 | nan | 0.9934 | 0.9485 | 0.9079 | 0.9888 | 0.8969 | 0.0 | 0.8861 | 0.9820 | 0.4447 | 0.0649 | 0.8862 | 0.9798 | 0.0 | nan | 0.9810 | 0.8858 | 0.8371 | 0.9770 | 0.8447 | 0.0 | 0.8382 | 0.9741 | 0.2970 | 0.0648 | 0.8329 | 0.9549 | 0.0 | | 0.0323 | 380.0 | 4180 | 0.0900 | 0.6547 | 0.6924 | 0.9783 | nan | 0.9942 | 0.9491 | 0.8930 | 0.9880 | 0.9082 | 0.0 | 0.8716 | 0.9787 | 0.4915 | 0.0729 | 0.8794 | 0.9747 | 0.0 | nan | 0.9807 | 0.8866 | 0.8367 | 0.9769 | 0.8465 | 0.0 | 0.8359 | 0.9719 | 0.3229 | 0.0728 | 0.8265 | 0.9543 | 0.0 | | 0.0327 | 381.82 | 4200 | 0.0899 | 0.6566 | 0.6959 | 0.9786 | nan | 0.9934 | 0.9514 | 0.8969 | 0.9882 | 0.8983 | 0.0 | 0.8866 | 0.9831 | 0.5002 | 0.0736 | 0.8919 | 0.9833 | 0.0 | nan | 0.9807 | 0.8893 | 0.8349 | 0.9770 | 0.8461 | 0.0 | 0.8380 | 0.9744 | 0.3308 | 0.0736 | 0.8340 | 0.9566 | 0.0 | | 0.0285 | 383.64 | 4220 | 0.0892 | 0.6563 | 0.6957 | 0.9785 | nan | 0.9935 | 0.9501 | 0.9040 | 0.9881 | 0.8995 | 0.0 | 0.8862 | 0.9811 | 0.4852 | 0.0773 | 0.8977 | 0.9808 | 0.0 | nan | 0.9809 | 0.8873 | 0.8391 | 0.9772 | 0.8460 | 0.0 | 0.8392 | 0.9732 | 0.3195 | 0.0773 | 0.8375 | 0.9549 | 0.0 | | 0.0314 | 385.45 | 4240 | 0.0886 | 0.6580 | 0.6989 | 0.9786 | nan | 0.9933 | 0.9515 | 0.9045 | 0.9869 | 0.9091 | 0.0 | 0.8815 | 0.9816 | 0.5215 | 0.0820 | 0.8899 | 0.9844 | 0.0 | nan | 0.9809 | 0.8891 | 0.8376 | 0.9770 | 0.8460 | 0.0 | 0.8372 | 0.9734 | 0.3410 | 0.0820 | 0.8349 | 0.9553 | 0.0 | | 0.0292 | 387.27 | 4260 | 0.0879 | 0.6573 | 0.6973 | 0.9785 | nan | 0.9933 | 0.9524 | 0.9039 | 0.9871 | 0.9027 | 0.0 | 0.8883 | 0.9805 | 0.5176 | 0.0783 | 0.8817 | 0.9790 | 0.0 | nan | 0.9808 | 0.8876 | 0.8356 | 0.9771 | 0.8463 | 0.0 | 0.8337 | 0.9730 | 0.3426 | 0.0783 | 0.8335 | 0.9565 | 0.0 | | 0.0263 | 389.09 | 4280 | 0.0900 | 0.6558 | 0.6943 | 0.9784 | nan | 0.9930 | 0.9508 | 0.9095 | 0.9886 | 0.9007 | 0.0 | 0.8757 | 0.9790 | 0.5073 | 0.0704 | 0.8782 | 0.9732 | 0.0 | nan | 0.9807 | 0.8873 | 0.8358 | 0.9771 | 0.8462 | 0.0 | 0.8364 | 0.9721 | 0.3355 | 0.0704 | 0.8301 | 0.9540 | 0.0 | | 0.0335 | 390.91 | 4300 | 0.0896 | 0.6550 | 0.6922 | 0.9782 | nan | 0.9934 | 0.9506 | 0.9049 | 0.9882 | 0.9042 | 0.0 | 0.8791 | 0.9779 | 0.4725 | 0.0670 | 0.8873 | 0.9742 | 0.0 | nan | 0.9807 | 0.8853 | 0.8355 | 0.9771 | 0.8457 | 0.0 | 0.8370 | 0.9712 | 0.3148 | 0.0670 | 0.8454 | 0.9547 | 0.0 | | 0.0314 | 392.73 | 4320 | 0.0894 | 0.6561 | 0.6949 | 0.9784 | nan | 0.9934 | 0.9523 | 0.9054 | 0.9874 | 0.9041 | 0.0 | 0.8804 | 0.9789 | 0.4871 | 0.0738 | 0.8963 | 0.9752 | 0.0 | nan | 0.9809 | 0.8877 | 0.8381 | 0.9772 | 0.8456 | 0.0 | 0.8378 | 0.9716 | 0.3196 | 0.0737 | 0.8418 | 0.9550 | 0.0 | | 0.0293 | 394.55 | 4340 | 0.0911 | 0.6543 | 0.6930 | 0.9779 | nan | 0.9928 | 0.9519 | 0.9088 | 0.9885 | 0.8981 | 0.0 | 0.8891 | 0.9770 | 0.4580 | 0.0701 | 0.8956 | 0.9785 | 0.0 | nan | 0.9808 | 0.8876 | 0.8346 | 0.9772 | 0.8457 | 0.0 | 0.8398 | 0.9696 | 0.3017 | 0.0700 | 0.8432 | 0.9556 | 0.0 | | 0.0314 | 396.36 | 4360 | 0.0906 | 0.6577 | 0.6981 | 0.9784 | nan | 0.9933 | 0.9486 | 0.9033 | 0.9886 | 0.8972 | 0.0 | 0.8909 | 0.9810 | 0.5145 | 0.0712 | 0.9100 | 0.9762 | 0.0 | nan | 0.9806 | 0.8900 | 0.8343 | 0.9770 | 0.8468 | 0.0 | 0.8404 | 0.9727 | 0.3350 | 0.0711 | 0.8471 | 0.9555 | 0.0 | | 0.0276 | 398.18 | 4380 | 0.0885 | 0.6560 | 0.6956 | 0.9784 | nan | 0.9934 | 0.9532 | 0.9007 | 0.9879 | 0.9029 | 0.0 | 0.8802 | 0.9795 | 0.4981 | 0.0738 | 0.9038 | 0.9688 | 0.0 | nan | 0.9807 | 0.8886 | 0.8346 | 0.9771 | 0.8465 | 0.0 | 0.8377 | 0.9732 | 0.3308 | 0.0738 | 0.8337 | 0.9507 | 0.0 | | 0.0433 | 400.0 | 4400 | 0.0883 | 0.6569 | 0.6967 | 0.9787 | nan | 0.9937 | 0.9519 | 0.8931 | 0.9879 | 0.8994 | 0.0 | 0.8881 | 0.9838 | 0.5089 | 0.0679 | 0.9059 | 0.9762 | 0.0 | nan | 0.9810 | 0.8898 | 0.8358 | 0.9771 | 0.8463 | 0.0 | 0.8402 | 0.9746 | 0.3333 | 0.0679 | 0.8400 | 0.9544 | 0.0 | | 0.026 | 401.82 | 4420 | 0.0883 | 0.6586 | 0.6991 | 0.9786 | nan | 0.9934 | 0.9498 | 0.8983 | 0.9893 | 0.8958 | 0.0 | 0.8897 | 0.9807 | 0.5383 | 0.0728 | 0.9022 | 0.9779 | 0.0 | nan | 0.9809 | 0.8894 | 0.8358 | 0.9769 | 0.8438 | 0.0 | 0.8400 | 0.9732 | 0.3489 | 0.0728 | 0.8446 | 0.9553 | 0.0 | | 0.0253 | 403.64 | 4440 | 0.0891 | 0.6568 | 0.6952 | 0.9787 | nan | 0.9934 | 0.9536 | 0.8991 | 0.9880 | 0.9018 | 0.0 | 0.8729 | 0.9824 | 0.5137 | 0.0669 | 0.8936 | 0.9717 | 0.0 | nan | 0.9811 | 0.8880 | 0.8380 | 0.9772 | 0.8499 | 0.0 | 0.8359 | 0.9742 | 0.3392 | 0.0669 | 0.8348 | 0.9534 | 0.0 | | 0.0316 | 405.45 | 4460 | 0.0884 | 0.6583 | 0.6978 | 0.9786 | nan | 0.9935 | 0.9504 | 0.8962 | 0.9885 | 0.9038 | 0.0 | 0.8849 | 0.9820 | 0.5136 | 0.0744 | 0.9034 | 0.9813 | 0.0 | nan | 0.9809 | 0.8885 | 0.8367 | 0.9771 | 0.8492 | 0.0 | 0.8377 | 0.9741 | 0.3381 | 0.0744 | 0.8445 | 0.9567 | 0.0 | | 0.0289 | 407.27 | 4480 | 0.0873 | 0.6626 | 0.7040 | 0.9789 | nan | 0.9936 | 0.9521 | 0.8951 | 0.9882 | 0.9000 | 0.0 | 0.8821 | 0.9815 | 0.5965 | 0.0688 | 0.9146 | 0.9791 | 0.0 | nan | 0.9807 | 0.8934 | 0.8348 | 0.9773 | 0.8493 | 0.0 | 0.8326 | 0.9736 | 0.3945 | 0.0687 | 0.8525 | 0.9568 | 0.0 | | 0.0279 | 409.09 | 4500 | 0.0897 | 0.6571 | 0.6958 | 0.9784 | nan | 0.9937 | 0.9514 | 0.9000 | 0.9882 | 0.9030 | 0.0 | 0.8854 | 0.9780 | 0.4876 | 0.0783 | 0.8988 | 0.9809 | 0.0 | nan | 0.9808 | 0.8870 | 0.8384 | 0.9772 | 0.8485 | 0.0 | 0.8357 | 0.9713 | 0.3224 | 0.0783 | 0.8466 | 0.9565 | 0.0 | | 0.0334 | 410.91 | 4520 | 0.0883 | 0.6567 | 0.6956 | 0.9785 | nan | 0.9938 | 0.9516 | 0.8941 | 0.9877 | 0.9049 | 0.0 | 0.8795 | 0.9807 | 0.4953 | 0.0769 | 0.8940 | 0.9850 | 0.0 | nan | 0.9808 | 0.8889 | 0.8353 | 0.9773 | 0.8494 | 0.0 | 0.8382 | 0.9730 | 0.3265 | 0.0769 | 0.8401 | 0.9509 | 0.0 | | 0.0332 | 412.73 | 4540 | 0.0897 | 0.6563 | 0.6943 | 0.9783 | nan | 0.9940 | 0.9489 | 0.8932 | 0.9887 | 0.9030 | 0.0 | 0.8786 | 0.9776 | 0.5089 | 0.0657 | 0.8888 | 0.9784 | 0.0 | nan | 0.9806 | 0.8895 | 0.8348 | 0.9772 | 0.8490 | 0.0 | 0.8359 | 0.9704 | 0.3372 | 0.0657 | 0.8377 | 0.9541 | 0.0 | | 0.0295 | 414.55 | 4560 | 0.0883 | 0.6579 | 0.6977 | 0.9787 | nan | 0.9938 | 0.9488 | 0.8977 | 0.9877 | 0.9076 | 0.0 | 0.8925 | 0.9830 | 0.5121 | 0.0737 | 0.9036 | 0.9695 | 0.0 | nan | 0.9808 | 0.8899 | 0.8360 | 0.9773 | 0.8507 | 0.0 | 0.8381 | 0.9740 | 0.3350 | 0.0737 | 0.8462 | 0.9510 | 0.0 | | 0.0323 | 416.36 | 4580 | 0.0896 | 0.6547 | 0.6906 | 0.9785 | nan | 0.9938 | 0.9527 | 0.8956 | 0.9880 | 0.9009 | 0.0 | 0.8724 | 0.9819 | 0.4765 | 0.0660 | 0.8832 | 0.9675 | 0.0 | nan | 0.9809 | 0.8872 | 0.8367 | 0.9773 | 0.8502 | 0.0 | 0.8352 | 0.9743 | 0.3156 | 0.0660 | 0.8368 | 0.9506 | 0.0 | | 0.0294 | 418.18 | 4600 | 0.0907 | 0.6562 | 0.6945 | 0.9784 | nan | 0.9931 | 0.9537 | 0.9024 | 0.9884 | 0.8972 | 0.0 | 0.8885 | 0.9795 | 0.4780 | 0.0734 | 0.8981 | 0.9765 | 0.0 | nan | 0.9808 | 0.8870 | 0.8362 | 0.9772 | 0.8491 | 0.0 | 0.8390 | 0.9726 | 0.3134 | 0.0734 | 0.8472 | 0.9546 | 0.0 | | 0.0308 | 420.0 | 4620 | 0.0938 | 0.6517 | 0.6881 | 0.9776 | nan | 0.9936 | 0.9538 | 0.8984 | 0.9879 | 0.9010 | 0.0 | 0.8842 | 0.9723 | 0.4149 | 0.0759 | 0.8869 | 0.9761 | 0.0 | nan | 0.9809 | 0.8799 | 0.8374 | 0.9772 | 0.8485 | 0.0 | 0.8390 | 0.9661 | 0.2758 | 0.0759 | 0.8388 | 0.9531 | 0.0 | | 0.0263 | 421.82 | 4640 | 0.0901 | 0.6557 | 0.6935 | 0.9783 | nan | 0.9938 | 0.9481 | 0.8999 | 0.9887 | 0.8988 | 0.0 | 0.8851 | 0.9793 | 0.4946 | 0.0745 | 0.8793 | 0.9738 | 0.0 | nan | 0.9807 | 0.8876 | 0.8363 | 0.9771 | 0.8469 | 0.0 | 0.8378 | 0.9713 | 0.3242 | 0.0745 | 0.8347 | 0.9528 | 0.0 | | 0.0317 | 423.64 | 4660 | 0.0907 | 0.6552 | 0.6932 | 0.9784 | nan | 0.9937 | 0.9518 | 0.8998 | 0.9879 | 0.9042 | 0.0 | 0.8838 | 0.9788 | 0.4782 | 0.0701 | 0.8853 | 0.9776 | 0.0 | nan | 0.9807 | 0.8878 | 0.8377 | 0.9772 | 0.8500 | 0.0 | 0.8388 | 0.9716 | 0.3154 | 0.0701 | 0.8339 | 0.9541 | 0.0 | | 0.0289 | 425.45 | 4680 | 0.0905 | 0.6554 | 0.6933 | 0.9785 | nan | 0.9935 | 0.9507 | 0.9022 | 0.9882 | 0.9056 | 0.0 | 0.8812 | 0.9816 | 0.4694 | 0.0709 | 0.8920 | 0.9774 | 0.0 | nan | 0.9808 | 0.8880 | 0.8374 | 0.9773 | 0.8506 | 0.0 | 0.8378 | 0.9740 | 0.3089 | 0.0709 | 0.8399 | 0.9542 | 0.0 | | 0.0301 | 427.27 | 4700 | 0.0906 | 0.6565 | 0.6944 | 0.9787 | nan | 0.9939 | 0.9513 | 0.8972 | 0.9879 | 0.9032 | 0.0 | 0.8832 | 0.9821 | 0.4820 | 0.0691 | 0.8972 | 0.9804 | 0.0 | nan | 0.9809 | 0.8883 | 0.8383 | 0.9773 | 0.8508 | 0.0 | 0.8386 | 0.9741 | 0.3170 | 0.0691 | 0.8447 | 0.9557 | 0.0 | | 0.03 | 429.09 | 4720 | 0.0891 | 0.6572 | 0.6956 | 0.9786 | nan | 0.9936 | 0.9468 | 0.9020 | 0.9897 | 0.9002 | 0.0 | 0.8815 | 0.9820 | 0.4854 | 0.0778 | 0.8999 | 0.9842 | 0.0 | nan | 0.9810 | 0.8883 | 0.8382 | 0.9770 | 0.8490 | 0.0 | 0.8388 | 0.9739 | 0.3198 | 0.0778 | 0.8449 | 0.9552 | 0.0 | | 0.0247 | 430.91 | 4740 | 0.0883 | 0.6576 | 0.6967 | 0.9786 | nan | 0.9938 | 0.9486 | 0.9006 | 0.9883 | 0.9055 | 0.0 | 0.8734 | 0.9803 | 0.5065 | 0.0790 | 0.9007 | 0.9801 | 0.0 | nan | 0.9808 | 0.8887 | 0.8367 | 0.9773 | 0.8494 | 0.0 | 0.8329 | 0.9734 | 0.3325 | 0.0790 | 0.8434 | 0.9549 | 0.0 | | 0.0281 | 432.73 | 4760 | 0.0887 | 0.6556 | 0.6937 | 0.9785 | nan | 0.9937 | 0.9500 | 0.9008 | 0.9878 | 0.9069 | 0.0 | 0.8808 | 0.9817 | 0.4871 | 0.0760 | 0.8839 | 0.9689 | 0.0 | nan | 0.9808 | 0.8887 | 0.8368 | 0.9772 | 0.8488 | 0.0 | 0.8339 | 0.9741 | 0.3214 | 0.0760 | 0.8345 | 0.9507 | 0.0 | | 0.0321 | 434.55 | 4780 | 0.0883 | 0.6568 | 0.6955 | 0.9786 | nan | 0.9936 | 0.9500 | 0.8971 | 0.9889 | 0.9018 | 0.0 | 0.8830 | 0.9820 | 0.4973 | 0.0833 | 0.8857 | 0.9788 | 0.0 | nan | 0.9810 | 0.8882 | 0.8374 | 0.9773 | 0.8494 | 0.0 | 0.8353 | 0.9742 | 0.3253 | 0.0833 | 0.8320 | 0.9548 | 0.0 | | 0.034 | 436.36 | 4800 | 0.0914 | 0.6505 | 0.6863 | 0.9783 | nan | 0.9940 | 0.9510 | 0.8954 | 0.9877 | 0.9105 | 0.0 | 0.8828 | 0.9813 | 0.3914 | 0.0720 | 0.8777 | 0.9788 | 0.0 | nan | 0.9809 | 0.8842 | 0.8375 | 0.9772 | 0.8524 | 0.0 | 0.8356 | 0.9741 | 0.2622 | 0.0720 | 0.8259 | 0.9546 | 0.0 | | 0.0327 | 438.18 | 4820 | 0.0898 | 0.6535 | 0.6914 | 0.9784 | nan | 0.9935 | 0.9500 | 0.8986 | 0.9884 | 0.9061 | 0.0 | 0.8879 | 0.9820 | 0.4382 | 0.0775 | 0.8825 | 0.9839 | 0.0 | nan | 0.9808 | 0.8863 | 0.8372 | 0.9773 | 0.8508 | 0.0 | 0.8362 | 0.9740 | 0.2890 | 0.0774 | 0.8304 | 0.9559 | 0.0 | | 0.0291 | 440.0 | 4840 | 0.0909 | 0.6526 | 0.6899 | 0.9783 | nan | 0.9935 | 0.9529 | 0.8975 | 0.9880 | 0.9070 | 0.0 | 0.8857 | 0.9804 | 0.4281 | 0.0777 | 0.875 | 0.9829 | 0.0 | nan | 0.9809 | 0.8852 | 0.8374 | 0.9773 | 0.8519 | 0.0 | 0.8362 | 0.9735 | 0.2839 | 0.0777 | 0.8233 | 0.9563 | 0.0 | | 0.03 | 441.82 | 4860 | 0.0887 | 0.6592 | 0.7009 | 0.9787 | nan | 0.9931 | 0.9523 | 0.9036 | 0.9872 | 0.9102 | 0.0 | 0.8857 | 0.9810 | 0.5448 | 0.0776 | 0.8926 | 0.9833 | 0.0 | nan | 0.9810 | 0.8889 | 0.8376 | 0.9772 | 0.8518 | 0.0 | 0.8352 | 0.9728 | 0.3540 | 0.0776 | 0.8370 | 0.9564 | 0.0 | | 0.035 | 443.64 | 4880 | 0.0897 | 0.6553 | 0.6940 | 0.9784 | nan | 0.9934 | 0.9509 | 0.9012 | 0.9883 | 0.9057 | 0.0 | 0.8870 | 0.9801 | 0.4675 | 0.0794 | 0.8832 | 0.9853 | 0.0 | nan | 0.9811 | 0.8854 | 0.8386 | 0.9772 | 0.8514 | 0.0 | 0.8379 | 0.9724 | 0.3075 | 0.0794 | 0.8330 | 0.9551 | 0.0 | | 0.0277 | 445.45 | 4900 | 0.0897 | 0.6548 | 0.6933 | 0.9784 | nan | 0.9935 | 0.9494 | 0.9042 | 0.9880 | 0.9065 | 0.0 | 0.8859 | 0.9804 | 0.4598 | 0.0798 | 0.8777 | 0.9871 | 0.0 | nan | 0.9809 | 0.8866 | 0.8381 | 0.9772 | 0.8503 | 0.0 | 0.8393 | 0.9724 | 0.3038 | 0.0798 | 0.8299 | 0.9545 | 0.0 | | 0.0266 | 447.27 | 4920 | 0.0904 | 0.6548 | 0.6928 | 0.9786 | nan | 0.9937 | 0.9500 | 0.8990 | 0.9888 | 0.8999 | 0.0 | 0.8858 | 0.9809 | 0.4745 | 0.0686 | 0.8777 | 0.9874 | 0.0 | nan | 0.9811 | 0.8875 | 0.8384 | 0.9773 | 0.8491 | 0.0 | 0.8383 | 0.9735 | 0.3128 | 0.0686 | 0.8303 | 0.9559 | 0.0 | | 0.0302 | 449.09 | 4940 | 0.0897 | 0.6555 | 0.6948 | 0.9785 | nan | 0.9933 | 0.9517 | 0.9024 | 0.9884 | 0.9039 | 0.0 | 0.8883 | 0.9801 | 0.4777 | 0.0767 | 0.8849 | 0.9850 | 0.0 | nan | 0.9810 | 0.8870 | 0.8369 | 0.9773 | 0.8492 | 0.0 | 0.8379 | 0.9732 | 0.3130 | 0.0767 | 0.8339 | 0.9560 | 0.0 | | 0.0291 | 450.91 | 4960 | 0.0899 | 0.6559 | 0.6942 | 0.9786 | nan | 0.9937 | 0.9520 | 0.8954 | 0.9876 | 0.9050 | 0.0 | 0.8856 | 0.9824 | 0.4838 | 0.0790 | 0.8759 | 0.9839 | 0.0 | nan | 0.9810 | 0.8881 | 0.8366 | 0.9773 | 0.8499 | 0.0 | 0.8390 | 0.9744 | 0.3168 | 0.0790 | 0.8278 | 0.9562 | 0.0 | | 0.0311 | 452.73 | 4980 | 0.0897 | 0.6567 | 0.6952 | 0.9786 | nan | 0.9936 | 0.9498 | 0.9016 | 0.9881 | 0.9043 | 0.0 | 0.8843 | 0.9821 | 0.4884 | 0.0792 | 0.8832 | 0.9824 | 0.0 | nan | 0.9809 | 0.8890 | 0.8366 | 0.9774 | 0.8498 | 0.0 | 0.8408 | 0.9745 | 0.3201 | 0.0792 | 0.8319 | 0.9570 | 0.0 | | 0.032 | 454.55 | 5000 | 0.0898 | 0.6560 | 0.6950 | 0.9785 | nan | 0.9933 | 0.9510 | 0.9048 | 0.9883 | 0.9044 | 0.0 | 0.8827 | 0.9811 | 0.4748 | 0.0791 | 0.8908 | 0.9847 | 0.0 | nan | 0.9810 | 0.8868 | 0.8371 | 0.9774 | 0.8490 | 0.0 | 0.8395 | 0.9741 | 0.3113 | 0.0791 | 0.8357 | 0.9564 | 0.0 | | 0.0301 | 456.36 | 5020 | 0.0897 | 0.6568 | 0.6964 | 0.9786 | nan | 0.9933 | 0.9510 | 0.9016 | 0.9885 | 0.9036 | 0.0 | 0.8851 | 0.9815 | 0.5002 | 0.0745 | 0.8906 | 0.9835 | 0.0 | nan | 0.9809 | 0.8880 | 0.8364 | 0.9774 | 0.8498 | 0.0 | 0.8397 | 0.9739 | 0.3268 | 0.0745 | 0.8345 | 0.9563 | 0.0 | | 0.0302 | 458.18 | 5040 | 0.0909 | 0.6558 | 0.6944 | 0.9785 | nan | 0.9936 | 0.9531 | 0.8982 | 0.9878 | 0.9012 | 0.0 | 0.8810 | 0.9801 | 0.4919 | 0.0759 | 0.8871 | 0.9771 | 0.0 | nan | 0.9809 | 0.8872 | 0.8362 | 0.9774 | 0.8497 | 0.0 | 0.8370 | 0.9729 | 0.3229 | 0.0759 | 0.8295 | 0.9555 | 0.0 | | 0.0268 | 460.0 | 5060 | 0.0902 | 0.6552 | 0.6937 | 0.9785 | nan | 0.9937 | 0.9522 | 0.8971 | 0.9874 | 0.9076 | 0.0 | 0.8854 | 0.9810 | 0.4867 | 0.0720 | 0.8828 | 0.9728 | 0.0 | nan | 0.9809 | 0.8879 | 0.8362 | 0.9774 | 0.8506 | 0.0 | 0.8383 | 0.9734 | 0.3195 | 0.0720 | 0.8281 | 0.9538 | 0.0 | | 0.0261 | 461.82 | 5080 | 0.0907 | 0.6558 | 0.6941 | 0.9785 | nan | 0.9938 | 0.9507 | 0.8961 | 0.9880 | 0.9024 | 0.0 | 0.8868 | 0.9812 | 0.4819 | 0.0775 | 0.8873 | 0.9777 | 0.0 | nan | 0.9808 | 0.8879 | 0.8360 | 0.9773 | 0.8503 | 0.0 | 0.8393 | 0.9737 | 0.3158 | 0.0775 | 0.8309 | 0.9558 | 0.0 | | 0.0347 | 463.64 | 5100 | 0.0905 | 0.6559 | 0.6941 | 0.9785 | nan | 0.9936 | 0.9510 | 0.9008 | 0.9879 | 0.9042 | 0.0 | 0.8840 | 0.9812 | 0.4818 | 0.0767 | 0.8848 | 0.9770 | 0.0 | nan | 0.9809 | 0.8878 | 0.8367 | 0.9774 | 0.8497 | 0.0 | 0.8397 | 0.9735 | 0.3165 | 0.0767 | 0.8320 | 0.9557 | 0.0 | | 0.0281 | 465.45 | 5120 | 0.0903 | 0.6553 | 0.6931 | 0.9785 | nan | 0.9936 | 0.9515 | 0.8990 | 0.9882 | 0.9007 | 0.0 | 0.8862 | 0.9810 | 0.4698 | 0.0761 | 0.8832 | 0.9812 | 0.0 | nan | 0.9809 | 0.8879 | 0.8367 | 0.9774 | 0.8489 | 0.0 | 0.8395 | 0.9734 | 0.3102 | 0.0761 | 0.8309 | 0.9569 | 0.0 | | 0.0281 | 467.27 | 5140 | 0.0912 | 0.6530 | 0.6905 | 0.9784 | nan | 0.9938 | 0.9524 | 0.8953 | 0.9874 | 0.9043 | 0.0 | 0.8888 | 0.9821 | 0.4326 | 0.0740 | 0.8855 | 0.9808 | 0.0 | nan | 0.9810 | 0.8860 | 0.8371 | 0.9773 | 0.8491 | 0.0 | 0.8402 | 0.9737 | 0.2860 | 0.0739 | 0.8284 | 0.9562 | 0.0 | | 0.0266 | 469.09 | 5160 | 0.0904 | 0.6536 | 0.6918 | 0.9784 | nan | 0.9934 | 0.9504 | 0.9047 | 0.9886 | 0.9023 | 0.0 | 0.8893 | 0.9813 | 0.4366 | 0.0790 | 0.8853 | 0.9821 | 0.0 | nan | 0.9810 | 0.8862 | 0.8385 | 0.9774 | 0.8486 | 0.0 | 0.8401 | 0.9736 | 0.2880 | 0.0790 | 0.8285 | 0.9556 | 0.0 | | 0.0291 | 470.91 | 5180 | 0.0901 | 0.6544 | 0.6928 | 0.9785 | nan | 0.9937 | 0.9506 | 0.8956 | 0.9885 | 0.9016 | 0.0 | 0.8885 | 0.9822 | 0.4679 | 0.0729 | 0.8839 | 0.9817 | 0.0 | nan | 0.9808 | 0.8884 | 0.8352 | 0.9774 | 0.8488 | 0.0 | 0.8374 | 0.9743 | 0.3087 | 0.0729 | 0.8272 | 0.9557 | 0.0 | | 0.0286 | 472.73 | 5200 | 0.0908 | 0.6546 | 0.6933 | 0.9785 | nan | 0.9937 | 0.9516 | 0.8975 | 0.9880 | 0.9028 | 0.0 | 0.8882 | 0.9813 | 0.4694 | 0.0747 | 0.8841 | 0.9821 | 0.0 | nan | 0.9808 | 0.8879 | 0.8358 | 0.9774 | 0.8495 | 0.0 | 0.8368 | 0.9739 | 0.3096 | 0.0747 | 0.8276 | 0.9559 | 0.0 | | 0.027 | 474.55 | 5220 | 0.0911 | 0.6541 | 0.6919 | 0.9784 | nan | 0.9937 | 0.9518 | 0.8979 | 0.9881 | 0.9016 | 0.0 | 0.8845 | 0.9794 | 0.4600 | 0.0758 | 0.8791 | 0.9832 | 0.0 | nan | 0.9808 | 0.8872 | 0.8363 | 0.9774 | 0.8485 | 0.0 | 0.8377 | 0.9723 | 0.3046 | 0.0758 | 0.8267 | 0.9561 | 0.0 | | 0.0266 | 476.36 | 5240 | 0.0914 | 0.6539 | 0.6919 | 0.9782 | nan | 0.9935 | 0.9516 | 0.8994 | 0.9886 | 0.9007 | 0.0 | 0.8842 | 0.9782 | 0.4591 | 0.0769 | 0.8798 | 0.9830 | 0.0 | nan | 0.9809 | 0.8872 | 0.8362 | 0.9773 | 0.8478 | 0.0 | 0.8380 | 0.9711 | 0.3038 | 0.0769 | 0.8260 | 0.9558 | 0.0 | | 0.0304 | 478.18 | 5260 | 0.0911 | 0.6547 | 0.6934 | 0.9783 | nan | 0.9935 | 0.9504 | 0.9022 | 0.9887 | 0.9016 | 0.0 | 0.8819 | 0.9791 | 0.4703 | 0.0748 | 0.8878 | 0.9836 | 0.0 | nan | 0.9809 | 0.8878 | 0.8367 | 0.9773 | 0.8483 | 0.0 | 0.8381 | 0.9719 | 0.3096 | 0.0748 | 0.8297 | 0.9559 | 0.0 | | 0.025 | 480.0 | 5280 | 0.0909 | 0.6549 | 0.6934 | 0.9784 | nan | 0.9936 | 0.9502 | 0.8995 | 0.9887 | 0.9008 | 0.0 | 0.8883 | 0.9801 | 0.4674 | 0.0775 | 0.8901 | 0.9786 | 0.0 | nan | 0.9809 | 0.8878 | 0.8359 | 0.9773 | 0.8486 | 0.0 | 0.8363 | 0.9725 | 0.3080 | 0.0775 | 0.8334 | 0.9556 | 0.0 | | 0.0353 | 481.82 | 5300 | 0.0905 | 0.6550 | 0.6933 | 0.9785 | nan | 0.9936 | 0.9504 | 0.9002 | 0.9882 | 0.9028 | 0.0 | 0.8831 | 0.9811 | 0.4723 | 0.0766 | 0.8857 | 0.9787 | 0.0 | nan | 0.9808 | 0.8884 | 0.8358 | 0.9773 | 0.8495 | 0.0 | 0.8363 | 0.9732 | 0.3109 | 0.0765 | 0.8305 | 0.9560 | 0.0 | | 0.0249 | 483.64 | 5320 | 0.0911 | 0.6549 | 0.6932 | 0.9785 | nan | 0.9938 | 0.9509 | 0.8972 | 0.9879 | 0.9033 | 0.0 | 0.8855 | 0.9805 | 0.4706 | 0.0773 | 0.8855 | 0.9790 | 0.0 | nan | 0.9808 | 0.8881 | 0.8359 | 0.9774 | 0.8493 | 0.0 | 0.8362 | 0.9730 | 0.3098 | 0.0773 | 0.8308 | 0.9556 | 0.0 | | 0.0281 | 485.45 | 5340 | 0.0908 | 0.6557 | 0.6943 | 0.9785 | nan | 0.9935 | 0.9506 | 0.9025 | 0.9881 | 0.9042 | 0.0 | 0.8838 | 0.9810 | 0.4763 | 0.0768 | 0.8897 | 0.9796 | 0.0 | nan | 0.9809 | 0.8886 | 0.8361 | 0.9774 | 0.8497 | 0.0 | 0.8384 | 0.9735 | 0.3130 | 0.0767 | 0.8341 | 0.9559 | 0.0 | | 0.0299 | 487.27 | 5360 | 0.0904 | 0.6553 | 0.6936 | 0.9785 | nan | 0.9937 | 0.9511 | 0.8977 | 0.9882 | 0.9044 | 0.0 | 0.8839 | 0.9808 | 0.4698 | 0.0783 | 0.8865 | 0.9817 | 0.0 | nan | 0.9809 | 0.8881 | 0.8364 | 0.9774 | 0.8495 | 0.0 | 0.8387 | 0.9735 | 0.3096 | 0.0783 | 0.8304 | 0.9560 | 0.0 | | 0.0259 | 489.09 | 5380 | 0.0909 | 0.6556 | 0.6937 | 0.9786 | nan | 0.9936 | 0.9517 | 0.8974 | 0.9880 | 0.9028 | 0.0 | 0.8853 | 0.9819 | 0.4740 | 0.0788 | 0.8830 | 0.9821 | 0.0 | nan | 0.9809 | 0.8883 | 0.8365 | 0.9774 | 0.8502 | 0.0 | 0.8386 | 0.9740 | 0.3115 | 0.0788 | 0.8302 | 0.9562 | 0.0 | | 0.0295 | 490.91 | 5400 | 0.0907 | 0.6559 | 0.6947 | 0.9786 | nan | 0.9936 | 0.9511 | 0.8984 | 0.9876 | 0.9044 | 0.0 | 0.8862 | 0.9827 | 0.4753 | 0.0797 | 0.8885 | 0.9832 | 0.0 | nan | 0.9809 | 0.8886 | 0.8364 | 0.9774 | 0.8509 | 0.0 | 0.8381 | 0.9741 | 0.3122 | 0.0797 | 0.8321 | 0.9563 | 0.0 | | 0.0279 | 492.73 | 5420 | 0.0906 | 0.6556 | 0.6938 | 0.9786 | nan | 0.9937 | 0.9499 | 0.8987 | 0.9881 | 0.9055 | 0.0 | 0.8843 | 0.9825 | 0.4716 | 0.0797 | 0.8842 | 0.9816 | 0.0 | nan | 0.9809 | 0.8886 | 0.8362 | 0.9773 | 0.8508 | 0.0 | 0.8381 | 0.9741 | 0.3102 | 0.0797 | 0.8314 | 0.9560 | 0.0 | | 0.0302 | 494.55 | 5440 | 0.0907 | 0.6553 | 0.6936 | 0.9786 | nan | 0.9937 | 0.9494 | 0.8983 | 0.9886 | 0.9011 | 0.0 | 0.8876 | 0.9827 | 0.4655 | 0.0779 | 0.8912 | 0.9807 | 0.0 | nan | 0.9809 | 0.8885 | 0.8360 | 0.9773 | 0.8502 | 0.0 | 0.8378 | 0.9742 | 0.3067 | 0.0778 | 0.8337 | 0.9560 | 0.0 | | 0.0295 | 496.36 | 5460 | 0.0911 | 0.6555 | 0.6941 | 0.9785 | nan | 0.9936 | 0.9507 | 0.8991 | 0.9883 | 0.9043 | 0.0 | 0.8856 | 0.9810 | 0.4712 | 0.0776 | 0.8913 | 0.9802 | 0.0 | nan | 0.9809 | 0.8882 | 0.8363 | 0.9773 | 0.8500 | 0.0 | 0.8383 | 0.9737 | 0.3099 | 0.0776 | 0.8335 | 0.9565 | 0.0 | | 0.0281 | 498.18 | 5480 | 0.0905 | 0.6552 | 0.6930 | 0.9785 | nan | 0.9937 | 0.9511 | 0.8973 | 0.9879 | 0.9058 | 0.0 | 0.8833 | 0.9819 | 0.4603 | 0.0788 | 0.8887 | 0.9799 | 0.0 | nan | 0.9808 | 0.8883 | 0.8360 | 0.9773 | 0.8506 | 0.0 | 0.8378 | 0.9741 | 0.3048 | 0.0788 | 0.8325 | 0.9564 | 0.0 | | 0.0327 | 500.0 | 5500 | 0.0905 | 0.6547 | 0.6922 | 0.9785 | nan | 0.9937 | 0.9505 | 0.8967 | 0.9886 | 0.9017 | 0.0 | 0.8836 | 0.9814 | 0.4644 | 0.0731 | 0.8837 | 0.9814 | 0.0 | nan | 0.9808 | 0.8881 | 0.8358 | 0.9774 | 0.8498 | 0.0 | 0.8385 | 0.9740 | 0.3068 | 0.0731 | 0.8306 | 0.9563 | 0.0 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "unlabeled", "nat", "concrete", "grass", "speedway bricks", "steel", "rough concrete", "dark bricks", "road", "rough red sidewalk", "tiles", "red bricks", "concrete tiles", "rest" ]
sam1120/safety-utcustom-terrain-b0-optim
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # safety-utcustom-terrain-b0-optim This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/safety-utcustom-terrain-jackal-full-391 dataset. It achieves the following results on the evaluation set: - Loss: 0.0837 - Mean Iou: 0.6719 - Mean Accuracy: 0.7168 - Overall Accuracy: 0.9802 - Accuracy Unlabeled: nan - Accuracy Nat: 0.9934 - Accuracy Concrete: 0.9531 - Accuracy Grass: 0.9075 - Accuracy Speedway bricks: 0.9886 - Accuracy Steel: 0.9103 - Accuracy Rough concrete: 0.0 - Accuracy Dark bricks: 0.9022 - Accuracy Road: 0.9844 - Accuracy Rough red sidewalk: 0.6933 - Accuracy Tiles: 0.1029 - Accuracy Red bricks: 0.8981 - Accuracy Concrete tiles: 0.9850 - Accuracy Rest: 0.0 - Iou Unlabeled: nan - Iou Nat: 0.9816 - Iou Concrete: 0.8989 - Iou Grass: 0.8466 - Iou Speedway bricks: 0.9782 - Iou Steel: 0.8578 - Iou Rough concrete: 0.0 - Iou Dark bricks: 0.8371 - Iou Road: 0.9778 - Iou Rough red sidewalk: 0.4583 - Iou Tiles: 0.1025 - Iou Red bricks: 0.8433 - Iou Concrete tiles: 0.9527 - Iou Rest: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.05 - num_epochs: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Nat | Accuracy Concrete | Accuracy Grass | Accuracy Speedway bricks | Accuracy Steel | Accuracy Rough concrete | Accuracy Dark bricks | Accuracy Road | Accuracy Rough red sidewalk | Accuracy Tiles | Accuracy Red bricks | Accuracy Concrete tiles | Accuracy Rest | Iou Unlabeled | Iou Nat | Iou Concrete | Iou Grass | Iou Speedway bricks | Iou Steel | Iou Rough concrete | Iou Dark bricks | Iou Road | Iou Rough red sidewalk | Iou Tiles | Iou Red bricks | Iou Concrete tiles | Iou Rest | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------:|:-----------------:|:--------------:|:------------------------:|:--------------:|:-----------------------:|:--------------------:|:-------------:|:---------------------------:|:--------------:|:-------------------:|:-----------------------:|:-------------:|:-------------:|:-------:|:------------:|:---------:|:-------------------:|:---------:|:------------------:|:---------------:|:--------:|:----------------------:|:---------:|:--------------:|:------------------:|:--------:| | 2.6746 | 1.82 | 20 | 2.6474 | 0.0082 | 0.0318 | 0.0233 | nan | 0.0191 | 0.0010 | 0.0080 | 0.0313 | 0.0084 | 0.0340 | 0.0933 | 0.0573 | 0.0600 | 0.0 | 0.0087 | 0.0918 | 0.0 | 0.0 | 0.0189 | 0.0009 | 0.0061 | 0.0310 | 0.0017 | 0.0043 | 0.0019 | 0.0466 | 0.0014 | 0.0 | 0.0000 | 0.0027 | 0.0 | | 2.4385 | 3.64 | 40 | 2.4373 | 0.1118 | 0.1701 | 0.4471 | nan | 0.4479 | 0.0436 | 0.0175 | 0.7969 | 0.0050 | 0.2950 | 0.0727 | 0.3749 | 0.0548 | 0.0 | 0.0 | 0.1028 | 0.0 | 0.0 | 0.4473 | 0.0400 | 0.0106 | 0.7166 | 0.0016 | 0.0490 | 0.0073 | 0.2852 | 0.0037 | 0.0 | 0.0 | 0.0041 | 0.0 | | 1.9773 | 5.45 | 60 | 1.9161 | 0.1721 | 0.2306 | 0.8010 | nan | 0.8990 | 0.2451 | 0.0716 | 0.9788 | 0.0 | 0.0037 | 0.0093 | 0.7720 | 0.0100 | 0.0010 | 0.0 | 0.0072 | 0.0 | 0.0 | 0.8940 | 0.1920 | 0.0704 | 0.6999 | 0.0 | 0.0013 | 0.0053 | 0.5406 | 0.0038 | 0.0003 | 0.0 | 0.0014 | 0.0 | | 1.6038 | 7.27 | 80 | 1.4905 | 0.2074 | 0.2714 | 0.8715 | nan | 0.9470 | 0.7309 | 0.0295 | 0.9800 | 0.0 | 0.0 | 0.0 | 0.8218 | 0.0020 | 0.0 | 0.0154 | 0.0014 | 0.0 | 0.0 | 0.9376 | 0.4290 | 0.0291 | 0.7987 | 0.0 | 0.0 | 0.0 | 0.7050 | 0.0010 | 0.0 | 0.0026 | 0.0011 | 0.0 | | 1.4392 | 9.09 | 100 | 1.2886 | 0.2165 | 0.2812 | 0.8830 | nan | 0.9505 | 0.7619 | 0.0759 | 0.9805 | 0.0003 | 0.0 | 0.0 | 0.8848 | 0.0009 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.9410 | 0.4459 | 0.0671 | 0.8220 | 0.0003 | 0.0 | 0.0 | 0.7530 | 0.0007 | 0.0 | 0.0004 | 0.0 | 0.0 | | 1.1951 | 10.91 | 120 | 1.1216 | 0.2496 | 0.2954 | 0.8964 | nan | 0.9539 | 0.8772 | 0.1407 | 0.9814 | 0.0002 | 0.0 | 0.0 | 0.8862 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9437 | 0.5138 | 0.1163 | 0.8450 | 0.0002 | 0.0 | 0.0 | 0.8261 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0666 | 12.73 | 140 | 0.9632 | 0.2684 | 0.3165 | 0.9090 | nan | 0.9561 | 0.8870 | 0.3516 | 0.9750 | 0.0 | 0.0 | 0.0 | 0.9451 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9454 | 0.6040 | 0.2726 | 0.8607 | 0.0 | 0.0 | 0.0 | 0.8060 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9858 | 14.55 | 160 | 0.8469 | 0.2728 | 0.3224 | 0.9065 | nan | 0.9542 | 0.9108 | 0.4874 | 0.9734 | 0.0 | 0.0 | 0.0 | 0.8654 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9443 | 0.5658 | 0.3575 | 0.8612 | 0.0 | 0.0 | 0.0 | 0.8180 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8171 | 16.36 | 180 | 0.7246 | 0.2957 | 0.3457 | 0.9223 | nan | 0.9581 | 0.8947 | 0.7176 | 0.9772 | 0.0 | 0.0 | 0.0 | 0.9461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9460 | 0.6849 | 0.4837 | 0.8639 | 0.0 | 0.0 | 0.0 | 0.8659 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7362 | 18.18 | 200 | 0.6130 | 0.2991 | 0.3465 | 0.9243 | nan | 0.9589 | 0.8930 | 0.7201 | 0.9858 | 0.0 | 0.0 | 0.0 | 0.9468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9477 | 0.6862 | 0.5314 | 0.8618 | 0.0 | 0.0 | 0.0 | 0.8610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.608 | 20.0 | 220 | 0.5256 | 0.3038 | 0.3528 | 0.9270 | nan | 0.9587 | 0.8941 | 0.7906 | 0.9821 | 0.0 | 0.0 | 0.0 | 0.9609 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9479 | 0.7103 | 0.5579 | 0.8661 | 0.0 | 0.0 | 0.0 | 0.8666 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5261 | 21.82 | 240 | 0.4744 | 0.3002 | 0.3495 | 0.9231 | nan | 0.9583 | 0.9322 | 0.7455 | 0.9563 | 0.0 | 0.0 | 0.0 | 0.9518 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.9485 | 0.6478 | 0.5857 | 0.8714 | 0.0 | 0.0 | 0.0 | 0.8491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.4777 | 23.64 | 260 | 0.4129 | 0.3190 | 0.3610 | 0.9328 | nan | 0.9675 | 0.8889 | 0.7898 | 0.9889 | 0.0010 | 0.0 | 0.0 | 0.9544 | 0.0 | 0.0 | 0.0 | 0.1025 | 0.0 | nan | 0.9546 | 0.7252 | 0.6226 | 0.8638 | 0.0010 | 0.0 | 0.0 | 0.8778 | 0.0 | 0.0 | 0.0 | 0.1025 | 0.0 | | 0.3775 | 25.45 | 280 | 0.3503 | 0.3733 | 0.4146 | 0.9392 | nan | 0.9730 | 0.8723 | 0.8075 | 0.9868 | 0.0143 | 0.0 | 0.0 | 0.9740 | 0.0 | 0.0 | 0.0 | 0.7616 | 0.0 | nan | 0.9570 | 0.7644 | 0.6186 | 0.8805 | 0.0143 | 0.0 | 0.0 | 0.8827 | 0.0 | 0.0 | 0.0 | 0.7351 | 0.0 | | 0.3272 | 27.27 | 300 | 0.3120 | 0.3744 | 0.4223 | 0.9411 | nan | 0.9755 | 0.8942 | 0.7730 | 0.9889 | 0.1235 | 0.0 | 0.0 | 0.9491 | 0.0 | 0.0 | 0.0 | 0.7856 | 0.0 | nan | 0.9595 | 0.7433 | 0.6412 | 0.8828 | 0.1230 | 0.0 | 0.0 | 0.9023 | 0.0 | 0.0 | 0.0 | 0.6157 | 0.0 | | 0.2718 | 29.09 | 320 | 0.2822 | 0.3720 | 0.4464 | 0.9434 | nan | 0.9826 | 0.8046 | 0.7965 | 0.9829 | 0.3906 | 0.0 | 0.0 | 0.9698 | 0.0 | 0.0 | 0.0 | 0.8759 | 0.0 | nan | 0.9586 | 0.7107 | 0.6444 | 0.9193 | 0.3835 | 0.0 | 0.0 | 0.9171 | 0.0 | 0.0 | 0.0 | 0.3028 | 0.0 | | 0.286 | 30.91 | 340 | 0.2324 | 0.4243 | 0.4613 | 0.9509 | nan | 0.9777 | 0.9108 | 0.8607 | 0.9874 | 0.4738 | 0.0 | 0.0 | 0.9504 | 0.0006 | 0.0 | 0.0 | 0.8355 | 0.0 | nan | 0.9621 | 0.7654 | 0.6784 | 0.9188 | 0.4559 | 0.0 | 0.0 | 0.9240 | 0.0006 | 0.0 | 0.0 | 0.8105 | 0.0 | | 0.2553 | 32.73 | 360 | 0.2106 | 0.4457 | 0.4820 | 0.9569 | nan | 0.9864 | 0.9002 | 0.8603 | 0.9764 | 0.6370 | 0.0 | 0.0 | 0.9631 | 0.0265 | 0.0 | 0.0 | 0.9166 | 0.0 | nan | 0.9641 | 0.8023 | 0.6887 | 0.9343 | 0.5962 | 0.0 | 0.0 | 0.9319 | 0.0263 | 0.0 | 0.0 | 0.8506 | 0.0 | | 0.2305 | 34.55 | 380 | 0.2024 | 0.4598 | 0.5072 | 0.9591 | nan | 0.9855 | 0.8977 | 0.8742 | 0.9746 | 0.7391 | 0.0 | 0.0449 | 0.9721 | 0.1409 | 0.0 | 0.0 | 0.9640 | 0.0 | nan | 0.9654 | 0.8089 | 0.7313 | 0.9375 | 0.6287 | 0.0 | 0.0449 | 0.9403 | 0.1125 | 0.0 | 0.0 | 0.8073 | 0.0 | | 0.2019 | 36.36 | 400 | 0.1884 | 0.4735 | 0.5088 | 0.9576 | nan | 0.9837 | 0.9404 | 0.8578 | 0.9750 | 0.7744 | 0.0 | 0.1972 | 0.9301 | 0.1264 | 0.0 | 0.0 | 0.8294 | 0.0 | nan | 0.9671 | 0.7637 | 0.7586 | 0.9438 | 0.6910 | 0.0 | 0.1947 | 0.9199 | 0.0911 | 0.0 | 0.0 | 0.8252 | 0.0 | | 0.1776 | 38.18 | 420 | 0.1694 | 0.5130 | 0.5541 | 0.9625 | nan | 0.9882 | 0.9233 | 0.8351 | 0.9685 | 0.7703 | 0.0 | 0.4815 | 0.9704 | 0.2915 | 0.0 | 0.0 | 0.9741 | 0.0 | nan | 0.9677 | 0.8077 | 0.7490 | 0.9483 | 0.6777 | 0.0 | 0.4750 | 0.9414 | 0.2156 | 0.0 | 0.0 | 0.8867 | 0.0 | | 0.1435 | 40.0 | 440 | 0.1586 | 0.4931 | 0.5359 | 0.9632 | nan | 0.9841 | 0.9306 | 0.8747 | 0.9802 | 0.8193 | 0.0 | 0.1049 | 0.9651 | 0.3851 | 0.0 | 0.0 | 0.9227 | 0.0 | nan | 0.9690 | 0.8176 | 0.7707 | 0.9438 | 0.7045 | 0.0 | 0.1047 | 0.9477 | 0.2692 | 0.0 | 0.0 | 0.8828 | 0.0 | | 0.1526 | 41.82 | 460 | 0.1500 | 0.4926 | 0.5411 | 0.9632 | nan | 0.9863 | 0.9281 | 0.8375 | 0.9843 | 0.7994 | 0.0 | 0.1830 | 0.9556 | 0.3669 | 0.0 | 0.0 | 0.9928 | 0.0 | nan | 0.9699 | 0.8065 | 0.7651 | 0.9486 | 0.7195 | 0.0 | 0.1819 | 0.9413 | 0.2536 | 0.0 | 0.0 | 0.8173 | 0.0 | | 0.1364 | 43.64 | 480 | 0.1412 | 0.4986 | 0.5333 | 0.9638 | nan | 0.9885 | 0.9233 | 0.8367 | 0.9873 | 0.7098 | 0.0 | 0.1723 | 0.9590 | 0.4377 | 0.0 | 0.0 | 0.9190 | 0.0 | nan | 0.9701 | 0.8276 | 0.7724 | 0.9432 | 0.6638 | 0.0 | 0.1693 | 0.9352 | 0.3003 | 0.0 | 0.0 | 0.8995 | 0.0 | | 0.1252 | 45.45 | 500 | 0.1320 | 0.5279 | 0.5713 | 0.9677 | nan | 0.9867 | 0.9338 | 0.8676 | 0.9817 | 0.8060 | 0.0 | 0.4072 | 0.9800 | 0.5415 | 0.0 | 0.0 | 0.9221 | 0.0 | nan | 0.9714 | 0.8490 | 0.7790 | 0.9536 | 0.7239 | 0.0 | 0.3961 | 0.9524 | 0.3482 | 0.0 | 0.0 | 0.8891 | 0.0 | | 0.1178 | 47.27 | 520 | 0.1296 | 0.5515 | 0.5916 | 0.9682 | nan | 0.9903 | 0.9303 | 0.8182 | 0.9874 | 0.7923 | 0.0 | 0.7099 | 0.9629 | 0.5105 | 0.0 | 0.0234 | 0.9654 | 0.0 | nan | 0.9715 | 0.8439 | 0.7604 | 0.9570 | 0.7327 | 0.0 | 0.6813 | 0.9529 | 0.3407 | 0.0 | 0.0234 | 0.9061 | 0.0 | | 0.1182 | 49.09 | 540 | 0.1293 | 0.5547 | 0.6058 | 0.9679 | nan | 0.9889 | 0.9476 | 0.8475 | 0.9691 | 0.8630 | 0.0 | 0.7153 | 0.9674 | 0.5159 | 0.0 | 0.0671 | 0.9935 | 0.0 | nan | 0.9727 | 0.8320 | 0.7759 | 0.9560 | 0.7409 | 0.0 | 0.6893 | 0.9536 | 0.3405 | 0.0 | 0.0671 | 0.8837 | 0.0 | | 0.1342 | 50.91 | 560 | 0.1255 | 0.5470 | 0.5826 | 0.9684 | nan | 0.9915 | 0.9201 | 0.8231 | 0.9884 | 0.8093 | 0.0 | 0.7222 | 0.9656 | 0.3527 | 0.0 | 0.0179 | 0.9833 | 0.0 | nan | 0.9718 | 0.8449 | 0.7689 | 0.9559 | 0.7430 | 0.0 | 0.6927 | 0.9544 | 0.2462 | 0.0 | 0.0179 | 0.9157 | 0.0 | | 0.1106 | 52.73 | 580 | 0.1194 | 0.5864 | 0.6360 | 0.9704 | nan | 0.9910 | 0.9139 | 0.8617 | 0.9853 | 0.8297 | 0.0 | 0.7634 | 0.9736 | 0.5821 | 0.0 | 0.375 | 0.9930 | 0.0 | nan | 0.9729 | 0.8589 | 0.7875 | 0.9601 | 0.7481 | 0.0 | 0.7323 | 0.9545 | 0.3818 | 0.0 | 0.3720 | 0.8547 | 0.0 | | 0.0981 | 54.55 | 600 | 0.1186 | 0.6044 | 0.6617 | 0.9702 | nan | 0.9893 | 0.9215 | 0.8812 | 0.9823 | 0.8643 | 0.0 | 0.7975 | 0.9668 | 0.5979 | 0.0 | 0.6080 | 0.9934 | 0.0 | nan | 0.9732 | 0.8551 | 0.7812 | 0.9613 | 0.7556 | 0.0 | 0.7315 | 0.9532 | 0.3826 | 0.0 | 0.6023 | 0.8617 | 0.0 | | 0.0855 | 56.36 | 620 | 0.1130 | 0.6073 | 0.6544 | 0.9715 | nan | 0.9907 | 0.9243 | 0.8642 | 0.9847 | 0.8373 | 0.0 | 0.7492 | 0.9746 | 0.6229 | 0.0 | 0.5705 | 0.9886 | 0.0 | nan | 0.9734 | 0.8603 | 0.7875 | 0.9632 | 0.7602 | 0.0 | 0.7227 | 0.9618 | 0.4006 | 0.0 | 0.5660 | 0.8991 | 0.0 | | 0.0847 | 58.18 | 640 | 0.1137 | 0.6133 | 0.6685 | 0.9707 | nan | 0.9867 | 0.9295 | 0.8990 | 0.9872 | 0.8400 | 0.0 | 0.7726 | 0.9702 | 0.5952 | 0.0 | 0.7141 | 0.9965 | 0.0 | nan | 0.9738 | 0.8515 | 0.7935 | 0.9620 | 0.7627 | 0.0 | 0.7351 | 0.9554 | 0.3896 | 0.0 | 0.7005 | 0.8494 | 0.0 | | 0.0838 | 60.0 | 660 | 0.1104 | 0.6180 | 0.6612 | 0.9723 | nan | 0.9906 | 0.9348 | 0.8680 | 0.9822 | 0.8720 | 0.0 | 0.7611 | 0.9750 | 0.5930 | 0.0 | 0.6460 | 0.9724 | 0.0 | nan | 0.9745 | 0.8627 | 0.7982 | 0.9634 | 0.7661 | 0.0 | 0.7324 | 0.9635 | 0.3859 | 0.0 | 0.6398 | 0.9480 | 0.0 | | 0.0823 | 61.82 | 680 | 0.1106 | 0.6166 | 0.6703 | 0.9715 | nan | 0.9879 | 0.9406 | 0.8956 | 0.9835 | 0.8632 | 0.0 | 0.8141 | 0.9647 | 0.5996 | 0.0 | 0.6696 | 0.9946 | 0.0 | nan | 0.9747 | 0.8535 | 0.8000 | 0.9643 | 0.7690 | 0.0 | 0.7666 | 0.9536 | 0.3917 | 0.0 | 0.6572 | 0.8858 | 0.0 | | 0.101 | 63.64 | 700 | 0.1086 | 0.6237 | 0.6748 | 0.9717 | nan | 0.9882 | 0.9375 | 0.8963 | 0.9860 | 0.8543 | 0.0 | 0.8191 | 0.9653 | 0.5543 | 0.0 | 0.7782 | 0.9927 | 0.0 | nan | 0.9749 | 0.8564 | 0.8005 | 0.9627 | 0.7745 | 0.0 | 0.7667 | 0.9566 | 0.3652 | 0.0 | 0.7570 | 0.8934 | 0.0 | | 0.1006 | 65.45 | 720 | 0.1065 | 0.6192 | 0.6599 | 0.9720 | nan | 0.9898 | 0.9419 | 0.8681 | 0.9855 | 0.8343 | 0.0 | 0.7655 | 0.9716 | 0.5197 | 0.0 | 0.7223 | 0.9801 | 0.0 | nan | 0.9743 | 0.8583 | 0.7958 | 0.9653 | 0.7742 | 0.0 | 0.7325 | 0.9586 | 0.3538 | 0.0 | 0.7146 | 0.9227 | 0.0 | | 0.0717 | 67.27 | 740 | 0.1028 | 0.6311 | 0.6750 | 0.9737 | nan | 0.9907 | 0.9394 | 0.8865 | 0.9853 | 0.8621 | 0.0 | 0.7997 | 0.9727 | 0.5673 | 0.0 | 0.7852 | 0.9860 | 0.0 | nan | 0.9755 | 0.8706 | 0.8046 | 0.9663 | 0.7817 | 0.0 | 0.7594 | 0.9657 | 0.3813 | 0.0 | 0.7669 | 0.9317 | 0.0 | | 0.0757 | 69.09 | 760 | 0.1036 | 0.6249 | 0.6739 | 0.9722 | nan | 0.9892 | 0.9321 | 0.8776 | 0.9885 | 0.8404 | 0.0 | 0.7978 | 0.9709 | 0.6121 | 0.0 | 0.7638 | 0.9884 | 0.0 | nan | 0.9753 | 0.8615 | 0.7849 | 0.9649 | 0.7798 | 0.0 | 0.7580 | 0.9577 | 0.3883 | 0.0 | 0.7425 | 0.9113 | 0.0 | | 0.0728 | 70.91 | 780 | 0.1025 | 0.6249 | 0.6721 | 0.9727 | nan | 0.9914 | 0.9279 | 0.8663 | 0.9879 | 0.8526 | 0.0 | 0.7875 | 0.9703 | 0.5827 | 0.0 | 0.7788 | 0.9925 | 0.0 | nan | 0.9755 | 0.8610 | 0.7961 | 0.9646 | 0.7835 | 0.0 | 0.7545 | 0.9620 | 0.3864 | 0.0 | 0.7597 | 0.8804 | 0.0 | | 0.0814 | 72.73 | 800 | 0.0986 | 0.6370 | 0.6918 | 0.9735 | nan | 0.9897 | 0.9413 | 0.8838 | 0.9856 | 0.8542 | 0.0 | 0.8407 | 0.9715 | 0.6379 | 0.0181 | 0.8766 | 0.9936 | 0.0 | nan | 0.9762 | 0.8644 | 0.8048 | 0.9678 | 0.7900 | 0.0 | 0.7829 | 0.9588 | 0.4128 | 0.0181 | 0.8285 | 0.8763 | 0.0 | | 0.076 | 74.55 | 820 | 0.0990 | 0.6390 | 0.6884 | 0.9737 | nan | 0.9921 | 0.9310 | 0.8731 | 0.9881 | 0.8546 | 0.0 | 0.8415 | 0.9670 | 0.6033 | 0.0206 | 0.8910 | 0.9874 | 0.0 | nan | 0.9760 | 0.8693 | 0.8008 | 0.9663 | 0.7895 | 0.0 | 0.7833 | 0.9598 | 0.4038 | 0.0206 | 0.8301 | 0.9078 | 0.0 | | 0.0734 | 76.36 | 840 | 0.0961 | 0.6347 | 0.6769 | 0.9746 | nan | 0.9922 | 0.9356 | 0.8709 | 0.9853 | 0.8601 | 0.0 | 0.7988 | 0.9810 | 0.5985 | 0.0068 | 0.7869 | 0.9832 | 0.0 | nan | 0.9758 | 0.8791 | 0.7994 | 0.9683 | 0.7924 | 0.0 | 0.7649 | 0.9695 | 0.4034 | 0.0068 | 0.7649 | 0.9263 | 0.0 | | 0.0727 | 78.18 | 860 | 0.0971 | 0.6379 | 0.6849 | 0.9734 | nan | 0.9905 | 0.9206 | 0.8899 | 0.9909 | 0.8484 | 0.0 | 0.8286 | 0.9746 | 0.5959 | 0.0112 | 0.8695 | 0.9840 | 0.0 | nan | 0.9756 | 0.8678 | 0.7957 | 0.9640 | 0.7858 | 0.0 | 0.7832 | 0.9661 | 0.3971 | 0.0112 | 0.8243 | 0.9213 | 0.0 | | 0.0824 | 80.0 | 880 | 0.1008 | 0.6308 | 0.6748 | 0.9732 | nan | 0.9920 | 0.9387 | 0.8628 | 0.9873 | 0.8577 | 0.0 | 0.7994 | 0.9655 | 0.5130 | 0.0 | 0.8722 | 0.9840 | 0.0 | nan | 0.9765 | 0.8586 | 0.7941 | 0.9685 | 0.7989 | 0.0 | 0.7642 | 0.9588 | 0.3452 | 0.0 | 0.8196 | 0.9162 | 0.0 | | 0.0862 | 81.82 | 900 | 0.0967 | 0.6329 | 0.6913 | 0.9739 | nan | 0.9913 | 0.9371 | 0.8783 | 0.9830 | 0.8831 | 0.0 | 0.8435 | 0.9753 | 0.5548 | 0.0351 | 0.9119 | 0.9934 | 0.0 | nan | 0.9768 | 0.8633 | 0.8073 | 0.9686 | 0.7953 | 0.0 | 0.7885 | 0.9637 | 0.3711 | 0.0351 | 0.7954 | 0.8623 | 0.0 | | 0.0613 | 83.64 | 920 | 0.0946 | 0.6397 | 0.6941 | 0.9743 | nan | 0.9916 | 0.9380 | 0.8773 | 0.9828 | 0.8829 | 0.0 | 0.8314 | 0.9766 | 0.5793 | 0.0716 | 0.8993 | 0.9920 | 0.0 | nan | 0.9762 | 0.8712 | 0.7988 | 0.9693 | 0.7988 | 0.0 | 0.7910 | 0.9662 | 0.3915 | 0.0716 | 0.7956 | 0.8864 | 0.0 | | 0.0795 | 85.45 | 940 | 0.0972 | 0.6321 | 0.6818 | 0.9736 | nan | 0.9892 | 0.9452 | 0.8657 | 0.9865 | 0.8651 | 0.0 | 0.8005 | 0.9801 | 0.5751 | 0.0 | 0.8663 | 0.9901 | 0.0 | nan | 0.9763 | 0.8654 | 0.7902 | 0.9689 | 0.7982 | 0.0 | 0.7636 | 0.9638 | 0.3789 | 0.0 | 0.8132 | 0.8984 | 0.0 | | 0.074 | 87.27 | 960 | 0.0975 | 0.6355 | 0.6833 | 0.9735 | nan | 0.9924 | 0.9455 | 0.8134 | 0.9863 | 0.8652 | 0.0 | 0.8170 | 0.9725 | 0.6270 | 0.0018 | 0.8787 | 0.9829 | 0.0 | nan | 0.9762 | 0.8634 | 0.7688 | 0.9696 | 0.7994 | 0.0 | 0.7808 | 0.9641 | 0.4099 | 0.0018 | 0.8052 | 0.9219 | 0.0 | | 0.0621 | 89.09 | 980 | 0.0941 | 0.6412 | 0.6932 | 0.9748 | nan | 0.9917 | 0.9363 | 0.8733 | 0.9835 | 0.8917 | 0.0 | 0.8269 | 0.9807 | 0.6488 | 0.0 | 0.8986 | 0.9804 | 0.0 | nan | 0.9762 | 0.8782 | 0.8030 | 0.9694 | 0.7955 | 0.0 | 0.7839 | 0.9702 | 0.4047 | 0.0 | 0.8211 | 0.9332 | 0.0 | | 0.061 | 90.91 | 1000 | 0.0962 | 0.6385 | 0.6837 | 0.9745 | nan | 0.9903 | 0.9429 | 0.8857 | 0.9876 | 0.8820 | 0.0 | 0.8102 | 0.9735 | 0.6191 | 0.0 | 0.8501 | 0.9466 | 0.0 | nan | 0.9770 | 0.8709 | 0.8107 | 0.9681 | 0.8017 | 0.0 | 0.7768 | 0.9635 | 0.4054 | 0.0 | 0.8043 | 0.9227 | 0.0 | | 0.0694 | 92.73 | 1020 | 0.0934 | 0.6353 | 0.6904 | 0.9744 | nan | 0.9928 | 0.9322 | 0.8513 | 0.9847 | 0.8925 | 0.0 | 0.8446 | 0.9790 | 0.5982 | 0.0 | 0.9103 | 0.9891 | 0.0 | nan | 0.9765 | 0.8731 | 0.7915 | 0.9696 | 0.8038 | 0.0 | 0.8022 | 0.9672 | 0.3942 | 0.0 | 0.8058 | 0.8749 | 0.0 | | 0.0581 | 94.55 | 1040 | 0.0952 | 0.6329 | 0.6797 | 0.9743 | nan | 0.9924 | 0.9379 | 0.8861 | 0.9828 | 0.8854 | 0.0 | 0.8086 | 0.9757 | 0.4666 | 0.0103 | 0.9077 | 0.9820 | 0.0 | nan | 0.9769 | 0.8698 | 0.7971 | 0.9701 | 0.8059 | 0.0 | 0.7741 | 0.9671 | 0.3180 | 0.0103 | 0.8143 | 0.9240 | 0.0 | | 0.0767 | 96.36 | 1060 | 0.0963 | 0.6125 | 0.6948 | 0.9742 | nan | 0.9909 | 0.9468 | 0.8540 | 0.9825 | 0.8924 | 0.0 | 0.8423 | 0.9782 | 0.6411 | 0.0 | 0.9336 | 0.9704 | 0.0 | nan | 0.9772 | 0.8733 | 0.7785 | 0.9700 | 0.8043 | 0.0 | 0.7935 | 0.9670 | 0.4153 | 0.0 | 0.4664 | 0.9176 | 0.0 | | 0.0578 | 98.18 | 1080 | 0.0923 | 0.6133 | 0.6931 | 0.9751 | nan | 0.9908 | 0.9453 | 0.8715 | 0.9856 | 0.8800 | 0.0 | 0.8405 | 0.9812 | 0.6312 | 0.0029 | 0.9162 | 0.9644 | 0.0 | nan | 0.9774 | 0.8813 | 0.7942 | 0.9709 | 0.8065 | 0.0 | 0.8001 | 0.9682 | 0.4117 | 0.0029 | 0.4287 | 0.9315 | 0.0 | | 0.0625 | 100.0 | 1100 | 0.0946 | 0.6401 | 0.6875 | 0.9749 | nan | 0.9923 | 0.9525 | 0.8594 | 0.9823 | 0.8810 | 0.0 | 0.8104 | 0.9743 | 0.6042 | 0.0253 | 0.8752 | 0.9808 | 0.0 | nan | 0.9767 | 0.8767 | 0.7946 | 0.9709 | 0.8104 | 0.0 | 0.7709 | 0.9666 | 0.4063 | 0.0253 | 0.8011 | 0.9214 | 0.0 | | 0.0625 | 101.82 | 1120 | 0.0919 | 0.6509 | 0.6976 | 0.9756 | nan | 0.9919 | 0.9373 | 0.8786 | 0.9878 | 0.8742 | 0.0 | 0.8209 | 0.9782 | 0.6487 | 0.0827 | 0.8896 | 0.9792 | 0.0 | nan | 0.9770 | 0.8803 | 0.8101 | 0.9700 | 0.8078 | 0.0 | 0.7759 | 0.9677 | 0.4229 | 0.0827 | 0.8296 | 0.9379 | 0.0 | | 0.0601 | 103.64 | 1140 | 0.0895 | 0.6437 | 0.6961 | 0.9758 | nan | 0.9897 | 0.9468 | 0.9090 | 0.9853 | 0.8808 | 0.0 | 0.8005 | 0.9836 | 0.6285 | 0.0171 | 0.9341 | 0.9739 | 0.0 | nan | 0.9776 | 0.8834 | 0.8132 | 0.9704 | 0.8076 | 0.0 | 0.7719 | 0.9690 | 0.4163 | 0.0171 | 0.8064 | 0.9350 | 0.0 | | 0.0605 | 105.45 | 1160 | 0.0899 | 0.6485 | 0.7021 | 0.9759 | nan | 0.9920 | 0.9461 | 0.8857 | 0.9806 | 0.8977 | 0.0 | 0.8183 | 0.9818 | 0.6503 | 0.0771 | 0.9196 | 0.9788 | 0.0 | nan | 0.9780 | 0.8785 | 0.8143 | 0.9704 | 0.8113 | 0.0 | 0.7845 | 0.9699 | 0.4237 | 0.0771 | 0.7904 | 0.9328 | 0.0 | | 0.0549 | 107.27 | 1180 | 0.0907 | 0.6500 | 0.6982 | 0.9758 | nan | 0.9917 | 0.9485 | 0.8989 | 0.9842 | 0.8858 | 0.0 | 0.8331 | 0.9703 | 0.6364 | 0.0821 | 0.8638 | 0.9823 | 0.0 | nan | 0.9784 | 0.8788 | 0.8145 | 0.9716 | 0.8131 | 0.0 | 0.7879 | 0.9622 | 0.4238 | 0.0821 | 0.8186 | 0.9192 | 0.0 | | 0.0582 | 109.09 | 1200 | 0.0895 | 0.6519 | 0.6951 | 0.9763 | nan | 0.9920 | 0.9446 | 0.8876 | 0.9875 | 0.8584 | 0.0 | 0.8004 | 0.9797 | 0.6433 | 0.0775 | 0.8928 | 0.9728 | 0.0 | nan | 0.9777 | 0.8830 | 0.8184 | 0.9713 | 0.8092 | 0.0 | 0.7727 | 0.9688 | 0.4242 | 0.0775 | 0.8218 | 0.9506 | 0.0 | | 0.0601 | 110.91 | 1220 | 0.0938 | 0.6465 | 0.7012 | 0.9751 | nan | 0.9917 | 0.9413 | 0.8822 | 0.9858 | 0.8868 | 0.0 | 0.8577 | 0.9711 | 0.6408 | 0.0629 | 0.9139 | 0.9818 | 0.0 | nan | 0.9770 | 0.8747 | 0.8055 | 0.9719 | 0.8182 | 0.0 | 0.7983 | 0.9625 | 0.4185 | 0.0629 | 0.7878 | 0.9267 | 0.0 | | 0.0507 | 112.73 | 1240 | 0.0900 | 0.6421 | 0.6966 | 0.9757 | nan | 0.9925 | 0.9472 | 0.8611 | 0.9834 | 0.8931 | 0.0 | 0.8411 | 0.9793 | 0.6416 | 0.0 | 0.9334 | 0.9835 | 0.0 | nan | 0.9773 | 0.8788 | 0.8030 | 0.9717 | 0.8161 | 0.0 | 0.8052 | 0.9697 | 0.4215 | 0.0 | 0.7786 | 0.9250 | 0.0 | | 0.0481 | 114.55 | 1260 | 0.0902 | 0.6484 | 0.6982 | 0.9756 | nan | 0.9921 | 0.9440 | 0.8811 | 0.9850 | 0.9023 | 0.0 | 0.8030 | 0.9731 | 0.6327 | 0.0715 | 0.9061 | 0.9855 | 0.0 | nan | 0.9773 | 0.8786 | 0.8096 | 0.9721 | 0.8166 | 0.0 | 0.7842 | 0.9654 | 0.4137 | 0.0715 | 0.8196 | 0.9206 | 0.0 | | 0.0489 | 116.36 | 1280 | 0.0926 | 0.6357 | 0.6956 | 0.9754 | nan | 0.9922 | 0.9397 | 0.8746 | 0.9872 | 0.8816 | 0.0 | 0.8691 | 0.9742 | 0.6310 | 0.0 | 0.9116 | 0.9820 | 0.0 | nan | 0.9779 | 0.8761 | 0.8036 | 0.9717 | 0.8186 | 0.0 | 0.8086 | 0.9636 | 0.4168 | 0.0 | 0.7027 | 0.9247 | 0.0 | | 0.0496 | 118.18 | 1300 | 0.0920 | 0.6385 | 0.6972 | 0.9752 | nan | 0.9919 | 0.9472 | 0.8730 | 0.9867 | 0.8829 | 0.0 | 0.8212 | 0.9680 | 0.6221 | 0.0683 | 0.9157 | 0.9866 | 0.0 | nan | 0.9779 | 0.8720 | 0.8000 | 0.9727 | 0.8193 | 0.0 | 0.7958 | 0.9614 | 0.4063 | 0.0683 | 0.7026 | 0.9237 | 0.0 | | 0.0466 | 120.0 | 1320 | 0.0892 | 0.6451 | 0.6923 | 0.9762 | nan | 0.9926 | 0.9490 | 0.8603 | 0.9882 | 0.8697 | 0.0 | 0.8289 | 0.9776 | 0.6185 | 0.0300 | 0.9073 | 0.9779 | 0.0 | nan | 0.9779 | 0.8821 | 0.8057 | 0.9722 | 0.8146 | 0.0 | 0.7891 | 0.9708 | 0.4077 | 0.0300 | 0.7946 | 0.9411 | 0.0 | | 0.0515 | 121.82 | 1340 | 0.0889 | 0.6225 | 0.6889 | 0.9760 | nan | 0.9916 | 0.9512 | 0.8571 | 0.9877 | 0.8805 | 0.0 | 0.8691 | 0.9829 | 0.5486 | 0.0 | 0.9002 | 0.9872 | 0.0 | nan | 0.9787 | 0.8784 | 0.7992 | 0.9724 | 0.8198 | 0.0 | 0.8019 | 0.9725 | 0.3641 | 0.0 | 0.5740 | 0.9319 | 0.0 | | 0.0646 | 123.64 | 1360 | 0.0879 | 0.6483 | 0.6948 | 0.9767 | nan | 0.9909 | 0.9483 | 0.9134 | 0.9862 | 0.8877 | 0.0 | 0.8176 | 0.9791 | 0.6465 | 0.0 | 0.8759 | 0.9869 | 0.0 | nan | 0.9784 | 0.8847 | 0.8194 | 0.9733 | 0.8219 | 0.0 | 0.7863 | 0.9718 | 0.4218 | 0.0 | 0.8345 | 0.9355 | 0.0 | | 0.052 | 125.45 | 1380 | 0.0860 | 0.6528 | 0.7026 | 0.9767 | nan | 0.9910 | 0.9478 | 0.9098 | 0.9831 | 0.8939 | 0.0 | 0.8247 | 0.9830 | 0.6543 | 0.0764 | 0.8794 | 0.9907 | 0.0 | nan | 0.9780 | 0.8868 | 0.8143 | 0.9725 | 0.8233 | 0.0 | 0.7908 | 0.9734 | 0.4281 | 0.0764 | 0.8251 | 0.9183 | 0.0 | | 0.0498 | 127.27 | 1400 | 0.0891 | 0.6469 | 0.6922 | 0.9765 | nan | 0.9917 | 0.9511 | 0.8928 | 0.9828 | 0.8927 | 0.0 | 0.8251 | 0.9820 | 0.6401 | 0.0 | 0.8562 | 0.9837 | 0.0 | nan | 0.9786 | 0.8815 | 0.8057 | 0.9725 | 0.8246 | 0.0 | 0.7955 | 0.9735 | 0.4228 | 0.0 | 0.8107 | 0.9446 | 0.0 | | 0.0577 | 129.09 | 1420 | 0.0908 | 0.6457 | 0.6903 | 0.9759 | nan | 0.9914 | 0.9513 | 0.8768 | 0.9888 | 0.8646 | 0.0 | 0.8205 | 0.9749 | 0.6379 | 0.0 | 0.8864 | 0.9820 | 0.0 | nan | 0.9787 | 0.8721 | 0.8048 | 0.9728 | 0.8223 | 0.0 | 0.7902 | 0.9683 | 0.4247 | 0.0 | 0.8145 | 0.9455 | 0.0 | | 0.0526 | 130.91 | 1440 | 0.0905 | 0.6463 | 0.6887 | 0.9763 | nan | 0.9925 | 0.9479 | 0.8874 | 0.9881 | 0.8765 | 0.0 | 0.8107 | 0.9721 | 0.6013 | 0.0 | 0.8958 | 0.9802 | 0.0 | nan | 0.9781 | 0.8797 | 0.8133 | 0.9734 | 0.8237 | 0.0 | 0.7899 | 0.9676 | 0.4075 | 0.0 | 0.8283 | 0.9403 | 0.0 | | 0.0521 | 132.73 | 1460 | 0.0866 | 0.6495 | 0.7001 | 0.9766 | nan | 0.9909 | 0.9440 | 0.8909 | 0.9898 | 0.8783 | 0.0 | 0.8513 | 0.9804 | 0.6767 | 0.0 | 0.9139 | 0.9849 | 0.0 | nan | 0.9789 | 0.8819 | 0.8062 | 0.9723 | 0.8237 | 0.0 | 0.8072 | 0.9726 | 0.4347 | 0.0 | 0.8244 | 0.9418 | 0.0 | | 0.0534 | 134.55 | 1480 | 0.0853 | 0.6529 | 0.6992 | 0.9770 | nan | 0.9930 | 0.9443 | 0.8636 | 0.9871 | 0.8985 | 0.0 | 0.8256 | 0.9829 | 0.6691 | 0.0397 | 0.9071 | 0.9792 | 0.0 | nan | 0.9784 | 0.8882 | 0.8086 | 0.9727 | 0.8209 | 0.0 | 0.7980 | 0.9743 | 0.4316 | 0.0397 | 0.8272 | 0.9485 | 0.0 | | 0.0454 | 136.36 | 1500 | 0.0877 | 0.6435 | 0.6959 | 0.9764 | nan | 0.9928 | 0.9514 | 0.8571 | 0.9841 | 0.9012 | 0.0 | 0.8173 | 0.9806 | 0.6427 | 0.0138 | 0.9174 | 0.9882 | 0.0 | nan | 0.9782 | 0.8802 | 0.8042 | 0.9735 | 0.8269 | 0.0 | 0.7976 | 0.9712 | 0.4288 | 0.0138 | 0.7780 | 0.9136 | 0.0 | | 0.0454 | 138.18 | 1520 | 0.0871 | 0.6529 | 0.7036 | 0.9764 | nan | 0.9916 | 0.9438 | 0.8860 | 0.9886 | 0.8818 | 0.0 | 0.8508 | 0.9770 | 0.6519 | 0.0801 | 0.9118 | 0.9839 | 0.0 | nan | 0.9787 | 0.8801 | 0.8046 | 0.9734 | 0.8261 | 0.0 | 0.7961 | 0.9680 | 0.4282 | 0.0801 | 0.8148 | 0.9382 | 0.0 | | 0.0472 | 140.0 | 1540 | 0.0870 | 0.6510 | 0.6964 | 0.9765 | nan | 0.9927 | 0.9462 | 0.8786 | 0.9888 | 0.8634 | 0.0 | 0.8223 | 0.9755 | 0.6278 | 0.0853 | 0.8873 | 0.9853 | 0.0 | nan | 0.9789 | 0.8797 | 0.8114 | 0.9734 | 0.8192 | 0.0 | 0.7982 | 0.9672 | 0.4151 | 0.0853 | 0.8036 | 0.9311 | 0.0 | | 0.046 | 141.82 | 1560 | 0.0897 | 0.6427 | 0.6985 | 0.9759 | nan | 0.9913 | 0.9441 | 0.8956 | 0.9896 | 0.8827 | 0.0 | 0.8468 | 0.9695 | 0.6313 | 0.0028 | 0.9379 | 0.9890 | 0.0 | nan | 0.9787 | 0.8752 | 0.8130 | 0.9727 | 0.8245 | 0.0 | 0.8000 | 0.9629 | 0.4166 | 0.0028 | 0.7933 | 0.9153 | 0.0 | | 0.054 | 143.64 | 1580 | 0.0851 | 0.6511 | 0.6953 | 0.9771 | nan | 0.9919 | 0.9474 | 0.9026 | 0.9873 | 0.8902 | 0.0 | 0.8168 | 0.9787 | 0.6303 | 0.0229 | 0.8949 | 0.9764 | 0.0 | nan | 0.9792 | 0.8839 | 0.8160 | 0.9741 | 0.8271 | 0.0 | 0.7928 | 0.9710 | 0.4205 | 0.0229 | 0.8237 | 0.9530 | 0.0 | | 0.0538 | 145.45 | 1600 | 0.0863 | 0.6539 | 0.6978 | 0.9769 | nan | 0.9932 | 0.9493 | 0.8808 | 0.9874 | 0.8811 | 0.0 | 0.8404 | 0.9743 | 0.6038 | 0.0749 | 0.9027 | 0.9832 | 0.0 | nan | 0.9789 | 0.8816 | 0.8226 | 0.9736 | 0.8252 | 0.0 | 0.8106 | 0.9671 | 0.4111 | 0.0749 | 0.8138 | 0.9410 | 0.0 | | 0.0553 | 147.27 | 1620 | 0.0900 | 0.6487 | 0.6923 | 0.9764 | nan | 0.9921 | 0.9533 | 0.8760 | 0.9868 | 0.8884 | 0.0 | 0.8422 | 0.9726 | 0.6472 | 0.0 | 0.8610 | 0.9802 | 0.0 | nan | 0.9790 | 0.8731 | 0.8220 | 0.9737 | 0.8307 | 0.0 | 0.8132 | 0.9638 | 0.4260 | 0.0 | 0.8079 | 0.9441 | 0.0 | | 0.0475 | 149.09 | 1640 | 0.0850 | 0.6531 | 0.7045 | 0.9772 | nan | 0.9921 | 0.9504 | 0.9025 | 0.9847 | 0.9053 | 0.0 | 0.8488 | 0.9764 | 0.6332 | 0.0624 | 0.9157 | 0.9867 | 0.0 | nan | 0.9792 | 0.8844 | 0.8242 | 0.9740 | 0.8297 | 0.0 | 0.8054 | 0.9697 | 0.4214 | 0.0624 | 0.8092 | 0.9313 | 0.0 | | 0.046 | 150.91 | 1660 | 0.0835 | 0.6581 | 0.7046 | 0.9778 | nan | 0.9925 | 0.9531 | 0.8980 | 0.9844 | 0.8964 | 0.0 | 0.8539 | 0.9814 | 0.6511 | 0.0612 | 0.9086 | 0.9788 | 0.0 | nan | 0.9792 | 0.8915 | 0.8260 | 0.9744 | 0.8322 | 0.0 | 0.8049 | 0.9728 | 0.4307 | 0.0612 | 0.8328 | 0.9491 | 0.0 | | 0.0472 | 152.73 | 1680 | 0.0842 | 0.6565 | 0.7050 | 0.9774 | nan | 0.9927 | 0.9409 | 0.8948 | 0.9857 | 0.8900 | 0.0 | 0.8360 | 0.9859 | 0.6512 | 0.0712 | 0.9316 | 0.9845 | 0.0 | nan | 0.9787 | 0.8884 | 0.8210 | 0.9741 | 0.8296 | 0.0 | 0.8010 | 0.9748 | 0.4301 | 0.0712 | 0.8283 | 0.9377 | 0.0 | | 0.047 | 154.55 | 1700 | 0.0895 | 0.6477 | 0.6973 | 0.9764 | nan | 0.9929 | 0.9457 | 0.8857 | 0.9876 | 0.8951 | 0.0 | 0.8400 | 0.9691 | 0.6246 | 0.0 | 0.9395 | 0.9848 | 0.0 | nan | 0.9786 | 0.8799 | 0.8180 | 0.9741 | 0.8284 | 0.0 | 0.8029 | 0.9623 | 0.4179 | 0.0 | 0.8250 | 0.9332 | 0.0 | | 0.0449 | 156.36 | 1720 | 0.0923 | 0.6456 | 0.6960 | 0.9762 | nan | 0.9927 | 0.9528 | 0.8601 | 0.9872 | 0.8886 | 0.0 | 0.8539 | 0.9723 | 0.6287 | 0.0 | 0.9263 | 0.9854 | 0.0 | nan | 0.9791 | 0.8736 | 0.8005 | 0.9748 | 0.8344 | 0.0 | 0.8076 | 0.9662 | 0.4123 | 0.0 | 0.8087 | 0.9358 | 0.0 | | 0.0485 | 158.18 | 1740 | 0.0888 | 0.6521 | 0.6963 | 0.9775 | nan | 0.9926 | 0.9479 | 0.9006 | 0.9892 | 0.8833 | 0.0 | 0.8442 | 0.9753 | 0.6389 | 0.0 | 0.9018 | 0.9776 | 0.0 | nan | 0.9798 | 0.8849 | 0.8284 | 0.9743 | 0.8297 | 0.0 | 0.8031 | 0.9680 | 0.4230 | 0.0 | 0.8396 | 0.9470 | 0.0 | | 0.047 | 160.0 | 1760 | 0.0899 | 0.6512 | 0.7015 | 0.9771 | nan | 0.9918 | 0.9530 | 0.8939 | 0.9845 | 0.9084 | 0.0 | 0.8653 | 0.9783 | 0.6442 | 0.0 | 0.9134 | 0.9864 | 0.0 | nan | 0.9794 | 0.8832 | 0.8221 | 0.9743 | 0.8297 | 0.0 | 0.8039 | 0.9680 | 0.4282 | 0.0 | 0.8477 | 0.9295 | 0.0 | | 0.0573 | 161.82 | 1780 | 0.0900 | 0.6508 | 0.6952 | 0.9766 | nan | 0.9931 | 0.9509 | 0.8661 | 0.9850 | 0.9008 | 0.0 | 0.8355 | 0.9767 | 0.6550 | 0.0 | 0.8974 | 0.9769 | 0.0 | nan | 0.9787 | 0.8782 | 0.8099 | 0.9747 | 0.8355 | 0.0 | 0.8020 | 0.9679 | 0.4319 | 0.0 | 0.8303 | 0.9518 | 0.0 | | 0.0415 | 163.64 | 1800 | 0.0888 | 0.6489 | 0.6987 | 0.9768 | nan | 0.9937 | 0.9401 | 0.8667 | 0.9875 | 0.8987 | 0.0 | 0.8514 | 0.9774 | 0.6620 | 0.0 | 0.9185 | 0.9875 | 0.0 | nan | 0.9783 | 0.8812 | 0.8114 | 0.9746 | 0.8338 | 0.0 | 0.8022 | 0.9703 | 0.4376 | 0.0 | 0.8244 | 0.9224 | 0.0 | | 0.0455 | 165.45 | 1820 | 0.0908 | 0.6474 | 0.6978 | 0.9765 | nan | 0.9930 | 0.9469 | 0.8557 | 0.9881 | 0.8875 | 0.0 | 0.8540 | 0.9778 | 0.6581 | 0.0 | 0.9226 | 0.9874 | 0.0 | nan | 0.9788 | 0.8756 | 0.7998 | 0.9749 | 0.8327 | 0.0 | 0.8104 | 0.9710 | 0.4316 | 0.0 | 0.8070 | 0.9345 | 0.0 | | 0.0476 | 167.27 | 1840 | 0.0870 | 0.6521 | 0.6972 | 0.9773 | nan | 0.9928 | 0.9471 | 0.8736 | 0.9876 | 0.8920 | 0.0 | 0.8445 | 0.9823 | 0.6509 | 0.0 | 0.9084 | 0.9843 | 0.0 | nan | 0.9788 | 0.8840 | 0.8087 | 0.9754 | 0.8370 | 0.0 | 0.8069 | 0.9743 | 0.4316 | 0.0 | 0.8312 | 0.9496 | 0.0 | | 0.0466 | 169.09 | 1860 | 0.0873 | 0.6527 | 0.7007 | 0.9771 | nan | 0.9934 | 0.9468 | 0.8779 | 0.9850 | 0.8997 | 0.0 | 0.8360 | 0.9793 | 0.6479 | 0.0466 | 0.9153 | 0.9807 | 0.0 | nan | 0.9784 | 0.8866 | 0.8097 | 0.9750 | 0.8353 | 0.0 | 0.7956 | 0.9725 | 0.4301 | 0.0466 | 0.8158 | 0.9392 | 0.0 | | 0.0458 | 170.91 | 1880 | 0.0892 | 0.6493 | 0.6967 | 0.9766 | nan | 0.9925 | 0.9451 | 0.8769 | 0.9886 | 0.8944 | 0.0 | 0.8559 | 0.9753 | 0.6460 | 0.0 | 0.8972 | 0.9850 | 0.0 | nan | 0.9791 | 0.8766 | 0.8070 | 0.9745 | 0.8340 | 0.0 | 0.8086 | 0.9697 | 0.4269 | 0.0 | 0.8242 | 0.9402 | 0.0 | | 0.0438 | 172.73 | 1900 | 0.0884 | 0.6493 | 0.7011 | 0.9770 | nan | 0.9921 | 0.9517 | 0.8792 | 0.9871 | 0.8968 | 0.0 | 0.8628 | 0.9778 | 0.6517 | 0.0 | 0.9245 | 0.9902 | 0.0 | nan | 0.9793 | 0.8782 | 0.8131 | 0.9752 | 0.8373 | 0.0 | 0.8123 | 0.9724 | 0.4300 | 0.0 | 0.8186 | 0.9245 | 0.0 | | 0.0412 | 174.55 | 1920 | 0.0863 | 0.6519 | 0.6984 | 0.9775 | nan | 0.9930 | 0.9543 | 0.8828 | 0.9832 | 0.9010 | 0.0 | 0.8560 | 0.9819 | 0.6329 | 0.0 | 0.9123 | 0.9814 | 0.0 | nan | 0.9791 | 0.8850 | 0.8212 | 0.9745 | 0.8370 | 0.0 | 0.8093 | 0.9744 | 0.4213 | 0.0 | 0.8262 | 0.9469 | 0.0 | | 0.0413 | 176.36 | 1940 | 0.0866 | 0.6535 | 0.6994 | 0.9777 | nan | 0.9923 | 0.9491 | 0.8948 | 0.9870 | 0.8848 | 0.0 | 0.8582 | 0.9835 | 0.6291 | 0.0 | 0.9329 | 0.9807 | 0.0 | nan | 0.9792 | 0.8866 | 0.8244 | 0.9751 | 0.8331 | 0.0 | 0.8170 | 0.9722 | 0.4239 | 0.0 | 0.8304 | 0.9531 | 0.0 | | 0.0411 | 178.18 | 1960 | 0.0856 | 0.6541 | 0.7070 | 0.9777 | nan | 0.9917 | 0.9462 | 0.8965 | 0.9902 | 0.8767 | 0.0 | 0.8706 | 0.9804 | 0.6980 | 0.0255 | 0.9240 | 0.9908 | 0.0 | nan | 0.9793 | 0.8844 | 0.8266 | 0.9748 | 0.8333 | 0.0 | 0.8101 | 0.9735 | 0.4539 | 0.0255 | 0.8141 | 0.9283 | 0.0 | | 0.044 | 180.0 | 1980 | 0.0848 | 0.6573 | 0.7119 | 0.9777 | nan | 0.9917 | 0.9502 | 0.8949 | 0.9878 | 0.8918 | 0.0 | 0.8804 | 0.9801 | 0.6919 | 0.0840 | 0.9151 | 0.9870 | 0.0 | nan | 0.9795 | 0.8862 | 0.8244 | 0.9749 | 0.8341 | 0.0 | 0.8058 | 0.9730 | 0.4499 | 0.0840 | 0.7933 | 0.9394 | 0.0 | | 0.0437 | 181.82 | 2000 | 0.0900 | 0.6446 | 0.6968 | 0.9772 | nan | 0.9934 | 0.9472 | 0.8668 | 0.9872 | 0.9019 | 0.0 | 0.8430 | 0.9801 | 0.6367 | 0.0001 | 0.9181 | 0.9843 | 0.0 | nan | 0.9791 | 0.8825 | 0.8120 | 0.9752 | 0.8367 | 0.0 | 0.7932 | 0.9726 | 0.4278 | 0.0001 | 0.7565 | 0.9438 | 0.0 | | 0.0384 | 183.64 | 2020 | 0.0892 | 0.6475 | 0.7018 | 0.9775 | nan | 0.9923 | 0.9459 | 0.8866 | 0.9890 | 0.8866 | 0.0 | 0.8772 | 0.9815 | 0.6549 | 0.0001 | 0.9199 | 0.9899 | 0.0 | nan | 0.9795 | 0.8839 | 0.8161 | 0.9752 | 0.8361 | 0.0 | 0.8166 | 0.9733 | 0.4355 | 0.0001 | 0.7822 | 0.9194 | 0.0 | | 0.044 | 185.45 | 2040 | 0.0860 | 0.6523 | 0.6998 | 0.9779 | nan | 0.9936 | 0.9498 | 0.8754 | 0.9870 | 0.8974 | 0.0 | 0.8672 | 0.9808 | 0.6493 | 0.0001 | 0.9173 | 0.9792 | 0.0 | nan | 0.9793 | 0.8875 | 0.8213 | 0.9758 | 0.8396 | 0.0 | 0.8175 | 0.9736 | 0.4336 | 0.0001 | 0.7977 | 0.9538 | 0.0 | | 0.0397 | 187.27 | 2060 | 0.0875 | 0.6470 | 0.6988 | 0.9778 | nan | 0.9942 | 0.9488 | 0.8691 | 0.9879 | 0.8919 | 0.0 | 0.8497 | 0.9779 | 0.6551 | 0.0004 | 0.9288 | 0.9801 | 0.0 | nan | 0.9793 | 0.8882 | 0.8197 | 0.9753 | 0.8392 | 0.0 | 0.8131 | 0.9727 | 0.4375 | 0.0004 | 0.7312 | 0.9548 | 0.0 | | 0.0397 | 189.09 | 2080 | 0.0875 | 0.6525 | 0.7044 | 0.9776 | nan | 0.9912 | 0.9541 | 0.9009 | 0.9856 | 0.9121 | 0.0 | 0.8845 | 0.9806 | 0.6498 | 0.0020 | 0.9075 | 0.9886 | 0.0 | nan | 0.9797 | 0.8841 | 0.8202 | 0.9753 | 0.8369 | 0.0 | 0.8132 | 0.9733 | 0.4313 | 0.0020 | 0.8261 | 0.9399 | 0.0 | | 0.0417 | 190.91 | 2100 | 0.0873 | 0.6534 | 0.7025 | 0.9775 | nan | 0.9912 | 0.9503 | 0.8970 | 0.9888 | 0.8964 | 0.0 | 0.8676 | 0.9803 | 0.6612 | 0.0068 | 0.9107 | 0.9823 | 0.0 | nan | 0.9795 | 0.8842 | 0.8107 | 0.9754 | 0.8354 | 0.0 | 0.8147 | 0.9743 | 0.4355 | 0.0068 | 0.8251 | 0.9528 | 0.0 | | 0.0399 | 192.73 | 2120 | 0.0882 | 0.6500 | 0.6980 | 0.9773 | nan | 0.9934 | 0.9437 | 0.8780 | 0.9877 | 0.8983 | 0.0 | 0.8584 | 0.9787 | 0.6419 | 0.0012 | 0.9041 | 0.9883 | 0.0 | nan | 0.9792 | 0.8825 | 0.8114 | 0.9756 | 0.8374 | 0.0 | 0.8127 | 0.9729 | 0.4276 | 0.0012 | 0.8228 | 0.9266 | 0.0 | | 0.0464 | 194.55 | 2140 | 0.0844 | 0.6580 | 0.7076 | 0.9780 | nan | 0.9926 | 0.9523 | 0.8899 | 0.9840 | 0.9078 | 0.0 | 0.8758 | 0.9843 | 0.6454 | 0.0559 | 0.9272 | 0.9834 | 0.0 | nan | 0.9795 | 0.8884 | 0.8239 | 0.9751 | 0.8369 | 0.0 | 0.8236 | 0.9746 | 0.4309 | 0.0559 | 0.8140 | 0.9517 | 0.0 | | 0.0411 | 196.36 | 2160 | 0.0854 | 0.6545 | 0.7030 | 0.9779 | nan | 0.9915 | 0.9567 | 0.8874 | 0.9869 | 0.9026 | 0.0 | 0.8709 | 0.9824 | 0.6750 | 0.0 | 0.8975 | 0.9882 | 0.0 | nan | 0.9800 | 0.8839 | 0.8211 | 0.9755 | 0.8396 | 0.0 | 0.8254 | 0.9740 | 0.4417 | 0.0 | 0.8226 | 0.9451 | 0.0 | | 0.0403 | 198.18 | 2180 | 0.0836 | 0.6629 | 0.7087 | 0.9782 | nan | 0.9916 | 0.9578 | 0.9090 | 0.9854 | 0.9005 | 0.0 | 0.8692 | 0.9803 | 0.6450 | 0.0800 | 0.9103 | 0.9836 | 0.0 | nan | 0.9798 | 0.8886 | 0.8302 | 0.9754 | 0.8401 | 0.0 | 0.8181 | 0.9746 | 0.4304 | 0.0800 | 0.8500 | 0.9507 | 0.0 | | 0.0406 | 200.0 | 2200 | 0.0839 | 0.6630 | 0.7080 | 0.9785 | nan | 0.9926 | 0.9547 | 0.8981 | 0.9864 | 0.8965 | 0.0 | 0.8676 | 0.9818 | 0.6801 | 0.0503 | 0.9164 | 0.9801 | 0.0 | nan | 0.9800 | 0.8925 | 0.8295 | 0.9756 | 0.8386 | 0.0 | 0.8273 | 0.9747 | 0.4473 | 0.0503 | 0.8491 | 0.9536 | 0.0 | | 0.0461 | 201.82 | 2220 | 0.0848 | 0.6569 | 0.7053 | 0.9783 | nan | 0.9922 | 0.9500 | 0.9119 | 0.9873 | 0.8954 | 0.0 | 0.8532 | 0.9811 | 0.6702 | 0.0088 | 0.9332 | 0.9859 | 0.0 | nan | 0.9803 | 0.8879 | 0.8326 | 0.9755 | 0.8393 | 0.0 | 0.8167 | 0.9734 | 0.4419 | 0.0088 | 0.8422 | 0.9411 | 0.0 | | 0.0378 | 203.64 | 2240 | 0.0843 | 0.6612 | 0.7054 | 0.9786 | nan | 0.9931 | 0.9491 | 0.9147 | 0.9851 | 0.9041 | 0.0 | 0.8671 | 0.9814 | 0.6515 | 0.0503 | 0.8972 | 0.9763 | 0.0 | nan | 0.9802 | 0.8913 | 0.8343 | 0.9756 | 0.8420 | 0.0 | 0.8097 | 0.9746 | 0.4363 | 0.0503 | 0.8518 | 0.9497 | 0.0 | | 0.036 | 205.45 | 2260 | 0.0828 | 0.6646 | 0.7107 | 0.9785 | nan | 0.9922 | 0.9493 | 0.9124 | 0.9867 | 0.9027 | 0.0 | 0.8772 | 0.9827 | 0.6611 | 0.0869 | 0.9032 | 0.9851 | 0.0 | nan | 0.9801 | 0.8922 | 0.8340 | 0.9758 | 0.8400 | 0.0 | 0.8202 | 0.9726 | 0.4404 | 0.0869 | 0.8512 | 0.9462 | 0.0 | | 0.0397 | 207.27 | 2280 | 0.0810 | 0.6686 | 0.7164 | 0.9786 | nan | 0.9930 | 0.9487 | 0.9044 | 0.9861 | 0.8991 | 0.0 | 0.8824 | 0.9811 | 0.6631 | 0.1556 | 0.9174 | 0.9822 | 0.0 | nan | 0.9800 | 0.8932 | 0.8298 | 0.9756 | 0.8410 | 0.0 | 0.8116 | 0.9739 | 0.4364 | 0.1555 | 0.8424 | 0.9529 | 0.0 | | 0.0402 | 209.09 | 2300 | 0.0834 | 0.6609 | 0.7088 | 0.9785 | nan | 0.9917 | 0.9523 | 0.9143 | 0.9870 | 0.9083 | 0.0 | 0.8851 | 0.9811 | 0.6983 | 0.0227 | 0.9036 | 0.9703 | 0.0 | nan | 0.9802 | 0.8917 | 0.8308 | 0.9759 | 0.8415 | 0.0 | 0.8208 | 0.9741 | 0.4528 | 0.0227 | 0.8459 | 0.9558 | 0.0 | | 0.038 | 210.91 | 2320 | 0.0861 | 0.6599 | 0.7053 | 0.9783 | nan | 0.9931 | 0.9488 | 0.8902 | 0.9883 | 0.8966 | 0.0 | 0.8748 | 0.9797 | 0.6734 | 0.0308 | 0.9062 | 0.9864 | 0.0 | nan | 0.9796 | 0.8913 | 0.8260 | 0.9761 | 0.8421 | 0.0 | 0.8226 | 0.9735 | 0.4438 | 0.0308 | 0.8498 | 0.9433 | 0.0 | | 0.0377 | 212.73 | 2340 | 0.0810 | 0.6689 | 0.7186 | 0.9787 | nan | 0.9925 | 0.9463 | 0.8974 | 0.9883 | 0.8965 | 0.0 | 0.8929 | 0.9845 | 0.7125 | 0.1251 | 0.9203 | 0.9851 | 0.0 | nan | 0.9800 | 0.8947 | 0.8285 | 0.9759 | 0.8397 | 0.0 | 0.8274 | 0.9744 | 0.4608 | 0.1250 | 0.8473 | 0.9417 | 0.0 | | 0.0402 | 214.55 | 2360 | 0.0831 | 0.6611 | 0.7060 | 0.9785 | nan | 0.9927 | 0.9489 | 0.8975 | 0.9891 | 0.8952 | 0.0 | 0.8758 | 0.9818 | 0.6763 | 0.0397 | 0.8986 | 0.9822 | 0.0 | nan | 0.9801 | 0.8913 | 0.8291 | 0.9758 | 0.8401 | 0.0 | 0.8262 | 0.9752 | 0.4472 | 0.0397 | 0.8479 | 0.9422 | 0.0 | | 0.0353 | 216.36 | 2380 | 0.0834 | 0.6600 | 0.7067 | 0.9784 | nan | 0.9934 | 0.9493 | 0.8897 | 0.9871 | 0.9062 | 0.0 | 0.8682 | 0.9806 | 0.6591 | 0.0568 | 0.9112 | 0.9858 | 0.0 | nan | 0.9796 | 0.8927 | 0.8267 | 0.9761 | 0.8410 | 0.0 | 0.8177 | 0.9748 | 0.4388 | 0.0568 | 0.8377 | 0.9375 | 0.0 | | 0.0385 | 218.18 | 2400 | 0.0844 | 0.6565 | 0.7144 | 0.9781 | nan | 0.9907 | 0.9541 | 0.9213 | 0.9842 | 0.9103 | 0.0 | 0.8853 | 0.9828 | 0.6841 | 0.0634 | 0.9176 | 0.9932 | 0.0 | nan | 0.9801 | 0.8877 | 0.8297 | 0.9754 | 0.8416 | 0.0 | 0.8176 | 0.9736 | 0.4482 | 0.0634 | 0.8269 | 0.8902 | 0.0 | | 0.0391 | 220.0 | 2420 | 0.0848 | 0.6604 | 0.7103 | 0.9785 | nan | 0.9921 | 0.9436 | 0.9143 | 0.9893 | 0.8993 | 0.0 | 0.8955 | 0.9837 | 0.6555 | 0.0499 | 0.9245 | 0.9856 | 0.0 | nan | 0.9801 | 0.8912 | 0.8329 | 0.9755 | 0.8427 | 0.0 | 0.8187 | 0.9755 | 0.4360 | 0.0499 | 0.8413 | 0.9411 | 0.0 | | 0.0377 | 221.82 | 2440 | 0.0842 | 0.6619 | 0.7039 | 0.9786 | nan | 0.9936 | 0.9507 | 0.8851 | 0.9875 | 0.8976 | 0.0 | 0.8679 | 0.9831 | 0.6504 | 0.0490 | 0.9038 | 0.9818 | 0.0 | nan | 0.9797 | 0.8929 | 0.8294 | 0.9763 | 0.8440 | 0.0 | 0.8274 | 0.9757 | 0.4344 | 0.0490 | 0.8522 | 0.9432 | 0.0 | | 0.0341 | 223.64 | 2460 | 0.0862 | 0.6567 | 0.7062 | 0.9786 | nan | 0.9931 | 0.9523 | 0.8956 | 0.9865 | 0.9035 | 0.0 | 0.8779 | 0.9813 | 0.6720 | 0.0068 | 0.9279 | 0.9839 | 0.0 | nan | 0.9801 | 0.8910 | 0.8312 | 0.9764 | 0.8449 | 0.0 | 0.8228 | 0.9747 | 0.4421 | 0.0068 | 0.8180 | 0.9491 | 0.0 | | 0.0376 | 225.45 | 2480 | 0.0836 | 0.6592 | 0.7065 | 0.9786 | nan | 0.9926 | 0.9509 | 0.8982 | 0.9880 | 0.9004 | 0.0 | 0.8853 | 0.9823 | 0.6884 | 0.0116 | 0.8965 | 0.9902 | 0.0 | nan | 0.9801 | 0.8917 | 0.8318 | 0.9765 | 0.8455 | 0.0 | 0.8259 | 0.9744 | 0.4520 | 0.0116 | 0.8456 | 0.9340 | 0.0 | | 0.0368 | 227.27 | 2500 | 0.0875 | 0.6535 | 0.6999 | 0.9780 | nan | 0.9931 | 0.9507 | 0.8859 | 0.9880 | 0.9006 | 0.0 | 0.8900 | 0.9780 | 0.6299 | 0.0032 | 0.9002 | 0.9790 | 0.0 | nan | 0.9799 | 0.8876 | 0.8168 | 0.9764 | 0.8447 | 0.0 | 0.8185 | 0.9733 | 0.4178 | 0.0032 | 0.8294 | 0.9480 | 0.0 | | 0.0421 | 229.09 | 2520 | 0.0858 | 0.6576 | 0.7081 | 0.9782 | nan | 0.9926 | 0.9516 | 0.8937 | 0.9879 | 0.9120 | 0.0 | 0.8882 | 0.9768 | 0.6814 | 0.0202 | 0.9162 | 0.9847 | 0.0 | nan | 0.9801 | 0.8887 | 0.8294 | 0.9763 | 0.8432 | 0.0 | 0.8175 | 0.9699 | 0.4485 | 0.0202 | 0.8268 | 0.9480 | 0.0 | | 0.0375 | 230.91 | 2540 | 0.0855 | 0.6602 | 0.7126 | 0.9786 | nan | 0.9923 | 0.9548 | 0.8877 | 0.9857 | 0.9147 | 0.0 | 0.8769 | 0.9833 | 0.7583 | 0.0100 | 0.9130 | 0.9866 | 0.0 | nan | 0.9802 | 0.8929 | 0.8264 | 0.9762 | 0.8426 | 0.0 | 0.8176 | 0.9735 | 0.4772 | 0.0100 | 0.8357 | 0.9508 | 0.0 | | 0.0375 | 232.73 | 2560 | 0.0810 | 0.6640 | 0.7157 | 0.9794 | nan | 0.9925 | 0.9475 | 0.9059 | 0.9874 | 0.9053 | 0.0 | 0.8918 | 0.9896 | 0.7319 | 0.0381 | 0.9279 | 0.9861 | 0.0 | nan | 0.9806 | 0.8994 | 0.8331 | 0.9765 | 0.8454 | 0.0 | 0.8167 | 0.9780 | 0.4760 | 0.0381 | 0.8480 | 0.9402 | 0.0 | | 0.0401 | 234.55 | 2580 | 0.0880 | 0.6553 | 0.7017 | 0.9776 | nan | 0.9925 | 0.9600 | 0.8894 | 0.9848 | 0.9043 | 0.0 | 0.8631 | 0.9742 | 0.6659 | 0.0036 | 0.8993 | 0.9850 | 0.0 | nan | 0.9798 | 0.8839 | 0.8221 | 0.9758 | 0.8444 | 0.0 | 0.8148 | 0.9678 | 0.4476 | 0.0036 | 0.8272 | 0.9517 | 0.0 | | 0.0349 | 236.36 | 2600 | 0.0863 | 0.6590 | 0.7059 | 0.9786 | nan | 0.9926 | 0.9568 | 0.8923 | 0.9872 | 0.9080 | 0.0 | 0.8841 | 0.9800 | 0.6914 | 0.0045 | 0.9032 | 0.9767 | 0.0 | nan | 0.9804 | 0.8917 | 0.8284 | 0.9767 | 0.8447 | 0.0 | 0.8102 | 0.9735 | 0.4559 | 0.0045 | 0.8449 | 0.9559 | 0.0 | | 0.0309 | 238.18 | 2620 | 0.0835 | 0.6605 | 0.7090 | 0.9785 | nan | 0.9928 | 0.9509 | 0.8950 | 0.9874 | 0.9060 | 0.0 | 0.8768 | 0.9810 | 0.6743 | 0.0605 | 0.9080 | 0.9841 | 0.0 | nan | 0.9798 | 0.8942 | 0.8249 | 0.9762 | 0.8425 | 0.0 | 0.8083 | 0.9743 | 0.4502 | 0.0605 | 0.8314 | 0.9441 | 0.0 | | 0.0362 | 240.0 | 2640 | 0.0838 | 0.6608 | 0.7087 | 0.9788 | nan | 0.9925 | 0.9510 | 0.9072 | 0.9870 | 0.9123 | 0.0 | 0.8707 | 0.9817 | 0.6935 | 0.0217 | 0.9151 | 0.9810 | 0.0 | nan | 0.9804 | 0.8932 | 0.8332 | 0.9760 | 0.8434 | 0.0 | 0.8194 | 0.9741 | 0.4562 | 0.0217 | 0.8463 | 0.9463 | 0.0 | | 0.0376 | 241.82 | 2660 | 0.0856 | 0.6580 | 0.7056 | 0.9786 | nan | 0.9932 | 0.9517 | 0.8892 | 0.9882 | 0.8934 | 0.0 | 0.8795 | 0.9807 | 0.6815 | 0.0101 | 0.9189 | 0.9861 | 0.0 | nan | 0.9804 | 0.8899 | 0.8323 | 0.9764 | 0.8454 | 0.0 | 0.8247 | 0.9718 | 0.4502 | 0.0101 | 0.8312 | 0.9411 | 0.0 | | 0.035 | 243.64 | 2680 | 0.0856 | 0.6591 | 0.7068 | 0.9786 | nan | 0.9929 | 0.9505 | 0.9058 | 0.9876 | 0.8970 | 0.0 | 0.8672 | 0.9784 | 0.7012 | 0.0077 | 0.9109 | 0.9886 | 0.0 | nan | 0.9809 | 0.8890 | 0.8335 | 0.9765 | 0.8480 | 0.0 | 0.8282 | 0.9684 | 0.4650 | 0.0077 | 0.8289 | 0.9429 | 0.0 | | 0.0342 | 245.45 | 2700 | 0.0832 | 0.6588 | 0.7080 | 0.9786 | nan | 0.9925 | 0.9542 | 0.8906 | 0.9888 | 0.8977 | 0.0 | 0.8798 | 0.9796 | 0.7188 | 0.0252 | 0.8903 | 0.9865 | 0.0 | nan | 0.9811 | 0.8883 | 0.8213 | 0.9765 | 0.8466 | 0.0 | 0.8208 | 0.9735 | 0.4699 | 0.0252 | 0.8197 | 0.9417 | 0.0 | | 0.0351 | 247.27 | 2720 | 0.0825 | 0.6612 | 0.7155 | 0.9786 | nan | 0.9924 | 0.9499 | 0.8994 | 0.9890 | 0.8955 | 0.0 | 0.8941 | 0.9795 | 0.7119 | 0.0725 | 0.9277 | 0.9897 | 0.0 | nan | 0.9811 | 0.8865 | 0.8290 | 0.9763 | 0.8456 | 0.0 | 0.8203 | 0.9717 | 0.4658 | 0.0724 | 0.8137 | 0.9336 | 0.0 | | 0.0325 | 249.09 | 2740 | 0.0829 | 0.6602 | 0.7026 | 0.9789 | nan | 0.9931 | 0.9534 | 0.8935 | 0.9891 | 0.8947 | 0.0 | 0.8761 | 0.9816 | 0.6643 | 0.0298 | 0.8853 | 0.9723 | 0.0 | nan | 0.9804 | 0.8924 | 0.8345 | 0.9764 | 0.8460 | 0.0 | 0.8191 | 0.9759 | 0.4438 | 0.0298 | 0.8280 | 0.9561 | 0.0 | | 0.0359 | 250.91 | 2760 | 0.0836 | 0.6630 | 0.7101 | 0.9793 | nan | 0.9923 | 0.9567 | 0.9106 | 0.9872 | 0.9100 | 0.0 | 0.8775 | 0.9826 | 0.6926 | 0.0217 | 0.9187 | 0.9807 | 0.0 | nan | 0.9810 | 0.8949 | 0.8391 | 0.9766 | 0.8472 | 0.0 | 0.8281 | 0.9766 | 0.4555 | 0.0217 | 0.8421 | 0.9558 | 0.0 | | 0.0391 | 252.73 | 2780 | 0.0826 | 0.6633 | 0.7110 | 0.9794 | nan | 0.9930 | 0.9508 | 0.9092 | 0.9878 | 0.8997 | 0.0 | 0.8720 | 0.9834 | 0.7021 | 0.0477 | 0.9119 | 0.9860 | 0.0 | nan | 0.9810 | 0.8958 | 0.8363 | 0.9767 | 0.8476 | 0.0 | 0.8224 | 0.9764 | 0.4628 | 0.0477 | 0.8320 | 0.9438 | 0.0 | | 0.0368 | 254.55 | 2800 | 0.0839 | 0.6583 | 0.7057 | 0.9788 | nan | 0.9928 | 0.9476 | 0.9055 | 0.9884 | 0.9082 | 0.0 | 0.8918 | 0.9825 | 0.6432 | 0.0286 | 0.9000 | 0.9851 | 0.0 | nan | 0.9809 | 0.8896 | 0.8311 | 0.9766 | 0.8467 | 0.0 | 0.8230 | 0.9749 | 0.4279 | 0.0286 | 0.8317 | 0.9472 | 0.0 | | 0.0384 | 256.36 | 2820 | 0.0816 | 0.6632 | 0.7097 | 0.9794 | nan | 0.9931 | 0.9488 | 0.9034 | 0.9887 | 0.9022 | 0.0 | 0.8797 | 0.9851 | 0.7006 | 0.0271 | 0.9109 | 0.9860 | 0.0 | nan | 0.9807 | 0.8971 | 0.8366 | 0.9768 | 0.8454 | 0.0 | 0.8258 | 0.9775 | 0.4644 | 0.0271 | 0.8410 | 0.9497 | 0.0 | | 0.0387 | 258.18 | 2840 | 0.0830 | 0.6601 | 0.7076 | 0.9791 | nan | 0.9935 | 0.9480 | 0.8908 | 0.9890 | 0.8978 | 0.0 | 0.8845 | 0.9845 | 0.6822 | 0.0145 | 0.9281 | 0.9858 | 0.0 | nan | 0.9804 | 0.8938 | 0.8345 | 0.9768 | 0.8468 | 0.0 | 0.8306 | 0.9768 | 0.4522 | 0.0144 | 0.8341 | 0.9412 | 0.0 | | 0.0391 | 260.0 | 2860 | 0.0828 | 0.6615 | 0.7077 | 0.9793 | nan | 0.9935 | 0.9517 | 0.9006 | 0.9866 | 0.9141 | 0.0 | 0.8847 | 0.9840 | 0.6715 | 0.0202 | 0.9089 | 0.9846 | 0.0 | nan | 0.9809 | 0.8957 | 0.8385 | 0.9767 | 0.8483 | 0.0 | 0.8269 | 0.9768 | 0.4456 | 0.0202 | 0.8406 | 0.9496 | 0.0 | | 0.0383 | 261.82 | 2880 | 0.0830 | 0.6591 | 0.7070 | 0.9791 | nan | 0.9921 | 0.9548 | 0.9170 | 0.9865 | 0.9097 | 0.0 | 0.8922 | 0.9840 | 0.6490 | 0.0169 | 0.9007 | 0.9886 | 0.0 | nan | 0.9811 | 0.8915 | 0.8404 | 0.9767 | 0.8483 | 0.0 | 0.8274 | 0.9761 | 0.4296 | 0.0169 | 0.8433 | 0.9374 | 0.0 | | 0.0424 | 263.64 | 2900 | 0.0831 | 0.6609 | 0.7068 | 0.9792 | nan | 0.9931 | 0.9543 | 0.9003 | 0.9873 | 0.9039 | 0.0 | 0.8838 | 0.9823 | 0.6691 | 0.0178 | 0.9119 | 0.9846 | 0.0 | nan | 0.9805 | 0.8963 | 0.8344 | 0.9769 | 0.8472 | 0.0 | 0.8241 | 0.9769 | 0.4430 | 0.0178 | 0.8435 | 0.9508 | 0.0 | | 0.0343 | 265.45 | 2920 | 0.0844 | 0.6592 | 0.7068 | 0.9790 | nan | 0.9929 | 0.9542 | 0.9045 | 0.9878 | 0.9082 | 0.0 | 0.8915 | 0.9789 | 0.6700 | 0.0159 | 0.8997 | 0.9845 | 0.0 | nan | 0.9809 | 0.8918 | 0.8368 | 0.9768 | 0.8478 | 0.0 | 0.8265 | 0.9738 | 0.4467 | 0.0159 | 0.8214 | 0.9507 | 0.0 | | 0.0373 | 267.27 | 2940 | 0.0872 | 0.6591 | 0.7069 | 0.9789 | nan | 0.9930 | 0.9561 | 0.8906 | 0.9885 | 0.8908 | 0.0 | 0.8880 | 0.9806 | 0.7061 | 0.0153 | 0.8981 | 0.9824 | 0.0 | nan | 0.9812 | 0.8880 | 0.8317 | 0.9768 | 0.8479 | 0.0 | 0.8273 | 0.9748 | 0.4644 | 0.0153 | 0.8106 | 0.9507 | 0.0 | | 0.0357 | 269.09 | 2960 | 0.0843 | 0.6610 | 0.7073 | 0.9791 | nan | 0.9928 | 0.9552 | 0.9012 | 0.9872 | 0.9078 | 0.0 | 0.8697 | 0.9823 | 0.6832 | 0.0162 | 0.9137 | 0.9852 | 0.0 | nan | 0.9808 | 0.8931 | 0.8370 | 0.9769 | 0.8483 | 0.0 | 0.8239 | 0.9750 | 0.4540 | 0.0162 | 0.8370 | 0.9509 | 0.0 | | 0.0337 | 270.91 | 2980 | 0.0845 | 0.6601 | 0.7086 | 0.9791 | nan | 0.9923 | 0.9598 | 0.9059 | 0.9862 | 0.9100 | 0.0 | 0.8763 | 0.9803 | 0.7154 | 0.0175 | 0.8821 | 0.9858 | 0.0 | nan | 0.9813 | 0.8903 | 0.8385 | 0.9768 | 0.8489 | 0.0 | 0.8203 | 0.9740 | 0.4653 | 0.0175 | 0.8205 | 0.9484 | 0.0 | | 0.0339 | 272.73 | 3000 | 0.0830 | 0.6632 | 0.7100 | 0.9795 | nan | 0.9938 | 0.9538 | 0.8848 | 0.9858 | 0.9122 | 0.0 | 0.8904 | 0.9870 | 0.7002 | 0.0340 | 0.9071 | 0.9808 | 0.0 | nan | 0.9806 | 0.8994 | 0.8315 | 0.9766 | 0.8465 | 0.0 | 0.8341 | 0.9796 | 0.4607 | 0.0339 | 0.8282 | 0.9510 | 0.0 | | 0.0394 | 274.55 | 3020 | 0.0846 | 0.6620 | 0.7093 | 0.9791 | nan | 0.9923 | 0.9555 | 0.9083 | 0.9880 | 0.9003 | 0.0 | 0.8880 | 0.9816 | 0.7051 | 0.0241 | 0.9006 | 0.9766 | 0.0 | nan | 0.9811 | 0.8900 | 0.8383 | 0.9771 | 0.8488 | 0.0 | 0.8293 | 0.9749 | 0.4630 | 0.0241 | 0.8239 | 0.9556 | 0.0 | | 0.0363 | 276.36 | 3040 | 0.0841 | 0.6617 | 0.7077 | 0.9791 | nan | 0.9931 | 0.9551 | 0.9035 | 0.9859 | 0.9143 | 0.0 | 0.8886 | 0.9810 | 0.6672 | 0.0332 | 0.8984 | 0.9799 | 0.0 | nan | 0.9808 | 0.8916 | 0.8391 | 0.9769 | 0.8470 | 0.0 | 0.8331 | 0.9750 | 0.4427 | 0.0332 | 0.8277 | 0.9555 | 0.0 | | 0.0339 | 278.18 | 3060 | 0.0821 | 0.6654 | 0.7160 | 0.9794 | nan | 0.9926 | 0.9496 | 0.9084 | 0.9888 | 0.9053 | 0.0 | 0.9075 | 0.9836 | 0.7101 | 0.0473 | 0.9265 | 0.9884 | 0.0 | nan | 0.9808 | 0.8957 | 0.8385 | 0.9769 | 0.8480 | 0.0 | 0.8256 | 0.9767 | 0.4627 | 0.0472 | 0.8515 | 0.9470 | 0.0 | | 0.0383 | 280.0 | 3080 | 0.0824 | 0.6654 | 0.7145 | 0.9796 | nan | 0.9915 | 0.9521 | 0.9197 | 0.9892 | 0.9016 | 0.0 | 0.8903 | 0.9870 | 0.7172 | 0.0360 | 0.9146 | 0.9891 | 0.0 | nan | 0.9810 | 0.8973 | 0.8384 | 0.9770 | 0.8484 | 0.0 | 0.8252 | 0.9781 | 0.4706 | 0.0360 | 0.8534 | 0.9451 | 0.0 | | 0.0381 | 281.82 | 3100 | 0.0820 | 0.6664 | 0.7131 | 0.9797 | nan | 0.9930 | 0.9512 | 0.9047 | 0.9877 | 0.9023 | 0.0 | 0.8841 | 0.9872 | 0.7214 | 0.0462 | 0.9084 | 0.9840 | 0.0 | nan | 0.9807 | 0.8990 | 0.8367 | 0.9771 | 0.8504 | 0.0 | 0.8331 | 0.9797 | 0.4660 | 0.0462 | 0.8483 | 0.9457 | 0.0 | | 0.0363 | 283.64 | 3120 | 0.0841 | 0.6636 | 0.7140 | 0.9793 | nan | 0.9930 | 0.9546 | 0.9055 | 0.9863 | 0.9069 | 0.0 | 0.9100 | 0.9816 | 0.6941 | 0.0345 | 0.9313 | 0.9844 | 0.0 | nan | 0.9808 | 0.8941 | 0.8399 | 0.9769 | 0.8496 | 0.0 | 0.8251 | 0.9754 | 0.4583 | 0.0345 | 0.8438 | 0.9485 | 0.0 | | 0.0355 | 285.45 | 3140 | 0.0826 | 0.6662 | 0.7163 | 0.9792 | nan | 0.9920 | 0.9529 | 0.9220 | 0.9863 | 0.9100 | 0.0 | 0.8870 | 0.9822 | 0.7147 | 0.0579 | 0.9169 | 0.9902 | 0.0 | nan | 0.9807 | 0.8939 | 0.8401 | 0.9770 | 0.8487 | 0.0 | 0.8326 | 0.9753 | 0.4667 | 0.0578 | 0.8519 | 0.9364 | 0.0 | | 0.043 | 287.27 | 3160 | 0.0817 | 0.6659 | 0.7115 | 0.9796 | nan | 0.9927 | 0.9564 | 0.9080 | 0.9861 | 0.9125 | 0.0 | 0.8793 | 0.9855 | 0.7131 | 0.0350 | 0.8965 | 0.9842 | 0.0 | nan | 0.9811 | 0.8955 | 0.8419 | 0.9768 | 0.8502 | 0.0 | 0.8264 | 0.9787 | 0.4704 | 0.0349 | 0.8480 | 0.9522 | 0.0 | | 0.0323 | 289.09 | 3180 | 0.0826 | 0.6673 | 0.7159 | 0.9796 | nan | 0.9927 | 0.9570 | 0.9020 | 0.9864 | 0.9132 | 0.0 | 0.8965 | 0.9843 | 0.7311 | 0.0395 | 0.9210 | 0.9835 | 0.0 | nan | 0.9811 | 0.8975 | 0.8390 | 0.9768 | 0.8482 | 0.0 | 0.8288 | 0.9776 | 0.4764 | 0.0395 | 0.8588 | 0.9510 | 0.0 | | 0.0325 | 290.91 | 3200 | 0.0826 | 0.6673 | 0.7163 | 0.9796 | nan | 0.9926 | 0.9496 | 0.9116 | 0.9884 | 0.9064 | 0.0 | 0.9004 | 0.9858 | 0.7125 | 0.0527 | 0.9233 | 0.9883 | 0.0 | nan | 0.9808 | 0.8967 | 0.8401 | 0.9771 | 0.8492 | 0.0 | 0.8369 | 0.9791 | 0.4654 | 0.0527 | 0.8502 | 0.9467 | 0.0 | | 0.0399 | 292.73 | 3220 | 0.0816 | 0.6685 | 0.7184 | 0.9796 | nan | 0.9922 | 0.9519 | 0.9184 | 0.9880 | 0.9027 | 0.0 | 0.8971 | 0.9853 | 0.7090 | 0.0883 | 0.9192 | 0.9870 | 0.0 | nan | 0.9810 | 0.8969 | 0.8406 | 0.9771 | 0.8475 | 0.0 | 0.8299 | 0.9786 | 0.4655 | 0.0882 | 0.8509 | 0.9347 | 0.0 | | 0.0371 | 294.55 | 3240 | 0.0835 | 0.6646 | 0.7112 | 0.9792 | nan | 0.9935 | 0.9513 | 0.8942 | 0.9882 | 0.9037 | 0.0 | 0.8866 | 0.9819 | 0.6945 | 0.0492 | 0.9187 | 0.9841 | 0.0 | nan | 0.9805 | 0.8940 | 0.8363 | 0.9773 | 0.8483 | 0.0 | 0.8277 | 0.9763 | 0.4585 | 0.0492 | 0.8460 | 0.9455 | 0.0 | | 0.0321 | 296.36 | 3260 | 0.0824 | 0.6643 | 0.7131 | 0.9795 | nan | 0.9930 | 0.9487 | 0.9011 | 0.9894 | 0.9034 | 0.0 | 0.9110 | 0.9859 | 0.6915 | 0.0443 | 0.9183 | 0.9835 | 0.0 | nan | 0.9807 | 0.8967 | 0.8373 | 0.9771 | 0.8492 | 0.0 | 0.8269 | 0.9789 | 0.4582 | 0.0442 | 0.8347 | 0.9524 | 0.0 | | 0.0374 | 298.18 | 3280 | 0.0815 | 0.6681 | 0.7136 | 0.9795 | nan | 0.9929 | 0.9528 | 0.9074 | 0.9871 | 0.9110 | 0.0 | 0.9045 | 0.9852 | 0.6557 | 0.0930 | 0.9222 | 0.9650 | 0.0 | nan | 0.9805 | 0.8984 | 0.8353 | 0.9773 | 0.8497 | 0.0 | 0.8361 | 0.9783 | 0.4413 | 0.0930 | 0.8447 | 0.9503 | 0.0 | | 0.0336 | 300.0 | 3300 | 0.0822 | 0.6696 | 0.7173 | 0.9796 | nan | 0.9932 | 0.9557 | 0.8945 | 0.9867 | 0.9098 | 0.0 | 0.8913 | 0.9848 | 0.7229 | 0.0699 | 0.9302 | 0.9857 | 0.0 | nan | 0.9808 | 0.8979 | 0.8347 | 0.9772 | 0.8520 | 0.0 | 0.8377 | 0.9783 | 0.4738 | 0.0697 | 0.8470 | 0.9551 | 0.0 | | 0.0388 | 301.82 | 3320 | 0.0830 | 0.6660 | 0.7151 | 0.9797 | nan | 0.9930 | 0.9567 | 0.8974 | 0.9865 | 0.9165 | 0.0 | 0.8798 | 0.9837 | 0.7495 | 0.0542 | 0.8881 | 0.9906 | 0.0 | nan | 0.9811 | 0.8966 | 0.8403 | 0.9773 | 0.8512 | 0.0 | 0.8283 | 0.9782 | 0.4803 | 0.0542 | 0.8370 | 0.9339 | 0.0 | | 0.0356 | 303.64 | 3340 | 0.0813 | 0.6726 | 0.7240 | 0.9799 | nan | 0.9921 | 0.9534 | 0.9135 | 0.9879 | 0.9112 | 0.0 | 0.9006 | 0.9862 | 0.7526 | 0.1197 | 0.9077 | 0.9874 | 0.0 | nan | 0.9815 | 0.8984 | 0.8429 | 0.9774 | 0.8511 | 0.0 | 0.8345 | 0.9780 | 0.4815 | 0.1196 | 0.8547 | 0.9249 | 0.0 | | 0.0323 | 305.45 | 3360 | 0.0825 | 0.6701 | 0.7158 | 0.9798 | nan | 0.9938 | 0.9530 | 0.8949 | 0.9862 | 0.9125 | 0.0 | 0.9012 | 0.9855 | 0.7135 | 0.0907 | 0.9004 | 0.9733 | 0.0 | nan | 0.9809 | 0.8995 | 0.8383 | 0.9773 | 0.8512 | 0.0 | 0.8329 | 0.9780 | 0.4644 | 0.0906 | 0.8452 | 0.9529 | 0.0 | | 0.0367 | 307.27 | 3380 | 0.0831 | 0.6658 | 0.7142 | 0.9797 | nan | 0.9917 | 0.9525 | 0.9196 | 0.9893 | 0.9012 | 0.0 | 0.9058 | 0.9867 | 0.7039 | 0.0355 | 0.9119 | 0.9866 | 0.0 | nan | 0.9812 | 0.8965 | 0.8430 | 0.9771 | 0.8507 | 0.0 | 0.8337 | 0.9776 | 0.4625 | 0.0355 | 0.8482 | 0.9493 | 0.0 | | 0.033 | 309.09 | 3400 | 0.0839 | 0.6687 | 0.7169 | 0.9797 | nan | 0.9931 | 0.9521 | 0.9057 | 0.9874 | 0.9076 | 0.0 | 0.8990 | 0.9857 | 0.7082 | 0.0814 | 0.9157 | 0.9843 | 0.0 | nan | 0.9808 | 0.8982 | 0.8403 | 0.9774 | 0.8518 | 0.0 | 0.8302 | 0.9794 | 0.4616 | 0.0812 | 0.8460 | 0.9456 | 0.0 | | 0.03 | 310.91 | 3420 | 0.0843 | 0.6704 | 0.7188 | 0.9796 | nan | 0.9919 | 0.9574 | 0.9201 | 0.9862 | 0.9135 | 0.0 | 0.8970 | 0.9833 | 0.7203 | 0.0881 | 0.9062 | 0.9808 | 0.0 | nan | 0.9811 | 0.8967 | 0.8413 | 0.9772 | 0.8509 | 0.0 | 0.8305 | 0.9770 | 0.4693 | 0.0879 | 0.8459 | 0.9570 | 0.0 | | 0.0325 | 312.73 | 3440 | 0.0837 | 0.6685 | 0.7154 | 0.9799 | nan | 0.9934 | 0.9514 | 0.9019 | 0.9878 | 0.9072 | 0.0 | 0.9010 | 0.9857 | 0.7073 | 0.0734 | 0.9050 | 0.9855 | 0.0 | nan | 0.9810 | 0.9008 | 0.8367 | 0.9774 | 0.8511 | 0.0 | 0.8293 | 0.9793 | 0.4618 | 0.0733 | 0.8458 | 0.9533 | 0.0 | | 0.0349 | 314.55 | 3460 | 0.0830 | 0.6693 | 0.7189 | 0.9798 | nan | 0.9925 | 0.9550 | 0.9153 | 0.9889 | 0.8999 | 0.0 | 0.8921 | 0.9822 | 0.7168 | 0.0903 | 0.9286 | 0.9843 | 0.0 | nan | 0.9812 | 0.9011 | 0.8372 | 0.9776 | 0.8501 | 0.0 | 0.8364 | 0.9771 | 0.4651 | 0.0902 | 0.8333 | 0.9518 | 0.0 | | 0.0335 | 316.36 | 3480 | 0.0832 | 0.6670 | 0.7149 | 0.9798 | nan | 0.9934 | 0.9505 | 0.9023 | 0.9873 | 0.9074 | 0.0 | 0.9016 | 0.9869 | 0.7127 | 0.0649 | 0.9006 | 0.9862 | 0.0 | nan | 0.9810 | 0.9002 | 0.8368 | 0.9773 | 0.8521 | 0.0 | 0.8417 | 0.9790 | 0.4665 | 0.0647 | 0.8368 | 0.9353 | 0.0 | | 0.0352 | 318.18 | 3500 | 0.0838 | 0.6657 | 0.7119 | 0.9795 | nan | 0.9927 | 0.9543 | 0.9091 | 0.9870 | 0.9095 | 0.0 | 0.8942 | 0.9838 | 0.7181 | 0.0459 | 0.8762 | 0.9835 | 0.0 | nan | 0.9809 | 0.8966 | 0.8390 | 0.9774 | 0.8525 | 0.0 | 0.8298 | 0.9767 | 0.4687 | 0.0458 | 0.8331 | 0.9537 | 0.0 | | 0.0345 | 320.0 | 3520 | 0.0844 | 0.6682 | 0.7150 | 0.9797 | nan | 0.9929 | 0.9530 | 0.9081 | 0.9874 | 0.9133 | 0.0 | 0.9066 | 0.9836 | 0.7079 | 0.0647 | 0.8968 | 0.9804 | 0.0 | nan | 0.9808 | 0.8988 | 0.8393 | 0.9773 | 0.8510 | 0.0 | 0.8332 | 0.9776 | 0.4651 | 0.0646 | 0.8432 | 0.9556 | 0.0 | | 0.0391 | 321.82 | 3540 | 0.0839 | 0.6676 | 0.7146 | 0.9799 | nan | 0.9936 | 0.9534 | 0.9030 | 0.9857 | 0.9160 | 0.0 | 0.9098 | 0.9862 | 0.7059 | 0.0520 | 0.8988 | 0.9860 | 0.0 | nan | 0.9811 | 0.8992 | 0.8427 | 0.9771 | 0.8524 | 0.0 | 0.8337 | 0.9793 | 0.4620 | 0.0520 | 0.8478 | 0.9514 | 0.0 | | 0.0336 | 323.64 | 3560 | 0.0838 | 0.6681 | 0.7145 | 0.9798 | nan | 0.9934 | 0.9548 | 0.9082 | 0.9862 | 0.9166 | 0.0 | 0.8975 | 0.9818 | 0.7104 | 0.0560 | 0.9002 | 0.9831 | 0.0 | nan | 0.9816 | 0.8956 | 0.8447 | 0.9772 | 0.8522 | 0.0 | 0.8382 | 0.9752 | 0.4677 | 0.0559 | 0.8481 | 0.9493 | 0.0 | | 0.0346 | 325.45 | 3580 | 0.0818 | 0.6717 | 0.7160 | 0.9799 | nan | 0.9931 | 0.9544 | 0.9039 | 0.9876 | 0.9085 | 0.0 | 0.8858 | 0.9845 | 0.7112 | 0.1194 | 0.8786 | 0.9812 | 0.0 | nan | 0.9811 | 0.8995 | 0.8397 | 0.9773 | 0.8543 | 0.0 | 0.8331 | 0.9768 | 0.4675 | 0.1192 | 0.8298 | 0.9540 | 0.0 | | 0.0352 | 327.27 | 3600 | 0.0833 | 0.6668 | 0.7124 | 0.9797 | nan | 0.9932 | 0.9528 | 0.9048 | 0.9871 | 0.9154 | 0.0 | 0.8966 | 0.9850 | 0.6883 | 0.0641 | 0.8915 | 0.9818 | 0.0 | nan | 0.9811 | 0.8984 | 0.8376 | 0.9772 | 0.8517 | 0.0 | 0.8309 | 0.9784 | 0.4574 | 0.0641 | 0.8394 | 0.9524 | 0.0 | | 0.0319 | 329.09 | 3620 | 0.0845 | 0.6671 | 0.7174 | 0.9797 | nan | 0.9919 | 0.9551 | 0.9202 | 0.9877 | 0.9112 | 0.0 | 0.8970 | 0.9831 | 0.7500 | 0.0384 | 0.9066 | 0.9851 | 0.0 | nan | 0.9815 | 0.8954 | 0.8420 | 0.9776 | 0.8538 | 0.0 | 0.8304 | 0.9769 | 0.4793 | 0.0384 | 0.8426 | 0.9547 | 0.0 | | 0.0336 | 330.91 | 3640 | 0.0818 | 0.6728 | 0.7215 | 0.9800 | nan | 0.9931 | 0.9519 | 0.9099 | 0.9876 | 0.9123 | 0.0 | 0.9019 | 0.9838 | 0.7339 | 0.1136 | 0.9043 | 0.9873 | 0.0 | nan | 0.9814 | 0.8987 | 0.8428 | 0.9775 | 0.8533 | 0.0 | 0.8334 | 0.9770 | 0.4779 | 0.1133 | 0.8388 | 0.9519 | 0.0 | | 0.0337 | 332.73 | 3660 | 0.0838 | 0.6715 | 0.7202 | 0.9797 | nan | 0.9934 | 0.9504 | 0.9035 | 0.9877 | 0.9139 | 0.0 | 0.9029 | 0.9822 | 0.7108 | 0.1165 | 0.9144 | 0.9863 | 0.0 | nan | 0.9810 | 0.8976 | 0.8403 | 0.9775 | 0.8521 | 0.0 | 0.8339 | 0.9757 | 0.4644 | 0.1160 | 0.8384 | 0.9529 | 0.0 | | 0.0345 | 334.55 | 3680 | 0.0830 | 0.6716 | 0.7203 | 0.9799 | nan | 0.9928 | 0.9494 | 0.9115 | 0.9894 | 0.9022 | 0.0 | 0.8938 | 0.9844 | 0.7401 | 0.1050 | 0.9091 | 0.9856 | 0.0 | nan | 0.9812 | 0.8972 | 0.8435 | 0.9776 | 0.8533 | 0.0 | 0.8323 | 0.9771 | 0.4743 | 0.1044 | 0.8354 | 0.9549 | 0.0 | | 0.0308 | 336.36 | 3700 | 0.0844 | 0.6669 | 0.7137 | 0.9797 | nan | 0.9928 | 0.9555 | 0.9154 | 0.9857 | 0.9120 | 0.0 | 0.8952 | 0.9849 | 0.6996 | 0.0588 | 0.8942 | 0.9844 | 0.0 | nan | 0.9813 | 0.8952 | 0.8455 | 0.9771 | 0.8533 | 0.0 | 0.8336 | 0.9774 | 0.4615 | 0.0587 | 0.8367 | 0.9489 | 0.0 | | 0.0302 | 338.18 | 3720 | 0.0838 | 0.6667 | 0.7130 | 0.9798 | nan | 0.9932 | 0.9546 | 0.9021 | 0.9882 | 0.9072 | 0.0 | 0.8943 | 0.9840 | 0.7031 | 0.0535 | 0.9043 | 0.9852 | 0.0 | nan | 0.9814 | 0.8968 | 0.8428 | 0.9776 | 0.8535 | 0.0 | 0.8313 | 0.9774 | 0.4621 | 0.0533 | 0.8439 | 0.9475 | 0.0 | | 0.0301 | 340.0 | 3740 | 0.0841 | 0.6695 | 0.7134 | 0.9799 | nan | 0.9933 | 0.9544 | 0.9042 | 0.9873 | 0.9094 | 0.0 | 0.8942 | 0.9848 | 0.6984 | 0.0769 | 0.8944 | 0.9774 | 0.0 | nan | 0.9812 | 0.8982 | 0.8419 | 0.9777 | 0.8543 | 0.0 | 0.8346 | 0.9778 | 0.4603 | 0.0767 | 0.8427 | 0.9577 | 0.0 | | 0.0285 | 341.82 | 3760 | 0.0843 | 0.6706 | 0.7172 | 0.9798 | nan | 0.9931 | 0.9518 | 0.9109 | 0.9878 | 0.9104 | 0.0 | 0.9037 | 0.9828 | 0.6922 | 0.0964 | 0.9102 | 0.9836 | 0.0 | nan | 0.9811 | 0.8977 | 0.8430 | 0.9776 | 0.8536 | 0.0 | 0.8300 | 0.9764 | 0.4557 | 0.0961 | 0.8528 | 0.9531 | 0.0 | | 0.0382 | 343.64 | 3780 | 0.0854 | 0.6670 | 0.7118 | 0.9797 | nan | 0.9938 | 0.9509 | 0.9013 | 0.9879 | 0.9098 | 0.0 | 0.8912 | 0.9826 | 0.6746 | 0.0703 | 0.9039 | 0.9871 | 0.0 | nan | 0.9808 | 0.8979 | 0.8413 | 0.9776 | 0.8539 | 0.0 | 0.8297 | 0.9768 | 0.4482 | 0.0702 | 0.8457 | 0.9491 | 0.0 | | 0.0316 | 345.45 | 3800 | 0.0838 | 0.6690 | 0.7170 | 0.9799 | nan | 0.9928 | 0.9526 | 0.9139 | 0.9878 | 0.9132 | 0.0 | 0.8975 | 0.9844 | 0.7144 | 0.0762 | 0.9002 | 0.9884 | 0.0 | nan | 0.9814 | 0.8982 | 0.8437 | 0.9776 | 0.8537 | 0.0 | 0.8346 | 0.9776 | 0.4685 | 0.0761 | 0.8439 | 0.9417 | 0.0 | | 0.03 | 347.27 | 3820 | 0.0828 | 0.6709 | 0.7159 | 0.9799 | nan | 0.9938 | 0.9521 | 0.9017 | 0.9876 | 0.9069 | 0.0 | 0.9003 | 0.9840 | 0.7115 | 0.0878 | 0.8999 | 0.9810 | 0.0 | nan | 0.9813 | 0.8981 | 0.8430 | 0.9775 | 0.8546 | 0.0 | 0.8387 | 0.9774 | 0.4669 | 0.0876 | 0.8428 | 0.9537 | 0.0 | | 0.0333 | 349.09 | 3840 | 0.0844 | 0.6671 | 0.7147 | 0.9797 | nan | 0.9930 | 0.9546 | 0.9114 | 0.9875 | 0.9084 | 0.0 | 0.8952 | 0.9817 | 0.7053 | 0.0653 | 0.9007 | 0.9877 | 0.0 | nan | 0.9814 | 0.8937 | 0.8461 | 0.9777 | 0.8551 | 0.0 | 0.8344 | 0.9762 | 0.4611 | 0.0651 | 0.8312 | 0.9503 | 0.0 | | 0.0319 | 350.91 | 3860 | 0.0840 | 0.6684 | 0.7152 | 0.9799 | nan | 0.9926 | 0.9528 | 0.9167 | 0.9881 | 0.9058 | 0.0 | 0.8987 | 0.9853 | 0.6963 | 0.0739 | 0.9041 | 0.9834 | 0.0 | nan | 0.9811 | 0.8984 | 0.8414 | 0.9775 | 0.8540 | 0.0 | 0.8329 | 0.9791 | 0.4581 | 0.0737 | 0.8379 | 0.9547 | 0.0 | | 0.0308 | 352.73 | 3880 | 0.0837 | 0.6681 | 0.7146 | 0.9797 | nan | 0.9923 | 0.9530 | 0.9171 | 0.9880 | 0.9132 | 0.0 | 0.9054 | 0.9855 | 0.6735 | 0.0863 | 0.8883 | 0.9869 | 0.0 | nan | 0.9810 | 0.8961 | 0.8433 | 0.9775 | 0.8534 | 0.0 | 0.8349 | 0.9793 | 0.4456 | 0.0860 | 0.8334 | 0.9542 | 0.0 | | 0.0359 | 354.55 | 3900 | 0.0843 | 0.6680 | 0.7124 | 0.9798 | nan | 0.9931 | 0.9563 | 0.9074 | 0.9864 | 0.9137 | 0.0 | 0.8989 | 0.9841 | 0.6872 | 0.0682 | 0.8805 | 0.9850 | 0.0 | nan | 0.9811 | 0.8974 | 0.8424 | 0.9774 | 0.8543 | 0.0 | 0.8335 | 0.9783 | 0.4543 | 0.0681 | 0.8375 | 0.9594 | 0.0 | | 0.0389 | 356.36 | 3920 | 0.0849 | 0.6678 | 0.7163 | 0.9798 | nan | 0.9923 | 0.9553 | 0.9071 | 0.9883 | 0.9091 | 0.0 | 0.9108 | 0.9856 | 0.7119 | 0.0583 | 0.9054 | 0.9880 | 0.0 | nan | 0.9813 | 0.8947 | 0.8452 | 0.9773 | 0.8543 | 0.0 | 0.8278 | 0.9786 | 0.4647 | 0.0582 | 0.8439 | 0.9558 | 0.0 | | 0.0331 | 358.18 | 3940 | 0.0842 | 0.6678 | 0.7152 | 0.9798 | nan | 0.9929 | 0.9513 | 0.9104 | 0.9887 | 0.9052 | 0.0 | 0.9001 | 0.9844 | 0.6991 | 0.0631 | 0.9132 | 0.9894 | 0.0 | nan | 0.9811 | 0.8958 | 0.8454 | 0.9775 | 0.8543 | 0.0 | 0.8325 | 0.9787 | 0.4566 | 0.0629 | 0.8511 | 0.9459 | 0.0 | | 0.0361 | 360.0 | 3960 | 0.0851 | 0.6690 | 0.7144 | 0.9800 | nan | 0.9937 | 0.9526 | 0.9018 | 0.9876 | 0.9101 | 0.0 | 0.9083 | 0.9846 | 0.7001 | 0.0729 | 0.8929 | 0.9828 | 0.0 | nan | 0.9813 | 0.8987 | 0.8445 | 0.9777 | 0.8545 | 0.0 | 0.8331 | 0.9785 | 0.4589 | 0.0727 | 0.8425 | 0.9544 | 0.0 | | 0.0311 | 361.82 | 3980 | 0.0867 | 0.6673 | 0.7155 | 0.9796 | nan | 0.9929 | 0.9553 | 0.9038 | 0.9877 | 0.9096 | 0.0 | 0.9086 | 0.9829 | 0.6951 | 0.0747 | 0.9086 | 0.9826 | 0.0 | nan | 0.9813 | 0.8946 | 0.8413 | 0.9774 | 0.8550 | 0.0 | 0.8332 | 0.9775 | 0.4521 | 0.0744 | 0.8295 | 0.9589 | 0.0 | | 0.045 | 363.64 | 4000 | 0.0849 | 0.6682 | 0.7138 | 0.9800 | nan | 0.9930 | 0.9545 | 0.9088 | 0.9876 | 0.9067 | 0.0 | 0.8957 | 0.9862 | 0.6906 | 0.0699 | 0.9011 | 0.9855 | 0.0 | nan | 0.9812 | 0.8993 | 0.8433 | 0.9776 | 0.8550 | 0.0 | 0.8319 | 0.9797 | 0.4525 | 0.0697 | 0.8463 | 0.9508 | 0.0 | | 0.0333 | 365.45 | 4020 | 0.0847 | 0.6683 | 0.7146 | 0.9799 | nan | 0.9934 | 0.9485 | 0.9126 | 0.9876 | 0.9119 | 0.0 | 0.8958 | 0.9858 | 0.6939 | 0.0733 | 0.8981 | 0.9884 | 0.0 | nan | 0.9810 | 0.8980 | 0.8446 | 0.9775 | 0.8540 | 0.0 | 0.8340 | 0.9796 | 0.4563 | 0.0731 | 0.8474 | 0.9424 | 0.0 | | 0.0314 | 367.27 | 4040 | 0.0848 | 0.6665 | 0.7114 | 0.9800 | nan | 0.9931 | 0.9534 | 0.9106 | 0.9884 | 0.8989 | 0.0 | 0.9062 | 0.9856 | 0.6772 | 0.0595 | 0.8903 | 0.9845 | 0.0 | nan | 0.9812 | 0.8983 | 0.8448 | 0.9778 | 0.8543 | 0.0 | 0.8348 | 0.9793 | 0.4487 | 0.0593 | 0.8358 | 0.9498 | 0.0 | | 0.0294 | 369.09 | 4060 | 0.0853 | 0.6693 | 0.7160 | 0.9798 | nan | 0.9929 | 0.9535 | 0.9124 | 0.9873 | 0.9078 | 0.0 | 0.8968 | 0.9830 | 0.7184 | 0.0738 | 0.8974 | 0.9844 | 0.0 | nan | 0.9812 | 0.8958 | 0.8449 | 0.9775 | 0.8548 | 0.0 | 0.8339 | 0.9765 | 0.4670 | 0.0736 | 0.8426 | 0.9530 | 0.0 | | 0.0342 | 370.91 | 4080 | 0.0853 | 0.6711 | 0.7174 | 0.9800 | nan | 0.9935 | 0.9538 | 0.9044 | 0.9876 | 0.9094 | 0.0 | 0.9037 | 0.9826 | 0.7193 | 0.0748 | 0.9144 | 0.9825 | 0.0 | nan | 0.9814 | 0.8968 | 0.8462 | 0.9776 | 0.8549 | 0.0 | 0.8362 | 0.9765 | 0.4707 | 0.0745 | 0.8514 | 0.9581 | 0.0 | | 0.0312 | 372.73 | 4100 | 0.0825 | 0.6724 | 0.7207 | 0.9801 | nan | 0.9929 | 0.9526 | 0.9128 | 0.9884 | 0.9067 | 0.0 | 0.9055 | 0.9852 | 0.7029 | 0.1121 | 0.9238 | 0.9856 | 0.0 | nan | 0.9815 | 0.8987 | 0.8449 | 0.9778 | 0.8549 | 0.0 | 0.8333 | 0.9782 | 0.4623 | 0.1117 | 0.8500 | 0.9483 | 0.0 | | 0.0317 | 374.55 | 4120 | 0.0843 | 0.6704 | 0.7172 | 0.9800 | nan | 0.9931 | 0.9540 | 0.9096 | 0.9878 | 0.9092 | 0.0 | 0.8975 | 0.9840 | 0.7138 | 0.0753 | 0.9125 | 0.9872 | 0.0 | nan | 0.9813 | 0.8984 | 0.8446 | 0.9778 | 0.8548 | 0.0 | 0.8350 | 0.9780 | 0.4646 | 0.0752 | 0.8562 | 0.9496 | 0.0 | | 0.0353 | 376.36 | 4140 | 0.0848 | 0.6708 | 0.7179 | 0.9799 | nan | 0.9924 | 0.9552 | 0.9156 | 0.9885 | 0.9054 | 0.0 | 0.9019 | 0.9838 | 0.7214 | 0.0799 | 0.9027 | 0.9857 | 0.0 | nan | 0.9815 | 0.8969 | 0.8448 | 0.9778 | 0.8553 | 0.0 | 0.8302 | 0.9770 | 0.4715 | 0.0796 | 0.8527 | 0.9529 | 0.0 | | 0.0338 | 378.18 | 4160 | 0.0834 | 0.6714 | 0.7183 | 0.9800 | nan | 0.9928 | 0.9504 | 0.9170 | 0.9884 | 0.9051 | 0.0 | 0.9128 | 0.9856 | 0.7067 | 0.0883 | 0.9048 | 0.9861 | 0.0 | nan | 0.9812 | 0.8974 | 0.8450 | 0.9778 | 0.8551 | 0.0 | 0.8334 | 0.9785 | 0.4630 | 0.0881 | 0.8555 | 0.9528 | 0.0 | | 0.0324 | 380.0 | 4180 | 0.0855 | 0.6693 | 0.7121 | 0.9798 | nan | 0.9936 | 0.9523 | 0.9013 | 0.9879 | 0.9114 | 0.0 | 0.8953 | 0.9840 | 0.6890 | 0.0692 | 0.8910 | 0.9828 | 0.0 | nan | 0.9811 | 0.8969 | 0.8436 | 0.9777 | 0.8546 | 0.0 | 0.8400 | 0.9774 | 0.4555 | 0.0689 | 0.8484 | 0.9569 | 0.0 | | 0.0333 | 381.82 | 4200 | 0.0868 | 0.6687 | 0.7137 | 0.9798 | nan | 0.9929 | 0.9528 | 0.9079 | 0.9881 | 0.9087 | 0.0 | 0.9049 | 0.9861 | 0.6805 | 0.0759 | 0.8936 | 0.9871 | 0.0 | nan | 0.9811 | 0.8964 | 0.8437 | 0.9777 | 0.8553 | 0.0 | 0.8392 | 0.9782 | 0.4517 | 0.0756 | 0.8426 | 0.9512 | 0.0 | | 0.0293 | 383.64 | 4220 | 0.0857 | 0.6706 | 0.7195 | 0.9799 | nan | 0.9924 | 0.9533 | 0.9133 | 0.9877 | 0.9124 | 0.0 | 0.9155 | 0.9851 | 0.7200 | 0.0847 | 0.9032 | 0.9860 | 0.0 | nan | 0.9812 | 0.8970 | 0.8434 | 0.9778 | 0.8559 | 0.0 | 0.8358 | 0.9781 | 0.4680 | 0.0844 | 0.8447 | 0.9521 | 0.0 | | 0.0319 | 385.45 | 4240 | 0.0858 | 0.6713 | 0.7181 | 0.9798 | nan | 0.9928 | 0.9549 | 0.9097 | 0.9866 | 0.9176 | 0.0 | 0.9020 | 0.9839 | 0.7280 | 0.0880 | 0.8899 | 0.9815 | 0.0 | nan | 0.9812 | 0.8969 | 0.8435 | 0.9775 | 0.8543 | 0.0 | 0.8349 | 0.9775 | 0.4735 | 0.0875 | 0.8414 | 0.9582 | 0.0 | | 0.03 | 387.27 | 4260 | 0.0848 | 0.6712 | 0.7174 | 0.9799 | nan | 0.9930 | 0.9560 | 0.9037 | 0.9863 | 0.9119 | 0.0 | 0.9116 | 0.9860 | 0.7047 | 0.0905 | 0.8988 | 0.9834 | 0.0 | nan | 0.9811 | 0.8976 | 0.8407 | 0.9777 | 0.8556 | 0.0 | 0.8374 | 0.9786 | 0.4648 | 0.0899 | 0.8454 | 0.9563 | 0.0 | | 0.0274 | 389.09 | 4280 | 0.0846 | 0.6727 | 0.7185 | 0.9800 | nan | 0.9927 | 0.9527 | 0.9146 | 0.9884 | 0.9100 | 0.0 | 0.8988 | 0.9845 | 0.7151 | 0.1003 | 0.8990 | 0.9842 | 0.0 | nan | 0.9812 | 0.8983 | 0.8438 | 0.9780 | 0.8552 | 0.0 | 0.8385 | 0.9779 | 0.4681 | 0.1000 | 0.8465 | 0.9580 | 0.0 | | 0.0342 | 390.91 | 4300 | 0.0840 | 0.6727 | 0.7178 | 0.9801 | nan | 0.9929 | 0.9530 | 0.9151 | 0.9880 | 0.9129 | 0.0 | 0.9062 | 0.9848 | 0.6918 | 0.0974 | 0.9061 | 0.9832 | 0.0 | nan | 0.9813 | 0.8987 | 0.8466 | 0.9779 | 0.8558 | 0.0 | 0.8399 | 0.9783 | 0.4569 | 0.0972 | 0.8551 | 0.9572 | 0.0 | | 0.0317 | 392.73 | 4320 | 0.0851 | 0.6710 | 0.7160 | 0.9801 | nan | 0.9929 | 0.9538 | 0.9143 | 0.9876 | 0.9107 | 0.0 | 0.9017 | 0.9857 | 0.6887 | 0.0822 | 0.9073 | 0.9830 | 0.0 | nan | 0.9813 | 0.8979 | 0.8461 | 0.9779 | 0.8565 | 0.0 | 0.8402 | 0.9789 | 0.4559 | 0.0820 | 0.8504 | 0.9562 | 0.0 | | 0.0296 | 394.55 | 4340 | 0.0853 | 0.6710 | 0.7186 | 0.9800 | nan | 0.9922 | 0.9535 | 0.9170 | 0.9889 | 0.9058 | 0.0 | 0.9117 | 0.9858 | 0.7070 | 0.0870 | 0.9059 | 0.9873 | 0.0 | nan | 0.9813 | 0.8970 | 0.8452 | 0.9779 | 0.8558 | 0.0 | 0.8402 | 0.9793 | 0.4600 | 0.0867 | 0.8495 | 0.9502 | 0.0 | | 0.0324 | 396.36 | 4360 | 0.0868 | 0.6724 | 0.7180 | 0.9802 | nan | 0.9932 | 0.9510 | 0.9106 | 0.9887 | 0.9065 | 0.0 | 0.9045 | 0.9858 | 0.7138 | 0.0846 | 0.9114 | 0.9838 | 0.0 | nan | 0.9815 | 0.8982 | 0.8468 | 0.9777 | 0.8564 | 0.0 | 0.8414 | 0.9785 | 0.4669 | 0.0844 | 0.8549 | 0.9542 | 0.0 | | 0.0284 | 398.18 | 4380 | 0.0839 | 0.6759 | 0.7217 | 0.9800 | nan | 0.9928 | 0.9556 | 0.9128 | 0.9874 | 0.9132 | 0.0 | 0.9069 | 0.9836 | 0.6843 | 0.1583 | 0.9057 | 0.9815 | 0.0 | nan | 0.9812 | 0.8980 | 0.8460 | 0.9779 | 0.8571 | 0.0 | 0.8371 | 0.9782 | 0.4527 | 0.1575 | 0.8445 | 0.9562 | 0.0 | | 0.0449 | 400.0 | 4400 | 0.0831 | 0.6768 | 0.7234 | 0.9803 | nan | 0.9932 | 0.9534 | 0.9052 | 0.9875 | 0.9107 | 0.0 | 0.9044 | 0.9874 | 0.7294 | 0.1453 | 0.9039 | 0.9836 | 0.0 | nan | 0.9816 | 0.8993 | 0.8468 | 0.9778 | 0.8567 | 0.0 | 0.8409 | 0.9795 | 0.4722 | 0.1448 | 0.8451 | 0.9534 | 0.0 | | 0.0266 | 401.82 | 4420 | 0.0828 | 0.6786 | 0.7248 | 0.9802 | nan | 0.9931 | 0.9506 | 0.9100 | 0.9894 | 0.9074 | 0.0 | 0.9045 | 0.9845 | 0.7097 | 0.1842 | 0.9020 | 0.9873 | 0.0 | nan | 0.9814 | 0.8996 | 0.8479 | 0.9777 | 0.8542 | 0.0 | 0.8383 | 0.9783 | 0.4633 | 0.1835 | 0.8455 | 0.9523 | 0.0 | | 0.0255 | 403.64 | 4440 | 0.0851 | 0.6717 | 0.7173 | 0.9799 | nan | 0.9926 | 0.9566 | 0.9097 | 0.9876 | 0.9100 | 0.0 | 0.9012 | 0.9836 | 0.7026 | 0.1092 | 0.8881 | 0.9836 | 0.0 | nan | 0.9814 | 0.8951 | 0.8473 | 0.9778 | 0.8558 | 0.0 | 0.8401 | 0.9763 | 0.4608 | 0.1088 | 0.8327 | 0.9562 | 0.0 | | 0.0336 | 405.45 | 4460 | 0.0844 | 0.6708 | 0.7172 | 0.9798 | nan | 0.9929 | 0.9529 | 0.9079 | 0.9886 | 0.9110 | 0.0 | 0.9042 | 0.9829 | 0.6966 | 0.1008 | 0.8981 | 0.9874 | 0.0 | nan | 0.9814 | 0.8946 | 0.8477 | 0.9778 | 0.8552 | 0.0 | 0.8361 | 0.9760 | 0.4576 | 0.1005 | 0.8419 | 0.9518 | 0.0 | | 0.0289 | 407.27 | 4480 | 0.0841 | 0.6737 | 0.7197 | 0.9800 | nan | 0.9928 | 0.9540 | 0.9124 | 0.9886 | 0.9063 | 0.0 | 0.9067 | 0.9832 | 0.7028 | 0.1169 | 0.9087 | 0.9836 | 0.0 | nan | 0.9814 | 0.8974 | 0.8460 | 0.9780 | 0.8563 | 0.0 | 0.8348 | 0.9761 | 0.4634 | 0.1164 | 0.8529 | 0.9559 | 0.0 | | 0.0276 | 409.09 | 4500 | 0.0852 | 0.6724 | 0.7180 | 0.9799 | nan | 0.9932 | 0.9520 | 0.9133 | 0.9880 | 0.9117 | 0.0 | 0.9032 | 0.9817 | 0.6858 | 0.1129 | 0.9086 | 0.9835 | 0.0 | nan | 0.9813 | 0.8962 | 0.8466 | 0.9779 | 0.8554 | 0.0 | 0.8338 | 0.9753 | 0.4554 | 0.1124 | 0.8507 | 0.9561 | 0.0 | | 0.0335 | 410.91 | 4520 | 0.0850 | 0.6724 | 0.7168 | 0.9800 | nan | 0.9934 | 0.9544 | 0.9079 | 0.9879 | 0.9098 | 0.0 | 0.8938 | 0.9826 | 0.7132 | 0.0872 | 0.9087 | 0.9796 | 0.0 | nan | 0.9815 | 0.8974 | 0.8465 | 0.9779 | 0.8559 | 0.0 | 0.8373 | 0.9759 | 0.4687 | 0.0868 | 0.8540 | 0.9599 | 0.0 | | 0.0345 | 412.73 | 4540 | 0.0845 | 0.6726 | 0.7161 | 0.9800 | nan | 0.9939 | 0.9522 | 0.8977 | 0.9885 | 0.9057 | 0.0 | 0.8982 | 0.9832 | 0.6894 | 0.1065 | 0.9093 | 0.9847 | 0.0 | nan | 0.9812 | 0.8985 | 0.8416 | 0.9779 | 0.8562 | 0.0 | 0.8361 | 0.9770 | 0.4581 | 0.1062 | 0.8555 | 0.9549 | 0.0 | | 0.0281 | 414.55 | 4560 | 0.0839 | 0.6737 | 0.7200 | 0.9803 | nan | 0.9933 | 0.9523 | 0.9091 | 0.9883 | 0.9131 | 0.0 | 0.9025 | 0.9856 | 0.7077 | 0.1080 | 0.9155 | 0.9843 | 0.0 | nan | 0.9816 | 0.9001 | 0.8479 | 0.9780 | 0.8557 | 0.0 | 0.8355 | 0.9784 | 0.4649 | 0.1076 | 0.8553 | 0.9528 | 0.0 | | 0.0329 | 416.36 | 4580 | 0.0848 | 0.6707 | 0.7152 | 0.9802 | nan | 0.9935 | 0.9557 | 0.9010 | 0.9881 | 0.9104 | 0.0 | 0.8989 | 0.9846 | 0.7053 | 0.0752 | 0.9016 | 0.9827 | 0.0 | nan | 0.9817 | 0.8979 | 0.8473 | 0.9781 | 0.8567 | 0.0 | 0.8391 | 0.9782 | 0.4635 | 0.0749 | 0.8444 | 0.9580 | 0.0 | | 0.0295 | 418.18 | 4600 | 0.0849 | 0.6744 | 0.7225 | 0.9801 | nan | 0.9927 | 0.9547 | 0.9132 | 0.9888 | 0.9063 | 0.0 | 0.9052 | 0.9828 | 0.7186 | 0.1243 | 0.9197 | 0.9867 | 0.0 | nan | 0.9815 | 0.8980 | 0.8482 | 0.9780 | 0.8555 | 0.0 | 0.8374 | 0.9766 | 0.4678 | 0.1235 | 0.8520 | 0.9489 | 0.0 | | 0.0316 | 420.0 | 4620 | 0.0844 | 0.6726 | 0.7184 | 0.9801 | nan | 0.9932 | 0.9556 | 0.9080 | 0.9878 | 0.9083 | 0.0 | 0.9016 | 0.9837 | 0.6995 | 0.1140 | 0.9027 | 0.9845 | 0.0 | nan | 0.9816 | 0.8968 | 0.8478 | 0.9780 | 0.8563 | 0.0 | 0.8372 | 0.9772 | 0.4628 | 0.1134 | 0.8417 | 0.9505 | 0.0 | | 0.0271 | 421.82 | 4640 | 0.0842 | 0.6741 | 0.7192 | 0.9801 | nan | 0.9933 | 0.9508 | 0.9108 | 0.9887 | 0.9081 | 0.0 | 0.9018 | 0.9838 | 0.7080 | 0.1181 | 0.9055 | 0.9806 | 0.0 | nan | 0.9815 | 0.8977 | 0.8463 | 0.9780 | 0.8560 | 0.0 | 0.8383 | 0.9768 | 0.4668 | 0.1176 | 0.8513 | 0.9534 | 0.0 | | 0.0332 | 423.64 | 4660 | 0.0842 | 0.6736 | 0.7193 | 0.9801 | nan | 0.9932 | 0.9529 | 0.9103 | 0.9882 | 0.9091 | 0.0 | 0.8993 | 0.9840 | 0.7060 | 0.1218 | 0.9031 | 0.9835 | 0.0 | nan | 0.9814 | 0.8980 | 0.8468 | 0.9780 | 0.8559 | 0.0 | 0.8379 | 0.9776 | 0.4641 | 0.1212 | 0.8446 | 0.9515 | 0.0 | | 0.0288 | 425.45 | 4680 | 0.0843 | 0.6729 | 0.7191 | 0.9802 | nan | 0.9930 | 0.9532 | 0.9136 | 0.9884 | 0.9094 | 0.0 | 0.9011 | 0.9845 | 0.7048 | 0.1080 | 0.9096 | 0.9826 | 0.0 | nan | 0.9815 | 0.8987 | 0.8467 | 0.9781 | 0.8571 | 0.0 | 0.8389 | 0.9779 | 0.4628 | 0.1076 | 0.8471 | 0.9515 | 0.0 | | 0.0309 | 427.27 | 4700 | 0.0837 | 0.6738 | 0.7197 | 0.9803 | nan | 0.9934 | 0.9536 | 0.9090 | 0.9879 | 0.9102 | 0.0 | 0.9024 | 0.9852 | 0.6985 | 0.1169 | 0.9126 | 0.9861 | 0.0 | nan | 0.9815 | 0.8998 | 0.8480 | 0.9781 | 0.8577 | 0.0 | 0.8368 | 0.9784 | 0.4614 | 0.1163 | 0.8507 | 0.9510 | 0.0 | | 0.0306 | 429.09 | 4720 | 0.0826 | 0.6755 | 0.7218 | 0.9803 | nan | 0.9933 | 0.9501 | 0.9104 | 0.9893 | 0.9085 | 0.0 | 0.9010 | 0.9856 | 0.7017 | 0.1427 | 0.9134 | 0.9882 | 0.0 | nan | 0.9816 | 0.8998 | 0.8474 | 0.9779 | 0.8569 | 0.0 | 0.8374 | 0.9791 | 0.4633 | 0.1421 | 0.8507 | 0.9449 | 0.0 | | 0.025 | 430.91 | 4740 | 0.0838 | 0.6753 | 0.7221 | 0.9802 | nan | 0.9934 | 0.9522 | 0.9105 | 0.9878 | 0.9118 | 0.0 | 0.8943 | 0.9837 | 0.7082 | 0.1379 | 0.9199 | 0.9879 | 0.0 | nan | 0.9815 | 0.8992 | 0.8461 | 0.9780 | 0.8571 | 0.0 | 0.8374 | 0.9775 | 0.4655 | 0.1372 | 0.8530 | 0.9462 | 0.0 | | 0.0282 | 432.73 | 4760 | 0.0835 | 0.6766 | 0.7227 | 0.9804 | nan | 0.9934 | 0.9519 | 0.9086 | 0.9875 | 0.9133 | 0.0 | 0.9017 | 0.9873 | 0.7101 | 0.1529 | 0.9068 | 0.9822 | 0.0 | nan | 0.9815 | 0.9009 | 0.8467 | 0.9779 | 0.8559 | 0.0 | 0.8375 | 0.9795 | 0.4658 | 0.1520 | 0.8465 | 0.9520 | 0.0 | | 0.033 | 434.55 | 4780 | 0.0826 | 0.6761 | 0.7234 | 0.9805 | nan | 0.9934 | 0.9520 | 0.9065 | 0.9887 | 0.9089 | 0.0 | 0.9038 | 0.9864 | 0.7249 | 0.1466 | 0.9050 | 0.9880 | 0.0 | nan | 0.9817 | 0.9009 | 0.8477 | 0.9781 | 0.8559 | 0.0 | 0.8381 | 0.9795 | 0.4727 | 0.1457 | 0.8444 | 0.9441 | 0.0 | | 0.0332 | 436.36 | 4800 | 0.0837 | 0.6697 | 0.7143 | 0.9801 | nan | 0.9936 | 0.9529 | 0.9072 | 0.9875 | 0.9126 | 0.0 | 0.9019 | 0.9854 | 0.6416 | 0.1116 | 0.9062 | 0.9856 | 0.0 | nan | 0.9816 | 0.8972 | 0.8478 | 0.9779 | 0.8559 | 0.0 | 0.8400 | 0.9786 | 0.4285 | 0.1111 | 0.8441 | 0.9437 | 0.0 | | 0.0321 | 438.18 | 4820 | 0.0837 | 0.6715 | 0.7188 | 0.9802 | nan | 0.9932 | 0.9518 | 0.9107 | 0.9886 | 0.9096 | 0.0 | 0.9076 | 0.9857 | 0.6956 | 0.1085 | 0.9059 | 0.9878 | 0.0 | nan | 0.9815 | 0.8987 | 0.8471 | 0.9781 | 0.8567 | 0.0 | 0.8362 | 0.9789 | 0.4574 | 0.1079 | 0.8433 | 0.9434 | 0.0 | | 0.0297 | 440.0 | 4840 | 0.0849 | 0.6691 | 0.7149 | 0.9801 | nan | 0.9929 | 0.9553 | 0.9104 | 0.9882 | 0.9094 | 0.0 | 0.9049 | 0.9849 | 0.6837 | 0.0896 | 0.8874 | 0.9871 | 0.0 | nan | 0.9814 | 0.8977 | 0.8454 | 0.9781 | 0.8574 | 0.0 | 0.8337 | 0.9787 | 0.4526 | 0.0891 | 0.8311 | 0.9533 | 0.0 | | 0.0305 | 441.82 | 4860 | 0.0843 | 0.6731 | 0.7231 | 0.9803 | nan | 0.9926 | 0.9549 | 0.9166 | 0.9870 | 0.9140 | 0.0 | 0.9084 | 0.9867 | 0.7377 | 0.1013 | 0.9125 | 0.9885 | 0.0 | nan | 0.9818 | 0.8995 | 0.8462 | 0.9777 | 0.8563 | 0.0 | 0.8342 | 0.9793 | 0.4780 | 0.1008 | 0.8472 | 0.9496 | 0.0 | | 0.0348 | 443.64 | 4880 | 0.0850 | 0.6719 | 0.7192 | 0.9802 | nan | 0.9929 | 0.9542 | 0.9119 | 0.9884 | 0.9112 | 0.0 | 0.9056 | 0.9849 | 0.7136 | 0.0900 | 0.9089 | 0.9884 | 0.0 | nan | 0.9817 | 0.8981 | 0.8467 | 0.9779 | 0.8562 | 0.0 | 0.8404 | 0.9782 | 0.4679 | 0.0896 | 0.8479 | 0.9505 | 0.0 | | 0.0284 | 445.45 | 4900 | 0.0844 | 0.6710 | 0.7179 | 0.9802 | nan | 0.9931 | 0.9522 | 0.9134 | 0.9879 | 0.9099 | 0.0 | 0.9109 | 0.9861 | 0.6941 | 0.0981 | 0.8984 | 0.9884 | 0.0 | nan | 0.9815 | 0.8988 | 0.8460 | 0.9779 | 0.8565 | 0.0 | 0.8407 | 0.9789 | 0.4585 | 0.0976 | 0.8414 | 0.9456 | 0.0 | | 0.0274 | 447.27 | 4920 | 0.0843 | 0.6709 | 0.7184 | 0.9802 | nan | 0.9930 | 0.9529 | 0.9126 | 0.9887 | 0.9097 | 0.0 | 0.9072 | 0.9850 | 0.7122 | 0.0846 | 0.9038 | 0.9893 | 0.0 | nan | 0.9817 | 0.8990 | 0.8468 | 0.9780 | 0.8568 | 0.0 | 0.8394 | 0.9786 | 0.4665 | 0.0843 | 0.8440 | 0.9471 | 0.0 | | 0.0307 | 449.09 | 4940 | 0.0848 | 0.6712 | 0.7185 | 0.9801 | nan | 0.9927 | 0.9543 | 0.9157 | 0.9885 | 0.9097 | 0.0 | 0.9081 | 0.9838 | 0.7066 | 0.0870 | 0.9062 | 0.9874 | 0.0 | nan | 0.9816 | 0.8979 | 0.8465 | 0.9781 | 0.8573 | 0.0 | 0.8392 | 0.9775 | 0.4639 | 0.0866 | 0.8453 | 0.9511 | 0.0 | | 0.03 | 450.91 | 4960 | 0.0854 | 0.6710 | 0.7176 | 0.9801 | nan | 0.9935 | 0.9542 | 0.9070 | 0.9872 | 0.9156 | 0.0 | 0.9021 | 0.9837 | 0.7044 | 0.0897 | 0.9054 | 0.9854 | 0.0 | nan | 0.9816 | 0.8979 | 0.8471 | 0.9780 | 0.8573 | 0.0 | 0.8384 | 0.9774 | 0.4622 | 0.0893 | 0.8441 | 0.9497 | 0.0 | | 0.0342 | 452.73 | 4980 | 0.0854 | 0.6719 | 0.7193 | 0.9802 | nan | 0.9934 | 0.9523 | 0.9083 | 0.9880 | 0.9121 | 0.0 | 0.9058 | 0.9850 | 0.7122 | 0.0895 | 0.9190 | 0.9852 | 0.0 | nan | 0.9817 | 0.8986 | 0.8459 | 0.9781 | 0.8579 | 0.0 | 0.8392 | 0.9779 | 0.4678 | 0.0891 | 0.8471 | 0.9519 | 0.0 | | 0.0327 | 454.55 | 5000 | 0.0842 | 0.6735 | 0.7207 | 0.9803 | nan | 0.9933 | 0.9531 | 0.9113 | 0.9883 | 0.9114 | 0.0 | 0.8995 | 0.9842 | 0.7079 | 0.1088 | 0.9252 | 0.9859 | 0.0 | nan | 0.9816 | 0.8991 | 0.8467 | 0.9782 | 0.8579 | 0.0 | 0.8385 | 0.9782 | 0.4662 | 0.1082 | 0.8476 | 0.9536 | 0.0 | | 0.0306 | 456.36 | 5020 | 0.0840 | 0.6743 | 0.7215 | 0.9803 | nan | 0.9929 | 0.9549 | 0.9123 | 0.9880 | 0.9114 | 0.0 | 0.8972 | 0.9855 | 0.7087 | 0.1210 | 0.9203 | 0.9873 | 0.0 | nan | 0.9816 | 0.8995 | 0.8467 | 0.9782 | 0.8585 | 0.0 | 0.8370 | 0.9788 | 0.4649 | 0.1205 | 0.8463 | 0.9542 | 0.0 | | 0.0305 | 458.18 | 5040 | 0.0852 | 0.6726 | 0.7180 | 0.9803 | nan | 0.9931 | 0.9561 | 0.9105 | 0.9878 | 0.9081 | 0.0 | 0.8991 | 0.9855 | 0.6953 | 0.1033 | 0.9118 | 0.9839 | 0.0 | nan | 0.9816 | 0.8992 | 0.8467 | 0.9782 | 0.8584 | 0.0 | 0.8363 | 0.9786 | 0.4601 | 0.1028 | 0.8455 | 0.9570 | 0.0 | | 0.0272 | 460.0 | 5060 | 0.0845 | 0.6724 | 0.7179 | 0.9803 | nan | 0.9934 | 0.9554 | 0.9088 | 0.9869 | 0.9155 | 0.0 | 0.9019 | 0.9853 | 0.7077 | 0.0982 | 0.8995 | 0.9797 | 0.0 | nan | 0.9816 | 0.8997 | 0.8466 | 0.9780 | 0.8572 | 0.0 | 0.8357 | 0.9789 | 0.4653 | 0.0978 | 0.8436 | 0.9566 | 0.0 | | 0.0261 | 461.82 | 5080 | 0.0849 | 0.6737 | 0.7195 | 0.9804 | nan | 0.9937 | 0.9532 | 0.9042 | 0.9878 | 0.9107 | 0.0 | 0.9048 | 0.9859 | 0.7237 | 0.1053 | 0.9031 | 0.9813 | 0.0 | nan | 0.9817 | 0.9004 | 0.8458 | 0.9782 | 0.8583 | 0.0 | 0.8372 | 0.9792 | 0.4716 | 0.1048 | 0.8465 | 0.9550 | 0.0 | | 0.034 | 463.64 | 5100 | 0.0848 | 0.6724 | 0.7172 | 0.9802 | nan | 0.9934 | 0.9540 | 0.9091 | 0.9878 | 0.9112 | 0.0 | 0.9020 | 0.9849 | 0.7040 | 0.1041 | 0.8908 | 0.9821 | 0.0 | nan | 0.9816 | 0.8987 | 0.8466 | 0.9781 | 0.8578 | 0.0 | 0.8362 | 0.9784 | 0.4648 | 0.1036 | 0.8407 | 0.9550 | 0.0 | | 0.0289 | 465.45 | 5120 | 0.0845 | 0.6725 | 0.7187 | 0.9802 | nan | 0.9932 | 0.9535 | 0.9090 | 0.9880 | 0.9114 | 0.0 | 0.9043 | 0.9852 | 0.7155 | 0.0987 | 0.8983 | 0.9862 | 0.0 | nan | 0.9816 | 0.8989 | 0.8457 | 0.9782 | 0.8580 | 0.0 | 0.8373 | 0.9784 | 0.4696 | 0.0982 | 0.8453 | 0.9520 | 0.0 | | 0.0287 | 467.27 | 5140 | 0.0847 | 0.6708 | 0.7176 | 0.9801 | nan | 0.9934 | 0.9536 | 0.9052 | 0.9875 | 0.9121 | 0.0 | 0.9053 | 0.9857 | 0.7024 | 0.0865 | 0.9094 | 0.9878 | 0.0 | nan | 0.9815 | 0.8980 | 0.8451 | 0.9780 | 0.8573 | 0.0 | 0.8372 | 0.9785 | 0.4630 | 0.0861 | 0.8487 | 0.9475 | 0.0 | | 0.0269 | 469.09 | 5160 | 0.0843 | 0.6714 | 0.7182 | 0.9802 | nan | 0.9931 | 0.9521 | 0.9108 | 0.9886 | 0.9102 | 0.0 | 0.9081 | 0.9862 | 0.7010 | 0.0929 | 0.9068 | 0.9873 | 0.0 | nan | 0.9816 | 0.8987 | 0.8456 | 0.9781 | 0.8580 | 0.0 | 0.8376 | 0.9790 | 0.4625 | 0.0924 | 0.8450 | 0.9497 | 0.0 | | 0.0287 | 470.91 | 5180 | 0.0840 | 0.6711 | 0.7180 | 0.9802 | nan | 0.9932 | 0.9528 | 0.9091 | 0.9885 | 0.9104 | 0.0 | 0.9089 | 0.9853 | 0.7016 | 0.0909 | 0.9071 | 0.9859 | 0.0 | nan | 0.9816 | 0.8989 | 0.8455 | 0.9782 | 0.8578 | 0.0 | 0.8350 | 0.9782 | 0.4629 | 0.0904 | 0.8432 | 0.9526 | 0.0 | | 0.0284 | 472.73 | 5200 | 0.0850 | 0.6713 | 0.7181 | 0.9802 | nan | 0.9933 | 0.9544 | 0.9075 | 0.9877 | 0.9124 | 0.0 | 0.9072 | 0.9844 | 0.7026 | 0.0959 | 0.9039 | 0.9854 | 0.0 | nan | 0.9816 | 0.8982 | 0.8461 | 0.9781 | 0.8577 | 0.0 | 0.8356 | 0.9778 | 0.4629 | 0.0955 | 0.8416 | 0.9521 | 0.0 | | 0.0277 | 474.55 | 5220 | 0.0849 | 0.6702 | 0.7159 | 0.9801 | nan | 0.9933 | 0.9541 | 0.9095 | 0.9880 | 0.9101 | 0.0 | 0.9037 | 0.9836 | 0.6912 | 0.0877 | 0.8995 | 0.9858 | 0.0 | nan | 0.9816 | 0.8975 | 0.8464 | 0.9781 | 0.8582 | 0.0 | 0.8385 | 0.9774 | 0.4576 | 0.0874 | 0.8392 | 0.9506 | 0.0 | | 0.0265 | 476.36 | 5240 | 0.0844 | 0.6718 | 0.7180 | 0.9801 | nan | 0.9930 | 0.9542 | 0.9127 | 0.9884 | 0.9101 | 0.0 | 0.9044 | 0.9835 | 0.6983 | 0.1031 | 0.9004 | 0.9852 | 0.0 | nan | 0.9816 | 0.8979 | 0.8464 | 0.9781 | 0.8578 | 0.0 | 0.8383 | 0.9773 | 0.4612 | 0.1027 | 0.8404 | 0.9515 | 0.0 | | 0.0308 | 478.18 | 5260 | 0.0839 | 0.6739 | 0.7210 | 0.9803 | nan | 0.9931 | 0.9536 | 0.9114 | 0.9884 | 0.9115 | 0.0 | 0.9007 | 0.9841 | 0.7181 | 0.1159 | 0.9098 | 0.9862 | 0.0 | nan | 0.9817 | 0.8989 | 0.8468 | 0.9782 | 0.8580 | 0.0 | 0.8377 | 0.9775 | 0.4693 | 0.1153 | 0.8447 | 0.9521 | 0.0 | | 0.0256 | 480.0 | 5280 | 0.0834 | 0.6748 | 0.7219 | 0.9803 | nan | 0.9931 | 0.9529 | 0.9129 | 0.9887 | 0.9102 | 0.0 | 0.9069 | 0.9841 | 0.7114 | 0.1293 | 0.9110 | 0.9837 | 0.0 | nan | 0.9817 | 0.8994 | 0.8470 | 0.9781 | 0.8577 | 0.0 | 0.8359 | 0.9775 | 0.4670 | 0.1286 | 0.8459 | 0.9542 | 0.0 | | 0.0344 | 481.82 | 5300 | 0.0836 | 0.6740 | 0.7198 | 0.9803 | nan | 0.9932 | 0.9540 | 0.9115 | 0.9879 | 0.9121 | 0.0 | 0.9020 | 0.9848 | 0.7000 | 0.1233 | 0.9050 | 0.9837 | 0.0 | nan | 0.9816 | 0.8992 | 0.8467 | 0.9781 | 0.8582 | 0.0 | 0.8358 | 0.9776 | 0.4624 | 0.1227 | 0.8443 | 0.9556 | 0.0 | | 0.0259 | 483.64 | 5320 | 0.0841 | 0.6724 | 0.7174 | 0.9802 | nan | 0.9935 | 0.9538 | 0.9082 | 0.9879 | 0.9111 | 0.0 | 0.9017 | 0.9846 | 0.6945 | 0.1070 | 0.9013 | 0.9830 | 0.0 | nan | 0.9816 | 0.8990 | 0.8463 | 0.9782 | 0.8581 | 0.0 | 0.8367 | 0.9778 | 0.4592 | 0.1064 | 0.8418 | 0.9560 | 0.0 | | 0.0282 | 485.45 | 5340 | 0.0842 | 0.6726 | 0.7185 | 0.9802 | nan | 0.9933 | 0.9530 | 0.9125 | 0.9880 | 0.9125 | 0.0 | 0.9013 | 0.9847 | 0.6981 | 0.1058 | 0.9066 | 0.9843 | 0.0 | nan | 0.9816 | 0.8994 | 0.8464 | 0.9781 | 0.8580 | 0.0 | 0.8368 | 0.9781 | 0.4606 | 0.1053 | 0.8449 | 0.9540 | 0.0 | | 0.0302 | 487.27 | 5360 | 0.0837 | 0.6727 | 0.7185 | 0.9802 | nan | 0.9933 | 0.9537 | 0.9089 | 0.9881 | 0.9126 | 0.0 | 0.9015 | 0.9843 | 0.7002 | 0.1108 | 0.9004 | 0.9861 | 0.0 | nan | 0.9816 | 0.8990 | 0.8466 | 0.9781 | 0.8577 | 0.0 | 0.8364 | 0.9780 | 0.4617 | 0.1102 | 0.8422 | 0.9537 | 0.0 | | 0.0266 | 489.09 | 5380 | 0.0842 | 0.6723 | 0.7178 | 0.9802 | nan | 0.9934 | 0.9541 | 0.9070 | 0.9880 | 0.9110 | 0.0 | 0.9034 | 0.9846 | 0.7013 | 0.1072 | 0.8956 | 0.9863 | 0.0 | nan | 0.9816 | 0.8989 | 0.8466 | 0.9781 | 0.8581 | 0.0 | 0.8360 | 0.9780 | 0.4625 | 0.1067 | 0.8404 | 0.9529 | 0.0 | | 0.0301 | 490.91 | 5400 | 0.0842 | 0.6728 | 0.7190 | 0.9803 | nan | 0.9933 | 0.9535 | 0.9101 | 0.9877 | 0.9113 | 0.0 | 0.9036 | 0.9855 | 0.7005 | 0.1105 | 0.9036 | 0.9871 | 0.0 | nan | 0.9816 | 0.8993 | 0.8467 | 0.9781 | 0.8578 | 0.0 | 0.8360 | 0.9781 | 0.4620 | 0.1099 | 0.8451 | 0.9512 | 0.0 | | 0.0279 | 492.73 | 5420 | 0.0840 | 0.6730 | 0.7187 | 0.9803 | nan | 0.9933 | 0.9527 | 0.9110 | 0.9881 | 0.9122 | 0.0 | 0.9022 | 0.9855 | 0.7012 | 0.1122 | 0.8993 | 0.9853 | 0.0 | nan | 0.9816 | 0.8994 | 0.8467 | 0.9782 | 0.8577 | 0.0 | 0.8360 | 0.9781 | 0.4623 | 0.1116 | 0.8453 | 0.9524 | 0.0 | | 0.0298 | 494.55 | 5440 | 0.0843 | 0.6730 | 0.7189 | 0.9803 | nan | 0.9934 | 0.9520 | 0.9095 | 0.9884 | 0.9098 | 0.0 | 0.9050 | 0.9855 | 0.6996 | 0.1122 | 0.9048 | 0.9849 | 0.0 | nan | 0.9816 | 0.8997 | 0.8464 | 0.9782 | 0.8579 | 0.0 | 0.8362 | 0.9781 | 0.4611 | 0.1117 | 0.8458 | 0.9528 | 0.0 | | 0.0298 | 496.36 | 5460 | 0.0847 | 0.6732 | 0.7191 | 0.9802 | nan | 0.9933 | 0.9532 | 0.9109 | 0.9882 | 0.9117 | 0.0 | 0.9034 | 0.9843 | 0.7045 | 0.1134 | 0.9006 | 0.9845 | 0.0 | nan | 0.9816 | 0.8991 | 0.8469 | 0.9781 | 0.8578 | 0.0 | 0.8361 | 0.9778 | 0.4629 | 0.1128 | 0.8448 | 0.9534 | 0.0 | | 0.0287 | 498.18 | 5480 | 0.0838 | 0.6724 | 0.7174 | 0.9802 | nan | 0.9934 | 0.9538 | 0.9089 | 0.9879 | 0.9126 | 0.0 | 0.9023 | 0.9850 | 0.6862 | 0.1122 | 0.8999 | 0.9844 | 0.0 | nan | 0.9816 | 0.8991 | 0.8468 | 0.9781 | 0.8576 | 0.0 | 0.8366 | 0.9780 | 0.4549 | 0.1117 | 0.8433 | 0.9533 | 0.0 | | 0.0331 | 500.0 | 5500 | 0.0837 | 0.6719 | 0.7168 | 0.9802 | nan | 0.9934 | 0.9531 | 0.9075 | 0.9886 | 0.9103 | 0.0 | 0.9022 | 0.9844 | 0.6933 | 0.1029 | 0.8981 | 0.9850 | 0.0 | nan | 0.9816 | 0.8989 | 0.8466 | 0.9782 | 0.8578 | 0.0 | 0.8371 | 0.9778 | 0.4583 | 0.1025 | 0.8433 | 0.9527 | 0.0 | ### Framework versions - Transformers 4.30.2 - Pytorch 2.0.1+cu117 - Datasets 2.13.1 - Tokenizers 0.13.3
[ "unlabeled", "nat", "concrete", "grass", "speedway bricks", "steel", "rough concrete", "dark bricks", "road", "rough red sidewalk", "tiles", "red bricks", "concrete tiles", "rest" ]
hbminsi/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 2.7084 - Mean Iou: 0.0871 - Mean Accuracy: 0.1451 - Overall Accuracy: 0.6167 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.5180 - Accuracy Flat-sidewalk: 0.9088 - Accuracy Flat-crosswalk: 0.0001 - Accuracy Flat-cyclinglane: 0.0259 - Accuracy Flat-parkingdriveway: 0.0 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.0012 - Accuracy Human-person: 0.0017 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.9553 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: nan - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0000 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.4663 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0670 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0064 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9708 - Accuracy Nature-terrain: 0.0024 - Accuracy Sky: 0.5740 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.3925 - Iou Flat-sidewalk: 0.6649 - Iou Flat-crosswalk: 0.0001 - Iou Flat-cyclinglane: 0.0249 - Iou Flat-parkingdriveway: 0.0 - Iou Flat-railtrack: 0.0 - Iou Flat-curb: 0.0012 - Iou Human-person: 0.0017 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.2851 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0000 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.3825 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0540 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: 0.0 - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0048 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.6011 - Iou Nature-terrain: 0.0024 - Iou Sky: 0.5451 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 2.9796 | 0.4 | 20 | 3.2289 | 0.0664 | 0.1236 | 0.5591 | nan | 0.2423 | 0.9160 | 0.0000 | 0.0110 | 0.0000 | nan | 0.0006 | 0.0021 | 0.0 | 0.9292 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.2970 | 0.0 | 0.1131 | 0.0 | 0.0 | nan | 0.0 | 0.0502 | 0.0 | 0.0 | 0.9793 | 0.0015 | 0.2752 | 0.0 | 0.0148 | 0.0000 | 0.0 | 0.0 | 0.2077 | 0.6059 | 0.0000 | 0.0107 | 0.0000 | 0.0 | 0.0006 | 0.0020 | 0.0 | 0.2985 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.2620 | 0.0 | 0.0654 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0249 | 0.0 | 0.0 | 0.5724 | 0.0015 | 0.2711 | 0.0 | 0.0027 | 0.0000 | 0.0 | | 2.6315 | 0.8 | 40 | 2.7084 | 0.0871 | 0.1451 | 0.6167 | nan | 0.5180 | 0.9088 | 0.0001 | 0.0259 | 0.0 | nan | 0.0012 | 0.0017 | 0.0 | 0.9553 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.4663 | 0.0 | 0.0670 | 0.0 | 0.0 | nan | 0.0 | 0.0064 | 0.0 | 0.0 | 0.9708 | 0.0024 | 0.5740 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.3925 | 0.6649 | 0.0001 | 0.0249 | 0.0 | 0.0 | 0.0012 | 0.0017 | 0.0 | 0.2851 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.3825 | 0.0 | 0.0540 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0 | 0.0 | 0.6011 | 0.0024 | 0.5451 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.41.1 - Pytorch 1.13.1 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
MichaelSoloveitchik/MedSam-Breast-Cancer
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "0", "1", "2" ]
troybvo/segformer-b0-finetuned-segments-sidewalk-test
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-test This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.1893 - Mean Iou: 0.4552 - Mean Accuracy: 0.9104 - Overall Accuracy: 0.9104 - Accuracy Other: nan - Accuracy Flat-sidewalk: 0.9104 - Iou Other: 0.0 - Iou Flat-sidewalk: 0.9104 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Other | Accuracy Flat-sidewalk | Iou Other | Iou Flat-sidewalk | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------:|:----------------------:|:---------:|:-----------------:| | 0.4805 | 0.05 | 20 | 0.5080 | 0.4534 | 0.9069 | 0.9069 | nan | 0.9069 | 0.0 | 0.9069 | | 0.2146 | 0.1 | 40 | 0.3937 | 0.4660 | 0.9319 | 0.9319 | nan | 0.9319 | 0.0 | 0.9319 | | 0.215 | 0.15 | 60 | 0.3593 | 0.4476 | 0.8952 | 0.8952 | nan | 0.8952 | 0.0 | 0.8952 | | 0.151 | 0.2 | 80 | 0.2834 | 0.4423 | 0.8845 | 0.8845 | nan | 0.8845 | 0.0 | 0.8845 | | 0.174 | 0.25 | 100 | 0.3268 | 0.4612 | 0.9225 | 0.9225 | nan | 0.9225 | 0.0 | 0.9225 | | 0.1597 | 0.3 | 120 | 0.2900 | 0.4229 | 0.8457 | 0.8457 | nan | 0.8457 | 0.0 | 0.8457 | | 0.2165 | 0.35 | 140 | 0.2723 | 0.4411 | 0.8822 | 0.8822 | nan | 0.8822 | 0.0 | 0.8822 | | 0.2 | 0.4 | 160 | 0.2598 | 0.4167 | 0.8334 | 0.8334 | nan | 0.8334 | 0.0 | 0.8334 | | 0.577 | 0.45 | 180 | 0.3185 | 0.4708 | 0.9416 | 0.9416 | nan | 0.9416 | 0.0 | 0.9416 | | 0.2466 | 0.5 | 200 | 0.2305 | 0.4295 | 0.8589 | 0.8589 | nan | 0.8589 | 0.0 | 0.8589 | | 0.1742 | 0.55 | 220 | 0.2439 | 0.4544 | 0.9089 | 0.9089 | nan | 0.9089 | 0.0 | 0.9089 | | 0.1764 | 0.6 | 240 | 0.2318 | 0.4359 | 0.8719 | 0.8719 | nan | 0.8719 | 0.0 | 0.8719 | | 0.1432 | 0.65 | 260 | 0.2253 | 0.4318 | 0.8636 | 0.8636 | nan | 0.8636 | 0.0 | 0.8636 | | 0.1472 | 0.7 | 280 | 0.2193 | 0.4353 | 0.8707 | 0.8707 | nan | 0.8707 | 0.0 | 0.8707 | | 0.4737 | 0.75 | 300 | 0.2347 | 0.4407 | 0.8813 | 0.8813 | nan | 0.8813 | 0.0 | 0.8813 | | 0.1567 | 0.8 | 320 | 0.2212 | 0.4248 | 0.8496 | 0.8496 | nan | 0.8496 | 0.0 | 0.8496 | | 0.0832 | 0.85 | 340 | 0.2170 | 0.4426 | 0.8852 | 0.8852 | nan | 0.8852 | 0.0 | 0.8852 | | 0.1718 | 0.9 | 360 | 0.2079 | 0.4390 | 0.8780 | 0.8780 | nan | 0.8780 | 0.0 | 0.8780 | | 0.3256 | 0.95 | 380 | 0.2127 | 0.4576 | 0.9151 | 0.9151 | nan | 0.9151 | 0.0 | 0.9151 | | 0.089 | 1.0 | 400 | 0.2249 | 0.4603 | 0.9207 | 0.9207 | nan | 0.9207 | 0.0 | 0.9207 | | 0.103 | 1.05 | 420 | 0.2051 | 0.4360 | 0.8720 | 0.8720 | nan | 0.8720 | 0.0 | 0.8720 | | 0.3474 | 1.1 | 440 | 0.2216 | 0.4333 | 0.8666 | 0.8666 | nan | 0.8666 | 0.0 | 0.8666 | | 0.0851 | 1.15 | 460 | 0.2306 | 0.4681 | 0.9361 | 0.9361 | nan | 0.9361 | 0.0 | 0.9361 | | 0.1989 | 1.2 | 480 | 0.2029 | 0.4516 | 0.9032 | 0.9032 | nan | 0.9032 | 0.0 | 0.9032 | | 0.2072 | 1.25 | 500 | 0.2076 | 0.4666 | 0.9331 | 0.9331 | nan | 0.9331 | 0.0 | 0.9331 | | 0.2898 | 1.3 | 520 | 0.2164 | 0.4645 | 0.9291 | 0.9291 | nan | 0.9291 | 0.0 | 0.9291 | | 0.1578 | 1.35 | 540 | 0.2057 | 0.4457 | 0.8914 | 0.8914 | nan | 0.8914 | 0.0 | 0.8914 | | 0.2697 | 1.4 | 560 | 0.1973 | 0.4646 | 0.9292 | 0.9292 | nan | 0.9292 | 0.0 | 0.9292 | | 0.1269 | 1.45 | 580 | 0.1830 | 0.4467 | 0.8934 | 0.8934 | nan | 0.8934 | 0.0 | 0.8934 | | 0.0908 | 1.5 | 600 | 0.1866 | 0.4471 | 0.8941 | 0.8941 | nan | 0.8941 | 0.0 | 0.8941 | | 0.0614 | 1.55 | 620 | 0.1983 | 0.4632 | 0.9264 | 0.9264 | nan | 0.9264 | 0.0 | 0.9264 | | 0.1043 | 1.6 | 640 | 0.1941 | 0.4598 | 0.9196 | 0.9196 | nan | 0.9196 | 0.0 | 0.9196 | | 0.0532 | 1.65 | 660 | 0.1920 | 0.4553 | 0.9106 | 0.9106 | nan | 0.9106 | 0.0 | 0.9106 | | 0.5912 | 1.7 | 680 | 0.1880 | 0.4530 | 0.9059 | 0.9059 | nan | 0.9059 | 0.0 | 0.9059 | | 0.0604 | 1.75 | 700 | 0.1964 | 0.4611 | 0.9221 | 0.9221 | nan | 0.9221 | 0.0 | 0.9221 | | 0.0899 | 1.8 | 720 | 0.1975 | 0.4623 | 0.9245 | 0.9245 | nan | 0.9245 | 0.0 | 0.9245 | | 0.1153 | 1.85 | 740 | 0.1866 | 0.4580 | 0.9160 | 0.9160 | nan | 0.9160 | 0.0 | 0.9160 | | 0.1038 | 1.9 | 760 | 0.1998 | 0.4652 | 0.9304 | 0.9304 | nan | 0.9304 | 0.0 | 0.9304 | | 0.1448 | 1.95 | 780 | 0.1977 | 0.4624 | 0.9248 | 0.9248 | nan | 0.9248 | 0.0 | 0.9248 | | 0.1298 | 2.0 | 800 | 0.1893 | 0.4552 | 0.9104 | 0.9104 | nan | 0.9104 | 0.0 | 0.9104 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1
[ "other", "flat-sidewalk" ]
qubvel-hf/finetune-instance-segmentation-ade20k-mini-mask2former
<!--- Copyright 2024 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> # Instance Segmentation Example Content: - [PyTorch Version with Trainer](#pytorch-version-with-trainer) - [Reload and Perform Inference](#reload-and-perform-inference) - [Note on Custom Data](#note-on-custom-data) ## PyTorch Version with Trainer This model is based on the script [`run_instance_segmentation.py`](https://github.com/huggingface/transformers/blob/main/examples/pytorch/instance-segmentation/run_instance_segmentation.py). The script uses the [🤗 Trainer API](https://huggingface.co/docs/transformers/main_classes/trainer) to manage training automatically, including distributed environments. Here, we fine-tune a [Mask2Former](https://huggingface.co/docs/transformers/model_doc/mask2former) model on a subsample of the [ADE20K](https://huggingface.co/datasets/zhoubolei/scene_parse_150) dataset. We created a [small dataset](https://huggingface.co/datasets/qubvel-hf/ade20k-mini) with approximately 2,000 images containing only "person" and "car" annotations; all other pixels are marked as "background." Here is the `label2id` mapping for this model: ```python label2id = { "person": 0, "car": 1, } ``` The training was done with the following command: ```bash python run_instance_segmentation.py \ --model_name_or_path facebook/mask2former-swin-tiny-coco-instance \ --output_dir finetune-instance-segmentation-ade20k-mini-mask2former \ --dataset_name qubvel-hf/ade20k-mini \ --do_reduce_labels \ --image_height 256 \ --image_width 256 \ --do_train \ --fp16 \ --num_train_epochs 40 \ --learning_rate 1e-5 \ --lr_scheduler_type constant \ --per_device_train_batch_size 8 \ --gradient_accumulation_steps 2 \ --dataloader_num_workers 8 \ --dataloader_persistent_workers \ --dataloader_prefetch_factor 4 \ --do_eval \ --evaluation_strategy epoch \ --logging_strategy epoch \ --save_strategy epoch \ --save_total_limit 2 \ --push_to_hub ``` ## Reload and Perform Inference You can easily load this trained model and perform inference as follows: ```python import torch import requests import matplotlib.pyplot as plt from PIL import Image from transformers import Mask2FormerForUniversalSegmentation, Mask2FormerImageProcessor # Load image image = Image.open(requests.get("http://farm4.staticflickr.com/3017/3071497290_31f0393363_z.jpg", stream=True).raw) # Load model and image processor device = "cuda" checkpoint = "qubvel-hf/finetune-instance-segmentation-ade20k-mini-mask2former" model = Mask2FormerForUniversalSegmentation.from_pretrained(checkpoint, device_map=device) image_processor = Mask2FormerImageProcessor.from_pretrained(checkpoint) # Run inference on image inputs = image_processor(images=[image], return_tensors="pt").to(device) with torch.no_grad(): outputs = model(**inputs) # Post-process outputs outputs = image_processor.post_process_instance_segmentation(outputs, target_sizes=[image.size[::-1]]) print("Mask shape: ", outputs[0]["segmentation"].shape) print("Mask values: ", outputs[0]["segmentation"].unique()) for segment in outputs[0]["segments_info"]: print("Segment: ", segment) ``` ``` Mask shape: torch.Size([427, 640]) Mask values: tensor([-1., 0., 1., 2., 3., 4., 5., 6.]) Segment: {'id': 0, 'label_id': 0, 'was_fused': False, 'score': 0.946127} Segment: {'id': 1, 'label_id': 1, 'was_fused': False, 'score': 0.961582} Segment: {'id': 2, 'label_id': 1, 'was_fused': False, 'score': 0.968367} Segment: {'id': 3, 'label_id': 1, 'was_fused': False, 'score': 0.819527} Segment: {'id': 4, 'label_id': 1, 'was_fused': False, 'score': 0.655761} Segment: {'id': 5, 'label_id': 1, 'was_fused': False, 'score': 0.531299} Segment: {'id': 6, 'label_id': 1, 'was_fused': False, 'score': 0.929477} ``` Use the following code to visualize the results: ```python import numpy as np import matplotlib.pyplot as plt segmentation = outputs[0]["segmentation"].numpy() plt.figure(figsize=(10, 10)) plt.subplot(1, 2, 1) plt.imshow(np.array(image)) plt.axis("off") plt.subplot(1, 2, 2) plt.imshow(segmentation) plt.axis("off") plt.show() ``` ![Result](https://i.imgur.com/rZmaRjD.png)
[ "person", "car" ]
qubvel-hf/finetune-instance-segmentation-ade20k-mini-mask2former-no-trainer
<!--- Copyright 2024 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> # Instance Segmentation Example Content: - [PyTorch Version with Accelerate](#pytorch-version-with-accelerate) - [Reload and Perform Inference](#reload-and-perform-inference) ## PyTorch Version with Accelerate This model is based on the script [`run_instance_segmentation_no_trainer.py`](https://github.com/huggingface/transformers/blob/main/examples/pytorch/instance-segmentation/run_instance_segmentation_no_trainer.py). The script uses [🤗 Accelerate](https://github.com/huggingface/accelerate) to write your own training loop in PyTorch and run it on various environments, including CPU, multi-CPU, GPU, multi-GPU, and TPU, with support for mixed precision. First, configure the environment: ```bash accelerate config ``` Answer the questions regarding your training environment. Then, run: ```bash accelerate test ``` This command ensures everything is ready for training. Finally, launch training with: ```bash accelerate launch run_instance_segmentation_no_trainer.py \ --model_name_or_path facebook/mask2former-swin-tiny-coco-instance \ --output_dir finetune-instance-segmentation-ade20k-mini-mask2former-no-trainer \ --dataset_name qubvel-hf/ade20k-mini \ --do_reduce_labels \ --image_height 256 \ --image_width 256 \ --num_train_epochs 40 \ --learning_rate 1e-5 \ --lr_scheduler_type constant \ --per_device_train_batch_size 8 \ --gradient_accumulation_steps 2 \ --dataloader_num_workers 8 \ --push_to_hub ``` ## Reload and Perform Inference You can easily load this trained model and perform inference as follows: ```python import torch import requests import matplotlib.pyplot as plt from PIL import Image from transformers import Mask2FormerForUniversalSegmentation, Mask2FormerImageProcessor # Load image image = Image.open(requests.get("http://farm4.staticflickr.com/3017/3071497290_31f0393363_z.jpg", stream=True).raw) # Load model and image processor device = "cuda" checkpoint = "qubvel-hf/finetune-instance-segmentation-ade20k-mini-mask2former-no-trainer" model = Mask2FormerForUniversalSegmentation.from_pretrained(checkpoint, device_map=device) image_processor = Mask2FormerImageProcessor.from_pretrained(checkpoint) # Run inference on image inputs = image_processor(images=[image], return_tensors="pt").to(device) with torch.no_grad(): outputs = model(**inputs) # Post-process outputs outputs = image_processor.post_process_instance_segmentation(outputs, target_sizes=[image.size[::-1]]) print("Mask shape: ", outputs[0]["segmentation"].shape) print("Mask values: ", outputs[0]["segmentation"].unique()) for segment in outputs[0]["segments_info"]: print("Segment: ", segment) ``` ``` Mask shape: torch.Size([427, 640]) Mask values: tensor([-1., 0., 1., 2., 3., 4., 5., 6.]) Segment: {'id': 0, 'label_id': 0, 'was_fused': False, 'score': 0.946127} Segment: {'id': 1, 'label_id': 1, 'was_fused': False, 'score': 0.961582} Segment: {'id': 2, 'label_id': 1, 'was_fused': False, 'score': 0.968367} Segment: {'id': 3, 'label_id': 1, 'was_fused': False, 'score': 0.819527} Segment: {'id': 4, 'label_id': 1, 'was_fused': False, 'score': 0.655761} Segment: {'id': 5, 'label_id': 1, 'was_fused': False, 'score': 0.531299} Segment: {'id': 6, 'label_id': 1, 'was_fused': False, 'score': 0.929477} ``` Use the following code to visualize the results: ```python import numpy as np import matplotlib.pyplot as plt segmentation = outputs[0]["segmentation"].numpy() plt.figure(figsize=(10, 10)) plt.subplot(1, 2, 1) plt.imshow(np.array(image)) plt.axis("off") plt.subplot(1, 2, 2) plt.imshow(segmentation) plt.axis("off") plt.show() ``` ![Result](https://i.imgur.com/rZmaRjD.png)
[ "person", "car" ]
basakozsoy/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.7810 - Mean Iou: 0.1982 - Mean Accuracy: 0.2367 - Overall Accuracy: 0.8103 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8571 - Accuracy Flat-sidewalk: 0.9478 - Accuracy Flat-crosswalk: 0.0 - Accuracy Flat-cyclinglane: 0.7893 - Accuracy Flat-parkingdriveway: 0.2689 - Accuracy Flat-railtrack: 0.0 - Accuracy Flat-curb: 0.1857 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.9203 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.9048 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.2404 - Accuracy Construction-fenceguardrail: 0.0000 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0102 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9351 - Accuracy Nature-terrain: 0.8000 - Accuracy Sky: 0.9484 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0017 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.6411 - Iou Flat-sidewalk: 0.8351 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.7179 - Iou Flat-parkingdriveway: 0.2097 - Iou Flat-railtrack: 0.0 - Iou Flat-curb: 0.1654 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.7408 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.6400 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.1981 - Iou Construction-fenceguardrail: 0.0000 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0101 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.7837 - Iou Nature-terrain: 0.7083 - Iou Sky: 0.8890 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0017 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 0.8994 | 0.2 | 20 | 0.9185 | 0.1824 | 0.2205 | 0.7956 | nan | 0.8398 | 0.9526 | 0.0 | 0.7086 | 0.0972 | 0.0 | 0.0416 | 0.0 | 0.0 | 0.9154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8989 | 0.0 | 0.1218 | 0.0 | 0.0 | nan | 0.0 | 0.0018 | 0.0 | 0.0 | 0.9318 | 0.8259 | 0.9418 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6125 | 0.8115 | 0.0 | 0.6588 | 0.0885 | 0.0 | 0.0407 | 0.0 | 0.0 | 0.7250 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6258 | 0.0 | 0.1174 | 0.0 | 0.0 | nan | 0.0 | 0.0018 | 0.0 | 0.0 | 0.7664 | 0.6934 | 0.8782 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0441 | 0.4 | 40 | 0.9159 | 0.1819 | 0.2192 | 0.7949 | nan | 0.8533 | 0.9506 | 0.0 | 0.6956 | 0.1025 | 0.0 | 0.0413 | 0.0 | 0.0 | 0.9078 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9106 | 0.0 | 0.1046 | 0.0 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9319 | 0.7917 | 0.9440 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6078 | 0.8152 | 0.0 | 0.6591 | 0.0929 | 0.0 | 0.0403 | 0.0 | 0.0 | 0.7302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6161 | 0.0 | 0.1019 | 0.0 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.7682 | 0.6913 | 0.8785 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9559 | 0.6 | 60 | 0.9124 | 0.1839 | 0.2229 | 0.7974 | nan | 0.8477 | 0.9475 | 0.0 | 0.7436 | 0.1287 | 0.0 | 0.0388 | 0.0 | 0.0 | 0.9180 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9094 | 0.0 | 0.1183 | 0.0 | 0.0 | nan | 0.0 | 0.0015 | 0.0 | 0.0 | 0.9204 | 0.8372 | 0.9434 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6055 | 0.8205 | 0.0 | 0.6711 | 0.1126 | 0.0 | 0.0380 | 0.0 | 0.0 | 0.7227 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6218 | 0.0 | 0.1144 | 0.0 | 0.0 | nan | 0.0 | 0.0015 | 0.0 | 0.0 | 0.7749 | 0.7091 | 0.8782 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.954 | 0.8 | 80 | 0.9023 | 0.1837 | 0.2215 | 0.7967 | nan | 0.8424 | 0.9498 | 0.0 | 0.7398 | 0.1447 | 0.0 | 0.0470 | 0.0 | 0.0 | 0.9163 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9046 | 0.0 | 0.1169 | 0.0 | 0.0 | nan | 0.0 | 0.0018 | 0.0 | 0.0 | 0.9388 | 0.7670 | 0.9412 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6102 | 0.8199 | 0.0 | 0.6731 | 0.1254 | 0.0 | 0.0458 | 0.0 | 0.0 | 0.7243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6220 | 0.0 | 0.1131 | 0.0 | 0.0 | nan | 0.0 | 0.0018 | 0.0 | 0.0 | 0.7646 | 0.6817 | 0.8798 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.852 | 1.0 | 100 | 0.8860 | 0.1839 | 0.2218 | 0.7973 | nan | 0.8695 | 0.9450 | 0.0 | 0.7107 | 0.1460 | 0.0 | 0.0565 | 0.0 | 0.0 | 0.9142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9131 | 0.0 | 0.1054 | 0.0 | 0.0 | nan | 0.0 | 0.0013 | 0.0 | 0.0 | 0.9311 | 0.7820 | 0.9449 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6076 | 0.8255 | 0.0 | 0.6667 | 0.1273 | 0.0 | 0.0547 | 0.0 | 0.0 | 0.7272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6160 | 0.0 | 0.1026 | 0.0 | 0.0 | nan | 0.0 | 0.0013 | 0.0 | 0.0 | 0.7711 | 0.6904 | 0.8794 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1704 | 1.2 | 120 | 0.8873 | 0.1856 | 0.2241 | 0.7992 | nan | 0.8439 | 0.9494 | 0.0 | 0.7551 | 0.1495 | 0.0 | 0.0572 | 0.0 | 0.0 | 0.9188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8992 | 0.0 | 0.1343 | 0.0 | 0.0 | nan | 0.0 | 0.0036 | 0.0 | 0.0 | 0.9351 | 0.8044 | 0.9459 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6168 | 0.8208 | 0.0 | 0.6752 | 0.1294 | 0.0 | 0.0555 | 0.0 | 0.0 | 0.7247 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6264 | 0.0 | 0.1284 | 0.0 | 0.0 | nan | 0.0 | 0.0036 | 0.0 | 0.0 | 0.7701 | 0.6956 | 0.8796 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2101 | 1.4 | 140 | 0.8773 | 0.1863 | 0.2245 | 0.7992 | nan | 0.8307 | 0.9524 | 0.0 | 0.7453 | 0.1570 | 0.0 | 0.0734 | 0.0 | 0.0 | 0.9222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9014 | 0.0 | 0.1408 | 0.0 | 0.0 | nan | 0.0 | 0.0032 | 0.0 | 0.0 | 0.9372 | 0.8047 | 0.9412 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6218 | 0.8175 | 0.0 | 0.6692 | 0.1355 | 0.0 | 0.0703 | 0.0 | 0.0 | 0.7188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6266 | 0.0 | 0.1336 | 0.0 | 0.0 | nan | 0.0 | 0.0032 | 0.0 | 0.0 | 0.7702 | 0.6987 | 0.8815 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1823 | 1.6 | 160 | 0.8691 | 0.1877 | 0.2255 | 0.8010 | nan | 0.8620 | 0.9476 | 0.0 | 0.7408 | 0.1586 | 0.0 | 0.0745 | 0.0 | 0.0 | 0.9175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9066 | 0.0 | 0.1510 | 0.0 | 0.0 | nan | 0.0 | 0.0026 | 0.0 | 0.0 | 0.9313 | 0.8042 | 0.9462 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6176 | 0.8242 | 0.0 | 0.6895 | 0.1374 | 0.0 | 0.0713 | 0.0 | 0.0 | 0.7313 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6242 | 0.0 | 0.1415 | 0.0 | 0.0 | nan | 0.0 | 0.0026 | 0.0 | 0.0 | 0.7747 | 0.7002 | 0.8802 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8903 | 1.8 | 180 | 0.8694 | 0.1877 | 0.2257 | 0.8015 | nan | 0.8801 | 0.9430 | 0.0 | 0.7577 | 0.1548 | 0.0 | 0.0671 | 0.0 | 0.0 | 0.9159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8977 | 0.0 | 0.1630 | 0.0 | 0.0 | nan | 0.0 | 0.0037 | 0.0 | 0.0 | 0.9389 | 0.7767 | 0.9482 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6151 | 0.8305 | 0.0 | 0.6952 | 0.1341 | 0.0 | 0.0646 | 0.0 | 0.0 | 0.7325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6302 | 0.0 | 0.1510 | 0.0 | 0.0 | nan | 0.0 | 0.0037 | 0.0 | 0.0 | 0.7682 | 0.6884 | 0.8802 | 0.0 | 0.0 | 0.0000 | 0.0 | | 1.0182 | 2.0 | 200 | 0.8606 | 0.1889 | 0.2273 | 0.8028 | nan | 0.8546 | 0.9471 | 0.0 | 0.7733 | 0.1752 | 0.0 | 0.0810 | 0.0 | 0.0 | 0.9168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9049 | 0.0 | 0.1563 | 0.0 | 0.0 | nan | 0.0 | 0.0032 | 0.0 | 0.0 | 0.9343 | 0.8129 | 0.9430 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6254 | 0.8278 | 0.0 | 0.6939 | 0.1483 | 0.0 | 0.0775 | 0.0 | 0.0 | 0.7310 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6259 | 0.0 | 0.1455 | 0.0 | 0.0 | nan | 0.0 | 0.0031 | 0.0 | 0.0 | 0.7723 | 0.7014 | 0.8822 | 0.0 | 0.0 | 0.0000 | 0.0 | | 0.7128 | 2.2 | 220 | 0.8548 | 0.1892 | 0.2276 | 0.8030 | nan | 0.8565 | 0.9498 | 0.0 | 0.7546 | 0.1793 | 0.0 | 0.0829 | 0.0 | 0.0 | 0.9215 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9108 | 0.0 | 0.1514 | 0.0 | 0.0 | nan | 0.0 | 0.0032 | 0.0 | 0.0 | 0.9203 | 0.8333 | 0.9477 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6258 | 0.8249 | 0.0 | 0.6952 | 0.1508 | 0.0 | 0.0793 | 0.0 | 0.0 | 0.7262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6244 | 0.0 | 0.1418 | 0.0 | 0.0 | nan | 0.0 | 0.0032 | 0.0 | 0.0 | 0.7801 | 0.7113 | 0.8817 | 0.0 | 0.0 | 0.0000 | 0.0 | | 0.7744 | 2.4 | 240 | 0.8551 | 0.1899 | 0.2280 | 0.8031 | nan | 0.8646 | 0.9441 | 0.0 | 0.7575 | 0.1919 | 0.0 | 0.0991 | 0.0 | 0.0 | 0.9153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9056 | 0.0 | 0.1533 | 0.0 | 0.0 | nan | 0.0 | 0.0039 | 0.0 | 0.0 | 0.9358 | 0.8085 | 0.9454 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6210 | 0.8313 | 0.0 | 0.6983 | 0.1577 | 0.0 | 0.0936 | 0.0 | 0.0 | 0.7317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6264 | 0.0 | 0.1433 | 0.0 | 0.0 | nan | 0.0 | 0.0039 | 0.0 | 0.0 | 0.7730 | 0.7030 | 0.8826 | 0.0 | 0.0 | 0.0001 | 0.0 | | 0.9597 | 2.6 | 260 | 0.8491 | 0.1915 | 0.2305 | 0.8045 | nan | 0.8490 | 0.9456 | 0.0 | 0.7801 | 0.2106 | 0.0 | 0.1075 | 0.0 | 0.0 | 0.9149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9040 | 0.0 | 0.1790 | 0.0000 | 0.0 | nan | 0.0 | 0.0050 | 0.0 | 0.0 | 0.9318 | 0.8332 | 0.9458 | 0.0 | 0.0 | 0.0002 | 0.0 | nan | 0.6263 | 0.8307 | 0.0 | 0.6976 | 0.1691 | 0.0 | 0.1013 | 0.0 | 0.0 | 0.7308 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6302 | 0.0 | 0.1622 | 0.0000 | 0.0 | nan | 0.0 | 0.0050 | 0.0 | 0.0 | 0.7758 | 0.7086 | 0.8823 | 0.0 | 0.0 | 0.0002 | 0.0 | | 1.0563 | 2.8 | 280 | 0.8415 | 0.1906 | 0.2281 | 0.8039 | nan | 0.8594 | 0.9509 | 0.0 | 0.7548 | 0.1894 | 0.0 | 0.1003 | 0.0 | 0.0 | 0.9161 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9158 | 0.0 | 0.1669 | 0.0 | 0.0 | nan | 0.0 | 0.0047 | 0.0 | 0.0 | 0.9266 | 0.7989 | 0.9423 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6274 | 0.8266 | 0.0 | 0.7030 | 0.1585 | 0.0 | 0.0947 | 0.0 | 0.0 | 0.7288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6259 | 0.0 | 0.1536 | 0.0 | 0.0 | nan | 0.0 | 0.0047 | 0.0 | 0.0 | 0.7797 | 0.7032 | 0.8831 | 0.0 | 0.0 | 0.0000 | 0.0 | | 0.873 | 3.0 | 300 | 0.8417 | 0.1910 | 0.2293 | 0.8040 | nan | 0.8526 | 0.9497 | 0.0 | 0.7642 | 0.2033 | 0.0 | 0.1126 | 0.0 | 0.0 | 0.9263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9039 | 0.0 | 0.1838 | 0.0 | 0.0 | nan | 0.0 | 0.0051 | 0.0 | 0.0 | 0.9333 | 0.7878 | 0.9452 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6271 | 0.8286 | 0.0 | 0.6954 | 0.1664 | 0.0 | 0.1053 | 0.0 | 0.0 | 0.7202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6335 | 0.0 | 0.1653 | 0.0 | 0.0 | nan | 0.0 | 0.0050 | 0.0 | 0.0 | 0.7757 | 0.6984 | 0.8834 | 0.0 | 0.0 | 0.0001 | 0.0 | | 0.8197 | 3.2 | 320 | 0.8420 | 0.1910 | 0.2292 | 0.8036 | nan | 0.8367 | 0.9502 | 0.0 | 0.7854 | 0.2157 | 0.0 | 0.1159 | 0.0 | 0.0 | 0.9138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9067 | 0.0 | 0.1796 | 0.0 | 0.0 | nan | 0.0 | 0.0053 | 0.0 | 0.0 | 0.9409 | 0.7721 | 0.9411 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6317 | 0.8278 | 0.0 | 0.6904 | 0.1754 | 0.0 | 0.1085 | 0.0 | 0.0 | 0.7353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6286 | 0.0 | 0.1619 | 0.0 | 0.0 | nan | 0.0 | 0.0053 | 0.0 | 0.0 | 0.7700 | 0.6848 | 0.8840 | 0.0 | 0.0 | 0.0001 | 0.0 | | 1.0667 | 3.4 | 340 | 0.8304 | 0.1921 | 0.2301 | 0.8059 | nan | 0.8711 | 0.9466 | 0.0 | 0.7673 | 0.1991 | 0.0 | 0.1051 | 0.0 | 0.0 | 0.9174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9085 | 0.0 | 0.1799 | 0.0 | 0.0 | nan | 0.0 | 0.0055 | 0.0 | 0.0 | 0.9293 | 0.8174 | 0.9453 | 0.0 | 0.0 | 0.0002 | 0.0 | nan | 0.6281 | 0.8319 | 0.0 | 0.7087 | 0.1664 | 0.0 | 0.0989 | 0.0 | 0.0 | 0.7352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6287 | 0.0 | 0.1629 | 0.0 | 0.0 | nan | 0.0 | 0.0055 | 0.0 | 0.0 | 0.7794 | 0.7114 | 0.8835 | 0.0 | 0.0 | 0.0002 | 0.0 | | 0.7125 | 3.6 | 360 | 0.8299 | 0.1924 | 0.2323 | 0.8062 | nan | 0.8713 | 0.9390 | 0.0 | 0.7967 | 0.2138 | 0.0 | 0.1172 | 0.0 | 0.0 | 0.9254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8901 | 0.0 | 0.1930 | 0.0 | 0.0 | nan | 0.0 | 0.0071 | 0.0 | 0.0 | 0.9379 | 0.8236 | 0.9498 | 0.0 | 0.0 | 0.0005 | 0.0 | nan | 0.6314 | 0.8371 | 0.0 | 0.6978 | 0.1723 | 0.0 | 0.1094 | 0.0 | 0.0 | 0.7251 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6366 | 0.0 | 0.1710 | 0.0 | 0.0 | nan | 0.0 | 0.0071 | 0.0 | 0.0 | 0.7725 | 0.7082 | 0.8816 | 0.0 | 0.0 | 0.0005 | 0.0 | | 0.9464 | 3.8 | 380 | 0.8261 | 0.1931 | 0.2317 | 0.8059 | nan | 0.8387 | 0.9523 | 0.0 | 0.7739 | 0.2171 | 0.0 | 0.1399 | 0.0 | 0.0 | 0.9234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9063 | 0.0 | 0.1884 | 0.0 | 0.0 | nan | 0.0 | 0.0065 | 0.0 | 0.0 | 0.9274 | 0.8247 | 0.9485 | 0.0 | 0.0 | 0.0002 | 0.0 | nan | 0.6354 | 0.8265 | 0.0 | 0.6937 | 0.1762 | 0.0 | 0.1288 | 0.0 | 0.0 | 0.7273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6324 | 0.0 | 0.1682 | 0.0 | 0.0 | nan | 0.0 | 0.0065 | 0.0 | 0.0 | 0.7809 | 0.7117 | 0.8835 | 0.0 | 0.0 | 0.0002 | 0.0 | | 0.8507 | 4.0 | 400 | 0.8193 | 0.1932 | 0.2310 | 0.8060 | nan | 0.8562 | 0.9489 | 0.0 | 0.7721 | 0.2034 | 0.0 | 0.1338 | 0.0 | 0.0 | 0.9171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9061 | 0.0 | 0.1953 | 0.0 | 0.0 | nan | 0.0 | 0.0061 | 0.0 | 0.0 | 0.9353 | 0.8039 | 0.9445 | 0.0 | 0.0 | 0.0003 | 0.0 | nan | 0.6299 | 0.8296 | 0.0 | 0.7067 | 0.1702 | 0.0 | 0.1234 | 0.0 | 0.0 | 0.7349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6316 | 0.0 | 0.1729 | 0.0 | 0.0 | nan | 0.0 | 0.0061 | 0.0 | 0.0 | 0.7778 | 0.7063 | 0.8851 | 0.0 | 0.0 | 0.0003 | 0.0 | | 1.1977 | 4.2 | 420 | 0.8203 | 0.1933 | 0.2312 | 0.8061 | nan | 0.8586 | 0.9506 | 0.0 | 0.7629 | 0.1993 | 0.0 | 0.1317 | 0.0 | 0.0 | 0.9203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9110 | 0.0 | 0.2036 | 0.0 | 0.0 | nan | 0.0 | 0.0054 | 0.0 | 0.0 | 0.9264 | 0.8135 | 0.9447 | 0.0 | 0.0 | 0.0002 | 0.0 | nan | 0.6319 | 0.8277 | 0.0 | 0.7057 | 0.1685 | 0.0 | 0.1217 | 0.0 | 0.0 | 0.7346 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6292 | 0.0 | 0.1768 | 0.0 | 0.0 | nan | 0.0 | 0.0054 | 0.0 | 0.0 | 0.7831 | 0.7094 | 0.8861 | 0.0 | 0.0 | 0.0002 | 0.0 | | 0.8554 | 4.4 | 440 | 0.8262 | 0.1933 | 0.2328 | 0.8055 | nan | 0.8301 | 0.9494 | 0.0 | 0.7944 | 0.2211 | 0.0 | 0.1424 | 0.0 | 0.0 | 0.9224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9017 | 0.0 | 0.2125 | 0.0 | 0.0 | nan | 0.0 | 0.0069 | 0.0 | 0.0 | 0.9333 | 0.8196 | 0.9469 | 0.0 | 0.0 | 0.0005 | 0.0 | nan | 0.6319 | 0.8278 | 0.0 | 0.6795 | 0.1803 | 0.0 | 0.1315 | 0.0 | 0.0 | 0.7317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6351 | 0.0 | 0.1827 | 0.0 | 0.0 | nan | 0.0 | 0.0069 | 0.0 | 0.0 | 0.7794 | 0.7066 | 0.8859 | 0.0 | 0.0 | 0.0005 | 0.0 | | 1.1725 | 4.6 | 460 | 0.8199 | 0.1943 | 0.2333 | 0.8070 | nan | 0.8779 | 0.9403 | 0.0 | 0.7814 | 0.2232 | 0.0 | 0.1392 | 0.0 | 0.0 | 0.9257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9003 | 0.0 | 0.2169 | 0.0 | 0.0 | nan | 0.0 | 0.0069 | 0.0 | 0.0 | 0.9329 | 0.8049 | 0.9487 | 0.0 | 0.0 | 0.0005 | 0.0 | nan | 0.6267 | 0.8374 | 0.0 | 0.7119 | 0.1800 | 0.0 | 0.1276 | 0.0 | 0.0 | 0.7295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6362 | 0.0 | 0.1855 | 0.0 | 0.0 | nan | 0.0 | 0.0069 | 0.0 | 0.0 | 0.7801 | 0.7049 | 0.8854 | 0.0 | 0.0 | 0.0005 | 0.0 | | 0.8091 | 4.8 | 480 | 0.8154 | 0.1946 | 0.2336 | 0.8062 | nan | 0.8288 | 0.9495 | 0.0 | 0.7873 | 0.2486 | 0.0 | 0.1507 | 0.0 | 0.0 | 0.9201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8995 | 0.0 | 0.2135 | 0.0 | 0.0 | nan | 0.0 | 0.0079 | 0.0 | 0.0 | 0.9366 | 0.8165 | 0.9479 | 0.0 | 0.0 | 0.0006 | 0.0 | nan | 0.6337 | 0.8279 | 0.0 | 0.6956 | 0.1942 | 0.0 | 0.1374 | 0.0 | 0.0 | 0.7309 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6360 | 0.0 | 0.1829 | 0.0 | 0.0 | nan | 0.0 | 0.0079 | 0.0 | 0.0 | 0.7785 | 0.7089 | 0.8859 | 0.0 | 0.0 | 0.0006 | 0.0 | | 0.7131 | 5.0 | 500 | 0.8094 | 0.1948 | 0.2335 | 0.8077 | nan | 0.8958 | 0.9348 | 0.0 | 0.7825 | 0.2308 | 0.0 | 0.1450 | 0.0 | 0.0 | 0.9134 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9175 | 0.0 | 0.1972 | 0.0 | 0.0 | nan | 0.0 | 0.0064 | 0.0 | 0.0 | 0.9286 | 0.8034 | 0.9498 | 0.0 | 0.0 | 0.0003 | 0.0 | nan | 0.6286 | 0.8417 | 0.0 | 0.7195 | 0.1827 | 0.0 | 0.1318 | 0.0 | 0.0 | 0.7359 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6271 | 0.0 | 0.1749 | 0.0 | 0.0 | nan | 0.0 | 0.0063 | 0.0 | 0.0 | 0.7840 | 0.7104 | 0.8852 | 0.0 | 0.0 | 0.0003 | 0.0 | | 0.8448 | 5.2 | 520 | 0.8107 | 0.1939 | 0.2312 | 0.8059 | nan | 0.8489 | 0.9516 | 0.0 | 0.7632 | 0.2214 | 0.0 | 0.1502 | 0.0 | 0.0 | 0.9119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9160 | 0.0 | 0.1999 | 0.0 | 0.0 | nan | 0.0 | 0.0069 | 0.0 | 0.0 | 0.9334 | 0.7829 | 0.9448 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6331 | 0.8274 | 0.0 | 0.7018 | 0.1818 | 0.0 | 0.1376 | 0.0 | 0.0 | 0.7363 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6284 | 0.0 | 0.1759 | 0.0 | 0.0 | nan | 0.0 | 0.0069 | 0.0 | 0.0 | 0.7808 | 0.7011 | 0.8868 | 0.0 | 0.0 | 0.0001 | 0.0 | | 0.718 | 5.4 | 540 | 0.8037 | 0.1953 | 0.2340 | 0.8078 | nan | 0.8531 | 0.9476 | 0.0 | 0.7795 | 0.2424 | 0.0 | 0.1572 | 0.0 | 0.0 | 0.9203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8951 | 0.0 | 0.2226 | 0.0 | 0.0 | nan | 0.0 | 0.0081 | 0.0 | 0.0 | 0.9393 | 0.8070 | 0.9504 | 0.0 | 0.0 | 0.0008 | 0.0 | nan | 0.6365 | 0.8329 | 0.0 | 0.7070 | 0.1935 | 0.0 | 0.1431 | 0.0 | 0.0 | 0.7320 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6392 | 0.0 | 0.1891 | 0.0 | 0.0 | nan | 0.0 | 0.0081 | 0.0 | 0.0 | 0.7765 | 0.7018 | 0.8855 | 0.0 | 0.0 | 0.0008 | 0.0 | | 0.743 | 5.6 | 560 | 0.8045 | 0.1958 | 0.2341 | 0.8080 | nan | 0.8597 | 0.9498 | 0.0 | 0.7553 | 0.2344 | 0.0 | 0.1531 | 0.0 | 0.0 | 0.9226 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9025 | 0.0 | 0.2335 | 0.0000 | 0.0 | nan | 0.0 | 0.0073 | 0.0 | 0.0 | 0.9269 | 0.8342 | 0.9444 | 0.0 | 0.0 | 0.0004 | 0.0 | nan | 0.6364 | 0.8294 | 0.0 | 0.7069 | 0.1897 | 0.0 | 0.1396 | 0.0 | 0.0 | 0.7324 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6375 | 0.0 | 0.1932 | 0.0000 | 0.0 | nan | 0.0 | 0.0073 | 0.0 | 0.0 | 0.7856 | 0.7150 | 0.8876 | 0.0 | 0.0 | 0.0004 | 0.0 | | 0.7429 | 5.8 | 580 | 0.8002 | 0.1960 | 0.2347 | 0.8082 | nan | 0.8410 | 0.9502 | 0.0 | 0.7887 | 0.2528 | 0.0 | 0.1588 | 0.0 | 0.0 | 0.9209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8994 | 0.0 | 0.2226 | 0.0 | 0.0 | nan | 0.0 | 0.0087 | 0.0 | 0.0 | 0.9340 | 0.8246 | 0.9442 | 0.0 | 0.0 | 0.0008 | 0.0 | nan | 0.6388 | 0.8303 | 0.0 | 0.7016 | 0.1994 | 0.0 | 0.1444 | 0.0 | 0.0 | 0.7341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6384 | 0.0 | 0.1881 | 0.0 | 0.0 | nan | 0.0 | 0.0086 | 0.0 | 0.0 | 0.7807 | 0.7149 | 0.8876 | 0.0 | 0.0 | 0.0008 | 0.0 | | 1.3848 | 6.0 | 600 | 0.7933 | 0.1965 | 0.2351 | 0.8090 | nan | 0.8591 | 0.9448 | 0.0 | 0.7879 | 0.2488 | 0.0 | 0.1709 | 0.0 | 0.0 | 0.9177 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9000 | 0.0 | 0.2240 | 0.0 | 0.0 | nan | 0.0 | 0.0083 | 0.0 | 0.0 | 0.9408 | 0.8121 | 0.9434 | 0.0 | 0.0 | 0.0009 | 0.0 | nan | 0.6362 | 0.8363 | 0.0 | 0.7118 | 0.1980 | 0.0 | 0.1537 | 0.0 | 0.0 | 0.7369 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6392 | 0.0 | 0.1898 | 0.0 | 0.0 | nan | 0.0 | 0.0083 | 0.0 | 0.0 | 0.7774 | 0.7079 | 0.8880 | 0.0 | 0.0 | 0.0009 | 0.0 | | 0.824 | 6.2 | 620 | 0.7967 | 0.1956 | 0.2348 | 0.8076 | nan | 0.8341 | 0.9495 | 0.0 | 0.7963 | 0.2538 | 0.0 | 0.1701 | 0.0 | 0.0 | 0.9256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8985 | 0.0 | 0.2224 | 0.0 | 0.0 | nan | 0.0 | 0.0095 | 0.0 | 0.0 | 0.9394 | 0.7987 | 0.9478 | 0.0 | 0.0 | 0.0013 | 0.0 | nan | 0.6376 | 0.8312 | 0.0 | 0.6930 | 0.2013 | 0.0 | 0.1537 | 0.0 | 0.0 | 0.7289 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6403 | 0.0 | 0.1888 | 0.0 | 0.0 | nan | 0.0 | 0.0094 | 0.0 | 0.0 | 0.7777 | 0.7025 | 0.8876 | 0.0 | 0.0 | 0.0013 | 0.0 | | 0.991 | 6.4 | 640 | 0.7887 | 0.1963 | 0.2347 | 0.8092 | nan | 0.8629 | 0.9472 | 0.0 | 0.7841 | 0.2341 | 0.0 | 0.1695 | 0.0 | 0.0 | 0.9209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9010 | 0.0 | 0.2270 | 0.0 | 0.0 | nan | 0.0 | 0.0096 | 0.0 | 0.0 | 0.9377 | 0.8027 | 0.9486 | 0.0 | 0.0 | 0.0010 | 0.0 | nan | 0.6378 | 0.8347 | 0.0 | 0.7145 | 0.1926 | 0.0 | 0.1523 | 0.0 | 0.0 | 0.7356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6396 | 0.0 | 0.1911 | 0.0 | 0.0 | nan | 0.0 | 0.0096 | 0.0 | 0.0 | 0.7796 | 0.7029 | 0.8878 | 0.0 | 0.0 | 0.0010 | 0.0 | | 0.9375 | 6.6 | 660 | 0.7923 | 0.1966 | 0.2348 | 0.8092 | nan | 0.8617 | 0.9454 | 0.0 | 0.7933 | 0.2520 | 0.0 | 0.1653 | 0.0 | 0.0 | 0.9167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9085 | 0.0 | 0.2246 | 0.0 | 0.0 | nan | 0.0 | 0.0092 | 0.0 | 0.0 | 0.9392 | 0.7917 | 0.9417 | 0.0 | 0.0 | 0.0007 | 0.0 | nan | 0.6385 | 0.8362 | 0.0 | 0.7182 | 0.2012 | 0.0 | 0.1496 | 0.0 | 0.0 | 0.7403 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6361 | 0.0 | 0.1900 | 0.0 | 0.0 | nan | 0.0 | 0.0091 | 0.0 | 0.0 | 0.7792 | 0.6992 | 0.8887 | 0.0 | 0.0 | 0.0007 | 0.0 | | 0.8419 | 6.8 | 680 | 0.7900 | 0.1965 | 0.2358 | 0.8094 | nan | 0.8614 | 0.9429 | 0.0 | 0.8050 | 0.2494 | 0.0 | 0.1607 | 0.0 | 0.0 | 0.9201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8991 | 0.0 | 0.2276 | 0.0000 | 0.0 | nan | 0.0 | 0.0096 | 0.0 | 0.0 | 0.9378 | 0.8169 | 0.9484 | 0.0 | 0.0 | 0.0015 | 0.0 | nan | 0.6348 | 0.8376 | 0.0 | 0.7105 | 0.1980 | 0.0 | 0.1456 | 0.0 | 0.0 | 0.7370 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6407 | 0.0 | 0.1923 | 0.0000 | 0.0 | nan | 0.0 | 0.0095 | 0.0 | 0.0 | 0.7804 | 0.7103 | 0.8875 | 0.0 | 0.0 | 0.0015 | 0.0 | | 0.8405 | 7.0 | 700 | 0.7905 | 0.1971 | 0.2360 | 0.8099 | nan | 0.8738 | 0.9420 | 0.0 | 0.7945 | 0.2426 | 0.0 | 0.1692 | 0.0 | 0.0 | 0.9219 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8987 | 0.0 | 0.2370 | 0.0 | 0.0 | nan | 0.0 | 0.0097 | 0.0 | 0.0 | 0.9377 | 0.8111 | 0.9483 | 0.0 | 0.0 | 0.0014 | 0.0 | nan | 0.6356 | 0.8390 | 0.0 | 0.7189 | 0.1950 | 0.0 | 0.1515 | 0.0 | 0.0 | 0.7359 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6410 | 0.0 | 0.1969 | 0.0 | 0.0 | nan | 0.0 | 0.0097 | 0.0 | 0.0 | 0.7809 | 0.7104 | 0.8876 | 0.0 | 0.0 | 0.0014 | 0.0 | | 0.9715 | 7.2 | 720 | 0.7874 | 0.1968 | 0.2362 | 0.8095 | nan | 0.8768 | 0.9383 | 0.0 | 0.8023 | 0.2512 | 0.0 | 0.1712 | 0.0 | 0.0 | 0.9238 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8951 | 0.0 | 0.2382 | 0.0 | 0.0 | nan | 0.0 | 0.0098 | 0.0 | 0.0 | 0.9426 | 0.7956 | 0.9480 | 0.0 | 0.0 | 0.0019 | 0.0 | nan | 0.6367 | 0.8421 | 0.0 | 0.7199 | 0.1960 | 0.0 | 0.1533 | 0.0 | 0.0 | 0.7332 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6417 | 0.0 | 0.1975 | 0.0 | 0.0 | nan | 0.0 | 0.0098 | 0.0 | 0.0 | 0.7763 | 0.6992 | 0.8876 | 0.0 | 0.0 | 0.0019 | 0.0 | | 0.7724 | 7.4 | 740 | 0.7915 | 0.1968 | 0.2362 | 0.8091 | nan | 0.8398 | 0.9487 | 0.0 | 0.8018 | 0.2608 | 0.0 | 0.1719 | 0.0 | 0.0 | 0.9207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9087 | 0.0 | 0.2315 | 0.0 | 0.0 | nan | 0.0 | 0.0097 | 0.0 | 0.0 | 0.9300 | 0.8208 | 0.9481 | 0.0 | 0.0 | 0.0007 | 0.0 | nan | 0.6414 | 0.8324 | 0.0 | 0.6986 | 0.2052 | 0.0 | 0.1555 | 0.0 | 0.0 | 0.7371 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6364 | 0.0 | 0.1927 | 0.0 | 0.0 | nan | 0.0 | 0.0096 | 0.0 | 0.0 | 0.7860 | 0.7113 | 0.8875 | 0.0 | 0.0 | 0.0007 | 0.0 | | 1.383 | 7.6 | 760 | 0.7824 | 0.1970 | 0.2367 | 0.8091 | nan | 0.8456 | 0.9449 | 0.0 | 0.8053 | 0.2649 | 0.0 | 0.1814 | 0.0 | 0.0 | 0.9223 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9019 | 0.0 | 0.2364 | 0.0 | 0.0 | nan | 0.0 | 0.0100 | 0.0 | 0.0 | 0.9366 | 0.8108 | 0.9494 | 0.0 | 0.0 | 0.0013 | 0.0 | nan | 0.6375 | 0.8363 | 0.0 | 0.7020 | 0.2052 | 0.0 | 0.1627 | 0.0 | 0.0 | 0.7352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6396 | 0.0 | 0.1961 | 0.0 | 0.0 | nan | 0.0 | 0.0099 | 0.0 | 0.0 | 0.7818 | 0.7065 | 0.8877 | 0.0 | 0.0 | 0.0013 | 0.0 | | 0.8534 | 7.8 | 780 | 0.7886 | 0.1964 | 0.2353 | 0.8078 | nan | 0.8278 | 0.9533 | 0.0 | 0.7803 | 0.2586 | 0.0 | 0.1821 | 0.0 | 0.0 | 0.9279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9032 | 0.0 | 0.2372 | 0.0 | 0.0 | nan | 0.0 | 0.0103 | 0.0 | 0.0 | 0.9331 | 0.8001 | 0.9500 | 0.0 | 0.0 | 0.0014 | 0.0 | nan | 0.6431 | 0.8262 | 0.0 | 0.6920 | 0.2048 | 0.0 | 0.1631 | 0.0 | 0.0 | 0.7279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6389 | 0.0 | 0.1955 | 0.0 | 0.0 | nan | 0.0 | 0.0102 | 0.0 | 0.0 | 0.7831 | 0.7072 | 0.8875 | 0.0 | 0.0 | 0.0014 | 0.0 | | 0.7281 | 8.0 | 800 | 0.7854 | 0.1977 | 0.2368 | 0.8099 | nan | 0.8600 | 0.9436 | 0.0 | 0.7960 | 0.2662 | 0.0 | 0.1818 | 0.0 | 0.0 | 0.9229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9079 | 0.0 | 0.2388 | 0.0000 | 0.0 | nan | 0.0 | 0.0100 | 0.0 | 0.0 | 0.9346 | 0.8066 | 0.9459 | 0.0 | 0.0 | 0.0011 | 0.0 | nan | 0.6396 | 0.8385 | 0.0 | 0.7166 | 0.2046 | 0.0 | 0.1626 | 0.0 | 0.0 | 0.7354 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6371 | 0.0 | 0.1970 | 0.0000 | 0.0 | nan | 0.0 | 0.0099 | 0.0 | 0.0 | 0.7838 | 0.7097 | 0.8888 | 0.0 | 0.0 | 0.0011 | 0.0 | | 0.9046 | 8.2 | 820 | 0.7823 | 0.1976 | 0.2364 | 0.8097 | nan | 0.8513 | 0.9486 | 0.0 | 0.7782 | 0.2599 | 0.0 | 0.1809 | 0.0 | 0.0 | 0.9254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9013 | 0.0 | 0.2411 | 0.0001 | 0.0 | nan | 0.0 | 0.0100 | 0.0 | 0.0 | 0.9338 | 0.8221 | 0.9455 | 0.0 | 0.0 | 0.0019 | 0.0 | nan | 0.6406 | 0.8324 | 0.0 | 0.7101 | 0.2041 | 0.0 | 0.1623 | 0.0 | 0.0 | 0.7338 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6411 | 0.0 | 0.1978 | 0.0001 | 0.0 | nan | 0.0 | 0.0099 | 0.0 | 0.0 | 0.7845 | 0.7146 | 0.8891 | 0.0 | 0.0 | 0.0019 | 0.0 | | 0.7019 | 8.4 | 840 | 0.7834 | 0.1975 | 0.2364 | 0.8099 | nan | 0.8499 | 0.9476 | 0.0 | 0.7918 | 0.2640 | 0.0 | 0.1803 | 0.0 | 0.0 | 0.9235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9015 | 0.0 | 0.2326 | 0.0000 | 0.0 | nan | 0.0 | 0.0106 | 0.0 | 0.0 | 0.9368 | 0.8132 | 0.9476 | 0.0 | 0.0 | 0.0021 | 0.0 | nan | 0.6411 | 0.8346 | 0.0 | 0.7084 | 0.2062 | 0.0 | 0.1619 | 0.0 | 0.0 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6408 | 0.0 | 0.1939 | 0.0000 | 0.0 | nan | 0.0 | 0.0105 | 0.0 | 0.0 | 0.7820 | 0.7128 | 0.8884 | 0.0 | 0.0 | 0.0021 | 0.0 | | 0.885 | 8.6 | 860 | 0.7877 | 0.1977 | 0.2373 | 0.8100 | nan | 0.8622 | 0.9417 | 0.0 | 0.8035 | 0.2708 | 0.0 | 0.1772 | 0.0 | 0.0 | 0.9241 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9109 | 0.0 | 0.2367 | 0.0000 | 0.0 | nan | 0.0 | 0.0104 | 0.0 | 0.0 | 0.9287 | 0.8143 | 0.9508 | 0.0 | 0.0 | 0.0011 | 0.0 | nan | 0.6363 | 0.8404 | 0.0 | 0.7137 | 0.2052 | 0.0 | 0.1592 | 0.0 | 0.0 | 0.7344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6370 | 0.0 | 0.1964 | 0.0000 | 0.0 | nan | 0.0 | 0.0103 | 0.0 | 0.0 | 0.7872 | 0.7147 | 0.8874 | 0.0 | 0.0 | 0.0011 | 0.0 | | 0.8822 | 8.8 | 880 | 0.7831 | 0.1978 | 0.2376 | 0.8103 | nan | 0.8765 | 0.9360 | 0.0 | 0.8054 | 0.2702 | 0.0 | 0.1818 | 0.0 | 0.0 | 0.9236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9075 | 0.0 | 0.2339 | 0.0000 | 0.0 | nan | 0.0 | 0.0103 | 0.0 | 0.0 | 0.9358 | 0.8084 | 0.9494 | 0.0 | 0.0 | 0.0014 | 0.0 | nan | 0.6381 | 0.8443 | 0.0 | 0.7209 | 0.2022 | 0.0 | 0.1618 | 0.0 | 0.0 | 0.7352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6382 | 0.0 | 0.1958 | 0.0000 | 0.0 | nan | 0.0 | 0.0103 | 0.0 | 0.0 | 0.7828 | 0.7080 | 0.8883 | 0.0 | 0.0 | 0.0014 | 0.0 | | 0.5794 | 9.0 | 900 | 0.7825 | 0.1978 | 0.2363 | 0.8100 | nan | 0.8604 | 0.9462 | 0.0 | 0.7940 | 0.2673 | 0.0 | 0.1772 | 0.0 | 0.0 | 0.9188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8940 | 0.0 | 0.2453 | 0.0000 | 0.0 | nan | 0.0 | 0.0105 | 0.0 | 0.0 | 0.9435 | 0.7907 | 0.9475 | 0.0 | 0.0 | 0.0025 | 0.0 | nan | 0.6417 | 0.8370 | 0.0 | 0.7175 | 0.2078 | 0.0 | 0.1590 | 0.0 | 0.0 | 0.7421 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6426 | 0.0 | 0.2008 | 0.0000 | 0.0 | nan | 0.0 | 0.0104 | 0.0 | 0.0 | 0.7766 | 0.7007 | 0.8888 | 0.0 | 0.0 | 0.0025 | 0.0 | | 0.6408 | 9.2 | 920 | 0.7818 | 0.1976 | 0.2360 | 0.8094 | nan | 0.8462 | 0.9497 | 0.0 | 0.7917 | 0.2648 | 0.0 | 0.1855 | 0.0 | 0.0 | 0.9179 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9025 | 0.0 | 0.2383 | 0.0000 | 0.0 | nan | 0.0 | 0.0107 | 0.0 | 0.0 | 0.9391 | 0.7889 | 0.9494 | 0.0 | 0.0 | 0.0019 | 0.0 | nan | 0.6409 | 0.8327 | 0.0 | 0.7081 | 0.2091 | 0.0 | 0.1654 | 0.0 | 0.0 | 0.7419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6404 | 0.0 | 0.1971 | 0.0000 | 0.0 | nan | 0.0 | 0.0107 | 0.0 | 0.0 | 0.7799 | 0.7025 | 0.8886 | 0.0 | 0.0 | 0.0019 | 0.0 | | 0.7626 | 9.4 | 940 | 0.7804 | 0.1977 | 0.2363 | 0.8099 | nan | 0.8546 | 0.9474 | 0.0 | 0.7927 | 0.2683 | 0.0 | 0.1778 | 0.0 | 0.0 | 0.9225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9002 | 0.0 | 0.2359 | 0.0001 | 0.0 | nan | 0.0 | 0.0105 | 0.0 | 0.0 | 0.9386 | 0.7972 | 0.9495 | 0.0 | 0.0 | 0.0023 | 0.0 | nan | 0.6407 | 0.8352 | 0.0 | 0.7153 | 0.2090 | 0.0 | 0.1597 | 0.0 | 0.0 | 0.7379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6415 | 0.0 | 0.1961 | 0.0001 | 0.0 | nan | 0.0 | 0.0104 | 0.0 | 0.0 | 0.7806 | 0.7051 | 0.8887 | 0.0 | 0.0 | 0.0023 | 0.0 | | 1.0868 | 9.6 | 960 | 0.7791 | 0.1979 | 0.2366 | 0.8103 | nan | 0.8627 | 0.9439 | 0.0 | 0.7967 | 0.2643 | 0.0 | 0.1783 | 0.0 | 0.0 | 0.9197 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9026 | 0.0 | 0.2389 | 0.0000 | 0.0 | nan | 0.0 | 0.0105 | 0.0 | 0.0 | 0.9402 | 0.8018 | 0.9475 | 0.0 | 0.0 | 0.0018 | 0.0 | nan | 0.6402 | 0.8386 | 0.0 | 0.7204 | 0.2061 | 0.0 | 0.1601 | 0.0 | 0.0 | 0.7399 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6401 | 0.0 | 0.1976 | 0.0000 | 0.0 | nan | 0.0 | 0.0105 | 0.0 | 0.0 | 0.7799 | 0.7048 | 0.8891 | 0.0 | 0.0 | 0.0018 | 0.0 | | 0.7829 | 9.8 | 980 | 0.7843 | 0.1979 | 0.2365 | 0.8102 | nan | 0.8641 | 0.9430 | 0.0 | 0.8002 | 0.2667 | 0.0 | 0.1808 | 0.0 | 0.0 | 0.9165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9020 | 0.0 | 0.2365 | 0.0000 | 0.0 | nan | 0.0 | 0.0105 | 0.0 | 0.0 | 0.9427 | 0.7922 | 0.9484 | 0.0 | 0.0 | 0.0019 | 0.0 | nan | 0.6390 | 0.8396 | 0.0 | 0.7200 | 0.2071 | 0.0 | 0.1619 | 0.0 | 0.0 | 0.7431 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6402 | 0.0 | 0.1968 | 0.0000 | 0.0 | nan | 0.0 | 0.0104 | 0.0 | 0.0 | 0.7777 | 0.7026 | 0.8888 | 0.0 | 0.0 | 0.0019 | 0.0 | | 0.8179 | 10.0 | 1000 | 0.7810 | 0.1982 | 0.2367 | 0.8103 | nan | 0.8571 | 0.9478 | 0.0 | 0.7893 | 0.2689 | 0.0 | 0.1857 | 0.0 | 0.0 | 0.9203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9048 | 0.0 | 0.2404 | 0.0000 | 0.0 | nan | 0.0 | 0.0102 | 0.0 | 0.0 | 0.9351 | 0.8000 | 0.9484 | 0.0 | 0.0 | 0.0017 | 0.0 | nan | 0.6411 | 0.8351 | 0.0 | 0.7179 | 0.2097 | 0.0 | 0.1654 | 0.0 | 0.0 | 0.7408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6400 | 0.0 | 0.1981 | 0.0000 | 0.0 | nan | 0.0 | 0.0101 | 0.0 | 0.0 | 0.7837 | 0.7083 | 0.8890 | 0.0 | 0.0 | 0.0017 | 0.0 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0+cu118 - Datasets 2.18.0 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
qubvel-hf/finetune-instance-segmentation-ade20k-mini-mask2former-v1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/qubvel-hf-co/huggingface/runs/tpf8jz7h) # finetune-instance-segmentation-ade20k-mini-mask2former-v1 This model is a fine-tuned version of [facebook/mask2former-swin-tiny-coco-instance](https://huggingface.co/facebook/mask2former-swin-tiny-coco-instance) on the qubvel-hf/ade20k-mini dataset. It achieves the following results on the evaluation set: - Loss: 27.5494 - Map: 0.2315 - Map 50: 0.4495 - Map 75: 0.2185 - Map Small: 0.1535 - Map Medium: 0.6606 - Map Large: 0.8161 - Mar 1: 0.0981 - Mar 10: 0.2576 - Mar 100: 0.3 - Mar Small: 0.2272 - Mar Medium: 0.7189 - Mar Large: 0.8618 - Map Person: 0.1626 - Mar 100 Person: 0.2224 - Map Car: 0.3003 - Mar 100 Car: 0.3776 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - num_epochs: 40.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Car | Mar 100 Car | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:| | 36.7831 | 1.0 | 100 | 33.2768 | 0.1838 | 0.3677 | 0.174 | 0.1175 | 0.6012 | 0.7974 | 0.0884 | 0.2431 | 0.284 | 0.2104 | 0.7053 | 0.8712 | 0.1175 | 0.2014 | 0.25 | 0.3665 | | 30.2324 | 2.0 | 200 | 30.8268 | 0.198 | 0.4007 | 0.1831 | 0.1321 | 0.6183 | 0.8028 | 0.0916 | 0.25 | 0.2885 | 0.2151 | 0.7125 | 0.8354 | 0.1331 | 0.2079 | 0.263 | 0.3691 | | 28.4136 | 3.0 | 300 | 29.8261 | 0.2036 | 0.416 | 0.1849 | 0.1337 | 0.6332 | 0.7969 | 0.0934 | 0.2472 | 0.2905 | 0.2169 | 0.7162 | 0.8323 | 0.1381 | 0.2112 | 0.269 | 0.3697 | | 27.5659 | 4.0 | 400 | 29.2926 | 0.2101 | 0.4176 | 0.1918 | 0.1371 | 0.6352 | 0.8051 | 0.094 | 0.25 | 0.2884 | 0.2143 | 0.7174 | 0.8354 | 0.1456 | 0.2107 | 0.2745 | 0.3661 | | 26.9971 | 5.0 | 500 | 28.8044 | 0.213 | 0.4209 | 0.2016 | 0.1379 | 0.6419 | 0.8094 | 0.093 | 0.2499 | 0.2894 | 0.2148 | 0.7207 | 0.8441 | 0.1475 | 0.2096 | 0.2785 | 0.3692 | | 26.42 | 6.0 | 600 | 28.4848 | 0.2196 | 0.4224 | 0.2062 | 0.1426 | 0.647 | 0.8046 | 0.0944 | 0.2523 | 0.2925 | 0.2188 | 0.7196 | 0.8354 | 0.15 | 0.2106 | 0.2892 | 0.3745 | | 25.9065 | 7.0 | 700 | 28.2601 | 0.2212 | 0.4261 | 0.207 | 0.1444 | 0.6442 | 0.8049 | 0.0943 | 0.2527 | 0.2902 | 0.2176 | 0.7103 | 0.8323 | 0.153 | 0.2102 | 0.2893 | 0.3703 | | 25.6766 | 8.0 | 800 | 28.2581 | 0.2209 | 0.4276 | 0.2076 | 0.1434 | 0.6485 | 0.8201 | 0.0943 | 0.2532 | 0.294 | 0.2197 | 0.7212 | 0.8681 | 0.1532 | 0.2122 | 0.2885 | 0.3758 | | 25.3111 | 9.0 | 900 | 27.8623 | 0.2234 | 0.4318 | 0.2163 | 0.1451 | 0.649 | 0.8252 | 0.0951 | 0.2519 | 0.2953 | 0.2212 | 0.721 | 0.8649 | 0.1561 | 0.2148 | 0.2907 | 0.3757 | | 24.9424 | 10.0 | 1000 | 27.8925 | 0.2256 | 0.4367 | 0.2129 | 0.1479 | 0.6476 | 0.8314 | 0.0953 | 0.2556 | 0.2973 | 0.2244 | 0.7159 | 0.8712 | 0.1588 | 0.2153 | 0.2923 | 0.3793 | | 24.6502 | 11.0 | 1100 | 27.7524 | 0.2254 | 0.441 | 0.2163 | 0.1486 | 0.6468 | 0.8186 | 0.0952 | 0.2556 | 0.2963 | 0.2231 | 0.7167 | 0.8681 | 0.1578 | 0.2153 | 0.2929 | 0.3772 | | 24.5278 | 12.0 | 1200 | 27.7122 | 0.2252 | 0.4349 | 0.2167 | 0.1473 | 0.6462 | 0.8237 | 0.0927 | 0.2549 | 0.2979 | 0.2251 | 0.7162 | 0.8649 | 0.1583 | 0.2165 | 0.2921 | 0.3793 | | 24.3514 | 13.0 | 1300 | 27.5382 | 0.224 | 0.4345 | 0.2156 | 0.1459 | 0.6554 | 0.8324 | 0.0958 | 0.2554 | 0.2988 | 0.2251 | 0.722 | 0.8806 | 0.1583 | 0.2191 | 0.2897 | 0.3785 | | 24.3422 | 14.0 | 1400 | 27.5665 | 0.226 | 0.4374 | 0.2172 | 0.1488 | 0.6505 | 0.8059 | 0.0974 | 0.2551 | 0.2964 | 0.2241 | 0.7141 | 0.8434 | 0.1592 | 0.2158 | 0.2928 | 0.377 | | 23.9768 | 15.0 | 1500 | 27.7770 | 0.2281 | 0.4379 | 0.2215 | 0.1499 | 0.6553 | 0.8188 | 0.096 | 0.2553 | 0.2978 | 0.2244 | 0.72 | 0.8632 | 0.1599 | 0.2163 | 0.2963 | 0.3793 | | 23.7005 | 16.0 | 1600 | 27.5535 | 0.227 | 0.4392 | 0.2167 | 0.1485 | 0.6509 | 0.8165 | 0.0965 | 0.255 | 0.2972 | 0.2241 | 0.7175 | 0.8656 | 0.1608 | 0.2164 | 0.2932 | 0.3779 | | 23.579 | 17.0 | 1700 | 27.4894 | 0.2286 | 0.44 | 0.2209 | 0.1511 | 0.6488 | 0.8152 | 0.097 | 0.2583 | 0.2965 | 0.2243 | 0.7113 | 0.8601 | 0.162 | 0.2144 | 0.2952 | 0.3785 | | 23.5004 | 18.0 | 1800 | 27.2188 | 0.2274 | 0.4374 | 0.216 | 0.1498 | 0.6512 | 0.7954 | 0.0962 | 0.2562 | 0.2969 | 0.2251 | 0.712 | 0.8323 | 0.1614 | 0.215 | 0.2933 | 0.3788 | | 23.1744 | 19.0 | 1900 | 27.3523 | 0.2286 | 0.4391 | 0.2166 | 0.1494 | 0.6559 | 0.8203 | 0.0962 | 0.2565 | 0.2998 | 0.2274 | 0.7156 | 0.8656 | 0.1602 | 0.2174 | 0.297 | 0.3821 | | 23.1884 | 20.0 | 2000 | 27.1185 | 0.2304 | 0.4395 | 0.2204 | 0.1521 | 0.6533 | 0.8004 | 0.0968 | 0.2558 | 0.299 | 0.2273 | 0.7131 | 0.8347 | 0.1611 | 0.217 | 0.2998 | 0.3809 | | 22.9136 | 21.0 | 2100 | 27.4296 | 0.2301 | 0.4386 | 0.2197 | 0.1518 | 0.6545 | 0.8185 | 0.0968 | 0.2552 | 0.2979 | 0.2256 | 0.7123 | 0.8712 | 0.1609 | 0.2179 | 0.2992 | 0.3778 | | 22.6863 | 22.0 | 2200 | 26.9978 | 0.2309 | 0.444 | 0.2196 | 0.1519 | 0.657 | 0.7955 | 0.0976 | 0.2543 | 0.2982 | 0.2264 | 0.714 | 0.8316 | 0.1624 | 0.2181 | 0.2994 | 0.3784 | | 22.7741 | 23.0 | 2300 | 27.0703 | 0.23 | 0.4436 | 0.2183 | 0.1519 | 0.6508 | 0.8029 | 0.0966 | 0.2562 | 0.3001 | 0.229 | 0.7106 | 0.8434 | 0.162 | 0.218 | 0.2979 | 0.3823 | | 22.4779 | 24.0 | 2400 | 27.0394 | 0.2335 | 0.4521 | 0.2252 | 0.1552 | 0.656 | 0.8318 | 0.0962 | 0.2598 | 0.3026 | 0.231 | 0.7143 | 0.8601 | 0.1624 | 0.2187 | 0.3045 | 0.3865 | | 22.357 | 25.0 | 2500 | 27.1483 | 0.2304 | 0.4456 | 0.2189 | 0.1517 | 0.6586 | 0.8065 | 0.0967 | 0.2554 | 0.2996 | 0.2278 | 0.7143 | 0.8378 | 0.162 | 0.2187 | 0.2989 | 0.3805 | | 22.3167 | 26.0 | 2600 | 27.3299 | 0.232 | 0.4438 | 0.2193 | 0.1534 | 0.6572 | 0.8221 | 0.0977 | 0.2564 | 0.2989 | 0.2267 | 0.7134 | 0.8681 | 0.1624 | 0.2176 | 0.3016 | 0.3802 | | 22.0958 | 27.0 | 2700 | 27.2571 | 0.232 | 0.4438 | 0.2171 | 0.1535 | 0.6539 | 0.8268 | 0.0974 | 0.2591 | 0.2986 | 0.226 | 0.7153 | 0.8774 | 0.1622 | 0.2185 | 0.3018 | 0.3788 | | 22.0902 | 28.0 | 2800 | 27.5156 | 0.2315 | 0.4482 | 0.2177 | 0.1539 | 0.6566 | 0.8265 | 0.0978 | 0.2583 | 0.3021 | 0.23 | 0.716 | 0.8719 | 0.1626 | 0.22 | 0.3004 | 0.3842 | | 21.9943 | 29.0 | 2900 | 27.0142 | 0.2288 | 0.4449 | 0.2155 | 0.1511 | 0.6536 | 0.8176 | 0.097 | 0.2557 | 0.2984 | 0.2257 | 0.7169 | 0.8569 | 0.1616 | 0.2202 | 0.2961 | 0.3766 | | 21.8843 | 30.0 | 3000 | 27.1738 | 0.2314 | 0.4456 | 0.2192 | 0.1534 | 0.6557 | 0.8263 | 0.0973 | 0.2587 | 0.3026 | 0.23 | 0.7204 | 0.8625 | 0.1629 | 0.2203 | 0.2999 | 0.3848 | | 21.8635 | 31.0 | 3100 | 27.0658 | 0.2316 | 0.4461 | 0.22 | 0.1534 | 0.6582 | 0.8166 | 0.0987 | 0.2581 | 0.3013 | 0.2292 | 0.7156 | 0.8625 | 0.163 | 0.2188 | 0.3003 | 0.3838 | | 21.473 | 32.0 | 3200 | 27.1354 | 0.2323 | 0.4493 | 0.219 | 0.1545 | 0.6569 | 0.8077 | 0.0966 | 0.259 | 0.3024 | 0.2305 | 0.7172 | 0.8507 | 0.1619 | 0.2182 | 0.3026 | 0.3866 | | 21.6879 | 33.0 | 3300 | 26.9810 | 0.2306 | 0.4461 | 0.2178 | 0.1533 | 0.6572 | 0.8095 | 0.0983 | 0.2581 | 0.3004 | 0.2285 | 0.7146 | 0.8476 | 0.1624 | 0.2194 | 0.2989 | 0.3814 | | 21.3771 | 34.0 | 3400 | 27.5323 | 0.23 | 0.4476 | 0.2149 | 0.1536 | 0.6593 | 0.8185 | 0.0968 | 0.2577 | 0.2996 | 0.2265 | 0.7204 | 0.8618 | 0.162 | 0.2212 | 0.298 | 0.3781 | | 21.2772 | 35.0 | 3500 | 27.1451 | 0.2327 | 0.4465 | 0.2172 | 0.1544 | 0.6641 | 0.8195 | 0.0988 | 0.2597 | 0.3028 | 0.2294 | 0.7262 | 0.8594 | 0.1616 | 0.221 | 0.3038 | 0.3847 | | 21.3682 | 36.0 | 3600 | 27.4698 | 0.2334 | 0.4503 | 0.2184 | 0.155 | 0.6608 | 0.8088 | 0.0985 | 0.2574 | 0.3013 | 0.2292 | 0.7164 | 0.8594 | 0.1657 | 0.223 | 0.3011 | 0.3797 | | 21.0417 | 37.0 | 3700 | 27.2499 | 0.2354 | 0.4523 | 0.2211 | 0.1569 | 0.6643 | 0.8224 | 0.0998 | 0.2604 | 0.3037 | 0.2307 | 0.7243 | 0.8562 | 0.1654 | 0.2209 | 0.3054 | 0.3865 | | 21.0664 | 38.0 | 3800 | 27.3426 | 0.2304 | 0.4437 | 0.2159 | 0.1516 | 0.6568 | 0.8071 | 0.0986 | 0.2566 | 0.2993 | 0.227 | 0.7164 | 0.8451 | 0.1641 | 0.2198 | 0.2967 | 0.3788 | | 21.0042 | 39.0 | 3900 | 27.7720 | 0.2315 | 0.4449 | 0.2182 | 0.1528 | 0.6611 | 0.8214 | 0.0994 | 0.2594 | 0.2994 | 0.2265 | 0.7191 | 0.8594 | 0.1604 | 0.2161 | 0.3026 | 0.3827 | | 20.8548 | 40.0 | 4000 | 27.5494 | 0.2315 | 0.4495 | 0.2185 | 0.1535 | 0.6606 | 0.8161 | 0.0981 | 0.2576 | 0.3 | 0.2272 | 0.7189 | 0.8618 | 0.1626 | 0.2224 | 0.3003 | 0.3776 | ### Framework versions - Transformers 4.42.0.dev0 - Pytorch 1.13.0+cu117 - Datasets 2.18.0 - Tokenizers 0.19.1
[ "person", "car" ]
smcenlly/segformer-finetuned-sidewalk-10k-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-sidewalk-10k-steps This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.6095 - Mean Iou: 0.2881 - Mean Accuracy: 0.3546 - Overall Accuracy: 0.8313 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.7705 - Accuracy Flat-sidewalk: 0.9498 - Accuracy Flat-crosswalk: 0.1340 - Accuracy Flat-cyclinglane: 0.8681 - Accuracy Flat-parkingdriveway: 0.2555 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.5391 - Accuracy Human-person: 0.7858 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.9234 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.8710 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.8511 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.5804 - Accuracy Construction-fenceguardrail: 0.4355 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.3931 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9126 - Accuracy Nature-terrain: 0.8313 - Accuracy Sky: 0.9633 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0209 - Accuracy Void-static: 0.2608 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.6844 - Iou Flat-sidewalk: 0.8516 - Iou Flat-crosswalk: 0.0905 - Iou Flat-cyclinglane: 0.6851 - Iou Flat-parkingdriveway: 0.2043 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.3766 - Iou Human-person: 0.5327 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.7630 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.6814 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.6784 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.3576 - Iou Construction-fenceguardrail: 0.3553 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.2718 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.8160 - Iou Nature-terrain: 0.7180 - Iou Sky: 0.9227 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0191 - Iou Void-static: 0.2111 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 2.6286 | 1.0 | 107 | 1.8181 | 0.1171 | 0.1634 | 0.6717 | nan | 0.6306 | 0.9255 | 0.0 | 0.0094 | 0.0001 | nan | 0.0012 | 0.0 | 0.0 | 0.8679 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7730 | 0.0 | 0.0057 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9381 | 0.3122 | 0.7647 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.4481 | 0.6987 | 0.0 | 0.0094 | 0.0001 | nan | 0.0012 | 0.0 | 0.0 | 0.4529 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4526 | 0.0 | 0.0056 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6348 | 0.2923 | 0.7502 | 0.0 | 0.0 | 0.0001 | 0.0 | | 1.8336 | 2.0 | 214 | 1.4625 | 0.1293 | 0.1823 | 0.6870 | nan | 0.7452 | 0.9141 | 0.0008 | 0.1033 | 0.0 | nan | 0.0020 | 0.0 | 0.0 | 0.8612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8218 | 0.0 | 0.0017 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7270 | 0.8431 | 0.8122 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4802 | 0.7308 | 0.0008 | 0.1016 | 0.0 | nan | 0.0020 | 0.0 | 0.0 | 0.5072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4932 | 0.0 | 0.0017 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6013 | 0.4276 | 0.7903 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.526 | 3.0 | 321 | 1.2558 | 0.1429 | 0.1939 | 0.7077 | nan | 0.6255 | 0.9446 | 0.0136 | 0.4912 | 0.0013 | nan | 0.0009 | 0.0 | 0.0 | 0.9019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8599 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7262 | 0.7610 | 0.8796 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5088 | 0.7241 | 0.0136 | 0.4121 | 0.0013 | nan | 0.0009 | 0.0 | 0.0 | 0.5108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5052 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6114 | 0.4509 | 0.8337 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3177 | 4.0 | 428 | 1.1051 | 0.1528 | 0.1993 | 0.7276 | nan | 0.6192 | 0.9398 | 0.0 | 0.6093 | 0.0072 | nan | 0.0149 | 0.0 | 0.0 | 0.8931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8390 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8639 | 0.6996 | 0.8917 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5043 | 0.7197 | 0.0 | 0.5088 | 0.0070 | nan | 0.0143 | 0.0 | 0.0 | 0.5535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5263 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6828 | 0.5304 | 0.8431 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.192 | 5.0 | 535 | 1.0533 | 0.1591 | 0.2060 | 0.7356 | nan | 0.6333 | 0.9471 | 0.0071 | 0.6331 | 0.0518 | nan | 0.0512 | 0.0 | 0.0 | 0.8437 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8791 | 0.0 | 0.0021 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8150 | 0.7926 | 0.9342 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.5301 | 0.7436 | 0.0070 | 0.5448 | 0.0450 | nan | 0.0453 | 0.0 | 0.0 | 0.6187 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5282 | 0.0 | 0.0021 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6618 | 0.5167 | 0.8494 | 0.0 | 0.0 | 0.0000 | 0.0 | | 1.1066 | 6.0 | 642 | 0.9737 | 0.1645 | 0.2153 | 0.7459 | nan | 0.7737 | 0.8898 | 0.0194 | 0.7187 | 0.0968 | nan | 0.0597 | 0.0 | 0.0 | 0.9025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8828 | 0.0 | 0.0082 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8544 | 0.7723 | 0.9125 | 0.0 | 0.0 | 0.0002 | 0.0 | nan | 0.5693 | 0.7770 | 0.0192 | 0.5704 | 0.0733 | nan | 0.0518 | 0.0 | 0.0 | 0.5623 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5249 | 0.0 | 0.0082 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6853 | 0.5789 | 0.8442 | 0.0 | 0.0 | 0.0002 | 0.0 | | 1.0345 | 7.0 | 749 | 0.9358 | 0.1641 | 0.2080 | 0.7484 | nan | 0.6676 | 0.9600 | 0.0015 | 0.5846 | 0.0333 | nan | 0.1226 | 0.0 | 0.0 | 0.8343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8947 | 0.0 | 0.0195 | 0.0 | 0.0 | nan | 0.0 | 0.0074 | 0.0 | 0.0 | 0.8678 | 0.7738 | 0.8896 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5882 | 0.7553 | 0.0014 | 0.5142 | 0.0322 | nan | 0.1033 | 0.0 | 0.0 | 0.6033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5328 | 0.0 | 0.0189 | 0.0 | 0.0 | nan | 0.0 | 0.0074 | 0.0 | 0.0 | 0.6840 | 0.5633 | 0.8454 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9398 | 8.0 | 856 | 0.9636 | 0.1635 | 0.2096 | 0.7294 | nan | 0.5132 | 0.9711 | 0.0792 | 0.5573 | 0.0947 | nan | 0.1237 | 0.0 | 0.0 | 0.8577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9130 | 0.0 | 0.0729 | 0.0 | 0.0 | nan | 0.0 | 0.0297 | 0.0 | 0.0 | 0.7992 | 0.7867 | 0.9099 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.4766 | 0.7221 | 0.0628 | 0.4650 | 0.0834 | nan | 0.0990 | 0.0 | 0.0 | 0.6044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5367 | 0.0 | 0.0649 | 0.0 | 0.0 | nan | 0.0 | 0.0291 | 0.0 | 0.0 | 0.6904 | 0.5492 | 0.8475 | 0.0 | 0.0 | 0.0000 | 0.0 | | 0.877 | 9.0 | 963 | 0.9514 | 0.1651 | 0.2131 | 0.7267 | nan | 0.4504 | 0.9665 | 0.0883 | 0.6428 | 0.0535 | nan | 0.1474 | 0.0 | 0.0 | 0.9087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8611 | 0.0 | 0.1287 | 0.0 | 0.0 | nan | 0.0 | 0.0728 | 0.0 | 0.0 | 0.8297 | 0.7927 | 0.8753 | 0.0 | 0.0 | 0.0025 | 0.0 | nan | 0.4186 | 0.7179 | 0.0694 | 0.5261 | 0.0506 | nan | 0.1086 | 0.0 | 0.0 | 0.5726 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5563 | 0.0 | 0.1121 | 0.0 | 0.0 | nan | 0.0 | 0.0677 | 0.0 | 0.0 | 0.6842 | 0.5551 | 0.8429 | 0.0 | 0.0 | 0.0024 | 0.0 | | 0.8582 | 10.0 | 1070 | 0.8750 | 0.1839 | 0.2417 | 0.7451 | nan | 0.6248 | 0.9409 | 0.2622 | 0.6829 | 0.1184 | nan | 0.3192 | 0.0660 | 0.0 | 0.9059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8449 | 0.0 | 0.3226 | 0.0 | 0.0 | nan | 0.0 | 0.1033 | 0.0 | 0.0 | 0.7591 | 0.8623 | 0.9142 | 0.0 | 0.0 | 0.0066 | 0.0 | nan | 0.5565 | 0.7807 | 0.1615 | 0.5223 | 0.0978 | nan | 0.1989 | 0.0649 | 0.0 | 0.6020 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5934 | 0.0 | 0.2091 | 0.0 | 0.0 | nan | 0.0 | 0.0914 | 0.0 | 0.0 | 0.6305 | 0.5065 | 0.8613 | 0.0 | 0.0 | 0.0065 | 0.0 | | 0.8174 | 11.0 | 1177 | 0.8511 | 0.1925 | 0.2406 | 0.7630 | nan | 0.6198 | 0.9537 | 0.1152 | 0.7450 | 0.0962 | nan | 0.2211 | 0.1179 | 0.0 | 0.8453 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8332 | 0.0 | 0.3622 | 0.0 | 0.0 | nan | 0.0 | 0.1466 | 0.0 | 0.0 | 0.8842 | 0.8011 | 0.9317 | 0.0 | 0.0 | 0.0253 | 0.0 | nan | 0.5534 | 0.7652 | 0.0810 | 0.5479 | 0.0897 | nan | 0.1529 | 0.1052 | 0.0 | 0.6908 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5975 | 0.0 | 0.2391 | 0.0 | 0.0 | nan | 0.0 | 0.1060 | 0.0 | 0.0 | 0.7168 | 0.6235 | 0.8676 | 0.0 | 0.0 | 0.0241 | 0.0 | | 0.8053 | 12.0 | 1284 | 0.8258 | 0.1958 | 0.2465 | 0.7656 | nan | 0.5861 | 0.9566 | 0.1011 | 0.7279 | 0.1687 | nan | 0.2568 | 0.1789 | 0.0 | 0.8924 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8964 | 0.0 | 0.3750 | 0.0001 | 0.0 | nan | 0.0 | 0.2163 | 0.0 | 0.0 | 0.8893 | 0.7173 | 0.9173 | 0.0 | 0.0 | 0.0083 | 0.0 | nan | 0.5385 | 0.7828 | 0.0763 | 0.5563 | 0.1347 | nan | 0.1770 | 0.1510 | 0.0 | 0.6499 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5855 | 0.0 | 0.2200 | 0.0001 | 0.0 | nan | 0.0 | 0.1464 | 0.0 | 0.0 | 0.7486 | 0.6286 | 0.8615 | 0.0 | 0.0 | 0.0081 | 0.0 | | 0.7937 | 13.0 | 1391 | 0.8232 | 0.2005 | 0.2562 | 0.7638 | nan | 0.5553 | 0.9526 | 0.1341 | 0.7287 | 0.1679 | nan | 0.3497 | 0.4409 | 0.0 | 0.8663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8407 | 0.0 | 0.2968 | 0.0011 | 0.0 | nan | 0.0 | 0.1930 | 0.0 | 0.0 | 0.9081 | 0.7888 | 0.9409 | 0.0 | 0.0 | 0.0349 | 0.0 | nan | 0.5081 | 0.7915 | 0.0925 | 0.5494 | 0.1468 | nan | 0.1990 | 0.2966 | 0.0 | 0.6470 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5892 | 0.0 | 0.2040 | 0.0010 | 0.0 | nan | 0.0 | 0.1191 | 0.0 | 0.0 | 0.7203 | 0.6447 | 0.8744 | 0.0 | 0.0 | 0.0336 | 0.0 | | 0.7716 | 14.0 | 1498 | 0.7484 | 0.2081 | 0.2618 | 0.7843 | nan | 0.7722 | 0.9215 | 0.1264 | 0.7666 | 0.1844 | nan | 0.3523 | 0.3594 | 0.0 | 0.9196 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8831 | 0.0 | 0.2244 | 0.0188 | 0.0 | nan | 0.0 | 0.2140 | 0.0 | 0.0 | 0.9013 | 0.7828 | 0.9271 | 0.0 | 0.0 | 0.0247 | 0.0 | nan | 0.6323 | 0.8154 | 0.0948 | 0.6211 | 0.1478 | nan | 0.2139 | 0.2617 | 0.0 | 0.6203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6000 | 0.0 | 0.1854 | 0.0184 | 0.0 | nan | 0.0 | 0.1356 | 0.0 | 0.0 | 0.7443 | 0.6675 | 0.8759 | 0.0 | 0.0 | 0.0237 | 0.0 | | 0.697 | 15.0 | 1605 | 0.7640 | 0.2154 | 0.2750 | 0.7820 | nan | 0.6800 | 0.9378 | 0.1242 | 0.7702 | 0.2647 | nan | 0.3212 | 0.5078 | 0.0 | 0.9206 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8370 | 0.0 | 0.4683 | 0.0281 | 0.0 | nan | 0.0 | 0.2391 | 0.0 | 0.0 | 0.8975 | 0.7546 | 0.9374 | 0.0 | 0.0 | 0.1111 | 0.0 | nan | 0.5954 | 0.8154 | 0.0834 | 0.5941 | 0.1778 | nan | 0.2096 | 0.3168 | 0.0 | 0.6310 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6256 | 0.0 | 0.2543 | 0.0276 | 0.0 | nan | 0.0 | 0.1788 | 0.0 | 0.0 | 0.7602 | 0.6507 | 0.8786 | 0.0 | 0.0 | 0.0922 | 0.0 | | 0.7087 | 16.0 | 1712 | 0.7756 | 0.2143 | 0.2689 | 0.7783 | nan | 0.6356 | 0.9559 | 0.1393 | 0.7969 | 0.1327 | nan | 0.2887 | 0.4540 | 0.0 | 0.9110 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8060 | 0.0 | 0.3380 | 0.1029 | 0.0 | nan | 0.0 | 0.2048 | 0.0 | 0.0 | 0.8951 | 0.7845 | 0.9515 | 0.0 | 0.0 | 0.2063 | 0.0 | nan | 0.5469 | 0.7997 | 0.0929 | 0.6084 | 0.1135 | nan | 0.1712 | 0.3306 | 0.0 | 0.6166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6291 | 0.0 | 0.2484 | 0.0982 | 0.0 | nan | 0.0 | 0.1616 | 0.0 | 0.0 | 0.7539 | 0.6542 | 0.8732 | 0.0 | 0.0 | 0.1599 | 0.0 | | 0.7095 | 17.0 | 1819 | 0.7780 | 0.2134 | 0.2775 | 0.7724 | nan | 0.6105 | 0.9634 | 0.2158 | 0.6514 | 0.1226 | nan | 0.3240 | 0.6724 | 0.0 | 0.9104 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8485 | 0.0 | 0.5700 | 0.0206 | 0.0 | nan | 0.0 | 0.2101 | 0.0 | 0.0 | 0.8578 | 0.7978 | 0.9248 | 0.0 | 0.0 | 0.1781 | 0.0 | nan | 0.5672 | 0.7839 | 0.1105 | 0.5158 | 0.1103 | nan | 0.2067 | 0.3594 | 0.0 | 0.6401 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6241 | 0.0 | 0.2685 | 0.0198 | 0.0 | nan | 0.0 | 0.1755 | 0.0 | 0.0 | 0.7708 | 0.6613 | 0.8734 | 0.0 | 0.0 | 0.1423 | 0.0 | | 0.6581 | 18.0 | 1926 | 0.7508 | 0.2171 | 0.2795 | 0.7801 | nan | 0.7706 | 0.9064 | 0.2124 | 0.6835 | 0.1403 | nan | 0.4551 | 0.6189 | 0.0 | 0.8718 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.8914 | 0.0 | 0.4337 | 0.0635 | 0.0 | nan | 0.0 | 0.2098 | 0.0 | 0.0 | 0.8830 | 0.8179 | 0.9307 | 0.0 | 0.0 | 0.0544 | 0.0 | nan | 0.6302 | 0.8074 | 0.1203 | 0.5742 | 0.1166 | nan | 0.2470 | 0.3759 | 0.0 | 0.6507 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.6008 | 0.0 | 0.2527 | 0.0586 | 0.0 | nan | 0.0 | 0.1727 | 0.0 | 0.0 | 0.7493 | 0.6594 | 0.8812 | 0.0 | 0.0 | 0.0496 | 0.0 | | 0.6426 | 19.0 | 2033 | 0.7245 | 0.2188 | 0.2826 | 0.7828 | nan | 0.6460 | 0.9538 | 0.2403 | 0.7412 | 0.1362 | nan | 0.4466 | 0.6738 | 0.0 | 0.8529 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8634 | 0.0 | 0.5070 | 0.0553 | 0.0 | nan | 0.0 | 0.2194 | 0.0 | 0.0 | 0.9065 | 0.7283 | 0.9258 | 0.0 | 0.0 | 0.1474 | 0.0 | nan | 0.5872 | 0.8063 | 0.1530 | 0.5897 | 0.1153 | nan | 0.2640 | 0.2673 | 0.0 | 0.7023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6152 | 0.0 | 0.2695 | 0.0521 | 0.0 | nan | 0.0 | 0.1736 | 0.0 | 0.0 | 0.7631 | 0.6394 | 0.8815 | 0.0 | 0.0 | 0.1230 | 0.0 | | 0.6702 | 20.0 | 2140 | 0.7611 | 0.2146 | 0.2804 | 0.7730 | nan | 0.6534 | 0.9088 | 0.1364 | 0.8361 | 0.0680 | nan | 0.3070 | 0.6003 | 0.0 | 0.8907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.0 | 0.0 | 0.8747 | 0.0 | 0.4397 | 0.1578 | 0.0 | nan | 0.0 | 0.3142 | 0.0 | 0.0 | 0.8835 | 0.8413 | 0.9161 | 0.0 | 0.0 | 0.1389 | 0.0 | nan | 0.5790 | 0.7775 | 0.0943 | 0.5104 | 0.0628 | nan | 0.2177 | 0.2765 | 0.0 | 0.6749 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.0 | 0.0 | 0.6273 | 0.0 | 0.2596 | 0.1387 | 0.0 | nan | 0.0 | 0.1920 | 0.0 | 0.0 | 0.7738 | 0.6739 | 0.8831 | 0.0 | 0.0 | 0.1208 | 0.0 | | 0.6459 | 21.0 | 2247 | 0.6979 | 0.2327 | 0.2995 | 0.7935 | nan | 0.7089 | 0.9446 | 0.1698 | 0.7339 | 0.2010 | nan | 0.4253 | 0.7356 | 0.0 | 0.8986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0489 | 0.0 | 0.0 | 0.8166 | 0.0 | 0.5715 | 0.1877 | 0.0 | nan | 0.0 | 0.2962 | 0.0 | 0.0 | 0.8889 | 0.8529 | 0.9257 | 0.0 | 0.0 | 0.1778 | 0.0 | nan | 0.6276 | 0.8195 | 0.1146 | 0.6042 | 0.1517 | nan | 0.2476 | 0.3503 | 0.0 | 0.6861 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0482 | 0.0 | 0.0 | 0.6283 | 0.0 | 0.2914 | 0.1585 | 0.0 | nan | 0.0 | 0.2180 | 0.0 | 0.0 | 0.7814 | 0.6855 | 0.8852 | 0.0 | 0.0 | 0.1496 | 0.0 | | 0.6129 | 22.0 | 2354 | 0.6890 | 0.2361 | 0.3018 | 0.8011 | nan | 0.7559 | 0.9447 | 0.2162 | 0.7605 | 0.2859 | nan | 0.3558 | 0.7490 | 0.0 | 0.9139 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0361 | 0.0 | 0.0 | 0.8660 | 0.0 | 0.4152 | 0.2028 | 0.0 | nan | 0.0 | 0.2883 | 0.0 | 0.0 | 0.8650 | 0.8445 | 0.9420 | 0.0 | 0.0 | 0.2157 | 0.0 | nan | 0.6661 | 0.8340 | 0.1244 | 0.6458 | 0.2020 | nan | 0.2536 | 0.3440 | 0.0 | 0.6636 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0360 | 0.0 | 0.0 | 0.6166 | 0.0 | 0.2880 | 0.1800 | 0.0 | nan | 0.0 | 0.2161 | 0.0 | 0.0 | 0.7648 | 0.6666 | 0.8811 | 0.0 | 0.0 | 0.1736 | 0.0 | | 0.587 | 23.0 | 2461 | 0.6791 | 0.2490 | 0.3175 | 0.8026 | nan | 0.7249 | 0.9426 | 0.1412 | 0.8048 | 0.2683 | nan | 0.4024 | 0.7341 | 0.0 | 0.8995 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3563 | 0.0 | 0.0 | 0.7969 | 0.0 | 0.5959 | 0.2566 | 0.0 | nan | 0.0 | 0.3140 | 0.0 | 0.0 | 0.8977 | 0.8109 | 0.9393 | 0.0 | 0.0 | 0.2740 | 0.0 | nan | 0.6384 | 0.8359 | 0.0943 | 0.6255 | 0.1975 | nan | 0.2657 | 0.3433 | 0.0 | 0.7017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3341 | 0.0 | 0.0 | 0.6332 | 0.0 | 0.2868 | 0.2052 | 0.0 | nan | 0.0 | 0.2232 | 0.0 | 0.0 | 0.7927 | 0.6943 | 0.8923 | 0.0 | 0.0 | 0.2045 | 0.0 | | 0.591 | 24.0 | 2568 | 0.6873 | 0.2347 | 0.3030 | 0.7947 | nan | 0.7352 | 0.9359 | 0.1105 | 0.8369 | 0.1648 | nan | 0.3745 | 0.7308 | 0.0 | 0.9372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1727 | 0.0 | 0.0 | 0.7772 | 0.0 | 0.4216 | 0.3084 | 0.0 | nan | 0.0 | 0.3076 | 0.0 | 0.0 | 0.8907 | 0.8103 | 0.9508 | 0.0 | 0.0 | 0.2295 | 0.0 | nan | 0.6287 | 0.8325 | 0.0757 | 0.5974 | 0.1304 | nan | 0.2501 | 0.3415 | 0.0 | 0.6188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1668 | 0.0 | 0.0 | 0.6259 | 0.0 | 0.2810 | 0.2333 | 0.0 | nan | 0.0 | 0.2182 | 0.0 | 0.0 | 0.7795 | 0.6710 | 0.8913 | 0.0 | 0.0 | 0.1671 | 0.0 | | 0.5565 | 25.0 | 2675 | 0.6970 | 0.2425 | 0.3119 | 0.7951 | nan | 0.7627 | 0.9196 | 0.3318 | 0.8099 | 0.1691 | nan | 0.4192 | 0.7074 | 0.0 | 0.9130 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1820 | 0.0 | 0.0 | 0.8595 | 0.0 | 0.4412 | 0.2585 | 0.0 | nan | 0.0 | 0.3007 | 0.0 | 0.0 | 0.8415 | 0.8706 | 0.9511 | 0.0 | 0.0 | 0.2416 | 0.0 | nan | 0.6508 | 0.8209 | 0.2078 | 0.6405 | 0.1411 | nan | 0.2586 | 0.3796 | 0.0 | 0.6485 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1730 | 0.0 | 0.0 | 0.6181 | 0.0 | 0.2824 | 0.2301 | 0.0 | nan | 0.0 | 0.2300 | 0.0 | 0.0 | 0.7563 | 0.6375 | 0.8938 | 0.0 | 0.0 | 0.1909 | 0.0 | | 0.5673 | 26.0 | 2782 | 0.6936 | 0.2464 | 0.3101 | 0.7956 | nan | 0.6847 | 0.9560 | 0.0266 | 0.6493 | 0.1651 | nan | 0.4754 | 0.7206 | 0.0 | 0.9053 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4206 | 0.0 | 0.0 | 0.8428 | 0.0 | 0.5624 | 0.3126 | 0.0 | nan | 0.0 | 0.3289 | 0.0 | 0.0 | 0.8940 | 0.8233 | 0.9411 | 0.0 | 0.0 | 0.2145 | 0.0 | nan | 0.6067 | 0.8103 | 0.0212 | 0.5254 | 0.1418 | nan | 0.2812 | 0.4028 | 0.0 | 0.7105 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3881 | 0.0 | 0.0 | 0.6410 | 0.0 | 0.3044 | 0.2496 | 0.0 | nan | 0.0 | 0.2443 | 0.0 | 0.0 | 0.7941 | 0.6923 | 0.8954 | 0.0 | 0.0 | 0.1761 | 0.0 | | 0.5759 | 27.0 | 2889 | 0.6665 | 0.2521 | 0.3173 | 0.8104 | nan | 0.7732 | 0.9421 | 0.0875 | 0.7788 | 0.1832 | nan | 0.4080 | 0.7830 | 0.0 | 0.8985 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3717 | 0.0 | 0.0 | 0.8734 | 0.0 | 0.4788 | 0.3323 | 0.0 | nan | 0.0 | 0.3734 | 0.0 | 0.0 | 0.8978 | 0.8224 | 0.9587 | 0.0 | 0.0 | 0.1901 | 0.0 | nan | 0.6721 | 0.8285 | 0.0644 | 0.6635 | 0.1458 | nan | 0.2751 | 0.3303 | 0.0 | 0.7125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3469 | 0.0 | 0.0 | 0.6399 | 0.0 | 0.3088 | 0.2740 | 0.0 | nan | 0.0 | 0.2524 | 0.0 | 0.0 | 0.7920 | 0.7042 | 0.8952 | 0.0 | 0.0 | 0.1605 | 0.0 | | 0.5519 | 28.0 | 2996 | 0.6882 | 0.2494 | 0.3181 | 0.8030 | nan | 0.6854 | 0.9535 | 0.1770 | 0.8538 | 0.1942 | nan | 0.3966 | 0.7358 | 0.0 | 0.9037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3296 | 0.0 | 0.0 | 0.8041 | 0.0 | 0.6758 | 0.2825 | 0.0 | nan | 0.0 | 0.2941 | 0.0 | 0.0 | 0.8816 | 0.8495 | 0.9256 | 0.0 | 0.0 | 0.2358 | 0.0 | nan | 0.6237 | 0.8353 | 0.1117 | 0.6402 | 0.1595 | nan | 0.2749 | 0.3770 | 0.0 | 0.6903 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3072 | 0.0 | 0.0 | 0.6350 | 0.0 | 0.2882 | 0.2358 | 0.0 | nan | 0.0 | 0.2300 | 0.0 | 0.0 | 0.7959 | 0.7063 | 0.8876 | 0.0 | 0.0 | 0.1837 | 0.0 | | 0.5543 | 29.0 | 3103 | 0.6939 | 0.2562 | 0.3221 | 0.8003 | nan | 0.5953 | 0.9575 | 0.1294 | 0.8713 | 0.2264 | nan | 0.4049 | 0.7236 | 0.0 | 0.8913 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5928 | 0.0 | 0.0 | 0.8923 | 0.0 | 0.4924 | 0.3422 | 0.0 | nan | 0.0 | 0.3405 | 0.0 | 0.0 | 0.8887 | 0.8011 | 0.9496 | 0.0 | 0.0 | 0.2076 | 0.0 | nan | 0.5561 | 0.8312 | 0.0831 | 0.6258 | 0.1713 | nan | 0.2404 | 0.3998 | 0.0 | 0.7266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5231 | 0.0 | 0.0 | 0.6360 | 0.0 | 0.3023 | 0.2928 | 0.0 | nan | 0.0 | 0.2467 | 0.0 | 0.0 | 0.8007 | 0.6943 | 0.9016 | 0.0 | 0.0 | 0.1682 | 0.0 | | 0.539 | 30.0 | 3210 | 0.6745 | 0.2556 | 0.3208 | 0.8078 | nan | 0.6808 | 0.9574 | 0.1187 | 0.8040 | 0.2318 | nan | 0.4663 | 0.7716 | 0.0 | 0.9109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5974 | 0.0 | 0.0 | 0.8887 | 0.0 | 0.3730 | 0.2630 | 0.0 | nan | 0.0 | 0.3166 | 0.0 | 0.0 | 0.8920 | 0.8438 | 0.9532 | 0.0 | 0.0 | 0.1962 | 0.0 | nan | 0.6214 | 0.8387 | 0.0838 | 0.6456 | 0.1843 | nan | 0.2902 | 0.3659 | 0.0 | 0.7135 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5120 | 0.0 | 0.0 | 0.6310 | 0.0 | 0.2731 | 0.2455 | 0.0 | nan | 0.0 | 0.2429 | 0.0 | 0.0 | 0.7802 | 0.6832 | 0.9058 | 0.0 | 0.0 | 0.1619 | 0.0 | | 0.5517 | 31.0 | 3317 | 0.6675 | 0.2597 | 0.3320 | 0.8034 | nan | 0.7208 | 0.9193 | 0.2627 | 0.8866 | 0.1661 | nan | 0.3632 | 0.7967 | 0.0 | 0.9130 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6360 | 0.0 | 0.0 | 0.8301 | 0.0 | 0.5332 | 0.2807 | 0.0 | nan | 0.0 | 0.3774 | 0.0 | 0.0 | 0.9068 | 0.8339 | 0.9590 | 0.0 | 0.0 | 0.2374 | 0.0 | nan | 0.6386 | 0.8266 | 0.1726 | 0.5421 | 0.1405 | nan | 0.2481 | 0.4136 | 0.0 | 0.6972 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5330 | 0.0 | 0.0 | 0.6504 | 0.0 | 0.3249 | 0.2532 | 0.0 | nan | 0.0 | 0.2697 | 0.0 | 0.0 | 0.7994 | 0.7023 | 0.9021 | 0.0 | 0.0 | 0.1950 | 0.0 | | 0.5111 | 32.0 | 3424 | 0.6563 | 0.2651 | 0.3346 | 0.8095 | nan | 0.7287 | 0.9359 | 0.1284 | 0.8389 | 0.2655 | nan | 0.4372 | 0.7783 | 0.0 | 0.9067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6431 | 0.0 | 0.0 | 0.8414 | 0.0 | 0.3977 | 0.3680 | 0.0 | nan | 0.0 | 0.4788 | 0.0 | 0.0 | 0.8987 | 0.8325 | 0.9500 | 0.0 | 0.0 | 0.2780 | 0.0 | nan | 0.6414 | 0.8299 | 0.0884 | 0.6024 | 0.2065 | nan | 0.2703 | 0.4066 | 0.0 | 0.7223 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5560 | 0.0 | 0.0 | 0.6554 | 0.0 | 0.3017 | 0.3239 | 0.0 | nan | 0.0 | 0.2607 | 0.0 | 0.0 | 0.7939 | 0.6986 | 0.9078 | 0.0 | 0.0 | 0.2189 | 0.0 | | 0.4945 | 33.0 | 3531 | 0.6428 | 0.2652 | 0.3290 | 0.8103 | nan | 0.7400 | 0.9456 | 0.1315 | 0.7854 | 0.2355 | nan | 0.4819 | 0.6784 | 0.0 | 0.9072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6235 | 0.0 | 0.0 | 0.8110 | 0.0 | 0.5156 | 0.4098 | 0.0 | nan | 0.0 | 0.3817 | 0.0 | 0.0 | 0.9075 | 0.8345 | 0.9478 | 0.0 | 0.0 | 0.1922 | 0.0 | nan | 0.6629 | 0.8240 | 0.0888 | 0.6027 | 0.1846 | nan | 0.3229 | 0.4525 | 0.0 | 0.6842 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5536 | 0.0 | 0.0 | 0.6578 | 0.0 | 0.3255 | 0.3138 | 0.0 | nan | 0.0 | 0.2600 | 0.0 | 0.0 | 0.7950 | 0.7071 | 0.8974 | 0.0 | 0.0 | 0.1529 | 0.0 | | 0.5566 | 34.0 | 3638 | 0.6430 | 0.2652 | 0.3343 | 0.8119 | nan | 0.7506 | 0.9485 | 0.1373 | 0.7617 | 0.2401 | nan | 0.5094 | 0.7984 | 0.0 | 0.8919 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7165 | 0.0 | 0.0 | 0.8105 | 0.0 | 0.5638 | 0.3511 | 0.0 | nan | 0.0 | 0.3629 | 0.0 | 0.0 | 0.9171 | 0.7909 | 0.9522 | 0.0 | 0.0 | 0.1937 | 0.0 | nan | 0.6646 | 0.8310 | 0.0938 | 0.6654 | 0.1870 | nan | 0.3027 | 0.3954 | 0.0 | 0.7227 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5810 | 0.0 | 0.0 | 0.6415 | 0.0 | 0.3167 | 0.2941 | 0.0 | nan | 0.0 | 0.2454 | 0.0 | 0.0 | 0.7938 | 0.6864 | 0.9048 | 0.0 | 0.0 | 0.1617 | 0.0 | | 0.4937 | 35.0 | 3745 | 0.6567 | 0.2675 | 0.3356 | 0.8082 | nan | 0.7545 | 0.9241 | 0.1340 | 0.8054 | 0.2432 | nan | 0.5336 | 0.7489 | 0.0 | 0.9184 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7152 | 0.0 | 0.0 | 0.8418 | 0.0 | 0.6014 | 0.3744 | 0.0 | nan | 0.0 | 0.3334 | 0.0 | 0.0 | 0.9021 | 0.7771 | 0.9489 | 0.0 | 0.0 | 0.1837 | 0.0 | nan | 0.6532 | 0.8199 | 0.0922 | 0.6511 | 0.1876 | nan | 0.2792 | 0.4810 | 0.0 | 0.7016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5924 | 0.0 | 0.0 | 0.6616 | 0.0 | 0.3433 | 0.3078 | 0.0 | nan | 0.0 | 0.2422 | 0.0 | 0.0 | 0.7996 | 0.6874 | 0.9074 | 0.0 | 0.0 | 0.1509 | 0.0 | | 0.5153 | 36.0 | 3852 | 0.6487 | 0.2669 | 0.3347 | 0.8111 | nan | 0.6899 | 0.9487 | 0.1302 | 0.8613 | 0.1863 | nan | 0.4750 | 0.7652 | 0.0 | 0.9031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6507 | 0.0 | 0.0 | 0.8072 | 0.0 | 0.5373 | 0.4560 | 0.0 | nan | 0.0 | 0.3429 | 0.0 | 0.0 | 0.9169 | 0.8264 | 0.9579 | 0.0 | 0.0 | 0.2557 | 0.0 | nan | 0.6214 | 0.8292 | 0.0922 | 0.6469 | 0.1563 | nan | 0.2912 | 0.4294 | 0.0 | 0.7475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5577 | 0.0 | 0.0 | 0.6468 | 0.0 | 0.3131 | 0.3209 | 0.0 | nan | 0.0 | 0.2511 | 0.0 | 0.0 | 0.8035 | 0.7209 | 0.9071 | 0.0 | 0.0 | 0.2040 | 0.0 | | 0.4799 | 37.0 | 3959 | 0.6417 | 0.2709 | 0.3385 | 0.8128 | nan | 0.7164 | 0.9375 | 0.1316 | 0.8677 | 0.2338 | nan | 0.4291 | 0.7674 | 0.0 | 0.8821 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7224 | 0.0 | 0.0 | 0.8288 | 0.0 | 0.5695 | 0.3840 | 0.0 | nan | 0.0 | 0.3546 | 0.0 | 0.0 | 0.9239 | 0.7850 | 0.9592 | 0.0 | 0.0 | 0.3387 | 0.0 | nan | 0.6482 | 0.8243 | 0.0909 | 0.6552 | 0.1885 | nan | 0.2861 | 0.4622 | 0.0 | 0.7307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5758 | 0.0 | 0.0 | 0.6606 | 0.0 | 0.3390 | 0.3260 | 0.0 | nan | 0.0 | 0.2559 | 0.0 | 0.0 | 0.7939 | 0.6962 | 0.9086 | 0.0 | 0.0 | 0.2275 | 0.0 | | 0.4863 | 38.0 | 4066 | 0.6544 | 0.2653 | 0.3296 | 0.8080 | nan | 0.7432 | 0.9174 | 0.1061 | 0.8839 | 0.1812 | nan | 0.3608 | 0.6850 | 0.0 | 0.8926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7417 | 0.0 | 0.0 | 0.8696 | 0.0 | 0.4454 | 0.3772 | 0.0 | nan | 0.0 | 0.3579 | 0.0 | 0.0 | 0.9243 | 0.7896 | 0.9503 | 0.0 | 0.0 | 0.3213 | 0.0 | nan | 0.6546 | 0.8262 | 0.0853 | 0.5395 | 0.1506 | nan | 0.2774 | 0.4358 | 0.0 | 0.7446 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5958 | 0.0 | 0.0 | 0.6499 | 0.0 | 0.3286 | 0.3315 | 0.0 | nan | 0.0 | 0.2465 | 0.0 | 0.0 | 0.7906 | 0.6857 | 0.9119 | 0.0 | 0.0 | 0.2354 | 0.0 | | 0.4924 | 39.0 | 4173 | 0.6366 | 0.2676 | 0.3353 | 0.8139 | nan | 0.8348 | 0.8993 | 0.1203 | 0.8621 | 0.2236 | nan | 0.4680 | 0.7511 | 0.0 | 0.9173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7338 | 0.0 | 0.0 | 0.8951 | 0.0 | 0.4269 | 0.3728 | 0.0 | nan | 0.0 | 0.3840 | 0.0 | 0.0 | 0.8954 | 0.8431 | 0.9483 | 0.0 | 0.0 | 0.1521 | 0.0 | nan | 0.7017 | 0.8287 | 0.0947 | 0.6259 | 0.1764 | nan | 0.2719 | 0.4769 | 0.0 | 0.7039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5897 | 0.0 | 0.0 | 0.6428 | 0.0 | 0.3148 | 0.3347 | 0.0 | nan | 0.0 | 0.2526 | 0.0 | 0.0 | 0.8029 | 0.7084 | 0.9057 | 0.0 | 0.0 | 0.1322 | 0.0 | | 0.4888 | 40.0 | 4280 | 0.6359 | 0.2700 | 0.3366 | 0.8159 | nan | 0.7028 | 0.9550 | 0.1329 | 0.8665 | 0.2352 | nan | 0.4666 | 0.7618 | 0.0 | 0.9236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7581 | 0.0 | 0.0 | 0.8869 | 0.0 | 0.4960 | 0.4000 | 0.0 | nan | 0.0 | 0.3841 | 0.0 | 0.0 | 0.8872 | 0.7911 | 0.9617 | 0.0 | 0.0 | 0.1630 | 0.0 | nan | 0.6340 | 0.8422 | 0.0921 | 0.6435 | 0.1872 | nan | 0.3106 | 0.4853 | 0.0 | 0.7116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6024 | 0.0 | 0.0 | 0.6476 | 0.0 | 0.3429 | 0.3344 | 0.0 | nan | 0.0 | 0.2610 | 0.0 | 0.0 | 0.8079 | 0.6908 | 0.9049 | 0.0 | 0.0 | 0.1404 | 0.0 | | 0.483 | 41.0 | 4387 | 0.6366 | 0.2693 | 0.3406 | 0.8126 | nan | 0.7171 | 0.9324 | 0.1344 | 0.8655 | 0.2242 | nan | 0.4717 | 0.7440 | 0.0 | 0.9005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7429 | 0.0 | 0.0 | 0.8466 | 0.0 | 0.6028 | 0.3963 | 0.0 | nan | 0.0 | 0.3660 | 0.0 | 0.0 | 0.8904 | 0.8841 | 0.9521 | 0.0 | 0.0 | 0.2270 | 0.0 | nan | 0.6429 | 0.8377 | 0.0920 | 0.6265 | 0.1774 | nan | 0.3087 | 0.4640 | 0.0 | 0.7265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5897 | 0.0 | 0.0 | 0.6584 | 0.0 | 0.3360 | 0.3145 | 0.0 | nan | 0.0 | 0.2553 | 0.0 | 0.0 | 0.7999 | 0.6916 | 0.9112 | 0.0 | 0.0 | 0.1841 | 0.0 | | 0.5009 | 42.0 | 4494 | 0.6258 | 0.2710 | 0.3339 | 0.8183 | nan | 0.7555 | 0.9483 | 0.1454 | 0.8212 | 0.1967 | nan | 0.4535 | 0.6912 | 0.0 | 0.8826 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7402 | 0.0 | 0.0 | 0.8792 | 0.0 | 0.5108 | 0.4389 | 0.0 | nan | 0.0 | 0.3686 | 0.0 | 0.0 | 0.9180 | 0.7769 | 0.9484 | 0.0 | 0.0 | 0.2080 | 0.0 | nan | 0.6616 | 0.8477 | 0.1026 | 0.6467 | 0.1621 | nan | 0.3080 | 0.4639 | 0.0 | 0.7380 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6009 | 0.0 | 0.0 | 0.6392 | 0.0 | 0.3440 | 0.3440 | 0.0 | nan | 0.0 | 0.2535 | 0.0 | 0.0 | 0.7983 | 0.6804 | 0.9080 | 0.0 | 0.0 | 0.1744 | 0.0 | | 0.4219 | 43.0 | 4601 | 0.6288 | 0.2719 | 0.3438 | 0.8167 | nan | 0.7323 | 0.9418 | 0.1150 | 0.7904 | 0.3300 | nan | 0.5284 | 0.8029 | 0.0 | 0.9075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7436 | 0.0 | 0.0 | 0.8522 | 0.0 | 0.5399 | 0.4230 | 0.0 | nan | 0.0 | 0.3909 | 0.0 | 0.0 | 0.8917 | 0.8602 | 0.9687 | 0.0 | 0.0000 | 0.1831 | 0.0 | nan | 0.6487 | 0.8395 | 0.0855 | 0.6434 | 0.2450 | nan | 0.3106 | 0.4472 | 0.0 | 0.7469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5633 | 0.0 | 0.0 | 0.6472 | 0.0 | 0.3279 | 0.3462 | 0.0 | nan | 0.0 | 0.2604 | 0.0 | 0.0 | 0.8074 | 0.7212 | 0.9040 | 0.0 | 0.0000 | 0.1569 | 0.0 | | 0.4786 | 44.0 | 4708 | 0.6198 | 0.2760 | 0.3402 | 0.8214 | nan | 0.7436 | 0.9499 | 0.1392 | 0.7792 | 0.3171 | nan | 0.5137 | 0.7376 | 0.0 | 0.9098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7106 | 0.0 | 0.0 | 0.8936 | 0.0 | 0.4527 | 0.4079 | 0.0 | nan | 0.0 | 0.3822 | 0.0 | 0.0 | 0.8944 | 0.8624 | 0.9497 | 0.0 | 0.0 | 0.2424 | 0.0 | nan | 0.6681 | 0.8415 | 0.0922 | 0.6642 | 0.2266 | nan | 0.3429 | 0.4575 | 0.0 | 0.7339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5831 | 0.0 | 0.0 | 0.6422 | 0.0 | 0.3334 | 0.3469 | 0.0 | nan | 0.0 | 0.2604 | 0.0 | 0.0 | 0.8106 | 0.7247 | 0.9118 | 0.0 | 0.0 | 0.1917 | 0.0 | | 0.4707 | 45.0 | 4815 | 0.6180 | 0.2747 | 0.3451 | 0.8199 | nan | 0.7173 | 0.9483 | 0.1182 | 0.8281 | 0.3361 | nan | 0.5437 | 0.7921 | 0.0 | 0.8841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7553 | 0.0 | 0.0 | 0.8626 | 0.0 | 0.4548 | 0.4455 | 0.0 | nan | 0.0 | 0.3826 | 0.0 | 0.0 | 0.9011 | 0.8459 | 0.9673 | 0.0 | 0.0001 | 0.2609 | 0.0 | nan | 0.6556 | 0.8428 | 0.0841 | 0.6664 | 0.2449 | nan | 0.3186 | 0.4486 | 0.0 | 0.7511 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5700 | 0.0 | 0.0 | 0.6577 | 0.0 | 0.3207 | 0.3424 | 0.0 | nan | 0.0 | 0.2569 | 0.0 | 0.0 | 0.8038 | 0.7080 | 0.9087 | 0.0 | 0.0001 | 0.2085 | 0.0 | | 0.4779 | 46.0 | 4922 | 0.6370 | 0.2721 | 0.3407 | 0.8173 | nan | 0.7184 | 0.9500 | 0.1183 | 0.8408 | 0.2093 | nan | 0.4735 | 0.7918 | 0.0 | 0.9066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7902 | 0.0 | 0.0 | 0.8807 | 0.0 | 0.5252 | 0.3672 | 0.0 | nan | 0.0 | 0.3848 | 0.0 | 0.0 | 0.8909 | 0.8341 | 0.9619 | 0.0 | 0.0 | 0.2590 | 0.0 | nan | 0.6536 | 0.8352 | 0.0857 | 0.6622 | 0.1713 | nan | 0.2987 | 0.4503 | 0.0 | 0.7568 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5968 | 0.0 | 0.0 | 0.6392 | 0.0 | 0.3366 | 0.3266 | 0.0 | nan | 0.0 | 0.2634 | 0.0 | 0.0 | 0.8086 | 0.7051 | 0.9098 | 0.0 | 0.0 | 0.2081 | 0.0 | | 0.4473 | 47.0 | 5029 | 0.6425 | 0.2713 | 0.3387 | 0.8133 | nan | 0.6994 | 0.9423 | 0.0690 | 0.8474 | 0.2233 | nan | 0.4301 | 0.7275 | 0.0 | 0.9012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8387 | 0.0 | 0.0 | 0.8716 | 0.0 | 0.4686 | 0.4532 | 0.0 | nan | 0.0 | 0.4030 | 0.0 | 0.0 | 0.9117 | 0.8222 | 0.9505 | 0.0 | 0.0064 | 0.2731 | 0.0 | nan | 0.6344 | 0.8294 | 0.0586 | 0.5998 | 0.1831 | nan | 0.3108 | 0.4896 | 0.0 | 0.7499 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6183 | 0.0 | 0.0 | 0.6457 | 0.0 | 0.3075 | 0.3404 | 0.0 | nan | 0.0 | 0.2601 | 0.0 | 0.0 | 0.8108 | 0.7115 | 0.9146 | 0.0 | 0.0064 | 0.2109 | 0.0 | | 0.473 | 48.0 | 5136 | 0.6422 | 0.2697 | 0.3360 | 0.8112 | nan | 0.6854 | 0.9420 | 0.1309 | 0.8273 | 0.2546 | nan | 0.5640 | 0.7386 | 0.0 | 0.9125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7923 | 0.0 | 0.0 | 0.8816 | 0.0 | 0.4047 | 0.3684 | 0.0 | nan | 0.0 | 0.3528 | 0.0 | 0.0 | 0.9236 | 0.7482 | 0.9633 | 0.0 | 0.0029 | 0.2587 | 0.0 | nan | 0.6186 | 0.8410 | 0.0933 | 0.6505 | 0.2030 | nan | 0.3017 | 0.4866 | 0.0 | 0.7350 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6182 | 0.0 | 0.0 | 0.6326 | 0.0 | 0.2819 | 0.3208 | 0.0 | nan | 0.0 | 0.2567 | 0.0 | 0.0 | 0.7975 | 0.6717 | 0.9122 | 0.0 | 0.0029 | 0.2051 | 0.0 | | 0.4241 | 49.0 | 5243 | 0.6255 | 0.2737 | 0.3387 | 0.8166 | nan | 0.7375 | 0.9367 | 0.1313 | 0.8636 | 0.2763 | nan | 0.5443 | 0.7194 | 0.0 | 0.8946 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7436 | 0.0 | 0.0 | 0.8931 | 0.0 | 0.4677 | 0.3851 | 0.0 | nan | 0.0 | 0.3614 | 0.0 | 0.0 | 0.8977 | 0.7758 | 0.9657 | 0.0 | 0.0063 | 0.2369 | 0.0 | nan | 0.6546 | 0.8372 | 0.0888 | 0.6319 | 0.2170 | nan | 0.3243 | 0.4674 | 0.0 | 0.7569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6013 | 0.0 | 0.0 | 0.6424 | 0.0 | 0.3288 | 0.3391 | 0.0 | nan | 0.0 | 0.2665 | 0.0 | 0.0 | 0.8066 | 0.6834 | 0.9051 | 0.0 | 0.0062 | 0.2002 | 0.0 | | 0.4681 | 50.0 | 5350 | 0.6196 | 0.2712 | 0.3393 | 0.8204 | nan | 0.8128 | 0.9339 | 0.1273 | 0.8327 | 0.2242 | nan | 0.4888 | 0.7883 | 0.0 | 0.9409 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6982 | 0.0 | 0.0 | 0.8247 | 0.0 | 0.5636 | 0.3920 | 0.0 | nan | 0.0 | 0.3897 | 0.0 | 0.0 | 0.9191 | 0.7609 | 0.9612 | 0.0 | 0.0101 | 0.1876 | 0.0 | nan | 0.6913 | 0.8442 | 0.0932 | 0.6740 | 0.1778 | nan | 0.3092 | 0.4430 | 0.0 | 0.6827 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5736 | 0.0 | 0.0 | 0.6605 | 0.0 | 0.3690 | 0.3381 | 0.0 | nan | 0.0 | 0.2675 | 0.0 | 0.0 | 0.7983 | 0.6753 | 0.9113 | 0.0 | 0.0100 | 0.1585 | 0.0 | | 0.4074 | 51.0 | 5457 | 0.6271 | 0.2775 | 0.3439 | 0.8195 | nan | 0.7294 | 0.9401 | 0.1326 | 0.8529 | 0.2439 | nan | 0.5318 | 0.7294 | 0.0 | 0.9195 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8128 | 0.0 | 0.0 | 0.8700 | 0.0 | 0.4981 | 0.3919 | 0.0 | nan | 0.0 | 0.3983 | 0.0 | 0.0 | 0.9130 | 0.8188 | 0.9577 | 0.0 | 0.0223 | 0.2417 | 0.0 | nan | 0.6655 | 0.8368 | 0.0844 | 0.6580 | 0.1969 | nan | 0.3341 | 0.4974 | 0.0 | 0.7287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6328 | 0.0 | 0.0 | 0.6460 | 0.0 | 0.3449 | 0.3422 | 0.0 | nan | 0.0 | 0.2711 | 0.0 | 0.0 | 0.8063 | 0.7061 | 0.9139 | 0.0 | 0.0216 | 0.1940 | 0.0 | | 0.4509 | 52.0 | 5564 | 0.6222 | 0.2796 | 0.3433 | 0.8223 | nan | 0.7406 | 0.9468 | 0.1266 | 0.8702 | 0.2352 | nan | 0.4488 | 0.6973 | 0.0 | 0.9181 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8190 | 0.0 | 0.0 | 0.8629 | 0.0 | 0.5565 | 0.4160 | 0.0 | nan | 0.0 | 0.3726 | 0.0 | 0.0 | 0.9052 | 0.8316 | 0.9539 | 0.0 | 0.0203 | 0.2652 | 0.0 | nan | 0.6646 | 0.8367 | 0.0931 | 0.6549 | 0.1878 | nan | 0.3129 | 0.4997 | 0.0 | 0.7531 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6511 | 0.0 | 0.0 | 0.6638 | 0.0 | 0.3531 | 0.3372 | 0.0 | nan | 0.0 | 0.2675 | 0.0 | 0.0 | 0.8122 | 0.7163 | 0.9154 | 0.0 | 0.0194 | 0.2096 | 0.0 | | 0.4498 | 53.0 | 5671 | 0.6236 | 0.2758 | 0.3394 | 0.8230 | nan | 0.7541 | 0.9499 | 0.1196 | 0.7979 | 0.2387 | nan | 0.5362 | 0.7165 | 0.0 | 0.9071 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7450 | 0.0 | 0.0 | 0.8996 | 0.0 | 0.5641 | 0.3635 | 0.0 | nan | 0.0 | 0.3507 | 0.0 | 0.0 | 0.8941 | 0.8490 | 0.9589 | 0.0 | 0.0099 | 0.2058 | 0.0 | nan | 0.6657 | 0.8440 | 0.0951 | 0.6654 | 0.1853 | nan | 0.3408 | 0.4485 | 0.0 | 0.7598 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6050 | 0.0 | 0.0 | 0.6495 | 0.0 | 0.3570 | 0.3232 | 0.0 | nan | 0.0 | 0.2632 | 0.0 | 0.0 | 0.8128 | 0.7191 | 0.9078 | 0.0 | 0.0096 | 0.1741 | 0.0 | | 0.4507 | 54.0 | 5778 | 0.6176 | 0.2770 | 0.3433 | 0.8211 | nan | 0.7416 | 0.9454 | 0.1223 | 0.8588 | 0.2360 | nan | 0.4892 | 0.7601 | 0.0 | 0.9185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7776 | 0.0 | 0.0 | 0.8650 | 0.0 | 0.5613 | 0.4273 | 0.0 | nan | 0.0 | 0.4079 | 0.0 | 0.0 | 0.9153 | 0.7758 | 0.9553 | 0.0 | 0.0130 | 0.2160 | 0.0 | nan | 0.6602 | 0.8370 | 0.0957 | 0.6295 | 0.1915 | nan | 0.3314 | 0.4815 | 0.0 | 0.7392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6269 | 0.0 | 0.0 | 0.6720 | 0.0 | 0.3648 | 0.3500 | 0.0 | nan | 0.0 | 0.2645 | 0.0 | 0.0 | 0.8121 | 0.7003 | 0.9127 | 0.0 | 0.0125 | 0.1808 | 0.0 | | 0.4261 | 55.0 | 5885 | 0.6341 | 0.2781 | 0.3445 | 0.8214 | nan | 0.7109 | 0.9575 | 0.1100 | 0.8762 | 0.2139 | nan | 0.4148 | 0.7341 | 0.0 | 0.9027 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8385 | 0.0 | 0.0 | 0.8455 | 0.0 | 0.6064 | 0.3939 | 0.0 | nan | 0.0 | 0.4239 | 0.0 | 0.0 | 0.9049 | 0.8480 | 0.9505 | 0.0 | 0.0098 | 0.2812 | 0.0 | nan | 0.6436 | 0.8358 | 0.0899 | 0.6665 | 0.1782 | nan | 0.3080 | 0.4988 | 0.0 | 0.7586 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6378 | 0.0 | 0.0 | 0.6712 | 0.0 | 0.3576 | 0.3103 | 0.0 | nan | 0.0 | 0.2761 | 0.0 | 0.0 | 0.8068 | 0.7148 | 0.9155 | 0.0 | 0.0097 | 0.2208 | 0.0 | | 0.4269 | 56.0 | 5992 | 0.6354 | 0.2800 | 0.3425 | 0.8229 | nan | 0.7174 | 0.9635 | 0.1316 | 0.8449 | 0.1980 | nan | 0.4801 | 0.7019 | 0.0 | 0.9220 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7983 | 0.0 | 0.0 | 0.8332 | 0.0 | 0.5805 | 0.3871 | 0.0 | nan | 0.0 | 0.3880 | 0.0 | 0.0 | 0.9070 | 0.8425 | 0.9638 | 0.0 | 0.0057 | 0.2954 | 0.0 | nan | 0.6489 | 0.8346 | 0.0938 | 0.6909 | 0.1622 | nan | 0.3414 | 0.5041 | 0.0 | 0.7278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6631 | 0.0 | 0.0 | 0.6693 | 0.0 | 0.3671 | 0.3103 | 0.0 | nan | 0.0 | 0.2686 | 0.0 | 0.0 | 0.8133 | 0.7224 | 0.9163 | 0.0 | 0.0055 | 0.2195 | 0.0 | | 0.4169 | 57.0 | 6099 | 0.6053 | 0.2789 | 0.3414 | 0.8280 | nan | 0.7766 | 0.9558 | 0.1311 | 0.8292 | 0.2873 | nan | 0.4710 | 0.7584 | 0.0 | 0.9194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7046 | 0.0 | 0.0 | 0.8549 | 0.0 | 0.5641 | 0.3717 | 0.0 | nan | 0.0 | 0.3584 | 0.0 | 0.0 | 0.9088 | 0.8278 | 0.9606 | 0.0 | 0.0052 | 0.2411 | 0.0 | nan | 0.6842 | 0.8509 | 0.0887 | 0.6997 | 0.2152 | nan | 0.3381 | 0.4711 | 0.0 | 0.7428 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5962 | 0.0 | 0.0 | 0.6637 | 0.0 | 0.3486 | 0.3136 | 0.0 | nan | 0.0 | 0.2619 | 0.0 | 0.0 | 0.8097 | 0.7202 | 0.9169 | 0.0 | 0.0051 | 0.1982 | 0.0 | | 0.4203 | 58.0 | 6206 | 0.6204 | 0.2768 | 0.3469 | 0.8219 | nan | 0.7345 | 0.9382 | 0.1267 | 0.8723 | 0.2670 | nan | 0.5010 | 0.8043 | 0.0 | 0.9108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7795 | 0.0 | 0.0 | 0.8756 | 0.0 | 0.5060 | 0.3917 | 0.0 | nan | 0.0 | 0.3940 | 0.0 | 0.0 | 0.9022 | 0.8630 | 0.9626 | 0.0 | 0.0147 | 0.2578 | 0.0 | nan | 0.6559 | 0.8447 | 0.0970 | 0.6231 | 0.2100 | nan | 0.3341 | 0.4423 | 0.0 | 0.7649 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6047 | 0.0 | 0.0 | 0.6631 | 0.0 | 0.3387 | 0.3427 | 0.0 | nan | 0.0 | 0.2643 | 0.0 | 0.0 | 0.8105 | 0.7240 | 0.9139 | 0.0 | 0.0143 | 0.2106 | 0.0 | | 0.4258 | 59.0 | 6313 | 0.6419 | 0.2779 | 0.3458 | 0.8155 | nan | 0.6942 | 0.9363 | 0.1292 | 0.8829 | 0.2064 | nan | 0.5479 | 0.7177 | 0.0 | 0.9146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8297 | 0.0 | 0.0 | 0.8539 | 0.0 | 0.5605 | 0.4169 | 0.0 | nan | 0.0 | 0.3946 | 0.0 | 0.0 | 0.9033 | 0.8518 | 0.9652 | 0.0 | 0.0151 | 0.2459 | 0.0 | nan | 0.6383 | 0.8254 | 0.0972 | 0.6315 | 0.1650 | nan | 0.3471 | 0.5105 | 0.0 | 0.7528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6633 | 0.0 | 0.0 | 0.6610 | 0.0 | 0.3331 | 0.3427 | 0.0 | nan | 0.0 | 0.2635 | 0.0 | 0.0 | 0.8102 | 0.7257 | 0.9120 | 0.0 | 0.0143 | 0.1989 | 0.0 | | 0.3939 | 60.0 | 6420 | 0.6190 | 0.2810 | 0.3480 | 0.8236 | nan | 0.7109 | 0.9550 | 0.1268 | 0.8720 | 0.2427 | nan | 0.5037 | 0.7789 | 0.0 | 0.9088 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7861 | 0.0 | 0.0 | 0.8841 | 0.0 | 0.5206 | 0.4294 | 0.0 | nan | 0.0 | 0.3945 | 0.0 | 0.0 | 0.8896 | 0.8466 | 0.9647 | 0.0 | 0.0142 | 0.3064 | 0.0 | nan | 0.6377 | 0.8446 | 0.0928 | 0.6636 | 0.1958 | nan | 0.3534 | 0.4888 | 0.0 | 0.7695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6303 | 0.0 | 0.0 | 0.6599 | 0.0 | 0.3350 | 0.3416 | 0.0 | nan | 0.0 | 0.2695 | 0.0 | 0.0 | 0.8187 | 0.7265 | 0.9120 | 0.0 | 0.0139 | 0.2380 | 0.0 | | 0.3992 | 61.0 | 6527 | 0.6259 | 0.2772 | 0.3441 | 0.8178 | nan | 0.7583 | 0.9240 | 0.1284 | 0.8714 | 0.2340 | nan | 0.5038 | 0.7764 | 0.0 | 0.9091 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7920 | 0.0 | 0.0 | 0.8740 | 0.0 | 0.5316 | 0.3637 | 0.0 | nan | 0.0 | 0.3819 | 0.0 | 0.0 | 0.9060 | 0.8316 | 0.9569 | 0.0 | 0.0077 | 0.2618 | 0.0 | nan | 0.6735 | 0.8256 | 0.0924 | 0.5960 | 0.1875 | nan | 0.3319 | 0.4965 | 0.0 | 0.7582 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6429 | 0.0 | 0.0 | 0.6716 | 0.0 | 0.3335 | 0.3229 | 0.0 | nan | 0.0 | 0.2728 | 0.0 | 0.0 | 0.8130 | 0.7181 | 0.9167 | 0.0 | 0.0076 | 0.2104 | 0.0 | | 0.4129 | 62.0 | 6634 | 0.6260 | 0.2820 | 0.3511 | 0.8223 | nan | 0.7066 | 0.9535 | 0.1271 | 0.8465 | 0.2200 | nan | 0.5430 | 0.7777 | 0.0 | 0.9011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8684 | 0.0 | 0.0 | 0.8707 | 0.0 | 0.5766 | 0.3864 | 0.0 | nan | 0.0 | 0.3736 | 0.0 | 0.0 | 0.8824 | 0.8947 | 0.9619 | 0.0 | 0.0059 | 0.3374 | 0.0 | nan | 0.6317 | 0.8436 | 0.0913 | 0.6718 | 0.1800 | nan | 0.3561 | 0.5209 | 0.0 | 0.7695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6569 | 0.0 | 0.0 | 0.6685 | 0.0 | 0.3421 | 0.3253 | 0.0 | nan | 0.0 | 0.2687 | 0.0 | 0.0 | 0.8108 | 0.7197 | 0.9138 | 0.0 | 0.0058 | 0.2489 | 0.0 | | 0.4115 | 63.0 | 6741 | 0.6140 | 0.2792 | 0.3499 | 0.8224 | nan | 0.7478 | 0.9367 | 0.1313 | 0.8520 | 0.2720 | nan | 0.5338 | 0.8185 | 0.0 | 0.9055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7987 | 0.0 | 0.0 | 0.8555 | 0.0 | 0.5679 | 0.3952 | 0.0 | nan | 0.0 | 0.4042 | 0.0 | 0.0 | 0.9170 | 0.8134 | 0.9675 | 0.0 | 0.0131 | 0.2668 | 0.0 | nan | 0.6611 | 0.8430 | 0.0900 | 0.6619 | 0.2118 | nan | 0.3411 | 0.4575 | 0.0 | 0.7664 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6257 | 0.0 | 0.0 | 0.6674 | 0.0 | 0.3466 | 0.3267 | 0.0 | nan | 0.0 | 0.2734 | 0.0 | 0.0 | 0.8067 | 0.7131 | 0.9151 | 0.0 | 0.0127 | 0.2142 | 0.0 | | 0.4007 | 64.0 | 6848 | 0.6226 | 0.2799 | 0.3478 | 0.8241 | nan | 0.7270 | 0.9499 | 0.1301 | 0.8634 | 0.2476 | nan | 0.5034 | 0.8017 | 0.0 | 0.9072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8034 | 0.0 | 0.0 | 0.8629 | 0.0 | 0.5290 | 0.3805 | 0.0 | nan | 0.0 | 0.4671 | 0.0 | 0.0 | 0.9222 | 0.8110 | 0.9629 | 0.0 | 0.0131 | 0.2483 | 0.0 | nan | 0.6596 | 0.8455 | 0.0886 | 0.6685 | 0.1992 | nan | 0.3472 | 0.4826 | 0.0 | 0.7668 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6407 | 0.0 | 0.0 | 0.6656 | 0.0 | 0.3446 | 0.3243 | 0.0 | nan | 0.0 | 0.2835 | 0.0 | 0.0 | 0.8060 | 0.7012 | 0.9171 | 0.0 | 0.0128 | 0.2020 | 0.0 | | 0.3729 | 65.0 | 6955 | 0.6277 | 0.2805 | 0.3501 | 0.8175 | nan | 0.7535 | 0.9178 | 0.1297 | 0.8744 | 0.2713 | nan | 0.5176 | 0.7654 | 0.0 | 0.9066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8650 | 0.0 | 0.0 | 0.8636 | 0.0 | 0.5904 | 0.3873 | 0.0 | nan | 0.0 | 0.4093 | 0.0 | 0.0 | 0.9034 | 0.8424 | 0.9667 | 0.0 | 0.0238 | 0.2161 | 0.0 | nan | 0.6692 | 0.8260 | 0.0898 | 0.5885 | 0.2116 | nan | 0.3469 | 0.5326 | 0.0 | 0.7594 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6756 | 0.0 | 0.0 | 0.6697 | 0.0 | 0.3490 | 0.3297 | 0.0 | nan | 0.0 | 0.2702 | 0.0 | 0.0 | 0.8161 | 0.7236 | 0.9122 | 0.0 | 0.0231 | 0.1842 | 0.0 | | 0.4057 | 66.0 | 7062 | 0.6330 | 0.2800 | 0.3502 | 0.8214 | nan | 0.7551 | 0.9304 | 0.1350 | 0.8568 | 0.2164 | nan | 0.5517 | 0.7512 | 0.0 | 0.9266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8694 | 0.0 | 0.0 | 0.8600 | 0.0 | 0.5469 | 0.4128 | 0.0 | nan | 0.0 | 0.4283 | 0.0 | 0.0 | 0.9052 | 0.8578 | 0.9652 | 0.0 | 0.0115 | 0.2263 | 0.0 | nan | 0.6654 | 0.8419 | 0.0868 | 0.6542 | 0.1744 | nan | 0.3281 | 0.5290 | 0.0 | 0.7317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6670 | 0.0 | 0.0 | 0.6642 | 0.0 | 0.3528 | 0.3423 | 0.0 | nan | 0.0 | 0.2696 | 0.0 | 0.0 | 0.8157 | 0.7269 | 0.9145 | 0.0 | 0.0111 | 0.1858 | 0.0 | | 0.4369 | 67.0 | 7169 | 0.6357 | 0.2824 | 0.3525 | 0.8224 | nan | 0.7476 | 0.9330 | 0.1332 | 0.8828 | 0.2109 | nan | 0.4922 | 0.7664 | 0.0 | 0.8911 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8781 | 0.0 | 0.0 | 0.8555 | 0.0 | 0.5624 | 0.4589 | 0.0 | nan | 0.0 | 0.4195 | 0.0 | 0.0 | 0.9089 | 0.8686 | 0.9586 | 0.0 | 0.0182 | 0.2932 | 0.0 | nan | 0.6625 | 0.8404 | 0.0855 | 0.6255 | 0.1733 | nan | 0.3351 | 0.5265 | 0.0 | 0.7617 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6604 | 0.0 | 0.0 | 0.6802 | 0.0 | 0.3689 | 0.3586 | 0.0 | nan | 0.0 | 0.2648 | 0.0 | 0.0 | 0.8099 | 0.7189 | 0.9187 | 0.0 | 0.0171 | 0.2292 | 0.0 | | 0.3892 | 68.0 | 7276 | 0.6167 | 0.2833 | 0.3528 | 0.8258 | nan | 0.7860 | 0.9329 | 0.1332 | 0.8555 | 0.2064 | nan | 0.5435 | 0.7913 | 0.0 | 0.9146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8599 | 0.0 | 0.0 | 0.8568 | 0.0 | 0.5659 | 0.3951 | 0.0 | nan | 0.0 | 0.4415 | 0.0 | 0.0 | 0.9093 | 0.8528 | 0.9608 | 0.0 | 0.0204 | 0.2654 | 0.0 | nan | 0.6838 | 0.8452 | 0.0888 | 0.6523 | 0.1692 | nan | 0.3467 | 0.5246 | 0.0 | 0.7558 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6640 | 0.0 | 0.0 | 0.6786 | 0.0 | 0.3606 | 0.3320 | 0.0 | nan | 0.0 | 0.2839 | 0.0 | 0.0 | 0.8121 | 0.7187 | 0.9186 | 0.0 | 0.0194 | 0.2098 | 0.0 | | 0.3962 | 69.0 | 7383 | 0.6167 | 0.2808 | 0.3513 | 0.8262 | nan | 0.7777 | 0.9410 | 0.1316 | 0.8411 | 0.2312 | nan | 0.5449 | 0.8178 | 0.0 | 0.9141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8272 | 0.0 | 0.0 | 0.8759 | 0.0 | 0.5513 | 0.3886 | 0.0 | nan | 0.0 | 0.4216 | 0.0 | 0.0 | 0.8981 | 0.8375 | 0.9663 | 0.0000 | 0.0144 | 0.2604 | 0.0 | nan | 0.6825 | 0.8491 | 0.0916 | 0.6636 | 0.1848 | nan | 0.3580 | 0.4910 | 0.0 | 0.7574 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6304 | 0.0 | 0.0 | 0.6691 | 0.0 | 0.3443 | 0.3262 | 0.0 | nan | 0.0 | 0.2676 | 0.0 | 0.0 | 0.8152 | 0.7171 | 0.9170 | 0.0000 | 0.0138 | 0.2060 | 0.0 | | 0.3941 | 70.0 | 7490 | 0.6295 | 0.2814 | 0.3482 | 0.8248 | nan | 0.7540 | 0.9446 | 0.1261 | 0.8654 | 0.1788 | nan | 0.5286 | 0.7921 | 0.0 | 0.9087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8541 | 0.0 | 0.0 | 0.8799 | 0.0 | 0.5621 | 0.3841 | 0.0 | nan | 0.0 | 0.3876 | 0.0 | 0.0 | 0.9026 | 0.8420 | 0.9602 | 0.0000 | 0.0127 | 0.2591 | 0.0 | nan | 0.6649 | 0.8426 | 0.0909 | 0.6614 | 0.1472 | nan | 0.3625 | 0.5344 | 0.0 | 0.7587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6551 | 0.0 | 0.0 | 0.6738 | 0.0 | 0.3419 | 0.3251 | 0.0 | nan | 0.0 | 0.2699 | 0.0 | 0.0 | 0.8157 | 0.7237 | 0.9189 | 0.0000 | 0.0123 | 0.2063 | 0.0 | | 0.3723 | 71.0 | 7597 | 0.6136 | 0.2851 | 0.3555 | 0.8245 | nan | 0.7657 | 0.9351 | 0.1292 | 0.8615 | 0.2372 | nan | 0.5278 | 0.7931 | 0.0 | 0.9092 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8857 | 0.0 | 0.0 | 0.8408 | 0.0 | 0.5757 | 0.4549 | 0.0 | nan | 0.0 | 0.4165 | 0.0 | 0.0 | 0.9103 | 0.8393 | 0.9636 | 0.0004 | 0.0229 | 0.3081 | 0.0 | nan | 0.6760 | 0.8398 | 0.0924 | 0.6518 | 0.1897 | nan | 0.3505 | 0.5348 | 0.0 | 0.7571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6611 | 0.0 | 0.0 | 0.6755 | 0.0 | 0.3646 | 0.3619 | 0.0 | nan | 0.0 | 0.2733 | 0.0 | 0.0 | 0.8090 | 0.7150 | 0.9190 | 0.0004 | 0.0211 | 0.2305 | 0.0 | | 0.3479 | 72.0 | 7704 | 0.6061 | 0.2884 | 0.3557 | 0.8325 | nan | 0.7936 | 0.9508 | 0.1315 | 0.8549 | 0.2708 | nan | 0.5505 | 0.8127 | 0.0 | 0.9252 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8511 | 0.0 | 0.0 | 0.8386 | 0.0 | 0.4955 | 0.4688 | 0.0 | nan | 0.0 | 0.4257 | 0.0 | 0.0 | 0.9102 | 0.8361 | 0.9647 | 0.0 | 0.0295 | 0.2733 | 0.0 | nan | 0.6928 | 0.8570 | 0.0867 | 0.6919 | 0.2109 | nan | 0.3833 | 0.5276 | 0.0 | 0.7488 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6738 | 0.0 | 0.0 | 0.6799 | 0.0 | 0.3550 | 0.3680 | 0.0 | nan | 0.0 | 0.2631 | 0.0 | 0.0 | 0.8084 | 0.7220 | 0.9201 | 0.0 | 0.0269 | 0.2136 | 0.0 | | 0.3855 | 73.0 | 7811 | 0.6140 | 0.2844 | 0.3517 | 0.8280 | nan | 0.7850 | 0.9335 | 0.1308 | 0.8746 | 0.2526 | nan | 0.5565 | 0.8057 | 0.0 | 0.9215 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8211 | 0.0 | 0.0 | 0.8850 | 0.0 | 0.4633 | 0.4599 | 0.0 | nan | 0.0 | 0.3996 | 0.0 | 0.0 | 0.9129 | 0.8173 | 0.9627 | 0.0 | 0.0262 | 0.2472 | 0.0 | nan | 0.6876 | 0.8485 | 0.0883 | 0.6613 | 0.2008 | nan | 0.3733 | 0.5156 | 0.0 | 0.7579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6635 | 0.0 | 0.0 | 0.6722 | 0.0 | 0.3407 | 0.3598 | 0.0 | nan | 0.0 | 0.2652 | 0.0 | 0.0 | 0.8112 | 0.7110 | 0.9183 | 0.0 | 0.0236 | 0.2031 | 0.0 | | 0.3729 | 74.0 | 7918 | 0.6198 | 0.2842 | 0.3507 | 0.8236 | nan | 0.7829 | 0.9307 | 0.1299 | 0.8662 | 0.2433 | nan | 0.5390 | 0.7845 | 0.0 | 0.9173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8498 | 0.0 | 0.0 | 0.8569 | 0.0 | 0.5180 | 0.4716 | 0.0 | nan | 0.0 | 0.3838 | 0.0 | 0.0 | 0.9181 | 0.7794 | 0.9605 | 0.0 | 0.0255 | 0.2639 | 0.0 | nan | 0.6945 | 0.8309 | 0.0913 | 0.6524 | 0.1930 | nan | 0.3617 | 0.5286 | 0.0 | 0.7612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6807 | 0.0 | 0.0 | 0.6696 | 0.0 | 0.3450 | 0.3662 | 0.0 | nan | 0.0 | 0.2663 | 0.0 | 0.0 | 0.8067 | 0.6917 | 0.9190 | 0.0 | 0.0232 | 0.2136 | 0.0 | | 0.3847 | 75.0 | 8025 | 0.6102 | 0.2872 | 0.3542 | 0.8295 | nan | 0.7547 | 0.9536 | 0.1305 | 0.8593 | 0.2643 | nan | 0.5444 | 0.7866 | 0.0 | 0.9128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8692 | 0.0 | 0.0 | 0.8641 | 0.0 | 0.5149 | 0.4767 | 0.0 | nan | 0.0 | 0.3996 | 0.0 | 0.0 | 0.9072 | 0.8167 | 0.9652 | 0.0 | 0.0273 | 0.2872 | 0.0 | nan | 0.6746 | 0.8521 | 0.0899 | 0.6848 | 0.2061 | nan | 0.3671 | 0.5234 | 0.0 | 0.7691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6744 | 0.0 | 0.0 | 0.6762 | 0.0 | 0.3446 | 0.3628 | 0.0 | nan | 0.0 | 0.2719 | 0.0 | 0.0 | 0.8109 | 0.7111 | 0.9194 | 0.0 | 0.0241 | 0.2272 | 0.0 | | 0.3747 | 76.0 | 8132 | 0.6221 | 0.2855 | 0.3530 | 0.8264 | nan | 0.7744 | 0.9357 | 0.1332 | 0.8646 | 0.2509 | nan | 0.5133 | 0.7699 | 0.0 | 0.9257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8628 | 0.0 | 0.0 | 0.8489 | 0.0 | 0.5518 | 0.4277 | 0.0 | nan | 0.0 | 0.4239 | 0.0 | 0.0 | 0.9115 | 0.8425 | 0.9679 | 0.0 | 0.0170 | 0.2731 | 0.0 | nan | 0.6871 | 0.8423 | 0.0887 | 0.6448 | 0.1990 | nan | 0.3666 | 0.5301 | 0.0 | 0.7518 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6831 | 0.0 | 0.0 | 0.6731 | 0.0 | 0.3554 | 0.3631 | 0.0 | nan | 0.0 | 0.2731 | 0.0 | 0.0 | 0.8106 | 0.7180 | 0.9189 | 0.0 | 0.0155 | 0.2154 | 0.0 | | 0.393 | 77.0 | 8239 | 0.6188 | 0.2845 | 0.3524 | 0.8246 | nan | 0.7827 | 0.9299 | 0.1315 | 0.8704 | 0.2537 | nan | 0.5172 | 0.7217 | 0.0 | 0.9238 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8737 | 0.0 | 0.0 | 0.8465 | 0.0 | 0.5902 | 0.4841 | 0.0 | nan | 0.0 | 0.3912 | 0.0 | 0.0 | 0.9111 | 0.8047 | 0.9652 | 0.0 | 0.0183 | 0.2607 | 0.0 | nan | 0.6816 | 0.8433 | 0.0911 | 0.6407 | 0.1997 | nan | 0.3598 | 0.5258 | 0.0 | 0.7555 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6891 | 0.0 | 0.0 | 0.6751 | 0.0 | 0.3515 | 0.3614 | 0.0 | nan | 0.0 | 0.2679 | 0.0 | 0.0 | 0.8116 | 0.7075 | 0.9187 | 0.0 | 0.0161 | 0.2081 | 0.0 | | 0.3503 | 78.0 | 8346 | 0.6216 | 0.2862 | 0.3506 | 0.8283 | nan | 0.7554 | 0.9490 | 0.1308 | 0.8747 | 0.2635 | nan | 0.5136 | 0.7380 | 0.0 | 0.9021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8697 | 0.0 | 0.0 | 0.8639 | 0.0 | 0.5498 | 0.4507 | 0.0 | nan | 0.0 | 0.3723 | 0.0 | 0.0 | 0.9130 | 0.8144 | 0.9669 | 0.0000 | 0.0202 | 0.2718 | 0.0 | nan | 0.6704 | 0.8499 | 0.0876 | 0.6808 | 0.2080 | nan | 0.3576 | 0.5340 | 0.0 | 0.7722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6799 | 0.0 | 0.0 | 0.6734 | 0.0 | 0.3529 | 0.3562 | 0.0 | nan | 0.0 | 0.2639 | 0.0 | 0.0 | 0.8091 | 0.7083 | 0.9194 | 0.0000 | 0.0185 | 0.2176 | 0.0 | | 0.3389 | 79.0 | 8453 | 0.6086 | 0.2873 | 0.3536 | 0.8317 | nan | 0.7849 | 0.9467 | 0.1317 | 0.8604 | 0.2798 | nan | 0.5327 | 0.7801 | 0.0 | 0.9223 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8640 | 0.0 | 0.0 | 0.8674 | 0.0 | 0.5225 | 0.4534 | 0.0 | nan | 0.0 | 0.4022 | 0.0 | 0.0 | 0.9105 | 0.8236 | 0.9628 | 0.0 | 0.0203 | 0.2504 | 0.0 | nan | 0.6862 | 0.8560 | 0.0930 | 0.6832 | 0.2170 | nan | 0.3653 | 0.5196 | 0.0 | 0.7605 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6713 | 0.0 | 0.0 | 0.6743 | 0.0 | 0.3523 | 0.3645 | 0.0 | nan | 0.0 | 0.2715 | 0.0 | 0.0 | 0.8142 | 0.7199 | 0.9201 | 0.0 | 0.0189 | 0.2045 | 0.0 | | 0.3585 | 80.0 | 8560 | 0.6063 | 0.2864 | 0.3520 | 0.8318 | nan | 0.7764 | 0.9462 | 0.1291 | 0.8645 | 0.2672 | nan | 0.5498 | 0.7815 | 0.0 | 0.9163 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8444 | 0.0 | 0.0 | 0.8734 | 0.0 | 0.5129 | 0.4308 | 0.0 | nan | 0.0 | 0.3917 | 0.0 | 0.0 | 0.9168 | 0.8376 | 0.9623 | 0.0 | 0.0157 | 0.2467 | 0.0 | nan | 0.6799 | 0.8577 | 0.0902 | 0.6946 | 0.2078 | nan | 0.3653 | 0.5092 | 0.0 | 0.7677 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6630 | 0.0 | 0.0 | 0.6704 | 0.0 | 0.3574 | 0.3582 | 0.0 | nan | 0.0 | 0.2689 | 0.0 | 0.0 | 0.8115 | 0.7248 | 0.9214 | 0.0 | 0.0149 | 0.2031 | 0.0 | | 0.3316 | 81.0 | 8667 | 0.6237 | 0.2846 | 0.3499 | 0.8292 | nan | 0.7771 | 0.9554 | 0.1310 | 0.8561 | 0.2328 | nan | 0.5116 | 0.8077 | 0.0 | 0.9220 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8567 | 0.0 | 0.0 | 0.8447 | 0.0 | 0.5184 | 0.4265 | 0.0 | nan | 0.0 | 0.3936 | 0.0 | 0.0 | 0.9163 | 0.7947 | 0.9616 | 0.0 | 0.0194 | 0.2725 | 0.0 | nan | 0.6830 | 0.8478 | 0.0919 | 0.6921 | 0.1856 | nan | 0.3587 | 0.5077 | 0.0 | 0.7572 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6634 | 0.0 | 0.0 | 0.6682 | 0.0 | 0.3565 | 0.3552 | 0.0 | nan | 0.0 | 0.2675 | 0.0 | 0.0 | 0.8096 | 0.7085 | 0.9218 | 0.0 | 0.0182 | 0.2152 | 0.0 | | 0.37 | 82.0 | 8774 | 0.6196 | 0.2852 | 0.3505 | 0.8287 | nan | 0.7762 | 0.9494 | 0.1284 | 0.8690 | 0.2293 | nan | 0.4902 | 0.7878 | 0.0 | 0.9015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8731 | 0.0 | 0.0 | 0.8775 | 0.0 | 0.5446 | 0.4245 | 0.0 | nan | 0.0 | 0.3999 | 0.0 | 0.0 | 0.9022 | 0.8158 | 0.9637 | 0.0000 | 0.0177 | 0.2668 | 0.0 | nan | 0.6814 | 0.8447 | 0.0936 | 0.6709 | 0.1840 | nan | 0.3540 | 0.5202 | 0.0 | 0.7798 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6653 | 0.0 | 0.0 | 0.6719 | 0.0 | 0.3483 | 0.3534 | 0.0 | nan | 0.0 | 0.2708 | 0.0 | 0.0 | 0.8169 | 0.7155 | 0.9201 | 0.0000 | 0.0167 | 0.2179 | 0.0 | | 0.3627 | 83.0 | 8881 | 0.6121 | 0.2864 | 0.3502 | 0.8309 | nan | 0.7826 | 0.9457 | 0.1303 | 0.8636 | 0.2628 | nan | 0.5182 | 0.7617 | 0.0 | 0.9169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8484 | 0.0 | 0.0 | 0.8981 | 0.0 | 0.5051 | 0.4155 | 0.0 | nan | 0.0 | 0.3909 | 0.0 | 0.0 | 0.9009 | 0.8326 | 0.9629 | 0.0 | 0.0204 | 0.2512 | 0.0 | nan | 0.6893 | 0.8503 | 0.0938 | 0.6786 | 0.2084 | nan | 0.3700 | 0.5230 | 0.0 | 0.7683 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6699 | 0.0 | 0.0 | 0.6627 | 0.0 | 0.3504 | 0.3488 | 0.0 | nan | 0.0 | 0.2671 | 0.0 | 0.0 | 0.8173 | 0.7240 | 0.9195 | 0.0 | 0.0191 | 0.2055 | 0.0 | | 0.3548 | 84.0 | 8988 | 0.6255 | 0.2871 | 0.3508 | 0.8293 | nan | 0.7457 | 0.9559 | 0.1291 | 0.8691 | 0.2532 | nan | 0.5187 | 0.7411 | 0.0 | 0.9188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8670 | 0.0 | 0.0 | 0.8565 | 0.0 | 0.5686 | 0.4341 | 0.0 | nan | 0.0 | 0.3870 | 0.0 | 0.0 | 0.9185 | 0.8038 | 0.9562 | 0.0 | 0.0193 | 0.2832 | 0.0 | nan | 0.6752 | 0.8472 | 0.0924 | 0.6852 | 0.2021 | nan | 0.3671 | 0.5311 | 0.0 | 0.7628 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6838 | 0.0 | 0.0 | 0.6761 | 0.0 | 0.3590 | 0.3523 | 0.0 | nan | 0.0 | 0.2738 | 0.0 | 0.0 | 0.8100 | 0.7060 | 0.9238 | 0.0 | 0.0179 | 0.2213 | 0.0 | | 0.3724 | 85.0 | 9095 | 0.6204 | 0.2872 | 0.3517 | 0.8294 | nan | 0.7477 | 0.9552 | 0.1310 | 0.8738 | 0.2764 | nan | 0.5233 | 0.7399 | 0.0 | 0.9121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8842 | 0.0 | 0.0 | 0.8752 | 0.0 | 0.5314 | 0.4354 | 0.0 | nan | 0.0 | 0.3917 | 0.0 | 0.0 | 0.9046 | 0.8095 | 0.9608 | 0.0 | 0.0245 | 0.2773 | 0.0 | nan | 0.6719 | 0.8498 | 0.0929 | 0.6746 | 0.2173 | nan | 0.3705 | 0.5351 | 0.0 | 0.7717 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6752 | 0.0 | 0.0 | 0.6721 | 0.0 | 0.3498 | 0.3548 | 0.0 | nan | 0.0 | 0.2703 | 0.0 | 0.0 | 0.8144 | 0.7084 | 0.9218 | 0.0 | 0.0218 | 0.2181 | 0.0 | | 0.3412 | 86.0 | 9202 | 0.6206 | 0.2865 | 0.3530 | 0.8286 | nan | 0.7376 | 0.9578 | 0.1302 | 0.8737 | 0.2390 | nan | 0.5286 | 0.7728 | 0.0 | 0.9213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8872 | 0.0 | 0.0 | 0.8562 | 0.0 | 0.5362 | 0.4489 | 0.0 | nan | 0.0 | 0.4215 | 0.0 | 0.0 | 0.9120 | 0.8058 | 0.9629 | 0.0003 | 0.0238 | 0.2789 | 0.0 | nan | 0.6623 | 0.8475 | 0.0928 | 0.6955 | 0.1920 | nan | 0.3685 | 0.5297 | 0.0 | 0.7654 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6686 | 0.0 | 0.0 | 0.6750 | 0.0 | 0.3522 | 0.3603 | 0.0 | nan | 0.0 | 0.2744 | 0.0 | 0.0 | 0.8128 | 0.7070 | 0.9232 | 0.0003 | 0.0215 | 0.2182 | 0.0 | | 0.3816 | 87.0 | 9309 | 0.6182 | 0.2885 | 0.3576 | 0.8305 | nan | 0.7658 | 0.9495 | 0.1319 | 0.8699 | 0.2584 | nan | 0.5379 | 0.7819 | 0.0 | 0.9261 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8793 | 0.0 | 0.0 | 0.8375 | 0.0 | 0.5918 | 0.4508 | 0.0 | nan | 0.0 | 0.4501 | 0.0 | 0.0 | 0.9060 | 0.8429 | 0.9638 | 0.0 | 0.0206 | 0.2781 | 0.0 | nan | 0.6797 | 0.8510 | 0.0925 | 0.6860 | 0.2044 | nan | 0.3727 | 0.5308 | 0.0 | 0.7534 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6776 | 0.0 | 0.0 | 0.6817 | 0.0 | 0.3627 | 0.3565 | 0.0 | nan | 0.0 | 0.2872 | 0.0 | 0.0 | 0.8167 | 0.7217 | 0.9234 | 0.0 | 0.0186 | 0.2152 | 0.0 | | 0.3451 | 88.0 | 9416 | 0.6192 | 0.2869 | 0.3519 | 0.8302 | nan | 0.7577 | 0.9534 | 0.1308 | 0.8708 | 0.2657 | nan | 0.5258 | 0.7680 | 0.0 | 0.9224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8636 | 0.0 | 0.0 | 0.8602 | 0.0 | 0.5584 | 0.4366 | 0.0 | nan | 0.0 | 0.3964 | 0.0 | 0.0 | 0.9124 | 0.8146 | 0.9591 | 0.0 | 0.0151 | 0.2492 | 0.0 | nan | 0.6754 | 0.8507 | 0.0921 | 0.6914 | 0.2094 | nan | 0.3695 | 0.5297 | 0.0 | 0.7594 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6818 | 0.0 | 0.0 | 0.6744 | 0.0 | 0.3527 | 0.3531 | 0.0 | nan | 0.0 | 0.2722 | 0.0 | 0.0 | 0.8145 | 0.7132 | 0.9232 | 0.0 | 0.0141 | 0.2027 | 0.0 | | 0.3549 | 89.0 | 9523 | 0.6111 | 0.2875 | 0.3527 | 0.8318 | nan | 0.7690 | 0.9509 | 0.1307 | 0.8692 | 0.2542 | nan | 0.5263 | 0.7781 | 0.0 | 0.9164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8690 | 0.0 | 0.0 | 0.8728 | 0.0 | 0.5455 | 0.4461 | 0.0 | nan | 0.0 | 0.3868 | 0.0 | 0.0 | 0.9071 | 0.8496 | 0.9605 | 0.0 | 0.0186 | 0.2354 | 0.0 | nan | 0.6819 | 0.8525 | 0.0922 | 0.6967 | 0.2025 | nan | 0.3693 | 0.5314 | 0.0 | 0.7691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6787 | 0.0 | 0.0 | 0.6703 | 0.0 | 0.3520 | 0.3585 | 0.0 | nan | 0.0 | 0.2679 | 0.0 | 0.0 | 0.8177 | 0.7221 | 0.9224 | 0.0 | 0.0174 | 0.1972 | 0.0 | | 0.3675 | 90.0 | 9630 | 0.6085 | 0.2871 | 0.3525 | 0.8312 | nan | 0.7783 | 0.9479 | 0.1331 | 0.8670 | 0.2621 | nan | 0.5314 | 0.7927 | 0.0 | 0.9177 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8640 | 0.0 | 0.0 | 0.8641 | 0.0 | 0.5337 | 0.4436 | 0.0 | nan | 0.0 | 0.3959 | 0.0 | 0.0 | 0.9199 | 0.8105 | 0.9577 | 0.0 | 0.0193 | 0.2418 | 0.0 | nan | 0.6875 | 0.8524 | 0.0913 | 0.6871 | 0.2078 | nan | 0.3731 | 0.5255 | 0.0 | 0.7704 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6773 | 0.0 | 0.0 | 0.6731 | 0.0 | 0.3493 | 0.3616 | 0.0 | nan | 0.0 | 0.2691 | 0.0 | 0.0 | 0.8117 | 0.7092 | 0.9232 | 0.0 | 0.0179 | 0.2013 | 0.0 | | 0.3346 | 91.0 | 9737 | 0.6175 | 0.2861 | 0.3523 | 0.8294 | nan | 0.7571 | 0.9528 | 0.1329 | 0.8727 | 0.2238 | nan | 0.5224 | 0.7853 | 0.0 | 0.9101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8653 | 0.0 | 0.0 | 0.8636 | 0.0 | 0.5742 | 0.4281 | 0.0 | nan | 0.0 | 0.4133 | 0.0 | 0.0 | 0.9096 | 0.8216 | 0.9655 | 0.0001 | 0.0190 | 0.2571 | 0.0 | nan | 0.6775 | 0.8469 | 0.0898 | 0.6763 | 0.1828 | nan | 0.3720 | 0.5250 | 0.0 | 0.7756 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6748 | 0.0 | 0.0 | 0.6781 | 0.0 | 0.3545 | 0.3506 | 0.0 | nan | 0.0 | 0.2722 | 0.0 | 0.0 | 0.8162 | 0.7140 | 0.9213 | 0.0001 | 0.0176 | 0.2114 | 0.0 | | 0.3679 | 92.0 | 9844 | 0.6140 | 0.2870 | 0.3545 | 0.8308 | nan | 0.7757 | 0.9425 | 0.1333 | 0.8753 | 0.2634 | nan | 0.5339 | 0.7934 | 0.0 | 0.9171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8685 | 0.0 | 0.0 | 0.8648 | 0.0 | 0.5513 | 0.4414 | 0.0 | nan | 0.0 | 0.4012 | 0.0 | 0.0 | 0.9139 | 0.8384 | 0.9640 | 0.0001 | 0.0205 | 0.2449 | 0.0 | nan | 0.6849 | 0.8538 | 0.0900 | 0.6714 | 0.2093 | nan | 0.3755 | 0.5232 | 0.0 | 0.7691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6736 | 0.0 | 0.0 | 0.6739 | 0.0 | 0.3498 | 0.3594 | 0.0 | nan | 0.0 | 0.2714 | 0.0 | 0.0 | 0.8155 | 0.7201 | 0.9216 | 0.0001 | 0.0190 | 0.2031 | 0.0 | | 0.3564 | 93.0 | 9951 | 0.6306 | 0.2873 | 0.3540 | 0.8295 | nan | 0.7487 | 0.9498 | 0.1348 | 0.8731 | 0.2800 | nan | 0.5271 | 0.7866 | 0.0 | 0.9001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8696 | 0.0 | 0.0 | 0.8795 | 0.0 | 0.5637 | 0.4355 | 0.0 | nan | 0.0 | 0.3995 | 0.0 | 0.0 | 0.9070 | 0.8256 | 0.9621 | 0.0004 | 0.0191 | 0.2643 | 0.0 | nan | 0.6768 | 0.8482 | 0.0894 | 0.6763 | 0.2202 | nan | 0.3755 | 0.5260 | 0.0 | 0.7800 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6703 | 0.0 | 0.0 | 0.6708 | 0.0 | 0.3502 | 0.3531 | 0.0 | nan | 0.0 | 0.2703 | 0.0 | 0.0 | 0.8171 | 0.7166 | 0.9212 | 0.0004 | 0.0177 | 0.2146 | 0.0 | | 0.3657 | 93.4579 | 10000 | 0.6095 | 0.2881 | 0.3546 | 0.8313 | nan | 0.7705 | 0.9498 | 0.1340 | 0.8681 | 0.2555 | nan | 0.5391 | 0.7858 | 0.0 | 0.9234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8710 | 0.0 | 0.0 | 0.8511 | 0.0 | 0.5804 | 0.4355 | 0.0 | nan | 0.0 | 0.3931 | 0.0 | 0.0 | 0.9126 | 0.8313 | 0.9633 | 0.0 | 0.0209 | 0.2608 | 0.0 | nan | 0.6844 | 0.8516 | 0.0905 | 0.6851 | 0.2043 | nan | 0.3766 | 0.5327 | 0.0 | 0.7630 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6814 | 0.0 | 0.0 | 0.6784 | 0.0 | 0.3576 | 0.3553 | 0.0 | nan | 0.0 | 0.2718 | 0.0 | 0.0 | 0.8160 | 0.7180 | 0.9227 | 0.0 | 0.0191 | 0.2111 | 0.0 | ### Framework versions - Transformers 4.42.0.dev0 - Pytorch 2.3.1+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
yijisuk/segformer-b0-miic-tl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-miic-tl This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set: - Loss: 0.3415 - Mean Iou: 0.4569 - Mean Accuracy: 0.9138 - Overall Accuracy: 0.9138 - Accuracy Unlabeled: nan - Accuracy Circuit: 0.9138 - Iou Unlabeled: 0.0 - Iou Circuit: 0.9138 - Dice Coefficient: 0.8323 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit | Dice Coefficient | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|:----------------:| | 0.3496 | 3.12 | 250 | 0.3203 | 0.4832 | 0.9665 | 0.9665 | nan | 0.9665 | 0.0 | 0.9665 | 0.8163 | | 0.2808 | 6.25 | 500 | 0.3289 | 0.4814 | 0.9629 | 0.9629 | nan | 0.9629 | 0.0 | 0.9629 | 0.8271 | | 0.2582 | 9.38 | 750 | 0.3404 | 0.4670 | 0.9339 | 0.9339 | nan | 0.9339 | 0.0 | 0.9339 | 0.8327 | | 0.2791 | 12.5 | 1000 | 0.3033 | 0.4591 | 0.9181 | 0.9181 | nan | 0.9181 | 0.0 | 0.9181 | 0.8300 | | 0.2668 | 15.62 | 1250 | 0.3117 | 0.4559 | 0.9118 | 0.9118 | nan | 0.9118 | 0.0 | 0.9118 | 0.8285 | | 0.2531 | 18.75 | 1500 | 0.2652 | 0.4686 | 0.9373 | 0.9373 | nan | 0.9373 | 0.0 | 0.9373 | 0.8432 | | 0.2326 | 21.88 | 1750 | 0.3256 | 0.4604 | 0.9208 | 0.9208 | nan | 0.9208 | 0.0 | 0.9208 | 0.8315 | | 0.2361 | 25.0 | 2000 | 0.3129 | 0.4656 | 0.9313 | 0.9313 | nan | 0.9313 | 0.0 | 0.9313 | 0.8400 | | 0.2167 | 28.12 | 2250 | 0.3135 | 0.4558 | 0.9116 | 0.9116 | nan | 0.9116 | 0.0 | 0.9116 | 0.8290 | | 0.2133 | 31.25 | 2500 | 0.3132 | 0.4560 | 0.9120 | 0.9120 | nan | 0.9120 | 0.0 | 0.9120 | 0.8219 | | 0.1769 | 34.38 | 2750 | 0.3200 | 0.4441 | 0.8882 | 0.8882 | nan | 0.8882 | 0.0 | 0.8882 | 0.8176 | | 0.1899 | 37.5 | 3000 | 0.3342 | 0.4612 | 0.9224 | 0.9224 | nan | 0.9224 | 0.0 | 0.9224 | 0.8363 | | 0.1765 | 40.62 | 3250 | 0.3445 | 0.4625 | 0.9249 | 0.9249 | nan | 0.9249 | 0.0 | 0.9249 | 0.8369 | | 0.1739 | 43.75 | 3500 | 0.3235 | 0.4608 | 0.9216 | 0.9216 | nan | 0.9216 | 0.0 | 0.9216 | 0.8373 | | 0.1639 | 46.88 | 3750 | 0.3527 | 0.4591 | 0.9181 | 0.9181 | nan | 0.9181 | 0.0 | 0.9181 | 0.8342 | | 0.1734 | 50.0 | 4000 | 0.3415 | 0.4569 | 0.9138 | 0.9138 | nan | 0.9138 | 0.0 | 0.9138 | 0.8323 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu115 - Datasets 2.15.0 - Tokenizers 0.15.0
[ "unlabeled", "circuit" ]
yijisuk/segformer-b1-miic-tl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b1-miic-tl This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set: - Loss: 0.2212 - Mean Iou: 0.4723 - Mean Accuracy: 0.9446 - Overall Accuracy: 0.9446 - Accuracy Unlabeled: nan - Accuracy Circuit: 0.9446 - Iou Unlabeled: 0.0 - Iou Circuit: 0.9446 - Dice Coefficient: 0.8541 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit | Dice Coefficient | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|:----------------:| | 0.3419 | 3.12 | 250 | 0.2745 | 0.4850 | 0.9701 | 0.9701 | nan | 0.9701 | 0.0 | 0.9701 | 0.8149 | | 0.2785 | 6.25 | 500 | 0.2789 | 0.4828 | 0.9657 | 0.9657 | nan | 0.9657 | 0.0 | 0.9657 | 0.8285 | | 0.2549 | 9.38 | 750 | 0.2888 | 0.4721 | 0.9443 | 0.9443 | nan | 0.9443 | 0.0 | 0.9443 | 0.8372 | | 0.2728 | 12.5 | 1000 | 0.2426 | 0.4699 | 0.9397 | 0.9397 | nan | 0.9397 | 0.0 | 0.9397 | 0.8424 | | 0.2625 | 15.62 | 1250 | 0.1990 | 0.4632 | 0.9264 | 0.9264 | nan | 0.9264 | 0.0 | 0.9264 | 0.8520 | | 0.2449 | 18.75 | 1500 | 0.2121 | 0.4706 | 0.9412 | 0.9412 | nan | 0.9412 | 0.0 | 0.9412 | 0.8508 | | 0.2173 | 21.88 | 1750 | 0.2768 | 0.4780 | 0.9559 | 0.9559 | nan | 0.9559 | 0.0 | 0.9559 | 0.8485 | | 0.2158 | 25.0 | 2000 | 0.2772 | 0.4643 | 0.9287 | 0.9287 | nan | 0.9287 | 0.0 | 0.9287 | 0.8383 | | 0.1843 | 28.12 | 2250 | 0.1818 | 0.4671 | 0.9343 | 0.9343 | nan | 0.9343 | 0.0 | 0.9343 | 0.8685 | | 0.1608 | 31.25 | 2500 | 0.1794 | 0.4591 | 0.9182 | 0.9182 | nan | 0.9182 | 0.0 | 0.9182 | 0.8618 | | 0.1504 | 34.38 | 2750 | 0.1805 | 0.4586 | 0.9172 | 0.9172 | nan | 0.9172 | 0.0 | 0.9172 | 0.8647 | | 0.1495 | 37.5 | 3000 | 0.2090 | 0.4773 | 0.9545 | 0.9545 | nan | 0.9545 | 0.0 | 0.9545 | 0.8595 | | 0.142 | 40.62 | 3250 | 0.2048 | 0.4750 | 0.9500 | 0.9500 | nan | 0.9500 | 0.0 | 0.9500 | 0.8666 | | 0.1401 | 43.75 | 3500 | 0.2131 | 0.4756 | 0.9512 | 0.9512 | nan | 0.9512 | 0.0 | 0.9512 | 0.8580 | | 0.1339 | 46.88 | 3750 | 0.2469 | 0.4773 | 0.9546 | 0.9546 | nan | 0.9546 | 0.0 | 0.9546 | 0.8481 | | 0.1303 | 50.0 | 4000 | 0.2212 | 0.4723 | 0.9446 | 0.9446 | nan | 0.9446 | 0.0 | 0.9446 | 0.8541 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu115 - Datasets 2.15.0 - Tokenizers 0.15.0
[ "unlabeled", "circuit" ]
eleninaneversmiles/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the eleninaneversmiles/wheels dataset. It achieves the following results on the evaluation set: - Loss: 0.1287 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:--------:|:----:|:---------------:| | 2.9957 | 2.8571 | 20 | 3.4269 | | 2.6593 | 5.7143 | 40 | 2.3621 | | 1.9746 | 8.5714 | 60 | 1.2378 | | 1.5998 | 11.4286 | 80 | 1.2329 | | 1.3299 | 14.2857 | 100 | 0.8019 | | 1.3781 | 17.1429 | 120 | 0.8478 | | 2.1912 | 20.0 | 140 | 0.6386 | | 1.0362 | 22.8571 | 160 | 0.6467 | | 1.3817 | 25.7143 | 180 | 0.4496 | | 0.8108 | 28.5714 | 200 | 0.4032 | | 0.8187 | 31.4286 | 220 | 0.4650 | | 0.6671 | 34.2857 | 240 | 0.3251 | | 0.6062 | 37.1429 | 260 | 0.4035 | | 1.4152 | 40.0 | 280 | 0.3076 | | 1.3078 | 42.8571 | 300 | 0.2517 | | 0.4267 | 45.7143 | 320 | 0.2405 | | 0.5829 | 48.5714 | 340 | 0.2142 | | 0.8742 | 51.4286 | 360 | 0.2055 | | 0.3055 | 54.2857 | 380 | 0.2257 | | 0.5966 | 57.1429 | 400 | 0.1559 | | 0.5006 | 60.0 | 420 | 0.1927 | | 0.4433 | 62.8571 | 440 | 0.1525 | | 0.2377 | 65.7143 | 460 | 0.1597 | | 0.2612 | 68.5714 | 480 | 0.1703 | | 0.477 | 71.4286 | 500 | 0.1663 | | 0.2006 | 74.2857 | 520 | 0.1427 | | 0.2641 | 77.1429 | 540 | 0.1370 | | 0.5154 | 80.0 | 560 | 0.1386 | | 0.447 | 82.8571 | 580 | 0.1274 | | 0.195 | 85.7143 | 600 | 0.1236 | | 0.1643 | 88.5714 | 620 | 0.1420 | | 0.4199 | 91.4286 | 640 | 0.1226 | | 0.1644 | 94.2857 | 660 | 0.1419 | | 0.312 | 97.1429 | 680 | 0.1365 | | 0.3905 | 100.0 | 700 | 0.1181 | | 0.4035 | 102.8571 | 720 | 0.1305 | | 0.1411 | 105.7143 | 740 | 0.1262 | | 0.3018 | 108.5714 | 760 | 0.1322 | | 0.1332 | 111.4286 | 780 | 0.1317 | | 0.303 | 114.2857 | 800 | 0.1205 | | 0.2399 | 117.1429 | 820 | 0.1358 | | 0.2488 | 120.0 | 840 | 0.1226 | | 0.304 | 122.8571 | 860 | 0.1275 | | 0.2278 | 125.7143 | 880 | 0.1280 | | 0.2718 | 128.5714 | 900 | 0.1294 | | 0.5304 | 131.4286 | 920 | 0.1320 | | 0.1143 | 134.2857 | 940 | 0.1279 | | 0.1075 | 137.1429 | 960 | 0.1258 | | 0.2103 | 140.0 | 980 | 0.1349 | | 0.1483 | 142.8571 | 1000 | 0.1230 | | 0.287 | 145.7143 | 1020 | 0.1253 | | 0.3606 | 148.5714 | 1040 | 0.1287 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.1+cpu - Datasets 2.19.2 - Tokenizers 0.19.1
[ "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1", "1" ]
Salmamoori/segformer-b0-finetuned-segments-sidewalk-oct-22
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-oct-22 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.7818 - Mean Iou: 0.1872 - Mean Accuracy: 0.2343 - Overall Accuracy: 0.7917 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8716 - Accuracy Flat-sidewalk: 0.9457 - Accuracy Flat-crosswalk: 0.0 - Accuracy Flat-cyclinglane: 0.6578 - Accuracy Flat-parkingdriveway: 0.2561 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.2287 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.9059 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.8667 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0198 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0002 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9476 - Accuracy Nature-terrain: 0.8466 - Accuracy Sky: 0.9510 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0001 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.6267 - Iou Flat-sidewalk: 0.8545 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.5807 - Iou Flat-parkingdriveway: 0.2032 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.1902 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.6625 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.5368 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0195 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0002 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.7879 - Iou Nature-terrain: 0.6502 - Iou Sky: 0.8781 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0001 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | No log | 1.0 | 400 | 1.0489 | 0.1563 | 0.2092 | 0.7462 | nan | 0.7727 | 0.9457 | 0.0 | 0.5203 | 0.0416 | nan | 0.0011 | 0.0 | 0.0 | 0.8995 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7747 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8911 | 0.9117 | 0.9374 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5549 | 0.7865 | 0.0 | 0.4480 | 0.0400 | nan | 0.0011 | 0.0 | 0.0 | 0.5695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5072 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7359 | 0.5246 | 0.8336 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4244 | 2.0 | 800 | 0.8409 | 0.1794 | 0.2249 | 0.7813 | nan | 0.8638 | 0.9504 | 0.0 | 0.5945 | 0.1981 | nan | 0.1027 | 0.0 | 0.0 | 0.8937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8568 | 0.0 | 0.0085 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9426 | 0.8503 | 0.9368 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6014 | 0.8353 | 0.0 | 0.5403 | 0.1665 | nan | 0.0927 | 0.0 | 0.0 | 0.6676 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5350 | 0.0 | 0.0085 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.7710 | 0.6442 | 0.8776 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4244 | 3.0 | 1200 | 0.7818 | 0.1872 | 0.2343 | 0.7917 | nan | 0.8716 | 0.9457 | 0.0 | 0.6578 | 0.2561 | nan | 0.2287 | 0.0 | 0.0 | 0.9059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8667 | 0.0 | 0.0198 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.9476 | 0.8466 | 0.9510 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6267 | 0.8545 | 0.0 | 0.5807 | 0.2032 | nan | 0.1902 | 0.0 | 0.0 | 0.6625 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5368 | 0.0 | 0.0195 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.7879 | 0.6502 | 0.8781 | 0.0 | 0.0 | 0.0001 | 0.0 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.0+cu121 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
Hasano20/segformer-b4-finetuned
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b4-finetuned This model is a fine-tuned version of [nvidia/mit-b4](https://huggingface.co/nvidia/mit-b4) on the Hasano20/Set1 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/segformer_Clean_Set1_95images
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer_Clean_Set1_95images This model is a fine-tuned version of [nvidia/mit-b4](https://huggingface.co/nvidia/mit-b4) on the Hasano20/Clean_Set1_95images dataset. It achieves the following results on the evaluation set: - Loss: 0.0223 - Mean Iou: 0.6447 - Mean Accuracy: 0.9824 - Overall Accuracy: 0.9886 - Accuracy Background: nan - Accuracy Melt: 0.9724 - Accuracy Substrate: 0.9923 - Iou Background: 0.0 - Iou Melt: 0.9458 - Iou Substrate: 0.9882 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.2051 | 1.1765 | 20 | 0.3764 | 0.3339 | 0.5766 | 0.8354 | nan | 0.1639 | 0.9894 | 0.0 | 0.1612 | 0.8404 | | 0.3486 | 2.3529 | 40 | 0.1932 | 0.4595 | 0.7687 | 0.8745 | nan | 0.6000 | 0.9375 | 0.0 | 0.4928 | 0.8858 | | 0.0831 | 3.5294 | 60 | 0.2016 | 0.4101 | 0.6782 | 0.8792 | nan | 0.3576 | 0.9988 | 0.0 | 0.3570 | 0.8732 | | 0.0809 | 4.7059 | 80 | 0.0763 | 0.5787 | 0.9243 | 0.9507 | nan | 0.8822 | 0.9664 | 0.0 | 0.7830 | 0.9531 | | 0.0325 | 5.8824 | 100 | 0.0694 | 0.6028 | 0.9436 | 0.9618 | nan | 0.9146 | 0.9727 | 0.0 | 0.8479 | 0.9606 | | 0.0279 | 7.0588 | 120 | 0.0460 | 0.6142 | 0.9520 | 0.9712 | nan | 0.9213 | 0.9826 | 0.0 | 0.8739 | 0.9686 | | 0.0493 | 8.2353 | 140 | 0.0353 | 0.6297 | 0.9648 | 0.9802 | nan | 0.9404 | 0.9893 | 0.0 | 0.9092 | 0.9797 | | 0.0286 | 9.4118 | 160 | 0.0366 | 0.6261 | 0.9643 | 0.9765 | nan | 0.9449 | 0.9837 | 0.0 | 0.8997 | 0.9787 | | 0.0463 | 10.5882 | 180 | 0.0258 | 0.6425 | 0.9798 | 0.9879 | nan | 0.9669 | 0.9927 | 0.0 | 0.9414 | 0.9862 | | 0.0145 | 11.7647 | 200 | 0.0302 | 0.6324 | 0.9652 | 0.9821 | nan | 0.9382 | 0.9922 | 0.0 | 0.9162 | 0.9810 | | 0.0221 | 12.9412 | 220 | 0.0262 | 0.6379 | 0.9733 | 0.9850 | nan | 0.9547 | 0.9919 | 0.0 | 0.9289 | 0.9848 | | 0.0109 | 14.1176 | 240 | 0.0236 | 0.6417 | 0.9764 | 0.9869 | nan | 0.9595 | 0.9932 | 0.0 | 0.9379 | 0.9871 | | 0.0122 | 15.2941 | 260 | 0.0252 | 0.6407 | 0.9812 | 0.9866 | nan | 0.9725 | 0.9898 | 0.0 | 0.9358 | 0.9864 | | 0.0101 | 16.4706 | 280 | 0.0239 | 0.6417 | 0.9799 | 0.9869 | nan | 0.9686 | 0.9911 | 0.0 | 0.9382 | 0.9870 | | 0.0113 | 17.6471 | 300 | 0.0231 | 0.6425 | 0.9798 | 0.9874 | nan | 0.9675 | 0.9920 | 0.0 | 0.9399 | 0.9875 | | 0.0086 | 18.8235 | 320 | 0.0225 | 0.6444 | 0.9826 | 0.9885 | nan | 0.9733 | 0.9919 | 0.0 | 0.9451 | 0.9882 | | 0.0086 | 20.0 | 340 | 0.0223 | 0.6447 | 0.9824 | 0.9886 | nan | 0.9724 | 0.9923 | 0.0 | 0.9458 | 0.9882 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/segformer_Clean_Set1_95images_mit-b5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Clean_Set1_95images_mit-b5 This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_95images dataset. It achieves the following results on the evaluation set: - Loss: 0.0390 - Mean Iou: 0.9468 - Mean Accuracy: 0.9733 - Overall Accuracy: 0.9860 - Accuracy Background: 0.9960 - Accuracy Melt: 0.9390 - Accuracy Substrate: 0.9850 - Iou Background: 0.9899 - Iou Melt: 0.8763 - Iou Substrate: 0.9743 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.3883 | 0.5882 | 10 | 0.7088 | 0.5294 | 0.6161 | 0.8428 | 0.8644 | 0.0 | 0.9840 | 0.8428 | 0.0 | 0.7456 | | 0.6271 | 1.1765 | 20 | 0.4185 | 0.5763 | 0.6455 | 0.8828 | 0.9472 | 0.0011 | 0.9882 | 0.9297 | 0.0011 | 0.7980 | | 0.1779 | 1.7647 | 30 | 0.2746 | 0.6105 | 0.6712 | 0.9000 | 0.9943 | 0.0499 | 0.9694 | 0.9534 | 0.0474 | 0.8307 | | 0.228 | 2.3529 | 40 | 0.2865 | 0.6102 | 0.6723 | 0.8897 | 0.9635 | 0.0820 | 0.9716 | 0.9359 | 0.0692 | 0.8254 | | 0.1099 | 2.9412 | 50 | 0.2432 | 0.6646 | 0.7305 | 0.9018 | 0.9879 | 0.2657 | 0.9380 | 0.9495 | 0.2073 | 0.8369 | | 0.1448 | 3.5294 | 60 | 0.3321 | 0.5993 | 0.6606 | 0.8987 | 0.9744 | 0.0140 | 0.9934 | 0.9613 | 0.0139 | 0.8226 | | 0.2412 | 4.1176 | 70 | 0.2053 | 0.6581 | 0.7115 | 0.9150 | 0.9906 | 0.1590 | 0.9850 | 0.9734 | 0.1485 | 0.8525 | | 0.1585 | 4.7059 | 80 | 0.2824 | 0.7094 | 0.8614 | 0.8838 | 0.9775 | 0.8013 | 0.8055 | 0.9504 | 0.3927 | 0.7851 | | 0.2025 | 5.2941 | 90 | 0.2405 | 0.7011 | 0.8139 | 0.8924 | 0.9982 | 0.6013 | 0.8423 | 0.9387 | 0.3501 | 0.8144 | | 0.2516 | 5.8824 | 100 | 0.2134 | 0.7488 | 0.8852 | 0.9083 | 0.9937 | 0.8227 | 0.8391 | 0.9721 | 0.4533 | 0.8212 | | 0.275 | 6.4706 | 110 | 0.2856 | 0.7243 | 0.8793 | 0.8910 | 0.9965 | 0.8484 | 0.7932 | 0.9543 | 0.4339 | 0.7848 | | 0.0721 | 7.0588 | 120 | 0.1417 | 0.7758 | 0.8225 | 0.9428 | 0.9913 | 0.4956 | 0.9804 | 0.9789 | 0.4530 | 0.8955 | | 0.1478 | 7.6471 | 130 | 0.1383 | 0.7811 | 0.8383 | 0.9412 | 0.9828 | 0.5588 | 0.9733 | 0.9715 | 0.4727 | 0.8992 | | 0.0541 | 8.2353 | 140 | 0.1654 | 0.7353 | 0.7778 | 0.9368 | 0.9958 | 0.3461 | 0.9915 | 0.9805 | 0.3400 | 0.8854 | | 0.1068 | 8.8235 | 150 | 0.1001 | 0.8481 | 0.8900 | 0.9607 | 0.9977 | 0.6982 | 0.9742 | 0.9813 | 0.6358 | 0.9272 | | 0.0879 | 9.4118 | 160 | 0.1177 | 0.8272 | 0.8658 | 0.9568 | 0.9914 | 0.6186 | 0.9875 | 0.9798 | 0.5785 | 0.9232 | | 0.0855 | 10.0 | 170 | 0.0929 | 0.8763 | 0.9444 | 0.9650 | 0.9910 | 0.8886 | 0.9537 | 0.9848 | 0.7113 | 0.9327 | | 0.102 | 10.5882 | 180 | 0.0770 | 0.8935 | 0.9405 | 0.9715 | 0.9962 | 0.8565 | 0.9689 | 0.9851 | 0.7486 | 0.9468 | | 0.1044 | 11.1765 | 190 | 0.1401 | 0.7868 | 0.8367 | 0.9441 | 0.9696 | 0.5446 | 0.9957 | 0.9672 | 0.4853 | 0.9080 | | 0.0705 | 11.7647 | 200 | 0.0822 | 0.8836 | 0.9507 | 0.9674 | 0.9924 | 0.9057 | 0.9542 | 0.9853 | 0.7276 | 0.9380 | | 0.0583 | 12.3529 | 210 | 0.0670 | 0.9102 | 0.9489 | 0.9757 | 0.9957 | 0.8760 | 0.9750 | 0.9841 | 0.7914 | 0.9550 | | 0.0337 | 12.9412 | 220 | 0.0718 | 0.9048 | 0.9384 | 0.9751 | 0.9960 | 0.8389 | 0.9803 | 0.9858 | 0.7756 | 0.9530 | | 0.0237 | 13.5294 | 230 | 0.0634 | 0.9106 | 0.9419 | 0.9769 | 0.9957 | 0.8467 | 0.9832 | 0.9878 | 0.7879 | 0.9562 | | 0.2478 | 14.1176 | 240 | 0.0724 | 0.8949 | 0.9289 | 0.9726 | 0.9958 | 0.8103 | 0.9806 | 0.9855 | 0.7514 | 0.9478 | | 0.0237 | 14.7059 | 250 | 0.0570 | 0.9230 | 0.9610 | 0.9790 | 0.9950 | 0.9124 | 0.9757 | 0.9861 | 0.8226 | 0.9604 | | 0.0237 | 15.2941 | 260 | 0.0564 | 0.9251 | 0.9650 | 0.9798 | 0.9957 | 0.9248 | 0.9745 | 0.9887 | 0.8253 | 0.9612 | | 0.0414 | 15.8824 | 270 | 0.0786 | 0.8738 | 0.8997 | 0.9693 | 0.9926 | 0.7107 | 0.9959 | 0.9893 | 0.6917 | 0.9405 | | 0.0444 | 16.4706 | 280 | 0.0431 | 0.9383 | 0.9686 | 0.9840 | 0.9962 | 0.9269 | 0.9828 | 0.9908 | 0.8539 | 0.9702 | | 0.0307 | 17.0588 | 290 | 0.0416 | 0.9438 | 0.9719 | 0.9855 | 0.9942 | 0.9350 | 0.9864 | 0.9900 | 0.8675 | 0.9741 | | 0.0335 | 17.6471 | 300 | 0.0420 | 0.9402 | 0.9635 | 0.9846 | 0.9943 | 0.9062 | 0.9900 | 0.9900 | 0.8589 | 0.9716 | | 0.0717 | 18.2353 | 310 | 0.0448 | 0.9375 | 0.9651 | 0.9837 | 0.9971 | 0.9144 | 0.9837 | 0.9891 | 0.8533 | 0.9702 | | 0.0225 | 18.8235 | 320 | 0.0403 | 0.9405 | 0.9635 | 0.9847 | 0.9947 | 0.9058 | 0.9899 | 0.9904 | 0.8595 | 0.9716 | | 0.0315 | 19.4118 | 330 | 0.0394 | 0.9444 | 0.9686 | 0.9855 | 0.9956 | 0.9230 | 0.9873 | 0.9901 | 0.8698 | 0.9732 | | 0.0178 | 20.0 | 340 | 0.0390 | 0.9468 | 0.9733 | 0.9860 | 0.9960 | 0.9390 | 0.9850 | 0.9899 | 0.8763 | 0.9743 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
droneambulanceproject/drone_ambulance_project_panoptic_segmentation
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "background", "not_crashed_vehicle", "crashed_vehicle" ]
Hasano20/SegFormer_Clean_Set1_240430_V2-Augmented_mit-b5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Clean_Set1_240430_V2-Augmented_mit-b5 This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_240430_V2-Augmented dataset. It achieves the following results on the evaluation set: - Loss: 0.0899 - Mean Iou: 0.8524 - Mean Accuracy: 0.8932 - Overall Accuracy: 0.9653 - Accuracy Background: 0.9900 - Accuracy Melt: 0.7107 - Accuracy Substrate: 0.9788 - Iou Background: 0.9693 - Iou Melt: 0.6451 - Iou Substrate: 0.9429 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.1394 | 1.6129 | 50 | 0.2486 | 0.6252 | 0.6776 | 0.9171 | 0.9800 | 0.0713 | 0.9816 | 0.9285 | 0.0680 | 0.8792 | | 0.2482 | 3.2258 | 100 | 0.2178 | 0.6883 | 0.7470 | 0.9224 | 0.9831 | 0.3037 | 0.9543 | 0.9307 | 0.2490 | 0.8854 | | 0.1697 | 4.8387 | 150 | 0.2044 | 0.6993 | 0.7613 | 0.9236 | 0.9847 | 0.3511 | 0.9480 | 0.9313 | 0.2796 | 0.8871 | | 0.139 | 6.4516 | 200 | 0.1897 | 0.7250 | 0.7835 | 0.9317 | 0.9771 | 0.4086 | 0.9648 | 0.9415 | 0.3395 | 0.8940 | | 0.0951 | 8.0645 | 250 | 0.1879 | 0.6863 | 0.7344 | 0.9291 | 0.9851 | 0.2414 | 0.9766 | 0.9372 | 0.2290 | 0.8928 | | 0.0812 | 9.6774 | 300 | 0.1875 | 0.7513 | 0.8449 | 0.9285 | 0.9636 | 0.6338 | 0.9372 | 0.9370 | 0.4265 | 0.8903 | | 0.1349 | 11.2903 | 350 | 0.2020 | 0.6825 | 0.7357 | 0.9247 | 0.9810 | 0.2577 | 0.9685 | 0.9328 | 0.2265 | 0.8882 | | 0.1312 | 12.9032 | 400 | 0.1401 | 0.7627 | 0.8053 | 0.9465 | 0.9864 | 0.4477 | 0.9816 | 0.9624 | 0.4169 | 0.9090 | | 0.1061 | 14.5161 | 450 | 0.1051 | 0.8297 | 0.8811 | 0.9586 | 0.9890 | 0.6853 | 0.9691 | 0.9657 | 0.5932 | 0.9302 | | 0.0287 | 16.1290 | 500 | 0.1045 | 0.8349 | 0.8835 | 0.9598 | 0.9850 | 0.6905 | 0.9749 | 0.9640 | 0.6073 | 0.9335 | | 0.2051 | 17.7419 | 550 | 0.0928 | 0.8466 | 0.8868 | 0.9644 | 0.9875 | 0.6906 | 0.9824 | 0.9687 | 0.6290 | 0.9420 | | 0.0898 | 19.3548 | 600 | 0.0899 | 0.8524 | 0.8932 | 0.9653 | 0.9900 | 0.7107 | 0.9788 | 0.9693 | 0.6451 | 0.9429 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/SegFormer_Clean_Set1_95images_mit-b5_RGB
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Clean_Set1_95images_mit-b5_RGB This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_95images dataset. It achieves the following results on the evaluation set: - Loss: 0.0210 - Mean Iou: 0.9721 - Mean Accuracy: 0.9816 - Overall Accuracy: 0.9941 - Accuracy Background: 0.9974 - Accuracy Melt: 0.9506 - Accuracy Substrate: 0.9969 - Iou Background: 0.9954 - Iou Melt: 0.9316 - Iou Substrate: 0.9891 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.2459 | 1.1765 | 20 | 0.4048 | 0.5613 | 0.6310 | 0.8812 | 0.9733 | 0.0102 | 0.9096 | 0.8391 | 0.0100 | 0.8349 | | 0.2421 | 2.3529 | 40 | 0.1840 | 0.6645 | 0.7118 | 0.9292 | 0.9969 | 0.1720 | 0.9666 | 0.9574 | 0.1475 | 0.8886 | | 0.1511 | 3.5294 | 60 | 0.1347 | 0.6751 | 0.7154 | 0.9392 | 0.9909 | 0.1590 | 0.9963 | 0.9639 | 0.1570 | 0.9045 | | 0.1449 | 4.7059 | 80 | 0.1350 | 0.7359 | 0.7793 | 0.9471 | 0.9937 | 0.3623 | 0.9819 | 0.9642 | 0.3221 | 0.9213 | | 0.1276 | 5.8824 | 100 | 0.1006 | 0.8194 | 0.9138 | 0.9551 | 0.9823 | 0.8117 | 0.9474 | 0.9707 | 0.5605 | 0.9271 | | 0.0638 | 7.0588 | 120 | 0.0916 | 0.8139 | 0.8438 | 0.9646 | 0.9964 | 0.5438 | 0.9913 | 0.9779 | 0.5208 | 0.9431 | | 0.0535 | 8.2353 | 140 | 0.0695 | 0.8572 | 0.8769 | 0.9735 | 0.9969 | 0.6367 | 0.9971 | 0.9804 | 0.6316 | 0.9597 | | 0.0346 | 9.4118 | 160 | 0.0435 | 0.9224 | 0.9384 | 0.9848 | 0.9962 | 0.8230 | 0.9959 | 0.9888 | 0.8039 | 0.9745 | | 0.0393 | 10.5882 | 180 | 0.0376 | 0.9352 | 0.9642 | 0.9867 | 0.9970 | 0.9082 | 0.9873 | 0.9882 | 0.8376 | 0.9798 | | 0.0294 | 11.7647 | 200 | 0.0448 | 0.9298 | 0.9746 | 0.9851 | 0.9932 | 0.9487 | 0.9818 | 0.9916 | 0.8253 | 0.9725 | | 0.0387 | 12.9412 | 220 | 0.0409 | 0.9270 | 0.9488 | 0.9855 | 0.9970 | 0.8575 | 0.9918 | 0.9830 | 0.8157 | 0.9823 | | 0.0435 | 14.1176 | 240 | 0.0353 | 0.9482 | 0.9685 | 0.9886 | 0.9891 | 0.9185 | 0.9980 | 0.9881 | 0.8749 | 0.9816 | | 0.022 | 15.2941 | 260 | 0.0246 | 0.9587 | 0.9696 | 0.9915 | 0.9970 | 0.9152 | 0.9967 | 0.9931 | 0.8979 | 0.9853 | | 0.0203 | 16.4706 | 280 | 0.0191 | 0.9698 | 0.9826 | 0.9934 | 0.9953 | 0.9557 | 0.9967 | 0.9935 | 0.9272 | 0.9887 | | 0.0212 | 17.6471 | 300 | 0.0256 | 0.9604 | 0.9724 | 0.9917 | 0.9953 | 0.9243 | 0.9975 | 0.9933 | 0.9028 | 0.9851 | | 0.0123 | 18.8235 | 320 | 0.0223 | 0.9638 | 0.9763 | 0.9924 | 0.9954 | 0.9363 | 0.9972 | 0.9938 | 0.9112 | 0.9864 | | 0.0137 | 20.0 | 340 | 0.0292 | 0.9543 | 0.9720 | 0.9906 | 0.9933 | 0.9256 | 0.9969 | 0.9919 | 0.8867 | 0.9844 | | 0.0092 | 21.1765 | 360 | 0.0171 | 0.9719 | 0.9797 | 0.9941 | 0.9977 | 0.9439 | 0.9974 | 0.9942 | 0.9312 | 0.9902 | | 0.0094 | 22.3529 | 380 | 0.0178 | 0.9730 | 0.9829 | 0.9941 | 0.9984 | 0.9550 | 0.9952 | 0.9938 | 0.9352 | 0.9901 | | 0.016 | 23.5294 | 400 | 0.0163 | 0.9760 | 0.9881 | 0.9946 | 0.9954 | 0.9721 | 0.9969 | 0.9944 | 0.9430 | 0.9907 | | 0.0083 | 24.7059 | 420 | 0.0151 | 0.9784 | 0.9882 | 0.9952 | 0.9973 | 0.9707 | 0.9965 | 0.9952 | 0.9483 | 0.9916 | | 0.0094 | 25.8824 | 440 | 0.0259 | 0.9626 | 0.9731 | 0.9925 | 0.9971 | 0.9248 | 0.9972 | 0.9952 | 0.9067 | 0.9858 | | 0.0144 | 27.0588 | 460 | 0.0171 | 0.9743 | 0.9860 | 0.9945 | 0.9980 | 0.9648 | 0.9951 | 0.9948 | 0.9376 | 0.9905 | | 0.0075 | 28.2353 | 480 | 0.0168 | 0.9733 | 0.9824 | 0.9943 | 0.9972 | 0.9528 | 0.9972 | 0.9949 | 0.9351 | 0.9900 | | 0.0076 | 29.4118 | 500 | 0.0171 | 0.9756 | 0.9842 | 0.9947 | 0.9979 | 0.9580 | 0.9966 | 0.9951 | 0.9409 | 0.9907 | | 0.0075 | 30.5882 | 520 | 0.0170 | 0.9748 | 0.9835 | 0.9946 | 0.9974 | 0.9560 | 0.9971 | 0.9954 | 0.9388 | 0.9901 | | 0.0084 | 31.7647 | 540 | 0.0154 | 0.9783 | 0.9899 | 0.9952 | 0.9976 | 0.9770 | 0.9953 | 0.9954 | 0.9480 | 0.9914 | | 0.0055 | 32.9412 | 560 | 0.0156 | 0.9777 | 0.9888 | 0.9951 | 0.9971 | 0.9730 | 0.9962 | 0.9953 | 0.9465 | 0.9913 | | 0.009 | 34.1176 | 580 | 0.0166 | 0.9752 | 0.9856 | 0.9947 | 0.9972 | 0.9630 | 0.9965 | 0.9953 | 0.9400 | 0.9904 | | 0.0055 | 35.2941 | 600 | 0.0176 | 0.9745 | 0.9835 | 0.9946 | 0.9972 | 0.9560 | 0.9974 | 0.9954 | 0.9378 | 0.9902 | | 0.0069 | 36.4706 | 620 | 0.0180 | 0.9748 | 0.9832 | 0.9946 | 0.9974 | 0.9547 | 0.9974 | 0.9955 | 0.9388 | 0.9902 | | 0.0051 | 37.6471 | 640 | 0.0181 | 0.9752 | 0.9843 | 0.9947 | 0.9975 | 0.9585 | 0.9968 | 0.9955 | 0.9397 | 0.9903 | | 0.0071 | 38.8235 | 660 | 0.0201 | 0.9729 | 0.9847 | 0.9943 | 0.9968 | 0.9610 | 0.9963 | 0.9953 | 0.9337 | 0.9896 | | 0.0058 | 40.0 | 680 | 0.0208 | 0.9720 | 0.9826 | 0.9941 | 0.9971 | 0.9540 | 0.9968 | 0.9954 | 0.9315 | 0.9892 | | 0.0061 | 41.1765 | 700 | 0.0222 | 0.9699 | 0.9802 | 0.9937 | 0.9973 | 0.9467 | 0.9967 | 0.9954 | 0.9260 | 0.9883 | | 0.0062 | 42.3529 | 720 | 0.0205 | 0.9720 | 0.9819 | 0.9941 | 0.9975 | 0.9516 | 0.9966 | 0.9953 | 0.9315 | 0.9891 | | 0.004 | 43.5294 | 740 | 0.0193 | 0.9741 | 0.9835 | 0.9945 | 0.9973 | 0.9561 | 0.9969 | 0.9954 | 0.9371 | 0.9898 | | 0.0065 | 44.7059 | 760 | 0.0195 | 0.9738 | 0.9842 | 0.9944 | 0.9971 | 0.9588 | 0.9967 | 0.9953 | 0.9363 | 0.9898 | | 0.0044 | 45.8824 | 780 | 0.0201 | 0.9731 | 0.9830 | 0.9943 | 0.9971 | 0.9550 | 0.9969 | 0.9954 | 0.9344 | 0.9895 | | 0.0073 | 47.0588 | 800 | 0.0210 | 0.9723 | 0.9818 | 0.9941 | 0.9972 | 0.9512 | 0.9971 | 0.9954 | 0.9323 | 0.9891 | | 0.0049 | 48.2353 | 820 | 0.0209 | 0.9723 | 0.9822 | 0.9941 | 0.9974 | 0.9527 | 0.9966 | 0.9954 | 0.9322 | 0.9892 | | 0.0069 | 49.4118 | 840 | 0.0210 | 0.9721 | 0.9816 | 0.9941 | 0.9974 | 0.9506 | 0.9969 | 0.9954 | 0.9316 | 0.9891 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/SegFormer_Clean_Set1_Grayscale_mit-b5_Grayscale
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Clean_Set1_Grayscale_mit-b5_Grayscale This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_Grayscale dataset. It achieves the following results on the evaluation set: - Loss: 0.0178 - Mean Iou: 0.9760 - Mean Accuracy: 0.9847 - Overall Accuracy: 0.9949 - Accuracy Background: 0.9976 - Accuracy Melt: 0.9586 - Accuracy Substrate: 0.9978 - Iou Background: 0.9959 - Iou Melt: 0.9408 - Iou Substrate: 0.9912 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.0882 | 5.5556 | 50 | 0.1733 | 0.7543 | 0.7985 | 0.9397 | 0.9594 | 0.4375 | 0.9986 | 0.9562 | 0.4189 | 0.8880 | | 0.0295 | 11.1111 | 100 | 0.0270 | 0.9580 | 0.9736 | 0.9907 | 0.9965 | 0.9302 | 0.9940 | 0.9918 | 0.8978 | 0.9843 | | 0.0143 | 16.6667 | 150 | 0.0260 | 0.9561 | 0.9798 | 0.9901 | 0.9970 | 0.9541 | 0.9884 | 0.9910 | 0.8938 | 0.9836 | | 0.0095 | 22.2222 | 200 | 0.0224 | 0.9645 | 0.9747 | 0.9926 | 0.9985 | 0.9293 | 0.9962 | 0.9944 | 0.9119 | 0.9872 | | 0.0083 | 27.7778 | 250 | 0.0180 | 0.9742 | 0.9819 | 0.9945 | 0.9982 | 0.9498 | 0.9977 | 0.9955 | 0.9366 | 0.9905 | | 0.0072 | 33.3333 | 300 | 0.0175 | 0.9751 | 0.9838 | 0.9947 | 0.9984 | 0.9563 | 0.9968 | 0.9957 | 0.9388 | 0.9909 | | 0.0073 | 38.8889 | 350 | 0.0177 | 0.9758 | 0.9854 | 0.9948 | 0.9970 | 0.9613 | 0.9978 | 0.9957 | 0.9406 | 0.9912 | | 0.0054 | 44.4444 | 400 | 0.0179 | 0.9758 | 0.9844 | 0.9949 | 0.9978 | 0.9579 | 0.9976 | 0.9959 | 0.9404 | 0.9911 | | 0.0052 | 50.0 | 450 | 0.0178 | 0.9760 | 0.9847 | 0.9949 | 0.9976 | 0.9586 | 0.9978 | 0.9959 | 0.9408 | 0.9912 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/SegFormer_Clean_Set1_240430_V2-Augmented_mit-b5_RGB
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Clean_Set1_240430_V2-Augmented_mit-b5_RGB This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_240430_V2-Augmented dataset. It achieves the following results on the evaluation set: - Loss: 0.4812 - Mean Iou: 0.6072 - Mean Accuracy: 0.6905 - Overall Accuracy: 0.8761 - Accuracy Background: 0.8967 - Accuracy Melt: 0.2439 - Accuracy Substrate: 0.9311 - Iou Background: 0.8325 - Iou Melt: 0.1423 - Iou Substrate: 0.8467 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.2826 | 6.6667 | 20 | 0.4812 | 0.6072 | 0.6905 | 0.8761 | 0.8967 | 0.2439 | 0.9311 | 0.8325 | 0.1423 | 0.8467 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/segformer_mixed-set2-788img-rgb_mit-b5_17epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Mixed_Set2_788images_mit-b5_RGB This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Mixed_Set2_788images dataset. It achieves the following results on the evaluation set: - Loss: 0.0179 - Mean Iou: 0.9757 - Mean Accuracy: 0.9872 - Overall Accuracy: 0.9938 - Accuracy Background: 0.9959 - Accuracy Melt: 0.9697 - Accuracy Substrate: 0.9959 - Iou Background: 0.9922 - Iou Melt: 0.9437 - Iou Substrate: 0.9911 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.1292 | 0.7042 | 50 | 0.1861 | 0.7698 | 0.8223 | 0.9387 | 0.9844 | 0.5153 | 0.9673 | 0.9318 | 0.4625 | 0.9152 | | 0.1161 | 1.4085 | 100 | 0.1307 | 0.8463 | 0.9335 | 0.9543 | 0.9851 | 0.8721 | 0.9433 | 0.9596 | 0.6514 | 0.9279 | | 0.072 | 2.1127 | 150 | 0.0675 | 0.9075 | 0.9607 | 0.9762 | 0.9887 | 0.9179 | 0.9755 | 0.9821 | 0.7779 | 0.9625 | | 0.0425 | 2.8169 | 200 | 0.0622 | 0.9078 | 0.9322 | 0.9781 | 0.9868 | 0.8138 | 0.9959 | 0.9838 | 0.7746 | 0.9652 | | 0.0214 | 3.5211 | 250 | 0.0372 | 0.9458 | 0.9688 | 0.9868 | 0.9905 | 0.9223 | 0.9935 | 0.9870 | 0.8702 | 0.9802 | | 0.0397 | 4.2254 | 300 | 0.0373 | 0.9428 | 0.9802 | 0.9858 | 0.9948 | 0.9635 | 0.9824 | 0.9892 | 0.8617 | 0.9774 | | 0.0515 | 4.9296 | 350 | 0.0411 | 0.9399 | 0.9735 | 0.9846 | 0.9902 | 0.9438 | 0.9864 | 0.9865 | 0.8583 | 0.9750 | | 0.0171 | 5.6338 | 400 | 0.0267 | 0.9587 | 0.9782 | 0.9898 | 0.9937 | 0.9477 | 0.9931 | 0.9900 | 0.9017 | 0.9843 | | 0.0274 | 6.3380 | 450 | 0.0262 | 0.9621 | 0.9780 | 0.9906 | 0.9935 | 0.9454 | 0.9951 | 0.9900 | 0.9107 | 0.9857 | | 0.0105 | 7.0423 | 500 | 0.0272 | 0.9597 | 0.9844 | 0.9900 | 0.9924 | 0.9695 | 0.9913 | 0.9898 | 0.9041 | 0.9852 | | 0.0143 | 7.7465 | 550 | 0.0250 | 0.9638 | 0.9824 | 0.9911 | 0.9946 | 0.9593 | 0.9931 | 0.9907 | 0.9142 | 0.9865 | | 0.0153 | 8.4507 | 600 | 0.0226 | 0.9670 | 0.9826 | 0.9918 | 0.9947 | 0.9585 | 0.9946 | 0.9909 | 0.9223 | 0.9878 | | 0.011 | 9.1549 | 650 | 0.0201 | 0.9711 | 0.9841 | 0.9926 | 0.9936 | 0.9622 | 0.9965 | 0.9908 | 0.9330 | 0.9893 | | 0.009 | 9.8592 | 700 | 0.0199 | 0.9707 | 0.9858 | 0.9926 | 0.9962 | 0.9676 | 0.9936 | 0.9913 | 0.9315 | 0.9891 | | 0.017 | 10.5634 | 750 | 0.0206 | 0.9692 | 0.9869 | 0.9923 | 0.9964 | 0.9723 | 0.9921 | 0.9911 | 0.9279 | 0.9886 | | 0.0095 | 11.2676 | 800 | 0.0184 | 0.9733 | 0.9870 | 0.9933 | 0.9954 | 0.9704 | 0.9950 | 0.9917 | 0.9379 | 0.9902 | | 0.0142 | 11.9718 | 850 | 0.0179 | 0.9740 | 0.9862 | 0.9935 | 0.9957 | 0.9671 | 0.9957 | 0.9919 | 0.9395 | 0.9905 | | 0.0134 | 12.6761 | 900 | 0.0180 | 0.9739 | 0.9882 | 0.9934 | 0.9948 | 0.9747 | 0.9951 | 0.9919 | 0.9394 | 0.9903 | | 0.0096 | 13.3803 | 950 | 0.0179 | 0.9744 | 0.9864 | 0.9936 | 0.9960 | 0.9675 | 0.9956 | 0.9922 | 0.9406 | 0.9905 | | 0.0089 | 14.0845 | 1000 | 0.0174 | 0.9744 | 0.9881 | 0.9936 | 0.9958 | 0.9737 | 0.9949 | 0.9922 | 0.9404 | 0.9908 | | 0.0094 | 14.7887 | 1050 | 0.0174 | 0.9754 | 0.9864 | 0.9938 | 0.9962 | 0.9671 | 0.9960 | 0.9924 | 0.9428 | 0.9911 | | 0.0089 | 15.4930 | 1100 | 0.0192 | 0.9748 | 0.9860 | 0.9935 | 0.9945 | 0.9666 | 0.9968 | 0.9918 | 0.9421 | 0.9905 | | 0.0087 | 16.1972 | 1150 | 0.0179 | 0.9757 | 0.9872 | 0.9938 | 0.9959 | 0.9697 | 0.9959 | 0.9922 | 0.9437 | 0.9911 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
yijisuk/segformer-b5-miic-tl
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b5-miic-tl This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set: - Loss: 0.2247 - Mean Iou: 0.4565 - Mean Accuracy: 0.9129 - Overall Accuracy: 0.9129 - Accuracy Unlabeled: nan - Accuracy Circuit: 0.9129 - Iou Unlabeled: 0.0 - Iou Circuit: 0.9129 - Dice Coefficient: 0.8406 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit | Dice Coefficient | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|:----------------:| | 0.2801 | 3.12 | 250 | 0.2305 | 0.4832 | 0.9663 | 0.9663 | nan | 0.9663 | 0.0 | 0.9663 | 0.8527 | | 0.2785 | 6.25 | 500 | 0.2715 | 0.4800 | 0.9601 | 0.9601 | nan | 0.9601 | 0.0 | 0.9601 | 0.8511 | | 0.208 | 9.38 | 750 | 0.2681 | 0.4811 | 0.9622 | 0.9622 | nan | 0.9622 | 0.0 | 0.9622 | 0.8538 | | 0.2042 | 12.5 | 1000 | 0.2959 | 0.4650 | 0.9299 | 0.9299 | nan | 0.9299 | 0.0 | 0.9299 | 0.7879 | | 0.1649 | 15.62 | 1250 | 0.2407 | 0.4340 | 0.8679 | 0.8679 | nan | 0.8679 | 0.0 | 0.8679 | 0.8150 | | 0.1353 | 18.75 | 1500 | 0.2530 | 0.4543 | 0.9085 | 0.9085 | nan | 0.9085 | 0.0 | 0.9085 | 0.8336 | | 0.126 | 21.88 | 1750 | 0.4934 | 0.4559 | 0.9119 | 0.9119 | nan | 0.9119 | 0.0 | 0.9119 | 0.7678 | | 0.1196 | 25.0 | 2000 | 0.2896 | 0.4604 | 0.9209 | 0.9209 | nan | 0.9209 | 0.0 | 0.9209 | 0.7807 | | 0.1149 | 28.12 | 2250 | 0.2210 | 0.4634 | 0.9268 | 0.9268 | nan | 0.9268 | 0.0 | 0.9268 | 0.8470 | | 0.1095 | 31.25 | 2500 | 0.2215 | 0.4534 | 0.9067 | 0.9067 | nan | 0.9067 | 0.0 | 0.9067 | 0.8380 | | 0.109 | 34.38 | 2750 | 0.2256 | 0.4243 | 0.8487 | 0.8487 | nan | 0.8487 | 0.0 | 0.8487 | 0.8077 | | 0.1062 | 37.5 | 3000 | 0.2172 | 0.4497 | 0.8994 | 0.8994 | nan | 0.8994 | 0.0 | 0.8994 | 0.8363 | | 0.1046 | 40.62 | 3250 | 0.2401 | 0.4551 | 0.9102 | 0.9102 | nan | 0.9102 | 0.0 | 0.9102 | 0.8387 | | 0.1096 | 43.75 | 3500 | 0.2157 | 0.4582 | 0.9164 | 0.9164 | nan | 0.9164 | 0.0 | 0.9164 | 0.8425 | | 0.1014 | 46.88 | 3750 | 0.2344 | 0.4573 | 0.9146 | 0.9146 | nan | 0.9146 | 0.0 | 0.9146 | 0.8411 | | 0.1036 | 50.0 | 4000 | 0.2247 | 0.4565 | 0.9129 | 0.9129 | nan | 0.9129 | 0.0 | 0.9129 | 0.8406 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.11.0+cu115 - Datasets 2.15.0 - Tokenizers 0.15.0
[ "unlabeled", "circuit" ]
grayphd/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 1.8299 - Mean Iou: 0.1367 - Mean Accuracy: 0.1860 - Overall Accuracy: 0.6943 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.7986 - Accuracy Flat-sidewalk: 0.8984 - Accuracy Flat-crosswalk: 0.0 - Accuracy Flat-cyclinglane: 0.0233 - Accuracy Flat-parkingdriveway: 0.0008 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.0 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.8604 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: nan - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.8665 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: 0.0 - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9388 - Accuracy Nature-terrain: 0.7081 - Accuracy Sky: 0.8565 - Accuracy Void-ground: 0.0001 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.4128 - Iou Flat-sidewalk: 0.7214 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.0233 - Iou Flat-parkingdriveway: 0.0008 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.0 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.6003 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: nan - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.5461 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: 0.0 - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.7232 - Iou Nature-terrain: 0.5549 - Iou Sky: 0.7907 - Iou Void-ground: 0.0001 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 2.6535 | 1.5385 | 20 | 2.8741 | 0.0846 | 0.1365 | 0.6009 | nan | 0.2219 | 0.9273 | 0.0 | 0.0017 | 0.0042 | nan | 0.0000 | 0.0014 | 0.0 | 0.8438 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7341 | 0.0 | 0.0015 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9709 | 0.1359 | 0.5242 | 0.0 | 0.0000 | 0.0000 | 0.0 | 0.0 | 0.1852 | 0.5896 | 0.0 | 0.0017 | 0.0041 | 0.0 | 0.0000 | 0.0013 | 0.0 | 0.4925 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4757 | 0.0 | 0.0015 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5807 | 0.1127 | 0.5151 | 0.0 | 0.0000 | 0.0000 | 0.0 | | 2.2209 | 3.0769 | 40 | 2.1895 | 0.1089 | 0.1592 | 0.6529 | nan | 0.6642 | 0.9009 | 0.0 | 0.0020 | 0.0002 | nan | 0.0 | 0.0 | 0.0 | 0.8372 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8318 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9586 | 0.2673 | 0.6333 | 0.0000 | 0.0 | 0.0 | 0.0 | nan | 0.3818 | 0.6832 | 0.0 | 0.0020 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5601 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4918 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6325 | 0.2290 | 0.6137 | 0.0000 | 0.0 | 0.0 | 0.0 | | 1.9209 | 4.6154 | 60 | 1.9240 | 0.1295 | 0.1779 | 0.6803 | nan | 0.7491 | 0.8970 | 0.0 | 0.0037 | 0.0003 | nan | 0.0 | 0.0001 | 0.0 | 0.8615 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8579 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9436 | 0.5907 | 0.7886 | 0.0003 | 0.0 | 0.0 | 0.0 | nan | 0.3990 | 0.7030 | 0.0 | 0.0037 | 0.0003 | nan | 0.0 | 0.0001 | 0.0 | 0.5788 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5252 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7042 | 0.4845 | 0.7438 | 0.0003 | 0.0 | 0.0 | 0.0 | | 1.9014 | 6.1538 | 80 | 1.8370 | 0.1346 | 0.1829 | 0.6899 | nan | 0.7851 | 0.8979 | 0.0 | 0.0146 | 0.0006 | nan | 0.0 | 0.0000 | 0.0 | 0.8365 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8694 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9449 | 0.6680 | 0.8361 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.4097 | 0.7159 | 0.0 | 0.0145 | 0.0006 | nan | 0.0 | 0.0000 | 0.0 | 0.6037 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7130 | 0.5332 | 0.7782 | 0.0001 | 0.0 | 0.0 | 0.0 | | 1.8127 | 7.6923 | 100 | 1.8299 | 0.1367 | 0.1860 | 0.6943 | nan | 0.7986 | 0.8984 | 0.0 | 0.0233 | 0.0008 | nan | 0.0 | 0.0 | 0.0 | 0.8604 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8665 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9388 | 0.7081 | 0.8565 | 0.0001 | 0.0 | 0.0 | 0.0 | nan | 0.4128 | 0.7214 | 0.0 | 0.0233 | 0.0008 | nan | 0.0 | 0.0 | 0.0 | 0.6003 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7232 | 0.5549 | 0.7907 | 0.0001 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
leftattention/segformer-b4-wall
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b4-wall This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1537 - Mean Accuracy: 0.9448 - Mean Iou: 0.8993 - Overall Accuracy: 0.9558 - Per Category Accuracy: [0.9648476610683054, 0.9680509025433003, 0.9015647356112896, nan] - Per Category Iou: [0.9294668192886654, 0.9344825387850888, 0.8340281823830938, nan] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy | Per Category Accuracy | Per Category Iou | |:-------------:|:-------:|:----:|:---------------:|:-------------:|:--------:|:----------------:|:-----------------------------------------------------------------:|:-----------------------------------------------------------------:| | 0.1398 | 5.3476 | 1000 | 0.1477 | 0.9424 | 0.8733 | 0.9420 | [0.9375947027923643, 0.962438818648652, 0.9270677962243152, nan] | [0.9071928258269675, 0.9154732958813474, 0.7971633247503161, nan] | | 0.1114 | 10.6952 | 2000 | 0.1329 | 0.9426 | 0.8878 | 0.9498 | [0.9551513266050631, 0.9606741248023447, 0.9120448217426163, nan] | [0.9197608920879746, 0.9255854097692368, 0.818153830444766, nan] | | 0.0683 | 16.0428 | 3000 | 0.1353 | 0.9473 | 0.8921 | 0.9516 | [0.9527839457434386, 0.9691455504455139, 0.9198476394516605, nan] | [0.922537499674425, 0.926305870761282, 0.8273726843249476, nan] | | 0.0753 | 21.3904 | 4000 | 0.1311 | 0.9437 | 0.8959 | 0.9540 | [0.9633835386385788, 0.9611760655179852, 0.9066569940696604, nan] | [0.9267602358926313, 0.9312805978213234, 0.8297698871401628, nan] | | 0.0505 | 26.7380 | 5000 | 0.1397 | 0.9442 | 0.8971 | 0.9545 | [0.9627544499461427, 0.967327419780526, 0.9024453947068249, nan] | [0.9272910775593762, 0.9304849186604474, 0.8333807013974415, nan] | | 0.0427 | 32.0856 | 6000 | 0.1414 | 0.9455 | 0.8992 | 0.9555 | [0.9640187847053339, 0.9652081246861538, 0.9074073950598316, nan] | [0.9289147168722637, 0.9321577805497577, 0.8366507705917902, nan] | | 0.0556 | 37.4332 | 7000 | 0.1477 | 0.9452 | 0.8984 | 0.9552 | [0.9629165900233977, 0.9697602413261539, 0.9029026554269718, nan] | [0.9285106797857617, 0.9331322728249959, 0.833620894806762, nan] | | 0.0424 | 42.7807 | 8000 | 0.1484 | 0.9439 | 0.8990 | 0.9557 | [0.9653151526182964, 0.96949089540134, 0.8967977175922358, nan] | [0.9292691886525306, 0.9343666443212755, 0.83323737535253, nan] | | 0.053 | 48.1283 | 9000 | 0.1537 | 0.9448 | 0.8993 | 0.9558 | [0.9648476610683054, 0.9680509025433003, 0.9015647356112896, nan] | [0.9294668192886654, 0.9344825387850888, 0.8340281823830938, nan] | ### Framework versions - Transformers 4.40.2 - Pytorch 2.3.0+cu121 - Datasets 2.17.0 - Tokenizers 0.19.1
[ "background", "wall", "ceiling", "floor" ]
Hasano20/segformer_clean-set1_mit-b5_rgb
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Clean_Set1_95images_mit-b5_Grayscale This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_95images dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0103 - Val Loss: 0.0229 - Mean Iou: 0.9729 - Mean Accuracy: 0.9859 - Overall Accuracy: 0.9928 - Accuracy Background: 0.9972 - Accuracy Melt: 0.9669 - Accuracy Substrate: 0.9937 - Iou Background: 0.9944 - Iou Melt: 0.9370 - Iou Substrate: 0.9871 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.1375 | 5.5556 | 50 | 0.1577 | 0.7820 | 0.8338 | 0.9411 | 0.9857 | 0.5352 | 0.9806 | 0.9746 | 0.4754 | 0.8959 | | 0.0403 | 11.1111 | 100 | 0.1948 | 0.7535 | 0.7960 | 0.9378 | 0.9893 | 0.4011 | 0.9977 | 0.9826 | 0.3954 | 0.8825 | | 0.0291 | 16.6667 | 150 | 0.0484 | 0.9337 | 0.9479 | 0.9832 | 0.9969 | 0.8495 | 0.9973 | 0.9884 | 0.8414 | 0.9712 | | 0.0114 | 22.2222 | 200 | 0.0273 | 0.9634 | 0.9808 | 0.9903 | 0.9930 | 0.9544 | 0.9950 | 0.9917 | 0.9149 | 0.9838 | | 0.0138 | 27.7778 | 250 | 0.0289 | 0.9655 | 0.9782 | 0.9910 | 0.9966 | 0.9423 | 0.9956 | 0.9941 | 0.9190 | 0.9836 | | 0.0072 | 33.3333 | 300 | 0.0257 | 0.9689 | 0.9855 | 0.9918 | 0.9975 | 0.9682 | 0.9908 | 0.9945 | 0.9276 | 0.9847 | | 0.007 | 38.8889 | 350 | 0.0234 | 0.9722 | 0.9862 | 0.9926 | 0.9968 | 0.9684 | 0.9934 | 0.9944 | 0.9354 | 0.9867 | | 0.0063 | 44.4444 | 400 | 0.0232 | 0.9727 | 0.9866 | 0.9927 | 0.9971 | 0.9696 | 0.9931 | 0.9945 | 0.9366 | 0.9870 | | 0.0103 | 50.0 | 450 | 0.0229 | 0.9729 | 0.9859 | 0.9928 | 0.9972 | 0.9669 | 0.9937 | 0.9944 | 0.9370 | 0.9871 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/segformer_mixed-set2-788images_mit-b5_RGB_15Epochs
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_Mixed_Set2_788images_mit-b5_RGB This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Mixed_Set2_788images dataset. It achieves the following results on the evaluation set: - Train-Loss: 0.0099 - Loss: 0.0150 - Mean Iou: 0.9788 - Mean Accuracy: 0.9887 - Overall Accuracy: 0.9948 - Accuracy Background: 0.9958 - Accuracy Melt: 0.9735 - Accuracy Substrate: 0.9969 - Iou Background: 0.9926 - Iou Melt: 0.9509 - Iou Substrate: 0.9927 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.1619 | 0.7042 | 50 | 0.1799 | 0.7782 | 0.8306 | 0.9444 | 0.9902 | 0.5371 | 0.9645 | 0.9436 | 0.4720 | 0.9192 | | 0.062 | 1.4085 | 100 | 0.1065 | 0.8361 | 0.8630 | 0.9638 | 0.9833 | 0.6084 | 0.9972 | 0.9720 | 0.5922 | 0.9441 | | 0.1757 | 2.1127 | 150 | 0.1157 | 0.8551 | 0.8896 | 0.9617 | 0.9803 | 0.7065 | 0.9820 | 0.9484 | 0.6731 | 0.9438 | | 0.0872 | 2.8169 | 200 | 0.0446 | 0.9302 | 0.9539 | 0.9844 | 0.9938 | 0.8760 | 0.9920 | 0.9846 | 0.8282 | 0.9777 | | 0.0336 | 3.5211 | 250 | 0.0338 | 0.9469 | 0.9751 | 0.9877 | 0.9913 | 0.9431 | 0.9910 | 0.9857 | 0.8719 | 0.9831 | | 0.0417 | 4.2254 | 300 | 0.0488 | 0.9281 | 0.9820 | 0.9830 | 0.9941 | 0.9765 | 0.9753 | 0.9877 | 0.8233 | 0.9732 | | 0.0273 | 4.9296 | 350 | 0.0295 | 0.9516 | 0.9628 | 0.9892 | 0.9952 | 0.8960 | 0.9973 | 0.9895 | 0.8819 | 0.9835 | | 0.0249 | 5.6338 | 400 | 0.0228 | 0.9627 | 0.9807 | 0.9913 | 0.9916 | 0.9544 | 0.9960 | 0.9890 | 0.9112 | 0.9879 | | 0.0247 | 6.3380 | 450 | 0.0234 | 0.9642 | 0.9886 | 0.9915 | 0.9919 | 0.9814 | 0.9925 | 0.9894 | 0.9151 | 0.9881 | | 0.0219 | 7.0423 | 500 | 0.0220 | 0.9656 | 0.9768 | 0.9920 | 0.9943 | 0.9386 | 0.9975 | 0.9908 | 0.9178 | 0.9882 | | 0.0172 | 7.7465 | 550 | 0.0206 | 0.9672 | 0.9888 | 0.9923 | 0.9951 | 0.9792 | 0.9919 | 0.9913 | 0.9215 | 0.9888 | | 0.018 | 8.4507 | 600 | 0.0169 | 0.9747 | 0.9859 | 0.9937 | 0.9944 | 0.9665 | 0.9969 | 0.9910 | 0.9420 | 0.9911 | | 0.0152 | 9.1549 | 650 | 0.0180 | 0.9726 | 0.9856 | 0.9932 | 0.9968 | 0.9659 | 0.9942 | 0.9909 | 0.9366 | 0.9902 | | 0.016 | 9.8592 | 700 | 0.0180 | 0.9729 | 0.9877 | 0.9936 | 0.9955 | 0.9726 | 0.9949 | 0.9917 | 0.9360 | 0.9909 | | 0.0132 | 10.5634 | 750 | 0.0169 | 0.9746 | 0.9872 | 0.9938 | 0.9944 | 0.9708 | 0.9965 | 0.9914 | 0.9410 | 0.9913 | | 0.0115 | 11.2676 | 800 | 0.0156 | 0.9761 | 0.9898 | 0.9941 | 0.9952 | 0.9789 | 0.9954 | 0.9920 | 0.9446 | 0.9917 | | 0.0143 | 11.9718 | 850 | 0.0155 | 0.9765 | 0.9895 | 0.9943 | 0.9962 | 0.9772 | 0.9952 | 0.9923 | 0.9452 | 0.9920 | | 0.0106 | 12.6761 | 900 | 0.0146 | 0.9778 | 0.9898 | 0.9946 | 0.9959 | 0.9777 | 0.9959 | 0.9924 | 0.9485 | 0.9925 | | 0.0106 | 13.3803 | 950 | 0.0146 | 0.9780 | 0.9888 | 0.9947 | 0.9967 | 0.9736 | 0.9959 | 0.9923 | 0.9490 | 0.9928 | | 0.0068 | 14.0845 | 1000 | 0.0147 | 0.9784 | 0.9883 | 0.9947 | 0.9966 | 0.9718 | 0.9964 | 0.9924 | 0.9501 | 0.9928 | | 0.0115 | 14.7887 | 1050 | 0.0163 | 0.9759 | 0.9901 | 0.9942 | 0.9958 | 0.9795 | 0.9950 | 0.9925 | 0.9436 | 0.9917 | | 0.0099 | 15.4930 | 1100 | 0.0150 | 0.9788 | 0.9887 | 0.9948 | 0.9958 | 0.9735 | 0.9969 | 0.9926 | 0.9509 | 0.9927 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Bramwel/segformer-b0-finetuned-segments-sidewalk-2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 1.2030 - Mean Iou: 0.1619 - Mean Accuracy: 0.2092 - Overall Accuracy: 0.7485 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8436 - Accuracy Flat-sidewalk: 0.9312 - Accuracy Flat-crosswalk: 0.0 - Accuracy Flat-cyclinglane: 0.4507 - Accuracy Flat-parkingdriveway: 0.0198 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.0 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.9019 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.8988 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0004 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9346 - Accuracy Nature-terrain: 0.7865 - Accuracy Sky: 0.9277 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.5666 - Iou Flat-sidewalk: 0.7709 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.4018 - Iou Flat-parkingdriveway: 0.0192 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.0 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.6148 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.5632 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0004 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.7501 - Iou Nature-terrain: 0.6356 - Iou Sky: 0.8596 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 2.3674 | 0.5 | 100 | 1.6720 | 0.1214 | 0.1717 | 0.6859 | nan | 0.8443 | 0.9140 | 0.0 | 0.0013 | 0.0016 | nan | 0.0000 | 0.0 | 0.0 | 0.9316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8364 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9363 | 0.1751 | 0.8520 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4679 | 0.7424 | 0.0 | 0.0013 | 0.0016 | nan | 0.0000 | 0.0 | 0.0 | 0.4884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5352 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6734 | 0.1662 | 0.8072 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.6248 | 1.0 | 200 | 1.3491 | 0.1433 | 0.1884 | 0.7207 | nan | 0.8164 | 0.9497 | 0.0 | 0.1026 | 0.0029 | nan | 0.0 | 0.0 | 0.0 | 0.8847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8799 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9487 | 0.5417 | 0.9021 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5258 | 0.7439 | 0.0 | 0.1022 | 0.0029 | nan | 0.0 | 0.0 | 0.0 | 0.6004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5538 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7135 | 0.4965 | 0.8474 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3902 | 1.5 | 300 | 1.2406 | 0.1583 | 0.2052 | 0.7431 | nan | 0.8351 | 0.9301 | 0.0 | 0.4278 | 0.0119 | nan | 0.0 | 0.0 | 0.0 | 0.8945 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8936 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9464 | 0.6912 | 0.9360 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5562 | 0.7694 | 0.0 | 0.3883 | 0.0117 | nan | 0.0 | 0.0 | 0.0 | 0.5991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5557 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7361 | 0.5997 | 0.8479 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2893 | 2.0 | 400 | 1.2030 | 0.1619 | 0.2092 | 0.7485 | nan | 0.8436 | 0.9312 | 0.0 | 0.4507 | 0.0198 | nan | 0.0 | 0.0 | 0.0 | 0.9019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8988 | 0.0 | 0.0004 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9346 | 0.7865 | 0.9277 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5666 | 0.7709 | 0.0 | 0.4018 | 0.0192 | nan | 0.0 | 0.0 | 0.0 | 0.6148 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5632 | 0.0 | 0.0004 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7501 | 0.6356 | 0.8596 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.0+rocm5.6 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
Domeandreimno/yolov8-segmentation
# YOLOv8 Segmentation Model Questo repository contiene il modello YOLOv8 per la segmentazione e i pesi personalizzati addestrati su un dataset specifico. ## File contenuti - `yolov8n-seg.pt`: Modello YOLOv8 pre-addestrato per la segmentazione. - `best.pt`: Pesi personalizzati addestrati su un dataset specifico per migliorare le prestazioni di segmentazione. ## Utilizzo Questi file possono essere utilizzati con CVAT per la segmentazione automatica delle immagini. Segui le istruzioni di [CVAT](https://cvat.ai) per caricare e utilizzare il modello.
[ "architrave", "doojam" ]
heroza/segformer-finetuned-biofilm_MRCNNv1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm_MRCNNv1 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm_MRCNNv1_validation dataset. It achieves the following results on the evaluation set: - eval_loss: 0.7031 - eval_mean_iou: 0.0 - eval_mean_accuracy: nan - eval_overall_accuracy: nan - eval_accuracy_background: nan - eval_accuracy_biofilm: nan - eval_iou_background: 0.0 - eval_iou_biofilm: 0.0 - eval_runtime: 144.7654 - eval_samples_per_second: 8.794 - eval_steps_per_second: 1.105 - step: 0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
Hasano20/SegFormer_mit-b5_Clean-Set3_RGB
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_mit-b5_Clean-Set3_RGB This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0207 - Mean Iou: 0.9744 - Mean Accuracy: 0.9865 - Overall Accuracy: 0.9940 - Accuracy Background: 0.9965 - Accuracy Melt: 0.9672 - Accuracy Substrate: 0.9957 - Iou Background: 0.9938 - Iou Melt: 0.9389 - Iou Substrate: 0.9905 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 200 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.3016 | 0.9434 | 50 | 0.2259 | 0.6885 | 0.7339 | 0.9268 | 0.9683 | 0.2451 | 0.9882 | 0.9455 | 0.2365 | 0.8834 | | 0.1267 | 1.8868 | 100 | 0.1062 | 0.8505 | 0.9168 | 0.9620 | 0.9849 | 0.7996 | 0.9660 | 0.9706 | 0.6411 | 0.9398 | | 0.0982 | 2.8302 | 150 | 0.0765 | 0.8725 | 0.9003 | 0.9718 | 0.9905 | 0.7183 | 0.9920 | 0.9803 | 0.6829 | 0.9544 | | 0.0626 | 3.7736 | 200 | 0.0596 | 0.9124 | 0.9496 | 0.9793 | 0.9921 | 0.8731 | 0.9836 | 0.9824 | 0.7879 | 0.9668 | | 0.0601 | 4.7170 | 250 | 0.0776 | 0.8931 | 0.9394 | 0.9733 | 0.9814 | 0.8536 | 0.9834 | 0.9762 | 0.7466 | 0.9566 | | 0.0662 | 5.6604 | 300 | 0.0548 | 0.9176 | 0.9660 | 0.9803 | 0.9919 | 0.9280 | 0.9781 | 0.9875 | 0.7993 | 0.9662 | | 0.0297 | 6.6038 | 350 | 0.0353 | 0.9452 | 0.9791 | 0.9872 | 0.9918 | 0.9581 | 0.9875 | 0.9895 | 0.8670 | 0.9792 | | 0.0197 | 7.5472 | 400 | 0.0422 | 0.9332 | 0.9520 | 0.9853 | 0.9949 | 0.8670 | 0.9940 | 0.9899 | 0.8343 | 0.9753 | | 0.0274 | 8.4906 | 450 | 0.0281 | 0.9589 | 0.9783 | 0.9904 | 0.9944 | 0.9475 | 0.9932 | 0.9913 | 0.9012 | 0.9843 | | 0.0197 | 9.4340 | 500 | 0.0280 | 0.9569 | 0.9792 | 0.9901 | 0.9965 | 0.9507 | 0.9904 | 0.9920 | 0.8950 | 0.9836 | | 0.0185 | 10.3774 | 550 | 0.0230 | 0.9644 | 0.9819 | 0.9918 | 0.9961 | 0.9564 | 0.9931 | 0.9923 | 0.9142 | 0.9867 | | 0.0131 | 11.3208 | 600 | 0.0248 | 0.9663 | 0.9788 | 0.9922 | 0.9951 | 0.9449 | 0.9964 | 0.9922 | 0.9192 | 0.9874 | | 0.0123 | 12.2642 | 650 | 0.0229 | 0.9682 | 0.9784 | 0.9926 | 0.9957 | 0.9424 | 0.9972 | 0.9931 | 0.9236 | 0.9879 | | 0.0094 | 13.2075 | 700 | 0.0220 | 0.9673 | 0.9811 | 0.9925 | 0.9962 | 0.9519 | 0.9951 | 0.9930 | 0.9209 | 0.9878 | | 0.0092 | 14.1509 | 750 | 0.0198 | 0.9721 | 0.9845 | 0.9935 | 0.9962 | 0.9617 | 0.9956 | 0.9933 | 0.9334 | 0.9895 | | 0.0119 | 15.0943 | 800 | 0.0210 | 0.9688 | 0.9828 | 0.9928 | 0.9971 | 0.9571 | 0.9943 | 0.9932 | 0.9250 | 0.9883 | | 0.0092 | 16.0377 | 850 | 0.0220 | 0.9688 | 0.9819 | 0.9928 | 0.9959 | 0.9543 | 0.9957 | 0.9929 | 0.9249 | 0.9885 | | 0.0092 | 16.9811 | 900 | 0.0186 | 0.9718 | 0.9859 | 0.9934 | 0.9965 | 0.9666 | 0.9947 | 0.9936 | 0.9324 | 0.9894 | | 0.0069 | 17.9245 | 950 | 0.0201 | 0.9725 | 0.9831 | 0.9936 | 0.9963 | 0.9564 | 0.9967 | 0.9937 | 0.9341 | 0.9898 | | 0.011 | 18.8679 | 1000 | 0.0190 | 0.9742 | 0.9851 | 0.9939 | 0.9962 | 0.9628 | 0.9964 | 0.9937 | 0.9388 | 0.9903 | | 0.009 | 19.8113 | 1050 | 0.0219 | 0.9714 | 0.9855 | 0.9933 | 0.9972 | 0.9652 | 0.9940 | 0.9936 | 0.9314 | 0.9891 | | 0.0086 | 20.7547 | 1100 | 0.0199 | 0.9737 | 0.9872 | 0.9938 | 0.9961 | 0.9702 | 0.9953 | 0.9937 | 0.9373 | 0.9901 | | 0.0086 | 21.6981 | 1150 | 0.0206 | 0.9737 | 0.9850 | 0.9938 | 0.9957 | 0.9625 | 0.9967 | 0.9936 | 0.9372 | 0.9902 | | 0.0052 | 22.6415 | 1200 | 0.0205 | 0.9737 | 0.9866 | 0.9939 | 0.9960 | 0.9682 | 0.9957 | 0.9936 | 0.9372 | 0.9903 | | 0.0079 | 23.5849 | 1250 | 0.0205 | 0.9745 | 0.9861 | 0.9940 | 0.9962 | 0.9658 | 0.9962 | 0.9937 | 0.9393 | 0.9905 | | 0.0057 | 24.5283 | 1300 | 0.0210 | 0.9746 | 0.9849 | 0.9940 | 0.9961 | 0.9618 | 0.9968 | 0.9938 | 0.9397 | 0.9904 | | 0.007 | 25.4717 | 1350 | 0.0212 | 0.9735 | 0.9858 | 0.9938 | 0.9963 | 0.9652 | 0.9957 | 0.9936 | 0.9369 | 0.9901 | | 0.0059 | 26.4151 | 1400 | 0.0207 | 0.9744 | 0.9865 | 0.9940 | 0.9965 | 0.9672 | 0.9957 | 0.9938 | 0.9389 | 0.9905 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/BEiT_beit-base-finetuned-ade-640-640_Clean-Set1_RGB
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # BEiT_beit-base-finetuned-ade-640-640_Clean-Set1_RGB This model is a fine-tuned version of [microsoft/beit-base-finetuned-ade-640-640](https://huggingface.co/microsoft/beit-base-finetuned-ade-640-640) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0603 - Mean Iou: 0.9672 - Mean Accuracy: 0.9774 - Overall Accuracy: 0.9930 - Accuracy Background: 0.9961 - Accuracy Melt: 0.9392 - Accuracy Substrate: 0.9971 - Iou Background: 0.9929 - Iou Melt: 0.9207 - Iou Substrate: 0.9879 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 200 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.3924 | 5.5556 | 50 | 0.3038 | 0.9022 | 0.9499 | 0.9809 | 0.9854 | 0.8738 | 0.9906 | 0.9853 | 0.7493 | 0.9719 | | 0.0857 | 11.1111 | 100 | 0.0788 | 0.9656 | 0.9771 | 0.9931 | 0.9972 | 0.9377 | 0.9964 | 0.9939 | 0.9146 | 0.9883 | | 0.0816 | 16.6667 | 150 | 0.0603 | 0.9672 | 0.9774 | 0.9930 | 0.9961 | 0.9392 | 0.9971 | 0.9929 | 0.9207 | 0.9879 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
guohao123/nvidia-mit-b3_finetune_sidewalk-semantic
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nvidia-mit-b3_finetune_sidewalk-semantic This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.7270 - Mean Iou: 0.2391 - Mean Accuracy: 0.2831 - Overall Accuracy: 0.8089 - Acc Unlabeled: nan - Acc Flat-road: 0.8462 - Acc Flat-sidewalk: 0.9578 - Acc Flat-crosswalk: 0.0 - Acc Flat-cyclinglane: 0.4756 - Acc Flat-parkingdriveway: 0.2145 - Acc Flat-railtrack: nan - Acc Flat-curb: 0.3585 - Acc Human-person: 0.2813 - Acc Human-rider: 0.0 - Acc Vehicle-car: 0.9514 - Acc Vehicle-truck: 0.0 - Acc Vehicle-bus: 0.0 - Acc Vehicle-tramtrain: 0.0 - Acc Vehicle-motorcycle: 0.0 - Acc Vehicle-bicycle: 0.2160 - Acc Vehicle-caravan: 0.0 - Acc Vehicle-cartrailer: 0.0 - Acc Construction-building: 0.9204 - Acc Construction-door: 0.0 - Acc Construction-wall: 0.2528 - Acc Construction-fenceguardrail: 0.3314 - Acc Construction-bridge: 0.0 - Acc Construction-tunnel: nan - Acc Construction-stairs: 0.0 - Acc Object-pole: 0.3098 - Acc Object-trafficsign: 0.0 - Acc Object-trafficlight: 0.0 - Acc Nature-vegetation: 0.9375 - Acc Nature-terrain: 0.8768 - Acc Sky: 0.9406 - Acc Void-ground: 0.0 - Acc Void-dynamic: 0.0 - Acc Void-static: 0.1895 - Acc Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.6618 - Iou Flat-sidewalk: 0.8235 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.4373 - Iou Flat-parkingdriveway: 0.1762 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.2784 - Iou Human-person: 0.2378 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.7833 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.2063 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.6478 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.2196 - Iou Construction-fenceguardrail: 0.3081 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.2490 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.8362 - Iou Nature-terrain: 0.7356 - Iou Sky: 0.8856 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.1633 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 3 - eval_batch_size: 3 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Acc Unlabeled | Acc Flat-road | Acc Flat-sidewalk | Acc Flat-crosswalk | Acc Flat-cyclinglane | Acc Flat-parkingdriveway | Acc Flat-railtrack | Acc Flat-curb | Acc Human-person | Acc Human-rider | Acc Vehicle-car | Acc Vehicle-truck | Acc Vehicle-bus | Acc Vehicle-tramtrain | Acc Vehicle-motorcycle | Acc Vehicle-bicycle | Acc Vehicle-caravan | Acc Vehicle-cartrailer | Acc Construction-building | Acc Construction-door | Acc Construction-wall | Acc Construction-fenceguardrail | Acc Construction-bridge | Acc Construction-tunnel | Acc Construction-stairs | Acc Object-pole | Acc Object-trafficsign | Acc Object-trafficlight | Acc Nature-vegetation | Acc Nature-terrain | Acc Sky | Acc Void-ground | Acc Void-dynamic | Acc Void-static | Acc Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 1.2696 | 0.14 | 33 | 1.3198 | 0.1366 | 0.1799 | 0.7212 | nan | 0.7268 | 0.9591 | 0.0 | 0.0 | 0.0054 | nan | 0.0 | 0.0 | 0.0 | 0.9144 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8191 | 0.0 | 0.0032 | 0.0002 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9455 | 0.5741 | 0.8065 | 0.0 | 0.0 | 0.0014 | 0.0 | nan | 0.5092 | 0.7200 | 0.0 | 0.0 | 0.0052 | nan | 0.0 | 0.0 | 0.0 | 0.7334 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5417 | 0.0 | 0.0032 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7237 | 0.4871 | 0.7824 | 0.0 | 0.0 | 0.0014 | 0.0 | | 1.196 | 0.28 | 66 | 0.9744 | 0.1660 | 0.2092 | 0.7621 | nan | 0.8403 | 0.9446 | 0.0 | 0.2749 | 0.0164 | nan | 0.0000 | 0.0 | 0.0 | 0.9510 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8910 | 0.0 | 0.0108 | 0.0620 | 0.0 | nan | 0.0 | 0.0256 | 0.0 | 0.0 | 0.9337 | 0.8260 | 0.9169 | 0.0 | 0.0 | 0.0018 | 0.0 | nan | 0.5760 | 0.7791 | 0.0 | 0.2646 | 0.0159 | nan | 0.0000 | 0.0 | 0.0 | 0.7093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5887 | 0.0 | 0.0107 | 0.0618 | 0.0 | nan | 0.0 | 0.0253 | 0.0 | 0.0 | 0.7845 | 0.6386 | 0.8554 | 0.0 | 0.0 | 0.0018 | 0.0 | | 1.3009 | 0.42 | 99 | 0.8567 | 0.1940 | 0.2355 | 0.7789 | nan | 0.8392 | 0.9534 | 0.0 | 0.3587 | 0.0450 | nan | 0.0226 | 0.0861 | 0.0 | 0.9475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0 | 0.0 | 0.9180 | 0.0 | 0.2409 | 0.2594 | 0.0 | nan | 0.0 | 0.1095 | 0.0 | 0.0 | 0.9179 | 0.8459 | 0.9262 | 0.0 | 0.0 | 0.0561 | 0.0 | nan | 0.5901 | 0.7918 | 0.0 | 0.3215 | 0.0422 | nan | 0.0221 | 0.0815 | 0.0 | 0.7401 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0 | 0.0 | 0.6252 | 0.0 | 0.2113 | 0.2513 | 0.0 | nan | 0.0 | 0.1028 | 0.0 | 0.0 | 0.8096 | 0.6881 | 0.8659 | 0.0 | 0.0 | 0.0543 | 0.0 | | 0.8802 | 0.56 | 132 | 0.8121 | 0.2111 | 0.2533 | 0.7870 | nan | 0.7684 | 0.9678 | 0.0 | 0.5098 | 0.0480 | nan | 0.1309 | 0.1147 | 0.0 | 0.9545 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1207 | 0.0 | 0.0 | 0.8997 | 0.0 | 0.3134 | 0.2572 | 0.0 | nan | 0.0 | 0.2145 | 0.0 | 0.0 | 0.9277 | 0.8730 | 0.9489 | 0.0 | 0.0 | 0.0552 | 0.0 | nan | 0.6078 | 0.7799 | 0.0 | 0.4300 | 0.0452 | nan | 0.1190 | 0.1086 | 0.0 | 0.7451 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1171 | 0.0 | 0.0 | 0.6525 | 0.0 | 0.2578 | 0.2467 | 0.0 | nan | 0.0 | 0.1851 | 0.0 | 0.0 | 0.8146 | 0.7176 | 0.8735 | 0.0 | 0.0 | 0.0531 | 0.0 | | 0.6118 | 0.71 | 165 | 0.7667 | 0.2267 | 0.2697 | 0.7976 | nan | 0.8872 | 0.9438 | 0.0 | 0.4101 | 0.1549 | nan | 0.1201 | 0.2090 | 0.0 | 0.9404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1452 | 0.0 | 0.0 | 0.8688 | 0.0 | 0.3950 | 0.3483 | 0.0 | nan | 0.0 | 0.2586 | 0.0 | 0.0 | 0.9422 | 0.8637 | 0.9486 | 0.0 | 0.0 | 0.1931 | 0.0 | nan | 0.6038 | 0.8210 | 0.0 | 0.3802 | 0.1356 | nan | 0.1094 | 0.1871 | 0.0 | 0.7863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1413 | 0.0 | 0.0 | 0.6519 | 0.0 | 0.2892 | 0.3200 | 0.0 | nan | 0.0 | 0.2204 | 0.0 | 0.0 | 0.8268 | 0.7340 | 0.8818 | 0.0 | 0.0 | 0.1644 | 0.0 | | 1.3374 | 0.85 | 198 | 0.7325 | 0.2339 | 0.2768 | 0.8064 | nan | 0.8797 | 0.9501 | 0.0 | 0.4559 | 0.2181 | nan | 0.2268 | 0.2519 | 0.0 | 0.9457 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2066 | 0.0 | 0.0 | 0.9112 | 0.0 | 0.2305 | 0.3397 | 0.0 | nan | 0.0 | 0.2854 | 0.0 | 0.0 | 0.9403 | 0.8881 | 0.9445 | 0.0 | 0.0 | 0.1843 | 0.0 | nan | 0.6533 | 0.8263 | 0.0 | 0.4195 | 0.1737 | nan | 0.1954 | 0.2214 | 0.0 | 0.7893 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1979 | 0.0 | 0.0 | 0.6490 | 0.0 | 0.2057 | 0.3186 | 0.0 | nan | 0.0 | 0.2353 | 0.0 | 0.0 | 0.8301 | 0.7274 | 0.8832 | 0.0 | 0.0 | 0.1599 | 0.0 | | 0.9023 | 0.99 | 231 | 0.7270 | 0.2391 | 0.2831 | 0.8089 | nan | 0.8462 | 0.9578 | 0.0 | 0.4756 | 0.2145 | nan | 0.3585 | 0.2813 | 0.0 | 0.9514 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2160 | 0.0 | 0.0 | 0.9204 | 0.0 | 0.2528 | 0.3314 | 0.0 | nan | 0.0 | 0.3098 | 0.0 | 0.0 | 0.9375 | 0.8768 | 0.9406 | 0.0 | 0.0 | 0.1895 | 0.0 | nan | 0.6618 | 0.8235 | 0.0 | 0.4373 | 0.1762 | nan | 0.2784 | 0.2378 | 0.0 | 0.7833 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2063 | 0.0 | 0.0 | 0.6478 | 0.0 | 0.2196 | 0.3081 | 0.0 | nan | 0.0 | 0.2490 | 0.0 | 0.0 | 0.8362 | 0.7356 | 0.8856 | 0.0 | 0.0 | 0.1633 | 0.0 | ### Framework versions - Transformers 4.27.4 - Pytorch 2.0.0+cu117 - Datasets 2.20.0 - Tokenizers 0.13.3
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
Hasano20/BEiT_beit-base-finetuned-ade-640-640_Clean-Set3_RGB
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # BEiT_beit-base-finetuned-ade-640-640_Clean-Set3_RGB This model is a fine-tuned version of [microsoft/beit-base-finetuned-ade-640-640](https://huggingface.co/microsoft/beit-base-finetuned-ade-640-640) on an unknown dataset. It achieves the following results on the evaluation set: - TrainLoss: 0.0216 - Loss: 0.0336 - Mean Iou: 0.9671 - Mean Accuracy: 0.9806 - Overall Accuracy: 0.9926 - Accuracy Background: 0.9956 - Accuracy Melt: 0.9505 - Accuracy Substrate: 0.9957 - Iou Background: 0.9916 - Iou Melt: 0.9208 - Iou Substrate: 0.9888 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 200 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.4042 | 0.9434 | 50 | 0.3272 | 0.8363 | 0.8672 | 0.9671 | 0.9931 | 0.6175 | 0.9911 | 0.9836 | 0.5790 | 0.9463 | | 0.1649 | 1.8868 | 100 | 0.0973 | 0.9371 | 0.9572 | 0.9867 | 0.9959 | 0.8833 | 0.9926 | 0.9881 | 0.8437 | 0.9795 | | 0.1439 | 2.8302 | 150 | 0.0724 | 0.9495 | 0.9800 | 0.9887 | 0.9946 | 0.9575 | 0.9879 | 0.9898 | 0.8770 | 0.9818 | | 0.1275 | 3.7736 | 200 | 0.0656 | 0.9443 | 0.9778 | 0.9877 | 0.9969 | 0.9515 | 0.9850 | 0.9903 | 0.8627 | 0.9799 | | 0.1522 | 4.7170 | 250 | 0.0585 | 0.9567 | 0.9737 | 0.9899 | 0.9971 | 0.9325 | 0.9915 | 0.9887 | 0.8976 | 0.9839 | | 0.1292 | 5.6604 | 300 | 0.0594 | 0.9502 | 0.9748 | 0.9877 | 0.9934 | 0.9418 | 0.9890 | 0.9857 | 0.8850 | 0.9801 | | 0.097 | 6.6038 | 350 | 0.0450 | 0.9634 | 0.9775 | 0.9912 | 0.9949 | 0.9432 | 0.9943 | 0.9883 | 0.9154 | 0.9866 | | 0.1125 | 7.5472 | 400 | 0.0451 | 0.9605 | 0.9757 | 0.9905 | 0.9953 | 0.9384 | 0.9934 | 0.9877 | 0.9080 | 0.9857 | | 0.102 | 8.4906 | 450 | 0.0518 | 0.9531 | 0.9798 | 0.9876 | 0.9921 | 0.9596 | 0.9876 | 0.9824 | 0.8960 | 0.9808 | | 0.0878 | 9.4340 | 500 | 0.0411 | 0.9639 | 0.9820 | 0.9911 | 0.9947 | 0.9592 | 0.9922 | 0.9885 | 0.9172 | 0.9859 | | 0.1198 | 10.3774 | 550 | 0.0679 | 0.9398 | 0.9655 | 0.9821 | 0.9873 | 0.9237 | 0.9855 | 0.9708 | 0.8768 | 0.9719 | | 0.055 | 11.3208 | 600 | 0.0521 | 0.9518 | 0.9791 | 0.9867 | 0.9846 | 0.9610 | 0.9917 | 0.9780 | 0.8966 | 0.9810 | | 0.086 | 12.2642 | 650 | 0.0402 | 0.9631 | 0.9791 | 0.9903 | 0.9920 | 0.9514 | 0.9940 | 0.9861 | 0.9185 | 0.9848 | | 0.058 | 13.2075 | 700 | 0.0455 | 0.9590 | 0.9768 | 0.9892 | 0.9908 | 0.9463 | 0.9934 | 0.9837 | 0.9096 | 0.9836 | | 0.0494 | 14.1509 | 750 | 0.0441 | 0.9588 | 0.9796 | 0.9895 | 0.9926 | 0.9547 | 0.9914 | 0.9842 | 0.9076 | 0.9846 | | 0.0599 | 15.0943 | 800 | 0.0401 | 0.9622 | 0.9787 | 0.9904 | 0.9925 | 0.9496 | 0.9939 | 0.9865 | 0.9149 | 0.9851 | | 0.0422 | 16.0377 | 850 | 0.0393 | 0.9619 | 0.9807 | 0.9906 | 0.9946 | 0.9556 | 0.9919 | 0.9880 | 0.9123 | 0.9853 | | 0.0454 | 16.9811 | 900 | 0.0429 | 0.9579 | 0.9742 | 0.9897 | 0.9918 | 0.9360 | 0.9948 | 0.9857 | 0.9033 | 0.9846 | | 0.0806 | 17.9245 | 950 | 0.0377 | 0.9640 | 0.9779 | 0.9915 | 0.9928 | 0.9445 | 0.9964 | 0.9892 | 0.9157 | 0.9869 | | 0.0677 | 18.8679 | 1000 | 0.0380 | 0.9602 | 0.9797 | 0.9910 | 0.9941 | 0.9513 | 0.9937 | 0.9882 | 0.9047 | 0.9877 | | 0.036 | 19.8113 | 1050 | 0.0388 | 0.9618 | 0.9799 | 0.9906 | 0.9942 | 0.9529 | 0.9925 | 0.9868 | 0.9127 | 0.9860 | | 0.0424 | 20.7547 | 1100 | 0.0375 | 0.9601 | 0.9753 | 0.9905 | 0.9934 | 0.9376 | 0.9949 | 0.9868 | 0.9071 | 0.9863 | | 0.0274 | 21.6981 | 1150 | 0.0322 | 0.9675 | 0.9795 | 0.9927 | 0.9955 | 0.9464 | 0.9965 | 0.9917 | 0.9218 | 0.9890 | | 0.0622 | 22.6415 | 1200 | 0.0360 | 0.9648 | 0.9798 | 0.9913 | 0.9932 | 0.9512 | 0.9949 | 0.9881 | 0.9197 | 0.9868 | | 0.0296 | 23.5849 | 1250 | 0.0334 | 0.9670 | 0.9823 | 0.9925 | 0.9953 | 0.9567 | 0.9950 | 0.9917 | 0.9207 | 0.9885 | | 0.0222 | 24.5283 | 1300 | 0.0326 | 0.9674 | 0.9823 | 0.9925 | 0.9948 | 0.9569 | 0.9953 | 0.9912 | 0.9222 | 0.9887 | | 0.0719 | 25.4717 | 1350 | 0.0328 | 0.9671 | 0.9832 | 0.9923 | 0.9945 | 0.9603 | 0.9947 | 0.9907 | 0.9223 | 0.9883 | | 0.0197 | 26.4151 | 1400 | 0.0311 | 0.9681 | 0.9817 | 0.9929 | 0.9962 | 0.9537 | 0.9954 | 0.9922 | 0.9230 | 0.9893 | | 0.0223 | 27.3585 | 1450 | 0.0324 | 0.9664 | 0.9811 | 0.9925 | 0.9956 | 0.9527 | 0.9950 | 0.9916 | 0.9191 | 0.9885 | | 0.024 | 28.3019 | 1500 | 0.0340 | 0.9657 | 0.9808 | 0.9920 | 0.9950 | 0.9528 | 0.9947 | 0.9902 | 0.9190 | 0.9880 | | 0.0242 | 29.2453 | 1550 | 0.0325 | 0.9672 | 0.9810 | 0.9926 | 0.9953 | 0.9522 | 0.9957 | 0.9915 | 0.9212 | 0.9888 | | 0.0371 | 30.1887 | 1600 | 0.0315 | 0.9681 | 0.9826 | 0.9928 | 0.9957 | 0.9569 | 0.9952 | 0.9920 | 0.9232 | 0.9891 | | 0.0235 | 31.1321 | 1650 | 0.0370 | 0.9632 | 0.9799 | 0.9911 | 0.9937 | 0.9520 | 0.9941 | 0.9880 | 0.9150 | 0.9868 | | 0.0266 | 32.0755 | 1700 | 0.0335 | 0.9664 | 0.9811 | 0.9925 | 0.9951 | 0.9527 | 0.9954 | 0.9913 | 0.9193 | 0.9887 | | 0.0216 | 33.0189 | 1750 | 0.0344 | 0.9656 | 0.9800 | 0.9921 | 0.9946 | 0.9497 | 0.9956 | 0.9904 | 0.9182 | 0.9883 | | 0.0382 | 33.9623 | 1800 | 0.0319 | 0.9680 | 0.9819 | 0.9929 | 0.9954 | 0.9544 | 0.9959 | 0.9922 | 0.9224 | 0.9893 | | 0.0161 | 34.9057 | 1850 | 0.0336 | 0.9672 | 0.9799 | 0.9927 | 0.9955 | 0.9479 | 0.9963 | 0.9920 | 0.9206 | 0.9890 | | 0.0216 | 35.8491 | 1900 | 0.0336 | 0.9671 | 0.9806 | 0.9926 | 0.9956 | 0.9505 | 0.9957 | 0.9916 | 0.9208 | 0.9888 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
Hasano20/SegFormer_mit-b5_Clean-Set3-Grayscale
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_mit-b5_Clean-Set3-Grayscale This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on _Clean-Set3-Grayscale. It achieves the following results on the evaluation set: - Train-Loss: 0.0053 - Loss: 0.0156 - Mean Iou: 0.9776 - Mean Accuracy: 0.9882 - Overall Accuracy: 0.9952 - Accuracy Background: 0.9974 - Accuracy Melt: 0.9708 - Accuracy Substrate: 0.9963 - Iou Background: 0.9942 - Iou Melt: 0.9458 - Iou Substrate: 0.9927 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 200 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.1206 | 1.8519 | 50 | 0.0898 | 0.8826 | 0.9277 | 0.9727 | 0.9809 | 0.8182 | 0.9840 | 0.9697 | 0.7209 | 0.9571 | | 0.0687 | 3.7037 | 100 | 0.0445 | 0.9291 | 0.9568 | 0.9845 | 0.9920 | 0.8888 | 0.9895 | 0.9833 | 0.8286 | 0.9754 | | 0.0457 | 5.5556 | 150 | 0.0413 | 0.9284 | 0.9428 | 0.9859 | 0.9938 | 0.8381 | 0.9966 | 0.9877 | 0.8204 | 0.9770 | | 0.0281 | 7.4074 | 200 | 0.0240 | 0.9592 | 0.9706 | 0.9914 | 0.9971 | 0.9198 | 0.9949 | 0.9900 | 0.9011 | 0.9865 | | 0.0234 | 9.2593 | 250 | 0.0179 | 0.9672 | 0.9810 | 0.9932 | 0.9960 | 0.9513 | 0.9957 | 0.9926 | 0.9195 | 0.9893 | | 0.0147 | 11.1111 | 300 | 0.0180 | 0.9672 | 0.9785 | 0.9932 | 0.9955 | 0.9429 | 0.9972 | 0.9925 | 0.9197 | 0.9893 | | 0.012 | 12.9630 | 350 | 0.0139 | 0.9748 | 0.9864 | 0.9946 | 0.9967 | 0.9664 | 0.9962 | 0.9936 | 0.9390 | 0.9918 | | 0.0104 | 14.8148 | 400 | 0.0138 | 0.9756 | 0.9890 | 0.9947 | 0.9972 | 0.9748 | 0.9949 | 0.9935 | 0.9413 | 0.9919 | | 0.0094 | 16.6667 | 450 | 0.0136 | 0.9767 | 0.9862 | 0.9950 | 0.9965 | 0.9646 | 0.9974 | 0.9940 | 0.9436 | 0.9924 | | 0.0101 | 18.5185 | 500 | 0.0135 | 0.9767 | 0.9867 | 0.9950 | 0.9974 | 0.9663 | 0.9964 | 0.9940 | 0.9438 | 0.9924 | | 0.0087 | 20.3704 | 550 | 0.0144 | 0.9764 | 0.9887 | 0.9949 | 0.9954 | 0.9736 | 0.9970 | 0.9935 | 0.9435 | 0.9923 | | 0.0078 | 22.2222 | 600 | 0.0145 | 0.9760 | 0.9885 | 0.9949 | 0.9967 | 0.9727 | 0.9960 | 0.9938 | 0.9417 | 0.9924 | | 0.0095 | 24.0741 | 650 | 0.0145 | 0.9753 | 0.9855 | 0.9948 | 0.9971 | 0.9626 | 0.9967 | 0.9939 | 0.9398 | 0.9921 | | 0.0073 | 25.9259 | 700 | 0.0145 | 0.9761 | 0.9892 | 0.9949 | 0.9965 | 0.9752 | 0.9960 | 0.9938 | 0.9419 | 0.9925 | | 0.009 | 27.7778 | 750 | 0.0143 | 0.9772 | 0.9891 | 0.9951 | 0.9958 | 0.9745 | 0.9970 | 0.9938 | 0.9451 | 0.9929 | | 0.0049 | 29.6296 | 800 | 0.0143 | 0.9782 | 0.9883 | 0.9953 | 0.9966 | 0.9713 | 0.9971 | 0.9942 | 0.9474 | 0.9929 | | 0.0075 | 31.4815 | 850 | 0.0153 | 0.9767 | 0.9886 | 0.9951 | 0.9967 | 0.9727 | 0.9963 | 0.9941 | 0.9434 | 0.9925 | | 0.008 | 33.3333 | 900 | 0.0155 | 0.9772 | 0.9876 | 0.9952 | 0.9970 | 0.9690 | 0.9968 | 0.9943 | 0.9447 | 0.9927 | | 0.0061 | 35.1852 | 950 | 0.0150 | 0.9777 | 0.9877 | 0.9953 | 0.9973 | 0.9691 | 0.9967 | 0.9943 | 0.9461 | 0.9928 | | 0.0053 | 37.0370 | 1000 | 0.0156 | 0.9776 | 0.9882 | 0.9952 | 0.9974 | 0.9708 | 0.9963 | 0.9942 | 0.9458 | 0.9927 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
heroza/segformer-finetuned-biofilm2_train
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm2_train This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm2_train dataset. It achieves the following results on the evaluation set: - Loss: 0.0761 - Mean Iou: 0.8665 - Mean Accuracy: 0.9765 - Overall Accuracy: 0.9745 - Accuracy Background: 0.9741 - Accuracy Biofilm: 0.9789 - Iou Background: 0.9722 - Iou Biofilm: 0.7608 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Biofilm | Iou Background | Iou Biofilm | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:----------------:|:--------------:|:-----------:| | 0.1611 | 1.0 | 298 | 0.1220 | 0.8393 | 0.9547 | 0.9687 | 0.9714 | 0.9379 | 0.9660 | 0.7126 | | 0.07 | 2.0 | 596 | 0.0682 | 0.8795 | 0.9359 | 0.9795 | 0.9881 | 0.8837 | 0.9779 | 0.7811 | | 0.0542 | 3.0 | 894 | 0.0564 | 0.8862 | 0.9735 | 0.9793 | 0.9805 | 0.9666 | 0.9775 | 0.7948 | | 0.0508 | 4.0 | 1192 | 0.0517 | 0.8888 | 0.9728 | 0.9799 | 0.9814 | 0.9643 | 0.9782 | 0.7993 | | 0.0491 | 5.0 | 1490 | 0.0479 | 0.8999 | 0.9727 | 0.9824 | 0.9843 | 0.9611 | 0.9809 | 0.8190 | | 0.0496 | 6.0 | 1788 | 0.0665 | 0.8733 | 0.9728 | 0.9764 | 0.9770 | 0.9686 | 0.9743 | 0.7724 | | 0.047 | 7.0 | 2086 | 0.0475 | 0.8936 | 0.9744 | 0.9810 | 0.9823 | 0.9664 | 0.9793 | 0.8079 | | 0.0403 | 8.0 | 2384 | 0.0513 | 0.8897 | 0.9699 | 0.9803 | 0.9823 | 0.9575 | 0.9786 | 0.8008 | | 0.0336 | 9.0 | 2682 | 0.0597 | 0.8736 | 0.9790 | 0.9761 | 0.9756 | 0.9824 | 0.9740 | 0.7732 | | 0.036 | 10.0 | 2980 | 0.0602 | 0.8789 | 0.9781 | 0.9774 | 0.9773 | 0.9789 | 0.9755 | 0.7824 | | 0.0335 | 11.0 | 3278 | 0.0519 | 0.8849 | 0.9670 | 0.9793 | 0.9818 | 0.9522 | 0.9775 | 0.7923 | | 0.0364 | 12.0 | 3576 | 0.0684 | 0.8718 | 0.9810 | 0.9756 | 0.9745 | 0.9874 | 0.9734 | 0.7702 | | 0.0423 | 13.0 | 3874 | 0.0637 | 0.8767 | 0.9742 | 0.9771 | 0.9777 | 0.9707 | 0.9751 | 0.7783 | | 0.0354 | 14.0 | 4172 | 0.0618 | 0.8773 | 0.9692 | 0.9775 | 0.9791 | 0.9593 | 0.9755 | 0.7790 | | 0.0335 | 15.0 | 4470 | 0.0547 | 0.8788 | 0.9686 | 0.9778 | 0.9797 | 0.9574 | 0.9759 | 0.7816 | | 0.0318 | 16.0 | 4768 | 0.0567 | 0.8841 | 0.9744 | 0.9788 | 0.9797 | 0.9691 | 0.9770 | 0.7913 | | 0.0296 | 17.0 | 5066 | 0.0653 | 0.8678 | 0.9741 | 0.9749 | 0.9751 | 0.9732 | 0.9727 | 0.7628 | | 0.0291 | 18.0 | 5364 | 0.0591 | 0.8757 | 0.9718 | 0.9770 | 0.9780 | 0.9657 | 0.9750 | 0.7765 | | 0.0311 | 19.0 | 5662 | 0.0716 | 0.8682 | 0.9753 | 0.9750 | 0.9749 | 0.9756 | 0.9728 | 0.7637 | | 0.0322 | 20.0 | 5960 | 0.0837 | 0.8506 | 0.9773 | 0.9703 | 0.9690 | 0.9857 | 0.9677 | 0.7335 | | 0.0317 | 21.0 | 6258 | 0.0728 | 0.8673 | 0.9749 | 0.9748 | 0.9747 | 0.9751 | 0.9726 | 0.7621 | | 0.0318 | 22.0 | 6556 | 0.0571 | 0.8796 | 0.9764 | 0.9777 | 0.9779 | 0.9748 | 0.9757 | 0.7835 | | 0.0288 | 23.0 | 6854 | 0.0734 | 0.8689 | 0.9798 | 0.9749 | 0.9739 | 0.9858 | 0.9727 | 0.7651 | | 0.0271 | 24.0 | 7152 | 0.0763 | 0.8615 | 0.9757 | 0.9733 | 0.9728 | 0.9785 | 0.9709 | 0.7521 | | 0.0236 | 25.0 | 7450 | 0.0615 | 0.8789 | 0.9761 | 0.9775 | 0.9778 | 0.9744 | 0.9756 | 0.7823 | | 0.025 | 26.0 | 7748 | 0.0694 | 0.8684 | 0.9768 | 0.9750 | 0.9746 | 0.9790 | 0.9727 | 0.7640 | | 0.0269 | 27.0 | 8046 | 0.0672 | 0.8700 | 0.9688 | 0.9757 | 0.9771 | 0.9605 | 0.9736 | 0.7664 | | 0.0286 | 28.0 | 8344 | 0.0717 | 0.8695 | 0.9761 | 0.9753 | 0.9751 | 0.9771 | 0.9731 | 0.7659 | | 0.0255 | 29.0 | 8642 | 0.0680 | 0.8696 | 0.9757 | 0.9753 | 0.9752 | 0.9761 | 0.9731 | 0.7661 | | 0.0255 | 30.0 | 8940 | 0.0701 | 0.8691 | 0.9756 | 0.9752 | 0.9751 | 0.9762 | 0.9730 | 0.7651 | | 0.0223 | 31.0 | 9238 | 0.0715 | 0.8687 | 0.9746 | 0.9751 | 0.9752 | 0.9740 | 0.9730 | 0.7644 | | 0.0226 | 32.0 | 9536 | 0.0757 | 0.8667 | 0.9770 | 0.9745 | 0.9740 | 0.9799 | 0.9723 | 0.7612 | | 0.022 | 33.0 | 9834 | 0.0773 | 0.8661 | 0.9766 | 0.9744 | 0.9739 | 0.9793 | 0.9721 | 0.7601 | | 0.0217 | 33.56 | 10000 | 0.0761 | 0.8665 | 0.9765 | 0.9745 | 0.9741 | 0.9789 | 0.9722 | 0.7608 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
heroza/segformer-finetuned-biofilm_MRCNNv1_train
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm_MRCNNv1_train This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm_MRCNNv1_train dataset. It achieves the following results on the evaluation set: - Loss: 0.0003 - Mean Iou: 0.5000 - Mean Accuracy: 1.0000 - Overall Accuracy: 1.0000 - Accuracy Background: 1.0000 - Accuracy Biofilm: nan - Iou Background: 1.0000 - Iou Biofilm: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Biofilm | Iou Background | Iou Biofilm | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:----------------:|:--------------:|:-----------:| | 0.262 | 1.0 | 136 | 0.0895 | 0.4984 | 0.9968 | 0.9968 | 0.9968 | nan | 0.9968 | 0.0 | | 0.0826 | 2.0 | 272 | 0.0225 | 0.4995 | 0.9990 | 0.9990 | 0.9990 | nan | 0.9990 | 0.0 | | 0.0225 | 3.0 | 408 | 0.0228 | 0.4985 | 0.9971 | 0.9971 | 0.9971 | nan | 0.9971 | 0.0 | | 0.0141 | 4.0 | 544 | 0.0116 | 0.4997 | 0.9995 | 0.9995 | 0.9995 | nan | 0.9995 | 0.0 | | 0.0093 | 5.0 | 680 | 0.0069 | 0.4996 | 0.9993 | 0.9993 | 0.9993 | nan | 0.9993 | 0.0 | | 0.0054 | 6.0 | 816 | 0.0042 | 0.4999 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.004 | 7.0 | 952 | 0.0030 | 0.5000 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0034 | 8.0 | 1088 | 0.0027 | 0.4999 | 0.9998 | 0.9998 | 0.9998 | nan | 0.9998 | 0.0 | | 0.0024 | 9.0 | 1224 | 0.0021 | 0.4999 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0021 | 10.0 | 1360 | 0.0015 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0017 | 11.0 | 1496 | 0.0019 | 0.4999 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0014 | 12.0 | 1632 | 0.0015 | 0.4999 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0012 | 13.0 | 1768 | 0.0010 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0009 | 14.0 | 1904 | 0.0010 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0008 | 15.0 | 2040 | 0.0009 | 0.5000 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0008 | 16.0 | 2176 | 0.0008 | 0.5000 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0006 | 17.0 | 2312 | 0.0009 | 0.4999 | 0.9999 | 0.9999 | 0.9999 | nan | 0.9999 | 0.0 | | 0.0007 | 18.0 | 2448 | 0.0005 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0006 | 19.0 | 2584 | 0.0010 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0005 | 20.0 | 2720 | 0.0004 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0004 | 21.0 | 2856 | 0.0005 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0004 | 22.0 | 2992 | 0.0004 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0003 | 23.0 | 3128 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0004 | 24.0 | 3264 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0003 | 25.0 | 3400 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0002 | 26.0 | 3536 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0003 | 27.0 | 3672 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0002 | 28.0 | 3808 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0002 | 29.0 | 3944 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0002 | 30.0 | 4080 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0002 | 31.0 | 4216 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 32.0 | 4352 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0002 | 33.0 | 4488 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 34.0 | 4624 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 35.0 | 4760 | 0.0001 | 1.0 | 1.0 | 1.0 | 1.0 | nan | 1.0 | nan | | 0.0002 | 36.0 | 4896 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 37.0 | 5032 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 38.0 | 5168 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 39.0 | 5304 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 40.0 | 5440 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 41.0 | 5576 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 42.0 | 5712 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 43.0 | 5848 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 44.0 | 5984 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 45.0 | 6120 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 46.0 | 6256 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 47.0 | 6392 | 0.0000 | 1.0 | 1.0 | 1.0 | 1.0 | nan | 1.0 | nan | | 0.0001 | 48.0 | 6528 | 0.0000 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 49.0 | 6664 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 50.0 | 6800 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 51.0 | 6936 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 52.0 | 7072 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 53.0 | 7208 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 54.0 | 7344 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 55.0 | 7480 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 56.0 | 7616 | 0.0001 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 57.0 | 7752 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 58.0 | 7888 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 59.0 | 8024 | 0.0004 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 60.0 | 8160 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 61.0 | 8296 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 62.0 | 8432 | 0.0002 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 63.0 | 8568 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 64.0 | 8704 | 0.0002 | 1.0 | 1.0 | 1.0 | 1.0 | nan | 1.0 | nan | | 0.0001 | 65.0 | 8840 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 66.0 | 8976 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 67.0 | 9112 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 68.0 | 9248 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 69.0 | 9384 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 70.0 | 9520 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 71.0 | 9656 | 0.0003 | 1.0 | 1.0 | 1.0 | 1.0 | nan | 1.0 | nan | | 0.0001 | 72.0 | 9792 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 73.0 | 9928 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | | 0.0001 | 73.53 | 10000 | 0.0003 | 0.5000 | 1.0000 | 1.0000 | 1.0000 | nan | 1.0000 | 0.0 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
heroza/segformer-finetuned-biofilm_MRCNNv1_validation
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm_MRCNNv1_validation This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm_MRCNNv1_validation dataset. It achieves the following results on the evaluation set: - Loss: 0.0667 - Mean Iou: 0.4894 - Mean Accuracy: 0.9788 - Overall Accuracy: 0.9788 - Accuracy Background: 0.9788 - Accuracy Biofilm: nan - Iou Background: 0.9788 - Iou Biofilm: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Biofilm | Iou Background | Iou Biofilm | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:----------------:|:--------------:|:-----------:| | 0.0896 | 1.0 | 351 | 0.0405 | 0.4947 | 0.9894 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | | 0.0556 | 2.0 | 702 | 0.0459 | 0.4925 | 0.9849 | 0.9849 | 0.9849 | nan | 0.9849 | 0.0 | | 0.0532 | 3.0 | 1053 | 0.0352 | 0.4931 | 0.9863 | 0.9863 | 0.9863 | nan | 0.9863 | 0.0 | | 0.0473 | 4.0 | 1404 | 0.0318 | 0.4936 | 0.9872 | 0.9872 | 0.9872 | nan | 0.9872 | 0.0 | | 0.0387 | 5.0 | 1755 | 0.0318 | 0.4928 | 0.9857 | 0.9857 | 0.9857 | nan | 0.9857 | 0.0 | | 0.0388 | 6.0 | 2106 | 0.0394 | 0.4909 | 0.9817 | 0.9817 | 0.9817 | nan | 0.9817 | 0.0 | | 0.0344 | 7.0 | 2457 | 0.0431 | 0.4906 | 0.9811 | 0.9811 | 0.9811 | nan | 0.9811 | 0.0 | | 0.0409 | 8.0 | 2808 | 0.0347 | 0.4922 | 0.9844 | 0.9844 | 0.9844 | nan | 0.9844 | 0.0 | | 0.0322 | 9.0 | 3159 | 0.0415 | 0.4910 | 0.9819 | 0.9819 | 0.9819 | nan | 0.9819 | 0.0 | | 0.0331 | 10.0 | 3510 | 0.0558 | 0.4884 | 0.9767 | 0.9767 | 0.9767 | nan | 0.9767 | 0.0 | | 0.0337 | 11.0 | 3861 | 0.0422 | 0.4923 | 0.9847 | 0.9847 | 0.9847 | nan | 0.9847 | 0.0 | | 0.0357 | 12.0 | 4212 | 0.0421 | 0.4908 | 0.9816 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | | 0.0306 | 13.0 | 4563 | 0.0398 | 0.4913 | 0.9827 | 0.9827 | 0.9827 | nan | 0.9827 | 0.0 | | 0.0324 | 14.0 | 4914 | 0.0488 | 0.4905 | 0.9810 | 0.9810 | 0.9810 | nan | 0.9810 | 0.0 | | 0.0293 | 15.0 | 5265 | 0.0401 | 0.4918 | 0.9835 | 0.9835 | 0.9835 | nan | 0.9835 | 0.0 | | 0.0243 | 16.0 | 5616 | 0.0499 | 0.4894 | 0.9788 | 0.9788 | 0.9788 | nan | 0.9788 | 0.0 | | 0.0306 | 17.0 | 5967 | 0.0495 | 0.4902 | 0.9805 | 0.9805 | 0.9805 | nan | 0.9805 | 0.0 | | 0.0267 | 18.0 | 6318 | 0.0498 | 0.4907 | 0.9813 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | | 0.0295 | 19.0 | 6669 | 0.0566 | 0.4903 | 0.9806 | 0.9806 | 0.9806 | nan | 0.9806 | 0.0 | | 0.0263 | 20.0 | 7020 | 0.0658 | 0.4893 | 0.9786 | 0.9786 | 0.9786 | nan | 0.9786 | 0.0 | | 0.0319 | 21.0 | 7371 | 0.0646 | 0.4885 | 0.9770 | 0.9770 | 0.9770 | nan | 0.9770 | 0.0 | | 0.0236 | 22.0 | 7722 | 0.0608 | 0.4897 | 0.9793 | 0.9793 | 0.9793 | nan | 0.9793 | 0.0 | | 0.0249 | 23.0 | 8073 | 0.0578 | 0.4897 | 0.9795 | 0.9795 | 0.9795 | nan | 0.9795 | 0.0 | | 0.0242 | 24.0 | 8424 | 0.0558 | 0.4902 | 0.9804 | 0.9804 | 0.9804 | nan | 0.9804 | 0.0 | | 0.0264 | 25.0 | 8775 | 0.0579 | 0.4899 | 0.9798 | 0.9798 | 0.9798 | nan | 0.9798 | 0.0 | | 0.0235 | 26.0 | 9126 | 0.0582 | 0.4900 | 0.9801 | 0.9801 | 0.9801 | nan | 0.9801 | 0.0 | | 0.0235 | 27.0 | 9477 | 0.0609 | 0.4897 | 0.9794 | 0.9794 | 0.9794 | nan | 0.9794 | 0.0 | | 0.0204 | 28.0 | 9828 | 0.0648 | 0.4896 | 0.9791 | 0.9791 | 0.9791 | nan | 0.9791 | 0.0 | | 0.023 | 28.49 | 10000 | 0.0667 | 0.4894 | 0.9788 | 0.9788 | 0.9788 | nan | 0.9788 | 0.0 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
heroza/segformer-finetuned-biofilm_MRCNNv1_halfjoin
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm_MRCNNv1_halfjoin This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm_MRCNNv1_halfjoin dataset. It achieves the following results on the evaluation set: - Loss: 0.0208 - Mean Iou: 0.4961 - Mean Accuracy: 0.9923 - Overall Accuracy: 0.9923 - Accuracy Background: 0.9923 - Accuracy Biofilm: nan - Iou Background: 0.9923 - Iou Biofilm: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Biofilm | Iou Background | Iou Biofilm | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:----------------:|:--------------:|:-----------:| | 0.0713 | 1.0 | 478 | 0.0381 | 0.4953 | 0.9906 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | | 0.044 | 2.0 | 956 | 0.0202 | 0.4975 | 0.9949 | 0.9949 | 0.9949 | nan | 0.9949 | 0.0 | | 0.041 | 3.0 | 1434 | 0.0181 | 0.4972 | 0.9945 | 0.9945 | 0.9945 | nan | 0.9945 | 0.0 | | 0.0361 | 4.0 | 1912 | 0.0203 | 0.4963 | 0.9926 | 0.9926 | 0.9926 | nan | 0.9926 | 0.0 | | 0.0357 | 5.0 | 2390 | 0.0163 | 0.4971 | 0.9942 | 0.9942 | 0.9942 | nan | 0.9942 | 0.0 | | 0.0336 | 6.0 | 2868 | 0.0340 | 0.4958 | 0.9915 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | | 0.0295 | 7.0 | 3346 | 0.0126 | 0.4978 | 0.9955 | 0.9955 | 0.9955 | nan | 0.9955 | 0.0 | | 0.0251 | 8.0 | 3824 | 0.0220 | 0.4957 | 0.9915 | 0.9915 | 0.9915 | nan | 0.9915 | 0.0 | | 0.0265 | 9.0 | 4302 | 0.0182 | 0.4966 | 0.9933 | 0.9933 | 0.9933 | nan | 0.9933 | 0.0 | | 0.0238 | 10.0 | 4780 | 0.0155 | 0.4970 | 0.9940 | 0.9940 | 0.9940 | nan | 0.9940 | 0.0 | | 0.0258 | 11.0 | 5258 | 0.0181 | 0.4966 | 0.9931 | 0.9931 | 0.9931 | nan | 0.9931 | 0.0 | | 0.0264 | 12.0 | 5736 | 0.0179 | 0.4969 | 0.9938 | 0.9938 | 0.9938 | nan | 0.9938 | 0.0 | | 0.0265 | 13.0 | 6214 | 0.0222 | 0.4959 | 0.9917 | 0.9917 | 0.9917 | nan | 0.9917 | 0.0 | | 0.0219 | 14.0 | 6692 | 0.0200 | 0.4962 | 0.9925 | 0.9925 | 0.9925 | nan | 0.9925 | 0.0 | | 0.0213 | 15.0 | 7170 | 0.0234 | 0.4958 | 0.9916 | 0.9916 | 0.9916 | nan | 0.9916 | 0.0 | | 0.0192 | 16.0 | 7648 | 0.0199 | 0.4961 | 0.9922 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | | 0.0232 | 17.0 | 8126 | 0.0208 | 0.4961 | 0.9923 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | | 0.0219 | 18.0 | 8604 | 0.0245 | 0.4955 | 0.9909 | 0.9909 | 0.9909 | nan | 0.9909 | 0.0 | | 0.0201 | 19.0 | 9082 | 0.0211 | 0.4961 | 0.9922 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | | 0.0192 | 20.0 | 9560 | 0.0207 | 0.4962 | 0.9923 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | | 0.0175 | 20.92 | 10000 | 0.0208 | 0.4961 | 0.9923 | 0.9923 | 0.9923 | nan | 0.9923 | 0.0 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
heroza/segformer-finetuned-biofilm_MRCNNv1_concat
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm_MRCNNv1_concat This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm_MRCNNv1_concat dataset. It achieves the following results on the evaluation set: - Loss: 0.0530 - Mean Iou: 0.8677 - Mean Accuracy: 0.9780 - Overall Accuracy: 0.9815 - Accuracy Background: 0.9820 - Accuracy Biofilm: 0.9740 - Iou Background: 0.9804 - Iou Biofilm: 0.7549 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Biofilm | Iou Background | Iou Biofilm | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:----------------:|:--------------:|:-----------:| | 0.0759 | 1.0 | 433 | 0.0649 | 0.8684 | 0.9740 | 0.9818 | 0.9829 | 0.9651 | 0.9807 | 0.7561 | | 0.0435 | 2.0 | 866 | 0.0391 | 0.8875 | 0.9616 | 0.9855 | 0.9887 | 0.9345 | 0.9847 | 0.7903 | | 0.0407 | 3.0 | 1299 | 0.0353 | 0.8951 | 0.9696 | 0.9865 | 0.9888 | 0.9504 | 0.9857 | 0.8046 | | 0.0372 | 4.0 | 1732 | 0.0489 | 0.8765 | 0.9810 | 0.9830 | 0.9833 | 0.9788 | 0.9820 | 0.7711 | | 0.0378 | 5.0 | 2165 | 0.0311 | 0.9020 | 0.9574 | 0.9879 | 0.9919 | 0.9229 | 0.9872 | 0.8168 | | 0.0325 | 6.0 | 2598 | 0.0510 | 0.8663 | 0.9745 | 0.9814 | 0.9823 | 0.9666 | 0.9803 | 0.7524 | | 0.0306 | 7.0 | 3031 | 0.0428 | 0.8873 | 0.9760 | 0.9850 | 0.9862 | 0.9657 | 0.9842 | 0.7904 | | 0.0318 | 8.0 | 3464 | 0.0399 | 0.8837 | 0.9739 | 0.9845 | 0.9859 | 0.9618 | 0.9836 | 0.7839 | | 0.0302 | 9.0 | 3897 | 0.0436 | 0.8795 | 0.9689 | 0.9840 | 0.9859 | 0.9520 | 0.9830 | 0.7760 | | 0.0236 | 10.0 | 4330 | 0.0391 | 0.8856 | 0.9713 | 0.9849 | 0.9867 | 0.9560 | 0.9840 | 0.7871 | | 0.0247 | 11.0 | 4763 | 0.0451 | 0.8705 | 0.9731 | 0.9822 | 0.9834 | 0.9628 | 0.9812 | 0.7598 | | 0.0213 | 12.0 | 5196 | 0.0487 | 0.8656 | 0.9735 | 0.9813 | 0.9824 | 0.9647 | 0.9802 | 0.7510 | | 0.0256 | 13.0 | 5629 | 0.0444 | 0.8799 | 0.9668 | 0.9841 | 0.9864 | 0.9473 | 0.9832 | 0.7766 | | 0.0218 | 14.0 | 6062 | 0.0492 | 0.8679 | 0.9773 | 0.9816 | 0.9822 | 0.9725 | 0.9805 | 0.7553 | | 0.0216 | 15.0 | 6495 | 0.0502 | 0.8717 | 0.9748 | 0.9824 | 0.9834 | 0.9663 | 0.9813 | 0.7621 | | 0.0206 | 16.0 | 6928 | 0.0565 | 0.8623 | 0.9766 | 0.9806 | 0.9811 | 0.9721 | 0.9794 | 0.7453 | | 0.0223 | 17.0 | 7361 | 0.0509 | 0.8666 | 0.9730 | 0.9815 | 0.9826 | 0.9635 | 0.9804 | 0.7527 | | 0.0226 | 18.0 | 7794 | 0.0464 | 0.8794 | 0.9792 | 0.9836 | 0.9842 | 0.9743 | 0.9826 | 0.7762 | | 0.0243 | 19.0 | 8227 | 0.0546 | 0.8649 | 0.9824 | 0.9809 | 0.9806 | 0.9843 | 0.9797 | 0.7501 | | 0.02 | 20.0 | 8660 | 0.0567 | 0.8648 | 0.9766 | 0.9810 | 0.9816 | 0.9716 | 0.9799 | 0.7496 | | 0.0196 | 21.0 | 9093 | 0.0559 | 0.8648 | 0.9784 | 0.9810 | 0.9813 | 0.9755 | 0.9798 | 0.7497 | | 0.0206 | 22.0 | 9526 | 0.0552 | 0.8652 | 0.9779 | 0.9811 | 0.9815 | 0.9742 | 0.9799 | 0.7504 | | 0.0189 | 23.0 | 9959 | 0.0544 | 0.8661 | 0.9785 | 0.9812 | 0.9816 | 0.9753 | 0.9801 | 0.7521 | | 0.0208 | 23.09 | 10000 | 0.0530 | 0.8677 | 0.9780 | 0.9815 | 0.9820 | 0.9740 | 0.9804 | 0.7549 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
chugz/segformer-b0-practice-7-11
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-practice-7-11 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the chugz/SEM dataset. It achieves the following results on the evaluation set: - Loss: 0.1941 - Mean Iou: 0.5755 - Mean Accuracy: 0.7536 - Overall Accuracy: 0.9340 - Accuracy Background: nan - Accuracy Silver: 0.9537 - Accuracy Glass: 0.0749 - Accuracy Silicon: 0.9909 - Accuracy Void: 0.8095 - Accuracy Interfacial void: 0.9391 - Iou Background: 0.0 - Iou Silver: 0.8956 - Iou Glass: 0.0717 - Iou Silicon: 0.9803 - Iou Void: 0.6885 - Iou Interfacial void: 0.8168 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Silver | Accuracy Glass | Accuracy Silicon | Accuracy Void | Accuracy Interfacial void | Iou Background | Iou Silver | Iou Glass | Iou Silicon | Iou Void | Iou Interfacial void | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------------:|:-------------:|:-------------------------:|:--------------:|:----------:|:---------:|:-----------:|:--------:|:--------------------:| | 1.2415 | 0.6061 | 20 | 1.3040 | 0.2800 | 0.4306 | 0.7396 | nan | 0.9612 | 0.0069 | 0.9153 | 0.0 | 0.2698 | 0.0 | 0.5882 | 0.0069 | 0.8301 | 0.0 | 0.2549 | | 0.9185 | 1.2121 | 40 | 0.7957 | 0.3642 | 0.5203 | 0.8249 | nan | 0.9524 | 0.0 | 0.9539 | 0.0000 | 0.6951 | 0.0 | 0.7181 | 0.0 | 0.8980 | 0.0000 | 0.5691 | | 0.6451 | 1.8182 | 60 | 0.6017 | 0.3679 | 0.5206 | 0.8131 | nan | 0.8835 | 0.0 | 0.9456 | 0.0049 | 0.7688 | 0.0 | 0.6888 | 0.0 | 0.9053 | 0.0049 | 0.6086 | | 0.5145 | 2.4242 | 80 | 0.5779 | 0.4033 | 0.5635 | 0.8393 | nan | 0.9514 | 0.0 | 0.9371 | 0.1596 | 0.7693 | 0.0 | 0.7321 | 0.0 | 0.9199 | 0.1516 | 0.6165 | | 0.9509 | 3.0303 | 100 | 0.5006 | 0.4017 | 0.5540 | 0.8420 | nan | 0.9510 | 0.0 | 0.9548 | 0.1047 | 0.7592 | 0.0 | 0.7312 | 0.0 | 0.9273 | 0.1010 | 0.6506 | | 0.6075 | 3.6364 | 120 | 0.4274 | 0.4688 | 0.6372 | 0.8776 | nan | 0.9414 | 0.0 | 0.9613 | 0.4300 | 0.8535 | 0.0 | 0.7953 | 0.0 | 0.9395 | 0.3709 | 0.7070 | | 0.4155 | 4.2424 | 140 | 0.3949 | 0.4994 | 0.6781 | 0.8912 | nan | 0.9260 | 0.0 | 0.9560 | 0.5908 | 0.9177 | 0.0 | 0.8258 | 0.0 | 0.9412 | 0.4981 | 0.7313 | | 0.4072 | 4.8485 | 160 | 0.3538 | 0.5040 | 0.6794 | 0.8961 | nan | 0.9463 | 0.0 | 0.9668 | 0.6083 | 0.8758 | 0.0 | 0.8231 | 0.0 | 0.9540 | 0.4978 | 0.7493 | | 0.3085 | 5.4545 | 180 | 0.3441 | 0.5033 | 0.6814 | 0.8919 | nan | 0.9506 | 0.0 | 0.9536 | 0.6344 | 0.8684 | 0.0 | 0.8170 | 0.0 | 0.9417 | 0.5214 | 0.7398 | | 0.2885 | 6.0606 | 200 | 0.3357 | 0.5136 | 0.6967 | 0.8989 | nan | 0.9082 | 0.0 | 0.9780 | 0.6987 | 0.8987 | 0.0 | 0.8389 | 0.0 | 0.9476 | 0.5652 | 0.7302 | | 0.2754 | 6.6667 | 220 | 0.3052 | 0.5253 | 0.7061 | 0.9094 | nan | 0.9355 | 0.0 | 0.9714 | 0.6918 | 0.9319 | 0.0 | 0.8553 | 0.0 | 0.9577 | 0.5742 | 0.7643 | | 0.2942 | 7.2727 | 240 | 0.2893 | 0.5226 | 0.6948 | 0.9087 | nan | 0.9388 | 0.0 | 0.9805 | 0.6373 | 0.9175 | 0.0 | 0.8466 | 0.0 | 0.9618 | 0.5504 | 0.7766 | | 0.2324 | 7.8788 | 260 | 0.3018 | 0.5221 | 0.7053 | 0.9053 | nan | 0.9031 | 0.0 | 0.9754 | 0.6931 | 0.9549 | 0.0 | 0.8499 | 0.0 | 0.9659 | 0.5796 | 0.7370 | | 0.2872 | 8.4848 | 280 | 0.2758 | 0.5339 | 0.7129 | 0.9139 | nan | 0.9160 | 0.0 | 0.9880 | 0.7207 | 0.9396 | 0.0 | 0.8585 | 0.0 | 0.9739 | 0.6018 | 0.7694 | | 0.2076 | 9.0909 | 300 | 0.2615 | 0.5317 | 0.7011 | 0.9168 | nan | 0.9431 | 0.0 | 0.9920 | 0.6451 | 0.9253 | 0.0 | 0.8670 | 0.0 | 0.9715 | 0.5706 | 0.7811 | | 0.2796 | 9.6970 | 320 | 0.2718 | 0.5226 | 0.6956 | 0.9078 | nan | 0.9573 | 0.0 | 0.9720 | 0.6536 | 0.8953 | 0.0 | 0.8501 | 0.0 | 0.9577 | 0.5505 | 0.7771 | | 0.3215 | 10.3030 | 340 | 0.2627 | 0.5414 | 0.7207 | 0.9193 | nan | 0.9303 | 0.0 | 0.9847 | 0.7407 | 0.9476 | 0.0 | 0.8680 | 0.0 | 0.9735 | 0.6193 | 0.7878 | | 0.209 | 10.9091 | 360 | 0.2516 | 0.5337 | 0.7073 | 0.9189 | nan | 0.9487 | 0.0 | 0.9898 | 0.6761 | 0.9221 | 0.0 | 0.8733 | 0.0 | 0.9734 | 0.5741 | 0.7818 | | 0.2928 | 11.5152 | 380 | 0.2606 | 0.5457 | 0.7354 | 0.9225 | nan | 0.9284 | 0.0 | 0.9831 | 0.8203 | 0.9452 | 0.0 | 0.8774 | 0.0 | 0.9733 | 0.6401 | 0.7831 | | 0.2246 | 12.1212 | 400 | 0.2519 | 0.5378 | 0.7107 | 0.9180 | nan | 0.9373 | 0.0 | 0.9906 | 0.6993 | 0.9263 | 0.0 | 0.8686 | 0.0 | 0.9691 | 0.6136 | 0.7755 | | 0.2386 | 12.7273 | 420 | 0.2477 | 0.5443 | 0.7290 | 0.9212 | nan | 0.9407 | 0.0001 | 0.9828 | 0.7968 | 0.9245 | 0.0 | 0.8733 | 0.0001 | 0.9703 | 0.6294 | 0.7929 | | 0.1734 | 13.3333 | 440 | 0.2285 | 0.5466 | 0.7234 | 0.9234 | nan | 0.9479 | 0.0005 | 0.9879 | 0.7531 | 0.9274 | 0.0 | 0.8791 | 0.0005 | 0.9747 | 0.6310 | 0.7940 | | 0.1809 | 13.9394 | 460 | 0.2304 | 0.5501 | 0.7254 | 0.9259 | nan | 0.9451 | 0.0002 | 0.9923 | 0.7544 | 0.9349 | 0.0 | 0.8799 | 0.0002 | 0.9774 | 0.6386 | 0.8047 | | 0.2162 | 14.5455 | 480 | 0.2431 | 0.5514 | 0.7364 | 0.9257 | nan | 0.9451 | 0.0022 | 0.9792 | 0.8061 | 0.9493 | 0.0 | 0.8825 | 0.0022 | 0.9714 | 0.6526 | 0.7996 | | 0.1973 | 15.1515 | 500 | 0.2519 | 0.5480 | 0.7321 | 0.9188 | nan | 0.9306 | 0.0002 | 0.9773 | 0.8158 | 0.9364 | 0.0 | 0.8645 | 0.0002 | 0.9631 | 0.6636 | 0.7967 | | 0.1218 | 15.7576 | 520 | 0.2275 | 0.5513 | 0.7291 | 0.9245 | nan | 0.9463 | 0.0027 | 0.9827 | 0.7704 | 0.9436 | 0.0 | 0.8842 | 0.0027 | 0.9712 | 0.6524 | 0.7972 | | 0.1809 | 16.3636 | 540 | 0.2323 | 0.5540 | 0.7339 | 0.9271 | nan | 0.9406 | 0.0014 | 0.9876 | 0.7905 | 0.9495 | 0.0 | 0.8827 | 0.0014 | 0.9744 | 0.6583 | 0.8071 | | 0.1484 | 16.9697 | 560 | 0.2364 | 0.5527 | 0.7374 | 0.9223 | nan | 0.9343 | 0.0027 | 0.9759 | 0.8231 | 0.9509 | 0.0 | 0.8694 | 0.0027 | 0.9657 | 0.6680 | 0.8105 | | 0.1464 | 17.5758 | 580 | 0.2338 | 0.5519 | 0.7376 | 0.9254 | nan | 0.9397 | 0.0030 | 0.9849 | 0.8248 | 0.9354 | 0.0 | 0.8782 | 0.0030 | 0.9752 | 0.6523 | 0.8029 | | 0.1389 | 18.1818 | 600 | 0.2355 | 0.5541 | 0.7350 | 0.9227 | nan | 0.9281 | 0.0044 | 0.9822 | 0.8067 | 0.9537 | 0.0 | 0.8759 | 0.0044 | 0.9722 | 0.6756 | 0.7967 | | 0.115 | 18.7879 | 620 | 0.2175 | 0.5478 | 0.7175 | 0.9242 | nan | 0.9554 | 0.0080 | 0.9932 | 0.7125 | 0.9180 | 0.0 | 0.8806 | 0.0079 | 0.9735 | 0.6302 | 0.7947 | | 0.1704 | 19.3939 | 640 | 0.2246 | 0.5552 | 0.7320 | 0.9283 | nan | 0.9413 | 0.0079 | 0.9919 | 0.7671 | 0.9517 | 0.0 | 0.8883 | 0.0078 | 0.9815 | 0.6585 | 0.7951 | | 0.1537 | 20.0 | 660 | 0.2222 | 0.5590 | 0.7370 | 0.9299 | nan | 0.9461 | 0.0193 | 0.9919 | 0.7830 | 0.9445 | 0.0 | 0.8923 | 0.0189 | 0.9803 | 0.6620 | 0.8002 | | 0.1605 | 20.6061 | 680 | 0.2174 | 0.5578 | 0.7417 | 0.9273 | nan | 0.9388 | 0.0084 | 0.9865 | 0.8359 | 0.9390 | 0.0 | 0.8833 | 0.0083 | 0.9749 | 0.6706 | 0.8096 | | 0.1237 | 21.2121 | 700 | 0.2154 | 0.5639 | 0.7384 | 0.9312 | nan | 0.9496 | 0.0219 | 0.9935 | 0.7867 | 0.9404 | 0.0 | 0.8913 | 0.0214 | 0.9812 | 0.6799 | 0.8096 | | 0.1288 | 21.8182 | 720 | 0.2214 | 0.5627 | 0.7430 | 0.9303 | nan | 0.9406 | 0.0237 | 0.9907 | 0.8101 | 0.9497 | 0.0 | 0.8926 | 0.0233 | 0.9799 | 0.6771 | 0.8034 | | 0.122 | 22.4242 | 740 | 0.2160 | 0.5675 | 0.7469 | 0.9309 | nan | 0.9367 | 0.0376 | 0.9922 | 0.8161 | 0.9519 | 0.0 | 0.8921 | 0.0361 | 0.9816 | 0.6884 | 0.8067 | | 0.1608 | 23.0303 | 760 | 0.2112 | 0.5613 | 0.7412 | 0.9304 | nan | 0.9417 | 0.0160 | 0.9934 | 0.8137 | 0.9410 | 0.0 | 0.8907 | 0.0157 | 0.9802 | 0.6695 | 0.8116 | | 0.1258 | 23.6364 | 780 | 0.2230 | 0.5611 | 0.7425 | 0.9293 | nan | 0.9367 | 0.0196 | 0.9898 | 0.8148 | 0.9518 | 0.0 | 0.8894 | 0.0192 | 0.9808 | 0.6733 | 0.8036 | | 0.1333 | 24.2424 | 800 | 0.2171 | 0.5606 | 0.7377 | 0.9280 | nan | 0.9420 | 0.0207 | 0.9871 | 0.7868 | 0.9520 | 0.0 | 0.8865 | 0.0203 | 0.9773 | 0.6734 | 0.8063 | | 0.1562 | 24.8485 | 820 | 0.2183 | 0.5644 | 0.7477 | 0.9302 | nan | 0.9301 | 0.0299 | 0.9929 | 0.8333 | 0.9523 | 0.0 | 0.8882 | 0.0291 | 0.9813 | 0.6798 | 0.8081 | | 0.1039 | 25.4545 | 840 | 0.2106 | 0.5648 | 0.7389 | 0.9302 | nan | 0.9506 | 0.0412 | 0.9920 | 0.7729 | 0.9379 | 0.0 | 0.8890 | 0.0392 | 0.9804 | 0.6668 | 0.8136 | | 0.1079 | 26.0606 | 860 | 0.2099 | 0.5625 | 0.7449 | 0.9299 | nan | 0.9385 | 0.0233 | 0.9896 | 0.8246 | 0.9486 | 0.0 | 0.8917 | 0.0227 | 0.9796 | 0.6752 | 0.8056 | | 0.0995 | 26.6667 | 880 | 0.2122 | 0.5620 | 0.7403 | 0.9310 | nan | 0.9464 | 0.0210 | 0.9913 | 0.7954 | 0.9474 | 0.0 | 0.8916 | 0.0207 | 0.9793 | 0.6690 | 0.8115 | | 0.1326 | 27.2727 | 900 | 0.2150 | 0.5611 | 0.7428 | 0.9291 | nan | 0.9412 | 0.0198 | 0.9879 | 0.8193 | 0.9459 | 0.0 | 0.8868 | 0.0195 | 0.9785 | 0.6684 | 0.8130 | | 0.1602 | 27.8788 | 920 | 0.2108 | 0.5645 | 0.7438 | 0.9312 | nan | 0.9388 | 0.0249 | 0.9925 | 0.8091 | 0.9537 | 0.0 | 0.8935 | 0.0243 | 0.9812 | 0.6835 | 0.8044 | | 0.1758 | 28.4848 | 940 | 0.2116 | 0.5682 | 0.7461 | 0.9326 | nan | 0.9520 | 0.0347 | 0.9891 | 0.8089 | 0.9459 | 0.0 | 0.9004 | 0.0336 | 0.9775 | 0.6888 | 0.8089 | | 0.3986 | 29.0909 | 960 | 0.1990 | 0.5783 | 0.7581 | 0.9314 | nan | 0.9414 | 0.1021 | 0.9878 | 0.8095 | 0.9496 | 0.0 | 0.8923 | 0.0941 | 0.9785 | 0.6951 | 0.8099 | | 0.1477 | 29.6970 | 980 | 0.2202 | 0.5627 | 0.7442 | 0.9290 | nan | 0.9450 | 0.0213 | 0.9847 | 0.8258 | 0.9442 | 0.0 | 0.8857 | 0.0210 | 0.9749 | 0.6807 | 0.8142 | | 0.1221 | 30.3030 | 1000 | 0.2288 | 0.5621 | 0.7452 | 0.9292 | nan | 0.9340 | 0.0226 | 0.9869 | 0.8218 | 0.9606 | 0.0 | 0.8922 | 0.0224 | 0.9775 | 0.6855 | 0.7950 | | 0.1092 | 30.9091 | 1020 | 0.2079 | 0.5696 | 0.7507 | 0.9297 | nan | 0.9404 | 0.0511 | 0.9902 | 0.8393 | 0.9322 | 0.0 | 0.8880 | 0.0478 | 0.9802 | 0.6916 | 0.8100 | | 0.1488 | 31.5152 | 1040 | 0.2147 | 0.5675 | 0.7501 | 0.9306 | nan | 0.9439 | 0.0550 | 0.9867 | 0.8164 | 0.9484 | 0.0 | 0.8939 | 0.0528 | 0.9773 | 0.6731 | 0.8078 | | 0.1177 | 32.1212 | 1060 | 0.2240 | 0.5688 | 0.7470 | 0.9311 | nan | 0.9409 | 0.0466 | 0.9899 | 0.8028 | 0.9550 | 0.0 | 0.8952 | 0.0444 | 0.9801 | 0.6910 | 0.8022 | | 0.1121 | 32.7273 | 1080 | 0.2043 | 0.5700 | 0.7471 | 0.9327 | nan | 0.9542 | 0.0467 | 0.9895 | 0.8043 | 0.9409 | 0.0 | 0.8949 | 0.0449 | 0.9791 | 0.6885 | 0.8129 | | 0.1263 | 33.3333 | 1100 | 0.2120 | 0.5679 | 0.7516 | 0.9299 | nan | 0.9341 | 0.0447 | 0.9888 | 0.8415 | 0.9489 | 0.0 | 0.8896 | 0.0434 | 0.9785 | 0.6851 | 0.8110 | | 0.0922 | 33.9394 | 1120 | 0.2104 | 0.5721 | 0.7507 | 0.9331 | nan | 0.9512 | 0.0605 | 0.9902 | 0.8086 | 0.9427 | 0.0 | 0.8984 | 0.0574 | 0.9797 | 0.6855 | 0.8118 | | 0.1549 | 34.5455 | 1140 | 0.2276 | 0.5624 | 0.7428 | 0.9296 | nan | 0.9371 | 0.0369 | 0.9895 | 0.7912 | 0.9593 | 0.0 | 0.8951 | 0.0359 | 0.9800 | 0.6682 | 0.7951 | | 0.1493 | 35.1515 | 1160 | 0.1981 | 0.5739 | 0.7532 | 0.9336 | nan | 0.9495 | 0.0723 | 0.9915 | 0.8094 | 0.9431 | 0.0 | 0.8946 | 0.0698 | 0.9817 | 0.6780 | 0.8197 | | 0.1176 | 35.7576 | 1180 | 0.2030 | 0.5757 | 0.7552 | 0.9351 | nan | 0.9506 | 0.0684 | 0.9925 | 0.8196 | 0.9447 | 0.0 | 0.9026 | 0.0648 | 0.9820 | 0.6921 | 0.8129 | | 0.229 | 36.3636 | 1200 | 0.2046 | 0.5730 | 0.7524 | 0.9337 | nan | 0.9468 | 0.0560 | 0.9917 | 0.8208 | 0.9466 | 0.0 | 0.8978 | 0.0539 | 0.9816 | 0.6888 | 0.8158 | | 0.1419 | 36.9697 | 1220 | 0.2069 | 0.5695 | 0.7491 | 0.9322 | nan | 0.9410 | 0.0449 | 0.9909 | 0.8142 | 0.9546 | 0.0 | 0.8945 | 0.0439 | 0.9812 | 0.6880 | 0.8093 | | 0.0725 | 37.5758 | 1240 | 0.2001 | 0.5724 | 0.7554 | 0.9322 | nan | 0.9400 | 0.0641 | 0.9916 | 0.8397 | 0.9418 | 0.0 | 0.8937 | 0.0612 | 0.9806 | 0.6830 | 0.8157 | | 0.0653 | 38.1818 | 1260 | 0.2039 | 0.5729 | 0.7530 | 0.9344 | nan | 0.9474 | 0.0577 | 0.9916 | 0.8173 | 0.9510 | 0.0 | 0.8999 | 0.0561 | 0.9823 | 0.6843 | 0.8151 | | 0.1446 | 38.7879 | 1280 | 0.2051 | 0.5729 | 0.7554 | 0.9325 | nan | 0.9349 | 0.0637 | 0.9926 | 0.8325 | 0.9531 | 0.0 | 0.8944 | 0.0619 | 0.9823 | 0.6853 | 0.8138 | | 0.1172 | 39.3939 | 1300 | 0.2099 | 0.5719 | 0.7498 | 0.9320 | nan | 0.9499 | 0.0707 | 0.9881 | 0.7914 | 0.9486 | 0.0 | 0.8959 | 0.0674 | 0.9797 | 0.6778 | 0.8108 | | 0.0907 | 40.0 | 1320 | 0.2099 | 0.5691 | 0.7500 | 0.9323 | nan | 0.9445 | 0.0536 | 0.9894 | 0.8095 | 0.9528 | 0.0 | 0.8966 | 0.0520 | 0.9803 | 0.6778 | 0.8078 | | 0.1174 | 40.6061 | 1340 | 0.2221 | 0.5677 | 0.7461 | 0.9332 | nan | 0.9491 | 0.0481 | 0.9913 | 0.7887 | 0.9535 | 0.0 | 0.9008 | 0.0464 | 0.9819 | 0.6707 | 0.8064 | | 0.1053 | 41.2121 | 1360 | 0.2092 | 0.5699 | 0.7493 | 0.9320 | nan | 0.9447 | 0.0571 | 0.9904 | 0.8045 | 0.9496 | 0.0 | 0.8970 | 0.0545 | 0.9810 | 0.6814 | 0.8056 | | 0.1026 | 41.8182 | 1380 | 0.2012 | 0.5752 | 0.7541 | 0.9338 | nan | 0.9489 | 0.0759 | 0.9926 | 0.8129 | 0.9404 | 0.0 | 0.8954 | 0.0720 | 0.9819 | 0.6856 | 0.8164 | | 0.1371 | 42.4242 | 1400 | 0.1945 | 0.5783 | 0.7575 | 0.9336 | nan | 0.9466 | 0.0953 | 0.9927 | 0.8133 | 0.9399 | 0.0 | 0.8959 | 0.0889 | 0.9823 | 0.6863 | 0.8162 | | 0.0799 | 43.0303 | 1420 | 0.2107 | 0.5712 | 0.7508 | 0.9332 | nan | 0.9492 | 0.0619 | 0.9902 | 0.8045 | 0.9483 | 0.0 | 0.8972 | 0.0595 | 0.9816 | 0.6766 | 0.8124 | | 0.1458 | 43.6364 | 1440 | 0.1906 | 0.5788 | 0.7577 | 0.9339 | nan | 0.9489 | 0.0892 | 0.9909 | 0.8185 | 0.9410 | 0.0 | 0.8943 | 0.0843 | 0.9805 | 0.6925 | 0.8213 | | 0.112 | 44.2424 | 1460 | 0.2091 | 0.5726 | 0.7541 | 0.9329 | nan | 0.9418 | 0.0579 | 0.9901 | 0.8288 | 0.9518 | 0.0 | 0.8953 | 0.0560 | 0.9802 | 0.6893 | 0.8149 | | 0.1311 | 44.8485 | 1480 | 0.2022 | 0.5730 | 0.7519 | 0.9326 | nan | 0.9512 | 0.0656 | 0.9888 | 0.8131 | 0.9408 | 0.0 | 0.8959 | 0.0622 | 0.9791 | 0.6862 | 0.8146 | | 0.1474 | 45.4545 | 1500 | 0.2001 | 0.5743 | 0.7578 | 0.9328 | nan | 0.9410 | 0.0698 | 0.9908 | 0.8441 | 0.9436 | 0.0 | 0.8939 | 0.0671 | 0.9805 | 0.6887 | 0.8157 | | 0.0953 | 46.0606 | 1520 | 0.2072 | 0.5764 | 0.7568 | 0.9338 | nan | 0.9420 | 0.0672 | 0.9906 | 0.8310 | 0.9530 | 0.0 | 0.8954 | 0.0651 | 0.9815 | 0.7000 | 0.8167 | | 0.1038 | 46.6667 | 1540 | 0.2003 | 0.5760 | 0.7544 | 0.9339 | nan | 0.9465 | 0.0760 | 0.9921 | 0.8092 | 0.9480 | 0.0 | 0.8963 | 0.0728 | 0.9824 | 0.6886 | 0.8157 | | 0.1362 | 47.2727 | 1560 | 0.1978 | 0.5755 | 0.7533 | 0.9327 | nan | 0.9482 | 0.0793 | 0.9908 | 0.8065 | 0.9416 | 0.0 | 0.8950 | 0.0751 | 0.9804 | 0.6895 | 0.8130 | | 0.1052 | 47.8788 | 1580 | 0.2080 | 0.5750 | 0.7526 | 0.9336 | nan | 0.9484 | 0.0676 | 0.9896 | 0.8045 | 0.9528 | 0.0 | 0.8969 | 0.0653 | 0.9802 | 0.6947 | 0.8129 | | 0.0813 | 48.4848 | 1600 | 0.2054 | 0.5744 | 0.7548 | 0.9338 | nan | 0.9439 | 0.0579 | 0.9919 | 0.8332 | 0.9471 | 0.0 | 0.8961 | 0.0559 | 0.9819 | 0.6965 | 0.8163 | | 0.0992 | 49.0909 | 1620 | 0.2016 | 0.5765 | 0.7563 | 0.9344 | nan | 0.9467 | 0.0730 | 0.9922 | 0.8243 | 0.9455 | 0.0 | 0.8952 | 0.0702 | 0.9826 | 0.6912 | 0.8200 | | 0.1604 | 49.6970 | 1640 | 0.1941 | 0.5755 | 0.7536 | 0.9340 | nan | 0.9537 | 0.0749 | 0.9909 | 0.8095 | 0.9391 | 0.0 | 0.8956 | 0.0717 | 0.9803 | 0.6885 | 0.8168 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "background", "silver", "glass", "silicon", "void", "interfacial void" ]
heroza/segformer-finetuned-biofilm_MRCNNv1_train80_val20
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-biofilm_MRCNNv1_train80_val20 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the heroza/biofilm_MRCNNv1_train80_val20 dataset. It achieves the following results on the evaluation set: - Loss: 0.0785 - Mean Iou: 0.8728 - Mean Accuracy: 0.9695 - Overall Accuracy: 0.9773 - Accuracy Background: 0.9788 - Accuracy Biofilm: 0.9603 - Iou Background: 0.9755 - Iou Biofilm: 0.7702 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - num_epochs: 50.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Biofilm | Iou Background | Iou Biofilm | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:----------------:|:--------------:|:-----------:| | 0.1517 | 1.0 | 280 | 0.1319 | 0.8108 | 0.8974 | 0.9664 | 0.9794 | 0.8154 | 0.9641 | 0.6575 | | 0.0728 | 2.0 | 560 | 0.0636 | 0.8857 | 0.9582 | 0.9807 | 0.9849 | 0.9314 | 0.9791 | 0.7923 | | 0.0555 | 3.0 | 840 | 0.0526 | 0.8863 | 0.9417 | 0.9815 | 0.9889 | 0.8944 | 0.9800 | 0.7925 | | 0.045 | 4.0 | 1120 | 0.0522 | 0.8763 | 0.9225 | 0.9802 | 0.9911 | 0.8539 | 0.9788 | 0.7738 | | 0.048 | 5.0 | 1400 | 0.0415 | 0.9085 | 0.9742 | 0.9848 | 0.9868 | 0.9616 | 0.9835 | 0.8335 | | 0.0425 | 6.0 | 1680 | 0.0468 | 0.9041 | 0.9790 | 0.9837 | 0.9846 | 0.9733 | 0.9824 | 0.8258 | | 0.0438 | 7.0 | 1960 | 0.0461 | 0.9001 | 0.9717 | 0.9832 | 0.9853 | 0.9581 | 0.9818 | 0.8185 | | 0.0406 | 8.0 | 2240 | 0.0488 | 0.8936 | 0.9701 | 0.9819 | 0.9841 | 0.9561 | 0.9804 | 0.8068 | | 0.0354 | 9.0 | 2520 | 0.0434 | 0.9004 | 0.9648 | 0.9835 | 0.9870 | 0.9426 | 0.9821 | 0.8187 | | 0.0411 | 10.0 | 2800 | 0.0467 | 0.9021 | 0.9680 | 0.9837 | 0.9867 | 0.9494 | 0.9824 | 0.8219 | | 0.0385 | 11.0 | 3080 | 0.0846 | 0.8493 | 0.9697 | 0.9716 | 0.9720 | 0.9674 | 0.9692 | 0.7294 | | 0.0327 | 12.0 | 3360 | 0.0574 | 0.8709 | 0.9643 | 0.9771 | 0.9795 | 0.9491 | 0.9753 | 0.7666 | | 0.0333 | 13.0 | 3640 | 0.0654 | 0.8649 | 0.9645 | 0.9757 | 0.9778 | 0.9513 | 0.9737 | 0.7561 | | 0.0356 | 14.0 | 3920 | 0.0823 | 0.8472 | 0.9703 | 0.9710 | 0.9712 | 0.9694 | 0.9686 | 0.7259 | | 0.0277 | 15.0 | 4200 | 0.0657 | 0.8634 | 0.9761 | 0.9748 | 0.9745 | 0.9777 | 0.9727 | 0.7542 | | 0.0328 | 16.0 | 4480 | 0.0575 | 0.8785 | 0.9668 | 0.9787 | 0.9810 | 0.9526 | 0.9770 | 0.7800 | | 0.0362 | 17.0 | 4760 | 0.0595 | 0.8750 | 0.9696 | 0.9778 | 0.9794 | 0.9599 | 0.9760 | 0.7741 | | 0.0301 | 18.0 | 5040 | 0.0610 | 0.8701 | 0.9755 | 0.9764 | 0.9766 | 0.9744 | 0.9744 | 0.7659 | | 0.0284 | 19.0 | 5320 | 0.0562 | 0.8874 | 0.9748 | 0.9804 | 0.9814 | 0.9682 | 0.9787 | 0.7961 | | 0.0345 | 20.0 | 5600 | 0.0612 | 0.8704 | 0.9696 | 0.9767 | 0.9781 | 0.9611 | 0.9748 | 0.7659 | | 0.0292 | 21.0 | 5880 | 0.0689 | 0.8639 | 0.9728 | 0.9751 | 0.9755 | 0.9701 | 0.9730 | 0.7549 | | 0.0287 | 22.0 | 6160 | 0.0559 | 0.8821 | 0.9657 | 0.9796 | 0.9822 | 0.9493 | 0.9779 | 0.7862 | | 0.0307 | 23.0 | 6440 | 0.0683 | 0.8637 | 0.9716 | 0.9751 | 0.9757 | 0.9674 | 0.9730 | 0.7545 | | 0.0303 | 24.0 | 6720 | 0.0692 | 0.8702 | 0.9627 | 0.9770 | 0.9797 | 0.9457 | 0.9752 | 0.7653 | | 0.0257 | 25.0 | 7000 | 0.0594 | 0.8783 | 0.9709 | 0.9785 | 0.9799 | 0.9619 | 0.9767 | 0.7798 | | 0.0341 | 26.0 | 7280 | 0.0762 | 0.8619 | 0.9746 | 0.9745 | 0.9745 | 0.9747 | 0.9723 | 0.7515 | | 0.025 | 27.0 | 7560 | 0.0675 | 0.8696 | 0.9751 | 0.9763 | 0.9765 | 0.9736 | 0.9743 | 0.7649 | | 0.0281 | 28.0 | 7840 | 0.0661 | 0.8641 | 0.9694 | 0.9753 | 0.9764 | 0.9625 | 0.9732 | 0.7550 | | 0.0285 | 29.0 | 8120 | 0.0796 | 0.8592 | 0.9737 | 0.9739 | 0.9739 | 0.9736 | 0.9717 | 0.7467 | | 0.0263 | 30.0 | 8400 | 0.0760 | 0.8627 | 0.9712 | 0.9749 | 0.9755 | 0.9668 | 0.9728 | 0.7527 | | 0.0252 | 31.0 | 8680 | 0.0615 | 0.8800 | 0.9642 | 0.9792 | 0.9820 | 0.9464 | 0.9775 | 0.7825 | | 0.0245 | 32.0 | 8960 | 0.0647 | 0.8665 | 0.9642 | 0.9761 | 0.9783 | 0.9501 | 0.9742 | 0.7589 | | 0.0241 | 33.0 | 9240 | 0.0638 | 0.8749 | 0.9668 | 0.9779 | 0.9800 | 0.9535 | 0.9761 | 0.7736 | | 0.0249 | 34.0 | 9520 | 0.0803 | 0.8610 | 0.9709 | 0.9744 | 0.9751 | 0.9667 | 0.9723 | 0.7497 | | 0.0213 | 35.0 | 9800 | 0.0754 | 0.8687 | 0.9633 | 0.9767 | 0.9792 | 0.9474 | 0.9748 | 0.7627 | | 0.0233 | 36.0 | 10080 | 0.0675 | 0.8743 | 0.9612 | 0.9780 | 0.9812 | 0.9411 | 0.9763 | 0.7723 | | 0.0214 | 37.0 | 10360 | 0.0695 | 0.8758 | 0.9714 | 0.9779 | 0.9791 | 0.9637 | 0.9761 | 0.7755 | | 0.0231 | 38.0 | 10640 | 0.0704 | 0.8695 | 0.9621 | 0.9769 | 0.9797 | 0.9444 | 0.9750 | 0.7640 | | 0.0231 | 39.0 | 10920 | 0.0780 | 0.8636 | 0.9646 | 0.9754 | 0.9774 | 0.9518 | 0.9734 | 0.7539 | | 0.0221 | 40.0 | 11200 | 0.0726 | 0.8709 | 0.9666 | 0.9770 | 0.9790 | 0.9542 | 0.9751 | 0.7666 | | 0.0227 | 41.0 | 11480 | 0.0829 | 0.8618 | 0.9627 | 0.9751 | 0.9774 | 0.9480 | 0.9730 | 0.7505 | | 0.0241 | 42.0 | 11760 | 0.0701 | 0.8763 | 0.9679 | 0.9782 | 0.9801 | 0.9557 | 0.9764 | 0.7762 | | 0.0206 | 43.0 | 12040 | 0.0782 | 0.8666 | 0.9670 | 0.9760 | 0.9777 | 0.9563 | 0.9740 | 0.7593 | | 0.023 | 44.0 | 12320 | 0.0809 | 0.8656 | 0.9654 | 0.9758 | 0.9778 | 0.9530 | 0.9739 | 0.7573 | | 0.0223 | 45.0 | 12600 | 0.0805 | 0.8688 | 0.9660 | 0.9765 | 0.9785 | 0.9535 | 0.9746 | 0.7630 | | 0.0224 | 46.0 | 12880 | 0.0748 | 0.8719 | 0.9682 | 0.9772 | 0.9789 | 0.9576 | 0.9753 | 0.7685 | | 0.0233 | 47.0 | 13160 | 0.0796 | 0.8697 | 0.9669 | 0.9767 | 0.9786 | 0.9552 | 0.9748 | 0.7645 | | 0.019 | 48.0 | 13440 | 0.0772 | 0.8729 | 0.9681 | 0.9774 | 0.9792 | 0.9569 | 0.9756 | 0.7703 | | 0.0215 | 49.0 | 13720 | 0.0783 | 0.8720 | 0.9682 | 0.9772 | 0.9789 | 0.9575 | 0.9753 | 0.7686 | | 0.0186 | 50.0 | 14000 | 0.0785 | 0.8728 | 0.9695 | 0.9773 | 0.9788 | 0.9603 | 0.9755 | 0.7702 | ### Framework versions - Transformers 4.38.0.dev0 - Pytorch 2.0.0+cu117 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "background", "biofilm" ]
Spatiallysaying/segformer-finetuned-rwymarkings-2-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-rwymarkings-2-steps This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Spatiallysaying/rwymarkings dataset. It achieves the following results on the evaluation set: - Loss: 2.2162 - Mean Iou: 0.0387 - Mean Accuracy: 0.1129 - Overall Accuracy: 0.1050 - Accuracy Backgound : nan - Accuracy Tdz: 0.0493 - Accuracy Aim: 0.2144 - Accuracy Desig: 0.0922 - Accuracy Rwythr: 0.1765 - Accuracy Thrbar: 0.0140 - Accuracy Disp: 0.2710 - Accuracy Chevron: 0.0023 - Accuracy Arrow: 0.0834 - Iou Backgound : 0.0 - Iou Tdz: 0.0399 - Iou Aim: 0.1158 - Iou Desig: 0.0443 - Iou Rwythr: 0.0980 - Iou Thrbar: 0.0131 - Iou Disp: 0.0266 - Iou Chevron: 0.0020 - Iou Arrow: 0.0085 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Backgound | Accuracy Tdz | Accuracy Aim | Accuracy Desig | Accuracy Rwythr | Accuracy Thrbar | Accuracy Disp | Accuracy Chevron | Accuracy Arrow | Iou Backgound | Iou Tdz | Iou Aim | Iou Desig | Iou Rwythr | Iou Thrbar | Iou Disp | Iou Chevron | Iou Arrow | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:------------:|:------------:|:--------------:|:---------------:|:---------------:|:-------------:|:----------------:|:--------------:|:---------------:|:-------:|:-------:|:---------:|:----------:|:----------:|:--------:|:-----------:|:---------:| | 2.2475 | 0.0455 | 2 | 2.2162 | 0.0387 | 0.1129 | 0.1050 | nan | 0.0493 | 0.2144 | 0.0922 | 0.1765 | 0.0140 | 0.2710 | 0.0023 | 0.0834 | 0.0 | 0.0399 | 0.1158 | 0.0443 | 0.0980 | 0.0131 | 0.0266 | 0.0020 | 0.0085 | ### Framework versions - Transformers 4.43.0.dev0 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "_backgound_", "tdz", "aim", "desig", "rwythr", "thrbar", "disp", "chevron", "arrow" ]
mouadenna/segformer-b4-finetuned-segments-pv
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/wfveukoq) # segformer-b4-finetuned-segments-pv This model is a fine-tuned version of [nvidia/mit-b4](https://huggingface.co/nvidia/mit-b4) on the mouadenna/satellite_PV_dataset_train_test dataset. It achieves the following results on the evaluation set: - Loss: 0.0068 - Mean Iou: 0.9132 - Mean Accuracy: 0.9437 - Overall Accuracy: 0.9978 - Accuracy Unlabeled: 0.9991 - Accuracy Pv: 0.8883 - Iou Unlabeled: 0.9978 - Iou Pv: 0.8286 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Pv | Iou Unlabeled | Iou Pv | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-----------:|:-------------:|:------:| | 0.3649 | 0.0055 | 20 | 0.3057 | 0.6051 | 0.8096 | 0.9751 | 0.9791 | 0.6401 | 0.9749 | 0.2353 | | 0.2061 | 0.0109 | 40 | 0.0965 | 0.7760 | 0.8850 | 0.9927 | 0.9953 | 0.7746 | 0.9926 | 0.5595 | | 0.0522 | 0.0164 | 60 | 0.0797 | 0.7319 | 0.9555 | 0.9878 | 0.9886 | 0.9225 | 0.9877 | 0.4761 | | 0.0557 | 0.0218 | 80 | 0.0310 | 0.8324 | 0.8875 | 0.9954 | 0.9981 | 0.7770 | 0.9954 | 0.6694 | | 0.0624 | 0.0273 | 100 | 0.0337 | 0.8012 | 0.8240 | 0.9950 | 0.9992 | 0.6487 | 0.9949 | 0.6074 | | 0.0351 | 0.0327 | 120 | 0.0258 | 0.8310 | 0.8980 | 0.9952 | 0.9976 | 0.7984 | 0.9952 | 0.6667 | | 0.0779 | 0.0382 | 140 | 0.0235 | 0.8142 | 0.8502 | 0.9951 | 0.9987 | 0.7018 | 0.9951 | 0.6333 | | 0.0378 | 0.0436 | 160 | 0.0210 | 0.8107 | 0.8409 | 0.9951 | 0.9989 | 0.6830 | 0.9951 | 0.6264 | | 0.0272 | 0.0491 | 180 | 0.0198 | 0.8243 | 0.8494 | 0.9956 | 0.9991 | 0.6997 | 0.9955 | 0.6532 | | 0.0085 | 0.0546 | 200 | 0.0203 | 0.8300 | 0.8701 | 0.9955 | 0.9986 | 0.7417 | 0.9955 | 0.6644 | | 0.008 | 0.0600 | 220 | 0.0178 | 0.8482 | 0.9149 | 0.9957 | 0.9977 | 0.8322 | 0.9957 | 0.7007 | | 0.008 | 0.0655 | 240 | 0.0205 | 0.8323 | 0.9299 | 0.9949 | 0.9965 | 0.8633 | 0.9948 | 0.6697 | | 0.0099 | 0.0709 | 260 | 0.0173 | 0.8340 | 0.8587 | 0.9958 | 0.9992 | 0.7181 | 0.9958 | 0.6723 | | 0.0243 | 0.0764 | 280 | 0.0160 | 0.8504 | 0.9401 | 0.9956 | 0.9969 | 0.8832 | 0.9955 | 0.7052 | | 0.0222 | 0.0818 | 300 | 0.0292 | 0.6962 | 0.7012 | 0.9928 | 0.9999 | 0.4025 | 0.9927 | 0.3997 | | 0.0062 | 0.0873 | 320 | 0.0155 | 0.8374 | 0.8632 | 0.9959 | 0.9991 | 0.7272 | 0.9958 | 0.6790 | | 0.0049 | 0.0927 | 340 | 0.0143 | 0.8614 | 0.9190 | 0.9962 | 0.9981 | 0.8398 | 0.9962 | 0.7267 | | 0.0059 | 0.0982 | 360 | 0.0135 | 0.8632 | 0.9312 | 0.9962 | 0.9978 | 0.8647 | 0.9961 | 0.7302 | | 0.0267 | 0.1037 | 380 | 0.0129 | 0.8630 | 0.9181 | 0.9963 | 0.9982 | 0.8380 | 0.9962 | 0.7297 | | 0.0042 | 0.1091 | 400 | 0.0128 | 0.8557 | 0.8964 | 0.9962 | 0.9987 | 0.7941 | 0.9962 | 0.7151 | | 0.0037 | 0.1146 | 420 | 0.0110 | 0.8741 | 0.9227 | 0.9966 | 0.9985 | 0.8469 | 0.9966 | 0.7516 | | 0.0066 | 0.1200 | 440 | 0.0151 | 0.8154 | 0.8268 | 0.9955 | 0.9996 | 0.6540 | 0.9955 | 0.6353 | | 0.0041 | 0.1255 | 460 | 0.0141 | 0.8472 | 0.8798 | 0.9961 | 0.9989 | 0.7607 | 0.9960 | 0.6984 | | 0.0498 | 0.1309 | 480 | 0.0140 | 0.8544 | 0.9296 | 0.9958 | 0.9975 | 0.8616 | 0.9958 | 0.7130 | | 0.0459 | 0.1364 | 500 | 0.0145 | 0.8553 | 0.9430 | 0.9958 | 0.9970 | 0.8890 | 0.9957 | 0.7150 | | 0.0072 | 0.1418 | 520 | 0.0133 | 0.8531 | 0.8805 | 0.9963 | 0.9991 | 0.7619 | 0.9962 | 0.7100 | | 0.0075 | 0.1473 | 540 | 0.0113 | 0.8724 | 0.9209 | 0.9966 | 0.9985 | 0.8433 | 0.9966 | 0.7483 | | 0.008 | 0.1528 | 560 | 0.0125 | 0.8555 | 0.8833 | 0.9963 | 0.9991 | 0.7675 | 0.9963 | 0.7147 | | 0.0038 | 0.1582 | 580 | 0.0119 | 0.8692 | 0.9090 | 0.9966 | 0.9987 | 0.8193 | 0.9966 | 0.7419 | | 0.0079 | 0.1637 | 600 | 0.0122 | 0.8730 | 0.9472 | 0.9964 | 0.9976 | 0.8967 | 0.9964 | 0.7496 | | 0.0044 | 0.1691 | 620 | 0.0120 | 0.8669 | 0.9100 | 0.9965 | 0.9986 | 0.8214 | 0.9965 | 0.7374 | | 0.0044 | 0.1746 | 640 | 0.0121 | 0.8637 | 0.8958 | 0.9965 | 0.9990 | 0.7925 | 0.9965 | 0.7310 | | 0.0032 | 0.1800 | 660 | 0.0113 | 0.8660 | 0.9016 | 0.9965 | 0.9989 | 0.8044 | 0.9965 | 0.7355 | | 0.003 | 0.1855 | 680 | 0.0112 | 0.8670 | 0.9234 | 0.9964 | 0.9982 | 0.8486 | 0.9963 | 0.7376 | | 0.0051 | 0.1909 | 700 | 0.0116 | 0.8489 | 0.8733 | 0.9962 | 0.9992 | 0.7475 | 0.9962 | 0.7016 | | 0.0096 | 0.1964 | 720 | 0.0159 | 0.7907 | 0.8008 | 0.9949 | 0.9997 | 0.6020 | 0.9949 | 0.5866 | | 0.0104 | 0.2019 | 740 | 0.0114 | 0.8656 | 0.9360 | 0.9962 | 0.9977 | 0.8743 | 0.9962 | 0.7351 | | 0.0072 | 0.2073 | 760 | 0.0112 | 0.8729 | 0.9488 | 0.9964 | 0.9976 | 0.9001 | 0.9964 | 0.7495 | | 0.0037 | 0.2128 | 780 | 0.0106 | 0.8632 | 0.8876 | 0.9966 | 0.9992 | 0.7759 | 0.9965 | 0.7299 | | 0.0024 | 0.2182 | 800 | 0.0104 | 0.8679 | 0.8914 | 0.9967 | 0.9993 | 0.7835 | 0.9967 | 0.7392 | | 0.0028 | 0.2237 | 820 | 0.0109 | 0.8735 | 0.9525 | 0.9964 | 0.9975 | 0.9076 | 0.9964 | 0.7507 | | 0.1398 | 0.2291 | 840 | 0.0100 | 0.8761 | 0.9080 | 0.9968 | 0.9990 | 0.8170 | 0.9968 | 0.7555 | | 0.0018 | 0.2346 | 860 | 0.0105 | 0.8699 | 0.9063 | 0.9966 | 0.9989 | 0.8137 | 0.9966 | 0.7433 | | 0.0112 | 0.2400 | 880 | 0.0116 | 0.8705 | 0.9604 | 0.9962 | 0.9971 | 0.9237 | 0.9962 | 0.7448 | | 0.0103 | 0.2455 | 900 | 0.0099 | 0.8765 | 0.9084 | 0.9968 | 0.9990 | 0.8177 | 0.9968 | 0.7563 | | 0.0034 | 0.2510 | 920 | 0.0106 | 0.8707 | 0.9019 | 0.9967 | 0.9990 | 0.8049 | 0.9967 | 0.7448 | | 0.0052 | 0.2564 | 940 | 0.0103 | 0.8788 | 0.9133 | 0.9969 | 0.9989 | 0.8276 | 0.9969 | 0.7608 | | 0.001 | 0.2619 | 960 | 0.0102 | 0.8842 | 0.9298 | 0.9969 | 0.9986 | 0.8609 | 0.9969 | 0.7715 | | 0.0058 | 0.2673 | 980 | 0.0102 | 0.8772 | 0.9033 | 0.9969 | 0.9992 | 0.8074 | 0.9969 | 0.7576 | | 0.0049 | 0.2728 | 1000 | 0.0110 | 0.8744 | 0.9070 | 0.9968 | 0.9990 | 0.8151 | 0.9967 | 0.7520 | | 0.0108 | 0.2782 | 1020 | 0.0109 | 0.8736 | 0.9504 | 0.9964 | 0.9975 | 0.9033 | 0.9964 | 0.7508 | | 0.009 | 0.2837 | 1040 | 0.0106 | 0.8686 | 0.9066 | 0.9966 | 0.9988 | 0.8145 | 0.9966 | 0.7406 | | 0.003 | 0.2891 | 1060 | 0.0120 | 0.8493 | 0.8748 | 0.9962 | 0.9992 | 0.7505 | 0.9962 | 0.7024 | | 0.0042 | 0.2946 | 1080 | 0.0109 | 0.8746 | 0.9340 | 0.9966 | 0.9981 | 0.8699 | 0.9965 | 0.7526 | | 0.0008 | 0.3001 | 1100 | 0.0110 | 0.8717 | 0.9050 | 0.9967 | 0.9990 | 0.8110 | 0.9967 | 0.7467 | | 0.0054 | 0.3055 | 1120 | 0.0114 | 0.8623 | 0.8855 | 0.9965 | 0.9993 | 0.7717 | 0.9965 | 0.7281 | | 0.0054 | 0.3110 | 1140 | 0.0096 | 0.8812 | 0.9230 | 0.9969 | 0.9987 | 0.8472 | 0.9969 | 0.7655 | | 0.004 | 0.3164 | 1160 | 0.0091 | 0.8859 | 0.9289 | 0.9970 | 0.9987 | 0.8591 | 0.9970 | 0.7749 | | 0.0139 | 0.3219 | 1180 | 0.0095 | 0.8827 | 0.9392 | 0.9968 | 0.9982 | 0.8802 | 0.9968 | 0.7686 | | 0.0229 | 0.3273 | 1200 | 0.0100 | 0.8755 | 0.9485 | 0.9965 | 0.9977 | 0.8994 | 0.9965 | 0.7546 | | 0.0071 | 0.3328 | 1220 | 0.0099 | 0.8804 | 0.9457 | 0.9967 | 0.9979 | 0.8934 | 0.9967 | 0.7641 | | 0.0009 | 0.3382 | 1240 | 0.0103 | 0.8639 | 0.8943 | 0.9965 | 0.9990 | 0.7896 | 0.9965 | 0.7313 | | 0.0038 | 0.3437 | 1260 | 0.0105 | 0.8687 | 0.9439 | 0.9963 | 0.9976 | 0.8902 | 0.9962 | 0.7411 | | 0.0012 | 0.3492 | 1280 | 0.0098 | 0.8718 | 0.8951 | 0.9968 | 0.9993 | 0.7908 | 0.9968 | 0.7468 | | 0.002 | 0.3546 | 1300 | 0.0093 | 0.8828 | 0.9440 | 0.9968 | 0.9981 | 0.8900 | 0.9968 | 0.7689 | | 0.0032 | 0.3601 | 1320 | 0.0087 | 0.8886 | 0.9332 | 0.9971 | 0.9986 | 0.8678 | 0.9970 | 0.7801 | | 0.006 | 0.3655 | 1340 | 0.0093 | 0.8862 | 0.9606 | 0.9968 | 0.9977 | 0.9235 | 0.9968 | 0.7756 | | 0.0056 | 0.3710 | 1360 | 0.0090 | 0.8878 | 0.9117 | 0.9972 | 0.9993 | 0.8240 | 0.9972 | 0.7785 | | 0.0115 | 0.3764 | 1380 | 0.0092 | 0.8939 | 0.9252 | 0.9973 | 0.9991 | 0.8514 | 0.9973 | 0.7905 | | 0.0055 | 0.3819 | 1400 | 0.0090 | 0.8928 | 0.9618 | 0.9970 | 0.9979 | 0.9258 | 0.9970 | 0.7887 | | 0.0088 | 0.3873 | 1420 | 0.0089 | 0.8894 | 0.9365 | 0.9971 | 0.9986 | 0.8743 | 0.9970 | 0.7817 | | 0.0043 | 0.3928 | 1440 | 0.0092 | 0.8902 | 0.9450 | 0.9970 | 0.9983 | 0.8917 | 0.9970 | 0.7834 | | 0.0007 | 0.3983 | 1460 | 0.0102 | 0.8736 | 0.8933 | 0.9969 | 0.9994 | 0.7872 | 0.9968 | 0.7504 | | 0.0007 | 0.4037 | 1480 | 0.0088 | 0.8960 | 0.9295 | 0.9973 | 0.9990 | 0.8599 | 0.9973 | 0.7947 | | 0.0091 | 0.4092 | 1500 | 0.0084 | 0.8995 | 0.9440 | 0.9974 | 0.9987 | 0.8894 | 0.9973 | 0.8016 | | 0.0009 | 0.4146 | 1520 | 0.0079 | 0.9000 | 0.9471 | 0.9974 | 0.9986 | 0.8957 | 0.9973 | 0.8026 | | 0.0028 | 0.4201 | 1540 | 0.0092 | 0.8832 | 0.9025 | 0.9971 | 0.9994 | 0.8057 | 0.9971 | 0.7693 | | 0.0076 | 0.4255 | 1560 | 0.0108 | 0.8692 | 0.9655 | 0.9961 | 0.9969 | 0.9342 | 0.9961 | 0.7423 | | 0.0012 | 0.4310 | 1580 | 0.0087 | 0.8883 | 0.9177 | 0.9972 | 0.9991 | 0.8364 | 0.9971 | 0.7795 | | 0.0032 | 0.4364 | 1600 | 0.0084 | 0.8976 | 0.9441 | 0.9973 | 0.9986 | 0.8896 | 0.9973 | 0.7979 | | 0.0091 | 0.4419 | 1620 | 0.0086 | 0.8940 | 0.9398 | 0.9972 | 0.9986 | 0.8809 | 0.9972 | 0.7908 | | 0.0051 | 0.4474 | 1640 | 0.0095 | 0.8787 | 0.8991 | 0.9970 | 0.9994 | 0.7987 | 0.9970 | 0.7605 | | 0.0775 | 0.4528 | 1660 | 0.0088 | 0.8927 | 0.9279 | 0.9972 | 0.9989 | 0.8568 | 0.9972 | 0.7882 | | 0.0021 | 0.4583 | 1680 | 0.0084 | 0.8976 | 0.9490 | 0.9973 | 0.9985 | 0.8995 | 0.9972 | 0.7979 | | 0.0031 | 0.4637 | 1700 | 0.0086 | 0.8988 | 0.9459 | 0.9973 | 0.9986 | 0.8933 | 0.9973 | 0.8003 | | 0.0033 | 0.4692 | 1720 | 0.0084 | 0.9005 | 0.9382 | 0.9974 | 0.9989 | 0.8776 | 0.9974 | 0.8035 | | 0.0035 | 0.4746 | 1740 | 0.0085 | 0.8938 | 0.9573 | 0.9971 | 0.9981 | 0.9165 | 0.9971 | 0.7905 | | 0.0029 | 0.4801 | 1760 | 0.0086 | 0.8895 | 0.9425 | 0.9970 | 0.9984 | 0.8867 | 0.9970 | 0.7821 | | 0.0037 | 0.4855 | 1780 | 0.0080 | 0.9011 | 0.9480 | 0.9974 | 0.9986 | 0.8974 | 0.9974 | 0.8047 | | 0.0048 | 0.4910 | 1800 | 0.0085 | 0.9038 | 0.9393 | 0.9975 | 0.9990 | 0.8797 | 0.9975 | 0.8100 | | 0.0043 | 0.4965 | 1820 | 0.0086 | 0.9016 | 0.9427 | 0.9974 | 0.9988 | 0.8865 | 0.9974 | 0.8057 | | 0.0024 | 0.5019 | 1840 | 0.0089 | 0.8969 | 0.9309 | 0.9974 | 0.9990 | 0.8627 | 0.9973 | 0.7964 | | 0.0082 | 0.5074 | 1860 | 0.0099 | 0.8850 | 0.9134 | 0.9971 | 0.9991 | 0.8276 | 0.9971 | 0.7730 | | 0.0042 | 0.5128 | 1880 | 0.0092 | 0.8916 | 0.9309 | 0.9972 | 0.9988 | 0.8630 | 0.9972 | 0.7860 | | 0.0032 | 0.5183 | 1900 | 0.0087 | 0.8946 | 0.9341 | 0.9973 | 0.9988 | 0.8693 | 0.9972 | 0.7919 | | 0.0056 | 0.5237 | 1920 | 0.0102 | 0.8805 | 0.9160 | 0.9969 | 0.9989 | 0.8330 | 0.9969 | 0.7641 | | 0.0054 | 0.5292 | 1940 | 0.0118 | 0.8665 | 0.8985 | 0.9966 | 0.9990 | 0.7980 | 0.9965 | 0.7365 | | 0.0052 | 0.5346 | 1960 | 0.0092 | 0.8936 | 0.9331 | 0.9972 | 0.9988 | 0.8673 | 0.9972 | 0.7900 | | 0.0009 | 0.5401 | 1980 | 0.0092 | 0.8916 | 0.9389 | 0.9971 | 0.9986 | 0.8792 | 0.9971 | 0.7860 | | 0.0035 | 0.5456 | 2000 | 0.0088 | 0.8946 | 0.9285 | 0.9973 | 0.9990 | 0.8581 | 0.9973 | 0.7919 | | 0.0026 | 0.5510 | 2020 | 0.0078 | 0.9015 | 0.9381 | 0.9975 | 0.9989 | 0.8772 | 0.9974 | 0.8056 | | 0.0048 | 0.5565 | 2040 | 0.0082 | 0.8901 | 0.9118 | 0.9973 | 0.9994 | 0.8243 | 0.9972 | 0.7829 | | 0.0027 | 0.5619 | 2060 | 0.0080 | 0.9012 | 0.9390 | 0.9975 | 0.9989 | 0.8790 | 0.9974 | 0.8050 | | 0.0291 | 0.5674 | 2080 | 0.0076 | 0.9062 | 0.9403 | 0.9976 | 0.9990 | 0.8815 | 0.9976 | 0.8149 | | 0.0023 | 0.5728 | 2100 | 0.0075 | 0.9072 | 0.9530 | 0.9976 | 0.9987 | 0.9073 | 0.9975 | 0.8168 | | 0.0023 | 0.5783 | 2120 | 0.0081 | 0.8997 | 0.9312 | 0.9974 | 0.9991 | 0.8632 | 0.9974 | 0.8021 | | 0.0004 | 0.5837 | 2140 | 0.0081 | 0.9009 | 0.9554 | 0.9973 | 0.9984 | 0.9125 | 0.9973 | 0.8044 | | 0.0037 | 0.5892 | 2160 | 0.0078 | 0.8985 | 0.9304 | 0.9974 | 0.9991 | 0.8616 | 0.9974 | 0.7997 | | 0.075 | 0.5947 | 2180 | 0.0075 | 0.9043 | 0.9475 | 0.9975 | 0.9987 | 0.8962 | 0.9975 | 0.8111 | | 0.002 | 0.6001 | 2200 | 0.0088 | 0.8889 | 0.9613 | 0.9969 | 0.9978 | 0.9248 | 0.9969 | 0.7809 | | 0.0043 | 0.6056 | 2220 | 0.0075 | 0.9010 | 0.9402 | 0.9974 | 0.9988 | 0.8815 | 0.9974 | 0.8046 | | 0.0009 | 0.6110 | 2240 | 0.0077 | 0.9010 | 0.9470 | 0.9974 | 0.9986 | 0.8953 | 0.9974 | 0.8046 | | 0.0044 | 0.6165 | 2260 | 0.0077 | 0.8951 | 0.9175 | 0.9974 | 0.9993 | 0.8357 | 0.9974 | 0.7929 | | 0.0009 | 0.6219 | 2280 | 0.0074 | 0.9010 | 0.9277 | 0.9975 | 0.9992 | 0.8561 | 0.9975 | 0.8045 | | 0.0023 | 0.6274 | 2300 | 0.0075 | 0.9004 | 0.9259 | 0.9975 | 0.9993 | 0.8526 | 0.9975 | 0.8033 | | 0.0036 | 0.6328 | 2320 | 0.0076 | 0.9037 | 0.9391 | 0.9975 | 0.9990 | 0.8793 | 0.9975 | 0.8100 | | 0.0043 | 0.6383 | 2340 | 0.0073 | 0.9048 | 0.9449 | 0.9975 | 0.9988 | 0.8910 | 0.9975 | 0.8121 | | 0.0065 | 0.6438 | 2360 | 0.0081 | 0.8935 | 0.9181 | 0.9973 | 0.9993 | 0.8369 | 0.9973 | 0.7898 | | 0.0019 | 0.6492 | 2380 | 0.0079 | 0.8993 | 0.9333 | 0.9974 | 0.9990 | 0.8676 | 0.9974 | 0.8011 | | 0.0004 | 0.6547 | 2400 | 0.0079 | 0.8991 | 0.9294 | 0.9974 | 0.9991 | 0.8597 | 0.9974 | 0.8008 | | 0.0009 | 0.6601 | 2420 | 0.0078 | 0.9005 | 0.9355 | 0.9974 | 0.9990 | 0.8720 | 0.9974 | 0.8037 | | 0.0075 | 0.6656 | 2440 | 0.0081 | 0.8931 | 0.9210 | 0.9973 | 0.9992 | 0.8428 | 0.9973 | 0.7889 | | 0.0018 | 0.6710 | 2460 | 0.0079 | 0.8969 | 0.9336 | 0.9973 | 0.9989 | 0.8683 | 0.9973 | 0.7966 | | 0.0082 | 0.6765 | 2480 | 0.0078 | 0.8990 | 0.9364 | 0.9974 | 0.9989 | 0.8740 | 0.9974 | 0.8006 | | 0.0013 | 0.6819 | 2500 | 0.0075 | 0.9020 | 0.9405 | 0.9975 | 0.9989 | 0.8821 | 0.9974 | 0.8066 | | 0.0007 | 0.6874 | 2520 | 0.0074 | 0.9053 | 0.9436 | 0.9976 | 0.9989 | 0.8884 | 0.9975 | 0.8131 | | 0.0037 | 0.6929 | 2540 | 0.0075 | 0.9054 | 0.9503 | 0.9975 | 0.9987 | 0.9020 | 0.9975 | 0.8134 | | 0.0007 | 0.6983 | 2560 | 0.0077 | 0.9028 | 0.9459 | 0.9975 | 0.9987 | 0.8930 | 0.9974 | 0.8081 | | 0.0031 | 0.7038 | 2580 | 0.0079 | 0.9021 | 0.9521 | 0.9974 | 0.9985 | 0.9057 | 0.9974 | 0.8068 | | 0.0037 | 0.7092 | 2600 | 0.0075 | 0.9048 | 0.9488 | 0.9975 | 0.9987 | 0.8988 | 0.9975 | 0.8121 | | 0.0012 | 0.7147 | 2620 | 0.0074 | 0.9048 | 0.9418 | 0.9975 | 0.9989 | 0.8846 | 0.9975 | 0.8121 | | 0.0027 | 0.7201 | 2640 | 0.0075 | 0.9061 | 0.9531 | 0.9975 | 0.9986 | 0.9075 | 0.9975 | 0.8147 | | 0.0048 | 0.7256 | 2660 | 0.0082 | 0.8992 | 0.9591 | 0.9973 | 0.9982 | 0.9200 | 0.9972 | 0.8012 | | 0.0012 | 0.7310 | 2680 | 0.0080 | 0.8923 | 0.9378 | 0.9972 | 0.9986 | 0.8770 | 0.9971 | 0.7874 | | 0.002 | 0.7365 | 2700 | 0.0083 | 0.8908 | 0.9451 | 0.9971 | 0.9983 | 0.8918 | 0.9970 | 0.7846 | | 0.0175 | 0.7420 | 2720 | 0.0081 | 0.8961 | 0.9572 | 0.9972 | 0.9982 | 0.9163 | 0.9971 | 0.7951 | | 0.0095 | 0.7474 | 2740 | 0.0074 | 0.9014 | 0.9350 | 0.9975 | 0.9990 | 0.8709 | 0.9975 | 0.8054 | | 0.0026 | 0.7529 | 2760 | 0.0073 | 0.9027 | 0.9376 | 0.9975 | 0.9990 | 0.8761 | 0.9975 | 0.8080 | | 0.0003 | 0.7583 | 2780 | 0.0074 | 0.9034 | 0.9545 | 0.9974 | 0.9985 | 0.9105 | 0.9974 | 0.8093 | | 0.0027 | 0.7638 | 2800 | 0.0074 | 0.9032 | 0.9471 | 0.9975 | 0.9987 | 0.8954 | 0.9974 | 0.8090 | | 0.0071 | 0.7692 | 2820 | 0.0074 | 0.9040 | 0.9520 | 0.9975 | 0.9986 | 0.9053 | 0.9974 | 0.8107 | | 0.004 | 0.7747 | 2840 | 0.0072 | 0.9061 | 0.9514 | 0.9975 | 0.9987 | 0.9041 | 0.9975 | 0.8147 | | 0.0043 | 0.7801 | 2860 | 0.0073 | 0.9026 | 0.9297 | 0.9975 | 0.9992 | 0.8603 | 0.9975 | 0.8077 | | 0.0048 | 0.7856 | 2880 | 0.0075 | 0.9028 | 0.9417 | 0.9975 | 0.9989 | 0.8846 | 0.9975 | 0.8080 | | 0.0022 | 0.7911 | 2900 | 0.0079 | 0.8974 | 0.9263 | 0.9974 | 0.9991 | 0.8535 | 0.9974 | 0.7975 | | 0.0035 | 0.7965 | 2920 | 0.0071 | 0.9073 | 0.9447 | 0.9976 | 0.9989 | 0.8904 | 0.9976 | 0.8170 | | 0.0029 | 0.8020 | 2940 | 0.0072 | 0.9033 | 0.9299 | 0.9976 | 0.9992 | 0.8605 | 0.9975 | 0.8091 | | 0.0042 | 0.8074 | 2960 | 0.0072 | 0.9079 | 0.9491 | 0.9976 | 0.9988 | 0.8995 | 0.9976 | 0.8183 | | 0.0034 | 0.8129 | 2980 | 0.0074 | 0.9071 | 0.9456 | 0.9976 | 0.9989 | 0.8922 | 0.9976 | 0.8166 | | 0.0006 | 0.8183 | 3000 | 0.0073 | 0.9085 | 0.9417 | 0.9977 | 0.9990 | 0.8843 | 0.9976 | 0.8194 | | 0.0004 | 0.8238 | 3020 | 0.0073 | 0.9086 | 0.9390 | 0.9977 | 0.9991 | 0.8788 | 0.9977 | 0.8195 | | 0.0042 | 0.8292 | 3040 | 0.0075 | 0.9081 | 0.9469 | 0.9976 | 0.9989 | 0.8949 | 0.9976 | 0.8187 | | 0.0003 | 0.8347 | 3060 | 0.0073 | 0.9092 | 0.9454 | 0.9977 | 0.9990 | 0.8918 | 0.9976 | 0.8208 | | 0.0008 | 0.8402 | 3080 | 0.0072 | 0.9111 | 0.9457 | 0.9977 | 0.9990 | 0.8923 | 0.9977 | 0.8245 | | 0.0031 | 0.8456 | 3100 | 0.0074 | 0.9078 | 0.9359 | 0.9977 | 0.9992 | 0.8726 | 0.9976 | 0.8179 | | 0.0099 | 0.8511 | 3120 | 0.0077 | 0.9056 | 0.9335 | 0.9976 | 0.9992 | 0.8678 | 0.9976 | 0.8136 | | 0.0003 | 0.8565 | 3140 | 0.0075 | 0.9086 | 0.9439 | 0.9977 | 0.9990 | 0.8888 | 0.9976 | 0.8196 | | 0.0047 | 0.8620 | 3160 | 0.0072 | 0.9094 | 0.9381 | 0.9977 | 0.9992 | 0.8770 | 0.9977 | 0.8211 | | 0.0057 | 0.8674 | 3180 | 0.0071 | 0.9098 | 0.9377 | 0.9977 | 0.9992 | 0.8763 | 0.9977 | 0.8218 | | 0.0033 | 0.8729 | 3200 | 0.0070 | 0.9101 | 0.9389 | 0.9977 | 0.9992 | 0.8787 | 0.9977 | 0.8224 | | 0.0009 | 0.8783 | 3220 | 0.0068 | 0.9129 | 0.9482 | 0.9978 | 0.9990 | 0.8974 | 0.9977 | 0.8281 | | 0.0047 | 0.8838 | 3240 | 0.0069 | 0.9128 | 0.9580 | 0.9977 | 0.9987 | 0.9174 | 0.9977 | 0.8278 | | 0.0021 | 0.8893 | 3260 | 0.0069 | 0.9133 | 0.9555 | 0.9977 | 0.9988 | 0.9121 | 0.9977 | 0.8289 | | 0.0006 | 0.8947 | 3280 | 0.0069 | 0.9118 | 0.9417 | 0.9978 | 0.9991 | 0.8844 | 0.9977 | 0.8258 | | 0.0003 | 0.9002 | 3300 | 0.0069 | 0.9107 | 0.9390 | 0.9977 | 0.9992 | 0.8788 | 0.9977 | 0.8237 | | 0.0025 | 0.9056 | 3320 | 0.0069 | 0.9122 | 0.9557 | 0.9977 | 0.9987 | 0.9127 | 0.9977 | 0.8267 | | 0.0003 | 0.9111 | 3340 | 0.0069 | 0.9131 | 0.9535 | 0.9977 | 0.9988 | 0.9082 | 0.9977 | 0.8285 | | 0.0034 | 0.9165 | 3360 | 0.0068 | 0.9148 | 0.9522 | 0.9978 | 0.9989 | 0.9054 | 0.9978 | 0.8319 | | 0.0043 | 0.9220 | 3380 | 0.0069 | 0.9140 | 0.9571 | 0.9978 | 0.9988 | 0.9155 | 0.9977 | 0.8303 | | 0.0031 | 0.9274 | 3400 | 0.0068 | 0.9138 | 0.9557 | 0.9978 | 0.9988 | 0.9127 | 0.9977 | 0.8300 | | 0.0063 | 0.9329 | 3420 | 0.0068 | 0.9139 | 0.9543 | 0.9978 | 0.9988 | 0.9098 | 0.9977 | 0.8300 | | 0.0049 | 0.9384 | 3440 | 0.0068 | 0.9139 | 0.9527 | 0.9978 | 0.9989 | 0.9066 | 0.9978 | 0.8300 | | 0.0092 | 0.9438 | 3460 | 0.0068 | 0.9142 | 0.9519 | 0.9978 | 0.9989 | 0.9050 | 0.9978 | 0.8306 | | 0.0033 | 0.9493 | 3480 | 0.0068 | 0.9133 | 0.9460 | 0.9978 | 0.9991 | 0.8929 | 0.9978 | 0.8289 | | 0.0048 | 0.9547 | 3500 | 0.0068 | 0.9141 | 0.9493 | 0.9978 | 0.9990 | 0.8996 | 0.9978 | 0.8304 | | 0.0037 | 0.9602 | 3520 | 0.0068 | 0.9136 | 0.9463 | 0.9978 | 0.9991 | 0.8936 | 0.9978 | 0.8294 | | 0.0037 | 0.9656 | 3540 | 0.0068 | 0.9124 | 0.9422 | 0.9978 | 0.9991 | 0.8852 | 0.9978 | 0.8271 | | 0.0022 | 0.9711 | 3560 | 0.0069 | 0.9102 | 0.9365 | 0.9977 | 0.9992 | 0.8738 | 0.9977 | 0.8227 | | 0.0066 | 0.9765 | 3580 | 0.0068 | 0.9117 | 0.9397 | 0.9978 | 0.9992 | 0.8803 | 0.9977 | 0.8256 | | 0.0044 | 0.9820 | 3600 | 0.0068 | 0.9132 | 0.9438 | 0.9978 | 0.9991 | 0.8885 | 0.9978 | 0.8285 | | 0.0066 | 0.9875 | 3620 | 0.0068 | 0.9143 | 0.9483 | 0.9978 | 0.9990 | 0.8975 | 0.9978 | 0.8309 | | 0.0007 | 0.9929 | 3640 | 0.0068 | 0.9143 | 0.9481 | 0.9978 | 0.9990 | 0.8971 | 0.9978 | 0.8308 | | 0.0046 | 0.9984 | 3660 | 0.0068 | 0.9132 | 0.9437 | 0.9978 | 0.9991 | 0.8883 | 0.9978 | 0.8286 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/Mask2former-base-finetuned-segments-pv
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/uncategorized/runs/illf24pn) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/uncategorized/runs/n27i0b9w) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/uncategorized/runs/dtnssy3k) # Mask2former-base-finetuned-segments-pv This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the mouadenna/satellite_PV_dataset_train_test dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
Spatiallysaying/segformer-finetuned-obb-1k-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-obb-1k-steps This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Spatiallysaying/obb dataset. It achieves the following results on the evaluation set: - Loss: 0.0511 - Mean Iou: 0.2238 - Mean Accuracy: 0.4477 - Overall Accuracy: 0.4477 - Accuracy Backgound : nan - Accuracy Rwy Obb: 0.4477 - Iou Backgound : 0.0 - Iou Rwy Obb: 0.4477 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Backgound | Accuracy Rwy Obb | Iou Backgound | Iou Rwy Obb | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:----------------:|:---------------:|:-----------:| | 0.3927 | 1.0 | 173 | 0.1096 | 0.1590 | 0.3180 | 0.3180 | nan | 0.3180 | 0.0 | 0.3180 | | 0.0969 | 2.0 | 346 | 0.0704 | 0.2112 | 0.4224 | 0.4224 | nan | 0.4224 | 0.0 | 0.4224 | | 0.0651 | 3.0 | 519 | 0.0598 | 0.2186 | 0.4371 | 0.4371 | nan | 0.4371 | 0.0 | 0.4371 | | 0.0576 | 4.0 | 692 | 0.0530 | 0.2250 | 0.4500 | 0.4500 | nan | 0.4500 | 0.0 | 0.4500 | | 0.0531 | 5.0 | 865 | 0.0529 | 0.2212 | 0.4424 | 0.4424 | nan | 0.4424 | 0.0 | 0.4424 | | 0.0467 | 5.7803 | 1000 | 0.0511 | 0.2238 | 0.4477 | 0.4477 | nan | 0.4477 | 0.0 | 0.4477 | ### Framework versions - Transformers 4.43.0.dev0 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "_backgound_", "rwy_obb" ]
Spatiallysaying/segformer-finetuned-rwymarkings-3k-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-rwymarkings-3k-steps This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Spatiallysaying/rwymarkings dataset. It achieves the following results on the evaluation set: - Loss: 0.0182 - Mean Iou: 0.0441 - Mean Accuracy: 0.0510 - Overall Accuracy: 0.0800 - Accuracy Backgound : nan - Accuracy Tdz: 0.0908 - Accuracy Aim: 0.2203 - Accuracy Desig: 0.0 - Accuracy Rwythr: 0.0971 - Accuracy Thrbar: 0.0 - Accuracy Disp: 0.0 - Accuracy Chevron: 0.0 - Accuracy Arrow: 0.0 - Iou Backgound : 0.0 - Iou Tdz: 0.0818 - Iou Aim: 0.2189 - Iou Desig: 0.0 - Iou Rwythr: 0.0958 - Iou Thrbar: 0.0 - Iou Disp: 0.0 - Iou Chevron: 0.0 - Iou Arrow: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: polynomial - training_steps: 3000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Backgound | Accuracy Tdz | Accuracy Aim | Accuracy Desig | Accuracy Rwythr | Accuracy Thrbar | Accuracy Disp | Accuracy Chevron | Accuracy Arrow | Iou Backgound | Iou Tdz | Iou Aim | Iou Desig | Iou Rwythr | Iou Thrbar | Iou Disp | Iou Chevron | Iou Arrow | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:------------:|:------------:|:--------------:|:---------------:|:---------------:|:-------------:|:----------------:|:--------------:|:---------------:|:-------:|:-------:|:---------:|:----------:|:----------:|:--------:|:-----------:|:---------:| | 1.6294 | 1.0 | 173 | 0.5448 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.3371 | 2.0 | 346 | 0.1107 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0724 | 3.0 | 519 | 0.0483 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0508 | 4.0 | 692 | 0.0331 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0369 | 5.0 | 865 | 0.0289 | 0.0002 | 0.0002 | 0.0004 | nan | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0272 | 6.0 | 1038 | 0.0276 | 0.0106 | 0.0120 | 0.0195 | nan | 0.0107 | 0.0853 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0105 | 0.0845 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0258 | 7.0 | 1211 | 0.0233 | 0.0066 | 0.0075 | 0.0122 | nan | 0.0118 | 0.0480 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0117 | 0.0480 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0235 | 8.0 | 1384 | 0.0221 | 0.0150 | 0.0171 | 0.0277 | nan | 0.0233 | 0.1108 | 0.0 | 0.0024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0224 | 0.1107 | 0.0 | 0.0024 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0213 | 9.0 | 1557 | 0.0209 | 0.0177 | 0.0200 | 0.0326 | nan | 0.0237 | 0.1351 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0231 | 0.1346 | 0.0 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0201 | 10.0 | 1730 | 0.0206 | 0.0277 | 0.0318 | 0.0512 | nan | 0.0595 | 0.1734 | 0.0 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0559 | 0.1726 | 0.0 | 0.0211 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0203 | 11.0 | 1903 | 0.0198 | 0.0246 | 0.0281 | 0.0450 | nan | 0.0463 | 0.1512 | 0.0 | 0.0277 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0432 | 0.1505 | 0.0 | 0.0277 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0172 | 12.0 | 2076 | 0.0192 | 0.0377 | 0.0435 | 0.0690 | nan | 0.0744 | 0.2145 | 0.0 | 0.0592 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0680 | 0.2119 | 0.0 | 0.0589 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0168 | 13.0 | 2249 | 0.0189 | 0.0331 | 0.0381 | 0.0607 | nan | 0.0704 | 0.1884 | 0.0 | 0.0462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0645 | 0.1876 | 0.0 | 0.0461 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0169 | 14.0 | 2422 | 0.0185 | 0.0383 | 0.0442 | 0.0701 | nan | 0.0786 | 0.2124 | 0.0 | 0.0628 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0716 | 0.2112 | 0.0 | 0.0623 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0172 | 15.0 | 2595 | 0.0184 | 0.0476 | 0.0551 | 0.0864 | nan | 0.0917 | 0.2463 | 0.0 | 0.1028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0830 | 0.2443 | 0.0 | 0.1013 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0159 | 16.0 | 2768 | 0.0182 | 0.0523 | 0.0615 | 0.0964 | nan | 0.1202 | 0.2493 | 0.0 | 0.1225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1044 | 0.2468 | 0.0 | 0.1199 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0163 | 17.0 | 2941 | 0.0181 | 0.0492 | 0.0571 | 0.0892 | nan | 0.0987 | 0.2414 | 0.0 | 0.1167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0885 | 0.2397 | 0.0 | 0.1146 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.0152 | 17.3410 | 3000 | 0.0182 | 0.0441 | 0.0510 | 0.0800 | nan | 0.0908 | 0.2203 | 0.0 | 0.0971 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0818 | 0.2189 | 0.0 | 0.0958 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.43.0.dev0 - Pytorch 2.3.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "_backgound_", "tdz", "aim", "desig", "rwythr", "thrbar", "disp", "chevron", "arrow" ]
Hasano20/SegFormer_mit-b5_Clean-Set3-Grayscale_Augmented_Medium
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # SegFormer_mit-b5_Clean-Set3-Grayscale_Augmented_Medium This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on an unknown dataset. It achieves the following results on the evaluation set: - Train-Loss: 0.0088 - Val-Loss: 0.0134 - Mean Iou: 0.9793 - Mean Accuracy: 0.9903 - Overall Accuracy: 0.9947 - Accuracy Background: 0.9971 - Accuracy Melt: 0.9785 - Accuracy Substrate: 0.9952 - Iou Background: 0.9935 - Iou Melt: 0.9524 - Iou Substrate: 0.9920 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - num_epochs: 25 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate | |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:| | 0.1305 | 0.3968 | 50 | 0.1020 | 0.8694 | 0.9199 | 0.9644 | 0.9855 | 0.8016 | 0.9726 | 0.9651 | 0.6989 | 0.9443 | | 0.0906 | 0.7937 | 100 | 0.0668 | 0.8972 | 0.9187 | 0.9757 | 0.9891 | 0.7703 | 0.9968 | 0.9818 | 0.7488 | 0.9609 | | 0.0409 | 1.1905 | 150 | 0.0606 | 0.9231 | 0.9414 | 0.9814 | 0.9879 | 0.8379 | 0.9984 | 0.9840 | 0.8152 | 0.9702 | | 0.0678 | 1.5873 | 200 | 0.0344 | 0.9524 | 0.9762 | 0.9879 | 0.9883 | 0.9463 | 0.9941 | 0.9848 | 0.8890 | 0.9834 | | 0.0312 | 1.9841 | 250 | 0.0340 | 0.9489 | 0.9756 | 0.9874 | 0.9935 | 0.9442 | 0.9892 | 0.9869 | 0.8779 | 0.9818 | | 0.0334 | 2.3810 | 300 | 0.0277 | 0.9576 | 0.9826 | 0.9895 | 0.9956 | 0.9637 | 0.9885 | 0.9908 | 0.8987 | 0.9833 | | 0.0286 | 2.7778 | 350 | 0.0264 | 0.9581 | 0.9776 | 0.9898 | 0.9964 | 0.9452 | 0.9912 | 0.9896 | 0.9002 | 0.9846 | | 0.0214 | 3.1746 | 400 | 0.0230 | 0.9661 | 0.9824 | 0.9915 | 0.9926 | 0.9587 | 0.9958 | 0.9903 | 0.9206 | 0.9875 | | 0.0208 | 3.5714 | 450 | 0.0203 | 0.9692 | 0.9876 | 0.9922 | 0.9968 | 0.9751 | 0.9910 | 0.9916 | 0.9283 | 0.9878 | | 0.0146 | 3.9683 | 500 | 0.0231 | 0.9667 | 0.9852 | 0.9915 | 0.9961 | 0.9680 | 0.9913 | 0.9904 | 0.9229 | 0.9870 | | 0.0197 | 4.3651 | 550 | 0.0208 | 0.9662 | 0.9883 | 0.9916 | 0.9950 | 0.9790 | 0.9908 | 0.9914 | 0.9200 | 0.9873 | | 0.0198 | 4.7619 | 600 | 0.0184 | 0.9722 | 0.9836 | 0.9930 | 0.9969 | 0.9587 | 0.9951 | 0.9916 | 0.9355 | 0.9896 | | 0.019 | 5.1587 | 650 | 0.0211 | 0.9693 | 0.9889 | 0.9919 | 0.9970 | 0.9801 | 0.9896 | 0.9907 | 0.9298 | 0.9872 | | 0.0115 | 5.5556 | 700 | 0.0193 | 0.9706 | 0.9833 | 0.9928 | 0.9963 | 0.9584 | 0.9953 | 0.9926 | 0.9304 | 0.9888 | | 0.0135 | 5.9524 | 750 | 0.0166 | 0.9740 | 0.9867 | 0.9933 | 0.9965 | 0.9692 | 0.9945 | 0.9919 | 0.9401 | 0.9899 | | 0.0127 | 6.3492 | 800 | 0.0182 | 0.9736 | 0.9866 | 0.9932 | 0.9969 | 0.9689 | 0.9939 | 0.9918 | 0.9395 | 0.9895 | | 0.0129 | 6.7460 | 850 | 0.0194 | 0.9723 | 0.9853 | 0.9930 | 0.9958 | 0.9651 | 0.9951 | 0.9920 | 0.9354 | 0.9894 | | 0.0124 | 7.1429 | 900 | 0.0145 | 0.9771 | 0.9900 | 0.9941 | 0.9972 | 0.9789 | 0.9940 | 0.9928 | 0.9472 | 0.9911 | | 0.011 | 7.5397 | 950 | 0.0149 | 0.9774 | 0.9876 | 0.9941 | 0.9972 | 0.9704 | 0.9953 | 0.9923 | 0.9485 | 0.9914 | | 0.0176 | 7.9365 | 1000 | 0.0212 | 0.9681 | 0.9890 | 0.9919 | 0.9972 | 0.9802 | 0.9895 | 0.9923 | 0.9251 | 0.9869 | | 0.0205 | 8.3333 | 1050 | 0.0171 | 0.9724 | 0.9895 | 0.9930 | 0.9971 | 0.9797 | 0.9918 | 0.9924 | 0.9356 | 0.9893 | | 0.0103 | 8.7302 | 1100 | 0.0141 | 0.9780 | 0.9891 | 0.9943 | 0.9968 | 0.9754 | 0.9953 | 0.9928 | 0.9497 | 0.9915 | | 0.0093 | 9.1270 | 1150 | 0.0148 | 0.9769 | 0.9881 | 0.9941 | 0.9965 | 0.9723 | 0.9956 | 0.9930 | 0.9466 | 0.9911 | | 0.0113 | 9.5238 | 1200 | 0.0136 | 0.9788 | 0.9881 | 0.9945 | 0.9977 | 0.9711 | 0.9955 | 0.9929 | 0.9517 | 0.9918 | | 0.0132 | 9.9206 | 1250 | 0.0144 | 0.9783 | 0.9882 | 0.9944 | 0.9971 | 0.9720 | 0.9957 | 0.9930 | 0.9503 | 0.9915 | | 0.0104 | 10.3175 | 1300 | 0.0135 | 0.9788 | 0.9882 | 0.9945 | 0.9976 | 0.9714 | 0.9957 | 0.9932 | 0.9515 | 0.9918 | | 0.0153 | 10.7143 | 1350 | 0.0129 | 0.9796 | 0.9889 | 0.9947 | 0.9970 | 0.9734 | 0.9962 | 0.9932 | 0.9534 | 0.9922 | | 0.0091 | 11.1111 | 1400 | 0.0142 | 0.9783 | 0.9900 | 0.9944 | 0.9968 | 0.9784 | 0.9950 | 0.9931 | 0.9500 | 0.9917 | | 0.0098 | 11.5079 | 1450 | 0.0139 | 0.9789 | 0.9889 | 0.9946 | 0.9967 | 0.9740 | 0.9962 | 0.9933 | 0.9516 | 0.9920 | | 0.0094 | 11.9048 | 1500 | 0.0136 | 0.9795 | 0.9887 | 0.9947 | 0.9977 | 0.9730 | 0.9956 | 0.9931 | 0.9533 | 0.9920 | | 0.0088 | 12.3016 | 1550 | 0.0134 | 0.9793 | 0.9903 | 0.9947 | 0.9971 | 0.9785 | 0.9952 | 0.9935 | 0.9524 | 0.9920 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.0.1+cu117 - Datasets 2.19.2 - Tokenizers 0.19.1
[ "background", "melt", "substrate" ]
cmarkea/dit-base-layout-detection
# DIT-base-layout-detection We present the model cmarkea/dit-base-layout-detection, which allows extracting different layouts (Text, Picture, Caption, Footnote, etc.) from an image of a document. This is a fine-tuning of the model [dit-base](https://huggingface.co/microsoft/dit-base) on the [DocLayNet](https://huggingface.co/datasets/ds4sd/DocLayNet) dataset. It is ideal for processing documentary corpora to be ingested into an ODQA system. This model allows extracting 11 entities, which are: Caption, Footnote, Formula, List-item, Page-footer, Page-header, Picture, Section-header, Table, Text, and Title. ## Performance In this section, we will assess the model's performance by separately considering semantic segmentation and object detection. We did not perform any post-processing for the semantic segmentation. As for object detection, we only applied OpenCV's `findContours` without any further post-processing. For semantic segmentation, we will use the F1-score to evaluate the classification of each pixel. For object detection, we will assess performance based on the Generalized Intersection over Union (GIoU) and the accuracy of the predicted bounding box class. The evaluation is conducted on 500 pages from the PDF evaluation dataset of DocLayNet. | Class | f1-score (x100) | GIoU (x100) | accuracy (x100) | |:--------------:|:---------------:|:-----------:|:---------------:| | Background | 94.98 | NA | NA | | Caption | 75.54 | 55.61 | 72.62 | | Footnote | 72.29 | 50.08 | 70.97 | | Formula | 82.29 | 49.91 | 94.48 | | List-item | 67.56 | 35.19 | 69 | | Page-footer | 83.93 | 57.99 | 94.06 | | Page-header | 62.33 | 65.25 | 79.39 | | Picture | 78.32 | 58.22 | 92.71 | | Section-header | 69.55 | 56.64 | 78.29 | | Table | 83.69 | 63.03 | 90.13 | | Text | 90.94 | 51.89 | 88.09 | | Title | 61.19 | 52.64 | 70 | ## Benchmark Now, let's compare the performance of this model with other models. | Model | f1-score (x100) | GIoU (x100) | accuracy (x100) | |:---------------------------------------------------------------------------------------------:|:---------------:|:-----------:|:---------------:| | cmarkea/dit-base-layout-detection | 90.77 | 56.29 | 85.26 | | [cmarkea/detr-layout-detection](https://huggingface.co/cmarkea/detr-layout-detection) | 91.27 | 80.66 | 90.46 | ### Direct Use ```python import torch from transformers import AutoImageProcessor, BeitForSemanticSegmentation img_proc = AutoImageProcessor.from_pretrained( "cmarkea/dit-base-layout-detection" ) model = BeitForSemanticSegmentation.from_pretrained( "cmarkea/dit-base-layout-detection" ) img: PIL.Image with torch.inference_mode(): input_ids = img_proc(img, return_tensors='pt') output = model(**input_ids) segmentation = img_proc.post_process_semantic_segmentation( output, target_sizes=[img.size[::-1]] ) ``` Here is a simple method for detecting bounding boxes from semantic segmentation. This is the method used to calculate the model's performance in object detection, as described in the "Performance" section. The method is provided without any additional post-processing. ```python import cv2 def detect_bboxes(masks: np.ndarray): r""" A simple bounding box detection function """ detected_blocks = [] contours, _ = cv2.findContours( masks.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE ) for contour in list(contours): if len(list(contour)) >= 4: # smallest rectangle containing all points x, y, width, height = cv2.boundingRect(contour) bounding_box = [x, y, x + width, y + height] detected_blocks.append(bounding_box) return detected_blocks bbox_pred = [] for segment in segmentation: boxes, labels = [], [] for ii in range(1, len(model.config.label2id)): mm = segment == ii if mm.sum() > 0: bbx = detect_bboxes(mm.numpy()) boxes.extend(bbx) labels.extend([ii]*len(bbx)) bbox_pred.append(dict(boxes=boxes, labels=labels)) ``` ### Example ![example](https://i.postimg.cc/rFXswV59/dit1.png) ### Citation ``` @online{DeDitLay, AUTHOR = {Cyrile Delestre}, URL = {https://huggingface.co/cmarkea/dit-base-layout-detection}, YEAR = {2024}, KEYWORDS = {Image Processing ; Transformers ; Layout}, } ```
[ "background", "caption", "footnote", "formula", "list-item", "page-footer", "page-header", "picture", "section-header", "table", "text", "title" ]
mouadenna/segformer-b1-finetuned-segments-pv
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/lyupah0l) # segformer-b1-finetuned-segments-pv This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test dataset. It achieves the following results on the evaluation set: - Loss: 0.0192 - Mean Iou: 0.8631 - Precision: 0.9304 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-----:|:------:|:---------------:|:--------:|:---------:| | 0.0051 | 1.0 | 3666 | 0.0084 | 0.8064 | 0.8864 | | 0.0038 | 2.0 | 7332 | 0.0115 | 0.7607 | 0.9338 | | 0.0001 | 3.0 | 10998 | 0.0089 | 0.8145 | 0.9115 | | 0.0 | 4.0 | 14664 | 0.0078 | 0.8317 | 0.9063 | | 0.0 | 5.0 | 18330 | 0.0093 | 0.8078 | 0.9244 | | 0.0017 | 6.0 | 21996 | 0.0080 | 0.8370 | 0.9203 | | 0.0019 | 7.0 | 25662 | 0.0085 | 0.8395 | 0.9163 | | 0.0001 | 8.0 | 29328 | 0.0099 | 0.8379 | 0.8931 | | 0.0019 | 9.0 | 32994 | 0.0100 | 0.8388 | 0.9225 | | 0.0048 | 10.0 | 36660 | 0.0103 | 0.8422 | 0.9035 | | 0.0238 | 11.0 | 40326 | 0.0132 | 0.8378 | 0.9169 | | 0.0 | 12.0 | 43992 | 0.0093 | 0.8509 | 0.9254 | | 0.0017 | 13.0 | 47658 | 0.0116 | 0.8417 | 0.9243 | | 0.0014 | 14.0 | 51324 | 0.0127 | 0.8348 | 0.9017 | | 0.0031 | 15.0 | 54990 | 0.0123 | 0.8463 | 0.9299 | | 0.0016 | 16.0 | 58656 | 0.0109 | 0.8439 | 0.9062 | | 0.0091 | 17.0 | 62322 | 0.0199 | 0.8143 | 0.9344 | | 0.0017 | 18.0 | 65988 | 0.0155 | 0.8326 | 0.9184 | | 0.0 | 19.0 | 69654 | 0.0128 | 0.8351 | 0.8971 | | 0.0013 | 20.0 | 73320 | 0.0135 | 0.8360 | 0.8970 | | 0.0015 | 21.0 | 76986 | 0.0151 | 0.8466 | 0.9055 | | 0.0011 | 22.0 | 80652 | 0.0136 | 0.8525 | 0.9117 | | 0.0016 | 23.0 | 84318 | 0.0129 | 0.8478 | 0.9052 | | 0.0007 | 24.0 | 87984 | 0.0189 | 0.8422 | 0.9422 | | 0.0012 | 25.0 | 91650 | 0.0134 | 0.8435 | 0.9070 | | 0.0012 | 26.0 | 95316 | 0.0152 | 0.8532 | 0.9243 | | 0.0028 | 27.0 | 98982 | 0.0145 | 0.8521 | 0.9273 | | 0.0023 | 28.0 | 102648 | 0.0156 | 0.8566 | 0.9288 | | 0.0 | 29.0 | 106314 | 0.0176 | 0.8494 | 0.9222 | | 0.0 | 30.0 | 109980 | 0.0156 | 0.8542 | 0.9282 | | 0.0 | 31.0 | 113646 | 0.0158 | 0.8578 | 0.9273 | | 0.0012 | 32.0 | 117312 | 0.0171 | 0.8560 | 0.9258 | | 0.0005 | 33.0 | 120978 | 0.0146 | 0.8534 | 0.9149 | | 0.0016 | 34.0 | 124644 | 0.0199 | 0.8519 | 0.9250 | | 0.0015 | 35.0 | 128310 | 0.0164 | 0.8559 | 0.9181 | | 0.0005 | 36.0 | 131976 | 0.0164 | 0.8551 | 0.9176 | | 0.0014 | 37.0 | 135642 | 0.0172 | 0.8594 | 0.9263 | | 0.0008 | 38.0 | 139308 | 0.0178 | 0.8601 | 0.9273 | | 0.0 | 39.0 | 142974 | 0.0153 | 0.8601 | 0.9281 | | 0.0 | 40.0 | 146640 | 0.0165 | 0.8632 | 0.9324 | | 0.0 | 41.0 | 150306 | 0.0172 | 0.8624 | 0.9328 | | 0.0002 | 42.0 | 153972 | 0.0201 | 0.8590 | 0.9303 | | 0.0033 | 43.0 | 157638 | 0.0180 | 0.8611 | 0.9347 | | 0.0 | 44.0 | 161304 | 0.0155 | 0.8620 | 0.9283 | | 0.0011 | 45.0 | 164970 | 0.0174 | 0.8624 | 0.9277 | | 0.0004 | 46.0 | 168636 | 0.0192 | 0.8612 | 0.9316 | | 0.0 | 47.0 | 172302 | 0.0185 | 0.8612 | 0.9232 | | 0.0007 | 48.0 | 175968 | 0.0173 | 0.8623 | 0.9247 | | 0.0007 | 49.0 | 179634 | 0.0196 | 0.8628 | 0.9295 | | 0.0003 | 50.0 | 183300 | 0.0192 | 0.8631 | 0.9304 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv-augx3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/zskynm6z) # segformer-b1-finetuned-segments-pv-augx3 This model is a fine-tuned version of [mouadenna/segformer-b1-finetuned-segments-pv](https://huggingface.co/mouadenna/segformer-b1-finetuned-segments-pv) on the mouadenna/satellite_PV_dataset_train_test dataset. It achieves the following results on the evaluation set: - Loss: 0.0157 - Mean Iou: 0.8465 - Precision: 0.9113 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-----:|:------:|:---------------:|:--------:|:---------:| | 0.0038 | 1.0 | 10998 | 0.0083 | 0.8036 | 0.8867 | | 0.0001 | 2.0 | 21996 | 0.0073 | 0.8363 | 0.8992 | | 0.003 | 3.0 | 32994 | 0.0089 | 0.8086 | 0.8877 | | 0.0044 | 4.0 | 43992 | 0.0098 | 0.8356 | 0.9120 | | 0.0085 | 5.0 | 54990 | 0.0103 | 0.8363 | 0.8955 | | 0.0025 | 6.0 | 65988 | 0.0097 | 0.8295 | 0.9055 | | 0.0 | 7.0 | 76986 | 0.0135 | 0.8148 | 0.9072 | | 0.0 | 8.0 | 87984 | 0.0106 | 0.8300 | 0.9011 | | 0.0033 | 9.0 | 98982 | 0.0133 | 0.8357 | 0.9160 | | 0.0022 | 10.0 | 109980 | 0.0128 | 0.8407 | 0.9197 | | 0.0019 | 11.0 | 120978 | 0.0150 | 0.8455 | 0.9136 | | 0.0017 | 12.0 | 131976 | 0.0115 | 0.8430 | 0.9191 | | 0.0024 | 13.0 | 142974 | 0.0126 | 0.8470 | 0.9171 | | 0.0029 | 14.0 | 153972 | 0.0155 | 0.8385 | 0.9127 | | 0.0024 | 15.0 | 164970 | 0.0145 | 0.8430 | 0.9085 | | 0.0029 | 16.0 | 175968 | 0.0148 | 0.8408 | 0.8980 | | 0.0038 | 17.0 | 186966 | 0.0168 | 0.8471 | 0.9170 | | 0.002 | 18.0 | 197964 | 0.0154 | 0.8437 | 0.9115 | | 0.0024 | 19.0 | 208962 | 0.0167 | 0.8458 | 0.9276 | | 0.0017 | 20.0 | 219960 | 0.0157 | 0.8465 | 0.9113 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b0-finetuned-segments-pv
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/oanxw61g) # segformer-b0-finetuned-segments-pv This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test dataset. It achieves the following results on the evaluation set: - Loss: 0.0224 - Mean Iou: 0.8462 - Precision: 0.9229 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-----:|:------:|:---------------:|:--------:|:---------:| | 0.0043 | 1.0 | 3666 | 0.0095 | 0.7784 | 0.8863 | | 0.0036 | 2.0 | 7332 | 0.0082 | 0.8127 | 0.8991 | | 0.0004 | 3.0 | 10998 | 0.0085 | 0.7946 | 0.8844 | | 0.0 | 4.0 | 14664 | 0.0082 | 0.8313 | 0.9130 | | 0.0 | 5.0 | 18330 | 0.0089 | 0.8147 | 0.9092 | | 0.002 | 6.0 | 21996 | 0.0117 | 0.8121 | 0.9275 | | 0.0017 | 7.0 | 25662 | 0.0105 | 0.7984 | 0.8629 | | 0.0 | 8.0 | 29328 | 0.0108 | 0.8169 | 0.8889 | | 0.0029 | 9.0 | 32994 | 0.0133 | 0.8224 | 0.9096 | | 0.006 | 10.0 | 36660 | 0.0106 | 0.8280 | 0.8829 | | 0.026 | 11.0 | 40326 | 0.0102 | 0.8501 | 0.9210 | | 0.0 | 12.0 | 43992 | 0.0118 | 0.8339 | 0.9022 | | 0.0019 | 13.0 | 47658 | 0.0139 | 0.8360 | 0.9103 | | 0.0018 | 14.0 | 51324 | 0.0140 | 0.8332 | 0.9161 | | 0.0039 | 15.0 | 54990 | 0.0129 | 0.8297 | 0.9012 | | 0.0025 | 16.0 | 58656 | 0.0166 | 0.8368 | 0.9030 | | 0.0073 | 17.0 | 62322 | 0.0148 | 0.8334 | 0.8950 | | 0.0017 | 18.0 | 65988 | 0.0157 | 0.8451 | 0.9166 | | 0.0 | 19.0 | 69654 | 0.0184 | 0.8129 | 0.9161 | | 0.0013 | 20.0 | 73320 | 0.0162 | 0.8333 | 0.9042 | | 0.0014 | 21.0 | 76986 | 0.0167 | 0.8470 | 0.9178 | | 0.0015 | 22.0 | 80652 | 0.0147 | 0.8429 | 0.9114 | | 0.0015 | 23.0 | 84318 | 0.0149 | 0.8458 | 0.8978 | | 0.0009 | 24.0 | 87984 | 0.0158 | 0.8416 | 0.9072 | | 0.0014 | 25.0 | 91650 | 0.0144 | 0.8457 | 0.9185 | | 0.0013 | 26.0 | 95316 | 0.0164 | 0.8482 | 0.9212 | | 0.0043 | 27.0 | 98982 | 0.0162 | 0.8400 | 0.9005 | | 0.0024 | 28.0 | 102648 | 0.0203 | 0.8468 | 0.9217 | | 0.0 | 29.0 | 106314 | 0.0192 | 0.8431 | 0.9142 | | 0.0 | 30.0 | 109980 | 0.0181 | 0.8477 | 0.9203 | | 0.0 | 31.0 | 113646 | 0.0179 | 0.8484 | 0.9177 | | 0.001 | 32.0 | 117312 | 0.0170 | 0.8485 | 0.9104 | | 0.0007 | 33.0 | 120978 | 0.0184 | 0.8471 | 0.9113 | | 0.0013 | 34.0 | 124644 | 0.0193 | 0.8487 | 0.9209 | | 0.0016 | 35.0 | 128310 | 0.0169 | 0.8491 | 0.9182 | | 0.0005 | 36.0 | 131976 | 0.0180 | 0.8476 | 0.9167 | | 0.0016 | 37.0 | 135642 | 0.0212 | 0.8478 | 0.9239 | | 0.0014 | 38.0 | 139308 | 0.0211 | 0.8455 | 0.9164 | | 0.0 | 39.0 | 142974 | 0.0203 | 0.8468 | 0.9211 | | 0.0 | 40.0 | 146640 | 0.0224 | 0.8462 | 0.9229 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/yfvyerdp) # segformer-b1-finetuned-segments-pv_v1_normalized This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_t4_16batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/gspnha68) # segformer-b1-finetuned-segments-pv_v1_normalized_t4_16batch This model is a fine-tuned version of [nvidia/segformer-b3-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b3-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: nan - Mean Iou: 0.0 - Precision: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.0056 | 0.9989 | 229 | 0.0099 | 0.7568 | 0.7827 | | 0.0092 | 1.9978 | 458 | 0.0113 | 0.7247 | 0.7360 | | 0.0069 | 2.9967 | 687 | 0.0051 | 0.8440 | 0.9106 | | 0.0159 | 4.0 | 917 | 0.0057 | 0.8317 | 0.8635 | | 0.0124 | 4.9989 | 1146 | 0.0099 | 0.8223 | 0.9292 | | 0.0033 | 5.9978 | 1375 | 0.2415 | 0.0 | 1.0 | | 0.0 | 6.9967 | 1604 | nan | 0.0 | 1.0 | | 0.0 | 8.0 | 1834 | nan | 0.0 | 1.0 | | 0.0 | 8.9989 | 2063 | nan | 0.0 | 1.0 | | 0.0 | 9.9978 | 2292 | nan | 0.0 | 1.0 | | 0.0 | 10.9967 | 2521 | nan | 0.0 | 1.0 | | 0.0 | 12.0 | 2751 | nan | 0.0 | 1.0 | | 0.0 | 12.9989 | 2980 | nan | 0.0 | 1.0 | | 0.0 | 13.9978 | 3209 | nan | 0.0 | 1.0 | | 0.0 | 14.9967 | 3438 | nan | 0.0 | 1.0 | | 0.0 | 16.0 | 3668 | nan | 0.0 | 1.0 | | 0.0 | 16.9989 | 3897 | nan | 0.0 | 1.0 | | 0.0 | 17.9978 | 4126 | nan | 0.0 | 1.0 | | 0.0 | 18.9967 | 4355 | nan | 0.0 | 1.0 | | 0.0 | 20.0 | 4585 | nan | 0.0 | 1.0 | | 0.0 | 20.9989 | 4814 | nan | 0.0 | 1.0 | | 0.0 | 21.9978 | 5043 | nan | 0.0 | 1.0 | | 0.0 | 22.9967 | 5272 | nan | 0.0 | 1.0 | | 0.0 | 24.0 | 5502 | nan | 0.0 | 1.0 | | 0.0 | 24.9989 | 5731 | nan | 0.0 | 1.0 | | 0.0 | 25.9978 | 5960 | nan | 0.0 | 1.0 | | 0.0 | 26.9967 | 6189 | nan | 0.0 | 1.0 | | 0.0 | 28.0 | 6419 | nan | 0.0 | 1.0 | | 0.0 | 28.9989 | 6648 | nan | 0.0 | 1.0 | | 0.0 | 29.9978 | 6877 | nan | 0.0 | 1.0 | | 0.0 | 30.9967 | 7106 | nan | 0.0 | 1.0 | | 0.0 | 32.0 | 7336 | nan | 0.0 | 1.0 | | 0.0 | 32.9989 | 7565 | nan | 0.0 | 1.0 | | 0.0 | 33.9978 | 7794 | nan | 0.0 | 1.0 | | 0.0 | 34.9967 | 8023 | nan | 0.0 | 1.0 | | 0.0 | 36.0 | 8253 | nan | 0.0 | 1.0 | | 0.0 | 36.9989 | 8482 | nan | 0.0 | 1.0 | | 0.0 | 37.9978 | 8711 | nan | 0.0 | 1.0 | | 0.0 | 38.9967 | 8940 | nan | 0.0 | 1.0 | | 0.0 | 39.9564 | 9160 | nan | 0.0 | 1.0 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_t4_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/ovjr83to) # segformer-b1-finetuned-segments-pv_v1_normalized_t4_4batch This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0107 - Mean Iou: 0.8767 - Precision: 0.9236 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.0061 | 0.9935 | 114 | 0.0077 | 0.7881 | 0.8686 | | 0.0043 | 1.9956 | 229 | 0.0063 | 0.8084 | 0.8541 | | 0.0051 | 2.9978 | 344 | 0.0052 | 0.8338 | 0.9445 | | 0.0141 | 4.0 | 459 | 0.0054 | 0.8421 | 0.8811 | | 0.0043 | 4.9935 | 573 | 0.0052 | 0.8414 | 0.9099 | | 0.0027 | 5.9956 | 688 | 0.0057 | 0.8450 | 0.8967 | | 0.0054 | 6.9978 | 803 | 0.0052 | 0.8505 | 0.9364 | | 0.0024 | 8.0 | 918 | 0.0064 | 0.8408 | 0.9001 | | 0.0024 | 8.9935 | 1032 | 0.0068 | 0.8378 | 0.9158 | | 0.0033 | 9.9956 | 1147 | 0.0055 | 0.8643 | 0.9156 | | 0.0014 | 10.9978 | 1262 | 0.0055 | 0.8611 | 0.9048 | | 0.0019 | 12.0 | 1377 | 0.0070 | 0.8410 | 0.8900 | | 0.0025 | 12.9935 | 1491 | 0.0060 | 0.8629 | 0.9112 | | 0.0018 | 13.9956 | 1606 | 0.0063 | 0.8577 | 0.9294 | | 0.002 | 14.9978 | 1721 | 0.0063 | 0.8539 | 0.8888 | | 0.002 | 16.0 | 1836 | 0.0072 | 0.8598 | 0.9172 | | 0.0021 | 16.9935 | 1950 | 0.0062 | 0.8555 | 0.9074 | | 0.0018 | 17.9956 | 2065 | 0.0069 | 0.8598 | 0.9167 | | 0.0018 | 18.9978 | 2180 | 0.0074 | 0.8556 | 0.9160 | | 0.002 | 20.0 | 2295 | 0.0067 | 0.8662 | 0.9117 | | 0.003 | 20.9935 | 2409 | 0.0062 | 0.8724 | 0.9245 | | 0.0027 | 21.9956 | 2524 | 0.0067 | 0.8727 | 0.9124 | | 0.0013 | 22.9978 | 2639 | 0.0068 | 0.8684 | 0.9147 | | 0.0011 | 24.0 | 2754 | 0.0070 | 0.8723 | 0.9165 | | 0.0014 | 24.9935 | 2868 | 0.0074 | 0.8709 | 0.9257 | | 0.0011 | 25.9956 | 2983 | 0.0075 | 0.8697 | 0.9139 | | 0.001 | 26.9978 | 3098 | 0.0071 | 0.8780 | 0.9273 | | 0.0011 | 28.0 | 3213 | 0.0075 | 0.8743 | 0.9182 | | 0.0008 | 28.9935 | 3327 | 0.0080 | 0.8744 | 0.9234 | | 0.0007 | 29.9956 | 3442 | 0.0086 | 0.8692 | 0.9205 | | 0.001 | 30.9978 | 3557 | 0.0083 | 0.8720 | 0.9145 | | 0.0009 | 32.0 | 3672 | 0.0084 | 0.8745 | 0.9167 | | 0.0009 | 32.9935 | 3786 | 0.0084 | 0.8717 | 0.9155 | | 0.0011 | 33.9956 | 3901 | 0.0084 | 0.8756 | 0.9279 | | 0.0007 | 34.9978 | 4016 | 0.0090 | 0.8777 | 0.9233 | | 0.0008 | 36.0 | 4131 | 0.0090 | 0.8744 | 0.9173 | | 0.0011 | 36.9935 | 4245 | 0.0097 | 0.8753 | 0.9192 | | 0.0008 | 37.9956 | 4360 | 0.0091 | 0.8757 | 0.9260 | | 0.0009 | 38.9978 | 4475 | 0.0091 | 0.8739 | 0.9173 | | 0.0008 | 40.0 | 4590 | 0.0103 | 0.8760 | 0.9274 | | 0.0008 | 40.9935 | 4704 | 0.0106 | 0.8749 | 0.9263 | | 0.0008 | 41.9956 | 4819 | 0.0097 | 0.8753 | 0.9238 | | 0.0009 | 42.9978 | 4934 | 0.0099 | 0.8730 | 0.9159 | | 0.0006 | 44.0 | 5049 | 0.0101 | 0.8757 | 0.9247 | | 0.0006 | 44.9935 | 5163 | 0.0104 | 0.8756 | 0.9217 | | 0.0007 | 45.9956 | 5278 | 0.0106 | 0.8720 | 0.9175 | | 0.0006 | 46.9978 | 5393 | 0.0107 | 0.8753 | 0.9202 | | 0.0005 | 48.0 | 5508 | 0.0107 | 0.8757 | 0.9224 | | 0.0007 | 48.9935 | 5622 | 0.0107 | 0.8764 | 0.9227 | | 0.0008 | 49.6732 | 5700 | 0.0107 | 0.8767 | 0.9236 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_p100
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/olas59o3) # segformer-b1-finetuned-segments-pv_v1_normalized_p100 This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0092 - Mean Iou: 0.8705 - Precision: 0.9201 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-----:|:------:|:---------------:|:--------:|:---------:| | 0.0078 | 1.0 | 3666 | 0.0068 | 0.8054 | 0.9187 | | 0.0029 | 2.0 | 7332 | 0.0058 | 0.8298 | 0.8778 | | 0.002 | 3.0 | 10998 | 0.0058 | 0.8274 | 0.8961 | | 0.0013 | 4.0 | 14664 | 0.0060 | 0.8388 | 0.8774 | | 0.0 | 5.0 | 18330 | 0.0055 | 0.8405 | 0.8901 | | 0.0059 | 6.0 | 21996 | 0.0050 | 0.8547 | 0.9004 | | 0.0048 | 7.0 | 25662 | 0.0060 | 0.8364 | 0.8668 | | 0.0 | 8.0 | 29328 | 0.0063 | 0.8278 | 0.8776 | | 0.0031 | 9.0 | 32994 | 0.0060 | 0.8547 | 0.9188 | | 0.0017 | 10.0 | 36660 | 0.0061 | 0.8489 | 0.9105 | | 0.0029 | 11.0 | 40326 | 0.0058 | 0.8572 | 0.9066 | | 0.0 | 12.0 | 43992 | 0.0059 | 0.8525 | 0.9105 | | 0.0031 | 13.0 | 47658 | 0.0057 | 0.8514 | 0.9035 | | 0.0012 | 14.0 | 51324 | 0.0056 | 0.8567 | 0.9058 | | 0.0042 | 15.0 | 54990 | 0.0058 | 0.8463 | 0.8898 | | 0.0026 | 16.0 | 58656 | 0.0067 | 0.8607 | 0.9196 | | 0.0012 | 17.0 | 62322 | 0.0050 | 0.8632 | 0.9224 | | 0.0031 | 18.0 | 65988 | 0.0066 | 0.8404 | 0.9155 | | 0.0018 | 19.0 | 69654 | 0.0059 | 0.8598 | 0.9115 | | 0.002 | 20.0 | 73320 | 0.0065 | 0.8561 | 0.9210 | | 0.0033 | 21.0 | 76986 | 0.0070 | 0.8580 | 0.9118 | | 0.0014 | 22.0 | 80652 | 0.0066 | 0.8597 | 0.9306 | | 0.0 | 23.0 | 84318 | 0.0066 | 0.8623 | 0.9014 | | 0.0 | 24.0 | 87984 | 0.0062 | 0.8709 | 0.9217 | | 0.0022 | 25.0 | 91650 | 0.0067 | 0.8644 | 0.9204 | | 0.0013 | 26.0 | 95316 | 0.0063 | 0.8680 | 0.9214 | | 0.0015 | 27.0 | 98982 | 0.0073 | 0.8520 | 0.8918 | | 0.0 | 28.0 | 102648 | 0.0071 | 0.8674 | 0.9144 | | 0.0015 | 29.0 | 106314 | 0.0069 | 0.8716 | 0.9261 | | 0.0 | 30.0 | 109980 | 0.0068 | 0.8715 | 0.9246 | | 0.0012 | 31.0 | 113646 | 0.0073 | 0.8682 | 0.9128 | | 0.0009 | 32.0 | 117312 | 0.0071 | 0.8717 | 0.9260 | | 0.0022 | 33.0 | 120978 | 0.0071 | 0.8715 | 0.9172 | | 0.0019 | 34.0 | 124644 | 0.0075 | 0.8674 | 0.9127 | | 0.0 | 35.0 | 128310 | 0.0078 | 0.8660 | 0.9140 | | 0.0009 | 36.0 | 131976 | 0.0079 | 0.8720 | 0.9214 | | 0.0007 | 37.0 | 135642 | 0.0087 | 0.8689 | 0.9206 | | 0.0014 | 38.0 | 139308 | 0.0077 | 0.8697 | 0.9161 | | 0.0 | 39.0 | 142974 | 0.0091 | 0.8682 | 0.9243 | | 0.0025 | 40.0 | 146640 | 0.0091 | 0.8660 | 0.9161 | | 0.0019 | 41.0 | 150306 | 0.0089 | 0.8722 | 0.9190 | | 0.0009 | 42.0 | 153972 | 0.0087 | 0.8727 | 0.9233 | | 0.0017 | 43.0 | 157638 | 0.0091 | 0.8721 | 0.9196 | | 0.0 | 44.0 | 161304 | 0.0093 | 0.8737 | 0.9181 | | 0.0012 | 45.0 | 164970 | 0.0093 | 0.8727 | 0.9237 | | 0.0 | 46.0 | 168636 | 0.0094 | 0.8724 | 0.9230 | | 0.0005 | 47.0 | 172302 | 0.0102 | 0.8675 | 0.9137 | | 0.0 | 48.0 | 175968 | 0.0094 | 0.8631 | 0.9066 | | 0.0009 | 49.0 | 179634 | 0.0103 | 0.8700 | 0.9165 | | 0.0008 | 50.0 | 183300 | 0.0092 | 0.8705 | 0.9201 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_t4_4batch_augx3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/1oi9s8vu) # segformer-b1-finetuned-segments-pv_v1_normalized_t4_4batch_augx3 This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0111 - Mean Iou: 0.8679 - Precision: 0.9181 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:-----:|:---------------:|:--------:|:---------:| | 0.0086 | 0.9978 | 343 | 0.0064 | 0.8154 | 0.9105 | | 0.0031 | 1.9985 | 687 | 0.0058 | 0.8290 | 0.9324 | | 0.0059 | 2.9993 | 1031 | 0.0061 | 0.8077 | 0.9352 | | 0.0031 | 4.0 | 1375 | 0.0057 | 0.8347 | 0.9182 | | 0.0043 | 4.9978 | 1718 | 0.0059 | 0.8275 | 0.8747 | | 0.0034 | 5.9985 | 2062 | 0.0054 | 0.8499 | 0.8922 | | 0.0035 | 6.9993 | 2406 | 0.0055 | 0.8508 | 0.9072 | | 0.0037 | 8.0 | 2750 | 0.0055 | 0.8466 | 0.9201 | | 0.0028 | 8.9978 | 3093 | 0.0054 | 0.8590 | 0.9139 | | 0.0031 | 9.9985 | 3437 | 0.0055 | 0.8562 | 0.9135 | | 0.002 | 10.9993 | 3781 | 0.0054 | 0.8619 | 0.9175 | | 0.0019 | 12.0 | 4125 | 0.0056 | 0.8649 | 0.9131 | | 0.0023 | 12.9978 | 4468 | 0.0061 | 0.8632 | 0.9195 | | 0.0028 | 13.9985 | 4812 | 0.0061 | 0.8553 | 0.9174 | | 0.0041 | 14.9993 | 5156 | 0.0072 | 0.8573 | 0.9172 | | 0.002 | 16.0 | 5500 | 0.0063 | 0.8643 | 0.9136 | | 0.0019 | 16.9978 | 5843 | 0.0068 | 0.8637 | 0.9185 | | 0.0019 | 17.9985 | 6187 | 0.0073 | 0.8598 | 0.9123 | | 0.0015 | 18.9993 | 6531 | 0.0070 | 0.8620 | 0.9108 | | 0.0019 | 20.0 | 6875 | 0.0073 | 0.8602 | 0.9163 | | 0.0017 | 20.9978 | 7218 | 0.0071 | 0.8669 | 0.9229 | | 0.0014 | 21.9985 | 7562 | 0.0081 | 0.8633 | 0.9198 | | 0.0027 | 22.9993 | 7906 | 0.0089 | 0.8573 | 0.9138 | | 0.0017 | 24.0 | 8250 | 0.0086 | 0.8570 | 0.9114 | | 0.0011 | 24.9978 | 8593 | 0.0087 | 0.8635 | 0.9152 | | 0.0013 | 25.9985 | 8937 | 0.0100 | 0.8583 | 0.9203 | | 0.0012 | 26.9993 | 9281 | 0.0085 | 0.8651 | 0.9113 | | 0.0015 | 28.0 | 9625 | 0.0091 | 0.8697 | 0.9179 | | 0.0011 | 28.9978 | 9968 | 0.0091 | 0.8684 | 0.9204 | | 0.0012 | 29.9985 | 10312 | 0.0099 | 0.8658 | 0.9152 | | 0.001 | 30.9993 | 10656 | 0.0098 | 0.8663 | 0.9170 | | 0.0011 | 32.0 | 11000 | 0.0100 | 0.8680 | 0.9174 | | 0.0008 | 32.9978 | 11343 | 0.0102 | 0.8675 | 0.9181 | | 0.0009 | 33.9985 | 11687 | 0.0107 | 0.8669 | 0.9180 | | 0.0011 | 34.9993 | 12031 | 0.0107 | 0.8681 | 0.9214 | | 0.001 | 36.0 | 12375 | 0.0113 | 0.8677 | 0.9166 | | 0.001 | 36.9978 | 12718 | 0.0115 | 0.8669 | 0.9179 | | 0.001 | 37.9985 | 13062 | 0.0109 | 0.8694 | 0.9206 | | 0.0008 | 38.9993 | 13406 | 0.0112 | 0.8681 | 0.9175 | | 0.0009 | 39.9127 | 13720 | 0.0111 | 0.8679 | 0.9181 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/ychhcvsx) # segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0012 - Mean Iou: 0.9591 - Precision: 0.9785 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.0122 | 0.9989 | 229 | 0.0078 | 0.8264 | 0.9201 | | 0.007 | 1.9978 | 458 | 0.0068 | 0.8134 | 0.8659 | | 0.0051 | 2.9967 | 687 | 0.0042 | 0.8781 | 0.9431 | | 0.0057 | 4.0 | 917 | 0.0038 | 0.8851 | 0.9200 | | 0.0044 | 4.9989 | 1146 | 0.0042 | 0.8738 | 0.8984 | | 0.0042 | 5.9978 | 1375 | 0.0035 | 0.8848 | 0.9454 | | 0.0043 | 6.9967 | 1604 | 0.0036 | 0.8847 | 0.9527 | | 0.0044 | 8.0 | 1834 | 0.0032 | 0.8961 | 0.9469 | | 0.0032 | 8.9989 | 2063 | 0.0039 | 0.8778 | 0.9144 | | 0.0033 | 9.9978 | 2292 | 0.0028 | 0.9072 | 0.9458 | | 0.0028 | 10.9967 | 2521 | 0.0025 | 0.9144 | 0.9593 | | 0.0029 | 12.0 | 2751 | 0.0028 | 0.9069 | 0.9329 | | 0.0029 | 12.9989 | 2980 | 0.0025 | 0.9148 | 0.9617 | | 0.0027 | 13.9978 | 3209 | 0.0026 | 0.9130 | 0.9508 | | 0.0024 | 14.9967 | 3438 | 0.0021 | 0.9255 | 0.9552 | | 0.0023 | 16.0 | 3668 | 0.0034 | 0.8896 | 0.9616 | | 0.0028 | 16.9989 | 3897 | 0.0029 | 0.9028 | 0.9420 | | 0.0029 | 17.9978 | 4126 | 0.0022 | 0.9235 | 0.9508 | | 0.0025 | 18.9967 | 4355 | 0.0021 | 0.9260 | 0.9621 | | 0.0023 | 20.0 | 4585 | 0.0020 | 0.9306 | 0.9591 | | 0.0022 | 20.9989 | 4814 | 0.0019 | 0.9334 | 0.9668 | | 0.0023 | 21.9978 | 5043 | 0.0019 | 0.9318 | 0.9649 | | 0.003 | 22.9967 | 5272 | 0.0021 | 0.9274 | 0.9517 | | 0.0019 | 24.0 | 5502 | 0.0018 | 0.9363 | 0.9670 | | 0.002 | 24.9989 | 5731 | 0.0018 | 0.9370 | 0.9571 | | 0.0022 | 25.9978 | 5960 | 0.0019 | 0.9330 | 0.9558 | | 0.0021 | 26.9967 | 6189 | 0.0018 | 0.9359 | 0.9593 | | 0.0018 | 28.0 | 6419 | 0.0016 | 0.9421 | 0.9625 | | 0.0017 | 28.9989 | 6648 | 0.0016 | 0.9447 | 0.9650 | | 0.0016 | 29.9978 | 6877 | 0.0015 | 0.9452 | 0.9651 | | 0.0017 | 30.9967 | 7106 | 0.0015 | 0.9478 | 0.9692 | | 0.0016 | 32.0 | 7336 | 0.0014 | 0.9503 | 0.9697 | | 0.0015 | 32.9989 | 7565 | 0.0014 | 0.9512 | 0.9720 | | 0.0014 | 33.9978 | 7794 | 0.0013 | 0.9531 | 0.9721 | | 0.0015 | 34.9967 | 8023 | 0.0013 | 0.9547 | 0.9716 | | 0.0013 | 36.0 | 8253 | 0.0013 | 0.9542 | 0.9683 | | 0.0014 | 36.9989 | 8482 | 0.0012 | 0.9573 | 0.9750 | | 0.0013 | 37.9978 | 8711 | 0.0012 | 0.9579 | 0.9768 | | 0.0014 | 38.9967 | 8940 | 0.0012 | 0.9582 | 0.9754 | | 0.0014 | 39.9564 | 9160 | 0.0012 | 0.9591 | 0.9785 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_x3_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/ktaai3s5) # segformer-b1-finetuned-segments-pv_v1_x3_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0064 - Mean Iou: 0.8466 - Precision: 0.9220 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:-----:|:---------------:|:--------:|:---------:| | 0.0084 | 0.9993 | 687 | 0.0063 | 0.8160 | 0.8736 | | 0.007 | 2.0 | 1375 | 0.0060 | 0.8262 | 0.9006 | | 0.006 | 2.9993 | 2062 | 0.0066 | 0.8072 | 0.9214 | | 0.0049 | 4.0 | 2750 | 0.0054 | 0.8283 | 0.9287 | | 0.004 | 4.9993 | 3437 | 0.0070 | 0.8326 | 0.9068 | | 0.0042 | 6.0 | 4125 | 0.0053 | 0.8318 | 0.8834 | | 0.004 | 6.9993 | 4812 | 0.0053 | 0.8370 | 0.8893 | | 0.0037 | 8.0 | 5500 | 0.0075 | 0.8049 | 0.9404 | | 0.0036 | 8.9993 | 6187 | 0.0074 | 0.8222 | 0.9106 | | 0.0033 | 10.0 | 6875 | 0.0061 | 0.8297 | 0.9161 | | 0.0031 | 10.9993 | 7562 | 0.0055 | 0.8427 | 0.9086 | | 0.0033 | 12.0 | 8250 | 0.0052 | 0.8437 | 0.9152 | | 0.0037 | 12.9993 | 8937 | 0.0055 | 0.8387 | 0.9186 | | 0.0028 | 14.0 | 9625 | 0.0060 | 0.8416 | 0.9137 | | 0.0027 | 14.9993 | 10312 | 0.0052 | 0.8489 | 0.9212 | | 0.003 | 16.0 | 11000 | 0.0065 | 0.8393 | 0.9158 | | 0.0025 | 16.9993 | 11687 | 0.0063 | 0.8347 | 0.9245 | | 0.0027 | 18.0 | 12375 | 0.0065 | 0.8439 | 0.9093 | | 0.0032 | 18.9993 | 13062 | 0.0056 | 0.8495 | 0.9186 | | 0.0024 | 20.0 | 13750 | 0.0064 | 0.8466 | 0.9220 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b0-finetuned-segments-pv_v1_x3_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/hwghoj9l) # segformer-b0-finetuned-segments-pv_v1_x3_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0056 - Mean Iou: 0.8288 - Precision: 0.8928 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:| | 0.0086 | 0.9993 | 687 | 0.0068 | 0.8080 | 0.8515 | | 0.0061 | 2.0 | 1375 | 0.0056 | 0.8257 | 0.8862 | | 0.0058 | 2.9993 | 2062 | 0.0056 | 0.8284 | 0.9154 | | 0.0063 | 4.0 | 2750 | 0.0055 | 0.8212 | 0.9261 | | 0.0051 | 4.9993 | 3437 | 0.0081 | 0.7851 | 0.9189 | | 0.0042 | 6.0 | 4125 | 0.0062 | 0.8322 | 0.9034 | | 0.004 | 6.9993 | 4812 | 0.0067 | 0.8262 | 0.8807 | | 0.0049 | 8.0 | 5500 | 0.0061 | 0.8271 | 0.9135 | | 0.0043 | 8.9993 | 6187 | 0.0056 | 0.8288 | 0.8928 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b2-finetuned-segments-pv_v1_x3_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b2-finetuned-segments-pv_v1_x3_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b2-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b2-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 1 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b2-finetuned-segments-pv_v1_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/mmtwbyor) # segformer-b2-finetuned-segments-pv_v1_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b2-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b2-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: nan - Mean Iou: 0.0 - Precision: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.01 | 0.9989 | 229 | 0.0088 | 0.8105 | 0.8817 | | 0.0062 | 1.9978 | 458 | 0.0075 | 0.8201 | 0.8726 | | 0.0049 | 2.9967 | 687 | 0.0063 | 0.8297 | 0.8867 | | 0.0053 | 4.0 | 917 | 0.0055 | 0.8425 | 0.8845 | | 0.0037 | 4.9989 | 1146 | 0.0058 | 0.8380 | 0.8823 | | 0.0039 | 5.9978 | 1375 | 0.0211 | 0.6114 | 0.9766 | | 0.0037 | 6.9967 | 1604 | 0.3403 | 0.0 | 1.0 | | 0.0002 | 8.0 | 1834 | nan | 0.0 | 1.0 | | 0.0003 | 8.9989 | 2063 | nan | 0.0 | 1.0 | | 0.0864 | 9.9978 | 2292 | nan | 0.0 | 1.0 | | 0.0035 | 10.9967 | 2521 | nan | 0.0 | 1.0 | | 0.0045 | 12.0 | 2751 | nan | 0.0 | 1.0 | | 0.0039 | 12.9989 | 2980 | nan | 0.0 | 1.0 | | 0.8023 | 13.9978 | 3209 | nan | 0.0 | 1.0 | | 0.0041 | 14.9967 | 3438 | nan | 0.0 | 1.0 | | 7.0711 | 16.0 | 3668 | nan | 0.0 | 1.0 | | 0.0039 | 16.9989 | 3897 | nan | 0.0 | 1.0 | | 19.4385 | 17.9978 | 4126 | nan | 0.0 | 1.0 | | 0.0001 | 18.9967 | 4355 | nan | 0.0 | 1.0 | | 1.7398 | 20.0 | 4585 | nan | 0.0 | 1.0 | | 0.2879 | 20.9989 | 4814 | nan | 0.0 | 1.0 | | 0.0005 | 21.9978 | 5043 | nan | 0.0 | 1.0 | | 5.8398 | 22.9967 | 5272 | nan | 0.0 | 1.0 | | 0.0004 | 24.0 | 5502 | nan | 0.0 | 1.0 | | 0.0002 | 24.9989 | 5731 | nan | 0.0 | 1.0 | | 0.0016 | 25.9978 | 5960 | nan | 0.0 | 1.0 | | 0.0034 | 26.9967 | 6189 | nan | 0.0 | 1.0 | | 0.0004 | 28.0 | 6419 | nan | 0.0 | 1.0 | | 0.0036 | 28.9989 | 6648 | nan | 0.0 | 1.0 | | 0.0314 | 29.9978 | 6877 | nan | 0.0 | 1.0 | | 0.0921 | 30.9967 | 7106 | nan | 0.0 | 1.0 | | 89.1025 | 32.0 | 7336 | nan | 0.0 | 1.0 | | 0.0073 | 32.9989 | 7565 | nan | 0.0 | 1.0 | | 0.0126 | 33.9978 | 7794 | nan | 0.0 | 1.0 | | 0.0094 | 34.9967 | 8023 | nan | 0.0 | 1.0 | | 0.0001 | 36.0 | 8253 | nan | 0.0 | 1.0 | | 4.3987 | 36.9989 | 8482 | nan | 0.0 | 1.0 | | 0.0005 | 37.9978 | 8711 | nan | 0.0 | 1.0 | | 0.0202 | 38.9967 | 8940 | nan | 0.0 | 1.0 | | 0.1612 | 39.9564 | 9160 | nan | 0.0 | 1.0 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b0-finetuned-segments-pv_v1_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/g4m4ysqz) # segformer-b0-finetuned-segments-pv_v1_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0074 - Mean Iou: 0.8483 - Precision: 0.9169 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.0127 | 0.9989 | 229 | 0.0092 | 0.7982 | 0.8641 | | 0.0077 | 1.9978 | 458 | 0.0094 | 0.7871 | 0.8456 | | 0.006 | 2.9967 | 687 | 0.0067 | 0.8140 | 0.9089 | | 0.0051 | 4.0 | 917 | 0.0058 | 0.8358 | 0.8713 | | 0.0045 | 4.9989 | 1146 | 0.0059 | 0.8258 | 0.8761 | | 0.0042 | 5.9978 | 1375 | 0.0058 | 0.8415 | 0.9018 | | 0.0036 | 6.9967 | 1604 | 0.0051 | 0.8513 | 0.9049 | | 0.0038 | 8.0 | 1834 | 0.0062 | 0.8226 | 0.9256 | | 0.004 | 8.9989 | 2063 | 0.0057 | 0.8358 | 0.8913 | | 0.0035 | 9.9978 | 2292 | 0.0053 | 0.8485 | 0.9079 | | 0.0037 | 10.9967 | 2521 | 0.0059 | 0.8192 | 0.9056 | | 0.0038 | 12.0 | 2751 | 0.0054 | 0.8487 | 0.8921 | | 0.0033 | 12.9989 | 2980 | 0.0053 | 0.8541 | 0.9086 | | 0.0028 | 13.9978 | 3209 | 0.0055 | 0.8551 | 0.8985 | | 0.0026 | 14.9967 | 3438 | 0.0060 | 0.8483 | 0.9085 | | 0.0026 | 16.0 | 3668 | 0.0057 | 0.8495 | 0.9076 | | 0.0024 | 16.9989 | 3897 | 0.0058 | 0.8442 | 0.9083 | | 0.0038 | 17.9978 | 4126 | 0.0066 | 0.8113 | 0.8910 | | 0.0031 | 18.9967 | 4355 | 0.0062 | 0.8488 | 0.9108 | | 0.0026 | 20.0 | 4585 | 0.0058 | 0.8575 | 0.9126 | | 0.0024 | 20.9989 | 4814 | 0.0057 | 0.8580 | 0.9119 | | 0.0025 | 21.9978 | 5043 | 0.0059 | 0.8505 | 0.8957 | | 0.0031 | 22.9967 | 5272 | 0.0062 | 0.8472 | 0.9135 | | 0.0022 | 24.0 | 5502 | 0.0055 | 0.8598 | 0.9147 | | 0.0023 | 24.9989 | 5731 | 0.0058 | 0.8621 | 0.9090 | | 0.0023 | 25.9978 | 5960 | 0.0064 | 0.8498 | 0.9094 | | 0.0023 | 26.9967 | 6189 | 0.0067 | 0.8428 | 0.9137 | | 0.0021 | 28.0 | 6419 | 0.0063 | 0.8527 | 0.9076 | | 0.002 | 28.9989 | 6648 | 0.0065 | 0.8509 | 0.9187 | | 0.002 | 29.9978 | 6877 | 0.0074 | 0.8424 | 0.9179 | | 0.002 | 30.9967 | 7106 | 0.0065 | 0.8577 | 0.9116 | | 0.0019 | 32.0 | 7336 | 0.0067 | 0.8547 | 0.9141 | | 0.0019 | 32.9989 | 7565 | 0.0072 | 0.8519 | 0.9168 | | 0.0019 | 33.9978 | 7794 | 0.0067 | 0.8569 | 0.9148 | | 0.0019 | 34.9967 | 8023 | 0.0070 | 0.8544 | 0.9139 | | 0.0017 | 36.0 | 8253 | 0.0072 | 0.8510 | 0.9124 | | 0.0018 | 36.9989 | 8482 | 0.0081 | 0.8425 | 0.9164 | | 0.0017 | 37.9978 | 8711 | 0.0073 | 0.8512 | 0.9155 | | 0.0018 | 38.9967 | 8940 | 0.0073 | 0.8495 | 0.9164 | | 0.0018 | 39.9564 | 9160 | 0.0074 | 0.8483 | 0.9169 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b0-finetuned-segments-pv_v1_3x_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/yy31wgdz) # segformer-b0-finetuned-segments-pv_v1_3x_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0067 - Mean Iou: 0.8641 - Precision: 0.9173 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:-----:|:---------------:|:--------:|:---------:| | 0.0077 | 0.9993 | 687 | 0.0077 | 0.7897 | 0.8235 | | 0.0056 | 2.0 | 1375 | 0.0059 | 0.8193 | 0.8760 | | 0.0065 | 2.9993 | 2062 | 0.0064 | 0.8222 | 0.9068 | | 0.0047 | 4.0 | 2750 | 0.0061 | 0.8195 | 0.9299 | | 0.0039 | 4.9993 | 3437 | 0.0055 | 0.8440 | 0.9075 | | 0.0044 | 6.0 | 4125 | 0.0063 | 0.8208 | 0.8479 | | 0.0034 | 6.9993 | 4812 | 0.0080 | 0.7750 | 0.8153 | | 0.0037 | 8.0 | 5500 | 0.0053 | 0.8475 | 0.9084 | | 0.004 | 8.9993 | 6187 | 0.0073 | 0.8013 | 0.8237 | | 0.003 | 10.0 | 6875 | 0.0056 | 0.8476 | 0.8955 | | 0.0038 | 10.9993 | 7562 | 0.0058 | 0.8273 | 0.9144 | | 0.0028 | 12.0 | 8250 | 0.0065 | 0.8143 | 0.8888 | | 0.0031 | 12.9993 | 8937 | 0.0064 | 0.8175 | 0.9188 | | 0.003 | 14.0 | 9625 | 0.0051 | 0.8491 | 0.9027 | | 0.0025 | 14.9993 | 10312 | 0.0059 | 0.8558 | 0.9085 | | 0.0029 | 16.0 | 11000 | 0.0057 | 0.8454 | 0.9029 | | 0.0026 | 16.9993 | 11687 | 0.0057 | 0.8547 | 0.9230 | | 0.0024 | 18.0 | 12375 | 0.0059 | 0.8579 | 0.9045 | | 0.0025 | 18.9993 | 13062 | 0.0059 | 0.8645 | 0.9094 | | 0.0025 | 20.0 | 13750 | 0.0059 | 0.8498 | 0.9174 | | 0.0024 | 20.9993 | 14437 | 0.0056 | 0.8576 | 0.8970 | | 0.0022 | 22.0 | 15125 | 0.0063 | 0.8541 | 0.8952 | | 0.0031 | 22.9993 | 15812 | 0.0054 | 0.8508 | 0.9154 | | 0.0021 | 24.0 | 16500 | 0.0057 | 0.8545 | 0.9119 | | 0.0022 | 24.9993 | 17187 | 0.0058 | 0.8474 | 0.9149 | | 0.0022 | 26.0 | 17875 | 0.0066 | 0.8325 | 0.8879 | | 0.0021 | 26.9993 | 18562 | 0.0062 | 0.8522 | 0.9156 | | 0.0021 | 28.0 | 19250 | 0.0063 | 0.8488 | 0.8932 | | 0.002 | 28.9993 | 19937 | 0.0061 | 0.8579 | 0.9200 | | 0.002 | 30.0 | 20625 | 0.0059 | 0.8624 | 0.9182 | | 0.0021 | 30.9993 | 21312 | 0.0061 | 0.8564 | 0.9013 | | 0.0019 | 32.0 | 22000 | 0.0060 | 0.8601 | 0.9091 | | 0.0018 | 32.9993 | 22687 | 0.0059 | 0.8640 | 0.9163 | | 0.0017 | 34.0 | 23375 | 0.0062 | 0.8622 | 0.9187 | | 0.0017 | 34.9993 | 24062 | 0.0062 | 0.8634 | 0.9245 | | 0.0017 | 36.0 | 24750 | 0.0064 | 0.8655 | 0.9196 | | 0.0017 | 36.9993 | 25437 | 0.0063 | 0.8642 | 0.9197 | | 0.0016 | 38.0 | 26125 | 0.0065 | 0.8634 | 0.9166 | | 0.0016 | 38.9993 | 26812 | 0.0067 | 0.8639 | 0.9186 | | 0.0016 | 39.9709 | 27480 | 0.0067 | 0.8641 | 0.9173 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_3x_normalized_p100_4batch
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/ydmwnhgs) # segformer-b1-finetuned-segments-pv_v1_3x_normalized_p100_4batch This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0087 - Mean Iou: 0.8602 - Precision: 0.9152 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:-----:|:---------------:|:--------:|:---------:| | 0.0073 | 0.9993 | 687 | 0.0072 | 0.7997 | 0.8395 | | 0.0052 | 2.0 | 1375 | 0.0069 | 0.8039 | 0.8906 | | 0.0051 | 2.9993 | 2062 | 0.0060 | 0.8301 | 0.8951 | | 0.0048 | 4.0 | 2750 | 0.0057 | 0.8223 | 0.9070 | | 0.0039 | 4.9993 | 3437 | 0.0054 | 0.8433 | 0.9104 | | 0.0042 | 6.0 | 4125 | 0.0054 | 0.8414 | 0.8779 | | 0.0031 | 6.9993 | 4812 | 0.0052 | 0.8453 | 0.8852 | | 0.0034 | 8.0 | 5500 | 0.0051 | 0.8526 | 0.9146 | | 0.0036 | 8.9993 | 6187 | 0.0059 | 0.8319 | 0.8884 | | 0.0027 | 10.0 | 6875 | 0.0058 | 0.8453 | 0.8990 | | 0.0028 | 10.9993 | 7562 | 0.0052 | 0.8552 | 0.9152 | | 0.0027 | 12.0 | 8250 | 0.0062 | 0.8459 | 0.9038 | | 0.0032 | 12.9993 | 8937 | 0.0056 | 0.8506 | 0.9163 | | 0.0024 | 14.0 | 9625 | 0.0062 | 0.8529 | 0.9189 | | 0.0035 | 14.9993 | 10312 | 0.0058 | 0.8464 | 0.9102 | | 0.0024 | 16.0 | 11000 | 0.0059 | 0.8575 | 0.9126 | | 0.0023 | 16.9993 | 11687 | 0.0057 | 0.8527 | 0.9201 | | 0.0024 | 18.0 | 12375 | 0.0060 | 0.8573 | 0.9177 | | 0.0028 | 18.9993 | 13062 | 0.0063 | 0.8601 | 0.9064 | | 0.0023 | 20.0 | 13750 | 0.0061 | 0.8589 | 0.9164 | | 0.002 | 20.9993 | 14437 | 0.0061 | 0.8611 | 0.9046 | | 0.002 | 22.0 | 15125 | 0.0057 | 0.8633 | 0.9143 | | 0.002 | 22.9993 | 15812 | 0.0067 | 0.8552 | 0.9133 | | 0.0018 | 24.0 | 16500 | 0.0068 | 0.8594 | 0.9174 | | 0.0021 | 24.9993 | 17187 | 0.0063 | 0.8545 | 0.9111 | | 0.0023 | 26.0 | 17875 | 0.0055 | 0.8642 | 0.9149 | | 0.0019 | 26.9993 | 18562 | 0.0060 | 0.8627 | 0.9152 | | 0.0017 | 28.0 | 19250 | 0.0063 | 0.8658 | 0.9148 | | 0.0017 | 28.9993 | 19937 | 0.0067 | 0.8644 | 0.9085 | | 0.0017 | 30.0 | 20625 | 0.0068 | 0.8578 | 0.9110 | | 0.0017 | 30.9993 | 21312 | 0.0067 | 0.8585 | 0.9130 | | 0.0015 | 32.0 | 22000 | 0.0069 | 0.8613 | 0.9103 | | 0.0015 | 32.9993 | 22687 | 0.0073 | 0.8599 | 0.9200 | | 0.0014 | 34.0 | 23375 | 0.0074 | 0.8605 | 0.9181 | | 0.0014 | 34.9993 | 24062 | 0.0079 | 0.8581 | 0.9174 | | 0.0013 | 36.0 | 24750 | 0.0081 | 0.8582 | 0.9123 | | 0.0013 | 36.9993 | 25437 | 0.0084 | 0.8599 | 0.9166 | | 0.0012 | 38.0 | 26125 | 0.0084 | 0.8603 | 0.9139 | | 0.0013 | 38.9993 | 26812 | 0.0092 | 0.8599 | 0.9193 | | 0.0012 | 39.9709 | 27480 | 0.0087 | 0.8602 | 0.9152 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch_fp
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/kxwmffd1) # segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch_fp This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0012 - Mean Iou: 0.9589 - Precision: 0.9794 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.001 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.0641 | 0.9989 | 229 | 0.0082 | 0.8288 | 0.8881 | | 0.0077 | 1.9978 | 458 | 0.0070 | 0.8228 | 0.8650 | | 0.0058 | 2.9967 | 687 | 0.0042 | 0.8827 | 0.9339 | | 0.005 | 4.0 | 917 | 0.0039 | 0.8849 | 0.9172 | | 0.0044 | 4.9989 | 1146 | 0.0071 | 0.7938 | 0.8122 | | 0.0049 | 5.9978 | 1375 | 0.0036 | 0.8914 | 0.9402 | | 0.0045 | 6.9967 | 1604 | 0.0042 | 0.8729 | 0.9280 | | 0.0038 | 8.0 | 1834 | 0.0035 | 0.8889 | 0.9433 | | 0.0034 | 8.9989 | 2063 | 0.0030 | 0.9038 | 0.9357 | | 0.0032 | 9.9978 | 2292 | 0.0026 | 0.9115 | 0.9501 | | 0.003 | 10.9967 | 2521 | 0.0026 | 0.9136 | 0.9482 | | 0.0031 | 12.0 | 2751 | 0.0026 | 0.9132 | 0.9461 | | 0.0029 | 12.9989 | 2980 | 0.0026 | 0.9144 | 0.9493 | | 0.0026 | 13.9978 | 3209 | 0.0023 | 0.9202 | 0.9414 | | 0.0025 | 14.9967 | 3438 | 0.0024 | 0.9175 | 0.9456 | | 0.003 | 16.0 | 3668 | 0.0032 | 0.8926 | 0.9640 | | 0.0035 | 16.9989 | 3897 | 0.0041 | 0.8741 | 0.9007 | | 0.0029 | 17.9978 | 4126 | 0.0022 | 0.9229 | 0.9598 | | 0.0024 | 18.9967 | 4355 | 0.0022 | 0.9239 | 0.9549 | | 0.0022 | 20.0 | 4585 | 0.0020 | 0.9308 | 0.9601 | | 0.0021 | 20.9989 | 4814 | 0.0019 | 0.9325 | 0.9689 | | 0.0021 | 21.9978 | 5043 | 0.0019 | 0.9334 | 0.9630 | | 0.002 | 22.9967 | 5272 | 0.0018 | 0.9368 | 0.9631 | | 0.002 | 24.0 | 5502 | 0.0019 | 0.9333 | 0.9684 | | 0.002 | 24.9989 | 5731 | 0.0018 | 0.9381 | 0.9613 | | 0.0022 | 25.9978 | 5960 | 0.0018 | 0.9369 | 0.9610 | | 0.0019 | 26.9967 | 6189 | 0.0017 | 0.9413 | 0.9677 | | 0.0018 | 28.0 | 6419 | 0.0016 | 0.9429 | 0.9629 | | 0.0017 | 28.9989 | 6648 | 0.0016 | 0.9444 | 0.9642 | | 0.0017 | 29.9978 | 6877 | 0.0015 | 0.9465 | 0.9741 | | 0.0016 | 30.9967 | 7106 | 0.0014 | 0.9492 | 0.9718 | | 0.0016 | 32.0 | 7336 | 0.0014 | 0.9499 | 0.9687 | | 0.0015 | 32.9989 | 7565 | 0.0015 | 0.9469 | 0.9737 | | 0.0016 | 33.9978 | 7794 | 0.0014 | 0.9514 | 0.9721 | | 0.0015 | 34.9967 | 8023 | 0.0013 | 0.9542 | 0.9719 | | 0.0014 | 36.0 | 8253 | 0.0013 | 0.9546 | 0.9694 | | 0.0014 | 36.9989 | 8482 | 0.0012 | 0.9569 | 0.9740 | | 0.0014 | 37.9978 | 8711 | 0.0012 | 0.9579 | 0.9781 | | 0.0014 | 38.9967 | 8940 | 0.0012 | 0.9584 | 0.9759 | | 0.0013 | 39.9564 | 9160 | 0.0012 | 0.9589 | 0.9794 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b2-finetuned-segments-pv_v1_normalized_p100_4batch_fp
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/z400vwxm) # segformer-b2-finetuned-segments-pv_v1_normalized_p100_4batch_fp This model is a fine-tuned version of [nvidia/segformer-b2-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b2-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0046 - Mean Iou: 0.8880 - Precision: 0.9115 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.01 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.6668 | 0.9989 | 229 | 0.4009 | 0.5075 | 0.5321 | | 0.2583 | 1.9978 | 458 | 0.1436 | 0.6208 | 0.6535 | | 0.1355 | 2.9967 | 687 | 0.0809 | 0.7078 | 0.7644 | | 0.088 | 4.0 | 917 | 0.0585 | 0.7472 | 0.8136 | | 0.0638 | 4.9989 | 1146 | 0.0452 | 0.7737 | 0.8353 | | 0.05 | 5.9978 | 1375 | 0.0365 | 0.7845 | 0.8394 | | 0.0401 | 6.9967 | 1604 | 0.0344 | 0.8087 | 0.8717 | | 0.0332 | 8.0 | 1834 | 0.0277 | 0.8128 | 0.8682 | | 0.0286 | 8.9989 | 2063 | 0.0188 | 0.8210 | 0.8710 | | 0.0247 | 9.9978 | 2292 | 0.0148 | 0.8369 | 0.8881 | | 0.0214 | 10.9967 | 2521 | 0.0133 | 0.8332 | 0.8716 | | 0.0189 | 12.0 | 2751 | 0.0156 | 0.8286 | 0.8597 | | 0.017 | 12.9989 | 2980 | 0.0139 | 0.8397 | 0.8726 | | 0.0151 | 13.9978 | 3209 | 0.0154 | 0.8544 | 0.8943 | | 0.0139 | 14.9967 | 3438 | 0.0114 | 0.8553 | 0.8897 | | 0.0127 | 16.0 | 3668 | 0.0108 | 0.8517 | 0.8799 | | 0.0118 | 16.9989 | 3897 | 0.0075 | 0.8658 | 0.9040 | | 0.0108 | 17.9978 | 4126 | 0.0094 | 0.8700 | 0.9088 | | 0.0101 | 18.9967 | 4355 | 0.0084 | 0.8746 | 0.9151 | | 0.0094 | 20.0 | 4585 | 0.0071 | 0.8693 | 0.8973 | | 0.0088 | 20.9989 | 4814 | 0.0071 | 0.8668 | 0.8931 | | 0.0082 | 21.9978 | 5043 | 0.0060 | 0.8786 | 0.9151 | | 0.008 | 22.9967 | 5272 | 0.0063 | 0.8776 | 0.9109 | | 0.0075 | 24.0 | 5502 | 0.0066 | 0.8776 | 0.9052 | | 0.0071 | 24.9989 | 5731 | 0.0060 | 0.8807 | 0.9115 | | 0.0069 | 25.9978 | 5960 | 0.0062 | 0.8766 | 0.9004 | | 0.0065 | 26.9967 | 6189 | 0.0059 | 0.8754 | 0.8963 | | 0.0063 | 28.0 | 6419 | 0.0062 | 0.8825 | 0.9086 | | 0.006 | 28.9989 | 6648 | 0.0050 | 0.8839 | 0.9101 | | 0.0059 | 29.9978 | 6877 | 0.0051 | 0.8827 | 0.9069 | | 0.0057 | 30.9967 | 7106 | 0.0056 | 0.8822 | 0.9053 | | 0.0055 | 32.0 | 7336 | 0.0047 | 0.8866 | 0.9133 | | 0.0055 | 32.9989 | 7565 | 0.0046 | 0.8876 | 0.9135 | | 0.0053 | 33.9978 | 7794 | 0.0052 | 0.8839 | 0.9053 | | 0.0052 | 34.9967 | 8023 | 0.0048 | 0.8828 | 0.9035 | | 0.0051 | 36.0 | 8253 | 0.0046 | 0.8897 | 0.9156 | | 0.005 | 36.9989 | 8482 | 0.0045 | 0.8891 | 0.9137 | | 0.005 | 37.9978 | 8711 | 0.0047 | 0.8881 | 0.9120 | | 0.005 | 38.9967 | 8940 | 0.0047 | 0.8879 | 0.9110 | | 0.0049 | 39.9564 | 9160 | 0.0046 | 0.8880 | 0.9115 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch_try
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/b36iwo31) # segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch_try This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0012 - Mean Iou: 0.9586 - Precision: 0.9787 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.2001 | 0.9989 | 229 | 0.0557 | 0.7433 | 0.8099 | | 0.0234 | 1.9978 | 458 | 0.0111 | 0.8026 | 0.8735 | | 0.0109 | 2.9967 | 687 | 0.0069 | 0.8264 | 0.9062 | | 0.0077 | 4.0 | 917 | 0.0052 | 0.8659 | 0.9087 | | 0.0053 | 4.9989 | 1146 | 0.0142 | 0.6652 | 0.6845 | | 0.0055 | 5.9978 | 1375 | 0.0041 | 0.8885 | 0.9319 | | 0.0041 | 6.9967 | 1604 | 0.0038 | 0.8855 | 0.9516 | | 0.0046 | 8.0 | 1834 | 0.0041 | 0.8787 | 0.9430 | | 0.0041 | 8.9989 | 2063 | 0.0035 | 0.8902 | 0.9105 | | 0.0035 | 9.9978 | 2292 | 0.0028 | 0.9107 | 0.9475 | | 0.0034 | 10.9967 | 2521 | 0.0027 | 0.9116 | 0.9390 | | 0.0038 | 12.0 | 2751 | 0.0031 | 0.9001 | 0.9306 | | 0.0033 | 12.9989 | 2980 | 0.0025 | 0.9160 | 0.9563 | | 0.0029 | 13.9978 | 3209 | 0.0026 | 0.9153 | 0.9444 | | 0.0026 | 14.9967 | 3438 | 0.0023 | 0.9218 | 0.9499 | | 0.0027 | 16.0 | 3668 | 0.0027 | 0.9086 | 0.9525 | | 0.0032 | 16.9989 | 3897 | 0.0029 | 0.9076 | 0.9439 | | 0.0033 | 17.9978 | 4126 | 0.0024 | 0.9192 | 0.9459 | | 0.0025 | 18.9967 | 4355 | 0.0027 | 0.9108 | 0.9538 | | 0.0024 | 20.0 | 4585 | 0.0021 | 0.9276 | 0.9559 | | 0.0022 | 20.9989 | 4814 | 0.0020 | 0.9316 | 0.9649 | | 0.0021 | 21.9978 | 5043 | 0.0021 | 0.9287 | 0.9571 | | 0.0025 | 22.9967 | 5272 | 0.0023 | 0.9217 | 0.9511 | | 0.0022 | 24.0 | 5502 | 0.0020 | 0.9309 | 0.9676 | | 0.002 | 24.9989 | 5731 | 0.0018 | 0.9360 | 0.9613 | | 0.002 | 25.9978 | 5960 | 0.0017 | 0.9394 | 0.9621 | | 0.0019 | 26.9967 | 6189 | 0.0017 | 0.9403 | 0.9664 | | 0.0018 | 28.0 | 6419 | 0.0017 | 0.9405 | 0.9566 | | 0.0017 | 28.9989 | 6648 | 0.0016 | 0.9438 | 0.9695 | | 0.0017 | 29.9978 | 6877 | 0.0015 | 0.9469 | 0.9755 | | 0.002 | 30.9967 | 7106 | 0.0016 | 0.9448 | 0.9672 | | 0.0018 | 32.0 | 7336 | 0.0015 | 0.9459 | 0.9693 | | 0.0016 | 32.9989 | 7565 | 0.0014 | 0.9486 | 0.9674 | | 0.0015 | 33.9978 | 7794 | 0.0013 | 0.9528 | 0.9761 | | 0.0015 | 34.9967 | 8023 | 0.0013 | 0.9520 | 0.9732 | | 0.0014 | 36.0 | 8253 | 0.0013 | 0.9541 | 0.9705 | | 0.0014 | 36.9989 | 8482 | 0.0012 | 0.9563 | 0.9739 | | 0.0014 | 37.9978 | 8711 | 0.0012 | 0.9575 | 0.9764 | | 0.0014 | 38.9967 | 8940 | 0.0012 | 0.9581 | 0.9766 | | 0.0014 | 39.9564 | 9160 | 0.0012 | 0.9586 | 0.9787 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
mohit-mahavar/segformer-b0-finetuned-segments-sidewalk-july-24
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-july-24 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.6506 - Mean Iou: 0.2417 - Mean Accuracy: 0.2896 - Overall Accuracy: 0.8279 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8456 - Accuracy Flat-sidewalk: 0.9612 - Accuracy Flat-crosswalk: 0.6353 - Accuracy Flat-cyclinglane: 0.7492 - Accuracy Flat-parkingdriveway: 0.3407 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.4808 - Accuracy Human-person: 0.2392 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.9285 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.9229 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.3317 - Accuracy Construction-fenceguardrail: 0.0065 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0667 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9214 - Accuracy Nature-terrain: 0.8309 - Accuracy Sky: 0.9552 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0520 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.7073 - Iou Flat-sidewalk: 0.8593 - Iou Flat-crosswalk: 0.4704 - Iou Flat-cyclinglane: 0.6415 - Iou Flat-parkingdriveway: 0.2779 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.3844 - Iou Human-person: 0.2212 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.7450 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.6131 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.2762 - Iou Construction-fenceguardrail: 0.0064 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0628 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.8426 - Iou Nature-terrain: 0.6748 - Iou Sky: 0.9052 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0448 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 5 - eval_batch_size: 5 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 2.7039 | 0.125 | 20 | 3.0089 | 0.0927 | 0.1417 | 0.6174 | nan | 0.2343 | 0.9657 | 0.0 | 0.0002 | 0.0064 | nan | 0.0006 | 0.0046 | 0.0 | 0.9171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4863 | 0.0 | 0.0058 | 0.0013 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.8955 | 0.4262 | 0.5880 | 0.0 | 0.0000 | 0.0017 | 0.0 | nan | 0.1897 | 0.6099 | 0.0 | 0.0002 | 0.0063 | 0.0 | 0.0006 | 0.0044 | 0.0 | 0.3982 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3652 | 0.0 | 0.0058 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0010 | 0.0 | 0.0 | 0.6971 | 0.3014 | 0.5676 | 0.0 | 0.0000 | 0.0016 | 0.0 | | 2.2735 | 0.25 | 40 | 2.2548 | 0.1103 | 0.1575 | 0.6776 | nan | 0.6866 | 0.9320 | 0.0 | 0.0005 | 0.0000 | nan | 0.0013 | 0.0 | 0.0 | 0.8456 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6957 | 0.0 | 0.0005 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9660 | 0.2157 | 0.6972 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.4119 | 0.7211 | 0.0 | 0.0005 | 0.0000 | nan | 0.0013 | 0.0 | 0.0 | 0.5169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4631 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.6679 | 0.1872 | 0.6707 | 0.0 | 0.0 | 0.0001 | 0.0 | | 2.0393 | 0.375 | 60 | 1.7801 | 0.1191 | 0.1625 | 0.6919 | nan | 0.6969 | 0.9508 | 0.0 | 0.0031 | 0.0009 | nan | 0.0006 | 0.0 | 0.0 | 0.8673 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7863 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9459 | 0.1857 | 0.7609 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4455 | 0.7216 | 0.0 | 0.0031 | 0.0009 | nan | 0.0006 | 0.0 | 0.0 | 0.5512 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4801 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7076 | 0.1728 | 0.7282 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.8708 | 0.5 | 80 | 1.6276 | 0.1275 | 0.1724 | 0.7060 | nan | 0.7259 | 0.9485 | 0.0 | 0.0013 | 0.0010 | nan | 0.0000 | 0.0 | 0.0 | 0.8795 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7958 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9465 | 0.3484 | 0.8713 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4479 | 0.7485 | 0.0 | 0.0013 | 0.0010 | nan | 0.0000 | 0.0 | 0.0 | 0.5599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4945 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7277 | 0.3001 | 0.8002 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.5044 | 0.625 | 100 | 1.5017 | 0.1301 | 0.1760 | 0.7125 | nan | 0.7595 | 0.9437 | 0.0 | 0.0033 | 0.0013 | nan | 0.0001 | 0.0 | 0.0 | 0.8920 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8119 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9493 | 0.3820 | 0.8877 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4624 | 0.7647 | 0.0 | 0.0033 | 0.0013 | nan | 0.0001 | 0.0 | 0.0 | 0.5528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4970 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7305 | 0.3416 | 0.8091 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4321 | 0.75 | 120 | 1.4233 | 0.1402 | 0.1885 | 0.7249 | nan | 0.8351 | 0.9300 | 0.0 | 0.0105 | 0.0010 | nan | 0.0000 | 0.0 | 0.0 | 0.8837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8178 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9093 | 0.7329 | 0.9126 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4557 | 0.7881 | 0.0 | 0.0105 | 0.0009 | nan | 0.0000 | 0.0 | 0.0 | 0.5882 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5046 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7708 | 0.5502 | 0.8158 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.6843 | 0.875 | 140 | 1.3702 | 0.1507 | 0.1979 | 0.7404 | nan | 0.7723 | 0.9568 | 0.0 | 0.2863 | 0.0010 | nan | 0.0 | 0.0 | 0.0 | 0.8670 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8437 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8800 | 0.8227 | 0.9022 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5235 | 0.7727 | 0.0 | 0.2830 | 0.0010 | nan | 0.0 | 0.0 | 0.0 | 0.6161 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5023 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7513 | 0.5414 | 0.8301 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3841 | 1.0 | 160 | 1.2792 | 0.1557 | 0.1988 | 0.7493 | nan | 0.7821 | 0.9636 | 0.0 | 0.3139 | 0.0018 | nan | 0.0 | 0.0 | 0.0 | 0.8292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8621 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9167 | 0.7779 | 0.9129 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5274 | 0.7745 | 0.0 | 0.3110 | 0.0018 | nan | 0.0 | 0.0 | 0.0 | 0.6512 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5125 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7783 | 0.5755 | 0.8487 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1633 | 1.125 | 180 | 1.2305 | 0.1604 | 0.2024 | 0.7565 | nan | 0.8239 | 0.9509 | 0.0 | 0.3913 | 0.0010 | nan | 0.0000 | 0.0 | 0.0 | 0.8394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8849 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9317 | 0.7419 | 0.9104 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5266 | 0.7939 | 0.0 | 0.3819 | 0.0010 | nan | 0.0000 | 0.0 | 0.0 | 0.6576 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5129 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7859 | 0.6157 | 0.8568 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3055 | 1.25 | 200 | 1.1705 | 0.1635 | 0.2072 | 0.7629 | nan | 0.8053 | 0.9581 | 0.0 | 0.4739 | 0.0029 | nan | 0.0 | 0.0 | 0.0 | 0.8556 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8819 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9212 | 0.8074 | 0.9230 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5555 | 0.7907 | 0.0 | 0.4442 | 0.0029 | nan | 0.0 | 0.0 | 0.0 | 0.6582 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5225 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7922 | 0.6029 | 0.8642 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.6919 | 1.375 | 220 | 1.0930 | 0.1602 | 0.2069 | 0.7577 | nan | 0.8569 | 0.9445 | 0.0 | 0.4060 | 0.0013 | nan | 0.0 | 0.0 | 0.0 | 0.8888 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8541 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8949 | 0.8292 | 0.9439 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5190 | 0.8032 | 0.0 | 0.3971 | 0.0013 | nan | 0.0 | 0.0 | 0.0 | 0.6341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5287 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7930 | 0.6103 | 0.8382 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.118 | 1.5 | 240 | 1.0786 | 0.1674 | 0.2125 | 0.7710 | nan | 0.8654 | 0.9339 | 0.0 | 0.6481 | 0.0041 | nan | 0.0000 | 0.0 | 0.0 | 0.8382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8874 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9438 | 0.7516 | 0.9260 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5651 | 0.8243 | 0.0 | 0.5386 | 0.0041 | nan | 0.0000 | 0.0 | 0.0 | 0.6591 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5203 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7854 | 0.5973 | 0.8640 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1816 | 1.625 | 260 | 1.0322 | 0.1674 | 0.2108 | 0.7707 | nan | 0.8264 | 0.9640 | 0.0 | 0.5392 | 0.0030 | nan | 0.0 | 0.0 | 0.0 | 0.8668 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8937 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9099 | 0.8178 | 0.9234 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5789 | 0.8009 | 0.0 | 0.5026 | 0.0030 | nan | 0.0 | 0.0 | 0.0 | 0.6558 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5262 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7986 | 0.6229 | 0.8669 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3272 | 1.75 | 280 | 1.0356 | 0.1675 | 0.2125 | 0.7696 | nan | 0.8953 | 0.9309 | 0.0 | 0.5460 | 0.0094 | nan | 0.0000 | 0.0 | 0.0 | 0.8843 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8848 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9254 | 0.8110 | 0.9144 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5270 | 0.8321 | 0.0 | 0.5074 | 0.0093 | nan | 0.0000 | 0.0 | 0.0 | 0.6563 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5356 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8006 | 0.6232 | 0.8671 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.461 | 1.875 | 300 | 0.9943 | 0.1693 | 0.2156 | 0.7743 | nan | 0.8884 | 0.9378 | 0.0 | 0.6180 | 0.0141 | nan | 0.0000 | 0.0 | 0.0 | 0.9026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8715 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9167 | 0.8110 | 0.9390 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5543 | 0.8316 | 0.0 | 0.5531 | 0.0141 | nan | 0.0000 | 0.0 | 0.0 | 0.6285 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5353 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7972 | 0.6415 | 0.8620 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8503 | 2.0 | 320 | 0.9521 | 0.1711 | 0.2151 | 0.7775 | nan | 0.8387 | 0.9557 | 0.0 | 0.6275 | 0.0135 | nan | 0.0003 | 0.0 | 0.0 | 0.8690 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9001 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9270 | 0.8244 | 0.9256 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5928 | 0.8123 | 0.0 | 0.5584 | 0.0133 | nan | 0.0003 | 0.0 | 0.0 | 0.6652 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5333 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8006 | 0.6279 | 0.8704 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2008 | 2.125 | 340 | 0.9569 | 0.1704 | 0.2123 | 0.7720 | nan | 0.7520 | 0.9779 | 0.0 | 0.6198 | 0.0077 | nan | 0.0006 | 0.0 | 0.0 | 0.8806 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8845 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9111 | 0.8300 | 0.9303 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5951 | 0.7718 | 0.0 | 0.5595 | 0.0077 | nan | 0.0006 | 0.0 | 0.0 | 0.6692 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5422 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8047 | 0.6285 | 0.8740 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8722 | 2.25 | 360 | 0.9221 | 0.1723 | 0.2170 | 0.7789 | nan | 0.8242 | 0.9625 | 0.0 | 0.6288 | 0.0493 | nan | 0.0007 | 0.0 | 0.0 | 0.9050 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8935 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9096 | 0.8300 | 0.9396 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5859 | 0.8158 | 0.0 | 0.5678 | 0.0479 | nan | 0.0007 | 0.0 | 0.0 | 0.6478 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5349 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8071 | 0.6341 | 0.8712 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1602 | 2.375 | 380 | 0.9269 | 0.1733 | 0.2208 | 0.7809 | nan | 0.8395 | 0.9412 | 0.0 | 0.7393 | 0.0701 | nan | 0.0081 | 0.0 | 0.0 | 0.8895 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8680 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9344 | 0.8348 | 0.9420 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5932 | 0.8310 | 0.0 | 0.5262 | 0.0669 | nan | 0.0080 | 0.0 | 0.0 | 0.6682 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5491 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7925 | 0.6338 | 0.8761 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8556 | 2.5 | 400 | 0.8914 | 0.1749 | 0.2156 | 0.7832 | nan | 0.8609 | 0.9672 | 0.0 | 0.6034 | 0.0476 | nan | 0.0219 | 0.0 | 0.0 | 0.8707 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8965 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9313 | 0.7605 | 0.9404 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5965 | 0.8165 | 0.0 | 0.5586 | 0.0462 | nan | 0.0212 | 0.0 | 0.0 | 0.6914 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5493 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8042 | 0.6404 | 0.8739 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7832 | 2.625 | 420 | 0.8639 | 0.1793 | 0.2246 | 0.7875 | nan | 0.8446 | 0.9522 | 0.0 | 0.7268 | 0.1335 | nan | 0.0357 | 0.0 | 0.0 | 0.8788 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9090 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9112 | 0.8534 | 0.9423 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5949 | 0.8358 | 0.0 | 0.5974 | 0.1200 | nan | 0.0348 | 0.0 | 0.0 | 0.6925 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5485 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8072 | 0.6257 | 0.8808 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7566 | 2.75 | 440 | 0.8985 | 0.1767 | 0.2216 | 0.7825 | nan | 0.9093 | 0.9330 | 0.0 | 0.5968 | 0.1217 | nan | 0.0351 | 0.0 | 0.0 | 0.8878 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8986 | 0.0 | 0.0005 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9228 | 0.8387 | 0.9473 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5655 | 0.8442 | 0.0 | 0.5478 | 0.1101 | nan | 0.0337 | 0.0 | 0.0 | 0.6815 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5423 | 0.0 | 0.0005 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8100 | 0.6445 | 0.8733 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0791 | 2.875 | 460 | 0.9042 | 0.1788 | 0.2190 | 0.7773 | nan | 0.7057 | 0.9779 | 0.0 | 0.6349 | 0.1378 | nan | 0.0828 | 0.0 | 0.0 | 0.8837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9071 | 0.0 | 0.0014 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9216 | 0.8164 | 0.9398 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.5968 | 0.7738 | 0.0 | 0.5680 | 0.1201 | nan | 0.0771 | 0.0 | 0.0 | 0.6887 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5441 | 0.0 | 0.0014 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8191 | 0.6467 | 0.8869 | 0.0 | 0.0 | 0.0001 | 0.0 | | 0.8377 | 3.0 | 480 | 0.8429 | 0.1796 | 0.2260 | 0.7871 | nan | 0.9267 | 0.9288 | 0.0 | 0.6593 | 0.1312 | nan | 0.0794 | 0.0 | 0.0 | 0.9142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8709 | 0.0 | 0.0016 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9259 | 0.8521 | 0.9413 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.5860 | 0.8471 | 0.0 | 0.5680 | 0.1189 | nan | 0.0753 | 0.0 | 0.0 | 0.6733 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5590 | 0.0 | 0.0016 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8040 | 0.6265 | 0.8870 | 0.0 | 0.0 | 0.0001 | 0.0 | | 1.5636 | 3.125 | 500 | 0.8272 | 0.1818 | 0.2244 | 0.7897 | nan | 0.8400 | 0.9588 | 0.0 | 0.6792 | 0.1355 | nan | 0.0755 | 0.0 | 0.0 | 0.8918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8879 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9349 | 0.8355 | 0.9428 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6105 | 0.8266 | 0.0 | 0.6082 | 0.1237 | nan | 0.0703 | 0.0 | 0.0 | 0.6854 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5470 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8047 | 0.6558 | 0.8845 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2729 | 3.25 | 520 | 0.8204 | 0.1838 | 0.2263 | 0.7898 | nan | 0.7866 | 0.9637 | 0.0 | 0.7059 | 0.2257 | nan | 0.0709 | 0.0 | 0.0 | 0.9080 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8927 | 0.0 | 0.0008 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9332 | 0.8135 | 0.9400 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6097 | 0.8149 | 0.0 | 0.6131 | 0.1875 | nan | 0.0665 | 0.0 | 0.0 | 0.6782 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5551 | 0.0 | 0.0008 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8178 | 0.6511 | 0.8867 | 0.0 | 0.0 | 0.0000 | 0.0 | | 0.7536 | 3.375 | 540 | 0.7992 | 0.1854 | 0.2276 | 0.7947 | nan | 0.8473 | 0.9649 | 0.0 | 0.6560 | 0.1865 | nan | 0.1253 | 0.0 | 0.0 | 0.8913 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9034 | 0.0 | 0.0026 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9291 | 0.8171 | 0.9587 | 0.0 | 0.0 | 0.0002 | 0.0 | nan | 0.6343 | 0.8302 | 0.0 | 0.6038 | 0.1634 | nan | 0.1152 | 0.0 | 0.0 | 0.6917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5472 | 0.0 | 0.0026 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8180 | 0.6495 | 0.8756 | 0.0 | 0.0 | 0.0002 | 0.0 | | 0.8595 | 3.5 | 560 | 0.7944 | 0.1894 | 0.2319 | 0.7981 | nan | 0.8326 | 0.9625 | 0.0 | 0.7415 | 0.1863 | nan | 0.2000 | 0.0 | 0.0 | 0.8843 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9174 | 0.0 | 0.0036 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9280 | 0.8172 | 0.9462 | 0.0 | 0.0 | 0.0004 | 0.0 | nan | 0.6394 | 0.8347 | 0.0 | 0.6209 | 0.1648 | nan | 0.1725 | 0.0 | 0.0 | 0.7060 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5465 | 0.0 | 0.0036 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8234 | 0.6603 | 0.8892 | 0.0 | 0.0 | 0.0004 | 0.0 | | 0.8425 | 3.625 | 580 | 0.7920 | 0.1913 | 0.2334 | 0.7984 | nan | 0.8307 | 0.9633 | 0.0 | 0.6938 | 0.2358 | nan | 0.2450 | 0.0 | 0.0 | 0.9065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9051 | 0.0 | 0.0021 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9277 | 0.8127 | 0.9462 | 0.0 | 0.0 | 0.0006 | 0.0 | nan | 0.6338 | 0.8342 | 0.0 | 0.6288 | 0.1961 | nan | 0.2045 | 0.0 | 0.0 | 0.6905 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5515 | 0.0 | 0.0021 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8214 | 0.6700 | 0.8883 | 0.0 | 0.0 | 0.0006 | 0.0 | | 0.9038 | 3.75 | 600 | 0.7786 | 0.1899 | 0.2322 | 0.7974 | nan | 0.8218 | 0.9663 | 0.0 | 0.7170 | 0.1656 | nan | 0.2636 | 0.0 | 0.0 | 0.9112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9002 | 0.0 | 0.0109 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.9329 | 0.7955 | 0.9444 | 0.0 | 0.0 | 0.0006 | 0.0 | nan | 0.6327 | 0.8286 | 0.0 | 0.6128 | 0.1494 | nan | 0.2195 | 0.0 | 0.0 | 0.6980 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5555 | 0.0 | 0.0109 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.8243 | 0.6541 | 0.8890 | 0.0 | 0.0 | 0.0006 | 0.0 | | 1.1182 | 3.875 | 620 | 0.7855 | 0.1897 | 0.2326 | 0.7972 | nan | 0.9053 | 0.9481 | 0.0 | 0.5831 | 0.2101 | nan | 0.2801 | 0.0 | 0.0 | 0.8864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9071 | 0.0 | 0.0198 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9356 | 0.8163 | 0.9511 | 0.0 | 0.0 | 0.0005 | 0.0 | nan | 0.6224 | 0.8515 | 0.0 | 0.5415 | 0.1823 | nan | 0.2310 | 0.0 | 0.0 | 0.7143 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5509 | 0.0 | 0.0197 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8129 | 0.6569 | 0.8869 | 0.0 | 0.0 | 0.0005 | 0.0 | | 0.8926 | 4.0 | 640 | 0.7673 | 0.1961 | 0.2411 | 0.8006 | nan | 0.8198 | 0.9569 | 0.0108 | 0.6882 | 0.2669 | nan | 0.4029 | 0.0 | 0.0 | 0.9175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9146 | 0.0 | 0.0379 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.9178 | 0.8403 | 0.9411 | 0.0 | 0.0 | 0.0009 | 0.0 | nan | 0.6509 | 0.8376 | 0.0108 | 0.6053 | 0.2160 | nan | 0.3055 | 0.0 | 0.0 | 0.6959 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5508 | 0.0 | 0.0374 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.8242 | 0.6491 | 0.8902 | 0.0 | 0.0 | 0.0009 | 0.0 | | 0.517 | 4.125 | 660 | 0.7549 | 0.1916 | 0.2340 | 0.8008 | nan | 0.9087 | 0.9469 | 0.0 | 0.6697 | 0.2060 | nan | 0.2515 | 0.0 | 0.0 | 0.8948 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9054 | 0.0 | 0.0259 | 0.0 | 0.0 | nan | 0.0 | 0.0003 | 0.0 | 0.0 | 0.9409 | 0.7818 | 0.9568 | 0.0 | 0.0 | 0.0008 | 0.0 | nan | 0.6364 | 0.8498 | 0.0 | 0.6005 | 0.1790 | nan | 0.2160 | 0.0 | 0.0 | 0.7067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5566 | 0.0 | 0.0257 | 0.0 | 0.0 | nan | 0.0 | 0.0003 | 0.0 | 0.0 | 0.8146 | 0.6600 | 0.8864 | 0.0 | 0.0 | 0.0008 | 0.0 | | 0.9158 | 4.25 | 680 | 0.7566 | 0.1948 | 0.2385 | 0.8018 | nan | 0.8922 | 0.9435 | 0.0 | 0.6622 | 0.2594 | nan | 0.2805 | 0.0 | 0.0 | 0.9142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9100 | 0.0 | 0.0682 | 0.0000 | 0.0 | nan | 0.0 | 0.0005 | 0.0 | 0.0 | 0.9308 | 0.8239 | 0.9458 | 0.0 | 0.0 | 0.0016 | 0.0 | nan | 0.6423 | 0.8485 | 0.0 | 0.5935 | 0.2123 | nan | 0.2332 | 0.0 | 0.0 | 0.7001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5573 | 0.0 | 0.0672 | 0.0000 | 0.0 | nan | 0.0 | 0.0005 | 0.0 | 0.0 | 0.8210 | 0.6650 | 0.8925 | 0.0 | 0.0 | 0.0016 | 0.0 | | 1.2964 | 4.375 | 700 | 0.7468 | 0.1991 | 0.2419 | 0.8038 | nan | 0.8221 | 0.9665 | 0.0304 | 0.6991 | 0.2118 | nan | 0.3813 | 0.0 | 0.0 | 0.8861 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9020 | 0.0 | 0.1202 | 0.0000 | 0.0 | nan | 0.0 | 0.0006 | 0.0 | 0.0 | 0.9289 | 0.8320 | 0.9571 | 0.0 | 0.0 | 0.0014 | 0.0 | nan | 0.6561 | 0.8322 | 0.0304 | 0.5971 | 0.1853 | nan | 0.3005 | 0.0 | 0.0 | 0.7211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5654 | 0.0 | 0.1156 | 0.0000 | 0.0 | nan | 0.0 | 0.0006 | 0.0 | 0.0 | 0.8212 | 0.6560 | 0.8890 | 0.0 | 0.0 | 0.0014 | 0.0 | | 1.0563 | 4.5 | 720 | 0.7462 | 0.1993 | 0.2460 | 0.8027 | nan | 0.9184 | 0.9166 | 0.0504 | 0.6917 | 0.3088 | nan | 0.3301 | 0.0 | 0.0 | 0.9292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9076 | 0.0 | 0.1184 | 0.0000 | 0.0 | nan | 0.0 | 0.0013 | 0.0 | 0.0 | 0.9385 | 0.8095 | 0.9498 | 0.0 | 0.0 | 0.0022 | 0.0 | nan | 0.6491 | 0.8513 | 0.0500 | 0.5678 | 0.2412 | nan | 0.2666 | 0.0 | 0.0 | 0.6916 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5639 | 0.0 | 0.1138 | 0.0000 | 0.0 | nan | 0.0 | 0.0013 | 0.0 | 0.0 | 0.8267 | 0.6601 | 0.8923 | 0.0 | 0.0 | 0.0021 | 0.0 | | 0.8957 | 4.625 | 740 | 0.7406 | 0.2011 | 0.2430 | 0.8042 | nan | 0.8290 | 0.9671 | 0.1172 | 0.6543 | 0.3003 | nan | 0.2570 | 0.0 | 0.0 | 0.9012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9194 | 0.0 | 0.1181 | 0.0000 | 0.0 | nan | 0.0 | 0.0025 | 0.0 | 0.0 | 0.9155 | 0.8497 | 0.9401 | 0.0 | 0.0 | 0.0036 | 0.0 | nan | 0.6682 | 0.8307 | 0.1167 | 0.5947 | 0.2420 | nan | 0.2248 | 0.0 | 0.0 | 0.7170 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5653 | 0.0 | 0.1142 | 0.0000 | 0.0 | nan | 0.0 | 0.0025 | 0.0 | 0.0 | 0.8212 | 0.6416 | 0.8942 | 0.0 | 0.0 | 0.0036 | 0.0 | | 0.7002 | 4.75 | 760 | 0.7270 | 0.2069 | 0.2494 | 0.8126 | nan | 0.9017 | 0.9575 | 0.2078 | 0.6567 | 0.2954 | nan | 0.2933 | 0.0 | 0.0 | 0.9239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9087 | 0.0 | 0.1253 | 0.0001 | 0.0 | nan | 0.0 | 0.0040 | 0.0 | 0.0 | 0.9307 | 0.8234 | 0.9502 | 0.0 | 0.0 | 0.0026 | 0.0 | nan | 0.6899 | 0.8534 | 0.2027 | 0.5985 | 0.2390 | nan | 0.2542 | 0.0 | 0.0 | 0.6982 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5740 | 0.0 | 0.1205 | 0.0001 | 0.0 | nan | 0.0 | 0.0040 | 0.0 | 0.0 | 0.8282 | 0.6618 | 0.8935 | 0.0 | 0.0 | 0.0025 | 0.0 | | 0.5669 | 4.875 | 780 | 0.7312 | 0.2104 | 0.2560 | 0.8065 | nan | 0.7918 | 0.9640 | 0.4076 | 0.7256 | 0.2672 | nan | 0.3281 | 0.0 | 0.0 | 0.9089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8972 | 0.0 | 0.1592 | 0.0002 | 0.0 | nan | 0.0 | 0.0076 | 0.0 | 0.0 | 0.9271 | 0.8527 | 0.9520 | 0.0 | 0.0 | 0.0027 | 0.0 | nan | 0.6582 | 0.8336 | 0.3396 | 0.5972 | 0.2246 | nan | 0.2550 | 0.0 | 0.0 | 0.7194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5818 | 0.0 | 0.1500 | 0.0002 | 0.0 | nan | 0.0 | 0.0075 | 0.0 | 0.0 | 0.8215 | 0.6477 | 0.8941 | 0.0 | 0.0 | 0.0026 | 0.0 | | 0.6062 | 5.0 | 800 | 0.7316 | 0.2048 | 0.2484 | 0.8087 | nan | 0.9198 | 0.9403 | 0.1631 | 0.6511 | 0.2664 | nan | 0.3733 | 0.0 | 0.0 | 0.9212 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9052 | 0.0 | 0.1084 | 0.0003 | 0.0 | nan | 0.0 | 0.0074 | 0.0 | 0.0 | 0.9431 | 0.7977 | 0.9508 | 0.0 | 0.0 | 0.0022 | 0.0 | nan | 0.6534 | 0.8587 | 0.1562 | 0.5951 | 0.2217 | nan | 0.2974 | 0.0 | 0.0 | 0.7053 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5743 | 0.0 | 0.1056 | 0.0003 | 0.0 | nan | 0.0 | 0.0073 | 0.0 | 0.0 | 0.8194 | 0.6628 | 0.8950 | 0.0 | 0.0 | 0.0022 | 0.0 | | 1.2116 | 5.125 | 820 | 0.7260 | 0.2143 | 0.2624 | 0.8098 | nan | 0.8476 | 0.9493 | 0.4505 | 0.7482 | 0.3397 | nan | 0.3730 | 0.0 | 0.0 | 0.9079 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9192 | 0.0 | 0.1304 | 0.0001 | 0.0 | nan | 0.0 | 0.0115 | 0.0 | 0.0 | 0.8961 | 0.8672 | 0.9520 | 0.0 | 0.0 | 0.0052 | 0.0 | nan | 0.6816 | 0.8509 | 0.3653 | 0.6222 | 0.2567 | nan | 0.3043 | 0.0 | 0.0 | 0.7271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5698 | 0.0 | 0.1254 | 0.0001 | 0.0 | nan | 0.0 | 0.0114 | 0.0 | 0.0 | 0.8143 | 0.6268 | 0.8961 | 0.0 | 0.0 | 0.0051 | 0.0 | | 0.6622 | 5.25 | 840 | 0.7183 | 0.2123 | 0.2546 | 0.8111 | nan | 0.8410 | 0.9627 | 0.3592 | 0.6780 | 0.2762 | nan | 0.3940 | 0.0 | 0.0 | 0.9109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8906 | 0.0 | 0.1500 | 0.0001 | 0.0 | nan | 0.0 | 0.0092 | 0.0 | 0.0 | 0.9567 | 0.7505 | 0.9588 | 0.0 | 0.0 | 0.0080 | 0.0 | nan | 0.6685 | 0.8444 | 0.3060 | 0.6187 | 0.2338 | nan | 0.3208 | 0.0 | 0.0 | 0.7154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5878 | 0.0 | 0.1431 | 0.0001 | 0.0 | nan | 0.0 | 0.0091 | 0.0 | 0.0 | 0.8061 | 0.6375 | 0.8938 | 0.0 | 0.0 | 0.0075 | 0.0 | | 0.6568 | 5.375 | 860 | 0.7108 | 0.2135 | 0.2588 | 0.8149 | nan | 0.8678 | 0.9594 | 0.3057 | 0.7247 | 0.2691 | nan | 0.3707 | 0.0000 | 0.0 | 0.9086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9145 | 0.0 | 0.1877 | 0.0002 | 0.0 | nan | 0.0 | 0.0148 | 0.0 | 0.0 | 0.9123 | 0.8736 | 0.9612 | 0.0 | 0.0 | 0.0100 | 0.0 | nan | 0.6961 | 0.8508 | 0.2737 | 0.6065 | 0.2302 | nan | 0.3062 | 0.0000 | 0.0 | 0.7328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5865 | 0.0 | 0.1737 | 0.0002 | 0.0 | nan | 0.0 | 0.0146 | 0.0 | 0.0 | 0.8258 | 0.6340 | 0.8928 | 0.0 | 0.0 | 0.0093 | 0.0 | | 0.7026 | 5.5 | 880 | 0.6979 | 0.2195 | 0.2632 | 0.8184 | nan | 0.8830 | 0.9646 | 0.5119 | 0.6625 | 0.2476 | nan | 0.4373 | 0.0003 | 0.0 | 0.9230 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9084 | 0.0 | 0.1983 | 0.0002 | 0.0 | nan | 0.0 | 0.0127 | 0.0 | 0.0 | 0.9465 | 0.7526 | 0.9570 | 0.0 | 0.0 | 0.0161 | 0.0 | nan | 0.7059 | 0.8546 | 0.4116 | 0.6166 | 0.2170 | nan | 0.3350 | 0.0003 | 0.0 | 0.7179 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5901 | 0.0 | 0.1783 | 0.0002 | 0.0 | nan | 0.0 | 0.0125 | 0.0 | 0.0 | 0.8251 | 0.6492 | 0.8967 | 0.0 | 0.0 | 0.0144 | 0.0 | | 0.5068 | 5.625 | 900 | 0.6910 | 0.2194 | 0.2666 | 0.8172 | nan | 0.8937 | 0.9482 | 0.4439 | 0.7078 | 0.3253 | nan | 0.3809 | 0.0011 | 0.0 | 0.9174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9192 | 0.0 | 0.2342 | 0.0002 | 0.0 | nan | 0.0 | 0.0174 | 0.0 | 0.0 | 0.9070 | 0.8719 | 0.9530 | 0.0 | 0.0 | 0.0091 | 0.0 | nan | 0.6918 | 0.8607 | 0.3586 | 0.6245 | 0.2597 | nan | 0.3113 | 0.0011 | 0.0 | 0.7224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5911 | 0.0 | 0.2093 | 0.0002 | 0.0 | nan | 0.0 | 0.0171 | 0.0 | 0.0 | 0.8222 | 0.6455 | 0.8977 | 0.0 | 0.0 | 0.0085 | 0.0 | | 0.6241 | 5.75 | 920 | 0.6961 | 0.2192 | 0.2623 | 0.8150 | nan | 0.8089 | 0.9673 | 0.4393 | 0.7130 | 0.2755 | nan | 0.4172 | 0.0015 | 0.0 | 0.9055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9157 | 0.0 | 0.2460 | 0.0002 | 0.0 | nan | 0.0 | 0.0139 | 0.0 | 0.0 | 0.9451 | 0.7834 | 0.9500 | 0.0 | 0.0 | 0.0097 | 0.0 | nan | 0.6721 | 0.8422 | 0.3538 | 0.6107 | 0.2366 | nan | 0.3326 | 0.0015 | 0.0 | 0.7309 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5902 | 0.0 | 0.2189 | 0.0002 | 0.0 | nan | 0.0 | 0.0137 | 0.0 | 0.0 | 0.8258 | 0.6754 | 0.9008 | 0.0 | 0.0 | 0.0091 | 0.0 | | 0.5583 | 5.875 | 940 | 0.6922 | 0.2240 | 0.2723 | 0.8180 | nan | 0.8839 | 0.9504 | 0.6275 | 0.6524 | 0.3259 | nan | 0.4297 | 0.0049 | 0.0 | 0.9446 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8925 | 0.0 | 0.2647 | 0.0004 | 0.0 | nan | 0.0 | 0.0172 | 0.0 | 0.0 | 0.9294 | 0.8227 | 0.9541 | 0.0 | 0.0 | 0.0118 | 0.0 | nan | 0.6749 | 0.8578 | 0.4585 | 0.6086 | 0.2598 | nan | 0.3321 | 0.0049 | 0.0 | 0.7004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6037 | 0.0 | 0.2309 | 0.0004 | 0.0 | nan | 0.0 | 0.0167 | 0.0 | 0.0 | 0.8317 | 0.6789 | 0.8982 | 0.0 | 0.0 | 0.0107 | 0.0 | | 0.8393 | 6.0 | 960 | 0.6903 | 0.2229 | 0.2711 | 0.8163 | nan | 0.8327 | 0.9467 | 0.5575 | 0.7482 | 0.3287 | nan | 0.4721 | 0.0047 | 0.0 | 0.9107 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9186 | 0.0 | 0.2184 | 0.0004 | 0.0 | nan | 0.0 | 0.0264 | 0.0 | 0.0 | 0.9399 | 0.8006 | 0.9552 | 0.0 | 0.0 | 0.0147 | 0.0 | nan | 0.6747 | 0.8594 | 0.4295 | 0.5751 | 0.2643 | nan | 0.3685 | 0.0047 | 0.0 | 0.7372 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5853 | 0.0 | 0.1958 | 0.0004 | 0.0 | nan | 0.0 | 0.0257 | 0.0 | 0.0 | 0.8320 | 0.6682 | 0.8987 | 0.0 | 0.0 | 0.0135 | 0.0 | | 0.6751 | 6.125 | 980 | 0.6844 | 0.2255 | 0.2728 | 0.8193 | nan | 0.8349 | 0.9647 | 0.6329 | 0.7137 | 0.2983 | nan | 0.4436 | 0.0047 | 0.0 | 0.9244 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9051 | 0.0 | 0.2648 | 0.0005 | 0.0 | nan | 0.0 | 0.0277 | 0.0 | 0.0 | 0.9279 | 0.8087 | 0.9571 | 0.0 | 0.0 | 0.0214 | 0.0 | nan | 0.6860 | 0.8516 | 0.4667 | 0.5964 | 0.2461 | nan | 0.3539 | 0.0047 | 0.0 | 0.7262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6084 | 0.0 | 0.2316 | 0.0005 | 0.0 | nan | 0.0 | 0.0269 | 0.0 | 0.0 | 0.8319 | 0.6703 | 0.8971 | 0.0 | 0.0 | 0.0190 | 0.0 | | 0.7188 | 6.25 | 1000 | 0.6903 | 0.2266 | 0.2751 | 0.8196 | nan | 0.8925 | 0.9514 | 0.6592 | 0.6630 | 0.2927 | nan | 0.4388 | 0.0143 | 0.0 | 0.9096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9233 | 0.0 | 0.2859 | 0.0009 | 0.0 | nan | 0.0 | 0.0311 | 0.0 | 0.0 | 0.9187 | 0.8378 | 0.9501 | 0.0 | 0.0 | 0.0333 | 0.0 | nan | 0.6885 | 0.8596 | 0.4619 | 0.5986 | 0.2428 | nan | 0.3487 | 0.0143 | 0.0 | 0.7411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5991 | 0.0 | 0.2396 | 0.0009 | 0.0 | nan | 0.0 | 0.0301 | 0.0 | 0.0 | 0.8326 | 0.6644 | 0.8992 | 0.0 | 0.0 | 0.0286 | 0.0 | | 0.7847 | 6.375 | 1020 | 0.6665 | 0.2266 | 0.2726 | 0.8234 | nan | 0.8969 | 0.9486 | 0.4908 | 0.7168 | 0.3156 | nan | 0.4230 | 0.0130 | 0.0 | 0.9263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9036 | 0.0 | 0.3229 | 0.0006 | 0.0 | nan | 0.0 | 0.0250 | 0.0 | 0.0 | 0.9378 | 0.8238 | 0.9516 | 0.0 | 0.0 | 0.0276 | 0.0 | nan | 0.6964 | 0.8650 | 0.3856 | 0.6312 | 0.2553 | nan | 0.3408 | 0.0130 | 0.0 | 0.7299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6100 | 0.0 | 0.2694 | 0.0006 | 0.0 | nan | 0.0 | 0.0244 | 0.0 | 0.0 | 0.8309 | 0.6726 | 0.9013 | 0.0 | 0.0 | 0.0239 | 0.0 | | 0.6132 | 6.5 | 1040 | 0.6789 | 0.2253 | 0.2743 | 0.8189 | nan | 0.8332 | 0.9432 | 0.4871 | 0.7784 | 0.3338 | nan | 0.4557 | 0.0417 | 0.0 | 0.9361 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8995 | 0.0 | 0.3001 | 0.0008 | 0.0 | nan | 0.0 | 0.0262 | 0.0 | 0.0 | 0.9419 | 0.8247 | 0.9546 | 0.0 | 0.0 | 0.0201 | 0.0 | nan | 0.6820 | 0.8635 | 0.3920 | 0.5818 | 0.2675 | nan | 0.3594 | 0.0415 | 0.0 | 0.7125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6042 | 0.0 | 0.2561 | 0.0008 | 0.0 | nan | 0.0 | 0.0256 | 0.0 | 0.0 | 0.8305 | 0.6719 | 0.9013 | 0.0 | 0.0 | 0.0175 | 0.0 | | 0.757 | 6.625 | 1060 | 0.6778 | 0.2256 | 0.2704 | 0.8195 | nan | 0.8437 | 0.9615 | 0.5217 | 0.6847 | 0.3283 | nan | 0.4550 | 0.0330 | 0.0 | 0.9264 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9071 | 0.0 | 0.2352 | 0.0017 | 0.0 | nan | 0.0 | 0.0316 | 0.0 | 0.0 | 0.9380 | 0.8140 | 0.9526 | 0.0 | 0.0 | 0.0188 | 0.0 | nan | 0.6780 | 0.8534 | 0.4110 | 0.6227 | 0.2644 | nan | 0.3591 | 0.0328 | 0.0 | 0.7297 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5954 | 0.0 | 0.2124 | 0.0017 | 0.0 | nan | 0.0 | 0.0305 | 0.0 | 0.0 | 0.8324 | 0.6782 | 0.9016 | 0.0 | 0.0 | 0.0169 | 0.0 | | 0.5827 | 6.75 | 1080 | 0.6690 | 0.2292 | 0.2774 | 0.8209 | nan | 0.8387 | 0.9524 | 0.5467 | 0.7103 | 0.3471 | nan | 0.5196 | 0.0267 | 0.0 | 0.9203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8834 | 0.0 | 0.3463 | 0.0005 | 0.0 | nan | 0.0 | 0.0383 | 0.0 | 0.0 | 0.9426 | 0.8170 | 0.9585 | 0.0 | 0.0 | 0.0296 | 0.0 | nan | 0.6778 | 0.8568 | 0.4135 | 0.6136 | 0.2751 | nan | 0.3848 | 0.0267 | 0.0 | 0.7310 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6146 | 0.0 | 0.2799 | 0.0005 | 0.0 | nan | 0.0 | 0.0366 | 0.0 | 0.0 | 0.8258 | 0.6708 | 0.8999 | 0.0 | 0.0 | 0.0268 | 0.0 | | 0.5645 | 6.875 | 1100 | 0.6721 | 0.2287 | 0.2786 | 0.8201 | nan | 0.8276 | 0.9558 | 0.6028 | 0.7323 | 0.3222 | nan | 0.5230 | 0.0413 | 0.0 | 0.9318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9212 | 0.0 | 0.2727 | 0.0011 | 0.0 | nan | 0.0 | 0.0408 | 0.0 | 0.0 | 0.9145 | 0.8597 | 0.9492 | 0.0 | 0.0 | 0.0196 | 0.0 | nan | 0.6883 | 0.8569 | 0.4496 | 0.6190 | 0.2588 | nan | 0.3889 | 0.0411 | 0.0 | 0.7250 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5968 | 0.0 | 0.2404 | 0.0011 | 0.0 | nan | 0.0 | 0.0389 | 0.0 | 0.0 | 0.8368 | 0.6577 | 0.9024 | 0.0 | 0.0 | 0.0177 | 0.0 | | 0.8222 | 7.0 | 1120 | 0.6668 | 0.2267 | 0.2726 | 0.8219 | nan | 0.8925 | 0.9475 | 0.4616 | 0.7190 | 0.3154 | nan | 0.4423 | 0.0497 | 0.0 | 0.9057 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9071 | 0.0 | 0.2813 | 0.0017 | 0.0 | nan | 0.0 | 0.0466 | 0.0 | 0.0 | 0.9378 | 0.8262 | 0.9620 | 0.0 | 0.0 | 0.0269 | 0.0 | nan | 0.6848 | 0.8636 | 0.3532 | 0.6294 | 0.2559 | nan | 0.3507 | 0.0493 | 0.0 | 0.7454 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6030 | 0.0 | 0.2484 | 0.0017 | 0.0 | nan | 0.0 | 0.0439 | 0.0 | 0.0 | 0.8288 | 0.6725 | 0.8981 | 0.0 | 0.0 | 0.0242 | 0.0 | | 0.5825 | 7.125 | 1140 | 0.6610 | 0.2285 | 0.2745 | 0.8231 | nan | 0.8851 | 0.9550 | 0.4756 | 0.6945 | 0.3191 | nan | 0.4265 | 0.0842 | 0.0 | 0.9336 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8943 | 0.0 | 0.3141 | 0.0051 | 0.0 | nan | 0.0 | 0.0430 | 0.0 | 0.0 | 0.9340 | 0.8300 | 0.9585 | 0.0 | 0.0 | 0.0301 | 0.0 | nan | 0.6921 | 0.8597 | 0.3645 | 0.6279 | 0.2610 | nan | 0.3379 | 0.0828 | 0.0 | 0.7260 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6170 | 0.0 | 0.2677 | 0.0051 | 0.0 | nan | 0.0 | 0.0409 | 0.0 | 0.0 | 0.8303 | 0.6723 | 0.9011 | 0.0 | 0.0 | 0.0267 | 0.0 | | 0.6208 | 7.25 | 1160 | 0.6523 | 0.2350 | 0.2832 | 0.8265 | nan | 0.8894 | 0.9477 | 0.5701 | 0.7105 | 0.3767 | nan | 0.4764 | 0.1025 | 0.0 | 0.9351 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8982 | 0.0 | 0.3659 | 0.0056 | 0.0 | nan | 0.0 | 0.0426 | 0.0 | 0.0 | 0.9318 | 0.8177 | 0.9535 | 0.0 | 0.0 | 0.0377 | 0.0 | nan | 0.7121 | 0.8641 | 0.4240 | 0.6383 | 0.2877 | nan | 0.3739 | 0.0997 | 0.0 | 0.7217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6189 | 0.0 | 0.2844 | 0.0055 | 0.0 | nan | 0.0 | 0.0408 | 0.0 | 0.0 | 0.8368 | 0.6767 | 0.9029 | 0.0 | 0.0 | 0.0323 | 0.0 | | 0.7019 | 7.375 | 1180 | 0.6583 | 0.2319 | 0.2790 | 0.8243 | nan | 0.8498 | 0.9577 | 0.5262 | 0.7398 | 0.3133 | nan | 0.4820 | 0.0729 | 0.0 | 0.9187 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9122 | 0.0 | 0.3430 | 0.0049 | 0.0 | nan | 0.0 | 0.0523 | 0.0 | 0.0 | 0.9290 | 0.8367 | 0.9540 | 0.0 | 0.0 | 0.0366 | 0.0 | nan | 0.6952 | 0.8597 | 0.4018 | 0.6244 | 0.2605 | nan | 0.3763 | 0.0717 | 0.0 | 0.7417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6152 | 0.0 | 0.2792 | 0.0048 | 0.0 | nan | 0.0 | 0.0494 | 0.0 | 0.0 | 0.8348 | 0.6692 | 0.9035 | 0.0 | 0.0 | 0.0317 | 0.0 | | 0.6965 | 7.5 | 1200 | 0.6598 | 0.2298 | 0.2779 | 0.8231 | nan | 0.8653 | 0.9488 | 0.5446 | 0.7388 | 0.2980 | nan | 0.4756 | 0.0534 | 0.0 | 0.9266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9094 | 0.0 | 0.3351 | 0.0027 | 0.0 | nan | 0.0 | 0.0456 | 0.0 | 0.0 | 0.9396 | 0.8185 | 0.9542 | 0.0 | 0.0 | 0.0350 | 0.0 | nan | 0.6883 | 0.8631 | 0.4118 | 0.6177 | 0.2535 | nan | 0.3616 | 0.0529 | 0.0 | 0.7321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6135 | 0.0 | 0.2778 | 0.0027 | 0.0 | nan | 0.0 | 0.0434 | 0.0 | 0.0 | 0.8334 | 0.6669 | 0.9042 | 0.0 | 0.0 | 0.0300 | 0.0 | | 0.8464 | 7.625 | 1220 | 0.6536 | 0.2327 | 0.2804 | 0.8234 | nan | 0.8563 | 0.9492 | 0.5608 | 0.7259 | 0.3495 | nan | 0.4742 | 0.0939 | 0.0 | 0.9284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9038 | 0.0 | 0.3330 | 0.0044 | 0.0 | nan | 0.0 | 0.0428 | 0.0 | 0.0 | 0.9403 | 0.8124 | 0.9580 | 0.0 | 0.0 | 0.0406 | 0.0 | nan | 0.6836 | 0.8603 | 0.4218 | 0.6183 | 0.2773 | nan | 0.3674 | 0.0909 | 0.0 | 0.7345 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6173 | 0.0 | 0.2801 | 0.0044 | 0.0 | nan | 0.0 | 0.0408 | 0.0 | 0.0 | 0.8352 | 0.6748 | 0.9038 | 0.0 | 0.0 | 0.0351 | 0.0 | | 0.8663 | 7.75 | 1240 | 0.6536 | 0.2356 | 0.2856 | 0.8251 | nan | 0.8318 | 0.9568 | 0.7155 | 0.7355 | 0.3685 | nan | 0.4944 | 0.0829 | 0.0 | 0.9280 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9053 | 0.0 | 0.3079 | 0.0030 | 0.0 | nan | 0.0 | 0.0560 | 0.0 | 0.0 | 0.9366 | 0.8176 | 0.9555 | 0.0 | 0.0 | 0.0423 | 0.0 | nan | 0.7027 | 0.8595 | 0.4845 | 0.6298 | 0.2828 | nan | 0.3846 | 0.0809 | 0.0 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6135 | 0.0 | 0.2656 | 0.0030 | 0.0 | nan | 0.0 | 0.0529 | 0.0 | 0.0 | 0.8313 | 0.6726 | 0.9042 | 0.0 | 0.0 | 0.0366 | 0.0 | | 0.5512 | 7.875 | 1260 | 0.6584 | 0.2349 | 0.2828 | 0.8232 | nan | 0.8085 | 0.9621 | 0.6023 | 0.7799 | 0.3110 | nan | 0.4580 | 0.1459 | 0.0 | 0.9263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9059 | 0.0 | 0.3185 | 0.0045 | 0.0 | nan | 0.0 | 0.0662 | 0.0 | 0.0 | 0.9351 | 0.8235 | 0.9535 | 0.0 | 0.0 | 0.0480 | 0.0 | nan | 0.6891 | 0.8552 | 0.4529 | 0.6063 | 0.2572 | nan | 0.3668 | 0.1394 | 0.0 | 0.7408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6184 | 0.0 | 0.2729 | 0.0045 | 0.0 | nan | 0.0 | 0.0622 | 0.0 | 0.0 | 0.8325 | 0.6726 | 0.9052 | 0.0 | 0.0 | 0.0418 | 0.0 | | 0.5665 | 8.0 | 1280 | 0.6482 | 0.2360 | 0.2830 | 0.8275 | nan | 0.8853 | 0.9485 | 0.5162 | 0.7434 | 0.3535 | nan | 0.4637 | 0.1514 | 0.0 | 0.9285 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9162 | 0.0 | 0.3339 | 0.0028 | 0.0 | nan | 0.0 | 0.0583 | 0.0 | 0.0 | 0.9347 | 0.8114 | 0.9573 | 0.0 | 0.0 | 0.0493 | 0.0 | nan | 0.7097 | 0.8658 | 0.3973 | 0.6420 | 0.2813 | nan | 0.3677 | 0.1442 | 0.0 | 0.7370 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6136 | 0.0 | 0.2780 | 0.0028 | 0.0 | nan | 0.0 | 0.0555 | 0.0 | 0.0 | 0.8368 | 0.6747 | 0.9040 | 0.0 | 0.0 | 0.0425 | 0.0 | | 0.5607 | 8.125 | 1300 | 0.6419 | 0.2416 | 0.2915 | 0.8289 | nan | 0.8491 | 0.9566 | 0.7230 | 0.7390 | 0.3663 | nan | 0.5171 | 0.1725 | 0.0 | 0.9251 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9203 | 0.0 | 0.3376 | 0.0034 | 0.0 | nan | 0.0 | 0.0559 | 0.0 | 0.0 | 0.9280 | 0.8273 | 0.9566 | 0.0 | 0.0 | 0.0498 | 0.0 | nan | 0.7179 | 0.8623 | 0.4966 | 0.6534 | 0.2854 | nan | 0.3995 | 0.1624 | 0.0 | 0.7419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6104 | 0.0 | 0.2804 | 0.0034 | 0.0 | nan | 0.0 | 0.0531 | 0.0 | 0.0 | 0.8403 | 0.6767 | 0.9041 | 0.0 | 0.0 | 0.0424 | 0.0 | | 0.547 | 8.25 | 1320 | 0.6452 | 0.2401 | 0.2904 | 0.8271 | nan | 0.8454 | 0.9559 | 0.6903 | 0.7527 | 0.3467 | nan | 0.4951 | 0.1829 | 0.0 | 0.9270 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9105 | 0.0 | 0.3385 | 0.0054 | 0.0 | nan | 0.0 | 0.0581 | 0.0 | 0.0 | 0.9213 | 0.8586 | 0.9579 | 0.0 | 0.0 | 0.0464 | 0.0 | nan | 0.7055 | 0.8615 | 0.4924 | 0.6390 | 0.2790 | nan | 0.3841 | 0.1729 | 0.0 | 0.7426 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6174 | 0.0 | 0.2814 | 0.0053 | 0.0 | nan | 0.0 | 0.0551 | 0.0 | 0.0 | 0.8342 | 0.6674 | 0.9038 | 0.0 | 0.0 | 0.0402 | 0.0 | | 0.575 | 8.375 | 1340 | 0.6417 | 0.2399 | 0.2890 | 0.8287 | nan | 0.8789 | 0.9506 | 0.6499 | 0.7599 | 0.3400 | nan | 0.4815 | 0.2067 | 0.0 | 0.9264 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9142 | 0.0 | 0.3073 | 0.0055 | 0.0 | nan | 0.0 | 0.0620 | 0.0 | 0.0 | 0.9323 | 0.8271 | 0.9594 | 0.0 | 0.0 | 0.0458 | 0.0 | nan | 0.7142 | 0.8665 | 0.4647 | 0.6435 | 0.2786 | nan | 0.3783 | 0.1939 | 0.0 | 0.7404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6117 | 0.0 | 0.2651 | 0.0054 | 0.0 | nan | 0.0 | 0.0587 | 0.0 | 0.0 | 0.8357 | 0.6768 | 0.9026 | 0.0 | 0.0 | 0.0398 | 0.0 | | 0.5761 | 8.5 | 1360 | 0.6437 | 0.2396 | 0.2871 | 0.8275 | nan | 0.8529 | 0.9573 | 0.6381 | 0.7561 | 0.3359 | nan | 0.4659 | 0.2093 | 0.0 | 0.9267 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9085 | 0.0 | 0.3204 | 0.0053 | 0.0 | nan | 0.0 | 0.0593 | 0.0 | 0.0 | 0.9411 | 0.8007 | 0.9573 | 0.0 | 0.0 | 0.0531 | 0.0 | nan | 0.7070 | 0.8614 | 0.4658 | 0.6388 | 0.2718 | nan | 0.3759 | 0.1970 | 0.0 | 0.7403 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6175 | 0.0 | 0.2723 | 0.0053 | 0.0 | nan | 0.0 | 0.0564 | 0.0 | 0.0 | 0.8342 | 0.6734 | 0.9050 | 0.0 | 0.0 | 0.0455 | 0.0 | | 0.7298 | 8.625 | 1380 | 0.6421 | 0.2433 | 0.2916 | 0.8303 | nan | 0.8762 | 0.9573 | 0.6831 | 0.7320 | 0.3450 | nan | 0.4702 | 0.2331 | 0.0 | 0.9270 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8965 | 0.0 | 0.3681 | 0.0054 | 0.0 | nan | 0.0 | 0.0602 | 0.0 | 0.0 | 0.9374 | 0.8236 | 0.9518 | 0.0 | 0.0 | 0.0658 | 0.0 | nan | 0.7182 | 0.8632 | 0.4795 | 0.6617 | 0.2789 | nan | 0.3749 | 0.2202 | 0.0 | 0.7413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6254 | 0.0 | 0.2929 | 0.0053 | 0.0 | nan | 0.0 | 0.0571 | 0.0 | 0.0 | 0.8344 | 0.6728 | 0.9059 | 0.0 | 0.0 | 0.0552 | 0.0 | | 1.1434 | 8.75 | 1400 | 0.6460 | 0.2406 | 0.2872 | 0.8294 | nan | 0.8834 | 0.9592 | 0.6247 | 0.7015 | 0.3320 | nan | 0.4713 | 0.2162 | 0.0 | 0.9257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9093 | 0.0 | 0.3398 | 0.0049 | 0.0 | nan | 0.0 | 0.0614 | 0.0 | 0.0 | 0.9377 | 0.8156 | 0.9555 | 0.0 | 0.0 | 0.0517 | 0.0 | nan | 0.7119 | 0.8632 | 0.4541 | 0.6502 | 0.2749 | nan | 0.3728 | 0.2042 | 0.0 | 0.7436 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6185 | 0.0 | 0.2802 | 0.0048 | 0.0 | nan | 0.0 | 0.0580 | 0.0 | 0.0 | 0.8371 | 0.6764 | 0.9055 | 0.0 | 0.0 | 0.0440 | 0.0 | | 0.6032 | 8.875 | 1420 | 0.6462 | 0.2403 | 0.2877 | 0.8282 | nan | 0.8562 | 0.9592 | 0.6079 | 0.7370 | 0.3376 | nan | 0.4940 | 0.2153 | 0.0 | 0.9234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9117 | 0.0 | 0.3287 | 0.0062 | 0.0 | nan | 0.0 | 0.0610 | 0.0 | 0.0 | 0.9324 | 0.8343 | 0.9557 | 0.0 | 0.0 | 0.0462 | 0.0 | nan | 0.7067 | 0.8607 | 0.4509 | 0.6492 | 0.2761 | nan | 0.3860 | 0.2025 | 0.0 | 0.7476 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6156 | 0.0 | 0.2750 | 0.0061 | 0.0 | nan | 0.0 | 0.0577 | 0.0 | 0.0 | 0.8387 | 0.6711 | 0.9056 | 0.0 | 0.0 | 0.0400 | 0.0 | | 0.3876 | 9.0 | 1440 | 0.6456 | 0.2427 | 0.2912 | 0.8288 | nan | 0.8482 | 0.9597 | 0.6409 | 0.7465 | 0.3434 | nan | 0.4992 | 0.2467 | 0.0 | 0.9290 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9104 | 0.0 | 0.3532 | 0.0090 | 0.0 | nan | 0.0 | 0.0657 | 0.0 | 0.0 | 0.9292 | 0.8279 | 0.9565 | 0.0 | 0.0 | 0.0543 | 0.0 | nan | 0.7045 | 0.8607 | 0.4671 | 0.6412 | 0.2797 | nan | 0.3894 | 0.2281 | 0.0 | 0.7454 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6231 | 0.0 | 0.2882 | 0.0088 | 0.0 | nan | 0.0 | 0.0620 | 0.0 | 0.0 | 0.8400 | 0.6772 | 0.9051 | 0.0 | 0.0 | 0.0467 | 0.0 | | 0.5561 | 9.125 | 1460 | 0.6433 | 0.2425 | 0.2904 | 0.8292 | nan | 0.8774 | 0.9526 | 0.6184 | 0.7383 | 0.3408 | nan | 0.4846 | 0.2610 | 0.0 | 0.9212 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9132 | 0.0 | 0.3409 | 0.0080 | 0.0 | nan | 0.0 | 0.0665 | 0.0 | 0.0 | 0.9358 | 0.8273 | 0.9539 | 0.0 | 0.0 | 0.0528 | 0.0 | nan | 0.7035 | 0.8658 | 0.4566 | 0.6457 | 0.2769 | nan | 0.3832 | 0.2390 | 0.0 | 0.7504 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6201 | 0.0 | 0.2807 | 0.0079 | 0.0 | nan | 0.0 | 0.0628 | 0.0 | 0.0 | 0.8384 | 0.6767 | 0.9064 | 0.0 | 0.0 | 0.0456 | 0.0 | | 0.7998 | 9.25 | 1480 | 0.6426 | 0.2435 | 0.2923 | 0.8291 | nan | 0.8504 | 0.9603 | 0.6651 | 0.7439 | 0.3315 | nan | 0.4828 | 0.2672 | 0.0 | 0.9350 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9052 | 0.0 | 0.3651 | 0.0061 | 0.0 | nan | 0.0 | 0.0706 | 0.0 | 0.0 | 0.9316 | 0.8224 | 0.9597 | 0.0 | 0.0 | 0.0557 | 0.0 | nan | 0.7055 | 0.8611 | 0.4829 | 0.6410 | 0.2743 | nan | 0.3834 | 0.2455 | 0.0 | 0.7398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6255 | 0.0 | 0.2895 | 0.0060 | 0.0 | nan | 0.0 | 0.0662 | 0.0 | 0.0 | 0.8401 | 0.6782 | 0.9050 | 0.0 | 0.0 | 0.0482 | 0.0 | | 0.4971 | 9.375 | 1500 | 0.6446 | 0.2441 | 0.2935 | 0.8278 | nan | 0.8230 | 0.9622 | 0.6830 | 0.7511 | 0.3421 | nan | 0.4969 | 0.2836 | 0.0 | 0.9332 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8986 | 0.0 | 0.3716 | 0.0088 | 0.0 | nan | 0.0 | 0.0600 | 0.0 | 0.0 | 0.9346 | 0.8235 | 0.9572 | 0.0 | 0.0 | 0.0630 | 0.0 | nan | 0.6964 | 0.8566 | 0.4920 | 0.6346 | 0.2794 | nan | 0.3853 | 0.2574 | 0.0 | 0.7425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6300 | 0.0 | 0.2957 | 0.0086 | 0.0 | nan | 0.0 | 0.0571 | 0.0 | 0.0 | 0.8380 | 0.6779 | 0.9061 | 0.0 | 0.0 | 0.0540 | 0.0 | | 0.7512 | 9.5 | 1520 | 0.6455 | 0.2435 | 0.2924 | 0.8279 | nan | 0.8299 | 0.9600 | 0.6616 | 0.7584 | 0.3290 | nan | 0.4990 | 0.2781 | 0.0 | 0.9243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9141 | 0.0 | 0.3497 | 0.0085 | 0.0 | nan | 0.0 | 0.0747 | 0.0 | 0.0 | 0.9351 | 0.8213 | 0.9584 | 0.0 | 0.0 | 0.0534 | 0.0 | nan | 0.7008 | 0.8599 | 0.4834 | 0.6314 | 0.2736 | nan | 0.3899 | 0.2542 | 0.0 | 0.7474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6191 | 0.0 | 0.2856 | 0.0083 | 0.0 | nan | 0.0 | 0.0699 | 0.0 | 0.0 | 0.8384 | 0.6772 | 0.9055 | 0.0 | 0.0 | 0.0461 | 0.0 | | 0.4762 | 9.625 | 1540 | 0.6426 | 0.2439 | 0.2939 | 0.8282 | nan | 0.8413 | 0.9530 | 0.6880 | 0.7689 | 0.3499 | nan | 0.5063 | 0.2761 | 0.0 | 0.9271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9182 | 0.0 | 0.3380 | 0.0101 | 0.0 | nan | 0.0 | 0.0703 | 0.0 | 0.0 | 0.9356 | 0.8177 | 0.9523 | 0.0 | 0.0 | 0.0518 | 0.0 | nan | 0.7039 | 0.8643 | 0.4905 | 0.6266 | 0.2828 | nan | 0.3939 | 0.2521 | 0.0 | 0.7451 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6160 | 0.0 | 0.2806 | 0.0099 | 0.0 | nan | 0.0 | 0.0663 | 0.0 | 0.0 | 0.8385 | 0.6815 | 0.9065 | 0.0 | 0.0 | 0.0448 | 0.0 | | 0.3538 | 9.75 | 1560 | 0.6393 | 0.2444 | 0.2940 | 0.8290 | nan | 0.8490 | 0.9553 | 0.6596 | 0.7572 | 0.3577 | nan | 0.4869 | 0.2816 | 0.0 | 0.9325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9100 | 0.0 | 0.3645 | 0.0085 | 0.0 | nan | 0.0 | 0.0726 | 0.0 | 0.0 | 0.9294 | 0.8296 | 0.9576 | 0.0 | 0.0 | 0.0552 | 0.0 | nan | 0.7055 | 0.8631 | 0.4808 | 0.6368 | 0.2850 | nan | 0.3868 | 0.2574 | 0.0 | 0.7429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6230 | 0.0 | 0.2922 | 0.0083 | 0.0 | nan | 0.0 | 0.0681 | 0.0 | 0.0 | 0.8412 | 0.6777 | 0.9057 | 0.0 | 0.0 | 0.0476 | 0.0 | | 0.5466 | 9.875 | 1580 | 0.6420 | 0.2433 | 0.2922 | 0.8289 | nan | 0.8640 | 0.9511 | 0.6359 | 0.7492 | 0.3604 | nan | 0.4945 | 0.2678 | 0.0 | 0.9280 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9107 | 0.0 | 0.3389 | 0.0100 | 0.0 | nan | 0.0 | 0.0697 | 0.0 | 0.0 | 0.9355 | 0.8275 | 0.9571 | 0.0 | 0.0 | 0.0501 | 0.0 | nan | 0.7045 | 0.8650 | 0.4684 | 0.6420 | 0.2857 | nan | 0.3864 | 0.2450 | 0.0 | 0.7453 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6192 | 0.0 | 0.2813 | 0.0098 | 0.0 | nan | 0.0 | 0.0658 | 0.0 | 0.0 | 0.8385 | 0.6782 | 0.9060 | 0.0 | 0.0 | 0.0438 | 0.0 | | 0.7021 | 10.0 | 1600 | 0.6506 | 0.2417 | 0.2896 | 0.8279 | nan | 0.8456 | 0.9612 | 0.6353 | 0.7492 | 0.3407 | nan | 0.4808 | 0.2392 | 0.0 | 0.9285 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9229 | 0.0 | 0.3317 | 0.0065 | 0.0 | nan | 0.0 | 0.0667 | 0.0 | 0.0 | 0.9214 | 0.8309 | 0.9552 | 0.0 | 0.0 | 0.0520 | 0.0 | nan | 0.7073 | 0.8593 | 0.4704 | 0.6415 | 0.2779 | nan | 0.3844 | 0.2212 | 0.0 | 0.7450 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6131 | 0.0 | 0.2762 | 0.0064 | 0.0 | nan | 0.0 | 0.0628 | 0.0 | 0.0 | 0.8426 | 0.6748 | 0.9052 | 0.0 | 0.0 | 0.0448 | 0.0 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]
mouadenna/segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch_try1
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/mouadn773/huggingface/runs/5yd8nl25) # segformer-b1-finetuned-segments-pv_v1_normalized_p100_4batch_try1 This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b1-finetuned-ade-512-512) on the mouadenna/satellite_PV_dataset_train_test_v1 dataset. It achieves the following results on the evaluation set: - Loss: 0.0012 - Mean Iou: 0.9586 - Precision: 0.9792 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Precision | |:-------------:|:-------:|:----:|:---------------:|:--------:|:---------:| | 0.2548 | 0.9989 | 229 | 0.0851 | 0.6627 | 0.7444 | | 0.0259 | 1.9978 | 458 | 0.0141 | 0.8187 | 0.8803 | | 0.011 | 2.9967 | 687 | 0.0082 | 0.8288 | 0.8937 | | 0.0073 | 4.0 | 917 | 0.0055 | 0.8596 | 0.8955 | | 0.0059 | 4.9989 | 1146 | 0.0053 | 0.8527 | 0.8786 | | 0.0047 | 5.9978 | 1375 | 0.0039 | 0.8920 | 0.9370 | | 0.0039 | 6.9967 | 1604 | 0.0039 | 0.8811 | 0.9470 | | 0.0041 | 8.0 | 1834 | 0.0046 | 0.8564 | 0.9432 | | 0.0042 | 8.9989 | 2063 | 0.0040 | 0.8786 | 0.9099 | | 0.004 | 9.9978 | 2292 | 0.0029 | 0.9062 | 0.9479 | | 0.0037 | 10.9967 | 2521 | 0.0030 | 0.9002 | 0.9557 | | 0.0031 | 12.0 | 2751 | 0.0026 | 0.9150 | 0.9415 | | 0.0028 | 12.9989 | 2980 | 0.0023 | 0.9216 | 0.9597 | | 0.0035 | 13.9978 | 3209 | 0.0038 | 0.8824 | 0.9091 | | 0.0032 | 14.9967 | 3438 | 0.0029 | 0.9041 | 0.9477 | | 0.0032 | 16.0 | 3668 | 0.0024 | 0.9191 | 0.9548 | | 0.0026 | 16.9989 | 3897 | 0.0025 | 0.9177 | 0.9487 | | 0.0024 | 17.9978 | 4126 | 0.0022 | 0.9235 | 0.9523 | | 0.0025 | 18.9967 | 4355 | 0.0021 | 0.9270 | 0.9563 | | 0.003 | 20.0 | 4585 | 0.0034 | 0.8911 | 0.9511 | | 0.0027 | 20.9989 | 4814 | 0.0023 | 0.9216 | 0.9576 | | 0.0024 | 21.9978 | 5043 | 0.0020 | 0.9296 | 0.9606 | | 0.0023 | 22.9967 | 5272 | 0.0019 | 0.9331 | 0.9602 | | 0.002 | 24.0 | 5502 | 0.0020 | 0.9318 | 0.9667 | | 0.002 | 24.9989 | 5731 | 0.0018 | 0.9373 | 0.9619 | | 0.0022 | 25.9978 | 5960 | 0.0019 | 0.9352 | 0.9582 | | 0.0025 | 26.9967 | 6189 | 0.0019 | 0.9328 | 0.9686 | | 0.0019 | 28.0 | 6419 | 0.0017 | 0.9400 | 0.9632 | | 0.0018 | 28.9989 | 6648 | 0.0016 | 0.9430 | 0.9689 | | 0.0017 | 29.9978 | 6877 | 0.0016 | 0.9443 | 0.9712 | | 0.0017 | 30.9967 | 7106 | 0.0015 | 0.9471 | 0.9720 | | 0.0016 | 32.0 | 7336 | 0.0015 | 0.9492 | 0.9719 | | 0.0016 | 32.9989 | 7565 | 0.0014 | 0.9503 | 0.9721 | | 0.0015 | 33.9978 | 7794 | 0.0014 | 0.9525 | 0.9737 | | 0.0015 | 34.9967 | 8023 | 0.0013 | 0.9532 | 0.9713 | | 0.0014 | 36.0 | 8253 | 0.0013 | 0.9536 | 0.9687 | | 0.0014 | 36.9989 | 8482 | 0.0012 | 0.9562 | 0.9733 | | 0.0014 | 37.9978 | 8711 | 0.0012 | 0.9576 | 0.9767 | | 0.0014 | 38.9967 | 8940 | 0.0012 | 0.9579 | 0.9749 | | 0.0014 | 39.9564 | 9160 | 0.0012 | 0.9586 | 0.9792 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.1.2 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "pv" ]
nguyenb2240/segformer-b0-finetuned-segments-sidewalk-oct-22
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-oct-22 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 1.1574 - Mean Iou: 0.1657 - Mean Accuracy: 0.2143 - Overall Accuracy: 0.7508 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8145 - Accuracy Flat-sidewalk: 0.9504 - Accuracy Flat-crosswalk: 0.0 - Accuracy Flat-cyclinglane: 0.5828 - Accuracy Flat-parkingdriveway: 0.0118 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.0001 - Accuracy Human-person: 0.0 - Accuracy Human-rider: 0.0 - Accuracy Vehicle-car: 0.8895 - Accuracy Vehicle-truck: 0.0 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.0 - Accuracy Vehicle-motorcycle: 0.0 - Accuracy Vehicle-bicycle: 0.0 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0 - Accuracy Construction-building: 0.8937 - Accuracy Construction-door: 0.0 - Accuracy Construction-wall: 0.0389 - Accuracy Construction-fenceguardrail: 0.0 - Accuracy Construction-bridge: 0.0 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.0 - Accuracy Object-pole: 0.0 - Accuracy Object-trafficsign: 0.0 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9244 - Accuracy Nature-terrain: 0.8287 - Accuracy Sky: 0.9224 - Accuracy Void-ground: 0.0 - Accuracy Void-dynamic: 0.0 - Accuracy Void-static: 0.0 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.5381 - Iou Flat-sidewalk: 0.7939 - Iou Flat-crosswalk: 0.0 - Iou Flat-cyclinglane: 0.5124 - Iou Flat-parkingdriveway: 0.0117 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.0001 - Iou Human-person: 0.0 - Iou Human-rider: 0.0 - Iou Vehicle-car: 0.6117 - Iou Vehicle-truck: 0.0 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.0 - Iou Vehicle-motorcycle: 0.0 - Iou Vehicle-bicycle: 0.0 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0 - Iou Construction-building: 0.5570 - Iou Construction-door: 0.0 - Iou Construction-wall: 0.0365 - Iou Construction-fenceguardrail: 0.0 - Iou Construction-bridge: 0.0 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.0 - Iou Object-pole: 0.0 - Iou Object-trafficsign: 0.0 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.7504 - Iou Nature-terrain: 0.6347 - Iou Sky: 0.8566 - Iou Void-ground: 0.0 - Iou Void-dynamic: 0.0 - Iou Void-static: 0.0 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 10 - eval_batch_size: 10 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 2.6835 | 0.25 | 20 | 2.9300 | 0.0679 | 0.1189 | 0.5663 | nan | 0.0864 | 0.9582 | 0.0005 | 0.0171 | 0.0000 | nan | 0.0049 | 0.0042 | 0.0 | 0.8284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7913 | 0.0 | 0.0535 | 0.0 | 0.0 | nan | 0.0029 | 0.0 | 0.0 | 0.0 | 0.9260 | 0.0016 | 0.1138 | 0.0168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0824 | 0.5815 | 0.0005 | 0.0168 | 0.0000 | 0.0 | 0.0045 | 0.0040 | 0.0 | 0.5349 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4154 | 0.0 | 0.0414 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.5765 | 0.0016 | 0.1137 | 0.0025 | 0.0 | 0.0 | 0.0 | | 2.3707 | 0.5 | 40 | 2.1968 | 0.0875 | 0.1387 | 0.6231 | nan | 0.5613 | 0.9361 | 0.0 | 0.0210 | 0.0003 | nan | 0.0019 | 0.0 | 0.0 | 0.7965 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8356 | 0.0 | 0.0281 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9513 | 0.0032 | 0.3006 | 0.0011 | 0.0 | 0.0 | 0.0 | nan | 0.3878 | 0.6646 | 0.0 | 0.0208 | 0.0003 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.5587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4439 | 0.0 | 0.0249 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5675 | 0.0032 | 0.3003 | 0.0009 | 0.0 | 0.0 | 0.0 | | 2.0797 | 0.75 | 60 | 1.9742 | 0.1106 | 0.1579 | 0.6543 | nan | 0.7248 | 0.9240 | 0.0 | 0.0036 | 0.0008 | nan | 0.0000 | 0.0 | 0.0 | 0.8368 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8428 | 0.0 | 0.0254 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9498 | 0.0444 | 0.7003 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4154 | 0.7036 | 0.0 | 0.0036 | 0.0008 | nan | 0.0000 | 0.0 | 0.0 | 0.5498 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4939 | 0.0 | 0.0240 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6151 | 0.0426 | 0.6908 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.9067 | 1.0 | 80 | 1.7288 | 0.1234 | 0.1699 | 0.6749 | nan | 0.7502 | 0.9269 | 0.0 | 0.0766 | 0.0014 | nan | 0.0000 | 0.0 | 0.0 | 0.8214 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8719 | 0.0 | 0.0111 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9451 | 0.2194 | 0.8139 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4498 | 0.7152 | 0.0 | 0.0761 | 0.0013 | nan | 0.0000 | 0.0 | 0.0 | 0.5696 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5007 | 0.0 | 0.0108 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6444 | 0.1954 | 0.7863 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.6673 | 1.25 | 100 | 1.6680 | 0.1258 | 0.1752 | 0.6789 | nan | 0.8230 | 0.9033 | 0.0 | 0.0493 | 0.0022 | nan | 0.0000 | 0.0 | 0.0 | 0.8676 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8389 | 0.0 | 0.0075 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9609 | 0.2915 | 0.8618 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4254 | 0.7481 | 0.0 | 0.0488 | 0.0022 | nan | 0.0000 | 0.0 | 0.0 | 0.5539 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5266 | 0.0 | 0.0074 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6400 | 0.2581 | 0.8161 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.7134 | 1.5 | 120 | 1.5470 | 0.1376 | 0.1870 | 0.6997 | nan | 0.7864 | 0.9287 | 0.0 | 0.1234 | 0.0027 | nan | 0.0000 | 0.0 | 0.0 | 0.8766 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8540 | 0.0 | 0.0060 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9433 | 0.5618 | 0.9008 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4735 | 0.7320 | 0.0 | 0.1216 | 0.0027 | nan | 0.0000 | 0.0 | 0.0 | 0.5599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5319 | 0.0 | 0.0059 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6897 | 0.4601 | 0.8255 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4384 | 1.75 | 140 | 1.4997 | 0.1458 | 0.1931 | 0.7143 | nan | 0.7973 | 0.9446 | 0.0 | 0.2050 | 0.0025 | nan | 0.0 | 0.0 | 0.0 | 0.8691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8695 | 0.0 | 0.0025 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9304 | 0.6705 | 0.8889 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4842 | 0.7479 | 0.0 | 0.1960 | 0.0025 | nan | 0.0 | 0.0 | 0.0 | 0.5860 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5294 | 0.0 | 0.0025 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7244 | 0.5600 | 0.8314 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4101 | 2.0 | 160 | 1.4325 | 0.1485 | 0.1990 | 0.7167 | nan | 0.8247 | 0.9212 | 0.0 | 0.2410 | 0.0032 | nan | 0.0 | 0.0 | 0.0 | 0.8787 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8896 | 0.0 | 0.0042 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9131 | 0.7902 | 0.9017 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4675 | 0.7648 | 0.0 | 0.2283 | 0.0032 | nan | 0.0 | 0.0 | 0.0 | 0.5777 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5259 | 0.0 | 0.0041 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7388 | 0.6051 | 0.8352 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4613 | 2.25 | 180 | 1.3689 | 0.1522 | 0.2012 | 0.7248 | nan | 0.7783 | 0.9430 | 0.0 | 0.3362 | 0.0037 | nan | 0.0 | 0.0 | 0.0 | 0.8668 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8788 | 0.0 | 0.0033 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9201 | 0.7933 | 0.9142 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4951 | 0.7564 | 0.0 | 0.3089 | 0.0037 | nan | 0.0 | 0.0 | 0.0 | 0.5926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5366 | 0.0 | 0.0033 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7330 | 0.6059 | 0.8348 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1652 | 2.5 | 200 | 1.3458 | 0.1566 | 0.2036 | 0.7323 | nan | 0.7605 | 0.9551 | 0.0 | 0.4259 | 0.0038 | nan | 0.0 | 0.0 | 0.0 | 0.8580 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8902 | 0.0 | 0.0066 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9193 | 0.7943 | 0.9027 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5290 | 0.7548 | 0.0 | 0.3710 | 0.0038 | nan | 0.0 | 0.0 | 0.0 | 0.6064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5270 | 0.0 | 0.0065 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7467 | 0.6256 | 0.8409 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3201 | 2.75 | 220 | 1.2652 | 0.1572 | 0.2057 | 0.7353 | nan | 0.7825 | 0.9559 | 0.0 | 0.4231 | 0.0056 | nan | 0.0000 | 0.0 | 0.0 | 0.8904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8834 | 0.0 | 0.0122 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9104 | 0.8184 | 0.9014 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5340 | 0.7600 | 0.0 | 0.3872 | 0.0056 | nan | 0.0000 | 0.0 | 0.0 | 0.5909 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5480 | 0.0 | 0.0120 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7383 | 0.6066 | 0.8463 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2234 | 3.0 | 240 | 1.2746 | 0.1594 | 0.2088 | 0.7376 | nan | 0.8409 | 0.9292 | 0.0 | 0.4723 | 0.0054 | nan | 0.0 | 0.0 | 0.0 | 0.8857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8767 | 0.0 | 0.0206 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9318 | 0.7948 | 0.9251 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4920 | 0.7916 | 0.0 | 0.4224 | 0.0053 | nan | 0.0 | 0.0 | 0.0 | 0.6083 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5504 | 0.0 | 0.0199 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7396 | 0.6269 | 0.8444 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.4557 | 3.25 | 260 | 1.2698 | 0.1584 | 0.2075 | 0.7313 | nan | 0.8659 | 0.9053 | 0.0 | 0.4557 | 0.0059 | nan | 0.0 | 0.0 | 0.0 | 0.8611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8783 | 0.0 | 0.0141 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9394 | 0.7982 | 0.9172 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4660 | 0.7939 | 0.0 | 0.4017 | 0.0059 | nan | 0.0 | 0.0 | 0.0 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5439 | 0.0 | 0.0137 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7346 | 0.6333 | 0.8478 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2238 | 3.5 | 280 | 1.2213 | 0.1610 | 0.2090 | 0.7427 | nan | 0.8092 | 0.9490 | 0.0 | 0.4971 | 0.0087 | nan | 0.0 | 0.0 | 0.0 | 0.8902 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8860 | 0.0 | 0.0207 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9339 | 0.7840 | 0.9103 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5269 | 0.7830 | 0.0 | 0.4531 | 0.0086 | nan | 0.0 | 0.0 | 0.0 | 0.5897 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5528 | 0.0 | 0.0200 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7379 | 0.6292 | 0.8511 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.066 | 3.75 | 300 | 1.1935 | 0.1624 | 0.2109 | 0.7442 | nan | 0.8253 | 0.9479 | 0.0 | 0.5012 | 0.0083 | nan | 0.0 | 0.0 | 0.0 | 0.8755 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8923 | 0.0 | 0.0307 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9137 | 0.8313 | 0.9234 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5267 | 0.7849 | 0.0 | 0.4499 | 0.0082 | nan | 0.0 | 0.0 | 0.0 | 0.6217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5554 | 0.0 | 0.0294 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7463 | 0.6202 | 0.8525 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.1549 | 4.0 | 320 | 1.1899 | 0.1632 | 0.2118 | 0.7464 | nan | 0.8334 | 0.9423 | 0.0 | 0.5290 | 0.0073 | nan | 0.0000 | 0.0 | 0.0 | 0.8857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8956 | 0.0 | 0.0194 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9280 | 0.8115 | 0.9247 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5215 | 0.7940 | 0.0 | 0.4730 | 0.0073 | nan | 0.0000 | 0.0 | 0.0 | 0.6143 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5530 | 0.0 | 0.0188 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7488 | 0.6382 | 0.8541 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0461 | 4.25 | 340 | 1.1746 | 0.1643 | 0.2127 | 0.7482 | nan | 0.8171 | 0.9549 | 0.0 | 0.5455 | 0.0078 | nan | 0.0 | 0.0 | 0.0 | 0.8974 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8812 | 0.0 | 0.0387 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9161 | 0.8302 | 0.9174 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5322 | 0.7872 | 0.0 | 0.4917 | 0.0077 | nan | 0.0 | 0.0 | 0.0 | 0.6076 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5651 | 0.0 | 0.0366 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7456 | 0.6282 | 0.8553 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0398 | 4.5 | 360 | 1.1687 | 0.1652 | 0.2128 | 0.7494 | nan | 0.8254 | 0.9501 | 0.0 | 0.5507 | 0.0094 | nan | 0.0000 | 0.0 | 0.0 | 0.8755 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8972 | 0.0 | 0.0341 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9253 | 0.8186 | 0.9232 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5323 | 0.7930 | 0.0 | 0.4922 | 0.0093 | nan | 0.0000 | 0.0 | 0.0 | 0.6277 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5531 | 0.0 | 0.0323 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7518 | 0.6382 | 0.8553 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0538 | 4.75 | 380 | 1.1675 | 0.1655 | 0.2135 | 0.7499 | nan | 0.8351 | 0.9458 | 0.0 | 0.5657 | 0.0087 | nan | 0.0000 | 0.0 | 0.0 | 0.8830 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8980 | 0.0 | 0.0322 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9244 | 0.8123 | 0.9262 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5260 | 0.7971 | 0.0 | 0.5002 | 0.0086 | nan | 0.0000 | 0.0 | 0.0 | 0.6214 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5537 | 0.0 | 0.0305 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7554 | 0.6472 | 0.8564 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.0232 | 5.0 | 400 | 1.1574 | 0.1657 | 0.2143 | 0.7508 | nan | 0.8145 | 0.9504 | 0.0 | 0.5828 | 0.0118 | nan | 0.0001 | 0.0 | 0.0 | 0.8895 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8937 | 0.0 | 0.0389 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9244 | 0.8287 | 0.9224 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5381 | 0.7939 | 0.0 | 0.5124 | 0.0117 | nan | 0.0001 | 0.0 | 0.0 | 0.6117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5570 | 0.0 | 0.0365 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7504 | 0.6347 | 0.8566 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.42.4 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1
[ "unlabeled", "flat-road", "flat-sidewalk", "flat-crosswalk", "flat-cyclinglane", "flat-parkingdriveway", "flat-railtrack", "flat-curb", "human-person", "human-rider", "vehicle-car", "vehicle-truck", "vehicle-bus", "vehicle-tramtrain", "vehicle-motorcycle", "vehicle-bicycle", "vehicle-caravan", "vehicle-cartrailer", "construction-building", "construction-door", "construction-wall", "construction-fenceguardrail", "construction-bridge", "construction-tunnel", "construction-stairs", "object-pole", "object-trafficsign", "object-trafficlight", "nature-vegetation", "nature-terrain", "sky", "void-ground", "void-dynamic", "void-static", "void-unclear" ]