CodeGenDetect-BERT_Classifier
This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0975
- Accuracy: 0.9767
- F1: 0.9767
- Precision: 0.9768
- Recall: 0.9767
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Accuracy | F1 | Validation Loss | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.1206 | 0.096 | 3000 | 0.9503 | 0.9503 | 0.1452 | 0.9515 | 0.9503 |
| 0.1659 | 0.192 | 6000 | 0.9580 | 0.9581 | 0.1326 | 0.9584 | 0.9580 |
| 0.1468 | 0.288 | 9000 | 0.9631 | 0.9632 | 0.1131 | 0.9634 | 0.9631 |
| 0.072 | 0.384 | 12000 | 0.9645 | 0.9645 | 0.1199 | 0.9651 | 0.9645 |
| 0.1184 | 0.48 | 15000 | 0.9656 | 0.9656 | 0.1093 | 0.9661 | 0.9656 |
| 0.2584 | 0.576 | 18000 | 0.9681 | 0.9681 | 0.0996 | 0.9684 | 0.9681 |
| 0.1833 | 0.672 | 21000 | 0.9624 | 0.9624 | 0.1154 | 0.9624 | 0.9624 |
| 0.0551 | 0.768 | 24000 | 0.9694 | 0.9694 | 0.1059 | 0.9700 | 0.9694 |
| 0.1545 | 0.864 | 27000 | 0.9705 | 0.9705 | 0.0960 | 0.9710 | 0.9705 |
| 0.1006 | 0.96 | 30000 | 0.9733 | 0.9733 | 0.0884 | 0.9735 | 0.9733 |
| 0.0941 | 1.056 | 33000 | 0.9696 | 0.9696 | 0.1021 | 0.9704 | 0.9696 |
| 0.1786 | 1.152 | 36000 | 0.9727 | 0.9727 | 0.0988 | 0.9728 | 0.9727 |
| 0.0231 | 1.248 | 39000 | 0.9740 | 0.9740 | 0.0923 | 0.9741 | 0.9740 |
| 0.0131 | 1.3440 | 42000 | 0.9735 | 0.9735 | 0.0924 | 0.9739 | 0.9735 |
| 0.1303 | 1.44 | 45000 | 0.9742 | 0.9742 | 0.0959 | 0.9742 | 0.9742 |
| 0.0637 | 1.536 | 48000 | 0.9753 | 0.9753 | 0.0877 | 0.9754 | 0.9753 |
| 0.1373 | 1.6320 | 51000 | 0.9741 | 0.9742 | 0.0977 | 0.9745 | 0.9741 |
| 0.1152 | 1.728 | 54000 | 0.9755 | 0.9755 | 0.1035 | 0.9756 | 0.9755 |
| 0.0728 | 1.8240 | 57000 | 0.9751 | 0.9752 | 0.0922 | 0.9754 | 0.9751 |
| 0.007 | 1.92 | 60000 | 0.9763 | 0.9763 | 0.0814 | 0.9764 | 0.9763 |
| 0.0043 | 2.016 | 63000 | 0.9768 | 0.9768 | 0.0991 | 0.9769 | 0.9768 |
| 0.0429 | 2.112 | 66000 | 0.9759 | 0.9759 | 0.0925 | 0.9760 | 0.9759 |
| 0.0061 | 2.208 | 69000 | 0.9765 | 0.9765 | 0.0930 | 0.9766 | 0.9765 |
| 0.0774 | 2.304 | 72000 | 0.9761 | 0.9761 | 0.0868 | 0.9763 | 0.9761 |
| 0.0166 | 2.4 | 75000 | 0.0927 | 0.9775 | 0.9775 | 0.9777 | 0.9775 |
| 0.0035 | 2.496 | 78000 | 0.0859 | 0.9777 | 0.9777 | 0.9779 | 0.9777 |
| 0.0891 | 2.592 | 81000 | 0.0898 | 0.9752 | 0.9752 | 0.9752 | 0.9752 |
| 0.093 | 2.6880 | 84000 | 0.0848 | 0.9777 | 0.9777 | 0.9779 | 0.9777 |
| 0.0056 | 2.784 | 87000 | 0.0933 | 0.9770 | 0.9770 | 0.9771 | 0.9770 |
| 0.124 | 2.88 | 90000 | 0.1115 | 0.9774 | 0.9774 | 0.9775 | 0.9774 |
| 0.0861 | 2.976 | 93000 | 0.0975 | 0.9767 | 0.9767 | 0.9768 | 0.9767 |
Framework versions
- Transformers 4.45.0
- Pytorch 2.6.0+cu124
- Datasets 4.4.1
- Tokenizers 0.20.3
- Downloads last month
- 207
Model tree for azherali/CodeGenDetect-BERT_Classifier
Base model
google-bert/bert-base-uncased