File size: 4,535 Bytes
fd4eb56
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ba49137
 
 
 
 
fd4eb56
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ba49137
 
 
 
 
 
 
 
 
 
 
 
 
 
 
fd4eb56
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
---

library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: CodeGenDetect-BERT_Classifier
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# CodeGenDetect-BERT_Classifier



This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.

It achieves the following results on the evaluation set:

- Loss: 0.0975

- Accuracy: 0.9767

- F1: 0.9767

- Precision: 0.9768

- Recall: 0.9767



## Model description



More information needed



## Intended uses & limitations



More information needed



## Training and evaluation data



More information needed



## Training procedure



### Training hyperparameters



The following hyperparameters were used during training:

- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch  | Step  | Accuracy | F1     | Validation Loss | Precision | Recall |

|:-------------:|:------:|:-----:|:--------:|:------:|:---------------:|:---------:|:------:|

| 0.1206        | 0.096  | 3000  | 0.9503   | 0.9503 | 0.1452          | 0.9515    | 0.9503 |

| 0.1659        | 0.192  | 6000  | 0.9580   | 0.9581 | 0.1326          | 0.9584    | 0.9580 |

| 0.1468        | 0.288  | 9000  | 0.9631   | 0.9632 | 0.1131          | 0.9634    | 0.9631 |

| 0.072         | 0.384  | 12000 | 0.9645   | 0.9645 | 0.1199          | 0.9651    | 0.9645 |

| 0.1184        | 0.48   | 15000 | 0.9656   | 0.9656 | 0.1093          | 0.9661    | 0.9656 |

| 0.2584        | 0.576  | 18000 | 0.9681   | 0.9681 | 0.0996          | 0.9684    | 0.9681 |

| 0.1833        | 0.672  | 21000 | 0.9624   | 0.9624 | 0.1154          | 0.9624    | 0.9624 |

| 0.0551        | 0.768  | 24000 | 0.9694   | 0.9694 | 0.1059          | 0.9700    | 0.9694 |

| 0.1545        | 0.864  | 27000 | 0.9705   | 0.9705 | 0.0960          | 0.9710    | 0.9705 |

| 0.1006        | 0.96   | 30000 | 0.9733   | 0.9733 | 0.0884          | 0.9735    | 0.9733 |

| 0.0941        | 1.056  | 33000 | 0.9696   | 0.9696 | 0.1021          | 0.9704    | 0.9696 |

| 0.1786        | 1.152  | 36000 | 0.9727   | 0.9727 | 0.0988          | 0.9728    | 0.9727 |

| 0.0231        | 1.248  | 39000 | 0.9740   | 0.9740 | 0.0923          | 0.9741    | 0.9740 |

| 0.0131        | 1.3440 | 42000 | 0.9735   | 0.9735 | 0.0924          | 0.9739    | 0.9735 |

| 0.1303        | 1.44   | 45000 | 0.9742   | 0.9742 | 0.0959          | 0.9742    | 0.9742 |

| 0.0637        | 1.536  | 48000 | 0.9753   | 0.9753 | 0.0877          | 0.9754    | 0.9753 |

| 0.1373        | 1.6320 | 51000 | 0.9741   | 0.9742 | 0.0977          | 0.9745    | 0.9741 |

| 0.1152        | 1.728  | 54000 | 0.9755   | 0.9755 | 0.1035          | 0.9756    | 0.9755 |

| 0.0728        | 1.8240 | 57000 | 0.9751   | 0.9752 | 0.0922          | 0.9754    | 0.9751 |

| 0.007         | 1.92   | 60000 | 0.9763   | 0.9763 | 0.0814          | 0.9764    | 0.9763 |

| 0.0043        | 2.016  | 63000 | 0.9768   | 0.9768 | 0.0991          | 0.9769    | 0.9768 |

| 0.0429        | 2.112  | 66000 | 0.9759   | 0.9759 | 0.0925          | 0.9760    | 0.9759 |

| 0.0061        | 2.208  | 69000 | 0.9765   | 0.9765 | 0.0930          | 0.9766    | 0.9765 |

| 0.0774        | 2.304  | 72000 | 0.9761   | 0.9761 | 0.0868          | 0.9763    | 0.9761 |

| 0.0166        | 2.4    | 75000 | 0.0927   | 0.9775 | 0.9775          | 0.9777    | 0.9775 |

| 0.0035        | 2.496  | 78000 | 0.0859   | 0.9777 | 0.9777          | 0.9779    | 0.9777 |

| 0.0891        | 2.592  | 81000 | 0.0898   | 0.9752 | 0.9752          | 0.9752    | 0.9752 |

| 0.093         | 2.6880 | 84000 | 0.0848   | 0.9777 | 0.9777          | 0.9779    | 0.9777 |

| 0.0056        | 2.784  | 87000 | 0.0933   | 0.9770 | 0.9770          | 0.9771    | 0.9770 |

| 0.124         | 2.88   | 90000 | 0.1115   | 0.9774 | 0.9774          | 0.9775    | 0.9774 |

| 0.0861        | 2.976  | 93000 | 0.0975   | 0.9767 | 0.9767          | 0.9768    | 0.9767 |





### Framework versions



- Transformers 4.45.0

- Pytorch 2.6.0+cu124

- Datasets 4.4.1

- Tokenizers 0.20.3