File size: 1,531 Bytes
a9979f5 9f5e4f0 1930b02 4116153 b9434c2 4116153 4d6cc9b b9434c2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
---
license: wtfpl
datasets:
- cakiki/rosetta-code
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-classification
tags:
- code
- programming-language
- code-classification
base_model: huggingface/CodeBERTa-small-v1
---
This Model is a fine-tuned version of *huggingface/CodeBERTa-small-v1* on *cakiki/rosetta-code* Dataset for 25 Programming Languages as mentioned below.
## Training Details:
Model is trained for 25 epochs on Azure for nearly 26000 Datapoints for above Mentioned 25 Programming Languages<br> extracted from Dataset having 1006 of total Programming Language.
### Programming Languages this model is able to detect vs Examples used for training
<ol>
<li>'ARM Assembly': 3,</li>
<li>'AppleScript': 17,</li>
<li>'C': 22,</li>
<li>'C#': 10,</li>
<li>'C++': 6,</li>
<li>'COBOL': 2,</li>
<li>'Erlang': 9,</li>
<li>'Fortran': 16,</li>
<li>'Go': 8,</li>
<li>'Java': 19,</li>
<li>'JavaScript': 1,</li>
<li>'Kotlin': 24,</li>
<li>'Lua': 5,</li>
<li>'Mathematica/Wolfram Language': 14,</li>
<li>'PHP': 15,</li>
<li>'Pascal': 18,</li>
<li>'Perl': 23,</li>
<li>'PowerShell': 20,</li>
<li>'Python': 21,</li>
<li>'R': 4</li>
<li>'Ruby': 12,</li>
<li>'Rust': 11,</li>
<li>'Scala': 0,</li>
<li>'Swift': 13,</li>
<li>'Visual Basic .NET': 7,</li>
<li>'jq': 25</li>
</ol>
<br>
Below is the Training Result for 25 epochs.
 |