smcleod's picture
Update README.md
a4f4784 verified
|
raw
history blame
581 Bytes
metadata
license: apache-2.0
datasets:
  - smcleod/golang-coder
  - smcleod/golang-programming-style-best-practices
  - ExAi/Code-Golang-QA-2k
  - google/code_x_glue_ct_code_to_text
  - semeru/code-text-go
language:
  - en
tags:
  - golang
  - code
  - go
  - programming
  - llama
  - text-generation-inference

Llama 3.1 8b Golang Coder v2

I trained this model (based on Llama 3.1 8b) on a merged dataset I created consisting of 50,627 rows, 13.3M input tokens and 2.2M output tokens.

The total training consisted of 1,020,719 input tokens and 445,810 output tokens from 45,565 items in the dataset.