datasets: | |
- niltheory/ExistenceTypes | |
language: | |
- en | |
Iteration #1: | |
base distilBert used to train the model with a data training of 96. | |
Iteration #2: | |
Increased the data training from 96 to 296. | |
Which greatly increased the accuracy scores, albeit there is some edge cases. | |
Iteration #3: | |
Switched from bert-base-uncased to bert-large-cased-whole-word-masking. | |
Which it has more contextual sensitivity and is slightly more accurate than Iteration #2. | |
It's overall more nuanced and sensitive. |