Instructions to use josh-oo/calibrated-decoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use josh-oo/calibrated-decoder with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("josh-oo/calibrated-decoder") model = AutoModelForSeq2SeqLM.from_pretrained("josh-oo/calibrated-decoder") - Notebooks
- Google Colab
- Kaggle
Commit History
Distilled layers: [r, r, -1] (0.75) kr/vr f2e4680
Distilled layers: [r, r, -1] (0.75) kr/vr 462d88f
Distilled layers: [r, r, -1] (0.75) c77f7fc
Trained with normal text replication (+ bart noise [Multilingual BERT]) 1bf3259
Finetuned on simplification task 4f1357e
Trained with normal text replication (+ bart noise [ + CTRL Tokens]) 3ff0b79
Trained with normal text replication (+ bart noise [Multilingual BERT]) ef48566
Trained with normal text replication [mlsum] (+ bart noise) c7ef61e
Trained with normal text replication (+ bart noise [only permutation]) b7defb2
Trained with normal text replication (+ bart noise [only masking]) 729966d
Trained with normal text replication (+ bart noise) 7bf0045
Trained with normal text replication 28295ad
Update config.json fa840f3
Initial commit fcca31e
initial commit 4a313a6
Joshua Oehms commited on