Update README.md with new model card content
Browse files
README.md
CHANGED
|
@@ -8,7 +8,7 @@ tags:
|
|
| 8 |
- text-conversation
|
| 9 |
pipeline_tag: text-generation
|
| 10 |
---
|
| 11 |
-
|
| 12 |
Mistral is a set of large language models published by the Mistral AI team. Both pretrained and instruction tuned models are available with 7 billion parameters. See the model card below for benchmarks, data sources, and intended use cases.
|
| 13 |
|
| 14 |
Both weights and Keras model code is released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
|
|
@@ -186,4 +186,4 @@ mistral_lm = keras_hub.models.MistralCausalLM.from_preset(
|
|
| 186 |
dtype="bfloat16"
|
| 187 |
)
|
| 188 |
mistral_lm.fit(x=x, y=y, sample_weight=sw, batch_size=2)
|
| 189 |
-
```
|
|
|
|
| 8 |
- text-conversation
|
| 9 |
pipeline_tag: text-generation
|
| 10 |
---
|
| 11 |
+
### Model Overview
|
| 12 |
Mistral is a set of large language models published by the Mistral AI team. Both pretrained and instruction tuned models are available with 7 billion parameters. See the model card below for benchmarks, data sources, and intended use cases.
|
| 13 |
|
| 14 |
Both weights and Keras model code is released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
|
|
|
|
| 186 |
dtype="bfloat16"
|
| 187 |
)
|
| 188 |
mistral_lm.fit(x=x, y=y, sample_weight=sw, batch_size=2)
|
| 189 |
+
```
|