Instructions to use keras/llama3.2_instruct_3b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- KerasHub
How to use keras/llama3.2_instruct_3b with KerasHub:
import keras_hub # Create a LlamaCausalLM model task = keras_hub.models.LlamaCausalLM.from_preset("hf://keras/llama3.2_instruct_3b")import keras_hub # Create a Backbone model unspecialized for any task backbone = keras_hub.models.Backbone.from_preset("hf://keras/llama3.2_instruct_3b") - Keras
How to use keras/llama3.2_instruct_3b with Keras:
# Available backend options are: "jax", "torch", "tensorflow". import os os.environ["KERAS_BACKEND"] = "jax" import keras model = keras.saving.load_model("hf://keras/llama3.2_instruct_3b") - Notebooks
- Google Colab
- Kaggle
| { | |
| "module": "keras_hub.src.models.llama3.llama3_backbone", | |
| "class_name": "Llama3Backbone", | |
| "config": { | |
| "name": "llama3_backbone", | |
| "trainable": true, | |
| "vocabulary_size": 128256, | |
| "num_layers": 28, | |
| "num_query_heads": 24, | |
| "hidden_dim": 3072, | |
| "intermediate_dim": 8192, | |
| "rope_max_wavelength": 500000.0, | |
| "rope_position_scaling_factor": 1, | |
| "rope_frequency_adjustment_factor": 32, | |
| "rope_low_freq_factor": 1, | |
| "rope_high_freq_factor": 4, | |
| "rope_pretraining_sequence_length": 8192, | |
| "num_key_value_heads": 8, | |
| "layer_norm_epsilon": 1e-05, | |
| "dropout": 0 | |
| }, | |
| "registered_name": "keras_hub>Llama3Backbone" | |
| } |