New joinee friendly doc

#13
by sashank2k3 - opened

Can anybody please walk me through , how can i use the model in google colab , untill the model is loaded there is code written , but no code written to get started

Hi @sashank2k3 ,

Welcome to Google's Gemma family of open source models, yes these Gemma models are similar to other Gemma 3 models. After loading/downloading the models weights you could able to utilize these models in you local for use-cases similar to other Gemma models, Please find the following sample code snippet related to model inference for a sample question and answer related task.

tokenizer = AutoTokenizer.from_pretrained(model_id)

messages = [
[
{
"role": "system",
"content": [{"type": "text", "text": "You are a helpful assistant."},]
},
{
"role": "user",
"content": [{"type": "text", "text": "What's a basic recipe for eggs"},]
},
],
]
inputs = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to(model.device).to(torch.bfloat16)

with torch.inference_mode():
outputs = model.generate(**inputs, max_new_tokens=200)

outputs = tokenizer.batch_decode(outputs)
print(outputs[0])

Thanks.

Thank you so much @BalakrishnaCh

Sign up or log in to comment