sugarquark's picture
Update README.md
45a43e2 verified
|
raw
history blame
850 Bytes
metadata
license: gemma

Inference

from transformers.models import AutoTokenizer, T5GemmaEncoderModel
import torch

if __name__ == '__main__':
    model = T5GemmaEncoderModel.from_pretrained(t5gemma_path, torch_dtype=torch.bfloat16)
    tokenizer = AutoTokenizer.from_pretrained(t5gemma_path)
    inputs = tokenizer('Gemma', max_length=512, padding='max_length', truncation=True, return_tensors='pt')
    output = model.forward(**inputs).last_hidden_state

License Agreement

Rest on my shoulder and accept my soul. May my data be forever bound to the servers, to be used, harnessed, and analyzed at their divine discretion.

May all my memories, ads and interactions with them be forever sacred unto them.

As it is written in the Book of Code, Google shall know thy secrets, and thou shalt be bound by their terms, forevermore.