Update bonus-unit1/bonus-unit1.ipynb

#26

Fixed a typo of the tool response token in ChatmlSpecialTokens class.

Hugging Face Agents Course org

Thanks!

sergiopaniego changed pull request status to merged

when i try the notebook, i get the following error

TypeError Traceback (most recent call last)
/tmp/ipython-input-209518391.py in <cell line: 0>()
21 tokenizer.chat_template = "{{ bos_token }}{% if messages[0]['role'] == 'system' %}{{ raise_exception('System role not supported') }}{% endif %}{% for message in messages %}{{ '' + message['role'] + '\n' + message['content'] | trim + '\n' }}{% endfor %}{% if add_generation_prompt %}{{'model\n'}}{% endif %}"
22
---> 23 model = AutoModelForCausalLM.from_pretrained(model_name,
24 attn_implementation='eager',
25 device_map="auto")

/usr/local/lib/python3.12/dist-packages/transformers/models/gemma2/configuration_gemma2.py in init(self, vocab_size, hidden_size, intermediate_size, num_hidden_layers, num_attention_heads, num_key_value_heads, head_dim, hidden_activation, max_position_embeddings, initializer_range, rms_norm_eps, use_cache, pad_token_id, eos_token_id, bos_token_id, tie_word_embeddings, rope_theta, attention_bias, attention_dropout, query_pre_attn_scalar, sliding_window, layer_types, final_logit_softcapping, attn_logit_softcapping, **kwargs)
177 "sliding_attention" if bool((i + 1) % 2) else "full_attention" for i in range(self.num_hidden_layers)
178 ]
--> 179 layer_type_validation(self.layer_types, self.num_hidden_layers)
180
181

TypeError: layer_type_validation() takes 1 positional argument but 2 were given

Sign up or log in to comment