johnucm commited on
Commit
f657f4e
·
verified ·
1 Parent(s): 9968e1d

Add `do_sample`attr in generation_config

Browse files

Since `temperature`, `top_k` etc are listed in this file, the `do_sample = true` should also be added. The default value for `do_sample` is false for transformers. Otherwise `GenerationConfig.save_pretrained()` will raise error, e.g.:

```
File "/opt/conda/envs/py310/lib/python3.10/site-packages/transformers/generation/configuration_utils.py", line 837, in save_pretrained
raise ValueError(str(exc) + "\n\nFix these issues to save the configuration.")
ValueError: GenerationConfig is invalid:
- `temperature`: `do_sample` is set to `False`. However, `temperature` is set to `0.6` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.
- `top_p`: `do_sample` is set to `False`. However, `top_p` is set to `0.95` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`.
- `min_p`: `do_sample` is set to `False`. However, `min_p` is set to `0.0` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `min_p`.
- `top_k`: `do_sample` is set to `False`. However, `top_k` is set to `20` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_k`.
If you're using a pretrained model, note that some of these attributes may be set through the model's `generation_config.json` file.

Fix these issues to save the configuration.
```

Files changed (1) hide show
  1. generation_config.json +1 -0
generation_config.json CHANGED
@@ -4,6 +4,7 @@
4
  "eos_token_id": 151645,
5
  "transformers_version": "4.52.4",
6
  "use_cache": false,
 
7
  "temperature": 0.6,
8
  "top_k": 20,
9
  "top_p": 0.95,
 
4
  "eos_token_id": 151645,
5
  "transformers_version": "4.52.4",
6
  "use_cache": false,
7
+ "do_sample": true,
8
  "temperature": 0.6,
9
  "top_k": 20,
10
  "top_p": 0.95,