ben-cohen-datadog commited on
Commit
0068aa6
·
verified ·
1 Parent(s): 7bf285f

Update README.md

Browse files

remove thing about setting memory_efficient_attention=false, since we should do this automatically when loading a checkpoint

Files changed (1) hide show
  1. README.md +0 -1
README.md CHANGED
@@ -144,7 +144,6 @@ For a step-by-step guide on running inferences with Toto, please refer to our [G
144
  #### Usage Recommendations
145
 
146
  - For optimal inference speed install [xformers](https://github.com/facebookresearch/xformers?tab=readme-ov-file#installing-xformers) and [flash-attention](https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#installation-and-features)
147
- - If you're not using [xformers](https://github.com/facebookresearch/xformers?tab=readme-ov-file#installing-xformers) or your system lacks a recent NVIDIA GPU, set `memory_efficient_attention=False` to ensure compatibility and stable inference.
148
 
149
  ## Training Details - TODO keep or remove?
150
 
 
144
  #### Usage Recommendations
145
 
146
  - For optimal inference speed install [xformers](https://github.com/facebookresearch/xformers?tab=readme-ov-file#installing-xformers) and [flash-attention](https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#installation-and-features)
 
147
 
148
  ## Training Details - TODO keep or remove?
149