SmerkyG commited on
Commit
e94655a
·
verified ·
1 Parent(s): bd2ed37

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -51,7 +51,7 @@ This is RWKV-7 model under flash-linear attention format.
51
  Install `flash-linear-attention` and the latest version of `transformers` before using this model:
52
 
53
  ```bash
54
- pip install git+https://github.com/fla-org/flash-linear-attention
55
  pip install 'transformers>=4.48.0'
56
  ```
57
 
 
51
  Install `flash-linear-attention` and the latest version of `transformers` before using this model:
52
 
53
  ```bash
54
+ pip install flash-linear-attention==0.3.0
55
  pip install 'transformers>=4.48.0'
56
  ```
57