Update README.md
Browse files
README.md
CHANGED
@@ -48,11 +48,11 @@ It not only possesses strong expertise in TCM, but also supports TCM multimodal
|
|
48 |
| **ShizhenGPT-32B-VL** | 32B | Text, Image Understanding | [HF Link](https://huggingface.co/FreedomIntelligence/ShizhenGPT-32B-VL) |
|
49 |
| **ShizhenGPT-32B-Omni** | 32B | Text, Four Diagnostics (望闻问切) | Available soon |
|
50 |
|
51 |
-
*Note: The LLM and VL models are parameter-split variants of ShizhenGPT-7B-Omni. Since their architectures align with Qwen2.5 and Qwen2.5-VL, they are easier to adapt to different environments. In contrast, ShizhenGPT-7B-Omni requires `transformers==
|
52 |
|
53 |
|
54 |
# <span>Usage</span>
|
55 |
-
To use `ShizhenGPT-7B-Omni`, you need to use `transformers==
|
56 |
|
57 |
```python
|
58 |
from transformers import AutoModelForCausalLM, AutoProcessor
|
|
|
48 |
| **ShizhenGPT-32B-VL** | 32B | Text, Image Understanding | [HF Link](https://huggingface.co/FreedomIntelligence/ShizhenGPT-32B-VL) |
|
49 |
| **ShizhenGPT-32B-Omni** | 32B | Text, Four Diagnostics (望闻问切) | Available soon |
|
50 |
|
51 |
+
*Note: The LLM and VL models are parameter-split variants of ShizhenGPT-7B-Omni. Since their architectures align with Qwen2.5 and Qwen2.5-VL, they are easier to adapt to different environments. In contrast, ShizhenGPT-7B-Omni requires `transformers==4.51.0`.*
|
52 |
|
53 |
|
54 |
# <span>Usage</span>
|
55 |
+
To use `ShizhenGPT-7B-Omni`, you need to use `transformers==4.51.0` and set `trust_remote_code` to True. You can run the following script:
|
56 |
|
57 |
```python
|
58 |
from transformers import AutoModelForCausalLM, AutoProcessor
|