Update README.md
Browse files
README.md
CHANGED
@@ -43,6 +43,12 @@ We applied:
|
|
43 |
- Filtering to keep only high-quality reasoning traces (correct answers with proper reasoning)
|
44 |
- STORM-inspired techniques to enhance comprehensive report generation
|
45 |
|
|
|
|
|
|
|
|
|
|
|
|
|
46 |
## Performance
|
47 |
|
48 |
| **Benchmark** | **Qwen3-4B** | **Jan-4B** | **WebSailor-3B** | **II-Search-4B** |
|
@@ -74,13 +80,16 @@ II-Search-4B is designed for:
|
|
74 |
|
75 |
## Usage
|
76 |
|
77 |
-
|
|
|
|
|
|
|
78 |
|
79 |
### Recommended Generation Parameters
|
80 |
|
81 |
```python
|
82 |
generate_cfg = {
|
83 |
-
'
|
84 |
'top_p': 0.95,
|
85 |
'temperature': 0.6,
|
86 |
'repetition_penalty': 1.1,
|
|
|
43 |
- Filtering to keep only high-quality reasoning traces (correct answers with proper reasoning)
|
44 |
- STORM-inspired techniques to enhance comprehensive report generation
|
45 |
|
46 |
+
### Phase 4: Reinforcement Learning
|
47 |
+
|
48 |
+
We trained the model using reinforcement learning
|
49 |
+
- UsedDataset: MuSiQue (19k samples)
|
50 |
+
- Incorporated our in-house search database (containing Wiki data, Fineweb data, and arXiv data)
|
51 |
+
|
52 |
## Performance
|
53 |
|
54 |
| **Benchmark** | **Qwen3-4B** | **Jan-4B** | **WebSailor-3B** | **II-Search-4B** |
|
|
|
80 |
|
81 |
## Usage
|
82 |
|
83 |
+
```bash
|
84 |
+
vllm serve Intelligent-Internet/II-Search-4B --served-model-name II-Search-4B --tensor-parallel-size 8 --enable-reasoning --reasoning-parser deepseek_r1 --rope-scaling '{"rope_type":"yarn","factor":1.5,"original_max_position_embeddings":98304}' --max-model-len 131072
|
85 |
+
```
|
86 |
+
- Or you can host the [II-4B-Search-MLX ](https://huggingface.co/Intelligent-Internet/II-Search-4B-MLX/) on your Mac then use LMStudio/ Olama Desktop to use it.
|
87 |
|
88 |
### Recommended Generation Parameters
|
89 |
|
90 |
```python
|
91 |
generate_cfg = {
|
92 |
+
'top_k': 20,
|
93 |
'top_p': 0.95,
|
94 |
'temperature': 0.6,
|
95 |
'repetition_penalty': 1.1,
|