Update README.md
Browse files
README.md
CHANGED
@@ -73,14 +73,11 @@ Inference code is available on [GitHub](https://github.com/DataDog/toto).
|
|
73 |
### Installation
|
74 |
|
75 |
```bash
|
76 |
-
|
77 |
-
git clone https://github.com/DataDog/toto.git
|
78 |
-
cd toto
|
79 |
-
|
80 |
-
# Install dependencies
|
81 |
-
pip install -r requirements.txt
|
82 |
```
|
83 |
|
|
|
|
|
84 |
### 🚀 Inference Example
|
85 |
|
86 |
Here's how to quickly generate forecasts using Toto:
|
@@ -133,9 +130,6 @@ upper_quantile = forecast.quantile(0.9)
|
|
133 |
|
134 |
For detailed inference instructions, refer to the [inference tutorial notebook](https://github.com/DataDog/toto/blob/main/toto/notebooks/inference_tutorial.ipynb).
|
135 |
|
136 |
-
### Performance Recommendations
|
137 |
-
- ### **For optimal speed and reduced memory usage, install [xFormers](https://github.com/facebookresearch/xformers) and [flash-attention](https://github.com/Dao-AILab/flash-attention). Then, set `use_memory_efficient` to `True`.**
|
138 |
-
|
139 |
---
|
140 |
|
141 |
### 💾 Available Checkpoints
|
|
|
73 |
### Installation
|
74 |
|
75 |
```bash
|
76 |
+
pip install toto-ts
|
|
|
|
|
|
|
|
|
|
|
77 |
```
|
78 |
|
79 |
+
For optimal speed and reduced memory usage, you should also install [xFormers](https://github.com/facebookresearch/xformers) and [flash-attention](https://github.com/Dao-AILab/flash-attention)
|
80 |
+
|
81 |
### 🚀 Inference Example
|
82 |
|
83 |
Here's how to quickly generate forecasts using Toto:
|
|
|
130 |
|
131 |
For detailed inference instructions, refer to the [inference tutorial notebook](https://github.com/DataDog/toto/blob/main/toto/notebooks/inference_tutorial.ipynb).
|
132 |
|
|
|
|
|
|
|
133 |
---
|
134 |
|
135 |
### 💾 Available Checkpoints
|