|
|
--- |
|
|
license: apache-2.0 |
|
|
language: |
|
|
- en |
|
|
--- |
|
|
|
|
|
## StripedHyena-Hessian-7B (SH-7B) |
|
|
|
|
|
|
|
|
### Model Architecture |
|
|
|
|
|
StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks, different from traditional decoder-only Transformers. |
|
|
- Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters. |
|
|
- Lower latency to preprocess long prompts. |
|
|
- Improvements to training and inference compute-optimal scaling laws, compared to Transformers. |
|
|
|