| pipeline_tag: sentence-similarity | |
| tags: | |
| - sentence-similarity | |
| - sentence-transformers | |
| license: mit | |
| language: | |
| - multilingual | |
| - af | |
| - am | |
| - ar | |
| - as | |
| - az | |
| - be | |
| - bg | |
| - bn | |
| - br | |
| - bs | |
| - ca | |
| - cs | |
| - cy | |
| - da | |
| - de | |
| - el | |
| - en | |
| - eo | |
| - es | |
| - et | |
| - eu | |
| - fa | |
| - fi | |
| - fr | |
| - fy | |
| - ga | |
| - gd | |
| - gl | |
| - gu | |
| - ha | |
| - he | |
| - hi | |
| - hr | |
| - hu | |
| - hy | |
| - id | |
| - is | |
| - it | |
| - ja | |
| - jv | |
| - ka | |
| - kk | |
| - km | |
| - kn | |
| - ko | |
| - ku | |
| - ky | |
| - la | |
| - lo | |
| - lt | |
| - lv | |
| - mg | |
| - mk | |
| - ml | |
| - mn | |
| - mr | |
| - ms | |
| - my | |
| - ne | |
| - nl | |
| - no | |
| - om | |
| - or | |
| - pa | |
| - pl | |
| - ps | |
| - pt | |
| - ro | |
| - ru | |
| - sa | |
| - sd | |
| - si | |
| - sk | |
| - sl | |
| - so | |
| - sq | |
| - sr | |
| - su | |
| - sv | |
| - sw | |
| - ta | |
| - te | |
| - th | |
| - tl | |
| - tr | |
| - ug | |
| - uk | |
| - ur | |
| - uz | |
| - vi | |
| - xh | |
| - yi | |
| - zh | |
| A quantized version of [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) for Intel CPUs. | |
| [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf). | |
| Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022 | |
| This model has 12 layers and the embedding size is 384. | |
| ## Usage | |
| TBD | |