langcache-embed-v3 / README.md
radoslavralev's picture
Add new SentenceTransformer model
ee88725 verified
|
raw
history blame
15.9 kB
---
language:
- en
license: apache-2.0
tags:
- biencoder
- sentence-transformers
- text-classification
- sentence-pair-classification
- semantic-similarity
- semantic-search
- retrieval
- reranking
- generated_from_trainer
- dataset_size:2200421
- loss:CoSENTLoss
base_model: Alibaba-NLP/gte-modernbert-base
widget:
- source_sentence: They are sometimes called Marg or also Path in Hindi .
sentences:
- Largs was born in Brisbane House in Noddsdale , near Brisbane in Ayrshire , Scotland
, the son of Sir Thomas Brisbane and Dame Eleanora Brisbane .
- Its smallest radius is 1.4 ( 131 thousand light years ) and largest 0.7 angle
minutes ( 65 thousand light years ) .
- They are also called Marg or sometimes the path in the Hindi .
- source_sentence: The main mode of play in `` Crash Bash `` is the Adventure Mode
, in which one or two players must win all 28 levels to complete .
sentences:
- Parkton is a city in Robeson County , North Carolina , in the Lumberton Metro
area , in the United States .
- The CANTAB tests were developed by Professor Barbara Sahakian and Professor Trevor
Robbins .
- The main mode in `` Crash Bash `` is the adventure mode in which one or two players
must complete all 28 levels to win .
- source_sentence: It was formed in December 2014 from elements of the disbanded 51st
Mechanized Brigade and newly mobilized units .
sentences:
- It had branches in feature films , television , physical and digital publishing
, merchandise , recorded music , digital and online media applications and mobile
and social games .
- Notts County and Arsenal were relegated to the Second Division ; Preston North
End and Burnley were promoted to the First Division .
- It was formed in December 2014 from elements of the dissolved 51st Mechanized
Brigade and newly mobilized units .
- source_sentence: The band pursued `` signals `` in January 2012 in three weeks ,
and drums were recorded in a day and a half .
sentences:
- Kearsarge Lakes , Kearsarge Pass Trail , and Rae Lakes all have a maximum 2 nights
stay , and Bullfrog Lake along the Charlotte Lake is closed to camping .
- The band tracked `` Signals `` in three weeks in January 2012 . Drums were recorded
in a day and a half .
- From 1954 to 1961 , he was married to Stella Caralis and from 1978 until his death
with Nina Bohlen .
- source_sentence: A special case is of the Country B loyalist who controls agents
or provides managerial supporting or other functions against Country A .
sentences:
- A special case is the loyalist of Country B , who controls agents or provides
management support or other functions against Country A .
- Music Story is a music service website and international music data provider that
curates , aggregates and analyses metadata for digital music services .
- These six cars were painted in the same lacquering as the buffet cars , silver
with red lines and text .
datasets:
- redis/langcache-sentencepairs-v2
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_precision@1
- cosine_recall@1
- cosine_ndcg@10
- cosine_mrr@1
- cosine_map@100
model-index:
- name: Redis fine-tuned BiEncoder model for semantic caching on LangCache
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: test
type: test
metrics:
- type: cosine_accuracy@1
value: 0.5861241448475948
name: Cosine Accuracy@1
- type: cosine_precision@1
value: 0.5861241448475948
name: Cosine Precision@1
- type: cosine_recall@1
value: 0.5679885764966713
name: Cosine Recall@1
- type: cosine_ndcg@10
value: 0.7729838064849864
name: Cosine Ndcg@10
- type: cosine_mrr@1
value: 0.5861241448475948
name: Cosine Mrr@1
- type: cosine_map@100
value: 0.7216697804426214
name: Cosine Map@100
---
# Redis fine-tuned BiEncoder model for semantic caching on LangCache
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-modernbert-base](https://huggingface.co/Alibaba-NLP/gte-modernbert-base) on the [LangCache Sentence Pairs (all)](https://huggingface.co/datasets/redis/langcache-sentencepairs-v2) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for sentence pair similarity.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Alibaba-NLP/gte-modernbert-base](https://huggingface.co/Alibaba-NLP/gte-modernbert-base) <!-- at revision e7f32e3c00f91d699e8c43b53106206bcc72bb22 -->
- **Maximum Sequence Length:** 100 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- [LangCache Sentence Pairs (all)](https://huggingface.co/datasets/redis/langcache-sentencepairs-v2)
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 100, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("redis/langcache-embed-v3")
# Run inference
sentences = [
'A special case is of the Country B loyalist who controls agents or provides managerial supporting or other functions against Country A .',
'A special case is the loyalist of Country B , who controls agents or provides management support or other functions against Country A .',
'Music Story is a music service website and international music data provider that curates , aggregates and analyses metadata for digital music services .',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9844, 0.5195],
# [0.9844, 0.9922, 0.5078],
# [0.5195, 0.5078, 0.9922]], dtype=torch.bfloat16)
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `test`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:-------------------|:----------|
| cosine_accuracy@1 | 0.5861 |
| cosine_precision@1 | 0.5861 |
| cosine_recall@1 | 0.568 |
| **cosine_ndcg@10** | **0.773** |
| cosine_mrr@1 | 0.5861 |
| cosine_map@100 | 0.7217 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### LangCache Sentence Pairs (all)
* Dataset: [LangCache Sentence Pairs (all)](https://huggingface.co/datasets/redis/langcache-sentencepairs-v2)
* Size: 72,021 training samples
* Columns: <code>sentence_a</code>, <code>sentence_b</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
| | sentence_a | sentence_b | label |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|
| type | string | string | int |
| details | <ul><li>min: 8 tokens</li><li>mean: 27.46 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 27.36 tokens</li><li>max: 52 tokens</li></ul> | <ul><li>0: ~50.30%</li><li>1: ~49.70%</li></ul> |
* Samples:
| sentence_a | sentence_b | label |
|:--------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
| <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>1</code> |
| <code>Turner Valley , was at the Turner Valley Bar N Ranch Airport , southwest of the Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley Bar N Ranch Airport , , was located at Turner Valley Bar N Ranch , southwest of Turner Valley , Alberta , Canada .</code> | <code>0</code> |
| <code>After losing his second election , he resigned as opposition leader and was replaced by Geoff Pearsall .</code> | <code>Max Bingham resigned as opposition leader after losing his second election , and was replaced by Geoff Pearsall .</code> | <code>1</code> |
* Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "pairwise_cos_sim"
}
```
### Evaluation Dataset
#### LangCache Sentence Pairs (all)
* Dataset: [LangCache Sentence Pairs (all)](https://huggingface.co/datasets/redis/langcache-sentencepairs-v2)
* Size: 72,021 evaluation samples
* Columns: <code>sentence_a</code>, <code>sentence_b</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
| | sentence_a | sentence_b | label |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|
| type | string | string | int |
| details | <ul><li>min: 8 tokens</li><li>mean: 27.46 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 27.36 tokens</li><li>max: 52 tokens</li></ul> | <ul><li>0: ~50.30%</li><li>1: ~49.70%</li></ul> |
* Samples:
| sentence_a | sentence_b | label |
|:--------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
| <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>1</code> |
| <code>Turner Valley , was at the Turner Valley Bar N Ranch Airport , southwest of the Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley Bar N Ranch Airport , , was located at Turner Valley Bar N Ranch , southwest of Turner Valley , Alberta , Canada .</code> | <code>0</code> |
| <code>After losing his second election , he resigned as opposition leader and was replaced by Geoff Pearsall .</code> | <code>Max Bingham resigned as opposition leader after losing his second election , and was replaced by Geoff Pearsall .</code> | <code>1</code> |
* Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "pairwise_cos_sim"
}
```
### Training Logs
| Epoch | Step | test_cosine_ndcg@10 |
|:-----:|:----:|:-------------------:|
| -1 | -1 | 0.7730 |
### Framework Versions
- Python: 3.12.3
- Sentence Transformers: 5.1.0
- Transformers: 4.56.0
- PyTorch: 2.8.0+cu128
- Accelerate: 1.10.1
- Datasets: 4.0.0
- Tokenizers: 0.22.0
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### CoSENTLoss
```bibtex
@online{kexuefm-8847,
title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
author={Su Jianlin},
year={2022},
month={Jan},
url={https://kexue.fm/archives/8847},
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->