Add new SentenceTransformer model
Browse files- 1_Pooling/config.json +2 -2
- README.md +38 -24
- config_sentence_transformers.json +2 -2
- model.safetensors +1 -1
- tokenizer_config.json +1 -1
1_Pooling/config.json
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
{
|
2 |
"word_embedding_dimension": 768,
|
3 |
-
"pooling_mode_cls_token":
|
4 |
-
"pooling_mode_mean_tokens":
|
5 |
"pooling_mode_max_tokens": false,
|
6 |
"pooling_mode_mean_sqrt_len_tokens": false,
|
7 |
"pooling_mode_weightedmean_tokens": false,
|
|
|
1 |
{
|
2 |
"word_embedding_dimension": 768,
|
3 |
+
"pooling_mode_cls_token": false,
|
4 |
+
"pooling_mode_mean_tokens": true,
|
5 |
"pooling_mode_max_tokens": false,
|
6 |
"pooling_mode_mean_sqrt_len_tokens": false,
|
7 |
"pooling_mode_weightedmean_tokens": false,
|
README.md
CHANGED
@@ -12,9 +12,9 @@ tags:
|
|
12 |
- retrieval
|
13 |
- reranking
|
14 |
- generated_from_trainer
|
15 |
-
- dataset_size:
|
16 |
-
- loss:
|
17 |
-
base_model:
|
18 |
widget:
|
19 |
- source_sentence: Hayley Vaughan portrayed Ripa on the ABC daytime soap opera , ``
|
20 |
All My Children `` , between 1990 and 2002 .
|
@@ -79,34 +79,34 @@ model-index:
|
|
79 |
type: test
|
80 |
metrics:
|
81 |
- type: cosine_accuracy@1
|
82 |
-
value: 0.
|
83 |
name: Cosine Accuracy@1
|
84 |
- type: cosine_precision@1
|
85 |
-
value: 0.
|
86 |
name: Cosine Precision@1
|
87 |
- type: cosine_recall@1
|
88 |
-
value: 0.
|
89 |
name: Cosine Recall@1
|
90 |
- type: cosine_ndcg@10
|
91 |
-
value: 0.
|
92 |
name: Cosine Ndcg@10
|
93 |
- type: cosine_mrr@1
|
94 |
-
value: 0.
|
95 |
name: Cosine Mrr@1
|
96 |
- type: cosine_map@100
|
97 |
-
value: 0.
|
98 |
name: Cosine Map@100
|
99 |
---
|
100 |
|
101 |
# Redis fine-tuned BiEncoder model for semantic caching on LangCache
|
102 |
|
103 |
-
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [
|
104 |
|
105 |
## Model Details
|
106 |
|
107 |
### Model Description
|
108 |
- **Model Type:** Sentence Transformer
|
109 |
-
- **Base model:** [
|
110 |
- **Maximum Sequence Length:** 100 tokens
|
111 |
- **Output Dimensionality:** 768 dimensions
|
112 |
- **Similarity Function:** Cosine Similarity
|
@@ -126,7 +126,7 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [A
|
|
126 |
```
|
127 |
SentenceTransformer(
|
128 |
(0): Transformer({'max_seq_length': 100, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
|
129 |
-
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token':
|
130 |
)
|
131 |
```
|
132 |
|
@@ -159,9 +159,9 @@ print(embeddings.shape)
|
|
159 |
# Get the similarity scores for the embeddings
|
160 |
similarities = model.similarity(embeddings, embeddings)
|
161 |
print(similarities)
|
162 |
-
# tensor([[
|
163 |
-
# [0.
|
164 |
-
# [0.9922,
|
165 |
```
|
166 |
|
167 |
<!--
|
@@ -199,12 +199,12 @@ You can finetune this model on your own dataset.
|
|
199 |
|
200 |
| Metric | Value |
|
201 |
|:-------------------|:----------|
|
202 |
-
| cosine_accuracy@1 | 0.
|
203 |
-
| cosine_precision@1 | 0.
|
204 |
-
| cosine_recall@1 | 0.
|
205 |
-
| **cosine_ndcg@10** | **0.
|
206 |
-
| cosine_mrr@1 | 0.
|
207 |
-
| cosine_map@100 | 0.
|
208 |
|
209 |
<!--
|
210 |
## Bias, Risks and Limitations
|
@@ -238,7 +238,14 @@ You can finetune this model on your own dataset.
|
|
238 |
| <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>how can I get financial freedom as soon as possible?</code> |
|
239 |
| <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The older Punts are still very much in existence today and race in the same fleets as the newer boats .</code> |
|
240 |
| <code>Turner Valley , was at the Turner Valley Bar N Ranch Airport , southwest of the Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley , , was located at Turner Valley Bar N Ranch Airport , southwest of Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley Bar N Ranch Airport , , was located at Turner Valley Bar N Ranch , southwest of Turner Valley , Alberta , Canada .</code> |
|
241 |
-
* Loss: <code>losses.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
242 |
|
243 |
### Evaluation Dataset
|
244 |
|
@@ -258,12 +265,19 @@ You can finetune this model on your own dataset.
|
|
258 |
| <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>how can I get financial freedom as soon as possible?</code> |
|
259 |
| <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The older Punts are still very much in existence today and race in the same fleets as the newer boats .</code> |
|
260 |
| <code>Turner Valley , was at the Turner Valley Bar N Ranch Airport , southwest of the Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley , , was located at Turner Valley Bar N Ranch Airport , southwest of Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley Bar N Ranch Airport , , was located at Turner Valley Bar N Ranch , southwest of Turner Valley , Alberta , Canada .</code> |
|
261 |
-
* Loss: <code>losses.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
262 |
|
263 |
### Training Logs
|
264 |
| Epoch | Step | test_cosine_ndcg@10 |
|
265 |
|:-----:|:----:|:-------------------:|
|
266 |
-
| -1 | -1 | 0.
|
267 |
|
268 |
|
269 |
### Framework Versions
|
|
|
12 |
- retrieval
|
13 |
- reranking
|
14 |
- generated_from_trainer
|
15 |
+
- dataset_size:9233417
|
16 |
+
- loss:ArcFaceInBatchLoss
|
17 |
+
base_model: answerdotai/ModernBERT-base
|
18 |
widget:
|
19 |
- source_sentence: Hayley Vaughan portrayed Ripa on the ABC daytime soap opera , ``
|
20 |
All My Children `` , between 1990 and 2002 .
|
|
|
79 |
type: test
|
80 |
metrics:
|
81 |
- type: cosine_accuracy@1
|
82 |
+
value: 0.4126938643934238
|
83 |
name: Cosine Accuracy@1
|
84 |
- type: cosine_precision@1
|
85 |
+
value: 0.4126938643934238
|
86 |
name: Cosine Precision@1
|
87 |
- type: cosine_recall@1
|
88 |
+
value: 0.39900881466078997
|
89 |
name: Cosine Recall@1
|
90 |
- type: cosine_ndcg@10
|
91 |
+
value: 0.5950456720106155
|
92 |
name: Cosine Ndcg@10
|
93 |
- type: cosine_mrr@1
|
94 |
+
value: 0.4126938643934238
|
95 |
name: Cosine Mrr@1
|
96 |
- type: cosine_map@100
|
97 |
+
value: 0.543168962594735
|
98 |
name: Cosine Map@100
|
99 |
---
|
100 |
|
101 |
# Redis fine-tuned BiEncoder model for semantic caching on LangCache
|
102 |
|
103 |
+
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on the [LangCache Sentence Pairs (all)](https://huggingface.co/datasets/redis/langcache-sentencepairs-v2) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for sentence pair similarity.
|
104 |
|
105 |
## Model Details
|
106 |
|
107 |
### Model Description
|
108 |
- **Model Type:** Sentence Transformer
|
109 |
+
- **Base model:** [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) <!-- at revision 8949b909ec900327062f0ebf497f51aef5e6f0c8 -->
|
110 |
- **Maximum Sequence Length:** 100 tokens
|
111 |
- **Output Dimensionality:** 768 dimensions
|
112 |
- **Similarity Function:** Cosine Similarity
|
|
|
126 |
```
|
127 |
SentenceTransformer(
|
128 |
(0): Transformer({'max_seq_length': 100, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
|
129 |
+
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
|
130 |
)
|
131 |
```
|
132 |
|
|
|
159 |
# Get the similarity scores for the embeddings
|
160 |
similarities = model.similarity(embeddings, embeddings)
|
161 |
print(similarities)
|
162 |
+
# tensor([[0.9961, 0.9922, 0.9922],
|
163 |
+
# [0.9922, 1.0000, 1.0000],
|
164 |
+
# [0.9922, 1.0000, 1.0000]], dtype=torch.bfloat16)
|
165 |
```
|
166 |
|
167 |
<!--
|
|
|
199 |
|
200 |
| Metric | Value |
|
201 |
|:-------------------|:----------|
|
202 |
+
| cosine_accuracy@1 | 0.4127 |
|
203 |
+
| cosine_precision@1 | 0.4127 |
|
204 |
+
| cosine_recall@1 | 0.399 |
|
205 |
+
| **cosine_ndcg@10** | **0.595** |
|
206 |
+
| cosine_mrr@1 | 0.4127 |
|
207 |
+
| cosine_map@100 | 0.5432 |
|
208 |
|
209 |
<!--
|
210 |
## Bias, Risks and Limitations
|
|
|
238 |
| <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>how can I get financial freedom as soon as possible?</code> |
|
239 |
| <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The older Punts are still very much in existence today and race in the same fleets as the newer boats .</code> |
|
240 |
| <code>Turner Valley , was at the Turner Valley Bar N Ranch Airport , southwest of the Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley , , was located at Turner Valley Bar N Ranch Airport , southwest of Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley Bar N Ranch Airport , , was located at Turner Valley Bar N Ranch , southwest of Turner Valley , Alberta , Canada .</code> |
|
241 |
+
* Loss: <code>losses.ArcFaceInBatchLoss</code> with these parameters:
|
242 |
+
```json
|
243 |
+
{
|
244 |
+
"scale": 20.0,
|
245 |
+
"similarity_fct": "cos_sim",
|
246 |
+
"gather_across_devices": false
|
247 |
+
}
|
248 |
+
```
|
249 |
|
250 |
### Evaluation Dataset
|
251 |
|
|
|
265 |
| <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>how can I get financial freedom as soon as possible?</code> |
|
266 |
| <code>The newer punts are still very much in existence today and run in the same fleets as the older boats .</code> | <code>The newer Punts are still very much in existence today and race in the same fleets as the older boats .</code> | <code>The older Punts are still very much in existence today and race in the same fleets as the newer boats .</code> |
|
267 |
| <code>Turner Valley , was at the Turner Valley Bar N Ranch Airport , southwest of the Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley , , was located at Turner Valley Bar N Ranch Airport , southwest of Turner Valley Bar N Ranch , Alberta , Canada .</code> | <code>Turner Valley Bar N Ranch Airport , , was located at Turner Valley Bar N Ranch , southwest of Turner Valley , Alberta , Canada .</code> |
|
268 |
+
* Loss: <code>losses.ArcFaceInBatchLoss</code> with these parameters:
|
269 |
+
```json
|
270 |
+
{
|
271 |
+
"scale": 20.0,
|
272 |
+
"similarity_fct": "cos_sim",
|
273 |
+
"gather_across_devices": false
|
274 |
+
}
|
275 |
+
```
|
276 |
|
277 |
### Training Logs
|
278 |
| Epoch | Step | test_cosine_ndcg@10 |
|
279 |
|:-----:|:----:|:-------------------:|
|
280 |
+
| -1 | -1 | 0.5950 |
|
281 |
|
282 |
|
283 |
### Framework Versions
|
config_sentence_transformers.json
CHANGED
@@ -1,4 +1,5 @@
|
|
1 |
{
|
|
|
2 |
"__version__": {
|
3 |
"sentence_transformers": "5.1.0",
|
4 |
"transformers": "4.56.0",
|
@@ -9,6 +10,5 @@
|
|
9 |
"document": ""
|
10 |
},
|
11 |
"default_prompt_name": null,
|
12 |
-
"similarity_fn_name": "cosine"
|
13 |
-
"model_type": "SentenceTransformer"
|
14 |
}
|
|
|
1 |
{
|
2 |
+
"model_type": "SentenceTransformer",
|
3 |
"__version__": {
|
4 |
"sentence_transformers": "5.1.0",
|
5 |
"transformers": "4.56.0",
|
|
|
10 |
"document": ""
|
11 |
},
|
12 |
"default_prompt_name": null,
|
13 |
+
"similarity_fn_name": "cosine"
|
|
|
14 |
}
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 298041696
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:49130ca9bb82456b8d7d77dc9819d74b11864b9d854ef1eff2ed157e37bc1afe
|
3 |
size 298041696
|
tokenizer_config.json
CHANGED
@@ -938,7 +938,7 @@
|
|
938 |
"input_ids",
|
939 |
"attention_mask"
|
940 |
],
|
941 |
-
"model_max_length":
|
942 |
"pad_token": "[PAD]",
|
943 |
"sep_token": "[SEP]",
|
944 |
"tokenizer_class": "PreTrainedTokenizerFast",
|
|
|
938 |
"input_ids",
|
939 |
"attention_mask"
|
940 |
],
|
941 |
+
"model_max_length": 8192,
|
942 |
"pad_token": "[PAD]",
|
943 |
"sep_token": "[SEP]",
|
944 |
"tokenizer_class": "PreTrainedTokenizerFast",
|