alvarobartt HF Staff commited on
Commit
052c35f
·
verified ·
1 Parent(s): 204c183

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -28
README.md CHANGED
@@ -73,33 +73,6 @@ print("Sentence embeddings:")
73
  print(sentence_embeddings)
74
  ```
75
 
76
-
77
-
78
- ## Full Model Architecture
79
- ```
80
- SentenceTransformer(
81
- (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel
82
- (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
83
- )
84
- ```
85
-
86
- ## Citing & Authors
87
-
88
- This model was trained by [sentence-transformers](https://www.sbert.net/).
89
-
90
- If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
91
- ```bibtex
92
- @inproceedings{reimers-2019-sentence-bert,
93
- title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
94
- author = "Reimers, Nils and Gurevych, Iryna",
95
- booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
96
- month = "11",
97
- year = "2019",
98
- publisher = "Association for Computational Linguistics",
99
- url = "http://arxiv.org/abs/1908.10084",
100
- }
101
- ```
102
-
103
  ## Usage (Text Embeddings Inference (TEI))
104
 
105
  [Text Embeddings Inference (TEI)](https://github.com/huggingface/text-embeddings-inference) is a blazing fast inference solution for text embeddings models.
@@ -130,4 +103,29 @@ curl -s http://localhost:8080/v1/embeddings \
130
  }'
131
  ```
132
 
133
- Or check the [Text Embeddings Inference API specification](https://huggingface.github.io/text-embeddings-inference/) instead.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
73
  print(sentence_embeddings)
74
  ```
75
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
76
  ## Usage (Text Embeddings Inference (TEI))
77
 
78
  [Text Embeddings Inference (TEI)](https://github.com/huggingface/text-embeddings-inference) is a blazing fast inference solution for text embeddings models.
 
103
  }'
104
  ```
105
 
106
+ Or check the [Text Embeddings Inference API specification](https://huggingface.github.io/text-embeddings-inference/) instead.
107
+
108
+ ## Full Model Architecture
109
+ ```
110
+ SentenceTransformer(
111
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel
112
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
113
+ )
114
+ ```
115
+
116
+ ## Citing & Authors
117
+
118
+ This model was trained by [sentence-transformers](https://www.sbert.net/).
119
+
120
+ If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
121
+ ```bibtex
122
+ @inproceedings{reimers-2019-sentence-bert,
123
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
124
+ author = "Reimers, Nils and Gurevych, Iryna",
125
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
126
+ month = "11",
127
+ year = "2019",
128
+ publisher = "Association for Computational Linguistics",
129
+ url = "http://arxiv.org/abs/1908.10084",
130
+ }
131
+ ```