metadata
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:111470
- loss:MultipleNegativesRankingLoss
base_model: thenlper/gte-small
widget:
- source_sentence: when was the first elephant brought to america
sentences:
- >-
Old Bet The first elephant brought to the United States was in 1796,
aboard the America which set sail from Calcutta for New York on December
3, 1795.[4] However, it is not certain that this was Old Bet.[2] The
first references to Old Bet start in 1804 in Boston as part of a
menagerie.[1] In 1808, while residing in Somers, New York, Hachaliah
Bailey purchased the menagerie elephant for $1,000 and named it "Old
Bet".[5][6]
- >-
Cronus Rhea secretly gave birth to Zeus in Crete, and handed Cronus a
stone wrapped in swaddling clothes, also known as the Omphalos Stone,
which he promptly swallowed, thinking that it was his son.
- >-
Renal artery One or two accessory renal arteries are frequently found,
especially on the left side since they usually arise from the aorta, and
may come off above (more common) or below the main artery. Instead of
entering the kidney at the hilus, they usually pierce the upper or lower
part of the organ.
- source_sentence: who won the india's next superstar grand finale
sentences:
- >-
India's Next Superstars India's Next Superstars is a talent-search
Indian reality TV show, which premiered on Star Plus and is streamed on
Hotstar.[1] Karan Johar and Rohit Shetty are the judges for the show.
[2] Aman Gandotra and Natasha Bharadwaj were declared winners of 2018
season. Shruti Sharma won a 'Special Mention' award. Runners up in the
male category were Aashish Mehrotra and Harshvardhan Deo and in the
female category were Naina Singh and Shruti Sharma. [3]
- >-
India national cricket team India was invited to The Imperial Cricket
Council in 1926, and made their debut as a Test playing nation in
England in 1932, led by CK Nayudu, who was considered as the best Indian
batsman at the time.[14] The one-off Test match between the two sides
was played at Lord's in London. The team was not strong in their batting
at this point and went on to lose by 158 runs.[15] In 1933, the first
Test series in India was played between India and England with matches
in Bombay, Calcutta (now Kolkata) and Madras (now Chennai). England won
the series 2–0.[16] The Indian team continued to improve throughout the
1930s and '40s but did not achieve an international victory during this
period. In the early 1940s, India didn't play any Test cricket due to
the Second World War. The team's first series as an independent country
was in late 1947 against Sir Donald Bradman's Invincibles (a name given
to the Australia national cricket team of that time). It was also the
first Test series India played which was not against England. Australia
won the five-match series 4–0, with Bradman tormenting the Indian
bowling in his final Australian summer.[17] India subsequently played
their first Test series at home not against England against the West
Indies in 1948. West Indies won the 5-Test series 1–0.[18]
- >-
Hindi Medium (film) Raj Batra (Irrfan Khan) is a rich businessman from
Delhi staying with his wife Mita (Saba Qamar). They studied in a Hindi
Medium school but wants their 5 year old daughter, Pia (Dishita Sehgal),
to be admitted to one of the top schools in Delhi. The top school,
'Delhi Grammar School', has a condition that they will admit students
who reside within 3km radius, so the family moves to Vasant Vihar.
- source_sentence: i am human and nothing of that which is human is alien to me meaning
sentences:
- >-
America's Got Talent Introduced in season nine, the "Golden Buzzer" is
located on the center of the judges' desk and may be used once per
season by each judge. In season 9, a judge could press the golden buzzer
to save an act from elimination, regardless of the number of X's earned
from the other judges. Starting in season 10 and onward, any act that
receives a golden buzzer advances directly to the live show; and in
season 11, the hosts also were given the power to use the golden buzzer.
The golden buzzer is also used in the Judge Cuts format.
- >-
You'll Never Walk Alone "You'll Never Walk Alone" is a show tune from
the 1945 Rodgers and Hammerstein musical Carousel. In the second act of
the musical, Nettie Fowler, the cousin of the female protagonist Julie
Jordan, sings "You'll Never Walk Alone" to comfort and encourage Julie
when her husband, Billy Bigelow, the male lead, commits suicide after a
failed robbery attempt. It is reprised in the final scene to encourage a
graduation class of which Louise (Billy and Julie's daughter) is a
member. The now invisible Billy, who has been granted the chance to
return to Earth for one day in order to redeem himself, watches the
ceremony and is able to silently motivate the unhappy Louise to join in
the song.
- >-
Terence One famous quotation by Terence reads: "Homo sum, humani nihil a
me alienum puto", or "I am human, and I think that nothing of that which
is human is alien to me." This appeared in his play Heauton Timorumenos.
- source_sentence: what do glial cells do in the brain
sentences:
- >-
Neuroglia Neuroglia, also called glial cells or simply glia, are
non-neuronal cells in the central nervous system (brain and spinal cord)
and the peripheral nervous system. They maintain homeostasis, form
myelin, and provide support and protection for neurons.[1] In the
central nervous system, glial cells include oligodendrocytes,
astrocytes, ependymal cells and microglia, and in the peripheral nervous
system glial cells include Schwann cells and satellite cells. They have
four main functions: (1) To surround neurons and hold them in place (2)
To supply nutrients and oxygen to neurons (3) To insulate one neuron
from another (4) To destroy pathogens and remove dead neurons. They also
play a role in neurotransmission and synaptic connections,[2] and in
physiological processes like breathing,[3][4] .
- >-
The Mother (How I Met Your Mother) Tracy McConnell, better known as "The
Mother", is the title character from the CBS television sitcom How I Met
Your Mother. The show, narrated by Future Ted, tells the story of how
Ted Mosby met The Mother. Tracy McConnell appears in 8 episodes from
"Lucky Penny" to "The Time Travelers" as an unseen character; she was
first seen fully in "Something New" and was promoted to a main character
in season 9. The Mother is played by Cristin Milioti.
- >-
Marsupial Marsupials are any members of the mammalian infraclass
Marsupialia. All extant marsupials are endemic to Australasia and the
Americas. A distinctive characteristic common to these species is that
most of the young are carried in a pouch. Well-known marsupials include
kangaroos, wallabies, koalas, possums, opossums, wombats, and Tasmanian
devils. Some lesser-known marsupials are the potoroo and the quokka.
- source_sentence: 'It was Easipower that said :'
sentences:
- >-
United States presidential election However, federal law does specify
that all electors must be selected on the same day, which is "the first
Tuesday after the first Monday in November," i.e., a Tuesday no earlier
than November 2 and no later than November 8.[17] Today, the states and
the District of Columbia each conduct their own popular elections on
Election Day to help determine their respective slate of electors. Thus,
the presidential election is really an amalgamation of separate and
simultaneous state elections instead of a single national election run
by the federal government.
- It is said that Easipower was ,
- 'It was Easipower that said :'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on thenlper/gte-small
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoMSMARCO
type: NanoMSMARCO
metrics:
- type: cosine_accuracy@1
value: 0.34
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.56
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.64
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.76
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.34
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.18666666666666668
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.128
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07600000000000001
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.34
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.56
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.64
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.76
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5416219337167224
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.47319047619047616
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4857841065799604
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNQ
type: NanoNQ
metrics:
- type: cosine_accuracy@1
value: 0.54
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.76
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.54
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.24
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15600000000000003
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.086
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.52
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.66
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.71
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.77
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6525146735767775
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.6275
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6140321846592789
name: Cosine Map@100
- task:
type: nano-beir
name: Nano BEIR
dataset:
name: NanoBEIR mean
type: NanoBEIR_mean
metrics:
- type: cosine_accuracy@1
value: 0.44000000000000006
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.63
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.78
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.44000000000000006
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.21333333333333332
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.14200000000000002
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.081
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.43000000000000005
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6100000000000001
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.675
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.765
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5970683036467499
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.550345238095238
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5499081456196196
name: Cosine Map@100
SentenceTransformer based on thenlper/gte-small
This is a sentence-transformers model finetuned from thenlper/gte-small. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: thenlper/gte-small
- Maximum Sequence Length: 128 tokens
- Output Dimensionality: 384 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'BertModel'})
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("redis/model-b-structured")
# Run inference
sentences = [
'It was Easipower that said :',
'It was Easipower that said :',
'It is said that Easipower was ,',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 1.0000, 0.8522],
# [1.0000, 1.0000, 0.8522],
# [0.8522, 0.8522, 1.0000]])
Evaluation
Metrics
Information Retrieval
- Datasets:
NanoMSMARCOandNanoNQ - Evaluated with
InformationRetrievalEvaluator
| Metric | NanoMSMARCO | NanoNQ |
|---|---|---|
| cosine_accuracy@1 | 0.34 | 0.54 |
| cosine_accuracy@3 | 0.56 | 0.7 |
| cosine_accuracy@5 | 0.64 | 0.76 |
| cosine_accuracy@10 | 0.76 | 0.8 |
| cosine_precision@1 | 0.34 | 0.54 |
| cosine_precision@3 | 0.1867 | 0.24 |
| cosine_precision@5 | 0.128 | 0.156 |
| cosine_precision@10 | 0.076 | 0.086 |
| cosine_recall@1 | 0.34 | 0.52 |
| cosine_recall@3 | 0.56 | 0.66 |
| cosine_recall@5 | 0.64 | 0.71 |
| cosine_recall@10 | 0.76 | 0.77 |
| cosine_ndcg@10 | 0.5416 | 0.6525 |
| cosine_mrr@10 | 0.4732 | 0.6275 |
| cosine_map@100 | 0.4858 | 0.614 |
Nano BEIR
- Dataset:
NanoBEIR_mean - Evaluated with
NanoBEIREvaluatorwith these parameters:{ "dataset_names": [ "msmarco", "nq" ], "dataset_id": "lightonai/NanoBEIR-en" }
| Metric | Value |
|---|---|
| cosine_accuracy@1 | 0.44 |
| cosine_accuracy@3 | 0.63 |
| cosine_accuracy@5 | 0.7 |
| cosine_accuracy@10 | 0.78 |
| cosine_precision@1 | 0.44 |
| cosine_precision@3 | 0.2133 |
| cosine_precision@5 | 0.142 |
| cosine_precision@10 | 0.081 |
| cosine_recall@1 | 0.43 |
| cosine_recall@3 | 0.61 |
| cosine_recall@5 | 0.675 |
| cosine_recall@10 | 0.765 |
| cosine_ndcg@10 | 0.5971 |
| cosine_mrr@10 | 0.5503 |
| cosine_map@100 | 0.5499 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 111,470 training samples
- Columns:
anchor,positive, andnegative - Approximate statistics based on the first 1000 samples:
anchor positive negative type string string string details - min: 6 tokens
- mean: 13.22 tokens
- max: 44 tokens
- min: 6 tokens
- mean: 90.67 tokens
- max: 128 tokens
- min: 7 tokens
- mean: 89.65 tokens
- max: 128 tokens
- Samples:
anchor positive negative which state is home to the arizona ice tea beverage companyArizona Beverage Company Arizona Beverages USA (stylized as AriZona) is an American producer of many flavors of iced tea, juice cocktails and energy drinks based in Woodbury, New York.[2] Arizona's first product was made available in 1992.Arya Vaishya Arya Vaishya (Arya Vysya) is an Indian caste. Orthodox Arya Vaishyas follow rituals prescribed in the Vasavi Puranam, a religious text written in the late Middle Ages. Their kuladevata is Vasavi. The community were formerly known as Komati Chettiars in Tamil Nadu but now prefer to be referred to as Arya Vaishya.[1]when were afro-american and africana studies programs founded in colleges and universitiesAfrican-American studies Programs and departments of African-American studies were first created in the 1960s and 1970s as a result of inter-ethnic student and faculty activism at many universities, sparked by a five-month strike for black studies at San Francisco State. In February 1968, San Francisco State hired sociologist Nathan Hare to coordinate the first black studies program and write a proposal for the first Department of Black Studies; the department was created in September 1968 and gained official status at the end of the five-months strike in the spring of 1969. The creation of programs and departments in Black studies was a common demand of protests and sit-ins by minority students and their allies, who felt that their cultures and interests were underserved by the traditional academic structures.Maze Runner: The Death Cure Maze Runner: The Death Cure was originally set to be released on February 17, 2017, in the United States by 20th Century Fox, but the studio rescheduled the film's release for January 26, 2018 in theatres and IMAX, allowing time for O'Brien to recover from injuries he sustained during filming. The film received mixed reviews from critics and grossed over $284Â million worldwide.who recorded the song total eclipse of the heartBonnie Tyler Bonnie Tyler (born Gaynor Hopkins; 8 June 1951) is a Welsh singer, known for her distinctive husky voice. Tyler came to prominence with the release of her 1977 album The World Starts Tonight and its singles "Lost in France" and "More Than a Lover". Her 1978 single "It's a Heartache" reached number four on the UK Singles Chart, and number three on the US Billboard Hot 100.Manny Pacquiao vs. Juan Manuel Márquez IV Marquez defeated Pacquiao by knockout with one second remaining in the sixth round. It was named Fight of the Year and Knockout of the Year by Ring Magazine, with round five garnering Round of the Year honors.[2] - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "gather_across_devices": false }
Evaluation Dataset
Unnamed Dataset
- Size: 12,386 evaluation samples
- Columns:
anchor,positive, andnegative - Approximate statistics based on the first 1000 samples:
anchor positive negative type string string string details - min: 6 tokens
- mean: 13.03 tokens
- max: 44 tokens
- min: 6 tokens
- mean: 89.36 tokens
- max: 128 tokens
- min: 4 tokens
- mean: 88.87 tokens
- max: 128 tokens
- Samples:
anchor positive negative In early July , Steve Whitley , the criminal father of Harper Whitley and Garrett Whitley , and brother of Benny Cameron .In early July , Steve Whitley , the criminal father of Harper Whitley and Garrett Whitley , and brother of Benny Cameron .In early July , Garrett Whitley , who is the criminal father of Harper Whitley and Steve Whitley , and the brother of Benny Cameron , appeared .when will the next season of house of cards be releasedHouse of Cards (season 6) The sixth and final season of the American political drama web television series House of Cards was confirmed by Netflix on December 4, 2017, and is scheduled to be released on November 2, 2018. Unlike previous seasons that consisted of thirteen episodes each, the sixth season will consist of only eight. The season will not include former lead actor Kevin Spacey, who was fired from the show due to sexual misconduct allegations.Wild 'n Out For the first four seasons, the show filmed from Los Angeles/Hollywood and aired on MTV. The first run episodes were suspended as Mr. Renaissance Entertainment became Ncredible Entertainment in 2012. Upon being revived in 2012, the show was produced in New York City and aired on MTV2 during Seasons 5–7, it also returned to that location for Season 9. In 2016, the show returned to airing new episodes on MTV and also for the first time since Season 4, production is in Los Angeles.who played the father on father knows bestFather Knows Best The series began August 25, 1949, on NBC Radio. Set in the Midwest, it starred Robert Young as the General Insurance agent Jim Anderson. His wife Margaret was first portrayed by June Whitley and later by Jean Vander Pyl. The Anderson children were Betty (Rhoda Williams), Bud (Ted Donaldson), and Kathy (Norma Jean Nilsson). Others in the cast were Eleanor Audley, Herb Vigran and Sam Edwards. Sponsored through most of its run by General Foods, the series was heard Thursday evenings on NBC until March 25, 1954.List of To Kill a Mockingbird characters Maycomb children believe he is a horrible person, due to the rumors spread about him and a trial he underwent as a teenager. It is implied during the story that Boo is a very lonely man who attempts to reach out to Jem and Scout for love and friendship, such as leaving them small gifts and figures in a tree knothole. Scout finally meets him at the very end of the book, when he saves the children's lives from Bob Ewell. Scout describes him as being sickly white, with a thin mouth, thin and feathery hair, and grey eyes, almost as if he were blind. During the same night, when Boo whispers to Scout to walk him back to the Radley house, Scout takes a moment to picture what it would be like to be Boo Radley. While standing on his porch, she realizes his "exile" inside his house is really not that lonely. - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "gather_across_devices": false }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 128per_device_eval_batch_size: 128learning_rate: 8e-05weight_decay: 0.005max_steps: 1125warmup_ratio: 0.1fp16: Truedataloader_drop_last: Truedataloader_num_workers: 1dataloader_prefetch_factor: 1load_best_model_at_end: Trueoptim: adamw_torchddp_find_unused_parameters: Falsepush_to_hub: Truehub_model_id: redis/model-b-structuredeval_on_start: True
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 128per_device_eval_batch_size: 128per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 8e-05weight_decay: 0.005adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 3.0max_steps: 1125lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Truedataloader_num_workers: 1dataloader_prefetch_factor: 1past_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Trueignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config: Nonedeepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthproject: huggingfacetrackio_space_id: trackioddp_find_unused_parameters: Falseddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Trueresume_from_checkpoint: Nonehub_model_id: redis/model-b-structuredhub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: noneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Trueuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Trueprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}
Training Logs
| Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
|---|---|---|---|---|---|---|
| 0 | 0 | - | 1.9462 | 0.6259 | 0.6583 | 0.6421 |
| 0.2874 | 250 | 0.3773 | 0.0669 | 0.5322 | 0.6570 | 0.5946 |
| 0.5747 | 500 | 0.0787 | 0.0564 | 0.5584 | 0.6307 | 0.5946 |
| 0.8621 | 750 | 0.0678 | 0.0495 | 0.5390 | 0.6447 | 0.5918 |
| 1.1494 | 1000 | 0.0517 | 0.0479 | 0.5416 | 0.6525 | 0.5971 |
Framework Versions
- Python: 3.10.18
- Sentence Transformers: 5.2.0
- Transformers: 4.57.3
- PyTorch: 2.9.1+cu128
- Accelerate: 1.12.0
- Datasets: 2.21.0
- Tokenizers: 0.22.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}