File size: 79,647 Bytes
6661c10
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
---
language:
- code
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:197351
- loss:MultipleNegativesRankingLoss
base_model: Qwen/Qwen3-Embedding-0.6B
widget:
- source_sentence: ABCB7
  sentences:
  - This gene encodes a tetrameric mitochondrial flavoprotein, which is a member of
    the acyl-CoA dehydrogenase family. This enzyme catalyzes the initial step of the
    mitochondrial fatty acid beta-oxidation pathway. Mutations in this gene have been
    associated with short-chain acyl-CoA dehydrogenase (SCAD) deficiency. Alternative
    splicing results in two variants which encode different isoforms. [provided by
    RefSeq, Oct 2014]
  - The membrane-associated protein encoded by this gene is a member of the superfamily
    of ATP-binding cassette (ABC) transporters. ABC proteins transport various molecules
    across extra- and intra-cellular membranes. ABC genes are divided into seven distinct
    subfamilies (ABC1, MDR/TAP, MRP, ALD, OABP, GCN20, White). This protein is a member
    of the MDR/TAP subfamily. Members of the MDR/TAP subfamily are involved in multidrug
    resistance as well as antigen presentation. This gene encodes a half-transporter
    involved in the transport of heme from the mitochondria to the cytosol. With iron/sulfur
    cluster precursors as its substrates, this protein may play a role in metal homeostasis.
    Mutations in this gene have been associated with mitochondrial iron accumulation
    and isodicentric (X)(q13) and sideroblastic anemia. Alternatively spliced transcript
    variants encoding multiple isoforms have been observed for this gene. [provided
    by RefSeq, Nov 2012]
  - The membrane-associated protein encoded by this gene is a member of the superfamily
    of ATP-binding cassette (ABC) transporters. ABC proteins transport various molecules
    across extra- and intracellular membranes. ABC genes are divided into seven distinct
    subfamilies (ABC1, MDR/TAP, MRP, ALD, OABP, GCN20, and White). This encoded protein
    is a member of the ABC1 subfamily. Members of the ABC1 subfamily comprise the
    only major ABC subfamily found exclusively in multicellular eukaryotes. This gene
    is clustered among 4 other ABC1 family members on 17q24, but neither the substrate
    nor the function of this gene is known. Alternative splicing of this gene results
    in several transcript variants; however, not all variants have been fully described.
    [provided by RefSeq, Jul 2008]
- source_sentence: ABCC8
  sentences:
  - The protein encoded by this gene is a member of the superfamily of ATP-binding
    cassette (ABC) transporters. ABC proteins transport various molecules across extra-
    and intra-cellular membranes. ABC genes are divided into seven distinct subfamilies
    (ABC1, MDR/TAP, MRP, ALD, OABP, GCN20, White). This protein is a member of the
    MRP subfamily which is involved in multi-drug resistance. This protein functions
    as a modulator of ATP-sensitive potassium channels and insulin release. Mutations
    in the ABCC8 gene and deficiencies in the encoded protein have been observed in
    patients with hyperinsulinemic hypoglycemia of infancy, an autosomal recessive
    disorder of unregulated and high insulin secretion. Mutations have also been associated
    with non-insulin-dependent diabetes mellitus type II, an autosomal dominant disease
    of defective insulin secretion. Alternatively spliced transcript variants have
    been found for this gene. [provided by RefSeq, Jul 2020]
  - Predicted to enable GTPase activator activity and zinc ion binding activity. Predicted
    to be involved in protein transport. Located in membrane. [provided by Alliance
    of Genome Resources, Jul 2025]
  - The protein encoded by this gene is a member of the superfamily of ATP-binding
    cassette (ABC) transporters. ABC proteins transport various molecules across extra-
    and intra-cellular membranes. ABC genes are divided into seven distinct subfamilies
    (ABC1, MDR/TAP, MRP, ALD, OABP, GCN20, White). This ABC full transporter is a
    member of the MRP subfamily which is involved in multi-drug resistance. The product
    of this gene participates in physiological processes involving bile acids, conjugated
    steroids, and cyclic nucleotides. In addition, a SNP in this gene is responsible
    for determination of human earwax type. This gene and family member ABCC12 are
    determined to be derived by duplication and are both localized to chromosome 16q12.1.
    Multiple alternatively spliced transcript variants have been described for this
    gene. [provided by RefSeq, Jul 2008]
- source_sentence: MALAT1 TMSB4X ACTB TPT1 EEF1A1 S100A10 LGALS1 VIM SH3BGRL3 S100A4
    FTL PTMA SRGN TMSB10 CYBA GAPDH CD74 TAGLN2 FTH1 S100A6 UBA52 YBX1 MYL6 OAZ1 CST3
    NACA FAU ARPC2 GSTP1 PFN1 HSP90AA1 COTL1 PPIA ARPC3 UQCRB MYL12A CD63 EIF1 NEAT1
    RACK1 MACROH2A1 ATP6V0E1 ATP5F1E SRP14 ENO1 SLC25A3 CTSH PRDX1 VAMP8 COX4I1 CAP1
    BTF3 DBI HNRNPA3 GNAS DDX5 H3-3B TPM3 LAPTM5 ZEB2 GNG5 FLNA CALM1 CD44
  sentences:
  - MALAT1 PTMA TMSB10 LGALS1 ACTB PRDX1 S100A4 H3-3B TMSB4X VIM TPT1 LMO4 HNRNPA2B1
    SH3BGRL3 TAGLN2 HNRNPU DDIT4 PFN1 IGFBP7 HMGB1 FTH1 CFL1 CD74 SOX4 KLF2 BST2 S100A11
    RACK1 PSMA4 DDX5 NCL RSRP1 IRF1 SERF2 EEF1A1 CALM1 UBA52 CYBA HSP90AA1 MYL12A
    AHNAK ITM2B SRP14 EMP3 CALM2 TSC22D3 YWHAZ SELENOW PPIA S100A6 TSPO IRAG2 TPM3
    UBC ARPC2 HNRNPA3 UBB EIF1 JUN IFITM2 PRR13 N4BP2L2 LAPTM4A CDC42
  - This measurement was conducted with 10x 3' v3. This sample is derived from a 3-month-old
    male patient with KMT2A-rearranged (KMT2A-r) infant acute lymphoblastic leukemia
    (ALL) with a CD8_Cytotoxic T cell type, specifically T/NK cells, and a presumed
    MLL-AF4 fusion.
  - This measurement was conducted with 10x 3' v3. Blast cells derived from a 1-month-old
    human with a presumed MLL-AF10 fusion, projected as cDC-like cells.
- source_sentence: MALAT1 CXCL14 EEF1A1 VIM IGFBP7 COL1A2 FTH1 TPT1 S100A6 TMSB4X
    A2M APOE DCN PTGDS TMSB10 LGALS1 ACTB FBLN1 FTL RARRES2 CD81 CALD1 CD63 COL6A2
    MYL6 SPARCL1 NEAT1 IGFBP5 PTMA CST3 FAU SERF2 SPARC IFITM3 EIF1 S100A4 NACA JUND
    COL6A1 GSN C1S CFH HSP90AA1 PDLIM1 H3-3B EDIL3 UBA52 VCAN LTBP4 TIMP3 CTSC ITM2B
    IGFBP4 UBC UBB RACK1 TIMP1 ACTA2 ZFP36L2 PLPP3 TUBA1A FILIP1L FOS S100A10
  sentences:
  - MALAT1 TMSB10 A2M FABP5 PTMA VIM ACTB CAV1 SPARCL1 CD74 EEF1A1 KLF2 IFITM3 CLDN5
    TMSB4X TPT1 ENPP2 TM4SF1 FOS EIF1 S100A6 CALM1 CD81 HES1 SRGN ID1 GNG11 IGFBP4
    STOM GSN TAGLN2 IGFBP7 CD320 FTH1 MCAM HSP90AA1 GNAS MYL6 TIMP3 EPAS1 TNFSF10
    PODXL ITM2B SRP14 UBC TGFBR2 KCTD12 GIMAP7 UBA52 RHOA CD59 FTL PCSK5 MYH9 MYL12A
    FLT1 CXCL12 LIFR TUBA1B DSTN ARPC1B JUND H3-3B TMBIM6
  - This measurement was conducted with 10x 3' v3. Fibroblasts derived from the terminal
    ileum of a female individual in her fourth decade, exhibiting Crohn's disease
    (CD) related changes.
  - This measurement was conducted with 10x 3' v3. Glial cells derived from the ileal
    epithelium of a female in her fourth decade.
- source_sentence: MALAT1 DCN MGP APOD GSN LAMA2 CST3 SPARCL1 IGFBP7 TIMP1 VIM EEF1A1
    ITM2B FBLN1 C3 IFITM3 FBN1 FTH1 TPT1 ABCA8 C1S TXNIP FTL TIMP3 FN1 CD63 RBMS3
    ABCA6 ZBTB20 CEBPD NEAT1 CFH VCAN PTN PTGDS CD81 SERF2 COL6A1 COL6A2 ABI3BP ABCA10
    EBF1 COL1A2 PRKG1 S100A6 MGST1 TMSB10 TIMP2 CELF2 LAPTM4A RORA ACTB LTBP4 MYL6
    LGALS1 DDX5 SPTBN1 EFEMP1 BICC1 LRP1 H3-3B SCN7A IGFBP4 FAU
  sentences:
  - This measurement was conducted with 10x 3' v3. CD4+T naive lymphocyte cells derived
    from the right cardiac atrium of a European male in his sixties.
  - This measurement was conducted with 10x multiome. Fibroblast cell sample taken
    from the right ventricle of a European female donor in her fifth decade, who is
    a DCD donor. The sample is in nucleus form.
  - MALAT1 NEAT1 LINC00486 SLC8A1 VMP1 SAT1 PIK3R5 DIRC3 FMN1 PMP22 RBM47 AGFG1 DIP2B
    RBMS1 GNAQ TBC1D14 RAB1A ARHGAP24 DAPK1 SLC1A3 RHOQ SH3BGRL DOCK10 SLCO2B1 RUNX1
    ENOX2 LDLRAD4 RNF150 PIAS1 DDX5 WSB1 TSHZ3 SBF2 DOCK2 LRP4 DENND4C FCHSD2 EXOC6B
    AFF3 ARHGAP26 DIAPH2 MGAT5 TMEM163 NSMCE2 RBPJ ZEB2 TANC2 BPTF SH3RF3 MFSD14CP
    TCF4 RORA-AS1 NOP58 MEF2A EPN2 PICALM ARHGAP15 MEF2C ANKRD12 FCGRT DOCK8 SETX
    TBC1D9 KLHL2
datasets:
- jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation
- jo-mengr/descriptions_genes
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy
model-index:
- name: SentenceTransformer based on Qwen/Qwen3-Embedding-0.6B
  results:
  - task:
      type: triplet
      name: Triplet
    dataset:
      name: cellxgene pseudo bulk 100k multiplets natural language annotation cell
        sentence 2
      type: cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2
    metrics:
    - type: cosine_accuracy
      value: 0.8204416632652283
      name: Cosine Accuracy
  - task:
      type: triplet
      name: Triplet
    dataset:
      name: gene description
      type: gene_description
    metrics:
    - type: cosine_accuracy
      value: 0.9559999704360962
      name: Cosine Accuracy
---

# SentenceTransformer based on Qwen/Qwen3-Embedding-0.6B

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Qwen/Qwen3-Embedding-0.6B](https://huggingface.co/Qwen/Qwen3-Embedding-0.6B) on the [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation) and [gene_description](https://huggingface.co/datasets/jo-mengr/descriptions_genes) datasets. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Qwen/Qwen3-Embedding-0.6B](https://huggingface.co/Qwen/Qwen3-Embedding-0.6B) <!-- at revision c54f2e6e80b2d7b7de06f51cec4959f6b3e03418 -->
- **Maximum Sequence Length:** 32768 tokens
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
    - [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation)
    - [gene_description](https://huggingface.co/datasets/jo-mengr/descriptions_genes)
- **Language:** code
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): MMContextEncoder(
    (text_encoder): Qwen3Model(
      (embed_tokens): Embedding(151669, 1024)
      (layers): ModuleList(
        (0-27): 28 x Qwen3DecoderLayer(
          (self_attn): Qwen3Attention(
            (q_proj): Linear(in_features=1024, out_features=2048, bias=False)
            (k_proj): Linear(in_features=1024, out_features=1024, bias=False)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=False)
            (o_proj): Linear(in_features=2048, out_features=1024, bias=False)
            (q_norm): Qwen3RMSNorm((128,), eps=1e-06)
            (k_norm): Qwen3RMSNorm((128,), eps=1e-06)
          )
          (mlp): Qwen3MLP(
            (gate_proj): Linear(in_features=1024, out_features=3072, bias=False)
            (up_proj): Linear(in_features=1024, out_features=3072, bias=False)
            (down_proj): Linear(in_features=3072, out_features=1024, bias=False)
            (act_fn): SiLU()
          )
          (input_layernorm): Qwen3RMSNorm((1024,), eps=1e-06)
          (post_attention_layernorm): Qwen3RMSNorm((1024,), eps=1e-06)
        )
      )
      (norm): Qwen3RMSNorm((1024,), eps=1e-06)
      (rotary_emb): Qwen3RotaryEmbedding()
    )
    (pooling): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  )
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("jo-mengr/mmcontext-qwen-scvi_fm")
# Run inference
sentences = [
    'MALAT1 DCN MGP APOD GSN LAMA2 CST3 SPARCL1 IGFBP7 TIMP1 VIM EEF1A1 ITM2B FBLN1 C3 IFITM3 FBN1 FTH1 TPT1 ABCA8 C1S TXNIP FTL TIMP3 FN1 CD63 RBMS3 ABCA6 ZBTB20 CEBPD NEAT1 CFH VCAN PTN PTGDS CD81 SERF2 COL6A1 COL6A2 ABI3BP ABCA10 EBF1 COL1A2 PRKG1 S100A6 MGST1 TMSB10 TIMP2 CELF2 LAPTM4A RORA ACTB LTBP4 MYL6 LGALS1 DDX5 SPTBN1 EFEMP1 BICC1 LRP1 H3-3B SCN7A IGFBP4 FAU',
    'This measurement was conducted with 10x multiome. Fibroblast cell sample taken from the right ventricle of a European female donor in her fifth decade, who is a DCD donor. The sample is in nucleus form.',
    "This measurement was conducted with 10x 3' v3. CD4+T naive lymphocyte cells derived from the right cardiac atrium of a European male in his sixties.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.6280, 0.0951],
#         [0.6280, 1.0000, 0.2002],
#         [0.0951, 0.2002, 1.0000]])
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Triplet

* Datasets: `cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2` and `gene_description`
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)

| Metric              | cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2 | gene_description |
|:--------------------|:----------------------------------------------------------------------------------|:-----------------|
| **cosine_accuracy** | **0.8204**                                                                        | **0.956**        |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Datasets

#### cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation

* Dataset: [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation) at [d518eb2](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation/tree/d518eb24af305653b43acd9e26f9502632059e7c)
* Size: 81,143 training samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative_1</code>, and <code>negative_2</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                                            | positive                                                                                         | negative_1                                                                                         | negative_2                                                                                        |
  |:--------|:--------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|
  | type    | string                                                                                            | string                                                                                           | string                                                                                             | string                                                                                            |
  | details | <ul><li>min: 356 characters</li><li>mean: 385.24 characters</li><li>max: 450 characters</li></ul> | <ul><li>min: 92 characters</li><li>mean: 216.13 characters</li><li>max: 900 characters</li></ul> | <ul><li>min: 103 characters</li><li>mean: 212.72 characters</li><li>max: 1186 characters</li></ul> | <ul><li>min: 353 characters</li><li>mean: 384.82 characters</li><li>max: 433 characters</li></ul> |
* Samples:
  | anchor                                                                                                                                                                                                                                                                                                                                                                                                                           | positive                                                                                                                                                                                                                                                     | negative_1                                                                                                                                                                                                                                                   | negative_2                                                                                                                                                                                                                                                                                                                                                                                                            |
  |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>TMSB4X TMSB10 ACTB MALAT1 GNLY NKG7 IFITM2 LGALS1 GZMA EEF1A1 PFN1 HMGB2 FTH1 PTMA HSP90AA1 GZMB ARHGDIB HNRNPA2B1 PLAAT4 FAU CMC1 VIM MYL12A CBX3 ATP5F1E HCST IFI44L KLRF1 H3-3A COX6C ARL6IP1 CFL1 ISG15 HMGB1 S100A4 ATP5MF RORA MYL6 CORO1A OAZ1 KLRB1 ID2 HMGN3 CCNI RBM39 CAP1 SERF2 ELOC FCER1G S100A9 IFI16 YWHAZ EIF1 CALR HMGN2 SKAP2 SLC25A5 ZZZ3 YBX1 NUCB2 CDC42 GSTP1 FTL ATP5F1D</code>                    | <code>This measurement was conducted with 10x 3' v2. A proliferating lymphocyte cell sample, obtained from a 34-year-old female Asian individual, derived from peripheral blood mononuclear cells.</code>                                                    | <code>This measurement was conducted with 10x 3' v2. Sample is a CD8-positive, alpha-beta T cell derived from a 31-year-old Asian female's peripheral blood mononuclear cells.</code>                                                                        | <code>MALAT1 TMSB4X EEF1A1 TMSB10 FAU TPT1 PTMA EIF1 UBA52 ACTB FTH1 RACK1 FTL H3-3B JUNB ATP5F1E BTG1 CD52 NACA MYL12A PFN1 COX7C COX4I1 SERF2 UQCRB TOMM7 IL32 YBX1 PABPC1 MYL6 EIF3E OAZ1 NOP53 ARHGDIB LDHB HCST SARAF ITM2B ATP6V1G1 SRP14 UBC H3-3A COX6C HINT1 UBB COMMD6 S100A4 S100A6 CALM1 VIM CYBA ENO1 HSP90AA1 FXYD5 HSP90AB1 CIRBP SRSF5 NFKBIA CORO1A LEPROTL1 TLE5 CHCHD2 DDX5 CD69</code>            |
  | <code>EEF1A1 MALAT1 FTH1 JUNB TPT1 FOS TMSB10 BTG1 TMSB4X ZFP36L2 NACA PABPC1 ACTB FAU VIM H3-3B EIF1 ZFP36 SARAF PTMA IL7R JUN RACK1 EEF2 UBA52 GAPDH FTL FXYD5 DUSP1 S100A4 CD69 CXCR4 UBC TSC22D3 CFL1 KLF6 ARHGDIB KLF2 BTG2 CITED2 IER2 TUBB4B CD3E EEF1G SLC2A3 NFKBIA PFN1 SRGN SNX9 COX4I1 DNAJB1 SERF2 CD8A PCBP2 IL32 BIRC3 SMAP2 FUS GADD45B MYL12A OAZ1 ATP5F1E TUBA4A PNRC1</code>                                  | <code>This measurement was conducted with 10x 5' v1. Sample is a cell from the omentum tissue, specifically an effector memory CD4-positive, alpha-beta T cell, from a female in her sixth decade.</code>                                                    | <code>This measurement was conducted with 10x 5' v1. Sample is a CD4-positive helper T cell, specifically Trm_Th1/Th17 subset, derived from the duodenum tissue of a male individual in his sixth decade.</code>                                             | <code>MALAT1 TPT1 EEF1A1 VIM JUND TMSB4X PTMA FTH1 CRIP1 ANXA1 EIF1 UBC H3-3B ACTB SRGN FTL FAU KLF6 IL7R CALM1 UBA52 BTG1 SARAF IL32 TMSB10 PABPC1 HSP90AB1 DDX5 GAPDH TAGLN2 NACA CD44 HSPA5 RORA HSP90AA1 KLRB1 TNFAIP3 ATP5F1E PNRC1 ZFP36L2 H3-3A UBB FOS RACK1 FYN FAM107B GNAS EZR MYL6 CREM NFKBIA PFN1 ARHGDIB SRSF7 CD2 CCNI HNRNPA2B1 COX7C ITM2B SERF2 SH3BGRL3 TSC22D3 LMNA YWHAZ</code>                 |
  | <code>MALAT1 GRIK1 SYT1 PCDH9 RORA NRG1 CADPS ZFPM2 LRRC4C LINGO2 RALYL PTPRD SPHKAP CNTNAP5 SLC8A1 CCSER1 HDAC9 CELF2 R3HDM1 CNTN4 RBMS3 PCDH7 GALNT13 UNC5D ROBO1 SYNPR SNAP25 GPM6A ANK3 FRMPD4 CHRM2 RYR2 KHDRBS2 CADM1 CACNA1D RGS6 PDE4D DOCK4 UNC13C CDH18 FAT3 MEG3 NR2F2-AS1 HMCN1 GULP1 CAMK2D ZEB1 SYN2 DYNC1I1 OXR1 DPP10 OSBPL6 FRAS1 PPP3CA ZNF385D ZMAT4 PCBP3 HS6ST3 ERC2 PLEKHA5 CDK14 MAP2 NCOA1 ATP8A2</code> | <code>This measurement was conducted with 10x 3' v3. Neuron cell type from a 29-year-old male, specifically from the thalamic complex, specifically the thalamus (THM) - posterior nuclear complex of thalamus (PoN) - medial geniculate nuclei (MG).</code> | <code>This measurement was conducted with 10x 3' v3. Astrocyte cell type from the thalamic complex, specifically from the thalamus (THM) - posterior nuclear complex of thalamus (PoN) - medial geniculate nuclei (MG) region, of a 42-year-old male.</code> | <code>MALAT1 PCDH9 PLP1 MBP ST18 QKI PDE4B RNF220 PTPRD SEPTIN7 TTLL7 NCKAP5 GPM6B PIP4K2A MOBP SLC44A1 PTGDS PLCL1 MAP7 ELMO1 SIK3 FTH1 ZBTB20 MAN2A1 TMEM165 DOCK10 TCF12 EDIL3 ZEB2 DPYD MAP4K4 PHLPP1 TF GAB1 TRIM2 FRMD4B DNAJC6 MARCHF1 ANK3 DST AGAP1 TMEM144 NEAT1 PLEKHH1 DLG1 CRYAB ERBIN RTN4 SPP1 ATP8A1 DOCK4 SLAIN1 APP DOCK5 APBB2 SAMD12 SHTN1 ZNF536 ZFYVE16 ARAP2 LIMCH1 HIPK2 BCAS1 FAM107B</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

#### gene_description

* Dataset: [gene_description](https://huggingface.co/datasets/jo-mengr/descriptions_genes) at [dd22363](https://huggingface.co/datasets/jo-mengr/descriptions_genes/tree/dd22363de0a7c501f41ba324fb3b8d6ecdd14dc7)
* Size: 116,208 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative_1</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                                       | positive                                                                                          | negative_1                                                                                        |
  |:--------|:---------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|
  | type    | string                                                                                       | string                                                                                            | string                                                                                            |
  | details | <ul><li>min: 3 characters</li><li>mean: 5.88 characters</li><li>max: 12 characters</li></ul> | <ul><li>min: 16 characters</li><li>mean: 367.09 characters</li><li>max: 1375 characters</li></ul> | <ul><li>min: 13 characters</li><li>mean: 167.33 characters</li><li>max: 1375 characters</li></ul> |
* Samples:
  | anchor            | positive                                                                                                                                                                                                                                          | negative_1                        |
  |:------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------|
  | <code>A1BG</code> | <code>The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]</code> | <code>A1BG antisense RNA 1</code> |
  | <code>A1BG</code> | <code>The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]</code> | <code>G antigen 12D</code>        |
  | <code>A1BG</code> | <code>The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]</code> | <code>G antigen 12B</code>        |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Evaluation Datasets

#### cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation

* Dataset: [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation) at [d518eb2](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation/tree/d518eb24af305653b43acd9e26f9502632059e7c)
* Size: 9,011 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, <code>negative_1</code>, and <code>negative_2</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                                           | positive                                                                                         | negative_1                                                                                       | negative_2                                                                                        |
  |:--------|:-------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|
  | type    | string                                                                                           | string                                                                                           | string                                                                                           | string                                                                                            |
  | details | <ul><li>min: 347 characters</li><li>mean: 386.7 characters</li><li>max: 437 characters</li></ul> | <ul><li>min: 99 characters</li><li>mean: 209.99 characters</li><li>max: 941 characters</li></ul> | <ul><li>min: 101 characters</li><li>mean: 208.8 characters</li><li>max: 728 characters</li></ul> | <ul><li>min: 356 characters</li><li>mean: 386.56 characters</li><li>max: 434 characters</li></ul> |
* Samples:
  | anchor                                                                                                                                                                                                                                                                                                                                                                                                                   | positive                                                                                                                                                                                                                                                                    | negative_1                                                                                                                                                                                                                                                | negative_2                                                                                                                                                                                                                                                                                                                                                                                                                        |
  |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>MALAT1 EEF1A1 FTH1 TMSB4X ACTB FTL RTN4 ATP6V0B TPT1 FAU S100A6 NDUFA4 ATP5F1E COX7C ITM2B IGFBP7 EIF1 C12orf75 CD9 COX7B SERF2 ATP1B1 COX8A TXNIP NDUFB2 MYL6 PPDPF COX6B1 UQCR11 APOE COX4I1 CALM2 UQCRB S100A11 UQCRQ COX6C ATP5MG BSG ATP6AP2 UQCR10 PTMA NACA UBL5 UBA52 TMSB10 ADGRF5 HSP90AA1 GSTP1 ATP5F1D CHCHD2 GAPDH COX7A2 SKP1 HSPE1 PRDX1 CYSTM1 LGALS3 CD63 ATP5MJ CKB NDUFS5 ATP5ME UBB MAL</code> | <code>This measurement was conducted with 10x 3' v3. Cell sample from the cortex of kidney, taken from a 43-year-old male of European ethnicity with a reported history of kidney cancer. The cell type is identified as a kidney collecting duct intercalated cell.</code> | <code>This measurement was conducted with 10x 3' v3. Cell sample from the cortex of kidney, taken from a 72-year-old male of European ethnicity, identified as a kidney collecting duct intercalated cell, and preserved through cryopreservation.</code> | <code>MALAT1 TMSB4X TMSB10 ACTB TXNIP EEF1A1 TPT1 PFN1 BTG1 FAU PTMA S100A4 ATP5F1E EIF1 FTL CFL1 CYBA MYL12A SRGN SERF2 SH3BGRL3 CALM1 TYROBP MYL6 ZFP36 KLRD1 UBB NACA S100A6 UBA52 HSP90AA1 H3-3B LCP1 FTH1 DDIT4 FOS PPIA CD247 RACK1 TMA7 CORO1A OAZ1 TLE5 ARPC3 GAPDH KLF2 UBC ZFP36L2 TSC22D3 ITGB2 ARPC2 ATP5MG HOPX IFITM2 HMGB1 OST4 EEF1G PRDM1 CDC42 GSTP1 NDUFB2 CIRBP LGALS1 CHCHD2</code>                          |
  | <code>MALAT1 KCND2 NRXN1 CDH18 NRXN3 ZNF385D CADM2 RALYL NKAIN2 CADPS2 RIMS1 FSTL5 GRID2 TRPM3 CHN2 DPP6 JMJD1C RORA PDE1A UNC13C TIAM1 NRG1 SNAP25 ZFPM2 CALN1 LSAMP CNTN1 ABLIM1 SYNE1 ANK3 CA10 NFIA ZBTB20 NTM CADM1 OPCML RELN DNM3 NEBL ERC1 SCN2A PPP3CA CACNA1A GALNT13 LRRC4C GPM6A RABGAP1L RIT2 CAMK4 GRIA4 PTPRD RBFOX3 MCTP1 LHFPL6 PCLO MEG3 PDE10A NOVA1 RTN1 ZNF385B CNTN4 GABRB2 SPOCK1 OXR1</code>     | <code>This measurement was conducted with 10x 3' v3. Neuron cell type from a 29-year-old male cerebellum, specifically from the Cerebellar Vermis - CBV region, with European self-reported ethnicity, analyzed at the nucleus level.</code>                                | <code>This measurement was conducted with 10x 3' v3. Sample is an oligodendrocyte precursor cell taken from the cerebellum tissue of a 42-year-old human male, specifically from the Cerebellum (CB) - Cerebellar Vermis - CBV dissection.</code>         | <code>MALAT1 NRXN3 SNTG1 UNC5C GRIA4 NRG1 RORA INPP4B CLSTN2 NKAIN2 FRMD4A DPP6 GRID2 NRXN1 LSAMP JMJD1C HS6ST3 NXPH1 MIR99AHG LRRC4C NTM CCNH NFIA ZFPM2 AFF3 OPCML PTPRT CADM2 ZBTB20 OLFM3 SLC22A3 CNTNAP5 CACNA2D3 CNTN4 KCND2 ADARB2 XKR4 GPM6A IL1RAPL1 ALK ANKRD36C UBE2E2 SYN3 GARNL3 PTPRG DAB1 TCF4 LINC00461 PRANCR GRIN2B TNRC6B MAPK10 NOVA1 NFIB ANK3 KCNMA1 KCNQ5 SPON1 TRIM9 VWA8 GDAP1 GABRG2 AHI1 ATP1B1</code> |
  | <code>EEF1A1 ACTB GAPDH HMGN2 PTMA SERF2 TMSB4X CD74 PABPC1 FTH1 TMSB10 FAU PFN1 HMGN1 OAZ1 HMGB1 TPT1 PPIA NACA BTF3 MALAT1 MYL6 ATP5MG CFL1 RACK1 ODC1 ATP5F1E TMA7 SLC25A5 ELOB ARPC3 NPM1 COX7C ANP32B C4orf3 EIF1 PCBP2 KLF6 LAPTM5 COX8A RHOA HSPA8 H3-3B PTP4A2 UBA52 OST4 CIRBP LGALS1 EIF3L STMN1 PPDPF COX4I1 RAN EIF3F PPP1CC COMMD6 NDUFA4 YBX1 PEBP1 COTL1 COX7A2 HSPE1 CCNI TRIR</code>                    | <code>This measurement was conducted with 10x 5' v1. Cell sample from the tonsil of a 9-year-old female with recurrent tonsillitis, characterized as a centroblast B cell with IGLC2, IGLV7-43, IGLJ3 immunoglobulin genes expressed.</code>                                | <code>This measurement was conducted with 10x 5' v1. Germinal center B cell derived from the tonsil tissue of a 3-year-old male with recurrent tonsillitis.</code>                                                                                        | <code>CD74 MALAT1 EEF1A1 SSR4 TPT1 UBC EEF2 SAT1 RACK1 SEC11C ATP5MG FAU TSC22D3 PPIB XBP1 FTL GAPDH HLA-DRB5 HERPUD1 RGS2 HSPA8 TMSB4X HSP90B1 EIF1 PTMA SERP1 SERF2 NACA SEC61B GSTP1 UBA52 HSPA5 BTF3 LAPTM5 HSPE1 H3-3B ATP5F1A SEC61G CD38 EDF1 FTH1 IL16 NPM1 OST4 CIRBP EIF3E OAZ1 CYTIP PCBP2 MYDGF COX6B1 ZFP36 CSDE1 PABPC1 REXO2 KDELR1 PFN1 PTP4A1 TMBIM6 H1-10 PSAP UBE2J1 VIM MYL6</code>                           |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

#### gene_description

* Dataset: [gene_description](https://huggingface.co/datasets/jo-mengr/descriptions_genes) at [dd22363](https://huggingface.co/datasets/jo-mengr/descriptions_genes/tree/dd22363de0a7c501f41ba324fb3b8d6ecdd14dc7)
* Size: 1,000 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative_1</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                                       | positive                                                                                          | negative_1                                                                                        |
  |:--------|:---------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|
  | type    | string                                                                                       | string                                                                                            | string                                                                                            |
  | details | <ul><li>min: 3 characters</li><li>mean: 5.88 characters</li><li>max: 12 characters</li></ul> | <ul><li>min: 16 characters</li><li>mean: 367.09 characters</li><li>max: 1375 characters</li></ul> | <ul><li>min: 13 characters</li><li>mean: 167.33 characters</li><li>max: 1375 characters</li></ul> |
* Samples:
  | anchor            | positive                                                                                                                                                                                                                                          | negative_1                        |
  |:------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------|
  | <code>A1BG</code> | <code>The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]</code> | <code>A1BG antisense RNA 1</code> |
  | <code>A1BG</code> | <code>The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]</code> | <code>G antigen 12D</code>        |
  | <code>A1BG</code> | <code>The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]</code> | <code>G antigen 12B</code>        |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 128
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `warmup_ratio`: 0.1
- `bf16`: True
- `gradient_checkpointing`: True

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 128
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `hub_revision`: None
- `gradient_checkpointing`: True
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `liger_kernel_config`: None
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {}
- `learning_rate_mapping`: {}

</details>

### Training Logs
<details><summary>Click to expand</summary>

| Epoch  | Step | Training Loss | cellxgene pseudo bulk 100k multiplets natural language annotation loss | gene description loss | cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2_cosine_accuracy | gene_description_cosine_accuracy |
|:------:|:----:|:-------------:|:----------------------------------------------------------------------:|:---------------------:|:-------------------------------------------------------------------------------------------------:|:--------------------------------:|
| 0.0324 | 50   | 9.3314        | 12.6479                                                                | 6.6616                | 0.5052                                                                                            | 0.2570                           |
| 0.0649 | 100  | 7.9528        | 10.8869                                                                | 6.0596                | 0.5078                                                                                            | 0.2660                           |
| 0.0973 | 150  | 7.0084        | 7.0423                                                                 | 5.4704                | 0.5075                                                                                            | 0.3020                           |
| 0.1297 | 200  | 5.6925        | 6.0263                                                                 | 5.2950                | 0.5024                                                                                            | 0.5200                           |
| 0.1621 | 250  | 5.381         | 5.8141                                                                 | 4.7323                | 0.5367                                                                                            | 0.6520                           |
| 0.1946 | 300  | 4.3736        | 5.4432                                                                 | 4.3565                | 0.5518                                                                                            | 0.7060                           |
| 0.2270 | 350  | 3.8184        | 5.1966                                                                 | 4.1283                | 0.5836                                                                                            | 0.7690                           |
| 0.2594 | 400  | 3.6181        | 5.0588                                                                 | 3.9594                | 0.6064                                                                                            | 0.7650                           |
| 0.2918 | 450  | 3.1076        | 4.9406                                                                 | 3.7824                | 0.6218                                                                                            | 0.8030                           |
| 0.3243 | 500  | 3.127         | 4.8376                                                                 | 3.6785                | 0.6369                                                                                            | 0.8230                           |
| 0.3567 | 550  | 3.1702        | 4.8230                                                                 | 3.6029                | 0.6532                                                                                            | 0.8410                           |
| 0.3891 | 600  | 2.992         | 5.1160                                                                 | 3.6091                | 0.6240                                                                                            | 0.8310                           |
| 0.4215 | 650  | 2.606         | 4.5652                                                                 | 3.5555                | 0.6679                                                                                            | 0.8490                           |
| 0.4540 | 700  | 2.9473        | 4.5831                                                                 | 3.5215                | 0.6846                                                                                            | 0.8600                           |
| 0.4864 | 750  | 2.369         | 4.4464                                                                 | 3.4824                | 0.6930                                                                                            | 0.8800                           |
| 0.5188 | 800  | 2.5923        | 4.4542                                                                 | 3.4372                | 0.6983                                                                                            | 0.8820                           |
| 0.5512 | 850  | 2.9167        | 4.4572                                                                 | 3.4915                | 0.6984                                                                                            | 0.8730                           |
| 0.5837 | 900  | 2.5716        | 4.2259                                                                 | 3.4390                | 0.7126                                                                                            | 0.8630                           |
| 0.6161 | 950  | 2.375         | 4.2200                                                                 | 3.4250                | 0.7143                                                                                            | 0.8740                           |
| 0.6485 | 1000 | 2.4105        | 4.2001                                                                 | 3.3524                | 0.7187                                                                                            | 0.8890                           |
| 0.6809 | 1050 | 2.4014        | 4.0744                                                                 | 3.2688                | 0.7243                                                                                            | 0.8950                           |
| 0.7134 | 1100 | 2.7474        | 4.1131                                                                 | 3.3046                | 0.7270                                                                                            | 0.8850                           |
| 0.7458 | 1150 | 2.1615        | 4.2206                                                                 | 3.2392                | 0.7202                                                                                            | 0.8860                           |
| 0.7782 | 1200 | 2.4409        | 4.4682                                                                 | 3.1664                | 0.7106                                                                                            | 0.8870                           |
| 0.8106 | 1250 | 2.5041        | 4.0881                                                                 | 3.1417                | 0.7277                                                                                            | 0.9030                           |
| 0.8431 | 1300 | 2.4221        | 3.8777                                                                 | 3.2302                | 0.7409                                                                                            | 0.8940                           |
| 0.8755 | 1350 | 2.189         | 3.8482                                                                 | 3.1316                | 0.7441                                                                                            | 0.9050                           |
| 0.9079 | 1400 | 2.3055        | 3.8571                                                                 | 3.1550                | 0.7451                                                                                            | 0.9030                           |
| 0.9403 | 1450 | 2.0945        | 3.8233                                                                 | 3.1269                | 0.7530                                                                                            | 0.9020                           |
| 0.9728 | 1500 | 2.0217        | 3.7722                                                                 | 3.0707                | 0.7527                                                                                            | 0.9070                           |
| 1.0052 | 1550 | 2.2443        | 3.8285                                                                 | 3.0799                | 0.7459                                                                                            | 0.9190                           |
| 1.0376 | 1600 | 1.9441        | 3.8292                                                                 | 3.0957                | 0.7470                                                                                            | 0.9090                           |
| 1.0700 | 1650 | 1.8771        | 3.6837                                                                 | 3.0190                | 0.7555                                                                                            | 0.9290                           |
| 1.1025 | 1700 | 1.9489        | 3.6946                                                                 | 3.0298                | 0.7570                                                                                            | 0.9210                           |
| 1.1349 | 1750 | 2.0622        | 3.7221                                                                 | 3.0001                | 0.7574                                                                                            | 0.9140                           |
| 1.1673 | 1800 | 1.7275        | 3.7806                                                                 | 2.9919                | 0.7530                                                                                            | 0.9090                           |
| 1.1997 | 1850 | 2.0068        | 3.6648                                                                 | 2.9490                | 0.7584                                                                                            | 0.9230                           |
| 1.2322 | 1900 | 1.9126        | 3.7416                                                                 | 2.9131                | 0.7603                                                                                            | 0.9160                           |
| 1.2646 | 1950 | 1.9513        | 3.5770                                                                 | 2.9362                | 0.7625                                                                                            | 0.9230                           |
| 1.2970 | 2000 | 1.8021        | 3.6660                                                                 | 2.8868                | 0.7670                                                                                            | 0.9360                           |
| 1.3294 | 2050 | 1.9685        | 3.7318                                                                 | 2.8669                | 0.7587                                                                                            | 0.9390                           |
| 1.3619 | 2100 | 1.7835        | 3.5471                                                                 | 2.8356                | 0.7712                                                                                            | 0.9350                           |
| 1.3943 | 2150 | 1.826         | 3.5666                                                                 | 2.7893                | 0.7707                                                                                            | 0.9340                           |
| 1.4267 | 2200 | 1.9708        | 3.5630                                                                 | 2.7570                | 0.7741                                                                                            | 0.9290                           |
| 1.4591 | 2250 | 2.0131        | 3.5586                                                                 | 2.8239                | 0.7742                                                                                            | 0.9360                           |
| 1.4916 | 2300 | 1.856         | 3.5155                                                                 | 2.7658                | 0.7779                                                                                            | 0.9410                           |
| 1.5240 | 2350 | 1.9354        | 3.7959                                                                 | 2.7921                | 0.7622                                                                                            | 0.9380                           |
| 1.5564 | 2400 | 1.8961        | 3.5166                                                                 | 2.7456                | 0.7790                                                                                            | 0.9430                           |
| 1.5888 | 2450 | 1.6347        | 3.4784                                                                 | 2.7911                | 0.7800                                                                                            | 0.9470                           |
| 1.6213 | 2500 | 1.9176        | 3.4388                                                                 | 2.7349                | 0.7829                                                                                            | 0.9440                           |
| 1.6537 | 2550 | 2.0475        | 3.6968                                                                 | 2.7456                | 0.7754                                                                                            | 0.9390                           |
| 1.6861 | 2600 | 1.7946        | 3.4758                                                                 | 2.7046                | 0.7848                                                                                            | 0.9470                           |
| 1.7185 | 2650 | 1.9581        | 3.3828                                                                 | 2.7022                | 0.7867                                                                                            | 0.9430                           |
| 1.7510 | 2700 | 1.8475        | 3.3631                                                                 | 2.6706                | 0.7903                                                                                            | 0.9470                           |
| 1.7834 | 2750 | 1.836         | 3.5622                                                                 | 2.6512                | 0.7857                                                                                            | 0.9450                           |
| 1.8158 | 2800 | 2.051         | 3.3523                                                                 | 2.6542                | 0.7926                                                                                            | 0.9390                           |
| 1.8482 | 2850 | 1.829         | 3.3676                                                                 | 2.6730                | 0.7925                                                                                            | 0.9390                           |
| 1.8807 | 2900 | 1.7557        | 3.3632                                                                 | 2.6536                | 0.7954                                                                                            | 0.9470                           |
| 1.9131 | 2950 | 1.7725        | 3.3448                                                                 | 2.6437                | 0.7946                                                                                            | 0.9470                           |
| 1.9455 | 3000 | 1.7373        | 3.2736                                                                 | 2.6562                | 0.7987                                                                                            | 0.9440                           |
| 1.9780 | 3050 | 1.886         | 3.3404                                                                 | 2.6456                | 0.7958                                                                                            | 0.9450                           |
| 2.0104 | 3100 | 1.7217        | 3.2570                                                                 | 2.6893                | 0.7988                                                                                            | 0.9400                           |
| 2.0428 | 3150 | 1.6235        | 3.2331                                                                 | 2.6132                | 0.8004                                                                                            | 0.9430                           |
| 2.0752 | 3200 | 1.6678        | 3.2466                                                                 | 2.5904                | 0.8030                                                                                            | 0.9470                           |
| 2.1077 | 3250 | 1.6784        | 3.2339                                                                 | 2.5956                | 0.8008                                                                                            | 0.9480                           |
| 2.1401 | 3300 | 1.8422        | 3.2286                                                                 | 2.5997                | 0.8039                                                                                            | 0.9480                           |
| 2.1725 | 3350 | 1.4859        | 3.2163                                                                 | 2.5924                | 0.8049                                                                                            | 0.9470                           |
| 2.2049 | 3400 | 1.6165        | 3.3246                                                                 | 2.6167                | 0.7989                                                                                            | 0.9440                           |
| 2.2374 | 3450 | 1.65          | 3.2184                                                                 | 2.5864                | 0.8039                                                                                            | 0.9460                           |
| 2.2698 | 3500 | 1.5071        | 3.2274                                                                 | 2.5788                | 0.8019                                                                                            | 0.9460                           |
| 2.3022 | 3550 | 1.5238        | 3.2032                                                                 | 2.5608                | 0.8075                                                                                            | 0.9480                           |
| 2.3346 | 3600 | 1.568         | 3.2409                                                                 | 2.5649                | 0.8081                                                                                            | 0.9470                           |
| 2.3671 | 3650 | 1.4644        | 3.1937                                                                 | 2.5841                | 0.8079                                                                                            | 0.9430                           |
| 2.3995 | 3700 | 1.5782        | 3.2033                                                                 | 2.5909                | 0.8065                                                                                            | 0.9450                           |
| 2.4319 | 3750 | 1.6976        | 3.1905                                                                 | 2.5690                | 0.8073                                                                                            | 0.9470                           |
| 2.4643 | 3800 | 1.4682        | 3.2078                                                                 | 2.5610                | 0.8052                                                                                            | 0.9490                           |
| 2.4968 | 3850 | 1.7414        | 3.1822                                                                 | 2.5650                | 0.8072                                                                                            | 0.9500                           |
| 2.5292 | 3900 | 1.654         | 3.1890                                                                 | 2.5566                | 0.8110                                                                                            | 0.9490                           |
| 2.5616 | 3950 | 1.5187        | 3.1843                                                                 | 2.5508                | 0.8090                                                                                            | 0.9470                           |
| 2.5940 | 4000 | 1.4893        | 3.1855                                                                 | 2.5527                | 0.8067                                                                                            | 0.9470                           |
| 2.6265 | 4050 | 1.6716        | 3.1520                                                                 | 2.5432                | 0.8093                                                                                            | 0.9480                           |
| 2.6589 | 4100 | 1.4914        | 3.1868                                                                 | 2.5466                | 0.8099                                                                                            | 0.9500                           |
| 2.6913 | 4150 | 1.6231        | 3.1702                                                                 | 2.5235                | 0.8112                                                                                            | 0.9500                           |
| 2.7237 | 4200 | 1.6058        | 3.1561                                                                 | 2.5171                | 0.8096                                                                                            | 0.9520                           |
| 2.7562 | 4250 | 1.5753        | 3.1660                                                                 | 2.5068                | 0.8111                                                                                            | 0.9530                           |
| 2.7886 | 4300 | 1.4654        | 3.1507                                                                 | 2.5156                | 0.8138                                                                                            | 0.9510                           |
| 2.8210 | 4350 | 1.5901        | 3.1960                                                                 | 2.4917                | 0.8115                                                                                            | 0.9540                           |
| 2.8534 | 4400 | 1.5034        | 3.1491                                                                 | 2.4960                | 0.8116                                                                                            | 0.9550                           |
| 2.8859 | 4450 | 1.4088        | 3.1505                                                                 | 2.5086                | 0.8133                                                                                            | 0.9530                           |
| 2.9183 | 4500 | 1.5527        | 3.1671                                                                 | 2.5154                | 0.8112                                                                                            | 0.9540                           |
| 2.9507 | 4550 | 1.5344        | 3.1329                                                                 | 2.5016                | 0.8141                                                                                            | 0.9530                           |
| 2.9831 | 4600 | 1.4156        | 3.1439                                                                 | 2.4858                | 0.8146                                                                                            | 0.9550                           |
| 3.0156 | 4650 | 1.8602        | 3.1056                                                                 | 2.4799                | 0.8163                                                                                            | 0.9550                           |
| 3.0480 | 4700 | 1.4472        | 3.1387                                                                 | 2.4539                | 0.8126                                                                                            | 0.9540                           |
| 3.0804 | 4750 | 1.3582        | 3.1220                                                                 | 2.4676                | 0.8159                                                                                            | 0.9530                           |
| 3.1128 | 4800 | 1.5408        | 3.1309                                                                 | 2.4722                | 0.8142                                                                                            | 0.9540                           |
| 3.1453 | 4850 | 1.3755        | 3.1227                                                                 | 2.4624                | 0.8171                                                                                            | 0.9530                           |
| 3.1777 | 4900 | 1.4571        | 3.1284                                                                 | 2.4410                | 0.8162                                                                                            | 0.9560                           |
| 3.2101 | 4950 | 1.5657        | 3.0882                                                                 | 2.4486                | 0.8167                                                                                            | 0.9550                           |
| 3.2425 | 5000 | 1.5325        | 3.0980                                                                 | 2.4339                | 0.8178                                                                                            | 0.9540                           |
| 3.2750 | 5050 | 1.4671        | 3.0961                                                                 | 2.4625                | 0.8169                                                                                            | 0.9550                           |
| 3.3074 | 5100 | 1.4808        | 3.1176                                                                 | 2.4578                | 0.8180                                                                                            | 0.9550                           |
| 3.3398 | 5150 | 1.4172        | 3.1338                                                                 | 2.4515                | 0.8168                                                                                            | 0.9550                           |
| 3.3722 | 5200 | 1.4953        | 3.1047                                                                 | 2.4425                | 0.8174                                                                                            | 0.9540                           |
| 3.4047 | 5250 | 1.6419        | 3.1081                                                                 | 2.4317                | 0.8180                                                                                            | 0.9540                           |
| 3.4371 | 5300 | 1.5425        | 3.0910                                                                 | 2.4481                | 0.8210                                                                                            | 0.9560                           |
| 3.4695 | 5350 | 1.5598        | 3.1049                                                                 | 2.4365                | 0.8198                                                                                            | 0.9560                           |
| 3.5019 | 5400 | 1.4086        | 3.1036                                                                 | 2.4352                | 0.8198                                                                                            | 0.9550                           |
| 3.5344 | 5450 | 1.6057        | 3.1076                                                                 | 2.4269                | 0.8197                                                                                            | 0.9560                           |
| 3.5668 | 5500 | 1.6735        | 3.0792                                                                 | 2.4291                | 0.8200                                                                                            | 0.9550                           |
| 3.5992 | 5550 | 1.401         | 3.0959                                                                 | 2.4364                | 0.8211                                                                                            | 0.9550                           |
| 3.6316 | 5600 | 1.2475        | 3.0909                                                                 | 2.4324                | 0.8202                                                                                            | 0.9570                           |
| 3.6641 | 5650 | 1.2495        | 3.0686                                                                 | 2.4148                | 0.8210                                                                                            | 0.9550                           |
| 3.6965 | 5700 | 1.4457        | 3.0837                                                                 | 2.4123                | 0.8197                                                                                            | 0.9570                           |
| 3.7289 | 5750 | 1.5794        | 3.0877                                                                 | 2.4171                | 0.8191                                                                                            | 0.9560                           |
| 3.7613 | 5800 | 1.5696        | 3.0936                                                                 | 2.4153                | 0.8186                                                                                            | 0.9560                           |
| 3.7938 | 5850 | 1.5947        | 3.0778                                                                 | 2.4173                | 0.8190                                                                                            | 0.9560                           |
| 3.8262 | 5900 | 1.4517        | 3.0760                                                                 | 2.4242                | 0.8202                                                                                            | 0.9560                           |
| 3.8586 | 5950 | 1.553         | 3.0897                                                                 | 2.4222                | 0.8188                                                                                            | 0.9580                           |
| 3.8911 | 6000 | 1.2109        | 3.0683                                                                 | 2.4233                | 0.8211                                                                                            | 0.9550                           |
| 3.9235 | 6050 | 1.4384        | 3.0756                                                                 | 2.4221                | 0.8208                                                                                            | 0.9560                           |
| 3.9559 | 6100 | 1.4945        | 3.0755                                                                 | 2.4179                | 0.8202                                                                                            | 0.9560                           |
| 3.9883 | 6150 | 1.4597        | 3.0686                                                                 | 2.4183                | 0.8204                                                                                            | 0.9560                           |

</details>

### Framework Versions
- Python: 3.11.6
- Sentence Transformers: 5.0.0
- Transformers: 4.55.0.dev0
- PyTorch: 2.5.1+cu121
- Accelerate: 1.9.0
- Datasets: 2.19.1
- Tokenizers: 0.21.4

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->