Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -19,9 +19,9 @@ model-index: 
     | 
|
| 19 | 
         
             
                  value: 0.9497212171554565
         
     | 
| 20 | 
         
             
            ---
         
     | 
| 21 | 
         | 
| 22 | 
         
            -
            #  
     | 
| 23 | 
         | 
| 24 | 
         
            -
            This model is a  
     | 
| 25 | 
         
             
            It achieves the following results on the evaluation set:
         
     | 
| 26 | 
         
             
            - Loss: 0.1202
         
     | 
| 27 | 
         
             
            - Accuracy: 0.9497
         
     | 
| 
         @@ -29,12 +29,12 @@ It achieves the following results on the evaluation set: 
     | 
|
| 29 | 
         | 
| 30 | 
         
             
            ## Model description
         
     | 
| 31 | 
         | 
| 32 | 
         
            -
            We trained t5 on SMILES from ZINC using  
     | 
| 33 | 
         | 
| 34 | 
         | 
| 35 | 
         
             
            ## Intended uses & limitations
         
     | 
| 36 | 
         | 
| 37 | 
         
            -
            This model can be used  
     | 
| 38 | 
         
             
            As an example, We finetuned this model to predict products. The model is [here](https://huggingface.co/sagawa/ZINC-t5-productpredicition), and you can use the demo [here](https://huggingface.co/spaces/sagawa/predictproduct-t5).
         
     | 
| 39 | 
         
             
            Using its encoder, we trained a regression model to predict a reaction yield. You can use this demo [here](https://huggingface.co/spaces/sagawa/predictyield-t5).
         
     | 
| 40 | 
         | 
| 
         | 
|
| 19 | 
         
             
                  value: 0.9497212171554565
         
     | 
| 20 | 
         
             
            ---
         
     | 
| 21 | 
         | 
| 22 | 
         
            +
            # CompoundT5 
         
     | 
| 23 | 
         | 
| 24 | 
         
            +
            This model is a re-pretrained version of [google/t5-v1_1-base](https://huggingface.co/microsoft/deberta-base) on the sagawa/ZINC-canonicalized dataset.
         
     | 
| 25 | 
         
             
            It achieves the following results on the evaluation set:
         
     | 
| 26 | 
         
             
            - Loss: 0.1202
         
     | 
| 27 | 
         
             
            - Accuracy: 0.9497
         
     | 
| 
         | 
|
| 29 | 
         | 
| 30 | 
         
             
            ## Model description
         
     | 
| 31 | 
         | 
| 32 | 
         
            +
            We trained t5 on SMILES from ZINC using masked-language modeling (MLM). Its tokenizer is also trained on ZINC.
         
     | 
| 33 | 
         | 
| 34 | 
         | 
| 35 | 
         
             
            ## Intended uses & limitations
         
     | 
| 36 | 
         | 
| 37 | 
         
            +
            This model can be used to predict molecules' properties, reactions, or interactions with proteins by changing the way of finetuning.
         
     | 
| 38 | 
         
             
            As an example, We finetuned this model to predict products. The model is [here](https://huggingface.co/sagawa/ZINC-t5-productpredicition), and you can use the demo [here](https://huggingface.co/spaces/sagawa/predictproduct-t5).
         
     | 
| 39 | 
         
             
            Using its encoder, we trained a regression model to predict a reaction yield. You can use this demo [here](https://huggingface.co/spaces/sagawa/predictyield-t5).
         
     | 
| 40 | 
         |