YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Model Card for BERT-base Sentiment Analysis Model
Model Details
This model is a fine-tuned version of BERT-base for sentiment analysis tasks.
Training Data
The model was trained on the Rotten Tomatoes dataset.
Training Procedure
- Learning Rate: 2e-5
- Epochs: 3
- Batch Size: 16 가능하면 다른 사람들도 똑같은 버트모델, 로튼 토메토를 이용했을 때 재현가능하도록 하는 모든 하이퍼 파라미터들을 다 적어라
How to Use 허깅 페이스 쓸 때 어떤 것을 쓰면 된다는 걸 알려주는 것
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")
input_text = "The movie was fantastic with a gripping storyline!"
inputs = tokenizer.encode(input_text, return_tensors="pt")
outputs = model(inputs)
print(outputs.logits)
Evaluation 평가 결과
- Accuracy: 81.97%
Limitations 약점은 뭐가 있다는 것
The model may generate biased or inappropriate content due to the nature of the training data. It is recommended to use the model with caution and apply necessary filters.
Ethical Considerations
- Bias: The model may inherit biases present in the training data.
- Misuse: The model can be misused to generate misleading or harmful content.
Copyright and License
This model is licensed under the MIT License.
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support