aveluth's picture
updated readme
bf630f0
---
license: mit
language:
- de
base_model:
- deepset/gbert-large
---
# Autor-Regulatory Focus Classifier (German)
This model is a fine-tuned transformer-based classifier that detects the **regulatory focus** in German-language text, classifying whether the language expresses a **promotion** (aspirational, growth-oriented) or **prevention** (safety, obligation-oriented) focus.
It is fine-tuned on top of a German-language base model for the task of binary text classification.
## Model Details
- **Base model**: `deepset/gbert-large`
- **Fine-tuned for**: Binary classification (Regulatory Focus)
- **Language**: German
- **Framework**: Hugging Face Transformers
- **Model format**: `safetensors`
## Use Cases
- Social psychology and communication research
- Marketing and consumer behavior analysis
- Literary or political discourse analysis
- Behavioral modeling and goal orientation profiling
## Example Usage
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained("aveluth/author_regulatory_focus_classifier")
tokenizer = AutoTokenizer.from_pretrained("aveluth/author_regulatory_focus_classifier")
text = "Wir müssen sicherstellen, dass keine Fehler passieren. Sicherheit hat höchste Priorität."
inputs = tokenizer(text, return_tensors="pt")
outputs = model(**inputs)
predicted_class = torch.argmax(outputs.logits).item()
print("Predicted class:", "prevention" if predicted_class == 0 else "promotion")
```
## Labels
| Class | Description |
|-------------|----------------------------------------|
| `0` | Prevention-focused language |
| `1` | Promotion-focused language |
## Training Details
- **Training data**: Custom labeled corpus based on psychological framing
- **Loss function**: Cross-entropy
- **Optimizer**: AdamW
- **Epochs**: 4
- **Learning rate**: 3e-5
## Limitations
- Trained on German-language data only
- Performance may vary on out-of-domain text (e.g., technical manuals, poetry)
- May not generalize across all cultural framings of regulatory focus
## License
[MIT](LICENSE)
## Citation
If you use this model in your research, please cite:
```bibtex
@article{velutharambath2023prevention,
title={Prevention or Promotion? Predicting Author's Regulatory Focus},
author={Velutharambath, Aswathy and Sassenberg, Kai and Klinger, Roman},
journal={Northern European Journal of Language Technology},
volume={9},
number={1},
year={2023}
}
```