File size: 1,924 Bytes
1776e76
 
 
 
 
 
 
 
 
 
 
 
 
2053b65
1776e76
 
 
 
3a63c01
 
 
 
 
 
 
1776e76
 
 
 
d1ff217
 
 
 
 
55fde25
d1ff217
cd179c4
848016a
 
 
 
 
 
 
 
 
 
caab1aa
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
language: en
license: apache-2.0
tags:
- text-classification
- suicidal-detection
pipeline_tag: text-classification
datasets:
- jsfactory/mental_health_reddit_posts
metrics:
- accuracy
base_model:
- distilbert/distilbert-base-uncased
library_name: transformers
---

# Suicidal Detection System

This is a fine-tuned model based on a transformer architecture distilBERT for detecting suicidal intent or ideation in text. This model purpose is for text-classification in suicidal detection system.

Example output

| Text Input                       | Label    | Score |
| :------------------------------- | :--------| :---- |
| "I want to jump off this bridge" | Suicidal | 0.89  |

## Example

```python
from transformers import pipeline, DistilBertTokenizer, DistilBertForSequenceClassification

tokenizer = DistilBertTokenizer.from_pretrained("Kebinnuil/suicidal_detection_model")
model = pipeline("text-classification", model="Kebinnuil/suicidal_detection_model")

result = model("I want to jump off the bridge")
print(result)
```

## Training Metrics
The dataset was split into 80/10/10 for train/validation/test set. Table below shows the result of the model's training metrics.

| Epoch | Training Loss | Validation Loss | Accuracy | AUC      |
| :---- | :------------ | :-------------- | :------- | :------- |
| 1     | 0.442800      | 0.348061        | 0.838000 | 0.925000 |
| 2     | 0.304100      | 0.331631        | 0.850000 | 0.935000 |
| 3     | 0.261600      | 0.329701        | 0.851000 | 0.936000 |


## Classification Report

| Class | Precision | Recall | F1-score | Support |
| :---- | :-------- | :----- | :------- | :------ |
| 0     | 0.87      | 0.84   | 0.85     | 1211    |
| 1     | 0.84      | 0.87   | 0.86     | 1189    |

**Accuracy**: 0.86  
**Macro avg**: Precision 0.86, Recall 0.86, F1-score 0.86  
**Weighted avg**: Precision 0.86, Recall 0.86, F1-score 0.86  
**Total samples**: 2400