Description
This model is a PHS-BERT based tweet classification model that can classify whether tweets contain depressive text.
Predicted Entities
depression
, no-depression
How to use
# Sample Python Code
document_assembler = DocumentAssembler() \
.setInputCol("text") \
.setOutputCol("document")
tokenizer = Tokenizer() \
.setInputCols(["document"]) \
.setOutputCol("token")
sequenceClassifier = MedicalBertForSequenceClassification.pretrained("bert_sequence_classifier_depression_twitter", "en", "clinical/models")\
.setInputCols(["document","token"])\
.setOutputCol("class")
pipeline = Pipeline(stages=[
document_assembler,
tokenizer,
sequenceClassifier
])
data = spark.createDataFrame([
["Do what makes you happy, be with who makes you smile, laugh as much as you breathe, and love as long as you live!"],
["Everything is a lie, everyone is fake, I'm so tired of living"]
]).toDF("text")
result = pipeline.fit(data).transform(data)
result.select("text", "class.result").show(truncate=False)
val documenter = new DocumentAssembler()
.setInputCol("text")
.setOutputCol("document")
val tokenizer = new Tokenizer()
.setInputCols(Array("document"))
.setOutputCol("token")
val sequenceClassifier = MedicalBertForSequenceClassification.pretrained("bert_sequence_classifier_depression_twitter", "en", "clinical/models")
.setInputCols(Array("document","token"))
.setOutputCol("class")
val pipeline = new Pipeline().setStages(Array(documenter, tokenizer, sequenceClassifier))
val data = Seq(Array(
"Do what makes you happy, be with who makes you smile, laugh as much as you breathe, and love as long as you live!",
"Everything is a lie, everyone is fake, I'm so tired of living"
)).toDS.toDF("text")
val result = pipeline.fit(data).transform(data)
import nlu
nlu.load("en.classify.bert_sequence.depression_twitter").predict("""Do what makes you happy, be with who makes you smile, laugh as much as you breathe, and love as long as you live!""")
Results
+-----------------------------------------------------------------------------------------------------------------+---------------+
|text |result |
+-----------------------------------------------------------------------------------------------------------------+---------------+
|Do what makes you happy, be with who makes you smile, laugh as much as you breathe, and love as long as you live!|[no-depression]|
|Everything is a lie, everyone is fake, I'm so tired of living |[depression] |
+-----------------------------------------------------------------------------------------------------------------+---------------+
Model Information
Model Name: | bert_sequence_classifier_depression_twitter |
Compatibility: | Healthcare NLP 4.0.2+ |
License: | Licensed |
Edition: | Official |
Input Labels: | [document, token] |
Output Labels: | [class] |
Language: | en |
Size: | 1.3 GB |
Case sensitive: | true |
Max sentence length: | 128 |
References
Curated from several academic and in-house datasets.
Benchmarking
label precision recall f1-score support
minimum 0.97 0.98 0.97 1411
high-depression 0.95 0.92 0.93 595
accuracy - - 0.96 2006
macro-avg 0.96 0.95 0.95 2006
weighted-avg 0.96 0.96 0.96 2006