Sentence Detection in English Texts


SentenceDetectorDL (SDDL) is based on a general-purpose neural network model for sentence boundary detection. The task of sentence boundary detection is to identify sentences within a text. Many natural language processing tasks take a sentence as an input unit, such as part-of-speech tagging, dependency parsing, named entity recognition or machine translation.

In this model, we treated the sentence boundary detection task as a classification problem based on a paper {Deep-EOS: General-Purpose Neural Networks for Sentence Boundary Detection (2020, Stefan Schweter, Sajawel Ahmed) using CNN architecture. We also modified the original implemenation a little bit to cover broken sentences and some impossible end of line chars.

We are releasing two pretrained SDDL models: english and multilanguage that are trained on SETimes corpus (Tyers and Alperen, 2010) and Europarl. Wong et al. (2014) datasets.


How to use

documenter = DocumentAssembler()\
sentencerDL = SentenceDetectorDLModel\
  .pretrained("sentence_detector_dl", "en") \
  .setInputCols(["document"]) \
sd_model = LightPipeline(PipelineModel(stages=[documenter, sentencerDL]))
sd_model.fullAnnotate("""John loves Mary.Mary loves Peter. Peter loves Helen .Helen loves John; Total: four people involved.""")
val documenter = DocumentAssembler()

val model = SentenceDetectorDLModel.pretrained("sentence_detector_dl", "en")
val pipeline = new Pipeline().setStages(Array(documenter, model))
val result =["John loves Mary.Mary loves Peter. Peter loves Helen .Helen loves John; Total: four people involved."].toDS.toDF("text")).transform(data)


| 0 | John loves Mary.             |
| 1 | Mary loves Peter             |
| 2 | Peter loves Helen .          |
| 3 | Helen loves John;            |
| 4 | Total: four people involved. |

Model Information

Model Name: sentence_detector_dl
Compatibility: Spark NLP 2.7.0+
Edition: Official
Input Labels: [document]
Output Labels: [sentences]
Language: en

Data Source

Please visit the repo for more information


Accuracy:   0.98
Recall:        1.00
Precision:  0.96
F1:              0.98