Korean BertForQuestionAnswering model (from ainize)

Description

Pretrained Question Answering model, adapted from Hugging Face and curated to provide scalability and production-readiness using Spark NLP. klue-bert-base-mrc is a Korean model orginally trained by ainize.

Download Copy S3 URI

How to use

document_assembler = MultiDocumentAssembler() \ 
.setInputCols(["question", "context"]) \
.setOutputCols(["document_question", "document_context"])

spanClassifier = BertForQuestionAnswering.pretrained("bert_qa_ainize_klue_bert_base_mrc","ko") \
.setInputCols(["document_question", "document_context"]) \
.setOutputCol("answer") \
.setCaseSensitive(True)

pipeline = Pipeline().setStages([
document_assembler,
spanClassifier
])

example = spark.createDataFrame([["What's my name?", "My name is Clara and I live in Berkeley."]]).toDF("question", "context")

result = pipeline.fit(example).transform(example)
val document = new MultiDocumentAssembler()
.setInputCols("question", "context")
.setOutputCols("document_question", "document_context")

val spanClassifier = BertForQuestionAnswering
.pretrained("bert_qa_ainize_klue_bert_base_mrc","ko")
.setInputCols(Array("document_question", "document_context"))
.setOutputCol("answer")
.setCaseSensitive(true)
.setMaxSentenceLength(512)

val pipeline = new Pipeline().setStages(Array(document, spanClassifier))

val example = Seq(
("Where was John Lenon born?", "John Lenon was born in London and lived in Paris. My name is Sarah and I live in London."),
("What's my name?", "My name is Clara and I live in Berkeley."))
.toDF("question", "context")

val result = pipeline.fit(example).transform(example)
import nlu
nlu.load("ko.answer_question.klue.bert.base").predict("""What's my name?|||"My name is Clara and I live in Berkeley.""")

Model Information

Model Name: bert_qa_ainize_klue_bert_base_mrc
Compatibility: Spark NLP 4.0.0+
License: Open Source
Edition: Official
Input Labels: [sentence, token]
Output Labels: [embeddings]
Language: ko
Size: 413.0 MB
Case sensitive: true
Max sentence length: 512

References

  • https://huggingface.co/ainize/klue-bert-base-mrc
  • https://ainize.ai/
  • https://main-klue-mrc-bert-scy6500.endpoint.ainize.ai/
  • https://ainize.ai/scy6500/KLUE-MRC-BERT?branch=main
  • https://ainize.ai/teachable-nlp
  • https://link.ainize.ai/3FjvBVn