papers AI Learner
The Github is limit! Click to go to the new site.

BioBERT: pre-trained biomedical language representation model for biomedical text mining

2019-01-25
Jinhyuk Lee, Wonjin Yoon, Sungdong Kim, Donghyeon Kim, Sunkyu Kim, Chan Ho So, Jaewoo Kang

Abstract

Biomedical text mining has become more important than ever as the number of biomedical documents rapidly grows. With the progress of machine learning, extracting valuable information from biomedical literature has gained popularity among researchers, and deep learning is boosting the development of effective biomedical text mining models. However, as deep learning models require a large amount of training data, biomedical text mining with deep learning often fails due to the small sizes of training datasets in biomedical fields. Recent researches on learning contextualized language representation models from text corpora shed light on the possibility of leveraging a large number of unannotated biomedical text corpora. We introduce BioBERT (Bidirectional Encoder Representations from Transformers for Biomedical Text Mining), which is a domain specific language representation model pre-trained on large-scale biomedical corpora. Based on the BERT architecture, BioBERT effectively transfers the knowledge of large amount of biomedical texts into biomedical text mining models. While BERT also shows competitive performances with previous state-of-the-art models, BioBERT significantly outperforms them on three representative biomedical text mining tasks including biomedical named entity recognition (1.86% absolute improvement), biomedical relation extraction (3.33% absolute improvement), and biomedical question answering (9.61% absolute improvement) with minimal task-specific architecture modifications. We make pre-trained weights of BioBERT freely available in https://github.com/naver/biobert-pretrained, and source codes of fine-tuned models in https://github.com/dmis-lab/biobert.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.08746

PDF

http://arxiv.org/pdf/1901.08746


Similar Posts

Comments