BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, NAACL 2019
[Paper Review] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, NAACL 2019
[Paper Review] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, NAACL 2019
[Paper Review] GPT1: Improving Language Understanding by Generative Pre-Training, Technical report, OpenAI, 2018
[Paper Review] Transformer: Attention is all you need, NeurIPS 2017