Bert
-
A Primer in BERTology: What we know about how BERT worksARXIV/NLP 2020. 3. 2. 21:46
https://arxiv.org/abs/2002.12327v1 A Primer in BERTology: What we know about how BERT works Transformer-based models are now widely used in NLP, but we still do not understand a lot about their inner workings. This paper describes what is known to date about the famous BERT model (Devlin et al. 2019), synthesizing over 40 analysis studies. We als arxiv.org abstract Transformer-based models 모델은 N..