전체 글
-
Improving Siamese Networks for One Shot Learning using Kernel Based Activation functionsARXIV/Convolution Neural Network 2020. 3. 3. 10:18
https://arxiv.org/abs/1910.09798v1 Improving Siamese Networks for One Shot Learning using Kernel Based Activation functions The lack of a large amount of training data has always been the constraining factor in solving a lot of problems in machine learning, making One Shot Learning one of the most intriguing ideas in machine learning. It aims to learn information about object c arxiv.org Siamese..
-
ArXiv siteARXIV 2020. 3. 2. 22:01
https://arxiv.org/ arXiv.org e-Print archive arXiv is a free distribution service and an open-access archive for 1,664,470 scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, an arxiv.org http://arxiv-sanity.com/ Arxiv Sanity Preserver arxiv-sanity.com
-
A Primer in BERTology: What we know about how BERT worksARXIV/NLP 2020. 3. 2. 21:46
https://arxiv.org/abs/2002.12327v1 A Primer in BERTology: What we know about how BERT works Transformer-based models are now widely used in NLP, but we still do not understand a lot about their inner workings. This paper describes what is known to date about the famous BERT model (Devlin et al. 2019), synthesizing over 40 analysis studies. We als arxiv.org abstract Transformer-based models 모델은 N..