Ai
-
Video-to-Video Synthesis카테고리 없음 2020. 8. 3. 13:03
https://arxiv.org/abs/1808.06601 Video-to-Video Synthesis We study the problem of video-to-video synthesis, whose goal is to learn a mapping function from an input source video (e.g., a sequence of semantic segmentation masks) to an output photorealistic video that precisely depicts the content of the source vide arxiv.org abstract 비디오 - 비디오 합성에 대한 문제를 연구한다. 기존 비디오의 내용을 정확하게 묘사하여 사질적인 비디오로 만드는 것..
-
Machine Learning in Python: Main developments and technology trends in data science, machine learning, and artificial intelligenceARXIV/IT 2020. 3. 10. 12:00
https://arxiv.org/abs/2002.04803v1 Machine Learning in Python: Main developments and technology trends in data science, machine learning, and artificial intelligen Smarter applications are making better use of the insights gleaned from data, having an impact on every industry and research discipline. At the core of this revolution lies the tools and the methods that are driving it, from processi..
-
Separating the Effects of Batch Normalization on CNN Training Speed and Stability Using Classical Adaptive Filter TheoryARXIV/Convolution Neural Network 2020. 3. 3. 14:29
https://arxiv.org/abs/2002.10674v1 Separating the Effects of Batch Normalization on CNN Training Speed and Stability Using Classical Adaptive Filter Theory Batch Normalization (BatchNorm) is commonly used in Convolutional Neural Networks (CNNs) to improve training speed and stability. However, there is still limited consensus on why this technique is effective. This paper uses concepts from the ..
-
Improving Siamese Networks for One Shot Learning using Kernel Based Activation functionsARXIV/Convolution Neural Network 2020. 3. 3. 10:18
https://arxiv.org/abs/1910.09798v1 Improving Siamese Networks for One Shot Learning using Kernel Based Activation functions The lack of a large amount of training data has always been the constraining factor in solving a lot of problems in machine learning, making One Shot Learning one of the most intriguing ideas in machine learning. It aims to learn information about object c arxiv.org Siamese..
-
ArXiv siteARXIV 2020. 3. 2. 22:01
https://arxiv.org/ arXiv.org e-Print archive arXiv is a free distribution service and an open-access archive for 1,664,470 scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, an arxiv.org http://arxiv-sanity.com/ Arxiv Sanity Preserver arxiv-sanity.com
-
A Primer in BERTology: What we know about how BERT worksARXIV/NLP 2020. 3. 2. 21:46
https://arxiv.org/abs/2002.12327v1 A Primer in BERTology: What we know about how BERT works Transformer-based models are now widely used in NLP, but we still do not understand a lot about their inner workings. This paper describes what is known to date about the famous BERT model (Devlin et al. 2019), synthesizing over 40 analysis studies. We als arxiv.org abstract Transformer-based models 모델은 N..