Paper Review(논문 리뷰)/Deep Learning

0. Referencehttps://arxiv.org/abs/1803.08494 Group NormalizationBatch Normalization (BN) is a milestone technique in the development of deep learning, enabling various networks to train. However, normalizing along the batch dimension introduces problems --- BN's error increases rapidly when the batch size becomes smallarxiv.org 1. Introduction- Batch Normalization은 딥러닝에서 흔히 쓰이는 기법중 하나이다.- 하지만 이러..
0. Referencehttps://arxiv.org/abs/1607.06450 Layer NormalizationTraining state-of-the-art, deep neural networks is computationally expensive. One way to reduce the training time is to normalize the activities of the neurons. A recently introduced technique called batch normalization uses the distribution of the summedarxiv.org1. Introduction- DNN은 여러 분야 즉, CV나 NLP문제에 대해 좋은 성과를 보인다.- 하지만, DNN은 학습..
0. Referencehttps://www.sciencedirect.com/science/article/abs/pii/S016786550500303X An introduction to ROC analysisReceiver operating characteristics (ROC) graphs are useful for organizing classifiers and visualizing their performance. ROC graphs are commonly used …www.sciencedirect.com1. Introduction- Receiver Operating Characteristics(ROC) graph는 classifier의 성능을 시각화, 조직화하는 기술이다.- 최근 몇년 동안 Mach..
0. Referencehttps://arxiv.org/abs/1412.6980 Adam: A Method for Stochastic OptimizationWe introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory rarxiv.org1. Introduction1.1. First-order Optimizer VS Se..