Paper Review(논문 리뷰)

0. Referencehttps://arxiv.org/abs/1607.06450 Layer NormalizationTraining state-of-the-art, deep neural networks is computationally expensive. One way to reduce the training time is to normalize the activities of the neurons. A recently introduced technique called batch normalization uses the distribution of the summedarxiv.org1. Introduction- DNN은 여러 분야 즉, CV나 NLP문제에 대해 좋은 성과를 보인다.- 하지만, DNN은 학습..
0. Referencehttps://www.sciencedirect.com/science/article/abs/pii/S016786550500303X An introduction to ROC analysisReceiver operating characteristics (ROC) graphs are useful for organizing classifiers and visualizing their performance. ROC graphs are commonly used …www.sciencedirect.com1. Introduction- Receiver Operating Characteristics(ROC) graph는 classifier의 성능을 시각화, 조직화하는 기술이다.- 최근 몇년 동안 Mach..
0. Referencehttps://arxiv.org/abs/1412.6980 Adam: A Method for Stochastic OptimizationWe introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory rarxiv.org1. Introduction1.1. First-order Optimizer VS Se..
0. Referencehttps://arxiv.org/abs/1609.04747 An overview of gradient descent optimization algorithmsGradient descent optimization algorithms, while increasingly popular, are often used as black-box optimizers, as practical explanations of their strengths and weaknesses are hard to come by. This article aims to provide the reader with intuitions with regaarxiv.org1. Introduction- Gradient Descent..
23학번이수현
'Paper Review(논문 리뷰)' 카테고리의 글 목록 (10 Page)