Paper Review(논문 리뷰)

0. Referencehttps://arxiv.org/abs/1602.07360 SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and Recent research on deep neural networks has focused primarily on improving accuracy. For a given accuracy level, it is typically possible to identify multiple DNN architectures that achieve that accuracy level. With equivalent accuracy, smaller DNN architearxiv.org1. Introduction- Squeez..
0. Referencehttps://arxiv.org/abs/1603.09382 Deep Networks with Stochastic DepthVery deep convolutional networks with hundreds of layers have led to significant reductions in error on competitive benchmarks. Although the unmatched expressiveness of the many layers can be highly desirable at test time, training very deep networks comesarxiv.org1. Introduction- 본 논문은 ResNet의 성능향상에 focus를 두고 있다.- N..
0. Referencehttps://arxiv.org/abs/1605.07146 Wide Residual NetworksDeep residual networks were shown to be able to scale up to thousands of layers and still have improving performance. However, each fraction of a percent of improved accuracy costs nearly doubling the number of layers, and so training very deep residual nearxiv.org1. Introduction- 해당 논문에선 ResNet에서 Activation 순서외에도 다양한 구조를 연구하며, 다..
0. Referencehttps://arxiv.org/abs/1505.00387 Highway NetworksThere is plenty of theoretical and empirical evidence that depth of neural networks is a crucial ingredient for their success. However, network training becomes more difficult with increasing depth and training of very deep networks remains an open problemarxiv.org1. Introduction- 해당 논문은 신경망을 깊게 쌓기위해 LSTM처럼 gating system을 구현하여 gradient..
23학번이수현
'Paper Review(논문 리뷰)' 카테고리의 글 목록 (3 Page)