0. 참고
- 해당 챕터들은 전부 paper review로 대체합니다.
- Paper review하면서, 강의속의 내용도 넣어놨기에 참고 바랍니다
1. Negative Sampling
https://ceulkun04.tistory.com/246
[논문 리뷰] [NLP] Distributed Representations of Words and Phrasesand their Compositionality
0. Referencehttps://arxiv.org/abs/1310.4546 Distributed Representations of Words and Phrases and their CompositionalityThe recently introduced continuous Skip-gram model is an efficient method for learning high-quality distributed vector representations t
ceulkun04.tistory.com
2. GloVe
https://ceulkun04.tistory.com/247?category=1296326
[논문 리뷰] [NLP] glove: global vectors for word representation
0. Referencehttps://nlp.stanford.edu/pubs/glove.pdf1. Introduction- 대부분의 word vector를 embedding할 땐, norm이나 cosine similarity 등의 intrinsic quality를 평가하게 된다.- 단순히 거리 대신, 벡터 간 structure of difference를
ceulkun04.tistory.com
3. Word embedding evaluation
https://ceulkun04.tistory.com/248?category=1296326
[논문 리뷰] [NLP] Evaluation methods for unsupervised word embeddings
0. Referencehttps://aclanthology.org/D15-1036/ Evaluation methods for unsupervised word embeddingsTobias Schnabel, Igor Labutov, David Mimno, Thorsten Joachims. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015.a
ceulkun04.tistory.com
'DS Study > CS 224n(NLP)' 카테고리의 다른 글
[CS 224n] [5] Recurrent Neural Networks (Lecture 5) (0) | 2025.03.30 |
---|---|
[CS 224n] [4] Dependency Parsing (Lecture 4) (0) | 2025.03.30 |
[CS 224n] [3] Backpropagation and Neural Network Basics (생략) (0) | 2025.03.30 |
[CS 224n] [1] Intro & Word Vectors (0) | 2025.03.29 |