NLP Classic
The NLP Canon: From Statistical Foundations to Large Language Models
Master the science of language. This curated "Awesome" list tracks the core breakthroughs in Natural Language Processing (NLP). We start with the Statistical NLP era (HMMs, CRFs, and PCFGs), move through the Neural Shift of the early 2010s (Word2Vec, LSTMs, and Seq2Seq), and culminate in the Transformer Revolution (BERT, GPT, and beyond). This repository serves as a definitive reading list for understanding the algorithms that taught machines to parse, translate, and generate human text.
- Andrew M. Dai, Quoc V. Le20155,115 checkouts
- Yoav Goldberg, Omer Levy20147,235 checkouts
- Manaal Faruqui, Jesse Dodge, Sujay K. Jauhar, Chris Dyer, Eduard Hovy, Noah A. Smith20156,392 checkouts
- Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean20135,114 checkouts
- Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, Jeffrey Dean20138,109 checkouts
FAQ
What is NLP Classic?
NLP Classic is an expert-curated awesome list on Attendemia that groups high-signal resources for fast learning. Items are reviewed and refreshed over time, so readers can start with a practical shortlist instead of searching across fragmented sources and low-context recommendation threads.
How are items ranked here?
Items are ranked using maintainer curation, content quality notes, engagement momentum, and freshness indicators. This ranking method keeps the top of the awesome list actionable for current workflows, while still preserving evergreen references that are widely cited and useful for deeper technical understanding.
Can I follow this list?
Yes. Use the follow button near the page header to receive update visibility when new resources are added or promoted. Following this list helps you monitor changes without rechecking manually and keeps your learning feed aligned with this specific topic over time.