Deep Learning Foundation
Awesome Deep Learning Foundations (2012–2016)
Trace the origins of the AI revolution. This curated collection features the seminal deep learning papers published between 2012 and 2016—the era that birthed modern computer vision and natural language processing. From the breakthrough of AlexNet (ImageNet) and the introduction of Dropout, to the architectural leaps of VGG, Inception, and ResNet. This roadmap provides a structured path for understanding the mathematical and architectural foundations that power today’s Large Language Models (LLMs) and Generative AI. Mastery of Artificial Intelligence begins with the classics. This guide provides a hand-picked selection of highly-cited deep learning papers published between 2012 and 2016. These works introduced the world to Convolutional Neural Networks (CNNs), Generative Adversarial Networks (GANs), and Recurrent Neural Networks (RNNs). Explore the seminal research that serves as the backbone for today's Large Language Models and autonomous systems.
- Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun20156,580 checkouts
- Liang-Chieh Chen, George Papandreou, Iasonas Kokkinos, Kevin Murphy, Alan L. Yuille20167,801 checkouts
- Chao Dong, Chen Change Loy, Kaiming He, Xiaoou Tang20157,703 checkouts
- Leon A. Gatys, Alexander S. Ecker, Matthias Bethge20158,063 checkouts
- Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhutdinov, Richard Zemel, Yoshua Bengio20169,296 checkouts
- Rafal Jozefowicz, Oriol Vinyals, Mike Schuster, Noam Shazeer, Yonghui Wu20165,734 checkouts
- Minh-Thang Luong, Hieu Pham, Christopher D. Manning20156,875 checkouts
- Jason Weston, Sumit Chopra, Antoine Bordes20159,192 checkouts
- Alex Graves, Greg Wayne, Ivo Danihelka20148,716 checkouts
- Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio20166,561 checkouts
- Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, Yoshua Bengio20145,887 checkouts
- Nal Kalchbrenner, Edward Grefenstette, Phil Blunsom20147,721 checkouts
- 20149,707 checkouts
- Quoc V. Le, Tomas Mikolov20145,958 checkouts
- Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean20135,114 checkouts
- 20148,463 checkouts
- Dzmitry Bahdanau, Jan Chorowski, Dmitriy Serdyuk, Philemon Brakel, Yoshua Bengio20169,357 checkouts
- Dario Amodei, Rishita Anubhai, Eric Battenberg, Carl Case, Jared Casper, Bryan Catanzaro, Jingdong Chen, Mike Chrzanowski, Adam Coates, Greg Diamos, Erich Elsen, Jesse Engel, Linxi Fan, Christopher Fougner, Tony Han, Awni Hannun, Billy Jun, Patrick LeGresley, Libby Lin, Sharan Narang, Andrew Ng, Sherjil Ozair, Ryan Prenger, Jonathan Raiman, Sanjeev Satheesh, David Seetapun, Shubho Sengupta, Yi Wang, Zhiqian Wang, Chong Wang, Bo Xiao, Dani Yogatama, Jun Zhan, Zhenyao Zhu20157,681 checkouts
- Alex Graves, Abdel-rahman Mohamed, Geoffrey Hinton20136,031 checkouts
- Sergey Levine, Peter Pastor, Alex Krizhevsky, Deirdre Quillen20168,882 checkouts
FAQ
What is Deep Learning Foundation?
Deep Learning Foundation is an expert-curated awesome list on Attendemia that groups high-signal resources for fast learning. Items are reviewed and refreshed over time, so readers can start with a practical shortlist instead of searching across fragmented sources and low-context recommendation threads.
How are items ranked here?
Items are ranked using maintainer curation, content quality notes, engagement momentum, and freshness indicators. This ranking method keeps the top of the awesome list actionable for current workflows, while still preserving evergreen references that are widely cited and useful for deeper technical understanding.
Can I follow this list?
Yes. Use the follow button near the page header to receive update visibility when new resources are added or promoted. Following this list helps you monitor changes without rechecking manually and keeps your learning feed aligned with this specific topic over time.