Best Awesome List: Deep Learning Foundation Resources

The highest-signal resources on Awesome List: Deep Learning Foundation, ranked by community reviews and momentum.
Canonical intent: topic=al-deep-learning-foundation|type=all|year=evergreen

Explore TopicAwesome ListsResearch Atlas

Top Picks

4
Wasserstein GAN
Martin Arjovsky, Soumith Chintala, Léon Bottou
5975eb63 6cad 40ae 8cca 025f002df284Jan 26, 2017·9773 checkouts·arxiv.org
Source ↗
13
TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Yangqing Jia, Rafal Jozefowicz, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mane, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Mike Schuster, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viegas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, Xiaoqiang Zheng
B11851bf cd47 4c0f 946e 04ccae43d20aMar 16, 2016·9579 checkouts·arxiv.org
Source ↗
24
Deep Voice: Real-time Neural Text-to-Speech
Sercan O. Arik, Mike Chrzanowski, Adam Coates, Gregory Diamos, Andrew Gibiansky, Yongguo Kang, Xian Li, John Miller, Andrew Ng, Jonathan Raiman, Shubho Sengupta, Mohammad Shoeybi
F64e6089 450e 4200 84eb 55c3179155ccMar 7, 2017·9257 checkouts·arxiv.org
Source ↗
27
Memory Networks
Jason Weston, Sumit Chopra, Antoine Bordes
4548843e a463 4bfe a96a 5129158062feNov 29, 2015·9192 checkouts·arxiv.org
Source ↗

Browse by Format

FAQ

How is this “best Awesome List: Deep Learning Foundation Resources” collection ranked?

This page ranks Awesome List: Deep Learning Foundation Resources using topic relevance, checkout momentum, source diversity, and freshness signals. Rankings are recalculated as new items and engagement arrive, so readers see resources that are both high quality and currently useful for implementation, research, and practical decision making. Canonical intent key: topic=al-deep-learning-foundation|type=all|year=evergreen.

How do you prevent duplicate collection pages?

Attendemia maps each slug variant, including best-of and year forms, to one canonical intent key. If two URLs describe the same topic, type, and timeframe, non-canonical versions permanently redirect. This consolidates crawl signals, avoids duplicate content dilution, and helps search engines index the strongest single page.

When does a year page stay separate from evergreen?

A year-specific page stays separate only when its item set is materially different from evergreen and has enough ranking depth. When overlap is high, the year URL redirects to the evergreen canonical page. This avoids thin duplication while preserving genuinely distinct annual collections for search users.

Are these paid recommendations?

No. These recommendations are not paid placements. Attendemia ranks items from public metadata, source quality coverage, and user engagement signals, then orders them by practical usefulness. Sponsorship does not buy rank position, so this page should be interpreted as editorial curation rather than advertising inventory.