Best Awesome List: Deep Learning Foundation Papers

The highest-signal papers on Awesome List: Deep Learning Foundation, ranked by community reviews and momentum.
Canonical intent: topic=al-deep-learning-foundation|type=paper|year=evergreen

Explore TopicAwesome ListsResearch Atlas

Top Picks

12
TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Yangqing Jia, Rafal Jozefowicz, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mane, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Mike Schuster, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viegas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, Xiaoqiang Zheng
Mar 16, 2016·9579 checkouts·arxiv.org
Source ↗
23
Deep Voice: Real-time Neural Text-to-Speech
Sercan O. Arik, Mike Chrzanowski, Adam Coates, Gregory Diamos, Andrew Gibiansky, Yongguo Kang, Xian Li, John Miller, Andrew Ng, Jonathan Raiman, Shubho Sengupta, Mohammad Shoeybi
Mar 7, 2017·9257 checkouts·arxiv.org
Source ↗

FAQ

How is this “best Awesome List: Deep Learning Foundation Papers” collection ranked?

This page ranks Awesome List: Deep Learning Foundation Papers using topic relevance, checkout momentum, source diversity, and freshness signals. Rankings are recalculated as new items and engagement arrive, so readers see resources that are both high quality and currently useful for implementation, research, and practical decision making. Canonical intent key: topic=al-deep-learning-foundation|type=paper|year=evergreen.

How do you prevent duplicate collection pages?

Attendemia maps each slug variant, including best-of and year forms, to one canonical intent key. If two URLs describe the same topic, type, and timeframe, non-canonical versions permanently redirect. This consolidates crawl signals, avoids duplicate content dilution, and helps search engines index the strongest single page.

When does a year page stay separate from evergreen?

A year-specific page stays separate only when its item set is materially different from evergreen and has enough ranking depth. When overlap is high, the year URL redirects to the evergreen canonical page. This avoids thin duplication while preserving genuinely distinct annual collections for search users.

Are these paid recommendations?

No. These recommendations are not paid placements. Attendemia ranks items from public metadata, source quality coverage, and user engagement signals, then orders them by practical usefulness. Sponsorship does not buy rank position, so this page should be interpreted as editorial curation rather than advertising inventory.