Topic: Awesome List: deep-learning-foundation

Track this topic after sign-in.

Short answer

This page shows the most relevant public items for Awesome List: deep-learning-foundation, ranked by trend activity and review signal. Use weekly for fast changes, monthly for more stable patterns, and all-time for evergreen picks.

WeeklyMonthlyAll time

← Back to home

  1. What makes for effective detection proposals?

    PaperAug 1, 2015arxiv.orgJan Hosang, Rodrigo Benenson, Piotr Dollár, Bernt Schiele

    Current top performing object detectors employ detection proposals to guide the search for objects, thereby avoiding exhaustive sliding window search across images. Despite the popularity and wides...

  2. Reading Text in the Wild with Convolutional Neural Networks

    PaperDec 4, 2014arxiv.orgMax Jaderberg, Karen Simonyan, Andrea Vedaldi, Andrew Zisserman

    In this work we present an end-to-end system for text spotting -- localising and recognising text in natural scene images -- and text based image retrieval. This system is based on a region proposa...

  3. Perceptual Losses for Real-Time Style Transfer and Super-Resolution

    PaperMar 27, 2016arxiv.orgJustin Johnson, Alexandre Alahi, Li Fei-Fei

    We consider image transformation problems, where an input image is transformed into an output image. Recent methods for such problems typically train feed-forward convolutional neural networks usin...

  4. Learning to Compose Neural Networks for Question Answering

    PaperJun 7, 2016arxiv.orgJacob Andreas, Marcus Rohrbach, Trevor Darrell, Dan Klein

    We describe a question answering model that applies to both images and structured knowledge bases. The model uses natural language strings to automatically assemble neural networks from a collectio...

  5. Very Deep Convolutional Networks for Text Classification

    PaperJan 27, 2017arxiv.orgAlexis Conneau, Holger Schwenk, Loïc Barrault, Yann Lecun

    The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks. However, these architectures are rather shallow in comparison to the ...

  6. A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task

    PaperAug 8, 2016arxiv.orgDanqi Chen, Jason Bolton, Christopher D. Manning

    Enabling a computer to understand a document so that it can answer comprehension questions is a central, yet unsolved goal of NLP. A key factor impeding its solution by machine learned systems is t...

  7. Densely Connected Convolutional Networks

    PaperAug 25, 2016arxiv.orgGao Huang, Zhuang Liu, Kilian Q. Weinberger

    Recent work has shown that convolutional networks can be substantially deeper, more accurate and efficient to train if they contain shorter connections between layers close to the input and those c...

  8. Adaptive Computation Time for Recurrent Neural Networks

    PaperFeb 21, 2017arxiv.orgAlex Graves

    This paper introduces Adaptive Computation Time (ACT), an algorithm that allows recurrent neural networks to learn how many computational steps to take between receiving an input and emitting an ou...

  9. Understanding Convolutional Neural Networks

    PaperMay 30, 2016arxiv.orgJayanth Koushik

    Convoulutional Neural Networks (CNNs) exhibit extraordinary performance on a variety of machine learning tasks. However, their mathematical properties and behavior are quite poorly understood. Ther...

  10. Adversarially Learned Inference

    PaperJun 2, 2016arxiv.orgVincent Dumoulin, Ishmael Belghazi, Ben Poole, Alex Lamb, Martin Arjovsky, Olivier Mastropietro, Aaron Courville

    We introduce the adversarially learned inference (ALI) model, which jointly learns a generation network and an inference network using an adversarial process. The generation network maps samples fr...

  11. Professor Forcing: A New Algorithm for Training Recurrent Networks

    PaperOct 27, 2016arxiv.orgAlex Lamb, Anirudh Goyal, Ying Zhang, Saizheng Zhang, Aaron Courville, Yoshua Bengio

    The Teacher Forcing algorithm trains recurrent networks by supplying observed sequence values as inputs during training and using the network's own one-step-ahead predictions to do multi-step sampl...

  12. Brain Tumor Segmentation with Deep Neural Networks

    PaperMay 20, 2016arxiv.orgMohammad Havaei, Axel Davy, David Warde-Farley, Antoine Biard, Aaron Courville, Yoshua Bengio, Chris Pal, Pierre-Marc Jodoin, Hugo Larochelle

    In this paper, we present a fully automatic brain tumor segmentation method based on Deep Neural Networks (DNNs). The proposed networks are tailored to glioblastomas (both low and high grade) pictu...

  13. Representation Learning: A Review and New Perspectives

    PaperApr 23, 2014arxiv.orgYoshua Bengio, Aaron Courville, Pascal Vincent

    The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the differe...

  14. Deep Learning in Neural Networks: An Overview

    PaperOct 8, 2014arxiv.orgJuergen Schmidhuber

    In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarises relev...

  15. Tutorial on Variational Autoencoders

    PaperJan 3, 2021arxiv.orgCarl Doersch

    In just three years, Variational Autoencoders (VAEs) have emerged as one of the most popular approaches to unsupervised learning of complicated distributions. VAEs are appealing because they are bu...

  16. LSTM: A Search Space Odyssey

    PaperOct 4, 2017arxiv.orgKlaus Greff, Rupesh Kumar Srivastava, Jan Koutník, Bas R. Steunebrink, Jürgen Schmidhuber

    Several variants of the Long Short-Term Memory (LSTM) architecture for recurrent neural networks have been proposed since its inception in 1995. In recent years, these networks have become the stat...

← PreviousPage 2Next →

Top Entities In This Topic

Related Topics

FAQ

What does this Awesome List: deep-learning-foundation page rank?

It ranks public content for Awesome List: deep-learning-foundation using recent discussion, review, and engagement signals so you can triage faster. This guidance is specific to Awesome List: deep-learning-foundation topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

How should I use weekly vs monthly vs all-time?

Use weekly for fast-moving updates, monthly for stable trend confirmation, and all-time for evergreen references. This guidance is specific to Awesome List: deep-learning-foundation topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

How can I discover organizations active in Awesome List: deep-learning-foundation?

Use the linked entities section to jump to labs, companies, and experts connected to this topic and explore their timelines. This guidance is specific to Awesome List: deep-learning-foundation topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

Can I follow this topic for updates?

Yes. Use the follow button on this page to subscribe and track new high-signal activity. This guidance is specific to Awesome List: deep-learning-foundation topic page on Attendemia and is written so it still makes sense without reading other sections on the page.