Topic: Machine Learning

Track this topic after sign-in.

Short answer

This page shows the most relevant public items for Machine Learning, ranked by trend activity and review signal. Use weekly for fast changes, monthly for more stable patterns, and all-time for evergreen picks.

WeeklyMonthlyAll time

← Back to home

  1. Building high-level features using large scale unsupervised learning

    PaperJul 12, 2012arxiv.orgQuoc V. Le, Marc'Aurelio Ranzato, Rajat Monga, Matthieu Devin, Kai Chen, Greg S. Corrado, Jeff Dean, Andrew Y. Ng

    We consider the problem of building high-level, class-specific feature detectors from only unlabeled data. For example, is it possible to learn a face detector using only unlabeled images? To answe...

  2. Auto-Encoding Variational Bayes

    PaperDec 10, 2022arxiv.orgDiederik P Kingma, Max Welling

    How can we perform efficient inference and learning in directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We...

  3. DRAW: A Recurrent Neural Network For Image Generation

    PaperMay 20, 2015arxiv.orgKarol Gregor, Ivo Danihelka, Alex Graves, Danilo Jimenez Rezende, Daan Wierstra

    This paper introduces the Deep Recurrent Attentive Writer (DRAW) neural network architecture for image generation. DRAW networks combine a novel spatial attention mechanism that mimics the foveatio...

  4. Pixel Recurrent Neural Networks

    PaperFeb 29, 2016arxiv.orgAaron van den Oord, Nal Kalchbrenner, Koray Kavukcuoglu

    Modeling the distribution of natural images is a landmark problem in unsupervised learning. This task requires an image model that is at once expressive, tractable and scalable. We present a deep n...

  5. Improving neural networks by preventing co-adaptation of feature detectors

    PaperJul 3, 2012arxiv.orgGeoffrey E. Hinton, Nitish Srivastava, Alex Krizhevsky, Ilya Sutskever, Ruslan R. Salakhutdinov

    When a large feedforward neural network is trained on a small training set, it typically performs poorly on held-out test data. This "overfitting" is greatly reduced by randomly omitting ha...

  6. Adam: A Method for Stochastic Optimization

    PaperJan 30, 2017arxiv.orgDiederik P. Kingma, Jimmy Ba

    We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to i...

  7. DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition

    PaperOct 6, 2013arxiv.orgJeff Donahue, Yangqing Jia, Oriol Vinyals, Judy Hoffman, Ning Zhang, Eric Tzeng, Trevor Darrell

    We evaluate whether features extracted from the activation of a deep convolutional network trained in a fully supervised fashion on a large, fixed set of object recognition tasks can be re-purposed...

  8. Visualizing and Understanding Convolutional Networks

    PaperNov 28, 2013arxiv.orgMatthew D Zeiler, Rob Fergus

    Large Convolutional Network models have recently demonstrated impressive classification performance on the ImageNet benchmark. However there is no clear understanding of why they perform so well, o...

  9. Distilling the Knowledge in a Neural Network

    PaperMar 9, 2015arxiv.orgGeoffrey Hinton, Oriol Vinyals, Jeff Dean

    A very simple way to improve the performance of almost any machine learning algorithm is to train many different models on the same data and then to average their predictions. Unfortunately, making...

  10. MEM1: A Constant-Memory RL Framework for Long-Horizon Language Agents

    PaperFeb 12, 2026arXivYurong Chen, Yu He, Michael I. Jordan, Fan Yao

    Modern language agents must operate over long-horizon, multi-turn interactions, but most rely on full-context prompting which leads to unbounded memory growth. We introduce MEM1, an end-to-end rein...

← PreviousPage 18Next →

Top Entities In This Topic

Related Topics

FAQ

What does this Machine Learning page rank?

It ranks public content for Machine Learning using recent discussion, review, and engagement signals so you can triage faster. This guidance is specific to Machine Learning topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

How should I use weekly vs monthly vs all-time?

Use weekly for fast-moving updates, monthly for stable trend confirmation, and all-time for evergreen references. This guidance is specific to Machine Learning topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

How can I discover organizations active in Machine Learning?

Use the linked entities section to jump to labs, companies, and experts connected to this topic and explore their timelines. This guidance is specific to Machine Learning topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

Can I follow this topic for updates?

Yes. Use the follow button on this page to subscribe and track new high-signal activity. This guidance is specific to Machine Learning topic page on Attendemia and is written so it still makes sense without reading other sections on the page.