Topic: Awesome List: nlp-classic

Track this topic after sign-in.

Short answer

This page shows the most relevant public items for Awesome List: nlp-classic, ranked by trend activity and review signal. Use weekly for fast changes, monthly for more stable patterns, and all-time for evergreen picks.

WeeklyMonthlyAll time

← Back to home

  1. Generalizing Word Embeddings using Bag of Subwords

    PaperSep 12, 2018arxiv.orgJinman Zhao, Sidharth Mudgal, Yingyu Liang

    We approach the problem of generalizing pre-trained word embeddings beyond fixed-size vocabularies without using additional contextual information. We propose a subword-level word vector generation...

  2. Domain Adapted Word Embeddings for Improved Sentiment Classification

    PaperMay 11, 2018arxiv.orgPrathusha K Sarma, YIngyu Liang, William A Sethares

    Generic word embeddings are trained on large-scale generic corpora; Domain Specific (DS) word embeddings are trained only on data from a domain of interest. This paper proposes a method to combine ...

  3. An efficient framework for learning sentence representations

    PaperMar 7, 2018arxiv.orgLajanugen Logeswaran, Honglak Lee

    In this work we propose a simple and efficient framework for learning sentence representations from unlabelled data. Drawing inspiration from the distributional hypothesis and recent work on learni...

  4. Unsupervised Machine Translation Using Monolingual Corpora Only

    PaperApr 13, 2018arxiv.orgGuillaume Lample, Alexis Conneau, Ludovic Denoyer, Marc'Aurelio Ranzato

    Machine translation has recently achieved impressive performance thanks to recent advances in deep learning and the availability of large-scale parallel corpora. There have been numerous attempts t...

← PreviousPage 2Next →

Related Topics

FAQ

What does this Awesome List: nlp-classic page rank?

It ranks public content for Awesome List: nlp-classic using recent discussion, review, and engagement signals so you can triage faster. This guidance is specific to Awesome List: nlp-classic topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

How should I use weekly vs monthly vs all-time?

Use weekly for fast-moving updates, monthly for stable trend confirmation, and all-time for evergreen references. This guidance is specific to Awesome List: nlp-classic topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

How can I discover organizations active in Awesome List: nlp-classic?

Use the linked entities section to jump to labs, companies, and experts connected to this topic and explore their timelines. This guidance is specific to Awesome List: nlp-classic topic page on Attendemia and is written so it still makes sense without reading other sections on the page.

Can I follow this topic for updates?

Yes. Use the follow button on this page to subscribe and track new high-signal activity. This guidance is specific to Awesome List: nlp-classic topic page on Attendemia and is written so it still makes sense without reading other sections on the page.