Introducing a Conditional Transformer Language Model for Controllable GenerationRichard Socher · 11 SEP 2019
Large-scale language models show promising text generation capabilities, but users cannot control their generated content, style or train them for multiple supervised language generation tasks.
Leveraging Language Models for Commonsense Reasoning in Neural NetworksNazneen Rajani · 27 JUN 2019
Commonsense reasoning that draws upon world knowledge derived from spatial and temporal relations, laws of physics, causes and effects, and social conventions is a feature of human intelligence.
We annually publish and present our findings at top NLP and computer vision conferences.See all publications
Collaborate on breakthrough open source artificial intelligence technology.Collaborate
Connecting deep learning researchers with the world's smartest CRM.Apply