gui
gui

The Berkeley Natural Language Processing Group

gui
gui
images/nlp_group_photo_2014.jpg
Members of the Berkeley NLP group, left to right, top row first: Jacob Andreas, Jonathan Kummerfeld, Dan Klein, Greg Durrett, Taylor Berg‑Kirkpatrick and David Hall

Overview

You understand language. Why can't computers? The Berkeley Natural Language Processing Group is trying to do something about it. We use a mix of computer science, linguistics, and statistics to create systems that can cope with the richness and subtlety of human language. We are a part of the UC Berkeley Computer Science Division. You can read more about our people and our research. Broadly, we work on the following areas:

Linguistic analysis: modeling the syntactic and semantic structures of text. Our work in this area includes syntactic parsing, semantic analysis, and coreference resolution. We focus on structured probabilistic models, including unsupervised and latent-variable methods. Some highlights: Check out our state-of-the-art parser and coreference system (currently the best available)!

Machine translation: translating text from one language into another. Our work in MT focuses on richly structured models that operate at a deeper syntactic level rather than a surface phrase level. We also design new algorithms for efficient alignment and decoding. Check out our alignment or language modeling toolkits.

Computational linguistics: using computer science to study language. Our computational linguistics projects include automated reconstruction of ancient languages and decipherment of historical documents. Check out some recent press!

Grounded semantics: modeling meaning. We use compositional models to produce interpretations of text grounded in the real world, such as linking spatial references to geometry, or anchoring dialogs to agent plans and goals.

Unsupervised learning: detecting and inducing hidden structure. Humans learn language without supervision, and we've demonstrated that automatic approaches can do the same. We've developed unsupervised methods that effectively learn a variety of linguistic phenomena, including grammar, coreference, word classes, and translation lexicons.

Beyond language: many other topics that excite us, from computational music to AI agent design. Check out the Berkeley Overmind!

Site designed by Jonathan K. Kummerfeld