Google has awarded over $1.2 million to support research in several areas of several natural language understanding that relate to Google's concept of the Knowledge Graph.
Google has been investing heavily in machine learning and deep neural networks to improve web search. Supporting natural language understanding is also motivated by the need to further search technology.
In the announcement of the awards Google Research Blog explains how natural language processing is integral to its Knowledge Graph technology that represents a shift "from strings to things", stating:
Understanding natural language is at the core of Google's work to help people get the information they need as quickly and easily as possible. At Google we work hard to advance the state of the art in natural language processing, to improve the understanding of fundamental principles, and to solve the algorithmic and engineering challenges to make these technologies part of everyday life. Language is inherently productive; an infinite number of meaningful new expressions can be formed by combining the meaning of their components systematically. The logical next step is the semantic modeling of structured meaningful expressions -- in other words, “what is said” about entities. We envision that knowledge graphs will support the next leap forward in language understanding towards scalable compositional analyses, by providing a universe of entities, facts and relations upon which semantic composition operations can be designed and implemented.
The research topic that have been awarded grants range from semantic parsing to statistical models of life stories and novel compositional inference and representation approaches to modeling relations and events in the Knowledge Graph.
The recipients are:
- Mark Johnson and Lan Du (Macquarie University) and Wray Buntine (NICTA) for “Generative models of Life Stories”
- Percy Liang and Christopher Manning (Stanford University) for “Tensor Factorizing Knowledge Graphs”
- Sebastian Riedel (University College London) and Andrew McCallum (University of Massachusetts, Amherst) for “Populating a Knowledge Base of Compositional Universal Schema”
- Ivan Titov (University of Amsterdam) for “Learning to Reason by Exploiting Grounded Text Collections”
- Hans Uszkoreit (Saarland University and DFKI), Feiyu Xu (DFKI and Saarland University) and Roberto Navigli (Sapienza University of Rome) for “Language Understanding cum Knowledge Yield”
- Luke Zettlemoyer (University of Washington) for “Weakly Supervised Learning for Semantic Parsing with Knowledge Graphs”
Reversible PHP Debugger Released
Dontbug is a new reverse debugger for PHP that lets you record the execution of PHP scripts, in command line mode or in the browser, and replay the same execution back in a PHP IDE debugger.
Mozilla Removes Firefox OS Code From Gecko
Firefox OS is a lesson in over-reaching. Mozilla thought that open source, real open source not the approximation that Google serves up with Android, could take over the mobile world. It didn't and th [ ... ]