In my spare time I am working on a project to explore the potential for a new generation of natural language systems, inspired by what could be done with a fusion of computational linguistics, cognitive science and symbolic reasoning. This started with a study of classical and statistical approaches to natural language processing, and a dawning realization that traditional approaches to parsing conflate different kinds of knowledge. Prepositional attachment is highly ambiguous at a purely grammatical level, and requires reasoning at a different level that operates in parallel.
Conventional symbolic reasoning is founded on mathematical logic and deals with what can be soundly deduced starting from a given set of assumptions. Cognitive Science is an interdisciplinary approach to gaining an understanding of the human mind. Cognitive theories such as ACT-R and CHREST aim to provide quantitative predictions of human performance, and have little in common with logic-based accounts of reasoning e.g. description logics as used in the Semantic Web.
To make significant progress will take plenty of effort and time, so I don’t expect quick results. My starting point is the development of a broad coverage chart parser with relatively flat grammar rules. I plan to then introduce cognitive models for dealing with parsing issues that are hard to address using a purely linguistic framework. The biggest problem I am facing is the difficulty of remembering where I left off when I pick up the work again. That’s inevitable for something that I only get time for now and then.