Skip to Main Content
We describe a comprehensive framework for text understanding, based on the representation of context. It is designed to serve as a representation of semantics for the full range of interpretive and inferential needs of general natural language processing. Its most distinctive feature is its uniform representation of the various simple and independent linguistic sources that play a role in determining meaning: lexical associations, syntactic restrictions, case-role expectations, and most importantly, contextual effects. Compositional syntactic structure from a shallow parsing is represented in a neural net-based associative memory, where it then interacts through a Bayesian network with semantic associations and the context or "gist" of the passage carried forward from preceding sentences. Experiments with more than 2000 sentences in different languages are included.