Logo Utrecht University

TULIPS – The Utrecht Logic in Progress Series

Upcoming Talks


Words as tensors; proofs as multilinear maps
In this talk, I report on a current NWO project that aims to combine the ‘proofs-as-programs’ view on derivational semantics as one finds it in typelogical grammars with a vector-based, distributional modelling of lexical semantics.
Typelogical grammars derive from the ‘Syntactic Calculus’ introduced by Lambek in 1958/1961. The Lambek calculus is a precursor of the ‘parsing as deduction’ method in computational linguistics: the conventional linguistic categories are turned into formulas of a grammar logic; the judgment whether a phrase is syntactically well-formed is the outcome of a process of deduction. Present-day categorial grammars address the expressive limitations of the original syntactic calculus by extending the formula language with control operations that allow for restricted forms of reordering and restructuring. These control operations play a role analogous to the exponentials/modalities of linear logic; they situate typelogical grammars in the broader family of substructural logics.
Compositional interpretation, in a typelogical grammar, takes the form of a structure-preserving map (a homomorphism) that sends types and proofs of the syntax to their counterparts in an appropriate semantic algebra. In the formal semantics approach in the tradition of Montague grammar, the target models are based on a set-theoretic universe of individuals, truth values, and function spaces over them. Semantics of open class lexical items (nouns, verbs, …) is ignored: these items are treated as unanalysed non-logical constants.
From a Natural Language Processing perspective, this agnostic attitude with respect to lexical semantics is disappointing. In fact, a core ingredient of the highly successful NLP applications we see today is the modelling of word meanings a vector in a high-dimensional semantic space representing a word’s context.
To obtain a vector-based compositional interpretation for typelogical grammars, the key idea is simple: a word with an atomic type, say a noun ‘cat’, is interpreted as a vector in Noun space; a word with an n-place functional type, say a transitive verb ‘chase’ requiring a subject and an object, is interpreted as a tensor of rank n+1; the derivation of a phrase ‘dogs chase cats’ then is interpreted as a multilinear map acting on the meanings of the constituting words to produce a vector in Sentence space, where the syntactic combination of the verb with its subject and object is modelled by tensor contraction.
I discuss some challenges for semantic modelling in terms of finite-dimensional vector spaces and (multi)linear maps arising from tensions between the demands of lexical and derivational aspects of meaning.

Time: 16.00 – 18.00

Location: 203, Drift 25