General Prerequisites:
It is recommended to either take Categories, Proofs and Processes or Quantum Processes and Computation (the relevant chapter for DMM will be given very early on in QCS). One can also get the relevant category-theoretic background from [3] and/or chapters 3-4 of [2].
Course Term: Hilary
Course Lecture Information: This is a reading course for MFoCS students only.
Course Level: M
Course Overview:
Modelling the meaning of natural (as opposed to computer) languages is one of the hardest problems in artificial intelligence. Solving this problem has the potential to dramatically improve the quality and impact of a wide range of text and language processing applications such as text summarisation, search, machine translation, language generation, question answering, etc. A host of different approaches to this problem have been devised throughout the years. One notable approach is Formal Semantics, which treats natural languages as programming languages which ‘compile' to higher order logics. Another is Distributional Semantics, which models the meanings of words as points in high dimensional semantic spaces, determined by the contexts of occurrence. Recent research has attacked the task or reconciling the strengths of both of these approaches to produce compositional distributed (i.e. predominantly vector-based) models of meaning. This reading course serves as an introduction to the theoretical end of this new and rapidly growing field, with a focus on how to compose meanings of words to form meanings of phrases and sentences. In particular, the new field of quantum natural language processing (QNLP) relies in a fundamental way on this language model now known as DisCoCat. QNLP is considered as one of the most promising avenues for quantum computing.
Learning Outcomes:
Be up to date with the state-of-the-art of this new field of research
Course Synopsis:
The course starts with reading [4] and then will move on to several follow-up papers.