Course Details (frequently updated!)
Latest change: Wed Aug 24 08:26:00 2022 (UTC)

(For general information about courses and the overall schedule click here.)


> Click here for the full ESSLLI schedule <






Effectful Composition in Natural Language Semantics

Area: Language and Logic (LaLo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Dylan Bumford (UCLA) and Simon Charlow (Rutgers)

Course website: https://simoncharlow.com/esslli

Abstract:

Computer programs are often factored into pure components -- simple, total functions from inputs to outputs -- and components that may have side effects -- errors, changes to memory, parallel threads, abortion of the current command, etc. In this course, we'll make the case that human languages are similarly organized around the give and pull of pure and effectful processes, and we'll aim to show how denotational techniques from computer science can be leveraged to support elegant and illuminating semantic analyses of natural language phenomena.

Additional information:








Graphs, Computation, and Language

Area: Language and Computation (LaCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Dmitry Ustalov (FoLLI)

Course website: https://zenodo.org/record/6667766

Abstract: Employing the properties of linguistic networks allows discovering structure and making predictions. This introductory course seeks answers to three questions: (1) how to express the linguistic phenomena as graphs, (2) how to gain knowledge based on them, and (3) how to assess the quality of this knowledge. We will start with traditional graph-based Natural Language Processing (NLP) methods like TextRank and Markov Clustering and finish with such contemporary Machine Learning techniques as DeepWalk and Graph Convolutional Networks. As the growing interest in NLP methods urges their meaningful evaluation, we pay special attention to quality assessment and human judgements. The course has five lectures on Language Graphs, Graph Clustering, Graph Embeddings, Knowledge Graphs, and Evaluation. They elaborately go through the essential algorithms step-by-step, discuss case studies, and suggest insightful references and datasets. The target audience is undergraduate and graduate students, data analysts, and interdisciplinary researchers (but it is not limited to them).








Outline of a Theory of Interpretation

Area: Language and Logic (LaLo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Paul Dekker, ILLC/Department of Philosophy, University of Amsterdam

Course website:

Abstract:

The course formally elaborates the idea of a theory of interpretation by translation, mostly in the spirit of Frege, Quine, Davidson and Kamp. The course provides some minimal formal tools required for presenting our extensional understanding of actual discourse, including intensional discourse, and a more or less philosophical motivation for presenting it this way.
While most of the developed formalism will be by and large similar to the relatively common semantics architectures, the proposed approach distinguishes itself in that it tries and accomplish all this with no, or the least possible, ontological and representational commitments. We make no assumptions about what meanings are, or possibilities, or representations---or objects, for that matter, nor about how one could go about modeling them. The course might be very adequately characterized as a training in a logical, Fregean, understanding of {\em DRT\/}, and related formalisms.

Additional information:

The course formally elaborates the idea of a theory of interpretation by translation, mostly in the spirit of Frege, Quine, Davidson and Kamp. The course provides some minimal formal tools required for presenting our extensional understanding of actual dis- course, including intensional discourse, and a more or less philosophical motivation for pre- senting it this way. While most of the developed formalism will be by and large similar to the relatively common semantics architectures, the proposed approach distinguishes itself in that it tries and accomplish all this with no, or the least possible, ontological and representational commitments. We make no assumptions about what meanings are, or possibilities, or representations—or objects, for that matter, nor about how one could go about modeling them. The course might be very adequately characterized as a training in a logical, Fregean, understanding of DRT, and related formalisms. In the course we will step by step develop the language and its logic. We motivate and in- troduce the formal language in various stages, each time explaining how it is to be understood (explain its Sinn so to speak), characterize its logic, and indicate how it can be interpreted model-theoretically (sketch its possible Bedeutung). The, minimal, logical architecture will be seen to allow for a neat but also novel understanding of the logical connectives, indexicals, names, propositional attitudes, and intentional existence. Even though it will not be explicitly argued here, the proposed architecture can be claimed to be compatible with various more specifically charged frameworks like those of formal dynamic semantics and cognitive con- ceptual grammar, and distributional approaches to meaning. The lecturer has made some preliminary readings available at his website at http://www.uva.nl/profiel/p.j.e.dekker, and also, for each day, in a DropBox folder access to which can be requested by mailing p.j.e.dekker@uva.nl. Keywords: theory of interpretation, dynamic semantics, discourse representation, proper names, indexicals, identity, necessity, propositional attitudes, natural deduction, logical space, intentional being.








Coordination: Syntax, Semantics, Discourse

Area: Language and Logic (LaLo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Daniel Altshuler (University of Oxford) and Robert Truswell (The University of Edinburgh)

Course website: https://danielaltshuler.com/esslli-2022

Abstract:

Linguistics is full of boundary disputes. Empirical phenomena do not come neatly labeled as “syntax”, “semantics”, etc. The components of linguistic theory should interact to provide a complete account of the phenomena at hand, but often a syntactician will claim “this phenomenon is really semantics” (or ​vice versa​), without ensuring that a semantic analysis of the phenomenon is viable.

This course, based on our forthcoming OUP survey monograph, is an attempt to develop a complete analysis of coordination, a topic that spans syntax, formal semantics, and discourse semantics/pragmatics. We focus particularly on patterns of unbounded dependency formation in coordinate structures, which syntacticians claim require a partially semantic analysis, often without reference to current theories of the semantics of coordination. We explore the interactions between syntactic, semantic, and discourse theories, to develop an empirically rich holistic picture of the phenomenon at hand.

Additional information:








From Minimal(ist) Formalizations to Parsing: Pros and Cons of a Symbolic Approach in a Deep-Learning Era

Area: Language and Computation (LaCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Cristiano Chesi and Gregory M Kobele

Course website:

Abstract:

A linguistic grammar formalism allows us to produce a deep analysis. The point of such an analysis is that disparate effects (word order, sentence meaning, prosodic contour, online processing cost, eye movements during reading, blood oxygenation levels in the brain, etc) are partially subsumed under a single cause (a syntactic structure). Among the many linguistic grammar formalisms on the market, in this course we will embrace a Minimalist perspective (Chomsky, 1995, 2001) because of its linguistic influence. We will start with the formalism of (Stabler, 2011, 1997) and we will show, on the one hand, how crucial empirical distinctions can be derived in terms of morphosyntactic featural manipulations, on the other, how various parsing strategies, based on Minimalist Grammars, can be formulated, which differ both in terms of complexity and cognitive plausibility. Here we will show how these different strategies make different predictions with respect to empirical data, and leverage these to reconstruct a version of MGs that better accounts for both linguistic (acceptability/grammaticality) and behavioural (fMRI/EEG/Eyetracking) data. In the end, a brief incursion on language modeling will be attempted, comparing the predictions of these models with the ones based on MGs.

Additional information:








Knowledge and Gossip

Area: Logic and Computation (LoCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Hans van Ditmarsch and Malvin Gattinger

Course website: https://malv.in/2022/gossip/

Abstract:

Gossip protocols facilitate peer-to-peer information sharing in possibly partial networks of agents. Each agent starts with some private information and the goal is to share this information among all agents. In distributed gossip protocols, there is no central processor or controller deciding who may call whom, but this is determined by independent pro-active agents and chance. In epistemic gossip protocols, knowledge conditions may restrict possible calls, for example you may not wish to call an agent who you know already to know your secret. In dynamic gossip, agents also exchange 'telephone numbers', which leads to network expansion.

This course gives a survey of results and methods in distributed epistemic gossip. Topics include constructing and revising gossip graphs, exhaustively enumerating call sequences, and model checking the conditions of protocols in suitable logics. We will present both the theory and Haskell-based implementations.

Additional information:








The Logic of Views

Area: Logic and Computation (LoCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Michael Benedikt and Jerzy Marcinkowski

Course website:

Abstract:

This introductory course for the Logic and Computation track will cover logical aspects of views.

A view is a selection of information from a dataset, described within some logic.

We will deal with a very natural  question about the information content of views:

*** When does a view have sufficient information to answer a given information need? ***

When you think of it, the above question sound almost philosophical. A view is
a projection of the reality (the dataset is the »reality« here). And one can easily imagine a bearded man
in himation chained to the wall of a Platonian cave, watching the views projected on the wall and pondering what
information about reality can be faithfully reconstructed from what he is able to see.

For us it is a logic/database theory question though. And a really well motivated one,
with motivations ranging from database query evaluation plans optimization
(where we prefer a positive answer) to privacy issues -- does a view keep another piece of information private?
Dozens (or maybe hundreds) of database theory papers were written in which this question (or closely related ones)
is considered, many of them quite recently. Both the lecturers (MB and JM) have been involved in this effort.
Additional information:

ON DAY 1 and overview will be presented, and preliminaries, necessary to make the course self contained.
This means that no previous knowledge of database theory is needed to attend. But even if you already
know something about database theory we will try to teach you something new on this day.

MB will be talking for the first 60-65 minutes, and then JM for 25-30 minutes.


ON DAY 2 we will start from the  definition of query determinacy
(which is the formalization of the informal idea of »having sufficient information to answer a query«)
and query rewriting.
Then the case where the views are defined by First Order Logic formulae will be presented
(undecidability will be shown, and we will see that determinacy coincides with rewriting in this case)
This part will take about 60-65 minutes, and will be presented by MB.

During the remaining 25-30 minutes an elegant case of Path Queries (which are very simple FO formulae)
will be presented by JM. In contrast to the First Order case, determinacy is decidable here,
and does not coincide with rewriting.

ON DAYS 3 and 4 JM will be talking. Most of the time the topic will be the red-green chase,
a useful tool to study determinacy. Both positive and negative results will be presented,
some of them simple and some quite complicated. Then (for the last ~45 minutes of DAY 4)
a new scenario will be discussed, with many interesting open questions which, we think,
require completely new techniques.

ON DAY 5: MB will talk about monotone determinacy, an alternative notion of "sufficient information".
Again, there will be an open problem for you to solve.







Conditionals and Information-Sensitivity

Area: Language and Logic (LaLo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): David Boylan (Texas Tech) and Matt Mandelkern (NYU)

Course website: https://mandelkern.hosting.nyu.edu/condinfo.html

Abstract:

Recent research has converged on the idea that the semantics for conditionals is locally information-sensitive: roughly, the interpretation of a conditional depends, not just on its global context, but also on its local context. We will explore the theory of conditionals through this lens. We will thus introduce students to long-standing questions involving conditionals, as well as give them familiarity with the cutting-edge literature in this area. We will first consider how local information-sensitivity can illuminate the distinction between indicatives and subjunctives. We then will consider the logic of conditionals. Finally, we will consider the probability of conditionals. These topics have generally been explored in isolation from each other; we hope that, by bringing them together under the umbrella of local information-sensitivity, we will draw out inter-dependencies between these three issues and show how they can shed light on each other.

Additional information:








Defeasible Logics with Applications to Normative Systems and Philosophy

Area: Logic and Computation (LoCo)

Level: Foundational

Week 1 (click here for course time and room)

Lecturer(s): Aleks Knoks, University of Luxembourg

Course website: http://aleksknoks.com/esslli-2022-defeasible-logic/

Abstract: In many scientific fields, as well as day-to-day commonsense reasoning one often has to reasons on the basis of uncertain, incomplete, or even inconsistent information. The titular defeasible logics---also known as nonmonotonic logics---encompass various formal systems designed to capture reasoning of this sort. This course has two main goals. The first is to introduce the participants to the field by providing them with a solid understanding of exemplar approaches, including default logic, input/output logic, and some argumentation-theoretic approaches. The second is to discuss applications of these formal tools to normative systems and some heatedly debated issues in epistemology and ethics. Thus, the course participants will not only learn about some of the core ideas, results, and approaches in the field of defeasible logics, but also develop skills on applying these logics in the context of normative systems and philosophy.

Additional information:








Argument Mining between NLP and Social Sciences

Area: Language and Computation (LaCo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): Gabriella Lapesa (IMS Stuttgart) and Eva Maria Vecchi (IMS Stuttgart)

Course website: https://sites.google.com/view/esslli2022-am-in-nlp-ss/

Abstract:

Argument Mining is a highly interdisciplinary field in Natural Language Processing. Given a linguistic unit (a speech, an essay, forum post, or a tweet), its goal is to determine what is the position adopted by the author/speaker on a certain topic/issue (e.g., whether or not vaccinations should be enforced), and to identify the support, if any, provided by the speaker for its position. In this introductory course we discuss a selection of issues related to Argument Mining, structured along three main coordinates: the core notion of Argument Quality (How do we recognise good arguments?); the modeling challenges related to the automatic extraction of argument structures (multilingualism; evaluation of different modeling architectures); the application potential (computational social science; education). The course aims to highlight the interdisciplinary aspect of this field, ranging from the collaboration of theory and practice between NLP and social sciences, to approaching different types of linguistic structures (i.e., social media versus parliamentary texts), linguistic analysis of such structures, and the ethical issues involved (i.e., how to use Argument Mining for the social good).

Additional information:








Cognitive and Computational Models of Abstractness

Area: Language and Computation (LaCo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): Diego Frassinelli (University of Konstanz) and Sabine Schulte im Walde (University of Stuttgart)

Course website: https://kater-concepts.github.io/esslli22/

Abstract:

Across disciplines, researchers are eager to gain insight into empirical features of abstract vs. concrete concepts and words. In the first part of this course we present an overview of the cognitive science literature which reports extensive analyses of how concrete concepts are processed, with however little consensus about the nature of abstract concepts. In the second part of this course we look into this dichotomy from a computational perspective, where the inclusion of information regarding the concreteness of words has been demonstrated to play a key role across NLP tasks, such as the automatic identification of figurative language. Additionally, we describe and discuss the procedures of collecting human-generated ratings of abstractness and their usage for both communities. Overall, this course thus aims at introducing and discussing cognitive and computational resources and empirical studies in order to understand the role and application of abstractness in large-scale data-driven models.

Additional information:








The Logical Form of Lexical Semantics

Area: Language and Logic (LaLo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): Itamar Kastner (University of Edinburgh)

Course website: https://blogs.ed.ac.uk/itamar/esslli/

Abstract:


It is common to say that the lexicon is the store of idiosyncratic information. This is true to some extent —- dog(x) is a different primitive than cat(x) -- but such a view sidesteps the many generalizations that work on lexical semantics has unearthed. Many verbs, in particular, differ not only in conceptual meaning but also in grammatical requirements: for example, we can "eat and eat all day" but not "??devour and devour all day". Building on a recent surge in empirical and formal work, this course will introduce students to a number of generalizations, discuss how they should be formalized, and make concrete a number of open questions, including:
- What are the most robust crosslinguistic generalizations regarding the interaction between lexicon and grammar?
- What formal tools can account for these?
- Is it possible to reach a constrained inventory of lexical semantic primitives?
- How can these claims be tested experimentally and modeled computationally?

Additional information:








When Semantics Meets Syntax

Area: Logic and Computation (LoCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Phokion G. Kolaitis (University of California Santa Cruz and IBM Research) and Andreas Pieris (University of Cyprus and University of Edinburgh)

Course website: https://sites.google.com/ucsc.edu/esslli-2022/home

Abstract: The study of model theory typically proceeds from syntax to semantics, that is, the syntax of a logical formalism is introduced first and then the properties of the mathematical structures that satisfy sentences of that formalism are explored. There is, however, a mature and growing body of research in the reverse direction aiming, among other goals, to characterize definability in some logical formalism in terms of model-theoretic or structural properties. The origins of this line of work can be traced to Tarski's program whose main objective was to characterize metamathematical notions in "purely mathematical terms". This course will present a comprehensive overview of model-theoretic and structural characterizations of definability in first-order logic and in fragments of first-order logic that have found numerous applications to databases and related areas of computer science.

Additional information:








Hands-on Distributional Semantics for Linguistics using R

Area: Language and Computation (LaCo)

Level: Foundational

Week 1 (click here for course time and room)

Lecturer(s): Stephanie Evert (FAU Erlangen-Nürnberg) and Gabriella Lapesa (IMS, U Stuttgart)

Course website: http://wordspace.collocations.de/doku.php/course:esslli2021:start

Abstract:

Distributional semantic models (DSM) – also known as “word space”, “distributional similarity”, or more recently “word embeddings” – are based on the assumption that the meaning of a word can (at least to a certain extent) be inferred from its usage, i.e. its distribution in text. Therefore, these models dynamically build semantic representations – in the form of high-dimensional vector spaces – through a statistical analysis of the contexts in which words occur. DSMs are a promising technique for solving the lexical acquisition bottleneck by unsupervised learning, and their distributed representation provides a cognitively plausible, robust and flexible architecture for the organisation and processing of semantic information.
In this introductory course we will highlight the interdisciplinary potential of DSMs beyond standard semantic similarity tasks; our overview will put a strong focus on cognitive modeling and theoretical linguistics. This course aims to equip participants with the background knowledge and skills needed to build different kinds of DSM representations – from traditional “count” models to neural word embeddings – and apply them to a wide range of tasks. The hands-on sessions will be conducted in R with the user-friendly “wordspace” package.

Additional information:








Computational Approaches to the Explanation of Universal Properties of Meaning

Area: Language and Computation (LaCo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Fausto Carcassi and Jakub Szymanik

Course website: https://thelogicalgrammar.github.io/ESSLLI22_langevo/intro.html#

Abstract:

One of the most successful programs in semantics has been the identification of meaning universals. Recently, there has been a surge in research combining semantic and cognitive science to explain the origins of such universals. We will introduce this current and theoretically rich debate, taking this opportunity to teach computational methods to study language and cognition, e.g., learning models, minimal-description length, or information theory. We will start with an overview of the linguistic debates about universals. Then, we will focus on the universals that evolve from increasing learnability and reducing complexity. We will discuss iterated learning as a mechanism connecting learnability and language-level patterns. Next, we will present the pressure towards languages that are optimized for communication. Finally, we will discuss the universals emerging as a tradeoff between these pressures. Throughout, we will use convexity in nominal semantics and the universals of quantification as case studies.

Additional information:








Complexity of Reasoning in Kleene and Action Algebras

Area: Logic and Computation (LoCo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Stepan Kuznetsov (FoLLI)

Course website: https://skuzn.github.io/esslli2022/

Abstract:

Iteration, or Kleene star, is one of the most mysterious algebraic operation that appears in computer science. Even simple theories involving this operation happen to enjoy properties which are more usual for strong systems with induction (like first-order arithmetic), and thus, high algorithmic complexity. In the proposed course, we survey complexity results for theories of algebras with the Kleene star. We start with classical results by Kozen (1994, 2002) on complexity of equational and Horn theories for Kleene algebras (but with modern proofs, which utilize circular proof systems by Das and Pous), and then head to recent results on action algebras, that is, Kleene algebras with residuals, by Buszkowski and Palka (2007) and the author (2019--2020). A highlight of this course is the usage of Kleene algebra for modelling infinite computations, dual to subexponential modalities in linear logic.

Additional information:








Causal Models in Linguistics, Philosophy, and Psychology

Area: Language and Logic (LaLo)

Level: Foundational

Week 2 (click here for course time and room)

Lecturer(s): Daniel Lassiter and Thomas Icard

Course website: https://www.dropbox.com/sh/342nf3y7dhjgu0g/AABAReBjm6mNekvVQJ7f0cgHa?dl=0

Abstract:

This course explains and motivates formal models of causation built around Bayes nets and structural equation models, a topic of increasing interest across multiple cognitive science fields, and describes their application to select problems in psychology, philosophy, and linguistics.

Additional information:








Languages and Logics of Time: Priorean Perspectives

Area: Language and Logic (LaLo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Patrick Blackburn (Roskilde University) and Antje Rumberg (Aarhus University)

Course website:

Abstract:

In this course we will introduce and motivate a variety of languages and logics of time from a historical and problem-oriented perspective. Our discussions will be grounded in the pioneering work of Arthur Prior, the founding father of tense logic, and we will explore the ways his ideas have been refined and reformed in later developments. The journey will take us from philosophical and linguistic questions concerning time and tense, temporal reference, the open future, change and aspectual categories, as well as the existence of individuals in time to contemporary formulations of basic tense logic, hybrid tense logic, branching time logic, event logic, description logic and first-order temporal logic. The course is interdisciplinary and only basic knowledge of elementary set theory, propositional and first-order logic is required.

Additional information:








From Axioms to Rules: The Factory of Modal Proof Systems

Area: Logic and Computation (LoCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Sonia Marin and Lutz Straßburger

Course website: https://www.lix.polytechnique.fr/~lutz/orgs/ESSLLI2022-course.html

Abstract:

Modal logics have many applications in computer science and linguistics. There are many different kinds of modal logics determined by the postulated axioms. Recent advances in proof theoretical research has uncovered various ways of designing sound and complete proof systems for modal logics, in such a way that the axioms can be directly translated into proof rules.

The goal of this course is to make the student acquainted with these methods. The level of the course is intended introductory. We will introduce the various proof calculi, sequent calculus, labelled sequent calculus and nested sequent calculus. We will also introduce the concept of focusing, and show how it can be used for the construction of efficient proof systems for various modal logics.

Additional information:








Cross-Linguistic Semantics: Methodological Advances

Area: Language and Logic (LaLo)

Level: Foundational

Week 2 (click here for course time and room)

Lecturer(s): Bert Le Bruyn (Utrecht University) and Henriette de Swart (Utrecht University)

Course website: https://www.dropbox.com/sh/060kb7y9dvjko8q/AACXpRN4HQegz9EWO0YdRiQDa?dl=0

Abstract: In this course, we present and discuss the methodologies that have been used in cross-linguistic semantics over the past two decades. One of the fundamental issues we are concerned with is the balance between data and theory: how much should we allow a theory based on one (set of) language(s) to guide our analysis of another (set of) language(s)? Up till recently, there was no other way to build a theory of language than by moving language by language and verifying/falsifying hypotheses based on previously studied languages. We argue that Translation Mining ' a new take on parallel corpus research ' allows us to proceed in a more data-driven fashion and to question the status quo in how we make theoretical progress in cross-linguistic semantics.

Additional information:








Questions in Logic

Area: Language and Logic (LaLo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): Ivano Ciardelli (LMU Munich)

Course website: http://www.ivanociardelli.altervista.org/esslli22

Abstract:


Logic is concerned with relations between sentences that hold in virtue of their logical form, such as entailment and consistency. Traditionally, however, logic has focused on a special class of sentences, namely, statements---sentences which can be true or false. The course makes a case for extending logic beyond statements to encompass also questions, and describes how such an extension can be achieved in the framework of inquisitive logic. We will see that once logic is generalized to questions, interesting logical notions such as answerhood and dependency emerge as facets of the fundamental notion of entailment, and can thereby be analyzed by using the logician's toolkit of model-theoretic constructions and proof systems. In addition to motivating the enterprise and laying out the conceptual framework in detail, we will also see how classical propositional and predicate logic can be made inquisitive, i.e., enriched with questions, and what the resulting logics look like in terms of meta-theoretic properties and proof systems.

Additional information:








Defeasible Reasoning for Ontologies

Area: Logic and Computation (LoCo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Ivan Varzinczak and Iliana M Petrova

Course website:

Abstract:

This course aims at providing an introduction to reasoning defeasibly over description logic ontologies in the context of knowledge representation and reasoning in AI. Description Logics (DLs) are a family of logic-based knowledge representation formalisms with appealing computational properties and a variety of applications.
The different DLs proposed in the literature provide a wide choice of constructors in the object language. However, these are intended to represent only classical, monotonic knowledge, and are therefore unable to express the different aspects of uncertainty and vagueness that often show up in everyday life such as typicality and exceptions.

The goal of this course is two-fold: (i) to provide an overview of the development of non-monotonic approaches to DLs from the past 25 years, pointing out the difficulties that arise when naively transposing the traditional propositional approaches to the DL case, and (ii) present the latest results in the area.

Additional information:








Compositional Models of Vector-based Semantics: From Theory to Tractable Implementation

Area: Language and Computation (LaCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Gijs Wijnholds and Michael Moortgat

Course website: https://compositioncalculus.sites.uu.nl/course

Abstract:

Vector-based compositional architectures combine a distributional view of word meanings with a modelling of the syntax-semantics interface as a structure-preserving map relating syntactic categories (types) and derivations to their counterparts in a corresponding meaning algebra. This design is theoretically attractive, but faces challenges when it comes to large-scale practical applications. First there is the curse of dimensionality resulting from the fact that semantic spaces directly reflect the complexity of the types of the syntactic front end. Secondly, modelling of the meaning algebra in terms of finite dimensional vector spaces and linear maps means that vital information encoded in syntactic derivations is lost in translation. The course compares and evaluates methods that are being proposed to face these challenges. Participants gain a thorough understanding of theoretical and practical issues involved, and acquire hands-on experience witha set of user-friendly tools and resources.

Additional information:








Natural Language Reasoning with a Natural Theorem Prover

Area: Language and Computation (LaCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Lasha Abzianidze

Course website: https://naturallogic.pro/Teaching/esslli22/

Abstract:

Natural language reasoning is a complex task that requires understanding the meanings of natural language expressions and recognizing semantic relations between them. The course is built around this highly interdisciplinary task. It introduces a computational theory of reasoning, called Natural Tableau, that combines the idea of a Natural Logic (logic using natural language as its vehicle of reasoning) with the semantic tableau method (a proof calculus that searches for certain situations).
The course not only introduces the theory of Natural Tableau but also shows its practical applications. In particular, we will show how an automated theorem prover, called LangPro, based on Natural Tableau, is used for Recognizing Textual Entailment (RTE) benchmarks. To overcome the knowledge sparsity and boost its performance, LangPro uses abductive reasoning to learn lexical relations from RTE training data.
Moreover, it will be also demonstrated how the prover can be extended for other languages than English, namely, for Dutch.
During the course, attendees will also have the opportunity to run LangPro on RTE problems and examine human-readable proofs.

Additional information:








Multimodal Interaction in Dialogue and its Meaning

Area: Language and Computation (LaCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Jonathan Ginzburg (Laboratoire de Linguistique Formelle-CNRS at the Université Paris Cité) and Andy Lücking (Laboratoire de Linguistique Formelle-CNRS at the Université Paris Cité)

Course website: https://aluecking.github.io/ESSLLI2022/

Abstract:

Brief Description: After a brief introduction to the formal frameworks considered, we will discuss various multimodal phenomena (including pointing, laughter, headshake), motivating certain formal devices needed for their description and finally consider what this entails for the interface of semantics with theories of memory, emotion, and rapport.

Additional information:








Logics for Social Choice Theory

Area: Logic and Computation (LoCo)

Level: Foundational

Week 2 (click here for course time and room)

Lecturer(s): Eric Pacuit, University of Maryland

Course website: https://pacuit.org/esslli2022/logics-social-choice/

Abstract:

There is a long tradition of fruitful interaction between logic and social choice theory. Many different logical systems have been developed that can formalize results in social choice theory. In recent years, much of the research on logic and social choice has focused on computer-aided methods such as SAT solving and interactive theorem proving. This course will be a broad overview of recent work on logic and social choice theory. The course will start by introducing key results in social choice theory (e.g., May's Theorem, Arrow's Theorem, the Gibbard-Satterthwaite Theorem) and the different logical systems that have been developed to formalize these results. The second part of the course will show how SAT solvers have been used to discover new results in the study of voting methods. Finally, the last part of the course will be a brief introduction to the Lean interactive proof assistant and how to use Lean to formalize results in social choice theory.

Additional information:








Canonical Models in First-Order Nonclassical Logics

Area: Logic and Computation (LoCo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Shay Logan and Andrew Tedder

Course website:

Abstract:

We survey methods for proving completeness for first-order relevant logics, including those for classes of models defined by by Kit Fine, on the one hand, and by Edwin Mares and Robert Goldblatt on the other. We contrast these methods, and discuss the philosophical upshots of the various model structures, and consider some extensions, such as to identity. We will also discuss a selection of open problems in the area, and seek to equip students with the expertise to engage with these and related issues.

Additional information:








Creating and Maintaining High-quality Language Resources

Area: Language and Computation (LaCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Valerio Basile, University of Turin

Course website: https://drive.google.com/drive/folders/1qItQOcKq5x9k3MccdgbsJ6Gpk2gI4l4Y?usp=sharing

Abstract: Language Resources (LRs) play a key role in Natural Language Processing (NLP). Creating and maintaining corpora and lexica is a complex task that requires following several methodological steps to ensure the quality of the resources employed for training and benchmarking NLP models. The path to a successful LR is filled with critical design decisions, each impacting the final outcome, and potential pitfalls. This course will provide an overview of the methodologies involved in designing annotation schemes, performing a robust annotation, representing and sharing the annotated data. Throughout the course, a LR will be collectively created by the students by means of hands-on, interactive examples.

Additional information:

The course will cover the following topics and activities:

  • Day 1: introduction to language annotation; annotation schemes; annotators.
  • Day 2: representation and formats for natural language data; software and online platforms for annotation.
  • Day 3: agreement and harmonization.
  • Day 4: metadata; legal and ethical considerations.
  • Day 5: informative disagreement, polarization, and annotator perspectives.

Interactive practical sessions (if time permits) and homeworks will make use of Google Colab.

All the course material will be shared through this Google Drive directory.








Multimodal Semantics for Affordances and Actions

Area: Language and Computation (LaCo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): James Pustejovsky and Nikhil Krishnaswamy

Course website: https://voxml.github.io/voxicon/blog/esslli-2022-course/

Abstract:

This course introduces the requirements and challenges involved in developing a multimodal semantics for human-computer and human-robot interactions. Unlike unimodal interactive agents (e.g., text-based chatbots or voice-based personal digital assistants), multimodal HCI and HRI inherently require a notion of embodiment, or an understanding of the agent's placement within the environment and that of its interlocutor.

This requires not only the robust recognition and generation of expressions through multiple modalities (language, gesture, vision, action), but also the encoding of situated meaning: (a) the situated grounding of expressions in context; (b) an interpretation of the expression contextualized to the dynamics of the discourse; and (c) an appreciation of the actions and consequences associated with objects in the environment.
This in turn impacts how we computationally model human-human communicative interactions, with particular relevance to the shared understanding of affordances and actions over objects.

Additional information:








Relating Structure to Power: An Invitation to Game Comonads

Area: Logic and Computation (LoCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Tomas Jakl (University of Cambridge) and Luca Reggio (University College London)

Course website: https://tomas.jakl.one/teaching/2022-su-game-comonads

Abstract:

There is a remarkable divide in the field of Logic in Computer Science between two distinct strands: one focussing on compositionality and semantics ("Structure"), and the other on expressiveness and complexity ("Power"). It is remarkable because these two fundamental aspects are investigated using almost disjoint methods by almost disjoint research communities.

This course will introduce the exciting and emerging theory of game comonads, recently put forward by Abramsky, Dawar, and their collaborators. Game comonads offer a novel approach to relating categorical semantics, which exemplifies Structure, to finite model theory, which exemplifies Power. We will develop their basic theory and illustrate how they provide a structural and intrinsic account of several concrete notions that play a central role in finite model theory. These range from equivalences with respect to logic fragments (e.g., finite-variable, bounded quantifier-rank, and bounded modal-depth fragments), to combinatorial parameters of structures (e.g., tree-width, tree-depth, and synchronization-tree depth), and model comparison games (e.g., pebble, Ehrenfeucht-Fraissé, and bisimulation games).

Additional information:

Click here to open PDF document








Argument and Logical Analysis in Humans and Machines

Area: Language and Computation (LaCo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Kyle Richardson and Gregor Betz

Course website:

Abstract:

This is intended to be an advanced course that focuses on applied argumentation analysis both in humans and machines. More specifically, the goal is to cover the basics of normative models of text understanding and argumentation models as ordinarily studied in philosophy and to outline recent attempts to model argumentation dynamics in machines using state-of-the-art neural NLP (with a focus on recent pre-trained transformers).

Additional information:








Logic & Probability

Area: Logic and Computation (LoCo)

Level: Introductory

Week 1 (click here for course time and room)

Lecturer(s): Thomas Icard (Stanford University) and Krzysztof Mierzewski (Carnegie Mellon University)

Course website: https://web.stanford.edu/~icard/esslli2022.html

Abstract:

In this course we propose to present and discuss some of the most important ideas and results covering various contacts between logic and probability. Topics include qualitative and comparative probability, probability logics and probabilistic logics, non-monotonic reasoning and acceptance rules, logical foundations of probabilistic programming, and zero-one laws in logic. Applications to core ESSLLI areas (computer science, linguistics, philosophical logic) will be emphasized.

Additional information:








Conditionals in Decision and Game Theory

Area: Language and Logic (LaLo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Ilaria Canavotto and Eric Pacuit

Course website:

Abstract:

Reasoning about conditionals, such as "If I do action a, the outcome will be c", plays an important role when studying the foundations of decision and game theory. There are two objectives for this course. The first objective is to introduce the most prominent semantics for conditionals from the logic, philosophy and formal semantics literatures. The second objective is to introduce decision and game theory with a special emphasis on the puzzles and paradoxes that involve reasoning about conditionals (such as Newcomb's paradox, certain analyses of the Prisoner's Dilemma, and the Aumann-Stalnaker debate about backward induction). This course will provide a solid foundation for students interested in studying the semantics of conditionals and for students interested in using ideas from decision and game theory in their own field of study.

Additional information:








Explainability in Integrated Cognitive Systems Combining Logic-based Reasoning and Data-driven Learning

Area: Logic and Computation (LoCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Mohan Sridharan, University of Birmingham, UK

Course website: https://www.cs.bham.ac.uk/~sridharm/Teaching/esslli22.html

Abstract:

This advanced course seeks to bring participants to the state of the
art in explainable reasoning and learning in integrated cognitive
systems that use a combination of knowledge-based reasoning and
data-driven learning for automated decision-making. In particular, we
will explore the class of such systems that sense and interact with
the physical world, use non-monotonic logic to reason with incomplete
commonsense domain knowledge, and use machine/deep learning methods to
learn from experience. We will discuss how the interplay between
representation, reasoning, and learning can be exploited in these
systems for reliable decision-making, and to provide relational
descriptions as explanations of decisions and beliefs. All related
concepts will be illustrated using simple examples drawn from computer
vision and robotics, with software agents and physical robots
assisting humans in dynamic domains.

Additional information:








Negative Events and Truthmaker Semantics

Area: Language and Logic (LaLo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Timothée Bernard and Justin Bledin

Course website:

Abstract:

This course introduces the conceptual and logical foundations of negative events. We review the arguments raised in the linguistics and philosophy literature for and against them. We contrast events and situations, and relate them to exact and inexact verifiers in truthmaker semantics. We present the model-theoretic properties of a family of negation-like operations based on the notion of incompatibility, and then develop a formalization of negative events compatible with standard treatments of time and modality. Applications to negative perception reports and other phenomena are considered. This course together with ``Negative individuals and truthmaker semantics'' forms a unit on negative ontology in formal semantics, though each course can be taken independently of the other.

Additional information:








Theory-Building in Higher Order Languages

Area: Language and Logic (LaLo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): Jeremy Goodman and Cian Dorr

Course website:

Abstract:

This course will introduce higher-order logic as a language in which to rigorously theorize about modality, propositional attitudes, and the semantics of natural language. Unlike more familiar applications of higher-order languages in the foundations of mathematics, the aforementioned applications called for a non-extensional interpretation of higher-order quantification. This in turn raises hard questions about the ``granularity'' of higher-order reality, which have been the subject of a burgeoning literature in contemporary metaphysics and philosophical logic. The course will disseminate some of this recent work in a way that highlights issues that will be of interest to the ESSLLI audience. These include new technical results (both model constructions and inconsistency results) for ``fine-grained'' theories, as well as applications to natural language semantics and to the semantic paradoxes. The course will stress the importance of combining fine-grained theories of sentence-meanings with corresponding accounts of sub-sentential-expression meanings, and of respecting limitative results like the Russell-Myhill paradox, Russell's paradox, and Tarski's theorem (relations between which will also be explored). 

Additional information:








Computational Models of Grounding in Dialogue

Area: Language and Computation (LaCo)

Level: Advanced

Week 1 (click here for course time and room)

Lecturer(s): David Traum (University of Southern California)

Course website: https://people.ict.usc.edu/~traum/ESSLLI2022/

Abstract:

Grounding is the process by which participants in a conversation establish new common ground. This process includes not just transmission of declarative utterances, but inferential and feedback processes. This process is also of critical importance to artificial dialogue systems, which include additional challenges of imperfect input recognition, and limited ontologies and inferential ability. In this course we will review models and uses for common ground in pragmatics and computational agent theories, and then examine a variety of proposals of how common ground can be established. These proposals include both descriptive analyses of behavior, as well as generative proposals that can be used by computational systems engaged in dialogue to decide on next moves. We will also look at multimodal grounding, and advanced topics, including multiparty grounding, incremental grounding and degrees of grounding, as well as how grounding models have been used for studying other social phenomena.

Advanced Course for Language and Computation Area (this will be an updated version of an ESSLLI course last taught in 2015)

Additional information:








Introduction to Answer Set Programming, Extensions and Applications

Area: Logic and Computation (LoCo)

Level: Introductory

Week 2 (click here for course time and room)

Lecturer(s): Alessandra Mileo, Dublin City University/Insight Centre for Data Analytics (Dublin, IE)

Course website:

Abstract: This course will provide an introduction to Answer Set Programming (ASP), a paradigm for declarative problem solving based on the stable model semantics of Logic Programming. The success of ASP for Knowledge Representation and Reasoning, and the presence of a growing research community around it is due to the availability of efficient answer set solvers, the main ones being CLASP and DLV. The expressive power of ASP rules goes beyond propositional clauses and as a result a growing number of applications of ASP has emerged in recent years. Numerous extensions of ASP have been proposed since its first formulation in the early 90s, including but not limited to reasoning in dynamic environments and probabilistic logic reasoning. More recently, the resurgence of neuro-symbolic computing has seen several approaches where ASP and Machine Learning have been combined. This course will cover the basics of ASP theoretical background as well as some of its key extensions, applications and tools.

Additional information:

Here’s a list of resources for my ESSLLI course:

My course slides will be available via Google Drive folder








Free Choice: Theoretical and Experimental Perspectives

Area: Language and Logic (LaLo)

Level: Advanced

Week 2 (click here for course time and room)

Lecturer(s): Jacopo Romoli (HHU) and Melissa Fusco (Columbia)

Course website: https://drive.google.com/drive/u/1/folders/1DksMwA3L7Tk7wAi5GsAkpwlVSHguOI2Y

Abstract:

Disjunctions in the scope of possibility modals give rise to a conjunctive inference, generally referred to as ‘Free choice.’ For example, (1) suggests that Angie can take Spanish and can take Calculus (and hence that she can ‘choose’ between the two). This inference is problematic, since it is not validated by a classical semantics for modals, in combination with a Boolean analysis of disjunction. To complicate things further, free choice tends to disappear under negation: (2) doesn’t merely suggest that Angie can’t choose, but rather that she can take neither Spanish nor Calculus. This second effect is sometimes referred to as ‘Dual prohibition.’

·  (1)  Angie can take Spanish or Calculus.
Angie can choose between the two 

·  Free choice

·  (2)  Angie cannot take Spanish or Calculus.
Angie can take neither of the two 

·  Dual prohibition

The Free choice-Dual prohibition pattern has sparked an industry of theories in philosophy of lan- guage and formal semantics/pragmatics since the seventies. A theory of this pattern not only has to derive free choice in positive contexts and dual prohibition in negative ones; it should also answer questions about these readings such as: are they part of the semantics of sentences like the above or do they arise as extra inferences? And what is the status of these readings? Are they at-issue or not at-issue meanings? How do they interact with other aspects of meaning?

There are two main approaches in the literature, differing in particular as to whether they derive free choice as an implicature or not. We will outline the two approaches and their divergent predic- tions and explore how they fare against a range of experimental evidence in the literature from both adults and children. In addition, we will discuss the interaction between free choice and quantifiers, plurality, generics, and presuppositions. The goal of the course is to enable students to conduct their own experimental or theoretical research on this complex topic in semantics/pragmatics.

Schedule

  • Class 1. The problem and a sketch of the implicature approach 
  • Class 2. Some options on the semantics approach 
  • Class 3. Experimental evidence: the case of negative free choice 
  • Class 4. Free choice and presuppositions 
  • Class 5. Free choice, cancellation, and scope 







A Linguist's Guide to Neural Networks

Area: Language and Computation (LaCo)

Level: Foundational

Week 2 (click here for course time and room)

Lecturer(s): Tim Van de Cruys (KU Leuven)

Course website: http://www.ccl.kuleuven.be/Courses/esslli2022/

Abstract:

This course will provide an overview of the neural network paradigm to natural language processing, with a specific focus on linguistic applications. In recent years, neural network approaches have obtained strong performance on a wide range of different NLP tasks, using end-to-end neural architectures that do not rely on traditional, task-specific feature engineering. The course will provide an overview of the various neural architectures that exist, with a specific focus on current state of the art transformer-based architectures, and the methods that are used to train them. Moreover, the course will pay specific attention to linguistic aspects, viz. what linguistic information might be implicitly present within these models, and how can they be used for linguistic analysis. Each session will consist of a theoretical lecture, followed by a hands-on practical session.

Additional information:








Formal Semantics of Natural Language

Area: Language and Logic (LaLo)

Level: Foundational

Week 1 (click here for course time and room)

Lecturer(s): Yoad Winter

Course website: https://www.phil.uu.nl/~yoad/esslli2022/esslli2022-course.html

Abstract:

This foundational course will introduce classical formal semantics of natural language, focusing on new directions and recent research. The first part of the course will cover semantic foundations including entailment, ambiguity, compositionality, types, models and basic lambda calculus. The second part will cover more advanced topics from recent work on presuppositions, plurals and events, focusing on empirical phenomena of foundational importance to compositionality: presupposition projection, distributive quantification, and modification across categories. The course is intended for students with basic mathematical and scientific background, but does not presuppose specific knowledge in logic or theoretical linguistics. For students who are new to formal semantics, the course will serve as a general overview which emphasizes the scientific value of analyzing semantic phenomena using elegant and rigorously defined mathematical methods. For students with previous background, the course will be useful as a pointer to current research of major importance for the foundations of formal semantics.

Additional information:



Search