Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mehrnoosh Sadrzadeh is active.

Publication


Featured researches published by Mehrnoosh Sadrzadeh.


Journal of Logic and Computation | 2007

Epistemic Actions as Resources

Alexandru Baltag; Bob Coecke; Mehrnoosh Sadrzadeh

We provide algebraic semantics together with a sound and complete sequent calculus for information update due to epistemic actions. This semantics is flexible enough to accommodate incomplete as well as wrong information e.g. due to secrecy and deceit, as well as nested knowledge. We give a purely algebraic treatment of the muddy children puzzle, which moreover extends to situations where the children are allowed to lie and cheat. Epistemic actions, that is, information exchanges between agents A,B, . . . ∈ A, are modeled as elements of a quantale. The quantale (Q, ∨ , •) acts on an underlyingQ-right module (M, ∨ ) of epistemic propositions and facts. The epistemic content is encoded by appearance maps, one pair f A : M → M and f Q A : Q → Q of (lax) morphisms for each agent A ∈ A, which preserve the module and quantale structure respectively. By adjunction, they give rise to epistemic modalities [12], capturing the agents’ knowledge on propositions and actions. The module action is epistemic update and gives rise to dynamic modalities [21]— cf.weakest precondition. This model subsumes the crucial fragment of Baltag, Moss and Solecki’s [6] dynamic epistemic logic, abstracting it in a constructive fashion while introducing resource-sensitive structure on the epistemic actions.


arXiv: Computation and Language | 2011

Concrete sentence spaces for compositional distributional models of meaning

Edward Grefenstette; Mehrnoosh Sadrzadeh; Stephen Clark; Bob Coecke; Stephen Pulman

Coecke, Sadrzadeh, and Clark [3] developed a compositional model of meaning for distributional semantics, in which each word in a sentence has a meaning vector and the distributional meaning of the sentence is a function of the tensor products of the word vectors. Abstractly speaking, this function is the morphism corresponding to the grammatical structure of the sentence in the category of finite dimensional vector spaces. In this paper, we provide a concrete method for implementing this linear meaning map, by constructing a corpus-based vector space for the type of sentence. Our construction method is based on structured vector spaces whereby meaning vectors of all sentences, regardless of their grammatical structure, live in the same vector space. Our proposed sentence space is the tensor product of two noun spaces, in which the basis vectors are pairs of words each augmented with a grammatical role. This enables us to compare meanings of sentences by simply taking the inner product of their vectors.


empirical methods in natural language processing | 2014

Evaluating Neural Word Representations in Tensor-Based Compositional Settings

Dmitrijs Milajevs; Dimitri Kartsaklis; Mehrnoosh Sadrzadeh; Matthew Purver

We provide a comparative study between neural word representations and traditional vector spaces based on cooccurrence counts, in a number of compositional tasks. We use three different semantic spaces and implement seven tensor-based compositional models, which we then test (together with simpler additive and multiplicative approaches) in tasks involving verb disambiguation and sentence similarity. To check their scalability, we additionally evaluate the spaces using simple compositional methods on larger-scale tasks with less constrained language: paraphrase detection and dialogue act tagging. In the more constrained tasks, co-occurrence vectors are competitive, although choice of compositional method is important; on the largerscale tasks, they are outperformed by neural word embeddings, which show robust, stable performance across the tasks.


Journal of Logic and Computation | 2013

The Frobenius anatomy of word meanings I: subject and object relative pronouns

Mehrnoosh Sadrzadeh; Stephen Clark; Bob Coecke

This paper develops a compositional vector-based semantics of subject and object relative pronouns within a categorical framework. Frobenius algebras are used to formalise the operations required to model the semantics of relative pronouns, including passing information between the relative clause and the modified noun phrase, as well as copying, combining, and discarding parts of the relative clause. We develop two instantiations of the abstract semantics, one based on a truth-theoretic approach and one based on corpus statistics.


Annals of Pure and Applied Logic | 2013

Lambek vs. Lambek: Functorial Vector Space Semantics and String Diagrams for Lambek Calculus

Bob Coecke; Edward Grefenstette; Mehrnoosh Sadrzadeh

Abstract The Distributional Compositional Categorical (DisCoCat) model is a mathematical framework that provides compositional semantics for meanings of natural language sentences. It consists of a computational procedure for constructing meanings of sentences, given their grammatical structure in terms of compositional type-logic, and given the empirically derived meanings of their words. For the particular case that the meaning of words is modelled within a distributional vector space model, its experimental predictions, derived from real large scale data, have outperformed other empirically validated methods that could build vectors for a full sentence. This success can be attributed to a conceptually motivated mathematical underpinning, something which the other methods lack, by integrating qualitative compositional type-logic and quantitative modelling of meaning within a category-theoretic mathematical framework. The type-logic used in the DisCoCat model is Lambekʼs pregroup grammar. Pregroup types form a posetal compact closed category, which can be passed, in a functorial manner, on to the compact closed structure of vector spaces, linear maps and tensor product. The diagrammatic versions of the equational reasoning in compact closed categories can be interpreted as the flow of word meanings within sentences. Pregroups simplify Lambekʼs previous type-logic, the Lambek calculus. The latter and its extensions have been extensively used to formalise and reason about various linguistic phenomena. Hence, the apparent reliance of the DisCoCat on pregroups has been seen as a shortcoming. This paper addresses this concern, by pointing out that one may as well realise a functorial passage from the original type-logic of Lambek, a monoidal bi-closed category, to vector spaces, or to any other model of meaning organised within a monoidal bi-closed category. The corresponding string diagram calculus, due to Baez and Stay, now depicts the flow of word meanings, and also reflects the structure of the parse trees of the Lambek calculus.


Electronic Notes in Theoretical Computer Science | 2005

Algebra and Sequent Calculus for Epistemic Actions

Alexandru Baltag; Bob Coecke; Mehrnoosh Sadrzadeh

We introduce an algebraic approach to Dynamic Epistemic Logic. This approach has the advantage that: (i) its semantics is a transparent algebraic object with a minimal set of primitives from which most ingredients of Dynamic Epistemic Logic arise, (ii) it goes with the introduction of non-determinism, (iii) it naturally extends beyond boolean sets of propositions, up to intuitionistic and non-distributive situations, hence allowing to accommodate constructive computational, information-theoretic as well as non-classical physical settings, and (iv) introduces a structure on the actions, which now constitute a quantale. We also introduce a corresponding sequent calculus (which extends Lambek calculus), in which propositions, actions as well as agents appear as resources in a resource-sensitive dynamic-epistemic logic.


arXiv: Computation and Language | 2014

A Study of Entanglement in a Categorical Framework of Natural Language

Dimitri Kartsaklis; Mehrnoosh Sadrzadeh

In both quantum mechanics and corpus linguistics based on vector spaces, the notion of entanglement provides a means for the various subsystems to communicate with each other. In this paper we examine a number of implementations of the categorical framework of Coecke et al. [4] for natural language, from an entanglement perspective. Specifically, our goal is to better understand in what way the level of entanglement of the relational tensors (or t he lack of it) affects the compositional structures in practical situations. Our findings reveal tha t a number of proposals for verb construction lead to almost separable tensors, a fact that considerably s implifies the interactions between the words. We examine the ramifications of this fact, and we show t hat the use of Frobenius algebras mitigates the potential problems to a great extent. Finally, we briefly examine a machine learning method that creates verb tensors exhibiting a sufficient lev el of entanglement.


Annals of Pure and Applied Logic | 2011

Algebraic semantics and model completeness for intuitionistic public announcement logic

Mehrnoosh Sadrzadeh; Alessandra Palmigiano

In this paper, we start studying epistemic updates using the standard toolkit of duality theory. We focus on public announcements. We give the dual characterization of the corresponding submodelinjection map, as a certain pseudo-quotient map between the complex algebras respectively associated with the given model and with its relativized submodel. The dual characterization we provide naturally generalizes to much wider classes of algebras, which include, but are not limited to, arbitrary BAOs and arbitrary modal expansions of Heyting algebras (HAOs). As an application, we axiomatize the intuitionistic analogue of PAL, which we refer to as IPAL, and prove soundness and completeness of IPAL w.r.t. both algebraic and relational models.


Journal of Logic and Computation | 2016

The Frobenius anatomy of word meanings II: possessive relative pronouns

Mehrnoosh Sadrzadeh; Stephen Clark; Bob Coecke

40 pages, Journal of Logic and Computation, Essays dedicated to Roy Dyckhoff on the occasion of his retirement, S. Graham-Lengrand and D. Galmiche (eds.), 2014


meeting of the association for computational linguistics | 2014

Resolving Lexical Ambiguity in Tensor Regression Models of Meaning

Dimitri Kartsaklis; Nal Kalchbrenner; Mehrnoosh Sadrzadeh

This paper provides a method for improving tensor-based compositional distributional models of meaning by the addition of an explicit disambiguation step prior to composition. In contrast with previous research where this hypothesis has been successfully tested against relatively simple compositional models, in our work we use a robust model trained with linear regression. The results we get in two experiments show the superiority of the prior disambiguation method and suggest that the effectiveness of this approach is modelindependent.

Collaboration


Dive into the Mehrnoosh Sadrzadeh's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dmitrijs Milajevs

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar

Matthew Purver

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar

Roy Dyckhoff

University of St Andrews

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge