Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gemma Boleda is active.

Publication


Featured researches published by Gemma Boleda.


international conference on computational linguistics | 2014

UTexas: Natural Language Semantics using Distributional Semantics and Probabilistic Logic

Islam Beltagy; Stephen Roller; Gemma Boleda; Katrin Erk; Raymond J. Mooney

We represent natural language semantics by combining logical and distributional information in probabilistic logic. We use Markov Logic Networks (MLN) for the RTE task, and Probabilistic Soft Logic (PSL) for the STS task. The system is evaluated on the SICK dataset. Our best system achieves 73% accuracy on the RTE task, and a Pearson’s correlation of 0.71 on the STS task.


meeting of the association for computational linguistics | 2016

“Look, some green circles!”: learning to quantify from images

Ionut Sorodoc; Angeliki Lazaridou; Gemma Boleda; Aurélie Herbelot; Sandro Pezzelle; Raffaella Bernardi

In this paper, we investigate whether a neural network model can learn the meaning of natural language quantifiers (no, some and all) from their use in visual contexts. We show that memory networks perform well in this task, and that explicit counting is not necessary to the system’s performance, supporting psycholinguistic evidence on the acquisition of quantifiers.


empirical methods in natural language processing | 2016

Convolutional neural network language models

Ngoc-Quan Pham; Germán Kruszewski; Gemma Boleda

Convolutional Neural Networks (CNNs) have shown to yield very strong results in several Computer Vision tasks. Their application to language has received much less attention, and it has mainly focused on static classification tasks, such as sentence classification for Sentiment Analysis or relation extraction. In this work, we study the application of CNNs to language modeling, a dynamic, sequential prediction task that needs models to capture local as well as long-range dependency information. Our contribution is twofold. First, we show that CNNs achieve 11-26% better absolute performance than feed-forward neural language models, demonstrating their potential for language representation even in sequential tasks. As for recurrent models, our model outperforms RNNs but is below state of the art LSTM models. Second, we gain some understanding of the behavior of the model, showing that CNNs in language act as feature detectors at a high level of abstraction, like in Computer Vision, and that the model can profitably use information from as far as 16 words before the target.


empirical methods in natural language processing | 2015

Distributional Semantics in Use

Raffaella Bernardi; Gemma Boleda; Raquel Fernández; Denis Paperno

In this position paper we argue that an adequate semantic model must account for language in use, taking into account how discourse context affects the meaning of words and larger linguistic units. Distributional semantic models are very attractive models of meaning mainly because they capture conceptual aspects and are automatically induced from natural language data. However, they need to be extended in order to account for language use in a discourse or dialogue context. We discuss phenomena that the new generation of distributional semantic models should capture, and propose concrete tasks on which they could be tested.


meeting of the association for computational linguistics | 2012

Distributional Semantics in Technicolor

Elia Bruni; Gemma Boleda; Marco Baroni; Nam Khanh Tran


Archive | 2004

Relational adjectives as properties of kinds

Louise McNally; Gemma Boleda


joint conference on lexical and computational semantics | 2013

Montague Meets Markov: Deep Semantics with Probabilistic Logical Form

Islam Beltagy; Cuong K. Chau; Gemma Boleda; Dan Garrette; Katrin Erk; Raymond J. Mooney


New Journal of Physics | 2013

A scaling law beyond Zipf's law and its relation to Heaps' law

Francesc Font-Clos; Gemma Boleda; Alvaro Corral


Proceedings of the 10th International Conference on Computational Semantics (IWCS 2013) -- Long Papers | 2013

Intensionality was only alleged: On adjective-noun composition in distributional semantics

Gemma Boleda; Marco Baroni; Louise McNally


joint conference on lexical and computational semantics | 2012

Regular polysemy: A distributional model

Gemma Boleda; Sebastian Padó; Jason Utt

Collaboration


Dive into the Gemma Boleda's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Islam Beltagy

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Katrin Erk

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Raymond J. Mooney

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alvaro Corral

Autonomous University of Barcelona

View shared research outputs
Top Co-Authors

Avatar

Toni Badia

Pompeu Fabra University

View shared research outputs
Researchain Logo
Decentralizing Knowledge