Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lorenzo Ferrone is active.

Publication


Featured researches published by Lorenzo Ferrone.


Computational Linguistics | 2015

When the whole is not greater than the combination of its parts: A decompositional look at compositional distributional semantics

Fabio Massimo Zanzotto; Lorenzo Ferrone; Marco Baroni

Distributional semantics has been extended to phrases and sentences by means of composition operations. We look at how these operations affect similarity measurements, showing that similarity equations of an important class of composition methods can be decomposed into operations performed on the subparts of the input phrases. This establishes a strong link between these models and convolution kernels.


Ksii Transactions on Internet and Information Systems | 2017

Have You Lost the Thread? Discovering Ongoing Conversations in Scattered Dialog Blocks

Fabio Massimo Zanzotto; Lorenzo Ferrone

Finding threads in textual dialogs is emerging as a need to better organize stored knowledge. We capture this need by introducing the novel task of discovering ongoing conversations in scattered dialog blocks. Our aim in this article is twofold. First, we propose a publicly available testbed for the task by solving the insurmountable problem of privacy of Big Personal Data. In fact, we showed that personal dialogs can be surrogated with theatrical plays. Second, we propose a suite of computationally light learning models that can use syntactic and semantic features. With this suite, we showed that models for this challenging task should include features capturing shifts in language use and, possibly, modeling underlying scripts.


SLSP 2015 Proceedings of the Third International Conference on Statistical Language and Speech Processing - Volume 9449 | 2015

Decoding Distributed Tree Structures

Lorenzo Ferrone; Fabio Massimo Zanzotto; Xavier Carreras

Encoding structural information in low-dimensional vectors is a recent trend in natural language processing that builds on distributed representations [14]. However, although the success in replacing structural information in final tasks, it is still unclear whether these distributed representations contain enough information on original structures. In this paper we want to take a specific example of a distributed representation, the distributed trees DT [17], and analyze the reverse problem: can the original structure be reconstructed given only its distributed representation? Our experiments show that this is indeed the case, DT can encode a great deal of information of the original tree, and this information is often enough to reconstruct the original object format.


international symposium on neural networks | 2017

Can we explain natural language inference decisions taken with neural networks? Inference rules in distributed representations

Fabio Massimo Zanzotto; Lorenzo Ferrone

Natural Language Inference (NLI) is a key, complex task where machine learning (ML) is playing an important role. However, ML has progressively obfuscated the role of linguistically-motivated inference rules, which should be the core of NLI systems. In this paper, we introduce distributed inference rules as a novel way to encode linguistically-motivated inference rules in learning interpretable NLI classifiers. We propose two encoders: the Distributed Partial Tree Encoder and the Distributed Smoothed Partial Tree Encoder. These encoders allow modeling syntactic and syntactic-semantic inference rules as distributed representations ready to be used in ML models over large datasets. Although far from the state-of-the-art of end-to-end deep learning systems on large datasets, our shallow networks positively exploit inference rules for NLI, improving over baseline systems. This is a first positive step towards interpretable and explainable end-to-end deep learning systems.


joint conference on lexical and computational semantics | 2014

Compositional Distributional Semantics Models in Chunk-based Smoothed Tree Kernels

Lorenzo Ferrone; Fabio Massimo Zanzotto

The field of compositional distributional semantics has proposed very interesting and reliable models for accounting the distributional meaning of simple phrases. These models however tend to disregard the syntactic structures when they are applied to larger sentences. In this paper we propose the chunk-based smoothed tree kernels (CSTKs) as a way to exploit the syntactic structures as well as the reliability of these compositional models for simple phrases. We experiment with the recognizing textual entailment datasets. Our experiments show that our CSTKs perform better than basic compositional distributional semantic models (CDSMs) recursively applied at the sentence level, and also better than syntactic tree kernels.


international conference on computational linguistics | 2014

haLF: Comparing a Pure CDSM Approach with a Standard Machine Learning System for RTE

Lorenzo Ferrone; Fabio Massimo Zanzotto

In this paper, we describe our submission to the Shared Task #1. We tried to follow the underlying idea of the task, that is, evaluating the gap of full-fledged recognizing textual entailment systems with respect to compositional distributional semantic models (CDSMs) applied to this task. We thus submitted two runs: 1) a system obtained with a machine learning approach based on the feature spaces of rules with variables and 2) a system completely based on a CDSM that mixes structural and syntactic information by using distributed tree kernels. Our analysis shows that, under the same conditions, the fully CDSM system is still far from being competitive with more complex methods.


international conference on computational linguistics | 2014

Towards Syntax-aware Compositional Distributional Semantic Models

Lorenzo Ferrone; Fabio Massimo Zanzotto


Joint Symposium of Semantic Processing (JSSP) | 2013

Linear Compositional Distributional Semantics and Structural Kernels

Lorenzo Ferrone; Fabio Massimo Zanzotto


arXiv: Computation and Language | 2017

Symbolic, Distributed and Distributional Representations for Natural Language Processing in the Era of Deep Learning: a Survey.

Lorenzo Ferrone; Fabio Massimo Zanzotto


neural information processing systems | 2015

Predicting embedded syntactic structures from natural language sentences with neural network approaches

Gregory Senay; Fabio Massimo Zanzotto; Lorenzo Ferrone; Luca Rigazio

Collaboration


Dive into the Lorenzo Ferrone's collaboration.

Top Co-Authors

Avatar

Fabio Massimo Zanzotto

University of Rome Tor Vergata

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge