Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Islam Beltagy is active.

Publication


Featured researches published by Islam Beltagy.


meeting of the association for computational linguistics | 2014

Probabilistic Soft Logic for Semantic Textual Similarity

Islam Beltagy; Katrin Erk; Raymond J. Mooney

Probabilistic Soft Logic (PSL) is a recently developed framework for probabilistic logic. We use PSL to combine logical and distributional representations of natural-language meaning, where distributional information is represented in the form of weighted inference rules. We apply this framework to the task of Semantic Textual Similarity (STS) (i.e. judging the semantic similarity of naturallanguage sentences), and show that PSL gives improved results compared to a previous approach based on Markov Logic Networks (MLNs) and a purely distributional approach.


Computational Linguistics | 2016

Representing meaning with a combination of logical and distributional models

Islam Beltagy; Stephen Roller; Pengxiang Cheng; Katrin Erk; Raymond J. Mooney

NLP tasks differ in the semantic information they require, and at this time no single semantic representation fulfills all requirements. Logic-based representations characterize sentence structure, but do not capture the graded aspect of meaning. Distributional models give graded similarity ratings for words and phrases, but do not capture sentence structure in the same detail as logic-based approaches. It has therefore been argued that the two are complementary.We adopt a hybrid approach that combines logical and distributional semantics using probabilistic logic, specifically Markov Logic Networks. In this article, we focus on the three components of a practical system:1 1) Logical representation focuses on representing the input problems in probabilistic logic; 2) knowledge base construction creates weighted inference rules by integrating distributional information with other sources; and 3) probabilistic inference involves solving the resulting MLN inference problems efficiently. To evaluate our approach, we use the task of textual entailment, which can utilize the strengths of both logic-based and distributional representations. In particular we focus on the SICK data set, where we achieve state-of-the-art results. We also release a lexical entailment data set of 10,213 rules extracted from the SICK data set, which is a valuable resource for evaluating lexical entailment systems.2


international conference on computational linguistics | 2014

UTexas: Natural Language Semantics using Distributional Semantics and Probabilistic Logic

Islam Beltagy; Stephen Roller; Gemma Boleda; Katrin Erk; Raymond J. Mooney

We represent natural language semantics by combining logical and distributional information in probabilistic logic. We use Markov Logic Networks (MLN) for the RTE task, and Probabilistic Soft Logic (PSL) for the STS task. The system is evaluated on the SICK dataset. Our best system achieves 73% accuracy on the RTE task, and a Pearson’s correlation of 0.71 on the STS task.


meeting of the association for computational linguistics | 2014

Semantic Parsing using Distributional Semantics and Probabilistic Logic

Islam Beltagy; Katrin Erk; Raymond J. Mooney

We propose a new approach to semantic parsing that is not constrained by a fixed formal ontology and purely logical inference. Instead, we use distributional semantics to generate only the relevant part of an on-the-fly ontology. Sentences and the on-the-fly ontology are represented in probabilistic logic. For inference, we use probabilistic logic frameworks like Markov Logic Networks (MLN) and Probabilistic Soft Logic (PSL). This semantic parsing approach is evaluated on two tasks, Textual Entitlement (RTE) and Textual Similarity (STS), both accomplished using inference in probabilistic logic. Experiments show the potential of the approach.


meeting of the association for computational linguistics | 2016

Improved Semantic Parsers For If-Then Statements

Islam Beltagy; Chris Quirk

Digital personal assistants are becoming both more common and more useful. The major NLP challenge for personal assistants is machine understanding: translating natural language user commands into an executable representation. This paper focuses on understanding rules written as If-Then statements, though the techniques should be portable to other semantic parsing tasks. We view understanding as structure prediction and show improved models using both conventional techniques and neural network models. We also discuss various ways to improve generalization and reduce overfitting: synthetic training data from paraphrase, grammar combinations, feature selection and ensembles of multiple systems. An ensemble of these techniques achieves a new state of the art result with 8% accuracy improvement.


Archive | 2014

Natural Language Semantics Using Probabilistic Logic

Islam Beltagy

Abstract : With better natural language semantic representations, computers can do more applications more efficiently as a result of better understanding of natural text. However, no single semantic representation at this time fulfills all requirements needed for a satisfactory representation. Logic-based representations like first-order logic capture many of the linguistic phenomena using logical constructs, and they come with standardized inference mechanisms, but standard first-order logic fails to capture the graded aspect of meaning in languages. Distributional models use contextual similarity to predict the graded semantic similarity of words and phrases but they do not adequately capture logical structure. In addition there are a few recent attempts to combine both representations either on the logic side (still, not a graded representation), or in the distribution side (not full logic). We propose using probabilistic logic to represent natural language semantics combining the expressivity and the automated inference of logic, and the gradedness of distributional representations. We evaluate this semantic representation on two tasks, Recognizing Textual Entailment (RTE) and Semantic Textual Similarity (STS). Doing RTE and STS better is an indication of a better semantic understanding. Our system has three main components, 1. Parsing and Task Representation, 2. Knowledge Base Construction, and 3. Inference. The input natural sentences of the RTE/STS task are mapped to logical form using Boxer which is a rule based system built on top of a CCG parser, then they are used to formulate the RTE/STS problem in probabilistic logic. Then, a knowledge base is represented as weighted inference rules collected from different sources like WordNet and on-the-fly lexical rules from distributional semantics.


joint conference on lexical and computational semantics | 2013

Montague Meets Markov: Deep Semantics with Probabilistic Logical Form

Islam Beltagy; Cuong K. Chau; Gemma Boleda; Dan Garrette; Katrin Erk; Raymond J. Mooney


wireless communications and networking conference | 2011

A new routing metric and protocol for multipath routing in cognitive networks

Islam Beltagy; Moustafa Youssef; Mohamed N. El-Derini


national conference on artificial intelligence | 2014

Efficient Markov logic inference for natural language semantics

Islam Beltagy; Raymond J. Mooney


arXiv: Computation and Language | 2015

Representing Meaning with a Combination of Logical Form and Vectors.

Islam Beltagy; Stephen Roller; Pengxiang Cheng; Katrin Erk; Raymond J. Mooney

Collaboration


Dive into the Islam Beltagy's collaboration.

Top Co-Authors

Avatar

Katrin Erk

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Raymond J. Mooney

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Stephen Roller

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Gemma Boleda

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Pengxiang Cheng

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Moustafa Youssef

Egypt-Japan University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cuong K. Chau

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Dan Garrette

University of Texas at Austin

View shared research outputs
Researchain Logo
Decentralizing Knowledge