Rafael Grimson
University of Buenos Aires
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Rafael Grimson.
International Journal of Geographical Information Science | 2011
Bart Kuijpers; Rafael Grimson; Walied Othman
Moving objects produce trajectories, which are stored in databases by means of finite samples of time-stamped locations. When speed limitations in these sample points are also known, space–time prisms (also called beads) (Pfoser and Jensen 1999, Egenhofer 2003, Miller 2005) can be used to model the uncertainty about an objects location in between sample points. In this setting, a query of particular interest that has been studied in the literature of geographic information systems (GIS) is the alibi query. This boolean query asks whether two moving objects could have physically met. This adds up to deciding whether the chains of space–time prisms (also called necklaces of beads) of these objects intersect. This problem can be reduced to deciding whether two space–time prisms intersect. The alibi query can be seen as a constraint database query. In the constraint database model, spatial and spatiotemporal data are stored by boolean combinations of polynomial equalities and inequalities over the real numbers. The relational calculus augmented with polynomial constraints is the standard first-order query language for constraint databases and the alibi query can be expressed in it. The evaluation of the alibi query in the constraint database model relies on the elimination of a block of three exªistential quantifiers. Implementations of general purpose elimination algorithms, such as those provided by QEPCAD, Redlog, and Mathematica, are, for practical purposes, too slow in answering the alibi query for two specific space–time prisms. These software packages completely fail to answer the alibi query in the parametric case (i.e., when it is formulated in terms of parameters representing the sample points and speed constraints). The main contribution of this article is an analytical solution to the parametric alibi query, which can be used to answer the alibi query on two specific space–time prisms in constant time (a matter of milliseconds in our implementation). It solves the alibi query for chains of space–time prisms in time proportional to the sum of the lengths of the chains. To back this claim up, we implemented our method in Mathematica alongside the traditional quantifier elimination method. The solutions we propose are based on the geometric argumentation and they illustrate the fact that some practical problems require creative solutions, where at least in theory, existing systems could provide a solution.
International Journal of River Basin Management | 2013
Natalia Blanca Montroull; Ramiro I. Saurral; Inés Camilloni; Rafael Grimson; Pablo Vasquez
The Iberá wetlands, located in La Plata Basin, is a fragile ecosystem habitat of several species of flora and fauna and it also constitutes one of the largest inland freshwater of the world. In this study, the hydroclimatologic response to projected climatic changes in the Iberá wetlands is assessed. Bias corrected temperature and precipitation data from four Regional Climate Models (RCMs) developed for the CLARIS-LPB project were used to drive the calibrated variable infiltration capacity (VIC) hydrological model for different time slices. Derived future scenarios consist on changes in temperature, precipitation and water level of the Iberá Lake for the periods 2021–2040 and 2071–2090 with respect to present. All RCMs are consistent in predicting a warming for the near future (0–2°C) and also to the end of the century (1.5–4.5°C) in the study region, but differ in the sign and percentage of precipitation changes. VIC modelling results suggest that the Iberá Lake level could increase in the twenty-first century and that this increment would be higher in the summer months. Nevertheless, the projected 10 cm of water-level increase could be not so relevant as it is of the same order of magnitude than the observed interdecadal variability of the system.
Journal of Computer and System Sciences | 2010
Santiago Figueira; Daniel Gorín; Rafael Grimson
In classical logics, the meaning of a formula is invariant with respect to the renaming of bound variables. This property, normally taken for granted, has been shown not to hold in the case of Independence Friendly (IF) logics. In this paper we argue that this is not an inherent characteristic of these logics but a defect in the way in which the compositional semantics given by Hodges for the regular fragment was generalized to arbitrary formulas. We fix this by proposing an alternative formalization, based on a variation of the classical notion of valuation. Basic metatheoretical results are proven. We present these results for Hodgesslash logic (from which these can be easily transferred to other IF-like logics) and we also consider the flattening operator, for which we give novel game-theoretical semantics.
Remote Sensing Letters | 2016
N. S. Morandeira; Rafael Grimson; P. Kandus
ABSTRACT The initial step in most object-based classification methodologies is the application of a segmentation algorithm to define objects. In the context of synthetic aperture radar (SAR) image analysis, the presence of speckle noise might hamper the segmentation quality. The aim of this study is to assess the segmentation performance of SAR images when no filter or different filters are applied before segmentation. In particular, the performance of the mean-shift segmentation algorithm combined with different adaptive and non-adaptive filters is assessed based on both synthetic and natural SAR images. Studied filters include the non-adaptive Boxcar filter and four adaptive filters: the well-known Refined Lee filter and three recently proposed non-local filters differing, in particular, in their dissimilarity criteria: the Hellinger and the Kullback–Leibler filters are based on stochastic distances, whereas the NL-SAR filter is based on the generalized likelihood ratio. Two measures were used for quality assessment: -index and -index. Over-segmentation was assessed by the -index, the ratio of the resulting number of segments to the number of connected components of the ground-truth classes. The accuracy of the best possible classification given on the segmentation result was assessed with ground truth information by maximizing the -index. A Monte Carlo experiment conducted on synthetic images shows that the quality measures significantly differ for the applied filters. Our results indicate that the use of an adaptive filter improves the performance of the segmentation. In particular, the combination of the mean-shift segmentation algorithm with the NL-SAR filter gives the best results and the resulting process is less sensitive to variations in the mean-shift operational parameters than when applying other filters or no filter. The results obtained may help improve the reliability of land-cover classification analyses based on an object-based approach on SAR data.
international geoscience and remote sensing symposium | 2015
Rafael Grimson; Natalia S. Morandeira; Alejandro C. Frery
This work presents the use of stochastic measures of similarities as features with statistical significance for the design of despeckling nonlocal means filters. Assuming that the observations follow a Gamma model with two parameters (mean and number of looks), patches are compared by means of the Kullback-Leibler and Hellinger distances, and by their Shannon entropies. A convolution mask is formed using the p-values of tests that verify if the patches come from the same distribution. The filter performances are assessed using well-known phantoms, three measures of quality, and a Monte Carlo experiment with several factors. The proposed filters are contrasted with the Refined Lee and NL-SAR filters.
International Journal of Remote Sensing | 2018
Patricia Kandus; Priscilla Gail Minotti; Natalia Soledad Morandeira; Rafael Grimson; Gabriela González Trilla; Eliana Belén González; Laura San Martín; Maira Patricia Gayol
ABSTRACT South America has a large proportion of wetlands compared with other continents. While most of these wetlands were conserved in a relatively good condition until a few decades ago, pressures brought about by land use and climate change have threaten their integrity in recent years. The aim of this article is to provide a bibliometric analysis of the available scientific literature relating to the remote sensing of wetlands in South America. From 1960 to 2015, 153 articles were published in 63 different journals, with the number of articles published per year increasing progressively since 1990. This rise is also paralleled by an increase in the contribution of local authors. The most intensively studied regions are the wetland macrosystems of South American mega-rivers: the Amazon and Paraná Rivers, along with the Pantanal at the headwaters of Paraguay River. Few studies spanned more than two countries. The most frequent objectives were mapping, covering all types of wetlands with optical data, and hydrology, focusing on floodplain wetlands with microwave data as the preferred data source. The last decade substantial growth reflects an increase in technological and scientific capacities. Nevertheless, the state of the art regarding the remote sensing of wetlands in South America remains enigmatic. Fundamental questions and guidelines which may contribute to the understanding of the functioning of these ecosystems are yet to be fully defined and there is considerable dispersion in the use of data and remote-sensing approaches.
Journal of Complexity | 2009
Rafael Grimson; Bart Kuijpers
We analyze the arithmetic complexity of the linear programming feasibility problem over the reals. For the case of polyhedra defined by 2n half-spaces in R^n we prove that the set I^(^2^n^,^n^), of parameters describing nonempty polyhedra, has an exponential number of limiting hypersurfaces. From this geometric result we obtain, as a corollary, the existence of a constant c>1 such that, if dense or sparse representation is used to code polynomials, the length of any quantifier-free formula expressing the set I^(^2^n^,^n^) is bounded from below by @W(c^n). Other related complexity results are stated; in particular, a lower bound for algebraic computation trees based on the notion of limiting hypersurface is presented.
Reports on Mathematical Logic | 2014
Rafael Grimson; Bart Kuijpers
A b s t r a c t. We consider the 1 -fragment of second-order logic over the vocabularyh+;; 0; 1;<;S1;:::;Ski, interpreted over the reals, where the predicate symbols Si are interpreted as semialgebraic sets. We show that, in this context, satisability of formulas is decidable for the rst-order 9 -quantier fragment and undecidable for the9 8- and8 -fragments. We also show that for these three fragments the same (un)decidability results hold for containment and equivalence of formulas.
Mathematical Logic Quarterly | 2012
Rafael Grimson; Bart Kuijpers; Walied Othman
We introduce new first-order languages for the elementary n-dimensional geometry and elementary n-dimensional affine geometry (n ≥ 2), based on extending FO (β,≡)and FO (β), respectively, with new function symbols. Here, β stands for the betweenness relation and ≡ for the congruence relation. We show that the associated theories admit effective quantifier elimination.
geometric modeling and processing | 2010
Rafael Grimson
We study algebraic complexity of the sign condition problem for any given family of polynomials. Essentially, the problem consists in determining the sign condition satisfied by a fixed family of polynomials at a query point, performing as little arithmetic operations as possible. After defining precisely the sign condition and the point location problems, we introduce a method called the dialytic method to solve the first problem efficiently. This method involves a linearization of the original polynomials and provides the best known algorithm to solve the sign condition problem. Moreover, we prove a lower bound showing that the dialytic method is almost optimal. Finally, we extend our method to the point location problem. The dialytic method solves (non-uniformly) the sign condition problem for a family of s polynomials in R[X1,...,Xn] given by an arithmetic circuit