Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Romero is active.

Publication


Featured researches published by Daniel Romero.


IEEE Transactions on Medical Imaging | 2013

Characterization and Modeling of the Peripheral Cardiac Conduction System

Rafael Sebastian; Viviana Zimmerman; Daniel Romero; Damián Sánchez-Quintana; Alejandro F. Frangi

The development of biophysical models of the heart has the potential to get insights in the patho-physiology of the heart, which requires to accurately modeling anatomy and function. The electrical activation sequence of the ventricles depends strongly on the cardiac conduction system (CCS). Its morphology and function cannot be observed in vivo, and therefore data available come from histological studies. We present a review on data available of the peripheral CCS including new experiments. In order to build a realistic model of the CCS we designed a procedure to extract morphological characteristics of the CCS from stained calf tissue samples. A CCS model personalized with our measurements has been built using L-systems. The effect of key unknown parameters of the model in the electrical activation of the left ventricle has been analyzed. The CCS models generated share the main characteristics of observed stained Purkinje networks. The timing of the simulated electrical activation sequences were in the physiological range for CCS models that included enough density of PMJs. These results show that this approach is a potential methodology for collecting knowledge-domain data and build improved CCS models of the heart automatically.


conference on automated deduction | 2011

Backward trace slicing for rewriting logic theories

María Alpuente; Demis Ballis; Javier Espert; Daniel Romero

Trace slicing is a widely used technique for execution trace analysis that is effectively used in program debugging, analysis and comprehension. In this paper, we present a backward trace slicing technique that can be used for the analysis of Rewriting Logic theories. Our trace slicing technique allows us to systematically trace back rewrite sequences modulo equational axioms (such as associativity and commutativity) by means of an algorithm that dynamically simplifies the traces by detecting control and data dependencies, and dropping useless data that do not influence the final result. Our methodology is particularly suitable for analyzing complex, textually-large system computations such as those delivered as counter-example traces by Maude model-checkers.


formal methods | 2009

Specification and Verification of Web Applications in Rewriting Logic

María Alpuente; Demis Ballis; Daniel Romero

This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.


IEEE Transactions on Biomedical Engineering | 2011

Fast Multiscale Modeling of Cardiac Electrophysiology Including Purkinje System

Ali Pashaei; Daniel Romero; Rafael Sebastian; Oscar Camara; Alejandro F. Frangi

In this paper, we present a modeling methodology to couple the cardiac conduction system to cardiac myocytes through a model of Purkinje-ventricular junctions to yield fast and realistic electrical activation of the ventricles. A patient-specific biventricular geometry is obtained from processing computed tomography scan data. A one-manifold implementation of the fast marching method based on Eikonal-type equations is used for modeling heart electrophysiology, which facilitates the multiscale 1-D-3-D coupling at very low computational costs. The method is illustrated in in-silico experiments where we analyze and compare alternative pacing strategies on the same patient-specific anatomy. We also show very good agreement between the results from the proposed approach and more detailed and comprehensive biophysical models for modeling cardiac electrophysiology. The effect of atrioventricular delay on the distribution of activation time in myocardium is studied with two experiments. Given the reasonable computational times and realistic activation sequences provided by our method, it can have an important clinical impact on the selection of optimal implantation sites of pacing leads or placement of ablation catheters tip in the context of cardiac rhythm management therapies.


IEEE Transactions on Biomedical Engineering | 2011

Construction of a Computational Anatomical Model of the Peripheral Cardiac Conduction System

Rafael Sebastian; Viviana Zimmerman; Daniel Romero; Alejandro F. Frangi

A methodology is presented here for automatic construction of a ventricular model of the cardiac conduction system (CCS), which is currently a missing block in many multiscale cardiac electromechanic models. It includes the His bundle, left bundle branches, and the peripheral CCS. The algorithm is fundamentally an enhancement of a rule-based method known as the Lindenmayer systems (L-systems). The generative procedure has been divided into three consecutive independent stages, which subsequently build the CCS from proximal to distal sections. Each stage is governed by a set of user parameters together with anatomical and physiological constrains to direct the generation process and adhere to the structural observations derived from histology studies. Several parameters are defined using statistical distributions to introduce stochastic variability in the models. The CCS built with this approach can generate electrical activation sequences with physiological characteristics.


Electronic Notes in Theoretical Computer Science | 2009

A Visual Technique for Web Pages Comparison

María Alpuente; Daniel Romero

Despite the exponential WWW growth and the success of the Semantic Web, there is limited support today to handle the information found on the Web. In this scenario, techniques and tools that support effective information retrieval are becoming increasingly important. In this work, we present a technique for recognizing and comparing the visual structural information of Web pages, The technique is based on a classification of the set of html-tags which is guided by the visual effect of each tag in the whole structure of the page. This allows us to translate the web page to a normalized form where groups of html tags are mapped into a common canonical one. A metric to compute the distance between two different pages is also introduced. Then, by means of a compression process we are also able to reduce the complexity of recognizing similar structures as well as the processing time when comparing the differences between two Web pages. Finally, we briefly describe a prototype implementation of our tool along with several examples that demonstrate the feasibility of our approach.


Science of Computer Programming | 2014

Using conditional trace slicing for improving Maude programs

María Alpuente; Demis Ballis; Francisco Frechina; Daniel Romero

Understanding the behavior of software is important for the existing software to be improved. In this paper, we present a trace slicing technique that is suitable for analyzing complex, textually-large computations in rewriting logic, which is a general framework efficiently implemented in the Maude language that seamlessly unifies a wide variety of logics and models of concurrency. Given a Maude execution trace T and a slicing criterion for the trace (i.e., a piece of information that we want to observe in the final computation state), we traverse T from back to front and the backward dependence of the observed information is incrementally computed at each execution step. At the end of the traversal, a simplified trace slice is obtained by filtering out all the irrelevant data that do not impact on the data of interest. By narrowing the size of the trace, the slicing technique favors better inspection and debugging activities since most tedious and irrelevant inspections that are routinely performed during diagnosis and bug localization can be eliminated automatically. Moreover, cutting down the execution trace can expose opportunities for further improvement, which we illustrate by means of several examples that we execute by using iJulienne, a trace slicer that implements our conditional slicing technique and is endowed with a trace querying mechanism that increases flexibility and reduction power.


international conference on logic programming | 2012

Backward trace slicing for conditional rewrite theories

María Alpuente; Demis Ballis; Francisco Frechina; Daniel Romero

In this paper, we present a trace slicing technique for rewriting logic that is suitable for analyzing complex, textually-large system computations in rewrite theories that may contain conditional equations and/or rules. Given a conditional execution trace T and a slicing criterion for the trace (i.e., a set of positions that we want to observe in the final state of the trace), we traverse T from back to front, and at each rewrite step, we incrementally compute the origins of the observed positions, which is done by inductively processing the conditions of the applied equations and rules. During the traversal, we also carry a boolean compatibility condition that is needed for the executability of the processed rewrite steps. At the end of the traversal, the trace slice is obtained by filtering out the irrelevant data that do not contribute to the criterion of interest.


software engineering and formal methods | 2006

A Semi-Automatic Methodology for Repairing FaultyWeb Sites

María Alpuente; Demis Ballis; Moreno Falaschi; Daniel Romero

The development and maintenance of Web sites are difficult tasks. To maintain the consistency of ever-larger, complex Web sites, Web administrators need effective mechanisms that assist them in fixing every possible inconsistency. In this paper, we present a novel methodology for semi-automatically repairing faulty Web sites which can be integrated on top of an existing rewriting-based verification technique developed in a previous work. Starting from a categorization of the kinds of errors that can be found during the Web verification activities, we formulate a stepwise transformation procedure that achieves correctness and completeness of the Web site w.r.t. its formal specification while respecting the structure of the document (e.g. the schema of an XML document). Finally, we shortly describe a prototype implementation of the repairing tool which we used for an experimental evaluation of our method


web reasoning and rule systems | 2007

A fast algebraic web verification service

María Alpuente; Demis Ballis; Moreno Falaschi; Pedro Ojeda; Daniel Romero

In this paper, we present the rewriting-based, Web verification service WebVerdi-M, which is able to recognize forbidden/incorrect patterns and incomplete/missing Web pages. WebVerdi-M relies on a powerful Web verification engine that is written in Maude, which automatically derives the error symptoms. Thanks to the AC pattern matching supported by Maude and its metalevel facilities, WebVerdi-M enjoys much better performance and usability than a previous implementation of the verification framework. By using the XML Benchmarking tool xmlgen, we develop some scalable experiments which demonstrate the usefulness of our approach.

Collaboration


Dive into the Daniel Romero's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

María Alpuente

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marcela Daniele

National University of Río Cuarto

View shared research outputs
Top Co-Authors

Avatar

Francisco Frechina

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar

Javier Espert

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar

Pedro Ojeda

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge