A. Benso
Catholic University of the Sacred Heart
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by A. Benso.
european test symposium | 2000
M. Lobetti Bodoni; A. Benso; Silvia Anna Chiusano; S. Di Carlo; G. Di Natale; Paolo Ernesto Prinetto
The present paper proposes a solution to the problem of testing a system containing many distributed memories of different sizes. The proposed solution relies in the development of a BIST architecture characterized by a single BIST processor, implemented as a microprogrammable machine and able to execute different test algorithms, a wrapper for each SRAM including standard memory BIST modules, and an interface block to manage the communications between the SRAM and the BIST processor. Both area overhead and routing costs are minimized, and a scan-based approach allows full diagnostic capabilities of the faults possibly detected in the memories under test.
IEEE Transactions on Computers | 2012
Alessandro Savino; S. Di Carlo; Gianfranco Michele Maria Politano; A. Benso; Alberto Bosio; G. Di Natale
What is the probability that the execution state of a given microprocessor running a given application is correct, in a certain working environment with a given soft-error rate? Trying to answer this question using fault injection can be very expensive and time consuming. This paper proposes the baseline for a new methodology, based on microprocessor error probability profiling, that aims at estimating fault injection results without the need of a typical fault injection setup. The proposed methodology is based on two main ideas: a one-time fault-injection analysis of the microprocessor architecture to characterize the probability of successful execution of each of its instructions in presence of a soft-error, and a static and very fast analysis of the control and data flow of the target software application to compute its probability of success. The presented work goes beyond the dependability evaluation problem; it also has the potential to become the backbone for new tools able to help engineers to choose the best hardware and software architecture to structurally maximize the probability of a correct execution of the target software.
IEEE Transactions on Very Large Scale Integration Systems | 2008
A. Benso; S. Di Carlo; Paolo Ernesto Prinetto; Yervant Zorian
Core-based design and reuse are the two key elements for an efficient system-on-chip (SoC) development. Unfortunately, they also introduce new challenges in SoC testing, such as core test reuse and the need of a common test infrastructure working with cores originating from different vendors. The IEEE 1500 Standard for Embedded Core Testing addresses these issues by proposing a flexible hardware test wrapper architecture for embedded cores, together with a core test language (CTL) used to describe the implemented wrapper functionalities. Several intellectual property providers have already announced IEEE Standard 1500 compliance in both existing and future design blocks. In this paper, we address the problem of guaranteeing the compliance of a wrapper architecture and its CTL description to the IEEE Standard 1500. This step is mandatory to fully trust the wrapper functionalities in applying the test sequences to the core. We present a systematic methodology to build a verification framework for IEEE Standard 1500 compliant cores, allowing core providers and/or integrators to verify the compliance of their products (sold or purchased) to the standard.
IEEE Transactions on Computers | 2008
A. Benso; A. Bosio; S. Di Carlo; G. Di Natale; Paolo Ernesto Prinetto
Memory testing commonly faces two issues: the characterization of detailed and realistic fault models and the definition of time-efficient test algorithms. Among the different types of algorithms proposed for testing static random access memories, march tests have proven to be faster, simpler, and regularly structured. The majority of the published march tests have been manually generated. Unfortunately, the continuous evolution of the memory technology introduces new classes of faults such as dynamic and linked faults and makes the task of handwriting test algorithms harder and not always leading to optimal results. Although some researchers published handmade march tests able to deal with new fault models, the problem of a comprehensive methodology to automatically generate march tests addressing both classic and new fault models is still an open issue. This paper proposes a new polynomial algorithm to automatically generate march tests. The formal model adopted to represent memory faults allows the definition of a general methodology to deal with static, dynamic, and linked faults. Experimental results show that the new automatically generated march tests reduce the test complexity and, therefore, the test time, compared to the well-known state of the art in memory testing.
computational intelligence in bioinformatics and computational biology | 2008
A. Benso; S. Di Carlo; Gianfranco Michele Maria Politano; Luca Sterpone
This paper proposes a new and very flexible data model, called gene expression graph (GEG), for genes expression analysis and classification. Three features differentiate GEGs from other available microarray data representation structures: (i) the memory occupation of a GEG is independent of the number of samples used to built it; (ii) a GEG more clearly expresses relationships among expressed and non expressed genes in both healthy and diseased tissues experiments; (iii) GEGs allow to easily implement very efficient classifiers. The paper also presents a simple classifier for sample-based classification to show the flexibility and user-friendliness of the proposed data structure.
bioinformatics and bioengineering | 2008
A. Benso; S. Di Carlo; Gianfranco Michele Maria Politano; Luca Sterpone
This paper proposes an innovative data structure to be used as a backbone in designing microarray phenotype sample classifiers. The data structure is based on graphs and it is built from a differential analysis of the expression levels of healthy and diseased tissue samples in a microarray dataset. The proposed data structure is built in such a way that, by construction, it shows a number of properties that are perfectly suited to address several problems like feature extraction, clustering, and classification.
european conference on radiation and its effects on components and systems | 2001
A. Benso; S. Di Carlo; G. Di Natale; Paolo Ernesto Prinetto; L. Tagliaferri
The present paper proposes a C/C++ source-to-source compiler able to increase the dependability properties of a given application. The adopted strategy is based on two main techniques: variable duplication/triplication and control flow checking. The validation of these techniques is based on the emulation of fault appearance by software fault injection. The chosen test case is a client-server application in charge of calculating and drawing a Mandelbrot fractal.
IEEE Design & Test of Computers | 2009
A. Benso; S. Di Carlo; Paolo Ernesto Prinetto; A. Bosio
Functional verification of complex SoC designs is a challenging task, which fortunately is increasingly supported by automation. This article proposes a verification component for IEEE Std 1500, to be plugged into a commercial verification tool suite.
international biennial baltic electronics conference | 2008
A. Benso; S. Di Carlo; Paolo Ernesto Prinetto; Alessandro Savino; Alberto Scionti
Test coverage evaluation is one of the most critical issues in microprocessor software-based testing. Whenever the test is developed in the absence of a structural model of the microprocessor, the evaluation of the final test coverage may become a major issue. In this paper, we present a microprocessor modeling technique based on entity-relationship diagrams allowing the definition and the computation of custom coverage functions. The proposed model is very flexible and particularly effective when a structural model of the microprocessor is not available.
Archive | 2018
Roberta Bardini; Gianfranco Michele Maria Politano; A. Benso; Stefano Di Carlo
Synthetic Biology is characterized by a forward engineering approach to the design of biological systems implementing desired functionalities. The Synthetic Biology design cycle benefits from the understanding and the proper representation of the underling biological complexity, allowing predicting the behavior of the target system. Considering the intrinsic nature of the systems to be designed with a Systems Biology perspective is a key requirement to support the Synthetic Biology design cycle. In particular, good models for synthetic biological systems must express hierarchy, encapsulation, selective communication, spatiality, quantitative mechanisms, and stochasticity. Computational models in general not only properly handle such modeling requirements. They can also manage heterogeneous information in compositional processes, support formal analysis and simulation, and can further be exploited for knowledge interchange among the scientific community. In particular, the nets-within-nets formalism expresses all of these features providing high flexibility in the modeling task. The formalism is well suited to represent heterogeneous systems and in general provide an extraordinary expressivity. This is achieved thanks its capability of tuning the abstraction level in each part of the model.