Michael Masin
IBM
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael Masin.
embedded software | 2013
David Broman; Christopher Brooks; Lev Greenberg; Edward A. Lee; Michael Masin; Stavros Tripakis; Michael Wetter
In this paper, we explain how to achieve deterministic execution of FMUs (Functional Mockup Units) under the FMI (Functional Mockup Interface) standard. In particular, we focus on co-simulation, where an FMU either contains its own internal simulation algorithm or serves as a gateway to a simulation tool. We give conditions on the design of FMUs and master algorithms (which orchestrate the execution of FMUs) to achieve deterministic co-simulation. We show that with the current version of the standard, these conditions demand capabilities from FMUs that are optional in the standard and rarely provided by an FMU in practice. When FMUs lacking these required capabilities are used to compose a model, many basic modeling capabilities become unachievable, including simple discrete-event simulation and variable-step-size numerical integration algorithms. We propose a small extension to the standard and a policy for designing FMUs that enables deterministic execution for a much broader class of models. The extension enables a master algorithm to query an FMU for the time of events that are expected in the future. We show that a model can be executed deterministically if all FMUs in the model are either memoryless or implement one of rollback or step-size prediction. We show further that such a model can contain at most one “legacy” FMU that is not memoryless and provides neither rollback nor step-size prediction.
international conference on hybrid systems computation and control | 2015
David Broman; Lev Greenberg; Edward A. Lee; Michael Masin; Stavros Tripakis; Michael Wetter
This paper defines a suite of requirements for future hybrid cosimulation standards, and specifically provides guidance for development of a hybrid cosimulation version of the Functional Mockup Interface (FMI). A cosimulation standard defines interfaces that enable diverse simulation tools to interoperate. Specifically, one tool defines a component that forms part of a simulation model in another tool. We focus on components with inputs and outputs that are functions of time, and specifically on mixtures of discrete events and continuous time signals. This hybrid mixture is not well supported by existing cosimulation standards, and specifically not by FMI 2.0, for reasons that are explained in this paper. The paper defines a suite of test components, giving a mathematical model of an ideal behavior, plus a discussion of practical implementation considerations. The discussion includes acceptance criteria by which we can determine whether a standard supports definition of each component. In addition, we define a set of test compositions that define requirements for coordination between components, including consistent handling of timed events.
Software and Systems Modeling | 2017
Fabio Cremona; Marten Lohstroh; David Broman; Edward A. Lee; Michael Masin; Stavros Tripakis
Model-based design methodologies are commonly used in industry for the development of complex cyber-physical systems (CPSs). There are many different languages, tools, and formalisms for model-based design, each with its strengths and weaknesses. Instead of accepting some weaknesses of a particular tool, an alternative is to embrace heterogeneity, and to develop tool integration platforms and protocols to leverage the strengths from different environments. A fairly recent attempt in this direction is the functional mock-up interface (FMI) standard that includes support for co-simulation. Although this standard has reached acceptance in industry, it provides only limited support for simulating systems that mix continuous and discrete behavior, which are typical of CPS. This paper identifies the representation of time as a key problem, because the FMI representation does not support well the discrete events that typically occur at the cyber-physical boundary. We analyze alternatives for representing time in hybrid co-simulation and conclude that a superdense model of time using integers only solves many of these problems. We show how an execution engine can pick an adequate time resolution, and how disparities between time representations internal to co-simulated components and the resulting effects of time quantization can be managed. We propose a concrete extension to the FMI standard for supporting hybrid co-simulation that includes integer time, automatic choice of time resolution, and the use of absent signals. We explain how these extensions can be implemented modularly within the frameworks of existing simulation environments.
design, automation, and test in europe | 2015
Nikunj Bajaj; Pierluigi Nuzzo; Michael Masin; Alberto L. Sangiovanni-Vincentelli
We address the problem of synthesizing safety-critical cyber-physical system architectures to minimize a cost function while guaranteeing the desired reliability. We cast the problem as an integer linear program on a reconfigurable graph which models the architecture. Since generating symbolic probability constraints by exhaustive enumeration of failure cases on all possible graph configurations takes exponential time, we propose two algorithms to decrease the problem complexity, i.e. Integer-Linear Programming Modulo Reliability (ILP-MR) and Integer-Linear Programming with Approximate Reliability (ILP-AR). We compare the two approaches and demonstrate their effectiveness on the design of aircraft electric power system architectures.
Journal of Scheduling | 2014
Michael Masin; Tal Raviv
We study a generalized version of the minimum makespan jobshop problem in which multiple instances of each job are to be processed. The system starts with specified inventory levels in all buffers and finishes with some desired inventory levels of the buffers at the end of the planning horizon. A schedule that minimizes the completion time of all the operations is sought. We develop a polynomial time asymptotic approximation procedure for the problem. That is, the ratio between the value of the delivered solution and the optimal one converge into one, as the multiplicity of the problem increases. Our algorithm uses the solution of the linear relaxation of a time-indexed Mixed-Integer formulation of the problem. In addition, a heuristic method inspired by this approximation algorithm is presented and is numerically shown to out-perform known methods for a large set of standard test problems of moderate job multiplicity.
Procedia Computer Science | 2013
Michael Masin; Lior Limonad; Aviad Sela; David Boaz; Lev Greenberg; Nir Mashkif; Ran Rinat
Abstract Viewpoint modeling is an effective approach for analyzing and designing complex systems. Splitting various elements and corresponding constraints into different perspectives of interests, enables separation of concerns such as domains of expertise, levels of abstraction, and stages in lifecycle. Specifically, in Systems Engineering different viewpoints could include functional requirements, physical architecture, safety, geometry, timing, scenarios, etc. Despite partial interdependences, the models are usually developed independently by different parties, using different tools and languages. However, the essence of Systems Engineering requires repetitive integration of many viewpoints in order to find feasible designs and to make good architectural decisions, e.g., in each mapping between consecutive levels of abstraction and in each design space exploration. This integration into one consistent model becomes a significant challenge from both modeling and information management perspectives. In this paper we suggest (1) a unique modular algebraic viewpoint representation robust to design evolution and suitable for generation of the integrated optimization/analysis models, and (2) an underlying ontology-based approach for consistent integration of local viewpoint concepts into the unified design space model. We show an example of an optimization model with different combinations of partially interdependent Analysis Viewpoints. Using the proposed modeling and information management approaches the underlying viewpoints equations can be applied without modification, making the approach pluggable.
systems, man and cybernetics | 2013
Roy S. Kalawsky; Yingchun Tian; Demetrios Joannou; Imad Sanduka; Michael Masin
Creation and management of large complex systems of systems (SoS) can be a daunting task for even the most experienced engineers. Architecting these systems requires considerable domain expertise and next generation tools to seamlessly link requirements elicitation, through to modeling and simulation, and finally through to implementation. The resulting SoS can be so complex that it is virtually impossible for a single person to understand all its interactions. An important aid to this process is the use of architecture patterns that provide blueprints for proven elements of a system that can be incorporated into the evolution of a system or service. This paper describes a novel approach in validating architecture patterns during the early phases of service or product design and development. We propose an architecture optimization method called Concise Modeling to be used in conjunction with architecture patterns to expedite SoS architecture exploration and analysis.
design, automation, and test in europe | 2017
Michael Masin; Francesca Palumbo; H. Myrhaug; J. A. de Oliveira Filho; M. Pastena; Maxime Pelcat; Luigi Raffo; Francesco Regazzoni; A. A. Sanchez; A. Toffetti; E. de la Torre; K. Zedda
In the last few years, besides the concepts of embedded and interconnected systems, also the notion of Cyber-Physical Systems (CPS) has emerged: embedded computational collaborating devices, capable of sensing and controlling physical elements and, often, responding to humans. The continuous interaction between physical and computing layers makes their design and maintenance extremely complex. Uncertainty management and runtime reconfigurability, to mention the most relevant ones, are rarely tackled by available toolchains. In this context, the Cross-layer modEl-based fRamework for multi-oBjective dEsign of Reconfigurable systems in unceRtain hybRid envirOnments (CERBERO) EU project aims at developing a design environment for CPS based of two pillars: 1) a cross-layer model-based approach to describe, optimize, and analyze the system and all its different views concurrently and 2) an advanced adaptivity support based on a multi-layer autonomous engine. In this work, we describe the components and the required developments for seamless design of reusable and reconfigurable CPS and System of Systems in uncertain hybrid environments.
Complex Systems Design & Management | 2015
Henry Broodney; Michael Masin; Evgeny Shindin; Uri Shani; Roy S. Kalawsky; Demetrios Joannou; Yingchun Tian; Antara Bhatt; Imad Sanduka
Domain experience is a key driver behind design quality, especially during the early design phases of a product or service. Currently, the only practical way to bring such experience into a project is to directly engage subject matter experts, which means there is the potential for a resource availability bottleneck because the experts are not available when required. Whilst many domain specific tools have attempted to capture expert knowledge in embedded analytics thus allowing less experienced engineers to perform complex tasks, this is certainly not the case for highly complex systems of systems where their architectures can go far beyond what a single human being can comprehend. This paper proposes a new approach to leveraging design expertise in a manner that facilitates architectural exploration and architecture optimization by using pre-defined architecture patterns. In addition, we propose a means to streamline such a process by delineating the knowledge creation process and architectural exploration analytics with the means to facilitate information flow from the former to the latter through a carefuly designed integration framework.
Archive | 2014
Ofer Shir; Shahar Chen; David Amid; Oded Margalit; Michael Masin; Ateret Anaby-Tavor; David Boaz
We consider two complementary tasks for consuming optimization results of a given multiobjective problem by decision-makers. The underpinning in both exploratory tasks is analyzing Pareto landscapes, and we propose in both cases discrete graph-based reductions. Firstly, we introduce interactive navigation from a given suboptimal reference solution to Pareto efficient solution-points. The proposed traversal mechanism is based upon landscape improvement-transitions from the reference towards Pareto-dominating solutions in a baby-steps fashion – accepting relatively small variations in the design-space. The Efficient Frontier and the archive of Pareto suboptimal points are to be obtained by population-based multiobjective solvers, such as Evolutionary Multiobjective Algorithms. Secondly, we propose a framework for automatically recommending a preferable subset of points belonging to the Frontier that accounts for the decision-maker’s tendencies. We devise a line of action that activates one of two approaches: either recommending the top offensive team – the gain-prone subset of points, or the top defensive team – the loss-averse subset of points. We describe the entire recommendation process and formulate mixed-integer linear programs for solving its combinatorial graph-based problems.