Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert Seater is active.

Publication


Featured researches published by Robert Seater.


Requirements Engineering | 2007

Requirement progression in problem frames: deriving specifications from requirements

Robert Seater; Daniel Jackson; Rohit Gheyi

A technique is presented for obtaining a specification from a requirement through a series of incremental steps. The starting point is a Problem Frame description, involving a decomposition of the environment into interconnected domains and a formal requirement on phenomena of those domains. In each step, the requirement is moved towards the machine, leaving behind a trail of “breadcrumbs”—partial domain descriptions representing assumptions about the behaviors of those domains. Eventually, the transformed requirement references only phenomena at the interface of the machine and can therefore serve as a specification. Each step is justified by a mechanically checkable implication, ensuring that, if the machine obeys the derived specification and the domain assumptions are valid, the requirement will hold. The technique is formalized in Alloy and demonstrated on two examples.


international symposium on software testing and analysis | 2004

Automating commutativity analysis at the design level

Greg Dennis; Robert Seater; Derek Rayside; Daniel Jackson

Two operations commute if executing them serially in either order results in the same change of state. In a system in which commands may be issued simultaneously by different users, lack of commutativity can result in unpredictable behaviour, even if the commands are serialized, because one users command may be preempted by anothers, and thus executed in an unanticipated state. This paper describes an automated approach to analyzing commutativity. The operations are expressed as constraints in a declarative modelling language such as Alloy, and a constraint solver is used to find violating scenarios. A case study application to the beam scheduling component of a proton therapy machine (originally specified in OCL) revealed several violations of commutativity in which requests from medical technicians in treatment rooms could conflict with the actions of a beam operator in a master control room. Some of the issues involved in automating the analysis for OCL itself are also discussed.


foundations of software engineering | 2006

Lightweight extraction of syntactic specifications

Mana Taghdiri; Robert Seater; Daniel Jackson

A method for extracting syntactic specifications from heap-manipulating code is described. The state of the heap is represented as an environment mapping each variable or field to a relational expression. A procedure is executed symbolically, obtaining an environment for the post-state that gives the value of each variable and field in terms of the values of variables and fields of the pre-state. Approximation is introduced by forming relational unions at merge points in the control flow graph, and by widening union-of-join expressions to transitive closures. The resulting analysis is linear in the length of the code and the number of fields, but capable of producing non-trivial specifications of surprising accuracy.


eclipse technology exchange | 2005

An analysis and visualization for revealing object sharing

Derek Rayside; Lucy Mendel; Robert Seater; Daniel Jackson

Sharing mutable data (via aliasing) is a powerful programming technique. To facilitate sharing, object-oriented programming languages permit the programmer to selectively break encapsulation boundaries. However, sharing data makes programs harder to understand and reason about, because, unlike encapsulated data, shared data cannot be reasoned about in a modular fashion. This paper presents an analysis and a visualizer to help the programmer understand and reason about shared data.


languages and compilers for parallel computing | 2001

Polynomial time array dataflow analysis

Robert Seater; David Wonnacott

Array dataflow analysis is a valuable tool for supercomputer compilers. However, the worst-case asymptotic time complexities for modern array dataflow analysis techniques are either not well understood or alarmingly high. For example, the Omega Test uses a subset of the 222O(n) language of Presburger Arithmetic for analysis of affine dependences; its use of uninterpreted function symbols for nonaffine terms introduces additional sources of complexity. Even traditional data dependence analysis of affine dependences is equivalent to integer programming, and is thus NP-complete. These worst-case complexities have raised questions about the wisdom of using array dataflow analysis in a production compiler, despite empirical data that show that various tests run quickly in practice. In this paper, we demonstrate that a polynomial-time algorithm can produce accurate information about the presence of loop-carried array dataflow. We first identify a subdomain of Presburger Arithmetic that can be manipulated (by the Omega Library) in polynomial time; we then describe a modification to prevent exponential blowup of the Omega Librarys algorithm for manipulating function symbols. Restricting the Omega Test to these polynomial cases can, in principle, reduce the accuracy of the dataflow information produced. We therefore present the results of our investigation of the effects of these restrictions on the detection of loop-carried array dataflow dependences (which prevent parallelization). These restrictions block parallelization of only a few unimportant loop nests in the approximately 18000 lines of benchmark code we studied. The use of our subdomain of Presburger Arithmetic also gives a modest reduction in analysis time, even with our current unoptimized implementation, as long as we do not employ our modified algorithms for function symbols. The data collected in our empirical studies also suggest directions for improving both accuracy and efficiency.


integrated communications, navigation and surveillance conference | 2011

Decision support tools for the tower flight data manager system

Vineet Mehta; Mary Ellen Miller; Tom G. Reynolds; Mariya Ishutkina; Richard Jordan; Robert Seater; William Moser

The FAA, carrier airlines and passengers are all familiar with the inefficiencies and costs that result from delays in the air transportation system. A significant portion of the delays are associated with operations on or near the surface of major airports. There continues to be keen interest in improving efficiency of surface operations in order to reduce delay costs. The projected growth of air traffic demand is further fueling this interest. The Tower Flight Data Manager (TFDM) initiative by the FAA is aimed at providing tools in the air traffic control tower that would aid in improving operational efficiency on the surface. This initiative seeks to provide benefits through the consolidation of legacy automation systems/displays, the coherent fusion of information from multiple external systems, and the prediction and planning of flight operations on the surface. The ability to predict and plan operations on the surface is delivered to air traffic controllers through a set of decision support tools. These tools provide decision support in the following functional areas: airport configuration, runway assignment, taxi routing, sequencing & scheduling, departure metering, and departure routing. This paper is focused on describing an implementation of these tools in a prototype of the TFDM system. This prototype system is installed at Dallas/Fort Worth airport and is undergoing operational testing. The paper provides selected results from the application of the decision support tools, as well as discussion of future enhancements to the tools.


ieee/aiaa digital avionics systems conference | 2011

The Tower Flight Data Manager prototype system

Vineet Mehta; Steven D. Campbell; James K. Kuchar; William Moser; Hayley J. Davison Reynolds; Tom G. Reynolds; Robert Seater

The Tower Flight Data Manager (TFDM) will serve as the next generation air traffic control tower automation platform for surface and local airspace operations. TFDM provides three primary enhancements over current systems: consolidation of diverse data and information sources into a single platform; electronic data exchange, including flight data entries, within and outside the tower cab; and a suite of decision support capabilities leveraging TFDMs access to external data sources and systems. This paper describes a TFDM prototype system that includes integrated surveillance, flight data, and decision support display components. Enhancements in airport configuration management, runway assignment, taxi routing, sequencing and scheduling, and departure route assurance are expected to yield significant benefits in delay reduction, fuel savings, additional capacity, improved access, enhanced safety, and reduced environmental impact. Data are provided on system performance and air traffic controller acceptance from simulation studies and a preliminary field demonstration at Dallas / Ft. Worth International Airport.


global humanitarian technology conference | 2015

Analysis of decision making skills for large scale disaster response

Charles E. Rose; Robert Seater; Adam S Norige

A large scale disaster such as the detonation of an improvised nuclear device (IND) in a U.S. city would pose significant response challenges for all levels of government, private organizations, and the general public. Public officials and emergency managers would face difficult and high impact choices throughout the response effort, and they must prepare to make timely and key decisions throughout the effort. Decision making preparation may involve more than technical training and resources. It may extend to emergency managers being cognitively and emotionally prepared for the situations they may face. This paper presents the first step toward the larger goal of developing alternative disaster preparedness training methods that teach effective decision making. The project team interviewed highly experienced, disaster response professionals and analyzed decisions they emphasized as being both important and difficult during an IND response. The respondents also identified the critical skills needed to make those decisions effectively. This paper reports on the findings and analysis of specific decisions and skills required for an IND response.


automated software engineering | 2003

Debugging overconstrained declarative models using unsatisfiable cores

Ilya Shlyakhter; Robert Seater; Daniel Jackson; Manu Sridharan; Mana Taghdiri


ieee international conference on requirements engineering | 2006

Requirement Progression in Problem Frames Applied to a Proton Therapy System

Robert Seater; Daniel Jackson

Collaboration


Dive into the Robert Seater's collaboration.

Top Co-Authors

Avatar

Daniel Jackson

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Adam S Norige

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Charles E. Rose

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Greg Dennis

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lucy Mendel

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Tom G. Reynolds

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Vineet Mehta

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge