Bruce G. Cameron
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bruce G. Cameron.
Journal of Spacecraft and Rockets | 2014
Daniel Selva; Bruce G. Cameron; Edward F. Crawley
This paper presents a methodology to explore the architectural trade space of Earth observing satellite systems, and applies it to the Earth Science Decadal Survey. The architecting problem is formulated as a combinatorial optimization problem with three sets of architectural decisions: instrument selection, assignment of instruments to satellites, and mission scheduling. A computational tool was created to automatically synthesize architectures based on valid combinations of options for these three decisions and evaluate them according to several figures of merit, including satisfaction of program requirements, data continuity, affordability, and proxies for fairness, technical, and programmatic risk. A population-based heuristic search algorithm is used to search the trade space. The novelty of the tool is that it uses a rule-based expert system to model the knowledge-intensive components of the problem, such as scientific requirements, and to capture the nonlinear positive and negative interactions bet...
ieee aerospace conference | 2014
Marc Sanchez; Daniel Selva; Bruce G. Cameron; Edward F. Crawley; Antonios Seas; Bernie Seery
NASA is currently conducting an architecture study for the next-generation Space Communication and Navigation system. This is an extremely complex problem with a variety of options in terms of band selection (RF, from S-band to Ka-band and beyond, or optical), network type (bent-pipe, circuit-switched, or packet-switched), fractionation strategies (monolithic, mother-daughters, homogeneous fractionation), orbit and constellation design (GEO/MEO/LEO, number of planes, number of satellites per plane), and so forth. When all the combinations are considered, the size of the tradespace grows to several millions of architectures. The ability of these architectures to meet the requirements from different user communities and other stakeholders (e.g., regulators, international partners) needs to be assessed. In this context, a computational tool was developed to enable the exploration of such large space of architectures in terms of both performance and cost. A preliminary version of this tool was presented in a paper last year. This paper describes an updated version of the tool featuring a higher-fidelity, rule-based scheduling algorithm, as well as several modifications in the architecture enumeration and cost models. It also discusses the validation results for the tool using real TDRSS data, as well as the results and sensitivity analyses for several forward-looking scenarios. Particular emphasis is put on families of architectures that are of interest to NASA, namely TDRSS-like architectures, architectures based on hosted payloads, and highly distributed architectures.
ieee aerospace conference | 2013
Marc Sanchez; Daniel Selva; Bruce G. Cameron; Edward F. Crawley; Antonios Seas; Bernie Seery
NASAs Space Communication and Navigation (SCaN) Program is responsible for providing communication and navigation services to space missions and other users in and beyond low Earth orbit. The current SCaN architecture consists of three independent networks: the Space Network (SN), which contains the TDRS relay satellites in GEO; the Near Earth Network (NEN), which consists of several NASA owned and commercially operated ground stations; and the Deep Space Network (DSN), with three ground stations in Goldstone, Madrid, and Canberra. The first task of this study is the stakeholder analysis. The goal of the stakeholder analysis is to identify the main stakeholders of the SCaN system and their needs. Twenty-one main groups of stakeholders have been identified and put on a stakeholder map. Their needs are currently being elicited by means of interviews and an extensive literature review. The data will then be analyzed by applying Cameron and Crawleys stakeholder analysis theory, with a view to highlighting dominant needs and conflicting needs. The second task of this study is the architectural tradespace exploration of the next generation TDRSS. The space of possible architectures for SCaN is represented by a set of architectural decisions, each of which has a discrete set of options. A computational tool is used to automatically synthesize a very large number of possible architectures by enumerating different combinations of decisions and options. The same tool contains models to evaluate the architectures in terms of performance and cost. The performance model uses the stakeholder needs and requirements identified in the previous steps as inputs, and it is based in the VASSAR methodology presented in a companion paper. This paper summarizes the current status of the MIT SCaN architecture study. It starts by motivating the need to perform tradespace exploration studies in the context of relay data systems through a description of the history NASAs space communication networks. It then presents the generalities of possible architectures for future space communication and navigation networks. Finally, it describes the tools and methods being developed, clearly indicating the architectural decisions that have been taken into account as well as the systematic approach followed to model them. The purpose of this study is to explore the SCaN architectural tradespace by means of a computational tool. This paper describes the tool, while the tradespace exploration is underway.
Systems Engineering | 2013
Ryan C. Boas; Bruce G. Cameron; Edward F. Crawley
Commonality, or the reuse and sharing of components, manufacturing processes, architectures, interfaces, and infrastructure across the members of a product family, is an often leveraged strategy targeted at improving corporate profitability. Commonality strategies are widespread in the literature and in industrial practice, but a clear gap exists: The literature has a distinctly positive bias towards the benefits of commonality, whereas industrial success with commonality has been mixed. This article explores two phenomena, divergence and lifecycle offsets, that may prevent companies from properly assessing and realizing the potential benefits of commonality. Using a multiple case study approach, we trace commonality levels through the lifecycles of seven complex product families that span the aerospace, automotive, semiconductor capital equipment, and printing industries. The case studies indicate that commonality tends to decline over time, a phenomenon we title divergence. In contrast to the prevailing concept of parallel development in product families, we find that lifecycle offsets, or temporal separations between the development, manufacturing, operations, and/or retirement phases of two or more products, are prevalent in industrial practice. Through this exploratory study, we find that lifecycle offsets may reduce the potential benefits of commonality, make the realization of benefits much more difficult, delay the realization of benefits, and reallocate potential benefits across individual products. We predict that lifecycle offsets exacerbate divergence. We propose a framework for categorizing parts-level changes that explicitly recognizes the potential for divergence. We conclude with guidance for product family managers, namely, that commonality be managed dynamically throughout the product family lifecycle, rather than as a static property. Additionally, we articulate the need to make commonality decisions from a product family perspective, a perspective that may lead to decisions that create near-term costs for one variant but result in larger long-term savings for a second variant and for the product family as a whole.
Engineering Management Journal | 2011
Bruce G. Cameron; Edward F. Crawley; Wen Feng; Maokai Lin
Abstract: The authors develop a methodology to compare the firms knowledge of its networked environment with the priority of its needs. This enables managers to make explicit decisions about the firms needs, and the feasibility of satisfying them with the firms production function. A scalable mathematical model of indirect transactions has been built on a foundation of a theory of generalized exchange. Using a multinational resource extraction project as an example, we show how this methodology can be used to prioritize the outputs of the firm, relative to its needs and its strategic environment. This work enables managers to perform an analysis of the firms stakeholders to determine the relative priority of stakeholder needs, according to the stakeholders importance to the firm, prior to the translation of needs into requirements.
Systems Engineering | 2015
Peter F. Davison; Bruce G. Cameron; Edward F. Crawley
Many systems undergo significant architecture-level change throughout their lifecycles in order to adapt to new operating and funding contexts, to react to failed technology development, or to incorporate new technologies. In all cases early architecture selection and technology investment decisions will constrain the system to certain regions of the tradespace, which can limit the evolvability of the system and its robustness to exogenous changes. In this paper we present a method for charting development pathways within a tradespace of potential architectures, with a view to enabling robustness to technology portfolio realization and later architectural changes. The tradespace is first transformed into a weighted, directed graph of architecture nodes with connectivity determined by relationships between technology portfolios and functional architecture. The tradespace exploration problem is then restated as a shortest path problem through this graph. This method is applied to the tradespace of in-space transportation architectures for missions to Mars, finding that knowledge of pathways through the tradespace can identify negative coupling between functional architectures and particular technologies, as well as identify ways to prioritize future technology investments.
Journal of Spacecraft and Rockets | 2014
Jonathan Battat; Bruce G. Cameron; Alexander Rudat; Edward F. Crawley
United States. National Aeronautics and Space Administration (Massachusetts Institute of Technology Research Grant)
IEEE\/OSA Journal of Optical Communications and Networking | 2016
Marc Sanchez Net; Iñigo del Portillo; Edward F. Crawley; Bruce G. Cameron
Optical communications are a key technology enabler to return increasing amounts of data from space exploration platforms such as robotic spacecraft in Earth orbit or across the solar system. However, several challenges have hindered the deployment and utilization of this technology in an operational context, most notably its sensitivity to atmospheric impairments such as cloud coverage. To mitigate this problem, building a network of interconnected and geographically disperse ground stations has been proposed as a possible solution to ensure that, at any point in time, at least one space-to-ground optical link is available to contact the space-based platforms. In this paper, we present a new approach for quantifying the availability of an optical ground network that is both computationally inexpensive and suitable for high-level architectural concept studies. Based on the cloud fraction data set, several approximation methods are used to estimate the probability of having a certain number of space-to-ground links fail due to cloud coverage. They are developed in order to capture increasingly complex atmospheric factors, from sites with independent weather conditions, to stations that are both temporally and spatially correlated. Then, the proposed approximation methods are benchmarked and recommendations on how to utilize and implement them are finally summarized.
Systems Engineering | 2015
Morgan Dwyer; Bruce G. Cameron; Zoe Szajnfarber
The government acquisition system is consistently plagued by cost growth and by attempts at acquisition reform. Despite these persistent challenges, the academic community lacks a methodology for studying complex acquisition programs both in-depth and longitudinally throughout their life cycles. In this paper, we present a framework for studying complex acquisition programs that provides researchers with a strategy for systematically studying cost growth mechanisms. The proposed framework provides a means to identify specific technical and organizational mechanisms for cost growth, to organize those mechanisms using design structure matrices, and to observe the evolution of those mechanisms throughout a programs life cycle. To illustrate the utility of our framework, we apply it to analyze a case study of the National Polar-orbiting Operational Environmental Satellite System NPOESS program. Ultimately, we demonstrate that the framework enables us to generate unique insights into the mechanisms that induced cost growth on NPOESS and that were unacknowledged by previous studies. Specifically, we observed that complexity was injected into the technical system well before the programs cost estimates began to increase and that it was the complexity of the NPOESS organization that hindered the programs ability to effectively estimate and to manage its costs.
Journal of Aerospace Information Systems | 2015
Marc Sanchez Net; Iñigo del Portillo; Bruce G. Cameron; Edward F. Crawley; Daniel Selva
Methods to design space communication networks at the link level are well understood and abound in the literature. Nevertheless, models that analyze the performance and cost of the entire network are scarce, and they typically rely on computationally expensive simulations that can only be applied to specific network designs. This paper presents an architectural model to quantitatively optimize space communication networks given future customer demands, communication technology, and contract modalities to deploy the network. The model is implemented and validated against NASA’s Tracking and Data Relay Satellite System. It is then used to evaluate new architectures for the fourth-generation Tracking and Data Relay Satellite System given the capabilities of new optical and Ka-band technologies, as well as the possibility to deploy network assets as hosted payloads. Results indicate that optical technology can provide a significant improvement in the network capabilities and lifecycle cost, especially when pl...