Brent E. Tweddle
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Brent E. Tweddle.
Journal of Guidance Control and Dynamics | 2011
Brent E. Tweddle; Alvar Saenz-Otero
W HEN a large spacecraft fails on orbit, the diagnostic and debugging process is challenging, time consuming, costly, and possibly dangerous. The ability to visually inspect a large spacecraft can be helpful because it provides a reliable source of independent data on the large spacecraft’s current state. Recent proposals by Henshaw et al. [1] have discussed launching large spacecraft with a small inspector satellite that is attached to the host. This inspector satellite could be deployed if an on-orbit anomaly were to occur. Henshaw et al. argued that, for this approach to be successful, the inspector spacecraft must minimize the requirements that are levied on the larger host spacecraft. Henshaw et al.’s approach implies that neither communications nor power systems should be sharedwith the host, and that a passive vision-based navigation system should be used with a single, small fiducial marker. It was also noted that it is important for the inspector to minimize the risk of colliding with the host spacecraft and causing further damage. This leads to an important design question: How should the relative state of the inspector be estimated in an accurate and reliable fashion? This Note presents a design for a relative state estimator based on a small fiducial target that is observed by a singlemonochrome camera. This design solves the exterior orientation problem [2] for four point features using a globally convergent nonlinear iterative algorithm. Note that four coplanar points are theminimum number of points that are required to solve for the relative posewithout ambiguity assuming perspective projection [3]. The relative pose is then filtered with the dynamics to estimate the 12-degrees-of-freedom relative state (i.e., three-dimensional position, orientation, and linear and angular velocity). This algorithm was tested on the Synchronized Position, Hold, Engage, and Reorient Experimental Satellites (SPHERES) system [4] with a computer vision upgrade [5], and a summary of these results are presented in this Note. This Note is an expanded version of earlier publications by the first author [6,7]. II. Literature Review
international conference on robotics and automation | 2011
Brian C. Williams; Nicolas Hudson; Brent E. Tweddle; Roland Brockers; Larry H. Matthies
A Feature and Pose Constrained Extended Kalman Filter (FPC-EKF) is developed for highly dynamic computationally constrained micro aerial vehicles. Vehicle localization is achieved using only a low performance inertial measurement unit and a single camera. The FPC-EKF framework augments the vehicles state with both previous vehicle poses and critical environmental features, including vertical edges. This filter framework efficiently incorporates measurements from hundreds of opportunistic visual features to constrain the motion estimate, while allowing navigating and sustained tracking with respect to a few persistent features. In addition, vertical features in the environment are opportunistically used to provide global attitude references. Accurate pose estimation is demonstrated on a sequence including fast traversing, where visual features enter and exit the field-of-view quickly, as well as hover and ingress maneuvers where drift free navigation is achieved with respect to the environment.
Journal of Field Robotics | 2015
Brent E. Tweddle; Alvar Saenz-Otero; John J. Leonard; David W. Miller
This paper presents a new approach for solving the simultaneous localization and mapping problem for inspecting an unknown and uncooperative object that is spinning about an arbitrary axis in space. This approach probabilistically models the six degree-of-freedom rigid-body dynamics in a factor graph formulation. Using the incremental smoothing and mapping system, this method estimates a feature-based map of the target object, as well as this objects position, orientation, linear velocity, angular velocity, center of mass, principal axes, and ratios of inertia. This solves an important problem for spacecraft proximity operations. Additionally, it provides a generic framework for incorporating rigid-body dynamics that may be applied to a number of other terrestrial-based applications. To evaluate this approach, the Synchronized Position Hold Engage Reorient Experimental Satellites SPHERES were used as a testbed within the microgravity environment of the International Space Station. The SPHERES satellites, using body-mounted stereo cameras, captured a dataset of a target object that was spinning at ten rotations per minute about its unstable, intermediate axis. This dataset was used to experimentally evaluate the approach described in this paper, and it showed that it was able to estimate a geometric map and the position, orientation, linear and angular velocities, center of mass, and ratios of inertia of the target object.
Journal of Spacecraft and Rockets | 2014
Dehann Fourie; Brent E. Tweddle; Steve Ulrich; Alvar Saenz-Otero
This paper describes a vision-based relative navigation and control strategy for inspecting an unknown, noncooperative, and possibly spinning object in space using a visual–inertial system that is designed to minimize the computational requirements while maintaining a safe relative distance. The proposed spacecraft inspection system relies solely on a calibrated stereo camera and a three-axis gyroscope to maintain a safe inspection distance while following a circular trajectory around the object. The navigation system is based on image processing algorithms, which extract the relative position and velocity between the inspector and the object, and a simple control approach is used to ensure that the desired range and bearing are maintained throughout the inspection maneuver. The hardware implementation details of the system are provided. Computer simulation results and experiments conducted aboard the International Space Station during Expedition 34 are reported to demonstrate the performance and applicab...
AIAA Guidance, Navigation, and Control Conference | 2012
Michael O'Connor; Brent E. Tweddle; Jacob G. Katz; Alvar Saenz-Otero; David W. Miller; Konrad Makowka
Large spacecraft represent a large investment of time and money, and are often risky ventures. The ability to visually inspect craft provides operators and engineers valuable information about on-orbit failures or salvagability. To minimize the risk of damage to the large craft, initial inspection maneuvers should be designed to maintain a safe observation distance. This paper discusses the design of such an inspection maneuver using range measurements that are directly applicable to stereo vision and are supplemented with gyroscope data to determine distance and rates in a relative frame. Using the SPHERES satellite testbed, the ultrasonic measurement system is used in place of vision hardware to test the vision-navigation algorithm in a microgravity environment. The paper discusses the error characteristics of the system and develops the control framework for the inspection maneuver. The implementation of the algorithm in simulation is compared with test results from the SPHERES satellites onboard the International Space Station.
AIAA Guidance, Navigation, and Control (GNC) Conference | 2013
Dehann Fourie; Brent E. Tweddle; Steve Ulrich; Alvar Saenz-Otero
This paper describes a vision-based relative navigation and control strategy for inspecting an unknown and non-cooperative object in space using a visual-inertial system. The proposed inspection spacecraft system relies solely on a calibrated stereo camera and a 3-axis gyroscope to maintain a safe inspection distance while following a circle trajectory around the object. The navigation system is based on image processing algorithms which extract the relative position and velocity between the inspector and the object, and a simple control approach is used to ensure the desired range and bearing are maintained throughout the inspection maneuver. Hardware implementation details of the VERTIGO Goggles system are provided, and design considerations related to the addition of the stereo camera system and associated processor unit to the SPHERES satellites are discussed. Computer simulation results, and experiments conducted aboard the International Space Station during Expedition 34 are reported to demonstrate the performance and applicability of the proposed hardware and related navigation and control systems to inspect an unknown spacecraft.
Archive | 2010
Enrico Stoll; Alvar Saenz-Otero; Brent E. Tweddle
Most malfunctioning spacecraft require only a minor maintenance operation, but have to be retired due to the lack of so-called On-Orbit Servicing (OOS) opportunities. There is no maintenance and repair infrastructure for space systems. Occasionally, space shuttle based servicing missions are launched, but there are no routine procedures foreseen for the individual spacecraft. The unmanned approach is to utilize the explorative possibilities of robots to dock a servicer spacecraft onto a malfunctioning target spacecraft and execute complex OOS operations, controlled from ground. Most OOS demonstration missions aim at equipping the servicing spacecraft with a high degree of autonomy. However, not all spacecraft can be serviced autonomously. Equipping the human operator on ground with the possibility of instantaneous interaction with the servicer satellite is a very beneficial capability that complements autonomous operations. This work focuses on such teleoperated space systems with a strong emphasis on multimodal feedback, i.e. human spacecraft interaction is considered, which utilizes multiple human senses through which the operator can receive output from a technical device. This work proposes a new concept for free flyer control and shows the development of an according test environment.
AIAA SPACE 2009 Conference & Exposition | 2009
Brent E. Tweddle; Alvar Saenz-Otero; David W. Miller
Journal of Field Robotics | 2016
Brent E. Tweddle; Timothy P. Setterfield; Alvar Saenz-Otero; David W. Miller
intelligent robots and systems | 2014
Brent E. Tweddle; Timothy P. Setterfield; Alvar Saenz-Otero; David W. Miller; John J. Leonard