Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bob Kanefsky is active.

Publication


Featured researches published by Bob Kanefsky.


Archive | 1996

Super-Resolved Surface Reconstruction from Multiple Images

Peter Cheeseman; Bob Kanefsky; Richard Kraft; John Stutz; Robin Hanson

This paper describes a Bayesian method for constructing a super-resolved surface model by combining information from a set of images of the given surface. We develop the theory and algorithms in detail for the 2-D reconstruction problem, appropriate for the case where all images are taken from roughly the same direction and under similar lighting conditions. We show the results of this 2-D reconstruction on Viking Martian data. These results show dramatic improvements in both spatial and gray-scale resolution. The Bayesian approach uses a neighbor correlation model as well as pixel data from the image set. Some extensions of this method are discussed, including 3-D surface reconstruction and the resolution of diffraction blurred images.


IEEE Intelligent Systems | 2004

MAPGEN: mixed-initiative planning and scheduling for the Mars Exploration Rover mission

Mitchell Ai-Chang; John L. Bresina; Leonard Charest; Adam Chase; Jennifer Hsu; Ari K. Jónsson; Bob Kanefsky; Paul H. Morris; Kanna Rajan; Jeffrey Yglesias; Brian G. Chafin; William C. Dias; Pierre Maldague

The Mars Exploration Rover mission is one of NASAs most ambitious science missions to date. Launched in the summer of 2003, each rover carries instruments for conducting remote and in site observations to elucidate the planets past climate, water activity, and habitability. Science is MERs primary driver, so making best use of the scientific instruments, within the available resources, is a crucial aspect of the mission. To address this criticality, the MER project team selected MAPGEN (mixed initiative activity plan generator) as an activity-planning tool. MAPGEN combines two existing systems, each with a strong heritage: the APGEN activity-planning tool from the Jet Propulsion Laboratory and the Europa planning and scheduling system from NASA Ames Research Center. We discuss the issues arising from combining these tools in this missions context. MAPGEN is the first AI-based system to control a space platform on another planets surface.


ieee aerospace conference | 1998

Design of the Remote Agent experiment for spacecraft autonomy

Douglas E. Bernard; Gregory A. Dorais; Chuck Fry; Edward B. Gamble; Bob Kanefsky; James Kurien; William Millar; Nicola Muscettola; P. Pandurang Nayak; Barney Pell; Kanna Rajan; Nicolas Rouquette; Benjamin D. Smith; Brian C. Williams

This paper describes the Remote Agent flight experiment for spacecraft commanding and control. In the Remote Agent approach, the operational rules and constraints are encoded in the flight software. The software may be considered to be an autonomous remote agent of the spacecraft operators in the sense that the operators rely on the agent to achieve particular goals. The experiment will be executed during the flight of NASAs Deep Space One technology validation mission. During the experiment, the spacecraft will not be given the usual detailed sequence of commands to execute. Instead, the spacecraft will be given a list of goals to achieve during the experiment. In flight, the Remote Agent flight software will generate a plan to accomplish the goals and then execute the plan in a robust manner while keeping track of how well the plan is being accomplished. During plan execution, the Remote Agent stays on the lookout for any hardware faults that might require recovery actions or replanning. In addition to describing the design of the remote agent, this paper discusses technology-insertion challenges and the approach used in the Remote Agent approach to address these challenges. The experiment integrates several spacecraft autonomy technologies developed at NASA Ames and the Jet Propulsion Laboratory: on-board planning, a robust multi threaded executive, and model-based failure diagnosis and recovery.


Journal of Geophysical Research | 1999

Analyzing Pathfinder data using virtual reality and superresolved imaging

Carol R. Stoker; Eric Zbinden; Theodore T. Blackmon; Bob Kanefsky; Joel Hagen; Charles F. Neveu; Daryl N. Rasmussen; Kurt Schwehr; Michael H. Sims

The Mars Pathfinder mission used a unique capability to rapidly generate and interactively display three-dimensional (3-D) photorealistic virtual reality (VR) models of the Martian surface. An interactive terrain visualization system creates and renders digital terrain models produced from stereo images taken by the Imager for Mars Pathfinder (IMP) camera. The stereo pipeline, an automated machine vision algorithm, correlates features between the left and right images to determine their disparity and computes the corresponding positions using the known camera geometry. These positions are connected to form a polygonal mesh upon which IMP images are overlaid as textures. During the Pathfinder mission, VR models were produced and displayed almost as fast as images were received. The VR models were viewed using MarsMap, an interface that allows the model to be viewed from any perspective driven by a standard three-button computer mouse. MarsMap incorporates graphical representations of the lander and rover and the sequence and spatial locations at which rover data were taken. Graphical models of the rover were placed in the model to indicate the rover position at the end of each day of the mission. Images taken by Sojourner cameras are projected into the model as 2-D billboards to show their proper perspective. Distance and angle measurements can be made on features viewed in the model using a mouse-driven 3-D cursor and a point-and- click interface. MarsMap was used to assist with archiving and planning Sojourner activities and to make detailed measurements of surface features such as wind streaks and rock size and orientation that are difficult to perform using 2-D images. Superresolution image processing is a computational method for improving image resolution by a factor of n 1/2 by combining n independent images. This technique was used on Pathfinder to obtain better resolved images of Martian surface features. We show results from superresolving IMP camera images of six targets including near- and far-field objects and discuss how the resolution improvement aids interpretation. Similar flood deposits can be seen on both of the Twin Peaks that cannot be resolved in raw images. Millimeter-sized pits are resolved on the rocks Wedge and Halfdome. Other rocks at the Pathfinder site exhibit fine-scale layering that is otherwise invisible. Use of the method resulted in the probable discovery of an artifact of intelligent life on Mars: a part of the Pathfinder spacecraft. (3-D) model of the nearby surroundings; perform measurements of nearby objects; determine slopes, strike, and dip of the distant terrain; and construct detailed maps of the region. The geologist also uses binoculars to get improved resolution of distant features, and a hand lens to get improved resolution of close features. In robotic missions, many of these capabilities can be simulated using the down- linked data from the rover and computers to reconstruct mod- els of the terrain. Geologists can interact with these to accom- plish many of the same things they would be able to do in the field. In this paper we describe the recent application of computer


Workshop on Physics and Computation | 1992

Computational Complexity And Phase Transitions

Peter Cheeseman; Bob Kanefsky; Will Taylor

It is well known that for many NP-complete problems, such as I<-Sat, I<-colorability etc., typical cases are easy to solve; so that coniputationally hard cases must be rare (assuming P # NP). This paper shows that NP-complete problems can be summarized by at least one “order parameter”, and that the hard problems occur at a critical value of such a parameter. This critical value separates two regions of characteristically different properties. For example, for K-colorability, the critical value separates overconstrained from underconstrained random graphs, and it marks the value at which the probability of a solution changes abruptly from near 0 to near 1. It is the high density of well-separated almost solutions (local minima) at this boundary that cause search algorithms to “thrash”. This boundary is a type of phase transition and we show that it is preserved under mappings between problems. We show that for some P problems either there is no phase transition or it occurs for bounded N (and so bounds the cost). These results suggest a way of deciding if a problem is in P or NP and why they are different.


ieee aerospace conference | 2000

Ground tools for autonomy in the 21st century

Kanna Rajan; M. Shirley; Will Taylor; Bob Kanefsky

Ground tools for unmanned spacecraft are changing rapidly driven by twin innovations: advanced autonomy and ubiquitous networking. Critical issues are the delegation of low-level decision-making to software, the transparency and accountability of that software, mixed-initiative control, i.e., the ability of controllers to adjust portions of the softwares activity without disturbing other portions, and the makeup and geographic distribution of the flight control team. These innovations will enable ground controllers to manage space-based resources much more efficiently and, in the case of science missions, give principal investigators an unprecedented level of direct control. This paper explores these ideas by describing the ground tools for the Remote Agent experiment aboard the Deep Space 1 spacecraft in May of 1999. The experiment demonstrated autonomous control capabilities including goal-oriented commanding, on-board planning, robust plan execution and model-based fault protection. We then speculate on the effect of these technologies on the future of spacecraft ground control.


international joint conference on artificial intelligence | 1991

Where the really hard problems are

Peter Cheeseman; Bob Kanefsky; Will Taylor


Icarus | 2010

The High Resolution Imaging Science Experiment (HiRISE) during MRO’s Primary Science Phase (PSP)

Alfred S. McEwen; Maria E. Banks; Nicole Faith Baugh; Kris J. Becker; Aaron K. Boyd; James W. Bergstrom; Ross A. Beyer; Edward Bortolini; Nathan T. Bridges; Shane Byrne; Bradford Castalia; Frank C. Chuang; Larry S. Crumpler; Ingrid Daubar; Alix K. Davatzes; Donald G. Deardorff; Alaina DeJong; W. Alan Delamere; Eldar Zeev Noe Dobrea; Colin M. Dundas; Eric M. Eliason; Yisrael Espinoza; Audrie Fennema; Kathryn Elspeth Fishbaugh; Terry Forrester; Paul E. Geissler; John A. Grant; J. L. Griffes; John P. Grotzinger; V. C. Gulick


Archive | 2000

Can Distributed Volunteers Accomplish Massive Data Analysis Tasks

Bob Kanefsky; Nadine G. Barlow; Virginia C. Gulick; Peter Norvig


Archive | 2003

MAPGEN : mixed initiative planning and scheduling for the Mars '03 MER mission

Mitchell Ai-Chang; John L. Bresina; Len Charest; Ari K. Jónsson; Jennifer Hsu; Bob Kanefsky; Pierre Maldague; Paul Morris; Kanna Rajan; Jeffrey Yglesias

Collaboration


Dive into the Bob Kanefsky's collaboration.

Top Co-Authors

Avatar

Kanna Rajan

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gregory A. Dorais

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nicola Muscettola

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nicolas Rouquette

California Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge