Richard Papasin
Ames Research Center
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Richard Papasin.
computer assisted radiology and surgery | 2003
Russell J. Andrews; Robert W. Mah; Stefanie S. Jeffrey; Michael Guerrero; Richard Papasin; C. Reed
Abstract The NASA Smart Probe combines information from multiple microsensors—using fuzzy logic and neural network software—to provide a unique tissue “signature” in real time. This report presents recent advances in the probe architecture itself plus clinical information gathered from women undergoing biopsy for suspected breast cancer by the NASA licensee, BioLuminate (Dublin, CA, USA). The multiparameter Smart Probe for breast cancer—1 mm in diameter—can clearly differentiate normal breast, benign lesions, and breast carcinoma. The sensors employed in the Smart Probe for breast cancer include electrical impedance and optical spectroscopy (OS) (both broadband or white light, and laser light (infrared and blue/fluorescence)). Data are acquired 100 times per second; a typical breast “biopsy” typically generates 500 MB of data. Potential applications of nanoelectrode arrays and the Smart Probe concept for deep brain recording and stimulation in neurosurgery are also noted.
Applied Soft Computing | 2001
Richard Papasin; Yuri Gawdiak; David A. Maluf; Christopher Leidich; Peter B. Tran
Networks of video cameras, meteorological sensors, and ancillary electronic equipment are under development in collaboration among NASA Ames Research Center, the Federal Aviation Administration (FAA), and the National Oceanic Atmospheric Administration (NOAA). These networks are to be established at and near airports to provide real-time information on local weather conditions that affect aircraft approaches and landings. The prototype network is an airport-approach-zone camera system (AAZCS), which has been deployed at San Francisco International Airport (SFO) and San Carlos Airport (SQL). The AAZCS includes remotely controlled color video cameras located on top of SFO and SQL air-traffic control towers. The cameras are controlled by the NOAA Center Weather Service Unit located at the Oakland Air Route Traffic Control Center and are accessible via a secure Web site. The AAZCS cameras can be zoomed and can be panned and tilted to cover a field of view 220 wide. The NOAA observer can see the sky condition as it is changing, thereby making possible a real-time evaluation of the conditions along the approach zones of SFO and SQL. The next-generation network, denoted a remote tower sensor system (RTSS), will soon be deployed at the Half Moon Bay Airport and a version of it will eventually be deployed at Los Angeles International Airport. In addition to remote control of video cameras via secure Web links, the RTSS offers realtime weather observations, remote sensing, portability, and a capability for deployment at remote and uninhabited sites. The RTSS can be used at airports that lack control towers, as well as at major airport hubs, to provide synthetic augmentation of vision for both local and remote operations under what would otherwise be conditions of low or even zero visibility.
Archive | 2002
Russell J. Andrews; Robert W. Mah; Stefanie S. Jeffrey; K. Freitas; Michael Guerrero; Richard Papasin; C. Reed
Automating the sensor, effector, and sensor-effector communication aspects of surgery is essential to perform opera-tions at a site remote from the surgeon, e.g. in Space. Real-time tissue recognition can be combined with image-guidance to augment the sensor component; a robot can be the remote effector component. The NASA Smart Probe uses neural networks to combine data from multiple microsensors in real-time to provide a unique tissue “signature”. The concept has been demonstrated in both animal models and clinical trials with women undergoing breast “biopsy” (optical followed by histological). Minimally-invasive multiparameter real-time tissue recognition should improve (1) cancer diagnosis, (2) localization (e.g. functional neurosurgery), and (3) tissue monitoring (e.g. cerebral or cardiac ischemia). The Smart Probe can become the sensor component of a self-contained robotic surgical device.
Archive | 2005
Bradley J. Betts; Robert W. Mah; Richard Papasin; Rommel Del Mundo; Dawn McIntosh; Charles Jorgensen
Archive | 2003
Edward Wilson; David W. Sutter; Dustin S. Berkovitz; Bradley J. Betts; Edmund Kong; Rommel delMundo; Christopher R. Lages; Robert W. Mah; Richard Papasin
Archive | 2003
Richard Papasin; Bradley J. Betts; Rommel Del Mundo; Michael Guerrero; Robert W. Mah; Dawn McIntosh; Edward Wilson
Archive | 2002
Bradley J. Betts; Rommel delMundo; Sharif Elcott; Dawn McIntosh; Brian Niehaus; Richard Papasin; Robert W. Mah; Daniel Clancy
Stereotactic and Functional Neurosurgery | 1997
Russell J. Andrews; Robert W. Mah; A. Galvagni; M. Guerrero; Richard Papasin; M. Wallace; J. Winters
Archive | 2003
Bradley J. Betts; Richard Papasin; Sharif Elcott; Dawn McIntosh; Rommel Del Mundo; Brian Niehaus; Robert W. Mah; Michael Guerrero; Edward Wilson
applied imagery pattern recognition workshop | 2000
Russell J. Andrews; Robert W. Mah; Stefanie S. Jeffrey; A. Aghevli; K. Freitas; M. Guerrero; Richard Papasin; C. Reed