Mark Draelos
Duke University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mark Draelos.
international conference on multisensor fusion and integration for intelligent systems | 2012
Mark Draelos; Nikhil Deshpande; Edward Grant
With proper calibration of its color and depth cameras, the Kinect can capture detailed color point clouds at up to 30 frames per second. This capability positions the Kinect for use in robotics as a low-cost navigation sensor. Thus, techniques for efficiently calibrating the Kinect depth camera and altering its optical system to improve suitability for imaging short-range obstacles are presented. To perform depth calibration, a calibration rig and software were developed to automatically map raw depth values to object depths. The calibration rig consisted of a traditional chessboard calibration target with easily locatable features in depth at its exterior corners that facilitated software extraction of corresponding object depths and raw depth values. To modify the Kinects optics for improved short-range imaging, Nykos Zoom adapter was used due to its simplicity and low cost. Although effective at reducing the Kinects minimum range, these optics introduced pronounced distortion in depth. A method based on capturing depth images of planar objects at various depths produced an empirical depth distortion model for correcting such distortion in software. Together, the modified optics and the empirical depth undistortion procedure demonstrated the ability to improve the Kinects resolution and decrease its minimum range by approximately 30%.
international conference on image processing | 2015
Mark Draelos; Qiang Qiu; Alexander M. Bronstein; Guillermo Sapiro
Intels newly-announced low-cost RealSense 3D camera claims significantly better precision than other currently available low-cost platforms and is expected to become ubiquitous in laptops and mobile devices starting this year. In this paper, we demonstrate for the first time that the RealSense camera can be easily converted into a real low-cost gaze tracker. Gaze has become increasingly relevant as an input for human-computer interaction due to its association with attention. It is also critical in clinical mental health diagnosis. We present a novel 3D gaze and fixation tracker based on the eye surface geometry captured with the RealSense 3D camera. First, eye surface 3D point clouds are segmented to extract the pupil center and iris using registered infrared images. With non-ellipsoid eye surface and single fixation point assumptions, pupil centers and iris normal vectors are used to first estimate gaze (for each eye), and then a single fixation point for both eyes simultaneously using a RANSAC-based approach. With a simple learned bias field correction model, the fixation tracker demonstrates mean error of approximately 1 cm at 20-30 cm, which is sufficiently adequate for gaze and fixation tracking in human-computer interaction and mental health diagnosis applications.
Biomedical Optics Express | 2017
Oscar Carrasco-Zevallos; Christian Viehland; Brenton Keller; Mark Draelos; Anthony N. Kuo; Cynthia A. Toth; Joseph A. Izatt
During microsurgery, en face imaging of the surgical field through the operating microscope limits the surgeons depth perception and visualization of instruments and sub-surface anatomy. Surgical procedures outside microsurgery, such as breast tumor resections, may also benefit from visualization of the sub-surface tissue structures. The widespread clinical adoption of optical coherence tomography (OCT) in ophthalmology and its growing prominence in other fields, such as cancer imaging, has motivated the development of intraoperative OCT for real-time tomographic visualization of surgical interventions. This article reviews key technological developments in intraoperative OCT and their applications in human surgery. We focus on handheld OCT probes, microscope-integrated OCT systems, and OCT-guided laser treatment platforms designed for intraoperative use. Moreover, we discuss intraoperative OCT adjuncts and processing techniques currently under development to optimize the surgical feedback derivable from OCT data. Lastly, we survey salient clinical studies of intraoperative OCT for human surgery.
intelligent robots and systems | 2014
Nikhil Deshpande; Edward Grant; Mark Draelos; Thomas C. Henderson
This paper presents a low-complexity, novel approach to wireless sensor network (WSN) assisted autonomous mobile robot (AMR) navigation. The goal is to have an AMR navigate to a target location using only the information inherent to WSNs, i.e., topology of the WSN and received signal strength (RSS) information, while executing an efficient navigation path. Here, the AMR has neither the location information for the WSN, nor any sophisticated ranging equipment for prior mapping. Two schemes are proposed utilizing particle filtering based bearing estimation with RSS values obtained from directional antennas. Real-world experiments demonstrate the effectiveness of the proposed schemes. In the basic node-to-node navigation scheme, the bearing-only particle filtering reduces trajectory length by 11.7% (indoors) and 15% (outdoors), when compared to using raw bearing measurements. The advanced scheme further reduces the trajectory length by 22.8% (indoors) and 19.8% (outdoors), as compared to the basic scheme. The mechanisms exploit the low-cost, low-complexity advantages of the WSNs to provide an effective method for map-less and ranging-less navigation.
Robotics and Autonomous Systems | 2016
Nikhil Deshpande; Edward Grant; Thomas C. Henderson; Mark Draelos
international conference on robotics and automation | 2018
Mark Draelos; Brenton Keller; Gao Tang; Anthony N. Kuo; Kris K. Hauser; Joseph A. Izatt
Biomedical Optics Express | 2018
Mark Draelos; Brenton Keller; Christian Viehland; Oscar Carrasco-Zevallos; Anthony N. Kuo; Joseph A. Izatt
Biomedical Optics Express | 2018
Brenton Keller; Mark Draelos; Gao Tang; Sina Farsiu; Anthony N. Kuo; Kris K. Hauser; Joseph A. Izatt
intelligent robots and systems | 2017
Mark Draelos; Brenton Keller; Cynthia A. Toth; Anthony N. Kuo; Kris K. Hauser; Joseph A. Izatt
Investigative Ophthalmology & Visual Science | 2017
Mark Draelos; Brenton Keller; Anthony N. Kuo; Joseph A. Izatt