Joshua Strickon
Apple Inc.
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Joshua Strickon.
Ibm Systems Journal | 2000
Joseph A. Paradiso; Kai-yuh Hsiao; Joshua Strickon; Joshua Lifton; Ari Adler
This paper describes four different systems that we have developed for capturing various manners of gesture near interactive surfaces. The first is a low-cost scanning laser rangefinder adapted to accurately track the position of bare hands in a plane just above a large projection display. The second is an acoustic system that detects the position of taps on a large, continuous surface (such as a table, wall, or window) by measuring the differential time-of-arrival of the acoustic shock impulse at several discrete locations. The third is a sensate carpet that uses a grid of piezoelectric wire to measure the dynamic location and pressure of footfalls. The fourth is a swept radio frequency (RF) tag reader that measures the height, approximate location, and other properties (orientation or a control variable like pressure) of objects containing passive, magnetically coupled resonant tags, and updates the continuous parameters of all tagged objects at 30 Hz. In addition to discussing the technologies and surveying different approaches, sample applications are given for each system.
human factors in computing systems | 1998
Joshua Strickon; Joseph A. Paradiso
a set of receive electrodes placed about the display perimeter provided signals that corresponded to body distance. Although this system responded well enough to body dynamics and location for its original application, the limited disambiguation from four receive electrodes was unable to result in a repeatable hand tracker without excessively constraining the body’s posture and placement. We have developed an inexpensive scanning laser rangefinder to measure the real-time position of bare hands in a 2-D plane up to distances of several meters. We have used this device to build a precise, multipoint “touchscreen” interface for large video projection systems. In this paper, we describe the concepts and hardware, plus outline an application for an interactive multimedia environment. Other groups have implemented hand trackers using video cameras and computer vision techniques. Some [3] employ IR light sources and cameras behind a translucent rearprojected screen to see hands near the front, while others [4] use multiple cameras to observe a 2D gesture space. Like most vision approaches, the performance of these systems can suffer from background light (including light from the display itself in the latter case), image clutter, limited speed of response, and the need for multi-camera correspondence.
intelligent robots and systems | 2003
Cynthia Breazeal; Andrew G. Brooks; Jesse Gray; Matt Hancher; Cory D. Kidd; John McBean; Dan Stiehl; Joshua Strickon
This work motivates interactive robot theatre as an interesting test bed to explore research issues in the development of sociable robots and to investigate the relationship between autonomous robots and intelligent environments. We present the implementation of our initial exploration in this area highlighting three core technologies. First, an integrated show control software development platform for the design and control of an intelligent stage. Second, a stereo vision system that tracks multiple features on multiple audience participants in real-time. Third, an interactive, autonomous robot performer with natural and expressive movement that combines techniques from character animation and robot control.
international conference on computer graphics and interactive techniques | 1998
Peter Rice; Joshua Strickon
Finder combines an innovative, graphi-cal, interactive music system with a state-of-the-art laser tracking device. An abstract graphical representation of a musical piece is projected onto a large vertical display surface. Users are invited to shape musical layers by pulling and stretching animated objects with natural, unencumbered hand movements. Each of the graphical objects is specifically designed to represent and control a particular bit of musical content. Objects incorporate simple behaviors and simulated physical properties to generate unique sonic personalities that contribute to their overall musical aesthetic. The project uses a scanning laser rangefinder to track multiple hands in a plane just forward of the projection surface. Using quadrature-phase detection, this inexpensive device can locate up to six independent points in a plane with cm-scale accuracy at up to 30 Hz. Bare hands can be tracked without sensitivity to background light and complexion to within a four-meter radius.
IEEE Computer Graphics and Applications | 2004
Joshua Strickon; Joseph A. Paradiso
erably over the past decade. Photorealistic animation and physically modeled simulations run on low-cost graphics cards in PCs with commodity software, and jobs that used to take days to run on a highend machine, can now be rendered in real time on an augmented PC. Accordingly, the computer graphics research community has evolved out to the fringes— for example, autonomous actors, parameter extraction from images and data, and rendering details so fine that they’re barely noticeable. Interaction, however, is another story. Beyond the keyboard and mouse control of today’s dominant GUIs, there are no established and more appropriate techniques for interacting with graphics and mixed media. Researchers in fields like human–computer interfaces, virtual reality, and interactive art have defined and approached interaction in many different ways, but the field hasn’t yet converged to a standard set of tools beyond the most basic available on any computer. The technologies of transduction are also following Moore’s law, provoking an explosion in the various ways input and output can be coupled to and interpreted by a computer. Interaction with graphical and multimedia systems has thus continued to be an active frontier that spans many fields of application, attracting approaches that are occasionally visionary, often inventive, and sometimes crazy, but that always offer a wild and stimulating intellectual ride.
Archive | 2008
Steve Porter Hotelling; Joshua Strickon; Brian Q. Huppi; Imran Chaudhri; Greg Christie; Bas Ording; Duncan Robert Kerr; Jonathan P. Ive
Archive | 2005
Steve Porter Hotelling; Brian Q. Huppi; Joshua Strickon; Duncan Robert Kerr; Bas Ording; Imran Chaudhri; Greg Christie; Jonathan P. Ive
Archive | 2005
Steven Porter Hotelling; Chris Ligtenberg; Duncan Robert Kerr; Bartley K. Andre; Joshua Strickon; Brian Q. Huppi; Imran Chaudhri; Greg Christie; Bas Ording
Archive | 2007
Wayne Carl Westerman; Joshua Strickon
Archive | 2009
Steve Porter Hotelling; Joshua Strickon; Brian Q. Huppi; Christoph Horst Krah