Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeffrey R. Blum is active.

Publication


Featured researches published by Jeffrey R. Blum.


IEEE Transactions on Neural Systems and Rehabilitation Engineering | 2011

A Game Platform for Treatment of Amblyopia

Long To; Benjamin Thompson; Jeffrey R. Blum; Goro Maehara; Robert F. Hess; Jeremy R. Cooperstock

We have developed a prototype device for take-home use that can be used in the treatment of amblyopia. The therapeutic scenario we envision involves patients first visiting a clinic, where their vision parameters are assessed and suitable parameters are determined for therapy. Patients then proceed with the actual therapeutic treatment on their own, using our device, which consists of an Apple iPod Touch running a specially modified game application. Our rationale for choosing to develop the prototype around a game stems from multiple requirements that such an application satisfies. First, system operation must be sufficiently straight-forward that ease-of-use is not an obstacle. Second, the application itself should be compelling and motivate use more so than a traditional therapeutic task if it is to be used regularly outside of the clinic. This is particularly relevant for children, as compliance is a major issue for current treatments of childhood amblyopia. However, despite the traditional opinion that treatment of amblyopia is only effective in children, our initial results add to the growing body of evidence that improvements in visual function can be achieved in adults with amblyopia.


international conference on human computer interaction | 2009

Did Minority Report Get It Wrong? Superiority of the Mouse over 3D Input Devices in a 3D Placement Task

François Bérard; Jessica Ip; Mitchel Benovoy; Dalia El-Shimy; Jeffrey R. Blum; Jeremy R. Cooperstock

Numerous devices have been invented with three or more degrees of freedom (DoF) to compensate for the assumed limitations of the 2 DoF mouse in the execution of 3D tasks. Nevertheless, the mouse remains the dominant input device in desktop 3D applications, which leads us to pose the following question: is the dominance of the mouse due simply to its widespread availability and long-term user habituation, or is the mouse, in fact, more suitable than dedicated 3D input devices to an important subset of 3D tasks? In the two studies reported in this paper, we measured performance efficiency of a group of subjects in accomplishing a 3D placement task and also observed physiological indicators through biosignal measurements. Subjects used both a standard 2D mouse and three other 3 DoF input devices. Much to our surprise, the standard 2D mouse outperformed the 3D input devices in both studies.


human factors in computing systems | 2013

Listen to it yourself!: evaluating usability of what's around me? for the blind

Sabrina A. Panëels; Adriana Olmos; Jeffrey R. Blum; Jeremy R. Cooperstock

Although multiple GPS-based navigation applications exist for the visually impaired, these are typically poorly suited for in-situ exploration, require cumbersome hardware, lack support for widely accessible geographic databases, or do not take advantage of advanced functionality such as spatialized audio rendering. These shortcomings led to our development of a novel spatial awareness application that leverages the capabilities of a smartphone coupled with worldwide geographic databases and spatialized audio rendering to convey surrounding points of interest. This paper describes the usability evaluation of our system through a task-based study and a longer-term deployment, each conducted with six blind users in real settings. The findings highlight the importance of testing in ecologically valid contexts over sufficient periods to face real-world challenges, including balancing quality versus quantity for audio information, overcoming limitations imposed by sensor accuracy and quality of database information, and paying appropriate design attention to physical interaction with the device.


Journal on Multimodal User Interfaces | 2014

Real-time emergency response: improved management of real-time information during crisis situations

Jeffrey R. Blum; Alexander Eichhorn; Severin Smith; Michael Sterle-Contala; Jeremy R. Cooperstock

The decision-making process during crisis and emergency scenarios intertwines human intelligence with infocommunications. In such scenarios, the tasks of data acquisition, manipulation, and analysis involve a combination of cognitive processes and information and communications technologies, all of which are vital to effective situational awareness and response capability. To support such capabilities, we describe our real time emergency response (rtER) system, implemented with the intention of helping to manage the potential torrents of data that are available during a crisis, and that could easily overwhelm human cognitive capacity in the absence of technological mediation. Specifically, rtER seeks to address the research challenges surrounding the real-time collection of relevant data, especially live video, making this information rapidly available to a team of humans, and giving them the tools to manipulate, tag, and filter the most critical information of relevance to the situation.


Mobile Networks and Applications | 2013

Spatialized Audio Environmental Awareness for Blind Users with a Smartphone

Jeffrey R. Blum; Mathieu Bouchard; Jeremy R. Cooperstock

Numerous projects have investigated assistive navigation technologies for the blind community, tackling challenges ranging from interface design to sensory substitution. However, none of these have successfully integrated what we consider to be the three factors necessary for a widely deployable system that delivers a rich experience of one’s environment: implementation on a commodity device, use of a pre-existing worldwide point of interest (POI) database, and a means of rendering the environment that is superior to a naive playback of spoken text. Our “In Situ Audio Services” (ISAS) application responds to these needs, allowing users to explore an urban area without necessarily having a particular destination in mind. We describe the technical aspects of its implementation, user requirements, interface design, safety concerns, POI data source issues, and further requirements to make the system practical on a wider basis. Initial qualitative feedback from blind users is also discussed.


international symposium on wearable computers | 2016

Grabbing at an angle: menu selection for fabric interfaces

Nur Al-huda Hamdan; Jeffrey R. Blum; Florian Heller; Ravi Kanth Kosuru; Jan O. Borchers

This paper investigates the pinch angle as a menu selection technique for two-dimensional foldable textile controllers. Based on the principles of marking menus, the selection of a menu item is performed by grabbing a fold at a specific angle, while changing value is performed by rolling the fold between the fingers. In a first experiment we determined an upper bound for the number of different angles users can reliably grab into a piece of fabric on their forearm. Our results show that users can, without looking at it, reliably grab fabric on their forearm with an average accuracy between 30° and 45°, which would provide up to six different menu options selectable with the initial pinch. In a second experiment, we show that our textile sensor, Grabrics, can detect fold angles at 45° spacing with up to 85% accuracy. Our studies also found that user performance and workload are independent of the fabric types that were tested.


user interface software and technology | 2017

Raising the Heat: Electrical Muscle Stimulation for Simulated Heat Withdrawal Response

Pascal E. Fortin; Jeffrey R. Blum; Jeremy R. Cooperstock

Virtual Reality (VR) has numerous mechanisms for making a virtual scene more compellingly real. Most effort has been focused on visual and auditory techniques for immersive environments, although some commercial systems now include relatively crude haptic effects through handheld controllers or haptic suits. We present results from a pilot experiment demonstrating the use of Electrical Muscle Stimulation (EMS) to trick participants into thinking a surface is dangerously hot even though it is below 50C. This is accomplished by inducing an artificial heat withdrawal reflex by contracting the participants bicep shortly after contact with the virtual hot surface. Although the effects of multiple experimental confounds need to be quantified in future work, results so far suggest that EMS could potentially be used to modify temperature perception in VR and AR contexts. Such an illusion has applications for VR gaming as well as emergency response and workplace training and simulation, in addition to providing new insights into the human perceptual system.


human factors in computing systems | 2018

Punching Empathy into Yourself and Others: Subversive Transformation of Hostility

Jeffrey R. Blum; Pascal E. Fortin; Feras Al Taha; Yubei Xiong; James Sham

Using technology to convey information and feelings between people is a key goal of many interactive systems, typically with the highest connection fidelity possible. However, the choices made during design and implementation inevitably impact how the communication is perceived. As part of the Empathy Mirror project [4], we explore using technology to instead invert the expressed physical aggression of one participant into a soothing massage for another. Participants take out their aggression on a punching bag. The system detects the magnitude of the blows, and processes them into vibrations rendered via a massage seat to a second participant. Participants reflect on how technology can subvert our intentions, such that the receivers perception may be very different from what the sender originally communicated. In this case, the most aggressive action is to remove themselves from the exhibit, leaving the receiver with no positive vibes, effectively nullifying the ability to be hostile.


augmented human international conference | 2016

Expressing Human State via Parameterized Haptic Feedback for Mobile Remote Implicit Communication

Jeffrey R. Blum; Jeremy R. Cooperstock

As part of a mobile remote implicit communication system, we use vibrotactile patterns to convey background information between two people on an ongoing basis. Unlike systems that use memorized tactons (haptic icons), we focus on methods for translating parameters of a users state (e.g., activity level, distance, physiological state) into dynamically created patterns that summarize the state over a brief time interval. We describe the vibration pattern used in our current user study to summarize a partners activity, as well as preliminary findings. Further, we propose additional possibilities for enriching the information content.


human computer interaction with mobile devices and services | 2014

Body-worn sensors for remote implicit communication

Jeffrey R. Blum

Although there has been a great deal of work on using machine learning algorithms to categorize user activity (e.g., walking, biking) from accelerometer and other sensor data, less attention has been focused on relaying such information in real-time for remote implicit communication. Humans are very familiar with using both explicit and implicit communication with others physically near to us, for example, by using body language and modulating our voice tone and volume. Likewise, remote explicit communication has also existed for a long time in the form of phone calls, text messages, and other mechanisms for communicating explicitly over great distances. However, remote implicit communication between mobile users has been less explored, likely since there have been limited avenues for collecting such information and rendering it in a meaningful way while we go about our daily lives.

Collaboration


Dive into the Jeffrey R. Blum's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge