Ravi Kuber
University of Maryland, Baltimore County
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ravi Kuber.
Universal Access in The Information Society | 2008
Emma Murphy; Ravi Kuber; Graham McAllister; Philip Strain; Wai Yu
In this paper, an empirical based study is described which has been conducted to gain a deeper understanding of the challenges faced by the visually impaired community when accessing the Web. The study, involving 30 blind and partially sighted computer users, has identified navigation strategies, perceptions of page layout and graphics using assistive devices such as screen readers. Analysis of the data has revealed that current assistive technologies impose navigational constraints and provide limited information on web page layout. Conveying additional spatial information could enhance the exploration process for visually impaired Internet users. It could also assist the process of collaboration between blind and sighted users when performing web-based tasks. The findings from the survey have informed the development of a non-visual interface, which uses the benefits of multimodal technologies to present spatial and navigational cues to the user.
Virtual Reality | 2006
Wai Yu; Ravi Kuber; Emma Murphy; Philip Strain; Graham McAllister
This paper introduces a novel interface designed to help blind and visually impaired people to explore and navigate on the Web. In contrast to traditionally used assistive tools, such as screen readers and magnifiers, the new interface employs a combination of both audio and haptic features to provide spatial and navigational information to users. The haptic features are presented via a low-cost force feedback mouse allowing blind people to interact with the Web, in a similar fashion to their sighted counterparts. The audio provides navigational and textual information through the use of non-speech sounds and synthesised speech. Interacting with the multimodal interface offers a novel experience to target users, especially to those with total blindness. A series of experiments have been conducted to ascertain the usability of the interface and compare its performance to that of a traditional screen reader. Results have shown the advantages that the new multimodal interface offers blind and visually impaired people. This includes the enhanced perception of the spatial layout of Web pages, and navigation towards elements on a page. Certain issues regarding the design of the haptic and audio features raised in the evaluation are discussed and presented in terms of recommendations for future work.
human factors in computing systems | 2007
Ravi Kuber; Wai Yu; Graham McAllister
Haptic technologies are thought to have the potential to help blind individuals overcome the challenges experienced when accessing the Web. This paper proposes a structured participatory-based approach for developing targeted haptic sensations for purposes of web page exploration, and reports preliminary results showing how HTML elements can be represented through the use of force-feedback. Findings are then compared with mappings from previous studies, demonstrating the need for providing tailored haptic sensations for blind Internet users. This research aims to culminate in a framework, encompassing a vocabulary of haptic sensations with accompanying recommendations for designers to reference when developing inclusive web solutions.
International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2010
Ravi Kuber; Wai Yu
Research suggests that human limitations are rarely considered in the design of knowledge-based authentication systems. In an attempt to foster entry to a system, individuals tend to choose passwords which are easy to recall. However, inappropriate selection can compromise data security. A novel approach has been developed to restore the balance between security and memorability through the use of the haptic channel. This paper introduces the Tactile Authentication System (TAS), which enables the user to authenticate entry through the ability to remember a sequence of pre-selected tactile sensations. The design process undertaken to develop distinguishable tactile stimuli for use within TAS is described, and details of the recognition-based tactile authentication mechanism are also presented. Findings from an empirical study reported in this paper, have revealed that 16 participants were able to authenticate access to TAS over the course of a one-month period, with low levels of error. The approach was found to offer benefits over conventional visual-based authentication methods. Tactile stimuli are presented underneath the fingertips, and are therefore occluded from others. As the sense of touch is personal to each user, tactile stimuli are difficult to describe in concrete terms, and cannot easily be written down or disclosed, thereby reducing the chance of unauthorized third party access.
International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2011
Huimin Qian; Ravi Kuber; Andrew Sears
Abstract As mobile technologies such as cellular telephones reduce in both size and cost, and improve in fidelity, they become a more attractive option for performing tasks such as surfing the Web and accessing applications while on-the-go. The small size of the visual display limits the amount of information that can be presented, which may lead to cluttered interfaces. Tactile feedback (e.g. vibrations) provides one solution to reducing the burden on the visual channel. This paper describes a series of studies conducted with the goal of developing perceivable tactile icons (tactons) to aid in non-visual interactions with mobile applications. In contrast to previous work, our research addresses the development of pairs of tactons, rather than individual tactons, with the goal of conveying two-state signals such as ‘on/off’, ‘slower/faster’, or ‘left/right’. Such communication can help reduce visual demands associated with using mobile applications, allowing the device to convey important information while the users’ hands and eyes are otherwise occupied. Realistic conditions were simulated in a laboratory-based environment to determine how auditory distracters could affect the perception of tactons. Findings show that recognition rates differed depending on the design of the vibration pair parameters, and type of auditory distracter. This research culminated in a set of guidelines, which tactile interface designers can integrate as they design mobile applications to improve access, as well as insights which can guide future research on tactile feedback for mobile devices.
conference on computers and accessibility | 2009
Huimin Qian; Ravi Kuber; Andrew Sears
This paper describes a study designed to identify salient tactile cues which can be integrated with a cellular telephone interface, to provide non-visual feedback to users when accessing mobile applications. A set of tactile icons (tactons) have been developed by manipulating the pulse duration and interval of vibrotactile signals. Participants were presented with pairs of tactons, and asked to differentiate between each respective pair and rank their salience. Results suggested that the combination of two static tactons is the most effective way to convey tactile information, when compared with dynamic or mixed tactile cues. Further studies will be conducted to refine feedback in order to communicate the presence of graphical objects on a mobile device interface, or to present events and alerts more effectively. The long term goal is to improve access to an interface by using the tactile channel, thereby freeing the visual and auditory channels to perform other tasks.
Proceedings of the 13th Web for All Conference on | 2016
Ali Abdolrahmani; Ravi Kuber; Amy Hurst
In this paper, we describe a study specifically focusing on the situationally-induced impairments and disabilities (SIIDs) which individuals who are blind encounter when interacting with mobile devices. We conducted semi-structured interviews with eight legally-blind participants, and presented them with three scenarios to inspire discussion relating to SIIDs. Nine main themes emerged from analysis of the participant interviews, including the challenges faced when using a mobile device one-handed while using a cane to detect obstacles along the intended path, the impact of using a mobile device under inhospitable conditions, and concerns associated with using a mobile device in environments where privacy and safety may be compromised (e.g. when using public transport). These were found to reduce the quality of the subjective interaction experience, and in some cases limiting use of mobile technologies in public venues. Insights from our research can be used to guide the design of future mobile interfaces to better meet the needs of users whose needs are often excluded from the design process.
International Journal of Human-computer Interaction | 2013
Ravi Kuber; Franklin P. Wright
Interpersonal communication benefits greatly from the emotional information encoded by facial expression, body language, and tone of voice. However, these cues are often lost when sending text-based instant messages. This article describes a novel approach utilizing gestural and brain–computer interface (BCI) input to replace the missing emotional cues, with the aim of augmenting the instant messaging process. A set of exploratory studies were conducted using a commercially available BCI headset. An initial study validated the emotional data automatically captured by the device. Subsequently, an instant messaging application was developed, which detected emotions and facial gestures that are presented to the users chat partner via progress bars and an avatar. Findings from an evaluation revealed that the novel approach facilitates communication containing a greater percentage of affective terms compared with traditional, text-based instant messaging environments. Strong levels of confidence were expressed when using the system to both convey and infer affective states, contributing to a rich subjective user experience.
Interacting with Computers | 2011
Huimin Qian; Ravi Kuber; Andrew Sears; Emma Murphy
Older adults are recommended to remain physically active to reduce the risk of chronic diseases and to maintain psychological well-being. At the same time, research also suggests that levels of fitness can be raised among this group. This paper describes the development and evaluation of a mobile technology, which enables older adults to monitor and modify their walking habits, with the long-term aim of sustaining appropriate levels of physical activity. An empirical study was conducted with twenty older adults to determine the feasibility of the proposed solution, with results indicating that tactile signals could be perceived while in motion and could support participants in walking at a range of paces. However, the effects were difficult to discern due to limitations of the hardware. In response, a novel low-cost prototype was developed to amplify vibrations, and effectiveness of redundant auditory information was investigated with the goal of enhancing the perception of the cues. A second study was conducted to determine the impact of multimodal feedback on walking behavior. Findings revealed that participants were able to maintain a desired level of pace more consistently when redundant auditory information was presented alongside the tactile feedback. When the visual channel is not available, these results suggest that tactile cues presented via a mobile device should be augmented with auditory feedback. Our research also suggests that mobile devices could be made more effective for alternative applications if they are designed to allow for stronger tactile feedback.
Interacting with Computers | 2011
Shaojian Zhu; Ravi Kuber; Matthew Tretter; M. Sile O'Modhrain
Haptic technologies are often used to improve access to the structural content of graphical user interfaces, thereby augmenting the interaction process for blind users. While haptic design guidelines offer valuable assistance when developing non-visual interfaces, the recommendations presented are often tailored to the feedback produced via one particular haptic input/output device. A blind user is therefore restricted to interacting with a device which may be unfamiliar to him/her, rather than selecting from the range of commercially available products. This paper reviews devices available on the first and second-hand markets, and describes an exploratory study undertaken with 12 blindfolded sighted participants to determine the effectiveness of three devices for non-visual web interaction. The force-feedback devices chosen for the study, ranged in the number of translations and rotations that the user was able to perform when interacting with them. Results have indicated that the Novint Falcon could be used to target items faster in the first task presented, compared with the other devices. However, participants agreed that the force-feedback mouse was most comfortable to use when interacting with the interface. Findings have highlighted the benefits which low cost haptic input/output devices can offer to the non-visual browsing process, and any changes which may need to be made to accommodate their deficiencies. The study has also highlighted the need for web designers to integrate appropriate haptic feedback on their web sites to cater for the strengths and weaknesses of various devices, in order to provide universally accessible sites and online applications.