Kevin P. Moloney
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kevin P. Moloney.
human factors in computing systems | 2003
Julie A. Jacko; Ingrid U. Scott; François Sainfort; Leon Barnard; Paula J. Edwards; V. Kathlene Emery; Thitima Kongnakorn; Kevin P. Moloney; Brynley S. Zorich
This study examines the effects of multimodal feedback on the performance of older adults with different visual abilities. Older adults possessing normal vision (n=29) and those who have been diagnosed with Age-Related Macular Degeneration (n=30) performed a series of drag-and-drop tasks under varying forms of feedback. User performance was assessed with measures of feedback exposure times and accuracy. Results indicated that for some cases, non-visual (e.g. auditory or haptic) and multimodal (bi- and trimodal) feedback forms demonstrated significant performance gains over the visual feedback form, for both AMD and normally sighted users. In addition to visual acuity, effects of manual dexterity and computer experience are considered.
conference on universal usability | 2002
V. Kathlene Emery; Paula J. Edwards; Julie A. Jacko; Kevin P. Moloney; Leon Barnard; Thitima Kongnakorn; François Sainfort; Ingrid U. Scott
This experiment examines the effect of combinations of feedback (auditory, haptic, and/or visual) on the performance of older adults completing a drag-and-drop computer task. Participants completed a series of drag-and-drop tasks under each of seven feedback conditions (3 unimodal, 3 bimodal, 1 trimodal). Performance was assessed using measures of efficiency and accuracy. For analyses of results, participants were grouped based on their level of computer experience. All users performed well under auditory-haptic bimodal feedback and experienced users responded well to all multimodal feedback. Based on performance benefits for older adults seen in this experiment, future research should extend investigations to effectively integrate multimodal feedback into GUI interfaces in order to improve usability for this growing and diverse user group.
Behaviour & Information Technology | 2004
Julie A. Jacko; V. Kathlene Emery; Paula J. Edwards; Mahima Ashok; Leon Barnard; Thitima Kongnakorn; Kevin P. Moloney; François Sainfort
This experiment examines the effect that computer experience and various combinations of feedback (auditory, haptic, and/or visual) have on the performance of older adults completing a drag-and-drop task on a computer. Participants were divided into three computer experience groups, based on their frequency of use and breadth of computer knowledge. Each participant completed a series of drag-and-drop tasks under each of seven feedback conditions (three unimodal, three bimodal, one trimodal). Performance was assessed using measures of efficiency and accuracy. Experienced users responded well to all multimodal feedback while users without experience responded well to auditory-haptic bimodal, but poorly to haptic-visual bimodal feedback. Based on performance benefits for older adults seen in this experiment, future research should extend investigations to effectively integrate multimodal feedback into GUI interfaces in order to improve usability for this growing and diverse user group.
human factors in computing systems | 2004
Julie A. Jacko; Leon Barnard; Thitima Kongnakorn; Kevin P. Moloney; Paula J. Edwards; V. Kathlene Emery; François Sainfort
This study examines the effects of multimodal feedback on the performance of older adults with an ocular disease, Age-Related Macular Degeneration (AMD), when completing a simple computer-based task. Visually healthy older users (n = 6) and older users with AMD (n = 6) performed a series of drag-and-drop tasks that incorporated a variety of different feedback modalities. The user groups were equivalent with respect to traditional visual function metrics and measured subject cofactors, aside from the presence or absence of AMD. Results indicate that users with AMD exhibited decreased performance, with respect to required feedback exposure time. Some non-visual and multimodal feedback forms show potential as solutions to enhance performance, for those with AMD as well as for visually healthy older adults.
International Journal of Human-computer Interaction | 2005
Julie A. Jacko; Kevin P. Moloney; Thitima Kongnakorn; Leon Barnard; Paula J. Edwards; V. Kathlene Leonard; François Sainfort; Ingrid U. Scott
This study examines effects of the most common cause of blindness in persons over the age of 55 in the United States, age-related macular degeneration (AMD), on the performance of older adults when completing a simple computer-based task. Older users with normal vision (n = 6) and with AMD (n = 6) performed a series of drag-and-drop tasks that incorporated a variety of different feedback modalities. The user groups were equivalent with respect to traditional visual function parameters (i.e., visual acuity, contrast sensitivity, and color vision) and measured subject cofactors, aside from the presence or absence of AMD (i.e., drusen and retinal pigment epithelial mottling). Task performance was assessed with measures of time (trial time and feedback exposure time) and accuracy (error frequency). Results indicate that users with AMD exhibited decreased performance with respect to required feedback exposure time, total trial time, and errors committed. Some nonvisual and multimodal feedback forms show potential as solutions for enhanced performance, for those with AMD as well as for visually healthy older adults.
ACM Transactions on Computer-Human Interaction | 2006
Kevin P. Moloney; Julie A. Jacko; Brani Vidakovic; François Sainfort; V. Kathlene Leonard; Bin Shi
The current ubiquity of information technology has increased variability among users, creating a corresponding need to properly capture and understand these individual differences. This study introduces a novel application of multifractal statistical methods to distinguish users via patterns of variability within high frequency pupillary response behavior (PRB) data collected during computer-based interaction. PRB was measured from older adults, including two groups diagnosed with age-related macular degeneration (AMD) maintaining a range of visual acuities (n = 14), and one visually healthy control group (i.e., disease-free, 20/20--20/32 acuity) (n = 14). Three measures of the multifractal spectrum, the distribution of regularity indices extracted from time series data, distinguished the user groups, including: 1) Spectral Mode; 2) Broadness; and 3) Left Slope. The results demonstrate a clear relationship between the values of these measures and the level of visual capabilities. These analytical techniques leverage the inherent complexity and richness of this high frequency physiological response data, which can be used to meaningfully differentiate individuals whose sensory and cognitive capabilities may be affected by aging and visual impairment. Multifractality analysis provides an objective, quantifiable means of uncovering and examining the underlying signatures in physiological behavior that may account for individual differences in interaction needs and behaviors.
Behaviour & Information Technology | 2005
Paula J. Edwards; Leon Barnard; Vk Leonard; Js Yi; Kevin P. Moloney; Thitima Kongnakorn; Julie A. Jacko; François Sainfort
This paper examines factors that affect performance on a basic menu selection task by users who are visually healthy and users with Diabetic Retinopathy (DR) in order to inform better interface design. Linear and logistic regression models were used to examine various contextual factors that influenced task efficiency (time) and accuracy (errors). Interface characteristics such as multimodal feedback, Windows® accessibility settings, and menu item location were investigated along with various visual function and participant characteristics. Results indicated that Windows® accessibility settings and other factors, including age, computer experience, visual acuity, contrast sensitivity, and menu item location, were significant predictors of task performance.
Behaviour & Information Technology | 2005
Julie A. Jacko; Leon Barnard; Js Yi; Paula J. Edwards; Vk Leonard; Thitima Kongnakorn; Kevin P. Moloney; François Sainfort
This study investigates the effectiveness of two design interventions, the Microsoft® Windows® accessibility settings and multimodal feedback, aimed at the enhancement of a menu selection task, for users with diabetic retinopathy (DR) with stratified levels of visual dysfunction. Several menu selection task performance measures, both time- and accuracy-based, were explored across different interface conditions and across groups of participants stratified by different degrees of vision loss. The results showed that the Windows® accessibility settings had a significant positive impact on performance for participants with DR. Moreover, multimodal feedback had a negligible effect for all participants. Strategies for applying multimodal feedback to menu selection are discussed, as well as the potential benefits and drawbacks of the Windows® accessibility settings.
conference on computers and accessibility | 2004
Paula J. Edwards; Leon Barnard; V. Kathlene Emery; Ji Soo Yi; Kevin P. Moloney; Thitima Kongnakorn; Julie A. Jacko; François Sainfort; Pamela Oliver; Joseph Pizzimenti; Annette Bade; Greg Fecho; Josephine Shallo-Hoffmann
This paper examines factors that affect performance of a basic menu selection task by users who are visually healthy and users with Diabetic Retinopathy (DR) in order to inform better interface design. Interface characteristics such as multimodal feedback, Windows® accessibility settings, and menu item location were investigated. Analyses of Variance (ANOVA) were employed to examine the effects of interface features on task performance. Linear regression was used to further examine and model various contextual factors that influenced task performance. Results indicated that Windows® accessibility settings significantly improved performance of participants with more progressed DR. Additionally, other factors, including age, computer experience, visual acuity, and menu location were significant predictors of the time required for subjects to complete the task.
Journal of Statistical Computation and Simulation | 2006
Bin Shi; Kevin P. Moloney; Y. Pan; V. K. Leonard; Brani Vidakovic; Julie A. Jacko; François Sainfort
This paper addresses the problem of classifying users with different visual abilities on the basis of their pupillary response while performing computer-based tasks. Multiscale Schur monotone (MSM) summaries of high frequency pupil diameter measurements are utilized as feature (or input) vectors in this classification. Various MSM measures, such as Shannon, Picard, and Emlen entropies, the Gini coefficient and the Fishlow measure, are investigated to assess their discriminatory characteristics. A combination of classifiers, motivated by a Bayesian paradigm, is proposed to minimize and stabilize the misclassification rate in training and test sets with the goal of improving classification accuracy. In addition, the issue of wavelet basis selection for optimal classification performance is discussed. The members of the Pollen wavelet library are included as competitors. The proposed methodology is validated with extensive simulation and applied to high-frequency pupil diameter measurements collected from 36 individuals with varying ocular abilities and pathologies. The expected misclassification rate of our procedure can be as low as 21% by appropriately choosing the Schur monotone summary and using a properly selected wavelet basis.