Ahreum Lee
Hanyang University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ahreum Lee.
Human Movement Science | 2015
Ahreum Lee; Kiburm Song; Hokyoung Ryu; Jieun Kim; Gyuhyun Kwon
The growing popularity of gaming applications and ever-faster mobile carrier networks have called attention to an intriguing issue that is closely related to command input performance. A challenging mirroring game service, which simultaneously provides game service to both PC and mobile phone users, allows them to play games against each other with very different control interfaces. Thus, for efficient mobile game design, it is essential to apply a new predictive model for measuring how potential touch input compares to the PC interfaces. The present study empirically tests the keystroke-level model (KLM) for predicting the time performance of basic interaction controls on the touch-sensitive smartphone interface (i.e., tapping, pointing, dragging, and flicking). A modified KLM, tentatively called the fingerstroke-level model (FLM), is proposed using time estimates on regression models.
human factors in computing systems | 2015
Ahreum Lee
The wearable camera industry is facing low adoption rates due to concerns over the amount of data the devices collect and the inability to differentiate from mobile phones and digital cameras. To improve adoption rates, the perception of the wearable camera should be changed. This research attempts to portray mobile cameras as tools for personal experience sharing. A 50-day study was conducted to determine what types of experiences are meaningful for the users. These factors should be considered when designing wearable cameras for personal re-experiencing system.
Proceedings of the Asian HCI Symposium'18 on Emerging Research Collection - Asian HCI Symposium'18 | 2018
Ahreum Lee; Hokyoung Ryu
The lifelogging camera continuously captures ones surroundings, therefore lifelog photos can form a medium by which to sketch out and share ones autobiographical memory with others. Frequently, the lifelog photos do not provide the context or significance of the situations to those not present when the photos were taken. This paper solicits the social value of the lifelog photos by proposing different levels of autobiographical narratives within Panofskys framework. By measuring the narrative engagement with questionnaire and the activation level of the lateral prefrontal cortex (LPFC), we have found that delivery the autobiographical narrative at the iconological level triggers the receivers empathetic response and emotional tagging of the sharers lifelog photos.
International Journal of Environmental Research and Public Health | 2018
Ahreum Lee; Hokyoung Ryu; Jae-Kwan Kim; Eunju Jeong
Older adults are known to have lesser cognitive control capability and greater susceptibility to distraction than young adults. Previous studies have reported age-related problems in selective attention and inhibitory control, yielding mixed results depending on modality and context in which stimuli and tasks were presented. The purpose of the study was to empirically demonstrate a modality-specific loss of inhibitory control in processing audio-visual information with ageing. A group of 30 young adults (mean age = 25.23, Standard Deviation (SD) = 1.86) and 22 older adults (mean age = 55.91, SD = 4.92) performed the audio-visual contour identification task (AV-CIT). We compared performance of visual/auditory identification (Uni-V, Uni-A) with that of visual/auditory identification in the presence of distraction in counterpart modality (Multi-V, Multi-A). The findings showed a modality-specific effect on inhibitory control. Uni-V performance was significantly better than Multi-V, indicating that auditory distraction significantly hampered visual target identification. However, Multi-A performance was significantly enhanced compared to Uni-A, indicating that auditory target performance was significantly enhanced by visual distraction. Additional analysis showed an age-specific effect on enhancement between Uni-A and Multi-A depending on the level of visual inhibition. Together, our findings indicated that the loss of visual inhibitory control was beneficial for the auditory target identification presented in a multimodal context in older adults. A likely multisensory information processing strategy in the older adults was further discussed in relation to aged cognition.
International Journal of Industrial Ergonomics | 2013
Jieun Kim; Ahreum Lee; Hokyoung Ryu
Journal of Visualized Experiments | 2018
Kyoungwon Seo; Ahreum Lee; Jieun Kim; Hokyoung Ryu; Hojin Choi
human factors in computing systems | 2013
Kiburm Song; Jihoon Kim; Yoon-Han Cho; Ahreum Lee; Hokyoung Ryu; Jung-Woon Choi; Yong Joo Lee
HCIK '16 Proceedings of HCI Korea | 2016
Ahreum Lee; Hokyoung Ryu
Interactions | 2015
Ahreum Lee; Kyoungwon Seo; Jieun Kim; Gyu Hyun Kwon; Hokyoung Ryu
QScience Proceedings | 2013
David Parsons; Jae Lim Ahn; Ahreum Lee; Kiburm Song; Miyoung Yoon