Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jaclyn Barnes is active.

Publication


Featured researches published by Jaclyn Barnes.


conference on computers and accessibility | 2016

Breaking Barriers to Digital Literacy: An Intergenerational Social-Cognitive Approach

Keith Atkinson; Jaclyn Barnes; Judith Albee; Peter Anttila; Judith Haataja; Kanak Nanavati; Kelly S. Steelman; Charles Wallace

In entering the digital realm, older adults face obstacles beyond the more clearly understood physical and cognitive barriers traditionally associated with accessibility. This experience report is a collection of narratives from learners and student tutors who participate in our digital literacy sessions for seniors. We point out ways in which attitudes and motivations, framed by social and cultural factors, can either hinder or assist with adoption of commodity digital technology among older newcomers. We also show how a social-cognitive approach can help learners overcome barriers to digital literacy.


human robot interaction | 2016

Making Live Theatre with Multiple Robots as Actors: Bringing Robots to Rural Schools to Promote STEAM Education for Underserved Students

Myounghoon Jeon; Maryram FakhrHosseini; Jaclyn Barnes; Zackery Duford; Ruimin Zhang; Joseph Ryan; Eric Vasey

We have tried to promote STEM (Science, Technology, Engineering, and Math) education for underserved students using interactive robots. As an advanced attempt to integrate art and design into STEM education (i.e., STEAM), in the present paper we introduce our afterschool program in which elementary students create live theatre using multiple robots as actors. We hope to receive feedback and comments on our afterschool curriculum and case study, and thus, we can run better sessions at schools and make a standardized protocol regarding this robot actors approach.


international conference on ubiquitous robots and ambient intelligence | 2017

The influence of robot design on acceptance of social robots

Jaclyn Barnes; Maryam FakhrHosseini; Myounghoon Jeon; Chung Hyuk Park; Ayanna Howard

Human-robot interaction (HRI) is a rapidly growing area of research and the important factors of the relationship between robots and the people who use them are still being determined. In the present research, we tried to ascertain if robot appearance influenced peoples preference for, interest in, and communication with particular robots. To this aim, 18 college students were asked to interact with 5 different robots, answer an investigator-designed questionnaire, and complete an adapted PHIT-40 scale for each robot. Results suggest that regardless of the actual interaction with robots, participants seem to prefer robots that resemble animals or humans over those that are intended to represent an imaginary creature or do not resemble a creature at all. Results are discussed based on social-robot application and design features.


human robot interaction | 2017

Child-Robot Theater: STEAM Education in an Afterschool Program

Jaclyn Barnes; Maryam FakhrHosseini; Eric Vasey; Zackery Duford; Joseph Ryan; Myounghoon Jeon

Children in an elementary school afterschool program utilizing the STEAM (science, technology, engineering, arts, and math) education paradigm created and acted in short plays with a variety of social robots.


international conference on human-computer interaction | 2015

Development and Evaluation of Emotional Robots for Children with Autism Spectrum Disorders

Myounghoon Jeon; Ruimin Zhang; William Lehman; Seyedeh Maryam Fakhrhosseini; Jaclyn Barnes; Chung Hyuk Park

Individuals with Autism Spectrum Disorders (ASD) often have difficulty recognizing emotional cues in ordinary interaction. To address this, we are developing a social robot that teaches children with ASD to recognize emotion in the simpler and more controlled context of interaction with a robot. An emotion recognition program using the Viola-Jones algorithm for facial detection is in development. To better understand emotion expression by social robots, a study was conducted with 11 college students matching animated facial expressions and emotionally neutral sentences spoken in affective voices to various emotions. Overall, facial expressions had greater recognition accuracy and higher perceived intensity than voices. Future work will test the recognition of combined face and voices.


international conference on human-computer interaction | 2018

Non-invasive Gaze Direction Estimation from Head Orientation for Human-Machine Interaction.

Zhi Zheng; Yuguang Wang; Jaclyn Barnes; Xingliang Li; Chung Hyuk Park; Myounghoon Jeon

Gaze direction is one of the most important interaction cues that is widely used in human-machine interactions. In scenarios where participants’ head movement is involved and/or participants are sensitive to body-attached sensors, traditional gaze tracking methods, such as using commercial eye trackers are not appropriate. This is because the participants need to hold head pose during tracking or wear invasive sensors that are distractive and uncomfortable. Thus, head orientation has been used to approximate gaze directions in these cases. However, the difference between head orientation and gaze direction has not been thoroughly and numerically evaluated, and thus how to derive gaze direction accurately from head orientation is still an open question. In this article, we have two contributions in solving these problems. First, we evaluated the difference between people’s frontal head orientation and their gaze direction when looking at an object in different directions. Second, we developed functions that can map people’s gaze direction using their frontal head orientation. The accuracy of the proposed gaze tracking method is around 7°, and the method can be easily embedded on top of any existing remote head orientation method to perform non-invasive gaze direction estimation.


robot and human interactive communication | 2017

Love at first sight: Mere exposure to robot appearance leaves impressions similar to interactions with physical robots

S. Maryam Fakhr Hosseini; Samantha Hilliger; Jaclyn Barnes; Myounghoon Jeon; Chung Hyuk Park; Ayanna M. Howard

As the technology needed to make robots robust and affordable draws ever nearer, human-robot interaction (HRI) research to make robots more useful and accessible to the general population becomes more crucial. In this study, 59 college students filled out an online survey soliciting their judgments regarding seven social robots based solely on appearance. Results suggest that participants prefer robots that resemble animals or humans over those that are intended to represent an imaginary creature or do not resemble a creature at all. Results are discussed based on social robot application and design features.


international conference on ubiquitous robots and ambient intelligence | 2017

Robot Opera: A modularized afterschool program for STEAM education at local elementary school

Myounghoon Jeon; Jaclyn Barnes; Maryam FakhrHosseini; Eric Vasey; Zack Duford; Zhi Zheng; Emily A. Dare

The importance of STEM education has consistently increased. Recently, it has been updated to “STEAM”, by adding art and design to the equation. To promote STEAM education using robots, we have devised the “Robot Opera” program at a local elementary school, by extending our earlier child-robot theater program. After introducing our pedagogical approach, we outline our ongoing program, composed of four modules: acting, dancing, sonifying, and drawing. We hope this work-in-progress paper can facilitate discussions on how to use robots in informal learning environments.


Archive | 2017

Virtual Reality Museum of Consumer Technologies

Avinash Subramanian; Jaclyn Barnes; Naveena Vemulapalli; Sumeet Chhawri

Given the rapid pace of technical development in the past several decades, many people have fond memories of using devices that are no longer common. We built a prototype of a virtual museum of consumer technologies to explore this with the intention of prompting memories of using past tech in the visitors. The prototype was created using the Janus VR browser and evaluated on a 2D display by 7 young adult users. It successfully prompted memories in all of the evaluators and all users rated the pleasure of touring the museum neutral or better. Future work involves making a more comprehensive museum and exploring better ways to utilize virtual reality for more engaging experiences.


international conference on auditory display | 2016

Musical Robots For Children With ASD Using A Client-Server Architecture

Ruimin Zhang; Jaclyn Barnes; Joseph Ryan; Myounghoon Jeon; Chung Hyuk Park; Ayanna M. Howard

Collaboration


Dive into the Jaclyn Barnes's collaboration.

Top Co-Authors

Avatar

Myounghoon Jeon

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Chung Hyuk Park

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Eric Vasey

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Joseph Ryan

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Maryam FakhrHosseini

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Ruimin Zhang

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Ayanna M. Howard

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Zackery Duford

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Zhi Zheng

Vanderbilt University

View shared research outputs
Top Co-Authors

Avatar

Avinash Subramanian

Michigan Technological University

View shared research outputs
Researchain Logo
Decentralizing Knowledge