Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aysu Erdemir is active.

Publication


Featured researches published by Aysu Erdemir.


applied perception in graphics and visualization | 2011

Egocentric distance perception in real and HMD-based virtual environments: the effect of limited scanning method

Qiufeng Lin; Xianshi Xie; Aysu Erdemir; Gayathri Narasimham; Timothy P. McNamara; John J. Rieser; Bobby Bodenheimer

We conducted four experiments on egocentric depth perception using blind walking with a restricted scanning method in both the real and a virtual environment. Our viewing condition in all experiments was monocular. We varied the field of view (real), scan direction (real), blind walking method (real and virtual), and self-representation (virtual) over distances of 4 meters to 7 meters. The field of view varied between 21.1° and 13.6°. The scan direction varied between near-to-far scanning and far-to-near scanning. The blind walking method varied between direct blind walking and an indirect method of blind walking that matched the geometry of our laboratory. We varied self-representation between having a self-avatar (a fully tracked, animated, and first-person perspective of the user), having a static avatar (a mannequin avatar that did not move), to having no avatar (a disembodied camera view of the virtual environment). In the real environment, we find an effect of field of view; participants performed more accurately with larger field of view. In both real and virtual environments, we find an effect of blind walking method; participants performed more accurately in direct blind walking. We do not find an effect of distance underestimation in any environment, nor do we find an effect of self-representation.


robot and human interactive communication | 2012

Learning structural affordances through self-exploration

Erdem Erdemir; D.M. Wilkes; Kazuhiko Kawamura; Aysu Erdemir

The goal of this paper is to develop a cognitive developmental approach for a humanoid robot so that it can provisionally discover self-affordance relations between certain arm limb movements and corresponding motor units by exploring the outcomes of its random arm movements while in a crawling position. Learning of the right and the left arm affordances is based on self-exploration and a set of experience similarly to how a human baby discovers action-effect relations of own arm movements. We address the early development of self-affordances, similar to infants, which encodes the relationships between actions, objects, and the effect on the environment.


Journal of Fluency Disorders | 2018

The effect of emotion on articulation rate in persistence and recovery of childhood stuttering

Aysu Erdemir; Tedra A. Walden; Caswell M. Jefferson; Dahye Choi; Robin M. Jones

PURPOSE This study investigated the possible association of emotional processes and articulation rate in pre-school age children who stutter and persist (persisting), children who stutter and recover (recovered) and children who do not stutter (nonstuttering). METHODS The participants were ten persisting, ten recovered, and ten nonstuttering children between the ages of 3-5 years; who were classified as persisting, recovered, or nonstuttering approximately 2-2.5 years after the experimental testing took place. The children were exposed to three emotionally-arousing video clips (baseline, positive and negative) and produced a narrative based on a text-free storybook following each video clip. From the audio-recordings of these narratives, individual utterances were transcribed and articulation rates were calculated. RESULTS Results indicated that persisting children exhibited significantly slower articulation rates following the negative emotion condition, unlike recovered and nonstuttering children whose articulation rates were not affected by either of the two emotion-inducing conditions. Moreover, all stuttering children displayed faster rates during fluent compared to stuttered speech; however, the recovered children were significantly faster than the persisting children during fluent speech. CONCLUSION Negative emotion plays a detrimental role on the speech-motor control processes of children who persist, whereas children who eventually recover seem to exhibit a relatively more stable and mature speech-motor system. This suggests that complex interactions between speech-motor and emotional processes are at play in stuttering recovery and persistency; and articulation rates following negative emotion or during stuttered versus fluent speech might be considered as potential factors to prospectively predict persistence and recovery from stuttering.


Psychomusicology: Music, Mind and Brain | 2017

Singing without Hearing: A Comparative Study of Children and Adults Singing a Familiar Tune

Sara L. Beck; John J. Rieser; Aysu Erdemir

The current study had 2 goals: to examine and compare baseline singing accuracy of 3 age-groups of participants (children ages 5–8 years, children ages 9–12, and adults) when performing a familiar song from memory, and to examine all participants’ use of auditory and proprio-kinesthetic feedback in regulating pitch by masking their ability to hear themselves. Pitch accuracy, error variability, and tonal stability were examined. All participants were asked to sing “The Alphabet Song” from memory in 2 conditions: normal auditory feedback and masked auditory feedback. Under both feedback conditions, there was significant improvement between the youngest children (ages 5–8) and the adults on all 3 measures, but not between the older children (ages 9–12) and the adults. Participants in every age-group performed more poorly in terms of interval accuracy and error variability when they could not hear themselves. In terms of tonal stability, however, we found an age by feedback interaction such that auditory masking negatively affected key stability for children ages 9 to 12, but not the younger children or the adults. This suggests that although older children may rely heavily on auditory feedback to control relative pitch accuracy and tonal center, adults and younger children may show a different pattern of feedback monitoring for interval-based singing accuracy and maintaining a consistent tonal center.


applied perception in graphics and visualization | 2011

Egocentric distance perception in HMD-based virtual environments

Qiufeng Lin; Xianshi Xie; Aysu Erdemir; Gayathri Narasimham; Timothy P. McNamara; John J. Rieser; Bobby Bodenheimer

We conducted a followup experiment to the work of Lin et al. [2011]. The experimental protocol was the same as that of Experiment Four in Lin et al. [2011] except the viewing condition was binocular instead of monocular. In that work there was no distance underestimation, as has been widely reported elsewhere, and we were motivated in this experiment to see if stereoscopic effects in head-mounted displays (HMDs) accounted for this effect.


Ecological Psychology | 2014

Knowing the Results of One's Own Actions Without Visual or Auditory Feedback When Walking, Throwing, and Singing

John J. Rieser; Aysu Erdemir; Ngoc-Thoa Khuu; Sara L. Beck

Peoples actions in everyday life often occur in situations where they can see and hear the results of their actions. In this article we summarize studies where people walked without seeing, threw a ball without seeing, and sang a tune without hearing. In each case, people reported perceptual awareness of the temporally unfolding results of their actions and judged with varying degrees of accuracy how far they walked relative to their remembered surroundings, the balls trajectory landing relative to their remembered surroundings, and how accurately they stayed in tune when singing without hearing.


applied perception in graphics and visualization | 2011

Using endpoints to judge alterations in self-produced trajectories in an immersive virtual environment

Erin A. McManus; Aysu Erdemir; Stephen W. Bailey; John J. Rieser; Bobby Bodenheimer

McManus et al. [2011] studied a users ability to judge errors in self-produced motion; more specifically, throwing. We now take the first step towards discriminating what cues subjects are using in order to make their judgments. The endpoint of the ball is one such cue; the restricted field of view (FOV) of the head mounted display (HMD) makes it difficult for users to view the complete trajectory of the ball, making the endpoint one of the more consistent cues available during the experiment. For the current study, we hid the trajectory of the ball and showed only the landing point of the ball.


Journal of Speech Language and Hearing Research | 2017

Executive Functions Impact the Relation between Respiratory Sinus Arrhythmia and Frequency of Stuttering in Young Children Who Do and Do Not Stutter.

Robin M. Jones; Tedra A. Walden; Edward G. Conture; Aysu Erdemir; Warren Lambert; Stephen W. Porges


applied perception in graphics and visualization | 2011

Perceiving alterations in trajectories while throwing in a virtual environment

Erin A. McManus; Qiufeng Lin; Aysu Erdemir; Stephen W. Bailey; John J. Rieser; Bobby Bodenheimer


Journal of Vision | 2013

Learning and the Role of Visual Information in Calibrating the Forces of Throws

John J. Rieser; Ngoc-Thoa Khuu; Aysu Erdemir

Collaboration


Dive into the Aysu Erdemir's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge