Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yuko Yamashita is active.

Publication


Featured researches published by Yuko Yamashita.


Frontiers in Psychology | 2013

Acoustic Analyses of Speech Sounds and Rhythms in Japanese- and English-Learning Infants

Yuko Yamashita; Yoshitaka Nakajima; Kazuo Ueda; Yohko Shimada; David Hirsh; Takeharu Seno; Benjamin Alexander Smith

The purpose of this study was to explore developmental changes, in terms of spectral fluctuations and temporal periodicity with Japanese- and English-learning infants. Three age groups (15, 20, and 24 months) were selected, because infants diversify phonetic inventories with age. Natural speech of the infants was recorded. We utilized a critical-band-filter bank, which simulated the frequency resolution in adults’ auditory periphery. First, the correlations between the power fluctuations of the critical-band outputs represented by factor analysis were observed in order to see how the critical bands should be connected to each other, if a listener is to differentiate sounds in infants’ speech. In the following analysis, we analyzed the temporal fluctuations of factor scores by calculating autocorrelations. The present analysis identified three factors as had been observed in adult speech at 24 months of age in both linguistic environments. These three factors were shifted to a higher frequency range corresponding to the smaller vocal tract size of the infants. The results suggest that the vocal tract structures of the infants had developed to become adult-like configuration by 24 months of age in both language environments. The amount of utterances with periodic nature of shorter time increased with age in both environments. This trend was clearer in the Japanese environment.


annual acis international conference on computer and information science | 2016

Head pose estimation and movement analysis for speech scene

Rinko Komiya; Takeshi Saitoh; Miharu Fuyuno; Yuko Yamashita; Yoshitaka Nakajima

Public speaking is an essential skill for a large variety of professions and in everyday life. However, it is difficult to master the skills. This paper focuses on the automatic assessment of nonverbal facial behavior of public speaking and proposes simple and efficient method of head pose estimation and motion analysis. We collected nine speech scenes of the recitation contest in a Japanese high school, and applied the proposed method to evaluate the performance. As for the head pose estimation, our method was obtained acceptable accuracy for the speech scene. Proposed motion analysis method can be calculated frequencies and moving ranges of head motion. As the result, it was found that there is correlation between the moving range and eye contact score.


International Journal of Software Innovation (IJSI) | 2017

Head pose estimation and motion analysis of public speaking videos

Rinko Komiya; Takeshi Saitoh; Miharu Fuyuno; Yuko Yamashita; Yoshitaka Nakajima

Public speaking is an essential skill in a large variety of professions and also in everyday life. However, it can be difficult to master. This paper focuses on the automatic assessment of nonverbal facial behavior during public speaking and proposes simple and efficient methods of head pose estimation and motion analysis. The authors collected nine and six speech videos from a recitation and oration contest, respectively, conducted at a Japanese high school and applied the proposed method to evaluate the contestants’ performance. For the estimation of head pose from speech videos, their method produced results with an acceptable level of accuracy. The proposed motion analysis method can be used for calculating frequencies and moving ranges of head motion. The authors found that the proposed parameters and the eye-contact score are strongly correlated and that the proposed frequency and moving range parameters are suitable for evaluating public speaking. Thus, on the basis of these features, a teacher can provide accurate feedback to help a speaker improve. KEywoRdS English Oration Contest, English Recitation Contest, Facial Feature Point, Head Pose Estimation, Image Processing, Motion Analysis, Speech Video


Learner Corpus Studies in Asia and the World | 2014

Analyzing Speech Pauses and Facial Movement Patterns in Multimodal Public-Speaking Data of EFL Learners

Miharu Fuyuno; Yuko Yamashita; Yoshikiyo Kawase; Yoshitaka Nakajima


Acoustical Science and Technology | 2018

How sonority appears in speech analyses

Yoshitaka Nakajima; Kazuo Ueda; Gerard B. Remijn; Yuko Yamashita; Takuya Kishida


言語科学 | 2016

Inter-turn pauses in the conversations of three-year-old children and their parents

Yuko Yamashita


基幹教育紀要 | 2016

芸術系ESP教育 : 映画を用いたパブリック・スピーキング指導

友子 山下; Yuko Yamashita; 美晴 冬野; Miharu Fuyuno; ユウコ ヤマシタ; ミハル フユノ


International Journal of Psychology | 2016

Developing effective instructions to decrease Japanese speaker's nervousness during English and Japanese public speeches: Evidence from psychological and physiological measurements: P0636

Miharu Fuyuno; Yuko Yamashita; Yuki Yamada; Yoshitaka Nakajima


CILC2016. 8th International Conference on Corpus Linguistics, 2016, págs. 447-461 | 2016

Semantic Structure, Speech Units and Facial Movements: Multimodal Corpus Analysis of English Public Speaking

Miharu Fuyuno; Yuko Yamashita; Takeshi Saitoh; Yoshitaka Nakajima


聴覚研究会資料 = Proceedings of the auditory research meeting | 2014

Influence of speech rate and pauses on the efficiency of English public speaking of Japanese EFL learners

Yuko Yamashita; Miharu Fuyuno; Yoshitaka Nakajima

Collaboration


Dive into the Yuko Yamashita's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Takeshi Saitoh

Kyushu Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rinko Komiya

Kyushu Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge