Yuko Yamashita
Shibaura Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yuko Yamashita.
Frontiers in Psychology | 2013
Yuko Yamashita; Yoshitaka Nakajima; Kazuo Ueda; Yohko Shimada; David Hirsh; Takeharu Seno; Benjamin Alexander Smith
The purpose of this study was to explore developmental changes, in terms of spectral fluctuations and temporal periodicity with Japanese- and English-learning infants. Three age groups (15, 20, and 24 months) were selected, because infants diversify phonetic inventories with age. Natural speech of the infants was recorded. We utilized a critical-band-filter bank, which simulated the frequency resolution in adults’ auditory periphery. First, the correlations between the power fluctuations of the critical-band outputs represented by factor analysis were observed in order to see how the critical bands should be connected to each other, if a listener is to differentiate sounds in infants’ speech. In the following analysis, we analyzed the temporal fluctuations of factor scores by calculating autocorrelations. The present analysis identified three factors as had been observed in adult speech at 24 months of age in both linguistic environments. These three factors were shifted to a higher frequency range corresponding to the smaller vocal tract size of the infants. The results suggest that the vocal tract structures of the infants had developed to become adult-like configuration by 24 months of age in both language environments. The amount of utterances with periodic nature of shorter time increased with age in both environments. This trend was clearer in the Japanese environment.
annual acis international conference on computer and information science | 2016
Rinko Komiya; Takeshi Saitoh; Miharu Fuyuno; Yuko Yamashita; Yoshitaka Nakajima
Public speaking is an essential skill for a large variety of professions and in everyday life. However, it is difficult to master the skills. This paper focuses on the automatic assessment of nonverbal facial behavior of public speaking and proposes simple and efficient method of head pose estimation and motion analysis. We collected nine speech scenes of the recitation contest in a Japanese high school, and applied the proposed method to evaluate the performance. As for the head pose estimation, our method was obtained acceptable accuracy for the speech scene. Proposed motion analysis method can be calculated frequencies and moving ranges of head motion. As the result, it was found that there is correlation between the moving range and eye contact score.
International Journal of Software Innovation (IJSI) | 2017
Rinko Komiya; Takeshi Saitoh; Miharu Fuyuno; Yuko Yamashita; Yoshitaka Nakajima
Public speaking is an essential skill in a large variety of professions and also in everyday life. However, it can be difficult to master. This paper focuses on the automatic assessment of nonverbal facial behavior during public speaking and proposes simple and efficient methods of head pose estimation and motion analysis. The authors collected nine and six speech videos from a recitation and oration contest, respectively, conducted at a Japanese high school and applied the proposed method to evaluate the contestants’ performance. For the estimation of head pose from speech videos, their method produced results with an acceptable level of accuracy. The proposed motion analysis method can be used for calculating frequencies and moving ranges of head motion. The authors found that the proposed parameters and the eye-contact score are strongly correlated and that the proposed frequency and moving range parameters are suitable for evaluating public speaking. Thus, on the basis of these features, a teacher can provide accurate feedback to help a speaker improve. KEywoRdS English Oration Contest, English Recitation Contest, Facial Feature Point, Head Pose Estimation, Image Processing, Motion Analysis, Speech Video
Learner Corpus Studies in Asia and the World | 2014
Miharu Fuyuno; Yuko Yamashita; Yoshikiyo Kawase; Yoshitaka Nakajima
Acoustical Science and Technology | 2018
Yoshitaka Nakajima; Kazuo Ueda; Gerard B. Remijn; Yuko Yamashita; Takuya Kishida
言語科学 | 2016
Yuko Yamashita
基幹教育紀要 | 2016
友子 山下; Yuko Yamashita; 美晴 冬野; Miharu Fuyuno; ユウコ ヤマシタ; ミハル フユノ
International Journal of Psychology | 2016
Miharu Fuyuno; Yuko Yamashita; Yuki Yamada; Yoshitaka Nakajima
CILC2016. 8th International Conference on Corpus Linguistics, 2016, págs. 447-461 | 2016
Miharu Fuyuno; Yuko Yamashita; Takeshi Saitoh; Yoshitaka Nakajima
聴覚研究会資料 = Proceedings of the auditory research meeting | 2014
Yuko Yamashita; Miharu Fuyuno; Yoshitaka Nakajima