Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Takeshi Ninomiya is active.

Publication


Featured researches published by Takeshi Ninomiya.


intelligent robots and systems | 2007

The waseda flutist robot No. 4 refined IV: enhancing the sound clarity and the articulation between notes by improving the design of the lips and tonguing mechanisms

Jorge Solis; Koichi Taniguchi; Takeshi Ninomiya; Tetsuro Yamamoto; Atsuo Takanishi

As a result from our research, the Waseda Flutist Robot is able of playing the flute nearly similar to an intermediate level flutist. In order to enhance the expressiveness of its performance, we are focusing our research on improving the mechanical design of the simulated organs as well as implementing automated algorithms for the generation of expressive music performance. In a previous research, we have developed a human-like vocal cord to improve the production of vibrato. In this paper, we are presenting the newest version of the flutist robot; where the lips, oral cavity and tonguing mechanisms were improved. Such improvements were proposed to effectively control the attack time and double tonguing. The attack time is useful to produce clear sounds and the double tonguing is an important articulation that helps players to produce shaped notes and smooth transitions between notes. The lips mechanism is composed by 3-DOFs which enables the accurate control of the air stream parameters (width, thickness and angle). The lips of the robot were designed more human-like by using a thermoplastic rubber (septon). The oral cavity was designed similar to the human one also made by septon. Inside the oral cavity, an improved tonguing mechanism was designed (1-DOF) to reproduce better the double tonguing technique. A set of experiments were performed to analyze the improvements on the dynamical properties of the sound while playing the flute.


international conference on robotics and automation | 2008

Development of Waseda flutist robot WF-4RIV: Implementation of auditory feedback system

Jorge Solis; Koichi Taniguchi; Takeshi Ninomiya; Tetsuro Yamamoto; Atsuo Takanishi

Up to now, different kinds of musical performance robots (MPRs) and robotic musicians (RMs) have been developed. MPRs are designed to closely reproduce the motor skills displayed by humans in order to play musical instruments. From this approach, MPRs are used as benchmarks to study the human motor control from an engineering point of view and to understand better the human-robot interaction from a musical point of view. In contrast, RMs are conceived as automated mechanisms designed to create new ways of musical expression from a musical engineering point of view. Our research, at Waseda University, has been focused on developing an anthropomorphic flutist robot. Our research aims in studying the human motor control from an engineering point of view, understanding the ways to facilitate the human-robot interaction and proposing new applications for humanoid robot in musical terms. As a result of our research, the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) has been developed. In a previous research, we focused on improving the mechanical system in order to enhance the clarity of the sound. However, we require performing further improvements to the control system in order to enable the robot to autonomously improve the quality of the sound during the flute performance. For this purpose, we proposed to implement an auditory feedback control system on the flutist robot. The proposed system is composed by a music expressive generator, feed-forward air pressure control system and a pitch evaluation module. From the experimental results with the WF-4RIV, we could confirm the improvements on the flute performance.


Advanced Robotics | 2009

Implementation of an Auditory Feedback Control System on an Anthropomorphic Flutist Robot Inspired on the Performance of a Professional Flutist

Jorge Solis; Koichi Taniguchi; Takeshi Ninomiya; Klaus Petersen; Tetsuro Yamamoto; Atsuo Takanishi

Up to now, different kinds of musical performance robots (MPRs) have been developed. MPRs are designed to closely reproduce the required motor skills displayed by humans in order to play musical instruments. Our research at Waseda University has been focused on developing an anthropomorphic flutist robot. As a result of our research, the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) has been designed to mechanically reproduce the human organs involved during a flute-playing performance. Although the WF-4RIV is able to play the flute nearly similar to the performance of an intermediate player, further improvements in terms of cognitive capabilities are still required (i.e., autonomously improve the quality of the sound during performance based on the sound processing). For this purpose, in this paper, we present the implementation of an Auditory Feedback Control System (AFCS) designed to enable the flutist robot to analyze the flute sound during a performance (in a similar way professional flutists practice before a performance is held). The AFCS is composed of three subsystems (which are detailed in this paper): WF-4RIVs Control System, Expressive Music Generator and Pitch Evaluation System. A set of experiments was proposed to verify the effectiveness of the proposed AFCS to control the air pressure and to detect/correct faulty notes during a performance. From the experimental results, we confirm the improvements on the flute sound produced by the robot.


Advanced Robotics | 2010

Development of the Anthropomorphic Saxophonist Robot WAS-1: Mechanical Design of the Simulated Organs and Implementation of Air Pressure Feedback Control

Jorge Solis; Takeshi Ninomiya; Klaus Petersen; Maasaki Takeuchi; Atsuo Takanishi

The research on the Waseda Flutist Robot, since 1990, is an approach to understand human motor control from an engineering point of view, as well as introducing novel ways of musical teaching. More recently, the authors have proposed as a long-term goal the aim to enable musical performance robots to interact with musical partners. For this purpose, we present two research approaches: implementing more advanced cognitive capabilities on the Waseda Flutist Robot no. 4 Refined IV (WF-4RIV) (i.e., visual/aural processing) and developing a new musical performance robot (i.e., duet performance). In this paper, we have focused our research on developing an anthropomorphic saxophonist robot as a benchmark to better understand how the interaction with musical partners can be facilitated. As a result, we have developed the Waseda Saxophonist Robot no. 1 (WAS-1) with 15 d.o.f. that mechanically simulates the organs involved during saxophone playing. In this paper, we present the details of the mechanical design of the simulated organs, as well as the implementation of the musical performance control system based on air pressure feedback control. A set of experiments was proposed to verify the effectiveness of the designed simulated organs to produce the saxophone sound and to verify the effectiveness of the proposed air pressure feedback control by comparing the saxophone performance of WAS-1 against an intermediate saxophonist. Finally, a preliminary experiment was carried out to analyze the feasibility of realizing a duet performance with the WF-4RIV. From the experimental results, we have confirmed that WAS-1 is capable of producing a saxophone sound nearly similar in terms of pitch and volume to the performance of a human player.


intelligent robots and systems | 2009

Development of anthropomorphic musical performance robots: From understanding the nature of music performance to its application to entertainment robotics

Jorge Solis; Klaus Petersen; Takeshi Ninomiya; Masaki Takeuchi; Atsuo Takanishi

During several decades, the research at Waseda University has been focused on developing anthropomorphic robots able of performing musical instruments. More recently, the authors have succeeded in developing a human-like robot able of playing the alto saxophone. As a result of our research efforts, the Waseda Flutist Robot WF-4RIV and the Waseda Saxophonist Robot WAS-1 have been designed to reproduce the human player performance. Therefore, as a long-term goal, we are proposing to enable the interaction between musical performance robots (i.e. robots orchestra). Such approach may enable us not only to propose new ways of musical expression, but also we may contribute to the better understanding of some of the mechanisms that enable the communication of humans in musical terms. In general terms, the communication of humans within an orchestra is a special case of conventional human social behavior. Rhythm, harmony and timbre of the music played represent the emotional states of the musicians. Of course, we are not considering a musical performance robot (MPR) just as a mere sophisticated MIDI instrument. Instead, its human-like design and the integration of perceptual capabilities may enable to act on its own autonomous initiative based on models which consider its own physical constrains. Due to the complexity of our long-term goal, in this paper, we are presenting our first steps towards enabling the interaction between musical performance robots. In particular, the details of the musical performance control systems are detailed. Thanks to the use of MIDI data, we performed preliminary experiments to enable a duet performance between the WF-4RIV and the WAS-1. We expect in the future; as a result of our research, we may enable a single anthropomorphic robot to perform different kinds of woodwind instruments as well as enable to interact at with level of perceptual capabilities (like human does).


international conference on robotics and automation | 2007

Implementation of Expressive Performance Rules on the WF-4RIII by modeling a professional flutist performance using NN

Jorge Solis; Kei Suefuji; Koichi Taniguchi; Takeshi Ninomiya; Maki Maeda; Atsuo Takanishi

In this paper, the methodology for automatically generating an expressive performance on the anthropomorphic flutist robot is detailed. A feed-forward network trained with the error back-propagation algorithm was implemented to model the performances expressiveness of a professional flutist. In particular, the note duration and vibrato were considered as performance rules (sources of variation) to enhance the robots performance expressiveness. From the mechanical point of view, the vibrato and lung systems were re-designed to effectively control the proposed music performance rules. An experimental setup was proposed to verify the effectiveness of generating a new score with expressiveness from a model created based on the performance of a professional flutist. As a result, the flutist robot was able of automatically producing an expressive performance similar to the human one from a nominal score.


robot and human interactive communication | 2007

Towards an expressive performance of the Waseda Flutist Robot: Production of Vibrato

Jorge Solis; Koichi Taniguchi; Takeshi Ninomiya; Atsuo Takanishi

The research on the anthropomorphic flutist robot, at Waseda University, has focused on clarifying the human motor control while playing the flute, proposing novel applications for humanoid robots and enabling the communication with humans at the emotional level. As a result of our research, the flutist robot is able of nearly reproducing the basic technical skills required to play the flute. Furthermore, some of the extended technical skills have been roughly simulated by the robot. However, in order to enhance the expressiveness of the robots performance, still further improvements are required. In particular, in this paper, we focus our research on understanding better how to enhance the expressiveness of the flute performance by studying in more detail the vibrato, which is believe to be one of the most important ways of expressing feelings/ideas while performing the flute. For this reason, the newest version of the flutist robot, the Waseda Flutist Robot No.4 Refined III has been developed by focusing on improving the design of the lung and designing of a human-like vocal cord mechanism, which are believed to have a close relation in producing vibrato. The details of the new mechanism are given and experiments were conducted to understand the effect of the movement of diaphragm and glottis while producing vibrato.


international conference on robotics and automation | 2009

Development of the anthropomorphic saxophonist robot WAS-1: Mechanical design of the lip, tonguing, fingers and air pump mechanisms

Klaus Petersen; Jorge Solis; Takeshi Ninomiya; Tetsuro Yamamoto; Masaki Takeuchi; Atsuo Takanishi

The research on the development of musical performance robots has been particularly intensified on the last decades. In fact, the development of anthropomorphic robots able of playing musical instruments have been served as a mean for understanding the human motor control from an engineering point of view as well as understanding how to enable the communication between human and robots from an emotional point of view. In particular, our research aims in the development of an anthropomorphic saxophonist robot which is able not only of performing a musical score; but also to interact with other musical performance robots (i.e. Waseda Flutist Robot) at the emotional level of perception. In this year, we have focused on the mechanical design of an anthropomorphic robot Waseda Saxophonist Robot No. 1 (WAS-1); which has been designed for playing an alto saxophone. WAS-1 has a total of 15-DOFs which mechanically reproduces the following organs involved during the saxophone playing: lips (1-DOF), tongue (1-DOF), oral cavity, lungs (air pump: 1-DOF and air valve: 1-DOF) and fingers (11-DOFs). In order to verify the effectiveness of the production of sound, a set of experiments have been proposed. In particular, the characteristics of air flow-pressure, level of mechanical noise, and the ripple effect ratio are presented. Finally, a qualitative evaluation of the sound produced by WAS-1 is presented and discussed. From the experimental results, we have confirmed the effectiveness of the proposed mechanisms to produce the saxophone sound.


ieee international conference on biomedical robotics and biomechatronics | 2008

The Waseda Flutist Robot No.4 Refined IV: From a musical partner to a musical teaching tool

Jorge Solis; Koichi Taniguchi; Takeshi Ninomiya; Klaus Petersen; Tetsuro Yamamoto; Atsuo Takanishi

Up to now, different kinds of musical performance robots have been developed. MPRs are designed to closely reproduce the human organs involved during the playing of musical instruments. Our research on the Waseda Flutist Robot has been focused on clarifying the human motor control from an engineering point of view. As a result, the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) is able of playing the flute nearly similar to an intermediate player. Thanks to the human-like design and the advanced technical skills displayed by the WF-4RIV, novel ways of musical education can be conceived. In this paper; the General Transfer Skill System (GTSS) is implemented on the flutist robot, towards enabling the automated transfer of technical skills from the robot to flutist beginners. A set of experiments are carried out to verify the evaluation and interaction modules of the GTSS. From the experimental results, the robot is able of quantitatively evaluating the performance of beginners, and automatically recognizing the melodies performed by them.


ieee international conference on biomedical robotics and biomechatronics | 2008

Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a real-time interaction system with human partners

Klaus Petersen; Jorge Solis; Koichi Taniguchi; Takeshi Ninomiya; Tetsuro Yamamoto; Atsuo Takanishi

The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF-4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musicianpsilas hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper.

Collaboration


Dive into the Takeshi Ninomiya's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge