Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ikuo Matsuo is active.

Publication


Featured researches published by Ikuo Matsuo.


Journal of the Acoustical Society of America | 2001

A model of echolocation of multiple targets in 3D space from a single emission

Ikuo Matsuo; Junji Tani; Masafumi Yano

Bats, using frequency-modulated echolocation sounds, can capture a moving target in real 3D space. The process by which they are able to accomplish this, however, is not completely understood. This work offers and analyzes a model for description of one mechanism that may play a role in the echolocation process of real bats. This mechanism allows for the localization of targets in 3D space from the echoes produced by a single emission. It is impossible to locate multiple targets in 3D space by using only the delay time between an emission and the resulting echoes received at two points (i.e., two ears). To locate multiple targets in 3D space requires directional information for each target. The frequency of the spectral notch, which is the frequency corresponding to the minimum of the external ears transfer function, provides a crucial cue for directional localization. The spectrum of the echoes from nearly equidistant targets includes spectral components of both the interference between the echoes and the interference resulting from the physical process of reception at the external ear. Thus, in order to extract the spectral component associated with the external ear, this component must first be distinguished from the spectral components associated with the interference of echoes from nearly equidistant targets. In the model presented, a computation that consists of the deconvolution of the spectrum is used to extract the external-ear-dependent component in the time domain. This model describes one mechanism that can be used to locate multiple targets in 3D space.


Journal of the Acoustical Society of America | 2004

An echolocation model for the restoration of an acoustic image from a single-emission echo

Ikuo Matsuo; Masafumi Yano

Bats can form a fine acoustic image of an object using frequency-modulated echolocation sound. The acoustic image is an impulse response, known as a reflected-intensity distribution, which is composed of amplitude and phase spectra over a range of frequencies. However, bats detect only the amplitude spectrum due to the low-time resolution of their peripheral auditory system, and the frequency range of emission is restricted. It is therefore necessary to restore the acoustic image from limited information. The amplitude spectrum varies with the changes in the configuration of the reflected-intensity distribution, while the phase spectrum varies with the changes in its configuration and location. Here, by introducing some reasonable constraints, a method is proposed for restoring an acoustic image from the echo. The configuration is extrapolated from the amplitude spectrum of the restricted frequency range by using the continuity condition of the amplitude spectrum at the minimum frequency of the emission and the minimum phase condition. The determination of the location requires extracting the amplitude spectra, which vary with its location. For this purpose, the Gaussian chirplets with a carrier frequency compatible with bat emission sweep rates were used. The location is estimated from the temporal changes of the amplitude spectra.


Frontiers in Behavioral Neuroscience | 2016

Echolocating Big Brown Bats, Eptesicus fuscus, Modulate Pulse Intervals to Overcome Range Ambiguity in Cluttered Surroundings.

Alyssa Wheeler; Kara A. Fulton; Jason E. Gaudette; Ryan Simmons; Ikuo Matsuo; James A. Simmons

Big brown bats (Eptesicus fuscus) emit trains of brief, wideband frequency-modulated (FM) echolocation sounds and use echoes of these sounds to orient, find insects, and guide flight through vegetation. They are observed to emit sounds that alternate between short and long inter-pulse intervals (IPIs), forming sonar sound groups. The occurrence of these strobe groups has been linked to flight in cluttered acoustic environments, but how exactly bats use sonar sound groups to orient and navigate is still a mystery. Here, the production of sound groups during clutter navigation was examined. Controlled flight experiments were conducted where the proximity of the nearest obstacles was systematically decreased while the extended scene was kept constant. Four bats flew along a corridor of varying widths (100, 70, and 40 cm) bounded by rows of vertically hanging plastic chains while in-flight echolocation calls were recorded. Bats shortened their IPIs for more rapid spatial sampling and also grouped their sounds more tightly when flying in narrower corridors. Bats emitted echolocation calls with progressively shorter IPIs over the course of a flight, and began their flights by emitting shorter starting IPI calls when clutter was denser. The percentage of sound groups containing 3 or more calls increased with increasing clutter proximity. Moreover, IPI sequences having internal structure become more pronounced when corridor width narrows. A novel metric for analyzing the temporal organization of sound sequences was developed, and the results indicate that the time interval between echolocation calls depends heavily on the preceding time interval. The occurrence of specific IPI patterns were dependent upon clutter, which suggests that sonar sound grouping may be an adaptive strategy for coping with pulse-echo ambiguity in cluttered surroundings.


Journal of the Acoustical Society of America | 2009

Analysis of the temporal structure of fish echoes using the dolphin broadband sonar signal

Ikuo Matsuo; Tomohito Imaizumi; Tomonari Akamatsu; Masahiko Furusawa; Yasushi Nishimori

Behavioral experiments indicate that dolphins detect and discriminate prey targets through echolocating broadband sonar signals. The fish echo contains components from multiple reflections, including those from the swim bladder and other organs, and can be used for the identification of fish species and the estimation of fish abundance. In this paper, temporal structures were extracted from fish echoes using the cross-correlation function and the lowpass filter. First, the echo was measured from an anesthetized fish in a water tank. The number, reflector intensity, and echo duration were shown to be dependent on the species, individual, and orientation of the fish. In particular, the echo duration provided useful information on the fish body height and for species identification. Second, the echo was measured from the live fish suspended by nylon monofilament lines in the open sea. It was shown that this duration could be estimated regardless of whether or not the fish were moving.


Journal of the Acoustical Society of America | 2011

Evaluation of the echolocation model for range estimation of multiple closely spaced objects.

Ikuo Matsuo

Experimental evidence indicates that bats can use frequency-modulated echolocation to identify objects with an accuracy of less than 1 μs. However, when modeling this process, it is difficult to estimate the delay times of multiple closely spaced objects by analyzing the echo spectrum, because the sequence of delay separations cannot be determined without information on the temporal changes in the interference patterns of the echoes. To extract the temporal changes, Gaussian chirplets with a carrier frequency compatible with bat emission sweep rates are introduced. The delay time for object 1 (T(1)) is estimated from the echo spectrum around the onset time. The T(2) is obtained by adding the T(1) to the delay separation between objects 1 and 2. Further objects are located in sequence by this procedure. Here echoes were measured from single and multiple objects at a low signal-to-noise ratio. It was confirmed that the delay time for a single object could be estimated with an accuracy of about 1.3 μs. The range accuracy was less than 6 μs when the frequency bandwidth was less than 10 kHz. The delay time for multiple closely spaced objects could be estimated with a high range resolution by extracting the interference pattern.


ieee international underwater technology symposium | 2013

Automated acoustic detection of fin whale calls off Kushiro-Tokachi at the deep sea floor observatory

Ikuo Matsuo; T. Akamatsu; Ryoichi Iwase; Katsuyoshi Kawaguchi

Automatic acoustic detection and tracking were useful methods for understanding the behavior and population of marine animals. The seismometer and hydrophones were set in the ocean bottom off Kushiro in Japan. The acoustic data were recorded at a sampling frequency of 100 samples per second by 4 ocean bottom hydrophones. The fin whale calls, consisting a frequency down sweep in the range 20-15 Hz with duration of about 1 second, were recorded at these data. In this paper, the fin whales calls were automatically detected by extracting such down sweep signals from the acoustic data at 1457 days, ranging from 2009 to 2012. The acoustic data were transformed into the spectrogram by using short FFT. The fin whale calls were detected by computing the correlation between the measured spectrogram and the criterion spectrogram, which was computed from the typical fin whale call, at each time. It was demonstrated that the call detections increased from October to February. In addition, the movements of the fin whales could be estimated by temporal changes of time differences between calls detected at two hydrophones.


Journal of the Acoustical Society of America | 2011

Reconstruction of the signal produced by a directional sound source from remote multi-microphone recordings

Francesco Guarato; John Hallam; Ikuo Matsuo

A mathematical method for reconstructing the signal produced by a directional sound source from knowledge of the same signal in the far field, i.e., microphone recordings, is developed. The key idea is to compute inverse filters that compensate for the directional filtering of the signal by the sound source directivity, using a least-square error optimization strategy. Previous work pointed out how the method strongly depends on arrival times of signal in the microphone recordings. Two strategies are used in this paper for calculating the time shifts that are afterward taken as inputs, together with source directivity, for the reconstruction. The method has been tested in a laboratory environment, where ground truth was available, with a Polaroid transducer as source. The reconstructions are similar with both strategies. The performance of the method also depends on source orientation.


Journal of the Acoustical Society of America | 2013

Localization and tracking of moving objects in two-dimensional space by echolocation

Ikuo Matsuo

Bats use frequency-modulated echolocation to identify and capture moving objects in real three-dimensional space. Experimental evidence indicates that bats are capable of locating static objects with a range accuracy of less than 1 μs. A previously introduced model estimates ranges of multiple, static objects using linear frequency modulation (LFM) sound and Gaussian chirplets with a carrier frequency compatible with bat emission sweep rates. The delay time for a single object was estimated with an accuracy of about 1.3 μs by measuring the echo at a low signal-to-noise ratio (SNR). The range accuracy was dependent not only on the SNR but also the Doppler shift, which was dependent on the movements. However, it was unclear whether this model could estimate the moving object range at each timepoint. In this study, echoes were measured from the rotating pole at two receiving points by intermittently emitting LFM sounds. The model was shown to localize moving objects in two-dimensional space by accurately estimating the objects range at each timepoint.


Frontiers in Physiology | 2013

Echolocation of static and moving objects in two-dimensional space using bat-like frequency-modulation sound

Ikuo Matsuo

Bats use frequency-modulated echolocation to identify and capture moving objects in real three-dimensional space. The big brown bat, Eptesicus fuscus, emits linear period modulation sound, and is capable of locating static objects with a range accuracy of less than 1 μs. A previously introduced model can estimate ranges of multiple, static objects using linear frequency modulation (LFM) sound and Gaussian chirplets with a carrier frequency compatible with bat emission sweep rates. The delay time for a single object was estimated with an accuracy of about 1.3 μs by measuring the echo at a low signal-to-noise ratio. This model could estimate the location of each moving object in two-dimensional space. In this study, the linear period modulation sounds, mimicking the emitting pulse of big brown bats, were introduced as the emitted signals. Echoes were measured from moving objects at two receiving points by intermittently emitting these sounds. It was clarified that this model could localize moving objects in two-dimensional space by accurately estimating the object ranges.


Journal of the Acoustical Society of America | 2014

Acoustic tracking of bats in clutter environments using microphone arrays

Ikuo Matsuo; Alyssa Wheeler; Laura N. Kloepper; Jason E. Gaudette; James A. Simmons

The big brown bat, Eptesicus fuscus, uses echolocation for foraging and orientation. Bats can change the echolocation calls dependent on the environments. Therefore, it is necessary to clarify the changes of acoustic characteristics of these calls. In this study, the flight path of the bat were tracked by computing the time differences of arrivals (TDOA) at the microphone array system in the flight room. The acoustic patterns of echolocation calls could be calculated from the measured call data at each microphone by compensating the spread and absorption loss. The head aim and beam pattern at each harmonics were computed from these acoustic patterns of echolocation calls. It was examined whether these acoustics beam patterns were dependent on clutter environment, that is, density of chains. [This research was supported by ONR, NSF, and JST, CREST.]

Collaboration


Dive into the Ikuo Matsuo's collaboration.

Top Co-Authors

Avatar

Tomonari Akamatsu

National Agriculture and Food Research Organization

View shared research outputs
Top Co-Authors

Avatar

Tomohito Imaizumi

Tokyo University of Marine Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Masanori Ito

Tohoku Gakuin University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Katsuyoshi Kawaguchi

Japan Agency for Marine-Earth Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Kazuki Yamato

Tohoku Gakuin University

View shared research outputs
Top Co-Authors

Avatar

Ryoichi Iwase

Japan Agency for Marine-Earth Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kazuo Amakasu

Tokyo University of Marine Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge