Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mingjun Jiang is active.

Publication


Featured researches published by Mingjun Jiang.


IEEE Sensors Journal | 2017

Real-Time Vibration Source Tracking Using High-Speed Vision

Mingjun Jiang; Qingyi Gu; Tadayoshi Aoyama; Takes Takaki; Idaku Ishii

In this paper, a concept of vision-based vibration source localization to extract vibration image regions using pixel-level digital filters in a high-frame-rate (HFR) video is proposed. The method can detect periodic changes in the audio frequency range in image intensities at pixels of vibrating objects. Owing to the acute directivity of the optical image sensor, our HFR-vision-based method can localize a vibration source more accurately than acoustic source localization methods. By applying pixel-level digital filters to clipped region-of-interest (ROI) images, in which the center position of a vibrating object is tracked at a fixed position, our method can reduce the latency effect on a digital filter, which may degrade the localization accuracy in vibration source tracking. Pixel-level digital filters for


Sensors | 2016

Pixel-Level and Robust Vibration Source Sensing in High-Frame-Rate Video Analysis

Mingjun Jiang; Tadayoshi Aoyama; Takeshi Takaki; Idaku Ishii

128\times 128


Sensors | 2017

Motion-Blur-Free High-Speed Video Shooting Using a Resonant Mirror

Michiaki Inoue; Qingyi Gu; Mingjun Jiang; Takeshi Takaki; Idaku Ishii; Kenji Tajima

ROI images, which are tracked from


robotics and biomimetics | 2016

Vibration source localization for motion-blurred high-frame-rate videos

Mingjun Jiang; Tadayoshi Aoyama; Takeshi Takaki; Idaku Ishii

512\times 512


ROBOMECH Journal | 2017

Motion-blur-free video shooting system based on frame-by-frame intermittent tracking

Michiaki Inoue; Mingjun Jiang; Yuji Matsumoto; Takeshi Takaki; Idaku Ishii

input images, are implemented on a 1000-frames/s vision platform that can measure vibration distributions at 100 Hz or higher. Our tracking system allows a vibrating object to be tracked in real time at the center of the camera view by controlling a pan-tilt active vision system. We present several experimental tracking results using objects vibrating at high frequencies, which cannot be observed by standard video cameras or the naked human eye, including a flying quadcopter with rotating propellers, and demonstrate its performance in vibration source localization with sub-degree-level angular directivity, which is more acute than a few or more degrees of directivity in acoustic-based source localization.


international conference on advanced intelligent mechatronics | 2018

Real-time Multicopter Detection Using Pixel-level Digital Filters for Frame-Interpolated High-frame-rate Images

Kohei Shimasaki; Mingjun Jiang; Takeshi Takaki; Idaku Ishii

We investigate the effect of appearance variations on the detectability of vibration feature extraction with pixel-level digital filters for high-frame-rate videos. In particular, we consider robust vibrating object tracking, which is clearly different from conventional appearance-based object tracking with spatial pattern recognition in a high-quality image region of a certain size. For 512 × 512 videos of a rotating fan located at different positions and orientations and captured at 2000 frames per second with different lens settings, we verify how many pixels are extracted as vibrating regions with pixel-level digital filters. The effectiveness of dynamics-based vibration features is demonstrated by examining the robustness against changes in aperture size and the focal condition of the camera lens, the apparent size and orientation of the object being tracked, and its rotational frequency, as well as complexities and movements of background scenes. Tracking experiments for a flying multicopter with rotating propellers are also described to verify the robustness of localization under complex imaging conditions in outside scenarios.


international conference on advanced intelligent mechatronics | 2018

Multithread Active Vision-based Modal Analysis Using Multiple Vibration Distribution Synthesis method

Tadayoshi Aoyama; Liang Li; Mingjun Jiang; Takeshi Takaki; Idaku Ishii; Hua Yang; Chikako Urnemoto; Hiroshi Matsuda; Makoto Chikaraishi; Akimasa Fujiwara

This study proposes a novel concept of actuator-driven frame-by-frame intermittent tracking for motion-blur-free video shooting of fast-moving objects. The camera frame and shutter timings are controlled for motion blur reduction in synchronization with a free-vibration-type actuator vibrating with a large amplitude at hundreds of hertz so that motion blur can be significantly reduced in free-viewpoint high-frame-rate video shooting for fast-moving objects by deriving the maximum performance of the actuator. We develop a prototype of a motion-blur-free video shooting system by implementing our frame-by-frame intermittent tracking algorithm on a high-speed video camera system with a resonant mirror vibrating at 750 Hz. It can capture 1024 × 1024 images of fast-moving objects at 750 fps with an exposure time of 0.33 ms without motion blur. Several experimental results for fast-moving objects verify that our proposed method can reduce image degradation from motion blur without decreasing the camera exposure time.


Transactions of the JSME (in Japanese) | 2018

A fast vibration source tracking algorithm using pixel-level digital filters

Kohei Shimasaki; Mingjun Jiang; Takeshi Takaki; Idaku Ishii

In this paper, we propose a robust method of vibration feature extraction using pixel-level digital filters in a high-frame-rate (HFR) video contaminated with motion blurs from a vibration source. This is completely different from conventional appearance-based object tracking methods that require spatial images with high-quality resolution. Our method allows a certain amount of motion blur to acquire brighter images by using a longer exposure time setting. This enables vibration source localization (VSL) under insufficient illumination. To verify its effectiveness, we present an experimental real-time tracking result of a flying multicopter with propellers rotating at several hundred hertz by implementing pixel-level digital filters for 128×128 images on a 1000 fps vision platform. Additionally, we kept the extracted vibration region at camera view center by controlling a pan-tilt active vision system.


Journal of robotics and mechatronics | 2018

Real-Time Monocular Three-Dimensional Motion Tracking Using a Multithread Active Vision System

Shaopeng Hu; Mingjun Jiang; Takeshi Takaki; Idaku Ishii


Journal of Physics: Conference Series | 2018

High-speed Vision System for Dynamic Structural Distributed Displacement Analysis

Zulhaj Aliansyah; Mingjun Jiang; Takeshi Takaki; Idaku Ishii

Collaboration


Dive into the Mingjun Jiang's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hiroshi Matsuda

Tokyo University of Agriculture and Technology

View shared research outputs
Top Co-Authors

Avatar

Hua Yang

Hiroshima University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Liang Li

Hiroshima University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge