Idaku Ishii
Hiroshima University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Idaku Ishii.
IEEE Transactions on Electron Devices | 2003
Takashi Komuro; Idaku Ishii; Masatoshi Ishikawa; Atsushi Yoshida
This paper describes a new vision chip architecture for high-speed target tracking. The processing speed and the number of pixels are improved by hardware implementation of a special algorithm which utilizes a property of high-speed vision and introduction of bit-serial and cumulative summation circuits. As a result, 18 objects in a 128 /spl times/ 128 image can be tracked in 1 ms. Based on the architecture, a prototype chip has been developed; 64 /spl times/ 64 pixels are integrated in 7 mm square chip and the power consumption for obtaining the centroid of an object per every 1 ms is 112 mW. Some experiments are performed on the evaluation board which is developed for evaluation under the condition of actual operation. High-speed target tracking including multitarget tracking with collision and separation has successfully been achieved.
international conference on robotics and automation | 2010
Idaku Ishii; Tetsuro Tatebe; Qingyi Gu; Yuta Moriue; Takeshi Takaki; Kenji Tajima
This paper introduces a high-speed vision system called IDP Express, which can execute real-time image processing and high frame rate video recording simultaneously. In IDP Express, a dedicated FPGA (Field Programmable Gate Array) board processes 512 × 512 pixel images from two camera heads by implementing image processing algorithms as hardware logic; the input images and processed results are transferred to standard PC memory at a rate of 2000 fps or more. Owing to the simultaneous high-frame-rate video processing and recording, IDP Express can be used as an intelligent video logger for long-term high-speed phenomenon analysis even when the measured objects move quickly in a wide area. We applied IDP Express to a mechanical target tracking system to record a high-frame-rate video at high resolution for a crucial moment, which is magnified by tracking the measured objects with visual feedback control. Several experiments on moving objects that undergo sudden shape deformation were performed. The results of the experiments involving the explosion of a rotating balloon and the crash of falling custard pudding have been provided to verify the effectiveness of IDP Express.
intelligent robots and systems | 2009
Idaku Ishii; Taku Taniguchi; Ryo Sukenobe; Kenkichi Yamamoto
In this paper, we introduce a high-speed vision platform, H3 (Hiroshima Hyper Human) Vision, which can simultaneously process a 1024× 1024 pixel image at 1000 fps and a 256× 256 pixel image at 10000 fps by implementing image processing algorithms as hardware logic on a dedicated FPGA board. Various types of algorithms are actually implemented to show that H3 Vision can work a high-speed image processing engine. Experimental results are shown for multi-target color tracking, feature point tracking, optical flow detection, and pattern recognition by using high-order local auto-correlation (HLAC) feature at a frame rate of 1000 fps or more.
IEEE Transactions on Circuits and Systems for Video Technology | 2012
Idaku Ishii; Taku Taniguchi; Kenkichi Yamamoto; Takeshi Takaki
In this paper, we develop a high-frame-rate (HFR) vision system that can estimate the optical flow in real time at 1000 f/s for 1024×1024 pixel images via the hardware implementation of an improved optical flow detection algorithm on a high-speed vision platform. Based on the Lucas-Kanade method, we adopt an improved gradient-based algorithm that can adaptively select a pseudo-variable frame rate according to the amplitude of the estimated optical flow to accurately detect the optical flow for objects moving at high and low speeds in the same image. The performance of our developed HFR optical flow system was verified through experimental results for high-speed movements such as a tops spinnning motion and a humans pitching motion.
IEEE Transactions on Circuits and Systems for Video Technology | 2013
Qingyi Gu; Takeshi Takaki; Idaku Ishii
This paper describes a high-frame-rate (HFR) vision system that can extract locations and features of multiple objects in an image at 2000 f/s for 512 × 512 images by implementing a cell-based multiobject feature extraction algorithm as hardware logic on a field-programmable gate array-based high-speed vision platform. In the hardware implementation of the algorithm, 25 higher-order local autocorrelation features of 1024 objects in an image can be simultaneously extracted for multiobject recognition by dividing the image into 8 × 8 cells concurrently with calculation of the zeroth and first-order moments to obtain the sizes and locations of multiple objects. Our developed HFR multiobject extraction system was verified by performing several experiments: tracking for multiple objects rotating at 16 r/s, recognition for multiple patterns projected at 1000 f/s, and recognition for human gestures with quick finger motion.
IEEE Transactions on Electron Devices | 2006
Idaku Ishii; Kenkichi Yamamoto; Munehiro Kubozono
This paper describes very large scale integration implementation using a new vision chip architecture specialized for target tracking and recognition. A 64 times 64 pixel prototype vision chip and its evaluation results are shown. The extraction algorithms of both higher order local autocorrelation (HLAC) features and moment features are implemented on the prototype chip in order to achieve high-speed image processing and enhanced pixel integration. The chip is integrated on a 5.00 mm times 5.00 mm chip using a 0.35-mum CMOS DLP/TLM process; the pixel size is 44.2 mumtimes48.3 mum. The maximum current consumption is approximately 400 mA, and the chip can calculate all HLAC features more than 26 times in 1 ms. The experimental results also demonstrate that the chip can successfully recognize and count high-speed objects at real time
international conference on robotics and automation | 2004
Idaku Ishii; K. Kato; S. Kurozumi; H. Nagai; A. Numata; K. Tajima
This paper describes a prototype system of an Mm (mega-pixel and milli-second) vision camera head, which strikes a balance between high resolution and high speed. This system is designed based on the concept of MmVision with intelligent pixel selection; target tracking results at a 1000 fps level are shown.
IEEE Transactions on Automation Science and Engineering | 2008
Idaku Ishii; Shogo Kurozumi; Kensuke Orito; Hiroshi Matsuda
Quantifying the rapid action of a mouse scratching its head with its hind leg can provide an objective behavioral indicator for atopic dermatitis, and the development of new drugs for this disease can be significantly expedited by automating such quantification. In this paper, we propose methods for extracting the scratching patterns of a mouse by focusing on its rapid and periodic behavioral patterns. We extracted the scratching patterns from the high-speed video images of actual laboratory mice, and our experimental results show that scratching can be automatically quantified without misinterpretations. Note to Practitioners - The proposed pattern detection method is completely free from the detection of false behaviors caused by markers painted on their skins. Therefore, practitioners in laboratory experiments can also easily apply it to other periodic behavior analyses, such as gait pattern analysis and tremble detection for automated quantitative evaluations of behaviors related to brain, heart, or other diseases, even if the experimented animals are sensitive to painting.
1st Annual International IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine and Biology. Proceedings (Cat. No.00EX451) | 2000
Hiromasa Oku; Idaku Ishii; Masatoshi Ishikawa
With the rapid development of biotechnology, it becomes more important to observe motile microorganisms for research. It is difficult to observe living motile microorganisms through microscope because they can easily go out from the field of view of the microscope. Therefore, instruments that track objects in the field of view are important. To track motile microorganisms, visual feedback by an image sensor has an advantage. However, the frame rate of most conventional vision systems (NTSC 30 [Hz]/PAL 25 [Hz]) is too slow for visual feedback to track them. In this study, the authors develop a new instrument to track motile microorganisms using high-speed vision system-CPV system. The high-speed vision system can get and process images in approximately 1 [ms]. Experimental result demonstrates visually tracking a living paramecium in the field of view of the microscope is succeeded.
international conference on robotics and automation | 2010
Takeshi Takaki; Youhei Omasa; Idaku Ishii; Tomohiro Kawahara; Masazumi Okajima
This paper presents a force visualization mechanism for endoscopic surgical instruments using a Moirée fringe. This mechanism can display fringes or characters that correspond to the magnitude of a force between the surgical instruments and internal organs without the use of electronic elements, such as amplifiers and strain gauges. As this mechanism is simply attached to the surgical instruments, there is no need for additional devices in the operating room or wires to connect these devices. The structure is simple, and its fabrication is inexpensive. An example is shown with the mechanism mounted on a 10-mm forceps. We experimentally verified in vivo, using a pig, that it can display characters corresponding to the magnitude of the force, thus visually displaying the force even in endoscopic image.