Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jihyun Cho is active.

Publication


Featured researches published by Jihyun Cho.


IEEE Journal of Solid-state Circuits | 2014

A 3.4-

Jaehyuk Choi; Seokjun Park; Jihyun Cho; Euisik Yoon

We report a low-power object-adaptive CMOS imager, which suppresses spatial temporal bandwidth. The object-adaptive imager has embedded a feature extraction algorithm for identifying objects of interest. The sensor wakes up triggered by motion sensing and extracts features from the captured image for the detection of object-of-interest (OOI). Full-image capturing operation and image signal transmission are performed only when the interested objects are found, which significantly reduces power consumption at the sensor node. This motion-triggered OOI imaging significantly saves a spatial bandwidth more than 96.5% from the feature output and saves a temporal bandwidth from the motion-triggered wakeup and object adaptive imaging. The sensor consumes low power by employing a reconfigurable differential-pixel architecture with reduced power supply voltage and by implementing the feature extraction algorithm with mixed-signal circuitry in a small area. The chip operates at 0.22 μW/frame in motion-sensing mode and at 3.4 μW/frame for feature extraction, respectively. The object detection from on-chip feature extraction circuits has demonstrated a 94.5% detection rate for human from a set of 200 sample images.


international solid-state circuits conference | 2012

\mu

Jaehyuk Choi; Seokjun Park; Jihyun Cho; Euisik Yoon

For outdoor surveillance, sensitivity and dynamic range are important to deliver reliable images over widely changing illumination. However, constant monitoring with maximum awareness requires large power consumption and is not suitable for energy-limited applications such as battery-operated and/or energy-scavenging wireless sensor nodes. One of the ways to reduce power is voltage scaling [1-4]. However, it significantly reduces the SNR and results in poor image quality [4]. The signal can be easily corrupted from the noise in dark conditions or be saturated in bright conditions. Most imagers with high sensitivity and wide dynamic range [5,6] consume large power >;50mW, unsuitable for wireless imager node applications. Therefore, it is imperative to implement a sensor adaptable to environmental changes: i.e., the sensor keeps monitoring at extremely low power operation and only turns into high-sensitivity or wide-dynamic-range operations when requested due to illumination changes or requested from the host for detailed image transmission. The sensor changes its operation back to the monitoring mode as a default or when enough operating energy is not available from the battery or energy-harvester. In this paper, we report an adaptive CMOS image sensor that employs four different modes: monitoring, normal, high-sensitivity and wide-dynamic-range (WDR) modes. This adaptable feature enables reliable monitoring while significantly enhancing battery lifetime for wireless image-sensor nodes.


international solid-state circuits conference | 2014

W Object-Adaptive CMOS Image Sensor With Embedded Feature Extraction Algorithm for Motion-Triggered Object-of-Interest Imaging

Seokjun Park; Jihyun Cho; Kyuseok Lee; Euisik Yoon

Miniaturized low-power artificial compound eyes in a small form factor and a low payload can be a promising approach to provide wide-field information for micro-air-vehicle (MAV) applications. Recently, research efforts have been made to realize bio-inspired artificial compound eyes to mimic the wide field of view (FoV) of insect visual organs by implementing photoreceptors to independently face different angles [1-2]. However, these approaches have drawbacks. They use complicated fabrication processes to form a hemispherical lens configuration and secure an independent optical path to each photoreceptor. We take a simple and practical approach to realize wide-field optic flow sensing in a pseudo-hemispherical configuration by mounting a number of 2D array optic flow sensors on a flexible PCB module as shown in Figure 7.2.1. In this scheme, the 2D optic flow sensor should meet the requirements of MAV applications: extremely low power consumption while maintaining robust optic flow generation. Conventional optic flow algorithms, such as Lucas-and-Kanade, require huge amounts of numerical calculations; therefore, they require substantial digital hardware (CPU and/or FPGA), resulting in large power consumption [3-4]. As an alternative approach for low-power implementation, bio-inspired elementary motion detector (EMD) based algorithms (or neuromorphic algorithms) have been studied and implemented in analog VLSI circuits for autonomous navigation [5-6]. However, pure analog signal processing is easily susceptible to temperature and process variations and it is difficult to scale the pixel size or apply low-power design techniques because extensive analog processing is implemented in pixel-level circuits. In this work, we have devised and implemented a time-stamp-based optic flow algorithm, which is modified from the conventional EMD algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as assign adequate allocation of pixel-level, column-parallel, and chip-level processing. Temporal filtering, which may require huge hardware resources if implemented in the digital domain, remains in a pixel-level analog processing unit. Feature detection is implemented using digital circuits that are column parallel. The embedded digital core decodes the 2D time-stamp information into velocity using chip-level processing. Finally, the estimated 16b optic flow data are compressed and transmitted to the host through a 4-wired Serial Peripheral Interface (SPI) bus.


IEEE Journal of Solid-state Circuits | 2016

A 1.36μW adaptive CMOS image sensor with reconfigurable modes of operation from available energy/illumination for distributed wireless sensor network

Adam E. Mendrela; Jihyun Cho; Jeffrey A. Fredenburg; Vivek Nagaraj; Theoden I. Netoff; Michael P. Flynn; Euisik Yoon

This work presents a bidirectional neural interface circuit that enables simultaneous recording and stimulation with a stimulation artifact cancellation circuit. The system employs a common average referencing (CAR) front-end circuit to suppress cross-channel environmental noise to further facilitate use in clinical environment. This paper also introduces a new range-adapting (RA) SAR ADC to lower the system power consumption. A prototype is fabricated in 0.18 μm CMOS and characterized and tested in vivo in an epileptic rat model. The prototype attenuates stimulation artifacts by up to 42 dB and suppresses cross-channel noise by up to 39.8 dB. The measured power consumption per channel is 330 nW, while the area per channel is 0.17 mm2.


IEEE Journal of Solid-state Circuits | 2015

7.2 243.3pJ/pixel bio-inspired time-stamp-based 2D optic flow sensor for artificial compound eyes

Jaehyuk Choi; Seokjun Park; Jihyun Cho; Euisik Yoon

We present an energy/illumination-adaptive CMOS image sensor for distributed wireless sensor applications. The adaptive feature enables always-on imaging operation with extremely low power consumption for extended lifetime of wireless image sensor nodes and provides optimum images in a wide range of illuminations. For adaptive operation, the sensor employs reconfigurable modes of operation. Most of time, the sensor is in a monitoring mode, which keeps imaging at extremely low power consumption. The sensor turns into a high-sensitivity imaging mode or a wide dynamic range imaging mode when illumination varies and sufficient power supply is available from energy harvesting. The sensor changes its operation back to the monitoring mode in order to save energy in the battery. The sensor operates at 1.36 μW/frame in the monitoring mode from harvested energy and provides high-sensitive (24 V/lx·sec) and wide dynamic range images (99.2 dB) at 867 μW in battery operation. The chip achieved power FOM of 15.4 pW/pixel·frame in 0.18 μm technology.


IEEE Journal of Solid-state Circuits | 2014

A Bidirectional Neural Interface Circuit With Active Stimulation Artifact Cancellation and Cross-Channel Common-Mode Noise Suppression

Jihyun Cho; Jaehyuk Choi; Seong-Jin Kim; Seokjun Park; Jungsoon Shin; James D. K. Kim; Euisik Yoon

This paper presents a CMOS time-of-flight (TOF) 3-D camera employing a column-level background light (BGL) suppression scheme for high-resolution outdoor imaging. The use of the column-level approach minimizes a pixel size for high-density pixel arrays. Pixel-binning and super-resolution can be adaptably applied for an optimal BGL suppression at given spatiotemporal resolutions. A prototype sensor has been fabricated by using 0.11 μm CMOS processes. The sensor achieved a fill factor of 24% in a pixel pitch of 5.9 μm which is the smallest among all the reported TOF cameras up to date. Measurement results showed the temporal noise of 1.47 cm-rms with a 100 ms integration time at a demodulation frequency of 12.5 MHz using a white target at 1 m distance. The non-linearity was measured as 1% over the range of 0.75 m ~ 4 m. The BGL suppression over 100 klx was achieved from indoor and outdoor experiments, while the BGL-induced offset was maintained less than 2.6 cm under 0 ~ 100 klx.


symposium on vlsi circuits | 2015

An Energy/Illumination-Adaptive CMOS Image Sensor With Reconfigurable Modes of Operations

Adam E. Mendrela; Jihyun Cho; Jeffrey A. Fredenburg; Cynthia A. Chestek; Michael P. Flynn; Euisik Yoon

We present the first bi-directional neural interface chip that employs a stimulation artifact cancellation circuit to allow concurrent recording and stimulation. In order to further suppress cross-channel common-mode noise, we incorporated a novel common average referencing (CAR) circuit in conjunction with range-adapting (RA) SAR ADC for low-power implementation. The fabricated prototype attenuates stimulation artifacts by up to 42 dB and suppresses common noise among channels by up to 39.8 dB at 330 nW and in an area of 0.17 mm2 per channel.


IEEE Transactions on Biomedical Circuits and Systems | 2015

A 3-D Camera With Adaptable Background Light Suppression Using Pixel-Binning and Super-Resolution

Sung Yun Park; Jihyun Cho; Kyuseok Lee; Euisik Yoon

We report a pulse width modulation (PWM) buck converter that is able to achieve a power conversion efficiency (PCE) of >80% in light loads (<;100 μA) for implantable biomedical systems. In order to achieve a high PCE for the given light loads, the buck converter adaptively reconfigures the size of power PMOS and NMOS transistors and their gate drivers in accordance with load currents, while operating at a fixed frequency of 1 MHz. The buck converter employs the analog-digital hybrid control scheme for coarse/fine adjustment of power transistors. The coarse digital control generates an approximate duty cycle necessary for driving a given load and selects an appropriate width of power transistors to minimize redundant power dissipation. The fine analog control provides the final tuning of the duty cycle to compensate for the error from the coarse digital control. The mode switching between the analog and digital controls is accomplished by a mode arbiter which estimates the average of duty cycles for the given load condition from limit cycle oscillations (LCO) induced by coarse adjustment. The fabricated buck converter achieved a peak efficiency of 86.3% at 1.4 mA and >80% efficiency for a wide range of load conditions from 45 μA to 4.1 mA, while generating 1 V output from 2.5-3.3 V supply. The converter occupies 0.375 mm2 in 0.18 μm CMOS processes and requires two external components: 1.2 μF capacitor and 6.8 μH inductor.


2014 14th International Workshop on Cellular Nanoscale Networks and Their Applications, CNNA 2014 | 2014

Enabling closed-loop neural interface: A bi-directional interface circuit with stimulation artifact cancellation and cross-channel CM noise suppression

Jihyun Cho; Seokjun Park; Jaehyuk Choi; Euisik Yoon

Miniaturized low-power implementation of a vision system is critical in battery-operated systems such as wireless sensor network (WSN), micro-air-vehicles (MAV), and mobile phones. Conventional digital-intensive processing uses the raw image with huge redundancy which degrades the power and speed. This paper reports multi-level mixed-mode processing schemes for efficient VLSI implementation in terms of power, area and speed. In this approach, the processing is distributed in pixel-level, column-level and chip-level processors. Each processor operates in mixed-mode, analog and digital, domains for an optimal use of resources. Three vision chips have been designed and characterized to show the effectiveness of this approach. First, motion detection and feature extraction are implemented in an object-adaptive CMOS image sensor to remove temporal and spatial redundancies for low power operation. Second, a neuromorphic algorithm is implemented for optic flow generation in mixed-mode circuits. Event-driven analog processing units allow low power operation of pre-processing, while the digital processor provides the robustness of backend processing. Finally, background light subtraction is implemented in a 3-D camera for outdoors mobile applications. The reconfigurable pixel array implemented by pixel-merging and super-resolution could achieve faster processing and better background light suppression.


Proceedings of SPIE | 2013

A PWM Buck Converter With Load-Adaptive Power Transistor Scaling Scheme Using Analog-Digital Hybrid Control for High Energy Efficiency in Implantable Biomedical Systems

Seokjun Park; Jaehyuk Choi; Jihyun Cho; Euisik Yoon

Monitoring wide-field surrounding information is essential for vision-based autonomous navigation in micro-air-vehicles (MAV). Our image-cube (iCube) module, which consists of multiple sensors that are facing different angles in 3-D space, can be applied to the wide-field of view optic flows estimation (μ-Compound eyes) and to attitude control (μ- Ocelli) in the Micro Autonomous Systems and Technology (MAST) platforms. In this paper, we report an analog/digital (A/D) mixed-mode optic-flow sensor, which generates both optic flows and normal images in different modes for μ- Compound eyes and μ-Ocelli applications. The sensor employs a time-stamp based optic flow algorithm which is modified from the conventional EMD (Elementary Motion Detector) algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as adequate allocation of pixel-level, column-parallel, and chip-level signal processing. Temporal filtering, which may require huge hardware resources if implemented in digital domain, is remained in a pixel-level analog processing unit. The rest of the blocks, including feature detection and timestamp latching, are implemented using digital circuits in a column-parallel processing unit. Finally, time-stamp information is decoded into velocity from look-up tables, multiplications, and simple subtraction circuits in a chip-level processing unit, thus significantly reducing core digital processing power consumption. In the normal image mode, the sensor generates 8-b digital images using single slope ADCs in the column unit. In the optic flow mode, the sensor estimates 8-b 1-D optic flows from the integrated mixed-mode algorithm core and 2-D optic flows with an external timestamp processing, respectively.

Collaboration


Dive into the Jihyun Cho's collaboration.

Top Co-Authors

Avatar

Euisik Yoon

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyuseok Lee

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge