Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Soo Chul Lim is active.

Publication


Featured researches published by Soo Chul Lim.


Autonomous Robots | 2004

Integration of a Rehabilitation Robotic System (KARES II) with Human-Friendly Man-Machine Interaction Units

Zeungnam Bien; Myung Jin Chung; Pyung Hun Chang; Dong-Soo Kwon; Dae-Jin Kim; Jeong-Su Han; Jae-Hean Kim; Do-Hyung Kim; Hyung-Soon Park; Sang Hoon Kang; Kyoobin Lee; Soo Chul Lim

In this paper, we report some important results of design and evaluation of a wheelchair-based robotic arm system, named as KARES II (KAIST Rehabilitation Engineering Service System II), which is newly developed for the disabled. KARES II is designed in consideration of surveyed necessary tasks for the target users (that is, people with spinal cord injury). At first, we predefined twelve important tasks according to extensive interviews and questionnaires. Next, based on these tasks, all subsystems are designed, simulated and developed. A robotic arm with active compliance and intelligent visual servoing capability is developed by using cable-driven mechanism. Various kinds of human-robot interfaces are developed to provide broad range of services according to the levels of disability. Eye-mouse, shoulder/head interface, EMG signal-based control subsystems are used for this purpose. Besides, we describe the process of integration of our rehabilitation robotic system KARES II, and discuss about user trials. A mobile platform and a wheelchair platform are two main platforms in which various subsystems are installed. For a real-world application of KARES II system, we have performed user trials with six selected potential end-users (with spinal cord injury).


intelligent robots and systems | 2002

Human-friendly interfaces of wheelchair robotic system for handicapped persons

Jae-Woong Min; Kyoobin Lee; Soo Chul Lim; Dong-Soo Kwon

With an increase in the number of handicapped persons, there is a growing demand for human friendly interface as mobility aids. To meet this need, we have developed two interfaces with using shoulder and head motion for powered wheelchair control. To acquire proper wheelchair control instruction signal, workspaces of shoulder and head are analyzed by magnetic position sensor. Two interfaces are developed to meet four guidelines :human friendly design, easiness of wearability, intuitive drive function, and low cost. FSR (force sensing resistor) is used to measure changes in the shoulder and head motion. Interfaces usefulness is verified by clinical experiment with six subjects who are spinal cord injured with C4 or C5.


Journal of Micromechanics and Microengineering | 2014

Development of a flexible three-axis tactile sensor based on screen-printed carbon nanotube-polymer composite

Soonjae Pyo; Jae Ik Lee; Min Ook Kim; Taeyoung Chung; Yongkeun Oh; Soo Chul Lim; Joonah Park; Jongbaeg Kim

A flexible, three-axis carbon nanotube (CNT)–polymer composite-based tactile sensor is presented. The proposed sensor consists of a flexible substrate, four sensing cells, and a bump structure. A CNT–polydimethylsiloxane (PDMS) composite is produced by a solvent evaporation method, and thus, the CNTs are well-dispersed within the PDMS matrix. The composite is directly patterned onto a flexible substrate using a screen printing technique to fabricate a sensor with four sensing cells. When a force is applied on the bump, the magnitude and direction of force could be detected by comparing the changes in electrical resistance of each sensing cell caused by the piezoresistive effect of the composite. The experimentally verified sensing characteristics of the fabricated sensor exhibit a linear relationship between the resistance change and the applied force, and the measured sensitivities of the sensor for the normal and shear forces are 6.67 and 86.7%/N for forces up to 2.0 and 0.5 N, respectively. Experiments to verify the load-sensing repeatability show a maximum 2.00% deviation of the resistance change within the tested force range.


human factors in computing systems | 2012

Evaluation of human tangential force input performance

Bhoram Lee; Hyun-Jeong Lee; Soo Chul Lim; Hyung-Kew Lee; Seungju Han; Joonah Park

While interacting with mobile devices, users may press against touch screens and also exert tangential force to the display in a sliding manner. We seek to guide UI design based on the tangential force applied by a user to the surface of a hand-held device. A prototype of an interface using tangential force input was implemented utilizing a force sensitive layer and an elastic layer and used for the user experiment. We investigated user controllability to reach and maintain target force levels and considered the effects of hand pose and direction of force input. Our results imply no significant difference in performance when applying force holding the device in one hand and in two hands. We also observed that users have more physical and perceived loads when applying tangential force in the left-right direction compared to the up-down direction. Based on the experimental results, we discuss considerations for user interface applications of tangential-force-based interface.


Sensors | 2015

Expansion of Smartwatch Touch Interface from Touchscreen to Around Device Interface Using Infrared Line Image Sensors

Soo Chul Lim; Jungsoon Shin; Seung-Chan Kim; Joonah Park

Touchscreen interaction has become a fundamental means of controlling mobile phones and smartwatches. However, the small form factor of a smartwatch limits the available interactive surface area. To overcome this limitation, we propose the expansion of the touch region of the screen to the back of the user’s hand. We developed a touch module for sensing the touched finger position on the back of the hand using infrared (IR) line image sensors, based on the calibrated IR intensity and the maximum intensity region of an IR array. For complete touch-sensing solution, a gyroscope installed in the smartwatch is used to read the wrist gestures. The gyroscope incorporates a dynamic time warping gesture recognition algorithm for eliminating unintended touch inputs during the free motion of the wrist while wearing the smartwatch. The prototype of the developed sensing module was implemented in a commercial smartwatch, and it was confirmed that the sensed positional information of the finger when it was used to touch the back of the hand could be used to control the smartwatch graphical user interface. Our system not only affords a novel experience for smartwatch users, but also provides a basis for developing other useful interfaces.


International Journal of Medical Robotics and Computer Assisted Surgery | 2015

Role of combined tactile and kinesthetic feedback in minimally invasive surgery.

Soo Chul Lim; Hyung-Kew Lee; Joonah Park

Haptic feedback is of critical importance in surgical tasks. However, conventional surgical robots do not provide haptic feedback to surgeons during surgery. Thus, in this study, a combined tactile and kinesthetic feedback system was developed to provide haptic feedback to surgeons during robotic surgery.


Advanced Robotics | 2014

Tactile display with tangential and normal skin displacement for robot-assisted surgery

Soo Chul Lim; Hyung-Kew Lee; Eunhyup Doh; Kwang-Seok Yun; Joonah Park

This paper proposes a tactile display providing both shear and normal feedback to the fingertip for generating three-axis tactile feedback during teleoperation of a surgical robot. The display is composed of five balloons actuated by controlling the pneumatic pressure. The implemented display is 18 mm × 18 mm × 15 mm. This size is suitable for mounting the display onto the master controls of a surgical robot. The maximum normal and shear displacements are 2 and 1.3 mm, respectively. The proposed tactile display may provide perceivable stimuli to a human finger pad in all five directions: normal, distal, proximal, radial, and ulnar. This paper also reports on the results of psychophysical measurement of the minimum perceivable movement of the developed tactile display for each of the five directions. Graphical Abstract


Sensors | 2017

Inferring Interaction Force from Visual Information without Using Physical Force Sensors

Wonjun Hwang; Soo Chul Lim

In this paper, we present an interaction force estimation method that uses visual information rather than that of a force sensor. Specifically, we propose a novel deep learning-based method utilizing only sequential images for estimating the interaction force against a target object, where the shape of the object is changed by an external force. The force applied to the target can be estimated by means of the visual shape changes. However, the shape differences in the images are not very clear. To address this problem, we formulate a recurrent neural network-based deep model with fully-connected layers, which models complex temporal dynamics from the visual representations. Extensive evaluations show that the proposed learning models successfully estimate the interaction forces using only the corresponding sequential images, in particular in the case of three objects made of different materials, a sponge, a PET bottle, a human arm, and a tube. The forces predicted by the proposed method are very similar to those measured by force sensors.


Advanced Robotics | 2011

Presentation of Surface Height Profiles Based on Frequency Modulation at Constant Amplitude Using Vibrotactile Elements

Soo Chul Lim; Ki-Uk Kyung; Dong-Soo Kwon

This study attempted to observe what effects the frequency modulation of vibration elements produce in representing a tactile shape. Tactile shapes were modulated based on frequency difference at constant amplitude through a tactile feedback array of 30 (5 × 6) pins, which stimulated the finger pad. Experiment I showed that participants feel height changes when modulating frequency. In Experiment II, the participants were asked to discriminate three basic tactile shape patterns, which were generated with different frequencies at constant amplitude. Experiment II proved that spatial height information can be represented by modulating temporal information. In Experiment III, the frequency modulation method was applied to the tactile mouse system. Dynamic frequency modulation at passive touch can be used to transmit tactile height pattern information to the user of the mouse pointer for more practical application. The results showed that the participants were able to discern eight predefined shapes with an accuracy of 98.4% upon passive touch.


international conference on human haptic sensing and touch enabled computer applications | 2010

Physical contact of devices: utilization of beats for interpersonal communication

Soo Chul Lim; Seung-Chan Kim; Jung-Hoon Hwang; Dong-Soo Kwon

In this paper, an interpersonal communication method based on tactile beats when devices are established physical contact will be proposed. Tactile beats are a well-known phenomenon that describes frequency modulation occurs when vibrating objects with similar but not same operating frequencies are physically connected. Vibrating signals at each configuration (no contact and contact with same/different frequency) were measured using a laser vibrometer. Preliminary user study revealed that the induced physical tactile stimulus was perceived well by the subjects. As an application, social touch interaction using hand-held devices with differently assigned operating vibration frequencies was described. Mapping the frequency deviation between devices to quantifiable social information, it is expected that the proposed algorithm can be applied for interpersonal communication based on physical contact with enhanced emotional and social experiences.

Collaboration


Dive into the Soo Chul Lim's collaboration.

Researchain Logo
Decentralizing Knowledge