Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ming-Sui Lee is active.

Publication


Featured researches published by Ming-Sui Lee.


human factors in computing systems | 2010

Touching the void: direct-touch interaction for intangible displays

Liwei Chan; HuiShan Kao; Mike Y. Chen; Ming-Sui Lee; Jane Yung-jen Hsu; Yi-Ping Hung

In this paper, we explore the challenges in applying and investigate methodologies to improve direct-touch interaction on intangible displays. Direct-touch interaction simplifies object manipulation, because it combines the input and display into a single integrated interface. While traditional tangible display-based direct-touch technology is commonplace, similar direct-touch interaction within an intangible display paradigm presents many challenges. Given the lack of tactile feedback, direct-touch interaction on an intangible display may show poor performance even on the simplest of target acquisition tasks. In order to study this problem, we have created a prototype of an intangible display. In the initial study, we collected user discrepancy data corresponding to the interpretation of 3D location of targets shown on our intangible display. The result showed that participants performed poorly in determining the z-coordinate of the targets and were imprecise in their execution of screen touches within the system. Thirty percent of positioning operations showed errors larger than 30mm from the actual surface. This finding triggered our interest to design a second study, in which we quantified task time in the presence of visual and audio feedback. The pseudo-shadow visual feedback was shown to be helpful both in improving user performance and satisfaction.


international conference of the ieee engineering in medicine and biology society | 2012

Noncontact respiratory measurement of volume change using depth camera

Meng-Chieh Yu; Jia-Ling Liou; Shuenn-Wen Kuo; Ming-Sui Lee; Yi-Ping Hung

In this study, a system is developed to measure human chest wall motion for respiratory volume estimation without any physical contact. Based on depth image sensing technique, respiratory volume is estimated by measuring morphological changes of the chest wall. We evaluated the system and compared with a standard reference device, and the results show strong agreement in respiratory volume measurement [correlation coefficient: r=0.966]. The isovolume test presents small variations of the total respiratory volume during the isovolume maneuver (standard deviation <;107 ml). Then, a regional pulmonary measurement test is evaluated by a patient, and the results show visibly difference of pulmonary functional between the diseased and the contralateral sides of the thorax after the thoracotomy. This study has big potential for personal health care and preventive medicine as it provides a novel, low-cost, and convenient way to measure users respiration volume.


virtual reality software and technology | 2007

Gesture-based interaction for a magic crystal ball

Liwei Chan; Yi-Fan Chuang; Meng-Chieh Yu; Yi-liu Chao; Ming-Sui Lee; Yi-Ping Hung; Jane Yung-jen Hsu

Crystal balls are generally considered as media to perform divination or fortune-telling. These imaginations are mainly from some fantasy films and fiction, in which an augur can see into the past, the present, or the future through a crystal ball. With the distinct impressions, crystal ball has revealed itself as a perfect interface for the users to access and to manipulate visual media in an intuitive, imaginative and playful manner. We developed an interactive visual display system named Magic Crystal Ball (MaC Ball). MaC Ball is a spherical display system, which allows the users to see a virtual object/scene appearing inside a transparent sphere, and to manipulate the displayed content with barehanded interactions. Interacting with MaC Ball makes the users feeling acting with magic power. With MaC Ball, user can manipulate the display with touch and hover interactions. For instance, the user waves hands above the ball, causing clouds blowing from bottom of the ball, or slides fingers on the ball to rotate the displayed object. In addition, the user can press single finger to select an object or to issue a button. MaC Ball takes advantages on the impressions of crystal balls, allowing the users acting with visual media following their imaginations. For applications, MaC Ball has high potential to be used for advertising and demonstration in museums, product launches, and other venues.


biomedical engineering systems and technologies | 2012

Multiparameter Sleep Monitoring Using a Depth Camera

Meng-Chieh Yu; Huan Wu; Jia-Ling Liou; Ming-Sui Lee; Yi-Ping Hung

In this study, a depth analysis technique was developed to monitor user’s breathing rate, sleep position, and body movement while sleeping without any physical contact. A cross-section method was proposed to detect user’s head and torso from the sequence of depth images. In the experiment, eight participants were asked to change the sleep positions (supine and side-lying) every fifteen breathing cycles on the bed. The results showed that the proposed method is promising to detect the head and torso with various sleeping postures and body shapes. In addition, a realistic over-night sleep monitoring experiment was conducted. The results demonstrated that this system is promising to monitor the sleep conditions in realistic sleep conditions and the measurement accuracy was better than the first experiment. This study is important for providing a non-contact technology to measure multiple sleep conditions and assist users in better understanding of his sleep quality.


IEEE Transactions on Biomedical Engineering | 2012

Multimedia-Assisted Breathwalk-Aware System

Meng-Chieh Yu; Huan Wu; Ming-Sui Lee; Yi-Ping Hung

Breathwalk is a science of combining specific patterns of footsteps synchronized with the breathing. In this study, we developed a multimedia-assisted Breathwalk-aware system which detects users walking and breathing conditions and provides appropriate multimedia guidance on the smartphone. Through the mobile device, the system enhances users awareness of walking and breathing behaviors. As an example application in slow technology, the system could help meditator beginners learn “walking meditation,” a type of meditation which aims to be as slow as possible in taking pace, to synchronize footstep with breathing, and to land every footstep with toes first. In the pilot study, we developed a walking-aware system and evaluated whether multimedia-assisted mechanism is capable of enhancing beginners walking awareness while walking meditation. Experimental results show that it could effectively assist beginners in slowing down the walking speed and decreasing incorrect footsteps. In the second experiment, we evaluated the Breathwalk-aware system to find a better feedback mechanism for learning the techniques of Breathwalk while walking meditation. The experimental results show that the visual-auditory mechanism is a better multimedia-assisted mechanism while walking meditation than visual mechanism and auditory mechanism.


ieee international conference on ubi-media computing | 2008

QPalm: A gesture recognition system for remote control with list menu

Yu-Hsin Chang; Liwei Chan; Ju-Chun Ko; Ming-Sui Lee; Jane Hsu; Yi-Ping Hung

The coming ubiquity of digital media content is driving the need of a solution for improving the interaction between the people and media. In this work, we proposed a novel interaction technique, QPalm, which allows the user to control the media via a list menu shown on a distant display by drawing circles in the air with one hand. To manipulate a list menu remotely, QPalm includes two basic functions, browse and choosing, realized by recognizing the userpsilas palm performing circular and push motions in the air. The circular motion provides fluidity in scrolling a menu up and down, while push motion is intuitive when the user decided to choose an item during a circular motion. Based on this design, we develop a vision system based on a stereo camera to track the userpsilas palm without interfering by intruders behind or next to the operating user. For more specifically, the contribution of the work includes: (1) an intuitive interaction technique, QPalm, for remote control with list menu, and (2) a palm tracking algorithm to support QPalm based on merely depth and motion information of images for a practical consideration.


conference on multimedia modeling | 2011

i-m-breath: the effect of multimedia biofeedback on learning abdominal breath

Meng-Chieh Yu; Jin-Shing Chen; King-Jen Chang; Hsu Sc; Ming-Sui Lee; Yi-Ping Hung

Breathing is a natural and important exercise for human beings, and the right breath method can make people healthier and even happier. i-m-Breath was developed to assist users in learning of abdominal breath, which used Respiration Girth Sensors (RGS) to measure users breath pattern and provided visual feedback to assist in learning abdominal breath. In this paper, we tried to study the effect of biofeedback mechanism on learning of abdominal breath. We cooperated with College of Medicine in National Taiwan University to take the experiments to explore whether the biofeedback mechanism affect the learning of abdominal breath. The results of the experiments showed that i-m-Breath could help people in improving the breath habit from chest breath to abdominal breath, and in the future the system will be used the hospital. Finally, this study is important for providing a biofeedback mechanism to assist users in better understanding of his breath pattern and improving the breath habit.


conference on multimedia modeling | 2010

Transformational breathing between present and past: virtual exhibition system of the mao-kung ting

Chun-Ko Hsieh; Xin Tong; Yi-Ping Hung; Chia-Ping Chen; Ju-Chun Ko; Meng-Chieh Yu; Han-Hung Lin; Szu-Wei Wu; Yi-Yu Chung; Liang-Chun Lin; Ming-Sui Lee; Chu-Song Chen; Jiaping Wang; Quo-Ping Lin; I-Ling Liu

The Mao-Kung Ting is one of the most precious artifacts in the National Palace Museum. Having five-hundred-character inscription cast inside, the Mao-Kung Ting is regarded as a very important historical document, dating back to 800 B.C.. Motivated by revealing the great nature of the artifact and interpreting it into a meaningful narrative, we have proposed an innovative Virtual Exhibition System to facilitate communication between the Mao-Kung Ting and audiences. Consequently, we develop the Virtual Exhibition system into the following scenarios: “Breathing through the History” and “View-dependent display”.


international conference on health informatics | 2012

BREATH AND POSITION MONITORING DURING SLEEPING WITH A DEPTH CAMERA

Meng-Chieh Yu; Huan Wu; Jia-Ling Liou; Ming-Sui Lee; Yi-Ping Hung


international conference on pattern recognition | 2012

Human action recognition using Action Trait Code

Shih-Yao Lin; Chuen-Kai Shie; Shen-Chi Chen; Ming-Sui Lee; Yi-Ping Hung

Collaboration


Dive into the Ming-Sui Lee's collaboration.

Top Co-Authors

Avatar

Yi-Ping Hung

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Meng-Chieh Yu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Mike Y. Chen

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Huan Wu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Jia-Ling Liou

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Ju-Chun Ko

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Liwei Chan

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Shih-Yao Lin

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Han-Hung Lin

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Jane Yung-jen Hsu

National Taiwan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge