Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yu-Hsuan Huang is active.

Publication


Featured researches published by Yu-Hsuan Huang.


international conference on computer graphics and interactive techniques | 2015

Scope+: a stereoscopic video see-through augmented reality microscope

Yu-Hsuan Huang; Tzu-Chieh Yu; Pei-Hsuan Tsai; Yu-Xiang Wang; Wan-Ling Yang; Ming Ouhyoung

During the usage of conventional stereo microscope, repetitive head movement are often inevitable for the users to retrieve information away from the microscope. Moreover, risks of surgeons loosing focus and increasing fatigue could also arise during microsurgeries. Therefore, Scope+, a stereoscopic video see-through augmented reality system was created not only to solve the problems mentioned above, but also to improve users stereo microscopic experience.


human factors in computing systems | 2018

CatAR: A Novel Stereoscopic Augmented Reality Cataract Surgery Training System with Dexterous Instruments Tracking Technology

Yu-Hsuan Huang; Hao-Yu Chang; Wan-Ling Yang; Yu-Kai Chiu; Tzu-Chieh Yu; Pei-Hsuan Tsai; Ming Ouhyoung

We propose CatAR, a novel stereoscopic augmented reality (AR) cataract surgery training system. It provides dexterous instrument tracking ability using a specially designed infrared optical system with 2 cameras and 1 reflective marker. The tracking accuracy on the instrument tip is 20 µm, much higher than previous simulators. Moreover, our system allows trainees to use and to see real surgical instruments while practicing. Five training modules with 31 parameters were designed and 28 participants were enrolled to conduct efficacy and validity tests. The results revealed significant differences between novice and experienced surgeons. Improvements in surgical skills after practicing with CatAR were also significant.


international conference on computer graphics and interactive techniques | 2017

Cinematography tutorials in virtual reality

Yu-Kai Chiu; Yu-Hsuan Huang; Ming Ouhyoung

The traditional method of cinematography tutorials often separates the theories from the practical experience. The theory was taught first, thus the students often need to practice on their own after that. However, cinematography equipment is costly and not affordable for most students. In this poster, we introduce a virtual reality tutorial system of cinematography. The system contains hands-on drills and simulation of cinematography equipment. Therefore, the user could learn cinematography in an immersive way through our system and gain hands-on experience at the same time. Our system are and can be adapted to assisting tools for production use, and not just for a tutorial.


international conference on computer graphics and interactive techniques | 2017

KidPen: a stroke-based method for kid-style sketches synthesis from photos

Wan-Ling Yang; Mei-Yun Chen; Hong-Shiang Ling; Yu-Hsuan Huang; Ming Ouhyoung

Drawings of children usually have a unique charm due to their naïve and untutored styles. To easily produce the kid-style art, we proposed KidPen, a method that can transform realistic photos into kid-style sketches. Synthesizing kid-style sketches is challenging because children often draw objects with large shape changes and content simplification. We propose a stroke composition method based on a general cognitive process of human copy-drawing, thus the system is not restricted to specific object categories. The perceptual study shows that there is no significant difference of naturalness between the synthesized sketches of our method and the childrens drawings.


international conference on computer graphics and interactive techniques | 2017

AR filming: augmented reality guide for compositing footage in filmmaking

Yu-Kai Chiu; Yi-Lung Kao; Yu-Hsuan Huang; Ming Ouhyoung

Creating videos from compositing multiple footage requires the support of the robotic arm due to the camera motion model needs to be precise. It is extremely difficult to shoot the footage with hand-held camera. However, the cost of the robotic arm is extremely high. Thus, we introduce an augmented reality guiding system to replace it. In our system, we utilized augmented reality to guide the user for the camera motion and implemented an algorithm of stabilization and camera motion alignment for a hand-held camera. The system reduces the cost but remaining good quality of the result at the same time.


user interface software and technology | 2016

A Novel Real Time Monitor System of 3D Printing Layers for Better Slicing Parameter Setting

Yu-Kai Chiu; Hao-Yu Chang; Wan-Ling Yang; Yu-Hsuan Huang; Ouhyoung Ming

We proposed a novel real time monitor system of 3D printer with dual cameras, which capture and reconstruct the printed result layer by layer. With the reconstructed image, we can apply computer vision technique to evaluate the difference with the ideal path generate by G-code. The difference gives us clues to classify which might be the possible factor of the result. Hence we can produce advice to user for better slicing parameter settings. We believe that this system can give helps to beginner or users of 3D printer that struggle in parameter settings in the future.


international conference on computer graphics and interactive techniques | 2016

A modified wheatstone-style head-mounted display prototype for narrow field-of-view video see-through augmented reality

Pei-Hsuan Tsai; Yu-Hsuan Huang; Yu-Ju Tsai; Hao-Yu Chang; Masatoshi Chang-Ogimoto; Ming Ouhyoung

Users always got bad experiences while using the general virtual reality head-mounted displays (HMDs) because of the low pixel density through optical lenses. For this reason, the narrow field-of-view (FoV) and high pixel density are the main goals we are going to pursue in the near-field video see-through augmented reality (AR) applications with sophisticated operations, such as the biological observation with AR microscope (e.g. Scope+ [Huang et al. 2015]), the AR surgery simulation, and telescope applications. Therefore with high resolution to see tiny objects clearly is the most important concern in this paper.


international conference on computer graphics and interactive techniques | 2016

ThirdEye: a coaxial feature tracking system for stereoscopic video see-through augmented reality

Yu-Xiang Wang; Yu-Ju Tsai; Yu-Hsuan Huang; Wan-Ling Yang; Tzu-Chieh Yu; Yu-Kai Chiu; Ming Ouhyoung

For stereoscopic augmented reality (AR) system, continuous feature tracking of the observing target is required to generate a virtual object in the real world coordinate. Besides, dual cameras have to be placed with proper distance to obtain correct stereo images for video see-through applications. Both higher resolution and frame rate per second (FPS) can improve the user experience. However, feature tracking could be the bottleneck with high resolution images and the latency would increase if image processing was done before tracking.


international conference on computer graphics and interactive techniques | 2016

A novel dexterous instrument tracking system for augmented reality cataract surgery training system

Yu-Hsuan Huang; Wan-Ling Yang; Yi-Lung Kao; Yu-Kai Chiu; Ya-Bo Huang; Hao-Yu Chang; Ming Ouhyoung

Cataract surgery is one of the most common operations performed worldwide. Therefore, the computer aided training system is important to reduce the risk resulted from an inexperience surgery performance. We introduced a novel optical tracking method for the surgical instrument in a cataract surgery training system, which is equipped with infrared cameras on top of and inside the eye model. Our system is suitable for augmented reality training environment and allowed trainees to manipulate with a real surgical tool. Also, our system used only 2 cameras for tracking and provided high resolution on the tip of the surgical tool.


user interface software and technology | 2015

Scope+: A Stereoscopic Video See-Through Augmented Reality Microscope

Yu-Hsuan Huang; Tzu-Chieh Yu; Pei-Hsuan Tsai; Yu-Xiang Wang; Wan-Ling Yang; Ming Ouhyoung

During the process of using conventional stereo microscope, users need to move their head away from the eyepieces repeatedly to access more information, such as anatomy structures from atlas. It happens during microsurgery if surgeons want to check patient?s data again. You might lose your target and your concentration after this kind of disruption. To solve this critical problem and to improve the user experience of stereo microscope, we present Scope+, a stereoscopic video see-through augmented reality system. Scope+ is designed for biological procedures, education and surgical training. While performing biological procedures, for example, dissection of a frog, anatomical atlas will show up inside the head mounted display (HMD) overlaid onto the magnified images. For education purpose, the specimens will no longer be silent under Scope+. When their body parts are pointed by a marked stick, related animation or transparent background video will merge with the real object and interact with observers. If surgeons want to improve their techniques of microsurgery, they can practice with Scope+ which provides complete foot pedal control functions identical to standard surgical microscope. Moreover, cooperating with special designed phantom models, this augmented reality system will guide you to perform some key steps of operation, such as Continuous Curvilinear Capsulorhexis in cataract surgery. Video see-through rather than optical see-through technology is adopt by Scope+ system, therefore remote observation via another Scope+ or web applications can be achieved. This feature can not only assist teachers during experiment classes, but also help researchers keep their eyes on the observables after work. Array mode is powered by the motor-driven stage plate which allows users to load multiple samples at the same time. Quick comparison between samples is possible when switching them by the foot pedal.

Collaboration


Dive into the Yu-Hsuan Huang's collaboration.

Top Co-Authors

Avatar

Ming Ouhyoung

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Wan-Ling Yang

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Yu-Kai Chiu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Pei-Hsuan Tsai

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Tzu-Chieh Yu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Hao-Yu Chang

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Yu-Xiang Wang

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Yi-Lung Kao

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Yu-Ju Tsai

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Hong-Shiang Ling

National Taiwan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge