Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hyunseok Choi is active.

Publication


Featured researches published by Hyunseok Choi.


International Journal of Medical Robotics and Computer Assisted Surgery | 2016

An effective visualization technique for depth perception in augmented reality-based surgical navigation.

Hyunseok Choi; Byunghyun Cho; Ken Masamune; Makoto Hashizume; Jaesung Hong

Depth perception is a major issue in augmented reality (AR)‐based surgical navigation. We propose an AR and virtual reality (VR) switchable visualization system with distance information, and evaluate its performance in a surgical navigation set‐up.


Minimally Invasive Therapy & Allied Technologies | 2017

A portable surgical navigation device to display resection planes for bone tumor surgery

Hyunseok Choi; Yeongkyoon Park; Seongpung Lee; Hogun Ha; Sungmin Kim; Hwan Seong Cho; Jaesung Hong

Abstract Introduction: Surgical navigation has been used in musculoskeletal tumor surgical procedures to improve the precision of tumor resection. Despite the favorable attributes of navigation-assisted surgery, conventional systems do not display the resection margin in real time, and preoperative manual input is required. In addition, navigation systems are often expensive and complex, and this has limited their widespread use. In this study, we propose an augmented reality surgical navigation system that uses a tablet personal computer with no external tracking system. Material and methods: We realized a real-time safety margin display based on three-dimensional dilation. The resection plane induced by the safety margin is updated in real time according to the direction of sawing. The minimum separation between the saw and the resection plane is also calculated and displayed. The surgeon can resect bone tumors accurately by referring to the resection plane and the minimum separation updated in real time. Results: The effectiveness of the system was demonstrated with experiments on pig pelvises. When the desired resection margin was 10 mm, the measured resection margin was 9.85 ± 1.02 mm. Conclusions: The proposed method exhibits sufficient accuracy and convenience for use in bone tumor resection. It also has favorable practical applicability due to its low cost and portability.


Cogent engineering | 2017

Effective calibration of an endoscope to an optical tracking system for medical augmented reality

Seongpung Lee; Hyunki Lee; Hyunseok Choi; Sangseo Jeon; Jaesung Hong

Abstract Background: We investigated the methods of calibrating an endoscope to an optical tracking system (OTS) for high accuracy augmented reality (AR)-based surgical navigation. We compared the possible calibration methods, and suggested the best method in terms of accuracy and speed in a medical environment. Material and methods: A calibration board with an attached OTS marker was used to acquire the pose data of the endoscope for the calibration. The transformation matrix from the endoscope to the OTS marker was calculated using the data. The calibration was performed by moving either the board or the endoscope in various placements. The re-projection error was utilized for evaluating the matrix. Results: From the statistical analysis, the method of moving the board was significantly more accurate than the method of moving the endoscope (p < 0.05). This difference resulted mainly from the uneven error distribution in the OTS measurement range and also the hand tremor in holding the endoscope. Conclusions: To increase the accuracy of AR, camera-to-OTS calibration should be performed by moving the board, and the board and the endoscope should be as close as possible to the OTS. This finding can contribute to improving the visualization accuracy in AR-based surgical navigation.


computer assisted radiology and surgery | 2018

Perspective pinhole model with planar source for augmented reality surgical navigation based on C-arm imaging

Hogun Ha; Sangseo Jeon; Seongpung Lee; Hyunseok Choi; Jaesung Hong

PurposeFor augmented reality surgical navigation based on C-arm imaging, accuracy of the overlaid augmented reality onto the X-ray image is imperative. However, overlay displacement is generated when a conventional pinhole model describing a geometric relationship of a normal camera is adopted for C-arm calibration. Thus, a modified model for C-arm calibration is proposed to reduce this displacement, which is essential for accurate surgical navigation.MethodBased on the analysis of displacement pattern generated for three-dimensional objects, we assumed that displacement originated by moving the X-ray source position according to the depth. In the proposed method, X-ray source movement was modeled as variable intrinsic parameters and represented in the pinhole model by replacing the point source with a planar source.ResultsThe improvement which represents a reduced displacement was verified by comparing overlay accuracy for augmented reality surgical navigation between the conventional and proposed methods. The proposed method achieved more accurate overlay on the X-ray image in spatial position as well as depth of the object volume.ConclusionWe validated that intrinsic parameters that describe the source position were dependent on depth for a three-dimensional object and showed that displacement can be reduced and become independent of depth by using the proposed planar source model.


Clinical Orthopaedics and Related Research | 2018

Can Augmented Reality Be Helpful in Pelvic Bone Cancer Surgery? An In Vitro Study

Hwan Seong Cho; Min Suk Park; Sanjay Gupta; Ilkyu Han; Han-Soo Kim; Hyunseok Choi; Jaesung Hong

Background Application of surgical navigation for pelvic bone cancer surgery may prove useful, but in addition to the fact that research supporting its adoption remains relatively preliminary, the actual navigation devices are physically large, occupying considerable space in already crowded operating rooms. To address this issue, we developed and tested a navigation system for pelvic bone cancer surgery assimilating augmented reality (AR) technology to simplify the system by embedding the navigation software into a tablet personal computer (PC). Questions/purposes Using simulated tumors and resections in a pig pelvic model, we asked: Can AR-assisted resection reduce errors in terms of planned bone cuts and improve ability to achieve the planned margin around a tumor in pelvic bone cancer surgery? Methods We developed an AR-based navigation system for pelvic bone tumor surgery, which could be operated on a tablet PC. We created 36 bone tumor models for simulation of tumor resection in pig pelves and assigned 18 each to the AR-assisted resection group and conventional resection group. To simulate a bone tumor, bone cement was inserted into the acetabular dome of the pig pelvis. Tumor resection was simulated in two scenarios. The first was AR-assisted resection by an orthopaedic resident and the second was resection using conventional methods by an orthopaedic oncologist. For both groups, resection was planned with a 1-cm safety margin around the bone cement. Resection margins were evaluated by an independent orthopaedic surgeon who was blinded as to the type of resection. All specimens were sectioned twice: first through a plane parallel to the medial wall of the acetabulum and second through a plane perpendicular to the first. The distance from the resection margin to the bone cement was measured at four different locations for each plane. The largest of the four errors on a plane was adopted for evaluation. Therefore, each specimen had two values of error, which were collected from two perpendicular planes. The resection errors were classified into four grades: ⩽ 3 mm; 3 to 6 mm; 6 to 9 mm; and > 9 mm or any tumor violation. Student’s t-test was used for statistical comparison of the mean resection errors of the two groups. Results The mean of 36 resection errors of 18 pelves in the AR-assisted resection group was 1.59 mm (SD, 4.13 mm; 95% confidence interval [CI], 0.24-2.94 mm) and the mean error of the conventional resection group was 4.55 mm (SD, 9.7 mm; 95% CI, 1.38-7.72 mm; p < 0.001). All specimens in the AR-assisted resection group had errors < 6 mm, whereas 78% (28 of 36) of errors in the conventional group were < 6 mm. Conclusions In this in vitro simulated tumor model, we demonstrated that AR assistance could help to achieve the planned margin. Our model was designed as a proof of concept; although our findings do not justify a clinical trial in humans, they do support continued investigation of this system in a live animal model, which will be our next experiment. Clinical Relevance The AR-based navigation system provides additional information of the tumor extent and may help surgeons during pelvic bone cancer surgery without the need for more complex and cumbersome conventional navigation systems.


computer assisted radiology and surgery | 2016

An all-joint-control master device for single-port laparoscopic surgery robots

Seongbo Shim; Taehun Kang; Daekeun Ji; Hyunseok Choi; Sanghyun Joung; Jaesung Hong

PurposeRobots for single-port laparoscopic surgery (SPLS) typically have all of their joints located inside abdomen during surgery, whereas with the da Vinci system, only the tip part of the robot arm is inserted and manipulated. A typical master device that controls only the tip with six degrees of freedom (DOFs) is not suitable for use with SPLS robots because of safety concerns.MethodsWe designed an ergonomic six-DOF master device that can control all of the joints of an SPLS robot. We matched each joint of the master, the slave, and the human arm to decouple all-joint motions of the slave robot. Counterbalance masses were used to reduce operator fatigue. Mapping factors were determined based on kinematic analysis and were used to achieve all-joint control with minimal error at the tip of the slave robot.ResultsThe proposed master device has two noteworthy features: efficient joint matching to the human arm to decouple each joint motion of the slave robot and accurate mapping factors, which can minimize the trajectory error of the tips between the master and the slave.ConclusionsWe confirmed that the operator can manipulate the slave robot intuitively with the master device and that both tips have similar trajectories with minimal error.


Workshop on Augmented Environments for Computer-Assisted Interventions | 2014

A Simple and Accurate Camera-Sensor Calibration for Surgical Endoscopes and Microscopes

Seongpung Lee; Hyunki Lee; Hyunseok Choi; Jaesung Hong

Nowadays, augmented reality (AR) has become a key technology for surgical navigation. It is necessary to perform camera-sensor calibration to build AR between a camera and a sensor that tracks the camera. In order to perform camera-sensor calibration, it has been common method to move the camera in such a way as to solve an AX = XB type formula. However, in clinical environments, Endoscopes and microscopes are commonly used, and moving those cameras is very difficult due to their large weight and size when camera-sensor calibration is performed. Therefore, we propose a method to solve the camera-sensor matrix by expanding the AX = XB equation to the AX = BYC equation. Instead of moving the camera, we move the calibration pattern in the proposed method. Through the experiments, we compared the AX = BYC solution with the AX = XB solution in terms of the accuracy. As a result, we found the proposed method is more convenient and accurate than the conventional method.


한국CAD/CAM학회 학술발표회 논문집 | 2013

Development of a Surgical Navigation System using Augmented Reality

Hyunseok Choi; Sangseo Jeon; Jaesung Hong


Journal of Electronic Imaging | 2018

Comparative study of hand–eye calibration methods for augmented reality using an endoscope

Seongpung Lee; Hyunki Lee; Hyunseok Choi; Sangseo Jeo; Hogun Ha; Jaesung Hong


IEEE-ASME Transactions on Mechatronics | 2018

Robotic System for Bone Drilling Using a Rolling Friction Mechanism

Seongbo Shim; Hyunseok Choi; Daekeun Ji; Wongin Kang; Jaesung Hong

Collaboration


Dive into the Hyunseok Choi's collaboration.

Top Co-Authors

Avatar

Jaesung Hong

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Seongpung Lee

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hogun Ha

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hyunki Lee

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Sangseo Jeon

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Daekeun Ji

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hwan Seong Cho

Seoul National University Bundang Hospital

View shared research outputs
Top Co-Authors

Avatar

Seongbo Shim

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Han-Soo Kim

Seoul National University Hospital

View shared research outputs
Researchain Logo
Decentralizing Knowledge