Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jun Geol Baek is active.

Publication


Featured researches published by Jun Geol Baek.


Expert Systems With Applications | 2013

Algorithm learning based neural network integrating feature selection and classification

Hyunsoo Yoon; Cheong Sool Park; Jun Seok Kim; Jun Geol Baek

Feature selection and classification techniques have been studied independently without considering the interaction between both procedures, which leads to a degraded performance. In this paper, we present a new neural network approach, which is called an algorithm learning based neural network (ALBNN), to improve classification accuracy by integrating feature selection and classification procedures. In general, a knowledge-based artificial neural network operates on prior knowledge from domain experience, which provides it with better starting points for the target function and leads to better classification accuracy. However, prior knowledge is usually difficult to identify. Instead of using unknown background resources, the proposed method utilizes prior knowledge that is mathematically calculated from the properties of other learning algorithms such as PCA, LARS, C4.5, and SVM. We employ the extreme learning machine in this study to help obtain better initial points faster and avoid irrelevant time-consuming work, such as determining architecture and manual tuning. ALBNN correctly approximates a target hypothesis by both considering the interaction between two procedures and minimizing individual procedure errors. The approach produces new relevant features and improves the classification accuracy. Experimental results exhibit improved performance in various classification problems. ALBNN can be applied to various fields requiring high classification accuracy.


Expert Systems With Applications | 2014

Density weighted support vector data description

Myungraee Cha; Jun Seok Kim; Jun Geol Baek

We present density weighted support vector data description (DW-SVDD).DW-SVDD improves classification accuracy by introducing density weight into the SVDD.Density weight is relative density of each data points using the k-nearest neighbor (k-NN) approach.DW-SVDD prioritizes data points in high-density regions by applying density weight into the search of description of SVDD.Experimental results demonstrated that the DW-SVDD improved the classification accuracy. One-class classification (OCC) has received a lot of attention because of its usefulness in the absence of statistically-representative non-target data. In this situation, the objective of OCC is to find the optimal description of the target data in order to better identify outlier or non-target data. An example of OCC, support vector data description (SVDD) is widely used for its flexible description boundaries without the need to make assumptions regarding data distribution. By mapping the target dataset into high-dimensional space, SVDD finds the spherical description boundary for the target data. In this process, SVDD considers only the kernel-based distance between each data point and the spherical description, not the density distribution of the data. Therefore, it may happen that data points in high-density regions are not included in the description, decreasing classification performance. To solve this problem, we propose a new SVDD introducing the notion of density weight, which is the relative density of each data point based on the density distribution of the target data using the k-nearest neighbor (k-NN) approach. Incorporating the new weight into the search for an optimal description using SVDD, this new method prioritizes data points in high-density regions, and eventually the optimal description shifts to these regions. We demonstrate the improved performance of the new SVDD by using various datasets from the UCI repository.


Expert Systems With Applications | 2011

Spline regression based feature extraction for semiconductor process fault detection using support vector machine

Jonghyuck Park; Ick Hyun Kwon; Sung Shick Kim; Jun Geol Baek

Research highlights? Feature extraction method using Spline Regression is proposed. ? Faulty signals are detected by Support Vector Machine classifier. ? Proposed method is good for data compression and sensitive fault detection. ? Especially, it outperforms conventional methods when the signal is delayed. Quality control is attracting more attention in semiconductor market due to harsh competition. This paper considers Fault Detection (FD), a well-known philosophy in quality control. Conventional methods, such as non-stationary SPC chart, PCA, PLS, and Hotellings T2, are widely used to detect faults. However, even for identical processes, the process time differs. Missing data may hinder fault detection. Artificial intelligence (AI) techniques are used to deal with these problems. In this paper, a new fault detection method using spline regression and Support Vector Machine (SVM) is proposed. For a given process signal, spline regression is applied regarding step changing points as knot points. The coefficients multiplied to the basis of the spline function are considered as the features for the signal. SVM uses those extracted features as input variables to construct the classifier for fault detection. Numerical experiments are conducted in the case of artificial data that replicates semiconductor manufacturing signals to evaluate the performance of the proposed method.


Computers & Industrial Engineering | 2010

Dispatching rule for non-identical parallel machines with sequence-dependent setups and quality restrictions

Hyo Heon Ko; Jihyun Kim; Sung Shick Kim; Jun Geol Baek

This paper proposes a dispatching rule that guarantees a predetermined minimum quality level for non-identical parallel machines with multiple product types. Manufacturers are focusing on improving the overall quality of their products, as the demand for top quality products is increasing. Such changes increase the possibility of neglecting another crucial factor in manufacturing schedules, namely due date. Traditionally, jobs are dispatched with the focus on meeting due dates. That is, jobs are assigned to machines without consideration of product quality. This approach opens the possibility of manufacturing poor quality products. Realizing the shortcomings of the existing dispatching rules, manufacturers are tempted to dispatch jobs with the objective of maximizing product quality. With such an attempt, jobs are likely to be assigned to high performance machines only. In turn, waiting times will increase and job delays are inevitable. This research proposes a dispatching rule that satisfies both criteria, reducing due date delays, while ensuring a predefined product quality level. A quality index is introduced to standardize various product qualities. The index is used to ensure a predetermined quality level, whilst minimizing product delays. Simulations compare various dispatch methods, evaluating them based on mean tardiness and product quality.


Journal of Bioscience and Bioengineering | 2011

Analysis of the time dependency of ammonia-oxidizing bacterial community dynamics in an activated sludge bioreactor

Jun Geol Baek; Jonghyuck Park; Taek Seung Kim; Hee Deung Park

An autoregressive error term model was applied to examine the dynamic oscillation of ammonia-oxidizing bacterial (AOB) lineages found in an activated sludge bioreactor. The current abundance of AOB lineages was affected by the past abundance of AOB lineages and past environmental and operational factors as well as current influencing factors.


Journal of Intelligent Manufacturing | 2012

Advanced semiconductor fabrication process control using dual filter exponentially weighted moving average

Hyo Heon Ko; Jihyun Kim; Sang Hoon Park; Jun Geol Baek; Sung Shick Kim

Semiconductor industry needs to meet high standards to ensure survival and success in the 21st century. Rising expectations from the customers are demanding the semiconductor industry to manufacture products with both accuracy and precision. To comply with the strict demands, an effective control method for semiconductor manufacturing is introduced. The process environment is afflicted by process disturbances. Different characteristics of the process disturbances require the control method to be able to respond accordingly. This study utilizes two separate exponentially weighted moving average (EWMA) filters simultaneously to improve the performance of the control method. By utilizing dual filters, the influence of the white noise is reduced and the accurate process control is made possible. The proposed methodology is evaluated through simulation in comparison with two other control methods.


Expert Systems With Applications | 2011

An integrated music video browsing system for personalized television

Hyoung Gook Kim; Jin Young Kim; Jun Geol Baek

In this paper, we propose an integrated music video browsing system for personalized digital television. The system has the functions of automatic music emotion classification, automatic theme-based music classification, salient region detection, and shot classification. From audio (music) tracks, highlight detection and emotion classification are performed on the basis of information on temporal energy, timbre and tempo. For video tracks, shot detection is fulfilled to classify shots into face shots and color-based shots. Lastly automatic grouping of themes is executed on music titles and their lyrics. With a database of international music videos, we evaluate the performance of each function implemented in this paper. The experimental results show that the music browsing system achieves remarkable performances. Thus, our system can be adopted in any digital television for providing personalized services.


Expert Systems With Applications | 2011

Intelligent adaptive process control using dynamic deadband for semiconductor manufacturing

Hyo Heon Ko; Jun Seok Kim; Jihyun Kim; Jun Geol Baek; Sung Shick Kim

Research highlights? This paper proposes an efficient control method to minimize process error and to reduce process variance in semiconductor manufacturing. ? The proposed methodology is the dynamic deadband control that uses a region (band) to detect the status of a process change. ? The proposed methodology was evaluated using simulation and implemented in the photo process of a semiconductor manufacturing company to verify its practicality. This paper proposes an efficient control method to minimize process error and to reduce process variance in semiconductor manufacturing. The photolithography (photo) process forms a complex semiconductor circuit and is important for quality. Obstacles to the process include the facility itself, vibration, wear and tear, product/process changes and environmental influences. Control methodologies being currently used to address these issues often amplify the variation of the process by failing to perform adequate process control. Therefore, this paper proposes an effective process control method to reduce process variance by quickly detecting and identifying process disturbances and accurately reflecting the degree of change to process control. This study proposes dynamic deadband control that uses a region (band) to detect the status of a process change. It adjusts the process control based on the changes detected. In this research, the semiconductor manufacturing company is supported to perform control that is more precise and reduces fluctuations by producing products of uniform quality. In addition, it can contribute to yield due to the quality incentive and increased process control of semiconductor manufacturing.


Expert Systems With Applications | 2017

Real-time contrasts control chart using random forests with weighted voting

Seongwon Jang; Seung Hwan Park; Jun Geol Baek

We propose RTC control charts using random forests with weighted voting.F-measure, G-mean, and MCC are used as performance measures to assign proper weights.Our method detects faults more rapidly by making monitoring statistics continuous.Our method can identify where the fault occurs because tree-based classifier is used.Experiments demonstrated that our method is more effective than the existing methods. Real-time fault detection and isolation are important tasks in process monitoring. A real-time contrasts (RTC) control chart converts the process monitoring problem into a real-time classification problem and outperforms existing methods. However, the monitoring statistics of the original RTC chart are discrete; this could make the fault detection ability less efficient. To make monitoring statistics continuous, distance-based RTC control charts using support vector machines (SVM) and kernel linear discriminant analysis (KLDA) were proposed. Although the distance-based RTC charts outperformed the original RTC chart, the distance-based RTC charts have a disadvantage in that it is difficult to analyze the causes of faults when using these charts. Therefore, we propose improved RTC control charts using random forests with weighted voting. These improved RTC control charts can detect changes more rapidly by making monitoring statistics continuous; additionally, they can also analyze the causes of faults in a similar manner to the original RTC chart. Further, the improved RTC control charts alleviate the class imbalance problem by using F-measure, G-mean, and Matthews correlation coefficient (MCC) as performance measures to assign proper weights to individual classifiers. Experiments show that the proposed methods outperform the original RTC chart and are more effective than the distance-based RTC charts using SVM and KLDA.


ieee international conference semantic computing | 2017

Solving the Singularity Problem of Semiconductor Process Signal Using Improved Dynamic Time Warping

Jae Yeol Hong; Seung Hwan Park; Jun Geol Baek

In the micro process such as semiconductor process, Variation of variables directly affects the quality of the end product. So, it is important to monitor and manage the fluctuations in the variable. At this time, the signal is generated. However, the signal has a moving average and non-uniform variance, and different lengths in specific portion. Also it has a different total length. Recently, the DTW(Dynamic Time Warping) is used to coordinate of signal. However, the mean and variance of the signal is not uniform, there arises a problem that the singularity occurs. Singularity means unintuitive alignments that single point in time series connects onto a portion of another time series. To solve the problem that the singularity occurs, in this paper, we propose to use MODWT (Maximal Overlap Discrete Wavelet Transform) as a feature of signal and then calculate the warping path using divided matrices. We solve the singularity problem when applying the DTW to process the signal. Also, by comparing the result of applying process signal to DTW and proposed method, the proposed method is confirmed to be better than DTW in process signal.

Collaboration


Dive into the Jun Geol Baek's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge