Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Young-Gi Byun is active.

Publication


Featured researches published by Young-Gi Byun.


IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 2013

An Area-Based Image Fusion Scheme for the Integration of SAR and Optical Satellite Imagery

Young-Gi Byun; Jaewan Choi; Youkyung Han

The task of enhancing the perception of a scene by combining information captured from different image sensors is usually known as multisensor image fusion. This paper presents an area-based image fusion algorithm to merge SAR (Synthetic Aperture Radar) and optical images. The co-registration of the two images is first conducted using the proposed registration method prior to image fusion. Segmentation into active and inactive areas is then performed on the SAR texture image for selective injection of the SAR image into the panchromatic (PAN) image. An integrated image based on these two images is generated by the novel area-based fusion scheme, which imposes different fusion rules for each segmented area. Finally, this image is fused into a multispectral (MS) image through the hybrid pansharpening method proposed in previous research. Experimental results demonstrate that the proposed method shows better performance than other fusion algorithms and has the potential to be applied to the multisensor fusion of SAR and optical images.


IEEE Transactions on Geoscience and Remote Sensing | 2014

Parameter Optimization for the Extraction of Matching Points Between High-Resolution Multisensor Images in Urban Areas

Youkyung Han; Jaewan Choi; Young-Gi Byun; Yong-Il Kim

The objective of this paper is to extract a suitable number of evenly distributed matched points, given the characteristics of the site and the sensors involved. The intent is to increase the accuracy of automatic image-to-image registration for high-resolution multisensor data. The initial set of matching points is extracted using a scale-invariant feature transform (SIFT)-based method, which is further used to evaluate the initial geometric relationship between the features of the reference and sensed images. The precise matching points are extracted considering location differences and local properties of features. The values of the parameters used in the precise matching are optimized using an objective function that considers both the distribution of the matching points and the reliability of the transformation model. In case studies, the proposed algorithm extracts an appropriate number of well-distributed matching points and achieves a higher correct-match rate than the SIFT method. The registration results for all sensors are acceptably accurate, with a root-mean-square error of less than 1.5 m.


IEEE Geoscience and Remote Sensing Letters | 2013

Hybrid Pansharpening Algorithm for High Spatial Resolution Satellite Imagery to Improve Spatial Quality

Jaewan Choi; Junho Yeom; Anjin Chang; Young-Gi Byun; Yong-Il Kim

Most pansharpened images from existing algorithms are apt to present a tradeoff relationship between the spectral preservation and the spatial enhancement. In this letter, we developed a hybrid pansharpening algorithm based on primary and secondary high-frequency information injection to efficiently improve the spatial quality of the pansharpened image. The injected high-frequency information in our algorithm is composed of two types of data, i.e., the difference between panchromatic and intensity images, and the Laplacian filtered image of high-frequency information. The extracted high frequencies are injected by the multispectral image using the local adaptive fusion parameter and postprocessing of the fusion parameter. In the experiments using various satellite images, our results show better spatial quality than those of other fusion algorithms while maintaining as much spectral information as possible.


Remote Sensing | 2015

Image Fusion-Based Change Detection for Flood Extent Extraction Using Bi-Temporal Very High-Resolution Satellite Images

Young-Gi Byun; Youkyung Han; Tae-Byeong Chae

Change detection based on satellite images acquired from an area at different dates is of widespread interest, according to the increasing number of flood-related disasters. The images help to generate products that support emergency response and flood management at a global scale. In this paper, a novel unsupervised change detection approach based on image fusion is introduced. The approach aims to extract the reliable flood extent from very high-resolution (VHR) bi-temporal images. The method takes an advantage of the spectral distortion that occurs during image fusion process to detect the change areas by flood. To this end, a change candidate image is extracted from the fused image generated with bi-temporal images by considering a local spectral distortion. This can be done by employing a universal image quality index (UIQI), which is a measure for local evaluation of spectral distortion. The decision threshold for the determination of changed pixels is set by applying a probability mixture model to the change candidate image based on expectation maximization (EM) algorithm. We used bi-temporal KOMPSAT-2 satellite images to detect the flooded area in the city of N′djamena in Chad. The performance of the proposed method was visually and quantitatively compared with existing change detection methods. The results showed that the proposed method achieved an overall accuracy (OA = 75.04) close to that of the support vector machine (SVM)-based supervised change detection method. Moreover, the proposed method showed a better performance in differentiating the flooded area and the permanent water body compared to the existing change detection methods.


IEEE Geoscience and Remote Sensing Letters | 2015

Object-Based Change Detection of Very High Resolution Satellite Imagery Using the Cross-Sharpening of Multitemporal Data

Biao Wang; Seok-Keun Choi; Young-Gi Byun; Soungki Lee; Jaewan Choi

In this letter, we present a method for unsupervised change detection based on the cross-sharpening of multitemporal images and image segmentation. Our method effectively reduces the change detection errors caused by relief or spatial displacement between multitemporal images with different acquisition angles. A total of four cross-sharpened images, including two general pansharpened images, were generated. Then, two pairs of cross-sharpened images were analyzed using change detection indexes. The effectiveness of the proposed method compared with other unsupervised change detection methods is demonstrated through experimentation.


Journal of remote sensing | 2015

Automatic and accurate registration of VHR optical and SAR images using a quadtree structure

Youkyung Han; Young-Gi Byun

In this article, we combined intensity- and feature-based similarity measures to co-register very-high-resolution (VHR) optical and SAR images. The global translation difference between optical and SAR images is minimized by applying a mutual information (MI) intensity-based similarity measure from coarsest to finest pyramid images constructed from the images in order. Matching points are then extracted considering the spatial distance and gradient orientation of linear features extracted from each image. To increase the reliability of the registration result, a quadtree-based structure is constructed (1) to mask out regions from the similarity measurement such as dense urban or heterogeneous areas, which can cause large differences in geometric and radiometric properties in two images; and (2) to extract evenly distributed and precise matching points by considering regional properties of the study site. To evaluate the proposed method’s generalization, various VHR optical and SAR sensors are used and compounded to construct the study sites. Evenly distributed matching points across the whole image were extracted, and reliable registration results by a non-linear transformation constructed from the points were derived from the proposed method.


Remote Sensing Letters | 2014

A texture-based fusion scheme to integrate high-resolution satellite SAR and optical images

Young-Gi Byun

Image fusion of different types of satellite images is of widespread interest due to the increasing availability of various spaceborne imaging sensors. In this letter, a new multi-sensor image fusion method is presented, where a texture-based fusion rule is used to address the fusion of synthetic aperture radar (SAR) and optical images. At first, the integration of panchromatic (PAN) and SAR image is achieved in wavelet domain by taking weighted average of the corresponding input pixels. The weights are adaptively decided according to the local spatial autocorrelation characteristics of SAR texture image. This integrated image is fused into the multispectral (MS) image through generalized intensity-hue-saturation (GIHS) image fusion method to obtain the final fused image. To quantitatively evaluate and test the performance of the proposed method, comparisons with some existing fusion methods are carried out in this letter. Experimental results demonstrate that the proposed method shows better performance than other fusion methods and can get satisfactory fusion results.


Remote Sensing Letters | 2018

Efficient seamline determination for UAV image mosaicking using edge detection

Truong Linh Nguyen; Young-Gi Byun; Dongyeob Han; Jungwon Huh

ABSTRACT Image mosaicking of data from individual high-resolution unmanned aerial vehicle (UAV) images is required to obtain sufficient coverage over extensive roads. During the mosaicking process, seamlines may be generated due to differences in illumination or projection between individual images, or the presence of moving objects. This study presents an efficient seamline determination technique based on edge detection for UAV road surface images. The algorithm can be divided into three main steps. First, we detect the edges in the overlapping intensity image within the road boundary. Next, we obtain an automatic seamline passing through regions of non-attraction in areas of overlap. Finally, we adjust the values of the overlapping region using the values of the corresponding individual images by following the coordinates of the seamline detected in the second step, ultimately creating an image mosaic. The experiment using UAV images of a road surface demonstrates that the proposed method produces a satisfactory result. The proposed method can be applied for quick mosaicking of UAV images intended for maintaining road safety.


Desalination | 2009

Detection of Cochlodinium polykrikoides red tide based on two-stage filtering using MODIS data

Yong-Min Kim; Young-Gi Byun; Yong-Il Kim; Yangdam Eo


Etri Journal | 2011

Extraction and Regularization of Various Building Boundaries with Complex Shapes Utilizing Distribution Characteristics of Airborne LIDAR Points

Jeong-Ho Lee; Soohee Han; Young-Gi Byun; Yong-Il Kim

Collaboration


Dive into the Young-Gi Byun's collaboration.

Top Co-Authors

Avatar

Yong-Il Kim

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Kiyun Yu

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Tae-Byeong Chae

Korea Aerospace Research Institute

View shared research outputs
Top Co-Authors

Avatar

Youkyung Han

Seoul National University

View shared research outputs
Top Co-Authors

Avatar

Jaewan Choi

Chungbuk National University

View shared research outputs
Top Co-Authors

Avatar

Jeong-Ho Lee

Seoul National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yong-Min Kim

Seoul National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yong Huh

Seoul National University

View shared research outputs
Researchain Logo
Decentralizing Knowledge