Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Songyot Nakariyakul is active.

Publication


Featured researches published by Songyot Nakariyakul.


Optical Engineering | 2008

Hyperspectral waveband selection for contaminant detection on poultry carcasses

Songyot Nakariyakul; David Casasent

We address the important product inspection application of contaminant detection on chicken carcasses. Detection of four contaminant types of interest (duodenum, ceca, colon, and ingesta) from chickens fed with three different feeds (corn, milo, and wheat) is considered. We consider feature selection algorithms for choosing a small set of spectral bands (wavelengths) in hyperspectral (HS) data for online contaminant detection. For cases when an optimal solution is not realistic, we introduce our new improved forward floating selection algorithm; we call it a quasi-optimal (close to optimal) algorithm. Our algorithm is an improvement on the state-of-the-art sequential forward floating selection algorithm. We train our algorithm on a pixel database using only corn-fed chickens and test it on HS images of carcasses with three feeds. Our new algorithm gives an excellent detection rate and performs better than other suboptimal feature selection algorithms on this database.


The Journal of Supercomputing | 2013

Fast spatial averaging: an efficient algorithm for 2D mean filtering

Songyot Nakariyakul

We present a new fast spatial averaging technique that efficiently implements operations for spatial averaging or two-dimensional mean filtering. To perform spatial averaging of an M×N image with an averaging filter of size m×n, our proposed method requires approximately 4MN additions and no division. This is very promising, since the major computations required by our algorithm depend only on the size of the original image but not on the size of the averaging filter. To our knowledge, this technique requires the smallest number of additions for mean filtering. Experimental results on various image sizes using different filter sizes confirm that our fast spatial averaging algorithm is significantly faster than other spatial averaging algorithms, especially when the size of the input image is very large.


international conference on wavelet analysis and pattern recognition | 2008

Improved forward floating selection algorithm for feature subset selection

Songyot Nakariyakul; David Casasent

We present results on two new databases for a new improved forward floating selection (IFFS) algorithm for selecting a subset of features. The algorithm is an improvement upon the state-of-the-art sequential forward floating selection algorithm that includes a new search strategy to check whether removing any feature in the selected feature set and adding a new one at each sequential step can improve the resultant feature set. We find that this method provides the optimal or quasi-optimal (close to optimal) solutions for many selected subsets and requires significantly less computational load than an exhaustive search optimal feature selection algorithm. Our experimental results for two different databases demonstrate that our algorithm consistently selects better subsets than other quasi-optimal feature selection algorithms do.


Pattern Recognition Letters | 2014

Suboptimal branch and bound algorithms for feature subset selection: A comparative study

Songyot Nakariyakul

Abstract The branch and bound algorithm is an optimal feature selection method that is well-known for its computational efficiency. However, when the dimensionality of the original feature space is large, the computational time of the branch and bound algorithm becomes very excessive. If the optimality of the solution is allowed to be compromised, one can further improve the search speed of the branch and bound algorithm; the look-ahead search strategy can be employed to eliminate many solutions deemed to be suboptimal early in the search. In this paper, a comparative study of the look-ahead scheme in terms of the computational cost and the solution quality on four major branch and bound algorithms is carried out on real data sets. We also explore the use of suboptimal branch and bound algorithms on a high-dimensional data set and compare its performance with other well-known suboptimal feature selection algorithms.


Information Sciences | 2014

A comparative study of suboptimal branch and bound algorithms

Songyot Nakariyakul

Abstract The branch and bound algorithm is widely known as an efficient approach for selecting optimal feature subsets. If the optimality of the solution is allowed to be compromised, it is possible to further improve the search speed of the branch and bound algorithm. This paper studies the look-ahead search strategy which can eliminate many solutions deemed to be suboptimal early in the branch and bound search. We propose ways to incorporate the look-ahead search scheme into four major branch and bound algorithms, namely the basic branch and bound algorithm, the ordered branch and bound algorithm, the fast branch and bound algorithm, and the adaptive branch and bound algorithm. A comparative study of the look-ahead scheme in terms of the computational cost and the solution quality on these suboptimal branch and bound algorithms is carried out on real data sets. Furthermore, we test the feasible use of suboptimal branch and bound algorithms on a high-dimensional data set and compare its performance with other well-known suboptimal feature selection algorithms.


international conference on wavelet analysis and pattern recognition | 2008

On the suboptimal solutions using the adaptive branch and bound algorithm for feature selection

Songyot Nakariyakul

The branch and bound algorithm is an optimal feature selection method that is well-known for its computational efficiency. The recently developed adaptive branch and bound algorithm has been shown to be several times faster than other versions of the branch and bound algorithm. If the optimality of the algorithm is allowed to be compromised, we can further improve the search speed by employing the look-ahead search strategy to eliminate many solutions deemed to be suboptimal early in the search. We investigate the effects of this scheme on the computational cost and suboptimal solutions obtained using the adaptive branch and bound algorithm and compare them with those using the basic branch and bound algorithm. Our experimental results for two different databases demonstrate that by setting the look-ahead parameter to an appropriate value, we can significantly reduce the search time of the adaptive branch and bound algorithm while retaining its optimal solutions.


Knowledge Based Systems | 2018

High-dimensional hybrid feature selection using interaction information-guided search

Songyot Nakariyakul

Abstract With the rapid growth of high-dimensional data sets in recent years, the need for reducing the dimensionality of data has grown significantly. Although wrapper approaches tend to achieve higher accuracy rates than filter techniques for the same number of selected features, only a few wrapper algorithms are applicable for high-dimensional data sets because the computational time becomes very excessive. We thus propose a new hybrid feature selection algorithm that is computationally efficient with high accuracy rates for high-dimensional data. The proposed method employs interaction information to guide the search, sequentially adds one feature at a time into the currently selected subset, and adopts early stopping to prevent overfitting and speed up the search. Our method is dynamic and selects only relevant and irredundant features that significantly improve the accuracy rates. Our experimental results for eleven high-dimensional data sets demonstrate that our algorithm consistently outperforms prior feature selection techniques, while requiring a reasonable amount of search time.


natural language processing and knowledge engineering | 2009

Study on criterion function models in the adaptive branch and bound algorithm

Songyot Nakariyakul

The adaptive branch and bound algorithm was recently introduced to accelerate the search speed for optimal feature selection. The algorithm improves upon prior branch and bound algorithms in many aspects. One of the major improvements is to model the criterion function as a simple mathematical function and to adapt it in the proposed jump search strategy to avoid redundant computations. In this paper, we investigate various mathematical functions that can be used as the criterion function model. Experimental results for two real data sets demonstrate that other simple criterion function models perform as well as the default one.


international conference on image processing | 2009

A new feature selection algorithm for multispectral and polarimetric vehicle images

Songyot Nakariyakul

Multispectral and polarimetric data have been shown to provide detailed information useful for automatic target recognition applications. A major limitation of using these data in remote sensing is that they often consist of a large number of features with an inadequate number of samples. To reduce the number of features, we thus present a new generalized steepest ascent feature selection technique that selects only a small subset of important features to use for classification. Our proposed algorithm improves upon the prior steepest ascent algorithm by selecting a better starting search point and performing a more thorough search. It is guaranteed to provide solutions that equal or exceed those of the classical sequential forward floating selection algorithm. Initial results for one multispectral and polarimetric data set show that our algorithm yields better classification results than other suboptimal search algorithms.


computational intelligence in bioinformatics and computational biology | 2016

Gene selection using interaction information for microarray-based cancer classification

Songyot Nakariyakul

Gene selection is an important pre-processing step in microarray analysis and classification. While traditional gene selection algorithms focus on identifying relevant and irredundant genes, we present a new gene selection algorithm that chooses gene subsets based on their interaction information. Many individual genes may be irrelevant with the class, but when combined together, they can interact and provide information useful for classification. Our proposed gene selection algorithm is tested on four well-known cancer microarray datasets. Initial results show that our algorithm selects effective gene subsets and outperforms prior gene selection algorithm in terms of classification accuracy.

Collaboration


Dive into the Songyot Nakariyakul's collaboration.

Top Co-Authors

Avatar

David Casasent

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge