Wei-Chen Cheng
National Taiwan University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Wei-Chen Cheng.
Neurocomputing | 2014
Cheng-Yuan Liou; Wei-Chen Cheng; Jiun-Wei Liou; Daw-Ran Liou
This paper presents a training method that encodes each word into a different vector in semantic space and its relation to low entropy coding. Elman network is employed in the method to process word sequences from literary works. The trained codes possess reduced entropy and are used in ranking, indexing, and categorizing literary works. A modification of the method to train the multi-vector for each polysemous word is also presented where each vector represents a different meaning of its word. These multiple vectors can accommodate several different meanings of their word. This method is applied to the stylish analyses of two Chinese novels, Dream of the Red Chamber and Romance of the Three Kingdoms.
international conference on neural information processing | 2008
Cheng-Yuan Liou; Wei-Chen Cheng
This work presents a neighborhood preservation method to construct the latent manifold. This manifold preserves the relative Euclidean distances among neighboring data points. Its computation cost is close to the linear algorithm and its performance in preserving the local relationships is promising when we compared it with the methods, LLE and Isomap.
international conference on neural information processing | 2008
Cheng-Yuan Liou; Wei-Chen Cheng
This paper presents a novel technique to separate the pattern representation in each hidden layer to facilitate many classification tasks. This technique requires that all patterns in the same class will have near representions and the patterns in different classes will have distant representions. This requirement is applied to any two data patterns to train a selected hidden layer of the MLP or the RNN. The MLP can be trained layer by layer feedforwardly to accomplish resolved representations. The trained MLP can serve as a kind of kernel functions for categorizing multiple classes.
international symposium on neural networks | 2012
Wei-Chen Cheng; Jiun-Wei Liou; Cheng-Yuan Liou
Dealing with large number of brain images in group analysis involves two kinds of analysis. One is to extract the information and relations in the population of brains, and the other is to combine the information of individual brain in the group. Linear or nonlinear dimension reduction algorithms are the main tools to perform the first analysis, which is to show the information of the population. The hidden relations in the distribution are therefore able to be visualized in low-dimensional and visible space. Image registration is the critical part of the second analysis, which is to integrate the information or statistics of individual brains. The statistics are registered to a template which is commonly the mean brain image of the population so that the statistics from different subjects can be compared in the same stereotaxic space. The process of registering images to the template is called normalization. The quality of registration decides the normalization and the interpretability of results. This work constructed ordered representations, from a set of brain images, as the multiple templates. The ordered representations are derived from self-organizing map. A novel method, transformation diversion, based on the ordered representations is proposed to improve the registration, which is a non-linear deformation, in a general manner. The discriminative low-dimensional representation of the population of Alzheimer disease and normal subjects are also shown. The set of ordered representations not only shows the population information but also improve the normalization process.
Memetic Computing | 2010
Wei-Chen Cheng; Cheng-Yuan Liou
This paper presents a distance invariant manifold that preserves the neighborhood relations among data patterns. All patterns have their corresponding cells in the manifold space. The constellation of neighborhood cells closely resembles that of patterns. The manifold is invariant under the translation, rotation and scale of the pattern coordinates. The neighborhood relations among cells are adjusted and improved in each iteration according to the reduction of the distance preservation energy.
Knowledge Based Systems | 2012
Wei-Chen Cheng; Jau-Chi Huang; Cheng-Yuan Liou
Abstract We report the discovery of strong correlations between protein coding regions and the prediction errors when using the simple recurrent network to segment genome sequences. We are going to use SARS genome to demonstrate how we conduct training and derive corresponding results. The distribution of prediction error indicates how the underlying hidden regularity of the genome sequences and the results are consistent with the finding of biologists: predicated protein coding features of SARS genome. This implies that the simple recurrent network is capable of providing new features for further biological studies when applied on genome studies. The HA gene of influenza A subtype H1N1 is also analyzed in a similar way.
Computational and Mathematical Methods in Medicine | 2013
Cheng-Yuan Liou; Shen-Han Tseng; Wei-Chen Cheng; Huai-Ying Tsai
In modern bioinformatics, finding an efficient way to allocate sequence fragments with biological functions is an important issue. This paper presents a structural approach based on context-free grammars extracted from original DNA or protein sequences. This approach is radically different from all those statistical methods. Furthermore, this approach is compared with a topological entropy-based method for consistency and difference of the complexity results.
asian conference on intelligent information and database systems | 2011
Jau-Chi Huang; Wei-Chen Cheng; Cheng-Yuan Liou
We present a novel method to train the Elman network to learn literal works. This paper reports findings and results during the training process. Both codes and network weights are trained by using this method. The training error can be greatly reduced by iteratively re-encoding all words.
international conference on neural information processing | 2009
Wei-Chen Cheng; Cheng-Yuan Liou
This paper presents a distance invariance method to construct the low dimension manifold that preserves the neighborhood topological relations among data patterns. This manifold can display close relationships among patterns.
international conference on neural information processing | 2009
Cheng-Yuan Liou; Wei-Chen Cheng
This paper presents a MLP kernel. It maps all patterns in a class into a single point in the output layer space and maps different classes into different points. These widely separated class points can be used for further classifications. It is a layered feed-forward network. Each layer is trained using the class differences and trained independently layer after layer using a bottom-up construction. The class labels are not used in the training process. It can be used in separating multiple classes.