Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ting-Li Chen is active.

Publication


Featured researches published by Ting-Li Chen.


The Journal of Neuroscience | 2006

Spike Count Reliability and the Poisson Hypothesis

Asohan Amarasingham; Ting-Li Chen; Stuart Geman; Matthew T. Harrison; David L. Sheinberg

The variability of cortical activity in response to repeated presentations of a stimulus has been an area of controversy in the ongoing debate regarding the evidence for fine temporal structure in nervous system activity. We present a new statistical technique for assessing the significance of observed variability in the neural spike counts with respect to a minimal Poisson hypothesis, which avoids the conventional but troubling assumption that the spiking process is identically distributed across trials. We apply the method to recordings of inferotemporal cortical neurons of primates presented with complex visual stimuli. On this data, the minimal Poisson hypothesis is rejected: the neuronal responses are too reliable to be fit by a typical firing-rate model, even allowing for sudden, time-varying, and trial-dependent rate changes after stimulus onset. The statistical evidence favors a tightly regulated stimulus response in these neurons, close to stimulus onset, although not further away.


Journal of Applied Statistics | 2014

Image warping using radial basis functions

Ting-Li Chen; Stuart Geman

Image warping is the process of deforming an image through a transformation of its domain, which is typically a subset of R2. Given the destination of a collection of points, the problem becomes one of finding a suitable smooth interpolation for the destinations of the remaining points of the domain. A common solution is to use the thin plate spline (TPS). We find that the TPS often introduces unintended distortions of image structures. In this paper, we will analyze interpolation by TPS, experiment with other radial basis functions, and suggest two alternative functions that provide better results.


biomedical engineering and informatics | 2009

A Markov Random Field Model for Medical Image Denoising

Ting-Li Chen

In this paper, we model the image prior using Markov Random Field. It is difficult to model image priors directly on the intensity value of each pixel, as the relationships between intensity values of pixels are extremely complicated. Instead, we model the probability by how likely we observe the filter responses. The filters of size 5x5 are learned from PCA on 5x5 patches. The distributions of filter responses are modeled by double exponential distributions with parameters obtained also from PCA. Based on this prior model, the denoising algorithm is carried out on the basis of Bayesian Analysis. The clean image is the most likely image given the observation and the previous knowledge (prior). We perform the gradient ascent method on the logarithm of the posterior probability to find the most likely image. We apply this denoising algorithm on fMRI images and ultrasound images and have very good denoising results.


IEEE Transactions on Information Theory | 2008

On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions

Ting-Li Chen; Stuart Geman

Progressive encoding of a signal generally involves an estimation step, designed to reduce the entropy of the residual of an observation over the entropy of the observation itself. Oftentimes the conditional distributions of an observation, given already-encoded observations, are well fit within a class of symmetric and unimodal distributions (e.g., the two-sided geometric distributions in images of natural scenes, or symmetric Paretian distributions in models of financial data). It is common practice to choose an estimator that centers, or aligns, the modes of the conditional distributions, since it is common sense that this will minimize the entropy, and hence the coding cost of the residuals. But with the exception of a special case, there has been no rigorous proof. Here we prove that the entropy of an arbitrary mixture of symmetric and unimodal distributions is minimized by aligning the modes. The result generalizes to unimodal and rotation-invariant distributions in Rn. We illustrate the result through some experiments with natural images.


Journal of Multivariate Analysis | 2016

On the weak convergence and Central Limit Theorem of blurring and nonblurring processes with application to robust location estimation

Ting-Li Chen; Hironori Fujisawa; Su-Yun Huang; Chii-Ruey Hwang

This article studies the weak convergence and associated Central Limit Theorem for blurring and nonblurring processes. Then, they are applied to the estimation of location parameter. Simulation studies show that the location estimation based on the convergence point of blurring process is more robust and often more efficient than that of nonblurring process.


Statistics & Probability Letters | 2018

On the asymptotic variance of reversible Markov chain without cycles

Chi-Hao Wu; Ting-Li Chen

Abstract Markov chain Monte Carlo(MCMC) is a popular approach to sample from high dimensional distributions, and the asymptotic variance is a commonly used criterion to evaluate the performance. While most popular MCMC algorithms are reversible, there is a growing literature on the development and analyses of nonreversible MCMC. Chen and Hwang (2013) showed that a reversible MCMC can be improved by adding an antisymmetric perturbation. They also raised a conjecture that it cannot be improved if there is no cycle in the corresponding graph. In this paper, we present a rigorous proof of this conjecture. The proof is based on the fact that the transition matrix with an acyclic structure will produce minimum commute time between vertices.


Siam Journal on Control and Optimization | 2018

Optimal Variance Reduction for Markov Chain Monte Carlo

Lu-Jing Huang; Yin-Ting Liao; Ting-Li Chen; Chii-Ruey Hwang

Markov chain Monte Carlo (MCMC) has been widely used to approximate the expectation of the statistic of a given probability measure


Journal of Statistical Computation and Simulation | 2016

On the strengths of the self-updating process clustering algorithm

Shang-Ying Shiu; Ting-Li Chen

\pi


Statistics & Probability Letters | 2013

Accelerating reversible Markov chains

Ting-Li Chen; Chii-Ruey Hwang

on a finite set, and the asymptotic variance is a typical a...


Annals of the Institute of Statistical Mathematics | 2015

On the convergence and consistency of the blurring mean-shift process

Ting-Li Chen

The self-updating process (SUP) is a clustering algorithm that stands from the viewpoint of data points and simulates the process how data points move and perform self-clustering. It is an iterative process on the sample space and allows for both time-varying and time-invariant operators. By simulations and comparisons, this paper shows that SUP is particularly competitive in clustering (i) data with noise, (ii) data with a large number of clusters, and (iii) unbalanced data. When noise is present in the data, SUP is able to isolate the noise data points while performing clustering simultaneously. The property of the local updating enables SUP to handle data with a large number of clusters and data of various structures. In this paper, we showed that the blurring mean-shift is a static SUP. Therefore, our discussions on the strengths of SUP also apply to the blurring mean-shift.

Collaboration


Dive into the Ting-Li Chen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shang-Ying Shiu

National Taipei University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hironori Fujisawa

Graduate University for Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Chi-Hao Wu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge