Kenny Ye
Stony Brook University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kenny Ye.
Nature | 2011
Ryan E. Mills; Klaudia Walter; Chip Stewart; Robert E. Handsaker; Ken Chen; Can Alkan; Alexej Abyzov; Seungtai Yoon; Kai Ye; R. Keira Cheetham; Asif T. Chinwalla; Donald F. Conrad; Yutao Fu; Fabian Grubert; Iman Hajirasouliha; Fereydoun Hormozdiari; Lilia M. Iakoucheva; Zamin Iqbal; Shuli Kang; Jeffrey M. Kidd; Miriam K. Konkel; Joshua M. Korn; Ekta Khurana; Deniz Kural; Hugo Y. K. Lam; Jing Leng; Ruiqiang Li; Yingrui Li; Chang-Yun Lin; Ruibang Luo
Genomic structural variants (SVs) are abundant in humans, differing from other forms of variation in extent, origin and functional impact. Despite progress in SV characterization, the nucleotide resolution architecture of most SVs remains unknown. We constructed a map of unbalanced SVs (that is, copy number variants) based on whole genome DNA sequencing data from 185 human genomes, integrating evidence from complementary SV discovery approaches with extensive experimental validations. Our map encompassed 22,025 deletions and 6,000 additional SVs, including insertions and tandem duplications. Most SVs (53%) were mapped to nucleotide resolution, which facilitated analysing their origin and functional impact. We examined numerous whole and partial gene deletions with a genotyping approach and observed a depletion of gene disruptions amongst high frequency deletions. Furthermore, we observed differences in the size spectra of SVs originating from distinct formation mechanisms, and constructed a map of SV hotspots formed by common mechanisms. Our analytical framework and SV map serves as a resource for sequencing-based association studies.
Nature | 2014
Ivan Iossifov; Brian J. O'Roak; Stephan J. Sanders; Michael Ronemus; Niklas Krumm; Dan Levy; Holly A.F. Stessman; Kali Witherspoon; Laura Vives; Karynne E. Patterson; Joshua D. Smith; Bryan W. Paeper; Deborah A. Nickerson; Jeanselle Dea; Shan Dong; Luis E. Gonzalez; Jeffrey D. Mandell; Shrikant Mane; Catherine Sullivan; Michael F. Walker; Zainulabedin Waqar; Liping Wei; A. Jeremy Willsey; Boris Yamrom; Yoon Lee; Ewa Grabowska; Ertugrul Dalkic; Zihua Wang; Steven Marks; Peter Andrews
Whole exome sequencing has proven to be a powerful tool for understanding the genetic architecture of human disease. Here we apply it to more than 2,500 simplex families, each having a child with an autistic spectrum disorder. By comparing affected to unaffected siblings, we show that 13% of de novo missense mutations and 43% of de novo likely gene-disrupting (LGD) mutations contribute to 12% and 9% of diagnoses, respectively. Including copy number variants, coding de novo mutations contribute to about 30% of all simplex and 45% of female diagnoses. Almost all LGD mutations occur opposite wild-type alleles. LGD targets in affected females significantly overlap the targets in males of lower intelligence quotient (IQ), but neither overlaps significantly with targets in males of higher IQ. We estimate that LGD mutation in about 400 genes can contribute to the joint class of affected females and males of lower IQ, with an overlapping and similar number of genes vulnerable to contributory missense mutation. LGD targets in the joint class overlap with published targets for intellectual disability and schizophrenia, and are enriched for chromatin modifiers, FMRP-associated genes and embryonically expressed genes. Most of the significance for the latter comes from affected females.
Journal of Statistical Planning and Inference | 2000
Kenny Ye; William Li; Agus Sudjianto
Abstract We propose symmetric Latin hypercubes for designs of computer experiment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for constructing optimal symmetric Latin hypercube designs. This algorithm is compared with two existing algorithms by Park (1994. J. Statist. Plann. Inference 39, 95–111) and Morris and Mitchell (1995. J. Statist. Plann. Inference 43, 381–402). Some examples, including a real case study in the automotive industry, are used to illustrate the performance of the new designs and the algorithms.
Journal of the American Statistical Association | 1998
Kenny Ye
Abstract Latin hypercubes have been frequently used in conducting computer experiments. In this paper, a class of orthogonal Latin hypercubes that preserves orthogonality among columns is proposed. Applying an orthogonal Latin hypercube design to a computer experiment benefits the data analysis in two ways. First, it retains the orthogonality of traditional experimental designs. The estimates of linear effects of all factors are uncorrelated not only with each other, but also with the estimates of all quadratic effects and bilinear interactions. Second, it can facilitate nonparametric fitting procedures, because one can select good space-filling designs within the class of orthogonal Latin hypercubes according to selection criteria.
Annals of Biomedical Engineering | 2003
Stefan Judex; Steve Boyd; Yi-Xian Qin; Simon Turner; Kenny Ye; Ralph Müller; Clinton T. Rubin
Extremely low magnitude mechanical stimuli (<10 microstrain) induced at high frequencies are anabolic to trabecular bone. Here, we used finite element (FE) modeling to investigate the mechanical implications of a one year mechanical intervention. Adult female sheep stood with their hindlimbs either on a vibrating plate (30 Hz, 0.3 g) for 20 min/d, 5 d/wk or on an inactive plate. Microcomputed tomography data of 1 cm bone cubes extracted from the medial femoral condyles were transformed into FE meshes. Simulated compressive loads applied to the trabecular meshes in the three orthogonal directions indicated that the low level mechanical intervention significantly increased the apparent trabecular tissue stiffness of the femoral condyle in the longitudinal (+17%, p < 0.02), anterior–posterior (+29%, p < 0.01), and medial-lateral (+37%, p < 0.01) direction, thus reducing apparent strain magnitudes for a given applied load. For a given apparent input strain (or stress), the resultant stresses and strains within trabeculae were more uniformly distributed in the off-axis loading directions in cubes of mechanically loaded sheep. These data suggest that trabecular bone responds to low level mechanical loads with intricate adaptations beyond a simple reduction in apparent strain magnitude, producing a structure that is stiffer and less prone to fracture for a given load.
Technometrics | 2003
William Li; Dennis K. J. Lin; Kenny Ye
This article considers optimal foldover plans for nonregular designs. By using the indicator function, we define words with fractional lengths. The extended word-length pattern is then used to select among nonregular foldover designs. Some general properties of foldover designs are obtained using the indicator function. We prove that the full-foldover plan that reverses the signs of all factors is optimal for all 12-run and 20-run orthogonal designs. The optimal foldover plans for all 16-run (regular and nonregular) orthogonal designs are constructed and tabulated for practical use. Optimal foldover plans for higher-order orthogonal designs can be constructed in a similar manner.
Journal of Quality Technology | 2002
Vijayan N. Nair; Winson Taam; Kenny Ye
Robust design studies with functional responses are becoming increasingly common. The goal in these studies is to analyze location and dispersion effects and optimize performance over a range of input-output values. Taguchi and others have proposed the so-called signal-to-noise ratio analysis for robust design with dynamic characteristics. We consider more general and flexible methods for analyzing location and dispersion effects from such studies and use three real applications to illustrate the methods. Two applications demonstrate the usefulness of functional regression techniques for location and dispersion analysis while the third illustrates a parametric analysis with two-stage modeling. Both a mean-variance analysis for random selection of noise settings as well as a control-by-noise interaction analysis for explicitly controlled noise factors are considered.
Journal of Quality Technology | 2000
Kenny Ye; Michael S. Hamada
The Lenth method is an objective method for testing effects from unreplicated factorial designs and eliminates the subjectivity in using a half-normal plot. The Lenth statistics are computed for the factorial effects and compared to corresponding critical values. Since the distribution of the Lenth statistics is not mathematically tractable, we propose a simple simulation method to estimate the critical values. Confidence intervals for the estimated critical values can also easily be obtained. Tables of critical values are provided for a large number of designs, and their use is demonstrated with data from three experiments. The proposed method can also be adapted to estimate critical values for other methods.
Technometrics | 2004
Shao Wei Cheng; William Li; Kenny Ye
This article discusses the optimal blocking criteria for nonregular two-level designs. We extend the optimal blocking criteria of Cheng and Wu to nonregular designs by adapting the G- and G2-minimum aberration criteria discussed by Tang and Deng. To define word-length pattern for nonregular designs, we extend the notion of “word” to nonregular designs through a polynomial representation of factorial designs. We define treatment resolution and block resolution for evaluating the degrees of aliasing and confounding. We propose four new criteria, which we use to search for optimal blocking schemes of 12-run, 16-run, and 20-run two-level orthogonal arrays.
Journal of Fluids Engineering-transactions of The Asme | 2002
B. DeVolder; James Glimm; John W. Grove; Y. Kang; Y. Lee; K. Pao; David H. Sharp; Kenny Ye
A general discussion of the quantification of uncertainty in numerical simulations is presented. A principal conclusion is that the distribution of solution errors is the leading term in the assessment of the validity of a simulation and its associated uncertainty in the Bayesian framework. Key issues that arise in uncertainty quantification are discussed for two examples drawn from shock wave physics and modeling of petroleum reservoirs. Solution error models, confidence intervals and Gaussian error statistics based on simulation studies are presented