Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Edward K. Cheng is active.

Publication


Featured researches published by Edward K. Cheng.


Proceedings of Second International Workshop on Active Matrix Liquid Crystal Displays | 1995

Modeling of leakage current distributions in series connected polysilicon thin film transistors

Edward K. Cheng; J.C. Sturm; I-Wei Wu; Tsu-Jae King

Leakage current is an important parameter in thin film transistors (TFTs) for achieving gray scales in active matrix liquid crystal displays (AMLCDs). Leakage regulation is especially important in polysilicon TFTs, where the leakage is significantly greater than in their amorphous silicon counterparts. To reduce the leakage current, it is a common practice to place two polysilicon TFTs in series, which may reduce the leakage current by over an order of magnitude. In this work, it is shown how leakage current in a series connected pair can be analytically predicted from the I-V characteristics of a single FET for the first time, and how the leakage current distribution of single transistors may affect the distribution of the series connected devices. These implications are important for estimating the pixel yield in AMLCDs from single device characteristics.


Sociological Methods & Research | 2014

Comment on Dawid, Faigman, and Fienberg (2014)

Edward K. Cheng

Out of the morass that characterizes the law’s handling of statistical evidence, Dawid, Faigman, and Fienberg (DFF) have articulated an important distinction between the effects of causes and the causes of effects. Rather than blaming the legal system’s confusion over statistical evidence on ignorance or ideology, they more helpfully suggest that the culprit may be a failure to understand the context in which statistical studies typically arise. At its core, DFF’s contribution is a valuable call for precision. ‘‘Statistical evidence of causation’’ is not a wholesale category. And just as we would not conflate factual with proximate causation in tort law, or causation with correlation in statistics, neither should we conflate the effects of causes with the causes of effects. In addition to the applications offered in the article, DFF’s cause–effect distinction implicates, or at least evokes, several other discussions in the law of evidence. I hope to highlight three of these links in the brief remarks that follow.


The Journal of Legal Studies | 2018

Detection and Correction of Case-Publication Bias

Edward K. Cheng

Case-publication bias, the possibility that certain legal outcomes may be more likely to be published or observed than others, carries significant implications for both legal actors and researchers. In this article, I propose a method for detecting and correcting case-publication bias based on ideas from multiple-systems estimation, a technique traditionally used for estimating hidden populations. I apply the method to a simulated data set of admissibility decisions to confirm its efficacy, then to a newly collected data set on false-confession expert testimony, where the model estimates that the observed 16 percent admissibility rate may be in reality closer to 28 percent. The article thus draws attention to the problem of case-publication bias and offers a practical statistical tool for detecting and correcting it.


International Journal of Evidence and Proof | 2016

DNA, Blue Bus, and phase changes

Edward K. Cheng; G. Alexander Nunn

In ‘Exploring the Proof Paradoxes’, Mike Redmayne comprehensively surveyed the puzzles at the intersection of law and statistics, the most famous of which is the Blue Bus problem, which prohibits legal actors from ascribing liability purely on the basis of probabilistic evidence. DNA evidence, however, is a longstanding exception to Blue Bus. Like Blue Bus, DNA presents probabilistic evidence of identity. Unlike Blue Bus, DNA is widely accepted as legitimate, even when it stands alone as so-called ‘naked’ statistical evidence. Observers often explain such DNA exceptionalism in two ways: either that people break down in extreme cases, or relatedly, that modern DNA testing generates effectively unique (as opposed to probabilistic) identifications. While both explanations are understandable, they are unsatisfying in certain ways. Breakdown theory seems unprincipled and falls victim to slippery slopes. Uniqueness theory rests on a fiction and fails to delineate a threshold for when probabilities are sufficiently small to be considered ‘unique’. In this paper inspired by our reading of Professor Redmayne’s piece, we propose a quantitative explanation for DNA exceptionalism. Specifically, we argue that as random match probabilities become smaller, the probability of error (i.e. mistaken identification) sharply transitions from high to low. This sharp change in probability, which we label a ‘phase change’, explains why legal actors can treat DNA as non-probabilistic evidence. The phase change further avoids slippery slope problems and helps define when one can legitimately treat DNA—or any similarly qualified forensic identification method for that matter—as a form of direct evidence.


Chance | 2016

A Bayesian Look at the Baby Annie Case

Edward K. Cheng

27 The problem of how to help juries handle forensic statistics and other types of statistical evidence has long bedeviled the legal system. One longstanding proposal is to use Bayes Theorem, which provides a rigorous framework that jurors can use to update their beliefs as various pieces of statistical evidence are admitted at trial. From their inception, however, Bayesian approaches to legal proof have been controversial. Proponents argue that a Bayesian approach can help jurors correctly incorporate quantitative data into their deliberations, avoid cognitive bias, and ultimately make more informed decisions. Opponents, by contrast, worry that such a technical framework in the hands of inexperienced laypersons is an invitation to error, and that its focus on quantitative evidence detrimentally devalues qualitative evidence, dehumanizes the trial process, and adversely affects the legal system’s legitimacy. In this article, we take a look at a recent tragic case from New York that throws the power and drawbacks of a Bayesian approach into stark relief. In January 2012, the New York Times chronicled the plight of Baby Annie Li, a twomonth-old who prosecutors believed had died at the hands of her parents’ abuse. The physical evidence was damning. Baby Annie’s head injuries were consistent with child abuse: “a massive skull fracture from two ‘non-accidental’ blows that caused brain damage, hemorrhaging and eye injuries, as well as two broken legs and a fractured rib that had not fully healed.” A bayesian look at the baby Annie case


Archive | 2006

Modern scientific evidence : the law and science of expert testimony

Edward K. Cheng; David L. Faigman; Michael J. Saks; Joseph Sanders


Virginia Law Review | 2004

Does Frye or Daubert Matter? A Study of Scientific Admissibility Standards

Edward K. Cheng; Albert Yoon


Northwestern University Law Review | 2005

Structural Laws and the Puzzle of Regulating Behavior

Edward K. Cheng


Stanford Law Review | 2007

The Myth of the Generalist Judge

Edward K. Cheng


Journal of law and policy | 2005

Mitochondrial DNA: Emerging Legal Issues

Edward K. Cheng

Collaboration


Dive into the Edward K. Cheng's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tsu-Jae King

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge