Arthur Fine
University of Washington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Arthur Fine.
Journal of Mathematical Physics | 1982
Arthur Fine
We provide necessary and sufficient conditions for several observables to have a joint distribution. When applied to the bivalent observables of a quantum correlation experiment, we show that these conditions are equivalent to the Bell inequalities, and also to the existence of deterministic hidden variables. We connect the no‐hidden‐variables theorem of Kochen and Specker to these conditions for joint distributions. We conclude with a new theorem linking joint distributions and commuting observables, and show how violations of the Bell inequalities correspond to violations of commutativity, as in the theorem.
Synthese | 1982
Arthur Fine
This paper constructs two classes of models for the quantum correlation experiments used to test the Bell-type inequalities, synchronization models and prism models. Both classes employ deterministic hidden variables, satisfy the causal requirements of physical locality, and yield precisely the quantum mechanical statistics. In the synchronization models, the joint probabilities, for each emission, do not factor in the manner of stochastic independence, showing that such factorizability is not required for locality. In the prism models the observables are not random variables over a common space; hence these models throw into question the entire random variables idiom of the literature. Both classes of models appear to be testable.
Foundations of Physics | 2005
Maximilian Schlosshauer; Arthur Fine
Recently, W. H. Zurek presented a novel derivation of the Born rule based on a mechanism termed environment-assisted invariance, or “envariance” [W. H. Zurek, Phys. Rev. Lett. 90(2), 120404 (2003)]. We review this approach and identify fundamental assumptions that have implicitly entered into it, emphasizing issues that any such derivation is likely to face.
Archive | 1994
Mara Beller; Arthur Fine
The EPR paper (Einstein, Podolsky and Rosen, 1935; hereafter “EPR”) appeared in the May 15, 1935 issue of Physical Review. The paper’s impact was due in large part to their demonstration of an incompatibility between quantum mechanics (if regarded as both correct and complete) and plausible physical principles regarding physical reality. Two other items appeared in Physical Review before Bohr’s own response: a note by Edwin C. Kemble (Kemble, 1935), and a letter by Arthur E. Ruark (Ruark, 1935). Both authors attempted, in a different way, to rescue quantum mechanics from the EPR conclusion by questioning the concept of reality that underlay the EPR argument. Similarly, Schrodinger wrote to Pauli: “For me this note [the EPR paper] was the cause to rethink once again the issue (which we know essentially for a long time already) ... that the expressions ‘to have a value really’, ‘to be actually constituted so and so’ and similar [expressions] are senseless phrases” (von Meyenn, et al., eds., 1985, Vol. 2, 406).1
Physical Review Letters | 2012
Maximilian Schlosshauer; Arthur Fine
Pusey, Barrett, and Rudolph introduce a new no-go theorem for hidden-variables models of quantum theory. We make precise the class of models targeted and construct equivalent models that evade the theorem. The theorem requires assumptions for models of composite systems, which we examine, determining compactness as the weakest assumption needed. On that basis, we demonstrate results of the Bell-Kochen-Specker theorem. Given compactness and the relevant class of models, the theorem can be seen as showing that some measurements on composite systems must have built-in inefficiencies, complicating its testing.
Foundations of Physics | 1989
Arthur Fine
This paper examines the efficiency problem involved in experimental tests of so-called “local” hidden variables. It separates the phenomenological locality at issue in the Bell case from Einsteins different conception of locality, and shows how phenomenological locality also differs from the factorizability needed to derive the Bell inequalities in the stochastic case. It then pursues the question of whether factorizable, local models (or, equivalently, deterministic ones) exist for the experiments designed to test the Bell inequalities, thus rendering the experimental argument against them incomplete. This leads to an investigation of the so-called “prism models” and to new inequalities for a significant class of such models, inequalities that are testable even at the low efficiencies of the photon correlation experiments.
Foundations of Physics | 1978
Arthur Fine; Paul Teller
In the contemporary discussion of hidden variable interpretations of quantum mechanics, much attention has been paid to the “no hidden variable” proof contained in an important paper of Kochen and Specker. It is a little noticed fact that Bell published a proof of the same result the preceding year, in his well-known 1966 article, where it is modestly described as a corollary to Gleasons theorem. We want to bring out the great simplicity of Bells formulation of this result and to show how it can be extended in certain respects.
Foundations of Physics | 1991
Arthur Fine
This paper addresses the “inefficiency loophole” in the Bell theorem. We examine factorizable stochastic models for the Bell inequalities, where we allow the detection efficiency to depend both on the “hidden” state of the measured system and also its passage through an analyzer. We show that, nevertheless, if the efficiency functions are symmetric between the two wings of the experiment, one can dispense with supplementary assumptions and derive new inequalities that enable the models to be tested even for highly inefficient experiments.
Archive | 1996
Arthur Fine
Terms like “complementarity,” “potentia,” the “collapse” of the wave packet, “phenomena” identified with the whole experimental arrangement, and so forth, mark the standard interpretation of quantum mechanics. Despite different public faces there is a core unifying theme. The theory is probabilistic and, in contrast with statistical mechanics, the standard interpretation regards the probabilities as objective, in the sense that it does not ground them in human limitations concerning knowledge of the finer details of things. The objectivity of the probabilities makes for indeterminism and, more fundamentally, for some sort of irrealism; since, according to the standard view, in significant situations there just are no finer details of things. The irreducibility of the probabilities might be thought to constitute a realism of a higher order (with respect to the probabilities themselves) except that in the standard interpretation the probabilities are entirely instrumental. They express a relation between a physical system and acts of measurement; they are probabilities for measurement outcomes. Thus, in a curious turn about, despite the objectivity of the probabilities the observer enters quantum theory in a fundamental way. On Bohr’s view we are required to divide each experimental situation into an observer part, that is treated classically and to which we do not apply the quantum formalism, and a quantum part, to which we do. On Heisenberg’s view the probabilities in the wave function somehow objectively represent both real “potentialities” and also subjective knowledge. Standing outside the causal order, an act of measurement “actualizes” a potentiality and, when we take account of this actualization, our changing knowledge is again objectively represented by a new “collapsed” wave function. Either view makes a mystery of how any object ever comes to possess any property; that is, of how anything at all actually happens. Standardly, we are cautioned not to inquire further. Physics stops here.
Archive | 2007
Maximilian Schlosshauer; Arthur Fine
This is an introduction to decoherence with an emphasis on the foundational and conceptual aspects of the theory. It explores the extent to which decoherence suggests a solution to the measurement problem, and evaluates the role of decoherence in several different interpretations of quantum mechanics. [This paper is essentially a short version of: M. Schlosshauer, “Decoherence, the measurement problem, and interpretations of quantum mechanics,” Reviews of Modern Physics 76: 1267–1305 (2004).]