Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alastair A. Abbott is active.

Publication


Featured researches published by Alastair A. Abbott.


Physical Review A | 2012

Strong Kochen-Specker theorem and incomputability of quantum randomness

Alastair A. Abbott; Cristian S. Calude; Jonathan Conder; Karl Svozil

The Kochen-Specker theorem shows the impossibility for a hidden variable theory to consistently assign values to certain (finite) sets of observables in a way that is non-contextual and consistent with quantum mechanics. If we require non-contextuality, the consequence is that many observables must not have pre-existing definite values. However, the Kochen-Specker theorem does not allow one to determine which observables must be value indefinite. In this paper we present an improvement on the Kochen-Specker theorem which allows one to actually locate observables which are provably value indefinite. Various technical and subtle aspects relating to this formal proof and its connection to quantum mechanics are discussed. This result is then utilized for the proposal and certification of a dichotomic quantum random number generator operating in a three-dimensional Hilbert space.


Physical Review A | 2014

Value-indefinite observables are almost everywhere

Alastair A. Abbott; Cristian S. Calude; Karl Svozil

Kochen-Specker theorems assure the breakdown of certain types of noncontextual hidden-variable theories through the nonexistence of global, holistic frame functions; however, they do not allow us to identify where this breakdown occurs, nor the extent of it. It was recently shown [Phys. Rev. A 86, 062109 (2012)] that this breakdown does not occur everywhere; here we show that it is maximal in that it occurs almost everywhere and thus prove that quantum indeterminacy, often referred to as contextuality or value indefiniteness, is a global property as is often assumed. In contrast to the Kochen-Specker theorem, we only assume the weaker noncontextuality condition that any potential value assignments that may exist are locally noncontextual. Under this assumption, we prove that once a single arbitrary observable is fixed to occur with certainty, almost (i.e., with Lebesgue measure one) all remaining observables are indeterminate.


Journal of Mathematical Physics | 2015

A variant of the Kochen-Specker theorem localising value indefiniteness

Alastair A. Abbott; Cristian S. Calude; Karl Svozil

The Kochen-Specker theorem proves the inability to assign, simultaneously, noncontextual definite values to all (of a finite set of) quantum mechanical observables in a consistent manner. If one assumes that any definite values behave noncontextually, one can nonetheless only conclude that some observables (in this set) are value indefinite. In this paper, we prove a variant of the Kochen-Specker theorem showing that, under the same assumption of noncontextuality, if a single one-dimensional projection observable is assigned the definite value 1, then no one-dimensional projection observable that is incompatible (i.e., non-commuting) with this one can be assigned consistently a definite value. Unlike standard proofs of the Kochen-Specker theorem, in order to localise and show the extent of value indefiniteness, this result requires a constructive method of reduction between Kochen-Specker sets. If a system is prepared in a pure state ψ, then it is reasonable to assume that any value assignment (i.e., hidd...


arXiv: Quantum Physics | 2010

Understanding the Quantum Computational Speed-up via De-quantisation

Alastair A. Abbott; Cristian S. Calude

While it seems possible that quantum computers may allow for algorithms offering a computational speed-up over classical algorithms for some problems, the issue is poorly understood. We explore this computational speed-up by investigating the ability to de-quantise quantum algorithms into classical simulations of the algorithms which are as efficient in both time and space as the original quantum algorithms. The process of de-quantisation helps formulate conditions to determine if a quantum algorithm provides a real speed-up over classical algorithms. These conditions can be used to develop new quantum algorithms more effectively (by avoiding features that could allow the algorithm to be efficiently classically simulated) and to create new classical algorithms (by using features which have proved valuable for quantum algorithms). Results on many different methods of de-quantisations are presented, as well as a general formal definition of de-quantisation. De-quantisations employing higher-dimensional classical bits, as well as those using matrix-simulations, put emphasis on entanglement in quantum algorithms; a key result is that any algorithm in which the entanglement is bounded is de-quantisable. These methods are contrasted with the stabiliser formalism de-quantisations due to the Gottesman-Knill Theorem, as well as those which take advantage of the topology of the circuit for a quantum algorithm. The benefits and limits of the different methods are discussed, and the importance of utilising a range of techniques is emphasised. We further discuss some features of quantum algorithms which current de-quantisation methods do not cover and highlight several important open questions in the area.


Mathematical Structures in Computer Science | 2014

A Quantum Random Number Generator Certified by Value Indefiniteness

Alastair A. Abbott; Cristian S. Calude; Karl Svozil

In this paper we propose a new ternary QRNG based on measuring located value indefinite observables with probabilities


arXiv: Quantum Physics | 2015

On the Unpredictability of Individual Quantum Measurement Outcomes

Alastair A. Abbott; Cristian S. Calude; Karl Svozil

1/4, 1/2,1/4


Information-an International Interdisciplinary Journal | 2015

A Non-Probabilistic Model of Relativised Predictability in Physics

Alastair A. Abbott; Cristian S. Calude; Karl Svozil

and prove that every sequence generated is maximally unpredictable, 3-bi-immune (a stronger form of bi-immunity), and its prefixes are Borel normal. The ternary quantum random digits produced by the QRNG are algorithmically transformed into quantum random bits using an alphabetic morphism which preserves all the above properties.


Applied Mathematics and Computation | 2012

De-quantisation of the quantum Fourier transform

Alastair A. Abbott

We develop a general, non-probabilistic model of prediction which is suitable for assessing the (un)predictability of individual physical events. We use this model to provide, for the first time, a rigorous proof of the unpredictability of a class of individual quantum measurement outcomes, a well-known quantum attribute postulated or claimed for a long time.


Physical Review A | 2016

Noise and disturbance of qubit measurements: An information-theoretic characterization

Alastair A. Abbott; Cyril Branciard

Unpredictability is an important concept throughout physics and plays a central role in quantum information theory. Despite this, little effort has been devoted to studying generalised notions or models of (un)predictability in physics. In this paper, we continue the programme of developing a general, non-probabilistic model of (un)predictability in physics. We present a more refined model that is capable of studying different degrees of “relativised” unpredictability. This model is based on the ability of an agent, acting via uniform, effective means, to predict correctly and reproducibly the outcome of an experiment using finite information extracted from the environment. We use this model to study the degree of unpredictability certified by different quantum phenomena further, showing that quantum complementarity guarantees a form of relativised unpredictability that is weaker than that guaranteed by Kochen–Specker-type value indefiniteness. We exemplify further the difference between certification by complementarity and value indefiniteness by showing that, unlike value indefiniteness, complementarity is compatible with the production of computable sequences of bits.


international conference on case based reasoning | 2011

Ontology-Aided product classification: a nearest neighbour approach

Alastair A. Abbott; Ian D. Watson

The quantum Fourier transform (QFT) plays an important role in many known quantum algorithms such as Shors algorithm for prime factorisation. In this paper we show that the QFT algorithm can, on a restricted set of input states, be de-quantised into a classical algorithm which is both more efficient and simpler than the quantum algorithm. By working directly with the algorithm instead of the circuit, we develop a simple classical version of the quantum basis-state algorithm. We formulate conditions for a separable state to remain separable after the QFT is performed, and use these conditions to extend the de-quantised algorithm to work on all such states without loss of efficiency. Our technique highlights the linearity of quantum mechanics as the fundamental feature accounting for the difference between quantum and de-quantised algorithms, and that it is this linearity which makes the QFT such a useful tool in quantum computation.

Collaboration


Dive into the Alastair A. Abbott's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Karl Svozil

Vienna University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julian Wechs

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Richard Hua

University of Auckland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bülent Demirel

Vienna University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge