Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Maria Schuld is active.

Publication


Featured researches published by Maria Schuld.


Contemporary Physics | 2015

An introduction to quantum machine learning

Maria Schuld; Ilya Sinayskiy; Francesco Petruccione

Machine learning algorithms learn a desired input-output relation from examples in order to interpret new inputs. This is important for tasks such as image and speech recognition or strategy optimisation, with growing applications in the IT industry. In the last couple of years, researchers investigated if quantum computing can help to improve classical machine learning algorithms. Ideas range from running computationally costly algorithms or their subroutines efficiently on a quantum computer to the translation of stochastic methods into the language of quantum theory. This contribution gives a systematic overview of the emerging field of quantum machine learning. It presents the approaches as well as technical details in an accessible way, and discusses the potential of a future theory of quantum learning.


Quantum Information Processing | 2014

The quest for a Quantum Neural Network

Maria Schuld; Ilya Sinayskiy; Francesco Petruccione

With the overwhelming success in the field of quantum information in the last decades, the ‘quest’ for a Quantum Neural Network (QNN) model began in order to combine quantum computing with the striking properties of neural computing. This article presents a systematic approach to QNN research, which so far consists of a conglomeration of ideas and proposals. Concentrating on Hopfield-type networks and the task of associative memory, it outlines the challenge of combining the nonlinear, dissipative dynamics of neural computing and the linear, unitary dynamics of quantum computing. It establishes requirements for a meaningful QNN and reviews existing literature against these requirements. It is found that none of the proposals for a potential QNN model fully exploits both the advantages of quantum physics and computing in neural networks. An outlook on possible ways forward is given, emphasizing the idea of Open Quantum Neural Networks based on dissipative quantum computing.


pacific rim international conference on artificial intelligence | 2014

Quantum Computing for Pattern Classification

Maria Schuld; Ilya Sinayskiy; Francesco Petruccione

It is well known that for certain tasks, quantum computing outperforms classical computing. A growing number of contributions try to use this advantage in order to improve or extend classical machine learning algorithms by methods of quantum information theory. This paper gives a brief introduction into quantum machine learning using the example of pattern classification. We introduce a quantum pattern classification algorithm that draws on Trugenberger’s proposal for measuring the Hamming distance on a quantum computer [CA Trugenberger, Phys Rev Let 87, 2001] and discuss its advantages using handwritten digit recognition as from the MNIST database.


Physical Review A | 2016

Prediction by linear regression on a quantum computer

Maria Schuld; Ilya Sinayskiy; Francesco Petruccione

We give an algorithm for prediction on a quantum computer which is based on a linear regression model with least-squares optimization. In contrast to related previous contributions suffering from the problem of reading out the optimal parameters of the fit, our scheme focuses on the machine-learning task of guessing the output corresponding to a new input given examples of data points. Furthermore, we adapt the algorithm to process nonsparse data matrices that can be represented by low-rank approximations, and significantly improve the dependency on its condition number. The prediction result can be accessed through a single-qubit measurement or used for further quantum information processing routines. The algorithms runtime is logarithmic in the dimension of the input space provided the data is given as quantum information as an input to the routine.


Physics Letters A | 2015

Simulating a perceptron on a quantum computer

Maria Schuld; Ilya Sinayskiy; Francesco Petruccione

Abstract Perceptrons are the basic computational unit of artificial neural networks, as they model the activation mechanism of an output neuron due to incoming signals from its neighbours. As linear classifiers, they play an important role in the foundations of machine learning. In the context of the emerging field of quantum machine learning, several attempts have been made to develop a corresponding unit using quantum information theory. Based on the quantum phase estimation algorithm, this paper introduces a quantum perceptron model imitating the step-activation function of a classical perceptron. This scheme requires resources in O ( n ) (where n is the size of the input) and promises efficient applications for more complex structures such as trainable quantum neural networks.


Physical Review A | 2014

Quantum walks on graphs representing the firing patterns of a quantum neural network

Maria Schuld; Ilya Sinayskiy; Francesco Petruccione

Quantum walks have been shown to be fruitful tools in analysing the dynamic properties of quantum systems. This article proposes to use quantum walks as an approach to Quantum Neural Networks (QNNs). QNNs replace binary McCulloch-Pitts neurons with a qubit in order to use the advantages of quantum computing in neural networks. A quantum walk on the firing states of such a QNN is supposed to simulate central properties of the dynamics of classical neural networks, such as associative memory. It is shown that a biased discrete Hadamard walk derived from the updating process of a biological neuron does not lead to a unitary walk. However, a Stochastic Quantum Walk between the global firing states of a QNN can be constructed and it is shown that it contains the feature of associative memory. The quantum contribution to the walk accounts for a modest speed-up in some regimes.


EPL | 2017

Implementing a distance-based classifier with a quantum interference circuit

Maria Schuld; Mark Fingerhuth; Francesco Petruccione

Lately, much attention has been given to quantum algorithms that solve pattern recognition tasks in machine learning. Many of these quantum machine learning algorithms try to implement classical models on large-scale universal quantum computers that have access to non-trivial subroutines such as Hamiltonian simulation, amplitude amplification and phase estimation. We approach the problem from the opposite direction and analyse a distance-based classifier that is realised by a simple quantum interference circuit. After state preparation, the circuit only consists of a Hadamard gate as well as two single-qubit measurements and can be implemented with small-scale setups available today. We demonstrate this using the IBM Quantum Experience and analyse the classifier with numerical simulations.


Scientific Reports | 2018

Quantum ensembles of quantum classifiers

Maria Schuld; Francesco Petruccione

Quantum machine learning witnesses an increasing amount of quantum algorithms for data-driven decision making, a problem with potential applications ranging from automated image recognition to medical diagnosis. Many of those algorithms are implementations of quantum classifiers, or models for the classification of data inputs with a quantum computer. Following the success of collective decision making with ensembles in classical machine learning, this paper introduces the concept of quantum ensembles of quantum classifiers. Creating the ensemble corresponds to a state preparation routine, after which the quantum classifiers are evaluated in parallel and their combined decision is accessed by a single-qubit measurement. This framework naturally allows for exponentially large ensembles in which – similar to Bayesian learning – the individual classifiers do not have to be trained. As an example, we analyse an exponentially large quantum ensemble in which each classifier is weighed according to its performance in classifying the training data, leading to new results for quantum as well as classical machine learning.


Archive | 2018

Quantum Computing for Training

Maria Schuld; Francesco Petruccione

The previous chapter looked into strategies of implementing inference algorithms on a quantum computer, or how to compute the prediction of a model using a quantum instead of a classical device. This chapter will be concerned with how to optimise models using quantum computers, a subject targeted by a large share of the quantum machine learning literature.


Archive | 2018

Learning with Quantum Models

Maria Schuld; Francesco Petruccione

The last two chapters were mainly concerned with the translation of known machine learning models and optimisation techniques into quantum algorithms in order to harvest potential runtime speedups known from quantum computing. This chapter will look into ‘genuine’ quantum models for machine learning which either have no direct equivalent in classical machine learning, or which are quantum extensions of classical models with a new quality of dynamics. A quantum model as we understand it here is a model function or distribution that is based on the mathematical formalism of quantum theory, or naturally implemented by a quantum device. For example, it has been obvious from the last chapters that Gibbs distributions play a prominent role in some areas of machine learning. At the same time, quantum systems can be in a ‘Gibbs state’. Previously, we described a number of attempts to use the quantum Gibbs states in order to sample from a (classical) Gibbs distribution. But what happens if we just use the ‘quantum Gibbs distribution’? What properties would such models or training schemes exhibit? What if we use other distributions that are easy to prepare on a quantum device but difficult on a classical one, and construct machine learning algorithms from them? How powerful are the classifiers constructed from variational circuits in Section, that is if we use the input-output relation of a quantum circuit as a core machine learning model f(x) and train the circuit to generalise from data?

Collaboration


Dive into the Maria Schuld's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ilya Sinayskiy

University of KwaZulu-Natal

View shared research outputs
Top Co-Authors

Avatar

Seth Lloyd

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Leonard Wossnig

University College London

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge