Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dan Ventura is active.

Publication


Featured researches published by Dan Ventura.


Information Sciences | 2000

Quantum associative memory

Dan Ventura; Tony R. Martinez

Abstract This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation (QC) uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum associative memory (QuAM) with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce such a quantum associative memory. The result is an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network. The paper covers necessary high-level quantum mechanical and quantum computational ideas and introduces a QuAM. Theoretical analysis proves the utility of the memory, and it is noted that a small version should be physically realizable in the near future.


Archive | 2000

Quantum Neural Networks

Alexandr A. Ezhov; Dan Ventura

This chapter outlines the research, development and perspectives of quantum neural networks - a burgeoning new field which integrates classical neurocomputing with quantum computation [1]. It is argued that the study of quantum neural networks may give us both new understanding of brain function as well as unprecedented possibilities in creating new systems for information processing, including solving classically intractable problems, associative memory with exponential capacity and possibly overcoming the limitations posed by the Church-Turing thesis.


international symposium on neural networks | 2004

Choosing a Starting Configuration for Particle Swarm Optimization

Mark Richards; Dan Ventura

The performance of particle swarm optimization can be improved by strategically selecting the starting positions of the particles. The work suggests the use of generators from centroidal Voronoi tessellations as the starting points for the swarm. The performance of swarms initialized with this method is compared with the standard PSO algorithm on several standard test functions. Results suggest that CVT initialization improves PSO performance in high dimensional spaces.


Briefings in Bioinformatics | 2015

LC-MS alignment in theory and practice: a comprehensive algorithmic review

Robert Smith; Dan Ventura; John T. Prince

Liquid chromatography-mass spectrometry is widely used for comparative replicate sample analysis in proteomics, lipidomics and metabolomics. Before statistical comparison, registration must be established to match corresponding analytes from run to run. Alignment, the most popular correspondence approach, consists of constructing a function that warps the content of runs to most closely match a given reference sample. To date, dozens of correspondence algorithms have been proposed, creating a daunting challenge for practitioners in algorithm selection. Yet, existing reviews have highlighted only a few approaches. In this review, we describe 50 correspondence algorithms to facilitate practical algorithm selection. We elucidate the motivation for correspondence and analyze the limitations of current approaches, which include prohibitive runtimes, numerous user parameters, model limitations and the need for reference samples. We suggest and describe a paradigm shift for overcoming current correspondence limitations by building on known liquid chromatography-mass spectrometry behavior.


Foundations of Physics Letters | 2014

Initializing the Amplitude Distribution of a Quantum State

Dan Ventura; Tony R. Martinez

To date, quantum computational algorithms have operated on a superposition of all basis states of a quantum system. Typically, this is because it is assumed that some function f is known and implementable as a unitary evolution. However, what if only some points of the function f are known? It then becomes important to be able to encode only the knowledge that we have about f. This paper presents an algorithm that requires a polynomial number of elementary operations for initializing a quantum system to represent only the m known points of a function f.


international conference on neural information processing | 2000

Quantum associative memory with distributed queries

A. A. Ezhov; A. V. Nifanova; Dan Ventura

Abstract This paper discusses a model of quantum associative memory which generalizes the completing associative memory proposed by Ventura and Martinez. Similar to this model, our system is based on Grovers well-known algorithm for searching an unsorted quantum database. However, the model presented in this paper suggests the use of a distributed query of general form. It is demonstrated that spurious memories form an unavoidable part of the quantum associative memory model; however, the very presence of these spurious states provides the possibility of organizing a controlled process of data retrieval using a specially formed initial state of the quantum database and also of the transformation performed upon it. Concrete examples illustrating the properties of the proposed model are also presented.


Archive | 1998

An Artificial Neuron with Quantum Mechanical Properties

Dan Ventura; Tony R. Martinez

Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. Choosing the best weights for a neural network is a time consuming problem that makes the harnessing of this ‘quantum parallelism’ appealing. This paper briefly covers necessary high-level quantum theory and introduces a model for a quantum neuron.


international joint conference on neural network | 2006

Preparing More Effective Liquid State Machines Using Hebbian Learning

David Norton; Dan Ventura

In liquid state machines, separation is a critical attribute of the liquid - which is traditionally not trained. The effects of using Hebbian learning in the liquid to improve separation are investigated in this paper. When presented with random input, Hebbian learning does not dramatically change separation. However, Hebbian learning does improve separation when presented with real-world speech data.


Neurocomputing | 2010

Improving liquid state machines through iterative refinement of the reservoir

David Norton; Dan Ventura

Liquid state machines (LSMs) exploit the power of recurrent spiking neural networks (SNNs) without training the SNN. Instead, LSMs randomly generate this network and then use it as a filter for a generic machine learner. Previous research has shown that LSMs can yield competitive results; however, the process can require numerous time consuming epochs before finding a viable filter. We have developed a method for iteratively refining these randomly generated networks, so that the LSM will yield a more effective filter in fewer epochs than the traditional method. We define a new metric for evaluating the quality of a filter before calculating the accuracy of the LSM. The LSM then uses this metric to drive a novel algorithm founded on principals integral to both Hebbian and reinforcement learning. We compare this new method with traditional LSMs across two artificial pattern recognition problems and two simplified problems derived from the TIMIT dataset. Depending on the problem, our method demonstrates improvements in accuracy of from 15 to almost 600%.


international joint conference on neural network | 2006

Spatiotemporal Pattern Recognition via Liquid State Machines

Eric Goodman; Dan Ventura

The applicability of complex networks of spiking neurons as a general purpose machine learning technique remains open. Building on previous work using macroscopic exploration of the parameter space of an (artificial) neural microcircuit, we investigate the possibility of using a liquid state machine to solve two real-world problems: stockpile surveillance signal alignment and spoken phoneme recognition.

Collaboration


Dive into the Dan Ventura's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Derrall Heath

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

David Norton

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Robert Smith

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

Adam Drake

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar

John T. Prince

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dah-Jye Lee

Brigham Young University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nancy Fulda

Brigham Young University

View shared research outputs
Researchain Logo
Decentralizing Knowledge