Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nikzad Toomarian is active.

Publication


Featured researches published by Nikzad Toomarian.


Neural Networks | 1992

Learning a trajectory using adjoint functions and teacher forcing

Nikzad Toomarian; Jacob Barhen

Abstract A new methodology for faster supervised temporal learning in nonlinear neural networks is presented. It builds upon the concept of adjoint operators, to enable a fast computation of the gradients of an error functional with respect to all parameters of the neural architecture, and exploits the concept of teacher forcing to incorporate information regarding the desired output into the activation dynamics. The importance of the initial or final time conditions for the adjoint equations (i.e., the error propagation equations) is discussed. A new algorithm is presented, in which the adjoint equations are solved simultaneously (i.e., forward in time) with the activation dynamics of the neural network. We also indicate how teacher forcing can be modulated in time as learning proceeds. The algorithm is illustrated by examples. The results show that the learning time is reduced by one to two orders of magnitude with respect to previously published results, while trajectory tracking is significantly improved. The proposed methodology makes hardware implementation of temporal learning attractive for real-time applications.


Computers & Geosciences | 2000

Reservoir parameter estimation using a hybrid neural network

Fred Aminzadeh; Jacob Barhen; Charles W. Glover; Nikzad Toomarian

The accuracy of an artificial neural network (ANN) algorithm is a crucial issue in the estimation of an oil field’s reservoir properties from the log and seismic data. This paper demonstrates the use of the k-fold cross validation technique to obtain confidence bounds on an ANN’s accuracy statistic from a finite sample set. In addition, we also show that an ANN’s classification accuracy is dramatically improved by transforming the ANN’s input feature space to a dimensionally smaller, new input space. The new input space represents a feature space that maximizes the linear separation between classes. Thus, the ANN’s convergence time and accuracy are improved because the ANN must merely find nonlinear perturbations to the starting linear decision boundaries. These techniques for estimating ANN accuracy bounds and feature space transformations are demonstrated on the problem of estimating the sand thickness in an oil field reservoir based only on remotely sensed seismic data. 7 2000 Elsevier Science Ltd. All rights reserved.


international conference on robotics and automation | 1993

A neural network based identification of environments models for compliant control of space robots

Subramanian Venkataraman; Sandeep Gulati; Jacob Barhen; Nikzad Toomarian

Many space robotic systems would be required to operate in uncertain or even unknown environments. The problem of identifying such environment for compliance control is considered. In particular, neural networks are used for identifying environments that a robot establishes contact with. Both function approximation and parameter identification (with fixed nonlinear structure and unknown parameters) results are presented. The environment model structure considered is relevant to two space applications: cooperative execution of tasks by robots and astronauts, and sample acquisition during planetary exploration. Compliant motion experiments have been performed with a robotic arm, placed in contact with a single-degree-of-freedom electromechanical environment. In the experiments, desired contact forces are computed using a neural network, given a desired motion trajectory. Results of the control experiments performed on robot hardware are described and discussed. >


international symposium on neural networks | 1991

Fast temporal neural learning using teacher forcing

Nikzad Toomarian; Jacob Barhen

A methodology for faster supervised temporal learning in nonlinear neural networks is presented. The authors introduce the concept of terminal teacher forcing and appropriately modify the activation dynamics of the neural network. They also indicate how teacher forcing can be decreased as the learning proceeds. In order to make the algorithm more tangible, the authors compare its different phases to an important aspect of learning inspired by a real-life analogy. The results show that the learning time is reduced by one to two orders of magnitude with respect to conventional methods. The authors limited themselves to an example of representative complexity. It is demonstrated that a circular trajectory can be learned in about 400 iterations.<<ETX>>


Journal of Petroleum Science and Engineering | 1999

Estimation of reservoir parameter using a hybrid neural network

Fred Aminzadeh; Jacob Barhen; Charles W. Glover; Nikzad Toomarian

Estimation of an oil fields reservoir properties using seismic data is a crucial issue. The accuracy of those estimates and the associated uncertainty are also important information. This paper demonstrates the use of the k-fold cross validation technique to obtain confidence bound on an Artificial Neural Networks (ANN) accuracy statistic from a finite sample set. In addition, we also show that an ANNs classification accuracy is dramatically improved by transforming the ANNs input feature space to a dimensionally smaller, new input space. The new input space represents a feature space that maximizes the linear separation between classes. Thus, the ANNs convergence time and accuracy are imporved because the ANN must merely find nonlinear perturbations to the starting linear decision boundaries. These technique for estimating ANN accuracy bounds and feature space transformations are demonstrated on the problem of estimating the sand thickness in an oil field reservoir based only on remotely sensed seismic data.


Applied Mathematics Letters | 1990

Application of adjoint operators to neural learning

Jacob Barhen; Nikzad Toomarian; Sandeep Gulati

A new methodology for neural learning of nonlinear mappings is presented. It exploits the concept of adjoint operators to enable a fast global computation of the networks response to perturbations in all system parameters.


international symposium on neural networks | 1994

Learning without local minima

Jacob Barhen; Nikzad Toomarian; Amir Fijany

A computationally efficient methodology for overcoming local minima in nonlinear neural network learning is presented. This methodology is based on the newly discovered TRUST global optimization paradigm. Enhancements to the backpropagation schema in feedforward multilayer architectures, and to adjoint-operator learning in recurrent networks are discussed. Extensions to TRUST now formally guarantee reaching a global minimum in the multidimensional case. Results for a standard benchmark are included, to illustrate the theoretical developments.<<ETX>>


Concurrency and Computation: Practice and Experience | 1994

Time Parallel Solution of Linear Partial Differential Equations on the Intel Touchstone Delta Supercomputer

Nikzad Toomarian; Amir Fijany; Jacob Barhen

The paper presents the implementation of a new class of massively parallel algorithms for solving certain time-dependent partial differential equations (PDEs) on massively parallel supercomputers. Such PDEs are usually solved numerically, by discretization in time and space, and by applying a time-stepping procedure to data and algorithms potentially parallelized in the spatial domain. In a radical departure from such a strictly sequential temporal paradigm, we have developed a concept of time-parallel algorithms, which allows the marching in time to be fully parallelized. This is achieved by using a set of transformations based on eigenvalue-eigenvector decomposition of the matrices involved in the discrete formalism. Our time-parallel algorithms possess a highly decoupled structure, and can therefore be efficiently implemented on emerging, massively parallel, high-performance supercomputers, with a minimum of communication and synchronization overhead. We have successfully carried out a proof-of-concept demonstration of the basic ideas using a two-dimensional heat equation example implemented on the Intel Touchstone Delta supercomputer. Our results indicate that linear, and even superlinear, speed-up can be achieved and maintained for a very large number of processor nodes.


international symposium on neural networks | 1993

Learning trajectories with a hierarchy of oscillatory modules

Pierre Baldi; Nikzad Toomarian

The most successful approach to learning has been the backpropagation method. Although very powerful on relatively simple problems, theoretical analysis and simulations show that this approach breaks down as soon as sufficiently complex problems are considered. To overcome this fundamental limitation, a hierarchical and modular approach is suggested, directly inspired from biological networks, whereby a certain degree of structure is introduced in the learning system. This approach is applied to a simple example of trajectory learning of a semi-figure eight.<<ETX>>


international parallel processing symposium | 1994

Massively parallel algorithms for solution of the Schrodinger equation

Amir Fijany; Jacob Barhen; Nikzad Toomarian

Time-parallel algorithms for solution of the Schrodinger equation are developed. By using the Crank-Nicolson method, it is shown that the solution of the problem can be fully parallelized in time, leading to a massive temporal parallelism in the computation with a minimum of communication and synchronization requirements. Our results clearly indicate that the Crank-Nicolson method, in addition to its excellent numerical properties, is also highly suitable for massively parallel computation.<<ETX>>

Collaboration


Dive into the Nikzad Toomarian's collaboration.

Top Co-Authors

Avatar

Jacob Barhen

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Amir Fijany

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Sandeep Gulati

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

Charles W. Glover

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Michail Zak

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Benjamin Blalock

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Farrokh Vatan

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Fred Aminzadeh

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Mohammad Mojarradi

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Sorin Cristoloveanu

California Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge