Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bruce E. Rosen is active.

Publication


Featured researches published by Bruce E. Rosen.


Mathematical and Computer Modelling | 1992

Genetic Algorithms and Very Fast Simulated Reannealing: A comparison

Lester Ingber; Bruce E. Rosen

We compare Genetic Algorithms (GA) with a functional search method, Very Fast Simulated Reannealing (VFSR), that not only is efficient in its search strategy, but also is statistically guaranteed to find the function optima. GA previously has been demonstrated to be competitive with other standard Boltzmann-type simulated annealing techniques. Presenting a suite of six standard test functions to GA and VFSR codes from previous studies, without any additional fine tuning, strongly suggests that VFSR can be expected to be orders of magnitude more efficient than GA.


Workshop on Physics and Computation | 1992

Function Optimization Based On Advanced Simulated Annealing

Bruce E. Rosen

Solutions to numerical problems often involve finding (or fitting) a set of parameters to optimize a function. A novel extension of the Simulated Annealing method, Very Fast Simulated Reannealing (VFSR) [6, 13, 81, has been proposed for optimizing difficult functions. VFSR has an exponentially decreasing temperature reduction schedule which is faster than both Boltzmn Annealing and Fast (Cauchy) Annealing. VFSR is shown to be is superior to these two methods on optimizing a difficult multimodal function.


international symposium on neural networks | 1994

A simulated annealing approach to job shop scheduling using critical block transition operators

Takeshi Yamada; Bruce E. Rosen; Ryohei Nakano

The job shop scheduling problem is one of the most difficult NP hard combinatorial optimization problems. This research investigates finding optimal and near optimal schedules using simulated annealing and a schedule permutation procedure. New schedules are generated by permuting operations within existing schedules. Simulated annealing probabilistically chooses one of the new schedules and probabilistically accepts or rejects it, allowing importance sampling search over the job shop schedule space. The initial and (minimum) final temperatures are adaptively determined a priori, and a reintensification strategy that improves the search by resetting the current temperature and state. Experimental results show this simple and flexible method can find near optimal schedules and often outperforms previous SA approaches.<<ETX>>


international symposium on neural networks | 1997

Improving the accuracy of financial time series prediction using ensemble networks and high order statistics

Roy Schwaerzel; Bruce E. Rosen

We apply neural network ensembles to the task of forecasting financial time series and explore the use of high order statistical information as part of network inputs. We show that the prediction accuracy of the time series can be significantly improved utilizing this methodology. Since prediction accuracy is only an estimate for the profitability on the financial market, we report good and profitable results using a profit/loss metric based on market simulations. Our simulations show an improvement of between 1.3 to 12.4% over a simple buy and hold trading strategy, and an improvement of between 6.5 to 20.9% over trading strategy using linear autoregressive models.


Biological Cybernetics | 1992

Process control with adaptive range coding

Bruce E. Rosen; James M. Goodwin; Jacques J. Vidal

Dynamical control with adaptive range coding eliminates fundamental shortcomings found in earlier applications of range (course) coding which used fixed partitioning. Adaptive range coding has the advantages of efficient implementation, execution and generalization. With the adaptive algorithm, region shapes are continually adjusted during operation and will self-organize to reflect the global dynamics of the system and the environment. The system progressively develops a control map relating environmental states, control actions, and future reinforcements.


international conference of the ieee engineering in medicine and biology society | 1988

Machine operant conditioning

Bruce E. Rosen; James M. Goodwin; Jacques J. Vidal

This research investigates learning of machine reflexes by applying punishment and reward reinforcement to teach artificial neuronlike systems a prescribed behavior. Stochastic neuronlike elements based on the classical weighted sum of inputs and threshold model can learn stimulus-response associations by emulated classical Pavlovian conditioning, i.e. make associations between conditioned and unconditioned stimuli and later responses. Several mathematical models have been developed which apply abstractions of classical conditioning to such threshold logic devices. Temporal sequences of stimulus-response associations can be dynamically learned by using operant conditioning when only aggregate external reinforcement is available.<<ETX>>


systems, man and cybernetics | 1994

Parallel very fast simulated reannealing by temperature block partitioning

Sandra G. Dykes; Bruce E. Rosen

This work describes a parallel implementation of very fast simulated reannealing (VFSR), an advanced simulated annealing method for optimization of nonlinear, multi-dimensional functions with large numbers of local minima. Parallel VFSR speed-ups on a CM-2 connection machine are reported for eight functions: De Jongs test suite, a 10-D parabolic function, and two multi-modal, highly nonlinear functions. Within the test set, the function characteristic most affecting parallel VFSR performance is the number of optimized function parameters. Low dimension functions profited least from parallelization, exhibiting speedups from 2 to 78 (where speedups are based on number of function evaluation cycles). Speed-ups for the three 10-D cost functions increased to 410, 823 and 1124. On a stochastic high dimensional (D=30) quartic cost function, the cycle ratio was over 19000. We present results of a systematic study of the dimensionality effect on three test functions.<<ETX>>


International Journal of Pattern Recognition and Artificial Intelligence | 1992

IMAGE RECOGNITION AND RECONSTRUCTION USING ASSOCIATIVE MAGNETIC PROCESSING

James M. Goodwin; Bruce E. Rosen; Jacques J. Vidal

This paper presents a technique for image recognition, reconstruction, and processing using a novel massively parallel system. This device is a physical implementation of a Boltzmann machine type of neural network based on the use of magnetic thin films and opto-magnetic control. Images or patterns in the form of pixel arrays are imposed on the magnetic film using a laser in an external magnetic field. These images are learned and can be recalled later when a similar image is presented. A stored image is recallable even when a partial, noisy, or corrupted version of that image is imposed on the film. The system can also be used for feature detection and image compression. The operation and construction of the physical system is described, together with a discussion of the physical basis for its operation. The authors have developed Monte Carlo style computer simulations of the system for a variety of platforms, including serial workstations and hypercube configured parallel systems. They describe here some of the factors involved in computer simulations of the system, which can be fast and relatively simple in implementation. Simulation results are presented and, in particular, the behavior of the model under simulated annealing in the light of statistical physics is discussed. The simulation itself can be used as a neural network model capable of the functions ascribed to the physical device.


international symposium on neural networks | 1997

A perceptron-like online algorithm for tracking the median

Tom Bylander; Bruce E. Rosen

We present an online algorithm for tracking the median of a series of values. The algorithm updates its current estimate of the median by incrementing or decrementing a fixed value, which is analogous to perceptron updating. The median value of a sequence minimizes the absolute loss, i.e., the sum of absolute deviations. The analysis shows that the worst-case absolute loss of our algorithm is comparable to the absolute loss of any sequence of target medians, given restrictions on how much the target can change per trial.


acm symposium on applied computing | 1994

Training hard to learn networks using advanced simulated annealing methods

Bruce E. Rosen; James M. Goodwin

We describe a method for avoiding local minima by combining Very Fast Simulated Reannealing (VFSR) with BEP. While convergence to the best. training weights can be slower than gradient descent, methods, it is faster than other SA network training methods. More importantly, convergence to the optimal weight set is statistically guaranteed. We demonstrate VFSR network training on learning a set of difficult but. linearly separable logic functions and a set. of nonlinearly separable parity problems, and compare performances of VFSR network training with conjugate gradient trained backpropagation networks. I n t r o d u c t i o n Backpropagation networks [1] are nonlinear methods for mapping a set of multidimensional input patterns x = xl ..... xo, to a corresponding set of dimensional output patterns y = Yl,...,YM. While they have a variety of networks architectures, they are typified by a set of input, hidden, and outpul units taking on different values. The units are arranged in feed forward layers in the network• Units between layers have connections to each other with weights w associated with each connection. A typical architecture is the three layer network that has its input units connected to its hidden units, and its hidden units connected to its output.• Additionally, there may are often direct, connections between the input and output units. Bias unit. may be included to reflect constant input values. Networks operate by first, assigning an input pattern x to the input units. Then, the values of the hidden units h and output unit o values are calculated as follows: h~.(~.) = f(w~,0 + ~ u,~,jzj)

Collaboration


Dive into the Bruce E. Rosen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tom Bylander

University of Texas at San Antonio

View shared research outputs
Top Co-Authors

Avatar

Takeshi Yamada

Nippon Telegraph and Telephone

View shared research outputs
Top Co-Authors

Avatar

Barry Schifrin

University of Texas at San Antonio

View shared research outputs
Top Co-Authors

Avatar

J.M. Goodwin

University of Texas at San Antonio

View shared research outputs
Top Co-Authors

Avatar

Lester Ingber

California Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge