Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Seho Oh is active.

Publication


Featured researches published by Seho Oh.


IEEE Transactions on Neural Networks | 1991

Query-based learning applied to partially trained multilayer perceptrons

Jenq-Neng Hwang; J.J. Choi; Seho Oh; Robert J. Marks

An approach is presented for query-based neural network learning. A layered perceptron partially trained for binary classification is considered. The single-output neuron is trained to be either a zero or a one. A test decision is made by thresholding the output at, for example, one-half. The set of inputs that produce an output of one-half forms the classification boundary. The authors adopted an inversion algorithm for the neural network that allows generation of this boundary. For each boundary point, the classification gradient can be generated. The gradient provides a useful measure of the steepness of the multidimensional decision surfaces. Conjugate input pairs are generated using the boundary point and gradient information and presented to an oracle for proper classification. These data are used to refine further the classification boundary, thereby increasing the classification accuracy. The result can be a significant reduction in the training set cardinality in comparison with, for example, randomly generated data points. An application example to power system security assessment is given.


IEEE Transactions on Geoscience and Remote Sensing | 1992

Inversion of snow parameters from passive microwave remote sensing measurements by a neural network trained with a multiple scattering model

Leung Tsang; Zhengxiao Chen; Seho Oh; Robert J. Marks; Alfred T. C. Chang

The inversion of snow parameters from passive microwave remote sensing measurements is performed with a neural network trained with a dense-media multiple-scattering model. The input-output pairs generated by the scattering model are used to train the neural network. Simultaneous inversion of three parameters, mean-grain size of ice particles in snow, snow density, and snow temperature from five brightness temperatures, is reported. It is shown that the neural network gives good results for simulated data. The absolute percentage errors for mean-grain size of ice particles and snow density are less than 10%, and the absolute error for snow temperature is less than 3 K. The neural network with the trained weighting coefficients of the three-parameter model is also used to invert SSMI data taken over the Antarctic region. >


Medical Physics | 1998

Optimization of intensity modulated beams with volume constraints using two methods: cost function minimization and projections onto convex sets.

Paul S. Cho; Shinhak Lee; Robert J. Marks; Seho Oh; Steve G. Sutlief; Mark H. Phillips

For accurate prediction of normal tissue tolerance, it is important that the volumetric information of dose distribution be considered. However, in dosimetric optimization of intensity modulated beams, the dose-volume factor is usually neglected. In this paper we describe two methods of volume-dependent optimization for intensity modulated beams such as those generated by computer-controlled multileaf collimators. The first method uses a volume sensitive penalty function in which fast simulated annealing is used for cost function minimization (CFM). The second technique is based on the theory of projections onto convex sets (POCS) in which the dose-volume constraint is replaced by a limit on integral dose. The ability of the methods to respect the dose-volume relationship was demonstrated by using a prostate example involving partial volume constraints to the bladder and the rectum. The volume sensitive penalty function used in the CFM method can be easily adopted by existing optimization programs. The convex projection method can find solutions in much shorter time with minimal user interaction.


IEEE Transactions on Neural Networks | 1995

Similarities of error regularization, sigmoid gain scaling, target smoothing, and training with jitter

Russell Reed; Robert J. Marks; Seho Oh

The generalization performance of feedforward layered perceptrons can, in many cases, be improved either by smoothing the target via convolution, regularizing the training error with a smoothing constraint, decreasing the gain (i.e., slope) of the sigmoid nonlinearities, or adding noise (i.e., jitter) to the input training data, In certain important cases, the results of these procedures yield highly similar results although at different costs. Training with jitter, for example, requires significantly more computation than sigmoid scaling.


IEEE Transactions on Signal Processing | 1994

Kernel synthesis for generalized time-frequency distributions using the method of alternating projections onto convex sets

Seho Oh; Robert J. Marks; Les E. Atlas

Cohens generalized time-frequency distribution (GTFR) requires the choice of a two-dimensional kernel. The kernel directly affects many performance attributes of the GTFR such as time resolution, frequency resolution, realness, and conformity to time and frequency marginals. A number of different kernels may suffice for a given performance constraint (high-frequency resolution, for example). Interestingly, most sets of kernels satisfying commonly used performance constraints are convex. We describe a method whereby kernels can be designed that satisfy two or more of these constraints. If there exists a nonempty intersection among the constraint sets, then the theory of alternating projection onto convex sets (POCS) guarantees, convergence to a kernel that satisfies all of the constraints. If the constraints can be partitioned into two sets, each with a nonempty intersection, then POCS guarantees convergence to a kernel that satisfies the inconsistent constraints with minimum mean-square error. We apply kernels synthesized using POCS to the generation of some example GTFRs, and compare their performance to the spectrogram, Wigner distribution, and cone kernel GTFR. >


ieee international conference on fuzzy systems | 1993

Adaptive membership function fusion and annihilation in fuzzy if-then rules

B.G. Song; Robert J. Marks; Seho Oh; Payman Arabshahi; Thomas P. Caudell; J.J. Choi

The parameters of the input and output fuzzy membership functions for fuzzy if-then min-max inferencing may be adapted using supervised learning applied to training data. Under the assumption that the inference surface is in some sense smooth, the process of adaptation can reveal overdetermination of the fuzzy system in two ways. First, if two membership functions come sufficiently close to each other, they can be fused into a single membership function. Second, annihilation occurs when a membership function becomes sufficiently narrow. In both cases, the number of if-then rules is reduced. In certain cases, the overall performance of the fuzzy system can be improved by this adaptive pruning. The process of membership function fusion and annihilation is illustrated with two examples.<<ETX>>


IEEE Transactions on Signal Processing | 1992

Some properties of the generalized time frequency representation with cone-shaped kernel

Seho Oh; Robert J. Marks

The cone-shaped kernel generalized time-frequency representation (GTFR) of Zhao, Atlas, and Marks (ZAM) has been shown empirically to generate quite good time frequency representation in comparison to other approaches. The authors analyze some specific properties of this GTFR and compare them to other TFRs. Asymptotically, the GTFR is shown to produce results identical to that of the spectrogram for stationary signals. Interference terms normally present in many GTFRs are shown to be attenuated drastically by the use of the ZAM-GTFR. The ability of the ZAM-GTFR to track frequency hopping is shown to be close to that of the Wigner distribution. When a signal is subjected to white noise, the ZAM-GTFR produces an unbiased estimate of the ZAM-GTFR of the signal without noise. In many other GTFRs, the power spectral density of the noise is superimposed on the GTFR of the signal. It is also shown that, in discrete form, the ZAM-GTFR is generally invertible. >


Physics in Medicine and Biology | 1997

Conformal radiotherapy computation by the method of alternating projections onto convex sets

Shinhak Lee; Paul S. Cho; Robert J. Marks; Seho Oh

Synthesis of beam profiles for a given dose prescription is a central problem in radiotherapy. Care must be taken in the beam design to expose the tumour volume at a high level, to avoid significant irradiation of critical organs, and to minimize exposure of all other tissue. Use of the synthesis procedure known as alternating projections onto convex sets (POCS) is shown to be a viable approach to beam design. POCS is a powerful tool for signal and image restoration and synthesis. Convex sets of signals obeying desired constraint sets are first specified. Then, by repeated projections onto these sets, convergence is to a signal obeying all desired constraints if the constraint sets have a finite intersection. In this paper we apply the method of POCS to conformal radiotherapy dose computation. The performance of the method is shown through three representative examples.


international symposium on neural networks | 1992

Regularization using jittered training data

Russell Reed; Seho Oh; Robert J. Marks

The authors investigate the training of a layered perceptron with jittered data. They study the effect of generating additional training data by adding noise to the input data and show that is introduces convolutional smoothing of the target function. Training using such jittered data is shown, under a small variance assumption, to be equivalent to Lagrangian regularization with a derivative regularizer. Training with jitter allows regularization within the conventional layered perceptron architecture.<<ETX>>


IEEE Transactions on Circuits and Systems | 1989

Alternating projection neural networks

Robert J. Marks; Seho Oh; Les E. Atlas

The authors consider a class of neural networks whose performance can be analyzed and geometrically visualized in a signal space environment. Alternating projection neural networks (APNNs) perform by alternatively projecting between two or more constraint sets. Criteria for desired and unique convergence are easily established. The network can be configured as either a content-addressable memory or classifier. Convergence of the APNN can be improved by the use of sigmoid-type nonlinearities and/or increasing the number of neurons in a hidden layer. >

Collaboration


Dive into the Seho Oh's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Les E. Atlas

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Dayle G. Ellison

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

J.J. Choi

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Paul S. Wilhelm

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Kwan F. Cheung

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Dong C. Park

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Russell Reed

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Alfred T. C. Chang

Goddard Space Flight Center

View shared research outputs
Researchain Logo
Decentralizing Knowledge