Olvi L. Mangasarian
University of California, San Diego
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Olvi L. Mangasarian.
knowledge discovery and data mining | 2001
Glenn Fung; Olvi L. Mangasarian
Instead of a standard support vector machine (SVM) that classifies points by assigning them to one of two disjoint half-spaces, points are classified by assigning them to the closest of two parallel planes (in input or feature space) that are pushed apart as far as possible. This formulation, which can also be interpreted as regularized least squares and considered in the much more general context of regularized networks [8, 9], leads to an extremely fast and simple algorithm for generating a linear or nonlinear classifier that merely requires the solution of a single system of linear equations. In contrast, standard SVMs solve a quadratic or a linear program that require considerably longer computational time. Computational results on publicly available datasets indicate that the proposed proximal SVM classifier has comparable test set correctness to that of standard SVM classifiers, but with considerably faster computational time that can be an order of magnitude faster. The linear proximal SVM can easily handle large datasets as indicated by the classification of a 2 million point 10-attribute set in 20.8 seconds. All computational results are based on 6 lines of MATLAB code.
Optimization Methods & Software | 1992
Kristin P. Bennett; Olvi L. Mangasarian
A single linear programming formulation is proposed which generates a plane that of minimizes an average sum of misclassified points belonging to two disjoint points sets in n-dimensional real space. When the convex hulls of the two sets are also disjoint, the plane completely separates the two sets. When the convex hulls intersect, our linear program, unlike all previously proposed linear programs, is guaranteed to generate some error-minimizing plane, without the imposition of extraneous normalization constraints that inevitably fail to handle certain cases. The effectiveness of the proposed linear program has been demonstrated by successfully testing it on a number of databases. In addition, it has been used in conjunction with the multisurface method of piecewise-linear separation to train a feed-forward neural network with a single hidden layer.
Operations Research | 1995
Olvi L. Mangasarian; W. Nick Street; William H. Wolberg
Two medical applications of linear programming are described in this paper. Specifically, linear programming-based machine learning techniques are used to increase the accuracy and objectivity of breast cancer diagnosis and prognosis. The first application to breast cancer diagnosis utilizes characteristics of individual cells, obtained from a minimally invasive fine needle aspirate, to discriminate benign from malignant breast lumps. This allows an accurate diagnosis without the need for a surgical biopsy. The diagnostic system in current operation at University of Wisconsin Hospitals was trained on samples from 569 patients and has had 100% chronological correctness in diagnosing 131 subsequent patients. The second application, recently put into clinical practice, is a method that constructs a surface that predicts when breast cancer is likely to recur in patients that have had their cancers excised. This gives the physician and the patient better information with which to plan treatment, and may eliminate the need for a prognostic surgical procedure. The novel feature of the predictive approach is the ability to handle cases for which cancer has not recurred (censored data) as well as cases for which cancer has recurred at a specific time. The prognostic system has an expected error of 13.9 to 18.3 months, which is better than prognosis correctness by other available techniques.
IEEE Transactions on Pattern Analysis and Machine Intelligence | 2006
Olvi L. Mangasarian; Edward W. Wild
A new approach to support vector machine (SVM) classification is proposed wherein each of two data sets are proximal to one of two distinct planes that are not parallel to each other. Each plane is generated such that it is closest to one of the two data sets and as far as possible from the other data set. Each of the two nonparallel proximal planes is obtained by a single MATLAB command as the eigenvector corresponding to a smallest eigenvalue of a generalized eigenvalue problem. Classification by proximity to two distinct nonlinear surfaces generated by a nonlinear kernel also leads to two simple generalized eigenvalue problems. The effectiveness of the proposed method is demonstrated by tests on simple examples as well as on a number of public data sets. These examples show the advantages of the proposed approach in both computation time and test set correctness.
Journal of Mathematical Analysis and Applications | 1967
Olvi L. Mangasarian; S Fromovitz
Optimality criteria form the foundations of mathematical programming both theoretically and computationally. In general, these criteria can be classified as either necessary or sufficient. Of course, one would like to have the same criterion be both necessary and sufficient. However, this occurs only under somewhat ideal conditions which are rarely satisfied in practice. In the absence of convexity, one is never assured, in general, of the sufficiency of any such optimality criterion. We are then left with only the necessary optimality criterion to face the vast number of mathematical programming problems which are not convex. The best-known necessary optimality criterion for a mathematical programming problem is the Kuhn-Tucker criterion [l]. However, the Fritz-John criterion [2], which predates the Kuhn-Tucker criterion by about three years, is in a sense more general. In order for the Kuhn-Tucker criterion to hold, one must impose a constraint-qualification on the constraints of the problem. On the other hand, no such qualification need be imposed on the constraints in order that the Fritz John criterion hold. Moreover, the Fritz John criterion itself can be used to derive a form of the constraint qualification for the Kuhn-Tucker criterion. Originally, Fritz John derived his conditions for the case of inequality constraints alone. If equality constraints are present and they are merely replaced by two inequality constraints, then the Fritz John original conditions become useless because every feasible point satisfies them. The new generalization of Fritz John’s conditions derived in this work treats equalities as equalities and does not convert them to inequalities. This makes it possible to handle equalities and inequalities together. Another contribution of the present work is a constraint qualification for equalities and inequalities together. Previous constraint qualifications treated equalities and inequalities separately, but not together. Since many
Computational Optimization and Applications | 2001
Yuh-Jye Lee; Olvi L. Mangasarian
Smoothing methods, extensively used for solving important mathematical programming problems and applications, are applied here to generate and solve an unconstrained smooth reformulation of the support vector machine for pattern classification using a completely arbitrary kernel. We term such reformulation a smooth support vector machine (SSVM). A fast Newton–Armijo algorithm for solving the SSVM converges globally and quadratically. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm. On six publicly available datasets, tenfold cross validation correctness of SSVM was the highest compared with four other methods as well as the fastest. On larger problems, SSVM was comparable or faster than SVMlight (T. Joachims, in Advances in Kernel Methods—Support Vector Learning, MIT Press: Cambridge, MA, 1999), SOR (O.L. Mangasarian and David R. Musicant, IEEE Transactions on Neural Networks, vol. 10, pp. 1032–1037, 1999) and SMO (J. Platt, in Advances in Kernel Methods—Support Vector Learning, MIT Press: Cambridge, MA, 1999). SSVM can also generate a highly nonlinear separating surface such as a checkerboard.
Computational Optimization and Applications | 1996
Chunhui Chen; Olvi L. Mangasarian
We propose a class of parametric smooth functions that approximate the fundamental plus function, (x)+=max{0, x}, by twice integrating a probability density function. This leads to classes of smooth parametric nonlinear equation approximations of nonlinear and mixed complementarity problems (NCPs and MCPs). For any solvable NCP or MCP, existence of an arbitrarily accurate solution to the smooth nonlinear equations as well as the NCP or MCP, is established for sufficiently large value of a smoothing parameter α. Newton-based algorithms are proposed for the smooth problem. For strongly monotone NCPs, global convergence and local quadratic convergence are established. For solvable monotone NCPs, each accumulation point of the proposed algorithms solves the smooth problem. Exact solutions of our smooth nonlinear equation for various values of the parameter α, generate an interior path, which is different from the central path for interior point method. Computational results for 52 test problems compare favorably with these for another Newton-based method. The smooth technique is capable of solving efficiently the test problems solved by Dirkse and Ferris [6], Harker and Xiao [11] and Pang & Gabriel [28].
Mathematical Programming | 1979
Shih-Ping Han; Olvi L. Mangasarian
It is shown that the existence of a strict local minimum satisfying the constraint qualification of [16] or McCormicks [12] second order sufficient optimality condition implies the existence of a class of exact local penalty functions (that is ones with a finite value of the penalty parameter) for a nonlinear programming problem. A lower bound to the penalty parameter is given by a norm of the optimal Lagrange multipliers which is dual to the norm used in the penalty function.
Journal of Optimization Theory and Applications | 1977
Olvi L. Mangasarian
AbstractA unified treatment is given for iterative algorithms for the solution of the symmetric linear complementarity problem:
Journal of Global Optimization | 2000
Paul S. Bradley; Olvi L. Mangasarian