J. Michael Steele
University of Pennsylvania
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J. Michael Steele.
American Mathematical Monthly | 2004
J. Michael Steele
1. Starting with Cauchy 2. The AM-GM inequality 3. Lagranges identity and Minkowskis conjecture 4. On geometry and sums of squares 5. Consequences of order 6. Convexity - the third pillar 7. Integral intermezzo 8. The ladder of power means 9. Holders inequality 10. Hilberts inequality and compensating difficulties 11. Hardys inequality and the flop 12. Symmetric sums 13. Majorization and Schur convexity 14. Cancellation and aggregation Solutions to the exercises Notes References.
Archive | 2004
David Aldous; J. Michael Steele
This survey describes a general approach to a class of problems that arise in combinatorial probability and combinatorial optimization. Formally, the method is part of weak convergence theory, but in concrete problems the method has a flavor of its own. A characteristic element of the method is that it often calls for one to introduce a new, infinite, probabilistic object whose local properties inform us about the limiting properties of a sequence of finite problems.
Journal of Algorithms | 1982
J. Michael Steele; Andrew Chi-Chih Yao
A topological method is given for obtaining lower bounds for the height of algebraic decision trees. The method is applied to the knapsack problem where an Ω(n2) bound is obtained for trees with bounded-degree polynomial tests, thus extending the Dobkin-Lipton result for linear trees. Applications to the convex hull problem and the distinct element problem are also indicated. Some open problems are discussed.
Mathematics of Operations Research | 1990
J. Michael Steele
The classical problems reviewed are the traveling salesman problem, minimal spanning tree, minimal matching, greedy matching, minimal triangulation, and others. Each optimization problem is considered for finite sets of points in ℝd, and the feature of principal interest is the value of the associated objective function. Special attention is given to the asymptotic behavior of this value under probabilistic assumptions, but both probabilistic and worst case analyses are surveyed.
Archive | 1995
J. Michael Steele
A review is given of the results on the length of the longest increasing subsequence and related problems. The review covers results on random and pseudorandom sequences as well as deterministic ones. Although most attention is given to previously published research, some new proofs and new results are given. In particular, some new phenomena are demonstrated for the monotonic subsequences of sections of sequences. A number of open problems from the literature are also surveyed.
Journal of the American Statistical Association | 1987
Diane L. Souvaine; J. Michael Steele
Abstract The least median of squared residuals regression line (or LMS line) is that line y = ax + b for which the median of the residuals |yi - axi - b |2 is minimized over all choices of a and b. If we rephrase the traditional ordinary least squares (OLS) problem as finding the a and b that minimize the mean of | yi - axi - b |2, one can see that in a formal sense LMS just replaces a “mean” by a “median.” This way of describing LMS regression does not do justice to the remarkable properties of LMS. In fact, LMS regression behaves in ways that distinguish it greatly from OLS as well as from many other methods for robustifying OLS (see, e.g., Rousseeuw 1984). As illustrations given here show, the LMS regression line should provide a valuable tool for studying those data sets in which the usual linear model assumptions are violated by the presence of some (not too small) groups of data values that behave distinctly from the bulk of the data. This feature of LMS regression is illustrated by the fit given in...
Journal of Applied Probability | 1987
J. Michael Steele; Lawrence A. Shepp; William F. Eddy
Let V k,n be the number of vertices of degree k in the Euclidean minimal spanning tree of X i , , where the X i are independent, absolutely continuous random variables with values in R d . It is proved that n –1 V k,n converges with probability 1 to a constant α k,d . Intermediate results provide information about how the vertex degrees of a minimal spanning tree change as points are added or deleted, about the decomposition of minimal spanning trees into probabilistically similar trees, and about the mean and variance of V k,n .
Journal of Combinatorial Theory | 1978
J. Michael Steele
Abstract Let M be a matrix with entries from {1, 2,…, s } with n rows such that no matrix M ′ formed by taking k rows of M has s k distinct columns. Let f ( k ; n , s ) be the largest integer for which there is an M with f ( k ; n , s ) distinct columns. It is proved that ƒ(k;,n,s)=s n −Σ j=k n ( n j )(s−1) n−j . This result is related to a conjecture of Erdos and Szekeres that any set of 2 k −2 + 1 points in R 2 contains a set of k points which form a convex polygon.
SIAM Journal on Computing | 1989
J. Michael Steele; Timothy Law Snyder
A method is presented for determining the asymptotic worst-case behavior of quantities like the length of the minimal spanning tree or the length of an optimal traveling salesman tour of n points in the unit d-cube. In each of these classical problems, the worst-case lengths are proved to have the exact asymptotic growth rate of
Technometrics | 1989
Richard D. De Veaux; J. Michael Steele
\beta _n^{{{(d - 1)} / d}}