Armand Siegel
Boston University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Armand Siegel.
Physics of Fluids | 1964
William C. Meecham; Armand Siegel
A Wiener‐Hermite functional expansion is used to treat a random initial value process involving the Burgers model equation. The nonlinear model equation has many of the characteristics of the Navier‐Stokes equation. It is found that the functional expansion converges better the larger the separation variable in the correlation function (the nearer to joint normal is the distribution). To the present order, the treatment is similar to a quasinormal assumption. The computations show that the correlation function quickly approaches an equilibrium form for quite different initial values. The power spectrum function approaches an equilibrium form also, where it falls off like the inverse second power of the wavenumber.
Epilepsia | 1982
Armand Siegel; Cheryl L. Grady; Allan F. Mirsky
Summary: The EEGs of subjects with absence seizures were examined to determine if changes occurred prior to spike‐wave bursts that could be used to predict bursts. A number of 20‐s epochs of EEG prior to spike‐wave bursts (preburst epochs) and during periods remote from bursts (control epochs) were examined in 5 subjects. Power‐spectrum analysis was carried out on each epoch and frequency bands from 0 to 50 c/s were combined into 2‐c/s bandwidths. Logarithmically transformed power values in each frequency band were entered into a discriminant analysis algorithm for each subject separately. Results were expressed in terms of a test for significant differences between preburst and control epochs (F statistic) and a “success ratio” of discriminant analysis classification, defined as the proportion of correct classifications in both groups, as obtained using a cross‐validation procedure. A significant preburst EEG pattern was found in 4 of the 5 subjects, and success ratios ranged from 0.64 to 0.83. Each subjects preburst EEG seemed to be characterized by a unique pattern of changes, and thus no common prodromal signal was found. The EEG changes did not appear to be caused by overt behaviors, such as eye closure or drowsiness. The findings suggest that the preburst EEG pattern represents a functional alteration in brain activity which could arise from the burst‐producing mechanism directly.
Journal of Mathematical Physics | 1965
Armand Siegel; Tsutomu Imamura; William C. Meecham
The Wiener‐Hermite functional expansion, which is the expansion of a random function about a Gaussian function, is here substituted into the Burgers one‐dimensional model equation of turbulence. The result is a hierarchy of equations which (along with initial conditions) determine the kernel functions which play the role of expansion coefficients in the series. Initial conditions are postulated, based on physical reasoning, criteria of simplicity, and the assumption that the series is to represent the late decay stage (in which the Gaussian correction is small and also decreasing with time). These are shown to justify an iterative solution to the equations. The first correction to the Gaussian approximation is calculated. This is then tested by evaluating the correction to the flatness factor, which for an exactly Gaussian function has the value 3, but which has been found by experiment (in real three‐dimensional fluids, of course) to have a value which deviates from the Gaussian value increasingly rapidl...
Physics of Fluids | 1963
Armand Siegel; Tsutomu Imamura; William C. Meecham
Abstract : A method is presented which exploits the nearness to Gaussianity of velocity probability distributions in turbulence, by expanding the velocity field function about the Gaussian approximation. Many of the mathematical manifestations of the method are new that it is hard to obtain physical insight into them. Hence, a pilot project of a simplified nature is under taken which uses instead of the Navier-Stokes or MHD equations the Burgers one-dimensional model equation.
Journal of Theoretical Biology | 1981
Armand Siegel
Abstract The theory of the stochastic aspects of the generation of the EEG due to Elul is mathematically incomplete, and contains two important paradoxes. Moreover the methodology of his evaluation of Gaussianity in the distribution of the EEG amplitude has been called into question. This article supplies the missing proof, shows how the logical difficulties can be overcome, and justifies (insofar as possible with our present knowledge) his conclusions with respect to Gaussianity. We close the gap in the mathematical presentation by showing to what extent the spectrum of the EEG will agree with that of its neuronal generators. The first paradox arose when Elul revised his theory to conform to his tetrodotoxin experiment. This revision made it impossible to understand how the Central Limit Theorem could still govern the amplitude distribution of the EEG, even though the use of this theorem was the most fundamental and attractive aspect of his system and seemed to be amply justified in many respects. The paradox is resolved by applying the Central Limit Theorem in its modified form, which applies to sums of dependent variables. The second paradox arises in Eluls finding that EEG during mental task performance is less Gaussian than that in the idle state. Less Gaussian means less random, i.e. presumably arising from more organized neuronal configurations. How can this be reconciled with the conventional picture of task EEG as “desynchronized”? The resolution involves a careful distinction between organization and synchronization, and it is shown that highly organized brain states accompanying task performance are very unlikely to involve neuronal synchronization in the conventional sense.
Progress of Theoretical Physics | 1961
Armand Siegel
The Boltzmann linear operator is expanded such that the successive terms in the expansion may be readily interpreted as corresponding to successive kinetic-theoretical corrections. Applications to noise spectra are given. (L.N.N.)
Foundations of Physics | 1970
Armand Siegel
The quantum formalism ofdistinguishable, yetequivalent particles (with symmetric or antisymmetric wave functions) is here worked out. The result is an entirely explicit formulation of the way in which classical mechanics emerges from quantum mechanics for such particles. Distinguishability is achieved at the cost of dynamical precision; the two are, in fact, complementary.
Foundations of Physics | 1970
Armand Siegel
It is shown here that the microcanonical ensemble for a system of noninteracting bosons and fermions contains a subensemble of state vectors for which all particles of the system are distinguishable. This “IQC” (inner quantum-classical) subensemble is therefore fully classical, except for a rather extreme quantization of particle momentum and position, which appears as the natural price that must be paid for distinguishability. The contribution of the IQC subensemble to the entropy is readily calculated, and the criterion for this to be a good approximation to the exact entropy is a logarithmically strengthened form of the usual criterion for the validity of classical statistics in terms of the thermal de Broglie wavelength and the average volume per particle. Thus, it becomes possible to derive the Maxwell-Boltzmann distribution directly from the ensemble in the classical limit, using fully classical reasoning about the distinguishability of particles. The entropy is additive—theN! factor of the Boltzmann count cancels out in the course of the calculation, and the “N! paradox” is thereby resolved. The method of “correct Boltzmann counting” and the lowest term of the Wigner-Kirkwood series for the partition function are seen to be partly based on the IQC subensemble, and their partly nonclassical nature is clarified. The clear separation in the full ensemble of classical and nonclassical components makes it possible to derive the classical statistics of indistinguishable particles from their quantum statistics in a controlled, explicit way. This is particularly important for nonequilibrium theory. The treatment of molecular collisions along too-literally classical lines turns out to require exorbitantly high temperatures, although there are suggestions of indirect ways in which classical nonequilibrium theory might be justified at ordinary temperatures. The applicability of exact classical ergodic and mixing theory to systems at ordinary temperatures is called into question, although the general idea of coarse-graining is confirmed. The concepts on which the IQC idea is based are shown to give rise to a series development of thermostatistical quantities, starting with the distinguishable-particle approximation.
Archive | 1984
Armand Siegel
My aim in this essay is to show how far it is possible to defend the quality of ‘humanness’ against the scientific world view, under stringent (and, I would say, appropriate) conditions.1 The conditions are: first, not to resort to anything in the nature of vitalism or teleology — by modern standards, a not particularly stringent condition; second, which is, I think, quite stringent, not to resort to anything in the nature of holism or organicism, nor appeal to hypothetical schemes of thought or laws of nature of a non-empirical kind, like dialectics. In effect, I propose to defend this quality, which we may think of for purposes of this discussion as tied in with such felt, relatively noncognitive entities as ‘the spark of life’ or free will, sense of self or sense of one’s uniqueness, without challenging the scientific reducibility of life to the laws of physics, and without postulating any laws of physics not in the textbooks today. The defense I speak of is not new. It was first put forth some fifty years ago by Alfred North Whitehead, in the early chapters of his book Science and the Modern World (Whitehead [1925]). I shall take up only Whitehead’s very simple proposals of his early chapters, without the speculative material further on. I shall argue, first, that this stark message of Whitehead’s deserves new attention because it is all that one can reasonably expect; and second, by means of various observations including a proposition that I call a ‘quasi-theorem’, which is a continuation of Whitehead’s thought, that it is really enough, if properly and not very complicatedly developed, and therefore deserves the further attention of philosophers.
Archive | 1974
Armand Siegel; Mildred Siegel
For pure science, the fifties and early sixties were a golden age: literally, that is to say, financially. Though in many respects they saw a flourishing of scientific achievement too, this is not what we are concerned with here. Rather we wish to discuss a social phenomenon, an era of unparalleled public, governmental support for the sciences which seems, in the early seventies, to be drawing to a close. With the culminating 1973 budget cuts of the Nixon Administration, catastrophic to the earlier well-heeled way of life of the sciences, some reflections are in order. We grant that at this same time a whole style of governmental expenditure is, at least temporarily, going out the window, so science is not being singled out right now. But at the time of its rise it was singled out; and one needs to remember that its decline began during the Johnson administration, when social programs and the like still flourished. Thus there is good reason to treat science as a special case.