Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J. Wolfowitz is active.

Publication


Featured researches published by J. Wolfowitz.


Information & Computation | 1962

Channels with Arbitrarily Varying Channel Probability Functions

J. Kiefer; J. Wolfowitz

Let S be a finite collection of c.p.f.’s as described in Section 4.1. Suppose that s varies arbitrarily from letter to letter, instead of remaining constant during the transmission of a word of length n. Let S t = S, t = 1, . . . , n. For every n-swquence \( {s_n} = \left( {{s^1}, \ldots ,{s^n}} \right)\varepsilon \prod\limits_{t = 1}^n {{S^t} = {S^n}} \) we define


Archive | 1974

Maximum probability estimators and related topics

Lionel Weiss; J. Wolfowitz


Annals of the Institute of Statistical Mathematics | 1964

Optimum extrapolation and interpolation designs, I

J. Klefer; J. Wolfowitz

{P_{{s_n}}}\left\{ {v\left( {{u_0}} \right) = {v_0}} \right\} = P\left\{ {v\left( {{u_0}} \right) = {v_0}|{s_n}} \right\} = \prod\limits_{t = 0}^n {w\left( {{y_t}|{x_t}|{s^t}} \right)}


Information & Computation | 1963

On channels without a capacity

J. Wolfowitz


Information & Computation | 1960

A note on the strong converse of the coding theorem for the general discrete finite-memory channel*

J. Wolfowitz

(6.1.1) Such a channel is called “arbitrarily varying”. Results on such channels have not yet reached the level of results on compound channels because the problems are much more difficult. This chapter is devoted to some of the results already obtained.


Information & Computation | 1968

Note on the Gaussian channel with feedback and a power constraint

J. Wolfowitz

Purpose of this monograph.- The maximum likelihood estimator.- The maximum probability estimator.- Maximum probability estimators with a general loss function.- Asymptotic behavior of the likelihood function. Asymptotically sufficient statistics.- Efficiency of maximum likelihood estimators.- Testing hypotheses.


Probability Theory and Related Fields | 1969

Asymptotically minimax tests of composite hypotheses

Lionel Weiss; J. Wolfowitz

SummaryFor regression problems where observations may be taken at points in a set X which does not coincide with the set Y on which the regression function is of interest, we consider the problem of finding a design (allocation of observations) which minimizes the maximum over Y of the variance function (of estimated regression). Specific examples are calculated for one-dimensional polynomial regression when Y is much smaller than or much larger than X. A related problem of optimum estimation of two regression coefficients is studied. This paper contains proofs of results first announced at the 1962 Minneapolis Meeting of the Institute of Mathematical Statistics. No prior knowledge of design theory is needed to read this paper.


Journal of the Royal Statistical Society. Series A (General) | 1962

Coding Theorems of Information Theory.

I. J. Good; J. Wolfowitz

which therefore always exists when C(k) exists. Many writers on information theory call by implication C(0-k) the capacity of the channel. They do this by proving a coding theorem and a weak converse (Wolfowitz, 1961, Section 7.6), and then calling the constant involved the capacity. In Wolfowitz (1961), where C is called the capacity, I pointed out that, to prove that C is the capacity, one has to prove a coding theorem and strong converse (Wolfowitz, 1961, Section 5.6, esp. p. 59).


Archive | 1961

Heuristic Introduction to the Discrete Memoryless Channel

J. Wolfowitz

The strong converse for the discrete memoryless channel was proved by the author (Wolfowitz, 1957). (The result actually proved is stronger than the strong converse because of the O(~¢/n) term in the exponent). Subsequently the author (1958) and Feinstein (1959) independently gave the capacity C of a discrete finite-memory channel, and proved the strong converse of the coding theorem for the special discrete finitememory channel studied (Wolfowit~, 1957, 1958). In the present note we prove the strong converse for the general discrete finite-memory channel. Thus our result includes t ha t of Wolfowitz (1958) and Feinstein (1959) as a special case. The proof is a slight modification of the proof of Wolfowitz (1958), whose notation and definitions are hereby assumed. For a definition of the capacity C see (Wolfowitz, 1958) or (Feinstein, 1959); for a definition of the general discrete finite-memory channel see (Feinstein, 1959) or (Feinstein, 1958, p. 90). We shall assume without essential loss of generality that both the input and output alphabets consist of two letters, say 0 and l; extension to the case where each alphabet contains any finite number of symbols is trivial. Any sequence of n zeros or ones will be called an n-sequence. A code (N, X) is a set


Archive | 1961

The Semi-Continuous Memoryless Channel

J. Wolfowitz

This note is a comment on the beautiful coding schemes for a Gaussian chalmel with feedback due to Schalkwijk and Kai la th (1966) and Schalkwijk (1966), particularly on the second of these. Very important and basic work from a different point of view has been done by Elias (1961, 19665. The coding scheme is simply described in Section 2. The new result of this paper is the strong converse in Section 3, whose proof is a simple modification of the proof of Theorem 9.2.2 of Wolfowitz (1961~ 1964). Section 4 contains ~ few comments on the Gaussian channel with feedback and ~ power constraint (channel GFP) which seem not to have been made elsewhere. Section 5 is a digression on the timecontinuous Gaussian channel which contains ~ very simple proof of the strong converse for tha t channel. A simple description of the Gaussian channel without feedback and with a power constraint is given in Section 9.2 of Wolfowitz ( 1961, 1964). We now describe channel GFP, which differs from the former channel only in having feedback. A message is t ransmit ted over channel GFP by n signals. The first signal is a function of the message; suppose the signal is a. Let zl , • • • , z~ be independent, normal chance variables, with means zero and variance 2 The first signal received is yl = a + Zl. The next signal sent is a function ~2(y15 of y~ and the message. The i th signal sent is a function ~ ( y ~ , . . , Y~-~) of the message and the variables exhibited, etc. The i th signal received is y~ = ~ + z~, etc. The power constraint is described in Section 2, (2.115 and (2.125, below. The functions ~ depend upon the particular coding scheme used.

Collaboration


Dive into the J. Wolfowitz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Wassily Hoeffding

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge