Abstract
A cellular game is a dynamical system in which cells, placed in some discrete structure, are regarded as playing a game with their immediate neighbors. Individual strategies may be either deterministic or stochastic. Strategy success is measured according to some universal and unchanging criterion. Successful strategies persist and spread; unsuccessful ones disappear. In this thesis, two cellular game models are formally defined, and are compared to cellular automata. Computer simulations of these models are presented. Conditions providing maximal average cell success, on one and two-dimensional lattices, are examined. It is shown that these conditions are not necessarily stable; and an example of such instability is analyzed. It is also shown that Nash equilibrium strategies are not necessarily stable. Finally, a particular kind of zero-depth, two-strategy cellular game is discussed; such a game is called a simple cellular game. It is shown that if a simple cellular game is left/right symmetric, and if there are initially only finitely many cells using one strategy, the zone in which this strategy occurs has probability 0 of expanding arbitrarily far in one direction only. With probability 1, it will either expand in both directions or disappear. Computer simulations of such games are presented.