A cellular automaton is a deterministic rewriting dynamical system that evolves in discrete time and discrete space, this latter usually a grid. It consists of a grid of cells that are locally but synchronously updated across the grid according to a global time scale and a global recursive rule governing the evolution of the state of each cell as a function of the state of neighboring cells. While the model of cellular automata is of the same computational power than other Turing universal models and, therefore, fundamentally equivalent as they can emulate each other, one of the most salient features of cellular automata is the qualitative diversity of their space-time evolutions when exploring different rules and different initial conditions. Their characteristic patterns appear faster than in other computing models and are shown visually in a compact manner as a result of their synchronous nature making them suitable to be studied both quantitatively and qualitatively, and also to be compared to physical and natural phenomena.

Contents

[edit] Two-dimensional Cellular Automata

Two-dimensional cellular automata were studied in the early 1950s by people such as Stanislaw Ulam, John von Neumann, and Nils Aall Barricelli in the context of fluid dynamics, and biological systems. Ulam and von Neumann created a method for calculating liquid motion in the late 1950s. The driving concept of the method was to consider a liquid as a group of discrete units and calculate the motion of each based on its neighbours' behavior. Ulam suggested using a discrete system for creating a reductionist model of self-replication. John von Neumann considered these cellular automata models in its quest to find or discover a "universal constructor", a computational model that would be able to describe itself and self-reproduce. Much later, in 1986, Christopher Langton found a self-reproducing cellular automaton named Langton's ant found in a small rule-space (less number of states/shorter rule) than that constructed by von Neumann. Barricelli performed many of the earliest numerical experiments of cellular automata as a framework for artificial life as a precursor of evolutionary algorithms. A different type of neighborhood to that considered by von Neumann in two-dimensional cellular automata is the neighborhood is named after Edward F. Moore, a pioneer of cellular automata theory. The Moore neighborhood is composed of nine cells: a central cell and the eight cells which surround it.

In 1969, German computer pioneer Konrad Zuse published his book Calculating Space, proposing that the physical laws of the universe are discrete by nature and that the entire universe is the output of a deterministic computation on a single cellular automaton. In the 1970s, John Conway introduced a two-state, two-dimensional cellular automaton with Moore's neighborhood that became known as the 'Game of Life' (GoL) as popularized by Martin Gardner in a Scientific American article. One of the most striking properties of GoL is not only that it appears to capture some of the most basic processes of life (birth, reproduction, and death) in an extremely simple model but the model displays the kind of living system's rich behavior. GoL was later proven to be Turing universal. WireWorld is another common two-dimensional cellular automaton.

[edit] One-dimensional Cellular Automata

In the 1980s, a comprehensive search of cellular automata models was performed by Stephen Wolfram who contributed significantly to the expansion and popularisation of the field. Wolfram systematically studied and introduced a simplified model of cellular automata, in particular, are the simplest non-trivial consisting of one-dimensional-space two-state automata that produce two-dimensional space-time evolutions. Those living in one-dimensional space and consider only the state of the closest two cell neighbors were named Elementary Cellular Automata by Wolfram. Wolfram performed an exhaustive study of ECA and other rewriting systems in increasingly larger rule spaces as published in his book A New Kind of Science (2002). Among his discoveries is that even in the simplest of the CA models some rules were able to generate high-quality statistical randomness even for the simplest initial conditions. This came as a surprise because even when some simple dynamical systems were known to be able to produce chaotic behavior such as in the so-called Rule 30 (where 30 comes from the rule representation in binary converted to decimal), such chaotic behavior would be, however, the result of continuous-time or continuous-space supporting such random-looking behavior. In cellular automata, however, rules and initial conditions are not only simple such as in other discrete or continuous dynamical systems but they also run on discrete space and time.

[edit] Wolfram's PCE and Computational Irreducibility

In the early 2000s, Wolfram and Cook showed that another ECA Rule, Rule 110, was capable of Turing universality, the rule is so minimalistic yet so powerful that led Wolfram to postulate a principle of computational equivalence (PCE) establishing that rules capable of non-trivial behavior are equally powerful and able of Turing universality. This PCE was later explored by H. Zenil and J. Riedel showing that most non-trivial ECA rules could be reprogrammed to behave as other rules displaying very different qualitative behavior with the appropriate compiler under rescaling hence providing further evidence in favor of Wolfram's PCE. A similar but theoretical exploration was undertaken by G. Theyssier arriving at similar conclusions. Wolfram's PCE also led to Wolfram to propose another principle of irreducibility on the basis that if PCE held and most non-trivial rules were indeed capable of unbounded complexity therefore most rules should also be irreducible (under a reasonable assumption based on the Church-Turing thesis). Wolfram's version of this type of universality was proposed as an incapability to find shortcuts of any feature of the behavior of a non-trivial system without having to run the rule itself nearly step by step thus expanding the irreducibility imposed by a stronger type of irreducibility from more widely studied undecidability and unreachability problems reduced to the Halting problem. H. Zwirn provided a formal framework of Wolfram's irreducibility in terms of a non-conventional approach to computational complexity based on logical depth. A philosophical angle was also studied by J. Dubucs and D. Reisinger et al. Zenil et al. took an experimental systemic approach and K. Sutner has given also technical accounts and an investigation into the meaning and discussion framework of PCE.

[edit] Enumeration, coding, and initial conditions

In 1983, Wolfram was the first also to explore CAs in a systematic fashion and conceived a simple enumeration scheme based on the rule representation. Each ECA rule can be represented by a row of all 3-tuple (the central cell and its closest neighbors) of which there are 8 cases, followed by a row of the rule assignation to such either a white or a black cell. This later row consisting of binary digits given that the ECA is a 2-state rule space can be read as a binary number e.g. 00000010 which in decimal is, e.g. rule 3 (for the depicted case). All possible mappings give a total number of \(2^8=256\) rules. Increasing the state (color) number or the number of neighbors to each side in the description of each rile, the rule space size increases exponentially with ever-increasing spaces including the smaller ones. Space-time evolutions are of \(d+1\) dimensions where d is the dimension of the CA \(+\) time. For example, ECA that are 2-dimensional evolve in 3 dimensions. CAs such as the Game of Life that are 3-dimensional produce space-time evolutions in 4 dimensions.

[edit] Classification of Cellular Automata

Starting with random initial conditions, Wolfram observed a wide range of qualitatively different behaviors in their space-time evolutions. The discovery of the various qualitative behaviors displayed by ECA led Wolfram to propose a behavioral heuristic classification:

Rules such as ECA rule 110, like the Game of Life, exhibits Class 4 behavior. Rules like ECA rule 30 that are random-looking belong to Class 3. Attempts to formalize the classification or come up with alternative ones have led to the different approaches and proposals, including, A. Wuensche, W. Li and N.H. Packard, J. Baetens, H. Zenil, and others, based on other order parameters such as information (communication) theoretic (statistical entropy), power spectral, topological, surface, lossless compression (such as LZW), lattices, Lyapunov exponents, algorithmic complexity, mean-field, and morphological diversity classifications. They can themselves be categorized into rule-based or post-evolution-based. Rule-based approaches focus on an examination of the generating rules, this is the case of, for example, Langton's lambda parameter (the rule density of non-zero values). These approaches, however, are very limited because of undecidability results (Culik et al.) Post-evolution approaches are observer-dependent by the same undecidability arguments but are better placed to adapt under a Bayesian approach to behavior that updates its class membership (as compared to those classifications based on only inspecting the generating rules), such as entropy, Lyapunov's exponent, lossless compression or algorithmic-complexity-based. Post-evolution approaches also allow a qualitative study of the sensitivity of a rule to different initial conditions. For example, ECA rule 22 is bi-stable, with one behavior belonging to Class 2 as it produces a Sierpinsky fractal-like pattern and another behavior to which it converges in the limit as a function of initial input length it produces a random-looking output similar to that of ECA rule 30 (because the longer the string the fewer chances to have the symmetry required to produce the fractal-like behavior). This kind of analysis is impossible with rule-based approaches such as Langton's lambda or state diagrams only inspecting the static rule. Measures, such as entropy or lossless compression with popular compression algorithms are limited to statistical (i)regularities. Only measures able to cope with undecidable problems such as those universal like algorithmic complexity and algorithmic probability introduced by J. Riedel and Zenil are equipped, in principle, to deal with the range of rich and possible behavior displayed by a universal model such as CA. Work by Riedel and Zenil also showed that classifications are not fundamental because for every (non-trivial) rule there is a compiler with which the rule can be reprogrammed to emulate rules from any other behavioral class. However, they also showed that a new classification can be recovered by looking at how difficult is for a rule-compiler tuple to be found in order to emulate a wider range of rules both quantitatively and qualitatively different. The invariant is, therefore, the combination of most likely behavior and the algorithmic probability of finding a short compiler to make the original rule to emulate other rules.

When considering the sensitivity to initial conditions one has to define a distance between initial conditions and the most appropriate is to use Gray's code that guarantees that only one digits changes from one to another string hence only introducing the smallest possible change in the initial condition to study its effect in the output evolution of the CA.

[edit] Subclasses of Cellular Automata

Reversible CA

A cellular automaton is reversible if, for every current configuration of the cellular automaton, there is exactly one past configuration (preimage). If one thinks of a cellular automaton as a function mapping configurations to configurations, reversibility implies that this function is bijective. If a cellular automaton is reversible, its time-reversed behavior can also be described as a cellular automaton; this fact is a consequence of the Curtis–Hedlund–Lyndon theorem, a topological characterization of cellular automata. For cellular automata in which not every configuration has a preimage, the configurations without preimages are called 'Garden of Eden' patterns. With John Myhill, Moore proved the Garden of Eden theorem characterizing the cellular automaton rules that have patterns with no predecessor.

For one-dimensional cellular automata, there are known algorithms for deciding whether a rule is reversible or irreversible. However, for cellular automata of two or more dimensions, reversibility is undecidable; that is, there is no algorithm that takes as input an automaton rule and is guaranteed to determine correctly whether the automaton is reversible. The proof by Jarkko Kari is related to the tiling problem by Wang tiles.

Reversible cellular automata are often used to simulate such physical phenomena as gas and fluid dynamics since they obey the laws of thermodynamics. Such cellular automata have rules specially constructed to be reversible. Such systems have been studied by Tommaso Toffoli, Norman Margolus and others. Several techniques can be used to explicitly construct reversible cellular automata with known inverses.

Totalistic

A special class of cellular automata are the totalistic cellular automata. The state of each cell in a totalistic cellular automaton is represented by a number (usually an integer value drawn from a finite set), and the value of a cell at time t depends only on the sum of the values of the cells in its neighborhood (possibly including the cell itself) at time \(t - 1\). If the state of the cell at time t depends on both its own state and the total of its neighbors at time \(t − 1\) then the cellular automaton is properly called outer totalistic. Conway's GoL is an example of an outer totalistic cellular automaton with cell values 0 and 1; outer totalistic cellular automata with the same Moore neighborhood structure as Life are sometimes called life-like cellular automata.

Continuous spatial automata

Continuous spatial automata have a continuum of locations. The state of a location is a finite number of real numbers. Time is also continuous, and the state evolves according to differential equations. One important example is reaction-diffusion textures, differential equations proposed by Alan Turing in the context of his morphogenesis to explain how chemical reactions could create the stripes on zebras and spots on leopards. The Belousov–Zhabotinsky reaction is a spatio-temporal chemical oscillator that can be simulated by means of a cellular automaton.

Non-square lattice/grid

Lattices on which a CA runs do not have to be square. Maurice Margenstern, for example, has introduced CA on the hyperbolic planes.

[edit] Applications

References


irreducibility and unpredictability. Minds and Machinesvol. 22, Number 3, pp. 149-165.