A cellular automaton is a deterministic rewriting dynamical system that evolves in discrete time and discrete space, this latter usually a grid. It consists of a grid of cells that are locally but synchronously updated across the grid according to a global time scale and a global recursive rule governing the evolution of the state of each cell as a function of the state of neighboring cells. While the model of cellular automata is of the same computational power than other Turing universal models and, therefore, fundamentally equivalent as they can emulate each other, one of the most salient features of cellular automata is the qualitative diversity of their space-time evolutions when exploring different rules and different initial conditions. Their characteristic patterns appear faster than in other computing models and are shown visually in a compact manner as a result of their synchronous nature making them suitable to be studied both quantitatively and qualitatively, and also to be compared to physical and natural phenomena. ## Contents * 1 Two-dimensional Cellular Automata * 2 One-dimensional Cellular Automata * 3 Wolfram's PCE and Computational Irreducibility * 4 Enumeration, coding, and initial conditions * 5 Classification of Cellular Automata * 6 Subclasses of Cellular Automata * 7 Applications ## [edit] Two-dimensional Cellular Automata Two-dimensional cellular automata were studied in the early 1950s by people such as Stanislaw Ulam, John von Neumann, and Nils Aall Barricelli in the context of fluid dynamics, and biological systems. Ulam and von Neumann created a method for calculating liquid motion in the late 1950s. The driving concept of the method was to consider a liquid as a group of discrete units and calculate the motion of each based on its neighbours' behavior. Ulam suggested using a discrete system for creating a reductionist model of self-replication. John von Neumann considered these cellular automata models in its quest to find or discover a "universal constructor", a computational model that would be able to describe itself and self-reproduce. Much later, in 1986, Christopher Langton found a self-reproducing cellular automaton named Langton's ant found in a small rule-space (less number of states/shorter rule) than that constructed by von Neumann. Barricelli performed many of the earliest numerical experiments of cellular automata as a framework for artificial life as a precursor of evolutionary algorithms. A different type of neighborhood to that considered by von Neumann in two-dimensional cellular automata is the neighborhood is named after Edward F. Moore, a pioneer of cellular automata theory. The Moore neighborhood is composed of nine cells: a central cell and the eight cells which surround it. In 1969, German computer pioneer Konrad Zuse published his book Calculating Space, proposing that the physical laws of the universe are discrete by nature and that the entire universe is the output of a deterministic computation on a single cellular automaton. In the 1970s, John Conway introduced a two-state, two-dimensional cellular automaton with Moore's neighborhood that became known as the 'Game of Life' (GoL) as popularized by Martin Gardner in a Scientific American article. One of the most striking properties of GoL is not only that it appears to capture some of the most basic processes of life (birth, reproduction, and death) in an extremely simple model but the model displays the kind of living system's rich behavior. GoL was later proven to be Turing universal. WireWorld is another common two-dimensional cellular automaton. ## [edit] One-dimensional Cellular Automata In the 1980s, a comprehensive search of cellular automata models was performed by Stephen Wolfram who contributed significantly to the expansion and popularisation of the field. Wolfram systematically studied and introduced a simplified model of cellular automata, in particular, are the simplest non-trivial consisting of one-dimensional-space two-state automata that produce two-dimensional space-time evolutions. Those living in one-dimensional space and consider only the state of the closest two cell neighbors were named Elementary Cellular Automata by Wolfram. Wolfram performed an exhaustive study of ECA and other rewriting systems in increasingly larger rule spaces as published in his book A New Kind of Science (2002). Among his discoveries is that even in the simplest of the CA models some rules were able to generate high-quality statistical randomness even for the simplest initial conditions. This came as a surprise because even when some simple dynamical systems were known to be able to produce chaotic behavior such as in the so-called Rule 30 (where 30 comes from the rule representation in binary converted to decimal), such chaotic behavior would be, however, the result of continuous-time or continuous-space supporting such random-looking behavior. In cellular automata, however, rules and initial conditions are not only simple such as in other discrete or continuous dynamical systems but they also run on discrete space and time. ## [edit] Wolfram's PCE and Computational Irreducibility In the early 2000s, Wolfram and Cook showed that another ECA Rule, Rule 110, was capable of Turing universality, the rule is so minimalistic yet so powerful that led Wolfram to postulate a principle of computational equivalence (PCE) establishing that rules capable of non-trivial behavior are equally powerful and able of Turing universality. This PCE was later explored by H. Zenil and J. Riedel showing that most non-trivial ECA rules could be reprogrammed to behave as other rules displaying very different qualitative behavior with the appropriate compiler under rescaling hence providing further evidence in favor of Wolfram's PCE. A similar but theoretical exploration was undertaken by G. Theyssier arriving at similar conclusions. Wolfram's PCE also led to Wolfram to propose another principle of irreducibility on the basis that if PCE held and most non-trivial rules were indeed capable of unbounded complexity therefore most rules should also be irreducible (under a reasonable assumption based on the Church-Turing thesis). Wolfram's version of this type of universality was proposed as an incapability to find shortcuts of any feature of the behavior of a non-trivial system without having to run the rule itself nearly step by step thus expanding the irreducibility imposed by a stronger type of irreducibility from more widely studied undecidability and unreachability problems reduced to the Halting problem. H. Zwirn provided a formal framework of Wolfram's irreducibility in terms of a non-conventional approach to computational complexity based on logical depth. A philosophical angle was also studied by J. Dubucs and D. Reisinger et al. Zenil et al. took an experimental systemic approach and K. Sutner has given also technical accounts and an investigation into the meaning and discussion framework of PCE. ## [edit] Enumeration, coding, and initial conditions In 1983, Wolfram was the first also to explore CAs in a systematic fashion and conceived a simple enumeration scheme based on the rule representation. Each ECA rule can be represented by a row of all 3-tuple (the central cell and its closest neighbors) of which there are 8 cases, followed by a row of the rule assignation to such either a white or a black cell. This later row consisting of binary digits given that the ECA is a 2-state rule space can be read as a binary number e.g. 00000010 which in decimal is, e.g. rule 3 (for the depicted case). All possible mappings give a total number of \\(2^8=256\\) rules. Increasing the state (color) number or the number of neighbors to each side in the description of each rile, the rule space size increases exponentially with ever-increasing spaces including the smaller ones. Space-time evolutions are of \\(d+1\\) dimensions where d is the dimension of the CA \\(+\\) time. For example, ECA that are 2-dimensional evolve in 3 dimensions. CAs such as the Game of Life that are 3-dimensional produce space-time evolutions in 4 dimensions. ## [edit] Classification of Cellular Automata Starting with random initial conditions, Wolfram observed a wide range of qualitatively different behaviors in their space-time evolutions. The discovery of the various qualitative behaviors displayed by ECA led Wolfram to propose a behavioral heuristic classification: * Class 1: Nearly all initial patterns evolve quickly into a stable, homogeneous state. Any randomness in the initial pattern disappears. * Class 2: Nearly all initial patterns evolve quickly into stable or oscillating structures. Some of the randomness in the initial pattern may filter out, but some remain. Local changes to the initial pattern tend to remain local. * Class 3: Nearly all initial patterns evolve in a pseudo-random or chaotic manner. Any stable structures that appear are quickly destroyed by the surrounding noise. Local changes to the initial pattern tend to spread indefinitely. * Class 4: Nearly all initial patterns evolve into structures that interact in complex and interesting ways, with the formation of local structures that are able to survive for long periods of time. Rules such as ECA rule 110, like the Game of Life, exhibits Class 4 behavior. Rules like ECA rule 30 that are random-looking belong to Class 3. Attempts to formalize the classification or come up with alternative ones have led to the different approaches and proposals, including, A. Wuensche, W. Li and N.H. Packard, J. Baetens, H. Zenil, and others, based on other order parameters such as information (communication) theoretic (statistical entropy), power spectral, topological, surface, lossless compression (such as LZW), lattices, Lyapunov exponents, algorithmic complexity, mean-field, and morphological diversity classifications. They can themselves be categorized into rule-based or post-evolution-based. Rule-based approaches focus on an examination of the generating rules, this is the case of, for example, Langton's lambda parameter (the rule density of non-zero values). These approaches, however, are very limited because of undecidability results (Culik et al.) Post-evolution approaches are observer-dependent by the same undecidability arguments but are better placed to adapt under a Bayesian approach to behavior that updates its class membership (as compared to those classifications based on only inspecting the generating rules), such as entropy, Lyapunov's exponent, lossless compression or algorithmic-complexity-based. Post-evolution approaches also allow a qualitative study of the sensitivity of a rule to different initial conditions. For example, ECA rule 22 is bi-stable, with one behavior belonging to Class 2 as it produces a Sierpinsky fractal-like pattern and another behavior to which it converges in the limit as a function of initial input length it produces a random-looking output similar to that of ECA rule 30 (because the longer the string the fewer chances to have the symmetry required to produce the fractal-like behavior). This kind of analysis is impossible with rule-based approaches such as Langton's lambda or state diagrams only inspecting the static rule. Measures, such as entropy or lossless compression with popular compression algorithms are limited to statistical (i)regularities. Only measures able to cope with undecidable problems such as those universal like algorithmic complexity and algorithmic probability introduced by J. Riedel and Zenil are equipped, in principle, to deal with the range of rich and possible behavior displayed by a universal model such as CA. Work by Riedel and Zenil also showed that classifications are not fundamental because for every (non-trivial) rule there is a compiler with which the rule can be reprogrammed to emulate rules from any other behavioral class. However, they also showed that a new classification can be recovered by looking at how difficult is for a rule-compiler tuple to be found in order to emulate a wider range of rules both quantitatively and qualitatively different. The invariant is, therefore, the combination of most likely behavior and the algorithmic probability of finding a short compiler to make the original rule to emulate other rules. When considering the sensitivity to initial conditions one has to define a distance between initial conditions and the most appropriate is to use Gray's code that guarantees that only one digits changes from one to another string hence only introducing the smallest possible change in the initial condition to study its effect in the output evolution of the CA. ## [edit] Subclasses of Cellular Automata Reversible CA A cellular automaton is reversible if, for every current configuration of the cellular automaton, there is exactly one past configuration (preimage). If one thinks of a cellular automaton as a function mapping configurations to configurations, reversibility implies that this function is bijective. If a cellular automaton is reversible, its time-reversed behavior can also be described as a cellular automaton; this fact is a consequence of the Curtis–Hedlund–Lyndon theorem, a topological characterization of cellular automata. For cellular automata in which not every configuration has a preimage, the configurations without preimages are called 'Garden of Eden' patterns. With John Myhill, Moore proved the Garden of Eden theorem characterizing the cellular automaton rules that have patterns with no predecessor. For one-dimensional cellular automata, there are known algorithms for deciding whether a rule is reversible or irreversible. However, for cellular automata of two or more dimensions, reversibility is undecidable; that is, there is no algorithm that takes as input an automaton rule and is guaranteed to determine correctly whether the automaton is reversible. The proof by Jarkko Kari is related to the tiling problem by Wang tiles. Reversible cellular automata are often used to simulate such physical phenomena as gas and fluid dynamics since they obey the laws of thermodynamics. Such cellular automata have rules specially constructed to be reversible. Such systems have been studied by Tommaso Toffoli, Norman Margolus and others. Several techniques can be used to explicitly construct reversible cellular automata with known inverses. Totalistic A special class of cellular automata are the totalistic cellular automata. The state of each cell in a totalistic cellular automaton is represented by a number (usually an integer value drawn from a finite set), and the value of a cell at time t depends only on the sum of the values of the cells in its neighborhood (possibly including the cell itself) at time \\(t - 1\\). If the state of the cell at time t depends on both its own state and the total of its neighbors at time \\(t − 1\\) then the cellular automaton is properly called outer totalistic. Conway's GoL is an example of an outer totalistic cellular automaton with cell values 0 and 1; outer totalistic cellular automata with the same Moore neighborhood structure as Life are sometimes called life-like cellular automata. Continuous spatial automata Continuous spatial automata have a continuum of locations. The state of a location is a finite number of real numbers. Time is also continuous, and the state evolves according to differential equations. One important example is reaction-diffusion textures, differential equations proposed by Alan Turing in the context of his morphogenesis to explain how chemical reactions could create the stripes on zebras and spots on leopards. The Belousov–Zhabotinsky reaction is a spatio-temporal chemical oscillator that can be simulated by means of a cellular automaton. Non-square lattice/grid Lattices on which a CA runs do not have to be square. Maurice Margenstern, for example, has introduced CA on the hyperbolic planes. ## [edit] Applications * Traffic * Game theory (Firing squad synchronization problem, Majority problem) * Pseudo-randomness * Computing with particle colliding * Quantum gravity: Fredkin and Wolfram are strong proponents of a CA-based physics and in 2016, Gerard 't Hooft published a book-length development of the idea to rebuild quantum mechanics using cellular automata. References * Amoroso, Serafino; Patt, Yale N. (1972). "Decision Procedures for Surjectivity and Injectivity of Parallel Maps for Tessellation Structures". J. Comput. Syst. Sci. 6 (5): 448–464. doi:10.1016/s0022-0000(72)80013-8. * Burton H. Voorhees (1996). Computational analysis of one-dimensional cellular automata. World Scientific. p. 8. ISBN 978-981-02-2221-5. * Durand-Lose, Jérôme (2001). "Representing reversible cellular automata with reversible block cellular automata". Discrete Mathematics and Theoretical Computer Science. AA: 145–154. Archived from the original on 15 May 2011. * E. Fredkin, "Digital mechanics: an informational process based on reversible universal cellular automata", Physica D 45, 254–270, 1990 * Ilachinski, Andrew (2001). Cellular automata: a discrete universe. World Scientific. ISBN 978-981-238-183-5. * Kari, Jarkko (1990). "Reversibility of 2D cellular automata is undecidable". Physica D. 45 (1–3): 379–385. doi:10.1016/0167-2789(90)90195-U. * Kari, Jarkko (1999). On the circuit depth of structurally reversible cellular automata. Fundamenta Informaticae. 38: 93–107. * Kari, Jarkko (2005). Theory of cellular automata: a survey. Theoret. Comp. Sci., 334:3–33. * Li, Wentian; Packard, Norman (1990). "The structure of the elementary cellular automata rule space". Complex Systems. (4) 281–297. * Nicolis (1974). "Dissipative Structures, Catastrophes, and Pattern Formation: A Bifurcation Analysis". PNAS. 71 (7): 2748–2751. * Margenstern, Maurice (2007). Cellular Automata in Hyperbolic Spaces – Tome I, Volume 1. * Riedel, Jürgen and Zenil, Hector (2018). "Rule Primality, Minimal Generating Sets and Turing-Universality in the Causal Decomposition of Elementary Cellular Automata". Journal of Cellular Automata, vol. 13, pp. 479-497. * Riedel, Jürgen and Zenil, Hector (2018). "Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Universality". International Journal of Unconventional Computing, vol 13:14-15 pp. 309-357. * Sutner, Klaus (1991). "De Bruijn Graphs and Linear Cellular Automata". Complex Systems. 5: 19–30. * Smith, Alvy Ray. "Cellular Automata Complexity Trade-Offs". * Smith, Alvy Ray. "Simple Computation-Universal Cellular Spaces". * Smith, Alvy Ray. "Simple Nontrivial Self-Reproducing Machines". * Smith, Alvy Ray. "Introduction to and Survey of Cellular Automata or Polyautomata Theory". * Gardner, Martin (1970). "Mathematical Games: The fantastic combinations of John Conway's new solitaire game "life"". Scientific American. 223 (223): 120–123. doi:10.1038/scientificamerican1070-120. * Gerard 't Hooft, 2016, The Cellular Automaton Interpretation of Quantum Mechanics, Springer International Publishing, DOI 10.1007/978-3-319-41285-6 * Reisinger, Drew and Martin, Taylor and Blankenship, Mason and Harrison, Christopher and Squires, Jesse and Beavers, Anthony, Exploring Wolfram’s Notion of Computational Irreducibility with a Two-Dimensional Cellular Automaton (2012) In Zenil, Hector (Ed.) Irreducibility and Computational Equivalence: 10 Years After Wolfram's A New Kind of Science, Springer, Heidelberg. * Toffoli, Tommaso and Margolus, Nornam (1987) Cellular Automata Machines: A New Environment for Modeling. Cambridge, MA: MIT Press. * Toffoli, Tommaso; Margolus, Norman (1987). Cellular Automata Machines: A New Environment for Modeling. MIT Press. p. 27. ISBN 9780262200608. * John von Neumann, "The general and logical theory of automata," in L.A. Jeffress, ed., Cerebral Mechanisms in Behavior – The Hixon Symposium, John Wiley & Sons, New York, 1951, pp. 1–31. * Weisstein, E. W. "Books about Cellular Automata." http://www.ericweisstein.com/encyclopedias/books/CellularAutomata.html. * Wolfram, Stephen "Statistical Mechanics of Cellular Automata." Rev. Mod. Phys. 55, 601-644, 1983. * Wolfram, Stephen "Twenty Problems in the Theory of Cellular Automata." Physica Scripta T9, 170-183, 1985. * Wolfram, Stephen (Ed.). Theory and Application of Cellular Automata. Reading, MA: Addison-Wesley, 1986. * Wolfram, Stephen Cellular Automata and Complexity: Collected Papers. Reading, MA: Addison-Wesley, 1994. * Wolfram, Stephen A New Kind of Science. Champaign, IL: Wolfram Media, 2002. * Wuensche, Andrew and Lesser, M. The Global Dynamics of Cellular Automata: An Atlas of Basin of Attraction Fields of One-Dimensional Cellular Automata. Reading, MA: Addison-Wesley, 1992. * Zenil, Hector (2010). "Compression-based investigation of the dynamical properties of cellular automata and other systems". Complex Systems. 19 (1). * Zenil, Hector (2013), “Asymptotic behavior and ratios of complexity in cellular automata,” International Journal of Bifurcation and Chaos, vol. 23, no. 9, Article ID 1350159. * Zenil, Hector (2012) Irreducibility and Computational Equivalence: 10 Years After Wolfram's A New Kind of Science, Springer, Heidelberg. * Sutner, Klaus (2012) Computational Equivalence and Classical Recursion Theory. In Zenil, Hector (Ed.) Irreducibility and Computational Equivalence: 10 Years After Wolfram's A New Kind of Science, Springer, Heidelberg. * Zenil, H., Soler-Toscano, F., Joosten, J. (2012) Empirical encounters with computational irreducibility and unpredictability. Minds and Machinesvol. 22, Number 3, pp. 149-165. * Zuse, Konrad "The Computing Universe", Int. Jour. of Theo. Phy. 21, 589–600, 1982. * Zuse, Konrad (1970) "Calculating Space", MIT Technical Translation AZT-70-164-GEMIT, Massachusetts Institute of Technology (Project MAC), Cambridge, Mass. 02139. Adrian German and Hector Zenil (eds). In Zenil, Hector (2012) A Computable Universe, World Scientific Publishing Company. * Zwirn, Hervé (2013) Computational Irreducibility and Computational Analogy, Complex Systems 24(2). * Zwirn, Hervé and Delahaye, Jean-Paul (2012) Unpredictability and Computational Irreducibility. In Zenil, Hector (Ed.) Irreducibility and Computational Equivalence: 10 Years After Wolfram's A New Kind of Science, Springer, Heidelberg.