The Society of Control: steering organisms, from cells to human societies

5 May 2023

Control and orders

After the Second World War, some automatisms used for fire control, especially anti-aircraft, inspired a new “Theory of Control”. Norbert Wiener, one of the inventors of these “servo-mechanisms” then extended their theory, not only to original mathematics, based on differential equations of a physico-mathematical type, but also to a new look at humans and society[1]. No more need for confused “democratic” debates among citizens, always too ideological, but automatic “governance” of humans and of society, on self-controlled paths, proposed by differential equations or imposed by machines that implement them. Once the political, ethical goals were set … formal servo-mechanisms would have proposed or imposed optimal paths to reach them, that is mathematical geodesics in predetermined phase spaces (the spaces of pertinent observables and parameters): humans would then adapt to this “automatic flight” towards higher goals and in an optimal way. And here is the point: the mathematical methods used solve equations by techniques of optima, thus dictating the best possible paths towards the pre-established ends, in pre-given spaces. In economics, a bit of more robust mathematics is then added to the elementariness and banality of Léon Walras’ equations (the more complex Fokker-Plank equations stepped in), that is a mathematics that better expresses dynamic adaptability and allows to follow the optimal path, as “there is no alternative”.

In the following decades Discrete State Machines (DSM, in Turing’s terms, 1950) became very important. The programming activity of modern computers acquired its scientific autonomy: digital machines were thought not only to solve equations but to execute programs of all kinds, and computer programming acquired a scientific autonomy. Many began to think directly in terms of algorithms, no longer “going through the math”. At that time, imperative and functional languages came to light, a great bi-partition of styles that still persists, despite the birth of “object-oriented” languages, interactive “eco-rithms” etc. However, beginning with functional languages, all “boil down” to imperative machine languages. That is, compilers or interpreters are programs that transfer, under the control of operating systems, the high-level scripts of computer languages written by humans into lower level or even machine languages. They “translate” the equations of functional languages, the interactions between objects, etc. into “orders”, in an imperative style. Lambda-calculus, the mathematical origin of functional languages and, thanks to types, also of object-oriented languages, explains it in a mathematically rigorous way: a (functional) equation between terms is equivalent to the execution of reduction orders (alpha-beta-eta-rules) up to the common term, the “reductum” (Church-Rosser theorem)[2].

A difference must then be observed between these two approaches. The equations in continua of the Theory of Control, often non-linear, allow us to conceive the “nuances”, the small fluctuations, the small “dissent”, let’s say, that can modify the trajectory, even in a very relevant way (in the case of non-linearity): the “control” is smooth, somehow resilient. That is, the system adjusts itself, it adapts to the perturbation by oscillating, incorporates the fluctuation, but tolerates it, integrates it. In this process, the optimum can be modified, it can lead to a geodesic that is in another “valley”.

The imperative orders on discrete data types, to which all programming languages reduce, allow no fluctuations, no nuance, do not even conceive “la sécrète noirceur du lait”, a typical interference of waves in continua. Also Leivant’s “eco-rithms” (interacting algorithms in a “virtual ecosystem”), as long as the entire environment is discrete, yield deterministic and predictable trajectories – typically, when iterated in the same initial conditions, they produce the same path, identically – this is the strength of discrete universes and of algorithms on them. Of course, one may force in some quantum randomness, the discrete spin-up/spin-down of an electron, or use the thermal fluctuation in the computer to produce random digits, but this is “external” to the theory of algorithms, an added conceptual universe – and some try to understand also the first as unpredictability in continua (by the “hidden variables theory”).

In summary, a culture of “control”, yet with crucial differences, dominates these leading approaches to knowledge and action: the mathematical physics of optimal paths in continua, applied to human society from Walras to Wiener and their followers and programming as “lists of instructions”, on discrete data types. Biology has been a major victim of the second control myth, based on the instructive-programming approach: we now know “the instructions for ontogenesis written by God” (Collins, announcing the “decoding” of DNA, 2001). This is presented in an ambitious way in the recent book by Jennifer A. Doudna (Nobel Award winner, bio-chemistry, 2020) & Samuel H. Sternberg “A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution”. The idea is to re-program DNA by the new tools, CRISPR-CAS9: “we are already supplanting the deaf, dumb, and blind system that has shaped genetic material on our planet for eons and replacing it with a conscious, intentional system of human-directed evolution” (Epilogue)[3].

Far from equilibrium systems

Control Theory and its physico-mathematical variants, in their classical setting, refer to systems at equilibrium (no flow of energy and matter). The least critique one may raise is that a cell, an organism … an economic system are surely not at equilibrium, but are permanently crossed by flows of energy and matter. Then the attention of a few moved towards far form equilibrium systems, at least since the pioneering work by Prigogine[4]. In particular, the notion of “self-organization” allows to justify forms of autonomous closure, such as whirls of all sorts and other dynamically auto-shaping structures that may recall cells’ closure and their internal dynamics. The progress is remarkable, as far as biology is concerned, since forgetting energy and matter’s flows, say, seems rather “naive”, to say the least.

Yet, this is largely insufficient to analyze life. First, self-organized far from equilibrium physical systems are “spontaneous”. More precisely, they “necessarily” occur under certain contour and/or initial conditions. A flame, a hurricane … are mathematically expected, up to some probabilities, if the pre-given space of possibilities (the values of the phase space, as the ensemble of pertinent observable and parameters) is set at suitable values. And they keep being created, ex-novo, continually. Life is not spontaneous, it is not being re-created continually, but it is the result of a history. A cell is always from a cell, and we are far from being able to tackle the singularity at the origin of life from inert matter: referring to it as an excuse to analyze life in physical terms is a red-herring, and forces us to “pull towards life” one or more existing theories of inert. The transition from inert to living organisms should first require a suitable theory of organisms, then to unify it (or to construct theoretical bridges) with not yet unified physical theories whose phenomena happen simultaneously in a cell (quantum and classical physics, hydrodynamics …)[5].

So a cell is not spontaneous, it inherits organization, from DNA to complex molecular networks to membranes… and plastically modifies it at all scales by interactions with the context. In short, a cell is not the self-organization of fluxes, but it uses fluxes by constraining/canalizing/harnessing them, first by an inherited organization. And I would dare to say more, but I hesitate here: no phenotype is “necessary” – at most and perhaps some basic metabolism is necessary, a membrane, some physico-chemical trace of a past history, which are all inherited features. But from dinosaurs to the reader’s nose, they did not need to be.

Changing phase spaces

As observed by H. Weyl, “all fundamental principles in physics are symmetry principles”. This is so for the conservation properties (momentum, energy…) that are shared by all theories in physics and which are expressed by symmetries in equations (Noether’s theorems). Also flow equations that make mathematically intelligible far from equilibrium systems are based on conservation properties, thus symmetries. Analyses of symmetry breaking or changes step in when symmetries are modified, e. g. in critical transitions. Yet, the most fundamental symmetry, proper to all physical theories, is the stability of the pre-given phase space. That is, different physical theories may be given in different spaces of pertinent observables and parameters, yet each has a pre-given phase space (up, in some cases, to the number of parameters, of identical “type”, though).

In the mathematical physics of far from equilibrium physics as well one has to fix the phase space for writing (flow) equations, where time appears as an irreversible parameter. Then “surfaces” are determined, in all pertinent dimensions, as geodetics (while largely unpredictable, of course), that is as optimal path. First and major unsuitability for analyzing life. One may have some vague hint to the structure of lungs, the hand… by such an analysis. Typically, the fractal dimension of the vascular system and of lungs may be approximated by optimality criteria[6]. This generalizes an analysis for which Turing’s and Thom’s work (different and original) set the foundation in the equilibrium case. While this approach applies beautifully to inert matter produced by organisms (colors on furs, shells…), it is just suggestive of the role of physical constraints in the formation of functional organs. In short, cells’ reproduction with variation is constraint also by physical forces (the flows of blood and amniotic liquids or air, as for arteries and lungs, respectively). That is, in embryogenesis, cells’ reproduction with variation is primary and yields the functional diversity of these organs, then physical constraints apply: if we had a physically exact fractal structure for them, we would be dead. The diversity of lungs and of alveoli in them contributes to the resilience of a population and of the organ in each individual, in changing ecosystemic conditions.

The point is that “optimality” criteria apply only in pre-given phase spaces, with an internal (partial) order (“this is better than that”). This makes no sense in biological evolution, where the space of possibilities (the ecosystem) is co-constructed with and by the “enabled” dynamics[7]. Changing phase spaces and the role of rare events singularizes the time of history[8], that is phylo-genetic and but onto-genetic time as well, on top of the irreversible time of thermodynamics. It is a time of “heterogenesis”, as production of diversity from diversity, which should be given a proper dimension, as proposed in a recent original mathematical approach[9]. In heterogenesis, the trajectories are produced by (differential) operators: when they meet, they create a new phase space. A typical heterogenetic phenomenon is the encounter of the evolutionary trajectories of a bacterium and of an archea, some one billion years ago, which contributed to the formation of a radically new observable, the eukaryotic cell and its ecosystem (the bacterium yielded the mitochondrium). As it very often happened in the history of physics, some new mathematics is being build for biology, at last (see the reference to Sarti et al.).

In summary, organisms are, of course, far from equilibrium systems, but this does not help to deduce many key properties of life. That is, the analysis of physical constraints may contribute to frame some properties of organs where flows, typically, are very relevant – much less for … the hand where only the analysis of its historical “ex-aptation” a la Gould, in changing ecosystems, makes sense. More generally, we exclude any analysis of phylogenetic trajectories as optimal paths, in view of the absence of a pre-given phase space, which is instead co-constituted by the very heterogenetic dynamics. Selection is not the choice of the best, which applies only to breeders, Darwin’s model, but the exclusion of the incompatible within a co-constituted ecosystem. Similarly, “closure of constraints”[10], with its characteristic times at different levels of organization, goes beyond any property entailed from physical theories. Historicity of life is one of them, and even enriched with some reference to a physico-chemical trace of evolution, DNA, a key historical constraint to molecular dynamics in a cell, mathematical physics can’t even grasp, as H. Weyl observed, the peculiarity of biological time, which deserves a proper analysis (and at least one extra dimension)[11]. The rich history of physics is punctuated by the invention of new theories, often incompatible with the existing ones (hydrodynamics, thermodynamics, quantum and relativistic physics …), when facing new phenomena or looking at them differently or just changing scale of observation. Some physicists forget this history and look at life phenomena, indeed rather peculiar ones, by, at best, some “conservative” extensions of what they already know.

Relational/dialectical causal regimes and time

We stressed historicity of life as breaking the fundamental theoretical symmetry of physical theories: the stability of the phase space. Biological evolution is the result of a branching cascade of contingent relational contexts. That is, the phylogenetic history of an organism is actually the history of the contexts of relations, the ecosystems it has been part of. Does the history or the relational context define the (scientific) identity of an organism? They jointly do. The necessary integration of the relational and the historical processes has been recently explored in an “organizational perspective”, that is by providing << a fine-grained characterization of the mutual dependence between an organism’s parts >> and its historical context. It is perhaps sound to consider this integration a “dialectical perspective”: the evolutionary, ecosystemic and organismal relations allow to understand both each individual as historical as well as its relation to the ecosystem; their “synthesis” follows as a new historical stage. This subsequent stage is thus a dialectical synthesis of relational interactions in historical time. The heterogenesis above of the eukaryotic cell and of its co-constructed ecosystem would then be the (dialectical) synthesis of two evolutionary paths and their ecosystems: some archea and bacteria. By continuing the quotation: << Biological organisms are understood as natural systems realizing a dual causal regime. On the one hand, they are thermodynamically open systems…. On the other hand, biological organisms control the thermodynamic flow through the action of structures that, at specific time scales, exert constraints on the ongoing processes and transformations. In particular, organisms are constituted by a set of constraints that (1) are generative—they canalize target processes in such a way to maintain the conditions of existence of other constraints and (2) are dependent—their existence relies on the action of other constraints >>, including ecosystemic constraints[12].

This forces to broaden the physical regime of causality. Note first that, in physics, causality may be beautifully framed by conservation principles, thus symmetries. So, a stone falls “for symmetry reasons” (by Einstein’s equivalence, gravitation is inertia in Riemannian manifolds, i.e. momentum conservation, a symmetry principle). On top of this, in biology, we need to frame causality in relational frames, in particular by the notion of enablement. A virus “causes” a deadly lung infection if enabled by the patient conditions. More generally, ecosystemic enablement is relational and dialectic, as it produces a new phase space.

Often, in physics, causality is related to the orientation of time, where thermodynamics and quantum theories of interacting systems set the arrow of time[13]. On top of this linear time, the historical time of evolution is produced by the process itself as a “dialectical” synthesis of a permanent re-tunning of distributed, but correlated biological clocks and rhythms in an ecosystem: << the time of an ecosystem is a tissue of interacting rhythms and frequencies: when deforming these interactions or their tissue, rhythms, frequencies and their tuning change; conversely, a deformation of rhythms or frequencies and of their tuning modifies the tissue, the time of the ecosystem >>[14]. More generally, all levels of organization are causally relevant in biology (molecular, cellular, organismal, ecosystemic…) and continually interact, in particular under the form of “bio-resonance”[15]. How this may be consistently related to Hegel’s notion of “dialectics” goes beyond my competence: I strongly recommend reading the thesis defended on February 17, 2023, by R. S. Haukedal, Agency and Organisation, The Dialectics of Nature and Life.

In view of the key role of time in dialectical approaches to biological dynamics and their causal structure, let’s conclude with a further change with respect to physical theorizing: a duality mass or energy vs. time. In quantum physics (Shroedinger’s equation), energy is viewed as a (mathematical) operator, while time is a parameter. This has been generalized by Pauli to other physical theories where the operatorial role of energy forces time to be a parameter. It is this author’s conjecture that time should be mathematized as the key operator in biology, the actual constructor of the dialectical dynamics, the result of the tissue of interacting times mentioned above, while energy or mass are “just” parameters (as they are in the allometric equations in biology[16]). Let’s hope that the dialectical perspective in biology, once framed in the newborn mathematics of heterogenesis, will help us to set a robust context of intelligibility for organismal and evolutionary biology and their theoretical integration. This would further develop recent work[17], well beyond the tales of the programs and orders of the genocentric view, while integrating what is pertinent in the thermodynamical approach. In particular, the biological notion of anti-entropy[18], which is different from negentropy in physics, uses its “opposite”, entropy, to develop: they “dialectically” feed each other while producing biological novelty, a broad topics under exploration[19].


[1] Bennett, S. (1992). A history of control engineering, 1930-1955. IET.

[2] Barendregt, H. (1984) The Lambda-Calculus: its Syntax, its Semantics. Amsterdam: North-Holland (see this author’s picture in the book).

[3] For more, see Longo G. (2021a). Programming Evolution: a Crack in Science. A Review of the book by Nobel winner, Jennifer A. Doudna, and Samuel H. Sternberg “A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution” 2017, in Organisms. J. Bio Sci., Vol. 5, No. 1.

[4] Nicolis G., Prigogine I. (1977). Self-organization in non-equilibrium systems. New York, Wiley.

[5] Chibbaro, S., Rondoni, L. and Vulpiani, A. (2015). Reductionism, Emergence and Levels of Reality: The Importance of Being Borderline, Springer, Berlin. See also Longo, G.  (2016). A review-essay on reductionism: some reasons for reading the book by S. Chibbaro, L. Rondoni, A. Vulpiani. Urbanomic, London, https://www.urbanomic.com/document/on-the-borderline/ , May 8.

[6] Bailly F., Gaill F., Mosseri R. (1994) “Morphogenèse et croissance biologique : un modele dynamique simple pour le poumon”  dans La biologie théorique a Solignac, Edition Polytechnica, 65-94.

[7] Longo, G, Montévil, M & Kauffman, S. (2012). No entailing laws, but enablement in the evolution of the biosphere. Invited Paper, ACM proceedings of Genetic and evolutionary Computation Conference, GECCO’12, Philadelphia (PA, USA).

[8] Longo, G. (2018). How Future Depends on Past Histories and Rare Events in Systems of Life, Foundations of Science, 23 (3):443-474

[9] Sarti, A., Citti, G., Piotrowski, D. (2022). Differential Heterogenesis. In: Differential Heterogenesis. Lecture Notes in Morphogenesis. Springer, Cham. https://doi.org/10.1007/978-3-030-97797-9_4

[10] Montévil, M. & Mossio, M. (2015). Closure of constraints in biological organisation. Journal of Theoretical Biology, vol. 372: 179-191

[11] Longo G. (2021). Confusing biological rhythms and physical clocks. Today’s ecological relevance of Bergson-Einstein debate on time. In A. Campo, S. Gozzano (eds.), Einstein vs Bergson. An enduring quarrel of time, De Gruyter.

[12] The quotations are from Montévil M and Mossio M (2020) The Identity of Organisms in Scientific Practice: Integrating Historical and Relational Conceptions. Front. Physiol. 11:611. See also Marinucci A. (2023) From deterministic biology to relational biology, to appear.

[13] Connes A., Rovelli C. 1994. Von Neumann algebra automorphisms and time-thermodynamics relation in general covariant quantum theories. Class. Quant. Grav. 11, 12, 2899-2918.

[14] From Longo G. (2021). Confusing biological rhythms and physical clocks. Today’s ecological relevance of Bergson-Einstein debate on time. In A. Campo, S. Gozzano (eds.), Einstein vs Bergson. An enduring quarrel of time, De Gruyter.

[15] Buiatti, M., & Longo, G. (2013). Randomness and multilevel interactions in biology. Theory in Biosciences, 132(3), 139‑158.

[16] Schmidt-Nielsen K. (1984) Scaling: Why Is Animal Size so Important? Cambridge Univ. Press, Cambridge.

[17] Soto A., Longo G., Noble D. (eds.) (2016) From the century of the genome to the century of the organism: New theoretical approaches, a Special issue of Progress in Biophysics and Mol. Biology, Vol. 122, 1, Elsevier.

[18] Bailly, Francis, & Longo, G. (2009). Biological organization and anti-entropy. Journal of Biological Systems, 17(1), 63‑96.

[19] Chollat-Namy M., Longo G. (2023) Entropie, Neguentropie et Anti-entropie : le jeu des tensions pour penser le vivant, à paraître.

Leave a Reply

Your email address will not be published. Required fields are marked *

Comments