Читать книгу The Contributory Revolution - Pierre Giorgini - Страница 27
1.4. The epistemological mutation of the sciences
ОглавлениеThe epistemology of science is an exciting study and one which is of renewed interest today, as the question of the boundary between physics and metaphysics is at the heart of contemporary debate. This stems, in particular, from the increasingly shared sense, when looking closely at the current technoscientific revolution, that we are entering not into the unknown, but into the unknowable. It is hard to believe what is happening. The constantly updated work on the origin of knowledge and the understanding of its deep mechanisms of constitution is increasingly indispensable for understanding and analyzing the contemporary world, not least owing to the emergence of the most bizarre scenarios about the exo-somatization of capacities of the human brain. “Even if we could hope to understand everything, then we will have to understand what it means to understand” (Mark Planck’s 1944 speech).
I would like to put forward here the idea – which will be substantiated in the rest of the book – that the constitution of knowledge at the heart of the mathematical–physical relationship is fundamentally exo-distributive, whereas what is at work in the life sciences is by nature endo-contributive. Indeed, there is an epistemological divide between the sciences of inert matter, based essentially on physics as a relationship to reality, and those of the living world sustained by biology. To defend this point of view, which we will be further developed in the following chapters, it is appropriate to briefly return to the foundations and history of knowledge.
The reading of two books and a series of interviews have recently advanced and challenged my thinking on the subject of the epistemology of science. These works will be widely cited in the sections that follow: Quantum Physics, a Philosophy (Bitbol 2008) and Mathematics and Natural Sciences (Bailly and Longo 2011). Indeed, these readings have led me to consider further some of my earlier convictions or intuitions, and have also overturned some of them.
The first is the role played by the mobility achieved by humans in particular, who, at the top of the pyramid of living things, have been able to abstract themselves from their physical relationship to time and space, to derive the first spatio-temporal formalisms. This is a genesis of formalisms first of all, very much linked to sensory perceptions, such as toying with infinite continuity, the perception of the successor in time or space as the initiator of the succession of whole numbers, then the emergence of proportions opening on to the fraction. Its combination with the concept of the successor, leading to rational numbers. We can also cite the concept of the straight line as the shortest path from one point to another, the circle and so on. Each time, as Bailly and Longo (2011) tell us, a formal construction principle is put into action and by its confrontation with the principle of proof, acquires its status of “truth”. Thus, principles of construction and principles of proof are intimately linked. Within mathematics itself, this applies between semantics and syntax or logicism, and also between mathematics and experimental physics. In addition, other ancient notions, which will be mathematized much later, will appear, some as a condition of the survival for mankind. For example, the notion of causality, and which causes induce which effects. There is no smoke without fire; the sudden fleeing of a flock can indicate the presence of a dangerous predator, and so on. Another notion arose from sensory experience well before its mathematization: the probability that something will occur. Agrarian societies soon learned the idea of the likelihood that it would rain; today, we would talk of probability based on a certain number of factors, but it would be without certainty.
But this process is not only deductive; it is also inductive, when experience and measurement require an evolution of formalism. This leads to a fruitful interaction through co-fertilization, through the “physicalization” of mathematics. Direct access to tangible reality, to experimentation and direct measurements, where the observable and the measurable merge, will exhaust this phase of the history of physical sciences, with Newtonian physics at its peak. We could say that this whole process will conform to the principle of intuition. What is born in the principle of construction does not betray the practical criteria of intuition: Euclidean space and its geometry, an arithmetic that does not question the fundamentals of what is shown by sensory experience, succession, commutativity, parallel lines that never intersect, etc.
However, this scientific development will, almost as a side benefit, generate observation tools that will increase the scope of phenomena subject to human scrutiny. The telescope will reveal the invisible of the infinitely large (astrophysics) and microscopy, or the tools of electromagnetic experimentation and measurement, the invisible of the infinitely small. Galileo will assert in a completely counter-intuitive way that the Sun does not revolve around the Earth but rather the opposite. It is the passage from geocentrism to heliocentrism. Then the game of co-fertilization between mathematics and physics will literally explode and give rise to new formalisms that are increasingly counter-intuitive. They will no longer conform to the principles emanating from the relationship to the visible and temporal world, perceived directly by the human senses. They will be efficient, however, because they will allow the formal construction to legitimize itself in the principles of proof through measurement and indirect observation. We will see, moreover, that in certain cases these devices can no longer be thought of as neutral and transparent, not intervening on the observed reality, but only transcribing by observation and measurement a faithful part of this reality (which is no longer the case in quantum physics, for example).
But this legitimacy through evidence or experience will remain inevitably partial. This is moreover a first elementary principle of incompleteness. We will return to it because, however multiple it may be, experience will remain discrete (discontinuous and countable) in this world of the invisible and will never completely cover the whole continuous field, theoretically covered by formalism itself. Experience will tell us that this principle is verified here and there, while never being able to prove that this is true everywhere and always.
There are many examples of theories leading to counter-intuitive principles of proof. For mathematics, this is the case of the body of complex numbers which escape the relationship of order and successor, the matrix formalism which furthermore escapes the principle of commutativity of the product, non-commutative geometry, geometric algebra which axiomatizes the fact that two parallel lines intersect at infinity, Hilbert’s N-dimensional spaces, etc. Their equivalents in physics can also be mentioned, with general relativity, which upsets our intuitive reference points in terms of space–time, or quantum physics with non-locality, for example, or Heisenberg’s uncertainty principle (the more we know the position of a particle, the less we can determine its speed and vice versa), or string theory requiring a 10-dimensional geometric formalism.
Here again, we will have the emergence of new principles of uncertainty linked to the fact that, in reality, it is impossible to measure each of the operators of a complex number or matrix, but only their product, or their square, or their linear combination in the space of real numbers, as their name indicates. The same is true for 10-dimensional spaces, whose reduction can only be observed by projection into a three- or four-dimensional space. This introduces an intrinsic distance between the “deterministic truth” of a theory and the uncertainty of the principles of proof. Knowing the product of two variables is not enough to determine the value of each variable. For example, the appearance of probabilistic variables makes it impossible to reproduce experiments with strictly the same initial conditions giving strictly the same result. On the other hand, it can be observed that two absolutely identical results in the experiment can come from two completely different situations in formal terms.
Let us take a well-known example from quantum physics. The wave functions mathematized by Schrödinger’s equations can be broken down into a product of two functions: a form function (trajectory for the wave, probability of presence for the particle or confined wave) and an energy function governed by the universal law of energy conservation. Formally, this theory is perfectly deterministic. The equation and its solution have no hazardous components. It is the interpretation and especially the process of quantum measurement that will introduce the uncontrollable, what is historically called the wave packet projection rule: chance projection on one of the components. Without measurement, the evolution of the quantum system would be just as deterministic for a physicist as a classical system.
Conversely, a formal theory can only be constructed within a formal locality (physical or temporal space defined by an axiomatic). Another source of incompleteness will then emerge, that of the influence of what is not local or of what is at the limits of the physical or temporal formal locality: hidden variables, for example, or side effects, or sensitivity to initial conditions.
All in all, we can see that what underpins the epistemological approach in this mathematical–physical approach is exo-distributive. The principle of formal construction proceeds from the fabrication of a formal intelligence external to the object, which even “creates the object” by constituting it from a conceptual point of view. The principle of proof is separate, and real matter is thought by the scientist to be subject to formal laws that determine it even in time, according to a principle of causality. Even if incompleteness and uncertainties are intrinsic (we will return to this), the formalism is distributed within the real to constitute it as an object that “no longer” escapes human understanding. In fact, formal continuity creates real continuity.
Thus, let us imagine a ball for drawing the lottery, itself made up of balls each containing 10 balls for each unit, 10, 100, etc. Although the result of the draw is perfectly random, each of the processes of impact between them, between them and the walls, and of centrifugal acceleration is perfectly deterministic and yet creates a perfectly random process by its combination. This possibility was highlighted by Henri Poincaré (1890) for the laws of gravitation in his theory of the three bodies. The temporality of the machine, and its conformity to the laws that are distributed in it to interpret its functioning determine it entirely, even if incompleteness and indeterminacy prevail through the combinatorial process (impacts, speed, etc.). We speak of deterministic chaos. Indeed, in theory, if the initial conditions at the time of the launching of the draw were exactly the same (weight and shape of the balls, atmosphere, temperature, pressure, etc.), the result would be identical each time. But the multiplicity of impacts and the divergent nature of the rebounds on the walls, and the sensitivity of the trajectories to the exact nature of the impact mean that a tiny variation in one of the many conditions, both initial and occurring during the draw, makes the uncertainty grow exponentially with time.
So what difference does it make if the objects are not inert and the machine is a biological and living organism? The first and most important difference is that for the machine there is no contingent intention other than to provide the lottery number. The designer’s intention is entirely and exclusively translated by a technique without any autonomy in relation to this. There is no intrinsic conservation requirement apart from the principle of energy conservation, no intrinsic “intelligence or decision-making capacity” that is distributed in the balls full of balls and the balls themselves. For the machine, essence precedes existence. The design of the machine and its operation have the sole aim of putting the machine at the service of its designer’s objective, to draw the lottery at random.
For the living world, on the other hand, for each complex constituent element there is a teleology which perpetually reinvents itself, driven by a “contingent purpose” which runs through like a force field: the forces of disorder, of entropic alteration, against the forces of anti-entropic conservation, intrinsic intelligence, a distributed decision-making capacity, continuously reinventing its ecorithms (Valiant 2013) and in a co-contributory manner. Moreover, the chaos intrinsic to the complexity and variability of the initial and “living” conditions is subject to ongoing anti-chaotic corrections, maintained by adaptive ecorithms.
It is important to pause here for a moment on this probably inappropriate term, purpose. I use it here deliberately to give me the opportunity to clarify a fundamental point to which I will return at the end of the book. Indeed, the word purpose is tricky because it most often implies the notion of intentionality. As with the machine, it implies the existence of a designer (a great architect) with a precise intention; therefore, the word teleology would be more precise. For it is an internal purpose within a living organism, whereas purpose brings to mind the whole universe and a designer (or a general “Purpose-Being”, like Aristotle’s First Mover).
Here, I consider things from a physical point of view and at this stage forbid myself from crossing the line between the physical and the metaphysical. In fact, for me, from a physical point of view, the history of evolution, from the inert to the living world to the human, is a magnificent story of emancipation of the forces of conservation constantly opposed to the forces of alteration. They are present in the inert, of course, engaging counter-entropic forces (antimatter, etc.).
The constitution of organisms (with a membrane with an inside and an outside), with a first stage of complexity organized around an autonomous functioning and a logical capacity to react to an environment trying to alter them, constituted the first threshold of emancipation of the forces of conservation with respect to inert matter. The grouping into eco-communities and then into multicellular organisms constituted a second threshold. Reproduction constituted the third threshold crossed by the forces of conservation in their long process of emancipation. To illustrate this, it has recently been shown, for example, that the aging of bacteria has been controlled by nature through reproduction. Indeed, these bacteria age and die by accumulating “waste” proteins that are generated by their metabolism and not eliminated. The reproduction of bacteria allows an ongoing dissolution of this level of “waste” proteins and therefore the survival of the species.
The mobility achieved by living creatures probably marks a fourth threshold which, as we have seen, led to the first spatio-temporal formalisms in hominids and probably in a certain number of animals in different forms to those associated with consciousness. The conquest of the conscious mind by humans constituted the fifth, allowing them to compensate for their poor physical performance in terms of survival through intelligence and imagination. Transhumanists also believe that history is not to be written in the traditional theories of evolution and that other thresholds in terms of conservation will soon be crossed, such as the eternal safeguarding of a dematerialized human being, placed in memory, or the emergence of an indefinitely repairable post-human.
To use the word purpose therefore means crossing a very delicate line between the physical and the metaphysical, a barrier that Max Planck crossed in a somewhat chance way at the end of his life:
As a man who has devoted his whole life to the most lucid science and the study of matter, I can tell you this in conclusion to my research on atoms: there is no matter as such. All matter originates and exists only by virtue of a force which causes the particles of an atom to vibrate and which sustains this entire atomic system together. We must assume behind this force the existence of a conscious and intelligent mind. This mind is the matrix of all matter3.
It is no coincidence that this statement by Planck, one of history’s great physicists, in Florence in 1944, three years before his death, has been widely reported on the Internet. It has given rise to heated debates on social networks. Attempts to lay claim to it are legion. Many who do not have the scientific background to understand what Max Planck was talking about embark on vague theories about “God, or no God”. Believers, sometimes in a breathtaking esotericism of stupidity, use this phrase to say that “non-believers” are idiots. Or they refer to this sentence to try not to be taken as such if they believe in God or in the forces of the spirit, a famous phrase from François Mitterrand’s farewells. We know to what extent this attempt to cross the line between the physical and the metaphysical is a high-risk exercise. It unburies the hatchet between science and religions, even if metaphysics is a broader concept. It opens the way for the conscious self to take possession of spirituality, according to Erich Neumann (1954).
From a temporal point of view, another fundamental difference can be deduced from the above. Thus, the temporality of a machine is thought of as a succession of times “following” a “current” time, whereas for a living organism, the current time is entirely stretched towards the next time in the name of a contingent teleology which is conservation. The living world is endo-contributive, its contribution being entirely directed towards this contingent teleology.
From this point of view, there is a second, even more interesting, difference. A dog continuously exhibits the phenomenological reality of a dog in its environment. It is continuous as a dog, and it is the mathematization of its biological localities that will produce a discontinuity. For example, the equation of its morphogenesis and the mathematization of its cardiac functioning will segment it while its complex reality, including its relationship with the human living with it, will be fundamentally unified.
We have seen that in the mathematical–physical epistemological space, formal continuity creates real continuity. For the biological living being it is the opposite because the living object is substantiated by a continuous self-referencing of its behaviors.
The question that will be key to the development of non-programmable artificial intelligence – i.e. a “machine” place where the existence of the machine (experiential learning for deep learning machines) would take precedence over its essence – is: Is it capable of breaking the hermetic barrier between the inevitably exo-distributive machine and the inevitably endo-contributive living machine? We will return to this by asking the question: Can an endo-contributive machine be considered alive? This obviously brings the question back to that of the possibility of consciousness of a machine which immediately opens the debate on transhumanism. Are we on the eve of a new dawn of emancipation of the forces of conservation?
1 1 Homeostasis is the stabilizing ability of living organisms to perform dynamic adjustment functions according to the environment or the stresses it is subjected to (temperature, pressure, chemical composition, etc.).
2 2 By coding the possible proofs in a formal theory given by arithmetic numbers, Gödel shows that statements that we would intuitively consider true are neither provable nor refutable in this formal theory.
3 3 Max Planck’s famous speech given in Florence, Italy in 1944 entitled Das Wesen der Materie (“The Nature of Matter”). Retrieved from Archiv zur Geschichte der Max-Planck-Gesellschaft, Abt. Va, Rep. 11 Planck, Nr. 1797.