Читать книгу Limits of Science? - John E. Beerbower - Страница 35

Endnotes

Оглавление

1 Perhaps one could fashion an “understanding” of the natural world based upon the existence of patterns and regularities, without introducing causality. At a minimum, it seems clear that science could not have arisen if the Universe had not been characterized by regularities and constants. (Actually, forget science. Life as we know it could not have existed without such regularities.) And, perhaps mere recognition of the existence of regularities or patterns would have sufficed for purposes of survival. But, I think that normal conception of understanding implies causal explanations—e.g., at least, why there are regularities that make certain events predictable.

2 Arkes, citing writings by Professors Lewis White Beck and Jeffrie Murphy, describes this argument as asserting “the most decisive part of Kant’s answer to Hume: viz., that Hume’s own argument becomes intelligible only on the basis of Kant’s understanding.” Id. Hume is discussed in some detail below.

3 Some current research has indicated that the human mind is “hard-wired” with modules that are suitable for quite specific purposes. We may be born with “domains of intuitive knowledge” that govern particular areas of cognition. Both the facility for and the resulting structure of language seem to be of such a nature. Certain models that arise in psychology, biology and physics may also be intuitive. Adam Frank, About Time (2012), pp.7–8. It has even been suggested that the evolution of the processes for moving information among the modules gave rise to analogy and metaphor, thereby boosting human creativity. Id. We return to some of these ideas in more detail in a later chapter.

4 A different but related point is that the senses necessarily restrict and the mind necessarily simplifies and orders the information received—if such restrictions and simplifications did not exist, a person would be reduced to helpless confusion by the unmanageable, overwhelming volume of data received. As Professor Barrow wrote, “The mind is the most effective algorithmic compressor of information that we have so far encountered in Nature. It reduces complex sequences of sense data to simple abbreviated forms which permit the existence of thought and memory. The natural limits that nature imposes upon the sensitivity of our eyes and ears prevent us from being overloaded with information about the world.” Theories of Everything, p.11.

5 This analogy of a theory being like a pair of spectacles suggests another point of relevance to the discussion here. As with a pair of spectacles, it is difficult to look at a theory (the spectacles) and through it at the same time. See Stephen Toulmin, Foresight and Understanding (1961), p.101. So we face a challenge in our efforts to understand science because the theories that constitute an integral part of that science shape not just our understanding of the science but also the vocabulary with which we discuss it. See also, Deutsch, The Beginning of Infinity, p.199.

6 A similar line of thinking gave rise to a substantial debate among philosophers of science, under the heading of “incommensurability.” See, e.g., Thomas Kuhn, The Structure of Scientific Revolutions (1962), pp.147–50, 170–73; Paul Feyerabend, “Explanation, Reduction and Empiricism,” in H. Feigl and G. Maxwell (eds.), Scientific Explanation, Space and Time (1962) at pp.28–97. Kuhn argued that the new paradigms, following each revolution, were not directly comparable to the prior theories because the concepts, as well as the vocabulary, were different. Feyerabend asserted that the conceptual incompatibility of successive theories created a conservative bias favoring the existing theories, partly because the development of a new theory generally required a new way of thinking and of seeing. See Stanford Encyclopedia of Philosophy, “The Incommensurability of Scientific Theories” (revised March 5, 2013).

7 “Scientific” theories generally fall into one of four major and ostensibly different patterns of explanation: (i) the deductive model in which the event to be explained is a logically (mathematically) necessary consequence of the explanatory premises; (ii) probabilistic relationships, which often resemble deductive models but include statistical premises; (iii) teleological or functional explanations, which identify the functions or role played by various agents in events and (iv) “genetic” explanations in which phenomena emerge from the events preceding them. See Ernest Nagel, The Structure of Science (1961), pp.20–26.

8 Rudolph Carnap described the deductive construction of the logician as “a skeleton of a language rather than a language proper, i.e., one capable of describing facts. It becomes a factual language only if supplemented by descriptive signs.” “Foundations of Logic and Mathematics,” International Encyclopedia of Unified Science (1939), No. 3, p.32. He also quoted Einstein as saying, “So far as the theorems of mathematics are about reality they are not certain; and so far as they are certain they are not about reality.” Id., p.56.

9 The term “gene” was not used by Mendel. The concept was set out by Dutch botanist Hugo de Vries in 1889 (an inheritable unit that caused a specific trait, which unit he called a panagene). The word “gene” was introduced in 1909 by Danish botanist Wilhelm Johannsen. See “Gene” (revised 2009), The Stanford Encyclopedia of Philosophy.

10 I use “inductive inference” here in the common, lay sense of drawing at least tentative inferences from the observation of apparent patterns or regularities in events. The philosopher’s “problem of induction” is discussed at some length in the next section of this chapter.

11 The editor added a footnote there explaining that Hume used the word “science” in the pre-nineteenth century sense of knowledge or learning generally.

12 Hume used “knowledge” to refer to pure mathematics or pure deduction, things known from “first principles.” Understanding of the physical world came from (and with) probabilities and was referred to by Hume as “opinion.” Any theories about cause and effect in the physical world were necessarily of the latter type. See Ian Hacking, The Emergence of Probability, pp.180–1.

13 Hume goes on to characterize inductive reasoning in a rather pejorative manner. He asserts that the inductive inferences from experience are the mere result of “custom or habit.” “There is some other principle which determines him to form such a conclusion [that the future will resemble the past]. This principle is custom or habit.” Id., p.43. “All inferences from experience, therefore, are effects of custom, not of reasoning.” Id., p.44. “[A]fter a repetition of similar instances, the mind is carried by habit, upon the appearance of one event, to expect its usual attendant [the apparently related event], and to believe that it will exist.” Id., p.69. “Our idea, therefore, of necessity and causation arises entirely from the uniformity observable in nature, which similar objects are constantly cojoined together, and the mind is determined by custom to infer the one from the appearance of the other.” Id., p.75.

14 Probability is not an unambiguous concept. Rudolf Carnap argued that there is concept of “logical probability” or “inductive probability” that reflects the confidence we can have in a particular theory or proposed law of nature in light of certain evidence—i.e., that reflects the likelihood that the law is true or the degree of confirmation provided by the evidence. He contrasted this concept with the more familiar concept of “statistical probability.” See Rudolf Carnap, Philosophical Foundations of Physics (Basic Books, 1966), pp.20, 34–5. This subject is discussed further in the chapter on Mathematics.

15 John Stuart Mill used the concept of causality or causal inference to explain the essence of inductive reasoning in his book A System of Logic (1904). See Peter Lipton, Inference to the Best Explanation, pp.18–19. An inference of causality was justified where all known cases of B were preceded by A or where there is only one identifiable difference between similar situations in which B occurs and those in which it does not (e.g., the presence of A). Id.

16 Foresight and understanding necessarily involve a fundamental subjective element, that is, they incorporate or turn upon the mental process of comprehension. The concept of explanation also, of course, implicates mental processes, and even prediction presumes more than a mere passive observer. We will return to these issues, but I would like to minimize their role for the present discussion.

17 Deutsch explains that “[o]ne cannot make even the simplest prediction without invoking quite a sophisticated explanatory framework.” Id., p.15. Testing and measurement techniques themselves depend upon theories of testing and measurement. Id., p.317. “The very idea that an experience has been repeated is not itself a sensory experience, but a theory” because “we are tacitly relying on explanatory theories to tell us which combinations of variables in our experience we should interpret as being ‘repeated’ phenomena in the underlying reality, and which are local or irrelevant.” Id., p.7. For a detailed discussion of the impressive scientific achievements made and their contribution to the advancement of science involved in developing accurate measurement techniques in one particular area (thermometry: the measurement of temperature), see Hasok Chang, Inventing Temperature: Measurement and Scientific Progress (2004).

18 If this statement seems strange, I refer the reader back to the discussion of the concept of the “gene.” To summarize, the theory of genes provides a highly accurate basis for predicting the heritability of traits, with actual hands-on applications, e.g., in the breeding of animals. But, modern biology strongly suggests that there is no such thing as a “gene,” at least not in the sense that the theory was traditionally understood.

19 Of course, a characteristic of light, which we discuss below, is that it appears to travel at the same speed whether approaching the front or the back of a moving body, that is, that its speed relative to a body is independent of the speed or direction of the motion of the observing body, contrary to “normal” experience. Thus, light as particles do not display the characteristics that Feynman says would be expected of his imagined gravity-inducing particles.

20 Einstein’s theory included the distortion of time as well as space, so that gravity also operates through the warping of time, which is a concept harder to visualize than the warping of space. Id., p.15n.

21 There is a gap between intuition and our pure sensory perceptions. Take the airplane. Our common sensory experiences probably tell us that the airplane flies because the force of the air on the underside of the moving wing pushes the airplane upwards. The aeronautical engineers tell us that, instead, the movement of the wing through the air creates a vacuum on the top of the wings which pulls the airplane upwards. The result is the same. The explanatory model is different. I think that our reasoning abilities allow us to incorporate the engineers’ model into our intuitive experience, even if our initial physical sensation would have suggested to us that the first model was correct. When you stick your arm out of the window of a moving car, you probably would say that you feel the air pushing your arm upwards and backwards. But, if you ask yourself is it possible that, in fact, a vacuum is pulling your arm upwards and backwards, you can conclude that the sensation is consistent with either explanation.

22 Lord Rees says that Einstein transcended Newton by offering insights into gravity that made it seem more “‘natural’ and linked it to the nature of space and time, and the Universe itself.” From Here to Infinity, p.81. I, like many (I would think), find the hypothesized curvature of space-time to be a concept that is far from normal. However, what Lord Rees probably means by “natural” is that the theory presents gravity as an integral part of the natural order, even if we find that natural order to seem contrary to the experiences of our sense or to our common sense.

23 More precisely, these criteria are not strictly requirements for achieving the status of a scientific theory but are standards for preferring one theory over another. Clearly, a useful theory will have the quality of generality, that is, it can extend to a wide variety of observations and consequences. In addition, philosophers of science and many scientists themselves consider simplicity to be not just a virtue but an inherent quality of a good theory. Simplicity may be hard to define and it would certainly seem to have some cultural basis (or bias), but it seems that it is often possible when confronted with two alternative theories, to identify the one that is more simple. Finally, there is elegance: “closely allied to simplicity, this regulative maxim separates what is ugly and cumbersome from sweeping ideas that carry élan and give pleasure upon comprehension.” Henry Morgenau, “What is a Theory?,” in Sherman R. Krupp (ed.), The Structure of Economic Science (1966), p.26. Elegance is undoubtedly a culturally-influenced criterion—certainly, what it is that constitutes elegance will be culturally determined; perhaps the relevance of the criterion itself is also a cultural phenomenon.

24 It can be argued that a history of successful predictions, especially predictions of things not already known, justifies a belief in the theory, at least until a better one comes along. This approach to science was formalized as the hypothetico-deductive model by Carl Hempel in Philosophy of Natural Science (1966). Professor Lipton observed that “the hypothetico-deductive model seems genuinely to reflect scientific practice, which is perhaps why it has become the scientists’ philosophy of science.” Lipton, Inference to the Best Explanation, p.15.

25 Indeed, Lakatos was sharply critical of Popper’s definition of (the methodology of) science as based upon the sole use of propositions that are falsifiable. “For Popper’s criterion ignores the remarkable tenacity of scientific theories. Scientists have thick skins. They do not abandon a theory merely because facts contradict it. They normally either invent some rescue hypothesis …[or] they ignore [the anomalous results]. …[S]cientists talk about anomalies, recalcitrant instances, not refutations. History of science, of course, is full of accounts of how crucial experiments allegedly killed theories. But such accounts are fabricated long after the theory had been abandoned.” Id. This criticism, as expressed, is based upon his view of how scientists act, but Lakatos also argues, in effect, that the test of falsification is not possible. See also, Polanyi, “The Creative Imagination,” Chemical and Engineering News 44 (1966), p.85 (“Verification and falsification are both formally indeterminate procedures.”).

26 Another problem is that of “underdetermination,” where the theory does not provide a unique explanation of causation. See, e.g., Peter Lipton, Inference to the Best Explanation (2004) (Second Edition), pp.5–7.

27 Professor Barrow attributes the realization of the crucial significance of the initial conditions to two of the “deepest thinkers of the nineteenth century,” James Clerk Maxwell and Henri Poincare, quoting Poincare’s statement that “a very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. … [I]t may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible… .” Id.

28 As we shall discuss further, all predictions will be subject to the ceteris paribus condition: the prediction of the theory will be realized on the assumption that nothing else material to the result, beyond what has been hypothesized in the making of the prediction, occurs in the process. Many scientific predictions can be tested in a laboratory where extraneous factors can be controlled or, at least, measured. In other cases, the theory may incorporate all of the necessary causal factors within the parameters of current technology for measurement or, in other words, experimental error. Such theories can be considered as applying to “closed systems”—all relevant variables are within the system. However, the ceteris paribus condition is particularly daunting in “open systems,” subjects like economics and biology—it is almost inconceivable that any prediction made by an economist or evolutionary biologist subject to the ceteris paribus condition could be realized in the real world. The theory will never be able to incorporate and control for or measure all relevant factors, and there will always be relevant factors that have changed. So, unless we are prepared to deny the status of “science” to certain bodies of study that we have generally considered to be scientific, we may conclude that there has to be a broader set of criteria that determines whether a theory is scientific. Such theories have been called “explanation sketches,” since a complete explanation can never be achieved, only more and more complete explanations as additional variables are incorporated into the theory. Carl G. Hempel and Paul Oppenheim, “Studies in the Logic of Explanation,” Philosophy of Science, XV (1948), pp.130–39.

29 Of course, one could undertake an interesting exploration in the history of ideas to ascertain the actual origins of the root concept and the influences of various intellectuals on others. One could also speculate on the apparent concurrent emergence of ideas (whose “time has come”) that seems to appear in the history of ideas. Naturally, the concepts of competition and survival of the fittest also can be—and have been—applied to the “evolution” of various bodies of thought, including the natural sciences.

30 Let me add, and explain, a caveat to this assertion. There has been some empirical evidence of regularity or predictability in the observed, laboratory evolution of bacteria. See Carl Zimmer, “Watching Bacteria Evolve, With Predictable Results,” The New York Times, Science, August 15, 2013. Research conducted under the direction of Joao Xavier at Memorial Sloan-Kettering Cancer Center, the results of which were published in the journal Cell Reports, demonstrated that a common species of bacteria called Pseudomonas aeruginosa repeatedly experienced mutations in the same gene that produced bacteria that could travel faster through the medium in which they were raised so as to obtain more food. This study dealt with only a specific adaptation in a specific environment. Dr. Xavier is quoted as saying: “In this case, it could be that there are only a few solutions in the evolutionary space.” Id. Given that the particular mutation was capable of giving rise to a physical characteristic that was clearly useful, if there were no other viable mutations that could be as useful, then it was predictable with large numbers and repeated reproduction that such a mutation would be likely to appear in new samples. It is hard to imagine the state of knowledge that would be required for scientists to have predicted such a mutation if they had never seen it before.

31 The practice of diagnostic medicine may more accurately be characterized as intuitive rather than probabilistic. The distinction I am making can be captured in the question “Could a computer be programmed (at least, theoretically) to be as effective a diagnostician as a human being is?” To the extent that the successful practice of diagnosis and treatment depends upon intuitive and creative insights, then that level of achievement will be beyond the capability of the computer because such insights are not computable.

32 For example, concerns have been expressed about the meaning of certain clinical trials that fail to detect any statistically meaningful effects of a drug across the sample group, despite individual stories of almost miraculous effects of the same drug on particular individuals. The explanation may be in our failure to understand “just how individualized human physiology and human pathology really are.” Clinton Leaf, “Do Clinical Trails Work,” The New York Times, Sunday Review, The Opinion Pages, July 14, 2013. In other words, there may be quite specific factors in particular individuals with a certain disease that make a drug highly effective, while many others with the “same” disease may not have those factors.

33 Byers discusses the different uses of the words “subjective” and “objective.” Citing the New Oxford American Dictionary, he notes that “objective” can be used to mean free of personal opinion or of a personal (biased) perspective but can also used to mean “not dependent on the mind for existence.” Byers suggests that science is objective in the first sense, but not necessarily so in the second. Id., pp.92, 98.

34 I admit that part of the wonder I felt as I began to master neoclassical value theory, and then more traditional industrial organization economics, was the recognition that it seemed to provide an intellectual framework to the Mid-western Republican political values with which I had been raised during the late 1950s and early 1960s. I then promptly saw that one of the reasons for the violent antipathy toward the economics department being expressed by most of my college contemporaries in the elite Eastern academy of the late-1960s was the very fact that a significant part of the established doctrine did indeed seem to provide support for free markets, inequality in the distribution of wealth and income and laissez faire government economic policies. The fact that economists were making substantial progress in understanding externalities, the consequences of inequality and other short-comings of the established doctrines was quite irrelevant.

35 Arthur Stanley Eddington, the Plumian Professor of Experimental Philosophy at Cambridge University during the first half of the twentieth century and a contemporary and colleague of Albert Einstein, believed that fundamental science describing the physical world could be developed by pure thought. Observation and experimentation were useful tools that could greatly speed the process of discovery but were, nonetheless, ultimately unnecessary. He declared that “My conclusion is that not only the laws of nature but numerical values of the fundamental relationships or forces that are presumed to be constant, such as the speed of light and the gravitational constant, can be deduced from epistemological considerations, so that we can have a prior knowledge of them.” The Philosophy of Physical Science (1939), p.58, quoted in Barrow, The Constants of Nature (2003), p.83.

36 Physicist Paul Davies asserts that it “is generally agreed” among scientists that the Laws of Nature have four characteristics: they are universal, they are absolute, they are eternal and they are omnipotent (“all powerful,” meaning that they cannot be avoided or evaded). The Mind of God, pp.82–3. He observes that these qualities “that were formerly attributed to the God from which [the Laws] were once supposed to have come.” Id., p.82. Davies suggests that the question that does divide scientists is whether the Laws are somehow real (that they exist independently of the physical world) or are simply things discovered by scientists. Id., pp.83–4.

37 And, certainly, theoretical work does go on exploring the possible consequences to our understanding of cosmology, for example, if one permits (or assumes) changes in the laws or the constants. See, e.g., Smolin, Time Reborn; Joao Mangueijo, Faster Than the Speed of Light (2003).

38 Lord Rees explains that if one takes a few meters as a normal distance for man, that distance would have to be increased by twenty five factors of ten to reach the observable limits of the Universe. Just Six Numbers, pp.5–6. From a meter distance, our smallest measurable size (using electron microscopes and particle accelerators) would be about seventeen negative factors of ten smaller. Physicists speculate that the smallest structures of nature, like the proposed superstrings, would be smaller by another seventeen negative factors of ten. Id., pp.6–7.

39 The example of the developments leading to Newton’s theories is relatively easy to understand. The change from an Earth-centered to a Sun-centered then a no-centered worldview is widely understood and accepted in modern society. There was a paradigm shift that was far broader than the scientific theories directly involved. It is likely that certain societal and cultural developments were useful, if not necessary, to the acceptance of the new theories and, perhaps, even to the conception of them. What about Einstein and his theories of relativity? At a superficially appealing level, one can note that societal developments concurrent to the scientific theory included the breakdown of established orders and universal truths and the emergence of cultural and moral relativity. Are these all elements of a paradigm shift? Can one say that the Special Theory of Relativity is a cousin to moral relativity and then to political correctness? I think not. The relativity of the theories in physics does not in any way suggest that there are no universal truths or bases from making judgments of the correctness or value of various positions. Indeed, as Lord Rees has observed, “It’s a pity, in retrospect, that he called his theory ‘relativity’. Its essence is that the local laws are just the same in different frames of reference. ‘Theory of invariance’ might have been a more apt choice, and would have staunched the misleading analogies with relativism in human contexts.” From Here to Infinity, p.137. Nonetheless, there is certainly some truth to the observation that our values and views of the world are being challenged on many fronts. I do not attribute that phenomenon to Albert Einstein. I think that it reflects the sense of tumult and uncertainty that will accompany significant shifts in views of the world, whether revolutionary in the Kuhnian sense or more accretive and cumulative.

Limits of Science?

Подняться наверх