Читать книгу Twentieth-Century Philosophy of Science: A History (Third Edition) - Thomas J. Hickey - Страница 92

4.12 Types of Theory Development

Оглавление

In his Introduction to Metascience (1976) Hickey distinguishes three types of theory development, which he calls extension, elaboration and revision.

Theory extension is the use of a currently tested and nonfalsified explanation to address a new scientific problem. The extension could be as simple as adding hypothetical statements to make a general explanation more specific for the problem at hand.

A more complex strategy for theory extension is analogy. In his Computational Philosophy of Science (1988) Thagard describes his strategy for mechanized theory development, which consists in the patterning of a proposed solution to a new problem by analogy with an existing explanation for a different subject. Using his system design based on this strategy his discovery system called PI (an acronym for “Process of Induction”) reconstructed development of the theory of sound waves by analogy with the description of water waves. The system was his Ph.D. dissertation.

In his Mental Leaps: Analogy in Creative Thought (1995) Thagard further explains that analogy is a kind of nondeductive logic, which he calls “analogic”. It firstly involves the “source analogue”, which is the known domain that the investigator already understands in terms of familiar patterns, and secondly involves the “target analogue”, which is the unfamiliar domain that the investigator is trying to understand. Analogic is the strategy whereby the investigator understands the targeted domain by seeing it in terms of the source domain. Analogic involves a “mental leap”, because the two analogues may initially seem unrelated. And the mental leap is called a “leap”, because analogic is not conclusive like deductive logic.

It may be noted that if the output state description generated by analogy such as the PI system is radically different from anything previously seen by the affected scientific profession containing the target analogue, then the members of that profession may experience the communication constraint to the high degree that is usually associated with a theory revision. The communication constraint is discussed below (Section 4.26).

Theory elaboration is the correction of a currently falsified theory to create a new theory by adding new factors or variables that correct the falsified universally quantified statements and erroneous predictions of the old theory. The new theory has the same test design as the old theory. The correction is not merely ad hoc excluding individual exceptional cases, but rather is a change in the universally quantified statements. This process is often misrepresented as “saving” a falsified theory, but in fact it creates a new one.

For example the introduction of a variable for the volume quantity and development of a constant coefficient for the particular gas could elaborate Gay-Lussac’s law for gasses into the combined Gay-Lussac’s law, Boyle’s law and Charles’ law. Similarly Friedman’s macroeconomic quantity theory might be elaborated into a Keynesian liquidity-preference function by the introduction of an interest rate, to account for the cyclicality manifest in an annual time series describing the calculated velocity parameter and to display the liquidity trap phenomenon.

Pat Langley’s BACON discovery system implements theory elaboration. It is named after the English philosopher Francis Bacon (1561-1626) who thought that scientific discovery can be routinized. BACON is a set of successive and increasingly sophisticated discovery systems that make quantitative laws and theories from input measurements. Langley designed and implemented BACON in 1979 as the thesis for his Ph.D. dissertation written in the Carnegie-Mellon department of psychology under the direction of Simon. A description of the system is in Simon’s Scientific Discovery: Computational Explorations of the Creative Processes (1987).

BACON uses Simon’s heuristic-search design strategy, which may be construed as a sequential application of theory elaboration. Given sets of observation measurements for two or more variables, BACON searches for functional relations among the variables. BACON has simulated the discovery of several historically significant empirical laws including Boyle’s law of gases, Kepler’s third planetary law, Galileo’s law of motion of objects on inclined planes, and Ohm’s law of electrical current.

Theory revision is a reorganization of currently existing information to create a new theory. In his Origins of Modern Science 1300-1800 Herbert Butterfield wrote that in both celestial and terrestrial physics the historic scientific revolution was brought about not by new observations or by additional evidence, but by transpositions that took place inside the minds of the scientists (P. 1). The results of theory revision may be radically different, so revision might be undertaken after repeated attempts at both theory extension and theory elaborations have failed to correct a previously falsified theory. The source for the input state description for mechanized theory revision consists of the descriptive vocabulary from the currently untested theories addressing the problem at hand. The descriptive vocabulary from previously falsified theories may also be included as inputs to make an accumulative state description, because the vocabularies in rejected theories can be productively cannibalized for their scrap value. The new theory is most likely to be called revolutionary if the revision is great, because theory revision typically produces greater change to the current language state than does theory extension or theory elaboration thus producing psychologically disorienting semantical dissolution.

Hickey’s METAMODEL discovery system synthesizes theory revisions. It constructed the Keynesian macroeconomic theory from U.S. statistical data available prior to 1936, the publication year of Keynes’ revolutionary General Theory of Employment, Interest and Money. The applicability of the METAMODEL for this theory revision was already known in retrospect by the fact that, as 1980 Nobel-laureate econometrician Lawrence Klein wrote in his Keynesian Revolution (1947), all the important parts of Keynes theory can be found in the works of one or another of his predecessors. Hickey’s METAMODEL discovery system described in his Introduction to Metascience (1976) is a mechanized generative grammar with combinatorial transition rules producing econometric models. The grammar is a finite-state generative grammar both to satisfy the collinearity restraint for the regression-estimated equations and to satisfy the formal requirements for executable multi-equation predictive models. The system tests for collinearity, statistical significance, serial correlation, goodness-of-fit properties of the equations, and for accurate out-of-sample retrodictions. Simon calls this combinatorial type of system a “generate-and-test” design.

Hickey also used his METAMODEL system in 1976 to develop a post-classical macrosociometric functionalist model of the American national society with fifty years of historical time-series data. To the shock, chagrin and dismay of academic sociologists it is not a social-psychological theory, and four sociological journals therefore rejected Hickey’s paper describing the model and its findings about the national society’s dynamics and stability characteristics. The paper is reprinted below as “Appendix I” to BOOK VIII.

The academic sociologists’ a priori ontological commitments to romanticism and social-psychological reductionism rendered the referees invincibly obdurate. Their criticisms also betrayed their Luddite mentality toward mechanized theory development. Later in the mid-1980’s Hickey integrated his macrosociometric model into a Keynesian macroeconometric model to produce an institutionalist macroeconometric model for the Indiana Department of Commerce, Division of Economic Analysis.

Twentieth-Century Philosophy of Science: A History (Third Edition)

Подняться наверх