In this paper I challenge the pernicious aspects of Milton Friedman's methodological outlook that continues to hold sway over mainstream neoclassical economists. I do this by showing how Friedman's own methodological dicta could have been used against him when he famously advanced the expectations critique of the Phillips curve at his presidential address to the American Economic Association. I use this case study to further suggest that psychological and neurophysiological data should not be deemed irrelevant to economic science.
In this paper, I shall defend two main claims. First, Friedman’s famous paper “On the methodology of positive economics” (“F53”) cannot be properly understood without taking into account the influence of three authors who are neither cited nor mentioned in the paper: Max Weber, Frank Knight, and Karl Popper. I shall trace both their substantive influence on F53 and the historical route by which this influence took place. Once one has understood these ingredients, especially Weber’s ideal types, many of F53’s (...) astonishing sentences like “the more significant the theory, the more unrealistic the assumptions”, make good sense. Second, I shall claim that the much-discussed question whether Friedman’s essay espouses an instrumentalist or a realist position, is the wrong question to be asked. I shall illustrate that by a comparison with examples from physics in which also unrealistic assumptions are made. Also there, the question whether these assumptions are indicators of instrumentalism or realism is not appropriate. Cleared from these misunderstandings, F53 presents itself as an interesting and reasonable but much less controversial contribution to the methodology of economics. (shrink)
Throughout the history of mankind, energy security has been always seen as a means of protection from disruptions of essential energy systems. The idea of protection from disorders emerged from the process of securing political and military control over energy resources to set up policies and measures on managing risks that affect all elements of energy systems. The various systems placed in a place to achieve energy security are the driving force towards the energy innovations or emerging trends in the (...) energy sector. Our paper discusses energy security status and innovations in the energy sector in European Union (EU). We analyze the recent up-to-date developments of the energy policy and exploitation of energy sources, as well as scrutinize the channels of energy streaming to the EU countries and the risks associated with this energy import. Moreover, we argue that the shift to the low-carbon production of energy and the massive deployment of renewable energy sources (RES) might become the key issue in ensuring the energy security and independency of the EU from its external energy supplies. Both RES, distributed energy resources (DER) and “green energy” that will be based on the energy efficiency and the shift to the alternative energy supply might change the energy security status quo for the EU. (shrink)
I distinguish several doctrines that economic methodologists have found attractive, all of which have a positivist flavour. One of these is the doctrine that preference assignments in economics are just shorthand descriptions of agents' choice behaviour. Although most of these doctrines are problematic, the latter doctrine about preference assignments is a respectable one, I argue. It doesn't entail any of the problematic doctrines, and indeed it is warranted independently of them.
Julian Reiss correctly identified a trilemma about economic models: we cannot maintain that they are false, but nevertheless explain and that only true accounts explain. In this reply we give reasons to reject the second premise ? that economic models explain. Intuitions to the contrary should be distrusted.
This article is a prelude to an experimental study of the preference concept in economics. I argue that a new empirical approach called experimental philosophy of science is a promising approach to advance the philosophy of economics. In particular, I discuss two debates in the field, the neuroeconomics controversy and the commonsensible realism debate, and suggest how experimental and survey techniques can generate data that will inform these debates. Some of the likely objections from philosophers and economists are addressed, and (...) possible ways of operationalizing different preference concepts are illustrated. (shrink)
This paper aims (a) to provide characterizations of realism and instrumentalism that are philosophically interesting and applicable to economics; and (b) to defend instrumentalism against realism as a methodological stance in economics. Starting point is the observation that ‘all models are false’, which, or so I argue, is difficult to square with the realist's aim of truth, even if the latter is understood as ‘partial’ or ‘approximate’. The three cheers in favour of instrumentalism are: (1) Once we have usefulness, truth (...) is redundant. (2) There is something disturbing about causal structure. (3) It's better to do what one can than to chase rainbows. (shrink)
Milton Friedman’s 1953 essay created controversy and consternation amongst economists. It provided a prescription, based on empirically generated predictive success, of how to do economics, yet many saw it as a concession of the search for truth and theoretical beauty within the discipline. This article reviews a 50th anniversary festschrift devoted to views of the essay. The purpose of the volume is to provide today’s reader with the essay, responses, and a guide to interpreting it. The volume is selective and (...) several contributors have their own agendas, but the feeling of tumult the essay still engenders is nicely conveyed. (shrink)
In this paper I study how the theoretical categories of consumption theory were used by Milton Friedman in order to classify empirical data and obtain predictions. Friedman advocated a case by case definition of these categories that traded theoretical coherence for empirical content. I contend that this methodological strategy puts a clear incentive to contest any prediction contrary to our interest: it can always be argued that these predictions rest on a wrong classification of data. My conjecture is that this (...) methodological strategy can contribute to explain why Friedman’s predictions never generated the consensus he expected among his peers. (shrink)
The firm is a real entity and not an imaginary, fictitious or linguistic entity. This implies that the firm as a whole exhibits a sufficient degree of unity or cohesiveness and is durable and persistent through time. The firm is essentially composed of a particular combination of constituents that are bound together by something that acts as an ontological glue, and is therefore non-reducible to other more basic entities, i.e., to its parts or its members. From our perspective, the firm (...) is not simply an aggregate or a collection. It is a real integrated entity and a dynamic causal system. Institutional and organizational aspects enter the picture. These assertions stand in sharp contrast with mainstream theories of the firm whose proponents are more preoccupied with questions of contractual provisions, vertical integration or opportunism than the general and more fundamental questions related to what firms really are. (shrink)
In this paper I study Milton Friedman’s statistical education, paying special attention to the different methodological approaches (Fisher, Neyman and Savage) to which he was exposed. I contend that these statistical procedures involved different views as to the evaluation of statistical predictions. In this light, the thesis defended in Friedman’s 1953 methodological essay appears substantially ungrounded.
This book brings together ten previously published essays on the philosophy of economics and economic methodology. The general theme is the application of Karl Popper's philosophy of science to economics -- not only by Popper himself but also by other members of the "Popperian school." There are three major issues that surface repeatedly: the applicability of Popper's falsificationist philosophy of science; the applicability of I. Lakatos's "methodology of scientific research programs" to economics; and the question of Popper's "situational analysis" approach (...) to social science. (shrink)
Many of the things that we try to explain, in both our common sense and our scientific engagement with the world, are capable of being explained more or less finely: that is, with greater or lesser attention to the detail of the producing mechanism. A natural assumption, pervasive if not always explicit, is that other things being equal, the more finegrained an explanation, the better. Thus, Jon Elster, who also thinks there are instrumental reasons for wanting a more fine-grained explanation, (...) assumes that in any case the mere fact of getting nearer the detail of production makes such an explanation intrinsically superior: “a more detailed explanation is also an end in itself”. Michael Taylor agrees: “A good explanation should be, amongst other things, as fine-grained as possible.”. (shrink)
A review of A. Hisch and N. de Marchi's thorough historical study on Milton Friedman's life-long work as an economist (and more specifically as a monetary economist) and as an economic methodologist (in his famous essay "The Methodology of Positive Economics".
Economics today cannot predict the likely outcome of specific events any better than it could in the time of Adam Smith. This is Alexander Rosenberg's controversial challenge to the scientific status of economics. Rosenberg explains that the defining characteristic of any science is predictive improvability--the capacity to create more precise forecasts by evaluating the success of earlier predictions--and he forcefully argues that because economics has not been able to increase its predictive power for over two centuries, it is not a (...) science. (shrink)
According to the instrumentalism of Friedman and Machlup it is irrelevant whether the explanatory principles or “assumptions” of a theory satisfy any criterion of “plausibility,” “realism,” “credibility,” or “soundness.” In this view the main or only criterion for selecting theories is whether a theory yields empirically testable implications that turn out to be consistent with observations. All we should require or expect from a theory is that it is a useful instrument for the purpose of prediction. Considerations of the “efficiency” (...) of a theory for the purpose of ordering our experiences are permitted, but considerations of “plausibility” are not. “Explanatory assumptions” are not really explanatory in the sense that they claim to represent underlying causal processes in reality; they only serve to generate, by deduction, implications that are in accordance with as many observations as possible. (shrink)
The F-twist is giving way to the methodology of scientific research programs. Milton Friedman's “Methodology for Economics” is being supplanted as the orthodox rationale for neoclassical economics by Imre Lakatos' account of scientific respectability. Friedman's instrumentalist thesis that theories are to be judged by the confirmation of their consequences and not the realism of their assumptions has long been widely endorsed by economists, under Paul Samuelson's catchy rubric “the F-twist.” It retains its popularity among economists who want no truck with (...) methodology, but among the increasing number of able economists who are writing on methodology the F-twist has been surrendered, not so much because these writers have decided it is false, as because something better has finally come along. (shrink)