First published in 2001, Causality in Macroeconomics addresses the long-standing problems of causality while taking macroeconomics seriously. The practical concerns of the macroeconomist and abstract concerns of the philosopher inform each other. Grounded in pragmatic realism, the book rejects the popular idea that macroeconomics requires microfoundations, and argues that the macroeconomy is a set of structures that are best analyzed causally. Ideas originally due to Herbert Simon and the Cowles Commission are refined and generalized to non-linear systems, particularly to the (...) non-linear systems with cross-equation restrictions that are ubiquitous in modern macroeconomic models with rational expectations. These ideas help to clarify philosophical as well as economic issues. The structural approach to causality is then used to evaluate more familiar approaches to causality due to Granger, LeRoy and Glymour, Spirtes, Scheines and Kelly, as well as vector autoregressions, the Lucas critique, and the exogeneity concepts of Engle, Hendry and Richard. (shrink)
Elliot Sober () forcefully restates his well-known counterexample to Reichenbach's principle of the common cause: bread prices in Britain and sea levels in Venice both rise over time and are, therefore, correlated; yet they are ex hypothesi not causally connected, which violates the principle of the common cause. The counterexample employs nonstationary data—i.e., data with time-dependent population moments. Common measures of statistical association do not generally reflect probabilistic dependence among nonstationary data. I demonstrate the inadequacy of the counterexample and of (...) some previous responses to it, as well as illustrating more appropriate measures of probabilistic dependence in the nonstationary case. A challenge to the principle of the common causeSober's argument and the attempts to rescue the principleProbabilistic dependenceNonstationary time seriesProbabilistic dependence in nonstationary time seriesDo Venetian sea levels and British bread prices violate the principle of the common cause? (shrink)
Discontented people might talk of corruption in the Commons, closeness in the Commons and the necessity of reforming the Commons, said Mr. Spenlow solemnly, in conclusion; but when the price of wheat per bushel had been the highest, the Commons had been the busiest; and a man might lay his hand upon his heart, and say this to the whole world, – ‘Touch the Commons, and down comes the country!’.
The dominant view among macroeconomists is that macroeconomics reduces to microeconomics, both in the sense that all macroeconomic phenomena arise out of microeconomic phenomena and in the sense that macroeconomic theory—to the extent that it is correct—can be derived from microeconomic theory. More than that, the dominant view believes that macroeconomics should in practice use the reduced microeconomic theory: this is the program of microfoundations for macroeconomics to which the vast majority of macroeconomists adhere. The "microfoundational" models that they actually (...) employ are, however, characterized by another feature: they are highly idealized, even when they are applied as direct characterizations of actual data, which itself consists of macroeconomic aggregates. This paper explores the interrelationship between reductionism and idealization in the microfoundational program and the role of idealization in empirical modeling. (shrink)
Macroeconomists overwhelmingly believe that macroeconomics requires microfoundations, typically understood as a strong eliminativist reductionism. Microfoundations aims to recover intentionality. In the face of technical and data constraints macroeconomists typically employ a representative-agent model, in which a single agent solves the microeconomic optimization problem for the whole economy, and take it to be microfoundationally adequate. The characteristic argument for the representative-agent model holds that the possibility of the sequential elaboration of the model to cover any number of individual agents justifies treating (...) the policy conclusions of the single-agent model as practically relevant. This eschatological justification is examined and rejected. (shrink)
For more than 20 years, Deidre McCloskey has campaigned to convince the economics profession that it is hopelessly confused about statistical significance. She argues that many practices associated with significance testing are bad science and that most economists routinely employ these bad practices: ?Though to a child they look like science, with all that really hard math, no science is being done in these and 96 percent of the best empirical economics ?? (McCloskey 1999). McCloskey's charges are analyzed and rejected. (...) That statistical significance is not economic significance is a jejune and uncontroversial claim, and there is no convincing evidence that economists systematically mistake the two. Other elements of McCloskey's analysis of statistical significance are shown to be ill?founded, and her criticisms of practices of economists are found to be based in inaccurate readings and tendentious interpretations of those economists' work. Properly used, significance tests are a valuable tool for assessing signal strength, for assisting in model specification, and for determining causal structure. (shrink)
'Data mining' refers to a broad class of activities that have in common, a search over different ways to process or package data statistically or econometrically with the purpose of making the final presentation meet certain design criteria. We characterize three attitudes toward data mining: first, that it is to be avoided and, if it is engaged in, that statistical inferences must be adjusted to account for it; second, that it is inevitable and that the only results of any interest (...) are those that transcend the variety of alternative data mined specifications (a view associated with Leamer's extreme-bounds analysis); and third, that it is essential and that the only hope we have of using econometrics to uncover true economic relationships is to be found in the intelligent mining of data. The first approach confuses considerations of sampling distribution and considerations of epistemic warrant and, reaches an unnecessarily hostile attitude toward data mining. The second approach relies on a notion of robustness that has little relationship to truth: there is no good reason to expect a true specification to be robust alternative specifications. Robustness is not, in general, a carrier of epistemic warrant. The third approach is operationalized in the general-to-specific search methodology of the LSE school of econometrics. Its success demonstrates that intelligent data mining is an important element in empirical investigation in economics. (shrink)
Argues that ontological reduction of macroeconomics to microeconomics is untenable. Existence of macroeconomic aggregates; Microfoundations of macroeconomics; Examinations of the general price level; Limits of the scientific development of microeconomics.
The Methodology of Empirical Macroeconomics stakes out a pragmatic middle-ground between traditional, prescriptive economic methodology and recent descriptive methodology. The former is sometimes seen as arrogantly telling economists how to do their work and the latter as irrelevant to their practice. The lectures are built around a case study of a concrete example of macroeconomic analysis. They demonstrate that economic methodology and the philosophy of science offer insights that help to resolve the genuine concerns of macroeconomists. Some examples of questions (...) addressed include: What is the relationship between theoretical models and empirical observations? What is the relevance of macroeconomics to policy? Should macroeconomics be viewed as a special case of microeconomics? What is the place of long-standing philosophical issues in macroeconomics, such as the scope and nature of economic laws, the role of idealizations, methodological individualism, and the problem of causality? (shrink)
Economics prefers complete explanations: general over partial equilibrium, microfoundational over aggregate. Similarly, probabilistic accounts of causation frequently prefer greater detail to less as in typical resolutions of Simpson’s paradox. Strategies of causal refinement equally aim to distinguish direct from indirect causes. Yet, there are countervailing practices in economics. Representative-agent models aim to capture economic motivation but not to reduce the level of aggregation. Small structural vector-autoregression and dynamic stochastic general-equilibrium models are practically preferred to larger ones. The distinction between exogenous (...) and endogenous variables suggests partitioning the world into distinct subsystems. The tension in these practices is addressed within a structural account of causation inspired by the work of Herbert Simon’s, which defines cause with reference to complete systems adapted to deal with incomplete systems and piecemeal evidence. The focus is on understanding the constraints that a structural account of causation places on the freedom to model complex or lower-order systems as simpler or higher-order systems and on to what degree piecemeal evidence can be incorporated into a structural account. (shrink)
The difficulty of conducting relevant experiments has long been regarded as the central challenge to learning about the economy from data. The standard solution, going back to Haavelmo's famous “The Probability Approach in Econometrics” (1944), involved two elements: first, it placed substantial weight on a priori theory as a source of structural information, reducing econometric estimates to measurements of causally articulated systems; second, it emphasized the need for an appropriate statistical model of the data. These elements are usually seen as (...) tightly linked. I argue that they are, to a large extent, separable. Careful attention to the role of an empirically justified statistical model in underwriting probability explains puzzles not only in economics, but more generally with respect to recent criticisms of Reichenbach's principle of the common cause, which lies behind graph-theoretic causal search algorithms. And it provides an antidote to the pessimistic understanding of the possibilities for passive observation of causal structure in econometrics and related areas of Nancy Cartwright and others. (shrink)
Modern empirical macroeconomic models, known as structural autoregressions (SVARs) are dynamic models that typically claim to represent a causal order among contemporaneously valued variables and to merely represent non-structural (reduced-form) co-occurence between lagged variables and contemporaneous variables. The strategy is held to meet the minimal requirements for identifying the residual errors in particular equations in the model with independent, though otherwise not directly observable, exogenous causes (“shocks”) that ultimately account for change in the model. In nonstationary models, such shocks accumulate (...) so that variables have discernible trends. Econometricians have conceived of variables that trend in sympathy with each other (so-called “cointegrated variables”) as sharing one or more of these unobserved trends as a common cause. It is possible for estimates of the values of both the otherwise unobservable individual shocks and the otherwise unobservable common trends to be backed-out of cointegrated systems of equations. The issue addressed in this paper is whether and in what circumstances these values can be regarded as observations of real entities rather than merely artifacts of the representation of variables in the model. The issue is related, on the one hand, to practical methodological problems in the use of SVARs for policy analysis—e.g., does it make sense to estimate of shocks or trends in one model and then use them as measures of variables in a conceptually distinct model? The issue is also related to debates in the philosophical analysis of causation—particularly, whether we are entitled, as assumed by the developers of Bayes-net approaches, to rely on the causal Markov condition (a generalization of Reichenbach’s common-cause condition) or whether cointegration generates a practical example of Nancy Cartwright’s “byproducts” objection to the causal Markov condition. (shrink)
Although the relevance and importance of his work has been recognized only belatedly, Charles Sanders Peirce was, throughout his life, a careful student and significant contributor to the development of logic, scientific theory, and philosophy generally. Occasionally, complete appreciation of Peirce's efforts has been hampered because his work is often unique and, at times, highly idiosyncratic. Yet, we hope to show in this paper that for one aspect of his work in logic Peirce did not abandon the ordinary without purpose. (...) Only relatively recently have philosophers of science become interested in the logic of discovery - that is, in the logic of the selection of hypotheses to be tested rather than simply in the ways of testing hypotheses which have already been selected. (shrink)
Recent debates over the nature of causation, casual inference, and the uses of causal models in counterfactual analysis, involving inter alia Nancy Cartwright (Hunting Causes and Using Them), James Woodward (Making Things Happen), and Judea Pearl (Causation), hinge on how causality is represented in models. Economists’ indigenous approach to causal representation goes back to the work of Herbert Simon with the Cowles Commission in the early 1950s. The paper explicates a scheme for the representation of causal structure, inspired by Simon, (...) and shows how this representation sheds light on some important debates in the philosophy of causation. This structural account is compared to Woodward’s manipulability account. It is used to evaluate the recent debates – particularly, with respect to the nature of causal structure, the identity of causes, causal independence, and modularity. Special attention is given to modeling issues that arise in empirical economics. (shrink)
Through most of the history of economics, the most influential commentators on methodology were also eminent practitioners of economics. And even not so long ago, it was so. Milton Friedman, Paul Samuelson, Trygve Haavelmo, and Tjalling Koopmans were awarded Nobel prizes for their substantive contributions to economics, and were each important contributors to methodological thought. But the fashion has changed. Specialization has increased. Not only has methodology become its own field, but many practitioners have come to agree with Frank Hahn's (...) (1992) view that methodology is a distraction to the practitioner, best left to the professional methodologists and philosophers, and of little practical import even when delivered from their pens. John Sutton's lectures, Marshall's Tendencies: What Economists Can Know, is a welcome return to the older fashion, for Sutton is an eminent practitioner of game theory and industrial organization. One of the main themes of these rich and nuanced lectures is the relationship of economic theory to econometric evidence. Sutton's reflections on econometrics appear to arise from the darker recesses of his practitioner's soul. While he affects a sunny disposition and ends on a hopeful note, his analysis articulates the lurking fear that econometrics is a hopeless project and that economics has little to learn from the interaction of theory and econometrics. Sutton's book is like a play in which virtue triumphs, but the villain gets all the good lines. (shrink)
(2013). Beyond mechanical markets – asset price swings, risk and the role of the state. Journal of Economic Methodology: Vol. 20, Methodology, Systemic Risk, and the Economics Profession, pp. 69-75. doi: 10.1080/1350178X.2013.774856.