This category needs an editor. We encourage you to help if you are qualified.
Volunteer, or read more about what this involves.
Related categories
Siblings:
31 found
Search inside:
(import / add options)   Sort by:
  1. Roger E. Backhouse & Mary S. Morgan (2000). Introduction: Is Data Mining a Methodological Problem? Journal of Economic Methodology 7 (2):171-181.
    This survey of the symposium papers argues that the problem of data mining should be of interest to both practicing econometricians and specialists in economic methodology. After summarizing some of the main points to arise in the symposium, it draws on recent work in the philosophy of science to point to parallels between data mining and practices engaged in routinely by experimental scientists. These suggest that data mining might be seen in a more positive light than conventional doubts about it (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  2. Bradley W. Bateman (1987). Keynes's Changing Conception of Probability. Economics and Philosophy 3 (01):97-.
  3. J. Berkovitz (1995). What Econometrics Cannot Teach Quantum Mechanics. Studies in History and Philosophy of Science Part B 26 (2):163-200.
    Cartwright (1989) and Humphreys (1989) have suggested theories of probabilistic causation for singular events, which are based on modifications of traditional causal linear modelling. On the basis of her theory, Cartwright offered an allegedly local, and non-factorizable, common-cause model for the EPR experiment. In this paper I consider Cartwright's and Humphrey's theories. I argue that, provided plausible assumptions obtain, local models for EPR in the framework of these theories are committed to Bell inequalities, which are violated by experiment.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  4. Gregor Betz (2006). Prediction or Prophecy? The Boundaries of Economic Foreknowledge and Their Socio-Political Consequences. DUV.
    Gregor Betz explores the following questions: Where are the limits of economics, in particular the limits of economic foreknowledge? Are macroeconomic forecasts credible predictions or mere prophecies and what would this imply for the way economic policy decisions are taken? Is rational economic decision making possible without forecasting at all?
    Remove from this list |
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  5. Justin Bledin & Sharon Shewmake (2004). Research Programs, Model-Building and Actor-Network-Theory: Reassessing the Case of the Leontief Paradox. Journal of Economic Methodology 11 (4):455-476.
    Methodology of scientific research programs (MSRP), model-building and actor-network-theory (ANT) are woven together to provide a layered study of the Leontief paradox. Neil De Marchi's Lakatosian account examined the paradox within an Ohlin-Samuelson research program. A model-building approach rather highlights the ability of Leontief's input-output model to mediate between international trade theory and the world by facilitating an empirical application of the Heckscher-Ohlin Theorem. The epistemological implications of this model-building approach provide an alternative explanation of why Samuelson and other prominent (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  6. David Booth (1991). Review: Bernt P. Stigum, Toward a Formal Science of Economics. The Axiomatic Method in Economics and Econometrics. [REVIEW] Journal of Symbolic Logic 56 (3):1102-1103.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  7. Nancy Cartwright & J. Reiss, Uncertainty in Econometrics: Evaluating Policy Counterfactuals.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  8. Filippo Cesarano (2006). Economic History and Economic Theory. Journal of Economic Methodology 13 (4):447-467.
    Since the mid?1950s the spread of formal models and econometric method has greatly improved the study of the past, giving rise to the ?new? economic history; at the same time, the influence of economic history on economists and economics has markedly declined. This paper argues that the contribution of history to the advancement of economics is still paramount, as is evident from the evolution of monetary theory and institutions.JEL classification: NO1, A12.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  9. Hsiang‐Ke Chao (2005). A Misconception of the Semantic Conception of Econometrics? Journal of Economic Methodology 12 (1):125-135.
    Davis argues that Suppe's semantic conception provides a better understanding of the problem of theory?data confrontations. Applying his semantic methodology to the LSE (London School of Economics) approach of econometrics, he concludes that the LSE approach fails to address the issue of bridging the theory?data gap. This paper suggests two other versions of the semantic view of theories in the philosophy of science, due to Suppes and van Fraassen, and argues that the LSE approach can be construed under these two (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  10. Steven Cook (2005). On the Semantic Approach to Econometric Methodology. Journal of Economic Methodology 12 (1):117-123.
    In recent research, Davis (2005) has introduced the semantic conception of theories as a means of studying the differing practices of the Textbook and LSE approaches to econometric modelling. In this paper, Davis' (2005) use of the semantic view is examined, with close attention paid to the stated roles of the semantic notions of ?model dimensions? and ?bridging assumptions?. While comments concerning the latter are of a supportive nature, some concerns are raised in relation to Davis' use of model dimensions.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  11. Steven Cook (2003). A Kuhnian Perspective on Econometric Methodology. Journal of Economic Methodology 10 (1):59-78.
    While there exist numerous applications of Kuhn's analysis of scientific revolutions to economics, there is yet to be an application to econometrics. The present paper addresses this via an analysis of the often-documented transition between the textbook and LSE methodologies witnessed in British time series econometrics. This exercise allows a number of issues to be raised. First, it will be questioned whether the observed transition in econometrics is an appropriate subject for analysis within the Kuhnian framework. This is the primary (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  12. Steven Cook (2001). Observations on the Practice of Data-Mining: Comments on the JEM Symposium. Journal of Economic Methodology 8 (3):415-419.
    A positive view of data-mining has been recently presented in a Journal of Economic Methodology ( JEM ) symposium. This is in stark contrast to the stance normally taken. In this note consideration of the Bayesian philosophy of science literature and the impact of data revision extends the analysis of data-mining. Introduction of these issues is seen to provide support for the arguments presented in the JEM symposium.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  13. Steven Cook (1999). Methodological Aspects of the Encompassing Principle. Journal of Economic Methodology 6 (1):61-78.
    The philosophy of science literature has played an increasing role in discussion of econometric methodology in recent years, and the Hendry methodology in particular has received much attention. Despite this, the encompassing principle has been overlooked in the methodological literature. This paper addresses this by examining the major methodological implications of the principle.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  14. George C. Davis (2005). A Rejoinder to Cook and Response to Chao: Moving the Textbook/LSE Debate Forward. Journal of Economic Methodology 12 (1):137-147.
    The reply by Cook and comment by Chao demonstrate Kuhn's thesis that different scientists place different values on different components of their common discipline. This fact is demonstrated by first succinctly summarizing Cook's and my original points within the framework of a simple choice model. I then respond to Cook and Chao. I close by offering some suggestions on how the Textbook/LSE debate could be moved forward.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  15. George C. Davis (2005). Clarifying the 'Puzzle' Between the Textbook and LSE Approaches to Econometrics: A Comment on Cook's Kuhnian Perspective on Econometric Modelling. Journal of Economic Methodology 12 (1):93-115.
    In a recent article, Cook conducted a Kuhnian analysis of the difference between the Textbook and LSE econometric approaches. This paper uses a semantic conception of theories (Suppe 1989) and a finer gradation of the theory of reduction process to clarify the apparent puzzle that exist between the Textbook and LSE approaches to econometrics. The paper demonstrates that a Kuhnian analysis in isolation can be more misleading than realized.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  16. George C. Davis (2000). A Semantic Interpretation of Haavelmo's Structure of Econometrics. Economics and Philosophy 16 (2):205-228.
    Trygve Haavelmo's 1944 article ‘The Probability Approach in Econometrics’ is considered by most to have provided the foundations for present day econometrics (Morgan, 1990, Chapters 8 and 9). Since Haavelmo (1944), extraordinary advances have been made in econometrics. However, over the last two decades the efficacy and scientific status of econometrics has become questionable. Not surprisingly, the growing discontent with econometrics has been accompanied by a growing interest in econometric methodology.
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  17. Tom Engsted (2009). Statistical Vs. Economic Significance in Economics and Econometrics: Further Comments on McCloskey and Ziliak. Journal of Economic Methodology 16 (4):393-408.
    I comment on the controversy between McCloskey and Ziliak and Hoover and Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey and Ziliak are right in emphasizing ?real error?, i.e. non-sampling error that cannot be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  18. Susan Feigenbaum & David M. Levy (1993). The Market for (Ir)Reproducible Econometrics. Social Epistemology 7 (3):215 – 232.
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  19. Damien Fennell (2007). Why Functional Form Matters: Revealing the Structure in Structural Models in Econometrics. Philosophy of Science 74 (5):1033-1045.
    This paper argues that econometricians' explicit adoption of identification conditions in structural equation modelling commits them to read the functional form of their equations in a strong, nonmathematical way. This content, which is implicitly attributed to the functional form of structural equations, is part of what makes equation structural. Unfortunately, econometricians are not explicit about the role functional form plays in signifying structural content. In order to remedy this, the second part of this paper presents an interpretation of the functional (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  20. Franklin M. Fisher (1969). Causation and Specification in Economic Theory and Econometrics. Synthese 20 (4):489 - 500.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  21. Maria Carla Galavotti (1990). Explanation and Causality: Some Suggestions From Econometrics. Topoi 9 (2):161-169.
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  22. Clark Glymour (1985). Interpreting Leamer. Economics and Philosophy 1 (02):290-.
  23. Robert S. Goldfarb, H. O. Stekler & Joel David (2005). Methodological Issues in Forecasting: Insights From the Egregious Business Forecast Errors of Late 1930. Journal of Economic Methodology 12 (4):517-542.
    This paper examines some economic forecasts made in late 1930 that were intended to predict economic activity in the United States in order to shed light on several methodological issues. We document that these forecasts were extremely optimistic, predicting that the recession in the US would soon end, and that 1931 would show a recovery. These forecasts displayed egregious errors, because 1931 witnessed the largest negative growth rate for the US economy in any year in the twentieth century. A specific (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  24. Wenceslao J. González (1996). On the Theoretical Basis of Prediction in Economics. Journal of Social Philosophy 27 (3):201-228.
  25. Clinton A. Greene (2000). I Am Not, nor Have I Ever Been a Member of a Data-Mining Discipline. Journal of Economic Methodology 7 (2):217-230.
    This paper argues classical statistics and standard econometrics are based on a desire to meet scientific standards for accumulating reliable knowledge. Science requires two inputs, mining of existing data for inspiration and new or 'out-of-sample' data for predictive testing. Avoidance of data-mining is neither possible nor desirable. In economics out-of-sample data is relatively scarce, so the production process should intensively exploit the existing data. But the two inputs should be thought of as complements rather than substitutes. And we neglect the (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  26. Alastair R. Hall & Fernanda P. M. Peixe (2000). Data Mining and the Selection of Instruments. Journal of Economic Methodology 7 (2):265-277.
    Abstract Instrumental variables estimation is widely applied in econometrics. To implement the method, it is necessary to specify a vector of instruments. In this paper, it is argued that there are compelling reasons to use the data for instrument selection, but that it is desirable to ensure the resulting estimator still behaves in the way predicted by standard textbook theory. These arguments lead one to propose three criteria for data based instrument selection. The remainder of the paper assesses the extent (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  27. Bernd Hayo (1998). Simplicity in Econometric Modelling: Some Methodological Considerations. Journal of Economic Methodology 5 (2):247-261.
    It is shown how simplicity in econometric modelling can be defended from three different methodological positions, a ?traditional scientific?, a rhetorical and a hermeneutical one. Moreover, it is argued that the claim of methodological superiority by supporters of general-to-specific modelling is largely rhetorics. In practice there does not exist a viable alternative to simple modelling in empirical economics.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  28. Kevin D. Hoover (2002). Symposium on Marshall's Tendencies: 5 Sutton's Critique of Econometrics. Economics and Philosophy 18 (1):45-54.
    Through most of the history of economics, the most influential commentators on methodology were also eminent practitioners of economics. And even not so long ago, it was so. Milton Friedman, Paul Samuelson, Trygve Haavelmo, and Tjalling Koopmans were awarded Nobel prizes for their substantive contributions to economics, and were each important contributors to methodological thought. But the fashion has changed. Specialization has increased. Not only has methodology become its own field, but many practitioners have come to agree with Frank Hahn's (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  29. Kevin D. Hoover (1990). The Logic of Causal Inference: Econometrics and the Conditional Analysis of Causation. Economics and Philosophy 6 (02):207-.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  30. Kevin D. Hoover & Stephen J. Perez (2000). Three Attitudes Towards Data Mining. Journal of Economic Methodology 7 (2):195-210.
    'Data mining' refers to a broad class of activities that have in common, a search over different ways to process or package data statistically or econometrically with the purpose of making the final presentation meet certain design criteria. We characterize three attitudes toward data mining: first, that it is to be avoided and, if it is engaged in, that statistical inferences must be adjusted to account for it; second, that it is inevitable and that the only results of any interest (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  31. Kevin D. Hoover & Mark V. Siegler (2008). Sound and Fury: McCloskey and Significance Testing in Economics. Journal of Economic Methodology 15 (1):1-37.
    For more than 20 years, Deidre McCloskey has campaigned to convince the economics profession that it is hopelessly confused about statistical significance. She argues that many practices associated with significance testing are bad science and that most economists routinely employ these bad practices: ?Though to a child they look like science, with all that really hard math, no science is being done in these and 96 percent of the best empirical economics ?? (McCloskey 1999). McCloskey's charges are analyzed and rejected. (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation