This category needs an editor. We encourage you to help if you are qualified.
Volunteer, or read more about what this involves.
Related categories
Siblings:
48 found
Search inside:
(import / add options)   Sort by:
  1. Roger E. Backhouse & Mary S. Morgan (2000). Introduction: Is Data Mining a Methodological Problem? Journal of Economic Methodology 7 (2):171-181.
    This survey of the symposium papers argues that the problem of data mining should be of interest to both practicing econometricians and specialists in economic methodology. After summarizing some of the main points to arise in the symposium, it draws on recent work in the philosophy of science to point to parallels between data mining and practices engaged in routinely by experimental scientists. These suggest that data mining might be seen in a more positive light than conventional doubts about it (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  2. Bradley W. Bateman (1987). Keynes's Changing Conception of Probability. Economics and Philosophy 3 (01):97-.
  3. J. Berkovitz (1995). What Econometrics Cannot Teach Quantum Mechanics. Studies in History and Philosophy of Science Part B 26 (2):163-200.
    Cartwright (1989) and Humphreys (1989) have suggested theories of probabilistic causation for singular events, which are based on modifications of traditional causal linear modelling. On the basis of her theory, Cartwright offered an allegedly local, and non-factorizable, common-cause model for the EPR experiment. In this paper I consider Cartwright's and Humphrey's theories. I argue that, provided plausible assumptions obtain, local models for EPR in the framework of these theories are committed to Bell inequalities, which are violated by experiment.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  4. Gregor Betz (2006). Prediction or Prophecy? The Boundaries of Economic Foreknowledge and Their Socio-Political Consequences. DUV.
    Gregor Betz explores the following questions: Where are the limits of economics, in particular the limits of economic foreknowledge? Are macroeconomic forecasts credible predictions or mere prophecies and what would this imply for the way economic policy decisions are taken? Is rational economic decision making possible without forecasting at all?
    Remove from this list |
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  5. Justin Bledin & Sharon Shewmake (2004). Research Programs, Model-Building and Actor-Network-Theory: Reassessing the Case of the Leontief Paradox. Journal of Economic Methodology 11 (4):455-476.
    Methodology of scientific research programs (MSRP), model-building and actor-network-theory (ANT) are woven together to provide a layered study of the Leontief paradox. Neil De Marchi's Lakatosian account examined the paradox within an Ohlin-Samuelson research program. A model-building approach rather highlights the ability of Leontief's input-output model to mediate between international trade theory and the world by facilitating an empirical application of the Heckscher-Ohlin Theorem. The epistemological implications of this model-building approach provide an alternative explanation of why Samuelson and other prominent (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  6. David Booth (1991). Review: Bernt P. Stigum, Toward a Formal Science of Economics. The Axiomatic Method in Economics and Econometrics. [REVIEW] Journal of Symbolic Logic 56 (3):1102-1103.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  7. Nancy Cartwright & J. Reiss, Uncertainty in Econometrics: Evaluating Policy Counterfactuals.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  8. Filippo Cesarano (2006). Economic History and Economic Theory. Journal of Economic Methodology 13 (4):447-467.
    Since the mid?1950s the spread of formal models and econometric method has greatly improved the study of the past, giving rise to the ?new? economic history; at the same time, the influence of economic history on economists and economics has markedly declined. This paper argues that the contribution of history to the advancement of economics is still paramount, as is evident from the evolution of monetary theory and institutions.JEL classification: NO1, A12.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  9. Hsiang‐Ke Chao (2005). A Misconception of the Semantic Conception of Econometrics? Journal of Economic Methodology 12 (1):125-135.
    Davis argues that Suppe's semantic conception provides a better understanding of the problem of theory?data confrontations. Applying his semantic methodology to the LSE (London School of Economics) approach of econometrics, he concludes that the LSE approach fails to address the issue of bridging the theory?data gap. This paper suggests two other versions of the semantic view of theories in the philosophy of science, due to Suppes and van Fraassen, and argues that the LSE approach can be construed under these two (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  10. Steven Cook (2005). On the Semantic Approach to Econometric Methodology. Journal of Economic Methodology 12 (1):117-123.
    In recent research, Davis (2005) has introduced the semantic conception of theories as a means of studying the differing practices of the Textbook and LSE approaches to econometric modelling. In this paper, Davis' (2005) use of the semantic view is examined, with close attention paid to the stated roles of the semantic notions of ?model dimensions? and ?bridging assumptions?. While comments concerning the latter are of a supportive nature, some concerns are raised in relation to Davis' use of model dimensions.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  11. Steven Cook (2003). A Kuhnian Perspective on Econometric Methodology. Journal of Economic Methodology 10 (1):59-78.
    While there exist numerous applications of Kuhn's analysis of scientific revolutions to economics, there is yet to be an application to econometrics. The present paper addresses this via an analysis of the often-documented transition between the textbook and LSE methodologies witnessed in British time series econometrics. This exercise allows a number of issues to be raised. First, it will be questioned whether the observed transition in econometrics is an appropriate subject for analysis within the Kuhnian framework. This is the primary (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  12. Steven Cook (2001). Observations on the Practice of Data-Mining: Comments on the JEM Symposium. Journal of Economic Methodology 8 (3):415-419.
    A positive view of data-mining has been recently presented in a Journal of Economic Methodology ( JEM ) symposium. This is in stark contrast to the stance normally taken. In this note consideration of the Bayesian philosophy of science literature and the impact of data revision extends the analysis of data-mining. Introduction of these issues is seen to provide support for the arguments presented in the JEM symposium.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  13. Steven Cook (1999). Methodological Aspects of the Encompassing Principle. Journal of Economic Methodology 6 (1):61-78.
    The philosophy of science literature has played an increasing role in discussion of econometric methodology in recent years, and the Hendry methodology in particular has received much attention. Despite this, the encompassing principle has been overlooked in the methodological literature. This paper addresses this by examining the major methodological implications of the principle.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  14. A. C. Darnell (2001). A Review of Jan R. Magnus and Mary S. Morgan's Methodology and Tacit Knowledge: Two Experiments in Econometrics. [REVIEW] Journal of Economic Methodology 8 (2):344-348.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  15. George C. Davis (2005). Clarifying the 'Puzzle' Between the Textbook and LSE Approaches to Econometrics: A Comment on Cook's Kuhnian Perspective on Econometric Modelling. Journal of Economic Methodology 12 (1):93-115.
    In a recent article, Cook conducted a Kuhnian analysis of the difference between the Textbook and LSE econometric approaches. This paper uses a semantic conception of theories (Suppe 1989) and a finer gradation of the theory of reduction process to clarify the apparent puzzle that exist between the Textbook and LSE approaches to econometrics. The paper demonstrates that a Kuhnian analysis in isolation can be more misleading than realized.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  16. George C. Davis (2005). A Rejoinder to Cook and Response to Chao: Moving the Textbook/LSE Debate Forward. Journal of Economic Methodology 12 (1):137-147.
    The reply by Cook and comment by Chao demonstrate Kuhn's thesis that different scientists place different values on different components of their common discipline. This fact is demonstrated by first succinctly summarizing Cook's and my original points within the framework of a simple choice model. I then respond to Cook and Chao. I close by offering some suggestions on how the Textbook/LSE debate could be moved forward.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  17. George C. Davis (2000). A Semantic Interpretation of Haavelmo's Structure of Econometrics. Economics and Philosophy 16 (2):205-228.
    Trygve Haavelmo's 1944 article ‘The Probability Approach in Econometrics’ is considered by most to have provided the foundations for present day econometrics (Morgan, 1990, Chapters 8 and 9). Since Haavelmo (1944), extraordinary advances have been made in econometrics. However, over the last two decades the efficacy and scientific status of econometrics has become questionable. Not surprisingly, the growing discontent with econometrics has been accompanied by a growing interest in econometric methodology.
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  18. Michel de Vroey (1999). Equilibrium and Disequilibrium in Economic Theory: A Confrontation of the Classical, Marshallian and Walras-Hicksian Conceptions. Economics and Philosophy 15 (02):161-185.
    When the economic theory of the last decades becomes a subject of reflection for historians of economic theory, a striking feature which they will have to explain is the demise of the disequilibrium concept. Previously, economists had no qualms concerning the view that the market or the economy was exhibiting disequilibria. Amongst many possible quotations, the following, drawn from Viner's well-known article on Marshall, illustrates that:.
    Remove from this list |
    Translate to English
    | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  19. David Dearmont & David A. Bessler (1997). A Bayesian Treatment of Duhem's Thesis: The Case of the 'Farm Problem' in Agricultural Economics. Economics and Philosophy 13 (2):149-158.
    In this paper we consider a Bayesian treatment of ‘Duhem's thesis’, the proposition that theories are never refuted on empirical grounds because they cannot be tested in isolation from auxiliary hypotheses about initial conditions or the operation of scientific instruments. Sawyer, Beed, and Sankey (1997) consider Duhem's thesis (and its restatement in stronger and weaker forms as the ‘Duhem-Quine thesis’) and its role in hypothesis testing, using four theories from economics and finance as examples. Here we consider Duhem's thesis in (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  20. Tom Engsted (2009). Statistical Vs. Economic Significance in Economics and Econometrics: Further Comments on McCloskey and Ziliak. Journal of Economic Methodology 16 (4):393-408.
    I comment on the controversy between McCloskey and Ziliak and Hoover and Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey and Ziliak are right in emphasizing ?real error?, i.e. non-sampling error that cannot be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  21. Susan Feigenbaum & David M. Levy (1993). The Market for (Ir)Reproducible Econometrics. Social Epistemology 7 (3):215 – 232.
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  22. Damien Fennell (2007). Why Functional Form Matters: Revealing the Structure in Structural Models in Econometrics. Philosophy of Science 74 (5):1033-1045.
    This paper argues that econometricians' explicit adoption of identification conditions in structural equation modelling commits them to read the functional form of their equations in a strong, nonmathematical way. This content, which is implicitly attributed to the functional form of structural equations, is part of what makes equation structural. Unfortunately, econometricians are not explicit about the role functional form plays in signifying structural content. In order to remedy this, the second part of this paper presents an interpretation of the functional (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  23. Franklin M. Fisher (1969). Causation and Specification in Economic Theory and Econometrics. Synthese 20 (4):489 - 500.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  24. Maria Carla Galavotti (1990). Explanation and Causality: Some Suggestions From Econometrics. Topoi 9 (2):161-169.
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  25. Clark Glymour (1985). Interpreting Leamer. Economics and Philosophy 1 (02):290-.
  26. Robert S. Goldfarb, H. O. Stekler & Joel David (2005). Methodological Issues in Forecasting: Insights From the Egregious Business Forecast Errors of Late 1930. Journal of Economic Methodology 12 (4):517-542.
    This paper examines some economic forecasts made in late 1930 that were intended to predict economic activity in the United States in order to shed light on several methodological issues. We document that these forecasts were extremely optimistic, predicting that the recession in the US would soon end, and that 1931 would show a recovery. These forecasts displayed egregious errors, because 1931 witnessed the largest negative growth rate for the US economy in any year in the twentieth century. A specific (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  27. Wenceslao J. González (1996). On the Theoretical Basis of Prediction in Economics. Journal of Social Philosophy 27 (3):201-228.
  28. C. W. J. Granger (2001). A Review of Jan R. Magnus and Mary S. Morgan's Methodology and Tacit Knowledge: Two Experiments in Econometrics. [REVIEW] Journal of Economic Methodology 8 (2):339-343.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  29. Clinton A. Greene (2000). I Am Not, nor Have I Ever Been a Member of a Data-Mining Discipline. Journal of Economic Methodology 7 (2):217-230.
    This paper argues classical statistics and standard econometrics are based on a desire to meet scientific standards for accumulating reliable knowledge. Science requires two inputs, mining of existing data for inspiration and new or 'out-of-sample' data for predictive testing. Avoidance of data-mining is neither possible nor desirable. In economics out-of-sample data is relatively scarce, so the production process should intensively exploit the existing data. But the two inputs should be thought of as complements rather than substitutes. And we neglect the (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  30. Alastair R. Hall & Fernanda P. M. Peixe (2000). Data Mining and the Selection of Instruments. Journal of Economic Methodology 7 (2):265-277.
    Abstract Instrumental variables estimation is widely applied in econometrics. To implement the method, it is necessary to specify a vector of instruments. In this paper, it is argued that there are compelling reasons to use the data for instrument selection, but that it is desirable to ensure the resulting estimator still behaves in the way predicted by standard textbook theory. These arguments lead one to propose three criteria for data based instrument selection. The remainder of the paper assesses the extent (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  31. Bernd Hayo (1998). Simplicity in Econometric Modelling: Some Methodological Considerations. Journal of Economic Methodology 5 (2):247-261.
    It is shown how simplicity in econometric modelling can be defended from three different methodological positions, a ?traditional scientific?, a rhetorical and a hermeneutical one. Moreover, it is argued that the claim of methodological superiority by supporters of general-to-specific modelling is largely rhetorics. In practice there does not exist a viable alternative to simple modelling in empirical economics.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  32. Kevin D. Hoover (2006). Fragility and Robustness in Econometrics: Introduction to the Symposium. Journal of Economic Methodology 13 (2):159-160.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  33. Kevin D. Hoover (2002). Symposium on Marshall's Tendencies: 5 Sutton's Critique of Econometrics. Economics and Philosophy 18 (1):45-54.
    Through most of the history of economics, the most influential commentators on methodology were also eminent practitioners of economics. And even not so long ago, it was so. Milton Friedman, Paul Samuelson, Trygve Haavelmo, and Tjalling Koopmans were awarded Nobel prizes for their substantive contributions to economics, and were each important contributors to methodological thought. But the fashion has changed. Specialization has increased. Not only has methodology become its own field, but many practitioners have come to agree with Frank Hahn's (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  34. Kevin D. Hoover (1994). Econometrics as Observation: The Lucas Critique and the Nature of Econometric Inference. Journal of Economic Methodology 1 (1):65-80.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  35. Kevin D. Hoover (1990). The Logic of Causal Inference: Econometrics and the Conditional Analysis of Causation. Economics and Philosophy 6 (02):207-.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  36. Kevin D. Hoover & Stephen J. Perez (2000). Three Attitudes Towards Data Mining. Journal of Economic Methodology 7 (2):195-210.
    'Data mining' refers to a broad class of activities that have in common, a search over different ways to process or package data statistically or econometrically with the purpose of making the final presentation meet certain design criteria. We characterize three attitudes toward data mining: first, that it is to be avoided and, if it is engaged in, that statistical inferences must be adjusted to account for it; second, that it is inevitable and that the only results of any interest (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  37. Kevin D. Hoover & Mark V. Siegler (2008). Sound and Fury: McCloskey and Significance Testing in Economics. Journal of Economic Methodology 15 (1):1-37.
    For more than 20 years, Deidre McCloskey has campaigned to convince the economics profession that it is hopelessly confused about statistical significance. She argues that many practices associated with significance testing are bad science and that most economists routinely employ these bad practices: ?Though to a child they look like science, with all that really hard math, no science is being done in these and 96 percent of the best empirical economics ?? (McCloskey 1999). McCloskey's charges are analyzed and rejected. (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  38. Edward E. Leamer (1985). Self-Interpretation. Economics and Philosophy 1 (02):295-.
    My essay “Let's Take the Con out of Econometrics” is intended to be an amusing, titillating, and even annoying distillation of ideas that I have published in a more formal, academic style in many different locations over the course of several years. As far as I could tell, these ideas were widely ignored until I adopted the more contentious style of “Con,” which, since its publication two years ago, has been reprinted in two volumes and excerpted in two others. There (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  39. Chee Kian Leong (2009). The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives. Journal of Economic Methodology 16 (4):431-434.
  40. Thomas Mayer (2000). Data Mining: A Reconsideration. Journal of Economic Methodology 7 (2):183-194.
    Data mining occurs because most economic hypotheses do not have a unique empirical interpretation but allow the econometrician much leeway in selecting conditioning variables, lags, functional forms, and sometimes the sample. The resulting problems are of interest not only to methodologists and philosophers concerned with how hypotheses are validated in the presence of some inevitable ad hocery but, also to readers of economics journals who have no interest in methodology but need to know whether to believe what they read. Since (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  41. Alessio Moneta (2005). Causality in Macroeconometrics: Some Considerations About Reductionism and Realism. Journal of Economic Methodology 12 (3):433-453.
    This paper investigates the varieties of reductionism and realism about causal relations in macroeconometrics. There are two issues, which are kept distinct in the analysis but which are interrelated in the development of econometrics. The first one is the question of the reducibility of causal relations to regularities, measured in statistics by correlations. The second one is the question of the reducibility of causes among macroeconomic aggregates to microeconomic behaviour. It is argued that there is a continuum of possible positions (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  42. Alessio Moneta & Federica Russo (2014). Causal Models and Evidential Pluralism in Econometrics. Journal of Economic Methodology 21 (1):54-76.
    Social research, from economics to demography and epidemiology, makes extensive use of statistical models in order to establish causal relations. The question arises as to what guarantees the causal interpretation of such models. In this paper we focus on econometrics and advance the view that causal models are ‘augmented’ statistical models that incorporate important causal information which contributes to their causal interpretation. The primary objective of this paper is to argue that causal claims are established on the basis of a (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  43. Jonathan Perraton (2011). Explaining Growth? The Case of the Trade–Growth Relationship. Journal of Economic Methodology 18 (3):283-296.
    This paper provides a critical analysis of the modelling strategies adopted in the trade?growth literature. Despite a huge number of econometric studies, there is a growing dissatisfaction with such studies and serious questions over what exactly has been learnt from them. Econometric work has been criticized for, amongst other things, its lack of clear relationship to underlying theory and questionable use of proxies for trade policy. It is frequently unclear what hypothesis is being tested in this literature. Universalist assumptions of (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  44. Federica Russo (2010). Representation and Structure in Economics. The Methodology of Econometric Models of the Consumption Function , Hsiang-Ke Chao. Routledge, 2009, XIV + 161 Pages. [REVIEW] Economics and Philosophy 26 (1):114-118.
  45. Ron P. Smith (1999). Unit Roots and All That: The Impact of Time-Series Methods on Macroeconomics. Journal of Economic Methodology 6 (2):239-258.
    Over the past two decades applied macroeconomics has been transformed by the widespread adoption of a set of new statistical techniques: unit-root tests, vector autoregressions, Granger causality and cointegration. Although these techniques were developed to answer statistical questions, they diffused very rapidly through applied economics because they were thought to be able to answer important theoretical questions in macroeconomics. This paper argues that these techniques have not delivered on the early promises; not because they were not useful - they are (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  46. Aris Spanos (2006). Revisiting the Omitted Variables Argument: Substantive Vs. Statistical Adequacy. Journal of Economic Methodology 13 (2):179-218.
    The problem of omitted variables is commonly viewed as a statistical misspecification issue which renders the inference concerning the influence of X t on yt unreliable, due to the exclusion of certain relevant factors W t . That is, omitting certain potentially important factors W t may confound the influence of X t on yt . The textbook omitted variables argument attempts to assess the seriousness of this unreliability using the sensitivity of the estimator to the inclusion/exclusion of W t (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  47. Peter Spirtes (2005). Graphical Models, Causal Inference, and Econometric Models. Journal of Economic Methodology 12 (1):3-34.
    A graphical model is a graph that represents a set of conditional independence relations among the vertices (random variables). The graph is often given a causal interpretation as well. I describe how graphical causal models can be used in an algorithm for constructing partial information about causal graphs from observational data that is reliable in the large sample limit, even when some of the variables in the causal graph are unmeasured. I also describe an algorithm for estimating from observational data (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  48. Genaro Sucarrat (2010). Econometric Reduction Theory and Philosophy. Journal of Economic Methodology 17 (1):53-75.
    Econometric reduction theory provides a comprehensive probabilistic framework for the analysis and classification of the reductions (simplifications) associated with empirical econometric models. However, the available approaches to econometric reduction theory are unable to satisfactorily accommodate a commonplace theory of social reality, namely that the course of history is indeterministic, that history does not repeat itself and that the future depends on the past. Using concepts from philosophy this paper proposes a solution to these shortcomings, which in addition permits new reductions, (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation