Abstract
Philosophers have recently argued, against a prevailing orthodoxy, that standards of knowledge partly depend on a subject’s interests; the more is at stake for the subject, the less she is in a position to know. This view, which is dubbed “Pragmatic Encroachment” has historical and conceptual connections to arguments in philosophy of science against the received model of science as value free. I bring the two debates together. I argue that Pragmatic Encroachment and the model of value-laden science reinforce each other. Drawing on Douglas’ argument about the indispensability of value judgments in science, and psychological evidence about people’s inability to objectively reason about what they care about, I introduce a novel argument for Pragmatic Encroachment.
Similar content being viewed by others
Notes
This is although Douglas (2009) uses the word “knowledge” throughout her (2009) book, and declares at its outset that “we have no better way of producing knowledge about the natural world than doing science” (2009, 1). In addition, in her argument for the indirect role (2009, Ch. 5), she draws on Heil (1983), who–like the current debaters of PE–discusses knowledge and belief in general, rather than science.
Fallibilism is, roughly, the view that a subject can know that p while acknowledging a remote possibility that p is false (Fantl and McGrath 2009, Ch. 1).
For the sake of this argument in this paper, I leave out of the discussion the truth condition for knowledge. For a discussion of the sense in which scientific theories may satisfy it, see Miller (2013, 1295–1296). I also leave out Gettier cases.
I thank an anonymous reviewer for improving my original reconstruction of the argument.
If we reconstruct the example such that the paranoid suffers from a pathological influence on his evidential reasoning rather than his beliefs directly, the difference between the paranoid and normal people turns from a difference in kind to a difference in degree, and the example loses its normative force. It is obvious that because the paranoid is extremely influenced by motivating factors, his belief is not justified. But it does not follow that a normal person who is moderately influenced by motivating factors is also not justified. Drinking heavily before driving is irresponsible, but this does not mean that any amount of drinking before driving is also irresponsible. I thank an anonymous reviewer for pressing me on this point.
Cf. a central principle of Jewish law “we do not impose on the community a hardship which the majority cannot endure” (Babylonian Talmud, Baba Bathra 60b).
See Miller and Record (2013, 122–127) for further discussion of this point.
For a detailed analysis of the relations between role oughts, epistemic responsibility, and standards of justified belief, see Miller & Record (2013).
The characterization of knowledge as a species of belief is part of the mainstream analysis of knowledge in analytic epistemology. It is not clear, however, that philosophers of science are committed to it, as they tend to focus on acceptance of theories or public claims to knowledge in their normative philosophical analysis, and regard belief as belonging to the realm of the psychology of the scientist, which is not of philosophical interest. For an explicit argument to the effect that knowledge is justified true belief or acceptance, see Cohen (1992, 90–92).
Fantl & McGrath (2010) also interpret Rudner’s (1953) argument as concerning pragmatic theory acceptance, rather than knowledge.
Lacey’s distinction between endorsement and acceptance is similar to Betz’s distinction, which was discussed in §2, between certain or near-certain theories that can be taken to be true for all practical purposes, and less-than-certain hypotheses that should be reported by scientists along with their error probabilities. It fails for similar reasons.
References
Betz, G. (2013). In defence of the value-free ideal. European Journal for the Philosophy of Science, 3(2), 207–220.
Brown, J. (2008). Subject-sensitive invariantism and the knowledge norm for practical reasoning. Noûs, 42(2), 167–189.
Castel, B., & Sismondo, S. (2003). The art of science. Peterborough: Broadview Press.
Cohen, L. J. (1992). An essay on belief and acceptance. Oxford: Clarendon.
Cohen, S. (1999). Contextualism, skepticism, and the structure of reasons. Philosophical Perspectives, 13, 57–89.
Conee, E., & Feldman, R. (2004). Evidentialism: essays in epistemology. Oxford: Oxford University Press.
Derose, K. (1995). Solving the skeptical problem. The Philosophical Review, 104(1), 1–52.
Douglas, H. (2000). Inductive risk and values in science. Philosophy of Science, 67(4), 559–579.
Douglas, H. (2009). Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press.
Elliot, K. C. (2011). Is a little pollution good for you? incorporating societal values in environmental research. New York: Oxford University Press.
Fantl, J., & McGrath, M. (2009). Knowledge in an uncertain world. Oxford: Oxford University Press.
Fantl, J., & McGrath. (2010). Pragmatic encroachment. In S. Bernecker & D. Pritchard (Eds.), The Routledge companion to epistemology (pp. pp. 558–578). London: Routledge.
Feldman, R. (2003). Epistemology. Upper Saddle River: Prentice Hall.
Geist, C., Löwe, B., & Van Kerkhove, B. (2010). Peer review and knowledge by testimony in mathematics. In B. Löwe & T. Müller (Eds.), Philosophy of mathematics: sociological aspects and mathematical practice (pp. 155–178). London: College Publications.
Gettier, E. (1963). Is justified true belief knowledge? Analysis, 23, 121–123.
Gilbert, M. (2002). Belief and acceptance as features of groups. Protosociology, 16, 35–69.
Goldman, A. I. (1999). Knowledge in a social world. New York: Oxford University Press.
Grcar, J. F. (2010). Errors and corrections in mathematics literature. Notices of the American Mathematical Society, 60(4), 418–425.
Hawthorne, J. (2003). Knowledge and lotteries. Oxford: Oxford University Press.
Heil, J. (1983). Believing what one ought. Journal of Philosophy, 80(11), 752–765.
Hempel, C. 1965. Science and human values. In Aspects of scientific explanation: And other essays in the philosophy of science. New York: Free Press (pp. 81–96).
Intemann, K., & De Melo-Martín, I. (2010). Social values and scientific evidence: the case of the HPV vaccines. Biology and Philosophy, 25(2), 203–213.
Jarvis, J. F., & Tyson, J. A. (1981). FOCAS: faint object classification and analysis system. The Astronomical Journal, 86(3), 476–495.
Jeffrey, R. C. (1956). Valuation and acceptance of scientific hypotheses. Philosophy of Science, 23(3), 237–246.
Kitcher, P. (2011). Science in a democratic society. Amherst: Prometheus Books.
Klayman, J. D. (1995). Varieties of confirmation bias. Psychology of Learning and Motivation, 32, 385–418.
Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.
Lacey, H. (1999). Is Science value free? values and scientific understanding. London: Routledge.
Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: can psychological research on correcting cognitive errors promote human welfare? Perspectives on Psychological Science, 4(4), 390–398.
Longino, H. (2002). The fate of knowledge. Princeton: Princeton University Press.
May, J., Sinnott-Armstrong, W., Hull, J. G., & Zimmerman, A. (2010). Practical interests, relevant alternatives, and knowledge attributions: an empirical study. Review of Philosophy and Psychology, 1, 265–273.
McMullin, E. (1983). Values in science. In P. Asquith & T. Nickles (Eds.), PSA 1982 (Vol. 2, pp. 3–28). East Lansing: PSA.
Miller, B. (2013). When is consensus knowledge based? distinguishing shared knowledge from mere agreement. Synthese, 190(7), 1293–1316.
Miller, B., & Record, I. (2013). Justified belief in a digital age: on the epistemic implications of secret internet technologies. Episteme, 10(2), 101–118.
Mizrahi, M. (2012). Does “ought” imply “can” from an epistemic point of view? Philosophia, 40, 829–840.
Nathanson, M. B. (2008). Desperately seeking mathematical truth. Notices of the American Mathematical Society, 55(7), 773.
Neta, R. (2007). Anti-intellectualism and the knowledge-action principle. Philosophy and Phenomenological Research, 75(1), 180–187.
Nickerson, R. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
Popper, K. R. (1981). The rationality of scientific revolutions. In I. Hacking (Ed.), Scientific revolutions (pp. 80–106). New York: Oxford University Press.
Pritchard, D. 2006. Review of Knowledge and practical interests, by Jason Stanley. Notre Dame Philosophical Reviews, http://ndpr.nd.edu/news/25054/?id=6885
Reach, K. (1946). The foundations of our knowledge. Synthese, 5(1/2), 83–86.
Rudner, R. (1953). The scientist qua a scientist makes value judgments. Philosophy of Science, 20(1), 1–6.
Schlick, M. 1917/1985. General theory of knowledge. Peru, IL: Open Court.
Solomon, M. (2001). Social empiricism. Cambridge: MIT Press.
Stanley, J. (2005). Knowledge and practical interests. Oxford: Oxford University Press.
Stanley, J. (2007). Replies to Gilbert Harman, Ram Neta, and Stephen Schiffer. Philosophy and Phenomenological Research, 75(1), 196–210.
Stocker, M. (1971). Ought” and “can”. Australasian Journal of Philosophy, 49(3), 303–316.
Van Fraassen, B. (1980). The scientific image. Oxford: Oxford University Press.
Wilholt, T. (2009). Bias and values in scientific research. Studies in History and Philosophy of Science, 40(1), 92–101.
Wray, K. B. (2001). Collective belief and acceptance. Synthese, 129(3), 319–333.
Acknowledgements
I thank Arnon Keren, Jacob Stegenga, Dan Hicks, Isaac Record, and Sandy Goldberg for helpful comments on earlier versions of this paper. I thank Ruth Weintraub, David Enoch, Moti Mizrahi, and Levi Spectre for useful discussion. I thank two anonymous reviewers for their helpful comments. This paper was presented at the Philosophy of Science (PSA) Meeting (San Diego, 2012), the Society for Philosophy of Science in Practice (SPSP) Conference (Exeter, 2011), and the Israeli Society for History and Philosophy of Science Annual Meeting (Jerusalem, 2012). I thank the audience members for useful comments and discussion. This paper was partly written when I was an Azrieli Postdoctoral Fellow at the Department of Philosophy, University of Haifa. I am grateful to the Azrieli Foundation for an award of an Azrieli Fellowship. I am also grateful to the Edmond J. Safra Center for Ethics, Tel Aviv University, and the Dan David Foundation, Tel Aviv University, for postdoctoral fellowships.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Miller, B. Science, values, and pragmatic encroachment on knowledge. Euro Jnl Phil Sci 4, 253–270 (2014). https://doi.org/10.1007/s13194-014-0087-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13194-014-0087-4