This book departs from the premise that context represents a complex relational configuration which can no longer be conceived as an analytic prime but rather requires a parts-whole perspective to capture its inherent dynamism. The edited volume presents a collection of papers which examine the connectedness between context, contextualization and entextualization. They address the questions how meaning and speech acts are situated in context, how both are influenced by context, how context influences speech acts and meaning, how context is imported (...) into the discourse, and how context is entextualized in discourse. The papers cover institutional and non-institutional contexts, the language of Greek laws, political discourse, confrontational media discourse and task-oriented face-to-face and back-to-back interactions. They reflect current moves in pragmatics and discourse analysis to cross disciplinary and methodological boundaries by integrating relevant premises and insights, in particular cognition, adaptive action, negotiation of meaning, sequentiality, recipient design and genre. (shrink)
This contribution investigates the role ofcontext in natural-language communication bydifferentiating between linguistic andsociocultural contexts. It is firmly anchoredto a dialogue framework and based on arelational conception of context as astructured and interactionally organisedphenomenon. However, context is not onlyexamined from this bottom-up or microperspective, but also from a top-down or macroviewpoint as pre- and co-supposed socioculturalcontext. Here, context is not solely seen as aninteractionally organised phenomenon, butrather as a sociocultural apparatus whichstrongly influences the interpretation of microsituations.The section, micro building blocks andlocal (...) meaning, argues for a sociopragmaticapproach to natural-language communication thusaccommodating both speech act theory andconversation analysis. It examines the questionof how linguistic and sociocultural contextsare accommodated by the micro building blocksof speech act and turn, and speaker and hearer.The results obtained are systematised in thesection, micro meets macro, and adaptedto the requirements of the dialogue act ofa plus/minus-validity claimbased on thecontextualisation of Jürgen Habermas''sconception of ratification of validityclaimadopted from this theory ofcommunicative action(1987). The definition ofa plus/minus-validity claim is furthersupplemented by the Gricean CooperativePrinciple, the ethnomethodological premise ofaccountability of social action, theconversation-analytic notion of sequentialorganisation and the interpersonal concepts offace and participation format. Validity claimsare discussed from both bottom-up and top-downperspectives stressing the dynamics of contextwith regard to both process and product, andselection and construction. (shrink)
The notion of program verification appears to trade upon an equivocation. Algorithms, as logical structures, are appropriate subjects for deductive verification. Programs, as causal models of those structures, are not. The success of program verification as a generally applicable and completely reliable method for guaranteeing program performance is not even a theoretical possibility.
The distinguished theologian, David Ray Griffin, has advanced a set of thirteen theses intended to characterize (what he calls) “Neo-Darwinism” and which he contrasts with “Intelligent Design”. Griffin maintains that Neo-Darwinism is “atheistic” in forgoing a creator but suggests that, by adopting a more modest scientific naturalism and embracing a more naturalistic theology, it is possible to find “a third way” that reconciles religion and science. The considerations adduced here suggest that Griffin has promised more than he can deliver. On (...) his account, God is in laws of nature; therefore, any influence He exerts is natural rather than supernatural. But if the differences God makes are not empirically detectable, then Griffin’s account is just as objectionable as a theory of supernatural intervention. And Griffin has not shown that evolution as distinct from his idiosyncratic sense of Neo-Darwinism is incompatible with theism. (shrink)
A debate over the theoretical capabilities of formal methods in computer science has raged for more than two years now. The function of this paper is to summarize the key elements of this debate and to respond to important criticisms others have advanced by placing these issues within a broader context of philosophical considerations about the nature of hardware and of software and about the kinds of knowledge that we have the capacity to acquire concerning their performance.
Luciano Floridi (2003) offers a theory of information as a strongly semantic notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as information. While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses – that information can be false; that tautologies are (...) information; and, that It is true that ... is non-redundant – appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation. (shrink)
The distinction between misinformation and disinformation becomes especially important in political, editorial, and advertising contexts, where sources may make deliberate efforts to mislead, deceive, or confuse an audience in order to promote their personal, religious, or ideological objectives. The difference consists in having an agenda. It thus bears comparison with lying, because lies are assertions that are false, that are known to be false, and that are asserted with the intention to mislead, deceive, or confuse. One context in which disinformation (...) abounds is the study of the death of JFK, which I know from more than a decade of personal research experience. Here I reflect on that experience and advance a preliminary theory of disinformation that is intended to stimulate thinking on this increasingly important subject. Five kinds of disinformation are distinguished and exemplified by real life cases I have encountered. It follows that the story you are about to read is true. (shrink)
An approach to inference to the best explanation integrating a Popperianconception of natural laws together with a modified Hempelian account of explanation, one the one hand, and Hacking's law of likelihood (in its nomicguise), on the other, which provides a robust abductivist model of sciencethat appears to overcome the obstacles that confront its inductivist,deductivist, and hypothetico-deductivist alternatives.This philosophy of scienceclarifies and illuminates some fundamental aspects of ontology and epistemology, especially concerning the relations between frequencies and propensities. Among the most important (...) elements of this conception is thecentral role of degrees of nomic expectability in explanation, prediction,and inference, for which this investigation provides a theoretical defense. (shrink)
Carl G. Hempel exerted greater influence upon philosophers of science than any other figure during the 20th century. In this far-reaching collection, distinguished philosophers contribute valuable studies that illuminate and clarify the central problems to which Hempel was devoted. The essays enhance our understanding of the development of logical empiricism as the major intellectual influence for scientifically-oriented philosophers and philosophically-minded scientists of the 20th century.
This paper pursues the question, To what extent does the propensity approach to probability contribute to plausible solutions to various anomalies which occur in quantum mechanics? The position I shall defend is that of the three interpretations — the frequency, the subjective, and the propensity — only the third accommodates the possibility, in principle, of providing a realistic interpretation of ontic indeterminism. If these considerations are correct, then they lend support to Popper's contention that the propensity construction tends to remove (...) (at least some of) the mystery from quantum phenomena. (shrink)
Cognitive science has been dominated by the computational conception that cognition is computation across representations. To the extent to which cognition as computation across representations is supposed to be a purposive, meaningful, algorithmic, problem-solving activity, however, computers appear to be incapable of cognition. They are devices that can facilitate computations on the basis of semantic grounding relations as special kinds of signs. Even their algorithmic, problem-solving character arises from their interpretation by human users. Strictly speaking, computers as such — apart (...) from human users — are not only incapable of cognition, but even incapable of computation, properly construed. If we want to understand the nature of thought, then we have to study thinking, not computing, because they are not the same thing. (shrink)
Cosmides, Wason, and Johnson-Laird, among others, have suggested evidence that reasoning abilities tend to be domain specific, insofar as humans do not appear to acquire capacities for logical reasoning that are applicable across different contexts. Unfortunately, the significance of these findings depends upon the specific variety of logical reasoning under consideration. Indeed, there seem to be at least three grounds for doubting such conclusions, since: (1) tests of reasoning involving the use of material conditionals may not be appropriate for representing (...) ordinary thinking, especially when it concerns causal processes involving the use of causal conditionals instead; (2) tests of domain specificity may fail to acknowledge the crucial role fulfilled by rules of inference, such as modus ponens and modus tollens, which appear to be completely general across different contexts; and, (3) tests that focus exclusively upon deductive reasoning may misinterpret findings involving the use of inductive reasoning, which is of primary importance for human evolution. (shrink)
The purpose of this paper is to demonstrate that Grünbaum's purported defense of Hempel's thesis of the symmetry of explanation and prediction is fundamentally inadequate by virtue of the fact that Grünbaum adopts an extended and revised version of the thesis pertaining to scientific understanding in general in lieu of the original and restricted version advanced by Hempel pertaining to scientific explanation in particular. When Hempel's thesis rather than Grünbaum's revision is recognized as the relevant object of criticism, it becomes (...) clear that Grünbaum has not demonstrated that Hempel ab omni naevo vindicates. Indeed, when correctly understood, it becomes clear that Bromberger's criticisms, especially, support the sound conclusion that the relationship between explanations and predictions is sometimes symmetrical yet sometimes asymmetrical, i.e., the relationship that obtains is non-symmetrical. (shrink)
The purpose of this essay is to investigate the properties of singular causal systems and their population manifestations, with special concern for the thesis of methodological individualism, which claims that there are no properties of social groups that cannot be adequately explained exclusively by reference to properties of individual members of those groups, i.e., at the level of individuals. Individuals, however, may be viewed as singular causal systems, i.e., as instantiations of (arrangements of) dispositional properties. From this perspective, methodological individualism (...) appears to be an ambiguous thesis: some properties of collections of (independent) systems of the same kind are reducible, but other properties of collections of (dependent) systems of the same kind are not. In cases of the first kind, therefore, methodological individualism is true, but trivial; while in cases of the second kind, it is significant, but false. Hence, if the arguments that follow are correct, at least some of the properties of social groups should qualify as emergent. (shrink)
The social exchange theory of reasoning, which is championed by Leda Cosmides and John Tooby, falls under the general rubric evolutionary psychology and asserts that human reasoning is governed by content-dependent, domain-specific, evolutionarily-derived algorithms. According to Cosmides and Tooby, the presumptive existence of what they call cheater-detection algorithms disconfirms the claim that we reason via general-purpose mechanisms or via inductively acquired principles. We contend that the Cosmides/Tooby arguments in favor of domain-specific algorithms or evolutionarily-derived mechanisms fail and that the notion (...) of a social exchange rule, which is central to their theory, is not correctly characterized. As a consequence, whether or not their conclusion is true cannot be established on the basis of the arguments they have presented. (shrink)
My purpose is to explain, first, that there is an alternative to Harnad's version of the symbol grounding problem, which is known as the problem of primitives; second, that there is an alternative to his solution (which is externalist) in the form of a dispositional conception (which is internalist); and, third, that, while the TTT, properly understood, may provide partial and fallible evidence for the presence of similar mental powers, it cannot supply conclusive proof, because more than observable symbolic manipuation (...) and robotic behavior is involved here, as he admits (Harnad 1991). Carrying the problem further appears to require inference to the best explanation. (shrink)
The purpose of this paper is (a) to provide a systematic defense of the single-case propensity account of probabilistic explanation from the criticisms advanced by Hanna and by Humphreys and (b) to offer a critical appraisal of the aleatory conception advanced by Humphreys and of the deductive-nomological-probabilistic approach Railton has proposed. The principal conclusion supported by this analysis is that the Requirements of Maximal Specificity and of Strict Maximal Specificity afford the foundation for completely objective explanations of probabilistic (...) explananda, so long as they are employed on the basis of propensity criteria of explanatory relevance. (shrink)
When computing is defined as the causal implementation of algorithms and algorithms are defined as effective decision procedures, human thought is mental computation only if it is governed by mental algorithms. An examination of ordinary thinking, however, suggests that most human thought processes are non-algorithmic. Digital machines, moreover, are mark-manipulating or string-processing systems whose marks or strings do not stand for anything for those systems, while minds are semiotic (or “signusing”) systems for which signs stand for other things for those (...) systems. Computing, at best, turns out to be no more than a special kind of thinking. (shrink)
Taking Brian Cantwell Smith’s study, “Limits of Correctness in Computers,” as its point of departure, this article explores the role of models in computer science. Smith identifies two kinds of models that play an important role, where specifications are models of problems and programs are models of possible solutions. Both presuppose the existence of conceptualizations as ways of conceiving the world “in certain delimited ways.” But high-level programming languages also function as models of virtual (or abstract) machines, while low-level programming (...) languages function as models of causal (or physical) machines. The resulting account suggests that sets of models embedded within models are indispensable for computer programming. (shrink)
The purpose of this paper is to provide a systematic appraisal of the covering law and statistical relevance theories of statistical explanation advanced by Carl G. Hempel and by Wesley C. Salmon, respectively. The analysis is intended to show that the difference between these accounts is inprinciple analogous to the distinction between truth and confirmation, where Hempel's analysis applies to what is taken to be the case and Salmon's analysis applies to what is the case. Specifically, it is argued (a) (...) that statistical explanations exhibit the nomic expectability of their explanandum events, which in some cases may be strong but in other cases will not be; (b) that the statistical relevance criterion is more fundamental than the requirement of maximal specificity and should therefore displace it; and, (c) that if statistical explanations are to be envisioned as inductive arguments at all, then only in a qualified sense since, in particular, the requirement of high inductive probability between explanans and explanandum must be abandoned. (shrink)
The thesis of this paper is that extensional language alone provides an essentially inadequate foundation for the logical formalization of any lawlike statement. The arguments presented are intended to demonstrate that lawlike sentences are logically general dispositional statements requiring an essentially intensional reduction sentence formulation. By introducing a non-extensional logical operator, the 'fork', the difference between universal and statistical laws emerges in a distinction between dispositional predicates of universal strength as opposed to those of merely statistical strength. While the logical (...) form of universal and statistical laws appears to be fundamentally dissimilar on the standard account, from this point of view their syntactical structure is basically the same. (shrink)
My purpose here is to elaborate the reasons I maintain that Salmon has not been completely successful in reporting the history of work on explanation. The most important limitation of his account is that it does not emphasize the critical necessity to embrace a suitable conception of probability in the development of the theory of probabilistic explanation.
The shapes of neurons and glial cells dictate many important aspects of their functions. In olfactory systems, certain architectural features are characteristics of these two cell types across a wide variety of species. The accumulated evidence suggests that these common features may play fundamental roles in olfactoryinformation processing. For instance, the primary olfactory neuropil in most vertebrate and invertebrate olfactory systems is organized into discrete modules called glomeruli. Inside each glomerulus, sensory axons and CNS neurons branch and synapse in patterns (...) that are repeated across species. In many species, moreover, the glomeruli are enveloped by a thin and ordered layer of glial processes. Theglomerular arrangement reflects the processing of odor information in modules that encode the discrete molecular attributes of odorant stimuli being processed. Recent studies of the mechanisms that guide the development of olfactory neurons and glial cells have revealed complex reciprocal interactions between these two cell types, which may be necessary for the establishment of modular compartments. Collectively, the findings reviewed here suggest that specialized cellular architecture plays key functional roles in the detection, analysis, and discrimination of odors at early steps in olfactory processing. (shrink)
John Searle distinguished between weak and strong artificial intelligence (AI). This essay discusses a third alternative, mild AI, according to which a machine may be capable of possessing a species of mentality. Using James Fetzer's conception of minds as semiotic systems, the possibility of what might be called ``mild AI'' receives consideration. Fetzer argues against strong AI by contending that digital machines lack the ground relationship required of semiotic systems. In this essay, the implementational nature of semiotic processes (...) posited by Charles S. Peirce's triadic sign relation is re-examined in terms of the underlying dispositional processes and the ontological levels they would span in an inanimate machine. This suggests that, if non-human mentality can be replicated rather than merely simulated in a digital machine, the direction to pursue appears to be that of mild AI. (shrink)
In this paper I argue for a computational theory of thinking that does not eliminate the mind. In doing so, I will defend computationalism against the arguments of John Searle and James Fetzer, and briefly respond to other common criticisms.