This study described parent participation in the informed consent conference for randomized clinical trials (RCTs) in childhood leukemia and documented the relationship of physician communication to parent participation. Parents of 140 children with newly diagnosed leukemia who were eligible for RCTs were studied at six sites using comprehensive methods involving direct observation and transcripts of parent-physician communication based on audiotapes. Parent participation during the informed consent conference reflected a wide range of content categories. Consistent with hypotheses, Physician Rapport and Partnership (...) Building related to parent participation in the informed consent conference but Information Giving did not. Higher parent socioeconomic status also was related to greater parent participation for two of three measures of parent participation. Findings suggest that physician behaviors that provide support and facilitate communication may enhance parental participation in the informed consent conference for RCTs in childhood leukemia. (shrink)
The title is meant to emphasize the immense loss of status I take logic to have undergone in recent decades, and to suggest something about its causes. The loss is most obvious in the context of higher education, where almost no post-secondary institutions now have effectual general requirements in standard formal logic, as that was easily understood thirty or more years ago. Courses in so-called 'critical thinking' are, with rare and noble exceptions, only a further illustration of the point, for (...) many of them, if not most, say nothing at all about logical form and formal logic, and proceed as if <span class='Hi'>thought</span> and discourse could be critically understood and appraised in total ignorance of their formal aspects. (shrink)
In this paper I examine William Alston's work on the epistemology of religious belief, focusing on the threat to the epistemic status of Christian belief presented by awareness of religious diversity. I argue that Alston appears to misunderstand the epistemic significance of the ‘practical rationality’ of the Christian mystical practice. I suggest that this error is due to a more fundamental misunderstanding, regarding the significance of practical rationality, in Alston's ‘doxastic practice’ approach to epistemology; an error that leads to arbitrariness (...) among the class of rational doxastic practices. I suggest how one might remedy this weakness, with an additional, epistemic, criterion that rational doxastic practices must satisfy. (shrink)
Is arguing over ontology a mistake? A recent proposal by Karen Bennett suggests that some metaphysical disputes, such as those over constitution and composition, can be dismissed on epistemic grounds. Given that both sides in a dispute try to minimize the differences between them, there are no good metaphysical grounds for choosing between them. In this paper, I expand on her epistemic dismissivism, arguing that given the Quinean conception of the task and method of metaphysics, we are warranted in believing (...) that all ontological disputes will end in a draw, even if they have not yet done so. By a draw, I mean that while both sides in a dispute are genuinely disagreeing about what there is and there are still moves open to them, there are no moves remaining that will advance the discourse further. (shrink)
I undertake to explain how the well known laws of formal logic – Barbara Syllogism, modus ponens, etc. – relate to experience by developing Edmund Husserl's critique ofFormalism and Psychologism in logical theory and then briefly explaining his positive views of the laws of logic. His view rests upon his understanding of the proposition as a complex, intentional property. The laws of formal logic are, on his view (and mine), statements about the truth values of propositions as determined by their (...) formal character and relationships alone. The laws thus understood explain how algorithms set up to mirror them can accomplish what they do to advance knowledge, even though they operate purely mechanically. Further, they explain the proper sense in which formal laws "govern," and may guide, processes of actual thinking. Husserl's theory is a realist theory in the sense that, on his interpretation, the laws of pure or formal logic hold true regardless of what any individual, culture or species may or may not think, or even if no thinking ever occurs. (shrink)
Sometimes metaphysicians appeal to simplicity as a reason to prefer one metaphysical theory to another, especially when a philosophical dispute has otherwise reached a state of equilibrium. In this paper, I show that given a Quinean conception of metaphysics, several initially plausible justifications for simplicity as a metaphysical criterion do not succeed. If philosophers wish to preserve simplicity as a metaphysical criterion, therefore, they must radically reconceive the project of metaphysics.
We will study several weak axiom systems that use the Subtraction and Division primitives (rather than Addition and Multiplication) to formally encode the theorems of Arithmetic. Provided such axiom systems do not recognize Multiplication as a total function, we will show that it is feasible for them to verify their Semantic Tableaux, Herbrand, and Cut-Free consistencies. If our axiom systems additionally do not recognize Addition as a total function, they will be capable of recognizing the consistency of their Hilbert-style deductive (...) proofs. Our axiom systems will not be strong enough to recognize their Canonical Reflection principle, but they will be capable of recognizing an approximation of it, called the "Tangibility Reflection Principle". We will also prove some new versions of the Second Incompleteness Theorem stating essentially that it is not possible to extend our exceptions to the Incompleteness Theorem much further. (shrink)
Twentieth Century philosophical thought has expressed itself for the most part through two great Movements: the phenomenological and the analytical. Each movement originated in reaction against idealistic—or at least anti-realistic—views of "the world". And each has collapsed back into an idealism not different in effect from that which it initially rejected. Both movements began with an appeal to meanings or concepts, regarded as objective realities capable of entering the flow of experience without loss of their objective status or of their (...) power to reveal to us an objective world as it would be if there were no subjective apprehensions of it. Both movements ended with a surrender of the objectivity of meanings and concepts in this strong sense, coming to treat them as at most more-or-less shareable components of a somehow communalized experience, but in any case incapable of revealing how things are irrespective of actual human experience. For the old Egocentric Predicament, with its "ideas" etc., is substituted a Lingocentric or Histrocentric Predicament of "language" and its elements. Hilary Putnam speaks for the current consensus: 'Internal realism says that we don't know what we are talking about when we talk about "things in themselves"' (The Many Faces of Realism , p. 36 ). (shrink)
This article will study a class of deduction systems that allow for a limited use of the modus ponens method of deduction. We will show that it is possible to devise axiom systems α that can recognize their consistency under a deduction system D provided that: (1) α treats multiplication as a 3-way relation (rather than as a total function), and that (2) D does not allow for the use of a modus ponens methodology above essentially the levels of Π1 (...) and Σ1 formulae. Part of what will make this boundary-case exception to the Second Incompleteness Theorem interesting is that we will also characterize generalizations of the Second Incompleteness Theorem that take force when we only slightly weaken the assumptions of our boundary-case exceptions in any of several further directions. (shrink)
We derive a Mal'cev condition for congruence meet-semidistributivity and then use it to prove two theorems. Theorem A: if a variety in a finite language is congruence meet-semidistributive and residually less than some finite cardinal, then it is finitely based. Theorem B: there is an algorithm which, given $m and a finite algebra in a finite language, determines whether the variety generated by the algebra is congruence meet-semidistributive and residually less than m.
Gödel’s Second Incompleteness Theorem states axiom systems of sufficient strength are unable to verify their own consistency. We will show that axiomatizations for a computer’s floating point arithmetic can recognize their cut-free consistency in a stronger respect than is feasible under integer arithmetics. This paper will include both new generalizations of the Second Incompleteness Theorem and techniques for evading it.
Jesus The Logician ABSTRACT: In understanding how discipleship to Jesus Christ works, a major issue is how he automatically presents himself to our minds. It is characteristic of most 20th century Christians that he does not automatically come to mind as one of great intellectual power: as Lord of universities and research institutes, of the creative disciplines and scholarship. The Gospel accounts of how he actually worked, however, challenge this intellectually marginal image of him and help us to see him (...) at home in the best of academic and scholarly settings of today, where many of us are called to be his apprentices. (shrink)
Using a result of Gurevich and Lewis on the word problem for finite semigroups, we give short proofs that the following theories are hereditarily undecidable: (1) finite graphs of vertex-degree at most 3; (2) finite nonvoid sets with two distinguished permutations; (3) finite-dimensional vector spaces over a finite field with two distinguished endomorphisms.
Let K be a finite set of finite structures. We give a syntactic characterization of the property: every element of K is injective in ISP(K). We use this result to establish that A is injective in ISP(A) for every two-element algebra A.
Nanostructured materials should present a good resistance to irradiation because the large volume fraction of grain boundaries can be an important sink for radiation-induced defects. The objective of the present study is to experimentally investigate the irradiation impact on the microstructure and mechanical properties in nanostructured materials. Nickel and Cu-0.5Al2O3 specimens were synthesized by electro deposition (ED) and severe plastic deformation (SPD). Mean grain size of the unirradiated specimen is about 30?nm for the ED Ni and about 115?nm for the (...) SPD Ni. 590?MeV proton irradiation and 840?keV nickel ion irradiation were conducted at room temperature. Vickers hardness measurements and transmission electron microscope observation were performed to examine the impact of irradiation on nanocrystalline materials. It appears that the irradiation induced microstructure in Ni and in Cu-0.5Al2O3, which leads to hardening, consists exclusively of stacking fault tetrahedra. Their density appears much lower than in the case of coarser grained material. These results, experimentally showing the resistance of nanostructured material to radiation damage, are presented here. (shrink)
The microstructural modifications due to irradiation in hcp pure metals and their consequences on the mechanical properties have been investigated. Experimental results for proton-irradiated pure polycrystalline titanium are presented and discussed. Samples have been irradiated with 590?MeV protons to a low dose range at two different temperatures, room temperature and 523?K. Defect sizes and densities as a function of dose have been determined by means of transmission electron microscopy observations, and hardening has been measured from uniaxial tensile stress?strain curves. The (...) dose dependence of the irradiation hardening has been found to depend strongly on the investigated temperatures. These results are discussed in terms of the main deformation mechanism operating at each temperature. (shrink)
We present a comprehensive dislocation dynamics (DD) study of the strength of stacking fault tetrahedra (SFT) to screw dislocation glide in fcc Cu. Our methodology explicitly accounts for partial dislocation reactions in fcc crystals, which allows us to provide more detailed insights into the dislocation?SFT processes than previous DD studies. The resistance due to stacking fault surfaces to dislocation cutting has been computed using atomistic simulations and added in the form of a point stress to our DD methodology. We obtain (...) a value of 1658.9 MPa, which translates into an extra force resolved on the glide plane that dislocations must overcome before they can penetrate SFTs. In fact, we see they do not, leading to two well differentiated regimes: (i) partial dislocation reactions, resulting in partial SFT damage, and (ii) impenetrable SFT resulting in the creation of Orowan loops. We obtain SFT strength maps as a function of dislocation glide plane-SFT intersection height, interaction orientation, and dislocation line length. In general SFTs are weaker obstacles the smaller the encountered triangular area is, which has allowed us to derive simple scaling laws with the slipped area as the only variable. These laws suffice to explain all strength curves and are used to derive a simple model of dislocation?SFT strength. The stresses required to break through obstacles in the 2.5?4.8-nm size range have been computed to be 100?300 MPa, in good agreement with some experimental estimations and molecular dynamics calculations. (shrink)
Large-scale molecular dynamics of cascade production of the primary damage state are performed in nanocrystalline nickel with an average grain diameter of 12?nm and primary knock-on atom kinetic energies ranging from 5 to 30?keV. The role of the grain boundary during the cascade production of irradiated NC Ni is discussed in terms of grain-boundary structure. It is shown that regions of misfit in the grain boundaries can absorb self-interstitials and that stacking-fault tetrahedra are formed in the neighbourhood of the grain (...) boundary. (shrink)
Irradiation induces the formation of stacking fault tetrahedra (SFTs) in a number of fcc metals, such as stainless steel and pure copper. In order to understand the role of the material's parameters on this formation, pure Cu, Ni, Pd and Al, having a respective stacking fault energy of 45, 125, 180 and 166?mJ?m?2, have been irradiated with high energy protons to a dose of about 10?2?dpa at room temperature. The irradiation-induced microstructure has been investigated using transmission electron microscopy. All irradiated (...) metals but Al present SFTs. The proportion of perfect, truncated and grouped SFTs has been determined. The SFT energy as a function of size has been calculated using elasticity of the continuum, with respect to the energy of a number of other possible defect configurations. It appears that the key parameters are the stacking fault energy and the shear modulus. Their implication on the formation and stability of the SFTs is discussed. (shrink)
Single-crystal nickel was irradiated with 590?MeV protons to 10?1?dpa at room temperature. Irradiated and unirradiated tensile samples were deformed and relaxation tests were performed at temperatures between 77 and 423?K. The tests show a strong temperature dependence of the flow stress for samples irradiated to 0.1?dpa as compared to the unirradiated case. Unirradiated and irradiated deformed microstructures were investigated by transmission electron microscopy. The initial plastic deformation of the samples irradiated at 0.1?dpa shows strain localization in the form of defect (...) free channels, over the temperature range from 77 to 293?K. Deformation processes are analysed through the determination of the activation energies of the deformation mechanisms as deduced from relaxation tests. The activation energy has an approximate value of 0.5?eV in unirradiated samples. In the irradiated samples it is suggested that multiple deformation processes are operative in the temperature range from 77 to 423?K. (shrink)