Abstract The effects of heat treatment and cold work on the superconducting properties of Ti-20 at. % Nb have been studied by measurement of magnetization, M(H), critical current density, J c(H) and transition temperature, T c. The metallurgy has been determined by electron and optical microscopy and x-ray diffraction. Quenching the recrystallized alloy from the single phase ? region above 500°C produces a martensitic α? phase characterized by H c2 = 50 koe, T c = 6·5°K and a high J (...) c arising from pinning of flux lines by the α? structure. After very short ageing times at 330°C the α? reverts to the ? phase and H c2 and T c fall to 15 koe and 4·5 °K respectively and J c drops by several orders of magnitude. On further ageing ?-Ti precipitates and H c2 and T c increase because of Nb enrichment of the matrix. Pinning by the ? precipitate causes J c to increase. A pronounced peak effect in the J c(H) and M(H) curves has been studied in detail. There is strong evidence for a barrier to flux line entry into the pinning centres below the peak field. This barrier may be due to a Nb-rich shell surrounding the ? precipitate. If the quenched alloy is worked, the α? phase is retained and T c and H c2 are unaltered. On ageing, the α? reverts to strained ? but T c, H c2 and J c do not drop as in the recrystallized alloy. Prolonged ageing produces a significant growth of α-Ti precipitate and T c, H c2 and J c all increase as in the recrystallized alloy. (shrink)
We describe a series of kinetic Monte Carlo simulations of post-cascade radiation damage evolution in α-iron that illustrates the part played by elastic interaction between defects. Elastic interactions are included as a bias to the diffusion of mobile point defects and defect clusters. The simulations show that recombination fractions are reduced, and vacancy clustering is enhanced. The sensitivity of these effects to temperature, cascade energy, and geometric description of vacancy clusters is also investigated.
When animals choose between completing a cognitive task and “escaping,” proper interpretation of their behavior depends crucially on methodological details, including how forced and freely chosen tests are mixed and whether appropriate transfer tests are administered. But no matter how rigorous the test, it is impossible to go beyond functional similarity between human and nonhuman behaviors to certainty about human-like consciousness.
Ethical guidance from the British Medical Association (BMA) about treating doctor–patients is compared and contrasted with evidence from a qualitative study of general practitioners (GPs) who have been patients. Semistructured interviews were conducted with 17 GPs who had experienced a significant illness. Their experiences were discussed and issues about both being and treating doctor–patients were revealed. Interpretative phenomenological analysis was used to evaluate the data. In this article data extracts are used to illustrate and discuss three key points that summarise (...) the BMA ethical guidance, in order to develop a picture of how far experiences map onto guidance. The data illustrate and extend the complexities of the issues outlined by the BMA document. In particular, differences between experienced GPs and those who have recently completed their training are identified. This analysis will be useful for medical professionals both when they themselves are unwell and when they treat doctor–patients. It will also inform recommendations for professionals who educate medical students or trainees. (shrink)
This paper introduces a new, expanded range of relevant cognitive psychological research on collaborative recall and social memory to the philosophical debate on extended and distributed cognition. We start by examining the case for extended cognition based on the complementarity of inner and outer resources, by which neural, bodily, social, and environmental resources with disparate but complementary properties are integrated into hybrid cognitive systems, transforming or augmenting the nature of remembering or decision-making. Adams and Aizawa, noting this distinctive complementarity argument, (...) say that they agree with it completely: but they describe it as “a non-revolutionary approach” which leaves “the cognitive psychology of memory as the study of processes that take place, essentially without exception, within nervous systems.” In response, we carve out, on distinct conceptual and empirical grounds, a rich middle ground between internalist forms of cognitivism and radical anti-cognitivism. Drawing both on extended cognition literature and on Sterelny’s account of the “scaffolded mind” (this issue), we develop a multidimensional framework for understanding varying relations between agents and external resources, both technological and social. On this basis we argue that, independent of any more “revolutionary” metaphysical claims about the partial constitution of cognitive processes by external resources, a thesis of scaffolded or distributed cognition can substantially influence or transform explanatory practice in cognitive science. Critics also cite various empirical results as evidence against the idea that remembering can extend beyond skull and skin. We respond with a more principled, representative survey of the scientific psychology of memory, focussing in particular on robust recent empirical traditions for the study of collaborative recall and transactive social memory. We describe our own empirical research on socially distributed remembering, aimed at identifying conditions for mnemonic emergence in collaborative groups. Philosophical debates about extended, embedded, and distributed cognition can thus make richer, mutually beneficial contact with independently motivated research programs in the cognitive psychology of memory. (shrink)
The ability to predict the actions of other agents is vital for joint action tasks. Recent theory suggests that action prediction relies on an emulator system that permits observers to use information about their own motor dynamics to predict the actions of other agents. If this is the case, then predictions for self-generated actions should be more accurate than predictions for other-generated actions. We tested this hypothesis by employing a self/other synchronization paradigm where prediction accuracy for recording of self-generated movements (...) was compared with prediction accuracy for other-generated movements. As expected, predictions were more accurate when the observer’s movement dynamics matched the movement dynamics of the recording. This is consistent with that idea that the observer’s movement dynamics influence the predictions they generate. (shrink)
Descartes was born in La Haye (now Descartes) in Touraine and educated at the Jesuit college of La Fleche` in Anjou. Descartes’modern reputation as a rationalistic armchair philosopher, whose mind–body dualism is the source of damaging divisions between psychology and the life sciences, is almost entirely undeserved. Some 90% of his surviving correspondence is on mathematics and scientiﬁc matters, from acoustics and hydrostatics to chemistry and the practical problems of constructing scientiﬁc instruments. Descartes was just as interested in the motions (...) of matter as in the supernatural soul, and he advised against spending too much time on metaphysical inquiries which neglect imagination and the senses. After meeting the Dutch engineer Isaac Beeckman in 1618, Descartes became committed to a systematically ‘mechanical’account of nature. This involved explaining all natural processes in terms of interactions between microscopic material bodies in motion. Descartes modelled his physics and cosmology on the behaviour of ﬂuids, which also have a distinctive and central role in his physiology: the key processes for natural philosophical investigation are the circulation and mutual displacement of constrained bodies, rather than the isolated collisions of atoms in a void. Descartes settled in Holland in 1628, and commenced an ambitious programme of physiological research. In 1630 he was ‘studying chemistry and anatomy simultaneously’, and in late 1632 he was ‘dissecting the heads of various animals’, in order to ‘explain what imagination, memory, etc. consist in’. By late 1633 Descartes had almost completed L’homme, the Treatise on Man; but he abandoned plans to publish it along with a work on matter theory and optics which relied on Copernican cosmology, after he heard of the condemnation of Galileo. L’homme was published posthumously in 1662 (Latin) and 1664 (French); not until 1972 was it fully translated into English. L’homme draws on many (mostly unnamed) Renaissance medical writers, and covers, ﬁrstly, a range of traditional physiological topics, such as digestion and respiration.. (shrink)
Expert skill in music performance involves an apparent paradox. On stage, expert musicians are required accurately to retrieve information that has been encoded over hours of practice. Yet they must also remain open to the demands of the ever-changing situational contingencies with which they are faced during performance. To further explore this apparent paradox and the way in which it is negotiated by expert musicians, this article profiles theories presented by Roger Chaffin, Hubert Dreyfus and Tony and Helga Noice. For (...) Chaffin, expert skill in music performance relies solely upon overarching mental representations, while, for Dreyfus, such representations are needed only by novices, while experts rely on a more embodied form of coping. Between Chaffin and Dreyfus sit the Noices, who argue that both overarching cognitive structures and embodied processes underlie expert skill. We then present the Applying Intelligence to the Reflexes (AIR) approach?a differently nuanced model of expert skill aligned with the integrative spirit of the Noices? research. The AIR approach suggests that musicians negotiate the apparent paradox of expert skill via a mindedness that allows flexibility of attention during music performance. We offer data from recent doctoral research conducted by the first author of this article to demonstrate at a practical level the usefulness of the AIR approach when attempting to understand the complexities of expert skill in music performance. (shrink)
In Without Justification, Jonathan Sutton undermines the orthodox view that a justified belief needn’t constitute knowledge; develops a battery of arguments for the unorthodox thesis that you justifiedly believe P iff you know P; and explores the topics of testimony and inference in light of his equation of justification and knowledge (J=K). This book is essential reading at epistemology’s cutting edge. In §I, we’ll take an extended tour of the book, raising various questions and objections along the way. In (...) §II, we’ll assess Sutton’s three main arguments for J=K, which form the heart of his project. (shrink)
Analytic epistemologists agree that, whatever else is true of epistemic justification, it is distinct from knowledge. However, if recent work by Jonathan Sutton is correct, this view is deeply mistaken, for according to Sutton justification is knowledge. That is, a subject is justified in believing that p iff he knows that p. Sutton further claims that there is no concept of epistemic justification distinct from knowledge. Since knowledge is factive, a consequence of Sutton’s view is that (...) there are no false justified beliefs. <br> Following Sutton, I will begin by outlining kinds of beliefs that do not constitute knowledge but that seem to be justified. I will then be in a position to critically evaluate Sutton’s arguments for his position that justification is knowledge, concluding that he fails to establish his bold thesis. In the course of so doing, I will defend the following rule of assertion: (The JBK-rule) One must: assert p only if one has justification to believe that one knows that p.<br>. (shrink)
The Orthodox View (OV) of the relation between epistemic justification and knowledge has it that justification is conceptually prior to knowledge—and so, can be used to provide a noncircular account of knowledge. OV has come under threat from the increasingly popular “Knowledge First” movement (KFM) in epistemology. I assess several anti-OV arguments due to three of KFM’s most prominent members: Timothy Williamson, Jonathan Sutton, and Alexander Bird. I argue that OV emerges from these attacks unscathed.
This paper assesses several prominent recent attacks on the view that epistemic justification is conceptually prior to knowledge. I argue that this view—call it the Received View (RV)—emerges from these attacks unscathed. I start with Timothy Williamson’s two strongest arguments for the claim that all evidence is knowledge (E>K), which impugns RV when combined with the claim that justification depends on evidence. One of Williamson’s arguments assumes a false epistemic closure principle; the other misses some alternative (to E>K) explanations of (...) a putative fact about the evidence a particular subject has. Next, I neutralize each of Jonathan Sutton’s three recent arguments to the conclusion that any justified belief constitutes knowledge. Finally, I consider a recent analysis of justification due to Alexander Bird, according to which justified belief is possible knowledge. I argue that Bird’s analysis delivers neither a sufficient nor (more importantly) a necessary condition for justification. [Word count: 149]. (shrink)