A coarse-grained W?25% Cu alloy is subjected to high-pressure torsion (HPT) at room temperature to different strains. Evolution of the microstructure during HPT processing is studied using X-ray diffraction analysis, scanning and transmission electron microscopy. It is demonstrated that HPT processing results in fragmentation of the tungsten particles and the formation of a 5?15?nm grain size nanostructure at equivalent strains of ≥256 (saturation). It is shown that the nanostructured W?25% Cu is thermostable up to 500°C, with grain growth up to (...) 50?nm at 720°C. During HPT processing, the lattice parameter of the copper and tungsten was found to increase and decrease, respectively, with increased level of equivalent strain. This is proposed to occur through the interdiffusion of copper atoms into tungsten grains and tungsten atoms into copper grains, as suggested by energy-dispersive X-ray analysis of the individual grains. The formation of a limited solid solution is considered and possible mechanisms for this effect discussed. (shrink)
By definition, the subjective probability distribution of a random event is revealed by the (‘rational’) subject's choice between bets — a view expressed by F. Ramsey, B. De Finetti, L. J. Savage and traceable to E. Borel and, it can be argued, to T. Bayes. Since hypotheses are not observable events, no bet can be made, and paid off, on a hypothesis. The subjective probability distribution of hypotheses (or of a parameter, as in the current ‘Bayesian’ statistical literature) is therefore (...) a figure of speech, an ‘as if’, justifiable in the limit. Given a long sequence of previous observations, the subjective posterior probabilities of events still to be observed are derived by using a mathematical expression that would approximate the subjective probability distribution of hypotheses, if these could be bet on. This position was taken by most, but not all, respondents to a ‘Round Robin’ initiated by J. Marschak after M. H. De-Groot's talk on Stopping Rules presented at the UCLA Interdisciplinary Colloquium on Mathematics in Behavioral Sciences. Other participants: K. Borch, H. Chernoif, R. Dorfman, W. Edwards, T. S. Ferguson, G. Graves, K. Miyasawa, P. Randolph, L. J. Savage, R. Schlaifer, R. L. Winkler. Attention is also drawn to K. Borch's article in this issue. (shrink)
Dr Edwards' stimulating and provocative book advances the thesis that the appropriate axiomatic basis for inductive inference is not that of probability, with its addition axiom, but rather likelihood - the concept introduced by Fisher as a measure of relative support amongst different hypotheses. Starting from the simplest considerations and assuming no more than a modest acquaintance with probability theory, the author sets out to reconstruct nothing less than a consistent theory of statistical inference in science.
We perceive colour, shape, sound and touch 'bound together' in a single experience. The following arguments about this binding phenomenon are raised: (1) The individual signals passing from neurone to neurone are not bound together, whether as elements of information or physically. (2) Within a single cell, binding in terms of bringing together of information is potentially feasible. A physical substrate may also be available. (3) It is therefore proposed that a bound conscious experience must be a property of an (...) individual cell, not of a group of cells. Since it is unlikely that one specific neurone is conscious, it is suggested that every neurone has a version of our consciousness, or at least some form of sentience. However absurd this may seem it appears to be consistent with the available evidence; arguably the only explanation that is. It probably does not alter the way we should expect to experience the world, but may help to explain the ways we seem to differ from digital computers and some of the paradoxes seen in mental illness. It predicts non-digital features of intracellular computation, for which there is already evidence, and which should be open to further experimental exploration. The arguments given may well prove flawed or the conclusion biologically or physically untenable, but the idea is raised for discussion not least because a formal demonstration that it is invalid may help to identify more fruitful avenues. (shrink)
It is argued that both neuroscience and physics point towards a similar re-assessment of our concepts of space, time and 'reality', which, by removing some apparent paradoxes, may lead to a view which can provide a natural place for consciousness and language within biophysics. There are reasons to believe that relationships between entities in experiential space and time and in modern physicists' space and time are quite different, neither corresponding to our geometric schooling. The elements of the universe may be (...) better described not as 'particles' but as dynamic processes giving rise, where they interface with each other, to the transfer, and at least in some cases experience, of 'pure'or 'active'information, the mental and physical just reflecting different standpoints. Although this analy-sis draws on general features of quantum dynamics, it is argued that purely quantum level events (and their 'interpretations') are unlikely to be relevant to the understanding of consciousness. The processes that might be able to give rise, within brain cells, to an experience like ours are briefly reviewed. It is suggested that the elementary signals that are integrated to generate a spatial experience may have features more in common with words than pixels. It is further suggested that the laws of integration of words in language may provide useful clues to the way biophysical integration of signals in neurons relates to integration of elements in experiential space. (shrink)
In popular articles that play down the genetical differences among human populations, it is often stated that about 85% of the total genetical variation is due to individual differences within populations and only 15% to differences between populations or ethnic groups. It has therefore been proposed that the division of Homo sapiens into these groups is not justified by the genetic data. This conclusion, due to R.C. Lewontin in 1972, is unwarranted because the argument ignores the fact that most of (...) the information that distinguishes populations is hidden in the correlation structure of the data and not simply in the variation of the individual factors. The underlying logic, which was discussed in the early years of the last century, is here discussed using a simple genetical example. (shrink)
This expands the proposal in 'Is consciousness only a property of individual cells?' to attempt to cover all relevant psychological, neuroscientific and philosophical issues. Some of the material is now dated (in 2011) but chiefly in the sense that tentative proposals have become firmer views for me. An example of this is the clarification of complementarities in "Are our spaces made of words?'.
We propose a descriptive version of the classical multi-attribute utility model; to that end, we add a new parameter, momentary salience, to the customary formulation. The addition of this parameter allows the theory to accommodate changes in the decision maker’s mood and circumstances, as the saliencies of anticipated consequences are driven by concerns of the moment. By allowing for the number of consequences given attention at the moment of decision to vary, the new model mutes the criticism that SEU models (...) call for an omniscient decision maker. Use of the model is illustrated with a large-scale longitudinal study showing that adolescent smokers have higher utility for smoking than nonsmokers. We also propose to use the model hierarchically to describe everyday decisions that people deal with repeatedly. Big decisions, which set policy, guide a host of nested little decisions, which in turn lead to action. For a little decision, one of the options will be consistent with the policy, and will inherit its high utility. Accordingly, most little decisions will be made quickly and will follow the policy. However, people do sometimes decide to violate their own policies, and we describe how these lapses can lead to collapse of the policy. (shrink)
Is toleration a requirement of morality or a dictate of prudence? What limits are there to toleration? What is required of us if we are to promote a truly tolerant society? These themes--the grounds, limits, and requirements of toleration--are central to this book, which presents the W.B. Morrell Memorial Lectures on Toleration, given in 1986 at the University of York. Covering a wide range of practical and theoretical issues, the contributors--including F.A. Hayek, Maurice Cranston, and Karl Popper--consider the philosophical difficulties (...) inherent in the concept as well as the practical problems of implementing a policy of toleration. Although the contributors differ in their conclusions about the grounds of toleration, they all share a belief in the importance of the concept both historically and in modern society. (shrink)
This paper explores the question of the purpose of education within the context of Lyotardȁ9s framing of the postmodern condition. It points to some of the continuities and discontinuities in the framing of the current condition as postmodern and the recurrent problematics of truth-telling which is the mark of this condition. It suggests that educationally the postmodern condition is marked by lifelong learning, a constant apprenticeship rather than mastery, where in language stutters.