We introduce the CUBISM system for the analysis and deep understanding of multi-participant dialogues. CUBISM brings together two typically separate forms of discourse analysis: semantic analysis and sociolinguistic analysis. In the paper proper, we describe and illustrate major components of the CUBISM system, and discuss the challenge posed by the system’s ultimate purpose, which is to automatically detect anomalous changes in participants’ expressed or implied beliefs about the world and each other, including shifts toward or away from cultural and community (...) norms. (shrink)
The paper describes a system for dealing with nestings of belief in terms of the mechanism of computational environment. A method is offered for computing the beliefs of A about B (and so on) in terms of the systems existing knowledge structures about A and B separately. A proposal for belief percolation is put forward: percolation being a side effect of the process of the computation of nested beliefs, but one which could explain the acquisition of unsupported beliefs. It is (...) argued that the mechanism proposed is compatible with a general least effort hypothesis concerning human mental functioning. (shrink)
I discuss two questions: (1) would Duhem have accepted the thesis of the continuity of scientific methodology? and (2) to what extent is the Oxford tradition of classification/subalternation of sciences continuous with early modern science? I argue that Duhem would have been surprised by the claim that scientific methodology is continuous; he expected at best only a continuity of physical theories, which he was trying to isolate from the perpetual fluctuations of methods and metaphysics. I also argue that the evidence (...) does not support the conclusion that early modern doctrines about mathematics and physics are continuous with the subalternation of sciences from Grosseteste, Bacon, and the theologians of fourteenth-century Oxford. The official and dominant context for early modern scientific methodology seems to have been progressive Thomism, and early modern thinkers seem to have pitted themselves against it. (shrink)
When John von Neumann turned his interest to computers, he was one of the leading mathematicians of his time. In the 1940s, he helped design two of the ﬁrst stored-program digital electronic computers. He authored reports explaining the functional organization of modern computers for the ﬁrst time, thereby inﬂuencing their construction worldwide (von Neumann, 1945; Burks et al., 1946). In the ﬁrst of these reports, von Neumann described the computer as analogous to a brain, with an input “organ” (analogous to (...) sensory neurons), a memory, an arithmetical and a logical “organ” (analogous to associative neurons), and an output “organ” (analogous to motor neurons). His experience with computers convinced him that brains and computers, both having to do with the processing of information, should be studied by a new discipline–automata theory. In fact, according to von Neumann, automata theory would cover not only computers and brains, but also any biological or artiﬁcial systems that dealt with information and control, including robots and genes. Von Neumann never formulated a full-blown mathematical theory of automata, but he wrote several important exploratory papers (von Neumann, 1951, 1956, 1966). Meanwhile, besides designing hardware, he developed some of the ﬁrst programs, programming languages, programming techniques, and numerical methods for solving mathematical problems using computers. (Much of his work on computing is reprinted in Aspray and Burks, 1987.) Shortly before his death in 1956, he wrote an informal synthesis of his views about brains. Though von Neumann left his manuscript sketchy and unﬁnished, Yale University Press published it as The Com- puter and the Brain in 1958. The 2000 reprint of this small but informative book is an opportunity to learn, or be reminded of, von Neumann’s thoughts on the computational organization of the mind-brain. Von Neumann began by explaining computers, which for him were essentially number crunchers: to compute was “to operate on .. (shrink)
Nothing in McKay & Dennett's (M&D's) target article deals with the issue of how the adaptivity, or some other aspect, of beliefs might become a biological adaptation; which is to say, how the functions discussed might be coded in such a way in the brain that their development was also coded in gametes or sex transmission cells.