In information societies, operations, decisions and choices previously left to humans are increasingly delegated to algorithms, which may advise, if not decide, about how data should be interpreted and what actions should be taken as a result. More and more often, algorithms mediate social processes, business transactions, governmental decisions, and how we perceive, understand, and interact among ourselves and with the environment. Gaps between the design and operation of algorithms and our understanding of their ethical implications can (...) have severe consequences affecting individuals as well as groups and whole societies. This paper makes three contributions to clarify the ethical importance of algorithmic mediation. It provides a prescriptive map to organise the debate. It reviews the current discussion of ethical aspects of algorithms. And it assesses the available literature in order to identify areas requiring further work to develop the ethics of algorithms. (shrink)
There are many branches of philosophy called “the philosophy of X,” where X = disciplines ranging from history to physics. The philosophy of artificial intelligence has a long history, and there are many courses and texts with that title. Surprisingly, the philosophy of computer science is not nearly as well-developed. This article proposes topics that might constitute the philosophy of computer science and describes a course covering those topics, along with suggested readings and assignments.
A primary goal of quantum computer science is to find an explanation for the fact that quantum computers are more powerful than classical computers. In this paper I argue that to answer this question is to compare algorithmic processes of various kinds and to describe the possibility spaces associated with these processes. By doing this, we explain how it is possible for one process to outperform its rival. Further, in this and similar examples little is gained in subsequently asking (...) a how-actually question. Once one has explained how-possibly, there is little left to do. (shrink)
This book constitutes the refereed proceedings of the Third International Symposium on Stochastic Algorithms: Foundations and Applications, SAGA 2005, held in Moscow, Russia in October 2005. The 14 revised full papers presented together with 5 invited papers were carefully reviewed and selected for inclusion in the book. The contributed papers included in this volume cover both theoretical as well as applied aspects of stochastic computations whith a special focus on new algorithmic ideas involving stochastic decisions and the design and (...) evaluation of stochastic algorithms within realistic scenarios. (shrink)
Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might. Following H. Simon's notion of satisficing, the authors have proposed a family of algorithms based on a simple psychological mechanism: one-reason decision making. These fast and frugal algorithms violate fundamental tenets of classical rationality: They neither look up nor integrate all information. By (...) class='Hi'>computer simulation, the authors held a competition between the satisficing "Take The Best" algorithm and various "rational" inference procedures. The Take The Best algorithm matched or outperformed all competitors in inferential speed and accuracy. This result is an existence proof that cognitive mechanisms capable of successful performance in the real world do not need to satisfy the classical norms of rational inference. (shrink)
This paper presents the first bibliometric mapping analysis of the field of computer and information ethics (C&IE). It provides a map of the relations between 400 key terms in the field. This term map can be used to get an overview of concepts and topics in the field and to identify relations between information and communication technology concepts on the one hand and ethical concepts on the other hand. To produce the term map, a data set of over thousand (...) articles published in leading journals and conference proceedings in the C&IE field was constructed. With the help of various computeralgorithms, key terms were identified in the titles and abstracts of the articles and co-occurrence frequencies of these key terms were calculated. Based on the co-occurrence frequencies, the term map was constructed. This was done using a computer program called VOSviewer. The term map provides a visual representation of the C&IE field and, more specifically, of the organization of the field around three main concepts, namely privacy, ethics, and the Internet. (shrink)
In the technical literature of computer science, the concept of an effective procedure is closely associated with the notion of an instruction that precisely specifies an action. Turing machine instructions are held up as providing paragons of instructions that "precisely describe" or "well define" the actions they prescribe. Numerical algorithms and computer programs are judged effective just insofar as they are thought to be translatable into Turing machine programs. Nontechnical procedures (e.g., recipes, methods) are summarily dismissed as (...) ineffective on the grounds that their instructions lack the requisite precision. But despite the pivotal role played by the notion of a precisely specified instruction in classifying procedures as effective and ineffective, little attention has been paid to the manner in which instructions "precisely specify" the actions they prescribe. It is the purpose of this paper to remedy this defect. The results are startling. The reputed exemplary precision of Turing machine instructions turns out to be a myth. Indeed, the most precise specifications of action are provided not by the procedures of theoretical computer science and mathematics (algorithms) but rather by the nontechnical procedures of everyday life. I close with a discussion of some of the rumifications of these conclusions for understanding and designing concrete computers and their programming languages. (shrink)
In this paper I attempt to cast the current program verification debate within a more general perspective on the methodologies and goals of computer science. I show, first, how any method involved in demonstrating the correctness of a physically executing computer program, whether by testing or formal verification, involves reasoning that is defeasible in nature. Then, through a delineation of the senses in which programs can be run as tests, I show that the activities of testing and formal (...) verification do not necessarily share the same goals and thus do not always constitute alternatives. The testing of a program is not always intended to demonstrate a program's correctness. Testing may seek to accept or reject nonprograms including algorithms, specifications, and hypotheses regarding phenomena. The relationship between these kinds of testing and formal verification is couched in a more fundamental relationship between two views of computer science, one properly containing the other. (shrink)
The formulas-as-types isomorphism tells us that every proof and theorem, in the intuitionistic implicational logic $H_\rightarrow$, corresponds to a lambda term or combinator and its type. The algorithms of Bunder very efficiently find a lambda term inhabitant, if any, of any given type of $H_\rightarrow$ and of many of its subsystems. In most cases the search procedure has a simple bound based roughly on the length of the formula involved. Computer implementations of some of these procedures were done (...) in Dekker. In this paper we extend these methods to full classical propositional logic as well as to its various subsystems. This extension has partly been implemented by Oostdijk. (shrink)
This book constitutes the refereed proceedings of the 8th International Conference on Theory and Applications of Satisfiability Testing, SAT 2005, held in St Andrews, Scotland in June 2005. The 26 revised full papers presented together with 16 revised short papers presented as posters during the technical programme were carefully selected from 73 submissions. The whole spectrum of research in propositional and quantified Boolean formula satisfiability testing is covered including proof systems, search techniques, probabilistic analysis of algorithms and their properties, (...) problem encodings, industrial applications, specific tools, case studies, and empirical results. (shrink)
There are many algorithm texts that provide lots of well-polished code and proofs of correctness. Instead, this book presents insights, notations, and analogies to help the novice describe and think about algorithms like an expert. By looking at both the big picture and easy step-by-step methods for developing algorithms, the author helps students avoid the common pitfalls. He stresses paradigms such as loop invariants and recursion to unify a huge range of algorithms into a few meta-algorithms. (...) Part of the goal is to teach the students to think abstractly. Without getting bogged with formal proofs, the book fosters a deeper understanding of how and why each algorithm works. These insights are presented in a slow and clear manner accessible to second- or third-year students of computer science, preparing them to find their own innovative ways to solve problems. (shrink)
This book constitutes the refereed proceedings of the 7th International Conference on Theory and Applications of Satisfiability Testing, SAT 2004, held in Vancouver, BC, Canada in May 2004. The 24 revised full papers presented together with 2 invited papers were carefully selected from 72 submissions. In addition there are 2 reports on the 2004 SAT Solver Competition and the 2004 QBF Solver Evaluation. The whole spectrum of research in propositional and quantified Boolean formula satisfiability testing is covered; bringing together the (...) fields of theoretical and experimental computer science as well as the many relevant application areas. (shrink)
A fixed-parameter is an algorithm that provides an optimal solution to a combinatorial problem. This research-level text is an application-oriented introduction to the growing and highly topical area of the development and analysis of efficient fixed-parameter algorithms for hard problems. The book is divided into three parts: a broad introduction that provides the general philosophy and motivation; followed by coverage of algorithmic methods developed over the years in fixed-parameter algorithmics forming the core of the book; and a discussion of (...) the essential from parameterized hardness theory with a focus on W -hardness, which parallels NP-hardness, then stating some relations to polynomial-time approximation algorithms, and finishing up with a list of selected case studies to show the wide range of applicability of the presented methodology. Aimed at graduate and research mathematicians, programmers, algorithm designers and computer scientists, the book introduces the basic techniques and results and provides a fresh view on this highly innovative field of algorithmic research. (shrink)
Color has long been at home in the domains of classical art and aesthetics. However, with the introduction of computer art in Germany in the early 1960s, a new ‘rational theory’ of art, media and color emerged. Many believed this new ‘science’ of art would generate computeralgorithms which would enable new media aesthetic ‘principles to be formulated mathematically’ — thus ending the lofty mystifications that have, for too long, been associated with Romantic notions about artwork and (...) art-making. Although, as German computer artist Herbert Franke noted, ‘Traditional aesthetic modes of expression are of little avail to the artist who works with technical systems, particularly computers’, I argue herein that the shifts in aesthetic theory brought about by the introduction of computer art are more complex than a clear-cut subsumption to mathematics, information theory or cybernetics. While the ‘Programming the Beautiful’ project that began in Europe in the 1960s — the three-fold rationalization of the artist, the artwork and color — has been largely realized, nonetheless, today we find ourselves seeped in the neo-Romanticization of new media art, the reglorification of the role of the artist, and new digital color — ironically, the exact opposite of what the project for rational art initially sought. (shrink)
This book constitutes the refereed proceedings of the 14th International Conference on Theory and Applications of Satisfiability Testing, SAT 2011, held in Ann Arbor, MI, USA in June 2011.The 25 revised full papers presented together with ...
Part I presents a model of interactive computation and a metric for expressiveness, Part II relates interactive models of computation to physics, and Part III considers empirical models from a philosophical perspective. Interaction machines, which extend Turing Machines to interaction, are shown in Part I to be more expressive than Turing Machines by a direct proof, by adapting Gödel's incompleteness result, and by observability metrics. Observation equivalence provides a tool for measuring expressiveness according to which interactive systems are more expressive (...) than algorithms. Refinement of function equivalence by observation of outer interactive behavior and inner computation steps is examined. The change of focus from algorithms specified by computable functions to interaction specified by observation equivalence captures the essence of empirical computer science. Part II relates interaction in models of computation to observation in the natural sciences. Explanatory power in physics is specified by the same observability metric as expressiveness in interactive systems. Realist models of inner structure are characterized by induction, abduction, and Occam's Razor. Interactive realism extends the hidden-variable model of Einstein to hidden interfaces that provide extra degrees of freedom to formulate hypotheses with testable predictions conforming with quantum theory. Greater expressiveness of collaborative computational observers (writers) than single observers implies that hidden-interface models are more expressive than hidden-variable models. By providing a common foundation for empirical computational and physical models we can use precise results about computational models to establish properties of physical models. Part III shows that the evolution in computing from algorithms to interaction parallels that in physics from rationalism to empiricism. Plato's cave metaphor is interactively extended from Platonic rationalism to empiricism. The Turing test is extended to TMs with hidden interfaces that express interactive thinking richer than the traditional Turing test. Interactive (nonmonotonic) extensions of logic such as the closed-world assumption suggest that interactiveness is incompatible with monotonic logical inference. Procedure call, atomicity of transactions, and taking a fixed point are techniques for closing open systems similar to "preparation" followed by "observation" of a physical system. Pragmatics is introduced as a framework for extending logical models with a fixed syntax and semantics to multiple-interface models that support collaboration among clients sharing common resources. (shrink)
A proof of ‘correctness’ for a mathematical algorithm cannot be relevant to executions of a program based on that algorithm because both the algorithm and the proof are based on assumptions that do not hold for computations carried out by real-world computers. Thus, proving the ‘correctness’ of an algorithm cannot establish the trustworthiness of programs based on that algorithm. Despite the (deceptive) sameness of the notations used to represent them, the transformation of an algorithm into an executable program is a (...) wrenching metamorphosis that changes a mathematical abstraction into a prescription for concrete actions to be taken by real computers. Therefore, it is verification of program executions (processes) that is needed, not of program texts that are merely the scripts for those processes. In this view, verification is the empirical investigation of: (a) the behavior that programs invoke in a computer system and (b) the larger context in which that behavior occurs. Here, deduction can play no more, and no less, a role than it does in the empirical sciences. (shrink)
Types now play an essential role in computer science; their ascent originates from Principia Mathematica. Type checking and type inference algorithms are used to prevent semantic errors in programs, and type theories are the native language of several major interactive theorem provers. Some of these trace key features back to Principia.
This article reviews the strengths and limitations of five major paradigms of medical computer-assisted decision making (CADM): (1) clinical algorithms, (2) statistical analysis of collections of patient data, (3) mathematical models of physical processes, (4) decision analysis, and (5) symbolic reasoning or artificial intelligence (Al). No one technique is best for all applications, and there is recent promising work which combines two or more established techniques. We emphasize both the inherent power of symbolic reasoning and the promise of (...) artificial intelligence and the other techniques to complement each other. Keywords: Diagnosis, Computer Assisted Decision Making, Artificial Intelligence * Current address: Intelligenetics, 124 University Avenue, Palo Alto, CA. 94301, U.S.A. ** Dr. Shortliffe is a Henry J. Kaiser Family Foundation Faculty Scholar in General Internal Medicine and recipient of research career development award LM0048 from the National Library of Medicine. CiteULike Connotea Del.icio.us What's this? (shrink)
This article reviews the strengths and limitations of five major paradigms of medical computer-assisted decision making (CADM): (1) clinical algorithms, (2) statistical analysis of collections of patient data, (3) mathematical models of physical processes, (4) decision analysis, and (5) symbolic reasoning or artificial intelligence (Al). No one technique is best for all applications, and there is recent promising work which combines two or more established techniques. We emphasize both the inherent power of symbolic reasoning and the promise of (...) artificial intelligence and the other techniques to complement each other. (shrink)
Mathematical Logic for Computer Science is a mathematics textbook with theorems and proofs, but the choice of topics has been guided by the needs of computer science students. The method of semantic tableaux provides an elegant way to teach logic that is both theoretically sound and yet sufficiently elementary for undergraduates. To provide a balanced treatment of logic, tableaux are related to deductive proof systems.The logical systems presented are:- Propositional calculus (including binary decision diagrams);- Predicate calculus;- Resolution;- Hoare (...) logic;- Z;- Temporal logic.Answers to exercises (for instructors only) as well as Prolog source code for algorithms may be found via the Springer London web site: http://www.springer.com/978-1-85233-319-5 Mordechai Ben-Ari is an associate professor in the Department of Science Teaching of the Weizmann Institute of Science. He is the author of numerous textbooks on concurrency, programming languages and logic, and has developed software tools for teaching concurrency. In 2004, Ben-Ari received the ACM/SIGCSE Award for Outstanding Contributions to Computer Science Education. (shrink)
In this essay, I examine the notion of a nondeterministic algorithm from both a conceptual and historical point of view. I argue that the intuitions underwriting nondeterminism in the context of contemporary theoretical computer science cannot be reconciled with the intuitions that originally motivated nondeterminism. I identify four different intuitions about nondeterminism: nondeterminism as evidence for the Church Turing thesis; nondeterminism as a natural reflection of the mathematician's behavior; nondeterminism as a formal, mathematical generalization; and nondeterminism as a physical (...) process. I shall argue that there are irreconcilable tensions among these intuitions and, moreover, that these tensions have not been acknowledged in the conceptual development of theoretical computer science. In fact, a careful review of the received history reveals a number of gaps and discontinuities. For instance, I examine the seminal writings of Turing and argue that the formal introduction of the nondeterminism is to be found there and not, as is often supposed, in the research of the late 1950s and early 1960s. I also examine the period after 1960 and argue that even the more recent developments concerning nondeterminism are hard to reconstruct. ;Although I firmly believe that a rigorous philosophical account of nondeterminism is needed, I am not sure that such an account will bring us any closer to a solution to any of the notoriously difficult theoretical problems that turn on the notion of a nondeterministic algorithm. I do believe, however, that a clear conceptual account of nondeterminism will shed light on many long-standing philosophical problems. I conclude by pointing to a number of philosophical questions that resonate with issues raised by the foregoing investigation of nondeterministic algorithms. (shrink)
There are theoretical limitations to what can be implemented by a computer program. In this paper we are concerned with a limitation on the strength of computer implemented deduction. We use a version of the Curry paradox to arrive at this limitation.
Context: Consistency of mathematical constructions in numerical analysis and the application of computerized proofs in the light of the occurrence of numerical chaos in simple systems. Purpose: To show that a computer in general and a numerical analysis in particular can add its own peculiarities to the subject under study. Hence the need of thorough theoretical studies on chaos in numerical simulation. Hence, a questioning of what e.g. a numerical disproof of a theorem in physics or a prediction in (...) numerical economics could mean. Method: An algebraic simple model system is subjected to a deeper structure of underlying variables. With an algorithm simulating the steps in taking a limit of second order difference quotients the error terms are studied at the background of their algebraic expression. Results: With the algorithm that was applied to a simple quadratic polynomial system we found unstably amplified round-off errors. The possibility of numerical chaos is already known but not in such a simple system as used in our paper. The amplification of the errors implies that it is not possible with computer means to constructively show that the algebra and numerical analysis will ‘on the long run’ converge to each other and the error term will vanish. The algebraic vanishing of the error term cannot be demonstrated with the use of the computer because the round-off errors are amplified. In philosophical terms, the amplification of the round-off error is equivalent to the continuum hypothesis. This means that the requirement of (numerical) construction of mathematical objects is no safeguard against inference-only conclusions of qualities of (numerical) mathematical objects. Unstably amplified round-off errors are a same type of problem as the ordering in size of transfinite cardinal numbers. The difference is that the former problem is created within the requirements of constructive mathematics. This can be seen as the reward for working numerically constructive. (shrink)
This paper combines naturalized metaphysics and a philosophical reflection on a recently evolving interdisciplinary branch of quantum chemistry, ab initio molecular dynamics. Bridging the gaps among chemistry, physics, and computer science, this cutting-edge research field explores the structure and dynamics of complex molecular many-body systems through computer simulations. These simulations are allegedly crafted solely by the laws of fundamental physics, and are explicitly designed to capture nature as closely as possible. The models and algorithms employed, however, involve (...) many approximations and significant degrees of idealization of their target systems. Therefore, for philosophers of science the pivotal question of whether relying only on the fundamental laws of physics supports a reductionist or realist stance arises. One conceivable answer to this question is that the irreducible approximations and idealizations support rather anti-realist positions. After reviewing an influential attitude in the philosophy of computer simulations and the debate concerning scientific realism, I offer a fair interpretation of such ab initio modelling in quantum chemistry within a naturalistic metaphysical framework that gives rise to a specific type of ontic structural realism. (shrink)
Combining physics, mathematics and computer science, quantum computing has developed in the past two decades from a visionary idea to one of the most fascinating areas of quantum mechanics. The recent excitement in this lively and speculative domain of research was triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially "speed up" classical computation and factor large numbers into primes much more rapidly (at least in terms of the number of computational steps involved) than any (...) known classical algorithm. Shor's algorithm was soon followed by several other algorithms that aimed to solve combinatorial and algebraic problems, and in the last few years theoretical study of quantum systems serving as computational devices has achieved tremendous progress. Common belief has it that the implementation of Shor's algorithm on a large scale quantum computer would have devastating consequences for current cryptography protocols which rely on the premiss that all known classical worst case algorithms for factoring take time exponential in the length of their input (see, e.g., Preskill 2005). Consequently, experimentalists around the world are engaged in tremendous attempts to tackle the technological difficulties that await the realization of such a large scale quantum computer. But regardless whether these technological problems can be overcome (Unruh 1995, Ekert and Jozsa 1996, Haroche and Raimond 1996), it is noteworthy that no proof exists yet for the general superiority of quantum computers over their classical counterparts. (shrink)
The algorithm, a building block of computer science, is defined from an intuitive and pragmatic point of view, through a methodological lens of philosophy rather than that of formal computation. The treatment extracts properties of abstraction, control, structure, finiteness, effective mechanism, and imperativity, and intentional aspects of goal and preconditions. The focus on the algorithm as a robust conceptual object obviates issues of correctness and minimality. Neither the articulation of an algorithm nor the dynamic process constitute the algorithm itself. (...) Analysis for implications in computer science and philosophy reveals unexpected results, new questions, and new perspectives on current questions, including the relationship between our informally construed algorithms and Turing machines. Exploration in terms of current computational and philosophical thinking invites further developments. (shrink)
Some philosophers suggest that the development of scientificknowledge is a kind of Darwinian process. The process of discovery,however, is one problematic element of this analogy. I compare HerbertSimon's attempt to simulate scientific discovery in a computer programto recent connectionist models that were not designed for that purpose,but which provide useful cases to help evaluate this aspect of theanalogy. In contrast to the classic A.I. approach Simon used, ``neuralnetworks'' contain no explicit protocols, but are generic learningsystems built on the model (...) of the interconnections of neurons in thebrain. I describe two cases that take the connectionist approach a stepfurther by using genetic algorithms, a form of evolutionary computationthat explicitly models Darwinian mechanisms. These cases show thatDarwinian mechanisms can make novel discoveries of complex, previouslyunknown patterns. With some caveats, they lend support to evolutionaryepistemology. (shrink)
After the novel, and subsequently cinema privileged narrative as the key form of cultural expression of the modern age, the computer age introduces its correlate — database. Why does new media favour database form over others? Can we explain ist popularity by analysing the specificity of the digital medium and of computer programming? What is the relationship between database and another form, which has traditionally dominated human culture — narrative? In addressing these questions, I discuss the connection between (...)computer's ontology — the way software represents the world — and the new cultural forms privileged by computer culture such as database. I propose that computerisation of culture involves projection of two fundamental parts of computer software — data structures and algorithms — onto the cultural sphere. Thus CD-ROMs and Web databases are cultural manifestations of one half of this ontology — data structures; while new media narratives are manifestations of the second part — algorithms. I conclude by proposing that in computer culture database and narrative do not have the same status. Given that on the level of data organisation most new media objects are databases, it is not surprising that on the level of form database also dominates new media culture. (shrink)
We show that in quantum logic of closed subspaces of Hilbert space one cannot substitute quantum operations for classical (standard Hilbert space) ones and treat them as primitive operations. We consider two possible ways of such a substitution and arrive at operation algebras that are not lattices what proves the claim. We devise algorithms and programs which write down any two-variable expression in an orthomodular lattice by means of classical and quantum operations in an identical form. Our results show (...) that lattice structure and classical operations uniquely determine quantum logic underlying Hilbert space. As a consequence of our result, recent proposals for a deduction theorem with quantum operations in an orthomodular lattice as well as a, substitution of quantum operations for the usual standard Hilbert space ones in quantum logic prove to be misleading. Quantum computer quantum logic is also discussed. (shrink)
The increasing knowledge intensity of jobs, typical of a knowledge economy, highlights the role of firms as integrators of know-how and skills. As economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, firms need to allocate skills to tasks and traditional hierarchical control results increasingly ineffective. In this work, we explore under what circumstances networks of agents, which bear specific skills, may self-organize in order to complete tasks. We use a computer simulation approach and (...) investigate how local interaction of agents, endowed with skills and individual decision-making rules, may produce aggregate network structure able to perform tasks. To design algorithms that mimic individual decision-making, we borrow from computer science literature and, in particular, from studies addressing protocols that produce cooperation in Peer-to-Peer networks. We found that self-organization depends on the structural features of, formal or informal, organizational networks embedding both professionals, holding skills, and project managers, holding access to jobs. (shrink)