Search results for 'Computer algorithms' (try it on Scholar)

1000+ found
Order:
  1. A. P. Ershov & Donald Ervin Knuth (eds.) (1981). Algorithms in Modern Mathematics and Computer Science: Proceedings, Urgench, Uzbek Ssr, September 16-22, 1979. Springer-Verlag.
  2.  12
    J. Arnoldi (2016). Computer Algorithms, Market Manipulation and the Institutionalization of High Frequency Trading. Theory, Culture and Society 33 (1):29-52.
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  3.  3
    Charles Rackoff (2006). Dietzfelbinger Martin. Primality Testing in Polynomial Time—From Randomized Algorithms to “PRIMES is in P”. Lecture Notes in Computer Science, Vol. 3000. Springer-Verlag, 2004, X+ 147 Pp. [REVIEW] Bulletin of Symbolic Logic 12 (3):494-496.
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography  
  4.  2
    Jiri Becvar (1971). Review: Robert R. Korfhage, Logic and Algorithms with Applications to the Computer and Information Sciences. [REVIEW] Journal of Symbolic Logic 36 (2):344-346.
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  5. Jiří Bečvář (1971). Korfhage Robert R.. Logic and Algorithms with Applications to the Computer and Information Sciences. John Wiley & Sons, Inc., New York, London, and Sydney, 1966, Xii + 194 Pp. [REVIEW] Journal of Symbolic Logic 36 (2):344-346.
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  6. W. Dean (forthcoming). Algorithms and the Mathematical Foundations of Computer Science. Notre Dame Journal of Formal Logic.
     
    Export citation  
     
    My bibliography  
  7. Witold Lipski (1977). Paterson M. S.. Complexity of Matrix Algorithms. Foundations of Computer Science, Edited by de Bakker J. W., Mathematical Centre Tracts 63, Mathematisch Centrum, Amsterdam 1975, Pp. 179–215. [REVIEW] Journal of Symbolic Logic 42 (3):422.
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  8. Nancy Lynch (1981). Machtey Michael and Young Paul. An Introduction to the General Theory of Algorithms. The Computer Science Library, Theory of Computation Series. North-Holland, New York, Oxford, and Shannon, 1978, Vii + 264 Pp. [REVIEW] Journal of Symbolic Logic 46 (4):877-878.
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  9. Brent Mittelstadt, Patrick Allo, Mariarosaria Taddeo, Sandra Wachter & Luciano Floridi (forthcoming). The Ethics of Algorithms: Mapping the Debate. Big Data and Society.
    In information societies, operations, decisions and choices previously left to humans are increasingly delegated to algorithms, which may advise, if not decide, about how data should be interpreted and what actions should be taken as a result. More and more often, algorithms mediate social processes, business transactions, governmental decisions, and how we perceive, understand, and interact among ourselves and with the environment. Gaps between the design and operation of algorithms and our understanding of their ethical implications can (...)
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography  
  10. Donald Ervin Knuth (2010). Selected Papers on Design of Algorithms. Center for the Study of Language and Information.
  11. André Thayse (1984). P-Functions and Boolean Matrix Factorization a Unified Approach for Wired, Programmed, and Microprogrammed Implementations of Discrete Algorithms.
     
    Export citation  
     
    My bibliography  
  12.  66
    Michael E. Cuffaro (2015). How-Possibly Explanations in (Quantum) Computer Science. Philosophy of Science 82 (5):737-748.
    A primary goal of quantum computer science is to find an explanation for the fact that quantum computers are more powerful than classical computers. In this paper I argue that to answer this question is to compare algorithmic processes of various kinds and to describe the possibility spaces associated with these processes. By doing this, we explain how it is possible for one process to outperform its rival. Further, in this and similar examples little is gained in subsequently asking (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  13.  45
    William J. Rapaport (2005). Philosophy of Computer Science. Teaching Philosophy 28 (4):319-341.
    There are many branches of philosophy called “the philosophy of X,” where X = disciplines ranging from history to physics. The philosophy of artificial intelligence has a long history, and there are many courses and texts with that title. Surprisingly, the philosophy of computer science is not nearly as well-developed. This article proposes topics that might constitute the philosophy of computer science and describes a course covering those topics, along with suggested readings and assignments.
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  14.  9
    O. B. Lupanov (ed.) (2005). Stochastic Algorithms: Foundations and Applications: Third International Symposium, Saga 2005, Moscow, Russia, October 20-22, 2005: Proceedings. [REVIEW] Springer.
    This book constitutes the refereed proceedings of the Third International Symposium on Stochastic Algorithms: Foundations and Applications, SAGA 2005, held in Moscow, Russia in October 2005. The 14 revised full papers presented together with 5 invited papers were carefully reviewed and selected for inclusion in the book. The contributed papers included in this volume cover both theoretical as well as applied aspects of stochastic computations whith a special focus on new algorithmic ideas involving stochastic decisions and the design and (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  15. Richard Heersmink, Jeroen van den Hoven, Nees Jan van Eck & Jan van den Berg (2011). Bibliometric Mapping of Computer and Information Ethics. Ethics and Information Technology 13 (3):241-249.
    This paper presents the first bibliometric mapping analysis of the field of computer and information ethics (C&IE). It provides a map of the relations between 400 key terms in the field. This term map can be used to get an overview of concepts and topics in the field and to identify relations between information and communication technology concepts on the one hand and ethical concepts on the other hand. To produce the term map, a data set of over thousand (...)
    Direct download (13 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  16.  3
    Gerd Gigerenzer & Daniel G. Goldstein (1996). Reasoning the Fast and Frugal Way: Models of Bounded Rationality. Psychological Review 103 (4):650-669.
    Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might. Following H. Simon's notion of satisficing, the authors have proposed a family of algorithms based on a simple psychological mechanism: one-reason decision making. These fast and frugal algorithms violate fundamental tenets of classical rationality: They neither look up nor integrate all information. By (...) simulation, the authors held a competition between the satisficing "Take The Best" algorithm and various "rational" inference procedures. The Take The Best algorithm matched or outperformed all competitors in inferential speed and accuracy. This result is an existence proof that cognitive mechanisms capable of successful performance in the real world do not need to satisfy the classical norms of rational inference. (shrink)
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography   13 citations  
  17. Carol E. Cleland (2001). Recipes, Algorithms, and Programs. Minds and Machines 11 (2):219-237.
    In the technical literature of computer science, the concept of an effective procedure is closely associated with the notion of an instruction that precisely specifies an action. Turing machine instructions are held up as providing paragons of instructions that "precisely describe" or "well define" the actions they prescribe. Numerical algorithms and computer programs are judged effective just insofar as they are thought to be translatable into Turing machine programs. Nontechnical procedures (e.g., recipes, methods) are summarily dismissed as (...)
    Direct download (18 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  18.  31
    Timothy R. Colburn (1991). Program Verification, Defeasible Reasoning, and Two Views of Computer Science. Minds and Machines 1 (1):97-116.
    In this paper I attempt to cast the current program verification debate within a more general perspective on the methodologies and goals of computer science. I show, first, how any method involved in demonstrating the correctness of a physically executing computer program, whether by testing or formal verification, involves reasoning that is defeasible in nature. Then, through a delineation of the senses in which programs can be run as tests, I show that the activities of testing and formal (...)
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  19.  12
    M. W. Bunder & R. M. Rizkalla (2009). Proof-Finding Algorithms for Classical and Subclassical Propositional Logics. Notre Dame Journal of Formal Logic 50 (3):261-273.
    The formulas-as-types isomorphism tells us that every proof and theorem, in the intuitionistic implicational logic $H_\rightarrow$, corresponds to a lambda term or combinator and its type. The algorithms of Bunder very efficiently find a lambda term inhabitant, if any, of any given type of $H_\rightarrow$ and of many of its subsystems. In most cases the search procedure has a simple bound based roughly on the length of the formula involved. Computer implementations of some of these procedures were done (...)
    Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  20.  12
    Fahiem Bacchus & Toby Walsh (eds.) (2005). Theory and Applications of Satisfiability Testing: 8th International Conference, Sat 2005, St Andrews, Uk, June 19-23, 2005: Proceedings. [REVIEW] Springer.
    This book constitutes the refereed proceedings of the 8th International Conference on Theory and Applications of Satisfiability Testing, SAT 2005, held in St Andrews, Scotland in June 2005. The 26 revised full papers presented together with 16 revised short papers presented as posters during the technical programme were carefully selected from 73 submissions. The whole spectrum of research in propositional and quantified Boolean formula satisfiability testing is covered including proof systems, search techniques, probabilistic analysis of algorithms and their properties, (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  21.  14
    Jeff Edmonds (2008). How to Think About Algorithms. Cambridge University Press.
    There are many algorithm texts that provide lots of well-polished code and proofs of correctness. Instead, this book presents insights, notations, and analogies to help the novice describe and think about algorithms like an expert. By looking at both the big picture and easy step-by-step methods for developing algorithms, the author helps students avoid the common pitfalls. He stresses paradigms such as loop invariants and recursion to unify a huge range of algorithms into a few meta-algorithms. (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  22.  6
    Holger H. Hoos & David G. Mitchell (eds.) (2005). Theory and Applications of Satisfiability Testing: 7th International Conference, Sat 2004, Vancouver, Bc, Canada, May 10-13, 2004: Revised Selected Papers. [REVIEW] Springer.
    This book constitutes the refereed proceedings of the 7th International Conference on Theory and Applications of Satisfiability Testing, SAT 2004, held in Vancouver, BC, Canada in May 2004. The 24 revised full papers presented together with 2 invited papers were carefully selected from 72 submissions. In addition there are 2 reports on the 2004 SAT Solver Competition and the 2004 QBF Solver Evaluation. The whole spectrum of research in propositional and quantified Boolean formula satisfiability testing is covered; bringing together the (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  23.  66
    Rolf Niedermeier (2006). Invitation to Fixed-Parameter Algorithms. Oxford University Press.
    A fixed-parameter is an algorithm that provides an optimal solution to a combinatorial problem. This research-level text is an application-oriented introduction to the growing and highly topical area of the development and analysis of efficient fixed-parameter algorithms for hard problems. The book is divided into three parts: a broad introduction that provides the general philosophy and motivation; followed by coverage of algorithmic methods developed over the years in fixed-parameter algorithmics forming the core of the book; and a discussion of (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  24.  3
    Ofer Strichman & Stefan Szeider (eds.) (2010). Theory and Applications of Satisfiability Testing-- Sat 2010: 13th International Conference, Sat 2010 Edinburgh, Uk, July 2010: Proceedings. [REVIEW] Springer.
    The LNCS series reports state-of-the-art results in computer science research, development, and education, at a high level and in both printed and electronic form.
    Direct download  
     
    Export citation  
     
    My bibliography  
  25.  1
    C. L. Kane (2010). 'Programming the Beautiful': Informatic Color and Aesthetic Transformations in Early Computer Art. Theory, Culture and Society 27 (1):73-93.
    Color has long been at home in the domains of classical art and aesthetics. However, with the introduction of computer art in Germany in the early 1960s, a new ‘rational theory’ of art, media and color emerged. Many believed this new ‘science’ of art would generate computer algorithms which would enable new media aesthetic ‘principles to be formulated mathematically’ — thus ending the lofty mystifications that have, for too long, been associated with Romantic notions about artwork and (...)
    No categories
    Direct download  
     
    Export citation  
     
    My bibliography  
  26. X. Chen & G. M. Megson (1993). A Methodology of Partitioning and Mapping for Given Regular Arrays with Lower Dimension. University of Newcastle Upon Tyne, Computing Science.
    No categories
     
    Export citation  
     
    My bibliography  
  27. John Franco, Endre Boros & P. L. Hammer (eds.) (1999). The Satisfiability Problem. Elsevier.
     
    Export citation  
     
    My bibliography  
  28.  6
    Karem A. Sakallah & Laurent Simon (eds.) (2011). Theory and Application of Satisfiability Testing - Sat 2011: 14th International Conference, Sat 2011, Ann Arbor, Mi, Usa, June 19-22, 2011: Proceedings. [REVIEW] Springer.
    This book constitutes the refereed proceedings of the 14th International Conference on Theory and Applications of Satisfiability Testing, SAT 2011, held in Ann Arbor, MI, USA in June 2011.The 25 revised full papers presented together with ...
    Direct download  
     
    Export citation  
     
    My bibliography  
  29. Dominique Snyers & André Thayse (1987). From Logic Design to Logic Programming Theorem Proving Techniques and P-Functions.
     
    Export citation  
     
    My bibliography  
  30.  34
    Peter Wegner (1999). Towards Empirical Computer Science. The Monist 82 (1):58-108.
    Part I presents a model of interactive computation and a metric for expressiveness, Part II relates interactive models of computation to physics, and Part III considers empirical models from a philosophical perspective. Interaction machines, which extend Turing Machines to interaction, are shown in Part I to be more expressive than Turing Machines by a direct proof, by adapting Gödel's incompleteness result, and by observability metrics. Observation equivalence provides a tool for measuring expressiveness according to which interactive systems are more expressive (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  31.  13
    David A. Nelson (1992). Deductive Program Verification (a Practitioner's Commentary). Minds and Machines 2 (3):283-307.
    A proof of ‘correctness’ for a mathematical algorithm cannot be relevant to executions of a program based on that algorithm because both the algorithm and the proof are based on assumptions that do not hold for computations carried out by real-world computers. Thus, proving the ‘correctness’ of an algorithm cannot establish the trustworthiness of programs based on that algorithm. Despite the (deceptive) sameness of the notations used to represent them, the transformation of an algorithm into an executable program is a (...)
    Direct download (4 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  32.  23
    Robert L. Constable, The Triumph of Types: Principia Mathematica's Impact on Computer Science.
    Types now play an essential role in computer science; their ascent originates from Principia Mathematica. Type checking and type inference algorithms are used to prevent semantic errors in programs, and type theories are the native language of several major interactive theorem provers. Some of these trace key features back to Principia.
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  33.  5
    A. Feigenbaum Edward (1984). Computer-Assisted Decision Making in Medicine. Journal of Medicine and Philosophy 9 (2).
    This article reviews the strengths and limitations of five major paradigms of medical computer-assisted decision making (CADM): (1) clinical algorithms, (2) statistical analysis of collections of patient data, (3) mathematical models of physical processes, (4) decision analysis, and (5) symbolic reasoning or artificial intelligence (Al). No one technique is best for all applications, and there is recent promising work which combines two or more established techniques. We emphasize both the inherent power of symbolic reasoning and the promise of (...)
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  34.  2
    J. C. Kunz, E. H. Shortliffe, B. G. Buchanan & E. A. Feigenbaum (1984). Computer-Assisted Decision Making in Medicine. Journal of Medicine and Philosophy 9 (2):135-160.
    This article reviews the strengths and limitations of five major paradigms of medical computer-assisted decision making (CADM): (1) clinical algorithms, (2) statistical analysis of collections of patient data, (3) mathematical models of physical processes, (4) decision analysis, and (5) symbolic reasoning or artificial intelligence (Al). No one technique is best for all applications, and there is recent promising work which combines two or more established techniques. We emphasize both the inherent power of symbolic reasoning and the promise of (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  35.  16
    M. Ben-Ari (1993). Mathematical Logic for Computer Science. Prentice Hall.
    Mathematical Logic for Computer Science is a mathematics textbook with theorems and proofs, but the choice of topics has been guided by the needs of computer science students. The method of semantic tableaux provides an elegant way to teach logic that is both theoretically sound and yet sufficiently elementary for undergraduates. To provide a balanced treatment of logic, tableaux are related to deductive proof systems.The logical systems presented are:- Propositional calculus (including binary decision diagrams);- Predicate calculus;- Resolution;- Hoare (...)
    Direct download  
     
    Export citation  
     
    My bibliography  
  36. Walter Warwick (2001). The Conceptual Development of Nondeterminism in Theoretical Computer Science. Dissertation, Indiana University
    In this essay, I examine the notion of a nondeterministic algorithm from both a conceptual and historical point of view. I argue that the intuitions underwriting nondeterminism in the context of contemporary theoretical computer science cannot be reconciled with the intuitions that originally motivated nondeterminism. I identify four different intuitions about nondeterminism: nondeterminism as evidence for the Church Turing thesis; nondeterminism as a natural reflection of the mathematician's behavior; nondeterminism as a formal, mathematical generalization; and nondeterminism as a physical (...)
     
    Export citation  
     
    My bibliography  
  37.  28
    Wayne Aitken & Jeffrey A. Barrett (2004). Computer Implication and the Curry Paradox. Journal of Philosophical Logic 33 (6):631-637.
    There are theoretical limitations to what can be implemented by a computer program. In this paper we are concerned with a limitation on the strength of computer implemented deduction. We use a version of the Curry paradox to arrive at this limitation.
    Direct download (9 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  38.  20
    Jeffrey Barrett (2004). Computer Implication and the Curry Paradox. Journal of Philosophical Logic 33 (6):631 - 637.
    There are theoretical limitations to what can be implemented by a computer program. In this paper we are concerned with a limitation on the strength of computer implemented deduction. We use a version of the Curry paradox to arrive at this limitation.
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  39. Marc Davio, Jean-Pierre Deschamps & André Thayse (1983). Digital Systems, with Algorithm Implementation.
     
    Export citation  
     
    My bibliography  
  40.  5
    Nicholas Furl, P. Jonathon Phillips & Alice J. O'Toole (2002). Face Recognition Algorithms and the Other‐Race Effect: Computational Mechanisms for a Developmental Contact Hypothesis. Cognitive Science 26 (6):797-815.
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  41. Han Geurdes, The Construction of Transfinite Equivalence Algorithms.
    Context: Consistency of mathematical constructions in numerical analysis and the application of computerized proofs in the light of the occurrence of numerical chaos in simple systems. Purpose: To show that a computer in general and a numerical analysis in particular can add its own peculiarities to the subject under study. Hence the need of thorough theoretical studies on chaos in numerical simulation. Hence, a questioning of what e.g. a numerical disproof of a theorem in physics or a prediction in (...)
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  42.  12
    Carsten Seck (2012). Metaphysics Within Chemical Physics: The Case of Ab Initio Molecular Dynamics. [REVIEW] Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 43 (2):361-375.
    This paper combines naturalized metaphysics and a philosophical reflection on a recently evolving interdisciplinary branch of quantum chemistry, ab initio molecular dynamics. Bridging the gaps among chemistry, physics, and computer science, this cutting-edge research field explores the structure and dynamics of complex molecular many-body systems through computer simulations. These simulations are allegedly crafted solely by the laws of fundamental physics, and are explicitly designed to capture nature as closely as possible. The models and algorithms employed, however, involve (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  43.  36
    Amit Hagar & Michael Cuffaro (2015). Quantum Computing. Stanford Encyclopedia of Philosophy.
    Combining physics, mathematics and computer science, quantum computing has developed in the past two decades from a visionary idea to one of the most fascinating areas of quantum mechanics. The recent excitement in this lively and speculative domain of research was triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially "speed up" classical computation and factor large numbers into primes much more rapidly (at least in terms of the number of computational steps involved) than any (...)
    Direct download  
     
    Export citation  
     
    My bibliography   1 citation  
  44.  20
    Robin K. Hill (2016). What an Algorithm Is. Philosophy and Technology 29 (1):35-59.
    The algorithm, a building block of computer science, is defined from an intuitive and pragmatic point of view, through a methodological lens of philosophy rather than that of formal computation. The treatment extracts properties of abstraction, control, structure, finiteness, effective mechanism, and imperativity, and intentional aspects of goal and preconditions. The focus on the algorithm as a robust conceptual object obviates issues of correctness and minimality. Neither the articulation of an algorithm nor the dynamic process constitute the algorithm itself. (...)
    Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  45.  14
    Robert T. Pennock (2000). Can Darwinian Mechanisms Make Novel Discoveries?: Learning From Discoveries Made by Evolving Neural Networks. [REVIEW] Foundations of Science 5 (2):225-238.
    Some philosophers suggest that the development of scientificknowledge is a kind of Darwinian process. The process of discovery,however, is one problematic element of this analogy. I compare HerbertSimon's attempt to simulate scientific discovery in a computer programto recent connectionist models that were not designed for that purpose,but which provide useful cases to help evaluate this aspect of theanalogy. In contrast to the classic A.I. approach Simon used, ``neuralnetworks'' contain no explicit protocols, but are generic learningsystems built on the model (...)
    Direct download (6 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  46.  12
    Lev Manovich (2000). Database as a Genre of New Media. AI and Society 14 (2):176-183.
    After the novel, and subsequently cinema privileged narrative as the key form of cultural expression of the modern age, the computer age introduces its correlate — database. Why does new media favour database form over others? Can we explain ist popularity by analysing the specificity of the digital medium and of computer programming? What is the relationship between database and another form, which has traditionally dominated human culture — narrative? In addressing these questions, I discuss the connection between (...)
    Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  47.  8
    Normal D. Megill & Mladen Pavičić (2002). Deduction, Ordering, and Operations in Quantum Logic. Foundations of Physics 32 (3):357-378.
    We show that in quantum logic of closed subspaces of Hilbert space one cannot substitute quantum operations for classical (standard Hilbert space) ones and treat them as primitive operations. We consider two possible ways of such a substitution and arrive at operation algebras that are not lattices what proves the claim. We devise algorithms and programs which write down any two-variable expression in an orthomodular lattice by means of classical and quantum operations in an identical form. Our results show (...)
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  48.  8
    Edoardo Mollona & Andrea Marcozzi (2009). Self-Emerging Coordination Mechanisms for Knowledge Integration Processes. Mind and Society 8 (2):223-241.
    The increasing knowledge intensity of jobs, typical of a knowledge economy, highlights the role of firms as integrators of know-how and skills. As economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, firms need to allocate skills to tasks and traditional hierarchical control results increasingly ineffective. In this work, we explore under what circumstances networks of agents, which bear specific skills, may self-organize in order to complete tasks. We use a computer simulation approach and (...)
    No categories
    Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  49. Kiyoko F. Aoki-Kinoshita, Minoru Kanehisa, Ming-Yang Kao, Xiang-Yang Li & Weizhao Wang (2006). Session 2A-Approximation Algorithms-A 6-Approximation Algorithm for Computing Smallest Common AoN-Supertree with Application to the Reconstruction of Glycan Trees. [REVIEW] In O. Stock & M. Schaerf (eds.), Lecture Notes in Computer Science. Springer-Verlag 100-110.
    No categories
     
    Export citation  
     
    My bibliography  
  50. Jose M. Badia, Peter Benner, Rafael Mayo & Enrique S. Quintana-Orti (2006). Minisymposia-IV Substructuring, Dimension Reduction and Applications-Parallel Algorithms for Balanced Truncation Model Reduction of Sparse Systems. In O. Stock & M. Schaerf (eds.), Lecture Notes in Computer Science. Springer-Verlag 267-275.
1 — 50 / 1000