This category needs an editor. We encourage you to help if you are qualified.
Volunteer, or read more about what this involves.
Related categories
Siblings:
59 found
Search inside:
(import / add options)   Sort by:
1 — 50 / 59
  1. Michael Alekhnovich, Sam Buss, Shlomo Moran & Toniann Pitassi (2001). Minimum Propositional Proof Length is NP-Hard to Linearly Approximate. Journal of Symbolic Logic 66 (1):171-191.
    We prove that the problem of determining the minimum propositional proof length is NP- hard to approximate within a factor of 2 log 1 - o(1) n . These results are very robust in that they hold for almost all natural proof systems, including: Frege systems, extended Frege systems, resolution, Horn resolution, the polynomial calculus, the sequent calculus, the cut-free sequent calculus, as well as the polynomial calculus. Our hardness of approximation results usually apply to proof length measured either by (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  2. Donald A. Alton (1976). Diversity of Speed-Ups and Embeddability in Computational Complexity. Journal of Symbolic Logic 41 (1):199-214.
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  3. Jeremy Avigad, Eliminating Definitions and Skolem Functions in First-Order Logic.
    From proofs in any classical first-order theory that proves the existence of at least two elements, one can eliminate definitions in polynomial time. From proofs in any classical first-order theory strong enough to code finite functions, including sequential theories, one can also eliminate Skolem functions in polynomial time.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  4. Arnold Beckmann & Samuel R. Buss (2009). Polynomial Local Search in the Polynomial Hierarchy and Witnessing in Fragments of Bounded Arithmetic. Journal of Mathematical Logic 9 (01):103-138.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  5. Patrick Blackburn & Maarten Marx (2002). Remarks on Gregory's “Actually” Operator. Journal of Philosophical Logic 31 (3):281-288.
    In this note we show that the classical modal technology of Sahlqvist formulas gives quick proofs of the completeness theorems in [8] (D. Gregory, Completeness and decidability results for some propositional modal logics containing "actually" operators, Journal of Philosophical Logic 30(1): 57-78, 2001) and vastly generalizes them. Moreover, as a corollary, interpolation theorems for the logics considered in [8] are obtained. We then compare Gregory's modal language enriched with an "actually" operator with the work of Arthur Prior now known under (...)
    Remove from this list | Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  6. Patrick Blackburn & Edith Spaan (1993). A Modal Perspective on the Computational Complexity of Attribute Value Grammar. Journal of Logic, Language and Information 2 (2):129-169.
    Many of the formalisms used in Attribute Value grammar are notational variants of languages of propositional modal logic, and testing whether two Attribute Value Structures unify amounts to testing for modal satisfiability. In this paper we put this observation to work. We study the complexity of the satisfiability problem for nine modal languages which mirror different aspects of AVS description formalisms, including the ability to express re-entrancy, the ability to express generalisations, and the ability to express recursive constraints. Two main (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  7. Peter Bosch, David Gabelaia & Jérôme Lang (eds.) (2009). Lecture Notes on Artificial Intelligence 5422, Logic, Language, and Computation 7th International Tbilisi Symposium on Logic, Language, and Computation. [REVIEW] Springer.
    Remove from this list |
    Translate to English
    |
     
    My bibliography  
     
    Export citation  
  8. Alessandra Carbone (2000). Quantified Propositional Logic and the Number of Lines of Tree-Like Proofs. Studia Logica 64 (3):315-321.
    There is an exponential speed-up in the number of lines of the quantified propositional sequent calculus over Substitution Frege Systems, if one considers proofs as trees. Whether this is true also for the number of symbols, is still an open problem.
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  9. Douglas Cenzer & Jeffrey B. Remmel (2006). Complexity, Decidability and Completeness. Journal of Symbolic Logic 71 (2):399 - 424.
    We give resource bounded versions of the Completeness Theorem for propositional and predicate logic. For example, it is well known that every computable consistent propositional theory has a computable complete consistent extension. We show that, when length is measured relative to the binary representation of natural numbers and formulas, every polynomial time decidable propositional theory has an exponential time (EXPTIME) complete consistent extension whereas there is a nondeterministic polynomial time (NP) decidable theory which has no polynomial time complete consistent extension (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  10. Gregory J. Chaitin (1970). Computational Complexity and Godel's Incompleteness Theorem. [Rio De Janeiro,Centro Técnico Científico, Pontifícia Universidade Católica Do Rio De Janeiro.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  11. Christopher Cherniak (1984). Computational Complexity and the Universal Acceptance of Logic. Journal of Philosophy 81 (12):739-758.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  12. Michael E. Cuffaro (forthcoming). How-Possibly Explanations in Quantum Computer Science. Philosophy of Science.
    A primary goal of quantum computer science is to find an explanation for the fact that quantum computers are more powerful than classical computers. In this paper I argue that to answer this question is to compare algorithmic processes of various kinds, and in so doing to describe the possibility spaces associated with these processes. By doing this we explain how it is possible for one process to outperform its rival. Further, in this and similar examples little is gained in (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  13. Michael E. Cuffaro (forthcoming). On the Significance of the Gottesman-Knill Theorem. British Journal for the Philosophy of Science.
    According to the Gottesman-Knill theorem, quantum algorithms which utilise only the operations belonging to a certain restricted set are efficiently simulable classically. Since some of the operations in this set generate entangled states, it is commonly concluded that entanglement is insufficient to enable quantum computers to outperform classical computers. I argue in this paper that this conclusion is misleading. First, the statement of the theorem (that the particular set of quantum operations in question can be simulated using a classical computer) (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  14. Cédric Dégremont, Lena Kurzen & Jakub Szymanik (2011). On theTractability of Comparing Informational Structures. In J. van Eijck & R. Verbrugge (eds.), Proceedings of the Workshop 'Reasoning about other minds: Logical and cognitive perspectives.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  15. Stéphane Demri (1997). A Completeness Proof for a Logic with an Alternative Necessity Operator. Studia Logica 58 (1):99-112.
    We show the completeness of a Hilbert-style system LK defined by M. Valiev involving the knowledge operator K dedicated to the reasoning with incomplete information. The completeness proof uses a variant of Makinson's canonical model construction. Furthermore we prove that the theoremhood problem for LK is co-NP-complete, using techniques similar to those used to prove that the satisfiability problem for propositional S5 is NP-complete.
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  16. Jeanne Ferrante (1979). The Computational Complexity of Logical Theories. Springer-Verlag.
    This book asks not only how the study of white-collar crime can enrich our understanding of crime and justice more generally, but also how criminological ...
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  17. Harvey Friedman, Lecture Notes on Term Rewriting and Computational Complexity.
    The main powerful method for establishing termination of term rewriting systems was discovered by Nachum Dershowitz through the introduction of certain natural well founded orderings (lexicographic path orderings). This leads to natural decision problems which may be of the highest computational complexity of any decidable problems appearing in a natural established computer science context.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  18. Nikos Gorogiannis & Mark D. Ryan (2002). Implementation of Belief Change Operators Using BDDs. Studia Logica 70 (1):131 - 156.
    While the theory of belief change has attracted a lot of interest from researchers, work on implementing belief change and actually putting it to use in real-world problems is still scarce. In this paper, we present an implementation of propositional belief change using Binary Decision Diagrams. Upper complexity bounds for the algorithm are presented and discussed. The approach is presented both in the general case, as well as on specific belief change operators from the literature. In an effort to gain (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  19. Erich Grädel, Phokion G. Kolaitis & Moshe Y. Vardi (1997). On the Decision Problem for Two-Variable First-Order Logic. Bulletin of Symbolic Logic 3 (1):53-69.
    We identify the computational complexity of the satisfiability problem for FO 2 , the fragment of first-order logic consisting of all relational first-order sentences with at most two distinct variables. Although this fragment was shown to be decidable a long time ago, the computational complexity of its decision problem has not been pinpointed so far. In 1975 Mortimer proved that FO 2 has the finite-model property, which means that if an FO 2 -sentence is satisfiable, then it has a finite (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  20. Amit Hagar & Alex Korolev (2007). Quantum Hypercomputation—Hype or Computation? Philosophy of Science 74 (3):347-363.
    A recent attempt to compute a (recursion‐theoretic) noncomputable function using the quantum adiabatic algorithm is criticized and found wanting. Quantum algorithms may outperform classical algorithms in some cases, but so far they retain the classical (recursion‐theoretic) notion of computability. A speculation is then offered as to where the putative power of quantum computers may come from.
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  21. Amit Hagar & Alexandre Korolev (2006). Quantum Hypercomputability? Minds and Machines 16 (1):87-93.
    A recent proposal to solve the halting problem with the quantum adiabatic algorithm is criticized and found wanting. Contrary to other physical hypercomputers, where one believes that a physical process “computes” a (recursive-theoretic) non-computable function simply because one believes the physical theory that presumably governs or describes such process, believing the theory (i.e., quantum mechanics) in the case of the quantum adiabatic “hypercomputer” is tantamount to acknowledging that the hypercomputer cannot perform its task.
    Remove from this list | Direct download (13 more)  
     
    My bibliography  
     
    Export citation  
  22. Amit Hagar & Giuseppe Sergioli (forthcoming). Counting Steps: A Finitist Interpretation of Objective Probability in Physics. Epistemologia.
    We propose a new interpretation of objective deterministic chances in statistical physics based on physical computational complexity. This notion applies to a single physical system (be it an experimental set--up in the lab, or a subsystem of the universe), and quantifies (1) the difficulty to realize a physical state given another, (2) the 'distance' (in terms of physical resources) from a physical state to another, and (3) the size of the set of time--complexity functions that are compatible with the physical (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  23. Ortrun Ibens (2002). Connection Tableau Calculi with Disjunctive Constraints. Studia Logica 70 (2):241 - 270.
    Automated theorem proving amounts to solving search problems in usually tremendous search spaces. A lot of research therefore focuses on search space reductions. Our approach reduces the search space which arises when using so-called connection tableau calculi for first-order automated theorem proving. It uses disjunctive constraints over first-order equations to compress certain parts of this search space. We present the basics of our constrained-connection-tableau calculi, a constraint extension of connection tableau calculi, and deal with the efficient handling of constraints during (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  24. Aleksandar Ignjatović (1995). Delineating Classes of Computational Complexity Via Second Order Theories with Weak Set Existence Principles. I. Journal of Symbolic Logic 60 (1):103-121.
    Aleksandar Ignjatović. Delineating Classes of Computational Complexity via Second Order Theories with Weak Set Existence Principles (I).
    Remove from this list | Direct download (11 more)  
     
    My bibliography  
     
    Export citation  
  25. Barry E. Jacobs (1977). On Generalized Computational Complexity. Journal of Symbolic Logic 42 (1):47-58.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  26. Juha Kontinen (2004). Definability of Second Order Generalized Quantifiers. Dissertation,
    We study second order generalized quantifiers on finite structures. One starting point of this research has been the notion of definability of Lindström quantifiers. We formulate an analogous notion for second order generalized quantifiers and study definability of second order generalized quantifiers in terms of Lindström quantifiers.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  27. Juha Kontinen & Jakub Szymanik (2011). Characterizing Definability of Second-Order Generalized Quantifiers. In L. Beklemishev & R. de Queiroz (eds.), Proceedings of the 18th Workshop on Logic, Language, Information and Computation, Lecture Notes in Artificial Intelligence 6642. Springer.
    We study definability of second-order generalized quantifiers. We show that the question whether a second-order generalized quantifier $\sQ_1$ is definable in terms of another quantifier $\sQ_2$, the base logic being monadic second-order logic, reduces to the question if a quantifier $\sQ^{\star}_1$ is definable in $\FO(\sQ^{\star}_2,<,+,\times)$ for certain first-order quantifiers $\sQ^{\star}_1$ and $\sQ^{\star}_2$. We use our characterization to show new definability and non-definability results for second-order generalized quantifiers. In particular, we show that the monadic second-order majority quantifier $\most^1$ is not definable (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  28. Juha Kontinen & Jakub Szymanik (2008). A Remark on Collective Quantification. Journal of Logic, Language and Information 17 (2):131-140.
    We consider collective quantification in natural language. For many years the common strategy in formalizing collective quantification has been to define the meanings of collective determiners, quantifying over collections, using certain type-shifting operations. These type-shifting operations, i.e., lifts, define the collective interpretations of determiners systematically from the standard meanings of quantifiers. All the lifts considered in the literature turn out to be definable in second-order logic. We argue that second-order definable quantifiers are probably not expressive enough to formalize all collective (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  29. Jan Krajíček & Pavel Pudlák (1989). Propositional Proof Systems, the Consistency of First Order Theories and the Complexity of Computations. Journal of Symbolic Logic 54 (3):1063-1079.
    We consider the problem about the length of proofs of the sentences $\operatorname{Con}_S(\underline{n})$ saying that there is no proof of contradiction in S whose length is ≤ n. We show the relation of this problem to some problems about propositional proof systems.
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  30. Theodor Leiber (1999). Deterministic Chaos and Computational Complexity: The Case of Methodological Complexity Reductions. [REVIEW] Journal for General Philosophy of Science 30 (1):87-101.
    Some problems rarely discussed in traditional philosophy of science are mentioned: The empirical sciences using mathematico-quantitative theoretical models are frequently confronted with several types of computational problems posing primarily methodological limitations on explanatory and prognostic matters. Such limitations may arise from the appearances of deterministic chaos and (too) high computational complexity in general. In many cases, however, scientists circumvent such limitations by utilizing reductional approximations or complexity reductions for intractable problem formulations, thus constructing new models which are computationally tractable. (...)
    Remove from this list | Direct download (10 more)  
     
    My bibliography  
     
    Export citation  
  31. McGraw-Hill, Computational Complexity and Godel's Incompleteness Theorem.
    Given any simply consistent formal theory F of the state complexity L(S) of finite binary sequences S as computed by 3-tape-symbol Turing machines, there exists a natural number L(F ) such that L(S) > n is provable in F only if n < L(F ). On the other hand, almost all finite binary sequences S satisfy L(S) > L(F ). The proof resembles Berry’s..
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  32. Albert R. Meyer & Patrick C. Fischer (1972). Computational Speed-Up by Effective Operators. Journal of Symbolic Logic 37 (1):55-68.
  33. Adam Morton (2004). Epistemic Virtues, Metavirtues, and Computational Complexity. Noûs 38 (3):481–502.
    I argue that considerations about computational complexity show that all finite agents need characteristics like those that have been called epistemic virtues. The necessity of these virtues follows in part from the nonexistence of shortcuts, or efficient ways of finding shortcuts, to cognitively expensive routines. It follows that agents must possess the capacities – metavirtues –of developing in advance the cognitive virtues they will need when time and memory are at a premium.
    Remove from this list | Direct download (10 more)  
     
    My bibliography  
     
    Export citation  
  34. Professor Adam Morton (2004). Epistemic Virtues, Metavirtues, and Computational Complexity. Noûs 38 (3):481-502.
    I argue that considerations about computational complexity show that all finite agents need characteristics like those that have been called epistemic virtues. The necessity of these virtues follows in part from the nonexistence of shortcuts, or efficient ways of finding shortcuts, to cognitively expensive routines. It follows that agents must possess the capacities – metavirtues –of developing in advance the cognitive virtues they will need when time and memory are at a premium.
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  35. Georg Moser & Richard Zach (2006). The Epsilon Calculus and Herbrand Complexity. Studia Logica 82 (1):133 - 155.
    Hilbert's ε-calculus is based on an extension of the language of predicate logic by a term-forming operator ex. Two fundamental results about the ε-calculus, the first and second epsilon theorem, play a rôle similar to that which the cut-elimination theorem plays in sequent calculus. In particular, Herbrand's Theorem is a consequence of the epsilon theorems. The paper investigates the epsilon theorems and the complexity of the elimination procedure underlying their proof, as well as the length of Herbrand disjunctions of existential (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  36. Marcin Mostowski & Jakub Szymanik (2012). Semantic Bounds for Everyday Language. Semiotica 188 (1/4):363-372.
    We consider the notion of everyday language. We claim that everyday language is semantically bounded by the properties expressible in the existential fragment of second–order logic. Two arguments for this thesis are formulated. Firstly, we show that so–called Barwise's test of negation normality works properly only when assuming our main thesis. Secondly, we discuss the argument from practical computability for finite universes. Everyday language sentences are directly or indirectly verifiable. We show that in both cases they are bounded by second–order (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  37. Marcin Mostowski & Jakub Szymanik (2007). Computational Complexity of Some Ramsey Quantifiers in Finite Models. Bulletin of Symbolic Logic 13:281--282.
    The problem of computational complexity of semantics for some natural language constructions – considered in [M. Mostowski, D. Wojtyniak 2004] – motivates an interest in complexity of Ramsey quantifiers in finite models. In general a sentence with a Ramsey quantifier R of the following form Rx, yH(x, y) is interpreted as ∃A(A is big relatively to the universe ∧A2 ⊆ H). In the paper cited the problem of the complexity of the Hintikka sentence is reduced to the problem of computational (...)
    Remove from this list |
    Translate to English
    | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  38. Alexandre Muzy, Franck Varenne, Bernard P. Zeigler, Jonathan Caux, Patrick Coquillard, Luc Touraille, Dominique Prunetti, Philippe Caillou, Olivier Michel & David R. C. Hill (2013). Refounding of the Activity Concept? Towards a Federative Paradigm for Modeling and Simulation. Simulation - Transactions of the Society for Modeling and Simulation International 89 (2):156-177.
    Currently, the widely used notion of activity is increasingly present in computer science. However, because this notion is used in specific contexts, it becomes vague. Here, the notion of activity is scrutinized in various contexts and, accordingly, put in perspective. It is discussed through four scientific disciplines: computer science, biology, economics, and epistemology. The definition of activity usually used in simulation is extended to new qualitative and quantitative definitions. In computer science, biology and economics disciplines, the new simulation activity definition (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  39. Carlo Penco & Daniele Porello (2010). Sense and Proof. In M. D'agostino, G. Giorello, F. Laudisa, T. Pievani & C. Sinigaglia (eds.), New Essays in Logic and Philosophy of Science,. College Publicationss.
    In this paper we give some formal examples of ideas developed by Penco in two papers on the tension inside Frege's notion of sense (see Penco 2003). The paper attempts to compose the tension between semantic and cognitive aspects of sense, through the idea of sense as proof or procedure – not as an alternative to the idea of sense as truth condition, but as complementary to it (as it happens sometimes in the old tradition of procedural semantics).
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  40. I. Pitowsky (1996). Laplace's Demon Consults an Oracle: The Computational Complexity of Prediction. Studies in History and Philosophy of Science Part B 27 (2):161-180.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  41. Ian Pratt-Hartmann (2008). On the Computational Complexity of the Numerically Definite Syllogistic and Related Logics. Bulletin of Symbolic Logic 14 (1):1-28.
    The numerically definite syllogistic is the fragment of English obtained by extending the language of the classical syllogism with numerical quantifiers. The numerically definite relational syllogistic is the fragment of English obtained by extending the numerically definite syllogistic with predicates involving transitive verbs. This paper investigates the computational complexity of the satisfiability problem for these fragments. We show that the satisfiability problem (= finite satisfiability problem) for the numerically definite syllogistic is strongly NP-complete, and that the satisfiability problem (= finite (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  42. Arto Salomaa (1985). Computation and Automata. Cambridge University Press.
    This introduction to certain mathematical topics central to theoretical computer science treats computability and recursive functions, formal languages and automata, computational complexity, and cruptography. The presentation is essentially self-contained with detailed proofs of all statements provided. Although it begins with the basics, it proceeds to some of the most important recent developments in theoretical computer science.
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  43. Nathan Segerlind (2007). The Complexity of Propositional Proofs. Bulletin of Symbolic Logic 13 (4):417-481.
    Propositional proof complexity is the study of the sizes of propositional proofs, and more generally, the resources necessary to certify propositional tautologies. Questions about proof sizes have connections with computational complexity, theories of arithmetic, and satisfiability algorithms. This is article includes a broad survey of the field, and a technical exposition of some recently developed techniques for proving lower bounds on proof sizes.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  44. Robert I. Soare (1977). Computational Complexity, Speedable and Levelable Sets. Journal of Symbolic Logic 42 (4):545-563.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  45. Larry Stockmeyer (1987). Classifying the Computational Complexity of Problems. Journal of Symbolic Logic 52 (1):1-43.
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  46. Viggo Stoltenberg-Hansen (1980). On Computational Complexity in Weakly Admissible Structures. Journal of Symbolic Logic 45 (2):353-358.
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  47. Jakub Szymanik (2010). Computational Complexity of Polyadic Lifts of Generalized Quantifiers in Natural Language. Linguistics and Philosophy 33 (3):215-250.
    We study the computational complexity of polyadic quantifiers in natural language. This type of quantification is widely used in formal semantics to model the meaning of multi-quantifier sentences. First, we show that the standard constructions that turn simple determiners into complex quantifiers, namely Boolean operations, iteration, cumulation, and resumption, are tractable. Then, we provide an insight into branching operation yielding intractable natural language multi-quantifier expressions. Next, we focus on a linguistic case study. We use computational complexity results to investigate semantic (...)
    Remove from this list | Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  48. Jakub Szymanik (2009). Quantifiers in TIME and SPACE. Computational Complexity of Generalized Quantifiers in Natural Language. Dissertation, University of Amsterdam
    In the dissertation we study the complexity of generalized quantifiers in natural language. Our perspective is interdisciplinary: we combine philosophical insights with theoretical computer science, experimental cognitive science and linguistic theories. -/- In Chapter 1 we argue for identifying a part of meaning, the so-called referential meaning (model-checking), with algorithms. Moreover, we discuss the influence of computational complexity theory on cognitive tasks. We give some arguments to treat as cognitively tractable only those problems which can be computed in polynomial time. (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  49. Jakub Szymanik (2009). The Computational Complexity of Quantified Reciprocals. In Peter Bosch, David Gabelaia & Jérôme Lang (eds.), Lecture Notes on Artificial Intelligence 5422, Logic, Language, and Computation 7th International Tbilisi Symposium on Logic, Language, and Computation. Springer.
    We study the computational complexity of reciprocal sentences with quantified antecedents. We observe a computational dichotomy between different interpretations of reciprocity, and shed some light on the status of the so-called Strong Meaning Hypothesis.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  50. Jakub Szymanik (2007). A Note on Some Neuroimaging Study of Natural Language Quantifiers Comprehension. Neuropsychologia 45 (9):2158-2160.
    We discuss McMillan et al. (2005) paper devoted to study brain activity during comprehension of sentences with generalized quantifiers. According to the authors their results verify a particular computational model of natural language quantifier comprehension posited by several linguists and logicians (e. g. see van Benthem, 1986). We challenge this statement by invoking the computational difference between first-order quantifiers and divisibility quantifiers (e. g. see Mostowski, 1998). Moreover, we suggest other studies on quantifier comprehension, which can throw more light on (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
1 — 50 / 59