Sport, Rules and Values presents a philosophical perspective on some issues concerning the character of sport. Central questions for the text are motivated from real life sporting examples as described in newspaper reports. For instance, the (supposed) subjectivity of umpiring decisions is explored via an examination of the judging ice-skating at the Salt Lake City Olympic Games of 2002. Throughout, the presentation is rich in concrete cases from sporting situations, including baseball, football, and soccer. While granting the constitutive nature (...) of the rules of sport, discussion focuses on three broad uses commonly urged for rules: in defining sport; in judging or assessing sport (as deployed by judges or umpires); and in characterizing the value of sport, especially if that value is regarded as moral value. In general, McFee rejects a conception of the determinacy of rules as possible within sport, and a parallel picture of the determinacy assumed to be required by philosophy. (shrink)
Several recent studies and initiatives have emphasized the importance of a strong ethical organizational DNA (ODNA) to create and promote an effective corporate governance culture of trust, integrity and intellectual honesty. This paper highlights the drawbacks of an excessively heavy reliance on rules-based approaches that increase the cost of doing business, overshadow essential elements of good corporate governance, create a culture of dependency, and can result in legal absolutism. The paper makes the case that the way forward for effective (...) corporate governance is to strike an optimal balance between rules-based and principles-based approaches. The recent corporate scandals have demonstrated that the ethical ODNA is critical to the driving force and basis of legal and regulatory requirements. Effective governance means adhering to ethical principles, not merely complying with rules, and is a crucial guardian of a firm’s reputation and integrity. It is through an effective corporate governance program (that is, one that optimally captures and integrates the appropriate aspects of rules-based and principles-based approaches, and identifies and assesses the related risks) that an organization can reconfigure its ODNA for improved performance. Focusing on the ethical ODNA as the basis of new governance measures provides an opportunity to develop a competitive advantage as it represents a potential source of differentiation, strengthens the relationship with all stakeholders of the organization by building a culture of trust and integrity, and re-instills investor confidence. This paper employs dialectical reasoning that links the ODNA through principles-driven rules in developing a risks-based approach. A comparison from a risk assessment perspective between rules-based and principles-based approaches is presented. Although there have been few applications employing dialectical reasoning in business research, this methodology can be extremely useful in isolating ethical issues and integrating them into the business process. The risks-based approach captures the benefits of both rules-based and principles-based approaches, and incorporates trust-based principles such␣as solidarity, subsidiarity and covenantal relationships. (shrink)
We reflect on lessons that the lottery and preface paradoxes provide for the logic of uncertain inference. One of these lessons is the unreliability of the rule of conjunction of conclusions in such contexts, whether the inferences are probabilistic or qualitative; this leads us to an examination of consequence relations without that rule, the study of other rules that may nevertheless be satisfied in its absence, and a partial rehabilitation of conjunction as a ‘lossy’ rule. A second lesson is (...) the possibility of rational inconsistent belief; this leads us to formulate criteria for deciding when an inconsistent set of beliefs may reasonably be retained. (shrink)
The value of solidarity, which is exemplified in noble groups like the Civil Rights Movement along with more mundane teams, families and marriages, is distinctive in part because people are in solidarity over, for or with regard to something, such as common sympathies, interests, values, etc. I use this special feature of solidarity to resolve a longstanding puzzle about enacted social moral rules, which is, aren’t these things just heuristics, rules of thumb or means of coordination that we (...) ‘fetishize’ or ‘worship’ if we stubbornly insist on sticking to them when we can do more good by breaking them? I argue that when we are in a certain kind of solidarity with others, united by social moral rules that we have established among ourselves, the rules we have developed and maintain are a constitutive part of our solidary relationships with one another; and it is part of being in this sort of solidarity with our comrades that we are presumptively required to follow the social moral rules that join us together. Those in the Polish Revolution, for example, were bound by informally enforced rules about publicity, free speech and the use of violence, so following their own rules became a way of standing in a valuable sort of solidarity with one another. I explain why we can have non-instrumental reasons to follow the social moral rules that exist in our own society, improve our rules and even sometimes to break the otherwise good rules that help to unite us. (shrink)
Rules proliferate; some are kept with a bureaucratic stringency bordering on the absurd, while others are manipulated and ignored in ways that injure our sense of justice. Under what conditions should we make exceptions to rules, and when should they be followed despite particular circumstances? The two dominant models in the current literature on rules are the particularist account and that which sees the application of rules as normative. Taking a position that falls between these two (...) extremes, Alan Goldman is the first to provide a systematic framework to clarify when we need to follow rules in our moral, legal, and prudential decisions, and when we ought not to do so. This book will be of great interest to advanced students and professionals working in philosophy, law, decision theory, and the social sciences. (shrink)
Genuine rules cannot capture our intuitive moral judgments because, if usable, they mention only a limited number of factors as relevant to decisions. But morally relevant factors are both numerous and unpredictable in the ways they interact to change priorities among them. Particularists have pointed this out, but their account of moral judgment is also inadequate, leaving no room for genuine reasoning or argument. Reasons must be general even if not universal. Particularists can insist that our judgments be reflective, (...) unbiased, informed, and sensitive, requiring a background of experiences that expand sympathy and empathy for others. But beyond this, our judgments must be coherent. This requirement provides a way to reason to the correct answer to a controversial issue—the answer most coherent with or body of settled judgments. Rawls' account of coherence in terms of reflective equilibrium, where we adjust particular judgments to match rules and adjust rules to match judgments, is rejected since rules have no independent force. Instead, the central requirement is that we not judge cases differently without being able to cite a morally relevant difference between them. Such differences must make a difference else-where as well, although they need not do so universally. Factors cannot be relevant in only one context because they reflect values that must recur to be maintained. The method of moral reasoning based on this requirement is specified as follows: first, the specification of competing values or interests in the problematic case; second, the location of paradigm cases in which these competing values are prioritized, making sure that these settled judgments are reflective, informed, and sensitive; third, the search for relevant differences between the settled and problematic cases or the location of alternative, more closely analogous paradigms. The paper ends with an illustration of the method applied to the issue of doctor assisted suicide. (shrink)
I summarise a conception of morality as containing a set of rules which hold ceteris paribus and which impose pro-tanto obligations. I explain two ways in which moral rules are ceteris-paribus, according to whether an exception is duty-voiding or duty-overriding. I defend the claim that moral rules are ceteris-paribus against two qualms suggested by Luke Robinson’s discussion of moral rules and against the worry that such rules are uninformative. I show that Robinson’s argument that moral (...)rules cannot ground pro-tanto obligations is unsound, because it confuses an absolute reason for an obligation with a reason for an absolute obligation, and because it overlooks the possibility that priority rules may be rules for ordering pro-tanto obligations rather than rules for eliminating contenders for the status of absolute obligation. (shrink)
This article examines the learning of a scientific procedure, and its connection to the greater scientific community through the notion of Wittgensteinian rules. The analysis reveals this connection by demonstrating that learning in interaction is largely grounded in rule-based community descriptions and judgments rather than any inner process. This same analysis also demonstrates that learning processes are particularly suited for such an analysis because rules and concomitant phenomena comprise a significant portion of any learning interaction. This analysis further (...) reveals the elucidating merit of Wittgensteinian rules, their relation to community and the concept of practice, and promotes the efficacy of participant-generated rule-formulations as analytic descriptors. (shrink)
This article questions the continued use and application of EVA® (economic value added) because it is epistemologically a non-sequitur, fails to satisfy the requirements of sound research methodology in terms of being a reliable and valid metric, and is unlikely to satisfy the requirements of Rule 702 of the Federal Rules of Evidence. In the light of these insufficiencies, the continued use of EVA® is ethically questionable, and moreover in time is likely to result in class actions.
The article discusses burden of proof rules in social criticism. By social criticism I mean an argumentative situation in which an opponent publicly argues against certain social practices; the examples I consider are discrimination on the basis of species and discrimination on the basis of one's nationality. I argue that burden of proof rules assumed by those who defend discrimination are somewhat dubious. In social criticism, there are no shared values which would uncontroversially determine what is the reasonable (...) presumption and who has the burden of proof, nor are there formal rules which would end the debate and determine the winner at a specific point. (shrink)
A certain type of inference rules in (multi-) modal logics, generalizing Gabbay's Irreflexivity rule, is introduced and some general completeness results about modal logics axiomatized with such rules are proved.
Inflectional morphology has been taken as a paradigmatic example of rule-governed grammatical knowledge (Pinker, 1999). The plausibility of this claim may be related to the fact that it is mainly based on studies of English, which has a very simple inflectional system. We examined the representation of inflectional morphology in Serbian, which encodes number, gender, and case for nouns. Linguists standardly characterize this system as a complex set of rules, with disagreements about their exact form. We present analyses of (...) a large corpus of nouns which showed that, as in English, Serbian inflectional morphology is quasiregular: It exhibits numerous partial regularities creating neighborhoods that vary in size and consistency. We then asked whether a simple connectionist network could encode this statistical information in a manner that also supported generalization. A network trained on 3,244 Serbian nouns learned to produce correctly inflected phonological forms from a specification of a word's lemma, gender, number, and case, and generalized to untrained cases. The model's performance was sensitive to variables that also influence human performance, including surface and lemma frequency. It was also influenced by inflectional neighborhood size, a novel measure of the consistency of meaning to form mapping. A word-naming experiment with native Serbian speakers showed that this measure also affects human performance. The results suggest that, as in English, generating correctly inflected forms involves satisfying a small number of simultaneous probabilistic constraints relating form and meaning. Thus, common computational mechanisms may govern the representation and use of inflectional information across typologically diverse languages. (shrink)
We describe PADUA, a protocol designed to support two agents debating a classification by offering arguments based on association rules mined from individual datasets. We motivate the style of argumentation supported by PADUA, and describe the protocol. We discuss the strategies and tactics that can be employed by agents participating in a PADUA dialogue. PADUA is applied to a typical problem in the classification of routine claims for a hypothetical welfare benefit. We particularly address the problems that arise from (...) the extensive number of misclassified examples typically found in such domains, where the high error rate is a widely recognised problem. We give examples of the use of PADUA in this domain, and explore in particular the effect of intermediate predicates. We have also done a large scale evaluation designed to test the effectiveness of using PADUA to detect misclassified examples, and to provide a comparison with other classification systems. (shrink)
The paper introduces the concept of polar decision rules and establishes that majority rules are polar rules. We identify second best rules and penultimate rules in cases that majority rules are optimal or the most inferior, respectively. We especially specify the almost expert rule and the almost majority rule as the secondary rules of the expert and majority rules, respectively.
Summary A small step is made in the direction of defining some general basic rules which can serve as a framework for research in several fields of the social sciences. The method of working with analogies asks for a more accurate approach. Starting from the concept of evolution in the form of a basic rule another basic rule is formulated. This rule shows what are the most important factors in long term developments and what types of development one can (...) expect. (shrink)
We consider the problem of choosing the location of a public facility either (a) on a tree network or (b) in a Euclidean space. (a) (1996) characterize the class of target rules on a tree network by Pareto efficiency and population-monotonicity. Using Vohra's (1999) characterization of rules that satisfy Pareto efficiency and replacement-domination, we give a short proof of the previous characterization and show that it also holds on the domain of symmetric preferences. (b) The result obtained for (...) model (a) proves to be crucial for the analysis of the problem of choosing the location of a public facility in a Euclidean space. Our main result is the characterization of the class of coordinatewise target rules by unanimity, strategy-proofness, and either replacement-domination or population-monotonicity. (shrink)
I briefly consider why Descartes stopped work on the _Rules_ towards the end of my paper. My main concern is to accurately characterize the project represented in the _Rules_, especially in its relation to early-modern logic.
Visser's rules form a basis for the admissible rules of . Here we show that this result can be generalized to arbitrary intermediate logics: Visser's rules form a basis for the admissible rules of any intermediate logic for which they are admissible. This implies that if Visser's rules are derivable for then has no nonderivable admissible rules. We also provide a necessary and sufficient condition for the admissibility of Visser's rules. We apply these (...) results to some specific intermediate logics and obtain that Visser's rules form a basis for the admissible rules of, for example, De Morgan logic, and that Dummett's logic and the propositional Gödel logics do not have nonderivable admissible rules. (shrink)
We discuss a `negative' way of defining frame classes in (multi)modal logic, and address the question of whether these classes can be axiomatized by derivation rules, the `non-ξ rules', styled after Gabbay's Irreflexivity Rule. The main result of this paper is a metatheorem on completeness, of the following kind: If Λ is a derivation system having a set of axioms that are special Sahlqvist formulas and Λ+ is the extension of Λ with a set of non-ξ rules, (...) then Λ+ is strongly sound and complete with respect to the class of frames determined by the axioms and the rules. (shrink)
The purpose of this research review is to examine the usefulness of reconstructing problematic interpersonal conflict behavior as violations of rules for critical discussions. Dialectical reconstruction of interpersonal conflict behavior sheds light on the ways in which dialectical fallacies influence not only the course of a critical discussion, but also the personal and relationship outcomes experienced by arguers. Conflict sequences such as cross complaining and demand/withdraw are shown to be problematic, in part, because they prevent parties from resolving their (...) difference through rational dialogue. The paper concludes by presenting some implications of the pragma-dialectical reconstruction of interpersonal conflict behavior. (shrink)
In this paper we argue that hybrid logic is the deductive setting most natural for Kripke semantics. We do so by investigating hybrid axiomatics for a variety of systems, ranging from the basic hybrid language (a decidable system with the same complexity as orthodox propositional modal logic) to the strong Priorean language (which offers full first-order expressivity).We show that hybrid logic offers a genuinely first-order perspective on Kripke semantics: it is possible to define base logics which extend automatically to a (...) wide variety of frame classes and to prove completeness using the Henkin method. In the weaker languages, this requires the use of non-orthodox rules. We discuss these rules in detail and prove non-eliminability and eliminability results. We also show how another type of rule, which reflects the structure of the strong Priorean language, can be employed to give an even wider coverage of frame classes. We show that this deductive apparatus gets progressively simpler as we work our way up the expressivity hierarchy, and conclude the paper by showing that the approach transfers to first-order hybrid logic. (shrink)
This paper analyzes the problem of deriving a ranking of fixed-cardinality subsets of a universal set from a given ranking of the elements of this universal set. Only subsets with a given number of elements are being ranked, which is where the approach in this paper differs from the literature on extension rules that establish preference relations on the power set of the universal set. Common examples for areas where such preferences on subsets with a fixed cardinality are needed (...) are elections of committees of a given size, many-to-one matchings, and decision problems under ignorance. The main result of the paper is a characterization of a class oflexicographic rank-ordered rules by means of two axioms, namely, aresponsiveness condition used in the matching literature and a well-knownneutrality requirement which ensures that the names of the alternatives are irrelevant for the ranking of the sets. (shrink)
Defeasible rules are said to allow for the following two-staged sequence, viz., that p → q and yet p & r → not-q. This is puzzling because in the logic of conditionals the sufficiency of p for q cannot normally be undermined if one adds to the antecedent a further proposition r. Critics argue that the better approach to comprehending defeasibility is explicitly to represent the limiting factor r in a single-stage articulation of the rule, viz., as p & (...) not-r → q. This is a more complete statement of the rule; where the old rule announced, confidently and without qualification, that in circumstances or class of cases p consequence q follows, the new rule admits from the outset that q only follows if we add the qualifier that limits the antecedent circumstances to the smaller class of cases p & not-r. This is not only more complete, the critics argue, it is also a more accurate or truthful statement of the rule. The rule p → q, articulated at the first stage of the defeasible process, is only a pretender encountered along the way to this more complete truth. In this paper I show how, and why, contrary to the critics, the process of private law adjudication supports the multi-staged structure and representation of defeasible legal rules. What matters when one party makes a claim against another party is not so much that the claim is true (or even that it is advanced as a legitimate claim under the rule that is most likely to be true), but rather that the claim is of the sort that the party can justifiably say is true. The latter, at least, is what carries the plaintiff past the initial burden of proof, and what calls for the defendant to reply if the plaintiff is not to carry the day. I illustrate the argument by way of the court’s reluctance in a tort case to entertain a purely statistical basis for a plaintiff’s claim that the defendant has caused her injury. Rather, the court will insist on direct evidence that appears to arise out of the particular circumstances of the case as the plaintiff alleges them to be, and even if, as a purely statistical matter, the latter sort of claim is less likely to be true than the claim grounded in pure statistical likelihood. I relate this argument for defeasible rules (and against rules that are more truthful) to some important arguments offered by the philosopher Stephen Darwall in his recent book The Second Person Viewpoint. Private law’s defeasible rules, I argue, capture the special respect (and interpersonal accountability) that attaches only to rights arising out of “second personal” claiming (and counterclaiming) and which are not realizable merely as a matter of “third personal” (moral) truth. (shrink)
In this article it is pointed out what kind of rules for communication and argumentation are required in order to make it possible to resolve disputes in an orderly way. In section 2, Gricean maxims and Searlean speech act conditions are integrated in such a way that five general rules for communication can be formulated. In section 3, starting from Lewis's definition of convention, it is argued that the interactional effect of accepting is conventionally linked with the complex (...) communicative act complex of argumentation. In section 4, the rules for argumentation are placed in a dialogical perspective. (shrink)
The evolution of boundedly rational rules for playing normal form games is studied within stationary environments of stochastically changing games. Rules are viewed as algorithms prescribing strategies for the different normal form games that arise. It is shown that many of the “folk results” of evolutionary game theory, typically obtained with a fixed game and fixed strategies, carry over to the present environments. The results are also related to some recent experiments on rules and games.
We give a characterization of majority voting rules with quorums in the framework of May (Econometrica 20:680–684, 1952)’s seminal article. According to these voting rules, an alternative is socially chosen if and only if it obtains the relative majority of votes and the total number of voters not abstaining reaches the quorum.
If the Visser rules are admissible for an intermediate logic, they form a basis for the admissible rules of the logic. How to characterize the admissible rules of intermediate logics for which not all of the Visser rules are admissible is not known. In this paper we give a brief overview of results on admissible rules in the context of intermediate logics. We apply these results to some well-known intermediate logics. We provide natural examples of (...) logics for which the Visser rule are derivable, admissible but nonderivable, or not admissible. (shrink)
For small panels of experts (e.g., boards of managers, courts, specialized committees), n ≦ 5, this paper provides an algorithm for ranking the seven efficient and commonly used weighted majority rules by their respective performance. These rules are terned efficient since they constitute the set of potentially optimal decision rules in uncertain, symmetric, pairwise choice situations. The main contribution of this study is the discovery of an essential ordering of six of these rules which entails that (...) the set of possible ranking of the seven rules is almost single peaked.The essential ordering significantly reduces the number of possible rankings of the rules, and thus, simplifies the development of the ranking algorithm. The essential ordering has important applications when the available information on the experts' decisional skills is incomplete. (shrink)
The wide gap between the prescriptions of moral rules and actual human behaviour is attributed to two factors which undermine the authority of moral rules, the one mainly affecting people's behaviour as individuals, and the other their behaviour as members of collectives, (a) Morality suffers from an inner tension: if it allows exemptions, e.g. that lying may be used as a retaliatory or protective measure, then its domain is eroded; if it allows no exceptions, it is too stringent (...) and it is flouted. Hence the moral agent who is the upholder of morality is also the transgressor of its rules. (b) Groups have evolved hostile and oppressive institutions for dealing with each other. In the setting up of these institutions and in the practising of them, individual moral responsibility is attenuated to vanishing point, so that moral rules are ineffectual in a large area of human interaction. (shrink)
If self-interested behavior conflicts with the collective welfare, rules of cooperation are often installed to prevent egoistic behavior. We hypothesized that installing such rules may instigate personal moral norms of cooperation, but that they fail in doing so when installed by a leader who is self-interested rather than self-sacrificing. Three studies confirmed this and also showed that, consequently, only self-sacrificing leaders were able to install rules that increase cooperation without the need for a perfectly operating monitoring system.
Non-compensatory aggregation rules are applied in a variety of problems such as voting theory, multi-criteria analysis, composite indicators, web ranking algorithms and so on. A major open problem is the fact that non-compensability implies the analytical cost of loosing all available information about intensity of preference, i.e. if some variables are measured on interval or ratio scales, they have to be treated as measured on an ordinal scale. Here this problem has been tackled in its most general formulation, that (...) is when mixed measurement scales (interval, ratio and ordinal) are used and both stochastic and fuzzy uncertainties are present. Objectives of this article are first to present a comprehensive review of useful solutions already proposed in the literature and second to advance the state of the art mainly in the theoretical guarantee that weights have the meaning of importance coefficients and they can be summarized in a voting matrix. This is a key result for using non-compensatory Condorcet consistent rules. A proof on the probability of existence of ties in the voting matrix is also developed. (shrink)
This chapter concerns Hume's account in Book I of the Treatise of Human Nature (1739) of the operation of ‘general rules’. It considers their relation to conceptions of regularity, probability, circumstance, and experience that obtained in early modern logic and natural philosophy, taking occasion to reflect upon the significance of Hume's claim, in the Enquiry Concerning Human Understanding, that natural philosophy and moral philosophy are ‘derived from the same principles’. It concludes by suggesting that a number of Hume's essays (...) are structured as reflections and refinements upon commonly-held general rules. (shrink)
Empirical studies have demonstrated that uncertainty about event probabilities, also known as ambiguity or second-order uncertainty, can affect decision makers’ choice preferences. Despite the importance of second-order uncertainty in decision making, almost no effort has been directed towards the development of methods that evaluate the accuracy of second-order probabilities. In this paper, we describe conditions under which strictly proper scoring rules can be used to assess the accuracy of second-order probability judgments. We investigate the effectiveness of using a particular (...) strictly proper scoring rule the ranked probability score - to discourage biased assessments of second-order uncertainty. (shrink)
In this paper I consider the concept of an illocutionary rule - i.e., the rule of the form "X counts as 7 in context C" - and examine the role it plays in explaining the nature of verbal communication and the conventionality of natural languages. My aim is to find a middle ground between John R. Searle's view, according to which every conventional speech act has to be explained in terms of illocutionary rules that underlie its performance, and the (...) view held by Ruth G Millikan, who seems to suggest that the formula "X counts as Y in context C" has no application in our theorizing about human linguistic practice. I claim, namely, that the concept of an illocutionary rule is theoretically useful, though not explanatorily basic. I argue that using the formula "X counts as Y in context C" we can classify illocutionary acts by what Millikan calls their conventional outcomes, and thereby make them susceptible to naturalistic explanation. (shrink)
As the ongoing literature on the paradoxes of the Lottery and the Preface reminds us, the nature of the relation between probability and rational acceptability remains far from settled. This article provides a novel perspective on the matter by exploiting a recently noted structural parallel with the problem of judgment aggregation. After offering a number of general desiderata on the relation between finite probability models and sets of accepted sentences in a Boolean sentential language, it is noted that a number (...) of these constraints will be satisfied if and only if acceptable sentences are true under all valuations in a distinguished non-empty set W. Drawing inspiration from distance-based aggregation procedures, various scoring rule based membership conditions for W are discussed and a possible point of contact with ranking theory is considered. The paper closes with various suggestions for further research. (shrink)
Rule utilitarianism has recently enjoyed a resurgence of interest triggered by Brad Hooker’s sophisticated treatment in Ideal Code, Real World.1 An intriguing new debate has now broken out about how best to formulate rule utilitarianism – whether to evaluate candidate moral codes in terms of the value of their consequences at a fixed rate (such as 90%) of social acceptance (as Hooker contends), or to evaluate codes in terms of the value of their consequences throughout the entire range of possible (...) acceptance rates (as Hooker’s opponent Ridge contends).2 I shall argue that both Hooker’s fixed-rate ruleutilitarianism and Ridge’s variable-rate rule-utilitarianism, suitably interpreted and revised, survive the criticisms that each theorist lodges against the other. But I shall use the insights gained through this examination to argue that both these forms of rule utilitarianism, arguably the best available, fall prey to two fatal problems that have gone unnoticed in these debates, or indeed in most debates about rule utilitarianism. The weaknesses I describe in Hooker and Ridge’s forms of rule utilitarianism threaten to undermine all versions of rule utilitarianism, not just the nuanced versions developed by Hooker and Ridge. (shrink)
Pettit presents a selection of essays touching upon metaphysics, philosophical psychology, and the theory of rational regulation. The first part of the book discusses the rule-following character of thought. The second considers how choice can be responsive to different sorts of factors, while still being under the control of thought. The third examines the implications of this view of choice and rationality for the normative regulation of social behavior.
We investigate the computational complexity of deciding whether a given inference rule is admissible for some modal and superintuitionistic logics. We state a broad condition under which the admissibility problem is coNEXP-hard. We also show that admissibility in several well-known systems (including GL, S4, and IPC) is in coNE, thus obtaining a sharp complexity estimate for admissibility in these systems.
Lewis’s view of the way conventions are passed on may have some especially interesting consequences for the study of language. I’ll start by briefly discussing agreements and disagreements that I have with Lewis’s general views on conventions and then turn to how linguistic conventions spread. I’ll compare views of main stream generative linguistics, in particular, Chomsky’s views on how syntactic forms are passed on, with the sort of view of language acquisition and language change advocated by usage-based or construction grammars, (...) which seem to fit better with Lewis’s ideas. Then I will illustrate the interest of Lewis’s perspective on the dissemination of conventions with a variety of linguistic examples. (shrink)
Many of our categorization experiences are non-transitive. For some objects a, b and c, a and b can appear indistinguishable, and likewise b and c, but a and c can appear distinguishable. Many categories also appear to be smooth; transitions between cases are not experienced as sharp, but rather as continuous. These two features of our categorization experiences tend to be addressed separately. Moreover, many views model smoothness by making use of infinite degrees. This paper presents a methodological strategy that (...) shows how solving the transitivity problem first can thereby introduce explanatory resources that feed into an account of smoothness without having to make use of infinite degrees. (shrink)