We show that classical two-valued logic is included in weak extensions of normal three-valued logics and also that normal three-valued logics are best viewed not as deviant logics but instead as strong extensions of classical two-valued logic obtained by adding a modal operator and the right axioms. This article develops a general method for formulating the right axioms to construct a two-valued system with theorems that correspond to all of the logical truths of any normal three-valued logic. The extended classical (...) system can then express anything that can be expressed in the three-valued logic, so there can be no reason to abandon two-valued logic in favor of three-valued logic. Moreover, the two-valued modal system is preferable, because it enables us to study interactions of different operators with different rationales. It also makes it easier to introduce quantifiers and iteration. Nothing is lost and much is gained by choosing the extended two-valued approach over normal three-valued logics. (shrink)
Pt. 1. The individual and his creator. The fear of God in our time -- Natural morality -- In-depth Torah study -- Levels of mitzvot -- The personal element in serving God -- Religious experience -- Naturalness in the worship of God -- The significance of Torah values -- Tension vs. tranquility in the worship of God -- Pt. 2. The individual and society. Fundamentals of prayer -- Derekh eretz, being a mensch -- "I dwell among my people" -- The (...) obligation to sanctify God's name -- Attending to the needs of the community -- The message beyond mere words -- How to relate to one who has lost his faith -- Pt. 3. The individual and his life. Humanity -- Dealing with crisis -- Adhering to values -- Independent decision-making. (shrink)
An individual’s accountability to oneself leads to self-regulatory behaviour. A field experiment afforded an opportunity to test this relation, given that external accountability conditions were absent. A single group pre-test/post-test design was used to test the hypothesis. A group of full-time resident management students, n ≈ 550, take four meals during the day in the institute mess. As a part of the experiment, food wastage in the form of leftovers on the plates of subjects was measured. As a pre-test, the (...) measurement occurred at two levels. Subjects could see how much they are adding to the total waste by looking at a weighing scale placed under a waste basket, and they could also see the total waste data for each of the four meals for the day and a day earlier displayed at a prominent place. After 105 days, the weighing scale under the basket was removed, and as a post-test measurement, the total waste data for the four meals were noted down for another 72 days. A manipulation test indicated that the experiment has had the desired effect of invoking self-accountability in subjects during the pre-test phase, and diluting it during the post-test phase. Time series analysis of pre-test and post-test data indicated that the wastage data decreased in the pre-test phase. However, the post-test waste data showed an increase over a period of time. The results indicate that accountability conditions like social norms invoke self-accountability cognition leading to self-regulatory behaviours in individuals. (shrink)
Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one’s system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the ‘apparent’ collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups (...) that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge the empirical character of the issue at stake. Here I argue that in doing so they are displaying an unwarranted double standard. r 2007 Elsevier Ltd. All rights reserved. (shrink)
It is occasionally claimed that the important work of philosophers, physicists, and mathematicians in the nineteenth and in the early twentieth centuries made Kant’s critical philosophy of geometry look somewhat unattractive. Indeed, from the wider perspective of the discovery of non-Euclidean geometries, the replacement of Newtonian physics with Einstein’s theories of relativity, and the rise of quantificational logic, Kant’s philosophy seems “quaint at best and silly at worst”.1 While there is no doubt that Kant’s transcendental project involves his own conceptions (...) of Newtonian physics, Euclidean geometry and Aristotelian logic, the issue at stake is whether the replacement of these conceptions collapses Kant’s philosophy into an unfortunate embarrassment.2 Thus, in evaluating the debate over the contemporary relevance of Kant’s philosophical project one is faced with the following two questions: (1) Are there any contradictions between the scientific developments of our era and Kant’s philosophy? (2) What is left from the Kantian legacy in light of our modern conceptions of logic, geometry and physics? Within this broad context, this paper aims to evaluate the Kantian project vis à vis the discovery and application of non-Euclidean geometries. Many important philosophers have evaluated Kant’s philosophy of geometry throughout the last century,3 but opinions with regard to the impact of non-Euclidean geometries on it diverge. In the beginning of the century there was a consensus that the Euclidean character of space should be considered as a consequence of the Kantian project, i.e., of the metaphysical view of space and of the synthetic a priori character of geometry. The impact of non-Euclidean geometries was then thought as undermining the Kantian project since it implied, according to positivists such.. (shrink)
Recent studies in psychiatry reveal an acceptance of trauma through the media. Traditionally restricted to immediate experience, Post-traumatic Stress Disorder is now expanding to include mediated experience. How did this development come about? How does mediated trauma manifest itself? What are its consequences? This essay addresses these questions through three cases: ‘trauma film paradigm’, an early 1960s research program that employed films to simulate traumatic effects; the psychiatric study into the clinical effects of watching catastrophic events on television, culminating with (...) the September 11 attacks; reports on drone operators who exhibit PTSD symptoms after flying combat missions away from the war zone. The recognition of mediated trauma marks a qualitative change in the understanding of media effects, rendering the impact literal and the consequences clinical. What informs recent speculations about the possibility of trauma through media is a conceptual link between visual media and contemporary conceptions of trauma. (shrink)
Some have argued that a subject has an inner awareness of its conscious mental states by virtue of the non-introspective, reflexive awareness that any conscious state has of itself. But, what exactly is it like to have a ubiquitous and reflexive inner awareness of one’s conscious states, as distinct from one’s outer awareness of the apparent world? This essay derives a model of ubiquitous inner awareness (UIA) from Sebastian Watzl’s recent theory of attention as the activity of structuring consciousness into (...) an experiential center and periphery. I develop Watzl’s theory into an account of UIA by suggesting that a subject is acquainted with its own conscious mental states through being reflexively aware of how these states are structured by attention into a unified subjective perspective. I favorably compare this Watzl-inspired account of UIA against other contemporary analytic and classical Buddhist accounts of reflexive awareness and subjective character, which variously ground the inner awareness of conscious states on their intrinsic phenomenal quality of “for-me-ness,” their affective/hedonic valence, or a subject’s disposition to introspect them. The Watzl-inspired account also accommodates possible counter- examples to Watzl’s theory posed by states of minimal phenomenal experience such as lucid dreamless sleep and non-dual meditative awareness. (shrink)
In a recent online lecture, the acclaimed novelist Amit Chaudhuri responded to an accusation that has greeted his fiction since the start of his literary career: that since, as he openly admits, his novels contain people and events that are drawn from his own life, they are better thought of as thinly disguised memoirs—as not really novels at all. In this paper, I discuss this charge by drawing on an account by the philosopher Stephen Mulhall of the work of (...) another distinguished novelist—J.M. Coetzee (more specifically, that work which features the character Elizabeth Costello). In particular, I want to establish the pertinence to Chaudhuri’s lecture of Mulhall’s analogy between aspects of that work and the work of the influential art historian and critic Michael Fried on the history of modernist painting. In so doing, I aim to show that the commitment to the projects of literary modernism and realism which Mulhall sees in Coetzee (and Costello), can also be seen in Chaudhuri’s understanding of the sense in which his novels both are, and are not, autobiographical. (shrink)
The present study was motivated by the hypothesis that inputs from internal states in obsessive–compulsive individuals are attenuated, which could be one source of the pervasive doubting and checking in OCD. Participants who were high or low in OC tendencies were asked to produce specific levels of muscle tension with and without biofeedback, and their accuracy in producing the required muscle tension levels was assessed. As predicted, high OC participants performed more poorly than low OC participants on this task when (...) biofeedback was not available. When biofeedback was provided, the difference between the groups was eliminated, and withdrawing the monitor again reversed this effect. Finally, when given the opportunity, high OC participants were more likely than low OC participants to request biofeedback. These results suggest that doubt in OCD may be grounded in a real and general deficiency in accessing internal states. (shrink)
The neurophysiological evidence from the Miyashita group's experiments on monkeys as well as cognitive experience common to us all suggests that local neuronal spike rate distributions might persist in the absence of their eliciting stimulus. In Hebb's cell-assembly theory, learning dynamics stabilize such self-maintaining reverberations. Quasi-quantitive modeling of the experimental data on internal representations in association-cortex modules identifies the reverberations as the internal code. This leads to cognitive and neurophysiological predictions, many following directly from the language used to describe the (...) activity in the experimental delay period, others from the details of how the model captures the properties of the internal representations. (shrink)
Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more (...) closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. (shrink)
A book on the notion of fundamental length, covering issues in the philosophy of math, metaphysics, and the history and the philosophy of modern physics, from classical electrodynamics to current theories of quantum gravity. Published (2014) in Cambridge University Press.
This study examines unethical purchasing practices from the perspective of buyer-supplier relationships. Based on a review of the inter-organizational literature and qualitative data from in-depth interviews with purchase managers from diverse industries, a conceptual framework is proposed, and theoretical arguments leading to propositions are presented. Taking into consideration the presence or absence of an explicit or implicit company policy sanctioning ethically questionable activities, unethical purchasing practices are conceptualized as a three-tiered set. Three broad themes emerge from the analysis toward explaining (...) purchasing ethics from a buyer— seller perspective: (a) Inter-organizational power issues (inter-organizational power and idiosyncratic investments), (b) Inter-organizational relational issues (long-term orientation and satisfaction), and (c) Interpersonal relational issues (interpersonal ties and trust). Theoretical and managerial implications of the conceptual framework are discussed. (shrink)
Combining physics, mathematics and computer science, quantum computing and its sister discipline of quantum information have developed in the past few decades from visionary ideas to two of the most fascinating areas of quantum theory. General interest and excitement in quantum computing was initially triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially “speed-up” classical computation and factor large numbers into primes far more efficiently than any (known) classical algorithm. Shor’s algorithm was soon followed by several (...) other algorithms that aimed to solve combinatorial and algebraic problems, and in the years since theoretical study of quantum systems serving as computational devices has achieved tremendous progress. Common belief has it that the implementation of Shor’s algorithm on a large scale quantum computer would have devastating consequences for current cryptography protocols which rely on the premise that all known classical worst-case algorithms for factoring take time exponential in the length of their input (see, e.g., Preskill 2005). Consequently, experimentalists around the world are engaged in attempts to tackle the technological difficulties that prevent the realisation of a large scale quantum computer. But regardless whether these technological problems can be overcome (Unruh 1995; Ekert and Jozsa 1996; Haroche and Raimond 1996), it is noteworthy that no proof exists yet for the general superiority of quantum computers over their classical counterparts. -/- The philosophical interest in quantum computing is manifold. From a social-historical perspective, quantum computing is a domain where experimentalists find themselves ahead of their fellow theorists. Indeed, quantum mysteries such as entanglement and nonlocality were historically considered a philosophical quibble, until physicists discovered that these mysteries might be harnessed to devise new efficient algorithms. But while the technology for harnessing the power of 50–100 qubits (the basic unit of information in the quantum computer) is now within reach (Preskill 2018), only a handful of quantum algorithms exist, and the question of whether these can truly outperform any conceivable classical alternative is still open. From a more philosophical perspective, advances in quantum computing may yield foundational benefits. For example, it may turn out that the technological capabilities that allow us to isolate quantum systems by shielding them from the effects of decoherence for a period of time long enough to manipulate them will also allow us to make progress in some fundamental problems in the foundations of quantum theory itself. Indeed, the development and the implementation of efficient quantum algorithms may help us understand better the border between classical and quantum physics (Cuffaro 2017, 2018a; cf. Pitowsky 1994, 100), and perhaps even illuminate fundamental concepts such as measurement and causality. Finally, the idea that abstract mathematical concepts such as computability and complexity may not only be translated into physics, but also re-written by physics bears directly on the autonomous character of computer science and the status of its theoretical entities—the so-called “computational kinds”. As such it is also relevant to the long-standing philosophical debate on the relationship between mathematics and the physical world. (shrink)
The authors use the theoretical notion of anomie to examine the impact of top management's control mechanisms on the environment of the marketing function. Based on a literature review and in-depth field interviews with marketing managers in diverse industries, a conceptual model is proposed that incorporates the two managerial control mechanisms, viz. output and process control, and relates their distinctive influence to anomie in the marketing function. Three contingency variables, i.e., resource scarcity, power, and ethics codification, are proposed to moderate (...) the relationship between control mechanisms and anomie. The authors also argue for the link between anomic environments and the propensity of unethical marketing practices to occur. Theoretical and managerial implications of the proposed conceptual model are discussed. (shrink)
A recent advent has been seen in the usage of Internet of things for autonomous devices for exchange of data. A large number of transformers are required to distribute the power over a wide area. To ensure the normal operation of transformer, live detection and fault diagnosis methods of power transformers are studied. This article presents an IoT-based approach for condition monitoring and controlling a large number of distribution transformers utilized in a power distribution network. In this article, the vibration (...) analysis method is used to carry out the research. The results show that the accuracy of the improved diagnosis algorithm is 99.01, 100, and 100% for normal, aging, and fault transformers. The system designed in this article can effectively monitor the healthy operation of power transformers in remote and real-time. The safety, stability, and reliability of transformer operation are improved. (shrink)
In quantum computing, where algorithms exist that can solve computational problems more efficiently than any known classical algorithms, the elimination of errors that result from external disturbances or from imperfect gates has become the ...
It is shown that if quantum physics is interpreted according to the philosophy of monistic idealism--that consciousness is the ground of all being--then some of the important dualisms of philosophy can be integrated.
Steven Pinker presents four ideals of Enlightenment in his popular book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. He argues his case brilliantly and convincingly through cogent arguments in a language comprehensible to the reader of the present century. Moreover, whether it is reason or science or humanism or progress, he defends his position powerfully. He justifies his views by citing 75 graphs on the upswing improvement made by humanity in terms of prosperity, longevity, education, equality of (...) men and women, health, political freedom and medical breakthroughs. Though Pinker makes an excellent case for the positive contributions of Enlightenment; however he ignores the negative aspects that are responsible for causing a great schism between the white race and others who are black and brown. The paper highlights some of these negative comments made by such Enlightenment thinkers as Montesquieu, Voltaire, Chambers, Down and Down and others. Through their literary and scientific writings, these scholars and researchers downgraded the black and brown races, thus causing a rift that led to slavery, colonialism and apartheid. The paper reveals these negative aspects ignored by Pinker in his otherwise well-researched book on Enlightenment. Since Pinker presents a one-sided case by including only the positive contributions of Enlightenment, I recommend that he should write a sequel to his present work outlining the negative aspects responsible for numerous political, social and environmental problems facing humanity today. By using dialectical logic in place of logic of contraries, he might be able to synthesize both the positive and negative aspects of Enlightenment. He can then argue that humanity might be propelled to make progress more efficiently at a faster pace toward humanism and world peace. (shrink)
We argue that current constructive approaches to the special theory of relativity do not derive the geometrical Minkowski structure from the dynamics but rather assume it. We further argue that in current physics there can be no dynamical derivation of primitive geometrical notions such as length. By this we believe we continue an argument initiated by Einstein.
Since its establishment in 1979, the Video Archive for Holocaust Testimonies at Yale University has given rise to numerous studies on history, memory and trauma in the wake of the Holocaust. While acknowledging its audiovisual nature, previous accounts have nevertheless failed to consider the significance of this novel archival formation and how it shapes the production and reception of survivors’ testimonies. This article occasions an unlikely encounter between the trauma and testimony discourse as developed by Dori Laub, Shoshana Felman, Lawrence (...) Langer in the context of the Yale archive, and the theory of “technical media” as developed by German media theorist, Friedrich Kittler. It argues that the trauma and testimony discourse has a technological unconscious in the form of videotape technology, which crucially conditions the way trauma is conceived in this discourse. It is only with an audiovisual medium capable of capturing and reproducing evidence of the fleeting unconscious that a discourse concerned with the unarticulated past becomes intelligible. (shrink)
We present a brief history of decoherence, from its roots in the foundations of classical statistical mechanics, to the current spin bath models in condensed matter physics. We analyze the philosophical import of the subject matter in three different foundational problems, and find that, contrary to the received view, decoherence is less instrumental to their solutions than it is commonly believed. What makes decoherence more philosophically interesting, we argue, are the methodological issues it draws attention to, and the question of (...) the universality of quantum mechanics. (shrink)
This study examines unethical purchasing practices from the perspective of buyer–supplier relationships. Based on a review of the inter-organizational literature and qualitative data from in-depth interviews with purchase managers from diverse industries, a conceptual framework is proposed, and theoretical arguments leading to propositions are presented. Taking into consideration the presence or absence of an explicit or implicit company policy sanctioning ethically questionable activities, unethical purchasing practices are conceptualized as a three-tiered set. Three broad themes emerge from the analysis toward explaining (...) purchasing ethics from a buyer–seller perspective: Inter-organizational power issues, Inter-organizational relational issues, and Interpersonal relational issues. Theoretical and managerial implications of the conceptual framework are discussed. (shrink)
Scientific realism is dead, or so many philosophers believe. Its death was announced when philosophers became convinced that one can accept all scientific results without committing oneself to metaphysical existence claims about theoretical entities (Fine 1986, 112). In addition, the inability of self–proclaimed scientific realists, despite recurrent demands, to distinguish themselves from their rival anti–realists (Stein 1989) didn’t exactly help their cause. If realists cannot identify the key feature or features that set them apart from their opponents, then there is (...) really no need to conduct a debate on scientific realism, is there? (shrink)
Philosophers of the 17th and 18th centuries who worked within the tradition of modern natural law became interested in political economy in part as they attempted to reconcile two conflicting images of economic activity. On the one hand, from the legal point of view economic activity was understood as a morally neutral and benign activity that could be regulated by simple and clear rules of justice. On the other hand, it was seen as a realm of political struggle, manipulation, deceit (...) and the exercise of hidden forms of domination. This article examines the legal and moral contexts of Adam Smith's excursion into political economy by interpreting the roles played by these two images of the market in the theory of value articulated in book I of The Wealth of Nations. (shrink)
In the wake of the current financial crises triggered by risky mortgage-backed securities, the question of ethics and risk-taking is once again at the front and center for both practitioners and academics. Although risk-taking is considered an integral part of strategic decision-making, sometimes firms could be propelled to take risks driven by reasons other than calculated strategic choices. The authors argue that a firm's risk-taking propensity is impacted by its ethical climate (egoistic or benevolent) and its emphasis on output control (...) to manage its marketing function. The firm's long-term orientation is argued to moderate the control–risk propensity relationship. The authors also extend research on risk and performance and argue that the association of risk-taking propensity and firm performance is contingent on the ownership (publicly traded versus privately held) structure of the firm. Based on survey data from a sample of manufacturing industries in the United States, the results show significant impact of ethical climate and marketing output control on a firm's risk-taking propensity; also risk-taking propensity shows a stronger association with firm performance in privately held firms than in publicly traded firms. (shrink)
Clinical Pharmacology is a specialty with many attributes and our association with the subject has allowed us to acquire, apply and disseminate myriad aspects of research and practice. Though clinical pharmacologists are conspicuous by virtue of their small number, recent years have shown a growing need for the course. In the review below we navigate through several aspects of the subject as we encountered them from time to time. From critical appraisal of literature, to application of knowledge of drugs, to (...) clinical practice; moving on to clinical and basic research, to drug development process, to policy making - these are but a few of the many fields which constitute the scope of clinical pharmacology. The importance of the subject lies in allowing a trainee to develop a broad overview of the entire process, from drug generation to drug distribution to drug utilization, a process meant for the greater common goal of better health for all. We foresee a bright future for the subject though with a slight skepticism thrown in. In the present article, we make use of personal experiences and reference from literature to help you get a broad view of what clinical pharmacology means to us. (shrink)
Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
I discuss the philosophical implications that the rising new science of quantum computing may have on the philosophy of computer science. While quantum algorithms leave the notion of Turing-Computability intact, they may re-describe the abstract space of computational complexity theory hence militate against the autonomous character of some of the concepts and categories of computer science.
An interpretation of John Rawls justice as fairness as a deliberative critical argumentative strategy for evaluating existing institutions is offered and its plausibility is discussed. I argue that justice as fairness aims at synthesizing the moral values claimed by existing social institutions into a coherent model of a well-ordered society in order to demand that these institutions stand up to the values that they promise. Understood in such a way, justice as fairness provides a set of idealizing mirrors through which (...) power dynamics in society can be viewed but does not function as a model for an ideal society. Key Words: distributive justice immanent criticism justice as fairness political liberalism public reason John Rawls reflective equilibrium. (shrink)
One of the recurrent problems in the foundations of physics is to explain why we rarely observe certain phenomena that are allowed by our theories and laws. In thermodynamics, for example, the spontaneous approach towards equilibrium is ubiquitous yet the time-reversal-invariant laws that presumably govern thermal behaviour in the microscopic level equally allow spontaneous departure from equilibrium to occur. Why are the former processes frequently observed while the latter are almost never reported? Another example comes from quantum mechanics where the (...) formalism, if considered complete and universally applicable, predicts the existence of macroscopic superpositions—monstrous Schr¨odinger cats—and these are never observed: while electrons and atoms enjoy the cloudiness of waves, macroscopic objects are always localized to definite positions. (shrink)