The Devonian Duvernay Formation in northwest central Alberta, Canada, has become a hot play in the past few years due to its richness in liquid and gaseous hydrocarbon resources. The oil and gas generation in this shale formation made it the source rock for many oil and gas fields in its vicinity. We attempt to showcase the characterization of Duvernay Formation using 3D multicomponent seismic data and integrating it with the available well log and other relevant data. This has been (...) done by deriving rock-physics parameters through deterministic simultaneous and joint impedance inversion, with appropriate quantitative interpretation. In particular, we determine the brittleness of the Duvernay interval, which helps us determine the sweet spots therein. The scope of this characterization exercise was extended to explore the induced seismicity observed in the area that is perceived to be associated with hydraulic fracture stimulation of the Duvernay. This has been a cause of media coverage lately. We attempt to integrate our results with the induced seismicity data available in the public domain and elaborate on our learning experience gained so far. (shrink)
In the collectivist culture of India, family occupies a central place in organizing social and personal lives of the people. However, the forces of industrialization and urbanization are changing the life style and leading to reprioritization of values. Against this backdrop this study examined the pattern of actual and desired family values in the context of ecology, family type and generation. The sample was drawn from urban, semi-urban and rural areas in central India representing parent–child pairs belonging to joint and (...) nuclear families. Using an indigenously developed measure of family values the study revealed that ecological setting and generation significantly influenced the family value of positive interaction. The type of family emerged as a significant factor for three major values, that is, social order, manners and helping. The interaction of ecology, family type and generation were significant for the values of social order and helping. In general, the type of family and ecological setting yielded major influence in shaping values. The implications of the findings for parenting are discussed. (shrink)
The North Delhi Fold Belt exposure of the Delhi Supergroup of rocks is significant for its structurally controlled uranium mineralization. The Narnaul-Palsana tract within the Khetri subbasin of the NDFB comprises the arenaceous Alwar and argillaceous Ajabgarh Groups of the Delhi Supergroup. The metasedimentary sequence has been subjected to polyphase deformation and igneous intrusion. We used heliborne magnetic data to enhance our geologic understanding of the area. Total magnetic intensity data are gridded and enhanced to resolve the magnetic anomalies. The (...) regional magnetic signature reveals a deep-seated fracture. Varying concentrations of magnetic minerals in different lithologies are reflected in the magnetic response and provide clues to the formational trends. Trend lines and breaks are extracted from the magnetic signature. Thematic analysis of trend lines reveals formational trends that indicate an antiformal and synformal fold pattern in different sectors of the study area. The spatial correlation of the fold patterns is used to decipher the tectonic sequence. Superimposition of antiformal folding over earlier antiform-synform structure and displacement due to later faulting is inferred. Magnetic data analysis is used as a tool to unravel the regional structural fabric of the area that is widely concealed below soil cover. (shrink)
The North Delhi Fold Belt exposure of the Delhi Supergroup of rocks is significant for its structurally controlled uranium mineralization. The Narnaul-Palsana tract within the Khetri subbasin of the NDFB comprises the arenaceous Alwar and argillaceous Ajabgarh Groups of the Delhi Supergroup. The metasedimentary sequence has been subjected to polyphase deformation and igneous intrusion. We used heliborne magnetic data to enhance our geologic understanding of the area. Total magnetic intensity data are gridded and enhanced to resolve the magnetic anomalies. The (...) regional magnetic signature reveals a deep-seated fracture. Varying concentrations of magnetic minerals in different lithologies are reflected in the magnetic response and provide clues to the formational trends. Trend lines and breaks are extracted from the magnetic signature. Thematic analysis of trend lines reveals formational trends that indicate an antiformal and synformal fold pattern in different sectors of the study area. The spatial correlation of the fold patterns is used to decipher the tectonic sequence. Superimposition of antiformal folding over earlier antiform-synform structure and displacement due to later faulting is inferred. Magnetic data analysis is used as a tool to unravel the regional structural fabric of the area that is widely concealed below soil cover. (shrink)
Misconduct in medical science research is an unfortunate reality. Science, for the most part, operates on the basis of trust. Researchers are expected to carry out their work and report their findings honestly. But, sadly, that is not how science always gets done. Reports keep surfacing from various countries about work being plagiarised, results which were doctored and data fabricated. Scientific misconduct is scourge afflicting the field of science, unfortunately with little impact in developing countries like India especially in health (...) care services. A recent survey and a meta-analysis suggest that the few cases that do float up represents only tip of a large iceberg. This paper therefore highlights reasons for misconduct with steps that can be taken to reduce misconduct. Also the paper throws light on Indian scenario in relation to misconduct. (shrink)
This paper delves into a two-agent scheduling problem in which two agents are competing for a single resource. Each agent has a set of jobs to be processed by a single machine. The processing time, release time, weight, and the due dates of each job are known in advance. Both agents have their objectives, which are conflicting in nature. The first agent tries to minimize the total completion time, while the second agent tries to minimize the number of tardy jobs. (...) The two agents’ scheduling problem, an NP-hard problem, has a wide variety of applications ranging from the manufacturing industry to the cloud computing service provider. Due to the wide applicability, each variation of the problem requires a different algorithm, adapted according to the user’s requirements. This paper provides mathematical models, heuristic algorithms, and two nature-based metaheuristic algorithms to solve the problem. The algorithm’s performance was gauged against the optimal solution obtained from the AMPL-CPLEX solver for both solution quality and computational time. The outlined metaheuristics produce a solution that is comparable with a short computational time. The proposed metaheuristics even have a better solution than the CPLEX solver for medium-size problems, whereas the computation times are much less than the CPLEX solvers. (shrink)
The paper considers two-agent order acceptance scheduling problems with different scheduling criteria. Two agents have a set of jobs to be processed by a single machine. The processing time and due date of each job are known in advance. In the order accepting scheduling problem, jobs are allowed to be rejected. The objective of the problem is to maximize the net revenue while keeping the weighted number of tardy jobs for the second agent within a predetermined value. A mixed-integer linear (...) programming formulation is provided to obtain the optimal solution. The problem is considered as an NP-hard problem. Therefore, MILP can be used to solve small problem instances optimally. To solve the problem instances with realistic size, heuristic and metaheuristic algorithms have been proposed. A heuristic method is used to determine and secure a quick solution while the metaheuristic based on particle swarm optimization is designed to obtain the near-optimal solution. A numerical experiment is piloted and conducted on the benchmark instances that could be obtained from the literature. The performances of the proposed algorithms are tested through numerical experiments. The proposed PSO can obtain the solution within 0.1% of the optimal solution for problem instances up to 60 jobs. The performance of the proposed PSO is found to be significantly better than the performance of the heuristic. (shrink)
A Bayesian approach using wavelet coefficient modeling is proposed for de-noising additive white Gaussian noise in medical magnetic resonance imaging. In a parallel acquisition process, the magnetic resonance image is affected by white Gaussian noise, which is additive in nature. A normal inverse Gaussian probability distribution function is taken for modeling the wavelet coefficients. A Bayesian approach is implemented for filtering the noisy wavelet coefficients. The maximum likelihood estimator and median absolute deviation estimator are used to find the signal parameters, (...) signal variances, and noise variances of the distribution. The minimum mean square error estimator is used for estimating the true wavelet coefficients. The proposed method is simulated on MRI. Performance and image quality parameters show that the proposed method has the capability to reduce the noise more effectively than other state-of-the-art methods. The proposed method provides 8.83%, 2.02%, 6.61%, and 30.74% improvement in peak signal-to-noise ratio, structure similarity index, Pratt’s figure of merit, and Bhattacharyya coefficient, respectively, over existing well-accepted methods. The effectiveness of the proposed method is evaluated by using the mean squared difference parameter. MSD shows the degree of dissimilarity and is 0.000324 for the proposed method, which is less than that of the other existing methods and proves the effectiveness of the proposed method. Experimental results show that the proposed method is capable of achieving better signal-to-noise ratio performance than other tested de-noising methods. (shrink)
Natural fracture networks are used in unconventional reservoir simulators to model pressure and saturation changes in fractured rocks. These fracture networks are often derived from well data or well data combined with a variety of seismic-derived attributes to provide spatial information away from the wells. In cases in which there is a correlation between faults and fractures, the use of a fault indicator can provide additional constraints on the spatial location of the natural fractures. We use a fault attribute based (...) on fault-oriented semblance as a secondary conditioner for the generation of NFNs. In addition, the distribution of automatically extracted faults from the fault-oriented semblance is used to augment the well-derived statistics for natural fracture generation. Without the benefit of this automated fault-extraction solution, to manually extract the fault-statistical information from the seismic data would be prohibitively tedious and time consuming. Finally, we determine, on a 3D field unconventional data set, that the use of fault-oriented semblance results in simulations that are significantly more geologically reasonable. (shrink)
We propose a novel edge detector in the presence of Gaussian noise with the use of proximal support vector machine. The edges of a noisy image are detected using a two-stage architecture: smoothing of image is first performed using regularized anisotropic diffusion, followed by the classification using PSVM, termed as regularized anisotropic diffusion-based PSVM method. In this process, a feature vector is formed for a pixel using the denoised coefficient’s class and the local orientations to detect edges in all possible (...) directions in images. From the experiments, conducted on both synthetic and benchmark images, it is observed that our RAD-PSVM approach outperforms the other state-of-the-art edge detection approaches, both qualitatively and quantitatively. (shrink)
THE IMMORTAL FLY: ETERNAL WHISPERS. WHO IS SHE? Author: Rituparna Ray Chaudhuri. Hello, Recently my book named, ‘The Immortal Fly: Eternal Whispers : Based On True Events of a Family' been published from Partridge (USA) In Association with Penguin Random House (UK) and achieved a separate Google identity. -/- As being # the author of the book, I thought to define self in the book what is definition of 'Depression'. I wanted to explain self in many ways, but the best (...) quotation appeared to me : “My life will end someday, but it will end at my convenience.’’ -/- To be accurate,thus, on medical explanation of the term ‘Depression’, I went depth inside and had # medical reports from in and abroad, with special mention of our family #Dr. Amit De, MD Senior Consultant, of # Narayana Multispeciality Hospital, Barasat, Kolkata (India) and # many other medical associates,associated with the said hospital, as imbibed in the book..... I had thought, then, to define 'the term' best with acceptance : “Death is nature’s way of saying, ‘‘Your table is ready’’….. -/- ************* I am missing her. By now, a year has passed without 'her' . Even though, unlike before, everything is becoming to be more scattered, gloomy and desolate. She is no-where to hear my words,whom I can still only share my feelings intensely. Even now, when I do close my eyes, I can visualize the same that I had left a year since on 7th February, 2019 at 8.20 A.M. in the hospital struggling a continuous period of fifty days : on the fifty one day, my father said, ”The End of our Fifty Years relationship has been completed with the Fifty Days”….’Whoever’ she was to others may, but she is our legend…To me, she is ‘My Ma’. -/- The story begins,'I failed preciously on success of my life.' Simplicity,Innocence, Belief and Faith met unknowingly with filthy waves skillfully immersed in Betray,Sorcery,Jealousy, Greediness,Revenge,Lie... ‘’ The Daughter writes, “I had asked Ma many times, but her ‘impenetrable personality’ and dynamic words to everyone with a tinge of smile as reflected on her face, she was reluctant to continue her conversation with me. I had thought, hence, I must not be indefinite on my spoken words. Who shall I blame!” Based on true story of a family came from South Calcutta (India) to a suburb, on staying at home of the Daughter’s maternal grandmother’s house, this book reveals in facts and true events how Destiny had unknowingly ‘further’ played an abominable role to Fate of The Daughter, when eventually one day on 7th February, 2019 everything was finished within 8.20A.M. The Daughter is, therefore, left alone on terrestrial with immortal words as written in her Diary, ‘Eternal Whispers’: “My words to self that I am to fulfill my Ma’s - wish. ’’ ‘’ -/- • Keywords: 1. Diary and True Events 2. The Chaotic Society 3. Fatality 4. Of A-Family 5. Science , Philosophy and Literature 6. Severe Depression 7. Medical Journey. -/- The Alternative Title of the Book: The Greatest Mistake or Fortune:: The book is mainly carrying with intense words of a journey of the relationship between a Mother with her Daughter has left readers in an abrupt situation where to define indeed "Man is the creature under circumstances..." -/- . (shrink)
In this book, Deep K. Datta-Ray strives to explore some of the deep foundations of Indian diplomacy with and beyond the discourse of modernity, especially its preoccupation with power, control, and violence. Datta-Ray argues that modern diplomacy is rooted in a model of violence and control, and Indian diplomacy is striving to move beyond this. Indian diplomacy draws inspiration from the civilizational ethos of and preoccupation of India with dharma, right conduct, and a non-violent way of being with the world. (...) For Datta-Ray, the Indian approach to diplomacy, as it draws from the civilizational steams of the Ramayana and Mahabharata as well as Indo-Mughal experiments in creative diplomacy and Gandhian and Nehruvian... (shrink)
The Utica Formation in eastern Ohio possesses all the prerequisites for being a successful unconventional play. Attempts at seismic reservoir characterization of the Utica Formation have been discussed in part 1, in which, after providing the geologic background of the area of study, the preconditioning of prestack seismic data, well-log correlation, and building of robust low-frequency models for prestack simultaneous impedance inversion were explained. All these efforts were aimed at identification of sweet spots in the Utica Formation in terms of (...) organic richness as well as brittleness. We elaborate on some aspects of that exercise, such as the challenges we faced in the determination of the total organic carbon volume and computation of brittleness indices based on mineralogical and geomechanical considerations. The prediction of TOC in the Utica play using a methodology, in which limited seismic as well as well-log data are available, is demonstrated first. Thereafter, knowing the nonexistence of the universally accepted indicator of brittleness, mechanical along with mineralogical attempts to extract the brittleness information for the Utica play are discussed. Although an attempt is made to determine brittleness from mechanical rock-physics parameters derived from seismic data, the available X-ray diffraction data and regional petrophysical modeling make it possible to determine the brittleness index based on mineralogical data and thereafter be derived from seismic data. (shrink)
A book on the notion of fundamental length, covering issues in the philosophy of math, metaphysics, and the history and the philosophy of modern physics, from classical electrodynamics to current theories of quantum gravity. Published (2014) in Cambridge University Press.
Spanning forty years of Ray's career, these essays, for the first time collected in one volume, present the filmmaker's reflections on the art and craft of the cinematic medium and include his thoughts on sentimentalism, mass culture, ...
Combining physics, mathematics and computer science, quantum computing and its sister discipline of quantum information have developed in the past few decades from visionary ideas to two of the most fascinating areas of quantum theory. General interest and excitement in quantum computing was initially triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially “speed-up” classical computation and factor large numbers into primes far more efficiently than any (known) classical algorithm. Shor’s algorithm was soon followed by several (...) other algorithms that aimed to solve combinatorial and algebraic problems, and in the years since theoretical study of quantum systems serving as computational devices has achieved tremendous progress. Common belief has it that the implementation of Shor’s algorithm on a large scale quantum computer would have devastating consequences for current cryptography protocols which rely on the premise that all known classical worst-case algorithms for factoring take time exponential in the length of their input (see, e.g., Preskill 2005). Consequently, experimentalists around the world are engaged in attempts to tackle the technological difficulties that prevent the realisation of a large scale quantum computer. But regardless whether these technological problems can be overcome (Unruh 1995; Ekert and Jozsa 1996; Haroche and Raimond 1996), it is noteworthy that no proof exists yet for the general superiority of quantum computers over their classical counterparts. -/- The philosophical interest in quantum computing is manifold. From a social-historical perspective, quantum computing is a domain where experimentalists find themselves ahead of their fellow theorists. Indeed, quantum mysteries such as entanglement and nonlocality were historically considered a philosophical quibble, until physicists discovered that these mysteries might be harnessed to devise new efficient algorithms. But while the technology for harnessing the power of 50–100 qubits (the basic unit of information in the quantum computer) is now within reach (Preskill 2018), only a handful of quantum algorithms exist, and the question of whether these can truly outperform any conceivable classical alternative is still open. From a more philosophical perspective, advances in quantum computing may yield foundational benefits. For example, it may turn out that the technological capabilities that allow us to isolate quantum systems by shielding them from the effects of decoherence for a period of time long enough to manipulate them will also allow us to make progress in some fundamental problems in the foundations of quantum theory itself. Indeed, the development and the implementation of efficient quantum algorithms may help us understand better the border between classical and quantum physics (Cuffaro 2017, 2018a; cf. Pitowsky 1994, 100), and perhaps even illuminate fundamental concepts such as measurement and causality. Finally, the idea that abstract mathematical concepts such as computability and complexity may not only be translated into physics, but also re-written by physics bears directly on the autonomous character of computer science and the status of its theoretical entities—the so-called “computational kinds”. As such it is also relevant to the long-standing philosophical debate on the relationship between mathematics and the physical world. (shrink)
Huw Price (1996, 2002, 2003) argues that causal-dynamical theories that aim to explain thermodynamic asymmetry in time are misguided. He points out that in seeking a dynamical factor responsible for the general tendency of entropy to increase, these approaches fail to appreciate the true nature of the problem in the foundations of statistical mechanics (SM). I argue that it is Price who is guilty of misapprehension of the issue at stake. When properly understood, causal-dynamical approaches in the foundations of SM (...) offer a solution for a different problem; a problem that unfortunately receives no attention in Price’s celebrated work. (shrink)
The neurophysiological evidence from the Miyashita group's experiments on monkeys as well as cognitive experience common to us all suggests that local neuronal spike rate distributions might persist in the absence of their eliciting stimulus. In Hebb's cell-assembly theory, learning dynamics stabilize such self-maintaining reverberations. Quasi-quantitive modeling of the experimental data on internal representations in association-cortex modules identifies the reverberations as the internal code. This leads to cognitive and neurophysiological predictions, many following directly from the language used to describe the (...) activity in the experimental delay period, others from the details of how the model captures the properties of the internal representations. (shrink)
Prof. G.C. Pande in his work ‘ Studies in the Origins of Buddhism ’ speaks of the theory of relation ( paccaya) while discussing the principle of dependent origination ( paṭiccasamuppāda ). Theory of relation ( paccaya) is a law explaining the existence of the dhammas , being related by some relations. It is further extension of the law of dependent origination ( paṭiccasamuppāda ). Things come to existence in our day-to-day life. The law of dependent origination explains that they (...) come into existence; depending upon some other factors. The theory of relation explains that such dependence on the other dhammas is possible due to some relations. In other words, Paṭiccasamuppāda explains the process of existence of conditioned things. The relation ( paccaya ) explains the relation existing between different phases coming into existence. Such relations are also explained in conditioned things only. (shrink)
This article analyzes how the medical gaze made possible by MRI operates in radiological laboratories. It argues that although computer-assisted medical imaging technologies such as MRI shift radiological analysis to the realm of cyborg visuality, radiological analysis continues to depend on visualization produced by other technologies and diagnostic inputs. In the radiological laboratory, MRI is used to produce diverse sets of images of the internal parts of the body to zero in and visually extract the pathology. Visual extraction of pathology (...) becomes possible, however, because of the visual training of the radiologists in understanding and interpreting anatomic details of the whole body. These two levels of viewing constitute the bifocal vision of the radiologists. To make these levels of viewing work complementarily, the body, as it is presented in the body atlases, is made notational. (shrink)
It is occasionally claimed that the important work of philosophers, physicists, and mathematicians in the nineteenth and in the early twentieth centuries made Kant’s critical philosophy of geometry look somewhat unattractive. Indeed, from the wider perspective of the discovery of non-Euclidean geometries, the replacement of Newtonian physics with Einstein’s theories of relativity, and the rise of quantificational logic, Kant’s philosophy seems “quaint at best and silly at worst”.1 While there is no doubt that Kant’s transcendental project involves his own conceptions (...) of Newtonian physics, Euclidean geometry and Aristotelian logic, the issue at stake is whether the replacement of these conceptions collapses Kant’s philosophy into an unfortunate embarrassment.2 Thus, in evaluating the debate over the contemporary relevance of Kant’s philosophical project one is faced with the following two questions: (1) Are there any contradictions between the scientific developments of our era and Kant’s philosophy? (2) What is left from the Kantian legacy in light of our modern conceptions of logic, geometry and physics? Within this broad context, this paper aims to evaluate the Kantian project vis à vis the discovery and application of non-Euclidean geometries. Many important philosophers have evaluated Kant’s philosophy of geometry throughout the last century,3 but opinions with regard to the impact of non-Euclidean geometries on it diverge. In the beginning of the century there was a consensus that the Euclidean character of space should be considered as a consequence of the Kantian project, i.e., of the metaphysical view of space and of the synthetic a priori character of geometry. The impact of non-Euclidean geometries was then thought as undermining the Kantian project since it implied, according to positivists such.. (shrink)
I discuss the philosophical implications that the rising new science of quantum computing may have on the philosophy of computer science. While quantum algorithms leave the notion of Turing-Computability intact, they may re-describe the abstract space of computational complexity theory hence militate against the autonomous character of some of the concepts and categories of computer science.
Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
This study examines unethical purchasing practices from the perspective of buyer-supplier relationships. Based on a review of the inter-organizational literature and qualitative data from in-depth interviews with purchase managers from diverse industries, a conceptual framework is proposed, and theoretical arguments leading to propositions are presented. Taking into consideration the presence or absence of an explicit or implicit company policy sanctioning ethically questionable activities, unethical purchasing practices are conceptualized as a three-tiered set. Three broad themes emerge from the analysis toward explaining (...) purchasing ethics from a buyer— seller perspective: (a) Inter-organizational power issues (inter-organizational power and idiosyncratic investments), (b) Inter-organizational relational issues (long-term orientation and satisfaction), and (c) Interpersonal relational issues (interpersonal ties and trust). Theoretical and managerial implications of the conceptual framework are discussed. (shrink)
An individual’s accountability to oneself leads to self-regulatory behaviour. A field experiment afforded an opportunity to test this relation, given that external accountability conditions were absent. A single group pre-test/post-test design was used to test the hypothesis. A group of full-time resident management students, n ≈ 550, take four meals during the day in the institute mess. As a part of the experiment, food wastage in the form of leftovers on the plates of subjects was measured. As a pre-test, the (...) measurement occurred at two levels. Subjects could see how much they are adding to the total waste by looking at a weighing scale placed under a waste basket, and they could also see the total waste data for each of the four meals for the day and a day earlier displayed at a prominent place. After 105 days, the weighing scale under the basket was removed, and as a post-test measurement, the total waste data for the four meals were noted down for another 72 days. A manipulation test indicated that the experiment has had the desired effect of invoking self-accountability in subjects during the pre-test phase, and diluting it during the post-test phase. Time series analysis of pre-test and post-test data indicated that the wastage data decreased in the pre-test phase. However, the post-test waste data showed an increase over a period of time. The results indicate that accountability conditions like social norms invoke self-accountability cognition leading to self-regulatory behaviours in individuals. (shrink)
A remarkable theorem by Clifton, Bub and Halvorson (2003) (CBH) characterizes quantum theory in terms of information--theoretic principles. According to Bub (2004, 2005) the philosophical significance of the theorem is that quantum theory should be regarded as a ``principle'' theory about (quantum) information rather than a ``constructive'' theory about the dynamics of quantum systems. Here we criticize Bub's principle approach arguing that if the mathematical formalism of quantum mechanics remains intact then there is no escape route from solving the measurement (...) problem by constructive theories. We further propose a (Wigner--type) thought experiment that we argue demonstrates that quantum mechanics on the information--theoretic approach is incomplete. (shrink)
A recent attempt to compute a (recursion‐theoretic) noncomputable function using the quantum adiabatic algorithm is criticized and found wanting. Quantum algorithms may outperform classical algorithms in some cases, but so far they retain the classical (recursion‐theoretic) notion of computability. A speculation is then offered as to where the putative power of quantum computers may come from.
The authors use the theoretical notion of anomie to examine the impact of top management's control mechanisms on the environment of the marketing function. Based on a literature review and in-depth field interviews with marketing managers in diverse industries, a conceptual model is proposed that incorporates the two managerial control mechanisms, viz. output and process control, and relates their distinctive influence to anomie in the marketing function. Three contingency variables, i.e., resource scarcity, power, and ethics codification, are proposed to moderate (...) the relationship between control mechanisms and anomie. The authors also argue for the link between anomic environments and the propensity of unethical marketing practices to occur. Theoretical and managerial implications of the proposed conceptual model are discussed. (shrink)
Loop quantum gravity predicts that spatial geometry is fundamentally discrete. Whether this discreteness entails a departure from exact Lorentz symmetry is a matter of dispute that has generated an interesting methodological dilemma. On one hand one would like the theory to agree with current experiments, but, so far, tests in the highest energies we can manage show no such sign of departure. On the other hand one would like the theory to yield testable predictions, and deformations of exact Lorentz symmetry (...) in certain yet– to–be–tested regimes may have phenomenological consequences. Exposing their shortcomings, here I discuss two arguments that exemplify this dilemma, and compare them to other cases from the history of physics that share their symptoms. (shrink)
I discuss a rarely mentioned correspondence between Einstein and Swann on the constructive approach to the special theory of relativity, in which Einstein points out that the attempts to construct a dynamical explanation of relativistic kinematical effects require postulating a fundamental length scale in the level of the dynamics. I use this correspondence to shed light on several issues under dispute in current philosophy of spacetime that were highlighted recently in Harvey Brown’s monograph Physical Relativity, namely, Einstein’s view on the (...) distinction between principle and constructive theories, and the consequences of pursuing the constructive approach in the context of spacetime theories. r 2008 Elsevier Ltd. All rights reserved. (shrink)
In the wake of the current financial crises triggered by risky mortgage-backed securities, the question of ethics and risk-taking is once again at the front and center for both practitioners and academics. Although risk-taking is considered an integral part of strategic decision-making, sometimes firms could be propelled to take risks driven by reasons other than calculated strategic choices. The authors argue that a firm's risk-taking propensity is impacted by its ethical climate (egoistic or benevolent) and its emphasis on output control (...) to manage its marketing function. The firm's long-term orientation is argued to moderate the control–risk propensity relationship. The authors also extend research on risk and performance and argue that the association of risk-taking propensity and firm performance is contingent on the ownership (publicly traded versus privately held) structure of the firm. Based on survey data from a sample of manufacturing industries in the United States, the results show significant impact of ethical climate and marketing output control on a firm's risk-taking propensity; also risk-taking propensity shows a stronger association with firm performance in privately held firms than in publicly traded firms. (shrink)
The present study was motivated by the hypothesis that inputs from internal states in obsessive–compulsive individuals are attenuated, which could be one source of the pervasive doubting and checking in OCD. Participants who were high or low in OC tendencies were asked to produce specific levels of muscle tension with and without biofeedback, and their accuracy in producing the required muscle tension levels was assessed. As predicted, high OC participants performed more poorly than low OC participants on this task when (...) biofeedback was not available. When biofeedback was provided, the difference between the groups was eliminated, and withdrawing the monitor again reversed this effect. Finally, when given the opportunity, high OC participants were more likely than low OC participants to request biofeedback. These results suggest that doubt in OCD may be grounded in a real and general deficiency in accessing internal states. (shrink)
This study examines unethical purchasing practices from the perspective of buyer–supplier relationships. Based on a review of the inter-organizational literature and qualitative data from in-depth interviews with purchase managers from diverse industries, a conceptual framework is proposed, and theoretical arguments leading to propositions are presented. Taking into consideration the presence or absence of an explicit or implicit company policy sanctioning ethically questionable activities, unethical purchasing practices are conceptualized as a three-tiered set. Three broad themes emerge from the analysis toward explaining (...) purchasing ethics from a buyer–seller perspective: Inter-organizational power issues, Inter-organizational relational issues, and Interpersonal relational issues. Theoretical and managerial implications of the conceptual framework are discussed. (shrink)