Parkinson's Disease (PD) is a long-term degenerative disorder of the central nervous system that mainly affects the motor system. The symptoms generally come on slowly over time. Early in the disease, the most obvious are shaking, rigidity, slowness of movement, and difficulty with walking. Doctors do not know what causes it and finds difficulty in early diagnosing the presence of Parkinson’s disease. An artificial neural network system with back propagation algorithm is presented in this paper for helping doctors in identifying (...) PD. Previous research with regards to predict the presence of the PD has shown accuracy rates up to 93% [1]; however, accuracy of prediction for small classes is reduced. The proposed design of the neural network system causes a significant increase of robustness. It is also has shown that networks recognition rates reached 100%. (shrink)
The 2008 financial crisis exposed the dark side of the financial sector in the UK. It brought attention to the contaminated culture of the business, which accommodated the systemic malpractices that largely contributed to the financial turmoil of 2008. In the wake of the crisis there seems to be a wide consensus that this contaminated culture can no longer be accepted and needs to change. This article examines the ills of the UK financial market, more specifically the cultural contamination problem, (...) which was uncovered by the 2008 financial crisis, in order to explore its genesis and the suitable solutions for it. In this regard, the article analyses the ethical finance sector from theoretical and practical perspectives in order to assess its role in addressing the cultural contamination problem of the UK financial market. (shrink)
This comprehensive text on African Mathematics addresses some of the problematic issues in the field, such as attitudes, curriculum development, educational change, academic achievement, standardized and other tests, performance factors, student characteristics, cross-cultural differences and studies, literacy, native speakers, social class and differences, equal education, teaching methods, and more.
This collection of specially commissioned essays is the first of its kind in English on the work of Antonio Negri, the Italian philosopher and political theorist. The spectacular success of Empire , Negri's collaboration with Michael Hardt, has brought Negri's writing to a new, wider audience. A substantial body of his writing is now available to an English-speaking readership. Outstanding contributors—including Michael Hardt, Sergio Bologna, Kathi Weeks and Nick Dyer-Witheford—reveal the variety and complexity of Negri's thought and explores its unique (...) relevance to modern politics. Negri is one of the most sophisticated analyists of modern political philosophy. Philosophers and critics alike find his work both difficult and exhilarating, engaging as it does with Marx, Spinoza, Deleuze, Guattari, Tronti and others. This book is ideal for readers who want to get to grips with Negri's key themes, in particular his theories on labour, capital, power, the state and revolution. It makes a great introduction to his work for students of political philosophy, as well as providing a comprehensive critical approach for Negri enthusiasts. (shrink)
The spectacular success of Empire and Multitude has brought Antonio Negri's writing to a new and wider audience. Negri'as work is singular in its depth and expression. It can be difficult to grasp the complexity of his ideas as they are rooted in the history of philosophy. This book offers an introduction to his thinking and is ideal for readers who want to come to grips with his key themes. Contributors include Pierre Macherey, Daniel Bensai;d, Charles Wolfe, Alex Callinicos, Miguel (...) Vatter, Jason Read, Alberto Toscano, Mamut Mutman, Ted Stolze, and Judith Revel. Written with dynamism and originality, the book will appeal to anyone interested in the evolution of Negri's thought, and especially to students of political philosophy, international studies, and literary theory. This book is the sequel to The Philosophy of Antonio Negri, Volume One: Resistance in Practice , but can be read entirely independently. (shrink)
Some time ago, Joel Katzav and Brian Ellis debated the compatibility of dispositional essentialism with the principle of least action. Surprisingly, very little has been said on the matter since, even by the most naturalistically inclined metaphysicians. Here, we revisit the Katzav–Ellis arguments of 2004–05. We outline the two problems for the dispositionalist identified Katzav in his 2004 , and claim they are not as problematic for the dispositional essentialist at it first seems – but not for the reasons espoused (...) by Ellis. (shrink)
The idea that democracy is under threat, after being largely dormant for at least 40 years, is looming increasingly large in public discourse. Complex systems theory offers a range of powerful new tools to analyse the stability of social institutions in general, and democracy in particular. What makes a democracy stable? And which processes potentially lead to instability of a democratic system? This paper offers a complex systems perspective on this question, informed by areas of the mathematical, natural, and social (...) sciences. We explain the meaning of the term 'stability' in different disciplines and discuss how laws, rules, and regulations, but also norms, conventions, and expectations are decisive for the stability of a social institution such as democracy. (shrink)
Political scientists have conventionally assumed that achieving democracy is a one-way ratchet. Only very recently has the question of ‘democratic backsliding’ attracted any research attention. We argue that democratic instability is best understood with tools from complexity science. The explanatory power of complexity science arises from several features of complex systems. Their relevance in the context of democracy is discussed. Several policy recommen- dations are offered to help stabilize current systems of representative democracy.
This paper attempts to analyze the performance of 57 memberstates of the Organization of Islamic Cooperation based on selectedindicators of some sectors namely demography, economics, educationand technology and innovation. Specifically, it aims at firstly portraying anoverview of OIC performance based on six selected indicators followed byanalyzing the relationship between selected development variables withliteracy and exploring the state of OIC performance as indicated by theirachievement based on selected indicators. The study was undertaken vis-àvisthe prevailing theories on modernization and development as well (...) as thewidely asserted underdevelopment of the contemporary Muslim ummah asclaimed by numerous contemporary Muslim scholars. The data were solicitedamong others from the World Bank Database, Statistical, Economic, and SocialResearch and Training, Centre of Islamic Countries of OIC andsome previous studies. They were then analyzed using SPSS with the resultsbeing generated mainly through the use of descriptive statistics. Among others,the study found that there is a positive correlation between Muslim countries’urban population growth and literacy rate, there is a steady decline in thenumber of OIC countries as they are categorized from lower to higher incomecategory placement and the percentage of the scientific publications of all 57OIC countries is far below that of any one single developed nation. (shrink)
BackgroundIn this study, medical errors are defined as unintentional patient harm caused by a doctor’s mistake. This topic, due to limited research, is poorly understood in Malaysia. The objective of this study was to determine the proportion of doctors intending to disclose medical errors, and their attitudes/perception pertaining to medical errors.MethodsThis cross-sectional study was conducted at a tertiary public hospital from July- December 2015 among 276 randomly selected doctors. Data was collected using a standardized and validated self-administered questionnaire intending to (...) measure disclosure and attitudes/perceptions. The scale had four vignettes in total two medical and two surgical. Each vignette consisted of five questions and each question measured the disclosure. Disclosure was categorised as “No Disclosure”, “Partial Disclosure” or “Full Disclosure”. Data was keyed in and analysed using STATA v 13.0.ResultsOnly 10.1% intended to disclose medical errors. Most respondents felt that they possessed an attitude/perception of adequately disclosing errors to patients. There was a statistically significant difference when comparing the intention of disclosure with perceived disclosures. Most respondents were in common agreement that disclosing an error would make them less likely to get sued, that minor errors should be reported and that they experienced relief from disclosing errors.ConclusionMost doctors in this study would not disclose medical errors although they perceived that the errors were serious and felt responsible for it. Poor disclosure could be due the fear of litigations and improper mechanisms/procedures available for disclosure. (shrink)
Many philosophers have claimed that Bayesianism can provide a simple justification for hypothetico-deductive inference, long regarded as a cornerstone of the scientific method. Following up a remark of van Fraassen, we analyze a problem for the putative Bayesian justification of H-D inference in the case where what we learn from observation is logically stronger than what our theory implies. Firstly, we demonstrate that in such cases the simple Bayesian justification does not necessarily apply. Secondly, we identify a set of sufficient (...) conditions for the mismatch in logical strength to be justifiably ignored as a "harmless idealization''. Thirdly, we argue, based upon scientific examples, that the pattern of H-D inference of which there is a ready Bayesian justification is only rarely the pattern that one actually finds at work in science. Whatever the other virtues of Bayesianism, the idea that it yields a simple justification of a pervasive pattern of scientific inference appears to have been oversold. (shrink)
Phenomenology, as I understand it, by G. C. Deb.--Phenomenology: education and evaluation, by Kazi A. Kadir.--Phenomenology, by B. H. Siddiqi.--Phenomenology in perspective, by James L. Kinder.
Philosophy of history; the idea of the not-being and the history, by K. M. Jamil.--Philosophy of history, by Khwaja Ashkar Husain.--Philosophy of history, by A. H. Kamali.--Philosophy of history, by B. H. Siddiqi.--Philosophy of history: explanation in history, by Kazi A. Kadir.
Forthcoming in Cognitive Architecture: from bio-politics to noo-politics, eds. Deborah Hauptmann, Warren Neidich and Abdul-Karim Mustapha INTRODUCTION The cognitive and affective sciences have benefitted in the last twenty years from a rethinking of the long-dominant computer model of the mind espoused by the standard approaches of computationalism and connectionism. The development of this alternative, often named the “embodied mind” approach or the “4EA” approach (embodied, embedded, enactive, extended, affective), has relied on a trio of classical 20th century phenomenologists (...) for its philosophical framework: Husserl, Heidegger, and Merleau-Ponty.[1] In this essay I propose that the French thinker Gilles Deleuze can provide the conceptual framework that will enable us to thematize some unstated presuppositions of the 4EA School, as well as to sharpen, extend and / or radicalize some of their explicit presuppositions. I highlight three areas here: 1) an ontology of distributed and differential systems, using Deleuze’s notion of the virtual; 2) a thought of multiple subjectification practices rather than a thought of “the” subject, even if it be seen as embodied and embedded; and 3) a rethinking of the notion of affect in order to thematize a notion of “political affect.”[2] I will develop this proposal with reference to Bruce Wexler’s Brain and Culture,[3] a work which resonates superbly with the Deleuzean approach. (shrink)
Numerous studies have examined the manner in which minority groups, including refugees, are depicted in the media discourse of the host countries or the dominant majority groups. The results of such studies indicate that media systematically discriminate these minority groups and deem them as a security, economic and hygiene threat to the majority groups. Through the use of Lakoff and Jonson’s conceptual metaphor theory, this study compares and contrasts the representation of Syrian refugees in the online media discourse of not (...) only host countries but also non-host countries, which, in this study, refers to nations that do not host Syrian refugees. The results show that statistical differences between the metaphors used by host and non-host countries only occur when using the metaphors that describe the entry of refugees and the burden they are inflicting on the host countries. This is clearly indicated by the p-values of the log-likelihood test. (shrink)
Econophysics is a new and exciting cross-disciplinary research field that applies models and modelling techniques from statistical physics to economic systems. It is not, however, without its critics: prominent figures in more mainstream economic theory have criticized some elements of the methodology of econophysics. One of the main lines of criticism concerns the nature of the modelling assumptions and idealizations involved, and a particular target are ‘kinetic exchange’ approaches used to model the emergence of inequality within the distribution of individual (...) monetary income. This article will consider such models in detail, and assess the warrant of the criticisms drawing upon the philosophical literature on modelling and idealization. Our aim is to provide the first steps towards informed mediation of this important and interesting interdisciplinary debate, and our hope is to offer guidance with regard to both the practice of modelling inequality, and the inequality of modelling practice. _1_ Introduction _1.1_ Econophysics and its discontents _1.2_ Against burglar economics _2_ Modelling Inequality _2.1_ Mainstream economic models for income distribution _2.2_ Econophysics models for income distribution _3_ Idealizations in Kinetic Exchange Models _3.1_ Binary interactions _3.2_ Conservation principles _3.3_ Exchange dynamics _ 4 _ Fat Tails and Savings _ 5 _ Evaluation. (shrink)
Environmental sustainability is one of the contemporary discourses that has abundant values embedded in the Quran and Sunnah teachings. Islam gives great emphasis on environment as it is preserved and protected under the Maqasid al-Shariah. The general outlook of Quranic paradigm on utilizing natural environment is based on prohibition of aggression and misuse. It is likewise founded on the construction and sustainable use. Thus, this article attempts to elaborate key concepts of the Quran and Sunnah teachings that reveal imperative values (...) for environmental sustainability. Research method employs in this paper is an analytical study of Quranic verses with special highlights of tafsir bi al-ma’thur, tafsir bi al-ra’yi, and historical narrative. In short, this paper brings to light the relevance of classical and contemporary works of Quran and Sunnah studies that have meticulous values for shaping the better world of human–nature’s interaction. (shrink)
Symplectic reduction is a formal process through which degeneracy within the mathematical representations of physical systems displaying gauge symmetry can be controlled via the construction of a reduced phase space. Typically such reduced spaces provide us with a formalism for representing both instantaneous states and evolution uniquely and for this reason can be justifiably afforded the status of fun- damental dynamical arena - the otiose structure having been eliminated from the original phase space. Essential to the application of symplectic reduction (...) is the precept that the first class constraints are the relevant gauge generators. This prescription becomes highly problematic for reparameterization invariant theories within which the Hamiltonian itself is a constraint; not least because it would seem to render prima facie distinct stages of a history physically identical and observable functions changeless. Here we will consider this problem of time within non-relativistic me- chanical theory with a view to both more fully understanding the temporal struc- ture of these timeless theories and better appreciating the corresponding issues in relativistic mechanics. For the case of nonrelativistic reparameterization invariant theory application of symplectic reduction will be demonstrated to be both unnec- essary; since the degeneracy involved is benign; and inappropriate; since it leads to a trivial theory. With this anti-reductive position established we will then examine two rival methodologies for consistently representing change and observable func- tions within the original phase space before evaluating the relevant philosophical implications. We will conclude with a preview of the case against symplectic re- duction being applied to canonical general relativity. (shrink)
Starting from a generalized Hamilton-Jacobi formalism, we develop a new framework for constructing observables and their evolution in theories invariant under global time reparametrizations. Our proposal relaxes the usual Dirac prescription for the observables of a totally constrained system and allows one to recover the influential partial and complete observables approach in a particular limit. Difficulties such as the non-unitary evolution of the complete observables in terms of certain partial observables are explained as a breakdown of this limit. Identification of (...) our observables relies upon a physical distinction between gauge symmetries that exist at the level of histories and states, and those that exist at the level of histories and not states. This distinction resolves a tension in the literature concerning the physical interpretation of the partial observables and allows for a richer class of observables in the quantum theory. There is the potential for the application of our proposal to the quantization of gravity when understood in terms of the Shape Dynamics formalism. (shrink)
In 1981 Unruh proposed that fluid mechanical experiments could be used to probe key aspects of the quantum phenomenology of black holes. In particular, he claimed that an analogue to Hawking radiation could be created within a fluid mechanical `dumb hole', with the event horizon replaced by a sonic horizon. Since then an entire sub-field of `analogue gravity' has been created. In 2016 Steinhauer reported the experimental observation of quantum Hawking radiation and its entanglement in a Bose-Einstein condensate analogue black (...) hole. What can we learn from such analogue experiments? In particular, in what sense can they provide evidence of novel phenomena such as black hole Hawking radiation? (shrink)
BackgroundObtaining informed consent for participation in genomic research in low-income settings presents specific ethical issues requiring attention. These include the challenges that arise when providing information about unfamiliar and technical research methods, the implications of complicated infrastructure and data sharing requirements, and the potential consequences of future research with samples and data. This study investigated researchers’ and participants’ parents’ experiences of a consent process and understandings of a genome-wide association study of malaria involving children aged five and under in Mali. (...) It aimed to inform best practices in recruiting participants into genomic research.MethodsA qualitative rapid ethical assessment was undertaken. Fifty-five semi-structured interviews were conducted with the parents of research participants. An additional nine semi-structured interviews were conducted with senior research scientists, research assistants and with a member of an ethics committee. A focus group with five parents of research participants and direct observations of four consent processes were also conducted. French and translated English transcripts were descriptively and thematically coded using OpenCode software.ResultsParticipants’ parents in the MalariaGEN study had differing understandings of the causes of malaria, the rationale for collecting blood samples, the purposes of the study and the kinds of information the study would generate. Genomic aspects of the research, including the gene/environment interaction underlying susceptibility or resistance to severe malaria, proved particularly challenging to explain and understand.ConclusionsThis study identifies a number of areas to be addressed in the design of consent processes for genomic research, some of which require careful ethical analysis. These include determining how much information should be provided about differing aspects of the research and how best to promote understandings of genomic research. We conclude that it is important to build capacity in the design and conduct of effective and appropriate consent processes for genomic research in low and middle-income settings. Additionally, consideration should be given to the role of review committees and community consultation activities in protecting the interests of participants in genomic research. (shrink)
The analysis of the temporal structure of canonical general relativity and the connected interpretational questions with regard to the role of time within the theory both rest upon the need to respect the fundamentally dual role of the Hamiltonian constraints found within the formalism. Any consistent philosophical approach towards the theory must pay dues to the role of these constraints in both generating dynamics, in the context of phase space, and generating unphysical symmetry transformations, in the context of a hypersurface (...) embedded within a solution. A first denial of time in the terms of a position of reductive temporal relationalism can be shown to be troubled by failure on the first count, and a second denial in the terms of Machian temporal relationalism can be found to be hampered by failure on the second. A third denial of time, consistent with both of the Hamiltonian constraints roles, is constituted by the implementation of a scheme for constructing observables in terms of correlations and leads to a radical Parmenidean timelessness. The motivation for and implications of each of these three denials are investigated. (shrink)
We claim that, as it stands, the Deutsch–Wallace–Everett approach to quantum theory is conceptually incoherent. This charge is based upon the approach’s reliance upon decoherence arguments that conflict with its own fundamental precepts regarding probabilistic reasoning in two respects. This conceptual conflict obtains even if the decoherence arguments deployed are aimed merely towards the establishment of certain ‘emergent’ or ‘robust’ structures within the wave function: To be relevant to physical science notions such as robustness must be empirically grounded, and, on (...) our analysis, this grounding can only plausibly be done in precisely the probabilistic terms that lead to conceptual conflict. Thus, the incoherence problems presented necessitate either the provision of a new, non-probabilistic empirical grounding for the notions of robustness and emergence in the context of decoherence, or the abandonment of the Deutsch–Wallace–Everett programme for quantum theory. (shrink)
Despite all its merits, the vast majority of critical attention devoted to colonialist literature restricts itself by severely bracketing the political context of culture and history. This typical facet of humanistic closure requires the critic systematically to avoid an analysis of the domination, manipulation, exploitation, and disfranchisement that are inevitably involved in the construction of any cultural artifact or relationship. I can best illustrate such closures in the field of colonialist discourse with two brief examples. In her book The Colonial (...) Encounter, which contrasts the colonial representations of three European and three non-European writers, M. M. Mahood skirts the political issue quite explicitly by arguing that she chose those authors precisely because they are “innocent of emotional exploitation of the colonial scene” and are “distanced” from the politics of domination.`1We find a more interesting example of this closure in Homi Bhabha’s criticism. While otherwise provocative and illuminating, his work rests on two assumptions—the unity of the “colonial subject” and the “ambivalence” of colonial discourse—that are inadequately problematized and, I feel, finally unwarranted and unacceptable. In rejecting Edward Said’s “suggestion that colonial power and discourse is possessed entirely by the colonizer,” Bhabha asserts, without providing any explanation, the unity of the “colonial subject .”2 I do not wish to rule out, a priori, the possibility that at some rarefied theoretical level the varied material and discursive antagonisms between conquerors and natives can be reduced to the workings of a single “subject”; but such a unity, let alone its value, must be demonstrated, not assumed. Though he cites Frantz Fanon, Bhabha completely ignored Fanon’s definition of the conqueror/native relation as a “Manichean” struggle—a definition that is not a fanciful metaphoric caricature but an accurate representation of a profound conflict.3 1. M. M. Mahood, The Colonial Encounter: A Reading of Six Novels , pp. 170, 171; and see p. 3. As many other studies demonstrate, the emotional innocence and the distance of the six writers whom Mahood has chosen—Joseph Conrad, E. M. Forster, Graham Greene, Chinua Achebe, R. K. Narayan, and V. S. Naipaul—are, at best, highly debatable.2. Homi K. Bhabha, “The Other Question—The Stereotype and Colonial Discourse,” Screen 24 : 25, 19.3. Frantz Fanon, The Wretched of the Earth, trans. Constance Farrington , p. 41. Abdul R. JanMohamed, assistant professor of English at the University of California, Berkeley, is the author of Manichean Aesthetics: The Politics of Literature in Colonial Africa. He is a founding member and associate editor of Cultural Critique and is currently working on a study of Richard Wright. (shrink)
The goal of responsible engineers is the creation of useful and safe technological products and commitment to public health, while respecting the autonomy of the clients and the public. Because engineers often face moral dilemma to resolve such issues, different engineers have chosen different course of actions depending on their respective moral value orientations. Islam provides a value-based mechanism rooted in the Maqasid al-Shari‘ah (the objectives of Islamic law). This mechanism prioritizes some values over others and could help resolve the (...) moral dilemmas faced in engineering. This paper introduces the Islamic interpretive-evaluative maxims to two core issues in engineering ethics: genetically modified foods and whistleblowing. The study aims primarily to provide problem-solving maxims within the Maqasid al-Shari‘ah matrix through which such moral dilemmas in science and engineering could be studied and resolved. (shrink)
The professions have focused considerable attention on developing codes of conduct. Despite their efforts there is considerable controversy regarding the propriety of professional codes of ethics. Many provisions of professional codes seem to exacerbate disputes between the profession and the public rather than providing a framework that satisfies the public''s desire for moral behavior.After examining three professional codes, we divide the provisions of professional codes into those provisions which urge professionals to avoid moral hazard, maintain professional courtesy and serve the (...) public interest. We note that whereas provisions urging the avoidance of moral hazard are uncontroversial, the public is suspicious of provisions protecting professional courtesy. Public interest provisions are controversial when the public and the profession disagree as to what is in the public interest. Based on these observations, we conclude with recommendations regarding the content of professional codes. (shrink)