Is conceptual analysis required for reductive explanation? If there is no a priori entailment from microphysical truths to phenomenal truths, does reductive explanation of the phenomenal fail? We say yes . Ned Block and Robert Stalnaker say no.
The explanatory gap . Consciousness is a mystery. No one has ever given an account, even a highly speculative, hypothetical, and incomplete account of how a physical thing could have phenomenal states. Suppose that consciousness is identical to a property of the brain, say activity in the pyramidal cells of layer 5 of the cortex involving reverberatory circuits from cortical layer 6 to the thalamus and back to layers 4 and 6,as Crick and Koch have suggested for visual consciousness. .) (...) Still, that identity itself calls out for explanation! Proponents of an explanatory gap disagree about whether the gap is permanent. Some say that we are like the scientifically naive person who is told that matter = energy, but does not have the concepts required to make sense of the idea. If we can acquire these concepts, the gap is closable. Others say the gap is uncloseable because of our cognitive limitations. Still others say that the gap is a consequence of the fundamental nature of consciousness. (shrink)
This book shows through argument and numerous policy-related examples how understanding moral philosophy can improve economic analysis, how moral philosophy can benefit from economists' analytical tools, and how economic analysis and moral philosophy together can inform public policy. Part I explores the idea of rationality and its connections to ethics, arguing that when they defend their formal model of rationality, most economists implicitly espouse contestable moral principles. Part II addresses the nature and measurement of welfare, utilitarianism and cost-benefit (...)analysis. Part III discusses freedom, rights, equality, and justice - moral notions that are relevant to evaluating policies, but which have played little if any role in conventional welfare economics. Finally, Part IV explores work in social choice theory and game theory that is relevant to moral decision making. Each chapter includes recommended reading and discussion questions. (shrink)
Despite their success in describing and predicting cognitive behavior, the plausibility of so-called ‘rational explanations’ is often contested on the grounds of computational intractability. Several cognitive scientists have argued that such intractability is an orthogonal pseudoproblem, however, since rational explanations account for the ‘why’ of cognition but are agnostic about the ‘how’. Their central premise is that humans do not actually perform the rational calculations posited by their models, but only act as if they do. Whether or not the problem (...) of intractability is solved by recourse to ‘as if’ explanations critically depends, inter alia, on the semantics of the ‘as if’ connective. We examine the five most sensible explications in the literature, and conclude that none of them circumvents the problem. As a result, rational ‘as if’ explanations must obey the minimal computational constraint of tractability. (shrink)
I apply Kooi and Tamminga's (2012) idea of correspondence analysis for many-valued logics to strong three-valued logic (K3). First, I characterize each possible single entry in the truth-table of a unary or a binary truth-functional operator that could be added to K3 by a basic inference scheme. Second, I define a class of natural deduction systems on the basis of these characterizing basic inference schemes and a natural deduction system for K3. Third, I show that each of the resulting (...) natural deduction systems is sound and complete with respect to its particular semantics. Among other things, I thus obtain a new proof system for Lukasiewicz's three-valued logic. (shrink)
The paper reacts against the strict separation between dialectical and rhetorical approaches to argumentation and argues that argumentative discourse can be analyzed and evaluated more adequately if the two are systematically combined. Such an integrated approach makes it possible to show how the opportunities available in each of the dialectical stages of a critical discussion have been used strategically to further the rhetorical aims of the speaker or writer. The approach is illustrated with the help of an analysis of (...) an `advertorial' published by R. J. Reynolds Tobacco Company. (shrink)
Gettier presented the now famous Gettier problem as a challenge to epistemology. The methods Gettier used to construct his challenge, however, utilized certain principles of formal logic that are actually inappropriate for the natural language discourse of the Gettier cases. In that challenge to epistemology, Gettier also makes truth claims that would be considered controversial in analytic philosophy of language. The Gettier challenge has escaped scrutiny in these other relevant academic disciplines, however, because of its façade as an epistemological (...) class='Hi'>analysis. This article examines Gettier's methods with the analytical tools of logic and analytic philosophy of language. (shrink)
In this work, a novel 6D four-wing hyperchaotic system with a line equilibrium based on a flux-controlled memristor model is proposed. The novel system is inspired from an existing 5D four-wing hyperchaotic system introduced by Zarei. Fundamental properties of the novel system are discussed, and its complex behaviors are characterized using phase portraits, Lyapunov exponential spectrum, bifurcation diagram, and spectral entropy. When a suitable set of parameters are chosen, the system exhibits a rich repertoire of dynamic behaviors including double-period bifurcation (...) of the quasiperiod, a single two-wing, and four-wing chaotic attractors. Further analysis of the novel system shows that the multiple coexisting attractors can be observed with different system parameter values and initial values. Moreover, the feasibility of the proposed mathematical model is also presented by using Multisim simulations based on an electronic analog of the model. Finally, the active control method is used to design the appropriate controller to realize the synchronization between the proposed 6D memristive hyperchaotic system and the 6D hyperchaotic Yang system with different structures. The Routh–Hurwitz criterion is used to prove the rationality of the controller, and the feasibility and effectiveness of the proposed synchronization method are proved by numerical simulations. (shrink)
By introducing a flux-controlled memristor with quadratic nonlinearity into a 4D hyperchaotic system as a feedback term, a novel 5D hyperchaotic four-wing memristive system is derived in this paper. The HFWMS with multiline equilibrium and three positive Lyapunov exponents presented very complex dynamic characteristics, such as the existence of chaos, hyperchaos, limit cycles, and periods. The dynamic characteristics of the HFWMS are analyzed by using equilibria, phase portraits, poincare map, Lyapunov exponential spectrum, bifurcation diagram, and spectral entropy. Of particular interest (...) is that this novel system can generate two-wing hyperchaotic attractor under appropriate parameters and initial conditions. Moreover, the FPGA realization of the novel 5D HFWMS is reported, which prove that the system has complex dynamic behavior. Finally, synchronization of the 5D hyperchaotic system with different structures by active control and a secure signal masking application of the HFWMS are implemented based on numerical simulations and FPGA. This research demonstrates that the hardware-based design of the 5D HFWMS can be applied to various chaos-based embedded system applications including random number generation, cryptography, and secure communication. (shrink)
A systematic comparative analysis of shale reservoir characteristics of the Wufeng-Longmaxi and Niutitang Formations in southeastern Sichuan Basin and its neighboring areas was conducted with respect to mineralogy, organic geochemistry, pore structure, methane sorption, brittleness, and fractures. Results indicate that organic matter -hosted pores that are hundreds of nanometers to micrometers in size in the Longmaxi shale are well-developed in migrated OM rather than in the in situ OM, and they are the dominant reservoir spaces. Furthermore, the total organic (...) carbon, brittleness, organic pores, and bedding fractures have good synergistic development relationships. However, there are fewer OM-hosted pores in the Niutitang shale; they are smaller in size, usually less than 30 nm, and have a more complicated pore structure. The intergranular pores in cataclastic organic-inorganic mineral fragments are the dominant reservoir spaces in the Niutitang shale and are coupled with stronger methane sorption and desorption capacities. The piecewise correlation between TOC and brittleness indicates the significant differences in pore and fracture characteristics. When the TOC [Formula: see text], the TOC, brittleness, organic/inorganic pores, and fractures synergistically develop; when the TOC [Formula: see text], even though the increase in ductility reduces the number of fractures, the lower cohesive strength, internal friction angle, and weaker surfaces of interlayer fractures and cataclastic minerals promote the development of slip fractures, which significantly improves the fracture effectiveness and reservoir spaces for free and absorbed shale gas. The Longmaxi, Wufeng, and Niutitang shales formed and evolved in different evolutionary stages. With the evolution of hydrocarbon generation, diagenesis, tectonic deformation, and pressure, the size and proportion of OM-hosted pores gradually decrease. At the same time, the complexity of the pore-fracture structure, the methane adsorption/desorption capacity, and the proportion of inorganic pores and fractures increase. (shrink)
The sceptical positions philosophers have adopted with respect to neuroimaging data are based on detailed evaluations of subtraction, which is one of many data analysis techniques used with neuroimaging data. These positions are undermined when the epistemic implications of the use of a diversity of data analysis techniques are taken into account. I argue that different data analysis techniques reveal different patterns in the data. Through the use of multiple data analysis techniques, researchers can produce results (...) that are locally robust. Thus, the epistemology of neuroimaging must take into consideration the details of the different data analysis techniques that are used to evaluate neuroimaging data, and the specific theoretical aims those techniques are deployed towards. _1_ Introduction _2_ Scepticism about Neuroimaging _3_ Data Analysis and Evidence _4_ Deconvolution and Pattern Classification Analysis _4.1_ Deconvolution analysis _4.2_ Region of interest selection _4.3_ Pattern classification analysis _5_ The Strength of Multiple Analyses _6_ Conclusion. (shrink)
The analysis of video-recorded interaction consists of various professionalized ways of seeing participant behavior through multimodal, co-operative, or intercorporeal lenses. While these perspectives are often adopted simultaneously, each creates a different view of the human body and interaction. Moreover, microanalysis is often produced through local practices of sense-making that involve the researchers’ bodies. It has not been fully elaborated by previous research how adopting these different ways of seeing human behavior influences both what is seen from a video and (...) how it is seen, as well as the way the interpretation of the data ultimately unfolds in the interaction between researchers. In this article, we provide a theoretical-methodological discussion of the microanalytic research process. We explore how it differs from “seeing” affect in interaction either as a co-operative and multimodal action or as an intercorporeal experience. First, we introduce the multimodal conversation analytic, co-operative, and intercorporeal approaches to microanalysis. Second, we apply and compare these practices to a video-recorded interaction of a romantic couple. Furthermore, we examine a video-recorded episode of us, the researchers, reflecting on our analytic observations about this interaction. We suggest that adopting a multimodal and co-operative perspective constructs affect as co-produced and displayed through observable action, while an intercorporeal perspective produces affect as an embodied and experienced phenomenon. While the former enables locating affect in a specific moment and identifiable body parts, the latter facilitates recognizing the experienced side of affect. These different modes of professional vision complement one another in capturing affect in interaction while being fundamentally used in local interactions between the researchers. (shrink)
Network analysis is a fundamental approach to the study of social structure. This chapter traces its development, distinguishing characteristics, and analytic principles. It emphasizes the intellectual unity of three research traditions: the anthropological concept of the social network, the sociological conception of social structure as social network, and structural explanations of political processes. Network analysts criticize the normative, categorical, dyadic, and bounded-group emphases prevalent in many sociological analyses. They claim that the most direct way to study a social system (...) is to analyze the pattern of ties linking its members. By analyzing complex hierarchical structures of asymmetric ties, they study power, stratification, and structural change. (shrink)
When scientists seek further confirmation of their results, they often attempt to duplicate the results using diverse means. To the extent that they are successful in doing so, their results are said to be robust. This paper investigates the logic of such "robustness analysis" [RA]. The most important and challenging question an account of RA can answer is what sense of evidential diversity is involved in RAs. I argue that prevailing formal explications of such diversity are unsatisfactory. I propose (...) a unified, explanatory account of diversity in RAs. The resulting account is, I argue, truer to actual cases of RA in science; moreover, this account affords us a helpful, new foothold on the logic undergirding RAs. (shrink)
The article aims to analyze the semantic fields of two Korean terms in the field of a specialized judicial terminology, i.e. court and tribunal, which are usually reflected in English by one hypernym term court. This analysis, although carried out on limited Korean data, is intended to indicate the differences between the use of these two different Korean terms and to indicate the reasons why court is currently the most common English equivalent. At the same time, the author, by (...) pointing to the historical and cultural background, explains the reasons why the term court is not always correct. The methods used in compiling the data are to highlight differences in the semantics of Korean terms covered by an English hypernym court. (shrink)
Epistemic Analysis, as I conceive of it, is concerned with the analysis of knowledge. The precincts of my concern have, however, been determined by the ...
Many philosophical naturalists eschew analysis in favor of discovering metaphysical truths from the a posteriori, contending that analysis does not lead to philosophical insight. A countercurrent to this approach seeks to reconcile a certain account of conceptual analysis with philosophical naturalism; prominent and influential proponents of this methodology include the late David Lewis, Frank Jackson, Michael Smith, Philip Pettit, and David Armstrong. Naturalistic analysis is a tool for locating in the scientifically given world objects and properties (...) we quantify over in everyday discourse. This collection gathers work from a range of prominent philosophers who are working within this tradition, offering important new work as well as critical evaluations of the methodology. Its centerpiece is an important posthumous paper by David Lewis, "Ramseyan Humility," published here for the first time. The contributors first address issues of philosophy of mind, semantics, and the new methodology's a priori character, then turn to matters of metaphysics, and finally consider problems regarding normativity. Conceptual Analysis and Philosophical Naturalism is one of the first efforts to apply this approach to such a wide range of philosophical issues. _Contributors: _David Braddon-Mitchell, Mark Colyvan, Frank Jackson, Justine Kingsbury, Fred Kroon, David Lewis, Dustin Locke, Kelby Mason, Jonathan McKeown-Green, Peter Menzies, Robert Nola, Daniel Nolan, Philip Pettit, Huw Price, Denis Robinson, Steve Stich, Daniel Stoljar The hardcover edition does not include a dust jacket. (shrink)
The dangers of character reification for cladistic inference are explored. The identification and analysis of characters always involves theory-laden abstraction—there is no theory-free “view from nowhere.” Given theory-ladenness, and given a real world with actual objects and processes, how can we separate robustly real biological characters from uncritically reified characters? One way to avoid reification is through the employment of objectivity criteria that give us good methods for identifying robust primary homology statements. I identify six such criteria and explore (...) each with examples. Ultimately, it is important to minimize character reification, because poor character analysis leads to dismal cladograms, even when proper phylogenetic analysis is employed. Given the deep and systemic problems associated with character reification, it is ironic that philosophers have focused almost entirely on phylogenetic analysis and neglected character analysis. (shrink)
Philosophical conceptual analysis is an experimental method. Focusing on this helps to justify it from the skepticism of experimental philosophers who follow Weinberg, Nichols & Stich. To explore the experimental aspect of philosophical conceptual analysis, I consider a simpler instance of the same activity: everyday linguistic interpretation. I argue that this, too, is experimental in nature. And in both conceptual analysis and linguistic interpretation, the intuitions considered problematic by experimental philosophers are necessary but epistemically irrelevant. They are (...) like variables introduced into mathematical proofs which drop out before the solution. Or better, they are like the hypotheses that drive science, which do not themselves need to be true. In other words, it does not matter whether or not intuitions are accurate as descriptions of the natural kinds that undergird philosophical concepts; the aims of conceptual analysis can still be met. (shrink)
Based on the theory of acoustic waves, a circular surface radiator model is introduced as a basis for constructing a knowledge transfer model for a knowledge alliance. The three main variables in the model are chosen to be the number of enterprises in knowledge alliance, the frequency of knowledge transfer, and the relationship distances between the knowledge bodies. The internal mechanism of knowledge transfer in a knowledge alliance is studied, and the direct relationships among the internal influencing factors are explored. (...) The results show that the number of enterprises in knowledge alliance, knowledge transfer frequency, and knowledge transfer effect are positively correlated. The “Rayleigh distance” in the knowledge field is the appropriate relationship distance measure for assessing knowledge transfer within the alliance. The Rayleigh distance is highly correlated with the number of enterprises in knowledge alliance and knowledge transfer frequency. Moreover, the number of enterprises in knowledge alliance and knowledge transfer frequency are interrelated. (shrink)
According to the standard opinions in the literature, blocking the unacceptable consequences of the notorious slingshot argument requires imposing constraints on the metaphysics of facts or on theories of definite descriptions (or class abstracts). This paper argues that both of these well-known strategies to rebut the slingshot overshoot the mark. The slingshot, first and foremost, raises the question as to the adequate logical formalization of statements about facts, i.e. of factual contexts. It will be shown that a rigorous application of (...) Quine’s maxim of shallow analysis to formalizations of factual contexts paves the way for an account of formalizing such contexts which blocks the slingshot without ramifications for theories of facts or definite descriptions. (shrink)
This paper studies the stability analysis of fractional-order bidirectional associative memory neural networks with mixed time-varying delays. The orders of these systems lie in the interval 1,2. Firstly, a sufficient condition is derived to ensure the finite-time stability of systems by resorting to some analytical techniques and some elementary inequalities. Next, a sufficient condition is obtained to guarantee the global asymptotic stability of systems based on the Laplace transform, the mean value theorem, the generalized Gronwall inequality, and some properties (...) of Mittag–Leffler functions. In particular, these obtained conditions are expressed as some algebraic inequalities which can be easily calculated in practical applications. Finally, some numerical examples are given to verify the feasibility and effectiveness of the obtained main results. (shrink)
Introduction to the Two Volumes xi PART ONE: G. E. MOORE ON ETHICS, EPISTEMOLOGY, AND PHILOSOPHICAL ANALYSIS 1 CHAPTER 1 Common Sense and Philosophical Analysis 3 CHAPTER 2 Moore on Skepticism, Perception, and Knowledge 12 CHAPTER 3 Moore on Goodness and the Foundations of Ethics 34 CHAPTER 4 The Legacies and Lost Opportunities of Moore’s Ethics 71 Suggested Further Reading 89 PART TWO: BERTRAND RUSSELL ON LOGICAL AND LINGUISTIC ANALYSIS 91 CHAPTER 5 Logical Form, Grammatical Form, and (...) the Theory of Descriptions 93 CHAPTER 6 Logic and Mathematics: The Logicist Reduction 132 CHAPTER 7 Logical Constructions and the External World 165 CHAPTER 8 Russell’s Logical Atomism 182 Suggested Further Reading 194 PART THREE: LUDWIG WITTGENSTEIN’S TRACTATUS 195 CHAPTER 9 The Metaphysics of the Tractatus 197 CHAPTER 10 Meaning, Truth, and Logic in the Tractatus 214 CHAPTER 11 The Tractarian Test of Intelligibility and Its Consequences 234 Suggested Further Reading 254 PART FOUR: LOGICAL POSITIVISM, EMOTIVISM, AND ETHICS 255 CHAPTER 12 The Logical Positivists on Necessity and Apriori Knowledge 257 CHAPTER 13 The Rise and Fall of the Empiricist Criterion of Meaning 271 CHAPTER 14 Emotivism and Its Critics 300 CHAPTER 15 Normative Ethics in the Era of Emotivism: The Anticonsequentialism of Sir David Ross 320 Suggested Further Reading 346 PART FIVE: THE POST-POSITIVIST PERSPECTIVE OF THE EARLY W. V. QUINE 349 CHAPTER 16 The Analytic and the Synthetic, the Necessary and the Possible, the Apriori and the Aposteriori 351 CHAPTER 17 Meaning and Holistic Verificationism 378 Suggested Further Reading 406 Index 409. (shrink)
Sentiment analysis is the field of natural language processing to analyze opinionated data, for the purpose of decision making. An opinion is a statement about a subject which expresses the sentiments as well as the emotions of the opinion makers on the topic. In this paper, we develop a sentiment analysis tool namely SENTI-METER. This tool estimates the success rate of social campaigns based on the algorithms we developed that analyze the sentiment of word as well as blog. (...) Social campaigns have a huge impact on the mindset of people. One such campaign was launched in India on October 2, 2014, named Swachh Bharat Abhiyan. Our tool computes an elaborated analysis of Swachh Bharat Abhiyan, which examines the success rate of this social campaign. Here, we performed the location-wise analysis of the campaign and predict the degree of polarity of tweets along with the monthly and weekly analysis of the tweets. The experiments were conducted in five phases namely extraction and preprocessing of tweets, tokenization, sentiment evaluation of a line, sentiment evaluation of a blog and analysis. Our tool is also capable of handling transliterated words. Unbiased tweets were extracted from Twitter related to this specific campaign, and on comparing with manual tagging we were able to achieve 84.47 % accuracy using unigram machine learning approach. This approach helps the government to implement the social campaigns effectively for the betterment of the society. (shrink)
Modelers often rely on robustness analysis, the search for predictions common to several independent models. Robustness analysis has been characterized and championed by Richard Levins and William Wimsatt, who see it as central to modern theoretical practice. The practice has also been severely criticized by Steven Orzack and Elliott Sober, who claim that it is a nonempirical form of confirmation, effective only under unusual circumstances. This paper addresses Orzack and Sober's criticisms by giving a new account of robustness (...)analysis and showing how the practice can identify robust theorems. Once the structure of robust theorems is clearly articulated, it can be shown that such theorems have a degree of confirmation, despite the lack of direct empirical evidence for their truth. (shrink)
In evolutionary psychology predictions, women’s mate preferences shift between fertile and nonfertile times of the month to reflect ancestral fitness benefits. Our meta-analytic test involving 58 independent reports was largely nonsupportive. Specifically, fertile women did not especially desire sex in short-term relationships with men purported to be of high genetic quality. The few significant preference shifts appeared to be research artifacts. The effects declined over time in published work, were limited to studies that used broader, less precise definitions of the (...) fertile phase, and were found only in published research. (shrink)
This is a major, wide-ranging history of analytic philosophy since 1900, told by one of the tradition's leading contemporary figures. The first volume takes the story from 1900 to mid-century. The second brings the history up to date.
The etiological approach to ‘proper functions’ in biology can be strengthened by relating it to Robert Cummins' general treatment of function ascription. The proper functions of a biological trait are the functions it is assigned in a Cummins-style functional explanation of the fitness of ancestors. These functions figure in selective explanations of the trait. It is also argued that some recent etiological theories include inaccurate accounts of selective explanation in biology. Finally, a generalization of the notion of selective explanation allows (...) an analysis of the proper functions of human artifacts. (shrink)
Community-researcher partnerships constitute one of the most important recent developments in biomedical ethics. The partnerships protect vulnerable communities within which research is conducted and help ensure that the communities benefit from the research. At the same time, they embody deep, core values about the social nature of persons and the value of community that significantly modify the radical individualism too often associated with the prevailing concepts of autonomy and respect for persons. This article examines the burgeoning literature on community-researcher partnerships (...) to identify the main ways of thinking ethically about the obligations of investigators and the roles and rights of communities in scientific research. The paper helps to uncover the deep commonalities and differences that mark the current debate in this emerging arena of research ethics, a debate over the social nature of persons that is beginning to influence the understanding of other bioethical issues. (shrink)
Recommender systems are recently developed computer-assisted tools that support social and informational needs of various communities and help users exploit huge amounts of data for making optimal decisions. In this study, we present a new recommender system for assessment and risk prediction in child welfare institutions in Israel. The system exploits a large diachronic repository of manually completed questionnaires on functioning of welfare institutions and proposes two different rule-based computational models. The system accepts users’ requests via a simple graphical interface, (...) calculates the institutions’ profiles according to user preferences, and presents assessment scores, trends and comparative analyses of the corresponding data using assorted visual aids. Based on the analysis, the system offers three different strategies for objective assessment of the institutions’ functioning and risks. Qualitative and quantitative evaluation of the system’s effectiveness and accuracy demonstrates that it substantially improves the assessment process of a welfare institution. Moreover, it provides an effective tool for objective large-scale analysis of the institution’s overall state and trends, which were previously based primarily on the institution supervisors’ subjective judgment and intuition. In addition, the proposed recommender system has great practical and social impact as it may help identify and avert potential problems, malfunctions, flaws, risks and even tragic incidents in child welfare institutions, as well as increase their overall functioning levels. As a result, as a long-term social implication, the system may also help reduce inequality and social gaps in the Israeli society. (shrink)
Smith, Glass, and Miller have reported a meta-analysis of over 500 studies comparing some form of psychological therapy with a control condition. They report that when averaged over all dependent measures of outcome, psychological therapy is. 85 standard deviations better than the control treatment. We examined the subset of studies included in the Smith et al. metaanalysis that contained a psychotherapy and a placebo treatment. The median of the mean effect sizes for these 32 studies was. 15. There was (...) a nonsignificant inverse relationship between mean outcome and the following: sample size, duration of therapy, use of measures of outcome other than undisguised self-report, measurement of outcome at follow-up, and use of real patients rather than subjects solicited for the purposes of participation in a research study. A qualitative analysis of the studies in terms of the type of patient involved indicates that those using psychiatric outpatients had essentially zero effect sizes and that none using psychiatric inpaticnts provide convincing evidence for psychotherapeutic effectiveness. The onty studies clearly demonstrating significant effects of psychotherapy were the ones that did not use real patients. For the most part, these studies involved small samples of subjects and brief treatments, occasionally described in quasibeliavioristic language. It was concluded that for real patients there is no evidence that the benefits of psychotherapy are greater than those of placebo treatment. (shrink)
The present study, based on the construct comparability approach, performs a comparative analysis of general points average for seven courses, using exploratory factor analysis and the Partial Credit model with a sample of 1398 student subjects from 8 schools in the province of Alicante. EFA confirmed a one-factor model which explains 74.44% of the variance. Cronbach’s alpha value for this factor was.94. The PCM supported the one-factor model, and an optimal fit was achieved in all of the courses. (...) The analysis of differential item functioning showed no significant differences in any course. Equitable distribution was observed in the evolution of the difficulty indices along the measurement scale for each course. This type of analysis confirms the measurement of a single latent construct in the different topics analysed, despite addressing various theoretical and procedural contents. (shrink)