The moral psychology of sympathy is the linchpin of the sentimentalist moral theories of both David Hume and Adam Smith. In this paper, I attempt to diagnose the critical differences between Hume's and Smith's respective accounts of sympathy in order to argue that Smithian sympathy is more properly suited to serve as a basis for impartial moral evaluations and judgments than is Humean sympathy. By way of arguing this claim, I take up the problem of overcoming sympathetic partiality in the (...) construction of a moral point of view, acknowledged by both writers, as my primary platform. My contention is that Humean sympathy is too mechanistic to actually deliver an impartial adjudicatory perspective, and that Smithian sympathy, with its evaluative, imaginative components, succeeds where Hume's account falls short. The paper is comprised of six sections: (i) introductory remarks, (ii) a discussion of Humean sympathy, (iii) a discussion of Smithian sympathy and its distinctness, (iv) a critical analysis of Hume's attempt to correct for sympathetic partiality in the construction of the judicial spectator's general point of view, (v) a critical discussion of sympathetic partiality in Smithian sympathy & (vi) a critical analysis of Smith's construction of the impartial spectator perspective as a moral point of view. (shrink)
Learn the tips and tricks used by a top photographer in the digital photography industry in Rick Sammon's Top Digital Photography Secrets. Filled with beautiful photographs and the techniques Rick Sammon used to capture them, this book offers you motivation to capture stunning photographs and the tools and tricks you need to capture them. With more than 100 techniques for use behind the camera, this book will improve the camera skills of both amateur and experienced photographers. Additionally, this (...) book includes a chapter on post-production secrets, and a Rick Sammon DVD Guide lighting, camera, and digital photography basics. In this 1-hour DVD, Rick shows you how to get great photos using a variety of lighting sources, camera-specific techniques, and basic rules of photography. (shrink)
In arguments in support of capitalism, the following propositions are sometimes advanced or presupposed: the best life for the individual is one of consumption, understood in a broad sense that includes aesthetic pleasures and entertainment as well as consumption of goods in the ordinary sense; consumption is to be valued because it promotes happiness or welfare, which is the ultimate good; since there are not enough opportunities for consumption to provide satiation for everybody, some principles of distributive justice must be (...) chosen to decide who gets what; the total to be distributed has first to be produced. What is produced depends, among other things, on the motivation and information of the producers. The theory of justice must take account of the fact that different principles of distribution have different effects on motivation and information; economic theory tells us that the motivational and informational consequences of private ownership of the means of production are superior to those of the various forms of collective ownerships. In the traditional controversy over the relative merits of capitalism and economic systems, the focus has been on proposition. In this paper, I consider instead propositions and. Before one can even begin to discuss how values are to be allocated, one must consider what they are – what it is that ought to be valued. I shall argue that at the center of Marxism is a specific conception of the good life as one of active self-realization, rather than passive consumption. (shrink)
Objective Bayesianism is a methodological theory that is currently applied in statistics, philosophy, artificial intelligence, physics and other sciences. This book develops the formal and philosophical foundations of the theory, at a level accessible to a graduate student with some familiarity with mathematical notation.
This book is an expanded and revised edition of the author's critically acclaimed volume Nuts and Bolts for the Social Sciences. In twenty-six succinct chapters, Jon Elster provides an account of the nature of explanation in the social sciences. He offers an overview of key explanatory mechanisms in the social sciences, relying on hundreds of examples and drawing on a large variety of sources - psychology, behavioral economics, biology, political science, historical writings, philosophy and fiction. Written in accessible and jargon-free (...) language, Elster aims at accuracy and clarity while eschewing formal models. In a provocative conclusion, Elster defends the centrality of qualitative social sciences in a two-front war against soft and hard forms of obscurantism. (shrink)
Bayesian nets are widely used in artificial intelligence as a calculus for causal reasoning, enabling machines to make predictions, perform diagnoses, take decisions and even to discover causal relationships. This book, aimed at researchers and graduate students in computer science, mathematics and philosophy, brings together two important research topics: how to automate reasoning in artificial intelligence, and the nature of causality and probability in philosophy.
The emulation theory of representation is developed and explored as a framework that can revealingly synthesize a wide variety of representational functions of the brain. The framework is based on constructs from control theory (forward models) and signal processing (Kalman filters). The idea is that in addition to simply engaging with the body and environment, the brain constructs neural circuits that act as models of the body and environment. During overt sensorimotor engagement, these models are driven by efference copies in (...) parallel with the body and environment, in order to provide expectations of the sensory feedback, and to enhance and process sensory information. These models can also be run off-line in order to produce imagery, estimate outcomes of different actions, and evaluate and develop motor plans. The framework is initially developed within the context of motor control, where it has been shown that inner models running in parallel with the body can reduce the effects of feedback delay problems. The same mechanisms can account for motor imagery as the off-line driving of the emulator via efference copies. The framework is extended to account for visual imagery as the off-line driving of an emulator of the motor-visual loop. I also show how such systems can provide for amodal spatial imagery. Perception, including visual perception, results from such models being used to form expectations of, and to interpret, sensory input. I close by briefly outlining other cognitive functions that might also be synthesized within this framework, including reasoning, theory of mind phenomena, and language. Key Words: efference copies; emulation theory of representation; forward models; Kalman filters; motor control; motor imagery; perception; visual imagery. (shrink)
To understand the mind, we need to draw equally on the fields of cognitive science and neuroscience. But these two fields have very separate intellectual roots, and very different styles. So how can these two be reconciled in order to develop a full understanding of the mind and brain.This is the focus of this landmark new book.
At least since the French moralists—Montaigne, Pascal, La Rochefoucauld, La Bruyère—it has been a commonplace that people can fool themselves as well as others about their beliefs and motivations. In this article, I consider some mechanisms of transmutation and misrepresentation, and their impact on behavior. I argue that deception and self-deception are not merely ex post rationalizations of behavior whose real motive and explanation are found elsewhere, but that they have independent causal and explanatory power. If people, that is, did (...) not fool themselves or others about why they do what they do they would act differently. The reason is that deception and self-deception take place under constraints that prevent us from offering totally opportunistic or self-serving rationalizations of what we do. There is a consistency constraint that is induced by the costs of being seen as offering inconsistent justifications for one's behavior, and an imperfection constraint diat is induced by the costs of being seen as offering justifications that are too blatantly self-serving. (shrink)
Technical change, defined as the manufacture and modification of tools, is generally thought to have played an important role in the evolution of intelligent life on earth, comparable to that of language. In this volume, first published in 1983, Jon Elster approaches the study of technical change from an epistemological perspective. He first sets out the main methods of scientific explanation and then applies those methods to some of the central theories of technical change. In particular, Elster considers neoclassical, evolutionary, (...) and Marxist theories, whilst also devoting a chapter to Joseph Schumpeter's influential theory. (shrink)
This brief commentary has three goals. The first is to argue that ‘‘framework debate’’ in cognitive science is unresolvable. The idea that one theory or framework can singly account for the vast complexity and variety of cognitive processes seems unlikely if not impossible. The second goal is a consequence of this: We should consider how the various theories on offer work together in diverse contexts of investigation. A final goal is to supply a brief review for readers who are compelled (...) by these points to explore existing literature on the topic. Despite this literature, pluralism has garnered very little attention from broader cognitive science. We end by briefly considering what it might mean for theoretical cognitive science. (shrink)
... there are cases in which on the basis of a temporally extended content of consciousness a unitary apprehension takes place which is spread out over a temporal interval (the so-called specious present). ... That several successive tones yield a melody is possible only in this way, that the succession of psychical processes are united "forthwith" in a common structure.
In Knowing Emotions, Furtak argues that it is only through the emotions that we can perceive meaning in life, and only by feeling emotions that we are able to recognize the value or significance of anything whatsoever. Our affective responses and dispositions therefore play a critical role in human existence, and their felt quality is intimately related to the awareness they provide.
b>: In this article I outline, apply, and defend a theory of natural representation. The main consequences of this theory are: i) representational status is a matter of how physical entities are used, and specifically is not a matter of causation, nomic relations with the intentional object, or information; ii) there are genuine (brain-)internal representations; iii) such representations are really representations, and not just farcical pseudo-representations, such as attractors, principal components, state-space partitions, or what-have-you;and iv) the theory allows us to (...) sharply distinguish those complex behaviors which are genuinely cognitive from those which are merely complex and adaptive. (shrink)
We describe a “centipede’s dilemma” that faces the sciences of human interaction. Research on human interaction has been involved in extensive theoretical debate, although the vast majority of research tends to focus on a small set of human behaviors, cognitive processes, and interactive contexts. The problem is that naturalistic human interaction must integrate all of these factors simultaneously, and grander theoretical mitigation cannot come only from focused experimental or computational agendas. We look to dynamical systems theory as a framework for (...) thinking about how these multiple behaviors, processes, and contexts can be integrated into a broader account of human interaction. By introducing and utilizing basic concepts of self-organization and synergy, we review empirical work that shows how human interaction is flexible and adaptive and structures itself incrementally during unfolding interactive tasks, such as conversation, or more focused goal-based contexts. We end on acknowledging that dynamical systems accounts are very short on concrete models, and we briefly describe ways that theoretical frameworks could be integrated, rather than endlessly disputed, to achieve some success on the centipede’s dilemma of human interaction. (shrink)
ABSTRACT In this paper, I link philosophical discussion of policies for trans inclusion or exclusion, to a method of policy making. I address the relationship between concerns about safety, fairness, and inclusion in policy making about the inclusion of transwomen athletes into women’s sport. I argue for an approach based on lexical priority rather than simple ‘balancing’, considering the different values in a specific order. I present justifying reasons for this approach and this lexical order, based on the special obligations (...) of International Federations such as World Rugby. As a result, I provide a justificatory framework for the WR Guidelines that exclude transwomen from the women’s game in WR competitions. Finally, I give an account of a maximally safe, maximally fair and maximally inclusive form of sex categorisation in sport. (shrink)
Using the Godel incompleteness result for leverage, Roger Penrose has argued that the mechanism for consciousness involves quantum gravitational phenomena, acting through microtubules in neurons. We show that this hypothesis is implausible. First the Godel result does not imply that human thought is in fact non-algorithmic. Second, whether or not non-algorithmic quantum gravitational phenomena actually exist, and if they did how that could conceivably implicate microtubules, and if microtubules were involved, how that could conceivably implicate consciousness, is entirely speculative. Third, (...) cytoplasmic ions such as calcium and sodium are almost certainly present in the microtubule pore, barring the quantum-mechanical effects Penrose envisages. Finally, physiological evidence indicates that consciousness does not directly depend on microtubule properties in any case, rendering doubtful any theory according to which consciousness is generated in the microtubules. (shrink)
The question of whether time is its own best representation is explored. Though there is theoretical debate between proponents of internal models and embedded cognition proponents (e.g. Brooks R 1991 Artiﬁcial Intelligence 47 139–59) concerning whether the world is its own best model, proponents of internal models are often content to let time be its own best representation. This happens via the time update of the model that simply allows the model’s state to evolve along with the state of the (...) modeled domain. I argue that this is neither necessary nor advisable. I show that this is not necessary by describing how internal modeling approaches can be generalized to schemes that explicitly represent time by maintaining trajectory estimates rather than state estimates. Though there are a variety of ways this could be done, I illustrate the proposal with a scheme that combines ﬁltering, smoothing and prediction to maintain an estimate of the modeled domain’s trajectory over time. I show that letting time be its own representation is not advisable by showing how trajectory estimation schemes can provide accounts of temporal illusions, such as apparent motion, that pose serious difﬁculties for any scheme that lets time be its own representation. (shrink)
This article outlines the main tenets of affect theory and links these to Sloterdijk’s spherology. Where affect foregrounds prepersonal energies and posthuman impulses, spherology provides a lens for considering how humans congregate in constantly reconfiguring socialities in their pursuit of legitimacy and immunity. The article then explores the relevance of “affective spheres” for contemporary social science research. The article’s main argument here is that research of contemporary organisational and professional practices must increasingly be spherogenic, or seeking to build “affective spheres.” (...) The basis of this argument are the in situ complexity and fast-changing nature of practices, and the increasing challenges involved in objectifying or ‘freezing’, and analysing or dissecting such practices. The article draws for its case study on a video-reflexive project conducted in a U.S. health service. The article concludes that the notion of research as spherogenics counterbalances the conventional methodological emphasis on a predetermined stance —whether neutral or political—in our construction and enactment of social science research. (shrink)
I argue against a growing radical trend in current theoretical cognitive science that moves from the premises of embedded cognition, embodied cognition, dynamical systems theory and/or situated robotics to conclusions either to the effect that the mind is not in the brain or that cognition does not require representation, or both. I unearth the considerations at the foundation of this view: Haugeland's bandwidth-component argument to the effect that the brain is not a component in cognitive activity, and arguments inspired by (...) dynamical systems theory and situated robotics to the effect that cognitive activity does not involve representations. Both of these strands depend not only on a shift of emphasis from higher cognitive functions to things like sensorimotor processes, but also depend on a certain understanding of how sensorimotor processes are implemented - as closed-loop control systems. I describe a much more sophisticated model of sensorimotor processing that is not only more powerful and robust than simple closed-loop control, but for which there is great evidence that it is implemented in the nervous system. The is the emulation theory of representation, according to which the brain constructs inner dynamical models, or emulators, of the body and environment which are used in parallel with the body and environment to enhance motor control and perception and to provide faster feedback during motor processes, and can be run off-line to produce imagery and evaluate sensorimotor counterfactuals. I then show that the emulation framework is immune to the radical arguments, and makes apparent why the brain is a component in the cognitive activity, and exactly what the representations are in sensorimotor control. (shrink)
In evolutionary medicine, researchers characterize some outcomes as evolutionary mismatch. Mismatch problems arise as the result of organisms living in environments to which they are poorly adapted, typically as the result of some rapid environmental change. Depression, anxiety, obesity, myopia, insomnia, breast cancer, dental problems, and numerous other negative health outcomes have all been characterized as mismatch problems. The exact nature of evolutionary mismatch itself is unclear, however. This leads to a lack of clarity about the sorts of problems that (...) evolutionary mismatch can actually explain. Resolving this challenge is important not only for the evolutionary health literature, but also because the notion of evolutionary mismatch involves central concepts in evolutionary biology: fitness, evolution in changing environments, and so forth. In this paper, I examine two characterizations of mismatch currently in the literature. I propose that we conceptualize mismatch as a relation between an optimal environment and an actual environment. Given an organism and its particular physiology, the optimal environment is the environment in which the organism’s fitness is maximized: in other words, the optimal environment is that in which the organism’s fitness is as high as it can possibly be. The actual environment is the environment in which the organism actually finds itself. To the extent that there is a discordance between the organism’s actual and optimal environments, there is an evolutionary mismatch. In the paper, I show that this account of mismatch gives us the right result when other accounts fail, and provides useful targets for investigation. (shrink)
An attempt is made to defend a general approach to the spatial content of perception, an approach according to which perception is imbued with spatial content in virtue of certain kinds of connections between perceiving organism's sensory input and its behavioral output. The most important aspect of the defense involves clearly distinguishing two kinds of perceptuo-behavioral skills—the formation of dispositions, and a capacity for emulation. The former, the formation of dispositions, is argued to by the central pivot of spatial content. (...) I provide a neural information processing interpretation of what these dispositions amount to, and describe how dispositions, so understood, are an obvious implementation of Gareth Evans' proposal on the topic. Furthermore, I describe what sorts of contribution are made by emulation mechanisms, and I also describe exactly how the emulation framework differs from similar but distinct notions with which it is often unhelpfully confused, such as sensorimotor contingencies and forward models. (shrink)
b>: The problem of how physical systems, such as brains, come to represent themselves as subjects in an objective world is addressed. I develop an account of the requirements for this ability that draws on and refines work in a philosophical tradition that runs from Kant through Peter Strawson to Gareth Evans. The basic idea is that the ability to represent oneself as a subject in a world whose existence is independent of oneself involves the ability to represent space, and (...) in particular, to represent oneself as one object among others in an objective spatial realm. In parallel, I provide an account of how this ability, and the mechanisms that support it, are realized neurobiologically. This aspect of the article draws on, and refines, work done in the neurobiology and psychology of egocentric and allocentric spatial representation. (shrink)
A variety of theoretical frameworks predict the resemblance of behaviors between two people engaged in communication, in the form of coordination, mimicry, or alignment. However, little is known about the time course of the behavior matching, even though there is evidence that dyads synchronize oscillatory motions (e.g., postural sway). This study examined the temporal structure of nonoscillatory actions—language, facial, and gestural behaviors—produced during a route communication task. The focus was the temporal relationship between matching behaviors in the interlocutors (e.g., facial (...) behavior in one interlocutor vs. the same facial behavior in the other interlocutor). Cross-recurrence analysis revealed that within each category tested (language, facial, gestural), interlocutors synchronized matching behaviors, at temporal lags short enough to provide imitation of one interlocutor by the other, from one conversational turn to the next. Both social and cognitive variables predicted the degree of temporal organization. These findings suggest that the temporal structure of matching behaviors provides low-level and low-cost resources for human interaction. (shrink)
The nature of temporal experience is typically explained in one of a small number of ways, most are versions of either retentionalism or extensionalism. After describing these, I make a distinction between two kinds of temporal character that could structure temporal experience: A-ish contents are those that present events as structured in past/present/future terms, and B-ish contents are those that present events as structured in earlier-than/later-than/simultaneous-with relations. There are a few exceptions, but most of the literature ignores this distinction, and (...) silently assumes temporal experience is A-ish. I then argue that temporal character is not scale invariant, but rather that temporal experience is A-ish at larger scales, and B-ish at smaller scales. I then point out that this scale non-invariance opens the possibility of hybrid views. I clarify my own view as a hybrid view, according to which temporal experience is B-ish at small scales – and at this scale my trajectory estimation model applies – but A-ish at larger scales, and at the larger scale my TEM does not apply. I then motivate this hybrid position by first defending it against arguments that have tried to show that the TEM is untenable. Since the hybrid view has TEM as its small-scale component, it must address this objection. I then put pressure on the main alternative account, extentionalism, by showing that its proponents have not adequately dealt with the problem of temporal illusions. The result is a new theory motivated by i) explaining its virtues, ii) showing that objections to it can be met, and iii) showing that objections to its main competitors have not been met. (shrink)
This is the first major study of Michel Foucault as a political thinker. Written in clear prose, Foucault and the Political explores the ramifications for political theory of the whole range of Foucault's writing, including materials only recently made available. Jon Simons argues that Foucault's work is animated by a tension between his presentation of modern life as "unbearably heavy" and his temptation to escape its limitations by aiming for "unbearable lightness." Through expositions of Foucault's ideas on power/knowledge, subjectification, governmentality, (...) political rationality and the aesthetics of existence, Simons demonstrates how Foucault resists both extremes. Foucault's thought entails an ethic of permanent resistance, best embodied in radical democracy. Simons relates Foucault's work both to contemporary political thinkers, such as Michael Walzer, Charles Taylor and Jurgen Habermas, as well as to scholars challenging conventional political categories, especially feminist and gay theorists such as Judith Butler. (shrink)
Language maps signals onto meanings through the use of two distinct types of structure. First, the space of meanings is discretized into categories that are shared by all users of the language. Second, the signals employed by the language are compositional: The meaning of the whole is a function of its parts and the way in which those parts are combined. In three iterated learning experiments using a vast, continuous, open-ended meaning space, we explore the conditions under which both structured (...) categories and structured signals emerge ex nihilo. While previous experiments have been limited to either categorical structure in meanings or compositional structure in signals, these experiments demonstrate that when the meaning space lacks clear preexisting boundaries, more subtle morphological structure that lacks straightforward compositionality—as found in natural languages—may evolve as a solution to joint pressures from learning and communication. (shrink)
Nothing is more obvious than the fact that we are able to experience events in the world such a ball deflecting from the cross-bar of a goal. But what is the temporal relation between these two things, the event, and our experience of the event? One possibility is that the world progresses temporally through a sequence of instantaneous states – the striker’s foot in contact with the ball, then the ball between the striker and the goal, then the ball in (...) contact with the cross-bar, and so forth –, while the perceiver’s experience is likewise a sequence of experience states, each one of which corresponds to, or is experience of, a corresponding state of the world – for example, a perception of the foot in contact with the ball, followed by a perception of the ball in the air, following by a perception of the ball in contact with the cross-bar. This way of understanding the relationship between experience and the world is very natural, and nearly universal. However, it rests on two assumptions that can be brought into question. (shrink)
A number of recent attempts to bridge Husserlian phenomenology of time consciousness and contemporary tools and results from cognitive science or computational neuroscience are described and critiqued. An alternate proposal is outlined that lacks the weaknesses of existing accounts.
Action is a means of acquiring perceptual information about the environment. Turning around, for example, alters your spatial relations to surrounding objects and, hence, which of their properties you visually perceive. Moving your hand over an object’s surface enables you to feel its shape, temperature, and texture. Sniffing and walking around a room enables you to track down the source of an unpleasant smell. Active or passive movements of the body can also generate useful sources of perceptual information (Gibson 1966, (...) 1979). The pattern of optic flow in the retinal image produced by forward locomotion, for example, contains information about the direction in which you are heading, while motion parallax is a “cue” used by the visual system to estimate the relative distances of objects in your field of view. In these uncontroversial ways and others, perception is instrumentally dependent on action. According to an explanatory framework that Susan Hurley (1998) dubs the “Input-Output Picture”, the dependence of perception on action is purely instrumental: "Movement can alter sensory inputs and so result in different perceptions… changes in output are merely a means to changes in input, on which perception depends directly" (1998: 342). -/- The action-based theories of perception, reviewed in this entry, challenge the Input-Output Picture. They maintain that perception can also depend in a noninstrumental or constitutive way on action (or, more generally, on capacities for object-directed motor control). This position has taken many different forms in the history of philosophy and psychology. Most action-based theories of perception in the last 300 years, however, have looked to action in order to explain how vision, in particular, acquires either all or some of its spatial representational content. Accordingly, these are the theories on which we shall focus here. -/- We begin in Section 1 by discussing George Berkeley’s Towards a New Theory of Vision (1709), the historical locus classicus of action-based theories of perception, and one of the most influential texts on vision ever written. Berkeley argues that the basic or “proper” deliverance of vision is not an arrangement of voluminous objects in three-dimensional space, but rather a two-dimensional manifold of light and color. We then turn to a discussion of Lotze, Helmholtz, and the local sign doctrine. The “local signs” were felt cues for the mind to know what sort of spatial content to imbue visual experience with. For Lotze, these cues were “inflowing” kinaesthetic feelings that result from actually moving the eyes, while, for Helmholtz, they were “outflowing” motor commands sent to move the eyes. -/- In Section 2, we discuss sensorimotor contingency theories, which became prominent in the 20thcentury. These views maintain that an ability to predict the sensory consequences of self-initiated actions is necessary for perception. Among the motivations for this family of theories is the problem of visual direction constancy—why do objects appear to be stationary even though the locations on the retina to which they reflect light change with every eye movement?—as well as experiments on adaptation to optical rearrangement devices (ORDs) and sensory substitution. -/- Section 3 examines two other important 20th century theories. According to what we shall call the motor component theory, efference copies generated in the oculomotor system and/or proprioceptive feedback from eye-movements are used together with incoming sensory inputs to determine the spatial attributes of perceived objects. Efferent readiness theories, by contrast, look to the particular ways in which perceptual states prepare the observer to move and act in relation to the environment. The modest readiness theory, as we shall call it, claims that the way an object’s spatial attributes are represented in visual experience can be modulated by one or another form of covert action planning. The bold readiness theory argues for the stronger claim that perception just is covert readiness for action. -/- In Section 4, we move to the disposition theory, most influentially articulated by Gareth Evans (1982, 1985), but more recently defended by Rick Grush (2000, 2007). Evans’ theory is, at its core, very similar to the bold efferent readiness theory. There are some notable differences, though. Evans’ account is more finely articulated in some philosophical respects. It also does not posit a reduction of perception to behavioral dispositions, but rather posits that certain complicated relations between perceptual input and behavioral provide spatial content. Grush proposes a very specific theory that is like Evans’ in that it does not posit a reduction, but unlike Evans’ view, does not put behavioral dispositions and sensory input on an undifferentiated footing. (shrink)
Social identity is a factor that is associated with well-being and community participation. Some studies have shown that ethnic identity goes along with empowerment, and that interaction between the two leads to greater indices of well-being and community participation. However, other works suggest a contextual circumstance may condition the nature of these relations. By means of a cross-sectional study, we analyzed the relations of social identification and collective psychological empowerment with personal well-being, social well-being and community participation in a sample (...) of Basques. A total of 748 Basques participated. Individuals who were highly identified or fused with Basque speakers and who were highly empowered showed higher indices of well-being and of community participation than non-fused individuals with low empowerment. The results also suggest that social identification offsets the negative effects of perceiving the group as a linguistic minority. Collective psychological empowerment proved to be an especially relevant factor that needs to continue to be explored. (shrink)