Possibly the most fundamental scientific problem is the origin of time and causality. The inherent difficulty is that all scientific theories of origins and evolution consider the existence of time and causality as given. We tackle this problem by starting from the concept of self-organization, which is seen as the spontaneous emergence of order out of primordial chaos. Self-organization can be explained by the selective retention of invariant or consistent variations, implying a breaking of the initial symmetry exhibited by randomness. (...) In the case of time, we start from a random graph connecting primitive “events”. Selection on the basis of consistency eliminates cyclic parts of the graph, so that transitive closure can transform it into a partial order relation of precedence. Causality is assumed to be carried by causal “agents” which undergo a more traditional variation and selection, giving rise to causal laws that are partly contingent, partly necessary. (shrink)
Emergence is defined as a process which cannot be described by a fixed model, consisting of invariant distinctions. Hence emergence must be described by a meta‐model, representing the transition of one model to another one by means of a distinction dynamics. The dynamics of distinctions is based on the processes of variation and selection, resulting in an invariant distinction, which constrains the variety and thus defines a new system. A classification of emergence processes is proposed, based on the following criteria: (...) amount of variety, internality/ externality of variation and selection, number of levels, and contingency of constraint. It is argued that traditional formal and computational models are incapable of representing the more general types of emergence, but that it is possible to generalize them on the basis of the dynamics of distinctions. (shrink)
The science of complexity is based on a new way of thinking that stands in sharp contrast to the philosophy underlying Newtonian science, which is based on reductionism, determinism, and objective knowledge. This paper reviews the historical development of this new world view, focusing on its philosophical foundations. Determinism was challenged by quantum mechanics and chaos theory. Systems theory replaced reductionism by a scientifically based holism. Cybernetics and postmodern social science showed that knowledge is intrinsically subjective. These developments are being (...) integrated under the header of “complexity science”. Its central paradigm is the multi-agent system. Agents are intrinsically subjective and uncertain about their environment and future, but out of their local interactions, a global organization emerges. Although different philosophers, and in particular the postmodernists, have voiced similar ideas, the paradigm of complexity still needs to be fully assimilated by philosophy. This will throw a new light on old philosophical issues such as relativism, ethics and the role of the subject. (shrink)
In this chapter we want to provide philosophical tools for understanding and reasoning about complex systems. Classical thinking, which is taught at most schools and universities, has several problems for coping with complexity. We review classical thinking and its drawbacks when dealing with complexity, for then presenting ways of thinking which allow the better understanding of complex systems. Examples illustrate the ideas presented. This chapter does not deal with specific tools and techniques for managing complex systems, but we try to (...) bring forth ideas that facilitate the thinking and speaking about complex systems. (shrink)
Allen (2001) proposed the “Getting Things Done” (GTD) method for personal productivity enhancement, and reduction of the stress caused by information overload. This paper argues that recent insights in psychology and cognitive science support and extend GTD’s recommendations. We first summarize GTD with the help of a flowchart. We then review the theories of situated, embodied and distributed cognition that purport to explain how the brain processes information and plans actions in the real world. The conclusion is that the brain (...) heavily relies on the environment, to function as an external memory, a trigger for actions, and a source of affordances, disturbances and feedback. We then show how these principles are practically implemented in GTD, with its focus on organizing tasks into “actionable” external memories, and on opportunistic, situation-dependent execution. Finally, we propose an extension of GTD to support collaborative work, inspired by the concept of stigmergy. (shrink)
It is argued that the acceptance of knowledge in a community depends on several, approximately independent selection "criteria". The objective criteria are distinctiveness, invariance and controllability, the subjective ones are individual utility, coherence, simplicity and novelty, and the intersubjective ones are publicity, expressivity, formality, collective utility, conformity and authority. Science demarcates itself from other forms of knowledge by explicitly controlling for the objective criteria.
The present paper criticizes Chalmers's discussion of the Singularity, viewed as the emergence of a superhuman intelligence via the self-amplifying development of artificial intelligence. The situated and embodied view of cognition rejects the notion that intelligence could arise in a closed 'brain-in-a-vat' system, because intelligence is rooted in a high-bandwidth, sensory-motor interaction with the outside world. Instead, it is proposed that superhuman intelligence can emerge only in a distributed fashion, in the form of a self-organizing network of humans, computers, and (...) other technologies: the 'Global Brain'. (shrink)
While art and science still functioned side-by-side during the Renaissance, their methods and perspectives diverged during the nineteenth century, creating a still enduring separation between the "two cultures". Recently, artists and scientists again collaborate more frequently, as promoted most radically by the ArtScience movement. This approach aims at a true synthesis between the intuitive, imaginative methods of art and the rational, rule-governed methods of science. To prepare the grounds for a theoretical synthesis, this paper surveys the fundamental commonalities and differences (...) between science and art. Science and art are united in their creative investigation, where coherence, pattern or meaning play a vital role in the development of concepts, while relying on concrete representations to experiment with the resulting insights. On the other hand, according to the standard conception, science seeks an understanding that is universal, objective and unambiguous, while art focuses on unique, subjective and open-ended experiences. Both offer prospect and coherence, mystery and complexity, albeit with science preferring the former and art, the latter. The paper concludes with some examples of artscience works that combine all these aspects. (shrink)
Testing the validity of knowledge requires formal expression of that knowledge. Formality of an expression is defined as the invariance, under changes of context, of the expression's meaning, i.e. the distinction which the expression represents. This encompasses both mathematical formalism and operational determination. The main advantages of formal expression are storability, universal communicability, and testability. They provide a selective edge in the Darwinian competition between ideas. However, formality can never be complete, as the context cannot be eliminated. Primitive terms, observation (...) set-ups, and background conditions are inescapable parts of formal or operational definitions, that all refer to a context beyond the formal system. Heisenberg's Uncertainty Principle and Gödel's Theorem provide special cases of this more universal limitation principle. Context-dependent expressions, on the other hand, have the benefit of being more flexible, intuitive and direct, and putting less strain on memory. It is concluded that formality is not an absolute property, but a context-dependent one: different people will apply different amounts of formality in different situations or for different purposes. Some recent computational and empirical studies of formality and contexts illustrate the emerging scientific investigation of this dependence. (shrink)
A new conceptual framework is proposed to situate and integrate the parallel theories of Turchin, Powers, Campbell and Simon. A system is defined as a constraint on variety. This entails a 2 × 2 × 2 classification scheme for “higher‐order” systems, using the dimensions of constraint, (static) variety, and (dynamic) variation. The scheme distinguishes two classes of metasystems from supersystems and other types of emergent phenomena. Metasystems are defined as constrained variations of constrained variety. Control is characterized as a constraint (...) exerted by a separate system. The emergence of hierarchical systems is motivated by evolutionary principles. The positive feedback between variety and constraint, which underlies the “branching growth of the penultimate level,” leads to the interpretation of metasystem transitions as phases of accelerated change in a continuous evolutionary progression toward increasing variety. The most important MSTs in the history of evolution are reinterpreted in this framework: mechanical motion, dissipative structuration, life, multicellular differentiation, sexuality, simple reflex, complex reflex, associating, thinking, metarationality and social interaction. (shrink)
(1995). Selection of organization at the social level: Obstacles and facilitators of metasystem transitions. World Futures: Vol. 45, The Quantum of Evolution, pp. 181-212.
The Principia Cybernetica Project was created to develop an integrated philosophy or world view, based on the theories of evolution, self-organization, systems and cybernetics. Its conceptual network has been implemented as an extensive website. The present paper reviews the assumptions behind the project, focusing on its rationale, its philosophical presuppositions, and its concrete methodology for computer-supported collaborative development. Principia Cybernetica starts from a process ontology, where a sequence of elementary actions produces ever more complex forms of organization through the mechanism (...) of variation and selection, and metasystem transition. Its epistemology is constructivist and evolutionary: models are constructed by subjects for their own purposes, but undergo selection by the environment. Its ethics takes fitness and the continuation of evolution as the basic value, and derives more concrete guidelines from this implicit purpose. Together, these postulates and their implications provide answers to a range of age-old philosophical questions. (shrink)
Meme replication is described as a 4-stage process, consisting of assimilation, retention, expression and transmission. The effect of different objective, subjective, intersubjective and meme-centered selection criteria on these different stages is discussed.
The context of a linguisticexpression is defined as everything outside theexpression itself that is necessary forunambiguous interpretation of the expression.As meaning can be conveyed either by theimplicit, shared context or by the explicitform of the expression, the degree ofcontext-dependence or ``contextuality'' ofcommunication will vary, depending on thesituation and preferences of the languageproducer. An empirical measure of thisvariation is proposed, the ``formality'' or``F-score'', based on the frequencies ofdifferent word classes. Nouns, adjectives,articles and prepositions are more frequent inlow-context or ``formal'' types of (...) expression;pronouns, adverbs, verbs and interjections aremore frequent in high-context styles. Thismeasure adequately distinguishes differentgenres of language production using data forDutch, French, Italian, and English. Factoranalyses applied to data in 7 differentlanguages produce a similar factor as the mostimportant one. Both the data and thetheoretical model suggest that contextualitydecreases when unambiguous understandingbecomes more important or more difficult toachieve, when the separation in space, time orbackground between the interlocutors increases,and when the speaker is male, introvertedand/or academically educated. (shrink)
This short comment confirms Longo’s observation about the importance of symmetries for understanding space and time, but raises the additional issue of the transition from reversible to irreversible transformations.
Given that knowledge consists of finite models of an infinitely complex reality, how can we explain that it is still most of the time reliable? Survival in a variable environment requires an internal model whose complexity (variety) matches the complexity of the environment that is to be controlled. The reduction of the infinite complexity of the sensed environment to a finite map requires a strong mechanism of categorization. A measure of cognitive complexity (C) is defined, which quantifies the average amount (...) of trial-and-error needed to find the adequate category. C can be minimized by "probability ordering" of the possible categories, where the most probable alternatives ("defaults") are explored first. The reduction of complexity by such ordering requires a low statistical entropy for the cognized environment. This entropy is automatically kept down by the natural selection of "fit" configurations. The high probability, "default" cognitive categorizations are then merely mappings of environmentally "fit" configurations. (shrink)
Despite the intrinsic complexity of integrating individual, social and technologically supported intelligence, the paper proposes a relatively simple ‘connectionist’ framework for conceptualizing distributed cognitive systems. Shared information sources (documents) are represented as nodes connected by links of variable strength, which increases as the documents co-occur in the usage patterns. This learning procedure captures and exploits its users’ implicit knowledge to help them find relevant information, thus supporting an unconscious form of exchange. These principles are applied to a concrete problem domain: (...) architects sharing design knowledge through a database of associatively connected building projects. (shrink)
Quantum phenomena are notoriously difficult to grasp. The present paper first reviews the most important quantum concepts in a non-technical manner: superposition, uncertainty, collapse of the wave function, entanglement and non-locality. It then tries to clarify these concepts by examining their analogues in complex, self-organizing systems. These include bifurcations, attractors, emergent constraints, order parameters and non-local correlations. They are illustrated with concrete examples that include Rayleigh–Bénard convection, social self-organization and Gestalt perception of ambiguous figures. In both cases, quantum and self-organizing, (...) the core process appears to be a symmetry breaking that irreversibly and unpredictably “collapses” an ambiguous state into one of a number of initially equivalent “eigenstates” or “attractors”. Some speculations are proposed about the non-linear amplification of quantum fluctuations of the vacuum being ultimately responsible for such symmetry breaking. (shrink)
The scientific worldview is based on laws, which are supposed to be certain, objective, and independent of time and context. The narrative worldview found in literature, myth and religion, is based on stories, which relate the events experienced by a subject in a particular context with an uncertain outcome. This paper argues that the concept of “agent”, supported by the theories of evolution, cybernetics and complex adaptive systems, allows us to reconcile scientific and narrative perspectives. An agent follows a course (...) of action through its environment with the aim of maximizing its fitness. Navigation along that course combines the strategies of regulation, exploitation and exploration, but needs to cope with often-unforeseen diversions. These can be positive (affordances, opportunities), negative (disturbances, dangers) or neutral (surprises). The resulting sequence of encounters and actions can be conceptualized as an adventure. Thus, the agent appears to play the role of the hero in a tale of challenge and mystery that is very similar to the "monomyth", the basic storyline that underlies all myths and fairy tales according to Campbell [1949]. This narrative dynamics is driven forward in particular by the alternation between prospect (the ability to foresee diversions) and mystery (the possibility of achieving an as yet absent prospect), two aspects of the environment that are particularly attractive to agents. This dynamics generalizes the scientific notion of a deterministic trajectory by introducing a variable “horizon of knowability”: the agent is never fully certain of its further course, but can anticipate depending on its degree of prospect. (shrink)
We approach the problem of the extended mind from a radically non-dualist perspective. The separation between mind and matter is an artefact of the outdated mechanistic worldview, which leaves no room for mental phenomena such as agency, intentionality, or feeling. We propose to replace it by an action ontology, which conceives mind and matter as aspects of the same network of processes. By adopting the intentional stance, we interpret the catalysts of elementary reactions as agents exhibiting desires, intentions, and sensations. (...) Autopoietic networks of reactions constitute more complex super-agents, which moreover exhibit memory, deliberation and sense-making. In the specific case of social networks, individual agents coordinate their actions via the propagation of challenges. The distributed cognition that emerges from this interaction cannot be situated in any individual brain. This non-dualist, holistic view extends and operationalises process metaphysics and Eastern philosophies. It is supported by both mindfulness experiences and mathematical models of action, self-organisation, and cognition. (shrink)
We show that TTOM has a lot to offer for the study of the evolution of cultures, but that this also brings to the fore the dark implications of TTOM, unexposed in Veissière et al. Those implications lead us to move beyond meme-centered or an organism-centered concept of fitness based on free-energy minimization, toward a social system-centered view.
: Standard ethical frameworks struggle to deal with transhumanism, ecological issues and the rising technodiversity because they are focused on guiding and evaluating human behavior. Ethics needs its Copernican revolution to be able to deal with all moral agents, including not only humans, but also artificial intelligent agents, robots or organizations of all sizes. We argue that embracing the complexity worldview is the first step towards this revolution, and that standard ethical frameworks are still entrenched in the Newtonian worldview. We (...) first spell out the foundational assumptions of the Newtonian worldview, where all change is reduced to material particles following predetermined trajectories governed by the laws of nature. However, modern physical theories such as relativity, quantum mechanics, chaos theory and thermodynamics have drawn a much more confusing and uncertain picture, and inspired indecisive, subjectivist, relativist, nihilist or postmodern worldviews. Based on cybernetics, systems theory and the new sciences of complexity, we introduce the complexity worldview that sees the world as interactions and their emergent organizations. We use this complexity worldview to show the limitations of standard ethical frameworks such as deontology, theology, consequentialism, virtue ethics, evolutionary ethics and pragmatism. Keywords: Complexity, philosophy, ethics, cybernetics, transhumanism, universal ethics, systems ethics. (shrink)
No categories
Export citation
Bookmark
Off-campus access
Using PhilPapers from home?
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it: