Numerical computing is a key part of the traditional computer architecture. Almost all traditional computers implement the IEEE 754-1985 binary floating point standard to represent and work with numbers. The architectural limitations of traditional computers make impossible to work with infinite and infinitesimal quantities numerically. This paper is dedicated to the Infinity Computer, a new kind of a supercomputer that allows one to perform numerical computations with finite, infinite, and infinitesimal numbers. The already available software simulator of the Infinity Computer (...) is used in different research domains for solving important real-world problems, where precision represents a key aspect. However, the software simulator is not suitable for solving problems in control theory and dynamics, where visual programming tools like Simulink are used frequently. In this context, the paper presents an innovative solution that allows one to use the Infinity Computer arithmetic within the Simulink environment. It is shown that the proposed solution is user-friendly, general purpose, and domain independent. (shrink)
This paper is dedicated to numerical computation of higher order derivatives in Simulink. In this paper, a new module has been implemented to achieve this purpose within the Simulink-based Infinity Computer solution, recently introduced by the authors. This module offers several blocks to calculate higher order derivatives of a function given by the arithmetic operations and elementary functions. Traditionally, this can be done in Simulink using finite differences only, for which it is well-known that they can be characterized by instability (...) and low accuracy. Moreover, the proposed module allows to calculate higher order Lie derivatives embedded in the numerical solution to Ordinary Differential Equations (ODEs). Traditionally, Simulink does not offer any practical solution for this case without using difficult external libraries and methodologies, which are domain-specific, not general-purpose and have their own limitations. The proposed differentiation module bridges this gap, is simple and does not require any additional knowledge or skills except basic knowledge of the Simulink programming language. Finally, the block for constructing the Taylor expansion of the differentiated function is also proposed, adding so another efficient numerical method for solving ODEs and for polynomial approximation of the functions. Numerical experiments on several classes of test problems confirm advantages of the proposed solution. (shrink)
This paper considers hybrid systems — dynamical systems that exhibit both continuous and discrete behavior. Usually, in these systems, interactions between the continuous and discrete dynamics occur when a pre-defined function becomes equal to zero, i.e., in the system occurs a zero-crossing (the situation where the function only “touches” zero is considered as the zero-crossing, as well). Determination of zero-crossings plays a crucial role in the correct simulation of the system in this case. However, for models of many real-life hybrid (...) systems, such interactions may lead to the so-called Zeno executions, i.e., situations where the system undergoes an unbounded number of discrete transitions in a finite and bounded length of time. In this case, standard numerical methods of simulating the systems may fail, since the time between two transitions can decrease significantly leading to ill-conditioning of the simulation. Correct determination of zero-crossings for a complex real-life system can require a lot of computational resources and, as a consequence, slow down the simulation significantly. This paper presents a new way to execute the simulation generating time observations of the hybrid system dynamically using numerical infinitesimals introduced recently, allowing thus to determine zero-crossings more accurately. The proposed method allows to automatically detect zero-crossings with predefined accuracy and to analyze better the behavior of the system around the zero-crossings generating observations more densely, where it is necessary. Moreover, the search for zero-crossings is performed efficiently without re-evaluation of the whole system at each observation. To show the validity of the proposed algorithm, the well-known Bouncing Ball hybrid system has been studied and the obtained simulation results were compared with the standard method. (shrink)
Scientists and engineers seek to understand how real-world systems work and could work better. Any modeling method devised for such purposes must simplify reality. Ideally, however, the modeling method should be flexible as well as logically rigorous; it should permit model simplifications to be appropriately tailored for the specific purpose at hand. Flexibility and logical rigor have been the two key goals motivating the development of Agent-based Computational Economics (ACE), a completely agent-based modeling method characterized by seven specific modeling principles. (...) This perspective provides an overview of ACE, a brief history of its development, and its role within a broader spectrum of experiment-based modeling methods. (shrink)
In this chapter, I sketch a philosophical framework of shared and diverging worlds and cultural significance. Although the framework proposed is basically a psychologically informed, philosophical approach, it is explicitly aimed at being applicable for agent-based social simulations. The account consists of three parts: (1) a formal ontology of human worlds, (2) an analysis of the pre-semantic significance of the objects of human worlds, and (3) an account of what it means for agents to share a world (or to live (...) in diverging worlds). In this chapter, I will give a brief and concise summary of my account. At the end, I will briefly outline how the proposed framework might be put to use for multiagent social simulation of complex social interaction scenarios involving diverging (cultural) backgrounds. (shrink)
When solving a complex problem in a group, should group members always choose the best available solution that they are aware of? In this paper, I build simulation models to show that, perhaps surprisingly, a group of agents who individually randomly follow a better available solution than their own can end up outperforming a group of agents who individually always follow the best available solution. This result has implications for the feminist philosophy of science and social epistemology.
This paper will discuss the recent LIGO-Virgo observations of gravitational waves and the binary black hole mergers that produce them. These observations rely on having prior knowledge of the dynamical behaviour of binary black hole systems, as governed by the Einstein Field Equations (EFEs). However, we currently lack any exact, analytic solutions to the EFEs describing such systems. In the absence of such solutions, a range of modelling approaches are used to mediate between the dynamical equations and the experimental data. (...) Models based on post-Newtonian approximation, the effective one-body formalism, and numerical relativity simulations (and combinations of these) bridge the gap between theory and observations and make the LIGO-Virgo experiments possible. In particular, this paper will consider how such models are validated as accurate descriptions of real-world binary black hole mergers (and the resulting gravitational waves) in the face of an epistemic circularity problem: the validity of these models must be assumed to justify claims about gravitational wave sources, but this validity can only be established based on these same observations. (shrink)
Les modèles basés sur des agents en interactions, constituent des systèmes sociaux complexes, qui peuvent être simulés par informatiques. Ils se répandent dans les sciences économiques et sociales - comme dans la plupart des sciences des systèmes complexes. Des énigmes épistémologiques (ré)apparaissent. On a souvent opposé modèles et investigations empiriques : d’un côté, on considère les sciences empiriques fondées sur une observation méthodique (enquêtes, expériences) tandis que de l’autre, on conçoit les approches théoriques et la modélisation comme s’appuyant sur une (...) approche conceptuelle ou hypothético-déductive. la simulation a de son coté souvent été comparée à une forme d’expérience ou à une méthode intermédiaire entre la théorie et l’expérience. Tesfatsion a parlé de « laboratoire computationnel » comme d’une manière d’étudier « les comportements des systèmes complexes au moyen d’expériences sous contrôle qu’on peut répéter (réplicable)» et Axelrod a proclamé que la simulation serait « une troisième manière de pratiquer la science dans les sciences sociales », entre l’induction et la déduction. Nous introduisons de nouveaux concepts épistémologiques afin de montrer dans quelle mesure les auteurs ont raison lorsqu'ils se concentrent sur une signification empirique, instrumentale ou conceptuelle de leur modèle ou simulation. La première partie du chapitre discute plus particulièrement de l'épistémologie des modèles économiques à partir des travaux d’Hausman, Guala, Mäki, Sugden, Schelling etc. pour expliquer les divergences entre modèles et simulations. En distinguant entre modèles et simulations, comme entre entre types de modèles, types de simulations informatiques et types d'empiricité obtenue par une simulation, la seconde partie donne la possibilité de comprendre plus précisément - et ensuite de justifier - la diversité des positions épistémologiques présentées dans la première partie. En utilisant les notion de sous-symbolisation (due à Smolensky) et, de hiérarchie dénotationnelle (due à Goodman), une attention particulière est accordée à la multiplicité des pouvoirs dénotationnels des symboles en jeu dans les modèles complexes et les simulations informatiques. Selon cette grille d'analyse, on peut déterminer, dans chaque cas particulier son statut épistémique et la crédibilité qui peut y être associée. -/- Ce chapitre est une traduction (par Gilles Campagnolo) réviséee et augmentée par les auteurs de : Agent-Based Models and Simulations in Economics and Social Sciences: From Conceptual Exploration to Distinct Ways of Experimenting, Journal of Artificial Societies and Social Simulation 13 (1) 5 DOI: 10.18564/jasss.1532. (shrink)
The debates on the scientificity of social sciences in general, and sociology in particular, are recurring. From the original methodenstreitat the end the 19th Century to the contemporary controversy on the legitimacy of “regional epistemologies”, a same set of interrogations reappears. Are social sciences really scientific? And if so, are they sciences like other sciences? How should we conceive “research programs” Lakatos (1978) or “research traditions” for Laudan (1977) able to produce advancement of knowledge in the field of social and (...) human phenomena? Is the progress of knowledge in social sciences similar to the one generally observed in natural sciences? Is it possible to evaluate the relative merits of each one of these research programs? -/- These debates are important vectors of social and intellectual polarization. The historical divide between the positivist and the hermeneutics poles precedes the structure of the contemporary debate around the epistemic space of social sciences, It is not only a question of renewing the opposition between a monist view of sciences (e.g. McIntyre, 1996) and a dualistic one (e.g. Geertz, 1973) or even a trialist view of sciences (e.g. Lepenies, 1985). It is also a question of asserting dichotomies transformed into framework (including when it is a question of exceeding them): nature-culture, nomothetic-idiographic, models-narrative, structure-history, cause-reason, explanation-comprehension. -/- In this short introduction, we provide, in section 2, a first overview of this epistemological debate in social science. Section 3 proposes a different standpoint on the same questions, by introducing both ontological and methodological aspects in this basic epistemological debate. Namely, following (Hollis, 1994) oppositions of the explanation - understanding, causes - meaning, etc., types discussed in the firsts ection are comparatively examined together with oppositions of the structure - action, holism - individualism types. This allows us to discus show multi-agent design, by integrating various dimensions and standpoints in the same framework (Phan, Amblard, 2007, Chapters 1, 5, 14) can help us to shift these boundaries, and to bypass these oppositions. As model building and ontology design are at the core of this process (Phan, Amblard, 2007, Chapter 12), Section4 discusses various issues of the art of modelling, starting both from economists’ and sociologists’ current standpoints. (shrink)
This book brings together contributions from leading researchers in the field of agent-based modelling and simulation. This approach has grown out of some recent and innovative ideas in the social sciences, computer sciences, life sciences, physics and game theory. It is proving helpful in understanding complexity in many domains. The opportunities it offers to explore the experimental approach to social and human behaviour is proving of theoretical and empirical value across a wide range of fields. With contributions from researchers whose (...) work has served to define this new field such as Nigel Gilbert, Robert Axtell (in economics and social science) and Jacques Ferber (in multi-agent systems), as well as practitioners who are working at the cutting edge of the new domain, this collection of essays has been assembled by two of its leading exponents: Frédéric Amblard and Denis Phan. The research, case studies and theoretical approaches discussed in this book are designed to introduce beginners and experts alike to the current state of play in this new and exciting field of social science. (shrink)
Diversity of practice is widely recognized as crucial to scientific progress. If all scientists perform the same tests in their research, they might miss important insights that other tests would yield. If all scientists adhere to the same theories, they might fail to explore other options which, in turn, might be superior. But the mechanisms that lead to this sort of diversity can also generate epistemic harms when scientific communities fail to reach swift consensus on successful theories. In this paper, (...) we draw on extant literature using network models to investigate diversity in science. We evaluate different mechanisms from the modeling literature that can promote transient diversity of practice, keeping in mind ethical and practical constraints posed by real epistemic communities. We ask: what are the best ways to promote an appropriate amount of diversity of practice in scientific communities? (shrink)
Ken Forbus's Qualitative Process Theory (QPT) is a popular theory for reasoning about the physical aspects of the daily world. Qualitative Process Theory Using Linguistic Variables by Bruce D'Ambrosio (Springer-Verlag, New York, 1989) is an attempt to fill some gaps in QPT.
I use network models to simulate social learning situations in which the dominant group ignores or devalues testimony from the marginalized group. I find that the marginalized group ends up with several epistemic advantages due to testimonial ignoration and devaluation. The results provide one possible explanation for a key claim of standpoint epistemology, the inversion thesis, by casting it as a consequence of another key claim of the theory, the unidirectional failure of testimonial reciprocity. Moreover, the results complicate the understanding (...) and application of previously discovered network epistemology effects, notably the Zollman effect (Zollman 2007, 2010). (shrink)
As screen-based virtual worlds have gradually begun facilitating more and more of our social interactions, some researchers have argued that the virtual worlds of these interactions do not allow for embodied social understanding. The aim of this article is to examine exactly the possibility of this by looking to esports practitioners’ experiences of interacting with each other during performance. By engaging in an integration of qualitative research methodologies and phenomenology, we investigate the actual first-person experiences of interaction in the virtual (...) worlds of the popular team-based esports practices Counter Strike: Global Offensive and League of Legends. Our analysis discloses how the practitioners’ interactions essentially depend on intercorporeality – understood as a form of reciprocity of bodily intentionality between the players. This is an intercorporeality that is present throughout the players’ performance, but which especially comes to the front when they engage in feinting. Acknowledging the intercorporeality integral to at least some esports practices helps fuzzying the sharp division between virtuality and embodied social understanding. Doing so highlights the fluidity of our embodied condition, and it raises interesting questions concerning the possibility of yet other forms of embodied sociality in a wider range of virtual formats in the world. (shrink)
What separates the unique nature of human consciousness and that of an entity that can only perceive the world via strict logic-based structures? Rather than assume that there is some potential way in which logic-only existence is non-feasible, our species would be better served by assuming that such sentient existence is feasible. Under this assumption, artificial intelligence systems (AIS), which are creations that run solely upon logic to process data, even with self-learning architectures, should therefore not face the opposition they (...) have to gaining some legal duties and protections insofar as they are sophisticated enough to display consciousness akin to humans. Should our species enable AIS to gain a digital body to inhabit (if we have not already done so), it is more pressing than ever that solid arguments be made as to how humanity can accept AIS as being cognizant of the same degree as we ourselves claim to be. By accepting the notion that AIS can and will be able to fool our senses into believing in their claim to possessing a will or ego, we may yet have a chance to address them as equals before some unforgivable travesty occurs betwixt ourselves and these super-computing beings. (shrink)
John Stuart Mill advocated for increased interactions between individuals of dissenting opinions for the reason that it would improve society. Whether Mill and similar arguments that advocate for opinion diversity are valid depends on background assumptions about the psychology and sociality of individuals. The field of opinion dynamics is a burgeoning testing ground for how different combinations of sociological and psychological facts contribute to phenomena that affect opinion diversity, such as polarization. This paper applies some recent results from the opinion (...) dynamics literature to assess the impacts of the Millian suggestion. The goal is to understand how the scope of the validity of Mill-style arguments depends on plausible assumptions that can be formalized using agent-based models, a common modeling approach in opinion dynamics. The most salient insight is that homophily (increased interactions between like-minded individuals) does not sufficiently explain decreased opinion diversity. Hence, decreasing homophily by increasing interactions between individuals of dissenting opinions is not the simple solution that a Millian-style argument may advocate. (shrink)
The structure of communication networks can be more or less “democratic”: networks are less democratic if (a) communication is more limited in terms of characteristic degree and (b) is more tightly channeled to a few specifc nodes. Together those measures give us a two-dimensional landscape of more and less democratic networks. We track opinion volatility across that landscape: the extent to which random changes in a small percentage of binary opinions at network nodes result in wide changes across the network (...) as a whole. If wide and frequent swings of popular opinion are taken as a mark of instability, democratic communication networks prove far more stable than anti-democratic ones. In a fnal section, we consider the democratic or anti-democratic character of networks that respond to volatility by rewiring at random, in a search for community, or in a search for a leader. (shrink)
We are increasingly exposed to polarized media sources, with clear evidence that individuals choose those sources closest to their existing views. We also have a tradition of open face-to-face group discussion in town meetings, for example. There are a range of current proposals to revive the role of group meetings in democratic decision-making. Here, we build a simulation that instantiates aspects of reinforcement theory in a model of competing social influences. What can we expect in the interaction of polarized media (...) with group interaction along the lines of town meetings? Some surprises are evident from a computational model that includes both. Deliberative group discussion can be expected to produce opinion convergence. That convergence may not, however, be a cure for extreme views polarized at opposite ends of the opinion spectrum. In a large class of cases, we show that adding the influence of group meetings in an environment of self-selected media produces not a moderate central consensus but opinion convergence at one of the extremes defined by polarized media. (shrink)
How do conventions of communication emerge? How do sounds or gestures take on a semantic meaning, and how do pragmatic conventions emerge regarding the passing of adequate, reliable, and relevant information? My colleagues and I have attempted in earlier work to extend spatialized game theory to questions of semantics. Agent-based simulations indicate that simple signaling systems emerge fairly naturally on the basis of individual information maximization in environments of wandering food sources and predators. Simple signaling emerges by means of any (...) of various forms of updating on the behavior of immediate neighbors: imitation, localized genetic algorithms, and partial training in neural nets. Here the goal is to apply similar techniques to questions of pragmatics. The motivating idea is the same: the idea that important aspects of pragmatics, like important aspects of semantics, may fall out as a natural results of information maximization in informational networks. The attempt below is to simulate fundamental elements of the Gricean picture: in particular, to show within networks of very simple agents the emergence of behavior in accord with the Gricean maxims. What these simulations suggest is that important features of pragmatics, like important aspects of semantics, don't have to be added in a theory of informational networks. They come for free. (shrink)
In this paper we make a simple theoretical point using a practical issue as an example. The simple theoretical point is that robustness is not 'all or nothing': in asking whether a system is robust one has to ask 'robust with respect to what property?' and 'robust over what set of changes in the system?' The practical issue used to illustrate the point is an examination of degrees of linkage between sub-networks and a pointed contrast in robustness and fragility between (...) the dynamics of (1) contact infection and (2) information transfer or belief change. Time to infection across linked sub-networks, it turns out, is fairly robust with regard to the degree of linkage between them. Time to infection is fragile and sensitive, however, with regard to the type of sub-network involved: total, ring, small world, random, or scale-free. Aspects of robustness and fragility are reversed where it is belief updating with reinforcement rather than infection that is at issue. In information dynamics, the pattern of time to consensus is robust across changes in network type but remarkably fragile with respect to degree of linkage between sub-networks. These results have important implications for public health interventions in realistic social networks, particularly with an eye to ethnic and socio-economic sub-communities, and in social networks with sub-communities changing in structure or linkage. (shrink)
There are many social psychological theories regarding the nature of prejudice, but only one major theory of prejudice reduction: under the right circumstances, prejudice between groups will be reduced with increased contact. On the one hand, the contact hypothesis has a range of empirical support and has been a major force in social change. On the other hand, there are practical and ethical obstacles to any large-scale controlled test of the hypothesis in which relevant variables can be manipulated. Here we (...) construct a spatialized model that tests the core hypothesis in a large array of game-theoretic agents. Robust results offer a new kind of support for the contact hypothesis: results in simulation do accord with a hypothesis of reduced prejudice with increased contact. The spatialized game-theoretic model also suggests a deeper explanation for at least some of the social psychological phenomena at issue. (shrink)
What is it for a sound or gesture to have a meaning, and how does it come to have one? In this paper, a range of simulations are used to extend the tradition of theories of meaning as use. The authors work throughout with large spatialized arrays of sessile individuals in an environment of wandering food sources and predators. Individuals gain points by feeding and lose points when they are hit by a predator and are not hiding. They can also (...) make sounds heard by immediate neighbours in the array, and can respond to sounds from immediate neighbours. No inherent meaning for these sounds is built into the simulation; under what circumstances they are sent, if any, and what the response to them is, if any, vary initially with the strategies randomized across the array. These sounds do take on a specific function for communities of individuals, however, with any of three forms of strategy change: direct imitation of strategies of successful neighbours, a localized genetic algorithm in which strategies are ‘crossed’ with those of successful neighbours, and neural net training on the behaviour of successful neighbours. Starting from an array randomized across a large number of strategies, and using any of these modes of strategy change, communities of ‘communicators’ emerge. Within these evolving communities the sounds heard from immediate neighbours, initially arbitrary across the array, come to be used for very specific communicative functions. ‘Communicators’ make a particular sound on feeding and respond to that same sound from neighbours by opening their mouths; they make a different sound when hit with a predator and respond to that sound by hiding. Robustly and persistently, even in simple computer models of communities of self-interested agents, something suggestively like signalling emerges and spreads. Keywords: meaning, communication, genetic algorithms, neural networks. (shrink)
Immersing in the virtual world of the Internet, information and communication technologies are changing the human being. In spite of the apparent similarity of on-line and off-line, social laws of their existence are different. According to the analysis of games, based on the violation of the accepted laws of the world off-line, their censoring, as well as the cheating, features of formation and violations of social norms in virtual worlds were formulated. Although the creators of the games have priority in (...) the standardization of the virtual world, society as well as players can have impact on it to reduce the realism. The violation of the prescribed rules by a player is regarded as cheating. And it is subjected to sanctions, but the attitude toward it is ambiguous, sometimes positive. Some rules are formed as a result of the interaction between players. (shrink)
The study of a person existence in Internet space is certainly an actual task, since the Internet is not only a source of innovation, but also the cause of society's transformations and the social and cultural problems that arise in connection with this. Computer network is global. It is used by people of different professions, age, level and nature of education, living around the world and belonging to different cultures. It complicates the problem of developing common standards of behavior, a (...) system of norms and rules that could be widely accepted by all users. On the other hand, the Internet space can be viewed as a new form of existence where physical laws do not work, and in connection with this, social ones are often questioned. This paper focuses on how social norms regulate relations in Internet space. The authors represents the typology of deviant behavior in the network. The empirical basis of the research includes the sociological survey of students of the senior courses in the Institute of Computer Science and Technology of Peter the Great St. Petersburg Polytechnic University. Sociological survey allows to identify students’ understanding of Internet space. The selection of students is conditioned by the fact that IT professionals are considered simultaneously as ordinary users of the network and as future professionals in this field. (shrink)
Computing, today more than ever before, is a multi-faceted discipline which collates several methodologies, areas of interest, and approaches: mathematics, engineering, programming, and applications. Given its enormous impact on everyday life, it is essential that its debated origins are understood, and that its different foundations are explained. On the Foundations of Computing offers a comprehensive and critical overview of the birth and evolution of computing, and it presents some of the most important technical results and philosophical problems of the discipline, (...) combining both historical and systematic analyses. -/- The debates this text surveys are among the latest and most urgent ones: the crisis of foundations in mathematics and the birth of the decision problem, the nature of algorithms, the debates on computational artefacts and malfunctioning, and the analysis of computational experiments. By covering these topics, On the Foundations of Computing provides a much-needed resource to contextualize these foundational issues. -/- For practitioners, researchers, and students alike, a historical and philosophical approach such as what this volume offers becomes essential to understand the past of the discipline and to figure out the challenges of its future. (shrink)
Existe um horizonte à frente. Este horizonte está longe de ser aquele aqui descrito em sua forma, mas talvez o seja em sua essência. O que quero dizer com isso é que existe uma possibilidade de os cérebros positrônicos do título nunca existirem para além das brilhantes mentes que os conceberam na Ficção Científica, mas isto não quer dizer que não existirão sistemas análogos em suas funções, principalmente quanto à racionalidade. A Crítica da Razão Positrônica é um texto que tem (...) uma pretensão de, a partir dos interesses estabelecidos por Immanuel Kant, na Crítica da Razão Pura, tenta determinar o tipo de racionalidade esperada a androides, no caso, que possuam um cérebro positrônico. Repito assim o que escreveu Kant: ”Todo interesse de minha razão (tanto especulativa como prática) concentra-se nas seguintes três interrogações:1.Que posso Saber?2.Que devo fazer?3.Que me é permitido esperar?”1 Portanto, é com uma aproximação por essas perguntas que buscaremos uma racionalidade positrônica, algumas vezes talvez,assumindo a posição lógica por parte dos androides positrônicos. -/- 1KANT, I. Critica da Razão PuraA805/B833 -/- . (shrink)
Abstract: In the future, it will be possible to create advance simulations of ancestor in computers. Superintelligent AI could make these simulations very similar to the real past by creating a simulation of all of humanity. Such a simulation would use all available data about the past, including internet archives, DNA samples, advanced nanotech-based archeology, human memories, as well as text, photos and videos. This means that currently living people will be recreated in such a simulation, and in some sense, (...) “resurrected”. Such “resurrectional simulation” could be deliberately created just for this goal: to return to life all people who have ever lived. The main technical problem of such simulation will be uncertainty about the past, which increases exponentially for more remote times. Such problem could be partly addressed by “acausal trade” between different branches of the multiverse, which will create slightly different versions of the simulation using a quantum randomness generator. Such trade will result in resurrection of all possible people (including those who existed in other branches). Ethical problems of such a resurrectional simulation include: a) possible resurrection of some people against their will; b) such simulation may create additional suffering; с) such simulation could be used by hostile AI to return people to life and then torture them. In this work, I explore preliminary ideas about how to address these problems. (shrink)
Real-world economies are open-ended dynamic systems consisting of heterogeneous interacting participants. Human participants are decision-makers who strategically take into account the past actions and potential future actions of other participants. All participants are forced to be locally constructive, meaning their actions at any given time must be based on their local states; and participant actions at any given time affect future local states. Taken together, these essential properties imply real-world economies are locally-constructive sequential games. This paper discusses a modeling approach, (...) Agent-based Computational Economics, that permits researchers to study economic systems from this point of view. ACE modeling principles and objectives are first concisely presented and explained. The remainder of the paper then highlights challenging issues and edgier explorations that ACE researchers are currently pursuing. (shrink)
Although virtual reality technology is still in its infancy as a means of communication, people have already started to develop spontaneous and creative uses of their avatars: three dimensional representations of selves in cyberspace. A small, but increasing, number of people use avatars as tools and expressions of self-exploration and means of socialization. Based on extensive virtual ethnography of people immersed in virtual worlds, this essay will explore the variety and richness of virtually embodied experiences, by focusing on the agency (...) and reflexivity of the individuals behind the avatars.I will focus on immersive types of avatars in Second Life in order to illuminate the ways in which people use avatars creatively, as well as the ways in which they try to make sense of their experiences in virtual worlds. Four aspects of avatar-self relations are highlighted in this essay: avatars as mirrors of the buried parts of the networked self, avatars as autonomous agents, avatars as self-expressions and explorations, and avatars as a means of boundary-crossing. We should watch this quiet and spontaneous rise of social experimentation in cyberspace characterized by the agency and self-reflections of people who use avatars: it serves as a microscopic device to reflect our own notions of identity, body and world. (shrink)
This thesis focuses on the development of the first project for FIU’s ICAVE, The Globe Experience, presented as part of the “First Folio! The Book That Gave Us Shakespeare” exhibit during February, 2016. The thesis is divided into two parts. The first part is the project itself: a virtual reality recreation of going to The Globe Theater to see a play by William Shakespeare. The second part examines the digital project and outlines how Walter Benjamin and postcolonial theorists influenced the (...) design of The Globe Experience, resulting in, what I call, a “temporally and spatially disjointed London.” From this examination, the thesis goes on to question the role of canonical literature in the humanities. I go on to make the argument that the design decisions made in recreating The Globe reveals the ways in which canonical literature can reinforce and support hierarchical ideologies which can impede student learning. (shrink)
Virtual Heritage is the use of electronic media to recreate or interpret culture and cultural artifacts as they are today or as they might have been in the past. By definition, VH applications employ some kind of three dimensional representation; the means used to display it range from still photos to immersive Virtual Reality. Virtual Heritage is a very active area of research and development in both the academic and the commercial realms.. Most VH applications are intended forsome kind of (...) educational use. While the main activity of virtual heritage is to create ancient artifacts, the real goal is to understand ancient cultures.Most VH applications are architectural reconstructions, centered on a reconstructed building or monument. However, in the same way that archaeologists and historians study the artifacts because they are the primary cultural evidence we have, VH uses architecture as a frame for recreating ancient cultures. The larger goal of VH is to recreate ancient cultures, not as dead simulations, but as living museums where students/users can enter and understand a culture that is different from their own. The closest analog is the real-world living museums, where actors in period dress occupy a life-size historical setting and interact with the visitors. Ultimately, we would like to see the users themselves creating activities in the virtual space as a way of exploring different cultural viewpoints. For example, students who know about the Virtual Egyptian Temple and the supporting material may attempt to recreate activities there. In doing so, they would learn about what is and is not possible in the architectural and cultural space.In this paper we will begin by reviewing the issues and tradeoffs around building the architectural models for VH applications. These models are crucial in themselves and many of the issues involved in designing and creating them also apply to the dynamic and interactive aspects of VR. Then, we will touch on issues of how to bring culture to life in VR, the strengths and limitations for VR technology for VH applications. Finally, we will present the Virtual Egyptian Temple, our current project, as a working example. (shrink)
Few things seem more a part of the material world than biological specimens. Yet the processes by which collections of specimens are assembled, translated into information, combined with more information, and distributed are taking research repositories into the virtual realm.The term “virtual” has a number of meanings, and so a research repository can qualify as virtual in a variety of ways. The term would seem to apply, for example, to constructing a repository by forming a network among institutions; using the (...) Internet or the World Wide Web to solicit specimens and information; integrating web-based technology into the operation of the bank; using the Internet or web-based technology to manage relationships with donors or collection sites and recipients; and digitizing specimens. The all-digital repository would seem the most virtual of all possible repositories, a true cyberbank. (shrink)
We examine a case in which non-computable behavior in a model is revealed by computer simulation. This is possible due to differing notions of computability for sets in a continuous space. The argument originally given for the validity of the simulation involves a simpler simulation of the simulation, still further simulations thereof, and a universality conjecture. There are difficulties with that argument, but there are other, heuristic arguments supporting the qualitative results. It is urged, using this example, that absolute validation, (...) while highly desirable, is overvalued. Simulations also provide valuable insights that we cannot yet (if ever) prove. (shrink)
This book looks at the origins and the many contemporary meanings of the virtual. Rob Shields shows how the construction of virtual worlds has a long history. He examines the many forms of faith and hysteria that have surrounded computer technologies in recent years. Moving beyond the technologies themselves he shows how the virtual plays a role in our daily lives at every level. The virtual is also an essential concept needed to manage innovation and risk. It is real but (...) not actual, ideal but not abstract. The virtual, he argues, has become one of the key organizing principles of contemporary society in the public realms of politics, business and consumption as well as in our private lives. (shrink)
_'HyperReality is a technological capability like nanotechnology, human cloning and artificial intelligence. Like them, it does not as yet exist in the sense of being clearly demonstrable and publicly available. Like them, it is maturing in laboratories where the question "if" has been replaced by the question "when?" and like them, the implications of its appearance as a basic infrastructure technology are profound and merit careful consideration.'_ - _Nobuyoshi Terashima_ _What comes after the Internet?_ Imagine a world where it is (...) difficult to tell if the person standing next to you is real or a virtual reality, and whether they have human intelligence or artificial intelligence; a world where people can appear to be anything they want to be. _HyperReality makes this possible._ _HyperReality_ offers a window into the world of the future, an interface between the natural and artificial. Nobuyoshi Terashima led the team that developed the prototype for _HyperReality_ at Japan's ATT laboratories. John Tiffin studied they way HyperReality would create a new communications paradigm. Together with a stellar list of contributors from around the globe who are engaged in researching different aspects of HyperReality, they offer the first account of this extraordinary technology and its implications. This fascinating book explores the defining features of HyperReality: what it is, how it works and how it could become to the information society what mass media was to the industrial society. It describes ongoing research into areas such as the design of virtual worlds and virtual humans, and the role of intelligent agents. It looks at applications and ways in which HyperReality may impact on fields such as translation, medicine, education, entertainment and leisure. What are its implications for lifestyles and work, for women and the elderly: Will we grow to prefer the virtual worlds we create to the physical world we adapt to? HyperReality at the beginning of the third millennium is like steam power at the beginning of the nineteenth century and radio at the start of the twentieth century, an idea that has been shown to work but has yet to be applied. This book is for anyone concerned about the future and the effects of technology on our lives. (shrink)
This paper touches on a number of seemingly disparate topics—Artificial Intelligence, Fuzzy Logic, String Theory, the search for extra-terrestrial intelligence, the Cantorian concept of infinite sets—in order to support the thesis that for a large part of the educated public in the Western world, the very concept of reality has been changing over the last few generations, and that the change is being accelerated by our increasing acceptance of the Virtual as a substitute for the traditional Real. This, as I (...) hope to convince you, is a momentous shift in the our world view, and like so many profound but gradual shifts, has gone largely unnoticed. Whether the shift is ultimately a good thing or a bad, it ought not to go unscrutinized; this paper aims to bring it to public attention. (shrink)
This thesis interrogates the ethicopolitical implications of the emergence from media culture of the novel social formation of global postmodern cyberculture. It describes and analyzes the impact of the development of virtual reality technologies and cyberspace--and their attendant militarized, heteropatriarchal cyberpedagogies--on subjectivity, cultural authority, cultural values, and notions of space, information and community. The broad context of global postmodern cyberculture is delineated and specific issues and cultural practices such as legal/juridical practices that have been radically transformed by ubiquitous computerization, digitalization, (...) and virtualization are analyzed. Some of the profound philosophical implications of an emergent bimodal global cyberculture--fleshworld and virtual--are investigated. ;The operation within cyberspace of a Foucauldian micro-physics of power, the interpellation of docile digital bodies, cyborgization, and the development of a hypersurveillant Cyberpanoptical order are interrogated. The final section of the thesis describes and articulates a critical cyberpedagogy that builds on the "tradition" of critical pedagogy and current poststructural, feminist, and postmodern theorizations in cultural studies. Critical cyberpedagogy is contrasted with some current theorizations of media literacy. ;The question of a micropolitics of resistance is adumbrated and the possibilities of a counterhegemonic discourse and the operationalization of the emancipatory cultural practices of cyberfeminism, cyberart, and cyberdemocracy in cyberspace are explored. The thesis ends with a polemical "thought experiment" called Cyberheterotopia. (shrink)
This book is an introduction, entirely by example, to the possibilities of using computer models as tools in phosophical research in general and in philosophical logic in particular. Topics include chaos, fractals, and the semantics of paradox; epistemic dynamics; fractal images of formal systems; the evolution of generosity; real-valued game theory; and computation and undecidability in the spatialized Prisoner's Dilemma.