The Sinian Dengying Formation in the Sichuan Basin, southwest China, mainly consisting of dolomites, is one of the most ancient gas-producing series in the world. During the past half-century, gas exploration in the formation has been largely based on the lithostratigraphic correlation, but a regional correlation scheme of time significance is usually insufficient, resulting in the difficulty of lateral correlation of strata between gas fields. Aiming to overcome the problem, we completed an interpretation of about 2500-km 2D regional seismic lines (...) by using the seismic sequence analysis method. As a result, a sequence stratigraphic framework was successfully constructed, which consists of two sequences and five systems tracts. By integrating analysis of isopatch maps with stratal stacking patterns, we identify three depositional facies belts within the formation, which are a shallow-water platform facies belt in the eastern and southern regions, a relatively deep-water basin facies belt in the northwestern region, and a northwest-dipping slope facies belt between them. During the development of sequence one in the lower of the Dengying Formation, retrogradation and aggradation dominated in the eastern and southern platform region whereas depositional condensation prevailed in the northwestern basin region. At that time, the depocenter was located on the eastern and southern platform region. However, sequence two in the upper of the Dengying Formation is dominated by the northwest-dipping sigmoid, oblique and shingled prograding packages of the platform-margin slope facies belt, indicating that the depositional center was shifted to the previous basin region in the northwest. As a result, the basin was filled gradually, and the platform-slope-basin topography was finally evolved into a northwest-dipping ramp. Our study suggests that the Late Sinian Sichuan Basin would consist of a series of shallow-water platforms separated by relatively deep-water depressions or basins, which provides important clues for gas exploration. (shrink)
In _Sino-Theology and the Philosophy of History_ Leopold Leeb presents the ideas of an influential Chinese intellectual, Liu Xiaofeng, whose approach to the question of a Christian theology for China is both controversial and inspiring.
There is a long-standing disagreement in the philosophy of probability and Bayesian decision theory about whether an agent can hold a meaningful credence about an upcoming action, while she deliberates about what to do. Can she believe that it is, say, 70% probable that she will do A, while she chooses whether to do A? No, say some philosophers, for Deliberation Crowds Out Prediction (DCOP), but others disagree. In this paper, we propose a valid core for DCOP, and identify terminological (...) causes for some of the apparent disputes. (shrink)
Can an agent deliberating about an action A hold a meaningful credence that she will do A? 'No', say some authors, for 'Deliberation Crowds Out Prediction' (DCOP). Others disagree, but we argue here that such disagreements are often terminological. We explain why DCOP holds in a Ramseyian operationalist model of credence, but show that it is trivial to extend this model so that DCOP fails. We then discuss a model due to Joyce, and show that Joyce's rejection of DCOP rests (...) on terminological choices about terms such as 'intention', 'prediction', and 'belief'. Once these choices are in view, they reveal underlying agreement between Joyce and the DCOP-favouring tradition that descends from Ramsey. Joyce's Evidential Autonomy Thesis (EAT) is effectively DCOP, in different terminological clothing. Both principles rest on the so-called 'transparency' of first-person present-tensed reflection on one's own mental states. (shrink)
This path-breaking volume will have a transformative impact on the field of early Chinese intellectual history and will be of great interest to scholars and students alike.
In this paper we explore the relationship between norms of belief revision that may be adopted by members of a community and the resulting dynamic properties of the distribution of beliefs across that community. We show that at a qualitative level many aspects of social belief change can be obtained from a very simple model, which we call ‘threshold influence’. In particular, we focus on the question of what makes the beliefs of a community stable under various dynamical situations. We (...) also consider refinements and alternatives to the ‘threshold’ model, the most significant of which is to consider changes to plausibility judgements rather than mere beliefs. We show first that some such change is mandated by difficult problems with belief-based dynamics related to the need to decide on an order in which different beliefs are considered. Secondly, we show that the resulting plausibility-based account results in a deterministic dynamical system that is non-deterministic at the level of beliefs. (shrink)
: Confucianism is often valued as a doctrine that highlights both the individual and social dimensions of the ideal person, for it indeed puts special emphasis on such lofty goals as loving all humanity and cultivating the self. Through a close and critical analysis of the texts of the Analects and the Mencius, however, it is attempted to demonstrate that because Confucius and Mencius always take filial piety, or, more generally, consanguineous affection, as not only the foundation but also the (...) supreme principle of human life, the individual and social dimensions are inevitably subordinated to and substantially negated by the filial precisely within the Confucian framework, with the result that Confucianism in essence is neither collectivism nor individualism, but "consanguinitism.". (shrink)
_An Introduction to Chinese Philosophy_ unlocks the mystery of ancient Chinese philosophy and unravels the complexity of Chinese Buddhism by placing them in the contemporary context of discourse. Elucidates the central issues and debates in Chinese philosophy, its different schools of thought, and its major philosophers. Covers eight major philosophers in the ancient period, among them Confucius, Laozi, and Zhuangzi. Illuminates the links between different schools of philosophy. Opens the door to further study of the relationship between Chinese and Western (...) philosophy. (shrink)
Two alternative accounts of quantum spontaneous symmetry breaking (SSB) are compared and one of them, the decompositional account in the algebraic approach, is argued to be superior for understanding quantum SSB. Two exactly solvable models are given as applications of our account -- the Weiss-Heisenberg model for ferromagnetism and the BCS model for superconductivity. Finally, the decompositional account is shown to be more conducive to the causal explanation of quantum SSB.
In this paper, we first propose a simple formal language to specify types of agents in terms of necessary conditions for their announcements. Based on this language, types of agents are treated as ‘first-class citizens’ and studied extensively in various dynamic epistemic frameworks which are suitable for reasoning about knowledge and agent types via announcements and questions. To demonstrate our approach, we discuss various versions of Smullyan’s Knights and Knaves puzzles, including the Hardest Logic Puzzle Ever (HLPE) proposed by Boolos (...) (in Harv Rev Philos 6:62–65, 1996). In particular, we formalize HLPE and verify a classic solution to it. Moreover, we propose a spectrum of new puzzles based on HLPE by considering subjective (knowledge-based) agent types and relaxing the implicit epistemic assumptions in the original puzzle. The new puzzles are harder than the previously proposed ones in the literature, in the sense that they require deeper epistemic reasoning. Surprisingly, we also show that a version of HLPE in which the agents do not know the others’ types does not have a solution at all. Our formalism paves the way for studying these new puzzles using automatic model checking techniques. (shrink)
Phase transitions are well-understood phenomena in thermodynamics (TD), but it turns out that they are mathematically impossible in finite SM systems. Hence, phase transitions are truly emergent properties. They appear again at the thermodynamic limit (TL), i.e., in infinite systems. However, most, if not all, systems in which they occur are finite, so whence comes the justification for taking TL? The problem is then traced back to the TD characterization of phase transitions, and it turns out that the characterization is (...) the result of serious idealizations which under suitable circumstances approximate actual conditions. (shrink)
The issues of organizational wrongdoing damage organizational performance and limit the development of organizations. Although organizational members may know the wrongdoing and have the opportunity to blow the whistle, they would keep silent because of the interpersonal risks. However, leaders can play an important role in shaping employee whistleblowing. This study focuses on discovering the mechanisms of how authentic leaders influence employee whistleblowing with a sample from China. Results demonstrate that authentic leadership is positively related to internal whistleblowing. Team psychological (...) safety partly mediates the relationship between authentic leadership and internal whistleblowing. Personal identification partly mediates the relationship between authentic leadership and internal whistleblowing. The study contributes to the extant theory by filling the gap between leadership and whistleblowing. (shrink)
We prove that RCA₀ + RT $RT\begin{array}{*{20}{c}} 2 \\ 2 \\ \end{array} $ ̸͢ WKL₀ by showing that for any set C not of PA-degree and any set A, there exists an infinite subset G of A or ̅Α, such that G ⊕ C is also not of PA-degree.
This article argues for an anti-deflationist view of scientific representation. Our discussion begins with an analysis of the recent Callender–Cohen deflationary view on scientific representation. We then argue that there are at least two radically different ways in which a thing can be represented: one is purely symbolic, and therefore conventional, and the other is epistemic. The failure to recognize that scientific models are epistemic vehicles rather than symbolic ones has led to the mistaken view that whatever distinguishes scientific models (...) from other representational vehicles must merely be a matter of pragmatics. It is then argued that even though epistemic vehicles also contain conventional elements, they do their job of demonstration in spite of such elements. (shrink)
Traditional theories construe approximate truth or truthlikeness as a measure of closeness to facts, singular facts, and idealization as an act of either assuming zero of otherwise very small differences from facts or imagining ideal conditions under which scientific laws are either approximately true or will be so when the conditions are relaxed. I first explain the serious but not insurmountable difficulties for the theories of approximation, and then argue that more serious and perhaps insurmountable difficulties for the theory of (...) idealization force us to sever its close tie to approximation. This leads to an appreciation of lawlikeness as a measure of closeness to laws, which I argue is the real measure of idealization whose main purpose is to carve nature at its joints. (shrink)
I first give a brief summary of a critique of the traditional theories of approximation and idealization; and after identifying one of the major roles of idealization as detaching component processes or systems from their joints, a detailed analysis is given of idealized laws – which are discoverable and/or applicable – in such processes and systems (i.e., idealized model systems). Then, I argue that dispositional properties should be regarded as admissible properties for laws and that such an inclusion supplies the (...) much needed connection between idealized models and the laws they `produce'' or `accommodate''. And I then argue that idealized law-statements so produced or accommodated in the models may be either true simpliciter or true approximately, but the latter is not because of the idealizations involved. I argue that the kind of limiting-case idealizations that produce approximate truth is best regarded as approximation; and finally I compare my theory with some existing theories of laws of nature.We seem to trace [in KingLear] ... the tendency of imagination toanalyse and abstract, to decomposehuman nature into its constituentfactors, and then to construct beings in whomone or more of these factors isabsent or atrophied or only incipient. (shrink)
(Recipient of the 2020 Everett Mendelsohn Prize.) This article revisits the development of the protoplasm concept as it originally arose from critiques of the cell theory, and examines how the term “protoplasm” transformed from a botanical term of art in the 1840s to the so-called “living substance” and “the physical basis of life” two decades later. I show that there were two major shifts in biological materialism that needed to occur before protoplasm theory could be elevated to have equal status (...) with cell theory in the nineteenth century. First, I argue that biologists had to accept that life could inhere in matter alone, regardless of form. Second, I argue that in the 1840s, ideas of what formless, biological matter was capable of dramatically changed: going from a “coagulation paradigm” that had existed since Theophrastus, to a more robust conception of matter that was itself capable of movement and self-maintenance. In addition to revisiting Schleiden and Schwann’s original writings on cell theory, this article looks especially closely at Hugo von Mohl’s definition of the protoplasm concept in 1846, how it differed from his primordial utricle theory of cell structure two years earlier. This article draws on Lakoff and Johnson’s theory of “ontological metaphors” to show that the cell, primordial utricle, and protoplasm can be understood as material container, object, and substance, and that these overlapping distinctions help explain the chaotic and confusing early history of cell theory. (shrink)
We present the interval-valued intuitionistic fuzzy ordered weighted cosine similarity measure in this paper, which combines the interval-valued intuitionistic fuzzy cosine similarity measure with the generalized ordered weighted averaging operator. The main advantage of the IVIFOWCS measure provides a parameterized family of similarity measures, and the decision maker can use the IVIFOWCS measure to consider a lot of possibilities and select the aggregation operator in accordance with his interests. We have studied some of its main properties and particular cases such (...) as the interval-valued intuitionistic fuzzy ordered weighted arithmetic cosine similarity measure and the interval-valued intuitionistic fuzzy maximum cosine similarity measure. The IVIFOWCS measure not only is a generalization of some similarity measure, but also it can deal with the correlation of different decision matrices for interval-valued intuitionistic fuzzy values. Furthermore, we present an application of IVIFOWCS measure to the group decision-making problem. Finally the existing similarity measures are compared with the IVIFOWCS measure by an illustrative example. (shrink)
Mandarin focus particles systematically have heterogeneous uses. By examining details of two focus particles jiu ‘only’ and dou ‘even’, this paper explores the hypothesis that varieties of alternatives give rise to systematic ‘ambiguities’. Specifically, by positing sum-based alternative sets and atom-based ones, it maintains unambiguous semantics of jiu as onlyweak and dou as even, while deriving their variability through interaction with alternatives. Independently motivated analyses of distributive/collective readings and contrastive topics, combined with varieties of alternatives, deliver the full range of (...) facts concerning jiu and dou. Theoretically, the paper illustrates an integration of Link, Landman’s theory of pluralites into Rooth’s alternative semantics. (shrink)
This paper offers a fine analysis of different versions of the well known sure-thing principle. We show that Savage's formal formulation of the principle, i.e., his second postulate (P2), is strictly stronger than what is intended originally.
The event-triggered consensus control for leader-following multiagent systems subjected to external disturbances is investigated, by using the output feedback. In particular, a novel distributed event-triggered protocol is proposed by adopting dynamic observers to estimate the internal state information based on the measurable output signal. It is shown that under the developed observer-based event-triggered protocol, multiple agents will reach consensus with the desired disturbance attenuation ability and meanwhile exhibit no Zeno behaviors. Finally, a simulation is presented to verify the obtained results.
Two types of guanxi have a close association with auditor independence in China: firm-level connections derived from state ownership and personal connections developed through management affiliations with external auditors. This article examines the effects of these two types of connection and their joint effect on audit quality. We find that state ownership and management affiliations with the external auditor both increase the probability of receiving a clean audit opinion in China. Furthermore, the probability increment brought by management affiliations for non-state-owned (...) enterprises (NSOEs) is greater than that for state-owned enterprises (SOEs). These results suggest that state ownership and management affiliations are two important types of connection that impair auditor independence, and that management affiliations are of greater importance to private-sector firms than to SOEs. (shrink)
This paper presents a novel fractional-order PID controller tuning strategy based on Bode’s optimal loop shaping which is commonly used for LTI feedback systems. Firstly, the controller parameters are achieved based on flat phase property and Bode’s optimal reference model, so that the controlled system is robust to gain variations and can achieve desirable transient performance according to various control requirements. Then, robustness analysis of the controlled system is carried out to support the results. Furthermore, the parameter setting is analyzed (...) to demonstrate the superiority of the proposed controller. At last, some simulation examples are shown to verify the accuracy and usefulness of the proposed control strategy. The proposed fractional-order PID controller does not have any restriction on the controlled plant, so it can be widely applied on both integer-order and fractional-order systems. (shrink)
Philosophers debate over the truth of the Doctrine of Doing and Allowing, the thesis that there is a morally significant difference between doing harm and merely allowing harm to happen. Deontologists tend to accept this doctrine, whereas consequentialists tend to reject it. A robust defence of this doctrine would require a conceptual distinction between doing and allowing that both matches our ordinary use of the concepts in a wide range of cases and enables a justification for the alleged moral difference. (...) In this article, I argue not only that a robust defence of this doctrine is available, but also that it is available within a consequentialist framework. (shrink)
Recently, infrared human action recognition has attracted increasing attention for it has many advantages over visible light, that is, being robust to illumination change and shadows. However, the infrared action data is limited until now, which degrades the performance of infrared action recognition. Motivated by the idea of transfer learning, an infrared human action recognition framework using auxiliary data from visible light is proposed to solve the problem of limited infrared action data. In the proposed framework, we first construct a (...) novel Cross-Dataset Feature Alignment and Generalization framework to map the infrared data and visible light data into a common feature space, where Kernel Manifold Alignment and a dual aligned-to-generalized encoders model are employed to represent the feature. Then, a support vector machine is trained, using both the infrared data and visible light data, and can classify the features derived from infrared data. The proposed method is evaluated on InfAR, which is a publicly available infrared human action dataset. To build up auxiliary data, we set up a novel visible light action dataset XD145. Experimental results show that the proposed method can achieve state-of-the-art performance compared with several transfer learning and domain adaptation methods. (shrink)
In this paper, a criticism of the traditional theories of approximation and idealization is given as a summary of previous works. After identifying the real purpose and measure of idealization in the practice of science, it is argued that the best way to characterize idealization is not to formulate a logical model – something analogous to Hempel's D-N model for explanation – but to study its different guises in the praxis of science. A case study of it is then made (...) in thermostatistical physics. After a brief sketch of the theories for phase transitions and critical phenomena, I examine the various idealizations that go into the making of models at three difference levels. The intended result is to induce a deeper appreciation of the complexity and fruitfulness of idealization in the praxis of model-building, not to give an abstract theory of it. (shrink)
The main factors responsible for the nonstationarity of seismic signals are the nonstationarity of the geologic structural sequences and the complex pore structure. Time-frequency analysis can identify various frequency components of seismic data and reveal their time-variant features. Choosing a proper time-frequency decomposition algorithm is the key to analyze these nonstationarity signals and reveal the geologic information contained in the seismic data. According to the Heisenberg uncertainty principle, we cannot obtain the finest time location and the best frequency resolution at (...) the same time, which results in the trade-off between the time resolution and the frequency resolution. For instance, the most commonly used approach is the short-time Fourier transform, in which the predefined window length limits the flexibility to adjust the temporal and spectral resolution at the same time. The continuous wavelet transform produces an “adjustable” resolution of time-frequency map using dilation and translation of a basic wavelet. However, the CWT has limitations in dealing with fast varying instantaneous frequencies. The synchrosqueezing transform can improve the quality and readability of the time-frequency representation. We have developed a high-resolution and effective time-frequency analysis method to characterize geologic bodies contained in the seismic data. We named this method the SST, and the basic wavelet is the three-parameter wavelet. The TPW is superior in time-frequency resolution than those of the Morlet and Ricker wavelets. Experiments on synthetic and field data determined its validity and effectiveness, which can be used in assisting in oil/gas reservoir identification. (shrink)
Electromechanical actuators are more and more widely used as actuation devices in flight control system of aircrafts and helicopters. The reliability of EMAs is vital because it will cause serious accidents if the malfunction of EMAs occurs, so it is significant to detect and diagnose the fault of EMAs timely. However, EMAs often run under variable conditions in realistic environment, and the vibration signals of EMAs are nonlinear and nonstationary, which make it difficult to effectively achieve fault diagnosis. This paper (...) proposed a fault diagnosis method of electromechanical actuators based on variational mode decomposition multifractal detrended fluctuation analysis and probabilistic neural network. First, the vibration signals were decomposed by VMD into a number of intrinsic mode functions. Second, the multifractal features hidden in IMFs were extracted by using MFDFA, and the generalized Hurst exponents were selected as the feature vectors. Then, the principal component analysis was introduced to realize dimension reduction of the extracted feature vectors. Finally, the probabilistic neural network was utilized to classify the fault modes. The experimental results show that this method can effectively achieve the fault diagnosis of EMAs even under diffident working conditions. Simultaneously, the diagnosis performance of the proposed method in this paper has an advantage over that of EMD-MFDFA method for feature extraction. (shrink)
As the aim of the responsible robotics initiative is to ensure that responsible practices are inculcated within each stage of design, development and use, this impetus is undergirded by the alignment of ethical and legal considerations towards socially beneficial ends. While every effort should be expended to ensure that issues of responsibility are addressed at each stage of technological progression, irresponsibility is inherent within the nature of robotics technologies from a theoretical perspective that threatens to thwart the endeavour. This is (...) because the concept of responsibility, despite being treated as such, is not monolithic: rather this seemingly unified concept consists of converging and confluent concepts that shape the idea of what we colloquially call responsibility. From a different perspective, robotics will be simultaneously responsible and irresponsible depending on the particular concept of responsibility that is foregrounded: an observation that cuts against the grain of the drive towards responsible robotics. This problem is further compounded by responsible design and development as contrasted to responsible use. From a different perspective, the difficulty in defining the concept of responsibility in robotics is because human responsibility is the main frame of reference. Robotic systems are increasingly expected to achieve the human-level performance, including the capacities associated with responsibility and other criteria which are necessary to act responsibly. This subsists within a larger phenomenon where the difference between humans and non-humans, be it animals or artificial systems, appears to be increasingly blurred, thereby disrupting orthodox understandings of responsibility. This paper seeks to supplement the responsible robotics impulse by proposing a complementary set of human rights directed specifically against the harms arising from robotic and artificial intelligence technologies. The relationship between responsibilities of the agent and the rights of the patient suggest that a rights regime is the other side of responsibility coin. The major distinction of this approach is to invert the power relationship: while human agents are perceived to control robotic patients, the prospect for this to become reversed is beginning. As robotic technologies become ever more sophisticated, and even genuinely complex, asserting human rights directly against robotic harms become increasingly important. Such an approach includes not only developing human rights that ‘protect’ humans but also ‘strengthen’ people against the challenges introduced by robotics and AI [This distinction parallels Berlin’s negative and positive concepts of liberty ], by emphasising the social and reflective character of the notion of humanness as well as the difference between the human and nonhuman. This will allow using the human frame of reference as constitutive of, rather than only subject to, the robotic and AI technologies, where it is human and not technology characteristics that shape the human rights framework in the first place. (shrink)