During the past decade, the size of 3D seismic data volumes and the number of seismic attributes have increased to the extent that it is difficult, if not impossible, for interpreters to examine every seismic line and time slice. To address this problem, several seismic facies classification algorithms including [Formula: see text]-means, self-organizing maps, generative topographic mapping, support vector machines, Gaussian mixture models, and artificial neural networks have been successfully used to extract features of geologic interest from multiple volumes. Although (...) well documented in the literature, the terminology and complexity of these algorithms may bewilder the average seismic interpreter, and few papers have applied these competing methods to the same data volume. We have reviewed six commonly used algorithms and applied them to a single 3D seismic data volume acquired over the Canterbury Basin, offshore New Zealand, where one of the main objectives was to differentiate the architectural elements of a turbidite system. Not surprisingly, the most important parameter in this analysis was the choice of the correct input attributes, which in turn depended on careful pattern recognition by the interpreter. We found that supervised learning methods provided accurate estimates of the desired seismic facies, whereas unsupervised learning methods also highlighted features that might otherwise be overlooked. (shrink)
Seismic interpretation is based on the identification of reflector configuration and continuity, with coherent reflectors having a distinct amplitude, frequency, and phase. Skilled interpreters may classify reflector configurations as parallel, converging, truncated, or hummocky, and use their expertise to identify stratigraphic packages and unconformities. In principal, a given pattern can be explicitly defined as a combination of waveform and reflector configuration properties, although such “clustering” is often done subconsciously. Computer-assisted classification of seismic attribute volumes builds on the same concepts. Seismic (...) attributes not only quantify characteristics of the seismic reflection events, but also measure aspects of reflector configurations. The Mississippi Lime resource play of northern Oklahoma and southern Kansas provides a particularly challenging problem. Instead of defining the facies stratigraphically, we need to define them either diagenetically or structurally. Using a 3D seismic survey acquired in Osage County Oklahoma, we use Kohonen self-organizing maps to classify different diagenetically altered facies of the Mississippi Lime play. The 256 prototype vectors reduce to only three or four distinct “natural” clusters. We use ground truth of seismic facies seen on horizontal image logs to fix three average attribute data vectors near the well locations, resulting in three “known” facies, and do a minimum Euclidean distance supervised classification. The predicted clusters correlate well to the poststack impedance inversion result. (shrink)
Seismic facies estimation is a critical component in understanding the stratigraphy and lithology of hydrocarbon reservoirs. With the adoption of 3D technology and increasing survey size, manual techniques of facies classification have become increasingly time consuming. Besides, the numbers of seismic attributes have increased dramatically, providing increasingly accurate measurements of reflector morphology. However, these seismic attributes add multiple “dimensions” to the data greatly expanding the amount of data to be analyzed. Principal component analysis and self-organizing maps are popular techniques to (...) reduce such dimensionality by projecting the data onto a lower order space in which clusters can be more readily identified and interpreted. After dimensional reduction, popular classification algorithms such as neural net, K-means, and Kohonen SOMs are routinely done for general well log prediction or analysis and seismic facies modeling. Although these clustering methods have been successful in many hydrocarbon exploration projects, they have some inherent limitations. We explored one of the recent techniques known as generative topographic mapping, which takes care of the shortcomings of Kohonen SOMs and helps in data classification. We applied GTM to perform multiattribute seismic facies classification of a carbonate conglomerate oil field in the Veracruz Basin of southern Mexico. The presence of conglomerate carbonates makes the reservoir units laterally and vertically highly heterogeneous, which are observed at well logs, core slabs, and thin section scales. We applied unsupervised GTM classification to determine the “natural” clusters in the data set. Finally, we introduced supervision into GTM and calculated the probability of occurrence of seismic facies seen at the wells over the reservoir units. In this manner, we were able to assign a level of confidence to encountering facies that corresponded to good and poor production. (shrink)
At the turn of the nineteenth century, Friedrich Schlegel developed an influential theory of irony that anticipated some of the central concerns of postmodernity. His most vocal contemporary critic, the philosopher Hegel, sought to demonstrate that Schlegel’s theory of irony tacitly relied on certain problematic aspects of Fichte’s philosophy. While Schlegel’s theory of irony has generated seemingly endless commentary in recent critical discourse, Hegel’s critique of Schlegelian irony has gone neglected. This essay’s primary aim is to defend Hegel’s critique of (...) Schlegel by isolating irony’s underlying Fichtean epistemology. Drawing on Søren Kierkegaard’s The Concept of Irony in the final section of this essay, I argue that Hegel’s critique of irony can motivate a dialectical hermeneutics that offers a powerful alternative both to Paul de Man’s poststructuralist hermeneutics and to recent cultural-studies-oriented criticism that tends to reduce literary texts to sociohistorical epiphenomena. (shrink)
Coleridge rarely mentions Hegel in his philosophical writings and seems to have read very little of Hegel's work. Yet I argue that Coleridge's criticisms of Schelling's philosophy—as recorded in letters and marginalia—betray remarkable intellectual affinities with his nearly exact contemporary Hegel, particularly in their shared doubts about Schelling's foundationalist intuitionism. With this background in place, I seek to demonstrate that volume one of Coleridge's Biographia Literaria is a radically self-undermining text: its philosophical argument, far from slavishly recapitulating Schelling's philosophy, remains (...) haunted by a quasi-Hegelian skepticism toward intuition even as it advances intuition as the foundation of its theoretical edifice. (shrink)
There is growing evidence that words that are acquired early in life are processed faster and more accurately than words acquired later, even by adults. As neuropsychological and neuroimaging studies have implicated different brain networks in the processing of action verbs and concrete nouns, the present study was aimed at contrasting reaction times to early and later-acquired action verbs and concrete nouns, in order to determine whether effects of word learning age express differently for the two types of words. Our (...) results show that while word frequency affected both types of words in the same way, distinct learning age effects were observed for action verbs and concrete nouns. A further experiment specified that this difference was observed for verbs describing actions belonging to the human motor repertoire, but not for verbs denoting actions past this repertoire (e.g., to neigh). We interpret these data within a recently emerging framework according to which language processing is associated with sensory motor programs. (shrink)
Nanoparticles of cobalt ferrite prepared by the co-precipitation method with crystallite size varying from 4.7 to 41 nm have been characterized by positron annihilation lifetime spectroscopy. Three lifetime components are fitted to the lifetime data. The shortest lifetime component is attributed to the delocalized positron lifetime shortened by defect trapping. The intermediate lifetime is assigned to the positron annihilation in diffuse vacancy clusters or microvoids at the grain boundaries and at the grain-boundary triple points. The longest component corresponds to the (...) pick-off annihilation of ortho-positronium formed at the larger voids. The variations in these lifetimes and their relative intensities with annealing temperature and crystallite size have been studied in detail. (shrink)
The ontological model framework for an operational theory has generated much interest in recent years. The debate concerning reality of quantum states has been made more precise in this framework. With the introduction of generalized notion of contextuality in this framework, it has been shown that completely mixed state of a qubit is preparation contextual. Interestingly, this new idea of preparation contextuality has been used to demonstrate nonlocality of some \(\psi \) -epistemic models without any use of Bell’s inequality. In (...) particular, nonlocality of a non maximally \(\psi \) -epistemic model has been demonstrated from preparation contextuality of a maximally mixed qubit and Schrödinger’s steerability of the maximally entangled state of two qubits (Leifer and Maroney, Phys Rev Lett 110:120401, 2013). In this paper, we, show that any mixed state is preparation contextual. We, then, show that nonlocality of any bipartite pure entangled state, with Schmidt rank two, follows from preparation contextuality and steerability provided we impose certain condition on the epistemicity of the underlying ontological model. More interestingly, if the pure entangled state is of Schmidt rank greater than two, its nonlocality follows without any further condition on the epistemicity. Thus our result establishes a stronger connection between nonlocality and preparation contextuality by revealing nonlocality of any bipartite pure entangled states without any use of Bell-type inequality. (shrink)
We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.
Originally published in 1954, on behalf of the National Institute of Economic and Social Research, this book presents a general review of British economic statistics in relation to the uses made of them for policy purposes. The text begins with an examination, in general terms, of the ways in which statistics can help in guiding or assessing policy, covering housing, coal, the development areas, agricultural price-fixing, the balance of external payments and the balance of the economy. The problems of statistical (...) application are then separately discussed under the headings of quality, presentation and availability, and organization. A full bibliography and reference table of principal British economic statistics are also included. This book will be of value to anyone with an interest in British economic history and statistics. (shrink)
The development of executive functions is recognizably correlated to culture, contextual and social factors. However, studies considering all the basic EF are still scarce in Brazil, most notably in the Northeast region, which is known for its social inequality and economic gap. This study aimed to analyze the developmental trajectories and structure of four EF, namely inhibition, flexibility, working memory and planning. In addition, the potential effects of socioeconomic status and gender were examined. The sample included 230 Brazilian children between (...) 7-12 years old, homogeneously distributed by age, gender and type of school. The EF were assessed through the Brazilian version of the Child Executive Functions Battery. A global effect of age was found for most of the EF measures evaluated. Gender effect was mostly non-significant, except for 4 of the 12 tasks. There was a significant SES effect on 8 tasks, all in favor of private school children. Exploratory factorial and correlation analysis showed a 4-factor EF structure, corroborating the theoretical distribution considered in the CEF-B. A developmental progression is evident in the results for all of the EF measures evaluated. While gender had little influence on EF, SES seems to significantly impact the development of EF. As normative data are still lacking in Northeast Brazil, this study may help to understand EF development trajectories and provide tools for neuropsychological evaluation. (shrink)
L’ouvrage se présente comme « un manuel de catéchétique fondamentale » offrant un large panorama des enjeux et problématiques de la catéchèse contemporaine. Il trouve son origine dans La catechesi oggi. Manuale di catechetica fondamentale (2001), fruit d’une réflexion sur la catéchèse conduite depuis une trentaine d’années par Emilio Alberich Sotomayor, professeur à l’Université Pontificale Salésienne de Rome. Lui ont été associés Jérôme Vallabaraj, professeur de catéchétique à la même Univer..
Le présent numéro de la Revue des sciences religieuses publie la majeure partie des conférences d’un colloque consacré aux Pratiques de formation des adultes dans le tissu social et l’espace ecclésial. Ce colloque a clos la mise en œuvre du projet défini par le Groupe de Recherche de l’Institut de Pédagogie Religieuse (GRIPR) dans le contrat quadriennal de recherche 1998-2003 (Université Marc Bloch). La réflexion sur la formation des adultes en Église est l’une des problématiques majeures de ..
Introduisant un colloque, l’auteur fait apparaître les changements intervenus depuis un manifeste publié en 1978 sur la formation chrétienne des adultes. Mais on y déclarait déjà que les personnes en formation pouvaient devenir « acteurs de leur propre changement ».
With rapid growth in Far Eastern economies, it is becoming imperative to understand the culturally driven ethical-value underpinnings of the management processes in this region of the world. In this study, we propose a broadened version of Hofstede’s and others’ conception of Confucian dynamics anchored in his teachings preserved in the Lunyu, which form the foundation of individual-social moral interactions. Based on a content analysis of these Analects via a qualitative software, NVivo, we identified six work-based values and six life-based (...) values of the society, prescribed by Confucius in his Analects. These factors are further analyzed and mapped in the context of the three Confucian ethical dimensions. The business implications of the results and directions for future research are finally discussed. (shrink)
L'ouvrage est le fruit d'une recherche-action menée par deux chercheurs que leurs responsabilités ecclésiales conduisent à rencontrer et accompagner, non seulement des jeunes de 13 à 18 ans, mais également des parents, des enseignants et enseignantes, des éducateurs et éducatrices. Après avoir assuré durant plus de quinze ans la responsabilité d'aumônerie de lycées et collèges dans le diocèse de Metz, Ch. Aulenbacher est maître de conférences en théologie pratique dans notre Faculté. Chercheu..
This commentary defines an additional characteristic of human learning. The nature of this test is different from the ones by Newell: This is a hard, pass/fail type of test. Thus a theory of cognition cannot partially satisfy this test ; it either conforms to the requirement fully, or it doesn't. If a theory of cognition cannot satisfy this property of human learning, then the theory is not valid at all.
It is a common practice to analyze fracture spacing data collected from scanlines and wells at various resolutions for the purposes of aquifer and reservoir characterization. However, the influence of resolution on such analyses is not well-studied. Lacunarity is a parameter that is used for multiscale analysis of spatial data. In quantitative terms, at any given scale, it is a function of the mean and variance of the distribution of masses captured by a gliding a window of that scale across (...) any pattern of interest. We have described the application of lacunarity for delineating differences between scale-dependent clustering attributes of data collected at different resolutions along a scanline. Specifically, we considered data collected at different resolutions from two outcrop exposures, a pavement and a cliff section, of the Cretaceous turbititic sandstones of the Chatsworth Formation widely exposed in southern California. For each scanline, we analyzed data from low-resolution aerial or ground photographs and high-resolution ground measurements for scale-dependent clustering attributes. High-resolution data show larger values of scale-dependent lacunarity than their respective low-resolution counterparts. We further performed a bootstrap analysis for each data set to test for the significance of such clustering differences. We started with generating 300 realizations for each data set and then ran lacunarity analysis on them. It was seen that lacunarity for higher resolution data set lay significantly outside the upper 90th percentile values, thus proving that higher resolution data are distinctly different from random and fractures are clustered. We have therefore postulated that lower resolution data capture fracture zones that had relatively uniform spacing, whereas higher resolution data capture thin and short splay joints and sheared joints that contribute to fracture clustering. Such findings have important implications in terms of understanding organization of fractures in fracture corridors, which in turn is critical for modeling and upscaling exercises. (shrink)
Today, thanks to Noam Chomsky and his fellow media analysts, it is almost axiomatic for thousands, possibly millions, of us that public opinion in "free market" democracies is manufactured just like any other mass market product â€” soap, switches, or sliced bread. We know that while, legally and constitutionally, speech may be free, the space in which that freedom can be exercised has been snatched from us and auctioned to the highest bidders. Neoliberal capitalism isn't just about the accumulation of (...) capital (for some). It's also about the accumulation of power (for some), the accumulation of freedom (for some). Conversely, for the rest of the world, the people who are excluded from neoliberalism's governing body, it's about the erosion of capital, the erosion of power, the erosion of freedom. In the "free" market, free speech has become a commodity like everything else â€” â€” justice, human rights, drinking water, clean air. It's available only to those who can afford it. And naturally, those who can afford it use free speech to manufacture the kind of product, confect the kind of public opinion, that best suits their purpose. (News they can use.) Exactly how they do this has been the subject of much of Noam Chomsky's political writing. (shrink)