Computational models of semantic memory exploit information about co-occurrences of words in naturally occurring text to extract information about the meaning of the words that are present in the language. Such models implicitly specify a representation of temporal context. Depending on the model, words are said to have occurred in the same context if they are presented within a moving window, within the same sentence, or within the same document. The temporal context model (TCM), which specifies a particular definition of (...) temporal context, has proved useful in the study of episodic memory. The predictive temporal context model (pTCM) uses the same definition of temporal context to generate semantic memory representations. Taken together pTCM and TCM may prove to be part of a general model of declarative memory. (shrink)
Purpose – The purpose of this paper is to examine institutional influences on the customer service and complaints handling practices of the Australian Internet industry. Design/methodology/approach – The study adopted a qualitative research methodology using semi-structured interview as a research method. The study was informed by constructivist/interpretive research paradigm approaches to knowledge. Eleven senior executives from key Internet industry stakeholder organizations were interviewed. Findings – Using the neo-institutional theory lens, this study found that the institutional forces played a pivotal role (...) in bringing all Internet industry stakeholders together to address CS/CH shortcomings in the old Telecommunications Consumer Protection Code 2007. This led to significant changes to the CS/CH practices detailed in the revised TCP Code 2012. The study findings revealed that frequent and fateful collaborations between central institutional actors have led to the emergence of organizational fields. The actors identified in the emerging organizational fields actively influence the CS/CH practices and the subsequent implementation of the practices in vLISPs. Research limitations/implications – The study focused on the functional aspects of service quality. Technical aspects of SQ is equally important, and future research needs to consider both aspects of SQ when assessing overall performance of vLISPs. Practical implications – The study findings encourage vLISP managers to continue collaboration with external stakeholders and develop customer-friendly practices that deliver desirable CS/CH outcomes. Social implications – The study findings revealed that when all vLISP industry stakeholders collaborate with each other on a focal issue, there is noticeable progress towards development of CS practices that will contribute to a better CS experience. Originality/value – An evidence-based approach was used towards understanding and explaining how and why institutional actors of technology-based service organizations act together. A significant contribution arising from this study is the identification and discussion of emerging organizational fields comprising the central actors in the Internet industry. These emerging organizational fields have the potential to develop into mature organizational fields and inform future CS/CH practices and consumer protection policies in the Australian Internet industry. (shrink)
Recent EEG studies on the early postmortem interval that suggest the persistence of electrophysiological coherence and connectivity in the brain of animals and humans reinforce the need for further investigation of the relationship between the brain’s activity and the dying process. Neuroscience is now in a position to empirically evaluate the extended process of dying and, more specifically, to investigate the possibility of brain activity following the cessation of cardiac and respiratory function. Under the direction of the Center for Healthy (...) Minds at the University of Wisconsin-Madison, research was conducted in India on a postmortem meditative state cultivated by some Tibetan Buddhist practitioners in which decomposition is putatively delayed. For all healthy baseline and postmortem subjects presented here, we collected resting state electroencephalographic data, mismatch negativity, and auditory brainstem response. In this study, we present HB data to demonstrate the feasibility of a sparse electrode EEG configuration to capture well-defined ERP waveforms from living subjects under very challenging field conditions. While living subjects displayed well-defined MMN and ABR responses, no recognizable EEG waveforms were discernable in any of the tukdam cases. (shrink)
Radiation therapy therapists face challenging daily tasks that leave them prone to high attrition and burnout and subsequent deficits in performance. Here, we employed an accelerated alpha-theta neurofeedback protocol that is implementable in a busy medical workplace to test if 12 RTTs could learn the protocol and exhibit behavior and brain performance-related benefits. Following the 3-week protocol, participants showed a decrease in subjective cognitive workload and a decrease in response time during a performance task, as well as a decrease in (...) desynchrony of the alpha electroencephalogram band. Additionally, novel microstate analysis for neurofeedback showed a significant decrease in global field power following neurofeedback. These results suggest that the RTTs successfully learned the protocol and improved in perceived cognitive workload following 3 weeks of neurofeedback. In sum, this study presents promising behavioral improvements as well as brain performance-related evidence of neurophysiological changes following neurofeedback, supporting the feasibility of implementing neurofeedback in a busy workplace and encouraging the further study of neurofeedback as a tool to mitigate burnout. (shrink)
To improve the quality and efficiency of research, groups within the scientific community seek to exploit the value of data sharing. Funders, institutions, and specialist organizations are developing and implementing strategies to encourage or mandate data sharing within and across disciplines, with varying degrees of success. Academic journals in ecology and evolution have adopted several types of public data archiving policies requiring authors to make data underlying scholarly manuscripts freely available. The effort to increase data sharing in the sciences is (...) one part of a broader “data revolution” that has prompted discussion about a paradigm shift in scientific research. Yet anecdotes from the community and studies evaluating data availability suggest that these policies have not obtained the desired effects, both in terms of quantity and quality of available datasets. We conducted a qualitative, interview-based study with journal editorial staff and other stakeholders in the academic publishing process to examine how journals enforce data archiving policies. We specifically sought to establish who editors and other stakeholders perceive as responsible for ensuring data completeness and quality in the peer review process. Our analysis revealed little consensus with regard to how data archiving policies should be enforced and who should hold authors accountable for dataset submissions. Themes in interviewee responses included hopefulness that reviewers would take the initiative to review datasets and trust in authors to ensure the completeness and quality of their datasets. We highlight problematic aspects of these thematic responses and offer potential starting points for improvement of the public data archiving process. (shrink)