10 found
  1.  4
    The Simultaneous Type, Serial Token Model of Temporal Attention and Working Memory.Howard Bowman & Brad Wyble - 2007 - Psychological Review 114 (1):38-70.
  2.  16
    Attentional Episodes in Visual Perception.Brad Wyble, Mary C. Potter, Howard Bowman & Mark Nieuwenstein - 2011 - Journal of Experimental Psychology: General 140 (3):488-505.
  3.  43
    Strategic Regulation of Cognitive Control by Emotional Salience: A Neural Network Model.Bradley Wyble, Dinkar Sharma & Howard Bowman - 2008 - Cognition and Emotion 22 (6):1019-1051.
  4.  56
    Attention is More Than Prediction Precision.Howard Bowman, Marco Filetti, Brad Wyble & Christian Olivers - 2013 - Behavioral and Brain Sciences 36 (3):206-208.
    A cornerstone of the target article is that, in a predictive coding framework, attention can be modelled by weighting prediction error with a measure of precision. We argue that this is not a complete explanation, especially in the light of ERP (event-related potentials) data showing large evoked responses for frequently presented target stimuli, which thus are predicted.
    No categories
    Direct download (4 more)  
    Export citation  
    Bookmark   3 citations  
  5.  4
    On the Limits of Evidence Accumulation of the Preconscious Percept.Alberto Avilés, Howard Bowman & Brad Wyble - 2020 - Cognition 195:104080.
    No categories
    Direct download (2 more)  
    Export citation  
  6.  3
    Effect of tDCS Over the Right Inferior Parietal Lobule on Mind-Wandering Propensity.Sean Coulborn, Howard Bowman, R. Chris Miall & Davinia Fernández-Espejo - 2020 - Frontiers in Human Neuroscience 14.
  7.  76
    PITL2MONA: Implementing a Decision Procedure for Propositional Interval Temporal Logic.Rodolfo Gómez & Howard Bowman - 2004 - Journal of Applied Non-Classical Logics 14 (1-2):105-148.
    Interval Temporal Logic is a finite-time linear temporal logic with applications in hardware verification, temporal logic programming and specification of multimedia documents. Due to the logic's non-elementary complexity, efficient ITL-based verification tools have been difficult to develop, even for propositional subsets. MONA is an efficient implementation of an automata-based decision procedure for the logic WS1S. Despite the non-elementary complexity of WS1S, MONA has been successfully applied in problems such as hardware synthesis, protocol verification and theorem proving. Here we consider a (...)
    Direct download (5 more)  
    Export citation  
  8.  19
    Placing Meta-Stable States of Consciousness Within the Predictive Coding Hierarchy: The Deceleration of the Accelerated Prediction Error.Amirali Shirazibeheshti, Jennifer Cooke, Srivas Chennu, Ram Adapa, David K. Menon, Seyed Ali Hojjatoleslami, Adrien Witon, Ling Li, Tristan Bekinschtein & Howard Bowman - 2018 - Consciousness and Cognition 63:123-142.
  9.  11
    Illusions of Integration Are Subjectively Impenetrable: Phenomenological Experience of Lag 1 Percepts During Dual-Target RSVP.Luca Simione, Elkan G. Akyürek, Valentina Vastola, Antonino Raffone & Howard Bowman - 2017 - Consciousness and Cognition 51:181-192.
  10.  4
    Understanding Visual Attention with RAGNAROC: A Reflexive Attention Gradient Through Neural AttRactOr Competition.Brad Wyble, Chloe Callahan-Flintoft, Hui Chen, Toma Marinov, Aakash Sarkar & Howard Bowman - 2020 - Psychological Review 127 (6):1163-1198.
    A quintessential challenge for any perceptual system is the need to focus on task-relevant information without being blindsided by unexpected, yet important information. The human visual system incorporates several solutions to this challenge, one of which is a reflexive covert attention system that is rapidly responsive to both the physical salience and the task-relevance of new information. This paper presents a model that simulates behavioral and neural correlates of reflexive attention as the product of brief neural attractor states that are (...)
    No categories
    Direct download (4 more)  
    Export citation