• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
Fast, Accurate, Reliable Eye Tracking

Fast, Accurate, Reliable Eye Tracking

  • Hardware
    • EyeLink 1000 Plus
    • EyeLink Portable Duo
    • fMRI and MEG Systems
    • EyeLink II
    • Hardware Integration
  • Software
    • Experiment Builder
    • Data Viewer
    • WebLink
    • Software Integration
    • Purchase Licenses
  • Solutions
    • Reading and Language
    • Developmental
    • fMRI and MEG
    • EEG and fNIRS
    • Clinical and Oculomotor
    • Cognitive
    • Usability and Applied
    • Non Human Primate
  • Support
    • Forum
    • Resources
    • Useful Apps
    • Training
  • About
    • About Us
    • EyeLink Publications
    • Events
    • Manufacturing
    • Careers
    • About Eye Tracking
    • Newsletter
  • Blog
  • Contact
  • 简体中文
eye tracking research

EEG / fNIRS / TMS Publications

EyeLink EEG / fNIRS / TMS Publications

All EyeLink EEG, fNIRS, and TMS research publications (with concurrent eye tracking) up until 2021 (with early 2022s) are listed below by year. You can search the publications using keywords such as P300, Gamma band, NIRS, etc. You can also search for individual author names. If we missed any EyeLink EEG, fNIRS, or TMS articles, please email us!

605 entries « ‹ 1 of 7 › »

2022

Timo L. Kvamme; Mesud Sarmanlu; Christopher Bailey; Morten Overgaard

Neurofeedback modulation of the sound-induced flash illusion using parietal cortex alpha oscillations reveals dependency on prior multisensory congruency Journal Article

In: Neuroscience, vol. 482, pp. 1–17, 2022.

Abstract | Links | BibTeX

@article{Kvamme2022,
title = {Neurofeedback modulation of the sound-induced flash illusion using parietal cortex alpha oscillations reveals dependency on prior multisensory congruency},
author = {Timo L. Kvamme and Mesud Sarmanlu and Christopher Bailey and Morten Overgaard},
doi = {10.1016/j.neuroscience.2021.11.028},
year = {2022},
date = {2022-01-01},
journal = {Neuroscience},
volume = {482},
pages = {1--17},
publisher = {The Authors},
abstract = {Spontaneous neural oscillations are key predictors of perceptual decisions to bind multisensory signals into a unified percept. Research links decreased alpha power in the posterior cortices to attention and audiovisual binding in the sound-induced flash illusion (SIFI) paradigm. This suggests that controlling alpha oscillations would be a way of controlling audiovisual binding. In the present feasibility study we used MEG-neurofeedback to train one group of subjects to increase left/right and another to increase right/left alpha power ratios in the parietal cortex. We tested for changes in audiovisual binding in a SIFI paradigm where flashes appeared in both hemifields. Results showed that the neurofeedback induced a significant asymmetry in alpha power for the left/right group, not seen for the right/left group. Corresponding asymmetry changes in audiovisual binding in illusion trials (with 2, 3, and 4 beeps paired with 1 flash) were not apparent. Exploratory analyses showed that neurofeedback training effects were present for illusion trials with the lowest numeric disparity (i.e., 2 beeps and 1 flash trials) only if the previous trial had high congruency (2 beeps and 2 flashes). Our data suggest that the relation between parietal alpha power (an index of attention) and its effect on audiovisual binding is dependent on the learned causal structure in the previous stimulus. The present results suggests that low alpha power biases observers towards audiovisual binding when they have learned that audiovisual signals originate from a common origin, consistent with a Bayesian causal inference account of multisensory perception.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Spontaneous neural oscillations are key predictors of perceptual decisions to bind multisensory signals into a unified percept. Research links decreased alpha power in the posterior cortices to attention and audiovisual binding in the sound-induced flash illusion (SIFI) paradigm. This suggests that controlling alpha oscillations would be a way of controlling audiovisual binding. In the present feasibility study we used MEG-neurofeedback to train one group of subjects to increase left/right and another to increase right/left alpha power ratios in the parietal cortex. We tested for changes in audiovisual binding in a SIFI paradigm where flashes appeared in both hemifields. Results showed that the neurofeedback induced a significant asymmetry in alpha power for the left/right group, not seen for the right/left group. Corresponding asymmetry changes in audiovisual binding in illusion trials (with 2, 3, and 4 beeps paired with 1 flash) were not apparent. Exploratory analyses showed that neurofeedback training effects were present for illusion trials with the lowest numeric disparity (i.e., 2 beeps and 1 flash trials) only if the previous trial had high congruency (2 beeps and 2 flashes). Our data suggest that the relation between parietal alpha power (an index of attention) and its effect on audiovisual binding is dependent on the learned causal structure in the previous stimulus. The present results suggests that low alpha power biases observers towards audiovisual binding when they have learned that audiovisual signals originate from a common origin, consistent with a Bayesian causal inference account of multisensory perception.

Close

  • doi:10.1016/j.neuroscience.2021.11.028

Close

Lorenzo Diana; Giulia Scotti; Edoardo N Aiello; Patrick Pilastro; Aleksandra K Eberhard-moscicka; Ren M Müri; Nadia Bolognini

Conventional and HD-tDCS may (or may not) modulate overt attentional orienting: An integrated spatio-temporal approach and methodological reflection Journal Article

In: Brain Sciences, vol. 12, no. 71, pp. 1–20, 2022.

Abstract | BibTeX

@article{Diana2022,
title = {Conventional and HD-tDCS may (or may not) modulate overt attentional orienting: An integrated spatio-temporal approach and methodological reflection},
author = {Lorenzo Diana and Giulia Scotti and Edoardo N Aiello and Patrick Pilastro and Aleksandra K Eberhard-moscicka and Ren M Müri and Nadia Bolognini},
year = {2022},
date = {2022-01-01},
journal = {Brain Sciences},
volume = {12},
number = {71},
pages = {1--20},
abstract = {Transcranial Direct Current Stimulation (tDCS) has been employed to modulate visuo- spatial attentional asymmetries, however, further investigation is needed to characterize tDCS- associated variability in more ecological settings. In the present research, we tested the effects of offline, anodal conventional tDCS (Experiment 1) and HD-tDCS (Experiment 2) delivered over the posterior parietal cortex (PPC) and Frontal Eye Field (FEF) of the right hemisphere in healthy participants. Attentional asymmetries were measured by means of an eye tracking-based, ecological paradigm, that is, a Free Visual Exploration task of naturalistic pictures. Data were analyzed from a spatiotemporal perspective. In Experiment 1, a pre-post linear mixed model (LMM) indicated a leftward attentional shift after PPC tDCS; this effect was not confirmed when the individual baseline performance was considered. In Experiment 2, FEF HD-tDCS was shown to induce a significant leftward shift of gaze position, which emerged after 6 s of picture exploration and lasted for 200 ms. The present results do not allow us to conclude on a clear efficacy of offline conventional tDCS and HD- tDCS in modulating overt visuospatial attention in an ecological setting. Nonetheless, our findings highlight a complex relationship among stimulated area, focality of stimulation, spatiotemporal aspects of deployment of attention, and the role of individual baseline performance in shaping the effects of tDCS.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Transcranial Direct Current Stimulation (tDCS) has been employed to modulate visuo- spatial attentional asymmetries, however, further investigation is needed to characterize tDCS- associated variability in more ecological settings. In the present research, we tested the effects of offline, anodal conventional tDCS (Experiment 1) and HD-tDCS (Experiment 2) delivered over the posterior parietal cortex (PPC) and Frontal Eye Field (FEF) of the right hemisphere in healthy participants. Attentional asymmetries were measured by means of an eye tracking-based, ecological paradigm, that is, a Free Visual Exploration task of naturalistic pictures. Data were analyzed from a spatiotemporal perspective. In Experiment 1, a pre-post linear mixed model (LMM) indicated a leftward attentional shift after PPC tDCS; this effect was not confirmed when the individual baseline performance was considered. In Experiment 2, FEF HD-tDCS was shown to induce a significant leftward shift of gaze position, which emerged after 6 s of picture exploration and lasted for 200 ms. The present results do not allow us to conclude on a clear efficacy of offline conventional tDCS and HD- tDCS in modulating overt visuospatial attention in an ecological setting. Nonetheless, our findings highlight a complex relationship among stimulated area, focality of stimulation, spatiotemporal aspects of deployment of attention, and the role of individual baseline performance in shaping the effects of tDCS.

Close

Johannes Rennig; Michael S Beauchamp

Intelligibility of audiovisual sentences drives nultivoxel response patterns in human superior temporal cortex Journal Article

In: NeuroImage, vol. 247, pp. 118796, 2022.

Abstract | Links | BibTeX

@article{Rennig2022,
title = {Intelligibility of audiovisual sentences drives nultivoxel response patterns in human superior temporal cortex},
author = {Johannes Rennig and Michael S Beauchamp},
doi = {10.1016/j.neuroimage.2021.118796},
year = {2022},
date = {2022-01-01},
journal = {NeuroImage},
volume = {247},
pages = {118796},
publisher = {Elsevier Inc.},
abstract = {Regions of the human posterior superior temporal gyrus and sulcus (pSTG/S) respond to the visual mouth movements that constitute visual speech and the auditory vocalizations that constitute auditory speech, and neural responses in pSTG/S may underlie the perceptual benefit of visual speech for the comprehension of noisy auditory speech. We examined this possibility through the lens of multivoxel pattern responses in pSTG/S. BOLD fMRI data was collected from 22 participants presented with speech consisting of English sentences presented in five different formats: visual-only; auditory with and without added auditory noise; and audiovisual with and without auditory noise. Participants reported the intelligibility of each sentence with a button press and trials were sorted post-hoc into those that were more or less intelligible. Response patterns were measured in regions of the pSTG/S identified with an independent localizer. Noisy audiovisual sentences with very similar physical properties evoked very different response patterns depending on their intelligibility. When a noisy audiovisual sentence was reported as intelligible, the pattern was nearly identical to that elicited by clear audiovisual sentences. In contrast, an unintelligible noisy audiovisual sentence evoked a pattern like that of visual-only sentences. This effect was less pronounced for noisy auditory-only sentences, which evoked similar response patterns regardless of intelligibility. The successful integration of visual and auditory speech produces a characteristic neural signature in pSTG/S, highlighting the importance of this region in generating the perceptual benefit of visual speech.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Regions of the human posterior superior temporal gyrus and sulcus (pSTG/S) respond to the visual mouth movements that constitute visual speech and the auditory vocalizations that constitute auditory speech, and neural responses in pSTG/S may underlie the perceptual benefit of visual speech for the comprehension of noisy auditory speech. We examined this possibility through the lens of multivoxel pattern responses in pSTG/S. BOLD fMRI data was collected from 22 participants presented with speech consisting of English sentences presented in five different formats: visual-only; auditory with and without added auditory noise; and audiovisual with and without auditory noise. Participants reported the intelligibility of each sentence with a button press and trials were sorted post-hoc into those that were more or less intelligible. Response patterns were measured in regions of the pSTG/S identified with an independent localizer. Noisy audiovisual sentences with very similar physical properties evoked very different response patterns depending on their intelligibility. When a noisy audiovisual sentence was reported as intelligible, the pattern was nearly identical to that elicited by clear audiovisual sentences. In contrast, an unintelligible noisy audiovisual sentence evoked a pattern like that of visual-only sentences. This effect was less pronounced for noisy auditory-only sentences, which evoked similar response patterns regardless of intelligibility. The successful integration of visual and auditory speech produces a characteristic neural signature in pSTG/S, highlighting the importance of this region in generating the perceptual benefit of visual speech.

Close

  • doi:10.1016/j.neuroimage.2021.118796

Close

Erin Goddard; Thomas A. Carlson; Alexandra Woolgar

Spatial and feature-selective attention have distinct, interacting effects on population-level tuning Journal Article

In: Journal of Cognitive Neuroscience, vol. 34, no. 2, pp. 290–312, 2022.

Abstract | Links | BibTeX

@article{Goddard2022,
title = {Spatial and feature-selective attention have distinct, interacting effects on population-level tuning},
author = {Erin Goddard and Thomas A. Carlson and Alexandra Woolgar},
doi = {10.1162/jocn_a_01796},
year = {2022},
date = {2022-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {34},
number = {2},
pages = {290--312},
abstract = {Attention can be deployed in different ways: When searching for a taxi in New York City, we can decide where to attend (e.g., to the street) and what to attend to (e.g., yellow cars). Although we use the same word to describe both processes, nonhuman primate data suggest that these produce distinct effects on neural tuning. This has been challenging to assess in humans, but here we used an opportunity afforded by multivariate decoding of MEG data. We found that attending to an object at a particular location and attending to a particular object feature produced effects that interacted multiplicatively. The two types of attention induced distinct patterns of enhancement in occipital cortex, with feature-selective attention producing relatively more enhancement of small feature differences and spatial attention producing relatively larger effects for larger feature differences. An information flow analysis further showed that stimulus representations in occipital cortex were Granger-caused by coding in frontal cortices earlier in time and that the timing of this feedback matched the onset of attention effects. The data suggest that spatial and feature-selective attention rely on distinct neural mechanisms that arise from frontal-occipital information exchange, interacting multiplicatively to selectively enhance task-relevant information.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Attention can be deployed in different ways: When searching for a taxi in New York City, we can decide where to attend (e.g., to the street) and what to attend to (e.g., yellow cars). Although we use the same word to describe both processes, nonhuman primate data suggest that these produce distinct effects on neural tuning. This has been challenging to assess in humans, but here we used an opportunity afforded by multivariate decoding of MEG data. We found that attending to an object at a particular location and attending to a particular object feature produced effects that interacted multiplicatively. The two types of attention induced distinct patterns of enhancement in occipital cortex, with feature-selective attention producing relatively more enhancement of small feature differences and spatial attention producing relatively larger effects for larger feature differences. An information flow analysis further showed that stimulus representations in occipital cortex were Granger-caused by coding in frontal cortices earlier in time and that the timing of this feedback matched the onset of attention effects. The data suggest that spatial and feature-selective attention rely on distinct neural mechanisms that arise from frontal-occipital information exchange, interacting multiplicatively to selectively enhance task-relevant information.

Close

  • doi:10.1162/jocn_a_01796

Close

2021

Delia A. Gheorghe; Muriel T. N. Panouillères; Nicholas D. Walsh

Investigating the effects of cerebellar transcranial direct current stimulation on saccadic adaptation and cortisol response Journal Article

In: Cerebellum and Ataxias, vol. 8, no. 1, pp. 1–11, 2021.

Abstract | Links | BibTeX

@article{Gheorghe2021,
title = {Investigating the effects of cerebellar transcranial direct current stimulation on saccadic adaptation and cortisol response},
author = {Delia A. Gheorghe and Muriel T. N. Panouillères and Nicholas D. Walsh},
doi = {10.1186/s40673-020-00124-y},
year = {2021},
date = {2021-12-01},
journal = {Cerebellum and Ataxias},
volume = {8},
number = {1},
pages = {1--11},
publisher = {BioMed Central Ltd},
abstract = {Background: Transcranial Direct Current Stimulation (tDCS) over the prefrontal cortex has been shown to modulate subjective, neuronal and neuroendocrine responses, particularly in the context of stress processing. However, it is currently unknown whether tDCS stimulation over other brain regions, such as the cerebellum, can similarly affect the stress response. Despite increasing evidence linking the cerebellum to stress-related processing, no studies have investigated the hormonal and behavioural effects of cerebellar tDCS. Methods: This study tested the hypothesis of a cerebellar tDCS effect on mood, behaviour and cortisol. To do this we employed a single-blind, sham-controlled design to measure performance on a cerebellar-dependent saccadic adaptation task, together with changes in cortisol output and mood, during online anodal and cathodal stimulation. Forty-five participants were included in the analysis. Stimulation groups were matched on demographic variables, potential confounding factors known to affect cortisol levels, mood and a number of personality characteristics. Results: Results showed that tDCS polarity did not affect cortisol levels or subjective mood, but did affect behaviour. Participants receiving anodal stimulation showed an 8.4% increase in saccadic adaptation, which was significantly larger compared to the cathodal group (1.6%). Conclusion: The stimulation effect on saccadic adaptation contributes to the current body of literature examining the mechanisms of cerebellar stimulation on associated function. We conclude that further studies are needed to understand whether and how cerebellar tDCS may module stress reactivity under challenge conditions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Background: Transcranial Direct Current Stimulation (tDCS) over the prefrontal cortex has been shown to modulate subjective, neuronal and neuroendocrine responses, particularly in the context of stress processing. However, it is currently unknown whether tDCS stimulation over other brain regions, such as the cerebellum, can similarly affect the stress response. Despite increasing evidence linking the cerebellum to stress-related processing, no studies have investigated the hormonal and behavioural effects of cerebellar tDCS. Methods: This study tested the hypothesis of a cerebellar tDCS effect on mood, behaviour and cortisol. To do this we employed a single-blind, sham-controlled design to measure performance on a cerebellar-dependent saccadic adaptation task, together with changes in cortisol output and mood, during online anodal and cathodal stimulation. Forty-five participants were included in the analysis. Stimulation groups were matched on demographic variables, potential confounding factors known to affect cortisol levels, mood and a number of personality characteristics. Results: Results showed that tDCS polarity did not affect cortisol levels or subjective mood, but did affect behaviour. Participants receiving anodal stimulation showed an 8.4% increase in saccadic adaptation, which was significantly larger compared to the cathodal group (1.6%). Conclusion: The stimulation effect on saccadic adaptation contributes to the current body of literature examining the mechanisms of cerebellar stimulation on associated function. We conclude that further studies are needed to understand whether and how cerebellar tDCS may module stress reactivity under challenge conditions.

Close

  • doi:10.1186/s40673-020-00124-y

Close

Guanpeng Chen; Ziyun Zhu; Qing He; Fang Fang

Offline transcranial direct current stimulation improves the ability to perceive crowded targets Journal Article

In: Journal of Vision, vol. 21, no. 2, pp. 1–10, 2021.

Abstract | Links | BibTeX

@article{Chen2021a,
title = {Offline transcranial direct current stimulation improves the ability to perceive crowded targets},
author = {Guanpeng Chen and Ziyun Zhu and Qing He and Fang Fang},
doi = {10.1167/jov.21.2.1},
year = {2021},
date = {2021-01-01},
journal = {Journal of Vision},
volume = {21},
number = {2},
pages = {1--10},
abstract = {The deleterious effect of nearby flankers on target identification in the periphery is known as visual crowding. Studying visual crowding can advance our understanding of the mechanisms of visual awareness and object recognition. Alleviating visual crowding is one of the major ways to improve peripheral vision. The aim of the current study was to examine whether transcranial direct current stimulation (tDCS) was capable of alleviating visual crowding at different visual eccentricities and with different visual tasks. In the present single-blind sham-controlled study, subjects were instructed to perform an orientation discrimination task or a letter identification task with isolated and crowded targets in the periphery, before and after applying 20 minutes of 2 mA anodal tDCS to visual cortex of the hemisphere contralateral or ipsilateral to visual stimuli. Contralateral tDCS significantly alleviated the orientation crowding effect at two different eccentricities and the letter crowding effect. This alleviation was absent after sham or ipsilateral stimulation and could not be fully explained by the performance improvement with the isolated targets. These findings demonstrated that offline tDCS was effective in alleviating visual crowding across different visual eccentricities and tasks, therefore providing a promising way to improve spatial vision rapidly in crowded scenes.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The deleterious effect of nearby flankers on target identification in the periphery is known as visual crowding. Studying visual crowding can advance our understanding of the mechanisms of visual awareness and object recognition. Alleviating visual crowding is one of the major ways to improve peripheral vision. The aim of the current study was to examine whether transcranial direct current stimulation (tDCS) was capable of alleviating visual crowding at different visual eccentricities and with different visual tasks. In the present single-blind sham-controlled study, subjects were instructed to perform an orientation discrimination task or a letter identification task with isolated and crowded targets in the periphery, before and after applying 20 minutes of 2 mA anodal tDCS to visual cortex of the hemisphere contralateral or ipsilateral to visual stimuli. Contralateral tDCS significantly alleviated the orientation crowding effect at two different eccentricities and the letter crowding effect. This alleviation was absent after sham or ipsilateral stimulation and could not be fully explained by the performance improvement with the isolated targets. These findings demonstrated that offline tDCS was effective in alleviating visual crowding across different visual eccentricities and tasks, therefore providing a promising way to improve spatial vision rapidly in crowded scenes.

Close

  • doi:10.1167/jov.21.2.1

Close

Andra Coldea; Stephanie Morand; Domenica Veniero; Monika Harvey; Gregor Thut

Parietal alpha tACS shows inconsistent effects on visuospatial attention Journal Article

In: PLoS ONE, vol. 16, no. 8, pp. e0255424, 2021.

Abstract | Links | BibTeX

@article{Coldea2021,
title = {Parietal alpha tACS shows inconsistent effects on visuospatial attention},
author = {Andra Coldea and Stephanie Morand and Domenica Veniero and Monika Harvey and Gregor Thut},
doi = {10.1371/journal.pone.0255424},
year = {2021},
date = {2021-01-01},
journal = {PLoS ONE},
volume = {16},
number = {8},
pages = {e0255424},
abstract = {Transcranial alternating current stimulation (tACS) is a popular technique that has been used for manipulating brain oscillations and inferring causality regarding the brain-behaviour relationship. Although it is a promising tool, the variability of tACS results has raised questions regarding the robustness and reproducibility of its effects. Building on recent research using tACS to modulate visuospatial attention, we here attempted to replicate findings of lateralized parietal tACS at alpha frequency to induce a change in attention bias away from the contra- towards the ipsilateral visual hemifield. 40 healthy participants underwent tACS in two separate sessions where either 10 Hz tACS or sham was applied via a high-density montage over the left parietal cortex at 1.5 mA for 20 min, while performance was assessed in an endogenous attention task. Task and tACS parameters were chosen to match those of previous studies reporting positive effects. Unlike these studies, we did not observe lateralized parietal alpha tACS to affect attention deployment or visual processing across the hemifields as compared to sham. Likewise, additional resting electroencephalography immediately offline to tACS did not reveal any notable effects on individual alpha power or frequency. Our study emphasizes the need for more replication studies and systematic investigations of the factors that drive tACS effects.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Transcranial alternating current stimulation (tACS) is a popular technique that has been used for manipulating brain oscillations and inferring causality regarding the brain-behaviour relationship. Although it is a promising tool, the variability of tACS results has raised questions regarding the robustness and reproducibility of its effects. Building on recent research using tACS to modulate visuospatial attention, we here attempted to replicate findings of lateralized parietal tACS at alpha frequency to induce a change in attention bias away from the contra- towards the ipsilateral visual hemifield. 40 healthy participants underwent tACS in two separate sessions where either 10 Hz tACS or sham was applied via a high-density montage over the left parietal cortex at 1.5 mA for 20 min, while performance was assessed in an endogenous attention task. Task and tACS parameters were chosen to match those of previous studies reporting positive effects. Unlike these studies, we did not observe lateralized parietal alpha tACS to affect attention deployment or visual processing across the hemifields as compared to sham. Likewise, additional resting electroencephalography immediately offline to tACS did not reveal any notable effects on individual alpha power or frequency. Our study emphasizes the need for more replication studies and systematic investigations of the factors that drive tACS effects.

Close

  • doi:10.1371/journal.pone.0255424

Close

Raymundo Machado Azevedo Neto; Andreas Bartels

Disrupting short-term memory maintenance in premotor cortex affects serial dependence in visuomotor integration Journal Article

In: Journal of Neuroscience, vol. 41, no. 45, pp. 9392–9402, 2021.

Abstract | Links | BibTeX

@article{deAzevedoNeto2021,
title = {Disrupting short-term memory maintenance in premotor cortex affects serial dependence in visuomotor integration},
author = {Raymundo Machado Azevedo Neto and Andreas Bartels},
doi = {10.1523/jneurosci.0380-21.2021},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {45},
pages = {9392--9402},
abstract = {Human behavior is biased by past experience. For example, when intercepting a moving target, the speed of previous targets will bias responses in future trials. Neural mechanisms underlying this so-called serial dependence are still under debate. Here, we tested the hypothesis that the previous trial leaves a neural trace in brain regions associated with encoding task-relevant information in visual and/or motor regions. We reasoned that injecting noise by means of transcranial magnetic stimulation (TMS) over premotor and visual areas would degrade such memory traces and hence reduce serial dependence. To test this hypothesis, we applied bursts of TMS pulses to right visual motion processing region hV5/MT1 and to left dorsal premotor cortex (PMd) during intertrial intervals of a coincident timing task performed by twenty healthy human participants (15 female). Without TMS, participants presented a bias toward the speed of the previous trial when intercepting moving targets. TMS over PMd decreased serial dependence in comparison to the control Vertex stimulation, whereas TMS applied over hV5/MT1 did not. In addition, TMS seems to have specifically affected the memory trace that leads to serial dependence, as we found no evidence that participants' behavior worsened after applying TMS. These results provide causal evidence that an implicit short-term memory mechanism in premotor cortex keeps information from one trial to the next, and that this information is blended with current trial information so that it biases behavior in a visuomotor integration task with moving objects.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Human behavior is biased by past experience. For example, when intercepting a moving target, the speed of previous targets will bias responses in future trials. Neural mechanisms underlying this so-called serial dependence are still under debate. Here, we tested the hypothesis that the previous trial leaves a neural trace in brain regions associated with encoding task-relevant information in visual and/or motor regions. We reasoned that injecting noise by means of transcranial magnetic stimulation (TMS) over premotor and visual areas would degrade such memory traces and hence reduce serial dependence. To test this hypothesis, we applied bursts of TMS pulses to right visual motion processing region hV5/MT1 and to left dorsal premotor cortex (PMd) during intertrial intervals of a coincident timing task performed by twenty healthy human participants (15 female). Without TMS, participants presented a bias toward the speed of the previous trial when intercepting moving targets. TMS over PMd decreased serial dependence in comparison to the control Vertex stimulation, whereas TMS applied over hV5/MT1 did not. In addition, TMS seems to have specifically affected the memory trace that leads to serial dependence, as we found no evidence that participants' behavior worsened after applying TMS. These results provide causal evidence that an implicit short-term memory mechanism in premotor cortex keeps information from one trial to the next, and that this information is blended with current trial information so that it biases behavior in a visuomotor integration task with moving objects.

Close

  • doi:10.1523/jneurosci.0380-21.2021

Close

P. J. Hills; G. Arabacı; J. Fagg; L. Canter; C. Thompson; R. Moseley

Low-frequency rTMS to the parietal lobe increases eye-movement carryover and decreases hazard rating Journal Article

In: Neuropsychologia, vol. 158, pp. 107895, 2021.

Abstract | Links | BibTeX

@article{Hills2021,
title = {Low-frequency rTMS to the parietal lobe increases eye-movement carryover and decreases hazard rating},
author = {P. J. Hills and G. Arabacı and J. Fagg and L. Canter and C. Thompson and R. Moseley},
doi = {10.1016/j.neuropsychologia.2021.107895},
year = {2021},
date = {2021-01-01},
journal = {Neuropsychologia},
volume = {158},
pages = {107895},
publisher = {Elsevier Ltd},
abstract = {The persistence of attentional set from one task to a secondary unrelated task, revealed through carryover of eye movements, has been attributed to increased activation in the parietal lobe and decreased activation to the frontal lobe. To directly test this, we adopted a modified version of the Thompson and Crundall (2011) paradigm using low-frequency repetitive TMS to P3 and F3. In each trial, participants viewed letter-strings that were arranged horizontally, vertically, or randomly across the screen before viewing a road image and providing a hazardousness rating for it. The orientation of the letter search influenced eye movements to the road images and this carryover was greater following stimulation to F3 than to P3 (or sham). Furthermore, hazardous ratings were lower following P3 stimulation. These results confirm the involvement of attentional orienting and switching mechanisms in the carryover of eye movements. It is suggested that this “attentional inertia” effect will increase with greater orienting of attentional resources in an initial task and poor inhibition of previously-relevant settings between tasks.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The persistence of attentional set from one task to a secondary unrelated task, revealed through carryover of eye movements, has been attributed to increased activation in the parietal lobe and decreased activation to the frontal lobe. To directly test this, we adopted a modified version of the Thompson and Crundall (2011) paradigm using low-frequency repetitive TMS to P3 and F3. In each trial, participants viewed letter-strings that were arranged horizontally, vertically, or randomly across the screen before viewing a road image and providing a hazardousness rating for it. The orientation of the letter search influenced eye movements to the road images and this carryover was greater following stimulation to F3 than to P3 (or sham). Furthermore, hazardous ratings were lower following P3 stimulation. These results confirm the involvement of attentional orienting and switching mechanisms in the carryover of eye movements. It is suggested that this “attentional inertia” effect will increase with greater orienting of attentional resources in an initial task and poor inhibition of previously-relevant settings between tasks.

Close

  • doi:10.1016/j.neuropsychologia.2021.107895

Close

Tzu Yu Hsu; Jui Tai Chen; Philip Tseng; Chin An Wang

Role of the frontal eye field in human microsaccade responses: A TMS study Journal Article

In: Biological Psychology, vol. 165, pp. 108202, 2021.

Abstract | Links | BibTeX

@article{Hsu2021,
title = {Role of the frontal eye field in human microsaccade responses: A TMS study},
author = {Tzu Yu Hsu and Jui Tai Chen and Philip Tseng and Chin An Wang},
doi = {10.1016/j.biopsycho.2021.108202},
year = {2021},
date = {2021-01-01},
journal = {Biological Psychology},
volume = {165},
pages = {108202},
publisher = {Elsevier B.V.},
abstract = {Microsaccade is a type of fixational eye movements that is modulated by various sensory and cognitive processes, and impact our visual perception. Although studies in monkeys have demonstrated a functional role for the superior colliculus and frontal eye field (FEF) in controlling microsaccades, our understanding of the neural mechanisms underlying the generation of microsaccades is still limited. By applying continuous theta-burst stimulation (cTBS) over the right FEF and the vertex, we investigated the role of the FEF in generating human microsaccade responses evoked by salient stimuli or by changes in background luminance. We observed higher microsaccade rates prior to target appearance, and larger rebound in microsaccade occurrence following salient stimuli, when disruptive cTBS was applied over FEF compared to vertex stimulation. Moreover, the microsaccade direction modulation after changes in background luminance was disrupted with FEF stimulation. Together, our results constitute the first evidence of FEF modulation in human microsaccade responses.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Microsaccade is a type of fixational eye movements that is modulated by various sensory and cognitive processes, and impact our visual perception. Although studies in monkeys have demonstrated a functional role for the superior colliculus and frontal eye field (FEF) in controlling microsaccades, our understanding of the neural mechanisms underlying the generation of microsaccades is still limited. By applying continuous theta-burst stimulation (cTBS) over the right FEF and the vertex, we investigated the role of the FEF in generating human microsaccade responses evoked by salient stimuli or by changes in background luminance. We observed higher microsaccade rates prior to target appearance, and larger rebound in microsaccade occurrence following salient stimuli, when disruptive cTBS was applied over FEF compared to vertex stimulation. Moreover, the microsaccade direction modulation after changes in background luminance was disrupted with FEF stimulation. Together, our results constitute the first evidence of FEF modulation in human microsaccade responses.

Close

  • doi:10.1016/j.biopsycho.2021.108202

Close

Tzu Yuv Hsu; Yu Fan Hsu; Hsin Yi Wang; Chin An Wang

Role of the frontal eye field in human pupil and saccade orienting responses Journal Article

In: European Journal of Neuroscience, vol. 54, no. 1, pp. 4283–4294, 2021.

Abstract | Links | BibTeX

@article{Hsu2021a,
title = {Role of the frontal eye field in human pupil and saccade orienting responses},
author = {Tzu Yuv Hsu and Yu Fan Hsu and Hsin Yi Wang and Chin An Wang},
doi = {10.1111/ejn.15253},
year = {2021},
date = {2021-01-01},
journal = {European Journal of Neuroscience},
volume = {54},
number = {1},
pages = {4283--4294},
abstract = {The appearance of a salient stimulus evokes a series of orienting responses including saccades and pupil size to prepare the body for appropriate action. The midbrain superior colliculus (SC) that receives critical control signals from the frontal eye field (FEF) is hypothesized to coordinate all components of orienting. It has shown recently that the FEF, together with the SC, is also importantly involved in the control of pupil size, in addition to its well-documented role in eye movements. Although the role of the FEF in pupil size is demonstrated in monkeys, its role in human pupil responses and the coordination between pupil size and saccades remains to be established. Through applying continuous theta-burst stimulation over the right FEF and vertex, we investigated the role of the FEF in human pupil and saccade responses evoked by a salient stimulus, and the coordination between pupil size and saccades. Our results showed that neither saccade reaction times (SRT) nor pupil responses evoked by salient stimuli were modulated by FEF stimulation. In contrast, the correlation between pupil size and SRTs in the contralateral stimulus condition was diminished with FEF stimulation, but intact with vertex stimulation. Moreover, FEF stimulation effects between saccade and pupil responses associated with salient stimuli correlated across participants. This is the first transcranial magnetic stimulation (TMS) study on the pupil orienting response, and our findings suggest that human FEF was involved in coordinating pupil size and saccades, but not involved in the control of pupil orienting responses.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The appearance of a salient stimulus evokes a series of orienting responses including saccades and pupil size to prepare the body for appropriate action. The midbrain superior colliculus (SC) that receives critical control signals from the frontal eye field (FEF) is hypothesized to coordinate all components of orienting. It has shown recently that the FEF, together with the SC, is also importantly involved in the control of pupil size, in addition to its well-documented role in eye movements. Although the role of the FEF in pupil size is demonstrated in monkeys, its role in human pupil responses and the coordination between pupil size and saccades remains to be established. Through applying continuous theta-burst stimulation over the right FEF and vertex, we investigated the role of the FEF in human pupil and saccade responses evoked by a salient stimulus, and the coordination between pupil size and saccades. Our results showed that neither saccade reaction times (SRT) nor pupil responses evoked by salient stimuli were modulated by FEF stimulation. In contrast, the correlation between pupil size and SRTs in the contralateral stimulus condition was diminished with FEF stimulation, but intact with vertex stimulation. Moreover, FEF stimulation effects between saccade and pupil responses associated with salient stimuli correlated across participants. This is the first transcranial magnetic stimulation (TMS) study on the pupil orienting response, and our findings suggest that human FEF was involved in coordinating pupil size and saccades, but not involved in the control of pupil orienting responses.

Close

  • doi:10.1111/ejn.15253

Close

Zhenlan Jin; Ruie Gou; Junjun Zhang; Ling Li

The role of frontal pursuit area in interaction between smooth pursuit eye movements and attention: A TMS study Journal Article

In: Journal of Vision, vol. 21, no. 3, pp. 1–10, 2021.

Abstract | Links | BibTeX

@article{Jin2021a,
title = {The role of frontal pursuit area in interaction between smooth pursuit eye movements and attention: A TMS study},
author = {Zhenlan Jin and Ruie Gou and Junjun Zhang and Ling Li},
doi = {10.1167/jov.21.3.11},
year = {2021},
date = {2021-01-01},
journal = {Journal of Vision},
volume = {21},
number = {3},
pages = {1--10},
abstract = {Close coupling between attention and smooth pursuit eye movements has been widely established and frontal eye field (FEF) is a “hub” region for attention and eye movements. Frontal pursuit area (FPA), a subregion of the FEF, is part of neural circuit for the pursuit, here, we directly checked the role of the FPA in the interaction between the pursuit and attention. To do it, we applied a dual-task paradigm where an attention demanding task was integrated into the pursuit target and interrupted the FPA using transcranial magnetic stimulation (TMS). In the study, participants were required to pursue a moving circle with a letter inside, which changed to another one every 100 ms and report whether “H” (low attentional load) or one of “H,” “S,” or “L” (high attentional load) appeared during the trial. As expected, increasing the attentional load decreased accuracy of the letter detection. Importantly, the FPA TMS had no effect on both the pursuit and letter detection tasks in the low load condition, whereas it reduced 200 to 320 ms gain, but tended to increase the letter detection accuracy in the high load condition. Moreover, individual's FPA TMS effect on pursuit gain},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Close coupling between attention and smooth pursuit eye movements has been widely established and frontal eye field (FEF) is a “hub” region for attention and eye movements. Frontal pursuit area (FPA), a subregion of the FEF, is part of neural circuit for the pursuit, here, we directly checked the role of the FPA in the interaction between the pursuit and attention. To do it, we applied a dual-task paradigm where an attention demanding task was integrated into the pursuit target and interrupted the FPA using transcranial magnetic stimulation (TMS). In the study, participants were required to pursue a moving circle with a letter inside, which changed to another one every 100 ms and report whether “H” (low attentional load) or one of “H,” “S,” or “L” (high attentional load) appeared during the trial. As expected, increasing the attentional load decreased accuracy of the letter detection. Importantly, the FPA TMS had no effect on both the pursuit and letter detection tasks in the low load condition, whereas it reduced 200 to 320 ms gain, but tended to increase the letter detection accuracy in the high load condition. Moreover, individual's FPA TMS effect on pursuit gain

Close

  • doi:10.1167/jov.21.3.11

Close

Björn Machner; Jonathan Imholz; Lara Braun; Philipp J. Koch; Tobias Bäumer; Thomas F. Münte; Christoph Helmchen; Andreas Sprenger

Resting-state functional connectivity in the attention networks is not altered by offline theta-burst stimulation of the posterior parietal cortex or the temporo-parietal junction as compared to a vertex control site Journal Article

In: Neuroimage: Reports, vol. 1, no. 2, pp. 100013, 2021.

Abstract | Links | BibTeX

@article{Machner2021,
title = {Resting-state functional connectivity in the attention networks is not altered by offline theta-burst stimulation of the posterior parietal cortex or the temporo-parietal junction as compared to a vertex control site},
author = {Björn Machner and Jonathan Imholz and Lara Braun and Philipp J. Koch and Tobias Bäumer and Thomas F. Münte and Christoph Helmchen and Andreas Sprenger},
doi = {10.1016/j.ynirp.2021.100013},
year = {2021},
date = {2021-01-01},
journal = {Neuroimage: Reports},
volume = {1},
number = {2},
pages = {100013},
abstract = {Disruption of resting-state functional connectivity (RSFC) between core regions of the dorsal attention network (DAN), including the bilateral superior parietal lobule (SPL), and structural damage of the right-lateralized ventral attention network (VAN), including the temporo-parietal junction (TPJ), have been described as neural basis for hemispatial neglect. Pursuing a virtual lesion model, we aimed to perturbate the attention networks of 22 healthy subjects by applying continuous theta burst stimulation (cTBS) to the right SPL or TPJ. We first created network masks of the DAN and VAN based on RSFC analyses from a RS-fMRI baseline session and determined the SPL and TPJ stimulation site within the respective mask. We then performed RS-fMRI immediately after cTBS of the SPL, TPJ (active sites) or vertex (control site). RSFC between SPL/TPJ and whole brain as well as between predefined regions of interest (ROI) in the attention networks was analyzed in a within-subject design. Contrary to our hypothesis, seed-based RSFC did not differ between the four experimental conditions. The individual change in ROI-to-ROI RSFC from baseline to post-stimulation did also not differ between active (SPL, TPJ) and control (vertex) cTBS. In our study, a single session offline cTBS over the right SPL or TPJ could not alter RSFC in the attention networks as compared to a control stimulation, maybe because effects wore off too early. Future studies should consider a modified cTBS protocol, concurrent TMS-fMRI or transcranial direct current stimulation.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Disruption of resting-state functional connectivity (RSFC) between core regions of the dorsal attention network (DAN), including the bilateral superior parietal lobule (SPL), and structural damage of the right-lateralized ventral attention network (VAN), including the temporo-parietal junction (TPJ), have been described as neural basis for hemispatial neglect. Pursuing a virtual lesion model, we aimed to perturbate the attention networks of 22 healthy subjects by applying continuous theta burst stimulation (cTBS) to the right SPL or TPJ. We first created network masks of the DAN and VAN based on RSFC analyses from a RS-fMRI baseline session and determined the SPL and TPJ stimulation site within the respective mask. We then performed RS-fMRI immediately after cTBS of the SPL, TPJ (active sites) or vertex (control site). RSFC between SPL/TPJ and whole brain as well as between predefined regions of interest (ROI) in the attention networks was analyzed in a within-subject design. Contrary to our hypothesis, seed-based RSFC did not differ between the four experimental conditions. The individual change in ROI-to-ROI RSFC from baseline to post-stimulation did also not differ between active (SPL, TPJ) and control (vertex) cTBS. In our study, a single session offline cTBS over the right SPL or TPJ could not alter RSFC in the attention networks as compared to a control stimulation, maybe because effects wore off too early. Future studies should consider a modified cTBS protocol, concurrent TMS-fMRI or transcranial direct current stimulation.

Close

  • doi:10.1016/j.ynirp.2021.100013

Close

Adam M. McNeill; Rebecca L. Monk; Adam W. Qureshi; Stergios Makris; Valentina Cazzato; Derek Heim

Elevated ad libitum alcohol consumption following continuous theta burst stimulation to the left-dorsolateral prefrontal cortex is partially mediated by changes in craving Journal Article

In: Cognitive, Affective and Behavioral Neuroscience, vol. 21, no. 6, pp. 1–11, 2021.

Abstract | Links | BibTeX

@article{McNeill2021b,
title = {Elevated ad libitum alcohol consumption following continuous theta burst stimulation to the left-dorsolateral prefrontal cortex is partially mediated by changes in craving},
author = {Adam M. McNeill and Rebecca L. Monk and Adam W. Qureshi and Stergios Makris and Valentina Cazzato and Derek Heim},
doi = {10.3758/s13415-021-00948-z},
year = {2021},
date = {2021-01-01},
journal = {Cognitive, Affective and Behavioral Neuroscience},
volume = {21},
number = {6},
pages = {1--11},
publisher = {Cognitive, Affective, & Behavioral Neuroscience},
abstract = {A Correction to this paper has been published: https://doi.org/10.3758/s13415-021-00948-z. Previous research indicates that following alcohol intoxication, activity in prefrontal cortices is reduced, linking to changes in associated cognitive processes, such as inhibitory control, attentional bias (AB), and craving. While these changes have been implicated in alcohol consumption behaviour, it has yet to be fully illuminated how these frontal regions and cognitive processes interact to govern alcohol consumption behaviour. The current preregistered study applied continuous theta burst transcranial magnetic stimulation (cTBS) to examine directly these relationships while removing the wider pharmacological effects of alcohol. A mixed design was implemented, with cTBS stimulation to right and left dorsolateral prefrontal cortex (DLPFC), the medial orbital frontal cortex (mOFC) and Vertex, with measures of inhibitory control, AB, and craving taken both pre- and post-stimulation. Ad libitum consumption was measured using a bogus taste task. Results suggest that rDLPFC stimulation impaired inhibitory control but did not significantly increase ad libitum consumption. However, lDLPFC stimulation heightened craving and increased consumption, with findings indicating that changes in craving partially mediated the relationship between cTBS stimulation ofprefrontal regions and ad libitum consumption. Medial OFC stimulation andAB findings were inconclusive. Overall, results implicate the left DLPFC in the regulation ofcraving, which appears to be a prepotent cognitive mechanism by which alcohol consumption is driven and maintained.wubble},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

A Correction to this paper has been published: https://doi.org/10.3758/s13415-021-00948-z. Previous research indicates that following alcohol intoxication, activity in prefrontal cortices is reduced, linking to changes in associated cognitive processes, such as inhibitory control, attentional bias (AB), and craving. While these changes have been implicated in alcohol consumption behaviour, it has yet to be fully illuminated how these frontal regions and cognitive processes interact to govern alcohol consumption behaviour. The current preregistered study applied continuous theta burst transcranial magnetic stimulation (cTBS) to examine directly these relationships while removing the wider pharmacological effects of alcohol. A mixed design was implemented, with cTBS stimulation to right and left dorsolateral prefrontal cortex (DLPFC), the medial orbital frontal cortex (mOFC) and Vertex, with measures of inhibitory control, AB, and craving taken both pre- and post-stimulation. Ad libitum consumption was measured using a bogus taste task. Results suggest that rDLPFC stimulation impaired inhibitory control but did not significantly increase ad libitum consumption. However, lDLPFC stimulation heightened craving and increased consumption, with findings indicating that changes in craving partially mediated the relationship between cTBS stimulation ofprefrontal regions and ad libitum consumption. Medial OFC stimulation andAB findings were inconclusive. Overall, results implicate the left DLPFC in the regulation ofcraving, which appears to be a prepotent cognitive mechanism by which alcohol consumption is driven and maintained.wubble

Close

  • doi:10.3758/s13415-021-00948-z

Close

Lara Merken; Marco Davare; Peter Janssen; Maria C. Romero

Behavioral effects of continuous theta-burst stimulation in macaque parietal cortex Journal Article

In: Scientific Reports, vol. 11, pp. 4511, 2021.

Abstract | Links | BibTeX

@article{Merken2021,
title = {Behavioral effects of continuous theta-burst stimulation in macaque parietal cortex},
author = {Lara Merken and Marco Davare and Peter Janssen and Maria C. Romero},
doi = {10.1038/s41598-021-83904-8},
year = {2021},
date = {2021-01-01},
journal = {Scientific Reports},
volume = {11},
pages = {4511},
publisher = {Nature Publishing Group UK},
abstract = {The neural mechanisms underlying the effects of continuous Theta-Burst Stimulation (cTBS) in humans are poorly understood. Animal studies can clarify the effects of cTBS on individual neurons, but behavioral evidence is necessary to demonstrate the validity of the animal model. We investigated the behavioral effect of cTBS applied over parietal cortex in rhesus monkeys performing a visually-guided grasping task with two differently sized objects, which required either a power grip or a pad-to-side grip. We used Fitts' law, predicting shorter grasping times (GT) for large compared to small objects, to investigate cTBS effects on two different grip types. cTBS induced long-lasting object-specific and dose-dependent changes in GT that remained present for up to two hours. High-intensity cTBS increased GTs for a power grip, but shortened GTs for a pad-to-side grip. Thus, high-intensity stimulation strongly reduced the natural GT difference between objects (i.e. the Fitts' law effect). In contrast, low-intensity cTBS induced the opposite effects on GT. Modifying the coil orientation from the standard 45-degree to a 30-degree angle induced opposite cTBS effects on GT. These findings represent behavioral evidence for the validity of the nonhuman primate model to study the neural underpinnings of non-invasive brain stimulation.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The neural mechanisms underlying the effects of continuous Theta-Burst Stimulation (cTBS) in humans are poorly understood. Animal studies can clarify the effects of cTBS on individual neurons, but behavioral evidence is necessary to demonstrate the validity of the animal model. We investigated the behavioral effect of cTBS applied over parietal cortex in rhesus monkeys performing a visually-guided grasping task with two differently sized objects, which required either a power grip or a pad-to-side grip. We used Fitts' law, predicting shorter grasping times (GT) for large compared to small objects, to investigate cTBS effects on two different grip types. cTBS induced long-lasting object-specific and dose-dependent changes in GT that remained present for up to two hours. High-intensity cTBS increased GTs for a power grip, but shortened GTs for a pad-to-side grip. Thus, high-intensity stimulation strongly reduced the natural GT difference between objects (i.e. the Fitts' law effect). In contrast, low-intensity cTBS induced the opposite effects on GT. Modifying the coil orientation from the standard 45-degree to a 30-degree angle induced opposite cTBS effects on GT. These findings represent behavioral evidence for the validity of the nonhuman primate model to study the neural underpinnings of non-invasive brain stimulation.

Close

  • doi:10.1038/s41598-021-83904-8

Close

Kentaro Miyamoto; Nadescha Trudel; Kevin Kamermans; Michele C. Lim; Alberto Lazari; Lennart Verhagen; Marco K. Wittmann; Matthew F. S. Rushworth

Identification and disruption of a neural mechanism for accumulating prospective metacognitive information prior to decision-making Journal Article

In: Neuron, vol. 109, no. 8, pp. 1396–1408, 2021.

Abstract | Links | BibTeX

@article{Miyamoto2021,
title = {Identification and disruption of a neural mechanism for accumulating prospective metacognitive information prior to decision-making},
author = {Kentaro Miyamoto and Nadescha Trudel and Kevin Kamermans and Michele C. Lim and Alberto Lazari and Lennart Verhagen and Marco K. Wittmann and Matthew F. S. Rushworth},
doi = {10.1016/j.neuron.2021.02.024},
year = {2021},
date = {2021-01-01},
journal = {Neuron},
volume = {109},
number = {8},
pages = {1396--1408},
publisher = {Elsevier Inc.},
abstract = {More than one type of probability must be considered when making decisions. It is as necessary to know one's chance of performing choices correctly as it is to know the chances that desired outcomes will follow choices. We refer to these two choice contingencies as internal and external probability. Neural activity across many frontal and parietal areas reflected internal and external probabilities in a similar manner during decision-making. However, neural recording and manipulation approaches suggest that one area, the anterior lateral prefrontal cortex (alPFC), is highly specialized for making prospective, metacognitive judgments on the basis of internal probability; it is essential for knowing which decisions to tackle, given its assessment of how well they will be performed. Its activity predicted prospective metacognitive judgments, and individual variation in activity predicted individual variation in metacognitive judgments. Its disruption altered metacognitive judgments, leading participants to tackle perceptual decisions they were likely to fail.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

More than one type of probability must be considered when making decisions. It is as necessary to know one's chance of performing choices correctly as it is to know the chances that desired outcomes will follow choices. We refer to these two choice contingencies as internal and external probability. Neural activity across many frontal and parietal areas reflected internal and external probabilities in a similar manner during decision-making. However, neural recording and manipulation approaches suggest that one area, the anterior lateral prefrontal cortex (alPFC), is highly specialized for making prospective, metacognitive judgments on the basis of internal probability; it is essential for knowing which decisions to tackle, given its assessment of how well they will be performed. Its activity predicted prospective metacognitive judgments, and individual variation in activity predicted individual variation in metacognitive judgments. Its disruption altered metacognitive judgments, leading participants to tackle perceptual decisions they were likely to fail.

Close

  • doi:10.1016/j.neuron.2021.02.024

Close

Roberto F. Salanamca-Giron; Estelle Raffin; Sarah B. Zandvliet; Martin Seeber; Christoph M. Michel; Paul Sauseng; Krystel R. Huxlin; Friedhelm C. Hummel

Enhancing visual motion discrimination by desynchronizing bifocal oscillatory activity Journal Article

In: NeuroImage, vol. 240, pp. 118299, 2021.

Abstract | Links | BibTeX

@article{SalanamcaGiron2021,
title = {Enhancing visual motion discrimination by desynchronizing bifocal oscillatory activity},
author = {Roberto F. Salanamca-Giron and Estelle Raffin and Sarah B. Zandvliet and Martin Seeber and Christoph M. Michel and Paul Sauseng and Krystel R. Huxlin and Friedhelm C. Hummel},
doi = {10.1016/j.neuroimage.2021.118299},
year = {2021},
date = {2021-01-01},
journal = {NeuroImage},
volume = {240},
pages = {118299},
publisher = {Elsevier Inc.},
abstract = {Visual motion discrimination involves reciprocal interactions in the alpha band between the primary visual cortex (V1) and mediotemporal areas (V5/MT). We investigated whether modulating alpha phase synchronization using individualized multisite transcranial alternating current stimulation (tACS) over V5 and V1 regions would improve motion discrimination. We tested 3 groups of healthy subjects with the following conditions: (1) individualized In-Phase V1alpha-V5alpha tACS (0° lag), (2) individualized Anti-Phase V1alpha-V5alpha tACS (180° lag) and (3) sham tACS. Motion discrimination and EEG activity were recorded before, during and after tACS. Performance significantly improved in the Anti-Phase group compared to the In-Phase group 10 and 30 min after stimulation. This result was explained by decreases in bottom-up alpha-V1 gamma-V5 phase-amplitude coupling. One possible explanation of these results is that Anti-Phase V1alpha-V5alpha tACS might impose an optimal phase lag between stimulation sites due to the inherent speed of wave propagation, hereby supporting optimized neuronal communication.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual motion discrimination involves reciprocal interactions in the alpha band between the primary visual cortex (V1) and mediotemporal areas (V5/MT). We investigated whether modulating alpha phase synchronization using individualized multisite transcranial alternating current stimulation (tACS) over V5 and V1 regions would improve motion discrimination. We tested 3 groups of healthy subjects with the following conditions: (1) individualized In-Phase V1alpha-V5alpha tACS (0° lag), (2) individualized Anti-Phase V1alpha-V5alpha tACS (180° lag) and (3) sham tACS. Motion discrimination and EEG activity were recorded before, during and after tACS. Performance significantly improved in the Anti-Phase group compared to the In-Phase group 10 and 30 min after stimulation. This result was explained by decreases in bottom-up alpha-V1 gamma-V5 phase-amplitude coupling. One possible explanation of these results is that Anti-Phase V1alpha-V5alpha tACS might impose an optimal phase lag between stimulation sites due to the inherent speed of wave propagation, hereby supporting optimized neuronal communication.

Close

  • doi:10.1016/j.neuroimage.2021.118299

Close

Omer Sharon; Firas Fahoum; Yuval Nir

Transcutaneous vagus nerve stimulation in humans induces pupil dilation and attenuates alpha oscillations Journal Article

In: Journal of Neuroscience, vol. 41, no. 2, pp. 320–330, 2021.

Abstract | Links | BibTeX

@article{Sharon2021,
title = {Transcutaneous vagus nerve stimulation in humans induces pupil dilation and attenuates alpha oscillations},
author = {Omer Sharon and Firas Fahoum and Yuval Nir},
doi = {10.1523/JNEUROSCI.1361-20.2020},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {2},
pages = {320--330},
abstract = {Vagus nerve stimulation (VNS) is widely used to treat drug-resistant epilepsy and depression. While the precise mechanisms mediating its long-term therapeutic effects are not fully resolved, they likely involve locus coeruleus (LC) stimulation via the nucleus of the solitary tract, which receives afferent vagal inputs. In rats, VNS elevates LC firing and forebrain noradrenaline levels, whereas LC lesions suppress VNS therapeutic efficacy. Noninvasive transcutaneous VNS (tVNS) uses electrical stimulation that targets the auricular branch of the vagus nerve at the cymba conchae of the ear. However, the extent to which tVNS mimics VNS remains unclear. Here, we investigated the short-term effects of tVNS in healthy human male volunteers (n = 24), using high-density EEG and pupillometry during visual fixation at rest. We compared short (3.4 s) trials of tVNS to sham electrical stimulation at the earlobe (far from the vagus nerve branch) to control for somatosensory stimulation. Although tVNS and sham stimulation did not differ in subjective intensity ratings, tVNS led to robust pupil dilation (peaking 4-5 s after trial onset) that was significantly higher than following sham stimulation. We further quantified, using parallel factor analysis, how tVNS modulates idle occipital alpha (8-13Hz) activity identified in each participant. We found greater attenuation of alpha oscillations by tVNS than by sham stimulation. This demonstrates that tVNS reliably induces pupillary and EEG markers of arousal beyond the effects of somatosensory stimulation, thus supporting the hypothesis that tVNS elevates noradrenaline and other arousal-promoting neuromodulatory signaling, and mimics invasive VNS.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Vagus nerve stimulation (VNS) is widely used to treat drug-resistant epilepsy and depression. While the precise mechanisms mediating its long-term therapeutic effects are not fully resolved, they likely involve locus coeruleus (LC) stimulation via the nucleus of the solitary tract, which receives afferent vagal inputs. In rats, VNS elevates LC firing and forebrain noradrenaline levels, whereas LC lesions suppress VNS therapeutic efficacy. Noninvasive transcutaneous VNS (tVNS) uses electrical stimulation that targets the auricular branch of the vagus nerve at the cymba conchae of the ear. However, the extent to which tVNS mimics VNS remains unclear. Here, we investigated the short-term effects of tVNS in healthy human male volunteers (n = 24), using high-density EEG and pupillometry during visual fixation at rest. We compared short (3.4 s) trials of tVNS to sham electrical stimulation at the earlobe (far from the vagus nerve branch) to control for somatosensory stimulation. Although tVNS and sham stimulation did not differ in subjective intensity ratings, tVNS led to robust pupil dilation (peaking 4-5 s after trial onset) that was significantly higher than following sham stimulation. We further quantified, using parallel factor analysis, how tVNS modulates idle occipital alpha (8-13Hz) activity identified in each participant. We found greater attenuation of alpha oscillations by tVNS than by sham stimulation. This demonstrates that tVNS reliably induces pupillary and EEG markers of arousal beyond the effects of somatosensory stimulation, thus supporting the hypothesis that tVNS elevates noradrenaline and other arousal-promoting neuromodulatory signaling, and mimics invasive VNS.

Close

  • doi:10.1523/JNEUROSCI.1361-20.2020

Close

Chloé Stengel; Marine Vernet; Julià L. Amengual; Antoni Valero-Cabré

Causal modulation of right hemisphere fronto-parietal phase synchrony with Transcranial Magnetic Stimulation during a conscious visual detection task Journal Article

In: Scientific Reports, vol. 11, pp. 3807, 2021.

Abstract | Links | BibTeX

@article{Stengel2021,
title = {Causal modulation of right hemisphere fronto-parietal phase synchrony with Transcranial Magnetic Stimulation during a conscious visual detection task},
author = {Chloé Stengel and Marine Vernet and Julià L. Amengual and Antoni Valero-Cabré},
doi = {10.1038/s41598-020-79812-y},
year = {2021},
date = {2021-01-01},
journal = {Scientific Reports},
volume = {11},
pages = {3807},
publisher = {Nature Publishing Group UK},
abstract = {Correlational evidence in non-human primates has reported increases of fronto-parietal high-beta (22–30 Hz) synchrony during the top-down allocation of visuo-spatial attention. But may inter-regional synchronization at this specific frequency band provide a causal mechanism by which top-down attentional processes facilitate conscious visual perception? To address this question, we analyzed electroencephalographic (EEG) signals from a group of healthy participants who performed a conscious visual detection task while we delivered brief (4 pulses) rhythmic (30 Hz) or random bursts of Transcranial Magnetic Stimulation (TMS) to the right Frontal Eye Field (FEF) prior to the onset of a lateralized target. We report increases of inter-regional synchronization in the high-beta band (25–35 Hz) between the electrode closest to the stimulated region (the right FEF) and right parietal EEG leads, and increases of local inter-trial coherence within the same frequency band over bilateral parietal EEG contacts, both driven by rhythmic but not random TMS patterns. Such increases were accompained by improvements of conscious visual sensitivity for left visual targets in the rhythmic but not the random TMS condition. These outcomes suggest that high-beta inter-regional synchrony can be modulated non-invasively and that high-beta oscillatory activity across the right dorsal fronto-parietal network may contribute to the facilitation of conscious visual perception. Our work supports future applications of non-invasive brain stimulation to restore impaired visually-guided behaviors by operating on top-down attentional modulatory mechanisms.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Correlational evidence in non-human primates has reported increases of fronto-parietal high-beta (22–30 Hz) synchrony during the top-down allocation of visuo-spatial attention. But may inter-regional synchronization at this specific frequency band provide a causal mechanism by which top-down attentional processes facilitate conscious visual perception? To address this question, we analyzed electroencephalographic (EEG) signals from a group of healthy participants who performed a conscious visual detection task while we delivered brief (4 pulses) rhythmic (30 Hz) or random bursts of Transcranial Magnetic Stimulation (TMS) to the right Frontal Eye Field (FEF) prior to the onset of a lateralized target. We report increases of inter-regional synchronization in the high-beta band (25–35 Hz) between the electrode closest to the stimulated region (the right FEF) and right parietal EEG leads, and increases of local inter-trial coherence within the same frequency band over bilateral parietal EEG contacts, both driven by rhythmic but not random TMS patterns. Such increases were accompained by improvements of conscious visual sensitivity for left visual targets in the rhythmic but not the random TMS condition. These outcomes suggest that high-beta inter-regional synchrony can be modulated non-invasively and that high-beta oscillatory activity across the right dorsal fronto-parietal network may contribute to the facilitation of conscious visual perception. Our work supports future applications of non-invasive brain stimulation to restore impaired visually-guided behaviors by operating on top-down attentional modulatory mechanisms.

Close

  • doi:10.1038/s41598-020-79812-y

Close

Fosca Al Roumi; Sébastien Marti; Liping Wang; Marie Amalric; Stanislas Dehaene

Mental compression of spatial sequences in human working memory using numerical and geometrical primitives Journal Article

In: Neuron, vol. 109, no. 16, pp. 2627–2639, 2021.

Abstract | Links | BibTeX

@article{AlRoumi2021,
title = {Mental compression of spatial sequences in human working memory using numerical and geometrical primitives},
author = {Fosca Al Roumi and Sébastien Marti and Liping Wang and Marie Amalric and Stanislas Dehaene},
doi = {10.1016/j.neuron.2021.06.009},
year = {2021},
date = {2021-01-01},
journal = {Neuron},
volume = {109},
number = {16},
pages = {2627--2639},
abstract = {How does the human brain store sequences of spatial locations? We propose that each sequence is internally compressed using an abstract, language-like code that captures its numerical and geometrical regularities. We exposed participants to spatial sequences of fixed length but variable regularity while their brain activity was recorded using magneto-encephalography. Using multivariate decoders, each successive location could be decoded from brain signals, and upcoming locations were anticipated prior to their actual onset. Crucially, sequences with lower complexity, defined as the minimal description length provided by the formal language, led to lower error rates and to increased anticipations. Furthermore, neural codes specific to the numerical and geometrical primitives of the postulated language could be detected, both in isolation and within the sequences. These results suggest that the human brain detects sequence regularities at multiple nested levels and uses them to compress long sequences in working memory.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

How does the human brain store sequences of spatial locations? We propose that each sequence is internally compressed using an abstract, language-like code that captures its numerical and geometrical regularities. We exposed participants to spatial sequences of fixed length but variable regularity while their brain activity was recorded using magneto-encephalography. Using multivariate decoders, each successive location could be decoded from brain signals, and upcoming locations were anticipated prior to their actual onset. Crucially, sequences with lower complexity, defined as the minimal description length provided by the formal language, led to lower error rates and to increased anticipations. Furthermore, neural codes specific to the numerical and geometrical primitives of the postulated language could be detected, both in isolation and within the sequences. These results suggest that the human brain detects sequence regularities at multiple nested levels and uses them to compress long sequences in working memory.

Close

  • doi:10.1016/j.neuron.2021.06.009

Close

Thomas Andrillon; Angus Burns; Teigane Mackay; Jennifer Windt; Naotsugu Tsuchiya

Predicting lapses of attention with sleep-like slow waves Journal Article

In: Nature Communications, vol. 12, pp. 3657, 2021.

Abstract | Links | BibTeX

@article{Andrillon2021,
title = {Predicting lapses of attention with sleep-like slow waves},
author = {Thomas Andrillon and Angus Burns and Teigane Mackay and Jennifer Windt and Naotsugu Tsuchiya},
doi = {10.1038/s41467-021-23890-7},
year = {2021},
date = {2021-01-01},
journal = {Nature Communications},
volume = {12},
pages = {3657},
publisher = {Springer US},
abstract = {Attentional lapses occur commonly and are associated with mind wandering, where focus is turned to thoughts unrelated to ongoing tasks and environmental demands, or mind blanking, where the stream of consciousness itself comes to a halt. To understand the neural mechanisms underlying attentional lapses, we studied the behaviour, subjective experience and neural activity of healthy participants performing a task. Random interruptions prompted participants to indicate their mental states as task-focused, mind-wandering or mind-blanking. Using high-density electroencephalography, we report here that spatially and temporally localized slow waves, a pattern of neural activity characteristic of the transition toward sleep, accompany behavioural markers of lapses and preceded reports of mind wandering and mind blanking. The location of slow waves could distinguish between sluggish and impulsive behaviours, and between mind wandering and mind blanking. Our results suggest attentional lapses share a common physiological origin: the emergence of local sleep-like activity within the awake brain.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Attentional lapses occur commonly and are associated with mind wandering, where focus is turned to thoughts unrelated to ongoing tasks and environmental demands, or mind blanking, where the stream of consciousness itself comes to a halt. To understand the neural mechanisms underlying attentional lapses, we studied the behaviour, subjective experience and neural activity of healthy participants performing a task. Random interruptions prompted participants to indicate their mental states as task-focused, mind-wandering or mind-blanking. Using high-density electroencephalography, we report here that spatially and temporally localized slow waves, a pattern of neural activity characteristic of the transition toward sleep, accompany behavioural markers of lapses and preceded reports of mind wandering and mind blanking. The location of slow waves could distinguish between sluggish and impulsive behaviours, and between mind wandering and mind blanking. Our results suggest attentional lapses share a common physiological origin: the emergence of local sleep-like activity within the awake brain.

Close

  • doi:10.1038/s41467-021-23890-7

Close

M. Antúnez; S. Mancini; J. A. Hernández-Cabrera; L. J. Hoversten; H. A. Barber; M. Carreiras

Cross-linguistic semantic preview benefit in Basque-Spanish bilingual readers: Evidence from fixation-related potentials Journal Article

In: Brain and Language, vol. 214, pp. 104905, 2021.

Abstract | Links | BibTeX

@article{Antunez2021,
title = {Cross-linguistic semantic preview benefit in Basque-Spanish bilingual readers: Evidence from fixation-related potentials},
author = {M. Antúnez and S. Mancini and J. A. Hernández-Cabrera and L. J. Hoversten and H. A. Barber and M. Carreiras},
doi = {10.1016/j.bandl.2020.104905},
year = {2021},
date = {2021-01-01},
journal = {Brain and Language},
volume = {214},
pages = {104905},
abstract = {During reading, we can process and integrate information from words allocated in the parafoveal region. However, whether we extract and process the meaning of parafoveal words is still under debate. Here, we obtained Fixation-Related Potentials in a Basque-Spanish bilingual sample during a Spanish reading task. By using the boundary paradigm, we presented different parafoveal previews that could be either Basque non-cognate translations or unrelated Basque words. We prove for the first time cross-linguistic semantic preview benefit effects in alphabetic languages, providing novel evidence of modulations in the N400 component. Our findings suggest that the meaning of parafoveal words is processed and integrated during reading and that such meaning is activated and shared across languages in bilingual readers.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

During reading, we can process and integrate information from words allocated in the parafoveal region. However, whether we extract and process the meaning of parafoveal words is still under debate. Here, we obtained Fixation-Related Potentials in a Basque-Spanish bilingual sample during a Spanish reading task. By using the boundary paradigm, we presented different parafoveal previews that could be either Basque non-cognate translations or unrelated Basque words. We prove for the first time cross-linguistic semantic preview benefit effects in alphabetic languages, providing novel evidence of modulations in the N400 component. Our findings suggest that the meaning of parafoveal words is processed and integrated during reading and that such meaning is activated and shared across languages in bilingual readers.

Close

  • doi:10.1016/j.bandl.2020.104905

Close

Martín Antúnez; Sara Milligan; Juan Andrés Hernández‐Cabrera; Horacio A. Barber; Elizabeth R. Schotter

Semantic parafoveal processing in natural reading: Insight from fixation‐related potentials & eye movements Journal Article

In: Psychophysiology, pp. e13986, 2021.

Abstract | Links | BibTeX

@article{Antunez2021a,
title = {Semantic parafoveal processing in natural reading: Insight from fixation‐related potentials & eye movements},
author = {Martín Antúnez and Sara Milligan and Juan Andrés Hernández‐Cabrera and Horacio A. Barber and Elizabeth R. Schotter},
doi = {10.1111/psyp.13986},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
pages = {e13986},
abstract = {Prior research suggests that we may access the meaning of parafoveal words dur- ing reading. We explored how semantic- plausibility parafoveal processing takes place in natural reading through the co- registration of eye movements (EM) and fixation- related potentials (FRPs), using the boundary paradigm. We replicated previous evidence of semantic parafoveal processing from highly controlled read- ing situations, extending their findings to more ecologically valid reading sce- narios. Additionally, and exploring the time- course of plausibility preview effects, we found distinct but complementary evidence from EM and FRPs measures. FRPs measures, showing a different trend than EM evidence, revealed that plau- sibility preview effects may be long- lasting. We highlight the importance of a co- registration set- up in ecologically valid scenarios to disentangle the mechanisms related to semantic- plausibility parafoveal processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Prior research suggests that we may access the meaning of parafoveal words dur- ing reading. We explored how semantic- plausibility parafoveal processing takes place in natural reading through the co- registration of eye movements (EM) and fixation- related potentials (FRPs), using the boundary paradigm. We replicated previous evidence of semantic parafoveal processing from highly controlled read- ing situations, extending their findings to more ecologically valid reading sce- narios. Additionally, and exploring the time- course of plausibility preview effects, we found distinct but complementary evidence from EM and FRPs measures. FRPs measures, showing a different trend than EM evidence, revealed that plau- sibility preview effects may be long- lasting. We highlight the importance of a co- registration set- up in ecologically valid scenarios to disentangle the mechanisms related to semantic- plausibility parafoveal processing.

Close

  • doi:10.1111/psyp.13986

Close

Damiano Azzalini; Anne Buot; Stefano Palminteri; Catherine Tallon-Baudry

Responses to heartbeats in ventromedial prefrontal cortex contribute to subjective preference-based decisions Journal Article

In: Journal of Neuroscience, vol. 41, no. 23, pp. 5102–5114, 2021.

Abstract | Links | BibTeX

@article{Azzalini2021,
title = {Responses to heartbeats in ventromedial prefrontal cortex contribute to subjective preference-based decisions},
author = {Damiano Azzalini and Anne Buot and Stefano Palminteri and Catherine Tallon-Baudry},
doi = {10.1523/JNEUROSCI.1932-20.2021},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {23},
pages = {5102--5114},
abstract = {Forrest Gump or The Matrix? Preference-based decisions are subjective and entail self-reflection. However, these self-related features are unaccounted for by known neural mechanisms of valuation and choice. Self-related processes have been linked to a basic interoceptive biological mechanism, the neural monitoring of heartbeats, in particular in ventromedial prefrontal cortex (vmPFC), a region also involved in value encoding. We thus hypothesized a functional coupling between the neural monitoring of heartbeats and the precision of value encoding in vmPFC. Human participants of both sexes were presented with pairs of movie titles. They indicated either which movie they preferred or performed a control objective visual discrimination that did not require self-reflection. Using magnetoencephalography, we measured heartbeat-evoked responses (HERs) before option presentation and confirmed that HERs in vmPFC were larger when preparing for the subjective, self-related task. We retrieved the expected cortical value network during choice with time-resolved statistical modeling. Crucially, we show that larger HERs before option presentation are followed by stronger value encoding during choice in vmPFC. This effect is independent of overall vmPFC baseline activity. The neural interaction between HERs and value encoding predicted preference-based choice consistency over time, accounting for both interindividual differences and trial-to-trial fluctuations within individuals. Neither cardiac activity nor arousal fluctuations could account for any of the effects. HERs did not interact with the encoding of perceptual evidence in the discrimination task. Our results show that the self-reflection underlying preference-based decisions involves HERs, and that HER integration to subjective value encoding in vmPFC contributes to preference stability.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Forrest Gump or The Matrix? Preference-based decisions are subjective and entail self-reflection. However, these self-related features are unaccounted for by known neural mechanisms of valuation and choice. Self-related processes have been linked to a basic interoceptive biological mechanism, the neural monitoring of heartbeats, in particular in ventromedial prefrontal cortex (vmPFC), a region also involved in value encoding. We thus hypothesized a functional coupling between the neural monitoring of heartbeats and the precision of value encoding in vmPFC. Human participants of both sexes were presented with pairs of movie titles. They indicated either which movie they preferred or performed a control objective visual discrimination that did not require self-reflection. Using magnetoencephalography, we measured heartbeat-evoked responses (HERs) before option presentation and confirmed that HERs in vmPFC were larger when preparing for the subjective, self-related task. We retrieved the expected cortical value network during choice with time-resolved statistical modeling. Crucially, we show that larger HERs before option presentation are followed by stronger value encoding during choice in vmPFC. This effect is independent of overall vmPFC baseline activity. The neural interaction between HERs and value encoding predicted preference-based choice consistency over time, accounting for both interindividual differences and trial-to-trial fluctuations within individuals. Neither cardiac activity nor arousal fluctuations could account for any of the effects. HERs did not interact with the encoding of perceptual evidence in the discrimination task. Our results show that the self-reflection underlying preference-based decisions involves HERs, and that HER integration to subjective value encoding in vmPFC contributes to preference stability.

Close

  • doi:10.1523/JNEUROSCI.1932-20.2021

Close

Shlomit Beker; John J. Foxe; Sophie Molholm

Oscillatory entrainment mechanisms and anticipatory predictive processes in children with autism spectrum disorder Journal Article

In: Journal of Neurophysiology, vol. 126, no. 5, pp. 1783–1798, 2021.

Abstract | Links | BibTeX

@article{Beker2021,
title = {Oscillatory entrainment mechanisms and anticipatory predictive processes in children with autism spectrum disorder},
author = {Shlomit Beker and John J. Foxe and Sophie Molholm},
doi = {10.1152/jn.00329.2021},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neurophysiology},
volume = {126},
number = {5},
pages = {1783--1798},
abstract = {Anticipating near-future events is fundamental to adaptive behavior, whereby neural processing of predictable stimuli is significantly facilitated relative to nonpredictable events. Neural oscillations appear to be a key anticipatory mechanism by which processing of upcoming stimuli is modified, and they often entrain to rhythmic environmental sequences. Clinical and anecdotal observations have led to the hypothesis that people with autism spectrum disorder (ASD) may have deficits in generating predictions, and as such, a candidate neural mechanism may be failure to adequately entrain neural activity to repetitive environmental patterns, to facilitate temporal predictions. We tested this hypothesis by interrogating temporal predictions and rhythmic entrainment using behavioral and electrophysiological approaches. We recorded high-density electroencephalography in children with ASD and typically developing (TD) age- and IQ-matched controls, while they reacted to an auditory target as quickly as possible. This auditory event was either preceded by predictive rhythmic visual cues or was not preceded by any cue. Both ASD and control groups presented comparable behavioral facilitation in response to the Cue versus No-Cue condition, challenging the hypothesis that children with ASD have deficits in generating temporal predictions. Analyses of the electrophysiological data, in contrast, revealed significantly reduced neural entrainment to the visual cues and altered anticipatory processes in the ASD group. This was the case despite intact stimulus-evoked visual responses. These results support intact behavioral temporal prediction in response to a cue in ASD, in the face of altered neural entrainment and anticipatory processes.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Anticipating near-future events is fundamental to adaptive behavior, whereby neural processing of predictable stimuli is significantly facilitated relative to nonpredictable events. Neural oscillations appear to be a key anticipatory mechanism by which processing of upcoming stimuli is modified, and they often entrain to rhythmic environmental sequences. Clinical and anecdotal observations have led to the hypothesis that people with autism spectrum disorder (ASD) may have deficits in generating predictions, and as such, a candidate neural mechanism may be failure to adequately entrain neural activity to repetitive environmental patterns, to facilitate temporal predictions. We tested this hypothesis by interrogating temporal predictions and rhythmic entrainment using behavioral and electrophysiological approaches. We recorded high-density electroencephalography in children with ASD and typically developing (TD) age- and IQ-matched controls, while they reacted to an auditory target as quickly as possible. This auditory event was either preceded by predictive rhythmic visual cues or was not preceded by any cue. Both ASD and control groups presented comparable behavioral facilitation in response to the Cue versus No-Cue condition, challenging the hypothesis that children with ASD have deficits in generating temporal predictions. Analyses of the electrophysiological data, in contrast, revealed significantly reduced neural entrainment to the visual cues and altered anticipatory processes in the ASD group. This was the case despite intact stimulus-evoked visual responses. These results support intact behavioral temporal prediction in response to a cue in ASD, in the face of altered neural entrainment and anticipatory processes.

Close

  • doi:10.1152/jn.00329.2021

Close

Chama Belkhiria; Vsevolod Peysakhovich

EOG metrics for cognitive workload detection Journal Article

In: Procedia Computer Science, vol. 192, pp. 1875–1884, 2021.

Abstract | Links | BibTeX

@article{Belkhiria2021,
title = {EOG metrics for cognitive workload detection},
author = {Chama Belkhiria and Vsevolod Peysakhovich},
doi = {10.1016/j.procs.2021.08.193},
year = {2021},
date = {2021-01-01},
journal = {Procedia Computer Science},
volume = {192},
pages = {1875--1884},
publisher = {Elsevier B.V.},
abstract = {Increasing workload is a central notion in human factors research that can decrease the performance and yield accidents. Thus, it is crucial to understand the impact of different internal operator's factors including eye movements, memory and audio-visual integration. Here, we explored the relationship between cognitive workload (low vs. high) and eye movements (saccades, fixations and smooth pursuit). The task difficulty was induced by auditory noise, arithmetical count and working memory load. We estimated cognitive workload using EOG and EEG-based mental state monitoring. One novelty consists in recording the EOG around the ears (alternative EOG) and around the eyes (conventional EOG). The number of blinks and saccades amplitude increased along with the difficulty increase (p ≤ 0.05). We found significant correlations between EOG and EEG (theta/alpha ratio) and between conventional and alternative EOG signal. The increase in cognitive load may disturb the coding and maintenance of related visual information. Alternative EOG metrics could be a valuable tool for detecting workload.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Increasing workload is a central notion in human factors research that can decrease the performance and yield accidents. Thus, it is crucial to understand the impact of different internal operator's factors including eye movements, memory and audio-visual integration. Here, we explored the relationship between cognitive workload (low vs. high) and eye movements (saccades, fixations and smooth pursuit). The task difficulty was induced by auditory noise, arithmetical count and working memory load. We estimated cognitive workload using EOG and EEG-based mental state monitoring. One novelty consists in recording the EOG around the ears (alternative EOG) and around the eyes (conventional EOG). The number of blinks and saccades amplitude increased along with the difficulty increase (p ≤ 0.05). We found significant correlations between EOG and EEG (theta/alpha ratio) and between conventional and alternative EOG signal. The increase in cognitive load may disturb the coding and maintenance of related visual information. Alternative EOG metrics could be a valuable tool for detecting workload.

Close

  • doi:10.1016/j.procs.2021.08.193

Close

Christoph Huber-Huber; Julia Steininger; Markus Grüner; Ulrich Ansorge

Psychophysical dual-task setups do not measure pre-saccadic attention but saccade-related strengthening of sensory representations Journal Article

In: Psychophysiology, vol. 58, no. 5, pp. e13787, 2021.

Abstract | Links | BibTeX

@article{HuberHuber2021a,
title = {Psychophysical dual-task setups do not measure pre-saccadic attention but saccade-related strengthening of sensory representations},
author = {Christoph Huber-Huber and Julia Steininger and Markus Grüner and Ulrich Ansorge},
doi = {10.1111/psyp.13787},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {58},
number = {5},
pages = {e13787},
abstract = {Visual attention and saccadic eye movements are linked in a tight, yet flexible fashion. In humans, this link is typically studied with dual-task setups. Participants are instructed to execute a saccade to some target location, while a discrimination target is flashed on a screen before the saccade can be made. Participants are also instructed to report a specific feature of this discrimination target at the trial end. Discrimination performance is usually better if the discrimination target occurred at the same location as the saccade target compared to when it occurred at a different location, which is explained by the mandatory shift of attention to the saccade target location before saccade onset. This pre-saccadic shift of attention presumably enhances the perception of the discrimination target if it occurred at the same, but not if it occurred at a different location. It is, however, known that a dual-task setup can alter the primary process under investigation. Here, we directly compared pre-saccadic attention in single-task versus dual-task setups using concurrent electroencephalography (EEG) and eye-tracking. Our results corroborate the idea of a pre-saccadic shift of attention. They, however, question that this shift leads to the same-position discrimination advantage. The relation of saccade and discrimination target position affected the EEG signal only after saccade onset. Our results, thus, favor an alternative explanation based on the role of saccades for the consolidation of sensory and short-term memory. We conclude that studies with dual-task setups arrived at a valid conclusion despite not measuring exactly what they intended to measure.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual attention and saccadic eye movements are linked in a tight, yet flexible fashion. In humans, this link is typically studied with dual-task setups. Participants are instructed to execute a saccade to some target location, while a discrimination target is flashed on a screen before the saccade can be made. Participants are also instructed to report a specific feature of this discrimination target at the trial end. Discrimination performance is usually better if the discrimination target occurred at the same location as the saccade target compared to when it occurred at a different location, which is explained by the mandatory shift of attention to the saccade target location before saccade onset. This pre-saccadic shift of attention presumably enhances the perception of the discrimination target if it occurred at the same, but not if it occurred at a different location. It is, however, known that a dual-task setup can alter the primary process under investigation. Here, we directly compared pre-saccadic attention in single-task versus dual-task setups using concurrent electroencephalography (EEG) and eye-tracking. Our results corroborate the idea of a pre-saccadic shift of attention. They, however, question that this shift leads to the same-position discrimination advantage. The relation of saccade and discrimination target position affected the EEG signal only after saccade onset. Our results, thus, favor an alternative explanation based on the role of saccades for the consolidation of sensory and short-term memory. We conclude that studies with dual-task setups arrived at a valid conclusion despite not measuring exactly what they intended to measure.

Close

  • doi:10.1111/psyp.13787

Close

Anna Hudson; Amie J. Durston; Sarah D. McCrackin; Roxane J. Itier

Emotion, gender and gaze discrimination tasks do not differentially impact the neural processing of angry or happy facial expressions - A mass univariate ERP analysis Journal Article

In: Brain Topography, vol. 34, no. 6, pp. 813–833, 2021.

Abstract | Links | BibTeX

@article{Hudson2021,
title = {Emotion, gender and gaze discrimination tasks do not differentially impact the neural processing of angry or happy facial expressions - A mass univariate ERP analysis},
author = {Anna Hudson and Amie J. Durston and Sarah D. McCrackin and Roxane J. Itier},
doi = {10.1007/s10548-021-00873-x},
year = {2021},
date = {2021-01-01},
journal = {Brain Topography},
volume = {34},
number = {6},
pages = {813--833},
publisher = {Springer US},
abstract = {Facial expression processing is a critical component of social cognition yet, whether it is influenced by task demands at the neural level remains controversial. Past ERP studies have found mixed results with classic statistical analyses, known to increase both Type I and Type II errors, which Mass Univariate statistics (MUS) control better. However, MUS open-access toolboxes can use different fundamental statistics, which may lead to inconsistent results. Here, we compared the output of two MUS toolboxes, LIMO and FMUT, on the same data recorded during the processing of angry and happy facial expressions investigated under three tasks in a within-subjects design. Both toolboxes revealed main effects of emotion during the N170 timing and main effects of task during later time points typically associated with the LPP component. Neither toolbox yielded an interaction between the two factors at the group level, nor at the individual level in LIMO, confirming that the neural processing of these two face expressions is largely independent from task demands. Behavioural data revealed main effects of task on reaction time and accuracy, but no influence of expression or an interaction between the two. Expression processing and task demands are discussed in the context of the consistencies and discrepancies between the two toolboxes and existing literature.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Facial expression processing is a critical component of social cognition yet, whether it is influenced by task demands at the neural level remains controversial. Past ERP studies have found mixed results with classic statistical analyses, known to increase both Type I and Type II errors, which Mass Univariate statistics (MUS) control better. However, MUS open-access toolboxes can use different fundamental statistics, which may lead to inconsistent results. Here, we compared the output of two MUS toolboxes, LIMO and FMUT, on the same data recorded during the processing of angry and happy facial expressions investigated under three tasks in a within-subjects design. Both toolboxes revealed main effects of emotion during the N170 timing and main effects of task during later time points typically associated with the LPP component. Neither toolbox yielded an interaction between the two factors at the group level, nor at the individual level in LIMO, confirming that the neural processing of these two face expressions is largely independent from task demands. Behavioural data revealed main effects of task on reaction time and accuracy, but no influence of expression or an interaction between the two. Expression processing and task demands are discussed in the context of the consistencies and discrepancies between the two toolboxes and existing literature.

Close

  • doi:10.1007/s10548-021-00873-x

Close

Gelu Ionescu; Aline Frey; Nathalie Guyader; Emmanuelle Kristensen; Anton Andreev; Anne Guérin-Dugué

Synchronization of acquisition devices in neuroimaging: An application using co-registration of eye movements and electroencephalography Journal Article

In: Behavior Research Methods, pp. 1–20, 2021.

Abstract | Links | BibTeX

@article{Ionescu2021,
title = {Synchronization of acquisition devices in neuroimaging: An application using co-registration of eye movements and electroencephalography},
author = {Gelu Ionescu and Aline Frey and Nathalie Guyader and Emmanuelle Kristensen and Anton Andreev and Anne Guérin-Dugué},
doi = {10.3758/s13428-021-01756-6},
year = {2021},
date = {2021-01-01},
journal = {Behavior Research Methods},
pages = {1--20},
publisher = {Springer US},
abstract = {Interest in applications for the simultaneous acquisition of data from different devices is growing. In neuroscience for example, co-registration complements and overcomes some of the shortcomings of individual methods. However, precise synchronization of the different data streams involved is required before joint data analysis. Our article presents and evaluates a synchronization method which maximizes the alignment of information across time. Synchronization through common triggers is widely used in all existing methods, because it is very simple and effective. However, this solution has been found to fail in certain practical situations, namely for the spurious detection of triggers and/or when the timestamps of triggers sampled by each acquisition device are not jointly distributed linearly for the entire duration of an experiment. We propose two additional mechanisms, the "Longest Common Subsequence" algorithm and a piecewise linear regression, in order to overcome the limitations of the classical method of synchronizing common triggers. The proposed synchronization method was evaluated using both real and artificial data. Co-registrations of electroencephalographic signals (EEG) and eye move- ments were used for real data. We compared the effectiveness of our method to another open source method implemented using EYE-EEG toolbox. Overall, we show that our method, implemented in C++ as a DOS application, is very fast, robust and fully automatic.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Interest in applications for the simultaneous acquisition of data from different devices is growing. In neuroscience for example, co-registration complements and overcomes some of the shortcomings of individual methods. However, precise synchronization of the different data streams involved is required before joint data analysis. Our article presents and evaluates a synchronization method which maximizes the alignment of information across time. Synchronization through common triggers is widely used in all existing methods, because it is very simple and effective. However, this solution has been found to fail in certain practical situations, namely for the spurious detection of triggers and/or when the timestamps of triggers sampled by each acquisition device are not jointly distributed linearly for the entire duration of an experiment. We propose two additional mechanisms, the "Longest Common Subsequence" algorithm and a piecewise linear regression, in order to overcome the limitations of the classical method of synchronizing common triggers. The proposed synchronization method was evaluated using both real and artificial data. Co-registrations of electroencephalographic signals (EEG) and eye move- ments were used for real data. We compared the effectiveness of our method to another open source method implemented using EYE-EEG toolbox. Overall, we show that our method, implemented in C++ as a DOS application, is very fast, robust and fully automatic.

Close

  • doi:10.3758/s13428-021-01756-6

Close

Silvia L. Isabella; J. Allan Cheyne; Douglas Cheyne

Inhibitory control in the absence of awareness: Interactions between frontal and motor cortex oscillations mediate implicitly learned responses Journal Article

In: Frontiers in Human Neuroscience, vol. 15, pp. 786035, 2021.

Abstract | Links | BibTeX

@article{Isabella2021,
title = {Inhibitory control in the absence of awareness: Interactions between frontal and motor cortex oscillations mediate implicitly learned responses},
author = {Silvia L. Isabella and J. Allan Cheyne and Douglas Cheyne},
doi = {10.3389/fnhum.2021.786035},
year = {2021},
date = {2021-01-01},
journal = {Frontiers in Human Neuroscience},
volume = {15},
pages = {786035},
abstract = {Cognitive control of action is associated with conscious effort and is hypothesised to be reflected by increased frontal theta activity. However, the functional role of these increases in theta power, and how they contribute to cognitive control remains unknown. We conducted an MEG study to test the hypothesis that frontal theta oscillations interact with sensorimotor signals in order to produce controlled behaviour, and that the strength of these interactions will vary with the amount of control required. We measured neuromagnetic activity in 16 healthy adults performing a response inhibition (Go/Switch) task, known from previous work to modulate cognitive control requirements using hidden patterns of Go and Switch cues. Learning was confirmed by reduced reaction times (RT) to patterned compared to random Switch cues. Concurrent measures of pupil diameter revealed changes in subjective cognitive effort with stimulus probability, even in the absence of measurable behavioural differences, revealing instances of covert variations in cognitive effort. Significant theta oscillations were found in five frontal brain regions, with theta power in the right middle frontal and right premotor cortices parametrically increasing with cognitive effort. Similar increases in oscillatory power were also observed in motor cortical gamma, suggesting an interaction. Right middle frontal and right precentral theta activity predicted changes in pupil diameter across all experimental conditions, demonstrating a close relationship between frontal theta increases and cognitive control. Although no theta-gamma cross-frequency coupling was found, long-range theta phase coherence among the five significant sources between bilateral middle frontal, right inferior frontal, and bilateral premotor areas was found, thus providing a mechanism for the relay of cognitive control between frontal and motor areas via theta signalling. Furthermore, this provides the first evidence for the sensitivity of frontal theta oscillations to implicit motor learning and its effects on cognitive load. More generally these results present a possible a mechanism for this frontal theta network to coordinate response preparation, inhibition and execution.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Cognitive control of action is associated with conscious effort and is hypothesised to be reflected by increased frontal theta activity. However, the functional role of these increases in theta power, and how they contribute to cognitive control remains unknown. We conducted an MEG study to test the hypothesis that frontal theta oscillations interact with sensorimotor signals in order to produce controlled behaviour, and that the strength of these interactions will vary with the amount of control required. We measured neuromagnetic activity in 16 healthy adults performing a response inhibition (Go/Switch) task, known from previous work to modulate cognitive control requirements using hidden patterns of Go and Switch cues. Learning was confirmed by reduced reaction times (RT) to patterned compared to random Switch cues. Concurrent measures of pupil diameter revealed changes in subjective cognitive effort with stimulus probability, even in the absence of measurable behavioural differences, revealing instances of covert variations in cognitive effort. Significant theta oscillations were found in five frontal brain regions, with theta power in the right middle frontal and right premotor cortices parametrically increasing with cognitive effort. Similar increases in oscillatory power were also observed in motor cortical gamma, suggesting an interaction. Right middle frontal and right precentral theta activity predicted changes in pupil diameter across all experimental conditions, demonstrating a close relationship between frontal theta increases and cognitive control. Although no theta-gamma cross-frequency coupling was found, long-range theta phase coherence among the five significant sources between bilateral middle frontal, right inferior frontal, and bilateral premotor areas was found, thus providing a mechanism for the relay of cognitive control between frontal and motor areas via theta signalling. Furthermore, this provides the first evidence for the sensitivity of frontal theta oscillations to implicit motor learning and its effects on cognitive load. More generally these results present a possible a mechanism for this frontal theta network to coordinate response preparation, inhibition and execution.

Close

  • doi:10.3389/fnhum.2021.786035

Close

Jianrong Jia; Ying Fan; Huan Luo

Alpha-band phase modulates bottom-up feature processing Journal Article

In: Cerebral Cortex, pp. 1–9, 2021.

Abstract | Links | BibTeX

@article{Jia2021,
title = {Alpha-band phase modulates bottom-up feature processing},
author = {Jianrong Jia and Ying Fan and Huan Luo},
doi = {10.1093/cercor/bhab291},
year = {2021},
date = {2021-01-01},
journal = {Cerebral Cortex},
pages = {1--9},
abstract = {Recent studies reveal that attention operates in a rhythmic manner, that is, sampling each location or feature alternatively over time. However, most evidence derives from top-down tasks, and it remains elusive whether bottom-up processing also entails dynamic coordination. Here, we developed a novel feature processing paradigm and combined time-resolved behavioral measurements and electroencephalogram (EEG) recordings to address the question. Specifically, a salient color in a multicolor display serves as a noninformative cue to capture attention and presumably reset the oscillations of feature processing. We then measured the behavioral performance of a probe stimulus associated with either high- or low-salient color at varied temporal lags after the cue. First, the behavioral results (i.e., reaction time) display an alpha-band ($sim$8 Hz) profile with a consistent phase lag between high- and low-salient conditions. Second, simultaneous EEG recordings show that behavioral performance is modulated by the phase of alpha-band neural oscillation at the onset of the probes. Finally, high- and low-salient probes are associated with distinct preferred phases of alpha-band neural oscillations. Taken together, our behavioral and neural results convergingly support a central function of alpha-band rhythms in feature processing, that is, features with varied saliency levels are processed at different phases of alpha neural oscillations.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Recent studies reveal that attention operates in a rhythmic manner, that is, sampling each location or feature alternatively over time. However, most evidence derives from top-down tasks, and it remains elusive whether bottom-up processing also entails dynamic coordination. Here, we developed a novel feature processing paradigm and combined time-resolved behavioral measurements and electroencephalogram (EEG) recordings to address the question. Specifically, a salient color in a multicolor display serves as a noninformative cue to capture attention and presumably reset the oscillations of feature processing. We then measured the behavioral performance of a probe stimulus associated with either high- or low-salient color at varied temporal lags after the cue. First, the behavioral results (i.e., reaction time) display an alpha-band ($sim$8 Hz) profile with a consistent phase lag between high- and low-salient conditions. Second, simultaneous EEG recordings show that behavioral performance is modulated by the phase of alpha-band neural oscillation at the onset of the probes. Finally, high- and low-salient probes are associated with distinct preferred phases of alpha-band neural oscillations. Taken together, our behavioral and neural results convergingly support a central function of alpha-band rhythms in feature processing, that is, features with varied saliency levels are processed at different phases of alpha neural oscillations.

Close

  • doi:10.1093/cercor/bhab291

Close

Efthymia C. Kapnoula; Bob McMurray

Idiosyncratic use of bottom-up and top-down information leads to differences in speech perception flexibility: Converging evidence from ERPs and eye-tracking Journal Article

In: Brain and Language, vol. 223, pp. 105031, 2021.

Abstract | Links | BibTeX

@article{Kapnoula2021,
title = {Idiosyncratic use of bottom-up and top-down information leads to differences in speech perception flexibility: Converging evidence from ERPs and eye-tracking},
author = {Efthymia C. Kapnoula and Bob McMurray},
doi = {10.1016/j.bandl.2021.105031},
year = {2021},
date = {2021-01-01},
journal = {Brain and Language},
volume = {223},
pages = {105031},
publisher = {Elsevier Inc.},
abstract = {Listeners generally categorize speech sounds in a gradient manner. However, recent work, using a visual analogue scaling (VAS) task, suggests that some listeners show more categorical performance, leading to less flexible cue integration and poorer recovery from misperceptions (Kapnoula et al., 2017, 2021). We asked how individual differences in speech gradiency can be reconciled with the well-established gradiency in the modal listener, showing how VAS performance relates to both Visual World Paradigm and EEG measures of gradiency. We also investigated three potential sources of these individual differences: inhibitory control; lexical inhibition; and early cue encoding. We used the N1 ERP component to track pre-categorical encoding of Voice Onset Time (VOT). The N1 linearly tracked VOT, reflecting a fundamentally gradient speech perception; however, for less gradient listeners, this linearity was disrupted near the boundary. Thus, while all listeners are gradient, they may show idiosyncratic encoding of specific cues, affecting downstream processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Listeners generally categorize speech sounds in a gradient manner. However, recent work, using a visual analogue scaling (VAS) task, suggests that some listeners show more categorical performance, leading to less flexible cue integration and poorer recovery from misperceptions (Kapnoula et al., 2017, 2021). We asked how individual differences in speech gradiency can be reconciled with the well-established gradiency in the modal listener, showing how VAS performance relates to both Visual World Paradigm and EEG measures of gradiency. We also investigated three potential sources of these individual differences: inhibitory control; lexical inhibition; and early cue encoding. We used the N1 ERP component to track pre-categorical encoding of Voice Onset Time (VOT). The N1 linearly tracked VOT, reflecting a fundamentally gradient speech perception; however, for less gradient listeners, this linearity was disrupted near the boundary. Thus, while all listeners are gradient, they may show idiosyncratic encoding of specific cues, affecting downstream processing.

Close

  • doi:10.1016/j.bandl.2021.105031

Close

Hamid Karimi-Rouzbahani; Alexandra Woolgar; Anina N. Rich

Neural signatures of vigilance decrements predict behavioural errors before they occur Journal Article

In: eLife, vol. 10, pp. e60563, 2021.

Abstract | Links | BibTeX

@article{KarimiRouzbahani2021,
title = {Neural signatures of vigilance decrements predict behavioural errors before they occur},
author = {Hamid Karimi-Rouzbahani and Alexandra Woolgar and Anina N. Rich},
doi = {10.7554/ELIFE.60563},
year = {2021},
date = {2021-01-01},
journal = {eLife},
volume = {10},
pages = {e60563},
abstract = {There are many monitoring environments, such as railway control, in which lapses of attention can have tragic consequences. Problematically, sustained monitoring for rare targets is difficult, with more misses and longer reaction times over time. What changes in the brain underpin these ‘vigilance decrements'? We designed a multiple-object monitoring (MOM) paradigm to examine how the neural representation of information varied with target frequency and time performing the task. Behavioural performance decreased over time for the rare target (monitoring) condition, but not for a frequent target (active) condition. This was mirrored in neural decoding using magnetoencephalography: coding of critical information declined more during monitoring versus active conditions along the experiment. We developed new analyses that can predict behavioural errors from the neural data more than a second before they occurred. This facilitates pre-empting behavioural errors due to lapses in attention and provides new insight into the neural correlates of vigilance decrements.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

There are many monitoring environments, such as railway control, in which lapses of attention can have tragic consequences. Problematically, sustained monitoring for rare targets is difficult, with more misses and longer reaction times over time. What changes in the brain underpin these ‘vigilance decrements'? We designed a multiple-object monitoring (MOM) paradigm to examine how the neural representation of information varied with target frequency and time performing the task. Behavioural performance decreased over time for the rare target (monitoring) condition, but not for a frequent target (active) condition. This was mirrored in neural decoding using magnetoencephalography: coding of critical information declined more during monitoring versus active conditions along the experiment. We developed new analyses that can predict behavioural errors from the neural data more than a second before they occurred. This facilitates pre-empting behavioural errors due to lapses in attention and provides new insight into the neural correlates of vigilance decrements.

Close

  • doi:10.7554/ELIFE.60563

Close

Julian Q. Kosciessa; Ulman Lindenberger; Douglas D. Garrett

Thalamocortical excitability modulation guides human perception under uncertainty Journal Article

In: Nature Communications, vol. 12, pp. 2430, 2021.

Abstract | Links | BibTeX

@article{Kosciessa2021,
title = {Thalamocortical excitability modulation guides human perception under uncertainty},
author = {Julian Q. Kosciessa and Ulman Lindenberger and Douglas D. Garrett},
doi = {10.1038/s41467-021-22511-7},
year = {2021},
date = {2021-01-01},
journal = {Nature Communications},
volume = {12},
pages = {2430},
publisher = {Springer US},
abstract = {Knowledge about the relevance of environmental features can guide stimulus processing. However, it remains unclear how processing is adjusted when feature relevance is uncertain. We hypothesized that (a) heightened uncertainty would shift cortical networks from a rhythmic, selective processing-oriented state toward an asynchronous (“excited”) state that boosts sensitivity to all stimulus features, and that (b) the thalamus provides a subcortical nexus for such uncertainty-related shifts. Here, we had young adults attend to varying numbers of task-relevant features during EEG and fMRI acquisition to test these hypotheses. Behavioral modeling and electrophysiological signatures revealed that greater uncertainty lowered the rate of evidence accumulation for individual stimulus features, shifted the cortex from a rhythmic to an asynchronous/excited regime, and heightened neuromodulatory arousal. Crucially, this unified constellation of within-person effects was dominantly reflected in the uncertainty-driven upregulation of thalamic activity. We argue that neuromodulatory processes involving the thalamus play a central role in how the brain modulates neural excitability in the face of momentary uncertainty.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Knowledge about the relevance of environmental features can guide stimulus processing. However, it remains unclear how processing is adjusted when feature relevance is uncertain. We hypothesized that (a) heightened uncertainty would shift cortical networks from a rhythmic, selective processing-oriented state toward an asynchronous (“excited”) state that boosts sensitivity to all stimulus features, and that (b) the thalamus provides a subcortical nexus for such uncertainty-related shifts. Here, we had young adults attend to varying numbers of task-relevant features during EEG and fMRI acquisition to test these hypotheses. Behavioral modeling and electrophysiological signatures revealed that greater uncertainty lowered the rate of evidence accumulation for individual stimulus features, shifted the cortex from a rhythmic to an asynchronous/excited regime, and heightened neuromodulatory arousal. Crucially, this unified constellation of within-person effects was dominantly reflected in the uncertainty-driven upregulation of thalamic activity. We argue that neuromodulatory processes involving the thalamus play a central role in how the brain modulates neural excitability in the face of momentary uncertainty.

Close

  • doi:10.1038/s41467-021-22511-7

Close

James E. Kragel; Stephan Schuele; Stephen VanHaerents; Joshua M. Rosenow; Joel L. Voss

Rapid coordination of effective learning by the human hippocampus Journal Article

In: Science Advances, vol. 7, no. 25, pp. eabf7144, 2021.

Abstract | Links | BibTeX

@article{Kragel2021,
title = {Rapid coordination of effective learning by the human hippocampus},
author = {James E. Kragel and Stephan Schuele and Stephen VanHaerents and Joshua M. Rosenow and Joel L. Voss},
doi = {10.1126/sciadv.abf7144},
year = {2021},
date = {2021-01-01},
journal = {Science Advances},
volume = {7},
number = {25},
pages = {eabf7144},
abstract = {Although the human hippocampus is necessary for long-term memory, controversial findings suggest that it may also support short-Term memory in the service of guiding effective behaviors during learning. We tested the counterintuitive theory that the hippocampus contributes to long-Term memory through remarkably short-Term processing, as reflected in eye movements during scene encoding. While viewing scenes for the first time, shortterm retrieval operative within the episode over only hundreds of milliseconds was indicated by a specific eye-movement pattern, which was effective in that it enhanced spatiotemporal memory formation. This viewing pattern was predicted by hippocampal theta oscillations recorded from depth electrodes and by shifts toward top-down influence of hippocampal theta on activity within visual perception and attention networks. The hippocampus thus supports short-Term memory processing that coordinates behavior in the service of effective spatiotemporal learning.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Although the human hippocampus is necessary for long-term memory, controversial findings suggest that it may also support short-Term memory in the service of guiding effective behaviors during learning. We tested the counterintuitive theory that the hippocampus contributes to long-Term memory through remarkably short-Term processing, as reflected in eye movements during scene encoding. While viewing scenes for the first time, shortterm retrieval operative within the episode over only hundreds of milliseconds was indicated by a specific eye-movement pattern, which was effective in that it enhanced spatiotemporal memory formation. This viewing pattern was predicted by hippocampal theta oscillations recorded from depth electrodes and by shifts toward top-down influence of hippocampal theta on activity within visual perception and attention networks. The hippocampus thus supports short-Term memory processing that coordinates behavior in the service of effective spatiotemporal learning.

Close

  • doi:10.1126/sciadv.abf7144

Close

Wouter Kruijne; Christian N. L. Olivers; Hedderik Rijn

Neural repetition suppression modulates time perception: Evidence from electrophysiology and pupillometry Journal Article

In: Journal of Cognitive Neuroscience, vol. 33, no. 7, pp. 1230–1252, 2021.

Abstract | Links | BibTeX

@article{Kruijne2021,
title = {Neural repetition suppression modulates time perception: Evidence from electrophysiology and pupillometry},
author = {Wouter Kruijne and Christian N. L. Olivers and Hedderik Rijn},
doi = {10.1162/jocn_a_01705},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {33},
number = {7},
pages = {1230--1252},
abstract = {Human time perception is malleable and subject to many biases. For example, it has repeatedly been shown that stimuli that are physically intense or that are unexpected seem to last longer. Two competing hypotheses have been proposed to account for such biases: One states that these temporal illusions are the result of increased levels of arousal that speeds up neural clock dynamics, whereas the alternative “magnitude coding” account states that the magnitude of sensory responses causally modulates perceived durations. Common experimental paradigms used to study temporal biases cannot dissociate between these accounts, as arousal and sensory magnitude covary and modulate each other. Here, we present two temporal discrimination experiments where two flashing stimuli demarcated the start and end of a to-be-timed interval. These stimuli could be either in the same or a different location, which led to different sensory responses because of neural repetition suppression. Crucially, changes and repetitions were fully predictable, which allowed us to explore effects of sensory response magnitude without changes in arousal or surprise. Intervals with changing markers were perceived as lasting longer than those with repeating markers. We measured EEG (Experiment 1) and pupil size (Experiment 2) and found that temporal perception was related to changes in ERPs (P2) and pupil constriction, both of which have been related to responses in the sensory cortex. Conversely, correlates of surprise and arousal (P3 amplitude and pupil dilation) were unaffected by stimulus repetitions and changes. These results demonstrate, for the first time, that sensory magnitude affects time perception even under constant levels of arousal.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Human time perception is malleable and subject to many biases. For example, it has repeatedly been shown that stimuli that are physically intense or that are unexpected seem to last longer. Two competing hypotheses have been proposed to account for such biases: One states that these temporal illusions are the result of increased levels of arousal that speeds up neural clock dynamics, whereas the alternative “magnitude coding” account states that the magnitude of sensory responses causally modulates perceived durations. Common experimental paradigms used to study temporal biases cannot dissociate between these accounts, as arousal and sensory magnitude covary and modulate each other. Here, we present two temporal discrimination experiments where two flashing stimuli demarcated the start and end of a to-be-timed interval. These stimuli could be either in the same or a different location, which led to different sensory responses because of neural repetition suppression. Crucially, changes and repetitions were fully predictable, which allowed us to explore effects of sensory response magnitude without changes in arousal or surprise. Intervals with changing markers were perceived as lasting longer than those with repeating markers. We measured EEG (Experiment 1) and pupil size (Experiment 2) and found that temporal perception was related to changes in ERPs (P2) and pupil constriction, both of which have been related to responses in the sensory cortex. Conversely, correlates of surprise and arousal (P3 amplitude and pupil dilation) were unaffected by stimulus repetitions and changes. These results demonstrate, for the first time, that sensory magnitude affects time perception even under constant levels of arousal.

Close

  • doi:10.1162/jocn_a_01705

Close

Louisa Kulke; Lena Brümmer; Arezoo Pooresmaeili; Annekathrin Schacht

Overt and covert attention shifts to emotional faces: Combining EEG, eye tracking, and a go/no-go paradigm Journal Article

In: Psychophysiology, vol. 58, no. 8, pp. e13838, 2021.

Abstract | Links | BibTeX

@article{Kulke2021,
title = {Overt and covert attention shifts to emotional faces: Combining EEG, eye tracking, and a go/no-go paradigm},
author = {Louisa Kulke and Lena Brümmer and Arezoo Pooresmaeili and Annekathrin Schacht},
doi = {10.1111/psyp.13838},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {58},
number = {8},
pages = {e13838},
abstract = {In everyday life, faces with emotional expressions quickly attract attention and eye movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry, or neutral expressions when eye movements were either executed (go conditions) or withheld (no-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component, and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC) were augmented in amplitude when attention was shifted with an eye movement, indicating an enhanced neural processing of faces if eye movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the go and no-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In everyday life, faces with emotional expressions quickly attract attention and eye movements. To study the neural mechanisms of such emotion-driven attention by means of event-related brain potentials (ERPs), tasks that employ covert shifts of attention are commonly used, in which participants need to inhibit natural eye movements towards stimuli. It remains, however, unclear how shifts of attention to emotional faces with and without eye movements differ from each other. The current preregistered study aimed to investigate neural differences between covert and overt emotion-driven attention. We combined eye tracking with measurements of ERPs to compare shifts of attention to faces with happy, angry, or neutral expressions when eye movements were either executed (go conditions) or withheld (no-go conditions). Happy and angry faces led to larger EPN amplitudes, shorter latencies of the P1 component, and faster saccades, suggesting that emotional expressions significantly affected shifts of attention. Several ERPs (N170, EPN, LPC) were augmented in amplitude when attention was shifted with an eye movement, indicating an enhanced neural processing of faces if eye movements had to be executed together with a reallocation of attention. However, the modulation of ERPs by facial expressions did not differ between the go and no-go conditions, suggesting that emotional content enhances both covert and overt shifts of attention. In summary, our results indicate that overt and covert attention shifts differ but are comparably affected by emotional content.

Close

  • doi:10.1111/psyp.13838

Close

Seungji Lee; Doyoung Lee; Hyunjae Gil; Ian Oakley; Yang Seok Cho; Sung Phil Kim

Eye fixation-related potentials during visual search on acquaintance and newly-learned faces Journal Article

In: Brain Sciences, vol. 11, no. 2, pp. 1–15, 2021.

Abstract | Links | BibTeX

@article{Lee2021b,
title = {Eye fixation-related potentials during visual search on acquaintance and newly-learned faces},
author = {Seungji Lee and Doyoung Lee and Hyunjae Gil and Ian Oakley and Yang Seok Cho and Sung Phil Kim},
doi = {10.3390/brainsci11020218},
year = {2021},
date = {2021-01-01},
journal = {Brain Sciences},
volume = {11},
number = {2},
pages = {1--15},
abstract = {Searching familiar faces in the crowd may involve stimulus-driven attention by emotional significance, together with goal-directed attention due to task-relevant needs. The present study investigated the effect of familiarity on attentional processes by exploring eye fixation-related potentials (EFRPs) and eye gazes when humans searched for, among other distracting faces, either an acquaintance's face or a newly-learned face. Task performance and gaze behavior were indistinguishable for identifying either faces. However, from the EFRP analysis, after a P300 component for successful search of target faces, we found greater deflections of right parietal late positive potentials in response to newly-learned faces than acquaintance's faces, indicating more involvement of goaldirected attention in processing newly-learned faces. In addition, we found greater occipital negativity elicited by acquaintance's faces, reflecting emotional responses to significant stimuli. These results may suggest that finding a familiar face in the crowd would involve lower goal-directed attention and elicit more emotional responses.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Searching familiar faces in the crowd may involve stimulus-driven attention by emotional significance, together with goal-directed attention due to task-relevant needs. The present study investigated the effect of familiarity on attentional processes by exploring eye fixation-related potentials (EFRPs) and eye gazes when humans searched for, among other distracting faces, either an acquaintance's face or a newly-learned face. Task performance and gaze behavior were indistinguishable for identifying either faces. However, from the EFRP analysis, after a P300 component for successful search of target faces, we found greater deflections of right parietal late positive potentials in response to newly-learned faces than acquaintance's faces, indicating more involvement of goaldirected attention in processing newly-learned faces. In addition, we found greater occipital negativity elicited by acquaintance's faces, reflecting emotional responses to significant stimuli. These results may suggest that finding a familiar face in the crowd would involve lower goal-directed attention and elicit more emotional responses.

Close

  • doi:10.3390/brainsci11020218

Close

Cai S. Longman; Heike Elchlepp; Stephen Monsell; Aureliu Lavric

Serial or parallel proactive control of components of task-set? A task-switching investigation with concurrent EEG and eye-tracking Journal Article

In: Neuropsychologia, vol. 160, pp. 107984, 2021.

Abstract | Links | BibTeX

@article{Longman2021,
title = {Serial or parallel proactive control of components of task-set? A task-switching investigation with concurrent EEG and eye-tracking},
author = {Cai S. Longman and Heike Elchlepp and Stephen Monsell and Aureliu Lavric},
doi = {10.1016/j.neuropsychologia.2021.107984},
year = {2021},
date = {2021-01-01},
journal = {Neuropsychologia},
volume = {160},
pages = {107984},
publisher = {Elsevier Ltd},
abstract = {Among the issues examined by studies of cognitive control in multitasking is whether processes underlying performance in the different tasks occur serially or in parallel. Here we ask a similar question about processes that pro-actively control task-set. In task-switching experiments, several indices of task-set preparation have been extensively documented, including anticipatory orientation of gaze to the task-relevant location (an unambiguous marker of reorientation of attention), and a positive polarity brain potential over the posterior cortex (whose functional significance is less well understood). We examine whether these markers of preparation occur in parallel or serially, and in what order. On each trial a cue required participants to make a semantic classification of one of three digits presented simultaneously, with the location of each digit consistently associated with one of three classification tasks (e.g., if the task was odd/even, the digit at the top of the display was relevant). The EEG positivity emerged following, and appeared time-locked to, the anticipatory fixation on the task-relevant location, which might suggest serial organisation. However, the fixation-locked positivity was not better defined than the cue-locked positivity; in fact, for the trials with the earliest fixations the positivity was better time-locked to the cue onset. This is more consistent with (re)orientation of spatial attention occurring in parallel with, but slightly before, the reconfiguration of other task-set components indexed by the EEG positivity.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Among the issues examined by studies of cognitive control in multitasking is whether processes underlying performance in the different tasks occur serially or in parallel. Here we ask a similar question about processes that pro-actively control task-set. In task-switching experiments, several indices of task-set preparation have been extensively documented, including anticipatory orientation of gaze to the task-relevant location (an unambiguous marker of reorientation of attention), and a positive polarity brain potential over the posterior cortex (whose functional significance is less well understood). We examine whether these markers of preparation occur in parallel or serially, and in what order. On each trial a cue required participants to make a semantic classification of one of three digits presented simultaneously, with the location of each digit consistently associated with one of three classification tasks (e.g., if the task was odd/even, the digit at the top of the display was relevant). The EEG positivity emerged following, and appeared time-locked to, the anticipatory fixation on the task-relevant location, which might suggest serial organisation. However, the fixation-locked positivity was not better defined than the cue-locked positivity; in fact, for the trials with the earliest fixations the positivity was better time-locked to the cue onset. This is more consistent with (re)orientation of spatial attention occurring in parallel with, but slightly before, the reconfiguration of other task-set components indexed by the EEG positivity.

Close

  • doi:10.1016/j.neuropsychologia.2021.107984

Close

Sara LoTemplio; Jack Silcox; Kara D. Federmeier; Brennan R. Payne

Inter- and intra-individual coupling between pupillary, electrophysiological, and behavioral responses in a visual oddball task Journal Article

In: Psychophysiology, vol. 58, no. 4, pp. e13758, 2021.

Abstract | Links | BibTeX

@article{LoTemplio2021,
title = {Inter- and intra-individual coupling between pupillary, electrophysiological, and behavioral responses in a visual oddball task},
author = {Sara LoTemplio and Jack Silcox and Kara D. Federmeier and Brennan R. Payne},
doi = {10.1111/psyp.13758},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {58},
number = {4},
pages = {e13758},
abstract = {Although the P3b component of the event-related brain potential is one of the most widely studied components, its underlying generators are not currently well understood. Recent theories have suggested that the P3b is triggered by phasic activation of the locus-coeruleus norepinephrine (LC-NE) system, an important control center implicated in facilitating optimal task-relevant behavior. Previous research has reported strong correlations between pupil dilation and LC activity, suggesting that pupil diameter is a useful indicator for ongoing LC-NE activity. Given the strong relationship between LC activity and pupil dilation, if the P3b is driven by phasic LC activity, there should be a robust trial-to-trial relationship with the phasic pupillary dilation response (PDR). However, previous work examining relationships between concurrently recorded pupillary and P3b responses has not supported this. One possibility is that the relationship between the measures might be carried primarily by either inter-individual (i.e., between-participant) or intra-individual (i.e., within-participant) contributions to coupling, and prior work has not systematically delineated these relationships. Doing so in the current study, we do not find evidence for either inter-individual or intra-individual relationships between the PDR and P3b responses. However, baseline pupil dilation did predict the P3b. Interestingly, both the PDR and P3b independently predicted inter-individual and intra-individual variability in decision response time. Implications for the LC-P3b hypothesis are discussed.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Although the P3b component of the event-related brain potential is one of the most widely studied components, its underlying generators are not currently well understood. Recent theories have suggested that the P3b is triggered by phasic activation of the locus-coeruleus norepinephrine (LC-NE) system, an important control center implicated in facilitating optimal task-relevant behavior. Previous research has reported strong correlations between pupil dilation and LC activity, suggesting that pupil diameter is a useful indicator for ongoing LC-NE activity. Given the strong relationship between LC activity and pupil dilation, if the P3b is driven by phasic LC activity, there should be a robust trial-to-trial relationship with the phasic pupillary dilation response (PDR). However, previous work examining relationships between concurrently recorded pupillary and P3b responses has not supported this. One possibility is that the relationship between the measures might be carried primarily by either inter-individual (i.e., between-participant) or intra-individual (i.e., within-participant) contributions to coupling, and prior work has not systematically delineated these relationships. Doing so in the current study, we do not find evidence for either inter-individual or intra-individual relationships between the PDR and P3b responses. However, baseline pupil dilation did predict the P3b. Interestingly, both the PDR and P3b independently predicted inter-individual and intra-individual variability in decision response time. Implications for the LC-P3b hypothesis are discussed.

Close

  • doi:10.1111/psyp.13758

Close

Sarah D. McCrackin; Roxane J. Itier

I can see it in your eyes: Perceived gaze direction impacts ERP and behavioural measures of affective theory of mind Journal Article

In: Cortex, vol. 143, pp. 205–222, 2021.

Abstract | Links | BibTeX

@article{McCrackin2021,
title = {I can see it in your eyes: Perceived gaze direction impacts ERP and behavioural measures of affective theory of mind},
author = {Sarah D. McCrackin and Roxane J. Itier},
doi = {10.1016/j.cortex.2021.05.024},
year = {2021},
date = {2021-01-01},
journal = {Cortex},
volume = {143},
pages = {205--222},
publisher = {Elsevier Ltd},
abstract = {Looking at someone's eyes is thought to be important for affective theory of mind (aTOM), our ability to infer their emotional state. However, it is unknown whether an individual's gaze direction influences our aTOM judgements and what the time course of this influence might be. We presented participants with sentences describing individuals in positive, negative or neutral scenarios, followed by direct or averted gaze neutral face pictures of those individuals. Participants made aTOM judgements about each person's mental state, including their affective valence and arousal, and we investigated whether the face gaze direction impacted those judgements. Participants rated that gazers were feeling more positive when they displayed direct gaze as opposed to averted gaze, and that they were feeling more aroused during negative contexts when gaze was averted as opposed to direct. Event-related potentials associated with face perception and affective processing were examined using mass-univariate analyses to track the time-course of this eye-gaze and affective processing interaction at a neural level. Both positive and negative trials were differentiated from neutral trials at many stages of processing. This included the early N200 and EPN components, believed to reflect automatic emotion areas activation and attentional selection respectively. This also included the later P300 and LPP components, thought to reflect elaborative cognitive appraisal of emotional content. Critically, sentence valence and gaze direction interacted over these later components, which may reflect the incorporation of eye-gaze in the cognitive evaluation of another's emotional state. The results suggest that gaze perception directly impacts aTOM processes, and that altered eye-gaze processing in clinical populations may contribute to associated aTOM impairments.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Looking at someone's eyes is thought to be important for affective theory of mind (aTOM), our ability to infer their emotional state. However, it is unknown whether an individual's gaze direction influences our aTOM judgements and what the time course of this influence might be. We presented participants with sentences describing individuals in positive, negative or neutral scenarios, followed by direct or averted gaze neutral face pictures of those individuals. Participants made aTOM judgements about each person's mental state, including their affective valence and arousal, and we investigated whether the face gaze direction impacted those judgements. Participants rated that gazers were feeling more positive when they displayed direct gaze as opposed to averted gaze, and that they were feeling more aroused during negative contexts when gaze was averted as opposed to direct. Event-related potentials associated with face perception and affective processing were examined using mass-univariate analyses to track the time-course of this eye-gaze and affective processing interaction at a neural level. Both positive and negative trials were differentiated from neutral trials at many stages of processing. This included the early N200 and EPN components, believed to reflect automatic emotion areas activation and attentional selection respectively. This also included the later P300 and LPP components, thought to reflect elaborative cognitive appraisal of emotional content. Critically, sentence valence and gaze direction interacted over these later components, which may reflect the incorporation of eye-gaze in the cognitive evaluation of another's emotional state. The results suggest that gaze perception directly impacts aTOM processes, and that altered eye-gaze processing in clinical populations may contribute to associated aTOM impairments.

Close

  • doi:10.1016/j.cortex.2021.05.024

Close

Sarah D. McCrackin; Roxane J. Itier

Feeling through another's eyes: Perceived gaze direction impacts ERP and behavioural measures of positive and negative affective empathy Journal Article

In: NeuroImage, vol. 226, pp. 117605, 2021.

Abstract | Links | BibTeX

@article{McCrackin2021a,
title = {Feeling through another's eyes: Perceived gaze direction impacts ERP and behavioural measures of positive and negative affective empathy},
author = {Sarah D. McCrackin and Roxane J. Itier},
doi = {10.1016/j.neuroimage.2020.117605},
year = {2021},
date = {2021-01-01},
journal = {NeuroImage},
volume = {226},
pages = {117605},
publisher = {Elsevier Inc.},
abstract = {Looking at the eyes informs us about the thoughts and emotions of those around us, and impacts our own emotional state. However, it is unknown how perceiving direct and averted gaze impacts our ability to share the gazer's positive and negative emotions, abilities referred to as positive and negative affective empathy. We presented 44 participants with contextual sentences describing positive, negative and neutral events happening to other people (e.g. “Her newborn was saved/killed/fed yesterday afternoon.”). These were designed to elicit positive, negative, or little to no empathy, and were followed by direct or averted gaze images of the individuals described. Participants rated their affective empathy for the individual and their own emotional valence on each trial. Event-related potentials time-locked to face-onset and associated with empathy and emotional processing were recorded to investigate whether they were modulated by gaze direction. Relative to averted gaze, direct gaze was associated with increased positive valence in the positive and neutral conditions and with increased positive empathy ratings. A similar pattern was found at the neural level, using robust mass-univariate statistics. The N100, thought to reflect an automatic activation of emotion areas, was modulated by gaze in the affective empathy conditions, with opposite effect directions in positive and negative conditions. The P200, an ERP component sensitive to positive stimuli, was modulated by gaze direction only in the positive empathy condition. Positive and negative trials were processed similarly at the early N200 processing stage, but later diverged, with only negative trials modulating the EPN, P300 and LPP components. These results suggest that positive and negative affective empathy are associated with distinct time-courses, and that perceived gaze direction uniquely modulates positive empathy, highlighting the importance of studying empathy with face stimuli.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Looking at the eyes informs us about the thoughts and emotions of those around us, and impacts our own emotional state. However, it is unknown how perceiving direct and averted gaze impacts our ability to share the gazer's positive and negative emotions, abilities referred to as positive and negative affective empathy. We presented 44 participants with contextual sentences describing positive, negative and neutral events happening to other people (e.g. “Her newborn was saved/killed/fed yesterday afternoon.”). These were designed to elicit positive, negative, or little to no empathy, and were followed by direct or averted gaze images of the individuals described. Participants rated their affective empathy for the individual and their own emotional valence on each trial. Event-related potentials time-locked to face-onset and associated with empathy and emotional processing were recorded to investigate whether they were modulated by gaze direction. Relative to averted gaze, direct gaze was associated with increased positive valence in the positive and neutral conditions and with increased positive empathy ratings. A similar pattern was found at the neural level, using robust mass-univariate statistics. The N100, thought to reflect an automatic activation of emotion areas, was modulated by gaze in the affective empathy conditions, with opposite effect directions in positive and negative conditions. The P200, an ERP component sensitive to positive stimuli, was modulated by gaze direction only in the positive empathy condition. Positive and negative trials were processed similarly at the early N200 processing stage, but later diverged, with only negative trials modulating the EPN, P300 and LPP components. These results suggest that positive and negative affective empathy are associated with distinct time-courses, and that perceived gaze direction uniquely modulates positive empathy, highlighting the importance of studying empathy with face stimuli.

Close

  • doi:10.1016/j.neuroimage.2020.117605

Close

Amir H. Meghdadi; Barry Giesbrecht; Miguel P. Eckstein

EEG signatures of contextual influences on visual search with real scenes Journal Article

In: Experimental Brain Research, vol. 239, no. 3, pp. 797–809, 2021.

Abstract | Links | BibTeX

@article{Meghdadi2021,
title = {EEG signatures of contextual influences on visual search with real scenes},
author = {Amir H. Meghdadi and Barry Giesbrecht and Miguel P. Eckstein},
doi = {10.1007/s00221-020-05984-8},
year = {2021},
date = {2021-01-01},
journal = {Experimental Brain Research},
volume = {239},
number = {3},
pages = {797--809},
publisher = {Springer Berlin Heidelberg},
abstract = {The use of scene context is a powerful way by which biological organisms guide and facilitate visual search. Although many studies have shown enhancements of target-related electroencephalographic activity (EEG) with synthetic cues, there have been fewer studies demonstrating such enhancements during search with scene context and objects in real world scenes. Here, observers covertly searched for a target in images of real scenes while we used EEG to measure the steady state visual evoked response to objects flickering at different frequencies. The target appeared in its typical contextual location or out of context while we controlled for low-level properties of the image including target saliency against the background and retinal eccentricity. A pattern classifier using EEG activity at the relevant modulated frequencies showed target detection accuracy increased when the target was in a contextually appropriate location. A control condition for which observers searched the same images for a different target orthogonal to the contextual manipulation, resulted in no effects of scene context on classifier performance, confirming that image properties cannot explain the contextual modulations of neural activity. Pattern classifier decisions for individual images were also related to the aggregated observer behavioral decisions for individual images. Together, these findings demonstrate target-related neural responses are modulated by scene context during visual search with real world scenes and can be related to behavioral search decisions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The use of scene context is a powerful way by which biological organisms guide and facilitate visual search. Although many studies have shown enhancements of target-related electroencephalographic activity (EEG) with synthetic cues, there have been fewer studies demonstrating such enhancements during search with scene context and objects in real world scenes. Here, observers covertly searched for a target in images of real scenes while we used EEG to measure the steady state visual evoked response to objects flickering at different frequencies. The target appeared in its typical contextual location or out of context while we controlled for low-level properties of the image including target saliency against the background and retinal eccentricity. A pattern classifier using EEG activity at the relevant modulated frequencies showed target detection accuracy increased when the target was in a contextually appropriate location. A control condition for which observers searched the same images for a different target orthogonal to the contextual manipulation, resulted in no effects of scene context on classifier performance, confirming that image properties cannot explain the contextual modulations of neural activity. Pattern classifier decisions for individual images were also related to the aggregated observer behavioral decisions for individual images. Together, these findings demonstrate target-related neural responses are modulated by scene context during visual search with real world scenes and can be related to behavioral search decisions.

Close

  • doi:10.1007/s00221-020-05984-8

Close

Michael Christopher Melnychuk; Ian H. Robertson; Emanuele R. G. Plini; Paul M. Dockree

A bridge between the breath and the brain: Synchronization of respiration, a pupillometric marker of the locus coeruleus, and an EEG marker of attentional control state Journal Article

In: Brain Sciences, vol. 11, pp. 1324, 2021.

Abstract | Links | BibTeX

@article{Melnychuk2021,
title = {A bridge between the breath and the brain: Synchronization of respiration, a pupillometric marker of the locus coeruleus, and an EEG marker of attentional control state},
author = {Michael Christopher Melnychuk and Ian H. Robertson and Emanuele R. G. Plini and Paul M. Dockree},
doi = {10.3390/brainsci11101324},
year = {2021},
date = {2021-01-01},
journal = {Brain Sciences},
volume = {11},
pages = {1324},
abstract = {Yogic and meditative traditions have long held that the fluctuations of the breath and the mind are intimately related. While respiratory modulation of cortical activity and attentional switching are established, the extent to which electrophysiological markers of attention exhibit synchronization with respiration is unknown. To this end, we examined (1) frontal midline theta-beta ratio (TBR), an indicator of attentional control state known to correlate with mind wandering episodes and functional connectivity of the executive control network; (2) pupil diameter (PD), a known proxy measure of locus coeruleus (LC) noradrenergic activity; and (3) respiration for evidence of phase synchronization and information transfer (multivariate Granger causality) during quiet restful breathing. Our results indicate that both TBR and PD are simultaneously synchronized with the breath, suggesting an underlying oscillation of an attentionally relevant electrophysiological index that is phase-locked to the respiratory cycle which could have the potential to bias the attentional system into switching states. We highlight the LC's pivotal role as a coupling mechanism between respiration and TBR, and elaborate on its dual functions as both a chemosensitive respiratory nucleus and a pacemaker of the attentional system. We further suggest that an appreciation of the dynamics of this weakly coupled oscillatory system could help deepen our understanding of the traditional claim of a relationship between breathing and attention.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Yogic and meditative traditions have long held that the fluctuations of the breath and the mind are intimately related. While respiratory modulation of cortical activity and attentional switching are established, the extent to which electrophysiological markers of attention exhibit synchronization with respiration is unknown. To this end, we examined (1) frontal midline theta-beta ratio (TBR), an indicator of attentional control state known to correlate with mind wandering episodes and functional connectivity of the executive control network; (2) pupil diameter (PD), a known proxy measure of locus coeruleus (LC) noradrenergic activity; and (3) respiration for evidence of phase synchronization and information transfer (multivariate Granger causality) during quiet restful breathing. Our results indicate that both TBR and PD are simultaneously synchronized with the breath, suggesting an underlying oscillation of an attentionally relevant electrophysiological index that is phase-locked to the respiratory cycle which could have the potential to bias the attentional system into switching states. We highlight the LC's pivotal role as a coupling mechanism between respiration and TBR, and elaborate on its dual functions as both a chemosensitive respiratory nucleus and a pacemaker of the attentional system. We further suggest that an appreciation of the dynamics of this weakly coupled oscillatory system could help deepen our understanding of the traditional claim of a relationship between breathing and attention.

Close

  • doi:10.3390/brainsci11101324

Close

Anna M. Monk; Daniel N. Barry; Vladimir Litvak; Gareth R. Barnes; Eleanor A. Maguire

Watching movies unfold, a frame-by-frame analysis of the associated neural dynamics Journal Article

In: eNeuro, vol. 8, no. 4, pp. 1–12, 2021.

Abstract | Links | BibTeX

@article{Monk2021,
title = {Watching movies unfold, a frame-by-frame analysis of the associated neural dynamics},
author = {Anna M. Monk and Daniel N. Barry and Vladimir Litvak and Gareth R. Barnes and Eleanor A. Maguire},
doi = {10.1523/ENEURO.0099-21.2021},
year = {2021},
date = {2021-01-01},
journal = {eNeuro},
volume = {8},
number = {4},
pages = {1--12},
abstract = {Our lives unfold as sequences of events. We experience these events as seamless, although they are composed of individual images captured in between the interruptions imposed by eye blinks and saccades. Events typically involve visual imagery from the real world (scenes), and the hippocampus is frequently en-gaged in this context. It is unclear, however, whether the hippocampus would be similarly responsive to unfolding events that involve abstract imagery. Addressing this issue could provide insights into the nature of its contribution to event processing, with relevance for theories of hippocampal function. Consequently, during magnetoencephalography (MEG), we had female and male humans watch highly matched unfolding movie events composed of either scene image frames that reflected the real world, or frames depicting abstract pat-terns. We examined the evoked neuronal responses to each image frame along the time course of the movie events. Only one difference between the two conditions was evident, and that was during the viewing of the first image frame of events, detectable across frontotemporal sensors. Further probing of this difference using source reconstruction revealed greater engagement of a set of brain regions across parietal, frontal, premotor, and cerebellar cortices, with the largest change in broadband (1–30 Hz) power in the hippocampus during scene-based movie events. Hippocampal engagement during the first image frame of scene-based events could reflect its role in registering a recognizable context perhaps based on templates or schemas. The hippo-campus, therefore, may help to set the scene for events very early on.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Our lives unfold as sequences of events. We experience these events as seamless, although they are composed of individual images captured in between the interruptions imposed by eye blinks and saccades. Events typically involve visual imagery from the real world (scenes), and the hippocampus is frequently en-gaged in this context. It is unclear, however, whether the hippocampus would be similarly responsive to unfolding events that involve abstract imagery. Addressing this issue could provide insights into the nature of its contribution to event processing, with relevance for theories of hippocampal function. Consequently, during magnetoencephalography (MEG), we had female and male humans watch highly matched unfolding movie events composed of either scene image frames that reflected the real world, or frames depicting abstract pat-terns. We examined the evoked neuronal responses to each image frame along the time course of the movie events. Only one difference between the two conditions was evident, and that was during the viewing of the first image frame of events, detectable across frontotemporal sensors. Further probing of this difference using source reconstruction revealed greater engagement of a set of brain regions across parietal, frontal, premotor, and cerebellar cortices, with the largest change in broadband (1–30 Hz) power in the hippocampus during scene-based movie events. Hippocampal engagement during the first image frame of scene-based events could reflect its role in registering a recognizable context perhaps based on templates or schemas. The hippo-campus, therefore, may help to set the scene for events very early on.

Close

  • doi:10.1523/ENEURO.0099-21.2021

Close

Anna M. Monk; Marshall A. Dalton; Gareth R. Barnes; Eleanor A. Maguire; American Psychological Association

The role of hippocampal-ventromedial prefrontal cortex neural dynamics in building mental representations Journal Article

In: Journal of Cognitive Neuroscience, vol. 33, no. 1, pp. 89–103, 2021.

Abstract | Links | BibTeX

@article{Monk2021a,
title = {The role of hippocampal-ventromedial prefrontal cortex neural dynamics in building mental representations},
author = {Anna M. Monk and Marshall A. Dalton and Gareth R. Barnes and Eleanor A. Maguire and American Psychological Association},
doi = {10.1162/jocn},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {33},
number = {1},
pages = {89--103},
abstract = {The hippocampus and ventromedial prefrontal cortex (vmPFC) play key roles in numerous cognitive domains including mind-wandering, episodic memory and imagining the future. Perspectives differ on precisely how they support these diverse functions, but there is general agreement that it involves constructing representations comprised of numerous elements. Visual scenes have been deployed extensively in cognitive neuroscience because they are paradigmatic multi-element stimuli. However, it remains unclear whether scenes, rather than other types of multi-feature stimuli, preferentially engage hippocampus and vmPFC. Here we leveraged the high temporal resolution of magnetoencephalography to test participants as they gradually built scene imagery from three successive auditorily-presented object descriptions and an imagined 3D space. This was contrasted with constructing mental images of non-scene arrays that were composed of three objects and an imagined 2D space. The scene and array stimuli were, therefore, highly matched, and this paradigm permitted a closer examination of step-by-step mental construction than has been undertaken previously. We observed modulation of theta power in our two regions of interest -anterior hippocampus during the initial stage, and in vmPFC during the first two stages, of scene relative to array construction. Moreover, the scene-specific anterior hippocampal activity during the first construction stage was driven by the vmPFC, with mutual entrainment between the two brain regions thereafter. These findings suggest that hippocampal and vmPFC neural activity is especially tuned to scene representations during the earliest stage of their formation, with implications for theories of how these brain areas enable cognitive functions such as episodic memory.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The hippocampus and ventromedial prefrontal cortex (vmPFC) play key roles in numerous cognitive domains including mind-wandering, episodic memory and imagining the future. Perspectives differ on precisely how they support these diverse functions, but there is general agreement that it involves constructing representations comprised of numerous elements. Visual scenes have been deployed extensively in cognitive neuroscience because they are paradigmatic multi-element stimuli. However, it remains unclear whether scenes, rather than other types of multi-feature stimuli, preferentially engage hippocampus and vmPFC. Here we leveraged the high temporal resolution of magnetoencephalography to test participants as they gradually built scene imagery from three successive auditorily-presented object descriptions and an imagined 3D space. This was contrasted with constructing mental images of non-scene arrays that were composed of three objects and an imagined 2D space. The scene and array stimuli were, therefore, highly matched, and this paradigm permitted a closer examination of step-by-step mental construction than has been undertaken previously. We observed modulation of theta power in our two regions of interest -anterior hippocampus during the initial stage, and in vmPFC during the first two stages, of scene relative to array construction. Moreover, the scene-specific anterior hippocampal activity during the first construction stage was driven by the vmPFC, with mutual entrainment between the two brain regions thereafter. These findings suggest that hippocampal and vmPFC neural activity is especially tuned to scene representations during the earliest stage of their formation, with implications for theories of how these brain areas enable cognitive functions such as episodic memory.

Close

  • doi:10.1162/jocn

Close

Peter R. Murphy; Niklas Wilming; Diana C. Hernandez-Bocanegra; Genis Prat-Ortega; Tobias H. Donner

Adaptive circuit dynamics across human cortex during evidence accumulation in changing environments Journal Article

In: Nature Neuroscience, vol. 24, no. 7, pp. 987–997, 2021.

Abstract | Links | BibTeX

@article{Murphy2021,
title = {Adaptive circuit dynamics across human cortex during evidence accumulation in changing environments},
author = {Peter R. Murphy and Niklas Wilming and Diana C. Hernandez-Bocanegra and Genis Prat-Ortega and Tobias H. Donner},
doi = {10.1038/s41593-021-00839-z},
year = {2021},
date = {2021-01-01},
journal = {Nature Neuroscience},
volume = {24},
number = {7},
pages = {987--997},
publisher = {Springer US},
abstract = {Many decisions under uncertainty entail the temporal accumulation of evidence that informs about the state of the environment. When environments are subject to hidden changes in their state, maximizing accuracy and reward requires non-linear accumulation of evidence. How this adaptive, non-linear computation is realized in the brain is unknown. We analyzed human behavior and cortical population activity (measured with magnetoencephalography) recorded during visual evidence accumulation in a changing environment. Behavior and decision-related activity in cortical regions involved in action planning exhibited hallmarks of adaptive evidence accumulation, which could also be implemented by a recurrent cortical microcircuit. Decision dynamics in action-encoding parietal and frontal regions were mirrored in a frequency-specific modulation of the state of the visual cortex that depended on pupil-linked arousal and the expected probability of change. These findings link normative decision computations to recurrent cortical circuit dynamics and highlight the adaptive nature of decision-related feedback to the sensory cortex.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Many decisions under uncertainty entail the temporal accumulation of evidence that informs about the state of the environment. When environments are subject to hidden changes in their state, maximizing accuracy and reward requires non-linear accumulation of evidence. How this adaptive, non-linear computation is realized in the brain is unknown. We analyzed human behavior and cortical population activity (measured with magnetoencephalography) recorded during visual evidence accumulation in a changing environment. Behavior and decision-related activity in cortical regions involved in action planning exhibited hallmarks of adaptive evidence accumulation, which could also be implemented by a recurrent cortical microcircuit. Decision dynamics in action-encoding parietal and frontal regions were mirrored in a frequency-specific modulation of the state of the visual cortex that depended on pupil-linked arousal and the expected probability of change. These findings link normative decision computations to recurrent cortical circuit dynamics and highlight the adaptive nature of decision-related feedback to the sensory cortex.

Close

  • doi:10.1038/s41593-021-00839-z

Close

Dinavahi V. P. S. Murty; Keerthana Manikandan; Wupadrasta Santosh Kumar; Ranjini Garani Ramesh; Simran Purokayastha; Bhargavi Nagendra; M. L. Abhishek; Aditi Balakrishnan; Mahendra Javali; Naren Prahalada Rao; Supratim Ray

Stimulus-induced gamma rhythms are weaker in human elderly with mild cognitive impairment and Alzheimer's disease Journal Article

In: eLife, vol. 10, pp. e61666, 2021.

Abstract | Links | BibTeX

@article{Murty2021,
title = {Stimulus-induced gamma rhythms are weaker in human elderly with mild cognitive impairment and Alzheimer's disease},
author = {Dinavahi V. P. S. Murty and Keerthana Manikandan and Wupadrasta Santosh Kumar and Ranjini Garani Ramesh and Simran Purokayastha and Bhargavi Nagendra and M. L. Abhishek and Aditi Balakrishnan and Mahendra Javali and Naren Prahalada Rao and Supratim Ray},
doi = {10.7554/eLife.61666},
year = {2021},
date = {2021-01-01},
journal = {eLife},
volume = {10},
pages = {e61666},
abstract = {Alzheimer's disease (AD) in elderly adds substantially to socioeconomic burden necessitating early diagnosis. While recent studies in rodent models of AD have suggested diagnostic and therapeutic value for gamma rhythms in brain, the same has not been rigorously tested in humans. In this case-control study, we recruited a large population (N = 244; 106 females) of elderly (>49 years) subjects from the community, who viewed large gratings that induced strong gamma oscillations in their electroencephalogram (EEG). These subjects were classified as healthy (N = 227), mild cognitively impaired (MCI; N = 12), or AD (N = 5) based on clinical history and Clinical Dementia Rating scores. Surprisingly, stimulus-induced gamma rhythms, but not alpha or steady-state visually evoked responses, were significantly lower in MCI/AD subjects compared to their age-and gender-matched controls. This reduction was not due to differences in eye movements or baseline power. Our results suggest that gamma could be used as a potential screening tool for MCI/AD in humans.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Alzheimer's disease (AD) in elderly adds substantially to socioeconomic burden necessitating early diagnosis. While recent studies in rodent models of AD have suggested diagnostic and therapeutic value for gamma rhythms in brain, the same has not been rigorously tested in humans. In this case-control study, we recruited a large population (N = 244; 106 females) of elderly (>49 years) subjects from the community, who viewed large gratings that induced strong gamma oscillations in their electroencephalogram (EEG). These subjects were classified as healthy (N = 227), mild cognitively impaired (MCI; N = 12), or AD (N = 5) based on clinical history and Clinical Dementia Rating scores. Surprisingly, stimulus-induced gamma rhythms, but not alpha or steady-state visually evoked responses, were significantly lower in MCI/AD subjects compared to their age-and gender-matched controls. This reduction was not due to differences in eye movements or baseline power. Our results suggest that gamma could be used as a potential screening tool for MCI/AD in humans.

Close

  • doi:10.7554/eLife.61666

Close

Gaëlle Nicolas; Eric Castet; Adrien Rabier; Emmanuelle Kristensen; Michel Dojat; Anne Guérin-Dugué

Neural correlates of intra-saccadic motion perception Journal Article

In: Journal of Vision, vol. 21, no. 11, pp. 1–24, 2021.

Abstract | Links | BibTeX

@article{Nicolas2021,
title = {Neural correlates of intra-saccadic motion perception},
author = {Gaëlle Nicolas and Eric Castet and Adrien Rabier and Emmanuelle Kristensen and Michel Dojat and Anne Guérin-Dugué},
doi = {10.1167/jov.21.11.19},
year = {2021},
date = {2021-01-01},
journal = {Journal of Vision},
volume = {21},
number = {11},
pages = {1--24},
abstract = {Retinal motion of the visual scene is not consciously perceived during ocular saccades in normal everyday conditions. It has been suggested that extra-retinal signals actively suppress intra-saccadic motion perception to preserve stable perception of the visual world. However, using stimuli optimized to preferentially activate the M-pathway, Castet and Masson (2000) demonstrated that motion can be perceived during a saccade. Based on this psychophysical paradigm, we used electroencephalography and eye-tracking recordings to investigate the neural correlates related to the conscious perception of intra-saccadic motion. We demonstrated the effective involvement during saccades of the cortical areas V1-V2 and MT-V5, which convey motion information along the M-pathway. We also showed that individual motion perception was related to retinal temporal frequency.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Retinal motion of the visual scene is not consciously perceived during ocular saccades in normal everyday conditions. It has been suggested that extra-retinal signals actively suppress intra-saccadic motion perception to preserve stable perception of the visual world. However, using stimuli optimized to preferentially activate the M-pathway, Castet and Masson (2000) demonstrated that motion can be perceived during a saccade. Based on this psychophysical paradigm, we used electroencephalography and eye-tracking recordings to investigate the neural correlates related to the conscious perception of intra-saccadic motion. We demonstrated the effective involvement during saccades of the cortical areas V1-V2 and MT-V5, which convey motion information along the M-pathway. We also showed that individual motion perception was related to retinal temporal frequency.

Close

  • doi:10.1167/jov.21.11.19

Close

J. A. Nij Bijvank; E. M. M. Strijbis; I. M. Nauta; S. D. Kulik; L. J. Balk; C. J. Stam; A. Hillebrand; J. J. G. Geurts; B. M. J. Uitdehaag; L. J. Rijn; A. Petzold; M. M. Schoonheim

Impaired saccadic eye movements in multiple sclerosis are related to altered functional connectivity of the oculomotor brain network Journal Article

In: NeuroImage: Clinical, vol. 32, pp. 102848, 2021.

Abstract | Links | BibTeX

@article{NijBijvank2021,
title = {Impaired saccadic eye movements in multiple sclerosis are related to altered functional connectivity of the oculomotor brain network},
author = {J. A. Nij Bijvank and E. M. M. Strijbis and I. M. Nauta and S. D. Kulik and L. J. Balk and C. J. Stam and A. Hillebrand and J. J. G. Geurts and B. M. J. Uitdehaag and L. J. Rijn and A. Petzold and M. M. Schoonheim},
doi = {10.1016/j.nicl.2021.102848},
year = {2021},
date = {2021-01-01},
journal = {NeuroImage: Clinical},
volume = {32},
pages = {102848},
publisher = {Elsevier Inc.},
abstract = {Background: Impaired eye movements in multiple sclerosis (MS) are common and could represent a non-invasive and accurate measure of (dys)functioning of interconnected areas within the complex brain network. The aim of this study was to test whether altered saccadic eye movements are related to changes in functional connectivity (FC) in patients with MS. Methods: Cross-sectional eye movement (pro-saccades and anti-saccades) and magnetoencephalography (MEG) data from the Amsterdam MS cohort were included from 176 MS patients and 33 healthy controls. FC was calculated between all regions of the Brainnetome atlas in six conventional frequency bands. Cognitive function and disability were evaluated by previously validated measures. The relationships between saccadic parameters and both FC and clinical scores in MS patients were analysed using multivariate linear regression models. Results: In MS pro- and anti-saccades were abnormal compared to healthy controls A relationship of saccadic eye movements was found with FC of the oculomotor network, which was stronger for regional than global FC. In general, abnormal eye movements were related to higher delta and theta FC but lower beta FC. Strongest associations were found for pro-saccadic latency and FC of the precuneus (beta band $beta$ = -0.23},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Background: Impaired eye movements in multiple sclerosis (MS) are common and could represent a non-invasive and accurate measure of (dys)functioning of interconnected areas within the complex brain network. The aim of this study was to test whether altered saccadic eye movements are related to changes in functional connectivity (FC) in patients with MS. Methods: Cross-sectional eye movement (pro-saccades and anti-saccades) and magnetoencephalography (MEG) data from the Amsterdam MS cohort were included from 176 MS patients and 33 healthy controls. FC was calculated between all regions of the Brainnetome atlas in six conventional frequency bands. Cognitive function and disability were evaluated by previously validated measures. The relationships between saccadic parameters and both FC and clinical scores in MS patients were analysed using multivariate linear regression models. Results: In MS pro- and anti-saccades were abnormal compared to healthy controls A relationship of saccadic eye movements was found with FC of the oculomotor network, which was stronger for regional than global FC. In general, abnormal eye movements were related to higher delta and theta FC but lower beta FC. Strongest associations were found for pro-saccadic latency and FC of the precuneus (beta band $beta$ = -0.23

Close

  • doi:10.1016/j.nicl.2021.102848

Close

Hamideh Norouzi; Niloofar Tavakoli; Mohammad Reza Daliri

Alpha oscillation during the performance of a new variant of working memory-guided saccade task: Evidence from behavioral and electroencephalographic analyses Journal Article

In: International Journal of Psychophysiology, vol. 166, pp. 61–70, 2021.

Abstract | Links | BibTeX

@article{Norouzi2021,
title = {Alpha oscillation during the performance of a new variant of working memory-guided saccade task: Evidence from behavioral and electroencephalographic analyses},
author = {Hamideh Norouzi and Niloofar Tavakoli and Mohammad Reza Daliri},
doi = {10.1016/j.ijpsycho.2021.05.008},
year = {2021},
date = {2021-01-01},
journal = {International Journal of Psychophysiology},
volume = {166},
pages = {61--70},
publisher = {Elsevier B.V.},
abstract = {Working memory (WM) can be considered as a limited-capacity system which is capable of saving information temporarily with the aim of processing. The aim of the present study was to establish whether eccentricity representation in WM could be decoded from eletroencephalography (EEG) alpha-band oscillation in parietal cortex during delay-period while performing memory-guided saccade (MGS) task. In this regard, we recorded EEG and Eye-tracking signals of 17 healthy volunteers in a variant version of MGS task. We designed the modified version of MGS task for the first time to investigate the effect of locating stimuli in two different positions, in a near (6°) eccentricity and far (12°) eccentricity on saccade error as a behavioral parameter. Another goal of study was to discern whether or not varying the stimuli loci can alter behavioral and eletroencephalographical data while performing the variant version of MGS task. Our findings demonstrate that saccade error for the near fixation condition is significantly smaller than the far from fixation condition. We observed an increase in alpha power in parietal lobe in near vs far conditions. In addition, the results indicate that the increase in alpha (8–12 Hz) power from fixation to memory was negatively correlated with saccade error. The novel approach of using simultaneous EEG/Eye-tracking recording in the modified MGS task provided both behavioral and electroencephalographic analyses for oscillatory activity during this new version of MGS task.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Working memory (WM) can be considered as a limited-capacity system which is capable of saving information temporarily with the aim of processing. The aim of the present study was to establish whether eccentricity representation in WM could be decoded from eletroencephalography (EEG) alpha-band oscillation in parietal cortex during delay-period while performing memory-guided saccade (MGS) task. In this regard, we recorded EEG and Eye-tracking signals of 17 healthy volunteers in a variant version of MGS task. We designed the modified version of MGS task for the first time to investigate the effect of locating stimuli in two different positions, in a near (6°) eccentricity and far (12°) eccentricity on saccade error as a behavioral parameter. Another goal of study was to discern whether or not varying the stimuli loci can alter behavioral and eletroencephalographical data while performing the variant version of MGS task. Our findings demonstrate that saccade error for the near fixation condition is significantly smaller than the far from fixation condition. We observed an increase in alpha power in parietal lobe in near vs far conditions. In addition, the results indicate that the increase in alpha (8–12 Hz) power from fixation to memory was negatively correlated with saccade error. The novel approach of using simultaneous EEG/Eye-tracking recording in the modified MGS task provided both behavioral and electroencephalographic analyses for oscillatory activity during this new version of MGS task.

Close

  • doi:10.1016/j.ijpsycho.2021.05.008

Close

John Orczyk; Charles E. Schroeder; Ilana Y. Abeles; Manuel Gomez-Ramirez; Pamela D. Butler; Yoshinao Kajikawa

Comparison of scalp ERP to faces in macaques and humans Journal Article

In: Frontiers in Systems Neuroscience, vol. 15, pp. 667611, 2021.

Abstract | Links | BibTeX

@article{Orczyk2021,
title = {Comparison of scalp ERP to faces in macaques and humans},
author = {John Orczyk and Charles E. Schroeder and Ilana Y. Abeles and Manuel Gomez-Ramirez and Pamela D. Butler and Yoshinao Kajikawa},
doi = {10.3389/fnsys.2021.667611},
year = {2021},
date = {2021-01-01},
journal = {Frontiers in Systems Neuroscience},
volume = {15},
pages = {667611},
abstract = {Face recognition is an essential activity of social living, common to many primate species. Underlying processes in the brain have been investigated using various techniques and compared between species. Functional imaging studies have shown face-selective cortical regions and their degree of correspondence across species. However, the temporal dynamics of face processing, particularly processing speed, are likely different between them. Across sensory modalities activation of primary sensory cortices in macaque monkeys occurs at about 3/5 the latency of corresponding activation in humans, though this human simian difference may diminish or disappear in higher cortical regions. We recorded scalp event-related potentials (ERPs) to presentation of faces in macaques and estimated the peak latency of ERP components. Comparisons of latencies between macaques (112 ms) and humans (192 ms) suggested that the 3:5 ratio could be preserved in higher cognitive regions of face processing between those species.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Face recognition is an essential activity of social living, common to many primate species. Underlying processes in the brain have been investigated using various techniques and compared between species. Functional imaging studies have shown face-selective cortical regions and their degree of correspondence across species. However, the temporal dynamics of face processing, particularly processing speed, are likely different between them. Across sensory modalities activation of primary sensory cortices in macaque monkeys occurs at about 3/5 the latency of corresponding activation in humans, though this human simian difference may diminish or disappear in higher cortical regions. We recorded scalp event-related potentials (ERPs) to presentation of faces in macaques and estimated the peak latency of ERP components. Comparisons of latencies between macaques (112 ms) and humans (192 ms) suggested that the 3:5 ratio could be preserved in higher cognitive regions of face processing between those species.

Close

  • doi:10.3389/fnsys.2021.667611

Close

Anastasia O. Ovchinnikova; Anatoly N. Vasilyev; Ivan P. Zubarev; Bogdan L. Kozyrskiy; Sergei L. Shishkin

MEG-based detection of voluntary eye fixations used to control a computer Journal Article

In: Frontiers in Neuroscience, vol. 15, pp. 619591, 2021.

Abstract | Links | BibTeX

@article{Ovchinnikova2021,
title = {MEG-based detection of voluntary eye fixations used to control a computer},
author = {Anastasia O. Ovchinnikova and Anatoly N. Vasilyev and Ivan P. Zubarev and Bogdan L. Kozyrskiy and Sergei L. Shishkin},
doi = {10.3389/fnins.2021.619591},
year = {2021},
date = {2021-01-01},
journal = {Frontiers in Neuroscience},
volume = {15},
pages = {619591},
abstract = {Gaze-based input is an efficient way of hand-free human-computer interaction. However, it suffers from the inability of gaze-based interfaces to discriminate voluntary and spontaneous gaze behaviors, which are overtly similar. Here, we demonstrate that voluntary eye fixations can be discriminated from spontaneous ones using short segments of magnetoencephalography (MEG) data measured immediately after the fixation onset. Recently proposed convolutional neural networks (CNNs), linear finite impulse response filters CNN (LF-CNN) and vector autoregressive CNN (VAR-CNN), were applied for binary classification of the MEG signals related to spontaneous and voluntary eye fixations collected in healthy participants (n = 25) who performed a game-like task by fixating on targets voluntarily for 500 ms or longer. Voluntary fixations were identified as those followed by a fixation in a special confirmatory area. Spontaneous vs. voluntary fixation-related single-trial 700 ms MEG segments were non-randomly classified in the majority of participants, with the group average cross-validated ROC AUC of 0.66 ± 0.07 for LF-CNN and 0.67 ± 0.07 for VAR-CNN (M ± SD). When the time interval, from which the MEG data were taken, was extended beyond the onset of the visual feedback, the group average classification performance increased up to 0.91. Analysis of spatial patterns contributing to classification did not reveal signs of significant eye movement impact on the classification results. We conclude that the classification of MEG signals has a certain potential to support gaze-based interfaces by avoiding false responses to spontaneous eye fixations on a single-trial basis. Current results for intention detection prior to gaze-based interface's feedback, however, are not sufficient for online single-trial eye fixation classification using MEG data alone, and further work is needed to find out if it could be used in practical applications.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Gaze-based input is an efficient way of hand-free human-computer interaction. However, it suffers from the inability of gaze-based interfaces to discriminate voluntary and spontaneous gaze behaviors, which are overtly similar. Here, we demonstrate that voluntary eye fixations can be discriminated from spontaneous ones using short segments of magnetoencephalography (MEG) data measured immediately after the fixation onset. Recently proposed convolutional neural networks (CNNs), linear finite impulse response filters CNN (LF-CNN) and vector autoregressive CNN (VAR-CNN), were applied for binary classification of the MEG signals related to spontaneous and voluntary eye fixations collected in healthy participants (n = 25) who performed a game-like task by fixating on targets voluntarily for 500 ms or longer. Voluntary fixations were identified as those followed by a fixation in a special confirmatory area. Spontaneous vs. voluntary fixation-related single-trial 700 ms MEG segments were non-randomly classified in the majority of participants, with the group average cross-validated ROC AUC of 0.66 ± 0.07 for LF-CNN and 0.67 ± 0.07 for VAR-CNN (M ± SD). When the time interval, from which the MEG data were taken, was extended beyond the onset of the visual feedback, the group average classification performance increased up to 0.91. Analysis of spatial patterns contributing to classification did not reveal signs of significant eye movement impact on the classification results. We conclude that the classification of MEG signals has a certain potential to support gaze-based interfaces by avoiding false responses to spontaneous eye fixations on a single-trial basis. Current results for intention detection prior to gaze-based interface's feedback, however, are not sufficient for online single-trial eye fixation classification using MEG data alone, and further work is needed to find out if it could be used in practical applications.

Close

  • doi:10.3389/fnins.2021.619591

Close

Yali Pan; Steven Frisson; Ole Jensen

Neural evidence for lexical parafoveal processing Journal Article

In: Nature Communications, vol. 12, pp. 5234, 2021.

Abstract | Links | BibTeX

@article{Pan2021a,
title = {Neural evidence for lexical parafoveal processing},
author = {Yali Pan and Steven Frisson and Ole Jensen},
doi = {10.1038/s41467-021-25571-x},
year = {2021},
date = {2021-01-01},
journal = {Nature Communications},
volume = {12},
pages = {5234},
publisher = {Springer US},
abstract = {In spite of the reduced visual acuity, parafoveal information plays an important role in natural reading. However, competing models on reading disagree on whether words are previewed parafoveally at the lexical level. We find neural evidence for lexical parafoveal processing by combining a rapid invisible frequency tagging (RIFT) approach with magnetoencephalography (MEG) and eye-tracking. In a silent reading task, target words are tagged (flickered) subliminally at 60 Hz. The tagging responses measured when fixating on the pre-target word reflect parafoveal processing of the target word. We observe stronger tagging responses during pre-target fixations when followed by low compared with high lexical frequency targets. Moreover, this lexical parafoveal processing is associated with individual reading speed. Our findings suggest that reading unfolds in the fovea and parafovea simultaneously to support fluent reading.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In spite of the reduced visual acuity, parafoveal information plays an important role in natural reading. However, competing models on reading disagree on whether words are previewed parafoveally at the lexical level. We find neural evidence for lexical parafoveal processing by combining a rapid invisible frequency tagging (RIFT) approach with magnetoencephalography (MEG) and eye-tracking. In a silent reading task, target words are tagged (flickered) subliminally at 60 Hz. The tagging responses measured when fixating on the pre-target word reflect parafoveal processing of the target word. We observe stronger tagging responses during pre-target fixations when followed by low compared with high lexical frequency targets. Moreover, this lexical parafoveal processing is associated with individual reading speed. Our findings suggest that reading unfolds in the fovea and parafovea simultaneously to support fluent reading.

Close

  • doi:10.1038/s41467-021-25571-x

Close

Hame Park; Christoph Kayser

The neurophysiological basis of the trial-wise and cumulative ventriloquism aftereffects Journal Article

In: Journal of Neuroscience, vol. 41, no. 5, pp. 1068–1079, 2021.

Abstract | Links | BibTeX

@article{Park2021d,
title = {The neurophysiological basis of the trial-wise and cumulative ventriloquism aftereffects},
author = {Hame Park and Christoph Kayser},
doi = {10.1523/JNEUROSCI.2091-20.2020},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {5},
pages = {1068--1079},
abstract = {Our senses often receive conflicting multisensory information, which our brain reconciles by adaptive recalibration. A classic example is the ventriloquism aftereffect, which emerges following both cumulative (long-term) and trial-wise exposure to spatially discrepant multisensory stimuli. Despite the importance of such adaptive mechanisms for interacting with environments that change over multiple timescales, it remains debated whether the ventriloquism aftereffects observed following trial-wise and cumulative exposure arise from the same neurophysiological substrate. We address this question by probing electroencephalography recordings from healthy humans (both sexes) for processes predictive of the aftereffect biases following the exposure to spatially offset audiovisual stimuli. Our results support the hypothesis that discrepant multisensory evidence shapes aftereffects on distinct timescales via common neurophysiological processes reflecting sensory inference and memory in parietal- occipital regions, while the cumulative exposure to consistent discrepancies additionally recruits prefrontal processes. During the subsequent unisensory trial, both trial-wise and cumulative exposure bias the encoding of the acoustic information, but do so distinctly. Our results posit a central role of parietal regions in shaping multisensory spatial recalibration, suggest that frontal regions consolidate the behavioral bias for persistent multisensory discrepancies, but also show that the trial-wise and cumulative exposure bias sound position encoding via distinct neurophysiological processes.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Our senses often receive conflicting multisensory information, which our brain reconciles by adaptive recalibration. A classic example is the ventriloquism aftereffect, which emerges following both cumulative (long-term) and trial-wise exposure to spatially discrepant multisensory stimuli. Despite the importance of such adaptive mechanisms for interacting with environments that change over multiple timescales, it remains debated whether the ventriloquism aftereffects observed following trial-wise and cumulative exposure arise from the same neurophysiological substrate. We address this question by probing electroencephalography recordings from healthy humans (both sexes) for processes predictive of the aftereffect biases following the exposure to spatially offset audiovisual stimuli. Our results support the hypothesis that discrepant multisensory evidence shapes aftereffects on distinct timescales via common neurophysiological processes reflecting sensory inference and memory in parietal- occipital regions, while the cumulative exposure to consistent discrepancies additionally recruits prefrontal processes. During the subsequent unisensory trial, both trial-wise and cumulative exposure bias the encoding of the acoustic information, but do so distinctly. Our results posit a central role of parietal regions in shaping multisensory spatial recalibration, suggest that frontal regions consolidate the behavioral bias for persistent multisensory discrepancies, but also show that the trial-wise and cumulative exposure bias sound position encoding via distinct neurophysiological processes.

Close

  • doi:10.1523/JNEUROSCI.2091-20.2020

Close

Mohsen Parto Dezfouli; Saeideh Davoudi; Robert T. Knight; Mohammad Reza Daliri; Elizabeth L. Johnson

Prefrontal lesions disrupt oscillatory signatures of spatiotemporal integration in working memory Journal Article

In: Cortex, vol. 138, pp. 113–126, 2021.

Abstract | Links | BibTeX

@article{PartoDezfouli2021,
title = {Prefrontal lesions disrupt oscillatory signatures of spatiotemporal integration in working memory},
author = {Mohsen Parto Dezfouli and Saeideh Davoudi and Robert T. Knight and Mohammad Reza Daliri and Elizabeth L. Johnson},
doi = {10.1016/j.cortex.2021.01.016},
year = {2021},
date = {2021-01-01},
journal = {Cortex},
volume = {138},
pages = {113--126},
publisher = {Elsevier Ltd},
abstract = {How does the human brain integrate spatial and temporal information into unified mnemonic representations? Building on classic theories of feature binding, we first define the oscillatory signatures of integrating ‘where' and ‘when' information in working memory (WM) and then investigate the role of prefrontal cortex (PFC) in spatiotemporal integration. Fourteen individuals with lateral PFC damage and 20 healthy controls completed a visuospatial WM task while electroencephalography (EEG) was recorded. On each trial, two shapes were presented sequentially in a top/bottom spatial orientation. We defined EEG signatures of spatiotemporal integration by comparing the maintenance of two possible where-when configurations: the first shape presented on top and the reverse. Frontal delta-theta ($delta$$theta$; 2–7 Hz) activity, frontal-posterior $delta$$theta$ functional connectivity, lateral posterior event-related potentials, and mesial posterior alpha phase-to-gamma amplitude coupling dissociated the two configurations in controls. WM performance and frontal and mesial posterior signatures of spatiotemporal integration were diminished in PFC lesion patients, whereas lateral posterior signatures were intact. These findings reveal both PFC-dependent and independent substrates of spatiotemporal integration and link optimal performance to PFC.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

How does the human brain integrate spatial and temporal information into unified mnemonic representations? Building on classic theories of feature binding, we first define the oscillatory signatures of integrating ‘where' and ‘when' information in working memory (WM) and then investigate the role of prefrontal cortex (PFC) in spatiotemporal integration. Fourteen individuals with lateral PFC damage and 20 healthy controls completed a visuospatial WM task while electroencephalography (EEG) was recorded. On each trial, two shapes were presented sequentially in a top/bottom spatial orientation. We defined EEG signatures of spatiotemporal integration by comparing the maintenance of two possible where-when configurations: the first shape presented on top and the reverse. Frontal delta-theta ($delta$$theta$; 2–7 Hz) activity, frontal-posterior $delta$$theta$ functional connectivity, lateral posterior event-related potentials, and mesial posterior alpha phase-to-gamma amplitude coupling dissociated the two configurations in controls. WM performance and frontal and mesial posterior signatures of spatiotemporal integration were diminished in PFC lesion patients, whereas lateral posterior signatures were intact. These findings reveal both PFC-dependent and independent substrates of spatiotemporal integration and link optimal performance to PFC.

Close

  • doi:10.1016/j.cortex.2021.01.016

Close

Wykowska A. Abubshait A Perez-Osorio J

Irrelevant robot signals in a categorization task induce cognitive conflict in performance, eye trajectories, the N2 component of the EEG, and frontal theta oscillations Journal Article

In: Journal of Cognitive Neuroscience, vol. 34, no. 1, pp. 108–126, 2021.

Abstract | Links | BibTeX

@article{PerezOsorioJAbubshaitA2021,
title = {Irrelevant robot signals in a categorization task induce cognitive conflict in performance, eye trajectories, the N2 component of the EEG, and frontal theta oscillations},
author = {Wykowska A. Abubshait A Perez-Osorio J},
doi = {10.1162/jocn},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {34},
number = {1},
pages = {108--126},
abstract = {Understanding others' nonverbal behavior is essential for social interaction, as it allows, among others, to infer mental states. While gaze communication, a well-established nonverbal social behavior, has shown its importance in inferring others' mental states, not much is known about the effects of irrelevant gaze signals on cognitive conflict markers during collaborative settings. Here, participants completed a categorization task where they categorized objects based on their color while observing images of a robot. On each trial, participants observed the robot iCub grasping an object from a table and offering it to them to simulate a handover. Once the robot “moved” the object forward, participants were asked to categorize the object according to its color. Before participants were allowed to respond, the robot made a lateral head/gaze shift. The gaze shifts were either congruent or incongruent with the object's color. We expected that incongruent head-cues would induce more errors (Study 1), would be associated with more curvature in eye-tracking trajectories (Study 2), and induce larger amplitude in electrophysiological markers of cognitive conflict (Study 3). Results of the three studies show more oculomotor interference as measured in error rates (Study 1), larger curvatures eye-tracking trajectories (Study 2), and higher amplitudes of the N2 event-related potential (ERP) of the EEG signals as well as higher Event-Related Spectral Perturbation (ERSP) amplitudes (Study 3) for incongruent trials compared to congruent trials. Our findings reveal that behavioral, ocular and electrophysiological markers can index the influence of irrelevant signals during goal-oriented tasks.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Understanding others' nonverbal behavior is essential for social interaction, as it allows, among others, to infer mental states. While gaze communication, a well-established nonverbal social behavior, has shown its importance in inferring others' mental states, not much is known about the effects of irrelevant gaze signals on cognitive conflict markers during collaborative settings. Here, participants completed a categorization task where they categorized objects based on their color while observing images of a robot. On each trial, participants observed the robot iCub grasping an object from a table and offering it to them to simulate a handover. Once the robot “moved” the object forward, participants were asked to categorize the object according to its color. Before participants were allowed to respond, the robot made a lateral head/gaze shift. The gaze shifts were either congruent or incongruent with the object's color. We expected that incongruent head-cues would induce more errors (Study 1), would be associated with more curvature in eye-tracking trajectories (Study 2), and induce larger amplitude in electrophysiological markers of cognitive conflict (Study 3). Results of the three studies show more oculomotor interference as measured in error rates (Study 1), larger curvatures eye-tracking trajectories (Study 2), and higher amplitudes of the N2 event-related potential (ERP) of the EEG signals as well as higher Event-Related Spectral Perturbation (ERSP) amplitudes (Study 3) for incongruent trials compared to congruent trials. Our findings reveal that behavioral, ocular and electrophysiological markers can index the influence of irrelevant signals during goal-oriented tasks.

Close

  • doi:10.1162/jocn

Close

Thomas Pfeffer; Adrian Ponce-Alvarez; Konstantinos Tsetsos; Thomas Meindertsma; Christoffer Julius Gahnström; Ruud Lucas Brink; Guido Nolte; Andreas Karl Engel; Gustavo Deco; Tobias Hinrich Donner

Circuit mechanisms for the chemical modulation of cortex-wide network interactions and behavioral variability Journal Article

In: Science Advances, vol. 7, no. 29, pp. eabf5620, 2021.

Abstract | Links | BibTeX

@article{Pfeffer2021,
title = {Circuit mechanisms for the chemical modulation of cortex-wide network interactions and behavioral variability},
author = {Thomas Pfeffer and Adrian Ponce-Alvarez and Konstantinos Tsetsos and Thomas Meindertsma and Christoffer Julius Gahnström and Ruud Lucas Brink and Guido Nolte and Andreas Karl Engel and Gustavo Deco and Tobias Hinrich Donner},
doi = {10.1126/sciadv.abf5620},
year = {2021},
date = {2021-01-01},
journal = {Science Advances},
volume = {7},
number = {29},
pages = {eabf5620},
abstract = {Influential theories postulate distinct roles of catecholamines and acetylcholine in cognition and behavior. However, previous physiological work reported similar effects of these neuromodulators on the response properties (specifically, the gain) of individual cortical neurons. Here, we show a double dissociation between the effects of catecholamines and acetylcholine at the level of large-scale interactions between cortical areas in humans. A pharmacological boost of catecholamine levels increased cortex-wide interactions during a visual task, but not rest. An acetylcholine boost decreased interactions during rest, but not task. Cortical circuit modeling explained this dissociation by differential changes in two circuit properties: The local excitation-inhibition balance (more strongly increased by catecholamines) and intracortical transmission (more strongly reduced by acetylcholine). The inferred catecholaminergic mechanism also predicted noisier decision-making, which we confirmed for both perceptual and value-based choice behavior. Our work highlights specific circuit mechanisms for shaping cortical network interactions and behavioral variability by key neuromodulatory systems.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Influential theories postulate distinct roles of catecholamines and acetylcholine in cognition and behavior. However, previous physiological work reported similar effects of these neuromodulators on the response properties (specifically, the gain) of individual cortical neurons. Here, we show a double dissociation between the effects of catecholamines and acetylcholine at the level of large-scale interactions between cortical areas in humans. A pharmacological boost of catecholamine levels increased cortex-wide interactions during a visual task, but not rest. An acetylcholine boost decreased interactions during rest, but not task. Cortical circuit modeling explained this dissociation by differential changes in two circuit properties: The local excitation-inhibition balance (more strongly increased by catecholamines) and intracortical transmission (more strongly reduced by acetylcholine). The inferred catecholaminergic mechanism also predicted noisier decision-making, which we confirmed for both perceptual and value-based choice behavior. Our work highlights specific circuit mechanisms for shaping cortical network interactions and behavioral variability by key neuromodulatory systems.

Close

  • doi:10.1126/sciadv.abf5620

Close

Michael Plöchl; Ian Fiebelkorn; Sabine Kastner; Jonas Obleser

Attentional sampling of visual and auditory objects is captured by theta-modulated neural activity Journal Article

In: European Journal of Neuroscience, pp. 1–16, 2021.

Abstract | Links | BibTeX

@article{Ploechl2021,
title = {Attentional sampling of visual and auditory objects is captured by theta-modulated neural activity},
author = {Michael Plöchl and Ian Fiebelkorn and Sabine Kastner and Jonas Obleser},
doi = {10.1111/ejn.15514},
year = {2021},
date = {2021-01-01},
journal = {European Journal of Neuroscience},
pages = {1--16},
abstract = {Recent evidence suggests that visual attention alternately samples two behaviourally relevant objects at approximately 4 Hz, rhythmically shifting between the objects. Whether similar attentional rhythms exist in other sensory modalities, however, is not yet clear. We therefore adapted and extended an established paradigm to investigate visual and potential auditory attentional rhythms, as well as possible interactions, on both a behavioural (detection performance},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Recent evidence suggests that visual attention alternately samples two behaviourally relevant objects at approximately 4 Hz, rhythmically shifting between the objects. Whether similar attentional rhythms exist in other sensory modalities, however, is not yet clear. We therefore adapted and extended an established paradigm to investigate visual and potential auditory attentional rhythms, as well as possible interactions, on both a behavioural (detection performance

Close

  • doi:10.1111/ejn.15514

Close

Ella Podvalny; Leana E. King; Biyu J. He

Spectral signature and behavioral consequence of spontaneous shifts of pupil-linked arousal in human Journal Article

In: eLife, vol. 10, pp. e68265, 2021.

Abstract | Links | BibTeX

@article{Podvalny2021,
title = {Spectral signature and behavioral consequence of spontaneous shifts of pupil-linked arousal in human},
author = {Ella Podvalny and Leana E. King and Biyu J. He},
doi = {10.7554/eLife.68265},
year = {2021},
date = {2021-01-01},
journal = {eLife},
volume = {10},
pages = {e68265},
abstract = {Arousal levels perpetually rise and fall spontaneously. How markers of arousal—pupil size and frequency content of brain activity—relate to each other and influence behavior in humans is poorly understood. We simultaneously monitored magnetoencephalography and pupil in healthy volunteers at rest and during a visual perceptual decision-making task. Spontaneously varying pupil size correlates with power of brain activity in most frequency bands across large-scale resting-state cortical networks. Pupil size recorded at prestimulus baseline correlates with subsequent shifts in detection bias (c) and sensitivity (d'). When dissociated from pupil-linked state, prestimulus spectral power of resting state networks still predicts perceptual behavior. Fast spontaneous pupil constriction and dilation correlate with large-scale brain activity as well but not perceptual behavior. Our results illuminate the relation between central and peripheral arousal markers and their respective roles in human perceptual decision-making.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Arousal levels perpetually rise and fall spontaneously. How markers of arousal—pupil size and frequency content of brain activity—relate to each other and influence behavior in humans is poorly understood. We simultaneously monitored magnetoencephalography and pupil in healthy volunteers at rest and during a visual perceptual decision-making task. Spontaneously varying pupil size correlates with power of brain activity in most frequency bands across large-scale resting-state cortical networks. Pupil size recorded at prestimulus baseline correlates with subsequent shifts in detection bias (c) and sensitivity (d'). When dissociated from pupil-linked state, prestimulus spectral power of resting state networks still predicts perceptual behavior. Fast spontaneous pupil constriction and dilation correlate with large-scale brain activity as well but not perceptual behavior. Our results illuminate the relation between central and peripheral arousal markers and their respective roles in human perceptual decision-making.

Close

  • doi:10.7554/eLife.68265

Close

Estelle Raffin; Adrien Witon; Roberto F Salamanca-Giron; Krystel R Huxlin; Friedhelm C Hummel

Functional aegregation within the dorsal frontoparietal network: A multimodal dynamic causal modeling study Journal Article

In: Cerebral Cortex, pp. 1–19, 2021.

Abstract | Links | BibTeX

@article{Raffin2021,
title = {Functional aegregation within the dorsal frontoparietal network: A multimodal dynamic causal modeling study},
author = {Estelle Raffin and Adrien Witon and Roberto F Salamanca-Giron and Krystel R Huxlin and Friedhelm C Hummel},
doi = {10.1093/cercor/bhab409},
year = {2021},
date = {2021-01-01},
journal = {Cerebral Cortex},
pages = {1--19},
abstract = {Discrimination and integration of motion direction requires the interplay of multiple brain areas. Theoretical accounts of perception suggest that stimulus-related (i.e., exogenous) and decision-related (i.e., endogenous) factors affect distributed neuronal processing at different levels of the visual hierarchy. To test these predictions, we measured brain activity of healthy participants during a motion discrimination task, using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI). We independently modeled the impact of exogenous factors (task demand) and endogenous factors (perceptual decision-making) on the activity of the motion discrimination network and applied Dynamic Causal Modeling (DCM) to both modalities. DCM for event-related potentials (DCM-ERP) revealed that task demand impacted the reciprocal connections between the primary visual cortex (V1) and medial temporal areas (V5). With practice, higher visual areas were increasingly involved, as revealed by DCM-fMRI. Perceptual decision-making modulated higher levels (e.g., V5-to-Frontal Eye Fields, FEF), in a manner predictive of performance. Our data suggest that lower levels of the visual network support early, feature-based selection of responses, especially when learning strategies have not been implemented. In contrast, perceptual decision-making operates at higher levels of the visual hierarchy by integrating sensory information with the internal state of the subject.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Discrimination and integration of motion direction requires the interplay of multiple brain areas. Theoretical accounts of perception suggest that stimulus-related (i.e., exogenous) and decision-related (i.e., endogenous) factors affect distributed neuronal processing at different levels of the visual hierarchy. To test these predictions, we measured brain activity of healthy participants during a motion discrimination task, using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI). We independently modeled the impact of exogenous factors (task demand) and endogenous factors (perceptual decision-making) on the activity of the motion discrimination network and applied Dynamic Causal Modeling (DCM) to both modalities. DCM for event-related potentials (DCM-ERP) revealed that task demand impacted the reciprocal connections between the primary visual cortex (V1) and medial temporal areas (V5). With practice, higher visual areas were increasingly involved, as revealed by DCM-fMRI. Perceptual decision-making modulated higher levels (e.g., V5-to-Frontal Eye Fields, FEF), in a manner predictive of performance. Our data suggest that lower levels of the visual network support early, feature-based selection of responses, especially when learning strategies have not been implemented. In contrast, perceptual decision-making operates at higher levels of the visual hierarchy by integrating sensory information with the internal state of the subject.

Close

  • doi:10.1093/cercor/bhab409

Close

Hamed Rahimi-Nasrabadi; Jianzhong Jin; Reece Mazade; Carmen Pons; Sohrab Najafian; Jose Manuel Alonso

Image luminance changes contrast sensitivity in visual cortex Journal Article

In: Cell Reports, vol. 34, no. 5, pp. 1–21, 2021.

Abstract | Links | BibTeX

@article{RahimiNasrabadi2021,
title = {Image luminance changes contrast sensitivity in visual cortex},
author = {Hamed Rahimi-Nasrabadi and Jianzhong Jin and Reece Mazade and Carmen Pons and Sohrab Najafian and Jose Manuel Alonso},
doi = {10.1016/j.celrep.2021.108692},
year = {2021},
date = {2021-01-01},
journal = {Cell Reports},
volume = {34},
number = {5},
pages = {1--21},
publisher = {ElsevierCompany.},
abstract = {Accurate measures of contrast sensitivity are important for evaluating visual disease progression and for navigation safety. Previous measures suggested that cortical contrast sensitivity was constant across widely different luminance ranges experienced indoors and outdoors. Against this notion, here, we show that luminance range changes contrast sensitivity in both cat and human cortex, and the changes are different for dark and light stimuli. As luminance range increases, contrast sensitivity increases more within cortical pathways signaling lights than those signaling darks. Conversely, when the luminance range is constant, light-dark differences in contrast sensitivity remain relatively constant even if background luminance changes. We show that a Naka-Rushton function modified to include luminance range and light-dark polarity accurately replicates both the statistics of light-dark features in natural scenes and the cortical responses to multiple combinations of contrast and luminance. We conclude that differences in light-dark contrast increase with luminance range and are largest in bright environments.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Accurate measures of contrast sensitivity are important for evaluating visual disease progression and for navigation safety. Previous measures suggested that cortical contrast sensitivity was constant across widely different luminance ranges experienced indoors and outdoors. Against this notion, here, we show that luminance range changes contrast sensitivity in both cat and human cortex, and the changes are different for dark and light stimuli. As luminance range increases, contrast sensitivity increases more within cortical pathways signaling lights than those signaling darks. Conversely, when the luminance range is constant, light-dark differences in contrast sensitivity remain relatively constant even if background luminance changes. We show that a Naka-Rushton function modified to include luminance range and light-dark polarity accurately replicates both the statistics of light-dark features in natural scenes and the cortical responses to multiple combinations of contrast and luminance. We conclude that differences in light-dark contrast increase with luminance range and are largest in bright environments.

Close

  • doi:10.1016/j.celrep.2021.108692

Close

Isabelle A. Rosenthal; Shridhar R. Singh; Katherine L. Hermann; Dimitrios Pantazis; Bevil R. Conway

Color space geometry uncovered with magnetoencephalography Journal Article

In: Current Biology, vol. 31, no. 3, pp. 515–526, 2021.

Abstract | Links | BibTeX

@article{Rosenthal2021,
title = {Color space geometry uncovered with magnetoencephalography},
author = {Isabelle A. Rosenthal and Shridhar R. Singh and Katherine L. Hermann and Dimitrios Pantazis and Bevil R. Conway},
doi = {10.1016/j.cub.2020.10.062},
year = {2021},
date = {2021-01-01},
journal = {Current Biology},
volume = {31},
number = {3},
pages = {515--526},
publisher = {Elsevier Ltd.},
abstract = {The geometry that describes the relationship among colors, and the neural mechanisms that support color vision, are unsettled. Here, we use multivariate analyses of measurements of brain activity obtained with magnetoencephalography to reverse-engineer a geometry of the neural representation of color space. The analyses depend upon determining similarity relationships among the spatial patterns of neural responses to different colors and assessing how these relationships change in time. We evaluate the approach by relating the results to universal patterns in color naming. Two prominent patterns of color naming could be accounted for by the decoding results: the greater precision in naming warm colors compared to cool colors evident by an interaction of hue and lightness, and the preeminence among colors of reddish hues. Additional experiments showed that classifiers trained on responses to color words could decode color from data obtained using colored stimuli, but only at relatively long delays after stimulus onset. These results provide evidence that perceptual representations can give rise to semantic representations, but not the reverse. Taken together, the results uncover a dynamic geometry that provides neural correlates for color appearance and generates new hypotheses about the structure of color space.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The geometry that describes the relationship among colors, and the neural mechanisms that support color vision, are unsettled. Here, we use multivariate analyses of measurements of brain activity obtained with magnetoencephalography to reverse-engineer a geometry of the neural representation of color space. The analyses depend upon determining similarity relationships among the spatial patterns of neural responses to different colors and assessing how these relationships change in time. We evaluate the approach by relating the results to universal patterns in color naming. Two prominent patterns of color naming could be accounted for by the decoding results: the greater precision in naming warm colors compared to cool colors evident by an interaction of hue and lightness, and the preeminence among colors of reddish hues. Additional experiments showed that classifiers trained on responses to color words could decode color from data obtained using colored stimuli, but only at relatively long delays after stimulus onset. These results provide evidence that perceptual representations can give rise to semantic representations, but not the reverse. Taken together, the results uncover a dynamic geometry that provides neural correlates for color appearance and generates new hypotheses about the structure of color space.

Close

  • doi:10.1016/j.cub.2020.10.062

Close

Giulia C. Salgari; Geoffrey F. Potts; Joseph Schmidt; Chi C. Chan; Christopher C. Spencer; Jeffrey S. Bedwell

Event-related potentials to rare visual targets and negative symptom severity in a transdiagnostic psychiatric sample Journal Article

In: Clinical Neurophysiology, vol. 132, no. 7, pp. 1526–1536, 2021.

Abstract | Links | BibTeX

@article{Salgari2021,
title = {Event-related potentials to rare visual targets and negative symptom severity in a transdiagnostic psychiatric sample},
author = {Giulia C. Salgari and Geoffrey F. Potts and Joseph Schmidt and Chi C. Chan and Christopher C. Spencer and Jeffrey S. Bedwell},
doi = {10.1016/j.clinph.2021.02.398},
year = {2021},
date = {2021-01-01},
journal = {Clinical Neurophysiology},
volume = {132},
number = {7},
pages = {1526--1536},
publisher = {International Federation of Clinical Neurophysiology},
abstract = {Objectives: Negative psychiatric symptoms are often resistant to treatments, regardless of the disorder in which they appear. One model for a cause of negative symptoms is impairment in higher-order cognition. The current study examined how particular bottom-up and top-down mechanisms of selective attention relate to severity of negative symptoms across a transdiagnostic psychiatric sample. Methods: The sample consisted of 130 participants: 25 schizophrenia-spectrum disorders, 26 bipolar disorders, 18 unipolar depression, and 61 nonpsychiatric controls. The relationships between attentional event-related potentials following rare visual targets (i.e., N1, N2b, P2a, and P3b) and severity of the negative symptom domains of anhedonia, avolition, and blunted affect were evaluated using frequentist and Bayesian analyses. Results: P3b and N2b mean amplitudes were inversely related to the Positive and Negative Syndrome Scale-Negative Symptom Factor severity score across the entire sample. Subsequent regression analyses showed a significant negative transdiagnostic relationship between P3b amplitude and blunted affect severity. Conclusions: Results indicate that negative symptoms, and particularly blunted affect, may have a stronger association with deficits in top-down mechanisms of selective attention. Significance: This suggests that people with greater severity of blunted affect, independent of diagnosis, do not allocate sufficient cognitive resources when engaging in activities requiring selective attention.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Objectives: Negative psychiatric symptoms are often resistant to treatments, regardless of the disorder in which they appear. One model for a cause of negative symptoms is impairment in higher-order cognition. The current study examined how particular bottom-up and top-down mechanisms of selective attention relate to severity of negative symptoms across a transdiagnostic psychiatric sample. Methods: The sample consisted of 130 participants: 25 schizophrenia-spectrum disorders, 26 bipolar disorders, 18 unipolar depression, and 61 nonpsychiatric controls. The relationships between attentional event-related potentials following rare visual targets (i.e., N1, N2b, P2a, and P3b) and severity of the negative symptom domains of anhedonia, avolition, and blunted affect were evaluated using frequentist and Bayesian analyses. Results: P3b and N2b mean amplitudes were inversely related to the Positive and Negative Syndrome Scale-Negative Symptom Factor severity score across the entire sample. Subsequent regression analyses showed a significant negative transdiagnostic relationship between P3b amplitude and blunted affect severity. Conclusions: Results indicate that negative symptoms, and particularly blunted affect, may have a stronger association with deficits in top-down mechanisms of selective attention. Significance: This suggests that people with greater severity of blunted affect, independent of diagnosis, do not allocate sufficient cognitive resources when engaging in activities requiring selective attention.

Close

  • doi:10.1016/j.clinph.2021.02.398

Close

Sebastian Schindler; Niko Busch; Maximilian Bruchmann; Maren Isabel Wolf; Thomas Straube

Early ERP functions are indexed by lateralized effects to peripherally presented emotional faces and scrambles Journal Article

In: Psychophysiology, vol. 59, no. 2, pp. e13959, 2021.

Abstract | Links | BibTeX

@article{Schindler2021,
title = {Early ERP functions are indexed by lateralized effects to peripherally presented emotional faces and scrambles},
author = {Sebastian Schindler and Niko Busch and Maximilian Bruchmann and Maren Isabel Wolf and Thomas Straube},
doi = {10.1111/psyp.13959},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {59},
number = {2},
pages = {e13959},
abstract = {A large body of research suggests that early event-related potentials (ERPs), such as the P1 and N1, are potentiated by attention and represent stimulus amplification. However, recent accounts suggest that the P1 is associated with inhibiting the irrelevant visual field evidenced by a pronounced ipsilateral P1 during sustained attention to peripherally presented stimuli. The current EEG study further investigated this issue to reveal how lateralized ERP findings are modulated by face and emotional information. Therefore, participants were asked to fixate the center of the screen and pay sustained attention either to the right or left visual field, where angry or neutral faces or their Fourier phase-scrambled versions were presented. We found a bilateral P1 to all stimuli with relatively increased, but delayed, ipsilateral P1 amplitudes to faces but not to scrambles. Explorative independent component analyses dissociated an earlier lateralized larger contralateral P1 from a later bilateral P1. By contrast, the N170 showed a contralateral enhancement to all stimuli, which was most pronounced for neutral faces attended in the left hemifield. Finally, increased contralateral alpha power was found for both attended hemifields but was not significantly related to poststimulus ERPs. These results provide evidence against a general inhibitory role of the P1 but suggest stimulus-specific relative enhancements of the ipsilateral P1 for the irrelevant visual hemifield. The lateralized N170, however, is associated with stimulus amplification as a function of facial features.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

A large body of research suggests that early event-related potentials (ERPs), such as the P1 and N1, are potentiated by attention and represent stimulus amplification. However, recent accounts suggest that the P1 is associated with inhibiting the irrelevant visual field evidenced by a pronounced ipsilateral P1 during sustained attention to peripherally presented stimuli. The current EEG study further investigated this issue to reveal how lateralized ERP findings are modulated by face and emotional information. Therefore, participants were asked to fixate the center of the screen and pay sustained attention either to the right or left visual field, where angry or neutral faces or their Fourier phase-scrambled versions were presented. We found a bilateral P1 to all stimuli with relatively increased, but delayed, ipsilateral P1 amplitudes to faces but not to scrambles. Explorative independent component analyses dissociated an earlier lateralized larger contralateral P1 from a later bilateral P1. By contrast, the N170 showed a contralateral enhancement to all stimuli, which was most pronounced for neutral faces attended in the left hemifield. Finally, increased contralateral alpha power was found for both attended hemifields but was not significantly related to poststimulus ERPs. These results provide evidence against a general inhibitory role of the P1 but suggest stimulus-specific relative enhancements of the ipsilateral P1 for the irrelevant visual hemifield. The lateralized N170, however, is associated with stimulus amplification as a function of facial features.

Close

  • doi:10.1111/psyp.13959

Close

Sebastian Schindler; Clara Tirloni; Maximilian Bruchmann; Thomas Straube

Face and emotional expression processing under continuous perceptual load tasks: An ERP study Journal Article

In: Biological Psychology, vol. 161, pp. 108056, 2021.

Abstract | Links | BibTeX

@article{Schindler2021a,
title = {Face and emotional expression processing under continuous perceptual load tasks: An ERP study},
author = {Sebastian Schindler and Clara Tirloni and Maximilian Bruchmann and Thomas Straube},
doi = {10.1016/j.biopsycho.2021.108056},
year = {2021},
date = {2021-01-01},
journal = {Biological Psychology},
volume = {161},
pages = {108056},
publisher = {Elsevier B.V.},
abstract = {High perceptual load is thought to impair already the early stages of visual processing of task-irrelevant visual stimuli. However, recent studies showed no effects of perceptual load on early ERPs in response to task-irrelevant emotional faces. In this preregistered EEG study (N = 40), we investigated the effects of continuous perceptual load on ERPs to fearful and neutral task-irrelevant faces and their phase-scrambled versions. Perceptual load did not modulate face or emotion effects for the P1 or N170. In contrast, larger face-scramble and fearful-neutral differentiation were found during low as compared to high load for the Early Posterior Negativity (EPN). Further, face-independent P1, but face-dependent N170 emotional modulations were observed. Taken together, our findings show that P1 and N170 face and emotional modulations are highly resistant to load manipulations, indicating a high degree of automaticity during this processing stage, whereas the EPN might represent a bottleneck in visual information processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

High perceptual load is thought to impair already the early stages of visual processing of task-irrelevant visual stimuli. However, recent studies showed no effects of perceptual load on early ERPs in response to task-irrelevant emotional faces. In this preregistered EEG study (N = 40), we investigated the effects of continuous perceptual load on ERPs to fearful and neutral task-irrelevant faces and their phase-scrambled versions. Perceptual load did not modulate face or emotion effects for the P1 or N170. In contrast, larger face-scramble and fearful-neutral differentiation were found during low as compared to high load for the Early Posterior Negativity (EPN). Further, face-independent P1, but face-dependent N170 emotional modulations were observed. Taken together, our findings show that P1 and N170 face and emotional modulations are highly resistant to load manipulations, indicating a high degree of automaticity during this processing stage, whereas the EPN might represent a bottleneck in visual information processing.

Close

  • doi:10.1016/j.biopsycho.2021.108056

Close

Constanze Schmitt; Jakob C. B. Schwenk; Adrian Schütz; Jan Churan; André Kaminiarz; Frank Bremmer

Preattentive processing of visually guided self-motion in humans and monkeys Journal Article

In: Progress in Neurobiology, vol. 205, pp. 102117, 2021.

Abstract | Links | BibTeX

@article{Schmitt2021,
title = {Preattentive processing of visually guided self-motion in humans and monkeys},
author = {Constanze Schmitt and Jakob C. B. Schwenk and Adrian Schütz and Jan Churan and André Kaminiarz and Frank Bremmer},
doi = {10.1016/j.pneurobio.2021.102117},
year = {2021},
date = {2021-01-01},
journal = {Progress in Neurobiology},
volume = {205},
pages = {102117},
abstract = {The visually-based control of self-motion is a challenging task, requiring – if needed – immediate adjustments to keep on track. Accordingly, it would appear advantageous if the processing of self-motion direction (heading) was predictive, thereby accelerating the encoding of unexpected changes, and un-impaired by attentional load. We tested this hypothesis by recording EEG in humans and macaque monkeys with similar experimental protocols. Subjects viewed a random dot pattern simulating self-motion across a ground plane in an oddball EEG paradigm. Standard and deviant trials differed only in their simulated heading direction (forward-left vs. forward-right). Event-related potentials (ERPs) were compared in order to test for the occurrence of a visual mismatch negativity (vMMN), a component that reflects preattentive and likely also predictive processing of sensory stimuli. Analysis of the ERPs revealed signatures of a prediction mismatch for deviant stimuli in both humans and monkeys. In humans, a MMN was observed starting 110 ms after self-motion onset. In monkeys, peak response amplitudes following deviant stimuli were enhanced compared to the standard already 100 ms after self-motion onset. We consider our results strong evidence for a preattentive processing of visual self-motion information in humans and monkeys, allowing for ultrafast adjustments of their heading direction.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The visually-based control of self-motion is a challenging task, requiring – if needed – immediate adjustments to keep on track. Accordingly, it would appear advantageous if the processing of self-motion direction (heading) was predictive, thereby accelerating the encoding of unexpected changes, and un-impaired by attentional load. We tested this hypothesis by recording EEG in humans and macaque monkeys with similar experimental protocols. Subjects viewed a random dot pattern simulating self-motion across a ground plane in an oddball EEG paradigm. Standard and deviant trials differed only in their simulated heading direction (forward-left vs. forward-right). Event-related potentials (ERPs) were compared in order to test for the occurrence of a visual mismatch negativity (vMMN), a component that reflects preattentive and likely also predictive processing of sensory stimuli. Analysis of the ERPs revealed signatures of a prediction mismatch for deviant stimuli in both humans and monkeys. In humans, a MMN was observed starting 110 ms after self-motion onset. In monkeys, peak response amplitudes following deviant stimuli were enhanced compared to the standard already 100 ms after self-motion onset. We consider our results strong evidence for a preattentive processing of visual self-motion information in humans and monkeys, allowing for ultrafast adjustments of their heading direction.

Close

  • doi:10.1016/j.pneurobio.2021.102117

Close

Jack W. Silcox; Brennan R. Payne

The costs (and benefits) of effortful listening on context processing: A simultaneous electrophysiology, pupillometry, and behavioral study Journal Article

In: Cortex, vol. 142, pp. 296–316, 2021.

Abstract | Links | BibTeX

@article{Silcox2021,
title = {The costs (and benefits) of effortful listening on context processing: A simultaneous electrophysiology, pupillometry, and behavioral study},
author = {Jack W. Silcox and Brennan R. Payne},
doi = {10.1016/j.cortex.2021.06.007},
year = {2021},
date = {2021-01-01},
journal = {Cortex},
volume = {142},
pages = {296--316},
publisher = {Elsevier Ltd},
abstract = {There is an apparent disparity between the fields of cognitive audiology and cognitive electrophysiology as to how linguistic context is used when listening to perceptually challenging speech. To gain a clearer picture of how listening effort impacts context use, we conducted a pre-registered study to simultaneously examine electrophysiological, pupillometric, and behavioral responses when listening to sentences varying in contextual constraint and acoustic challenge in the same sample. Participants (N = 44) listened to sentences that were highly constraining and completed with expected or unexpected sentence-final words (“The prisoners were planning their escape/party”) or were low-constraint sentences with unexpected sentence-final words (“All day she thought about the party”). Sentences were presented either in quiet or with +3 dB SNR background noise. Pupillometry and EEG were simultaneously recorded and subsequent sentence recognition and word recall were measured. While the N400 expectancy effect was diminished by noise, suggesting impaired real-time context use, we simultaneously observed a beneficial effect of constraint on subsequent recognition memory for degraded speech. Importantly, analyses of trial-to-trial coupling between pupil dilation and N400 amplitude showed that when participants' showed increased listening effort (i.e., greater pupil dilation), there was a subsequent recovery of the N400 effect, but at the same time, higher effort was related to poorer subsequent sentence recognition and word recall. Collectively, these findings suggest divergent effects of acoustic challenge and listening effort on context use: while noise impairs the rapid use of context to facilitate lexical semantic processing in general, this negative effect is attenuated when listeners show increased effort in response to noise. However, this effort-induced reliance on context for online word processing comes at the cost of poorer subsequent memory.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

There is an apparent disparity between the fields of cognitive audiology and cognitive electrophysiology as to how linguistic context is used when listening to perceptually challenging speech. To gain a clearer picture of how listening effort impacts context use, we conducted a pre-registered study to simultaneously examine electrophysiological, pupillometric, and behavioral responses when listening to sentences varying in contextual constraint and acoustic challenge in the same sample. Participants (N = 44) listened to sentences that were highly constraining and completed with expected or unexpected sentence-final words (“The prisoners were planning their escape/party”) or were low-constraint sentences with unexpected sentence-final words (“All day she thought about the party”). Sentences were presented either in quiet or with +3 dB SNR background noise. Pupillometry and EEG were simultaneously recorded and subsequent sentence recognition and word recall were measured. While the N400 expectancy effect was diminished by noise, suggesting impaired real-time context use, we simultaneously observed a beneficial effect of constraint on subsequent recognition memory for degraded speech. Importantly, analyses of trial-to-trial coupling between pupil dilation and N400 amplitude showed that when participants' showed increased listening effort (i.e., greater pupil dilation), there was a subsequent recovery of the N400 effect, but at the same time, higher effort was related to poorer subsequent sentence recognition and word recall. Collectively, these findings suggest divergent effects of acoustic challenge and listening effort on context use: while noise impairs the rapid use of context to facilitate lexical semantic processing in general, this negative effect is attenuated when listeners show increased effort in response to noise. However, this effort-induced reliance on context for online word processing comes at the cost of poorer subsequent memory.

Close

  • doi:10.1016/j.cortex.2021.06.007

Close

Rodolfo Solís-Vivanco; Ole Jensen; Mathilde Bonnefond

New insights on the ventral attention network: Active suppression and involuntary recruitment during a bimodal task Journal Article

In: Human Brain Mapping, vol. 42, no. 6, pp. 1699–1713, 2021.

Abstract | Links | BibTeX

@article{SolisVivanco2021,
title = {New insights on the ventral attention network: Active suppression and involuntary recruitment during a bimodal task},
author = {Rodolfo Solís-Vivanco and Ole Jensen and Mathilde Bonnefond},
doi = {10.1002/hbm.25322},
year = {2021},
date = {2021-01-01},
journal = {Human Brain Mapping},
volume = {42},
number = {6},
pages = {1699--1713},
abstract = {Detection of unexpected, yet relevant events is essential in daily life. fMRI studies have revealed the involvement of the ventral attention network (VAN), including the temporo-parietal junction (TPJ), in such process. In this MEG study with 34 participants (17 women), we used a bimodal (visual/auditory) attention task to determine the neuronal dynamics associated with suppression of the activity of the VAN during top-down attention and its recruitment when information from the unattended sensory modality is involuntarily integrated. We observed an anticipatory power increase of alpha/beta oscillations (12–20 Hz, previously associated with functional inhibition) in the VAN following a cue indicating the modality to attend. Stronger VAN power increases were associated with better task performance, suggesting that the VAN suppression prevents shifting attention to distractors. Moreover, the TPJ was synchronized with the frontal eye field in that frequency band, indicating that the dorsal attention network (DAN) might participate in such suppression. Furthermore, we found a 12–20 Hz power decrease and enhanced synchronization, in both the VAN and DAN, when information between sensory modalities was congruent, suggesting an involvement of these networks when attention is involuntarily enhanced due to multisensory integration. Our results show that effective multimodal attentional allocation includes the modulation of the VAN and DAN through upper-alpha/beta oscillations. Altogether these results indicate that the suppressing role of alpha/beta oscillations might operate beyond sensory regions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Detection of unexpected, yet relevant events is essential in daily life. fMRI studies have revealed the involvement of the ventral attention network (VAN), including the temporo-parietal junction (TPJ), in such process. In this MEG study with 34 participants (17 women), we used a bimodal (visual/auditory) attention task to determine the neuronal dynamics associated with suppression of the activity of the VAN during top-down attention and its recruitment when information from the unattended sensory modality is involuntarily integrated. We observed an anticipatory power increase of alpha/beta oscillations (12–20 Hz, previously associated with functional inhibition) in the VAN following a cue indicating the modality to attend. Stronger VAN power increases were associated with better task performance, suggesting that the VAN suppression prevents shifting attention to distractors. Moreover, the TPJ was synchronized with the frontal eye field in that frequency band, indicating that the dorsal attention network (DAN) might participate in such suppression. Furthermore, we found a 12–20 Hz power decrease and enhanced synchronization, in both the VAN and DAN, when information between sensory modalities was congruent, suggesting an involvement of these networks when attention is involuntarily enhanced due to multisensory integration. Our results show that effective multimodal attentional allocation includes the modulation of the VAN and DAN through upper-alpha/beta oscillations. Altogether these results indicate that the suppressing role of alpha/beta oscillations might operate beyond sensory regions.

Close

  • doi:10.1002/hbm.25322

Close

Jemaine E. Stacey; Mark Crook-Rumsey; Alexander Sumich; Christina J. Howard; Trevor Crawford; Kinneret Livne; Sabrina Lenzoni; Stephen Badham

Age differences in resting state EEG and their relation to eye movements and cognitive performance Journal Article

In: Neuropsychologia, vol. 157, pp. 107887, 2021.

Abstract | Links | BibTeX

@article{Stacey2021,
title = {Age differences in resting state EEG and their relation to eye movements and cognitive performance},
author = {Jemaine E. Stacey and Mark Crook-Rumsey and Alexander Sumich and Christina J. Howard and Trevor Crawford and Kinneret Livne and Sabrina Lenzoni and Stephen Badham},
doi = {10.1016/j.neuropsychologia.2021.107887},
year = {2021},
date = {2021-01-01},
journal = {Neuropsychologia},
volume = {157},
pages = {107887},
publisher = {Elsevier Ltd},
abstract = {Prior research has focused on EEG differences across age or EEG differences across cognitive tasks/eye tracking. There are few studies linking age differences in EEG to age differences in behavioural performance which is necessary to establish how neuroactivity corresponds to successful and impaired ageing. Eighty-six healthy participants completed a battery of cognitive tests and eye-tracking measures. Resting state EEG (n = 75, 31 young, 44 older adults) was measured for delta, theta, alpha and beta power as well as for alpha peak frequency. Age deficits in cognition were aligned with the literature, showing working memory and inhibitory deficits along with an older adult advantage in vocabulary. Older adults showed poorer eye movement accuracy and response times, but we did not replicate literature showing a greater age deficit for antisaccades than for prosaccades. We replicated EEG literature showing lower alpha peak frequency in older adults but not literature showing lower alpha power. Older adults also showed higher beta power and less parietal alpha power asymmetry than young adults. Interaction effects showed that better prosaccade performance was related to lower beta power in young adults but not in older adults. Performance at the trail making test part B (measuring task switching and inhibition) was improved for older adults with higher resting state delta power but did not depend on delta power for young adults. It is argued that individuals with higher slow-wave resting EEG may be more resilient to age deficits in tasks that utilise cross-cortical processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Prior research has focused on EEG differences across age or EEG differences across cognitive tasks/eye tracking. There are few studies linking age differences in EEG to age differences in behavioural performance which is necessary to establish how neuroactivity corresponds to successful and impaired ageing. Eighty-six healthy participants completed a battery of cognitive tests and eye-tracking measures. Resting state EEG (n = 75, 31 young, 44 older adults) was measured for delta, theta, alpha and beta power as well as for alpha peak frequency. Age deficits in cognition were aligned with the literature, showing working memory and inhibitory deficits along with an older adult advantage in vocabulary. Older adults showed poorer eye movement accuracy and response times, but we did not replicate literature showing a greater age deficit for antisaccades than for prosaccades. We replicated EEG literature showing lower alpha peak frequency in older adults but not literature showing lower alpha power. Older adults also showed higher beta power and less parietal alpha power asymmetry than young adults. Interaction effects showed that better prosaccade performance was related to lower beta power in young adults but not in older adults. Performance at the trail making test part B (measuring task switching and inhibition) was improved for older adults with higher resting state delta power but did not depend on delta power for young adults. It is argued that individuals with higher slow-wave resting EEG may be more resilient to age deficits in tasks that utilise cross-cortical processing.

Close

  • doi:10.1016/j.neuropsychologia.2021.107887

Close

Benjamin J Stauch; Alina Peter; Heike Schuler; Pascal Fries

Stimulus-specific plasticity in human visual gamma-band activity and functional connectivity Journal Article

In: eLife, vol. 10, pp. e68240, 2021.

Abstract | Links | BibTeX

@article{Stauch2021,
title = {Stimulus-specific plasticity in human visual gamma-band activity and functional connectivity},
author = {Benjamin J Stauch and Alina Peter and Heike Schuler and Pascal Fries},
doi = {10.7554/elife.68240},
year = {2021},
date = {2021-01-01},
journal = {eLife},
volume = {10},
pages = {e68240},
abstract = {Under natural conditions, the visual system often sees a given input repeatedly. This provides an opportunity to optimize processing of the repeated stimuli. Stimulus repetition has been shown to strongly modulate neuronal-gamma band synchronization, yet crucial questions remained open. Here we used magnetoencephalography in 30 human subjects and find that gamma decreases across ≈10 repetitions and then increases across further repetitions, revealing plastic changes of the activated neuronal circuits. Crucially, increases induced by one stimulus did not affect responses to other stimuli, demonstrating stimulus specificity. Changes partially persisted when the inducing stimulus was repeated after 25 minutes of intervening stimuli. They were strongest in early visual cortex and increased interareal feedforward influences. Our results suggest that early visual cortex gamma synchronization enables adaptive neuronal processing of recurring stimuli. These and previously reported changes might be due to an interaction of oscillatory dynamics with established synaptic plasticity mechanisms.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Under natural conditions, the visual system often sees a given input repeatedly. This provides an opportunity to optimize processing of the repeated stimuli. Stimulus repetition has been shown to strongly modulate neuronal-gamma band synchronization, yet crucial questions remained open. Here we used magnetoencephalography in 30 human subjects and find that gamma decreases across ≈10 repetitions and then increases across further repetitions, revealing plastic changes of the activated neuronal circuits. Crucially, increases induced by one stimulus did not affect responses to other stimuli, demonstrating stimulus specificity. Changes partially persisted when the inducing stimulus was repeated after 25 minutes of intervening stimuli. They were strongest in early visual cortex and increased interareal feedforward influences. Our results suggest that early visual cortex gamma synchronization enables adaptive neuronal processing of recurring stimuli. These and previously reported changes might be due to an interaction of oscillatory dynamics with established synaptic plasticity mechanisms.

Close

  • doi:10.7554/elife.68240

Close

David W. Sutterer; Andrew J. Coia; Vincent Sun; Steven K. Shevell; Edward Awh

Decoding chromaticity and luminance from patterns of EEG activity Journal Article

In: Psychophysiology, vol. 58, no. 4, pp. e13779, 2021.

Abstract | Links | BibTeX

@article{Sutterer2021,
title = {Decoding chromaticity and luminance from patterns of EEG activity},
author = {David W. Sutterer and Andrew J. Coia and Vincent Sun and Steven K. Shevell and Edward Awh},
doi = {10.1111/psyp.13779},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {58},
number = {4},
pages = {e13779},
abstract = {A long-standing question in the field of vision research is whether scalp-recorded EEG activity contains sufficient information to identify stimulus chromaticity. Recent multivariate work suggests that it is possible to decode which chromaticity an observer is viewing from the multielectrode pattern of EEG activity. There is debate, however, about whether the claimed effects of stimulus chromaticity on visual evoked potentials (VEPs) are instead caused by unequal stimulus luminances, which are achromatic differences. Here, we tested whether stimulus chromaticity could be decoded when potential confounds with luminance were minimized by (1) equating chromatic stimuli in luminance using heterochromatic flicker photometry for each observer and (2) independently varying the chromaticity and luminance of target stimuli, enabling us to test whether the pattern for a given chromaticity generalized across wide variations in luminance. We also tested whether luminance variations can be decoded from the topography of voltage across the scalp. In Experiment 1, we presented two chromaticities (appearing red and green) at three luminance levels during separate trials. In Experiment 2, we presented four chromaticities (appearing red, orange, yellow, and green) at two luminance levels. Using a pattern classifier and the multielectrode pattern of EEG activity, we were able to accurately decode the chromaticity and luminance level of each stimulus. Furthermore, we were able to decode stimulus chromaticity when we trained the classifier on chromaticities presented at one luminance level and tested at a different luminance level. Thus, EEG topography contains robust information regarding stimulus chromaticity, despite large variations in stimulus luminance.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

A long-standing question in the field of vision research is whether scalp-recorded EEG activity contains sufficient information to identify stimulus chromaticity. Recent multivariate work suggests that it is possible to decode which chromaticity an observer is viewing from the multielectrode pattern of EEG activity. There is debate, however, about whether the claimed effects of stimulus chromaticity on visual evoked potentials (VEPs) are instead caused by unequal stimulus luminances, which are achromatic differences. Here, we tested whether stimulus chromaticity could be decoded when potential confounds with luminance were minimized by (1) equating chromatic stimuli in luminance using heterochromatic flicker photometry for each observer and (2) independently varying the chromaticity and luminance of target stimuli, enabling us to test whether the pattern for a given chromaticity generalized across wide variations in luminance. We also tested whether luminance variations can be decoded from the topography of voltage across the scalp. In Experiment 1, we presented two chromaticities (appearing red and green) at three luminance levels during separate trials. In Experiment 2, we presented four chromaticities (appearing red, orange, yellow, and green) at two luminance levels. Using a pattern classifier and the multielectrode pattern of EEG activity, we were able to accurately decode the chromaticity and luminance level of each stimulus. Furthermore, we were able to decode stimulus chromaticity when we trained the classifier on chromaticities presented at one luminance level and tested at a different luminance level. Thus, EEG topography contains robust information regarding stimulus chromaticity, despite large variations in stimulus luminance.

Close

  • doi:10.1111/psyp.13779

Close

David W. Sutterer; Sean M. Polyn; Geoffrey F. Woodman

a-Band activity tracks a two-dimensional spotlight of attention during spatial working memory maintenance Journal Article

In: Journal of Neurophysiology, vol. 125, no. 3, pp. 957–971, 2021.

Abstract | Links | BibTeX

@article{Sutterer2021a,
title = {a-Band activity tracks a two-dimensional spotlight of attention during spatial working memory maintenance},
author = {David W. Sutterer and Sean M. Polyn and Geoffrey F. Woodman},
doi = {10.1152/jn.00582.2020},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neurophysiology},
volume = {125},
number = {3},
pages = {957--971},
abstract = {Covert spatial attention is thought to facilitate the maintenance of locations in working memory, and EEG a-band activity (8–12Hz) is proposed to track the focus of covert attention. Recent work has shown that multivariate patterns of a-band activity track the polar angle of remembered locations relative to fixation. However, a defining feature of covert spatial attention is that it facilitates processing in a specific region of the visual field, and prior work has not determined whether patterns of a-band activity track the two-dimensional (2-D) coordinates of remembered stimuli within a visual hemifield or are instead maximally sensitive to the polar angle of remembered locations around fixation. Here, we used a lateralized spatial estimation task, in which observers remembered the location of one or two target dots presented to one side of fixation, to test this question. By applying a linear discriminant classifier to the topography of a-band activity, we found that we were able to decode the location of remembered stimuli. Critically, model comparison revealed that the pattern of classifier choices observed across remembered positions was best explained by a model assuming that a-band activity tracks the 2-D coordinates of remembered locations rather than a model assuming that a-band activity tracks the polar angle of remembered locations relative to fixation. These results support the hypothesis that this a-band activity is involved in the spotlight of attention, and arises from mid- to lower-level visual areas involved in maintaining spatial locations in working memory. NEW & NOTEWORTHY A substantial body of work has shown that patterns of EEG a-band activity track the angular coordinates of attended and remembered stimuli around fixation, but whether these patterns track the two-dimensional coordinates of stimuli presented within a visual hemifield remains an open question. Here, we demonstrate that a-band activity tracks the two-dimensional coordinates of remembered stimuli within a hemifield, showing that a-band activity reflects a spotlight of attention focused on locations maintained in working memory.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Covert spatial attention is thought to facilitate the maintenance of locations in working memory, and EEG a-band activity (8–12Hz) is proposed to track the focus of covert attention. Recent work has shown that multivariate patterns of a-band activity track the polar angle of remembered locations relative to fixation. However, a defining feature of covert spatial attention is that it facilitates processing in a specific region of the visual field, and prior work has not determined whether patterns of a-band activity track the two-dimensional (2-D) coordinates of remembered stimuli within a visual hemifield or are instead maximally sensitive to the polar angle of remembered locations around fixation. Here, we used a lateralized spatial estimation task, in which observers remembered the location of one or two target dots presented to one side of fixation, to test this question. By applying a linear discriminant classifier to the topography of a-band activity, we found that we were able to decode the location of remembered stimuli. Critically, model comparison revealed that the pattern of classifier choices observed across remembered positions was best explained by a model assuming that a-band activity tracks the 2-D coordinates of remembered locations rather than a model assuming that a-band activity tracks the polar angle of remembered locations relative to fixation. These results support the hypothesis that this a-band activity is involved in the spotlight of attention, and arises from mid- to lower-level visual areas involved in maintaining spatial locations in working memory. NEW & NOTEWORTHY A substantial body of work has shown that patterns of EEG a-band activity track the angular coordinates of attended and remembered stimuli around fixation, but whether these patterns track the two-dimensional coordinates of stimuli presented within a visual hemifield remains an open question. Here, we demonstrate that a-band activity tracks the two-dimensional coordinates of remembered stimuli within a hemifield, showing that a-band activity reflects a spotlight of attention focused on locations maintained in working memory.

Close

  • doi:10.1152/jn.00582.2020

Close

Yu Takagi; Laurence Tudor Hunt; Mark W. Woolrich; Timothy E. J. Behrens; Miriam C. Klein-Flügge

Adapting non-invasive human recordings along multiple task-axes shows unfolding of spontaneous and over-trained choice Journal Article

In: eLife, vol. 10, pp. 1–27, 2021.

Abstract | Links | BibTeX

@article{Takagi2021,
title = {Adapting non-invasive human recordings along multiple task-axes shows unfolding of spontaneous and over-trained choice},
author = {Yu Takagi and Laurence Tudor Hunt and Mark W. Woolrich and Timothy E. J. Behrens and Miriam C. Klein-Flügge},
doi = {10.7554/eLife.60988},
year = {2021},
date = {2021-01-01},
journal = {eLife},
volume = {10},
pages = {1--27},
abstract = {Choices rely on a transformation of sensory inputs into motor responses. Using invasive single neuron recordings, the evolution of a choice process has been tracked by projecting population neural responses into state spaces. Here, we develop an approach that allows us to recover similar trajectories on a millisecond timescale in non-invasive human recordings. We selectively suppress activity related to three task-axes, relevant and irrelevant sensory inputs and response direction, in magnetoencephalography data acquired during context-dependent choices. Recordings from premotor cortex show a progression from processing sensory input to processing the response. In contrast to previous macaque recordings, information related to choice-irrelevant features is represented more weakly than choice-relevant sensory information. To test whether this mechanistic difference between species is caused by extensive over-training common in non-human primate studies, we trained humans on >20,000 trials of the task. Choice-irrelevant features were still weaker than relevant features in premotor cortex after over-training.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Choices rely on a transformation of sensory inputs into motor responses. Using invasive single neuron recordings, the evolution of a choice process has been tracked by projecting population neural responses into state spaces. Here, we develop an approach that allows us to recover similar trajectories on a millisecond timescale in non-invasive human recordings. We selectively suppress activity related to three task-axes, relevant and irrelevant sensory inputs and response direction, in magnetoencephalography data acquired during context-dependent choices. Recordings from premotor cortex show a progression from processing sensory input to processing the response. In contrast to previous macaque recordings, information related to choice-irrelevant features is represented more weakly than choice-relevant sensory information. To test whether this mechanistic difference between species is caused by extensive over-training common in non-human primate studies, we trained humans on >20,000 trials of the task. Choice-irrelevant features were still weaker than relevant features in premotor cortex after over-training.

Close

  • doi:10.7554/eLife.60988

Close

Travis N. Talcott; Nicholas Gaspelin

Eye movements are not mandatorily preceded by the N2pc component Journal Article

In: Psychophysiology, vol. 58, no. 6, pp. e13821, 2021.

Abstract | Links | BibTeX

@article{Talcott2021,
title = {Eye movements are not mandatorily preceded by the N2pc component},
author = {Travis N. Talcott and Nicholas Gaspelin},
doi = {10.1111/psyp.13821},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {58},
number = {6},
pages = {e13821},
abstract = {Researchers typically distinguish between two mechanisms of attentional selection in vision: overt and covert attention. A commonplace assumption is that overt eye movements are automatically preceded by shifts of covert attention during visual search. Although the N2pc component is a putative index of covert attentional orienting, little is currently known about its relationship with overt eye movements. This is because most previous studies of the N2pc component prohibit overt eye movements. The current study assessed this relationship by concurrently measuring covert attention (via the N2pc) and overt eye movements (via eye tracking). Participants searched displays for a lateralized target stimulus and were allowed to generate overt eye movements during the search. We then assessed whether overt eye movements were preceded by the N2pc component. The results indicated that saccades were preceded by an N2pc component, but only when participants were required to carefully inspect the target stimulus before initiating the eye movement. When participants were allowed to make naturalistic eye movements in service of visual search, there was no evidence of an N2pc component before eye movements. These findings suggest that the N2pc component does not always precede overt eye movements during visual search. Implications for understanding the relationship between covert and overt attention are discussed.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Researchers typically distinguish between two mechanisms of attentional selection in vision: overt and covert attention. A commonplace assumption is that overt eye movements are automatically preceded by shifts of covert attention during visual search. Although the N2pc component is a putative index of covert attentional orienting, little is currently known about its relationship with overt eye movements. This is because most previous studies of the N2pc component prohibit overt eye movements. The current study assessed this relationship by concurrently measuring covert attention (via the N2pc) and overt eye movements (via eye tracking). Participants searched displays for a lateralized target stimulus and were allowed to generate overt eye movements during the search. We then assessed whether overt eye movements were preceded by the N2pc component. The results indicated that saccades were preceded by an N2pc component, but only when participants were required to carefully inspect the target stimulus before initiating the eye movement. When participants were allowed to make naturalistic eye movements in service of visual search, there was no evidence of an N2pc component before eye movements. These findings suggest that the N2pc component does not always precede overt eye movements during visual search. Implications for understanding the relationship between covert and overt attention are discussed.

Close

  • doi:10.1111/psyp.13821

Close

Mats W. J. Es; Tom R. Marshall; Eelke Spaak; Ole Jensen; Jan-Mathijs Schoffelen

Phasic modulation of visual representations during sustained attention Journal Article

In: European Journal of Neuroscience, pp. 1–18, 2021.

Abstract | Links | BibTeX

@article{Es2021,
title = {Phasic modulation of visual representations during sustained attention},
author = {Mats W. J. Es and Tom R. Marshall and Eelke Spaak and Ole Jensen and Jan-Mathijs Schoffelen},
doi = {10.1111/ejn.15084},
year = {2021},
date = {2021-01-01},
journal = {European Journal of Neuroscience},
pages = {1--18},
abstract = {Sustained attention has long been thought to benefit perception in a continuous fashion, but recent evidence suggests that it affects perception in a discrete, rhythmic way. Periodic fluctuations in behavioral performance over time, and modulations of behavioral performance by the phase of spontaneous oscillatory brain activity point to an attentional sampling rate in the theta or alpha frequency range. We investigated whether such discrete sampling by attention is reflected in periodic fluctuations in the decodability of visual stimulus orientation from magnetoencephalographic (MEG) brain signals. In this exploratory study, human subjects attended one of two grating stimuli while MEG was being recorded. We assessed the strength of the visual representation of the attended stimulus using a support vector machine (SVM) to decode the orientation of the grating (clockwise vs. counterclockwise) from the MEG signal. We tested whether decoder performance depended on the theta/alpha phase of local brain activity. While the phase of ongoing activity in visual cortex did not modulate decoding performance, theta/alpha phase of activity in the FEF and parietal cortex, contralateral to the attended stimulus did modulate decoding performance. These findings suggest that phasic modulations of visual stimulus representations in the brain are caused by frequency-specific top-down activity in the fronto-parietal attention network.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Sustained attention has long been thought to benefit perception in a continuous fashion, but recent evidence suggests that it affects perception in a discrete, rhythmic way. Periodic fluctuations in behavioral performance over time, and modulations of behavioral performance by the phase of spontaneous oscillatory brain activity point to an attentional sampling rate in the theta or alpha frequency range. We investigated whether such discrete sampling by attention is reflected in periodic fluctuations in the decodability of visual stimulus orientation from magnetoencephalographic (MEG) brain signals. In this exploratory study, human subjects attended one of two grating stimuli while MEG was being recorded. We assessed the strength of the visual representation of the attended stimulus using a support vector machine (SVM) to decode the orientation of the grating (clockwise vs. counterclockwise) from the MEG signal. We tested whether decoder performance depended on the theta/alpha phase of local brain activity. While the phase of ongoing activity in visual cortex did not modulate decoding performance, theta/alpha phase of activity in the FEF and parietal cortex, contralateral to the attended stimulus did modulate decoding performance. These findings suggest that phasic modulations of visual stimulus representations in the brain are caused by frequency-specific top-down activity in the fronto-parietal attention network.

Close

  • doi:10.1111/ejn.15084

Close

TianHong Zhang; YingYu Yang; LiHua Xu; XiaoChen Tang; YeGang Hu; Xin Xiong; YanYan Wei; HuiRu Cui; YingYing Tang; HaiChun Liu; Tao Chen; Zhi Liu; Li Hui; ChunBo Li; XiaoLi Guo; JiJun Wang

Inefficient integration during multiple facial processing in pre-morbid and early phases of psychosis Journal Article

In: The World Journal of Biological Psychiatry, pp. 1–13, 2021.

Abstract | Links | BibTeX

@article{Zhang2021f,
title = {Inefficient integration during multiple facial processing in pre-morbid and early phases of psychosis},
author = {TianHong Zhang and YingYu Yang and LiHua Xu and XiaoChen Tang and YeGang Hu and Xin Xiong and YanYan Wei and HuiRu Cui and YingYing Tang and HaiChun Liu and Tao Chen and Zhi Liu and Li Hui and ChunBo Li and XiaoLi Guo and JiJun Wang},
doi = {10.1080/15622975.2021.2011402},
year = {2021},
date = {2021-01-01},
journal = {The World Journal of Biological Psychiatry},
pages = {1--13},
publisher = {Taylor & Francis},
abstract = {Objectives: We used eye-tracking to evaluate multiple facial context processing and event-related potential (ERP) to evaluate multiple facial recognition in individuals at clinical high risk (CHR) for psychosis. Methods: In total, 173 subjects (83 CHRs and 90 healthy controls [HCs]) were included and their emotion perception performances were accessed. A total of 40 CHRs and 40 well-matched HCs completed an eye-tracking task where they viewed pictures depicting a person in the foreground, presented as context-free, context-compatible, and context-incompatible. During the two-year follow-up, 26 CHRs developed psychosis, including 17 individuals who developed first-episode schizophrenia (FES). Eighteen well-matched HCs were made to complete the face number detection ERP task with image stimuli of one, two, or three faces. Results: Compared to the HC group, the CHR group showed reduced visual attention to contextual processing when viewing multiple faces. With the increasing complexity of contextual faces, the differences in eye-tracking characteristics also increased. In the ERP task, the N170 amplitude decreased with a higher face number in FES patients, while it increased with a higher face number in HCs. Conclusions: Individuals in the very early phase of psychosis showed facial processing deficits with supporting evidence of different scan paths during context processing and disruption of N170 during multiple facial recognition.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Objectives: We used eye-tracking to evaluate multiple facial context processing and event-related potential (ERP) to evaluate multiple facial recognition in individuals at clinical high risk (CHR) for psychosis. Methods: In total, 173 subjects (83 CHRs and 90 healthy controls [HCs]) were included and their emotion perception performances were accessed. A total of 40 CHRs and 40 well-matched HCs completed an eye-tracking task where they viewed pictures depicting a person in the foreground, presented as context-free, context-compatible, and context-incompatible. During the two-year follow-up, 26 CHRs developed psychosis, including 17 individuals who developed first-episode schizophrenia (FES). Eighteen well-matched HCs were made to complete the face number detection ERP task with image stimuli of one, two, or three faces. Results: Compared to the HC group, the CHR group showed reduced visual attention to contextual processing when viewing multiple faces. With the increasing complexity of contextual faces, the differences in eye-tracking characteristics also increased. In the ERP task, the N170 amplitude decreased with a higher face number in FES patients, while it increased with a higher face number in HCs. Conclusions: Individuals in the very early phase of psychosis showed facial processing deficits with supporting evidence of different scan paths during context processing and disruption of N170 during multiple facial recognition.

Close

  • doi:10.1080/15622975.2021.2011402

Close

Wieske Van Zoest; Christoph Huber-Huber; Matthew D. Weaver; Clayton Hickey

Strategic distractor suppression improves selective control in human vision Journal Article

In: Journal of Neuroscience, vol. 41, no. 33, pp. 7120–7135, 2021.

Abstract | Links | BibTeX

@article{VanZoest2021,
title = {Strategic distractor suppression improves selective control in human vision},
author = {Wieske Van Zoest and Christoph Huber-Huber and Matthew D. Weaver and Clayton Hickey},
doi = {10.1523/JNEUROSCI.0553-21.2021},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {33},
pages = {7120--7135},
abstract = {Our visual environment is complicated, and our cognitive capacity is limited. As a result, we must strategically ignore some stimuli to prioritize others. Common sense suggests that foreknowledge of distractor characteristics, like location or color, might help us ignore these objects. But empirical studies have provided mixed evidence, often showing that knowing about a distractor before it appears counterintuitively leads to its attentional selection. What has looked like strategic distractor suppression in the past is now commonly explained as a product of prior experience and implicit statistical learning, and the long-standing notion the distractor suppression is reflected in a band oscillatory brain activity has been challenged by results appearing to link a to target resolution. Can we strategically, proactively suppress distractors? And, if so, does this involve a? Here, we use the concurrent recording of human EEG and eye movements in optimized experimental designs to identify behavior and brain activity associated with proactive distractor suppression. Results from three experiments show that knowing about distractors before they appear causes a reduction in electrophysiological indices of covert attentional selection of these objects and a reduction in the overt deployment of the eyes to the location of the objects. This control is established before the distractor appears and is predicted by the power of cue-elicited a activity over the visual cortex. Foreknowledge of distractor characteristics therefore leads to improved selective control, and a oscillations in visual cortex reflect the implementation of this strategic, proactive mechanism.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Our visual environment is complicated, and our cognitive capacity is limited. As a result, we must strategically ignore some stimuli to prioritize others. Common sense suggests that foreknowledge of distractor characteristics, like location or color, might help us ignore these objects. But empirical studies have provided mixed evidence, often showing that knowing about a distractor before it appears counterintuitively leads to its attentional selection. What has looked like strategic distractor suppression in the past is now commonly explained as a product of prior experience and implicit statistical learning, and the long-standing notion the distractor suppression is reflected in a band oscillatory brain activity has been challenged by results appearing to link a to target resolution. Can we strategically, proactively suppress distractors? And, if so, does this involve a? Here, we use the concurrent recording of human EEG and eye movements in optimized experimental designs to identify behavior and brain activity associated with proactive distractor suppression. Results from three experiments show that knowing about distractors before they appear causes a reduction in electrophysiological indices of covert attentional selection of these objects and a reduction in the overt deployment of the eyes to the location of the objects. This control is established before the distractor appears and is predicted by the power of cue-elicited a activity over the visual cortex. Foreknowledge of distractor characteristics therefore leads to improved selective control, and a oscillations in visual cortex reflect the implementation of this strategic, proactive mechanism.

Close

  • doi:10.1523/JNEUROSCI.0553-21.2021

Close

Bo Yao; Jason R. Taylor; Briony Banks; Sonja A. Kotz

Reading direct speech quotes increases theta phase-locking: Evidence for cortical tracking of inner speech? Journal Article

In: NeuroImage, vol. 239, pp. 118313, 2021.

Abstract | Links | BibTeX

@article{Yao2021a,
title = {Reading direct speech quotes increases theta phase-locking: Evidence for cortical tracking of inner speech?},
author = {Bo Yao and Jason R. Taylor and Briony Banks and Sonja A. Kotz},
doi = {10.1016/j.neuroimage.2021.118313},
year = {2021},
date = {2021-01-01},
journal = {NeuroImage},
volume = {239},
pages = {118313},
publisher = {Elsevier Inc.},
abstract = {Growing evidence shows that theta-band (4–7 Hz) activity in the auditory cortex phase-locks to rhythms of overt speech. Does theta activity also encode the rhythmic dynamics of inner speech? Previous research established that silent reading of direct speech quotes (e.g., Mary said: “This dress is lovely!”) elicits more vivid inner speech than indirect speech quotes (e.g., Mary said that the dress was lovely). As we cannot directly track the phase alignment between theta activity and inner speech over time, we used EEG to measure the brain's phase-locked responses to the onset of speech quote reading. We found that direct (vs. indirect) quote reading was associated with increased theta phase synchrony over trials at 250–500 ms post-reading onset, with sources of the evoked activity estimated in the speech processing network. An eye-tracking control experiment confirmed that increased theta phase synchrony in direct quote reading was not driven by eye movement patterns, and more likely reflects synchronous phase resetting at the onset of inner speech. These findings suggest a functional role of theta phase modulation in reading-induced inner speech.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Growing evidence shows that theta-band (4–7 Hz) activity in the auditory cortex phase-locks to rhythms of overt speech. Does theta activity also encode the rhythmic dynamics of inner speech? Previous research established that silent reading of direct speech quotes (e.g., Mary said: “This dress is lovely!”) elicits more vivid inner speech than indirect speech quotes (e.g., Mary said that the dress was lovely). As we cannot directly track the phase alignment between theta activity and inner speech over time, we used EEG to measure the brain's phase-locked responses to the onset of speech quote reading. We found that direct (vs. indirect) quote reading was associated with increased theta phase synchrony over trials at 250–500 ms post-reading onset, with sources of the evoked activity estimated in the speech processing network. An eye-tracking control experiment confirmed that increased theta phase synchrony in direct quote reading was not driven by eye movement patterns, and more likely reflects synchronous phase resetting at the onset of inner speech. These findings suggest a functional role of theta phase modulation in reading-induced inner speech.

Close

  • doi:10.1016/j.neuroimage.2021.118313

Close

Stephen Whitmarsh; Christophe Gitton; Veikko Jousmäki; Jérôme Sackur; Catherine Tallon-Baudry

Neuronal correlates of the subjective experience of attention Journal Article

In: European Journal of Neuroscience, no. January, pp. 1–18, 2021.

Abstract | Links | BibTeX

@article{Whitmarsh2021,
title = {Neuronal correlates of the subjective experience of attention},
author = {Stephen Whitmarsh and Christophe Gitton and Veikko Jousmäki and Jérôme Sackur and Catherine Tallon-Baudry},
doi = {10.1111/ejn.15395},
year = {2021},
date = {2021-01-01},
journal = {European Journal of Neuroscience},
number = {January},
pages = {1--18},
abstract = {The effect of top–down attention on stimulus-evoked responses and alpha oscillations and the association between arousal and pupil diameter are well established. However, the relationship between these indices, and their contribution to the subjective experience of attention, remains largely unknown. Participants performed a sustained (10–30 s) attention task in which rare (10%) targets were detected within continuous tactile stimulation (16 Hz). Trials were followed by attention ratings on an 8-point visual scale. Attention ratings correlated negatively with contralateral somatosensory alpha power and positively with pupil diameter. The effect of pupil diameter on attention ratings extended into the following trial, reflecting a sustained aspect of attention related to vigilance. The effect of alpha power did not carry over to the next trial and furthermore mediated the association between pupil diameter and attention ratings. Variations in steady-state amplitude reflected stimulus processing under the influence of alpha oscillations but were only weakly related to subjective ratings of attention. Together, our results show that both alpha power and pupil diameter are reflected in the subjective experience of attention, albeit on different time spans, while continuous stimulus processing might not contribute to the experience of attention.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The effect of top–down attention on stimulus-evoked responses and alpha oscillations and the association between arousal and pupil diameter are well established. However, the relationship between these indices, and their contribution to the subjective experience of attention, remains largely unknown. Participants performed a sustained (10–30 s) attention task in which rare (10%) targets were detected within continuous tactile stimulation (16 Hz). Trials were followed by attention ratings on an 8-point visual scale. Attention ratings correlated negatively with contralateral somatosensory alpha power and positively with pupil diameter. The effect of pupil diameter on attention ratings extended into the following trial, reflecting a sustained aspect of attention related to vigilance. The effect of alpha power did not carry over to the next trial and furthermore mediated the association between pupil diameter and attention ratings. Variations in steady-state amplitude reflected stimulus processing under the influence of alpha oscillations but were only weakly related to subjective ratings of attention. Together, our results show that both alpha power and pupil diameter are reflected in the subjective experience of attention, albeit on different time spans, while continuous stimulus processing might not contribute to the experience of attention.

Close

  • doi:10.1111/ejn.15395

Close

Aurélien Weiss; Valérian Chambon; Junseok K. Lee; Jan Drugowitsch; Valentin Wyart

Interacting with volatile environments stabilizes hidden-state inference and its brain signatures Journal Article

In: Nature Communications, vol. 12, pp. 2228, 2021.

Abstract | Links | BibTeX

@article{Weiss2021,
title = {Interacting with volatile environments stabilizes hidden-state inference and its brain signatures},
author = {Aurélien Weiss and Valérian Chambon and Junseok K. Lee and Jan Drugowitsch and Valentin Wyart},
doi = {10.1038/s41467-021-22396-6},
year = {2021},
date = {2021-01-01},
journal = {Nature Communications},
volume = {12},
pages = {2228},
publisher = {Springer US},
abstract = {Making accurate decisions in uncertain environments requires identifying the generative cause of sensory cues, but also the expected outcomes of possible actions. Although both cognitive processes can be formalized as Bayesian inference, they are commonly studied using different experimental frameworks, making their formal comparison difficult. Here, by framing a reversal learning task either as cue-based or outcome-based inference, we found that humans perceive the same volatile environment as more stable when inferring its hidden state by interaction with uncertain outcomes than by observation of equally uncertain cues. Multivariate patterns of magnetoencephalographic (MEG) activity reflected this behavioral difference in the neural interaction between inferred beliefs and incoming evidence, an effect originating from associative regions in the temporal lobe. Together, these findings indicate that the degree of control over the sampling of volatile environments shapes human learning and decision-making under uncertainty.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Making accurate decisions in uncertain environments requires identifying the generative cause of sensory cues, but also the expected outcomes of possible actions. Although both cognitive processes can be formalized as Bayesian inference, they are commonly studied using different experimental frameworks, making their formal comparison difficult. Here, by framing a reversal learning task either as cue-based or outcome-based inference, we found that humans perceive the same volatile environment as more stable when inferring its hidden state by interaction with uncertain outcomes than by observation of equally uncertain cues. Multivariate patterns of magnetoencephalographic (MEG) activity reflected this behavioral difference in the neural interaction between inferred beliefs and incoming evidence, an effect originating from associative regions in the temporal lobe. Together, these findings indicate that the degree of control over the sampling of volatile environments shapes human learning and decision-making under uncertainty.

Close

  • doi:10.1038/s41467-021-22396-6

Close

Maximilian Bruchmann; Sebastian Schindler; Mandana Dinyarian; Thomas Straube

The role of phase and orientation for ERP modulations of spectrum-manipulated fearful and neutral faces Journal Article

In: Psychophysiology, pp. e13974, 2021.

Abstract | Links | BibTeX

@article{Bruchmann2021,
title = {The role of phase and orientation for ERP modulations of spectrum-manipulated fearful and neutral faces},
author = {Maximilian Bruchmann and Sebastian Schindler and Mandana Dinyarian and Thomas Straube},
doi = {10.1111/psyp.13974},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
pages = {e13974},
abstract = {Prioritized processing of fearful compared to neutral faces has been proposed to result from evolutionary adaptation of the contrast sensitivity function (CSF) to the features of emotionally relevant faces and/or vice versa. However, it is unknown whether a stimulus merely has to feature the amplitude spectrum of a fearful face to be prioritized or whether the relevant spatial frequencies have to occur with specific phases and orientations. Prioritized processing is indexed by specific increases of Event-Related Potentials (ERPs) of the EEG and occurs throughout different early processing stages, indexed by emotion-related modulations of the P1, N170, and EPN. In this pre-registered study, we manipulated phase and amplitude properties of the Fourier spectra of neutral and fearful faces to test the effect of phase coherence (PC, face vs. scramble) and orientation coherence (OC, original vs. rotational average) and their interactions with differential emotion processing. We found that differential emotion processing was not present at the level of P1 but strongly affected N170 and EPN. In both cases, intact phase coherence was required for enhanced processing of fearful faces. OC did not interact with emotion. While faces produced the typical N170 effect, we observed a reversed effect for scrambles. Additional exploratory independent component analysis (ICA) suggests that this reversal could signal a mismatch between an early "perceptual hypothesis" and feedback of configural information. In line with our expectations, fearful-neutral differences for the N170 and EPN depend on configural information, i.e., recognizable faces.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Prioritized processing of fearful compared to neutral faces has been proposed to result from evolutionary adaptation of the contrast sensitivity function (CSF) to the features of emotionally relevant faces and/or vice versa. However, it is unknown whether a stimulus merely has to feature the amplitude spectrum of a fearful face to be prioritized or whether the relevant spatial frequencies have to occur with specific phases and orientations. Prioritized processing is indexed by specific increases of Event-Related Potentials (ERPs) of the EEG and occurs throughout different early processing stages, indexed by emotion-related modulations of the P1, N170, and EPN. In this pre-registered study, we manipulated phase and amplitude properties of the Fourier spectra of neutral and fearful faces to test the effect of phase coherence (PC, face vs. scramble) and orientation coherence (OC, original vs. rotational average) and their interactions with differential emotion processing. We found that differential emotion processing was not present at the level of P1 but strongly affected N170 and EPN. In both cases, intact phase coherence was required for enhanced processing of fearful faces. OC did not interact with emotion. While faces produced the typical N170 effect, we observed a reversed effect for scrambles. Additional exploratory independent component analysis (ICA) suggests that this reversal could signal a mismatch between an early "perceptual hypothesis" and feedback of configural information. In line with our expectations, fearful-neutral differences for the N170 and EPN depend on configural information, i.e., recognizable faces.

Close

  • doi:10.1111/psyp.13974

Close

Anne Buot; Damiano Azzalini; Maximilien Chaumon; Catherine Tallon-Baudry

Does stroke volume influence heartbeat evoked responses? Journal Article

In: Biological Psychology, vol. 165, pp. 108165, 2021.

Abstract | Links | BibTeX

@article{Buot2021,
title = {Does stroke volume influence heartbeat evoked responses?},
author = {Anne Buot and Damiano Azzalini and Maximilien Chaumon and Catherine Tallon-Baudry},
doi = {10.1016/j.biopsycho.2021.108165},
year = {2021},
date = {2021-01-01},
journal = {Biological Psychology},
volume = {165},
pages = {108165},
publisher = {Elsevier B.V.},
abstract = {We know surprisingly little on how heartbeat-evoked responses (HERs) vary with cardiac parameters. Here, we measured both stroke volume, or volume of blood ejected at each heartbeat, with impedance cardiography, and HER amplitude with magneto-encephalography, in 21 male and female participants at rest with eyes open. We observed that HER co-fluctuates with stroke volume on a beat-to-beat basis, but only when no correction for cardiac artifact was performed. This highlights the importance of an ICA correction tailored to the cardiac artifact. We also observed that easy-to-measure cardiac parameters (interbeat intervals, ECG amplitude) are sensitive to stroke volume fluctuations and can be used as proxies when stroke volume measurements are not available. Finally, interindividual differences in stroke volume were reflected in MEG data, but whether this effect is locked to heartbeats is unclear. Altogether, our results question assumptions on the link between stroke volume and HERs.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

We know surprisingly little on how heartbeat-evoked responses (HERs) vary with cardiac parameters. Here, we measured both stroke volume, or volume of blood ejected at each heartbeat, with impedance cardiography, and HER amplitude with magneto-encephalography, in 21 male and female participants at rest with eyes open. We observed that HER co-fluctuates with stroke volume on a beat-to-beat basis, but only when no correction for cardiac artifact was performed. This highlights the importance of an ICA correction tailored to the cardiac artifact. We also observed that easy-to-measure cardiac parameters (interbeat intervals, ECG amplitude) are sensitive to stroke volume fluctuations and can be used as proxies when stroke volume measurements are not available. Finally, interindividual differences in stroke volume were reflected in MEG data, but whether this effect is locked to heartbeats is unclear. Altogether, our results question assumptions on the link between stroke volume and HERs.

Close

  • doi:10.1016/j.biopsycho.2021.108165

Close

Christoforos Christoforou; Argyro Fella; Paavo H. T. Leppänen; George K. Georgiou; Timothy C. Papadopoulos

Fixation-related potentials in naming speed: A combined EEG and eye-tracking study on children with dyslexia Journal Article

In: Clinical Neurophysiology, vol. 132, no. 11, pp. 2798–2807, 2021.

Abstract | Links | BibTeX

@article{Christoforou2021,
title = {Fixation-related potentials in naming speed: A combined EEG and eye-tracking study on children with dyslexia},
author = {Christoforos Christoforou and Argyro Fella and Paavo H. T. Leppänen and George K. Georgiou and Timothy C. Papadopoulos},
doi = {10.1016/j.clinph.2021.08.013},
year = {2021},
date = {2021-01-01},
journal = {Clinical Neurophysiology},
volume = {132},
number = {11},
pages = {2798--2807},
publisher = {International Federation of Clinical Neurophysiology},
abstract = {Objective: We combined electroencephalography (EEG) and eye-tracking recordings to examine the underlying factors elicited during the serial Rapid-Automatized Naming (RAN) task that may differentiate between children with dyslexia (DYS) and chronological age controls (CAC). Methods: Thirty children with DYS and 30 CAC (Mage = 9.79 years; age range 7.6 through 12.1 years) performed a set of serial RAN tasks. We extracted fixation-related potentials (FRPs) under phonologically similar (rime-confound) or visually similar (resembling lowercase letters) and dissimilar (non-confounding and discrete uppercase letters, respectively) control tasks. Results: Results revealed significant differences in FRP amplitudes between DYS and CAC groups under the phonologically similar and phonologically non-confounding conditions. No differences were observed in the case of the visual conditions. Moreover, regression analysis showed that the average amplitude of the extracted components significantly predicted RAN performance. Conclusion: FRPs capture neural components during the serial RAN task informative of differences between DYS and CAC and establish a relationship between neurocognitive processes during serial RAN and dyslexia. Significance: We suggest our approach as a methodological model for the concurrent analysis of neurophysiological and eye-gaze data to decipher the role of RAN in reading.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Objective: We combined electroencephalography (EEG) and eye-tracking recordings to examine the underlying factors elicited during the serial Rapid-Automatized Naming (RAN) task that may differentiate between children with dyslexia (DYS) and chronological age controls (CAC). Methods: Thirty children with DYS and 30 CAC (Mage = 9.79 years; age range 7.6 through 12.1 years) performed a set of serial RAN tasks. We extracted fixation-related potentials (FRPs) under phonologically similar (rime-confound) or visually similar (resembling lowercase letters) and dissimilar (non-confounding and discrete uppercase letters, respectively) control tasks. Results: Results revealed significant differences in FRP amplitudes between DYS and CAC groups under the phonologically similar and phonologically non-confounding conditions. No differences were observed in the case of the visual conditions. Moreover, regression analysis showed that the average amplitude of the extracted components significantly predicted RAN performance. Conclusion: FRPs capture neural components during the serial RAN task informative of differences between DYS and CAC and establish a relationship between neurocognitive processes during serial RAN and dyslexia. Significance: We suggest our approach as a methodological model for the concurrent analysis of neurophysiological and eye-gaze data to decipher the role of RAN in reading.

Close

  • doi:10.1016/j.clinph.2021.08.013

Close

Edan Daniel; Ilan Dinstein

Individual magnitudes of neural variability quenching are associated with motion perception abilities Journal Article

In: Journal of Neurophysiology, vol. 125, no. 4, pp. 1111–1120, 2021.

Abstract | Links | BibTeX

@article{Daniel2021,
title = {Individual magnitudes of neural variability quenching are associated with motion perception abilities},
author = {Edan Daniel and Ilan Dinstein},
doi = {10.1152/jn.00355.2020},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neurophysiology},
volume = {125},
number = {4},
pages = {1111--1120},
abstract = {Remarkable trial-by-trial variability is apparent in cortical responses to repeating stimulus presentations. This neural variability across trials is relatively high before stimulus presentation and then reduced (i.e., quenched) $sim$0.2 s after stimulus presentation. Individual subjects exhibit different magnitudes of variability quenching, and previous work from our lab has revealed that individuals with larger variability quenching exhibit lower (i.e., better) perceptual thresholds in a contrast discrimination task. Here, we examined whether similar findings were also apparent in a motion detection task, which is processed by distinct neural populations in the visual system. We recorded EEG data from 35 adult subjects as they detected the direction of coherent motion in random dot kinematograms. The results demonstrated that individual magnitudes of variability quenching were significantly correlated with coherent motion thresholds, particularly when presenting stimuli with low dot densities, where coherent motion was more difficult to detect. These findings provide consistent support for the hypothesis that larger magnitudes of neural variability quenching are associated with better perceptual abilities in multiple visual domain tasks. NEW & NOTEWORTHY The current study demonstrates that better visual perception abilities in a motion discrimination task are associated with larger quenching of neural variability. In line with previous studies and signal detection theory principles, these findings support the hypothesis that cortical sensory neurons increase reproducibility to enhance detection and discrimination of sensory stimuli.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Remarkable trial-by-trial variability is apparent in cortical responses to repeating stimulus presentations. This neural variability across trials is relatively high before stimulus presentation and then reduced (i.e., quenched) $sim$0.2 s after stimulus presentation. Individual subjects exhibit different magnitudes of variability quenching, and previous work from our lab has revealed that individuals with larger variability quenching exhibit lower (i.e., better) perceptual thresholds in a contrast discrimination task. Here, we examined whether similar findings were also apparent in a motion detection task, which is processed by distinct neural populations in the visual system. We recorded EEG data from 35 adult subjects as they detected the direction of coherent motion in random dot kinematograms. The results demonstrated that individual magnitudes of variability quenching were significantly correlated with coherent motion thresholds, particularly when presenting stimuli with low dot densities, where coherent motion was more difficult to detect. These findings provide consistent support for the hypothesis that larger magnitudes of neural variability quenching are associated with better perceptual abilities in multiple visual domain tasks. NEW & NOTEWORTHY The current study demonstrates that better visual perception abilities in a motion discrimination task are associated with larger quenching of neural variability. In line with previous studies and signal detection theory principles, these findings support the hypothesis that cortical sensory neurons increase reproducibility to enhance detection and discrimination of sensory stimuli.

Close

  • doi:10.1152/jn.00355.2020

Close

Jonathan Daume; Peng Wang; Alexander Maye; Dan Zhang; Andreas K. Engel

Non-rhythmic temporal prediction involves phase resets of low-frequency delta oscillations Journal Article

In: NeuroImage, vol. 224, pp. 117376, 2021.

Abstract | Links | BibTeX

@article{Daume2021,
title = {Non-rhythmic temporal prediction involves phase resets of low-frequency delta oscillations},
author = {Jonathan Daume and Peng Wang and Alexander Maye and Dan Zhang and Andreas K. Engel},
doi = {10.1016/j.neuroimage.2020.117376},
year = {2021},
date = {2021-01-01},
journal = {NeuroImage},
volume = {224},
pages = {117376},
publisher = {Elsevier Inc.},
abstract = {The phase of neural oscillatory signals aligns to the predicted onset of upcoming stimulation. Whether such phase alignments represent phase resets of underlying neural oscillations or just rhythmically evoked activity, and whether they can be observed in a rhythm-free visual context, however, remains unclear. Here, we recorded the magnetoencephalogram while participants were engaged in a temporal prediction task, judging the visual or tactile reappearance of a uniformly moving stimulus. The prediction conditions were contrasted with a control condition to dissociate phase adjustments of neural oscillations from stimulus-driven activity. We observed stronger delta band inter-trial phase consistency (ITPC) in a network of sensory, parietal and frontal brain areas, but no power increase reflecting stimulus-driven or prediction-related evoked activity. Delta ITPC further correlated with prediction performance in the cerebellum and visual cortex. Our results provide evidence that phase alignments of low-frequency neural oscillations underlie temporal predictions in a non-rhythmic visual and crossmodal context.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The phase of neural oscillatory signals aligns to the predicted onset of upcoming stimulation. Whether such phase alignments represent phase resets of underlying neural oscillations or just rhythmically evoked activity, and whether they can be observed in a rhythm-free visual context, however, remains unclear. Here, we recorded the magnetoencephalogram while participants were engaged in a temporal prediction task, judging the visual or tactile reappearance of a uniformly moving stimulus. The prediction conditions were contrasted with a control condition to dissociate phase adjustments of neural oscillations from stimulus-driven activity. We observed stronger delta band inter-trial phase consistency (ITPC) in a network of sensory, parietal and frontal brain areas, but no power increase reflecting stimulus-driven or prediction-related evoked activity. Delta ITPC further correlated with prediction performance in the cerebellum and visual cortex. Our results provide evidence that phase alignments of low-frequency neural oscillations underlie temporal predictions in a non-rhythmic visual and crossmodal context.

Close

  • doi:10.1016/j.neuroimage.2020.117376

Close

Saeideh Davoudi; Mohsen Parto Dezfouli; Robert T. Knight; Mohammad Reza Daliri; Elizabeth L. Johnson

Prefrontal lesions disrupt posterior alpha–gamma coordination of visual working memory representations Journal Article

In: Journal of Cognitive Neuroscience, vol. 33, no. 9, pp. 1798–1810, 2021.

Abstract | Links | BibTeX

@article{Davoudi2021,
title = {Prefrontal lesions disrupt posterior alpha–gamma coordination of visual working memory representations},
author = {Saeideh Davoudi and Mohsen Parto Dezfouli and Robert T. Knight and Mohammad Reza Daliri and Elizabeth L. Johnson},
doi = {10.1162/jocn_a_01715},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {33},
number = {9},
pages = {1798--1810},
abstract = {How does the human brain prioritize different visual representations in working memory (WM)? Here, we define the oscillatory mechanisms supporting selection of “where”and “when” features from visual WM storage and investigate the role of pFC in feature selection. Fourteen individuals with lateral pFC damage and 20 healthy controls performed a visuospatial WM task while EEG was recorded. On each trial, two shapes were presented sequentially in a top/ bottom spatial orientation. A retro-cue presented mid-delay prompted which of the two shapes had been in either the top/ bottom spatial position or first/second temporal position. We found that cross-frequency coupling between parieto-occipital alpha ($alpha$; 8–12 Hz) oscillations and topographi-cally distributed gamma ($gamma$; 30–50 Hz) activity tracked selection of the distinct cued feature in controls. This signature of feature selection was disrupted in patients with pFC lesions, despite intact $alpha$–$gamma$ coupling independent of feature selection. These findings reveal a pFC-dependent parieto-occipital $alpha$–$gamma$ mechanism for the rapid selection of visual WM representations.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

How does the human brain prioritize different visual representations in working memory (WM)? Here, we define the oscillatory mechanisms supporting selection of “where”and “when” features from visual WM storage and investigate the role of pFC in feature selection. Fourteen individuals with lateral pFC damage and 20 healthy controls performed a visuospatial WM task while EEG was recorded. On each trial, two shapes were presented sequentially in a top/ bottom spatial orientation. A retro-cue presented mid-delay prompted which of the two shapes had been in either the top/ bottom spatial position or first/second temporal position. We found that cross-frequency coupling between parieto-occipital alpha ($alpha$; 8–12 Hz) oscillations and topographi-cally distributed gamma ($gamma$; 30–50 Hz) activity tracked selection of the distinct cued feature in controls. This signature of feature selection was disrupted in patients with pFC lesions, despite intact $alpha$–$gamma$ coupling independent of feature selection. These findings reveal a pFC-dependent parieto-occipital $alpha$–$gamma$ mechanism for the rapid selection of visual WM representations.

Close

  • doi:10.1162/jocn_a_01715

Close

Jan Willem De Gee; Camile M. C. Correa; Matthew Weaver; Tobias H. Donner; Simon Van Gaal

Pupil dilation and the slow wave ERP reflect surprise about choice outcome resulting from intrinsic variability in decision confidence Journal Article

In: Cerebral Cortex, vol. 31, no. 7, pp. 3565–3578, 2021.

Abstract | Links | BibTeX

@article{DeGee2021,
title = {Pupil dilation and the slow wave ERP reflect surprise about choice outcome resulting from intrinsic variability in decision confidence},
author = {Jan Willem De Gee and Camile M. C. Correa and Matthew Weaver and Tobias H. Donner and Simon Van Gaal},
doi = {10.1093/cercor/bhab032},
year = {2021},
date = {2021-01-01},
journal = {Cerebral Cortex},
volume = {31},
number = {7},
pages = {3565--3578},
abstract = {Central to human and animal cognition is the ability to learn from feedback in order to optimize future rewards. Such a learning signal might be encoded and broadcasted by the brain's arousal systems, including the noradrenergic locus coeruleus. Pupil responses and the positive slow wave component of event-related potentials reflect rapid changes in the arousal level of the brain. Here, we ask whether and how these variables may reflect surprise: the mismatch between one's expectation about being correct and the outcome of a decision, when expectations fluctuate due to internal factors (e.g., engagement). We show that during an elementary decision task in the face of uncertainty both physiological markers of phasic arousal reflect surprise. We further show that pupil responses and slow wave event-related potential are unrelated to each other and that prediction error computations depend on feedback awareness. These results further advance our understanding of the role of central arousal systems in decision-making under uncertainty.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Central to human and animal cognition is the ability to learn from feedback in order to optimize future rewards. Such a learning signal might be encoded and broadcasted by the brain's arousal systems, including the noradrenergic locus coeruleus. Pupil responses and the positive slow wave component of event-related potentials reflect rapid changes in the arousal level of the brain. Here, we ask whether and how these variables may reflect surprise: the mismatch between one's expectation about being correct and the outcome of a decision, when expectations fluctuate due to internal factors (e.g., engagement). We show that during an elementary decision task in the face of uncertainty both physiological markers of phasic arousal reflect surprise. We further show that pupil responses and slow wave event-related potential are unrelated to each other and that prediction error computations depend on feedback awareness. These results further advance our understanding of the role of central arousal systems in decision-making under uncertainty.

Close

  • doi:10.1093/cercor/bhab032

Close

Megan T. Debettencourt; Stephanie D. Williams; Edward K. Vogel; Edward Awh

Sustained attention and spatial attention distinctly influence long-term memory encoding Journal Article

In: Journal of Cognitive Neuroscience, vol. 33, no. 10, pp. 2132–2148, 2021.

Abstract | Links | BibTeX

@article{Debettencourt2021,
title = {Sustained attention and spatial attention distinctly influence long-term memory encoding},
author = {Megan T. Debettencourt and Stephanie D. Williams and Edward K. Vogel and Edward Awh},
doi = {10.1162/jocn_a_01748},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {33},
number = {10},
pages = {2132--2148},
abstract = {Our attention is critically important for what we remember. Prior measures of the relationship between attention and memory, however, have largely treated “attention” as a monolith. Here, across three experiments, we provide evidence for two dissociable aspects of attention that influence encoding into long-term memory. Using spatial cues together with a sensitive continuous report procedure, we find that long-term memory response error is affected by both trial-by-trial fluctuations of sustained attention and prioritization via covert spatial attention. Furthermore, using multivariate analyses of EEG, we track both sustained attention and spatial attention before stimulus onset. Intriguingly, even during moments of low sustained attention, there is no decline in the representation of the spatially attended location, showing that these two aspects of attention have robust but independent effects on long-term memory encoding. Finally, sustained and spatial attention predicted distinct variance in long-term memory performance across individuals. That is, the relationship between attention and long-term memory suggests a composite model, wherein distinct attentional subcomponents influence encoding into long-term memory. These results point toward a taxonomy of the distinct attentional processes that constrain our memories.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Our attention is critically important for what we remember. Prior measures of the relationship between attention and memory, however, have largely treated “attention” as a monolith. Here, across three experiments, we provide evidence for two dissociable aspects of attention that influence encoding into long-term memory. Using spatial cues together with a sensitive continuous report procedure, we find that long-term memory response error is affected by both trial-by-trial fluctuations of sustained attention and prioritization via covert spatial attention. Furthermore, using multivariate analyses of EEG, we track both sustained attention and spatial attention before stimulus onset. Intriguingly, even during moments of low sustained attention, there is no decline in the representation of the spatially attended location, showing that these two aspects of attention have robust but independent effects on long-term memory encoding. Finally, sustained and spatial attention predicted distinct variance in long-term memory performance across individuals. That is, the relationship between attention and long-term memory suggests a composite model, wherein distinct attentional subcomponents influence encoding into long-term memory. These results point toward a taxonomy of the distinct attentional processes that constrain our memories.

Close

  • doi:10.1162/jocn_a_01748

Close

Federica Degno; Otto Loberg; Simon P. Liversedge

Co-registration of eye movements and fixation-related potentials in natural reading: Practical issues of experimental design and data analysis Journal Article

In: Collabra: Psychology, vol. 7, no. 1, pp. 1–28, 2021.

Abstract | Links | BibTeX

@article{Degno2021,
title = {Co-registration of eye movements and fixation-related potentials in natural reading: Practical issues of experimental design and data analysis},
author = {Federica Degno and Otto Loberg and Simon P. Liversedge},
doi = {10.1525/collabra.18032},
year = {2021},
date = {2021-01-01},
journal = {Collabra: Psychology},
volume = {7},
number = {1},
pages = {1--28},
abstract = {A growing number of studies are using co-registration of eye movement (EM) and fixation-related potential (FRP) measures to investigate reading. However, the number of co-registration experiments remains small when compared to the number of studies in the literature conducted with EMs and event-related potentials (ERPs) alone. One reason for this is the complexity of the experimental design and data analyses. The present paper is designed to support researchers who might have expertise in conducting reading experiments with EM or ERP techniques and are wishing to take their first steps towards co-registration research. The objective of this paper is threefold. First, to provide an overview of the issues that such researchers would face. Second, to provide a critical overview of the methodological approaches available to date to deal with these issues. Third, to offer an example pipeline and a full set of scripts for data preprocessing that may be adopted and adapted for one's own needs. The data preprocessing steps are based on EM data parsing via Data Viewer (SR Research), and the provided scripts are written in Matlab and R. Ultimately, with this paper we hope to encourage other researchers to run co-registration experiments to study reading and human cognition more generally.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

A growing number of studies are using co-registration of eye movement (EM) and fixation-related potential (FRP) measures to investigate reading. However, the number of co-registration experiments remains small when compared to the number of studies in the literature conducted with EMs and event-related potentials (ERPs) alone. One reason for this is the complexity of the experimental design and data analyses. The present paper is designed to support researchers who might have expertise in conducting reading experiments with EM or ERP techniques and are wishing to take their first steps towards co-registration research. The objective of this paper is threefold. First, to provide an overview of the issues that such researchers would face. Second, to provide a critical overview of the methodological approaches available to date to deal with these issues. Third, to offer an example pipeline and a full set of scripts for data preprocessing that may be adopted and adapted for one's own needs. The data preprocessing steps are based on EM data parsing via Data Viewer (SR Research), and the provided scripts are written in Matlab and R. Ultimately, with this paper we hope to encourage other researchers to run co-registration experiments to study reading and human cognition more generally.

Close

  • doi:10.1525/collabra.18032

Close

Gisella K. Diaz; Edward K. Vogel; Edward Awh

Perceptual grouping reveals distinct roles for sustained slow wave activity and alpha oscillations in working memory Journal Article

In: Journal of Cognitive Neuroscience, vol. 33, no. 7, pp. 1354–1364, 2021.

Abstract | Links | BibTeX

@article{Diaz2021,
title = {Perceptual grouping reveals distinct roles for sustained slow wave activity and alpha oscillations in working memory},
author = {Gisella K. Diaz and Edward K. Vogel and Edward Awh},
doi = {10.1162/jocn_a_01719},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {33},
number = {7},
pages = {1354--1364},
abstract = {Multiple neural signals have been found to track the number of items stored in working memory ( WM). These signals include oscillatory activity in the alpha band and slow-wave components in human EEG, both of which vary with storage loads and predict individual differences in WM capacity. However, recent evidence suggests that these two signals play distinct roles in spatial attention and item-based storage in WM. Here, we examine the hypothesis that sustained negative voltage deflections over parieto-occipital electrodes reflect the number of individuated items in WM, whereas oscillatory activity in the alpha frequency band (8–12 Hz) within the same electrodes tracks the attended positions in the visual display. We measured EEG activity while participants stored the orientation of visual elements that were either grouped by collinearity or not. This grouping manipulation altered the number of individuated items perceived while holding constant the number of locations occupied by visual stimuli. The negative slow wave tracked the number of items stored and was reduced in amplitude in the grouped condition. By contrast, oscillatory activity in the alpha frequency band tracked the number of positions occupied by the memoranda and was unaffected by perceptual grouping. Perceptual grouping, then, reduced the number of individuated representations stored in WM as reflected by the negative slow wave, whereas the location of each element was actively maintained as indicated by alpha power. These findings contribute to the emerging idea that distinct classes of EEG signals work in concert to successfully maintain online representations in WM.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Multiple neural signals have been found to track the number of items stored in working memory ( WM). These signals include oscillatory activity in the alpha band and slow-wave components in human EEG, both of which vary with storage loads and predict individual differences in WM capacity. However, recent evidence suggests that these two signals play distinct roles in spatial attention and item-based storage in WM. Here, we examine the hypothesis that sustained negative voltage deflections over parieto-occipital electrodes reflect the number of individuated items in WM, whereas oscillatory activity in the alpha frequency band (8–12 Hz) within the same electrodes tracks the attended positions in the visual display. We measured EEG activity while participants stored the orientation of visual elements that were either grouped by collinearity or not. This grouping manipulation altered the number of individuated items perceived while holding constant the number of locations occupied by visual stimuli. The negative slow wave tracked the number of items stored and was reduced in amplitude in the grouped condition. By contrast, oscillatory activity in the alpha frequency band tracked the number of positions occupied by the memoranda and was unaffected by perceptual grouping. Perceptual grouping, then, reduced the number of individuated representations stored in WM as reflected by the negative slow wave, whereas the location of each element was actively maintained as indicated by alpha power. These findings contribute to the emerging idea that distinct classes of EEG signals work in concert to successfully maintain online representations in WM.

Close

  • doi:10.1162/jocn_a_01719

Close

Marcos Domic-Siede; Martín Irani; Joaquín Valdés; Marcela Perrone-Bertolotti; Tomás Ossandón

Theta activity from frontopolar cortex, mid-cingulate cortex and anterior cingulate cortex shows different roles in cognitive planning performance Journal Article

In: NeuroImage, vol. 226, pp. 117557, 2021.

Abstract | Links | BibTeX

@article{DomicSiede2021,
title = {Theta activity from frontopolar cortex, mid-cingulate cortex and anterior cingulate cortex shows different roles in cognitive planning performance},
author = {Marcos Domic-Siede and Martín Irani and Joaquín Valdés and Marcela Perrone-Bertolotti and Tomás Ossandón},
doi = {10.1016/j.neuroimage.2020.117557},
year = {2021},
date = {2021-01-01},
journal = {NeuroImage},
volume = {226},
pages = {117557},
publisher = {Elsevier Inc.},
abstract = {Cognitive planning, the ability to develop a sequenced plan to achieve a goal, plays a crucial role in human goal-directed behavior. However, the specific role of frontal structures in planning is unclear. We used a novel and ecological task, that allowed us to separate the planning period from the execution period. The spatio-temporal dynamics of EEG recordings showed that planning induced a progressive and sustained increase of frontal-midline theta activity (FM$theta$) over time. Source analyses indicated that this activity was generated within the prefrontal cortex. Theta activity from the right mid-Cingulate Cortex (MCC) and the left Anterior Cingulate Cortex (ACC) were correlated with an increase in the time needed for elaborating plans. On the other hand, left Frontopolar cortex (FP) theta activity exhibited a negative correlation with the time required for executing a plan. Since reaction times of planning execution correlated with correct responses, left FP theta activity might be associated with efficiency and accuracy in making a plan. Associations between theta activity from the right MCC and the left ACC with reaction times of the planning period may reflect high cognitive demand of the task, due to the engagement of attentional control and conflict monitoring implementation. In turn, the specific association between left FP theta activity and planning performance may reflect the participation of this brain region in successfully self-generated plans.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Cognitive planning, the ability to develop a sequenced plan to achieve a goal, plays a crucial role in human goal-directed behavior. However, the specific role of frontal structures in planning is unclear. We used a novel and ecological task, that allowed us to separate the planning period from the execution period. The spatio-temporal dynamics of EEG recordings showed that planning induced a progressive and sustained increase of frontal-midline theta activity (FM$theta$) over time. Source analyses indicated that this activity was generated within the prefrontal cortex. Theta activity from the right mid-Cingulate Cortex (MCC) and the left Anterior Cingulate Cortex (ACC) were correlated with an increase in the time needed for elaborating plans. On the other hand, left Frontopolar cortex (FP) theta activity exhibited a negative correlation with the time required for executing a plan. Since reaction times of planning execution correlated with correct responses, left FP theta activity might be associated with efficiency and accuracy in making a plan. Associations between theta activity from the right MCC and the left ACC with reaction times of the planning period may reflect high cognitive demand of the task, due to the engagement of attentional control and conflict monitoring implementation. In turn, the specific association between left FP theta activity and planning performance may reflect the participation of this brain region in successfully self-generated plans.

Close

  • doi:10.1016/j.neuroimage.2020.117557

Close

Linda Drijvers; Ole Jensen; Eelke Spaak

Rapid invisible frequency tagging reveals nonlinear integration of auditory and visual information Journal Article

In: Human Brain Mapping, vol. 42, no. 4, pp. 1138–1152, 2021.

Abstract | Links | BibTeX

@article{Drijvers2021,
title = {Rapid invisible frequency tagging reveals nonlinear integration of auditory and visual information},
author = {Linda Drijvers and Ole Jensen and Eelke Spaak},
doi = {10.1002/hbm.25282},
year = {2021},
date = {2021-01-01},
journal = {Human Brain Mapping},
volume = {42},
number = {4},
pages = {1138--1152},
abstract = {During communication in real-life settings, the brain integrates information from auditory and visual modalities to form a unified percept of our environment. In the current magnetoencephalography (MEG) study, we used rapid invisible frequency tagging (RIFT) to generate steady-state evoked fields and investigated the integration of audiovisual information in a semantic context. We presented participants with videos of an actress uttering action verbs (auditory; tagged at 61 Hz) accompanied by a gesture (visual; tagged at 68 Hz, using a projector with a 1,440 Hz refresh rate). Integration difficulty was manipulated by lower-order auditory factors (clear/degraded speech) and higher-order visual factors (congruent/incongruent gesture). We identified MEG spectral peaks at the individual (61/68 Hz) tagging frequencies. We furthermore observed a peak at the intermodulation frequency of the auditory and visually tagged signals (fvisual − fauditory = 7 Hz), specifically when lower-order integration was easiest because signal quality was optimal. This intermodulation peak is a signature of nonlinear audiovisual integration, and was strongest in left inferior frontal gyrus and left temporal regions; areas known to be involved in speech-gesture integration. The enhanced power at the intermodulation frequency thus reflects the ease of lower-order audiovisual integration and demonstrates that speech-gesture information interacts in higher-order language areas. Furthermore, we provide a proof-of-principle of the use of RIFT to study the integration of audiovisual stimuli, in relation to, for instance, semantic context.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

During communication in real-life settings, the brain integrates information from auditory and visual modalities to form a unified percept of our environment. In the current magnetoencephalography (MEG) study, we used rapid invisible frequency tagging (RIFT) to generate steady-state evoked fields and investigated the integration of audiovisual information in a semantic context. We presented participants with videos of an actress uttering action verbs (auditory; tagged at 61 Hz) accompanied by a gesture (visual; tagged at 68 Hz, using a projector with a 1,440 Hz refresh rate). Integration difficulty was manipulated by lower-order auditory factors (clear/degraded speech) and higher-order visual factors (congruent/incongruent gesture). We identified MEG spectral peaks at the individual (61/68 Hz) tagging frequencies. We furthermore observed a peak at the intermodulation frequency of the auditory and visually tagged signals (fvisual − fauditory = 7 Hz), specifically when lower-order integration was easiest because signal quality was optimal. This intermodulation peak is a signature of nonlinear audiovisual integration, and was strongest in left inferior frontal gyrus and left temporal regions; areas known to be involved in speech-gesture integration. The enhanced power at the intermodulation frequency thus reflects the ease of lower-order audiovisual integration and demonstrates that speech-gesture information interacts in higher-order language areas. Furthermore, we provide a proof-of-principle of the use of RIFT to study the integration of audiovisual stimuli, in relation to, for instance, semantic context.

Close

  • doi:10.1002/hbm.25282

Close

Amie J. Durston; Roxane J. Itier

The early processing of fearful and happy facial expressions is independent of task demands – Support from mass univariate analyses Journal Article

In: Brain Research, vol. 1765, pp. 147505, 2021.

Abstract | Links | BibTeX

@article{Durston2021,
title = {The early processing of fearful and happy facial expressions is independent of task demands – Support from mass univariate analyses},
author = {Amie J. Durston and Roxane J. Itier},
doi = {10.1016/j.brainres.2021.147505},
year = {2021},
date = {2021-01-01},
journal = {Brain Research},
volume = {1765},
pages = {147505},
publisher = {Elsevier B.V.},
abstract = {Most ERP studies on facial expressions of emotion have yielded inconsistent results regarding the time course of emotion effects and their possible modulation by task demands. Most studies have used classical statistical methods with a high likelihood of type I and type II errors, which can be limited with Mass Univariate statistics. FMUT and LIMO are currently the only two available toolboxes for Mass Univariate analysis of ERP data and use different fundamental statistics. Yet, no direct comparison of their output has been performed on the same dataset. Given the current push to transition to robust statistics to increase results replicability, here we compared the output of these toolboxes on data previously analyzed using classic approaches (Itier & Neath-Tavares, 2017). The early (0–352 ms) processing of fearful, happy, and neutral faces was investigated under three tasks in a within-subject design that also controlled gaze fixation location. Both toolboxes revealed main effects of emotion and task but neither yielded an interaction between the two, confirming the early processing of fear and happy expressions is largely independent of task demands. Both toolboxes found virtually no difference between neutral and happy expressions, while fearful (compared to neutral and happy) expressions modulated the N170 and EPN but elicited maximum effects after the N170 peak, around 190 ms. Similarities and differences in the spatial and temporal extent of these effects are discussed in comparison to the published classical analysis and the rest of the ERP literature.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Most ERP studies on facial expressions of emotion have yielded inconsistent results regarding the time course of emotion effects and their possible modulation by task demands. Most studies have used classical statistical methods with a high likelihood of type I and type II errors, which can be limited with Mass Univariate statistics. FMUT and LIMO are currently the only two available toolboxes for Mass Univariate analysis of ERP data and use different fundamental statistics. Yet, no direct comparison of their output has been performed on the same dataset. Given the current push to transition to robust statistics to increase results replicability, here we compared the output of these toolboxes on data previously analyzed using classic approaches (Itier & Neath-Tavares, 2017). The early (0–352 ms) processing of fearful, happy, and neutral faces was investigated under three tasks in a within-subject design that also controlled gaze fixation location. Both toolboxes revealed main effects of emotion and task but neither yielded an interaction between the two, confirming the early processing of fear and happy expressions is largely independent of task demands. Both toolboxes found virtually no difference between neutral and happy expressions, while fearful (compared to neutral and happy) expressions modulated the N170 and EPN but elicited maximum effects after the N170 peak, around 190 ms. Similarities and differences in the spatial and temporal extent of these effects are discussed in comparison to the published classical analysis and the rest of the ERP literature.

Close

  • doi:10.1016/j.brainres.2021.147505

Close

Tobias Feldmann-Wöstefeld; Marina Weinberger; Edward Awh

Spatially guided distractor suppression during visual search Journal Article

In: Journal of Neuroscience, vol. 41, no. 14, pp. 3180–3191, 2021.

Abstract | Links | BibTeX

@article{FeldmannWoestefeld2021,
title = {Spatially guided distractor suppression during visual search},
author = {Tobias Feldmann-Wöstefeld and Marina Weinberger and Edward Awh},
doi = {10.1523/JNEUROSCI.2418-20.2021},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {14},
pages = {3180--3191},
abstract = {Past work has demonstrated that active suppression of salient distractors is a critical part of visual selection. Evidence for goaldriven suppression includes below-baseline visual encoding at the position of salient distractors (Gaspelin and Luck, 2018) and neural signals such as the distractor positivity (Pd) that track how many distractors are presented in a given hemifield (Feldmann-Wöstefeld and Vogel, 2019). One basic question regarding distractor suppression is whether it is inherently spatial or nonspatial in character. Indeed, past work has shown that distractors evoke both spatial (Theeuwes, 1992) and nonspatial forms of interference (Folk and Remington, 1998), motivating a direct examination of whether space is integral to goal-driven distractor suppression. Here, we use behavioral and EEG data from adult humans (male and female) to provide clear evidence for a spatial gradient of suppression surrounding salient singleton distractors. Replicating past work, both reaction time and neural indices of target selection improved monotonically as the distance between target and distractor increased. Importantly, these target selection effects were paralleled by a monotonic decline in the amplitude of the Pd, an electrophysiological index of distractor suppression. Moreover, multivariate analyses revealed spatially selective activity in the h-band that tracked the position of the target and, critically, revealed suppressed activity at spatial channels centered on distractor positions. Thus, goal-driven selection of relevant over irrelevant information benefits from a spatial gradient of suppression surrounding salient distractors.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Past work has demonstrated that active suppression of salient distractors is a critical part of visual selection. Evidence for goaldriven suppression includes below-baseline visual encoding at the position of salient distractors (Gaspelin and Luck, 2018) and neural signals such as the distractor positivity (Pd) that track how many distractors are presented in a given hemifield (Feldmann-Wöstefeld and Vogel, 2019). One basic question regarding distractor suppression is whether it is inherently spatial or nonspatial in character. Indeed, past work has shown that distractors evoke both spatial (Theeuwes, 1992) and nonspatial forms of interference (Folk and Remington, 1998), motivating a direct examination of whether space is integral to goal-driven distractor suppression. Here, we use behavioral and EEG data from adult humans (male and female) to provide clear evidence for a spatial gradient of suppression surrounding salient singleton distractors. Replicating past work, both reaction time and neural indices of target selection improved monotonically as the distance between target and distractor increased. Importantly, these target selection effects were paralleled by a monotonic decline in the amplitude of the Pd, an electrophysiological index of distractor suppression. Moreover, multivariate analyses revealed spatially selective activity in the h-band that tracked the position of the target and, critically, revealed suppressed activity at spatial channels centered on distractor positions. Thus, goal-driven selection of relevant over irrelevant information benefits from a spatial gradient of suppression surrounding salient distractors.

Close

  • doi:10.1523/JNEUROSCI.2418-20.2021

Close

Tobias Feldmann-Wüstefeld

Neural measures of working memory in a bilateral change detection task Journal Article

In: Psychophysiology, vol. 58, no. 1, pp. e13683, 2021.

Abstract | Links | BibTeX

@article{FeldmannWuestefeld2021,
title = {Neural measures of working memory in a bilateral change detection task},
author = {Tobias Feldmann-Wüstefeld},
doi = {10.1111/psyp.13683},
year = {2021},
date = {2021-01-01},
journal = {Psychophysiology},
volume = {58},
number = {1},
pages = {e13683},
abstract = {The change detection task is a widely used paradigm to examine visual working memory processes. Participants memorize a set of items and then, try to detect changes in the set after a retention period. The negative slow wave (NSW) and contralateral delay activity (CDA) are event-related potentials in the EEG signal that are commonly used in change detection tasks to track working memory load, as both increase with the number of items maintained in working memory (set size). While the CDA was argued to more purely reflect the memory-specific neural activity than the NSW, it also requires a lateralized design and attention shifts prior to memoranda onset, imposing more restrictions on the task than the NSW. The present study proposes a novel change detection task in which both CDA and NSW can be measured at the same time. Memory items were presented bilaterally, but their distribution in the left and right hemifield varied, inducing a target imbalance or “net load.” NSW increased with set size, whereas CDA increased with net load. In addition, a multivariate linear classifier was able to decode the set size and net load from the EEG signal. CDA, NSW, and decoding accuracy predicted an individual's working memory capacity. In line with the notion of a bilateral advantage in working memory, accuracy, and CDA data suggest that participants tended to encode items relatively balanced. In sum, this novel change detection task offers a basis to make use of converging neural measures of working memory in a comprehensive paradigm.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The change detection task is a widely used paradigm to examine visual working memory processes. Participants memorize a set of items and then, try to detect changes in the set after a retention period. The negative slow wave (NSW) and contralateral delay activity (CDA) are event-related potentials in the EEG signal that are commonly used in change detection tasks to track working memory load, as both increase with the number of items maintained in working memory (set size). While the CDA was argued to more purely reflect the memory-specific neural activity than the NSW, it also requires a lateralized design and attention shifts prior to memoranda onset, imposing more restrictions on the task than the NSW. The present study proposes a novel change detection task in which both CDA and NSW can be measured at the same time. Memory items were presented bilaterally, but their distribution in the left and right hemifield varied, inducing a target imbalance or “net load.” NSW increased with set size, whereas CDA increased with net load. In addition, a multivariate linear classifier was able to decode the set size and net load from the EEG signal. CDA, NSW, and decoding accuracy predicted an individual's working memory capacity. In line with the notion of a bilateral advantage in working memory, accuracy, and CDA data suggest that participants tended to encode items relatively balanced. In sum, this novel change detection task offers a basis to make use of converging neural measures of working memory in a comprehensive paradigm.

Close

  • doi:10.1111/psyp.13683

Close

Joshua J. Foster; William Thyer; Janna W. Wennberg; Edward Awh

Covert attention increases the gain of stimulus-evoked population codes Journal Article

In: Journal of Neuroscience, vol. 41, no. 8, pp. 1802–1815, 2021.

Abstract | Links | BibTeX

@article{Foster2021,
title = {Covert attention increases the gain of stimulus-evoked population codes},
author = {Joshua J. Foster and William Thyer and Janna W. Wennberg and Edward Awh},
doi = {10.1523/JNEUROSCI.2186-20.2020},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {8},
pages = {1802--1815},
abstract = {Covert spatial attention has a variety of effects on the responses of individual neurons. However, relatively little is known about the net effect of these changes on sensory population codes, even though perception ultimately depends on population activity. Here, we measured the EEG in human observers (male and female), and isolated stimulus-evoked activity that was phase-locked to the onset of attended and ignored visual stimuli. Using an encoding model, we reconstructed spatially selective population tuning functions from the pattern of stimulus-evoked activity across the scalp. Our EEG-based approach allowed us to measure very early visually evoked responses occurring;100ms after stimulus onset. In Experiment 1, we found that covert attention increased the amplitude of spatially tuned population responses at this early stage of sensory processing. In Experiment 2, we parametrically varied stimulus contrast to test how this effect scaled with stimulus contrast. We found that the effect of attention on the amplitude of spatially tuned responses increased with stimulus contrast, and was well described by an increase in response gain (i.e., a multiplicative scaling of the population response). Together, our results show that attention increases the gain of spatial population codes during the first wave of visual processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Covert spatial attention has a variety of effects on the responses of individual neurons. However, relatively little is known about the net effect of these changes on sensory population codes, even though perception ultimately depends on population activity. Here, we measured the EEG in human observers (male and female), and isolated stimulus-evoked activity that was phase-locked to the onset of attended and ignored visual stimuli. Using an encoding model, we reconstructed spatially selective population tuning functions from the pattern of stimulus-evoked activity across the scalp. Our EEG-based approach allowed us to measure very early visually evoked responses occurring;100ms after stimulus onset. In Experiment 1, we found that covert attention increased the amplitude of spatially tuned population responses at this early stage of sensory processing. In Experiment 2, we parametrically varied stimulus contrast to test how this effect scaled with stimulus contrast. We found that the effect of attention on the amplitude of spatially tuned responses increased with stimulus contrast, and was well described by an increase in response gain (i.e., a multiplicative scaling of the population response). Together, our results show that attention increases the gain of spatial population codes during the first wave of visual processing.

Close

  • doi:10.1523/JNEUROSCI.2186-20.2020

Close

Wendel M. Friedl; Andreas Keil

Aversive conditioning of spatial position sharpens neural population-level tuning in visual cortex and selectively alters alpha-band activity Journal Article

In: Journal of Neuroscience, vol. 41, no. 26, pp. 5723–5733, 2021.

Abstract | Links | BibTeX

@article{Friedl2021,
title = {Aversive conditioning of spatial position sharpens neural population-level tuning in visual cortex and selectively alters alpha-band activity},
author = {Wendel M. Friedl and Andreas Keil},
doi = {10.1523/JNEUROSCI.2889-20.2021},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neuroscience},
volume = {41},
number = {26},
pages = {5723--5733},
abstract = {Processing capabilities for many low-level visual features are experientially malleable, aiding sighted organisms in adapting to dynamic environments. Explicit instructions to attend a specific visual field location influence retinotopic visuocortical activity, amplifying responses to stimuli appearing at cued spatial positions. It remains undetermined both how such prioritization affects surrounding nonprioritized locations, and if a given retinotopic spatial position can attain enhanced cortical representation through experience rather than instruction. The current report examined visuocortical response changes as human observers (N = 51, 19 male) learned, through differential classical conditioning, to associate specific screen locations with aversive outcomes. Using dense-array EEG and pupillometry, we tested the preregistered hypotheses of either sharpening or generalization around an aversively associated location following a single conditioning session. Competing hypotheses tested whether mean response changes would take the form of a Gaussian (generalization) or difference-of-Gaussian (sharpening) distribution over spatial positions, peaking at the viewing location paired with a noxious noise. Occipital 15 Hz steady-state visual evoked potential responses were selectively heightened when viewing aversively paired locations and displayed a nonlinear, difference-of-Gaussian profile across neighboring locations, consistent with suppressive surround modulation of nonprioritized positions. Measures of alpha-band (8-12 Hz) activity were differentially altered in anterior versus posterior locations, while pupil diameter exhibited selectively heightened responses to noise-paired locations but did not evince differences across the nonpaired locations. These results indicate that visuocortical spatial representations are sharpened in response to location-specific aversive conditioning, while top-down influences indexed by alpha-power reduction exhibit posterior generalization and anterior sharpening.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Processing capabilities for many low-level visual features are experientially malleable, aiding sighted organisms in adapting to dynamic environments. Explicit instructions to attend a specific visual field location influence retinotopic visuocortical activity, amplifying responses to stimuli appearing at cued spatial positions. It remains undetermined both how such prioritization affects surrounding nonprioritized locations, and if a given retinotopic spatial position can attain enhanced cortical representation through experience rather than instruction. The current report examined visuocortical response changes as human observers (N = 51, 19 male) learned, through differential classical conditioning, to associate specific screen locations with aversive outcomes. Using dense-array EEG and pupillometry, we tested the preregistered hypotheses of either sharpening or generalization around an aversively associated location following a single conditioning session. Competing hypotheses tested whether mean response changes would take the form of a Gaussian (generalization) or difference-of-Gaussian (sharpening) distribution over spatial positions, peaking at the viewing location paired with a noxious noise. Occipital 15 Hz steady-state visual evoked potential responses were selectively heightened when viewing aversively paired locations and displayed a nonlinear, difference-of-Gaussian profile across neighboring locations, consistent with suppressive surround modulation of nonprioritized positions. Measures of alpha-band (8-12 Hz) activity were differentially altered in anterior versus posterior locations, while pupil diameter exhibited selectively heightened responses to noise-paired locations but did not evince differences across the nonpaired locations. These results indicate that visuocortical spatial representations are sharpened in response to location-specific aversive conditioning, while top-down influences indexed by alpha-power reduction exhibit posterior generalization and anterior sharpening.

Close

  • doi:10.1523/JNEUROSCI.2889-20.2021

Close

R. Frömer; H. Lin; C. K. Dean Wolf; M. Inzlicht; A. Shenhav

Expectations of reward and efficacy guide cognitive control allocation Journal Article

In: Nature Communications, vol. 12, pp. 1030, 2021.

Abstract | Links | BibTeX

@article{Froemer2021,
title = {Expectations of reward and efficacy guide cognitive control allocation},
author = {R. Frömer and H. Lin and C. K. Dean Wolf and M. Inzlicht and A. Shenhav},
doi = {10.1038/s41467-021-21315-z},
year = {2021},
date = {2021-01-01},
journal = {Nature Communications},
volume = {12},
pages = {1030},
publisher = {Springer US},
abstract = {The amount of mental effort we invest in a task is influenced by the reward we can expect if we perform that task well. However, some of the rewards that have the greatest potential for driving these efforts are partly determined by factors beyond one's control. In such cases, effort has more limited efficacy for obtaining rewards. According to the Expected Value of Control theory, people integrate information about the expected reward and efficacy of task performance to determine the expected value of control, and then adjust their control allocation (i.e., mental effort) accordingly. Here we test this theory's key behavioral and neural predictions. We show that participants invest more cognitive control when this control is more rewarding and more efficacious, and that these incentive components separately modulate EEG signatures of incentive evaluation and proactive control allocation. Our findings support the prediction that people combine expectations of reward and efficacy to determine how much effort to invest.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The amount of mental effort we invest in a task is influenced by the reward we can expect if we perform that task well. However, some of the rewards that have the greatest potential for driving these efforts are partly determined by factors beyond one's control. In such cases, effort has more limited efficacy for obtaining rewards. According to the Expected Value of Control theory, people integrate information about the expected reward and efficacy of task performance to determine the expected value of control, and then adjust their control allocation (i.e., mental effort) accordingly. Here we test this theory's key behavioral and neural predictions. We show that participants invest more cognitive control when this control is more rewarding and more efficacious, and that these incentive components separately modulate EEG signatures of incentive evaluation and proactive control allocation. Our findings support the prediction that people combine expectations of reward and efficacy to determine how much effort to invest.

Close

  • doi:10.1038/s41467-021-21315-z

Close

Jordan Garrett; Tom Bullock; Barry Giesbrecht

Tracking the contents of spatial working memory during an acute bout of aerobic exercise Journal Article

In: Journal of Cognitive Neuroscience, vol. 33, no. 7, pp. 1271–1286, 2021.

Abstract | Links | BibTeX

@article{Garrett2021,
title = {Tracking the contents of spatial working memory during an acute bout of aerobic exercise},
author = {Jordan Garrett and Tom Bullock and Barry Giesbrecht},
doi = {10.1162/jocn_a_01714},
year = {2021},
date = {2021-01-01},
journal = {Journal of Cognitive Neuroscience},
volume = {33},
number = {7},
pages = {1271--1286},
abstract = {Recent studies have reported enhanced visual responses during acute bouts of physical exercise, suggesting that sensory systems may become more sensitive during active exploration of the environment. This raises the possibility that exercise may also modulate brain activity associated with other cognitive functions, like visual working memory, that rely on patterns of activity that persist beyond the initial sensory evoked response. Here, we investigated whether the neural coding of an object location held in memory is modulated by an acute bout of aerobic exercise. Participants performed a spatial change detection task while seated on a stationary bike at rest and during low-intensity cycling (∼50 watts/50 RPM). Brain activity was measured with EEG. An inverted encoding modeling technique was employed to estimate location-selective channel response functions from topographical patterns of alpha-band (8–12 Hz) activity. There was strong evidence of robust spatially selective responses during stimulus presentation and retention periods both at rest and during exercise. During retention, the spatial selectivity of these responses decreased in the exercise condition relative to rest. A temporal generalization analysis indicated that models trained on one time period could be used to reconstruct the remembered locations at other time periods, however, generalization was degraded during exercise. Together, these results demonstrate that it is possible to reconstruct the contents of working memory at rest and during exercise, but that exercise can result in degraded responses, which contrasts with the enhancements observed in early sensory processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Recent studies have reported enhanced visual responses during acute bouts of physical exercise, suggesting that sensory systems may become more sensitive during active exploration of the environment. This raises the possibility that exercise may also modulate brain activity associated with other cognitive functions, like visual working memory, that rely on patterns of activity that persist beyond the initial sensory evoked response. Here, we investigated whether the neural coding of an object location held in memory is modulated by an acute bout of aerobic exercise. Participants performed a spatial change detection task while seated on a stationary bike at rest and during low-intensity cycling (∼50 watts/50 RPM). Brain activity was measured with EEG. An inverted encoding modeling technique was employed to estimate location-selective channel response functions from topographical patterns of alpha-band (8–12 Hz) activity. There was strong evidence of robust spatially selective responses during stimulus presentation and retention periods both at rest and during exercise. During retention, the spatial selectivity of these responses decreased in the exercise condition relative to rest. A temporal generalization analysis indicated that models trained on one time period could be used to reconstruct the remembered locations at other time periods, however, generalization was degraded during exercise. Together, these results demonstrate that it is possible to reconstruct the contents of working memory at rest and during exercise, but that exercise can result in degraded responses, which contrasts with the enhancements observed in early sensory processing.

Close

  • doi:10.1162/jocn_a_01714

Close

605 entries « ‹ 1 of 7 › »

Let’s Keep in Touch

  • Twitter
  • Facebook
  • Instagram
  • LinkedIn
  • YouTube
Newsletter
Newsletter Archive
Conferences

Contact

info@sr-research.com

Phone: +1-613-271-8686

Toll Free: +1-866-821-0731

Fax: +1-613-482-4866

Quick Links

Products

Solutions

Support Forum

Legal

Legal Notice

Privacy Policy | Accessibility Policy

EyeLink® eye trackers are intended for research purposes only and should not be used in the treatment or diagnosis of any medical condition.

Featured Blog

Reading Profiles of Adults with Dyslexia

Reading Profile of Adults with Dyslexia


Copyright © 2023 · SR Research Ltd. All Rights Reserved. EyeLink is a registered trademark of SR Research Ltd.