• Skip to primary navigation
  • Skip to main content
  • Skip to footer
SR Research Logo

SR Research

Fast, Accurate, Reliable Eye Tracking

  • Hardware
    • EyeLink 1000 Plus
    • EyeLink Portable Duo
    • fMRI and MEG Systems
    • EyeLink II
    • Hardware Integration
  • Software
    • Experiment Builder
    • Data Viewer
    • WebLink
    • Software Integration
  • Solutions
    • Reading / Language
    • Developmental
    • fMRI / MEG
    • More…
  • Support
    • Forum
    • Resources
    • Workshops
    • Lab Visits
  • About
    • About Eye Tracking
    • EyeLink Publications
    • History
    • Manufacturing
    • Careers
  • Blog
  • Contact
  • 中文

Non-Human Primate Publications

EyeLink Non-Human Primate Publications

All EyeLink non-human primate research publications up until 2020 (with some early 2021s) are listed below by year. You can search the publications using key words such as Temporal Cortex, Macaque, Antisaccade, etc. You can also search for individual author names. If we missed any EyeLink non-human primate article, please email us!

All EyeLink non-human primatee publications are also available for download / import into reference management software as a single Bibtex (.bib) file.

 

574 entries « ‹ 1 of 6 › »

2021

Milena Raffi; Andrea Meoni; Alessandro Piras

Analysis of microsaccades during extended practice of a visual discrimination task in the macaque monkey Journal Article

Neuroscience Letters, 743 , pp. 1–7, 2021.

Abstract | Links | BibTeX

@article{Raffi2021,
title = {Analysis of microsaccades during extended practice of a visual discrimination task in the macaque monkey},
author = {Milena Raffi and Andrea Meoni and Alessandro Piras},
doi = {10.1016/j.neulet.2020.135581},
year = {2021},
date = {2021-01-01},
journal = {Neuroscience Letters},
volume = {743},
pages = {1--7},
publisher = {Elsevier B.V.},
abstract = {The spatial location indicated by a visual cue can bias microsaccades directions towards or away from the cue. Aim of this work was to evaluate the microsaccades characteristics during the monkey's training, investigating the relationship between a shift of attention and practice. The monkey was trained to press a lever at a target onset, then an expanding optic flow stimulus appeared to the right of the target. After a variable time delay, a visual cue appeared within the optic flow stimulus and the monkey had to release the lever in a maximum reaction time (RT) of 700 ms. In the control task no visual cue appeared and the monkey had to attend a change in the target color. Data were recorded in 9 months. Results revealed that the RTs at the control task changed significantly across time. The microsaccades directions were significantly clustered toward the visual cue, suggesting that the animal developed an attentional bias toward the visual space where the cue appeared. The microsaccades amplitude differed significantly across time. The microsaccades peak velocity differed significantly both across time and within the time delays, indicating that the monkey made faster microsaccades when it expected the cue to appear. The microsaccades number was significantly higher in the control task with respect to discrimination. The lack of change in microsaccades rate, duration, number and direction across time indicates that the experience acquired during practicing the task did not influence microsaccades generation.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The spatial location indicated by a visual cue can bias microsaccades directions towards or away from the cue. Aim of this work was to evaluate the microsaccades characteristics during the monkey's training, investigating the relationship between a shift of attention and practice. The monkey was trained to press a lever at a target onset, then an expanding optic flow stimulus appeared to the right of the target. After a variable time delay, a visual cue appeared within the optic flow stimulus and the monkey had to release the lever in a maximum reaction time (RT) of 700 ms. In the control task no visual cue appeared and the monkey had to attend a change in the target color. Data were recorded in 9 months. Results revealed that the RTs at the control task changed significantly across time. The microsaccades directions were significantly clustered toward the visual cue, suggesting that the animal developed an attentional bias toward the visual space where the cue appeared. The microsaccades amplitude differed significantly across time. The microsaccades peak velocity differed significantly both across time and within the time delays, indicating that the monkey made faster microsaccades when it expected the cue to appear. The microsaccades number was significantly higher in the control task with respect to discrimination. The lack of change in microsaccades rate, duration, number and direction across time indicates that the experience acquired during practicing the task did not influence microsaccades generation.

Close

  • doi:10.1016/j.neulet.2020.135581

Close

Amarender R Bogadhi; Leor N Katz; Anil Bollimunta; David A Leopold; Richard J Krauzlis

Midbrain activity supports high-level visual properties in primate temporal cortex Miscellaneous

2021.

Abstract | Links | BibTeX

@misc{Bogadhi2021,
title = {Midbrain activity supports high-level visual properties in primate temporal cortex},
author = {Amarender R Bogadhi and Leor N Katz and Anil Bollimunta and David A Leopold and Richard J Krauzlis},
doi = {10.1101/841155},
year = {2021},
date = {2021-01-01},
booktitle = {Neuron},
volume = {109},
pages = {1--10},
publisher = {Elsevier Inc.},
abstract = {The evolution of the primate brain is marked by a dramatic increase in the number of neocortical areas that process visual information 1. This cortical expansion supports two hallmarks of high-level primate vision - the ability to selectively attend to particular visual features 2 and the ability to recognize a seemingly limitless number of complex visual objects 3. Given their prominent roles in high-level vision for primates, it is commonly assumed that these cortical processes supersede the earlier versions of these functions accomplished by the evolutionarily older brain structures that lie beneath the cortex. Contrary to this view, here we show that the superior colliculus (SC), a midbrain structure conserved across all vertebrates 4, is necessary for the normal expression of attention-related modulation and object selectivity in a newly identified region of macaque temporal cortex. Using a combination of psychophysics, causal perturbations and fMRI, we identified a localized region in the temporal cortex that is functionally dependent on the SC. Targeted electrophysiological recordings in this cortical region revealed neurons with strong attention-related modulation that was markedly reduced during attention deficits caused by SC inactivation. Many of these neurons also exhibited selectivity for particular visual objects, and this selectivity was also reduced during SC inactivation. Thus, the SC exerts a causal influence on high-level visual processing in cortex at a surprisingly late stage where attention and object selectivity converge, perhaps determined by the elemental forms of perceptual processing the SC has supported since before there was a neocortex.},
keywords = {},
pubstate = {published},
tppubtype = {misc}
}

Close

The evolution of the primate brain is marked by a dramatic increase in the number of neocortical areas that process visual information 1. This cortical expansion supports two hallmarks of high-level primate vision - the ability to selectively attend to particular visual features 2 and the ability to recognize a seemingly limitless number of complex visual objects 3. Given their prominent roles in high-level vision for primates, it is commonly assumed that these cortical processes supersede the earlier versions of these functions accomplished by the evolutionarily older brain structures that lie beneath the cortex. Contrary to this view, here we show that the superior colliculus (SC), a midbrain structure conserved across all vertebrates 4, is necessary for the normal expression of attention-related modulation and object selectivity in a newly identified region of macaque temporal cortex. Using a combination of psychophysics, causal perturbations and fMRI, we identified a localized region in the temporal cortex that is functionally dependent on the SC. Targeted electrophysiological recordings in this cortical region revealed neurons with strong attention-related modulation that was markedly reduced during attention deficits caused by SC inactivation. Many of these neurons also exhibited selectivity for particular visual objects, and this selectivity was also reduced during SC inactivation. Thus, the SC exerts a causal influence on high-level visual processing in cortex at a surprisingly late stage where attention and object selectivity converge, perhaps determined by the elemental forms of perceptual processing the SC has supported since before there was a neocortex.

Close

  • doi:10.1101/841155

Close

Francesco Fabbrini; Rufin Vogels

Within- and between-hemifields generalization of repetition suppression in inferior temporal cortex Journal Article

Journal of Neurophysiology, 125 (1), pp. 1–20, 2021.

Abstract | Links | BibTeX

@article{Fabbrini2021,
title = {Within- and between-hemifields generalization of repetition suppression in inferior temporal cortex},
author = {Francesco Fabbrini and Rufin Vogels},
doi = {10.1152/jn.00361.2020},
year = {2021},
date = {2021-01-01},
journal = {Journal of Neurophysiology},
volume = {125},
number = {1},
pages = {1--20},
abstract = {The decrease in response with stimulus repetition is a common property observed in many sensory brain areas. This repetition suppression (RS) is ubiquitous in neurons of macaque inferior temporal (IT) cortex, the end-stage of the ventral visual pathway. The neural mechanisms of RS in IT are still unclear, and one possibility is that it is inherited from areas upstream to IT that show also RS. Since neurons in IT have larger receptive fields compared to earlier visual areas, we examined the inheritance hypothesis by presenting adapter and test stimuli at widely different spatial locations along both vertical and horizontal meridians, and across hemifields. RS was present for distances between adapter and test stimuli up to 22°, and when the two stimuli were presented in different hemifields. Also, we examined the position tolerance of the stimulus selectivity of adaptation by comparing the responses to a test stimulus following the same (repetition trial) or a different adapter (alternation trial) at a different position than the test stimulus. Stimulus-selective adaptation was still present and consistently stronger in the later phase of the response for distances up to 18°. Finally, we observed stimulus-selective adaptation in repetition trials even without a measurable excitatory response to the adapter stimulus. To accommodate these and previous data, we propose that at least part of the stimulus-selective adaptation in IT is based on short-term plasticity mechanisms within IT and/or reflects top-down activity from areas downstream to IT.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The decrease in response with stimulus repetition is a common property observed in many sensory brain areas. This repetition suppression (RS) is ubiquitous in neurons of macaque inferior temporal (IT) cortex, the end-stage of the ventral visual pathway. The neural mechanisms of RS in IT are still unclear, and one possibility is that it is inherited from areas upstream to IT that show also RS. Since neurons in IT have larger receptive fields compared to earlier visual areas, we examined the inheritance hypothesis by presenting adapter and test stimuli at widely different spatial locations along both vertical and horizontal meridians, and across hemifields. RS was present for distances between adapter and test stimuli up to 22°, and when the two stimuli were presented in different hemifields. Also, we examined the position tolerance of the stimulus selectivity of adaptation by comparing the responses to a test stimulus following the same (repetition trial) or a different adapter (alternation trial) at a different position than the test stimulus. Stimulus-selective adaptation was still present and consistently stronger in the later phase of the response for distances up to 18°. Finally, we observed stimulus-selective adaptation in repetition trials even without a measurable excitatory response to the adapter stimulus. To accommodate these and previous data, we propose that at least part of the stimulus-selective adaptation in IT is based on short-term plasticity mechanisms within IT and/or reflects top-down activity from areas downstream to IT.

Close

  • doi:10.1152/jn.00361.2020

Close

2020

Jacob A Westerberg; Alexander Maier; Geoffrey F Woodman; Jeffrey D Schall

Performance monitoring during visual priming Journal Article

Journal of Cognitive Neuroscience, 32 (3), pp. 515–526, 2020.

Abstract | Links | BibTeX

@article{Westerberg2020,
title = {Performance monitoring during visual priming},
author = {Jacob A Westerberg and Alexander Maier and Geoffrey F Woodman and Jeffrey D Schall},
doi = {10.1162/jocn_a_01499},
year = {2020},
date = {2020-11-01},
journal = {Journal of Cognitive Neuroscience},
volume = {32},
number = {3},
pages = {515--526},
publisher = {MIT Press - Journals},
abstract = {Repetitive performance of single-feature (efficient or popout) visual search improves RTs and accuracy. This phenomenon, known as priming of pop-out, has been demonstrated in both humans and macaque monkeys. We investigated the relationship between performance monitoring and priming of pop-out. Neuronal activity in the supplementary eye field (SEF) contributes to performance monitoring and to the generation of performance monitoring signals in the EEG. To determine whether priming depends on performance monitoring, we investigated spiking activity in SEF as well as the concurrent EEG of two monkeys performing a priming of pop-out task. We found that SEF spiking did not modulate with priming. Surprisingly, concurrent EEG did covary with priming. Together, these results suggest that performance monitoring contributes to priming of pop-out. However, this performance monitoring seems not mediated by SEF. This dissociation suggests that EEG indices of performance monitoring arise from multiple, functionally distinct neural generators.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Repetitive performance of single-feature (efficient or popout) visual search improves RTs and accuracy. This phenomenon, known as priming of pop-out, has been demonstrated in both humans and macaque monkeys. We investigated the relationship between performance monitoring and priming of pop-out. Neuronal activity in the supplementary eye field (SEF) contributes to performance monitoring and to the generation of performance monitoring signals in the EEG. To determine whether priming depends on performance monitoring, we investigated spiking activity in SEF as well as the concurrent EEG of two monkeys performing a priming of pop-out task. We found that SEF spiking did not modulate with priming. Surprisingly, concurrent EEG did covary with priming. Together, these results suggest that performance monitoring contributes to priming of pop-out. However, this performance monitoring seems not mediated by SEF. This dissociation suggests that EEG indices of performance monitoring arise from multiple, functionally distinct neural generators.

Close

  • doi:10.1162/jocn_a_01499

Close

Guillaume Doucet; Roberto A Gulli; Benjamin W Corrigan; Lyndon R Duong; Julio C Martinez-Trujillo

Modulation of local field potentials and neuronal activity in primate hippocampus during saccades Journal Article

Hippocampus, 30 (3), pp. 192–209, 2020.

Abstract | Links | BibTeX

@article{Doucet2020,
title = {Modulation of local field potentials and neuronal activity in primate hippocampus during saccades},
author = {Guillaume Doucet and Roberto A Gulli and Benjamin W Corrigan and Lyndon R Duong and Julio C Martinez-Trujillo},
doi = {10.1002/hipo.23140},
year = {2020},
date = {2020-07-01},
journal = {Hippocampus},
volume = {30},
number = {3},
pages = {192--209},
publisher = {Wiley},
abstract = {Primates use saccades to gather information about objects and their relative spatial arrangement, a process essential for visual perception and memory. It has been proposed that signals linked to saccades reset the phase of local field potential (LFP) oscillations in the hippocampus, providing a temporal window for visual signals to activate neurons in this region and influence memory formation. We investigated this issue by measuring hippocampal LFPs and spikes in two macaques performing different tasks with unconstrained eye movements. We found that LFP phase clustering (PC) in the alpha/beta (8–16 Hz) frequencies followed foveation onsets, while PC in frequencies lower than 8 Hz followed spontaneous saccades, even on a homogeneous background. Saccades to a solid grey background were not followed by increases in local neuronal firing, whereas saccades toward appearing visual stimuli were. Finally, saccade parameters correlated with LFPs phase and amplitude: saccade direction correlated with delta (≤4 Hz) phase, and saccade amplitude with theta (4–8 Hz) power. Our results suggest that signals linked to saccades reach the hippocampus, producing synchronization of delta/theta LFPs without a general activation of local neurons. Moreover, some visual inputs co-occurring with saccades produce LFP synchronization in the alpha/beta bands and elevated neuronal firing. Our findings support the hypothesis that saccade-related signals enact sensory input-dependent plasticity and therefore memory formation in the primate hippocampus.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Primates use saccades to gather information about objects and their relative spatial arrangement, a process essential for visual perception and memory. It has been proposed that signals linked to saccades reset the phase of local field potential (LFP) oscillations in the hippocampus, providing a temporal window for visual signals to activate neurons in this region and influence memory formation. We investigated this issue by measuring hippocampal LFPs and spikes in two macaques performing different tasks with unconstrained eye movements. We found that LFP phase clustering (PC) in the alpha/beta (8–16 Hz) frequencies followed foveation onsets, while PC in frequencies lower than 8 Hz followed spontaneous saccades, even on a homogeneous background. Saccades to a solid grey background were not followed by increases in local neuronal firing, whereas saccades toward appearing visual stimuli were. Finally, saccade parameters correlated with LFPs phase and amplitude: saccade direction correlated with delta (≤4 Hz) phase, and saccade amplitude with theta (4–8 Hz) power. Our results suggest that signals linked to saccades reach the hippocampus, producing synchronization of delta/theta LFPs without a general activation of local neurons. Moreover, some visual inputs co-occurring with saccades produce LFP synchronization in the alpha/beta bands and elevated neuronal firing. Our findings support the hypothesis that saccade-related signals enact sensory input-dependent plasticity and therefore memory formation in the primate hippocampus.

Close

  • doi:10.1002/hipo.23140

Close

Ramina Adam; Kevin D Johnston; Ravi S Menon; Stefan Everling

Functional reorganization during the recovery of contralesional target selection deficits after prefrontal cortex lesions in macaque monkeys Journal Article

NeuroImage, 207 , pp. 1–17, 2020.

Abstract | Links | BibTeX

@article{Adam2020,
title = {Functional reorganization during the recovery of contralesional target selection deficits after prefrontal cortex lesions in macaque monkeys},
author = {Ramina Adam and Kevin D Johnston and Ravi S Menon and Stefan Everling},
doi = {10.1016/j.neuroimage.2019.116339},
year = {2020},
date = {2020-01-01},
journal = {NeuroImage},
volume = {207},
pages = {1--17},
publisher = {Elsevier Ltd},
abstract = {Visual extinction has been characterized by the failure to respond to a visual stimulus in the contralesional hemifield when presented simultaneously with an ipsilesional stimulus (Corbetta and Shulman, 2011). Unilateral damage to the macaque frontoparietal cortex commonly leads to deficits in contralesional target selection that resemble visual extinction. Recently, we showed that macaque monkeys with unilateral lesions in the caudal prefrontal cortex (PFC) exhibited contralesional target selection deficits that recovered over 2–4 months (Adam et al., 2019). Here, we investigated the longitudinal changes in functional connectivity (FC) of the frontoparietal network after a small or large right caudal PFC lesion in four macaque monkeys. We collected ultra-high field resting-state fMRI at 7-T before the lesion and at weeks 1–16 post-lesion and compared the functional data with behavioural performance on a free-choice saccade task. We found that the pattern of frontoparietal network FC changes depended on lesion size, such that the recovery of contralesional extinction was associated with an initial increase in network FC that returned to baseline in the two small lesion monkeys, whereas FC continued to increase throughout recovery in the two monkeys with a larger lesion. We also found that the FC between contralesional dorsolateral PFC and ipsilesional parietal cortex correlated with behavioural recovery and that the contralesional dorsolateral PFC showed increasing degree centrality with the frontoparietal network. These findings suggest that both the contralesional and ipsilesional hemispheres play an important role in the recovery of function. Importantly, optimal compensation after large PFC lesions may require greater recruitment of distant and intact areas of the frontoparietal network, whereas recovery from smaller lesions was supported by a normalization of the functional network.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual extinction has been characterized by the failure to respond to a visual stimulus in the contralesional hemifield when presented simultaneously with an ipsilesional stimulus (Corbetta and Shulman, 2011). Unilateral damage to the macaque frontoparietal cortex commonly leads to deficits in contralesional target selection that resemble visual extinction. Recently, we showed that macaque monkeys with unilateral lesions in the caudal prefrontal cortex (PFC) exhibited contralesional target selection deficits that recovered over 2–4 months (Adam et al., 2019). Here, we investigated the longitudinal changes in functional connectivity (FC) of the frontoparietal network after a small or large right caudal PFC lesion in four macaque monkeys. We collected ultra-high field resting-state fMRI at 7-T before the lesion and at weeks 1–16 post-lesion and compared the functional data with behavioural performance on a free-choice saccade task. We found that the pattern of frontoparietal network FC changes depended on lesion size, such that the recovery of contralesional extinction was associated with an initial increase in network FC that returned to baseline in the two small lesion monkeys, whereas FC continued to increase throughout recovery in the two monkeys with a larger lesion. We also found that the FC between contralesional dorsolateral PFC and ipsilesional parietal cortex correlated with behavioural recovery and that the contralesional dorsolateral PFC showed increasing degree centrality with the frontoparietal network. These findings suggest that both the contralesional and ipsilesional hemispheres play an important role in the recovery of function. Importantly, optimal compensation after large PFC lesions may require greater recruitment of distant and intact areas of the frontoparietal network, whereas recovery from smaller lesions was supported by a normalization of the functional network.

Close

  • doi:10.1016/j.neuroimage.2019.116339

Close

Habiba Azab; Benjamin Y Hayden

Partial integration of the components of value in anterior cingulate cortex Journal Article

Behavioral Neuroscience, 134 (4), pp. 296–308, 2020.

Abstract | Links | BibTeX

@article{Azab2020,
title = {Partial integration of the components of value in anterior cingulate cortex},
author = {Habiba Azab and Benjamin Y Hayden},
doi = {10.1037/bne0000382},
year = {2020},
date = {2020-01-01},
journal = {Behavioral Neuroscience},
volume = {134},
number = {4},
pages = {296--308},
abstract = {Evaluation often involves integrating multiple determinants of value, such as the different possible outcomes in risky choice. A brain region can be placed either before or after a presumed evaluation stage by measuring how responses of its neurons depend on multiple determinants of value. A brain region could also, in principle, show partial integration, which would indicate that it occupies a middle position between (preevaluative) nonintegration and (postevaluative) full integration. Existing mathematical techniques cannot distinguish full from partial integration and therefore risk misidentifying regional function. Here we use a new Bayesian regression-based approach to analyze responses of neurons in dorsal anterior cingulate cortex (dACC) to risky offers. We find that dACC neurons only partially integrate across outcome dimensions, indicating that dACC cannot be assigned to either a pre- or postevaluative position. Neurons in dACC also show putative signatures of value comparison, thereby demonstrating that comparison does not require complete evaluation before proceeding.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Evaluation often involves integrating multiple determinants of value, such as the different possible outcomes in risky choice. A brain region can be placed either before or after a presumed evaluation stage by measuring how responses of its neurons depend on multiple determinants of value. A brain region could also, in principle, show partial integration, which would indicate that it occupies a middle position between (preevaluative) nonintegration and (postevaluative) full integration. Existing mathematical techniques cannot distinguish full from partial integration and therefore risk misidentifying regional function. Here we use a new Bayesian regression-based approach to analyze responses of neurons in dorsal anterior cingulate cortex (dACC) to risky offers. We find that dACC neurons only partially integrate across outcome dimensions, indicating that dACC cannot be assigned to either a pre- or postevaluative position. Neurons in dACC also show putative signatures of value comparison, thereby demonstrating that comparison does not require complete evaluation before proceeding.

Close

  • doi:10.1037/bne0000382

Close

Marzyeh Azimi; Mariann Oemisch; Thilo Womelsdorf

Dissociation of nicotinic $alpha$7 and $alpha$4/$beta$2 sub-receptor agonists for enhancing learning and attentional filtering in nonhuman primates Journal Article

Psychopharmacology, 237 (4), pp. 997–1010, 2020.

Abstract | Links | BibTeX

@article{Azimi2020,
title = {Dissociation of nicotinic $alpha$7 and $alpha$4/$beta$2 sub-receptor agonists for enhancing learning and attentional filtering in nonhuman primates},
author = {Marzyeh Azimi and Mariann Oemisch and Thilo Womelsdorf},
doi = {10.1007/s00213-019-05430-w},
year = {2020},
date = {2020-01-01},
journal = {Psychopharmacology},
volume = {237},
number = {4},
pages = {997--1010},
publisher = {Psychopharmacology},
abstract = {Rationale: Nicotinic acetylcholine receptors (nAChRs) modulate attention, memory, and higher executive functioning, but it is unclear how nACh sub-receptors mediate different mechanisms supporting these functions. Objectives: We investigated whether selective agonists for the alpha-7 nAChR versus the alpha-4/beta-2 nAChR have unique functional contributions for value learning and attentional filtering of distractors in the nonhuman primate. Methods: Two adult rhesus macaque monkeys performed reversal learning following systemic administration of either the alpha-7 nAChR agonist PHA-543613 or the alpha-4/beta-2 nAChR agonist ABT-089 or a vehicle control. Behavioral analysis quantified performance accuracy, speed of processing, reversal learning speed, the control of distractor interference, perseveration tendencies, and motivation. Results: We found that the alpha-7 nAChR agonist PHA-543613 enhanced the learning speed of feature values but did not modulate how salient distracting information was filtered from ongoing choice processes. In contrast, the selective alpha-4/beta-2 nAChR agonist ABT-089 did not affect learning speed but reduced distractibility. This dissociation was dose-dependent and evident in the absence of systematic changes in overall performance, reward intake, motivation to perform the task, perseveration tendencies, or reaction times. Conclusions: These results suggest nicotinic sub-receptor specific mechanisms consistent with (1) alpha-4/beta-2 nAChR specific amplification of cholinergic transients in prefrontal cortex linked to enhanced cue detection in light of interferences, and (2) alpha-7 nAChR specific activation prolonging cholinergic transients, which could facilitate subjects to follow-through with newly established attentional strategies when outcome contingencies change. These insights will be critical for developing function-specific drugs alleviating attention and learning deficits in neuro-psychiatric diseases.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Rationale: Nicotinic acetylcholine receptors (nAChRs) modulate attention, memory, and higher executive functioning, but it is unclear how nACh sub-receptors mediate different mechanisms supporting these functions. Objectives: We investigated whether selective agonists for the alpha-7 nAChR versus the alpha-4/beta-2 nAChR have unique functional contributions for value learning and attentional filtering of distractors in the nonhuman primate. Methods: Two adult rhesus macaque monkeys performed reversal learning following systemic administration of either the alpha-7 nAChR agonist PHA-543613 or the alpha-4/beta-2 nAChR agonist ABT-089 or a vehicle control. Behavioral analysis quantified performance accuracy, speed of processing, reversal learning speed, the control of distractor interference, perseveration tendencies, and motivation. Results: We found that the alpha-7 nAChR agonist PHA-543613 enhanced the learning speed of feature values but did not modulate how salient distracting information was filtered from ongoing choice processes. In contrast, the selective alpha-4/beta-2 nAChR agonist ABT-089 did not affect learning speed but reduced distractibility. This dissociation was dose-dependent and evident in the absence of systematic changes in overall performance, reward intake, motivation to perform the task, perseveration tendencies, or reaction times. Conclusions: These results suggest nicotinic sub-receptor specific mechanisms consistent with (1) alpha-4/beta-2 nAChR specific amplification of cholinergic transients in prefrontal cortex linked to enhanced cue detection in light of interferences, and (2) alpha-7 nAChR specific activation prolonging cholinergic transients, which could facilitate subjects to follow-through with newly established attentional strategies when outcome contingencies change. These insights will be critical for developing function-specific drugs alleviating attention and learning deficits in neuro-psychiatric diseases.

Close

  • doi:10.1007/s00213-019-05430-w

Close

Pragathi Priyadharsini Balasubramani; Meghan C Pesce; Benjamin Y Hayden

Activity in orbitofrontal neuronal ensembles reflects inhibitory control Journal Article

European Journal of Neuroscience, 51 (10), pp. 2033–2051, 2020.

Abstract | Links | BibTeX

@article{Balasubramani2020,
title = {Activity in orbitofrontal neuronal ensembles reflects inhibitory control},
author = {Pragathi Priyadharsini Balasubramani and Meghan C Pesce and Benjamin Y Hayden},
doi = {10.1111/ejn.14638},
year = {2020},
date = {2020-01-01},
journal = {European Journal of Neuroscience},
volume = {51},
number = {10},
pages = {2033--2051},
abstract = {Stopping, or inhibition, is a form of self-control that is a core element of flexible and adaptive behavior. Its neural origins remain unclear. Some views hold that inhibition decisions reflect the aggregation of widespread and diverse pieces of information, including information arising in ostensible core reward regions (i.e., outside the canonical executive system). We recorded activity of single neurons in the orbitofrontal cortex (OFC) of macaques, a region associated with economic decisions, and whose role in inhibition is debated. Subjects performed a classic inhibition task known as the stop signal task. Ensemble decoding analyses reveal a clear firing rate pattern that distinguishes successful from failed inhibition and that begins after the stop signal and before the stop signal reaction time (SSRT). We also found a different and orthogonal ensemble pattern that distinguishes successful from failed stopping before the beginning of the trial. These signals were distinct from, and orthogonal to, value encoding, which was also observed in these neurons. The timing of the early and late signals was, respectively, consistent with the idea that neuronal activity in OFC encodes inhibition both proactively and reactively.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Stopping, or inhibition, is a form of self-control that is a core element of flexible and adaptive behavior. Its neural origins remain unclear. Some views hold that inhibition decisions reflect the aggregation of widespread and diverse pieces of information, including information arising in ostensible core reward regions (i.e., outside the canonical executive system). We recorded activity of single neurons in the orbitofrontal cortex (OFC) of macaques, a region associated with economic decisions, and whose role in inhibition is debated. Subjects performed a classic inhibition task known as the stop signal task. Ensemble decoding analyses reveal a clear firing rate pattern that distinguishes successful from failed inhibition and that begins after the stop signal and before the stop signal reaction time (SSRT). We also found a different and orthogonal ensemble pattern that distinguishes successful from failed stopping before the beginning of the trial. These signals were distinct from, and orthogonal to, value encoding, which was also observed in these neurons. The timing of the early and late signals was, respectively, consistent with the idea that neuronal activity in OFC encodes inhibition both proactively and reactively.

Close

  • doi:10.1111/ejn.14638

Close

Kévin Blaize; Fabrice Arcizet; Marc Gesnik; Harry Ahnine; Ulisse Ferrari; Thomas Deffieux; Pierre Pouget; Frédéric Chavane; Mathias Fink; José Alain Sahel; José Alain Sahel; José Alain Sahel; Mickael Tanter; Serge Picaud

Functional ultrasound imaging of deep visual cortex in awake nonhuman primates Journal Article

Proceedings of the National Academy of Sciences, 117 (25), pp. 14453–14463, 2020.

Abstract | Links | BibTeX

@article{Blaize2020,
title = {Functional ultrasound imaging of deep visual cortex in awake nonhuman primates},
author = {Kévin Blaize and Fabrice Arcizet and Marc Gesnik and Harry Ahnine and Ulisse Ferrari and Thomas Deffieux and Pierre Pouget and Frédéric Chavane and Mathias Fink and José Alain Sahel and José Alain Sahel and José Alain Sahel and Mickael Tanter and Serge Picaud},
doi = {10.1073/pnas.1916787117},
year = {2020},
date = {2020-01-01},
journal = {Proceedings of the National Academy of Sciences},
volume = {117},
number = {25},
pages = {14453--14463},
abstract = {Deep regions of the brain are not easily accessible to investigation at the mesoscale level in awake animals or humans. We have recently developed a functional ultrasound (fUS) technique that enables imaging hemodynamic responses to visual tasks. Using fUS imaging on two awake nonhuman primates performing a passive fixation task, we constructed retinotopic maps at depth in the visual cortex (V1, V2, and V3) in the calcarine and lunate sulci. The maps could be acquired in a single-hour session with relatively few presentations of the stimuli. The spatial resolution of the technology is illustrated by mapping patterns similar to ocular dominance (OD) columns within superficial and deep layers of the primary visual cortex. These acquisitions using fUS suggested that OD selectivity is mostly present in layer IV but with extensions into layers II/III and V. This imaging technology provides a new mesoscale approach to the mapping of brain activity at high spatiotemporal resolution in awake subjects within the whole depth of the cortex.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Deep regions of the brain are not easily accessible to investigation at the mesoscale level in awake animals or humans. We have recently developed a functional ultrasound (fUS) technique that enables imaging hemodynamic responses to visual tasks. Using fUS imaging on two awake nonhuman primates performing a passive fixation task, we constructed retinotopic maps at depth in the visual cortex (V1, V2, and V3) in the calcarine and lunate sulci. The maps could be acquired in a single-hour session with relatively few presentations of the stimuli. The spatial resolution of the technology is illustrated by mapping patterns similar to ocular dominance (OD) columns within superficial and deep layers of the primary visual cortex. These acquisitions using fUS suggested that OD selectivity is mostly present in layer IV but with extensions into layers II/III and V. This imaging technology provides a new mesoscale approach to the mapping of brain activity at high spatiotemporal resolution in awake subjects within the whole depth of the cortex.

Close

  • doi:10.1073/pnas.1916787117

Close

Amarender R Bogadhi; Antimo Buonocore; Ziad M Hafed

Task-irrelevant visual forms facilitate covert and overt spatial selection Journal Article

Journal of Neuroscience, 40 (49), pp. 9496–9506, 2020.

Abstract | Links | BibTeX

@article{Bogadhi2020,
title = {Task-irrelevant visual forms facilitate covert and overt spatial selection},
author = {Amarender R Bogadhi and Antimo Buonocore and Ziad M Hafed},
doi = {10.1523/jneurosci.1593-20.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neuroscience},
volume = {40},
number = {49},
pages = {9496--9506},
abstract = {Covert and overt spatial selection behaviors are guided by both visual saliency maps derived from early visual features as well as priority maps reflecting high-level cognitive factors. However, whether mid-level perceptual processes associated with visual form recognition contribute to covert and overt spatial selection behaviors remains unclear. We hypothesized that if peripheral visual forms contribute to spatial selection behaviors, then they should do so even when the visual forms are task-irrelevant. We tested this hypothesis in male and female human subjects as well as in male macaque monkeys performing a visual detection task. In this task, subjects reported the detection of a suprathreshold target spot presented on top of one of two peripheral images, and they did so with either a speeded manual button press (humans) or a speeded saccadic eye movement response (humans and monkeys). Crucially, the two images, one with a visual form and the other with a partially phase-scrambled visual form, were completely irrelevant to the task. In both manual (covert) and oculomotor (overt) response modalities, and in both humans and monkeys, response times were faster when the target was congruent with a visual form than when it was incongruent. Importantly, incongruent targets were associated with almost all errors, suggesting that forms automatically captured selection behaviors. These findings demonstrate that mid-level perceptual processes associated with visual form recognition contribute to covert and overt spatial selection. This indicates that neural circuits associated with target selection, such as the superior colliculus, may have privileged access to visual form information. SIGNIFICANCE STATEMENT Spatial selection of visual information either with (overt) or without (covert) foveating eye movements is critical to primate behavior. However, it is still not clear whether spatial maps in sensorimotor regions known to guide overt and covert spatial selection are influenced by peripheral visual forms. We probed the ability of humans and monkeys to perform overt and covert target selection in the presence of spatially congruent or incongruent visual forms. Even when completely task-irrelevant, images of visual objects had a dramatic effect on target selection, acting much like spatial cues used in spatial attention tasks. Our results demonstrate that traditional brain circuits for orienting behaviors, such as the superior colliculus, likely have privileged access to visual object representations.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Covert and overt spatial selection behaviors are guided by both visual saliency maps derived from early visual features as well as priority maps reflecting high-level cognitive factors. However, whether mid-level perceptual processes associated with visual form recognition contribute to covert and overt spatial selection behaviors remains unclear. We hypothesized that if peripheral visual forms contribute to spatial selection behaviors, then they should do so even when the visual forms are task-irrelevant. We tested this hypothesis in male and female human subjects as well as in male macaque monkeys performing a visual detection task. In this task, subjects reported the detection of a suprathreshold target spot presented on top of one of two peripheral images, and they did so with either a speeded manual button press (humans) or a speeded saccadic eye movement response (humans and monkeys). Crucially, the two images, one with a visual form and the other with a partially phase-scrambled visual form, were completely irrelevant to the task. In both manual (covert) and oculomotor (overt) response modalities, and in both humans and monkeys, response times were faster when the target was congruent with a visual form than when it was incongruent. Importantly, incongruent targets were associated with almost all errors, suggesting that forms automatically captured selection behaviors. These findings demonstrate that mid-level perceptual processes associated with visual form recognition contribute to covert and overt spatial selection. This indicates that neural circuits associated with target selection, such as the superior colliculus, may have privileged access to visual form information. SIGNIFICANCE STATEMENT Spatial selection of visual information either with (overt) or without (covert) foveating eye movements is critical to primate behavior. However, it is still not clear whether spatial maps in sensorimotor regions known to guide overt and covert spatial selection are influenced by peripheral visual forms. We probed the ability of humans and monkeys to perform overt and covert target selection in the presence of spatially congruent or incongruent visual forms. Even when completely task-irrelevant, images of visual objects had a dramatic effect on target selection, acting much like spatial cues used in spatial attention tasks. Our results demonstrate that traditional brain circuits for orienting behaviors, such as the superior colliculus, likely have privileged access to visual object representations.

Close

  • doi:10.1523/jneurosci.1593-20.2020

Close

Sophie Brulé; Bastien Herlin; Pierre Pouget; Marcus Missal

Ketamine reduces temporal expectation in the rhesus monkey Journal Article

Psychopharmacology, pp. 1–9, 2020.

Abstract | Links | BibTeX

@article{Brule2020,
title = {Ketamine reduces temporal expectation in the rhesus monkey},
author = {Sophie Brulé and Bastien Herlin and Pierre Pouget and Marcus Missal},
doi = {10.1007/s00213-020-05706-6},
year = {2020},
date = {2020-01-01},
journal = {Psychopharmacology},
pages = {1--9},
publisher = {Psychopharmacology},
abstract = {Rationale: Ketamine, a well-known general dissociative anesthetic agent that is a non-competitive antagonist of the N-methyl-D-aspartate receptor, perturbs the perception of elapsed time and the expectation of upcoming events. Objective: The objective of this study was to determine the influence of ketamine on temporal expectation in the rhesus monkey. Methods: Two rhesus monkeys were trained to make a saccade between a central warning stimulus and an eccentric visual target that served as imperative stimulus. The delay between the warning and the imperative stimulus could take one of four different values randomly with the same probability (variable foreperiod paradigm). During experimental sessions, a subanesthetic low dose of ketamine (0.25–0.35 mg/kg) was injected i.m. and the influence of the drug on movement latency was measured. Results: We found that in the control conditions, saccadic latencies strongly decreased with elapsed time before the appearance of the visual target showing that temporal expectation built up during the delay period between the warning and the imperative stimulus. However, after ketamine injection, temporal expectation was significantly reduced in both subjects. In addition, ketamine also increased average movement latency but this effect could be dissociated from the reduction of temporal expectation. Conclusion: In conclusion, a subanesthetic dose of ketamine could have two independent effects: increasing reaction time and decreasing temporal expectation. This alteration of temporal expectation could explain cognitive deficits observed during ketamine use.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Rationale: Ketamine, a well-known general dissociative anesthetic agent that is a non-competitive antagonist of the N-methyl-D-aspartate receptor, perturbs the perception of elapsed time and the expectation of upcoming events. Objective: The objective of this study was to determine the influence of ketamine on temporal expectation in the rhesus monkey. Methods: Two rhesus monkeys were trained to make a saccade between a central warning stimulus and an eccentric visual target that served as imperative stimulus. The delay between the warning and the imperative stimulus could take one of four different values randomly with the same probability (variable foreperiod paradigm). During experimental sessions, a subanesthetic low dose of ketamine (0.25–0.35 mg/kg) was injected i.m. and the influence of the drug on movement latency was measured. Results: We found that in the control conditions, saccadic latencies strongly decreased with elapsed time before the appearance of the visual target showing that temporal expectation built up during the delay period between the warning and the imperative stimulus. However, after ketamine injection, temporal expectation was significantly reduced in both subjects. In addition, ketamine also increased average movement latency but this effect could be dissociated from the reduction of temporal expectation. Conclusion: In conclusion, a subanesthetic dose of ketamine could have two independent effects: increasing reaction time and decreasing temporal expectation. This alteration of temporal expectation could explain cognitive deficits observed during ketamine use.

Close

  • doi:10.1007/s00213-020-05706-6

Close

Ting Yu Chang; Raymond Doudlah; Byounghoon Kim; Adhira Sunkara; Lowell W Thompson; Meghan E Lowe; Ari Rosenberg

Functional links between sensory representations, choice activity, and sensorimotor associations in parietal cortex Journal Article

eLife, 9 , pp. 1–27, 2020.

Abstract | Links | BibTeX

@article{Chang2020b,
title = {Functional links between sensory representations, choice activity, and sensorimotor associations in parietal cortex},
author = {Ting Yu Chang and Raymond Doudlah and Byounghoon Kim and Adhira Sunkara and Lowell W Thompson and Meghan E Lowe and Ari Rosenberg},
doi = {10.7554/eLife.57968},
year = {2020},
date = {2020-01-01},
journal = {eLife},
volume = {9},
pages = {1--27},
abstract = {Three-dimensional (3D) representations of the environment are often critical for selecting actions that achieve desired goals. The success of these goal-directed actions relies on 3D sensorimotor transformations that are experience-dependent. Here we investigated the relationships between the robustness of 3D visual representations, choice-related activity, and motor-related activity in parietal cortex. Macaque monkeys performed an eight-alternative 3D orientation discrimination task and a visually guided saccade task while we recorded from the caudal intraparietal area using laminar probes. We found that neurons with more robust 3D visual representations preferentially carried choice-related activity. Following the onset of choice-related activity, the robustness of the 3D representations further increased for those neurons. We additionally found that 3D orientation and saccade direction preferences aligned, particularly for neurons with choice-related activity, reflecting an experience-dependent sensorimotor association. These findings reveal previously unrecognized links between the fidelity of ecologically relevant object representations, choice-related activity, and motor-related activity.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Three-dimensional (3D) representations of the environment are often critical for selecting actions that achieve desired goals. The success of these goal-directed actions relies on 3D sensorimotor transformations that are experience-dependent. Here we investigated the relationships between the robustness of 3D visual representations, choice-related activity, and motor-related activity in parietal cortex. Macaque monkeys performed an eight-alternative 3D orientation discrimination task and a visually guided saccade task while we recorded from the caudal intraparietal area using laminar probes. We found that neurons with more robust 3D visual representations preferentially carried choice-related activity. Following the onset of choice-related activity, the robustness of the 3D representations further increased for those neurons. We additionally found that 3D orientation and saccade direction preferences aligned, particularly for neurons with choice-related activity, reflecting an experience-dependent sensorimotor association. These findings reveal previously unrecognized links between the fidelity of ecologically relevant object representations, choice-related activity, and motor-related activity.

Close

  • doi:10.7554/eLife.57968

Close

Ting Yu Chang; Lowell Thompson; Raymond Doudlah; Byounghoon Kim; Adhira Sunkara; Ari Rosenberg

Optimized but not maximized cue integration for 3D visual perception Journal Article

eNeuro, 7 (1), pp. 1–18, 2020.

Abstract | Links | BibTeX

@article{Chang2020c,
title = {Optimized but not maximized cue integration for 3D visual perception},
author = {Ting Yu Chang and Lowell Thompson and Raymond Doudlah and Byounghoon Kim and Adhira Sunkara and Ari Rosenberg},
doi = {10.1523/ENEURO.0411-19.2019},
year = {2020},
date = {2020-01-01},
journal = {eNeuro},
volume = {7},
number = {1},
pages = {1--18},
abstract = {Reconstructing three-dimensional (3D) scenes from two-dimensional (2D) retinal images is an ill-posed problem. Despite this, 3D perception of the world based on 2D retinal images is seemingly accurate and precise. The integration of distinct visual cues is essential for robust 3D perception in humans, but it is unclear whether this is true for non-human primates (NHPs). Here, we assessed 3D perception in macaque monkeys using a planar surface orientation discrimination task. Perception was accurate across a wide range of spatial poses (orientations and distances), but precision was highly dependent on the plane's pose. The monkeys achieved robust 3D perception by dynamically reweighting the integration of stereoscopic and perspective cues according to their pose-dependent reliabilities. Errors in performance could be explained by a prior resembling the 3D orientation statistics of natural scenes. We used neural network simulations based on 3D orientation-selective neurons recorded from the same monkeys to assess how neural computation might constrain perception. The perceptual data were consistent with a model in which the responses of two independent neuronal populations representing stereoscopic cues and perspective cues (with perspective signals from the two eyes combined using nonlinear canonical computations) were optimally integrated through linear summation. Perception of combined-cue stimuli was optimal given this architecture. However, an alternative architecture in which stereoscopic cues, left eye perspective cues, and right eye perspective cues were represented by three independent populations yielded two times greater precision than the monkeys. This result suggests that, due to canonical computations, cue integration for 3D perception is optimized but not maximized.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Reconstructing three-dimensional (3D) scenes from two-dimensional (2D) retinal images is an ill-posed problem. Despite this, 3D perception of the world based on 2D retinal images is seemingly accurate and precise. The integration of distinct visual cues is essential for robust 3D perception in humans, but it is unclear whether this is true for non-human primates (NHPs). Here, we assessed 3D perception in macaque monkeys using a planar surface orientation discrimination task. Perception was accurate across a wide range of spatial poses (orientations and distances), but precision was highly dependent on the plane's pose. The monkeys achieved robust 3D perception by dynamically reweighting the integration of stereoscopic and perspective cues according to their pose-dependent reliabilities. Errors in performance could be explained by a prior resembling the 3D orientation statistics of natural scenes. We used neural network simulations based on 3D orientation-selective neurons recorded from the same monkeys to assess how neural computation might constrain perception. The perceptual data were consistent with a model in which the responses of two independent neuronal populations representing stereoscopic cues and perspective cues (with perspective signals from the two eyes combined using nonlinear canonical computations) were optimally integrated through linear summation. Perception of combined-cue stimuli was optimal given this architecture. However, an alternative architecture in which stereoscopic cues, left eye perspective cues, and right eye perspective cues were represented by three independent populations yielded two times greater precision than the monkeys. This result suggests that, due to canonical computations, cue integration for 3D perception is optimized but not maximized.

Close

  • doi:10.1523/ENEURO.0411-19.2019

Close

Chih-Yang Chen; Denis Matrov; Richard Veale; Hirotaka Onoe; Masatoshi Yoshida; Kenichiro Miura; Tadashi Isa

Properties of visually-guided saccadic behavior and bottom-up attention in marmoset, macaque, and human Journal Article

Journal of Neurophysiology, 2020.

Abstract | Links | BibTeX

@article{Chen2020a,
title = {Properties of visually-guided saccadic behavior and bottom-up attention in marmoset, macaque, and human},
author = {Chih-Yang Chen and Denis Matrov and Richard Veale and Hirotaka Onoe and Masatoshi Yoshida and Kenichiro Miura and Tadashi Isa},
doi = {10.1152/jn.00312.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
abstract = {The saccade is a stereotypic behavior whose investigation improves our understanding of how primate brains implement precise motor control. Furthermore, saccades offer an important window into the cognitive and attentional state of the brain. Historically, saccade studies have largely relied on macaque. However, the cortical network giving rise to the saccadic command is difficult to study in macaque because relevant cortical areas lie in sulci and are difficult to access. Recently, a New World monkey – the marmoset – has garnered attention as an attractive alternative to macaque because of its smooth cortical surface, its smaller body, and its amenability to transgenic technology. However, adoption of marmoset for oculomotor research has been limited due to a lack of in-depth descriptions of marmoset saccade kinematics and their ability to perform psychophysical and cognitive tasks. Here, we directly compare free-viewing and visually-guided behavior of marmoset, macaque, and human engaged in identical tasks under similar conditions. In video free-viewing task, all species exhibited qualitatively similar saccade kinematics including saccade main sequence up to 25° in amplitude. Furthermore, the conventional bottom-up saliency model predicted gaze targets at similar rates for all species. We further verified their visually-guided behavior by training them with step and gap saccade tasks. All species showed similar gap effect and express saccades in the gap paradigm. Our results suggest that the three species have similar natural and task-guided visuomotor behavior. The marmoset can be trained on saccadic tasks and thus can serve as a model for oculomotor, attention, and cognitive research.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The saccade is a stereotypic behavior whose investigation improves our understanding of how primate brains implement precise motor control. Furthermore, saccades offer an important window into the cognitive and attentional state of the brain. Historically, saccade studies have largely relied on macaque. However, the cortical network giving rise to the saccadic command is difficult to study in macaque because relevant cortical areas lie in sulci and are difficult to access. Recently, a New World monkey – the marmoset – has garnered attention as an attractive alternative to macaque because of its smooth cortical surface, its smaller body, and its amenability to transgenic technology. However, adoption of marmoset for oculomotor research has been limited due to a lack of in-depth descriptions of marmoset saccade kinematics and their ability to perform psychophysical and cognitive tasks. Here, we directly compare free-viewing and visually-guided behavior of marmoset, macaque, and human engaged in identical tasks under similar conditions. In video free-viewing task, all species exhibited qualitatively similar saccade kinematics including saccade main sequence up to 25° in amplitude. Furthermore, the conventional bottom-up saliency model predicted gaze targets at similar rates for all species. We further verified their visually-guided behavior by training them with step and gap saccade tasks. All species showed similar gap effect and express saccades in the gap paradigm. Our results suggest that the three species have similar natural and task-guided visuomotor behavior. The marmoset can be trained on saccadic tasks and thus can serve as a model for oculomotor, attention, and cognitive research.

Close

  • doi:10.1152/jn.00312.2020

Close

Xiaomo Chen; Marc Zirnsak; Gabriel M Vega; Eshan Govil; Stephen G Lomber; Tirin Moore

Parietal cortex regulates visual salience and salience-driven behavior Journal Article

Neuron, 106 (1), pp. 177–187, 2020.

Abstract | Links | BibTeX

@article{Chen2020g,
title = {Parietal cortex regulates visual salience and salience-driven behavior},
author = {Xiaomo Chen and Marc Zirnsak and Gabriel M Vega and Eshan Govil and Stephen G Lomber and Tirin Moore},
doi = {10.1016/j.neuron.2020.01.016},
year = {2020},
date = {2020-01-01},
journal = {Neuron},
volume = {106},
number = {1},
pages = {177--187},
publisher = {Elsevier Inc.},
abstract = {Unique stimuli stand out. Despite an abundance of competing sensory stimuli, the detection of the most salient ones occurs without effort, and that detection contributes to theguidanceof adaptive behavior. Neurons sensitive to the salience of visual stimuli are wide-spread throughout the primate visual system and are thought to shape the selection of visual targets. However, a neural source of salience remains elusive. In an attempt to identify a source of visual salience, we reversibly inactivated parietal cortex and simultaneously recorded salience signals in prefrontal cortex. Inactivation of parietal cortex not only caused pronounced and selective reductions of salience signals in prefrontal cortex but also diminished the influence of salience on visually guided behavior. These observations demonstrate a causal role of parietal cortex in regulating salience signals within the brain and in controlling salience-driven behavior.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Unique stimuli stand out. Despite an abundance of competing sensory stimuli, the detection of the most salient ones occurs without effort, and that detection contributes to theguidanceof adaptive behavior. Neurons sensitive to the salience of visual stimuli are wide-spread throughout the primate visual system and are thought to shape the selection of visual targets. However, a neural source of salience remains elusive. In an attempt to identify a source of visual salience, we reversibly inactivated parietal cortex and simultaneously recorded salience signals in prefrontal cortex. Inactivation of parietal cortex not only caused pronounced and selective reductions of salience signals in prefrontal cortex but also diminished the influence of salience on visually guided behavior. These observations demonstrate a causal role of parietal cortex in regulating salience signals within the brain and in controlling salience-driven behavior.

Close

  • doi:10.1016/j.neuron.2020.01.016

Close

Xiaomo Chen; Marc Zirnsak; Gabriel M Vega; Tirin Moore

Frontal eye field neurons selectively signal the reward value of prior actions Journal Article

Progress in Neurobiology, 195 , pp. 1–10, 2020.

Abstract | Links | BibTeX

@article{Chen2020i,
title = {Frontal eye field neurons selectively signal the reward value of prior actions},
author = {Xiaomo Chen and Marc Zirnsak and Gabriel M Vega and Tirin Moore},
doi = {10.1016/j.pneurobio.2020.101881},
year = {2020},
date = {2020-01-01},
journal = {Progress in Neurobiology},
volume = {195},
pages = {1--10},
publisher = {Elsevier},
abstract = {The consequences of individual actions are typically unknown until well after they are executed. This fact necessitates a mechanism that bridges delays between specific actions and reward outcomes. We looked for the presence of such a mechanism in the post-movement activity of neurons in the frontal eye field (FEF), a visuomotor area in prefrontal cortex. Monkeys performed an oculomotor gamble task in which they made eye movements to different locations associated with dynamically varying reward outcomes. Behavioral data showed that monkeys tracked reward history and made choices according to their own risk preferences. Consistent with previous studies, we observed that the activity of FEF neurons is correlated with the expected reward value of different eye movements before a target appears. Moreover, we observed that the activity of FEF neurons continued to signal the direction of eye movements, the expected reward value, and their interaction well after the movements were completed and when targets were no longer within the neuronal response field. In addition, this post-movement information was also observed in local field potentials, particularly in low-frequency bands. These results show that neural signals of prior actions and expected reward value persist across delays between those actions and their experienced outcomes. These memory traces may serve a role in reward-based learning in which subjects need to learn actions predicting delayed reward.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The consequences of individual actions are typically unknown until well after they are executed. This fact necessitates a mechanism that bridges delays between specific actions and reward outcomes. We looked for the presence of such a mechanism in the post-movement activity of neurons in the frontal eye field (FEF), a visuomotor area in prefrontal cortex. Monkeys performed an oculomotor gamble task in which they made eye movements to different locations associated with dynamically varying reward outcomes. Behavioral data showed that monkeys tracked reward history and made choices according to their own risk preferences. Consistent with previous studies, we observed that the activity of FEF neurons is correlated with the expected reward value of different eye movements before a target appears. Moreover, we observed that the activity of FEF neurons continued to signal the direction of eye movements, the expected reward value, and their interaction well after the movements were completed and when targets were no longer within the neuronal response field. In addition, this post-movement information was also observed in local field potentials, particularly in low-frequency bands. These results show that neural signals of prior actions and expected reward value persist across delays between those actions and their experienced outcomes. These memory traces may serve a role in reward-based learning in which subjects need to learn actions predicting delayed reward.

Close

  • doi:10.1016/j.pneurobio.2020.101881

Close

E Cleeren; I D Popivanov; W Van Paesschen; Peter Janssen

Fast responses to images of animate and inanimate objects in the nonhuman primate amygdala Journal Article

Scientific Reports, 10 , pp. 1–11, 2020.

Abstract | Links | BibTeX

@article{Cleeren2020,
title = {Fast responses to images of animate and inanimate objects in the nonhuman primate amygdala},
author = {E Cleeren and I D Popivanov and W {Van Paesschen} and Peter Janssen},
doi = {10.1038/s41598-020-71885-z},
year = {2020},
date = {2020-01-01},
journal = {Scientific Reports},
volume = {10},
pages = {1--11},
publisher = {Nature Publishing Group UK},
abstract = {Visual information reaches the amygdala through the various stages of the ventral visual stream. There is, however, evidence that a fast subcortical pathway for the processing of emotional visual input exists. To explore the presence of this pathway in primates, we recorded local field potentials in the amygdala of four rhesus monkeys during a passive fixation task showing images of ten object categories. Additionally, in one of the monkeys we also obtained multi-unit spiking activity during the same task. We observed remarkably fast medium and high gamma responses in the amygdala of the four monkeys. These responses were selective for the different stimulus categories, showed within-category selectivity, and peaked as early as 60 ms after stimulus onset. Multi-unit responses in the amygdala were lagging the gamma responses by about 40 ms. Thus, these observations add further evidence that selective visual information reaches the amygdala of nonhuman primates through a very fast route.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual information reaches the amygdala through the various stages of the ventral visual stream. There is, however, evidence that a fast subcortical pathway for the processing of emotional visual input exists. To explore the presence of this pathway in primates, we recorded local field potentials in the amygdala of four rhesus monkeys during a passive fixation task showing images of ten object categories. Additionally, in one of the monkeys we also obtained multi-unit spiking activity during the same task. We observed remarkably fast medium and high gamma responses in the amygdala of the four monkeys. These responses were selective for the different stimulus categories, showed within-category selectivity, and peaked as early as 60 ms after stimulus onset. Multi-unit responses in the amygdala were lagging the gamma responses by about 40 ms. Thus, these observations add further evidence that selective visual information reaches the amygdala of nonhuman primates through a very fast route.

Close

  • doi:10.1038/s41598-020-71885-z

Close

Benjamin R Cowley; Adam C Snyder; Katerina Acar; Ryan C Williamson; Byron M Yu; Matthew A Smith

Slow drift of neural activity as a signature of impulsivity in macaque visual and prefrontal cortex Journal Article

Neuron, 108 (3), pp. 551–567, 2020.

Abstract | Links | BibTeX

@article{Cowley2020,
title = {Slow drift of neural activity as a signature of impulsivity in macaque visual and prefrontal cortex},
author = {Benjamin R Cowley and Adam C Snyder and Katerina Acar and Ryan C Williamson and Byron M Yu and Matthew A Smith},
doi = {10.1016/j.neuron.2020.07.021},
year = {2020},
date = {2020-01-01},
journal = {Neuron},
volume = {108},
number = {3},
pages = {551--567},
publisher = {Elsevier Inc.},
abstract = {An animal's decision depends not only on incoming sensory evidence but also on its fluctuating internal state. This state embodies multiple cognitive factors, such as arousal and fatigue, but it is unclear how these factors influence the neural processes that encode sensory stimuli and form a decision. We discovered that, unprompted by task conditions, animals slowly shifted their likelihood of detecting stimulus changes over the timescale of tens of minutes. Neural population activity from visual area V4, as well as from prefrontal cortex, slowly drifted together with these behavioral fluctuations. We found that this slow drift, rather than altering the encoding of the sensory stimulus, acted as an impulsivity signal, overriding sensory evidence to dictate the final decision. Overall, this work uncovers an internal state embedded in population activity across multiple brain areas and sheds further light on how internal states contribute to the decision-making process.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

An animal's decision depends not only on incoming sensory evidence but also on its fluctuating internal state. This state embodies multiple cognitive factors, such as arousal and fatigue, but it is unclear how these factors influence the neural processes that encode sensory stimuli and form a decision. We discovered that, unprompted by task conditions, animals slowly shifted their likelihood of detecting stimulus changes over the timescale of tens of minutes. Neural population activity from visual area V4, as well as from prefrontal cortex, slowly drifted together with these behavioral fluctuations. We found that this slow drift, rather than altering the encoding of the sensory stimulus, acted as an impulsivity signal, overriding sensory evidence to dictate the final decision. Overall, this work uncovers an internal state embedded in population activity across multiple brain areas and sheds further light on how internal states contribute to the decision-making process.

Close

  • doi:10.1016/j.neuron.2020.07.021

Close

Olga Dal Monte; Cheng C J Chu; Nicholas A Fagan; Steve W C Chang

Specialized medial prefrontal–amygdala coordination in other-regarding decision preference Journal Article

Nature Neuroscience, 23 (4), pp. 565–574, 2020.

Abstract | Links | BibTeX

@article{DalMonte2020,
title = {Specialized medial prefrontal–amygdala coordination in other-regarding decision preference},
author = {Olga {Dal Monte} and Cheng C J Chu and Nicholas A Fagan and Steve W C Chang},
doi = {10.1038/s41593-020-0593-y},
year = {2020},
date = {2020-01-01},
journal = {Nature Neuroscience},
volume = {23},
number = {4},
pages = {565--574},
publisher = {Springer US},
abstract = {Social behaviors recruit multiple cognitive operations that require interactions between cortical and subcortical brain regions. Interareal synchrony may facilitate such interactions between cortical and subcortical neural populations. However, it remains unknown how neurons from different nodes in the social brain network interact during social decision-making. Here we investigated oscillatory neuronal interactions between the basolateral amygdala and the rostral anterior cingulate gyrus of the medial prefrontal cortex while monkeys expressed context-dependent positive or negative other-regarding preference (ORP), whereby decisions affected the reward received by another monkey. Synchronization between the two nodes was enhanced for a positive ORP but suppressed for a negative ORP. These interactions occurred in beta and gamma frequency bands depending on the area contributing the spikes, exhibited a specific directionality of information flow associated with a positive ORP and could be used to decode social decisions. These findings suggest that specialized coordination in the medial prefrontal–amygdala network underlies social-decision preferences.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Social behaviors recruit multiple cognitive operations that require interactions between cortical and subcortical brain regions. Interareal synchrony may facilitate such interactions between cortical and subcortical neural populations. However, it remains unknown how neurons from different nodes in the social brain network interact during social decision-making. Here we investigated oscillatory neuronal interactions between the basolateral amygdala and the rostral anterior cingulate gyrus of the medial prefrontal cortex while monkeys expressed context-dependent positive or negative other-regarding preference (ORP), whereby decisions affected the reward received by another monkey. Synchronization between the two nodes was enhanced for a positive ORP but suppressed for a negative ORP. These interactions occurred in beta and gamma frequency bands depending on the area contributing the spikes, exhibited a specific directionality of information flow associated with a positive ORP and could be used to decode social decisions. These findings suggest that specialized coordination in the medial prefrontal–amygdala network underlies social-decision preferences.

Close

  • doi:10.1038/s41593-020-0593-y

Close

Becket R Ebitz; Jiaxin Cindy Tu; Benjamin Y Hayden

Rules warp feature encoding in decision-making circuits Journal Article

PLOS Biology, 18 (11), pp. 1–38, 2020.

Abstract | Links | BibTeX

@article{Ebitz2020,
title = {Rules warp feature encoding in decision-making circuits},
author = {Becket R Ebitz and Jiaxin Cindy Tu and Benjamin Y Hayden},
doi = {10.1371/journal.pbio.3000951},
year = {2020},
date = {2020-01-01},
journal = {PLOS Biology},
volume = {18},
number = {11},
pages = {1--38},
abstract = {We have the capacity to follow arbitrary stimulus–response rules, meaning simple policies that guide our behavior. Rule identity is broadly encoded across decision-making circuits, but there are less data on how rules shape the computations that lead to choices. One idea is that rules could simplify these computations. When we follow a rule, there is no need to encode or compute information that is irrelevant to the current rule, which could reduce the metabolic or energetic demands of decision-making. However, it is not clear if the brain can actually take advantage of this computational simplicity. To test this idea, we recorded from neurons in 3 regions linked to decision-making, the orbitofrontal cortex (OFC), ventral striatum (VS), and dorsal striatum (DS), while macaques performed a rule-based decision-making task. Rule-based decisions were identified via modeling rules as the latent causes of decisions. This left us with a set of physically identical choices that maximized reward and information, but could not be explained by simple stimulus–response rules. Contrasting rule-based choices with these residual choices revealed that following rules (1) decreased the energetic cost of decision-making; and (2) expanded rule-relevant coding dimensions and compressed rule-irrelevant ones. Together, these results suggest that we use rules, in part, because they reduce the costs of decision-making through a distributed representational warping in decision-making circuits.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

We have the capacity to follow arbitrary stimulus–response rules, meaning simple policies that guide our behavior. Rule identity is broadly encoded across decision-making circuits, but there are less data on how rules shape the computations that lead to choices. One idea is that rules could simplify these computations. When we follow a rule, there is no need to encode or compute information that is irrelevant to the current rule, which could reduce the metabolic or energetic demands of decision-making. However, it is not clear if the brain can actually take advantage of this computational simplicity. To test this idea, we recorded from neurons in 3 regions linked to decision-making, the orbitofrontal cortex (OFC), ventral striatum (VS), and dorsal striatum (DS), while macaques performed a rule-based decision-making task. Rule-based decisions were identified via modeling rules as the latent causes of decisions. This left us with a set of physically identical choices that maximized reward and information, but could not be explained by simple stimulus–response rules. Contrasting rule-based choices with these residual choices revealed that following rules (1) decreased the energetic cost of decision-making; and (2) expanded rule-relevant coding dimensions and compressed rule-irrelevant ones. Together, these results suggest that we use rules, in part, because they reduce the costs of decision-making through a distributed representational warping in decision-making circuits.

Close

  • doi:10.1371/journal.pbio.3000951

Close

Steven P Errington; Geoffrey F Woodman; Jeffrey D Schall

Dissociation of medial frontal $beta$-bursts and executive control Journal Article

Journal of Neuroscience, 40 (48), pp. 9272–9282, 2020.

Abstract | Links | BibTeX

@article{Errington2020,
title = {Dissociation of medial frontal $beta$-bursts and executive control},
author = {Steven P Errington and Geoffrey F Woodman and Jeffrey D Schall},
doi = {10.1523/JNEUROSCI.2072-20.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neuroscience},
volume = {40},
number = {48},
pages = {9272--9282},
abstract = {The neural mechanisms of executive and motor control concern both basic researchers and clinicians. In human studies, preparation and cancellation of movements are accompanied by changes in the $beta$-frequency band (15-29 Hz) of electroencephalogram (EEG). Previous studies with human participants performing stop signal (countermanding) tasks have described reduced frequency of transient $beta$-bursts over sensorimotor cortical areas before movement initiation and increased $beta$-bursting over medial frontal areas with movement cancellation. This modulation has been interpreted as contributing to the trial-by-trial control of behavior. We performed identical analyses of EEG recorded over the frontal lobe of macaque monkeys (one male, one female) performing a saccade countermanding task. While we replicate the occurrence and modulation of $beta$-bursts associated with initiation and cancellation of saccades, we found that $beta$-bursts occur too infrequently to account for the observed stopping behavior. We also found $beta$-bursts were more common after errors, but their incidence was unrelated to response time (RT) adaptation. These results demonstrate the homology of this EEG signature between humans and macaques but raise questions about the current interpretation of $beta$ band functional significance.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The neural mechanisms of executive and motor control concern both basic researchers and clinicians. In human studies, preparation and cancellation of movements are accompanied by changes in the $beta$-frequency band (15-29 Hz) of electroencephalogram (EEG). Previous studies with human participants performing stop signal (countermanding) tasks have described reduced frequency of transient $beta$-bursts over sensorimotor cortical areas before movement initiation and increased $beta$-bursting over medial frontal areas with movement cancellation. This modulation has been interpreted as contributing to the trial-by-trial control of behavior. We performed identical analyses of EEG recorded over the frontal lobe of macaque monkeys (one male, one female) performing a saccade countermanding task. While we replicate the occurrence and modulation of $beta$-bursts associated with initiation and cancellation of saccades, we found that $beta$-bursts occur too infrequently to account for the observed stopping behavior. We also found $beta$-bursts were more common after errors, but their incidence was unrelated to response time (RT) adaptation. These results demonstrate the homology of this EEG signature between humans and macaques but raise questions about the current interpretation of $beta$ band functional significance.

Close

  • doi:10.1523/JNEUROSCI.2072-20.2020

Close

Katharine A Shapcott; Joscha T Schmiedt; Kleopatra Kouroupaki; Ricardo Kienitz; Andreea Lazar; Wolf Singer; Michael C Schmid

Reward-related suppression of neural activity in macaque visual area v4 Journal Article

Cerebral Cortex, 30 (9), pp. 4871–4881, 2020.

Abstract | Links | BibTeX

@article{Shapcott2020,
title = {Reward-related suppression of neural activity in macaque visual area v4},
author = {Katharine A Shapcott and Joscha T Schmiedt and Kleopatra Kouroupaki and Ricardo Kienitz and Andreea Lazar and Wolf Singer and Michael C Schmid},
doi = {10.1093/cercor/bhaa079},
year = {2020},
date = {2020-01-01},
journal = {Cerebral Cortex},
volume = {30},
number = {9},
pages = {4871--4881},
abstract = {In order for organisms to survive, they need to detect rewarding stimuli, for example, food or a mate, in a complex environment with many competing stimuli. These rewarding stimuli should be detected even if they are nonsalient or irrelevant to the current goal. The value-driven theory of attentional selection proposes that this detection takes place through reward-associated stimuli automatically engaging attentional mechanisms. But how this is achieved in the brain is not very well understood. Here, we investigate the effect of differential reward on the multiunit activity in visual area V4 of monkeys performing a perceptual judgment task. Surprisingly, instead of finding reward-related increases in neural responses to the perceptual target, we observed a large suppression at the onset of the reward indicating cues. Therefore, while previous research showed that reward increases neural activity, here we report a decrease. More suppression was caused by cues associated with higher reward than with lower reward, although neither cue was informative about the perceptually correct choice. This finding of reward-associated neural suppression further highlights normalization as a general cortical mechanism and is consistent with predictions of the value-driven attention theory.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In order for organisms to survive, they need to detect rewarding stimuli, for example, food or a mate, in a complex environment with many competing stimuli. These rewarding stimuli should be detected even if they are nonsalient or irrelevant to the current goal. The value-driven theory of attentional selection proposes that this detection takes place through reward-associated stimuli automatically engaging attentional mechanisms. But how this is achieved in the brain is not very well understood. Here, we investigate the effect of differential reward on the multiunit activity in visual area V4 of monkeys performing a perceptual judgment task. Surprisingly, instead of finding reward-related increases in neural responses to the perceptual target, we observed a large suppression at the onset of the reward indicating cues. Therefore, while previous research showed that reward increases neural activity, here we report a decrease. More suppression was caused by cues associated with higher reward than with lower reward, although neither cue was informative about the perceptually correct choice. This finding of reward-associated neural suppression further highlights normalization as a general cortical mechanism and is consistent with predictions of the value-driven attention theory.

Close

  • doi:10.1093/cercor/bhaa079

Close

Zhenhua Shi; Xiaomo Chen; Changming Zhao; He He; Veit Stuphorn; Dongrui Wu

Multi-view broad learning system for primate oculomotor decision decoding Journal Article

IEEE Transactions on Neural Systems and Rehabilitation Engineering, 28 (9), pp. 1908–1920, 2020.

Abstract | Links | BibTeX

@article{Shi2020a,
title = {Multi-view broad learning system for primate oculomotor decision decoding},
author = {Zhenhua Shi and Xiaomo Chen and Changming Zhao and He He and Veit Stuphorn and Dongrui Wu},
doi = {10.1109/TNSRE.2020.3003342},
year = {2020},
date = {2020-01-01},
journal = {IEEE Transactions on Neural Systems and Rehabilitation Engineering},
volume = {28},
number = {9},
pages = {1908--1920},
abstract = {Multi-view learning improves the learning performance by utilizing multi-view data: data collected from multiple sources, or feature sets extracted from the same data source. This approach is suitable for primate brain state decoding using cortical neural signals. This is because the complementary components of simultaneously recorded neural signals, local field potentials (LFPs) and action potentials (spikes), can be treated as two views. In this paper, we extended broad learning system (BLS), a recently proposed wide neural network architecture, from single-view learning to multi-view learning, and validated its performance in decoding monkeys' oculomotor decision from medial frontal LFPs and spikes. We demonstrated that medial frontal LFPs and spikes in non-human primate do contain complementary information about the oculomotor decision, and that the proposed multi-view BLS is a more effective approach for decoding the oculomotor decision than several classical and state-of-the-art single-view and multi-view learning approaches.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Multi-view learning improves the learning performance by utilizing multi-view data: data collected from multiple sources, or feature sets extracted from the same data source. This approach is suitable for primate brain state decoding using cortical neural signals. This is because the complementary components of simultaneously recorded neural signals, local field potentials (LFPs) and action potentials (spikes), can be treated as two views. In this paper, we extended broad learning system (BLS), a recently proposed wide neural network architecture, from single-view learning to multi-view learning, and validated its performance in decoding monkeys' oculomotor decision from medial frontal LFPs and spikes. We demonstrated that medial frontal LFPs and spikes in non-human primate do contain complementary information about the oculomotor decision, and that the proposed multi-view BLS is a more effective approach for decoding the oculomotor decision than several classical and state-of-the-art single-view and multi-view learning approaches.

Close

  • doi:10.1109/TNSRE.2020.3003342

Close

Ramona Siebert; Nick Taubert; Silvia Spadacenta; Peter W Dicke; Martin A Giese; Peter Thier

A naturalistic dynamic monkey head avatar elicits species-typical reactions and overcomes the uncanny valley Journal Article

eNeuro, 7 (4), pp. 1–17, 2020.

Abstract | Links | BibTeX

@article{Siebert2020,
title = {A naturalistic dynamic monkey head avatar elicits species-typical reactions and overcomes the uncanny valley},
author = {Ramona Siebert and Nick Taubert and Silvia Spadacenta and Peter W Dicke and Martin A Giese and Peter Thier},
doi = {10.1523/ENEURO.0524-19.2020},
year = {2020},
date = {2020-01-01},
journal = {eNeuro},
volume = {7},
number = {4},
pages = {1--17},
abstract = {Research on social perception in monkeys may benefit from standardized, controllable, and ethologically valid renditions of conspecifics offered by monkey avatars. However, previous work has cautioned that monkeys, like humans, show an adverse reaction toward realistic synthetic stimuli, known as the “uncanny valley” effect. We developed an improved naturalistic rhesus monkey face avatar capable of producing facial expressions (fear grin, lip smack and threat), animated by motion capture data of real monkeys. For validation, we addition-ally created decreasingly naturalistic avatar variants. Eight rhesus macaques were tested on the various videos and avoided looking at less naturalistic avatar variants, but not at the most naturalistic or the most unnaturalis-tic avatar, indicating an uncanny valley effect for the less naturalistic avatar versions. The avoidance was deepened by motion and accompanied by physiological arousal. Only the most naturalistic avatar evoked facial expressions comparable to those toward the real monkey videos. Hence, our findings demonstrate that the uncanny valley reaction in monkeys can be overcome by a highly naturalistic avatar.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Research on social perception in monkeys may benefit from standardized, controllable, and ethologically valid renditions of conspecifics offered by monkey avatars. However, previous work has cautioned that monkeys, like humans, show an adverse reaction toward realistic synthetic stimuli, known as the “uncanny valley” effect. We developed an improved naturalistic rhesus monkey face avatar capable of producing facial expressions (fear grin, lip smack and threat), animated by motion capture data of real monkeys. For validation, we addition-ally created decreasingly naturalistic avatar variants. Eight rhesus macaques were tested on the various videos and avoided looking at less naturalistic avatar variants, but not at the most naturalistic or the most unnaturalis-tic avatar, indicating an uncanny valley effect for the less naturalistic avatar versions. The avoidance was deepened by motion and accompanied by physiological arousal. Only the most naturalistic avatar evoked facial expressions comparable to those toward the real monkey videos. Hence, our findings demonstrate that the uncanny valley reaction in monkeys can be overcome by a highly naturalistic avatar.

Close

  • doi:10.1523/ENEURO.0524-19.2020

Close

Cheng Tang; Roger Herikstad; Aishwarya Parthasarathy; Camilo Libedinsky; Shih Cheng Yen

Minimally dependent activity subspaces for working memory and motor preparation in the lateral prefrontal cortex Journal Article

eLife, 9 , pp. 1–23, 2020.

Abstract | Links | BibTeX

@article{Tang2020,
title = {Minimally dependent activity subspaces for working memory and motor preparation in the lateral prefrontal cortex},
author = {Cheng Tang and Roger Herikstad and Aishwarya Parthasarathy and Camilo Libedinsky and Shih Cheng Yen},
doi = {10.7554/ELIFE.58154},
year = {2020},
date = {2020-01-01},
journal = {eLife},
volume = {9},
pages = {1--23},
abstract = {The lateral prefrontal cortex is involved in the integration of multiple types of information, including working memory and motor preparation. However, it is not known how downstream regions can extract one type of information without interference from the others present in the network. Here, we show that the lateral prefrontal cortex of non-human primates contains two minimally dependent low-dimensional subspaces: one that encodes working memory information, and another that encodes motor preparation information. These subspaces capture all the information about the target in the delay periods, and the information in both subspaces is reduced in error trials. A single population of neurons with mixed selectivity forms both subspaces, but the information is kept largely independent from each other. A bump attractor model with divisive normalization replicates the properties of the neural data. These results provide new insights into neural processing in prefrontal regions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The lateral prefrontal cortex is involved in the integration of multiple types of information, including working memory and motor preparation. However, it is not known how downstream regions can extract one type of information without interference from the others present in the network. Here, we show that the lateral prefrontal cortex of non-human primates contains two minimally dependent low-dimensional subspaces: one that encodes working memory information, and another that encodes motor preparation information. These subspaces capture all the information about the target in the delay periods, and the information in both subspaces is reduced in error trials. A single population of neurons with mixed selectivity forms both subspaces, but the information is kept largely independent from each other. A bump attractor model with divisive normalization replicates the properties of the neural data. These results provide new insights into neural processing in prefrontal regions.

Close

  • doi:10.7554/ELIFE.58154

Close

David A Tovar; Jacob A Westerberg; Michele A Cox; Kacie Dougherty; Thomas A Carlson; Mark T Wallace; Alexander Maier

Stimulus feature-specific information flow along the columnar cortical microcircuit revealed by multivariate laminar spiking analysis Journal Article

Frontiers in Systems Neuroscience, 14 , pp. 1–14, 2020.

Abstract | Links | BibTeX

@article{Tovar2020,
title = {Stimulus feature-specific information flow along the columnar cortical microcircuit revealed by multivariate laminar spiking analysis},
author = {David A Tovar and Jacob A Westerberg and Michele A Cox and Kacie Dougherty and Thomas A Carlson and Mark T Wallace and Alexander Maier},
doi = {10.3389/fnsys.2020.600601},
year = {2020},
date = {2020-01-01},
journal = {Frontiers in Systems Neuroscience},
volume = {14},
pages = {1--14},
abstract = {Most of the mammalian neocortex is comprised of a highly similar anatomical structure, consisting of a granular cell layer between superficial and deep layers. Even so, different cortical areas process different information. Taken together, this suggests that cortex features a canonical functional microcircuit that supports region-specific information processing. For example, the primate primary visual cortex (V1) combines the two eyes' signals, extracts stimulus orientation, and integrates contextual information such as visual stimulation history. These processes co-occur during the same laminar stimulation sequence that is triggered by the onset of visual stimuli. Yet, we still know little regarding the laminar processing differences that are specific to each of these types of stimulus information. Univariate analysis techniques have provided great insight by examining one electrode at a time or by studying average responses across multiple electrodes. Here we focus on multivariate statistics to examine response patterns across electrodes instead. Specifically, we applied multivariate pattern analysis (MVPA) to linear multielectrode array recordings of laminar spiking responses to decode information regarding the eye-of-origin, stimulus orientation, and stimulus repetition. MVPA differs from conventional univariate approaches in that it examines patterns of neural activity across simultaneously recorded electrode sites. We were curious whether this added dimensionality could reveal neural processes on the population level that are challenging to detect when measuring brain activity without the context of neighboring recording sites. We found that eye-of-origin information was decodable for the entire duration of stimulus presentation, but diminished in the deepest layers of V1. Conversely, orientation information was transient and equally pronounced along all layers. More importantly, using time-resolved MVPA, we were able to evaluate laminar response properties beyond those yielded by univariate analyses. Specifically, we performed a time generalization analysis by training a classifier at one point of the neural response and testing its performance throughout the remaining period of stimulation. Using this technique, we demonstrate repeating (reverberating) patterns of neural activity that have not previously been observed using standard univariate approaches.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Most of the mammalian neocortex is comprised of a highly similar anatomical structure, consisting of a granular cell layer between superficial and deep layers. Even so, different cortical areas process different information. Taken together, this suggests that cortex features a canonical functional microcircuit that supports region-specific information processing. For example, the primate primary visual cortex (V1) combines the two eyes' signals, extracts stimulus orientation, and integrates contextual information such as visual stimulation history. These processes co-occur during the same laminar stimulation sequence that is triggered by the onset of visual stimuli. Yet, we still know little regarding the laminar processing differences that are specific to each of these types of stimulus information. Univariate analysis techniques have provided great insight by examining one electrode at a time or by studying average responses across multiple electrodes. Here we focus on multivariate statistics to examine response patterns across electrodes instead. Specifically, we applied multivariate pattern analysis (MVPA) to linear multielectrode array recordings of laminar spiking responses to decode information regarding the eye-of-origin, stimulus orientation, and stimulus repetition. MVPA differs from conventional univariate approaches in that it examines patterns of neural activity across simultaneously recorded electrode sites. We were curious whether this added dimensionality could reveal neural processes on the population level that are challenging to detect when measuring brain activity without the context of neighboring recording sites. We found that eye-of-origin information was decodable for the entire duration of stimulus presentation, but diminished in the deepest layers of V1. Conversely, orientation information was transient and equally pronounced along all layers. More importantly, using time-resolved MVPA, we were able to evaluate laminar response properties beyond those yielded by univariate analyses. Specifically, we performed a time generalization analysis by training a classifier at one point of the neural response and testing its performance throughout the remaining period of stimulation. Using this technique, we demonstrate repeating (reverberating) patterns of neural activity that have not previously been observed using standard univariate approaches.

Close

  • doi:10.3389/fnsys.2020.600601

Close

Pedro G Vieira; Matthew R Krause; Christopher C Pack

tACS entrains neural activity while somatosensory input is blocked Journal Article

PLoS Biology, 18 (10), pp. 1–14, 2020.

Abstract | Links | BibTeX

@article{Vieira2020,
title = {tACS entrains neural activity while somatosensory input is blocked},
author = {Pedro G Vieira and Matthew R Krause and Christopher C Pack},
doi = {10.1371/journal.pbio.3000834},
year = {2020},
date = {2020-01-01},
journal = {PLoS Biology},
volume = {18},
number = {10},
pages = {1--14},
abstract = {Transcranial alternating current stimulation (tACS) modulates brain activity by passing electrical current through electrodes that are attached to the scalp. Because it is safe and noninvasive, tACS holds great promise as a tool for basic research and clinical treatment. However, little is known about how tACS ultimately influences neural activity. One hypothesis is that tACS affects neural responses directly, by producing electrical fields that interact with the brain's endogenous electrical activity. By controlling the shape and location of these electric fields, one could target brain regions associated with particular behaviors or symptoms. However, an alternative hypothesis is that tACS affects neural activity indirectly, via peripheral sensory afferents. In particular, it has often been hypothesized that tACS acts on sensory fibers in the skin, which in turn provide rhythmic input to central neurons. In this case, there would be little possibility of targeted brain stimulation, as the regions modulated by tACS would depend entirely on the somatosensory pathways originating in the skin around the stimulating electrodes. Here, we directly test these competing hypotheses by recording single-unit activity in the hippocampus and visual cortex of alert monkeys receiving tACS. We find that tACS entrains neuronal activity in both regions, so that cells fire synchronously with the stimulation. Blocking somatosensory input with a topical anesthetic does not significantly alter these neural entrainment effects. These data are therefore consistent with the direct stimulation hypothesis and suggest that peripheral somatosensory stimulation is not required for tACS to entrain neurons.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Transcranial alternating current stimulation (tACS) modulates brain activity by passing electrical current through electrodes that are attached to the scalp. Because it is safe and noninvasive, tACS holds great promise as a tool for basic research and clinical treatment. However, little is known about how tACS ultimately influences neural activity. One hypothesis is that tACS affects neural responses directly, by producing electrical fields that interact with the brain's endogenous electrical activity. By controlling the shape and location of these electric fields, one could target brain regions associated with particular behaviors or symptoms. However, an alternative hypothesis is that tACS affects neural activity indirectly, via peripheral sensory afferents. In particular, it has often been hypothesized that tACS acts on sensory fibers in the skin, which in turn provide rhythmic input to central neurons. In this case, there would be little possibility of targeted brain stimulation, as the regions modulated by tACS would depend entirely on the somatosensory pathways originating in the skin around the stimulating electrodes. Here, we directly test these competing hypotheses by recording single-unit activity in the hippocampus and visual cortex of alert monkeys receiving tACS. We find that tACS entrains neuronal activity in both regions, so that cells fire synchronously with the stimulation. Blocking somatosensory input with a topical anesthetic does not significantly alter these neural entrainment effects. These data are therefore consistent with the direct stimulation hypothesis and suggest that peripheral somatosensory stimulation is not required for tACS to entrain neurons.

Close

  • doi:10.1371/journal.pbio.3000834

Close

Benjamin Voloh; Mariann Oemisch; Thilo Womelsdorf

Phase of firing coding of learning variables across the fronto-striatal network during feature-based learning Journal Article

Nature Communications, 11 , pp. 1–16, 2020.

Abstract | Links | BibTeX

@article{Voloh2020,
title = {Phase of firing coding of learning variables across the fronto-striatal network during feature-based learning},
author = {Benjamin Voloh and Mariann Oemisch and Thilo Womelsdorf},
doi = {10.1038/s41467-020-18435-3},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
pages = {1--16},
publisher = {Springer US},
abstract = {The prefrontal cortex and striatum form a recurrent network whose spiking activity encodes multiple types of learning-relevant information. This spike-encoded information is evident in average firing rates, but finer temporal coding might allow multiplexing and enhanced readout across the connected network. We tested this hypothesis in the fronto-striatal network of nonhuman primates during reversal learning of feature values. We found that populations of neurons encoding choice outcomes, outcome prediction errors, and outcome history in their firing rates also carry significant information in their phase-of-firing at a 10–25 Hz band-limited beta frequency at which they synchronize across lateral prefrontal cortex, anterior cingulate cortex and anterior striatum when outcomes were processed. The phase-of-firing code exceeds information that can be obtained from firing rates alone and is evident for inter-areal connections between anterior cingulate cortex, lateral prefrontal cortex and anterior striatum. For the majority of connections, the phase-of-firing information gain is maximal at phases of the beta cycle that were offset from the preferred spiking phase of neurons. Taken together, these findings document enhanced information of three important learning variables at specific phases of firing in the beta cycle at an inter-areally shared beta oscillation frequency during goal-directed behavior.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The prefrontal cortex and striatum form a recurrent network whose spiking activity encodes multiple types of learning-relevant information. This spike-encoded information is evident in average firing rates, but finer temporal coding might allow multiplexing and enhanced readout across the connected network. We tested this hypothesis in the fronto-striatal network of nonhuman primates during reversal learning of feature values. We found that populations of neurons encoding choice outcomes, outcome prediction errors, and outcome history in their firing rates also carry significant information in their phase-of-firing at a 10–25 Hz band-limited beta frequency at which they synchronize across lateral prefrontal cortex, anterior cingulate cortex and anterior striatum when outcomes were processed. The phase-of-firing code exceeds information that can be obtained from firing rates alone and is evident for inter-areal connections between anterior cingulate cortex, lateral prefrontal cortex and anterior striatum. For the majority of connections, the phase-of-firing information gain is maximal at phases of the beta cycle that were offset from the preferred spiking phase of neurons. Taken together, these findings document enhanced information of three important learning variables at specific phases of firing in the beta cycle at an inter-areally shared beta oscillation frequency during goal-directed behavior.

Close

  • doi:10.1038/s41467-020-18435-3

Close

Steven Wiesner; Ian W Baumgart; Xin Huang

Spatial arrangement drastically changes the neural representation of multiple visual stimuli that compete in more than one feature domain Journal Article

Journal of Neuroscience, 40 (9), pp. 1834–1848, 2020.

Abstract | Links | BibTeX

@article{Wiesner2020,
title = {Spatial arrangement drastically changes the neural representation of multiple visual stimuli that compete in more than one feature domain},
author = {Steven Wiesner and Ian W Baumgart and Xin Huang},
doi = {10.1523/JNEUROSCI.1950-19.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neuroscience},
volume = {40},
number = {9},
pages = {1834--1848},
abstract = {Natural scenes often contain multiple objects and surfaces. However, how neurons in the visual cortex represent multiple visual stimuli is not well understood. Previous studies have shown that, when multiple stimuli compete in one feature domain, the evoked neuronal response is biased toward the stimulus that has a stronger signal strength. We recorded from two male macaques to investigate how neurons in the middle temporal cortex (MT) represent multiple stimuli that compete in more than one feature domain. Visual stimuli were two random-dot patches moving in different directions. One stimulus had low luminance contrast and moved with high coherence, whereas the other had high contrast and moved with low coherence. We found that how MT neurons represent multiple stimuli depended on the spatial arrangement. When two stimuli were overlapping, MT responses were dominated by the stimulus component that had high contrast. When two stimuli were spatially separated within the receptive fields, the contrast dominance was abolished. We found the same results when using contrast to compete with motion speed. Our neural data and computer simulations using a V1-MT model suggest that the contrast dominance found with overlapping stimuli is due to normalization occurring at an input stage fed to MT, and MT neurons cannot overturn this bias based on their own feature selectivity. The interaction between spatially separated stimuli can largely be explained by normalization within MT. Our results revealed new rules on stimulus competition and highlighted the impact of hierarchical processing on representing multiple stimuli in the visual cortex.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Natural scenes often contain multiple objects and surfaces. However, how neurons in the visual cortex represent multiple visual stimuli is not well understood. Previous studies have shown that, when multiple stimuli compete in one feature domain, the evoked neuronal response is biased toward the stimulus that has a stronger signal strength. We recorded from two male macaques to investigate how neurons in the middle temporal cortex (MT) represent multiple stimuli that compete in more than one feature domain. Visual stimuli were two random-dot patches moving in different directions. One stimulus had low luminance contrast and moved with high coherence, whereas the other had high contrast and moved with low coherence. We found that how MT neurons represent multiple stimuli depended on the spatial arrangement. When two stimuli were overlapping, MT responses were dominated by the stimulus component that had high contrast. When two stimuli were spatially separated within the receptive fields, the contrast dominance was abolished. We found the same results when using contrast to compete with motion speed. Our neural data and computer simulations using a V1-MT model suggest that the contrast dominance found with overlapping stimuli is due to normalization occurring at an input stage fed to MT, and MT neurons cannot overturn this bias based on their own feature selectivity. The interaction between spatially separated stimuli can largely be explained by normalization within MT. Our results revealed new rules on stimulus competition and highlighted the impact of hierarchical processing on representing multiple stimuli in the visual cortex.

Close

  • doi:10.1523/JNEUROSCI.1950-19.2020

Close

Vanessa A D Wilson; Carolin Kade; Sebastian Moeller; Stefan Treue; Igor Kagan; Julia Fischer

Macaque gaze responses to the primatar: A virtual macaque head for social cognition research Journal Article

Frontiers in Psychology, 11 , pp. 1–13, 2020.

Abstract | Links | BibTeX

@article{Wilson2020a,
title = {Macaque gaze responses to the primatar: A virtual macaque head for social cognition research},
author = {Vanessa A D Wilson and Carolin Kade and Sebastian Moeller and Stefan Treue and Igor Kagan and Julia Fischer},
doi = {10.3389/fpsyg.2020.01645},
year = {2020},
date = {2020-01-01},
journal = {Frontiers in Psychology},
volume = {11},
pages = {1--13},
abstract = {Following the expanding use and applications of virtual reality in everyday life, realistic virtual stimuli are of increasing interest in cognitive studies. They allow for control of features such as gaze, expression, appearance, and movement, which may help to overcome limitations of using photographs or video recordings to study social responses. In using virtual stimuli however, one must be careful to avoid the uncanny valley effect, where realistic stimuli can be perceived as eerie, and induce an aversion response. At the same time, it is important to establish whether responses to virtual stimuli mirror responses to depictions of a real conspecific. In the current study, we describe the development of a new virtual monkey head with realistic facial features for experiments with nonhuman primates, the “Primatar.” As a first step toward validation, we assessed how monkeys respond to facial images of a prototype of this Primatar compared to images of real monkeys (RMs), and an unrealistic model. We also compared gaze responses between original images and scrambled as well as obfuscated versions of these images. We measured looking time to images in six freely moving long-tailed macaques (Macaca fascicularis) and gaze exploration behavior in three rhesus macaques (Macaca mulatta). Both groups showed more signs of overt attention to original images than scrambled or obfuscated images. In addition, we found no evidence for an uncanny valley effect; since for both groups, looking times did not differ between real, realistic, or unrealistic images. These results provide important data for further development of our Primatar for use in social cognition studies and more generally for cognitive research with virtual stimuli in nonhuman primates. Future research on the absence of an uncanny valley effect in macaques is needed, to elucidate the roots of this mechanism in humans.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Following the expanding use and applications of virtual reality in everyday life, realistic virtual stimuli are of increasing interest in cognitive studies. They allow for control of features such as gaze, expression, appearance, and movement, which may help to overcome limitations of using photographs or video recordings to study social responses. In using virtual stimuli however, one must be careful to avoid the uncanny valley effect, where realistic stimuli can be perceived as eerie, and induce an aversion response. At the same time, it is important to establish whether responses to virtual stimuli mirror responses to depictions of a real conspecific. In the current study, we describe the development of a new virtual monkey head with realistic facial features for experiments with nonhuman primates, the “Primatar.” As a first step toward validation, we assessed how monkeys respond to facial images of a prototype of this Primatar compared to images of real monkeys (RMs), and an unrealistic model. We also compared gaze responses between original images and scrambled as well as obfuscated versions of these images. We measured looking time to images in six freely moving long-tailed macaques (Macaca fascicularis) and gaze exploration behavior in three rhesus macaques (Macaca mulatta). Both groups showed more signs of overt attention to original images than scrambled or obfuscated images. In addition, we found no evidence for an uncanny valley effect; since for both groups, looking times did not differ between real, realistic, or unrealistic images. These results provide important data for further development of our Primatar for use in social cognition studies and more generally for cognitive research with virtual stimuli in nonhuman primates. Future research on the absence of an uncanny valley effect in macaques is needed, to elucidate the roots of this mechanism in humans.

Close

  • doi:10.3389/fpsyg.2020.01645

Close

Seng Bum Michael Yoo; Benjamin Y Hayden

The transition from evaluation to selection involves neural subspace reorganization in c ore reward regions Journal Article

Neuron, 105 (4), pp. 1–13, 2020.

Abstract | Links | BibTeX

@article{Yoo2020,
title = {The transition from evaluation to selection involves neural subspace reorganization in c ore reward regions},
author = {Seng Bum Michael Yoo and Benjamin Y Hayden},
doi = {10.1016/j.neuron.2019.11.013},
year = {2020},
date = {2020-01-01},
journal = {Neuron},
volume = {105},
number = {4},
pages = {1--13},
publisher = {Elsevier Inc.},
abstract = {Economic choice proceeds from evaluation, in which we contemplate options, to selection, in which we weigh options and choose one. These stages must be differentiated so that decision makers do not proceed to selection before evaluation is complete. We examined responses of neurons in two core reward regions, orbitofrontal (OFC) and ventromedial prefrontal cortex (vmPFC), during two-option choice with asynchronous offer presentation. Our data suggest that neurons selective during the first (presumed evaluation) and second (presumed comparison and selection) offer epochs come from a single pool. Stage transition is accompanied by a shift toward orthogonality in the low-dimensional population response manifold. Nonetheless, the relative position of each option in driving responses in the population subspace is preserved. The orthogonalization we observe supports the hypothesis that the transition from evaluation to selection leads to reorganization of response subspace and suggests a mechanism by which value-related signals are prevented from prematurely driving choice.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Economic choice proceeds from evaluation, in which we contemplate options, to selection, in which we weigh options and choose one. These stages must be differentiated so that decision makers do not proceed to selection before evaluation is complete. We examined responses of neurons in two core reward regions, orbitofrontal (OFC) and ventromedial prefrontal cortex (vmPFC), during two-option choice with asynchronous offer presentation. Our data suggest that neurons selective during the first (presumed evaluation) and second (presumed comparison and selection) offer epochs come from a single pool. Stage transition is accompanied by a shift toward orthogonality in the low-dimensional population response manifold. Nonetheless, the relative position of each option in driving responses in the population subspace is preserved. The orthogonalization we observe supports the hypothesis that the transition from evaluation to selection leads to reorganization of response subspace and suggests a mechanism by which value-related signals are prevented from prematurely driving choice.

Close

  • doi:10.1016/j.neuron.2019.11.013

Close

Mengxi Yun; Takashi Kawai; Masafumi Nejime; Hiroshi Yamada; Masayuki Matsumoto

Signal dynamics of midbrain dopamine neurons during economic decision-making in monkeys Journal Article

Science Advances, 6 , pp. 1–15, 2020.

Abstract | Links | BibTeX

@article{Yun2020,
title = {Signal dynamics of midbrain dopamine neurons during economic decision-making in monkeys},
author = {Mengxi Yun and Takashi Kawai and Masafumi Nejime and Hiroshi Yamada and Masayuki Matsumoto},
doi = {10.1126/sciadv.aba4962},
year = {2020},
date = {2020-01-01},
journal = {Science Advances},
volume = {6},
pages = {1--15},
abstract = {When we make economic choices, the brain first evaluates available options and then decides whether to choose them. Midbrain dopamine neurons are known to reinforce economic choices through their signal evoked by outcomes after decisions are made. However, although critical internal processing is executed while decisions are being made, little is known about the role of dopamine neurons during this period. We found that dopamine neurons exhibited dynamically changing signals related to the internal processing while rhesus monkeys were making decisions. These neurons encoded the value of an option immediately after it was offered and then gradually changed their activity to represent the animal's upcoming choice. Similar dynamics were observed in the orbitofrontal cortex, a center for economic decision-making, but the value-to-choice signal transition was completed earlier in dopamine neurons. Our findings suggest that dopamine neurons are a key component of the neural network that makes choices from values during ongoing decision-making processes.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

When we make economic choices, the brain first evaluates available options and then decides whether to choose them. Midbrain dopamine neurons are known to reinforce economic choices through their signal evoked by outcomes after decisions are made. However, although critical internal processing is executed while decisions are being made, little is known about the role of dopamine neurons during this period. We found that dopamine neurons exhibited dynamically changing signals related to the internal processing while rhesus monkeys were making decisions. These neurons encoded the value of an option immediately after it was offered and then gradually changed their activity to represent the animal's upcoming choice. Similar dynamics were observed in the orbitofrontal cortex, a center for economic decision-making, but the value-to-choice signal transition was completed earlier in dopamine neurons. Our findings suggest that dopamine neurons are a key component of the neural network that makes choices from values during ongoing decision-making processes.

Close

  • doi:10.1126/sciadv.aba4962

Close

Polina Zamarashkina; Dina V Popovkina; Anitha Pasupathy

Timing of response onset and offset in macaque V4: stimulus and task dependence Journal Article

Journal of Neurophysiology, 123 (6), pp. 2311–2325, 2020.

Abstract | Links | BibTeX

@article{Zamarashkina2020,
title = {Timing of response onset and offset in macaque V4: stimulus and task dependence},
author = {Polina Zamarashkina and Dina V Popovkina and Anitha Pasupathy},
doi = {10.1152/jn.00586.2019},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {123},
number = {6},
pages = {2311--2325},
abstract = {In the primate visual cortex, both the magnitude of the neuronal response and its timing can carry important information about the visual world, but studies typically focus only on response magnitude. Here, we examine the onset and offset latency of the responses of neurons in area V4 of awake, behaving macaques across several experiments in the context of a variety of stimuli and task paradigms. Our results highlight distinct contributions of stimuli and tasks to V4 response latency. We found that response onset latencies are shorter than typically cited (median = 75.5 ms), supporting a role for V4 neurons in rapid object and scene recognition functions. Moreover, onset latencies are longer for smaller stimuli and stimulus outlines, consistent with the hypothesis that longer latencies are associated with higher spatial frequency content. Strikingly, we found that onset latencies showed no significant dependence on stimulus occlusion, unlike in inferotemporal cortex, nor on task demands. Across the V4 population, onset latencies had a broad distribution, reflecting the diversity of feedforward, recurrent, and feedback connections that inform the responses of individual neurons. Response offset latencies, on the other hand, displayed the opposite tendency in their relationship to stimulus and task attributes: they are less influenced by stimulus appearance but are shorter in guided saccade tasks compared with fixation tasks. The observation that response latency is influenced by stimulus- and task-associated factors emphasizes a need to examine response timing alongside firing rate in determining the functional role of area V4.NEW & NOTEWORTHY Onset and offset timing of neuronal responses can provide information about visual environment and neuron's role in visual processing and its anatomical connectivity. In the first comprehensive examination of onset and offset latencies in the intermediate visual cortical area V4, we find neurons respond faster than previously reported, making them ideally suited to contribute to rapid object and scene recognition. While response onset reflects stimulus characteristics, timing of response offset is influenced more by behavioral task.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In the primate visual cortex, both the magnitude of the neuronal response and its timing can carry important information about the visual world, but studies typically focus only on response magnitude. Here, we examine the onset and offset latency of the responses of neurons in area V4 of awake, behaving macaques across several experiments in the context of a variety of stimuli and task paradigms. Our results highlight distinct contributions of stimuli and tasks to V4 response latency. We found that response onset latencies are shorter than typically cited (median = 75.5 ms), supporting a role for V4 neurons in rapid object and scene recognition functions. Moreover, onset latencies are longer for smaller stimuli and stimulus outlines, consistent with the hypothesis that longer latencies are associated with higher spatial frequency content. Strikingly, we found that onset latencies showed no significant dependence on stimulus occlusion, unlike in inferotemporal cortex, nor on task demands. Across the V4 population, onset latencies had a broad distribution, reflecting the diversity of feedforward, recurrent, and feedback connections that inform the responses of individual neurons. Response offset latencies, on the other hand, displayed the opposite tendency in their relationship to stimulus and task attributes: they are less influenced by stimulus appearance but are shorter in guided saccade tasks compared with fixation tasks. The observation that response latency is influenced by stimulus- and task-associated factors emphasizes a need to examine response timing alongside firing rate in determining the functional role of area V4.NEW & NOTEWORTHY Onset and offset timing of neuronal responses can provide information about visual environment and neuron's role in visual processing and its anatomical connectivity. In the first comprehensive examination of onset and offset latencies in the intermediate visual cortical area V4, we find neurons respond faster than previously reported, making them ideally suited to contribute to rapid object and scene recognition. While response onset reflects stimulus characteristics, timing of response offset is influenced more by behavioral task.

Close

  • doi:10.1152/jn.00586.2019

Close

Armin Najarpour Foroushani; Sujaya Neupane; Pablo De Heredia Pastor; Christopher C Pack; Mohamad Sawan

Spatial resolution of local field potential signals in macaque V4 Journal Article

Journal of Neural Engineering, 17 (2), pp. 1–23, 2020.

Abstract | Links | BibTeX

@article{Foroushani2020,
title = {Spatial resolution of local field potential signals in macaque V4},
author = {Armin Najarpour Foroushani and Sujaya Neupane and Pablo {De Heredia Pastor} and Christopher C Pack and Mohamad Sawan},
doi = {10.1088/1741-2552/ab7321},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neural Engineering},
volume = {17},
number = {2},
pages = {1--23},
publisher = {IOP Publishing},
abstract = {Objective. An important challenge for the development of cortical visual prostheses is to generate spatially localized percepts of light, using artificial stimulation. Such percepts are called phosphenes, and the goal of prosthetic applications is to generate a pattern of phosphenes that matches the structure of the retinal image. A preliminary step in this process is to understand how the spatial positions of phosphene-like visual stimuli are encoded in the distributed activity of cortical neurons. The spatial resolution with which the distributed responses discriminate positions puts a limit on the capability of visual prosthesis devices to induce phosphenes at multiple positions. While most previous prosthetic devices have targeted the primary visual cortex, the extrastriate cortex has the advantage of covering a large part of the visual field with a smaller amount of cortical tissue, providing the possibility of a more compact implant. Here, we studied how well ensembles of Local Field Potentials (LFPs) and Multiunit activity (MUA) responses from extrastriate cortical visual area V4 of a behaving macaque monkey can discriminate between two-dimensional spatial positions. Approach. We used support vector machines (SVM) to determine the capabilities of LFPs and MUA to discriminate responses to phosphene-like stimuli (probes) at different spatial separations. We proposed a selection strategy based on the combined responses of multiple electrodes and used the linear learning weights to find the minimum number of electrodes for fine and coarse discriminations. We also measured the contribution of correlated trial-to-trial variability in the responses to the discrimination performance for MUA and LFP. Main results. We found that despite the large receptive field sizes in V4, the combined responses from multiple sites, whether MUA or LFP, are capable of fine and coarse discrimination of positions. Our electrode selection procedure significantly increased discrimination performance while reducing the required number of electrodes. Analysis of noise correlations in MUA and LFP responses showed that noise correlations in LFPs carry more information about spatial positions. Significance. This study determined the coding strategy for fine discrimination, suggesting that spatial positions could be well localized with patterned stimulation in extrastriate area V4. It also provides a novel approach to build a compact prosthesis with relatively few electrodes, which has the potential advantage of reducing tissue damage in real applications.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Objective. An important challenge for the development of cortical visual prostheses is to generate spatially localized percepts of light, using artificial stimulation. Such percepts are called phosphenes, and the goal of prosthetic applications is to generate a pattern of phosphenes that matches the structure of the retinal image. A preliminary step in this process is to understand how the spatial positions of phosphene-like visual stimuli are encoded in the distributed activity of cortical neurons. The spatial resolution with which the distributed responses discriminate positions puts a limit on the capability of visual prosthesis devices to induce phosphenes at multiple positions. While most previous prosthetic devices have targeted the primary visual cortex, the extrastriate cortex has the advantage of covering a large part of the visual field with a smaller amount of cortical tissue, providing the possibility of a more compact implant. Here, we studied how well ensembles of Local Field Potentials (LFPs) and Multiunit activity (MUA) responses from extrastriate cortical visual area V4 of a behaving macaque monkey can discriminate between two-dimensional spatial positions. Approach. We used support vector machines (SVM) to determine the capabilities of LFPs and MUA to discriminate responses to phosphene-like stimuli (probes) at different spatial separations. We proposed a selection strategy based on the combined responses of multiple electrodes and used the linear learning weights to find the minimum number of electrodes for fine and coarse discriminations. We also measured the contribution of correlated trial-to-trial variability in the responses to the discrimination performance for MUA and LFP. Main results. We found that despite the large receptive field sizes in V4, the combined responses from multiple sites, whether MUA or LFP, are capable of fine and coarse discrimination of positions. Our electrode selection procedure significantly increased discrimination performance while reducing the required number of electrodes. Analysis of noise correlations in MUA and LFP responses showed that noise correlations in LFPs carry more information about spatial positions. Significance. This study determined the coding strategy for fine discrimination, suggesting that spatial positions could be well localized with patterned stimulation in extrastriate area V4. It also provides a novel approach to build a compact prosthesis with relatively few electrodes, which has the potential advantage of reducing tissue damage in real applications.

Close

  • doi:10.1088/1741-2552/ab7321

Close

Mathilda Froesel; Quentin Goudard; Marc Hauser; Maëva Gacoin; Suliann Ben Hamed

Automated video-based heart rate tracking for the anesthetized and behaving monkey Journal Article

Scientific Reports, 10 , pp. 1–11, 2020.

Abstract | Links | BibTeX

@article{Froesel2020,
title = {Automated video-based heart rate tracking for the anesthetized and behaving monkey},
author = {Mathilda Froesel and Quentin Goudard and Marc Hauser and Ma{ë}va Gacoin and Suliann {Ben Hamed}},
doi = {10.1038/s41598-020-74954-5},
year = {2020},
date = {2020-01-01},
journal = {Scientific Reports},
volume = {10},
pages = {1--11},
publisher = {Nature Publishing Group UK},
abstract = {Heart rate (HR) is extremely valuable in the study of complex behaviours and their physiological correlates in non-human primates. However, collecting this information is often challenging, involving either invasive implants or tedious behavioural training. In the present study, we implement a Eulerian video magnification (EVM) heart tracking method in the macaque monkey combined with wavelet transform. This is based on a measure of image to image fluctuations in skin reflectance due to changes in blood influx. We show a strong temporal coherence and amplitude match between EVM-based heart tracking and ground truth ECG, from both color (RGB) and infrared (IR) videos, in anesthetized macaques, to a level comparable to what can be achieved in humans. We further show that this method allows to identify consistent HR changes following the presentation of conspecific emotional voices or faces. EVM is used to extract HR in humans but has never been applied to non-human primates. Video photoplethysmography allows to extract awake macaques HR from RGB videos. In contrast, our method allows to extract awake macaques HR from both RGB and IR videos and is particularly resilient to the head motion that can be observed in awake behaving monkeys. Overall, we believe that this method can be generalized as a tool to track HR of the awake behaving monkey, for ethological, behavioural, neuroscience or welfare purposes.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Heart rate (HR) is extremely valuable in the study of complex behaviours and their physiological correlates in non-human primates. However, collecting this information is often challenging, involving either invasive implants or tedious behavioural training. In the present study, we implement a Eulerian video magnification (EVM) heart tracking method in the macaque monkey combined with wavelet transform. This is based on a measure of image to image fluctuations in skin reflectance due to changes in blood influx. We show a strong temporal coherence and amplitude match between EVM-based heart tracking and ground truth ECG, from both color (RGB) and infrared (IR) videos, in anesthetized macaques, to a level comparable to what can be achieved in humans. We further show that this method allows to identify consistent HR changes following the presentation of conspecific emotional voices or faces. EVM is used to extract HR in humans but has never been applied to non-human primates. Video photoplethysmography allows to extract awake macaques HR from RGB videos. In contrast, our method allows to extract awake macaques HR from both RGB and IR videos and is particularly resilient to the head motion that can be observed in awake behaving monkeys. Overall, we believe that this method can be generalized as a tool to track HR of the awake behaving monkey, for ethological, behavioural, neuroscience or welfare purposes.

Close

  • doi:10.1038/s41598-020-74954-5

Close

Jennifer M Groh; John M Pearson; Jeff T Mohl

Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli Journal Article

Journal of Neurophysiology, 124 (3), pp. 715–727, 2020.

Abstract | Links | BibTeX

@article{Groh2020,
title = {Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli},
author = {Jennifer M Groh and John M Pearson and Jeff T Mohl},
doi = {10.1152/jn.00046.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {124},
number = {3},
pages = {715--727},
abstract = {The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys (Macaca mulatta) and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accu- rately predicted both the explicit “same vs. different” source judg- ments as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to nonhuman primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys (Macaca mulatta) and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accu- rately predicted both the explicit “same vs. different” source judg- ments as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to nonhuman primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.

Close

  • doi:10.1152/jn.00046.2020

Close

Roberto A Gulli; Lyndon R Duong; Benjamin W Corrigan; Guillaume Doucet; Sylvain Williams; Stefano Fusi; Julio C Martinez-Trujillo

Context-dependent representations of objects and space in the primate hippocampus during virtual navigation Journal Article

Nature Neuroscience, 23 (1), pp. 103–112, 2020.

Abstract | Links | BibTeX

@article{Gulli2020,
title = {Context-dependent representations of objects and space in the primate hippocampus during virtual navigation},
author = {Roberto A Gulli and Lyndon R Duong and Benjamin W Corrigan and Guillaume Doucet and Sylvain Williams and Stefano Fusi and Julio C Martinez-Trujillo},
doi = {10.1038/s41593-019-0548-3},
year = {2020},
date = {2020-01-01},
journal = {Nature Neuroscience},
volume = {23},
number = {1},
pages = {103--112},
publisher = {Springer US},
abstract = {The hippocampus is implicated in associative memory and spatial navigation. To investigate how these functions are mixed in the hippocampus, we recorded from single hippocampal neurons in macaque monkeys navigating a virtual maze during a foraging task and a context–object associative memory task. During both tasks, single neurons encoded information about spatial position; a linear classifier also decoded position. However, the population code for space did not generalize across tasks, particularly where stimuli relevant to the associative memory task appeared. Single-neuron and population-level analyses revealed that cross-task changes were due to selectivity for nonspatial features of the associative memory task when they were visually available (perceptual coding) and following their disappearance (mnemonic coding). Our results show that neurons in the primate hippocampus nonlinearly mix information about space and nonspatial elements of the environment in a task-dependent manner; this efficient code flexibly represents unique perceptual experiences and correspondent memories.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The hippocampus is implicated in associative memory and spatial navigation. To investigate how these functions are mixed in the hippocampus, we recorded from single hippocampal neurons in macaque monkeys navigating a virtual maze during a foraging task and a context–object associative memory task. During both tasks, single neurons encoded information about spatial position; a linear classifier also decoded position. However, the population code for space did not generalize across tasks, particularly where stimuli relevant to the associative memory task appeared. Single-neuron and population-level analyses revealed that cross-task changes were due to selectivity for nonspatial features of the associative memory task when they were visually available (perceptual coding) and following their disappearance (mnemonic coding). Our results show that neurons in the primate hippocampus nonlinearly mix information about space and nonspatial elements of the environment in a task-dependent manner; this efficient code flexibly represents unique perceptual experiences and correspondent memories.

Close

  • doi:10.1038/s41593-019-0548-3

Close

Ziad M Hafed; Laurent Goffart

Gaze direction as equilibrium: More evidence from spatial and temporal aspects of small-saccade triggering in the rhesus macaque monkey Journal Article

Journal of Neurophysiology, 123 (1), pp. 308–322, 2020.

Abstract | Links | BibTeX

@article{Hafed2020,
title = {Gaze direction as equilibrium: More evidence from spatial and temporal aspects of small-saccade triggering in the rhesus macaque monkey},
author = {Ziad M Hafed and Laurent Goffart},
doi = {10.1152/JN.00588.2019},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {123},
number = {1},
pages = {308--322},
abstract = {Rigorous behavioral studies made in human subjects have shown that small-eccentricity target displacements are associated with increased saccadic reaction times, but the reasons for this remain unclear. Before characterizing the neurophysiological foundations underlying this relationship between the spatial and temporal aspects of saccades, we tested the triggering of small saccades in the male rhesus macaque monkey. We also compared our results to those obtained in human subjects, both from the existing literature and through our own additional measurements. Using a variety of behavioral tasks exercising visual and nonvisual guidance of small saccades, we found that small saccades consistently require more time than larger saccades to be triggered in the nonhuman primate, even in the absence of any visual guidance and when valid advance information about the saccade landing position is available. We also found a strong asymmetry in the reaction times of small upper versus lower visual field visually guided saccades, a phenomenon that has not been described before for small saccades, even in humans. Following the suggestion that an eye movement is not initiated as long as the visuo-oculomotor system is within a state of balance, in which opposing commands counterbalance each other, we propose that the longer reaction times are a signature of enhanced times needed to create the symmetry-breaking condition that puts downstream premotor neurons into a push-pull regime necessary for rotating the eyeballs. Our results provide an important catalog of nonhuman primate oculomotor capabilities on the miniature scale, allowing concrete predictions on underlying neurophysiological mechanisms. NEW & NOTEWORTHY Leveraging a multitude of neurophysiological investigations in the rhesus macaque monkey, we generated and tested hypotheses about small-saccade latencies in this animal model. We found that small saccades always take longer, on average, than larger saccades to trigger, regardless of visual and cognitive context. Moreover, small downward saccades have the longest latencies overall. Our results provide an important documentation of oculomotor capabilities of an indispensable animal model for neuroscientific research in vision, cognition, and action.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Rigorous behavioral studies made in human subjects have shown that small-eccentricity target displacements are associated with increased saccadic reaction times, but the reasons for this remain unclear. Before characterizing the neurophysiological foundations underlying this relationship between the spatial and temporal aspects of saccades, we tested the triggering of small saccades in the male rhesus macaque monkey. We also compared our results to those obtained in human subjects, both from the existing literature and through our own additional measurements. Using a variety of behavioral tasks exercising visual and nonvisual guidance of small saccades, we found that small saccades consistently require more time than larger saccades to be triggered in the nonhuman primate, even in the absence of any visual guidance and when valid advance information about the saccade landing position is available. We also found a strong asymmetry in the reaction times of small upper versus lower visual field visually guided saccades, a phenomenon that has not been described before for small saccades, even in humans. Following the suggestion that an eye movement is not initiated as long as the visuo-oculomotor system is within a state of balance, in which opposing commands counterbalance each other, we propose that the longer reaction times are a signature of enhanced times needed to create the symmetry-breaking condition that puts downstream premotor neurons into a push-pull regime necessary for rotating the eyeballs. Our results provide an important catalog of nonhuman primate oculomotor capabilities on the miniature scale, allowing concrete predictions on underlying neurophysiological mechanisms. NEW & NOTEWORTHY Leveraging a multitude of neurophysiological investigations in the rhesus macaque monkey, we generated and tested hypotheses about small-saccade latencies in this animal model. We found that small saccades always take longer, on average, than larger saccades to trigger, regardless of visual and cognitive context. Moreover, small downward saccades have the longest latencies overall. Our results provide an important documentation of oculomotor capabilities of an indispensable animal model for neuroscientific research in vision, cognition, and action.

Close

  • doi:10.1152/JN.00588.2019

Close

Eric Hart; Alexander C Huk

Recurrent circuit dynamics underlie persistent activity in the macaque frontoparietal network Journal Article

eLife, 9 , pp. 1–22, 2020.

Abstract | Links | BibTeX

@article{Hart2020,
title = {Recurrent circuit dynamics underlie persistent activity in the macaque frontoparietal network},
author = {Eric Hart and Alexander C Huk},
doi = {10.7554/eLife.52460},
year = {2020},
date = {2020-01-01},
journal = {eLife},
volume = {9},
pages = {1--22},
abstract = {During delayed oculomotor response tasks, neurons in the lateral intraparietal area (LIP) and the frontal eye fields (FEF) exhibit persistent activity that reflects the active maintenance of behaviorally relevant information. Despite many computational models of the mechanisms of persistent activity, there is a lack of circuit-level data from the primate to inform the theories. To fill this gap, we simultaneously recorded ensembles of neurons in both LIP and FEF while macaques performed a memory-guided saccade task. A population encoding model revealed strong and symmetric long-timescale recurrent excitation between LIP and FEF. Unexpectedly, LIP exhibited stronger local functional connectivity than FEF, and many neurons in LIP had longer network and intrinsic timescales. The differences in connectivity could be explained by the strength of recurrent dynamics in attractor networks. These findings reveal reciprocal multi-area circuit dynamics in the frontoparietal network during persistent activity and lay the groundwork for quantitative comparisons to theoretical models.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

During delayed oculomotor response tasks, neurons in the lateral intraparietal area (LIP) and the frontal eye fields (FEF) exhibit persistent activity that reflects the active maintenance of behaviorally relevant information. Despite many computational models of the mechanisms of persistent activity, there is a lack of circuit-level data from the primate to inform the theories. To fill this gap, we simultaneously recorded ensembles of neurons in both LIP and FEF while macaques performed a memory-guided saccade task. A population encoding model revealed strong and symmetric long-timescale recurrent excitation between LIP and FEF. Unexpectedly, LIP exhibited stronger local functional connectivity than FEF, and many neurons in LIP had longer network and intrinsic timescales. The differences in connectivity could be explained by the strength of recurrent dynamics in attractor networks. These findings reveal reciprocal multi-area circuit dynamics in the frontoparietal network during persistent activity and lay the groundwork for quantitative comparisons to theoretical models.

Close

  • doi:10.7554/eLife.52460

Close

Christopher A Henry; Adam Kohn

Spatial contextual effects in primary visual cortex limit feature representation under crowding Journal Article

Nature Communications, 11 , pp. 1–12, 2020.

Abstract | Links | BibTeX

@article{Henry2020,
title = {Spatial contextual effects in primary visual cortex limit feature representation under crowding},
author = {Christopher A Henry and Adam Kohn},
doi = {10.1038/s41467-020-15386-7},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
pages = {1--12},
publisher = {Springer US},
abstract = {Crowding is a profound loss of discriminability of visual features, when a target stimulus is surrounded by distractors. Numerous studies of human perception have characterized how crowding depends on the properties of a visual display. Yet, there is limited understanding of how and where stimulus information is lost in the visual system under crowding. Here, we show that macaque monkeys exhibit perceptual crowding for target orientation that is similar to humans. We then record from neuronal populations in monkey primary visual cortex (V1). These populations show an appreciable loss of information about target orientation in the presence of distractors, due both to divisive and additive modulation of responses to targets by distractors. Our results show that spatial contextual effects in V1 limit the discriminability of visual features and can contribute substantively to crowding.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Crowding is a profound loss of discriminability of visual features, when a target stimulus is surrounded by distractors. Numerous studies of human perception have characterized how crowding depends on the properties of a visual display. Yet, there is limited understanding of how and where stimulus information is lost in the visual system under crowding. Here, we show that macaque monkeys exhibit perceptual crowding for target orientation that is similar to humans. We then record from neuronal populations in monkey primary visual cortex (V1). These populations show an appreciable loss of information about target orientation in the presence of distractors, due both to divisive and additive modulation of responses to targets by distractors. Our results show that spatial contextual effects in V1 limit the discriminability of visual features and can contribute substantively to crowding.

Close

  • doi:10.1038/s41467-020-15386-7

Close

James P Herman; Fabrice Arcizet; Richard J Krauzlis

Attention-related modulation of caudate neurons depends on superior colliculus activity Journal Article

eLife, 9 , pp. 1–26, 2020.

Abstract | Links | BibTeX

@article{Herman2020,
title = {Attention-related modulation of caudate neurons depends on superior colliculus activity},
author = {James P Herman and Fabrice Arcizet and Richard J Krauzlis},
doi = {10.7554/ELIFE.53998},
year = {2020},
date = {2020-01-01},
journal = {eLife},
volume = {9},
pages = {1--26},
abstract = {Recent work has implicated the primate basal ganglia in visual perception and attention, in addition to their traditional role in motor control. The basal ganglia, especially the caudate nucleus “head” (CDh) of the striatum, receive indirect anatomical connections from the superior colliculus, a midbrain structure that is known to play a crucial role in the control of visual attention. To test the possible functional relationship between these subcortical structures, we recorded CDh neuronal activity of macaque monkeys before and during unilateral superior colliculus (SC) inactivation in a spatial attention task. SC inactivation significantly altered the attention-related modulation of CDh neurons and strongly impaired the classification of task epochs based on CDh activity. Only inactivation of SC on the same side of the brain as recorded CDh neurons, not the opposite side, had these effects. These results demonstrate a novel interaction between SC activity and attention-related visual processing in the basal ganglia.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Recent work has implicated the primate basal ganglia in visual perception and attention, in addition to their traditional role in motor control. The basal ganglia, especially the caudate nucleus “head” (CDh) of the striatum, receive indirect anatomical connections from the superior colliculus, a midbrain structure that is known to play a crucial role in the control of visual attention. To test the possible functional relationship between these subcortical structures, we recorded CDh neuronal activity of macaque monkeys before and during unilateral superior colliculus (SC) inactivation in a spatial attention task. SC inactivation significantly altered the attention-related modulation of CDh neurons and strongly impaired the classification of task epochs based on CDh activity. Only inactivation of SC on the same side of the brain as recorded CDh neurons, not the opposite side, had these effects. These results demonstrate a novel interaction between SC activity and attention-related visual processing in the basal ganglia.

Close

  • doi:10.7554/ELIFE.53998

Close

Ahmad Jezzini; Camillo Padoa-Schioppa

Neuronal activity in the primate amygdala during economic choice Journal Article

Journal of Neuroscience, 40 (6), pp. 1286–1301, 2020.

Abstract | Links | BibTeX

@article{Jezzini2020,
title = {Neuronal activity in the primate amygdala during economic choice},
author = {Ahmad Jezzini and Camillo Padoa-Schioppa},
doi = {10.1523/JNEUROSCI.0961-19.2019},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neuroscience},
volume = {40},
number = {6},
pages = {1286--1301},
abstract = {Multiple lines of evidence link economic choices to the orbitofrontal cortex (OFC), but other brain regions may contribute to the computation and comparison of economic values. A particularly strong candidate is the basolateral amygdala (BLA). Amygdala lesions impair performance in reinforcer devaluation tasks, suggesting that the BLA contributes to value computation. Furthermore, previous studies of the BLA have found neuronal activity consistent with a value representation. Here, we recorded from the BLA of two male rhesus macaques choosing between different juices. Offered quantities varied from trial to trial, and relative values were inferred from choices. Approximately one-third of BLA cells were task-related. Our analyses revealed the presence of three groups of neurons encoding variables offer value, chosen value, and chosen juice. In this respect, the BLA appeared similar to the OFC. The two areas differed for the proportion of neurons in each group, as the fraction of chosen value cells was significantly higher in the BLA. Importantly, the activity of these neurons reflected the subjective nature of value. Firing rates in the BLA were sustained throughout the trial and maximal after juice delivery. In contrast, firing rates in the OFC were phasic and maximal shortly after offer presentation. Our results suggest that the BLA supports economic choice and reward expectation.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Multiple lines of evidence link economic choices to the orbitofrontal cortex (OFC), but other brain regions may contribute to the computation and comparison of economic values. A particularly strong candidate is the basolateral amygdala (BLA). Amygdala lesions impair performance in reinforcer devaluation tasks, suggesting that the BLA contributes to value computation. Furthermore, previous studies of the BLA have found neuronal activity consistent with a value representation. Here, we recorded from the BLA of two male rhesus macaques choosing between different juices. Offered quantities varied from trial to trial, and relative values were inferred from choices. Approximately one-third of BLA cells were task-related. Our analyses revealed the presence of three groups of neurons encoding variables offer value, chosen value, and chosen juice. In this respect, the BLA appeared similar to the OFC. The two areas differed for the proportion of neurons in each group, as the fraction of chosen value cells was significantly higher in the BLA. Importantly, the activity of these neurons reflected the subjective nature of value. Firing rates in the BLA were sustained throughout the trial and maximal after juice delivery. In contrast, firing rates in the OFC were phasic and maximal shortly after offer presentation. Our results suggest that the BLA supports economic choice and reward expectation.

Close

  • doi:10.1523/JNEUROSCI.0961-19.2019

Close

Kohitij Kar; James J DiCarlo

Fast recurrent processing via ventrolateral prefrontal cortex Is needed by the primate ventral stream for robust core visual object recognition Journal Article

Neuron, pp. 1–13, 2020.

Abstract | Links | BibTeX

@article{Kar2020,
title = {Fast recurrent processing via ventrolateral prefrontal cortex Is needed by the primate ventral stream for robust core visual object recognition},
author = {Kohitij Kar and James J DiCarlo},
doi = {10.1016/j.neuron.2020.09.035},
year = {2020},
date = {2020-01-01},
journal = {Neuron},
pages = {1--13},
publisher = {Elsevier Inc.},
abstract = {Distributed neural population spiking patterns in macaque inferior temporal (IT) cortex that support core object recognition require additional time to develop for specific, “late-solved” images. This suggests the necessity of recurrent processing in these computations. Which brain circuits are responsible for computing and transmitting these putative recurrent signals to IT? To test whether the ventrolateral prefrontal cortex (vlPFC) is a critical recurrent node in this system, here, we pharmacologically inactivated parts of vlPFC and simultaneously measured IT activity while monkeys performed object discrimination tasks. vlPFC inactivation deteriorated the quality of late-phase (textgreater150 ms from image onset) IT population code and produced commensurate behavioral deficits for late-solved images. Finally, silencing vlPFC caused the monkeys' IT activity and behavior to become more like those produced by feedforward-only ventral stream models. Together with prior work, these results implicate fast recurrent processing through vlPFC as critical to producing behaviorally sufficient object representations in IT.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Distributed neural population spiking patterns in macaque inferior temporal (IT) cortex that support core object recognition require additional time to develop for specific, “late-solved” images. This suggests the necessity of recurrent processing in these computations. Which brain circuits are responsible for computing and transmitting these putative recurrent signals to IT? To test whether the ventrolateral prefrontal cortex (vlPFC) is a critical recurrent node in this system, here, we pharmacologically inactivated parts of vlPFC and simultaneously measured IT activity while monkeys performed object discrimination tasks. vlPFC inactivation deteriorated the quality of late-phase (textgreater150 ms from image onset) IT population code and produced commensurate behavioral deficits for late-solved images. Finally, silencing vlPFC caused the monkeys' IT activity and behavior to become more like those produced by feedforward-only ventral stream models. Together with prior work, these results implicate fast recurrent processing through vlPFC as critical to producing behaviorally sufficient object representations in IT.

Close

  • doi:10.1016/j.neuron.2020.09.035

Close

Sanjeev B Khanna; Jonathan A Scott; Matthew A Smith

Dynamic shifts of visual and saccadic signals in prefrontal cortical regions 8Ar and FEF Journal Article

Journal of Neurophysiology, 124 (6), pp. 1774–1791, 2020.

Abstract | Links | BibTeX

@article{Khanna2020,
title = {Dynamic shifts of visual and saccadic signals in prefrontal cortical regions 8Ar and FEF},
author = {Sanjeev B Khanna and Jonathan A Scott and Matthew A Smith},
doi = {10.1152/jn.00669.2019},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {124},
number = {6},
pages = {1774--1791},
abstract = {Active vision is a fundamental process by which primates gather information about the external world. Multiple brain regions have been studied in the context of simple active vision tasks in which a visual target's appearance is temporally separated from saccade execution. Most neurons have tight spatial registration between visual and saccadic signals, and in areas such as prefrontal cortex (PFC), some neurons show persistent delay activity that links visual and motor epochs and has been proposed as a basis for spatial working memory. Many PFC neurons also show rich dynamics, which have been attributed to alternative working memory codes and the representation of other task variables. Our study investigated the transition between processing a visual stimulus and generating an eye movement in populations of PFC neurons in macaque monkeys performing a memory guided saccade task. We found that neurons in two subregions of PFC, the frontal eye fields (FEF) and area 8Ar, differed in their dynamics and spatial response profiles. These dynamics could be attributed largely to shifts in the spatial profile of visual and motor responses in individual neurons. This led to visual and motor codes for particular spatial locations that were instantiated by different mixtures of neurons, which could be important in PFC's flexible role in multiple sensory, cognitive, and motor tasks.NEW & NOTEWORTHY A central question in neuroscience is how the brain transitions from sensory representations to motor outputs. The prefrontal cortex contains neurons that have long been implicated as important in this transition and in working memory. We found evidence for rich and diverse tuning in these neurons, which was often spatially misaligned between visual and saccadic responses. This feature may play an important role in flexible working memory capabilities.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Active vision is a fundamental process by which primates gather information about the external world. Multiple brain regions have been studied in the context of simple active vision tasks in which a visual target's appearance is temporally separated from saccade execution. Most neurons have tight spatial registration between visual and saccadic signals, and in areas such as prefrontal cortex (PFC), some neurons show persistent delay activity that links visual and motor epochs and has been proposed as a basis for spatial working memory. Many PFC neurons also show rich dynamics, which have been attributed to alternative working memory codes and the representation of other task variables. Our study investigated the transition between processing a visual stimulus and generating an eye movement in populations of PFC neurons in macaque monkeys performing a memory guided saccade task. We found that neurons in two subregions of PFC, the frontal eye fields (FEF) and area 8Ar, differed in their dynamics and spatial response profiles. These dynamics could be attributed largely to shifts in the spatial profile of visual and motor responses in individual neurons. This led to visual and motor codes for particular spatial locations that were instantiated by different mixtures of neurons, which could be important in PFC's flexible role in multiple sensory, cognitive, and motor tasks.NEW & NOTEWORTHY A central question in neuroscience is how the brain transitions from sensory representations to motor outputs. The prefrontal cortex contains neurons that have long been implicated as important in this transition and in working memory. We found evidence for rich and diverse tuning in these neurons, which was often spatially misaligned between visual and saccadic responses. This feature may play an important role in flexible working memory capabilities.

Close

  • doi:10.1152/jn.00669.2019

Close

Ricardo Kienitz; Michele A Cox; Kacie Dougherty; Richard C Saunders; Joscha T Schmiedt; David A Leopold; Alexander Maier; Michael C Schmid

Theta, but not gamma oscillations in area V4 depend on input from primary visual cortex Journal Article

Current Biology, pp. 1–12, 2020.

Abstract | Links | BibTeX

@article{Kienitz2020,
title = {Theta, but not gamma oscillations in area V4 depend on input from primary visual cortex},
author = {Ricardo Kienitz and Michele A Cox and Kacie Dougherty and Richard C Saunders and Joscha T Schmiedt and David A Leopold and Alexander Maier and Michael C Schmid},
doi = {10.1016/j.cub.2020.10.091},
year = {2020},
date = {2020-01-01},
journal = {Current Biology},
pages = {1--12},
publisher = {Elsevier Ltd.},
abstract = {Theta (3–9 Hz) and gamma (30–100 Hz) oscillations have been observed at different levels along the hierarchy of cortical areas and across a wide set of cognitive tasks. In the visual system, the emergence of both rhythms in primary visual cortex (V1) and mid-level cortical areas V4 has been linked with variations in perceptual reaction times.1–5 Based on analytical methods to infer causality in neural activation patterns, it was concluded that gamma and theta oscillations might both reflect feedforward sensory processing from V1 to V4.6–10 Here, we report on experiments in macaque monkeys in which we experimentally assessed the presence of both oscillations in the neural activity recorded from multi-electrode arrays in V1 and V4 before and after a permanent V1 lesion. With intact cortex, theta and gamma oscillations could be reliably elicited in V1 and V4 when monkeys viewed a visual contour illusion and showed phase-to-amplitude coupling. Laminar analysis in V1 revealed that both theta and gamma oscillations occurred primarily in the supragranular layers, the cortical output compartment of V1. However, there was a clear dissociation between the two rhythms in V4 that became apparent when the major feedforward input to V4 was removed by lesioning V1: although V1 lesioning eliminated V4 theta, it had little effect on V4 gamma power except for delaying its emergence by textgreater100 ms. These findings suggest that theta is more tightly associated with feedforward processing than gamma and pose limits on the proposed role of gamma as a feedforward mechanism.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Theta (3–9 Hz) and gamma (30–100 Hz) oscillations have been observed at different levels along the hierarchy of cortical areas and across a wide set of cognitive tasks. In the visual system, the emergence of both rhythms in primary visual cortex (V1) and mid-level cortical areas V4 has been linked with variations in perceptual reaction times.1–5 Based on analytical methods to infer causality in neural activation patterns, it was concluded that gamma and theta oscillations might both reflect feedforward sensory processing from V1 to V4.6–10 Here, we report on experiments in macaque monkeys in which we experimentally assessed the presence of both oscillations in the neural activity recorded from multi-electrode arrays in V1 and V4 before and after a permanent V1 lesion. With intact cortex, theta and gamma oscillations could be reliably elicited in V1 and V4 when monkeys viewed a visual contour illusion and showed phase-to-amplitude coupling. Laminar analysis in V1 revealed that both theta and gamma oscillations occurred primarily in the supragranular layers, the cortical output compartment of V1. However, there was a clear dissociation between the two rhythms in V4 that became apparent when the major feedforward input to V4 was removed by lesioning V1: although V1 lesioning eliminated V4 theta, it had little effect on V4 gamma power except for delaying its emergence by textgreater100 ms. These findings suggest that theta is more tightly associated with feedforward processing than gamma and pose limits on the proposed role of gamma as a feedforward mechanism.

Close

  • doi:10.1016/j.cub.2020.10.091

Close

Daniel L Kimmel; Gamaleldin F Elsayed; John P Cunningham; William T Newsome

Value and choice as separable and stable representations in orbitofrontal cortex Journal Article

Nature Communications, 11 , pp. 1–19, 2020.

Abstract | Links | BibTeX

@article{Kimmel2020,
title = {Value and choice as separable and stable representations in orbitofrontal cortex},
author = {Daniel L Kimmel and Gamaleldin F Elsayed and John P Cunningham and William T Newsome},
doi = {10.1038/s41467-020-17058-y},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
pages = {1--19},
publisher = {Springer US},
abstract = {Value-based decision-making requires different variables—including offer value, choice, expected outcome, and recent history—at different times in the decision process. Orbitofrontal cortex (OFC) is implicated in value-based decision-making, but it is unclear how downstream circuits read out complex OFC responses into separate representations of the relevant variables to support distinct functions at specific times. We recorded from single OFC neurons while macaque monkeys made cost-benefit decisions. Using a novel analysis, we find separable neural dimensions that selectively represent the value, choice, and expected reward of the present and previous offers. The representations are generally stable during periods of behavioral relevance, then transition abruptly at key task events and between trials. Applying new statistical methods, we show that the sensitivity, specificity and stability of the representations are greater than expected from the population's low-level features—dimensionality and temporal smoothness—alone. The separability and stability suggest a mechanism—linear summation over static synaptic weights—by which downstream circuits can select for specific variables at specific times.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Value-based decision-making requires different variables—including offer value, choice, expected outcome, and recent history—at different times in the decision process. Orbitofrontal cortex (OFC) is implicated in value-based decision-making, but it is unclear how downstream circuits read out complex OFC responses into separate representations of the relevant variables to support distinct functions at specific times. We recorded from single OFC neurons while macaque monkeys made cost-benefit decisions. Using a novel analysis, we find separable neural dimensions that selectively represent the value, choice, and expected reward of the present and previous offers. The representations are generally stable during periods of behavioral relevance, then transition abruptly at key task events and between trials. Applying new statistical methods, we show that the sensitivity, specificity and stability of the representations are greater than expected from the population's low-level features—dimensionality and temporal smoothness—alone. The separability and stability suggest a mechanism—linear summation over static synaptic weights—by which downstream circuits can select for specific variables at specific times.

Close

  • doi:10.1038/s41467-020-17058-y

Close

Kenji W Koyano; Adam P Jones; David B T McMahon; Elena N Waidmann; Brian E Russ; David A Leopold

Dynamic suppression of average facial structure shapes neural tuning in three macaque face patches Journal Article

Current Biology, 31 , pp. 1–18, 2020.

Abstract | Links | BibTeX

@article{Koyano2020,
title = {Dynamic suppression of average facial structure shapes neural tuning in three macaque face patches},
author = {Kenji W Koyano and Adam P Jones and David B T McMahon and Elena N Waidmann and Brian E Russ and David A Leopold},
doi = {10.1016/j.cub.2020.09.070},
year = {2020},
date = {2020-01-01},
journal = {Current Biology},
volume = {31},
pages = {1--18},
publisher = {Elsevier Ltd.},
abstract = {The visual perception of identity in humans and other primates is thought to draw upon cortical areas specialized for the analysis of facial structure. A prominent theory of face recognition holds that the brain computes and stores average facial structure, which it then uses to efficiently determine individual identity, though the neural mechanisms underlying this process are controversial. Here, we demonstrate that the dynamic suppression of average facial structure plays a prominent role in the responses of neurons in three fMRI-defined face patches of the macaque. Using photorealistic face stimuli that systematically varied in identity level according to a psychophysically based face space, we found that single units in the AF, AM, and ML face patches exhibited robust tuning around average facial structure. This tuning emerged after the initial excitatory response to the face and was expressed as the selective suppression of sustained responses to low-identity faces. The coincidence of this suppression with increased spike timing synchrony across the population suggests a mechanism of active inhibition underlying this effect. Control experiments confirmed that the diminished responses to low-identity faces were not due to short-term adaptation processes. We propose that the brain's neural suppression of average facial structure facilitates recognition by promoting the extraction of distinctive facial characteristics and suppressing redundant or irrelevant responses across the population.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The visual perception of identity in humans and other primates is thought to draw upon cortical areas specialized for the analysis of facial structure. A prominent theory of face recognition holds that the brain computes and stores average facial structure, which it then uses to efficiently determine individual identity, though the neural mechanisms underlying this process are controversial. Here, we demonstrate that the dynamic suppression of average facial structure plays a prominent role in the responses of neurons in three fMRI-defined face patches of the macaque. Using photorealistic face stimuli that systematically varied in identity level according to a psychophysically based face space, we found that single units in the AF, AM, and ML face patches exhibited robust tuning around average facial structure. This tuning emerged after the initial excitatory response to the face and was expressed as the selective suppression of sustained responses to low-identity faces. The coincidence of this suppression with increased spike timing synchrony across the population suggests a mechanism of active inhibition underlying this effect. Control experiments confirmed that the diminished responses to low-identity faces were not due to short-term adaptation processes. We propose that the brain's neural suppression of average facial structure facilitates recognition by promoting the extraction of distinctive facial characteristics and suppressing redundant or irrelevant responses across the population.

Close

  • doi:10.1016/j.cub.2020.09.070

Close

Aravind Krishna; Seiji Tanabe; Adam Kohn

Decision signals in the local field potentials of early and mid-level macaque visual cortex Journal Article

Cerebral Cortex, pp. 1–15, 2020.

Abstract | Links | BibTeX

@article{Krishna2020,
title = {Decision signals in the local field potentials of early and mid-level macaque visual cortex},
author = {Aravind Krishna and Seiji Tanabe and Adam Kohn},
doi = {10.1093/cercor/bhaa218},
year = {2020},
date = {2020-01-01},
journal = {Cerebral Cortex},
pages = {1--15},
abstract = {The neural basis of perceptual decision making has typically been studied using measurements of single neuron activity, though decisions are likely based on the activity of large neuronal ensembles. Local field potentials (LFPs) may, in some cases, serve as a useful proxy for population activity and thus be useful for understanding the neural basis of perceptual decision making. However, little is known about whether LFPs in sensory areas include decision-related signals. We therefore analyzed LFPs recorded using two 48­electrode arrays implanted in primary visual cortex (V1) and area V4 of macaque monkeys trained to perform a fine orientation discrimination task. We found significant choice information in low (0–30 Hz) and higher (70–500 Hz) frequency components of the LFP, but little information in gamma frequencies (30–70 Hz). Choice information was more robust in V4 than V1 and stronger in LFPs than in simultaneously measured spiking activity. LFP-based choice information included a global component, common across electrodes within an area. Our findings reveal the presence of robust choice-related signals in the LFPs recorded in V1 and V4 and suggest that LFPs may be a useful complement to spike-based analyses of decision making.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The neural basis of perceptual decision making has typically been studied using measurements of single neuron activity, though decisions are likely based on the activity of large neuronal ensembles. Local field potentials (LFPs) may, in some cases, serve as a useful proxy for population activity and thus be useful for understanding the neural basis of perceptual decision making. However, little is known about whether LFPs in sensory areas include decision-related signals. We therefore analyzed LFPs recorded using two 48­electrode arrays implanted in primary visual cortex (V1) and area V4 of macaque monkeys trained to perform a fine orientation discrimination task. We found significant choice information in low (0–30 Hz) and higher (70–500 Hz) frequency components of the LFP, but little information in gamma frequencies (30–70 Hz). Choice information was more robust in V4 than V1 and stronger in LFPs than in simultaneously measured spiking activity. LFP-based choice information included a global component, common across electrodes within an area. Our findings reveal the presence of robust choice-related signals in the LFPs recorded in V1 and V4 and suggest that LFPs may be a useful complement to spike-based analyses of decision making.

Close

  • doi:10.1093/cercor/bhaa218

Close

Jan Kubanek; Julian Brown; Patrick Ye; Kim Butts Pauly; Tirin Moore; William Newsome

Remote, brain region-specific control of choice behavior with ultrasonic waves Journal Article

Science Advances, 6 , pp. 1–10, 2020.

Abstract | Links | BibTeX

@article{Kubanek2020,
title = {Remote, brain region-specific control of choice behavior with ultrasonic waves},
author = {Jan Kubanek and Julian Brown and Patrick Ye and Kim Butts Pauly and Tirin Moore and William Newsome},
doi = {10.1126/sciadv.aaz4193},
year = {2020},
date = {2020-01-01},
journal = {Science Advances},
volume = {6},
pages = {1--10},
abstract = {The ability to modulate neural activity in specific brain circuits remotely and systematically could revolutionize studies of brain function and treatments of brain disorders. Sound waves of high frequencies (ultrasound) have shown promise in this respect, combining the ability to modulate neuronal activity with sharp spatial focus. Here, we show that the approach can have potent effects on choice behavior. Brief, low-intensity ultrasound pulses delivered noninvasively into specific brain regions of macaque monkeys influenced their decisions regarding which target to choose. The effects were substantial, leading to around a 2:1 bias in choices compared to the default balanced proportion. The effect presence and polarity was controlled by the specific target region. These results represent a critical step towards the ability to influence choice behavior noninvasively, enabling systematic investigations and treatments of brain circuits underlying disorders of choice.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The ability to modulate neural activity in specific brain circuits remotely and systematically could revolutionize studies of brain function and treatments of brain disorders. Sound waves of high frequencies (ultrasound) have shown promise in this respect, combining the ability to modulate neuronal activity with sharp spatial focus. Here, we show that the approach can have potent effects on choice behavior. Brief, low-intensity ultrasound pulses delivered noninvasively into specific brain regions of macaque monkeys influenced their decisions regarding which target to choose. The effects were substantial, leading to around a 2:1 bias in choices compared to the default balanced proportion. The effect presence and polarity was controlled by the specific target region. These results represent a critical step towards the ability to influence choice behavior noninvasively, enabling systematic investigations and treatments of brain circuits underlying disorders of choice.

Close

  • doi:10.1126/sciadv.aaz4193

Close

Marcin Leszczyński; Annamaria Barczak; Yoshinao Kajikawa; Istvan Ulbert; Arnaud Y Falchier; Idan Tal; Saskia Haegens; Lucia Melloni; Robert T Knight; Charles E Schroeder

Dissociation of broadband high-frequency activity and neuronal firing in the neocortex Journal Article

Science Advances, 6 , pp. 1–13, 2020.

Abstract | Links | BibTeX

@article{Leszczynski2020,
title = {Dissociation of broadband high-frequency activity and neuronal firing in the neocortex},
author = {Marcin Leszczy{ń}ski and Annamaria Barczak and Yoshinao Kajikawa and Istvan Ulbert and Arnaud Y Falchier and Idan Tal and Saskia Haegens and Lucia Melloni and Robert T Knight and Charles E Schroeder},
doi = {10.1101/531368},
year = {2020},
date = {2020-01-01},
journal = {Science Advances},
volume = {6},
pages = {1--13},
abstract = {Broadband High-frequency Activity (BHA; 70-150 Hz), also known as “high gamma,” a key analytic signal in human intracranial recordings is often assumed to reflect local neural firing (multiunit activity; MUA). Accordingly, BHA has been used to study neuronal population responses in auditory (1,2), visual (3,4), language (5), mnemonic processes (6-9) and cognitive control (10,11). BHA is arguably the electrophysiological measure best correlated with the Blood Oxygenation Level Dependent (BOLD) signal in fMRI (12-13). However, beyond the fact that BHA correlates with neuronal spiking (12, 14-16), the neuronal populations and physiological processes generating BHA are not precisely defined. Although critical for interpreting intracranial signals in human and non-human primates, the precise physiology of BHA remains unknown. Here, we show that BHA dissociates from MUA in primary visual and auditory cortex. Using laminar multielectrode data in monkeys, we found a bimodal distribution of stimulus-evoked BHA across depth of a cortical column: an early-deep, followed by a later-superficial layer response. Only, the early-deep layer BHA had a clear local MUA correlate, while the more prominent superficial layer BHA had a weak or undetectable MUA correlate. In many cases, particularly in V1 (70%), supragranular sites showed strong BHA in lieu of any detectable increase in MUA. Due to volume conduction, BHA from both the early-deep and the later-supragranular generators contribute to the field potential at the pial surface, though the contribution may be weighted towards the late-supragranular BHA. Our results demonstrate that the strongest generators of BHA are in the superficial cortical layers and show that the origins of BHA include a mixture of the neuronal action potential firing and dendritic processes separable from this firing. It is likely that the typically-recorded BHA signal emphasizes the latter processes to a greater extent than previously recognized.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Broadband High-frequency Activity (BHA; 70-150 Hz), also known as “high gamma,” a key analytic signal in human intracranial recordings is often assumed to reflect local neural firing (multiunit activity; MUA). Accordingly, BHA has been used to study neuronal population responses in auditory (1,2), visual (3,4), language (5), mnemonic processes (6-9) and cognitive control (10,11). BHA is arguably the electrophysiological measure best correlated with the Blood Oxygenation Level Dependent (BOLD) signal in fMRI (12-13). However, beyond the fact that BHA correlates with neuronal spiking (12, 14-16), the neuronal populations and physiological processes generating BHA are not precisely defined. Although critical for interpreting intracranial signals in human and non-human primates, the precise physiology of BHA remains unknown. Here, we show that BHA dissociates from MUA in primary visual and auditory cortex. Using laminar multielectrode data in monkeys, we found a bimodal distribution of stimulus-evoked BHA across depth of a cortical column: an early-deep, followed by a later-superficial layer response. Only, the early-deep layer BHA had a clear local MUA correlate, while the more prominent superficial layer BHA had a weak or undetectable MUA correlate. In many cases, particularly in V1 (70%), supragranular sites showed strong BHA in lieu of any detectable increase in MUA. Due to volume conduction, BHA from both the early-deep and the later-supragranular generators contribute to the field potential at the pial surface, though the contribution may be weighted towards the late-supragranular BHA. Our results demonstrate that the strongest generators of BHA are in the superficial cortical layers and show that the origins of BHA include a mixture of the neuronal action potential firing and dendritic processes separable from this firing. It is likely that the typically-recorded BHA signal emphasizes the latter processes to a greater extent than previously recognized.

Close

  • doi:10.1101/531368

Close

Baowang Li; Brandy N Routh; Daniel Johnston; Eyal Seidemann; Nicholas J Priebe

Voltage-Gated Intrinsic Conductances Shape the Input-Output Relationship of Cortical Neurons in Behaving Primate V1 Journal Article

Neuron, 107 (1), pp. 185–196.e4, 2020.

Abstract | Links | BibTeX

@article{Li2020a,
title = {Voltage-Gated Intrinsic Conductances Shape the Input-Output Relationship of Cortical Neurons in Behaving Primate V1},
author = {Baowang Li and Brandy N Routh and Daniel Johnston and Eyal Seidemann and Nicholas J Priebe},
doi = {10.1016/j.neuron.2020.04.001},
year = {2020},
date = {2020-01-01},
journal = {Neuron},
volume = {107},
number = {1},
pages = {185--196.e4},
publisher = {Elsevier Inc.},
abstract = {Li et al. used whole-cell recording to reveal a large and unexpected voltage-gated intrinsic conductance that dramatically alters the integrative properties of primate V1 neurons. Therefore, a standard computational model of sensory neurons that incorporates linear integration of synaptic inputs followed by a threshold nonlinearity requires revision.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Li et al. used whole-cell recording to reveal a large and unexpected voltage-gated intrinsic conductance that dramatically alters the integrative properties of primate V1 neurons. Therefore, a standard computational model of sensory neurons that incorporates linear integration of synaptic inputs followed by a threshold nonlinearity requires revision.

Close

  • doi:10.1016/j.neuron.2020.04.001

Close

Zhongqiao Lin; Chechang Nie; Yuanfeng Zhang; Yang Chen; Tianming Yang

Evidence accumulation for value computation in the prefrontal cortex during decision making Journal Article

Proceedings of the National Academy of Sciences, 117 (48), pp. 30728–30737, 2020.

Abstract | Links | BibTeX

@article{Lin2020ab,
title = {Evidence accumulation for value computation in the prefrontal cortex during decision making},
author = {Zhongqiao Lin and Chechang Nie and Yuanfeng Zhang and Yang Chen and Tianming Yang},
doi = {10.1073/pnas.2019077117},
year = {2020},
date = {2020-01-01},
journal = {Proceedings of the National Academy of Sciences},
volume = {117},
number = {48},
pages = {30728--30737},
abstract = {A key step of decision making is to determine the value associated with each option. The evaluation process often depends on the accumulation of evidence from multiple sources, which may arrive at different times. How evidence is accumulated for value computation in the brain during decision making has not been well studied. To address this problem, we trained rhesus monkeys to perform a decision-making task in which they had to make eye movement choices between two targets, whose reward probabilities had to be determined with the combined evidence from four sequentially presented visual stimuli. We studied the encoding of the reward probabilities associated with the stimuli and the eye movements in the orbitofrontal (OFC) and the dorsolateral prefrontal (DLPFC) cortices during the decision process. We found that the OFC neurons encoded the reward probability associated with individual pieces of evidence in the stimulus domain. Importantly, the representation of the reward probability in the OFC was transient, and the OFC did not encode the reward probability associated with the combined evidence from multiple stimuli. The computation of the combined reward probabilities was observed only in the DLPFC and only in the action domain. Furthermore, the reward probability encoding in the DLPFC exhibited an asymmetric pattern of mixed selectivity that supported the computation of the stimulus-to-action transition of reward information. Our results reveal that the OFC and the DLPFC play distinct roles in the value computation during evidence accumulation.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

A key step of decision making is to determine the value associated with each option. The evaluation process often depends on the accumulation of evidence from multiple sources, which may arrive at different times. How evidence is accumulated for value computation in the brain during decision making has not been well studied. To address this problem, we trained rhesus monkeys to perform a decision-making task in which they had to make eye movement choices between two targets, whose reward probabilities had to be determined with the combined evidence from four sequentially presented visual stimuli. We studied the encoding of the reward probabilities associated with the stimuli and the eye movements in the orbitofrontal (OFC) and the dorsolateral prefrontal (DLPFC) cortices during the decision process. We found that the OFC neurons encoded the reward probability associated with individual pieces of evidence in the stimulus domain. Importantly, the representation of the reward probability in the OFC was transient, and the OFC did not encode the reward probability associated with the combined evidence from multiple stimuli. The computation of the combined reward probabilities was observed only in the DLPFC and only in the action domain. Furthermore, the reward probability encoding in the DLPFC exhibited an asymmetric pattern of mixed selectivity that supported the computation of the stimulus-to-action transition of reward information. Our results reveal that the OFC and the DLPFC play distinct roles in the value computation during evidence accumulation.

Close

  • doi:10.1073/pnas.2019077117

Close

Ye Liu; Ming Li; Xian Zhang; Yiliang Lu; Hongliang Gong; Jiapeng Yin; Zheyuan Chen; Liling Qian; Yupeng Yang; Ian Max Andolina; Stewart Shipp; Niall Mcloughlin; Shiming Tang; Wei Wang

Hierarchical Representation for Chromatic Processing across Macaque V1, V2, and V4 Journal Article

Neuron, 108 (3), pp. 538–550.e5, 2020.

Abstract | Links | BibTeX

@article{Liu2020f,
title = {Hierarchical Representation for Chromatic Processing across Macaque V1, V2, and V4},
author = {Ye Liu and Ming Li and Xian Zhang and Yiliang Lu and Hongliang Gong and Jiapeng Yin and Zheyuan Chen and Liling Qian and Yupeng Yang and Ian Max Andolina and Stewart Shipp and Niall Mcloughlin and Shiming Tang and Wei Wang},
doi = {10.1016/j.neuron.2020.07.037},
year = {2020},
date = {2020-01-01},
journal = {Neuron},
volume = {108},
number = {3},
pages = {538--550.e5},
publisher = {Elsevier Inc.},
abstract = {How does our visual brain generate perceptual color space? Liu et al. find that within a uniform blob-like architecture of hue responses, chromotopic maps develop progressively in scale and precision along the visual hierarchy of macaque V1, V2, and V4. Such hierarchical refinement improves spectral uniformity, better reflecting color perception.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

How does our visual brain generate perceptual color space? Liu et al. find that within a uniform blob-like architecture of hue responses, chromotopic maps develop progressively in scale and precision along the visual hierarchy of macaque V1, V2, and V4. Such hierarchical refinement improves spectral uniformity, better reflecting color perception.

Close

  • doi:10.1016/j.neuron.2020.07.037

Close

Adi Lixenberg; Merav Yarkoni; Yehudit Botschko; Mati Joshua

Encoding of eye movements explains reward-related activity in cerebellar simple spikes Journal Article

Journal of Neurophysiology, 123 (2), pp. 786–799, 2020.

Abstract | Links | BibTeX

@article{Lixenberg2020,
title = {Encoding of eye movements explains reward-related activity in cerebellar simple spikes},
author = {Adi Lixenberg and Merav Yarkoni and Yehudit Botschko and Mati Joshua},
doi = {10.1152/JN.00363.2019},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {123},
number = {2},
pages = {786--799},
abstract = {The cerebellum exhibits both motor and reward-related signals. However, it remains unclear whether reward is processed independently from the motor command or might reflect the motor consequences of the reward drive. To test how reward-related signals interact with sensorimotor processing in the cerebellum, we recorded Purkinje cell simple spike activity in the cerebellar floccular complex while monkeys were engaged in smooth pursuit eye movement tasks. The color of the target signaled the size of the reward the monkeys would receive at the end of the target motion. When the tracking task presented a single target, both pursuit and neural activity were only slightly modulated by the reward size. The reward modulations in single cells were rarely large enough to be detected. These modulations were only significant in the population analysis when we averaged across many neurons. In two-target tasks where the monkey learned to select based on the size of the reward outcome, both behavior and neural activity adapted rapidly. In both the single- and two-target tasks, the size of the reward-related modulation matched the size of the effect of reward on behavior. Thus, unlike cortical activity in eye movement structures, the reward-related signals could not be dissociated from the motor command. These results suggest that reward information is integrated with the eye movement command upstream of the Purkinje cells in the floccular complex. Thus reward-related modulations of the simple spikes are akin to modulations found in motor behavior and not to the central processing of the reward value. NEW & NOTEWORTHY Disentangling sensorimotor and reward signals is only possible if these signals do not completely overlap. We recorded activity in the floccular complex of the cerebellum while monkeys performed tasks designed to separate representations of reward from those of movement. Activity modulation by reward could be accounted for by the coding of eye movement parameters, suggesting that reward information is already integrated into motor commands upstream of the floccular complex.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The cerebellum exhibits both motor and reward-related signals. However, it remains unclear whether reward is processed independently from the motor command or might reflect the motor consequences of the reward drive. To test how reward-related signals interact with sensorimotor processing in the cerebellum, we recorded Purkinje cell simple spike activity in the cerebellar floccular complex while monkeys were engaged in smooth pursuit eye movement tasks. The color of the target signaled the size of the reward the monkeys would receive at the end of the target motion. When the tracking task presented a single target, both pursuit and neural activity were only slightly modulated by the reward size. The reward modulations in single cells were rarely large enough to be detected. These modulations were only significant in the population analysis when we averaged across many neurons. In two-target tasks where the monkey learned to select based on the size of the reward outcome, both behavior and neural activity adapted rapidly. In both the single- and two-target tasks, the size of the reward-related modulation matched the size of the effect of reward on behavior. Thus, unlike cortical activity in eye movement structures, the reward-related signals could not be dissociated from the motor command. These results suggest that reward information is integrated with the eye movement command upstream of the Purkinje cells in the floccular complex. Thus reward-related modulations of the simple spikes are akin to modulations found in motor behavior and not to the central processing of the reward value. NEW & NOTEWORTHY Disentangling sensorimotor and reward signals is only possible if these signals do not completely overlap. We recorded activity in the floccular complex of the cerebellum while monkeys performed tasks designed to separate representations of reward from those of movement. Activity modulation by reward could be accounted for by the coding of eye movement parameters, suggesting that reward information is already integrated into motor commands upstream of the floccular complex.

Close

  • doi:10.1152/JN.00363.2019

Close

Kaleb A Lowe; Wolf Zinke; Anthony M Phipps; Josh Cosman; Micala Maddox; Jeffrey D Schall; Charles F Caskey

Visuomotor transformations are modulated by focused ultrasound over frontal eye field Journal Article

Ultrasound in Medicine & Biology, pp. 1–14, 2020.

Abstract | Links | BibTeX

@article{Lowe2020,
title = {Visuomotor transformations are modulated by focused ultrasound over frontal eye field},
author = {Kaleb A Lowe and Wolf Zinke and Anthony M Phipps and Josh Cosman and Micala Maddox and Jeffrey D Schall and Charles F Caskey},
doi = {10.1016/j.ultrasmedbio.2020.11.022},
year = {2020},
date = {2020-01-01},
journal = {Ultrasound in Medicine & Biology},
pages = {1--14},
abstract = {Neuromodulation with focused ultrasound (FUS) is being widely explored as a non-invasive tool to stimulate focal brain regions because of its superior spatial resolution and coverage compared with other neuro- modulation methods. The precise effects of FUS stimulation on specific regions of the brain are not yet fully understood. Here, we characterized the behavioral effects of FUS stimulation directly applied through a craniot- omy over the macaque frontal eye field (FEF). In macaque monkeys making directed eye movements to perform visual search tasks with direct or arbitrary responses, focused ultrasound was applied through a craniotomy over the FEF. Saccade response times (RTs) and error rates were determined for trials without or with FUS stim- ulation with pulses at a peak negative pressure of either 250 or 425 kPa. Both RTs and error rates were affected by FUS. Responses toward a target located contralateral to the FUS stimulation were approximately 3 ms slower in the presence of FUS in both monkeys studied, while only one exhibited a slowing of responses for ipsilateral targets. Error rates were lower in one monkey in this study. In another search task requiring making eye move- ments toward a target (pro-saccades) or in the opposite direction (anti-saccades), the RT for pro-saccades increased in the presence of FUS stimulation. Our results indicate the effectiveness of FUS to modulate saccadic responses when stimulating FEF in awake, behaving non-human primates. (E-mail:},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Neuromodulation with focused ultrasound (FUS) is being widely explored as a non-invasive tool to stimulate focal brain regions because of its superior spatial resolution and coverage compared with other neuro- modulation methods. The precise effects of FUS stimulation on specific regions of the brain are not yet fully understood. Here, we characterized the behavioral effects of FUS stimulation directly applied through a craniot- omy over the macaque frontal eye field (FEF). In macaque monkeys making directed eye movements to perform visual search tasks with direct or arbitrary responses, focused ultrasound was applied through a craniotomy over the FEF. Saccade response times (RTs) and error rates were determined for trials without or with FUS stim- ulation with pulses at a peak negative pressure of either 250 or 425 kPa. Both RTs and error rates were affected by FUS. Responses toward a target located contralateral to the FUS stimulation were approximately 3 ms slower in the presence of FUS in both monkeys studied, while only one exhibited a slowing of responses for ipsilateral targets. Error rates were lower in one monkey in this study. In another search task requiring making eye move- ments toward a target (pro-saccades) or in the opposite direction (anti-saccades), the RT for pro-saccades increased in the presence of FUS stimulation. Our results indicate the effectiveness of FUS to modulate saccadic responses when stimulating FEF in awake, behaving non-human primates. (E-mail:

Close

  • doi:10.1016/j.ultrasmedbio.2020.11.022

Close

Liya Ma; Janahan Selvanayagam; Maryam Ghahremani; Lauren K Hayrynen; Kevin D Johnston; Stefan Everling

Single-unit activity in marmoset posterior parietal cortex in a gap saccade task Journal Article

Journal of Neurophysiology, 123 (3), pp. 896–911, 2020.

Abstract | Links | BibTeX

@article{Ma2020e,
title = {Single-unit activity in marmoset posterior parietal cortex in a gap saccade task},
author = {Liya Ma and Janahan Selvanayagam and Maryam Ghahremani and Lauren K Hayrynen and Kevin D Johnston and Stefan Everling},
doi = {10.1152/JN.00614.2019},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {123},
number = {3},
pages = {896--911},
abstract = {Abnormal saccadic eye movements can serve as biomarkers for patients with several neuropsychiatric disorders. The common marmoset (Callithrix jacchus) is becoming increasingly popular as a nonhuman primate model to investigate the cortical mechanisms of saccadic control. Recently, our group demonstrated that microstimulation in the posterior parietal cortex (PPC) of marmosets elicits contralateral saccades. Here we recorded single-unit activity in the PPC of the same two marmosets using chronic microelectrode arrays while the monkeys performed a saccadic task with gap trials (target onset lagged fixation point offset by 200 ms) interleaved with step trials (fixation point disappeared when the peripheral target appeared). Both marmosets showed a gap effect, shorter saccadic reaction times (SRTs) in gap vs. step trials. On average, stronger gap-period responses across the entire neuronal population preceded shorter SRTs on trials with contralateral targets although this correlation was stronger among the 15% “gap neurons,” which responded significantly during the gap. We also found 39% “target neurons” with significant saccadic target-related responses, which were stronger in gap trials and correlated with the SRTs better than the remaining neurons. Compared with saccades with relatively long SRTs, short-SRT saccades were preceded by both stronger gap-related and target-related responses in all PPC neurons, regardless of whether such response reached significance. Our findings suggest that the PPC in the marmoset contains an area that is involved in the modulation of saccadic preparation. NEW & NOTEWORTHY As a primate model in systems neuroscience, the marmoset is a great complement to the macaque monkey because of its unique advantages. To identify oculomotor networks in the marmoset, we recorded from the marmoset posterior parietal cortex during a saccadic task and found single-unit activities consistent with a role in saccadic modulation. This finding supports the marmoset as a valuable model for studying oculomotor control.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Abnormal saccadic eye movements can serve as biomarkers for patients with several neuropsychiatric disorders. The common marmoset (Callithrix jacchus) is becoming increasingly popular as a nonhuman primate model to investigate the cortical mechanisms of saccadic control. Recently, our group demonstrated that microstimulation in the posterior parietal cortex (PPC) of marmosets elicits contralateral saccades. Here we recorded single-unit activity in the PPC of the same two marmosets using chronic microelectrode arrays while the monkeys performed a saccadic task with gap trials (target onset lagged fixation point offset by 200 ms) interleaved with step trials (fixation point disappeared when the peripheral target appeared). Both marmosets showed a gap effect, shorter saccadic reaction times (SRTs) in gap vs. step trials. On average, stronger gap-period responses across the entire neuronal population preceded shorter SRTs on trials with contralateral targets although this correlation was stronger among the 15% “gap neurons,” which responded significantly during the gap. We also found 39% “target neurons” with significant saccadic target-related responses, which were stronger in gap trials and correlated with the SRTs better than the remaining neurons. Compared with saccades with relatively long SRTs, short-SRT saccades were preceded by both stronger gap-related and target-related responses in all PPC neurons, regardless of whether such response reached significance. Our findings suggest that the PPC in the marmoset contains an area that is involved in the modulation of saccadic preparation. NEW & NOTEWORTHY As a primate model in systems neuroscience, the marmoset is a great complement to the macaque monkey because of its unique advantages. To identify oculomotor networks in the marmoset, we recorded from the marmoset posterior parietal cortex during a saccadic task and found single-unit activities consistent with a role in saccadic modulation. This finding supports the marmoset as a valuable model for studying oculomotor control.

Close

  • doi:10.1152/JN.00614.2019

Close

Tatiana Malevich; Antimo Buonocore; Ziad M Hafed

Rapid stimulus-driven modulation of slow ocular position drifts Journal Article

eLife, 9 , pp. 1–22, 2020.

Abstract | Links | BibTeX

@article{Malevich2020,
title = {Rapid stimulus-driven modulation of slow ocular position drifts},
author = {Tatiana Malevich and Antimo Buonocore and Ziad M Hafed},
doi = {10.7554/ELIFE.57595},
year = {2020},
date = {2020-01-01},
journal = {eLife},
volume = {9},
pages = {1--22},
abstract = {The eyes are never still during maintained gaze fixation. When microsaccades are not occurring, ocular position exhibits continuous slow changes, often referred to as drifts. Unlike microsaccades, drifts remain to be viewed as largely random eye movements. Here we found that ocular position drifts can, instead, be very systematically stimulus-driven, and with very short latencies. We used highly precise eye tracking in three well trained macaque monkeys and found that even fleeting (~8 ms duration) stimulus presentations can robustly trigger transient and stimulus-specific modulations of ocular position drifts, and with only approximately 60 ms latency. Such drift responses are binocular, and they are most effectively elicited with large stimuli of low spatial frequency. Intriguingly, the drift responses exhibit some image pattern selectivity, and they are not explained by convergence responses, pupil constrictions, head movements, or starting eye positions. Ocular position drifts have very rapid access to exogenous visual information.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The eyes are never still during maintained gaze fixation. When microsaccades are not occurring, ocular position exhibits continuous slow changes, often referred to as drifts. Unlike microsaccades, drifts remain to be viewed as largely random eye movements. Here we found that ocular position drifts can, instead, be very systematically stimulus-driven, and with very short latencies. We used highly precise eye tracking in three well trained macaque monkeys and found that even fleeting (~8 ms duration) stimulus presentations can robustly trigger transient and stimulus-specific modulations of ocular position drifts, and with only approximately 60 ms latency. Such drift responses are binocular, and they are most effectively elicited with large stimuli of low spatial frequency. Intriguingly, the drift responses exhibit some image pattern selectivity, and they are not explained by convergence responses, pupil constrictions, head movements, or starting eye positions. Ocular position drifts have very rapid access to exogenous visual information.

Close

  • doi:10.7554/ELIFE.57595

Close

Vahid Mehrpour; Julio C Martinez-Trujillo; Stefan Treue

Attention amplifies neural representations of changes in sensory input at the expense of perceptual accuracy Journal Article

Nature Communications, 11 , pp. 1–8, 2020.

Abstract | Links | BibTeX

@article{Mehrpour2020,
title = {Attention amplifies neural representations of changes in sensory input at the expense of perceptual accuracy},
author = {Vahid Mehrpour and Julio C Martinez-Trujillo and Stefan Treue},
doi = {10.1038/s41467-020-15989-0},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
pages = {1--8},
publisher = {Springer US},
abstract = {Attention enhances the neural representations of behaviorally relevant stimuli, typically by a push–pull increase of the neuronal response gain to attended vs. unattended stimuli. This selectively improves perception and consequently behavioral performance. However, to enhance the detectability of stimulus changes, attention might also distort neural representations, compromising accurate stimulus representation. We test this hypothesis by recording neural responses in the visual cortex of rhesus monkeys during a motion direction change detection task. We find that attention indeed amplifies the neural representation of direction changes, beyond a similar effect of adaptation. We further show that humans overestimate such direction changes, providing a perceptual correlate of our neurophysiological observations. Our results demonstrate that attention distorts the neural representations of abrupt sensory changes and consequently perceptual accuracy. This likely represents an evolutionary adaptive mechanism that allows sensory systems to flexibly forgo accurate representation of stimulus features to improve the encoding of stimulus change.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Attention enhances the neural representations of behaviorally relevant stimuli, typically by a push–pull increase of the neuronal response gain to attended vs. unattended stimuli. This selectively improves perception and consequently behavioral performance. However, to enhance the detectability of stimulus changes, attention might also distort neural representations, compromising accurate stimulus representation. We test this hypothesis by recording neural responses in the visual cortex of rhesus monkeys during a motion direction change detection task. We find that attention indeed amplifies the neural representation of direction changes, beyond a similar effect of adaptation. We further show that humans overestimate such direction changes, providing a perceptual correlate of our neurophysiological observations. Our results demonstrate that attention distorts the neural representations of abrupt sensory changes and consequently perceptual accuracy. This likely represents an evolutionary adaptive mechanism that allows sensory systems to flexibly forgo accurate representation of stimulus features to improve the encoding of stimulus change.

Close

  • doi:10.1038/s41467-020-15989-0

Close

Atsushi Noritake; Taihei Ninomiya; Masaki Isoda

Representation of distinct reward variables for self and other in primate lateral hypothalamus Journal Article

Proceedings of the National Academy of Sciences, 117 (10), pp. 5516–5524, 2020.

Abstract | Links | BibTeX

@article{Noritake2020,
title = {Representation of distinct reward variables for self and other in primate lateral hypothalamus},
author = {Atsushi Noritake and Taihei Ninomiya and Masaki Isoda},
doi = {10.1073/pnas.1917156117},
year = {2020},
date = {2020-01-01},
journal = {Proceedings of the National Academy of Sciences},
volume = {117},
number = {10},
pages = {5516--5524},
abstract = {The lateral hypothalamus (LH) has long been implicated in maintaining behavioral homeostasis essential for the survival of an individual. However, recent evidence suggests its more widespread roles in behavioral coordination, extending to the social domain. The neuronal and circuit mechanisms behind the LH processing of social information are unknown. Here, we show that the LH represents distinct reward variables for “self” and “other” and is causally involved in shaping socially motivated behavior. During a Pavlovian conditioning procedure incorporating ubiquitous social experiences where rewards to others affect one's motivation, LH cells encoded the subjective value of self-rewards, as well as the likelihood of self- or other-rewards. The other-reward coding was not a general consequence of other's existence, but a specific effect of other's reward availability. Coherent activity with and top-down information flow from the medial prefrontal cortex, a hub of social brain networks, contributed to signal encoding in the LH. Furthermore, deactivation of LH cells eliminated the motivational impact of other-rewards. These results indicate that the LH constitutes a subcortical node in social brain networks and shapes one's motivation by integrating cortically derived, agent-specific reward information.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The lateral hypothalamus (LH) has long been implicated in maintaining behavioral homeostasis essential for the survival of an individual. However, recent evidence suggests its more widespread roles in behavioral coordination, extending to the social domain. The neuronal and circuit mechanisms behind the LH processing of social information are unknown. Here, we show that the LH represents distinct reward variables for “self” and “other” and is causally involved in shaping socially motivated behavior. During a Pavlovian conditioning procedure incorporating ubiquitous social experiences where rewards to others affect one's motivation, LH cells encoded the subjective value of self-rewards, as well as the likelihood of self- or other-rewards. The other-reward coding was not a general consequence of other's existence, but a specific effect of other's reward availability. Coherent activity with and top-down information flow from the medial prefrontal cortex, a hub of social brain networks, contributed to signal encoding in the LH. Furthermore, deactivation of LH cells eliminated the motivational impact of other-rewards. These results indicate that the LH constitutes a subcortical node in social brain networks and shapes one's motivation by integrating cortically derived, agent-specific reward information.

Close

  • doi:10.1073/pnas.1917156117

Close

Wei Song Ong; Seth Madlon-Kay; Michael L Platt

Neuronal correlates of strategic cooperation in monkeys Journal Article

Nature Neuroscience, 24 , pp. 1–22, 2020.

Abstract | Links | BibTeX

@article{Ong2020c,
title = {Neuronal correlates of strategic cooperation in monkeys},
author = {Wei Song Ong and Seth Madlon-Kay and Michael L Platt},
doi = {10.1038/s41593-020-00746-9},
year = {2020},
date = {2020-01-01},
journal = {Nature Neuroscience},
volume = {24},
pages = {1--22},
publisher = {Springer US},
abstract = {We recorded neural activity in male monkeys playing a variant of the game ‘chicken' in which they made decisions to cooperate or not cooperate to obtain rewards of different sizes. Neurons in the middle superior temporal sulcus (mSTS)—previously implicated in social perception—signaled strategic information, including payoffs, intentions of the other player, reward outcomes and predictions about the other player. Moreover, a subpopulation of mSTS neurons selectively signaled cooperatively obtained rewards. Neurons in the anterior cingulate gyrus, previously implicated in vicarious reinforcement and empathy, carried less information about strategic variables, especially cooperative reward. Strategic signals were not reducible to perceptual information about the other player or motor contingencies. These findings suggest that the capacity to compute models of other agents has deep roots in the strategic social behavior of primates and that the anterior cingulate gyrus and the mSTS support these computations.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

We recorded neural activity in male monkeys playing a variant of the game ‘chicken' in which they made decisions to cooperate or not cooperate to obtain rewards of different sizes. Neurons in the middle superior temporal sulcus (mSTS)—previously implicated in social perception—signaled strategic information, including payoffs, intentions of the other player, reward outcomes and predictions about the other player. Moreover, a subpopulation of mSTS neurons selectively signaled cooperatively obtained rewards. Neurons in the anterior cingulate gyrus, previously implicated in vicarious reinforcement and empathy, carried less information about strategic variables, especially cooperative reward. Strategic signals were not reducible to perceptual information about the other player or motor contingencies. These findings suggest that the capacity to compute models of other agents has deep roots in the strategic social behavior of primates and that the anterior cingulate gyrus and the mSTS support these computations.

Close

  • doi:10.1038/s41593-020-00746-9

Close

Tyler R Peel; Suryadeep Dash; Stephen G Lomber; Brian D Corneil

Frontal eye field inactivation alters the readout of superior colliculus activity for saccade generation in a task-dependent manner Journal Article

Journal of Computational Neuroscience, pp. 1–21, 2020.

Abstract | Links | BibTeX

@article{Peel2020,
title = {Frontal eye field inactivation alters the readout of superior colliculus activity for saccade generation in a task-dependent manner},
author = {Tyler R Peel and Suryadeep Dash and Stephen G Lomber and Brian D Corneil},
doi = {10.1101/646604},
year = {2020},
date = {2020-01-01},
journal = {Journal of Computational Neuroscience},
pages = {1--21},
publisher = {Journal of Computational Neuroscience},
abstract = {Saccades require a spatiotemporal transformation of activity between the intermediate layers of the superior colliculus (iSC) and downstream brainstem burst generator. The dynamic linear ensemble-coding model (Goossens and Van Opstal, 2006) proposes that each iSC spike contributes a fixed mini-vector to saccade displacement. Although biologically-plausible, this model assumes cortical areas like the frontal eye fields (FEF) simply provide the saccadic goal to be executed by the iSC and brainstem burst generator. However, the FEF and iSC operate in unison during saccades, and a pathway from the FEF to the brainstem burst generator that bypasses the iSC exists. Here, we investigate the impact of large yet reversible inactivation of the FEF on iSC activity in the context of the model across four saccade tasks. We exploit the overlap of saccade vectors generated when the FEF is inactivated or not, comparing the number of iSC spikes for metrically-matched saccades. We found that the iSC emits fewer spikes for metrically-matched saccades during FEF inactivation. The decrease in spike count is task-dependent, with a greater decrease accompanying more cognitively-demanding saccades. Our results show that FEF integrity influences the readout of iSC activity in a task-dependent manner. We propose that the dynamic linear ensemble-coding model be modified so that FEF inactivation increases the gain of a readout parameter, effectively increasing the influence of a single iSC spike. We speculate that this modification could be instantiated by a direct pathway from the FEF to the omnipause region that modulates the excitability of the brainstem burst generator. Significance statement One of the enduring puzzles in the oculomotor system is how it achieves the spatiotemporal transformation, converting spatial activity within the intermediate layers of the superior colliculus (iSC) into a rate code within the brainstem burst generator. The spatiotemporal transformation has traditionally been viewed as the purview of the oculomotor brainstem. Here, within the context of testing a biologically-plausible model of the spatiotemporal transformation, we show that reversible inactivation of the frontal eye fields (FEF) decreases the number of spikes issued by the iSC for metrically-matched saccades, with greater decreases accompanying more cognitively-demanding tasks. These results show that signals from the FEF influence the spatiotemporal transformation.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Saccades require a spatiotemporal transformation of activity between the intermediate layers of the superior colliculus (iSC) and downstream brainstem burst generator. The dynamic linear ensemble-coding model (Goossens and Van Opstal, 2006) proposes that each iSC spike contributes a fixed mini-vector to saccade displacement. Although biologically-plausible, this model assumes cortical areas like the frontal eye fields (FEF) simply provide the saccadic goal to be executed by the iSC and brainstem burst generator. However, the FEF and iSC operate in unison during saccades, and a pathway from the FEF to the brainstem burst generator that bypasses the iSC exists. Here, we investigate the impact of large yet reversible inactivation of the FEF on iSC activity in the context of the model across four saccade tasks. We exploit the overlap of saccade vectors generated when the FEF is inactivated or not, comparing the number of iSC spikes for metrically-matched saccades. We found that the iSC emits fewer spikes for metrically-matched saccades during FEF inactivation. The decrease in spike count is task-dependent, with a greater decrease accompanying more cognitively-demanding saccades. Our results show that FEF integrity influences the readout of iSC activity in a task-dependent manner. We propose that the dynamic linear ensemble-coding model be modified so that FEF inactivation increases the gain of a readout parameter, effectively increasing the influence of a single iSC spike. We speculate that this modification could be instantiated by a direct pathway from the FEF to the omnipause region that modulates the excitability of the brainstem burst generator. Significance statement One of the enduring puzzles in the oculomotor system is how it achieves the spatiotemporal transformation, converting spatial activity within the intermediate layers of the superior colliculus (iSC) into a rate code within the brainstem burst generator. The spatiotemporal transformation has traditionally been viewed as the purview of the oculomotor brainstem. Here, within the context of testing a biologically-plausible model of the spatiotemporal transformation, we show that reversible inactivation of the frontal eye fields (FEF) decreases the number of spikes issued by the iSC for metrically-matched saccades, with greater decreases accompanying more cognitively-demanding tasks. These results show that signals from the FEF influence the spatiotemporal transformation.

Close

  • doi:10.1101/646604

Close

Sorin A Pojoga; Natasha Kharas; Valentin Dragoi

Perceptually unidentifiable stimuli influence cortical processing and behavioral performance Journal Article

Nature Communications, 11 , pp. 1–12, 2020.

Abstract | Links | BibTeX

@article{Pojoga2020,
title = {Perceptually unidentifiable stimuli influence cortical processing and behavioral performance},
author = {Sorin A Pojoga and Natasha Kharas and Valentin Dragoi},
doi = {10.1038/s41467-020-19848-w},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
pages = {1--12},
abstract = {Our daily behavior is dynamically influenced by conscious and unconscious processes. Although the neural bases of conscious experience have been extensively investigated over the past several decades, how unconscious information impacts neural circuitry and behavior remains unknown. Here, we recorded populations of neurons in macaque primary visual cortex (V1) to find that perceptually unidentifiable stimuli repeatedly presented in the absence of awareness are encoded by neural populations in a way that facilitates their future processing in the context of a behavioral task. Such exposure increases stimulus sensitivity and information encoded in cell populations, even though animals are unaware of stimulus identity. This phenomenon is consistent with a Hebbian mechanism underlying an increase in functional connectivity specifically for the neurons activated by subthreshold stimuli. This form of unsupervised adaptation may constitute a vestigial pre-attention system using the mere frequency of stimulus occurrence to change stimulus representations even when sensory inputs are perceptually invisible.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Our daily behavior is dynamically influenced by conscious and unconscious processes. Although the neural bases of conscious experience have been extensively investigated over the past several decades, how unconscious information impacts neural circuitry and behavior remains unknown. Here, we recorded populations of neurons in macaque primary visual cortex (V1) to find that perceptually unidentifiable stimuli repeatedly presented in the absence of awareness are encoded by neural populations in a way that facilitates their future processing in the context of a behavioral task. Such exposure increases stimulus sensitivity and information encoded in cell populations, even though animals are unaware of stimulus identity. This phenomenon is consistent with a Hebbian mechanism underlying an increase in functional connectivity specifically for the neurons activated by subthreshold stimuli. This form of unsupervised adaptation may constitute a vestigial pre-attention system using the mere frequency of stimulus occurrence to change stimulus representations even when sensory inputs are perceptually invisible.

Close

  • doi:10.1038/s41467-020-19848-w

Close

Joern K Pomper; Silvia Spadacenta; Friedemann Bunjes; Daniel Arnstein; Martin A Giese; Peter Thier

Representation of the observer's predicted outcome value in mirror and nonmirror neurons of macaque F5 ventral premotor cortex Journal Article

Journal of Neurophysiology, 124 (3), pp. 941–961, 2020.

Abstract | Links | BibTeX

@article{Pomper2020,
title = {Representation of the observer's predicted outcome value in mirror and nonmirror neurons of macaque F5 ventral premotor cortex},
author = {Joern K Pomper and Silvia Spadacenta and Friedemann Bunjes and Daniel Arnstein and Martin A Giese and Peter Thier},
doi = {10.1152/jn.00234.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {124},
number = {3},
pages = {941--961},
abstract = {In the search for the function of mirror neurons, a previous study reported that F5 mirror neuron responses are modulated by the value that the observing monkey associates with the grasped object. Yet we do not know whether mirror neurons are modulated by the expected reward value for the observer or also by other variables, which are causally dependent on value (e.g., motivation, attention directed at the observed action, arousal). To clarify this, we trained two rhesus macaques to observe a grasping action on an object kept constant, followed by four fully predictable outcomes of different values (2 outcomes with positive and 2 with negative emotional valence). We found a consistent order in population activity of both mirror and nonmirror neurons that matches the order of the value of this predicted outcome but that does not match the order of the above-mentioned value-dependent variables. These variables were inferred from the probability not to abort a trial, saccade latency, modulation of eye position during action observation, heart rate, and pupil size. Moreover, we found subpopulations of neurons tuned to each of the four predicted outcome values. Multidimensional scaling revealed equal normalized distances of 0.25 between the two positive and between the two negative outcomes suggesting the representation of a relative value, scaled to the task setting. We conclude that F5 mirror neurons and nonmirror neurons represent the observer's predicted outcome value, which in the case of mirror neurons may be transferred to the observed object or action. NEW & NOTEWORTHY Both the populations of F5 mirror neurons and nonmirror neurons represent the predicted value of an outcome resulting from the observation of a grasping action. Value-dependent motivation, arousal, and attention directed at the observed action do not provide a better explanation for this representation. The population activity's metric suggests an optimal scaling of value representation to task setting.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In the search for the function of mirror neurons, a previous study reported that F5 mirror neuron responses are modulated by the value that the observing monkey associates with the grasped object. Yet we do not know whether mirror neurons are modulated by the expected reward value for the observer or also by other variables, which are causally dependent on value (e.g., motivation, attention directed at the observed action, arousal). To clarify this, we trained two rhesus macaques to observe a grasping action on an object kept constant, followed by four fully predictable outcomes of different values (2 outcomes with positive and 2 with negative emotional valence). We found a consistent order in population activity of both mirror and nonmirror neurons that matches the order of the value of this predicted outcome but that does not match the order of the above-mentioned value-dependent variables. These variables were inferred from the probability not to abort a trial, saccade latency, modulation of eye position during action observation, heart rate, and pupil size. Moreover, we found subpopulations of neurons tuned to each of the four predicted outcome values. Multidimensional scaling revealed equal normalized distances of 0.25 between the two positive and between the two negative outcomes suggesting the representation of a relative value, scaled to the task setting. We conclude that F5 mirror neurons and nonmirror neurons represent the observer's predicted outcome value, which in the case of mirror neurons may be transferred to the observed object or action. NEW & NOTEWORTHY Both the populations of F5 mirror neurons and nonmirror neurons represent the predicted value of an outcome resulting from the observation of a grasping action. Value-dependent motivation, arousal, and attention directed at the observed action do not provide a better explanation for this representation. The population activity's metric suggests an optimal scaling of value representation to task setting.

Close

  • doi:10.1152/jn.00234.2020

Close

Pierre Pouget; Stephen Frey; Harry Ahnine; David Attali; Julien Claron; Charlotte Constans; Jean Francois Aubry; Fabrice Arcizet

Neuronavigated repetitive transcranial ultrasound stimulation induces long-lasting and reversible effects on oculomotor performance in non-human primates Journal Article

Frontiers in Physiology, 11 , pp. 1–13, 2020.

Abstract | Links | BibTeX

@article{Pouget2020,
title = {Neuronavigated repetitive transcranial ultrasound stimulation induces long-lasting and reversible effects on oculomotor performance in non-human primates},
author = {Pierre Pouget and Stephen Frey and Harry Ahnine and David Attali and Julien Claron and Charlotte Constans and Jean Francois Aubry and Fabrice Arcizet},
doi = {10.3389/fphys.2020.01042},
year = {2020},
date = {2020-01-01},
journal = {Frontiers in Physiology},
volume = {11},
pages = {1--13},
abstract = {Since the late 2010s, Transcranial Ultrasound Stimulation (TUS) has been used experimentally to carryout safe, non-invasive stimulation of the brain with better spatial resolution than Transcranial Magnetic Stimulation (TMS). This innovative stimulation method has emerged as a novel and valuable device for studying brain function in humans and animals. In particular, single pulses of TUS directed to oculomotor regions have been shown to modulate visuomotor behavior of non-human primates during 100 ms ultrasound pulses. In the present study, a sustained effect was induced by applying 20-s trains of neuronavigated repetitive Transcranial Ultrasound Stimulation (rTUS) to oculomotor regions of the frontal cortex in three non-human primates performing an antisaccade task. With the help of MRI imaging and a frame-less stereotactic neuronavigation system (SNS), we were able to demonstrate that neuronavigated TUS (outside of the MRI scanner) is an efficient tool to carry out neuromodulation procedures in non-human primates. We found that, following neuronavigated rTUS, saccades were significantly modified, resulting in shorter latencies compared to no-rTUS trials. This behavioral modulation was maintained for up to 20 min. Oculomotor behavior returned to baseline after 18–31 min and could not be significantly distinguished from the no-rTUS condition. This study is the first to show that neuronavigated rTUS can have a persistent effect on monkey behavior with a quantified return-time to baseline. The specificity of the effects could not be explained by auditory confounds.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Since the late 2010s, Transcranial Ultrasound Stimulation (TUS) has been used experimentally to carryout safe, non-invasive stimulation of the brain with better spatial resolution than Transcranial Magnetic Stimulation (TMS). This innovative stimulation method has emerged as a novel and valuable device for studying brain function in humans and animals. In particular, single pulses of TUS directed to oculomotor regions have been shown to modulate visuomotor behavior of non-human primates during 100 ms ultrasound pulses. In the present study, a sustained effect was induced by applying 20-s trains of neuronavigated repetitive Transcranial Ultrasound Stimulation (rTUS) to oculomotor regions of the frontal cortex in three non-human primates performing an antisaccade task. With the help of MRI imaging and a frame-less stereotactic neuronavigation system (SNS), we were able to demonstrate that neuronavigated TUS (outside of the MRI scanner) is an efficient tool to carry out neuromodulation procedures in non-human primates. We found that, following neuronavigated rTUS, saccades were significantly modified, resulting in shorter latencies compared to no-rTUS trials. This behavioral modulation was maintained for up to 20 min. Oculomotor behavior returned to baseline after 18–31 min and could not be significantly distinguished from the no-rTUS condition. This study is the first to show that neuronavigated rTUS can have a persistent effect on monkey behavior with a quantified return-time to baseline. The specificity of the effects could not be explained by auditory confounds.

Close

  • doi:10.3389/fphys.2020.01042

Close

Paul Henri Prévot; Kevin Gehere; Fabrice Arcizet; Himanshu Akolkar; Mina A Khoei; Kévin Blaize; Omar Oubari; Pierre Daye; Marion Lanoë; Manon Valet; Sami Dalouz; Paul Langlois; Elric Esposito; Valérie Forster; Elisabeth Dubus; Nicolas Wattiez; Elena Brazhnikova; Céline Nouvel-Jaillard; Yannick LeMer; Joanna Demilly; Claire Maëlle Fovet; Philippe Hantraye; Morgane Weissenburger; Henri Lorach; Elodie Bouillet; Martin Deterre; Ralf Hornig; Guillaume Buc; José Alain Sahel; Guillaume Chenegros; Pierre Pouget; Ryad Benosman; Serge Picaud

Behavioural responses to a photovoltaic subretinal prosthesis implanted in non-human primates Journal Article

Nature Biomedical Engineering, 4 (2), pp. 172–180, 2020.

Abstract | Links | BibTeX

@article{Prevot2020,
title = {Behavioural responses to a photovoltaic subretinal prosthesis implanted in non-human primates},
author = {Paul Henri Prévot and Kevin Gehere and Fabrice Arcizet and Himanshu Akolkar and Mina A Khoei and Kévin Blaize and Omar Oubari and Pierre Daye and Marion Lano{ë} and Manon Valet and Sami Dalouz and Paul Langlois and Elric Esposito and Valérie Forster and Elisabeth Dubus and Nicolas Wattiez and Elena Brazhnikova and Céline Nouvel-Jaillard and Yannick LeMer and Joanna Demilly and Claire Ma{ë}lle Fovet and Philippe Hantraye and Morgane Weissenburger and Henri Lorach and Elodie Bouillet and Martin Deterre and Ralf Hornig and Guillaume Buc and José Alain Sahel and Guillaume Chenegros and Pierre Pouget and Ryad Benosman and Serge Picaud},
doi = {10.1038/s41551-019-0484-2},
year = {2020},
date = {2020-01-01},
journal = {Nature Biomedical Engineering},
volume = {4},
number = {2},
pages = {172--180},
abstract = {Retinal dystrophies and age-related macular degeneration related to photoreceptor degeneration can cause blindness. In blind patients, although the electrical activation of the residual retinal circuit can provide useful artificial visual perception, the resolutions of current retinal prostheses have been limited either by large electrodes or small numbers of pixels. Here we report the evaluation, in three awake non-human primates, of a previously reported near-infrared-light-sensitive photovoltaic subretinal prosthesis. We show that multipixel stimulation of the prosthesis within radiation safety limits enabled eye tracking in the animals, that they responded to stimulations directed at the implant with repeated saccades and that the implant-induced responses were present two years after device implantation. Our findings pave the way for the clinical evaluation of the prosthesis in patients affected by dry atrophic age-related macular degeneration.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Retinal dystrophies and age-related macular degeneration related to photoreceptor degeneration can cause blindness. In blind patients, although the electrical activation of the residual retinal circuit can provide useful artificial visual perception, the resolutions of current retinal prostheses have been limited either by large electrodes or small numbers of pixels. Here we report the evaluation, in three awake non-human primates, of a previously reported near-infrared-light-sensitive photovoltaic subretinal prosthesis. We show that multipixel stimulation of the prosthesis within radiation safety limits enabled eye tracking in the animals, that they responded to stimulations directed at the implant with repeated saccades and that the implant-induced responses were present two years after device implantation. Our findings pave the way for the clinical evaluation of the prosthesis in patients affected by dry atrophic age-related macular degeneration.

Close

  • doi:10.1038/s41551-019-0484-2

Close

Rishi Rajalingham; Kohitij Kar; Sachi Sanghavi; Stanislas Dehaene; James J DiCarlo

The inferior temporal cortex is a potential cortical precursor of orthographic processing in untrained monkeys Journal Article

Nature Communications, 11 (1), pp. 1–13, 2020.

Abstract | Links | BibTeX

@article{Rajalingham2020,
title = {The inferior temporal cortex is a potential cortical precursor of orthographic processing in untrained monkeys},
author = {Rishi Rajalingham and Kohitij Kar and Sachi Sanghavi and Stanislas Dehaene and James J DiCarlo},
doi = {10.1038/s41467-020-17714-3},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
number = {1},
pages = {1--13},
publisher = {Springer US},
abstract = {The ability to recognize written letter strings is foundational to human reading, but the underlying neuronal mechanisms remain largely unknown. Recent behavioral research in baboons suggests that non-human primates may provide an opportunity to investigate this question. We recorded the activity of hundreds of neurons in V4 and the inferior temporal cortex (IT) while naïve macaque monkeys passively viewed images of letters, English words and non-word strings, and tested the capacity of those neuronal representations to support a battery of orthographic processing tasks. We found that simple linear read-outs of IT (but not V4) population responses achieved high performance on all tested tasks, even matching the performance and error patterns of baboons on word classification. These results show that the IT cortex of untrained primates can serve as a precursor of orthographic processing, suggesting that the acquisition of reading in humans relies on the recycling of a brain network evolved for other visual functions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The ability to recognize written letter strings is foundational to human reading, but the underlying neuronal mechanisms remain largely unknown. Recent behavioral research in baboons suggests that non-human primates may provide an opportunity to investigate this question. We recorded the activity of hundreds of neurons in V4 and the inferior temporal cortex (IT) while naïve macaque monkeys passively viewed images of letters, English words and non-word strings, and tested the capacity of those neuronal representations to support a battery of orthographic processing tasks. We found that simple linear read-outs of IT (but not V4) population responses achieved high performance on all tested tasks, even matching the performance and error patterns of baboons on word classification. These results show that the IT cortex of untrained primates can serve as a precursor of orthographic processing, suggesting that the acquisition of reading in humans relies on the recycling of a brain network evolved for other visual functions.

Close

  • doi:10.1038/s41467-020-17714-3

Close

Sina Salehi; Mohammad Reza A Dehaqani; Behrad Noudoost; Hossein Esteky

Distinct mechanisms of face representation by enhancive and suppressive neurons of the inferior temporal cortex Journal Article

Journal of Neurophysiology, 124 (4), pp. 1216–1228, 2020.

Abstract | Links | BibTeX

@article{Salehi2020,
title = {Distinct mechanisms of face representation by enhancive and suppressive neurons of the inferior temporal cortex},
author = {Sina Salehi and Mohammad Reza A Dehaqani and Behrad Noudoost and Hossein Esteky},
doi = {10.1152/jn.00203.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neurophysiology},
volume = {124},
number = {4},
pages = {1216--1228},
abstract = {Face-selective neurons in the inferior temporal (IT) cortex respond to faces by either increasing (ENH) or decreasing (SUP) their spiking activities compared with their baseline. Although nearly half of IT face neurons are selectively suppressed by face stimulation, their role in face representation is not clear. To address this issue, we recorded the spiking activities and local field potential (LFP) from IT cortex of three monkeys while they viewed a large set of visual stimuli. LFP high-gamma (HG-LFP) power indicated the presence of both ENH and SUP face-selective neural clusters in IT cortex. The magnitude of HG-LFP power of the recording sites was correlated with the magnitude of change in the evoked spiking activities of its constituent neurons for both ENH and SUP face clusters. Spatial distribution of the ENH and SUP face clusters suggests the presence of a complex and heterogeneous face hypercluster organization in IT cortex. Importantly, ENH neurons conveyed more face category and SUP neurons conveyed more face identity information at both the single-unit and neuronal population levels. Onset and peak of suppressive single-unit, neuronal population, and HG-LFP power activities lagged those of the ENH ones. These results demonstrate that IT neuronal code for face representation is optimized by increasing sparseness through selective suppression of a subset of face neurons. We suggest that IT cortex contains spatial clusters of both ENH and SUP face neurons with distinct specialized functional role in face representation. NEW & NOTEWORTHY Electrophysiological and imaging studies have suggested that face information is encoded by a network of clusters of enhancive face-selective neurons in the visual cortex of man and monkey. We show that nearly half of face-selective neurons are suppressed by face stimulation. The suppressive neurons form spatial clusters and convey more face identity information than the enhancive face neurons. Our results suggest the presence of two neuronal subsystems for coarse and fine face information processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Face-selective neurons in the inferior temporal (IT) cortex respond to faces by either increasing (ENH) or decreasing (SUP) their spiking activities compared with their baseline. Although nearly half of IT face neurons are selectively suppressed by face stimulation, their role in face representation is not clear. To address this issue, we recorded the spiking activities and local field potential (LFP) from IT cortex of three monkeys while they viewed a large set of visual stimuli. LFP high-gamma (HG-LFP) power indicated the presence of both ENH and SUP face-selective neural clusters in IT cortex. The magnitude of HG-LFP power of the recording sites was correlated with the magnitude of change in the evoked spiking activities of its constituent neurons for both ENH and SUP face clusters. Spatial distribution of the ENH and SUP face clusters suggests the presence of a complex and heterogeneous face hypercluster organization in IT cortex. Importantly, ENH neurons conveyed more face category and SUP neurons conveyed more face identity information at both the single-unit and neuronal population levels. Onset and peak of suppressive single-unit, neuronal population, and HG-LFP power activities lagged those of the ENH ones. These results demonstrate that IT neuronal code for face representation is optimized by increasing sparseness through selective suppression of a subset of face neurons. We suggest that IT cortex contains spatial clusters of both ENH and SUP face neurons with distinct specialized functional role in face representation. NEW & NOTEWORTHY Electrophysiological and imaging studies have suggested that face information is encoded by a network of clusters of enhancive face-selective neurons in the visual cortex of man and monkey. We show that nearly half of face-selective neurons are suppressed by face stimulation. The suppressive neurons form spatial clusters and convey more face identity information than the enhancive face neurons. Our results suggest the presence of two neuronal subsystems for coarse and fine face information processing.

Close

  • doi:10.1152/jn.00203.2020

Close

David J Schaeffer; Janahan Selvanayagam; Kevin D Johnston; Ravi S Menon; Winrich A Freiwald; Stefan Everling

Face selective patches in marmoset frontal cortex Journal Article

Nature Communications, 11 , pp. 1–8, 2020.

Abstract | Links | BibTeX

@article{Schaeffer2020,
title = {Face selective patches in marmoset frontal cortex},
author = {David J Schaeffer and Janahan Selvanayagam and Kevin D Johnston and Ravi S Menon and Winrich A Freiwald and Stefan Everling},
doi = {10.1038/s41467-020-18692-2},
year = {2020},
date = {2020-01-01},
journal = {Nature Communications},
volume = {11},
pages = {1--8},
publisher = {Springer US},
abstract = {In humans and macaque monkeys, socially relevant face processing is accomplished via a distributed functional network that includes specialized patches in frontal cortex. It is unclear whether a similar network exists in New World primates, who diverged ~35 million years from Old World primates. The common marmoset is a New World primate species ideally placed to address this question given their complex social repertoire. Here, we demonstrate the existence of a putative high-level face processing network in marmosets. Like Old World primates, marmosets show differential activation in anterior cingulate and lateral prefrontal cortices while they view socially relevant videos of marmoset faces. We corroborate the locations of these frontal regions by demonstrating functional and structural connectivity between these regions and temporal lobe face patches. Given the evolutionary separation between macaques and marmosets, our results suggest this frontal network specialized for social face processing predates the separation between Platyrrhini and Catarrhini.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In humans and macaque monkeys, socially relevant face processing is accomplished via a distributed functional network that includes specialized patches in frontal cortex. It is unclear whether a similar network exists in New World primates, who diverged ~35 million years from Old World primates. The common marmoset is a New World primate species ideally placed to address this question given their complex social repertoire. Here, we demonstrate the existence of a putative high-level face processing network in marmosets. Like Old World primates, marmosets show differential activation in anterior cingulate and lateral prefrontal cortices while they view socially relevant videos of marmoset faces. We corroborate the locations of these frontal regions by demonstrating functional and structural connectivity between these regions and temporal lobe face patches. Given the evolutionary separation between macaques and marmosets, our results suggest this frontal network specialized for social face processing predates the separation between Platyrrhini and Catarrhini.

Close

  • doi:10.1038/s41467-020-18692-2

Close

Philipp Schwedhelm; Daniel Baldauf; Stefan Treue

The lateral prefrontal cortex of primates encodes stimulus colors and their behavioral relevance during a match-to-sample task Journal Article

Scientific Reports, 10 , pp. 1–12, 2020.

Abstract | Links | BibTeX

@article{Schwedhelm2020,
title = {The lateral prefrontal cortex of primates encodes stimulus colors and their behavioral relevance during a match-to-sample task},
author = {Philipp Schwedhelm and Daniel Baldauf and Stefan Treue},
doi = {10.1038/s41598-020-61171-3},
year = {2020},
date = {2020-01-01},
journal = {Scientific Reports},
volume = {10},
pages = {1--12},
publisher = {Springer US},
abstract = {The lateral prefrontal cortex of primates (lPFC) plays a central role in complex cognitive behavior, in decision-making as well as in guiding top-down attention. However, how and where in lPFC such behaviorally relevant signals are computed is poorly understood. We analyzed neural recordings from chronic microelectrode arrays implanted in lPFC region 8Av/45 of two rhesus macaques. The animals performed a feature match-to-sample task requiring them to match both motion and color information in a test stimulus. This task allowed to separate the encoding of stimulus motion and color from their current behavioral relevance on a trial-by-trial basis. We found that upcoming motor behavior can be robustly predicted from lPFC activity. In addition, we show that 8Av/45 encodes the color of a visual stimulus, regardless of its behavioral relevance. Most notably, whether a color matches the searched-for color can be decoded independent of a trial's motor outcome and while subjects detect unique feature conjunctions of color and motion. Thus, macaque area 8Av/45 computes, among other task-relevant information, the behavioral relevance of visual color features. Such a signal is most critical for both the selection of responses as well as the deployment of top-down modulatory signals, like feature-based attention.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The lateral prefrontal cortex of primates (lPFC) plays a central role in complex cognitive behavior, in decision-making as well as in guiding top-down attention. However, how and where in lPFC such behaviorally relevant signals are computed is poorly understood. We analyzed neural recordings from chronic microelectrode arrays implanted in lPFC region 8Av/45 of two rhesus macaques. The animals performed a feature match-to-sample task requiring them to match both motion and color information in a test stimulus. This task allowed to separate the encoding of stimulus motion and color from their current behavioral relevance on a trial-by-trial basis. We found that upcoming motor behavior can be robustly predicted from lPFC activity. In addition, we show that 8Av/45 encodes the color of a visual stimulus, regardless of its behavioral relevance. Most notably, whether a color matches the searched-for color can be decoded independent of a trial's motor outcome and while subjects detect unique feature conjunctions of color and motion. Thus, macaque area 8Av/45 computes, among other task-relevant information, the behavioral relevance of visual color features. Such a signal is most critical for both the selection of responses as well as the deployment of top-down modulatory signals, like feature-based attention.

Close

  • doi:10.1038/s41598-020-61171-3

Close

H N Schwerdt; K Amemori; D J Gibson; L L Stanwicks; T Yoshida; N P Bichot; S Amemori; R Desimone; R Langer; M J Cima; A M Graybiel

Dopamine and beta-band oscillations differentially link to striatal value and motor control Journal Article

Science Advances, 6 , pp. 1–17, 2020.

Abstract | Links | BibTeX

@article{Schwerdt2020,
title = {Dopamine and beta-band oscillations differentially link to striatal value and motor control},
author = {H N Schwerdt and K Amemori and D J Gibson and L L Stanwicks and T Yoshida and N P Bichot and S Amemori and R Desimone and R Langer and M J Cima and A M Graybiel},
doi = {10.1126/sciadv.abb9226},
year = {2020},
date = {2020-01-01},
journal = {Science Advances},
volume = {6},
pages = {1--17},
abstract = {Parkinson's disease is characterized by decreased dopamine and increased beta-band oscillatory activity accompanying debilitating motor and mood impairments. Coordinate dopamine-beta opposition is considered a normative rule for basal ganglia function. We report a breakdown of this rule. We developed multimodal systems allowing the first simultaneous, chronic recordings of dopamine release and beta-band activity in the striatum of nonhuman primates during behavioral performance. Dopamine and beta signals were anticorrelated over secondslong time frames, in agreement with the posited rule, but at finer time scales, we identified conditions in which these signals were modulated with the same polarity. These measurements demonstrated that task-elicited beta suppressions preceded dopamine peaks and that relative dopamine-beta timing and polarity depended on reward value, performance history, movement, and striatal domain. These findings establish a new view of coordinate dopamine and beta signaling operations, critical to guide novel strategies for diagnosing and treating Parkinson's disease and related neurodegenerative disorders.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Parkinson's disease is characterized by decreased dopamine and increased beta-band oscillatory activity accompanying debilitating motor and mood impairments. Coordinate dopamine-beta opposition is considered a normative rule for basal ganglia function. We report a breakdown of this rule. We developed multimodal systems allowing the first simultaneous, chronic recordings of dopamine release and beta-band activity in the striatum of nonhuman primates during behavioral performance. Dopamine and beta signals were anticorrelated over secondslong time frames, in agreement with the posited rule, but at finer time scales, we identified conditions in which these signals were modulated with the same polarity. These measurements demonstrated that task-elicited beta suppressions preceded dopamine peaks and that relative dopamine-beta timing and polarity depended on reward value, performance history, movement, and striatal domain. These findings establish a new view of coordinate dopamine and beta signaling operations, critical to guide novel strategies for diagnosing and treating Parkinson's disease and related neurodegenerative disorders.

Close

  • doi:10.1126/sciadv.abb9226

Close

Caspar M Schwiedrzik; Sandrin S Sudmann

Pupil diameter tracks statistical structure in the environment to increase visual sensitivity Journal Article

Journal of Neuroscience, 40 (23), pp. 4565–4575, 2020.

Abstract | Links | BibTeX

@article{Schwiedrzik2020,
title = {Pupil diameter tracks statistical structure in the environment to increase visual sensitivity},
author = {Caspar M Schwiedrzik and Sandrin S Sudmann},
doi = {10.1523/JNEUROSCI.0216-20.2020},
year = {2020},
date = {2020-01-01},
journal = {Journal of Neuroscience},
volume = {40},
number = {23},
pages = {4565--4575},
abstract = {Pupil diameter determines how much light hits the retina, and thus, how much information is available for visual processing. This is regulated by a brainstem reflex pathway. Here, we investigate whether this pathway is under control of internal models about the environment. This would allow adjusting pupil dynamics to environmental statistics to augment information transmission. We present image sequences containing internal temporal structure to humans of either sex and male macaque monkeys. We then measure whether the pupil tracks this temporal structure not only at the rate of luminance variations, but also at the rate of statistics not available from luminance information alone. We find entrainment to environmental statistics in both species. This entrainment directly affects visual processing by increasing sensitivity at the environmentally relevant temporal frequency. Thus, pupil dynamics are matched to the temporal structure of the environment to optimize perception, in line with an active sensing account.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Pupil diameter determines how much light hits the retina, and thus, how much information is available for visual processing. This is regulated by a brainstem reflex pathway. Here, we investigate whether this pathway is under control of internal models about the environment. This would allow adjusting pupil dynamics to environmental statistics to augment information transmission. We present image sequences containing internal temporal structure to humans of either sex and male macaque monkeys. We then measure whether the pupil tracks this temporal structure not only at the rate of luminance variations, but also at the rate of statistics not available from luminance information alone. We find entrainment to environmental statistics in both species. This entrainment directly affects visual processing by increasing sensitivity at the environmentally relevant temporal frequency. Thus, pupil dynamics are matched to the temporal structure of the environment to optimize perception, in line with an active sensing account.

Close

  • doi:10.1523/JNEUROSCI.0216-20.2020

Close

2019

Maria C Romero; Marco Davare; Marcelo Armendariz; Peter Janssen

Neural effects of transcranial magnetic stimulation at the single-cell level Journal Article

Nature Communications, 10 , pp. 2642, 2019.

Abstract | Links | BibTeX

@article{Romero2019,
title = {Neural effects of transcranial magnetic stimulation at the single-cell level},
author = {Maria C Romero and Marco Davare and Marcelo Armendariz and Peter Janssen},
doi = {10.1038/s41467-019-10638-7},
year = {2019},
date = {2019-12-01},
journal = {Nature Communications},
volume = {10},
pages = {2642},
publisher = {Nature Publishing Group},
abstract = {Transcranial magnetic stimulation (TMS) can non-invasively modulate neural activity in humans. Despite three decades of research, the spatial extent of the cortical area activated by TMS is still controversial. Moreover, how TMS interacts with task-related activity during motor behavior is unknown. Here, we applied single-pulse TMS over macaque parietal cortex while recording single-unit activity at various distances from the center of stimulation during grasping. The spatial extent of TMS-induced activation is remarkably restricted, affecting the spiking activity of single neurons in an area of cortex measuring less than 2 mm in diameter. In task-related neurons, TMS evokes a transient excitation followed by reduced activity, paralleled by a significantly longer grasping time. Furthermore, TMS-induced activity and task-related activity do not summate in single neurons. These results furnish crucial experimental evidence for the neural effects of TMS at the single-cell level and uncover the neural underpinnings of behavioral effects of TMS.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Transcranial magnetic stimulation (TMS) can non-invasively modulate neural activity in humans. Despite three decades of research, the spatial extent of the cortical area activated by TMS is still controversial. Moreover, how TMS interacts with task-related activity during motor behavior is unknown. Here, we applied single-pulse TMS over macaque parietal cortex while recording single-unit activity at various distances from the center of stimulation during grasping. The spatial extent of TMS-induced activation is remarkably restricted, affecting the spiking activity of single neurons in an area of cortex measuring less than 2 mm in diameter. In task-related neurons, TMS evokes a transient excitation followed by reduced activity, paralleled by a significantly longer grasping time. Furthermore, TMS-induced activity and task-related activity do not summate in single neurons. These results furnish crucial experimental evidence for the neural effects of TMS at the single-cell level and uncover the neural underpinnings of behavioral effects of TMS.

Close

  • doi:10.1038/s41467-019-10638-7

Close

Ariana R Andrei; Sorin A Pojoga; Roger Janz; Valentin Dragoi

Integration of cortical population signals for visual perception Journal Article

Nature Communications, 10 , pp. 3832, 2019.

Abstract | Links | BibTeX

@article{Andrei2019,
title = {Integration of cortical population signals for visual perception},
author = {Ariana R Andrei and Sorin A Pojoga and Roger Janz and Valentin Dragoi},
doi = {10.1038/s41467-019-11736-2},
year = {2019},
date = {2019-12-01},
journal = {Nature Communications},
volume = {10},
pages = {3832},
publisher = {Nature Publishing Group},
abstract = {Visual stimuli evoke heterogeneous responses across nearby neural populations. These signals must be locally integrated to contribute to perception, but the principles underlying this process are unknown. Here, we exploit the systematic organization of orientation preference in macaque primary visual cortex (V1) and perform causal manipulations to examine the limits of signal integration. Optogenetic stimulation and visual stimuli are used to simultaneously drive two neural populations with overlapping receptive fields. We report that optogenetic stimulation raises firing rates uniformly across conditions, but improves the detection of visual stimuli only when activating cells that are preferentially-tuned to the visual stimulus. Further, we show that changes in correlated variability are exclusively present when the optogenetically and visually-activated populations are functionally-proximal, suggesting that correlation changes represent a hallmark of signal integration. Our results demonstrate that information from functionally-proximal neurons is pooled for perception, but functionally-distal signals remain independent. Primary visual cortical neurons exhibit diverse responses to visual stimuli yet how these signals are integrated during visual perception is not well understood. Here, the authors show that optogenetic stimulation of neurons situated near the visually‐driven population leads to improved orientation detection in monkeys through changes in correlated variability.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual stimuli evoke heterogeneous responses across nearby neural populations. These signals must be locally integrated to contribute to perception, but the principles underlying this process are unknown. Here, we exploit the systematic organization of orientation preference in macaque primary visual cortex (V1) and perform causal manipulations to examine the limits of signal integration. Optogenetic stimulation and visual stimuli are used to simultaneously drive two neural populations with overlapping receptive fields. We report that optogenetic stimulation raises firing rates uniformly across conditions, but improves the detection of visual stimuli only when activating cells that are preferentially-tuned to the visual stimulus. Further, we show that changes in correlated variability are exclusively present when the optogenetically and visually-activated populations are functionally-proximal, suggesting that correlation changes represent a hallmark of signal integration. Our results demonstrate that information from functionally-proximal neurons is pooled for perception, but functionally-distal signals remain independent. Primary visual cortical neurons exhibit diverse responses to visual stimuli yet how these signals are integrated during visual perception is not well understood. Here, the authors show that optogenetic stimulation of neurons situated near the visually‐driven population leads to improved orientation detection in monkeys through changes in correlated variability.

Close

  • doi:10.1038/s41467-019-11736-2

Close

Seth W Egger; Evan D Remington; Chia-Jung Chang; Mehrdad Jazayeri

Internal models of sensorimotor integration regulate cortical dynamics Journal Article

Nature Neuroscience, 22 , pp. 1871–1882, 2019.

Abstract | Links | BibTeX

@article{Egger2019,
title = {Internal models of sensorimotor integration regulate cortical dynamics},
author = {Seth W Egger and Evan D Remington and Chia-Jung Chang and Mehrdad Jazayeri},
doi = {10.1038/s41593-019-0500-6},
year = {2019},
date = {2019-10-01},
journal = {Nature Neuroscience},
volume = {22},
pages = {1871--1882},
publisher = {Springer Science and Business Media LLC},
abstract = {Sensorimotor control during overt movements is characterized in terms of three building blocks: a controller, a simulator and a state estimator. We asked whether the same framework could explain the control of internal states in the absence of movements. Recently, it was shown that the brain controls the timing of future movements by adjusting an internal speed command. We trained monkeys in a novel task in which the speed command had to be dynamically controlled based on the timing of a sequence of flashes. Recordings from the frontal cortex provided evidence that the brain updates the internal speed command after each flash based on the error between the timing of the flash and the anticipated timing of the flash derived from a simulated motor plan. These findings suggest that cognitive control of internal states may be understood in terms of the same computational principles as motor control. Control of movements can be understood in terms of the interplay between a controller, a simulator and an estimator. Egger et. al. show that cortical neurons establish the same building blocks to control cognitive states in the absence of movement.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Sensorimotor control during overt movements is characterized in terms of three building blocks: a controller, a simulator and a state estimator. We asked whether the same framework could explain the control of internal states in the absence of movements. Recently, it was shown that the brain controls the timing of future movements by adjusting an internal speed command. We trained monkeys in a novel task in which the speed command had to be dynamically controlled based on the timing of a sequence of flashes. Recordings from the frontal cortex provided evidence that the brain updates the internal speed command after each flash based on the error between the timing of the flash and the anticipated timing of the flash derived from a simulated motor plan. These findings suggest that cognitive control of internal states may be understood in terms of the same computational principles as motor control. Control of movements can be understood in terms of the interplay between a controller, a simulator and an estimator. Egger et. al. show that cortical neurons establish the same building blocks to control cognitive states in the absence of movement.

Close

  • doi:10.1038/s41593-019-0500-6

Close

Ramina Adam; Kevin D Johnston; Stefan Everling

Recovery of contralesional saccade choice and reaction time deficits after a unilateral endothelin-1-induced lesion in the macaque caudal prefrontal cortex Journal Article

Journal of Neurophysiology, 122 (2), pp. 672–690, 2019.

Abstract | Links | BibTeX

@article{Adam2019,
title = {Recovery of contralesional saccade choice and reaction time deficits after a unilateral endothelin-1-induced lesion in the macaque caudal prefrontal cortex},
author = {Ramina Adam and Kevin D Johnston and Stefan Everling},
doi = {10.1152/jn.00078.2019},
year = {2019},
date = {2019-08-01},
journal = {Journal of Neurophysiology},
volume = {122},
number = {2},
pages = {672--690},
publisher = {American Physiological Society Bethesda, MD},
abstract = {The caudal primate prefrontal cortex (PFC) is involved in target selection and visually guided saccades through both covert attention and overt orienting eye movements. Unilateral damage to the caudal PFC often leads to decreased awareness of a contralesional target alone, referred to as “neglect,” or when it is presented simultaneously with an ipsilesional target, referred to as “extinction.” In the current study, we examined whether deficits in contralesional target selection were due to contralesional oculomotor deficits, such as slower reaction times. We experimentally induced a focal ischemic lesion in the right caudal PFC of 4 male macaque monkeys using the vasoconstrictor endothelin-1 and measured saccade choice and reaction times on double-stimulus free-choice tasks and single-stimulus trials before and after the lesion. We found that 1) endothelin-1-induced lesions in the caudal PFC produced contralesional target selection deficits that varied in severity and duration based on lesion volume and location; 2) contralesional neglect-like deficits were transient and recovered by week 4 postlesion; 3) contralesional extinction-like deficits were longer lasting and recovered by weeks 8–16 postlesion; 4) contralesional reaction time returned to baseline well before the contralesional choice deficit had recovered; and 5) neither the mean reaction times nor the reaction time distributions could account for the degree of contralesional extinction on the free-choice task throughout recovery. These findings demonstrate that the saccade choice bias observed after a right caudal PFC lesion is not exclusively due to contralesional motor deficits, but instead reflects a combination of impaired motor and attentional processing.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The caudal primate prefrontal cortex (PFC) is involved in target selection and visually guided saccades through both covert attention and overt orienting eye movements. Unilateral damage to the caudal PFC often leads to decreased awareness of a contralesional target alone, referred to as “neglect,” or when it is presented simultaneously with an ipsilesional target, referred to as “extinction.” In the current study, we examined whether deficits in contralesional target selection were due to contralesional oculomotor deficits, such as slower reaction times. We experimentally induced a focal ischemic lesion in the right caudal PFC of 4 male macaque monkeys using the vasoconstrictor endothelin-1 and measured saccade choice and reaction times on double-stimulus free-choice tasks and single-stimulus trials before and after the lesion. We found that 1) endothelin-1-induced lesions in the caudal PFC produced contralesional target selection deficits that varied in severity and duration based on lesion volume and location; 2) contralesional neglect-like deficits were transient and recovered by week 4 postlesion; 3) contralesional extinction-like deficits were longer lasting and recovered by weeks 8–16 postlesion; 4) contralesional reaction time returned to baseline well before the contralesional choice deficit had recovered; and 5) neither the mean reaction times nor the reaction time distributions could account for the degree of contralesional extinction on the free-choice task throughout recovery. These findings demonstrate that the saccade choice bias observed after a right caudal PFC lesion is not exclusively due to contralesional motor deficits, but instead reflects a combination of impaired motor and attentional processing.

Close

  • doi:10.1152/jn.00078.2019

Close

Kun Guo; Zhihan Li; Yin Yan; Wu Li

Viewing heterospecific facial expressions: an eye-tracking study of human and monkey viewers Journal Article

Experimental Brain Research, 237 (8), pp. 2045–2059, 2019.

Abstract | Links | BibTeX

@article{Guo2019,
title = {Viewing heterospecific facial expressions: an eye-tracking study of human and monkey viewers},
author = {Kun Guo and Zhihan Li and Yin Yan and Wu Li},
doi = {10.1007/s00221-019-05574-3},
year = {2019},
date = {2019-08-01},
journal = {Experimental Brain Research},
volume = {237},
number = {8},
pages = {2045--2059},
publisher = {Springer Berlin Heidelberg},
abstract = {Common facial expressions of emotion have distinctive patterns of facial muscle movements that are culturally similar among humans, and perceiving these expressions is associated with stereotypical gaze allocation at local facial regions that are characteristic for each expression, such as eyes in angry faces. It is, however, unclear to what extent this ‘universality' view can be extended to process heterospecific facial expressions, and how ‘social learning' process contributes to heterospecific expression perception. In this eye-tracking study, we examined face-viewing gaze allocation of human (including dog owners and non-dog owners) and monkey observers while exploring expressive human, chimpanzee, monkey and dog faces (positive, neutral and negative expressions in human and dog faces; neutral and negative expressions in chimpanzee and monkey faces). Human observers showed species- and experience-dependent expression categorization accuracy. Furthermore, both human and monkey observers demonstrated different face-viewing gaze distributions which were also species dependent. Specifically, humans predominately attended at human eyes but animal mouth when judging facial expressions. Monkeys' gaze distributions in exploring human and monkey faces were qualitatively different from exploring chimpanzee and dog faces. Interestingly, the gaze behaviour of both human and monkey observers were further affected by their prior experience of the viewed species. It seems that facial expression processing is species dependent, and social learning may play a significant role in discriminating even rudimentary types of heterospecific expressions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Common facial expressions of emotion have distinctive patterns of facial muscle movements that are culturally similar among humans, and perceiving these expressions is associated with stereotypical gaze allocation at local facial regions that are characteristic for each expression, such as eyes in angry faces. It is, however, unclear to what extent this ‘universality' view can be extended to process heterospecific facial expressions, and how ‘social learning' process contributes to heterospecific expression perception. In this eye-tracking study, we examined face-viewing gaze allocation of human (including dog owners and non-dog owners) and monkey observers while exploring expressive human, chimpanzee, monkey and dog faces (positive, neutral and negative expressions in human and dog faces; neutral and negative expressions in chimpanzee and monkey faces). Human observers showed species- and experience-dependent expression categorization accuracy. Furthermore, both human and monkey observers demonstrated different face-viewing gaze distributions which were also species dependent. Specifically, humans predominately attended at human eyes but animal mouth when judging facial expressions. Monkeys' gaze distributions in exploring human and monkey faces were qualitatively different from exploring chimpanzee and dog faces. Interestingly, the gaze behaviour of both human and monkey observers were further affected by their prior experience of the viewed species. It seems that facial expression processing is species dependent, and social learning may play a significant role in discriminating even rudimentary types of heterospecific expressions.

Close

  • doi:10.1007/s00221-019-05574-3

Close

Florian Sandhaeger; Constantin von Nicolai; Earl K Miller; Markus Siegel

Monkey EEG links neuronal color and motion information across species and scales Journal Article

eLife, 8 , pp. 1–21, 2019.

Abstract | Links | BibTeX

@article{Sandhaeger2019,
title = {Monkey EEG links neuronal color and motion information across species and scales},
author = {Florian Sandhaeger and Constantin von Nicolai and Earl K Miller and Markus Siegel},
doi = {10.7554/eLife.45645},
year = {2019},
date = {2019-07-01},
journal = {eLife},
volume = {8},
pages = {1--21},
publisher = {eLife Sciences Publications, Ltd},
abstract = {It remains challenging to relate EEG and MEG to underlying circuit processes and comparable experiments on both spatial scales are rare. To close this gap between invasive and non-invasive electrophysiology we developed and recorded human-comparable EEG in macaque monkeys during visual stimulation with colored dynamic random dot patterns. Furthermore, we performed simultaneous microelectrode recordings from 6 areas of macaque cortex and human MEG. Motion direction and color information were accessible in all signals. Tuning of the non- invasive signals was similar to V4 and IT, but not to dorsal and frontal areas. Thus, MEG and EEG were dominated by early visual and ventral stream sources. Source level analysis revealed corresponding information and latency gradients across cortex. We show how information-based methods and monkey EEG can identify analogous properties of visual processing in signals spanning spatial scales from single units to MEG – a valuable framework for relating human and animal studies.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

It remains challenging to relate EEG and MEG to underlying circuit processes and comparable experiments on both spatial scales are rare. To close this gap between invasive and non-invasive electrophysiology we developed and recorded human-comparable EEG in macaque monkeys during visual stimulation with colored dynamic random dot patterns. Furthermore, we performed simultaneous microelectrode recordings from 6 areas of macaque cortex and human MEG. Motion direction and color information were accessible in all signals. Tuning of the non- invasive signals was similar to V4 and IT, but not to dorsal and frontal areas. Thus, MEG and EEG were dominated by early visual and ventral stream sources. Source level analysis revealed corresponding information and latency gradients across cortex. We show how information-based methods and monkey EEG can identify analogous properties of visual processing in signals spanning spatial scales from single units to MEG – a valuable framework for relating human and animal studies.

Close

  • doi:10.7554/eLife.45645

Close

Junxiang Luo; Keyan He; Ian Max Andolina; Xiaohong Li; Jiapeng Yin; Zheyuan Chen; Yong Gu; Wei Wang

Going with the flow: The neural mechanisms underlying illusions of complex-flow motion Journal Article

Journal of Neuroscience, 39 (14), pp. 2664 –2685, 2019.

Abstract | Links | BibTeX

@article{Luo2019,
title = {Going with the flow: The neural mechanisms underlying illusions of complex-flow motion},
author = {Junxiang Luo and Keyan He and Ian Max Andolina and Xiaohong Li and Jiapeng Yin and Zheyuan Chen and Yong Gu and Wei Wang},
doi = {10.1523/JNEUROSCI.2112-18.2019},
year = {2019},
date = {2019-01-01},
journal = {Journal of Neuroscience},
volume = {39},
number = {14},
pages = {2664 --2685},
abstract = {Studying the mismatch between perception and reality helps us better understand the constructive nature of the visual brain. The Pinna-Brelstaff motion illusion is a compelling example illustrating how a complex moving pattern can generate an illusory motion perception. When an observer moves toward (expansion) or away (contraction) from the Pinna-Brelstaff figure, the figure appears to rotate. The neural mechanisms underlying the illusory complex-flow motion of rotation, expansion, and contraction remain unknown. We studied this question at both perceptual and neuronal levels in behaving male macaques by using carefully parametrized Pinna-Brelstaff figures that induce the above motion illusions. We first demonstrate that macaques perceive illusory motion in a manner similar to that of human observers. Neurophysiological recordings were subsequently performed in the middle temporal area (MT) and the dorsal portion of the medial superior temporal area (MSTd). We find that subgroups of MSTd neurons encoding a particular global pattern of real complex-flow motion (rotation, expansion, contraction) also represent illusory motion patterns of the same class. They require an extra 15 ms to reliably discriminate the illusion. In contrast, MT neurons encode both real and illusory local motions with similar temporal delays. These findings reveal that illusory complex-flow motion is first represented in MSTd by the same neurons that normally encode real complex-flow motion. However, the extraction of global illusory motion in MSTd from other classes of real complex-flow motion requires extra processing time. Our study illustrates a cascaded integration mechanism from MT to MSTd underlying the transformation from external physical to internal nonveridical flow-motion perception.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Studying the mismatch between perception and reality helps us better understand the constructive nature of the visual brain. The Pinna-Brelstaff motion illusion is a compelling example illustrating how a complex moving pattern can generate an illusory motion perception. When an observer moves toward (expansion) or away (contraction) from the Pinna-Brelstaff figure, the figure appears to rotate. The neural mechanisms underlying the illusory complex-flow motion of rotation, expansion, and contraction remain unknown. We studied this question at both perceptual and neuronal levels in behaving male macaques by using carefully parametrized Pinna-Brelstaff figures that induce the above motion illusions. We first demonstrate that macaques perceive illusory motion in a manner similar to that of human observers. Neurophysiological recordings were subsequently performed in the middle temporal area (MT) and the dorsal portion of the medial superior temporal area (MSTd). We find that subgroups of MSTd neurons encoding a particular global pattern of real complex-flow motion (rotation, expansion, contraction) also represent illusory motion patterns of the same class. They require an extra 15 ms to reliably discriminate the illusion. In contrast, MT neurons encode both real and illusory local motions with similar temporal delays. These findings reveal that illusory complex-flow motion is first represented in MSTd by the same neurons that normally encode real complex-flow motion. However, the extraction of global illusory motion in MSTd from other classes of real complex-flow motion requires extra processing time. Our study illustrates a cascaded integration mechanism from MT to MSTd underlying the transformation from external physical to internal nonveridical flow-motion perception.

Close

  • doi:10.1523/JNEUROSCI.2112-18.2019

Close

Liya Ma; Jason L Chan; Kevin D Johnston; Stephen G Lomber; Stefan Everling

Macaque anterior cingulate cortex deactivation impairs performance and alters lateral prefrontal oscillatory activities in a rule-switching task Journal Article

PLoS Biology, 17 (7), pp. e3000045, 2019.

Abstract | Links | BibTeX

@article{Ma2019b,
title = {Macaque anterior cingulate cortex deactivation impairs performance and alters lateral prefrontal oscillatory activities in a rule-switching task},
author = {Liya Ma and Jason L Chan and Kevin D Johnston and Stephen G Lomber and Stefan Everling},
doi = {10.1371/journal.pbio.3000045},
year = {2019},
date = {2019-01-01},
journal = {PLoS Biology},
volume = {17},
number = {7},
pages = {e3000045},
abstract = {In primates, both the dorsal anterior cingulate cortex (dACC) and the dorsolateral prefrontal cortex (dlPFC) are key regions of the frontoparietal cognitive control network. To study the role of the dACC and its communication with the dlPFC in cognitive control, we recorded local field potentials (LFPs) from the dlPFC before and during the reversible deactivation of the dACC, in macaque monkeys engaging in uncued switches between 2 stimulus-response rules, namely prosaccade and antisaccade. Cryogenic dACC deactivation impaired response accuracy during maintenance of—but not the initial switching to—the cognitively demanding antisaccade rule, which coincided with a reduction in task-related theta activity and the correct-error (C-E) difference in dlPFC beta-band power. During both rule switching and maintenance, dACC deactivation prolonged the animals' reaction time and reduced task-related alpha power in the dlPFC. Our findings support a role of the dACC in prefrontal oscillatory activities that are involved the maintenance of a new, challenging task rule.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

In primates, both the dorsal anterior cingulate cortex (dACC) and the dorsolateral prefrontal cortex (dlPFC) are key regions of the frontoparietal cognitive control network. To study the role of the dACC and its communication with the dlPFC in cognitive control, we recorded local field potentials (LFPs) from the dlPFC before and during the reversible deactivation of the dACC, in macaque monkeys engaging in uncued switches between 2 stimulus-response rules, namely prosaccade and antisaccade. Cryogenic dACC deactivation impaired response accuracy during maintenance of—but not the initial switching to—the cognitively demanding antisaccade rule, which coincided with a reduction in task-related theta activity and the correct-error (C-E) difference in dlPFC beta-band power. During both rule switching and maintenance, dACC deactivation prolonged the animals' reaction time and reduced task-related alpha power in the dlPFC. Our findings support a role of the dACC in prefrontal oscillatory activities that are involved the maintenance of a new, challenging task rule.

Close

  • doi:10.1371/journal.pbio.3000045

Close

Corentin Massot; Uday K Jagadisan; Neeraj J Gandhi

Sensorimotor transformation elicits systematic patterns of activity along the dorsoventral extent of the superior colliculus in the macaque monkey Journal Article

Communications Biology, 2 , pp. 1–14, 2019.

Abstract | Links | BibTeX

@article{Massot2019,
title = {Sensorimotor transformation elicits systematic patterns of activity along the dorsoventral extent of the superior colliculus in the macaque monkey},
author = {Corentin Massot and Uday K Jagadisan and Neeraj J Gandhi},
doi = {10.1038/s42003-019-0527-y},
year = {2019},
date = {2019-01-01},
journal = {Communications Biology},
volume = {2},
pages = {1--14},
abstract = {The superior colliculus (SC) is an excellent substrate to study sensorimotor transformations. To date, the spatial and temporal properties of population activity along its dorsoventral axis have been inferred from single electrode studies. Here, we recorded SC population activity in non-human primates using a linear multi-contact array during delayed saccade tasks. We show that during the visual epoch, information appeared first in dorsal layers and systematically later in ventral layers. During the delay period, the laminar organization of low-spiking rate activity matched that of the visual epoch. During the pre-saccadic epoch, spiking activity emerged first in a more ventral layer, ~ 100 ms before saccade onset. This buildup of activity appeared later on nearby neurons situated both dorsally and ventrally, culminating in a synchronous burst across the dorsoventral axis, ~ 28 ms before saccade onset. Collectively, these results reveal a principled spatiotemporal organization of SC population activity underlying sensorimotor transformation for the control of gaze.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The superior colliculus (SC) is an excellent substrate to study sensorimotor transformations. To date, the spatial and temporal properties of population activity along its dorsoventral axis have been inferred from single electrode studies. Here, we recorded SC population activity in non-human primates using a linear multi-contact array during delayed saccade tasks. We show that during the visual epoch, information appeared first in dorsal layers and systematically later in ventral layers. During the delay period, the laminar organization of low-spiking rate activity matched that of the visual epoch. During the pre-saccadic epoch, spiking activity emerged first in a more ventral layer, ~ 100 ms before saccade onset. This buildup of activity appeared later on nearby neurons situated both dorsally and ventrally, culminating in a synchronous burst across the dorsoventral axis, ~ 28 ms before saccade onset. Collectively, these results reveal a principled spatiotemporal organization of SC population activity underlying sensorimotor transformation for the control of gaze.

Close

  • doi:10.1038/s42003-019-0527-y

Close

Vincent B McGinty

Overt attention toward appetitive cues enhances their subjective value, independent of orbitofrontal cortex activity Journal Article

eNeuro, 6 (6), pp. 1–19, 2019.

Abstract | Links | BibTeX

@article{McGinty2019,
title = {Overt attention toward appetitive cues enhances their subjective value, independent of orbitofrontal cortex activity},
author = {Vincent B McGinty},
doi = {10.1523/ENEURO.0230-19.2019},
year = {2019},
date = {2019-01-01},
journal = {eNeuro},
volume = {6},
number = {6},
pages = {1--19},
abstract = {Neural representations of value underlie many behaviors that are crucial for survival. Previously, we found that value representations in primate orbitofrontal cortex (OFC) are modulated by attention, specifically, by overt shifts of gaze toward or away from reward-associated visual cues (McGinty et al., 2016). Here, we investigate the influence of overt attention on behavior by asking how gaze shifts correlate with reward anticipatory responses and whether activity in OFC mediates this correlation. Macaque monkeys viewed pavlovian conditioned appetitive cues on a visual display, while the fraction of time they spent looking toward or away from the cues was measured using an eye tracker. Also measured during cue presentation were the reward anticipation, indicated by conditioned licking responses (CRs), and single-neuron activity in OFC. In general, gaze allocation predicted subsequent licking responses: the longer the monkeys spent looking at a cue at a given time point in a trial, the more likely they were to produce an anticipatory CR later in that trial, as if the subjective value of the cue were increased. To address neural mechanisms, mediation analysis measured the extent to which the gaze–CR correlation could be statistically explained by the concurrently recorded firing of OFC neurons. The resulting mediation effects were indistinguishable from chance. Therefore, while overt attention may increase the subjective value of reward-associated cues (as revealed by anticipatory behaviors), the underlying mechanism remains unknown, as does the functional significance of gaze-driven modulation of OFC value signals.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Neural representations of value underlie many behaviors that are crucial for survival. Previously, we found that value representations in primate orbitofrontal cortex (OFC) are modulated by attention, specifically, by overt shifts of gaze toward or away from reward-associated visual cues (McGinty et al., 2016). Here, we investigate the influence of overt attention on behavior by asking how gaze shifts correlate with reward anticipatory responses and whether activity in OFC mediates this correlation. Macaque monkeys viewed pavlovian conditioned appetitive cues on a visual display, while the fraction of time they spent looking toward or away from the cues was measured using an eye tracker. Also measured during cue presentation were the reward anticipation, indicated by conditioned licking responses (CRs), and single-neuron activity in OFC. In general, gaze allocation predicted subsequent licking responses: the longer the monkeys spent looking at a cue at a given time point in a trial, the more likely they were to produce an anticipatory CR later in that trial, as if the subjective value of the cue were increased. To address neural mechanisms, mediation analysis measured the extent to which the gaze–CR correlation could be statistically explained by the concurrently recorded firing of OFC neurons. The resulting mediation effects were indistinguishable from chance. Therefore, while overt attention may increase the subjective value of reward-associated cues (as revealed by anticipatory behaviors), the underlying mechanism remains unknown, as does the functional significance of gaze-driven modulation of OFC value signals.

Close

  • doi:10.1523/ENEURO.0230-19.2019

Close

Priyanka S Mehta; Jiaxin Cindy Tu; Giuliana A LoConte; Meghan C Pesce; Benjamin Y Hayden

Ventromedial prefrontal cortex tracks multiple environmental variables during search Journal Article

Journal of Neuroscience, 39 (27), pp. 5336–5350, 2019.

Abstract | Links | BibTeX

@article{Mehta2019,
title = {Ventromedial prefrontal cortex tracks multiple environmental variables during search},
author = {Priyanka S Mehta and Jiaxin Cindy Tu and Giuliana A LoConte and Meghan C Pesce and Benjamin Y Hayden},
doi = {10.1523/JNEUROSCI.2365-18.2019},
year = {2019},
date = {2019-01-01},
journal = {Journal of Neuroscience},
volume = {39},
number = {27},
pages = {5336--5350},
abstract = {To make efficient foraging decisions, we must combine information about the values of available options with nonvalue information. Some accounts of ventromedial PFC (vmPFC) suggest that it has a narrow role limited to evaluating immediately available options. We examined responses of neurons in area 14 (a putative macaque homolog of human vmPFC) as 2 male macaques performed a novel foraging search task. Although many neurons encoded the values of immediately available offers, they also independently encoded several other variables that influence choice, but that are conceptually distinct from offer value. These variables include average reward rate, number of offers viewed per trial, previous offer values, previous outcome sizes, and the locations of the currently attended offer.We conclude that, rather than serving as specialized economic value center, vmPFC plays a broad role in integrating relevant environmental information to drive foraging decisions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

To make efficient foraging decisions, we must combine information about the values of available options with nonvalue information. Some accounts of ventromedial PFC (vmPFC) suggest that it has a narrow role limited to evaluating immediately available options. We examined responses of neurons in area 14 (a putative macaque homolog of human vmPFC) as 2 male macaques performed a novel foraging search task. Although many neurons encoded the values of immediately available offers, they also independently encoded several other variables that influence choice, but that are conceptually distinct from offer value. These variables include average reward rate, number of offers viewed per trial, previous offer values, previous outcome sizes, and the locations of the currently attended offer.We conclude that, rather than serving as specialized economic value center, vmPFC plays a broad role in integrating relevant environmental information to drive foraging decisions.

Close

  • doi:10.1523/JNEUROSCI.2365-18.2019

Close

Adam P Morris; Bart Krekelberg

A stable visual world in primate primary visual cortex Journal Article

Current Biology, 29 (9), pp. 1471–1480, 2019.

Abstract | Links | BibTeX

@article{Morris2019,
title = {A stable visual world in primate primary visual cortex},
author = {Adam P Morris and Bart Krekelberg},
doi = {10.1016/j.cub.2019.03.069},
year = {2019},
date = {2019-01-01},
journal = {Current Biology},
volume = {29},
number = {9},
pages = {1471--1480},
publisher = {Elsevier Ltd.},
abstract = {Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina—and propagated throughout the visual cortical hierarchy—is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here, we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in area V1 of macaque monkeys during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of gaze direction. This decoded signal tracked the eye accurately not only during fixation but also during fast and slow eye movements. After a fast eye movement, the eye-position signal arrived in V1 at approximately the same time at which the new visual information arrived from the retina. Using simulations, we show that this V1 eye-position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world. Visual input arrives as a series of snapshots, each taken from a different line of sight, due to eye movements from one part of a scene to another. How do we nevertheless see a stable visual world? Morris and Krekelberg show that in primary visual cortex, the neural representation of each snapshot includes “metadata” that tracks gaze direction.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina—and propagated throughout the visual cortical hierarchy—is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here, we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded “eye tracker” that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in area V1 of macaque monkeys during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies, we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of gaze direction. This decoded signal tracked the eye accurately not only during fixation but also during fast and slow eye movements. After a fast eye movement, the eye-position signal arrived in V1 at approximately the same time at which the new visual information arrived from the retina. Using simulations, we show that this V1 eye-position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world. Visual input arrives as a series of snapshots, each taken from a different line of sight, due to eye movements from one part of a scene to another. How do we nevertheless see a stable visual world? Morris and Krekelberg show that in primary visual cortex, the neural representation of each snapshot includes “metadata” that tracks gaze direction.

Close

  • doi:10.1016/j.cub.2019.03.069

Close

Aidan P Murphy; David A Leopold

A parameterized digital 3D model of the Rhesus macaque face for investigating the visual processing of social cues Journal Article

Journal of Neuroscience Methods, 324 , pp. 1–14, 2019.

Abstract | Links | BibTeX

@article{Murphy2019b,
title = {A parameterized digital 3D model of the Rhesus macaque face for investigating the visual processing of social cues},
author = {Aidan P Murphy and David A Leopold},
doi = {10.1016/j.jneumeth.2019.06.001},
year = {2019},
date = {2019-01-01},
journal = {Journal of Neuroscience Methods},
volume = {324},
pages = {1--14},
publisher = {Elsevier},
abstract = {Background: Rhesus macaques are the most popular model species for studying the neural basis of visual face processing and social interaction using intracranial methods. However, the challenge of creating realistic, dynamic, and parametric macaque face stimuli has limited the experimental control and ethological validity of existing approaches. New method: We performed statistical analyses of in vivo computed tomography data to generate an anatomically accurate, three-dimensional representation of Rhesus macaque cranio-facial morphology. The surface structures were further edited, rigged and textured by a professional digital artist with careful reference to photographs of macaque facial expression, colouration and pelage. Results: The model offers precise, continuous, parametric control of craniofacial shape, emotional expression, head orientation, eye gaze direction, and many other parameters that can be adjusted to render either static or dynamic high-resolution faces. Example single-unit responses to such stimuli in macaque inferotemporal cortex demonstrate the value of parametric control over facial appearance and behaviours. Comparison with existing method(s): The generation of such a high-dimensionality and systematically controlled stimulus set of conspecific faces, with accurate craniofacial modelling and professional finalization of facial details, is currently not achievable using existing methods. Conclusions: The results herald a new set of possibilities in adaptive sampling of a high-dimensional and socially meaningful feature space, thus opening the door to systematic testing of hypotheses about the abundant neural specialization for faces found in the primate.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Background: Rhesus macaques are the most popular model species for studying the neural basis of visual face processing and social interaction using intracranial methods. However, the challenge of creating realistic, dynamic, and parametric macaque face stimuli has limited the experimental control and ethological validity of existing approaches. New method: We performed statistical analyses of in vivo computed tomography data to generate an anatomically accurate, three-dimensional representation of Rhesus macaque cranio-facial morphology. The surface structures were further edited, rigged and textured by a professional digital artist with careful reference to photographs of macaque facial expression, colouration and pelage. Results: The model offers precise, continuous, parametric control of craniofacial shape, emotional expression, head orientation, eye gaze direction, and many other parameters that can be adjusted to render either static or dynamic high-resolution faces. Example single-unit responses to such stimuli in macaque inferotemporal cortex demonstrate the value of parametric control over facial appearance and behaviours. Comparison with existing method(s): The generation of such a high-dimensionality and systematically controlled stimulus set of conspecific faces, with accurate craniofacial modelling and professional finalization of facial details, is currently not achievable using existing methods. Conclusions: The results herald a new set of possibilities in adaptive sampling of a high-dimensional and socially meaningful feature space, thus opening the door to systematic testing of hypotheses about the abundant neural specialization for faces found in the primate.

Close

  • doi:10.1016/j.jneumeth.2019.06.001

Close

Sunny Nigam; Sorin A Pojoga; Valentin Dragoi

Synergistic coding of visual information in columnar networks Journal Article

Neuron, 104 , pp. 402–411, 2019.

Abstract | Links | BibTeX

@article{Nigam2019,
title = {Synergistic coding of visual information in columnar networks},
author = {Sunny Nigam and Sorin A Pojoga and Valentin Dragoi},
doi = {10.1016/j.neuron.2019.07.006},
year = {2019},
date = {2019-01-01},
journal = {Neuron},
volume = {104},
pages = {402--411},
publisher = {Elsevier Inc.},
abstract = {Incoming stimuli are encoded collectively by populations of cortical neurons, which transmit information by using a neural code thought to be predominantly redundant. Redundant coding is widely believed to reflect a design choice whereby neurons with overlapping receptive fields sample environmental stimuli to convey similar information. Here, we performed multi-electrode laminar recordings in awake monkey V1 to report significant synergistic interactions between nearby neurons within a cortical column. These interactions are clustered non-randomly across cortical layers to form synergy and redundancy hubs. Homogeneous sub-populations comprising synergy hubs decode stimulus information significantly better compared to redundancy hubs or heterogeneous sub-populations. Mechanistically, synergistic interactions emerge from the stimulus dependence of correlated activity between neurons. Our findings suggest a refinement of the prevailing ideas regarding coding schemes in sensory cortex: columnar populations can efficiently encode information due to synergistic interactions even when receptive fields overlap and shared noise between cells is high.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Incoming stimuli are encoded collectively by populations of cortical neurons, which transmit information by using a neural code thought to be predominantly redundant. Redundant coding is widely believed to reflect a design choice whereby neurons with overlapping receptive fields sample environmental stimuli to convey similar information. Here, we performed multi-electrode laminar recordings in awake monkey V1 to report significant synergistic interactions between nearby neurons within a cortical column. These interactions are clustered non-randomly across cortical layers to form synergy and redundancy hubs. Homogeneous sub-populations comprising synergy hubs decode stimulus information significantly better compared to redundancy hubs or heterogeneous sub-populations. Mechanistically, synergistic interactions emerge from the stimulus dependence of correlated activity between neurons. Our findings suggest a refinement of the prevailing ideas regarding coding schemes in sensory cortex: columnar populations can efficiently encode information due to synergistic interactions even when receptive fields overlap and shared noise between cells is high.

Close

  • doi:10.1016/j.neuron.2019.07.006

Close

Kaiser Niknam; Amir Akbarian; Kelsey Clark; Yasin Zamani; Behrad Noudoost; Neda Nategh

Characterizing and dissociating multiple time-varying modulatory computations influencing neuronal activity Book

2019.

Abstract | Links | BibTeX

@book{Niknam2019,
title = {Characterizing and dissociating multiple time-varying modulatory computations influencing neuronal activity},
author = {Kaiser Niknam and Amir Akbarian and Kelsey Clark and Yasin Zamani and Behrad Noudoost and Neda Nategh},
doi = {10.1371/journal.pcbi.1007275},
year = {2019},
date = {2019-01-01},
booktitle = {PLoS Computational Biology},
volume = {15},
number = {9},
pages = {e1007275},
abstract = {In many brain areas, sensory responses are heavily modulated by factors including attentional state, context, reward history, motor preparation, learned associations, and other cognitive variables. Modelling the effect of these modulatory factors on sensory responses has proven challenging, mostly due to the time-varying and nonlinear nature of the underlying computations. Here we present a computational model capable of capturing and dissociating multiple time-varying modulatory effects on neuronal responses on the order of milliseconds. The model's performance is tested on extrastriate perisaccadic visual responses in nonhuman primates. Visual neurons respond to stimuli presented around the time of saccades differently than during fixation. These perisaccadic changes include sensitivity to the stimuli presented at locations outside the neuron's receptive field, which suggests a contribution of multiple sources to perisaccadic response generation. Current computational approaches cannot quantitatively characterize the contribution of each modulatory source in response generation, mainly due to the very short timescale on which the saccade takes place. In this study, we use a high spatiotemporal resolution experimental paradigm along with a novel extension of the generalized linear model framework (GLM), termed the sparse-variable GLM, to allow for time-varying model parameters representing the temporal evolution of the system with a resolution on the order of milliseconds. We used this model framework to precisely map the temporal evolution of the spatiotemporal receptive field of visual neurons in the middle temporal area during the execution of a saccade. Moreover, an extended model based on a factorization of the sparse-variable GLM allowed us to disassociate and quantify the contribution of individual sources to the perisaccadic response. Our results show that our novel framework can precisely capture the changes in sensitivity of neurons around the time of saccades, and provide a general framework to quantitatively track the role of multiple modulatory sources over time.},
keywords = {},
pubstate = {published},
tppubtype = {book}
}

Close

In many brain areas, sensory responses are heavily modulated by factors including attentional state, context, reward history, motor preparation, learned associations, and other cognitive variables. Modelling the effect of these modulatory factors on sensory responses has proven challenging, mostly due to the time-varying and nonlinear nature of the underlying computations. Here we present a computational model capable of capturing and dissociating multiple time-varying modulatory effects on neuronal responses on the order of milliseconds. The model's performance is tested on extrastriate perisaccadic visual responses in nonhuman primates. Visual neurons respond to stimuli presented around the time of saccades differently than during fixation. These perisaccadic changes include sensitivity to the stimuli presented at locations outside the neuron's receptive field, which suggests a contribution of multiple sources to perisaccadic response generation. Current computational approaches cannot quantitatively characterize the contribution of each modulatory source in response generation, mainly due to the very short timescale on which the saccade takes place. In this study, we use a high spatiotemporal resolution experimental paradigm along with a novel extension of the generalized linear model framework (GLM), termed the sparse-variable GLM, to allow for time-varying model parameters representing the temporal evolution of the system with a resolution on the order of milliseconds. We used this model framework to precisely map the temporal evolution of the spatiotemporal receptive field of visual neurons in the middle temporal area during the execution of a saccade. Moreover, an extended model based on a factorization of the sparse-variable GLM allowed us to disassociate and quantify the contribution of individual sources to the perisaccadic response. Our results show that our novel framework can precisely capture the changes in sensitivity of neurons around the time of saccades, and provide a general framework to quantitatively track the role of multiple modulatory sources over time.

Close

  • doi:10.1371/journal.pcbi.1007275

Close

Mariann Oemisch; Stephanie Westendorff; Marzyeh Azimi; Seyed Alireza Hassani; Salva Ardid; Paul Tiesinga; Thilo Womelsdorf

Feature-specific prediction errors and surprise across macaque fronto-striatal circuits Journal Article

Nature Communications, 10 , pp. 176, 2019.

Abstract | Links | BibTeX

@article{Oemisch2019,
title = {Feature-specific prediction errors and surprise across macaque fronto-striatal circuits},
author = {Mariann Oemisch and Stephanie Westendorff and Marzyeh Azimi and Seyed Alireza Hassani and Salva Ardid and Paul Tiesinga and Thilo Womelsdorf},
doi = {10.1038/s41467-018-08184-9},
year = {2019},
date = {2019-01-01},
journal = {Nature Communications},
volume = {10},
pages = {176},
publisher = {Springer US},
abstract = {To adjust expectations efficiently, prediction errors need to be associated with the precise features that gave rise to the unexpected outcome, but this credit assignment may be problematic if stimuli differ on multiple dimensions and it is ambiguous which feature dimension caused the outcome. Here, we report a potential solution: neurons in four recorded areas of the anterior fronto-striatal networks encode prediction errors that are specific to feature values of different dimensions of attended multidimensional stimuli. The most ubiquitous prediction error occurred for the reward-relevant dimension. Feature-specific prediction error signals a) emerge on average shortly after non-specific prediction error signals, b) arise earliest in the anterior cingulate cortex and later in dorsolateral prefrontal cortex, caudate and ventral striatum, and c) contribute to feature-based stimulus selection after learning. Thus, a widely-distributed feature-specific eligibility trace may be used to update synaptic weights for improved feature-based attention.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

To adjust expectations efficiently, prediction errors need to be associated with the precise features that gave rise to the unexpected outcome, but this credit assignment may be problematic if stimuli differ on multiple dimensions and it is ambiguous which feature dimension caused the outcome. Here, we report a potential solution: neurons in four recorded areas of the anterior fronto-striatal networks encode prediction errors that are specific to feature values of different dimensions of attended multidimensional stimuli. The most ubiquitous prediction error occurred for the reward-relevant dimension. Feature-specific prediction error signals a) emerge on average shortly after non-specific prediction error signals, b) arise earliest in the anterior cingulate cortex and later in dorsolateral prefrontal cortex, caudate and ventral striatum, and c) contribute to feature-based stimulus selection after learning. Thus, a widely-distributed feature-specific eligibility trace may be used to update synaptic weights for improved feature-based attention.

Close

  • doi:10.1038/s41467-018-08184-9

Close

Davide Paoletti; Christoph Braun; Elisabeth Julie Vargo; Wieske van Zoest

Spontaneous pre-stimulus oscillatory activity shapes the way we look: A concurrent imaging and eye-movement study Journal Article

European Journal of Neuroscience, 49 , pp. 137–149, 2019.

Abstract | Links | BibTeX

@article{Paoletti2019,
title = {Spontaneous pre-stimulus oscillatory activity shapes the way we look: A concurrent imaging and eye-movement study},
author = {Davide Paoletti and Christoph Braun and Elisabeth Julie Vargo and Wieske van Zoest},
doi = {10.1111/ejn.14285},
year = {2019},
date = {2019-01-01},
journal = {European Journal of Neuroscience},
volume = {49},
pages = {137--149},
abstract = {Previous behavioural studies have accrued evidence that response time plays a critical role in determining whether selection is influenced by stimulus saliency or target template. In the present work, we investigated to what extent the variations in timing and consequent oculomotor controls are influenced by spontaneous variations in pre-stimulus alpha oscillations. We recorded simultaneously brain activity using magnetoencephalography (MEG) and eye movements while participants performed a visual search task. Our results show that slower saccadic reaction times were predicted by an overall stronger alpha power in the 500 ms time window preceding the stimulus onset, while weaker alpha power was a signature of faster responses. When looking separately at performance for fast and slow responses, we found evidence for two specific sources of alpha activity predicting correct versus incorrect responses. When saccades were quickly elicited, errors were predicted by stronger alpha activity in posterior areas, comprising the angular gyrus in the temporal-parietal junction (TPJ) and possibly the lateral intraparietal area (LIP). Instead, when participants were slower in responding, an increase of alpha power in frontal eye fields (FEF), supplementary eye fields (SEF) and dorsolateral pre-frontal cortex (DLPFC) predicted erroneous saccades. In other words, oculomotor accuracy in fast responses was predicted by alpha power differences in more posterior areas, while the accuracy in slow responses was predicted by alpha power differences in frontal areas, in line with the idea that these areas may be differentially related to stimulus-driven and goal-driven control of selection.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Previous behavioural studies have accrued evidence that response time plays a critical role in determining whether selection is influenced by stimulus saliency or target template. In the present work, we investigated to what extent the variations in timing and consequent oculomotor controls are influenced by spontaneous variations in pre-stimulus alpha oscillations. We recorded simultaneously brain activity using magnetoencephalography (MEG) and eye movements while participants performed a visual search task. Our results show that slower saccadic reaction times were predicted by an overall stronger alpha power in the 500 ms time window preceding the stimulus onset, while weaker alpha power was a signature of faster responses. When looking separately at performance for fast and slow responses, we found evidence for two specific sources of alpha activity predicting correct versus incorrect responses. When saccades were quickly elicited, errors were predicted by stronger alpha activity in posterior areas, comprising the angular gyrus in the temporal-parietal junction (TPJ) and possibly the lateral intraparietal area (LIP). Instead, when participants were slower in responding, an increase of alpha power in frontal eye fields (FEF), supplementary eye fields (SEF) and dorsolateral pre-frontal cortex (DLPFC) predicted erroneous saccades. In other words, oculomotor accuracy in fast responses was predicted by alpha power differences in more posterior areas, while the accuracy in slow responses was predicted by alpha power differences in frontal areas, in line with the idea that these areas may be differentially related to stimulus-driven and goal-driven control of selection.

Close

  • doi:10.1111/ejn.14285

Close

Michael A Paradiso; Seth Akers-Campbell; Octavio Ruiz; James E Niemeyer; Stuart Geman; Jackson Loper

Transsacadic information and corollary discharge in local field potentials of macaque V1 Journal Article

Frontiers in Integrative Neuroscience, 12 , pp. 1–18, 2019.

Abstract | Links | BibTeX

@article{Paradiso2019,
title = {Transsacadic information and corollary discharge in local field potentials of macaque V1},
author = {Michael A Paradiso and Seth Akers-Campbell and Octavio Ruiz and James E Niemeyer and Stuart Geman and Jackson Loper},
doi = {10.3389/fnint.2018.00063},
year = {2019},
date = {2019-01-01},
journal = {Frontiers in Integrative Neuroscience},
volume = {12},
pages = {1--18},
abstract = {Approximately three times per second, human visual perception is interrupted by a saccadic eye movement. In addition to taking the eyes to a new location, several lines of evidence suggest that the saccades play multiple roles in visual perception. Indeed, it may be crucial that visual processing is informed about movements of the eyes in order to analyze visual input distinctly and efficiently on each fixation and preserve stable visual perception of the world across saccades. A variety of studies has demonstrated that activity in multiple brain areas is modulated by saccades. The hypothesis tested here is that these signals carry significant information that could be used in visual processing. To test this hypothesis, local field potentials (LFPs) were simultaneously recorded from multiple electrodes in macaque primary visual cortex (V1); support vector machines (SVMs) were used to classify the peri-saccadic LFPs. We find that LFPs in area V1 carry information that can be used to distinguish neural activity associated with fixations from saccades, precisely estimate the onset time of fixations, and reliably infer the directions of saccades. This information may be used by the brain in processes including visual stability, saccadic suppression, receptive field (RF) remapping, fixation amplification, and trans-saccadic visual perception.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Approximately three times per second, human visual perception is interrupted by a saccadic eye movement. In addition to taking the eyes to a new location, several lines of evidence suggest that the saccades play multiple roles in visual perception. Indeed, it may be crucial that visual processing is informed about movements of the eyes in order to analyze visual input distinctly and efficiently on each fixation and preserve stable visual perception of the world across saccades. A variety of studies has demonstrated that activity in multiple brain areas is modulated by saccades. The hypothesis tested here is that these signals carry significant information that could be used in visual processing. To test this hypothesis, local field potentials (LFPs) were simultaneously recorded from multiple electrodes in macaque primary visual cortex (V1); support vector machines (SVMs) were used to classify the peri-saccadic LFPs. We find that LFPs in area V1 carry information that can be used to distinguish neural activity associated with fixations from saccades, precisely estimate the onset time of fixations, and reliably infer the directions of saccades. This information may be used by the brain in processes including visual stability, saccadic suppression, receptive field (RF) remapping, fixation amplification, and trans-saccadic visual perception.

Close

  • doi:10.3389/fnint.2018.00063

Close

Aishwarya Parthasarathy; Cheng Tang; Roger Herikstad; Loong Fah Cheong; Shih Cheng Yen; Camilo Libedinsky

Time-invariant working memory representations in the presence of code-morphing in the lateral prefrontal cortex Journal Article

Nature Communications, 10 , pp. 4995, 2019.

Abstract | Links | BibTeX

@article{Parthasarathy2019,
title = {Time-invariant working memory representations in the presence of code-morphing in the lateral prefrontal cortex},
author = {Aishwarya Parthasarathy and Cheng Tang and Roger Herikstad and Loong Fah Cheong and Shih Cheng Yen and Camilo Libedinsky},
doi = {10.1038/s41467-019-12841-y},
year = {2019},
date = {2019-01-01},
journal = {Nature Communications},
volume = {10},
pages = {4995},
publisher = {Springer US},
abstract = {Maintenance of working memory is thought to involve the activity of prefrontal neuronal populations with strong recurrent connections. However, it was recently shown that distractors evoke a morphing of the prefrontal population code, even when memories are maintained throughout the delay. How can a morphing code maintain time-invariant memory information? We hypothesized that dynamic prefrontal activity contains time-invariant memory information within a subspace of neural activity. Using an optimization algorithm, we found a low-dimensional subspace that contains time-invariant memory information. This information was reduced in trials where the animals made errors in the task, and was also found in periods of the trial not used to find the subspace. A bump attractor model replicated these properties, and provided predictions that were confirmed in the neural data. Our results suggest that the high-dimensional responses of prefrontal cortex contain subspaces where different types of information can be simultaneously encoded with minimal interference.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Maintenance of working memory is thought to involve the activity of prefrontal neuronal populations with strong recurrent connections. However, it was recently shown that distractors evoke a morphing of the prefrontal population code, even when memories are maintained throughout the delay. How can a morphing code maintain time-invariant memory information? We hypothesized that dynamic prefrontal activity contains time-invariant memory information within a subspace of neural activity. Using an optimization algorithm, we found a low-dimensional subspace that contains time-invariant memory information. This information was reduced in trials where the animals made errors in the task, and was also found in periods of the trial not used to find the subspace. A bump attractor model replicated these properties, and provided predictions that were confirmed in the neural data. Our results suggest that the high-dimensional responses of prefrontal cortex contain subspaces where different types of information can be simultaneously encoded with minimal interference.

Close

  • doi:10.1038/s41467-019-12841-y

Close

Alina Peter; Cem Uran; Johanna Klon-Lipok; Rasmus Roese; Sylvia Van Stijn; William Barnes; Jarrod R Dowdall; Wolf Singer; Pascal Fries; Martin Vinck

Surface color and predictability determine contextual modulation of V1 firing and gamma oscillations Journal Article

eLife, 8 , pp. 1–38, 2019.

Abstract | Links | BibTeX

@article{Peter2019,
title = {Surface color and predictability determine contextual modulation of V1 firing and gamma oscillations},
author = {Alina Peter and Cem Uran and Johanna Klon-Lipok and Rasmus Roese and Sylvia {Van Stijn} and William Barnes and Jarrod R Dowdall and Wolf Singer and Pascal Fries and Martin Vinck},
doi = {10.7554/eLife.42101},
year = {2019},
date = {2019-01-01},
journal = {eLife},
volume = {8},
pages = {1--38},
abstract = {The integration of direct bottom-up inputs with contextual information is a core feature of neocortical circuits. In area V1, neurons may reduce their firing rates when their receptive field input can be predicted by spatial context. Gamma-synchronized (30–80 Hz) firing may provide a complementary signal to rates, reflecting stronger synchronization between neuronal populations receiving mutually predictable inputs. We show that large uniform surfaces, which have high spatial predictability, strongly suppressed firing yet induced prominent gamma synchronization in macaque V1, particularly when they were colored. Yet, chromatic mismatches between center and surround, breaking predictability, strongly reduced gamma synchronization while increasing firing rates. Differences between responses to different colors, including strong gamma-responses to red, arose from stimulus adaptation to a full-screen background, suggesting prominent differences in adaptation between M- and L-cone signaling pathways. Thus, synchrony signaled whether RF inputs were predicted from spatial context, while firing rates increased when stimuli were unpredicted from context.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The integration of direct bottom-up inputs with contextual information is a core feature of neocortical circuits. In area V1, neurons may reduce their firing rates when their receptive field input can be predicted by spatial context. Gamma-synchronized (30–80 Hz) firing may provide a complementary signal to rates, reflecting stronger synchronization between neuronal populations receiving mutually predictable inputs. We show that large uniform surfaces, which have high spatial predictability, strongly suppressed firing yet induced prominent gamma synchronization in macaque V1, particularly when they were colored. Yet, chromatic mismatches between center and surround, breaking predictability, strongly reduced gamma synchronization while increasing firing rates. Differences between responses to different colors, including strong gamma-responses to red, arose from stimulus adaptation to a full-screen background, suggesting prominent differences in adaptation between M- and L-cone signaling pathways. Thus, synchrony signaled whether RF inputs were predicted from spatial context, while firing rates increased when stimuli were unpredicted from context.

Close

  • doi:10.7554/eLife.42101

Close

Dina V Popovkina; Wyeth Bair; Anitha Pasupathy

Modeling diverse responses to filled and outline shapes in macaque V4 Journal Article

Journal of Neurophysiology, 121 (3), pp. 1059–1077, 2019.

Abstract | Links | BibTeX

@article{Popovkina2019,
title = {Modeling diverse responses to filled and outline shapes in macaque V4},
author = {Dina V Popovkina and Wyeth Bair and Anitha Pasupathy},
doi = {10.1152/jn.00456.2018},
year = {2019},
date = {2019-01-01},
journal = {Journal of Neurophysiology},
volume = {121},
number = {3},
pages = {1059--1077},
abstract = {Visual area V4 is an important midlevel cortical processing stage that subserves object recognition in primates. Studies investigating shape coding in V4 have largely probed neuronal responses with filled shapes, i.e., shapes defined by both a boundary and an interior fill. As a result, we do not know whether form-selective V4 responses are dictated by boundary features alone or if interior fill is also important. We studied 43 V4 neurons in two male macaque monkeys ( Macaca mulatta) with a set of 362 filled shapes and their corresponding outlines to determine how interior fill modulates neuronal responses in shape-selective neurons. Only a minority of neurons exhibited similar response strength and shape preferences for filled and outline stimuli. A majority responded preferentially to one stimulus category (either filled or outline shapes) and poorly to the other. Our findings are inconsistent with predictions of the hierarchical-max (HMax) V4 model that builds form selectivity from oriented boundary features and takes little account of attributes related to object surface, such as the phase of the boundary edge. We modified the V4 HMax model to include sensitivity to interior fill by either removing phase-pooling or introducing unoriented units at the V1 level; both modifications better explained our data without increasing the number of free parameters. Overall, our results suggest that boundary orientation and interior surface information are both maintained until at least the midlevel visual representation, consistent with the idea that object fill is important for recognition and perception in natural vision.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual area V4 is an important midlevel cortical processing stage that subserves object recognition in primates. Studies investigating shape coding in V4 have largely probed neuronal responses with filled shapes, i.e., shapes defined by both a boundary and an interior fill. As a result, we do not know whether form-selective V4 responses are dictated by boundary features alone or if interior fill is also important. We studied 43 V4 neurons in two male macaque monkeys ( Macaca mulatta) with a set of 362 filled shapes and their corresponding outlines to determine how interior fill modulates neuronal responses in shape-selective neurons. Only a minority of neurons exhibited similar response strength and shape preferences for filled and outline stimuli. A majority responded preferentially to one stimulus category (either filled or outline shapes) and poorly to the other. Our findings are inconsistent with predictions of the hierarchical-max (HMax) V4 model that builds form selectivity from oriented boundary features and takes little account of attributes related to object surface, such as the phase of the boundary edge. We modified the V4 HMax model to include sensitivity to interior fill by either removing phase-pooling or introducing unoriented units at the V1 level; both modifications better explained our data without increasing the number of free parameters. Overall, our results suggest that boundary orientation and interior surface information are both maintained until at least the midlevel visual representation, consistent with the idea that object fill is important for recognition and perception in natural vision.

Close

  • doi:10.1152/jn.00456.2018

Close

Rishi Rajalingham; James J DiCarlo

Reversible inactivation of different millimeter-scale regions of primate IT results in rifferent patterns of core object recognition deficits Journal Article

Neuron, 102 , pp. 493–505, 2019.

Abstract | Links | BibTeX

@article{Rajalingham2019,
title = {Reversible inactivation of different millimeter-scale regions of primate IT results in rifferent patterns of core object recognition deficits},
author = {Rishi Rajalingham and James J DiCarlo},
doi = {10.1016/j.neuron.2019.02.001},
year = {2019},
date = {2019-01-01},
journal = {Neuron},
volume = {102},
pages = {493--505},
publisher = {Elsevier Inc.},
abstract = {Extensive research suggests that the inferior temporal (IT) population supports visual object recognition behavior. However, causal evidence for this hypothesis has been equivocal, particularly beyond the specific case of face-selective subregions of IT. Here, we directly tested this hypothesis by pharmacologically inactivating individual, millimeter-scale subregions of IT while monkeys performed several core object recognition subtasks, interleaved trial- by trial. First, we observed that IT inactivation resulted in reliable contralateral-biased subtask-selective behavioral deficits. Moreover, inactivating different IT subregions resulted in different patterns of subtask deficits, predicted by each subregion's neuronal object discriminability. Finally, the similarity between different inactivation effects was tightly related to the anatomical distance between corre- sponding inactivation sites. Taken together, these results provide direct evidence that the IT cortex causally supports general core object recognition and that the underlying IT coding dimensions are topographically organized.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Extensive research suggests that the inferior temporal (IT) population supports visual object recognition behavior. However, causal evidence for this hypothesis has been equivocal, particularly beyond the specific case of face-selective subregions of IT. Here, we directly tested this hypothesis by pharmacologically inactivating individual, millimeter-scale subregions of IT while monkeys performed several core object recognition subtasks, interleaved trial- by trial. First, we observed that IT inactivation resulted in reliable contralateral-biased subtask-selective behavioral deficits. Moreover, inactivating different IT subregions resulted in different patterns of subtask deficits, predicted by each subregion's neuronal object discriminability. Finally, the similarity between different inactivation effects was tightly related to the anatomical distance between corre- sponding inactivation sites. Taken together, these results provide direct evidence that the IT cortex causally supports general core object recognition and that the underlying IT coding dimensions are topographically organized.

Close

  • doi:10.1016/j.neuron.2019.02.001

Close

Douglas A Ruff; Marlene R Cohen

Simultaneous multi-area recordings suggest that attention improves performance by reshaping stimulus representations Journal Article

Nature Neuroscience, 22 , pp. 1669–1676, 2019.

Abstract | Links | BibTeX

@article{Ruff2019,
title = {Simultaneous multi-area recordings suggest that attention improves performance by reshaping stimulus representations},
author = {Douglas A Ruff and Marlene R Cohen},
doi = {10.1038/s41593-019-0477-1},
year = {2019},
date = {2019-01-01},
journal = {Nature Neuroscience},
volume = {22},
pages = {1669--1676},
publisher = {Springer US},
abstract = {Visual attention dramatically improves individuals' ability to see and modulates the responses of neurons in every known visual and oculomotor area, but whether such modulations can account for perceptual improvements is unclear. We measured the relationship between populations of visual neurons, oculomotor neurons and behavior during detection and discrimination tasks. We found that neither of the two prominent hypothesized neuronal mechanisms underlying attention (which concern changes in information coding and the way sensory information is read out) provide a satisfying account of the observed behavioral improvements. Instead, our results are more consistent with the hypothesis that attention reshapes the representation of attended stimuli to more effectively influence behavior. Our results suggest a path toward understanding the neural underpinnings of perception and cognition in health and disease by analyzing neuronal responses in ways that are constrained by behavior and interactions between brain areas.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Visual attention dramatically improves individuals' ability to see and modulates the responses of neurons in every known visual and oculomotor area, but whether such modulations can account for perceptual improvements is unclear. We measured the relationship between populations of visual neurons, oculomotor neurons and behavior during detection and discrimination tasks. We found that neither of the two prominent hypothesized neuronal mechanisms underlying attention (which concern changes in information coding and the way sensory information is read out) provide a satisfying account of the observed behavioral improvements. Instead, our results are more consistent with the hypothesis that attention reshapes the representation of attended stimuli to more effectively influence behavior. Our results suggest a path toward understanding the neural underpinnings of perception and cognition in health and disease by analyzing neuronal responses in ways that are constrained by behavior and interactions between brain areas.

Close

  • doi:10.1038/s41593-019-0477-1

Close

Amirsaman Sajad; David C Godlove; Jeffrey D Schall

Cortical microcircuitry of performance monitoring Journal Article

Nature Neuroscience, 22 , pp. 265–274, 2019.

Abstract | Links | BibTeX

@article{Sajad2019,
title = {Cortical microcircuitry of performance monitoring},
author = {Amirsaman Sajad and David C Godlove and Jeffrey D Schall},
doi = {10.1038/s41593-018-0309-8},
year = {2019},
date = {2019-01-01},
journal = {Nature Neuroscience},
volume = {22},
pages = {265--274},
publisher = {Springer US},
abstract = {The medial frontal cortex enables performance monitoring, indexed by the error-related negativity (ERN) and manifested by performance adaptations. We recorded electroencephalogram over and neural spiking across all layers of the supplementary eye field, an agranular cortical area, in monkeys performing a saccade-countermanding (stop signal) task. Neurons signaling error production, feedback predicting reward gain or loss, and delivery of fluid reward had different spike widths and were concentrated differently across layers. Neurons signaling error or loss of reward were more common in layers 2 and 3 (L2/3), whereas neurons signaling gain of reward were more common in layers 5 and 6 (L5/6). Variation of error– and reinforcement-related spike rates in L2/3 but not L5/6 predicted response time adaptation. Variation in error-related spike rate in L2/3 but not L5/6 predicted ERN magnitude. These findings reveal novel features of cortical microcircuitry supporting performance monitoring and confirm one cortical source of the ERN.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

The medial frontal cortex enables performance monitoring, indexed by the error-related negativity (ERN) and manifested by performance adaptations. We recorded electroencephalogram over and neural spiking across all layers of the supplementary eye field, an agranular cortical area, in monkeys performing a saccade-countermanding (stop signal) task. Neurons signaling error production, feedback predicting reward gain or loss, and delivery of fluid reward had different spike widths and were concentrated differently across layers. Neurons signaling error or loss of reward were more common in layers 2 and 3 (L2/3), whereas neurons signaling gain of reward were more common in layers 5 and 6 (L5/6). Variation of error– and reinforcement-related spike rates in L2/3 but not L5/6 predicted response time adaptation. Variation in error-related spike rate in L2/3 but not L5/6 predicted ERN magnitude. These findings reveal novel features of cortical microcircuitry supporting performance monitoring and confirm one cortical source of the ERN.

Close

  • doi:10.1038/s41593-018-0309-8

Close

Jason M Samonds; Veronica Choi; Nicholas J Priebe

Mice discriminate stereoscopic surfaces without fixating in depth Journal Article

Journal of Neuroscience, 39 (41), pp. 8024–8037, 2019.

Abstract | Links | BibTeX

@article{Samonds2019,
title = {Mice discriminate stereoscopic surfaces without fixating in depth},
author = {Jason M Samonds and Veronica Choi and Nicholas J Priebe},
doi = {10.1523/JNEUROSCI.0895-19.2019},
year = {2019},
date = {2019-01-01},
journal = {Journal of Neuroscience},
volume = {39},
number = {41},
pages = {8024--8037},
abstract = {Stereopsis is aubiquitous feature ofprimatemammalianvision, but little is knownabout ifandhowrodentssuchasmiceuse stereoscopic vision. We used random dot stereograms to test for stereopsis in male and female mice, and they were able to discriminate near from far surfaces over a range of disparities, with diminishing performance for small and large binocular disparities. Based on two-photon measurements of disparity tuning, the range of disparities represented in the visual cortex aligns with the behavior and covers a broad range ofdisparities. Whenwe examined their binocular eye movements, we found that, unlike primates, mice did not systematically vary relative eye positions or use vergence eye movements when presented with different disparities. Nonetheless, the representation of disparity tuning was wide enough to capture stereoscopic information over a range of potential vergence angles. Although mice share fundamental characteristics of stereoscopic vision with primates and carnivores, their lack ofdisparity-dependent vergence eye move- ments and wide neuronal representation suggests that they may use a distinct strategy for stereopsis.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Stereopsis is aubiquitous feature ofprimatemammalianvision, but little is knownabout ifandhowrodentssuchasmiceuse stereoscopic vision. We used random dot stereograms to test for stereopsis in male and female mice, and they were able to discriminate near from far surfaces over a range of disparities, with diminishing performance for small and large binocular disparities. Based on two-photon measurements of disparity tuning, the range of disparities represented in the visual cortex aligns with the behavior and covers a broad range ofdisparities. Whenwe examined their binocular eye movements, we found that, unlike primates, mice did not systematically vary relative eye positions or use vergence eye movements when presented with different disparities. Nonetheless, the representation of disparity tuning was wide enough to capture stereoscopic information over a range of potential vergence angles. Although mice share fundamental characteristics of stereoscopic vision with primates and carnivores, their lack ofdisparity-dependent vergence eye move- ments and wide neuronal representation suggests that they may use a distinct strategy for stereopsis.

Close

  • doi:10.1523/JNEUROSCI.0895-19.2019

Close

Morteza Sarafyazd; Mehrdad Jazayeri

Hierarchical reasoning by neural circuits in the frontal cortex Journal Article

Science, 364 , pp. 1–11, 2019.

Abstract | Links | BibTeX

@article{Sarafyazd2019,
title = {Hierarchical reasoning by neural circuits in the frontal cortex},
author = {Morteza Sarafyazd and Mehrdad Jazayeri},
doi = {10.1126/science.aav8911},
year = {2019},
date = {2019-01-01},
journal = {Science},
volume = {364},
pages = {1--11},
abstract = {Humans process information hierarchically. In the presence of hierarchies, sources of failures are ambiguous. Humans resolve this ambiguity by assessing their confidence after one or more attempts. To understand the neural basis of this reasoning strategy, we recorded from dorsomedial frontal cortex (DMFC) and anterior cingulate cortex (ACC) of monkeys in a task in which negative outcomes were caused either by misjudging the stimulus or by a covert switch between two stimulus-response contingency rules. We found that both areas harbored a representation of evidence supporting a rule switch. Additional perturbation experiments revealed that ACC functioned downstream of DMFC and was directly and specifically involved in inferring covert rule switches. These results reveal the computational principles of hierarchical reasoning, as implemented by cortical circuits.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Humans process information hierarchically. In the presence of hierarchies, sources of failures are ambiguous. Humans resolve this ambiguity by assessing their confidence after one or more attempts. To understand the neural basis of this reasoning strategy, we recorded from dorsomedial frontal cortex (DMFC) and anterior cingulate cortex (ACC) of monkeys in a task in which negative outcomes were caused either by misjudging the stimulus or by a covert switch between two stimulus-response contingency rules. We found that both areas harbored a representation of evidence supporting a rule switch. Additional perturbation experiments revealed that ACC functioned downstream of DMFC and was directly and specifically involved in inferring covert rule switches. These results reveal the computational principles of hierarchical reasoning, as implemented by cortical circuits.

Close

  • doi:10.1126/science.aav8911

Close

Veronica E Scerra; M Gabriela Costello; Emilio Salinas; Terrence R Stanford

All-or-none context dependence delineates limits of FEF visual target selection Journal Article

Current Biology, 29 (2), pp. 294–305, 2019.

Abstract | Links | BibTeX

@article{Scerra2019,
title = {All-or-none context dependence delineates limits of FEF visual target selection},
author = {Veronica E Scerra and M {Gabriela Costello} and Emilio Salinas and Terrence R Stanford},
doi = {10.1016/j.cub.2018.12.013},
year = {2019},
date = {2019-01-01},
journal = {Current Biology},
volume = {29},
number = {2},
pages = {294--305},
abstract = {Choices of where to look are informed by perceptual judgments, which locate objects of current value or interest within the visual scene. This perceptual-motor transform is partly implemented in the frontal eye field (FEF), where visually responsive neurons appear to select behaviorally relevant visual targets and, subsequently, saccade-related neurons select the movements required to look at them. Here, we use urgent decision-making tasks to show (1) that FEF motor activity can direct accurate, visually informed choices in the complete absence of prior target-distracter discrimination by FEF visual responses and (2) that such discrimination by FEF visual cells shows an all-or-none reliance on the presence of stimulus attributes strongly associated with saliency-driven attentional allocation. The present findings suggest that FEF visual target selection is specific to visual judgments made on the basis of saliency and may not play a significant role in guiding saccadic choices informed solely by feature content.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

Choices of where to look are informed by perceptual judgments, which locate objects of current value or interest within the visual scene. This perceptual-motor transform is partly implemented in the frontal eye field (FEF), where visually responsive neurons appear to select behaviorally relevant visual targets and, subsequently, saccade-related neurons select the movements required to look at them. Here, we use urgent decision-making tasks to show (1) that FEF motor activity can direct accurate, visually informed choices in the complete absence of prior target-distracter discrimination by FEF visual responses and (2) that such discrimination by FEF visual cells shows an all-or-none reliance on the presence of stimulus attributes strongly associated with saliency-driven attentional allocation. The present findings suggest that FEF visual target selection is specific to visual judgments made on the basis of saliency and may not play a significant role in guiding saccadic choices informed solely by feature content.

Close

  • doi:10.1016/j.cub.2018.12.013

Close

Shiva Farashahi; Christopher H Donahue; Benjamin Y Hayden; Daeyeol Lee; Alireza Soltani

Flexible combination of reward information across primates Journal Article

Nature Human Behaviour, 3 (11), pp. 1215–1224, 2019.

Abstract | Links | BibTeX

@article{Farashahi2019,
title = {Flexible combination of reward information across primates},
author = {Shiva Farashahi and Christopher H Donahue and Benjamin Y Hayden and Daeyeol Lee and Alireza Soltani},
doi = {10.1038/s41562-019-0714-3},
year = {2019},
date = {2019-01-01},
journal = {Nature Human Behaviour},
volume = {3},
number = {11},
pages = {1215--1224},
publisher = {Springer US},
abstract = {A fundamental but rarely contested assumption in economics and neuroeconomics is that decision-makers compute subjective values of risky options by multiplying functions of reward probability and magnitude. By contrast, an additive strategy for valuation allows flexible combination of reward information required in uncertain or changing environments. We hypothesized that the level of uncertainty in the reward environment should determine the strategy used for valuation and choice. To test this hypothesis, we examined choice between risky options in humans and rhesus macaques across three tasks with different levels of uncertainty. We found that whereas humans and monkeys adopted a multiplicative strategy under risk when probabilities are known, both species spontaneously adopted an additive strategy under uncertainty when probabilities must be learned. Additionally, the level of volatility influenced relative weighting of certain and uncertain reward information, and this was reflected in the encoding of reward magnitude by neurons in the dorsolateral prefrontal cortex.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Close

A fundamental but rarely contested assumption in economics and neuroeconomics is that decision-makers compute subjective values of risky options by multiplying functions of reward probability and magnitude. By contrast, an additive strategy for valuation allows flexible combination of reward information required in uncertain or changing environments. We hypothesized that the level of uncertainty in the reward environment should determine the strategy used for valuation and choice. To test this hypothesis, we examined choice between risky options in humans and rhesus macaques across three tasks with different levels of uncertainty. We found that whereas humans and monkeys adopted a multiplicative strategy under risk when probabilities are known, both species spontaneously adopted an additive strategy under uncertainty when probabilities must be learned. Additionally, the level of volatility influenced relative weighting of certain and uncertain reward information, and this was reflected in the encoding of reward magnitude by neurons in the dorsolateral prefrontal cortex.

Close

  • doi:10.1038/s41562-019-0714-3

Close

574 entries « ‹ 1 of 6 › »

Let's Keep in Touch

SR Research Eye Tracking

NEWSLETTER SIGNUPNEWSLETTER ARCHIVE

Footer

Contact

info@sr-research.com
Phone: +1-613-271-8686
Toll Free: 1-866-821-0731
Fax: 613-482-4866

Quick Links

PRODUCTS

SOLUTIONS

SUPPORT FORUM

Legal Information

Legal Notice

Privacy Policy

EyeLink® eye trackers are intended for research purposes only and should not be used in the treatment or diagnosis of any medical condition.

Featured Blog Post

EyeLink Eye-Tracking Articles

2020 EyeLink Publication Update

Copyright © 2020 SR Research Ltd. All Rights Reserved. EyeLink is a registered trademark of SR Research Ltd.