↓ Skip to main content

American Physiological Society

Modeling auditory-visual evoked eye-head gaze shifts in dynamic multisteps

Overview of attention for article published in Journal of Neurophysiology, January 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age
  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
4 X users
facebook
2 Facebook pages

Citations

dimensions_citation
5 Dimensions

Readers on

mendeley
23 Mendeley
Title
Modeling auditory-visual evoked eye-head gaze shifts in dynamic multisteps
Published in
Journal of Neurophysiology, January 2018
DOI 10.1152/jn.00502.2017
Pubmed ID
Authors

Bahadir Kasap, A John van Opstal

Abstract

In dynamic visual or auditory gaze double-steps, a brief target flash or sound burst is presented in midflight of an intervening eye-head gaze shift. Behavioral experiments in humans and monkeys have indicated that the subsequent eye- and head movements to the target are goal-directed, regardless stimulus timing, first gaze-shift characteristics, and initial conditions. This remarkable behavior requires that the gaze-control system (i) has continuous access to accurate signals about eye-in-head position, and ongoing eye-head movements, (ii) that it accounts for different internal signal delays, and (iii) that it is able to update the retinal (TE) and head-centric (TH) target coordinates into appropriate eye-centered and head-centered motor commands on millisecond time scales. As predictive, feedforward remapping of targets cannot account for this behavior, we propose that targets are transformed and stored into a stable reference frame as soon as their sensory information becomes available. We present a computational model, in which recruited cells in the midbrain Superior Colliculus drive eyes and head to the stored target location through a common dynamic oculocentric gaze-velocity command, which is continuously updated from the stable goal, and transformed into appropriate oculocentric and craniocentric motor commands. We describe two equivalent, yet conceptually different, implementations that both account for the complex, but accurate, kinematic behaviors and trajectories of eye-head gaze shifts under a variety of challenging multi-sensory conditions, such as in dynamic visual-auditory multi-steps.

X Demographics

X Demographics

The data shown below were collected from the profiles of 4 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 23 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 23 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 5 22%
Researcher 3 13%
Student > Master 3 13%
Student > Doctoral Student 2 9%
Professor 2 9%
Other 3 13%
Unknown 5 22%
Readers by discipline Count As %
Psychology 5 22%
Neuroscience 5 22%
Computer Science 2 9%
Agricultural and Biological Sciences 1 4%
Sports and Recreations 1 4%
Other 3 13%
Unknown 6 26%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 21 May 2018.
All research outputs
#14,920,631
of 25,382,440 outputs
Outputs from Journal of Neurophysiology
#4,267
of 8,425 outputs
Outputs of similar age
#229,497
of 448,910 outputs
Outputs of similar age from Journal of Neurophysiology
#57
of 99 outputs
Altmetric has tracked 25,382,440 research outputs across all sources so far. This one is in the 40th percentile – i.e., 40% of other outputs scored the same or lower than it.
So far Altmetric has tracked 8,425 research outputs from this source. They receive a mean Attention Score of 4.8. This one is in the 48th percentile – i.e., 48% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 448,910 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 48th percentile – i.e., 48% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 99 others from the same source and published within six weeks on either side of this one. This one is in the 39th percentile – i.e., 39% of its contemporaries scored the same or lower than it.