ORIENT ERC adv Grant nr. 693400

Project summary

Rapid object identification is crucial for our survival, but poses daunting challenges to the brain if many stimuli compete for attention, and multiple sensory and motor systems are involved to plan and generate an eye-head gaze-orienting response to a selected goal.

Research question: How do normal and sensory-impaired brains decide which signals to integrate ("goal"), or suppress ("distracter")?

Audiovisual (AV) integration only helps for spatially and temporally aligned stimuli. However, sensory inputs differ markedly in their reliability and precision, reference frames, and processing delays, yielding considerable spatial-temporal uncertainty. Moreover, vision and audition utilise coordinates that misalign whenever the eyes and head move, while their sensory acuities vary across space and time in essentially different ways. As a result, assessing AV alignment poses a number of major neuro-computational problems, which so far have been studied for the simplest stimulus-response conditions only.

In three integrated research projects my team studies and models bottom-up (sensory) and top-down (task-driven) gaze-control mechanisms of AV integration in complex environments:
  1. We impose unique eye-head gaze-orienting tasks to healthy controls and to patients with sensory, motor, or cognitive disorders, while systematically varying AV stimulus statistics. We will uncover how healthy and impaired systems construct Priors about the environment, and utilise dynamic active and passive self-motion feedback (1 PhDs/2 postdocs; Nijmegen).
  2. We challenge prevailing computational models of statistical inference in the brain by incorporating top-down control and fast eye-head sensorimotor feedback in cortical-midbrain neural network models, driven by our novel psychophysical, and our recent monkey neurophysiological data (1 postdoc; Nijmegen).
  3. As a critical test for our concepts and models, we apply our results to a novel autonomous humanoid eye-head robot that is equipped with foveal vision, realistic auditory inputs, three-dimensional nested motor systems, and rapid sensorimotor feedback and learning algorithms (1 PhD/1 postdoc/1 engineer; Lisbon).


Back to the ORIENT home page.


The Nijmegen team.


John van Opstal, coordinator
Jesse Heckman, PhD student, psychophysics
Annemiek Barsingerhorn, postdoc, psychophysics
Ad Snik, co-supervisor
Francesca Rocchi postdoc (until May 2020), AV psychophysics
Arezoo Alizadeh, postdoc, computational modelling
Antoine Bernas, Postdoc, Auditory motion and plasticity
Guus van Bentum postdoc, Auditory motion and plasticity
Our new L-arm auditory - vestibular robot system
for our moving sound experiments (d.d. June 2021).
Our experimental facilities and support staff
Gunter Windau, ICT
Stijn Martens, Mechanics
Ruurd Lof, physics, Electronics
  
Two-axis vestibular chair and Teun Maas (master student)
Auditory setup ('the sphere')
fNRS - EEG lab
   
   
   
Merlijn Kemna and Sten Lahaije
Bachelor students
Mina Galis and Mathijs Prudon
Bachelor students
Lennaert van der Molen & Wouter Mattheussens
Master students



Back to the ORIENT home page.


The Lisbon team.


Alexandre Bernardino, co-PI
Jose Santos-Victor, Dept. Head
Our prototype Eye
Miguel Lucas, master student, robotics apr-oct, 2017
Alex and Carlos Aleluia, master student, robotics
Mariana Martins, master student, computer vision
       
       
       
Akhil John, PhD student, robotics,
and his new eye-design
Eye with torsional pulley for Superior Oblique,
and Rui Cardoso, master student, eye control
Bernardo das Chagas, master student, eye-head control and
Gonçalo Pereira, master student, computer vision and control (sensor fusion)
   
       
       
       
Reza Javanmard, postdoc, Learning optimal control with NN
and Henrique Granado, student, works on model-free learning.
Jose Noronha and Ricardo Valadas
Master students, modelling eye-muscle sideslip
John and Alex, proud supervisors of the EyeTeam....
Jose Gomes, master student, works with the event camera, and
Jose Gaspar, PI, Visual navigation and calibration.





Fish-eye camera view from our prototype eye, showing Alex, Jose, John, and Plinio.



Typical Zoom session of our weekly Team discussions (March 9, 2021)





Back to the ORIENT home page.


Subproject 1. Multisensory human gaze-orienting behavior in health and disease.

Background and relevance.
Human AV integration has typically been studied under head-fixed, stationary conditions in simple sensory environments. In this Subproject we will extend the current body of knowledge, by studying multisensory evoked orienting responses in cluttered AV environments for: (i) head-unrestrained active gaze shifts, with or without (ii) whole-body low-frequency passive rotation through 2-axis vestibular stimulation.
While active eye-head orienting allows use of all available sensory cues and efferent motor feedback signals (Figure B2-4), head-restrained passive vestibular rotation denies feedback from neck-proprioception and head-motor corollary discharges. These different movement conditions will assess the importance and contribution of motor-feedback signals to overt orienting behaviour in health and disease, in particular for challenging audiovisual search conditions.



The research problems outlined in Section (a) are studied in healthy control subjects and in different groups of patients. The philosophy of my experiments is that the advanced neuro-computational algorithms for multisensory evoked gaze shifts provide highly nontrivial challenges, in particular for the impaired brain. Further, by studying the behavioural strategies of patients with particular sensory impairments (visual, auditory, or vestibular) our Bayesian theory predicts specific, quantitative effects (regarding precision and accuracy) of reduced sensory resolution on AV integration and eye-head coordination, and will shed new light on the implementation of statistical processing in the brain.
I firmly argue that these insights will be of crucial importance for future improvements of sensory aids and behavioural therapies for these patients


Back to the ORIENT home page.


Subproject 2. Computational modelling

Beyond the state-of-the-art:
Our conceptual dynamic Bayesian model will be formulated into a neurobiologically realistic neuro-computational model of multisensory-evoked eye-head gaze control, guided by the results from our most recent monkey recordings in the midbrain, and our new human behavioural experiments. We will significantly extend our optimal control model of the midbrain Superior Colliculus (SC) (Figure 5), which explains how (eye-only) saccade-related activity of the neuronal population in the SC encodes the instantaneous trajectory and kinematics of saccadic eye movements by summing the instantaneous spike contribution of each neuron across the recruited population:



We recently discovered an important topography in the functional organisation of the SC, in which peak firing rates and burst durations of recruited cells vary systematically with their location in the SC motor map: rostral cells (associated with small saccades) have higher firing rates for their optimal saccade (~800 spks/s), than caudal cells (large saccades; ~300 spks/s). Our theoretical analysis revealed that this property underlies the well-documented nonlinear, saturating relation between saccade amplitude and peak eye velocity. Interestingly, this saccade "main sequence" reflects an optimal control strategy that implements a speed-accuracy trade off in the primate oculomotor system. It accounts for two conflicting constraints in the system in the best possible way: (i) to generate a precise eye movement as fast as possible, (ii) despite poor spatial resolution (high uncertainty) in the peripheral retina.
We have thus identified the first neurophysiological correlate of an optimal control mechanism in the brain!
Our model is driven by recorded spike trains, and predicts the full kinematic repertoire of eye saccades in all directions and amplitudes with a remarkable simplicity of only two free parameters (gain B and feedback delay in Fig. 5).

My ERC project:
The model of Figure 5 is far from complete, and will form the starting point for a complete 3D sensorimotor scheme that includes eye-head coordination, and local excitation and global inhibition within an intrinsic SC network, such that it explains, rather than uses, the recorded bursting activity profiles of the SC cells.

So far (June, 2019), we have made considerable progress in our modelling efforts, by incorporating a full computational model with spiking neurons, and an extension of the computational model of Figure 5 to eye-head coordination (see our publications, Kasap and Van Opstal, 2018a,b,c, 2019, and Van Opstal and Kasap, 2018, 2019).


Back to the ORIENT home page.


Subproject 3. Biomimetic Eye Robotic System

Motivation
The real critical test for the validity of our models is in a realistic hardware implementation that has to flexibly cope with similar fundamental target selection, acquisition and following problems as human subjects (see Cully and Clune, Nature 521: 503-507, 2015, for recent strong support for this argument). In my third subproject my collaborating team in from the Instituto Tecnico Superior in Lisbon will design and test a novel humanoid eye robotic system that is guided by the same principles as uncovered from our monkey recordings (Fig. 5), our human psychophysics (Fig. B4-2), and our modelling.
The test is critical because (i) a robot will face truly unexpected situations in the environment and in its own dynamics, which will be unforeseen in necessarily restricted computer-only simulations. (ii) Its physical dynamics can be quite complex, and a-priori unknown because of (unknown) internal delays in its circuitry, nonlinear interactions and complex dependencies between different segments, etc. These aspects are at best simplified in computational modelling. (iii) The robot will have to autonomously discover, and learn through reinforcement at minimising "costs", its own dynamics and optimal, yet flexible, behaviours through exploration of, and interactions with the environment, relying on the general principles uncovered in my Subprojects 1 and 2. An important challenge to be solved in this Subproject will be to define the set of Cost Functions (e.g., speed-accuracy trade off; energy expenditure; movement duration; task constraints, trajectory formation, etc.) under which the control is optimised just as in humans.

So far, the gaze-control behaviours of existing robots are not guided by the principles of the human brain. For example, the eyes are typically simplified to cameras with homogeneous fields of view, and equipped with relatively simple Winner-Take-All computations to find an instructed target. Similar limitations apply to the robot's ears. However, our studies, and those of others, have made clear that the presence of a fovea in primate eyes (predator!), is crucial to understand the efficient, rapid, and precise search of targets through fast saccadic eye-head gaze shifts with optimal kinematics, and limited (visual, serial) processing time. Further, adequate sound localisation (on the basis of different, but complementary acoustic cues) is needed to extend the range of the limited visual resolution to far beyond the fovea and visual field, and allow for efficient Audio-Visual integration to make the total system better than its parts. For these reasons we will construct, in close collaboration with our Lisbon colleagues, our own novel robot system. Figure 8 illustrates the basic mechanics of our planned eye robot.

Fig. 8

Figure 8: (a) Schematic of our robotic humanoid eye, consisting of three antagonistic muscle pairs, each connected to a single rotational motor that is driven by the neural network models of Subproject 2. Motors and muscles are drawn displaced re. eye for illustrative purposes only. For example the medial and lateral recti (MR, LR) are both connected to a single motor with a vertical rotation axis. Motors can rotate bi-directionally, at a velocity that is specified by our SC model (Fig. 5). The LR and MR will thus be appropriately lengthened or shortened to generate a saccade. Hor.-Vert. rotations will be confined to about 40 deg from straight ahead, Torsion to about 15 deg, to impose a realistic oculomotor range that necessitates early use of head movements for peripheral goals. The three motors together implement the 3D kinematical principles described by Listing's and Donders' laws. We investigate whether muscles should also be equipped with strain gauges to provide proprioceptive feedback signals. (b) The visual input is filtered such that it has high resolution only around the center ('foveal vision').





Back to the ORIENT home page.


Progress reports (d.d. May, 2021)


Months 1-18


The project started in January 2017, setting up the collaboration with the Visual Lab of prof Alexandre Bernardino of the Robotics Institute at the Instituto Tecnico Superior in Lisbon as third Linked Party. The project coordinator (JvO) paid several visits to the Lisbon group, to jointly draft a formal Memorandum of Understanding, in which the mutual parties both agreed on the terms for the collaboration. The MoU was formally signed by the Institute Deans of both Universities (Nijmegen and Lisbon) in Sept. 2017.

The two-axis vestibular chair at the Faculty of Science (part of Radboud Research Facilities, RRF) became available for testing by the end of the spring of 2017. During the first few months of the project (Feb-May, 2017), the PI appointed Bahadir Kasap (PhD student), to work on a model of the oculomotor midbrain by implementing a novel spiking neural network algorithm (implementation of Subproject 2). At the moment of writing this report (August 2018), five manuscripts have arisen from this work (two publications appeared: in the Journal of Neurophysiology (2018), and in Neurocomputing (2018); three papers have been submitted: to Frontiers in Applied Mathematics and Statistics, to PLoS Computational Biology, and to Progress in Brain Research).

In the meantime, the PI initiated a search for the positions of the PhD and postdoc for Subproject 1 (human psychophysics). Two excellent candidates have been appointed in the fall of 2017: Jesse Heckman (PhD1, per Sept 2017), and Annemiek Barsingerhorn (Postdoc 1, per Oct 2017). We were lucky to appoint prof Ad Snik, who recently retired from the Otolaryngology Dept. of the Radboud Umc, per Sept 1, 2017 as co-supervisor of the patient psychophysics. Prof Snik is a world-recognised expert on auditory technology and audiology, and his appointment therefore fits perfectly in the Action's aims concerning our work with sensory patients.

We have appointed a second postdoc to pursue the work on modelling the gaze control system with spiking neuronal networks (Subproject 2; Arezoo Alizadeh), and have also hired a third and fourth postdoc to strengthen the team of Subproject 1 (Guus van Bentum and Alais Bernas).

The collaboration with prof Bernardino is well on its way. Between April - Oct 2017 a master student (Miguel Ruiz Lucas) designed and tested a prototype robotic eye (resulting in his Master Thesis report, and his graduation in Oct 2017). As of April 2018, two Lisbon master students are continuing the work to improve and extend on this prototype (Carlos Aleluia, who works on the 3D kinematics and on mechanical improvements, and Mariana Martins, who focuses on visual-image processing and stabilisation of the systems positioning). We have recently recruited a PhD student (Akhil John is appointed per Sept. 2018) and have recently hiring an excellent Postdoc (Reza Javanmard, Iran, appointed fall 2020). Because of the successful collaboration, the PI has prepared an amendment to the Grant Agreement in 2019 to change the status of the Lisbon group into that of co-Beneficiary, so that the collaboration will include formal scientific work also from the Lisbon group.

Months 1-30 (midterm)


The project started in January 2017, setting up the collaboration with the Visual Lab of prof Alexandre Bernardino of the Robotics Institute at the Instituto Tecnico Superior in Lisbon, and starting the research by hiring the first applicants and interns. The Memorandum of Understanding was formally signed by the Institute Deans of both Universities (Nijmegen and Lisbon) in Sept. 2017. So far, 22 research papers have already resulted from the work in this Action: 15 papers from Subproject 1, 6 papers from Subproject 2, and 1 paper from Subproject 3. Overall, the project can be considered a great success.

Subproject 1: Human multisensory gaze control in complex environments: psychophysics.

The multisensory two-axis vestibular chair at the Faculty of Science (Radboud Research Facilities), which is the central experimental facility for this Action, became fully available for the psychophysical experiments of Subproject 1 in June 2018 (for video, see:
Chair). In Sept. 2017, J Heckman (PhD 1), and per Oct. 2018, A Barsingerhorn (Postdoc1), were appointed. Experiments on neural mechanisms underlying sound-localisation in noisy environments, and Bayesian mechanisms underlying audiovisual integration were published in 2017/2018/2019: Van Opstal et al., 2017; Bremen et al., 2018; Van Bentum et al,., 2017; Ege et al., 2018a,b; 2019; Zonooz et al.,2018a,b. The PI has held several presentations on this work in international conferences and invited seminars, e.g. at the NCM meetings in Santa Fe, NM, USA, and in Toyama, Japan; in Rovereto, Italy; in Kosice, Slovakia. PhD1 presented his work at the Gordon Conference on eye movements in Lewiston, MN, USA. Postdoc 1 presents her current results at the European Conference on eye movements in Alicante. Recenlty, we published our first paper on head-movement pursuit of moving sounds in eNeuro, which received a lot of international attention (even appearance of the PI in an Australian Breakfast show!). The paper appeared in eNeuro in 2021: JA Garcia-Uceda Calvo, AJ Van Opstal and MM Van Wanrooij: "Adaptive response behavior in the pursuit of unpredictably moving sounds". See: eNeuro 8(3) 1-14, 2021.
We hired prof A Snik per Sept 1, 2017 to work on sensory-deprived patients in collaboration with our applicants. He is a world-recognized expert on auditory technology and audiology, and his appointment fits perfectly in the Action’s aims. In Jan 2019, we attracted F Rocchi (Italy) as Postdoc2 to ORIENT to work on audio-visual psychophysics and plasticity/adaptation.
To set up the auditory patient work under the supervision of Prof Snik, we hired two young PhD researchers for a period of 6 months (Jan-June, 2019): S Sharma, and S Ausili. They performed sound-localization studies in our lab with hearing-impaired patients, equipped with a cochlear implant (either unilateral, or bilateral) and a hearing aid (so-called bimodal electro-acoustic hearing). So far, five publications have appeared from this work (Snik et al., 2019; Huinck et al., 2019; Vogt et al., 2018; Sharma et al., 2019; Ausili et al., 2019), and about 6 more papers are expected to follow this year and next.

After Francesca received a very good offer from the UK, she left our project. Fortunately, we could hire two new postdocs, Guus van Bentum and Alais Bernas, to continue the research on sound-localization plasticity, and to start a new research line on moving sounds. Our latest paper on moving sounds in the horizontal plane (Garcia et al., eNeuro, 2021, see publication list) received quite some (inter)national press attention. Our new "moving-sound-vestibular-eye/head movement" setup will allow a sound to move on the two-dimensional sphere (radius 1.5 m) around the subject, while the subject follows the sound source with a coordinated eye-head movement (while sometimes also rotated passively around a vertical axis), and is now nearly completed (see image at the Nijmegen Team). We will start these new exciting experiments together with a bachelor-of-Science intern, Noortje van der Mast, in September.

Subproject 2: Computational modelling of the eye-head gaze-control system.
During the first months of the Action (Feb-May, 2017), the PI appointed B Kasap (PhD student), to work on a computational model of the midbrain by implementing a novel spiking neural network algorithm. Six manuscripts emerged from this work (in the J Neurophysiology (2018), in Neurocomputing (2018); in Front Appl Math and Stat (2018), and to PLoS Comput Biol (2019), and two papers in press in Progr Brain Res). In April 2019, the PI appointed Postdoc3 on this Subproject, dr A Alizadeh (Iran), who will extend the current spiking network model to 3D eye-head coordination. We are currently preparing two new research papers, to be submitted by the Summer of 2021.
In the mean time, our Neurophysics master student Lennaert van der Molen finished a modelling study on 3D eye-head coordination (see internship reports). We will prepare this work into a manuscript for publication.

Subproject 3: Humanoid robotic model of the eye-head gaze-control system.

The collaboration with prof Bernardino on Subproject 3 goes very well. In 2018, the PI submitted an amendment to the Grant Agreement to make the Lisbon group co-Beneficiary, so that the collaboration can now include formal scientific work from the Lisbon group, with its own alotted budget.
Between April and Oct 2017 a master student (Miguel Ruiz Lucas) had designed and tested a prototype robotic eye (resulting in a Master Thesis report, and his graduation in Oct 2017). From April 2018 to Oct. 2019, the work continued with two new students: Carlos Aleluia, who works on the 3D kinematics and on mechanical improvements, and Mariana Martins, who focuses on the visual-image processing and positional stabilisation of the system. This work is expected to lead to a first joint publication this year. We aim to recruit more interns from the Technical Institute to collaborate on our project; also master students from Nijmegen may be sent to Lisbon for brief internships.
During the first half of the Action, the PI paid 23 short (typically 4-5 day) visits to the Lisbon group to collaborate on Subproject 3. A first paper on 3D eye movements was published in december 2018: Van Opstal, Strabismus, 2018. The coordinator and Bernardino recruited PhD2 in Sept. 2018 (Akhil John, India), and we have hired our Postdoc 4 (Reza Javanmard, Iran), starting in the fall of 2020.

In Lisbon, we currently work on an unconventional prototype of a rotating humanoid eye, in which we aim to model and control the six elastic eye muscles. Although other labs have produced more or less realistic eye movement robotic models, none have so far included the actual complexity of the true three degrees of freedom problem, leading to Donders' law and Listing's law, and the mechanical properties of the muscles. We currently adapt our system to also include dynamic viscosity, and we take into account muscle side-slip, to better approach the overdamped mechanics and dynamic change of pulling directions of the muscles of the biological human eye.

For a nice demo of the 3D saccades, made by our new prototype eye, see Vision from the Eye.

NEWS: In May, 2021 our revised first collaborative manuscript was accepted in PLoS Computational Biology!!
John, A. Aleluia, C., Van Opstal, A.J. and Bernardino, A:
Modelling 3D Saccade Generation by Feedforward Optimal Control in PLoS Computational Biology.
For a pdf of our paper, see: Lisbon_PLoSCB2021.pdf

NEWS: In July 2022, we submitted our second manuscript to the journal IEEE Transactions on Neural Networks and Learning Systems!!
Javanmard Alitappeh R, John A, Dias B, Van Opstal, A.J. and Bernardino, A:
Emergence of human oculomotor behavior from optimal control of a cable-driven biomimetic robotic eye
For a sneak preview of our paper, see: Javanmard_etal2022_IEEE.pdf


See also IST news Sept 2019 for a nice News item of our project on the IST website (Sept 2019).



MiniSymposia VisLab Lisboa

Our project was presented at a MiniSymposium in the Lisbon VisLab on June 21, 2019 (see the presentations). For the thesis reports of our students, see the extended abstracts

Our second Minisymposium at VisLab was held on June 22, 2021 (featured speakers: Akhil John, Reza Javanmard with Henrique Granado, Bernardo das Chagas with Jose de Noronha and Ricardo Valadas (the control and muscle-slip team), Goncalo Pereira and Jose Gomes (the visual team). Nearly 30 participants joined in the discussions.
An impression of the (Zoom) meeting can be found here. (see also the presentations).









Back to the ORIENT home page.






Internship reports from the Lisbon collaboration:

(Note: The full thesis reports and their extended abstracts are public access on the Lisbon IST repository.
E.g., at https://fenix.tecnico.ulisboa.pt/downloadFile/1407770020546218/Extended_abstractCorrigido_Miguel_Lucas_nr_76781.pdf)





Internship reports from RUN-Biophysics students:




Back to the ORIENT home page.











Minisymposium at IST Visual Lab, Lisboa, June 21, 2019



On Friday, June 21, 2019, we organised a minisymposium at the IST Visual Lab in Lisboa, in which Alex hosted the meeting, and John, Akhil, Mariana and Carlos presented their ongoing work to the department on the 7th floor of Torre Norte. Below, find the pdf files of their presentations:
  1. John's presentation on the background of our robotics project is found here
  2. Akhil's presentation on the mechanical implementation of the 3D robotic eye is found here
  3. Mariana's presentation on the 3D visual mapping algorithms for rotation and translation of the robot's camera eye is found here
  4. Carlos' presentation on a 3D model of the robot's eye and the use of optimal control to understand Listing's law is found here

    Listing's Law for the eye: An optimal control principle?
    Listing's Law, measured for head-restrained monkey saccades in 3D. In quaternion laboratory coordinates (r'x,r'y,r'z), Listing's plane (r'x = ar'y + br'z) has a width of only 0.6 deg. The primary position (PP: (1, a, b)) points upwards (as the center of the oculomotor range, OMR, is about 15 deg downward); i.e., the plane is tilted re. gravity. Figure taken from Hess et al., Vision Res., 1992.
    There is controversy about the origin of this two degrees of freedom (dof) behaviour: mechanical constraints (e.g., muscle pulleys) vs. neural (control) strategies. The following arguments support a control strategy:
    1. Listing's law (rx = 0) only holds for the head upright and still, and the eyes looking at infinity.
    2. For vergence eye movements, ocular torsion depends on vertical eye orientation: eyes have three dof!
    3. During torsional head rotations, the eyes counter-roll: eyes have three dof!
    4. The eyes counter-roll during static head tilts: eyes have three dof!
    5. After small spontaneous violations of LL there is no passive drift back into the plane: The 3D eye's orientation is controlled!
    6. Small spontaneous violations of LL (up to 1-2 deg) are actively compensated by the next saccade: The saccadic system has three dof!
    7. After brainstem microstimulation (e.g. in riMLF, or NRTP; see van Opstal et al., J Neuroscience, 1996) the eyes will roll out of LP (by up to 10 deg!). However, the next saccade brings the eyes back to LP! Saccades really have three dof when needed!
    8. Indeed, during head-free gaze saccades, the eyes do not obey LL! However, they do after the head movement is finished (e.g. Tweed, Science, 1998): The saccadic system has three dof!
    No passive mechanical model can ever account for these findings! (but an active (i.e., neural) control model could ....)
    Computer simulations of random 3D saccades with a nonlinear mechanical model of our 3D robotic eye with three controllers, and 6 elastic 'muscles', at realistic insertion points on the globe (by Carlos). This model can generate 3D eye orientations over a large range (e.g., torsion up to 15 deg). Optimal control on a linearised version of this model is assumed to minimize the weighted sum of different costs. Here, we show two different strategies to account for Listing's law:
    Top figure includes three costs: saccade duration (p; time discount), total energy consumption during the trajectory (proportional to squared control velocity), and saccade accuracy; the latter requires the eye to be on the target T=(ry, rz) and in LP (i.e., rx = 0) at the end of the movement (at time p). PP is assumed to be straight ahead.
    Bottom figure includes four costs: duration (p), energy, saccade accuracy (but now only looking at the 2D target location; torsion left free), and total muscle force at each fixation (at p).
    Both models yield 3D single-axis rotations to generate saccade trajectories that keep the eye quite close to LP! The latter even yields a downward titled plane (dotted yellow line; the central OMR is at about 15 deg upward from PP), due to the force of gravity acting on our (macro-eye) system.
    Interestingly, the saccades followed nearly straight trajectories (i.e., they synchronized the 3D motor commands (top)), and yielded velocity profiles that obey the well-known nonlinear saccade main sequence (increase of saccade duration with amplitude, saturation of peak eye velocity with amplitude, and roughly a fixed-duration acceleration phase), although the model is (at present) free of (multiplicative) noise.


    Back to the ORIENT home page.




















    2nd Minisymposium at IST Visual Lab, Lisboa, June 22, 2021



    On Tuesday, June 22, 2021, we organised our second minisymposium at the IST Visual Lab in Lisboa. Plinio Moreno and Alex Bernardino hosted the meeting (about 30 participants), and Akhil, Bernardo/Jose/Ricardo, Reza/Henrique and Goncalo/Jose presented their ongoing work in brief 10' presentations to the audience in Torre Norte. Below, you can find the pdf files of their presentations:




    1. Akhil's presentation on our recently published paper in PLoS Computational Biology is found here
    2. Bernardo's presentation on the implementation of a head-neck system with 4 d.o.f. is found here
    3. Jose and Ricardo's presentation on the calculation of muscle sideslip is found here
    4. Reza's presentation on neural network reinforcement learning model of the eye is found here
    5. Henrique's presentation on model-free learning the eye's dynamical properties is found here
    6. Goncalo's presentation on using sensor fusion for an eye-fixed camera and IMU is found here
    7. Jose's presentation on using sensor fusion with an event camera and IMU is found here




      Back to the ORIENT home page.




















      Poster presentations at CoSyne 2022, Lisboa, March 17-20, 2022



      From March 17-20, 2022, we participated in the international conference on Computational and Systems Neuroscience (Cosyne), in Lisbon:


      see this Website for Henrique's poster video talk

      and this Website for Bernardo's poster video talk




      Back to the ORIENT home page.




















      Poster presentation of Akhil at IROS 2023, Detroit, USA, October 1-5, 2023



      From October 1-5, 2023, we participated in the International Conference on Intelligent Robots and Systems (IROS 2023) in Detroit. Akhill presented a poster, entitled "Learning Open-Loop Saccadic Control of a 3D Biomimetic Eye Using the Actor-Critic Algorithm", on using Reinforcement Learning to fully characterize a 3D model of our robotic eye system; work that was carried our by our master student Henrique Granado.
      Akhil's poster received ample positive attention, which may lead to further collaborations with other groups.

      His video presentation can be viewed HERE




      Back to the ORIENT home page.




















      Results from the Nijmegen group





      Feb 2020: Our PupilLabs measurements in the vestibular chair (infrared eye images at 120 Hz) nicely demonstrate Listing's law during saccades:
      (courtesy: Annemiek, Jesse, and Lennaert)




      Another recording or Listing's Law with the PupilLabs eye tracker, made by Annemiek at home:






      Back to the ORIENT home page.


      Demonstrations.

      1. Here, you can view a demonstration of our two-axis VESTIBULAR CHAIR
      2. Torsional (clockwise/counter-clockwise) nystagmus of the eye, measured with our PUPIL LABS eye tracker (you can clearly see the rotation of the iris!). It unequivocally shows that the eye's gaze direction has three degrees of freedom.
      3. See here a brief demonstration of saccades made by our first prototype ROBOT EYE, and for our second prototype, see Vision from the New Eye
      4. Here's an MRI MOVIE of the human eye (courtesy: Hotte et al., Transl. Vis Sci Technol 5:9, 2016) ) making large horizontal left-right saccades. Note the large sideway movements of the optic nerve through the viscuous fat surrounding the globe! It causes the transfer function of the ocular plant to be overdamped. Also note the lateral and medial rectus eye muscles.


      Click here, to see Jesse (opens new window)
      Current 2D audio-visual array in the chair (63 locations in azimuth and
      elevation). Each speaker also has a central LED for visual stimulation.
      Jesse makes eye-head movements to targets (here, shown in the light),
      while under passive two-axis sinusoidal vestibular rotation.



      Back to the ORIENT home page.


      List of publications from ORIENT

      1. A.J. Van Opstal, J. Vliegen, and T. Van Esch
        Reconstructing spectral cues for sound localization from responses to rippled noise stimuli.
        PloS ONE 12(3): e0174185, 2017 (doi)
      2. G.C. Van Bentum, A.J. Van Opstal, C. C.M.van Aartrijk and M.M Van Wanrooij
        Level-weighted response averaging in elevation to synchronous amplitude-modulated sounds.
        Journal of the Acoustical Society of America, 142: 3094-3103, 2017 (doi)
      3. B. Kasap and A.J. Van Opstal
        A spiking neural network model of the midbrain Superior Colliculus that generates saccadic motor commands.
        Biological Cybernetics,, 111: 249-268, 2017 (doi)
      4. P. Bremen, R. Massoudi, M.M. Van Wanrooij, and A.J. van Opstal
        Audio-Visual Integration in a Redundant Target Paradigm: A Comparison between Rhesus Macaque and Man.
        Frontiers Behavioural Neuroscience 11: 89, 2017 (doi)
      5. R. Ege, A.J. van Opstal, P. Bremen and M.M. Van Wanrooij
        Testing the precedence effect in the median plane reveals backward spatial masking of sound.
        Scientific Reports (Nature) 8(1):8670, 2018 (doi)
      6. B. Kasap and A.J. Van Opstal
        A model for auditory-visual evoked eye-head gaze shifts in dynamic multi-steps.
        Journal of Neurophysiology 119: 1796-1808, 2018a (doi)
      7. B. Kasap and A.J. Van Opstal
        Dynamic parallelism for spike propagation in GPU accelerated spiking neural network simulations.
        Neurocomputing 302: 55-65, 2018b (doi)
      8. B. Kasap and A.J. Van Opstal
        Double stimulation in a spiking neural network model of the midbrain Superior Colliculus.
        Frontiers in Applied Mathematics and Statistics,, 4: 47, 2018c (doi)
      9. R. Ege, A.J. van Opstal, and M.M. Van Wanrooij
        Accuracy-precision trade-off in human sound localisation.
        Scientific Reports (Nature), , 8: 16399, 2018 (doi)
        (Supplementary material)
      10. B. Zoonooz, E. Arani, and A.J. Van Opstal
        Learning to localise weakly-informative sound spectra with and without feedback.
        Scientific Reports (Nature), 8: 17933, 2018 (doi)
      11. L.P.H. Van de Rijt, , M.M. Van Wanrooij, A.F>M. Snik, E.A.M. Mylanus, A.J. Van Opstal and A. Roye
        Measuring cortical activity during auditory processing with functional near-infrared spectroscopy.
        Journal of Hearing Science 8(4) pp. 9-18, 2018 (doi)
      12. K. Vogt, H. Frenzel, S.A. Ausili, D. Hollfelder, B. Wollenberg, A.F.M. Snik, and M.H.J. Agterberg
        Improved directional hearing of children with congenital unilateral conductive hearing loss implanted with an active bone-conduction implant or an active middle ear implant.
        Hearing Research 370: 238-247, 2018 (doi) .
      13. A.J. Van Opstal
        Editorial: 200 years Franciscus Cornelis Donders.
        Strabismus 26 (4) 159-162, 2018. (doi)
      14. B. Zoonooz, E. Arani, P.A.T.R. Aalbers, K.P. Koerding and A.J. Van Opstal
        Spectral weighting underlies perceived sound elevation.
        Scientific Reports (Nature), 9: 1642, 2019 (doi)
      15. B. Kasap and A.J. Van Opstal
        Microstimulation in a spiking neural network model of the midbrain superior colliculus.
        PLoS Computational Biology, 15(4): e1006522, 2019 (doi)
      16. R. Ege, A.J. van Opstal, and M.M. Van Wanrooij
        Experience shapes human sound-localisation behaviour.
        ENeuro (J Neuroscience) 6(2) 1-15, 2019 (doi)
      17. S.A. Ausili, B. Backus, M.J.H. Agterberg, A.J. Van Opstal, and M.M. Van Wanrooij
        Sound Localization in real-time vocoded cochlear-implant simulations with normal-hearing listeners.
        Trends in Hearing, 23:1-18, 2019 (doi)
      18. S. Sharma, L.H.M. Mens, A.F.M. Snik, A.J. Van Opstal, and M.M. Van Wanrooij
        An individual with hearing preservation and bimodal hearing using a cochlear implant and hearing aids has perturbed sound localization but preserved speech perception.
        Frontiers in Neurology, 10:637, 2019 (doi) .
      19. W.J. Huinck, E.A.M. Mylanus, and A.F.M. Snik
        Expanding unilateral cochlear implantation criteria for adults with bilateral acquired severe sensorineural hearing loss.
        European Archives of OtoRhinoLaryngology, 265: (5) 1313-1320, 2019 (doi)
      20. A. Snik, H. Maier, B. Hodgetts, M. Kompis, G. Mertens, P. van de Heijning, T. Lenarz, and A. Bosman
        Efficacy of Auditory Implants for Patients With Conductive and Mixed Hearing Loss Depends on Implant Center.
        Otology and Neurotology 40: 430-435, 2019 (doi)
      21. A.J. Van Opstal and B. Kasap
        Maps and sensorimotor transformations for eye-head gaze shifts: Role of the midbrain Superior Colliculus.
        Progress in Brain Research, Vol. 249, Ch. 2, pp. 19-33, 2019 (doi)
      22. A.J. Van Opstal and B. Kasap
        Electrical stimulation in a spiking neural network model of the monkey Superior Colliculus.
        Progress in Brain Research, Vol. 249, Ch. 11, pp. 153-166, 2019 (doi)
      23. L.P.H. Van de Rijt, , A. Roye, E.A.M. Mylanus, A.J. Van Opstal, and M.M. Van Wanrooij
        The principle of inverse effectiveness in audiovisual speech perception.
        Frontiers in Human Neuroscience 13:335, 2019 (doi)
      24. B. Zonooz, and A.J. van Opstal
        Differential adaptation in azimuth and elevation to acute monaural spatial hearing after training with visual feedback.
        ENeuro (J Neurosci), 6(5) 1-18, 2019 (doi)
      25. K. Vogt, J.-W. Wasmann, A.J. van Opstal, A.F.M. Snik, and M.J.H. Agterberg
        Contribution of spectral pinna cues for sound localization in children with congenital unilateral hearing loss after hearing rehabilitation.
        Hearing Research, 385: 107847, 2020 (doi)
      26. N. Wardenga, A.F.M. Snik, E. Kludt, B. Waldmann, T. Lenarz, H. Maier
        Hearing-aid treatment for patients with mixed hearing loss. Part II: Speech recognition in comparison to direct acoustic cochlear stimulation..
        Audiology and Neurotology, 2020 (doi)
      27. S.A. Ausili, M.J.H. Agterberg, A. Engel, C. Voelter, J.-P. Thomas, S. Brill, A.F.M. Snik, S. Dazert, A.J. Van Opstal and E.A.M. Mylanus.
        Spatial Hearing by Bilateral Cochlear Implant Users With Temporal Fine-Structure Processing.
        Frontiers Neurology, 11: 915 (doi)
      28. L. Wang, E. Noordanus and AJ van Opstal
        Estimating multiple latencies in the auditory system from auditory steady‐state responses on a single EEG channel.
        Scientific Reports (Nature), 2021 (doi)
      29. G.C. Van Bentum, M.M. Van Wanrooij and A.J. Van Opstal
        Spatiotemporal factors influence sound-source segregation in localization behavior.
        Journal of Neurophysiology 125: 555-567, 2021. (doi)
      30. J.A. Garcia Uceda-Calvo, M.M. Van Wanrooij, and A.J. Van Opstal.
        Adaptive Response Behavior in the Pursuit of Unpredictably Moving Sounds.
        eNeuro 8(3): ENEURO.0556-20.2021 1-14, 2021. (doi)
      31. A. John, C. Aleluia, A.J. Van Opstal and A. Bernardino.
        Modelling 3D saccade generation by feedforward optimal control.
        PLoS Computational Biology 17(5): e1008975. (doi)
      32. L. Wang, E. Noordanus, and A.J. Van Opstal
        Towards real-time detection of auditory steady-state responses: a comparative study.
        IEEE Access, 9: 108975 - 108991, 2021 (https://doi.org/10.1109/ACCESS.2021.3100157)
      33. L.P.H. Van de Rijt, A.J. Van Opstal, and M.M. Van Wanrooij
        Multisensory integration-attention trade-off in cochlear-implanted deaf individuals.
        Frontiers Neuroscience, 15: 683804, 2021 (https://doi.org/10.3389/fnins.2021.683804)
      34. S. Sharma, W. Nogueira, A.J. Van Opstal, J. Chalupper, L.H.M. Mens, and M.M. Van Wanrooij
        Amount of frequency compression in bimodal cochlear implant users is a poor predictor for audibility and spatial hearing.
        Journal of Speech, Language, and Hearing Research, 64(11) 4056-4069, 2021 (https://doi.org/10.1044/2021_JSLHR-20-00653)
      35. A. Alizadeh and A.J. Van Opstal
        A spiking neural network of the midbrain Superior Colliculus generates saccades that are invariant to input strength.
        Scientific Reports, published, 2021.
      36. T. Anastasio, J. Demer, R.J. Leigh, A. Luebke, A.J. van Opstal, L. Optican, S. Ramat, and D. Zee (Eds.)
        David A. Robinson's Modeling the Oculomotor Control System.
        Progress In Brain Research (Elsevier Publ.), Vol.: 267 ((455 pp, 21 chapters)
      37. A. Alizadeh and A.J. Van Opstal
        Dynamic Control of Eye-Head Gaze Shifts by a Spiking Neural Network Model of the Superior Colliculus.
        Frontiers in Computational Neuroscience, published, 2022.
      38. L.C.E. Veugen, A.J. Van Opstal, D. Louvet, and M.M. Van Wanrooij.
        Reaction times to monaural and binaural spectrotemporal modulations: normal hearing and simulated impaired hearing.
        Trends in Hearing 26: 1-16, 2022
      39. L. Guadron, A.J. van Opstal, and J. Goossens
        Speed-accuracy tradeoffs influence the main sequence of saccadic eye movements.
        published in Scientific Reports (Nature), 2022
      40. A. John, A. Bernardino, A.J. Van Opstal
        A Cable-Driven Robotic Eye for Understanding Eye-Movement Control.
        In: IEEE Xplore: 2023, 9th International Conference on Robotics, Automation and Applications (ICARA) (manuscript)
      41. L. Guadron, S.A. Titchener, C.J. Abbott, L.N. Ayton, A.J. van Opstal, M.A. Petoe, and J. Goossens
        The Saccade Main Sequence in Patients with Retinitis Pigmentosa and Advanced Age-Related Macular Degeneration.
        Investigat. Ophtalmol. Vis. Sci., Accepted, Feb., 2023
      42. S. Sharma, L.H.M Mens, A.F.M., A.F>M. Snik, A.J. Van Opstal, and M.M. Van Wanrooij
        Hearing asymmetry biases spatial hearing in bimodal cochlear-implant users despite bilateral low-frequency hearing preservation.
        Trends in Hearing, 2022, accepted
      43. H. Granado, A. John, R. Alitappeh Javanmard, A.J. Van Opstal, A. Bernardino
        Learning open-loop saccadic control of a 3D biomimetic eye using the actor-critic algorithm.
        In: IEEE Xplore: 2023, International Conference on Intelligent Robots and Systems, Detroit USA (IROS) (manuscript)
        Upcoming:


      44. Javanmard Alitappeh R, John A, Dias B, Van Opstal, A.J. and Bernardino, A:
        Emergence of human oculomotor behavior from optimal control of a cable-driven biomimetic robotic eye.
        IEEE Transactions on Neural Networks and Learning Systems, under review, 2023 (manuscript)
      45. A.J. van Opstal
        Neural control of instantaneous kinematics of eye-head gaze shifts in monkey Superior Colliculus.
        Communications Biology (Nature; under review, Feb 2023)
      46. G.C. Van Bentum, A.J. Van Opstal, and M.M. Van Wanrooij
        Localization of two detectable synchronous sounds is impossible.
        J Neurophysiology, to be submitted, 2023
      47. E. Noordanus, L. Wang, and A.J. Van Opstal
        Fingerprinting the auditory system with EEG using multi-sine stimuli and nonlinear system identification.
        in revision for JARO.
      48. R.F. van der Willigen, A.J. van Opstal and H. Versnel.
        Spectrotemporal sound processing of naturalistic sounds in monkeys and humans.
        Submit to J Neurophysiology, March 2023
      49. L.C.E. Veugen, A.J. Van Opstal, J. Chalupper, and M.M. Van Wanrooij.
        Spectral-temporal sensitivity of bimodal cochlear implant users.
        Sumbit to JARO, March 2023
      50. R. Ege, A.J. van Opstal, and M.M. Van Wanrooij
        Frequency transfer of the ventriloquism after effect.
        eNeuro (J Neurosci, to be submitted, 2023


      Back to the ORIENT home page.