EyeLink Learning Resources

In addition to our legendary support service, we have a range of resources that will help you get your eye-tracking research running smoothly – from setting up your EyeLink system through creating an eye tracking task to collecting and analyzing high-quality eye tracking data. You will need to register with our Support Forum to access many of these resources (highlighted with the symbol: ). All links will work after logging into the support forum in the same browser as this website.


Set UpStimulus PresentationData AnalysisEEG / fNIRS IntegrationResearch Topics


Setting up your eye tracker correctly is a really important step and is key to ensuring optimal data quality. We have a range of resources on the support forum that will help you set up your eye tracker and become familiar with its use, including quick-start guides and detailed video tutorials.


Eyelink 1000 Plus
EyeLink 1000 Plus Video


Eyelink Portable Duo
EyeLink Portable Duo Eye Tracker



EyeLink systems can interface with a wide range of stimulus presentation software packages, including commercial products such as E-Prime and Presentation, as well as software such as Psychtoolbox, OpenSesame and PsychoPy. You could even program tasks and control EyeLink systems using Python and other languages. In all cases the stimulus presentation software can control the eye tracker (for example starting and stopping recording), send messages into the data file (for example marking stimulus onset or participant response) and receive gaze data during recording over the link (enabling gaze-contingent paradigms). You can of course also use our own powerful and intuitive stimulus presentation software – Experiment Builder, which makes programming complex eye tracking tasks very straightforward.

Experiment Builder
Experiment Builder
SR Research Experiment Builder is our drag-and-drop program for making computer-based psychology and neuroscience experiments. It’s easy to use but also incredibly powerful. Experiment Builder can be downloaded and used in evaluation mode for 30 days without a license.

Experiment Builder is very easy to learn, and our 12-part video tutorial series is a great place to start. It leads you through Experiment Builder’s key features, in the context of a simple Picture-Response eye tracking experiment. For a detailed overview of the topics covered, and to access the video series itself, please visit our Support Forum:

We also have several webinars that demonstrate features of Experiment Builder in the context of specific eye tracking tasks, or highlight more advanced features:

Finally, we have a large number of commented experiment templates and example projects that to you can use as a starting point for your own projects, or as a way to learn more about Experiment Builder. For a complete list of templates, expand the bar below:

Templates/Example Projects
Experiment Builder comes with pre-installed templates:

  1. Simple: Basic template for eye-tracking experiments
  2. Stroop: Basic template for non-eye-tracking experiments
  3. Picture: Shows how to use/display image resources
  4. TextLine: Display single-line text, and auto-generate interest areas for it
  5. TextPage: As above, using multi-line text
  6. GCWindow: Using real-time gaze position to display a gaze-contingent window
  7. Track: Display gaze position, set resource position based on gaze location
  8. Change: Basic change blindness experiment. Shows use of fixation triggers
  9. Saccade: Basic pro-/anti-saccade experiment
  10. Pursuit: Illustrates several kinds of sinusoidal movement in a pursuit task
  11. Video: Display video clips during an experiment

There are also a large number of popular experimental examples on the Support Forum. See the following forum page for a detailed overview:

  1. Detailed Overview of Experiment Builder Examples

Or go directly to each example:

  1. Attentional Blink
  2. Balloon Analogue Risk Task
  3. Basic Custom Class Variable Use
  4. Boundary Paradigm
  5. Collecting Participant Information
  6. Creating False Memories
  7. Different Instructions at the Beginning of Each Block
  8. Dual Task Interference
  9. Fitts’ Law
  10. Gender-Neutral Pronoun Use
  11. Generate a Random Number
  12. Getting Button Responses from Parallel Port Button Boxes for Non-Eye Tracking Project
  13. Getting Subject Response from a Likert Scale
  14. Getting Subject Response from Sliding Scale Survey Questions
  15. Habituation
  16. Implicit Association Test
  17. Lexical Decision
  18. Listen to .wav and Type Response
  19. Maintained Fixation
  20. Manipulating variables with Python functions without Custom Class
  21. Manual Reaction Time Calculation
  22. Mental Rotation
  23. Mock Web Browsing
  24. Mouse Drag and Drop
  25. Mouse Trigger
  26. Moving Target
  27. Multiple Presses/Responses per Trial
  28. Online Drift Correction on Mouse Click
  29. Operation Span
  30. Paced Auditory Serial Addition Task
  31. Prosaccade and Antisaccade Tasks
  32. Question Every X Trials (Keyboard and/or Mouse Response)
  33. Recycling Trials with Incorrect Responses
  34. Repeated Invisible Boundaries
  35. Response Checking and RT calculation (with Keyboard/EyeLink Button Box)
  36. Reversed SNARC and Simon Effects
  37. Semantic Category
  38. Sentence Picture Verification Task
  39. Setting a Fixed Inter-Trial Interval in Experiment Builder
  40. Simon Effect
  41. Simple Invisible Boundary
  42. SOA Manipulation in EyeLink Experiments
  43. SOA Manipulation in Non-EyeLink Experiments
  44. Sternberg Search
  45. Stroopfeedback
  46. Study and test trials
  47. Subliminal Perception
  48. Synchronizing Eye Movements and Audio Recording</li>
  49. Temporal Order and Spatial Memory
  50. Terminate Block When all Trials are Correct
  51. Terminate Block When Average RT is Below Threshold
  52. TextLine with Comprehension Questions
  53. Tower of Hanoi task
  54. Tracking Key/Button Pressing Down Time for Behavior Studies
  55. Up/Down Method using Custom Class
  56. Updown Method
  57. Using Lists
  58. Using MRI Sync Pulses (TTL) to Trigger Stimulus Onset
  59. Variable Fixation
  60. Variable Fixation Screen Duration and Reporting the Duration Online
  61. Variable MultiPage MultiLine Text with Questions
  62. Variable MultiPage MultiLine Text with Questions Alternate Version
  63. Variable Number of Multiple Responses in Each Trial
  64. Visual Lexical Access
  65. Visual Search EB Program
  66. Visual World Paradigm
  67. Wisconsin Card Sorting Task


Psychtoolbox and EyeLink
Psychtoolbox for Matlab comes with native support for the EyeLink eye-tracking systems, and many example scripts that demonstrate the full range of integration – from basic control of the eye tracker, through implementing gaze-contingent tasks, to adding integration messages for Data Viewer analysis software. These resources are available from the support forum:


Psychology Software Tools’ E-prime software can be integrated with EyeLink systems via the user script and inline scripts embedded within your Experiment. Example scripts demonstrate the full range of integration functionality, including examples of how to implement gaze-contingent tasks.


Neurobehavioral Systems Presentation software can be integrated with EyeLink systems through function calls via the PresLink Extension. We provide example scripts that illustrate the full range of integration functionality, including gaze-contingent stimulus presentation and Data Viewer integration.


PsychoPy comes with native support for the EyeLink eye-tracking systems via PyLink, our Python wrapper for our core SDK. Example scripts illustrate the full range of EyeLink integration using pygame and VisionEgg graphics libraries.


Open Sesame-EyeLink
OpenSesame is an open source stimulus presentation software that is also based on Python. EyeLink integration can be achieved via PyLink and via an Eyelink plugin for OpenSesame. Follow the link below for OpenSesame resources.


Programming Languages
EyeLink Programming Language
Our Software Development Kit enables experimental tasks to be implemented in a wide range of programming languages, including C, C# and Python. The Software Development Kit is absolutely free and can be downloaded from the support forum.


Analyzing eye tracking data can seem like a daunting task, but we have a range of resources to help, including our powerful and intuitive Data Viewer Analysis Software. EyeLink Data Files can also be imported directly into Matlab or R.

Data Viewer
Data Viewer
EyeLink Data Viewer is a powerful software tool for analyzing EyeLink data. Data Viewer supports EDF files recorded by EyeLink I, II, 1000, 1000 Plus, and Portable Duo. Links to the Data Viewer Installers, Software License Installation, User Manual, and Frequently Asked Questions are below.

Data Viewer is easy to learn, and a great place to start is with our video tutorial series, which is available on the support forum.

You can see Data Viewer in action, and learn about some of its more advanced features in our webinar series, available on the support forum.


There are a number of ways in which you can work with EyeLink Data Files (EDF Files) in Matlab. You can convert the files to ASCII or import the EyeLink Data Files directly. More information on both of these approaches is available on the Support Forum – just follow the link:


Other Software
edf2asc for EyeLink
EyeLink Data Files can be converted into Ascii, and read directly into statistics software such as R / SPSS, or imported into Excel or Python data tables for further processing and analysis.


EyeLink systems can be integrated with a wide range of other neurophysiological recording devices, including EEG and fNIRS equipment. A range of common integration solutions are outlined below. More detailed technical information is available on the support forum.


Biosemi EyeLink Integration

BioSemi produces state-of-the-art EEG instruments for researchers. Integrating Biosemi EEG hardware with an EyeLink eye tracker is very easy, requiring a simple trigger cable. Our stimulus presentation software, Experiment Builder has dedicated biometric device control nodes that streamline the integration process. For more information, please see our Biosemi and EyeLink integration page.


Brain Products
Brain Products EyeLink Integration
Brain Products provides EEG hardware and software solutions for neurophysiological research. The hardware integraton between Brain Products EEG systems and our eye trackers involves a combination of TCP/IP and TTL communication. Experiment Builder, has dedicated nodes for easy synchronization and integration. For more information, please go to our Brain Products and EyeLink integration page.


gtec EyeLink Integration
g.tec is a medical engineering company with 20 years of experience in brain-computer interfaces. They have a page outlining the integration between the EyeLink 1000 Plus and their g.Nautilus dry or gel-based wireless EEG via their g.EYEtracking Interface. This link will take you off our site and to their page for more information about g.tec and EyeLink integration.


EGI EyeLink Integration
Philips-EGI are pioneers in the field of dense-arrary EEG recording. We have worked closely with Phillips-EGI to bring very high levels of integration between EyeLink systems and Net Station 2. Our stimulus presentation software Experiment Builder can directly control EGI systems and EyeLink eye trackers. For more information please go to our dedicated page.


Neuroscan EEG EyeLink Integration
Compumedics Neuroscan is a lead­ing provider of tech­nolo­gies for high-density EEG record­ings and electromagnetic source local­iza­tion. Integrating Neuroscan EEG hardware with EyeLink systems is very straightforward and simply requires a trigger cable. SR Research Experiment Builder has dedicated nodes that facilitate integrating Neuroscan EEG with EyeLink systems. More integration information is available on our EyeLink-Compumedics Neuroscan integration page.


NIRx EEG EyeLink Integration
NIRx provides integrated solutions for fNIRS neuroimaging. Tests conducted at NIRx headquarters in Berlin confirm that, unlike some other infrared eye trackers, EyeLink systems do not create any interference with the NIRS signal. Integrating their hardware with EyeLink systems is quite simple, requiring only two cables. For stimulus presentation and data collection integration you can use Experiment Builder, E-Prime, Presentation, PsychToolBox, etc. More information is available on our NIRx and EyeLink integration page.



If you’re interested in eye tracking using a particular paradigm, we have comprehensive webinars on a pupillometry, psycholinguistics, and dynamic stimuli.

Stimuli-Pupillometry Webinar

For an introduction to recording and analyzing pupil data, there’s a webinar on pupillometry on the Support Forum. The webinar covers background material (pupil physiology/why pupil data is interesting), experimental design considerations (non-cognitive influences on pupil size, pupil foreshortening effects), converting pupil area values to mm, and analysis approaches (difference from baseline, time-series).


Stimuli-Psycholinguistics Webinar
A webinar on the Support Forum covers programming and data analysis for reading and visual world experiments. More specifically, it covers the following topics in the context of a simple eye tracking experiment for reading and visual world experiments:


Dynamic Stimuli
Dynamic Stimuli
If you’re interested in designing and analysing experiments using dynamic stimuli such as videos, there’s a webinar on the Support Forum. Experiment Builder topics include creating simple dynamic stimuli from patterns, displaying videos, and audio and video synchronisation. Data Viewer topics include creating a beeswarm, dynamic heatmap, and dynamic interest areas as well as isolating “pursuit” eye movements.