Interest Areas for Subtitles in Video/Subtitle Experiments
Delivered Friday, November 14 2025, 8:30am EST
This webinar was given as a pre-workshop training session for the Reading Subtitles Eye-Tracking Workshop (December 2025, Leiden)
This webinar is about eye tracking experiments involving the presentation of videos with subtitles. When analysing eye movements during reading of subtitles it can be useful to have interest areas around the text (e.g., around words, phrases, characters etc.) which appear and disappear in sync with the subtitles.
This webinar goes through the steps for generating these dynamic interest areas from SRT files so they can be imported into a main Experiment Builder task for data collection.
The process involves using the SRT file (step 1 in the diagram below) to generate an Experiment Builder data source (step 3). The data source is used by a special 'dummy' Experiment Builder project (step 4) to generate interest area files for the main project (step 5). The video needs to have the subtitles burned/hardcoded into it using 3rd party software such as Camtasia (step 2) so it can be added to the main task (step 5). At the analysis stage (step 6), the text-based dynamic interest areas will appear and disappear in sync with the subtitles.
You can get to a video of the Interest Areas for Subtitles in Video/Subtitle Experiments Webinar from the following links:
The materials/examples from the webinar can be downloaded from the following links
This webinar covers:
Having basic knowledge of Experiment Builder functionality will help in understanding the concepts discussed in this webinar, but it is not required. For an introduction to the basic functionality of Experiment Builder see the Experiment Builder Video Tutorials.
If you have any questions, don't hesitate to get in touch at support@sr-research.com!
Delivered Friday, November 14 2025, 8:30am EST
This webinar was given as a pre-workshop training session for the Reading Subtitles Eye-Tracking Workshop (December 2025, Leiden)
This webinar is about eye tracking experiments involving the presentation of videos with subtitles. When analysing eye movements during reading of subtitles it can be useful to have interest areas around the text (e.g., around words, phrases, characters etc.) which appear and disappear in sync with the subtitles.
This webinar goes through the steps for generating these dynamic interest areas from SRT files so they can be imported into a main Experiment Builder task for data collection.
The process involves using the SRT file (step 1 in the diagram below) to generate an Experiment Builder data source (step 3). The data source is used by a special 'dummy' Experiment Builder project (step 4) to generate interest area files for the main project (step 5). The video needs to have the subtitles burned/hardcoded into it using 3rd party software such as Camtasia (step 2) so it can be added to the main task (step 5). At the analysis stage (step 6), the text-based dynamic interest areas will appear and disappear in sync with the subtitles.
You can get to a video of the Interest Areas for Subtitles in Video/Subtitle Experiments Webinar from the following links:
The materials/examples from the webinar can be downloaded from the following links
- EyeTrackingForVideosWithSubtitles.pptx -- the powerpoint used in the webinar
- SRT_Subtitle_Conversion.zip -- a zip file containing a Python script/application that converts SRT subtitle files to an Experiment Builder Data Source file (i.e., a file that can be imported to populate the Data Source of the interest area generating project) as well as the example SRT files containing the subtitle info that was hardcoded into the video files.
- Video_Generate_Subtitle_IAS_Files.ebz -- the interest area generating project used to create dynamic interest area set files for the subtitles.
- Video_Subtitles_Main_Experimental_Project.ebz -- the main experimental project used to present the videos with the hardcoded subs (the videos are in the project's library). The project also uses the interest area set files from the interest area generating project to ensure the interest areas are synced with the video and are available for analysis.
- OriginalVideoFiles.zip -- the original example video files without any subtitles.
- VideoSubtitleExampleViewingSession.dvz -- a Data Viewer viewing session for some example data collected using the materials covered in the webinar.
This webinar covers:
- How to hardcode/burn in subtitles into a video from SRT subtitle files
- How to convert SRT subtitle files to an Experiment Builder-friendly file that can be used to populate the Data Source of an Experiment Builder project that will generate the interest area set files
- How to use a dummy Experiment Builder project to generate dynamic interest area set files for the subtitle text
- How to add the videos and interest area set files to an Experiment Builder project used in running a video/subtitle experiment
- Considerations for interest area IDs for subtitles and how to extract common reading measures from Data Viewer for the interest areas/text of the subtitles
Having basic knowledge of Experiment Builder functionality will help in understanding the concepts discussed in this webinar, but it is not required. For an introduction to the basic functionality of Experiment Builder see the Experiment Builder Video Tutorials.
If you have any questions, don't hesitate to get in touch at support@sr-research.com!

