Create infant-friendly eyetracking experiments with PsychoPy and Tobii eyetrackers.
This package was based on psychopy_tobii_controller, with some some improvements and modifications for developmental research.
- This project is not a stand-alone program. It is an implementation of PsychoPy for Tobii eye trackers.
- This project is unofficial.
- Test the scripts thoroughly before jumping into data collection!
Yu-Han Luo
- Clone or download this folder
- Install the package with
pip install .
or put the folder in your project
import os
from psychopy import visual, core
from psychopy_tobii_infant import TobiiInfantController
# create a Window to control the monitor
win = visual.Window(
size=[1280, 1024],
units='norm',
fullscr=True,
allowGUI=False)
# initialize TobiiInfantController to communicate with the eyetracker
controller = TobiiInfantController(win)
# show the relative position of the subject to the eyetracker
# Press space to exit
controller.show_status()
# run calibration
# - Use 1~9 (depending on the number of calibration points) to present
# calibration stimulus and 0 to hide the target.
# - Press space to start collect calibration samples.
# - Press return (Enter) to finish the calibration and show the result.
# - Choose the points to recalibrate with 1~9.
# - Press decision_key (default to Space) to accept the calibration or recalibrate.
# stimuli to use in calibration
# The number of stimuli must be the same or larger than the calibration points.
# stimuli for calibration
CALISTIMS = [x for x in os.listdir('infant/') if '.png' in x]
# correct path for calibration stimuli
CALISTIMS = ['infant/{}'.format(x) for x in CALISTIMS]
controller.run_calibration([(-0.4, 0.4), (-0.4, -0.4), (0.0, 0.0), (0.4, 0.4), (0.4, -0.4)], CALISTIMS)
# Start recording
controller.start_recording('demo1-test.tsv')
core.wait(3) # record for 3 seconds
# stop recording
controller.stop_recording()
# close the file
controller.close()
# shut down the experiment
win.close()
core.quit()
Currently tested on Python 3.5.7 and Python 3.6.9
- PsychoPy
- supports both PsychoPy2 (tested on 1.90.3, should work on older version) and PsychoPy 3 (tested on 3.2.3)
- tobii-research
>=1.6.0
for Python 3.5>=1.7.0
for Python 3.6
If you wish to run calibration validation, you need to install or put the add-ons inside the source folder: The add-ons are Apache-2.0 licensed.
psychopy_tobii_infant
├── __init__.py
└── tobii_research_addons
├── __init__.py
├── ScreenBasedCalibrationValidation.py
└── vectormath.py
Demo stimuli released under Creative Commons CC0, aka no copyright:
Notes
On Windows machines, PsychoPy sometimes is not focused and keyboard inputs are not detected by PsychoPy.
Users might get stuck in show_status()
or other procedures that require keyboard inputs. Details can be found in #8. Two workarounds are provided:
- Simply put
from moviepy.config import get_setting
in the beginning of the script. - Use Alt + Tab to manually focus PsychoPy.
- Show the relative position of the subject to the eyetracker
- Run five-points calibration
- Show the relative position of the subject to the eyetracker
- Run five-points calibration
- Collect looking time data based on the eyetracker (static image)
- Show the relative position of the subject to the eyetracker
- Run five-points calibration
- Collect looking time data based on the eyetracker (video)
- Show the relative position of the subject to the eyetracker
- Adjust parameters of calibration procedure
- Run five-points calibration automatically
- Show the relative position of the subject to the eyetracker
- Use customized calibration procedure to attract the participant's attention
- visual stimulus shriking and a sound playing during calibration
- Run five-points calibration
- Show the relative position of the subject to the eyetracker
- Run five-points calibration with sound
- Show the relative position of the subject to the eyetracker
- Run five-points calibration automatically
- Run calibration validation automatically and show the results
- Validation procedure for
TobiiInfantController
. - New
shuffle
argument (default isTrue
) forTobiiInfantController.run_calibration
to control the randomization of calibration stimuli.
- A large part of codes had been refactored. If you used a modified version of this package, please be aware of that before upgrading!
- New class
InfantStimuli
is used to handle the images for infant-friendly calibration/validation. Users now can use additional arguments ofpsychopy.visual.ImageStim
for the calibration stimuli.
Codes conform to PEP8 now. Class names are backward-compatible so old scripts should run as expected.
tobii_controller
->TobiiController
tobii_infant_controller
->TobiiInfantController
- Calibration validation provided by Tobii Pro SDK add-ons.
focus_time
intobii_controller.run_calibration
allows the adjustment of duration allowing the subject to focus.
- Data precision was slightly improved.
- Code readability was improved.
tobii_controller.get_current_pupil_size
now returns the average pupil size from both eyes instead of respective values.
tobii_controller._write_header
was no longer used. Will only affect users if it is used in the scripts.
- Python 3.6 support
tobii-research
, started from v1.7, now supports Python 3.6 (yay!).
audio
parameter inpsychopy_tobii_infant.infant_tobii_controller.run_calibration()
. Users can provide a psychopy.sound.Sound object to play during calibration.
- Redundant property getters and setters. Do not affect users.
- Remove
embed_event
. The output file will always record the events in the end of data. Will only affect users ifembed_event
is used in the scripts.
GPL v3.0 or later
This package is built upon/inspired by the following packages, for which credit goes out to the respective authors.
- PsychoPy
- Tobii Pro SDK
- Tobii Pro SDK add-ons, Apache-2.0 licensed
- psychopy_tobii_controller by Hiroyuki Sogo
- PyGaze by Edwin S. Dalmaijer
Please consider to cite PsychoPy to encourage open-source projects:
(APA formatted)
- Peirce, J. (2009). Generating stimuli for neuroscience using PsychoPy. Frontiers in Neuroinformatics, 2, 10. doi:10.3389/neuro.11.010.2008