Eye-Tracking Calibration
Version: v1 (current)
A calibration task for establishing accurate eye-gaze tracking during web-based studies using webcam-based eye tracking.
Overview
The Eye-Tracking Calibration task is an essential setup step for studies that use eye-tracking technology to monitor participants' gaze patterns. Calibration maps the relationship between camera-captured eye images and screen coordinates, enabling accurate measurement of where participants are looking during experimental tasks.
This task guides participants through:
- Positioning: Adjusting seating distance and camera angle
- Calibration: Fixating on targets at known screen locations
- Validation: Testing accuracy of the calibration
- Recalibration: If needed, repeating calibration for better accuracy
Eye-tracking calibration is particularly important for:
- Visual attention studies: Measuring gaze patterns on scenes, faces, or interfaces
- Reading research: Tracking eye movements during text reading
- Usability testing: Where users look on web pages or applications
- Attention control: Verifying fixation in tasks requiring central gaze
- Clinical assessment: Eye movement abnormalities (saccades, smooth pursuit)
Why Researchers Use This Task
- Measurement Validity: Accurate eye-tracking requires proper calibration
- Data Quality: Poor calibration leads to noisy, unusable gaze data
- Standardization: Consistent calibration procedure across participants
- Quality Control: Validation step identifies insufficient calibration
- Participant Engagement: Interactive calibration increases compliance
- Remote Studies: Enables eye-tracking in web-based, unmoderated studies
Current Implementation Status
Fully Implemented:
- ✅ Multi-point calibration (5-point, 9-point, 13-point)
- ✅ Visual feedback during calibration
- ✅ Validation phase with accuracy reporting
- ✅ Recalibration option if accuracy insufficient
- ✅ Positioning guidance (camera, lighting, distance checks)
- ✅ Integration with WebGazer.js (webcam-based eye tracking)
Partially Implemented:
- ⚠️ Limited to webcam-based tracking (not hardware eye trackers)
- ⚠️ Accuracy depends on webcam quality and lighting
Not Yet Implemented:
- ❌ Integration with external eye-tracking hardware (Tobii, EyeLink)
- ❌ Advanced validation metrics (fixation stability, drift correction)
- ❌ Continuous recalibration during long sessions
Configuration Parameters
Calibration Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
| Calibration Points | number | 9 | Number of calibration targets (5, 9, or 13) |
| Point Duration (ms) | number | 1000 | Duration participant must fixate each point |
| Inter Point Interval (ms) | number | 500 | Delay between calibration points |
| Validation Enabled | boolean | true | Run validation phase after calibration |
| Min Accuracy Threshold | number | 100 | Maximum error in pixels to accept calibration |
Positioning Check
| Parameter | Type | Default | Description |
|---|---|---|---|
| Check Camera | boolean | true | Verify webcam is working |
| Check Lighting | boolean | true | Check for adequate face illumination |
| Check Distance | boolean | true | Guide participant to optimal distance (~50-70cm) |
| Allow Skip Positioning | boolean | false | Let participant skip positioning checks (not recommended) |
Visual Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
| Target Size (px) | number | 20 | Size of calibration target dot |
| Target Color (hex) | string | 'FF0000' | Color of calibration target (red default) |
| Show Gaze Cursor | boolean | true | Display real-time gaze position during validation |
| Background Color (hex) | string | 'E0E0E0' | Background color during calibration |
Recalibration
| Parameter | Type | Default | Description |
|---|---|---|---|
| Allow Recalibration | boolean | true | Offer recalibration if accuracy insufficient |
| Max Recalibration Attempts | number | 3 | Maximum number of recalibration attempts |
| Auto Recalibrate Threshold | number | 150 | Automatically trigger recalibration if error > threshold |
Data Output
Markers and Responses
The task is passive (no participant clicks), so no response objects are recorded. Markers track each calibration point display.
Markers (point_shown):
{
"type": "point_shown",
"ts": "2024-01-01T00:00:02.000Z",
"hr": 2234.56,
"data": {
"pos": "top_left",
"index": 0
}
}
The pos field contains the grid position key (e.g., "top_left", "middle_middle", "bottom_right"). The index field is the sequential point number in the calibration sequence.
Summary Artifact
None. The eyetracking calibration task does not generate a summary artifact. Calibration data is available in the participation markers for post-processing.
Calibration Procedures
5-Point Calibration (Fast)
Points: Center + 4 corners
1 2
\ /
\ /
5
/ \
/ \
3 4
Use Case: Quick calibration for tasks not requiring high precision
Duration: ~10 seconds
Accuracy: Moderate (±100-150 pixels typical)
9-Point Calibration (Standard)
Points: 3x3 grid covering screen
1 2 3
4 5 6
7 8 9
Use Case: Standard for most eye-tracking studies
Duration: ~15 seconds
Accuracy: Good (±75-100 pixels typical)
13-Point Calibration (High Accuracy)
Points: 9-point + 4 additional mid-edge points
1 2 3
4 5 6
7 8 9
Use Case: High-precision requirements (reading, small targets)
Duration: ~20 seconds
Accuracy: Best (±50-75 pixels typical)
Participant Experience
-
Positioning Check:
- "Please adjust your position so your face is clearly visible in the camera preview."
- Camera view shown with face detection overlay
- Distance guidance: "Move closer/farther until your face fills the oval."
- Lighting check: "Ensure your face is well-lit. Avoid backlighting."
-
Calibration Instructions:
- "You will see red dots appear on the screen. Please look directly at each dot and keep your eyes on it until it disappears."
- "Try to keep your head still during calibration."
- "Click 'Start Calibration' when ready."
-
Calibration Procedure:
- Red dot appears in corner
- Participant fixates for 1 second
- Dot moves to next location
- Repeat for all calibration points (~9 total)
-
Validation:
- "Now we'll test the calibration accuracy."
- Dots appear again; gaze position shown with cursor
- Accuracy computed: "Calibration accuracy: 78 pixels (Good)"
-
Recalibration Decision:
- If accuracy good (< 100px): "Calibration successful! Proceed to task."
- If accuracy poor (> 100px): "Calibration accuracy is low. Would you like to recalibrate for better results?"
-
Completion:
- "Eye-tracking is ready. Please keep your head relatively still during the task."
Design Recommendations
Environment Requirements
- Lighting: Even, frontal lighting (no backlighting from windows)
- Distance: 50-70 cm from screen (arm's length)
- Camera: Built-in laptop webcam or external webcam (720p minimum, 1080p preferred)
- Stability: Chair and desk stable (no rocking or movement)
- Glasses: Compatible with glasses, but contacts or no correction preferred
Calibration Design
- Number of Points: 9-point standard; 13-point for reading or high precision
- Duration per Point: 1000ms minimum; 1500ms for noisy environments
- Validation: Always validate; recalibrate if mean error > 100px
- Instructions: Emphasize head stability and fixation at center of target
Quality Control
- Accuracy Threshold: Accept if mean error < 100px for standard tasks; < 75px for precision tasks
- Maximum Attempts: Allow 3 recalibration attempts; if still poor, exclude participant
- Head Movement: Monitor and warn if excessive head movement detected
- Glasses/Contacts: Ask participants; note in data (may affect accuracy)
Integration with Tasks
- Timing: Calibrate immediately before eye-tracking tasks
- Recalibration: For sessions >30 minutes, recalibrate midway
- Drift Correction: Consider brief validation/correction between blocks
- Reminders: Remind participants to keep head still at task start
Common Issues and Solutions
| Issue | Solution |
|---|---|
| Poor accuracy (>150px error) | Check lighting, distance, glasses; recalibrate; may need to exclude |
| Camera not detected | Check permissions; ensure webcam connected; try different browser |
| Face not detected | Improve lighting; ensure face centered; remove obstructions (hat, hair) |
| Accuracy degrades during task | Participant moving head; recalibrate between blocks; remind about stillness |
| Calibration hangs on one point | Participant not fixating; provide clearer instructions; check for understanding |
| Gaze cursor jumps around | Insufficient samples; increase calibration duration; improve lighting |
Validation Criteria
Acceptable Calibration:
- Mean error < 100 pixels
- Max error < 200 pixels
- Data loss < 10%
- Precision < 50 pixels SD
Good Calibration:
- Mean error < 75 pixels
- Max error < 150 pixels
- Data loss < 5%
- Precision < 30 pixels SD
Excellent Calibration:
- Mean error < 50 pixels
- Max error < 100 pixels
- Data loss < 2%
- Precision < 20 pixels SD
Use Cases by Research Area
Visual Attention
- Free Viewing: Where participants look on scenes/images
- Visual Search: Scan paths during search tasks
- Change Blindness: What participants fixate before/after change
Reading Research
- Eye Movements: Saccades, fixations, regressions during reading
- Word Difficulty: Fixation duration on easy vs. hard words
- Dyslexia: Atypical eye movement patterns
Usability/UX
- Web Design: Heat maps of gaze on interfaces
- Advertising: Attention to ads, banner blindness
- Navigation: How users explore menus and pages
Clinical Assessment
- Saccades: Latency, accuracy, velocity
- Smooth Pursuit: Tracking moving targets
- Fixation Stability: Nystagmus, tremor
References
-
Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice (3rd ed.). Springer.
-
Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (pp. 3839-3845).
-
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.
-
Semmelmann, K., & Weigelt, S. (2018). Online webcam-based eye tracking in cognitive science: A first look. Behavior Research Methods, 50(2), 451-465.
See Also
- Pro/Antisaccade - Oculomotor control without eye-tracking
- Posner Cueing - Spatial attention (can use eye-tracking)
- Visual Search - Scan patterns analyzable with eye-tracking