Eye-Tracking Calibration (Interactive)
Version: v1 (current)
An interactive click-based calibration task for establishing accurate eye-gaze tracking during web-based studies using webcam-based eye tracking.
Overview
The Eye-Tracking Calibration (Interactive) task is an alternative calibration approach for studies that use eye-tracking technology. Unlike dwell-based calibration where participants must maintain fixation for a set duration, this interactive variant requires participants to click on each calibration point to advance to the next target.
This task guides participants through:
- Interactive Calibration: Clicking on targets at known screen locations (3x3 grid)
- Multiple Iterations: Optionally repeating the 9-point sequence for improved accuracy
- Completion: Brief pause before proceeding to the next task
Eye-tracking calibration is particularly important for:
- Visual attention studies: Measuring gaze patterns on scenes, faces, or interfaces
- Reading research: Tracking eye movements during text reading
- Usability testing: Where users look on web pages or applications
- Attention control: Verifying fixation in tasks requiring central gaze
- Clinical assessment: Eye movement abnormalities (saccades, smooth pursuit)
Why Researchers Use This Task
- User Control: Participants advance at their own pace by clicking
- Reduced Frustration: No waiting for dwell timers to complete
- Better Engagement: Active clicking may increase participant focus
- Faster Calibration: Participants can click quickly when ready
- Clear Feedback: Explicit action (click) provides better sense of progress
- Accessibility: Easier for participants who have difficulty maintaining steady gaze
Current Implementation Status
Fully Implemented:
- ✅ 9-point calibration (3x3 grid covering screen)
- ✅ Click-based advancement (no dwell timers)
- ✅ Randomized point presentation order
- ✅ Multiple iteration support
- ✅ Configurable point size and pause duration
- ✅ Integration with WebGazer.js (webcam-based eye tracking)
Partially Implemented:
- ⚠️ Limited to webcam-based tracking (not hardware eye trackers)
- ⚠️ No validation phase (assumes participant clicked accurately)
- ⚠️ Accuracy depends on webcam quality and lighting
Not Yet Implemented:
- ❌ Post-calibration validation with accuracy reporting
- ❌ Recalibration option based on accuracy metrics
- ❌ Integration with external eye-tracking hardware (Tobii, EyeLink)
- ❌ Positioning checks (camera, lighting, distance)
Configuration Parameters
Calibration Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
| Iterations | number | 1 | Number of full rounds through all 9 calibration points |
| Final Pause (ms) | number | 600 | Pause duration after the last point before task completion |
| Point Size (px) | number | 28 | Size of the calibration dot in pixels |
Calibration Point Positions
The task uses a fixed 9-point grid at standard screen positions:
| Position | Screen Location | X Coordinate | Y Coordinate |
|---|---|---|---|
| Top-left | Upper left corner | 10% width | 10% height |
| Top-middle | Top center | 50% width | 10% height |
| Top-right | Upper right corner | 90% width | 10% height |
| Middle-left | Left center | 10% width | 50% height |
| Center | Screen center | 50% width | 50% height |
| Middle-right | Right center | 90% width | 50% height |
| Bottom-left | Lower left corner | 10% width | 90% height |
| Bottom-middle | Bottom center | 50% width | 90% height |
| Bottom-right | Lower right corner | 90% width | 90% height |
Data Output
Markers and Responses
The task records a marker when each calibration point is displayed, and a response when the participant clicks it.
Markers (point_shown):
{
"type": "point_shown",
"ts": "2024-01-01T00:00:01.000Z",
"hr": 1234.56,
"data": {
"pos": "top_left",
"index": 0
}
}
Markers (point_clicked):
{
"type": "point_clicked",
"ts": "2024-01-01T00:00:02.247Z",
"hr": 2481.56,
"data": {
"pos": "top_left",
"index": 0
}
}
Response Data:
{
"pos": "top_left",
"index": 0
}
| Field | Type | Description |
|---|---|---|
pos | string | Grid position key (e.g., "top_left", "middle_middle", "bottom_right") |
index | number | Sequential point number in the calibration sequence (0-8 for first iteration, 9-17 for second, etc.) |
Summary Artifact
None. The interactive eyetracking calibration task does not generate a summary artifact. Calibration data is available in the participation markers and responses for post-processing.
Calibration Procedure
9-Point Grid Layout
The task always uses a 9-point calibration grid:
1 2 3
4 5 6
7 8 9
Presentation Order: Points appear in randomized order to prevent anticipatory eye movements.
Duration: Variable (depends on participant click speed, typically 10-30 seconds)
Accuracy: Depends on participant clicking accuracy and webcam quality (no validation phase)
Participant Experience
-
Calibration Instructions:
- "You will see dots appear on the screen one at a time."
- "Click on each dot to proceed to the next one."
- "Try to keep your head still during calibration."
-
Calibration Procedure:
- Dot appears at random screen location
- Participant clicks on the dot
- Next dot immediately appears at new location
- Repeat for all 9 points
- If multiple iterations configured, sequence repeats
-
Completion:
- After final click, brief pause (default 600ms)
- Task completes automatically
- "Eye-tracking is ready. Please keep your head relatively still during the task."
Design Recommendations
When to Use Interactive Calibration
Choose Interactive Calibration When:
- Participants prefer active control over passive waiting
- You want faster calibration (participants can click immediately when ready)
- Dwell-based calibration causes frustration or confusion
- Participants have difficulty maintaining steady fixation
- You need a simplified calibration without validation phases
Choose Standard Calibration When:
- You need validation and accuracy metrics
- You want to ensure proper fixation duration at each point
- You require recalibration options based on accuracy
- High-precision eye-tracking is critical for your study
Environment Requirements
- Lighting: Even, frontal lighting (no backlighting from windows)
- Distance: 50-70 cm from screen (arm's length)
- Camera: Built-in laptop webcam or external webcam (720p minimum, 1080p preferred)
- Stability: Chair and desk stable (no rocking or movement)
- Glasses: Compatible with glasses, but contacts or no correction preferred
Calibration Design
- Iterations: 1 iteration sufficient for most studies; 2-3 iterations for improved accuracy
- Point Size: 28px default; increase to 40px for older adults or vision impairments
- Final Pause: 600ms default; increase to 1000ms if participants need transition time
- Instructions: Emphasize accurate clicking on the center of each dot
Quality Control Considerations
Important Limitations:
- No automatic validation of calibration accuracy
- Assumes participants click accurately on each point center
- No recalibration mechanism based on accuracy metrics
- Requires manual review of eye-tracking data quality in subsequent tasks
Recommendations:
- Include validation task after calibration (e.g., fixation verification)
- Monitor data quality in early trials of eye-tracking tasks
- Consider excluding participants with poor gaze data quality
- Document any participant-reported issues (glasses, lighting, camera problems)
Integration with Tasks
- Timing: Calibrate immediately before eye-tracking tasks
- Recalibration: For sessions >30 minutes, run calibration task again midway
- Reminders: Remind participants to keep head still at task start
- Validation: Consider adding a brief fixation verification task after calibration
Comparison with Standard Calibration
| Feature | Interactive Calibration | Standard Calibration |
|---|---|---|
| Advancement | Click on each point | Automatic after dwell duration |
| Duration | Variable (participant-controlled) | Fixed (point_duration_ms × num_points) |
| Validation | None | Yes (with accuracy reporting) |
| Recalibration | Not available | Available if accuracy insufficient |
| Positioning Checks | None | Camera, lighting, distance checks |
| User Control | High (clicks to advance) | Low (waits for timers) |
| Accuracy Feedback | None | Mean/max error in pixels |
| Complexity | Simple (click only) | Full calibration workflow |
| Best For | Quick studies, user preference | High-precision, quality control |
Common Issues and Solutions
| Issue | Solution |
|---|---|
| Participant clicks too quickly | Emphasize accuracy over speed in instructions; increase final_pause_ms |
| Clicks miss the dot center | Increase point_size_px; provide practice trial before calibration |
| Poor eye-tracking after calibration | Run multiple iterations; check lighting and distance; consider standard calibration |
| Participant confused about task | Clarify that they should click directly on the dot, not elsewhere |
| Dots appear in unexpected order | Normal behavior (randomized order prevents anticipation) |
| Task completes too abruptly | Increase final_pause_ms to provide transition time |
Use Cases by Research Area
Visual Attention
- Free Viewing: Where participants look on scenes/images
- Visual Search: Scan paths during search tasks
- Change Blindness: What participants fixate before/after change
Reading Research
- Eye Movements: Saccades, fixations, regressions during reading
- Word Difficulty: Fixation duration on easy vs. hard words
- Dyslexia: Atypical eye movement patterns
Usability/UX
- Web Design: Heat maps of gaze on interfaces
- Advertising: Attention to ads, banner blindness
- Navigation: How users explore menus and pages
Clinical Assessment
- Saccades: Latency, accuracy, velocity
- Smooth Pursuit: Tracking moving targets
- Fixation Stability: Nystagmus, tremor
References
-
Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice (3rd ed.). Springer.
-
Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (pp. 3839-3845).
-
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.
-
Semmelmann, K., & Weigelt, S. (2018). Online webcam-based eye tracking in cognitive science: A first look. Behavior Research Methods, 50(2), 451-465.
See Also
- Eye-Tracking Calibration - Standard dwell-based calibration with validation
- Pro/Antisaccade - Oculomotor control without eye-tracking
- Posner Cueing - Spatial attention (can use eye-tracking)
- Visual Search - Scan patterns analyzable with eye-tracking