Skip to main content

Eye-Tracking Calibration (Interactive)

Version: v1 (current)

An interactive click-based calibration task for establishing accurate eye-gaze tracking during web-based studies using webcam-based eye tracking.

Overview

The Eye-Tracking Calibration (Interactive) task is an alternative calibration approach for studies that use eye-tracking technology. Unlike dwell-based calibration where participants must maintain fixation for a set duration, this interactive variant requires participants to click on each calibration point to advance to the next target.

This task guides participants through:

  1. Interactive Calibration: Clicking on targets at known screen locations (3x3 grid)
  2. Multiple Iterations: Optionally repeating the 9-point sequence for improved accuracy
  3. Completion: Brief pause before proceeding to the next task

Eye-tracking calibration is particularly important for:

  • Visual attention studies: Measuring gaze patterns on scenes, faces, or interfaces
  • Reading research: Tracking eye movements during text reading
  • Usability testing: Where users look on web pages or applications
  • Attention control: Verifying fixation in tasks requiring central gaze
  • Clinical assessment: Eye movement abnormalities (saccades, smooth pursuit)

Why Researchers Use This Task

  1. User Control: Participants advance at their own pace by clicking
  2. Reduced Frustration: No waiting for dwell timers to complete
  3. Better Engagement: Active clicking may increase participant focus
  4. Faster Calibration: Participants can click quickly when ready
  5. Clear Feedback: Explicit action (click) provides better sense of progress
  6. Accessibility: Easier for participants who have difficulty maintaining steady gaze

Current Implementation Status

Fully Implemented:

  • ✅ 9-point calibration (3x3 grid covering screen)
  • ✅ Click-based advancement (no dwell timers)
  • ✅ Randomized point presentation order
  • ✅ Multiple iteration support
  • ✅ Configurable point size and pause duration
  • ✅ Integration with WebGazer.js (webcam-based eye tracking)

Partially Implemented:

  • ⚠️ Limited to webcam-based tracking (not hardware eye trackers)
  • ⚠️ No validation phase (assumes participant clicked accurately)
  • ⚠️ Accuracy depends on webcam quality and lighting

Not Yet Implemented:

  • ❌ Post-calibration validation with accuracy reporting
  • ❌ Recalibration option based on accuracy metrics
  • ❌ Integration with external eye-tracking hardware (Tobii, EyeLink)
  • ❌ Positioning checks (camera, lighting, distance)

Configuration Parameters

Calibration Parameters

ParameterTypeDefaultDescription
Iterationsnumber1Number of full rounds through all 9 calibration points
Final Pause (ms)number600Pause duration after the last point before task completion
Point Size (px)number28Size of the calibration dot in pixels

Calibration Point Positions

The task uses a fixed 9-point grid at standard screen positions:

PositionScreen LocationX CoordinateY Coordinate
Top-leftUpper left corner10% width10% height
Top-middleTop center50% width10% height
Top-rightUpper right corner90% width10% height
Middle-leftLeft center10% width50% height
CenterScreen center50% width50% height
Middle-rightRight center90% width50% height
Bottom-leftLower left corner10% width90% height
Bottom-middleBottom center50% width90% height
Bottom-rightLower right corner90% width90% height

Data Output

Markers and Responses

The task records a marker when each calibration point is displayed, and a response when the participant clicks it.

Markers (point_shown):

{
"type": "point_shown",
"ts": "2024-01-01T00:00:01.000Z",
"hr": 1234.56,
"data": {
"pos": "top_left",
"index": 0
}
}

Markers (point_clicked):

{
"type": "point_clicked",
"ts": "2024-01-01T00:00:02.247Z",
"hr": 2481.56,
"data": {
"pos": "top_left",
"index": 0
}
}

Response Data:

{
"pos": "top_left",
"index": 0
}
FieldTypeDescription
posstringGrid position key (e.g., "top_left", "middle_middle", "bottom_right")
indexnumberSequential point number in the calibration sequence (0-8 for first iteration, 9-17 for second, etc.)

Summary Artifact

None. The interactive eyetracking calibration task does not generate a summary artifact. Calibration data is available in the participation markers and responses for post-processing.

Calibration Procedure

9-Point Grid Layout

The task always uses a 9-point calibration grid:

   1    2    3

4 5 6

7 8 9

Presentation Order: Points appear in randomized order to prevent anticipatory eye movements.

Duration: Variable (depends on participant click speed, typically 10-30 seconds)

Accuracy: Depends on participant clicking accuracy and webcam quality (no validation phase)

Participant Experience

  1. Calibration Instructions:

    • "You will see dots appear on the screen one at a time."
    • "Click on each dot to proceed to the next one."
    • "Try to keep your head still during calibration."
  2. Calibration Procedure:

    • Dot appears at random screen location
    • Participant clicks on the dot
    • Next dot immediately appears at new location
    • Repeat for all 9 points
    • If multiple iterations configured, sequence repeats
  3. Completion:

    • After final click, brief pause (default 600ms)
    • Task completes automatically
    • "Eye-tracking is ready. Please keep your head relatively still during the task."

Design Recommendations

When to Use Interactive Calibration

Choose Interactive Calibration When:

  • Participants prefer active control over passive waiting
  • You want faster calibration (participants can click immediately when ready)
  • Dwell-based calibration causes frustration or confusion
  • Participants have difficulty maintaining steady fixation
  • You need a simplified calibration without validation phases

Choose Standard Calibration When:

  • You need validation and accuracy metrics
  • You want to ensure proper fixation duration at each point
  • You require recalibration options based on accuracy
  • High-precision eye-tracking is critical for your study

Environment Requirements

  • Lighting: Even, frontal lighting (no backlighting from windows)
  • Distance: 50-70 cm from screen (arm's length)
  • Camera: Built-in laptop webcam or external webcam (720p minimum, 1080p preferred)
  • Stability: Chair and desk stable (no rocking or movement)
  • Glasses: Compatible with glasses, but contacts or no correction preferred

Calibration Design

  • Iterations: 1 iteration sufficient for most studies; 2-3 iterations for improved accuracy
  • Point Size: 28px default; increase to 40px for older adults or vision impairments
  • Final Pause: 600ms default; increase to 1000ms if participants need transition time
  • Instructions: Emphasize accurate clicking on the center of each dot

Quality Control Considerations

Important Limitations:

  • No automatic validation of calibration accuracy
  • Assumes participants click accurately on each point center
  • No recalibration mechanism based on accuracy metrics
  • Requires manual review of eye-tracking data quality in subsequent tasks

Recommendations:

  • Include validation task after calibration (e.g., fixation verification)
  • Monitor data quality in early trials of eye-tracking tasks
  • Consider excluding participants with poor gaze data quality
  • Document any participant-reported issues (glasses, lighting, camera problems)

Integration with Tasks

  • Timing: Calibrate immediately before eye-tracking tasks
  • Recalibration: For sessions >30 minutes, run calibration task again midway
  • Reminders: Remind participants to keep head still at task start
  • Validation: Consider adding a brief fixation verification task after calibration

Comparison with Standard Calibration

FeatureInteractive CalibrationStandard Calibration
AdvancementClick on each pointAutomatic after dwell duration
DurationVariable (participant-controlled)Fixed (point_duration_ms × num_points)
ValidationNoneYes (with accuracy reporting)
RecalibrationNot availableAvailable if accuracy insufficient
Positioning ChecksNoneCamera, lighting, distance checks
User ControlHigh (clicks to advance)Low (waits for timers)
Accuracy FeedbackNoneMean/max error in pixels
ComplexitySimple (click only)Full calibration workflow
Best ForQuick studies, user preferenceHigh-precision, quality control

Common Issues and Solutions

IssueSolution
Participant clicks too quicklyEmphasize accuracy over speed in instructions; increase final_pause_ms
Clicks miss the dot centerIncrease point_size_px; provide practice trial before calibration
Poor eye-tracking after calibrationRun multiple iterations; check lighting and distance; consider standard calibration
Participant confused about taskClarify that they should click directly on the dot, not elsewhere
Dots appear in unexpected orderNormal behavior (randomized order prevents anticipation)
Task completes too abruptlyIncrease final_pause_ms to provide transition time

Use Cases by Research Area

Visual Attention

  • Free Viewing: Where participants look on scenes/images
  • Visual Search: Scan paths during search tasks
  • Change Blindness: What participants fixate before/after change

Reading Research

  • Eye Movements: Saccades, fixations, regressions during reading
  • Word Difficulty: Fixation duration on easy vs. hard words
  • Dyslexia: Atypical eye movement patterns

Usability/UX

  • Web Design: Heat maps of gaze on interfaces
  • Advertising: Attention to ads, banner blindness
  • Navigation: How users explore menus and pages

Clinical Assessment

  • Saccades: Latency, accuracy, velocity
  • Smooth Pursuit: Tracking moving targets
  • Fixation Stability: Nystagmus, tremor

References

  • Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice (3rd ed.). Springer.

  • Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (pp. 3839-3845).

  • Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.

  • Semmelmann, K., & Weigelt, S. (2018). Online webcam-based eye tracking in cognitive science: A first look. Behavior Research Methods, 50(2), 451-465.

See Also