Skip to main content

Eye-Tracking Calibration

Version: v1 (current)

A calibration task for establishing accurate eye-gaze tracking during web-based studies using webcam-based eye tracking.

Overview

The Eye-Tracking Calibration task is an essential setup step for studies that use eye-tracking technology to monitor participants' gaze patterns. Calibration maps the relationship between camera-captured eye images and screen coordinates, enabling accurate measurement of where participants are looking during experimental tasks.

This task guides participants through:

  1. Positioning: Adjusting seating distance and camera angle
  2. Calibration: Fixating on targets at known screen locations
  3. Validation: Testing accuracy of the calibration
  4. Recalibration: If needed, repeating calibration for better accuracy

Eye-tracking calibration is particularly important for:

  • Visual attention studies: Measuring gaze patterns on scenes, faces, or interfaces
  • Reading research: Tracking eye movements during text reading
  • Usability testing: Where users look on web pages or applications
  • Attention control: Verifying fixation in tasks requiring central gaze
  • Clinical assessment: Eye movement abnormalities (saccades, smooth pursuit)

Why Researchers Use This Task

  1. Measurement Validity: Accurate eye-tracking requires proper calibration
  2. Data Quality: Poor calibration leads to noisy, unusable gaze data
  3. Standardization: Consistent calibration procedure across participants
  4. Quality Control: Validation step identifies insufficient calibration
  5. Participant Engagement: Interactive calibration increases compliance
  6. Remote Studies: Enables eye-tracking in web-based, unmoderated studies

Current Implementation Status

Fully Implemented:

  • ✅ Multi-point calibration (5-point, 9-point, 13-point)
  • ✅ Visual feedback during calibration
  • ✅ Validation phase with accuracy reporting
  • ✅ Recalibration option if accuracy insufficient
  • ✅ Positioning guidance (camera, lighting, distance checks)
  • ✅ Integration with WebGazer.js (webcam-based eye tracking)

Partially Implemented:

  • ⚠️ Limited to webcam-based tracking (not hardware eye trackers)
  • ⚠️ Accuracy depends on webcam quality and lighting

Not Yet Implemented:

  • ❌ Integration with external eye-tracking hardware (Tobii, EyeLink)
  • ❌ Advanced validation metrics (fixation stability, drift correction)
  • ❌ Continuous recalibration during long sessions

Configuration Parameters

Calibration Parameters

ParameterTypeDefaultDescription
Calibration Pointsnumber9Number of calibration targets (5, 9, or 13)
Point Duration (ms)number1000Duration participant must fixate each point
Inter Point Interval (ms)number500Delay between calibration points
Validation EnabledbooleantrueRun validation phase after calibration
Min Accuracy Thresholdnumber100Maximum error in pixels to accept calibration

Positioning Check

ParameterTypeDefaultDescription
Check CamerabooleantrueVerify webcam is working
Check LightingbooleantrueCheck for adequate face illumination
Check DistancebooleantrueGuide participant to optimal distance (~50-70cm)
Allow Skip PositioningbooleanfalseLet participant skip positioning checks (not recommended)

Visual Parameters

ParameterTypeDefaultDescription
Target Size (px)number20Size of calibration target dot
Target Color (hex)string'FF0000'Color of calibration target (red default)
Show Gaze CursorbooleantrueDisplay real-time gaze position during validation
Background Color (hex)string'E0E0E0'Background color during calibration

Recalibration

ParameterTypeDefaultDescription
Allow RecalibrationbooleantrueOffer recalibration if accuracy insufficient
Max Recalibration Attemptsnumber3Maximum number of recalibration attempts
Auto Recalibrate Thresholdnumber150Automatically trigger recalibration if error > threshold

Data Output

Markers and Responses

The task is passive (no participant clicks), so no response objects are recorded. Markers track each calibration point display.

Markers (point_shown):

{
"type": "point_shown",
"ts": "2024-01-01T00:00:02.000Z",
"hr": 2234.56,
"data": {
"pos": "top_left",
"index": 0
}
}

The pos field contains the grid position key (e.g., "top_left", "middle_middle", "bottom_right"). The index field is the sequential point number in the calibration sequence.

Summary Artifact

None. The eyetracking calibration task does not generate a summary artifact. Calibration data is available in the participation markers for post-processing.

Calibration Procedures

5-Point Calibration (Fast)

Points: Center + 4 corners

     1           2
\ /
\ /
5
/ \
/ \
3 4

Use Case: Quick calibration for tasks not requiring high precision

Duration: ~10 seconds

Accuracy: Moderate (±100-150 pixels typical)

9-Point Calibration (Standard)

Points: 3x3 grid covering screen

   1    2    3

4 5 6

7 8 9

Use Case: Standard for most eye-tracking studies

Duration: ~15 seconds

Accuracy: Good (±75-100 pixels typical)

13-Point Calibration (High Accuracy)

Points: 9-point + 4 additional mid-edge points

     1   2   3

4 5 6

7 8 9

Use Case: High-precision requirements (reading, small targets)

Duration: ~20 seconds

Accuracy: Best (±50-75 pixels typical)

Participant Experience

  1. Positioning Check:

    • "Please adjust your position so your face is clearly visible in the camera preview."
    • Camera view shown with face detection overlay
    • Distance guidance: "Move closer/farther until your face fills the oval."
    • Lighting check: "Ensure your face is well-lit. Avoid backlighting."
  2. Calibration Instructions:

    • "You will see red dots appear on the screen. Please look directly at each dot and keep your eyes on it until it disappears."
    • "Try to keep your head still during calibration."
    • "Click 'Start Calibration' when ready."
  3. Calibration Procedure:

    • Red dot appears in corner
    • Participant fixates for 1 second
    • Dot moves to next location
    • Repeat for all calibration points (~9 total)
  4. Validation:

    • "Now we'll test the calibration accuracy."
    • Dots appear again; gaze position shown with cursor
    • Accuracy computed: "Calibration accuracy: 78 pixels (Good)"
  5. Recalibration Decision:

    • If accuracy good (< 100px): "Calibration successful! Proceed to task."
    • If accuracy poor (> 100px): "Calibration accuracy is low. Would you like to recalibrate for better results?"
  6. Completion:

    • "Eye-tracking is ready. Please keep your head relatively still during the task."

Design Recommendations

Environment Requirements

  • Lighting: Even, frontal lighting (no backlighting from windows)
  • Distance: 50-70 cm from screen (arm's length)
  • Camera: Built-in laptop webcam or external webcam (720p minimum, 1080p preferred)
  • Stability: Chair and desk stable (no rocking or movement)
  • Glasses: Compatible with glasses, but contacts or no correction preferred

Calibration Design

  • Number of Points: 9-point standard; 13-point for reading or high precision
  • Duration per Point: 1000ms minimum; 1500ms for noisy environments
  • Validation: Always validate; recalibrate if mean error > 100px
  • Instructions: Emphasize head stability and fixation at center of target

Quality Control

  • Accuracy Threshold: Accept if mean error < 100px for standard tasks; < 75px for precision tasks
  • Maximum Attempts: Allow 3 recalibration attempts; if still poor, exclude participant
  • Head Movement: Monitor and warn if excessive head movement detected
  • Glasses/Contacts: Ask participants; note in data (may affect accuracy)

Integration with Tasks

  • Timing: Calibrate immediately before eye-tracking tasks
  • Recalibration: For sessions >30 minutes, recalibrate midway
  • Drift Correction: Consider brief validation/correction between blocks
  • Reminders: Remind participants to keep head still at task start

Common Issues and Solutions

IssueSolution
Poor accuracy (>150px error)Check lighting, distance, glasses; recalibrate; may need to exclude
Camera not detectedCheck permissions; ensure webcam connected; try different browser
Face not detectedImprove lighting; ensure face centered; remove obstructions (hat, hair)
Accuracy degrades during taskParticipant moving head; recalibrate between blocks; remind about stillness
Calibration hangs on one pointParticipant not fixating; provide clearer instructions; check for understanding
Gaze cursor jumps aroundInsufficient samples; increase calibration duration; improve lighting

Validation Criteria

Acceptable Calibration:

  • Mean error < 100 pixels
  • Max error < 200 pixels
  • Data loss < 10%
  • Precision < 50 pixels SD

Good Calibration:

  • Mean error < 75 pixels
  • Max error < 150 pixels
  • Data loss < 5%
  • Precision < 30 pixels SD

Excellent Calibration:

  • Mean error < 50 pixels
  • Max error < 100 pixels
  • Data loss < 2%
  • Precision < 20 pixels SD

Use Cases by Research Area

Visual Attention

  • Free Viewing: Where participants look on scenes/images
  • Visual Search: Scan paths during search tasks
  • Change Blindness: What participants fixate before/after change

Reading Research

  • Eye Movements: Saccades, fixations, regressions during reading
  • Word Difficulty: Fixation duration on easy vs. hard words
  • Dyslexia: Atypical eye movement patterns

Usability/UX

  • Web Design: Heat maps of gaze on interfaces
  • Advertising: Attention to ads, banner blindness
  • Navigation: How users explore menus and pages

Clinical Assessment

  • Saccades: Latency, accuracy, velocity
  • Smooth Pursuit: Tracking moving targets
  • Fixation Stability: Nystagmus, tremor

References

  • Duchowski, A. T. (2017). Eye Tracking Methodology: Theory and Practice (3rd ed.). Springer.

  • Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (pp. 3839-3845).

  • Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.

  • Semmelmann, K., & Weigelt, S. (2018). Online webcam-based eye tracking in cognitive science: A first look. Behavior Research Methods, 50(2), 451-465.

See Also