

Evaluating Visual Distortions in an AR Prototype
During my internship at Meta Reality Labs, I led a perceptual study to measure user sensitivity to optical distortions in an AR varifocal system, generating insights that informed prototype design and product strategy
​
Disclaimer: Some project details have been modified or omitted due to confidentiality.
Project Overview
Client: Meta Reality Labs (Internship)
My Role: Research Intern (Behavioural, Quantitative Perceptual Studies)
Timeline: December 2019 to March 2020 (4 months)
Skills & Tools: Experimental design, eye-tracking integration, perceptual studies, Python, Matlab, optical measurement, lab usability testing, product prototyping, quantitative data modelling (mixed effects and predictive modelling) & visualization in R
​
Summary: As part of my internship at Meta Reality Labs, I evaluated whether users could perceive spatial distortions in a novel augmented reality varifocal prototype, providing research insights that informed optical design and product development strategy.
Problem
AR headset prototypes rely on advanced optics to deliver realistic depth cues. However, these optics can introduce subtle distortions that disrupt comfort, usability, and immersion. To support the development of a new varifocal system, the product team required evidence on whether such distortions were perceptible to users in realistic scenarios.
Research Goals
-
Measure spatial distortions introduced by the prototype optics.
-
Assess whether users could perceive these distortions during natural eye movements.
Methods
I collaborated with engineers, optical scientists, and product managers to design and execute the study:
-
Optical Measurement:
-
Set up a laser system on an optical table to quantify distortion of light through the device.
-
Modeled geometric distortions in Matlab for comparison with human perception results.
-
-
Perceptual Experiment:
-
Developed a Python-based experiment integrating eye-tracking hardware.
-
Tracked gaze patterns while users performed controlled eye movements.
-
Participants reported whether they perceived expansion or compression of visual elements.
-
Key Findings
-
Users were less likely to perceive distortions under specific optical conditions.
-
These results suggested that the prototype could allow some flexibility in rendering virtual content without noticeably degrading user experience.
-
While COVID-19 disrupted full data collection, initial testing yielded promising insights published in my final report.
Impact
-
Backed product strategy pivots with user research evidence.
-
Enabled optical engineers to refine the varifocal prototype design based on perceptual tolerances.
-
Demonstrated the feasibility of balancing optical design tradeoffs with user comfort in AR.
What I learned
-
Gained hands-on experience integrating eye-tracking hardware with Python to run perceptual studies.
-
Learned to work closely with engineers and optical scientists in a fast-paced product development environment.
-
Navigated tight timelines and unexpected challenges during the COVID-19 pandemic while still delivering actionable research insights.