What if text could adapt to you—your gaze, your movement, your surroundings—in real time? Typography has always been about clarity and communication, but in the immersive worlds of Augmented and Virtual Reality (AR/VR), it must evolve into something more dynamic, responsive, and alive. This presentation explores how adaptable typography can redefine our interaction with text in AR/VR environments.
The future of typography in AR/VR lies in its ability to adapt to users and contexts through the integration of sensors, AI, and variable fonts. Drawing on my research and experience at Meta and designing for AR/VR headsets, I will share concepts on how technologies like LIDAR, eye tracking, gesture recognition, and ambient light sensors can drive real-time typographic adjustments. By linking these inputs with variable fonts, typography becomes fluid and context-aware, responding dynamically to space, motion, and user behaviour. This approach redefines how we design and consume text, paving the way for more personalised, accessible, and engaging reading experiences.
The session will explore applications, such as how typography can adapt to user focus, environmental conditions, and movement, breaking free from static layouts. We’ll also examine new paradigms for text consumption in AR/VR, from navigating spatial text environments to interacting with intelligent, context-aware content.
Finally, I will connect these innovations with emerging AI capabilities, showcasing how typography can deliver inclusive, tailored experiences for diverse users in immersive spaces.
The talk will conclude with a vision for the future of reading in spatial computing, where typography becomes an intelligent, adaptive system that enhances how we interact with text in three-dimensional space. Attendees will leave with a vision of typography that isn’t just read but experienced.
Niteesh Yadav