Translating the world around using AI & AR
CLIENT
Meta
ROLE
Product Designer
YEAR
2022
Overview
Designed intuitive AR interface for translating text and speech in real-time using AI, enabling seamless multilingual interactions via Meta’s smart glasses.
Impact
Working on the Live Translations project provided me with deep insights into the complexities of designing real-time, new domain of immersive user experiences. A whole new take on the importance of balancing technological possibilities with user needs, ensuring that the design enhances rather than disrupts the user’s natural environment. It also took a more iterative prototyping and testing, even in a highly confidential and constrained setting, to refine the user experience.
Moreover, the challenges of integrating AI & ML (computer vision, OCR, ASR etc.) into future-forward and emerging tech consumer products, particularly in creating seamless, intuitive interactions was nerve-wrecking yet exciting. These learnings gave me a much needed confidence and will play a great role in future projects within this new domain.
The Problem
staying focused while conversing in a foreign language
It's often struggle to stay engaged with the world around when relying on our phones—let alone for translations. It pulls us out of the moment and disrupt real-time interactions. The challenge was to design a seamless, hands-free translation experience through AR smart glasses, allowing users to translate text and speech directly in their line of sight without distraction.