During TED2025, which took place from April 7th in Vancouver, Google’s Android XR head, Shahram Izadi, showcased a prototype of the company’s extended reality (XR) glasses. While he appeared as a regular individual in a sweater and glasses, a closer look revealed that his eyewear was no ordinary accessory.
Google has been steadily entering various technology markets, and XR is one of them. There has been considerable buzz surrounding the company’s XR glasses, which align with its ongoing focus on artificial intelligence and innovation.
Previously, these glasses existed only in speculation, leaving many eager for a first glimpse. The unveiling at TED2025 finally offered that preview.
During his presentation, Izadi demonstrated the glasses’ impressive capabilities, including real-time translation and the scanning of text from a book. These features indicate that the glasses heavily rely on Google’s Gemini system, enhancing user interaction in a manner similar to Meta’s Ray-Ban smart glasses.
However, with Gemini closely integrated into Google’s Android operating system, it is expected that the functionality will surpass that of competing products. Izadi remarked, “These glasses work with your phone, streaming back and forth,” emphasizing their lightweight design and seamless access to various phone applications.
This integration hints at a practical and versatile usage, potentially setting the stage for a significant leap in XR technology. Despite the exciting demonstration, there remains a lack of detailed information regarding the glasses.
Key aspects such as pricing, full capabilities, and availability are still under wraps, leaving enthusiasts and potential users eagerly awaiting more updates. As Google continues to innovate, the industry is keen to see how these XR glasses will evolve.