The augmented reality (AR) glasses will show subtitles and translations in real-time.
During Google’s annual I/O developer conference, CEO Sundar Pichai teased an upcoming product: smart glasses. The glasses can translate spoken text and display it on the wearer’s screen in their own language.
Built-in microphones listen to nearby audio, which is then transcribed into text. The text is translated and displayed on the lens of the glasses.
While you can achieve a similar result using the Google Translate app on your phone, the glasses are seen as a less intrusive experience.
Google didn’t reveal much else about the AR glasses. Instead, the teaser focused primarily on the glasses’ ability to help people understand different languages more naturally.
Tech companies have been heavily investing in AR technology for many years, though few have proposed their practical benefits.
Unsurprisingly, it turns out that people don’t want to wear clunky tech glasses unless there’s a clear benefit to doing so. Live translations are a refreshingly useful implementation for an industry ripe for practicality.
This isn’t Google’s first rodeo, and it’s good to see the company learning from previous mistakes.
In 2014, the search-engine-turned-hardware company released Google Glasses, its first iteration of AR glasses. It was way ahead of its time, and the technology wasn’t good enough to keep up. It also didn’t offer any clear benefits that convinced people to wear the $1499 glasses daily.
Google Glass then transitioned into a business product, enabling workers to video call and access specialised software hands-free. This is a powerful use case in some industries, though by no means a killer feature for a mainstream product.
It seems that Google hasn’t given up on building consumer AR glasses just yet. It’s not clear if the unnamed translation-enabled glasses will come with any other features such as AR maps and video calls.
In other words, live translations may be one of many features in the new Google smart glasses, or it may be the only feature in a product built for a niche audience.
Google is by no means the only company investing in AR glasses. Apple is rumoured to be working on its own AR glasses too.
When asked if Apple is working on AR products, CEO Tim Cook replied that “it’s something we’re doing a lot of things on behind that curtain”.
Snapchat’s parent company, Snap, has already released a few iterations of its Spectacles smart glasses. The first three generations of Spectacles included the capability to record video, with the latest fourth-generation model also including viewing AR effects in real-time while wearing the glasses.
Ray-Ban has also partnered with Meta (formerly Facebook) to launch Rayban Stories, a pair of smart glasses that can record and post videos to Meta apps such as Facebook and Instagram.
While most tech glasses in the market don’t currently include AR functionality, they are often stepping stones towards a more ambitious future: the metaverse.
Smart glasses will serve as key products that bridge real life with digital assets.
Experts believe that the metaverse will be accessible through multiple mediums, mainly virtual and augmented reality devices. Out of the two, AR offers a less isolating experience as it mixes between physical and digital; meanwhile, VR isolates the user from real life.
As a result, AR glasses are seen as crucial entry points into the metaverse. Tech companies eager to participate in the metaverse are investing heavily into such technology to get a lead in this new attractive market.
So while Google’s new glasses may initially only offer translations, they may eventually serve as a critical piece in its metaverse puzzle.
As CEO Sundar Pichai put it, “computing over time will adapt to people than people adapting to computers. You won’t always interact with computing in a black rectangle in front of you. So, just like you speak to people, you see, and interact, computers will become more immersive. They’ll be there when you need [them] to be.”