A new pair of smart eyeglasses designed for deaf or hard-of-hearing people to see subtitles during conversations. The company XRAI Glass has released software on their augmented reality (AR) smart glasses, which are connected to their phones so they can download the app.
According to MailOnline, the technology was launched in nine languages worldwide with additional features, such as a personal assistant, rewinding conversations on demand, and real-time translation so anyone can use it in their daily life.
What It Can Do
The smart eyeglasses weigh less than three ounces, have darkened lenses, are connected to the phone, and project images in a tiny TV-like structure. Users are promised an experience like watching a 205-inch IMAX cinema screen.
Wearers can use it daily and still see the person enough with subtitles in real time. Different people in the conversation will come up with different colors, making it easier to distinguish who is talking.
XRAI's software can be used in conjunction with Nreal AR smart glasses as it is designed for various uses. Business Wire reports that these features of the smart eyeglasses include a personal assistant that enables wearers to ask questions, such as the weather, and it will instantly pop up in subtitles.
Moreover, it has a feature for rewinding conversations that happened the other day. Simply say "Hey, XRAI" to command the software to recall past conversations.
Lastly, the smart eyeglass is now available to transcribe and subtitle conversations in the most spoken languages in the world. These languages include English, Mandarin, Japanese, Korean, Spanish, French, Portuguese, German, and Italian. As the company improves the software, more languages will also be added.
READ ALSO : $3,840 Smart Shoe Helps Blind, Visually Impaired People With Its Ultrasonic Sensors That Avoid Obstacles
How It Works
The smart eyeglass captures audio using Bluetooth microphones and then sends it to the phone, processing it into subtitles appearing on the eyeglass as augmented reality. As Metro reported, the audio can only be processed but not stored.
The latency between the first-word input and the word displayed on the screen is less than one second, so it shows subtitles in real time. The Siri-like feature where users could say, "Hey, XRAI" is done using deep learning technology with data from GPT3, a large language model for understanding language questions.
Like all technologies, privacy is a major concern. The makers said that they are leaning into the decentralized architecture of Web 3.0, putting the data back into the hands of the user, and it will never be in the cloud.
Moreover, XRAI Glass could be used someday in a blended reality technology that could show users digital elements in the real world. That means it could also be used in the metaverse someday, as they aim to reach 100,000 users globally by the end of 2024.
RELATED ARTICLE: University of Utah's 'Smart Glasses' Auto Focuses What You're Looking At
Check out more news and information on Tech & Innovation in Science Times.