Voice-activated AI glasses use your own SKIN to tell you what you’re seeing
TWO college students have designed game-changing AI-activated glasses – powered by your body.
After watching their grandparents suffer from severe visual impairments, final-year Stanford students Daniel Kim and Arjun Oberoi devised an ingenious idea for the Red Bull Basement global innovation competition.
While smart glasses are already on the market, Kim and Oberoi’s Argus prototype is on a different level thanks to revolutionary Wi-R tech, which transmits data using a person’s skin.
For users with a condition like macular degeneration, for example, a quick question about what is in front of them or which medicine they are holding will receive a prompt, detailed response from the built-in AI via a small speaker on the frame.
The two computer whizkids won the US national title at MIT in Boston.
They will compete in the world championships in Tokyo early next month, joining fellow innovators from all over the globe.
The winners will spend a few weeks being mentored in Silicon Valley and having their creations fast-tracked.
The Stanford duo fought off fierce competition from similarly bright-minded students, who unleashed their concepts in front of an expert panel that included MIT alum and Netflix science star Emily Calandrelli.
Argus, however, was a class apart.
Rather than the glasses being connected to a cellular or Wi-Fi network, it uses Wi-R.
The tech has been around for a decade, although computer gurus at Purdue University have recently made huge advancements which, according to Oberoi, have development kits “out in the wild.”
The human body is well-equipped to carry electrical signals. Now, however, with the use of Wi-R, those signals can remain within the body’s proximity.
Kim explained to The U.S. Sun that the devices can emit electrical signals “using your skin as a transmitter.”
The two-part visual aid device comprises a computer and camera module, which can be placed in a pocket.
They work together and are activated by voice commands.
It can then answer questions about the surrounding environment, read text, and even recognize faces.
The students claim that the combination of Wi-R and edge AI processors will make their device “10 times less expensive, as well as lighter and boast an all-day battery life.”
“Whenever you have something an inch away from the skin, you can get a signal transmitted from our computer module to the camera module, which allows us to be 100 times more efficient, energy efficient than Wi-Fi,” Kim said.
“So you’re able to keep the battery on the camera module super lightweight, which is really important for making sure that it’s ergonomic and not causing any discomfort.”
Each of the 10 teams—whittled down from over 15,000 applications—spoke for 10 minutes at MIT, outlining their vision and business possibilities for their creations.
Princeton computer science students Foyez Alauddin and Brian Shi created an all-encompassing program to help immigrant families complete essential documents in their native languages.
Meanwhile, Mirna Jaber and Izabella Herrera-Nunez from the Georgia Institute of Technology offered an enticing AI-powered real-time meal planner to help combat the global food waste problem.
The high-tech Argus glasses, however, were the deserving winners, with the finals in Japan up next.
Kim and Oberoi’s plan has been two years in the making, and now they aim to conquer the world.
The global champs will receive an all-expense-paid trip to Los Angeles and Silicon Valley for a three-week immersive experience in partnership with Plug and Play VC.
They will network with top venture capitalists to help them turn their product into a reality.
“Both of us have personal people who, like, this would help them in their everyday life,” said Oberoi.
“Winning gives us many more opportunities to connect with people at the forefront of industries, which are problems we’re also trying to solve. It’s amazing.”