Mark Zuckerberg says were close to controlling our AR glasses with brain signals
Date:
Wed, 21 Feb 2024 15:15:48 +0000
Description:
Mark Zuckerberg says were close to being able to use its EMG wristband to control AR smart glasses with our brain signals.
FULL STORY ======================================================================
Move over eye-tracking and handset controls for VR headsets and AR glasses, according to Mark Zuckerberg the companys CEO Meta is close to selling a device that can be controlled by your brain signals.
Speaking on the Morning Brew Daily podcast (shown below), Zuckerberg was
asked to give examples of AIs most impressive use cases. Ever keen to hype up the products Meta makes he also recently took to Instagram to explain why
the Meta Quest 3 is better than the Apple Vision Pro he started to discuss the Ray-Ban Meta Smart Glasses that use AI and their camera to answer questions about what you see (though annoyingly this is still only available to some lucky users in beta form).
He then went on to discuss one of the wilder things were working on, a neural interface in the form of a wristband Zuckerberg also took a moment to poke fun at Elon Musks Neuralink, saying he wouldnt want to put a chip in his
brain until the tech is mature, unlike the first human subject to be
implanted with the tech .
Metas EMG wristband can read the nervous system signals your brain sends to your hands and arms. According to Zuckerberg, this tech would allow you to merely think how you want to move your hand and that would happen in the virtual without requiring big real-world motions.
Zuckerberg has shown off Metas prototype EMG wristband before in a video (shown below) though not the headset it works with but whats interesting about his podcast statement is he goes on to say that he feels Meta is close to having a product in the next few years that people can buy and use.
Understandably he gives a rather vague release date and, unfortunately,
theres no mention of how much something like this would cost though were ready for it to cost as much as one of the best smartwatches but this system could be a major leap forward for privacy, utility and accessibility in Metas AR and VR tech. The next next-gen XR advancement?
Currently, if you want to communicate with the Ray-Ban Meta Smart Glasses via its Look and Ask feature or to respond to a text message youve been sent without getting your phone out you have to talk it it. This is fine most of the time but there might be questions you want to ask or replies you want to send that youd rather keep private.
The EMG wristband allows you to type out these messages using subtle hand gestures so you can maintain a higher level of privacy though as the podcast hosts note this has issues of its own, not least of which is schools having a harder time trying to stop students from cheating in tests. Gone are the days of sneaking in notes, its all about secretly bringing AI into your exam.
Then there are utility advantages. While this kind of wristband would also be useful in VR, Zuckerberg has mostly talked about it being used with AR smart glasses. The big success, at least for the Ray-Ban Meta Smart Glasses is that theyre sleek and lightweight if you glance at them theyre not noticeably different to a regular pair of Ray-Bans.
Adding cameras, sensors, and a chipset for managing hand gestures may affect this slim design. That is unless you put some of this functionality and processing power into a separate device like the wristband. The Xreal Air 2 Pro's displays (Image credit: Future)
Some changes would still need to be made to the specs themselves chiefly theyll need to have in-built displays perhaps like the Xreal Air 2 Pros screens but well just have to wait to see what the next Meta smart glasses have in store for us.
Lastly, theres accessibility. By their very nature, AR and VR are very physical things you have to physically move your arms around, make hand gestures, and push buttons which can make them very inaccessible for folks with disabilities that affect mobility and dexterity.
These kinds of brain signal sensors start to address this issue. Rather than having to physically act someone could think about doing it and the virtual interface would interpret these thoughts accordingly.
Based on demos shown so far some movement is still required to use Metas neural interface so its far from the perfect solution, but its the first step to making this tech more accessible and we're excited to see where it goes next. YOU MIGHT ALSO LIKE Your Meta Quest 3 is getting a hand-tracking
upgrade that could unlock foot-tracking Yes, Apple Vision Pro is being returned to stores but this could be a good thing Apples foldable iPhone
could land in September 2026, alongside the iPhone 18
======================================================================
Link to news story:
https://www.techradar.com/computing/virtual-reality-augmented-reality/mark-zuc kerberg-says-were-close-to-controlling-our-ar-glasses-with-brain-signals
--- Mystic BBS v1.12 A47 (Linux/64)
* Origin: tqwNet Technology News (1337:1/100)