Can Apple beat Meta's smart glasses by adding cameras and AI to AirPods Pro?
Date:
Tue, 01 Oct 2024 01:00:51 +0000
Description:
Apple focuses on AirPods with cameras and AI.
FULL STORY ======================================================================
After Meta made AI wearables a centerpiece of its announcements at Meta Connect 2024 , the question arises of how its rivals will respond. In particular, there's a lot of rumors flying around about what Apple has
planned in response to the upgraded Ray-Ban Meta Smart Glasses and upcoming Orion smart glasses that employ augmented reality along with AI.
One major long-running rumor is that Apple plans to incorporate the hardware and AI software for a wearable not into glasses, but into its next-generation AirPods Pro. That might include cameras and AI features to match what you see in Meta's smart glasses. This potential competition sets the stage for a battle of wearables, with both companies seeking to redefine how users interact with the digital and physical worlds.
Meta boasted that the upgraded Ray-Ban Meta smart glasses can take photos, live stream video, and otherwise provide hands-free access to the world with both audio and visual receptors controlled by voice commands. With the Meta
AI assistant integrated into the device, the smart glasses can handle
requests in a conversational form. However, they are only a shadow of what
the Orion smart glasses previewed at the event could do. Orion will employ augmented reality to meld digital content with the physical world through a holographic display. Smart Futures
Apple's approach with the speculative AirPods Pro is more about leveraging AI for contextual awareness, using infrared cameras to interpret the space
around you. It wouldn't take photos or video from the perspective of your ears. Rather, it would use visual input to subtly improve navigation, fitness tracking, and even better respond to gesture controls. They would also likely augment Apple's Vision Pro headset, giving even more accurate spatial audio experiences by tracking head movements and adjusting audio based on the
user's surroundings.
The best way of thinking about the difference between Meta's smart glasses
and Apple's AirPods with AI tools is how they connect the AI with the hardware. They might divide prospective users based on what they want from their wearables. Meta's Ray-Ban glasses are geared toward capturing and sharing experiences visually, while Apple's AirPods seem to be more about enhancing the AI assistant with more passive input than is currently available. What they share is an interest in immersive experiences and making AI an ever-present aspect of wearing technology.
While Orion may link the physical and digital worlds through augmented reality, Apple's AirPods offer a lighter touch in enhancing environmental awareness and boosting the AI assistant's ability to help you. However, the integration into the Apple headset gives more weight to the AirPods and their capacity to deliver AI experiences. That's especially true if they can sync
up with Apples extensive ecosystem of devices. You Might also like 5
questions I still have about Apple Intelligence after the iPhone 16 launch Apple Intelligence feels like the HomePod all over again iPhone 16: price, cameras, Apple Intelligence features, and everything you need to know
======================================================================
Link to news story:
https://www.techradar.com/computing/artificial-intelligence/can-apple-beat-met as-smart-glasses-by-adding-cameras-and-ai-to-airpods-pro
--- Mystic BBS v1.12 A47 (Linux/64)
* Origin: tqwNet Technology News (1337:1/100)