Limited period discount :Sponsored Articles, Homepage Banners and News Release. Write to us - [email protected]

The Ray-Ban Meta Smart Glasses have multimodal AI now

1 Mins read

When the Ray-Ban Meta Smart Glasses launched last fall, they were a pretty neat content capture tool and a surprisingly solid pair of headphones. But they were missing a key feature: multimodal AI. Basically, the ability for an AI assistant to process multiple types of information like photos, audio, and text. A few weeks after launch, Meta rolled out an early access program, but for everyone else, the wait is over. Multimodal AI is coming to everyone.

The timing is uncanny. The Humane AI Pin just launched and bellyflopped with reviewers after a universally poor user experience. It’s been somewhat of a bad omen hanging over AI gadgets. But having futzed around a bit with the early access AI beta on the Ray-Ban Meta Smart Glasses for the past few months, it’s a bit premature to completely write this class of gadget off.

Read More at :- Theverge

498 posts

About author
Andrew Sabastian is a tech whiz who is obsessed with everything technology. Basically, he's a software and tech mastermind who likes to feed readers gritty tech news to keep their techie intellects nourished.