When AI-powered visual search tools first debuted on Ray-Ban’s Meta sunglasses last year, they had some amazing (and disturbing) powers. However, a new feature in the most recent beta appears to be rather helpful. Meta CTO Andrew Bosworth stated in a Threads post that it functions as a kind of tour guide for travelers by identifying landmarks in different regions and providing additional information about them.
Bosworth displayed a few example pictures that illustrated the history of the “painted ladies” houses in San Francisco, the reason the Golden Gate Bridge is orange (making it easier to see in fog), and other topics. For those, the captions were shown as text below the pictures.
Additionally, using a few films shot in Montana, Mark Zuckerberg showcased the new features on Instagram. This time, Big Sky Mountain and the Roosevelt Arch’s history are spoken by the glasses via audio, and they also explain how snow forms, sort of as a caveman would.
At its Connect event the previous year, Meta unveiled the feature, which is a part of its new “multimodal” capabilities-which let it respond to inquiries specific to your environment. That in turn was made possible when real-time information became available to all of Meta’s smart glasses, helped in part by Bing Search, as opposed to the previous 2022 knowledge cutoff.
This feature is part of Meta’s Google Lens-like functionality, which lets users “show” objects through the glasses and ask the AI questions about them. Examples of objects that can be shown include fruits or foreign writing that has to be translated. Anyone enrolled in Meta’s early access program (which is currently limited in quantity) can access it. “For those who still don’t have access to the beta, you can add yourself to the waitlist while we work to make this available to more people,” Bosworth said in the blog post.
Topics #AI #Artificial intelligence #Facebook #Goggle #landmarks #Mark Zukerberg #Meta #Meta sunglasses #news #Ray-Ban