https://www.theverge.com/2024/3/28/24114454/meta-ai-ray-ban-smart-glasses-launchStarting next month, Meta's Ray-Ban smart glasses will support
>multimodal AI features to perform translation, along with object, animal, and monument identification.
Users can activate the glasses' smart assistant by saying "Hey Meta," and then saying a prompt or asking a question. It will then respond through the speakers built into the frames (this is usually not heard by anyone else but the wearer).
Although Meta's AI was able to correctly identify pets (cats and dogs) and artwork, it didn't get things right 100 percent of the time. Glasses struggled to identify zoo animals that were far away and behind cages.
>Also it points most animals to be dogs.It also didn't properly identify an exotic fruit, called a cherimoya, after multiple tries. As for AI translations, the NYT found that the glasses support English, Spanish, Italian, French, and German.