Ray-Ban and Meta collaborated and released smart glasses. A new update is coming to these smart glasses. With this update, the social media giant’s artificial intelligence will answer questions with real-time information. Meta will also begin testing some “multimodel” capabilities. With this multimodel, artificial intelligence will be able to make recommendations based on the person’s environment.
Meta’s artificial intelligence will now be able to answer current questions
()
Meta AI has restricted the information of its artificial intelligence since December 2022. Therefore, artificial intelligence could not help users with issues such as current events, match results, road conditions or weather conditions. According to Meta CTO Andrew Bosworth, this will change soon and Meta smart glasses users in the US will have access to real-time information. It is stated that Bing will also be used partially for this purpose.
Meta is also working on a multi-model AI. Thanks to this feature, which we first saw at the Connect event, Meta artificial intelligence will be able to answer users’ contextual questions about their environment. For example, “What is this fruit?” will be able to understand what the pointed fruit is to answer the question.
This update will not be available to everyone for now, and people who choose to try it will be able to benefit from the new features in 2024. Let’s see how Meta’s activities in this field, where he has not been able to find what he wants so far, will result this time.