The Ray-Ban Meta smart glasses received three more new features this morning that give them a variety of new artificial intelligence (AI) capabilities. The first feature is Live AI, which can understand what Ray-Ban Meta users are looking at. They can ask the glasses to tell them what they are looking at, and this feature can be used without having to use the “Hey Meta” command.
Next, Live Translation performs real-time translation of Spanish, French, and Italian into English. The glasses wearer can listen to English audio that is the translation of the three languages.
At the same time, the transcript of the conversation and the translation can be read on the screen of the phone the glasses are tethered to. Meta says that this feature is still being tested and errors may occur with continuous improvements being made.
Finally, Shazam support is also provided, which allows users to identify the music they are listening to. Users only need to give the command “Hey Meta, what is this song?”. All of the above features are only available to users in the United States and Canada at this time.