At Google I/O, a new Perspectives feature was announced for Google Search that will display not only web pages but also short videos from YouTube Shorts. On the day of the announcement, Google only said artificial intelligence (AI) was used to display videos relevant to users' searches in Search but without a detailed description.
Today DeepMind shared some information about how their AI expertise was used to understand the content of YouTube Shorts. Unlike other YouTube videos, Shorts has no information about the uploaded content.
The Flamingo visual language model is used to understand the content in YouTube Shorts visuals. Based on the analysis of the frames in the video, metadata is then stored and associated with the video. For example if in a video visual there is an image of a cat chasing a mouse, the meta data "cat", "mouse" and "chasing" will be stored for the video.
With this when a user searches, YouTube Shorts videos with matching metadata will appear in the search results.
Tags
NEWS & FACT