Google today showed off one of the prototypes they are developing, named Project Astra. This Project Astra takes the virtual assistant one step further, and does not just operate using text input alone - but also involves the use of video, audio, as well as identifying the user's environment.
Through the demo shown, Google uses a smartphone and its camera capabilities to interact with the environment. The virtual assistant then provides various inputs immediately and quickly as if interacting with other humans.
This Project Astra can also remember some things, and answer when you ask him. For example, you can ask where you put your glasses, etc.
This support of Project Astra shows various possibilities in the future – such as being able to be integrated on smart smart glasses, or perhaps on earphones with a camera – while allowing each user to have their own “JARVIS”.
Google says Project Astra may launch on Gemini sometime in 2024.