Google I/O 2024: New AI Assistant Can Read Everything Through Your Phone’s Camera
Project Astra – The Everyday AI Assistant
The multimodal from Google is based on Gemini is basically its way of saying to OpenAI that we are here for the battle. So, how does this version of the AI Assistant work? Google is using the camera on your phone to guide the AI assistant to help you understand the things near you.
It can even read a code written on a PC and help you determine its purpose or solve the complex code as well. That’s not all, you can point the camera to the street and ask the AI assistant to tell you where you are located and get more localised details if indeed.
Google says these capabilities via Project Astra will be available in Gemini Live within the main app that will be available later this year. The tech will initially work on the Pixel phones and Google sees the AI assistant coming to more devices, which includes the smart glasses and even TWS earbuds some day.
0 Comments