Google showed off quite a few things revolving around AI at Google I/O 2025, and one of those things was the new argentic capabilities it’s been working on for its Project Astra technology. This is likely going to be something that we see come to Gemini Live at some point down the line. As Project Astra powers Gemini Live. Google’s demo during the keynote depicted a user asking Gemini for help. Gemini then kicked in with a response and started doing things for the user.
The neat thing about all this was the argentic capabilities that Project Astra had acquired. It was using these to assist the person in the demo. In Google’s example, the user is working on a bike and asks for help finding a user’s manual for a Huffy mountain bike. The user then asks Gemini to scroll to the page about the breaks.
All of this is followed by several requests that include opening YouTube and finding a video on how to fix a stripped screw, as well as finding out what size of a hexnut is needed for the repair. Gemini is even asked to call the local bike shop to see if they have parts. Gemini does this and then tells the user that the parts they need are in stock.
All of this is happening while the user continues to work on the bike during the entire conversation. Only stopping a few times for very specific reasons.
New argentic capabilities for Project Astra basically control the phone for you
As you’ve likely already caught on, the new argentic capabilities shown in the demo allow Gemini to control your phone for you. This includes everything from scrolling pages to dialing a call. Effectively turning Gemini into a true universal AI assistant like Google wants. Google doesn’t mention anything about when this will be available, but it’s probably not going to be anytime in the immediate future.
These new capabilities are just a few of the improvements Google is making with Gemini. Later this year, Google plans to add Gemini-powered language translation to Google Meet. It’s also using Gemini to power cars and Google TV.