To coincide with the launch of the Samsung S25 range of devices, at today’s Galaxy Unpacked, Google has announced some impressive updates to its Gemini AI platform. Many of the improvements are specific to devices like the new Samsung S25, but some also work on the older Samsung S24 and the Pixel 9 phones.
The stand-out feature is Gemini’s new ability to chain actions together. This means you can now do things like connect to Google Maps to search for nearby restaurants, then draft a text in Google Messages to send to people you’d like to invite to lunch, all through Gemini commands.
The chaining ability is being added to all devices that run Gemini, “depending on extensions”, which means that the extensions to link the particular app to Gemini will need to be written by a developer for them to be included. Naturally, all the major Google apps have extensions for Gemini already, but extensions are also available for the Samsung Reminder, Samsung Calendar, Samsung Notes, and Samsung Clock apps.
Gemini Live goes multimodal
Google’s Gemini Live, the part of Gemini that gives you the opportunity to have a natural, human-like conversation with the AI, is also getting some major multimodal upgrades. You will now be able to upload images, files, and YouTube videos to the conversation you’re having, so, for example, you could ask Gemini Live, “Hey, take a look at this picture of my school project and tell me how I could make this better”, then upload the picture, and get a response.
The Gemini multimodal improvements are not available across the board, however, and will require a Galaxy S24, S25, or Pixel 9 to work.
Google Pixel 9 with Gemini Live | Now We’re Talking – YouTubeWatch On
Project Astra
Finally, Google has announced that Project Astra capabilities will be coming in the next few months, arriving first on Galaxy S25 and Pixel phones. Project Astra is Google’s prototype AI assistant that enables you to interact with the world around you, asking questions about what you’re looking at and where you are using your phone’s camera. So, you can simply point your phone at something and ask Gemini to tell you something about it, or ask it when the next stop on your bus route will be.
Project Astra works on mobile phones, but takes your experience to the next level when combined with Google’s prototype hands-free AI glasses, so you can simply start asking Gemini questions about what you’re looking at, without having to interact with a screen at all.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Project Astra | Exploring the future capabilities of a universal AI assistant – YouTubeWatch On
While there’s still no news about a release date for this next generation of Google glasses, they will join Meta Ray-Ban glasses in the emerging market for AI wearables when they finally become available.
You may also like
Google has just announced the ability to chain actions in Gemini and it could change the way we use AI for good NY Times News Today.
Read More Details
Finally We wish PressBee provided you with enough information of ( Google has just announced the ability to chain actions in Gemini and it could change the way we use AI for good )
Also on site :
- What to know about the trial of an ex-Michigan cop charged in the killing of a Black motorist
- Trump sees no red line that would change tariff policy - The Atlantic
- Could 100 Men Really Take on One Gorilla and Win? We Asked ChatGPT