Apple yesterday, including what I've been waiting to play with - Apple Visual Intelligence. It seems, at least for now, that Apple Visual Intelligence works with ChatGPT and Google Search but I was not able to trigger it for Apple Maps like the initial demo showed.
Apple wrote:
A new visual intelligence experience builds on Apple Intelligence and helps users learn about objects and places instantly, thanks to the new Camera Control on the iPhone 16 lineup. Visual intelligence can summarize and copy text, translate text between languages, detect phone numbers or email addresses with the option to add to contacts, and more. Camera Control also allows users to search Google so they can see where they can buy an item, or benefit from ChatGPT’s problem-solving skills to ask for an explanation about a complex diagram, such as from class notes. Users are in control of when third-party tools are used and what information is shared.
So sure, I was able to help me find how to buy a similar football with Google:
I was able to get it to do some of my homework with ChatGPT:
Even describe the toys I have on my shelf with ChatGPT:
But when I tried local searches of restaurants that I am looking at, it would not work, like we saw in the demo.
Here is the original demo that starts at the 55:05 mark:
ChatGPT is super integrated, just watch this demo from Sam Altman and the OpenAI team:
So this has a ton of potential, if integrated into more AI solutions.
ChatGPT in Apple Intelligence: Here's the 10-minute video from today's announcement. Pretty cool integration. Again, Google must be freaking out a bit over this... https://t.co/Y5r9Se9K5U pic.twitter.com/OYXzVzwi83
— Glenn Gabe (@glenngabe) December 11, 2024
— Barry Schwartz (@rustybrick) December 11, 2024
I am trying to trigger Apple Maps with it but that doesn't seem to work for me.
— Barry Schwartz (@rustybrick) December 11, 2024
Forum discussion at X.