This article contains affiliate links. As an Amazon Associate, Next Level Mac earns from qualifying purchases.
At its WWDC event in the summer of 2024, Apple Intelligence made a lot of promises. Time has since revealed that many of the demos of Apple Intelligence shown at that now-infamous WWDC were either noninteractive or outright nonexistent. It was so bad, in fact, that some of the team members at Apple who worked on Apple Intelligence didn’t even recognize some of the features shown at that ill-fated event. Ouch.
Fast-forward to the fall of 2025. Is Apple Intelligence still little more than a mockup of non-working features and broken promises, or has Apple evolved it into something worthy of the “Made for Apple Intelligence” slogan slapped onto every Mac, iPhone, iPad, Apple Watch, and Vision Pro? Once the wheat is sorted from the chaff, the truth is not just revealing—it’s transformative.
More Than Just a 26 on the Box
I love the reference to TRON: Legacy where the CEO of ENCOM joked with Alan Bradley that “This year, we put a 12 on the box.” It’s a tongue-in-cheek remark that nothing was changed other than the number of the OS on ENCOM OS that year.
Apple could have easily fallen into that trap when all of the OS 26 versions were released in September 2025. Apple Intelligence could have still been in the oven being baked, as it had been for well over a year. Apple took much-deserved criticism for their first rollout of the initial features—and lack of them—when they announced and rolled out the first iteration of Apple Intelligence.
Fortunately, Apple listened to the criticisms and made internal moves to correct many of Apple Intelligence’s issues. Now, in late 2025, it’s—better. It’s not perfect, and there’s clearly room to grow. But when you take stock of what Apple Intelligence actually is, and not what it’s not trying to be, it starts to make sense.
And, as the old saying goes, don’t throw the baby out with the bathwater here. Apple Intelligence in late 2025 has evolved, and it has some new features across devices that really add value to Apple’s line of products that support it.
What’s New?
Across its product lines, Apple Intelligence is supported on Mac, iPhone, iPad, Apple Watch, and Vision Pro. Notably absent, at least at the moment, are the HomePods and Apple TV, although both are overdue for refreshes and might see Apple Intelligence added soon.
Here’s the list of the major features added to supported devices:
iOS 26 (iPhone)
• Visual intelligence: understand what’s on-screen so you can search, ask questions, or take actions from on-screen content.
• Live Translation in Messages, FaceTime, and Phone; spoken translations for calls.
• Live Translation with AirPods (hear the other person’s speech translated in real time).
• Shortcuts intelligent actions (summarize text, create images, or call Apple Intelligence models in a workflow).
• Image Playground & Genmoji enhancements (more control and additional ChatGPT styles).
• Reminders with AI suggestions and auto-categorization.
iPadOS 26
• Live Translation in Messages, FaceTime, and Phone.
• Shortcuts intelligent actions (summaries, image creation, model responses).
• Image Playground & Genmoji enhancements (including new ChatGPT styles).
• Reminders with AI suggestions and auto-categorization.
macOS 26 “Tahoe” (Mac)
• Live Translation in Messages, FaceTime, and Phone.
• Shortcuts intelligent actions (including the new Use Model action).
• Image Playground & Genmoji enhancements (including ChatGPT styles).
• Reminders with AI suggestions and auto-categorization.
watchOS 26 (Apple Watch)
• Workout Buddy: AI-generated, personalized pep talks using a TTS model and your workout data.
• Live Translation in Messages on Apple Watch (when paired to an Apple-Intelligence-enabled iPhone).
• Smart actions in Messages (contextual suggestions like sharing location).
visionOS 26 (Apple Vision Pro)
• Apple Intelligence with expanded language support plus Image Playground updates (more control; Genmoji).
• Additional AI features available on visionOS 26 include Writing Tools, Smart Reply/Summaries, Notification/Notes transcription summaries, natural-language search and memory movies in Photos, and more.
There’s a lot to unpack here, so I’ll start with the common Apple Intelligence features that tend to be available across supported devices first.
Live Translation: The Babel Fish Brought to Life
OK, so the Babel Fish is a fictional species from the book and movie The Hitchhiker’s Guide to the Galaxy. One could shove it into one’s ear, and every language from every species in the universe could immediately be understood. The stuff of science fiction, right? Not anymore.
Live Translation is a powerful tool, integrated into Apple Intelligence, that does exactly that. When you hear a language spoken to you that you don’t natively understand, Apple Intelligence can translate it into your native language and you’ll hear it spoken in your native language. It’s literally the Babel Fish, but without having to shove an alien fish into your ear to use it. Whew!
Currently, Live Translation supports English, French, German, Portuguese, and Spanish. That’s just the beginning, as the number of supported languages is expected to grow quickly.
But how good is it? Well, I happen to speak both English and Spanish, so my wife and I were able to give it a try. She put in our new AirPods Pro 3, connected to my iPhone with iOS 26 installed. Then I spoke to her in Spanish. It worked exactly as expected. She was able to hear the Spanish sentences I spoke to her translated into English in the AirPods Pro 3.
It was so good, in fact, that she went out and bought a pair for her workplace as well. She deals with customers on occasion who only speak Spanish, and she does not. Being able to have those conversations with them in real time has empowered her to connect with them immediately and earn their trust—and their business. Live Translation is much more than a novelty. It empowers people to communicate and connect with each other in ways that were never possible just a short time ago.
Shortcuts Make Complex Tasks Simple
I have this saying that “what gets made easy gets done.” That’s where Apple Intelligence features like Shortcuts come in. They take complex workflows and turn them into simplified, powerful outcomes.
Among the many great features of Shortcuts are:
• Automates multi-step tasks across apps (you chain “actions” into a workflow you can run with one tap or voice).
• Runs from Siri, the app, widgets, Share Sheet, Control Center, Action Button, etc.
• Supports event-based automations (time, location, app opens, messages, device settings, and more).
• Huge action library pulled from system features and third-party apps.
• iCloud sync across iPhone, iPad, Mac (and it shows on Apple Watch) to keep your shortcuts in step.
It all sounds great, doesn’t it? If you’ve ever used an automation tool like Zapier or n8n, you know the power that automations can bring to your workflows. What’s the difference here? The deep integration with Apple’s native apps and ecosystem.
Shortcuts in Apple Intelligence bring native, powerful automations, such as:
• New Apple Intelligence category in the Shortcuts Gallery.
• “Use Model” action lets a shortcut call an AI model: On-Device, Private Cloud Compute, or ChatGPT; you can pass data in and feed the output to later actions.
• Practical uses Apple highlights: summarize text (Writing Tools), create images (Image Playground), parse PDFs or transcripts, then continue the workflow (e.g., file it, message it, add to a spreadsheet).
OK, great. That explains the “how.” But what about the “why”? What’s the point of all of these Shortcuts features? Here are some super practical uses for Shortcuts.
• Saves time on repetitive stuff (one tap/voice instead of bouncing between apps).
• Hands-free and context-aware with Siri and automations (runs at the right moment).
• Bridges apps: passes data from one step to the next for consistent, error-free results.
• Adds AI when useful (summaries, rewrites, image creation) while keeping data private via on-device or Private Cloud Compute.
Two features stand out to me here. The first is privacy. Apple Intelligence can run on-device, making the need for accessing Apple’s servers, or anyone else’s, unnecessary. This protects your privacy and the privacy of your sensitive data by keeping it on your device. The practice of AI companies harvesting data and, in some cases, publishing it publicly, is well known. Privacy matters, and it’s great to see Apple treating it seriously in Apple Intelligence.
The other big feature is using Apple devices hands free. This serves accessibility needs for those who may not be able to use Apple products with their hands, a respectful move by Apple to value and respect inclusivity.
It also serves another powerful need: legality. For example, in my native state of South Carolina, it is now illegal to hold a mobile device, such as a smartphone, while driving. Having hands-free operability through automations like Shortcuts make it possible to complete complex workflows without even touching your iPhone. It’s good for the user, good for legal compliance. Both are wins, and both are made possible through Shortcuts.
Image Playground and Genmoji Improvements
I admit, the first time I tried Image Playground, it seemed, well, dreadful. I didn’t get the point. Images seemed limited to cartoonish caricatures, and there were (and still are) far better image generation tools online.
What’s better about it now? I think that depends on what you want from image generation. If you’re looking for realism, Image Playground isn’t your Huckleberry, to say the least. I prompted it to generate an image of a Boxer dog playing in a field of green grass with a blue sky on a sunny day. Note that the length of the prompt you can add is super limited. The result?

I guess if a Boxer bred with a Martian, that poor creature could be the result. It’s like the joke about tying a pork chop around someone’s neck to get their dog to play with them, only the other way around. Bless its little heart.
On the other hand, if you’re looking for shenanigans, Image Playground is kind of fun. You can drag and drop icons onto images of people you have photos of and get some zany results. Here’s one it came up with for me.

I’ll give it this: it’s fast. Really fast. Running those generations on my MacBook Pro M4 Max was nearly instantaneous. It’s WAY faster than waiting for ChatGPT to generate an image. With the newly released M5 chips and their Neural technologies, it’s even faster. And at least companies aren’t swiping every generated image and converting them to their own use without permission. That is, until one of them scrapes this article, anyway.
So, by those standards, Image Playground IS better in the new Apple Intelligence updates for the OS 26 rollouts. Here’s the problem with it for me: it’s a novelty, nothing more. If Apple gets serious about real image generation on-device, especially with the powers of the M5 chip deeply integrated, it’ll give the online players a run for their money. Until then, use Image Playground for what it is: a playground. But have some fun while you’re there. It does have the word “play” in it, after all.
Working Out Together-ish
New to Apple Watch and watchOS 26 is the Workout Buddy feature. It uses your workout data, along with a text-to-speech AI modeled after voice patterns from fitness coaches. It gives you a real-time workout AI companion, capable of:
• Before: short pep talk (e.g., weekly mileage, ring progress).
• During: mile/km splits, pace changes, notable milestones.
• After: stats summary and achievement call-outs.
What’s great about the AI coach is that it works across a variety of workout types. Among the list of supported workouts are:
Run (indoor/outdoor).
Walk (indoor/outdoor).
Cycle (indoor/outdoor).
Hiking.
Elliptical.
Stair Stepper.
HIIT.
Functional Strength.
Traditional Strength.
I do ALL of those every day.
OK, no, I don’t.
But I DO walk for exercise, and I tested the Workout Buddy feature with my Apple Watch Series 10, iPhone 15 Pro Max, and AirPods Pro 3. See, you have to have an Apple Watch Series 6 or later, an iPhone 15 Pro or later, and Bluetooth headphones or earbuds for it all to work. Oh, you also need OS 26 installed on the iPhone and Apple Watch.
What’s it like? It’s AI, but it’s certainly better than the feeling of isolation that can come with working out in a vacuum.
The things that are great about it are that the process, once set up on your iPhone, is seamless. You can simply tell your iPhone or Apple Watch to start a workout with your Workout Buddy, and it all just works. You can also set up the same kind of automation for starting your favorite music at the same time, all of which removes friction and makes the process of using Workout Buddy simple. And as you know from earlier, I believe that what gets made easy gets done.
On the flip side, it’s clear you’re listening to a text-to-speech voice in your ear when you’re working out. You just have to decide if hearing those prompts in a simulated voice in your ears when you’re working out is your jam or not. Apple clearly feels that it is, and it doesn’t make moves like this without supporting research and the data to back it. You just have to make up your own mind how much AI you want in your life.
My take on it is that Workout Buddy is a useful component of Apple Intelligence and adds value to the workout process. Just keep it in the same place mentally that you keep the written data you track on your phone and Apple Watch, and I think you’ll enjoy this new feature.
What Apple Intelligence Isn’t and Won’t Ever Be
All of these new features really do add value to the Apple ecosystem. But what about the things that Apple Intelligence isn’t, at least yet?
The one glaring feature that is yet to be implemented, and is sorely overdue, is an improved Siri. In late 2025, Siri is at best lagging behind the competition, and at best, it’s a hot mess. That might even be putting things too politely.
Ask Siri to do basic research, and you’re likely to get, well, much of nothing, even on high spec iPhones, iPads, and Macs. And don’t even get me started on the dreadful implementation of Siri on HomePods, especially the woefully out-of-date HomePod mini. My daily arguments with the two I have make for something between not bothering and turning them into baseballs with long tails and hitting them into the next county.
The promise, if not the evidence, is there that Siri is undergoing a major update, likely to be released sometime in 2026. How long will Apple owners patiently wait before seeking out alternatives? I certainly hope it’s not too much longer, both for my own sanity and for the sake of Apple’s reputation.
Also, Apple Intelligence is NOT a GPT or LLM (Large Language Model) like ChatGPT, Google Gemini, Claude, or others. I don’t believe it ever will be. I also don’t believe it ever needs to be. Each of those models, with a few exceptions that can run on local PCs and Macs, runs in the cloud. You know what happens when LLMs run in the cloud? They keep your interactions and your data as part of their records. That’s fine if you’re fine with it, but it’s against the spirit of both Apple and Apple Intelligence, which is meant to run on your local devices privately.
Apple already has a ChatGPT integration in iOS 26 and iPadOS 26, but it can be turned off with the tap of a toggle switch to save your privacy. That’s meaningful, because it preserves the one thing that none of those LLMs give you once you sign in and start using them: choice. And when a human can no longer choose, they cease to be human. That’s kind of a problem.
Apple Intelligence has a single focus: to help you use Apple devices and software more efficiently and effectively, if you choose to do so. You can even turn off Apple Intelligence if you’d like—just as you can with ChatGPT. That’s a lot of control over the use, privacy, and security of your devices. That’s not a deal any of those LLMs will offer you—I assure you.
Take Apple Intelligence for what it is: an Apple-centric way to get more value from the everyday use of your Apple hardware and software.
Is It Better After All?
Yes, it is. Apple Intelligence continues to make progress and has the potential to be a real game-changer for Apple owners.
It’s not there yet. But what is “there,” anyway? When should any OS, AI, or anything anywhere in the digital world be “finished”? There’s always room for growth, always room for evolution, and always room for revolution.
Apple Intelligence has grown. It stands ready to evolve. And, with time and continued adoption, it can revolutionize the way we use, and just maybe, love, our Apple devices. It’s up to Apple to make it so.
Related Posts
How to Stay Comfortable and Creative During Long Mac Sessions
Oct 23, 2025
Privacy Screen Filters for MacBook: Protect Your Data
Oct 20, 2025
M5 14-inch MacBook Pro: A Practical Buyer’s Outline
Oct 17, 2025