Apple Intelligence has finally arrived, and like most AI on smartphones so far, it’s mostly underwhelming.
The debut features of Apple Intelligence are all very familiar: there are glowing gradients and sparkle icons that indicate the presence of AI; writing tools that make your emails sound more professional; and an AI eraser in Photos that blots away distractions. It’s all here, and it all works okay. But none of it is even close to the time-saving computing platform shift we’ve been promised.
Essentially, there are two Apple Intelligences: the one that’s here now and the one we might see in the future. Even in today’s launch announcement, Apple is busy teasing the features that haven’t launched yet. What’s here today is a handful of tools that loosely share a common theme: helping you weed out distractions and find the signal in the noise. That’s the theory, anyway.
Apple uses AI to summarize groups of notifications so you can catch up on what you missed faster. You can summarize long emails and use a new focus mode that filters out unnecessary distractions. In practice, these things kind of work, though, after a week of using them, I don’t feel like I saved much time or energy.
In the Mail app, AI summaries appear where the first line of an email would normally show up when you’re viewing an entire inbox; there’s also an option to summarize individual emails. Maybe it’s a reflection of how useless email has become, but I didn’t find either of these features terribly helpful. You know what feature we already use that summarizes an email pretty well? The subject line. At least that’s true of most emails I get; they’re usually short and to the point. Maybe Tim Cook saves himself a lot of time reading long emails, but personally, I could live without a little summary of every email the DNC sends me asking for three dollars by midnight.
Notification summaries seem a little more promising to me — at the very least, it’s pretty funny seeing AI try to summarize a string of gossipy texts or a bunch of notifications from your doorbell. But it also surfaced a bit of important information in a string of texts from a friend, and had I not seen that summary when glancing at my phone, I might have read the messages much later. That was helpful.
Over in Photos, you’ll find the new Clean Up tool in your editing options. It’s designed to quickly remove objects from a scene; you can tap something that’s been automatically highlighted by the tool or outline something yourself that you want removed. It runs on-device, so you only have to wait a few moments, and you’ll see the selected object (mostly) disappear.
The tool does a good-enough job, especially for smaller objects in the background. But it’s only about as good as Google’s older Magic Eraser tool in Google Photos — occasionally, it’s better, but it’s not as good as Google’s Magic Editor, which uses generative AI for incredibly convincing object removal. That tool runs in the cloud, so it’s a little apples to oranges, but still. I can use Google Photos’ on-device Magic Eraser tool on my four-year-old iPhone 12 Mini, and the results are pretty close to what I get with Clean Up running on the iPhone 16 — not a great argument for the AI phone upgrade cycle.
There’s also, of course, an upgraded Siri. Sure, it looks different, and typing queries is a handy addition, but you don’t have to use it for long to realize it’s basically the same old Siri with a new coat of paint. It handles natural language better and includes more product knowledge to help you find settings on your iPhone, but that’s about it right now. Apple has promised big updates for Siri down the road, and features like a ChatGPT extension are scheduled to arrive by the end of the year. But the big stuff — contextual awareness, the ability to take action in apps — is all planned for 2025.
Other features — like AI-generated photo memories and smart replies — do what they’re supposed to do but lack a certain human touch. I didn’t send any of the AI-suggested replies in my messages even though they conveyed the right sentiments. If I’m going to take the time to respond to a text, I might as well just write “That’s tough” myself rather than have AI do it, you know? Isn’t that part of the point of texting someone? I also prompted Photos to create a memory of moments of my kid, which it did but titled it the eerily impersonal “Joyous Moments with Child.”
To be clear, criticism of Apple Intelligence is not an endorsement of other phones’ intelligences; they’re all varying degrees of unhelpful right now. If you want to make it look like a helicopter crashed in an empty meadow, then sure, there’s AI for that. But if you want help getting things done? That’s not quite ready.
And in fairness, this is v1, and Apple has been pretty clear that its more impressive Intelligence features will drip out over the next year. But Apple also put a big, bright “Built for Apple Intelligence” bow on every new iPhone, iPad, and Mac it’s selling right now, suggesting we’d be sorry if we bought an Apple device that couldn’t handle AI. If Apple Intelligence is a letdown right now, it’s because Apple built it up to impossible heights.
There’s more to come, and some of it looks really promising. This first wave of AI features is Apple playing catch-up to Google and Samsung. But no phone maker has yet created a cohesive set of time-saving AI tools. Apple might be arriving late, but the game is just getting started.
Read the full article here