Apple Intelligence isn’t very smart yet—and Apple’s OK with that

Apple’s ability to build tools right into the operating systems is undeniably powerful and convenient. Photo: David Paul Morris/Bloomberg News
Apple’s ability to build tools right into the operating systems is undeniably powerful and convenient. Photo: David Paul Morris/Bloomberg News

Summary

Our columnist reviews the first wave of AI features coming to iPhones with iOS 18.1, and asks Apple’s software chief why so much is still missing.

Me: “Hey Siri, when will you become the smart assistant Apple always promised?"

Siri: “Here’s what I found on the web for ‘The smart assistant Apple promised.’"

Welcome to Apple Intelligence, where “intelligence" is still a work in progress.

Apple will launch iOS 18.1 next week, bringing its much anticipated generative-AI tools to the iPhone 15 Pro models and the new iPhone 16 lineup. It will be available for most newer iPads and Macs, too.

If you’re expecting AI fireworks, prepare for AI…sparklers. Back in June, at the company’s annual developers conference, executives showed off do-it-yourself emojis, ChatGPT integration and a Siri that can recall the name of a person you met months ago. Apple has even been running ads for some features. None are in this release.

“This is a big lift," Craig Federighi, Apple’s senior vice president of software engineering, told me at the company’s headquarters. “You could put something out there and have it be sort of a mess. Apple’s point of view is more like, ‘Let’s try to get each piece right and release it when it’s ready.’" (You can watch our full video interview here.)

Yes, while other companies rush out generative-AI tools, sometimes with controversy, Apple is moving cautiously. Federighi denies the company is behind, saying it’s prioritizing privacy and responsibility.

I’ve been testing Apple Intelligence on my iPhone and iPad. Apple’s ability to build tools right into the operating systems is undeniably powerful and convenient. But many are half-baked. I asked Federighi to explain the features—and Apple’s broader AI strategy.

Smarter Siri?

When you summon Siri in iOS 18.1, the screen’s edges light up with a rainbow glow and a more human-sounding voice responds. You can even type your queries by tapping at the bottom of the iPhone screen.

It’s like a Hollywood remake: fancier special effects but the same old plot holes. Siri is still best for basic commands (timers, weather and music, etc.), and often falls back on “Here’s what I found on the web" or admits it doesn’t understand. That is, unless you ask about using your Apple device—say, how to adjust Screen Time limits. That’s when generative AI kicks in and “new" Siri shines.

So where is this smarter Siri—the one that can piece together context from your calendar, email and messages to take action? The one that can call on ChatGPT if it doesn’t know the answer?

“Siri is adopting that in stages and will benefit in stages over the coming year," Federighi said.

Recently, OpenAI, Meta and Microsoft gave their chatbots eerily humanlike voices, enabling them to hold long conversations and answer questions about the world.

Apple’s assistant is built differently, Federighi said, emphasizing that it processes 1.5 billion requests daily. Those other chatbots are great if you want to ask a question about quantum mechanics, he said, but they won’t open your garage or send a text message.

“There’s a trade-off across capabilities," he said. “Will these worlds converge? Of course."

Writing Tools

Confession: I used Apple Intelligence to help write some of what you just read. I pasted the interview transcript into the Notes app, highlighted the passages about Siri, tapped the Writing Tools pop-up and selected Summary.

It’s convenient—you can use it within most apps where you’re working with text. But while Apple’s results are decent, other chatbot apps offer more control, letting you specify summary length or provide detailed rewriting prompts. Apple’s Writing Tools are the convenient drive-thru right on the highway; OpenAI’s ChatGPT is the better restaurant a few miles off your route.

Apple will soon offer deeper ChatGPT integration right within Writing Tools, but you still might want to lean on Apple for its privacy benefits. ChatGPT and most other chatbots based on large language models send everything you type to their cloud servers.

Apple does a lot on the device, but when it needs more processing power—summarizing a long email, for instance—it taps encrypted, private cloud-based models. With Private Cloud Compute, Federighi promises that the privacy of your device is extended to its cloud servers, and the data isn’t stored.

Summarize notifications

My favorite part of Apple Intelligence? Notification summaries. Instead of seeing 10 alerts about my garage opening and closing, I get a simple update: “Multiple status changes for Garage Door, recently closed." You can opt into this feature and choose which types of notifications you’d like summarized. All the summarizing happens on the device.

It’s definitely useful—but the occasional flubs and impersonal summaries are also pretty hilarious. How did it summarize a series of texts from my wife about our toddler’s tantrum?

“Child is not behaving well."

Federighi acknowledged that sometimes summarization might not be funny.

A recent viral post showed Apple translating a breakup text thread as “No longer in a relationship; wants belongings from the apartment." Federighi says Apple Intelligence didn’t do “a horrible job" with it, but he recognized the larger issue: “There’s a variety of kinds of communication that can come through and sometimes those are sensitive matters." In some cases, Apple won’t automatically summarize a notification because it might not handle it well, he says.

Apple also added summarization elsewhere, including the Mail app and voice recordings within the Notes app. Tim Cook told my colleague that email summarization changed his life. Wow! Just wait ’til Genmoji gets to him.

Clean up photos

Apple Intelligence will let you create images and emojis with a text prompt. Except—you guessed it—that’s not in this release, either. Instead, there’s a tool for removing unwanted parts of a photo, similar to Google’s Magic Eraser.

Select Clean Up in the Photos app and tap the objects or people you want to remove, and the on-device generative-AI model does its thing. It works well when the object or person is set against a simple, solid background—like a blue sky. But with more complex backgrounds, it struggles, leaving behind visible errors.

Many generative-AI image tools let you generate new parts of a photo. Apple doesn’t—and that’s intentional.

“People view photographic content as something they can rely on as indicative of reality," Federighi said. “It’s important to us that we help purvey accurate information, not fantasy."

That’s reassuring. Apple may be lagging behind AI rivals, but most of those companies haven’t demonstrated they prioritize privacy or the thoughtful rollout of these powerful tools.

Few new and upstart competitors have more to lose than Apple, which trades in the trust of over a billion people. Who wants an iPhone experience where Siri blabs on about the meaning of life but forgets to set our alarms?

“This is a many-year, honestly, even decadeslong arc of this technology playing out, and so we’re going to do it responsibly," Federighi said.

Until then, let’s say it all together now: “Here’s what I found on the web."

—Sign up here for the Tech Things With Joanna Stern weekly newsletter.

Write to Joanna Stern at joanna.stern@wsj.com

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

topics

MINT SPECIALS