Apple

How will Apple re-think AI features for WWDC 2025? – Six Colors


“Let’s work the problem, people. Let’s not make things worse by guessin’.”

Between the delay of a bunch of promised Apple Intelligence features and the realignment of Siri and other AI features under Apple software chief Craig Federighi, I’ve got to think that the Apple Intelligence situation at Apple is pretty intense right now.

The somewhat half-baked set of features we saw announced at WWDC in June 2024 was, according to Bloomberg’s Mark Gurman, the result of Apple going “all-in on AI” nearly a year before. Apple added a bunch of features into iOS 18 and macOS 15 and came up with a marketing plan built around the new Apple Intelligence brand, but by Apple standards it was a rush job.

Contrast that with the current status of Apple Intelligence. In the last few months, Apple has had to pull back on features it already promised and bring in new leadership. Meanwhile, WWDC 2025 looms. There’s not a lot of time to decide how Apple’s going to approach its AI functionality over the next year.

When I consider what’s going on at Apple right now, I keep thinking back to one of my favorite movies, “Apollo 13,” in which a bunch of engineers back in Houston are guided through a series of intense analytical steps by Flight Director Gene Kranz in order to understand what’s happened and how they can best work the problem and save the crew.

This exchange, between Kranz (played by Ed Harris) and Flight Controller Sy Liebergot (Clint Howard), is what I’ve kept thinking about:

Gene Kranz: Can we review our status here, Sy? Let’s look at this thing from a standpoint of status. What have we got on the spacecraft that’s good?

Sy Liebergot: I’ll get back to you, Gene.

While there are no lives at stake, this is very much a situation where there is a daunting technical challenge that demands an immediate response. So what do you do if you’re Federighi and Mike Rockwell (the new head of Siri)? You do what Gene Kranz did (and yes, kids, “Apollo 13” is a true story): look at the entire thing from a standpoint of status.

What does Apple have in artificial intelligence that’s good?

It’s triage, which involves reviewing a list of projects and determining what’s feasible. Balancing the needs of WWDC marketing with the art of the possible has to be one of the toughest things Federighi and company have done in the last few months.

First up: What’s the current status of the items announced last year and delayed back in March? Is it close to shipping, or is it far off? Federighi and company need to find out whether these features are just lagging, or if the initial conception was misguided and things need to be reconceptualized.

For example, Gurman has just reported that Apple is using Perplexity’s AI to build internal coding tools. Where does that leave Swift Assist, which was demoed last year but has never appeared, even in beta? One of the jobs of this triage project is to decide that some ideas just didn’t pan out, and move on to new ideas.

Another question for both Federighi and Apple’s marketing group is how to handle features already promised a year ago. Do they get re-promised? Are they not mentioned? If they’re reconceptualized, how does that get communicated? Apple has always been reluctant to admit to failure, so do revised features just get re-announced without any acknowledgement that they were previously promised?

Next: What’s the state of the stuff being worked on that hasn’t been announced? Between the features being built by Federighi’s team and the work he’s inheriting from AI chief John Giannandrea, there are undoubtedly a whole bunch of new items that were intended for the 2025-2026 OS cycle.

Obviously, the first step is a status check, to get a realistic sense of when a feature will be ready to ship to customers. But there’s another aspect to this part of the job: The whole group needs to consider all the mistakes they made last year in terms of gauging readiness. Obviously, last year’s judgment about what was ready to be announced was… flawed. How does Apple avoid that this time around? And then considering those mistakes, what features are really going to ship by spring 2026? Everything else gets delayed until 2026.

Someone also needs to look critically at Apple’s own AI models and judge whether they’re suitable for deployment. One would hope that over the past year, Apple has developed better versions of the models it currently ships on devices, but even those new models may still lag behind the functionality of models from other providers. Some reports suggest that Federighi has softened on the use of third-party models in Apple features and functionality.

If Apple’s models are not state-of-the-art (and they are almost certainly not), are there “quick wins” Apple could accomplish by integrating third-party models? Could they be integrated into specific features? Does Apple have time to build a modular AI system that lets users choose which models—Apple, OpenAI, Perplexity, Google, whatever—they’d prefer to use? (And, separately, is Apple going to provide tools for app developers to use to integrate AI functionality as well?)

At WWDC, we’ll get our first sense of what Apple, with a revised software structure and still feeling the sting from failing to ship what it promised last year, thinks it can deliver. Failure is not an option.

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.