This wasn’t the AI Keynote you expected…That’s the point

A screenshot from the WWDC 2025 keynote

Tim Cook played pit crew. Craig Federighi drove an F1 car. I was more focused on the quiet shift beneath it all — Apple’s entering a new phase of intelligence, continuity and user-centred system design.

WWDC 2025 leaned heavily into drama: high-octane intros, choir renditions of App Store reviews, and cinematic flair. But beneath the theatrics, Apple delivered meaningful, if subtle, shifts across its ecosystem — especially for those of us building for complex, multi-surface user experiences.

As someone who leads a mobile and cross-platform engineering team, I watched this keynote not for the hype, but for the signals: where Apple is heading, and what it means for developers, designers, and digital product teams who want to stay ahead of the curve.

What year is it? Apple just made it obvious

Apple is ditching its traditional OS version numbers in favour of a more universal, automotive-inspired format: iOS 26, iPadOS 26, macOS 26, etc. A small change on the surface — but one that reflects a broader shift in how Apple wants people to think about its platforms: as a cohesive, synchronised ecosystem.

This should make life a bit easier for both technical and non-technical teams. With version numbers finally lining up across iPhone, iPad, Mac, Watch and everything else, you won’t need to second-guess which OS is which anymore. No more “wait, is iOS 18 the same year as watchOS 11?” — just one shared version to track across the board.

Liquid Glass: Looks Sharp, Reads… We’ll See

Liquid Glass (Source: Apple Newsroom)

The design refresh, dubbed Liquid Glass, was perhaps the boldest aesthetic change since the iOS 7 flat design era. Inspired by the spatial interface of Vision Pro, it brings translucency and depth across devices.

There’s clear ambition here — especially in how the design language ties together across platforms. But I do have some reservations about how it plays out on the phone. The notifications on the lock screen looked difficult to read, and while the glassy look is visually impressive, the distortion during scroll felt more distracting than elegant.

Frosted glass in earlier versions felt subtle and refined — this new approach might be pushing things a bit too far. There are some really strong ideas in there, but Apple may need to tone down the effect slightly to keep text legible and UI components easy to read. Contrast is going to be key.

That said, the updates do look much slicker on the other platforms. The translucent menu bar on Mac, in particular, hits the right balance — clean, modern, and still practical.

Interestingly, Apple’s share price dropped by as much as 1.6% mid-keynote — maybe a sign that not everyone’s sold on the visual direction just yet.

Apple Intelligence: A Quiet Power Move

Rather than throwing out flashy generative AI demos like some of its competitors, Apple is betting on on-device intelligence with privacy baked in.

Apple’s new Foundation Models framework allows developers to tap into its language models in an offline, on-device fashion. That shift has big implications. Running models locally not only preserves user privacy, but also removes the need for cloud API calls — cutting both latency and cost.

The hesitation for many developers has always been the expense and complexity of integrating large language models. By removing that barrier, Apple’s making it easier (and more appealing) to experiment with AI features inside apps. It’s a more accessible, lower-risk way to build intelligent experiences — one that could encourage a much wider wave of AI-powered development.

Apple Intelligence updates include allowing users to search for similar items using Google, Etsy, or other supported apps. (Source: Apple Newsroom)

Features like screenshot querying, adding calendar events, or even tapping into ChatGPT all feel practical and usable right now — not just part of some vague, long-term vision.

Take automotive, for example: imagine an in-car assistant that understands natural language, suggests nearby charging points based on route, or summarises recent trips — all handled locally, without user data leaving the vehicle. That kind of intelligent, on-device capability unlocks a more privacy-conscious and globally scalable user experience.

Real-World Features Worth Watching

Apple’s most compelling updates are often the quiet ones — the ones that strip friction from everyday experiences.

Hold Assist is a good example. Your iPhone now recognises hold music and offers to stay on the line for you, ringing when someone finally answers. Android’s had a version of this for a while, but Apple’s take feels polished and user-friendly. It’s a quality-of-life feature that’s long overdue for iOS users — one that genuinely saves time and reduces passive frustration.

Live Translation is a broader play at cross-device accessibility. It’s available across Phone, Messages, and FaceTime, but the experience will vary. For texts, translations are instant — fluid and invisible. On FaceTime, live captions work naturally in a visual format. But the phone call translation — where your voice is translated and spoken aloud to the recipient — feels more experimental. The delay and lack of emotional nuance might make it awkward for now, even if the underlying tech is impressive.

Live Translations (Source: Apple Newsroom)

CarPlay UI updates tackle a long-standing distraction. Incoming calls no longer hijack the full screen, keeping navigation and driver context intact. For those of us working on in-car UX, these small interaction changes are critical. Attention in vehicles is a limited resource — and HMI design needs to respect that. This update reflects a more mature approach to contextual priority, which is central to how we design and build for the automotive space.

MacOS & Continuity Gets Smarter

macOS Tahoe continues to blur the lines between devices. Live Activities from your iPhone now surface on your Mac — Uber Eats orders, for example. Spotlight can now run automations, send emails, and more. It’s an interesting move, bringing command-line-esque behaviour to a broader user base.

It feels like Apple is pushing for “power users” across the board — not just on desktop. These additions matter when your users live across screens.

VisionOS & Shared Experience

Vision Pro didn’t take centre stage this year, but a few quiet upgrades suggest Apple is preparing for broader, more social use of the device.

In visionOS 26, widgets become spatial, integrating seamlessly into a user’s space. You can now truly make your spatial environment your own, tweaking widgets to the perfect size, colour, and even depth, placing them exactly where they feel right. And the new widgets themselves are fantastic — we’re getting a clock you can actually style, a weather widget that visually mirrors the real-world conditions outside, and a photo widget that can either wrap around you in a breathtaking panorama or, my personal favourite, act as a magical ‘window’ into another place entirely.

Customisable widgets (Source: Apple Newsroom)

VisionOS now lets organisations easily share a common pool of devices among team members. Your eye and hand data, vision prescription, and accessibility settings are securely stored on your iPhone so users can quickly use a shared team device or a friend’s Vision Pro as a guest user

In my opinion, this is a meaningful shift for usability. When I tried the Vision Pro last year, this was one of the biggest friction points. I wear glasses, so setting it up to my prescription was essential. But that made it essentially locked to me — if someone else wanted to try it, they’d either have to deal with a distorted view or reset the setup entirely. It didn’t feel like a shared product.

Now, with support for multiple profiles, Vision Pro starts to move beyond the single-user paradigm. It’s a small but important step toward more accessible and social spatial computing.

PlayStation VR2 controller support was another notable update. I’m not really a gamer, but it’s a smart move that opens the door to more mainstream use cases — and signals that Apple is serious about growing Vision Pro beyond productivity demos and solitary media consumption.

Final Thoughts

Was this the AI-centric WWDC that many were anticipating? Not on the surface. But to view that as a shortcoming is to miss the fundamental shift that took place. This keynote was a deliberate and necessary recalibration.

After the friction caused by last year’s delayed Siri promises, Apple is clearly prioritising deliverable, foundational technology over speculative demos. This is their reset — a move to reposition for the next wave of interaction, one built on a trustworthy and truly integrated intelligence.

For developers and product teams, this is the most crucial signal. The immediate payoff is already visible in the enhanced capabilities within Spotlight and Shortcuts. This isn’t a vague future promise; it’s a tangible, reliable foundation to build more powerful and deeply integrated app experiences on right now.

There is still much to explore in the developer sessions, especially what this groundwork means for a future seemingly focused on “visual intelligence.”

But one thing is certain: this wasn’t just an incremental update. It was a clear statement of direction — Apple has laid the groundwork. Now it’s up to us to build on it.

Next
Next

Bio-Motion in HMI