Skip to main content

Apple says it made ‘AI for the rest of us’ — and it’s right

An Apple executive giving a presentation at WWDC 2024.
Apple
Promotional logo for WWDC 2023.
This story is part of our complete Apple WWDC coverage

After many months of anxious waiting and salacious rumors, it’s finally happened: Apple has revealed its generative artificial intelligence (AI) systems to the world at its Worldwide Developers Conference (WWDC).

Yet unlike ChatGPT and Google Gemini, the freshly unveiled Apple Intelligence tools and features look like someone actually took the time to think about how AI can be used to better the world, not burn it down. If it works half as well as Apple promises it will, it could be the best AI system on the market.

Recommended Videos

Every aspect of Apple Intelligence bears the classic Apple hallmarks, from its intense focus on user privacy to its natural and seamless integration into the company’s devices and operating systems. Apple’s insistence on waiting until its AI was ready — rather than rushing some dangerous, half-baked product out of the door for maximum profit — is exactly what we’ve come to expect from Tim Cook and company. And we’re all better off for it.

Slow and steady

Apple Intelligence on iPhone pulling data from across apps.
Apple

I can’t blame Apple for being slow when it comes to launching generative AI tools. We’ve all seen the damage that unchecked AI can cause. From medical misinformation and deepfakes to job losses and revenge porn, getting it wrong comes with some serious side effects.

That means the race to the AI crown sometimes feels like a race to the bottom, with everyone so desperate to “win” that they resort to pumping out increasingly powerful and perilous tools with no oversight or thought beforehand.

Today, though, Apple got it spot-on. Apple Intelligence is baked into existing Apple apps — apps that you use every day and intimately understand how to use. Where others have tossed over the keys to the kingdom and said “go out there and have fun,” Apple has powered up your existing daily workflows with precisely the right tools in the right places.

There are several significant benefits to this method. For one thing, it brings the required learning curve right back down to earth. You already know how to write an email and edit a photo on your device — with Apple Intelligence, those same processes continue to exist, but now they have a few more generative bells and whistles to play around with.

Apple has also put its world-famous design sense to good use, integrating AI tools into apps that you use every day in ways that look totally natural. You don’t need to learn prompt engineering, you don’t need to load up any plug-ins, and you don’t need to pay for any new apps. In fact, you barely need to do anything different at all.

Privacy first

Apple talking about privacy with AI apps.
Apple

And there’s more. By constraining its generative AI tools within existing apps and operating system features, Apple can put a lid on dangerous and risky content that is far too easy to create in rival products.

But Apple’s not just looking to protect everyone else from what you might want to create in a dark moment — it’s looking to protect you as well. We’ve all seen what a privacy nightmare existing AI tools can be, with their propensity to leak the private data that they so voraciously vacuum up. Apple Intelligence takes a different approach.

For starters, Apple Intelligence processes most AI requests on your device, meaning no one else can even get a sniff of it — not Apple, not third-party app makers, not anyone. That’s been the case with other Apple-made features for years, but doing so with AI is a must. Of course, Apple has obliged.

When a cloud server really is required to process your queries and requests, Apple has tightly locked that down too. The cloud servers are Apple’s own, but the company has no access to your data. Better yet, it can all be reviewed by external experts to make sure Apple is keeping its word. Just try getting a similar promise from OpenAI or Google.

A better way

Apple's Craig Federighi talks about Apple Intelligence at the Worldwide Developers Conference (WWDC) 2024.
Apple

It’s unlikely that Apple’s approach is foolproof (nothing truly is, after all). But it’s a far more accessible approach than any of us are used to seeing. Not only does Apple Intelligence look more useable and understandable than anything we’ve seen before, but it looks safer and more private too. In taking this approach, Apple is showing that AI doesn’t have to mean the destruction of humanity. It could instead mean helpful everyday tools and fun little Genmoji. Who’d have thought?

Describing Apple Intelligence, the company’s software chief, Craig Federighi, summed it up as “AI for the rest of us.” I couldn’t have put it any better.

Alex Blake
Alex Blake has been working with Digital Trends since 2019, where he spends most of his time writing about Mac computers…
You can now generate AI videos right in Premiere Pro
An example of the Firefly video generator

Firefly can now generate videos from image and text prompts, as well as extend existing clips, Adobe announced on Monday. The new feature is currently rolling out to Premiere Pro subscribers.

The video generation feature makes its debut in a number of new tools for Premiere Pro and the Firefly web app. PP's Generative Extend, for example, can tack on up to two seconds of added AI footage to either the beginning or ending of a clip, as well as make mid-shot adjustments to the camera position, tracking, and even the shot subjects themselves.

Read more
Zoom debuts its new customizable AI Companion 2.0
overhead shot of a person taking a zoom meeting at their desk

Zoom unveiled its AI Companion 2.0 during the company's Zoomtopia 2024 event on Wednesday. The AI assistant is incorporated throughout the Zoom Workplace app suite and is promised to "deliver an AI-first work platform for human connection."

While Zoom got its start as a videoconferencing app, the company has expanded its product ecosystem to become an "open collaboration platform" that includes a variety of communication, productivity, and business services, both online and in physical office spaces. The company's AI Companion, which debuted last September, is incorporated deeply throughout Zoom Workplace and, like Google's Gemini or Microsoft's Copilot, is designed to automate repetitive tasks like transcribing notes and summarizing reports that can take up as much as 62% of a person's workday.

Read more
Meta and Google made AI news this week. Here were the biggest announcements
Ray-Ban Meta Smart Glasses will be available in clear frames.

From Meta's AI-empowered AR glasses to its new Natural Voice Interactions feature to Google's AlphaChip breakthrough and ChromaLock's chatbot-on-a-graphing calculator mod, this week has been packed with jaw-dropping developments in the AI space. Here are a few of the biggest headlines.

Google taught an AI to design computer chips
Deciding how and where all the bits and bobs go into today's leading-edge computer chips is a massive undertaking, often requiring agonizingly precise work before fabrication can even begin. Or it did, at least, before Google released its AlphaChip AI this week. Similar to AlphaFold, which generates potential protein structures for drug discovery, AlphaChip uses reinforcement learning to generate new chip designs in a matter of hours, rather than months. The company has reportedly been using the AI to design layouts for the past three generations of Google’s Tensor Processing Units (TPUs), and is now sharing the technology with companies like MediaTek, which builds chipsets for mobile phones and other handheld devices.

Read more