This Week's Sponsor:

DEVONTHINK

Store, Organize, and Work the Smart Way


Apple Details Its AI Foundation Models and Applebot Web Scraping

From Apple’s Machine Learning Research1 blog:

Our foundation models are trained on Apple’s AXLearn framework, an open-source project we released in 2023. It builds on top of JAX and XLA, and allows us to train the models with high efficiency and scalability on various training hardware and cloud platforms, including TPUs and both cloud and on-premise GPUs. We used a combination of data parallelism, tensor parallelism, sequence parallelism, and Fully Sharded Data Parallel (FSDP) to scale training along multiple dimensions such as data, model, and sequence length.

We train our foundation models on licensed data, including data selected to enhance specific features, as well as publicly available data collected by our web-crawler, AppleBot. Web publishers have the option to opt out of the use of their web content for Apple Intelligence training with a data usage control.

We never use our users’ private personal data or user interactions when training our foundation models, and we apply filters to remove personally identifiable information like social security and credit card numbers that are publicly available on the Internet. We also filter profanity and other low-quality content to prevent its inclusion in the training corpus. In addition to filtering, we perform data extraction, deduplication, and the application of a model-based classifier to identify high quality documents.

It’s a very technical read, but it shows how Apple approached building AI features in their products and how their on-device and server models compare to others in the industry (on servers, Apple claims their model is essentially neck and neck with GPT-4-Turbo, OpenAI’s older model).

This blog post, however, pretty much parallels my reaction to the WWDC keynote. Everything was fun and cool until they showed generative image creation that spits out slop “resembling” (strong word) other people; and in this post, everything was cool until they mentioned how – surprise! – Applebot had already indexed web content to train their model without publishers’ consent, who can only opt out now. (This was also confirmed by Apple executives elsewhere.)

As a creator and website owner, I guess that these things will never sit right with me. Why should we accept that certain data sets require a licensing fee but anything that is found “on the open web” can be mindlessly scraped, parsed, and regurgitated by an AI? Web publishers (and especially indie web publishers these days, who cannot afford lawsuits or hiring law firms to strike expensive deals) deserve better.

It’s disappointing to see Apple muddy an otherwise compelling set of features (some of which I really want to try) with practices that are no better than the rest of the industry.


  1. How long until this become the ‘Apple Intelligence Research’ website? ↩︎
Permalink

The Latest from AppStories and Ruminate

Enjoy the latest episodes from MacStories’ family of podcasts:

For the latest WWDC episode of AppStories, Federico is joined by Myke Hurley to talk about the Vision Pro and Apple Intelligence before John pops up with some AI tidbits and a WWDC vibe check from in and around Apple Park.

This episode is sponsored by:


For this special episode of AppStories, Federico is joined by Jonathan and Niléane live in the Club MacStories+ Discord community to share their first impressions of the WWDC 2024 Keynote.

This episode is sponsored by:

  • Kolide – It ensures that if a device isn’t secure it can’t access your apps. It’s Device Trust for Okta. Watch the demo now.

Recorded live in the Club MacStories Discord, Federico share their final preparations and plans for WWDC 2024 along with some last-minute predictions.

On AppStories+, Federico reveals his trio of iPad Pros and we take questions from Club members about WWDC.

This episode is sponsored by:

CleanMyMac X: Your Mac. As good as new. Get 15% off today with code APPSTORIES15.
- Kolide – It ensures that if a device isn’t secure it can’t access your apps. It’s Device Trust for Okta. Watch the demo now.


This week, new MacStories podcasts, the Ruminate intro song is back, snack news, some keyboard accessories, and an alternative to the small web.

Read more


Interview Roundup: Appleā€™s Executives Talk Up Apple Intelligence and WWDC

In what has become a yearly WWDC tradition, Apple executives have been out talking about the big announcements from this year’s conference. Craig Federighi, Greg Joswiak, John Giannandrea, and Tim Cook have given interviews to YouTubers, news sites, and John Gruber on a special edition of The Talk Show streamed live in spatial video.

They gave fascinating answers to some questions, particularly about Apple Intelligence, so without further ado, here’s a roundup of interesting Apple executive interviews over the past few days.

Read more


1Password Extended Access Management: Secure Every Sign-In for Every App on Every Device [WWDC Sponsor]

In a perfect world, end users would only work on managed devices with IT-approved apps. But every day, employees use personal devices and unapproved apps that aren’t protected by MDM, IAM, or any other security tool.

There’s a giant gap between the security tools we have and the way we actually work. 1Password calls it the Access-Trust Gap, and they’ve also created the first ever solution to fill it.

1Password Extended Access Management secures every sign-in for every app on every device. It includes the password manager you know and love, and the device trust solution you’ve probably heard of on this podcast, back when it was called Kolide.

1Password Extended Access Management cares about user experience and privacy, which means it can go places other tools can’t–like personal and contractor devices. It ensures that every device is known and healthy, and every login is protected. So stop trying to ban BYOD or Shadow IT, and start protecting them with 1Password Extended Access Management.

Check it out today.

Our thanks to 1Password for sponsoring our WWDC coverage this week.


tvOS 18: The MacStories Overview

Yesterday, during its WWDC 2024 opening keynote, Apple officially revealed its latest software story for Apple TV. Coming this fall, tvOS 18 introduces new intelligence-based features such as InSight and on-device Siri, native 21:9 aspect ratio support, new screen savers, and a host of noteworthy additions to enhance the at-home TV viewing experience. Let’s jump into everything new coming to Apple TV.

InSight

Apple’s video player is somewhat of a hidden gem when it comes to playback and controls for audio and captions. A few years ago, the company expanded its functionality with a quick swipe down gesture revealing an Info panel with details of the currently-playing content and quick access to the user’s Up Next queue. Premiering this fall is a new feature nestled between those two elements called InSight.

A new addition to Apple TV+, InSight gives users real-time access to information about the actors and their characters onscreen, as well as the soundtrack in a given scene, allowing viewers to quickly add that song or musical performance to an Apple Music playlist to enjoy later. Much like Amazon Prime Video’s X-Ray feature that came before it, there’s lots of fine granular detail that could be added to InSight before its fall launch, but this is a great start.

In addition to accessing InSight on the big screen, users will also be able to view real-time actor, character, and music information through the Remote app found in Control Center on iOS and iPadOS, allowing access to the same information for a distraction-free experience when watching with friends and family.

Read more


Apple Announces New Features Coming to Its Services This Fall

Alongside updates to Apple’s platforms and Apple Intelligence, the company announced an assortment of new features coming to its line of services this fall. From the press release in Apple Newsroom:

“So many of our users rely on Apple services throughout their day, from navigating their commute with Apple Maps, to making easy and secure payments with Apple Pay, to curating playlists with Apple Music,” said Eddy Cue, Apple’s senior vice president of Services. “We’re excited to give them even more to love about our services, like the ability to explore national parks with hikes in Apple Maps, redeem rewards or access installments with Apple Pay, and enjoy music with loved ones through SharePlay in Apple Music.”

I like that this services roundup is becoming an annual WWDC tradition. Some of these features were mentioned or shown on-screen during the keynote, but it’s easy for them to get overlooked in light of major operating system changes. While they might seem small in comparison, improvements to Apple’s services can have lasting day-to-day impacts on those who use them, myself included.

A few of my favorite services updates this year:

  • A new Places Library in Maps that allows you to save locations and write notes about them.
  • Tap to Provision, an easier way to add credit and debits cards to Wallet by tapping them instead of entering card numbers.
  • Redesigned event tickets in Wallet that can feature new types of data, including parking and weather information.
  • The Library tab in Apple Fitness+ for quicker access to saved workouts, Custom Plans, and Stacks.
  • Redesigned iCloud settings to better surface recommendations and features you’re using.

Check out the press release for all the updates coming to Apple’s services this fall. There’s a lot to look forward to there, and I’m happy to see the company continuing to push its services forward.


You can follow all of our WWDC coverage through our WWDC 2024 hub or subscribe to the dedicated WWDC 2024 RSS feed.

Permalink

Apple Intelligence: The MacStories Overview

After months of anticipation and speculation about what Apple could be doing in the world of artificial intelligence, we now have our first glimpse at the company’s approach: Apple Intelligence. Based on generative models, Apple Intelligence uses a combination of on-device and cloud processing to offer intelligence features that are personalized, useful, and secure. In today’s WWDC keynote, Tim Cook went so far as to call it “the next big step for Apple.”

From the company’s press release on Apple Intelligence:

“We’re thrilled to introduce a new chapter in Apple innovation. Apple Intelligence will transform what users can do with our products — and what our products can do for our users,” said Tim Cook, Apple’s CEO. “Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence. And it can access that information in a completely private and secure way to help users do the things that matter most to them. This is AI as only Apple can deliver it, and we can’t wait for users to experience what it can do.”

It’s clear from today’s presentation that Apple is positioning itself as taking a different approach to AI than the rest of the industry. The company is putting generative models at the core of its devices while seeking to stay true to its principles. And that starts with privacy.

Read more


A Look at Code Completion and Swift Assist Coming in Xcode 16

Source: Apple.

Source: Apple.

Earlier today, I got the very first live demo of Swift Assist, one of the many developer tools introduced today by Apple. I also saw code completion in action. It was an impressive demo, and although the tools seem like magic and will undoubtedly be valuable to developers, they do have their limitations, which are worth exploring.

Code Completion in Action. Source: Apple.Replay

First, from what I could tell, code completion works extremely well. The demo I saw was of a simple restaurant app that displayed a menu. As an Apple representative typed variables and other items into Xcode, code completion named things in ways that made sense for a restaurant menu, such as Name, Price, and Calories. The feature also filled in types like strings, integers, and bools, along with the appropriate surrounding syntax.

In most cases, after typing just a handful of characters, the correct suggestion appeared and with a quick tap of the Tab key, the rest of the line of code was filled in. When the suggestion wasn’t what was wanted, a little additional typing steered the AI that backs code completion to the correct solution.

The model that drives code completion is trained specifically for the Swift programming language and Apple’s APIs. It runs locally on a developer’s Mac, enhancing privacy and ensuring that it’s available regardless of Internet connectivity. Although Apple was vague about the code on which the model was trained, it was clear from my briefing that it wasn’t on Apple’s own internal code, but Apple said it is code that it is authorized to use. I was also told that the model isn’t trained on the code of the developers that use the feature. Also worth noting is that Apple’s code completion model is continually updated independent of the update release cycle of Xcode itself.

Read more


iOS and iPadOS 18: The MacStories Overview

Image: Apple.

Image: Apple.

At its WWDC 2024 keynote held earlier today online and with an in-person event at Apple Park in Cupertino, California, Apple officially announced the next versions of the operating systems for iPhone and iPad – iOS and iPadOS 18.

As widely speculated in the lead up to the event, Apple’s focus for both OSes largely revolves around artificial intelligence, or as the company likes to refer to the AI acronym now, “Apple Intelligence”. The new AI features in iOS and iPadOS promise to make both operating systems, well, more intelligent than before thanks to a completely revamped Siri and proactive functionalities that learn from users’ habits and apps. Presented as a fast, private, and personal set of features that draws from the user’s context and combines it with generative models, Apple Intelligence – which will debut in U.S. English only later this year, with a beta expected later this summer – will power a variety of new system features and experiences, starting from a revamped Siri and text analysis features to image creation, performing actions inside apps, and more.

But AI-related improvements aren’t the only new features Apple announced today. From a renewed focus on Home Screen customization and redesigned Control Center to a new design for tab bars on iPad and expanded Tapbacks in Messages, Apple has showed that, while they can follow the rest of the tech industry in rethinking how AI can enhance how we use our devices, they can continue shipping other functionalities for iPhone and iPad, too. Or, at the very least, they certainly can for the iPhone and iOS.

We’ll have in-depth overviews for both iOS and iPadOS 18 when the public betas for each OS come out next month, and, of course, we’ll continue diving into the announcements later this week on MacStories via our WWDC 2024 hub as well as AppStories. We’ll also have a dedicated story about Apple Intelligence coming later on MacStories with the highlights of all the AI-infused features announced by Apple today.

In the meantime, here’s a recap of everything else that Apple showed today for iOS and iPadOS 18.

Read more