This Week's Sponsor:

DEVONTHINK

Store, Organize, and Work the Smart Way


Posts tagged with "developer tools"

Where’s Swift Assist?

Last June at WWDC, Apple announced Swift Assist, a way to generate Swift code using natural language prompts. However, as Tim Hardwick writes for MacRumors, Swift Assist hasn’t been heard from since then:

Unlike Apple Intelligence, Swift Assist never appeared in beta. Apple hasn’t announced that it’s been delayed or cancelled. The company has since released Xcode 16.3 beta 2, and as Michael Tsai points out, it’s not even mentioned in the release notes.

Meanwhile, developers have moved on, adopting services like Cursor, which does much of what was promised with Swift Assist, if not more. A similar tool built specifically for Swift projects and Apple’s APIs would be a great addition to Xcode, but it’s been nine months, and developers haven’t heard anything more about Swift Assist. Apple owes them an update.

Permalink

On Apple Offering an Abstraction Layer for AI on Its Platforms

Source: Apple.

Source: Apple.

I’ve been thinking about Apple’s position in AI a lot this week, and I keep coming back to this idea: if Apple is making the best consumer-grade computers for AI right now, but Apple Intelligence is failing third-party developers with a lack of AI-related APIs, should the company try something else to make it easier for developers to integrate AI into their apps?

Gus Mueller, creator of Acorn and Retrobatch, has been pondering similar thoughts:

A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.

Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.

The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don’t have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple’s. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.

The idea is a fascinating one: if Apple Intelligence cannot compete with the likes of ChatGPT or Claude for the foreseeable future, but third-party developers are creating apps based on those APIs, is there a scenario in which Apple may regain control of the burgeoning AI app ecosystem by offering their own native bridge to those APIs?

Read more


The M3 Ultra Mac Studio for Local LLMs

Speaking of the new Mac Studio and Apple making the best computers for AI: this is a terrific overview by Max Weinbach about the new M3 Ultra chip and its real-world performance with various on-device LLMs:

The Mac I’ve been using for the past few days is the Mac Studio with M3 Ultra SoC, 32-core CPU, 80-core GPU, 256GB Unified Memory (192GB usable for VRAM), and 4TB SSD. It’s the fastest computer I have. It is faster in my workflows for even AI than my gaming PC (which will be used for comparisons below; it has an Intel i9 13900K, RTX 5090, 64GB of DDR5, and a 2TB NVMe SSD).

It’s a very technical read, but the comparison between the M3 Ultra and a vanilla (non-optimized) RTX 5090 is mind-blogging to me. According to Weinbach, it all comes down to Apple’s MLX framework:

I’ll keep it brief; the LLM performance is essentially as good as you’ll get for the majority of models. You’ll be able to run better models faster with larger context windows on a Mac Studio or any Mac with Unified Memory than essentially any PC on the market. This is simply the inherent benefit of not only Apple Silicon but Apple’s MLX framework (the reason we can efficiently run the models without preloading KV Cache into memory, as well as generate tokens faster as context windows grow).

In case you’re not familiar, MLX is Apple’s open-source framework that – I’m simplifying – optimizes training and serving models on Apple Silicon’s unified memory architecture. It is a wonderful project with over 1,600 community models available for download.

As Weinbach concludes:

I see one of the best combos any developer can do as: M3 Ultra Mac Studio with an Nvidia 8xH100 rented rack. Hopper and Blackwell are outstanding for servers, M3 Ultra is outstanding for your desk. Different machines for a different use, while it’s fun to compare these for sport, that’s not the reality.⁠⁠

There really is no competition for an AI workstation today. The reality is, the only option is a Mac Studio.

Don’t miss the benchmarks in the story.

Permalink

App Store Connect Adds New Tools for Developers to Promote Their Apps

Source: Apple.

Source: Apple.

App Store Connect, the web app that developers use to submit their apps to Apple’s App Store and manage them, was updated yesterday with new tools developers can use to promote their apps.

Source: Apple.

Source: Apple.

Developers have been able to submit promotional requests to Apple for quite some time, but the new Featuring Nomination process is now baked right into App Store Connect. Developers can submit nominations from App Store Connect where they will be asked for information about their app. Nominations can be made for events such as a new app launch or adding in-app content and features. When an app is chosen by the App Store editorial team for a feature, developers will be notified in App Store Connect, too.

App Store Connect has also added the ability to generate promotional materials. The assets created can be used on social media and other platforms to promote app launches and other significant events.

These new App Store Connect tools promise to make promoting apps more convenient by including the Featuring Nomination process alongside other aspects of app submission. However, I expect it’s the ready-made promotional assets that are the more significant addition for smaller developers who may not have the budget or skills to create the materials themselves.


A Look at Code Completion and Swift Assist Coming in Xcode 16

Source: Apple.

Source: Apple.

Earlier today, I got the very first live demo of Swift Assist, one of the many developer tools introduced today by Apple. I also saw code completion in action. It was an impressive demo, and although the tools seem like magic and will undoubtedly be valuable to developers, they do have their limitations, which are worth exploring.

Code Completion in Action. Source: Apple.Replay

First, from what I could tell, code completion works extremely well. The demo I saw was of a simple restaurant app that displayed a menu. As an Apple representative typed variables and other items into Xcode, code completion named things in ways that made sense for a restaurant menu, such as Name, Price, and Calories. The feature also filled in types like strings, integers, and bools, along with the appropriate surrounding syntax.

In most cases, after typing just a handful of characters, the correct suggestion appeared and with a quick tap of the Tab key, the rest of the line of code was filled in. When the suggestion wasn’t what was wanted, a little additional typing steered the AI that backs code completion to the correct solution.

The model that drives code completion is trained specifically for the Swift programming language and Apple’s APIs. It runs locally on a developer’s Mac, enhancing privacy and ensuring that it’s available regardless of Internet connectivity. Although Apple was vague about the code on which the model was trained, it was clear from my briefing that it wasn’t on Apple’s own internal code, but Apple said it is code that it is authorized to use. I was also told that the model isn’t trained on the code of the developers that use the feature. Also worth noting is that Apple’s code completion model is continually updated independent of the update release cycle of Xcode itself.

Read more


Apple Highlights Its 2023 Developer Programs

Every year, Apple recaps the programs and other services it has launched for developers. Ever since I started covering Apple, there’s been a certain amount of tension between it and its developers. This year, that strain is running higher than I’ve ever seen, at least among the solo and small developer teams we typically cover.

However, it can simultaneously be true that Apple provides valuable resources for developers that are constantly changing. In a press release today, Apple highlights the following developer programs from 2023:

This year, Apple has also updated the Apple Developer Forums in advance of WWDC and rolled out Pathways, a collection of videos, documentation, and other resources focused on core topics like Design, Swift, SwiftUI, Games, visionOS, and App Store distribution.

There are a lot of great resources here. Far more than when I was learning to code around 2015. I’m particularly intrigued by Pathways, which looks as though it does an excellent job of pulling together materials that would otherwise require developers to consult multiple sources.


A TestFlight Update: Patched, But Still Broken

Just over a year ago, I wrote about the poor performance of TestFlight, the app that App Store developers rely on for beta testing their own apps. Today, thanks to a couple rounds of Feedback submissions, TestFlight is working better than before, but it’s not fixed. With WWDC around the corner, I thought I’d provide a quick update and share a few suggestions for fixes and features I’d like to see Apple implement.

One of the benefits of writing about TestFlight last year was that it became clear to me that, although my use of the app was unique, I wasn’t alone. Other writers who test a lot of apps and super fans who love trying the latest versions of their favorite apps got in touch sharing similar experiences, which convinced me that the issue was related to the number of betas I had in TestFlight. My experience was one of the worst, but with others in a similar boat, I took the time to file a Feedback report to see if there was anything that could be done to improve TestFlight.

An example of a beta app set to automatically update. But at least on my iPhone, none do.

An example of a beta app set to automatically update. But at least on my iPhone, none do.

That initial Feedback attempt ultimately went nowhere. Then, I got busy and resigned myself to getting by as best I could. However, getting by was no longer an option as the Vision Pro’s release date approached. That added a significant number of new betas to my TestFlight collection. By March, the Mac version of TestFlight had stopped working entirely. With apps lined up in my review queue, that posed a problem I couldn’t work around.

I removed inactive betas using my iPhone and removed myself from testing as many active betas as I could bear. However, nothing worked, so I filed another report with the black box known as Feedback. Fortunately, this time, it worked. After some back-and-forth sharing logs and screen recordings of TestFlight failing to load any content, I received a message that something had been adjusted on Apple’s end to shake things loose. Just like that, TestFlight was working again, although sluggishly.

TestFlight once again loads betas on my Mac, but not always with icons.

TestFlight once again loads betas on my Mac, but not always with icons.

My immediate problem is fixed, and I’ve been managing old betas more carefully to avoid a repeat of what happened on the Mac before. However, it’s clear that TestFlight needs more than just the quick fix that solved the worst of my problems. First of all, although TestFlight works again on my Mac, it’s slow to load on all OSes and clearly in need of work to allow it to handle larger beta collections more gracefully. And there’s a lot of other low-hanging fruit that would make managing large beta collections better on every OS, including:

  • the addition of a search field to make it easier to quickly locate a particular app
  • sorting by multiple criteria like developer, app name, and app category
  • filtering to allow users to only display installed or uninstalled betas
  • a single toggle in the Settings app to turn off all existing and future email notifications of new beta releases
  • attention to the automatic installation of beta updates, which has never worked consistently for me
  • a versioning system that allows users to see whether the App Store version of an app has caught up to its beta releases
  • automatic installation of betas after an OS update or ‘factory restore’ because currently, those apps’ icons are installed, but they are not useable until they’re manually re-installed from TestFlight one-by-one

It’s time for Apple to spend some time updating TestFlight beyond the band-aid fix that got it working again for me. It’s been a full decade since Apple acquired TestFlight. Today, the app is crucial to iOS, iPadOS, watchOS, and visionOS development, and while it’s not as critical to macOS development, it’s used more often than not by Mac developers, too. Apple has gone to great lengths to explain the benefits of its developer program to justify its App Store commissions generally and the Core Technology Fee in the EU specifically. TestFlight is just one piece of that program, but it’s an important one that has been neglected for too long and no longer squares with the company’s professed commitment to developers.


NVIDIA Introduces Remote Scene Rendering for Vision Pro Development

NVIDIA is in the midst of its 2024 GTC AI conference, and among the many posts published by the company yesterday, was a bit of news about the Apple Vision Pro:

Announced today at NVIDIA GTC, a new software framework built on Omniverse Cloud APIs, or application programming interfaces, lets developers easily send their Universal Scene Description (OpenUSD) industrial scenes from their content creation applications to the NVIDIA Graphics Delivery Network (GDN), a global network of graphics-ready data centers that can stream advanced 3D experiences to Apple Vision Pro.

That’s a bit of an NVIDIA word salad, but what they’re saying is that developers will be able to take immersive scenes built using OpenUSD, an open standard for creating 3D scenes, render them remotely, and deliver them to the Apple Vision Pro over Wi-Fi.

What caught my eye about this announcement is the remote rendering and Wi-Fi delivery part. NVIDIA has been using its data centers to deliver high-resolution gaming via its GeForce NOW streaming service. I’ve tried it with the Vision Pro, and it works really well.

NVIDIA says:

The workflow also introduces hybrid rendering, a groundbreaking technique that combines local and remote rendering on the device. Users can render fully interactive experiences in a single application from Apple’s native SwiftUI and Reality Kit with the Omniverse RTX Renderer streaming from GDN.

That means visionOS developers will be able to offload the rendering of an immersive environment to NVIDIA’s servers but add to the scene using SwiftUI and RealityKit frameworks, which Apple and NVIDIA expect will create new opportunities for customers:

“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from NVIDIA accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, vice president of the Vision Products Group at Apple. “Spatial computing will redefine how designers and developers build captivating digital content, driving a new era of creativity and engagement.”

“Apple Vision Pro is the first untethered device which allows for enterprise customers to realize their work without compromise,” said Rev Lebaredian, vice president of simulation at NVIDIA. “We look forward to our customers having access to these amazing tools.”

The press release is framed as a technology focused on enterprise users, but given NVIDIA’s importance to the gaming industry, I wouldn’t be surprised to see the new frameworks employed there too. Also notable is the quote from Apple’s Mike Rockwell given the two companies’ historically chilly relationship.

Permalink

FinanceKit Opens Real-Time Apple Card, Apple Cash, and Apple Savings Transaction Data to Third-Party Apps

Ivan Mehta, writing for TechCrunch:

Apple’s iOS 17.4 update is primarily about adapting iOS to EU’s Digital Market Act Regulation. But the company has also released a new API called FinanceKit that lets developers fetch transactions and balance information from Apple Card, Apple Cash, and Savings with Apple.

If you use an Apple Card and a budgeting and financial tracking app, you’ll know why this is a big deal. I’ve been tracking my expenses with Copilot for over a year now, and I was pleased to see in Mehta’s story that Copilot, along with YNAB, Monarch, have teamed up with Apple to be the first third-party apps to use FinanceKit.

Before FinanceKit, I could only track my Apple Card expenses by importing a CSV file of my transactions one time each month when a new statement appeared in the Wallet app. Not only was that laborious, but it defeated the purpose of an app like Copilot, which otherwise lets you see where you stand with your budget in real-time. The process was such a bad experience that I used my Apple Card a lot less than I would have otherwise. Now, those Apple Card transactions will be recorded in Copilot, YNAB, and Monarch as they’re made, just like any other credit card.

Permalink