Posts tagged with "developer tools"

App Store Connect Adds New Tools for Developers to Promote Their Apps

Source: Apple.

Source: Apple.

App Store Connect, the web app that developers use to submit their apps to Apple’s App Store and manage them, was updated yesterday with new tools developers can use to promote their apps.

Source: Apple.

Source: Apple.

Developers have been able to submit promotional requests to Apple for quite some time, but the new Featuring Nomination process is now baked right into App Store Connect. Developers can submit nominations from App Store Connect where they will be asked for information about their app. Nominations can be made for events such as a new app launch or adding in-app content and features. When an app is chosen by the App Store editorial team for a feature, developers will be notified in App Store Connect, too.

App Store Connect has also added the ability to generate promotional materials. The assets created can be used on social media and other platforms to promote app launches and other significant events.

These new App Store Connect tools promise to make promoting apps more convenient by including the Featuring Nomination process alongside other aspects of app submission. However, I expect it’s the ready-made promotional assets that are the more significant addition for smaller developers who may not have the budget or skills to create the materials themselves.


A Look at Code Completion and Swift Assist Coming in Xcode 16

Source: Apple.

Source: Apple.

Earlier today, I got the very first live demo of Swift Assist, one of the many developer tools introduced today by Apple. I also saw code completion in action. It was an impressive demo, and although the tools seem like magic and will undoubtedly be valuable to developers, they do have their limitations, which are worth exploring.

Code Completion in Action. Source: Apple.Replay

First, from what I could tell, code completion works extremely well. The demo I saw was of a simple restaurant app that displayed a menu. As an Apple representative typed variables and other items into Xcode, code completion named things in ways that made sense for a restaurant menu, such as Name, Price, and Calories. The feature also filled in types like strings, integers, and bools, along with the appropriate surrounding syntax.

In most cases, after typing just a handful of characters, the correct suggestion appeared and with a quick tap of the Tab key, the rest of the line of code was filled in. When the suggestion wasn’t what was wanted, a little additional typing steered the AI that backs code completion to the correct solution.

The model that drives code completion is trained specifically for the Swift programming language and Apple’s APIs. It runs locally on a developer’s Mac, enhancing privacy and ensuring that it’s available regardless of Internet connectivity. Although Apple was vague about the code on which the model was trained, it was clear from my briefing that it wasn’t on Apple’s own internal code, but Apple said it is code that it is authorized to use. I was also told that the model isn’t trained on the code of the developers that use the feature. Also worth noting is that Apple’s code completion model is continually updated independent of the update release cycle of Xcode itself.

Read more


Apple Highlights Its 2023 Developer Programs

Every year, Apple recaps the programs and other services it has launched for developers. Ever since I started covering Apple, there’s been a certain amount of tension between it and its developers. This year, that strain is running higher than I’ve ever seen, at least among the solo and small developer teams we typically cover.

However, it can simultaneously be true that Apple provides valuable resources for developers that are constantly changing. In a press release today, Apple highlights the following developer programs from 2023:

This year, Apple has also updated the Apple Developer Forums in advance of WWDC and rolled out Pathways, a collection of videos, documentation, and other resources focused on core topics like Design, Swift, SwiftUI, Games, visionOS, and App Store distribution.

There are a lot of great resources here. Far more than when I was learning to code around 2015. I’m particularly intrigued by Pathways, which looks as though it does an excellent job of pulling together materials that would otherwise require developers to consult multiple sources.


A TestFlight Update: Patched, But Still Broken

Just over a year ago, I wrote about the poor performance of TestFlight, the app that App Store developers rely on for beta testing their own apps. Today, thanks to a couple rounds of Feedback submissions, TestFlight is working better than before, but it’s not fixed. With WWDC around the corner, I thought I’d provide a quick update and share a few suggestions for fixes and features I’d like to see Apple implement.

One of the benefits of writing about TestFlight last year was that it became clear to me that, although my use of the app was unique, I wasn’t alone. Other writers who test a lot of apps and super fans who love trying the latest versions of their favorite apps got in touch sharing similar experiences, which convinced me that the issue was related to the number of betas I had in TestFlight. My experience was one of the worst, but with others in a similar boat, I took the time to file a Feedback report to see if there was anything that could be done to improve TestFlight.

An example of a beta app set to automatically update. But at least on my iPhone, none do.

An example of a beta app set to automatically update. But at least on my iPhone, none do.

That initial Feedback attempt ultimately went nowhere. Then, I got busy and resigned myself to getting by as best I could. However, getting by was no longer an option as the Vision Pro’s release date approached. That added a significant number of new betas to my TestFlight collection. By March, the Mac version of TestFlight had stopped working entirely. With apps lined up in my review queue, that posed a problem I couldn’t work around.

I removed inactive betas using my iPhone and removed myself from testing as many active betas as I could bear. However, nothing worked, so I filed another report with the black box known as Feedback. Fortunately, this time, it worked. After some back-and-forth sharing logs and screen recordings of TestFlight failing to load any content, I received a message that something had been adjusted on Apple’s end to shake things loose. Just like that, TestFlight was working again, although sluggishly.

TestFlight once again loads betas on my Mac, but not always with icons.

TestFlight once again loads betas on my Mac, but not always with icons.

My immediate problem is fixed, and I’ve been managing old betas more carefully to avoid a repeat of what happened on the Mac before. However, it’s clear that TestFlight needs more than just the quick fix that solved the worst of my problems. First of all, although TestFlight works again on my Mac, it’s slow to load on all OSes and clearly in need of work to allow it to handle larger beta collections more gracefully. And there’s a lot of other low-hanging fruit that would make managing large beta collections better on every OS, including:

  • the addition of a search field to make it easier to quickly locate a particular app
  • sorting by multiple criteria like developer, app name, and app category
  • filtering to allow users to only display installed or uninstalled betas
  • a single toggle in the Settings app to turn off all existing and future email notifications of new beta releases
  • attention to the automatic installation of beta updates, which has never worked consistently for me
  • a versioning system that allows users to see whether the App Store version of an app has caught up to its beta releases
  • automatic installation of betas after an OS update or ‘factory restore’ because currently, those apps’ icons are installed, but they are not useable until they’re manually re-installed from TestFlight one-by-one

It’s time for Apple to spend some time updating TestFlight beyond the band-aid fix that got it working again for me. It’s been a full decade since Apple acquired TestFlight. Today, the app is crucial to iOS, iPadOS, watchOS, and visionOS development, and while it’s not as critical to macOS development, it’s used more often than not by Mac developers, too. Apple has gone to great lengths to explain the benefits of its developer program to justify its App Store commissions generally and the Core Technology Fee in the EU specifically. TestFlight is just one piece of that program, but it’s an important one that has been neglected for too long and no longer squares with the company’s professed commitment to developers.


NVIDIA Introduces Remote Scene Rendering for Vision Pro Development

NVIDIA is in the midst of its 2024 GTC AI conference, and among the many posts published by the company yesterday, was a bit of news about the Apple Vision Pro:

Announced today at NVIDIA GTC, a new software framework built on Omniverse Cloud APIs, or application programming interfaces, lets developers easily send their Universal Scene Description (OpenUSD) industrial scenes from their content creation applications to the NVIDIA Graphics Delivery Network (GDN), a global network of graphics-ready data centers that can stream advanced 3D experiences to Apple Vision Pro.

That’s a bit of an NVIDIA word salad, but what they’re saying is that developers will be able to take immersive scenes built using OpenUSD, an open standard for creating 3D scenes, render them remotely, and deliver them to the Apple Vision Pro over Wi-Fi.

What caught my eye about this announcement is the remote rendering and Wi-Fi delivery part. NVIDIA has been using its data centers to deliver high-resolution gaming via its GeForce NOW streaming service. I’ve tried it with the Vision Pro, and it works really well.

NVIDIA says:

The workflow also introduces hybrid rendering, a groundbreaking technique that combines local and remote rendering on the device. Users can render fully interactive experiences in a single application from Apple’s native SwiftUI and Reality Kit with the Omniverse RTX Renderer streaming from GDN.

That means visionOS developers will be able to offload the rendering of an immersive environment to NVIDIA’s servers but add to the scene using SwiftUI and RealityKit frameworks, which Apple and NVIDIA expect will create new opportunities for customers:

“The breakthrough ultra-high-resolution displays of Apple Vision Pro, combined with photorealistic rendering of OpenUSD content streamed from NVIDIA accelerated computing, unlocks an incredible opportunity for the advancement of immersive experiences,” said Mike Rockwell, vice president of the Vision Products Group at Apple. “Spatial computing will redefine how designers and developers build captivating digital content, driving a new era of creativity and engagement.”

“Apple Vision Pro is the first untethered device which allows for enterprise customers to realize their work without compromise,” said Rev Lebaredian, vice president of simulation at NVIDIA. “We look forward to our customers having access to these amazing tools.”

The press release is framed as a technology focused on enterprise users, but given NVIDIA’s importance to the gaming industry, I wouldn’t be surprised to see the new frameworks employed there too. Also notable is the quote from Apple’s Mike Rockwell given the two companies’ historically chilly relationship.

Permalink

FinanceKit Opens Real-Time Apple Card, Apple Cash, and Apple Savings Transaction Data to Third-Party Apps

Ivan Mehta, writing for TechCrunch:

Apple’s iOS 17.4 update is primarily about adapting iOS to EU’s Digital Market Act Regulation. But the company has also released a new API called FinanceKit that lets developers fetch transactions and balance information from Apple Card, Apple Cash, and Savings with Apple.

If you use an Apple Card and a budgeting and financial tracking app, you’ll know why this is a big deal. I’ve been tracking my expenses with Copilot for over a year now, and I was pleased to see in Mehta’s story that Copilot, along with YNAB, Monarch, have teamed up with Apple to be the first third-party apps to use FinanceKit.

Before FinanceKit, I could only track my Apple Card expenses by importing a CSV file of my transactions one time each month when a new statement appeared in the Wallet app. Not only was that laborious, but it defeated the purpose of an app like Copilot, which otherwise lets you see where you stand with your budget in real-time. The process was such a bad experience that I used my Apple Card a lot less than I would have otherwise. Now, those Apple Card transactions will be recorded in Copilot, YNAB, and Monarch as they’re made, just like any other credit card.

Permalink

The Best Way to Take Screenshots on Apple Vision Pro

Taking good-looking screenshots on the Apple Vision Pro isn’t easy, but it’s not impossible either. I’ve already spent many hours taking screenshots on the device, and I thought I’d share my experience and some practical tips for getting the best screenshots possible.

Although I’ve only had the Apple Vision Pro for a week, I’ve already spent a lot of time thinking about and refining my screenshot workflow out of necessity. That’s because after I spent around three hours writing my first visionOS app review of CARROT Weather and Mercury Weather, I spent at least as much time trying to get the screenshots I wanted. If that had been a review of the iOS versions of those apps, the same number of screenshots would have taken less than a half hour. That’s a problem because I simply don’t have that much time to devote to screenshots.

Taking screenshots with the Apple Vision Pro is difficult because of the way the device works. Like other headsets, the Apple Vision Pro uses something called foveated rendering, a technique that’s used to reduce the computing power needed to display the headset’s images. In practical terms, the technique means that the only part of the device’s view that is in focus is where you’re looking. The focal point changes as your eyes move, so you don’t notice that part of the view is blurry. In fact, this is how the human eye works, so as long as the eye tracking is good, which it is on the Apple Vision Pro, the experience is good too.

However, as well as foveated rendering works for using the Apple Vision Pro, it’s terrible for screenshots. You can take a quick screenshot by pressing the top button and Digital Crown, but you’ll immediately see that everything except where you were looking when you took the screen-grab is out of focus. That’s fine for sharing a quick image with a friend, but if you want something suitable for publishing, it’s not a good option.

Fortunately, Apple thought of this, and there’s a solution, but it involves using Xcode and another developer tool. Of course, using Xcode to take screenshots is a little like using Logic Pro to record voice memos, except there are plenty of simple apps for recording voice memos, whereas Xcode is currently your only choice for taking crisp screenshots on the Vision Pro. So until there’s another option, it pays to learn your way around these developer tools to get the highest quality screenshots as efficiently as possible.

Read more


The Apple Vision Pro Developer Strap

Jeff Benjamin writing for 9to5Mac has a comprehensive breakdown on what the Apple Vision Pro Developer Strap can and can’t do. One of the primary benefits for developers is capturing video. As Benjamin writes:

The Developer Strap also lets developers capture a direct video feed from Apple Vision Pro via a wired USB-C connection using Reality Composer Pro. Files transfers of the captured feed occur via the direct USB-C connection. Users without the strap can still capture these feeds but via Wi-Fi only.

Benjamin also explains how to use the strap to access Recovery Mode:

You can also restore visionOS using Recovery Mode via the wired connection made possible by the Developer Strap. This includes downgrading from visionOS beta releases.

My experience is in line with Benjamin’s. The Developer Strap may make capturing short videos and screenshots easier, but it can’t do much else.

I will add, however, that I was contacted by a MacStories reader who tipped me off to one other thing the Developer Strap can do, which is act as a video source for QuickTime. This works a lot like capturing screenshots and video from an Apple TV via QuickTime, and the advantage is that you can capture more than the 60-second cap imposed by Reality Composer Pro. That’s great, except that the capture is foveated, meaning that the video recorded will be blurry everywhere except where you’re looking.

Permalink

Apple Offers USB-C Enabled Vision Pro Strap to Registered Developers

Apple is offering a new Vision Pro accessory to registered developers: a head strap with a USB-C connector for $299. There aren’t a lot of details about the strap, which is designed to be connected to a Mac to accelerate development and testing for the Vision Pro, other than this description that is behind a developer account login:

Overview

The Developer Strap is an optional accessory that provides a USB-C connection between Apple Vision Pro and Mac and is helpful for accelerating the development of graphics-intensive apps and games. The Developer Strap provides the same audio experience as the in-box Right Audio Strap, so developers can keep the Developer Strap attached for both development and testing.

Tech specs

  • USB-C data connection
  • Individually amplified dual driver audio pods
  • Compatible with Mac

Although we haven’t been able to confirm the capabilities of the Developer Strap, USB-C may allow developers to connect the Vision Pro to their network over Ethernet or access external storage, for example.

Why is a USB-C dongle $299? It’s expensive, but as the description makes clear, it incorporates the speaker found in Vision Pro’s right strap, which it replaces, explaining at least part of the cost.