This Week's Sponsor:

Incogni

Put an End to Spam, Scams, and Robocalls on Your iPhone


Posts tagged with "api"

FinanceKit Opens Real-Time Apple Card, Apple Cash, and Apple Savings Transaction Data to Third-Party Apps

Ivan Mehta, writing for TechCrunch:

Apple’s iOS 17.4 update is primarily about adapting iOS to EU’s Digital Market Act Regulation. But the company has also released a new API called FinanceKit that lets developers fetch transactions and balance information from Apple Card, Apple Cash, and Savings with Apple.

If you use an Apple Card and a budgeting and financial tracking app, you’ll know why this is a big deal. I’ve been tracking my expenses with Copilot for over a year now, and I was pleased to see in Mehta’s story that Copilot, along with YNAB, Monarch, have teamed up with Apple to be the first third-party apps to use FinanceKit.

Before FinanceKit, I could only track my Apple Card expenses by importing a CSV file of my transactions one time each month when a new statement appeared in the Wallet app. Not only was that laborious, but it defeated the purpose of an app like Copilot, which otherwise lets you see where you stand with your budget in real-time. The process was such a bad experience that I used my Apple Card a lot less than I would have otherwise. Now, those Apple Card transactions will be recorded in Copilot, YNAB, and Monarch as they’re made, just like any other credit card.

Permalink

MacStories Starter Pack: Reverse-Engineering the Matter API and My ‘Save to Matter’ Shortcut

My Save to Matter shortcut.

My Save to Matter shortcut.

Editor’s Note: Reverse-Engineering the Matter API and My ‘Save to Matter’ Shortcut is part of the MacStories Starter Pack, a collection of ready-to-use shortcuts, apps, workflows, and more that we’ve created to help you get the most out of your Mac, iPhone, and iPad.

For the past few months, I’ve been enjoying and keeping an eye on the development of Matter, a new read-later service that aims to combine a powerful text parser with elegant design, social discovery features, annotations, and the ability to listen to articles as audio. I’m not one to typically care about the latest VC-backed startup that promises to revolutionize reading articles with social features, but Matter struck me for a few reasons: the app’s reader mode is gorgeous; the ability to annotate articles with highlights is great; and, more importantly, it has the best, most human-sounding text-to-audio conversion engine I’ve ever tested.

Something else happened a few months ago: Matter introduced an official plugin to sync your article highlights as Markdown notes to Obsidian. Integration with PKM-style apps is a hot trend right now in the modern crop of read-later services (John covered this very topic here), so I wasn’t shocked to see that Matter joined Readwise in supporting Obsidian with a plugin. Something about it piqued my interest though:

If Matter didn’t have a public API, how could the Obsidian plugin even sync to the Matter service?

Obviously, there had to be an API involved behind the scenes, which Matter hadn’t announced yet, but which I could potentially reverse-engineer and integrate with Shortcuts. And that’s exactly what I’ve been doing for the past month.

My experiments with the still-unannounced Matter API have developed on three separate fronts, and I’m going to share the results in three different places:

  • Today on MacStories, I’m going to share a one-click shortcut called Save to Matter that lets you save any article to your Matter queue directly from the share sheet or anywhere else on iOS, iPadOS, or macOS without having to use the Matter extension;
  • Tomorrow on MacStories Weekly for Club MacStories members, I will share MatterBot, an advanced Matter shortcut that lets you take complete control over your Matter queue with support for exporting annotations as Markdown or even downloading articles as MP3 files;
  • Next week for Club MacStories+ and Premier members only, I will share MatterPod, another advanced shortcut that lets you turn your Matter queue into a Matter podcast feed hosted on your own web server.

Before we dive in, I also want to confirm that I privately reached out to the folks at Matter weeks ago about my experiments, and they were cool with me writing about my findings and sharing shortcuts I’ve built for the Matter API.

With that being said, let’s take a look at how you can get started with the Matter API and the Save to Matter shortcut.

Read more


Feedly Opens Up API

From the Feedly blog:

Millions of users depend on their feedly for inspiration, information, and to feed their mind. But one size does not fit all. Individuals have different workflows, different habits, and different devices. In our efforts to evolve feedly from a product to a platform, we have therefore decided to open up the feedly API. Effective immediately, developers are welcome to deliver new applications, experiences, and innovations via the feedly cloud. We feel strongly that this will help to accelerate innovation and better serve our users.

API documentation here. I’m looking forward to playing with this in the next couple of weeks.

Permalink

Philips Releases Hue API and iOS SDK

hue

hue

As reported by TechCrunch, Philips has released an API and iOS SDK for the hue, the company’s wireless lighting system that gives users control through an iOS app.

We’re now at a point where there are already about 10 applications that have been shared and built from the unofficial developer community for new applications around Hue,” explained George Yianni, Hue System Architect in an interview. “Now what we want to do as Philips is we actually want to help and grow and encourage this community, and give them tools and proper documentation. Also, we want to give them commitment that this is the API and we’re going to support it and it won’t change overnight.

Prior to the official release of an API and SDK, third-party developers had already reverse-engineered Philips’ apps to create their own solutions to control hue’s system (based on a “bridge” that communicates with the actual lightbulbs). An iOS app called Ambify lets users pair their music with hue lights; here at MacStories, I linked to a video back in November showing an unofficial hue Python library that could work with Pythonista to automate the process of switching lights on and off.

The API opens a lot of interesting possibilities for third-party software and hardware makers. The hue already shipped with its own options for remote control and “presets” (called “recipes”) for different lighting settings aimed at providing users with ways to easily replicate specific color combinations based on photos (available in the app’s photo library) or targeted towards lifestyle improvements (such as waking users up in the morning with a gradual light increase).

With an SDK and API, developers can now take advantage of these concepts: aside from the “simple” remote control features, imagine apps that could activate specific hue settings when you’re reading or watching a movie, parse voice-based commands with dictation, or integrate with an iOS device’s Reminders, Calendars, or Location Services. On the hardware side, it should be possible – at least in theory – to develop gadgets capable of combining personal data with hue to leverage Philips’ “smart” lighting system in completely new ways. An obvious implementation would be for health and fitness-monitoring accessories such as Nike’s FuelBand; as far as rumors go, an Apple iWatch could integrate with hue to exchange an user’s data and personal stats (Apple isn’t new to third-party collaborations of this kind).

Right now, Philips’ hue API is promising and shows great potential for more forward-thinking software and hardware implementations. You can read more on Philips’ website.


The Siri API

The Siri API

Samuel Iglesias has written an excellent post detailing the (possible) challenges developers will have to cope with if Apple decides to release a Siri API.

The second half of Siri integration, Semantics, is the tricky part: something that most iOS developers have never dealt with. Semantics will attempt to capture the various ways a user can ask for something, and, more importantly, the ways Siri, in turn, can ask for more information should that be required. This means that developers will need to imagine and provide “hints” about the numerous ways a user can ask for something. Sure, machine learning can cover some of that, but at this early stage Siri will need human supervision to work seamlessly.

This is exactly what I have been wondering since speculation on the Siri API started last year. How will an app be capable of telling Siri the kinds of input (read: natural language) it accepts? Will developers have to do it manually? Will Apple provide a series of automated tools to associate specific features (say, creating a task in OmniFocus) with common expressions and words? And how is Apple going to look into the natural language processing developers will implement in their apps?

Of course, the Siri API is still at the speculation stage, but it does make sense to greatly expand upon Siri’s capabilities as an assistant capable of working with any app. The TBA sessions at WWDC are intriguing, and Tim Cook said we’ll be pleased with the direction they’re taking with Siri. Right now, I’d say integrating with third-party software would be a fantastic direction.

Permalink

iCloud’s First Six Months: The Developers Weigh In

On October 12th, 2011, iCloud launched to millions of iOS users impatiently waiting to start getting their devices to sync with Apple’s new platform, which CEO Tim Cook went on to call the company’s next big insight for the next decade. Six months and 85 million customers later, iCloud has proven to be a substantial improvement to sync a user’s email, contacts, address book, and other data accessed by Apple apps. With third-party developers, however, adoption of iCloud sync and storage features has turned out to be a bit tricker, and possibly less intuitive than Apple’s own implementation due to the early nature of the platform. Read more


Instagram’s New Experiment: Open Up The API for Third-Party Uploads

Hipstamatic, a photo sharing app for the iPhone that allows users to apply vintage/analog effects and filters to their photos, has become the first app to directly integrate with Instagram. The popular iPhone-only sharing service, now boasting over 27 million users and on the verge of releasing an Android app, has so far allowed third-party developers to integrate their apps with the Instagram API to only visualize a user’s photos or feed. The API hasn’t allowed for the creation of real Instagram clients for other devices, in that uploading could be done exclusively using Instagram’s own app.

Today, however, an update to Hipstamatic and a collaboration between the two services first reported by Fast Company might signal an important change in Instagram’s direction and nature as a photo sharing service. The new Hipstamatic, available now on the App Store, comes with a redesigned “HipstaShare” system to send photos to various social networks including Facebook, Twitter, and Flickr. Among the supported services, a new Instagram option now enables you to log into your Instagram account, and upload photos directly within Hipstamatic, without leaving the app. There is no “forwarding” of files to the Instagram app, nor does Hipstamatic asks you to download the Instagram app from the App Store – this is true uploading to Instagram done by a third-party, via the API.

Unlike most photo sharing apps these days, Hipstamatic puts great focus on recreating the analog experience of shooting photos and carefully selecting the equipment you’d like to shoot with. With a somewhat accurate representation of vintage films, lenses, camera cases, and flash units, Hipstamatic wants to appeal to that kind of userbase that is not simply interested in capturing a fleeing moment and share it in seconds; rather, as famous appearances on publications like The New York Times confirm, the Hipstamatic crowd is more of a passionate gathering of 4 million users looking to spend minutes, if not hours, trying to achieve the perfect setup for each occasion, spending one dollar at a time on in-app purchases that unlock different filters and “parts” of the cameras supported in Hipstamatic. Unlike Instagram or, say, Camera+, Hipstamatic isn’t built to shoot & share; the ultimate goal is undoubtedly sharing, but it’d be more appropriate to describe Hipstamatic’s workflow as “set up, shoot, then share”.

Hipstamatic seems to have realized, however, that sharing can’t be relegated to a simple accessory  that has a second place behind the app’s custom effects and unlockable items. Whilst in-app purchases and fancy graphics may have played an important role in driving Hipstamatic’s success so far, apps can’t go without a strong sharing and social foundation nowadays, and since its launch two years ago, Instagram has seen tremendous growth for being only an iPhone app. With this update, Instagram and Hipstamatic are doing a favor to each other: Instagram gets to test the waters with an API that now allows for uploading through other clients that support similar feature sets; Hipstamatic maintains its existing functionalities, but it adds a new social layer that plugs natively into the world’s hottest photo sharing startup.

Looking at the terms of the “deal” (I don’t think any revenue sharing is taking place between the two parties), it appears both sides got the perks they wanted. This native integration comes with an Instagram icon in Hipstamatic’s new sharing menu, which, when tapped, will let you log into your account. Once active, each “Hipstaprint” (another fancy name for photos) can be shared on a variety of networks, with Facebook even supporting friend tagging. You can upload multiple photos at once if you want, too. In the sharing panel, you can optionally decide to activate “equipment tagging” – this option will, alongside the client’s information, include #hashtags for the lens, film and other equipment that you use in your Hipstamatic camera.

On the Instagram’s side, things get a little more interesting. Hipstamatic photos get uploaded respecting Instagram’s photo sizes, and they get a border around the image to, I guess, indicate their “print” nature. Together with the title, Instagram will display the aforementioned tags for equipment, and a “Taken with Hipstamatic” link that, when tapped, will ask you to launch Hipstamatic. If you don’t have Hipstamatic installed on your iPhone, this link will take you to the App Store page for the app.

Overall, what really intrigues me about this collaboration isn’t the Hipstamatic update per se – version “250” of the app is solid and well-built, but I don’t use Hipstamatic myself on a regular basis, as I prefer more direct tools like Instagram, Camera+, or even the Facebook app for iOS. What I really think could be huge, both for the companies involved and the users, is the API that Hipstamatic is leveraging here. Hipstamatic is doing the right thing: sharing has become a fundamental part of the mobile photo taking process, and it would be foolish to ignore Instagram’s popularity and come up with a whole new network.

Instagram, on the other hand, is taking an interesting path (no pun intended) that, sometime down the road, might turn what was once an iPhone app into a de-facto option for all future social sharing implementations. A few months from now, would it be crazy to think Camera+ could integrate with Instagram to offer antive uploads? Or to imagine built-in support for Instagram photo uploads in, say, iOS, Twitter clients, and other photo apps? I don’t think so. Just as “taken with Hipstamatic” stands out in today’s Instagram feeds, “Upload to Instagram” doesn’t sound too absurd at this point.


Face Detection Technology And APIs Make Their Way Into iOS 5

After doing some digging in iOS 5, 9to5 Mac today reported that Apple is planning to open up face detection APIs to developers. It appears from what they found that Apple will bring similar face detection techniques that Photo Booth on Lion currently employs to iOS and allow developers to build different apps that utilize the APIs.

These claims come after 9to5 Mac found the ‘CIFaceFeature’ and ‘CIDetector’ APIs within a recent beta build of iOS 5, which they say are “very advanced” APIs. The first of the two can be used by developers to locate where a person’s mouth and eyes are, whilst the latter is used when processing those images for face detection. Apple’s Developer Library online also already has some notes on the new APIs for developers to take advantage of.

Obviously at this point there is only speculation as to what these purported APIs could be used for – an obvious stretch is that the iPad version of Photo Booth may add the new effects added to the Lion version that take advantage of the information of where a person’s facial features are. It is also claimed that Polar Rose, a company that specialized in facial recognition software and was purchased by Apple last year, played a role in the development of these technologies within iOS and Lion.

[Via 9to5 Mac]


Carousel Is A Beautiful Instagram Client for Mac

Back in April we covered Instadesk, the first Instagram client for Mac that, through an interface design similar to iTunes and iPhoto, allowed you to browse Instagram photos, users, likes and comments directly from your desktop. The app was one of the thousands of results coming from the launch of the Instagram API, a set of tools that enable third-party developers to plug into your Instagram feed to retrieve photos uploaded by you or relevant to you. Of all the Instagram-connected apps we’ve covered, Instadesk saw a huge success as it was the first one to land on the Mac App Store.

Carousel, however, wants to step the game up by offering a beautiful and slick way to access Instagram from your Mac with a design that’s heavily inspired by iOS, yet runs natively on OS X. I don’t know if the developers are using the Iconfactory’s Chameleon framework for this, but it certainly looks like Carousel has some similarities with Twitterrific – the Twitter client from the Iconfactory that shares it codebase across the Mac, iPhone and iPad. So what’s this all about? First off, Carousel presents a minimal, vertical-oriented interface as if you were looking at your iPhone’s screen in portrait mode while browsing Instagram. The photo stream is embedded directly into the app’s window, with beautiful Instagram photos to flick through as they load. At the bottom, three tabs allow you to switch between your feed, popular photos and your profile. Every photo can be enlarged via Quick Look, saved locally on your Mac, or commented / liked thanks to a wide selection of keyboard shortcuts to choose from.

In Carousel, you can open every user’s profile to check out their photos. You can comment and like pictures, too, with interaction happening inside an iOS-like popover that resembles Twitterrific’s implementation of conversation views and profiles. You can even view if a user’s following you, or if you’re following him. Last, the app can be themed. Carousel’s default theme is already gorgeous in my opinion, but you can switch to a classic Mac or red one from the Settings.

Carousel can be downloaded for free, or you can purchase a license at $4.99 (introductory price) from the developer’s website. More screenshots below. Read more