Posts tagged with "siri"

Siri Vs. Google Voice Search, Four Months Later

Siri Vs. Google Voice Search, Four Months Later

Rob Griffiths, comparing Siri to Google Voice Search at Macworld:

Because of the speed, accuracy, and usefulness of Google’s search results, I’ve pretty much stopped using Siri. Sure, it takes a bit of extra effort to get started, but for me, that effort is worth it. Google has taken a key feature of the iOS ecosystem and made it seem more than a little antiquated. When your main competitor is shipping something that works better, faster, and more intuitively than your built-in solution, I’d hope that’d drive you to improve your built-in solution.

When the Google Search app was updated with Voice Search in October 2012, I concluded saying:

Right now, the new Voice Search won’t give smarter results to international users, and it would be unfair to compare it to Siri, because they are two different products. Perhaps Google’s intention is to make Voice Search a more Siri-like product with Google Now, but that’s another platform, another product, and, ultimately, pure speculation.

When Clark Goble posted his comparison of Siri Vs. Google Voice Search in November, I summed up my thoughts on the “usefulness” of both voice input solutions:

I’m always around a computer or iOS device, and the only times when I can’t directly manipulate a UI with my hands is when I’m driving or cooking. I want to know how Siri compares to Google in letting me complete tasks such as converting pounds to grams and texting my girlfriend, not showing me pictures of the Eiffel Tower.

From my interview with John Siracusa:

And yet the one part of Google voice search that Google can control without Apple’s interference — the part where it listens to your speech and converts it to words — has much better perceptual performance than Siri. Is that just a UI choice, where Apple went with a black box that you speak into and wait to see what Siri thinks you said? Or is it because Google’s speech-to-text service is so much more responsive than Apple’s that Google could afford to provide much more granular feedback? I suspect it’s the latter, and that’s bad for Apple. (And, honestly, if it’s the former, then Apple made a bad call there too.)

Now, four months after Google Voice Search launched, I still think Google’s implementation is, from a user experience standpoint, superior. While it’s nice that Siri says things like “Ok, here you go”, I just want to get results faster. I don’t care if my virtual assistant has manners: I want it to be neutral and efficient. Is Siri’s distinct personality a key element to its success? Does the way Siri is built justify the fact that Google Voice Search is almost twice as fast as Siri? Or are Siri’s manners just a way to give some feedback while the software is working on a process that, in practice, takes more seconds than Google’s?

I still believe that Siri’s biggest advantage remains its deep connection with the operating system. Siri is faster to invoke and it can directly plug into apps like Reminders, Calendar, Mail, or Clock. Google can’t parse your upcoming schedule or create new calendar events for you. It’s safe to assume Apple’s policy will always preclude Google from having that kind of automatic, invisible, seamless integration with iOS.

But I have been wondering whether Google could ever take the midway approach and offer a voice-based “assistant” that also plays by Apple’s rules.

Example: users can’t set a default browser on iOS but Google shipped Chrome as an app; the Gmail app has push notifications; Google Maps was pulled from iOS 6 and Google released it as a standalone app. What’s stopping Google from applying the same concept to a Google Now app? Of course, such app would be a “watered down” version of Google Now for Android, but it could still request access to your local Calendar and Reminders like other apps can; it would be able to look into your Contacts and location; it would obviously push Google+ as an additional sharing service (alongside the built-in Twitter and Facebook). It would use the Google Maps SDK and offer users to open web links in Google Chrome. Search commands would be based on Voice Search technology, but results wouldn’t appear in a web view under a search box – it would be a native app. The app would be able to create new events with or without showing Apple’s UI; for Mail.app and Messages integration, it would work just like Google Chrome’s Mail sharing: it’d bring up a Mail panel with the transcribed version of your voice command.

Technically, I believe this is possible – not because I am assuming it, but because other apps are doing the exact same thing, only with regular text input. See: Drafts. What I don’t know is whether this would be in Google’s interest, or if Apple would ever approve it (although, if based on publicly-available APIs and considering Voice Search was approved, I don’t see why not).

If such an app ever comes out, how many people would, like Rob, “pretty much stop using Siri”? How many would accept the trade-off of a less integrated solution in return of speed and more reliability?

An earlier version of this post stated that calendar events can’t be created programmatically on iOS. That is possible without having to show Apple’s UI, like apps such as Agenda and Fantastical have shown .

Permalink

iWatch Potential

iWatch Potential

Bruce “Tog” Tognazzini, Apple employee #66 and founder of the Human Interface Group, has published a great post on the potential of the “iWatch” – a so-called smartwatch Apple could release in the near future (via MG Siegler). While I haven’t been exactly excited by the features offered by current smartwatches – namely, the Pebble and other Bluetooth-based watches – the possibilities explored by Bruce made me think about a future ecosystem where, essentially, the iPhone will “think” in the background and the iWatch will “talk” directly to us. I believe that having bulky smartwatches with high-end CPUs won’t be nearly as important as ensuring a reliable, constant connection between lightweight wearable devices and the “real” computers in our pocket – smartphones.

The entire post is worth a read, so I’ll just highlight a specific paragraph about health tracking:

Having the watch facilitate a basic test like blood pressure monitoring would be a god-send, but probably at prohibitive cost in dollars, size, and energy. However, people will write apps that will carry out other medical tests that will end up surprising us, such as tests for early detection of tremor, etc. The watch could also act as a store-and-forward data collector for other more specialized devices, cutting back the cost of specialized sensors that would then need be little more than a sensor, a Blue Tooth chip, and a battery. Because the watch is always with us, it will be able to deliver a long-term data stream, rather than a limited snapshot, providing insight often missing from tests administered in a doctor’s office.

Dealing with all sorts of blood, temperature, and pressure tests on a regular basis, I can tell you that data sets that span weeks and months – building “archives” of a patient with graphs and charts, for instance – has, nowadays, too much friction. Monitoring blood pressure is still done with dedicated devices that most people don’t know how to operate. But imagine accurate, industry-certified, low-energy sensors capable of monitoring this kind of data and sending it back automatically to an iPhone for further processing, and you can see how friction could be removed while a) making people’s lives better and b) building data sets that don’t require any user input (you’d be surprised to know how much data can be extrapolated from the combination of “simple” tests like blood pressure monitoring and body temperature).

The health aspect of a possible “iWatch” is just a side of a device that Apple may or may not release any time soon. While I’m not sure about some of the ideas proposed by Bruce (passcode locks seem overly complex when the devices themselves could have biometric scanners built-in; Siri conversations in public still feel awkward and the service is far from responsive, especially on 3G), I believe others are definitley in the realm of technologically feasible and actually beneficial to the users (and Apple). Imagine crowdsourced data from the iWatch when applied to Maps or the iWatch being able to “tell us” about upcoming appointments or reminders when we’re driving so we won’t have to reach out to an iPhone (combine iWatch vibrations and “always-on” display with Siri Eyes Free and you get the idea).

As our iPhones grow more powerful and connected on each generation, I like to think that, in a not-so distant future, some of that power will be used to compute data from wearable devices that have a more direct connection to us and the world around us.

Permalink

Open Google Maps Directions With Siri or Launch Center Pro

Here’s a fun experiment to launch the Google Maps app via URL scheme directly into a new Directions view.

As I detailed this morning, the new Google Maps app for iPhone lets you launch specific views and modes using a URL scheme. You don’t need to be a developer to use the URL scheme; this means you’ll be able to launch the Google Maps app from Safari, Launch Center Pro, or any other launcher using the base comgooglemaps:// URL.

Google’s URL has a scheme for directions with addresses and transportation parameters. It lets you specific a starting address with the saddr parameter, and a destination address with daddr.

Further, you can instruct the URL to open a specific directionsmode, such as driving or transit.

With these parameters, it becomes possible to set up a nice automated workflow to launch directions using Siri or Launch Center Pro. Read more


Siri Vs. Google Voice Search

Clark Goble has posted an excellent review of Siri and Google Voice Search, taking into account the improvements Apple and Google made to their services in the past few weeks. His experience largely reflects mine: Siri is more useful to do stuff, Google is faster at transcribing dictated input and displaying search results.

That said Siri still has places it needs to improve. It really should speak far more results. For certain classes of queries Siri should display a large simple result and speak it rather than the stylized Wolfram result it now provides. Given that Siri already has started speaking more results, I suspect we’ll see that added over the next month. Siri also has a problem of not letting you speak over it. I’d like it to be able to let me answer before I have to listen to every part of the question she’s asking. Finally I think there are several types of queries Siri needs to be optimized for. Temperature conversions, forthcoming movie releases, television schedules, and time series sporting event statistics really are all things Siri needs to do better.

In October, I wrote:

Google Search for iOS doesn’t want to be Siri: after all, it can’t. It has some similar features (“Do I need a raincoat today?”), but it’s not an assistant. It couldn’t be per Apple’s restrictions, and Google isn’t even trying to posit it as a Siri replacement. It’s Voice Search.

I also agree with Clark in regards to the tests many people conduct to compare Siri to Google. I’m not interested in the funny and witty responses – for as much as they’re an entertaining demo – because, after all, I want to know how voice works in real life. I’m always around a computer or iOS device, and the only times when I can’t directly manipulate a UI with my hands is when I’m driving or cooking. I want to know how Siri compares to Google in letting me complete tasks such as converting pounds to grams and texting my girlfriend, not showing me pictures of the Eiffel Tower.

As an Italian user, I have to say Siri has still a long way to go with a language that’s full of variables such as conjugations and pronouns. Some Italian responses are poorly worded (see screenshot above), and sentences containing a “that” are still plainly transcribed. Sports results for Serie A lack player pictures and coach information, and results for the last match are displayed instead of rankings. Siri asks me if I mean “November 19th” or “November 20th” when I ask for “tomorrow” minutes after the midnight of November 19th, but simply replying “the 19th” doesn’t work.

Italian Siri has also been getting better, though. It presents more results for businesses open at night in my area if I ask after 11 PM, and it appears to accept a more variegate vocabulary for Reminders and Calendar integration. I can also attest reliability has improved recently, but it’s still far from perfect.

If you want a balanced and detailed take on the differences between Siri and Google, read Clark’s review here.

Permalink

Apple’s Hire of William Stasior May Be for More than Just Search

Earlier this afternoon, AllThingsD’s Kara Swisher reported that Apple has hired Amazon executive William Stasior, who was in charge of Amazon’s A9, which focuses on product and visual search technologies. Swisher reports that Stasior will be working with the Siri team in his new position at Apple.

What’s more intriguing is what else Stasior might find himself working on — presumably, strengthening Apple’s search and search advertising technology in the wake of its increasing competition with Google.

“Apple’s search and search advertising technology” covers a broad swath of search that could be… well anything. If I was to take a stab at what Apple might specifically want Stasior for, I’d look at one of the products A9 ended up introducing on the App Store. Flow Powered by Amazon is a visual search app that attempts to visually recognize and display relevant information about books, music, video games, and more by simply pointing your smartphone’s camera at the cover or UPC barcode. The app allows people to bring up product details and customer ratings by identifying the product’s packaging (it’s in the same vein as Google Goggles).

Siri would be well suited as not just a voice assistant, but as a visual assistant. Given Apple’s recent foray into books, magazines, and textbooks, using Siri to scan and subscribe to a magazine through your iPhone, get more information on a paperback, or find more novels by an author could be a possibility. I could see Apple offering album ratings for music from iTunes, or displaying rental fees when you scan the cover of a Blu-ray boxset. A9 also powers CloudSearch and Product Search at Amazon — I don’t see the hire being related to search advertising, but rather for product search as it applies to Apple’s digital ecosystems.

Permalink

Apple Details iOS 6 Feature Availability By Country

As noted by Horace Dediu, Apple has published an official list of iOS 6 feature availability on its website. While iOS 6 is officially coming out next week, on September 19th, not every feature will be available in every country.

The list focuses mainly on Siri, Maps, and Dictation. Availability of iTunes Store and App Store content is mentioned as well, but that’s not really new if you’ve been following the expansion of Apple’s digital storefronts in the past months. What’s interesting ahead of iOS 6’s launch is the list of features that, due to content limitations or the “beta” nature of Siri, won’t be available in some parts of the world.

For instance, Maps’ “standard” operation will be available from Afghanistan to Zimbabwe for a total of 181 supported countries. This should include the “standard” view of Maps – the new tiles that Apple is using after removing Google’s ones from iOS 6 entirely. Similarly, the Satellite view of Maps will be available in the same 181 countries worldwide. However, things start getting different with Maps’ Directions and and Turn-by-Turn navigation: the former will be available in 96 countries, the latter in 56. 3D buildings, another feature of Maps, will only be available in the United States at launch, whereas Traffic information will be available in 23 countries. Last, Maps Local Search will be available in 49 countries, and Business Reviews and Photos in 15.

Siri is even more limited. In spite of the voice assistant gaining support for more languages in iOS 6, several functionalities and integrations will be limited by the user’s location. So, for instance, while everyone will be able to set Siri to a supported language and issue commands, Sports data will be limited to 15 countries; Twitter and Facebook integration to 14; Local Search and Restaurant Information to 10, but Restaurant Reviews will only be available through Siri in 9 countries and Reservations in 3 (USA, Canada, Mexico). Another Siri integration, Movies, will be limited to 13 countries for Movie Information, 4 for Reviews, and only 3 for showtimes.

As Apple embraces more third-party services in its operating systems, it’s no surprise that some features will be restricted to only the countries where those services are fully operational. The same happened with the first version of Siri last year – some commands were only supported in the United States initially.

Check out the full list of iOS 6 feature availability here.


The Rise Of Third Party Services And Fall Of Google In iOS

When Apple introduced iOS 6 to the world at this year’s WWDC, one of the most talked about moves was Apple’s decision to step away from their partnership with Google Maps and create their own maps app. In many respects, it wasn’t too surprising given the increasingly strenuous relationship between Apple and Google in the years since the iPhone launched and Google became a competitor with Android, but in recent weeks it was also revealed that YouTube will also no longer be included as a pre-installed app from iOS 6. That leaves Google Search as the only remaining Google service to be integrated into iOS. Yet whilst Apple has been severing its relationship with Google, it has been courting numerous other service providers and integrating them into iOS over the past few years.

Curious to visualise this information, I made a list of every notable service that has been integrated with iOS (and when) and then created the above graphic (click on it to view a larger version). When I had compiled the list, it was pretty compelling (and longer than I had realised), but I think the graphic takes it to the next level and really tells a story about iOS and Apple’s relationship with other services.

Read more


The Siri API

The Siri API

Samuel Iglesias has written an excellent post detailing the (possible) challenges developers will have to cope with if Apple decides to release a Siri API.

The second half of Siri integration, Semantics, is the tricky part: something that most iOS developers have never dealt with. Semantics will attempt to capture the various ways a user can ask for something, and, more importantly, the ways Siri, in turn, can ask for more information should that be required. This means that developers will need to imagine and provide “hints” about the numerous ways a user can ask for something. Sure, machine learning can cover some of that, but at this early stage Siri will need human supervision to work seamlessly.

This is exactly what I have been wondering since speculation on the Siri API started last year. How will an app be capable of telling Siri the kinds of input (read: natural language) it accepts? Will developers have to do it manually? Will Apple provide a series of automated tools to associate specific features (say, creating a task in OmniFocus) with common expressions and words? And how is Apple going to look into the natural language processing developers will implement in their apps?

Of course, the Siri API is still at the speculation stage, but it does make sense to greatly expand upon Siri’s capabilities as an assistant capable of working with any app. The TBA sessions at WWDC are intriguing, and Tim Cook said we’ll be pleased with the direction they’re taking with Siri. Right now, I’d say integrating with third-party software would be a fantastic direction.

Permalink

Apple Posts New Siri Ads Featuring John Malkovich

Following a series of “celebrity ads” for the iPhone 4S’ voice-based assistant released last month, Apple today posted two new Siri TV commercials featuring actor John Malkovich. The ads, titled “Joke” and “Life” show Malkovich casually talking to Siri with short sentences and a series of single words such as “weather” or “evening”, perhaps in an effort to showcase both Malkovich’s particular attitude and Siri’s capability of handling short commands with seemingly no context (“evening” returns a series of calendar appointments, “linguica” displays local restaurants).

According to a recent study, the previous commercials featuring Zooey Deschanel and Samuel L. Jackson fared well with viewers, who, reportedly, were highly receptive to familiar faces of celebrities illustrating the latest features of the iPhone in a familiar, almost casual setting. MacRumors has put together a number of possible responses Siri can give to Malkovich’s query – tests performed with the question asked by Samuel L. Jackson showed that, in practice, Siri was a little less accurate than its primetime counterpart.

The new Siri ads are available on Apple’s website, YouTube channel, and we have embedded the official versions below.
Read more