Posts tagged with "google"

Inbox by Gmail to Add New Smart Reply Feature This Week

Inbox by Gmail is about to get a whole lot smarter this week with a new feature called Smart Reply. Bálint Miklós on the Official Gmail Blog explains:

Smart Reply suggests up to three responses based on the emails you get. For those emails that only need a quick response, it can take care of the thinking and save precious time spent typing. And for those emails that require a bit more thought, it gives you a jump start so you can respond right away.

The feature will be rolling out to the Inbox by Gmail app on iOS and Android later this week, but will only work in English for now. Smart Reply uses machine learning to recognize which emails need responses and then generate three appropriate responses for the user to pick from. The Google Research Blog also has some more details on how the researchers got the feature to work.

And much like how Inbox gets better when you report spam, the responses you choose (or don’t choose!) help improve future suggestions. For example, when Smart Reply was tested at Google, a common suggestion in the workplace was “I love you.” Thanks to Googler feedback, Smart Reply is now SFW :)

Permalink

Google’s App Indexing Adding Support for iOS 9 Universal Links

Google Developers, on surprisingly-it’s-still-around Google Plus:

Getting your app content found on Google just got easier. App Indexing is now compatible with HTTP deep link standards for iOS 9, as it has been on Android from the beginning. That means that you can start getting your app content into the Search results page on Safari in iOS, simply by adding Universal Links to your iOS app, then integrating with our SDK. With this improvement, we will no longer support new integrations on iOS 7 and iOS 8. Users will start seeing your app content in Safari on iOS at the end of October.

Google has additional documentation here. I’m glad they’re adding support for this relatively soon.

Permalink

Apple’s New ‘Move to iOS’ Android App

Zac Hall, writing for 9to5Mac on Apple’s first Android app to move user data to iOS:

Once you complete the selection process, the app creates a private Wi-Fi network used by both devices to wirelessly transfer content. After the transfer process is complete, Move to iOS will notify you if any content was not able to move to your new iPhone or iPad, then recommend recycling your old Android phone at a local Apple Store. After continuing the setup process on the iPhone or iPad, the settings and content should appear intact.

The process is integrated with iOS 9’s new setup flow – you get an option to import data from Android when setting up an iOS 9 device for the first time. The Android app is available here (and it’s got some…interesting customer reviews.)

Permalink

Google Photos Will Now Show You Photos and Videos From the Past

Sean O’Kane at The Verge:

The Google Photos app will now serve up cards in the “assistant view” that urge you to “rediscover this day,” and they can include photos, photo collages, or videos. The cards will tell you where you were and who you were with on that day, and the app also sticks a little graphic over everything that tells you which year it was from — another little bit that is extremely similar to Timehop.

The first rule of modern photo management services is that, sooner or later, they’re going to bring back a feature from Everpix. I used to love this in the defunct service; it makes sense for the Assistant view of Google Photos. It’s surprising to me that Apple still hasn’t added something like this to Photos (you can search for “one year ago”, but it’s not as precise or visible).

Permalink

Google App for iOS Gets Support for Smarter “OK Google” Questions

Clever update to the Google app for iOS released today: because Google can’t replicate the system-wide Now on Tap overlay on iOS, they have enabled a similar experience for webpages displayed inside the app. Now, when you’re looking at a webpage that contains information you want to look up, you can say “OK Google” and ask a contextual question that Google will likely know the answer for.

I just took it for a spin, and I was able to get a smart answer for a webpage that mentioned Liam Gallagher (“when was he born”, I asked) and another for Everybody’s Gone to the Rapture (“when is the release date” was my question). This, of course, isn’t as flexible as Now on Tap’s deep integration with Android apps and the OS, but it can be handy to save a bit of time when browsing in the Google app.

The technical achievements of Google’s Now and smart answer technologies continue to impress me, although I wonder about their practicality for most people in everyday usage.

Permalink

Google Starts Showing Busy Times for Places in Search

Here’s a clever addition to Google search results announced by the company today: you can now view busy times for places listed in search results. Google writes:

Now, you can avoid the wait and see the busiest times of the week at millions of places and businesses around the world directly from Google Search. For example, just search for “Blue Bottle Williamsburg”, tap on the title and see how busy it gets throughout the day. Enjoy your extra time!

This is already working in Safari for iOS (see screenshot above), and I’m going to use it a lot – my girlfriend and I are constantly checking out new places in Rome, and it helps to see the best times so we can avoid queues and possible parking issues around a store. I assume this uses the location shared by the Google app for iOS (which can keep track of where you go as it’s always running in the background) as well as Google Maps. Very cool, and I’d like to see it inside Google Maps as well.

Permalink

Chrome for iOS Gets ‘Physical Web’ Support for Beacon Discovery

Chrome for iOS has been updated today with support for Physical Web, an initiative aimed at interacting with beacons based on the new Eddystone protocol through webpages instead of apps. Now, Chrome’s Today widget on iOS (previously used to open tabs and voice searches) can scan beacons broadcasting URLs nearby and offer to open them in Chrome directly.

From the blog post:

When users who have enabled the Physical Web open the Today view, the Chrome widget scans for broadcasted URLs and displays these results, using estimated proximity of the beacons to rank the content. You can learn more about the types of user experiences that the Physical Web enables by visiting our cookbook and joining the open source community on GitHub.

This is Google’s attempt at improving upon one of the biggest shortcomings of Apple’s iBeacon: app discoverability. iBeacons can achieve great utility if an associated/compatible app is already installed on a user’s device and sends a notification, but iOS doesn’t have a simple, consistent way to browse nearby beacons and start interacting with them right away. With Eddystone and Physical Web, Google is hoping that the transition from OS to discovered beacon and beacon functionality (for the smart device) can be smoother thanks to the web. Here’s how they explain it:

The Physical Web is an approach to unleash the core superpower of the web: interaction on demand. People should be able to walk up to any smart device - a vending machine, a poster, a toy, a bus stop, a rental car - and not have to download an app first. Everything should be just a tap away.

Essentially, Google wants to give every smart device a web address that doesn’t require an app store. This plays in favor of Google’s strengths and, potentially, core business model, but it also sounds like a superior solution for some cases if the overhead of app discovery is out of the equation altogether (for more on the differences between iBeacon and Physical Web, see this). The Physical Web implementation in Chrome for iOS looks clever and well done, and I’m hoping that I’ll get to play with it at some point. Seems crazy that all this is available in an iOS widget.


“Laughing and Crying My Way Through the New Google Photos”

My first watch of this video hit me emotionally in a way that’s hard to articulate. The film itself is a new kind of uncanny valley for digital artifacts: Assistant and its algorithms combined these clips in a way that no reasonable person would attempt. Ever. The result is surreal, random, creepy, sad, and oddly funny. It had to be a coincidence of timing that I had only just returned from visiting Grumpy on his deathbed. But partly because of that timing, this video present came at a moment when I was primed to appreciate it. Maybe it won’t be long before services try (and fail) to do this sort of thing on purpose, offering us narratives that highlight timely memories, or videos designed to fill anticipated emotional needs. My photos are still uploading.

Ryan Gantz has shared a personal story about photos he took at family events and how Google Photos put them all together automatically. The result is indeed funny and weird at the same time, but Ryan ended up appreciating it anyway.

There have been some interesting discussions about privacy and the value of Google Photos over the past week. So far, I agree completely with Manton Reece:

My family photos are the most important files I have on my computer, and I very rarely share any photos of my kids publicly. But ironically I’m willing to overlook some of the privacy concerns around this exactly because the photos are so valuable to me. I want multiple copies in the cloud, and I want the power of search that Google has built.

“Kind of creepy but I appreciate it” seems to be a common theme around Google Photos.

Permalink

“Google Photos Is Gmail for Your Images”

The information gleaned from analyzing these photos does not travel outside of this product — not today. But if I thought we could return immense value to the users based on this data I’m sure we would consider doing that. For instance, if it were possible for Google Photos to figure out that I have a Tesla, and Tesla wanted to alert me to a recall, that would be a service that we would consider offering, with appropriate controls and disclosure to the user. Google Now is a great example. When I’m late for a flight and I get a Google Now notification that my flight has been delayed I can chill out and take an extra hour, breathe deep.

Steven Levy interviewed Google’s Bradley Horowitz about Google Photos. The article includes some fascinating details on how the technology behind it could be applied in the future. (Ads aren’t part of the plan – for now. It’s easy to imagine how they could be.)

I’m currently uploading years of photos to Google’s cloud because I’m interested in their search technology. I ran some initial tests on a first batch of photos, and machine learning was indeed impressive: the service organized photos by locations and people, but more importantly it let me search for common keywords like “fireworks”, “beach”, and “pets”. This, however, could also have negative repercussions, as Casey Newton noted in his story on Google Photos:

Google’s face detection is so powerful that I’m glad you have the option to disable it. It created an amazingly comprehensive photo album of my ex-boyfriend, and instantly reliving every holiday and road trip together just by tapping his face overwhelmed me. It’s magic, yes, but it can catch you off guard. (And it’s not perfect: a colleague who tried the service discovered that Google thought his wife was at least four different people.)

Finding photos and rediscovering memories is just as important – if not more important – than managing them. I believe that machine learning and deep neural networks have a huge potential to help us organize and retrieve information we’d forget otherwise, and Google is well positioned to tackle this. If anything, Google Photos makes for a good additional backup option after iCloud Photo Library.

Permalink