Posts tagged with "google"

NotebookLM Plus Is Now Available to Google One AI Premium Subscribers

In this week’s extended post-show for AppStories+ subscribers, Federico and I covered the AI tools we use. NotebookLM is one we have in common because it’s such a powerful research tool. The service allows you to upload documents and other files to a notebook and then query what you’ve collected. It’s better than a traditional search tool because you can ask complex questions, discover connections between topics, and generate materials like timelines and summaries.

Yesterday, Google announced that NotebookLM Plus is now available to Google One AI Premium subscribers, significantly expanding its reach. Previously, the extended functionality was only available as an add-on for Google Workspace subscribers.

The Plus version of NotebookLM increases the number of notebooks, sources, and audio overviews available, allows users to customize the tone of their notebooks, and lets users share notebooks with others. Google One AI Premium also includes access to Gemini Advanced and Gemini integration with Gmail, Docs, and other Google services, plus 2 TB of Google Drive cloud storage.

My DMA notebook.

My DMA notebook.

I’ve only begun to scratch the surface of what is possible with NotebookLM and am currently moving my notebook setup from one Google account to another, but it’s already proven to be a valuable research tool. Examples of the types of materials I’ve collected for querying include:

  • legislative material and articles about Apple’s DMA compliance,
  • my past macOS reviews,
  • summaries of and links to stories published on MacStories and Club MacStories,
  • video hardware research materials, and
  • manuals for home appliances and gadgets.

Having already collected and read these materials, I find navigating them with NotebookLM to be far faster than repeatedly skimming through them to pull out details. I also appreciate the ability to create materials like timelines for topics that span months or years.

Google One AI Premium is available from Google for $19.99 per month.


Gemini 2.0 and LLMs Integrated with Apps

Busy day at Google today: the company rolled out version 2.0 of its Gemini AI assistant (previously announced in December) with a variety of new and updated models to more users. From the Google blog:

Today, we’re making the updated Gemini 2.0 Flash generally available via the Gemini API in Google AI Studio and Vertex AI. Developers can now build production applications with 2.0 Flash.

We’re also releasing an experimental version of Gemini 2.0 Pro, our best model yet for coding performance and complex prompts. It is available in Google AI Studio and Vertex AI, and in the Gemini app for Gemini Advanced users.

We’re releasing a new model, Gemini 2.0 Flash-Lite, our most cost-efficient model yet, in public preview in Google AI Studio and Vertex AI.

Finally, 2.0 Flash Thinking Experimental will be available to Gemini app users in the model dropdown on desktop and mobile.

Read more


Doing Research with NotebookLM

Fascinating blog post by Vidit Bhargava (creator of the excellent LookUp dictionary app) about how he worked on his master thesis with the aid of Google’s NotebookLM.

I used NotebookLM throughout my thesis, not because I was interested in it generating content for me (I think AI generated text and images are sloppy and classless); but because it’s a genuinely great research organization tool that provides utility of drawing connections between discreet topics and helping me understand my own journey better.

Make sure to check out the examples of his interviews and research material as indexed by the service.

As I explained in an episode of AppStories a while back, and as John also expanded upon in the latest issue of the Monthly Log for Club members, we believe that assistive AI tools that leverage modern LLM advancements to help people work better (and less) are infinitely superior to whatever useless slop generative tools produce.

Google’s NotebookLM is, in my opinion, one of the most intriguing new tools in this field. For the past two months, I’ve been using it as a personal search assistant for the entire archive of 10 years of annual iOS reviews – that’s more than half a million words in total. Not only can NotebookLM search that entire library in seconds, but it does so with even the most random natural language queries about the most obscure details I’ve ever covered in my stories, such as “When was the copy and paste menu renamed to edit menu?” (It was iOS 16.). It’s becoming increasingly challenging for me, after all these years, to keep track of the growing list of iOS-related minutiae; from a personal productivity standpoint, NotebookLM has to be one of the most exciting new products I’ve tried in a while. (Alongside Shortwave for email.)

Just today, I discovered that my read-later tool of choice – Readwise Reader – offers a native integration to let you search highlights with NotebookLM. That’s another source that I’m definitely adding to NotebookLM, and I’m thinking of how I could replicate the same Readwise Reader setup (highlights are appended to a single Google Doc) with Zapier and RSS feeds. Wouldn’t it be fun, for instance, if I could search the entire archive of AppStories show notes in NotebookLM, or if I could turn starred items from Feedbin into a standalone notebook as well?

I’m probably going to have to sign up for NotebookLM Plus when it launches for non-business accounts, which, according to Google, should happen in early 2025.

Permalink

Google’s Antitrust Loss, Why Apple Doesn’t Just Build a Search Engine, and What Comes Next

Yesterday, Federal District Judge Amit Mehta issued a ruling in the U.S. Justice Department’s antitrust case against Google in favor of the government. Judge Mehta didn’t mince words:

Google is a monopolist, and it has acted as one to maintain its monopoly. It has violated Section 2 of the Sherman Act.

The Judge further explained his ruling:

Specifically, the court holds that (1) there are relevant product markets for general search services and general search text ads; (2) Google has monopoly power in those markets; (3) Google’s distribution agreements are exclusive and have anticompetitive effects; and (4) Google has not offered valid procompetitive justifications for those agreements. Importantly, the court also finds that Google has exercised its monopoly power by charging supracompetitive prices for general search text ads. That conduct has allowed Google to earn monopoly profits.

It’s a long opinion, coming in at nearly 300 pages, but the upshot of why Judge Mehta ruled the way he did is summed up nicely near the beginning of the tome:

But Google also has a major, largely unseen advantage over its rivals: default distribution. Most users access a general search engine through a browser (like Apple’s Safari) or a search widget that comes preloaded on a mobile device. Those search access points are preset with a “default” search engine. The default is extremely valuable real estate. Because many users simply stick to searching with the default, Google receives billions of queries every day through those access points. Google derives extraordinary volumes of user data from such searches. It then uses that information to improve search quality. Google so values such data that, absent a user-initiated change, it stores 18 months-worth of a user’s search history and activity.

If you’re interested in how web search works and the business deals that drive it, the opinion is a great primer. Plus, although the details already dribbled out over the course of the 10-week trial, there are lots of interesting bits of information buried in there for anyone interested in Apple’s search deal with Google.

Read more


The New York Times Declares that Voice Assistants Have Lost the ‘AI Race’

Brian Chen, Nico Grant, and Karen Weise of The New York Times set out to explain why voice assistants like Siri, Alexa, and Google Assistant seem primitive by comparison to ChatGPT. According to ex-Apple, Amazon, and Google engineers and employees, the difference is grounded in the approach the companies took with their assistants:

The assistants and the chatbots are based on different flavors of A.I. Chatbots are powered by what are known as large language models, which are systems trained to recognize and generate text based on enormous data sets scraped off the web. They can then suggest words to complete a sentence.

In contrast, Siri, Alexa and Google Assistant are essentially what are known as command-and-control systems. These can understand a finite list of questions and requests like “What’s the weather in New York City?” or “Turn on the bedroom lights.” If a user asks the virtual assistant to do something that is not in its code, the bot simply says it can’t help.

In the case of Siri, former Apple engineer John Burkey said the company’s assistant was designed as a monolithic database that took weeks to update with new capabilities. Burkey left Apple in 2016 after less than two years at the company according to his LinkedIn bio. According to other unnamed Apple sources, the company has been testing AI based on large language models in the years since Burkey’s departure:

At Apple’s headquarters last month, the company held its annual A.I. summit, an internal event for employees to learn about its large language model and other A.I. tools, two people who were briefed on the program said. Many engineers, including members of the Siri team, have been testing language-generating concepts every week, the people said.

It’s not surprising that sources have told The New York Times that Apple is researching the latest advances in artificial intelligence. All you have to do is visit the company’s Machine Learning Research website to see that. But to declare a winner in ‘the AI race’ based on the architecture of where voice assistants started compared to today’s chatbots is a bit facile. Voice assistants may be primitive by comparison to chatbots, but it’s far too early to count Apple, Google, or Amazon out or declare the race over, for that matter.

Permalink

Google Appears to Have Stopped Serving AMP Search Results to Safari Users on iOS and iPadOS 15

Update: Although Google has not commented on the lack of AMP links in its search results, Danny Sullivan has tweeted that their disappearance from iOS and iPadOS 15 is a bug that Google is working to fix.


Earlier today, developer Jeff Johnson published a story, noting that AMP links have seemingly vanished from Safari on iOS and iPadOS 15. AMP is Google’s cached URL system that’s designed to speed up the mobile web but often ruins website functionality and junks up URLs. I’ve never been a fan of AMP and neither has Federico.

Google search results for an article that returns AMP results on iOS 14 but not iOS 15.

Google search results for an article that returns AMP results on iOS 14 but not iOS 15.

iOS and iPadOS 15 introduced extensions to Safari, and one of the most popular categories has been extensions that redirect AMP links to the canonical version of the URL. I covered two of our favorites, Amplosion by Christian Selig and Overamped, both of which continue to be among the top paid Safari extensions on the App Store.

Safari extensions that redirect AMP URLs have proven popular on the App Store.

Safari extensions that redirect AMP URLs have proven popular on the App Store.

Jeff Johnson, the maker of Stop the Madness, another Safari extension that redirects AMP links, noticed, while updating his extension, that AMP links had disappeared from Google search results.

Johnson ran some tests:

With this User-Agent [iOS 15’s], there are no AMP links in Google search results, but if I simply change Version/15.0 to Version/14.0 and keep the rest the same, Google search results suddenly have AMP links again! This is reproducible on my iPhone, in the Xcode iPhone simulator, and also in desktop Safari Mac with its User-Agent spoofed as iPhone.

Google search results still return AMP URLs on iOS 14.

Google search results still return AMP URLs on iOS 14.

I’ve done some digging myself, as has Federico, and we have been able to reproduce the same results. I searched Google for an article published today on The Verge. Sure enough, on iOS 14, I get AMP results, but not on iOS 15, where the links point to theverge.com. I ran the same test using Google Chrome, Firefox, and Microsoft’s Edge browser on iOS 15, and all returned Google search results with AMP links. Safari for iOS and iPadOS 15 stands alone among these four browsers and is the only one that doesn’t return AMP links in Google search results.

I wondered what might be going on, so I contacted Google PR to see if they could explain it. I haven’t heard back yet but will update this story if I do.

Meanwhile, Johnson has a theory that seems plausible to me:

So, is it possible that Google has given up on AMP in Safari on iOS 15 because of the popularity of AMP blocking extensions? Who can say, but it’s certainly an interesting coincidence. I can say that it’s a very recent change. I know from my own testing that Google search results still included AMP links for the first week after iOS 15 was released on September 20.

The timing certainly lines up. I know there were AMP links to redirect when I was testing Amplosion and Overamped on the iOS 15 betas and shortly after its launch, but sometime in the past two weeks or so, they have completely vanished from Google search results in Safari for iOS and iPadOS 15. I hope the change sticks.


Apple and Google Partner for COVID-19 Contact Tracing

Today Apple announced a special partnership with Google to develop contact tracing technology designed to reduce the spread of COVID-19 in the coming months. The plan involves two steps:

First, in May, both companies will release APIs that enable interoperability between Android and iOS devices using apps from public health authorities. These official apps will be available for users to download via their respective app stores.

Second, in the coming months, Apple and Google will work to enable a broader Bluetooth-based contact tracing platform by building this functionality into the underlying platforms. This is a more robust solution than an API and would allow more individuals to participate, if they choose to opt in, as well as enable interaction with a broader ecosystem of apps and government health authorities. Privacy, transparency, and consent are of utmost importance in this effort, and we look forward to building this functionality in consultation with interested stakeholders. We will openly publish information about our work for others to analyze.

Additionally, Apple has published draft technical documentation covering their joint work with Google.

This marks only the latest of several efforts Apple has developed to help fight the spread of COVID-19. The company developed an app and website, in partnership with the CDC, to help people with symptoms know what they should do. Additionally Tim Cook has been tweeting periodic updates about the masks and face shields the company has sourced and developed to send to first responders. Today’s partnership with Google, however, may be the most significant effort to date.

The World Health Organization explains how contact tracing – which involves keeping track of anyone who has been in contact with an infected person – can help limit the transmission of disease. Although current social distancing policies are a strict form of containment, well-implemented contact tracing could help prevent the need for such drastic measures in the future.

I’m glad that Apple and Google are collaborating on this effort, and that it will be privacy-first and opt-in. Anything that can be done to minimize the spread of COVID-19 is a good thing.


Google Integrates Assistant App with Siri Shortcuts on iOS

Google released an exciting update for its Assistant iOS app today, bringing support for Siri shortcuts and, for the first time, opening lines of communication between the two competing assistants.

Siri and the Google Assistant have historically been unable to work together in any way, but thanks to the opening up of Siri via shortcuts in iOS 12, that changes now. With the latest update, you can set up a shortcut in iOS to immediately, via Siri, trigger any command you’d like to give Google’s Assistant.

Read more


Google Maps Adds Commuting Features

Google has announced that later this week, it will add several new features to its Maps app for iOS and Android commuters. The update includes live, personalized traffic data, support for ‘mixed-mode’ commutes, real-time bus and train tracking, and integration with Apple Music, Google Play Music, and Spotify.

The update will include a dedicated ‘Commute’ tab in the Maps app. After users identify their commute, Google Maps will provide live traffic data about the route. The Android app will also include notifications about delays as they happen so you can adjust your trip.

Google Maps will also support mixed-mode commutes. That means, for example, commuters who travel by car, train, and on foot will see commute information relevant to each leg of their journey. Real-time bus and train tracking is being added in 80 cities worldwide too.

Playback controls for Apple Music, Spotify, and Google Play Music is coming to Google Maps. Spotify users on Android will also be able to browse and select content from inside the app.

As someone who used to commute by train every day, I particularly appreciate the focus on public transportation. Google hasn’t said, but hopefully, these new features are included as part of Google Maps’ CarPlay integration too.

Google Maps is available as a free download on the App Store.