Andy Greenberg, writing for Wired, has a good explanation of differential privacy:
Differential privacy, translated from Apple-speak, is the statistical science of trying to learn as much as possible about a group while learning as little as possible about any individual in it. With differential privacy, Apple can collect and store its users’ data in a format that lets it glean useful notions about what people do, say, like and want. But it can’t extract anything about a single, specific one of those people that might represent a privacy violation. And neither, in theory, could hackers or intelligence agencies.
And:
Differential privacy, Roth explains, seeks to mathematically prove that a certain form of data analysis can’t reveal anything about an individual—that the output of an algorithm remains identical with and without the input containing any given person’s private data. “You might do something more clever than the people before to anonymize your data set, but someone more clever than you might come around tomorrow and de-anonymize it,” says Roth. “Differential privacy, because it has a provable guarantee, breaks that loop. It’s future proof.”