On Thursday Google engineers released their vision for the company to bring privacy standards that are more in line with consumer expectations. The new approach, dubbed Privacy Sandbox, hopes to ensure that ads remain relevant for users who are willing to share their data with websites and advertisers.
“Over the last couple of weeks, we’ve started sharing our preliminary ideas for a Privacy Sandbox, a secure environment for personalization that also protects user privacy,” wrote Justin Schuh, director of Chrome engineering, in a blog post.
With the help of the web community, Google will develop the standards that will partly restrict fingerprinting on the web and improve the way browser cookies are classified, among other tasks. The restriction on fingerprinting is particularly important due to a lack of consumer understanding and demand for digital fingerprint data among hackers.
The standards would anonymize aggregate user data and keep the majority of the data on the device, not in the cloud. This will reduce the risk of the data being, stollen, leaked or otherwise compromised, as we’ve seen with MoviePass companies are often inept.
Along with the company’s overall vision, Google released a summary of each proposal. Googlers are testing ways to deliver advertisements to large groups of like-minded people without allowing personally identifiable data to ever to leave the user’s device.
Using techniques Google calls Differential Privacy, this technology has been in Chrome for nearly five years. Showing that it is possible for a browser not to share data with an advertiser about a user, without sacrificing targeting.
The group will also work to improve conversion metrics. In the hopes to help advertisers and publishers determine whether the ads served actually lead to more business.
“The proposals are a first step in exploring how to address the measurement needs of the advertiser without letting the advertiser track a specific user across sites,” wrote Schuh
The proposals include measures for fraud protection and creating sandbox boundaries. As historically removing capabilities from the web cause developers (and hackers) to find workarounds instead of then going down the intended path.
Schuh asserts that we’ve seen this already in response to actions some browsers took to block cookies, the result was new techniques like fingerprinting. And these techniques are not well understood by consumers.
Small fragments of information such as what device they have or what fonts they have installed may not be personally identifiable. But combining enough of these trivial data points and someone can generate unique identifiers and match a user across websites.
“This subversion of user choice is wrong,” wrote Schuh — something that has led Google to develop the Privacy Sandbox.
Update: Some in the ad industry are very skeptical of Google’s reasons behind wanting cookies. Saying “Google is pushing cookies harder than a dealer on Sesame Street“