Google is working on new privacy measures. Source: Shutterstock

Google is working on new privacy measures. Source: Shutterstock

Can Google’s Privacy Sandbox protect the future of the vibrant web?

GOOGLE is a technology giant that has a reputation for doing the right thing, which is why a large percentage of the global population relies on its products — from search and maps to cloud computing, ads, and beyond.

Given the position it enjoys, the company always attempts to stay one step ahead of regulators in data protection and privacy matters. Its latest initiative is something it calls the Privacy Sandbox.

Google built the sandbox a couple of weeks ago and is now making efforts to involve all fractions of the web community in a bid to find privacy solutions that work for everyone and protect the future of the vibrant web.

At first glance, the initiative does seem to be a knee-jerk reaction to competitors who are beginning to block cookies (snippets of code that browsers use to help place relevant ads) entirely, affecting advertisers as well as publishers.

However, the reality is that Google has been inviting employees, users, developers, publishers, advertisers, regulators, and other stakeholders to have discussions about privacy for more than two years now — and the sandbox is just a new attempt to collectively create a common solution.

Google is against blocking cookies altogether in the name of privacy

While Google’s Privacy Sandbox is experimenting with interesting technologies such as federated learning and the blockchain, the company is against simply blocking cookies on browsers.

“First, large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting,” said Google Chrome Engineering Director Justin Schuh.

Fingerprinting is a smart technique that developers use in order to bypass the restrictions being placed on cookies and collect bits of user information such as device type and fonts to generate unique identifiers that can be used to match a user across websites.

“Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.”

Second, Schuh explains, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.

Many publishers have been able to continue to invest in freely accessible content because they can be confident that their advertising will fund their costs.

“If this funding is cut, we are concerned that we will see much less accessible content for everyone.

“Recent studies have shown that when advertising is made less relevant by removing cookies, funding for publishers falls by 52 percent on average.”

Google believes that the raft of privacy features it is trialing via its privacy sandbox will bring provide a more balanced solution — something that takes the opinion, needs, and expectations of all stakeholders into account.

What is going on inside the sandbox?

Like Google outlined at the start, the intent of the privacy sandbox is to work with everyone that makes up the internet ecosystem.

One group of its partners include CloudFlare, Royal Holloway (University of London), and the University of Waterloo. Together, they’ve created a project blockchain-based privacy-preserving protocol they call the Privacy Pass.

“Privacy Pass leverages an idea from cryptography — zero-knowledge proofs — to let users prove their identity across multiple sites anonymously without enabling tracking. Privacy Pass is fully open source under a BSD license and the code is available on GitHub.”

Google is also using (and promoting to the data science community) a technology known as federated learning to determine how ads can be targeted by clustering people into groups without revealing any data to advertisers or even having any personally identifiable data leave the browser.

It’s a concept that has evolved over the years and essentially involves helping data science professionals train their artificial intelligence and machine learning models on real user-data without needing any of that data to leave the user’s device.

Since the algorithm is trained on the device and only insights are transferred to the developers, privacy concerns and associated risks are reduced significantly.

When the company and its partners figure out how federated learning can be deployed for better campaign targeting, it will delight customers as well as advertisers.

There are other initiatives and proposals that the Privacy Sandbox is going to review, especially in terms of conversion measurement, ad selection, and fraud prevention — and Google says it is keen to work with everyone, to everyone’s advantage — to ensure that the future of the vibrant web is protected.