In the past, companies have been reckless with their customers’ personal data, relying on contractual guarantees when under criticism. However, with stories of data mishandling, and breaches, and the threat of GDPR fines, senior executives are increasingly cautious about sharing data externally or relying on third-party data. This is bad news for the advertising ecosystem, who rely on the movement of digital data to deliver targeted ads.
Our unique decentralized platform solves the data privacy risks associated with sharing raw data by instilling technical safeguards to prevent misuse.
PII is a North American term regularly used in AdTech, and US government agencies. Personal Data is the European equivalent of PII and is the focus of GDPR. However, it has a broader definition that encompasses more data categories. The following is a non-exhaustive list of potential PII and/or Personal Data.
The most direct way to avoid the misuse of personal data is to not to centralize or share it. This is especially important when sharing with a third-party, as if they suffer a data leak or breach, you’re still liable.
PII and Personal Data are never shared between parties. We utilize groundbreaking decentralized technology to allow analysis to be conducted across isolated datasets, removing the need to ever share raw data.
Under GDPR, hashing of personal data is usually pseudonymization, not anonymization. Therefore, hashed data must still be handled as personal data, under GDPR.
We utilize hashing for pseudonymization, but the raw data always remains in the control of the data owner. Analysis across datasets is conducted via a non-reversible mathematical model, so personal information is ever exposed.
Equifax: On 8 September Equifax revealed that 143 million consumers in the US could have been affected by a data leak that saw hackers access data such as names, address, and dates of birth, as well as credit card numbers in a smaller number of cases.
Cambridge Analytica: One of the most high profile cases in the last decade saw the data analytics firm used personal information harvested from more than 50 million Facebook profiles without permission to build a system that could target personalized political advertisements.
Emma’s Diary: The pregnancy and childcare, advice site sold customer information to Experian, specifically for use by the Labour Party. Experian then created a database which the party used to profile the new mums in the run-up to the 2017 General Election without their overt permission.
Facebook API Leak: In September 2018, Facebook revealed that nearly 50 million users personal data had been exposed in a hack that took advantage of a flaw in Facebook’s code. This leak also affected services that utilize Facebook for user logins
In June 2018 the German Federal DPA ruled that the Facebook process of matching hashed email address is not anonymized but in contrast, personal data.Plus that the data transfer from one controller to another controller is not admissible on basis of legitimate interest but requires a consent of the customer.Read more
In September 2018 complaints were filed with European data protection authorities against Google and other adtech firms. The complaint states that when an individual visits a website and shown an ad, personal data, including location, device, cookie ID and IP address, is broadcast to a number of companies.Read more
Yes, but GDPR compliant technology should only be considered a starting point. It is with the individual users of the technology to ensure that their use of it is compliant with the principles of GDPR. Our technology ensures that when running analysis across another party’s data, you never take on the role of the data processor. This means you are only responsible for the compliant handling of your data.
Under GDPR, hashing of personal data is usually pseudonymization, not anonymization. Therefore, hashed data must still be handled as personal data, under GDPR. Our platform hashes all fields containing personal data and keeps them in control of the data owner. Joining is done by applying a non-reversible mathematical model to the hashes to ensure no personal information is ever exposed.
When two or more data points are combined, they can become personal data. For example, an individual’s browsing habits could reveal their identity, sexuality and ethnic origin. We employ various differential privacy concepts, including data rounding, noise addition, redaction thresholds and rate limits to ensure that individuals cannot be re-identified through combinations of data.
GDPR requires that individuals be able to request that all instances of their personal data be deleted. If you have shared raw data internally or externally, this becomes increasingly difficult. Where an individual’s data is held in a bunker, permission to analyze can either be rescinded or the data within the bunker can be updated to remove the individual. Find out more about the right to erasure here.
Typical audience selection platforms utilize hashing techniques, which does not result in anonymized data. Therefore, must still be handled as personal data under GDPR. Our unique execution bunkers are able to match and “tag” identities for re-marketing purposes, without sharing the source data with the third-party.
Anytime you pass raw data to another party, even a strategic partner, they become a legal Controller of your data. Once the data has been passed, you no longer have physical control over any onward sharing which can have adverse ramifications if that data is then sold onwards or misused. We never share raw data. Permissions are granted to run statistical analysis only or for tagging. These permissions can be rescinded at any time, without having passed any raw data and sacrificing your control or ownership.
Organizations looking to buy or rent data must be diligent in ensuring data is “lawfully and fairly obtained” and individuals understood their data would be shared with other parties. As we enable organizations to collaborate with partners without sharing raw data. This means they can gain the insights from the combined analysis, without becoming the controller of the third-party data.