GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

Oct 3, 2018

By Linda Henry

See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

The French Data Protection Authority (“CNIL”) recently became the first data protection authority to provide guidance as to how the European Union’s General Data Protection Regulation (“GDPR”) applies to blockchain.

A few key takeaways from the CNIL report are as follows:

  • Data controllers: Legal entities or natural persons who have a right to write on a blockchain and create a transaction that is submitted for validation (referred to in the CNIL’s report as “participants”) can be considered a data controller if the participant records personal data on a blockchain and (i) is a natural person that is engaging in a professional or commercial activity or (ii) is a corporate entity. For example, if a bank enters customer data on a blockchain, the bank would be considered a data controller.
  • Joint controllers. The CNIL advises that if there are multiple participants, the parties should designate a single entity or participant as the data controller in order to avoid joint liability under Article 26 of GDPR.  In addition, designating a single entity or participant as the data controller will provide data subjects with a single controller against whom they can enforce their rights.
  • Smart contract developers:  A smart contract developer may be considered a data processor if the smart contract processes personal data on behalf of the controller.  The CNIL provides the example of a software developer that offers a smart contract to insurance companies that will automatically compensate airline passengers under their travel insurance policies if a flight is delayed.  In this example, the smart contract developer is considered a data processor.
  • Miners: 
    • A miner may be considered a data processor if it executes the instructions of the data controller when verifying whether a transaction meets specified technical criteria.  The CNIL acknowledges the practical difficulties that would result from considering miners as data processors in a public blockchain, and the impracticalities of satisfying the requirement for the miner, as data processor, to sign a data processing agreement with the data controller.  The CNIL indicates that it is still considering this issue and encourages others to find innovative ways to address issues that would arise when miners are considered data processors.
    • Because miners validate transactions on behalf of blockchain participants and do not determine the purpose and means of processing, miners would not be considered data controllers.
  • Privacy by design and data minimization:
    •  In order to comply with GDPR’S privacy by design and data minimization requirements, data controllers must consider whether blockchain is the appropriate technology for the intended use case and whether they will be able to comply with GDPR requirements.  The CNIL notes that data transfers on a public blockchain may be especially problematic since miners may be validating transactions outside of the EU.
    • If personal data cannot be stored off-chain, hashing and encryption should be considered.
  • Right to erasure: The CNIL acknowledges that compliance with GDPR’s right to erasure may be technically impossible with respect to data on a blockchain, and notes that a more detailed analysis is needed as to how the right to erasure applies to blockchain. The CNIL strongly cautions against using blockchain to store unencrypted personal data and indicates that deletion of private keys should be considered when determining how to comply with the right to erasure requirement.
  • Security:  The CNIL recommends considering if a minimum number of miners should be required in order to help prevent a 51% attack.  In addition, there should be a contingency plan to modify algorithms in the event a vulnerability is detected.

The CNIL notes that its analysis is focused only on blockchain and not the broader category of distributed ledger technology (DLT).  Although the CNIL indicates that it may offer guidance on GDPR’s applicability to other DLTs in the future, it chose to focus its analysis on blockchain because DLT solutions that are not blockchains do not yet lend themselves to a generic analysis.  (The CNIL’s full report (in French) and introductory materials accompanying the report can be found here).


The Weight of “GDPR Lite”

By Dawn Ingley | In June, 2018, California’s legislature took the first steps to ensure that the state’s approach to data privacy was trending more closely to the European Union’s General Data Protection Regulation (GDPR), the de facto global industry standard for data protection. Though legislators have acknowledged that further refinements to the California Consumer Privacy Act (CCPA) will be necessary in the coming months, its salient requirements are known.

The ABA’s Valentine’s Gift to Same-Sex Couples: Formal Opinion 458 Requires Judges to Perform Marriages

By Jennifer Thompson | On Valentine’s Day, the American Bar Association (ABA) Standing Committee on Ethics and Professional Responsibility issued Formal Opinion 485, entitled “Judges Performing Same-Sex Marriages,” stating that judges may not decline to perform marriages for couples of the same sex.

The Intersection of Artificial Intelligence and the Model Rules of Professional Conduct

By Linda Henry | Artificial intelligence is transforming the legal profession and attorneys are increasingly using AI-powered software to assist with a wide rage of tasks, ranging from due diligence review, issue spotting during the contract negotiation process and predicting case outcomes.

Follow the Leader: Will Congressional and Corporate Push for Federal Privacy Regulations Leave Some Technology Giants in the Dust?

By Dawn Ingley | On October 24, 2018, Apple CEO Tim Cook, one of the keynote speakers at the International Conference of Data Protection and Privacy Commissioners Conference, threw down the gauntlet when he assured an audience of data protection professionals that Apple fully supports a “GDPR-like” federal data privacy law in the United States.

Yes, Lawyers Too! ABA Formal Opinion 483 and the Affirmative Duty to Inform Clients of Data Breaches

By Jennifer Thompson | Developments in the rules and regulations governing data breaches happen as quickly as you can click through the headlines on your favorite news media site.  Now, the American Bar Association (“ABA”) has gotten in on the action and is mandating that attorneys notify current clients of real or substantially likely data breaches where confidential client information is or may be compromised.

GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

By Linda Henry | The French Data Protection Authority (“CNIL”) recently became the first data protection authority to provide guidance as to how the European Union’s General Data Protection Regulation (“GDPR”) applies to blockchain.

D-Link Continues Challenges to FTC’s Data Security Authority

By Linda Henry | On September 21, 2018, the FTC and D-Link Systems Inc. each filed a motion for summary judgement in one of the most closely watched recent enforcement actions in privacy and data security law (FTC v. D-Link Systems Inc., No. 3:17-cv-00039).  The dispute, which dates back to early 2017, may have widespread implications on companies’ potential liability for lax security practices, even in the absence of actual consumer harm.

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley | With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across platforms are reasonable, fair and non-discriminatory.  Yet, to date, very few details have emerged regarding those teams—Who are the members?  What standards are applied to creation and implementation of AI?  Axon, the manufacturer behind community policing products and services such as body cameras and related video analytics, has embarked upon creation of an ethics board.  Google’s DeepMind Ethics and Society division (DeepMind) also seeks to temper the innovative potential of AI with the dangers of a technology that is not inherently “value-neutral” and that could lead to outcomes ranging from good to bad to downright ugly.  Indeed, a peak behind both ethics programs may offer some interesting insights into the direction of all corporate AI ethics programs.

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson | Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of, companies subject to COPPA. While the FTC has pursued companies for alleged violations of all facets of its COPPA Six Step Compliance Plan, most recently the FTC has focused on the obligation to promptly and securely delete all data collected if it is no longer needed. Taken as a whole, recent FTC activity may indicate a desire on the part of the FTC to expand its regulatory reach.

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry | Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling evidence that such algorithms may be racially biased.  As a result of one such investigation by ProPublica, the New York City Council recently passed the first bill in the country designed to address algorithmic discrimination in government agencies. The goal of New York City’s algorithmic accountability bill is to monitor algorithms used by municipal agencies and provide recommendations as to how to make the City’s algorithms fairer and more transparent.