While you’ve been focused on CCPA Compliance Efforts, Elon has Been Developing Cyborgs

Dec 16, 2019

By Meredith Sidewater


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

As I look back at calendar year 2019 and contemplate the legal work that came my way, it is impossible to overlook the impact that privacy laws have on organizations. Just as businesses began to exhale after a grueling race toward the GDPR compliance deadline, the California legislature enacted a law that required the focused attention of every company collecting personal information on California consumers.

Most Americans agree that data privacy is an issue that matters to them and over which they feel little control. A June 2019 Pew Research Center survey of a cohort of randomly selected U.S. adults revealed that 62% of Americans believe “it is not possible to go through daily life without companies collecting data about them.” The survey further reveals statistics regarding the perception American citizens have vis a vis the potential risks of personal data proliferation as compared to the benefits resulting thereof. These findings support the attention that legislators and privacy advocates give to the issue of personal data protection and regulation.

Privacy laws, once enacted, send in-house legal staff, technologists, compliance teams, and data management personnel scurrying about to develop policies and processes, write code, and train employees in an effort to ensure compliance by a looming effective date. Companies spend significant dollars and co-opt large teams of human capital to address CCPA requirements after having just done the same in response to GDPR.

I, like some of you, am a lawyer who has spent the last 18 months advising companies about their CCPA compliance obligations. I participated in the design of a CCPA – compliance program for a multi-billion dollar data company and I have spent a great deal of time counseling smaller companies with respect to their CCPA compliance efforts.

While researching a topic the other day, I happened upon a video in which Elon Musk introduces his new company, Neuralink, to an audience of potential recruits. Somehow, while I was busy offering sage counsel in the arcane space of the CCPA, Neuralink’s debut slipped under my radar. I watched the presentation and was struck. It became frighteningly clear that these privacy laws that legislators, lawyers, lobbyists, advocates, and businesses are so busy negotiating and around which they are so busy navigating may be one critical defense against humanity’s demise.

Neuralink is a company operating in the realm of “neurotechnology” and their immediate goal is to repair broken brain circuitry with a device comprised of a chip and tiny threads that are implanted on a human’s brain by a robot, thereby creating a “brain/machine interface”. Neuralink is already testing the device in monkeys and forecasts that the device will become part of a human study by the end of 2020, likely in the context of a quadriplegic patient with a C1 – C4 spinal injury.

Mr. Musk foresees that the procedure to implant the Neuralink device could ultimately be as simple as that of Lasik: no hospital stay and use of conscious sedation. A brain chip will be as accessible as hair transplants and breast augmentation. Ready availability of a device that could reactivate mobility and successfully treat diseases such as Parkinson’s, ALS, Depression, and chronic pain is extraordinary. Having lost my mother at too young an age to Multiple System Atrophy, a neurodegenerative disorder, I understand the suffering such diseases inflict on patients and their families.

Yet, in listening carefully to Mr. Musk’s presentation, I felt a chill creep up my still human spine. He spoke of a “symbiosis with AI.” He opines that the Neuralink technologies will be “important at a civilization level scale” in that they will provide humans the “option of merging with AI”. He hopes that Neuralink will help to mitigate the “existential threat of AI” by enabling a “well-aligned future” between humans and machines, which I interpreted as a scenario where humans and machines coexist peaceably rather than in a war of the worlds. Just how long is the evolutionary timeline between chip implantation at the corner doc in the box and a cyborg nation? And, what happens when this technological “tertiary super-intelligence layer”, controlled by an app from the Apple store, becomes available on a mass scale?

Will the privacy laws devised by humans be able to protect us once our every thought, emotion, and memory is recorded, and our moods decoded? Can we effectively defend these devices, and ourselves, from the effect of malignant custom code? How do we design and build an ethical framework, with which to employ this technology? Can we, as mere humans, be prescient enough to craft legal protections that will preserve humanity? Happy New Year and welcome to our brave new world.

OTHER THOUGHT LEADERSHIP POSTS:

A Few More Thoughts About Improving Our Force Majeure Provisions

The Coronavirus pandemic has brought the force majeure provision into the spotlight. A quick Google search brings up countless articles published in the past few weeks by lawyers worldwide about how to use force majeure provisions offensively and defensively in these uncertain times.

Government Efforts to Fight a Pandemic Challenge Data Privacy Concerns

Media outlets reported this week that representatives from Facebook, Google, Amazon, and Apple are meeting with members of the White House to brainstorm about ways in which the “Big Four,” can leverage the consumer information they possess to help in the war against COVID–19.

School or Parent? Factors Playing into the FTC’s Analysis of who should provide Parental Consent under COPPA in the Age of EdTech

The use of education technologies (EdTech) has exploded in recent years. In fact, between online learning sites, one to one device deployments in school districts and personalized curriculum services, virtually every student today has some online or digital component to their learning.

NYC’s Task Force to Tackle Algorithmic Bias Issues Final Report

In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.

While you’ve been focused on CCPA Compliance Efforts, Elon has Been Developing Cyborgs

On November 27, 2019, the Cybersecurity and Infrastructure Security Agency (CISA) of the Department of Homeland Security (DHS) released for public comment a draft of Binding Operational Directive 20-01, Develop and Publish a Vulnerability Disclosure Policy (the “Directive”).

DHS Cybersecurity Arm Directs Executive Agencies to Develop Vulnerability Disclosure Policies

On November 27, 2019, the Cybersecurity and Infrastructure Security Agency (CISA) of the Department of Homeland Security (DHS) released for public comment a draft of Binding Operational Directive 20-01, Develop and Publish a Vulnerability Disclosure Policy (the “Directive”).

Open Internet Advocates Rejoice: Ninth Circuit Finds Web Scraping of Publicly Accessible Data Likely Does Not Violate CFAA

The Ninth Circuit Court of Appeals recently handed open internet advocates a big win by upholding the right of a data analytics startup to use automated bots to scrape publicly available data.

The ABA Speaks on AI

By Jennifer Thompson | Earlier this week, the American Bar Association (“ABA”) House of Delegates, charged with developing policy for the ABA, approved Resolution 112 which urges lawyers and courts to reflect on their use (or non-use) of artificial intelligence (“AI”) in the practice of law, and to address the attendant ethical issues related to AI.

Is Anonymized Data Truly Safe From Re-Identification? Maybe not.

By Linda Henry | Across all industries, data collection is ubiquitous. One recent study estimates that over 2.5 quintillion bytes of data are created every day, and over 90% of the data in the world was generated over the last two years.

FTC Settlement Reminds IoT Companies to Employ Prudent Software Development Practices

By Linda Henry | Smart home products manufacturer D-Link Systems Inc. (D-Link) has reached a proposed settlement with the Federal Trade Commission after several years of litigation over D-Link’s security practices.