While you’ve been focused on CCPA Compliance Efforts, Elon has Been Developing Cyborgs

Dec 16, 2019

By Meredith Sidewater

See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

As I look back at calendar year 2019 and contemplate the legal work that came my way, it is impossible to overlook the impact that privacy laws have on organizations. Just as businesses began to exhale after a grueling race toward the GDPR compliance deadline, the California legislature enacted a law that required the focused attention of every company collecting personal information on California consumers.

Most Americans agree that data privacy is an issue that matters to them and over which they feel little control. A June 2019 Pew Research Center survey of a cohort of randomly selected U.S. adults revealed that 62% of Americans believe “it is not possible to go through daily life without companies collecting data about them.” The survey further reveals statistics regarding the perception American citizens have vis a vis the potential risks of personal data proliferation as compared to the benefits resulting thereof. These findings support the attention that legislators and privacy advocates give to the issue of personal data protection and regulation.

Privacy laws, once enacted, send in-house legal staff, technologists, compliance teams, and data management personnel scurrying about to develop policies and processes, write code, and train employees in an effort to ensure compliance by a looming effective date. Companies spend significant dollars and co-opt large teams of human capital to address CCPA requirements after having just done the same in response to GDPR.

I, like some of you, am a lawyer who has spent the last 18 months advising companies about their CCPA compliance obligations. I participated in the design of a CCPA – compliance program for a multi-billion dollar data company and I have spent a great deal of time counseling smaller companies with respect to their CCPA compliance efforts.

While researching a topic the other day, I happened upon a video in which Elon Musk introduces his new company, Neuralink, to an audience of potential recruits. Somehow, while I was busy offering sage counsel in the arcane space of the CCPA, Neuralink’s debut slipped under my radar. I watched the presentation and was struck. It became frighteningly clear that these privacy laws that legislators, lawyers, lobbyists, advocates, and businesses are so busy negotiating and around which they are so busy navigating may be one critical defense against humanity’s demise.

Neuralink is a company operating in the realm of “neurotechnology” and their immediate goal is to repair broken brain circuitry with a device comprised of a chip and tiny threads that are implanted on a human’s brain by a robot, thereby creating a “brain/machine interface”. Neuralink is already testing the device in monkeys and forecasts that the device will become part of a human study by the end of 2020, likely in the context of a quadriplegic patient with a C1 – C4 spinal injury.

Mr. Musk foresees that the procedure to implant the Neuralink device could ultimately be as simple as that of Lasik: no hospital stay and use of conscious sedation. A brain chip will be as accessible as hair transplants and breast augmentation. Ready availability of a device that could reactivate mobility and successfully treat diseases such as Parkinson’s, ALS, Depression, and chronic pain is extraordinary. Having lost my mother at too young an age to Multiple System Atrophy, a neurodegenerative disorder, I understand the suffering such diseases inflict on patients and their families.

Yet, in listening carefully to Mr. Musk’s presentation, I felt a chill creep up my still human spine. He spoke of a “symbiosis with AI.” He opines that the Neuralink technologies will be “important at a civilization level scale” in that they will provide humans the “option of merging with AI”. He hopes that Neuralink will help to mitigate the “existential threat of AI” by enabling a “well-aligned future” between humans and machines, which I interpreted as a scenario where humans and machines coexist peaceably rather than in a war of the worlds. Just how long is the evolutionary timeline between chip implantation at the corner doc in the box and a cyborg nation? And, what happens when this technological “tertiary super-intelligence layer”, controlled by an app from the Apple store, becomes available on a mass scale?

Will the privacy laws devised by humans be able to protect us once our every thought, emotion, and memory is recorded, and our moods decoded? Can we effectively defend these devices, and ourselves, from the effect of malignant custom code? How do we design and build an ethical framework, with which to employ this technology? Can we, as mere humans, be prescient enough to craft legal protections that will preserve humanity? Happy New Year and welcome to our brave new world.


New York City Council Enters the Anti-Surveillance Fray

On Thursday, June 18, 2020 the New York City Council overwhelmingly approved the Public Oversight of Surveillance Technology Act (or “POST Act”) to institute oversight regarding the New York City Police Department’s use of surveillance technologies.

Harm or Deterrence?: FTC Civil Penalty Assessments Under COPPA

The Federal Trade Commission announced its most recent settlement action under the Children’s Online Privacy Protection Act (COPPA) on June 4. The settlement included a $4 million penalty (suspended to $150,000 due to the defendant’s proven inability to pay) against Hyperbeard, Inc. for alleged violations of COPPA.

Georgia COVID-19 Pandemic Business Safety Act

On July 29, 2020 the Georgia General Assembly sent to Governor Brian Kemp for his approval the Georgia COVID-19 Pandemic Business Safety Act.

FTC Provides Guidance on Using Artificial Intelligence and Algorithms

The Director of the Federal Trade Commission (FTC) Bureau of Consumer Protection recently issued guidance in its Tips and Advice blog as to how companies can manage consumer protection risks that may arise as a result of using artificial intelligence and algorithms.

Is Robotic Process Automation Reducing or Increasing your Software Licensing Fees?

While statistics regarding the increase in the use of Robotic Process Automation (RPA) vary, it is clear that the use of RPA is on the rise. Companies are rolling out RPA in an effort to increase productivity, cut costs and reduce errors.

A Few More Thoughts About Improving Our Force Majeure Provisions

The Coronavirus pandemic has brought the force majeure provision into the spotlight. A quick Google search brings up countless articles published in the past few weeks by lawyers worldwide about how to use force majeure provisions offensively and defensively in these uncertain times.

Government Efforts to Fight a Pandemic Challenge Data Privacy Concerns

Media outlets reported this week that representatives from Facebook, Google, Amazon, and Apple are meeting with members of the White House to brainstorm about ways in which the “Big Four,” can leverage the consumer information they possess to help in the war against COVID–19.

School or Parent? Factors Playing into the FTC’s Analysis of who should provide Parental Consent under COPPA in the Age of EdTech

The use of education technologies (EdTech) has exploded in recent years. In fact, between online learning sites, one to one device deployments in school districts and personalized curriculum services, virtually every student today has some online or digital component to their learning.

NYC’s Task Force to Tackle Algorithmic Bias Issues Final Report

In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.

While you’ve been focused on CCPA Compliance Efforts, Elon has Been Developing Cyborgs

On November 27, 2019, the Cybersecurity and Infrastructure Security Agency (CISA) of the Department of Homeland Security (DHS) released for public comment a draft of Binding Operational Directive 20-01, Develop and Publish a Vulnerability Disclosure Policy (the “Directive”).