Patrick Law Group Presenting on AI Business Ethics at NAMWOLF’s Driving Diversity & Leadership Conference in New Orleans

Patrick Law Group Presenting on AI Business Ethics at NAMWOLF’s Driving Diversity & Leadership Conference in New Orleans

Patrick Law Group’s Linda Henry will present at a CLE entitled, Risky Business: Legal and Ethical Considerations in Artificial Intelligence at the NAMWOLF Driving Diversity & Leadership Conference on February 18, 2019 in New Orleans, LA. 

Description of CLE:

Artificial Intelligence (“AI”) is becoming ubiquitous in business operations.  This CLE presentation will explore key legal considerations arising from the use of AI, including privacy, data security, intellectual property, allocation of liability and general contract considerations.  In addition, we will also discuss policy considerations related to the use of AI (e.g., algorithmic bias) and ethical considerations arising from the use of AI in law practice (e.g., how the Rules of Professional Conduct may apply).      

Bringing Value by Bringing AI

Bringing Value by Bringing AI

Pittsburgh, PA USA  and Atlanta, GA USA| February 1, 2019  

Patrick Law Group, LLC and LegalSifter partner to bring Artificial Intelligence to the Contract Negotiation Table

LegalSifter and Patrick Law Group, LLC (“PLG”) are proud to announce a strategic partnership to offer a “combined intelligence” solution for contract review, preparation and negotiation to corporations and other business clients.  The combination of Patrick Law Group’s transactional legal expertise and business acumen, and LegalSifter’s artificial intelligence, allow Clients to review and execute business contracts quickly, confidently, and cost effectively.    Patrick Law Group has deep expertise negotiating a wide range of commercial agreements, and Clients will see immediate benefits resulting from its adoption and implementation of this innovative legal technology.

How This Combined Human-Technology Approach Will Save Time While Also Adding Value

Combining the power of Patrick Law Group’s highly experienced transactional lawyers with LegalSifter’s powerful artificial intelligence technology means Clients can review and draft contracts more quickly by employing a consistent approach and process to the review process.   Contracts are uploaded to a secure site, and then reviewed (“sifted”) for legal issues. LegalSifter utilizes technology (“Sifters”) trained to read text, look for specific concepts, and identify important terms that require additional consideration or are missing entirely. The Sifters trigger help text and alternative language suggestions that have been customized and tailored to the Client’s business and negotiation positions by the legal team at Patrick Law Group. In addition, Sifters learn from experience and improve over time. Since the product is fully configurable, Clients can feel confident that their specific legal perspectives are incorporated while having access to their trusted legal advisors 24x7x365.

Why PLG Chose to be an Early Adopter of This Cutting-Edge Tech Solution

“Patrick Law Group is committed to providing innovative legal solutions to our Clients, and we are excited to bring artificial intelligence to the contract negotiation process.  Artificial Intelligence is transforming the practice of law, and our Clients will be able to expedite contract review by leveraging LegalSifter’s AI and PLG’s deep expertise in drafting and negotiating commercial agreements,” said A. Elizabeth (Lizz) Patrick, Chief Client Officer and Founder of Patrick Law Group.

CEO of LegalSifter, Kevin Miller, said that “Patrick Law Group knows contracts. They bring expertise and leadership, and we bring software that unleashes that expertise and leadership. They will be an outstanding combined intelligence partner!  We are fortunate to work with Lizz and her team.”

Patrick Law Group, LLC will offer LegalSifter-enabled solutions to Clients for a wide range of contract negotiation matters, including Software-as-a-Service Agreements, Software License and Support Agreements, Data Security Agreements, EULAs, Non-Disclosure Agreements, Business Associate Agreements and Services Agreements.

For additional information, please contact A. Elizabeth (Lizz) Patrick, Chief Client Officer at 404-437-6731 or email at lpatrick@patricklawgroup.com or Linda Henry, Deputy Counsel and Vice President of Operations at 404-525-3229 or email at lhenry@patricklawgroup.com.

About Patrick Law Group, LLC

Patrick Law Group, LLC is an agile, results driven law firm focused on preparing and negotiating Business Contracts for our Clients in the United States and abroad. We have drafted thousands of agreements for companies in a broad range of industries for their purchase and sale of a wide spectrum of goods and services, including those relating to complex construction and development projects, technology, procurement, logistics and fulfillment, and promotional marketing and advertising. We are woman owned and operated, certified as Women’s Business Enterprise (WBE) by Women’s Business Enterprise National Council (WBENC) and a member firm of the National Association of Minority & Women Owned Law Firms (NAMWOLF).

 

About LegalSifter

LegalSifter is dedicated to bringing affordable legal services to the world by empowering people with artificial intelligence. LegalSifter intends to achieve its mission by working with the legal profession, not against it.

 

 

Contacts

Linda Henry

Deputy Counsel & Vice President of Operations

lhenry@patricklawgroup.com

404-525-3229

Vesatee Merkerson

Director of Legal Operations

vmerkerson@patricklawgroup.com

404-275-2109

 

Laura Taylor

Vice President of Partner Experience
laura@legalsifter.com

412-523-5977

Maggie Frey
Vice President of Growth & Partnerships

maggie@legalsifter.com

412-779-4271

 

Peggy Abood, Senior Counsel for PLG, Adopts an Elephant from The David Sheldrick Wildlife Trust USA

Peggy Abood, Senior Counsel for PLG, Adopts an Elephant from The David Sheldrick Wildlife Trust USA

Patrick Law Group also proudly supports The David Sheldrick Wildlife Trust USA, an organization dedicated to the preservation of Kenya’s wildlife and the prevention of cruel ivory trade of the African elephant population. Peggy Abood, Senior Counsel for Patrick Law Group has adopted one of the elephants – her name is Shukuru.

More information about Shukuru and how to support elephants like her click here.

GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

The French Data Protection Authority (“CNIL”) recently became the first data protection authority to provide guidance as to how the European Union’s General Data Protection Regulation (“GDPR”) applies to blockchain.

A few key takeaways from the CNIL report are as follows:

  • Data controllers: Legal entities or natural persons who have a right to write on a blockchain and create a transaction that is submitted for validation (referred to in the CNIL’s report as “participants”) can be considered a data controller if the participant records personal data on a blockchain and (i) is a natural person that is engaging in a professional or commercial activity or (ii) is a corporate entity. For example, if a bank enters customer data on a blockchain, the bank would be considered a data controller.
  • Joint controllers. The CNIL advises that if there are multiple participants, the parties should designate a single entity or participant as the data controller in order to avoid joint liability under Article 26 of GDPR.  In addition, designating a single entity or participant as the data controller will provide data subjects with a single controller against whom they can enforce their rights.
  • Smart contract developers:  A smart contract developer may be considered a data processor if the smart contract processes personal data on behalf of the controller.  The CNIL provides the example of a software developer that offers a smart contract to insurance companies that will automatically compensate airline passengers under their travel insurance policies if a flight is delayed.  In this example, the smart contract developer is considered a data processor.
  • Miners: 
    • A miner may be considered a data processor if it executes the instructions of the data controller when verifying whether a transaction meets specified technical criteria.  The CNIL acknowledges the practical difficulties that would result from considering miners as data processors in a public blockchain, and the impracticalities of satisfying the requirement for the miner, as data processor, to sign a data processing agreement with the data controller.  The CNIL indicates that it is still considering this issue and encourages others to find innovative ways to address issues that would arise when miners are considered data processors.
    • Because miners validate transactions on behalf of blockchain participants and do not determine the purpose and means of processing, miners would not be considered data controllers.
  • Privacy by design and data minimization:
    •  In order to comply with GDPR’S privacy by design and data minimization requirements, data controllers must consider whether blockchain is the appropriate technology for the intended use case and whether they will be able to comply with GDPR requirements.  The CNIL notes that data transfers on a public blockchain may be especially problematic since miners may be validating transactions outside of the EU.
    • If personal data cannot be stored off-chain, hashing and encryption should be considered.
  • Right to erasure: The CNIL acknowledges that compliance with GDPR’s right to erasure may be technically impossible with respect to data on a blockchain, and notes that a more detailed analysis is needed as to how the right to erasure applies to blockchain. The CNIL strongly cautions against using blockchain to store unencrypted personal data and indicates that deletion of private keys should be considered when determining how to comply with the right to erasure requirement.
  • Security:  The CNIL recommends considering if a minimum number of miners should be required in order to help prevent a 51% attack.  In addition, there should be a contingency plan to modify algorithms in the event a vulnerability is detected.

The CNIL notes that its analysis is focused only on blockchain and not the broader category of distributed ledger technology (DLT).  Although the CNIL indicates that it may offer guidance on GDPR’s applicability to other DLTs in the future, it chose to focus its analysis on blockchain because DLT solutions that are not blockchains do not yet lend themselves to a generic analysis.  (The CNIL’s full report (in French) and introductory materials accompanying the report can be found here).

OTHER THOUGHT LEADERSHIP POSTS:

The Intersection of Artificial Intelligence and the Model Rules of Professional Conduct

By Linda Henry | Artificial intelligence is transforming the legal profession and attorneys are increasingly using AI-powered software to assist with a wide rage of tasks, ranging from due diligence review, issue spotting during the contract negotiation process and predicting case outcomes.

Follow the Leader: Will Congressional and Corporate Push for Federal Privacy Regulations Leave Some Technology Giants in the Dust?

By Dawn Ingley | On October 24, 2018, Apple CEO Tim Cook, one of the keynote speakers at the International Conference of Data Protection and Privacy Commissioners Conference, threw down the gauntlet when he assured an audience of data protection professionals that Apple fully supports a “GDPR-like” federal data privacy law in the United States.

Yes, Lawyers Too! ABA Formal Opinion 483 and the Affirmative Duty to Inform Clients of Data Breaches

By Jennifer Thompson | Developments in the rules and regulations governing data breaches happen as quickly as you can click through the headlines on your favorite news media site.  Now, the American Bar Association (“ABA”) has gotten in on the action and is mandating that attorneys notify current clients of real or substantially likely data breaches where confidential client information is or may be compromised.

GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

By Linda Henry | The French Data Protection Authority (“CNIL”) recently became the first data protection authority to provide guidance as to how the European Union’s General Data Protection Regulation (“GDPR”) applies to blockchain.

D-Link Continues Challenges to FTC’s Data Security Authority

By Linda Henry | On September 21, 2018, the FTC and D-Link Systems Inc. each filed a motion for summary judgement in one of the most closely watched recent enforcement actions in privacy and data security law (FTC v. D-Link Systems Inc., No. 3:17-cv-00039).  The dispute, which dates back to early 2017, may have widespread implications on companies’ potential liability for lax security practices, even in the absence of actual consumer harm.

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley | With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across platforms are reasonable, fair and non-discriminatory.  Yet, to date, very few details have emerged regarding those teams—Who are the members?  What standards are applied to creation and implementation of AI?  Axon, the manufacturer behind community policing products and services such as body cameras and related video analytics, has embarked upon creation of an ethics board.  Google’s DeepMind Ethics and Society division (DeepMind) also seeks to temper the innovative potential of AI with the dangers of a technology that is not inherently “value-neutral” and that could lead to outcomes ranging from good to bad to downright ugly.  Indeed, a peak behind both ethics programs may offer some interesting insights into the direction of all corporate AI ethics programs.

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson | Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of, companies subject to COPPA. While the FTC has pursued companies for alleged violations of all facets of its COPPA Six Step Compliance Plan, most recently the FTC has focused on the obligation to promptly and securely delete all data collected if it is no longer needed. Taken as a whole, recent FTC activity may indicate a desire on the part of the FTC to expand its regulatory reach.

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry | Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling evidence that such algorithms may be racially biased.  As a result of one such investigation by ProPublica, the New York City Council recently passed the first bill in the country designed to address algorithmic discrimination in government agencies. The goal of New York City’s algorithmic accountability bill is to monitor algorithms used by municipal agencies and provide recommendations as to how to make the City’s algorithms fairer and more transparent.

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley | Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that collected extensive metadata—average speeds and distances, routes routinely traveled, seat belt usage and other types of data.  But to what end?  Was the purpose of the monitor to learn more about my driving practices and to encourage better driving habits?  To share my data with advertisers wishing to serve up a buy-one, get-one free coupon for paper towels from my favorite grocery store (just as I pass by it) on my touchscreen dashboard?  Or to build a “risk profile” that could be sold to parties (AirBnB, banks, other insurance companies) who may have a vested interest in learning more about my propensity for making good decisions?  The answer could be, “all of the above.”

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry | As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use and monetization.  Data scraping has come a long way since its early days, which involved manually copying data visible on a website.  Today, data scraping is a thriving industry, and high-performance web scraping tools are fueling the big data revolution.  Like many technological advances though, the law has not kept up with the technology that enables scraping. As a result, the state of the law on data scraping remains in flux.

D-Link Continues Challenges to FTC’s Data Security Authority

D-Link Continues Challenges to FTC’s Data Security Authority

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

On September 21, 2018, the FTC and D-Link Systems Inc. each filed a motion for summary judgement in one of the most closely watched recent enforcement actions in privacy and data security law (FTC v. D-Link Systems Inc., No. 3:17-cv-00039).  The dispute, which dates back to early 2017, may have widespread implications on companies’ potential liability for lax security practices, even in the absence of actual consumer harm.

In January 2017, the FTC sued D-Link for engaging in unfair or deceptive acts in violation of Section 5 of the FTC Act in connection with D-Link’s failure to take reasonable steps to secure its routers and Internet-protocol cameras from widely known and reasonably foreseeable security risks.   The FTC’s complaint focused on D-Link’s marketing practices, noting that D-Link’s marketing materials and user manuals included statements in bold, italicized, all-capitalized text that D-Link’s routers were “easy to secure” with “advanced network security.”  D-Link also promoted the security of its IP cameras in its marketing materials, specifically referencing the device’s security in large capital letters.  In addition, the IP camera packaging also listed security claims, such as “secure-connection” next to a lock icon as one of the product features.

Although a U.S. district court judge dismissed three of the FTC’s six claims in September 2017, the judge also rejected D-Link’s argument that the FTC lacked statutory authority to regulate data security for IoT companies as an unfair practice under Section 5 of the FTC Act.  In the court’s Order Regarding Motion to Dismiss, the court stated that “the fact that data security is not expressly enumerated as within the FTC’s enforcement powers is of no moment to the exercise of its statutory authority.”  With respect to the court’s dismissal of the FTC’s unfairness claim, the court agreed with D-Link that the FTC had failed to provide any concrete facts demonstrating actual harm to consumers, and reasoned that the absence of any concrete facts makes it just as possible that D-Link’s devices would not cause substantial harm to consumers and that “the FTC cannot rely on wholly conclusory allegations about potential injury to tilt the balance in its favor.”

Despite the court’s dismissal of the FTC’s unfairness claim, the court indicated that the claim might have survived a motion to dismiss if the FTC had tied the unfairness claim to the representations underlying the deception claims.  The court stated that “a consumer’s purchase of a device that fails to be reasonably secure — let alone as secure as advertised — would likely be in the ballpark of a “substantial injury,” particularly when aggregated across a large group of consumers.”   Although the court’s reasoning indicates that there are limits to the FTC’s data security enforcement capabilities, it did not completely foreclose the possibility that lax security practices might be deemed to violate the unfairness prong of the FTC Act even in the absence of evidence of actual harm to consumers.

The FTC argued in its September 2018 motion for summary judgment that summary judgment is appropriate because there is no dispute that D-Link made representations regarding the security of its devices from unauthorized access, the devices contained numerous vulnerabilities that made them susceptible to unauthorized access and D-Link’s security statements were material to consumers.  The FTC noted that “there is no genuine dispute that D-Link routers and IP cameras have contained serious, foreseeable, and easily preventable vulnerabilities permitting unauthorized access; that D-Link knew of these vulnerabilities; and that D-Link sold and marketed these devices as secure anyway.”

In D-Link’s motion for summary judgment, D-Link argued that the FTC’s remaining deception claims were based on “expert conjecture” with no evidentiary support.  D-Link stressed that the FTC’s failure to present any evidence that an identifiable consumer was deceived by D-Link’s marketing statements or that any of the routers or cameras were actually compromised demonstrated that there was no harm for the court to remedy.

D-Link is significant because the outcome may have a substantial impact on the FTC’s ability to successfully pursue a claim under Section 5 of the FTC Act in the absence of evidence that there has been an actual harm or injury to consumers. In addition, the outcome of D-Link may shape the FTC’s approach to classifying informational harm that impacts consumers following a data breach.

Even if the D-Link decision offers more clarity around the scope of the FTC’s regulatory authority on data security, the FTC’s past guidance regarding data security and privacy remains useful when evaluating a company’s data security practices.  Over the past few years, the FTC has repeatedly stressed that a company’s failure to implement reasonable security measures may be considered deceptive or unfair, and has stated that “the touchstone of the FTC’s approach to data security is reasonableness: a company’s data security measures must be reasonable in light of the sensitivity and volume of consumer information it holds, the size and complexity of its data operations, and the cost of available tools to improve security and reduce vulnerabilities.” In addition, the FTC’s motions in D-Link confirm that a company should ensure that it actually follows all security practices it claims to follow.

OTHER THOUGHT LEADERSHIP POSTS:

The Intersection of Artificial Intelligence and the Model Rules of Professional Conduct

By Linda Henry | Artificial intelligence is transforming the legal profession and attorneys are increasingly using AI-powered software to assist with a wide rage of tasks, ranging from due diligence review, issue spotting during the contract negotiation process and predicting case outcomes.

Follow the Leader: Will Congressional and Corporate Push for Federal Privacy Regulations Leave Some Technology Giants in the Dust?

By Dawn Ingley | On October 24, 2018, Apple CEO Tim Cook, one of the keynote speakers at the International Conference of Data Protection and Privacy Commissioners Conference, threw down the gauntlet when he assured an audience of data protection professionals that Apple fully supports a “GDPR-like” federal data privacy law in the United States.

Yes, Lawyers Too! ABA Formal Opinion 483 and the Affirmative Duty to Inform Clients of Data Breaches

By Jennifer Thompson | Developments in the rules and regulations governing data breaches happen as quickly as you can click through the headlines on your favorite news media site.  Now, the American Bar Association (“ABA”) has gotten in on the action and is mandating that attorneys notify current clients of real or substantially likely data breaches where confidential client information is or may be compromised.

GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

By Linda Henry | The French Data Protection Authority (“CNIL”) recently became the first data protection authority to provide guidance as to how the European Union’s General Data Protection Regulation (“GDPR”) applies to blockchain.

D-Link Continues Challenges to FTC’s Data Security Authority

By Linda Henry | On September 21, 2018, the FTC and D-Link Systems Inc. each filed a motion for summary judgement in one of the most closely watched recent enforcement actions in privacy and data security law (FTC v. D-Link Systems Inc., No. 3:17-cv-00039).  The dispute, which dates back to early 2017, may have widespread implications on companies’ potential liability for lax security practices, even in the absence of actual consumer harm.

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley | With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across platforms are reasonable, fair and non-discriminatory.  Yet, to date, very few details have emerged regarding those teams—Who are the members?  What standards are applied to creation and implementation of AI?  Axon, the manufacturer behind community policing products and services such as body cameras and related video analytics, has embarked upon creation of an ethics board.  Google’s DeepMind Ethics and Society division (DeepMind) also seeks to temper the innovative potential of AI with the dangers of a technology that is not inherently “value-neutral” and that could lead to outcomes ranging from good to bad to downright ugly.  Indeed, a peak behind both ethics programs may offer some interesting insights into the direction of all corporate AI ethics programs.

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson | Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of, companies subject to COPPA. While the FTC has pursued companies for alleged violations of all facets of its COPPA Six Step Compliance Plan, most recently the FTC has focused on the obligation to promptly and securely delete all data collected if it is no longer needed. Taken as a whole, recent FTC activity may indicate a desire on the part of the FTC to expand its regulatory reach.

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry | Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling evidence that such algorithms may be racially biased.  As a result of one such investigation by ProPublica, the New York City Council recently passed the first bill in the country designed to address algorithmic discrimination in government agencies. The goal of New York City’s algorithmic accountability bill is to monitor algorithms used by municipal agencies and provide recommendations as to how to make the City’s algorithms fairer and more transparent.

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley | Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that collected extensive metadata—average speeds and distances, routes routinely traveled, seat belt usage and other types of data.  But to what end?  Was the purpose of the monitor to learn more about my driving practices and to encourage better driving habits?  To share my data with advertisers wishing to serve up a buy-one, get-one free coupon for paper towels from my favorite grocery store (just as I pass by it) on my touchscreen dashboard?  Or to build a “risk profile” that could be sold to parties (AirBnB, banks, other insurance companies) who may have a vested interest in learning more about my propensity for making good decisions?  The answer could be, “all of the above.”

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry | As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use and monetization.  Data scraping has come a long way since its early days, which involved manually copying data visible on a website.  Today, data scraping is a thriving industry, and high-performance web scraping tools are fueling the big data revolution.  Like many technological advances though, the law has not kept up with the technology that enables scraping. As a result, the state of the law on data scraping remains in flux.