The digital economy is revolutionizing every aspect of our lives, and success in today’s economy requires that businesses become disruptors and innovators. At Patrick Law Group, we believe that a critical component of competitiveness in the digital economy is the pursuit of sharing and collaboration of relevant information.
We recognize the increasing complexity our Clients face in identifying relevant content and insightful business perspectives on changes and developments important to their practice areas and business interests. We invest in creating and sharing Client-centric content, and provide our Clients with current insights and knowledge that affect critical business decisions and the development of cogent business strategies.
By Dawn Ingley | With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across platforms are reasonable, fair and non-discriminatory. Yet, to date, very few details have emerged regarding those teams—Who are the members? What standards are applied to creation and implementation of AI? Axon, the manufacturer behind community policing products and services such as body cameras and related video analytics, has embarked upon creation of an ethics board. Google’s DeepMind Ethics and Society division (DeepMind) also seeks to temper the innovative potential of AI with the dangers of a technology that is not inherently “value-neutral” and that could lead to outcomes ranging from good to bad to downright ugly. Indeed, a peak behind both ethics programs may offer some interesting insights into the direction of all corporate AI ethics programs.
By Jennifer Thompson | Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of, companies subject to COPPA. While the FTC has pursued companies for alleged violations of all facets of its COPPA Six Step Compliance Plan, most recently the FTC has focused on the obligation to promptly and securely delete all data collected if it is no longer needed. Taken as a whole, recent FTC activity may indicate a desire on the part of the FTC to expand its regulatory reach.
By Linda Henry | Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling evidence that such algorithms may be racially biased. As a result of one such investigation by ProPublica, the New York City Council recently passed the first bill in the country designed to address algorithmic discrimination in government agencies. The goal of New York City’s algorithmic accountability bill is to monitor algorithms used by municipal agencies and provide recommendations as to how to make the City’s algorithms fairer and more transparent.
By Dawn Ingley | Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums. The catch? For three months, I would be required to install a plug-in monitor that collected extensive metadata—average speeds and distances, routes routinely traveled, seat belt usage and other types of data. But to what end? Was the purpose of the monitor to learn more about my driving practices and to encourage better driving habits? To share my data with advertisers wishing to serve up a buy-one, get-one free coupon for paper towels from my favorite grocery store (just as I pass by it) on my touchscreen dashboard? Or to build a “risk profile” that could be sold to parties (AirBnB, banks, other insurance companies) who may have a vested interest in learning more about my propensity for making good decisions? The answer could be, “all of the above.”
By Linda Henry | As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use and monetization. Data scraping has come a long way since its early days, which involved manually copying data visible on a website. Today, data scraping is a thriving industry, and high-performance web scraping tools are fueling the big data revolution. Like many technological advances though, the law has not kept up with the technology that enables scraping. As a result, the state of the law on data scraping remains in flux.
By Jennifer Thompson | In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States Uber drivers had been hacked. Uber, like many companies, leveraged a vulnerability disclosure or “bug bounty” program that invited hackers to test Uber’s systems for certain vulnerabilities, and offered financial rewards for qualifying vulnerabilities. In fact, Uber has paid out over $1,000,000 pursuant to its program, which is administered through HackerOne, a third-party vendor. Uber initially identified the breach as an authorized vulnerability disclosure, paid the hackers $100,000, and the hackers deleted the records. Yet, Uber has faced lawsuits, governmental inquiry and much public criticism in connection with this payment.